Compare commits
51 Commits
prerelease
...
v0.20.1
Author | SHA1 | Date | |
---|---|---|---|
![]() |
e8658d6493 | ||
![]() |
0a4bd29ce5 | ||
![]() |
506b899676 | ||
![]() |
c2103b27f6 | ||
![]() |
1a2e845958 | ||
![]() |
e5f270c8da | ||
![]() |
0fd224404a | ||
![]() |
215020f9bb | ||
![]() |
b366cb3c90 | ||
![]() |
e0bba8f4a3 | ||
![]() |
60195d72c9 | ||
![]() |
2008503a1f | ||
![]() |
9df47aabdb | ||
![]() |
80e90b924a | ||
![]() |
c6ff664366 | ||
![]() |
d27debd940 | ||
![]() |
c93b8bceb8 | ||
![]() |
f602c67606 | ||
![]() |
3a082f0112 | ||
![]() |
9fb25b7404 | ||
![]() |
9924c92c40 | ||
![]() |
16cb6ac1ed | ||
![]() |
5821746258 | ||
![]() |
d860083b08 | ||
![]() |
f2d3818d5c | ||
![]() |
0052f330be | ||
![]() |
456db45c4a | ||
![]() |
e493ab31c6 | ||
![]() |
e0f45b33e9 | ||
![]() |
bb61ecb9b9 | ||
![]() |
9694225b80 | ||
![]() |
3b15e7bf41 | ||
![]() |
ac5f0cc340 | ||
![]() |
f67840511a | ||
![]() |
bd9cfa3a47 | ||
![]() |
96c262b13e | ||
![]() |
d22fd79a0b | ||
![]() |
8cf4bf7559 | ||
![]() |
14a703a4bb | ||
![]() |
d7726f80e8 | ||
![]() |
d69c3a6ab7 | ||
![]() |
1fd964140d | ||
![]() |
c9bab946d4 | ||
![]() |
74a5cd2bb0 | ||
![]() |
151ce6f923 | ||
![]() |
1c31ce82af | ||
![]() |
caab2cbfd2 | ||
![]() |
a6f41006eb | ||
![]() |
18b4670d9f | ||
![]() |
322fe415e4 | ||
![]() |
096bfa4ba9 |
5
.github/dependabot.yml
vendored
5
.github/dependabot.yml
vendored
@@ -5,3 +5,8 @@ updates:
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
# Requirements to build documentation
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/lib/spack/docs"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
|
1
.github/workflows/unit_tests.yaml
vendored
1
.github/workflows/unit_tests.yaml
vendored
@@ -137,6 +137,7 @@ jobs:
|
||||
- name: Setup repo and non-root user
|
||||
run: |
|
||||
git --version
|
||||
git config --global --add safe.directory /__w/spack/spack
|
||||
git fetch --unshallow
|
||||
. .github/workflows/setup_git.sh
|
||||
useradd spack-test
|
||||
|
1
.github/workflows/valid-style.yml
vendored
1
.github/workflows/valid-style.yml
vendored
@@ -72,6 +72,7 @@ jobs:
|
||||
- name: Setup repo and non-root user
|
||||
run: |
|
||||
git --version
|
||||
git config --global --add safe.directory /__w/spack/spack
|
||||
git fetch --unshallow
|
||||
. .github/workflows/setup_git.sh
|
||||
useradd spack-test
|
||||
|
@@ -1,10 +1,16 @@
|
||||
version: 2
|
||||
|
||||
build:
|
||||
os: "ubuntu-22.04"
|
||||
apt_packages:
|
||||
- graphviz
|
||||
tools:
|
||||
python: "3.11"
|
||||
|
||||
sphinx:
|
||||
configuration: lib/spack/docs/conf.py
|
||||
fail_on_warning: true
|
||||
|
||||
python:
|
||||
version: 3.7
|
||||
install:
|
||||
- requirements: lib/spack/docs/requirements.txt
|
||||
|
218
CHANGELOG.md
218
CHANGELOG.md
@@ -1,3 +1,221 @@
|
||||
# v0.20.0 (2023-05-21)
|
||||
|
||||
`v0.20.0` is a major feature release.
|
||||
|
||||
## Features in this release
|
||||
|
||||
1. **`requires()` directive and enhanced package requirements**
|
||||
|
||||
We've added some more enhancements to requirements in Spack (#36286).
|
||||
|
||||
There is a new `requires()` directive for packages. `requires()` is the opposite of
|
||||
`conflicts()`. You can use it to impose constraints on this package when certain
|
||||
conditions are met:
|
||||
|
||||
```python
|
||||
requires(
|
||||
"%apple-clang",
|
||||
when="platform=darwin",
|
||||
msg="This package builds only with clang on macOS"
|
||||
)
|
||||
```
|
||||
|
||||
More on this in [the docs](
|
||||
https://spack.rtfd.io/en/latest/packaging_guide.html#conflicts-and-requirements).
|
||||
|
||||
You can also now add a `when:` clause to `requires:` in your `packages.yaml`
|
||||
configuration or in an environment:
|
||||
|
||||
```yaml
|
||||
packages:
|
||||
openmpi:
|
||||
require:
|
||||
- any_of: ["%gcc"]
|
||||
when: "@:4.1.4"
|
||||
message: "Only OpenMPI 4.1.5 and up can build with fancy compilers"
|
||||
```
|
||||
|
||||
More details can be found [here](
|
||||
https://spack.readthedocs.io/en/latest/build_settings.html#package-requirements)
|
||||
|
||||
2. **Exact versions**
|
||||
|
||||
Spack did not previously have a way to distinguish a version if it was a prefix of
|
||||
some other version. For example, `@3.2` would match `3.2`, `3.2.1`, `3.2.2`, etc. You
|
||||
can now match *exactly* `3.2` with `@=3.2`. This is useful, for example, if you need
|
||||
to patch *only* the `3.2` version of a package. The new syntax is described in [the docs](
|
||||
https://spack.readthedocs.io/en/latest/basic_usage.html#version-specifier).
|
||||
|
||||
Generally, when writing packages, you should prefer to use ranges like `@3.2` over
|
||||
the specific versions, as this allows the concretizer more leeway when selecting
|
||||
versions of dependencies. More details and recommendations are in the [packaging guide](
|
||||
https://spack.readthedocs.io/en/latest/packaging_guide.html#ranges-versus-specific-versions).
|
||||
|
||||
See #36273 for full details on the version refactor.
|
||||
|
||||
3. **New testing interface**
|
||||
|
||||
Writing package tests is now much simpler with a new [test interface](
|
||||
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests).
|
||||
|
||||
Writing a test is now as easy as adding a method that starts with `test_`:
|
||||
|
||||
```python
|
||||
class MyPackage(Package):
|
||||
...
|
||||
|
||||
def test_always_fails(self):
|
||||
"""use assert to always fail"""
|
||||
assert False
|
||||
|
||||
def test_example(self):
|
||||
"""run installed example"""
|
||||
example = which(self.prefix.bin.example)
|
||||
example()
|
||||
```
|
||||
|
||||
You can use Python's native `assert` statement to implement your checks -- no more
|
||||
need to fiddle with `run_test` or other test framework methods. Spack will
|
||||
introspect the class and run `test_*` methods when you run `spack test`,
|
||||
|
||||
4. **More stable concretization**
|
||||
|
||||
* Now, `spack concretize` will *only* concretize the new portions of the environment
|
||||
and will not change existing parts of an environment unless you specify `--force`.
|
||||
This has always been true for `unify:false`, but not for `unify:true` and
|
||||
`unify:when_possible` environments. Now it is true for all of them (#37438, #37681).
|
||||
|
||||
* The concretizer has a new `--reuse-deps` argument that *only* reuses dependencies.
|
||||
That is, it will always treat the *roots* of your environment as it would with
|
||||
`--fresh`. This allows you to upgrade just the roots of your environment while
|
||||
keeping everything else stable (#30990).
|
||||
|
||||
5. **Weekly develop snapshot releases**
|
||||
|
||||
Since last year, we have maintained a buildcache of `develop` at
|
||||
https://binaries.spack.io/develop, but the cache can grow to contain so many builds
|
||||
as to be unwieldy. When we get a stable `develop` build, we snapshot the release and
|
||||
add a corresponding tag the Spack repository. So, you can use a stack from a specific
|
||||
day. There are now tags in the spack repository like:
|
||||
|
||||
* `develop-2023-05-14`
|
||||
* `develop-2023-05-18`
|
||||
|
||||
that correspond to build caches like:
|
||||
|
||||
* https://binaries.spack.io/develop-2023-05-14/e4s
|
||||
* https://binaries.spack.io/develop-2023-05-18/e4s
|
||||
|
||||
We plan to store these snapshot releases weekly.
|
||||
|
||||
6. **Specs in buildcaches can be referenced by hash.**
|
||||
|
||||
* Previously, you could run `spack buildcache list` and see the hashes in
|
||||
buildcaches, but referring to them by hash would fail.
|
||||
* You can now run commands like `spack spec` and `spack install` and refer to
|
||||
buildcache hashes directly, e.g. `spack install /abc123` (#35042)
|
||||
|
||||
7. **New package and buildcache index websites**
|
||||
|
||||
Our public websites for searching packages have been completely revamped and updated.
|
||||
You can check them out here:
|
||||
|
||||
* *Package Index*: https://packages.spack.io
|
||||
* *Buildcache Index*: https://cache.spack.io
|
||||
|
||||
Both are searchable and more interactive than before. Currently major releases are
|
||||
shown; UI for browsing `develop` snapshots is coming soon.
|
||||
|
||||
8. **Default CMake and Meson build types are now Release**
|
||||
|
||||
Spack has historically defaulted to building with optimization and debugging, but
|
||||
packages like `llvm` can be enormous with debug turned on. Our default build type for
|
||||
all Spack packages is now `Release` (#36679, #37436). This has a number of benefits:
|
||||
|
||||
* much smaller binaries;
|
||||
* higher default optimization level; and
|
||||
* defining `NDEBUG` disables assertions, which may lead to further speedups.
|
||||
|
||||
You can still get the old behavior back through requirements and package preferences.
|
||||
|
||||
## Other new commands and directives
|
||||
|
||||
* `spack checksum` can automatically add new versions to package (#24532)
|
||||
* new command: `spack pkg grep` to easily search package files (#34388)
|
||||
* New `maintainers` directive (#35083)
|
||||
* Add `spack buildcache push` (alias to `buildcache create`) (#34861)
|
||||
* Allow using `-j` to control the parallelism of concretization (#37608)
|
||||
* Add `--exclude` option to 'spack external find' (#35013)
|
||||
|
||||
## Other new features of note
|
||||
|
||||
* editing: add higher-precedence `SPACK_EDITOR` environment variable
|
||||
* Many YAML formatting improvements from updating `ruamel.yaml` to the latest version
|
||||
supporting Python 3.6. (#31091, #24885, #37008).
|
||||
* Requirements and preferences should not define (non-git) versions (#37687, #37747)
|
||||
* Environments now store spack version/commit in `spack.lock` (#32801)
|
||||
* User can specify the name of the `packages` subdirectory in repositories (#36643)
|
||||
* Add container images supporting RHEL alternatives (#36713)
|
||||
* make version(...) kwargs explicit (#36998)
|
||||
|
||||
## Notable refactors
|
||||
|
||||
* buildcache create: reproducible tarballs (#35623)
|
||||
* Bootstrap most of Spack dependencies using environments (#34029)
|
||||
* Split `satisfies(..., strict=True/False)` into two functions (#35681)
|
||||
* spack install: simplify behavior when inside environments (#35206)
|
||||
|
||||
## Binary cache and stack updates
|
||||
|
||||
* Major simplification of CI boilerplate in stacks (#34272, #36045)
|
||||
* Many improvements to our CI pipeline's reliability
|
||||
|
||||
## Removals, Deprecations, and disablements
|
||||
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
|
||||
* Support for Python 2 was deprecated in `v0.19.0` and has been removed. `v0.20.0` only
|
||||
supports Python 3.6 and higher.
|
||||
* Deprecated target names are no longer recognized by Spack. Use generic names instead:
|
||||
* `graviton` is now `cortex_a72`
|
||||
* `graviton2` is now `neoverse_n1`
|
||||
* `graviton3` is now `neoverse_v1`
|
||||
* `blacklist` and `whitelist` in module configuration were deprecated in `v0.19.0` and are
|
||||
removed in this release. Use `exclude` and `include` instead.
|
||||
* The `ignore=` parameter of the `extends()` directive has been removed. It was not used by
|
||||
any builtin packages and is no longer needed to avoid conflicts in environment views (#35588).
|
||||
* Support for the old YAML buildcache format has been removed. It was deprecated in `v0.19.0` (#34347).
|
||||
* `spack find --bootstrap` has been removed. It was deprecated in `v0.19.0`. Use `spack
|
||||
--bootstrap find` instead (#33964).
|
||||
* `spack bootstrap trust` and `spack bootstrap untrust` are now removed, having been
|
||||
deprecated in `v0.19.0`. Use `spack bootstrap enable` and `spack bootstrap disable`.
|
||||
* The `--mirror-name`, `--mirror-url`, and `--directory` options to buildcache and
|
||||
mirror commands were deprecated in `v0.19.0` and have now been removed. They have been
|
||||
replaced by positional arguments (#37457).
|
||||
* Deprecate `env:` as top level environment key (#37424)
|
||||
* deprecate buildcache create --rel, buildcache install --allow-root (#37285)
|
||||
* Support for very old perl-like spec format strings (e.g., `$_$@$%@+$+$=`) has been
|
||||
removed (#37425). This was deprecated in in `v0.15` (#10556).
|
||||
|
||||
## Notable Bugfixes
|
||||
|
||||
* bugfix: don't fetch package metadata for unknown concrete specs (#36990)
|
||||
* Improve package source code context display on error (#37655)
|
||||
* Relax environment manifest filename requirements and lockfile identification criteria (#37413)
|
||||
* `installer.py`: drop build edges of installed packages by default (#36707)
|
||||
* Bugfix: package requirements with git commits (#35057, #36347)
|
||||
* Package requirements: allow single specs in requirement lists (#36258)
|
||||
* conditional variant values: allow boolean (#33939)
|
||||
* spack uninstall: follow run/link edges on --dependents (#34058)
|
||||
|
||||
## Spack community stats
|
||||
|
||||
* 7,179 total packages, 499 new since `v0.19.0`
|
||||
* 329 new Python packages
|
||||
* 31 new R packages
|
||||
* 336 people contributed to this release
|
||||
* 317 committers to packages
|
||||
* 62 committers to core
|
||||
|
||||
|
||||
# v0.19.1 (2023-02-07)
|
||||
|
||||
### Spack Bugfixes
|
||||
|
16
lib/spack/docs/_pygments/style.py
Normal file
16
lib/spack/docs/_pygments/style.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
# We use our own extension of the default style with a few modifications
|
||||
from pygments.styles.default import DefaultStyle
|
||||
from pygments.token import Generic
|
||||
|
||||
|
||||
class SpackStyle(DefaultStyle):
|
||||
styles = DefaultStyle.styles.copy()
|
||||
background_color = "#f4f4f8"
|
||||
styles[Generic.Output] = "#355"
|
||||
styles[Generic.Prompt] = "bold #346ec9"
|
@@ -149,7 +149,6 @@ def setup(sphinx):
|
||||
# Get nice vector graphics
|
||||
graphviz_output_format = "svg"
|
||||
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ["_templates"]
|
||||
|
||||
@@ -233,30 +232,8 @@ def setup(sphinx):
|
||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||
# output. They are ignored by default.
|
||||
# show_authors = False
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
# We use our own extension of the default style with a few modifications
|
||||
from pygments.style import Style
|
||||
from pygments.styles.default import DefaultStyle
|
||||
from pygments.token import Comment, Generic, Text
|
||||
|
||||
|
||||
class SpackStyle(DefaultStyle):
|
||||
styles = DefaultStyle.styles.copy()
|
||||
background_color = "#f4f4f8"
|
||||
styles[Generic.Output] = "#355"
|
||||
styles[Generic.Prompt] = "bold #346ec9"
|
||||
|
||||
|
||||
import pkg_resources
|
||||
|
||||
dist = pkg_resources.Distribution(__file__)
|
||||
sys.path.append(".") # make 'conf' module findable
|
||||
ep = pkg_resources.EntryPoint.parse("spack = conf:SpackStyle", dist=dist)
|
||||
dist._ep_map = {"pygments.styles": {"plugin1": ep}}
|
||||
pkg_resources.working_set.add(dist)
|
||||
|
||||
pygments_style = "spack"
|
||||
sys.path.append("./_pygments")
|
||||
pygments_style = "style.SpackStyle"
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
# modindex_common_prefix = []
|
||||
@@ -341,16 +318,15 @@ class SpackStyle(DefaultStyle):
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = "Spackdoc"
|
||||
|
||||
|
||||
# -- Options for LaTeX output --------------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#'papersize': 'letterpaper',
|
||||
# 'papersize': 'letterpaper',
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#'pointsize': '10pt',
|
||||
# 'pointsize': '10pt',
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#'preamble': '',
|
||||
# 'preamble': '',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
|
@@ -616,7 +616,7 @@ to customize the generation of container recipes:
|
||||
- No
|
||||
* - ``os_packages:command``
|
||||
- Tool used to manage system packages
|
||||
- ``apt``, ``yum``, ``zypper``, ``apk``, ``yum_amazon``
|
||||
- ``apt``, ``yum``, ``dnf``, ``dnf_epel``, ``zypper``, ``apk``, ``yum_amazon``
|
||||
- Only with custom base images
|
||||
* - ``os_packages:update``
|
||||
- Whether or not to update the list of available packages
|
||||
|
@@ -916,9 +916,9 @@ function, as shown in the example below:
|
||||
.. code-block:: yaml
|
||||
|
||||
projections:
|
||||
zlib: {name}-{version}
|
||||
^mpi: {name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}
|
||||
all: {name}-{version}/{compiler.name}-{compiler.version}
|
||||
zlib: "{name}-{version}"
|
||||
^mpi: "{name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}"
|
||||
all: "{name}-{version}/{compiler.name}-{compiler.version}"
|
||||
|
||||
The entries in the projections configuration file must all be either
|
||||
specs or the keyword ``all``. For each spec, the projection used will
|
||||
|
@@ -1,13 +1,8 @@
|
||||
# These dependencies should be installed using pip in order
|
||||
# to build the documentation.
|
||||
|
||||
sphinx>=3.4,!=4.1.2,!=5.1.0
|
||||
sphinxcontrib-programoutput
|
||||
sphinx-design
|
||||
sphinx-rtd-theme
|
||||
python-levenshtein
|
||||
# Restrict to docutils <0.17 to workaround a list rendering issue in sphinx.
|
||||
# https://stackoverflow.com/questions/67542699
|
||||
docutils <0.17
|
||||
pygments <2.13
|
||||
urllib3 <2
|
||||
sphinx==6.2.1
|
||||
sphinxcontrib-programoutput==0.17
|
||||
sphinx_design==0.4.1
|
||||
sphinx-rtd-theme==1.2.1
|
||||
python-levenshtein==0.21.0
|
||||
docutils==0.18.1
|
||||
pygments==2.15.1
|
||||
urllib3==2.0.2
|
||||
|
2
lib/spack/external/__init__.py
vendored
2
lib/spack/external/__init__.py
vendored
@@ -18,7 +18,7 @@
|
||||
|
||||
* Homepage: https://pypi.python.org/pypi/archspec
|
||||
* Usage: Labeling, comparison and detection of microarchitectures
|
||||
* Version: 0.2.1 (commit 4b1f21802a23b536bbcce73d3c631a566b20e8bd)
|
||||
* Version: 0.2.1 (commit 9e1117bd8a2f0581bced161f2a2e8d6294d0300b)
|
||||
|
||||
astunparse
|
||||
----------------
|
||||
|
@@ -2803,7 +2803,7 @@
|
||||
"flags" : "-march=armv8.2-a+fp16+dotprod+crypto -mtune=cortex-a72"
|
||||
},
|
||||
{
|
||||
"versions": "10.2",
|
||||
"versions": "10.2:10.2.99",
|
||||
"flags" : "-mcpu=zeus"
|
||||
},
|
||||
{
|
||||
|
@@ -4,7 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
|
||||
__version__ = "0.20.0.dev0"
|
||||
__version__ = "0.20.1"
|
||||
spack_version = __version__
|
||||
|
||||
|
||||
|
@@ -1216,6 +1216,9 @@ def child_fun():
|
||||
return child_result
|
||||
|
||||
|
||||
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
|
||||
|
||||
|
||||
def get_package_context(traceback, context=3):
|
||||
"""Return some context for an error message when the build fails.
|
||||
|
||||
@@ -1244,32 +1247,38 @@ def make_stack(tb, stack=None):
|
||||
|
||||
stack = make_stack(traceback)
|
||||
|
||||
basenames = tuple(base.__name__ for base in CONTEXT_BASES)
|
||||
for tb in stack:
|
||||
frame = tb.tb_frame
|
||||
if "self" in frame.f_locals:
|
||||
# Find the first proper subclass of PackageBase.
|
||||
# Find the first proper subclass of the PackageBase or BaseBuilder, but
|
||||
# don't provide context if the code is actually in the base classes.
|
||||
obj = frame.f_locals["self"]
|
||||
if isinstance(obj, spack.package_base.PackageBase):
|
||||
func = getattr(obj, tb.tb_frame.f_code.co_name, "")
|
||||
if func:
|
||||
typename, *_ = func.__qualname__.partition(".")
|
||||
|
||||
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
|
||||
break
|
||||
else:
|
||||
return None
|
||||
|
||||
# We found obj, the Package implementation we care about.
|
||||
# Point out the location in the install method where we failed.
|
||||
lines = [
|
||||
"{0}:{1:d}, in {2}:".format(
|
||||
inspect.getfile(frame.f_code),
|
||||
frame.f_lineno - 1, # subtract 1 because f_lineno is 0-indexed
|
||||
frame.f_code.co_name,
|
||||
)
|
||||
]
|
||||
filename = inspect.getfile(frame.f_code)
|
||||
lineno = frame.f_lineno
|
||||
if os.path.basename(filename) == "package.py":
|
||||
# subtract 1 because we inject a magic import at the top of package files.
|
||||
# TODO: get rid of the magic import.
|
||||
lineno -= 1
|
||||
|
||||
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
|
||||
|
||||
# Build a message showing context in the install method.
|
||||
sourcelines, start = inspect.getsourcelines(frame)
|
||||
|
||||
# Calculate lineno of the error relative to the start of the function.
|
||||
# Subtract 1 because f_lineno is 0-indexed.
|
||||
fun_lineno = frame.f_lineno - start - 1
|
||||
fun_lineno = lineno - start
|
||||
start_ctx = max(0, fun_lineno - context)
|
||||
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
|
||||
|
||||
@@ -1365,7 +1374,7 @@ def long_message(self):
|
||||
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
|
||||
if os.path.isfile(test_log):
|
||||
out.write("\nSee test log for details:\n")
|
||||
out.write(" {0}n".format(test_log))
|
||||
out.write(" {0}\n".format(test_log))
|
||||
|
||||
return out.getvalue()
|
||||
|
||||
|
@@ -180,51 +180,6 @@ def test(self):
|
||||
work_dir="spack-test",
|
||||
)
|
||||
|
||||
|
||||
class PythonPackage(PythonExtension):
|
||||
"""Specialized class for packages that are built using pip."""
|
||||
|
||||
#: Package name, version, and extension on PyPI
|
||||
pypi: Optional[str] = None
|
||||
|
||||
# To be used in UI queries that require to know which
|
||||
# build-system class we are using
|
||||
build_system_class = "PythonPackage"
|
||||
#: Legacy buildsystem attribute used to deserialize and install old specs
|
||||
legacy_buildsystem = "python_pip"
|
||||
|
||||
#: Callback names for install-time test
|
||||
install_time_test_callbacks = ["test"]
|
||||
|
||||
build_system("python_pip")
|
||||
|
||||
with spack.multimethod.when("build_system=python_pip"):
|
||||
extends("python")
|
||||
depends_on("py-pip", type="build")
|
||||
# FIXME: technically wheel is only needed when building from source, not when
|
||||
# installing a downloaded wheel, but I don't want to add wheel as a dep to every
|
||||
# package manually
|
||||
depends_on("py-wheel", type="build")
|
||||
|
||||
py_namespace: Optional[str] = None
|
||||
|
||||
@lang.classproperty
|
||||
def homepage(cls):
|
||||
if cls.pypi:
|
||||
name = cls.pypi.split("/")[0]
|
||||
return "https://pypi.org/project/" + name + "/"
|
||||
|
||||
@lang.classproperty
|
||||
def url(cls):
|
||||
if cls.pypi:
|
||||
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
|
||||
|
||||
@lang.classproperty
|
||||
def list_url(cls):
|
||||
if cls.pypi:
|
||||
name = cls.pypi.split("/")[0]
|
||||
return "https://pypi.org/simple/" + name + "/"
|
||||
|
||||
def update_external_dependencies(self, extendee_spec=None):
|
||||
"""
|
||||
Ensure all external python packages have a python dependency
|
||||
@@ -270,6 +225,51 @@ def update_external_dependencies(self, extendee_spec=None):
|
||||
python._mark_concrete()
|
||||
self.spec.add_dependency_edge(python, deptypes=("build", "link", "run"))
|
||||
|
||||
|
||||
class PythonPackage(PythonExtension):
|
||||
"""Specialized class for packages that are built using pip."""
|
||||
|
||||
#: Package name, version, and extension on PyPI
|
||||
pypi: Optional[str] = None
|
||||
|
||||
# To be used in UI queries that require to know which
|
||||
# build-system class we are using
|
||||
build_system_class = "PythonPackage"
|
||||
#: Legacy buildsystem attribute used to deserialize and install old specs
|
||||
legacy_buildsystem = "python_pip"
|
||||
|
||||
#: Callback names for install-time test
|
||||
install_time_test_callbacks = ["test"]
|
||||
|
||||
build_system("python_pip")
|
||||
|
||||
with spack.multimethod.when("build_system=python_pip"):
|
||||
extends("python")
|
||||
depends_on("py-pip", type="build")
|
||||
# FIXME: technically wheel is only needed when building from source, not when
|
||||
# installing a downloaded wheel, but I don't want to add wheel as a dep to every
|
||||
# package manually
|
||||
depends_on("py-wheel", type="build")
|
||||
|
||||
py_namespace: Optional[str] = None
|
||||
|
||||
@lang.classproperty
|
||||
def homepage(cls):
|
||||
if cls.pypi:
|
||||
name = cls.pypi.split("/")[0]
|
||||
return "https://pypi.org/project/" + name + "/"
|
||||
|
||||
@lang.classproperty
|
||||
def url(cls):
|
||||
if cls.pypi:
|
||||
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
|
||||
|
||||
@lang.classproperty
|
||||
def list_url(cls):
|
||||
if cls.pypi:
|
||||
name = cls.pypi.split("/")[0]
|
||||
return "https://pypi.org/simple/" + name + "/"
|
||||
|
||||
def get_external_python_for_prefix(self):
|
||||
"""
|
||||
For an external package that extends python, find the most likely spec for the python
|
||||
|
@@ -756,6 +756,7 @@ def generate_gitlab_ci_yaml(
|
||||
# Get the joined "ci" config with all of the current scopes resolved
|
||||
ci_config = cfg.get("ci")
|
||||
|
||||
config_deprecated = False
|
||||
if not ci_config:
|
||||
tty.warn("Environment does not have `ci` a configuration")
|
||||
gitlabci_config = yaml_root.get("gitlab-ci")
|
||||
@@ -768,6 +769,7 @@ def generate_gitlab_ci_yaml(
|
||||
)
|
||||
translate_deprecated_config(gitlabci_config)
|
||||
ci_config = gitlabci_config
|
||||
config_deprecated = True
|
||||
|
||||
# Default target is gitlab...and only target is gitlab
|
||||
if not ci_config.get("target", "gitlab") == "gitlab":
|
||||
@@ -831,6 +833,14 @@ def generate_gitlab_ci_yaml(
|
||||
# Values: "spack_pull_request", "spack_protected_branch", or not set
|
||||
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None)
|
||||
|
||||
copy_only_pipeline = spack_pipeline_type == "spack_copy_only"
|
||||
if copy_only_pipeline and config_deprecated:
|
||||
tty.warn(
|
||||
"SPACK_PIPELINE_TYPE=spack_copy_only is not supported when using\n",
|
||||
"deprecated ci configuration, a no-op pipeline will be generated\n",
|
||||
"instead.",
|
||||
)
|
||||
|
||||
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
|
||||
tty.die("spack ci generate requires an env containing a mirror")
|
||||
|
||||
@@ -1207,7 +1217,7 @@ def main_script_replacements(cmd):
|
||||
).format(c_spec, release_spec)
|
||||
tty.debug(debug_msg)
|
||||
|
||||
if prune_dag and not rebuild_spec and spack_pipeline_type != "spack_copy_only":
|
||||
if prune_dag and not rebuild_spec and not copy_only_pipeline:
|
||||
tty.debug(
|
||||
"Pruning {0}/{1}, does not need rebuild.".format(
|
||||
release_spec.name, release_spec.dag_hash()
|
||||
@@ -1298,7 +1308,7 @@ def main_script_replacements(cmd):
|
||||
max_length_needs = length_needs
|
||||
max_needs_job = job_name
|
||||
|
||||
if spack_pipeline_type != "spack_copy_only":
|
||||
if not copy_only_pipeline:
|
||||
output_object[job_name] = job_object
|
||||
job_id += 1
|
||||
|
||||
@@ -1330,7 +1340,7 @@ def main_script_replacements(cmd):
|
||||
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
|
||||
}
|
||||
|
||||
if spack_pipeline_type == "spack_copy_only":
|
||||
if copy_only_pipeline and not config_deprecated:
|
||||
stage_names.append("copy")
|
||||
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
|
||||
sync_job["stage"] = "copy"
|
||||
@@ -1474,12 +1484,18 @@ def main_script_replacements(cmd):
|
||||
sorted_output = cinw.needs_to_dependencies(sorted_output)
|
||||
else:
|
||||
# No jobs were generated
|
||||
tty.debug("No specs to rebuild, generating no-op job")
|
||||
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
|
||||
|
||||
noop_job["retry"] = service_job_retries
|
||||
|
||||
sorted_output = {"no-specs-to-rebuild": noop_job}
|
||||
if copy_only_pipeline and config_deprecated:
|
||||
tty.debug("Generating no-op job as copy-only is unsupported here.")
|
||||
noop_job["script"] = [
|
||||
'echo "copy-only pipelines are not supported with deprecated ci configs"'
|
||||
]
|
||||
sorted_output = {"unsupported-copy": noop_job}
|
||||
else:
|
||||
tty.debug("No specs to rebuild, generating no-op job")
|
||||
sorted_output = {"no-specs-to-rebuild": noop_job}
|
||||
|
||||
if known_broken_specs_encountered:
|
||||
tty.error("This pipeline generated hashes known to be broken on develop:")
|
||||
|
@@ -347,7 +347,7 @@ def iter_groups(specs, indent, all_headers):
|
||||
spack.spec.architecture_color,
|
||||
architecture if architecture else "no arch",
|
||||
spack.spec.compiler_color,
|
||||
f"{compiler.name}@{compiler.version}" if compiler else "no compiler",
|
||||
f"{compiler.display_str}" if compiler else "no compiler",
|
||||
)
|
||||
|
||||
# Sometimes we want to display specs that are not yet concretized.
|
||||
|
@@ -53,7 +53,7 @@ def setup_parser(subparser):
|
||||
"--scope",
|
||||
choices=scopes,
|
||||
metavar=scopes_metavar,
|
||||
default=spack.config.default_modify_scope("compilers"),
|
||||
default=None,
|
||||
help="configuration scope to modify",
|
||||
)
|
||||
|
||||
@@ -98,7 +98,7 @@ def compiler_find(args):
|
||||
config = spack.config.config
|
||||
filename = config.get_config_filename(args.scope, "compilers")
|
||||
tty.msg("Added %d new compiler%s to %s" % (n, s, filename))
|
||||
colify(reversed(sorted(c.spec for c in new_compilers)), indent=4)
|
||||
colify(reversed(sorted(c.spec.display_str for c in new_compilers)), indent=4)
|
||||
else:
|
||||
tty.msg("Found no new compilers")
|
||||
tty.msg("Compilers are defined in the following files:")
|
||||
@@ -106,19 +106,21 @@ def compiler_find(args):
|
||||
|
||||
|
||||
def compiler_remove(args):
|
||||
cspec = spack.spec.CompilerSpec(args.compiler_spec)
|
||||
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
|
||||
if not compilers:
|
||||
tty.die("No compilers match spec %s" % cspec)
|
||||
elif not args.all and len(compilers) > 1:
|
||||
tty.error("Multiple compilers match spec %s. Choose one:" % cspec)
|
||||
colify(reversed(sorted([c.spec for c in compilers])), indent=4)
|
||||
compiler_spec = spack.spec.CompilerSpec(args.compiler_spec)
|
||||
candidate_compilers = spack.compilers.compilers_for_spec(compiler_spec, scope=args.scope)
|
||||
|
||||
if not candidate_compilers:
|
||||
tty.die("No compilers match spec %s" % compiler_spec)
|
||||
|
||||
if not args.all and len(candidate_compilers) > 1:
|
||||
tty.error(f"Multiple compilers match spec {compiler_spec}. Choose one:")
|
||||
colify(reversed(sorted([c.spec.display_str for c in candidate_compilers])), indent=4)
|
||||
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
|
||||
sys.exit(1)
|
||||
|
||||
for compiler in compilers:
|
||||
spack.compilers.remove_compiler_from_config(compiler.spec, scope=args.scope)
|
||||
tty.msg("Removed compiler %s" % compiler.spec)
|
||||
for current_compiler in candidate_compilers:
|
||||
spack.compilers.remove_compiler_from_config(current_compiler.spec, scope=args.scope)
|
||||
tty.msg(f"{current_compiler.spec.display_str} has been removed")
|
||||
|
||||
|
||||
def compiler_info(args):
|
||||
@@ -130,7 +132,7 @@ def compiler_info(args):
|
||||
tty.die("No compilers match spec %s" % cspec)
|
||||
else:
|
||||
for c in compilers:
|
||||
print(str(c.spec) + ":")
|
||||
print(c.spec.display_str + ":")
|
||||
print("\tpaths:")
|
||||
for cpath in ["cc", "cxx", "f77", "fc"]:
|
||||
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
|
||||
@@ -188,7 +190,7 @@ def compiler_list(args):
|
||||
os_str += "-%s" % target
|
||||
cname = "%s{%s} %s" % (spack.spec.compiler_color, name, os_str)
|
||||
tty.hline(colorize(cname), char="-")
|
||||
colify(reversed(sorted(c.spec for c in compilers)))
|
||||
colify(reversed(sorted(c.spec.display_str for c in compilers)))
|
||||
|
||||
|
||||
def compiler(parser, args):
|
||||
|
@@ -302,7 +302,7 @@ def env_create(args):
|
||||
# the environment should not include a view.
|
||||
with_view = None
|
||||
|
||||
_env_create(
|
||||
env = _env_create(
|
||||
args.create_env,
|
||||
init_file=args.envfile,
|
||||
dir=args.dir,
|
||||
@@ -310,6 +310,9 @@ def env_create(args):
|
||||
keep_relative=args.keep_relative,
|
||||
)
|
||||
|
||||
# Generate views, only really useful for environments created from spack.lock files.
|
||||
env.regenerate_views()
|
||||
|
||||
|
||||
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
|
||||
"""Create a new environment, with an optional yaml description.
|
||||
|
@@ -79,6 +79,12 @@ def setup_parser(subparser):
|
||||
read_cray_manifest.add_argument(
|
||||
"--directory", default=None, help="specify a directory storing a group of manifest files"
|
||||
)
|
||||
read_cray_manifest.add_argument(
|
||||
"--ignore-default-dir",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="ignore the default directory of manifest files",
|
||||
)
|
||||
read_cray_manifest.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
@@ -177,11 +183,16 @@ def external_read_cray_manifest(args):
|
||||
manifest_directory=args.directory,
|
||||
dry_run=args.dry_run,
|
||||
fail_on_error=args.fail_on_error,
|
||||
ignore_default_dir=args.ignore_default_dir,
|
||||
)
|
||||
|
||||
|
||||
def _collect_and_consume_cray_manifest_files(
|
||||
manifest_file=None, manifest_directory=None, dry_run=False, fail_on_error=False
|
||||
manifest_file=None,
|
||||
manifest_directory=None,
|
||||
dry_run=False,
|
||||
fail_on_error=False,
|
||||
ignore_default_dir=False,
|
||||
):
|
||||
manifest_files = []
|
||||
if manifest_file:
|
||||
@@ -191,7 +202,7 @@ def _collect_and_consume_cray_manifest_files(
|
||||
if manifest_directory:
|
||||
manifest_dirs.append(manifest_directory)
|
||||
|
||||
if os.path.isdir(cray_manifest.default_path):
|
||||
if not ignore_default_dir and os.path.isdir(cray_manifest.default_path):
|
||||
tty.debug(
|
||||
"Cray manifest path {0} exists: collecting all files to read.".format(
|
||||
cray_manifest.default_path
|
||||
|
@@ -116,21 +116,23 @@ def one_spec_or_raise(specs):
|
||||
|
||||
|
||||
def check_module_set_name(name):
|
||||
modules_config = spack.config.get("modules")
|
||||
valid_names = set(
|
||||
[
|
||||
key
|
||||
for key, value in modules_config.items()
|
||||
if isinstance(value, dict) and value.get("enable", [])
|
||||
]
|
||||
)
|
||||
if "enable" in modules_config and modules_config["enable"]:
|
||||
valid_names.add("default")
|
||||
modules = spack.config.get("modules")
|
||||
if name != "prefix_inspections" and name in modules:
|
||||
return
|
||||
|
||||
if name not in valid_names:
|
||||
msg = "Cannot use invalid module set %s." % name
|
||||
msg += " Valid module set names are %s" % list(valid_names)
|
||||
raise spack.config.ConfigError(msg)
|
||||
names = [k for k in modules if k != "prefix_inspections"]
|
||||
|
||||
if not names:
|
||||
raise spack.config.ConfigError(
|
||||
f"Module set configuration is missing. Cannot use module set '{name}'"
|
||||
)
|
||||
|
||||
pretty_names = "', '".join(names)
|
||||
|
||||
raise spack.config.ConfigError(
|
||||
f"Cannot use invalid module set '{name}'.",
|
||||
f"Valid module set names are: '{pretty_names}'.",
|
||||
)
|
||||
|
||||
|
||||
_missing_modules_warning = (
|
||||
|
@@ -25,7 +25,7 @@
|
||||
|
||||
|
||||
# tutorial configuration parameters
|
||||
tutorial_branch = "releases/v0.19"
|
||||
tutorial_branch = "releases/v0.20"
|
||||
tutorial_mirror = "file:///mirror"
|
||||
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")
|
||||
|
||||
|
@@ -37,7 +37,6 @@
|
||||
"implicit_rpaths",
|
||||
"extra_rpaths",
|
||||
]
|
||||
_cache_config_file = []
|
||||
|
||||
# TODO: Caches at module level make it difficult to mock configurations in
|
||||
# TODO: unit tests. It might be worth reworking their implementation.
|
||||
@@ -112,36 +111,26 @@ def _to_dict(compiler):
|
||||
def get_compiler_config(scope=None, init_config=True):
|
||||
"""Return the compiler configuration for the specified architecture."""
|
||||
|
||||
def init_compiler_config():
|
||||
"""Compiler search used when Spack has no compilers."""
|
||||
compilers = find_compilers()
|
||||
compilers_dict = []
|
||||
for compiler in compilers:
|
||||
compilers_dict.append(_to_dict(compiler))
|
||||
spack.config.set("compilers", compilers_dict, scope=scope)
|
||||
config = spack.config.get("compilers", scope=scope) or []
|
||||
if config or not init_config:
|
||||
return config
|
||||
|
||||
merged_config = spack.config.get("compilers")
|
||||
if merged_config:
|
||||
return config
|
||||
|
||||
_init_compiler_config(scope=scope)
|
||||
config = spack.config.get("compilers", scope=scope)
|
||||
# Update the configuration if there are currently no compilers
|
||||
# configured. Avoid updating automatically if there ARE site
|
||||
# compilers configured but no user ones.
|
||||
if not config and init_config:
|
||||
if scope is None:
|
||||
# We know no compilers were configured in any scope.
|
||||
init_compiler_config()
|
||||
config = spack.config.get("compilers", scope=scope)
|
||||
elif scope == "user":
|
||||
# Check the site config and update the user config if
|
||||
# nothing is configured at the site level.
|
||||
site_config = spack.config.get("compilers", scope="site")
|
||||
sys_config = spack.config.get("compilers", scope="system")
|
||||
if not site_config and not sys_config:
|
||||
init_compiler_config()
|
||||
config = spack.config.get("compilers", scope=scope)
|
||||
return config
|
||||
elif config:
|
||||
return config
|
||||
else:
|
||||
return [] # Return empty list which we will later append to.
|
||||
return config
|
||||
|
||||
|
||||
def _init_compiler_config(*, scope):
|
||||
"""Compiler search used when Spack has no compilers."""
|
||||
compilers = find_compilers()
|
||||
compilers_dict = []
|
||||
for compiler in compilers:
|
||||
compilers_dict.append(_to_dict(compiler))
|
||||
spack.config.set("compilers", compilers_dict, scope=scope)
|
||||
|
||||
|
||||
def compiler_config_files():
|
||||
@@ -165,52 +154,65 @@ def add_compilers_to_config(compilers, scope=None, init_config=True):
|
||||
compiler_config = get_compiler_config(scope, init_config)
|
||||
for compiler in compilers:
|
||||
compiler_config.append(_to_dict(compiler))
|
||||
global _cache_config_file
|
||||
_cache_config_file = compiler_config
|
||||
spack.config.set("compilers", compiler_config, scope=scope)
|
||||
|
||||
|
||||
@_auto_compiler_spec
|
||||
def remove_compiler_from_config(compiler_spec, scope=None):
|
||||
"""Remove compilers from the config, by spec.
|
||||
"""Remove compilers from configuration by spec.
|
||||
|
||||
If scope is None, all the scopes are searched for removal.
|
||||
|
||||
Arguments:
|
||||
compiler_specs: a list of CompilerSpec objects.
|
||||
scope: configuration scope to modify.
|
||||
compiler_spec: compiler to be removed
|
||||
scope: configuration scope to modify
|
||||
"""
|
||||
# Need a better way for this
|
||||
global _cache_config_file
|
||||
candidate_scopes = [scope]
|
||||
if scope is None:
|
||||
candidate_scopes = spack.config.config.scopes.keys()
|
||||
|
||||
removal_happened = False
|
||||
for current_scope in candidate_scopes:
|
||||
removal_happened |= _remove_compiler_from_scope(compiler_spec, scope=current_scope)
|
||||
|
||||
return removal_happened
|
||||
|
||||
|
||||
def _remove_compiler_from_scope(compiler_spec, scope):
|
||||
"""Removes a compiler from a specific configuration scope.
|
||||
|
||||
Args:
|
||||
compiler_spec: compiler to be removed
|
||||
scope: configuration scope under consideration
|
||||
|
||||
Returns:
|
||||
True if one or more compiler entries were actually removed, False otherwise
|
||||
"""
|
||||
assert scope is not None, "a specific scope is needed when calling this function"
|
||||
compiler_config = get_compiler_config(scope)
|
||||
config_length = len(compiler_config)
|
||||
|
||||
filtered_compiler_config = [
|
||||
comp
|
||||
for comp in compiler_config
|
||||
compiler_entry
|
||||
for compiler_entry in compiler_config
|
||||
if not spack.spec.parse_with_version_concrete(
|
||||
comp["compiler"]["spec"], compiler=True
|
||||
compiler_entry["compiler"]["spec"], compiler=True
|
||||
).satisfies(compiler_spec)
|
||||
]
|
||||
|
||||
# Update the cache for changes
|
||||
_cache_config_file = filtered_compiler_config
|
||||
if len(filtered_compiler_config) == config_length: # No items removed
|
||||
CompilerSpecInsufficientlySpecificError(compiler_spec)
|
||||
spack.config.set("compilers", filtered_compiler_config, scope=scope)
|
||||
if len(filtered_compiler_config) == len(compiler_config):
|
||||
return False
|
||||
|
||||
# We need to preserve the YAML type for comments, hence we are copying the
|
||||
# items in the list that has just been retrieved
|
||||
compiler_config[:] = filtered_compiler_config
|
||||
spack.config.set("compilers", compiler_config, scope=scope)
|
||||
return True
|
||||
|
||||
|
||||
def all_compilers_config(scope=None, init_config=True):
|
||||
"""Return a set of specs for all the compiler versions currently
|
||||
available to build with. These are instances of CompilerSpec.
|
||||
"""
|
||||
# Get compilers for this architecture.
|
||||
# Create a cache of the config file so we don't load all the time.
|
||||
global _cache_config_file
|
||||
if not _cache_config_file:
|
||||
_cache_config_file = get_compiler_config(scope, init_config)
|
||||
return _cache_config_file
|
||||
else:
|
||||
return _cache_config_file
|
||||
return get_compiler_config(scope, init_config)
|
||||
|
||||
|
||||
def all_compiler_specs(scope=None, init_config=True):
|
||||
|
@@ -30,7 +30,7 @@
|
||||
|
||||
|
||||
def get_valid_fortran_pth(comp_ver):
|
||||
cl_ver = str(comp_ver).split("@")[1]
|
||||
cl_ver = str(comp_ver)
|
||||
sort_fn = lambda fc_ver: StrictVersion(fc_ver)
|
||||
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
|
||||
for ver in sort_fc_ver:
|
||||
@@ -75,7 +75,7 @@ class Msvc(Compiler):
|
||||
# file based on compiler executable path.
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
new_pth = [pth if pth else get_valid_fortran_pth(args[0]) for pth in args[3]]
|
||||
new_pth = [pth if pth else get_valid_fortran_pth(args[0].version) for pth in args[3]]
|
||||
args[3][:] = new_pth
|
||||
super(Msvc, self).__init__(*args, **kwargs)
|
||||
if os.getenv("ONEAPI_ROOT"):
|
||||
|
@@ -1353,17 +1353,11 @@ def use_configuration(*scopes_or_paths):
|
||||
configuration = _config_from(scopes_or_paths)
|
||||
config.clear_caches(), configuration.clear_caches()
|
||||
|
||||
# Save and clear the current compiler cache
|
||||
saved_compiler_cache = spack.compilers._cache_config_file
|
||||
spack.compilers._cache_config_file = []
|
||||
|
||||
saved_config, config = config, configuration
|
||||
|
||||
try:
|
||||
yield configuration
|
||||
finally:
|
||||
# Restore previous config files
|
||||
spack.compilers._cache_config_file = saved_compiler_cache
|
||||
config = saved_config
|
||||
|
||||
|
||||
|
@@ -17,7 +17,7 @@
|
||||
"template": "container/fedora_38.dockerfile",
|
||||
"image": "docker.io/fedora:38"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf",
|
||||
"build": "spack/fedora38",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -31,7 +31,7 @@
|
||||
"template": "container/fedora_37.dockerfile",
|
||||
"image": "docker.io/fedora:37"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf",
|
||||
"build": "spack/fedora37",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -45,7 +45,7 @@
|
||||
"template": "container/rockylinux_9.dockerfile",
|
||||
"image": "docker.io/rockylinux:9"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf_epel",
|
||||
"build": "spack/rockylinux9",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -59,7 +59,7 @@
|
||||
"template": "container/rockylinux_8.dockerfile",
|
||||
"image": "docker.io/rockylinux:8"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf_epel",
|
||||
"build": "spack/rockylinux8",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -73,7 +73,7 @@
|
||||
"template": "container/almalinux_9.dockerfile",
|
||||
"image": "quay.io/almalinux/almalinux:9"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf_epel",
|
||||
"build": "spack/almalinux9",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -87,7 +87,7 @@
|
||||
"template": "container/almalinux_8.dockerfile",
|
||||
"image": "quay.io/almalinux/almalinux:8"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf_epel",
|
||||
"build": "spack/almalinux8",
|
||||
"build_tags": {
|
||||
"develop": "latest"
|
||||
@@ -101,7 +101,7 @@
|
||||
"template": "container/centos_stream.dockerfile",
|
||||
"image": "quay.io/centos/centos:stream"
|
||||
},
|
||||
"os_package_manager": "yum",
|
||||
"os_package_manager": "dnf_epel",
|
||||
"build": "spack/centos-stream",
|
||||
"final": {
|
||||
"image": "quay.io/centos/centos:stream"
|
||||
@@ -185,6 +185,16 @@
|
||||
"install": "apt-get -yqq install",
|
||||
"clean": "rm -rf /var/lib/apt/lists/*"
|
||||
},
|
||||
"dnf": {
|
||||
"update": "dnf update -y",
|
||||
"install": "dnf install -y",
|
||||
"clean": "rm -rf /var/cache/dnf && dnf clean all"
|
||||
},
|
||||
"dnf_epel": {
|
||||
"update": "dnf update -y && dnf install -y epel-release && dnf update -y",
|
||||
"install": "dnf install -y",
|
||||
"clean": "rm -rf /var/cache/dnf && dnf clean all"
|
||||
},
|
||||
"yum": {
|
||||
"update": "yum update -y && yum install -y epel-release && yum update -y",
|
||||
"install": "yum install -y",
|
||||
|
@@ -48,7 +48,8 @@ def translated_compiler_name(manifest_compiler_name):
|
||||
def compiler_from_entry(entry):
|
||||
compiler_name = translated_compiler_name(entry["name"])
|
||||
paths = entry["executables"]
|
||||
version = entry["version"]
|
||||
# to instantiate a compiler class we may need a concrete version:
|
||||
version = "={}".format(entry["version"])
|
||||
arch = entry["arch"]
|
||||
operating_system = arch["os"]
|
||||
target = arch["target"]
|
||||
|
@@ -1221,28 +1221,27 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
|
||||
old_specs = set(self.user_specs)
|
||||
new_specs = set()
|
||||
for spec in matches:
|
||||
if spec in list_to_change:
|
||||
try:
|
||||
list_to_change.remove(spec)
|
||||
self.update_stale_references(list_name)
|
||||
new_specs = set(self.user_specs)
|
||||
except spack.spec_list.SpecListError:
|
||||
# define new specs list
|
||||
new_specs = set(self.user_specs)
|
||||
msg = f"Spec '{spec}' is part of a spec matrix and "
|
||||
msg += f"cannot be removed from list '{list_to_change}'."
|
||||
if force:
|
||||
msg += " It will be removed from the concrete specs."
|
||||
# Mock new specs, so we can remove this spec from concrete spec lists
|
||||
new_specs.remove(spec)
|
||||
tty.warn(msg)
|
||||
if spec not in list_to_change:
|
||||
continue
|
||||
try:
|
||||
list_to_change.remove(spec)
|
||||
self.update_stale_references(list_name)
|
||||
new_specs = set(self.user_specs)
|
||||
except spack.spec_list.SpecListError:
|
||||
# define new specs list
|
||||
new_specs = set(self.user_specs)
|
||||
msg = f"Spec '{spec}' is part of a spec matrix and "
|
||||
msg += f"cannot be removed from list '{list_to_change}'."
|
||||
if force:
|
||||
msg += " It will be removed from the concrete specs."
|
||||
# Mock new specs, so we can remove this spec from concrete spec lists
|
||||
new_specs.remove(spec)
|
||||
tty.warn(msg)
|
||||
else:
|
||||
if list_name == user_speclist_name:
|
||||
self.manifest.remove_user_spec(str(spec))
|
||||
else:
|
||||
if list_name == user_speclist_name:
|
||||
for user_spec in matches:
|
||||
self.manifest.remove_user_spec(str(user_spec))
|
||||
else:
|
||||
for user_spec in matches:
|
||||
self.manifest.remove_definition(str(user_spec), list_name=list_name)
|
||||
self.manifest.remove_definition(str(spec), list_name=list_name)
|
||||
|
||||
# If force, update stale concretized specs
|
||||
for spec in old_specs - new_specs:
|
||||
@@ -1352,6 +1351,10 @@ def concretize(self, force=False, tests=False):
|
||||
self.concretized_order = []
|
||||
self.specs_by_hash = {}
|
||||
|
||||
# Remove concrete specs that no longer correlate to a user spec
|
||||
for spec in set(self.concretized_user_specs) - set(self.user_specs):
|
||||
self.deconcretize(spec)
|
||||
|
||||
# Pick the right concretization strategy
|
||||
if self.unify == "when_possible":
|
||||
return self._concretize_together_where_possible(tests=tests)
|
||||
@@ -1365,6 +1368,16 @@ def concretize(self, force=False, tests=False):
|
||||
msg = "concretization strategy not implemented [{0}]"
|
||||
raise SpackEnvironmentError(msg.format(self.unify))
|
||||
|
||||
def deconcretize(self, spec):
|
||||
# spec has to be a root of the environment
|
||||
index = self.concretized_user_specs.index(spec)
|
||||
dag_hash = self.concretized_order.pop(index)
|
||||
del self.concretized_user_specs[index]
|
||||
|
||||
# If this was the only user spec that concretized to this concrete spec, remove it
|
||||
if dag_hash not in self.concretized_order:
|
||||
del self.specs_by_hash[dag_hash]
|
||||
|
||||
def _get_specs_to_concretize(
|
||||
self,
|
||||
) -> Tuple[Set[spack.spec.Spec], Set[spack.spec.Spec], List[spack.spec.Spec]]:
|
||||
@@ -1402,6 +1415,10 @@ def _concretize_together_where_possible(
|
||||
if not new_user_specs:
|
||||
return []
|
||||
|
||||
old_concrete_to_abstract = {
|
||||
concrete: abstract for (abstract, concrete) in self.concretized_specs()
|
||||
}
|
||||
|
||||
self.concretized_user_specs = []
|
||||
self.concretized_order = []
|
||||
self.specs_by_hash = {}
|
||||
@@ -1413,11 +1430,13 @@ def _concretize_together_where_possible(
|
||||
|
||||
result = []
|
||||
for abstract, concrete in sorted(result_by_user_spec.items()):
|
||||
# If the "abstract" spec is a concrete spec from the previous concretization
|
||||
# translate it back to an abstract spec. Otherwise, keep the abstract spec
|
||||
abstract = old_concrete_to_abstract.get(abstract, abstract)
|
||||
if abstract in new_user_specs:
|
||||
result.append((abstract, concrete))
|
||||
else:
|
||||
assert (abstract, concrete) in result
|
||||
self._add_concrete_spec(abstract, concrete)
|
||||
|
||||
return result
|
||||
|
||||
def _concretize_together(
|
||||
@@ -1436,7 +1455,7 @@ def _concretize_together(
|
||||
self.specs_by_hash = {}
|
||||
|
||||
try:
|
||||
concrete_specs = spack.concretize.concretize_specs_together(
|
||||
concrete_specs: List[spack.spec.Spec] = spack.concretize.concretize_specs_together(
|
||||
*specs_to_concretize, tests=tests
|
||||
)
|
||||
except spack.error.UnsatisfiableSpecError as e:
|
||||
@@ -1455,11 +1474,14 @@ def _concretize_together(
|
||||
)
|
||||
raise
|
||||
|
||||
# zip truncates the longer list, which is exactly what we want here
|
||||
concretized_specs = [x for x in zip(new_user_specs | kept_user_specs, concrete_specs)]
|
||||
# set() | set() does not preserve ordering, even though sets are ordered
|
||||
ordered_user_specs = list(new_user_specs) + list(kept_user_specs)
|
||||
concretized_specs = [x for x in zip(ordered_user_specs, concrete_specs)]
|
||||
for abstract, concrete in concretized_specs:
|
||||
self._add_concrete_spec(abstract, concrete)
|
||||
return concretized_specs
|
||||
|
||||
# zip truncates the longer list, which is exactly what we want here
|
||||
return list(zip(new_user_specs, concrete_specs))
|
||||
|
||||
def _concretize_separately(self, tests=False):
|
||||
"""Concretization strategy that concretizes separately one
|
||||
|
@@ -215,6 +215,31 @@ def print_message(logger: LogType, msg: str, verbose: bool = False):
|
||||
tty.info(msg, format="g")
|
||||
|
||||
|
||||
def overall_status(current_status: "TestStatus", substatuses: List["TestStatus"]) -> "TestStatus":
|
||||
"""Determine the overall status based on the current and associated sub status values.
|
||||
|
||||
Args:
|
||||
current_status: current overall status, assumed to default to PASSED
|
||||
substatuses: status of each test part or overall status of each test spec
|
||||
Returns:
|
||||
test status encompassing the main test and all subtests
|
||||
"""
|
||||
if current_status in [TestStatus.SKIPPED, TestStatus.NO_TESTS, TestStatus.FAILED]:
|
||||
return current_status
|
||||
|
||||
skipped = 0
|
||||
for status in substatuses:
|
||||
if status == TestStatus.FAILED:
|
||||
return status
|
||||
elif status == TestStatus.SKIPPED:
|
||||
skipped += 1
|
||||
|
||||
if skipped and skipped == len(substatuses):
|
||||
return TestStatus.SKIPPED
|
||||
|
||||
return current_status
|
||||
|
||||
|
||||
class PackageTest:
|
||||
"""The class that manages stand-alone (post-install) package tests."""
|
||||
|
||||
@@ -308,14 +333,12 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
|
||||
# to start with the same name) may not have PASSED. This extra
|
||||
# check is used to ensure the containing test part is not claiming
|
||||
# to have passed when at least one subpart failed.
|
||||
if status == TestStatus.PASSED:
|
||||
for pname, substatus in self.test_parts.items():
|
||||
if pname != part_name and pname.startswith(part_name):
|
||||
if substatus == TestStatus.FAILED:
|
||||
print(f"{substatus}: {part_name}{extra}")
|
||||
self.test_parts[part_name] = substatus
|
||||
self.counts[substatus] += 1
|
||||
return
|
||||
substatuses = []
|
||||
for pname, substatus in self.test_parts.items():
|
||||
if pname != part_name and pname.startswith(part_name):
|
||||
substatuses.append(substatus)
|
||||
if substatuses:
|
||||
status = overall_status(status, substatuses)
|
||||
|
||||
print(f"{status}: {part_name}{extra}")
|
||||
self.test_parts[part_name] = status
|
||||
@@ -420,6 +443,26 @@ def summarize(self):
|
||||
lines.append(f"{totals:=^80}")
|
||||
return lines
|
||||
|
||||
def write_tested_status(self):
|
||||
"""Write the overall status to the tested file.
|
||||
|
||||
If there any test part failures, then the tests failed. If all test
|
||||
parts are skipped, then the tests were skipped. If any tests passed
|
||||
then the tests passed; otherwise, there were not tests executed.
|
||||
"""
|
||||
status = TestStatus.NO_TESTS
|
||||
if self.counts[TestStatus.FAILED] > 0:
|
||||
status = TestStatus.FAILED
|
||||
else:
|
||||
skipped = self.counts[TestStatus.SKIPPED]
|
||||
if skipped and self.parts() == skipped:
|
||||
status = TestStatus.SKIPPED
|
||||
elif self.counts[TestStatus.PASSED] > 0:
|
||||
status = TestStatus.PASSED
|
||||
|
||||
with open(self.tested_file, "w") as f:
|
||||
f.write(f"{status.value}\n")
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
|
||||
@@ -654,8 +697,9 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
|
||||
try:
|
||||
tests = test_functions(spec.package_class)
|
||||
except spack.repo.UnknownPackageError:
|
||||
# some virtuals don't have a package
|
||||
tests = []
|
||||
# Some virtuals don't have a package so we don't want to report
|
||||
# them as not having tests when that isn't appropriate.
|
||||
continue
|
||||
|
||||
if len(tests) == 0:
|
||||
tester.status(spec.name, TestStatus.NO_TESTS)
|
||||
@@ -682,7 +726,7 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
|
||||
|
||||
finally:
|
||||
if tester.ran_tests():
|
||||
fs.touch(tester.tested_file)
|
||||
tester.write_tested_status()
|
||||
|
||||
# log one more test message to provide a completion timestamp
|
||||
# for CDash reporting
|
||||
@@ -889,20 +933,15 @@ def __call__(self, *args, **kwargs):
|
||||
if remove_directory:
|
||||
shutil.rmtree(test_dir)
|
||||
|
||||
tested = os.path.exists(self.tested_file_for_spec(spec))
|
||||
if tested:
|
||||
status = TestStatus.PASSED
|
||||
else:
|
||||
self.ensure_stage()
|
||||
if spec.external and not externals:
|
||||
status = TestStatus.SKIPPED
|
||||
elif not spec.installed:
|
||||
status = TestStatus.SKIPPED
|
||||
else:
|
||||
status = TestStatus.NO_TESTS
|
||||
status = self.test_status(spec, externals)
|
||||
self.counts[status] += 1
|
||||
|
||||
self.write_test_result(spec, status)
|
||||
|
||||
except SkipTest:
|
||||
status = TestStatus.SKIPPED
|
||||
self.counts[status] += 1
|
||||
self.write_test_result(spec, TestStatus.SKIPPED)
|
||||
|
||||
except BaseException as exc:
|
||||
status = TestStatus.FAILED
|
||||
self.counts[status] += 1
|
||||
@@ -939,6 +978,31 @@ def __call__(self, *args, **kwargs):
|
||||
if failures:
|
||||
raise TestSuiteFailure(failures)
|
||||
|
||||
def test_status(self, spec: spack.spec.Spec, externals: bool) -> Optional[TestStatus]:
|
||||
"""Determine the overall test results status for the spec.
|
||||
|
||||
Args:
|
||||
spec: instance of the spec under test
|
||||
externals: ``True`` if externals are to be tested, else ``False``
|
||||
|
||||
Returns:
|
||||
the spec's test status if available or ``None``
|
||||
"""
|
||||
tests_status_file = self.tested_file_for_spec(spec)
|
||||
if not os.path.exists(tests_status_file):
|
||||
self.ensure_stage()
|
||||
if spec.external and not externals:
|
||||
status = TestStatus.SKIPPED
|
||||
elif not spec.installed:
|
||||
status = TestStatus.SKIPPED
|
||||
else:
|
||||
status = TestStatus.NO_TESTS
|
||||
return status
|
||||
|
||||
with open(tests_status_file, "r") as f:
|
||||
value = (f.read()).strip("\n")
|
||||
return TestStatus(int(value)) if value else TestStatus.NO_TESTS
|
||||
|
||||
def ensure_stage(self):
|
||||
"""Ensure the test suite stage directory exists."""
|
||||
if not os.path.exists(self.stage):
|
||||
|
@@ -40,7 +40,7 @@
|
||||
|
||||
import llnl.util.filesystem
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.lang import dedupe
|
||||
from llnl.util.lang import dedupe, memoized
|
||||
|
||||
import spack.build_environment
|
||||
import spack.config
|
||||
@@ -170,17 +170,10 @@ def merge_config_rules(configuration, spec):
|
||||
Returns:
|
||||
dict: actions to be taken on the spec passed as an argument
|
||||
"""
|
||||
|
||||
# Get the top-level configuration for the module type we are using
|
||||
module_specific_configuration = copy.deepcopy(configuration)
|
||||
|
||||
# Construct a dictionary with the actions we need to perform on the spec
|
||||
# passed as a parameter
|
||||
|
||||
# The keyword 'all' is always evaluated first, all the others are
|
||||
# evaluated in order of appearance in the module file
|
||||
spec_configuration = module_specific_configuration.pop("all", {})
|
||||
for constraint, action in module_specific_configuration.items():
|
||||
spec_configuration = copy.deepcopy(configuration.get("all", {}))
|
||||
for constraint, action in configuration.items():
|
||||
if spec.satisfies(constraint):
|
||||
if hasattr(constraint, "override") and constraint.override:
|
||||
spec_configuration = {}
|
||||
@@ -200,14 +193,14 @@ def merge_config_rules(configuration, spec):
|
||||
# configuration
|
||||
|
||||
# Hash length in module files
|
||||
hash_length = module_specific_configuration.get("hash_length", 7)
|
||||
hash_length = configuration.get("hash_length", 7)
|
||||
spec_configuration["hash_length"] = hash_length
|
||||
|
||||
verbose = module_specific_configuration.get("verbose", False)
|
||||
verbose = configuration.get("verbose", False)
|
||||
spec_configuration["verbose"] = verbose
|
||||
|
||||
# module defaults per-package
|
||||
defaults = module_specific_configuration.get("defaults", [])
|
||||
defaults = configuration.get("defaults", [])
|
||||
spec_configuration["defaults"] = defaults
|
||||
|
||||
return spec_configuration
|
||||
@@ -678,7 +671,14 @@ def configure_options(self):
|
||||
# the configure option section
|
||||
return None
|
||||
|
||||
def modification_needs_formatting(self, modification):
|
||||
"""Returns True if environment modification entry needs to be formatted."""
|
||||
return (
|
||||
not isinstance(modification, (spack.util.environment.SetEnv)) or not modification.raw
|
||||
)
|
||||
|
||||
@tengine.context_property
|
||||
@memoized
|
||||
def environment_modifications(self):
|
||||
"""List of environment modifications to be processed."""
|
||||
# Modifications guessed by inspecting the spec prefix
|
||||
@@ -740,15 +740,29 @@ def environment_modifications(self):
|
||||
_check_tokens_are_valid(x.name, message=msg)
|
||||
# Transform them
|
||||
x.name = spec.format(x.name, transform=transform)
|
||||
try:
|
||||
# Not every command has a value
|
||||
x.value = spec.format(x.value)
|
||||
except AttributeError:
|
||||
pass
|
||||
if self.modification_needs_formatting(x):
|
||||
try:
|
||||
# Not every command has a value
|
||||
x.value = spec.format(x.value)
|
||||
except AttributeError:
|
||||
pass
|
||||
x.name = str(x.name).replace("-", "_")
|
||||
|
||||
return [(type(x).__name__, x) for x in env if x.name not in exclude]
|
||||
|
||||
@tengine.context_property
|
||||
def has_manpath_modifications(self):
|
||||
"""True if MANPATH environment variable is modified."""
|
||||
for modification_type, cmd in self.environment_modifications:
|
||||
if not isinstance(
|
||||
cmd, (spack.util.environment.PrependPath, spack.util.environment.AppendPath)
|
||||
):
|
||||
continue
|
||||
if cmd.name == "MANPATH":
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
@tengine.context_property
|
||||
def autoload(self):
|
||||
"""List of modules that needs to be loaded automatically."""
|
||||
|
@@ -7,7 +7,7 @@
|
||||
import itertools
|
||||
import os.path
|
||||
import posixpath
|
||||
from typing import Any, Dict
|
||||
from typing import Any, Dict, List
|
||||
|
||||
import llnl.util.lang as lang
|
||||
|
||||
@@ -56,7 +56,7 @@ def make_context(spec, module_set_name, explicit):
|
||||
return LmodContext(conf)
|
||||
|
||||
|
||||
def guess_core_compilers(name, store=False):
|
||||
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
|
||||
"""Guesses the list of core compilers installed in the system.
|
||||
|
||||
Args:
|
||||
@@ -64,21 +64,19 @@ def guess_core_compilers(name, store=False):
|
||||
modules.yaml configuration file
|
||||
|
||||
Returns:
|
||||
List of core compilers, if found, or None
|
||||
List of found core compilers
|
||||
"""
|
||||
core_compilers = []
|
||||
for compiler_config in spack.compilers.all_compilers_config():
|
||||
for compiler in spack.compilers.all_compilers():
|
||||
try:
|
||||
compiler = compiler_config["compiler"]
|
||||
# A compiler is considered to be a core compiler if any of the
|
||||
# C, C++ or Fortran compilers reside in a system directory
|
||||
is_system_compiler = any(
|
||||
os.path.dirname(x) in spack.util.environment.SYSTEM_DIRS
|
||||
for x in compiler["paths"].values()
|
||||
if x is not None
|
||||
os.path.dirname(getattr(compiler, x, "")) in spack.util.environment.SYSTEM_DIRS
|
||||
for x in ("cc", "cxx", "f77", "fc")
|
||||
)
|
||||
if is_system_compiler:
|
||||
core_compilers.append(str(compiler["spec"]))
|
||||
core_compilers.append(compiler.spec)
|
||||
except (KeyError, TypeError, AttributeError):
|
||||
continue
|
||||
|
||||
@@ -89,10 +87,10 @@ def guess_core_compilers(name, store=False):
|
||||
modules_cfg = spack.config.get(
|
||||
"modules:" + name, {}, scope=spack.config.default_modify_scope()
|
||||
)
|
||||
modules_cfg.setdefault("lmod", {})["core_compilers"] = core_compilers
|
||||
modules_cfg.setdefault("lmod", {})["core_compilers"] = [str(x) for x in core_compilers]
|
||||
spack.config.set("modules:" + name, modules_cfg, scope=spack.config.default_modify_scope())
|
||||
|
||||
return core_compilers or None
|
||||
return core_compilers
|
||||
|
||||
|
||||
class LmodConfiguration(BaseConfiguration):
|
||||
@@ -104,7 +102,7 @@ class LmodConfiguration(BaseConfiguration):
|
||||
default_projections = {"all": posixpath.join("{name}", "{version}")}
|
||||
|
||||
@property
|
||||
def core_compilers(self):
|
||||
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
|
||||
"""Returns the list of "Core" compilers
|
||||
|
||||
Raises:
|
||||
@@ -112,14 +110,18 @@ def core_compilers(self):
|
||||
specified in the configuration file or the sequence
|
||||
is empty
|
||||
"""
|
||||
value = configuration(self.name).get("core_compilers") or guess_core_compilers(
|
||||
self.name, store=True
|
||||
)
|
||||
compilers = [
|
||||
spack.spec.CompilerSpec(c) for c in configuration(self.name).get("core_compilers", [])
|
||||
]
|
||||
|
||||
if not value:
|
||||
if not compilers:
|
||||
compilers = guess_core_compilers(self.name, store=True)
|
||||
|
||||
if not compilers:
|
||||
msg = 'the key "core_compilers" must be set in modules.yaml'
|
||||
raise CoreCompilersNotFoundError(msg)
|
||||
return value
|
||||
|
||||
return compilers
|
||||
|
||||
@property
|
||||
def core_specs(self):
|
||||
@@ -132,6 +134,7 @@ def filter_hierarchy_specs(self):
|
||||
return configuration(self.name).get("filter_hierarchy_specs", {})
|
||||
|
||||
@property
|
||||
@lang.memoized
|
||||
def hierarchy_tokens(self):
|
||||
"""Returns the list of tokens that are part of the modulefile
|
||||
hierarchy. 'compiler' is always present.
|
||||
@@ -156,6 +159,7 @@ def hierarchy_tokens(self):
|
||||
return tokens
|
||||
|
||||
@property
|
||||
@lang.memoized
|
||||
def requires(self):
|
||||
"""Returns a dictionary mapping all the requirements of this spec
|
||||
to the actual provider. 'compiler' is always present among the
|
||||
@@ -222,6 +226,7 @@ def available(self):
|
||||
return available
|
||||
|
||||
@property
|
||||
@lang.memoized
|
||||
def missing(self):
|
||||
"""Returns the list of tokens that are not available."""
|
||||
return [x for x in self.hierarchy_tokens if x not in self.available]
|
||||
@@ -283,16 +288,18 @@ def token_to_path(self, name, value):
|
||||
|
||||
# If we are dealing with a core compiler, return 'Core'
|
||||
core_compilers = self.conf.core_compilers
|
||||
if name == "compiler" and str(value) in core_compilers:
|
||||
if name == "compiler" and any(
|
||||
spack.spec.CompilerSpec(value).satisfies(c) for c in core_compilers
|
||||
):
|
||||
return "Core"
|
||||
|
||||
# CompilerSpec does not have an hash, as we are not allowed to
|
||||
# CompilerSpec does not have a hash, as we are not allowed to
|
||||
# use different flavors of the same compiler
|
||||
if name == "compiler":
|
||||
return path_part_fmt.format(token=value)
|
||||
|
||||
# In case the hierarchy token refers to a virtual provider
|
||||
# we need to append an hash to the version to distinguish
|
||||
# we need to append a hash to the version to distinguish
|
||||
# among flavors of the same library (e.g. openblas~openmp vs.
|
||||
# openblas+openmp)
|
||||
path = path_part_fmt.format(token=value)
|
||||
@@ -313,6 +320,7 @@ def available_path_parts(self):
|
||||
return parts
|
||||
|
||||
@property
|
||||
@lang.memoized
|
||||
def unlocked_paths(self):
|
||||
"""Returns a dictionary mapping conditions to a list of unlocked
|
||||
paths.
|
||||
@@ -424,6 +432,7 @@ def missing(self):
|
||||
return self.conf.missing
|
||||
|
||||
@tengine.context_property
|
||||
@lang.memoized
|
||||
def unlocked_paths(self):
|
||||
"""Returns the list of paths that are unlocked unconditionally."""
|
||||
layout = make_layout(self.spec, self.conf.name, self.conf.explicit)
|
||||
|
@@ -1231,6 +1231,7 @@ def dependencies_of_type(cls, *deptypes):
|
||||
if any(dt in cls.dependencies[name][cond].type for cond in conds for dt in deptypes)
|
||||
)
|
||||
|
||||
# TODO: allow more than one active extendee.
|
||||
@property
|
||||
def extendee_spec(self):
|
||||
"""
|
||||
@@ -1246,7 +1247,6 @@ def extendee_spec(self):
|
||||
if dep.name in self.extendees:
|
||||
deps.append(dep)
|
||||
|
||||
# TODO: allow more than one active extendee.
|
||||
if deps:
|
||||
assert len(deps) == 1
|
||||
return deps[0]
|
||||
@@ -1256,7 +1256,6 @@ def extendee_spec(self):
|
||||
if self.spec._concrete:
|
||||
return None
|
||||
else:
|
||||
# TODO: do something sane here with more than one extendee
|
||||
# If it's not concrete, then return the spec from the
|
||||
# extends() directive since that is all we know so far.
|
||||
spec_str, kwargs = next(iter(self.extendees.items()))
|
||||
@@ -2017,7 +2016,8 @@ def test_title(purpose, test_name):
|
||||
# stack instead of from traceback.
|
||||
# The traceback is truncated here, so we can't use it to
|
||||
# traverse the stack.
|
||||
m = "\n".join(spack.build_environment.get_package_context(tb))
|
||||
context = spack.build_environment.get_package_context(tb)
|
||||
m = "\n".join(context) if context else ""
|
||||
|
||||
exc = e # e is deleted after this block
|
||||
|
||||
|
@@ -37,7 +37,9 @@
|
||||
|
||||
|
||||
def slingshot_network():
|
||||
return os.path.exists("/opt/cray/pe") and os.path.exists("/lib64/libcxi.so")
|
||||
return os.path.exists("/opt/cray/pe") and (
|
||||
os.path.exists("/lib64/libcxi.so") or os.path.exists("/usr/lib64/libcxi.so")
|
||||
)
|
||||
|
||||
|
||||
def _target_name_from_craype_target_name(name):
|
||||
|
@@ -686,7 +686,7 @@ def is_relocatable(spec):
|
||||
Raises:
|
||||
ValueError: if the spec is not installed
|
||||
"""
|
||||
if not spec.install_status():
|
||||
if not spec.installed:
|
||||
raise ValueError("spec is not installed [{0}]".format(str(spec)))
|
||||
|
||||
if spec.external or spec.virtual:
|
||||
|
@@ -861,9 +861,9 @@ class SpackSolverSetup(object):
|
||||
def __init__(self, tests=False):
|
||||
self.gen = None # set by setup()
|
||||
|
||||
self.declared_versions = {}
|
||||
self.possible_versions = {}
|
||||
self.deprecated_versions = {}
|
||||
self.declared_versions = collections.defaultdict(list)
|
||||
self.possible_versions = collections.defaultdict(set)
|
||||
self.deprecated_versions = collections.defaultdict(set)
|
||||
|
||||
self.possible_virtuals = None
|
||||
self.possible_compilers = []
|
||||
@@ -1669,9 +1669,34 @@ class Body(object):
|
||||
if concrete_build_deps or dtype != "build":
|
||||
clauses.append(fn.attr("depends_on", spec.name, dep.name, dtype))
|
||||
|
||||
# Ensure Spack will not coconcretize this with another provider
|
||||
# for the same virtual
|
||||
for virtual in dep.package.virtuals_provided:
|
||||
# TODO: We have to look up info from package.py here, but we'd
|
||||
# TODO: like to avoid this entirely. We should not need to look
|
||||
# TODO: up potentially wrong info if we have virtual edge info.
|
||||
try:
|
||||
try:
|
||||
pkg = dep.package
|
||||
|
||||
except spack.repo.UnknownNamespaceError:
|
||||
# Try to look up the package of the same name and use its
|
||||
# providers. This is as good as we can do without edge info.
|
||||
pkg_class = spack.repo.path.get_pkg_class(dep.name)
|
||||
spec = spack.spec.Spec(f"{dep.name}@{dep.version}")
|
||||
pkg = pkg_class(spec)
|
||||
|
||||
virtuals = pkg.virtuals_provided
|
||||
|
||||
except spack.repo.UnknownPackageError:
|
||||
# Skip virtual node constriants for renamed/deleted packages,
|
||||
# so their binaries can still be installed.
|
||||
# NOTE: with current specs (which lack edge attributes) this
|
||||
# can allow concretizations with two providers, but it's unlikely.
|
||||
continue
|
||||
|
||||
# Don't concretize with two providers of the same virtual.
|
||||
# See above for exception for unknown packages.
|
||||
# TODO: we will eventually record provider information on edges,
|
||||
# TODO: which avoids the need for the package lookup above.
|
||||
for virtual in virtuals:
|
||||
clauses.append(fn.attr("virtual_node", virtual.name))
|
||||
clauses.append(fn.provider(dep.name, virtual.name))
|
||||
|
||||
@@ -1697,10 +1722,6 @@ class Body(object):
|
||||
|
||||
def build_version_dict(self, possible_pkgs):
|
||||
"""Declare any versions in specs not declared in packages."""
|
||||
self.declared_versions = collections.defaultdict(list)
|
||||
self.possible_versions = collections.defaultdict(set)
|
||||
self.deprecated_versions = collections.defaultdict(set)
|
||||
|
||||
packages_yaml = spack.config.get("packages")
|
||||
packages_yaml = _normalize_packages_yaml(packages_yaml)
|
||||
for pkg_name in possible_pkgs:
|
||||
@@ -1734,13 +1755,47 @@ def key_fn(item):
|
||||
# All the preferred version from packages.yaml, versions in external
|
||||
# specs will be computed later
|
||||
version_preferences = packages_yaml.get(pkg_name, {}).get("version", [])
|
||||
for idx, v in enumerate(version_preferences):
|
||||
# v can be a string so force it into an actual version for comparisons
|
||||
ver = vn.Version(v)
|
||||
version_defs = []
|
||||
pkg_class = spack.repo.path.get_pkg_class(pkg_name)
|
||||
for vstr in version_preferences:
|
||||
v = vn.ver(vstr)
|
||||
if isinstance(v, vn.GitVersion):
|
||||
version_defs.append(v)
|
||||
else:
|
||||
satisfying_versions = self._check_for_defined_matching_versions(pkg_class, v)
|
||||
# Amongst all defined versions satisfying this specific
|
||||
# preference, the highest-numbered version is the
|
||||
# most-preferred: therefore sort satisfying versions
|
||||
# from greatest to least
|
||||
version_defs.extend(sorted(satisfying_versions, reverse=True))
|
||||
|
||||
for weight, vdef in enumerate(llnl.util.lang.dedupe(version_defs)):
|
||||
self.declared_versions[pkg_name].append(
|
||||
DeclaredVersion(version=ver, idx=idx, origin=Provenance.PACKAGES_YAML)
|
||||
DeclaredVersion(version=vdef, idx=weight, origin=Provenance.PACKAGES_YAML)
|
||||
)
|
||||
self.possible_versions[pkg_name].add(ver)
|
||||
self.possible_versions[pkg_name].add(vdef)
|
||||
|
||||
def _check_for_defined_matching_versions(self, pkg_class, v):
|
||||
"""Given a version specification (which may be a concrete version,
|
||||
range, etc.), determine if any package.py version declarations
|
||||
or externals define a version which satisfies it.
|
||||
|
||||
This is primarily for determining whether a version request (e.g.
|
||||
version preferences, which should not themselves define versions)
|
||||
refers to a defined version.
|
||||
|
||||
This function raises an exception if no satisfying versions are
|
||||
found.
|
||||
"""
|
||||
pkg_name = pkg_class.name
|
||||
satisfying_versions = list(x for x in pkg_class.versions if x.satisfies(v))
|
||||
satisfying_versions.extend(x for x in self.possible_versions[pkg_name] if x.satisfies(v))
|
||||
if not satisfying_versions:
|
||||
raise spack.config.ConfigError(
|
||||
"Preference for version {0} does not match any version"
|
||||
" defined for {1} (in its package.py or any external)".format(str(v), pkg_name)
|
||||
)
|
||||
return satisfying_versions
|
||||
|
||||
def add_concrete_versions_from_specs(self, specs, origin):
|
||||
"""Add concrete versions to possible versions from lists of CLI/dev specs."""
|
||||
@@ -2173,14 +2228,6 @@ def setup(self, driver, specs, reuse=None):
|
||||
# get possible compilers
|
||||
self.possible_compilers = self.generate_possible_compilers(specs)
|
||||
|
||||
# traverse all specs and packages to build dict of possible versions
|
||||
self.build_version_dict(possible)
|
||||
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
|
||||
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
|
||||
|
||||
req_version_specs = _get_versioned_specs_from_pkg_requirements()
|
||||
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
|
||||
|
||||
self.gen.h1("Concrete input spec definitions")
|
||||
self.define_concrete_input_specs(specs, possible)
|
||||
|
||||
@@ -2208,6 +2255,14 @@ def setup(self, driver, specs, reuse=None):
|
||||
self.provider_requirements()
|
||||
self.external_packages()
|
||||
|
||||
# traverse all specs and packages to build dict of possible versions
|
||||
self.build_version_dict(possible)
|
||||
self.add_concrete_versions_from_specs(specs, Provenance.SPEC)
|
||||
self.add_concrete_versions_from_specs(dev_specs, Provenance.DEV_SPEC)
|
||||
|
||||
req_version_specs = self._get_versioned_specs_from_pkg_requirements()
|
||||
self.add_concrete_versions_from_specs(req_version_specs, Provenance.PACKAGE_REQUIREMENT)
|
||||
|
||||
self.gen.h1("Package Constraints")
|
||||
for pkg in sorted(self.pkgs):
|
||||
self.gen.h2("Package rules: %s" % pkg)
|
||||
@@ -2254,55 +2309,78 @@ def literal_specs(self, specs):
|
||||
if self.concretize_everything:
|
||||
self.gen.fact(fn.concretize_everything())
|
||||
|
||||
def _get_versioned_specs_from_pkg_requirements(self):
|
||||
"""If package requirements mention versions that are not mentioned
|
||||
elsewhere, then we need to collect those to mark them as possible
|
||||
versions.
|
||||
"""
|
||||
req_version_specs = list()
|
||||
config = spack.config.get("packages")
|
||||
for pkg_name, d in config.items():
|
||||
if pkg_name == "all":
|
||||
continue
|
||||
if "require" in d:
|
||||
req_version_specs.extend(self._specs_from_requires(pkg_name, d["require"]))
|
||||
return req_version_specs
|
||||
|
||||
def _get_versioned_specs_from_pkg_requirements():
|
||||
"""If package requirements mention versions that are not mentioned
|
||||
elsewhere, then we need to collect those to mark them as possible
|
||||
versions.
|
||||
"""
|
||||
req_version_specs = list()
|
||||
config = spack.config.get("packages")
|
||||
for pkg_name, d in config.items():
|
||||
if pkg_name == "all":
|
||||
continue
|
||||
if "require" in d:
|
||||
req_version_specs.extend(_specs_from_requires(pkg_name, d["require"]))
|
||||
return req_version_specs
|
||||
|
||||
|
||||
def _specs_from_requires(pkg_name, section):
|
||||
if isinstance(section, str):
|
||||
spec = spack.spec.Spec(section)
|
||||
if not spec.name:
|
||||
spec.name = pkg_name
|
||||
extracted_specs = [spec]
|
||||
else:
|
||||
spec_strs = []
|
||||
for spec_group in section:
|
||||
if isinstance(spec_group, str):
|
||||
spec_strs.append(spec_group)
|
||||
else:
|
||||
# Otherwise it is an object. The object can contain a single
|
||||
# "spec" constraint, or a list of them with "any_of" or
|
||||
# "one_of" policy.
|
||||
if "spec" in spec_group:
|
||||
new_constraints = [spec_group["spec"]]
|
||||
else:
|
||||
key = "one_of" if "one_of" in spec_group else "any_of"
|
||||
new_constraints = spec_group[key]
|
||||
spec_strs.extend(new_constraints)
|
||||
|
||||
extracted_specs = []
|
||||
for spec_str in spec_strs:
|
||||
spec = spack.spec.Spec(spec_str)
|
||||
def _specs_from_requires(self, pkg_name, section):
|
||||
"""Collect specs from requirements which define versions (i.e. those that
|
||||
have a concrete version). Requirements can define *new* versions if
|
||||
they are included as part of an equivalence (hash=number) but not
|
||||
otherwise.
|
||||
"""
|
||||
if isinstance(section, str):
|
||||
spec = spack.spec.Spec(section)
|
||||
if not spec.name:
|
||||
spec.name = pkg_name
|
||||
extracted_specs.append(spec)
|
||||
extracted_specs = [spec]
|
||||
else:
|
||||
spec_strs = []
|
||||
for spec_group in section:
|
||||
if isinstance(spec_group, str):
|
||||
spec_strs.append(spec_group)
|
||||
else:
|
||||
# Otherwise it is an object. The object can contain a single
|
||||
# "spec" constraint, or a list of them with "any_of" or
|
||||
# "one_of" policy.
|
||||
if "spec" in spec_group:
|
||||
new_constraints = [spec_group["spec"]]
|
||||
else:
|
||||
key = "one_of" if "one_of" in spec_group else "any_of"
|
||||
new_constraints = spec_group[key]
|
||||
spec_strs.extend(new_constraints)
|
||||
|
||||
version_specs = [x for x in extracted_specs if x.versions.concrete]
|
||||
for spec in version_specs:
|
||||
spec.attach_git_version_lookup()
|
||||
return version_specs
|
||||
extracted_specs = []
|
||||
for spec_str in spec_strs:
|
||||
spec = spack.spec.Spec(spec_str)
|
||||
if not spec.name:
|
||||
spec.name = pkg_name
|
||||
extracted_specs.append(spec)
|
||||
|
||||
version_specs = []
|
||||
for spec in extracted_specs:
|
||||
if spec.versions.concrete:
|
||||
# Note: this includes git versions
|
||||
version_specs.append(spec)
|
||||
continue
|
||||
|
||||
# Prefer spec's name if it exists, in case the spec is
|
||||
# requiring a specific implementation inside of a virtual section
|
||||
# e.g. packages:mpi:require:openmpi@4.0.1
|
||||
pkg_class = spack.repo.path.get_pkg_class(spec.name or pkg_name)
|
||||
satisfying_versions = self._check_for_defined_matching_versions(
|
||||
pkg_class, spec.versions
|
||||
)
|
||||
|
||||
# Version ranges ("@1.3" without the "=", "@1.2:1.4") and lists
|
||||
# will end up here
|
||||
ordered_satisfying_versions = sorted(satisfying_versions, reverse=True)
|
||||
vspecs = list(spack.spec.Spec("@{0}".format(x)) for x in ordered_satisfying_versions)
|
||||
version_specs.extend(vspecs)
|
||||
|
||||
for spec in version_specs:
|
||||
spec.attach_git_version_lookup()
|
||||
return version_specs
|
||||
|
||||
|
||||
class SpecBuilder(object):
|
||||
@@ -2758,12 +2836,13 @@ class InternalConcretizerError(spack.error.UnsatisfiableSpecError):
|
||||
"""
|
||||
|
||||
def __init__(self, provided, conflicts):
|
||||
indented = [" %s\n" % conflict for conflict in conflicts]
|
||||
error_msg = "".join(indented)
|
||||
msg = "Spack concretizer internal error. Please submit a bug report"
|
||||
msg += "\n Please include the command, environment if applicable,"
|
||||
msg += "\n and the following error message."
|
||||
msg = "\n %s is unsatisfiable, errors are:\n%s" % (provided, error_msg)
|
||||
msg = (
|
||||
"Spack concretizer internal error. Please submit a bug report and include the "
|
||||
"command, environment if applicable and the following error message."
|
||||
f"\n {provided} is unsatisfiable, errors are:"
|
||||
)
|
||||
|
||||
msg += "".join([f"\n {conflict}" for conflict in conflicts])
|
||||
|
||||
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
|
||||
|
||||
|
@@ -50,6 +50,7 @@
|
||||
"""
|
||||
import collections
|
||||
import collections.abc
|
||||
import enum
|
||||
import io
|
||||
import itertools
|
||||
import os
|
||||
@@ -173,6 +174,16 @@
|
||||
SPECFILE_FORMAT_VERSION = 3
|
||||
|
||||
|
||||
# InstallStatus is used to map install statuses to symbols for display
|
||||
# Options are artificially disjoint for dispay purposes
|
||||
class InstallStatus(enum.Enum):
|
||||
installed = "@g{[+]} "
|
||||
upstream = "@g{[^]} "
|
||||
external = "@g{[e]} "
|
||||
absent = "@K{ - } "
|
||||
missing = "@r{[-]} "
|
||||
|
||||
|
||||
def colorize_spec(spec):
|
||||
"""Returns a spec colorized according to the colors specified in
|
||||
color_formats."""
|
||||
@@ -679,6 +690,16 @@ def from_dict(d):
|
||||
d = d["compiler"]
|
||||
return CompilerSpec(d["name"], vn.VersionList.from_dict(d))
|
||||
|
||||
@property
|
||||
def display_str(self):
|
||||
"""Equivalent to {compiler.name}{@compiler.version} for Specs, without extra
|
||||
@= for readability."""
|
||||
if self.concrete:
|
||||
return f"{self.name}@{self.version}"
|
||||
elif self.versions != vn.any_version:
|
||||
return f"{self.name}@{self.versions}"
|
||||
return self.name
|
||||
|
||||
def __str__(self):
|
||||
out = self.name
|
||||
if self.versions and self.versions != vn.any_version:
|
||||
@@ -1730,14 +1751,14 @@ def traverse_edges(self, **kwargs):
|
||||
def short_spec(self):
|
||||
"""Returns a version of the spec with the dependencies hashed
|
||||
instead of completely enumerated."""
|
||||
spec_format = "{name}{@version}{%compiler}"
|
||||
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
|
||||
spec_format += "{variants}{arch=architecture}{/hash:7}"
|
||||
return self.format(spec_format)
|
||||
|
||||
@property
|
||||
def cshort_spec(self):
|
||||
"""Returns an auto-colorized version of ``self.short_spec``."""
|
||||
spec_format = "{name}{@version}{%compiler}"
|
||||
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
|
||||
spec_format += "{variants}{arch=architecture}{/hash:7}"
|
||||
return self.cformat(spec_format)
|
||||
|
||||
@@ -2789,11 +2810,11 @@ def inject_patches_variant(root):
|
||||
# Also record all patches required on dependencies by
|
||||
# depends_on(..., patch=...)
|
||||
for dspec in root.traverse_edges(deptype=all, cover="edges", root=False):
|
||||
pkg_deps = dspec.parent.package_class.dependencies
|
||||
if dspec.spec.name not in pkg_deps:
|
||||
if dspec.spec.concrete:
|
||||
continue
|
||||
|
||||
if dspec.spec.concrete:
|
||||
pkg_deps = dspec.parent.package_class.dependencies
|
||||
if dspec.spec.name not in pkg_deps:
|
||||
continue
|
||||
|
||||
patches = []
|
||||
@@ -4391,12 +4412,20 @@ def __str__(self):
|
||||
def install_status(self):
|
||||
"""Helper for tree to print DB install status."""
|
||||
if not self.concrete:
|
||||
return None
|
||||
try:
|
||||
record = spack.store.db.get_record(self)
|
||||
return record.installed
|
||||
except KeyError:
|
||||
return None
|
||||
return InstallStatus.absent
|
||||
|
||||
if self.external:
|
||||
return InstallStatus.external
|
||||
|
||||
upstream, record = spack.store.db.query_by_spec_hash(self.dag_hash())
|
||||
if not record:
|
||||
return InstallStatus.absent
|
||||
elif upstream and record.installed:
|
||||
return InstallStatus.upstream
|
||||
elif record.installed:
|
||||
return InstallStatus.installed
|
||||
else:
|
||||
return InstallStatus.missing
|
||||
|
||||
def _installed_explicitly(self):
|
||||
"""Helper for tree to print DB install status."""
|
||||
@@ -4410,7 +4439,10 @@ def _installed_explicitly(self):
|
||||
|
||||
def tree(self, **kwargs):
|
||||
"""Prints out this spec and its dependencies, tree-formatted
|
||||
with indentation."""
|
||||
with indentation.
|
||||
|
||||
Status function may either output a boolean or an InstallStatus
|
||||
"""
|
||||
color = kwargs.pop("color", clr.get_color_when())
|
||||
depth = kwargs.pop("depth", False)
|
||||
hashes = kwargs.pop("hashes", False)
|
||||
@@ -4442,14 +4474,12 @@ def tree(self, **kwargs):
|
||||
|
||||
if status_fn:
|
||||
status = status_fn(node)
|
||||
if node.installed_upstream:
|
||||
out += clr.colorize("@g{[^]} ", color=color)
|
||||
elif status is None:
|
||||
out += clr.colorize("@K{ - } ", color=color) # !installed
|
||||
if status in list(InstallStatus):
|
||||
out += clr.colorize(status.value, color=color)
|
||||
elif status:
|
||||
out += clr.colorize("@g{[+]} ", color=color) # installed
|
||||
out += clr.colorize("@g{[+]} ", color=color)
|
||||
else:
|
||||
out += clr.colorize("@r{[-]} ", color=color) # missing
|
||||
out += clr.colorize("@r{[-]} ", color=color)
|
||||
|
||||
if hashes:
|
||||
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hlen)
|
||||
|
@@ -4,7 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import itertools
|
||||
import textwrap
|
||||
from typing import List
|
||||
from typing import List, Optional, Tuple
|
||||
|
||||
import llnl.util.lang
|
||||
|
||||
@@ -66,17 +66,17 @@ def to_dict(self):
|
||||
return dict(d)
|
||||
|
||||
|
||||
def make_environment(dirs=None):
|
||||
"""Returns an configured environment for template rendering."""
|
||||
@llnl.util.lang.memoized
|
||||
def make_environment(dirs: Optional[Tuple[str, ...]] = None):
|
||||
"""Returns a configured environment for template rendering."""
|
||||
# Import at this scope to avoid slowing Spack startup down
|
||||
import jinja2
|
||||
|
||||
if dirs is None:
|
||||
# Default directories where to search for templates
|
||||
builtins = spack.config.get("config:template_dirs", ["$spack/share/spack/templates"])
|
||||
extensions = spack.extensions.get_template_dirs()
|
||||
dirs = [canonicalize_path(d) for d in itertools.chain(builtins, extensions)]
|
||||
|
||||
# avoid importing this at the top level as it's used infrequently and
|
||||
# slows down startup a bit.
|
||||
import jinja2
|
||||
dirs = tuple(canonicalize_path(d) for d in itertools.chain(builtins, extensions))
|
||||
|
||||
# Loader for the templates
|
||||
loader = jinja2.FileSystemLoader(dirs)
|
||||
|
@@ -115,9 +115,6 @@ def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutabl
|
||||
|
||||
spack.config.config, old_config = cfg, spack.config.config
|
||||
spack.config.config.set("repos", [spack.paths.mock_packages_path])
|
||||
# This is essential, otherwise the cache will create weird side effects
|
||||
# that will compromise subsequent tests if compilers.yaml is modified
|
||||
monkeypatch.setattr(spack.compilers, "_cache_config_file", [])
|
||||
njobs = spack.config.get("config:build_jobs")
|
||||
if not njobs:
|
||||
spack.config.set("config:build_jobs", 4, scope="user")
|
||||
|
@@ -8,8 +8,6 @@
|
||||
|
||||
import pytest
|
||||
|
||||
import llnl.util.filesystem
|
||||
|
||||
import spack.compilers
|
||||
import spack.main
|
||||
import spack.version
|
||||
@@ -18,124 +16,8 @@
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_compiler_version():
|
||||
return "4.5.3"
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def mock_compiler_dir(tmpdir, mock_compiler_version):
|
||||
"""Return a directory containing a fake, but detectable compiler."""
|
||||
|
||||
tmpdir.ensure("bin", dir=True)
|
||||
bin_dir = tmpdir.join("bin")
|
||||
|
||||
gcc_path = bin_dir.join("gcc")
|
||||
gxx_path = bin_dir.join("g++")
|
||||
gfortran_path = bin_dir.join("gfortran")
|
||||
|
||||
gcc_path.write(
|
||||
"""\
|
||||
#!/bin/sh
|
||||
|
||||
for arg in "$@"; do
|
||||
if [ "$arg" = -dumpversion ]; then
|
||||
echo '%s'
|
||||
fi
|
||||
done
|
||||
"""
|
||||
% mock_compiler_version
|
||||
)
|
||||
|
||||
# Create some mock compilers in the temporary directory
|
||||
llnl.util.filesystem.set_executable(str(gcc_path))
|
||||
gcc_path.copy(gxx_path, mode=True)
|
||||
gcc_path.copy(gfortran_path, mode=True)
|
||||
|
||||
return str(tmpdir)
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
sys.platform == "win32",
|
||||
reason="Cannot execute bash \
|
||||
script on Windows",
|
||||
)
|
||||
@pytest.mark.regression("11678,13138")
|
||||
def test_compiler_find_without_paths(no_compilers_yaml, working_env, tmpdir):
|
||||
with tmpdir.as_cwd():
|
||||
with open("gcc", "w") as f:
|
||||
f.write(
|
||||
"""\
|
||||
#!/bin/sh
|
||||
echo "0.0.0"
|
||||
"""
|
||||
)
|
||||
os.chmod("gcc", 0o700)
|
||||
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "gcc" in output
|
||||
|
||||
|
||||
@pytest.mark.regression("17589")
|
||||
def test_compiler_find_no_apple_gcc(no_compilers_yaml, working_env, tmpdir):
|
||||
with tmpdir.as_cwd():
|
||||
# make a script to emulate apple gcc's version args
|
||||
with open("gcc", "w") as f:
|
||||
f.write(
|
||||
"""\
|
||||
#!/bin/sh
|
||||
if [ "$1" = "-dumpversion" ]; then
|
||||
echo "4.2.1"
|
||||
elif [ "$1" = "--version" ]; then
|
||||
echo "Configured with: --prefix=/dummy"
|
||||
echo "Apple clang version 11.0.0 (clang-1100.0.33.16)"
|
||||
echo "Target: x86_64-apple-darwin18.7.0"
|
||||
echo "Thread model: posix"
|
||||
echo "InstalledDir: /dummy"
|
||||
else
|
||||
echo "clang: error: no input files"
|
||||
fi
|
||||
"""
|
||||
)
|
||||
os.chmod("gcc", 0o700)
|
||||
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "gcc" not in output
|
||||
|
||||
|
||||
def test_compiler_remove(mutable_config, mock_packages):
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
|
||||
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
|
||||
spack.cmd.compiler.compiler_remove(args)
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
sys.platform == "win32",
|
||||
reason="Cannot execute bash \
|
||||
script on Windows",
|
||||
)
|
||||
def test_compiler_add(mutable_config, mock_packages, mock_compiler_dir, mock_compiler_version):
|
||||
# Compilers available by default.
|
||||
old_compilers = set(spack.compilers.all_compiler_specs())
|
||||
|
||||
args = spack.util.pattern.Bunch(
|
||||
all=None, compiler_spec=None, add_paths=[mock_compiler_dir], scope=None
|
||||
)
|
||||
spack.cmd.compiler.compiler_find(args)
|
||||
|
||||
# Ensure new compiler is in there
|
||||
new_compilers = set(spack.compilers.all_compiler_specs())
|
||||
new_compiler = new_compilers - old_compilers
|
||||
assert any(c.version == spack.version.Version(mock_compiler_version) for c in new_compiler)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def clangdir(tmpdir):
|
||||
"""Create a directory with some dummy compiler scripts in it.
|
||||
def compilers_dir(mock_executable):
|
||||
"""Create a directory with some mock compiler scripts in it.
|
||||
|
||||
Scripts are:
|
||||
- clang
|
||||
@@ -145,11 +27,9 @@ def clangdir(tmpdir):
|
||||
- gfortran-8
|
||||
|
||||
"""
|
||||
with tmpdir.as_cwd():
|
||||
with open("clang", "w") as f:
|
||||
f.write(
|
||||
"""\
|
||||
#!/bin/sh
|
||||
clang_path = mock_executable(
|
||||
"clang",
|
||||
output="""
|
||||
if [ "$1" = "--version" ]; then
|
||||
echo "clang version 11.0.0 (clang-1100.0.33.16)"
|
||||
echo "Target: x86_64-apple-darwin18.7.0"
|
||||
@@ -159,12 +39,11 @@ def clangdir(tmpdir):
|
||||
echo "clang: error: no input files"
|
||||
exit 1
|
||||
fi
|
||||
"""
|
||||
)
|
||||
shutil.copy("clang", "clang++")
|
||||
""",
|
||||
)
|
||||
shutil.copy(clang_path, clang_path.parent / "clang++")
|
||||
|
||||
gcc_script = """\
|
||||
#!/bin/sh
|
||||
gcc_script = """
|
||||
if [ "$1" = "-dumpversion" ]; then
|
||||
echo "8"
|
||||
elif [ "$1" = "-dumpfullversion" ]; then
|
||||
@@ -178,120 +57,187 @@ def clangdir(tmpdir):
|
||||
exit 1
|
||||
fi
|
||||
"""
|
||||
with open("gcc-8", "w") as f:
|
||||
f.write(gcc_script.format("gcc", "gcc-8"))
|
||||
with open("g++-8", "w") as f:
|
||||
f.write(gcc_script.format("g++", "g++-8"))
|
||||
with open("gfortran-8", "w") as f:
|
||||
f.write(gcc_script.format("GNU Fortran", "gfortran-8"))
|
||||
os.chmod("clang", 0o700)
|
||||
os.chmod("clang++", 0o700)
|
||||
os.chmod("gcc-8", 0o700)
|
||||
os.chmod("g++-8", 0o700)
|
||||
os.chmod("gfortran-8", 0o700)
|
||||
mock_executable("gcc-8", output=gcc_script.format("gcc", "gcc-8"))
|
||||
mock_executable("g++-8", output=gcc_script.format("g++", "g++-8"))
|
||||
mock_executable("gfortran-8", output=gcc_script.format("GNU Fortran", "gfortran-8"))
|
||||
|
||||
yield tmpdir
|
||||
return clang_path.parent
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
sys.platform == "win32",
|
||||
reason="Cannot execute bash \
|
||||
script on Windows",
|
||||
)
|
||||
@pytest.mark.regression("17590")
|
||||
def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, clangdir):
|
||||
"""Ensure that we'll mix compilers with different suffixes when necessary."""
|
||||
os.environ["PATH"] = str(clangdir)
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
|
||||
@pytest.mark.regression("11678,13138")
|
||||
def test_compiler_find_without_paths(no_compilers_yaml, working_env, mock_executable):
|
||||
"""Tests that 'spack compiler find' looks into PATH by default, if no specific path
|
||||
is given.
|
||||
"""
|
||||
gcc_path = mock_executable("gcc", output='echo "0.0.0"')
|
||||
|
||||
os.environ["PATH"] = str(gcc_path.parent)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "clang@=11.0.0" in output
|
||||
assert "gcc@=8.4.0" in output
|
||||
assert "gcc" in output
|
||||
|
||||
|
||||
@pytest.mark.regression("17589")
|
||||
def test_compiler_find_no_apple_gcc(no_compilers_yaml, working_env, mock_executable):
|
||||
"""Tests that Spack won't mistake Apple's GCC as a "real" GCC, since it's really
|
||||
Clang with a few tweaks.
|
||||
"""
|
||||
gcc_path = mock_executable(
|
||||
"gcc",
|
||||
output="""
|
||||
if [ "$1" = "-dumpversion" ]; then
|
||||
echo "4.2.1"
|
||||
elif [ "$1" = "--version" ]; then
|
||||
echo "Configured with: --prefix=/dummy"
|
||||
echo "Apple clang version 11.0.0 (clang-1100.0.33.16)"
|
||||
echo "Target: x86_64-apple-darwin18.7.0"
|
||||
echo "Thread model: posix"
|
||||
echo "InstalledDir: /dummy"
|
||||
else
|
||||
echo "clang: error: no input files"
|
||||
fi
|
||||
""",
|
||||
)
|
||||
|
||||
os.environ["PATH"] = str(gcc_path.parent)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "gcc" not in output
|
||||
|
||||
|
||||
@pytest.mark.regression("37996")
|
||||
def test_compiler_remove(mutable_config, mock_packages):
|
||||
"""Tests that we can remove a compiler from configuration."""
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
|
||||
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
|
||||
spack.cmd.compiler.compiler_remove(args)
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
|
||||
|
||||
|
||||
@pytest.mark.regression("37996")
|
||||
def test_removing_compilers_from_multiple_scopes(mutable_config, mock_packages):
|
||||
# Duplicate "site" scope into "user" scope
|
||||
site_config = spack.config.get("compilers", scope="site")
|
||||
spack.config.set("compilers", site_config, scope="user")
|
||||
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") in spack.compilers.all_compiler_specs()
|
||||
args = spack.util.pattern.Bunch(all=True, compiler_spec="gcc@4.5.0", add_paths=[], scope=None)
|
||||
spack.cmd.compiler.compiler_remove(args)
|
||||
assert spack.spec.CompilerSpec("gcc@=4.5.0") not in spack.compilers.all_compiler_specs()
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
|
||||
def test_compiler_add(mutable_config, mock_packages, mock_executable):
|
||||
"""Tests that we can add a compiler to configuration."""
|
||||
expected_version = "4.5.3"
|
||||
gcc_path = mock_executable(
|
||||
"gcc",
|
||||
output=f"""\
|
||||
for arg in "$@"; do
|
||||
if [ "$arg" = -dumpversion ]; then
|
||||
echo '{expected_version}'
|
||||
fi
|
||||
done
|
||||
""",
|
||||
)
|
||||
bin_dir = gcc_path.parent
|
||||
root_dir = bin_dir.parent
|
||||
|
||||
compilers_before_find = set(spack.compilers.all_compiler_specs())
|
||||
args = spack.util.pattern.Bunch(
|
||||
all=None, compiler_spec=None, add_paths=[str(root_dir)], scope=None
|
||||
)
|
||||
spack.cmd.compiler.compiler_find(args)
|
||||
compilers_after_find = set(spack.compilers.all_compiler_specs())
|
||||
|
||||
compilers_added_by_find = compilers_after_find - compilers_before_find
|
||||
assert len(compilers_added_by_find) == 1
|
||||
new_compiler = compilers_added_by_find.pop()
|
||||
assert new_compiler.version == spack.version.Version(expected_version)
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
|
||||
@pytest.mark.regression("17590")
|
||||
def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, compilers_dir):
|
||||
"""Ensure that we'll mix compilers with different suffixes when necessary."""
|
||||
os.environ["PATH"] = str(compilers_dir)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "clang@11.0.0" in output
|
||||
assert "gcc@8.4.0" in output
|
||||
|
||||
config = spack.compilers.get_compiler_config("site", False)
|
||||
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
|
||||
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
|
||||
|
||||
gfortran_path = str(clangdir.join("gfortran-8"))
|
||||
gfortran_path = str(compilers_dir / "gfortran-8")
|
||||
|
||||
assert clang["paths"] == {
|
||||
"cc": str(clangdir.join("clang")),
|
||||
"cxx": str(clangdir.join("clang++")),
|
||||
"cc": str(compilers_dir / "clang"),
|
||||
"cxx": str(compilers_dir / "clang++"),
|
||||
# we only auto-detect mixed clang on macos
|
||||
"f77": gfortran_path if sys.platform == "darwin" else None,
|
||||
"fc": gfortran_path if sys.platform == "darwin" else None,
|
||||
}
|
||||
|
||||
assert gcc["paths"] == {
|
||||
"cc": str(clangdir.join("gcc-8")),
|
||||
"cxx": str(clangdir.join("g++-8")),
|
||||
"cc": str(compilers_dir / "gcc-8"),
|
||||
"cxx": str(compilers_dir / "g++-8"),
|
||||
"f77": gfortran_path,
|
||||
"fc": gfortran_path,
|
||||
}
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
sys.platform == "win32",
|
||||
reason="Cannot execute bash \
|
||||
script on Windows",
|
||||
)
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
|
||||
@pytest.mark.regression("17590")
|
||||
def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, clangdir):
|
||||
def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, compilers_dir):
|
||||
"""Ensure that we'll pick 'clang' over 'clang-gpu' when there is a choice."""
|
||||
with clangdir.as_cwd():
|
||||
shutil.copy("clang", "clang-gpu")
|
||||
shutil.copy("clang++", "clang++-gpu")
|
||||
os.chmod("clang-gpu", 0o700)
|
||||
os.chmod("clang++-gpu", 0o700)
|
||||
clang_path = compilers_dir / "clang"
|
||||
shutil.copy(clang_path, clang_path.parent / "clang-gpu")
|
||||
shutil.copy(clang_path, clang_path.parent / "clang++-gpu")
|
||||
|
||||
os.environ["PATH"] = str(clangdir)
|
||||
os.environ["PATH"] = str(compilers_dir)
|
||||
output = compiler("find", "--scope=site")
|
||||
|
||||
assert "clang@=11.0.0" in output
|
||||
assert "gcc@=8.4.0" in output
|
||||
assert "clang@11.0.0" in output
|
||||
assert "gcc@8.4.0" in output
|
||||
|
||||
config = spack.compilers.get_compiler_config("site", False)
|
||||
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
|
||||
|
||||
assert clang["paths"]["cc"] == str(clangdir.join("clang"))
|
||||
assert clang["paths"]["cxx"] == str(clangdir.join("clang++"))
|
||||
assert clang["paths"]["cc"] == str(compilers_dir / "clang")
|
||||
assert clang["paths"]["cxx"] == str(compilers_dir / "clang++")
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
sys.platform == "win32",
|
||||
reason="Cannot execute bash \
|
||||
script on Windows",
|
||||
)
|
||||
def test_compiler_find_path_order(no_compilers_yaml, working_env, clangdir):
|
||||
"""Ensure that we find compilers that come first in the PATH first"""
|
||||
|
||||
with clangdir.as_cwd():
|
||||
os.mkdir("first_in_path")
|
||||
shutil.copy("gcc-8", "first_in_path/gcc-8")
|
||||
shutil.copy("g++-8", "first_in_path/g++-8")
|
||||
shutil.copy("gfortran-8", "first_in_path/gfortran-8")
|
||||
|
||||
# the first_in_path folder should be searched first
|
||||
os.environ["PATH"] = "{0}:{1}".format(str(clangdir.join("first_in_path")), str(clangdir))
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Cannot execute bash script on Windows")
|
||||
def test_compiler_find_path_order(no_compilers_yaml, working_env, compilers_dir):
|
||||
"""Ensure that we look for compilers in the same order as PATH, when there are duplicates"""
|
||||
new_dir = compilers_dir / "first_in_path"
|
||||
new_dir.mkdir()
|
||||
for name in ("gcc-8", "g++-8", "gfortran-8"):
|
||||
shutil.copy(compilers_dir / name, new_dir / name)
|
||||
# Set PATH to have the new folder searched first
|
||||
os.environ["PATH"] = "{}:{}".format(str(new_dir), str(compilers_dir))
|
||||
|
||||
compiler("find", "--scope=site")
|
||||
|
||||
config = spack.compilers.get_compiler_config("site", False)
|
||||
|
||||
gcc = next(c["compiler"] for c in config if c["compiler"]["spec"] == "gcc@=8.4.0")
|
||||
|
||||
assert gcc["paths"] == {
|
||||
"cc": str(clangdir.join("first_in_path", "gcc-8")),
|
||||
"cxx": str(clangdir.join("first_in_path", "g++-8")),
|
||||
"f77": str(clangdir.join("first_in_path", "gfortran-8")),
|
||||
"fc": str(clangdir.join("first_in_path", "gfortran-8")),
|
||||
"cc": str(new_dir / "gcc-8"),
|
||||
"cxx": str(new_dir / "g++-8"),
|
||||
"f77": str(new_dir / "gfortran-8"),
|
||||
"fc": str(new_dir / "gfortran-8"),
|
||||
}
|
||||
|
||||
|
||||
def test_compiler_list_empty(no_compilers_yaml, working_env, clangdir):
|
||||
# Spack should not automatically search for compilers when listing them and none
|
||||
# are available. And when stdout is not a tty like in tests, there should be no
|
||||
# output and no error exit code.
|
||||
os.environ["PATH"] = str(clangdir)
|
||||
def test_compiler_list_empty(no_compilers_yaml, working_env, compilers_dir):
|
||||
"""Spack should not automatically search for compilers when listing them and none are
|
||||
available. And when stdout is not a tty like in tests, there should be no output and
|
||||
no error exit code.
|
||||
"""
|
||||
os.environ["PATH"] = str(compilers_dir)
|
||||
out = compiler("list")
|
||||
assert not out
|
||||
assert compiler.returncode == 0
|
||||
|
@@ -390,6 +390,19 @@ def test_remove_after_concretize():
|
||||
assert not any(s.name == "mpileaks" for s in env_specs)
|
||||
|
||||
|
||||
def test_remove_before_concretize():
|
||||
e = ev.create("test")
|
||||
e.unify = True
|
||||
|
||||
e.add("mpileaks")
|
||||
e.concretize()
|
||||
|
||||
e.remove("mpileaks")
|
||||
e.concretize()
|
||||
|
||||
assert not list(e.concretized_specs())
|
||||
|
||||
|
||||
def test_remove_command():
|
||||
env("create", "test")
|
||||
assert "test" in env("list")
|
||||
@@ -906,7 +919,7 @@ def test_env_config_precedence(environment_from_manifest):
|
||||
mpileaks:
|
||||
version: ["2.2"]
|
||||
libelf:
|
||||
version: ["0.8.11"]
|
||||
version: ["0.8.10"]
|
||||
"""
|
||||
)
|
||||
|
||||
@@ -3299,3 +3312,22 @@ def test_environment_created_in_users_location(mutable_config, tmpdir):
|
||||
assert dir_name in out
|
||||
assert env_dir in ev.root(dir_name)
|
||||
assert os.path.isdir(os.path.join(env_dir, dir_name))
|
||||
|
||||
|
||||
def test_environment_created_from_lockfile_has_view(mock_packages, tmpdir):
|
||||
"""When an env is created from a lockfile, a view should be generated for it"""
|
||||
env_a = str(tmpdir.join("a"))
|
||||
env_b = str(tmpdir.join("b"))
|
||||
|
||||
# Create an environment and install a package in it
|
||||
env("create", "-d", env_a)
|
||||
with ev.Environment(env_a):
|
||||
add("libelf")
|
||||
install("--fake")
|
||||
|
||||
# Create another environment from the lockfile of the first environment
|
||||
env("create", "-d", env_b, os.path.join(env_a, "spack.lock"))
|
||||
|
||||
# Make sure the view was created
|
||||
with ev.Environment(env_b) as e:
|
||||
assert os.path.isdir(e.view_path_default)
|
||||
|
@@ -44,9 +44,8 @@ def define_plat_exe(exe):
|
||||
|
||||
def test_find_external_single_package(mock_executable, executables_found, _platform_executables):
|
||||
pkgs_to_check = [spack.repo.path.get_pkg_class("cmake")]
|
||||
executables_found(
|
||||
{mock_executable("cmake", output="echo cmake version 1.foo"): define_plat_exe("cmake")}
|
||||
)
|
||||
cmake_path = mock_executable("cmake", output="echo cmake version 1.foo")
|
||||
executables_found({str(cmake_path): define_plat_exe("cmake")})
|
||||
|
||||
pkg_to_entries = spack.detection.by_executable(pkgs_to_check)
|
||||
|
||||
@@ -71,7 +70,7 @@ def test_find_external_two_instances_same_package(
|
||||
"cmake", output="echo cmake version 3.17.2", subdir=("base2", "bin")
|
||||
)
|
||||
cmake_exe = define_plat_exe("cmake")
|
||||
executables_found({cmake_path1: cmake_exe, cmake_path2: cmake_exe})
|
||||
executables_found({str(cmake_path1): cmake_exe, str(cmake_path2): cmake_exe})
|
||||
|
||||
pkg_to_entries = spack.detection.by_executable(pkgs_to_check)
|
||||
|
||||
@@ -107,7 +106,7 @@ def test_get_executables(working_env, mock_executable):
|
||||
cmake_path1 = mock_executable("cmake", output="echo cmake version 1.foo")
|
||||
path_to_exe = spack.detection.executables_in_path([os.path.dirname(cmake_path1)])
|
||||
cmake_exe = define_plat_exe("cmake")
|
||||
assert path_to_exe[cmake_path1] == cmake_exe
|
||||
assert path_to_exe[str(cmake_path1)] == cmake_exe
|
||||
|
||||
|
||||
external = SpackCommand("external")
|
||||
@@ -334,7 +333,7 @@ def test_packages_yaml_format(mock_executable, mutable_config, monkeypatch, _pla
|
||||
assert "extra_attributes" in external_gcc
|
||||
extra_attributes = external_gcc["extra_attributes"]
|
||||
assert "prefix" not in extra_attributes
|
||||
assert extra_attributes["compilers"]["c"] == gcc_exe
|
||||
assert extra_attributes["compilers"]["c"] == str(gcc_exe)
|
||||
|
||||
|
||||
def test_overriding_prefix(mock_executable, mutable_config, monkeypatch, _platform_executables):
|
||||
|
@@ -357,3 +357,18 @@ def test_find_loaded(database, working_env):
|
||||
output = find("--loaded")
|
||||
expected = find()
|
||||
assert output == expected
|
||||
|
||||
|
||||
@pytest.mark.regression("37712")
|
||||
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
|
||||
"""Tests that having an active environment with a root spec containing a compiler constrained
|
||||
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
|
||||
when invoking "spack find".
|
||||
"""
|
||||
test_environment = ev.create_in_dir(tmp_path)
|
||||
test_environment.add("zlib %gcc@12.1.0")
|
||||
test_environment.write()
|
||||
|
||||
with test_environment:
|
||||
output = find()
|
||||
assert "zlib%gcc@12.1.0" in output
|
||||
|
@@ -319,3 +319,17 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
|
||||
spack.cmd.common.arguments.sanitize_reporter_options(args)
|
||||
filename = spack.cmd.test.report_filename(args, suite)
|
||||
assert filename != "https://blahblah/submit.php?project=debugging"
|
||||
|
||||
|
||||
def test_test_output_multiple_specs(
|
||||
mock_test_stage, mock_packages, mock_archive, mock_fetch, install_mockery_mutable_config
|
||||
):
|
||||
"""Ensure proper reporting for suite with skipped, failing, and passed tests."""
|
||||
install("test-error", "simple-standalone-test@0.9", "simple-standalone-test@1.0")
|
||||
out = spack_test("run", "test-error", "simple-standalone-test", fail_on_error=False)
|
||||
|
||||
# Note that a spec with passing *and* skipped tests is still considered
|
||||
# to have passed at this level. If you want to see the spec-specific
|
||||
# part result summaries, you'll have to look at the "test-out.txt" files
|
||||
# for each spec.
|
||||
assert "1 failed, 2 passed of 3 specs" in out
|
||||
|
@@ -337,8 +337,6 @@ def test_compiler_flags_differ_identical_compilers(self):
|
||||
|
||||
# Get the compiler that matches the spec (
|
||||
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
|
||||
# Clear cache for compiler config since it has its own cache mechanism outside of config
|
||||
spack.compilers._cache_config_file = []
|
||||
|
||||
# Configure spack to have two identical compilers with different flags
|
||||
default_dict = spack.compilers._to_dict(compiler)
|
||||
@@ -2137,7 +2135,7 @@ def test_compiler_with_custom_non_numeric_version(self, mock_executable):
|
||||
{
|
||||
"compiler": {
|
||||
"spec": "gcc@foo",
|
||||
"paths": {"cc": gcc_path, "cxx": gcc_path, "f77": None, "fc": None},
|
||||
"paths": {"cc": str(gcc_path), "cxx": str(gcc_path), "f77": None, "fc": None},
|
||||
"operating_system": "debian6",
|
||||
"modules": [],
|
||||
}
|
||||
|
@@ -152,7 +152,9 @@ def test_preferred_versions(self):
|
||||
assert spec.version == Version("2.2")
|
||||
|
||||
def test_preferred_versions_mixed_version_types(self):
|
||||
update_packages("mixedversions", "version", ["2.0"])
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("This behavior is not enforced for the old concretizer")
|
||||
update_packages("mixedversions", "version", ["=2.0"])
|
||||
spec = concretize("mixedversions")
|
||||
assert spec.version == Version("2.0")
|
||||
|
||||
@@ -228,6 +230,29 @@ def test_preferred(self):
|
||||
spec.concretize()
|
||||
assert spec.version == Version("3.5.0")
|
||||
|
||||
def test_preferred_undefined_raises(self):
|
||||
"""Preference should not specify an undefined version"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.xfail("This behavior is not enforced for the old concretizer")
|
||||
|
||||
update_packages("python", "version", ["3.5.0.1"])
|
||||
spec = Spec("python")
|
||||
with pytest.raises(spack.config.ConfigError):
|
||||
spec.concretize()
|
||||
|
||||
def test_preferred_truncated(self):
|
||||
"""Versions without "=" are treated as version ranges: if there is
|
||||
a satisfying version defined in the package.py, we should use that
|
||||
(don't define a new version).
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("This behavior is not enforced for the old concretizer")
|
||||
|
||||
update_packages("python", "version", ["3.5"])
|
||||
spec = Spec("python")
|
||||
spec.concretize()
|
||||
assert spec.satisfies("@3.5.1")
|
||||
|
||||
def test_develop(self):
|
||||
"""Test concretization with develop-like versions"""
|
||||
spec = Spec("develop-test")
|
||||
|
@@ -66,6 +66,28 @@ class V(Package):
|
||||
)
|
||||
|
||||
|
||||
_pkgt = (
|
||||
"t",
|
||||
"""\
|
||||
class T(Package):
|
||||
version('2.1')
|
||||
version('2.0')
|
||||
|
||||
depends_on('u', when='@2.1:')
|
||||
""",
|
||||
)
|
||||
|
||||
|
||||
_pkgu = (
|
||||
"u",
|
||||
"""\
|
||||
class U(Package):
|
||||
version('1.1')
|
||||
version('1.0')
|
||||
""",
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def create_test_repo(tmpdir, mutable_config):
|
||||
repo_path = str(tmpdir)
|
||||
@@ -79,7 +101,7 @@ def create_test_repo(tmpdir, mutable_config):
|
||||
)
|
||||
|
||||
packages_dir = tmpdir.join("packages")
|
||||
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv]:
|
||||
for pkg_name, pkg_str in [_pkgx, _pkgy, _pkgv, _pkgt, _pkgu]:
|
||||
pkg_dir = packages_dir.ensure(pkg_name, dir=True)
|
||||
pkg_file = pkg_dir.join("package.py")
|
||||
with open(str(pkg_file), "w") as f:
|
||||
@@ -144,6 +166,45 @@ def test_requirement_isnt_optional(concretize_scope, test_repo):
|
||||
Spec("x@1.1").concretize()
|
||||
|
||||
|
||||
def test_require_undefined_version(concretize_scope, test_repo):
|
||||
"""If a requirement specifies a numbered version that isn't in
|
||||
the associated package.py and isn't part of a Git hash
|
||||
equivalence (hash=number), then Spack should raise an error
|
||||
(it is assumed this is a typo, and raising the error here
|
||||
avoids a likely error when Spack attempts to fetch the version).
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("Original concretizer does not support configuration requirements")
|
||||
|
||||
conf_str = """\
|
||||
packages:
|
||||
x:
|
||||
require: "@1.2"
|
||||
"""
|
||||
update_packages_config(conf_str)
|
||||
with pytest.raises(spack.config.ConfigError):
|
||||
Spec("x").concretize()
|
||||
|
||||
|
||||
def test_require_truncated(concretize_scope, test_repo):
|
||||
"""A requirement specifies a version range, with satisfying
|
||||
versions defined in the package.py. Make sure we choose one
|
||||
of the defined versions (vs. allowing the requirement to
|
||||
define a new version).
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("Original concretizer does not support configuration requirements")
|
||||
|
||||
conf_str = """\
|
||||
packages:
|
||||
x:
|
||||
require: "@1"
|
||||
"""
|
||||
update_packages_config(conf_str)
|
||||
xspec = Spec("x").concretized()
|
||||
assert xspec.satisfies("@1.1")
|
||||
|
||||
|
||||
def test_git_user_supplied_reference_satisfaction(
|
||||
concretize_scope, test_repo, mock_git_version_info, monkeypatch
|
||||
):
|
||||
@@ -220,6 +281,40 @@ def test_requirement_adds_new_version(
|
||||
assert s1.version.ref == a_commit_hash
|
||||
|
||||
|
||||
def test_requirement_adds_version_satisfies(
|
||||
concretize_scope, test_repo, mock_git_version_info, monkeypatch
|
||||
):
|
||||
"""Make sure that new versions added by requirements are factored into
|
||||
conditions. In this case create a new version that satisfies a
|
||||
depends_on condition and make sure it is triggered (i.e. the
|
||||
dependency is added).
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("Original concretizer does not support configuration" " requirements")
|
||||
|
||||
repo_path, filename, commits = mock_git_version_info
|
||||
monkeypatch.setattr(
|
||||
spack.package_base.PackageBase, "git", path_to_file_url(repo_path), raising=False
|
||||
)
|
||||
|
||||
# Sanity check: early version of T does not include U
|
||||
s0 = Spec("t@2.0").concretized()
|
||||
assert not ("u" in s0)
|
||||
|
||||
conf_str = """\
|
||||
packages:
|
||||
t:
|
||||
require: "@{0}=2.2"
|
||||
""".format(
|
||||
commits[0]
|
||||
)
|
||||
update_packages_config(conf_str)
|
||||
|
||||
s1 = Spec("t").concretized()
|
||||
assert "u" in s1
|
||||
assert s1.satisfies("@2.2")
|
||||
|
||||
|
||||
def test_requirement_adds_git_hash_version(
|
||||
concretize_scope, test_repo, mock_git_version_info, monkeypatch
|
||||
):
|
||||
@@ -272,8 +367,11 @@ def test_requirement_adds_multiple_new_versions(
|
||||
def test_preference_adds_new_version(
|
||||
concretize_scope, test_repo, mock_git_version_info, monkeypatch
|
||||
):
|
||||
"""Normally a preference cannot define a new version, but that constraint
|
||||
is ignored if the version is a Git hash-based version.
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("Original concretizer does not support configuration requirements")
|
||||
pytest.skip("Original concretizer does not enforce this constraint for preferences")
|
||||
|
||||
repo_path, filename, commits = mock_git_version_info
|
||||
monkeypatch.setattr(
|
||||
@@ -296,6 +394,29 @@ def test_preference_adds_new_version(
|
||||
assert not s3.satisfies("@2.3")
|
||||
|
||||
|
||||
def test_external_adds_new_version_that_is_preferred(concretize_scope, test_repo):
|
||||
"""Test that we can use a version, not declared in package recipe, as the
|
||||
preferred version if that version appears in an external spec.
|
||||
"""
|
||||
if spack.config.get("config:concretizer") == "original":
|
||||
pytest.skip("Original concretizer does not enforce this constraint for preferences")
|
||||
|
||||
conf_str = """\
|
||||
packages:
|
||||
y:
|
||||
version: ["2.7"]
|
||||
externals:
|
||||
- spec: y@2.7 # Not defined in y
|
||||
prefix: /fake/nonexistent/path/
|
||||
buildable: false
|
||||
"""
|
||||
update_packages_config(conf_str)
|
||||
|
||||
spec = Spec("x").concretized()
|
||||
assert spec["y"].satisfies("@2.7")
|
||||
assert spack.version.Version("2.7") not in spec["y"].package.versions
|
||||
|
||||
|
||||
def test_requirement_is_successfully_applied(concretize_scope, test_repo):
|
||||
"""If a simple requirement can be satisfied, make sure the
|
||||
concretization succeeds and the requirement spec is applied.
|
||||
|
@@ -1669,22 +1669,21 @@ def clear_directive_functions():
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_executable(tmpdir):
|
||||
def mock_executable(tmp_path):
|
||||
"""Factory to create a mock executable in a temporary directory that
|
||||
output a custom string when run.
|
||||
"""
|
||||
import jinja2
|
||||
|
||||
shebang = "#!/bin/sh\n" if sys.platform != "win32" else "@ECHO OFF"
|
||||
|
||||
def _factory(name, output, subdir=("bin",)):
|
||||
f = tmpdir.ensure(*subdir, dir=True).join(name)
|
||||
executable_dir = tmp_path.joinpath(*subdir)
|
||||
executable_dir.mkdir(parents=True, exist_ok=True)
|
||||
executable_path = executable_dir / name
|
||||
if sys.platform == "win32":
|
||||
f += ".bat"
|
||||
t = jinja2.Template("{{ shebang }}{{ output }}\n")
|
||||
f.write(t.render(shebang=shebang, output=output))
|
||||
f.chmod(0o755)
|
||||
return str(f)
|
||||
executable_path = executable_dir / (name + ".bat")
|
||||
executable_path.write_text(f"{ shebang }{ output }\n")
|
||||
executable_path.chmod(0o755)
|
||||
return executable_path
|
||||
|
||||
return _factory
|
||||
|
||||
|
@@ -4,7 +4,7 @@ lmod:
|
||||
hash_length: 0
|
||||
|
||||
core_compilers:
|
||||
- 'clang@3.3'
|
||||
- 'clang@12.0.0'
|
||||
|
||||
core_specs:
|
||||
- 'mpich@3.0.1'
|
||||
|
@@ -0,0 +1,5 @@
|
||||
enable:
|
||||
- lmod
|
||||
lmod:
|
||||
core_compilers:
|
||||
- 'clang@12.0.0'
|
@@ -0,0 +1,5 @@
|
||||
enable:
|
||||
- lmod
|
||||
lmod:
|
||||
core_compilers:
|
||||
- 'clang@=12.0.0'
|
17
lib/spack/spack/test/data/sourceme_modules.sh
Normal file
17
lib/spack/spack/test/data/sourceme_modules.sh
Normal file
@@ -0,0 +1,17 @@
|
||||
#!/usr/bin/env bash
|
||||
#
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
_module_raw() { return 1; };
|
||||
module() { return 1; };
|
||||
ml() { return 1; };
|
||||
export -f _module_raw;
|
||||
export -f module;
|
||||
export -f ml;
|
||||
|
||||
export MODULES_AUTO_HANDLING=1
|
||||
export __MODULES_LMCONFLICT=bar&foo
|
||||
export NEW_VAR=new
|
@@ -400,7 +400,7 @@ def test_sanitize_literals(env, exclude, include):
|
||||
({"SHLVL": "1"}, ["SH.*"], [], [], ["SHLVL"]),
|
||||
# Check we can include using a regex
|
||||
({"SHLVL": "1"}, ["SH.*"], ["SH.*"], ["SHLVL"], []),
|
||||
# Check regex to exclude Modules v4 related vars
|
||||
# Check regex to exclude Environment Modules related vars
|
||||
(
|
||||
{"MODULES_LMALTNAME": "1", "MODULES_LMCONFLICT": "2"},
|
||||
["MODULES_(.*)"],
|
||||
@@ -415,6 +415,13 @@ def test_sanitize_literals(env, exclude, include):
|
||||
[],
|
||||
["A_modquar", "b_modquar", "C_modshare"],
|
||||
),
|
||||
(
|
||||
{"__MODULES_LMTAG": "1", "__MODULES_LMPREREQ": "2"},
|
||||
["__MODULES_(.*)"],
|
||||
[],
|
||||
[],
|
||||
["__MODULES_LMTAG", "__MODULES_LMPREREQ"],
|
||||
),
|
||||
],
|
||||
)
|
||||
def test_sanitize_regex(env, exclude, include, expected, deleted):
|
||||
@@ -489,3 +496,19 @@ def test_exclude_lmod_variables():
|
||||
# Check that variables related to lmod are not in there
|
||||
modifications = env.group_by_name()
|
||||
assert not any(x.startswith("LMOD_") for x in modifications)
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
|
||||
@pytest.mark.regression("13504")
|
||||
def test_exclude_modules_variables():
|
||||
# Construct the list of environment modifications
|
||||
file = os.path.join(datadir, "sourceme_modules.sh")
|
||||
env = EnvironmentModifications.from_sourcing_file(file)
|
||||
|
||||
# Check that variables related to modules are not in there
|
||||
modifications = env.group_by_name()
|
||||
assert not any(x.startswith("MODULES_") for x in modifications)
|
||||
assert not any(x.startswith("__MODULES_") for x in modifications)
|
||||
assert not any(x.startswith("BASH_FUNC_ml") for x in modifications)
|
||||
assert not any(x.startswith("BASH_FUNC_module") for x in modifications)
|
||||
assert not any(x.startswith("BASH_FUNC__module_raw") for x in modifications)
|
||||
|
@@ -1399,17 +1399,24 @@ def test_print_install_test_log_skipped(install_mockery, mock_packages, capfd, r
|
||||
assert out == ""
|
||||
|
||||
|
||||
def test_print_install_test_log_missing(
|
||||
def test_print_install_test_log_failures(
|
||||
tmpdir, install_mockery, mock_packages, ensure_debug, capfd
|
||||
):
|
||||
"""Confirm expected error on attempt to print missing test log file."""
|
||||
"""Confirm expected outputs when there are test failures."""
|
||||
name = "trivial-install-test-package"
|
||||
s = spack.spec.Spec(name).concretized()
|
||||
pkg = s.package
|
||||
|
||||
# Missing test log is an error
|
||||
pkg.run_tests = True
|
||||
pkg.tester.test_log_file = str(tmpdir.join("test-log.txt"))
|
||||
pkg.tester.add_failure(AssertionError("test"), "test-failure")
|
||||
spack.installer.print_install_test_log(pkg)
|
||||
err = capfd.readouterr()[1]
|
||||
assert "no test log file" in err
|
||||
|
||||
# Having test log results in path being output
|
||||
fs.touch(pkg.tester.test_log_file)
|
||||
spack.installer.print_install_test_log(pkg)
|
||||
out = capfd.readouterr()[0]
|
||||
assert "See test results at" in out
|
||||
|
@@ -8,6 +8,8 @@
|
||||
|
||||
import pytest
|
||||
|
||||
import spack.cmd.modules
|
||||
import spack.config
|
||||
import spack.error
|
||||
import spack.modules.tcl
|
||||
import spack.package_base
|
||||
@@ -187,3 +189,31 @@ def find_nothing(*args):
|
||||
assert module_path
|
||||
|
||||
spack.package_base.PackageBase.uninstall_by_spec(spec)
|
||||
|
||||
|
||||
@pytest.mark.regression("37649")
|
||||
def test_check_module_set_name(mutable_config):
|
||||
"""Tests that modules set name are validated correctly and an error is reported if the
|
||||
name we require does not exist or is reserved by the configuration."""
|
||||
|
||||
# Minimal modules.yaml config.
|
||||
spack.config.set(
|
||||
"modules",
|
||||
{
|
||||
"prefix_inspections": {"./bin": ["PATH"]},
|
||||
# module sets
|
||||
"first": {},
|
||||
"second": {},
|
||||
},
|
||||
)
|
||||
|
||||
# Valid module set name
|
||||
spack.cmd.modules.check_module_set_name("first")
|
||||
|
||||
# Invalid module set names
|
||||
msg = "Valid module set names are"
|
||||
with pytest.raises(spack.config.ConfigError, match=msg):
|
||||
spack.cmd.modules.check_module_set_name("prefix_inspections")
|
||||
|
||||
with pytest.raises(spack.config.ConfigError, match=msg):
|
||||
spack.cmd.modules.check_module_set_name("third")
|
||||
|
@@ -45,6 +45,18 @@ def provider(request):
|
||||
|
||||
@pytest.mark.usefixtures("config", "mock_packages")
|
||||
class TestLmod(object):
|
||||
@pytest.mark.regression("37788")
|
||||
@pytest.mark.parametrize("modules_config", ["core_compilers", "core_compilers_at_equal"])
|
||||
def test_layout_for_specs_compiled_with_core_compilers(
|
||||
self, modules_config, module_configuration, factory
|
||||
):
|
||||
"""Tests that specs compiled with core compilers are in the 'Core' folder. Also tests that
|
||||
we can use both ``compiler@version`` and ``compiler@=version`` to specify a core compiler.
|
||||
"""
|
||||
module_configuration(modules_config)
|
||||
module, spec = factory("libelf%clang@12.0.0")
|
||||
assert "Core" in module.layout.available_path_parts
|
||||
|
||||
def test_file_layout(self, compiler, provider, factory, module_configuration):
|
||||
"""Tests the layout of files in the hierarchy is the one expected."""
|
||||
module_configuration("complex_hierarchy")
|
||||
@@ -61,7 +73,7 @@ def test_file_layout(self, compiler, provider, factory, module_configuration):
|
||||
# is transformed to r"Core" if the compiler is listed among core
|
||||
# compilers
|
||||
# Check that specs listed as core_specs are transformed to "Core"
|
||||
if compiler == "clang@=3.3" or spec_string == "mpich@3.0.1":
|
||||
if compiler == "clang@=12.0.0" or spec_string == "mpich@3.0.1":
|
||||
assert "Core" in layout.available_path_parts
|
||||
else:
|
||||
assert compiler.replace("@=", "/") in layout.available_path_parts
|
||||
@@ -155,6 +167,46 @@ def test_prepend_path_separator(self, modulefile_content, module_configuration):
|
||||
assert len([x for x in content if 'append_path("SPACE", "qux", " ")' in x]) == 1
|
||||
assert len([x for x in content if 'remove_path("SPACE", "qux", " ")' in x]) == 1
|
||||
|
||||
@pytest.mark.regression("11355")
|
||||
def test_manpath_setup(self, modulefile_content, module_configuration):
|
||||
"""Tests specific setup of MANPATH environment variable."""
|
||||
|
||||
module_configuration("autoload_direct")
|
||||
|
||||
# no manpath set by module
|
||||
content = modulefile_content("mpileaks")
|
||||
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 0
|
||||
|
||||
# manpath set by module with prepend_path
|
||||
content = modulefile_content("module-manpath-prepend")
|
||||
assert (
|
||||
len([x for x in content if 'prepend_path("MANPATH", "/path/to/man", ":")' in x]) == 1
|
||||
)
|
||||
assert (
|
||||
len([x for x in content if 'prepend_path("MANPATH", "/path/to/share/man", ":")' in x])
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 1
|
||||
|
||||
# manpath set by module with append_path
|
||||
content = modulefile_content("module-manpath-append")
|
||||
assert len([x for x in content if 'append_path("MANPATH", "/path/to/man", ":")' in x]) == 1
|
||||
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 1
|
||||
|
||||
# manpath set by module with setenv
|
||||
content = modulefile_content("module-manpath-setenv")
|
||||
assert len([x for x in content if 'setenv("MANPATH", "/path/to/man")' in x]) == 1
|
||||
assert len([x for x in content if 'append_path("MANPATH", "", ":")' in x]) == 0
|
||||
|
||||
@pytest.mark.regression("29578")
|
||||
def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
||||
"""Tests that we can set environment variable value without formatting it."""
|
||||
|
||||
module_configuration("autoload_direct")
|
||||
content = modulefile_content("module-setenv-raw")
|
||||
|
||||
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
|
||||
|
||||
def test_help_message(self, modulefile_content, module_configuration):
|
||||
"""Tests the generation of module help message."""
|
||||
|
||||
|
@@ -37,6 +37,11 @@ def test_autoload_direct(self, modulefile_content, module_configuration):
|
||||
module_configuration("autoload_direct")
|
||||
content = modulefile_content(mpileaks_spec_string)
|
||||
|
||||
assert (
|
||||
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if "depends-on " in x]) == 2
|
||||
assert len([x for x in content if "module load " in x]) == 2
|
||||
|
||||
# dtbuild1 has
|
||||
@@ -46,6 +51,11 @@ def test_autoload_direct(self, modulefile_content, module_configuration):
|
||||
# Just make sure the 'build' dependency is not there
|
||||
content = modulefile_content("dtbuild1")
|
||||
|
||||
assert (
|
||||
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if "depends-on " in x]) == 2
|
||||
assert len([x for x in content if "module load " in x]) == 2
|
||||
|
||||
# The configuration file sets the verbose keyword to False
|
||||
@@ -58,6 +68,11 @@ def test_autoload_all(self, modulefile_content, module_configuration):
|
||||
module_configuration("autoload_all")
|
||||
content = modulefile_content(mpileaks_spec_string)
|
||||
|
||||
assert (
|
||||
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if "depends-on " in x]) == 5
|
||||
assert len([x for x in content if "module load " in x]) == 5
|
||||
|
||||
# dtbuild1 has
|
||||
@@ -67,6 +82,11 @@ def test_autoload_all(self, modulefile_content, module_configuration):
|
||||
# Just make sure the 'build' dependency is not there
|
||||
content = modulefile_content("dtbuild1")
|
||||
|
||||
assert (
|
||||
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if "depends-on " in x]) == 2
|
||||
assert len([x for x in content if "module load " in x]) == 2
|
||||
|
||||
def test_prerequisites_direct(self, modulefile_content, module_configuration):
|
||||
@@ -103,6 +123,7 @@ def test_alter_environment(self, modulefile_content, module_configuration):
|
||||
assert len([x for x in content if x.startswith("prepend-path CMAKE_PREFIX_PATH")]) == 0
|
||||
assert len([x for x in content if 'setenv FOO "foo"' in x]) == 0
|
||||
assert len([x for x in content if "unsetenv BAR" in x]) == 0
|
||||
assert len([x for x in content if "depends-on foo/bar" in x]) == 1
|
||||
assert len([x for x in content if "module load foo/bar" in x]) == 1
|
||||
assert len([x for x in content if "setenv LIBDWARF_ROOT" in x]) == 1
|
||||
|
||||
@@ -121,6 +142,55 @@ def test_prepend_path_separator(self, modulefile_content, module_configuration):
|
||||
assert len([x for x in content if 'append-path --delim " " SPACE "qux"' in x]) == 1
|
||||
assert len([x for x in content if 'remove-path --delim " " SPACE "qux"' in x]) == 1
|
||||
|
||||
@pytest.mark.regression("11355")
|
||||
def test_manpath_setup(self, modulefile_content, module_configuration):
|
||||
"""Tests specific setup of MANPATH environment variable."""
|
||||
|
||||
module_configuration("autoload_direct")
|
||||
|
||||
# no manpath set by module
|
||||
content = modulefile_content("mpileaks")
|
||||
assert len([x for x in content if 'append-path --delim ":" MANPATH ""' in x]) == 0
|
||||
|
||||
# manpath set by module with prepend-path
|
||||
content = modulefile_content("module-manpath-prepend")
|
||||
assert (
|
||||
len([x for x in content if 'prepend-path --delim ":" MANPATH "/path/to/man"' in x])
|
||||
== 1
|
||||
)
|
||||
assert (
|
||||
len(
|
||||
[
|
||||
x
|
||||
for x in content
|
||||
if 'prepend-path --delim ":" MANPATH "/path/to/share/man"' in x
|
||||
]
|
||||
)
|
||||
== 1
|
||||
)
|
||||
assert len([x for x in content if 'append-path --delim ":" MANPATH ""' in x]) == 1
|
||||
|
||||
# manpath set by module with append-path
|
||||
content = modulefile_content("module-manpath-append")
|
||||
assert (
|
||||
len([x for x in content if 'append-path --delim ":" MANPATH "/path/to/man"' in x]) == 1
|
||||
)
|
||||
assert len([x for x in content if 'append-path --delim ":" MANPATH ""' in x]) == 1
|
||||
|
||||
# manpath set by module with setenv
|
||||
content = modulefile_content("module-manpath-setenv")
|
||||
assert len([x for x in content if 'setenv MANPATH "/path/to/man"' in x]) == 1
|
||||
assert len([x for x in content if 'append-path --delim ":" MANPATH ""' in x]) == 0
|
||||
|
||||
@pytest.mark.regression("29578")
|
||||
def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
||||
"""Tests that we can set environment variable value without formatting it."""
|
||||
|
||||
module_configuration("autoload_direct")
|
||||
content = modulefile_content("module-setenv-raw")
|
||||
|
||||
assert len([x for x in content if 'setenv FOO "{{name}}, {name}, {{}}, {}"' in x]) == 1
|
||||
|
||||
def test_help_message(self, modulefile_content, module_configuration):
|
||||
"""Tests the generation of module help message."""
|
||||
|
||||
@@ -394,10 +464,16 @@ def test_autoload_with_constraints(self, modulefile_content, module_configuratio
|
||||
|
||||
# Test the mpileaks that should have the autoloaded dependencies
|
||||
content = modulefile_content("mpileaks ^mpich2")
|
||||
assert len([x for x in content if "depends-on " in x]) == 2
|
||||
assert len([x for x in content if "module load " in x]) == 2
|
||||
|
||||
# Test the mpileaks that should NOT have the autoloaded dependencies
|
||||
content = modulefile_content("mpileaks ^mpich")
|
||||
assert (
|
||||
len([x for x in content if "if {![info exists ::env(LMOD_VERSION_MAJOR)]} {" in x])
|
||||
== 0
|
||||
)
|
||||
assert len([x for x in content if "depends-on " in x]) == 0
|
||||
assert len([x for x in content if "module load " in x]) == 0
|
||||
|
||||
def test_modules_no_arch(self, factory, module_configuration):
|
||||
|
@@ -62,7 +62,7 @@ def source_file(tmpdir, is_relocatable):
|
||||
src = tmpdir.join("relocatable.c")
|
||||
shutil.copy(template_src, str(src))
|
||||
else:
|
||||
template_dirs = [os.path.join(spack.paths.test_path, "data", "templates")]
|
||||
template_dirs = (os.path.join(spack.paths.test_path, "data", "templates"),)
|
||||
env = spack.tengine.make_environment(template_dirs)
|
||||
template = env.get_template("non_relocatable.c")
|
||||
text = template.render({"prefix": spack.store.layout.root})
|
||||
@@ -246,12 +246,12 @@ def test_set_elf_rpaths(mock_patchelf):
|
||||
# the call made to patchelf itself
|
||||
patchelf = mock_patchelf("echo $@")
|
||||
rpaths = ["/usr/lib", "/usr/lib64", "/opt/local/lib"]
|
||||
output = spack.relocate._set_elf_rpaths(patchelf, rpaths)
|
||||
output = spack.relocate._set_elf_rpaths(str(patchelf), rpaths)
|
||||
|
||||
# Assert that the arguments of the call to patchelf are as expected
|
||||
assert "--force-rpath" in output
|
||||
assert "--set-rpath " + ":".join(rpaths) in output
|
||||
assert patchelf in output
|
||||
assert str(patchelf) in output
|
||||
|
||||
|
||||
@skip_unless_linux
|
||||
@@ -261,7 +261,7 @@ def test_set_elf_rpaths_warning(mock_patchelf):
|
||||
rpaths = ["/usr/lib", "/usr/lib64", "/opt/local/lib"]
|
||||
# To avoid using capfd in order to check if the warning was triggered
|
||||
# here we just check that output is not set
|
||||
output = spack.relocate._set_elf_rpaths(patchelf, rpaths)
|
||||
output = spack.relocate._set_elf_rpaths(str(patchelf), rpaths)
|
||||
assert output is None
|
||||
|
||||
|
||||
|
@@ -71,7 +71,7 @@ def test_template_retrieval(self):
|
||||
"""Tests the template retrieval mechanism hooked into config files"""
|
||||
# Check the directories are correct
|
||||
template_dirs = spack.config.get("config:template_dirs")
|
||||
template_dirs = [canonicalize_path(x) for x in template_dirs]
|
||||
template_dirs = tuple([canonicalize_path(x) for x in template_dirs])
|
||||
assert len(template_dirs) == 3
|
||||
|
||||
env = tengine.make_environment(template_dirs)
|
||||
|
@@ -12,6 +12,7 @@
|
||||
|
||||
import spack.install_test
|
||||
import spack.spec
|
||||
from spack.install_test import TestStatus
|
||||
from spack.util.executable import which
|
||||
|
||||
|
||||
@@ -20,7 +21,7 @@ def _true(*args, **kwargs):
|
||||
return True
|
||||
|
||||
|
||||
def ensure_results(filename, expected):
|
||||
def ensure_results(filename, expected, present=True):
|
||||
assert os.path.exists(filename)
|
||||
with open(filename, "r") as fd:
|
||||
lines = fd.readlines()
|
||||
@@ -29,7 +30,10 @@ def ensure_results(filename, expected):
|
||||
if expected in line:
|
||||
have = True
|
||||
break
|
||||
assert have
|
||||
if present:
|
||||
assert have, f"Expected '{expected}' in the file"
|
||||
else:
|
||||
assert not have, f"Expected '{expected}' NOT to be in the file"
|
||||
|
||||
|
||||
def test_test_log_name(mock_packages, config):
|
||||
@@ -78,8 +82,8 @@ def test_write_test_result(mock_packages, mock_test_stage):
|
||||
assert spec.name in msg
|
||||
|
||||
|
||||
def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
|
||||
"""Attempt to perform stand-alone test for uninstalled package."""
|
||||
def test_test_not_installed(mock_packages, install_mockery, mock_test_stage):
|
||||
"""Attempt to perform stand-alone test for not_installed package."""
|
||||
spec = spack.spec.Spec("trivial-smoke-test").concretized()
|
||||
test_suite = spack.install_test.TestSuite([spec])
|
||||
|
||||
@@ -91,10 +95,7 @@ def test_test_uninstalled(mock_packages, install_mockery, mock_test_stage):
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"arguments,status,msg",
|
||||
[
|
||||
({}, spack.install_test.TestStatus.SKIPPED, "Skipped"),
|
||||
({"externals": True}, spack.install_test.TestStatus.NO_TESTS, "No tests"),
|
||||
],
|
||||
[({}, TestStatus.SKIPPED, "Skipped"), ({"externals": True}, TestStatus.NO_TESTS, "No tests")],
|
||||
)
|
||||
def test_test_external(
|
||||
mock_packages, install_mockery, mock_test_stage, monkeypatch, arguments, status, msg
|
||||
@@ -156,6 +157,7 @@ def test_test_spec_passes(mock_packages, install_mockery, mock_test_stage, monke
|
||||
|
||||
ensure_results(test_suite.results_file, "PASSED")
|
||||
ensure_results(test_suite.log_file_for_spec(spec), "simple stand-alone")
|
||||
ensure_results(test_suite.log_file_for_spec(spec), "standalone-ifc", present=False)
|
||||
|
||||
|
||||
def test_get_test_suite():
|
||||
@@ -212,8 +214,10 @@ def test_test_functions_pkgless(mock_packages, install_mockery, ensure_debug, ca
|
||||
spec = spack.spec.Spec("simple-standalone-test").concretized()
|
||||
fns = spack.install_test.test_functions(spec.package, add_virtuals=True)
|
||||
out = capsys.readouterr()
|
||||
assert len(fns) == 1, "Expected only one test function"
|
||||
assert "does not appear to have a package file" in out[1]
|
||||
assert len(fns) == 2, "Expected two test functions"
|
||||
for f in fns:
|
||||
assert f[1].__name__ in ["test_echo", "test_skip"]
|
||||
assert "virtual does not appear to have a package file" in out[1]
|
||||
|
||||
|
||||
# TODO: This test should go away when compilers as dependencies is supported
|
||||
@@ -301,7 +305,7 @@ def test_test_part_fail(tmpdir, install_mockery_mutable_config, mock_fetch, mock
|
||||
|
||||
for part_name, status in pkg.tester.test_parts.items():
|
||||
assert part_name.endswith(name)
|
||||
assert status == spack.install_test.TestStatus.FAILED
|
||||
assert status == TestStatus.FAILED
|
||||
|
||||
|
||||
def test_test_part_pass(install_mockery_mutable_config, mock_fetch, mock_test_stage):
|
||||
@@ -317,7 +321,7 @@ def test_test_part_pass(install_mockery_mutable_config, mock_fetch, mock_test_st
|
||||
|
||||
for part_name, status in pkg.tester.test_parts.items():
|
||||
assert part_name.endswith(name)
|
||||
assert status == spack.install_test.TestStatus.PASSED
|
||||
assert status == TestStatus.PASSED
|
||||
|
||||
|
||||
def test_test_part_skip(install_mockery_mutable_config, mock_fetch, mock_test_stage):
|
||||
@@ -331,7 +335,7 @@ def test_test_part_skip(install_mockery_mutable_config, mock_fetch, mock_test_st
|
||||
|
||||
for part_name, status in pkg.tester.test_parts.items():
|
||||
assert part_name.endswith(name)
|
||||
assert status == spack.install_test.TestStatus.SKIPPED
|
||||
assert status == TestStatus.SKIPPED
|
||||
|
||||
|
||||
def test_test_part_missing_exe_fail_fast(
|
||||
@@ -354,7 +358,7 @@ def test_test_part_missing_exe_fail_fast(
|
||||
assert len(test_parts) == 1
|
||||
for part_name, status in test_parts.items():
|
||||
assert part_name.endswith(name)
|
||||
assert status == spack.install_test.TestStatus.FAILED
|
||||
assert status == TestStatus.FAILED
|
||||
|
||||
|
||||
def test_test_part_missing_exe(
|
||||
@@ -375,7 +379,90 @@ def test_test_part_missing_exe(
|
||||
assert len(test_parts) == 1
|
||||
for part_name, status in test_parts.items():
|
||||
assert part_name.endswith(name)
|
||||
assert status == spack.install_test.TestStatus.FAILED
|
||||
assert status == TestStatus.FAILED
|
||||
|
||||
|
||||
# TODO (embedded test parts): Update this once embedded test part tracking
|
||||
# TODO (embedded test parts): properly handles the nested context managers.
|
||||
@pytest.mark.parametrize(
|
||||
"current,substatuses,expected",
|
||||
[
|
||||
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
|
||||
(TestStatus.FAILED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.FAILED),
|
||||
(TestStatus.SKIPPED, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.SKIPPED),
|
||||
(TestStatus.NO_TESTS, [TestStatus.PASSED, TestStatus.PASSED], TestStatus.NO_TESTS),
|
||||
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
|
||||
(TestStatus.PASSED, [TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
|
||||
(TestStatus.PASSED, [TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
|
||||
],
|
||||
)
|
||||
def test_embedded_test_part_status(
|
||||
install_mockery_mutable_config, mock_fetch, mock_test_stage, current, substatuses, expected
|
||||
):
|
||||
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
|
||||
|
||||
s = spack.spec.Spec("trivial-smoke-test").concretized()
|
||||
pkg = s.package
|
||||
base_name = "test_example"
|
||||
part_name = f"{pkg.__class__.__name__}::{base_name}"
|
||||
|
||||
pkg.tester.test_parts[part_name] = current
|
||||
for i, status in enumerate(substatuses):
|
||||
pkg.tester.test_parts[f"{part_name}_{i}"] = status
|
||||
|
||||
pkg.tester.status(base_name, current)
|
||||
assert pkg.tester.test_parts[part_name] == expected
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"statuses,expected",
|
||||
[
|
||||
([TestStatus.PASSED, TestStatus.PASSED], TestStatus.PASSED),
|
||||
([TestStatus.PASSED, TestStatus.SKIPPED], TestStatus.PASSED),
|
||||
([TestStatus.PASSED, TestStatus.FAILED], TestStatus.FAILED),
|
||||
([TestStatus.SKIPPED, TestStatus.SKIPPED], TestStatus.SKIPPED),
|
||||
([], TestStatus.NO_TESTS),
|
||||
],
|
||||
)
|
||||
def test_write_tested_status(
|
||||
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage, statuses, expected
|
||||
):
|
||||
"""Check to ensure the status of the enclosing test part reflects summary of embedded parts."""
|
||||
s = spack.spec.Spec("trivial-smoke-test").concretized()
|
||||
pkg = s.package
|
||||
for i, status in enumerate(statuses):
|
||||
pkg.tester.test_parts[f"test_{i}"] = status
|
||||
pkg.tester.counts[status] += 1
|
||||
|
||||
pkg.tester.tested_file = tmpdir.join("test-log.txt")
|
||||
pkg.tester.write_tested_status()
|
||||
with open(pkg.tester.tested_file, "r") as f:
|
||||
status = int(f.read().strip("\n"))
|
||||
assert TestStatus(status) == expected
|
||||
|
||||
|
||||
@pytest.mark.regression("37840")
|
||||
def test_write_tested_status_no_repeats(
|
||||
tmpdir, install_mockery_mutable_config, mock_fetch, mock_test_stage
|
||||
):
|
||||
"""Emulate re-running the same stand-alone tests a second time."""
|
||||
s = spack.spec.Spec("trivial-smoke-test").concretized()
|
||||
pkg = s.package
|
||||
statuses = [TestStatus.PASSED, TestStatus.PASSED]
|
||||
for i, status in enumerate(statuses):
|
||||
pkg.tester.test_parts[f"test_{i}"] = status
|
||||
pkg.tester.counts[status] += 1
|
||||
|
||||
pkg.tester.tested_file = tmpdir.join("test-log.txt")
|
||||
pkg.tester.write_tested_status()
|
||||
pkg.tester.write_tested_status()
|
||||
|
||||
# The test should NOT result in a ValueError: invalid literal for int()
|
||||
# with base 10: '2\n2' (i.e., the results being appended instead of
|
||||
# written to the file).
|
||||
with open(pkg.tester.tested_file, "r") as f:
|
||||
status = int(f.read().strip("\n"))
|
||||
assert TestStatus(status) == TestStatus.PASSED
|
||||
|
||||
|
||||
def test_check_special_outputs(tmpdir):
|
||||
|
@@ -244,6 +244,7 @@ def check_ast_roundtrip(code1, filename="internal", mode="exec"):
|
||||
assert ast.dump(ast1) == ast.dump(ast2), error_msg
|
||||
|
||||
|
||||
@pytest.mark.xfail(reason="https://github.com/spack/spack/pull/38424")
|
||||
def test_core_lib_files():
|
||||
"""Roundtrip source files from the Python core libs."""
|
||||
test_directories = [
|
||||
|
@@ -935,7 +935,7 @@ def test_inclusion_upperbound():
|
||||
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
|
||||
def test_git_version_repo_attached_after_serialization(
|
||||
mock_git_version_info, mock_packages, monkeypatch
|
||||
mock_git_version_info, mock_packages, config, monkeypatch
|
||||
):
|
||||
"""Test that a GitVersion instance can be serialized and deserialized
|
||||
without losing its repository reference.
|
||||
@@ -954,7 +954,9 @@ def test_git_version_repo_attached_after_serialization(
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
|
||||
def test_resolved_git_version_is_shown_in_str(mock_git_version_info, mock_packages, monkeypatch):
|
||||
def test_resolved_git_version_is_shown_in_str(
|
||||
mock_git_version_info, mock_packages, config, monkeypatch
|
||||
):
|
||||
"""Test that a GitVersion from a commit without a user supplied version is printed
|
||||
as <hash>=<version>, and not just <hash>."""
|
||||
repo_path, _, commits = mock_git_version_info
|
||||
@@ -968,7 +970,7 @@ def test_resolved_git_version_is_shown_in_str(mock_git_version_info, mock_packag
|
||||
assert str(spec.version) == f"{commit}=1.0-git.1"
|
||||
|
||||
|
||||
def test_unresolvable_git_versions_error(mock_packages):
|
||||
def test_unresolvable_git_versions_error(config, mock_packages):
|
||||
"""Test that VersionLookupError is raised when a git prop is not set on a package."""
|
||||
with pytest.raises(VersionLookupError):
|
||||
# The package exists, but does not have a git property set. When dereferencing
|
||||
|
@@ -340,13 +340,20 @@ def execute(self, env: MutableMapping[str, str]):
|
||||
|
||||
|
||||
class SetEnv(NameValueModifier):
|
||||
__slots__ = ("force",)
|
||||
__slots__ = ("force", "raw")
|
||||
|
||||
def __init__(
|
||||
self, name: str, value: str, *, trace: Optional[Trace] = None, force: bool = False
|
||||
self,
|
||||
name: str,
|
||||
value: str,
|
||||
*,
|
||||
trace: Optional[Trace] = None,
|
||||
force: bool = False,
|
||||
raw: bool = False,
|
||||
):
|
||||
super().__init__(name, value, trace=trace)
|
||||
self.force = force
|
||||
self.raw = raw
|
||||
|
||||
def execute(self, env: MutableMapping[str, str]):
|
||||
tty.debug(f"SetEnv: {self.name}={str(self.value)}", level=3)
|
||||
@@ -490,15 +497,16 @@ def _trace(self) -> Optional[Trace]:
|
||||
return Trace(filename=filename, lineno=lineno, context=current_context)
|
||||
|
||||
@system_env_normalize
|
||||
def set(self, name: str, value: str, *, force: bool = False):
|
||||
def set(self, name: str, value: str, *, force: bool = False, raw: bool = False):
|
||||
"""Stores a request to set an environment variable.
|
||||
|
||||
Args:
|
||||
name: name of the environment variable
|
||||
value: value of the environment variable
|
||||
force: if True, audit will not consider this modification a warning
|
||||
raw: if True, format of value string is skipped
|
||||
"""
|
||||
item = SetEnv(name, value, trace=self._trace(), force=force)
|
||||
item = SetEnv(name, value, trace=self._trace(), force=force, raw=raw)
|
||||
self.env_modifications.append(item)
|
||||
|
||||
@system_env_normalize
|
||||
@@ -757,16 +765,21 @@ def from_sourcing_file(
|
||||
"PS1",
|
||||
"PS2",
|
||||
"ENV",
|
||||
# Environment modules v4
|
||||
# Environment Modules or Lmod
|
||||
"LOADEDMODULES",
|
||||
"_LMFILES_",
|
||||
"BASH_FUNC_module()",
|
||||
"MODULEPATH",
|
||||
"MODULES_(.*)",
|
||||
r"(\w*)_mod(quar|share)",
|
||||
# Lmod configuration
|
||||
r"LMOD_(.*)",
|
||||
"MODULERCFILE",
|
||||
"BASH_FUNC_ml()",
|
||||
"BASH_FUNC_module()",
|
||||
# Environment Modules-specific configuration
|
||||
"MODULESHOME",
|
||||
"BASH_FUNC__module_raw()",
|
||||
r"MODULES_(.*)",
|
||||
r"__MODULES_(.*)",
|
||||
r"(\w*)_mod(quar|share)",
|
||||
# Lmod-specific configuration
|
||||
r"LMOD_(.*)",
|
||||
]
|
||||
)
|
||||
|
||||
|
@@ -225,7 +225,7 @@ spack:
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug-aarch64" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-ahug-aarch64" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -222,7 +222,7 @@ spack:
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-ahug" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -132,7 +132,7 @@ spack:
|
||||
- - $target
|
||||
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc-aarch64" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-isc-aarch64" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -143,7 +143,7 @@ spack:
|
||||
- - $target
|
||||
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/aws-isc" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -21,7 +21,7 @@ spack:
|
||||
- - $default_specs
|
||||
- - $arch
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/build_systems" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/build_systems" }
|
||||
|
||||
cdash:
|
||||
build-group: Build tests for different build systems
|
||||
build-group: Build Systems
|
||||
|
@@ -6,9 +6,9 @@ spack:
|
||||
mesa:
|
||||
require: "+glx +osmesa +opengl ~opengles +llvm"
|
||||
libosmesa:
|
||||
require: ^mesa +osmesa
|
||||
require: "mesa +osmesa"
|
||||
libglx:
|
||||
require: ^mesa +glx
|
||||
require: "mesa +glx"
|
||||
ospray:
|
||||
require: "@2.8.0 +denoiser +mpi"
|
||||
llvm:
|
||||
@@ -64,7 +64,7 @@ spack:
|
||||
- [$sdk_base_spec]
|
||||
- [$^visit_specs]
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/data-vis-sdk" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/data-vis-sdk" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -21,7 +21,7 @@ spack:
|
||||
- readline
|
||||
|
||||
mirrors:
|
||||
mirror: s3://spack-binaries/develop/deprecated
|
||||
mirror: s3://spack-binaries/releases/v0.20/deprecated
|
||||
gitlab-ci:
|
||||
broken-tests-packages:
|
||||
- gptune
|
||||
|
@@ -24,7 +24,7 @@ spack:
|
||||
- - $easy_specs
|
||||
- - $arch
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-mac" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-mac" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -263,12 +263,16 @@ spack:
|
||||
# SKIPPED
|
||||
# - flecsi # dependency pfunit marks oneapi as an unsupported compiler
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-oneapi" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-oneapi" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
- build-job:
|
||||
image: ecpe4s/ubuntu20.04-runner-x86_64-oneapi:2023-01-01
|
||||
before_script:
|
||||
- - . /bootstrap/runner/view/lmod/lmod/init/bash
|
||||
- module use /opt/intel/oneapi/modulefiles
|
||||
- module load compiler
|
||||
|
||||
cdash:
|
||||
build-group: E4S OneAPI
|
||||
|
@@ -207,7 +207,7 @@ spack:
|
||||
# bricks: VSBrick-7pt.py-Scalar-8x8x8-1:30:3: error: 'vfloat512' was not declared in this scope
|
||||
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-power" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s-power" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -232,7 +232,7 @@ spack:
|
||||
# CUDA failures
|
||||
#- parsec +cuda # parsec/mca/device/cuda/transfer.c:168: multiple definition of `parsec_CUDA_d2h_max_flows';
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/e4s" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -51,7 +51,7 @@ spack:
|
||||
# FAILURES
|
||||
# - kokkos +wrapper +cuda cuda_arch=80 ^cuda@12.0.0 # https://github.com/spack/spack/issues/35378
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/gpu-tests" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/gpu-tests" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -77,7 +77,7 @@ spack:
|
||||
- xgboost
|
||||
|
||||
mirrors:
|
||||
mirror: s3://spack-binaries/develop/ml-linux-x86_64-cpu
|
||||
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-cpu
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -80,7 +80,7 @@ spack:
|
||||
- xgboost
|
||||
|
||||
mirrors:
|
||||
mirror: s3://spack-binaries/develop/ml-linux-x86_64-cuda
|
||||
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-cuda
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -83,7 +83,7 @@ spack:
|
||||
- xgboost
|
||||
|
||||
mirrors:
|
||||
mirror: s3://spack-binaries/develop/ml-linux-x86_64-rocm
|
||||
mirror: s3://spack-binaries/releases/v0.20/ml-linux-x86_64-rocm
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -38,7 +38,7 @@ spack:
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/radiuss-aws-aarch64" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/radiuss-aws-aarch64" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -44,7 +44,7 @@ spack:
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/radiuss-aws" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/releases/v0.20/radiuss-aws" }
|
||||
|
||||
ci:
|
||||
pipeline-gen:
|
||||
|
@@ -41,7 +41,7 @@ spack:
|
||||
- zfp
|
||||
|
||||
mirrors:
|
||||
mirror: "s3://spack-binaries/develop/radiuss"
|
||||
mirror: "s3://spack-binaries/releases/v0.20/radiuss"
|
||||
|
||||
specs:
|
||||
- matrix:
|
||||
|
@@ -50,7 +50,7 @@ spack:
|
||||
- $gcc_spack_built_packages
|
||||
|
||||
mirrors:
|
||||
mirror: s3://spack-binaries/develop/tutorial
|
||||
mirror: s3://spack-binaries/releases/v0.20/tutorial
|
||||
ci:
|
||||
pipeline-gen:
|
||||
- build-job:
|
||||
|
@@ -1060,7 +1060,7 @@ _spack_external_list() {
|
||||
}
|
||||
|
||||
_spack_external_read_cray_manifest() {
|
||||
SPACK_COMPREPLY="-h --help --file --directory --dry-run --fail-on-error"
|
||||
SPACK_COMPREPLY="-h --help --file --directory --ignore-default-dir --dry-run --fail-on-error"
|
||||
}
|
||||
|
||||
_spack_fetch() {
|
||||
|
@@ -37,7 +37,7 @@ RUN find -L {{ paths.view }}/* -type f -exec readlink -f '{}' \; | \
|
||||
|
||||
# Modifications to the environment that are necessary to run
|
||||
RUN cd {{ paths.environment }} && \
|
||||
spack env activate --sh -d . >> /etc/profile.d/z10_spack_environment.sh
|
||||
spack env activate --sh -d . > activate.sh
|
||||
|
||||
{% if extra_instructions.build %}
|
||||
{{ extra_instructions.build }}
|
||||
@@ -53,7 +53,13 @@ COPY --from=builder {{ paths.environment }} {{ paths.environment }}
|
||||
COPY --from=builder {{ paths.store }} {{ paths.store }}
|
||||
COPY --from=builder {{ paths.hidden_view }} {{ paths.hidden_view }}
|
||||
COPY --from=builder {{ paths.view }} {{ paths.view }}
|
||||
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
RUN { \
|
||||
echo '#!/bin/sh' \
|
||||
&& echo '.' {{ paths.environment }}/activate.sh \
|
||||
&& echo 'exec "$@"'; \
|
||||
} > /entrypoint.sh \
|
||||
&& chmod a+x /entrypoint.sh
|
||||
|
||||
{% block final_stage %}
|
||||
|
||||
@@ -70,6 +76,6 @@ RUN {% if os_package_update %}{{ os_packages_final.update }} \
|
||||
{% for label, value in labels.items() %}
|
||||
LABEL "{{ label }}"="{{ value }}"
|
||||
{% endfor %}
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l", "-c", "$*", "--" ]
|
||||
ENTRYPOINT [ "/entrypoint.sh" ]
|
||||
CMD [ "/bin/bash" ]
|
||||
{% endif %}
|
||||
|
@@ -1,7 +1,7 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y \
|
||||
bzip2 \
|
||||
curl \
|
||||
file \
|
||||
@@ -23,6 +23,6 @@ RUN yum update -y \
|
||||
unzip \
|
||||
zstd \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,9 +1,9 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y epel-release \
|
||||
&& yum update -y \
|
||||
&& yum --enablerepo epel install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y epel-release \
|
||||
&& dnf update -y \
|
||||
&& dnf --enablerepo epel install -y \
|
||||
bzip2 \
|
||||
curl-minimal \
|
||||
file \
|
||||
@@ -25,6 +25,6 @@ RUN yum update -y \
|
||||
unzip \
|
||||
zstd \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,13 +1,13 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
RUN dnf update -y \
|
||||
# See https://fedoraproject.org/wiki/EPEL#Quickstart for powertools
|
||||
&& yum install -y dnf-plugins-core \
|
||||
&& dnf install -y dnf-plugins-core \
|
||||
&& dnf config-manager --set-enabled powertools \
|
||||
&& yum install -y epel-release \
|
||||
&& yum update -y \
|
||||
&& yum --enablerepo epel groupinstall -y "Development Tools" \
|
||||
&& yum --enablerepo epel install -y \
|
||||
&& dnf install -y epel-release \
|
||||
&& dnf update -y \
|
||||
&& dnf --enablerepo epel groupinstall -y "Development Tools" \
|
||||
&& dnf --enablerepo epel install -y \
|
||||
curl \
|
||||
findutils \
|
||||
gcc-c++ \
|
||||
@@ -24,6 +24,6 @@ RUN yum update -y \
|
||||
python38-setuptools \
|
||||
unzip \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,7 +1,7 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y \
|
||||
bzip2 \
|
||||
curl \
|
||||
file \
|
||||
@@ -24,6 +24,6 @@ RUN yum update -y \
|
||||
zstd \
|
||||
xz \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,7 +1,7 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y \
|
||||
bzip2 \
|
||||
curl \
|
||||
file \
|
||||
@@ -24,6 +24,6 @@ RUN yum update -y \
|
||||
xz \
|
||||
zstd \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,7 +1,7 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y \
|
||||
bzip2 \
|
||||
curl \
|
||||
file \
|
||||
@@ -24,6 +24,6 @@ RUN yum update -y \
|
||||
xz \
|
||||
zstd \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -1,9 +1,9 @@
|
||||
{% extends "container/bootstrap-base.dockerfile" %}
|
||||
{% block install_os_packages %}
|
||||
RUN yum update -y \
|
||||
&& yum install -y epel-release \
|
||||
&& yum update -y \
|
||||
&& yum --enablerepo epel install -y \
|
||||
RUN dnf update -y \
|
||||
&& dnf install -y epel-release \
|
||||
&& dnf update -y \
|
||||
&& dnf --enablerepo epel install -y \
|
||||
bzip2 \
|
||||
curl-minimal \
|
||||
file \
|
||||
@@ -26,6 +26,6 @@ RUN yum update -y \
|
||||
xz \
|
||||
zstd \
|
||||
&& pip3 install boto3 \
|
||||
&& rm -rf /var/cache/yum \
|
||||
&& yum clean all
|
||||
&& rm -rf /var/cache/dnf \
|
||||
&& dnf clean all
|
||||
{% endblock %}
|
||||
|
@@ -84,6 +84,10 @@ setenv("{{ cmd.name }}", "{{ cmd.value }}")
|
||||
unsetenv("{{ cmd.name }}")
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
{# Make sure system man pages are enabled by appending trailing delimiter to MANPATH #}
|
||||
{% if has_manpath_modifications %}
|
||||
append_path("MANPATH", "", ":")
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
|
||||
{% block footer %}
|
||||
|
@@ -26,9 +26,17 @@ proc ModulesHelp { } {
|
||||
{% endblock %}
|
||||
|
||||
{% block autoloads %}
|
||||
{% if autoload|length > 0 %}
|
||||
if {![info exists ::env(LMOD_VERSION_MAJOR)]} {
|
||||
{% for module in autoload %}
|
||||
module load {{ module }}
|
||||
module load {{ module }}
|
||||
{% endfor %}
|
||||
} else {
|
||||
{% for module in autoload %}
|
||||
depends-on {{ module }}
|
||||
{% endfor %}
|
||||
}
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
{# #}
|
||||
{% block prerequisite %}
|
||||
@@ -58,6 +66,10 @@ unsetenv {{ cmd.name }}
|
||||
{% endif %}
|
||||
{# #}
|
||||
{% endfor %}
|
||||
{# Make sure system man pages are enabled by appending trailing delimiter to MANPATH #}
|
||||
{% if has_manpath_modifications %}
|
||||
append-path --delim ":" MANPATH ""
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
|
||||
{% block footer %}
|
||||
|
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class ModuleManpathAppend(Package):
|
||||
homepage = "http://www.llnl.gov"
|
||||
url = "http://www.llnl.gov/module-manpath-append-1.0.tar.gz"
|
||||
|
||||
version("1.0", "0123456789abcdef0123456789abcdef")
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.append_path("MANPATH", "/path/to/man")
|
@@ -0,0 +1,17 @@
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class ModuleManpathPrepend(Package):
|
||||
homepage = "http://www.llnl.gov"
|
||||
url = "http://www.llnl.gov/module-manpath-prepend-1.0.tar.gz"
|
||||
|
||||
version("1.0", "0123456789abcdef0123456789abcdef")
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.prepend_path("MANPATH", "/path/to/man")
|
||||
env.prepend_path("MANPATH", "/path/to/share/man")
|
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class ModuleManpathSetenv(Package):
|
||||
homepage = "http://www.llnl.gov"
|
||||
url = "http://www.llnl.gov/module-manpath-setenv-1.0.tar.gz"
|
||||
|
||||
version("1.0", "0123456789abcdef0123456789abcdef")
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.set("MANPATH", "/path/to/man")
|
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class ModuleSetenvRaw(Package):
|
||||
homepage = "http://www.llnl.gov"
|
||||
url = "http://www.llnl.gov/module-setenv-raw-1.0.tar.gz"
|
||||
|
||||
version("1.0", "0123456789abcdef0123456789abcdef")
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.set("FOO", "{{name}}, {name}, {{}}, {}", raw=True)
|
@@ -14,6 +14,7 @@ class Python(Package):
|
||||
|
||||
extendable = True
|
||||
|
||||
version("3.7.1", md5="aaabbbcccdddeeefffaaabbbcccddd12")
|
||||
version("3.5.1", md5="be78e48cdfc1a7ad90efff146dce6cfe")
|
||||
version("3.5.0", md5="a56c0c0b45d75a0ec9c6dee933c41c36")
|
||||
version("2.7.11", md5="6b6076ec9e93f05dd63e47eb9c15728b", preferred=True)
|
||||
|
@@ -7,16 +7,24 @@
|
||||
|
||||
|
||||
class SimpleStandaloneTest(Package):
|
||||
"""This package has a simple stand-alone test features."""
|
||||
"""This package has simple stand-alone test features."""
|
||||
|
||||
homepage = "http://www.example.com/simple_test"
|
||||
url = "http://www.unit-test-should-replace-this-url/simple_test-1.0.tar.gz"
|
||||
|
||||
version("1.0", md5="0123456789abcdef0123456789abcdef")
|
||||
version("1.0", md5="123456789abcdef0123456789abcdefg")
|
||||
version("0.9", md5="0123456789abcdef0123456789abcdef")
|
||||
|
||||
provides("standalone-test")
|
||||
provides("standalone-ifc")
|
||||
|
||||
def test_echo(self):
|
||||
"""simple stand-alone test"""
|
||||
echo = which("echo")
|
||||
echo("testing echo", output=str.split, error=str.split)
|
||||
|
||||
def test_skip(self):
|
||||
"""simple skip test"""
|
||||
if self.spec.satisfies("@1.0:"):
|
||||
raise SkipTest("This test is not available from v1.0 on")
|
||||
|
||||
print("Ran test_skip")
|
||||
|
@@ -436,6 +436,22 @@ def setup_dependent_package(self, module, dependent_spec):
|
||||
module.cmake = Executable(self.spec.prefix.bin.cmake)
|
||||
module.ctest = Executable(self.spec.prefix.bin.ctest)
|
||||
|
||||
@property
|
||||
def libs(self):
|
||||
"""CMake has no libraries, so if you ask for `spec['cmake'].libs`
|
||||
(which happens automatically for packages that depend on CMake as
|
||||
a link dependency) the default implementation of ``.libs` will
|
||||
search the entire root prefix recursively before failing.
|
||||
|
||||
The longer term solution is for all dependents of CMake to change
|
||||
their deptype. For now, this returns an empty set of libraries.
|
||||
"""
|
||||
return LibraryList([])
|
||||
|
||||
@property
|
||||
def headers(self):
|
||||
return HeaderList([])
|
||||
|
||||
def test(self):
|
||||
"""Perform smoke tests on the installed package."""
|
||||
spec_vers_str = "version {0}".format(self.spec.version)
|
||||
|
@@ -32,11 +32,8 @@ class Libxcb(AutotoolsPackage):
|
||||
depends_on("xcb-proto@1.12:", when="@1.12:1.12.999")
|
||||
depends_on("xcb-proto@1.11:", when="@1.11:1.11.999")
|
||||
|
||||
# TODO: uncomment once build deps can be resolved separately
|
||||
# See #7646, #4145, #4063, and #2548 for details
|
||||
# libxcb 1.13 added Python 3 support
|
||||
# depends_on('python', type='build')
|
||||
# depends_on('python@2:2.8', when='@:1.12', type='build')
|
||||
depends_on("python@3", when="@1.13:", type="build")
|
||||
|
||||
depends_on("pkgconfig", type="build")
|
||||
depends_on("util-macros", type="build")
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user