Compare commits

..

6 Commits

Author SHA1 Message Date
Harmen Stoppels
614714fc01 delete dead code 2025-01-06 15:07:48 +01:00
Wouter Deconinck
b328b5d169 PythonPackage: call python and pass env mods 2025-01-05 09:47:58 -06:00
Wouter Deconinck
cda1a2fd91 PythonPackage: fix style 2025-01-03 19:43:21 -06:00
Wouter Deconinck
4181237bcd PythonPackage: fix missing import 2025-01-03 19:40:57 -06:00
Wouter Deconinck
7a4859721a EnvironmentModifications.set_env(self, env) context manager 2025-01-03 19:35:53 -06:00
Wouter Deconinck
d4a28b4e22 PythonPackage: run test_imports in run env context 2025-01-03 19:30:48 -06:00
591 changed files with 4105 additions and 6674 deletions

View File

@@ -29,7 +29,7 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@1e68e06f1dbfde0e4cefc87efeba9e4643565303
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: false

View File

@@ -2,6 +2,6 @@ black==24.10.0
clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.11.2
mypy==1.8.0
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -20,7 +20,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
python-version: '3.11'
cache: 'pip'
- name: Install Python Packages
run: |
@@ -39,7 +39,7 @@ jobs:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
python-version: '3.11'
cache: 'pip'
- name: Install Python packages
run: |
@@ -58,7 +58,7 @@ jobs:
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
python_version: '3.11'
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest

View File

@@ -65,7 +65,6 @@ packages:
unwind: [libunwind]
uuid: [util-linux-uuid, libuuid]
wasi-sdk: [wasi-sdk-prebuilt]
xkbdata-api: [xkeyboard-config, xkbdata]
xxd: [xxd-standalone, vim]
yacc: [bison, byacc]
ziglang: [zig]

View File

@@ -25,23 +25,14 @@ These settings can be overridden in ``etc/spack/config.yaml`` or
The location where Spack will install packages and their dependencies.
Default is ``$spack/opt/spack``.
---------------
``projections``
---------------
---------------------------------------------------
``install_hash_length`` and ``install_path_scheme``
---------------------------------------------------
.. warning::
Modifying projections of the install tree is strongly discouraged.
By default Spack installs all packages into a unique directory relative to the install
tree root with the following layout:
.. code-block::
{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}
In very rare cases, it may be necessary to reduce the length of this path. For example,
very old versions of the Intel compiler are known to segfault when input paths are too long:
The default Spack installation path can be very long and can create problems
for scripts with hardcoded shebangs. Additionally, when using the Intel
compiler, and if there is also a long list of dependencies, the compiler may
segfault. If you see the following:
.. code-block:: console
@@ -49,25 +40,36 @@ very old versions of the Intel compiler are known to segfault when input paths a
** Segmentation violation signal raised. **
Access violation or stack overflow. Please contact Intel Support for assistance.
Another case is Python and R packages with many runtime dependencies, which can result
in very large ``PYTHONPATH`` and ``R_LIBS`` environment variables. This can cause the
``execve`` system call to fail with ``E2BIG``, preventing processes from starting.
it may be because variables containing dependency specs may be too long. There
are two parameters to help with long path names. Firstly, the
``install_hash_length`` parameter can set the length of the hash in the
installation path from 1 to 32. The default path uses the full 32 characters.
For this reason, Spack allows users to modify the installation layout through custom
projections. For example
Secondly, it is also possible to modify the entire installation
scheme. By default Spack uses
``{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}``
where the tokens that are available for use in this directive are the
same as those understood by the :meth:`~spack.spec.Spec.format`
method. Using this parameter it is possible to use a different package
layout or reduce the depth of the installation paths. For example
.. code-block:: yaml
config:
install_tree:
root: $spack/opt/spack
projections:
all: "{name}/{version}/{hash:16}"
install_path_scheme: '{name}/{version}/{hash:7}'
would install packages into sub-directories using only the package name, version and a
hash length of 16 characters.
would install packages into sub-directories using only the package
name, version and a hash length of 7 characters.
Notice that reducing the hash length increases the likelihood of hash collisions.
When using either parameter to set the hash length it only affects the
representation of the hash in the installation directory. You
should be aware that the smaller the hash length the more likely
naming conflicts will occur. These parameters are independent of those
used to configure module names.
.. warning:: Modifying the installation hash length or path scheme after
packages have been installed will prevent Spack from being
able to find the old installation directories.
--------------------
``build_stage``

View File

@@ -543,10 +543,10 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'from spack.concretize import concretize_one; concretize_one("python")'
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
...
$ spack python -i ipython -c 'from spack.concretize import concretize_one; concretize_one("python")'
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
Out[1]: ...
or a file:

View File

@@ -456,13 +456,14 @@ For instance, the following config options,
tcl:
all:
suffixes:
^python@3: 'python{^python.version.up_to_2}'
^python@3: 'python{^python.version}'
^openblas: 'openblas'
will add a ``python3.12`` to module names of packages compiled with Python 3.12, and similarly for
all specs depending on ``python@3``. This is useful to know which version of Python a set of Python
extensions is associated with. Likewise, the ``openblas`` string is attached to any program that
has openblas in the spec, most likely via the ``+blas`` variant specification.
will add a ``python-3.12.1`` version string to any packages compiled with
Python matching the spec, ``python@3``. This is useful to know which
version of Python a set of Python extensions is associated with. Likewise, the
``openblas`` string is attached to any program that has openblas in the spec,
most likely via the ``+blas`` variant specification.
The most heavyweight solution to module naming is to change the entire
naming convention for module files. This uses the projections format

View File

@@ -4,7 +4,7 @@ sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.19.1
pygments==2.18.0
urllib3==2.3.0
pytest==8.3.4
isort==5.13.2

View File

@@ -75,6 +75,7 @@
"install_tree",
"is_exe",
"join_path",
"last_modification_time_recursive",
"library_extensions",
"mkdirp",
"partition_path",
@@ -1469,36 +1470,15 @@ def set_executable(path):
@system_path_filter
def recursive_mtime_greater_than(path: str, time: float) -> bool:
"""Returns true if any file or dir recursively under `path` has mtime greater than `time`."""
# use bfs order to increase likelihood of early return
queue: Deque[str] = collections.deque()
if os.stat(path).st_mtime > time:
return True
while queue:
current = queue.popleft()
try:
entries = os.scandir(current)
except OSError:
continue
with entries:
for entry in entries:
try:
st = entry.stat(follow_symlinks=False)
except OSError:
continue
if st.st_mtime > time:
return True
if entry.is_dir(follow_symlinks=False):
queue.append(entry.path)
return False
def last_modification_time_recursive(path):
path = os.path.abspath(path)
times = [os.stat(path).st_mtime]
times.extend(
os.lstat(os.path.join(root, name)).st_mtime
for root, dirs, files in os.walk(path)
for name in dirs + files
)
return max(times)
@system_path_filter
@@ -1760,7 +1740,8 @@ def find(
def _log_file_access_issue(e: OSError, path: str) -> None:
tty.debug(f"find must skip {path}: {e}")
errno_name = errno.errorcode.get(e.errno, "UNKNOWN")
tty.debug(f"find must skip {path}: {errno_name} {e}")
def _file_id(s: os.stat_result) -> Tuple[int, int]:

View File

@@ -1356,8 +1356,14 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str) and isinstance(_detected, str):
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
@@ -1373,7 +1379,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
elif isinstance(_expected, dict) and isinstance(_detected, dict):
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1388,10 +1394,6 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result

View File

@@ -23,7 +23,7 @@
import urllib.request
import warnings
from contextlib import closing
from typing import IO, Callable, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
from typing import IO, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
import llnl.util.filesystem as fsys
import llnl.util.lang
@@ -591,18 +591,32 @@ def file_matches(f: IO[bytes], regex: llnl.util.lang.PatternBytes) -> bool:
f.seek(0)
def specs_to_relocate(spec: spack.spec.Spec) -> List[spack.spec.Spec]:
"""Return the set of specs that may be referenced in the install prefix of the provided spec.
We currently include non-external transitive link and direct run dependencies."""
specs = [
def deps_to_relocate(spec):
"""Return the transitive link and direct run dependencies of the spec.
This is a special traversal for dependencies we need to consider when relocating a package.
Package binaries, scripts, and other files may refer to the prefixes of dependencies, so
we need to rewrite those locations when dependencies are in a different place at install time
than they were at build time.
This traversal covers transitive link dependencies and direct run dependencies because:
1. Spack adds RPATHs for transitive link dependencies so that packages can find needed
dependency libraries.
2. Packages may call any of their *direct* run dependencies (and may bake their paths into
binaries or scripts), so we also need to search for run dependency prefixes when relocating.
This returns a deduplicated list of transitive link dependencies and direct run dependencies.
"""
deps = [
s
for s in itertools.chain(
spec.traverse(root=True, deptype="link", order="breadth", key=traverse.by_dag_hash),
spec.dependencies(deptype="run"),
spec.traverse(root=True, deptype="link"), spec.dependencies(deptype="run")
)
if not s.external
]
return list(llnl.util.lang.dedupe(specs, key=lambda s: s.dag_hash()))
return llnl.util.lang.dedupe(deps, key=lambda s: s.dag_hash())
def get_buildinfo_dict(spec):
@@ -616,7 +630,7 @@ def get_buildinfo_dict(spec):
# "relocate_binaries": [],
# "relocate_links": [],
"hardlinks_deduped": True,
"hash_to_prefix": {d.dag_hash(): str(d.prefix) for d in specs_to_relocate(spec)},
"hash_to_prefix": {d.dag_hash(): str(d.prefix) for d in deps_to_relocate(spec)},
}
@@ -669,24 +683,19 @@ def sign_specfile(key: str, specfile_path: str) -> str:
def _read_specs_and_push_index(
file_list: List[str],
read_method: Callable,
cache_prefix: str,
db: BuildCacheDatabase,
temp_dir: str,
concurrency: int,
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency
):
"""Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror.
Args:
file_list: List of urls or file paths pointing at spec files to read
file_list (list(str)): List of urls or file paths pointing at spec files to read
read_method: A function taking a single argument, either a url or a file path,
and which reads the spec file at that location, and returns the spec.
cache_prefix: prefix of the build cache on s3 where index should be pushed.
cache_prefix (str): prefix of the build cache on s3 where index should be pushed.
db: A spack database used for adding specs and then writing the index.
temp_dir: Location to write index.json and hash for pushing
concurrency: Number of parallel processes to use when fetching
temp_dir (str): Location to write index.json and hash for pushing
concurrency (int): Number of parallel processes to use when fetching
"""
for file in file_list:
contents = read_method(file)
@@ -866,12 +875,9 @@ def _url_generate_package_index(url: str, tmpdir: str, concurrency: int = 32):
tty.debug(f"Retrieving spec descriptor files from {url} to build index")
db = BuildCacheDatabase(tmpdir)
db._write()
try:
_read_specs_and_push_index(
file_list, read_fn, url, db, str(db.database_directory), concurrency
)
_read_specs_and_push_index(file_list, read_fn, url, db, db.database_directory, concurrency)
except Exception as e:
raise GenerateIndexError(f"Encountered problem pushing package index to {url}: {e}") from e
@@ -1106,7 +1112,7 @@ def _exists_in_buildcache(spec: spack.spec.Spec, tmpdir: str, out_url: str) -> E
def prefixes_to_relocate(spec):
prefixes = [s.prefix for s in specs_to_relocate(spec)]
prefixes = [s.prefix for s in deps_to_relocate(spec)]
prefixes.append(spack.hooks.sbang.sbang_install_path())
prefixes.append(str(spack.store.STORE.layout.root))
return prefixes
@@ -2133,9 +2139,10 @@ def fetch_url_to_mirror(url):
def dedupe_hardlinks_if_necessary(root, buildinfo):
"""Updates a buildinfo dict for old archives that did not dedupe hardlinks. De-duping hardlinks
is necessary when relocating files in parallel and in-place. This means we must preserve inodes
when relocating."""
"""Updates a buildinfo dict for old archives that did
not dedupe hardlinks. De-duping hardlinks is necessary
when relocating files in parallel and in-place. This
means we must preserve inodes when relocating."""
# New archives don't need this.
if buildinfo.get("hardlinks_deduped", False):
@@ -2164,48 +2171,65 @@ def dedupe_hardlinks_if_necessary(root, buildinfo):
buildinfo[key] = new_list
def relocate_package(spec: spack.spec.Spec) -> None:
"""Relocate binaries and text files in the given spec prefix, based on its buildinfo file."""
spec_prefix = str(spec.prefix)
buildinfo = read_buildinfo_file(spec_prefix)
def relocate_package(spec):
"""
Relocate the given package
"""
workdir = str(spec.prefix)
buildinfo = read_buildinfo_file(workdir)
new_layout_root = str(spack.store.STORE.layout.root)
new_prefix = str(spec.prefix)
new_rel_prefix = str(os.path.relpath(new_prefix, new_layout_root))
new_spack_prefix = str(spack.paths.prefix)
old_sbang_install_path = None
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
old_layout_root = str(buildinfo["buildpath"])
old_spack_prefix = str(buildinfo.get("spackprefix"))
old_rel_prefix = buildinfo.get("relative_prefix")
old_prefix = os.path.join(old_layout_root, old_rel_prefix)
rel = buildinfo.get("relative_rpaths", False)
# Warn about old style tarballs created with the --rel flag (removed in Spack v0.20)
if buildinfo.get("relative_rpaths", False):
tty.warn(
f"Tarball for {spec} uses relative rpaths, which can cause library loading issues."
)
# In Spack 0.19 and older prefix_to_hash was the default and externals were not dropped, so
# prefixes were not unique.
# In the past prefix_to_hash was the default and externals were not dropped, so prefixes
# were not unique.
if "hash_to_prefix" in buildinfo:
hash_to_old_prefix = buildinfo["hash_to_prefix"]
elif "prefix_to_hash" in buildinfo:
hash_to_old_prefix = {v: k for (k, v) in buildinfo["prefix_to_hash"].items()}
hash_to_old_prefix = dict((v, k) for (k, v) in buildinfo["prefix_to_hash"].items())
else:
raise NewLayoutException(
"Package tarball was created from an install prefix with a different directory layout "
"and an older buildcache create implementation. It cannot be relocated."
)
hash_to_old_prefix = dict()
prefix_to_prefix: Dict[str, str] = {}
if old_rel_prefix != new_rel_prefix and not hash_to_old_prefix:
msg = "Package tarball was created from an install "
msg += "prefix with a different directory layout and an older "
msg += "buildcache create implementation. It cannot be relocated."
raise NewLayoutException(msg)
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
prefix_to_prefix[old_sbang_install_path] = spack.hooks.sbang.sbang_install_path()
# Spurious replacements (e.g. sbang) will cause issues with binaries
# For example, the new sbang can be longer than the old one.
# Hence 2 dictionaries are maintained here.
prefix_to_prefix_text = collections.OrderedDict()
prefix_to_prefix_bin = collections.OrderedDict()
# First match specific prefix paths. Possibly the *local* install prefix of some dependency is
# in an upstream, so we cannot assume the original spack store root can be mapped uniformly to
# the new spack store root.
if old_sbang_install_path:
install_path = spack.hooks.sbang.sbang_install_path()
prefix_to_prefix_text[old_sbang_install_path] = install_path
# If the spec is spliced, we need to handle the simultaneous mapping from the old install_tree
# to the new install_tree and from the build_spec to the spliced spec. Because foo.build_spec
# is foo for any non-spliced spec, we can simplify by checking for spliced-in nodes by checking
# for nodes not in the build_spec without any explicit check for whether the spec is spliced.
# An analog in this algorithm is any spec that shares a name or provides the same virtuals in
# the context of the relevant root spec. This ensures that the analog for a spec s is the spec
# that s replaced when we spliced.
relocation_specs = specs_to_relocate(spec)
# First match specific prefix paths. Possibly the *local* install prefix
# of some dependency is in an upstream, so we cannot assume the original
# spack store root can be mapped uniformly to the new spack store root.
#
# If the spec is spliced, we need to handle the simultaneous mapping
# from the old install_tree to the new install_tree and from the build_spec
# to the spliced spec.
# Because foo.build_spec is foo for any non-spliced spec, we can simplify
# by checking for spliced-in nodes by checking for nodes not in the build_spec
# without any explicit check for whether the spec is spliced.
# An analog in this algorithm is any spec that shares a name or provides the same virtuals
# in the context of the relevant root spec. This ensures that the analog for a spec s
# is the spec that s replaced when we spliced.
relocation_specs = deps_to_relocate(spec)
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in relocation_specs:
analog = s
@@ -2224,48 +2248,98 @@ def relocate_package(spec: spack.spec.Spec) -> None:
lookup_dag_hash = analog.dag_hash()
if lookup_dag_hash in hash_to_old_prefix:
old_dep_prefix = hash_to_old_prefix[lookup_dag_hash]
prefix_to_prefix[old_dep_prefix] = str(s.prefix)
prefix_to_prefix_bin[old_dep_prefix] = str(s.prefix)
prefix_to_prefix_text[old_dep_prefix] = str(s.prefix)
# Only then add the generic fallback of install prefix -> install prefix.
prefix_to_prefix[old_layout_root] = str(spack.store.STORE.layout.root)
prefix_to_prefix_text[old_prefix] = new_prefix
prefix_to_prefix_bin[old_prefix] = new_prefix
prefix_to_prefix_text[old_layout_root] = new_layout_root
prefix_to_prefix_bin[old_layout_root] = new_layout_root
# Delete identity mappings from prefix_to_prefix
prefix_to_prefix = {k: v for k, v in prefix_to_prefix.items() if k != v}
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
orig_sbang = "#!/bin/bash {0}/bin/sbang".format(old_spack_prefix)
new_sbang = spack.hooks.sbang.sbang_shebang_line()
prefix_to_prefix_text[orig_sbang] = new_sbang
# If there's nothing to relocate, we're done.
if not prefix_to_prefix:
return
tty.debug("Relocating package from", "%s to %s." % (old_layout_root, new_layout_root))
for old, new in prefix_to_prefix.items():
tty.debug(f"Relocating: {old} => {new}.")
# Old archives maybe have hardlinks repeated.
dedupe_hardlinks_if_necessary(workdir, buildinfo)
# Old archives may have hardlinks repeated.
dedupe_hardlinks_if_necessary(spec_prefix, buildinfo)
def is_backup_file(file):
return file.endswith("~")
# Text files containing the prefix text
textfiles = [os.path.join(spec_prefix, f) for f in buildinfo["relocate_textfiles"]]
binaries = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_binaries")]
links = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_links", [])]
text_names = list()
for filename in buildinfo["relocate_textfiles"]:
text_name = os.path.join(workdir, filename)
# Don't add backup files generated by filter_file during install step.
if not is_backup_file(text_name):
text_names.append(text_name)
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(binaries, prefix_to_prefix)
elif "elf" in platform.binary_formats:
relocate.relocate_elf_binaries(binaries, prefix_to_prefix)
# If we are not installing back to the same install tree do the relocation
if old_prefix != new_prefix:
files_to_relocate = [
os.path.join(workdir, filename) for filename in buildinfo.get("relocate_binaries")
]
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(
files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix_bin,
rel,
old_prefix,
new_prefix,
)
elif "elf" in platform.binary_formats and not rel:
# The new ELF dynamic section relocation logic only handles absolute to
# absolute relocation.
relocate.new_relocate_elf_binaries(files_to_relocate, prefix_to_prefix_bin)
elif "elf" in platform.binary_formats and rel:
relocate.relocate_elf_binaries(
files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix_bin,
rel,
old_prefix,
new_prefix,
)
relocate.relocate_links(links, prefix_to_prefix)
relocate.relocate_text(textfiles, prefix_to_prefix)
changed_files = relocate.relocate_text_bin(binaries, prefix_to_prefix)
# Relocate links to the new install prefix
links = [os.path.join(workdir, f) for f in buildinfo.get("relocate_links", [])]
relocate.relocate_links(links, prefix_to_prefix_bin)
# Add ad-hoc signatures to patched macho files when on macOS.
if "macho" in platform.binary_formats and sys.platform == "darwin":
codesign = which("codesign")
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
# relocate the install prefixes in binary files including dependencies
changed_files = relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin)
# Add ad-hoc signatures to patched macho files when on macOS.
if "macho" in platform.binary_formats and sys.platform == "darwin":
codesign = which("codesign")
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
else:
if old_spack_prefix != new_spack_prefix:
relocate.relocate_text(text_names, prefix_to_prefix_text)
def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):

View File

@@ -10,9 +10,7 @@
import sys
import sysconfig
import warnings
from typing import Optional, Sequence, Union
from typing_extensions import TypedDict
from typing import Dict, Optional, Sequence, Union
import archspec.cpu
@@ -20,17 +18,13 @@
from llnl.util import tty
import spack.platforms
import spack.spec
import spack.store
import spack.util.environment
import spack.util.executable
from .config import spec_for_current_python
class QueryInfo(TypedDict, total=False):
spec: spack.spec.Spec
command: spack.util.executable.Executable
QueryInfo = Dict[str, "spack.spec.Spec"]
def _python_import(module: str) -> bool:
@@ -217,9 +211,7 @@ def _executables_in_store(
):
spack.util.environment.path_put_first("PATH", [bin_dir])
if query_info is not None:
query_info["command"] = spack.util.executable.which(
*executables, path=bin_dir, required=True
)
query_info["command"] = spack.util.executable.which(*executables, path=bin_dir)
query_info["spec"] = concrete_spec
return True
return False

View File

@@ -34,10 +34,8 @@
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.concretize
import spack.config
import spack.detection
import spack.error
import spack.mirrors.mirror
import spack.platforms
import spack.spec
@@ -49,13 +47,7 @@
import spack.version
from spack.installer import PackageInstaller
from ._common import (
QueryInfo,
_executables_in_store,
_python_import,
_root_spec,
_try_import_from_store,
)
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python
@@ -142,7 +134,7 @@ class BuildcacheBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[QueryInfo] = None
self.last_search: Optional[ConfigDictionary] = None
self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}"
@staticmethod
@@ -219,14 +211,14 @@ def _install_and_test(
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: QueryInfo = {}
info: ConfigDictionary = {}
if test_fn(query_spec=abstract_spec, query_info=info):
self.last_search = info
return True
return False
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: QueryInfo
info: ConfigDictionary
test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
return True
@@ -239,7 +231,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: QueryInfo
info: ConfigDictionary
test_fn, info = functools.partial(_executables_in_store, executables), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
self.last_search = info
@@ -257,11 +249,11 @@ class SourceBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[QueryInfo] = None
self.last_search: Optional[ConfigDictionary] = None
self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}"
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: QueryInfo = {}
info: ConfigDictionary = {}
if _try_import_from_store(module, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -278,10 +270,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
abstract_spec = spack.spec.Spec(
concrete_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec = spack.concretize.concretize_one(abstract_spec)
concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
@@ -296,7 +288,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return False
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: QueryInfo = {}
info: ConfigDictionary = {}
if _executables_in_store(executables, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -307,7 +299,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.concretize.concretize_one(abstract_spec_str)
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):
@@ -324,9 +316,11 @@ def create_bootstrapper(conf: ConfigDictionary):
return _bootstrap_methods[btype](conf)
def source_is_enabled(conf: ConfigDictionary) -> bool:
"""Returns true if the source is not enabled for bootstrapping"""
return spack.config.get("bootstrap:trusted").get(conf["name"], False)
def source_is_enabled_or_raise(conf: ConfigDictionary):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"]
if not trusted.get(name, False):
raise ValueError("source is not trusted")
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
@@ -356,23 +350,24 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
return
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {module}"
)
msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
if tty.is_debug():
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --backtrace ...` for more detailed errors"
msg += "\nRun `spack --debug ...` for more detailed errors"
raise ImportError(msg)
@@ -410,9 +405,8 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):
# Additional environment variables needed
@@ -420,7 +414,6 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
assert cmd is not None, "expected an Executable"
cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False
@@ -428,17 +421,18 @@ def ensure_executables_in_path_or_raise(
)
return cmd
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {executables_str}"
)
msg = f"cannot bootstrap any of the {executables_str} executables "
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
if tty.is_debug():
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --backtrace ...` for more detailed errors"
msg += "\nRun `spack --debug ...` for more detailed errors"
raise RuntimeError(msg)

View File

@@ -63,6 +63,7 @@ def _missing(name: str, purpose: str, system_only: bool = True) -> str:
def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),

View File

@@ -44,19 +44,7 @@
from enum import Flag, auto
from itertools import chain
from multiprocessing.connection import Connection
from typing import (
Callable,
Dict,
List,
Optional,
Sequence,
Set,
TextIO,
Tuple,
Type,
Union,
overload,
)
from typing import Callable, Dict, List, Optional, Set, Tuple
import archspec.cpu
@@ -158,128 +146,48 @@ def get_effective_jobs(jobs, parallel=True, supports_jobserver=False):
class MakeExecutable(Executable):
"""Special callable executable object for make so the user can specify parallelism options
on a per-invocation basis.
"""Special callable executable object for make so the user can specify
parallelism options on a per-invocation basis. Specifying
'parallel' to the call will override whatever the package's
global setting is, so you can either default to true or false and
override particular calls. Specifying 'jobs_env' to a particular
call will name an environment variable which will be set to the
parallelism level (without affecting the normal invocation with
-j).
"""
def __init__(self, name: str, *, jobs: int, supports_jobserver: bool = True) -> None:
super().__init__(name)
def __init__(self, name, jobs, **kwargs):
supports_jobserver = kwargs.pop("supports_jobserver", True)
super().__init__(name, **kwargs)
self.supports_jobserver = supports_jobserver
self.jobs = jobs
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str] = ...,
error: Union[Optional[TextIO], str] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> None: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Type[str], Callable] = ...,
error: Union[Optional[TextIO], str, Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str, Type[str], Callable] = ...,
error: Union[Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
def __call__(
self,
*args: str,
parallel: bool = True,
jobs_env: Optional[str] = None,
jobs_env_supports_jobserver: bool = False,
**kwargs,
) -> Optional[str]:
"""Runs this "make" executable in a subprocess.
Args:
parallel: if False, parallelism is disabled
jobs_env: environment variable that will be set to the current level of parallelism
jobs_env_supports_jobserver: whether the jobs env supports a job server
For all the other **kwargs, refer to the base class.
def __call__(self, *args, **kwargs):
"""parallel, and jobs_env from kwargs are swallowed and used here;
remaining arguments are passed through to the superclass.
"""
parallel = kwargs.pop("parallel", True)
jobs_env = kwargs.pop("jobs_env", None)
jobs_env_supports_jobserver = kwargs.pop("jobs_env_supports_jobserver", False)
jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=self.supports_jobserver
)
if jobs is not None:
args = (f"-j{jobs}",) + args
args = ("-j{0}".format(jobs),) + args
if jobs_env:
# Caller wants us to set an environment variable to control the parallelism
# Caller wants us to set an environment variable to
# control the parallelism.
jobs_env_jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=jobs_env_supports_jobserver
)
if jobs_env_jobs is not None:
extra_env = kwargs.setdefault("extra_env", {})
extra_env.update({jobs_env: str(jobs_env_jobs)})
kwargs["extra_env"] = {jobs_env: str(jobs_env_jobs)}
return super().__call__(*args, **kwargs)
class UndeclaredDependencyError(spack.error.SpackError):
"""Raised if a dependency is invoking an executable through a module global, without
declaring a dependency on it.
"""
class DeprecatedExecutable:
def __init__(self, pkg: str, exe: str, exe_pkg: str) -> None:
self.pkg = pkg
self.exe = exe
self.exe_pkg = exe_pkg
def __call__(self, *args, **kwargs):
raise UndeclaredDependencyError(
f"{self.pkg} is using {self.exe} without declaring a dependency on {self.exe_pkg}"
)
def add_default_env(self, key: str, value: str):
self.__call__()
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -713,9 +621,10 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
module.make = DeprecatedExecutable(pkg.name, "make", "gmake")
module.gmake = DeprecatedExecutable(pkg.name, "gmake", "gmake")
module.ninja = DeprecatedExecutable(pkg.name, "ninja", "ninja")
# TODO: make these build deps that can be installed if not found.
module.make = MakeExecutable("make", jobs)
module.gmake = MakeExecutable("gmake", jobs)
module.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own

View File

@@ -356,13 +356,6 @@ def _do_patch_libtool_configure(self) -> None:
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:

View File

@@ -293,26 +293,12 @@ def initconfig_hardware_entries(self):
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
if spec.satisfies("%gcc"):
entries.append(
cmake_cache_string(
"CMAKE_HIP_FLAGS", f"--gcc-toolchain={self.pkg.compiler.prefix}"
)
)
return entries
def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
complete_rpath_list = ";".join(
[
self.pkg.spec.prefix.lib,
self.pkg.spec.prefix.lib64,
*os.environ.get("SPACK_COMPILER_EXTRA_RPATHS", "").split(":"),
*os.environ.get("SPACK_COMPILER_IMPLICIT_RPATHS", "").split(":"),
]
)
return [
"#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!",
@@ -321,8 +307,6 @@ def std_initconfig_entries(self):
"#------------------{0}\n".format("-" * 60),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
]

View File

@@ -71,16 +71,13 @@ def build_directory(self):
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return ["-j", str(self.pkg.module.make_jobs)]
return []
@property
def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def setup_build_environment(self, env):
env.set("CARGO_HOME", self.stage.path)
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):

View File

@@ -10,9 +10,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on, extends
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BuilderWithDefaults, execute_build_time_tests
@@ -29,9 +28,7 @@ class PerlPackage(spack.package_base.PackageBase):
build_system("perl")
with when("build_system=perl"):
extends("perl")
depends_on("gmake", type="build")
extends("perl", when="build_system=perl")
@property
@memoized

View File

@@ -17,8 +17,10 @@
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList, join_path
import spack.build_environment
import spack.builder
import spack.config
import spack.context
import spack.deptypes as dt
import spack.detection
import spack.multimethod
@@ -237,7 +239,11 @@ def test_imports(self) -> None:
purpose=f"checking import of {module}",
work_dir="spack-test",
):
python("-c", f"import {module}")
setup_context = spack.build_environment.SetupContext(
self.spec, context=spack.context.Context.RUN
)
mods = setup_context.get_env_modifications()
python("-c", f"import {module}", env=mods)
def update_external_dependencies(self, extendee_spec=None):
"""

View File

@@ -27,7 +27,6 @@ class QMakePackage(spack.package_base.PackageBase):
build_system("qmake")
depends_on("qmake", type="build", when="build_system=qmake")
depends_on("gmake", type="build")
@spack.builder.builder("qmake")

View File

@@ -94,7 +94,7 @@ def list_url(cls):
if cls.cran:
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
@lang.classproperty
def git(cls):
if cls.bioc:
return f"https://git.bioconductor.org/packages/{cls.bioc}"
@property
def git(self):
if self.bioc:
return f"https://git.bioconductor.org/packages/{self.bioc}"

View File

@@ -140,7 +140,7 @@ class ROCmPackage(PackageBase):
when="+rocm",
)
depends_on("llvm-amdgpu", type="build", when="+rocm")
depends_on("llvm-amdgpu", when="+rocm")
depends_on("hsa-rocr-dev", when="+rocm")
depends_on("hip +rocm", when="+rocm")

View File

@@ -26,7 +26,6 @@
import spack.paths
import spack.repo
import spack.spec
import spack.spec_lookup
import spack.spec_parser
import spack.store
import spack.traverse as traverse
@@ -172,9 +171,7 @@ def quote_kvp(string: str) -> str:
def parse_specs(
args: Union[str, List[str]],
concretize: bool = False,
tests: spack.concretize.TestsType = False,
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
@@ -186,13 +183,11 @@ def parse_specs(
if not concretize:
return specs
to_concretize: List[spack.concretize.SpecPairInput] = [(s, None) for s in specs]
to_concretize = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(
to_concretize: List[spack.concretize.SpecPairInput], tests: spack.concretize.TestsType = False
) -> List[spack.spec.Spec]:
def _concretize_spec_pairs(to_concretize, tests=False):
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec
@@ -203,7 +198,7 @@ def _concretize_spec_pairs(
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or spack.concretize.concretize_one(abstract, tests=tests)]
return [concrete or abstract.concretized()]
# Special case if every spec is either concrete or has an abstract hash
if all(
@@ -212,8 +207,7 @@ def _concretize_spec_pairs(
):
# Get all the concrete specs
ret = [
concrete
or (abstract if abstract.concrete else spack.spec_lookup.lookup_hash(abstract))
concrete or (abstract if abstract.concrete else abstract.lookup_hash())
for abstract, concrete in to_concretize
]
@@ -256,9 +250,9 @@ def matching_spec_from_env(spec):
"""
env = ev.active_environment()
if env:
return env.matching_spec(spec) or spack.concretize.concretize_one(spec)
return env.matching_spec(spec) or spec.concretized()
else:
return spack.concretize.concretize_one(spec)
return spec.concretized()
def matching_specs_from_env(specs):
@@ -299,7 +293,7 @@ def disambiguate_spec(
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: Optional[List[str]],
hashes: List[str],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,

View File

@@ -14,9 +14,9 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
import spack.util.spack_yaml
@@ -397,7 +397,7 @@ def _mirror(args):
llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False)
spec = spack.concretize.concretize_one(spec_str)
spec = spack.spec.Spec(spec_str).concretized()
for node in spec.traverse():
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)

View File

@@ -16,7 +16,6 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.environment as ev
@@ -555,7 +554,8 @@ def check_fn(args: argparse.Namespace):
tty.msg("No specs provided, exiting.")
return
specs = [spack.concretize.concretize_one(s) for s in specs]
for spec in specs:
spec.concretize()
# Next see if there are any configured binary mirrors
configured_mirrors = spack.config.get("mirrors", scope=args.scope)
@@ -623,7 +623,7 @@ def save_specfile_fn(args):
root = specs[0]
if not root.concrete:
root = spack.concretize.concretize_one(root)
root.concretize()
save_dependency_specfiles(
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)

View File

@@ -18,7 +18,6 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.concretize
import spack.environment as ev
import spack.installer
import spack.store
@@ -104,7 +103,7 @@ def deprecate(parser, args):
)
if args.install:
deprecator = spack.concretize.concretize_one(specs[1])
deprecator = specs[1].concretized()
else:
deprecator = spack.cmd.disambiguate_spec(specs[1], env, local=True)

View File

@@ -10,7 +10,6 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments
import spack.concretize
import spack.config
import spack.repo
from spack.cmd.common import arguments
@@ -114,8 +113,8 @@ def dev_build(self, args):
source_path = os.path.abspath(source_path)
# Forces the build to run out of the source directory.
spec.constrain(f'dev_path="{source_path}"')
spec = spack.concretize.concretize_one(spec)
spec.constrain("dev_path=%s" % source_path)
spec.concretize()
if spec.installed:
tty.error("Already installed in %s" % spec.prefix)

View File

@@ -11,7 +11,6 @@
import spack.cmd
import spack.environment as ev
import spack.solver.asp as asp
import spack.spec_lookup
import spack.util.spack_json as sjson
from spack.cmd.common import arguments
@@ -211,7 +210,7 @@ def diff(parser, args):
specs = []
for spec in spack.cmd.parse_specs(args.specs):
# If the spec has a hash, check it before disambiguating
spack.spec_lookup.replace_hash(spec)
spec.replace_hash()
if spec.concrete:
specs.append(spec)
else:

View File

@@ -13,7 +13,6 @@
from llnl.util import lang, tty
import spack.cmd
import spack.concretize
import spack.config
import spack.environment as ev
import spack.paths
@@ -451,7 +450,7 @@ def concrete_specs_from_file(args):
else:
s = spack.spec.Spec.from_json(f)
concretized = spack.concretize.concretize_one(s)
concretized = s.concretized()
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."

View File

@@ -7,9 +7,9 @@
from llnl.path import convert_to_posix_path
import spack.concretize
import spack.paths
import spack.util.executable
from spack.spec import Spec
description = "generate Windows installer"
section = "admin"
@@ -65,7 +65,8 @@ def make_installer(parser, args):
"""
if sys.platform == "win32":
output_dir = args.output_dir
cmake_spec = spack.concretize.concretize_one("cmake")
cmake_spec = Spec("cmake")
cmake_spec.concretize()
cmake_path = os.path.join(cmake_spec.prefix, "bin", "cmake.exe")
cpack_path = os.path.join(cmake_spec.prefix, "bin", "cpack.exe")
spack_source = args.spack_source

View File

@@ -492,7 +492,7 @@ def extend_with_additional_versions(specs, num_versions):
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [spack.concretize.concretize_one(x) for x in mirror_specs]
mirror_specs = [x.concretized() for x in mirror_specs]
return mirror_specs

View File

@@ -144,7 +144,7 @@ def is_installed(spec):
record = spack.store.STORE.db.query_local_by_spec_hash(spec.dag_hash())
return record and record.installed
all_specs = traverse.traverse_nodes(
specs = traverse.traverse_nodes(
specs,
root=False,
order="breadth",
@@ -155,7 +155,7 @@ def is_installed(spec):
)
with spack.store.STORE.db.read_transaction():
return [spec for spec in all_specs if is_installed(spec)]
return [spec for spec in specs if is_installed(spec)]
def dependent_environments(

View File

@@ -749,18 +749,12 @@ def __init__(self, compiler, feature, flag_name, ver_string=None):
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ("c_compiler_output", "real_version")
__slots__ = ["c_compiler_output", "real_version"]
def __init__(self, c_compiler_output: Optional[str], real_version: str):
self.c_compiler_output = c_compiler_output
self.real_version = real_version
@property
def empty(self) -> bool:
"""Sometimes the compiler is temporarily broken, preventing us from getting output. The
call site determines if that is a problem."""
return self.c_compiler_output is None
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
@@ -798,10 +792,9 @@ def __init__(self, cache: "FileCache") -> None:
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str, *, allow_empty: bool) -> Optional[CompilerCacheEntry]:
def _get_entry(self, key: str) -> Optional[CompilerCacheEntry]:
try:
entry = CompilerCacheEntry.from_dict(self._data[key])
return entry if allow_empty or not entry.empty else None
return CompilerCacheEntry.from_dict(self._data[key])
except ValueError:
del self._data[key]
except KeyError:
@@ -819,7 +812,7 @@ def get(self, compiler: Compiler) -> CompilerCacheEntry:
self._data = {}
key = self._key(compiler)
value = self._get_entry(key, allow_empty=False)
value = self._get_entry(key)
if value is not None:
return value
@@ -833,7 +826,7 @@ def get(self, compiler: Compiler) -> CompilerCacheEntry:
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key, allow_empty=True)
entry = self._get_entry(key)
# Finally compute the cache entry
if entry is None:

View File

@@ -5,7 +5,7 @@
import sys
import time
from contextlib import contextmanager
from typing import Iterable, List, Optional, Sequence, Tuple, Union
from typing import Iterable, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
@@ -35,14 +35,14 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
def _concretize_specs_together(
abstract_specs: Sequence[Spec], tests: TestsType = False
) -> List[Spec]:
def concretize_specs_together(
abstract_specs: Sequence[SpecLike], tests: TestsType = False
) -> Sequence[Spec]:
"""Given a number of specs as input, tries to concretize them together.
Args:
@@ -50,16 +50,17 @@ def _concretize_specs_together(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
from spack.solver.asp import Solver
import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
solver = spack.solver.asp.Solver()
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
def concretize_together(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together.
Args:
@@ -70,13 +71,13 @@ def concretize_together(
"""
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
abstract_specs = [abstract for abstract, _ in spec_list]
concrete_specs = _concretize_specs_together(to_concretize, tests=tests)
concrete_specs = concretize_specs_together(to_concretize, tests=tests)
return list(zip(abstract_specs, concrete_specs))
def concretize_together_when_possible(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of
@@ -88,7 +89,7 @@ def concretize_together_when_possible(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
from spack.solver.asp import Solver
import spack.solver.asp
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
@@ -96,8 +97,9 @@ def concretize_together_when_possible(
}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in Solver().solve_in_rounds(
for result in solver.solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
@@ -111,8 +113,8 @@ def concretize_together_when_possible(
def concretize_separately(
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Concretizes the input specs separately from each other.
Args:
@@ -121,7 +123,7 @@ def concretize_separately(
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
from spack.bootstrap import ensure_bootstrap_configuration, ensure_clingo_importable_or_raise
import spack.bootstrap
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
@@ -131,8 +133,8 @@ def concretize_separately(
]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel
with ensure_bootstrap_configuration():
ensure_clingo_importable_or_raise()
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -187,54 +189,10 @@ def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int,
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = concretize_one(Spec(spec_str), tests=tests)
spec = Spec(spec_str).concretized(tests=tests)
return index, spec, time.time() - start
def concretize_one(spec: Union[str, Spec], tests: TestsType = False) -> Spec:
"""Return a concretized copy of the given spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
from spack.solver.asp import Solver, SpecBuilder
from spack.spec_lookup import replace_hash
if isinstance(spec, str):
spec = Spec(spec)
replace_hash(spec)
if spec.concrete:
return spec.copy()
for node in spec.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
allow_deprecated = spack.config.get("config:deprecated", False)
result = Solver().solve([spec], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = spec.name
# TODO: Consolidate this code with similar code in solve.py
if spec.virtual:
providers = [s.name for s in answer.values() if s.package.provides(name)]
name = providers[0]
node = SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
return concretized
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""

View File

@@ -36,8 +36,6 @@
import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
import jsonschema
from llnl.util import filesystem, lang, tty
import spack.error
@@ -953,6 +951,12 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
return CONFIG.set(path, value, scope)
def add_default_platform_scope(platform: str) -> None:
plat_name = os.path.join("defaults", platform)
plat_path = os.path.join(CONFIGURATION_DEFAULTS_PATH[1], platform)
CONFIG.push_scope(DirectoryConfigScope(plat_name, plat_path))
def scopes() -> Dict[str, ConfigScope]:
"""Convenience function to get list of configuration scopes."""
return CONFIG.scopes
@@ -1050,6 +1054,8 @@ def validate(
This leverages the line information (start_mark, end_mark) stored
on Spack YAML structures.
"""
import jsonschema
try:
spack.schema.Validator(schema).validate(data)
except jsonschema.ValidationError as e:

View File

@@ -6,8 +6,6 @@
"""
import warnings
import jsonschema
import spack.environment as ev
import spack.schema.env as env
import spack.util.spack_yaml as syaml
@@ -32,6 +30,8 @@ def validate(configuration_file):
Returns:
A sanitized copy of the configuration stored in the input file
"""
import jsonschema
with open(configuration_file, encoding="utf-8") as f:
config = syaml.load(f)

View File

@@ -9,8 +9,6 @@
from collections import namedtuple
from typing import Optional
import jsonschema
import spack.environment as ev
import spack.error
import spack.schema.env
@@ -190,6 +188,8 @@ def paths(self):
@tengine.context_property
def manifest(self):
"""The spack.yaml file that should be used in the image"""
import jsonschema
# Copy in the part of spack.yaml prescribed in the configuration file
manifest = copy.deepcopy(self.config)
manifest.pop("container")

View File

@@ -419,25 +419,14 @@ class FailureTracker:
the likelihood of collision very low with no cleanup required.
"""
#: root directory of the failure tracker
dir: pathlib.Path
#: File for locking particular concrete spec hashes
locker: SpecLocker
def __init__(self, root_dir: Union[str, pathlib.Path], default_timeout: Optional[float]):
#: Ensure a persistent location for dealing with parallel installation
#: failures (e.g., across near-concurrent processes).
self.dir = pathlib.Path(root_dir) / _DB_DIRNAME / "failures"
self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout)
def _ensure_parent_directories(self) -> None:
"""Ensure that parent directories of the FailureTracker exist.
Accesses the filesystem only once, the first time it's called on a given FailureTracker.
"""
self.dir.mkdir(parents=True, exist_ok=True)
self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout)
def clear(self, spec: "spack.spec.Spec", force: bool = False) -> None:
"""Removes any persistent and cached failure tracking for the spec.
@@ -480,18 +469,13 @@ def clear_all(self) -> None:
tty.debug("Removing prefix failure tracking files")
try:
marks = os.listdir(str(self.dir))
except FileNotFoundError:
return # directory doesn't exist yet
for fail_mark in os.listdir(str(self.dir)):
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
except OSError as exc:
tty.warn(f"Unable to remove failure marking files: {str(exc)}")
return
for fail_mark in marks:
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
"""Marks a spec as failing to install.
@@ -499,8 +483,6 @@ def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
Args:
spec: spec that failed to install
"""
self._ensure_parent_directories()
# Dump the spec to the failure file for (manual) debugging purposes
path = self._path(spec)
path.write_text(spec.to_json())
@@ -585,13 +567,17 @@ def __init__(
Relevant only if the repository is not an upstream.
"""
self.root = root
self.database_directory = pathlib.Path(self.root) / _DB_DIRNAME
self.database_directory = os.path.join(self.root, _DB_DIRNAME)
self.layout = layout
# Set up layout of database files within the db dir
self._index_path = self.database_directory / "index.json"
self._verifier_path = self.database_directory / "index_verifier"
self._lock_path = self.database_directory / "lock"
self._index_path = os.path.join(self.database_directory, "index.json")
self._verifier_path = os.path.join(self.database_directory, "index_verifier")
self._lock_path = os.path.join(self.database_directory, "lock")
# Create needed directories and files
if not is_upstream and not os.path.exists(self.database_directory):
fs.mkdirp(self.database_directory)
self.is_upstream = is_upstream
self.last_seen_verifier = ""
@@ -613,7 +599,7 @@ def __init__(
self.lock = ForbiddenLock()
else:
self.lock = lk.Lock(
str(self._lock_path),
self._lock_path,
default_timeout=self.db_lock_timeout,
desc="database",
enable=lock_cfg.enable,
@@ -630,11 +616,6 @@ def __init__(
self._write_transaction_impl = lk.WriteTransaction
self._read_transaction_impl = lk.ReadTransaction
def _ensure_parent_directories(self):
"""Create the parent directory for the DB, if necessary."""
if not self.is_upstream:
self.database_directory.mkdir(parents=True, exist_ok=True)
def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return self._write_transaction_impl(self.lock, acquire=self._read, release=self._write)
@@ -649,8 +630,6 @@ def _write_to_file(self, stream):
This function does not do any locking or transactions.
"""
self._ensure_parent_directories()
# map from per-spec hash code to installation record.
installs = dict(
(k, v.to_dict(include_fields=self.record_fields)) for k, v in self._data.items()
@@ -780,7 +759,7 @@ def _read_from_file(self, filename):
Does not do any locking.
"""
try:
with open(str(filename), "r", encoding="utf-8") as f:
with open(filename, "r", encoding="utf-8") as f:
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
@@ -881,13 +860,11 @@ def reindex(self):
if self.is_upstream:
raise UpstreamDatabaseLockingError("Cannot reindex an upstream database")
self._ensure_parent_directories()
# Special transaction to avoid recursive reindex calls and to
# ignore errors if we need to rebuild a corrupt database.
def _read_suppress_error():
try:
if self._index_path.is_file():
if os.path.isfile(self._index_path):
self._read_from_file(self._index_path)
except CorruptDatabaseError as e:
tty.warn(f"Reindexing corrupt database, error was: {e}")
@@ -1030,7 +1007,7 @@ def _check_ref_counts(self):
% (key, found, expected, self._index_path)
)
def _write(self, type=None, value=None, traceback=None):
def _write(self, type, value, traceback):
"""Write the in-memory database index to its file path.
This is a helper function called by the WriteTransaction context
@@ -1041,8 +1018,6 @@ def _write(self, type=None, value=None, traceback=None):
This routine does no locking.
"""
self._ensure_parent_directories()
# Do not write if exceptions were raised
if type is not None:
# A failure interrupted a transaction, so we should record that
@@ -1051,16 +1026,16 @@ def _write(self, type=None, value=None, traceback=None):
self._state_is_inconsistent = True
return
temp_file = str(self._index_path) + (".%s.%s.temp" % (_getfqdn(), os.getpid()))
temp_file = self._index_path + (".%s.%s.temp" % (_getfqdn(), os.getpid()))
# Write a temporary database file them move it into place
try:
with open(temp_file, "w", encoding="utf-8") as f:
self._write_to_file(f)
fs.rename(temp_file, str(self._index_path))
fs.rename(temp_file, self._index_path)
if _use_uuid:
with self._verifier_path.open("w", encoding="utf-8") as f:
with open(self._verifier_path, "w", encoding="utf-8") as f:
new_verifier = str(uuid.uuid4())
f.write(new_verifier)
self.last_seen_verifier = new_verifier
@@ -1073,11 +1048,11 @@ def _write(self, type=None, value=None, traceback=None):
def _read(self):
"""Re-read Database from the data in the set location. This does no locking."""
if self._index_path.is_file():
if os.path.isfile(self._index_path):
current_verifier = ""
if _use_uuid:
try:
with self._verifier_path.open("r", encoding="utf-8") as f:
with open(self._verifier_path, "r", encoding="utf-8") as f:
current_verifier = f.read()
except BaseException:
pass
@@ -1355,7 +1330,7 @@ def deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> N
def installed_relatives(
self,
spec: "spack.spec.Spec",
direction: tr.DirectionType = "children",
direction: str = "children",
transitive: bool = True,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
) -> Set["spack.spec.Spec"]:
@@ -1706,7 +1681,7 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
results.sort() # type: ignore[call-overload]
results.sort()
return results
def query_one(

View File

@@ -8,7 +8,7 @@
import shutil
import sys
from pathlib import Path
from typing import Dict, List, Optional, Tuple
from typing import List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.symlink import readlink
@@ -17,6 +17,7 @@
import spack.hash_types as ht
import spack.projections
import spack.spec
import spack.store
import spack.util.spack_json as sjson
from spack.error import SpackError
@@ -68,9 +69,10 @@ def specs_from_metadata_dirs(root: str) -> List["spack.spec.Spec"]:
class DirectoryLayout:
"""A directory layout is used to associate unique paths with specs. Different installations are
going to want different layouts for their install, and they can use this to customize the
nesting structure of spack installs. The default layout is:
"""A directory layout is used to associate unique paths with specs.
Different installations are going to want different layouts for their
install, and they can use this to customize the nesting structure of
spack installs. The default layout is:
* <install root>/
@@ -80,30 +82,35 @@ class DirectoryLayout:
* <name>-<version>-<hash>
The installation directory projections can be modified with the projections argument."""
The hash here is a SHA-1 hash for the full DAG plus the build
spec.
def __init__(
self,
root,
*,
projections: Optional[Dict[str, str]] = None,
hash_length: Optional[int] = None,
) -> None:
The installation directory projections can be modified with the
projections argument.
"""
def __init__(self, root, **kwargs):
self.root = root
projections = projections or default_projections
self.projections = {key: projection.lower() for key, projection in projections.items()}
self.check_upstream = True
projections = kwargs.get("projections") or default_projections
self.projections = dict(
(key, projection.lower()) for key, projection in projections.items()
)
# apply hash length as appropriate
self.hash_length = hash_length
self.hash_length = kwargs.get("hash_length", None)
if self.hash_length is not None:
for when_spec, projection in self.projections.items():
if "{hash}" not in projection:
raise InvalidDirectoryLayoutParametersError(
"Conflicting options for installation layout hash length"
if "{hash" in projection
else "Cannot specify hash length when the hash is not part of all "
"install_tree projections"
)
if "{hash" in projection:
raise InvalidDirectoryLayoutParametersError(
"Conflicting options for installation layout hash" " length"
)
else:
raise InvalidDirectoryLayoutParametersError(
"Cannot specify hash length when the hash is not"
" part of all install_tree projections"
)
self.projections[when_spec] = projection.replace(
"{hash}", "{hash:%d}" % self.hash_length
)
@@ -272,6 +279,13 @@ def path_for_spec(self, spec):
if spec.external:
return spec.external_path
if self.check_upstream:
upstream, record = spack.store.STORE.db.query_by_spec_hash(spec.dag_hash())
if upstream:
raise SpackError(
"Internal error: attempted to call path_for_spec on"
" upstream-installed package."
)
path = self.relative_path_for_spec(spec)
assert not path.startswith(self.root)

View File

@@ -15,10 +15,6 @@
SHOW_BACKTRACE = False
class SpackAPIWarning(UserWarning):
"""Warning that formats with file and line number."""
class SpackError(Exception):
"""This is the superclass for all Spack errors.
Subclasses can be found in the modules they have to do with.
@@ -202,10 +198,3 @@ class MirrorError(SpackError):
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)
class InvalidHashError(SpecError):
def __init__(self, spec, hash):
msg = f"No spec with hash {hash} could be found to match {spec}."
msg += " Either the hash does not exist, or it does not match other spec constraints."
super().__init__(msg)

View File

@@ -35,6 +35,7 @@
import spack.config
import spack.directory_layout
import spack.paths
import spack.projections
import spack.relocate
import spack.schema.projections
@@ -43,6 +44,7 @@
import spack.util.spack_json as s_json
import spack.util.spack_yaml as s_yaml
from spack.error import SpackError
from spack.hooks import sbang
__all__ = ["FilesystemView", "YamlFilesystemView"]
@@ -89,10 +91,16 @@ def view_copy(
if stat.S_ISLNK(src_stat.st_mode):
spack.relocate.relocate_links(links=[dst], prefix_to_prefix=prefix_to_projection)
elif spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(binaries=[dst], prefix_to_prefix=prefix_to_projection)
spack.relocate.relocate_text_bin(binaries=[dst], prefixes=prefix_to_projection)
else:
prefix_to_projection[spack.store.STORE.layout.root] = view._root
spack.relocate.relocate_text(files=[dst], prefix_to_prefix=prefix_to_projection)
# This is vestigial code for the *old* location of sbang.
prefix_to_projection[f"#!/bin/bash {spack.paths.spack_root}/bin/sbang"] = (
sbang.sbang_shebang_line()
)
spack.relocate.relocate_text(files=[dst], prefixes=prefix_to_projection)
# The os module on Windows does not have a chown function.
if sys.platform != "win32":

View File

@@ -275,7 +275,7 @@ def _do_fake_install(pkg: "spack.package_base.PackageBase") -> None:
fs.mkdirp(pkg.prefix.bin)
fs.touch(os.path.join(pkg.prefix.bin, command))
if sys.platform != "win32":
chmod = which("chmod", required=True)
chmod = which("chmod")
chmod("+x", os.path.join(pkg.prefix.bin, command))
# Install fake header file
@@ -539,7 +539,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
# Note that we copy them in as they are in the *install* directory
# NOT as they are in the repository, because we want a snapshot of
# how *this* particular build was done.
for node in spec.traverse(deptype="all"):
for node in spec.traverse(deptype=all):
if node is not spec:
# Locate the dependency package in the install tree and find
# its provenance information.

View File

@@ -503,16 +503,16 @@ def make_argument_parser(**kwargs):
return parser
def showwarning(message, category, filename, lineno, file=None, line=None):
def send_warning_to_tty(message, *args):
"""Redirects messages to tty.warn."""
if category is spack.error.SpackAPIWarning:
tty.warn(f"{filename}:{lineno}: {message}")
else:
tty.warn(message)
tty.warn(message)
def setup_main_options(args):
"""Configure spack globals based on the basic options."""
# Assign a custom function to show warnings
warnings.showwarning = send_warning_to_tty
# Set up environment based on args.
tty.set_verbose(args.verbose)
tty.set_debug(args.debug)
@@ -903,10 +903,9 @@ def _main(argv=None):
# main() is tricky to get right, so be careful where you put things.
#
# Things in this first part of `main()` should *not* require any
# configuration. This doesn't include much -- setting up the parser,
# configuration. This doesn't include much -- setting up th parser,
# restoring some key environment variables, very simple CLI options, etc.
# ------------------------------------------------------------------------
warnings.showwarning = showwarning
# Create a parser with a simple positional argument first. We'll
# lazily load the subcommand(s) we need later. This allows us to

View File

@@ -3,6 +3,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.lang
import spack.util.spack_yaml as syaml
@llnl.util.lang.lazy_lexicographic_ordering
class OperatingSystem:
@@ -40,4 +42,4 @@ def _cmp_iter(self):
yield self.version
def to_dict(self):
return {"name": self.name, "version": self.version}
return syaml.syaml_dict([("name", self.name), ("version", self.version)])

View File

@@ -106,16 +106,8 @@
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver
# These are just here for editor support; they may be set when the build env is set up.
configure: Executable
make_jobs: int
make: MakeExecutable
ninja: MakeExecutable
python_include: str
python_platlib: str
python_purelib: str
python: Executable
spack_cc: str
spack_cxx: str
spack_f77: str
spack_fc: str
# These are just here for editor support; they will be replaced when the build env
# is set up.
make = MakeExecutable("make", jobs=1)
ninja = MakeExecutable("ninja", jobs=1)
configure = Executable(join_path(".", "configure"))

View File

@@ -767,9 +767,6 @@ def __init__(self, spec):
self.win_rpath = fsys.WindowsSimulatedRPath(self)
super().__init__()
def __getitem__(self, key: str) -> "PackageBase":
return self.spec[key].package
@classmethod
def dependency_names(cls):
return _subkeys(cls.dependencies)
@@ -1099,14 +1096,14 @@ def update_external_dependencies(self, extendee_spec=None):
"""
pass
def detect_dev_src_change(self) -> bool:
def detect_dev_src_change(self):
"""
Method for checking for source code changes to trigger rebuild/reinstall
"""
dev_path_var = self.spec.variants.get("dev_path", None)
_, record = spack.store.STORE.db.query_by_spec_hash(self.spec.dag_hash())
assert dev_path_var and record, "dev_path variant and record must be present"
return fsys.recursive_mtime_greater_than(dev_path_var.value, record.installation_time)
mtime = fsys.last_modification_time_recursive(dev_path_var.value)
return mtime > record.installation_time
def all_urls_for_version(self, version: StandardVersion) -> List[str]:
"""Return all URLs derived from version_urls(), url, urls, and
@@ -1819,6 +1816,12 @@ def _has_make_target(self, target):
Returns:
bool: True if 'target' is found, else False
"""
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
# Check if we have a Makefile
for makefile in ["GNUmakefile", "Makefile", "makefile"]:
if os.path.exists(makefile):
@@ -1827,12 +1830,6 @@ def _has_make_target(self, target):
tty.debug("No Makefile found in the build directory")
return False
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
# Check if 'target' is a valid target.
#
# `make -n target` performs a "dry run". It prints the commands that

View File

@@ -6,7 +6,8 @@
import os
import re
import sys
from typing import Dict, Iterable, List, Optional
from collections import OrderedDict
from typing import List, Optional
import macholib.mach_o
import macholib.MachO
@@ -17,11 +18,28 @@
from llnl.util.lang import memoized
from llnl.util.symlink import readlink, symlink
import spack.error
import spack.store
import spack.util.elf as elf
import spack.util.executable as executable
from .relocate_text import BinaryFilePrefixReplacer, PrefixToPrefix, TextFilePrefixReplacer
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer
class InstallRootStringError(spack.error.SpackError):
def __init__(self, file_path, root_path):
"""Signal that the relocated binary still has the original
Spack's store root string
Args:
file_path (str): path of the binary
root_path (str): original Spack's store root string
"""
super().__init__(
"\n %s \ncontains string\n %s \n"
"after replacing it in rpaths.\n"
"Package should not be relocated.\n Use -a to override." % (file_path, root_path)
)
@memoized
@@ -36,11 +54,144 @@ def _patchelf() -> Optional[executable.Executable]:
return spack.bootstrap.ensure_patchelf_in_path_or_raise()
def _elf_rpaths_for(path):
"""Return the RPATHs for an executable or a library.
Args:
path (str): full path to the executable or library
Return:
RPATHs as a list of strings. Returns an empty array
on ELF parsing errors, or when the ELF file simply
has no rpaths.
"""
return elf.get_rpaths(path) or []
def _make_relative(reference_file, path_root, paths):
"""Return a list where any path in ``paths`` that starts with
``path_root`` is made relative to the directory in which the
reference file is stored.
After a path is made relative it is prefixed with the ``$ORIGIN``
string.
Args:
reference_file (str): file from which the reference directory
is computed
path_root (str): root of the relative paths
paths: (list) paths to be examined
Returns:
List of relative paths
"""
start_directory = os.path.dirname(reference_file)
pattern = re.compile(path_root)
relative_paths = []
for path in paths:
if pattern.match(path):
rel = os.path.relpath(path, start=start_directory)
path = os.path.join("$ORIGIN", rel)
relative_paths.append(path)
return relative_paths
def _normalize_relative_paths(start_path, relative_paths):
"""Normalize the relative paths with respect to the original path name
of the file (``start_path``).
The paths that are passed to this function existed or were relevant
on another filesystem, so os.path.abspath cannot be used.
A relative path may contain the signifier $ORIGIN. Assuming that
``start_path`` is absolute, this implies that the relative path
(relative to start_path) should be replaced with an absolute path.
Args:
start_path (str): path from which the starting directory
is extracted
relative_paths (str): list of relative paths as obtained by a
call to :ref:`_make_relative`
Returns:
List of normalized paths
"""
normalized_paths = []
pattern = re.compile(re.escape("$ORIGIN"))
start_directory = os.path.dirname(start_path)
for path in relative_paths:
if path.startswith("$ORIGIN"):
sub = pattern.sub(start_directory, path)
path = os.path.normpath(sub)
normalized_paths.append(path)
return normalized_paths
def _decode_macho_data(bytestring):
return bytestring.rstrip(b"\x00").decode("ascii")
def _macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
def macho_make_paths_relative(path_name, old_layout_root, rpaths, deps, idpath):
"""
Return a dictionary mapping the original rpaths to the relativized rpaths.
This dictionary is used to replace paths in mach-o binaries.
Replace old_dir with relative path from dirname of path name
in rpaths and deps; idpath is replaced with @rpath/libname.
"""
paths_to_paths = dict()
if idpath:
paths_to_paths[idpath] = os.path.join("@rpath", "%s" % os.path.basename(idpath))
for rpath in rpaths:
if re.match(old_layout_root, rpath):
rel = os.path.relpath(rpath, start=os.path.dirname(path_name))
paths_to_paths[rpath] = os.path.join("@loader_path", "%s" % rel)
else:
paths_to_paths[rpath] = rpath
for dep in deps:
if re.match(old_layout_root, dep):
rel = os.path.relpath(dep, start=os.path.dirname(path_name))
paths_to_paths[dep] = os.path.join("@loader_path", "%s" % rel)
else:
paths_to_paths[dep] = dep
return paths_to_paths
def macho_make_paths_normal(orig_path_name, rpaths, deps, idpath):
"""
Return a dictionary mapping the relativized rpaths to the original rpaths.
This dictionary is used to replace paths in mach-o binaries.
Replace '@loader_path' with the dirname of the origname path name
in rpaths and deps; idpath is replaced with the original path name
"""
rel_to_orig = dict()
if idpath:
rel_to_orig[idpath] = orig_path_name
for rpath in rpaths:
if re.match("@loader_path", rpath):
norm = os.path.normpath(
re.sub(re.escape("@loader_path"), os.path.dirname(orig_path_name), rpath)
)
rel_to_orig[rpath] = norm
else:
rel_to_orig[rpath] = rpath
for dep in deps:
if re.match("@loader_path", dep):
norm = os.path.normpath(
re.sub(re.escape("@loader_path"), os.path.dirname(orig_path_name), dep)
)
rel_to_orig[dep] = norm
else:
rel_to_orig[dep] = dep
return rel_to_orig
def macho_find_paths(orig_rpaths, deps, idpath, old_layout_root, prefix_to_prefix):
"""
Inputs
original rpaths from mach-o binaries
@@ -56,12 +207,13 @@ def _macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
# Sort from longest path to shortest, to ensure we try /foo/bar/baz before /foo/bar
prefix_iteration_order = sorted(prefix_to_prefix, key=len, reverse=True)
for orig_rpath in orig_rpaths:
for old_prefix in prefix_iteration_order:
new_prefix = prefix_to_prefix[old_prefix]
if orig_rpath.startswith(old_prefix):
new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath)
paths_to_paths[orig_rpath] = new_rpath
break
if orig_rpath.startswith(old_layout_root):
for old_prefix in prefix_iteration_order:
new_prefix = prefix_to_prefix[old_prefix]
if orig_rpath.startswith(old_prefix):
new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath)
paths_to_paths[orig_rpath] = new_rpath
break
else:
paths_to_paths[orig_rpath] = orig_rpath
@@ -85,7 +237,7 @@ def _macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
return paths_to_paths
def _modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
"""
This function is used to make machO buildcaches on macOS by
replacing old paths with new paths using install_name_tool
@@ -128,7 +280,7 @@ def _modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
install_name_tool(*args, temp_path)
def _macholib_get_paths(cur_path):
def macholib_get_paths(cur_path):
"""Get rpaths, dependent libraries, and library id of mach-o objects."""
headers = []
try:
@@ -196,7 +348,9 @@ def _set_elf_rpaths_and_interpreter(
return None
def relocate_macho_binaries(path_names, prefix_to_prefix):
def relocate_macho_binaries(
path_names, old_layout_root, new_layout_root, prefix_to_prefix, rel, old_prefix, new_prefix
):
"""
Use macholib python package to get the rpaths, depedent libraries
and library identity for libraries from the MachO object. Modify them
@@ -209,26 +363,88 @@ def relocate_macho_binaries(path_names, prefix_to_prefix):
# Corner case where macho object file ended up in the path name list
if path_name.endswith(".o"):
continue
# get the paths in the old prefix
rpaths, deps, idpath = _macholib_get_paths(path_name)
# get the mapping of paths in the old prerix to the new prefix
paths_to_paths = _macho_find_paths(rpaths, deps, idpath, prefix_to_prefix)
# replace the old paths with new paths
_modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
if rel:
# get the relativized paths
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the file path name in the original prefix
orig_path_name = re.sub(re.escape(new_prefix), old_prefix, path_name)
# get the mapping of the relativized paths to the original
# normalized paths
rel_to_orig = macho_make_paths_normal(orig_path_name, rpaths, deps, idpath)
# replace the relativized paths with normalized paths
modify_macho_object(path_name, rpaths, deps, idpath, rel_to_orig)
# get the normalized paths in the mach-o binary
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths in old prefix to path in new prefix
paths_to_paths = macho_find_paths(
rpaths, deps, idpath, old_layout_root, prefix_to_prefix
)
# replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
# get the new normalized path in the mach-o binary
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths to relative paths in the new prefix
paths_to_paths = macho_make_paths_relative(
path_name, new_layout_root, rpaths, deps, idpath
)
# replace the new paths with relativized paths in the new prefix
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
else:
# get the paths in the old prefix
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths in the old prerix to the new prefix
paths_to_paths = macho_find_paths(
rpaths, deps, idpath, old_layout_root, prefix_to_prefix
)
# replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
def relocate_elf_binaries(binaries: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
"""Take a list of binaries, and an ordered prefix to prefix mapping, and update the rpaths
accordingly."""
def _transform_rpaths(orig_rpaths, orig_root, new_prefixes):
"""Return an updated list of RPATHs where each entry in the original list
starting with the old root is relocated to another place according to the
mapping passed as argument.
Args:
orig_rpaths (list): list of the original RPATHs
orig_root (str): original root to be substituted
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
Returns:
List of paths
"""
new_rpaths = []
for orig_rpath in orig_rpaths:
# If the original RPATH doesn't start with the target root
# append it verbatim and proceed
if not orig_rpath.startswith(orig_root):
new_rpaths.append(orig_rpath)
continue
# Otherwise inspect the mapping and transform + append any prefix
# that starts with a registered key
# avoiding duplicates
for old_prefix, new_prefix in new_prefixes.items():
if orig_rpath.startswith(old_prefix):
new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath)
if new_rpath not in new_rpaths:
new_rpaths.append(new_rpath)
return new_rpaths
def new_relocate_elf_binaries(binaries, prefix_to_prefix):
"""Take a list of binaries, and an ordered dictionary of
prefix to prefix mapping, and update the rpaths accordingly."""
# Transform to binary string
prefix_to_prefix_bin = {
k.encode("utf-8"): v.encode("utf-8") for k, v in prefix_to_prefix.items()
}
prefix_to_prefix = OrderedDict(
(k.encode("utf-8"), v.encode("utf-8")) for (k, v) in prefix_to_prefix.items()
)
for path in binaries:
try:
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix_bin)
elf.substitute_rpath_and_pt_interp_in_place_or_raise(path, prefix_to_prefix)
except elf.ElfCStringUpdatesFailed as e:
# Fall back to `patchelf --set-rpath ... --set-interpreter ...`
rpaths = e.rpath.new_value.decode("utf-8").split(":") if e.rpath else []
@@ -236,13 +452,105 @@ def relocate_elf_binaries(binaries: Iterable[str], prefix_to_prefix: Dict[str, s
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def _warn_if_link_cant_be_relocated(link: str, target: str):
def relocate_elf_binaries(
binaries, orig_root, new_root, new_prefixes, rel, orig_prefix, new_prefix
):
"""Relocate the binaries passed as arguments by changing their RPATHs.
Use patchelf to get the original RPATHs and then replace them with
rpaths in the new directory layout.
New RPATHs are determined from a dictionary mapping the prefixes in the
old directory layout to the prefixes in the new directory layout if the
rpath was in the old layout root, i.e. system paths are not replaced.
Args:
binaries (list): list of binaries that might need relocation, located
in the new prefix
orig_root (str): original root to be substituted
new_root (str): new root to be used, only relevant for relative RPATHs
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
rel (bool): True if the RPATHs are relative, False if they are absolute
orig_prefix (str): prefix where the executable was originally located
new_prefix (str): prefix where we want to relocate the executable
"""
for new_binary in binaries:
orig_rpaths = _elf_rpaths_for(new_binary)
# TODO: Can we deduce `rel` from the original RPATHs?
if rel:
# Get the file path in the original prefix
orig_binary = re.sub(re.escape(new_prefix), orig_prefix, new_binary)
# Get the normalized RPATHs in the old prefix using the file path
# in the orig prefix
orig_norm_rpaths = _normalize_relative_paths(orig_binary, orig_rpaths)
# Get the normalize RPATHs in the new prefix
new_norm_rpaths = _transform_rpaths(orig_norm_rpaths, orig_root, new_prefixes)
# Get the relative RPATHs in the new prefix
new_rpaths = _make_relative(new_binary, new_root, new_norm_rpaths)
# check to see if relative rpaths are changed before rewriting
if sorted(new_rpaths) != sorted(orig_rpaths):
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
else:
new_rpaths = _transform_rpaths(orig_rpaths, orig_root, new_prefixes)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def make_link_relative(new_links, orig_links):
"""Compute the relative target from the original link and
make the new link relative.
Args:
new_links (list): new links to be made relative
orig_links (list): original links
"""
for new_link, orig_link in zip(new_links, orig_links):
target = readlink(orig_link)
relative_target = os.path.relpath(target, os.path.dirname(orig_link))
os.unlink(new_link)
symlink(relative_target, new_link)
def make_macho_binaries_relative(cur_path_names, orig_path_names, old_layout_root):
"""
Replace old RPATHs with paths relative to old_dir in binary files
"""
if not sys.platform == "darwin":
return
for cur_path, orig_path in zip(cur_path_names, orig_path_names):
(rpaths, deps, idpath) = macholib_get_paths(cur_path)
paths_to_paths = macho_make_paths_relative(
orig_path, old_layout_root, rpaths, deps, idpath
)
modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths)
def make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root):
"""Replace the original RPATHs in the new binaries making them
relative to the original layout root.
Args:
new_binaries (list): new binaries whose RPATHs is to be made relative
orig_binaries (list): original binaries
orig_layout_root (str): path to be used as a base for making
RPATHs relative
"""
for new_binary, orig_binary in zip(new_binaries, orig_binaries):
orig_rpaths = _elf_rpaths_for(new_binary)
if orig_rpaths:
new_rpaths = _make_relative(orig_binary, orig_layout_root, orig_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def warn_if_link_cant_be_relocated(link, target):
if not os.path.isabs(target):
return
tty.warn(f'Symbolic link at "{link}" to "{target}" cannot be relocated')
tty.warn('Symbolic link at "{}" to "{}" cannot be relocated'.format(link, target))
def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> None:
def relocate_links(links, prefix_to_prefix):
"""Relocate links to a new install prefix."""
regex = re.compile("|".join(re.escape(p) for p in prefix_to_prefix.keys()))
for link in links:
@@ -251,7 +559,7 @@ def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> No
# No match.
if match is None:
_warn_if_link_cant_be_relocated(link, old_target)
warn_if_link_cant_be_relocated(link, old_target)
continue
new_target = prefix_to_prefix[match.group()] + old_target[match.end() :]
@@ -259,32 +567,32 @@ def relocate_links(links: Iterable[str], prefix_to_prefix: Dict[str, str]) -> No
symlink(new_target, link)
def relocate_text(files: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> None:
def relocate_text(files, prefixes):
"""Relocate text file from the original installation prefix to the
new prefix.
Relocation also affects the the path in Spack's sbang script.
Args:
files: Text files to be relocated
prefix_to_prefix: ordered prefix to prefix mapping
files (list): Text files to be relocated
prefixes (OrderedDict): String prefixes which need to be changed
"""
TextFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(files)
TextFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(files)
def relocate_text_bin(binaries: Iterable[str], prefix_to_prefix: PrefixToPrefix) -> List[str]:
def relocate_text_bin(binaries, prefixes):
"""Replace null terminated path strings hard-coded into binaries.
The new install prefix must be shorter than the original one.
Args:
binaries: paths to binaries to be relocated
prefix_to_prefix: ordered prefix to prefix mapping
binaries (list): binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed.
Raises:
spack.relocate_text.BinaryTextReplaceError: when the new path is longer than the old path
"""
return BinaryFilePrefixReplacer.from_strings_or_bytes(prefix_to_prefix).apply(binaries)
return BinaryFilePrefixReplacer.from_strings_or_bytes(prefixes).apply(binaries)
def is_macho_magic(magic: bytes) -> bool:
@@ -321,7 +629,7 @@ def _exists_dir(dirname):
return os.path.isdir(dirname)
def is_macho_binary(path: str) -> bool:
def is_macho_binary(path):
try:
with open(path, "rb") as f:
return is_macho_magic(f.read(4))
@@ -345,7 +653,7 @@ def fixup_macos_rpath(root, filename):
return False
# Get Mach-O header commands
(rpath_list, deps, id_dylib) = _macholib_get_paths(abspath)
(rpath_list, deps, id_dylib) = macholib_get_paths(abspath)
# Convert rpaths list to (name -> number of occurrences)
add_rpaths = set()

View File

@@ -6,61 +6,64 @@
paths inside text files and binaries."""
import re
from typing import IO, Dict, Iterable, List, Union
from llnl.util.lang import PatternBytes
from collections import OrderedDict
from typing import Dict, Union
import spack.error
Prefix = Union[str, bytes]
PrefixToPrefix = Union[Dict[str, str], Dict[bytes, bytes]]
def encode_path(p: Prefix) -> bytes:
return p if isinstance(p, bytes) else p.encode("utf-8")
def _prefix_to_prefix_as_bytes(prefix_to_prefix: PrefixToPrefix) -> Dict[bytes, bytes]:
return {encode_path(k): encode_path(v) for (k, v) in prefix_to_prefix.items()}
def _prefix_to_prefix_as_bytes(prefix_to_prefix) -> Dict[bytes, bytes]:
return OrderedDict((encode_path(k), encode_path(v)) for (k, v) in prefix_to_prefix.items())
def utf8_path_to_binary_regex(prefix: str) -> PatternBytes:
def utf8_path_to_binary_regex(prefix: str):
"""Create a binary regex that matches the input path in utf8"""
prefix_bytes = re.escape(prefix).encode("utf-8")
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)" % prefix_bytes)
def _byte_strings_to_single_binary_regex(prefixes: Iterable[bytes]) -> PatternBytes:
def _byte_strings_to_single_binary_regex(prefixes):
all_prefixes = b"|".join(re.escape(p) for p in prefixes)
return re.compile(b"(?<![\\w\\-_/])([\\w\\-_]*?)(%s)([\\w\\-_/]*)" % all_prefixes)
def utf8_paths_to_single_binary_regex(prefixes: Iterable[str]) -> PatternBytes:
def utf8_paths_to_single_binary_regex(prefixes):
"""Create a (binary) regex that matches any input path in utf8"""
return _byte_strings_to_single_binary_regex(p.encode("utf-8") for p in prefixes)
def filter_identity_mappings(prefix_to_prefix: Dict[bytes, bytes]) -> Dict[bytes, bytes]:
def filter_identity_mappings(prefix_to_prefix):
"""Drop mappings that are not changed."""
# NOTE: we don't guard against the following case:
# [/abc/def -> /abc/def, /abc -> /x] *will* be simplified to
# [/abc -> /x], meaning that after this simplification /abc/def will be
# mapped to /x/def instead of /abc/def. This should not be a problem.
return {k: v for k, v in prefix_to_prefix.items() if k != v}
return OrderedDict((k, v) for (k, v) in prefix_to_prefix.items() if k != v)
class PrefixReplacer:
"""Base class for applying a prefix to prefix map to a list of binaries or text files. Derived
classes implement _apply_to_file to do the actual work, which is different when it comes to
"""Base class for applying a prefix to prefix map
to a list of binaries or text files.
Child classes implement _apply_to_file to do the
actual work, which is different when it comes to
binaries and text files."""
def __init__(self, prefix_to_prefix: Dict[bytes, bytes]) -> None:
def __init__(self, prefix_to_prefix: Dict[bytes, bytes]):
"""
Arguments:
prefix_to_prefix: An ordered mapping from prefix to prefix. The order is relevant to
support substring fallbacks, for example
``[("/first/sub", "/x"), ("/first", "/y")]`` will ensure /first/sub is matched and
replaced before /first.
prefix_to_prefix (OrderedDict):
A ordered mapping from prefix to prefix. The order is
relevant to support substring fallbacks, for example
[("/first/sub", "/x"), ("/first", "/y")] will ensure
/first/sub is matched and replaced before /first.
"""
self.prefix_to_prefix = filter_identity_mappings(prefix_to_prefix)
@@ -71,7 +74,7 @@ def is_noop(self) -> bool:
or there are no prefixes to replace."""
return not self.prefix_to_prefix
def apply(self, filenames: Iterable[str]) -> List[str]:
def apply(self, filenames: list):
"""Returns a list of files that were modified"""
changed_files = []
if self.is_noop:
@@ -81,20 +84,17 @@ def apply(self, filenames: Iterable[str]) -> List[str]:
changed_files.append(filename)
return changed_files
def apply_to_filename(self, filename: str) -> bool:
def apply_to_filename(self, filename):
if self.is_noop:
return False
with open(filename, "rb+") as f:
return self.apply_to_file(f)
def apply_to_file(self, f: IO[bytes]) -> bool:
def apply_to_file(self, f):
if self.is_noop:
return False
return self._apply_to_file(f)
def _apply_to_file(self, f: IO) -> bool:
raise NotImplementedError("Derived classes must implement this method")
class TextFilePrefixReplacer(PrefixReplacer):
"""This class applies prefix to prefix mappings for relocation
@@ -112,11 +112,13 @@ def __init__(self, prefix_to_prefix: Dict[bytes, bytes]):
self.regex = _byte_strings_to_single_binary_regex(self.prefix_to_prefix.keys())
@classmethod
def from_strings_or_bytes(cls, prefix_to_prefix: PrefixToPrefix) -> "TextFilePrefixReplacer":
def from_strings_or_bytes(
cls, prefix_to_prefix: Dict[Prefix, Prefix]
) -> "TextFilePrefixReplacer":
"""Create a TextFilePrefixReplacer from an ordered prefix to prefix map."""
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix))
def _apply_to_file(self, f: IO) -> bool:
def _apply_to_file(self, f):
"""Text replacement implementation simply reads the entire file
in memory and applies the combined regex."""
replacement = lambda m: m.group(1) + self.prefix_to_prefix[m.group(2)] + m.group(3)
@@ -131,12 +133,12 @@ def _apply_to_file(self, f: IO) -> bool:
class BinaryFilePrefixReplacer(PrefixReplacer):
def __init__(self, prefix_to_prefix: Dict[bytes, bytes], suffix_safety_size: int = 7) -> None:
def __init__(self, prefix_to_prefix, suffix_safety_size=7):
"""
prefix_to_prefix: Ordered dictionary where the keys are bytes representing the old prefixes
and the values are the new
suffix_safety_size: in case of null terminated strings, what size of the suffix should
remain to avoid aliasing issues?
prefix_to_prefix (OrderedDict): OrderedDictionary where the keys are
bytes representing the old prefixes and the values are the new
suffix_safety_size (int): in case of null terminated strings, what size
of the suffix should remain to avoid aliasing issues?
"""
assert suffix_safety_size >= 0
super().__init__(prefix_to_prefix)
@@ -144,18 +146,17 @@ def __init__(self, prefix_to_prefix: Dict[bytes, bytes], suffix_safety_size: int
self.regex = self.binary_text_regex(self.prefix_to_prefix.keys(), suffix_safety_size)
@classmethod
def binary_text_regex(
cls, binary_prefixes: Iterable[bytes], suffix_safety_size: int = 7
) -> PatternBytes:
"""Create a regex that looks for exact matches of prefixes, and also tries to match a
C-string type null terminator in a small lookahead window.
def binary_text_regex(cls, binary_prefixes, suffix_safety_size=7):
"""
Create a regex that looks for exact matches of prefixes, and also tries to
match a C-string type null terminator in a small lookahead window.
Arguments:
binary_prefixes: Iterable of byte strings of prefixes to match
suffix_safety_size: Sizeof the lookahed for null-terminated string.
binary_prefixes (list): List of byte strings of prefixes to match
suffix_safety_size (int): Sizeof the lookahed for null-terminated string.
Returns: compiled regex
"""
# Note: it's important not to use capture groups for the prefix, since it destroys
# performance due to common prefix optimization.
return re.compile(
b"("
+ b"|".join(re.escape(p) for p in binary_prefixes)
@@ -164,34 +165,36 @@ def binary_text_regex(
@classmethod
def from_strings_or_bytes(
cls, prefix_to_prefix: PrefixToPrefix, suffix_safety_size: int = 7
cls, prefix_to_prefix: Dict[Prefix, Prefix], suffix_safety_size: int = 7
) -> "BinaryFilePrefixReplacer":
"""Create a BinaryFilePrefixReplacer from an ordered prefix to prefix map.
Arguments:
prefix_to_prefix: Ordered mapping of prefix to prefix.
suffix_safety_size: Number of bytes to retain at the end of a C-string to avoid binary
string-aliasing issues.
prefix_to_prefix (OrderedDict): Ordered mapping of prefix to prefix.
suffix_safety_size (int): Number of bytes to retain at the end of a C-string
to avoid binary string-aliasing issues.
"""
return cls(_prefix_to_prefix_as_bytes(prefix_to_prefix), suffix_safety_size)
def _apply_to_file(self, f: IO[bytes]) -> bool:
def _apply_to_file(self, f):
"""
Given a file opened in rb+ mode, apply the string replacements as specified by an ordered
dictionary of prefix to prefix mappings. This method takes special care of null-terminated
C-strings. C-string constants are problematic because compilers and linkers optimize
readonly strings for space by aliasing those that share a common suffix (only suffix since
all of them are null terminated). See https://github.com/spack/spack/pull/31739 and
https://github.com/spack/spack/pull/32253 for details. Our logic matches the original
prefix with a ``suffix_safety_size + 1`` lookahead for null bytes. If no null terminator
is found, we simply pad with leading /, assuming that it's a long C-string; the full
C-string after replacement has a large suffix in common with its original value. If there
*is* a null terminator we can do the same as long as the replacement has a sufficiently
long common suffix with the original prefix. As a last resort when the replacement does
not have a long enough common suffix, we can try to shorten the string, but this only
works if the new length is sufficiently short (typically the case when going from large
padding -> normal path) If the replacement string is longer, or all of the above fails,
we error out.
Given a file opened in rb+ mode, apply the string replacements as
specified by an ordered dictionary of prefix to prefix mappings. This
method takes special care of null-terminated C-strings. C-string constants
are problematic because compilers and linkers optimize readonly strings for
space by aliasing those that share a common suffix (only suffix since all
of them are null terminated). See https://github.com/spack/spack/pull/31739
and https://github.com/spack/spack/pull/32253 for details. Our logic matches
the original prefix with a ``suffix_safety_size + 1`` lookahead for null bytes.
If no null terminator is found, we simply pad with leading /, assuming that
it's a long C-string; the full C-string after replacement has a large suffix
in common with its original value.
If there *is* a null terminator we can do the same as long as the replacement
has a sufficiently long common suffix with the original prefix.
As a last resort when the replacement does not have a long enough common suffix,
we can try to shorten the string, but this only works if the new length is
sufficiently short (typically the case when going from large padding -> normal path)
If the replacement string is longer, or all of the above fails, we error out.
Arguments:
f: file opened in rb+ mode
@@ -201,10 +204,11 @@ def _apply_to_file(self, f: IO[bytes]) -> bool:
"""
assert f.tell() == 0
# We *could* read binary data in chunks to avoid loading all in memory, but it's nasty to
# deal with matches across boundaries, so let's stick to something simple.
# We *could* read binary data in chunks to avoid loading all in memory,
# but it's nasty to deal with matches across boundaries, so let's stick to
# something simple.
modified = False
modified = True
for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new)
@@ -214,7 +218,8 @@ def _apply_to_file(self, f: IO[bytes]) -> bool:
# Did we find a trailing null within a N + 1 bytes window after the prefix?
null_terminated = match.end(0) > match.end(1)
# Suffix string length, excluding the null byte. Only makes sense if null_terminated
# Suffix string length, excluding the null byte
# Only makes sense if null_terminated
suffix_strlen = match.end(0) - match.end(1) - 1
# How many bytes are we shrinking our string?
@@ -224,9 +229,9 @@ def _apply_to_file(self, f: IO[bytes]) -> bool:
if bytes_shorter < 0:
raise CannotGrowString(old, new)
# If we don't know whether this is a null terminated C-string (we're looking only N + 1
# bytes ahead), or if it is and we have a common suffix, we can simply pad with leading
# dir separators.
# If we don't know whether this is a null terminated C-string (we're looking
# only N + 1 bytes ahead), or if it is and we have a common suffix, we can
# simply pad with leading dir separators.
elif (
not null_terminated
or suffix_strlen >= self.suffix_safety_size # == is enough, but let's be defensive
@@ -235,9 +240,9 @@ def _apply_to_file(self, f: IO[bytes]) -> bool:
):
replacement = b"/" * bytes_shorter + new
# If it *was* null terminated, all that matters is that we can leave N bytes of old
# suffix in place. Note that > is required since we also insert an additional null
# terminator.
# If it *was* null terminated, all that matters is that we can leave N bytes
# of old suffix in place. Note that > is required since we also insert an
# additional null terminator.
elif bytes_shorter > self.suffix_safety_size:
replacement = new + match.group(2) # includes the trailing null
@@ -252,6 +257,22 @@ def _apply_to_file(self, f: IO[bytes]) -> bool:
return modified
class BinaryStringReplacementError(spack.error.SpackError):
def __init__(self, file_path, old_len, new_len):
"""The size of the file changed after binary path substitution
Args:
file_path (str): file with changing size
old_len (str): original length of the file
new_len (str): length of the file after substitution
"""
super().__init__(
"Doing a binary string replacement in %s failed.\n"
"The size of the file changed from %s to %s\n"
"when it should have remanined the same." % (file_path, old_len, new_len)
)
class BinaryTextReplaceError(spack.error.SpackError):
def __init__(self, msg):
msg += (
@@ -263,16 +284,17 @@ def __init__(self, msg):
class CannotGrowString(BinaryTextReplaceError):
def __init__(self, old, new):
return super().__init__(
f"Cannot replace {old!r} with {new!r} because the new prefix is longer."
)
msg = "Cannot replace {!r} with {!r} because the new prefix is longer.".format(old, new)
super().__init__(msg)
class CannotShrinkCString(BinaryTextReplaceError):
def __init__(self, old, new, full_old_string):
# Just interpolate binary string to not risk issues with invalid unicode, which would be
# really bad user experience: error in error. We have no clue if we actually deal with a
# real C-string nor what encoding it has.
super().__init__(
f"Cannot replace {old!r} with {new!r} in the C-string {full_old_string!r}."
# Just interpolate binary string to not risk issues with invalid
# unicode, which would be really bad user experience: error in error.
# We have no clue if we actually deal with a real C-string nor what
# encoding it has.
msg = "Cannot replace {!r} with {!r} in the C-string {!r}.".format(
old, new, full_old_string
)
super().__init__(msg)

View File

@@ -48,7 +48,7 @@ def rewire_node(spec, explicit):
# spec
prefix_to_prefix = {spec.build_spec.prefix: spec.prefix}
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in bindist.specs_to_relocate(spec):
for s in bindist.deps_to_relocate(spec):
analog = s
if id(s) not in build_spec_ids:
analogs = [
@@ -69,7 +69,7 @@ def rewire_node(spec, explicit):
os.path.join(spec.prefix, rel_path) for rel_path in buildinfo["relocate_textfiles"]
]
if text_to_relocate:
relocate.relocate_text(files=text_to_relocate, prefix_to_prefix=prefix_to_prefix)
relocate.relocate_text(files=text_to_relocate, prefixes=prefix_to_prefix)
links = [os.path.join(spec.prefix, f) for f in buildinfo["relocate_links"]]
relocate.relocate_links(links, prefix_to_prefix)
bins_to_relocate = [
@@ -77,10 +77,26 @@ def rewire_node(spec, explicit):
]
if bins_to_relocate:
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(bins_to_relocate, prefix_to_prefix)
relocate.relocate_macho_binaries(
bins_to_relocate,
str(spack.store.STORE.layout.root),
str(spack.store.STORE.layout.root),
prefix_to_prefix,
False,
spec.build_spec.prefix,
spec.prefix,
)
if "elf" in platform.binary_formats:
relocate.relocate_elf_binaries(bins_to_relocate, prefix_to_prefix)
relocate.relocate_text_bin(binaries=bins_to_relocate, prefix_to_prefix=prefix_to_prefix)
relocate.relocate_elf_binaries(
bins_to_relocate,
str(spack.store.STORE.layout.root),
str(spack.store.STORE.layout.root),
prefix_to_prefix,
False,
spec.build_spec.prefix,
spec.prefix,
)
relocate.relocate_text_bin(binaries=bins_to_relocate, prefixes=prefix_to_prefix)
shutil.rmtree(tempdir)
install_manifest = os.path.join(
spec.prefix,

View File

@@ -6,8 +6,6 @@
import typing
import warnings
import jsonschema
import llnl.util.lang
from spack.error import SpecSyntaxError
@@ -21,8 +19,12 @@ class DeprecationMessage(typing.NamedTuple):
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
import jsonschema
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import jsonschema
import spack.spec_parser
if not validator.is_type(instance, "object"):
@@ -31,8 +33,8 @@ def _validate_spec(validator, is_spec, instance, schema):
for spec_str in instance:
try:
spack.spec_parser.parse(spec_str)
except SpecSyntaxError:
yield jsonschema.ValidationError(f"the key '{spec_str}' is not a valid spec")
except SpecSyntaxError as e:
yield jsonschema.ValidationError(str(e))
def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
@@ -65,7 +67,7 @@ def _deprecated_properties(validator, deprecated, instance, schema):
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft7Validator,
jsonschema.Draft4Validator,
{"validate_spec": _validate_spec, "deprecatedProperties": _deprecated_properties},
)

View File

@@ -106,17 +106,10 @@
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v1.0.",
"error": False,
},
{
"names": ["install_path_scheme"],
"message": "The config:install_path_scheme option was deprecated in Spack v0.16 "
"in favor of config:install_tree:projections:all. It will be removed in Spack "
"v1.0.",
"error": False,
},
],
}
}

View File

@@ -19,7 +19,7 @@
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"additionalProperties": spec_list_schema,
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
},
}
}

View File

@@ -9,8 +9,6 @@
"""
from typing import Any, Dict
import jsonschema
#: Common properties for connection specification
connection = {
"url": {"type": "string"},
@@ -104,6 +102,8 @@
def update(data):
import jsonschema
errors = []
def check_access_pair(name, section):

View File

@@ -12,6 +12,22 @@
import spack.schema.environment
import spack.schema.projections
#: Matches a spec or a multi-valued variant but not another
#: valid keyword.
#:
#: THIS NEEDS TO BE UPDATED FOR EVERY NEW KEYWORD THAT
#: IS ADDED IMMEDIATELY BELOW THE MODULE TYPE ATTRIBUTE
spec_regex = (
r"(?!hierarchy|core_specs|verbose|hash_length|defaults|filter_hierarchy_specs|hide|"
r"include|exclude|projections|naming_scheme|core_compilers|all)(^\w[\w-]*)"
)
#: Matches a valid name for a module set
valid_module_set_name = r"^(?!prefix_inspections$)\w[\w-]*$"
#: Matches an anonymous spec, i.e. a spec without a root name
anonymous_spec_regex = r"^[\^@%+~]"
#: Definitions for parts of module schema
array_of_strings = {"type": "array", "default": [], "items": {"type": "string"}}
@@ -40,7 +56,7 @@
"suffixes": {
"type": "object",
"validate_spec": True,
"additionalProperties": {"type": "string"}, # key
"patternProperties": {r"\w[\w-]*": {"type": "string"}}, # key
},
"environment": spack.schema.environment.definition,
},
@@ -48,40 +64,34 @@
projections_scheme = spack.schema.projections.properties["projections"]
module_type_configuration: Dict = {
module_type_configuration = {
"type": "object",
"default": {},
"validate_spec": True,
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"},
"projections": projections_scheme,
"all": module_file_configuration,
},
"additionalProperties": module_file_configuration,
"allOf": [
{
"properties": {
"verbose": {"type": "boolean", "default": False},
"hash_length": {"type": "integer", "minimum": 0, "default": 7},
"include": array_of_strings,
"exclude": array_of_strings,
"exclude_implicits": {"type": "boolean", "default": False},
"defaults": array_of_strings,
"hide_implicits": {"type": "boolean", "default": False},
"naming_scheme": {"type": "string"}, # Can we be more specific here?
"projections": projections_scheme,
"all": module_file_configuration,
}
},
{
"validate_spec": True,
"patternProperties": {
spec_regex: module_file_configuration,
anonymous_spec_regex: module_file_configuration,
},
},
],
}
tcl_configuration = module_type_configuration.copy()
lmod_configuration = module_type_configuration.copy()
lmod_configuration["properties"].update(
{
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"validate_spec": True,
"additionalProperties": array_of_strings,
},
}
)
module_config_properties = {
"use_view": {"anyOf": [{"type": "string"}, {"type": "boolean"}]},
@@ -95,8 +105,31 @@
"default": [],
"items": {"type": "string", "enum": ["tcl", "lmod"]},
},
"lmod": lmod_configuration,
"tcl": tcl_configuration,
"lmod": {
"allOf": [
# Base configuration
module_type_configuration,
{
"type": "object",
"properties": {
"core_compilers": array_of_strings,
"hierarchy": array_of_strings,
"core_specs": array_of_strings,
"filter_hierarchy_specs": {
"type": "object",
"patternProperties": {spec_regex: array_of_strings},
},
},
}, # Specific lmod extensions
]
},
"tcl": {
"allOf": [
# Base configuration
module_type_configuration,
{}, # Specific tcl extensions
]
},
"prefix_inspections": {
"type": "object",
"additionalProperties": False,
@@ -112,6 +145,7 @@
properties: Dict[str, Any] = {
"modules": {
"type": "object",
"additionalProperties": False,
"properties": {
"prefix_inspections": {
"type": "object",
@@ -122,11 +156,13 @@
},
}
},
"additionalProperties": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": module_config_properties,
"patternProperties": {
valid_module_set_name: {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": module_config_properties,
}
},
}
}

View File

@@ -98,6 +98,7 @@
"packages": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"all": { # package name
"type": "object",
@@ -139,54 +140,58 @@
},
}
},
"additionalProperties": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {
"type": "array",
"default": [],
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"variants": variants,
"externals": {
"type": "array",
"items": {
"type": "object",
"properties": {
"spec": {"type": "string"},
"prefix": {"type": "string"},
"modules": {"type": "array", "items": {"type": "string"}},
"extra_attributes": {
"type": "object",
"additionalProperties": {"type": "string"},
"properties": {
"compilers": {
"type": "object",
"patternProperties": {r"(^\w[\w-]*)": {"type": "string"}},
"patternProperties": {
r"(?!^all$)(^\w[\w-]*)": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {
"type": "array",
"default": [],
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"variants": variants,
"externals": {
"type": "array",
"items": {
"type": "object",
"properties": {
"spec": {"type": "string"},
"prefix": {"type": "string"},
"modules": {"type": "array", "items": {"type": "string"}},
"extra_attributes": {
"type": "object",
"additionalProperties": True,
"properties": {
"compilers": {
"type": "object",
"patternProperties": {
r"(^\w[\w-]*)": {"type": "string"}
},
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
},
},
"additionalProperties": True,
"required": ["spec"],
},
"additionalProperties": True,
"required": ["spec"],
},
},
},
}
},
}
}

View File

@@ -37,9 +37,7 @@
import spack.package_prefs
import spack.platforms
import spack.repo
import spack.solver.splicing
import spack.spec
import spack.spec_lookup
import spack.store
import spack.util.crypto
import spack.util.libc
@@ -69,7 +67,7 @@
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
TransformFunction = Callable[["spack.spec.Spec", List[AspFunction]], List[AspFunction]]
#: Enable the addition of a runtime node
WITH_RUNTIME = sys.platform != "win32"
@@ -129,8 +127,8 @@ def __str__(self):
@contextmanager
def named_spec(
spec: Optional[spack.spec.Spec], name: Optional[str]
) -> Iterator[Optional[spack.spec.Spec]]:
spec: Optional["spack.spec.Spec"], name: Optional[str]
) -> Iterator[Optional["spack.spec.Spec"]]:
"""Context manager to temporarily set the name of a spec"""
if spec is None or name is None:
yield spec
@@ -749,11 +747,11 @@ def on_model(model):
class KnownCompiler(NamedTuple):
"""Data class to collect information on compilers"""
spec: spack.spec.Spec
spec: "spack.spec.Spec"
os: str
target: Optional[str]
target: str
available: bool
compiler_obj: Optional[spack.compiler.Compiler]
compiler_obj: Optional["spack.compiler.Compiler"]
def _key(self):
return self.spec, self.os, self.target
@@ -1134,7 +1132,7 @@ def __init__(self, tests: bool = False):
set
)
self.possible_compilers: List[KnownCompiler] = []
self.possible_compilers: List = []
self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set()
@@ -1388,7 +1386,7 @@ def effect_rules(self):
def define_variant(
self,
pkg: Type[spack.package_base.PackageBase],
pkg: "Type[spack.package_base.PackageBase]",
name: str,
when: spack.spec.Spec,
variant_def: vt.Variant,
@@ -1492,7 +1490,7 @@ def define_auto_variant(self, name: str, multi: bool):
)
)
def variant_rules(self, pkg: Type[spack.package_base.PackageBase]):
def variant_rules(self, pkg: "Type[spack.package_base.PackageBase]"):
for name in pkg.variant_names():
self.gen.h3(f"Variant {name} in package {pkg.name}")
for when, variant_def in pkg.variant_definitions(name):
@@ -1683,8 +1681,8 @@ def dependency_holds(input_spec, requirements):
def _gen_match_variant_splice_constraints(
self,
pkg,
cond_spec: spack.spec.Spec,
splice_spec: spack.spec.Spec,
cond_spec: "spack.spec.Spec",
splice_spec: "spack.spec.Spec",
hash_asp_var: "AspVar",
splice_node,
match_variants: List[str],
@@ -1742,7 +1740,7 @@ def package_splice_rules(self, pkg):
if any(
v in cond.variants or v in spec_to_splice.variants for v in match_variants
):
raise spack.error.PackageError(
raise Exception(
"Overlap between match_variants and explicitly set variants"
)
variant_constraints = self._gen_match_variant_splice_constraints(
@@ -2712,7 +2710,7 @@ def setup(
if env:
dev_specs = tuple(
spack.spec.Spec(info["spec"]).constrained(
'dev_path="%s"'
"dev_path=%s"
% spack.util.path.canonicalize_path(info["path"], default_wd=env.path)
)
for name, info in env.dev_specs.items()
@@ -2979,7 +2977,7 @@ def _specs_from_requires(self, pkg_name, section):
for s in spec_group[key]:
yield _spec_with_default_name(s, pkg_name)
def pkg_class(self, pkg_name: str) -> typing.Type[spack.package_base.PackageBase]:
def pkg_class(self, pkg_name: str) -> typing.Type["spack.package_base.PackageBase"]:
request = pkg_name
if pkg_name in self.explicitly_required_namespaces:
namespace = self.explicitly_required_namespaces[pkg_name]
@@ -3098,7 +3096,7 @@ def __init__(self, configuration) -> None:
self.compilers.add(candidate)
def with_input_specs(self, input_specs: List[spack.spec.Spec]) -> "CompilerParser":
def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerParser":
"""Accounts for input specs when building the list of possible compilers.
Args:
@@ -3138,7 +3136,7 @@ def with_input_specs(self, input_specs: List[spack.spec.Spec]) -> "CompilerParse
return self
def add_compiler_from_concrete_spec(self, spec: spack.spec.Spec) -> None:
def add_compiler_from_concrete_spec(self, spec: "spack.spec.Spec") -> None:
"""Account for compilers that are coming from concrete specs, through reuse.
Args:
@@ -3376,6 +3374,14 @@ def consume_facts(self):
self._setup.effect_rules()
# This should be a dataclass, but dataclasses don't work on Python 3.6
class Splice:
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
self.splice_node = splice_node
self.child_name = child_name
self.child_hash = child_hash
class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results."""
@@ -3415,7 +3421,7 @@ def __init__(self, specs, hash_lookup=None):
self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
# Matches parent nodes to splice node
self._splices: Dict[spack.spec.Spec, List[spack.solver.splicing.Splice]] = {}
self._splices: Dict[NodeArgument, List[Splice]] = {}
self._result = None
self._command_line_specs = specs
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
@@ -3534,13 +3540,15 @@ def reorder_flags(self):
)
cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse())
for node, spec in self._specs.items():
for spec in self._specs.values():
# if bootstrapping, compiler is not in config and has no flags
flagmap_from_compiler = {}
if spec.compiler in compilers:
flagmap_from_compiler = compilers[spec.compiler].flags
for flag_type in spec.compiler_flags.valid_compiler_flags():
node = SpecBuilder.make_node(pkg=spec.name)
ordered_flags = []
# 1. Put compiler flags first
@@ -3622,12 +3630,49 @@ def splice_at_hash(
child_name: str,
child_hash: str,
):
parent_spec = self._specs[parent_node]
splice_spec = self._specs[splice_node]
splice = spack.solver.splicing.Splice(
splice_spec, child_name=child_name, child_hash=child_hash
)
self._splices.setdefault(parent_spec, []).append(splice)
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash)
self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate splices.
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
new_spec.add_dependency_edge(
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
)
elif edge.spec in fixed_specs:
new_spec.add_dependency_edge(
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(
edge.spec, depflag=depflag, virtuals=edge.virtuals
)
self._specs[node] = new_spec
fixed_specs[spec] = new_spec
@staticmethod
def sort_fn(function_tuple) -> Tuple[int, int]:
@@ -3720,15 +3765,7 @@ def build_specs(self, function_tuples):
for root in roots.values():
root._finalize_concretization()
# Only attempt to resolve automatic splices if the solver produced any
if self._splices:
resolved_splices = spack.solver.splicing._resolve_collected_splices(
list(self._specs.values()), self._splices
)
new_specs = {}
for node, spec in self._specs.items():
new_specs[node] = resolved_splices.get(spec, spec)
self._specs = new_specs
self._resolve_automatic_splices()
for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s)
@@ -3775,7 +3812,7 @@ def execute_explicit_splices(self):
# The first iteration, we need to replace the abstract hash
if not replacement.concrete:
spack.spec_lookup.replace_hash(replacement)
replacement.replace_hash()
current_spec = current_spec.splice(replacement, transitive)
new_key = NodeArgument(id=key.id, pkg=current_spec.name)
specs[new_key] = current_spec
@@ -4134,7 +4171,7 @@ def solve_with_stats(
setup_only (bool): if True, stop after setup and don't solve (default False).
allow_deprecated (bool): allow deprecated version in the solve
"""
specs = [spack.spec_lookup.lookup_hash(s) for s in specs]
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
@@ -4171,7 +4208,7 @@ def solve_in_rounds(
tests (bool): add test dependencies to the solve
allow_deprecated (bool): allow deprecated version in the solve
"""
specs = [spack.spec_lookup.lookup_hash(s) for s in specs]
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)

View File

@@ -1,73 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from functools import cmp_to_key
from typing import Dict, List, NamedTuple
import spack.deptypes as dt
from spack.spec import Spec
from spack.traverse import by_dag_hash, traverse_nodes
class Splice(NamedTuple):
#: The spec being spliced into a parent
splice_spec: Spec
#: The name of the child that splice spec is replacing
child_name: str
#: The hash of the child that `splice_spec` is replacing
child_hash: str
def _resolve_collected_splices(
specs: List[Spec], splices: Dict[Spec, List[Splice]]
) -> Dict[Spec, Spec]:
"""After all of the specs have been concretized, apply all immediate splices.
Returns a dict mapping original specs to their resolved counterparts
"""
def splice_cmp(s1: Spec, s2: Spec):
"""This function can be used to sort a list of specs such that that any
spec which will be spliced into a parent comes after the parent it will
be spliced into. This order ensures that transitive splices will be
executed in the correct order.
"""
s1_splices = splices.get(s1, [])
s2_splices = splices.get(s2, [])
if any([s2.dag_hash() == splice.splice_spec.dag_hash() for splice in s1_splices]):
return -1
elif any([s1.dag_hash() == splice.splice_spec.dag_hash() for splice in s2_splices]):
return 1
else:
return 0
splice_order = sorted(specs, key=cmp_to_key(splice_cmp))
reverse_topo_order = reversed(
[x for x in traverse_nodes(splice_order, order="topo", key=by_dag_hash) if x in specs]
)
already_resolved: Dict[Spec, Spec] = {}
for spec in reverse_topo_order:
immediate = splices.get(spec, [])
if not immediate and not any(
edge.spec in already_resolved for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.clear_caches(ignore=("package_hash",))
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
# If the spec being splice in is also spliced
splice_spec = already_resolved.get(splice.splice_spec, splice.splice_spec)
new_spec.add_dependency_edge(splice_spec, depflag=depflag, virtuals=edge.virtuals)
elif edge.spec in already_resolved:
new_spec.add_dependency_edge(
already_resolved[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(edge.spec, depflag=depflag, virtuals=edge.virtuals)
already_resolved[spec] = new_spec
return already_resolved

View File

@@ -58,21 +58,7 @@
import re
import socket
import warnings
from typing import (
Any,
Callable,
Dict,
Iterable,
List,
Match,
Optional,
Set,
Tuple,
Union,
overload,
)
from typing_extensions import Literal
from typing import Any, Callable, Dict, Iterable, List, Match, Optional, Set, Tuple, Union
import archspec.cpu
@@ -86,6 +72,7 @@
import spack
import spack.compiler
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
import spack.hash_types as ht
@@ -93,9 +80,10 @@
import spack.platforms
import spack.provider_index
import spack.repo
import spack.solver
import spack.spec_parser
import spack.store
import spack.traverse
import spack.traverse as traverse
import spack.util.executable
import spack.util.hash
import spack.util.module_cmd as md
@@ -106,6 +94,8 @@
import spack.version as vn
import spack.version.git_ref_lookup
from .enums import InstallRecordStatus
__all__ = [
"CompilerSpec",
"Spec",
@@ -126,6 +116,8 @@
"UnsatisfiableArchitectureSpecError",
"UnsatisfiableProviderSpecError",
"UnsatisfiableDependencySpecError",
"AmbiguousHashError",
"InvalidHashError",
"SpecDeprecatedError",
]
@@ -574,9 +566,14 @@ def to_dict(self):
target_data = str(self.target)
else:
# Get rid of compiler flag information before turning the uarch into a dict
target_data = self.target.to_dict()
target_data.pop("compilers", None)
return {"arch": {"platform": self.platform, "platform_os": self.os, "target": target_data}}
uarch_dict = self.target.to_dict()
uarch_dict.pop("compilers", None)
target_data = syaml.syaml_dict(uarch_dict.items())
d = syaml.syaml_dict(
[("platform", self.platform), ("platform_os", self.os), ("target", target_data)]
)
return syaml.syaml_dict([("arch", d)])
@staticmethod
def from_dict(d):
@@ -701,7 +698,10 @@ def _cmp_iter(self):
yield self.versions
def to_dict(self):
return {"compiler": {"name": self.name, **self.versions.to_dict()}}
d = syaml.syaml_dict([("name", self.name)])
d.update(self.versions.to_dict())
return syaml.syaml_dict([("compiler", d)])
@staticmethod
def from_dict(d):
@@ -1339,16 +1339,16 @@ def tree(
depth: bool = False,
hashes: bool = False,
hashlen: Optional[int] = None,
cover: spack.traverse.CoverType = "nodes",
cover: str = "nodes",
indent: int = 0,
format: str = DEFAULT_FORMAT,
deptypes: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
deptypes: Union[Tuple[str, ...], str] = "all",
show_types: bool = False,
depth_first: bool = False,
recurse_dependencies: bool = True,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
prefix: Optional[Callable[["Spec"], str]] = None,
key: Callable[["Spec"], Any] = id,
key=id,
) -> str:
"""Prints out specs and their dependencies, tree-formatted with indentation.
@@ -1380,16 +1380,11 @@ def tree(
# reduce deptypes over all in-edges when covering nodes
if show_types and cover == "nodes":
deptype_lookup: Dict[str, dt.DepFlag] = collections.defaultdict(dt.DepFlag)
for edge in spack.traverse.traverse_edges(
specs, cover="edges", deptype=deptypes, root=False
):
for edge in traverse.traverse_edges(specs, cover="edges", deptype=deptypes, root=False):
deptype_lookup[edge.spec.dag_hash()] |= edge.depflag
# SupportsRichComparisonT issue with List[Spec]
sorted_specs: List["Spec"] = sorted(specs) # type: ignore[type-var]
for d, dep_spec in spack.traverse.traverse_tree(
sorted_specs, cover=cover, deptype=deptypes, depth_first=depth_first, key=key
for d, dep_spec in traverse.traverse_tree(
sorted(specs), cover=cover, deptype=deptypes, depth_first=depth_first, key=key
):
node = dep_spec.spec
@@ -1932,125 +1927,13 @@ def installed_upstream(self):
upstream, _ = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
return upstream
@overload
def traverse(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable["Spec"]: ...
@overload
def traverse(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, "Spec"]]: ...
def traverse(
self,
*,
root: bool = True,
order: spack.traverse.OrderType = "pre",
cover: spack.traverse.CoverType = "nodes",
direction: spack.traverse.DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth: bool = False,
key: Callable[["Spec"], Any] = id,
visited: Optional[Set[Any]] = None,
) -> Iterable[Union["Spec", Tuple[int, "Spec"]]]:
def traverse(self, **kwargs):
"""Shorthand for :meth:`~spack.traverse.traverse_nodes`"""
return spack.traverse.traverse_nodes(
[self],
root=root,
order=order,
cover=cover,
direction=direction,
deptype=deptype,
depth=depth,
key=key,
visited=visited,
)
return traverse.traverse_nodes([self], **kwargs)
@overload
def traverse_edges(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[DependencySpec]: ...
@overload
def traverse_edges(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, DependencySpec]]: ...
def traverse_edges(
self,
*,
root: bool = True,
order: spack.traverse.OrderType = "pre",
cover: spack.traverse.CoverType = "nodes",
direction: spack.traverse.DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth: bool = False,
key: Callable[["Spec"], Any] = id,
visited: Optional[Set[Any]] = None,
) -> Iterable[Union[DependencySpec, Tuple[int, DependencySpec]]]:
def traverse_edges(self, **kwargs):
"""Shorthand for :meth:`~spack.traverse.traverse_edges`"""
return spack.traverse.traverse_edges(
[self],
root=root,
order=order,
cover=cover,
direction=direction,
deptype=deptype,
depth=depth,
key=key,
visited=visited,
)
@property
def long_spec(self):
"""Returns a string of the spec with the dependencies completely
enumerated."""
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
return traverse.traverse_edges([self], **kwargs)
@property
def short_spec(self):
@@ -2166,6 +2049,66 @@ def process_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.process_hash(), bits)
def _lookup_hash(self):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
import spack.binary_distribution
import spack.environment
active_env = spack.environment.active_environment()
# First env, then store, then binary cache
matches = (
(active_env.all_matching_specs(self) if active_env else [])
or spack.store.STORE.db.query(self, installed=InstallRecordStatus.ANY)
or spack.binary_distribution.BinaryCacheQuery(True)(self)
)
if not matches:
raise InvalidHashError(self, self.abstract_hash)
if len(matches) != 1:
raise AmbiguousHashError(
f"Multiple packages specify hash beginning '{self.abstract_hash}'.", *matches
)
return matches[0]
def lookup_hash(self):
"""Given a spec with an abstract hash, return a copy of the spec with all properties and
dependencies by looking up the hash in the environment, store, or finally, binary caches.
This is non-destructive."""
if self.concrete or not any(node.abstract_hash for node in self.traverse()):
return self
spec = self.copy(deps=False)
# root spec is replaced
if spec.abstract_hash:
spec._dup(self._lookup_hash())
return spec
# Get dependencies that need to be replaced
for node in self.traverse(root=False):
if node.abstract_hash:
spec._add_dependency(node._lookup_hash(), depflag=0, virtuals=())
# reattach nodes that were not otherwise satisfied by new dependencies
for node in self.traverse(root=False):
if not any(n.satisfies(node) for n in spec.traverse()):
spec._add_dependency(node.copy(), depflag=0, virtuals=())
return spec
def replace_hash(self):
"""Given a spec with an abstract hash, attempt to populate all properties and dependencies
by looking up the hash in the environment, store, or finally, binary caches.
This is destructive."""
if not any(node for node in self.traverse(order="post") if node.abstract_hash):
return
self._dup(self.lookup_hash())
def to_node_dict(self, hash=ht.dag_hash):
"""Create a dictionary representing the state of this Spec.
@@ -2218,7 +2161,9 @@ def to_node_dict(self, hash=ht.dag_hash):
Arguments:
hash (spack.hash_types.SpecHashDescriptor) type of hash to generate.
"""
d = {"name": self.name}
d = syaml.syaml_dict()
d["name"] = self.name
if self.versions:
d.update(self.versions.to_dict())
@@ -2232,7 +2177,7 @@ def to_node_dict(self, hash=ht.dag_hash):
if self.namespace:
d["namespace"] = self.namespace
params = dict(sorted(v.yaml_entry() for v in self.variants.values()))
params = syaml.syaml_dict(sorted(v.yaml_entry() for _, v in self.variants.items()))
# Only need the string compiler flag for yaml file
params.update(
@@ -2258,16 +2203,13 @@ def to_node_dict(self, hash=ht.dag_hash):
)
if self.external:
if self.extra_attributes:
extra_attributes = syaml.sorted_dict(self.extra_attributes)
else:
extra_attributes = None
d["external"] = {
"path": self.external_path,
"module": self.external_modules,
"extra_attributes": extra_attributes,
}
d["external"] = syaml.syaml_dict(
[
("path", self.external_path),
("module", self.external_modules),
("extra_attributes", self.extra_attributes),
]
)
if not self._concrete:
d["concrete"] = False
@@ -2298,25 +2240,29 @@ def to_node_dict(self, hash=ht.dag_hash):
# Note: Relies on sorting dict by keys later in algorithm.
deps = self._dependencies_dict(depflag=hash.depflag)
if deps:
d["dependencies"] = [
{
"name": name,
hash.name: dspec.spec._cached_hash(hash),
"parameters": {
"deptypes": dt.flag_to_tuple(dspec.depflag),
"virtuals": dspec.virtuals,
},
}
for name, edges_for_name in sorted(deps.items())
for dspec in edges_for_name
]
deps_list = []
for name, edges_for_name in sorted(deps.items()):
name_tuple = ("name", name)
for dspec in edges_for_name:
hash_tuple = (hash.name, dspec.spec._cached_hash(hash))
parameters_tuple = (
"parameters",
syaml.syaml_dict(
(
("deptypes", dt.flag_to_tuple(dspec.depflag)),
("virtuals", dspec.virtuals),
)
),
)
ordered_entries = [name_tuple, hash_tuple, parameters_tuple]
deps_list.append(syaml.syaml_dict(ordered_entries))
d["dependencies"] = deps_list
# Name is included in case this is replacing a virtual.
if self._build_spec:
d["build_spec"] = {
"name": self.build_spec.name,
hash.name: self.build_spec._cached_hash(hash),
}
d["build_spec"] = syaml.syaml_dict(
[("name", self.build_spec.name), (hash.name, self.build_spec._cached_hash(hash))]
)
return d
def to_dict(self, hash=ht.dag_hash):
@@ -2418,7 +2364,10 @@ def to_dict(self, hash=ht.dag_hash):
node_list.append(node)
hash_set.add(node_hash)
return {"spec": {"_meta": {"version": SPECFILE_FORMAT_VERSION}, "nodes": node_list}}
meta_dict = syaml.syaml_dict([("version", SPECFILE_FORMAT_VERSION)])
inner_dict = syaml.syaml_dict([("_meta", meta_dict), ("nodes", node_list)])
spec_dict = syaml.syaml_dict([("spec", inner_dict)])
return spec_dict
def node_dict_with_hashes(self, hash=ht.dag_hash):
"""Returns a node_dict of this spec with the dag hash added. If this
@@ -2869,16 +2818,44 @@ def ensure_no_deprecated(root):
raise SpecDeprecatedError(msg)
def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None:
from spack.concretize import concretize_one
"""Concretize the current spec.
warnings.warn(
"`Spec.concretize` is deprecated and will be removed in version 1.0.0. Use "
"`spack.concretize.concretize_one` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
import spack.solver.asp
self._dup(concretize_one(self, tests))
self.replace_hash()
for node in self.traverse():
if not node.name:
raise spack.error.SpecError(
f"Spec {node} has no name; cannot concretize an anonymous spec"
)
if self._concrete:
return
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve([self], tests=tests, allow_deprecated=allow_deprecated)
# take the best answer
opt, i, answer = min(result.answers)
name = self.name
# TODO: Consolidate this code with similar code in solve.py
if self.virtual:
providers = [spec.name for spec in answer.values() if spec.package.provides(name)]
name = providers[0]
node = spack.solver.asp.SpecBuilder.make_node(pkg=name)
assert (
node in answer
), f"cannot find {name} in the list of specs {','.join([n.pkg for n in answer.keys()])}"
concretized = answer[node]
self._dup(concretized)
def _mark_root_concrete(self, value=True):
"""Mark just this spec (not dependencies) concrete."""
@@ -2967,17 +2944,20 @@ def _finalize_concretization(self):
for spec in self.traverse():
spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
from spack.concretize import concretize_one
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec":
"""This is a non-destructive version of concretize().
warnings.warn(
"`Spec.concretized` is deprecated and will be removed in version 1.0.0. Use "
"`spack.concretize.concretize_one` instead.",
category=spack.error.SpackAPIWarning,
stacklevel=2,
)
First clones, then returns a concrete version of this package
without modifying this package.
return concretize_one(self, tests)
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
clone = self.copy()
clone.concretize(tests=tests)
return clone
def index(self, deptype="all"):
"""Return a dictionary that points to all the dependencies in this
@@ -3068,7 +3048,7 @@ def constrain(self, other, deps=True):
if not self.abstract_hash or other.abstract_hash.startswith(self.abstract_hash):
self.abstract_hash = other.abstract_hash
elif not self.abstract_hash.startswith(other.abstract_hash):
raise spack.error.InvalidHashError(self, other.abstract_hash)
raise InvalidHashError(self, other.abstract_hash)
if not (self.name == other.name or (not self.name) or (not other.name)):
raise UnsatisfiableSpecNameError(self.name, other.name)
@@ -3513,16 +3493,25 @@ def patches(self):
return self._patches
def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True) -> bool:
"""Copies "other" into self, by overwriting all attributes.
def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, cleardeps=True):
"""Copy the spec other into self. This is an overwriting
copy. It does not copy any dependents (parents), but by default
copies dependencies.
To duplicate an entire DAG, call _dup() on the root of the DAG.
Args:
other: spec to be copied onto ``self``
deps: if True copies all the dependencies. If False copies None.
If deptype, or depflag, copy matching types.
other (Spec): spec to be copied onto ``self``
deps: if True copies all the dependencies. If
False copies None. If deptype/depflag, copy matching types.
cleardeps (bool): if True clears the dependencies of ``self``,
before possibly copying the dependencies of ``other`` onto
``self``
Returns:
True if ``self`` changed because of the copy operation, False otherwise.
True if ``self`` changed because of the copy operation,
False otherwise.
"""
# We don't count dependencies as changes here
changed = True
@@ -3547,15 +3536,14 @@ def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True)
self.versions = other.versions.copy()
self.architecture = other.architecture.copy() if other.architecture else None
self.compiler = other.compiler.copy() if other.compiler else None
if cleardeps:
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
self.compiler_flags = other.compiler_flags.copy()
self.compiler_flags.spec = self
self.variants = other.variants.copy()
self._build_spec = other._build_spec
# Clear dependencies
self._dependents = _EdgeMap(store_by_child=False)
self._dependencies = _EdgeMap(store_by_child=True)
# FIXME: we manage _patches_in_order_of_appearance specially here
# to keep it from leaking out of spec.py, but we should figure
# out how to handle it more elegantly in the Variant classes.
@@ -4060,7 +4048,15 @@ def __str__(self):
if not self._dependencies:
return self.format()
return self.long_spec
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@property
def colored_str(self):
@@ -4109,10 +4105,10 @@ def tree(
depth: bool = False,
hashes: bool = False,
hashlen: Optional[int] = None,
cover: spack.traverse.CoverType = "nodes",
cover: str = "nodes",
indent: int = 0,
format: str = DEFAULT_FORMAT,
deptypes: Union[dt.DepTypes, dt.DepFlag] = dt.ALL,
deptypes: Union[Tuple[str, ...], str] = "all",
show_types: bool = False,
depth_first: bool = False,
recurse_dependencies: bool = True,
@@ -4438,7 +4434,7 @@ def mask_build_deps(in_spec):
return spec
def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
def clear_caches(self, ignore=()):
"""
Clears all cached hashes in a Spec, while preserving other properties.
"""
@@ -4823,7 +4819,9 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"]
if spec.external_modules is False:
spec.external_modules = None
spec.extra_attributes = node["external"].get("extra_attributes", {})
spec.extra_attributes = node["external"].get(
"extra_attributes", syaml.syaml_dict()
)
# specs read in are concrete unless marked abstract
if node.get("concrete", True):
@@ -5275,6 +5273,21 @@ def __init__(self, spec):
super().__init__(msg)
class AmbiguousHashError(spack.error.SpecError):
def __init__(self, msg, *specs):
spec_fmt = "{namespace}.{name}{@version}{%compiler}{compiler_flags}"
spec_fmt += "{variants}{ arch=architecture}{/hash:7}"
specs_str = "\n " + "\n ".join(spec.format(spec_fmt) for spec in specs)
super().__init__(msg + specs_str)
class InvalidHashError(spack.error.SpecError):
def __init__(self, spec, hash):
msg = f"No spec with hash {hash} could be found to match {spec}."
msg += " Either the hash does not exist, or it does not match other spec constraints."
super().__init__(msg)
class SpecFilenameError(spack.error.SpecError):
"""Raised when a spec file name is invalid."""

View File

@@ -5,7 +5,6 @@
from typing import List
import spack.spec
import spack.spec_lookup
import spack.variant
from spack.error import SpackError
from spack.spec import Spec
@@ -231,7 +230,7 @@ def _expand_matrix_constraints(matrix_config):
pass
# Resolve abstract hashes for exclusion criteria
if any(spack.spec_lookup.lookup_hash(test_spec).satisfies(x) for x in excludes):
if any(test_spec.lookup_hash().satisfies(x) for x in excludes):
continue
if sigil:

View File

@@ -1,79 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.binary_distribution
import spack.environment
import spack.error
import spack.spec
import spack.store
from .enums import InstallRecordStatus
def _lookup_hash(spec: spack.spec.Spec):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
active_env = spack.environment.active_environment()
# First env, then store, then binary cache
matches = (
(active_env.all_matching_specs(spec) if active_env else [])
or spack.store.STORE.db.query(spec, installed=InstallRecordStatus.ANY)
or spack.binary_distribution.BinaryCacheQuery(True)(spec)
)
if not matches:
raise spack.error.InvalidHashError(spec, spec.abstract_hash)
if len(matches) != 1:
raise AmbiguousHashError(
f"Multiple packages specify hash beginning '{spec.abstract_hash}'.", *matches
)
return matches[0]
def lookup_hash(spec: spack.spec.Spec) -> spack.spec.Spec:
"""Given a spec with an abstract hash, return a copy of the spec with all properties and
dependencies by looking up the hash in the environment, store, or finally, binary caches.
This is non-destructive."""
if spec.concrete or not any(node.abstract_hash for node in spec.traverse()):
return spec
spec = spec.copy(deps=False)
# root spec is replaced
if spec.abstract_hash:
spec._dup(_lookup_hash(spec))
return spec
# Get dependencies that need to be replaced
for node in spec.traverse(root=False):
if node.abstract_hash:
spec._add_dependency(_lookup_hash(node), depflag=0, virtuals=())
# reattach nodes that were not otherwise satisfied by new dependencies
for node in spec.traverse(root=False):
if not any(n.satisfies(node) for n in spec.traverse()):
spec._add_dependency(node.copy(), depflag=0, virtuals=())
return spec
def replace_hash(spec: spack.spec.Spec) -> None:
"""Given a spec with an abstract hash, attempt to populate all properties and dependencies
by looking up the hash in the environment, store, or finally, binary caches.
This is destructive."""
if not any(node for node in spec.traverse(order="post") if node.abstract_hash):
return
spec._dup(lookup_hash(spec))
class AmbiguousHashError(spack.error.SpecError):
def __init__(self, msg, *specs):
spec_fmt = "{namespace}.{name}{@version}{%compiler}{compiler_flags}"
spec_fmt += "{variants}{ arch=architecture}{/hash:7}"
specs_str = "\n " + "\n ".join(spec.format(spec_fmt) for spec in specs)
super().__init__(msg + specs_str)

View File

@@ -7,14 +7,35 @@
import pytest
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.solver.asp
from spack.installer import PackageInstaller
from spack.solver.asp import SolverError
from spack.spec import Spec
class CacheManager:
def __init__(self, specs: List[str]) -> None:
self.req_specs = specs
self.concr_specs: List[Spec]
self.concr_specs = []
def __enter__(self):
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
for s in self.concr_specs:
PackageInstaller([s.package], fake=True, explicit=True).install()
def __exit__(self, exc_type, exc_val, exc_tb):
for s in self.concr_specs:
s.package.do_uninstall()
# MacOS and Windows only work if you pass this function pointer rather than a
# closure
def _mock_has_runtime_dependencies(_x):
return True
def _make_specs_non_buildable(specs: List[str]):
output_config = {}
for spec in specs:
@@ -23,262 +44,203 @@ def _make_specs_non_buildable(specs: List[str]):
@pytest.fixture
def install_specs(
mutable_database,
mock_packages,
mutable_config,
do_not_check_runtimes_on_reuse,
install_mockery,
):
"""Returns a function that concretizes and installs a list of abstract specs"""
mutable_config.set("concretizer:reuse", True)
def _impl(*specs_str):
concrete_specs = [Spec(s).concretized() for s in specs_str]
PackageInstaller([s.package for s in concrete_specs], fake=True, explicit=True).install()
return concrete_specs
return _impl
def splicing_setup(mutable_database, mock_packages, monkeypatch):
spack.config.set("concretizer:reuse", True)
monkeypatch.setattr(
spack.solver.asp, "_has_runtime_dependencies", _mock_has_runtime_dependencies
)
def _enable_splicing():
spack.config.set("concretizer:splice", {"automatic": True})
@pytest.mark.parametrize("spec_str", ["splice-z", "splice-h@1"])
def test_spec_reuse(spec_str, install_specs, mutable_config):
"""Tests reuse of splice-z, without splicing, as a root and as a dependency of splice-h"""
splice_z = install_specs("splice-z@1.0.0+compat")[0]
mutable_config.set("packages", _make_specs_non_buildable(["splice-z"]))
concrete = spack.concretize.concretize_one(spec_str)
assert concrete["splice-z"].satisfies(splice_z)
def _has_build_dependency(spec: Spec, name: str):
return any(s.name == name for s in spec.dependencies(None, dt.BUILD))
@pytest.mark.regression("48578")
def test_splice_installed_hash(install_specs, mutable_config):
"""Tests splicing the dependency of an installed spec, for another installed spec"""
splice_t, splice_h = install_specs(
def test_simple_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-z").concretized().satisfies(Spec("splice-z"))
def test_simple_dep_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
def test_splice_installed_hash(splicing_setup):
cache = [
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0",
"splice-h@1.0.2+compat ^splice-z@1.0.0",
)
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
mutable_config.set("packages", packages_config)
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h is reused, so the hash should stay the same
assert concrete["splice-h"].satisfies(splice_h)
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert concrete["splice-h"].dag_hash() == splice_h.dag_hash()
]
with CacheManager(cache):
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
spack.config.set("packages", packages_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
def test_splice_build_splice_node(install_specs, mutable_config):
"""Tests splicing the dependency of an installed spec, for a spec that is yet to be built"""
splice_t = install_specs("splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat")[0]
mutable_config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t has a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
# splice-h should be different
assert concrete["splice-h"].dag_hash() != splice_t["splice-h"].dag_hash()
assert concrete["splice-h"].build_spec.dag_hash() == concrete["splice-h"].dag_hash()
def test_splice_build_splice_node(splicing_setup):
with CacheManager(["splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
def test_double_splice(install_specs, mutable_config):
"""Tests splicing two dependencies of an installed spec, for other installed specs"""
splice_t, splice_h, splice_z = install_specs(
def test_double_splice(splicing_setup):
cache = [
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat",
"splice-h@1.0.2+compat ^splice-z@1.0.1+compat",
"splice-z@1.0.2+compat",
)
mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"]))
goal_spec = "splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat"
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
concrete = spack.concretize.concretize_one(goal_spec)
# splice-t and splice-h have a dependency that is changing, thus its hash should be different
assert concrete.dag_hash() != splice_t.dag_hash()
assert concrete.build_spec.satisfies(splice_t)
assert not concrete.satisfies(splice_t)
assert concrete["splice-h"].dag_hash() != splice_h.dag_hash()
assert concrete["splice-h"].build_spec.satisfies(splice_h)
assert not concrete["splice-h"].satisfies(splice_h)
# splice-z is reused, so the hash should stay the same
assert concrete["splice-z"].dag_hash() == splice_z.dag_hash()
]
with CacheManager(cache):
freeze_builds_config = _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"])
spack.config.set("packages", freeze_builds_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
@pytest.mark.parametrize(
"original_spec,goal_spec",
[
# `virtual-abi-1` can be spliced for `virtual-abi-multi abi=one` and vice-versa
(
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
),
(
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-1",
),
# `virtual-abi-2` can be spliced for `virtual-abi-multi abi=two` and vice-versa
(
"depends-on-virtual-with-abi ^virtual-abi-2",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
),
(
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
"depends-on-virtual-with-abi ^virtual-abi-2",
),
],
)
def test_virtual_multi_splices_in(original_spec, goal_spec, install_specs, mutable_config):
"""Tests that we can splice a virtual dependency with a different, but compatible, provider."""
original = install_specs(original_spec)[0]
mutable_config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
assert spliced["virtual-with-abi"].name != spliced.build_spec["virtual-with-abi"].name
# The next two tests are mirrors of one another
def test_virtual_multi_splices_in(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
for gs in goal_specs:
with pytest.raises(Exception):
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
@pytest.mark.parametrize(
"original_spec,goal_spec",
[
def test_virtual_multi_can_be_spliced(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(Exception):
for gs in goal_specs:
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
def test_manyvariant_star_matching_variant_splice(splicing_setup):
cache = [
# can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
(
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
"depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2",
),
(
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
"depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3",
),
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
]
goal_specs = [
Spec("depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2"),
Spec("depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for goal in goal_specs:
with pytest.raises(Exception):
goal.concretized()
_enable_splicing()
for goal in goal_specs:
assert goal.concretized().satisfies(goal)
def test_manyvariant_limited_matching(splicing_setup):
cache = [
# can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
(
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
"depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2",
),
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
# can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
(
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
"depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3",
),
],
)
def test_manyvariant_matching_variant_splice(
original_spec, goal_spec, install_specs, mutable_config
):
"""Tests splicing with different kind of matching on variants"""
original = install_specs(original_spec)[0]
mutable_config.set("packages", {"depends-on-manyvariants": {"buildable": False}})
with pytest.raises(SolverError):
spack.concretize.concretize_one(goal_spec)
_enable_splicing()
spliced = spack.concretize.concretize_one(goal_spec)
assert spliced.dag_hash() != original.dag_hash()
assert spliced.build_spec.dag_hash() == original.dag_hash()
# The spliced 'manyvariants' is yet to be built
assert spliced["manyvariants"].dag_hash() != original["manyvariants"].dag_hash()
assert spliced["manyvariants"].build_spec.dag_hash() == spliced["manyvariants"].dag_hash()
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
]
goal_specs = [
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2"),
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for s in goal_specs:
with pytest.raises(Exception):
s.concretized()
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
def test_external_splice_same_name(install_specs, mutable_config):
"""Tests that externals can be spliced for non-external specs"""
original_splice_h, original_splice_t = install_specs(
def test_external_splice_same_name(splicing_setup):
cache = [
"splice-h@1.0.0 ^splice-z@1.0.0+compat",
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat",
)
mutable_config.set("packages", _make_specs_non_buildable(["splice-t", "splice-h"]))
mutable_config.set(
"packages",
{
"splice-z": {
"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}],
"buildable": False,
}
},
)
_enable_splicing()
concrete_splice_h = spack.concretize.concretize_one("splice-h@1.0.0 ^splice-z@1.0.2")
concrete_splice_t = spack.concretize.concretize_one(
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"
)
assert concrete_splice_h.dag_hash() != original_splice_h.dag_hash()
assert concrete_splice_h.build_spec.dag_hash() == original_splice_h.dag_hash()
assert concrete_splice_h["splice-z"].external
assert concrete_splice_t.dag_hash() != original_splice_t.dag_hash()
assert concrete_splice_t.build_spec.dag_hash() == original_splice_t.dag_hash()
assert concrete_splice_t["splice-z"].external
assert concrete_splice_t["splice-z"].dag_hash() == concrete_splice_h["splice-z"].dag_hash()
]
packages_yaml = {
"splice-z": {"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}]}
}
goal_specs = [
Spec("splice-h@1.0.0 ^splice-z@1.0.2"),
Spec("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"),
]
with CacheManager(cache):
spack.config.set("packages", packages_yaml)
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
def test_spliced_build_deps_only_in_build_spec(install_specs):
"""Tests that build specs are not reported in the spliced spec"""
install_specs("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0")
def test_spliced_build_deps_only_in_build_spec(splicing_setup):
cache = ["splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0"]
goal_spec = Spec("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
_enable_splicing()
spliced = spack.concretize.concretize_one("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
build_spec = spliced.build_spec
# Spec has been spliced
assert build_spec.dag_hash() != spliced.dag_hash()
# Build spec has spliced build dependencies
assert build_spec.dependencies("splice-h", dt.BUILD)
assert build_spec.dependencies("splice-z", dt.BUILD)
# Spliced build dependencies are removed
assert len(spliced.dependencies(None, dt.BUILD)) == 0
with CacheManager(cache):
_enable_splicing()
concr_goal = goal_spec.concretized()
build_spec = concr_goal._build_spec
# Spec has been spliced
assert build_spec is not None
# Build spec has spliced build dependencies
assert _has_build_dependency(build_spec, "splice-h")
assert _has_build_dependency(build_spec, "splice-z")
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(install_specs, mutable_config):
"""Tests that build specs are not reported, even for spliced transitive dependencies"""
install_specs("splice-depends-on-t@1.0 ^splice-h@1.0.1")
mutable_config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
def test_spliced_transitive_dependency(splicing_setup):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"]
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2")
_enable_splicing()
spliced = spack.concretize.concretize_one("splice-depends-on-t^splice-h@1.0.2")
# Spec has been spliced
assert spliced.build_spec.dag_hash() != spliced.dag_hash()
assert spliced["splice-t"].build_spec.dag_hash() != spliced["splice-t"].dag_hash()
# Spliced build dependencies are removed
assert len(spliced.dependencies(None, dt.BUILD)) == 0
assert len(spliced["splice-t"].dependencies(None, dt.BUILD)) == 0
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec)
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0

View File

@@ -133,5 +133,5 @@ def test_concretize_target_ranges(root_target_range, dep_target_range, result, m
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec = spack.concretize.concretize_one(spec)
spec.concretize()
assert spec.target == spec["pkg-b"].target == result

View File

@@ -28,7 +28,6 @@
import spack.binary_distribution as bindist
import spack.caches
import spack.compilers
import spack.concretize
import spack.config
import spack.fetch_strategy
import spack.hooks.sbang as sbang
@@ -37,13 +36,14 @@
import spack.oci.image
import spack.paths
import spack.spec
import spack.stage
import spack.store
import spack.util.gpg
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.binary_distribution import CannotListKeys, GenerateIndexError
from spack.installer import PackageInstaller
from spack.directory_layout import DirectoryLayout
from spack.paths import test_path
from spack.spec import Spec
@@ -136,28 +136,35 @@ def default_config(tmp_path, config_directory, monkeypatch, install_mockery):
@pytest.fixture(scope="function")
def install_dir_default_layout(tmpdir):
"""Hooks a fake install directory with a default layout"""
scheme = os.path.join(
"${architecture}", "${compiler.name}-${compiler.version}", "${name}-${version}-${hash}"
)
real_store, real_layout = spack.store.STORE, spack.store.STORE.layout
opt_dir = tmpdir.join("opt")
original_store, spack.store.STORE = spack.store.STORE, spack.store.Store(str(opt_dir))
spack.store.STORE = spack.store.Store(str(opt_dir))
spack.store.STORE.layout = DirectoryLayout(str(opt_dir), path_scheme=scheme)
try:
yield spack.store
finally:
spack.store.STORE = original_store
spack.store.STORE = real_store
spack.store.STORE.layout = real_layout
@pytest.fixture(scope="function")
def install_dir_non_default_layout(tmpdir):
"""Hooks a fake install directory with a non-default layout"""
opt_dir = tmpdir.join("opt")
original_store, spack.store.STORE = spack.store.STORE, spack.store.Store(
str(opt_dir),
projections={
"all": "{name}/{version}/{architecture}-{compiler.name}-{compiler.version}-{hash}"
},
scheme = os.path.join(
"${name}", "${version}", "${architecture}-${compiler.name}-${compiler.version}-${hash}"
)
real_store, real_layout = spack.store.STORE, spack.store.STORE.layout
opt_dir = tmpdir.join("opt")
spack.store.STORE = spack.store.Store(str(opt_dir))
spack.store.STORE.layout = DirectoryLayout(str(opt_dir), path_scheme=scheme)
try:
yield spack.store
finally:
spack.store.STORE = original_store
spack.store.STORE = real_store
spack.store.STORE.layout = real_layout
@pytest.fixture(scope="function")
@@ -206,9 +213,8 @@ def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the default directory layout scheme.
"""
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
sy_spec = spack.concretize.concretize_one("symly")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
sy_spec = Spec("symly").concretized()
# Install 'corge' without using a cache
install_cmd("--no-cache", cspec.name)
@@ -255,9 +261,9 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme.
"""
cspec = spack.concretize.concretize_one("corge")
cspec = Spec("corge").concretized()
# This guy tests for symlink relocation
sy_spec = spack.concretize.concretize_one("symly")
sy_spec = Spec("symly").concretized()
# Install some packages with dependent packages
# test install in non-default install path scheme
@@ -278,8 +284,7 @@ def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
Test the creation and installation of buildcaches with relative
rpaths into the default directory layout scheme.
"""
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
# Install buildcache created with relativized rpaths
buildcache_cmd("install", "-uf", cspec.name)
@@ -308,7 +313,7 @@ def test_relative_rpaths_install_nondefault(temporary_mirror_dir):
Test the installation of buildcaches with relativized rpaths
into the non-default directory layout scheme.
"""
cspec = spack.concretize.concretize_one("corge")
cspec = Spec("corge").concretized()
# Test install in non-default install path scheme and relative path
buildcache_cmd("install", "-uf", cspec.name)
@@ -361,8 +366,7 @@ def test_built_spec_cache(temporary_mirror_dir):
that cache from a buildcache index."""
buildcache_cmd("list", "-a", "-l")
gspec = spack.concretize.concretize_one("garply")
cspec = spack.concretize.concretize_one("corge")
gspec, cspec = Spec("garply").concretized(), Spec("corge").concretized()
for s in [gspec, cspec]:
results = bindist.get_mirrors_for_spec(s)
@@ -385,7 +389,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
s = spack.concretize.concretize_one("libdwarf")
s = Spec("libdwarf").concretized()
# Install a package
install_cmd(s.name)
@@ -414,7 +418,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
spack.config.set("mirrors", {"test": mirror_url})
s = spack.concretize.concretize_one("libdwarf")
s = Spec("libdwarf").concretized()
# Install a package
install_cmd("--no-cache", s.name)
@@ -496,40 +500,74 @@ def mock_list_url(url, recursive=False):
assert f"Encountered problem listing packages at {url}" in capfd.readouterr().err
def test_update_sbang(tmp_path, temporary_mirror, mock_fetch, install_mockery):
"""Test relocation of the sbang shebang line in a package script"""
s = spack.concretize.concretize_one("old-sbang")
PackageInstaller([s.package]).install()
old_prefix, old_sbang_shebang = s.prefix, sbang.sbang_shebang_line()
old_contents = f"""\
{old_sbang_shebang}
#!/usr/bin/env python3
@pytest.mark.usefixtures("mock_fetch", "install_mockery")
def test_update_sbang(tmpdir, temporary_mirror):
"""Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme, triggering an update of the
sbang.
"""
spec_str = "old-sbang"
# Concretize a package with some old-fashioned sbang lines.
old_spec = Spec(spec_str).concretized()
old_spec_hash_str = "/{0}".format(old_spec.dag_hash())
{s.prefix.bin}
"""
with open(os.path.join(s.prefix.bin, "script.sh"), encoding="utf-8") as f:
assert f.read() == old_contents
# Need a fake mirror with *function* scope.
mirror_dir = temporary_mirror
# Assume all commands will concretize old_spec the same way.
install_cmd("--no-cache", old_spec.name)
# Create a buildcache with the installed spec.
buildcache_cmd("push", "--update-index", "--unsigned", temporary_mirror, f"/{s.dag_hash()}")
buildcache_cmd("push", "-u", mirror_dir, old_spec_hash_str)
# Need to force an update of the buildcache index
buildcache_cmd("update-index", mirror_dir)
# Uninstall the original package.
uninstall_cmd("-y", old_spec_hash_str)
# Switch the store to the new install tree locations
with spack.store.use_store(str(tmp_path)):
s._prefix = None # clear the cached old prefix
new_prefix, new_sbang_shebang = s.prefix, sbang.sbang_shebang_line()
assert old_prefix != new_prefix
assert old_sbang_shebang != new_sbang_shebang
PackageInstaller([s.package], cache_only=True, unsigned=True).install()
newtree_dir = tmpdir.join("newtree")
with spack.store.use_store(str(newtree_dir)):
new_spec = Spec("old-sbang").concretized()
assert new_spec.dag_hash() == old_spec.dag_hash()
# Check that the sbang line refers to the new install tree
new_contents = f"""\
{sbang.sbang_shebang_line()}
#!/usr/bin/env python3
# Install package from buildcache
buildcache_cmd("install", "-u", "-f", new_spec.name)
{s.prefix.bin}
"""
with open(os.path.join(s.prefix.bin, "script.sh"), encoding="utf-8") as f:
assert f.read() == new_contents
# Continue blowing away caches
bindist.clear_spec_cache()
spack.stage.purge()
# test that the sbang was updated by the move
sbang_style_1_expected = """{0}
#!/usr/bin/env python
{1}
""".format(
sbang.sbang_shebang_line(), new_spec.prefix.bin
)
sbang_style_2_expected = """{0}
#!/usr/bin/env python
{1}
""".format(
sbang.sbang_shebang_line(), new_spec.prefix.bin
)
installed_script_style_1_path = new_spec.prefix.bin.join("sbang-style-1.sh")
assert (
sbang_style_1_expected
== open(str(installed_script_style_1_path), encoding="utf-8").read()
)
installed_script_style_2_path = new_spec.prefix.bin.join("sbang-style-2.sh")
assert (
sbang_style_2_expected
== open(str(installed_script_style_2_path), encoding="utf-8").read()
)
uninstall_cmd("-y", "/%s" % new_spec.dag_hash())
@pytest.mark.skipif(

View File

@@ -220,12 +220,14 @@ def test_source_is_disabled(mutable_config):
# The source is not explicitly enabled or disabled, so the following
# call should raise to skip using it for bootstrapping
assert not spack.bootstrap.core.source_is_enabled(conf)
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
# Try to explicitly disable the source and verify that the behavior
# is the same as above
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
assert not spack.bootstrap.core.source_is_enabled(conf)
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")

View File

@@ -8,15 +8,15 @@
import pytest
import spack.binary_distribution as bd
import spack.concretize
import spack.mirrors.mirror
import spack.spec
from spack.installer import PackageInstaller
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path):
spec = spack.concretize.concretize_one("trivial-install-test-package")
spec = spack.spec.Spec("trivial-install-test-package").concretized()
PackageInstaller([spec.package], fake=True).install()
specs = [spec]

View File

@@ -16,7 +16,6 @@
import spack.build_environment
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.package_base
@@ -164,7 +163,8 @@ def test_static_to_shared_library(build_environment):
@pytest.mark.regression("8345")
@pytest.mark.usefixtures("config", "mock_packages")
def test_cc_not_changed_by_modules(monkeypatch, working_env):
s = spack.concretize.concretize_one("cmake")
s = spack.spec.Spec("cmake")
s.concretize()
pkg = s.package
def _set_wrong_cc(x):
@@ -184,7 +184,7 @@ def test_setup_dependent_package_inherited_modules(
working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.concretize.concretize_one("cmake-client-inheritor")
s = spack.spec.Spec("cmake-client-inheritor").concretized()
PackageInstaller([s.package]).install()
@@ -277,7 +277,7 @@ def platform_pathsep(pathlist):
return convert_to_platform_path(pathlist)
# Monkeypatch a pkg.compiler.environment with the required modifications
pkg = spack.concretize.concretize_one("cmake").package
pkg = spack.spec.Spec("cmake").concretized().package
monkeypatch.setattr(pkg.compiler, "environment", modifications)
# Trigger the modifications
spack.build_environment.setup_package(pkg, False)
@@ -301,7 +301,7 @@ def custom_env(pkg, env):
env.prepend_path("PATH", test_path)
env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1")
pkg = spack.concretize.concretize_one("cmake").package
pkg = spack.spec.Spec("cmake").concretized().package
monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env)
spack.build_environment.setup_package(pkg, False)
@@ -322,7 +322,7 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
}
spack.config.set("packages:cmake", cmake_config)
cmake_client = spack.concretize.concretize_one("cmake-client")
cmake_client = spack.spec.Spec("cmake-client").concretized()
spack.build_environment.setup_package(cmake_client.package, False)
assert os.environ["TEST_ENV_VAR_SET"] == "yes it's set"
@@ -330,7 +330,8 @@ def test_external_config_env(mock_packages, mutable_config, working_env):
@pytest.mark.regression("9107")
def test_spack_paths_before_module_paths(config, mock_packages, monkeypatch, working_env):
s = spack.concretize.concretize_one("cmake")
s = spack.spec.Spec("cmake")
s.concretize()
pkg = s.package
module_path = os.path.join("path", "to", "module")
@@ -351,7 +352,8 @@ def _set_wrong_cc(x):
def test_package_inheritance_module_setup(config, mock_packages, working_env):
s = spack.concretize.concretize_one("multimodule-inheritance")
s = spack.spec.Spec("multimodule-inheritance")
s.concretize()
pkg = s.package
spack.build_environment.setup_package(pkg, False)
@@ -385,7 +387,8 @@ def test_wrapper_variables(
not in cuda_include_dirs
)
root = spack.concretize.concretize_one("dt-diamond")
root = spack.spec.Spec("dt-diamond")
root.concretize()
for s in root.traverse():
s.prefix = "/{0}-prefix/".format(s.name)
@@ -450,7 +453,7 @@ def test_external_prefixes_last(mutable_config, mock_packages, working_env, monk
"""
)
spack.config.set("packages", cfg_data)
top = spack.concretize.concretize_one("dt-diamond")
top = spack.spec.Spec("dt-diamond").concretized()
def _trust_me_its_a_dir(path):
return True
@@ -497,7 +500,8 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
)
def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mock_packages):
# Pick a random package to be able to set compiler's variables
s = spack.concretize.concretize_one("cmake")
s = spack.spec.Spec("cmake")
s.concretize()
pkg = s.package
env = EnvironmentModifications()
@@ -529,7 +533,7 @@ def setup_dependent_package(module, dependent_spec):
assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True
externaltool = spack.concretize.concretize_one("externaltest")
externaltool = spack.spec.Spec("externaltest").concretized()
monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
)
@@ -724,7 +728,7 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
But obviously it can lead to very hard to find bugs... We should get rid of those globals and
define them instead as a property on the package instance.
"""
root = spack.concretize.concretize_one("mpileaks")
root = spack.spec.Spec("mpileaks").concretized()
build_variables = ("std_cmake_args", "std_meson_args", "std_pip_args")
# See todo above, we clear out any properties that may have been set by the previous test.

View File

@@ -15,13 +15,12 @@
import spack.build_systems.autotools
import spack.build_systems.cmake
import spack.builder
import spack.concretize
import spack.environment
import spack.error
import spack.paths
import spack.platforms
import spack.platforms.test
from spack.build_environment import ChildError, MakeExecutable, setup_package
from spack.build_environment import ChildError, setup_package
from spack.installer import PackageInstaller
from spack.spec import Spec
from spack.util.executable import which
@@ -30,12 +29,10 @@
@pytest.fixture()
def concretize_and_setup(default_mock_concretization, monkeypatch):
def concretize_and_setup(default_mock_concretization):
def _func(spec_str):
s = default_mock_concretization(spec_str)
setup_package(s.package, False)
monkeypatch.setattr(s.package.module, "make", MakeExecutable("make", jobs=1))
monkeypatch.setattr(s.package.module, "ninja", MakeExecutable("ninja", jobs=1))
return s
return _func
@@ -147,7 +144,7 @@ def test_none_is_allowed(self, default_mock_concretization):
def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Install a package that creates a mock libtool archive
s = spack.concretize.concretize_one("libtool-deletion")
s = Spec("libtool-deletion").concretized()
PackageInstaller([s.package], explicit=True).install()
# Assert the libtool archive is not there and we have
@@ -162,7 +159,7 @@ def test_libtool_archive_files_might_be_installed_on_demand(
):
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = spack.concretize.concretize_one("libtool-deletion")
s = Spec("libtool-deletion").concretized()
monkeypatch.setattr(
type(spack.builder.create(s.package)), "install_libtool_archives", True
)
@@ -176,9 +173,7 @@ def test_autotools_gnuconfig_replacement(self, mutable_database):
Tests whether only broken config.sub and config.guess are replaced with
files from working alternatives from the gnuconfig package.
"""
s = spack.concretize.concretize_one(
Spec("autotools-config-replacement +patch_config_files +gnuconfig")
)
s = Spec("autotools-config-replacement +patch_config_files +gnuconfig").concretized()
PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f:
@@ -197,9 +192,7 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
"""
Tests whether disabling patch_config_files
"""
s = spack.concretize.concretize_one(
Spec("autotools-config-replacement ~patch_config_files +gnuconfig")
)
s = Spec("autotools-config-replacement ~patch_config_files +gnuconfig").concretized()
PackageInstaller([s.package]).install()
with open(os.path.join(s.prefix.broken, "config.sub"), encoding="utf-8") as f:
@@ -224,9 +217,8 @@ def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, mo
enabled, but gnuconfig is not listed as a direct build dependency.
"""
monkeypatch.setattr(spack.platforms.test.Test, "default", "x86_64")
s = spack.concretize.concretize_one(
Spec("autotools-config-replacement +patch_config_files ~gnuconfig")
)
s = Spec("autotools-config-replacement +patch_config_files ~gnuconfig")
s.concretize()
msg = "Cannot patch config files: missing dependencies: gnuconfig"
with pytest.raises(ChildError, match=msg):
@@ -306,7 +298,7 @@ def test_define(self, default_mock_concretization):
assert define("SINGLE", "red") == "-DSINGLE:STRING=red"
def test_define_from_variant(self):
s = spack.concretize.concretize_one("cmake-client multi=up,right ~truthy single=red")
s = Spec("cmake-client multi=up,right ~truthy single=red").concretized()
arg = s.package.define_from_variant("MULTI")
assert arg == "-DMULTI:STRING=right;up"

View File

@@ -8,9 +8,9 @@
from llnl.util.filesystem import touch
import spack.builder
import spack.concretize
import spack.paths
import spack.repo
import spack.spec
@pytest.fixture()
@@ -78,7 +78,7 @@ def builder_test_repository():
@pytest.mark.disable_clean_stage_check
def test_callbacks_and_installation_procedure(spec_str, expected_values, working_env):
"""Test the correct execution of callbacks and installation procedures for packages."""
s = spack.concretize.concretize_one(spec_str)
s = spack.spec.Spec(spec_str).concretized()
builder = spack.builder.create(s.package)
for phase_fn in builder:
phase_fn.execute()
@@ -101,7 +101,7 @@ def test_callbacks_and_installation_procedure(spec_str, expected_values, working
],
)
def test_old_style_compatibility_with_super(spec_str, method_name, expected):
s = spack.concretize.concretize_one(spec_str)
s = spack.spec.Spec(spec_str).concretized()
builder = spack.builder.create(s.package)
value = getattr(builder, method_name)()
assert value == expected
@@ -112,7 +112,7 @@ def test_old_style_compatibility_with_super(spec_str, method_name, expected):
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
@pytest.mark.disable_clean_stage_check
def test_build_time_tests_are_executed_from_default_builder():
s = spack.concretize.concretize_one("old-style-autotools")
s = spack.spec.Spec("old-style-autotools").concretized()
builder = spack.builder.create(s.package)
builder.pkg.run_tests = True
for phase_fn in builder:
@@ -126,7 +126,7 @@ def test_build_time_tests_are_executed_from_default_builder():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_wrapped_pkg():
"""Confirm 'run_tests' is accessible through wrappers."""
s = spack.concretize.concretize_one("old-style-autotools")
s = spack.spec.Spec("old-style-autotools").concretized()
builder = spack.builder.create(s.package)
assert s.package.run_tests is False
assert builder.pkg.run_tests is False
@@ -141,7 +141,7 @@ def test_monkey_patching_wrapped_pkg():
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
def test_monkey_patching_test_log_file():
"""Confirm 'test_log_file' is accessible through wrappers."""
s = spack.concretize.concretize_one("old-style-autotools")
s = spack.spec.Spec("old-style-autotools").concretized()
builder = spack.builder.create(s.package)
s.package.tester.test_log_file = "/some/file"
@@ -154,7 +154,7 @@ def test_monkey_patching_test_log_file():
@pytest.mark.not_on_windows("Does not run on windows")
def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage):
"""Confirm able to run stand-alone test as a post-install callback."""
s = spack.concretize.concretize_one("py-test-callback")
s = spack.spec.Spec("py-test-callback").concretized()
builder = spack.builder.create(s.package)
builder.pkg.run_tests = True
s.package.tester.test_log_file = tmpdir.join("install_test.log")
@@ -174,7 +174,7 @@ def test_mixins_with_builders(working_env):
"""Tests that run_after and run_before callbacks are accumulated correctly,
when mixins are used with builders.
"""
s = spack.concretize.concretize_one("builder-and-mixins")
s = spack.spec.Spec("builder-and-mixins").concretized()
builder = spack.builder.create(s.package)
# Check that callbacks added by the mixin are in the list

View File

@@ -4,7 +4,6 @@
import pytest
import spack.concretize
import spack.deptypes as dt
import spack.installer as inst
import spack.repo
@@ -22,7 +21,8 @@ def test_build_request_errors(install_mockery):
def test_build_request_basics(install_mockery):
spec = spack.concretize.concretize_one("dependent-install")
spec = spack.spec.Spec("dependent-install")
spec.concretize()
assert spec.concrete
# Ensure key properties match expectations
@@ -39,7 +39,8 @@ def test_build_request_basics(install_mockery):
def test_build_request_strings(install_mockery):
"""Tests of BuildRequest repr and str for coverage purposes."""
# Using a package with one dependency
spec = spack.concretize.concretize_one("dependent-install")
spec = spack.spec.Spec("dependent-install")
spec.concretize()
assert spec.concrete
# Ensure key properties match expectations
@@ -71,7 +72,7 @@ def test_build_request_deptypes(
package_deptypes,
dependencies_deptypes,
):
s = spack.concretize.concretize_one("dependent-install")
s = spack.spec.Spec("dependent-install").concretized()
build_request = inst.BuildRequest(
s.package,

View File

@@ -4,7 +4,6 @@
import pytest
import spack.concretize
import spack.error
import spack.installer as inst
import spack.repo
@@ -25,7 +24,7 @@ def test_build_task_errors(install_mockery):
inst.BuildTask(pkg_cls(spec), None)
# Using a concretized package now means the request argument is checked.
spec = spack.concretize.concretize_one(spec)
spec.concretize()
assert spec.concrete
with pytest.raises(TypeError, match="is not a valid build request"):
@@ -48,7 +47,8 @@ def test_build_task_errors(install_mockery):
def test_build_task_basics(install_mockery):
spec = spack.concretize.concretize_one("dependent-install")
spec = spack.spec.Spec("dependent-install")
spec.concretize()
assert spec.concrete
# Ensure key properties match expectations
@@ -69,7 +69,8 @@ def test_build_task_basics(install_mockery):
def test_build_task_strings(install_mockery):
"""Tests of build_task repr and str for coverage purposes."""
# Using a package with one dependency
spec = spack.concretize.concretize_one("dependent-install")
spec = spack.spec.Spec("dependent-install")
spec.concretize()
assert spec.concrete
# Ensure key properties match expectations

View File

@@ -9,12 +9,13 @@
import llnl.util.filesystem as fs
import spack.ci as ci
import spack.concretize
import spack.environment as ev
import spack.error
import spack.paths as spack_paths
import spack.repo as repo
import spack.spec
import spack.util.git
from spack.spec import Spec
pytestmark = [pytest.mark.usefixtures("mock_packages")]
@@ -53,7 +54,7 @@ def test_pipeline_dag(config, tmpdir):
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
with repo.use_repositories(builder.root):
spec_a = spack.concretize.concretize_one("pkg-a")
spec_a = Spec("pkg-a").concretized()
key_a = ci.common.PipelineDag.key(spec_a)
key_b = ci.common.PipelineDag.key(spec_a["pkg-b"])
@@ -448,7 +449,7 @@ def test_ci_run_standalone_tests_not_installed_junit(
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": spack.concretize.concretize_one("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
"fail_fast": True,
}
@@ -467,7 +468,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": spack.concretize.concretize_one("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
}
@@ -500,7 +501,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
def test_ci_skipped_report(tmpdir, mock_packages, config):
"""Test explicit skipping of report as well as CI's 'package' arg."""
pkg = "trivial-smoke-test"
spec = spack.concretize.concretize_one(pkg)
spec = spack.spec.Spec(pkg).concretized()
ci_cdash = {
"url": "file://fake",
"build-group": "fake-group",

View File

@@ -10,7 +10,6 @@
import spack.bootstrap
import spack.bootstrap.core
import spack.concretize
import spack.config
import spack.environment as ev
import spack.main
@@ -184,7 +183,7 @@ def test_bootstrap_mirror_metadata(mutable_config, linux_os, monkeypatch, tmpdir
"""
old_create = spack.mirrors.utils.create
monkeypatch.setattr(spack.mirrors.utils, "create", lambda p, s: old_create(p, []))
monkeypatch.setattr(spack.concretize, "concretize_one", lambda p: spack.spec.Spec(p))
monkeypatch.setattr(spack.spec.Spec, "concretized", lambda p: p)
# Create the mirror in a temporary folder
compilers = [

View File

@@ -12,7 +12,6 @@
import spack.binary_distribution
import spack.cmd.buildcache
import spack.concretize
import spack.environment as ev
import spack.error
import spack.main
@@ -20,6 +19,7 @@
import spack.spec
import spack.util.url
from spack.installer import PackageInstaller
from spack.spec import Spec
buildcache = spack.main.SpackCommand("buildcache")
install = spack.main.SpackCommand("install")
@@ -81,7 +81,7 @@ def tests_buildcache_create(install_mockery, mock_fetch, monkeypatch, tmpdir):
buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = spack.concretize.concretize_one(pkg)
spec = Spec(pkg).concretized()
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -101,7 +101,7 @@ def tests_buildcache_create_env(
buildcache("push", "--unsigned", str(tmpdir))
spec = spack.concretize.concretize_one(pkg)
spec = Spec(pkg).concretized()
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -145,7 +145,7 @@ def test_update_key_index(
gpg("create", "Test Signing Key", "nobody@nowhere.com")
s = spack.concretize.concretize_one("libdwarf")
s = Spec("libdwarf").concretized()
# Install a package
install(s.name)
@@ -175,7 +175,7 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
mirror("add", "--unsigned", "mirror", mirror_dir.as_uri())
mirror("add", "--autopush", "--unsigned", "mirror-autopush", mirror_autopush_dir.as_uri())
s = spack.concretize.concretize_one("libdwarf")
s = Spec("libdwarf").concretized()
# Install and generate build cache index
PackageInstaller([s.package], explicit=True).install()
@@ -219,7 +219,7 @@ def verify_mirror_contents():
assert False
# Install a package and put it in the buildcache
s = spack.concretize.concretize_one(out_env_pkg)
s = Spec(out_env_pkg).concretized()
install(s.name)
buildcache("push", "-u", "-f", src_mirror_url, s.name)
@@ -329,7 +329,7 @@ def test_buildcache_create_install(
buildcache("push", "--unsigned", str(tmpdir), pkg)
spec = spack.concretize.concretize_one(pkg)
spec = Spec(pkg).concretized()
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
@@ -450,7 +450,7 @@ def test_push_and_install_with_mirror_marked_unsigned_does_not_require_extra_fla
def test_skip_no_redistribute(mock_packages, config):
specs = list(spack.concretize.concretize_one("no-redistribute-dependent").traverse())
specs = list(Spec("no-redistribute-dependent").concretized().traverse())
filtered = spack.cmd.buildcache._skip_no_redistribute_for_public(specs)
assert not any(s.name == "no-redistribute" for s in filtered)
assert any(s.name == "no-redistribute-dependent" for s in filtered)
@@ -490,7 +490,7 @@ def test_push_without_build_deps(tmp_path, temporary_store, mock_packages, mutab
mirror("add", "--unsigned", "my-mirror", str(tmp_path))
s = spack.concretize.concretize_one("dtrun3")
s = spack.spec.Spec("dtrun3").concretized()
PackageInstaller([s.package], explicit=True, fake=True).install()
s["dtbuild3"].package.do_uninstall()

View File

@@ -7,10 +7,10 @@
import pytest
import spack.cmd.checksum
import spack.concretize
import spack.error
import spack.package_base
import spack.repo
import spack.spec
import spack.stage
import spack.util.web
from spack.main import SpackCommand
@@ -308,7 +308,7 @@ def test_checksum_url(mock_packages, config):
def test_checksum_verification_fails(default_mock_concretization, capsys, can_fetch_versions):
spec = spack.concretize.concretize_one("zlib")
spec = spack.spec.Spec("zlib").concretized()
pkg = spec.package
versions = list(pkg.versions.keys())
version_hashes = {versions[0]: "abadhash", Version("0.1"): "123456789"}

View File

@@ -18,7 +18,6 @@
import spack.ci as ci
import spack.cmd
import spack.cmd.ci
import spack.concretize
import spack.environment as ev
import spack.hash_types as ht
import spack.main
@@ -1057,7 +1056,7 @@ def test_ci_rebuild_index(
with working_dir(tmp_path):
env_cmd("create", "test", "./spack.yaml")
with ev.read("test"):
concrete_spec = spack.concretize.concretize_one("callpath")
concrete_spec = Spec("callpath").concretized()
with open(tmp_path / "spec.json", "w", encoding="utf-8") as f:
f.write(concrete_spec.to_json(hash=ht.dag_hash))
@@ -1178,10 +1177,12 @@ def test_ci_generate_read_broken_specs_url(
ci_base_environment,
):
"""Verify that `broken-specs-url` works as intended"""
spec_a = spack.concretize.concretize_one("pkg-a")
spec_a = Spec("pkg-a")
spec_a.concretize()
a_dag_hash = spec_a.dag_hash()
spec_flattendeps = spack.concretize.concretize_one("flatten-deps")
spec_flattendeps = Spec("flatten-deps")
spec_flattendeps.concretize()
flattendeps_dag_hash = spec_flattendeps.dag_hash()
broken_specs_url = tmp_path.as_uri()
@@ -1532,7 +1533,8 @@ def dynamic_mapping_setup(tmpdir):
"""
)
spec_a = spack.concretize.concretize_one("pkg-a")
spec_a = Spec("pkg-a")
spec_a.concretize()
return gitlab_generator.get_job_name(spec_a)

View File

@@ -10,8 +10,10 @@
import spack.caches
import spack.cmd.clean
import spack.environment as ev
import spack.main
import spack.package_base
import spack.spec
import spack.stage
import spack.store
@@ -67,6 +69,20 @@ def test_function_calls(command_line, effects, mock_calls_for_clean):
assert mock_calls_for_clean[name] == (1 if name in effects else 0)
def test_env_aware_clean(mock_stage, install_mockery, mutable_mock_env_path, monkeypatch):
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.concretize()
def fail(*args, **kwargs):
raise Exception("This should not have been called")
monkeypatch.setattr(spack.spec.Spec, "concretize", fail)
with e:
clean("mpileaks")
def test_remove_python_cache(tmpdir, monkeypatch):
cache_files = ["file1.pyo", "file2.pyc"]
source_file = "file1.py"

View File

@@ -8,12 +8,12 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.config
import spack.database
import spack.environment as ev
import spack.main
import spack.schema.config
import spack.spec
import spack.store
import spack.util.spack_yaml as syaml
@@ -593,7 +593,8 @@ def test_config_prefer_upstream(
prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/"))
for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]:
dep = spack.concretize.concretize_one(spec)
dep = spack.spec.Spec(spec)
dep.concretize()
prepared_db.add(dep)
downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))

View File

@@ -4,7 +4,6 @@
import pytest
import spack.concretize
import spack.spec
import spack.store
from spack.enums import InstallRecordStatus
@@ -67,8 +66,8 @@ def test_deprecate_deps(mock_packages, mock_archive, mock_fetch, install_mockery
install("libdwarf@20130729 ^libelf@0.8.13")
install("libdwarf@20130207 ^libelf@0.8.10")
new_spec = spack.concretize.concretize_one("libdwarf@20130729^libelf@0.8.13")
old_spec = spack.concretize.concretize_one("libdwarf@20130207^libelf@0.8.10")
new_spec = spack.spec.Spec("libdwarf@20130729^libelf@0.8.13").concretized()
old_spec = spack.spec.Spec("libdwarf@20130207^libelf@0.8.10").concretized()
all_installed = spack.store.STORE.db.query()
@@ -108,12 +107,12 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
install("libelf@0.8.12")
install("libelf@0.8.10")
deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized()
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.concretize.concretize_one("libelf@0.8.12")
assert deprecator == spack.spec.Spec("libelf@0.8.12").concretized()
deprecate("-y", "libelf@0.8.10", "libelf@0.8.13")
@@ -123,7 +122,7 @@ def test_deprecate_already_deprecated(mock_packages, mock_archive, mock_fetch, i
assert len(all_available) == 3
deprecator = spack.store.STORE.db.deprecator(deprecated_spec)
assert deprecator == spack.concretize.concretize_one("libelf@0.8.13")
assert deprecator == spack.spec.Spec("libelf@0.8.13").concretized()
def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_mockery):
@@ -133,9 +132,9 @@ def test_deprecate_deprecator(mock_packages, mock_archive, mock_fetch, install_m
install("libelf@0.8.12")
install("libelf@0.8.10")
first_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.10")
second_deprecated_spec = spack.concretize.concretize_one("libelf@0.8.12")
final_deprecator = spack.concretize.concretize_one("libelf@0.8.13")
first_deprecated_spec = spack.spec.Spec("libelf@0.8.10").concretized()
second_deprecated_spec = spack.spec.Spec("libelf@0.8.12").concretized()
final_deprecator = spack.spec.Spec("libelf@0.8.13").concretized()
deprecate("-y", "libelf@0.8.10", "libelf@0.8.12")
@@ -165,7 +164,7 @@ def test_concretize_deprecated(mock_packages, mock_archive, mock_fetch, install_
spec = spack.spec.Spec("libelf@0.8.10")
with pytest.raises(spack.spec.SpecDeprecatedError):
spack.concretize.concretize_one(spec)
spec.concretize()
@pytest.mark.usefixtures("mock_packages", "mock_archive", "mock_fetch", "install_mockery")

View File

@@ -8,7 +8,6 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.environment as ev
import spack.error
import spack.repo
@@ -24,9 +23,7 @@
def test_dev_build_basics(tmpdir, install_mockery):
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
assert "dev_path" in spec.variants
@@ -44,9 +41,7 @@ def test_dev_build_basics(tmpdir, install_mockery):
def test_dev_build_before(tmpdir, install_mockery):
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -62,9 +57,7 @@ def test_dev_build_before(tmpdir, install_mockery):
def test_dev_build_until(tmpdir, install_mockery):
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -82,9 +75,7 @@ def test_dev_build_until(tmpdir, install_mockery):
def test_dev_build_until_last_phase(tmpdir, install_mockery):
# Test that we ignore the last_phase argument if it is already last
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -102,9 +93,7 @@ def test_dev_build_until_last_phase(tmpdir, install_mockery):
def test_dev_build_before_until(tmpdir, install_mockery):
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -140,9 +129,8 @@ def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery,
def test_dev_build_fails_already_installed(tmpdir, install_mockery):
spec = spack.concretize.concretize_one(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
spec.concretize()
with tmpdir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -180,25 +168,12 @@ def test_dev_build_fails_no_version(mock_packages):
assert "dev-build spec must have a single, concrete version" in output
def test_dev_build_can_parse_path_with_at_symbol(tmpdir, install_mockery):
special_char_dir = tmpdir.mkdir("tmp@place")
spec = spack.spec.Spec(f'dev-build-test-install@0.0.0 dev_path="{special_char_dir}"')
spec.concretize()
with special_char_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
f.write(spec.package.original_string)
dev_build("dev-build-test-install@0.0.0")
assert spec.package.filename in os.listdir(spec.prefix)
def test_dev_build_env(tmpdir, install_mockery, mutable_mock_env_path):
"""Test Spack does dev builds for packages in develop section of env."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.concretize.concretize_one(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
spec.concretize()
with build_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -233,9 +208,8 @@ def test_dev_build_env_with_vars(tmpdir, install_mockery, mutable_mock_env_path,
"""Test Spack does dev builds for packages in develop section of env (path with variables)."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.concretize.concretize_one(
spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
)
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={build_dir}")
spec.concretize()
# store the build path in an environment variable that will be used in the environment
monkeypatch.setenv("CUSTOM_BUILD_PATH", build_dir)
@@ -272,9 +246,8 @@ def test_dev_build_env_version_mismatch(tmpdir, install_mockery, mutable_mock_en
"""Test Spack constraints concretization by develop specs."""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.concretize.concretize_one(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
)
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % tmpdir)
spec.concretize()
with build_dir.as_cwd():
with open(spec.package.filename, "w", encoding="utf-8") as f:
@@ -354,8 +327,8 @@ def test_dev_build_multiple(tmpdir, install_mockery, mutable_mock_env_path, mock
with ev.read("test"):
# Do concretization inside environment for dev info
# These specs are the source of truth to compare against the installs
leaf_spec = spack.concretize.concretize_one(leaf_spec)
root_spec = spack.concretize.concretize_one(root_spec)
leaf_spec.concretize()
root_spec.concretize()
# Do install
install()
@@ -401,8 +374,8 @@ def test_dev_build_env_dependency(tmpdir, install_mockery, mock_fetch, mutable_m
# concretize in the environment to get the dev build info
# equivalent to setting dev_build and dev_path variants
# on all specs above
spec = spack.concretize.concretize_one(spec)
dep_spec = spack.concretize.concretize_one(dep_spec)
spec.concretize()
dep_spec.concretize()
install()
# Ensure that both specs installed properly
@@ -426,9 +399,8 @@ def test_dev_build_rebuild_on_source_changes(
"""
# setup dev-build-test-install package for dev build
build_dir = tmpdir.mkdir("build")
spec = spack.concretize.concretize_one(
spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
)
spec = spack.spec.Spec("dev-build-test-install@0.0.0 dev_path=%s" % build_dir)
spec.concretize()
def reset_string():
with build_dir.as_cwd():

View File

@@ -8,7 +8,6 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.config
import spack.environment as ev
import spack.package_base
@@ -139,8 +138,7 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked
result = spack.concretize.concretize_one("mpich@1.0")
assert result.satisfies("dev_path=%s" % abspath)
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath)
def test_develop_canonicalize_path_no_args(self, monkeypatch):
env("create", "test")
@@ -167,8 +165,7 @@ def check_path(stage, dest):
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Check modifications actually worked
result = spack.concretize.concretize_one("mpich@1.0")
assert result.satisfies("dev_path=%s" % abspath)
assert spack.spec.Spec("mpich@1.0").concretized().satisfies("dev_path=%s" % abspath)
def _git_commit_list(git_repo_dir):
@@ -193,7 +190,7 @@ def test_develop_full_git_repo(
spack.package_base.PackageBase, "git", "file://%s" % repo_path, raising=False
)
spec = spack.concretize.concretize_one("git-test-commit@1.2")
spec = spack.spec.Spec("git-test-commit@1.2").concretized()
try:
spec.package.do_stage()
commits = _git_commit_list(spec.package.stage[0].source_path)
@@ -216,22 +213,3 @@ def test_develop_full_git_repo(
develop_dir = spec.variants["dev_path"].value
commits = _git_commit_list(develop_dir)
assert len(commits) > 1
def test_concretize_dev_path_with_at_symbol_in_env(mutable_mock_env_path, tmpdir, mock_packages):
spec_like = "develop-test@develop"
develop_dir = tmpdir.mkdir("build@location")
env("create", "test_at_sym")
with ev.read("test_at_sym") as e:
add(spec_like)
develop(f"--path={develop_dir}", spec_like)
e.concretize()
result = e.concrete_roots()
assert len(result) == 1
cspec = result[0]
assert cspec.satisfies(spec_like), cspec
assert cspec.is_develop, cspec
assert develop_dir in cspec.variants["dev_path"], cspec

View File

@@ -5,9 +5,9 @@
import pytest
import spack.cmd.diff
import spack.concretize
import spack.main
import spack.repo
import spack.spec
import spack.util.spack_json as sjson
from spack.test.conftest import create_test_repo
@@ -133,8 +133,8 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
def test_diff_ignore(test_repo):
specA = spack.concretize.concretize_one("p1+usev1")
specB = spack.concretize.concretize_one("p1~usev1")
specA = spack.spec.Spec("p1+usev1").concretized()
specB = spack.spec.Spec("p1~usev1").concretized()
c1 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
@@ -154,8 +154,8 @@ def find(function_list, name, args):
# Check ignoring changes on multiple packages
specA = spack.concretize.concretize_one("p1+usev1 ^p3+p3var")
specA = spack.concretize.concretize_one("p1~usev1 ^p3~p3var")
specA = spack.spec.Spec("p1+usev1 ^p3+p3var").concretized()
specA = spack.spec.Spec("p1~usev1 ^p3~p3var").concretized()
c3 = spack.cmd.diff.compare_specs(specA, specB, to_string=False)
assert find(c3["a_not_b"], "variant_value", ["p3", "p3var"])
@@ -168,8 +168,8 @@ def find(function_list, name, args):
def test_diff_cmd(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test that we can install two packages and diff them"""
specA = spack.concretize.concretize_one("mpileaks")
specB = spack.concretize.concretize_one("mpileaks+debug")
specA = spack.spec.Spec("mpileaks").concretized()
specB = spack.spec.Spec("mpileaks+debug").concretized()
# Specs should be the same as themselves
c = spack.cmd.diff.compare_specs(specA, specA, to_string=True)

View File

@@ -19,7 +19,6 @@
from llnl.util.symlink import readlink
import spack.cmd.env
import spack.concretize
import spack.config
import spack.environment as ev
import spack.environment.depfile as depfile
@@ -958,7 +957,7 @@ def test_lockfile_spliced_specs(environment_from_manifest, install_mockery):
"""Test that an environment can round-trip a spliced spec."""
# Create a local install for zmpi to splice in
# Default concretization is not using zmpi
zmpi = spack.concretize.concretize_one("zmpi")
zmpi = spack.spec.Spec("zmpi").concretized()
PackageInstaller([zmpi.package], fake=True).install()
e1 = environment_from_manifest(
@@ -1321,43 +1320,39 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
with e:
# List of requirements, flip a variant
config("change", "packages:mpich:require:~debug")
test_spec = spack.concretize.concretize_one("mpich")
test_spec = spack.spec.Spec("mpich").concretized()
assert test_spec.satisfies("@3.0.2~debug")
# List of requirements, change the version (in a different scope)
config("change", "packages:mpich:require:@3.0.3")
test_spec = spack.concretize.concretize_one("mpich")
test_spec = spack.spec.Spec("mpich").concretized()
assert test_spec.satisfies("@3.0.3")
# "require:" as a single string, also try specifying
# a spec string that requires enclosing in quotes as
# part of the config path
config("change", 'packages:libelf:require:"@0.8.12:"')
spack.concretize.concretize_one("libelf@0.8.12")
spack.spec.Spec("libelf@0.8.12").concretized()
# No need for assert, if there wasn't a failure, we
# changed the requirement successfully.
# Use change to add a requirement for a package that
# has no requirements defined
config("change", "packages:fftw:require:+mpi")
test_spec = spack.concretize.concretize_one("fftw")
test_spec = spack.spec.Spec("fftw").concretized()
assert test_spec.satisfies("+mpi")
config("change", "packages:fftw:require:~mpi")
test_spec = spack.concretize.concretize_one("fftw")
test_spec = spack.spec.Spec("fftw").concretized()
assert test_spec.satisfies("~mpi")
config("change", "packages:fftw:require:@1.0")
test_spec = spack.concretize.concretize_one("fftw")
test_spec = spack.spec.Spec("fftw").concretized()
assert test_spec.satisfies("@1.0~mpi")
# Use "--match-spec" to change one spec in a "one_of"
# list
config("change", "packages:bowtie:require:@1.2.2", "--match-spec", "@1.2.0")
# confirm that we can concretize to either value
spack.concretize.concretize_one("bowtie@1.3.0")
spack.concretize.concretize_one("bowtie@1.2.2")
# confirm that we cannot concretize to the old value
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretize_one("bowtie@1.2.0")
spack.spec.Spec("bowtie@1.3.0").concretize()
spack.spec.Spec("bowtie@1.2.2").concretized()
def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config):
@@ -1372,8 +1367,8 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
with ev.Environment(tmp_path):
config("change", "packages:mpich:require:~debug")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretize_one("mpich+debug")
spack.concretize.concretize_one("mpich~debug")
spack.spec.Spec("mpich+debug").concretized()
spack.spec.Spec("mpich~debug").concretized()
# Now check that we raise an error if we need to add a require: constraint
# when preexisting config manually specified it as a singular spec
@@ -1387,7 +1382,7 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
"""
)
with ev.Environment(tmp_path):
assert spack.concretize.concretize_one("mpich").satisfies("@3.0.3")
assert spack.spec.Spec("mpich").concretized().satisfies("@3.0.3")
with pytest.raises(spack.error.ConfigError, match="not a list"):
config("change", "packages:mpich:require:~debug")
@@ -1695,7 +1690,7 @@ def test_stage(mock_stage, mock_fetch, install_mockery):
root = str(mock_stage)
def check_stage(spec):
spec = spack.concretize.concretize_one(spec)
spec = Spec(spec).concretized()
for dep in spec.traverse():
stage_name = f"{stage_prefix}{dep.name}-{dep.version}-{dep.dag_hash()}"
assert os.path.isdir(os.path.join(root, stage_name))
@@ -1796,7 +1791,7 @@ def test_indirect_build_dep(tmp_path):
with spack.repo.use_repositories(builder.root):
x_spec = Spec("x")
x_concretized = spack.concretize.concretize_one(x_spec)
x_concretized = x_spec.concretized()
_env_create("test", with_view=False)
e = ev.read("test")
@@ -1829,10 +1824,10 @@ def test_store_different_build_deps(tmp_path):
with spack.repo.use_repositories(builder.root):
y_spec = Spec("y ^z@3")
y_concretized = spack.concretize.concretize_one(y_spec)
y_concretized = y_spec.concretized()
x_spec = Spec("x ^z@2")
x_concretized = spack.concretize.concretize_one(x_spec)
x_concretized = x_spec.concretized()
# Even though x chose a different 'z', the y it chooses should be identical
# *aside* from the dependency on 'z'. The dag_hash() will show the difference
@@ -2125,7 +2120,15 @@ def configure_reuse(reuse_mode, combined_env) -> Optional[ev.Environment]:
"from_environment_raise",
],
)
def test_env_include_concrete_reuse(do_not_check_runtimes_on_reuse, reuse_mode):
def test_env_include_concrete_reuse(monkeypatch, reuse_mode):
# The mock packages do not use the gcc-runtime
def mock_has_runtime_dependencies(*args, **kwargs):
return True
monkeypatch.setattr(
spack.solver.asp, "_has_runtime_dependencies", mock_has_runtime_dependencies
)
# The default mpi version is 3.x provided by mpich in the mock repo.
# This test verifies that concretizing with an included concrete
# environment with "concretizer:reuse:true" the included

View File

@@ -5,18 +5,16 @@
import pytest
import spack.concretize
from spack.installer import PackageInstaller
from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
extensions = SpackCommand("extensions")
@pytest.fixture
def python_database(mock_packages, mutable_database):
specs = [
spack.concretize.concretize_one(s) for s in ["python", "py-extension1", "py-extension2"]
]
specs = [Spec(s).concretized() for s in ["python", "py-extension1", "py-extension2"]]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
yield
@@ -24,7 +22,7 @@ def python_database(mock_packages, mutable_database):
@pytest.mark.not_on_windows("All Fetchers Failed")
@pytest.mark.db
def test_extensions(mock_packages, python_database, capsys):
ext2 = spack.concretize.concretize_one("py-extension2")
ext2 = Spec("py-extension2").concretized()
def check_output(ni):
with capsys.disabled():

View File

@@ -12,13 +12,13 @@
import spack.cmd as cmd
import spack.cmd.find
import spack.concretize
import spack.environment as ev
import spack.repo
import spack.store
import spack.user_environment as uenv
from spack.enums import InstallRecordStatus
from spack.main import SpackCommand
from spack.spec import Spec
from spack.test.conftest import create_test_repo
from spack.test.utilities import SpackCommandArgs
from spack.util.pattern import Bunch
@@ -201,8 +201,7 @@ def test_find_json_deps(database):
@pytest.mark.db
def test_display_json(database, capsys):
specs = [
spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
cmd.display_specs_as_json(specs)
@@ -217,8 +216,7 @@ def test_display_json(database, capsys):
@pytest.mark.db
def test_display_json_deps(database, capsys):
specs = [
spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
cmd.display_specs_as_json(specs, deps=True)
@@ -277,7 +275,7 @@ def test_find_format_deps(database, config):
def test_find_format_deps_paths(database, config):
output = find("-dp", "--format", "{name}-{version}", "mpileaks", "^zmpi")
spec = spack.concretize.concretize_one("mpileaks ^zmpi")
spec = Spec("mpileaks ^zmpi").concretized()
prefixes = [s.prefix for s in spec.traverse()]
assert (
@@ -302,8 +300,7 @@ def test_find_very_long(database, config):
output = find("-L", "--no-groups", "mpileaks")
specs = [
spack.concretize.concretize_one(s)
for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
Spec(s).concretized() for s in ["mpileaks ^zmpi", "mpileaks ^mpich", "mpileaks ^mpich2"]
]
assert set(output.strip().split("\n")) == set(

View File

@@ -5,7 +5,6 @@
import pytest
import spack.concretize
import spack.deptypes as dt
import spack.environment as ev
import spack.main
@@ -26,7 +25,8 @@ def test_gc_without_build_dependency(mutable_database):
@pytest.mark.db
def test_gc_with_build_dependency(mutable_database):
s = spack.concretize.concretize_one("simple-inheritance")
s = spack.spec.Spec("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install()
assert "There are no unused specs." in gc("-yb")
@@ -36,8 +36,8 @@ def test_gc_with_build_dependency(mutable_database):
@pytest.mark.db
def test_gc_with_constraints(mutable_database):
s_cmake1 = spack.concretize.concretize_one("simple-inheritance ^cmake@3.4.3")
s_cmake2 = spack.concretize.concretize_one("simple-inheritance ^cmake@3.23.1")
s_cmake1 = spack.spec.Spec("simple-inheritance ^cmake@3.4.3").concretized()
s_cmake2 = spack.spec.Spec("simple-inheritance ^cmake@3.23.1").concretized()
PackageInstaller([s_cmake1.package], explicit=True, fake=True).install()
PackageInstaller([s_cmake2.package], explicit=True, fake=True).install()
@@ -52,7 +52,8 @@ def test_gc_with_constraints(mutable_database):
@pytest.mark.db
def test_gc_with_environment(mutable_database, mutable_mock_env_path):
s = spack.concretize.concretize_one("simple-inheritance")
s = spack.spec.Spec("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc")
@@ -67,7 +68,8 @@ def test_gc_with_environment(mutable_database, mutable_mock_env_path):
@pytest.mark.db
def test_gc_with_build_dependency_in_environment(mutable_database, mutable_mock_env_path):
s = spack.concretize.concretize_one("simple-inheritance")
s = spack.spec.Spec("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install()
e = ev.create("test_gc")
@@ -118,7 +120,8 @@ def test_gc_except_any_environments(mutable_database, mutable_mock_env_path):
@pytest.mark.db
def test_gc_except_specific_environments(mutable_database, mutable_mock_env_path):
s = spack.concretize.concretize_one("simple-inheritance")
s = spack.spec.Spec("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi")
@@ -144,7 +147,8 @@ def test_gc_except_nonexisting_dir_env(mutable_database, mutable_mock_env_path,
@pytest.mark.db
def test_gc_except_specific_dir_env(mutable_database, mutable_mock_env_path, tmpdir):
s = spack.concretize.concretize_one("simple-inheritance")
s = spack.spec.Spec("simple-inheritance")
s.concretize()
PackageInstaller([s.package], explicit=True, fake=True).install()
assert mutable_database.query_local("zmpi")

View File

@@ -19,7 +19,6 @@
import spack.build_environment
import spack.cmd.common.arguments
import spack.cmd.install
import spack.concretize
import spack.config
import spack.environment as ev
import spack.error
@@ -135,7 +134,7 @@ def test_package_output(tmpdir, capsys, install_mockery, mock_fetch):
# we can't use output capture here because it interferes with Spack's
# logging. TODO: see whether we can get multiple log_outputs to work
# when nested AND in pytest
spec = spack.concretize.concretize_one("printing-package")
spec = Spec("printing-package").concretized()
pkg = spec.package
PackageInstaller([pkg], explicit=True, verbose=True).install()
@@ -175,7 +174,7 @@ def test_install_output_on_python_error(mock_packages, mock_archive, mock_fetch,
def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mockery):
"""Verify that source has been copied into place."""
install("--source", "--keep-stage", "trivial-install-test-package")
spec = spack.concretize.concretize_one("trivial-install-test-package")
spec = Spec("trivial-install-test-package").concretized()
src = os.path.join(spec.prefix.share, "trivial-install-test-package", "src")
assert filecmp.cmp(
os.path.join(mock_archive.path, "configure"), os.path.join(src, "configure")
@@ -183,7 +182,8 @@ def test_install_with_source(mock_packages, mock_archive, mock_fetch, install_mo
def test_install_env_variables(mock_packages, mock_archive, mock_fetch, install_mockery):
spec = spack.concretize.concretize_one("libdwarf")
spec = Spec("libdwarf")
spec.concretize()
install("libdwarf")
assert os.path.isfile(spec.package.install_env_path)
@@ -204,7 +204,8 @@ def test_show_log_on_error(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = spack.concretize.concretize_one("libdwarf")
spec = Spec("libdwarf")
spec.concretize()
install("libdwarf")
@@ -237,7 +238,8 @@ def test_install_overwrite(mock_packages, mock_archive, mock_fetch, install_mock
def test_install_overwrite_not_installed(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
spec = spack.concretize.concretize_one("libdwarf")
spec = Spec("libdwarf")
spec.concretize()
assert not os.path.exists(spec.prefix)
@@ -258,7 +260,7 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
monkeypatch.setattr(spack.package_base.PackageBase, "git", file_url, raising=False)
# Use the earliest commit in the respository
spec = spack.concretize.concretize_one(f"git-test-commit@{commits[-1]}")
spec = Spec(f"git-test-commit@{commits[-1]}").concretized()
PackageInstaller([spec.package], explicit=True).install()
# Ensure first commit file contents were written
@@ -271,11 +273,13 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
def test_install_overwrite_multiple(mock_packages, mock_archive, mock_fetch, install_mockery):
# Try to install a spec and then to reinstall it.
libdwarf = spack.concretize.concretize_one("libdwarf")
libdwarf = Spec("libdwarf")
libdwarf.concretize()
install("libdwarf")
cmake = spack.concretize.concretize_one("cmake")
cmake = Spec("cmake")
cmake.concretize()
install("cmake")
@@ -351,7 +355,7 @@ def test_install_invalid_spec():
)
def test_install_from_file(spec, concretize, error_code, tmpdir):
if concretize:
spec = spack.concretize.concretize_one(spec)
spec.concretize()
specfile = tmpdir.join("spec.yaml")
@@ -481,7 +485,8 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
for spec in filespecs:
filepath = tmpdir.join(spec + ".yaml")
args = ["-f", str(filepath)] + args
s = spack.concretize.concretize_one(spec)
s = Spec(spec)
s.concretize()
with filepath.open("w") as f:
s.to_yaml(f)
@@ -490,7 +495,8 @@ def test_install_mix_cli_and_files(clispecs, filespecs, tmpdir):
def test_extra_files_are_archived(mock_packages, mock_archive, mock_fetch, install_mockery):
s = spack.concretize.concretize_one("archive-files")
s = Spec("archive-files")
s.concretize()
install("archive-files")
@@ -609,7 +615,8 @@ def test_cdash_install_from_spec_json(
with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = spack.concretize.concretize_one("pkg-a")
pkg_spec = Spec("pkg-a")
pkg_spec.concretize()
with open(spec_json_path, "w", encoding="utf-8") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
@@ -685,8 +692,8 @@ def test_cache_only_fails(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_only_dependencies(tmpdir, mock_fetch, install_mockery):
dep = spack.concretize.concretize_one("dependency-install")
root = spack.concretize.concretize_one("dependent-install")
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
install("--only", "dependencies", "dependent-install")
@@ -707,8 +714,8 @@ def test_install_only_package(tmpdir, mock_fetch, install_mockery, capfd):
def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
dep = spack.concretize.concretize_one("dependency-install")
root = spack.concretize.concretize_one("dependent-install")
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
install("--only", "dependencies", "dependent-install")
assert os.path.exists(dep.prefix)
@@ -726,8 +733,8 @@ def test_install_only_dependencies_in_env(
env("create", "test")
with ev.read("test"):
dep = spack.concretize.concretize_one("dependency-install")
root = spack.concretize.concretize_one("dependent-install")
dep = Spec("dependency-install").concretized()
root = Spec("dependent-install").concretized()
install("-v", "--only", "dependencies", "--add", "dependent-install")
@@ -743,8 +750,8 @@ def test_install_only_dependencies_of_all_in_env(
with ev.read("test"):
roots = [
spack.concretize.concretize_one("dependent-install@1.0"),
spack.concretize.concretize_one("dependent-install@2.0"),
Spec("dependent-install@1.0").concretized(),
Spec("dependent-install@2.0").concretized(),
]
add("dependent-install@1.0")
@@ -893,7 +900,7 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = spack.concretize.concretize_one("configure-warning")
spec = Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w", encoding="utf-8") as f:
@@ -939,7 +946,7 @@ def test_install_env_with_tests_all(
):
env("create", "test")
with ev.read("test"):
test_dep = spack.concretize.concretize_one("test-dependency")
test_dep = Spec("test-dependency").concretized()
add("depb")
install("--test", "all")
assert os.path.exists(test_dep.prefix)
@@ -951,7 +958,7 @@ def test_install_env_with_tests_root(
):
env("create", "test")
with ev.read("test"):
test_dep = spack.concretize.concretize_one("test-dependency")
test_dep = Spec("test-dependency").concretized()
add("depb")
install("--test", "root")
assert not os.path.exists(test_dep.prefix)

View File

@@ -7,7 +7,7 @@
import pytest
import spack.concretize
import spack.spec
import spack.user_environment as uenv
from spack.main import SpackCommand
@@ -49,7 +49,7 @@ def test_load_shell(shell, set_command):
"""Test that `spack load` applies prefix inspections of its required runtime deps in
topo-order"""
install("mpileaks")
mpileaks_spec = spack.concretize.concretize_one("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
# Ensure our reference variable is clean.
os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world"
@@ -166,7 +166,7 @@ def test_unload(
"""Tests that any variables set in the user environment are undone by the
unload command"""
install("mpileaks")
mpileaks_spec = spack.concretize.concretize_one("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
# Set so unload has something to do
os.environ["FOOBAR"] = "mpileaks"
@@ -187,7 +187,7 @@ def test_unload_fails_no_shell(
):
"""Test that spack unload prints an error message without a shell."""
install("mpileaks")
mpileaks_spec = spack.concretize.concretize_one("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
os.environ[uenv.spack_loaded_hashes_var] = mpileaks_spec.dag_hash()
out = unload("mpileaks", fail_on_error=False)

View File

@@ -8,9 +8,9 @@
from llnl.util.filesystem import mkdirp
import spack.concretize
import spack.environment as ev
import spack.paths
import spack.spec
import spack.stage
from spack.main import SpackCommand, SpackCommandError
@@ -25,7 +25,7 @@
@pytest.fixture
def mock_spec():
# Make it look like the source was actually expanded.
s = spack.concretize.concretize_one("externaltest")
s = spack.spec.Spec("externaltest").concretized()
source_path = s.package.stage.source_path
mkdirp(source_path)
yield s, s.package

View File

@@ -13,7 +13,6 @@
import spack
import spack.cmd.logs
import spack.concretize
import spack.main
import spack.spec
from spack.main import SpackCommand
@@ -54,7 +53,7 @@ def disable_capture(capfd):
def test_logs_cmd_errors(install_mockery, mock_fetch, mock_archive, mock_packages):
spec = spack.concretize.concretize_one("libelf")
spec = spack.spec.Spec("libelf").concretized()
assert not spec.installed
with pytest.raises(spack.main.SpackCommandError, match="is not installed or staged"):
@@ -83,7 +82,7 @@ def test_dump_logs(install_mockery, mock_fetch, mock_archive, mock_packages, dis
decompress them.
"""
cmdline_spec = spack.spec.Spec("libelf")
concrete_spec = spack.concretize.concretize_one(cmdline_spec)
concrete_spec = cmdline_spec.concretized()
# Sanity check, make sure this test is checking what we want: to
# start with

View File

@@ -7,7 +7,6 @@
import pytest
import spack.cmd.mirror
import spack.concretize
import spack.config
import spack.environment as ev
import spack.error
@@ -61,7 +60,7 @@ def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_p
@pytest.fixture
def source_for_pkg_with_hash(mock_packages, tmpdir):
s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
s = spack.spec.Spec("trivial-pkg-with-valid-hash").concretized()
local_url_basename = os.path.basename(s.package.url)
local_path = os.path.join(str(tmpdir), local_url_basename)
with open(local_path, "w", encoding="utf-8") as f:
@@ -73,9 +72,7 @@ def source_for_pkg_with_hash(mock_packages, tmpdir):
def test_mirror_skip_unstable(tmpdir_factory, mock_packages, config, source_for_pkg_with_hash):
mirror_dir = str(tmpdir_factory.mktemp("mirror-dir"))
specs = [
spack.concretize.concretize_one(x) for x in ["git-test", "trivial-pkg-with-valid-hash"]
]
specs = [spack.spec.Spec(x).concretized() for x in ["git-test", "trivial-pkg-with-valid-hash"]]
spack.mirrors.utils.create(mirror_dir, specs, skip_unstable_versions=True)
assert set(os.listdir(mirror_dir)) - set(["_source-cache"]) == set(
@@ -114,7 +111,7 @@ def test_exclude_specs(mock_packages, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.concretize.concretize_one(x) for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs)
@@ -148,7 +145,7 @@ def test_exclude_file(mock_packages, tmpdir, config):
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.concretize.concretize_one(x) for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
expected_exclude = set(spack.spec.Spec(x) for x in ["mpich@3.0.1", "mpich@3.0.2", "mpich@1.0"])
assert expected_include <= set(mirror_specs)

View File

@@ -7,12 +7,12 @@
import pytest
import spack.concretize
import spack.config
import spack.main
import spack.modules
import spack.modules.lmod
import spack.repo
import spack.spec
import spack.store
from spack.installer import PackageInstaller
@@ -33,7 +33,7 @@ def ensure_module_files_are_there(mock_repo_path, mock_store, mock_configuration
def _module_files(module_type, *specs):
specs = [spack.concretize.concretize_one(x) for x in specs]
specs = [spack.spec.Spec(x).concretized() for x in specs]
writer_cls = spack.modules.module_types[module_type]
return [writer_cls(spec, "default").layout.filename for spec in specs]
@@ -184,15 +184,12 @@ def test_setdefault_command(mutable_database, mutable_config):
# Install two different versions of pkg-a
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0"
specs = [
spack.concretize.concretize_one(other_spec),
spack.concretize.concretize_one(preferred),
]
specs = [spack.spec.Spec(other_spec).concretized(), spack.spec.Spec(preferred).concretized()]
PackageInstaller([s.package for s in specs], explicit=True, fake=True).install()
writers = {
preferred: writer_cls(specs[1], "default"),
other_spec: writer_cls(specs[0], "default"),
preferred: writer_cls(spack.spec.Spec(preferred).concretized(), "default"),
other_spec: writer_cls(spack.spec.Spec(other_spec).concretized(), "default"),
}
# Create two module files for the same software

View File

@@ -2,9 +2,9 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.main
import spack.repo
import spack.spec
from spack.installer import PackageInstaller
tags = spack.main.SpackCommand("tags")
@@ -47,7 +47,7 @@ class tag_path:
def test_tags_installed(install_mockery, mock_fetch):
s = spack.concretize.concretize_one("mpich")
s = spack.spec.Spec("mpich").concretized()
PackageInstaller([s.package], explicit=True, fake=True).install()
out = tags("-i")

View File

@@ -11,10 +11,10 @@
import spack.cmd.common.arguments
import spack.cmd.test
import spack.concretize
import spack.config
import spack.install_test
import spack.paths
import spack.spec
from spack.install_test import TestStatus
from spack.main import SpackCommand
@@ -240,7 +240,7 @@ def test_read_old_results(mock_packages, mock_test_stage):
def test_test_results_none(mock_packages, mock_test_stage):
name = "trivial"
spec = spack.concretize.concretize_one("trivial-smoke-test")
spec = spack.spec.Spec("trivial-smoke-test").concretized()
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()
spack.install_test.write_test_suite_file(suite)
@@ -255,7 +255,7 @@ def test_test_results_none(mock_packages, mock_test_stage):
def test_test_results_status(mock_packages, mock_test_stage, status):
"""Confirm 'spack test results' returns expected status."""
name = "trivial"
spec = spack.concretize.concretize_one("trivial-smoke-test")
spec = spack.spec.Spec("trivial-smoke-test").concretized()
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()
spack.install_test.write_test_suite_file(suite)
@@ -278,7 +278,7 @@ def test_test_results_status(mock_packages, mock_test_stage, status):
def test_report_filename_for_cdash(install_mockery, mock_fetch):
"""Test that the temporary file used to write Testing.xml for CDash is not the upload URL"""
name = "trivial"
spec = spack.concretize.concretize_one("trivial-smoke-test")
spec = spack.spec.Spec("trivial-smoke-test").concretized()
suite = spack.install_test.TestSuite([spec], name)
suite.ensure_stage()

View File

@@ -1,8 +1,8 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.concretize
import spack.environment as ev
import spack.spec
from spack.main import SpackCommand
undevelop = SpackCommand("undevelop")
@@ -30,9 +30,9 @@ def test_undevelop(tmpdir, mutable_config, mock_packages, mutable_mock_env_path)
env("create", "test", "./spack.yaml")
with ev.read("test"):
before = spack.concretize.concretize_one("mpich")
before = spack.spec.Spec("mpich").concretized()
undevelop("mpich")
after = spack.concretize.concretize_one("mpich")
after = spack.spec.Spec("mpich").concretized()
# Removing dev spec from environment changes concretization
assert before.satisfies("dev_path=*")

View File

@@ -7,7 +7,7 @@
import llnl.util.filesystem as fs
import spack.concretize
import spack.spec
import spack.store
import spack.util.spack_json as sjson
import spack.verify
@@ -65,7 +65,7 @@ def test_single_file_verify_cmd(tmpdir):
def test_single_spec_verify_cmd(tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery):
# Test the verify command interface to verify a single spec
install("libelf")
s = spack.concretize.concretize_one("libelf")
s = spack.spec.Spec("libelf").concretized()
prefix = s.prefix
hash = s.dag_hash()

View File

@@ -9,10 +9,10 @@
from llnl.util.symlink import _windows_can_symlink
import spack.concretize
import spack.util.spack_yaml as s_yaml
from spack.installer import PackageInstaller
from spack.main import SpackCommand
from spack.spec import Spec
extensions = SpackCommand("extensions")
install = SpackCommand("install")
@@ -190,7 +190,7 @@ def test_view_fails_with_missing_projections_file(tmpdir):
def test_view_files_not_ignored(
tmpdir, mock_packages, mock_archive, mock_fetch, install_mockery, cmd, with_projection
):
spec = spack.concretize.concretize_one("view-not-ignored")
spec = Spec("view-not-ignored").concretized()
pkg = spec.package
PackageInstaller([pkg], explicit=True).install()
pkg.assert_installed(spec.prefix)

View File

@@ -8,7 +8,6 @@
import archspec.cpu
import spack.concretize
import spack.config
import spack.paths
import spack.repo
@@ -21,7 +20,7 @@
def _concretize_with_reuse(*, root_str, reused_str):
reused_spec = spack.concretize.concretize_one(reused_str)
reused_spec = spack.spec.Spec(reused_str).concretized()
setup = spack.solver.asp.SpackSolverSetup(tests=False)
driver = spack.solver.asp.PyclingoDriver()
result, _, _ = driver.solve(setup, [spack.spec.Spec(f"{root_str}")], reuse=[reused_spec])
@@ -45,7 +44,7 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.concretize.concretize_one("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0")
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized()
a, b = s["pkg-a"], s["pkg-b"]
# Both a and b should depend on the same gcc-runtime directly
@@ -62,7 +61,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml)
s = spack.concretize.concretize_one("pkg-a%gcc@10.2.1")
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized()
a, b = s["pkg-a"], s["pkg-b"]

File diff suppressed because it is too large Load Diff

View File

@@ -4,9 +4,9 @@
import pytest
import spack.concretize
import spack.config
import spack.solver.asp
import spack.spec
version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",
@@ -57,7 +57,7 @@ def test_error_messages(error_messages, config_set, spec, mock_packages, mutable
spack.config.set(path, conf)
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError) as e:
_ = spack.concretize.concretize_one(spec)
_ = spack.spec.Spec(spec).concretized()
for em in error_messages:
assert em in str(e.value)

Some files were not shown because too many files have changed in this diff Show More