Support independent includes with conditional, optional, and remote entries (#48784)
Supersedes #46792. Closes #40018. Closes #31026. Closes #2700. There were a number of feature requests for os-specific config. This enables os-specific config without adding a lot of special sub-scopes. Support `include:` as an independent configuration schema, allowing users to include configuration scopes from files or directories. Includes can be: * conditional (similar to definitions in environments), and/or * optional (i.e., the include will be skipped if it does not exist). Includes can be paths or URLs (`ftp`, `https`, `http` or `file`). Paths can be absolute or relative . Environments can include configuration files using the same schema. Remote includes must be checked by `sha256`. Includes can also be recursive, and this modifies the config system accordingly so that we push included configuration scopes on the stack *before* their including scopes, and we remove configuration scopes from the stack when their including scopes are removed. For example, you could have an `include.yaml` file (e.g., under `$HOME/.spack`) to specify global includes: ``` include: - ./enable_debug.yaml - path: https://github.com/spack/spack-configs/blob/main/NREL/configs/mac/config.yaml sha256: 37f982915b03de18cc4e722c42c5267bf04e46b6a6d6e0ef3a67871fcb1d258b ``` Or an environment `spack.yaml`: ``` spack: include: - path: "/path/to/a/config-dir-or-file" when: os == "ventura" - ./path/relative/to/containing/file/that/is/required - path: "/path/with/spack/variables/$os/$target" optional: true - path: https://raw.githubusercontent.com/spack/spack-configs/refs/heads/main/path/to/required/raw/config.yaml sha256: 26e871804a92cd07bb3d611b31b4156ae93d35b6a6d6e0ef3a67871fcb1d258b ``` Updated TODO: - [x] Get existing unit tests to pass with Todd's changes - [x] Resolve new (or old) circular imports - [x] Ensure remote includes (global) work - [x] Ensure remote includes for environments work (note: caches remote files under user cache root) - [x] add sha256 field to include paths, validate, and require for remote includes - [x] add sha256 remote file unit tests - [x] revisit how diamond includes should work - [x] support recursive includes - [x] add recursive include unit tests - [x] update docs and unit test to indicate ordering of recursive includes with conflicting options is deferred to follow-on work --------- Signed-off-by: Todd Gamblin <tgamblin@llnl.gov> Co-authored-by: Peter Scheibel <scheibel1@llnl.gov> Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
This commit is contained in:
parent
07d4915e82
commit
3f8dcfc6ed
@ -14,6 +14,7 @@ case you want to skip directly to specific docs:
|
||||
* :ref:`compilers.yaml <compiler-config>`
|
||||
* :ref:`concretizer.yaml <concretizer-options>`
|
||||
* :ref:`config.yaml <config-yaml>`
|
||||
* :ref:`include.yaml <include-yaml>`
|
||||
* :ref:`mirrors.yaml <mirrors>`
|
||||
* :ref:`modules.yaml <modules>`
|
||||
* :ref:`packages.yaml <packages-config>`
|
||||
|
@ -670,24 +670,45 @@ This configuration sets the default compiler for all packages to
|
||||
Included configurations
|
||||
^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Spack environments allow an ``include`` heading in their yaml
|
||||
schema. This heading pulls in external configuration files and applies
|
||||
them to the environment.
|
||||
Spack environments allow an ``include`` heading in their yaml schema.
|
||||
This heading pulls in external configuration files and applies them to
|
||||
the environment.
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
spack:
|
||||
include:
|
||||
- relative/path/to/config.yaml
|
||||
- environment/relative/path/to/config.yaml
|
||||
- https://github.com/path/to/raw/config/compilers.yaml
|
||||
- /absolute/path/to/packages.yaml
|
||||
- path: /path/to/$os/$target/environment
|
||||
optional: true
|
||||
- path: /path/to/os-specific/config-dir
|
||||
when: os == "ventura"
|
||||
|
||||
Included configuration files are required *unless* they are explicitly optional
|
||||
or the entry's condition evaluates to ``false``. Optional includes are specified
|
||||
with the ``optional`` clause and conditional with the ``when`` clause. (See
|
||||
:ref:`include-yaml` for more information on optional and conditional entries.)
|
||||
|
||||
Files are listed using paths to individual files or directories containing them.
|
||||
Path entries may be absolute or relative to the environment or specified as
|
||||
URLs. URLs to individual files need link to the **raw** form of the file's
|
||||
contents (e.g., `GitHub
|
||||
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
|
||||
or `GitLab
|
||||
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
|
||||
Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or schemes) are
|
||||
supported. Spack-specific, environment and user path variables can be used.
|
||||
(See :ref:`config-file-variables` for more information.)
|
||||
|
||||
.. warning::
|
||||
|
||||
Recursive includes are not currently processed in a breadth-first manner
|
||||
so the value of a configuration option that is altered by multiple included
|
||||
files may not be what you expect. This will be addressed in a future
|
||||
update.
|
||||
|
||||
Environments can include files or URLs. File paths can be relative or
|
||||
absolute. URLs include the path to the text for individual files or
|
||||
can be the path to a directory containing configuration files.
|
||||
Spack supports ``file``, ``http``, ``https`` and ``ftp`` protocols (or
|
||||
schemes). Spack-specific, environment and user path variables may be
|
||||
used in these paths. See :ref:`config-file-variables` for more information.
|
||||
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
Configuration precedence
|
||||
|
51
lib/spack/docs/include_yaml.rst
Normal file
51
lib/spack/docs/include_yaml.rst
Normal file
@ -0,0 +1,51 @@
|
||||
.. Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||
|
||||
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
.. _include-yaml:
|
||||
|
||||
===============================
|
||||
Include Settings (include.yaml)
|
||||
===============================
|
||||
|
||||
Spack allows you to include configuration files through ``include.yaml``.
|
||||
Using the ``include:`` heading results in pulling in external configuration
|
||||
information to be used by any Spack command.
|
||||
|
||||
Included configuration files are required *unless* they are explicitly optional
|
||||
or the entry's condition evaluates to ``false``. Optional includes are specified
|
||||
with the ``optional`` clause and conditional with the ``when`` clause. For
|
||||
example,
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
include:
|
||||
- /path/to/a/required/config.yaml
|
||||
- path: /path/to/$os/$target/config
|
||||
optional: true
|
||||
- path: /path/to/os-specific/config-dir
|
||||
when: os == "ventura"
|
||||
|
||||
shows all three. The first entry, ``/path/to/a/required/config.yaml``,
|
||||
indicates that included ``config.yaml`` file is required (so must exist).
|
||||
Use of ``optional: true`` for ``/path/to/$os/$target/config`` means
|
||||
the path is only included if it exists. The condition ``os == "ventura"``
|
||||
in the ``when`` clause for ``/path/to/os-specific/config-dir`` means the
|
||||
path is only included when the operating system (``os``) is ``ventura``.
|
||||
|
||||
The same conditions and variables in `Spec List References
|
||||
<https://spack.readthedocs.io/en/latest/environments.html#spec-list-references>`_
|
||||
can be used for conditional activation in the ``when`` clauses.
|
||||
|
||||
Included files can be specified by path or by their parent directory.
|
||||
Paths may be absolute, relative (to the configuration file including the path),
|
||||
or specified as URLs. Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or
|
||||
schemes) are supported. Spack-specific, environment and user path variables
|
||||
can be used. (See :ref:`config-file-variables` for more information.)
|
||||
|
||||
.. warning::
|
||||
|
||||
Recursive includes are not currently processed in a breadth-first manner
|
||||
so the value of a configuration option that is altered by multiple included
|
||||
files may not be what you expect. This will be addressed in a future
|
||||
update.
|
@ -71,6 +71,7 @@ or refer to the full manual below.
|
||||
|
||||
configuration
|
||||
config_yaml
|
||||
include_yaml
|
||||
packages_yaml
|
||||
build_settings
|
||||
environments
|
||||
|
@ -11,6 +11,7 @@
|
||||
import re
|
||||
import sys
|
||||
import traceback
|
||||
import types
|
||||
import typing
|
||||
import warnings
|
||||
from datetime import datetime, timedelta
|
||||
@ -707,14 +708,24 @@ def __init__(self, wrapped_object):
|
||||
|
||||
|
||||
class Singleton:
|
||||
"""Simple wrapper for lazily initialized singleton objects."""
|
||||
"""Wrapper for lazily initialized singleton objects."""
|
||||
|
||||
def __init__(self, factory):
|
||||
def __init__(self, factory: Callable[[], object]):
|
||||
"""Create a new singleton to be inited with the factory function.
|
||||
|
||||
Most factories will simply create the object to be initialized and
|
||||
return it.
|
||||
|
||||
In some cases, e.g. when bootstrapping some global state, the singleton
|
||||
may need to be initialized incrementally. If the factory returns a generator
|
||||
instead of a regular object, the singleton will assign each result yielded by
|
||||
the generator to the singleton instance. This allows methods called by
|
||||
the factory in later stages to refer back to the singleton.
|
||||
|
||||
Args:
|
||||
factory (function): function taking no arguments that
|
||||
creates the singleton instance.
|
||||
factory (function): function taking no arguments that creates the
|
||||
singleton instance.
|
||||
|
||||
"""
|
||||
self.factory = factory
|
||||
self._instance = None
|
||||
@ -722,7 +733,16 @@ def __init__(self, factory):
|
||||
@property
|
||||
def instance(self):
|
||||
if self._instance is None:
|
||||
self._instance = self.factory()
|
||||
instance = self.factory()
|
||||
|
||||
if isinstance(instance, types.GeneratorType):
|
||||
# if it's a generator, assign every value
|
||||
for value in instance:
|
||||
self._instance = value
|
||||
else:
|
||||
# if not, just assign the result like a normal singleton
|
||||
self._instance = instance
|
||||
|
||||
return self._instance
|
||||
|
||||
def __getattr__(self, name):
|
||||
|
@ -32,9 +32,10 @@
|
||||
import copy
|
||||
import functools
|
||||
import os
|
||||
import os.path
|
||||
import re
|
||||
import sys
|
||||
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
|
||||
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Optional, Tuple, Union
|
||||
|
||||
import jsonschema
|
||||
|
||||
@ -42,7 +43,6 @@
|
||||
|
||||
import spack.error
|
||||
import spack.paths
|
||||
import spack.platforms
|
||||
import spack.schema
|
||||
import spack.schema.bootstrap
|
||||
import spack.schema.cdash
|
||||
@ -54,16 +54,16 @@
|
||||
import spack.schema.develop
|
||||
import spack.schema.env
|
||||
import spack.schema.env_vars
|
||||
import spack.schema.include
|
||||
import spack.schema.merged
|
||||
import spack.schema.mirrors
|
||||
import spack.schema.modules
|
||||
import spack.schema.packages
|
||||
import spack.schema.repos
|
||||
import spack.schema.upstreams
|
||||
import spack.schema.view
|
||||
|
||||
# Hacked yaml for configuration files preserves line numbers.
|
||||
import spack.util.remote_file_cache as rfc_util
|
||||
import spack.util.spack_yaml as syaml
|
||||
import spack.util.web as web_util
|
||||
from spack.util.cpus import cpus_available
|
||||
|
||||
from .enums import ConfigScopePriority
|
||||
@ -74,6 +74,7 @@
|
||||
"concretizer": spack.schema.concretizer.schema,
|
||||
"definitions": spack.schema.definitions.schema,
|
||||
"env_vars": spack.schema.env_vars.schema,
|
||||
"include": spack.schema.include.schema,
|
||||
"view": spack.schema.view.schema,
|
||||
"develop": spack.schema.develop.schema,
|
||||
"mirrors": spack.schema.mirrors.schema,
|
||||
@ -121,6 +122,17 @@
|
||||
#: Type used for raw YAML configuration
|
||||
YamlConfigDict = Dict[str, Any]
|
||||
|
||||
#: prefix for name of included configuration scopes
|
||||
INCLUDE_SCOPE_PREFIX = "include"
|
||||
|
||||
#: safeguard for recursive includes -- maximum include depth
|
||||
MAX_RECURSIVE_INCLUDES = 100
|
||||
|
||||
|
||||
def _include_cache_location():
|
||||
"""Location to cache included configuration files."""
|
||||
return os.path.join(spack.paths.user_cache_path, "includes")
|
||||
|
||||
|
||||
class ConfigScope:
|
||||
def __init__(self, name: str) -> None:
|
||||
@ -128,6 +140,9 @@ def __init__(self, name: str) -> None:
|
||||
self.writable = False
|
||||
self.sections = syaml.syaml_dict()
|
||||
|
||||
#: names of any included scopes
|
||||
self.included_scopes: List[str] = []
|
||||
|
||||
def get_section_filename(self, section: str) -> str:
|
||||
raise NotImplementedError
|
||||
|
||||
@ -433,7 +448,9 @@ def highest(self) -> ConfigScope:
|
||||
return next(self.scopes.reversed_values()) # type: ignore
|
||||
|
||||
@_config_mutator
|
||||
def push_scope(self, scope: ConfigScope, priority: Optional[int] = None) -> None:
|
||||
def push_scope(
|
||||
self, scope: ConfigScope, priority: Optional[int] = None, _depth: int = 0
|
||||
) -> None:
|
||||
"""Adds a scope to the Configuration, at a given priority.
|
||||
|
||||
If a priority is not given, it is assumed to be the current highest priority.
|
||||
@ -443,6 +460,30 @@ def push_scope(self, scope: ConfigScope, priority: Optional[int] = None) -> None
|
||||
priority: priority of the scope
|
||||
"""
|
||||
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
|
||||
|
||||
# TODO: As a follow on to #48784, change this to create a graph of the
|
||||
# TODO: includes AND ensure properly sorted such that the order included
|
||||
# TODO: at the highest level is reflected in the value of an option that
|
||||
# TODO: is set in multiple included files.
|
||||
# before pushing the scope itself, push any included scopes recursively, at same priority
|
||||
includes = scope.get_section("include")
|
||||
if includes:
|
||||
include_paths = [included_path(data) for data in includes["include"]]
|
||||
for path in reversed(include_paths):
|
||||
included_scope = include_path_scope(path)
|
||||
if not included_scope:
|
||||
continue
|
||||
|
||||
if _depth + 1 > MAX_RECURSIVE_INCLUDES: # make sure we're not recursing endlessly
|
||||
mark = path.path._start_mark if syaml.marked(path.path) else "" # type: ignore
|
||||
raise RecursiveIncludeError(
|
||||
f"Maximum include recursion exceeded in {path.path}", str(mark)
|
||||
)
|
||||
|
||||
# record this inclusion so that remove_scope() can use it
|
||||
scope.included_scopes.append(included_scope.name)
|
||||
self.push_scope(included_scope, priority=priority, _depth=_depth + 1)
|
||||
|
||||
self.scopes.add(scope.name, value=scope, priority=priority)
|
||||
|
||||
@_config_mutator
|
||||
@ -450,10 +491,17 @@ def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
|
||||
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
|
||||
try:
|
||||
scope = self.scopes.remove(scope_name)
|
||||
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
|
||||
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {str(scope)}", level=2)
|
||||
except KeyError as e:
|
||||
tty.debug(f"[CONFIGURATION: POP SCOPE]: {e}", level=2)
|
||||
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {e}", level=2)
|
||||
return None
|
||||
|
||||
# transitively remove included scopes
|
||||
for inc in scope.included_scopes:
|
||||
assert inc in self.scopes, f"Included scope '{inc}' was never added to configuration!"
|
||||
self.remove_scope(inc)
|
||||
scope.included_scopes.clear() # clean up includes for bookkeeping
|
||||
|
||||
return scope
|
||||
|
||||
@property
|
||||
@ -763,6 +811,8 @@ def _add_platform_scope(
|
||||
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
|
||||
) -> None:
|
||||
"""Add a platform-specific subdirectory for the current platform."""
|
||||
import spack.platforms # circular dependency
|
||||
|
||||
platform = spack.platforms.host().name
|
||||
scope = DirectoryConfigScope(
|
||||
f"{name}/{platform}", os.path.join(path, platform), writable=writable
|
||||
@ -770,6 +820,75 @@ def _add_platform_scope(
|
||||
cfg.push_scope(scope, priority=priority)
|
||||
|
||||
|
||||
#: Class for the relevance of an optional path conditioned on a limited
|
||||
#: python code that evaluates to a boolean and or explicit specification
|
||||
#: as optional.
|
||||
class IncludePath(NamedTuple):
|
||||
path: str
|
||||
when: str
|
||||
sha256: str
|
||||
optional: bool
|
||||
|
||||
|
||||
def included_path(entry: Union[str, dict]) -> IncludePath:
|
||||
"""Convert the included path entry into an IncludePath.
|
||||
|
||||
Args:
|
||||
entry: include configuration entry
|
||||
|
||||
Returns: converted entry, where an empty ``when`` means the path is
|
||||
not conditionally included
|
||||
"""
|
||||
if isinstance(entry, str):
|
||||
return IncludePath(path=entry, sha256="", when="", optional=False)
|
||||
|
||||
path = entry["path"]
|
||||
sha256 = entry.get("sha256", "")
|
||||
when = entry.get("when", "")
|
||||
optional = entry.get("optional", False)
|
||||
return IncludePath(path=path, sha256=sha256, when=when, optional=optional)
|
||||
|
||||
|
||||
def include_path_scope(include: IncludePath) -> Optional[ConfigScope]:
|
||||
"""Instantiate an appropriate configuration scope for the given path.
|
||||
|
||||
Args:
|
||||
include: optional include path
|
||||
|
||||
Returns: configuration scope
|
||||
|
||||
Raises:
|
||||
ValueError: included path has an unsupported URL scheme, is required
|
||||
but does not exist; configuration stage directory argument is missing
|
||||
ConfigFileError: unable to access remote configuration file(s)
|
||||
"""
|
||||
# circular dependencies
|
||||
import spack.spec
|
||||
|
||||
if (not include.when) or spack.spec.eval_conditional(include.when):
|
||||
config_path = rfc_util.local_path(include.path, include.sha256, _include_cache_location)
|
||||
if not config_path:
|
||||
raise ConfigFileError(f"Unable to fetch remote configuration from {include.path}")
|
||||
|
||||
if os.path.isdir(config_path):
|
||||
# directories are treated as regular ConfigScopes
|
||||
config_name = f"{INCLUDE_SCOPE_PREFIX}:{os.path.basename(config_path)}"
|
||||
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
|
||||
return DirectoryConfigScope(config_name, config_path)
|
||||
|
||||
if os.path.exists(config_path):
|
||||
# files are assumed to be SingleFileScopes
|
||||
config_name = f"{INCLUDE_SCOPE_PREFIX}:{config_path}"
|
||||
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
|
||||
return SingleFileScope(config_name, config_path, spack.schema.merged.schema)
|
||||
|
||||
if not include.optional:
|
||||
path = f" at ({config_path})" if config_path != include.path else ""
|
||||
raise ValueError(f"Required path ({include.path}) does not exist{path}")
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
|
||||
"""Load configuration paths from entry points
|
||||
|
||||
@ -795,7 +914,7 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
|
||||
return config_paths
|
||||
|
||||
|
||||
def create() -> Configuration:
|
||||
def create_incremental() -> Generator[Configuration, None, None]:
|
||||
"""Singleton Configuration instance.
|
||||
|
||||
This constructs one instance associated with this module and returns
|
||||
@ -839,11 +958,25 @@ def create() -> Configuration:
|
||||
# Each scope can have per-platform overrides in subdirectories
|
||||
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
|
||||
|
||||
return cfg
|
||||
# yield the config incrementally so that each config level's init code can get
|
||||
# data from the one below. This can be tricky, but it enables us to have a
|
||||
# single unified config system.
|
||||
#
|
||||
# TODO: think about whether we want to restrict what types of config can be used
|
||||
# at each level. e.g., we may want to just more forcibly disallow remote
|
||||
# config (which uses ssl and other config options) for some of the scopes,
|
||||
# to make the bootstrap issues more explicit, even if allowing config scope
|
||||
# init to reference lower scopes is more flexible.
|
||||
yield cfg
|
||||
|
||||
|
||||
def create() -> Configuration:
|
||||
"""Create a configuration using create_incremental(), return the last yielded result."""
|
||||
return list(create_incremental())[-1]
|
||||
|
||||
|
||||
#: This is the singleton configuration instance for Spack.
|
||||
CONFIG: Configuration = lang.Singleton(create) # type: ignore
|
||||
CONFIG: Configuration = lang.Singleton(create_incremental) # type: ignore
|
||||
|
||||
|
||||
def add_from_file(filename: str, scope: Optional[str] = None) -> None:
|
||||
@ -939,7 +1072,8 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
|
||||
|
||||
Accepts the path syntax described in ``get()``.
|
||||
"""
|
||||
return CONFIG.set(path, value, scope)
|
||||
result = CONFIG.set(path, value, scope)
|
||||
return result
|
||||
|
||||
|
||||
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
|
||||
@ -1462,98 +1596,6 @@ def create_from(*scopes_or_paths: Union[ScopeWithOptionalPriority, str]) -> Conf
|
||||
return result
|
||||
|
||||
|
||||
def raw_github_gitlab_url(url: str) -> str:
|
||||
"""Transform a github URL to the raw form to avoid undesirable html.
|
||||
|
||||
Args:
|
||||
url: url to be converted to raw form
|
||||
|
||||
Returns:
|
||||
Raw github/gitlab url or the original url
|
||||
"""
|
||||
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
|
||||
# actual URL under https://raw.githubusercontent.com/ with '/blob'
|
||||
# removed and or, '/blame' if needed.
|
||||
if "github" in url or "gitlab" in url:
|
||||
return url.replace("/blob/", "/raw/")
|
||||
|
||||
return url
|
||||
|
||||
|
||||
def collect_urls(base_url: str) -> list:
|
||||
"""Return a list of configuration URLs.
|
||||
|
||||
Arguments:
|
||||
base_url: URL for a configuration (yaml) file or a directory
|
||||
containing yaml file(s)
|
||||
|
||||
Returns:
|
||||
List of configuration file(s) or empty list if none
|
||||
"""
|
||||
if not base_url:
|
||||
return []
|
||||
|
||||
extension = ".yaml"
|
||||
|
||||
if base_url.endswith(extension):
|
||||
return [base_url]
|
||||
|
||||
# Collect configuration URLs if the base_url is a "directory".
|
||||
_, links = web_util.spider(base_url, 0)
|
||||
return [link for link in links if link.endswith(extension)]
|
||||
|
||||
|
||||
def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) -> str:
|
||||
"""Retrieve configuration file(s) at the specified URL.
|
||||
|
||||
Arguments:
|
||||
url: URL for a configuration (yaml) file or a directory containing
|
||||
yaml file(s)
|
||||
dest_dir: destination directory
|
||||
skip_existing: Skip files that already exist in dest_dir if
|
||||
``True``; otherwise, replace those files
|
||||
|
||||
Returns:
|
||||
Path to the corresponding file if URL is or contains a
|
||||
single file and it is the only file in the destination directory or
|
||||
the root (dest_dir) directory if multiple configuration files exist
|
||||
or are retrieved.
|
||||
"""
|
||||
|
||||
def _fetch_file(url):
|
||||
raw = raw_github_gitlab_url(url)
|
||||
tty.debug(f"Reading config from url {raw}")
|
||||
return web_util.fetch_url_text(raw, dest_dir=dest_dir)
|
||||
|
||||
if not url:
|
||||
raise ConfigFileError("Cannot retrieve configuration without a URL")
|
||||
|
||||
# Return the local path to the cached configuration file OR to the
|
||||
# directory containing the cached configuration files.
|
||||
config_links = collect_urls(url)
|
||||
existing_files = os.listdir(dest_dir) if os.path.isdir(dest_dir) else []
|
||||
|
||||
paths = []
|
||||
for config_url in config_links:
|
||||
basename = os.path.basename(config_url)
|
||||
if skip_existing and basename in existing_files:
|
||||
tty.warn(
|
||||
f"Will not fetch configuration from {config_url} since a "
|
||||
f"version already exists in {dest_dir}"
|
||||
)
|
||||
path = os.path.join(dest_dir, basename)
|
||||
else:
|
||||
path = _fetch_file(config_url)
|
||||
|
||||
if path:
|
||||
paths.append(path)
|
||||
|
||||
if paths:
|
||||
return dest_dir if len(paths) > 1 else paths[0]
|
||||
|
||||
raise ConfigFileError(f"Cannot retrieve configuration (yaml) from {url}")
|
||||
|
||||
|
||||
def get_mark_from_yaml_data(obj):
|
||||
"""Try to get ``spack.util.spack_yaml`` mark from YAML data.
|
||||
|
||||
@ -1680,3 +1722,7 @@ def get_path(path, data):
|
||||
|
||||
# give up and return None if nothing worked
|
||||
return None
|
||||
|
||||
|
||||
class RecursiveIncludeError(spack.error.SpackError):
|
||||
"""Too many levels of recursive includes."""
|
||||
|
@ -10,8 +10,6 @@
|
||||
import re
|
||||
import shutil
|
||||
import stat
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
import warnings
|
||||
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
|
||||
|
||||
@ -32,7 +30,6 @@
|
||||
import spack.paths
|
||||
import spack.repo
|
||||
import spack.schema.env
|
||||
import spack.schema.merged
|
||||
import spack.spec
|
||||
import spack.spec_list
|
||||
import spack.store
|
||||
@ -43,7 +40,6 @@
|
||||
import spack.util.path
|
||||
import spack.util.spack_json as sjson
|
||||
import spack.util.spack_yaml as syaml
|
||||
import spack.util.url
|
||||
from spack import traverse
|
||||
from spack.installer import PackageInstaller
|
||||
from spack.schema.env import TOP_LEVEL_KEY
|
||||
@ -577,13 +573,6 @@ def _write_yaml(data, str_or_file):
|
||||
syaml.dump_config(data, str_or_file, default_flow_style=False)
|
||||
|
||||
|
||||
def _eval_conditional(string):
|
||||
"""Evaluate conditional definitions using restricted variable scope."""
|
||||
valid_variables = spack.spec.get_host_environment()
|
||||
valid_variables.update({"re": re, "env": os.environ})
|
||||
return eval(string, valid_variables)
|
||||
|
||||
|
||||
def _is_dev_spec_and_has_changed(spec):
|
||||
"""Check if the passed spec is a dev build and whether it has changed since the
|
||||
last installation"""
|
||||
@ -1016,7 +1005,7 @@ def _process_definition(self, entry):
|
||||
"""Process a single spec definition item."""
|
||||
when_string = entry.get("when")
|
||||
if when_string is not None:
|
||||
when = _eval_conditional(when_string)
|
||||
when = spack.spec.eval_conditional(when_string)
|
||||
assert len([x for x in entry if x != "when"]) == 1
|
||||
else:
|
||||
when = True
|
||||
@ -1561,9 +1550,6 @@ def _get_specs_to_concretize(
|
||||
return new_user_specs, kept_user_specs, specs_to_concretize
|
||||
|
||||
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
|
||||
# Avoid cyclic dependency
|
||||
import spack.solver.asp
|
||||
|
||||
# Exit early if the set of concretized specs is the set of user specs
|
||||
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
|
||||
if not new_user_specs:
|
||||
@ -2674,20 +2660,23 @@ def _ensure_env_dir():
|
||||
# error handling for bad manifests is handled on other code paths
|
||||
return
|
||||
|
||||
# TODO: make this recursive
|
||||
includes = manifest[TOP_LEVEL_KEY].get("include", [])
|
||||
for include in includes:
|
||||
if os.path.isabs(include):
|
||||
included_path = spack.config.included_path(include)
|
||||
path = included_path.path
|
||||
if os.path.isabs(path):
|
||||
continue
|
||||
|
||||
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
|
||||
abspath = pathlib.Path(os.path.normpath(environment_dir / path))
|
||||
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
|
||||
if common_path != environment_dir:
|
||||
tty.debug(f"Will not copy relative include from outside environment: {include}")
|
||||
tty.debug(f"Will not copy relative include file from outside environment: {path}")
|
||||
continue
|
||||
|
||||
orig_abspath = os.path.normpath(envfile.parent / include)
|
||||
orig_abspath = os.path.normpath(envfile.parent / path)
|
||||
if not os.path.exists(orig_abspath):
|
||||
tty.warn(f"Included file does not exist; will not copy: '{include}'")
|
||||
tty.warn(f"Included file does not exist; will not copy: '{path}'")
|
||||
continue
|
||||
|
||||
fs.touchp(abspath)
|
||||
@ -2910,7 +2899,7 @@ def extract_name(_item):
|
||||
continue
|
||||
|
||||
condition_str = item.get("when", "True")
|
||||
if not _eval_conditional(condition_str):
|
||||
if not spack.spec.eval_conditional(condition_str):
|
||||
continue
|
||||
|
||||
yield idx, item
|
||||
@ -2971,127 +2960,20 @@ def __iter__(self):
|
||||
def __str__(self):
|
||||
return str(self.manifest_file)
|
||||
|
||||
@property
|
||||
def included_config_scopes(self) -> List[spack.config.ConfigScope]:
|
||||
"""List of included configuration scopes from the manifest.
|
||||
|
||||
Scopes are listed in the YAML file in order from highest to
|
||||
lowest precedence, so configuration from earlier scope will take
|
||||
precedence over later ones.
|
||||
|
||||
This routine returns them in the order they should be pushed onto
|
||||
the internal scope stack (so, in reverse, from lowest to highest).
|
||||
|
||||
Returns: Configuration scopes associated with the environment manifest
|
||||
|
||||
Raises:
|
||||
SpackEnvironmentError: if the manifest includes a remote file but
|
||||
no configuration stage directory has been identified
|
||||
"""
|
||||
scopes: List[spack.config.ConfigScope] = []
|
||||
|
||||
# load config scopes added via 'include:', in reverse so that
|
||||
# highest-precedence scopes are last.
|
||||
includes = self[TOP_LEVEL_KEY].get("include", [])
|
||||
missing = []
|
||||
for i, config_path in enumerate(reversed(includes)):
|
||||
# allow paths to contain spack config/environment variables, etc.
|
||||
config_path = substitute_path_variables(config_path)
|
||||
include_url = urllib.parse.urlparse(config_path)
|
||||
|
||||
# If scheme is not valid, config_path is not a url
|
||||
# of a type Spack is generally aware
|
||||
if spack.util.url.validate_scheme(include_url.scheme):
|
||||
# Transform file:// URLs to direct includes.
|
||||
if include_url.scheme == "file":
|
||||
config_path = urllib.request.url2pathname(include_url.path)
|
||||
|
||||
# Any other URL should be fetched.
|
||||
elif include_url.scheme in ("http", "https", "ftp"):
|
||||
# Stage any remote configuration file(s)
|
||||
staged_configs = (
|
||||
os.listdir(self.config_stage_dir)
|
||||
if os.path.exists(self.config_stage_dir)
|
||||
else []
|
||||
)
|
||||
remote_path = urllib.request.url2pathname(include_url.path)
|
||||
basename = os.path.basename(remote_path)
|
||||
if basename in staged_configs:
|
||||
# Do NOT re-stage configuration files over existing
|
||||
# ones with the same name since there is a risk of
|
||||
# losing changes (e.g., from 'spack config update').
|
||||
tty.warn(
|
||||
"Will not re-stage configuration from {0} to avoid "
|
||||
"losing changes to the already staged file of the "
|
||||
"same name.".format(remote_path)
|
||||
)
|
||||
|
||||
# Recognize the configuration stage directory
|
||||
# is flattened to ensure a single copy of each
|
||||
# configuration file.
|
||||
config_path = self.config_stage_dir
|
||||
if basename.endswith(".yaml"):
|
||||
config_path = os.path.join(config_path, basename)
|
||||
else:
|
||||
staged_path = spack.config.fetch_remote_configs(
|
||||
config_path, str(self.config_stage_dir), skip_existing=True
|
||||
)
|
||||
if not staged_path:
|
||||
raise SpackEnvironmentError(
|
||||
"Unable to fetch remote configuration {0}".format(config_path)
|
||||
)
|
||||
config_path = staged_path
|
||||
|
||||
elif include_url.scheme:
|
||||
raise ValueError(
|
||||
f"Unsupported URL scheme ({include_url.scheme}) for "
|
||||
f"environment include: {config_path}"
|
||||
)
|
||||
|
||||
# treat relative paths as relative to the environment
|
||||
if not os.path.isabs(config_path):
|
||||
config_path = os.path.join(self.manifest_dir, config_path)
|
||||
config_path = os.path.normpath(os.path.realpath(config_path))
|
||||
|
||||
if os.path.isdir(config_path):
|
||||
# directories are treated as regular ConfigScopes
|
||||
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
|
||||
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
|
||||
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
|
||||
elif os.path.exists(config_path):
|
||||
# files are assumed to be SingleFileScopes
|
||||
config_name = f"env:{self.name}:{config_path}"
|
||||
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
|
||||
scopes.append(
|
||||
spack.config.SingleFileScope(
|
||||
config_name, config_path, spack.schema.merged.schema
|
||||
)
|
||||
)
|
||||
else:
|
||||
missing.append(config_path)
|
||||
continue
|
||||
|
||||
if missing:
|
||||
msg = "Detected {0} missing include path(s):".format(len(missing))
|
||||
msg += "\n {0}".format("\n ".join(missing))
|
||||
raise spack.config.ConfigFileError(msg)
|
||||
|
||||
return scopes
|
||||
|
||||
@property
|
||||
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
|
||||
"""A list of all configuration scopes for the environment manifest. On the first call this
|
||||
instantiates all the scopes, on subsequent calls it returns the cached list."""
|
||||
if self._config_scopes is not None:
|
||||
return self._config_scopes
|
||||
|
||||
scopes: List[spack.config.ConfigScope] = [
|
||||
*self.included_config_scopes,
|
||||
spack.config.SingleFileScope(
|
||||
self.scope_name,
|
||||
str(self.manifest_file),
|
||||
spack.schema.env.schema,
|
||||
yaml_path=[TOP_LEVEL_KEY],
|
||||
),
|
||||
)
|
||||
]
|
||||
ensure_no_disallowed_env_config_mods(scopes)
|
||||
self._config_scopes = scopes
|
||||
|
@ -21,7 +21,6 @@
|
||||
from llnl.util.lang import nullcontext
|
||||
from llnl.util.tty.color import colorize
|
||||
|
||||
import spack.build_environment
|
||||
import spack.config
|
||||
import spack.error
|
||||
import spack.package_base
|
||||
@ -398,7 +397,7 @@ def stand_alone_tests(self, kwargs):
|
||||
Args:
|
||||
kwargs (dict): arguments to be used by the test process
|
||||
"""
|
||||
import spack.build_environment
|
||||
import spack.build_environment # avoid circular dependency
|
||||
|
||||
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
|
||||
|
||||
@ -463,6 +462,8 @@ def write_tested_status(self):
|
||||
|
||||
@contextlib.contextmanager
|
||||
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
|
||||
import spack.build_environment # avoid circular dependency
|
||||
|
||||
wdir = "." if work_dir is None else work_dir
|
||||
tester = pkg.tester
|
||||
assert test_name and test_name.startswith(
|
||||
|
@ -29,11 +29,7 @@
|
||||
# merged configuration scope schemas
|
||||
spack.schema.merged.properties,
|
||||
# extra environment schema properties
|
||||
{
|
||||
"include": {"type": "array", "default": [], "items": {"type": "string"}},
|
||||
"specs": spec_list_schema,
|
||||
"include_concrete": include_concrete,
|
||||
},
|
||||
{"specs": spec_list_schema, "include_concrete": include_concrete},
|
||||
),
|
||||
}
|
||||
}
|
||||
|
41
lib/spack/spack/schema/include.py
Normal file
41
lib/spack/spack/schema/include.py
Normal file
@ -0,0 +1,41 @@
|
||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Schema for include.yaml configuration file.
|
||||
|
||||
.. literalinclude:: _spack_root/lib/spack/spack/schema/include.py
|
||||
:lines: 12-
|
||||
"""
|
||||
from typing import Any, Dict
|
||||
|
||||
#: Properties for inclusion in other schemas
|
||||
properties: Dict[str, Any] = {
|
||||
"include": {
|
||||
"type": "array",
|
||||
"default": [],
|
||||
"additionalProperties": False,
|
||||
"items": {
|
||||
"anyOf": [
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"when": {"type": "string"},
|
||||
"path": {"type": "string"},
|
||||
"sha256": {"type": "string"},
|
||||
"optional": {"type": "boolean"},
|
||||
},
|
||||
"required": ["path"],
|
||||
"additionalProperties": False,
|
||||
},
|
||||
{"type": "string"},
|
||||
]
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#: Full schema with metadata
|
||||
schema = {
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
"title": "Spack include configuration file schema",
|
||||
"properties": properties,
|
||||
}
|
@ -21,6 +21,7 @@
|
||||
import spack.schema.definitions
|
||||
import spack.schema.develop
|
||||
import spack.schema.env_vars
|
||||
import spack.schema.include
|
||||
import spack.schema.mirrors
|
||||
import spack.schema.modules
|
||||
import spack.schema.packages
|
||||
@ -40,6 +41,7 @@
|
||||
spack.schema.definitions.properties,
|
||||
spack.schema.develop.properties,
|
||||
spack.schema.env_vars.properties,
|
||||
spack.schema.include.properties,
|
||||
spack.schema.mirrors.properties,
|
||||
spack.schema.modules.properties,
|
||||
spack.schema.packages.properties,
|
||||
@ -48,7 +50,6 @@
|
||||
spack.schema.view.properties,
|
||||
)
|
||||
|
||||
|
||||
#: Full schema with metadata
|
||||
schema = {
|
||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||
|
@ -648,10 +648,9 @@ class ConcretizationCache:
|
||||
"""
|
||||
|
||||
def __init__(self, root: Union[str, None] = None):
|
||||
if not root:
|
||||
root = spack.config.get(
|
||||
"config:concretization_cache:url", spack.paths.default_conc_cache_path
|
||||
)
|
||||
root = root or spack.config.get(
|
||||
"config:concretization_cache:url", spack.paths.default_conc_cache_path
|
||||
)
|
||||
self.root = pathlib.Path(spack.util.path.canonicalize_path(root))
|
||||
self._fc = FileCache(self.root)
|
||||
self._cache_manifest = ".cache_manifest"
|
||||
|
@ -5130,6 +5130,13 @@ def get_host_environment() -> Dict[str, Any]:
|
||||
}
|
||||
|
||||
|
||||
def eval_conditional(string):
|
||||
"""Evaluate conditional definitions using restricted variable scope."""
|
||||
valid_variables = get_host_environment()
|
||||
valid_variables.update({"re": re, "env": os.environ})
|
||||
return eval(string, valid_variables)
|
||||
|
||||
|
||||
class SpecParseError(spack.error.SpecError):
|
||||
"""Wrapper for ParseError for when we're parsing specs."""
|
||||
|
||||
|
@ -200,7 +200,11 @@ def dummy_prefix(tmpdir):
|
||||
@pytest.mark.requires_executables(*required_executables)
|
||||
@pytest.mark.maybeslow
|
||||
@pytest.mark.usefixtures(
|
||||
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
|
||||
"default_config",
|
||||
"cache_directory",
|
||||
"install_dir_default_layout",
|
||||
"temporary_mirror",
|
||||
"mutable_mock_env_path",
|
||||
)
|
||||
def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
|
||||
"""
|
||||
@ -272,7 +276,11 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
|
||||
@pytest.mark.maybeslow
|
||||
@pytest.mark.nomockstage
|
||||
@pytest.mark.usefixtures(
|
||||
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
|
||||
"default_config",
|
||||
"cache_directory",
|
||||
"install_dir_default_layout",
|
||||
"temporary_mirror",
|
||||
"mutable_mock_env_path",
|
||||
)
|
||||
def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
|
||||
"""
|
||||
|
@ -5,6 +5,7 @@
|
||||
import filecmp
|
||||
import os
|
||||
import shutil
|
||||
import textwrap
|
||||
|
||||
import pytest
|
||||
|
||||
@ -259,15 +260,25 @@ def test_update_completion_arg(shell, tmpdir, monkeypatch):
|
||||
def test_updated_completion_scripts(shell, tmpdir):
|
||||
"""Make sure our shell tab completion scripts remain up-to-date."""
|
||||
|
||||
msg = (
|
||||
width = 72
|
||||
lines = textwrap.wrap(
|
||||
"It looks like Spack's command-line interface has been modified. "
|
||||
"Please update Spack's shell tab completion scripts by running:\n\n"
|
||||
" spack commands --update-completion\n\n"
|
||||
"and adding the changed files to your pull request."
|
||||
"If differences are more than your global 'include:' scopes, please "
|
||||
"update Spack's shell tab completion scripts by running:",
|
||||
width,
|
||||
)
|
||||
lines.append("\n spack commands --update-completion\n")
|
||||
lines.extend(
|
||||
textwrap.wrap(
|
||||
"and adding the changed files (minus your global 'include:' scopes) "
|
||||
"to your pull request.",
|
||||
width,
|
||||
)
|
||||
)
|
||||
msg = "\n".join(lines)
|
||||
|
||||
header = os.path.join(spack.paths.share_path, shell, f"spack-completion.{shell}")
|
||||
script = "spack-completion.{0}".format(shell)
|
||||
script = f"spack-completion.{shell}"
|
||||
old_script = os.path.join(spack.paths.share_path, script)
|
||||
new_script = str(tmpdir.join(script))
|
||||
|
||||
|
@ -213,7 +213,7 @@ def test_config_add_update_dict(mutable_empty_config):
|
||||
|
||||
def test_config_with_c_argument(mutable_empty_config):
|
||||
# I don't know how to add a spack argument to a Spack Command, so we test this way
|
||||
config_file = "config:install_root:root:/path/to/config.yaml"
|
||||
config_file = "config:install_tree:root:/path/to/config.yaml"
|
||||
parser = spack.main.make_argument_parser()
|
||||
args = parser.parse_args(["-c", config_file])
|
||||
assert config_file in args.config_vars
|
||||
@ -221,7 +221,7 @@ def test_config_with_c_argument(mutable_empty_config):
|
||||
# Add the path to the config
|
||||
config("add", args.config_vars[0], scope="command_line")
|
||||
output = config("get", "config")
|
||||
assert "config:\n install_root:\n root: /path/to/config.yaml" in output
|
||||
assert "config:\n install_tree:\n root: /path/to/config.yaml" in output
|
||||
|
||||
|
||||
def test_config_add_ordered_dict(mutable_empty_config):
|
||||
|
@ -15,6 +15,9 @@
|
||||
deprecate = SpackCommand("deprecate")
|
||||
find = SpackCommand("find")
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
||||
|
||||
|
||||
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
|
||||
install("--fake", "libelf@0.8.13")
|
||||
|
@ -1067,13 +1067,17 @@ def test_init_from_yaml_relative_includes(tmp_path):
|
||||
assert os.path.exists(os.path.join(e2.path, f))
|
||||
|
||||
|
||||
# TODO: Should we be supporting relative path rewrites when creating new env from existing?
|
||||
# TODO: If so, then this should confirm that the absolute include paths in the new env exist.
|
||||
def test_init_from_yaml_relative_includes_outside_env(tmp_path):
|
||||
files = ["../outside_env_not_copied/repos.yaml"]
|
||||
"""Ensure relative includes to files outside the environment fail."""
|
||||
files = ["../outside_env/repos.yaml"]
|
||||
|
||||
manifest = f"""
|
||||
spack:
|
||||
specs: []
|
||||
include: {files}
|
||||
include:
|
||||
- path: {files[0]}
|
||||
"""
|
||||
|
||||
# subdir to ensure parent of environment dir is not shared
|
||||
@ -1086,7 +1090,7 @@ def test_init_from_yaml_relative_includes_outside_env(tmp_path):
|
||||
for f in files:
|
||||
fs.touchp(e1_path / f)
|
||||
|
||||
with pytest.raises(spack.config.ConfigFileError, match="Detected 1 missing include"):
|
||||
with pytest.raises(ValueError, match="does not exist"):
|
||||
_ = _env_create("test2", init_file=e1_manifest)
|
||||
|
||||
|
||||
@ -1186,14 +1190,14 @@ def test_env_with_config(environment_from_manifest):
|
||||
|
||||
|
||||
def test_with_config_bad_include_create(environment_from_manifest):
|
||||
"""Confirm missing include paths raise expected exception and error."""
|
||||
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
|
||||
"""Confirm missing required include raises expected exception."""
|
||||
err = "does not exist"
|
||||
with pytest.raises(ValueError, match=err):
|
||||
environment_from_manifest(
|
||||
"""
|
||||
spack:
|
||||
include:
|
||||
- /no/such/directory
|
||||
- no/such/file.yaml
|
||||
"""
|
||||
)
|
||||
|
||||
@ -1203,34 +1207,25 @@ def test_with_config_bad_include_activate(environment_from_manifest, tmpdir):
|
||||
include1 = env_root / "include1.yaml"
|
||||
include1.touch()
|
||||
|
||||
abs_include_path = os.path.abspath(tmpdir.join("subdir").ensure("include2.yaml"))
|
||||
|
||||
spack_yaml = env_root / ev.manifest_name
|
||||
spack_yaml.write_text(
|
||||
f"""
|
||||
"""
|
||||
spack:
|
||||
include:
|
||||
- ./include1.yaml
|
||||
- {abs_include_path}
|
||||
"""
|
||||
)
|
||||
|
||||
with ev.Environment(env_root) as e:
|
||||
e.concretize()
|
||||
|
||||
# we've created an environment with some included config files (which do
|
||||
# in fact exist): now we remove them and check that we get a sensible
|
||||
# error message
|
||||
# We've created an environment with included config file (which does
|
||||
# exist). Now we remove it and check that we get a sensible error.
|
||||
|
||||
os.remove(abs_include_path)
|
||||
os.remove(include1)
|
||||
with pytest.raises(spack.config.ConfigFileError) as exc:
|
||||
with pytest.raises(ValueError, match="does not exist"):
|
||||
ev.activate(ev.Environment(env_root))
|
||||
|
||||
err = exc.value.message
|
||||
assert "missing include" in err
|
||||
assert abs_include_path in err
|
||||
assert "include1.yaml" in err
|
||||
assert ev.active_environment() is None
|
||||
|
||||
|
||||
@ -1338,8 +1333,10 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
||||
included file scope.
|
||||
"""
|
||||
|
||||
env_path = tmp_path / "test_config"
|
||||
fs.mkdirp(env_path)
|
||||
included_file = "included-packages.yaml"
|
||||
included_path = tmp_path / included_file
|
||||
included_path = env_path / included_file
|
||||
with open(included_path, "w", encoding="utf-8") as f:
|
||||
f.write(
|
||||
"""\
|
||||
@ -1355,7 +1352,7 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
||||
"""
|
||||
)
|
||||
|
||||
spack_yaml = tmp_path / ev.manifest_name
|
||||
spack_yaml = env_path / ev.manifest_name
|
||||
spack_yaml.write_text(
|
||||
f"""\
|
||||
spack:
|
||||
@ -1369,7 +1366,8 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
||||
"""
|
||||
)
|
||||
|
||||
e = ev.Environment(tmp_path)
|
||||
mutable_config.set("config:misc_cache", str(tmp_path / "cache"))
|
||||
e = ev.Environment(env_path)
|
||||
with e:
|
||||
# List of requirements, flip a variant
|
||||
config("change", "packages:mpich:require:~debug")
|
||||
@ -1459,19 +1457,6 @@ def test_env_with_included_config_file_url(tmpdir, mutable_empty_config, package
|
||||
assert cfg["mpileaks"]["version"] == ["2.2"]
|
||||
|
||||
|
||||
def test_env_with_included_config_missing_file(tmpdir, mutable_empty_config):
|
||||
"""Test inclusion of a missing configuration file raises FetchError
|
||||
noting missing file."""
|
||||
|
||||
spack_yaml = tmpdir.join("spack.yaml")
|
||||
missing_file = tmpdir.join("packages.yaml")
|
||||
with spack_yaml.open("w") as f:
|
||||
f.write("spack:\n include:\n - {0}\n".format(missing_file.strpath))
|
||||
|
||||
with pytest.raises(spack.error.ConfigError, match="missing include path"):
|
||||
ev.Environment(tmpdir.strpath)
|
||||
|
||||
|
||||
def test_env_with_included_config_scope(mutable_mock_env_path, packages_file):
|
||||
"""Test inclusion of a package file from the environment's configuration
|
||||
stage directory. This test is intended to represent a case where a remote
|
||||
@ -1566,7 +1551,7 @@ def test_env_with_included_config_precedence(tmp_path):
|
||||
|
||||
|
||||
def test_env_with_included_configs_precedence(tmp_path):
|
||||
"""Test precendence of multiple included configuration files."""
|
||||
"""Test precedence of multiple included configuration files."""
|
||||
file1 = "high-config.yaml"
|
||||
file2 = "low-config.yaml"
|
||||
|
||||
@ -4277,21 +4262,31 @@ def test_unify_when_possible_works_around_conflicts():
|
||||
assert len([x for x in e.all_specs() if x.satisfies("mpich")]) == 1
|
||||
|
||||
|
||||
# Using mock_include_cache to ensure the "remote" file is cached in a temporary
|
||||
# location and not polluting the user cache.
|
||||
def test_env_include_packages_url(
|
||||
tmpdir, mutable_empty_config, mock_spider_configs, mock_curl_configs
|
||||
tmpdir, mutable_empty_config, mock_fetch_url_text, mock_curl_configs, mock_include_cache
|
||||
):
|
||||
"""Test inclusion of a (GitHub) URL."""
|
||||
develop_url = "https://github.com/fake/fake/blob/develop/"
|
||||
default_packages = develop_url + "etc/fake/defaults/packages.yaml"
|
||||
sha256 = "a422e35b3a18869d0611a4137b37314131749ecdc070a7cd7183f488da81201a"
|
||||
spack_yaml = tmpdir.join("spack.yaml")
|
||||
with spack_yaml.open("w") as f:
|
||||
f.write("spack:\n include:\n - {0}\n".format(default_packages))
|
||||
assert os.path.isfile(spack_yaml.strpath)
|
||||
f.write(
|
||||
f"""\
|
||||
spack:
|
||||
include:
|
||||
- path: {default_packages}
|
||||
sha256: {sha256}
|
||||
"""
|
||||
)
|
||||
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
env = ev.Environment(tmpdir.strpath)
|
||||
ev.activate(env)
|
||||
|
||||
# Make sure a setting from test/data/config/packages.yaml is present
|
||||
cfg = spack.config.get("packages")
|
||||
assert "mpich" in cfg["all"]["providers"]["mpi"]
|
||||
|
||||
@ -4360,7 +4355,7 @@ def test_env_view_disabled(tmp_path, mutable_mock_env_path):
|
||||
|
||||
|
||||
@pytest.mark.parametrize("first", ["false", "true", "custom"])
|
||||
def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config, first):
|
||||
def test_env_include_mixed_views(tmp_path, mutable_config, mutable_mock_env_path, first):
|
||||
"""Ensure including path and boolean views in different combinations result
|
||||
in the creation of only the first view if it is not disabled."""
|
||||
false_yaml = tmp_path / "false-view.yaml"
|
||||
|
@ -718,10 +718,11 @@ def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
|
||||
assert os.path.exists(root.prefix)
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||
@pytest.mark.regression("12002")
|
||||
def test_install_only_dependencies_in_env(
|
||||
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
|
||||
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
|
||||
):
|
||||
env("create", "test")
|
||||
|
||||
@ -735,9 +736,10 @@ def test_install_only_dependencies_in_env(
|
||||
assert not os.path.exists(root.prefix)
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.regression("12002")
|
||||
def test_install_only_dependencies_of_all_in_env(
|
||||
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
|
||||
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
|
||||
):
|
||||
env("create", "--without-view", "test")
|
||||
|
||||
@ -757,7 +759,8 @@ def test_install_only_dependencies_of_all_in_env(
|
||||
assert os.path.exists(dep.prefix)
|
||||
|
||||
|
||||
def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock_env_path):
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
def test_install_no_add_in_env(tmpdir, mutable_mock_env_path, mock_fetch, install_mockery):
|
||||
# To test behavior of --add option, we create the following environment:
|
||||
#
|
||||
# mpileaks
|
||||
@ -932,9 +935,10 @@ def test_install_fails_no_args_suggests_env_activation(tmpdir):
|
||||
assert "using the `spack.yaml` in this directory" in output
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||
def test_install_env_with_tests_all(
|
||||
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
||||
):
|
||||
env("create", "test")
|
||||
with ev.read("test"):
|
||||
@ -944,9 +948,10 @@ def test_install_env_with_tests_all(
|
||||
assert os.path.exists(test_dep.prefix)
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||
def test_install_env_with_tests_root(
|
||||
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
||||
):
|
||||
env("create", "test")
|
||||
with ev.read("test"):
|
||||
@ -956,9 +961,10 @@ def test_install_env_with_tests_root(
|
||||
assert not os.path.exists(test_dep.prefix)
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||
def test_install_empty_env(
|
||||
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
||||
):
|
||||
env_name = "empty"
|
||||
env("create", env_name)
|
||||
@ -994,9 +1000,17 @@ def test_installation_fail_tests(install_mockery, mock_fetch, name, method):
|
||||
assert "See test log for details" in output
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("Buildcache not supported on windows")
|
||||
def test_install_use_buildcache(
|
||||
capsys, mock_packages, mock_fetch, mock_archive, mock_binary_index, tmpdir, install_mockery
|
||||
capsys,
|
||||
mutable_mock_env_path,
|
||||
mock_packages,
|
||||
mock_fetch,
|
||||
mock_archive,
|
||||
mock_binary_index,
|
||||
tmpdir,
|
||||
install_mockery,
|
||||
):
|
||||
"""
|
||||
Make sure installing with use-buildcache behaves correctly.
|
||||
|
@ -12,6 +12,9 @@
|
||||
install = SpackCommand("install")
|
||||
uninstall = SpackCommand("uninstall")
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
||||
|
||||
|
||||
@pytest.mark.db
|
||||
def test_mark_mode_required(mutable_database):
|
||||
|
@ -38,8 +38,9 @@ def test_regression_8083(tmpdir, capfd, mock_packages, mock_fetch, config):
|
||||
assert "as it is an external spec" in output
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.regression("12345")
|
||||
def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_path):
|
||||
def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
|
||||
mirror_dir = str(tmp_path / "mirror")
|
||||
env_name = "test"
|
||||
|
||||
@ -342,8 +343,16 @@ def test_mirror_name_collision(mutable_config):
|
||||
mirror("add", "first", "1")
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
def test_mirror_destroy(
|
||||
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
|
||||
mutable_mock_env_path,
|
||||
install_mockery,
|
||||
mock_packages,
|
||||
mock_fetch,
|
||||
mock_archive,
|
||||
mutable_config,
|
||||
monkeypatch,
|
||||
tmpdir,
|
||||
):
|
||||
# Create a temp mirror directory for buildcache usage
|
||||
mirror_dir = tmpdir.join("mirror_dir")
|
||||
|
@ -13,7 +13,10 @@
|
||||
import spack.store
|
||||
from spack.main import SpackCommand, SpackCommandError
|
||||
|
||||
pytestmark = pytest.mark.usefixtures("mutable_config", "mutable_mock_repo")
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
pytestmark = pytest.mark.usefixtures(
|
||||
"mutable_mock_env_path", "mutable_config", "mutable_mock_repo"
|
||||
)
|
||||
|
||||
spec = SpackCommand("spec")
|
||||
|
||||
|
@ -16,6 +16,9 @@
|
||||
uninstall = SpackCommand("uninstall")
|
||||
install = SpackCommand("install")
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
||||
|
||||
|
||||
class MockArgs:
|
||||
def __init__(self, packages, all=False, force=False, dependents=False):
|
||||
@ -220,9 +223,7 @@ class TestUninstallFromEnv:
|
||||
find = SpackCommand("find")
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def environment_setup(
|
||||
self, mutable_mock_env_path, mock_packages, mutable_database, install_mockery
|
||||
):
|
||||
def environment_setup(self, mock_packages, mutable_database, install_mockery):
|
||||
TestUninstallFromEnv.env("create", "e1")
|
||||
e1 = spack.environment.read("e1")
|
||||
with e1:
|
||||
|
@ -11,8 +11,7 @@
|
||||
|
||||
import pytest
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import join_path, touch, touchp
|
||||
from llnl.util.filesystem import join_path, touch
|
||||
|
||||
import spack
|
||||
import spack.config
|
||||
@ -26,6 +25,7 @@
|
||||
import spack.schema.compilers
|
||||
import spack.schema.config
|
||||
import spack.schema.env
|
||||
import spack.schema.include
|
||||
import spack.schema.mirrors
|
||||
import spack.schema.repos
|
||||
import spack.spec
|
||||
@ -51,22 +51,9 @@
|
||||
|
||||
config_override_list = {"config": {"build_stage:": ["pathd", "pathe"]}}
|
||||
|
||||
config_merge_dict = {"config": {"info": {"a": 3, "b": 4}}}
|
||||
config_merge_dict = {"config": {"aliases": {"ls": "find", "dev": "develop"}}}
|
||||
|
||||
config_override_dict = {"config": {"info:": {"a": 7, "c": 9}}}
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def write_config_file(tmpdir):
|
||||
"""Returns a function that writes a config file."""
|
||||
|
||||
def _write(config, data, scope):
|
||||
config_yaml = tmpdir.join(scope, config + ".yaml")
|
||||
config_yaml.ensure()
|
||||
with config_yaml.open("w") as f:
|
||||
syaml.dump_config(data, f)
|
||||
|
||||
return _write
|
||||
config_override_dict = {"config": {"aliases:": {"be": "build-env", "deps": "dependencies"}}}
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
@ -1037,6 +1024,16 @@ def test_bad_config_yaml(tmpdir):
|
||||
)
|
||||
|
||||
|
||||
def test_bad_include_yaml(tmpdir):
|
||||
with pytest.raises(spack.config.ConfigFormatError, match="is not of type"):
|
||||
check_schema(
|
||||
spack.schema.include.schema,
|
||||
"""\
|
||||
include: $HOME/include.yaml
|
||||
""",
|
||||
)
|
||||
|
||||
|
||||
def test_bad_mirrors_yaml(tmpdir):
|
||||
with pytest.raises(spack.config.ConfigFormatError):
|
||||
check_schema(
|
||||
@ -1101,9 +1098,9 @@ def test_internal_config_section_override(mock_low_high_config, write_config_fil
|
||||
|
||||
def test_internal_config_dict_override(mock_low_high_config, write_config_file):
|
||||
write_config_file("config", config_merge_dict, "low")
|
||||
wanted_dict = config_override_dict["config"]["info:"]
|
||||
wanted_dict = config_override_dict["config"]["aliases:"]
|
||||
mock_low_high_config.push_scope(spack.config.InternalConfigScope("high", config_override_dict))
|
||||
assert mock_low_high_config.get("config:info") == wanted_dict
|
||||
assert mock_low_high_config.get("config:aliases") == wanted_dict
|
||||
|
||||
|
||||
def test_internal_config_list_override(mock_low_high_config, write_config_file):
|
||||
@ -1135,10 +1132,10 @@ def test_set_list_override(mock_low_high_config, write_config_file):
|
||||
|
||||
def test_set_dict_override(mock_low_high_config, write_config_file):
|
||||
write_config_file("config", config_merge_dict, "low")
|
||||
wanted_dict = config_override_dict["config"]["info:"]
|
||||
with spack.config.override("config:info:", wanted_dict):
|
||||
assert wanted_dict == mock_low_high_config.get("config:info")
|
||||
assert config_merge_dict["config"]["info"] == mock_low_high_config.get("config:info")
|
||||
wanted_dict = config_override_dict["config"]["aliases:"]
|
||||
with spack.config.override("config:aliases:", wanted_dict):
|
||||
assert wanted_dict == mock_low_high_config.get("config:aliases")
|
||||
assert config_merge_dict["config"]["aliases"] == mock_low_high_config.get("config:aliases")
|
||||
|
||||
|
||||
def test_set_bad_path(config):
|
||||
@ -1264,134 +1261,6 @@ def test_user_cache_path_is_default_when_env_var_is_empty(working_env):
|
||||
assert os.path.expanduser("~%s.spack" % os.sep) == spack.paths._get_user_cache_path()
|
||||
|
||||
|
||||
github_url = "https://github.com/fake/fake/{0}/develop"
|
||||
gitlab_url = "https://gitlab.fake.io/user/repo/-/blob/config/defaults"
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"url,isfile",
|
||||
[
|
||||
(github_url.format("tree"), False),
|
||||
("{0}/README.md".format(github_url.format("blob")), True),
|
||||
("{0}/etc/fake/defaults/packages.yaml".format(github_url.format("blob")), True),
|
||||
(gitlab_url, False),
|
||||
(None, False),
|
||||
],
|
||||
)
|
||||
def test_config_collect_urls(mutable_empty_config, mock_spider_configs, url, isfile):
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
urls = spack.config.collect_urls(url)
|
||||
if url:
|
||||
if isfile:
|
||||
expected = 1 if url.endswith(".yaml") else 0
|
||||
assert len(urls) == expected
|
||||
else:
|
||||
# Expect multiple configuration files for a "directory"
|
||||
assert len(urls) > 1
|
||||
else:
|
||||
assert not urls
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"url,isfile,fail",
|
||||
[
|
||||
(github_url.format("tree"), False, False),
|
||||
(gitlab_url, False, False),
|
||||
("{0}/README.md".format(github_url.format("blob")), True, True),
|
||||
("{0}/compilers.yaml".format(gitlab_url), True, False),
|
||||
(None, False, True),
|
||||
],
|
||||
)
|
||||
def test_config_fetch_remote_configs(
|
||||
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, isfile, fail
|
||||
):
|
||||
def _has_content(filename):
|
||||
# The first element of all configuration files for this test happen to
|
||||
# be the basename of the file so this check leverages that feature. If
|
||||
# that changes, then this check will need to change accordingly.
|
||||
element = "{0}:".format(os.path.splitext(os.path.basename(filename))[0])
|
||||
with open(filename, "r", encoding="utf-8") as fd:
|
||||
for line in fd:
|
||||
if element in line:
|
||||
return True
|
||||
tty.debug("Expected {0} in '{1}'".format(element, filename))
|
||||
return False
|
||||
|
||||
dest_dir = join_path(tmpdir.strpath, "defaults")
|
||||
if fail:
|
||||
msg = "Cannot retrieve configuration"
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
with pytest.raises(spack.config.ConfigFileError, match=msg):
|
||||
spack.config.fetch_remote_configs(url, dest_dir)
|
||||
else:
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
path = spack.config.fetch_remote_configs(url, dest_dir)
|
||||
assert os.path.exists(path)
|
||||
if isfile:
|
||||
# Ensure correct file is "fetched"
|
||||
assert os.path.basename(path) == os.path.basename(url)
|
||||
# Ensure contents of the file has expected config element
|
||||
assert _has_content(path)
|
||||
else:
|
||||
for filename in os.listdir(path):
|
||||
assert _has_content(join_path(path, filename))
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def mock_collect_urls(mock_config_data, monkeypatch):
|
||||
"""Mock the collection of URLs to avoid mocking spider."""
|
||||
|
||||
_, config_files = mock_config_data
|
||||
|
||||
def _collect(base_url):
|
||||
if not base_url:
|
||||
return []
|
||||
|
||||
ext = os.path.splitext(base_url)[1]
|
||||
if ext:
|
||||
return [base_url] if ext == ".yaml" else []
|
||||
|
||||
return [join_path(base_url, f) for f in config_files]
|
||||
|
||||
monkeypatch.setattr(spack.config, "collect_urls", _collect)
|
||||
|
||||
yield
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"url,skip",
|
||||
[(github_url.format("tree"), True), ("{0}/compilers.yaml".format(gitlab_url), True)],
|
||||
)
|
||||
def test_config_fetch_remote_configs_skip(
|
||||
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, skip
|
||||
):
|
||||
"""Ensure skip fetching remote config file if it already exists when
|
||||
required and not skipping if replacing it."""
|
||||
|
||||
def check_contents(filename, expected):
|
||||
with open(filename, "r", encoding="utf-8") as fd:
|
||||
lines = fd.readlines()
|
||||
if expected:
|
||||
assert lines[0] == "compilers:"
|
||||
else:
|
||||
assert not lines
|
||||
|
||||
dest_dir = join_path(tmpdir.strpath, "defaults")
|
||||
filename = "compilers.yaml"
|
||||
|
||||
# Create a stage directory with an empty configuration file
|
||||
path = join_path(dest_dir, filename)
|
||||
touchp(path)
|
||||
|
||||
# Do NOT replace the existing cached configuration file if skipping
|
||||
expected = None if skip else "compilers:"
|
||||
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
path = spack.config.fetch_remote_configs(url, dest_dir, skip)
|
||||
result_filename = path if path.endswith(".yaml") else join_path(path, filename)
|
||||
check_contents(result_filename, expected)
|
||||
|
||||
|
||||
def test_config_file_dir_failure(tmpdir, mutable_empty_config):
|
||||
with pytest.raises(spack.config.ConfigFileError, match="not a file"):
|
||||
spack.config.read_config_file(tmpdir.strpath)
|
||||
|
@ -30,7 +30,15 @@
|
||||
import llnl.util.lang
|
||||
import llnl.util.lock
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import copy_tree, mkdirp, remove_linked_tree, touchp, working_dir
|
||||
from llnl.util.filesystem import (
|
||||
copy,
|
||||
copy_tree,
|
||||
join_path,
|
||||
mkdirp,
|
||||
remove_linked_tree,
|
||||
touchp,
|
||||
working_dir,
|
||||
)
|
||||
|
||||
import spack.binary_distribution
|
||||
import spack.bootstrap.core
|
||||
@ -65,6 +73,7 @@
|
||||
from spack.installer import PackageInstaller
|
||||
from spack.main import SpackCommand
|
||||
from spack.util.pattern import Bunch
|
||||
from spack.util.remote_file_cache import raw_github_gitlab_url
|
||||
|
||||
from ..enums import ConfigScopePriority
|
||||
|
||||
@ -1906,35 +1915,21 @@ def __call__(self, *args, **kwargs):
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def mock_spider_configs(mock_config_data, monkeypatch):
|
||||
"""
|
||||
Mock retrieval of configuration file URLs from the web by grabbing
|
||||
them from the test data configuration directory.
|
||||
"""
|
||||
config_data_dir, config_files = mock_config_data
|
||||
def mock_fetch_url_text(tmpdir, mock_config_data, monkeypatch):
|
||||
"""Mock spack.util.web.fetch_url_text."""
|
||||
|
||||
def _spider(*args, **kwargs):
|
||||
root_urls = args[0]
|
||||
if not root_urls:
|
||||
return [], set()
|
||||
stage_dir, config_files = mock_config_data
|
||||
|
||||
root_urls = [root_urls] if isinstance(root_urls, str) else root_urls
|
||||
def _fetch_text_file(url, dest_dir):
|
||||
raw_url = raw_github_gitlab_url(url)
|
||||
mkdirp(dest_dir)
|
||||
basename = os.path.basename(raw_url)
|
||||
src = join_path(stage_dir, basename)
|
||||
dest = join_path(dest_dir, basename)
|
||||
copy(src, dest)
|
||||
return dest
|
||||
|
||||
# Any URL with an extension will be treated like a file; otherwise,
|
||||
# it is considered a directory/folder and we'll grab all available
|
||||
# files.
|
||||
urls = []
|
||||
for url in root_urls:
|
||||
if os.path.splitext(url)[1]:
|
||||
urls.append(url)
|
||||
else:
|
||||
urls.extend([os.path.join(url, f) for f in config_files])
|
||||
|
||||
return [], set(urls)
|
||||
|
||||
monkeypatch.setattr(spack.util.web, "spider", _spider)
|
||||
|
||||
yield
|
||||
monkeypatch.setattr(spack.util.web, "fetch_url_text", _fetch_text_file)
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
@ -2197,3 +2192,27 @@ def info(self):
|
||||
@pytest.fixture()
|
||||
def mock_runtimes(config, mock_packages):
|
||||
return mock_packages.packages_with_tags("runtime")
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def write_config_file(tmpdir):
|
||||
"""Returns a function that writes a config file."""
|
||||
|
||||
def _write(config, data, scope):
|
||||
config_yaml = tmpdir.join(scope, config + ".yaml")
|
||||
config_yaml.ensure()
|
||||
with config_yaml.open("w") as f:
|
||||
syaml.dump_config(data, f)
|
||||
return config_yaml
|
||||
|
||||
return _write
|
||||
|
||||
|
||||
def _include_cache_root():
|
||||
return join_path(str(tempfile.mkdtemp()), "user_cache", "includes")
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def mock_include_cache(monkeypatch):
|
||||
"""Override the include cache directory so tests don't pollute user cache."""
|
||||
monkeypatch.setattr(spack.config, "_include_cache_location", _include_cache_root)
|
||||
|
@ -12,6 +12,7 @@
|
||||
|
||||
import spack.config
|
||||
import spack.environment as ev
|
||||
import spack.platforms
|
||||
import spack.solver.asp
|
||||
import spack.spec
|
||||
from spack.environment.environment import (
|
||||
@ -921,3 +922,50 @@ def test_environment_from_name_or_dir(mock_packages, mutable_mock_env_path, tmp_
|
||||
|
||||
with pytest.raises(ev.SpackEnvironmentError, match="no such environment"):
|
||||
_ = ev.environment_from_name_or_dir("fake-env")
|
||||
|
||||
|
||||
def test_env_include_configs(mutable_mock_env_path, mock_packages):
|
||||
"""check config and package values using new include schema"""
|
||||
env_path = mutable_mock_env_path
|
||||
env_path.mkdir()
|
||||
|
||||
this_os = spack.platforms.host().default_os
|
||||
config_root = env_path / this_os
|
||||
config_root.mkdir()
|
||||
config_path = str(config_root / "config.yaml")
|
||||
with open(config_path, "w", encoding="utf-8") as f:
|
||||
f.write(
|
||||
"""\
|
||||
config:
|
||||
verify_ssl: False
|
||||
"""
|
||||
)
|
||||
|
||||
packages_path = str(env_path / "packages.yaml")
|
||||
with open(packages_path, "w", encoding="utf-8") as f:
|
||||
f.write(
|
||||
"""\
|
||||
packages:
|
||||
python:
|
||||
require:
|
||||
- spec: "@3.11:"
|
||||
"""
|
||||
)
|
||||
|
||||
spack_yaml = env_path / ev.manifest_name
|
||||
spack_yaml.write_text(
|
||||
f"""\
|
||||
spack:
|
||||
include:
|
||||
- path: {config_path}
|
||||
optional: true
|
||||
- path: {packages_path}
|
||||
"""
|
||||
)
|
||||
|
||||
e = ev.Environment(env_path)
|
||||
with e.manifest.use_config():
|
||||
assert not spack.config.get("config:verify_ssl")
|
||||
python_reqs = spack.config.get("packages")["python"]["require"]
|
||||
req_specs = set(x["spec"] for x in python_reqs)
|
||||
assert req_specs == set(["@3.11:"])
|
||||
|
@ -680,13 +680,19 @@ def test_install_spliced_build_spec_installed(install_mockery, capfd, mock_fetch
|
||||
assert node.build_spec.installed
|
||||
|
||||
|
||||
# Unit tests should not be affected by the user's managed environments
|
||||
@pytest.mark.not_on_windows("lacking windows support for binary installs")
|
||||
@pytest.mark.parametrize("transitive", [True, False])
|
||||
@pytest.mark.parametrize(
|
||||
"root_str", ["splice-t^splice-h~foo", "splice-h~foo", "splice-vt^splice-a"]
|
||||
)
|
||||
def test_install_splice_root_from_binary(
|
||||
install_mockery, mock_fetch, mutable_temporary_mirror, transitive, root_str
|
||||
mutable_mock_env_path,
|
||||
install_mockery,
|
||||
mock_fetch,
|
||||
mutable_temporary_mirror,
|
||||
transitive,
|
||||
root_str,
|
||||
):
|
||||
"""Test installing a spliced spec with the root available in binary cache"""
|
||||
# Test splicing and rewiring a spec with the same name, different hash.
|
||||
|
@ -3,6 +3,9 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
|
||||
import os
|
||||
import os.path
|
||||
|
||||
import pytest
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
@ -13,8 +16,10 @@
|
||||
import spack.error
|
||||
import spack.main
|
||||
import spack.paths
|
||||
import spack.platforms
|
||||
import spack.util.executable as exe
|
||||
import spack.util.git
|
||||
import spack.util.spack_yaml as syaml
|
||||
|
||||
pytestmark = pytest.mark.not_on_windows(
|
||||
"Test functionality supported but tests are failing on Win"
|
||||
@ -167,3 +172,163 @@ def test_add_command_line_scope_env(tmp_path, mutable_mock_env_path):
|
||||
assert config.get("config:install_tree:root") == "/tmp/first"
|
||||
|
||||
assert ev.active_environment() is None # shouldn't cause an environment to be activated
|
||||
|
||||
|
||||
def test_include_cfg(mock_low_high_config, write_config_file, tmpdir):
|
||||
cfg1_path = str(tmpdir.join("include1.yaml"))
|
||||
with open(cfg1_path, "w", encoding="utf-8") as f:
|
||||
f.write(
|
||||
"""\
|
||||
config:
|
||||
verify_ssl: False
|
||||
dirty: True
|
||||
packages:
|
||||
python:
|
||||
require:
|
||||
- spec: "@3.11:"
|
||||
"""
|
||||
)
|
||||
|
||||
def python_cfg(_spec):
|
||||
return f"""\
|
||||
packages:
|
||||
python:
|
||||
require:
|
||||
- spec: {_spec}
|
||||
"""
|
||||
|
||||
def write_python_cfg(_spec, _cfg_name):
|
||||
cfg_path = str(tmpdir.join(_cfg_name))
|
||||
with open(cfg_path, "w", encoding="utf-8") as f:
|
||||
f.write(python_cfg(_spec))
|
||||
return cfg_path
|
||||
|
||||
# This config will not be included
|
||||
cfg2_path = write_python_cfg("+shared", "include2.yaml")
|
||||
|
||||
# The config will point to this using substitutable variables,
|
||||
# namely $os; we expect that Spack resolves these variables
|
||||
# into the actual path of the config
|
||||
this_os = spack.platforms.host().default_os
|
||||
cfg3_expanded_path = os.path.join(str(tmpdir), f"{this_os}", "include3.yaml")
|
||||
fs.mkdirp(os.path.dirname(cfg3_expanded_path))
|
||||
with open(cfg3_expanded_path, "w", encoding="utf-8") as f:
|
||||
f.write(python_cfg("+ssl"))
|
||||
cfg3_abstract_path = os.path.join(str(tmpdir), "$os", "include3.yaml")
|
||||
|
||||
# This will be included unconditionally
|
||||
cfg4_path = write_python_cfg("+tk", "include4.yaml")
|
||||
|
||||
# This config will not exist, and the config will explicitly
|
||||
# allow this
|
||||
cfg5_path = os.path.join(str(tmpdir), "non-existent.yaml")
|
||||
|
||||
include_entries = [
|
||||
{"path": f"{cfg1_path}", "when": f'os == "{this_os}"'},
|
||||
{"path": f"{cfg2_path}", "when": "False"},
|
||||
{"path": cfg3_abstract_path},
|
||||
cfg4_path,
|
||||
{"path": cfg5_path, "optional": True},
|
||||
]
|
||||
include_cfg = {"include": include_entries}
|
||||
filename = write_config_file("include", include_cfg, "low")
|
||||
|
||||
assert not spack.config.get("config:dirty")
|
||||
|
||||
spack.main.add_command_line_scopes(mock_low_high_config, [os.path.dirname(filename)])
|
||||
|
||||
assert spack.config.get("config:dirty")
|
||||
python_reqs = spack.config.get("packages")["python"]["require"]
|
||||
req_specs = set(x["spec"] for x in python_reqs)
|
||||
assert req_specs == set(["@3.11:", "+ssl", "+tk"])
|
||||
|
||||
|
||||
def test_include_duplicate_source(tmpdir, mutable_config):
|
||||
"""Check precedence when include.yaml files have the same path."""
|
||||
include_yaml = "debug.yaml"
|
||||
include_list = {"include": [f"./{include_yaml}"]}
|
||||
|
||||
system_filename = mutable_config.get_config_filename("system", "include")
|
||||
site_filename = mutable_config.get_config_filename("site", "include")
|
||||
|
||||
def write_configs(include_path, debug_data):
|
||||
fs.mkdirp(os.path.dirname(include_path))
|
||||
with open(include_path, "w", encoding="utf-8") as f:
|
||||
syaml.dump_config(include_list, f)
|
||||
|
||||
debug_path = fs.join_path(os.path.dirname(include_path), include_yaml)
|
||||
with open(debug_path, "w", encoding="utf-8") as f:
|
||||
syaml.dump_config(debug_data, f)
|
||||
|
||||
system_config = {"config": {"debug": False}}
|
||||
write_configs(system_filename, system_config)
|
||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(system_filename)])
|
||||
|
||||
site_config = {"config": {"debug": True}}
|
||||
write_configs(site_filename, site_config)
|
||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(site_filename)])
|
||||
|
||||
# Ensure takes the last value of the option pushed onto the stack
|
||||
assert mutable_config.get("config:debug") == site_config["config"]["debug"]
|
||||
|
||||
|
||||
def test_include_recurse_limit(tmpdir, mutable_config):
|
||||
"""Ensure hit the recursion limit."""
|
||||
include_yaml = "include.yaml"
|
||||
include_list = {"include": [f"./{include_yaml}"]}
|
||||
|
||||
include_path = str(tmpdir.join(include_yaml))
|
||||
with open(include_path, "w", encoding="utf-8") as f:
|
||||
syaml.dump_config(include_list, f)
|
||||
|
||||
with pytest.raises(spack.config.RecursiveIncludeError, match="recursion exceeded"):
|
||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(include_path)])
|
||||
|
||||
|
||||
# TODO: Fix this once recursive includes are processed in the expected order.
|
||||
@pytest.mark.parametrize("child,expected", [("b", True), ("c", False)])
|
||||
def test_include_recurse_diamond(tmpdir, mutable_config, child, expected):
|
||||
"""Demonstrate include parent's value overrides that of child in diamond include.
|
||||
|
||||
Check that the value set by b or c overrides that set by d.
|
||||
"""
|
||||
configs_root = tmpdir.join("configs")
|
||||
configs_root.mkdir()
|
||||
|
||||
def write(path, contents):
|
||||
with open(path, "w", encoding="utf-8") as f:
|
||||
f.write(contents)
|
||||
|
||||
def debug_contents(value):
|
||||
return f"config:\n debug: {value}\n"
|
||||
|
||||
def include_contents(paths):
|
||||
indent = "\n - "
|
||||
values = indent.join([str(p) for p in paths])
|
||||
return f"include:{indent}{values}"
|
||||
|
||||
a_yaml = tmpdir.join("a.yaml")
|
||||
b_yaml = configs_root.join("b.yaml")
|
||||
c_yaml = configs_root.join("c.yaml")
|
||||
d_yaml = configs_root.join("d.yaml")
|
||||
debug_yaml = configs_root.join("enable_debug.yaml")
|
||||
|
||||
write(debug_yaml, debug_contents("true"))
|
||||
|
||||
a_contents = f"""\
|
||||
include:
|
||||
- {b_yaml}
|
||||
- {c_yaml}
|
||||
"""
|
||||
write(a_yaml, a_contents)
|
||||
write(d_yaml, debug_contents("false"))
|
||||
|
||||
write(b_yaml, include_contents([debug_yaml, d_yaml] if child == "b" else [d_yaml]))
|
||||
write(c_yaml, include_contents([debug_yaml, d_yaml] if child == "c" else [d_yaml]))
|
||||
|
||||
spack.main.add_command_line_scopes(mutable_config, [str(tmpdir)])
|
||||
|
||||
try:
|
||||
assert mutable_config.get("config:debug") is expected
|
||||
except AssertionError:
|
||||
pytest.xfail("recursive includes are not processed in the expected order")
|
||||
|
@ -84,6 +84,7 @@ def test_module_suffixes(module_suffixes_schema):
|
||||
"compilers",
|
||||
"config",
|
||||
"definitions",
|
||||
"include",
|
||||
"env",
|
||||
"merged",
|
||||
"mirrors",
|
||||
|
97
lib/spack/spack/test/util/remote_file_cache.py
Normal file
97
lib/spack/spack/test/util/remote_file_cache.py
Normal file
@ -0,0 +1,97 @@
|
||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import os.path
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import join_path
|
||||
|
||||
import spack.config
|
||||
import spack.util.remote_file_cache as rfc_util
|
||||
|
||||
github_url = "https://github.com/fake/fake/{0}/develop"
|
||||
gitlab_url = "https://gitlab.fake.io/user/repo/-/blob/config/defaults"
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"path,err",
|
||||
[
|
||||
("ssh://git@github.com:spack/", "Unsupported URL scheme"),
|
||||
("bad:///this/is/a/file/url/include.yaml", "Invalid URL scheme"),
|
||||
],
|
||||
)
|
||||
def test_rfc_local_path_bad_scheme(path, err):
|
||||
with pytest.raises(ValueError, match=err):
|
||||
_ = rfc_util.local_path(path, "")
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"path", ["/a/b/c/d/e/config.py", "file:///this/is/a/file/url/include.yaml"]
|
||||
)
|
||||
def test_rfc_local_path_file(path):
|
||||
actual = path.split("://")[1] if ":" in path else path
|
||||
assert rfc_util.local_path(path, "") == os.path.normpath(actual)
|
||||
|
||||
|
||||
def test_rfc_remote_local_path_no_dest():
|
||||
path = f"{gitlab_url}/packages.yaml"
|
||||
with pytest.raises(ValueError, match="Requires the destination argument"):
|
||||
_ = rfc_util.local_path(path, "")
|
||||
|
||||
|
||||
compilers_sha256 = (
|
||||
"381732677538143a8f900406c0654f2730e2919a11740bdeaf35757ab3e1ef3e"
|
||||
if sys.platform == "win32"
|
||||
else "e91148ed5a0da7844e9f3f9cfce0fa60cce509461886bc3b006ee9eb711f69df"
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"url,sha256,err,msg",
|
||||
[
|
||||
(
|
||||
f"{join_path(github_url.format('tree'), 'config.yaml')}",
|
||||
"",
|
||||
ValueError,
|
||||
"Requires sha256",
|
||||
),
|
||||
(f"{gitlab_url}/compilers.yaml", compilers_sha256, None, ""),
|
||||
(f"{gitlab_url}/packages.yaml", "abcdef", ValueError, "does not match"),
|
||||
(f"{github_url.format('blob')}/README.md", "", OSError, "No such"),
|
||||
(github_url.format("tree"), "", OSError, "No such"),
|
||||
("", "", ValueError, "argument is required"),
|
||||
],
|
||||
)
|
||||
def test_rfc_remote_local_path(
|
||||
tmpdir, mutable_empty_config, mock_fetch_url_text, url, sha256, err, msg
|
||||
):
|
||||
def _has_content(filename):
|
||||
# The first element of all configuration files for this test happen to
|
||||
# be the basename of the file so this check leverages that feature. If
|
||||
# that changes, then this check will need to change accordingly.
|
||||
element = f"{os.path.splitext(os.path.basename(filename))[0]}:"
|
||||
with open(filename, "r", encoding="utf-8") as fd:
|
||||
for line in fd:
|
||||
if element in line:
|
||||
return True
|
||||
tty.debug(f"Expected {element} in '{filename}'")
|
||||
return False
|
||||
|
||||
def _dest_dir():
|
||||
return join_path(tmpdir.strpath, "cache")
|
||||
|
||||
if err is not None:
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
with pytest.raises(err, match=msg):
|
||||
rfc_util.local_path(url, sha256, _dest_dir)
|
||||
else:
|
||||
with spack.config.override("config:url_fetch_method", "curl"):
|
||||
path = rfc_util.local_path(url, sha256, _dest_dir)
|
||||
assert os.path.exists(path)
|
||||
# Ensure correct file is "fetched"
|
||||
assert os.path.basename(path) == os.path.basename(url)
|
||||
# Ensure contents of the file contains expected config element
|
||||
assert _has_content(path)
|
@ -14,6 +14,7 @@
|
||||
import sys
|
||||
import tempfile
|
||||
from datetime import date
|
||||
from typing import Optional
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.lang import memoized
|
||||
@ -235,7 +236,7 @@ def add_padding(path, length):
|
||||
return os.path.join(path, padding)
|
||||
|
||||
|
||||
def canonicalize_path(path, default_wd=None):
|
||||
def canonicalize_path(path: str, default_wd: Optional[str] = None) -> str:
|
||||
"""Same as substitute_path_variables, but also take absolute path.
|
||||
|
||||
If the string is a yaml object with file annotations, make absolute paths
|
||||
@ -243,26 +244,39 @@ def canonicalize_path(path, default_wd=None):
|
||||
Otherwise, use ``default_wd`` if specified, otherwise ``os.getcwd()``
|
||||
|
||||
Arguments:
|
||||
path (str): path being converted as needed
|
||||
path: path being converted as needed
|
||||
|
||||
Returns:
|
||||
(str): An absolute path with path variable substitution
|
||||
Returns: An absolute path or non-file URL with path variable substitution
|
||||
"""
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
|
||||
# Get file in which path was written in case we need to make it absolute
|
||||
# relative to that path.
|
||||
filename = None
|
||||
if isinstance(path, syaml.syaml_str):
|
||||
filename = os.path.dirname(path._start_mark.name)
|
||||
assert path._start_mark.name == path._end_mark.name
|
||||
filename = os.path.dirname(path._start_mark.name) # type: ignore[attr-defined]
|
||||
assert path._start_mark.name == path._end_mark.name # type: ignore[attr-defined]
|
||||
|
||||
path = substitute_path_variables(path)
|
||||
|
||||
url = urllib.parse.urlparse(path)
|
||||
url_path = urllib.request.url2pathname(url.path)
|
||||
if url.scheme:
|
||||
if url.scheme != "file":
|
||||
# Have a remote URL so simply return it with substitutions
|
||||
return os.path.normpath(path)
|
||||
|
||||
# Drop the URL scheme from the local path
|
||||
path = url_path
|
||||
|
||||
if not os.path.isabs(path):
|
||||
if filename:
|
||||
path = os.path.join(filename, path)
|
||||
else:
|
||||
base = default_wd or os.getcwd()
|
||||
path = os.path.join(base, path)
|
||||
tty.debug("Using working directory %s as base for abspath" % base)
|
||||
tty.debug(f"Using working directory {base} as base for abspath")
|
||||
|
||||
return os.path.normpath(path)
|
||||
|
||||
@ -347,6 +361,7 @@ def filter_padding():
|
||||
This is needed because Spack's debug output gets extremely long when we use a
|
||||
long padded installation path.
|
||||
"""
|
||||
# circular import
|
||||
import spack.config
|
||||
|
||||
padding = spack.config.get("config:install_tree:padded_length", None)
|
||||
|
137
lib/spack/spack/util/remote_file_cache.py
Normal file
137
lib/spack/spack/util/remote_file_cache.py
Normal file
@ -0,0 +1,137 @@
|
||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import hashlib
|
||||
import os.path
|
||||
import shutil
|
||||
import tempfile
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
from typing import Callable, Optional
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import copy, join_path, mkdirp
|
||||
|
||||
import spack.util.crypto
|
||||
from spack.util.path import canonicalize_path
|
||||
from spack.util.url import validate_scheme
|
||||
|
||||
|
||||
def raw_github_gitlab_url(url: str) -> str:
|
||||
"""Transform a github URL to the raw form to avoid undesirable html.
|
||||
|
||||
Args:
|
||||
url: url to be converted to raw form
|
||||
|
||||
Returns:
|
||||
Raw github/gitlab url or the original url
|
||||
"""
|
||||
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
|
||||
# actual URL under https://raw.githubusercontent.com/ with '/blob'
|
||||
# removed and or, '/blame' if needed.
|
||||
if "github" in url or "gitlab" in url:
|
||||
return url.replace("/blob/", "/raw/")
|
||||
|
||||
return url
|
||||
|
||||
|
||||
def fetch_remote_text_file(url: str, dest_dir: str) -> str:
|
||||
"""Retrieve the text file from the url into the destination directory.
|
||||
|
||||
Arguments:
|
||||
url: URL for the remote text file
|
||||
dest_dir: destination directory in which to stage the file locally
|
||||
|
||||
Returns:
|
||||
Path to the fetched file
|
||||
|
||||
Raises:
|
||||
ValueError: if there are missing required arguments
|
||||
"""
|
||||
from spack.util.web import fetch_url_text # circular import
|
||||
|
||||
if not url:
|
||||
raise ValueError("Cannot retrieve the remote file without the URL")
|
||||
|
||||
raw_url = raw_github_gitlab_url(url)
|
||||
tty.debug(f"Fetching file from {raw_url} into {dest_dir}")
|
||||
|
||||
return fetch_url_text(raw_url, dest_dir=dest_dir)
|
||||
|
||||
|
||||
def local_path(raw_path: str, sha256: str, make_dest: Optional[Callable[[], str]] = None) -> str:
|
||||
"""Determine the actual path and, if remote, stage its contents locally.
|
||||
|
||||
Args:
|
||||
raw_path: raw path with possible variables needing substitution
|
||||
sha256: the expected sha256 for the file
|
||||
make_dest: function to create a stage for remote files, if needed (e.g., `mkdtemp`)
|
||||
|
||||
Returns: resolved, normalized local path or None
|
||||
|
||||
Raises:
|
||||
ValueError: missing or mismatched arguments, unsupported URL scheme
|
||||
"""
|
||||
if not raw_path:
|
||||
raise ValueError("path argument is required to cache remote files")
|
||||
|
||||
# Allow paths (and URLs) to contain spack config/environment variables,
|
||||
# etc.
|
||||
path = canonicalize_path(raw_path)
|
||||
url = urllib.parse.urlparse(path)
|
||||
|
||||
# Path isn't remote so return absolute, normalized path with substitutions.
|
||||
if url.scheme in ["", "file"]:
|
||||
return path
|
||||
|
||||
# If scheme is not valid, path is not a url
|
||||
# of a type Spack is generally aware
|
||||
if validate_scheme(url.scheme):
|
||||
# Fetch files from supported URL schemes.
|
||||
if url.scheme in ("http", "https", "ftp"):
|
||||
if make_dest is None:
|
||||
raise ValueError("Requires the destination argument to cache remote files")
|
||||
|
||||
# Stage the remote configuration file
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
try:
|
||||
staged_path = fetch_remote_text_file(path, tmpdir)
|
||||
|
||||
# Ensure the sha256 is expected.
|
||||
checksum = spack.util.crypto.checksum(hashlib.sha256, staged_path)
|
||||
if sha256 and checksum != sha256:
|
||||
raise ValueError(
|
||||
f"Actual sha256 ('{checksum}') does not match expected ('{sha256}')"
|
||||
)
|
||||
|
||||
# Help the user by reporting the required checksum.
|
||||
if not sha256:
|
||||
raise ValueError(f"Requires sha256 ('{checksum}') to cache remote files.")
|
||||
|
||||
# Copy the file to the destination directory
|
||||
dest_dir = join_path(make_dest(), checksum)
|
||||
if not os.path.exists(dest_dir):
|
||||
mkdirp(dest_dir)
|
||||
|
||||
cache_path = join_path(dest_dir, os.path.basename(staged_path))
|
||||
copy(staged_path, cache_path)
|
||||
tty.debug(f"Cached {raw_path} in {cache_path}")
|
||||
|
||||
# Stash the associated URL to aid with debugging
|
||||
with open(join_path(dest_dir, "source_url.txt"), "w", encoding="utf-8") as f:
|
||||
f.write(f"{raw_path}\n")
|
||||
|
||||
return cache_path
|
||||
|
||||
except ValueError as err:
|
||||
tty.warn(f"Unable to cache {raw_path}: {str(err)}")
|
||||
raise
|
||||
|
||||
finally:
|
||||
shutil.rmtree(tmpdir)
|
||||
|
||||
raise ValueError(f"Unsupported URL scheme ({url.scheme}) in {raw_path}")
|
||||
|
||||
else:
|
||||
raise ValueError(f"Invalid URL scheme ({url.scheme}) in {raw_path}")
|
@ -1203,19 +1203,19 @@ complete -c spack -n '__fish_spack_using_command config' -l scope -r -d 'configu
|
||||
|
||||
# spack config get
|
||||
set -g __fish_spack_optspecs_spack_config_get h/help
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config get' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config get' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars include mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command config get' -s h -l help -f -a help
|
||||
complete -c spack -n '__fish_spack_using_command config get' -s h -l help -d 'show this help message and exit'
|
||||
|
||||
# spack config blame
|
||||
set -g __fish_spack_optspecs_spack_config_blame h/help
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config blame' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config blame' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars include mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command config blame' -s h -l help -f -a help
|
||||
complete -c spack -n '__fish_spack_using_command config blame' -s h -l help -d 'show this help message and exit'
|
||||
|
||||
# spack config edit
|
||||
set -g __fish_spack_optspecs_spack_config_edit h/help print-file
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config edit' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command_pos 0 config edit' -f -a 'bootstrap cdash ci compilers concretizer config definitions develop env_vars include mirrors modules packages repos upstreams view'
|
||||
complete -c spack -n '__fish_spack_using_command config edit' -s h -l help -f -a help
|
||||
complete -c spack -n '__fish_spack_using_command config edit' -s h -l help -d 'show this help message and exit'
|
||||
complete -c spack -n '__fish_spack_using_command config edit' -l print-file -f -a print_file
|
||||
|
Loading…
Reference in New Issue
Block a user