Compare commits
1 Commits
develop-20
...
packages/v
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5e7832d832 |
@@ -1,7 +1,7 @@
|
|||||||
black==25.1.0
|
black==25.1.0
|
||||||
clingo==5.7.1
|
clingo==5.7.1
|
||||||
flake8==7.1.2
|
flake8==7.1.2
|
||||||
isort==6.0.1
|
isort==6.0.0
|
||||||
mypy==1.15.0
|
mypy==1.15.0
|
||||||
types-six==1.17.0.20250304
|
types-six==1.17.0.20241205
|
||||||
vermin==1.6.0
|
vermin==1.6.0
|
||||||
|
|||||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -201,6 +201,7 @@ tramp
|
|||||||
|
|
||||||
# Org-mode
|
# Org-mode
|
||||||
.org-id-locations
|
.org-id-locations
|
||||||
|
*_archive
|
||||||
|
|
||||||
# flymake-mode
|
# flymake-mode
|
||||||
*_flymake.*
|
*_flymake.*
|
||||||
|
|||||||
@@ -125,8 +125,6 @@ are stored in ``$spack/var/spack/cache``. These are stored indefinitely
|
|||||||
by default. Can be purged with :ref:`spack clean --downloads
|
by default. Can be purged with :ref:`spack clean --downloads
|
||||||
<cmd-spack-clean>`.
|
<cmd-spack-clean>`.
|
||||||
|
|
||||||
.. _Misc Cache:
|
|
||||||
|
|
||||||
--------------------
|
--------------------
|
||||||
``misc_cache``
|
``misc_cache``
|
||||||
--------------------
|
--------------------
|
||||||
@@ -336,52 +334,3 @@ create a new alias called ``inst`` that will always call ``install -v``:
|
|||||||
|
|
||||||
aliases:
|
aliases:
|
||||||
inst: install -v
|
inst: install -v
|
||||||
|
|
||||||
-------------------------------
|
|
||||||
``concretization_cache:enable``
|
|
||||||
-------------------------------
|
|
||||||
|
|
||||||
When set to ``true``, Spack will utilize a cache of solver outputs from
|
|
||||||
successful concretization runs. When enabled, Spack will check the concretization
|
|
||||||
cache prior to running the solver. If a previous request to solve a given
|
|
||||||
problem is present in the cache, Spack will load the concrete specs and other
|
|
||||||
solver data from the cache rather than running the solver. Specs not previously
|
|
||||||
concretized will be added to the cache on a successful solve. The cache additionally
|
|
||||||
holds solver statistics, so commands like ``spack solve`` will still return information
|
|
||||||
about the run that produced a given solver result.
|
|
||||||
|
|
||||||
This cache is a subcache of the :ref:`Misc Cache` and as such will be cleaned when the Misc
|
|
||||||
Cache is cleaned.
|
|
||||||
|
|
||||||
When ``false`` or ommitted, all concretization requests will be performed from scatch
|
|
||||||
|
|
||||||
----------------------------
|
|
||||||
``concretization_cache:url``
|
|
||||||
----------------------------
|
|
||||||
|
|
||||||
Path to the location where Spack will root the concretization cache. Currently this only supports
|
|
||||||
paths on the local filesystem.
|
|
||||||
|
|
||||||
Default location is under the :ref:`Misc Cache` at: ``$misc_cache/concretization``
|
|
||||||
|
|
||||||
------------------------------------
|
|
||||||
``concretization_cache:entry_limit``
|
|
||||||
------------------------------------
|
|
||||||
|
|
||||||
Sets a limit on the number of concretization results that Spack will cache. The limit is evaluated
|
|
||||||
after each concretization run; if Spack has stored more results than the limit allows, the
|
|
||||||
oldest concretization results are pruned until 10% of the limit has been removed.
|
|
||||||
|
|
||||||
Setting this value to 0 disables the automatic pruning. It is expected users will be
|
|
||||||
responsible for maintaining this cache.
|
|
||||||
|
|
||||||
-----------------------------------
|
|
||||||
``concretization_cache:size_limit``
|
|
||||||
-----------------------------------
|
|
||||||
|
|
||||||
Sets a limit on the size of the concretization cache in bytes. The limit is evaluated
|
|
||||||
after each concretization run; if Spack has stored more results than the limit allows, the
|
|
||||||
oldest concretization results are pruned until 10% of the limit has been removed.
|
|
||||||
|
|
||||||
Setting this value to 0 disables the automatic pruning. It is expected users will be
|
|
||||||
responsible for maintaining this cache.
|
|
||||||
|
|||||||
@@ -14,7 +14,6 @@ case you want to skip directly to specific docs:
|
|||||||
* :ref:`compilers.yaml <compiler-config>`
|
* :ref:`compilers.yaml <compiler-config>`
|
||||||
* :ref:`concretizer.yaml <concretizer-options>`
|
* :ref:`concretizer.yaml <concretizer-options>`
|
||||||
* :ref:`config.yaml <config-yaml>`
|
* :ref:`config.yaml <config-yaml>`
|
||||||
* :ref:`include.yaml <include-yaml>`
|
|
||||||
* :ref:`mirrors.yaml <mirrors>`
|
* :ref:`mirrors.yaml <mirrors>`
|
||||||
* :ref:`modules.yaml <modules>`
|
* :ref:`modules.yaml <modules>`
|
||||||
* :ref:`packages.yaml <packages-config>`
|
* :ref:`packages.yaml <packages-config>`
|
||||||
|
|||||||
@@ -670,45 +670,24 @@ This configuration sets the default compiler for all packages to
|
|||||||
Included configurations
|
Included configurations
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
Spack environments allow an ``include`` heading in their yaml schema.
|
Spack environments allow an ``include`` heading in their yaml
|
||||||
This heading pulls in external configuration files and applies them to
|
schema. This heading pulls in external configuration files and applies
|
||||||
the environment.
|
them to the environment.
|
||||||
|
|
||||||
.. code-block:: yaml
|
.. code-block:: yaml
|
||||||
|
|
||||||
spack:
|
spack:
|
||||||
include:
|
include:
|
||||||
- environment/relative/path/to/config.yaml
|
- relative/path/to/config.yaml
|
||||||
- https://github.com/path/to/raw/config/compilers.yaml
|
- https://github.com/path/to/raw/config/compilers.yaml
|
||||||
- /absolute/path/to/packages.yaml
|
- /absolute/path/to/packages.yaml
|
||||||
- path: /path/to/$os/$target/environment
|
|
||||||
optional: true
|
|
||||||
- path: /path/to/os-specific/config-dir
|
|
||||||
when: os == "ventura"
|
|
||||||
|
|
||||||
Included configuration files are required *unless* they are explicitly optional
|
|
||||||
or the entry's condition evaluates to ``false``. Optional includes are specified
|
|
||||||
with the ``optional`` clause and conditional with the ``when`` clause. (See
|
|
||||||
:ref:`include-yaml` for more information on optional and conditional entries.)
|
|
||||||
|
|
||||||
Files are listed using paths to individual files or directories containing them.
|
|
||||||
Path entries may be absolute or relative to the environment or specified as
|
|
||||||
URLs. URLs to individual files need link to the **raw** form of the file's
|
|
||||||
contents (e.g., `GitHub
|
|
||||||
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
|
|
||||||
or `GitLab
|
|
||||||
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
|
|
||||||
Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or schemes) are
|
|
||||||
supported. Spack-specific, environment and user path variables can be used.
|
|
||||||
(See :ref:`config-file-variables` for more information.)
|
|
||||||
|
|
||||||
.. warning::
|
|
||||||
|
|
||||||
Recursive includes are not currently processed in a breadth-first manner
|
|
||||||
so the value of a configuration option that is altered by multiple included
|
|
||||||
files may not be what you expect. This will be addressed in a future
|
|
||||||
update.
|
|
||||||
|
|
||||||
|
Environments can include files or URLs. File paths can be relative or
|
||||||
|
absolute. URLs include the path to the text for individual files or
|
||||||
|
can be the path to a directory containing configuration files.
|
||||||
|
Spack supports ``file``, ``http``, ``https`` and ``ftp`` protocols (or
|
||||||
|
schemes). Spack-specific, environment and user path variables may be
|
||||||
|
used in these paths. See :ref:`config-file-variables` for more information.
|
||||||
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
Configuration precedence
|
Configuration precedence
|
||||||
|
|||||||
@@ -1,51 +0,0 @@
|
|||||||
.. Copyright Spack Project Developers. See COPYRIGHT file for details.
|
|
||||||
|
|
||||||
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
|
||||||
|
|
||||||
.. _include-yaml:
|
|
||||||
|
|
||||||
===============================
|
|
||||||
Include Settings (include.yaml)
|
|
||||||
===============================
|
|
||||||
|
|
||||||
Spack allows you to include configuration files through ``include.yaml``.
|
|
||||||
Using the ``include:`` heading results in pulling in external configuration
|
|
||||||
information to be used by any Spack command.
|
|
||||||
|
|
||||||
Included configuration files are required *unless* they are explicitly optional
|
|
||||||
or the entry's condition evaluates to ``false``. Optional includes are specified
|
|
||||||
with the ``optional`` clause and conditional with the ``when`` clause. For
|
|
||||||
example,
|
|
||||||
|
|
||||||
.. code-block:: yaml
|
|
||||||
|
|
||||||
include:
|
|
||||||
- /path/to/a/required/config.yaml
|
|
||||||
- path: /path/to/$os/$target/config
|
|
||||||
optional: true
|
|
||||||
- path: /path/to/os-specific/config-dir
|
|
||||||
when: os == "ventura"
|
|
||||||
|
|
||||||
shows all three. The first entry, ``/path/to/a/required/config.yaml``,
|
|
||||||
indicates that included ``config.yaml`` file is required (so must exist).
|
|
||||||
Use of ``optional: true`` for ``/path/to/$os/$target/config`` means
|
|
||||||
the path is only included if it exists. The condition ``os == "ventura"``
|
|
||||||
in the ``when`` clause for ``/path/to/os-specific/config-dir`` means the
|
|
||||||
path is only included when the operating system (``os``) is ``ventura``.
|
|
||||||
|
|
||||||
The same conditions and variables in `Spec List References
|
|
||||||
<https://spack.readthedocs.io/en/latest/environments.html#spec-list-references>`_
|
|
||||||
can be used for conditional activation in the ``when`` clauses.
|
|
||||||
|
|
||||||
Included files can be specified by path or by their parent directory.
|
|
||||||
Paths may be absolute, relative (to the configuration file including the path),
|
|
||||||
or specified as URLs. Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or
|
|
||||||
schemes) are supported. Spack-specific, environment and user path variables
|
|
||||||
can be used. (See :ref:`config-file-variables` for more information.)
|
|
||||||
|
|
||||||
.. warning::
|
|
||||||
|
|
||||||
Recursive includes are not currently processed in a breadth-first manner
|
|
||||||
so the value of a configuration option that is altered by multiple included
|
|
||||||
files may not be what you expect. This will be addressed in a future
|
|
||||||
update.
|
|
||||||
@@ -71,7 +71,6 @@ or refer to the full manual below.
|
|||||||
|
|
||||||
configuration
|
configuration
|
||||||
config_yaml
|
config_yaml
|
||||||
include_yaml
|
|
||||||
packages_yaml
|
packages_yaml
|
||||||
build_settings
|
build_settings
|
||||||
environments
|
environments
|
||||||
|
|||||||
@@ -1,13 +1,13 @@
|
|||||||
sphinx==8.2.3
|
sphinx==8.2.1
|
||||||
sphinxcontrib-programoutput==0.18
|
sphinxcontrib-programoutput==0.18
|
||||||
sphinx_design==0.6.1
|
sphinx_design==0.6.1
|
||||||
sphinx-rtd-theme==3.0.2
|
sphinx-rtd-theme==3.0.2
|
||||||
python-levenshtein==0.27.1
|
python-levenshtein==0.26.1
|
||||||
docutils==0.21.2
|
docutils==0.21.2
|
||||||
pygments==2.19.1
|
pygments==2.19.1
|
||||||
urllib3==2.3.0
|
urllib3==2.3.0
|
||||||
pytest==8.3.5
|
pytest==8.3.4
|
||||||
isort==6.0.1
|
isort==6.0.0
|
||||||
black==25.1.0
|
black==25.1.0
|
||||||
flake8==7.1.2
|
flake8==7.1.2
|
||||||
mypy==1.11.1
|
mypy==1.11.1
|
||||||
|
|||||||
@@ -7,7 +7,6 @@
|
|||||||
import fnmatch
|
import fnmatch
|
||||||
import glob
|
import glob
|
||||||
import hashlib
|
import hashlib
|
||||||
import io
|
|
||||||
import itertools
|
import itertools
|
||||||
import numbers
|
import numbers
|
||||||
import os
|
import os
|
||||||
@@ -21,7 +20,6 @@
|
|||||||
from contextlib import contextmanager
|
from contextlib import contextmanager
|
||||||
from itertools import accumulate
|
from itertools import accumulate
|
||||||
from typing import (
|
from typing import (
|
||||||
IO,
|
|
||||||
Callable,
|
Callable,
|
||||||
Deque,
|
Deque,
|
||||||
Dict,
|
Dict,
|
||||||
@@ -2456,69 +2454,26 @@ class WindowsSimulatedRPath:
|
|||||||
and vis versa.
|
and vis versa.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(
|
def __init__(self, package, link_install_prefix=True):
|
||||||
self,
|
|
||||||
package,
|
|
||||||
base_modification_prefix: Optional[Union[str, pathlib.Path]] = None,
|
|
||||||
link_install_prefix: bool = True,
|
|
||||||
):
|
|
||||||
"""
|
"""
|
||||||
Args:
|
Args:
|
||||||
package (spack.package_base.PackageBase): Package requiring links
|
package (spack.package_base.PackageBase): Package requiring links
|
||||||
base_modification_prefix (str|pathlib.Path): Path representation indicating
|
|
||||||
the root directory in which to establish the simulated rpath, ie where the
|
|
||||||
symlinks that comprise the "rpath" behavior will be installed.
|
|
||||||
|
|
||||||
Note: This is a mutually exclusive option with `link_install_prefix` using
|
|
||||||
both is an error.
|
|
||||||
|
|
||||||
Default: None
|
|
||||||
link_install_prefix (bool): Link against package's own install or stage root.
|
link_install_prefix (bool): Link against package's own install or stage root.
|
||||||
Packages that run their own executables during build and require rpaths to
|
Packages that run their own executables during build and require rpaths to
|
||||||
the build directory during build time require this option.
|
the build directory during build time require this option. Default: install
|
||||||
|
|
||||||
Default: install
|
|
||||||
root
|
root
|
||||||
|
|
||||||
Note: This is a mutually exclusive option with `base_modification_prefix`, using
|
|
||||||
both is an error.
|
|
||||||
"""
|
"""
|
||||||
self.pkg = package
|
self.pkg = package
|
||||||
self._addl_rpaths: set[str] = set()
|
self._addl_rpaths = set()
|
||||||
if link_install_prefix and base_modification_prefix:
|
|
||||||
raise RuntimeError(
|
|
||||||
"Invalid combination of arguments given to WindowsSimulated RPath.\n"
|
|
||||||
"Select either `link_install_prefix` to create an install prefix rpath"
|
|
||||||
" or specify a `base_modification_prefix` for any other link type. "
|
|
||||||
"Specifying both arguments is invalid."
|
|
||||||
)
|
|
||||||
if not (link_install_prefix or base_modification_prefix):
|
|
||||||
raise RuntimeError(
|
|
||||||
"Insufficient arguments given to WindowsSimulatedRpath.\n"
|
|
||||||
"WindowsSimulatedRPath requires one of link_install_prefix"
|
|
||||||
" or base_modification_prefix to be specified."
|
|
||||||
" Neither was provided."
|
|
||||||
)
|
|
||||||
|
|
||||||
self.link_install_prefix = link_install_prefix
|
self.link_install_prefix = link_install_prefix
|
||||||
if base_modification_prefix:
|
self._additional_library_dependents = set()
|
||||||
self.base_modification_prefix = pathlib.Path(base_modification_prefix)
|
|
||||||
else:
|
|
||||||
self.base_modification_prefix = pathlib.Path(self.pkg.prefix)
|
|
||||||
self._additional_library_dependents: set[pathlib.Path] = set()
|
|
||||||
if not self.link_install_prefix:
|
|
||||||
tty.debug(f"Generating rpath for non install context: {base_modification_prefix}")
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def library_dependents(self):
|
def library_dependents(self):
|
||||||
"""
|
"""
|
||||||
Set of directories where package binaries/libraries are located.
|
Set of directories where package binaries/libraries are located.
|
||||||
"""
|
"""
|
||||||
base_pths = set()
|
return set([pathlib.Path(self.pkg.prefix.bin)]) | self._additional_library_dependents
|
||||||
if self.link_install_prefix:
|
|
||||||
base_pths.add(pathlib.Path(self.pkg.prefix.bin))
|
|
||||||
base_pths |= self._additional_library_dependents
|
|
||||||
return base_pths
|
|
||||||
|
|
||||||
def add_library_dependent(self, *dest):
|
def add_library_dependent(self, *dest):
|
||||||
"""
|
"""
|
||||||
@@ -2534,12 +2489,6 @@ def add_library_dependent(self, *dest):
|
|||||||
new_pth = pathlib.Path(pth).parent
|
new_pth = pathlib.Path(pth).parent
|
||||||
else:
|
else:
|
||||||
new_pth = pathlib.Path(pth)
|
new_pth = pathlib.Path(pth)
|
||||||
path_is_in_prefix = new_pth.is_relative_to(self.base_modification_prefix)
|
|
||||||
if not path_is_in_prefix:
|
|
||||||
raise RuntimeError(
|
|
||||||
f"Attempting to generate rpath symlink out of rpath context:\
|
|
||||||
{str(self.base_modification_prefix)}"
|
|
||||||
)
|
|
||||||
self._additional_library_dependents.add(new_pth)
|
self._additional_library_dependents.add(new_pth)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@@ -2628,33 +2577,6 @@ def establish_link(self):
|
|||||||
self._link(library, lib_dir)
|
self._link(library, lib_dir)
|
||||||
|
|
||||||
|
|
||||||
def make_package_test_rpath(pkg, test_dir: Union[str, pathlib.Path]):
|
|
||||||
"""Establishes a temp Windows simulated rpath for the pkg in the testing directory
|
|
||||||
so an executable can test the libraries/executables with proper access
|
|
||||||
to dependent dlls
|
|
||||||
|
|
||||||
Note: this is a no-op on all other platforms besides Windows
|
|
||||||
|
|
||||||
Args:
|
|
||||||
pkg (spack.package_base.PackageBase): the package for which the rpath should be computed
|
|
||||||
test_dir: the testing directory in which we should construct an rpath
|
|
||||||
"""
|
|
||||||
# link_install_prefix as false ensures we're not linking into the install prefix
|
|
||||||
mini_rpath = WindowsSimulatedRPath(pkg, link_install_prefix=False)
|
|
||||||
# add the testing directory as a location to install rpath symlinks
|
|
||||||
mini_rpath.add_library_dependent(test_dir)
|
|
||||||
|
|
||||||
# check for whether build_directory is available, if not
|
|
||||||
# assume the stage root is the build dir
|
|
||||||
build_dir_attr = getattr(pkg, "build_directory", None)
|
|
||||||
build_directory = build_dir_attr if build_dir_attr else pkg.stage.path
|
|
||||||
# add the build dir & build dir bin
|
|
||||||
mini_rpath.add_rpath(os.path.join(build_directory, "bin"))
|
|
||||||
mini_rpath.add_rpath(os.path.join(build_directory))
|
|
||||||
# construct rpath
|
|
||||||
mini_rpath.establish_link()
|
|
||||||
|
|
||||||
|
|
||||||
@system_path_filter
|
@system_path_filter
|
||||||
@memoized
|
@memoized
|
||||||
def can_access_dir(path):
|
def can_access_dir(path):
|
||||||
@@ -2883,20 +2805,6 @@ def keep_modification_time(*filenames):
|
|||||||
os.utime(f, (os.path.getatime(f), mtime))
|
os.utime(f, (os.path.getatime(f), mtime))
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
|
||||||
def temporary_file_position(stream):
|
|
||||||
orig_pos = stream.tell()
|
|
||||||
yield
|
|
||||||
stream.seek(orig_pos)
|
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
|
||||||
def current_file_position(stream: IO[str], loc: int, relative_to=io.SEEK_CUR):
|
|
||||||
with temporary_file_position(stream):
|
|
||||||
stream.seek(loc, relative_to)
|
|
||||||
yield
|
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
@contextmanager
|
||||||
def temporary_dir(
|
def temporary_dir(
|
||||||
suffix: Optional[str] = None, prefix: Optional[str] = None, dir: Optional[str] = None
|
suffix: Optional[str] = None, prefix: Optional[str] = None, dir: Optional[str] = None
|
||||||
|
|||||||
@@ -11,7 +11,6 @@
|
|||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import traceback
|
import traceback
|
||||||
import types
|
|
||||||
import typing
|
import typing
|
||||||
import warnings
|
import warnings
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
@@ -708,24 +707,14 @@ def __init__(self, wrapped_object):
|
|||||||
|
|
||||||
|
|
||||||
class Singleton:
|
class Singleton:
|
||||||
"""Wrapper for lazily initialized singleton objects."""
|
"""Simple wrapper for lazily initialized singleton objects."""
|
||||||
|
|
||||||
def __init__(self, factory: Callable[[], object]):
|
def __init__(self, factory):
|
||||||
"""Create a new singleton to be inited with the factory function.
|
"""Create a new singleton to be inited with the factory function.
|
||||||
|
|
||||||
Most factories will simply create the object to be initialized and
|
|
||||||
return it.
|
|
||||||
|
|
||||||
In some cases, e.g. when bootstrapping some global state, the singleton
|
|
||||||
may need to be initialized incrementally. If the factory returns a generator
|
|
||||||
instead of a regular object, the singleton will assign each result yielded by
|
|
||||||
the generator to the singleton instance. This allows methods called by
|
|
||||||
the factory in later stages to refer back to the singleton.
|
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
factory (function): function taking no arguments that creates the
|
factory (function): function taking no arguments that
|
||||||
singleton instance.
|
creates the singleton instance.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
self.factory = factory
|
self.factory = factory
|
||||||
self._instance = None
|
self._instance = None
|
||||||
@@ -733,16 +722,7 @@ def __init__(self, factory: Callable[[], object]):
|
|||||||
@property
|
@property
|
||||||
def instance(self):
|
def instance(self):
|
||||||
if self._instance is None:
|
if self._instance is None:
|
||||||
instance = self.factory()
|
self._instance = self.factory()
|
||||||
|
|
||||||
if isinstance(instance, types.GeneratorType):
|
|
||||||
# if it's a generator, assign every value
|
|
||||||
for value in instance:
|
|
||||||
self._instance = value
|
|
||||||
else:
|
|
||||||
# if not, just assign the result like a normal singleton
|
|
||||||
self._instance = instance
|
|
||||||
|
|
||||||
return self._instance
|
return self._instance
|
||||||
|
|
||||||
def __getattr__(self, name):
|
def __getattr__(self, name):
|
||||||
|
|||||||
@@ -13,18 +13,6 @@
|
|||||||
__version__ = "1.0.0.dev0"
|
__version__ = "1.0.0.dev0"
|
||||||
spack_version = __version__
|
spack_version = __version__
|
||||||
|
|
||||||
#: The current Package API version implemented by this version of Spack. The Package API defines
|
|
||||||
#: the Python interface for packages as well as the layout of package repositories. The minor
|
|
||||||
#: version is incremented when the package API is extended in a backwards-compatible way. The major
|
|
||||||
#: version is incremented upon breaking changes. This version is changed independently from the
|
|
||||||
#: Spack version.
|
|
||||||
package_api_version = (1, 0)
|
|
||||||
|
|
||||||
#: The minimum Package API version that this version of Spack is compatible with. This should
|
|
||||||
#: always be a tuple of the form ``(major, 0)``, since compatibility with vX.Y implies
|
|
||||||
#: compatibility with vX.0.
|
|
||||||
min_package_api_version = (1, 0)
|
|
||||||
|
|
||||||
|
|
||||||
def __try_int(v):
|
def __try_int(v):
|
||||||
try:
|
try:
|
||||||
@@ -91,6 +79,4 @@ def get_short_version() -> str:
|
|||||||
"get_version",
|
"get_version",
|
||||||
"get_spack_commit",
|
"get_spack_commit",
|
||||||
"get_short_version",
|
"get_short_version",
|
||||||
"package_api_version",
|
|
||||||
"min_package_api_version",
|
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -881,6 +881,21 @@ def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]
|
|||||||
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
|
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
|
||||||
|
|
||||||
|
|
||||||
|
def load_external_modules(pkg):
|
||||||
|
"""Traverse a package's spec DAG and load any external modules.
|
||||||
|
|
||||||
|
Traverse a package's dependencies and load any external modules
|
||||||
|
associated with them.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
pkg (spack.package_base.PackageBase): package to load deps for
|
||||||
|
"""
|
||||||
|
for dep in list(pkg.spec.traverse()):
|
||||||
|
external_modules = dep.external_modules or []
|
||||||
|
for external_module in external_modules:
|
||||||
|
load_module(external_module)
|
||||||
|
|
||||||
|
|
||||||
def setup_package(pkg, dirty, context: Context = Context.BUILD):
|
def setup_package(pkg, dirty, context: Context = Context.BUILD):
|
||||||
"""Execute all environment setup routines."""
|
"""Execute all environment setup routines."""
|
||||||
if context not in (Context.BUILD, Context.TEST):
|
if context not in (Context.BUILD, Context.TEST):
|
||||||
@@ -931,7 +946,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
|
|||||||
for mod in pkg.compiler.modules:
|
for mod in pkg.compiler.modules:
|
||||||
load_module(mod)
|
load_module(mod)
|
||||||
|
|
||||||
load_external_modules(setup_context)
|
load_external_modules(pkg)
|
||||||
|
|
||||||
# Make sure nothing's strange about the Spack environment.
|
# Make sure nothing's strange about the Spack environment.
|
||||||
validate(env_mods, tty.warn)
|
validate(env_mods, tty.warn)
|
||||||
@@ -1219,20 +1234,9 @@ def _make_runnable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
|
|||||||
if os.path.isdir(bin_dir):
|
if os.path.isdir(bin_dir):
|
||||||
env.prepend_path("PATH", bin_dir)
|
env.prepend_path("PATH", bin_dir)
|
||||||
|
|
||||||
|
for cp_dir in spack.build_systems.cmake.get_cmake_prefix_path(dep.package):
|
||||||
def load_external_modules(context: SetupContext) -> None:
|
env.append_path("CMAKE_PREFIX_PATH", cp_dir)
|
||||||
"""Traverse a package's spec DAG and load any external modules.
|
env.prune_duplicate_paths("CMAKE_PREFIX_PATH")
|
||||||
|
|
||||||
Traverse a package's dependencies and load any external modules
|
|
||||||
associated with them.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
context: A populated SetupContext object
|
|
||||||
"""
|
|
||||||
for spec, _ in context.external:
|
|
||||||
external_modules = spec.external_modules or []
|
|
||||||
for external_module in external_modules:
|
|
||||||
load_module(external_module)
|
|
||||||
|
|
||||||
|
|
||||||
def _setup_pkg_and_run(
|
def _setup_pkg_and_run(
|
||||||
|
|||||||
@@ -6,7 +6,6 @@
|
|||||||
import codecs
|
import codecs
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
import pathlib
|
|
||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
import stat
|
import stat
|
||||||
@@ -24,6 +23,7 @@
|
|||||||
|
|
||||||
import spack
|
import spack
|
||||||
import spack.binary_distribution as bindist
|
import spack.binary_distribution as bindist
|
||||||
|
import spack.builder
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.config as cfg
|
import spack.config as cfg
|
||||||
import spack.environment as ev
|
import spack.environment as ev
|
||||||
@@ -33,7 +33,6 @@
|
|||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.store
|
|
||||||
import spack.util.git
|
import spack.util.git
|
||||||
import spack.util.gpg as gpg_util
|
import spack.util.gpg as gpg_util
|
||||||
import spack.util.spack_yaml as syaml
|
import spack.util.spack_yaml as syaml
|
||||||
@@ -225,7 +224,7 @@ def rebuild_filter(s: spack.spec.Spec) -> RebuildDecision:
|
|||||||
|
|
||||||
def _format_pruning_message(spec: spack.spec.Spec, prune: bool, reasons: List[str]) -> str:
|
def _format_pruning_message(spec: spack.spec.Spec, prune: bool, reasons: List[str]) -> str:
|
||||||
reason_msg = ", ".join(reasons)
|
reason_msg = ", ".join(reasons)
|
||||||
spec_fmt = "{name}{@version}{/hash:7}{%compiler}"
|
spec_fmt = "{name}{@version}{%compiler}{/hash:7}"
|
||||||
|
|
||||||
if not prune:
|
if not prune:
|
||||||
status = colorize("@*g{[x]} ")
|
status = colorize("@*g{[x]} ")
|
||||||
@@ -582,25 +581,22 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
|
|||||||
tty.debug(f"job spec: {job_spec}")
|
tty.debug(f"job spec: {job_spec}")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
package_metadata_root = pathlib.Path(spack.store.STORE.layout.metadata_path(job_spec))
|
pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name)
|
||||||
except spack.error.SpackError as e:
|
job_pkg = pkg_cls(job_spec)
|
||||||
tty.error(f"Cannot copy logs: {str(e)}")
|
tty.debug(f"job package: {job_pkg}")
|
||||||
|
except AssertionError:
|
||||||
|
msg = f"Cannot copy stage logs: job spec ({job_spec}) must be concrete"
|
||||||
|
tty.error(msg)
|
||||||
return
|
return
|
||||||
|
|
||||||
# Get the package's archived files
|
stage_dir = job_pkg.stage.path
|
||||||
archive_files = []
|
tty.debug(f"stage dir: {stage_dir}")
|
||||||
archive_root = package_metadata_root / "archived-files"
|
for file in [
|
||||||
if archive_root.is_dir():
|
job_pkg.log_path,
|
||||||
archive_files = [f for f in archive_root.rglob("*") if f.is_file()]
|
job_pkg.env_mods_path,
|
||||||
else:
|
*spack.builder.create(job_pkg).archive_files,
|
||||||
msg = "Cannot copy package archived files: archived-files must be a directory"
|
]:
|
||||||
tty.warn(msg)
|
copy_files_to_artifacts(file, job_log_dir)
|
||||||
|
|
||||||
build_log_zipped = package_metadata_root / "spack-build-out.txt.gz"
|
|
||||||
build_env_mods = package_metadata_root / "spack-build-env.txt"
|
|
||||||
|
|
||||||
for f in [build_log_zipped, build_env_mods, *archive_files]:
|
|
||||||
copy_files_to_artifacts(str(f), job_log_dir)
|
|
||||||
|
|
||||||
|
|
||||||
def copy_test_logs_to_artifacts(test_stage, job_test_dir):
|
def copy_test_logs_to_artifacts(test_stage, job_test_dir):
|
||||||
|
|||||||
@@ -330,7 +330,7 @@ def ensure_single_spec_or_die(spec, matching_specs):
|
|||||||
if len(matching_specs) <= 1:
|
if len(matching_specs) <= 1:
|
||||||
return
|
return
|
||||||
|
|
||||||
format_string = "{name}{@version}{ arch=architecture} {%compiler.name}{@compiler.version}"
|
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{ arch=architecture}"
|
||||||
args = ["%s matches multiple packages." % spec, "Matching packages:"]
|
args = ["%s matches multiple packages." % spec, "Matching packages:"]
|
||||||
args += [
|
args += [
|
||||||
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
|
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
|
||||||
@@ -471,11 +471,12 @@ def get_arg(name, default=None):
|
|||||||
nfmt = "{fullname}" if namespaces else "{name}"
|
nfmt = "{fullname}" if namespaces else "{name}"
|
||||||
ffmt = ""
|
ffmt = ""
|
||||||
if full_compiler or flags:
|
if full_compiler or flags:
|
||||||
ffmt += "{compiler_flags} {%compiler.name}"
|
ffmt += "{%compiler.name}"
|
||||||
if full_compiler:
|
if full_compiler:
|
||||||
ffmt += "{@compiler.version}"
|
ffmt += "{@compiler.version}"
|
||||||
|
ffmt += " {compiler_flags}"
|
||||||
vfmt = "{variants}" if variants else ""
|
vfmt = "{variants}" if variants else ""
|
||||||
format_string = nfmt + "{@version}" + vfmt + ffmt
|
format_string = nfmt + "{@version}" + ffmt + vfmt
|
||||||
|
|
||||||
def fmt(s, depth=0):
|
def fmt(s, depth=0):
|
||||||
"""Formatter function for all output specs"""
|
"""Formatter function for all output specs"""
|
||||||
|
|||||||
@@ -427,7 +427,7 @@ def ci_rebuild(args):
|
|||||||
|
|
||||||
# Arguments when installing the root from sources
|
# Arguments when installing the root from sources
|
||||||
deps_install_args = install_args + ["--only=dependencies"]
|
deps_install_args = install_args + ["--only=dependencies"]
|
||||||
root_install_args = install_args + ["--only=package"]
|
root_install_args = install_args + ["--keep-stage", "--only=package"]
|
||||||
|
|
||||||
if cdash_handler:
|
if cdash_handler:
|
||||||
# Add additional arguments to `spack install` for CDash reporting.
|
# Add additional arguments to `spack install` for CDash reporting.
|
||||||
@@ -464,7 +464,8 @@ def ci_rebuild(args):
|
|||||||
job_spec.to_dict(hash=ht.dag_hash),
|
job_spec.to_dict(hash=ht.dag_hash),
|
||||||
)
|
)
|
||||||
|
|
||||||
# Copy logs and archived files from the install metadata (.spack) directory to artifacts now
|
# We generated the "spack install ..." command to "--keep-stage", copy
|
||||||
|
# any logs from the staging directory to artifacts now
|
||||||
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
|
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
|
||||||
|
|
||||||
# If the installation succeeded and we're running stand-alone tests for
|
# If the installation succeeded and we're running stand-alone tests for
|
||||||
|
|||||||
@@ -350,12 +350,9 @@ def _config_change(config_path, match_spec_str=None):
|
|||||||
if spack.config.get(key_path, scope=scope):
|
if spack.config.get(key_path, scope=scope):
|
||||||
ideal_scope_to_modify = scope
|
ideal_scope_to_modify = scope
|
||||||
break
|
break
|
||||||
# If we find our key in a specific scope, that's the one we want
|
|
||||||
# to modify. Otherwise we use the default write scope.
|
|
||||||
write_scope = ideal_scope_to_modify or spack.config.default_modify_scope()
|
|
||||||
|
|
||||||
update_path = f"{key_path}:[{str(spec)}]"
|
update_path = f"{key_path}:[{str(spec)}]"
|
||||||
spack.config.add(update_path, scope=write_scope)
|
spack.config.add(update_path, scope=ideal_scope_to_modify)
|
||||||
else:
|
else:
|
||||||
raise ValueError("'config change' can currently only change 'require' sections")
|
raise ValueError("'config change' can currently only change 'require' sections")
|
||||||
|
|
||||||
|
|||||||
@@ -55,7 +55,7 @@ def dependencies(parser, args):
|
|||||||
env = ev.active_environment()
|
env = ev.active_environment()
|
||||||
spec = spack.cmd.disambiguate_spec(specs[0], env)
|
spec = spack.cmd.disambiguate_spec(specs[0], env)
|
||||||
|
|
||||||
format_string = "{name}{@version}{/hash:7}{%compiler}"
|
format_string = "{name}{@version}{%compiler}{/hash:7}"
|
||||||
if sys.stdout.isatty():
|
if sys.stdout.isatty():
|
||||||
tty.msg("Dependencies of %s" % spec.format(format_string, color=True))
|
tty.msg("Dependencies of %s" % spec.format(format_string, color=True))
|
||||||
deps = spack.store.STORE.db.installed_relatives(
|
deps = spack.store.STORE.db.installed_relatives(
|
||||||
|
|||||||
@@ -93,7 +93,7 @@ def dependents(parser, args):
|
|||||||
env = ev.active_environment()
|
env = ev.active_environment()
|
||||||
spec = spack.cmd.disambiguate_spec(specs[0], env)
|
spec = spack.cmd.disambiguate_spec(specs[0], env)
|
||||||
|
|
||||||
format_string = "{name}{@version}{/hash:7}{%compiler}"
|
format_string = "{name}{@version}{%compiler}{/hash:7}"
|
||||||
if sys.stdout.isatty():
|
if sys.stdout.isatty():
|
||||||
tty.msg("Dependents of %s" % spec.cformat(format_string))
|
tty.msg("Dependents of %s" % spec.cformat(format_string))
|
||||||
deps = spack.store.STORE.db.installed_relatives(spec, "parents", args.transitive)
|
deps = spack.store.STORE.db.installed_relatives(spec, "parents", args.transitive)
|
||||||
|
|||||||
@@ -73,7 +73,7 @@
|
|||||||
boxlib @B{dim=2} boxlib built for 2 dimensions
|
boxlib @B{dim=2} boxlib built for 2 dimensions
|
||||||
libdwarf @g{%intel} ^libelf@g{%gcc}
|
libdwarf @g{%intel} ^libelf@g{%gcc}
|
||||||
libdwarf, built with intel compiler, linked to libelf built with gcc
|
libdwarf, built with intel compiler, linked to libelf built with gcc
|
||||||
mvapich2 @B{fabrics=psm,mrail,sock} @g{%gcc}
|
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
|
||||||
mvapich2, built with gcc compiler, with support for multiple fabrics
|
mvapich2, built with gcc compiler, with support for multiple fabrics
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|||||||
@@ -383,10 +383,8 @@ def modules_cmd(parser, args, module_type, callbacks=callbacks):
|
|||||||
query = " ".join(str(s) for s in args.constraint_specs)
|
query = " ".join(str(s) for s in args.constraint_specs)
|
||||||
msg = f"the constraint '{query}' matches multiple packages:\n"
|
msg = f"the constraint '{query}' matches multiple packages:\n"
|
||||||
for s in specs:
|
for s in specs:
|
||||||
spec_fmt = (
|
spec_fmt = "{hash:7} {name}{@version}{%compiler}"
|
||||||
"{hash:7} {name}{@version}{compiler_flags}{variants}"
|
spec_fmt += "{compiler_flags}{variants}{arch=architecture}"
|
||||||
"{arch=architecture} {%compiler}"
|
|
||||||
)
|
|
||||||
msg += "\t" + s.cformat(spec_fmt) + "\n"
|
msg += "\t" + s.cformat(spec_fmt) + "\n"
|
||||||
tty.die(msg, "In this context exactly *one* match is needed.")
|
tty.die(msg, "In this context exactly *one* match is needed.")
|
||||||
|
|
||||||
|
|||||||
@@ -6,9 +6,8 @@
|
|||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import warnings
|
|
||||||
from itertools import islice, zip_longest
|
from itertools import islice, zip_longest
|
||||||
from typing import Callable, Dict, List, Optional
|
from typing import Dict, List, Optional
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
import llnl.util.tty.color as color
|
import llnl.util.tty.color as color
|
||||||
@@ -17,9 +16,6 @@
|
|||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.util.git
|
import spack.util.git
|
||||||
import spack.util.spack_yaml
|
|
||||||
from spack.spec_parser import SPEC_TOKENIZER, SpecTokens
|
|
||||||
from spack.tokenize import Token
|
|
||||||
from spack.util.executable import Executable, which
|
from spack.util.executable import Executable, which
|
||||||
|
|
||||||
description = "runs source code style checks on spack"
|
description = "runs source code style checks on spack"
|
||||||
@@ -202,13 +198,6 @@ def setup_parser(subparser):
|
|||||||
action="append",
|
action="append",
|
||||||
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
|
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
|
||||||
)
|
)
|
||||||
subparser.add_argument(
|
|
||||||
"--spec-strings",
|
|
||||||
action="store_true",
|
|
||||||
help="upgrade spec strings in Python, JSON and YAML files for compatibility with Spack "
|
|
||||||
"v1.0 and v0.x. Example: spack style --spec-strings $(git ls-files). Note: this flag "
|
|
||||||
"will be removed in Spack v1.0.",
|
|
||||||
)
|
|
||||||
|
|
||||||
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
|
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
|
||||||
|
|
||||||
@@ -518,196 +507,7 @@ def _bootstrap_dev_dependencies():
|
|||||||
spack.bootstrap.ensure_environment_dependencies()
|
spack.bootstrap.ensure_environment_dependencies()
|
||||||
|
|
||||||
|
|
||||||
IS_PROBABLY_COMPILER = re.compile(r"%[a-zA-Z_][a-zA-Z0-9\-]")
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_reorder_compiler(idx: int, blocks: List[List[Token]]) -> None:
|
|
||||||
# only move the compiler to the back if it exists and is not already at the end
|
|
||||||
if not 0 <= idx < len(blocks) - 1:
|
|
||||||
return
|
|
||||||
# if there's only whitespace after the compiler, don't move it
|
|
||||||
if all(token.kind == SpecTokens.WS for block in blocks[idx + 1 :] for token in block):
|
|
||||||
return
|
|
||||||
# rotate left and always add at least one WS token between compiler and previous token
|
|
||||||
compiler_block = blocks.pop(idx)
|
|
||||||
if compiler_block[0].kind != SpecTokens.WS:
|
|
||||||
compiler_block.insert(0, Token(SpecTokens.WS, " "))
|
|
||||||
# delete the WS tokens from the new first block if it was at the very start, to prevent leading
|
|
||||||
# WS tokens.
|
|
||||||
while idx == 0 and blocks[0][0].kind == SpecTokens.WS:
|
|
||||||
blocks[0].pop(0)
|
|
||||||
blocks.append(compiler_block)
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_format(spec_str: str) -> Optional[str]:
|
|
||||||
"""Given any string, try to parse as spec string, and rotate the compiler token to the end
|
|
||||||
of each spec instance. Returns the formatted string if it was changed, otherwise None."""
|
|
||||||
# We parse blocks of tokens that include leading whitespace, and move the compiler block to
|
|
||||||
# the end when we hit a dependency ^... or the end of a string.
|
|
||||||
# [@3.1][ +foo][ +bar][ %gcc@3.1][ +baz]
|
|
||||||
# [@3.1][ +foo][ +bar][ +baz][ %gcc@3.1]
|
|
||||||
|
|
||||||
current_block: List[Token] = []
|
|
||||||
blocks: List[List[Token]] = []
|
|
||||||
compiler_block_idx = -1
|
|
||||||
in_edge_attr = False
|
|
||||||
|
|
||||||
for token in SPEC_TOKENIZER.tokenize(spec_str):
|
|
||||||
if token.kind == SpecTokens.UNEXPECTED:
|
|
||||||
# parsing error, we cannot fix this string.
|
|
||||||
return None
|
|
||||||
elif token.kind in (SpecTokens.COMPILER, SpecTokens.COMPILER_AND_VERSION):
|
|
||||||
# multiple compilers are not supported in Spack v0.x, so early return
|
|
||||||
if compiler_block_idx != -1:
|
|
||||||
return None
|
|
||||||
current_block.append(token)
|
|
||||||
blocks.append(current_block)
|
|
||||||
current_block = []
|
|
||||||
compiler_block_idx = len(blocks) - 1
|
|
||||||
elif token.kind in (
|
|
||||||
SpecTokens.START_EDGE_PROPERTIES,
|
|
||||||
SpecTokens.DEPENDENCY,
|
|
||||||
SpecTokens.UNQUALIFIED_PACKAGE_NAME,
|
|
||||||
SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME,
|
|
||||||
):
|
|
||||||
_spec_str_reorder_compiler(compiler_block_idx, blocks)
|
|
||||||
compiler_block_idx = -1
|
|
||||||
if token.kind == SpecTokens.START_EDGE_PROPERTIES:
|
|
||||||
in_edge_attr = True
|
|
||||||
current_block.append(token)
|
|
||||||
blocks.append(current_block)
|
|
||||||
current_block = []
|
|
||||||
elif token.kind == SpecTokens.END_EDGE_PROPERTIES:
|
|
||||||
in_edge_attr = False
|
|
||||||
current_block.append(token)
|
|
||||||
blocks.append(current_block)
|
|
||||||
current_block = []
|
|
||||||
elif in_edge_attr:
|
|
||||||
current_block.append(token)
|
|
||||||
elif token.kind in (
|
|
||||||
SpecTokens.VERSION_HASH_PAIR,
|
|
||||||
SpecTokens.GIT_VERSION,
|
|
||||||
SpecTokens.VERSION,
|
|
||||||
SpecTokens.PROPAGATED_BOOL_VARIANT,
|
|
||||||
SpecTokens.BOOL_VARIANT,
|
|
||||||
SpecTokens.PROPAGATED_KEY_VALUE_PAIR,
|
|
||||||
SpecTokens.KEY_VALUE_PAIR,
|
|
||||||
SpecTokens.DAG_HASH,
|
|
||||||
):
|
|
||||||
current_block.append(token)
|
|
||||||
blocks.append(current_block)
|
|
||||||
current_block = []
|
|
||||||
elif token.kind == SpecTokens.WS:
|
|
||||||
current_block.append(token)
|
|
||||||
else:
|
|
||||||
raise ValueError(f"unexpected token {token}")
|
|
||||||
|
|
||||||
if current_block:
|
|
||||||
blocks.append(current_block)
|
|
||||||
_spec_str_reorder_compiler(compiler_block_idx, blocks)
|
|
||||||
|
|
||||||
new_spec_str = "".join(token.value for block in blocks for token in block)
|
|
||||||
return new_spec_str if spec_str != new_spec_str else None
|
|
||||||
|
|
||||||
|
|
||||||
SpecStrHandler = Callable[[str, int, int, str, str], None]
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_default_handler(path: str, line: int, col: int, old: str, new: str):
|
|
||||||
"""A SpecStrHandler that prints formatted spec strings and their locations."""
|
|
||||||
print(f"{path}:{line}:{col}: `{old}` -> `{new}`")
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_fix_handler(path: str, line: int, col: int, old: str, new: str):
|
|
||||||
"""A SpecStrHandler that updates formatted spec strings in files."""
|
|
||||||
with open(path, "r", encoding="utf-8") as f:
|
|
||||||
lines = f.readlines()
|
|
||||||
new_line = lines[line - 1].replace(old, new)
|
|
||||||
if new_line == lines[line - 1]:
|
|
||||||
tty.warn(f"{path}:{line}:{col}: could not apply fix: `{old}` -> `{new}`")
|
|
||||||
return
|
|
||||||
lines[line - 1] = new_line
|
|
||||||
print(f"{path}:{line}:{col}: fixed `{old}` -> `{new}`")
|
|
||||||
with open(path, "w", encoding="utf-8") as f:
|
|
||||||
f.writelines(lines)
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_ast(path: str, tree: ast.AST, handler: SpecStrHandler) -> None:
|
|
||||||
"""Walk the AST of a Python file and apply handler to formatted spec strings."""
|
|
||||||
has_constant = sys.version_info >= (3, 8)
|
|
||||||
for node in ast.walk(tree):
|
|
||||||
if has_constant and isinstance(node, ast.Constant) and isinstance(node.value, str):
|
|
||||||
current_str = node.value
|
|
||||||
elif not has_constant and isinstance(node, ast.Str):
|
|
||||||
current_str = node.s
|
|
||||||
else:
|
|
||||||
continue
|
|
||||||
if not IS_PROBABLY_COMPILER.search(current_str):
|
|
||||||
continue
|
|
||||||
new = _spec_str_format(current_str)
|
|
||||||
if new is not None:
|
|
||||||
handler(path, node.lineno, node.col_offset, current_str, new)
|
|
||||||
|
|
||||||
|
|
||||||
def _spec_str_json_and_yaml(path: str, data: dict, handler: SpecStrHandler) -> None:
|
|
||||||
"""Walk a YAML or JSON data structure and apply handler to formatted spec strings."""
|
|
||||||
queue = [data]
|
|
||||||
seen = set()
|
|
||||||
|
|
||||||
while queue:
|
|
||||||
current = queue.pop(0)
|
|
||||||
if id(current) in seen:
|
|
||||||
continue
|
|
||||||
seen.add(id(current))
|
|
||||||
if isinstance(current, dict):
|
|
||||||
queue.extend(current.values())
|
|
||||||
queue.extend(current.keys())
|
|
||||||
elif isinstance(current, list):
|
|
||||||
queue.extend(current)
|
|
||||||
elif isinstance(current, str) and IS_PROBABLY_COMPILER.search(current):
|
|
||||||
new = _spec_str_format(current)
|
|
||||||
if new is not None:
|
|
||||||
mark = getattr(current, "_start_mark", None)
|
|
||||||
if mark:
|
|
||||||
line, col = mark.line + 1, mark.column + 1
|
|
||||||
else:
|
|
||||||
line, col = 0, 0
|
|
||||||
handler(path, line, col, current, new)
|
|
||||||
|
|
||||||
|
|
||||||
def _check_spec_strings(
|
|
||||||
paths: List[str], handler: SpecStrHandler = _spec_str_default_handler
|
|
||||||
) -> None:
|
|
||||||
"""Open Python, JSON and YAML files, and format their string literals that look like spec
|
|
||||||
strings. A handler is called for each formatting, which can be used to print or apply fixes."""
|
|
||||||
for path in paths:
|
|
||||||
is_json_or_yaml = path.endswith(".json") or path.endswith(".yaml") or path.endswith(".yml")
|
|
||||||
is_python = path.endswith(".py")
|
|
||||||
if not is_json_or_yaml and not is_python:
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
|
||||||
with open(path, "r", encoding="utf-8") as f:
|
|
||||||
# skip files that are likely too large to be user code or config
|
|
||||||
if os.fstat(f.fileno()).st_size > 1024 * 1024:
|
|
||||||
warnings.warn(f"skipping {path}: too large.")
|
|
||||||
continue
|
|
||||||
if is_json_or_yaml:
|
|
||||||
_spec_str_json_and_yaml(path, spack.util.spack_yaml.load_config(f), handler)
|
|
||||||
elif is_python:
|
|
||||||
_spec_str_ast(path, ast.parse(f.read()), handler)
|
|
||||||
except (OSError, spack.util.spack_yaml.SpackYAMLError, SyntaxError, ValueError):
|
|
||||||
warnings.warn(f"skipping {path}")
|
|
||||||
continue
|
|
||||||
|
|
||||||
|
|
||||||
def style(parser, args):
|
def style(parser, args):
|
||||||
if args.spec_strings:
|
|
||||||
if not args.files:
|
|
||||||
tty.die("No files provided to check spec strings.")
|
|
||||||
handler = _spec_str_fix_handler if args.fix else _spec_str_default_handler
|
|
||||||
return _check_spec_strings(args.files, handler)
|
|
||||||
|
|
||||||
# save initial working directory for relativizing paths later
|
# save initial working directory for relativizing paths later
|
||||||
args.initial_working_dir = os.getcwd()
|
args.initial_working_dir = os.getcwd()
|
||||||
|
|
||||||
|
|||||||
@@ -32,10 +32,9 @@
|
|||||||
import copy
|
import copy
|
||||||
import functools
|
import functools
|
||||||
import os
|
import os
|
||||||
import os.path
|
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Optional, Tuple, Union
|
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
|
||||||
|
|
||||||
import jsonschema
|
import jsonschema
|
||||||
|
|
||||||
@@ -43,6 +42,7 @@
|
|||||||
|
|
||||||
import spack.error
|
import spack.error
|
||||||
import spack.paths
|
import spack.paths
|
||||||
|
import spack.platforms
|
||||||
import spack.schema
|
import spack.schema
|
||||||
import spack.schema.bootstrap
|
import spack.schema.bootstrap
|
||||||
import spack.schema.cdash
|
import spack.schema.cdash
|
||||||
@@ -54,18 +54,17 @@
|
|||||||
import spack.schema.develop
|
import spack.schema.develop
|
||||||
import spack.schema.env
|
import spack.schema.env
|
||||||
import spack.schema.env_vars
|
import spack.schema.env_vars
|
||||||
import spack.schema.include
|
|
||||||
import spack.schema.merged
|
|
||||||
import spack.schema.mirrors
|
import spack.schema.mirrors
|
||||||
import spack.schema.modules
|
import spack.schema.modules
|
||||||
import spack.schema.packages
|
import spack.schema.packages
|
||||||
import spack.schema.repos
|
import spack.schema.repos
|
||||||
import spack.schema.upstreams
|
import spack.schema.upstreams
|
||||||
import spack.schema.view
|
import spack.schema.view
|
||||||
import spack.util.remote_file_cache as rfc_util
|
|
||||||
|
# Hacked yaml for configuration files preserves line numbers.
|
||||||
import spack.util.spack_yaml as syaml
|
import spack.util.spack_yaml as syaml
|
||||||
|
import spack.util.web as web_util
|
||||||
from spack.util.cpus import cpus_available
|
from spack.util.cpus import cpus_available
|
||||||
from spack.util.spack_yaml import get_mark_from_yaml_data
|
|
||||||
|
|
||||||
from .enums import ConfigScopePriority
|
from .enums import ConfigScopePriority
|
||||||
|
|
||||||
@@ -75,7 +74,6 @@
|
|||||||
"concretizer": spack.schema.concretizer.schema,
|
"concretizer": spack.schema.concretizer.schema,
|
||||||
"definitions": spack.schema.definitions.schema,
|
"definitions": spack.schema.definitions.schema,
|
||||||
"env_vars": spack.schema.env_vars.schema,
|
"env_vars": spack.schema.env_vars.schema,
|
||||||
"include": spack.schema.include.schema,
|
|
||||||
"view": spack.schema.view.schema,
|
"view": spack.schema.view.schema,
|
||||||
"develop": spack.schema.develop.schema,
|
"develop": spack.schema.develop.schema,
|
||||||
"mirrors": spack.schema.mirrors.schema,
|
"mirrors": spack.schema.mirrors.schema,
|
||||||
@@ -123,17 +121,6 @@
|
|||||||
#: Type used for raw YAML configuration
|
#: Type used for raw YAML configuration
|
||||||
YamlConfigDict = Dict[str, Any]
|
YamlConfigDict = Dict[str, Any]
|
||||||
|
|
||||||
#: prefix for name of included configuration scopes
|
|
||||||
INCLUDE_SCOPE_PREFIX = "include"
|
|
||||||
|
|
||||||
#: safeguard for recursive includes -- maximum include depth
|
|
||||||
MAX_RECURSIVE_INCLUDES = 100
|
|
||||||
|
|
||||||
|
|
||||||
def _include_cache_location():
|
|
||||||
"""Location to cache included configuration files."""
|
|
||||||
return os.path.join(spack.paths.user_cache_path, "includes")
|
|
||||||
|
|
||||||
|
|
||||||
class ConfigScope:
|
class ConfigScope:
|
||||||
def __init__(self, name: str) -> None:
|
def __init__(self, name: str) -> None:
|
||||||
@@ -141,25 +128,6 @@ def __init__(self, name: str) -> None:
|
|||||||
self.writable = False
|
self.writable = False
|
||||||
self.sections = syaml.syaml_dict()
|
self.sections = syaml.syaml_dict()
|
||||||
|
|
||||||
#: names of any included scopes
|
|
||||||
self._included_scopes: Optional[List["ConfigScope"]] = None
|
|
||||||
|
|
||||||
@property
|
|
||||||
def included_scopes(self) -> List["ConfigScope"]:
|
|
||||||
"""Memoized list of included scopes, in the order they appear in this scope."""
|
|
||||||
if self._included_scopes is None:
|
|
||||||
self._included_scopes = []
|
|
||||||
|
|
||||||
includes = self.get_section("include")
|
|
||||||
if includes:
|
|
||||||
include_paths = [included_path(data) for data in includes["include"]]
|
|
||||||
for path in include_paths:
|
|
||||||
included_scope = include_path_scope(path)
|
|
||||||
if included_scope:
|
|
||||||
self._included_scopes.append(included_scope)
|
|
||||||
|
|
||||||
return self._included_scopes
|
|
||||||
|
|
||||||
def get_section_filename(self, section: str) -> str:
|
def get_section_filename(self, section: str) -> str:
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
@@ -465,9 +433,7 @@ def highest(self) -> ConfigScope:
|
|||||||
return next(self.scopes.reversed_values()) # type: ignore
|
return next(self.scopes.reversed_values()) # type: ignore
|
||||||
|
|
||||||
@_config_mutator
|
@_config_mutator
|
||||||
def push_scope(
|
def push_scope(self, scope: ConfigScope, priority: Optional[int] = None) -> None:
|
||||||
self, scope: ConfigScope, priority: Optional[int] = None, _depth: int = 0
|
|
||||||
) -> None:
|
|
||||||
"""Adds a scope to the Configuration, at a given priority.
|
"""Adds a scope to the Configuration, at a given priority.
|
||||||
|
|
||||||
If a priority is not given, it is assumed to be the current highest priority.
|
If a priority is not given, it is assumed to be the current highest priority.
|
||||||
@@ -476,44 +442,18 @@ def push_scope(
|
|||||||
scope: scope to be added
|
scope: scope to be added
|
||||||
priority: priority of the scope
|
priority: priority of the scope
|
||||||
"""
|
"""
|
||||||
# TODO: As a follow on to #48784, change this to create a graph of the
|
|
||||||
# TODO: includes AND ensure properly sorted such that the order included
|
|
||||||
# TODO: at the highest level is reflected in the value of an option that
|
|
||||||
# TODO: is set in multiple included files.
|
|
||||||
# before pushing the scope itself, push any included scopes recursively, at same priority
|
|
||||||
for included_scope in reversed(scope.included_scopes):
|
|
||||||
if _depth + 1 > MAX_RECURSIVE_INCLUDES: # make sure we're not recursing endlessly
|
|
||||||
mark = ""
|
|
||||||
if hasattr(included_scope, "path") and syaml.marked(included_scope.path):
|
|
||||||
mark = included_scope.path._start_mark # type: ignore
|
|
||||||
raise RecursiveIncludeError(
|
|
||||||
f"Maximum include recursion exceeded in {included_scope.name}", str(mark)
|
|
||||||
)
|
|
||||||
|
|
||||||
# record this inclusion so that remove_scope() can use it
|
|
||||||
self.push_scope(included_scope, priority=priority, _depth=_depth + 1)
|
|
||||||
|
|
||||||
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
|
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
|
||||||
self.scopes.add(scope.name, value=scope, priority=priority)
|
self.scopes.add(scope.name, value=scope, priority=priority)
|
||||||
|
|
||||||
@_config_mutator
|
@_config_mutator
|
||||||
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
|
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
|
||||||
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
|
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
scope = self.scopes.remove(scope_name)
|
scope = self.scopes.remove(scope_name)
|
||||||
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {str(scope)}", level=2)
|
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
|
||||||
except KeyError as e:
|
except KeyError as e:
|
||||||
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {e}", level=2)
|
tty.debug(f"[CONFIGURATION: POP SCOPE]: {e}", level=2)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# transitively remove included scopes
|
|
||||||
for included_scope in scope.included_scopes:
|
|
||||||
assert (
|
|
||||||
included_scope.name in self.scopes
|
|
||||||
), f"Included scope '{included_scope.name}' was never added to configuration!"
|
|
||||||
self.remove_scope(included_scope.name)
|
|
||||||
|
|
||||||
return scope
|
return scope
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@@ -823,8 +763,6 @@ def _add_platform_scope(
|
|||||||
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
|
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Add a platform-specific subdirectory for the current platform."""
|
"""Add a platform-specific subdirectory for the current platform."""
|
||||||
import spack.platforms # circular dependency
|
|
||||||
|
|
||||||
platform = spack.platforms.host().name
|
platform = spack.platforms.host().name
|
||||||
scope = DirectoryConfigScope(
|
scope = DirectoryConfigScope(
|
||||||
f"{name}/{platform}", os.path.join(path, platform), writable=writable
|
f"{name}/{platform}", os.path.join(path, platform), writable=writable
|
||||||
@@ -832,75 +770,6 @@ def _add_platform_scope(
|
|||||||
cfg.push_scope(scope, priority=priority)
|
cfg.push_scope(scope, priority=priority)
|
||||||
|
|
||||||
|
|
||||||
#: Class for the relevance of an optional path conditioned on a limited
|
|
||||||
#: python code that evaluates to a boolean and or explicit specification
|
|
||||||
#: as optional.
|
|
||||||
class IncludePath(NamedTuple):
|
|
||||||
path: str
|
|
||||||
when: str
|
|
||||||
sha256: str
|
|
||||||
optional: bool
|
|
||||||
|
|
||||||
|
|
||||||
def included_path(entry: Union[str, dict]) -> IncludePath:
|
|
||||||
"""Convert the included path entry into an IncludePath.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
entry: include configuration entry
|
|
||||||
|
|
||||||
Returns: converted entry, where an empty ``when`` means the path is
|
|
||||||
not conditionally included
|
|
||||||
"""
|
|
||||||
if isinstance(entry, str):
|
|
||||||
return IncludePath(path=entry, sha256="", when="", optional=False)
|
|
||||||
|
|
||||||
path = entry["path"]
|
|
||||||
sha256 = entry.get("sha256", "")
|
|
||||||
when = entry.get("when", "")
|
|
||||||
optional = entry.get("optional", False)
|
|
||||||
return IncludePath(path=path, sha256=sha256, when=when, optional=optional)
|
|
||||||
|
|
||||||
|
|
||||||
def include_path_scope(include: IncludePath) -> Optional[ConfigScope]:
|
|
||||||
"""Instantiate an appropriate configuration scope for the given path.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
include: optional include path
|
|
||||||
|
|
||||||
Returns: configuration scope
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: included path has an unsupported URL scheme, is required
|
|
||||||
but does not exist; configuration stage directory argument is missing
|
|
||||||
ConfigFileError: unable to access remote configuration file(s)
|
|
||||||
"""
|
|
||||||
# circular dependencies
|
|
||||||
import spack.spec
|
|
||||||
|
|
||||||
if (not include.when) or spack.spec.eval_conditional(include.when):
|
|
||||||
config_path = rfc_util.local_path(include.path, include.sha256, _include_cache_location)
|
|
||||||
if not config_path:
|
|
||||||
raise ConfigFileError(f"Unable to fetch remote configuration from {include.path}")
|
|
||||||
|
|
||||||
if os.path.isdir(config_path):
|
|
||||||
# directories are treated as regular ConfigScopes
|
|
||||||
config_name = f"{INCLUDE_SCOPE_PREFIX}:{os.path.basename(config_path)}"
|
|
||||||
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
|
|
||||||
return DirectoryConfigScope(config_name, config_path)
|
|
||||||
|
|
||||||
if os.path.exists(config_path):
|
|
||||||
# files are assumed to be SingleFileScopes
|
|
||||||
config_name = f"{INCLUDE_SCOPE_PREFIX}:{config_path}"
|
|
||||||
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
|
|
||||||
return SingleFileScope(config_name, config_path, spack.schema.merged.schema)
|
|
||||||
|
|
||||||
if not include.optional:
|
|
||||||
path = f" at ({config_path})" if config_path != include.path else ""
|
|
||||||
raise ValueError(f"Required path ({include.path}) does not exist{path}")
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
|
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
|
||||||
"""Load configuration paths from entry points
|
"""Load configuration paths from entry points
|
||||||
|
|
||||||
@@ -926,7 +795,7 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
|
|||||||
return config_paths
|
return config_paths
|
||||||
|
|
||||||
|
|
||||||
def create_incremental() -> Generator[Configuration, None, None]:
|
def create() -> Configuration:
|
||||||
"""Singleton Configuration instance.
|
"""Singleton Configuration instance.
|
||||||
|
|
||||||
This constructs one instance associated with this module and returns
|
This constructs one instance associated with this module and returns
|
||||||
@@ -970,25 +839,11 @@ def create_incremental() -> Generator[Configuration, None, None]:
|
|||||||
# Each scope can have per-platform overrides in subdirectories
|
# Each scope can have per-platform overrides in subdirectories
|
||||||
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
|
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
|
||||||
|
|
||||||
# yield the config incrementally so that each config level's init code can get
|
return cfg
|
||||||
# data from the one below. This can be tricky, but it enables us to have a
|
|
||||||
# single unified config system.
|
|
||||||
#
|
|
||||||
# TODO: think about whether we want to restrict what types of config can be used
|
|
||||||
# at each level. e.g., we may want to just more forcibly disallow remote
|
|
||||||
# config (which uses ssl and other config options) for some of the scopes,
|
|
||||||
# to make the bootstrap issues more explicit, even if allowing config scope
|
|
||||||
# init to reference lower scopes is more flexible.
|
|
||||||
yield cfg
|
|
||||||
|
|
||||||
|
|
||||||
def create() -> Configuration:
|
|
||||||
"""Create a configuration using create_incremental(), return the last yielded result."""
|
|
||||||
return list(create_incremental())[-1]
|
|
||||||
|
|
||||||
|
|
||||||
#: This is the singleton configuration instance for Spack.
|
#: This is the singleton configuration instance for Spack.
|
||||||
CONFIG: Configuration = lang.Singleton(create_incremental) # type: ignore
|
CONFIG: Configuration = lang.Singleton(create) # type: ignore
|
||||||
|
|
||||||
|
|
||||||
def add_from_file(filename: str, scope: Optional[str] = None) -> None:
|
def add_from_file(filename: str, scope: Optional[str] = None) -> None:
|
||||||
@@ -1084,8 +939,7 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
|
|||||||
|
|
||||||
Accepts the path syntax described in ``get()``.
|
Accepts the path syntax described in ``get()``.
|
||||||
"""
|
"""
|
||||||
result = CONFIG.set(path, value, scope)
|
return CONFIG.set(path, value, scope)
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
|
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
|
||||||
@@ -1608,6 +1462,120 @@ def create_from(*scopes_or_paths: Union[ScopeWithOptionalPriority, str]) -> Conf
|
|||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def raw_github_gitlab_url(url: str) -> str:
|
||||||
|
"""Transform a github URL to the raw form to avoid undesirable html.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
url: url to be converted to raw form
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Raw github/gitlab url or the original url
|
||||||
|
"""
|
||||||
|
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
|
||||||
|
# actual URL under https://raw.githubusercontent.com/ with '/blob'
|
||||||
|
# removed and or, '/blame' if needed.
|
||||||
|
if "github" in url or "gitlab" in url:
|
||||||
|
return url.replace("/blob/", "/raw/")
|
||||||
|
|
||||||
|
return url
|
||||||
|
|
||||||
|
|
||||||
|
def collect_urls(base_url: str) -> list:
|
||||||
|
"""Return a list of configuration URLs.
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
base_url: URL for a configuration (yaml) file or a directory
|
||||||
|
containing yaml file(s)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of configuration file(s) or empty list if none
|
||||||
|
"""
|
||||||
|
if not base_url:
|
||||||
|
return []
|
||||||
|
|
||||||
|
extension = ".yaml"
|
||||||
|
|
||||||
|
if base_url.endswith(extension):
|
||||||
|
return [base_url]
|
||||||
|
|
||||||
|
# Collect configuration URLs if the base_url is a "directory".
|
||||||
|
_, links = web_util.spider(base_url, 0)
|
||||||
|
return [link for link in links if link.endswith(extension)]
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) -> str:
|
||||||
|
"""Retrieve configuration file(s) at the specified URL.
|
||||||
|
|
||||||
|
Arguments:
|
||||||
|
url: URL for a configuration (yaml) file or a directory containing
|
||||||
|
yaml file(s)
|
||||||
|
dest_dir: destination directory
|
||||||
|
skip_existing: Skip files that already exist in dest_dir if
|
||||||
|
``True``; otherwise, replace those files
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Path to the corresponding file if URL is or contains a
|
||||||
|
single file and it is the only file in the destination directory or
|
||||||
|
the root (dest_dir) directory if multiple configuration files exist
|
||||||
|
or are retrieved.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _fetch_file(url):
|
||||||
|
raw = raw_github_gitlab_url(url)
|
||||||
|
tty.debug(f"Reading config from url {raw}")
|
||||||
|
return web_util.fetch_url_text(raw, dest_dir=dest_dir)
|
||||||
|
|
||||||
|
if not url:
|
||||||
|
raise ConfigFileError("Cannot retrieve configuration without a URL")
|
||||||
|
|
||||||
|
# Return the local path to the cached configuration file OR to the
|
||||||
|
# directory containing the cached configuration files.
|
||||||
|
config_links = collect_urls(url)
|
||||||
|
existing_files = os.listdir(dest_dir) if os.path.isdir(dest_dir) else []
|
||||||
|
|
||||||
|
paths = []
|
||||||
|
for config_url in config_links:
|
||||||
|
basename = os.path.basename(config_url)
|
||||||
|
if skip_existing and basename in existing_files:
|
||||||
|
tty.warn(
|
||||||
|
f"Will not fetch configuration from {config_url} since a "
|
||||||
|
f"version already exists in {dest_dir}"
|
||||||
|
)
|
||||||
|
path = os.path.join(dest_dir, basename)
|
||||||
|
else:
|
||||||
|
path = _fetch_file(config_url)
|
||||||
|
|
||||||
|
if path:
|
||||||
|
paths.append(path)
|
||||||
|
|
||||||
|
if paths:
|
||||||
|
return dest_dir if len(paths) > 1 else paths[0]
|
||||||
|
|
||||||
|
raise ConfigFileError(f"Cannot retrieve configuration (yaml) from {url}")
|
||||||
|
|
||||||
|
|
||||||
|
def get_mark_from_yaml_data(obj):
|
||||||
|
"""Try to get ``spack.util.spack_yaml`` mark from YAML data.
|
||||||
|
|
||||||
|
We try the object, and if that fails we try its first member (if it's a container).
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
mark if one is found, otherwise None.
|
||||||
|
"""
|
||||||
|
# mark of object itelf
|
||||||
|
mark = getattr(obj, "_start_mark", None)
|
||||||
|
if mark:
|
||||||
|
return mark
|
||||||
|
|
||||||
|
# mark of first member if it is a container
|
||||||
|
if isinstance(obj, (list, dict)):
|
||||||
|
first_member = next(iter(obj), None)
|
||||||
|
if first_member:
|
||||||
|
mark = getattr(first_member, "_start_mark", None)
|
||||||
|
|
||||||
|
return mark
|
||||||
|
|
||||||
|
|
||||||
def determine_number_of_jobs(
|
def determine_number_of_jobs(
|
||||||
*,
|
*,
|
||||||
parallel: bool = False,
|
parallel: bool = False,
|
||||||
@@ -1712,7 +1680,3 @@ def get_path(path, data):
|
|||||||
|
|
||||||
# give up and return None if nothing worked
|
# give up and return None if nothing worked
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
class RecursiveIncludeError(spack.error.SpackError):
|
|
||||||
"""Too many levels of recursive includes."""
|
|
||||||
|
|||||||
@@ -7,7 +7,6 @@
|
|||||||
import collections
|
import collections
|
||||||
import concurrent.futures
|
import concurrent.futures
|
||||||
import os
|
import os
|
||||||
import pathlib
|
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import traceback
|
import traceback
|
||||||
@@ -16,7 +15,6 @@
|
|||||||
|
|
||||||
import llnl.util.filesystem
|
import llnl.util.filesystem
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.symlink
|
|
||||||
import llnl.util.tty
|
import llnl.util.tty
|
||||||
|
|
||||||
import spack.error
|
import spack.error
|
||||||
@@ -72,21 +70,13 @@ def dedupe_paths(paths: List[str]) -> List[str]:
|
|||||||
"""Deduplicate paths based on inode and device number. In case the list contains first a
|
"""Deduplicate paths based on inode and device number. In case the list contains first a
|
||||||
symlink and then the directory it points to, the symlink is replaced with the directory path.
|
symlink and then the directory it points to, the symlink is replaced with the directory path.
|
||||||
This ensures that we pick for example ``/usr/bin`` over ``/bin`` if the latter is a symlink to
|
This ensures that we pick for example ``/usr/bin`` over ``/bin`` if the latter is a symlink to
|
||||||
the former."""
|
the former`."""
|
||||||
seen: Dict[Tuple[int, int], str] = {}
|
seen: Dict[Tuple[int, int], str] = {}
|
||||||
|
|
||||||
linked_parent_check = lambda x: any(
|
|
||||||
[llnl.util.symlink.islink(str(y)) for y in pathlib.Path(x).parents]
|
|
||||||
)
|
|
||||||
|
|
||||||
for path in paths:
|
for path in paths:
|
||||||
identifier = file_identifier(path)
|
identifier = file_identifier(path)
|
||||||
if identifier not in seen:
|
if identifier not in seen:
|
||||||
seen[identifier] = path
|
seen[identifier] = path
|
||||||
# we also want to deprioritize paths if they contain a symlink in any parent
|
elif not os.path.islink(path):
|
||||||
# (not just the basedir): e.g. oneapi has "latest/bin",
|
|
||||||
# where "latest" is a symlink to 2025.0"
|
|
||||||
elif not (llnl.util.symlink.islink(path) or linked_parent_check(path)):
|
|
||||||
seen[identifier] = path
|
seen[identifier] = path
|
||||||
return list(seen.values())
|
return list(seen.values())
|
||||||
|
|
||||||
|
|||||||
@@ -166,7 +166,7 @@ def __init__(
|
|||||||
" ".join(self._install_target(s.safe_name()) for s in item.prereqs),
|
" ".join(self._install_target(s.safe_name()) for s in item.prereqs),
|
||||||
item.target.spec_hash(),
|
item.target.spec_hash(),
|
||||||
item.target.unsafe_format(
|
item.target.unsafe_format(
|
||||||
"{name}{@version}{variants}{ arch=architecture} {%compiler}"
|
"{name}{@version}{%compiler}{variants}{arch=architecture}"
|
||||||
),
|
),
|
||||||
item.buildcache_flag,
|
item.buildcache_flag,
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -10,6 +10,8 @@
|
|||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
import stat
|
import stat
|
||||||
|
import urllib.parse
|
||||||
|
import urllib.request
|
||||||
import warnings
|
import warnings
|
||||||
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
|
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
|
||||||
|
|
||||||
@@ -30,6 +32,7 @@
|
|||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.schema.env
|
import spack.schema.env
|
||||||
|
import spack.schema.merged
|
||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.spec_list
|
import spack.spec_list
|
||||||
import spack.store
|
import spack.store
|
||||||
@@ -40,6 +43,7 @@
|
|||||||
import spack.util.path
|
import spack.util.path
|
||||||
import spack.util.spack_json as sjson
|
import spack.util.spack_json as sjson
|
||||||
import spack.util.spack_yaml as syaml
|
import spack.util.spack_yaml as syaml
|
||||||
|
import spack.util.url
|
||||||
from spack import traverse
|
from spack import traverse
|
||||||
from spack.installer import PackageInstaller
|
from spack.installer import PackageInstaller
|
||||||
from spack.schema.env import TOP_LEVEL_KEY
|
from spack.schema.env import TOP_LEVEL_KEY
|
||||||
@@ -385,7 +389,6 @@ def create_in_dir(
|
|||||||
# dev paths in this environment to refer to their original
|
# dev paths in this environment to refer to their original
|
||||||
# locations.
|
# locations.
|
||||||
_rewrite_relative_dev_paths_on_relocation(env, init_file_dir)
|
_rewrite_relative_dev_paths_on_relocation(env, init_file_dir)
|
||||||
_rewrite_relative_repos_paths_on_relocation(env, init_file_dir)
|
|
||||||
|
|
||||||
return env
|
return env
|
||||||
|
|
||||||
@@ -402,8 +405,8 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
|
|||||||
dev_path = substitute_path_variables(entry["path"])
|
dev_path = substitute_path_variables(entry["path"])
|
||||||
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
|
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
|
||||||
|
|
||||||
# Skip if the substituted and expanded path is the same (e.g. when absolute)
|
# Skip if the expanded path is the same (e.g. when absolute)
|
||||||
if entry["path"] == expanded_path:
|
if dev_path == expanded_path:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
tty.debug("Expanding develop path for {0} to {1}".format(name, expanded_path))
|
tty.debug("Expanding develop path for {0} to {1}".format(name, expanded_path))
|
||||||
@@ -418,34 +421,6 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
|
|||||||
env._re_read()
|
env._re_read()
|
||||||
|
|
||||||
|
|
||||||
def _rewrite_relative_repos_paths_on_relocation(env, init_file_dir):
|
|
||||||
"""When initializing the environment from a manifest file and we plan
|
|
||||||
to store the environment in a different directory, we have to rewrite
|
|
||||||
relative repo paths to absolute ones and expand environment variables."""
|
|
||||||
with env:
|
|
||||||
repos_specs = spack.config.get("repos", default={}, scope=env.scope_name)
|
|
||||||
if not repos_specs:
|
|
||||||
return
|
|
||||||
for i, entry in enumerate(repos_specs):
|
|
||||||
repo_path = substitute_path_variables(entry)
|
|
||||||
expanded_path = spack.util.path.canonicalize_path(repo_path, default_wd=init_file_dir)
|
|
||||||
|
|
||||||
# Skip if the substituted and expanded path is the same (e.g. when absolute)
|
|
||||||
if entry == expanded_path:
|
|
||||||
continue
|
|
||||||
|
|
||||||
tty.debug("Expanding repo path for {0} to {1}".format(entry, expanded_path))
|
|
||||||
|
|
||||||
repos_specs[i] = expanded_path
|
|
||||||
|
|
||||||
spack.config.set("repos", repos_specs, scope=env.scope_name)
|
|
||||||
|
|
||||||
env.repos_specs = None
|
|
||||||
# If we changed the environment's spack.yaml scope, that will not be reflected
|
|
||||||
# in the manifest that we read
|
|
||||||
env._re_read()
|
|
||||||
|
|
||||||
|
|
||||||
def environment_dir_from_name(name: str, exists_ok: bool = True) -> str:
|
def environment_dir_from_name(name: str, exists_ok: bool = True) -> str:
|
||||||
"""Returns the directory associated with a named environment.
|
"""Returns the directory associated with a named environment.
|
||||||
|
|
||||||
@@ -573,6 +548,13 @@ def _write_yaml(data, str_or_file):
|
|||||||
syaml.dump_config(data, str_or_file, default_flow_style=False)
|
syaml.dump_config(data, str_or_file, default_flow_style=False)
|
||||||
|
|
||||||
|
|
||||||
|
def _eval_conditional(string):
|
||||||
|
"""Evaluate conditional definitions using restricted variable scope."""
|
||||||
|
valid_variables = spack.spec.get_host_environment()
|
||||||
|
valid_variables.update({"re": re, "env": os.environ})
|
||||||
|
return eval(string, valid_variables)
|
||||||
|
|
||||||
|
|
||||||
def _is_dev_spec_and_has_changed(spec):
|
def _is_dev_spec_and_has_changed(spec):
|
||||||
"""Check if the passed spec is a dev build and whether it has changed since the
|
"""Check if the passed spec is a dev build and whether it has changed since the
|
||||||
last installation"""
|
last installation"""
|
||||||
@@ -1005,7 +987,7 @@ def _process_definition(self, entry):
|
|||||||
"""Process a single spec definition item."""
|
"""Process a single spec definition item."""
|
||||||
when_string = entry.get("when")
|
when_string = entry.get("when")
|
||||||
if when_string is not None:
|
if when_string is not None:
|
||||||
when = spack.spec.eval_conditional(when_string)
|
when = _eval_conditional(when_string)
|
||||||
assert len([x for x in entry if x != "when"]) == 1
|
assert len([x for x in entry if x != "when"]) == 1
|
||||||
else:
|
else:
|
||||||
when = True
|
when = True
|
||||||
@@ -1550,6 +1532,9 @@ def _get_specs_to_concretize(
|
|||||||
return new_user_specs, kept_user_specs, specs_to_concretize
|
return new_user_specs, kept_user_specs, specs_to_concretize
|
||||||
|
|
||||||
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
|
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
|
||||||
|
# Avoid cyclic dependency
|
||||||
|
import spack.solver.asp
|
||||||
|
|
||||||
# Exit early if the set of concretized specs is the set of user specs
|
# Exit early if the set of concretized specs is the set of user specs
|
||||||
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
|
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
|
||||||
if not new_user_specs:
|
if not new_user_specs:
|
||||||
@@ -2660,23 +2645,20 @@ def _ensure_env_dir():
|
|||||||
# error handling for bad manifests is handled on other code paths
|
# error handling for bad manifests is handled on other code paths
|
||||||
return
|
return
|
||||||
|
|
||||||
# TODO: make this recursive
|
|
||||||
includes = manifest[TOP_LEVEL_KEY].get("include", [])
|
includes = manifest[TOP_LEVEL_KEY].get("include", [])
|
||||||
for include in includes:
|
for include in includes:
|
||||||
included_path = spack.config.included_path(include)
|
if os.path.isabs(include):
|
||||||
path = included_path.path
|
|
||||||
if os.path.isabs(path):
|
|
||||||
continue
|
continue
|
||||||
|
|
||||||
abspath = pathlib.Path(os.path.normpath(environment_dir / path))
|
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
|
||||||
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
|
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
|
||||||
if common_path != environment_dir:
|
if common_path != environment_dir:
|
||||||
tty.debug(f"Will not copy relative include file from outside environment: {path}")
|
tty.debug(f"Will not copy relative include from outside environment: {include}")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
orig_abspath = os.path.normpath(envfile.parent / path)
|
orig_abspath = os.path.normpath(envfile.parent / include)
|
||||||
if not os.path.exists(orig_abspath):
|
if not os.path.exists(orig_abspath):
|
||||||
tty.warn(f"Included file does not exist; will not copy: '{path}'")
|
tty.warn(f"Included file does not exist; will not copy: '{include}'")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
fs.touchp(abspath)
|
fs.touchp(abspath)
|
||||||
@@ -2899,7 +2881,7 @@ def extract_name(_item):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
condition_str = item.get("when", "True")
|
condition_str = item.get("when", "True")
|
||||||
if not spack.spec.eval_conditional(condition_str):
|
if not _eval_conditional(condition_str):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
yield idx, item
|
yield idx, item
|
||||||
@@ -2960,20 +2942,127 @@ def __iter__(self):
|
|||||||
def __str__(self):
|
def __str__(self):
|
||||||
return str(self.manifest_file)
|
return str(self.manifest_file)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def included_config_scopes(self) -> List[spack.config.ConfigScope]:
|
||||||
|
"""List of included configuration scopes from the manifest.
|
||||||
|
|
||||||
|
Scopes are listed in the YAML file in order from highest to
|
||||||
|
lowest precedence, so configuration from earlier scope will take
|
||||||
|
precedence over later ones.
|
||||||
|
|
||||||
|
This routine returns them in the order they should be pushed onto
|
||||||
|
the internal scope stack (so, in reverse, from lowest to highest).
|
||||||
|
|
||||||
|
Returns: Configuration scopes associated with the environment manifest
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
SpackEnvironmentError: if the manifest includes a remote file but
|
||||||
|
no configuration stage directory has been identified
|
||||||
|
"""
|
||||||
|
scopes: List[spack.config.ConfigScope] = []
|
||||||
|
|
||||||
|
# load config scopes added via 'include:', in reverse so that
|
||||||
|
# highest-precedence scopes are last.
|
||||||
|
includes = self[TOP_LEVEL_KEY].get("include", [])
|
||||||
|
missing = []
|
||||||
|
for i, config_path in enumerate(reversed(includes)):
|
||||||
|
# allow paths to contain spack config/environment variables, etc.
|
||||||
|
config_path = substitute_path_variables(config_path)
|
||||||
|
include_url = urllib.parse.urlparse(config_path)
|
||||||
|
|
||||||
|
# If scheme is not valid, config_path is not a url
|
||||||
|
# of a type Spack is generally aware
|
||||||
|
if spack.util.url.validate_scheme(include_url.scheme):
|
||||||
|
# Transform file:// URLs to direct includes.
|
||||||
|
if include_url.scheme == "file":
|
||||||
|
config_path = urllib.request.url2pathname(include_url.path)
|
||||||
|
|
||||||
|
# Any other URL should be fetched.
|
||||||
|
elif include_url.scheme in ("http", "https", "ftp"):
|
||||||
|
# Stage any remote configuration file(s)
|
||||||
|
staged_configs = (
|
||||||
|
os.listdir(self.config_stage_dir)
|
||||||
|
if os.path.exists(self.config_stage_dir)
|
||||||
|
else []
|
||||||
|
)
|
||||||
|
remote_path = urllib.request.url2pathname(include_url.path)
|
||||||
|
basename = os.path.basename(remote_path)
|
||||||
|
if basename in staged_configs:
|
||||||
|
# Do NOT re-stage configuration files over existing
|
||||||
|
# ones with the same name since there is a risk of
|
||||||
|
# losing changes (e.g., from 'spack config update').
|
||||||
|
tty.warn(
|
||||||
|
"Will not re-stage configuration from {0} to avoid "
|
||||||
|
"losing changes to the already staged file of the "
|
||||||
|
"same name.".format(remote_path)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Recognize the configuration stage directory
|
||||||
|
# is flattened to ensure a single copy of each
|
||||||
|
# configuration file.
|
||||||
|
config_path = self.config_stage_dir
|
||||||
|
if basename.endswith(".yaml"):
|
||||||
|
config_path = os.path.join(config_path, basename)
|
||||||
|
else:
|
||||||
|
staged_path = spack.config.fetch_remote_configs(
|
||||||
|
config_path, str(self.config_stage_dir), skip_existing=True
|
||||||
|
)
|
||||||
|
if not staged_path:
|
||||||
|
raise SpackEnvironmentError(
|
||||||
|
"Unable to fetch remote configuration {0}".format(config_path)
|
||||||
|
)
|
||||||
|
config_path = staged_path
|
||||||
|
|
||||||
|
elif include_url.scheme:
|
||||||
|
raise ValueError(
|
||||||
|
f"Unsupported URL scheme ({include_url.scheme}) for "
|
||||||
|
f"environment include: {config_path}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# treat relative paths as relative to the environment
|
||||||
|
if not os.path.isabs(config_path):
|
||||||
|
config_path = os.path.join(self.manifest_dir, config_path)
|
||||||
|
config_path = os.path.normpath(os.path.realpath(config_path))
|
||||||
|
|
||||||
|
if os.path.isdir(config_path):
|
||||||
|
# directories are treated as regular ConfigScopes
|
||||||
|
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
|
||||||
|
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
|
||||||
|
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
|
||||||
|
elif os.path.exists(config_path):
|
||||||
|
# files are assumed to be SingleFileScopes
|
||||||
|
config_name = f"env:{self.name}:{config_path}"
|
||||||
|
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
|
||||||
|
scopes.append(
|
||||||
|
spack.config.SingleFileScope(
|
||||||
|
config_name, config_path, spack.schema.merged.schema
|
||||||
|
)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
missing.append(config_path)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if missing:
|
||||||
|
msg = "Detected {0} missing include path(s):".format(len(missing))
|
||||||
|
msg += "\n {0}".format("\n ".join(missing))
|
||||||
|
raise spack.config.ConfigFileError(msg)
|
||||||
|
|
||||||
|
return scopes
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
|
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
|
||||||
"""A list of all configuration scopes for the environment manifest. On the first call this
|
"""A list of all configuration scopes for the environment manifest. On the first call this
|
||||||
instantiates all the scopes, on subsequent calls it returns the cached list."""
|
instantiates all the scopes, on subsequent calls it returns the cached list."""
|
||||||
if self._config_scopes is not None:
|
if self._config_scopes is not None:
|
||||||
return self._config_scopes
|
return self._config_scopes
|
||||||
|
|
||||||
scopes: List[spack.config.ConfigScope] = [
|
scopes: List[spack.config.ConfigScope] = [
|
||||||
|
*self.included_config_scopes,
|
||||||
spack.config.SingleFileScope(
|
spack.config.SingleFileScope(
|
||||||
self.scope_name,
|
self.scope_name,
|
||||||
str(self.manifest_file),
|
str(self.manifest_file),
|
||||||
spack.schema.env.schema,
|
spack.schema.env.schema,
|
||||||
yaml_path=[TOP_LEVEL_KEY],
|
yaml_path=[TOP_LEVEL_KEY],
|
||||||
)
|
),
|
||||||
]
|
]
|
||||||
ensure_no_disallowed_env_config_mods(scopes)
|
ensure_no_disallowed_env_config_mods(scopes)
|
||||||
self._config_scopes = scopes
|
self._config_scopes = scopes
|
||||||
|
|||||||
@@ -643,7 +643,7 @@ def print_status(self, *specs, **kwargs):
|
|||||||
specs.sort()
|
specs.sort()
|
||||||
|
|
||||||
abbreviated = [
|
abbreviated = [
|
||||||
s.cformat("{name}{@version}{compiler_flags}{variants}{%compiler}")
|
s.cformat("{name}{@version}{%compiler}{compiler_flags}{variants}")
|
||||||
for s in specs
|
for s in specs
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|||||||
@@ -482,7 +482,7 @@ class SimpleDAG(DotGraphBuilder):
|
|||||||
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
|
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
|
||||||
|
|
||||||
def node_entry(self, node):
|
def node_entry(self, node):
|
||||||
format_option = "{name}{@version}{/hash:7}{%compiler}"
|
format_option = "{name}{@version}{%compiler}{/hash:7}"
|
||||||
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
|
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
|
||||||
|
|
||||||
def edge_entry(self, edge):
|
def edge_entry(self, edge):
|
||||||
@@ -515,7 +515,7 @@ def visit(self, edge):
|
|||||||
super().visit(edge)
|
super().visit(edge)
|
||||||
|
|
||||||
def node_entry(self, node):
|
def node_entry(self, node):
|
||||||
node_str = node.format("{name}{@version}{/hash:7}{%compiler}")
|
node_str = node.format("{name}{@version}{%compiler}{/hash:7}")
|
||||||
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
|
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
|
||||||
if node.dag_hash() in self.main_unified_space:
|
if node.dag_hash() in self.main_unified_space:
|
||||||
options = f'[label="{node_str}", group="main_psid"]'
|
options = f'[label="{node_str}", group="main_psid"]'
|
||||||
|
|||||||
@@ -21,6 +21,7 @@
|
|||||||
from llnl.util.lang import nullcontext
|
from llnl.util.lang import nullcontext
|
||||||
from llnl.util.tty.color import colorize
|
from llnl.util.tty.color import colorize
|
||||||
|
|
||||||
|
import spack.build_environment
|
||||||
import spack.config
|
import spack.config
|
||||||
import spack.error
|
import spack.error
|
||||||
import spack.package_base
|
import spack.package_base
|
||||||
@@ -397,7 +398,7 @@ def stand_alone_tests(self, kwargs):
|
|||||||
Args:
|
Args:
|
||||||
kwargs (dict): arguments to be used by the test process
|
kwargs (dict): arguments to be used by the test process
|
||||||
"""
|
"""
|
||||||
import spack.build_environment # avoid circular dependency
|
import spack.build_environment
|
||||||
|
|
||||||
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
|
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
|
||||||
|
|
||||||
@@ -462,8 +463,6 @@ def write_tested_status(self):
|
|||||||
|
|
||||||
@contextlib.contextmanager
|
@contextlib.contextmanager
|
||||||
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
|
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
|
||||||
import spack.build_environment # avoid circular dependency
|
|
||||||
|
|
||||||
wdir = "." if work_dir is None else work_dir
|
wdir = "." if work_dir is None else work_dir
|
||||||
tester = pkg.tester
|
tester = pkg.tester
|
||||||
assert test_name and test_name.startswith(
|
assert test_name and test_name.startswith(
|
||||||
|
|||||||
@@ -564,12 +564,6 @@ def __init__(self, configuration):
|
|||||||
def spec(self):
|
def spec(self):
|
||||||
return self.conf.spec
|
return self.conf.spec
|
||||||
|
|
||||||
@tengine.context_property
|
|
||||||
def tags(self):
|
|
||||||
if not hasattr(self.spec.package, "tags"):
|
|
||||||
return []
|
|
||||||
return self.spec.package.tags
|
|
||||||
|
|
||||||
@tengine.context_property
|
@tengine.context_property
|
||||||
def timestamp(self):
|
def timestamp(self):
|
||||||
return datetime.datetime.now()
|
return datetime.datetime.now()
|
||||||
|
|||||||
@@ -125,10 +125,9 @@ def windows_establish_runtime_linkage(self):
|
|||||||
# Spack should in general not modify things it has not installed
|
# Spack should in general not modify things it has not installed
|
||||||
# we can reasonably expect externals to have their link interface properly established
|
# we can reasonably expect externals to have their link interface properly established
|
||||||
if sys.platform == "win32" and not self.spec.external:
|
if sys.platform == "win32" and not self.spec.external:
|
||||||
win_rpath = fsys.WindowsSimulatedRPath(self)
|
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
|
||||||
win_rpath.add_library_dependent(*self.win_add_library_dependent())
|
self.win_rpath.add_rpath(*self.win_add_rpath())
|
||||||
win_rpath.add_rpath(*self.win_add_rpath())
|
self.win_rpath.establish_link()
|
||||||
win_rpath.establish_link()
|
|
||||||
|
|
||||||
|
|
||||||
#: Registers which are the detectable packages, by repo and package name
|
#: Registers which are the detectable packages, by repo and package name
|
||||||
@@ -743,6 +742,7 @@ def __init__(self, spec):
|
|||||||
# Set up timing variables
|
# Set up timing variables
|
||||||
self._fetch_time = 0.0
|
self._fetch_time = 0.0
|
||||||
|
|
||||||
|
self.win_rpath = fsys.WindowsSimulatedRPath(self)
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
def __getitem__(self, key: str) -> "PackageBase":
|
def __getitem__(self, key: str) -> "PackageBase":
|
||||||
|
|||||||
@@ -108,8 +108,6 @@ def _get_user_cache_path():
|
|||||||
#: transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
|
#: transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
|
||||||
default_misc_cache_path = os.path.join(user_cache_path, "cache")
|
default_misc_cache_path = os.path.join(user_cache_path, "cache")
|
||||||
|
|
||||||
#: concretization cache for Spack concretizations
|
|
||||||
default_conc_cache_path = os.path.join(default_misc_cache_path, "concretization")
|
|
||||||
|
|
||||||
# Below paths pull configuration from the host environment.
|
# Below paths pull configuration from the host environment.
|
||||||
#
|
#
|
||||||
|
|||||||
@@ -32,7 +32,6 @@
|
|||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.util.filesystem import working_dir
|
from llnl.util.filesystem import working_dir
|
||||||
|
|
||||||
import spack
|
|
||||||
import spack.caches
|
import spack.caches
|
||||||
import spack.config
|
import spack.config
|
||||||
import spack.error
|
import spack.error
|
||||||
@@ -50,8 +49,6 @@
|
|||||||
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
|
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
|
||||||
ROOT_PYTHON_NAMESPACE = "spack.pkg"
|
ROOT_PYTHON_NAMESPACE = "spack.pkg"
|
||||||
|
|
||||||
_API_REGEX = re.compile(r"^v(\d+)\.(\d+)$")
|
|
||||||
|
|
||||||
|
|
||||||
def python_package_for_repo(namespace):
|
def python_package_for_repo(namespace):
|
||||||
"""Returns the full namespace of a repository, given its relative one
|
"""Returns the full namespace of a repository, given its relative one
|
||||||
@@ -912,52 +909,19 @@ def __reduce__(self):
|
|||||||
return RepoPath.unmarshal, self.marshal()
|
return RepoPath.unmarshal, self.marshal()
|
||||||
|
|
||||||
|
|
||||||
def _parse_package_api_version(
|
|
||||||
config: Dict[str, Any],
|
|
||||||
min_api: Tuple[int, int] = spack.min_package_api_version,
|
|
||||||
max_api: Tuple[int, int] = spack.package_api_version,
|
|
||||||
) -> Tuple[int, int]:
|
|
||||||
api = config.get("api")
|
|
||||||
if api is None:
|
|
||||||
package_api = (1, 0)
|
|
||||||
else:
|
|
||||||
if not isinstance(api, str):
|
|
||||||
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
|
|
||||||
api_match = _API_REGEX.match(api)
|
|
||||||
if api_match is None:
|
|
||||||
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
|
|
||||||
package_api = (int(api_match.group(1)), int(api_match.group(2)))
|
|
||||||
|
|
||||||
if min_api <= package_api <= max_api:
|
|
||||||
return package_api
|
|
||||||
|
|
||||||
min_str = ".".join(str(i) for i in min_api)
|
|
||||||
max_str = ".".join(str(i) for i in max_api)
|
|
||||||
curr_str = ".".join(str(i) for i in package_api)
|
|
||||||
raise BadRepoError(
|
|
||||||
f"Package API v{curr_str} is not supported by this version of Spack ("
|
|
||||||
f"must be between v{min_str} and v{max_str})"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class Repo:
|
class Repo:
|
||||||
"""Class representing a package repository in the filesystem.
|
"""Class representing a package repository in the filesystem.
|
||||||
|
|
||||||
Each package repository must have a top-level configuration file called `repo.yaml`.
|
Each package repository must have a top-level configuration file
|
||||||
|
called `repo.yaml`.
|
||||||
|
|
||||||
It contains the following keys:
|
Currently, `repo.yaml` must define:
|
||||||
|
|
||||||
`namespace`:
|
`namespace`:
|
||||||
A Python namespace where the repository's packages should live.
|
A Python namespace where the repository's packages should live.
|
||||||
|
|
||||||
`subdirectory`:
|
`subdirectory`:
|
||||||
An optional subdirectory name where packages are placed
|
An optional subdirectory name where packages are placed
|
||||||
|
|
||||||
`api`:
|
|
||||||
A string of the form vX.Y that indicates the Package API version. The default is "v1.0".
|
|
||||||
For the repo to be compatible with the current version of Spack, the version must be
|
|
||||||
greater than or equal to :py:data:`spack.min_package_api_version` and less than or equal to
|
|
||||||
:py:data:`spack.package_api_version`.
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
@@ -994,7 +958,7 @@ def check(condition, msg):
|
|||||||
f"{os.path.join(root, repo_config_name)} must define a namespace.",
|
f"{os.path.join(root, repo_config_name)} must define a namespace.",
|
||||||
)
|
)
|
||||||
|
|
||||||
self.namespace: str = config["namespace"]
|
self.namespace = config["namespace"]
|
||||||
check(
|
check(
|
||||||
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
|
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
|
||||||
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
|
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
|
||||||
@@ -1007,14 +971,12 @@ def check(condition, msg):
|
|||||||
# Keep name components around for checking prefixes.
|
# Keep name components around for checking prefixes.
|
||||||
self._names = self.full_namespace.split(".")
|
self._names = self.full_namespace.split(".")
|
||||||
|
|
||||||
packages_dir: str = config.get("subdirectory", packages_dir_name)
|
packages_dir = config.get("subdirectory", packages_dir_name)
|
||||||
self.packages_path = os.path.join(self.root, packages_dir)
|
self.packages_path = os.path.join(self.root, packages_dir)
|
||||||
check(
|
check(
|
||||||
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
|
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
|
||||||
)
|
)
|
||||||
|
|
||||||
self.package_api = _parse_package_api_version(config)
|
|
||||||
|
|
||||||
# Class attribute overrides by package name
|
# Class attribute overrides by package name
|
||||||
self.overrides = overrides or {}
|
self.overrides = overrides or {}
|
||||||
|
|
||||||
@@ -1064,7 +1026,7 @@ def is_prefix(self, fullname: str) -> bool:
|
|||||||
parts = fullname.split(".")
|
parts = fullname.split(".")
|
||||||
return self._names[: len(parts)] == parts
|
return self._names[: len(parts)] == parts
|
||||||
|
|
||||||
def _read_config(self) -> Dict[str, Any]:
|
def _read_config(self) -> Dict[str, str]:
|
||||||
"""Check for a YAML config file in this db's root directory."""
|
"""Check for a YAML config file in this db's root directory."""
|
||||||
try:
|
try:
|
||||||
with open(self.config_file, encoding="utf-8") as reponame_file:
|
with open(self.config_file, encoding="utf-8") as reponame_file:
|
||||||
@@ -1406,8 +1368,6 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
|
|||||||
config.write(f" namespace: '{namespace}'\n")
|
config.write(f" namespace: '{namespace}'\n")
|
||||||
if subdir != packages_dir_name:
|
if subdir != packages_dir_name:
|
||||||
config.write(f" subdirectory: '{subdir}'\n")
|
config.write(f" subdirectory: '{subdir}'\n")
|
||||||
x, y = spack.package_api_version
|
|
||||||
config.write(f" api: v{x}.{y}\n")
|
|
||||||
|
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
# try to clean up.
|
# try to clean up.
|
||||||
|
|||||||
@@ -58,15 +58,6 @@
|
|||||||
{"type": "string"}, # deprecated
|
{"type": "string"}, # deprecated
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"concretization_cache": {
|
|
||||||
"type": "object",
|
|
||||||
"properties": {
|
|
||||||
"enable": {"type": "boolean"},
|
|
||||||
"url": {"type": "string"},
|
|
||||||
"entry_limit": {"type": "integer", "minimum": 0},
|
|
||||||
"size_limit": {"type": "integer", "minimum": 0},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
"install_hash_length": {"type": "integer", "minimum": 1},
|
"install_hash_length": {"type": "integer", "minimum": 1},
|
||||||
"install_path_scheme": {"type": "string"}, # deprecated
|
"install_path_scheme": {"type": "string"}, # deprecated
|
||||||
"build_stage": {
|
"build_stage": {
|
||||||
|
|||||||
@@ -29,7 +29,11 @@
|
|||||||
# merged configuration scope schemas
|
# merged configuration scope schemas
|
||||||
spack.schema.merged.properties,
|
spack.schema.merged.properties,
|
||||||
# extra environment schema properties
|
# extra environment schema properties
|
||||||
{"specs": spec_list_schema, "include_concrete": include_concrete},
|
{
|
||||||
|
"include": {"type": "array", "default": [], "items": {"type": "string"}},
|
||||||
|
"specs": spec_list_schema,
|
||||||
|
"include_concrete": include_concrete,
|
||||||
|
},
|
||||||
),
|
),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,41 +0,0 @@
|
|||||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
|
||||||
#
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
|
||||||
"""Schema for include.yaml configuration file.
|
|
||||||
|
|
||||||
.. literalinclude:: _spack_root/lib/spack/spack/schema/include.py
|
|
||||||
:lines: 12-
|
|
||||||
"""
|
|
||||||
from typing import Any, Dict
|
|
||||||
|
|
||||||
#: Properties for inclusion in other schemas
|
|
||||||
properties: Dict[str, Any] = {
|
|
||||||
"include": {
|
|
||||||
"type": "array",
|
|
||||||
"default": [],
|
|
||||||
"additionalProperties": False,
|
|
||||||
"items": {
|
|
||||||
"anyOf": [
|
|
||||||
{
|
|
||||||
"type": "object",
|
|
||||||
"properties": {
|
|
||||||
"when": {"type": "string"},
|
|
||||||
"path": {"type": "string"},
|
|
||||||
"sha256": {"type": "string"},
|
|
||||||
"optional": {"type": "boolean"},
|
|
||||||
},
|
|
||||||
"required": ["path"],
|
|
||||||
"additionalProperties": False,
|
|
||||||
},
|
|
||||||
{"type": "string"},
|
|
||||||
]
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#: Full schema with metadata
|
|
||||||
schema = {
|
|
||||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
|
||||||
"title": "Spack include configuration file schema",
|
|
||||||
"properties": properties,
|
|
||||||
}
|
|
||||||
@@ -21,7 +21,6 @@
|
|||||||
import spack.schema.definitions
|
import spack.schema.definitions
|
||||||
import spack.schema.develop
|
import spack.schema.develop
|
||||||
import spack.schema.env_vars
|
import spack.schema.env_vars
|
||||||
import spack.schema.include
|
|
||||||
import spack.schema.mirrors
|
import spack.schema.mirrors
|
||||||
import spack.schema.modules
|
import spack.schema.modules
|
||||||
import spack.schema.packages
|
import spack.schema.packages
|
||||||
@@ -41,7 +40,6 @@
|
|||||||
spack.schema.definitions.properties,
|
spack.schema.definitions.properties,
|
||||||
spack.schema.develop.properties,
|
spack.schema.develop.properties,
|
||||||
spack.schema.env_vars.properties,
|
spack.schema.env_vars.properties,
|
||||||
spack.schema.include.properties,
|
|
||||||
spack.schema.mirrors.properties,
|
spack.schema.mirrors.properties,
|
||||||
spack.schema.modules.properties,
|
spack.schema.modules.properties,
|
||||||
spack.schema.packages.properties,
|
spack.schema.packages.properties,
|
||||||
@@ -50,6 +48,7 @@
|
|||||||
spack.schema.view.properties,
|
spack.schema.view.properties,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
#: Full schema with metadata
|
#: Full schema with metadata
|
||||||
schema = {
|
schema = {
|
||||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||||
|
|||||||
@@ -5,12 +5,9 @@
|
|||||||
import collections.abc
|
import collections.abc
|
||||||
import copy
|
import copy
|
||||||
import enum
|
import enum
|
||||||
import errno
|
|
||||||
import functools
|
import functools
|
||||||
import hashlib
|
|
||||||
import io
|
import io
|
||||||
import itertools
|
import itertools
|
||||||
import json
|
|
||||||
import os
|
import os
|
||||||
import pathlib
|
import pathlib
|
||||||
import pprint
|
import pprint
|
||||||
@@ -20,25 +17,12 @@
|
|||||||
import typing
|
import typing
|
||||||
import warnings
|
import warnings
|
||||||
from contextlib import contextmanager
|
from contextlib import contextmanager
|
||||||
from typing import (
|
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
|
||||||
IO,
|
|
||||||
Callable,
|
|
||||||
Dict,
|
|
||||||
Iterator,
|
|
||||||
List,
|
|
||||||
NamedTuple,
|
|
||||||
Optional,
|
|
||||||
Set,
|
|
||||||
Tuple,
|
|
||||||
Type,
|
|
||||||
Union,
|
|
||||||
)
|
|
||||||
|
|
||||||
import archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.util.filesystem import current_file_position
|
|
||||||
from llnl.util.lang import elide_list
|
from llnl.util.lang import elide_list
|
||||||
|
|
||||||
import spack
|
import spack
|
||||||
@@ -50,18 +34,15 @@
|
|||||||
import spack.deptypes as dt
|
import spack.deptypes as dt
|
||||||
import spack.environment as ev
|
import spack.environment as ev
|
||||||
import spack.error
|
import spack.error
|
||||||
import spack.hash_types as ht
|
|
||||||
import spack.package_base
|
import spack.package_base
|
||||||
import spack.package_prefs
|
import spack.package_prefs
|
||||||
import spack.patch
|
import spack.patch
|
||||||
import spack.paths
|
|
||||||
import spack.platforms
|
import spack.platforms
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.solver.splicing
|
import spack.solver.splicing
|
||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.store
|
import spack.store
|
||||||
import spack.util.crypto
|
import spack.util.crypto
|
||||||
import spack.util.hash
|
|
||||||
import spack.util.libc
|
import spack.util.libc
|
||||||
import spack.util.module_cmd as md
|
import spack.util.module_cmd as md
|
||||||
import spack.util.path
|
import spack.util.path
|
||||||
@@ -70,7 +51,6 @@
|
|||||||
import spack.version as vn
|
import spack.version as vn
|
||||||
import spack.version.git_ref_lookup
|
import spack.version.git_ref_lookup
|
||||||
from spack import traverse
|
from spack import traverse
|
||||||
from spack.util.file_cache import FileCache
|
|
||||||
|
|
||||||
from .core import (
|
from .core import (
|
||||||
AspFunction,
|
AspFunction,
|
||||||
@@ -558,364 +538,6 @@ def format_unsolved(unsolved_specs):
|
|||||||
msg += "\n\t(No candidate specs from solver)"
|
msg += "\n\t(No candidate specs from solver)"
|
||||||
return msg
|
return msg
|
||||||
|
|
||||||
def to_dict(self, test: bool = False) -> dict:
|
|
||||||
"""Produces dict representation of Result object
|
|
||||||
|
|
||||||
Does not include anything related to unsatisfiability as we
|
|
||||||
are only interested in storing satisfiable results
|
|
||||||
"""
|
|
||||||
serial_node_arg = (
|
|
||||||
lambda node_dict: f"""{{"id": "{node_dict.id}", "pkg": "{node_dict.pkg}"}}"""
|
|
||||||
)
|
|
||||||
spec_hash_type = ht.process_hash if test else ht.dag_hash
|
|
||||||
ret = dict()
|
|
||||||
ret["asp"] = self.asp
|
|
||||||
ret["criteria"] = self.criteria
|
|
||||||
ret["optimal"] = self.optimal
|
|
||||||
ret["warnings"] = self.warnings
|
|
||||||
ret["nmodels"] = self.nmodels
|
|
||||||
ret["abstract_specs"] = [str(x) for x in self.abstract_specs]
|
|
||||||
ret["satisfiable"] = self.satisfiable
|
|
||||||
serial_answers = []
|
|
||||||
for answer in self.answers:
|
|
||||||
serial_answer = answer[:2]
|
|
||||||
serial_answer_dict = {}
|
|
||||||
for node, spec in answer[2].items():
|
|
||||||
serial_answer_dict[serial_node_arg(node)] = spec.to_dict(hash=spec_hash_type)
|
|
||||||
serial_answer = serial_answer + (serial_answer_dict,)
|
|
||||||
serial_answers.append(serial_answer)
|
|
||||||
ret["answers"] = serial_answers
|
|
||||||
ret["specs_by_input"] = {}
|
|
||||||
input_specs = {} if not self.specs_by_input else self.specs_by_input
|
|
||||||
for input, spec in input_specs.items():
|
|
||||||
ret["specs_by_input"][str(input)] = spec.to_dict(hash=spec_hash_type)
|
|
||||||
return ret
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def from_dict(obj: dict):
|
|
||||||
"""Returns Result object from compatible dictionary"""
|
|
||||||
|
|
||||||
def _dict_to_node_argument(dict):
|
|
||||||
id = dict["id"]
|
|
||||||
pkg = dict["pkg"]
|
|
||||||
return NodeArgument(id=id, pkg=pkg)
|
|
||||||
|
|
||||||
def _str_to_spec(spec_str):
|
|
||||||
return spack.spec.Spec(spec_str)
|
|
||||||
|
|
||||||
def _dict_to_spec(spec_dict):
|
|
||||||
loaded_spec = spack.spec.Spec.from_dict(spec_dict)
|
|
||||||
_ensure_external_path_if_external(loaded_spec)
|
|
||||||
spack.spec.Spec.ensure_no_deprecated(loaded_spec)
|
|
||||||
return loaded_spec
|
|
||||||
|
|
||||||
asp = obj.get("asp")
|
|
||||||
spec_list = obj.get("abstract_specs")
|
|
||||||
if not spec_list:
|
|
||||||
raise RuntimeError("Invalid json for concretization Result object")
|
|
||||||
if spec_list:
|
|
||||||
spec_list = [_str_to_spec(x) for x in spec_list]
|
|
||||||
result = Result(spec_list, asp)
|
|
||||||
result.criteria = obj.get("criteria")
|
|
||||||
result.optimal = obj.get("optimal")
|
|
||||||
result.warnings = obj.get("warnings")
|
|
||||||
result.nmodels = obj.get("nmodels")
|
|
||||||
result.satisfiable = obj.get("satisfiable")
|
|
||||||
result._unsolved_specs = []
|
|
||||||
answers = []
|
|
||||||
for answer in obj.get("answers", []):
|
|
||||||
loaded_answer = answer[:2]
|
|
||||||
answer_node_dict = {}
|
|
||||||
for node, spec in answer[2].items():
|
|
||||||
answer_node_dict[_dict_to_node_argument(json.loads(node))] = _dict_to_spec(spec)
|
|
||||||
loaded_answer.append(answer_node_dict)
|
|
||||||
answers.append(tuple(loaded_answer))
|
|
||||||
result.answers = answers
|
|
||||||
result._concrete_specs_by_input = {}
|
|
||||||
result._concrete_specs = []
|
|
||||||
for input, spec in obj.get("specs_by_input", {}).items():
|
|
||||||
result._concrete_specs_by_input[_str_to_spec(input)] = _dict_to_spec(spec)
|
|
||||||
result._concrete_specs.append(_dict_to_spec(spec))
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
class ConcretizationCache:
|
|
||||||
"""Store for Spack concretization results and statistics
|
|
||||||
|
|
||||||
Serializes solver result objects and statistics to json and stores
|
|
||||||
at a given endpoint in a cache associated by the sha256 of the
|
|
||||||
asp problem and the involved control files.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, root: Union[str, None] = None):
|
|
||||||
root = root or spack.config.get(
|
|
||||||
"config:concretization_cache:url", spack.paths.default_conc_cache_path
|
|
||||||
)
|
|
||||||
self.root = pathlib.Path(spack.util.path.canonicalize_path(root))
|
|
||||||
self._fc = FileCache(self.root)
|
|
||||||
self._cache_manifest = ".cache_manifest"
|
|
||||||
self._manifest_queue: List[Tuple[pathlib.Path, int]] = []
|
|
||||||
|
|
||||||
def cleanup(self):
|
|
||||||
"""Prunes the concretization cache according to configured size and entry
|
|
||||||
count limits. Cleanup is done in FIFO ordering."""
|
|
||||||
# TODO: determine a better default
|
|
||||||
entry_limit = spack.config.get("config:concretization_cache:entry_limit", 1000)
|
|
||||||
bytes_limit = spack.config.get("config:concretization_cache:size_limit", 3e8)
|
|
||||||
# lock the entire buildcache as we're removing a lot of data from the
|
|
||||||
# manifest and cache itself
|
|
||||||
with self._fc.read_transaction(self._cache_manifest) as f:
|
|
||||||
count, cache_bytes = self._extract_cache_metadata(f)
|
|
||||||
if not count or not cache_bytes:
|
|
||||||
return
|
|
||||||
entry_count = int(count)
|
|
||||||
manifest_bytes = int(cache_bytes)
|
|
||||||
# move beyond the metadata entry
|
|
||||||
f.readline()
|
|
||||||
if entry_count > entry_limit and entry_limit > 0:
|
|
||||||
with self._fc.write_transaction(self._cache_manifest) as (old, new):
|
|
||||||
# prune the oldest 10% or until we have removed 10% of
|
|
||||||
# total bytes starting from oldest entry
|
|
||||||
# TODO: make this configurable?
|
|
||||||
prune_count = entry_limit // 10
|
|
||||||
lines_to_prune = f.readlines(prune_count)
|
|
||||||
for i, line in enumerate(lines_to_prune):
|
|
||||||
sha, cache_entry_bytes = self._parse_manifest_entry(line)
|
|
||||||
if sha and cache_entry_bytes:
|
|
||||||
cache_path = self._cache_path_from_hash(sha)
|
|
||||||
if self._fc.remove(cache_path):
|
|
||||||
entry_count -= 1
|
|
||||||
manifest_bytes -= int(cache_entry_bytes)
|
|
||||||
else:
|
|
||||||
tty.warn(
|
|
||||||
f"Invalid concretization cache entry: '{line}' on line: {i+1}"
|
|
||||||
)
|
|
||||||
self._write_manifest(f, entry_count, manifest_bytes)
|
|
||||||
|
|
||||||
elif manifest_bytes > bytes_limit and bytes_limit > 0:
|
|
||||||
with self._fc.write_transaction(self._cache_manifest) as (old, new):
|
|
||||||
# take 10% of current size off
|
|
||||||
prune_amount = bytes_limit // 10
|
|
||||||
total_pruned = 0
|
|
||||||
i = 0
|
|
||||||
while total_pruned < prune_amount:
|
|
||||||
sha, manifest_cache_bytes = self._parse_manifest_entry(f.readline())
|
|
||||||
if sha and manifest_cache_bytes:
|
|
||||||
entry_bytes = int(manifest_cache_bytes)
|
|
||||||
cache_path = self.root / sha[:2] / sha
|
|
||||||
if self._safe_remove(cache_path):
|
|
||||||
entry_count -= 1
|
|
||||||
entry_bytes -= entry_bytes
|
|
||||||
total_pruned += entry_bytes
|
|
||||||
else:
|
|
||||||
tty.warn(
|
|
||||||
"Invalid concretization cache entry "
|
|
||||||
f"'{sha} {manifest_cache_bytes}' on line: {i}"
|
|
||||||
)
|
|
||||||
i += 1
|
|
||||||
self._write_manifest(f, entry_count, manifest_bytes)
|
|
||||||
for cache_dir in self.root.iterdir():
|
|
||||||
if cache_dir.is_dir() and not any(cache_dir.iterdir()):
|
|
||||||
self._safe_remove(cache_dir)
|
|
||||||
|
|
||||||
def cache_entries(self):
|
|
||||||
"""Generator producing cache entries"""
|
|
||||||
for cache_dir in self.root.iterdir():
|
|
||||||
# ensure component is cache entry directory
|
|
||||||
# not metadata file
|
|
||||||
if cache_dir.is_dir():
|
|
||||||
for cache_entry in cache_dir.iterdir():
|
|
||||||
if not cache_entry.is_dir():
|
|
||||||
yield cache_entry
|
|
||||||
else:
|
|
||||||
raise RuntimeError(
|
|
||||||
"Improperly formed concretization cache. "
|
|
||||||
f"Directory {cache_entry.name} is improperly located "
|
|
||||||
"within the concretization cache."
|
|
||||||
)
|
|
||||||
|
|
||||||
def _parse_manifest_entry(self, line):
|
|
||||||
"""Returns parsed manifest entry lines
|
|
||||||
with handling for invalid reads."""
|
|
||||||
if line:
|
|
||||||
cache_values = line.strip("\n").split(" ")
|
|
||||||
if len(cache_values) < 2:
|
|
||||||
tty.warn(f"Invalid cache entry at {line}")
|
|
||||||
return None, None
|
|
||||||
return None, None
|
|
||||||
|
|
||||||
def _write_manifest(self, manifest_file, entry_count, entry_bytes):
|
|
||||||
"""Writes new concretization cache manifest file.
|
|
||||||
|
|
||||||
Arguments:
|
|
||||||
manifest_file: IO stream opened for readin
|
|
||||||
and writing wrapping the manifest file
|
|
||||||
with cursor at calltime set to location
|
|
||||||
where manifest should be truncated
|
|
||||||
entry_count: new total entry count
|
|
||||||
entry_bytes: new total entry bytes count
|
|
||||||
|
|
||||||
"""
|
|
||||||
persisted_entries = manifest_file.readlines()
|
|
||||||
manifest_file.truncate(0)
|
|
||||||
manifest_file.write(f"{entry_count} {entry_bytes}\n")
|
|
||||||
manifest_file.writelines(persisted_entries)
|
|
||||||
|
|
||||||
def _results_from_cache(self, cache_entry_buffer: IO[str]) -> Union[Result, None]:
|
|
||||||
"""Returns a Results object from the concretizer cache
|
|
||||||
|
|
||||||
Reads the cache hit and uses `Result`'s own deserializer
|
|
||||||
to produce a new Result object
|
|
||||||
"""
|
|
||||||
|
|
||||||
with current_file_position(cache_entry_buffer, 0):
|
|
||||||
cache_str = cache_entry_buffer.read()
|
|
||||||
# TODO: Should this be an error if None?
|
|
||||||
# Same for _stats_from_cache
|
|
||||||
if cache_str:
|
|
||||||
cache_entry = json.loads(cache_str)
|
|
||||||
result_json = cache_entry["results"]
|
|
||||||
return Result.from_dict(result_json)
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _stats_from_cache(self, cache_entry_buffer: IO[str]) -> Union[List, None]:
|
|
||||||
"""Returns concretization statistic from the
|
|
||||||
concretization associated with the cache.
|
|
||||||
|
|
||||||
Deserialzes the the json representation of the
|
|
||||||
statistics covering the cached concretization run
|
|
||||||
and returns the Python data structures
|
|
||||||
"""
|
|
||||||
with current_file_position(cache_entry_buffer, 0):
|
|
||||||
cache_str = cache_entry_buffer.read()
|
|
||||||
if cache_str:
|
|
||||||
return json.loads(cache_str)["statistics"]
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _extract_cache_metadata(self, cache_stream: IO[str]):
|
|
||||||
"""Extracts and returns cache entry count and bytes count from head of manifest
|
|
||||||
file"""
|
|
||||||
# make sure we're always reading from the beginning of the stream
|
|
||||||
# concretization cache manifest data lives at the top of the file
|
|
||||||
with current_file_position(cache_stream, 0):
|
|
||||||
return self._parse_manifest_entry(cache_stream.readline())
|
|
||||||
|
|
||||||
def _prefix_digest(self, problem: str) -> Tuple[str, str]:
|
|
||||||
"""Return the first two characters of, and the full, sha256 of the given asp problem"""
|
|
||||||
prob_digest = hashlib.sha256(problem.encode()).hexdigest()
|
|
||||||
prefix = prob_digest[:2]
|
|
||||||
return prefix, prob_digest
|
|
||||||
|
|
||||||
def _cache_path_from_problem(self, problem: str) -> pathlib.Path:
|
|
||||||
"""Returns a Path object representing the path to the cache
|
|
||||||
entry for the given problem"""
|
|
||||||
prefix, digest = self._prefix_digest(problem)
|
|
||||||
return pathlib.Path(prefix) / digest
|
|
||||||
|
|
||||||
def _cache_path_from_hash(self, hash: str) -> pathlib.Path:
|
|
||||||
"""Returns a Path object representing the cache entry
|
|
||||||
corresponding to the given sha256 hash"""
|
|
||||||
return pathlib.Path(hash[:2]) / hash
|
|
||||||
|
|
||||||
def _lock_prefix_from_cache_path(self, cache_path: str):
|
|
||||||
"""Returns the bit location corresponding to a given cache entry path
|
|
||||||
for file locking"""
|
|
||||||
return spack.util.hash.base32_prefix_bits(
|
|
||||||
spack.util.hash.b32_hash(cache_path), spack.util.crypto.bit_length(sys.maxsize)
|
|
||||||
)
|
|
||||||
|
|
||||||
def flush_manifest(self):
|
|
||||||
"""Updates the concretization cache manifest file after a cache write operation
|
|
||||||
Updates the current byte count and entry counts and writes to the head of the
|
|
||||||
manifest file"""
|
|
||||||
manifest_file = self.root / self._cache_manifest
|
|
||||||
manifest_file.touch(exist_ok=True)
|
|
||||||
with open(manifest_file, "r+", encoding="utf-8") as f:
|
|
||||||
# check if manifest is empty
|
|
||||||
count, cache_bytes = self._extract_cache_metadata(f)
|
|
||||||
if not count or not cache_bytes:
|
|
||||||
# cache is unintialized
|
|
||||||
count = 0
|
|
||||||
cache_bytes = 0
|
|
||||||
f.seek(0, io.SEEK_END)
|
|
||||||
for manifest_update in self._manifest_queue:
|
|
||||||
entry_path, entry_bytes = manifest_update
|
|
||||||
count += 1
|
|
||||||
cache_bytes += entry_bytes
|
|
||||||
f.write(f"{entry_path.name} {entry_bytes}")
|
|
||||||
f.seek(0, io.SEEK_SET)
|
|
||||||
new_stats = f"{int(count)+1} {int(cache_bytes)}\n"
|
|
||||||
f.write(new_stats)
|
|
||||||
|
|
||||||
def _register_cache_update(self, cache_path: pathlib.Path, bytes_written: int):
|
|
||||||
"""Adds manifest entry to update queue for later updates to the manifest"""
|
|
||||||
self._manifest_queue.append((cache_path, bytes_written))
|
|
||||||
|
|
||||||
def _safe_remove(self, cache_dir: pathlib.Path):
|
|
||||||
"""Removes cache entries with handling for the case where the entry has been
|
|
||||||
removed already or there are multiple cache entries in a directory"""
|
|
||||||
try:
|
|
||||||
if cache_dir.is_dir():
|
|
||||||
cache_dir.rmdir()
|
|
||||||
else:
|
|
||||||
cache_dir.unlink()
|
|
||||||
return True
|
|
||||||
except FileNotFoundError:
|
|
||||||
# This is acceptable, removal is idempotent
|
|
||||||
pass
|
|
||||||
except OSError as e:
|
|
||||||
if e.errno == errno.ENOTEMPTY:
|
|
||||||
# there exists another cache entry in this directory, don't clean yet
|
|
||||||
pass
|
|
||||||
return False
|
|
||||||
|
|
||||||
def store(self, problem: str, result: Result, statistics: List, test: bool = False):
|
|
||||||
"""Creates entry in concretization cache for problem if none exists,
|
|
||||||
storing the concretization Result object and statistics in the cache
|
|
||||||
as serialized json joined as a single file.
|
|
||||||
|
|
||||||
Hash membership is computed based on the sha256 of the provided asp
|
|
||||||
problem.
|
|
||||||
"""
|
|
||||||
cache_path = self._cache_path_from_problem(problem)
|
|
||||||
if self._fc.init_entry(cache_path):
|
|
||||||
# if an entry for this conc hash exists already, we're don't want
|
|
||||||
# to overwrite, just exit
|
|
||||||
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
|
|
||||||
return
|
|
||||||
with self._fc.write_transaction(cache_path) as (old, new):
|
|
||||||
if old:
|
|
||||||
# Entry for this conc hash exists already, do not overwrite
|
|
||||||
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
|
|
||||||
return
|
|
||||||
cache_dict = {"results": result.to_dict(test=test), "statistics": statistics}
|
|
||||||
bytes_written = new.write(json.dumps(cache_dict))
|
|
||||||
self._register_cache_update(cache_path, bytes_written)
|
|
||||||
|
|
||||||
def fetch(self, problem: str) -> Union[Tuple[Result, List], Tuple[None, None]]:
|
|
||||||
"""Returns the concretization cache result for a lookup based on the given problem.
|
|
||||||
|
|
||||||
Checks the concretization cache for the given problem, and either returns the
|
|
||||||
Python objects cached on disk representing the concretization results and statistics
|
|
||||||
or returns none if no cache entry was found.
|
|
||||||
"""
|
|
||||||
cache_path = self._cache_path_from_problem(problem)
|
|
||||||
result, statistics = None, None
|
|
||||||
with self._fc.read_transaction(cache_path) as f:
|
|
||||||
if f:
|
|
||||||
result = self._results_from_cache(f)
|
|
||||||
statistics = self._stats_from_cache(f)
|
|
||||||
if result and statistics:
|
|
||||||
tty.debug(f"Concretization cache hit at {str(cache_path)}")
|
|
||||||
return result, statistics
|
|
||||||
tty.debug(f"Concretization cache miss at {str(cache_path)}")
|
|
||||||
return None, None
|
|
||||||
|
|
||||||
|
|
||||||
CONC_CACHE: ConcretizationCache = llnl.util.lang.Singleton(
|
|
||||||
lambda: ConcretizationCache()
|
|
||||||
) # type: ignore
|
|
||||||
|
|
||||||
|
|
||||||
def _normalize_packages_yaml(packages_yaml):
|
def _normalize_packages_yaml(packages_yaml):
|
||||||
normalized_yaml = copy.copy(packages_yaml)
|
normalized_yaml = copy.copy(packages_yaml)
|
||||||
@@ -1184,15 +806,6 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
|
|||||||
if sys.platform == "win32":
|
if sys.platform == "win32":
|
||||||
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
|
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
|
||||||
spack.bootstrap.core.ensure_winsdk_external_or_raise()
|
spack.bootstrap.core.ensure_winsdk_external_or_raise()
|
||||||
control_files = ["concretize.lp", "heuristic.lp", "display.lp"]
|
|
||||||
if not setup.concretize_everything:
|
|
||||||
control_files.append("when_possible.lp")
|
|
||||||
if using_libc_compatibility():
|
|
||||||
control_files.append("libc_compatibility.lp")
|
|
||||||
else:
|
|
||||||
control_files.append("os_compatibility.lp")
|
|
||||||
if setup.enable_splicing:
|
|
||||||
control_files.append("splices.lp")
|
|
||||||
|
|
||||||
timer.start("setup")
|
timer.start("setup")
|
||||||
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
|
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
|
||||||
@@ -1202,133 +815,123 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
|
|||||||
return Result(specs), None, None
|
return Result(specs), None, None
|
||||||
timer.stop("setup")
|
timer.stop("setup")
|
||||||
|
|
||||||
timer.start("cache-check")
|
timer.start("load")
|
||||||
timer.start("ordering")
|
# Add the problem instance
|
||||||
# ensure deterministic output
|
self.control.add("base", [], asp_problem)
|
||||||
problem_repr = "\n".join(sorted(asp_problem.split("\n")))
|
# Load the file itself
|
||||||
timer.stop("ordering")
|
|
||||||
parent_dir = os.path.dirname(__file__)
|
parent_dir = os.path.dirname(__file__)
|
||||||
full_path = lambda x: os.path.join(parent_dir, x)
|
self.control.load(os.path.join(parent_dir, "concretize.lp"))
|
||||||
abs_control_files = [full_path(x) for x in control_files]
|
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
|
||||||
for ctrl_file in abs_control_files:
|
self.control.load(os.path.join(parent_dir, "display.lp"))
|
||||||
with open(ctrl_file, "r+", encoding="utf-8") as f:
|
if not setup.concretize_everything:
|
||||||
problem_repr += "\n" + f.read()
|
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
|
||||||
|
|
||||||
result = None
|
# Binary compatibility is based on libc on Linux, and on the os tag elsewhere
|
||||||
conc_cache_enabled = spack.config.get("config:concretization_cache:enable", True)
|
if using_libc_compatibility():
|
||||||
if conc_cache_enabled:
|
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
|
||||||
result, concretization_stats = CONC_CACHE.fetch(problem_repr)
|
else:
|
||||||
|
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
|
||||||
|
if setup.enable_splicing:
|
||||||
|
self.control.load(os.path.join(parent_dir, "splices.lp"))
|
||||||
|
|
||||||
timer.stop("cache-check")
|
timer.stop("load")
|
||||||
if not result:
|
|
||||||
timer.start("load")
|
|
||||||
# Add the problem instance
|
|
||||||
self.control.add("base", [], asp_problem)
|
|
||||||
# Load the files
|
|
||||||
[self.control.load(lp) for lp in abs_control_files]
|
|
||||||
timer.stop("load")
|
|
||||||
|
|
||||||
# Grounding is the first step in the solve -- it turns our facts
|
# Grounding is the first step in the solve -- it turns our facts
|
||||||
# and first-order logic rules into propositional logic.
|
# and first-order logic rules into propositional logic.
|
||||||
timer.start("ground")
|
timer.start("ground")
|
||||||
self.control.ground([("base", [])])
|
self.control.ground([("base", [])])
|
||||||
timer.stop("ground")
|
timer.stop("ground")
|
||||||
|
|
||||||
# With a grounded program, we can run the solve.
|
# With a grounded program, we can run the solve.
|
||||||
models = [] # stable models if things go well
|
models = [] # stable models if things go well
|
||||||
cores = [] # unsatisfiable cores if they do not
|
cores = [] # unsatisfiable cores if they do not
|
||||||
|
|
||||||
def on_model(model):
|
def on_model(model):
|
||||||
models.append((model.cost, model.symbols(shown=True, terms=True)))
|
models.append((model.cost, model.symbols(shown=True, terms=True)))
|
||||||
|
|
||||||
solve_kwargs = {
|
solve_kwargs = {
|
||||||
"assumptions": setup.assumptions,
|
"assumptions": setup.assumptions,
|
||||||
"on_model": on_model,
|
"on_model": on_model,
|
||||||
"on_core": cores.append,
|
"on_core": cores.append,
|
||||||
}
|
}
|
||||||
|
|
||||||
if clingo_cffi():
|
if clingo_cffi():
|
||||||
solve_kwargs["on_unsat"] = cores.append
|
solve_kwargs["on_unsat"] = cores.append
|
||||||
|
|
||||||
timer.start("solve")
|
timer.start("solve")
|
||||||
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
|
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
|
||||||
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
|
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
|
||||||
# Spack uses 0 to set no time limit, clingo API uses -1
|
# Spack uses 0 to set no time limit, clingo API uses -1
|
||||||
if time_limit == 0:
|
if time_limit == 0:
|
||||||
time_limit = -1
|
time_limit = -1
|
||||||
with self.control.solve(**solve_kwargs, async_=True) as handle:
|
with self.control.solve(**solve_kwargs, async_=True) as handle:
|
||||||
finished = handle.wait(time_limit)
|
finished = handle.wait(time_limit)
|
||||||
if not finished:
|
if not finished:
|
||||||
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
|
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
|
||||||
header = (
|
header = f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
|
||||||
f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
|
if error_on_timeout:
|
||||||
)
|
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
|
||||||
if error_on_timeout:
|
warnings.warn(f"{header}, using the best configuration found so far")
|
||||||
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
|
handle.cancel()
|
||||||
warnings.warn(f"{header}, using the best configuration found so far")
|
|
||||||
handle.cancel()
|
|
||||||
|
|
||||||
solve_result = handle.get()
|
solve_result = handle.get()
|
||||||
timer.stop("solve")
|
timer.stop("solve")
|
||||||
|
|
||||||
# once done, construct the solve result
|
# once done, construct the solve result
|
||||||
result = Result(specs)
|
result = Result(specs)
|
||||||
result.satisfiable = solve_result.satisfiable
|
result.satisfiable = solve_result.satisfiable
|
||||||
|
|
||||||
if result.satisfiable:
|
if result.satisfiable:
|
||||||
timer.start("construct_specs")
|
timer.start("construct_specs")
|
||||||
# get the best model
|
# get the best model
|
||||||
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
|
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
|
||||||
min_cost, best_model = min(models)
|
min_cost, best_model = min(models)
|
||||||
|
|
||||||
# first check for errors
|
# first check for errors
|
||||||
error_handler = ErrorHandler(best_model, specs)
|
error_handler = ErrorHandler(best_model, specs)
|
||||||
error_handler.raise_if_errors()
|
error_handler.raise_if_errors()
|
||||||
|
|
||||||
# build specs from spec attributes in the model
|
# build specs from spec attributes in the model
|
||||||
spec_attrs = [
|
spec_attrs = [(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")]
|
||||||
(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")
|
answers = builder.build_specs(spec_attrs)
|
||||||
]
|
|
||||||
answers = builder.build_specs(spec_attrs)
|
|
||||||
|
|
||||||
# add best spec to the results
|
# add best spec to the results
|
||||||
result.answers.append((list(min_cost), 0, answers))
|
result.answers.append((list(min_cost), 0, answers))
|
||||||
|
|
||||||
# get optimization criteria
|
# get optimization criteria
|
||||||
criteria_args = extract_args(best_model, "opt_criterion")
|
criteria_args = extract_args(best_model, "opt_criterion")
|
||||||
result.criteria = build_criteria_names(min_cost, criteria_args)
|
result.criteria = build_criteria_names(min_cost, criteria_args)
|
||||||
|
|
||||||
# record the number of models the solver considered
|
# record the number of models the solver considered
|
||||||
result.nmodels = len(models)
|
result.nmodels = len(models)
|
||||||
|
|
||||||
# record the possible dependencies in the solve
|
# record the possible dependencies in the solve
|
||||||
result.possible_dependencies = setup.pkgs
|
result.possible_dependencies = setup.pkgs
|
||||||
timer.stop("construct_specs")
|
timer.stop("construct_specs")
|
||||||
timer.stop()
|
timer.stop()
|
||||||
elif cores:
|
elif cores:
|
||||||
result.control = self.control
|
result.control = self.control
|
||||||
result.cores.extend(cores)
|
result.cores.extend(cores)
|
||||||
|
|
||||||
result.raise_if_unsat()
|
|
||||||
|
|
||||||
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
|
|
||||||
unsolved_str = Result.format_unsolved(result.unsolved_specs)
|
|
||||||
raise InternalConcretizerError(
|
|
||||||
"Internal Spack error: the solver completed but produced specs"
|
|
||||||
" that do not satisfy the request. Please report a bug at "
|
|
||||||
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
|
|
||||||
)
|
|
||||||
if conc_cache_enabled:
|
|
||||||
CONC_CACHE.store(problem_repr, result, self.control.statistics, test=setup.tests)
|
|
||||||
concretization_stats = self.control.statistics
|
|
||||||
if output.timers:
|
if output.timers:
|
||||||
timer.write_tty()
|
timer.write_tty()
|
||||||
print()
|
print()
|
||||||
|
|
||||||
if output.stats:
|
if output.stats:
|
||||||
print("Statistics:")
|
print("Statistics:")
|
||||||
pprint.pprint(concretization_stats)
|
pprint.pprint(self.control.statistics)
|
||||||
return result, timer, concretization_stats
|
|
||||||
|
result.raise_if_unsat()
|
||||||
|
|
||||||
|
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
|
||||||
|
unsolved_str = Result.format_unsolved(result.unsolved_specs)
|
||||||
|
raise InternalConcretizerError(
|
||||||
|
"Internal Spack error: the solver completed but produced specs"
|
||||||
|
" that do not satisfy the request. Please report a bug at "
|
||||||
|
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return result, timer, self.control.statistics
|
||||||
|
|
||||||
|
|
||||||
class ConcreteSpecsByHash(collections.abc.Mapping):
|
class ConcreteSpecsByHash(collections.abc.Mapping):
|
||||||
@@ -1770,7 +1373,7 @@ def effect_rules(self):
|
|||||||
return
|
return
|
||||||
|
|
||||||
self.gen.h2("Imposed requirements")
|
self.gen.h2("Imposed requirements")
|
||||||
for name in sorted(self._effect_cache):
|
for name in self._effect_cache:
|
||||||
cache = self._effect_cache[name]
|
cache = self._effect_cache[name]
|
||||||
for (spec_str, _), (effect_id, requirements) in cache.items():
|
for (spec_str, _), (effect_id, requirements) in cache.items():
|
||||||
self.gen.fact(fn.pkg_fact(name, fn.effect_id(effect_id)))
|
self.gen.fact(fn.pkg_fact(name, fn.effect_id(effect_id)))
|
||||||
@@ -1823,8 +1426,8 @@ def define_variant(
|
|||||||
|
|
||||||
elif isinstance(values, vt.DisjointSetsOfValues):
|
elif isinstance(values, vt.DisjointSetsOfValues):
|
||||||
union = set()
|
union = set()
|
||||||
for sid, s in enumerate(sorted(values.sets)):
|
for sid, s in enumerate(values.sets):
|
||||||
for value in sorted(s):
|
for value in s:
|
||||||
pkg_fact(fn.variant_value_from_disjoint_sets(vid, value, sid))
|
pkg_fact(fn.variant_value_from_disjoint_sets(vid, value, sid))
|
||||||
union.update(s)
|
union.update(s)
|
||||||
values = union
|
values = union
|
||||||
@@ -2005,7 +1608,7 @@ def package_provider_rules(self, pkg):
|
|||||||
self.gen.fact(fn.pkg_fact(pkg.name, fn.possible_provider(vpkg_name)))
|
self.gen.fact(fn.pkg_fact(pkg.name, fn.possible_provider(vpkg_name)))
|
||||||
|
|
||||||
for when, provided in pkg.provided.items():
|
for when, provided in pkg.provided.items():
|
||||||
for vpkg in sorted(provided):
|
for vpkg in provided:
|
||||||
if vpkg.name not in self.possible_virtuals:
|
if vpkg.name not in self.possible_virtuals:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
@@ -2020,8 +1623,8 @@ def package_provider_rules(self, pkg):
|
|||||||
condition_id = self.condition(
|
condition_id = self.condition(
|
||||||
when, required_name=pkg.name, msg="Virtuals are provided together"
|
when, required_name=pkg.name, msg="Virtuals are provided together"
|
||||||
)
|
)
|
||||||
for set_id, virtuals_together in enumerate(sorted(sets_of_virtuals)):
|
for set_id, virtuals_together in enumerate(sets_of_virtuals):
|
||||||
for name in sorted(virtuals_together):
|
for name in virtuals_together:
|
||||||
self.gen.fact(
|
self.gen.fact(
|
||||||
fn.pkg_fact(pkg.name, fn.provided_together(condition_id, set_id, name))
|
fn.pkg_fact(pkg.name, fn.provided_together(condition_id, set_id, name))
|
||||||
)
|
)
|
||||||
@@ -2131,7 +1734,7 @@ def package_splice_rules(self, pkg):
|
|||||||
for map in pkg.variants.values():
|
for map in pkg.variants.values():
|
||||||
for k in map:
|
for k in map:
|
||||||
filt_match_variants.add(k)
|
filt_match_variants.add(k)
|
||||||
filt_match_variants = sorted(filt_match_variants)
|
filt_match_variants = list(filt_match_variants)
|
||||||
variant_constraints = self._gen_match_variant_splice_constraints(
|
variant_constraints = self._gen_match_variant_splice_constraints(
|
||||||
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
|
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
|
||||||
)
|
)
|
||||||
@@ -2661,7 +2264,7 @@ def define_package_versions_and_validate_preferences(
|
|||||||
):
|
):
|
||||||
"""Declare any versions in specs not declared in packages."""
|
"""Declare any versions in specs not declared in packages."""
|
||||||
packages_yaml = spack.config.get("packages")
|
packages_yaml = spack.config.get("packages")
|
||||||
for pkg_name in sorted(possible_pkgs):
|
for pkg_name in possible_pkgs:
|
||||||
pkg_cls = self.pkg_class(pkg_name)
|
pkg_cls = self.pkg_class(pkg_name)
|
||||||
|
|
||||||
# All the versions from the corresponding package.py file. Since concepts
|
# All the versions from the corresponding package.py file. Since concepts
|
||||||
@@ -2989,7 +2592,7 @@ def define_variant_values(self):
|
|||||||
"""
|
"""
|
||||||
# Tell the concretizer about possible values from specs seen in spec_clauses().
|
# Tell the concretizer about possible values from specs seen in spec_clauses().
|
||||||
# We might want to order these facts by pkg and name if we are debugging.
|
# We might want to order these facts by pkg and name if we are debugging.
|
||||||
for pkg_name, variant_def_id, value in sorted(self.variant_values_from_specs):
|
for pkg_name, variant_def_id, value in self.variant_values_from_specs:
|
||||||
try:
|
try:
|
||||||
vid = self.variant_ids_by_def_id[variant_def_id]
|
vid = self.variant_ids_by_def_id[variant_def_id]
|
||||||
except KeyError:
|
except KeyError:
|
||||||
@@ -3027,8 +2630,6 @@ def concrete_specs(self):
|
|||||||
# Declare as possible parts of specs that are not in package.py
|
# Declare as possible parts of specs that are not in package.py
|
||||||
# - Add versions to possible versions
|
# - Add versions to possible versions
|
||||||
# - Add OS to possible OS's
|
# - Add OS to possible OS's
|
||||||
|
|
||||||
# is traverse deterministic?
|
|
||||||
for dep in spec.traverse():
|
for dep in spec.traverse():
|
||||||
self.possible_versions[dep.name].add(dep.version)
|
self.possible_versions[dep.name].add(dep.version)
|
||||||
if isinstance(dep.version, vn.GitVersion):
|
if isinstance(dep.version, vn.GitVersion):
|
||||||
@@ -3266,7 +2867,7 @@ def define_runtime_constraints(self):
|
|||||||
recorder.consume_facts()
|
recorder.consume_facts()
|
||||||
|
|
||||||
def literal_specs(self, specs):
|
def literal_specs(self, specs):
|
||||||
for spec in sorted(specs):
|
for spec in specs:
|
||||||
self.gen.h2("Spec: %s" % str(spec))
|
self.gen.h2("Spec: %s" % str(spec))
|
||||||
condition_id = next(self._id_counter)
|
condition_id = next(self._id_counter)
|
||||||
trigger_id = next(self._id_counter)
|
trigger_id = next(self._id_counter)
|
||||||
@@ -3767,7 +3368,7 @@ def consume_facts(self):
|
|||||||
# on the available compilers)
|
# on the available compilers)
|
||||||
self._setup.pkg_version_rules(runtime_pkg)
|
self._setup.pkg_version_rules(runtime_pkg)
|
||||||
|
|
||||||
for imposed_spec, when_spec in sorted(self.runtime_conditions):
|
for imposed_spec, when_spec in self.runtime_conditions:
|
||||||
msg = f"{when_spec} requires {imposed_spec} at runtime"
|
msg = f"{when_spec} requires {imposed_spec} at runtime"
|
||||||
_ = self._setup.condition(when_spec, imposed_spec=imposed_spec, msg=msg)
|
_ = self._setup.condition(when_spec, imposed_spec=imposed_spec, msg=msg)
|
||||||
|
|
||||||
@@ -4624,9 +4225,6 @@ def solve_with_stats(
|
|||||||
reusable_specs.extend(self.selector.reusable_specs(specs))
|
reusable_specs.extend(self.selector.reusable_specs(specs))
|
||||||
setup = SpackSolverSetup(tests=tests)
|
setup = SpackSolverSetup(tests=tests)
|
||||||
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
|
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
|
||||||
|
|
||||||
CONC_CACHE.flush_manifest()
|
|
||||||
CONC_CACHE.cleanup()
|
|
||||||
return self.driver.solve(
|
return self.driver.solve(
|
||||||
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
|
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
|
||||||
)
|
)
|
||||||
@@ -4696,9 +4294,6 @@ def solve_in_rounds(
|
|||||||
for spec in result.specs:
|
for spec in result.specs:
|
||||||
reusable_specs.extend(spec.traverse())
|
reusable_specs.extend(spec.traverse())
|
||||||
|
|
||||||
CONC_CACHE.flush_manifest()
|
|
||||||
CONC_CACHE.cleanup()
|
|
||||||
|
|
||||||
|
|
||||||
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
|
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
|
||||||
"""There was an issue with the spec that was requested (i.e. a user error)."""
|
"""There was an issue with the spec that was requested (i.e. a user error)."""
|
||||||
|
|||||||
@@ -10,7 +10,7 @@
|
|||||||
import spack.error
|
import spack.error
|
||||||
import spack.package_base
|
import spack.package_base
|
||||||
import spack.spec
|
import spack.spec
|
||||||
from spack.util.spack_yaml import get_mark_from_yaml_data
|
from spack.config import get_mark_from_yaml_data
|
||||||
|
|
||||||
|
|
||||||
class RequirementKind(enum.Enum):
|
class RequirementKind(enum.Enum):
|
||||||
@@ -165,8 +165,7 @@ def _rules_from_requirements(
|
|||||||
|
|
||||||
# validate specs from YAML first, and fail with line numbers if parsing fails.
|
# validate specs from YAML first, and fail with line numbers if parsing fails.
|
||||||
constraints = [
|
constraints = [
|
||||||
parse_spec_from_yaml_string(constraint, named=kind == RequirementKind.VIRTUAL)
|
parse_spec_from_yaml_string(constraint) for constraint in constraints
|
||||||
for constraint in constraints
|
|
||||||
]
|
]
|
||||||
when_str = requirement.get("when")
|
when_str = requirement.get("when")
|
||||||
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
|
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
|
||||||
@@ -204,7 +203,7 @@ def reject_requirement_constraint(
|
|||||||
s.validate_or_raise()
|
s.validate_or_raise()
|
||||||
except spack.error.SpackError as e:
|
except spack.error.SpackError as e:
|
||||||
tty.debug(
|
tty.debug(
|
||||||
f"[{__name__}] Rejecting the default '{constraint}' requirement "
|
f"[SETUP] Rejecting the default '{constraint}' requirement "
|
||||||
f"on '{pkg_name}': {str(e)}",
|
f"on '{pkg_name}': {str(e)}",
|
||||||
level=2,
|
level=2,
|
||||||
)
|
)
|
||||||
@@ -212,37 +211,21 @@ def reject_requirement_constraint(
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
def parse_spec_from_yaml_string(string: str, *, named: bool = False) -> spack.spec.Spec:
|
def parse_spec_from_yaml_string(string: str) -> spack.spec.Spec:
|
||||||
"""Parse a spec from YAML and add file/line info to errors, if it's available.
|
"""Parse a spec from YAML and add file/line info to errors, if it's available.
|
||||||
|
|
||||||
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
|
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
|
||||||
add file/line information for debugging using file/line annotations from the string.
|
add file/line information for debugging using file/line annotations from the string.
|
||||||
|
|
||||||
Args:
|
Arguments:
|
||||||
string: a string representing a ``Spec`` from config YAML.
|
string: a string representing a ``Spec`` from config YAML.
|
||||||
named: if True, the spec must have a name
|
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
result = spack.spec.Spec(string)
|
return spack.spec.Spec(string)
|
||||||
except spack.error.SpecSyntaxError as e:
|
except spack.error.SpecSyntaxError as e:
|
||||||
mark = get_mark_from_yaml_data(string)
|
mark = get_mark_from_yaml_data(string)
|
||||||
if mark:
|
if mark:
|
||||||
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
|
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
|
||||||
raise spack.error.SpecSyntaxError(msg) from e
|
raise spack.error.SpecSyntaxError(msg) from e
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
if named is True and not result.name:
|
|
||||||
msg = f"expected a named spec, but got '{string}' instead"
|
|
||||||
mark = get_mark_from_yaml_data(string)
|
|
||||||
|
|
||||||
# Add a hint in case it's dependencies
|
|
||||||
deps = result.dependencies()
|
|
||||||
if len(deps) == 1:
|
|
||||||
msg = f"{msg}. Did you mean '{deps[0]}'?"
|
|
||||||
|
|
||||||
if mark:
|
|
||||||
msg = f"{mark.name}:{mark.line + 1}: {msg}"
|
|
||||||
|
|
||||||
raise spack.error.SpackError(msg)
|
|
||||||
|
|
||||||
return result
|
|
||||||
|
|||||||
@@ -175,17 +175,15 @@
|
|||||||
#: Spec(Spec("string").format()) == Spec("string)"
|
#: Spec(Spec("string").format()) == Spec("string)"
|
||||||
DEFAULT_FORMAT = (
|
DEFAULT_FORMAT = (
|
||||||
"{name}{@versions}"
|
"{name}{@versions}"
|
||||||
"{compiler_flags}"
|
"{%compiler.name}{@compiler.versions}{compiler_flags}"
|
||||||
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
|
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
|
||||||
" {%compiler.name}{@compiler.versions}"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
#: Display format, which eliminates extra `@=` in the output, for readability.
|
#: Display format, which eliminates extra `@=` in the output, for readability.
|
||||||
DISPLAY_FORMAT = (
|
DISPLAY_FORMAT = (
|
||||||
"{name}{@version}"
|
"{name}{@version}"
|
||||||
"{compiler_flags}"
|
"{%compiler.name}{@compiler.version}{compiler_flags}"
|
||||||
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
|
"{variants}{ namespace=namespace_if_anonymous}{ arch=architecture}{/abstract_hash}"
|
||||||
" {%compiler.name}{@compiler.version}"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
#: Regular expression to pull spec contents out of clearsigned signature
|
#: Regular expression to pull spec contents out of clearsigned signature
|
||||||
@@ -2108,18 +2106,16 @@ def long_spec(self):
|
|||||||
def short_spec(self):
|
def short_spec(self):
|
||||||
"""Returns a version of the spec with the dependencies hashed
|
"""Returns a version of the spec with the dependencies hashed
|
||||||
instead of completely enumerated."""
|
instead of completely enumerated."""
|
||||||
return self.format(
|
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
|
||||||
"{name}{@version}{variants}{ arch=architecture}"
|
spec_format += "{variants}{ arch=architecture}{/hash:7}"
|
||||||
"{/hash:7}{%compiler.name}{@compiler.version}"
|
return self.format(spec_format)
|
||||||
)
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def cshort_spec(self):
|
def cshort_spec(self):
|
||||||
"""Returns an auto-colorized version of ``self.short_spec``."""
|
"""Returns an auto-colorized version of ``self.short_spec``."""
|
||||||
return self.cformat(
|
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
|
||||||
"{name}{@version}{variants}{ arch=architecture}"
|
spec_format += "{variants}{ arch=architecture}{/hash:7}"
|
||||||
"{/hash:7}{%compiler.name}{@compiler.version}"
|
return self.cformat(spec_format)
|
||||||
)
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def prefix(self) -> spack.util.prefix.Prefix:
|
def prefix(self) -> spack.util.prefix.Prefix:
|
||||||
@@ -5134,13 +5130,6 @@ def get_host_environment() -> Dict[str, Any]:
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
def eval_conditional(string):
|
|
||||||
"""Evaluate conditional definitions using restricted variable scope."""
|
|
||||||
valid_variables = get_host_environment()
|
|
||||||
valid_variables.update({"re": re, "env": os.environ})
|
|
||||||
return eval(string, valid_variables)
|
|
||||||
|
|
||||||
|
|
||||||
class SpecParseError(spack.error.SpecError):
|
class SpecParseError(spack.error.SpecError):
|
||||||
"""Wrapper for ParseError for when we're parsing specs."""
|
"""Wrapper for ParseError for when we're parsing specs."""
|
||||||
|
|
||||||
@@ -5299,10 +5288,8 @@ def __init__(self, spec):
|
|||||||
|
|
||||||
class AmbiguousHashError(spack.error.SpecError):
|
class AmbiguousHashError(spack.error.SpecError):
|
||||||
def __init__(self, msg, *specs):
|
def __init__(self, msg, *specs):
|
||||||
spec_fmt = (
|
spec_fmt = "{namespace}.{name}{@version}{%compiler}{compiler_flags}"
|
||||||
"{namespace}.{name}{@version}{compiler_flags}{variants}"
|
spec_fmt += "{variants}{ arch=architecture}{/hash:7}"
|
||||||
"{ arch=architecture}{/hash:7}{%compiler}"
|
|
||||||
)
|
|
||||||
specs_str = "\n " + "\n ".join(spec.format(spec_fmt) for spec in specs)
|
specs_str = "\n " + "\n ".join(spec.format(spec_fmt) for spec in specs)
|
||||||
super().__init__(msg + specs_str)
|
super().__init__(msg + specs_str)
|
||||||
|
|
||||||
|
|||||||
@@ -60,17 +60,13 @@
|
|||||||
import pathlib
|
import pathlib
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import traceback
|
from typing import Iterator, List, Optional
|
||||||
import warnings
|
|
||||||
from typing import Iterator, List, Optional, Tuple
|
|
||||||
|
|
||||||
from llnl.util.tty import color
|
from llnl.util.tty import color
|
||||||
|
|
||||||
import spack.deptypes
|
import spack.deptypes
|
||||||
import spack.error
|
import spack.error
|
||||||
import spack.paths
|
|
||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.util.spack_yaml
|
|
||||||
import spack.version
|
import spack.version
|
||||||
from spack.tokenize import Token, TokenBase, Tokenizer
|
from spack.tokenize import Token, TokenBase, Tokenizer
|
||||||
|
|
||||||
@@ -208,32 +204,6 @@ def __init__(self, tokens: List[Token], text: str):
|
|||||||
super().__init__(message)
|
super().__init__(message)
|
||||||
|
|
||||||
|
|
||||||
def _warn_about_variant_after_compiler(literal_str: str, issues: List[str]):
|
|
||||||
"""Issue a warning if variant or other token is preceded by a compiler token. The warning is
|
|
||||||
only issued if it's actionable: either we know the config file it originates from, or we have
|
|
||||||
call site that's not internal to Spack."""
|
|
||||||
ignore = [spack.paths.lib_path, spack.paths.bin_path]
|
|
||||||
mark = spack.util.spack_yaml.get_mark_from_yaml_data(literal_str)
|
|
||||||
issue_str = ", ".join(issues)
|
|
||||||
error = f"{issue_str} in `{literal_str}`"
|
|
||||||
|
|
||||||
# warning from config file
|
|
||||||
if mark:
|
|
||||||
warnings.warn(f"{mark.name}:{mark.line + 1}: {error}")
|
|
||||||
return
|
|
||||||
|
|
||||||
# warning from hopefully package.py
|
|
||||||
for frame in reversed(traceback.extract_stack()):
|
|
||||||
if frame.lineno and not any(frame.filename.startswith(path) for path in ignore):
|
|
||||||
warnings.warn_explicit(
|
|
||||||
error,
|
|
||||||
category=spack.error.SpackAPIWarning,
|
|
||||||
filename=frame.filename,
|
|
||||||
lineno=frame.lineno,
|
|
||||||
)
|
|
||||||
return
|
|
||||||
|
|
||||||
|
|
||||||
class SpecParser:
|
class SpecParser:
|
||||||
"""Parse text into specs"""
|
"""Parse text into specs"""
|
||||||
|
|
||||||
@@ -272,31 +242,26 @@ def add_dependency(dep, **edge_properties):
|
|||||||
raise SpecParsingError(str(e), self.ctx.current_token, self.literal_str) from e
|
raise SpecParsingError(str(e), self.ctx.current_token, self.literal_str) from e
|
||||||
|
|
||||||
initial_spec = initial_spec or spack.spec.Spec()
|
initial_spec = initial_spec or spack.spec.Spec()
|
||||||
root_spec, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
|
root_spec = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
|
||||||
while True:
|
while True:
|
||||||
if self.ctx.accept(SpecTokens.START_EDGE_PROPERTIES):
|
if self.ctx.accept(SpecTokens.START_EDGE_PROPERTIES):
|
||||||
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
|
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
|
||||||
edge_properties.setdefault("depflag", 0)
|
edge_properties.setdefault("depflag", 0)
|
||||||
edge_properties.setdefault("virtuals", ())
|
edge_properties.setdefault("virtuals", ())
|
||||||
dependency, warnings = self._parse_node(root_spec)
|
dependency = self._parse_node(root_spec)
|
||||||
parser_warnings.extend(warnings)
|
|
||||||
add_dependency(dependency, **edge_properties)
|
add_dependency(dependency, **edge_properties)
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.DEPENDENCY):
|
elif self.ctx.accept(SpecTokens.DEPENDENCY):
|
||||||
dependency, warnings = self._parse_node(root_spec)
|
dependency = self._parse_node(root_spec)
|
||||||
parser_warnings.extend(warnings)
|
|
||||||
add_dependency(dependency, depflag=0, virtuals=())
|
add_dependency(dependency, depflag=0, virtuals=())
|
||||||
|
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
if parser_warnings:
|
|
||||||
_warn_about_variant_after_compiler(self.literal_str, parser_warnings)
|
|
||||||
|
|
||||||
return root_spec
|
return root_spec
|
||||||
|
|
||||||
def _parse_node(self, root_spec):
|
def _parse_node(self, root_spec):
|
||||||
dependency, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse()
|
dependency = SpecNodeParser(self.ctx, self.literal_str).parse()
|
||||||
if dependency is None:
|
if dependency is None:
|
||||||
msg = (
|
msg = (
|
||||||
"the dependency sigil and any optional edge attributes must be followed by a "
|
"the dependency sigil and any optional edge attributes must be followed by a "
|
||||||
@@ -305,7 +270,7 @@ def _parse_node(self, root_spec):
|
|||||||
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
|
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
|
||||||
if root_spec.concrete:
|
if root_spec.concrete:
|
||||||
raise spack.spec.RedundantSpecError(root_spec, "^" + str(dependency))
|
raise spack.spec.RedundantSpecError(root_spec, "^" + str(dependency))
|
||||||
return dependency, parser_warnings
|
return dependency
|
||||||
|
|
||||||
def all_specs(self) -> List["spack.spec.Spec"]:
|
def all_specs(self) -> List["spack.spec.Spec"]:
|
||||||
"""Return all the specs that remain to be parsed"""
|
"""Return all the specs that remain to be parsed"""
|
||||||
@@ -325,7 +290,7 @@ def __init__(self, ctx, literal_str):
|
|||||||
|
|
||||||
def parse(
|
def parse(
|
||||||
self, initial_spec: Optional["spack.spec.Spec"] = None
|
self, initial_spec: Optional["spack.spec.Spec"] = None
|
||||||
) -> Tuple["spack.spec.Spec", List[str]]:
|
) -> Optional["spack.spec.Spec"]:
|
||||||
"""Parse a single spec node from a stream of tokens
|
"""Parse a single spec node from a stream of tokens
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@@ -334,15 +299,12 @@ def parse(
|
|||||||
Return
|
Return
|
||||||
The object passed as argument
|
The object passed as argument
|
||||||
"""
|
"""
|
||||||
parser_warnings: List[str] = []
|
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
|
||||||
last_compiler = None
|
return initial_spec
|
||||||
|
|
||||||
if initial_spec is None:
|
if initial_spec is None:
|
||||||
initial_spec = spack.spec.Spec()
|
initial_spec = spack.spec.Spec()
|
||||||
|
|
||||||
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
|
|
||||||
return initial_spec, parser_warnings
|
|
||||||
|
|
||||||
# If we start with a package name we have a named spec, we cannot
|
# If we start with a package name we have a named spec, we cannot
|
||||||
# accept another package name afterwards in a node
|
# accept another package name afterwards in a node
|
||||||
if self.ctx.accept(SpecTokens.UNQUALIFIED_PACKAGE_NAME):
|
if self.ctx.accept(SpecTokens.UNQUALIFIED_PACKAGE_NAME):
|
||||||
@@ -356,7 +318,7 @@ def parse(
|
|||||||
initial_spec.namespace = namespace
|
initial_spec.namespace = namespace
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.FILENAME):
|
elif self.ctx.accept(SpecTokens.FILENAME):
|
||||||
return FileParser(self.ctx).parse(initial_spec), parser_warnings
|
return FileParser(self.ctx).parse(initial_spec)
|
||||||
|
|
||||||
def raise_parsing_error(string: str, cause: Optional[Exception] = None):
|
def raise_parsing_error(string: str, cause: Optional[Exception] = None):
|
||||||
"""Raise a spec parsing error with token context."""
|
"""Raise a spec parsing error with token context."""
|
||||||
@@ -369,12 +331,6 @@ def add_flag(name: str, value: str, propagate: bool):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
raise_parsing_error(str(e), e)
|
raise_parsing_error(str(e), e)
|
||||||
|
|
||||||
def warn_if_after_compiler(token: str):
|
|
||||||
"""Register a warning for %compiler followed by +variant that will in the future apply
|
|
||||||
to the compiler instead of the current root."""
|
|
||||||
if last_compiler:
|
|
||||||
parser_warnings.append(f"`{token}` should go before `{last_compiler}`")
|
|
||||||
|
|
||||||
while True:
|
while True:
|
||||||
if self.ctx.accept(SpecTokens.COMPILER):
|
if self.ctx.accept(SpecTokens.COMPILER):
|
||||||
if self.has_compiler:
|
if self.has_compiler:
|
||||||
@@ -383,7 +339,6 @@ def warn_if_after_compiler(token: str):
|
|||||||
compiler_name = self.ctx.current_token.value[1:]
|
compiler_name = self.ctx.current_token.value[1:]
|
||||||
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
|
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
|
||||||
self.has_compiler = True
|
self.has_compiler = True
|
||||||
last_compiler = self.ctx.current_token.value
|
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.COMPILER_AND_VERSION):
|
elif self.ctx.accept(SpecTokens.COMPILER_AND_VERSION):
|
||||||
if self.has_compiler:
|
if self.has_compiler:
|
||||||
@@ -394,7 +349,6 @@ def warn_if_after_compiler(token: str):
|
|||||||
compiler_name.strip(), compiler_version
|
compiler_name.strip(), compiler_version
|
||||||
)
|
)
|
||||||
self.has_compiler = True
|
self.has_compiler = True
|
||||||
last_compiler = self.ctx.current_token.value
|
|
||||||
|
|
||||||
elif (
|
elif (
|
||||||
self.ctx.accept(SpecTokens.VERSION_HASH_PAIR)
|
self.ctx.accept(SpecTokens.VERSION_HASH_PAIR)
|
||||||
@@ -409,17 +363,14 @@ def warn_if_after_compiler(token: str):
|
|||||||
)
|
)
|
||||||
initial_spec.attach_git_version_lookup()
|
initial_spec.attach_git_version_lookup()
|
||||||
self.has_version = True
|
self.has_version = True
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.BOOL_VARIANT):
|
elif self.ctx.accept(SpecTokens.BOOL_VARIANT):
|
||||||
variant_value = self.ctx.current_token.value[0] == "+"
|
variant_value = self.ctx.current_token.value[0] == "+"
|
||||||
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
|
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.PROPAGATED_BOOL_VARIANT):
|
elif self.ctx.accept(SpecTokens.PROPAGATED_BOOL_VARIANT):
|
||||||
variant_value = self.ctx.current_token.value[0:2] == "++"
|
variant_value = self.ctx.current_token.value[0:2] == "++"
|
||||||
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
|
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
|
elif self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
|
||||||
match = SPLIT_KVP.match(self.ctx.current_token.value)
|
match = SPLIT_KVP.match(self.ctx.current_token.value)
|
||||||
@@ -427,7 +378,6 @@ def warn_if_after_compiler(token: str):
|
|||||||
|
|
||||||
name, _, value = match.groups()
|
name, _, value = match.groups()
|
||||||
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
|
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
elif self.ctx.accept(SpecTokens.PROPAGATED_KEY_VALUE_PAIR):
|
elif self.ctx.accept(SpecTokens.PROPAGATED_KEY_VALUE_PAIR):
|
||||||
match = SPLIT_KVP.match(self.ctx.current_token.value)
|
match = SPLIT_KVP.match(self.ctx.current_token.value)
|
||||||
@@ -435,19 +385,17 @@ def warn_if_after_compiler(token: str):
|
|||||||
|
|
||||||
name, _, value = match.groups()
|
name, _, value = match.groups()
|
||||||
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
|
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
elif self.ctx.expect(SpecTokens.DAG_HASH):
|
elif self.ctx.expect(SpecTokens.DAG_HASH):
|
||||||
if initial_spec.abstract_hash:
|
if initial_spec.abstract_hash:
|
||||||
break
|
break
|
||||||
self.ctx.accept(SpecTokens.DAG_HASH)
|
self.ctx.accept(SpecTokens.DAG_HASH)
|
||||||
initial_spec.abstract_hash = self.ctx.current_token.value[1:]
|
initial_spec.abstract_hash = self.ctx.current_token.value[1:]
|
||||||
warn_if_after_compiler(self.ctx.current_token.value)
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
return initial_spec, parser_warnings
|
return initial_spec
|
||||||
|
|
||||||
|
|
||||||
class FileParser:
|
class FileParser:
|
||||||
@@ -537,18 +485,23 @@ def parse_one_or_raise(
|
|||||||
text (str): text to be parsed
|
text (str): text to be parsed
|
||||||
initial_spec: buffer where to parse the spec. If None a new one will be created.
|
initial_spec: buffer where to parse the spec. If None a new one will be created.
|
||||||
"""
|
"""
|
||||||
parser = SpecParser(text)
|
stripped_text = text.strip()
|
||||||
|
parser = SpecParser(stripped_text)
|
||||||
result = parser.next_spec(initial_spec)
|
result = parser.next_spec(initial_spec)
|
||||||
next_token = parser.ctx.next_token
|
last_token = parser.ctx.current_token
|
||||||
|
|
||||||
if next_token:
|
if last_token is not None and last_token.end != len(stripped_text):
|
||||||
message = f"expected a single spec, but got more:\n{text}"
|
message = "a single spec was requested, but parsed more than one:"
|
||||||
underline = f"\n{' ' * next_token.start}{'^' * len(next_token.value)}"
|
message += f"\n{text}"
|
||||||
message += color.colorize(f"@*r{{{underline}}}")
|
if last_token is not None:
|
||||||
|
underline = f"\n{' ' * last_token.end}{'^' * (len(text) - last_token.end)}"
|
||||||
|
message += color.colorize(f"@*r{{{underline}}}")
|
||||||
raise ValueError(message)
|
raise ValueError(message)
|
||||||
|
|
||||||
if result is None:
|
if result is None:
|
||||||
raise ValueError("expected a single spec, but got none")
|
message = "a single spec was requested, but none was parsed:"
|
||||||
|
message += f"\n{text}"
|
||||||
|
raise ValueError(message)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|||||||
@@ -129,7 +129,7 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
|
|||||||
)
|
)
|
||||||
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
|
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
|
||||||
spec = Spec(
|
spec = Spec(
|
||||||
f"pkg-a foobar=bar target={root_target_range} %gcc@10 ^pkg-b target={dep_target_range}"
|
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
|
||||||
)
|
)
|
||||||
with spack.concretize.disable_compiler_existence_check():
|
with spack.concretize.disable_compiler_existence_check():
|
||||||
spec = spack.concretize.concretize_one(spec)
|
spec = spack.concretize.concretize_one(spec)
|
||||||
|
|||||||
@@ -200,11 +200,7 @@ def dummy_prefix(tmpdir):
|
|||||||
@pytest.mark.requires_executables(*required_executables)
|
@pytest.mark.requires_executables(*required_executables)
|
||||||
@pytest.mark.maybeslow
|
@pytest.mark.maybeslow
|
||||||
@pytest.mark.usefixtures(
|
@pytest.mark.usefixtures(
|
||||||
"default_config",
|
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
|
||||||
"cache_directory",
|
|
||||||
"install_dir_default_layout",
|
|
||||||
"temporary_mirror",
|
|
||||||
"mutable_mock_env_path",
|
|
||||||
)
|
)
|
||||||
def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
|
def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
|
||||||
"""
|
"""
|
||||||
@@ -276,11 +272,7 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
|
|||||||
@pytest.mark.maybeslow
|
@pytest.mark.maybeslow
|
||||||
@pytest.mark.nomockstage
|
@pytest.mark.nomockstage
|
||||||
@pytest.mark.usefixtures(
|
@pytest.mark.usefixtures(
|
||||||
"default_config",
|
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
|
||||||
"cache_directory",
|
|
||||||
"install_dir_default_layout",
|
|
||||||
"temporary_mirror",
|
|
||||||
"mutable_mock_env_path",
|
|
||||||
)
|
)
|
||||||
def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
|
def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
|
||||||
"""
|
"""
|
||||||
@@ -577,6 +569,7 @@ def test_FetchCacheError_only_accepts_lists_of_errors():
|
|||||||
def test_FetchCacheError_pretty_printing_multiple():
|
def test_FetchCacheError_pretty_printing_multiple():
|
||||||
e = bindist.FetchCacheError([RuntimeError("Oops!"), TypeError("Trouble!")])
|
e = bindist.FetchCacheError([RuntimeError("Oops!"), TypeError("Trouble!")])
|
||||||
str_e = str(e)
|
str_e = str(e)
|
||||||
|
print("'" + str_e + "'")
|
||||||
assert "Multiple errors" in str_e
|
assert "Multiple errors" in str_e
|
||||||
assert "Error 1: RuntimeError: Oops!" in str_e
|
assert "Error 1: RuntimeError: Oops!" in str_e
|
||||||
assert "Error 2: TypeError: Trouble!" in str_e
|
assert "Error 2: TypeError: Trouble!" in str_e
|
||||||
@@ -1140,7 +1133,7 @@ def test_get_valid_spec_file_no_json(tmp_path, filename):
|
|||||||
|
|
||||||
def test_download_tarball_with_unsupported_layout_fails(tmp_path, mutable_config, capsys):
|
def test_download_tarball_with_unsupported_layout_fails(tmp_path, mutable_config, capsys):
|
||||||
layout_version = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION + 1
|
layout_version = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION + 1
|
||||||
spec = Spec("gmake@4.4.1 arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("gmake@4.4.1%gcc@13.1.0 arch=linux-ubuntu23.04-zen2")
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
spec_dict = spec.to_dict()
|
spec_dict = spec.to_dict()
|
||||||
spec_dict["buildcache_layout_version"] = layout_version
|
spec_dict["buildcache_layout_version"] = layout_version
|
||||||
|
|||||||
@@ -210,6 +210,7 @@ def check_args_contents(cc, args, must_contain, must_not_contain):
|
|||||||
"""
|
"""
|
||||||
with set_env(SPACK_TEST_COMMAND="dump-args"):
|
with set_env(SPACK_TEST_COMMAND="dump-args"):
|
||||||
cc_modified_args = cc(*args, output=str).strip().split("\n")
|
cc_modified_args = cc(*args, output=str).strip().split("\n")
|
||||||
|
print(cc_modified_args)
|
||||||
for a in must_contain:
|
for a in must_contain:
|
||||||
assert a in cc_modified_args
|
assert a in cc_modified_args
|
||||||
for a in must_not_contain:
|
for a in must_not_contain:
|
||||||
|
|||||||
@@ -347,6 +347,7 @@ def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
|
|||||||
for key, val in expectations.items():
|
for key, val in expectations.items():
|
||||||
affected_specs = ci.get_spec_filter_list(e1, touched, dependent_traverse_depth=key)
|
affected_specs = ci.get_spec_filter_list(e1, touched, dependent_traverse_depth=key)
|
||||||
affected_pkg_names = set([s.name for s in affected_specs])
|
affected_pkg_names = set([s.name for s in affected_specs])
|
||||||
|
print(f"{key}: {affected_pkg_names}")
|
||||||
assert affected_pkg_names == val
|
assert affected_pkg_names == val
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -214,7 +214,9 @@ def verify_mirror_contents():
|
|||||||
if in_env_pkg in p:
|
if in_env_pkg in p:
|
||||||
found_pkg = True
|
found_pkg = True
|
||||||
|
|
||||||
assert found_pkg, f"Expected to find {in_env_pkg} in {dest_mirror_dir}"
|
if not found_pkg:
|
||||||
|
print("Expected to find {0} in {1}".format(in_env_pkg, dest_mirror_dir))
|
||||||
|
assert False
|
||||||
|
|
||||||
# Install a package and put it in the buildcache
|
# Install a package and put it in the buildcache
|
||||||
s = spack.concretize.concretize_one(out_env_pkg)
|
s = spack.concretize.concretize_one(out_env_pkg)
|
||||||
|
|||||||
@@ -867,7 +867,7 @@ def test_push_to_build_cache(
|
|||||||
logs_dir = scratch / "logs_dir"
|
logs_dir = scratch / "logs_dir"
|
||||||
logs_dir.mkdir()
|
logs_dir.mkdir()
|
||||||
ci.copy_stage_logs_to_artifacts(concrete_spec, str(logs_dir))
|
ci.copy_stage_logs_to_artifacts(concrete_spec, str(logs_dir))
|
||||||
assert "spack-build-out.txt.gz" in os.listdir(logs_dir)
|
assert "spack-build-out.txt" in os.listdir(logs_dir)
|
||||||
|
|
||||||
dl_dir = scratch / "download_dir"
|
dl_dir = scratch / "download_dir"
|
||||||
buildcache_cmd("download", "--spec-file", json_path, "--path", str(dl_dir))
|
buildcache_cmd("download", "--spec-file", json_path, "--path", str(dl_dir))
|
||||||
|
|||||||
@@ -5,7 +5,6 @@
|
|||||||
import filecmp
|
import filecmp
|
||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
import textwrap
|
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
@@ -260,25 +259,15 @@ def test_update_completion_arg(shell, tmpdir, monkeypatch):
|
|||||||
def test_updated_completion_scripts(shell, tmpdir):
|
def test_updated_completion_scripts(shell, tmpdir):
|
||||||
"""Make sure our shell tab completion scripts remain up-to-date."""
|
"""Make sure our shell tab completion scripts remain up-to-date."""
|
||||||
|
|
||||||
width = 72
|
msg = (
|
||||||
lines = textwrap.wrap(
|
|
||||||
"It looks like Spack's command-line interface has been modified. "
|
"It looks like Spack's command-line interface has been modified. "
|
||||||
"If differences are more than your global 'include:' scopes, please "
|
"Please update Spack's shell tab completion scripts by running:\n\n"
|
||||||
"update Spack's shell tab completion scripts by running:",
|
" spack commands --update-completion\n\n"
|
||||||
width,
|
"and adding the changed files to your pull request."
|
||||||
)
|
)
|
||||||
lines.append("\n spack commands --update-completion\n")
|
|
||||||
lines.extend(
|
|
||||||
textwrap.wrap(
|
|
||||||
"and adding the changed files (minus your global 'include:' scopes) "
|
|
||||||
"to your pull request.",
|
|
||||||
width,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
msg = "\n".join(lines)
|
|
||||||
|
|
||||||
header = os.path.join(spack.paths.share_path, shell, f"spack-completion.{shell}")
|
header = os.path.join(spack.paths.share_path, shell, f"spack-completion.{shell}")
|
||||||
script = f"spack-completion.{shell}"
|
script = "spack-completion.{0}".format(shell)
|
||||||
old_script = os.path.join(spack.paths.share_path, script)
|
old_script = os.path.join(spack.paths.share_path, script)
|
||||||
new_script = str(tmpdir.join(script))
|
new_script = str(tmpdir.join(script))
|
||||||
|
|
||||||
|
|||||||
@@ -213,7 +213,7 @@ def test_config_add_update_dict(mutable_empty_config):
|
|||||||
|
|
||||||
def test_config_with_c_argument(mutable_empty_config):
|
def test_config_with_c_argument(mutable_empty_config):
|
||||||
# I don't know how to add a spack argument to a Spack Command, so we test this way
|
# I don't know how to add a spack argument to a Spack Command, so we test this way
|
||||||
config_file = "config:install_tree:root:/path/to/config.yaml"
|
config_file = "config:install_root:root:/path/to/config.yaml"
|
||||||
parser = spack.main.make_argument_parser()
|
parser = spack.main.make_argument_parser()
|
||||||
args = parser.parse_args(["-c", config_file])
|
args = parser.parse_args(["-c", config_file])
|
||||||
assert config_file in args.config_vars
|
assert config_file in args.config_vars
|
||||||
@@ -221,7 +221,7 @@ def test_config_with_c_argument(mutable_empty_config):
|
|||||||
# Add the path to the config
|
# Add the path to the config
|
||||||
config("add", args.config_vars[0], scope="command_line")
|
config("add", args.config_vars[0], scope="command_line")
|
||||||
output = config("get", "config")
|
output = config("get", "config")
|
||||||
assert "config:\n install_tree:\n root: /path/to/config.yaml" in output
|
assert "config:\n install_root:\n root: /path/to/config.yaml" in output
|
||||||
|
|
||||||
|
|
||||||
def test_config_add_ordered_dict(mutable_empty_config):
|
def test_config_add_ordered_dict(mutable_empty_config):
|
||||||
|
|||||||
@@ -15,9 +15,6 @@
|
|||||||
deprecate = SpackCommand("deprecate")
|
deprecate = SpackCommand("deprecate")
|
||||||
find = SpackCommand("find")
|
find = SpackCommand("find")
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
|
||||||
|
|
||||||
|
|
||||||
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
|
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
|
||||||
install("--fake", "libelf@0.8.13")
|
install("--fake", "libelf@0.8.13")
|
||||||
|
|||||||
@@ -1067,17 +1067,13 @@ def test_init_from_yaml_relative_includes(tmp_path):
|
|||||||
assert os.path.exists(os.path.join(e2.path, f))
|
assert os.path.exists(os.path.join(e2.path, f))
|
||||||
|
|
||||||
|
|
||||||
# TODO: Should we be supporting relative path rewrites when creating new env from existing?
|
|
||||||
# TODO: If so, then this should confirm that the absolute include paths in the new env exist.
|
|
||||||
def test_init_from_yaml_relative_includes_outside_env(tmp_path):
|
def test_init_from_yaml_relative_includes_outside_env(tmp_path):
|
||||||
"""Ensure relative includes to files outside the environment fail."""
|
files = ["../outside_env_not_copied/repos.yaml"]
|
||||||
files = ["../outside_env/repos.yaml"]
|
|
||||||
|
|
||||||
manifest = f"""
|
manifest = f"""
|
||||||
spack:
|
spack:
|
||||||
specs: []
|
specs: []
|
||||||
include:
|
include: {files}
|
||||||
- path: {files[0]}
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# subdir to ensure parent of environment dir is not shared
|
# subdir to ensure parent of environment dir is not shared
|
||||||
@@ -1090,7 +1086,7 @@ def test_init_from_yaml_relative_includes_outside_env(tmp_path):
|
|||||||
for f in files:
|
for f in files:
|
||||||
fs.touchp(e1_path / f)
|
fs.touchp(e1_path / f)
|
||||||
|
|
||||||
with pytest.raises(ValueError, match="does not exist"):
|
with pytest.raises(spack.config.ConfigFileError, match="Detected 1 missing include"):
|
||||||
_ = _env_create("test2", init_file=e1_manifest)
|
_ = _env_create("test2", init_file=e1_manifest)
|
||||||
|
|
||||||
|
|
||||||
@@ -1190,14 +1186,14 @@ def test_env_with_config(environment_from_manifest):
|
|||||||
|
|
||||||
|
|
||||||
def test_with_config_bad_include_create(environment_from_manifest):
|
def test_with_config_bad_include_create(environment_from_manifest):
|
||||||
"""Confirm missing required include raises expected exception."""
|
"""Confirm missing include paths raise expected exception and error."""
|
||||||
err = "does not exist"
|
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
|
||||||
with pytest.raises(ValueError, match=err):
|
|
||||||
environment_from_manifest(
|
environment_from_manifest(
|
||||||
"""
|
"""
|
||||||
spack:
|
spack:
|
||||||
include:
|
include:
|
||||||
- /no/such/directory
|
- /no/such/directory
|
||||||
|
- no/such/file.yaml
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -1207,25 +1203,34 @@ def test_with_config_bad_include_activate(environment_from_manifest, tmpdir):
|
|||||||
include1 = env_root / "include1.yaml"
|
include1 = env_root / "include1.yaml"
|
||||||
include1.touch()
|
include1.touch()
|
||||||
|
|
||||||
|
abs_include_path = os.path.abspath(tmpdir.join("subdir").ensure("include2.yaml"))
|
||||||
|
|
||||||
spack_yaml = env_root / ev.manifest_name
|
spack_yaml = env_root / ev.manifest_name
|
||||||
spack_yaml.write_text(
|
spack_yaml.write_text(
|
||||||
"""
|
f"""
|
||||||
spack:
|
spack:
|
||||||
include:
|
include:
|
||||||
- ./include1.yaml
|
- ./include1.yaml
|
||||||
|
- {abs_include_path}
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
|
||||||
with ev.Environment(env_root) as e:
|
with ev.Environment(env_root) as e:
|
||||||
e.concretize()
|
e.concretize()
|
||||||
|
|
||||||
# We've created an environment with included config file (which does
|
# we've created an environment with some included config files (which do
|
||||||
# exist). Now we remove it and check that we get a sensible error.
|
# in fact exist): now we remove them and check that we get a sensible
|
||||||
|
# error message
|
||||||
|
|
||||||
|
os.remove(abs_include_path)
|
||||||
os.remove(include1)
|
os.remove(include1)
|
||||||
with pytest.raises(ValueError, match="does not exist"):
|
with pytest.raises(spack.config.ConfigFileError) as exc:
|
||||||
ev.activate(ev.Environment(env_root))
|
ev.activate(ev.Environment(env_root))
|
||||||
|
|
||||||
|
err = exc.value.message
|
||||||
|
assert "missing include" in err
|
||||||
|
assert abs_include_path in err
|
||||||
|
assert "include1.yaml" in err
|
||||||
assert ev.active_environment() is None
|
assert ev.active_environment() is None
|
||||||
|
|
||||||
|
|
||||||
@@ -1333,10 +1338,8 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
|||||||
included file scope.
|
included file scope.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
env_path = tmp_path / "test_config"
|
|
||||||
fs.mkdirp(env_path)
|
|
||||||
included_file = "included-packages.yaml"
|
included_file = "included-packages.yaml"
|
||||||
included_path = env_path / included_file
|
included_path = tmp_path / included_file
|
||||||
with open(included_path, "w", encoding="utf-8") as f:
|
with open(included_path, "w", encoding="utf-8") as f:
|
||||||
f.write(
|
f.write(
|
||||||
"""\
|
"""\
|
||||||
@@ -1352,7 +1355,7 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
|||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
|
||||||
spack_yaml = env_path / ev.manifest_name
|
spack_yaml = tmp_path / ev.manifest_name
|
||||||
spack_yaml.write_text(
|
spack_yaml.write_text(
|
||||||
f"""\
|
f"""\
|
||||||
spack:
|
spack:
|
||||||
@@ -1366,8 +1369,7 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
|
|||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
|
||||||
mutable_config.set("config:misc_cache", str(tmp_path / "cache"))
|
e = ev.Environment(tmp_path)
|
||||||
e = ev.Environment(env_path)
|
|
||||||
with e:
|
with e:
|
||||||
# List of requirements, flip a variant
|
# List of requirements, flip a variant
|
||||||
config("change", "packages:mpich:require:~debug")
|
config("change", "packages:mpich:require:~debug")
|
||||||
@@ -1457,6 +1459,19 @@ def test_env_with_included_config_file_url(tmpdir, mutable_empty_config, package
|
|||||||
assert cfg["mpileaks"]["version"] == ["2.2"]
|
assert cfg["mpileaks"]["version"] == ["2.2"]
|
||||||
|
|
||||||
|
|
||||||
|
def test_env_with_included_config_missing_file(tmpdir, mutable_empty_config):
|
||||||
|
"""Test inclusion of a missing configuration file raises FetchError
|
||||||
|
noting missing file."""
|
||||||
|
|
||||||
|
spack_yaml = tmpdir.join("spack.yaml")
|
||||||
|
missing_file = tmpdir.join("packages.yaml")
|
||||||
|
with spack_yaml.open("w") as f:
|
||||||
|
f.write("spack:\n include:\n - {0}\n".format(missing_file.strpath))
|
||||||
|
|
||||||
|
with pytest.raises(spack.error.ConfigError, match="missing include path"):
|
||||||
|
ev.Environment(tmpdir.strpath)
|
||||||
|
|
||||||
|
|
||||||
def test_env_with_included_config_scope(mutable_mock_env_path, packages_file):
|
def test_env_with_included_config_scope(mutable_mock_env_path, packages_file):
|
||||||
"""Test inclusion of a package file from the environment's configuration
|
"""Test inclusion of a package file from the environment's configuration
|
||||||
stage directory. This test is intended to represent a case where a remote
|
stage directory. This test is intended to represent a case where a remote
|
||||||
@@ -1551,7 +1566,7 @@ def test_env_with_included_config_precedence(tmp_path):
|
|||||||
|
|
||||||
|
|
||||||
def test_env_with_included_configs_precedence(tmp_path):
|
def test_env_with_included_configs_precedence(tmp_path):
|
||||||
"""Test precedence of multiple included configuration files."""
|
"""Test precendence of multiple included configuration files."""
|
||||||
file1 = "high-config.yaml"
|
file1 = "high-config.yaml"
|
||||||
file2 = "low-config.yaml"
|
file2 = "low-config.yaml"
|
||||||
|
|
||||||
@@ -1779,7 +1794,7 @@ def test_roots_display_with_variants():
|
|||||||
with ev.read("test"):
|
with ev.read("test"):
|
||||||
out = find(output=str)
|
out = find(output=str)
|
||||||
|
|
||||||
assert "boost+shared" in out
|
assert "boost +shared" in out
|
||||||
|
|
||||||
|
|
||||||
def test_uninstall_keeps_in_env(mock_stage, mock_fetch, install_mockery):
|
def test_uninstall_keeps_in_env(mock_stage, mock_fetch, install_mockery):
|
||||||
@@ -4262,31 +4277,21 @@ def test_unify_when_possible_works_around_conflicts():
|
|||||||
assert len([x for x in e.all_specs() if x.satisfies("mpich")]) == 1
|
assert len([x for x in e.all_specs() if x.satisfies("mpich")]) == 1
|
||||||
|
|
||||||
|
|
||||||
# Using mock_include_cache to ensure the "remote" file is cached in a temporary
|
|
||||||
# location and not polluting the user cache.
|
|
||||||
def test_env_include_packages_url(
|
def test_env_include_packages_url(
|
||||||
tmpdir, mutable_empty_config, mock_fetch_url_text, mock_curl_configs, mock_include_cache
|
tmpdir, mutable_empty_config, mock_spider_configs, mock_curl_configs
|
||||||
):
|
):
|
||||||
"""Test inclusion of a (GitHub) URL."""
|
"""Test inclusion of a (GitHub) URL."""
|
||||||
develop_url = "https://github.com/fake/fake/blob/develop/"
|
develop_url = "https://github.com/fake/fake/blob/develop/"
|
||||||
default_packages = develop_url + "etc/fake/defaults/packages.yaml"
|
default_packages = develop_url + "etc/fake/defaults/packages.yaml"
|
||||||
sha256 = "a422e35b3a18869d0611a4137b37314131749ecdc070a7cd7183f488da81201a"
|
|
||||||
spack_yaml = tmpdir.join("spack.yaml")
|
spack_yaml = tmpdir.join("spack.yaml")
|
||||||
with spack_yaml.open("w") as f:
|
with spack_yaml.open("w") as f:
|
||||||
f.write(
|
f.write("spack:\n include:\n - {0}\n".format(default_packages))
|
||||||
f"""\
|
assert os.path.isfile(spack_yaml.strpath)
|
||||||
spack:
|
|
||||||
include:
|
|
||||||
- path: {default_packages}
|
|
||||||
sha256: {sha256}
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
with spack.config.override("config:url_fetch_method", "curl"):
|
with spack.config.override("config:url_fetch_method", "curl"):
|
||||||
env = ev.Environment(tmpdir.strpath)
|
env = ev.Environment(tmpdir.strpath)
|
||||||
ev.activate(env)
|
ev.activate(env)
|
||||||
|
|
||||||
# Make sure a setting from test/data/config/packages.yaml is present
|
|
||||||
cfg = spack.config.get("packages")
|
cfg = spack.config.get("packages")
|
||||||
assert "mpich" in cfg["all"]["providers"]["mpi"]
|
assert "mpich" in cfg["all"]["providers"]["mpi"]
|
||||||
|
|
||||||
@@ -4355,7 +4360,7 @@ def test_env_view_disabled(tmp_path, mutable_mock_env_path):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("first", ["false", "true", "custom"])
|
@pytest.mark.parametrize("first", ["false", "true", "custom"])
|
||||||
def test_env_include_mixed_views(tmp_path, mutable_config, mutable_mock_env_path, first):
|
def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config, first):
|
||||||
"""Ensure including path and boolean views in different combinations result
|
"""Ensure including path and boolean views in different combinations result
|
||||||
in the creation of only the first view if it is not disabled."""
|
in the creation of only the first view if it is not disabled."""
|
||||||
false_yaml = tmp_path / "false-view.yaml"
|
false_yaml = tmp_path / "false-view.yaml"
|
||||||
|
|||||||
@@ -320,7 +320,7 @@ def test_find_very_long(database, config):
|
|||||||
@pytest.mark.db
|
@pytest.mark.db
|
||||||
def test_find_show_compiler(database, config):
|
def test_find_show_compiler(database, config):
|
||||||
output = find("--no-groups", "--show-full-compiler", "mpileaks")
|
output = find("--no-groups", "--show-full-compiler", "mpileaks")
|
||||||
assert "mpileaks@2.3 %gcc@10.2.1" in output
|
assert "mpileaks@2.3%gcc@10.2.1" in output
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.db
|
@pytest.mark.db
|
||||||
@@ -464,7 +464,7 @@ def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
|
|||||||
|
|
||||||
with test_environment:
|
with test_environment:
|
||||||
output = find()
|
output = find()
|
||||||
assert "zlib %gcc@12.1.0" in output
|
assert "zlib%gcc@12.1.0" in output
|
||||||
|
|
||||||
|
|
||||||
_pkga = (
|
_pkga = (
|
||||||
|
|||||||
@@ -718,11 +718,10 @@ def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
|
|||||||
assert os.path.exists(root.prefix)
|
assert os.path.exists(root.prefix)
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||||
@pytest.mark.regression("12002")
|
@pytest.mark.regression("12002")
|
||||||
def test_install_only_dependencies_in_env(
|
def test_install_only_dependencies_in_env(
|
||||||
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
|
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
env("create", "test")
|
env("create", "test")
|
||||||
|
|
||||||
@@ -736,10 +735,9 @@ def test_install_only_dependencies_in_env(
|
|||||||
assert not os.path.exists(root.prefix)
|
assert not os.path.exists(root.prefix)
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.regression("12002")
|
@pytest.mark.regression("12002")
|
||||||
def test_install_only_dependencies_of_all_in_env(
|
def test_install_only_dependencies_of_all_in_env(
|
||||||
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
|
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
env("create", "--without-view", "test")
|
env("create", "--without-view", "test")
|
||||||
|
|
||||||
@@ -759,8 +757,7 @@ def test_install_only_dependencies_of_all_in_env(
|
|||||||
assert os.path.exists(dep.prefix)
|
assert os.path.exists(dep.prefix)
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock_env_path):
|
||||||
def test_install_no_add_in_env(tmpdir, mutable_mock_env_path, mock_fetch, install_mockery):
|
|
||||||
# To test behavior of --add option, we create the following environment:
|
# To test behavior of --add option, we create the following environment:
|
||||||
#
|
#
|
||||||
# mpileaks
|
# mpileaks
|
||||||
@@ -901,6 +898,7 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
|
|||||||
specfile = "./spec.json"
|
specfile = "./spec.json"
|
||||||
with open(specfile, "w", encoding="utf-8") as f:
|
with open(specfile, "w", encoding="utf-8") as f:
|
||||||
f.write(spec.to_json())
|
f.write(spec.to_json())
|
||||||
|
print(spec.to_json())
|
||||||
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
|
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
|
||||||
# Verify Configure.xml exists with expected contents.
|
# Verify Configure.xml exists with expected contents.
|
||||||
report_dir = tmpdir.join("cdash_reports")
|
report_dir = tmpdir.join("cdash_reports")
|
||||||
@@ -935,10 +933,9 @@ def test_install_fails_no_args_suggests_env_activation(tmpdir):
|
|||||||
assert "using the `spack.yaml` in this directory" in output
|
assert "using the `spack.yaml` in this directory" in output
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||||
def test_install_env_with_tests_all(
|
def test_install_env_with_tests_all(
|
||||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
env("create", "test")
|
env("create", "test")
|
||||||
with ev.read("test"):
|
with ev.read("test"):
|
||||||
@@ -948,10 +945,9 @@ def test_install_env_with_tests_all(
|
|||||||
assert os.path.exists(test_dep.prefix)
|
assert os.path.exists(test_dep.prefix)
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||||
def test_install_env_with_tests_root(
|
def test_install_env_with_tests_root(
|
||||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
env("create", "test")
|
env("create", "test")
|
||||||
with ev.read("test"):
|
with ev.read("test"):
|
||||||
@@ -961,10 +957,9 @@ def test_install_env_with_tests_root(
|
|||||||
assert not os.path.exists(test_dep.prefix)
|
assert not os.path.exists(test_dep.prefix)
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
|
||||||
def test_install_empty_env(
|
def test_install_empty_env(
|
||||||
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
|
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
env_name = "empty"
|
env_name = "empty"
|
||||||
env("create", env_name)
|
env("create", env_name)
|
||||||
@@ -1000,17 +995,9 @@ def test_installation_fail_tests(install_mockery, mock_fetch, name, method):
|
|||||||
assert "See test log for details" in output
|
assert "See test log for details" in output
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("Buildcache not supported on windows")
|
@pytest.mark.not_on_windows("Buildcache not supported on windows")
|
||||||
def test_install_use_buildcache(
|
def test_install_use_buildcache(
|
||||||
capsys,
|
capsys, mock_packages, mock_fetch, mock_archive, mock_binary_index, tmpdir, install_mockery
|
||||||
mutable_mock_env_path,
|
|
||||||
mock_packages,
|
|
||||||
mock_fetch,
|
|
||||||
mock_archive,
|
|
||||||
mock_binary_index,
|
|
||||||
tmpdir,
|
|
||||||
install_mockery,
|
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Make sure installing with use-buildcache behaves correctly.
|
Make sure installing with use-buildcache behaves correctly.
|
||||||
|
|||||||
@@ -52,7 +52,8 @@ def test_load_shell(shell, set_command):
|
|||||||
mpileaks_spec = spack.concretize.concretize_one("mpileaks")
|
mpileaks_spec = spack.concretize.concretize_one("mpileaks")
|
||||||
|
|
||||||
# Ensure our reference variable is clean.
|
# Ensure our reference variable is clean.
|
||||||
os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world"
|
hello_world_paths = [os.path.normpath(p) for p in ("/hello", "/world")]
|
||||||
|
os.environ["CMAKE_PREFIX_PATH"] = os.pathsep.join(hello_world_paths)
|
||||||
|
|
||||||
shell_out = load(shell, "mpileaks")
|
shell_out = load(shell, "mpileaks")
|
||||||
|
|
||||||
@@ -69,7 +70,7 @@ def extract_value(output, variable):
|
|||||||
paths_shell = extract_value(shell_out, "CMAKE_PREFIX_PATH")
|
paths_shell = extract_value(shell_out, "CMAKE_PREFIX_PATH")
|
||||||
|
|
||||||
# We should've prepended new paths, and keep old ones.
|
# We should've prepended new paths, and keep old ones.
|
||||||
assert paths_shell[-2:] == ["/hello", "/world"]
|
assert paths_shell[-2:] == hello_world_paths
|
||||||
|
|
||||||
# All but the last two paths are added by spack load; lookup what packages they're from.
|
# All but the last two paths are added by spack load; lookup what packages they're from.
|
||||||
pkgs = [prefix_to_pkg(p) for p in paths_shell[:-2]]
|
pkgs = [prefix_to_pkg(p) for p in paths_shell[:-2]]
|
||||||
|
|||||||
@@ -12,9 +12,6 @@
|
|||||||
install = SpackCommand("install")
|
install = SpackCommand("install")
|
||||||
uninstall = SpackCommand("uninstall")
|
uninstall = SpackCommand("uninstall")
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.db
|
@pytest.mark.db
|
||||||
def test_mark_mode_required(mutable_database):
|
def test_mark_mode_required(mutable_database):
|
||||||
|
|||||||
@@ -38,9 +38,8 @@ def test_regression_8083(tmpdir, capfd, mock_packages, mock_fetch, config):
|
|||||||
assert "as it is an external spec" in output
|
assert "as it is an external spec" in output
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.regression("12345")
|
@pytest.mark.regression("12345")
|
||||||
def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
|
def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_path):
|
||||||
mirror_dir = str(tmp_path / "mirror")
|
mirror_dir = str(tmp_path / "mirror")
|
||||||
env_name = "test"
|
env_name = "test"
|
||||||
|
|
||||||
@@ -343,16 +342,8 @@ def test_mirror_name_collision(mutable_config):
|
|||||||
mirror("add", "first", "1")
|
mirror("add", "first", "1")
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
def test_mirror_destroy(
|
def test_mirror_destroy(
|
||||||
mutable_mock_env_path,
|
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
|
||||||
install_mockery,
|
|
||||||
mock_packages,
|
|
||||||
mock_fetch,
|
|
||||||
mock_archive,
|
|
||||||
mutable_config,
|
|
||||||
monkeypatch,
|
|
||||||
tmpdir,
|
|
||||||
):
|
):
|
||||||
# Create a temp mirror directory for buildcache usage
|
# Create a temp mirror directory for buildcache usage
|
||||||
mirror_dir = tmpdir.join("mirror_dir")
|
mirror_dir = tmpdir.join("mirror_dir")
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ def mock_pkg_git_repo(git, tmp_path_factory):
|
|||||||
repo_dir = root_dir / "builtin.mock"
|
repo_dir = root_dir / "builtin.mock"
|
||||||
shutil.copytree(spack.paths.mock_packages_path, str(repo_dir))
|
shutil.copytree(spack.paths.mock_packages_path, str(repo_dir))
|
||||||
|
|
||||||
repo_cache = spack.util.file_cache.FileCache(root_dir / "cache")
|
repo_cache = spack.util.file_cache.FileCache(str(root_dir / "cache"))
|
||||||
mock_repo = spack.repo.RepoPath(str(repo_dir), cache=repo_cache)
|
mock_repo = spack.repo.RepoPath(str(repo_dir), cache=repo_cache)
|
||||||
mock_repo_packages = mock_repo.repos[0].packages_path
|
mock_repo_packages = mock_repo.repos[0].packages_path
|
||||||
|
|
||||||
|
|||||||
@@ -5,13 +5,9 @@
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
import spack.config
|
|
||||||
import spack.environment as ev
|
|
||||||
import spack.main
|
import spack.main
|
||||||
from spack.main import SpackCommand
|
|
||||||
|
|
||||||
repo = spack.main.SpackCommand("repo")
|
repo = spack.main.SpackCommand("repo")
|
||||||
env = SpackCommand("env")
|
|
||||||
|
|
||||||
|
|
||||||
def test_help_option():
|
def test_help_option():
|
||||||
@@ -37,33 +33,3 @@ def test_create_add_list_remove(mutable_config, tmpdir):
|
|||||||
repo("remove", "--scope=site", str(tmpdir))
|
repo("remove", "--scope=site", str(tmpdir))
|
||||||
output = repo("list", "--scope=site", output=str)
|
output = repo("list", "--scope=site", output=str)
|
||||||
assert "mockrepo" not in output
|
assert "mockrepo" not in output
|
||||||
|
|
||||||
|
|
||||||
def test_env_repo_path_vars_substitution(
|
|
||||||
tmpdir, install_mockery, mutable_mock_env_path, monkeypatch
|
|
||||||
):
|
|
||||||
"""Test Spack correctly substitues repo paths with environment variables when creating an
|
|
||||||
environment from a manifest file."""
|
|
||||||
|
|
||||||
monkeypatch.setenv("CUSTOM_REPO_PATH", ".")
|
|
||||||
|
|
||||||
# setup environment from spack.yaml
|
|
||||||
envdir = tmpdir.mkdir("env")
|
|
||||||
with envdir.as_cwd():
|
|
||||||
with open("spack.yaml", "w", encoding="utf-8") as f:
|
|
||||||
f.write(
|
|
||||||
"""\
|
|
||||||
spack:
|
|
||||||
specs: []
|
|
||||||
|
|
||||||
repos:
|
|
||||||
- $CUSTOM_REPO_PATH
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
# creating env from manifest file
|
|
||||||
env("create", "test", "./spack.yaml")
|
|
||||||
# check that repo path was correctly substituted with the environment variable
|
|
||||||
current_dir = os.getcwd()
|
|
||||||
with ev.read("test") as newenv:
|
|
||||||
repos_specs = spack.config.get("repos", default={}, scope=newenv.scope_name)
|
|
||||||
assert current_dir in repos_specs
|
|
||||||
|
|||||||
@@ -13,10 +13,7 @@
|
|||||||
import spack.store
|
import spack.store
|
||||||
from spack.main import SpackCommand, SpackCommandError
|
from spack.main import SpackCommand, SpackCommandError
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
pytestmark = pytest.mark.usefixtures("mutable_config", "mutable_mock_repo")
|
||||||
pytestmark = pytest.mark.usefixtures(
|
|
||||||
"mutable_mock_env_path", "mutable_config", "mutable_mock_repo"
|
|
||||||
)
|
|
||||||
|
|
||||||
spec = SpackCommand("spec")
|
spec = SpackCommand("spec")
|
||||||
|
|
||||||
|
|||||||
@@ -409,108 +409,3 @@ def test_case_sensitive_imports(tmp_path: pathlib.Path):
|
|||||||
def test_pkg_imports():
|
def test_pkg_imports():
|
||||||
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg.builtin.boost") is None
|
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg.builtin.boost") is None
|
||||||
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg") is None
|
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg") is None
|
||||||
|
|
||||||
|
|
||||||
def test_spec_strings(tmp_path):
|
|
||||||
(tmp_path / "example.py").write_text(
|
|
||||||
"""\
|
|
||||||
def func(x):
|
|
||||||
print("dont fix %s me" % x, 3)
|
|
||||||
return x.satisfies("+foo %gcc +bar") and x.satisfies("%gcc +baz")
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
(tmp_path / "example.json").write_text(
|
|
||||||
"""\
|
|
||||||
{
|
|
||||||
"spec": [
|
|
||||||
"+foo %gcc +bar~nope ^dep %clang +yup @3.2 target=x86_64 /abcdef ^another %gcc ",
|
|
||||||
"%gcc +baz"
|
|
||||||
],
|
|
||||||
"%gcc x=y": 2
|
|
||||||
}
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
(tmp_path / "example.yaml").write_text(
|
|
||||||
"""\
|
|
||||||
spec:
|
|
||||||
- "+foo %gcc +bar"
|
|
||||||
- "%gcc +baz"
|
|
||||||
- "this is fine %clang"
|
|
||||||
"%gcc x=y": 2
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
issues = set()
|
|
||||||
|
|
||||||
def collect_issues(path: str, line: int, col: int, old: str, new: str):
|
|
||||||
issues.add((path, line, col, old, new))
|
|
||||||
|
|
||||||
# check for issues with custom handler
|
|
||||||
spack.cmd.style._check_spec_strings(
|
|
||||||
[
|
|
||||||
str(tmp_path / "nonexistent.py"),
|
|
||||||
str(tmp_path / "example.py"),
|
|
||||||
str(tmp_path / "example.json"),
|
|
||||||
str(tmp_path / "example.yaml"),
|
|
||||||
],
|
|
||||||
handler=collect_issues,
|
|
||||||
)
|
|
||||||
|
|
||||||
assert issues == {
|
|
||||||
(
|
|
||||||
str(tmp_path / "example.json"),
|
|
||||||
3,
|
|
||||||
9,
|
|
||||||
"+foo %gcc +bar~nope ^dep %clang +yup @3.2 target=x86_64 /abcdef ^another %gcc ",
|
|
||||||
"+foo +bar~nope %gcc ^dep +yup @3.2 target=x86_64 /abcdef %clang ^another %gcc ",
|
|
||||||
),
|
|
||||||
(str(tmp_path / "example.json"), 4, 9, "%gcc +baz", "+baz %gcc"),
|
|
||||||
(str(tmp_path / "example.json"), 6, 5, "%gcc x=y", "x=y %gcc"),
|
|
||||||
(str(tmp_path / "example.py"), 3, 23, "+foo %gcc +bar", "+foo +bar %gcc"),
|
|
||||||
(str(tmp_path / "example.py"), 3, 57, "%gcc +baz", "+baz %gcc"),
|
|
||||||
(str(tmp_path / "example.yaml"), 2, 5, "+foo %gcc +bar", "+foo +bar %gcc"),
|
|
||||||
(str(tmp_path / "example.yaml"), 3, 5, "%gcc +baz", "+baz %gcc"),
|
|
||||||
(str(tmp_path / "example.yaml"), 5, 1, "%gcc x=y", "x=y %gcc"),
|
|
||||||
}
|
|
||||||
|
|
||||||
# fix the issues in the files
|
|
||||||
spack.cmd.style._check_spec_strings(
|
|
||||||
[
|
|
||||||
str(tmp_path / "nonexistent.py"),
|
|
||||||
str(tmp_path / "example.py"),
|
|
||||||
str(tmp_path / "example.json"),
|
|
||||||
str(tmp_path / "example.yaml"),
|
|
||||||
],
|
|
||||||
handler=spack.cmd.style._spec_str_fix_handler,
|
|
||||||
)
|
|
||||||
|
|
||||||
assert (
|
|
||||||
(tmp_path / "example.json").read_text()
|
|
||||||
== """\
|
|
||||||
{
|
|
||||||
"spec": [
|
|
||||||
"+foo +bar~nope %gcc ^dep +yup @3.2 target=x86_64 /abcdef %clang ^another %gcc ",
|
|
||||||
"+baz %gcc"
|
|
||||||
],
|
|
||||||
"x=y %gcc": 2
|
|
||||||
}
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
assert (
|
|
||||||
(tmp_path / "example.py").read_text()
|
|
||||||
== """\
|
|
||||||
def func(x):
|
|
||||||
print("dont fix %s me" % x, 3)
|
|
||||||
return x.satisfies("+foo +bar %gcc") and x.satisfies("+baz %gcc")
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
assert (
|
|
||||||
(tmp_path / "example.yaml").read_text()
|
|
||||||
== """\
|
|
||||||
spec:
|
|
||||||
- "+foo +bar %gcc"
|
|
||||||
- "+baz %gcc"
|
|
||||||
- "this is fine %clang"
|
|
||||||
"x=y %gcc": 2
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -16,9 +16,6 @@
|
|||||||
uninstall = SpackCommand("uninstall")
|
uninstall = SpackCommand("uninstall")
|
||||||
install = SpackCommand("install")
|
install = SpackCommand("install")
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
|
|
||||||
|
|
||||||
|
|
||||||
class MockArgs:
|
class MockArgs:
|
||||||
def __init__(self, packages, all=False, force=False, dependents=False):
|
def __init__(self, packages, all=False, force=False, dependents=False):
|
||||||
@@ -223,7 +220,9 @@ class TestUninstallFromEnv:
|
|||||||
find = SpackCommand("find")
|
find = SpackCommand("find")
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def environment_setup(self, mock_packages, mutable_database, install_mockery):
|
def environment_setup(
|
||||||
|
self, mutable_mock_env_path, mock_packages, mutable_database, install_mockery
|
||||||
|
):
|
||||||
TestUninstallFromEnv.env("create", "e1")
|
TestUninstallFromEnv.env("create", "e1")
|
||||||
e1 = spack.environment.read("e1")
|
e1 = spack.environment.read("e1")
|
||||||
with e1:
|
with e1:
|
||||||
|
|||||||
@@ -50,7 +50,7 @@ def test_list_long(capsys):
|
|||||||
def test_list_long_with_pytest_arg(capsys):
|
def test_list_long_with_pytest_arg(capsys):
|
||||||
with capsys.disabled():
|
with capsys.disabled():
|
||||||
output = spack_test("--list-long", cmd_test_py)
|
output = spack_test("--list-long", cmd_test_py)
|
||||||
|
print(output)
|
||||||
assert "unit_test.py::\n" in output
|
assert "unit_test.py::\n" in output
|
||||||
assert "test_list" in output
|
assert "test_list" in output
|
||||||
assert "test_list_with_pytest_arg" in output
|
assert "test_list_with_pytest_arg" in output
|
||||||
|
|||||||
@@ -49,6 +49,7 @@ def test_single_file_verify_cmd(tmpdir):
|
|||||||
sjson.dump({filepath: data}, f)
|
sjson.dump({filepath: data}, f)
|
||||||
|
|
||||||
results = verify("manifest", "-f", filepath, fail_on_error=False)
|
results = verify("manifest", "-f", filepath, fail_on_error=False)
|
||||||
|
print(results)
|
||||||
assert not results
|
assert not results
|
||||||
|
|
||||||
os.utime(filepath, (0, 0))
|
os.utime(filepath, (0, 0))
|
||||||
|
|||||||
@@ -92,7 +92,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
|||||||
# Same as before, but tests that we can reuse from a more generic target
|
# Same as before, but tests that we can reuse from a more generic target
|
||||||
pytest.param(
|
pytest.param(
|
||||||
"pkg-a%gcc@9.4.0",
|
"pkg-a%gcc@9.4.0",
|
||||||
"pkg-b target=x86_64 %gcc@10.2.1",
|
"pkg-b%gcc@10.2.1 target=x86_64",
|
||||||
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
|
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
|
||||||
1,
|
1,
|
||||||
marks=pytest.mark.skipif(
|
marks=pytest.mark.skipif(
|
||||||
@@ -101,7 +101,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
|||||||
),
|
),
|
||||||
pytest.param(
|
pytest.param(
|
||||||
"pkg-a%gcc@10.2.1",
|
"pkg-a%gcc@10.2.1",
|
||||||
"pkg-b target=x86_64 %gcc@9.4.0",
|
"pkg-b%gcc@9.4.0 target=x86_64",
|
||||||
{
|
{
|
||||||
"pkg-a": "gcc-runtime@10.2.1 target=x86_64",
|
"pkg-a": "gcc-runtime@10.2.1 target=x86_64",
|
||||||
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",
|
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",
|
||||||
|
|||||||
@@ -121,7 +121,7 @@ def binary_compatibility(monkeypatch, request):
|
|||||||
"mpileaks ^mpi@1.2:2",
|
"mpileaks ^mpi@1.2:2",
|
||||||
# conflict not triggered
|
# conflict not triggered
|
||||||
"conflict",
|
"conflict",
|
||||||
"conflict~foo%clang",
|
"conflict%clang~foo",
|
||||||
"conflict-parent%gcc",
|
"conflict-parent%gcc",
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
@@ -387,8 +387,8 @@ def test_different_compilers_get_different_flags(
|
|||||||
t = archspec.cpu.host().family
|
t = archspec.cpu.host().family
|
||||||
client = spack.concretize.concretize_one(
|
client = spack.concretize.concretize_one(
|
||||||
Spec(
|
Spec(
|
||||||
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
|
f"cmake-client %gcc@11.1.0 platform=test os=redhat6 target={t}"
|
||||||
f" ^cmake platform=test os=redhat6 target={t} %clang@12.2.0"
|
f" ^cmake %clang@12.2.0 platform=test os=redhat6 target={t}"
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
cmake = client["cmake"]
|
cmake = client["cmake"]
|
||||||
@@ -403,7 +403,7 @@ def test_spec_flags_maintain_order(self, mutable_config, gcc11_with_flags):
|
|||||||
for successive concretizations.
|
for successive concretizations.
|
||||||
"""
|
"""
|
||||||
mutable_config.set("compilers", [gcc11_with_flags])
|
mutable_config.set("compilers", [gcc11_with_flags])
|
||||||
spec_str = "libelf os=redhat6 %gcc@11.1.0"
|
spec_str = "libelf %gcc@11.1.0 os=redhat6"
|
||||||
for _ in range(3):
|
for _ in range(3):
|
||||||
s = spack.concretize.concretize_one(spec_str)
|
s = spack.concretize.concretize_one(spec_str)
|
||||||
assert all(
|
assert all(
|
||||||
@@ -414,7 +414,7 @@ def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12
|
|||||||
mutable_config.set("compilers", [clang12_with_flags])
|
mutable_config.set("compilers", [clang12_with_flags])
|
||||||
# Correct arch to use test compiler that has flags
|
# Correct arch to use test compiler that has flags
|
||||||
t = archspec.cpu.host().family
|
t = archspec.cpu.host().family
|
||||||
spec = Spec(f"pkg-a platform=test os=redhat6 target={t} %clang@12.2.0")
|
spec = Spec(f"pkg-a %clang@12.2.0 platform=test os=redhat6 target={t}")
|
||||||
|
|
||||||
# Get the compiler that matches the spec (
|
# Get the compiler that matches the spec (
|
||||||
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
|
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
|
||||||
@@ -488,13 +488,13 @@ def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
|
|||||||
# CNL compiler has no target attribute, and this is essential to make detection pass
|
# CNL compiler has no target attribute, and this is essential to make detection pass
|
||||||
del cnl_compiler["compiler"]["target"]
|
del cnl_compiler["compiler"]["target"]
|
||||||
with spack.config.override("compilers", [cnl_compiler]):
|
with spack.config.override("compilers", [cnl_compiler]):
|
||||||
spec_str = "mpileaks os=CNL target=nocona %gcc@4.5.0 ^dyninst os=CNL ^callpath os=CNL"
|
spec_str = "mpileaks %gcc@4.5.0 os=CNL target=nocona ^dyninst os=CNL ^callpath os=CNL"
|
||||||
spec = spack.concretize.concretize_one(spec_str)
|
spec = spack.concretize.concretize_one(spec_str)
|
||||||
for s in spec.traverse(root=False):
|
for s in spec.traverse(root=False):
|
||||||
assert s.architecture.target == spec.architecture.target
|
assert s.architecture.target == spec.architecture.target
|
||||||
|
|
||||||
def test_compiler_flags_from_user_are_grouped(self):
|
def test_compiler_flags_from_user_are_grouped(self):
|
||||||
spec = Spec('pkg-a cflags="-O -foo-flag foo-val" platform=test %gcc')
|
spec = Spec('pkg-a%gcc cflags="-O -foo-flag foo-val" platform=test')
|
||||||
spec = spack.concretize.concretize_one(spec)
|
spec = spack.concretize.concretize_one(spec)
|
||||||
cflags = spec.compiler_flags["cflags"]
|
cflags = spec.compiler_flags["cflags"]
|
||||||
assert any(x == "-foo-flag foo-val" for x in cflags)
|
assert any(x == "-foo-flag foo-val" for x in cflags)
|
||||||
@@ -804,7 +804,7 @@ def test_external_and_virtual(self, mutable_config):
|
|||||||
assert spec["stuff"].compiler.satisfies("gcc")
|
assert spec["stuff"].compiler.satisfies("gcc")
|
||||||
|
|
||||||
def test_compiler_child(self):
|
def test_compiler_child(self):
|
||||||
s = Spec("mpileaks target=x86_64 %clang ^dyninst%gcc")
|
s = Spec("mpileaks%clang target=x86_64 ^dyninst%gcc")
|
||||||
s = spack.concretize.concretize_one(s)
|
s = spack.concretize.concretize_one(s)
|
||||||
assert s["mpileaks"].satisfies("%clang")
|
assert s["mpileaks"].satisfies("%clang")
|
||||||
assert s["dyninst"].satisfies("%gcc")
|
assert s["dyninst"].satisfies("%gcc")
|
||||||
@@ -826,7 +826,7 @@ def test_conflict_in_all_directives_true(self):
|
|||||||
with pytest.raises(spack.error.SpackError):
|
with pytest.raises(spack.error.SpackError):
|
||||||
s = spack.concretize.concretize_one(s)
|
s = spack.concretize.concretize_one(s)
|
||||||
|
|
||||||
@pytest.mark.parametrize("spec_str", ["conflict@10.0+foo%clang"])
|
@pytest.mark.parametrize("spec_str", ["conflict@10.0%clang+foo"])
|
||||||
def test_no_conflict_in_external_specs(self, spec_str):
|
def test_no_conflict_in_external_specs(self, spec_str):
|
||||||
# Modify the configuration to have the spec with conflict
|
# Modify the configuration to have the spec with conflict
|
||||||
# registered as an external
|
# registered as an external
|
||||||
@@ -940,9 +940,9 @@ def test_noversion_pkg(self, spec):
|
|||||||
"spec, best_achievable",
|
"spec, best_achievable",
|
||||||
[
|
[
|
||||||
("mpileaks%gcc@=4.4.7 ^dyninst@=10.2.1 target=x86_64:", "core2"),
|
("mpileaks%gcc@=4.4.7 ^dyninst@=10.2.1 target=x86_64:", "core2"),
|
||||||
("mpileaks target=x86_64: %gcc@=4.8", "haswell"),
|
("mpileaks%gcc@=4.8 target=x86_64:", "haswell"),
|
||||||
("mpileaks target=x86_64: %gcc@=5.3.0", "broadwell"),
|
("mpileaks%gcc@=5.3.0 target=x86_64:", "broadwell"),
|
||||||
("mpileaks target=x86_64: %apple-clang@=5.1.0", "x86_64"),
|
("mpileaks%apple-clang@=5.1.0 target=x86_64:", "x86_64"),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
@pytest.mark.regression("13361", "20537")
|
@pytest.mark.regression("13361", "20537")
|
||||||
@@ -1286,7 +1286,7 @@ def test_compiler_match_is_preferred_to_newer_version(self, compiler_factory):
|
|||||||
with spack.config.override(
|
with spack.config.override(
|
||||||
"compilers", [compiler_factory(spec="gcc@10.1.0", operating_system="redhat6")]
|
"compilers", [compiler_factory(spec="gcc@10.1.0", operating_system="redhat6")]
|
||||||
):
|
):
|
||||||
spec_str = "simple-inheritance+openblas os=redhat6 %gcc@10.1.0"
|
spec_str = "simple-inheritance+openblas %gcc@10.1.0 os=redhat6"
|
||||||
s = spack.concretize.concretize_one(spec_str)
|
s = spack.concretize.concretize_one(spec_str)
|
||||||
assert "openblas@0.2.15" in s
|
assert "openblas@0.2.15" in s
|
||||||
assert s["openblas"].satisfies("%gcc@10.1.0")
|
assert s["openblas"].satisfies("%gcc@10.1.0")
|
||||||
@@ -1319,7 +1319,7 @@ def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeyp
|
|||||||
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
|
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
|
||||||
)
|
)
|
||||||
monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1")
|
monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1")
|
||||||
s = spack.concretize.concretize_one("pkg-a os=redhat6 %gcc@10foo")
|
s = spack.concretize.concretize_one("pkg-a %gcc@10foo os=redhat6")
|
||||||
assert "%gcc@10foo" in s
|
assert "%gcc@10foo" in s
|
||||||
|
|
||||||
def test_all_patches_applied(self):
|
def test_all_patches_applied(self):
|
||||||
@@ -1531,8 +1531,8 @@ def test_external_with_non_default_variant_as_dependency(self):
|
|||||||
("mpileaks", "os=debian6"),
|
("mpileaks", "os=debian6"),
|
||||||
# To trigger the bug in 22871 we need to have the same compiler
|
# To trigger the bug in 22871 we need to have the same compiler
|
||||||
# spec available on both operating systems
|
# spec available on both operating systems
|
||||||
("mpileaks platform=test os=debian6 %gcc@10.2.1", "os=debian6"),
|
("mpileaks%gcc@10.2.1 platform=test os=debian6", "os=debian6"),
|
||||||
("mpileaks platform=test os=redhat6 %gcc@10.2.1", "os=redhat6"),
|
("mpileaks%gcc@10.2.1 platform=test os=redhat6", "os=redhat6"),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_os_selection_when_multiple_choices_are_possible(
|
def test_os_selection_when_multiple_choices_are_possible(
|
||||||
@@ -2018,6 +2018,7 @@ def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
|
|||||||
s = Spec("develop-branch-version@git.%s=develop" % git_ref)
|
s = Spec("develop-branch-version@git.%s=develop" % git_ref)
|
||||||
c = spack.concretize.concretize_one(s)
|
c = spack.concretize.concretize_one(s)
|
||||||
assert git_ref in str(c)
|
assert git_ref in str(c)
|
||||||
|
print(str(c))
|
||||||
assert s.satisfies("@develop")
|
assert s.satisfies("@develop")
|
||||||
assert s.satisfies("@0.1:")
|
assert s.satisfies("@0.1:")
|
||||||
|
|
||||||
@@ -3013,7 +3014,7 @@ def test_virtuals_provided_together_but_only_one_required_in_dag(self):
|
|||||||
|
|
||||||
|
|
||||||
def test_reusable_externals_match(mock_packages, tmpdir):
|
def test_reusable_externals_match(mock_packages, tmpdir):
|
||||||
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
|
||||||
spec.external_path = tmpdir.strpath
|
spec.external_path = tmpdir.strpath
|
||||||
spec.external_modules = ["mpich/4.1"]
|
spec.external_modules = ["mpich/4.1"]
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
@@ -3031,7 +3032,7 @@ def test_reusable_externals_match(mock_packages, tmpdir):
|
|||||||
|
|
||||||
|
|
||||||
def test_reusable_externals_match_virtual(mock_packages, tmpdir):
|
def test_reusable_externals_match_virtual(mock_packages, tmpdir):
|
||||||
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
|
||||||
spec.external_path = tmpdir.strpath
|
spec.external_path = tmpdir.strpath
|
||||||
spec.external_modules = ["mpich/4.1"]
|
spec.external_modules = ["mpich/4.1"]
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
@@ -3049,7 +3050,7 @@ def test_reusable_externals_match_virtual(mock_packages, tmpdir):
|
|||||||
|
|
||||||
|
|
||||||
def test_reusable_externals_different_prefix(mock_packages, tmpdir):
|
def test_reusable_externals_different_prefix(mock_packages, tmpdir):
|
||||||
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
|
||||||
spec.external_path = "/other/path"
|
spec.external_path = "/other/path"
|
||||||
spec.external_modules = ["mpich/4.1"]
|
spec.external_modules = ["mpich/4.1"]
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
@@ -3068,7 +3069,7 @@ def test_reusable_externals_different_prefix(mock_packages, tmpdir):
|
|||||||
|
|
||||||
@pytest.mark.parametrize("modules", [None, ["mpich/4.1", "libfabric/1.19"]])
|
@pytest.mark.parametrize("modules", [None, ["mpich/4.1", "libfabric/1.19"]])
|
||||||
def test_reusable_externals_different_modules(mock_packages, tmpdir, modules):
|
def test_reusable_externals_different_modules(mock_packages, tmpdir, modules):
|
||||||
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
|
||||||
spec.external_path = tmpdir.strpath
|
spec.external_path = tmpdir.strpath
|
||||||
spec.external_modules = modules
|
spec.external_modules = modules
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
@@ -3086,7 +3087,7 @@ def test_reusable_externals_different_modules(mock_packages, tmpdir, modules):
|
|||||||
|
|
||||||
|
|
||||||
def test_reusable_externals_different_spec(mock_packages, tmpdir):
|
def test_reusable_externals_different_spec(mock_packages, tmpdir):
|
||||||
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
|
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
|
||||||
spec.external_path = tmpdir.strpath
|
spec.external_path = tmpdir.strpath
|
||||||
spec._mark_concrete()
|
spec._mark_concrete()
|
||||||
assert not spack.solver.asp._is_reusable(
|
assert not spack.solver.asp._is_reusable(
|
||||||
@@ -3254,54 +3255,3 @@ def test_spec_unification(unify, mutable_config, mock_packages):
|
|||||||
maybe_fails = pytest.raises if unify is True else llnl.util.lang.nullcontext
|
maybe_fails = pytest.raises if unify is True else llnl.util.lang.nullcontext
|
||||||
with maybe_fails(spack.solver.asp.UnsatisfiableSpecError):
|
with maybe_fails(spack.solver.asp.UnsatisfiableSpecError):
|
||||||
_ = spack.cmd.parse_specs([a_restricted, b], concretize=True)
|
_ = spack.cmd.parse_specs([a_restricted, b], concretize=True)
|
||||||
|
|
||||||
|
|
||||||
def test_concretization_cache_roundtrip(use_concretization_cache, monkeypatch, mutable_config):
|
|
||||||
"""Tests whether we can write the results of a clingo solve to the cache
|
|
||||||
and load the same spec request from the cache to produce identical specs"""
|
|
||||||
# Force determinism:
|
|
||||||
# Solver setup is normally non-deterministic due to non-determinism in
|
|
||||||
# asp solver setup logic generation. The only other inputs to the cache keys are
|
|
||||||
# the .lp files, which are invariant over the course of this test.
|
|
||||||
# This method forces the same setup to be produced for the same specs
|
|
||||||
# which gives us a guarantee of cache hits, as it removes the only
|
|
||||||
# element of non deterministic solver setup for the same spec
|
|
||||||
# Basically just a quick and dirty memoization
|
|
||||||
solver_setup = spack.solver.asp.SpackSolverSetup.setup
|
|
||||||
|
|
||||||
def _setup(self, specs, *, reuse=None, allow_deprecated=False):
|
|
||||||
if not getattr(_setup, "cache_setup", None):
|
|
||||||
cache_setup = solver_setup(self, specs, reuse=reuse, allow_deprecated=allow_deprecated)
|
|
||||||
setattr(_setup, "cache_setup", cache_setup)
|
|
||||||
return getattr(_setup, "cache_setup")
|
|
||||||
|
|
||||||
# monkeypatch our forced determinism setup method into solver setup
|
|
||||||
monkeypatch.setattr(spack.solver.asp.SpackSolverSetup, "setup", _setup)
|
|
||||||
|
|
||||||
assert spack.config.get("config:concretization_cache:enable")
|
|
||||||
|
|
||||||
# run one standard concretization to populate the cache and the setup method
|
|
||||||
# memoization
|
|
||||||
h = spack.concretize.concretize_one("hdf5")
|
|
||||||
|
|
||||||
# due to our forced determinism above, we should not be observing
|
|
||||||
# cache misses, assert that we're not storing any new cache entries
|
|
||||||
def _ensure_no_store(self, problem: str, result, statistics, test=False):
|
|
||||||
# always throw, we never want to reach this code path
|
|
||||||
assert False, "Concretization cache hit expected"
|
|
||||||
|
|
||||||
# Assert that we're actually hitting the cache
|
|
||||||
cache_fetch = spack.solver.asp.ConcretizationCache.fetch
|
|
||||||
|
|
||||||
def _ensure_cache_hits(self, problem: str):
|
|
||||||
result, statistics = cache_fetch(self, problem)
|
|
||||||
assert result, "Expected successful concretization cache hit"
|
|
||||||
assert statistics, "Expected statistics to be non null on cache hit"
|
|
||||||
return result, statistics
|
|
||||||
|
|
||||||
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "store", _ensure_no_store)
|
|
||||||
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "fetch", _ensure_cache_hits)
|
|
||||||
# ensure subsequent concretizations of the same spec produce the same spec
|
|
||||||
# object
|
|
||||||
for _ in range(5):
|
|
||||||
assert h == spack.concretize.concretize_one("hdf5")
|
|
||||||
|
|||||||
@@ -94,7 +94,7 @@ def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
|
|||||||
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
|
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
|
||||||
update_concretize_scope(conf_str, "compilers")
|
update_concretize_scope(conf_str, "compilers")
|
||||||
|
|
||||||
s1 = spack.concretize.concretize_one('y cflags="-O2" %gcc@12.100.100')
|
s1 = spack.concretize.concretize_one('y %gcc@12.100.100 cflags="-O2"')
|
||||||
assert s1.satisfies('cflags="-Wall -O2"')
|
assert s1.satisfies('cflags="-Wall -O2"')
|
||||||
|
|
||||||
|
|
||||||
@@ -182,7 +182,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
|
|||||||
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
|
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
|
||||||
update_concretize_scope(conf_str, "compilers")
|
update_concretize_scope(conf_str, "compilers")
|
||||||
|
|
||||||
root_spec = spack.concretize.concretize_one("v cflags=='-f1' %gcc@12.100.100")
|
root_spec = spack.concretize.concretize_one("v %gcc@12.100.100 cflags=='-f1'")
|
||||||
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
|
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
|
||||||
|
|
||||||
|
|
||||||
@@ -229,7 +229,7 @@ def test_dev_mix_flags(tmp_path, concretize_scope, mutable_mock_env_path, test_r
|
|||||||
env_content = f"""\
|
env_content = f"""\
|
||||||
spack:
|
spack:
|
||||||
specs:
|
specs:
|
||||||
- y cflags=='-fsanitize=address' %gcc@12.100.100
|
- y %gcc@12.100.100 cflags=='-fsanitize=address'
|
||||||
develop:
|
develop:
|
||||||
y:
|
y:
|
||||||
spec: y cflags=='-fsanitize=address'
|
spec: y cflags=='-fsanitize=address'
|
||||||
|
|||||||
@@ -359,10 +359,10 @@ def test_one_package_multiple_oneof_groups(concretize_scope, test_repo):
|
|||||||
update_packages_config(conf_str)
|
update_packages_config(conf_str)
|
||||||
|
|
||||||
s1 = spack.concretize.concretize_one("y@2.5")
|
s1 = spack.concretize.concretize_one("y@2.5")
|
||||||
assert s1.satisfies("~shared%clang")
|
assert s1.satisfies("%clang~shared")
|
||||||
|
|
||||||
s2 = spack.concretize.concretize_one("y@2.4")
|
s2 = spack.concretize.concretize_one("y@2.4")
|
||||||
assert s2.satisfies("+shared%gcc")
|
assert s2.satisfies("%gcc+shared")
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("34241")
|
@pytest.mark.regression("34241")
|
||||||
@@ -499,7 +499,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
|
|||||||
"requirements,expectations",
|
"requirements,expectations",
|
||||||
[
|
[
|
||||||
(("%gcc", "%clang"), ("%gcc", "%clang")),
|
(("%gcc", "%clang"), ("%gcc", "%clang")),
|
||||||
(("~shared%gcc", "@1.0"), ("~shared%gcc", "@1.0+shared")),
|
(("%gcc~shared", "@1.0"), ("%gcc~shared", "@1.0+shared")),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_default_and_package_specific_requirements(
|
def test_default_and_package_specific_requirements(
|
||||||
@@ -754,7 +754,7 @@ def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
|
|||||||
update_packages_config(packages_yaml)
|
update_packages_config(packages_yaml)
|
||||||
s = spack.concretize.concretize_one("mpileaks")
|
s = spack.concretize.concretize_one("mpileaks")
|
||||||
|
|
||||||
assert s.satisfies("+shared %clang")
|
assert s.satisfies("%clang+shared")
|
||||||
# Sanity checks that 'callpath' doesn't have the shared variant, but that didn't
|
# Sanity checks that 'callpath' doesn't have the shared variant, but that didn't
|
||||||
# cause failures during concretization.
|
# cause failures during concretization.
|
||||||
assert "shared" not in s["callpath"].variants
|
assert "shared" not in s["callpath"].variants
|
||||||
@@ -1125,46 +1125,3 @@ def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_pa
|
|||||||
)
|
)
|
||||||
ascent = result.specs[0]
|
ascent = result.specs[0]
|
||||||
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent
|
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
"packages_yaml,err_match",
|
|
||||||
[
|
|
||||||
(
|
|
||||||
"""
|
|
||||||
packages:
|
|
||||||
mpi:
|
|
||||||
require:
|
|
||||||
- "+bzip2"
|
|
||||||
""",
|
|
||||||
"expected a named spec",
|
|
||||||
),
|
|
||||||
(
|
|
||||||
"""
|
|
||||||
packages:
|
|
||||||
mpi:
|
|
||||||
require:
|
|
||||||
- one_of: ["+bzip2", openmpi]
|
|
||||||
""",
|
|
||||||
"expected a named spec",
|
|
||||||
),
|
|
||||||
(
|
|
||||||
"""
|
|
||||||
packages:
|
|
||||||
mpi:
|
|
||||||
require:
|
|
||||||
- "^mpich"
|
|
||||||
""",
|
|
||||||
"Did you mean",
|
|
||||||
),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_anonymous_spec_cannot_be_used_in_virtual_requirements(
|
|
||||||
packages_yaml, err_match, concretize_scope, mock_packages
|
|
||||||
):
|
|
||||||
"""Tests that using anonymous specs in requirements for virtual packages raises an
|
|
||||||
appropriate error message.
|
|
||||||
"""
|
|
||||||
update_packages_config(packages_yaml)
|
|
||||||
with pytest.raises(spack.error.SpackError, match=err_match):
|
|
||||||
spack.concretize.concretize_one("mpileaks")
|
|
||||||
|
|||||||
@@ -11,7 +11,8 @@
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from llnl.util.filesystem import join_path, touch
|
import llnl.util.tty as tty
|
||||||
|
from llnl.util.filesystem import join_path, touch, touchp
|
||||||
|
|
||||||
import spack
|
import spack
|
||||||
import spack.config
|
import spack.config
|
||||||
@@ -25,7 +26,6 @@
|
|||||||
import spack.schema.compilers
|
import spack.schema.compilers
|
||||||
import spack.schema.config
|
import spack.schema.config
|
||||||
import spack.schema.env
|
import spack.schema.env
|
||||||
import spack.schema.include
|
|
||||||
import spack.schema.mirrors
|
import spack.schema.mirrors
|
||||||
import spack.schema.repos
|
import spack.schema.repos
|
||||||
import spack.spec
|
import spack.spec
|
||||||
@@ -51,9 +51,22 @@
|
|||||||
|
|
||||||
config_override_list = {"config": {"build_stage:": ["pathd", "pathe"]}}
|
config_override_list = {"config": {"build_stage:": ["pathd", "pathe"]}}
|
||||||
|
|
||||||
config_merge_dict = {"config": {"aliases": {"ls": "find", "dev": "develop"}}}
|
config_merge_dict = {"config": {"info": {"a": 3, "b": 4}}}
|
||||||
|
|
||||||
config_override_dict = {"config": {"aliases:": {"be": "build-env", "deps": "dependencies"}}}
|
config_override_dict = {"config": {"info:": {"a": 7, "c": 9}}}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture()
|
||||||
|
def write_config_file(tmpdir):
|
||||||
|
"""Returns a function that writes a config file."""
|
||||||
|
|
||||||
|
def _write(config, data, scope):
|
||||||
|
config_yaml = tmpdir.join(scope, config + ".yaml")
|
||||||
|
config_yaml.ensure()
|
||||||
|
with config_yaml.open("w") as f:
|
||||||
|
syaml.dump_config(data, f)
|
||||||
|
|
||||||
|
return _write
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
@pytest.fixture()
|
||||||
@@ -1024,16 +1037,6 @@ def test_bad_config_yaml(tmpdir):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def test_bad_include_yaml(tmpdir):
|
|
||||||
with pytest.raises(spack.config.ConfigFormatError, match="is not of type"):
|
|
||||||
check_schema(
|
|
||||||
spack.schema.include.schema,
|
|
||||||
"""\
|
|
||||||
include: $HOME/include.yaml
|
|
||||||
""",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def test_bad_mirrors_yaml(tmpdir):
|
def test_bad_mirrors_yaml(tmpdir):
|
||||||
with pytest.raises(spack.config.ConfigFormatError):
|
with pytest.raises(spack.config.ConfigFormatError):
|
||||||
check_schema(
|
check_schema(
|
||||||
@@ -1098,9 +1101,9 @@ def test_internal_config_section_override(mock_low_high_config, write_config_fil
|
|||||||
|
|
||||||
def test_internal_config_dict_override(mock_low_high_config, write_config_file):
|
def test_internal_config_dict_override(mock_low_high_config, write_config_file):
|
||||||
write_config_file("config", config_merge_dict, "low")
|
write_config_file("config", config_merge_dict, "low")
|
||||||
wanted_dict = config_override_dict["config"]["aliases:"]
|
wanted_dict = config_override_dict["config"]["info:"]
|
||||||
mock_low_high_config.push_scope(spack.config.InternalConfigScope("high", config_override_dict))
|
mock_low_high_config.push_scope(spack.config.InternalConfigScope("high", config_override_dict))
|
||||||
assert mock_low_high_config.get("config:aliases") == wanted_dict
|
assert mock_low_high_config.get("config:info") == wanted_dict
|
||||||
|
|
||||||
|
|
||||||
def test_internal_config_list_override(mock_low_high_config, write_config_file):
|
def test_internal_config_list_override(mock_low_high_config, write_config_file):
|
||||||
@@ -1132,10 +1135,10 @@ def test_set_list_override(mock_low_high_config, write_config_file):
|
|||||||
|
|
||||||
def test_set_dict_override(mock_low_high_config, write_config_file):
|
def test_set_dict_override(mock_low_high_config, write_config_file):
|
||||||
write_config_file("config", config_merge_dict, "low")
|
write_config_file("config", config_merge_dict, "low")
|
||||||
wanted_dict = config_override_dict["config"]["aliases:"]
|
wanted_dict = config_override_dict["config"]["info:"]
|
||||||
with spack.config.override("config:aliases:", wanted_dict):
|
with spack.config.override("config:info:", wanted_dict):
|
||||||
assert wanted_dict == mock_low_high_config.get("config:aliases")
|
assert wanted_dict == mock_low_high_config.get("config:info")
|
||||||
assert config_merge_dict["config"]["aliases"] == mock_low_high_config.get("config:aliases")
|
assert config_merge_dict["config"]["info"] == mock_low_high_config.get("config:info")
|
||||||
|
|
||||||
|
|
||||||
def test_set_bad_path(config):
|
def test_set_bad_path(config):
|
||||||
@@ -1221,7 +1224,7 @@ def test_user_config_path_is_default_when_env_var_is_empty(working_env):
|
|||||||
|
|
||||||
|
|
||||||
def test_default_install_tree(monkeypatch, default_config):
|
def test_default_install_tree(monkeypatch, default_config):
|
||||||
s = spack.spec.Spec("nonexistent@x.y.z arch=foo-bar-baz %none@a.b.c")
|
s = spack.spec.Spec("nonexistent@x.y.z %none@a.b.c arch=foo-bar-baz")
|
||||||
monkeypatch.setattr(s, "dag_hash", lambda length: "abc123")
|
monkeypatch.setattr(s, "dag_hash", lambda length: "abc123")
|
||||||
_, _, projections = spack.store.parse_install_tree(spack.config.get("config"))
|
_, _, projections = spack.store.parse_install_tree(spack.config.get("config"))
|
||||||
assert s.format(projections["all"]) == "foo-bar-baz/none-a.b.c/nonexistent-x.y.z-abc123"
|
assert s.format(projections["all"]) == "foo-bar-baz/none-a.b.c/nonexistent-x.y.z-abc123"
|
||||||
@@ -1261,6 +1264,134 @@ def test_user_cache_path_is_default_when_env_var_is_empty(working_env):
|
|||||||
assert os.path.expanduser("~%s.spack" % os.sep) == spack.paths._get_user_cache_path()
|
assert os.path.expanduser("~%s.spack" % os.sep) == spack.paths._get_user_cache_path()
|
||||||
|
|
||||||
|
|
||||||
|
github_url = "https://github.com/fake/fake/{0}/develop"
|
||||||
|
gitlab_url = "https://gitlab.fake.io/user/repo/-/blob/config/defaults"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"url,isfile",
|
||||||
|
[
|
||||||
|
(github_url.format("tree"), False),
|
||||||
|
("{0}/README.md".format(github_url.format("blob")), True),
|
||||||
|
("{0}/etc/fake/defaults/packages.yaml".format(github_url.format("blob")), True),
|
||||||
|
(gitlab_url, False),
|
||||||
|
(None, False),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_config_collect_urls(mutable_empty_config, mock_spider_configs, url, isfile):
|
||||||
|
with spack.config.override("config:url_fetch_method", "curl"):
|
||||||
|
urls = spack.config.collect_urls(url)
|
||||||
|
if url:
|
||||||
|
if isfile:
|
||||||
|
expected = 1 if url.endswith(".yaml") else 0
|
||||||
|
assert len(urls) == expected
|
||||||
|
else:
|
||||||
|
# Expect multiple configuration files for a "directory"
|
||||||
|
assert len(urls) > 1
|
||||||
|
else:
|
||||||
|
assert not urls
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"url,isfile,fail",
|
||||||
|
[
|
||||||
|
(github_url.format("tree"), False, False),
|
||||||
|
(gitlab_url, False, False),
|
||||||
|
("{0}/README.md".format(github_url.format("blob")), True, True),
|
||||||
|
("{0}/compilers.yaml".format(gitlab_url), True, False),
|
||||||
|
(None, False, True),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_config_fetch_remote_configs(
|
||||||
|
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, isfile, fail
|
||||||
|
):
|
||||||
|
def _has_content(filename):
|
||||||
|
# The first element of all configuration files for this test happen to
|
||||||
|
# be the basename of the file so this check leverages that feature. If
|
||||||
|
# that changes, then this check will need to change accordingly.
|
||||||
|
element = "{0}:".format(os.path.splitext(os.path.basename(filename))[0])
|
||||||
|
with open(filename, "r", encoding="utf-8") as fd:
|
||||||
|
for line in fd:
|
||||||
|
if element in line:
|
||||||
|
return True
|
||||||
|
tty.debug("Expected {0} in '{1}'".format(element, filename))
|
||||||
|
return False
|
||||||
|
|
||||||
|
dest_dir = join_path(tmpdir.strpath, "defaults")
|
||||||
|
if fail:
|
||||||
|
msg = "Cannot retrieve configuration"
|
||||||
|
with spack.config.override("config:url_fetch_method", "curl"):
|
||||||
|
with pytest.raises(spack.config.ConfigFileError, match=msg):
|
||||||
|
spack.config.fetch_remote_configs(url, dest_dir)
|
||||||
|
else:
|
||||||
|
with spack.config.override("config:url_fetch_method", "curl"):
|
||||||
|
path = spack.config.fetch_remote_configs(url, dest_dir)
|
||||||
|
assert os.path.exists(path)
|
||||||
|
if isfile:
|
||||||
|
# Ensure correct file is "fetched"
|
||||||
|
assert os.path.basename(path) == os.path.basename(url)
|
||||||
|
# Ensure contents of the file has expected config element
|
||||||
|
assert _has_content(path)
|
||||||
|
else:
|
||||||
|
for filename in os.listdir(path):
|
||||||
|
assert _has_content(join_path(path, filename))
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def mock_collect_urls(mock_config_data, monkeypatch):
|
||||||
|
"""Mock the collection of URLs to avoid mocking spider."""
|
||||||
|
|
||||||
|
_, config_files = mock_config_data
|
||||||
|
|
||||||
|
def _collect(base_url):
|
||||||
|
if not base_url:
|
||||||
|
return []
|
||||||
|
|
||||||
|
ext = os.path.splitext(base_url)[1]
|
||||||
|
if ext:
|
||||||
|
return [base_url] if ext == ".yaml" else []
|
||||||
|
|
||||||
|
return [join_path(base_url, f) for f in config_files]
|
||||||
|
|
||||||
|
monkeypatch.setattr(spack.config, "collect_urls", _collect)
|
||||||
|
|
||||||
|
yield
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"url,skip",
|
||||||
|
[(github_url.format("tree"), True), ("{0}/compilers.yaml".format(gitlab_url), True)],
|
||||||
|
)
|
||||||
|
def test_config_fetch_remote_configs_skip(
|
||||||
|
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, skip
|
||||||
|
):
|
||||||
|
"""Ensure skip fetching remote config file if it already exists when
|
||||||
|
required and not skipping if replacing it."""
|
||||||
|
|
||||||
|
def check_contents(filename, expected):
|
||||||
|
with open(filename, "r", encoding="utf-8") as fd:
|
||||||
|
lines = fd.readlines()
|
||||||
|
if expected:
|
||||||
|
assert lines[0] == "compilers:"
|
||||||
|
else:
|
||||||
|
assert not lines
|
||||||
|
|
||||||
|
dest_dir = join_path(tmpdir.strpath, "defaults")
|
||||||
|
filename = "compilers.yaml"
|
||||||
|
|
||||||
|
# Create a stage directory with an empty configuration file
|
||||||
|
path = join_path(dest_dir, filename)
|
||||||
|
touchp(path)
|
||||||
|
|
||||||
|
# Do NOT replace the existing cached configuration file if skipping
|
||||||
|
expected = None if skip else "compilers:"
|
||||||
|
|
||||||
|
with spack.config.override("config:url_fetch_method", "curl"):
|
||||||
|
path = spack.config.fetch_remote_configs(url, dest_dir, skip)
|
||||||
|
result_filename = path if path.endswith(".yaml") else join_path(path, filename)
|
||||||
|
check_contents(result_filename, expected)
|
||||||
|
|
||||||
|
|
||||||
def test_config_file_dir_failure(tmpdir, mutable_empty_config):
|
def test_config_file_dir_failure(tmpdir, mutable_empty_config):
|
||||||
with pytest.raises(spack.config.ConfigFileError, match="not a file"):
|
with pytest.raises(spack.config.ConfigFileError, match="not a file"):
|
||||||
spack.config.read_config_file(tmpdir.strpath)
|
spack.config.read_config_file(tmpdir.strpath)
|
||||||
|
|||||||
@@ -30,15 +30,7 @@
|
|||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.lock
|
import llnl.util.lock
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.util.filesystem import (
|
from llnl.util.filesystem import copy_tree, mkdirp, remove_linked_tree, touchp, working_dir
|
||||||
copy,
|
|
||||||
copy_tree,
|
|
||||||
join_path,
|
|
||||||
mkdirp,
|
|
||||||
remove_linked_tree,
|
|
||||||
touchp,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
import spack.binary_distribution
|
import spack.binary_distribution
|
||||||
import spack.bootstrap.core
|
import spack.bootstrap.core
|
||||||
@@ -73,7 +65,6 @@
|
|||||||
from spack.installer import PackageInstaller
|
from spack.installer import PackageInstaller
|
||||||
from spack.main import SpackCommand
|
from spack.main import SpackCommand
|
||||||
from spack.util.pattern import Bunch
|
from spack.util.pattern import Bunch
|
||||||
from spack.util.remote_file_cache import raw_github_gitlab_url
|
|
||||||
|
|
||||||
from ..enums import ConfigScopePriority
|
from ..enums import ConfigScopePriority
|
||||||
|
|
||||||
@@ -350,16 +341,6 @@ def pytest_collection_modifyitems(config, items):
|
|||||||
item.add_marker(skip_as_slow)
|
item.add_marker(skip_as_slow)
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
|
||||||
def use_concretization_cache(mutable_config, tmpdir):
|
|
||||||
"""Enables the use of the concretization cache"""
|
|
||||||
spack.config.set("config:concretization_cache:enable", True)
|
|
||||||
# ensure we have an isolated concretization cache
|
|
||||||
new_conc_cache_loc = str(tmpdir.mkdir("concretization"))
|
|
||||||
spack.config.set("config:concretization_cache:path", new_conc_cache_loc)
|
|
||||||
yield
|
|
||||||
|
|
||||||
|
|
||||||
#
|
#
|
||||||
# These fixtures are applied to all tests
|
# These fixtures are applied to all tests
|
||||||
#
|
#
|
||||||
@@ -1700,7 +1681,7 @@ def installation_dir_with_headers(tmpdir_factory):
|
|||||||
##########
|
##########
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(params=["conflict+foo%clang", "conflict-parent@0.9^conflict~foo"])
|
@pytest.fixture(params=["conflict%clang+foo", "conflict-parent@0.9^conflict~foo"])
|
||||||
def conflict_spec(request):
|
def conflict_spec(request):
|
||||||
"""Specs which violate constraints specified with the "conflicts"
|
"""Specs which violate constraints specified with the "conflicts"
|
||||||
directive in the "conflict" package.
|
directive in the "conflict" package.
|
||||||
@@ -1915,21 +1896,35 @@ def __call__(self, *args, **kwargs):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def mock_fetch_url_text(tmpdir, mock_config_data, monkeypatch):
|
def mock_spider_configs(mock_config_data, monkeypatch):
|
||||||
"""Mock spack.util.web.fetch_url_text."""
|
"""
|
||||||
|
Mock retrieval of configuration file URLs from the web by grabbing
|
||||||
|
them from the test data configuration directory.
|
||||||
|
"""
|
||||||
|
config_data_dir, config_files = mock_config_data
|
||||||
|
|
||||||
stage_dir, config_files = mock_config_data
|
def _spider(*args, **kwargs):
|
||||||
|
root_urls = args[0]
|
||||||
|
if not root_urls:
|
||||||
|
return [], set()
|
||||||
|
|
||||||
def _fetch_text_file(url, dest_dir):
|
root_urls = [root_urls] if isinstance(root_urls, str) else root_urls
|
||||||
raw_url = raw_github_gitlab_url(url)
|
|
||||||
mkdirp(dest_dir)
|
|
||||||
basename = os.path.basename(raw_url)
|
|
||||||
src = join_path(stage_dir, basename)
|
|
||||||
dest = join_path(dest_dir, basename)
|
|
||||||
copy(src, dest)
|
|
||||||
return dest
|
|
||||||
|
|
||||||
monkeypatch.setattr(spack.util.web, "fetch_url_text", _fetch_text_file)
|
# Any URL with an extension will be treated like a file; otherwise,
|
||||||
|
# it is considered a directory/folder and we'll grab all available
|
||||||
|
# files.
|
||||||
|
urls = []
|
||||||
|
for url in root_urls:
|
||||||
|
if os.path.splitext(url)[1]:
|
||||||
|
urls.append(url)
|
||||||
|
else:
|
||||||
|
urls.extend([os.path.join(url, f) for f in config_files])
|
||||||
|
|
||||||
|
return [], set(urls)
|
||||||
|
|
||||||
|
monkeypatch.setattr(spack.util.web, "spider", _spider)
|
||||||
|
|
||||||
|
yield
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
@@ -2143,7 +2138,8 @@ def _c_compiler_always_exists():
|
|||||||
@pytest.fixture(scope="session")
|
@pytest.fixture(scope="session")
|
||||||
def mock_test_cache(tmp_path_factory):
|
def mock_test_cache(tmp_path_factory):
|
||||||
cache_dir = tmp_path_factory.mktemp("cache")
|
cache_dir = tmp_path_factory.mktemp("cache")
|
||||||
return spack.util.file_cache.FileCache(cache_dir)
|
print(cache_dir)
|
||||||
|
return spack.util.file_cache.FileCache(str(cache_dir))
|
||||||
|
|
||||||
|
|
||||||
class MockHTTPResponse(io.IOBase):
|
class MockHTTPResponse(io.IOBase):
|
||||||
@@ -2192,27 +2188,3 @@ def info(self):
|
|||||||
@pytest.fixture()
|
@pytest.fixture()
|
||||||
def mock_runtimes(config, mock_packages):
|
def mock_runtimes(config, mock_packages):
|
||||||
return mock_packages.packages_with_tags("runtime")
|
return mock_packages.packages_with_tags("runtime")
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
|
||||||
def write_config_file(tmpdir):
|
|
||||||
"""Returns a function that writes a config file."""
|
|
||||||
|
|
||||||
def _write(config, data, scope):
|
|
||||||
config_yaml = tmpdir.join(scope, config + ".yaml")
|
|
||||||
config_yaml.ensure()
|
|
||||||
with config_yaml.open("w") as f:
|
|
||||||
syaml.dump_config(data, f)
|
|
||||||
return config_yaml
|
|
||||||
|
|
||||||
return _write
|
|
||||||
|
|
||||||
|
|
||||||
def _include_cache_root():
|
|
||||||
return join_path(str(tempfile.mkdtemp()), "user_cache", "includes")
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture()
|
|
||||||
def mock_include_cache(monkeypatch):
|
|
||||||
"""Override the include cache directory so tests don't pollute user cache."""
|
|
||||||
monkeypatch.setattr(spack.config, "_include_cache_location", _include_cache_root)
|
|
||||||
|
|||||||
@@ -14,5 +14,3 @@ config:
|
|||||||
checksum: true
|
checksum: true
|
||||||
dirty: false
|
dirty: false
|
||||||
locks: {1}
|
locks: {1}
|
||||||
concretization_cache:
|
|
||||||
enable: false
|
|
||||||
|
|||||||
@@ -161,7 +161,7 @@ def test_handle_unknown_package(temporary_store, config, mock_packages, tmp_path
|
|||||||
"""
|
"""
|
||||||
layout = temporary_store.layout
|
layout = temporary_store.layout
|
||||||
|
|
||||||
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
|
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
|
||||||
mock_db = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=repo_cache)
|
mock_db = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=repo_cache)
|
||||||
|
|
||||||
not_in_mock = set.difference(
|
not_in_mock = set.difference(
|
||||||
|
|||||||
@@ -12,7 +12,6 @@
|
|||||||
|
|
||||||
import spack.config
|
import spack.config
|
||||||
import spack.environment as ev
|
import spack.environment as ev
|
||||||
import spack.platforms
|
|
||||||
import spack.solver.asp
|
import spack.solver.asp
|
||||||
import spack.spec
|
import spack.spec
|
||||||
from spack.environment.environment import (
|
from spack.environment.environment import (
|
||||||
@@ -922,50 +921,3 @@ def test_environment_from_name_or_dir(mock_packages, mutable_mock_env_path, tmp_
|
|||||||
|
|
||||||
with pytest.raises(ev.SpackEnvironmentError, match="no such environment"):
|
with pytest.raises(ev.SpackEnvironmentError, match="no such environment"):
|
||||||
_ = ev.environment_from_name_or_dir("fake-env")
|
_ = ev.environment_from_name_or_dir("fake-env")
|
||||||
|
|
||||||
|
|
||||||
def test_env_include_configs(mutable_mock_env_path, mock_packages):
|
|
||||||
"""check config and package values using new include schema"""
|
|
||||||
env_path = mutable_mock_env_path
|
|
||||||
env_path.mkdir()
|
|
||||||
|
|
||||||
this_os = spack.platforms.host().default_os
|
|
||||||
config_root = env_path / this_os
|
|
||||||
config_root.mkdir()
|
|
||||||
config_path = str(config_root / "config.yaml")
|
|
||||||
with open(config_path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(
|
|
||||||
"""\
|
|
||||||
config:
|
|
||||||
verify_ssl: False
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
packages_path = str(env_path / "packages.yaml")
|
|
||||||
with open(packages_path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(
|
|
||||||
"""\
|
|
||||||
packages:
|
|
||||||
python:
|
|
||||||
require:
|
|
||||||
- spec: "@3.11:"
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
spack_yaml = env_path / ev.manifest_name
|
|
||||||
spack_yaml.write_text(
|
|
||||||
f"""\
|
|
||||||
spack:
|
|
||||||
include:
|
|
||||||
- path: {config_path}
|
|
||||||
optional: true
|
|
||||||
- path: {packages_path}
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
e = ev.Environment(env_path)
|
|
||||||
with e.manifest.use_config():
|
|
||||||
assert not spack.config.get("config:verify_ssl")
|
|
||||||
python_reqs = spack.config.get("packages")["python"]["require"]
|
|
||||||
req_specs = set(x["spec"] for x in python_reqs)
|
|
||||||
assert req_specs == set(["@3.11:"])
|
|
||||||
|
|||||||
@@ -680,19 +680,13 @@ def test_install_spliced_build_spec_installed(install_mockery, capfd, mock_fetch
|
|||||||
assert node.build_spec.installed
|
assert node.build_spec.installed
|
||||||
|
|
||||||
|
|
||||||
# Unit tests should not be affected by the user's managed environments
|
|
||||||
@pytest.mark.not_on_windows("lacking windows support for binary installs")
|
@pytest.mark.not_on_windows("lacking windows support for binary installs")
|
||||||
@pytest.mark.parametrize("transitive", [True, False])
|
@pytest.mark.parametrize("transitive", [True, False])
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
"root_str", ["splice-t^splice-h~foo", "splice-h~foo", "splice-vt^splice-a"]
|
"root_str", ["splice-t^splice-h~foo", "splice-h~foo", "splice-vt^splice-a"]
|
||||||
)
|
)
|
||||||
def test_install_splice_root_from_binary(
|
def test_install_splice_root_from_binary(
|
||||||
mutable_mock_env_path,
|
install_mockery, mock_fetch, mutable_temporary_mirror, transitive, root_str
|
||||||
install_mockery,
|
|
||||||
mock_fetch,
|
|
||||||
mutable_temporary_mirror,
|
|
||||||
transitive,
|
|
||||||
root_str,
|
|
||||||
):
|
):
|
||||||
"""Test installing a spliced spec with the root available in binary cache"""
|
"""Test installing a spliced spec with the root available in binary cache"""
|
||||||
# Test splicing and rewiring a spec with the same name, different hash.
|
# Test splicing and rewiring a spec with the same name, different hash.
|
||||||
@@ -983,6 +977,7 @@ class MyBuildException(Exception):
|
|||||||
|
|
||||||
|
|
||||||
def _install_fail_my_build_exception(installer, task, install_status, **kwargs):
|
def _install_fail_my_build_exception(installer, task, install_status, **kwargs):
|
||||||
|
print(task, task.pkg.name)
|
||||||
if task.pkg.name == "pkg-a":
|
if task.pkg.name == "pkg-a":
|
||||||
raise MyBuildException("mock internal package build error for pkg-a")
|
raise MyBuildException("mock internal package build error for pkg-a")
|
||||||
else:
|
else:
|
||||||
|
|||||||
@@ -3,9 +3,6 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
@@ -16,10 +13,8 @@
|
|||||||
import spack.error
|
import spack.error
|
||||||
import spack.main
|
import spack.main
|
||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.platforms
|
|
||||||
import spack.util.executable as exe
|
import spack.util.executable as exe
|
||||||
import spack.util.git
|
import spack.util.git
|
||||||
import spack.util.spack_yaml as syaml
|
|
||||||
|
|
||||||
pytestmark = pytest.mark.not_on_windows(
|
pytestmark = pytest.mark.not_on_windows(
|
||||||
"Test functionality supported but tests are failing on Win"
|
"Test functionality supported but tests are failing on Win"
|
||||||
@@ -172,163 +167,3 @@ def test_add_command_line_scope_env(tmp_path, mutable_mock_env_path):
|
|||||||
assert config.get("config:install_tree:root") == "/tmp/first"
|
assert config.get("config:install_tree:root") == "/tmp/first"
|
||||||
|
|
||||||
assert ev.active_environment() is None # shouldn't cause an environment to be activated
|
assert ev.active_environment() is None # shouldn't cause an environment to be activated
|
||||||
|
|
||||||
|
|
||||||
def test_include_cfg(mock_low_high_config, write_config_file, tmpdir):
|
|
||||||
cfg1_path = str(tmpdir.join("include1.yaml"))
|
|
||||||
with open(cfg1_path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(
|
|
||||||
"""\
|
|
||||||
config:
|
|
||||||
verify_ssl: False
|
|
||||||
dirty: True
|
|
||||||
packages:
|
|
||||||
python:
|
|
||||||
require:
|
|
||||||
- spec: "@3.11:"
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
|
|
||||||
def python_cfg(_spec):
|
|
||||||
return f"""\
|
|
||||||
packages:
|
|
||||||
python:
|
|
||||||
require:
|
|
||||||
- spec: {_spec}
|
|
||||||
"""
|
|
||||||
|
|
||||||
def write_python_cfg(_spec, _cfg_name):
|
|
||||||
cfg_path = str(tmpdir.join(_cfg_name))
|
|
||||||
with open(cfg_path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(python_cfg(_spec))
|
|
||||||
return cfg_path
|
|
||||||
|
|
||||||
# This config will not be included
|
|
||||||
cfg2_path = write_python_cfg("+shared", "include2.yaml")
|
|
||||||
|
|
||||||
# The config will point to this using substitutable variables,
|
|
||||||
# namely $os; we expect that Spack resolves these variables
|
|
||||||
# into the actual path of the config
|
|
||||||
this_os = spack.platforms.host().default_os
|
|
||||||
cfg3_expanded_path = os.path.join(str(tmpdir), f"{this_os}", "include3.yaml")
|
|
||||||
fs.mkdirp(os.path.dirname(cfg3_expanded_path))
|
|
||||||
with open(cfg3_expanded_path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(python_cfg("+ssl"))
|
|
||||||
cfg3_abstract_path = os.path.join(str(tmpdir), "$os", "include3.yaml")
|
|
||||||
|
|
||||||
# This will be included unconditionally
|
|
||||||
cfg4_path = write_python_cfg("+tk", "include4.yaml")
|
|
||||||
|
|
||||||
# This config will not exist, and the config will explicitly
|
|
||||||
# allow this
|
|
||||||
cfg5_path = os.path.join(str(tmpdir), "non-existent.yaml")
|
|
||||||
|
|
||||||
include_entries = [
|
|
||||||
{"path": f"{cfg1_path}", "when": f'os == "{this_os}"'},
|
|
||||||
{"path": f"{cfg2_path}", "when": "False"},
|
|
||||||
{"path": cfg3_abstract_path},
|
|
||||||
cfg4_path,
|
|
||||||
{"path": cfg5_path, "optional": True},
|
|
||||||
]
|
|
||||||
include_cfg = {"include": include_entries}
|
|
||||||
filename = write_config_file("include", include_cfg, "low")
|
|
||||||
|
|
||||||
assert not spack.config.get("config:dirty")
|
|
||||||
|
|
||||||
spack.main.add_command_line_scopes(mock_low_high_config, [os.path.dirname(filename)])
|
|
||||||
|
|
||||||
assert spack.config.get("config:dirty")
|
|
||||||
python_reqs = spack.config.get("packages")["python"]["require"]
|
|
||||||
req_specs = set(x["spec"] for x in python_reqs)
|
|
||||||
assert req_specs == set(["@3.11:", "+ssl", "+tk"])
|
|
||||||
|
|
||||||
|
|
||||||
def test_include_duplicate_source(tmpdir, mutable_config):
|
|
||||||
"""Check precedence when include.yaml files have the same path."""
|
|
||||||
include_yaml = "debug.yaml"
|
|
||||||
include_list = {"include": [f"./{include_yaml}"]}
|
|
||||||
|
|
||||||
system_filename = mutable_config.get_config_filename("system", "include")
|
|
||||||
site_filename = mutable_config.get_config_filename("site", "include")
|
|
||||||
|
|
||||||
def write_configs(include_path, debug_data):
|
|
||||||
fs.mkdirp(os.path.dirname(include_path))
|
|
||||||
with open(include_path, "w", encoding="utf-8") as f:
|
|
||||||
syaml.dump_config(include_list, f)
|
|
||||||
|
|
||||||
debug_path = fs.join_path(os.path.dirname(include_path), include_yaml)
|
|
||||||
with open(debug_path, "w", encoding="utf-8") as f:
|
|
||||||
syaml.dump_config(debug_data, f)
|
|
||||||
|
|
||||||
system_config = {"config": {"debug": False}}
|
|
||||||
write_configs(system_filename, system_config)
|
|
||||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(system_filename)])
|
|
||||||
|
|
||||||
site_config = {"config": {"debug": True}}
|
|
||||||
write_configs(site_filename, site_config)
|
|
||||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(site_filename)])
|
|
||||||
|
|
||||||
# Ensure takes the last value of the option pushed onto the stack
|
|
||||||
assert mutable_config.get("config:debug") == site_config["config"]["debug"]
|
|
||||||
|
|
||||||
|
|
||||||
def test_include_recurse_limit(tmpdir, mutable_config):
|
|
||||||
"""Ensure hit the recursion limit."""
|
|
||||||
include_yaml = "include.yaml"
|
|
||||||
include_list = {"include": [f"./{include_yaml}"]}
|
|
||||||
|
|
||||||
include_path = str(tmpdir.join(include_yaml))
|
|
||||||
with open(include_path, "w", encoding="utf-8") as f:
|
|
||||||
syaml.dump_config(include_list, f)
|
|
||||||
|
|
||||||
with pytest.raises(spack.config.RecursiveIncludeError, match="recursion exceeded"):
|
|
||||||
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(include_path)])
|
|
||||||
|
|
||||||
|
|
||||||
# TODO: Fix this once recursive includes are processed in the expected order.
|
|
||||||
@pytest.mark.parametrize("child,expected", [("b", True), ("c", False)])
|
|
||||||
def test_include_recurse_diamond(tmpdir, mutable_config, child, expected):
|
|
||||||
"""Demonstrate include parent's value overrides that of child in diamond include.
|
|
||||||
|
|
||||||
Check that the value set by b or c overrides that set by d.
|
|
||||||
"""
|
|
||||||
configs_root = tmpdir.join("configs")
|
|
||||||
configs_root.mkdir()
|
|
||||||
|
|
||||||
def write(path, contents):
|
|
||||||
with open(path, "w", encoding="utf-8") as f:
|
|
||||||
f.write(contents)
|
|
||||||
|
|
||||||
def debug_contents(value):
|
|
||||||
return f"config:\n debug: {value}\n"
|
|
||||||
|
|
||||||
def include_contents(paths):
|
|
||||||
indent = "\n - "
|
|
||||||
values = indent.join([str(p) for p in paths])
|
|
||||||
return f"include:{indent}{values}"
|
|
||||||
|
|
||||||
a_yaml = tmpdir.join("a.yaml")
|
|
||||||
b_yaml = configs_root.join("b.yaml")
|
|
||||||
c_yaml = configs_root.join("c.yaml")
|
|
||||||
d_yaml = configs_root.join("d.yaml")
|
|
||||||
debug_yaml = configs_root.join("enable_debug.yaml")
|
|
||||||
|
|
||||||
write(debug_yaml, debug_contents("true"))
|
|
||||||
|
|
||||||
a_contents = f"""\
|
|
||||||
include:
|
|
||||||
- {b_yaml}
|
|
||||||
- {c_yaml}
|
|
||||||
"""
|
|
||||||
write(a_yaml, a_contents)
|
|
||||||
write(d_yaml, debug_contents("false"))
|
|
||||||
|
|
||||||
write(b_yaml, include_contents([debug_yaml, d_yaml] if child == "b" else [d_yaml]))
|
|
||||||
write(c_yaml, include_contents([debug_yaml, d_yaml] if child == "c" else [d_yaml]))
|
|
||||||
|
|
||||||
spack.main.add_command_line_scopes(mutable_config, [str(tmpdir)])
|
|
||||||
|
|
||||||
try:
|
|
||||||
assert mutable_config.get("config:debug") is expected
|
|
||||||
except AssertionError:
|
|
||||||
pytest.xfail("recursive includes are not processed in the expected order")
|
|
||||||
|
|||||||
@@ -53,7 +53,7 @@ def test_no_version_match(pkg_name):
|
|||||||
# Constraints on compilers with a default
|
# Constraints on compilers with a default
|
||||||
("%gcc", "has_a_default", "gcc"),
|
("%gcc", "has_a_default", "gcc"),
|
||||||
("%clang", "has_a_default", "clang"),
|
("%clang", "has_a_default", "clang"),
|
||||||
("os=elcapitan %apple-clang", "has_a_default", "default"),
|
("%apple-clang os=elcapitan", "has_a_default", "default"),
|
||||||
# Constraints on dependencies
|
# Constraints on dependencies
|
||||||
("^zmpi", "different_by_dep", "zmpi"),
|
("^zmpi", "different_by_dep", "zmpi"),
|
||||||
("^mpich", "different_by_dep", "mpich"),
|
("^mpich", "different_by_dep", "mpich"),
|
||||||
@@ -74,7 +74,7 @@ def test_multimethod_calls(
|
|||||||
with spack.config.override(
|
with spack.config.override(
|
||||||
"compilers", [compiler_factory(spec="apple-clang@9.1.0", operating_system="elcapitan")]
|
"compilers", [compiler_factory(spec="apple-clang@9.1.0", operating_system="elcapitan")]
|
||||||
):
|
):
|
||||||
s = spack.concretize.concretize_one(f"{pkg_name} {constraint_str}")
|
s = spack.concretize.concretize_one(pkg_name + constraint_str)
|
||||||
msg = f"Method {method_name} from {s} is giving a wrong result"
|
msg = f"Method {method_name} from {s} is giving a wrong result"
|
||||||
assert getattr(s.package, method_name)() == expected_result, msg
|
assert getattr(s.package, method_name)() == expected_result, msg
|
||||||
|
|
||||||
|
|||||||
@@ -38,9 +38,9 @@
|
|||||||
{"optional-dep-test@1.1%intel": {"pkg-b": None, "pkg-c": None}},
|
{"optional-dep-test@1.1%intel": {"pkg-b": None, "pkg-c": None}},
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"optional-dep-test@1.1+a%intel@64.1.2",
|
"optional-dep-test@1.1%intel@64.1.2+a",
|
||||||
{
|
{
|
||||||
"optional-dep-test@1.1+a%intel@64.1.2": {
|
"optional-dep-test@1.1%intel@64.1.2+a": {
|
||||||
"pkg-a": None,
|
"pkg-a": None,
|
||||||
"pkg-b": None,
|
"pkg-b": None,
|
||||||
"pkg-c": None,
|
"pkg-c": None,
|
||||||
@@ -49,8 +49,8 @@
|
|||||||
},
|
},
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"optional-dep-test@1.1+a%clang@36.5",
|
"optional-dep-test@1.1%clang@36.5+a",
|
||||||
{"optional-dep-test@1.1+a%clang@36.5": {"pkg-b": None, "pkg-a": None, "pkg-e": None}},
|
{"optional-dep-test@1.1%clang@36.5+a": {"pkg-b": None, "pkg-a": None, "pkg-e": None}},
|
||||||
),
|
),
|
||||||
# Chained MPI
|
# Chained MPI
|
||||||
(
|
(
|
||||||
|
|||||||
@@ -34,7 +34,7 @@ def extra_repo(tmp_path_factory, request):
|
|||||||
subdirectory: '{request.param}'
|
subdirectory: '{request.param}'
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
repo_cache = spack.util.file_cache.FileCache(cache_dir)
|
repo_cache = spack.util.file_cache.FileCache(str(cache_dir))
|
||||||
return spack.repo.Repo(str(repo_dir), cache=repo_cache), request.param
|
return spack.repo.Repo(str(repo_dir), cache=repo_cache), request.param
|
||||||
|
|
||||||
|
|
||||||
@@ -194,7 +194,7 @@ def _repo_paths(repos):
|
|||||||
|
|
||||||
repo_paths, namespaces = _repo_paths(repos)
|
repo_paths, namespaces = _repo_paths(repos)
|
||||||
|
|
||||||
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
|
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
|
||||||
repo_path = spack.repo.RepoPath(*repo_paths, cache=repo_cache)
|
repo_path = spack.repo.RepoPath(*repo_paths, cache=repo_cache)
|
||||||
assert len(repo_path.repos) == len(namespaces)
|
assert len(repo_path.repos) == len(namespaces)
|
||||||
assert [x.namespace for x in repo_path.repos] == namespaces
|
assert [x.namespace for x in repo_path.repos] == namespaces
|
||||||
@@ -319,48 +319,3 @@ def test_get_repo(self, mock_test_cache):
|
|||||||
# foo is not there, raise
|
# foo is not there, raise
|
||||||
with pytest.raises(spack.repo.UnknownNamespaceError):
|
with pytest.raises(spack.repo.UnknownNamespaceError):
|
||||||
repo.get_repo("foo")
|
repo.get_repo("foo")
|
||||||
|
|
||||||
|
|
||||||
def test_parse_package_api_version():
|
|
||||||
"""Test that we raise an error if a repository has a version that is not supported."""
|
|
||||||
# valid version
|
|
||||||
assert spack.repo._parse_package_api_version(
|
|
||||||
{"api": "v1.2"}, min_api=(1, 0), max_api=(2, 3)
|
|
||||||
) == (1, 2)
|
|
||||||
# too new and too old
|
|
||||||
with pytest.raises(
|
|
||||||
spack.repo.BadRepoError,
|
|
||||||
match=r"Package API v2.4 is not supported .* \(must be between v1.0 and v2.3\)",
|
|
||||||
):
|
|
||||||
spack.repo._parse_package_api_version({"api": "v2.4"}, min_api=(1, 0), max_api=(2, 3))
|
|
||||||
with pytest.raises(
|
|
||||||
spack.repo.BadRepoError,
|
|
||||||
match=r"Package API v0.9 is not supported .* \(must be between v1.0 and v2.3\)",
|
|
||||||
):
|
|
||||||
spack.repo._parse_package_api_version({"api": "v0.9"}, min_api=(1, 0), max_api=(2, 3))
|
|
||||||
# default to v1.0 if not specified
|
|
||||||
assert spack.repo._parse_package_api_version({}, min_api=(1, 0), max_api=(2, 3)) == (1, 0)
|
|
||||||
# if v1.0 support is dropped we should also raise
|
|
||||||
with pytest.raises(
|
|
||||||
spack.repo.BadRepoError,
|
|
||||||
match=r"Package API v1.0 is not supported .* \(must be between v2.0 and v2.3\)",
|
|
||||||
):
|
|
||||||
spack.repo._parse_package_api_version({}, min_api=(2, 0), max_api=(2, 3))
|
|
||||||
# finally test invalid input
|
|
||||||
with pytest.raises(spack.repo.BadRepoError, match="Invalid Package API version"):
|
|
||||||
spack.repo._parse_package_api_version({"api": "v2"}, min_api=(1, 0), max_api=(3, 3))
|
|
||||||
with pytest.raises(spack.repo.BadRepoError, match="Invalid Package API version"):
|
|
||||||
spack.repo._parse_package_api_version({"api": 2.0}, min_api=(1, 0), max_api=(3, 3))
|
|
||||||
|
|
||||||
|
|
||||||
def test_repo_package_api_version(tmp_path: pathlib.Path):
|
|
||||||
"""Test that we can specify the API version of a repository."""
|
|
||||||
(tmp_path / "example" / "packages").mkdir(parents=True)
|
|
||||||
(tmp_path / "example" / "repo.yaml").write_text(
|
|
||||||
"""\
|
|
||||||
repo:
|
|
||||||
namespace: example
|
|
||||||
"""
|
|
||||||
)
|
|
||||||
cache = spack.util.file_cache.FileCache(tmp_path / "cache")
|
|
||||||
assert spack.repo.Repo(str(tmp_path / "example"), cache=cache).package_api == (1, 0)
|
|
||||||
|
|||||||
@@ -27,7 +27,9 @@ def check_spliced_spec_prefixes(spliced_spec):
|
|||||||
text_file_path = os.path.join(node.prefix, node.name)
|
text_file_path = os.path.join(node.prefix, node.name)
|
||||||
with open(text_file_path, "r", encoding="utf-8") as f:
|
with open(text_file_path, "r", encoding="utf-8") as f:
|
||||||
text = f.read()
|
text = f.read()
|
||||||
|
print(text)
|
||||||
for modded_spec in node.traverse(root=True, deptype=dt.ALL & ~dt.BUILD):
|
for modded_spec in node.traverse(root=True, deptype=dt.ALL & ~dt.BUILD):
|
||||||
|
print(modded_spec)
|
||||||
assert modded_spec.prefix in text
|
assert modded_spec.prefix in text
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -84,7 +84,6 @@ def test_module_suffixes(module_suffixes_schema):
|
|||||||
"compilers",
|
"compilers",
|
||||||
"config",
|
"config",
|
||||||
"definitions",
|
"definitions",
|
||||||
"include",
|
|
||||||
"env",
|
"env",
|
||||||
"merged",
|
"merged",
|
||||||
"mirrors",
|
"mirrors",
|
||||||
|
|||||||
@@ -21,7 +21,6 @@
|
|||||||
SpecParsingError,
|
SpecParsingError,
|
||||||
SpecTokenizationError,
|
SpecTokenizationError,
|
||||||
SpecTokens,
|
SpecTokens,
|
||||||
parse_one_or_raise,
|
|
||||||
)
|
)
|
||||||
from spack.tokenize import Token
|
from spack.tokenize import Token
|
||||||
|
|
||||||
@@ -160,13 +159,13 @@ def _specfile_for(spec_str, filename):
|
|||||||
),
|
),
|
||||||
# Version after compiler
|
# Version after compiler
|
||||||
(
|
(
|
||||||
"foo @2.0 %bar@1.0",
|
"foo %bar@1.0 @2.0",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="foo"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="foo"),
|
||||||
Token(SpecTokens.VERSION, value="@2.0"),
|
|
||||||
Token(SpecTokens.COMPILER_AND_VERSION, value="%bar@1.0"),
|
Token(SpecTokens.COMPILER_AND_VERSION, value="%bar@1.0"),
|
||||||
|
Token(SpecTokens.VERSION, value="@2.0"),
|
||||||
],
|
],
|
||||||
"foo@2.0 %bar@1.0",
|
"foo@2.0%bar@1.0",
|
||||||
),
|
),
|
||||||
# Single dependency with version
|
# Single dependency with version
|
||||||
dependency_with_version("openmpi ^hwloc@1.2e6"),
|
dependency_with_version("openmpi ^hwloc@1.2e6"),
|
||||||
@@ -175,54 +174,54 @@ def _specfile_for(spec_str, filename):
|
|||||||
dependency_with_version("openmpi ^hwloc@1.2e6:1.4b7-rc3"),
|
dependency_with_version("openmpi ^hwloc@1.2e6:1.4b7-rc3"),
|
||||||
# Complex specs with multiple constraints
|
# Complex specs with multiple constraints
|
||||||
(
|
(
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6+debug~qt_4 %intel@12.1 ^stackwalker@8.1_1e",
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1+debug~qt_4 ^stackwalker@8.1_1e",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
||||||
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
||||||
|
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="+debug"),
|
Token(SpecTokens.BOOL_VARIANT, value="+debug"),
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
||||||
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
||||||
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
||||||
],
|
],
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6+debug~qt_4 %intel@12.1 ^stackwalker@8.1_1e",
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1+debug~qt_4 ^stackwalker@8.1_1e",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6~qt_4 debug=2 %intel@12.1 ^stackwalker@8.1_1e",
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1~qt_4 debug=2 ^stackwalker@8.1_1e",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
||||||
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
||||||
|
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
||||||
Token(SpecTokens.KEY_VALUE_PAIR, value="debug=2"),
|
Token(SpecTokens.KEY_VALUE_PAIR, value="debug=2"),
|
||||||
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
||||||
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
||||||
],
|
],
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6~qt_4 debug=2 %intel@12.1 ^stackwalker@8.1_1e",
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1~qt_4 debug=2 ^stackwalker@8.1_1e",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6 cppflags=-O3 +debug~qt_4 %intel@12.1 "
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags=-O3 +debug~qt_4 "
|
||||||
"^stackwalker@8.1_1e",
|
"^stackwalker@8.1_1e",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich_foo"),
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="_openmpi"),
|
||||||
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
Token(SpecTokens.VERSION, value="@1.2:1.4,1.6"),
|
||||||
|
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
||||||
Token(SpecTokens.KEY_VALUE_PAIR, value="cppflags=-O3"),
|
Token(SpecTokens.KEY_VALUE_PAIR, value="cppflags=-O3"),
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="+debug"),
|
Token(SpecTokens.BOOL_VARIANT, value="+debug"),
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
Token(SpecTokens.BOOL_VARIANT, value="~qt_4"),
|
||||||
Token(SpecTokens.COMPILER_AND_VERSION, value="%intel@12.1"),
|
|
||||||
Token(SpecTokens.DEPENDENCY, value="^"),
|
Token(SpecTokens.DEPENDENCY, value="^"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="stackwalker"),
|
||||||
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
Token(SpecTokens.VERSION, value="@8.1_1e"),
|
||||||
],
|
],
|
||||||
"mvapich_foo ^_openmpi@1.2:1.4,1.6 cppflags=-O3 +debug~qt_4 %intel@12.1 "
|
"mvapich_foo ^_openmpi@1.2:1.4,1.6%intel@12.1 cppflags=-O3 +debug~qt_4 "
|
||||||
"^stackwalker@8.1_1e",
|
"^stackwalker@8.1_1e",
|
||||||
),
|
),
|
||||||
# Specs containing YAML or JSON in the package name
|
# Specs containing YAML or JSON in the package name
|
||||||
@@ -236,7 +235,7 @@ def _specfile_for(spec_str, filename):
|
|||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="boost"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="boost"),
|
||||||
Token(SpecTokens.VERSION, value="@3.1.4"),
|
Token(SpecTokens.VERSION, value="@3.1.4"),
|
||||||
],
|
],
|
||||||
"yaml-cpp@0.1.8 %intel@12.1 ^boost@3.1.4",
|
"yaml-cpp@0.1.8%intel@12.1 ^boost@3.1.4",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
r"builtin.yaml-cpp%gcc",
|
r"builtin.yaml-cpp%gcc",
|
||||||
@@ -244,7 +243,7 @@ def _specfile_for(spec_str, filename):
|
|||||||
Token(SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME, value="builtin.yaml-cpp"),
|
Token(SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME, value="builtin.yaml-cpp"),
|
||||||
Token(SpecTokens.COMPILER, value="%gcc"),
|
Token(SpecTokens.COMPILER, value="%gcc"),
|
||||||
],
|
],
|
||||||
"yaml-cpp %gcc",
|
"yaml-cpp%gcc",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
r"testrepo.yaml-cpp%gcc",
|
r"testrepo.yaml-cpp%gcc",
|
||||||
@@ -252,7 +251,7 @@ def _specfile_for(spec_str, filename):
|
|||||||
Token(SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME, value="testrepo.yaml-cpp"),
|
Token(SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME, value="testrepo.yaml-cpp"),
|
||||||
Token(SpecTokens.COMPILER, value="%gcc"),
|
Token(SpecTokens.COMPILER, value="%gcc"),
|
||||||
],
|
],
|
||||||
"yaml-cpp %gcc",
|
"yaml-cpp%gcc",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
r"builtin.yaml-cpp@0.1.8%gcc@7.2.0 ^boost@3.1.4",
|
r"builtin.yaml-cpp@0.1.8%gcc@7.2.0 ^boost@3.1.4",
|
||||||
@@ -264,7 +263,7 @@ def _specfile_for(spec_str, filename):
|
|||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="boost"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="boost"),
|
||||||
Token(SpecTokens.VERSION, value="@3.1.4"),
|
Token(SpecTokens.VERSION, value="@3.1.4"),
|
||||||
],
|
],
|
||||||
"yaml-cpp@0.1.8 %gcc@7.2.0 ^boost@3.1.4",
|
"yaml-cpp@0.1.8%gcc@7.2.0 ^boost@3.1.4",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
r"builtin.yaml-cpp ^testrepo.boost ^zlib",
|
r"builtin.yaml-cpp ^testrepo.boost ^zlib",
|
||||||
@@ -486,12 +485,12 @@ def _specfile_for(spec_str, filename):
|
|||||||
"a@1:",
|
"a@1:",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"+ debug % intel @ 12.1:12.6",
|
"% intel @ 12.1:12.6 + debug",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.BOOL_VARIANT, value="+ debug"),
|
|
||||||
Token(SpecTokens.COMPILER_AND_VERSION, value="% intel @ 12.1:12.6"),
|
Token(SpecTokens.COMPILER_AND_VERSION, value="% intel @ 12.1:12.6"),
|
||||||
|
Token(SpecTokens.BOOL_VARIANT, value="+ debug"),
|
||||||
],
|
],
|
||||||
"+debug %intel@12.1:12.6",
|
"%intel@12.1:12.6+debug",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"@ 12.1:12.6 + debug - qt_4",
|
"@ 12.1:12.6 + debug - qt_4",
|
||||||
@@ -516,7 +515,7 @@ def _specfile_for(spec_str, filename):
|
|||||||
Token(SpecTokens.VERSION, value="@:0.4"),
|
Token(SpecTokens.VERSION, value="@:0.4"),
|
||||||
Token(SpecTokens.COMPILER, value="% nvhpc"),
|
Token(SpecTokens.COMPILER, value="% nvhpc"),
|
||||||
],
|
],
|
||||||
"@:0.4 %nvhpc",
|
"@:0.4%nvhpc",
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"^[virtuals=mpi] openmpi",
|
"^[virtuals=mpi] openmpi",
|
||||||
@@ -640,15 +639,15 @@ def test_parse_single_spec(spec_str, tokens, expected_roundtrip, mock_git_test_p
|
|||||||
["mvapich cppflags=-O3", "emacs"],
|
["mvapich cppflags=-O3", "emacs"],
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
"mvapich emacs @1.1.1 cflags=-O3 %intel",
|
"mvapich emacs @1.1.1 %intel cflags=-O3",
|
||||||
[
|
[
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="mvapich"),
|
||||||
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="emacs"),
|
Token(SpecTokens.UNQUALIFIED_PACKAGE_NAME, value="emacs"),
|
||||||
Token(SpecTokens.VERSION, value="@1.1.1"),
|
Token(SpecTokens.VERSION, value="@1.1.1"),
|
||||||
Token(SpecTokens.KEY_VALUE_PAIR, value="cflags=-O3"),
|
|
||||||
Token(SpecTokens.COMPILER, value="%intel"),
|
Token(SpecTokens.COMPILER, value="%intel"),
|
||||||
|
Token(SpecTokens.KEY_VALUE_PAIR, value="cflags=-O3"),
|
||||||
],
|
],
|
||||||
["mvapich", "emacs @1.1.1 cflags=-O3 %intel"],
|
["mvapich", "emacs @1.1.1 %intel cflags=-O3"],
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
'mvapich cflags="-O3 -fPIC" emacs^ncurses%intel',
|
'mvapich cflags="-O3 -fPIC" emacs^ncurses%intel',
|
||||||
@@ -1232,7 +1231,7 @@ def test_compare_abstract_specs():
|
|||||||
"foo.foo@foo+foo",
|
"foo.foo@foo+foo",
|
||||||
"foo.foo@foo+foo arch=foo-foo-foo",
|
"foo.foo@foo+foo arch=foo-foo-foo",
|
||||||
"foo.foo@foo+foo arch=foo-foo-foo %foo",
|
"foo.foo@foo+foo arch=foo-foo-foo %foo",
|
||||||
"foo.foo@foo+foo arch=foo-foo-foo cflags=foo %foo",
|
"foo.foo@foo+foo arch=foo-foo-foo %foo cflags=foo",
|
||||||
]
|
]
|
||||||
specs = [SpecParser(s).next_spec() for s in constraints]
|
specs = [SpecParser(s).next_spec() for s in constraints]
|
||||||
|
|
||||||
@@ -1286,19 +1285,3 @@ def test_git_ref_spec_equivalences(mock_packages, lhs_str, rhs_str, expected):
|
|||||||
def test_platform_is_none_if_not_present(spec_str):
|
def test_platform_is_none_if_not_present(spec_str):
|
||||||
s = SpecParser(spec_str).next_spec()
|
s = SpecParser(spec_str).next_spec()
|
||||||
assert s.architecture.platform is None, s
|
assert s.architecture.platform is None, s
|
||||||
|
|
||||||
|
|
||||||
def test_parse_one_or_raise_error_message():
|
|
||||||
with pytest.raises(ValueError) as exc:
|
|
||||||
parse_one_or_raise(" x y z")
|
|
||||||
|
|
||||||
msg = """\
|
|
||||||
expected a single spec, but got more:
|
|
||||||
x y z
|
|
||||||
^\
|
|
||||||
"""
|
|
||||||
|
|
||||||
assert str(exc.value) == msg
|
|
||||||
|
|
||||||
with pytest.raises(ValueError, match="expected a single spec, but got none"):
|
|
||||||
parse_one_or_raise(" ")
|
|
||||||
|
|||||||
@@ -149,8 +149,11 @@ def test_reverse_environment_modifications(working_env):
|
|||||||
os.environ.clear()
|
os.environ.clear()
|
||||||
os.environ.update(start_env)
|
os.environ.update(start_env)
|
||||||
|
|
||||||
|
print(os.environ)
|
||||||
to_reverse.apply_modifications()
|
to_reverse.apply_modifications()
|
||||||
|
print(os.environ)
|
||||||
reversal.apply_modifications()
|
reversal.apply_modifications()
|
||||||
|
print(os.environ)
|
||||||
|
|
||||||
start_env.pop("UNSET")
|
start_env.pop("UNSET")
|
||||||
assert os.environ == start_env
|
assert os.environ == start_env
|
||||||
|
|||||||
@@ -134,18 +134,3 @@ def test_path_debug_padded_filter(debug, monkeypatch):
|
|||||||
monkeypatch.setattr(tty, "_debug", debug)
|
monkeypatch.setattr(tty, "_debug", debug)
|
||||||
with spack.config.override("config:install_tree", {"padded_length": 128}):
|
with spack.config.override("config:install_tree", {"padded_length": 128}):
|
||||||
assert expected == sup.debug_padded_filter(string)
|
assert expected == sup.debug_padded_filter(string)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
"path,expected",
|
|
||||||
[
|
|
||||||
("/home/spack/path/to/file.txt", "/home/spack/path/to/file.txt"),
|
|
||||||
("file:///home/another/config.yaml", "/home/another/config.yaml"),
|
|
||||||
("path/to.txt", os.path.join(os.environ["SPACK_ROOT"], "path", "to.txt")),
|
|
||||||
(r"C:\Files (x86)\Windows\10", r"C:\Files (x86)\Windows\10"),
|
|
||||||
(r"E:/spack stage", "E:\\spack stage"),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_canonicalize_file(path, expected):
|
|
||||||
"""Confirm canonicalize path handles local files and file URLs."""
|
|
||||||
assert sup.canonicalize_path(path) == os.path.normpath(expected)
|
|
||||||
|
|||||||
@@ -1,106 +0,0 @@
|
|||||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
|
||||||
#
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
|
|
||||||
import pytest
|
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
|
||||||
from llnl.util.filesystem import join_path
|
|
||||||
|
|
||||||
import spack.config
|
|
||||||
import spack.util.remote_file_cache as rfc_util
|
|
||||||
|
|
||||||
github_url = "https://github.com/fake/fake/{0}/develop"
|
|
||||||
gitlab_url = "https://gitlab.fake.io/user/repo/-/blob/config/defaults"
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
"path,err",
|
|
||||||
[
|
|
||||||
("ssh://git@github.com:spack/", "Unsupported URL scheme"),
|
|
||||||
("bad:///this/is/a/file/url/include.yaml", "Invalid URL scheme"),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_rfc_local_path_bad_scheme(path, err):
|
|
||||||
with pytest.raises(ValueError, match=err):
|
|
||||||
_ = rfc_util.local_path(path, "")
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
"path,expected",
|
|
||||||
[
|
|
||||||
("/a/b/c/d/e/config.py", "/a/b/c/d/e/config.py"),
|
|
||||||
("file:///this/is/a/file/url/include.yaml", "/this/is/a/file/url/include.yaml"),
|
|
||||||
(
|
|
||||||
"relative/packages.txt",
|
|
||||||
os.path.join(os.environ["SPACK_ROOT"], "relative", "packages.txt"),
|
|
||||||
),
|
|
||||||
(r"C:\Files (x86)\Windows\10", r"C:\Files (x86)\Windows\10"),
|
|
||||||
(r"D:/spack stage", "D:\\spack stage"),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_rfc_local_file(path, expected):
|
|
||||||
assert rfc_util.local_path(path, "") == os.path.normpath(expected)
|
|
||||||
|
|
||||||
|
|
||||||
def test_rfc_remote_local_path_no_dest():
|
|
||||||
path = f"{gitlab_url}/packages.yaml"
|
|
||||||
with pytest.raises(ValueError, match="Requires the destination argument"):
|
|
||||||
_ = rfc_util.local_path(path, "")
|
|
||||||
|
|
||||||
|
|
||||||
compilers_sha256 = (
|
|
||||||
"381732677538143a8f900406c0654f2730e2919a11740bdeaf35757ab3e1ef3e"
|
|
||||||
if sys.platform == "win32"
|
|
||||||
else "e91148ed5a0da7844e9f3f9cfce0fa60cce509461886bc3b006ee9eb711f69df"
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
|
||||||
"url,sha256,err,msg",
|
|
||||||
[
|
|
||||||
(
|
|
||||||
f"{join_path(github_url.format('tree'), 'config.yaml')}",
|
|
||||||
"",
|
|
||||||
ValueError,
|
|
||||||
"Requires sha256",
|
|
||||||
),
|
|
||||||
(f"{gitlab_url}/compilers.yaml", compilers_sha256, None, ""),
|
|
||||||
(f"{gitlab_url}/packages.yaml", "abcdef", ValueError, "does not match"),
|
|
||||||
(f"{github_url.format('blob')}/README.md", "", OSError, "No such"),
|
|
||||||
(github_url.format("tree"), "", OSError, "No such"),
|
|
||||||
("", "", ValueError, "argument is required"),
|
|
||||||
],
|
|
||||||
)
|
|
||||||
def test_rfc_remote_local_path(
|
|
||||||
tmpdir, mutable_empty_config, mock_fetch_url_text, url, sha256, err, msg
|
|
||||||
):
|
|
||||||
def _has_content(filename):
|
|
||||||
# The first element of all configuration files for this test happen to
|
|
||||||
# be the basename of the file so this check leverages that feature. If
|
|
||||||
# that changes, then this check will need to change accordingly.
|
|
||||||
element = f"{os.path.splitext(os.path.basename(filename))[0]}:"
|
|
||||||
with open(filename, "r", encoding="utf-8") as fd:
|
|
||||||
for line in fd:
|
|
||||||
if element in line:
|
|
||||||
return True
|
|
||||||
tty.debug(f"Expected {element} in '{filename}'")
|
|
||||||
return False
|
|
||||||
|
|
||||||
def _dest_dir():
|
|
||||||
return join_path(tmpdir.strpath, "cache")
|
|
||||||
|
|
||||||
if err is not None:
|
|
||||||
with spack.config.override("config:url_fetch_method", "curl"):
|
|
||||||
with pytest.raises(err, match=msg):
|
|
||||||
rfc_util.local_path(url, sha256, _dest_dir)
|
|
||||||
else:
|
|
||||||
with spack.config.override("config:url_fetch_method", "curl"):
|
|
||||||
path = rfc_util.local_path(url, sha256, _dest_dir)
|
|
||||||
assert os.path.exists(path)
|
|
||||||
# Ensure correct file is "fetched"
|
|
||||||
assert os.path.basename(path) == os.path.basename(url)
|
|
||||||
# Ensure contents of the file contains expected config element
|
|
||||||
assert _has_content(path)
|
|
||||||
@@ -257,6 +257,7 @@ def test_core_lib_files():
|
|||||||
names.append(os.path.join(test_dir, n))
|
names.append(os.path.join(test_dir, n))
|
||||||
|
|
||||||
for filename in names:
|
for filename in names:
|
||||||
|
print("Testing %s" % filename)
|
||||||
source = read_pyfile(filename)
|
source = read_pyfile(filename)
|
||||||
check_ast_roundtrip(source)
|
check_ast_roundtrip(source)
|
||||||
|
|
||||||
|
|||||||
@@ -5,17 +5,16 @@
|
|||||||
import errno
|
import errno
|
||||||
import math
|
import math
|
||||||
import os
|
import os
|
||||||
import pathlib
|
|
||||||
import shutil
|
import shutil
|
||||||
from typing import IO, Dict, Optional, Tuple, Union
|
from typing import IO, Optional, Tuple
|
||||||
|
|
||||||
from llnl.util.filesystem import rename
|
from llnl.util.filesystem import mkdirp, rename
|
||||||
|
|
||||||
from spack.error import SpackError
|
from spack.error import SpackError
|
||||||
from spack.util.lock import Lock, ReadTransaction, WriteTransaction
|
from spack.util.lock import Lock, ReadTransaction, WriteTransaction
|
||||||
|
|
||||||
|
|
||||||
def _maybe_open(path: Union[str, pathlib.Path]) -> Optional[IO[str]]:
|
def _maybe_open(path: str) -> Optional[IO[str]]:
|
||||||
try:
|
try:
|
||||||
return open(path, "r", encoding="utf-8")
|
return open(path, "r", encoding="utf-8")
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
@@ -25,7 +24,7 @@ def _maybe_open(path: Union[str, pathlib.Path]) -> Optional[IO[str]]:
|
|||||||
|
|
||||||
|
|
||||||
class ReadContextManager:
|
class ReadContextManager:
|
||||||
def __init__(self, path: Union[str, pathlib.Path]) -> None:
|
def __init__(self, path: str) -> None:
|
||||||
self.path = path
|
self.path = path
|
||||||
|
|
||||||
def __enter__(self) -> Optional[IO[str]]:
|
def __enter__(self) -> Optional[IO[str]]:
|
||||||
@@ -71,7 +70,7 @@ class FileCache:
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, root: Union[str, pathlib.Path], timeout=120):
|
def __init__(self, root, timeout=120):
|
||||||
"""Create a file cache object.
|
"""Create a file cache object.
|
||||||
|
|
||||||
This will create the cache directory if it does not exist yet.
|
This will create the cache directory if it does not exist yet.
|
||||||
@@ -83,60 +82,58 @@ def __init__(self, root: Union[str, pathlib.Path], timeout=120):
|
|||||||
for cache files, this specifies how long Spack should wait
|
for cache files, this specifies how long Spack should wait
|
||||||
before assuming that there is a deadlock.
|
before assuming that there is a deadlock.
|
||||||
"""
|
"""
|
||||||
if isinstance(root, str):
|
self.root = root.rstrip(os.path.sep)
|
||||||
root = pathlib.Path(root)
|
if not os.path.exists(self.root):
|
||||||
self.root = root
|
mkdirp(self.root)
|
||||||
self.root.mkdir(parents=True, exist_ok=True)
|
|
||||||
|
|
||||||
self._locks: Dict[Union[pathlib.Path, str], Lock] = {}
|
self._locks = {}
|
||||||
self.lock_timeout = timeout
|
self.lock_timeout = timeout
|
||||||
|
|
||||||
def destroy(self):
|
def destroy(self):
|
||||||
"""Remove all files under the cache root."""
|
"""Remove all files under the cache root."""
|
||||||
for f in self.root.iterdir():
|
for f in os.listdir(self.root):
|
||||||
if f.is_dir():
|
path = os.path.join(self.root, f)
|
||||||
shutil.rmtree(f, True)
|
if os.path.isdir(path):
|
||||||
|
shutil.rmtree(path, True)
|
||||||
else:
|
else:
|
||||||
f.unlink()
|
os.remove(path)
|
||||||
|
|
||||||
def cache_path(self, key: Union[str, pathlib.Path]):
|
def cache_path(self, key):
|
||||||
"""Path to the file in the cache for a particular key."""
|
"""Path to the file in the cache for a particular key."""
|
||||||
return self.root / key
|
return os.path.join(self.root, key)
|
||||||
|
|
||||||
def _lock_path(self, key: Union[str, pathlib.Path]):
|
def _lock_path(self, key):
|
||||||
"""Path to the file in the cache for a particular key."""
|
"""Path to the file in the cache for a particular key."""
|
||||||
keyfile = os.path.basename(key)
|
keyfile = os.path.basename(key)
|
||||||
keydir = os.path.dirname(key)
|
keydir = os.path.dirname(key)
|
||||||
|
|
||||||
return self.root / keydir / ("." + keyfile + ".lock")
|
return os.path.join(self.root, keydir, "." + keyfile + ".lock")
|
||||||
|
|
||||||
def _get_lock(self, key: Union[str, pathlib.Path]):
|
def _get_lock(self, key):
|
||||||
"""Create a lock for a key, if necessary, and return a lock object."""
|
"""Create a lock for a key, if necessary, and return a lock object."""
|
||||||
if key not in self._locks:
|
if key not in self._locks:
|
||||||
self._locks[key] = Lock(str(self._lock_path(key)), default_timeout=self.lock_timeout)
|
self._locks[key] = Lock(self._lock_path(key), default_timeout=self.lock_timeout)
|
||||||
return self._locks[key]
|
return self._locks[key]
|
||||||
|
|
||||||
def init_entry(self, key: Union[str, pathlib.Path]):
|
def init_entry(self, key):
|
||||||
"""Ensure we can access a cache file. Create a lock for it if needed.
|
"""Ensure we can access a cache file. Create a lock for it if needed.
|
||||||
|
|
||||||
Return whether the cache file exists yet or not.
|
Return whether the cache file exists yet or not.
|
||||||
"""
|
"""
|
||||||
cache_path = self.cache_path(key)
|
cache_path = self.cache_path(key)
|
||||||
# Avoid using pathlib here to allow the logic below to
|
|
||||||
# function as is
|
|
||||||
# TODO: Maybe refactor the following logic for pathlib
|
|
||||||
exists = os.path.exists(cache_path)
|
exists = os.path.exists(cache_path)
|
||||||
if exists:
|
if exists:
|
||||||
if not cache_path.is_file():
|
if not os.path.isfile(cache_path):
|
||||||
raise CacheError("Cache file is not a file: %s" % cache_path)
|
raise CacheError("Cache file is not a file: %s" % cache_path)
|
||||||
|
|
||||||
if not os.access(cache_path, os.R_OK):
|
if not os.access(cache_path, os.R_OK):
|
||||||
raise CacheError("Cannot access cache file: %s" % cache_path)
|
raise CacheError("Cannot access cache file: %s" % cache_path)
|
||||||
else:
|
else:
|
||||||
# if the file is hierarchical, make parent directories
|
# if the file is hierarchical, make parent directories
|
||||||
parent = cache_path.parent
|
parent = os.path.dirname(cache_path)
|
||||||
if parent != self.root:
|
if parent.rstrip(os.path.sep) != self.root:
|
||||||
parent.mkdir(parents=True, exist_ok=True)
|
mkdirp(parent)
|
||||||
|
|
||||||
if not os.access(parent, os.R_OK | os.W_OK):
|
if not os.access(parent, os.R_OK | os.W_OK):
|
||||||
raise CacheError("Cannot access cache directory: %s" % parent)
|
raise CacheError("Cannot access cache directory: %s" % parent)
|
||||||
@@ -145,7 +142,7 @@ def init_entry(self, key: Union[str, pathlib.Path]):
|
|||||||
self._get_lock(key)
|
self._get_lock(key)
|
||||||
return exists
|
return exists
|
||||||
|
|
||||||
def read_transaction(self, key: Union[str, pathlib.Path]):
|
def read_transaction(self, key):
|
||||||
"""Get a read transaction on a file cache item.
|
"""Get a read transaction on a file cache item.
|
||||||
|
|
||||||
Returns a ReadTransaction context manager and opens the cache file for
|
Returns a ReadTransaction context manager and opens the cache file for
|
||||||
@@ -156,11 +153,9 @@ def read_transaction(self, key: Union[str, pathlib.Path]):
|
|||||||
|
|
||||||
"""
|
"""
|
||||||
path = self.cache_path(key)
|
path = self.cache_path(key)
|
||||||
return ReadTransaction(
|
return ReadTransaction(self._get_lock(key), acquire=lambda: ReadContextManager(path))
|
||||||
self._get_lock(key), acquire=lambda: ReadContextManager(path) # type: ignore
|
|
||||||
)
|
|
||||||
|
|
||||||
def write_transaction(self, key: Union[str, pathlib.Path]):
|
def write_transaction(self, key):
|
||||||
"""Get a write transaction on a file cache item.
|
"""Get a write transaction on a file cache item.
|
||||||
|
|
||||||
Returns a WriteTransaction context manager that opens a temporary file
|
Returns a WriteTransaction context manager that opens a temporary file
|
||||||
@@ -172,11 +167,9 @@ def write_transaction(self, key: Union[str, pathlib.Path]):
|
|||||||
if os.path.exists(path) and not os.access(path, os.W_OK):
|
if os.path.exists(path) and not os.access(path, os.W_OK):
|
||||||
raise CacheError(f"Insufficient permissions to write to file cache at {path}")
|
raise CacheError(f"Insufficient permissions to write to file cache at {path}")
|
||||||
|
|
||||||
return WriteTransaction(
|
return WriteTransaction(self._get_lock(key), acquire=lambda: WriteContextManager(path))
|
||||||
self._get_lock(key), acquire=lambda: WriteContextManager(path) # type: ignore
|
|
||||||
)
|
|
||||||
|
|
||||||
def mtime(self, key: Union[str, pathlib.Path]) -> float:
|
def mtime(self, key) -> float:
|
||||||
"""Return modification time of cache file, or -inf if it does not exist.
|
"""Return modification time of cache file, or -inf if it does not exist.
|
||||||
|
|
||||||
Time is in units returned by os.stat in the mtime field, which is
|
Time is in units returned by os.stat in the mtime field, which is
|
||||||
@@ -186,14 +179,14 @@ def mtime(self, key: Union[str, pathlib.Path]) -> float:
|
|||||||
if not self.init_entry(key):
|
if not self.init_entry(key):
|
||||||
return -math.inf
|
return -math.inf
|
||||||
else:
|
else:
|
||||||
return self.cache_path(key).stat().st_mtime
|
return os.stat(self.cache_path(key)).st_mtime
|
||||||
|
|
||||||
def remove(self, key: Union[str, pathlib.Path]):
|
def remove(self, key):
|
||||||
file = self.cache_path(key)
|
file = self.cache_path(key)
|
||||||
lock = self._get_lock(key)
|
lock = self._get_lock(key)
|
||||||
try:
|
try:
|
||||||
lock.acquire_write()
|
lock.acquire_write()
|
||||||
file.unlink()
|
os.unlink(file)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
# File not found is OK, so remove is idempotent.
|
# File not found is OK, so remove is idempotent.
|
||||||
if e.errno != errno.ENOENT:
|
if e.errno != errno.ENOENT:
|
||||||
|
|||||||
@@ -9,13 +9,11 @@
|
|||||||
import contextlib
|
import contextlib
|
||||||
import getpass
|
import getpass
|
||||||
import os
|
import os
|
||||||
import pathlib
|
|
||||||
import re
|
import re
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
from datetime import date
|
from datetime import date
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.util.lang import memoized
|
from llnl.util.lang import memoized
|
||||||
@@ -237,7 +235,7 @@ def add_padding(path, length):
|
|||||||
return os.path.join(path, padding)
|
return os.path.join(path, padding)
|
||||||
|
|
||||||
|
|
||||||
def canonicalize_path(path: str, default_wd: Optional[str] = None) -> str:
|
def canonicalize_path(path, default_wd=None):
|
||||||
"""Same as substitute_path_variables, but also take absolute path.
|
"""Same as substitute_path_variables, but also take absolute path.
|
||||||
|
|
||||||
If the string is a yaml object with file annotations, make absolute paths
|
If the string is a yaml object with file annotations, make absolute paths
|
||||||
@@ -245,53 +243,28 @@ def canonicalize_path(path: str, default_wd: Optional[str] = None) -> str:
|
|||||||
Otherwise, use ``default_wd`` if specified, otherwise ``os.getcwd()``
|
Otherwise, use ``default_wd`` if specified, otherwise ``os.getcwd()``
|
||||||
|
|
||||||
Arguments:
|
Arguments:
|
||||||
path: path being converted as needed
|
path (str): path being converted as needed
|
||||||
default_wd: optional working directory/root for non-yaml string paths
|
|
||||||
|
|
||||||
Returns: An absolute path or non-file URL with path variable substitution
|
Returns:
|
||||||
|
(str): An absolute path with path variable substitution
|
||||||
"""
|
"""
|
||||||
import urllib.parse
|
|
||||||
import urllib.request
|
|
||||||
|
|
||||||
# Get file in which path was written in case we need to make it absolute
|
# Get file in which path was written in case we need to make it absolute
|
||||||
# relative to that path.
|
# relative to that path.
|
||||||
filename = None
|
filename = None
|
||||||
if isinstance(path, syaml.syaml_str):
|
if isinstance(path, syaml.syaml_str):
|
||||||
filename = os.path.dirname(path._start_mark.name) # type: ignore[attr-defined]
|
filename = os.path.dirname(path._start_mark.name)
|
||||||
assert path._start_mark.name == path._end_mark.name # type: ignore[attr-defined]
|
assert path._start_mark.name == path._end_mark.name
|
||||||
|
|
||||||
path = substitute_path_variables(path)
|
path = substitute_path_variables(path)
|
||||||
|
if not os.path.isabs(path):
|
||||||
|
if filename:
|
||||||
|
path = os.path.join(filename, path)
|
||||||
|
else:
|
||||||
|
base = default_wd or os.getcwd()
|
||||||
|
path = os.path.join(base, path)
|
||||||
|
tty.debug("Using working directory %s as base for abspath" % base)
|
||||||
|
|
||||||
# Ensure properly process a Windows path
|
return os.path.normpath(path)
|
||||||
win_path = pathlib.PureWindowsPath(path)
|
|
||||||
if win_path.drive:
|
|
||||||
# Assume only absolute paths are supported with a Windows drive
|
|
||||||
# (though DOS does allow drive-relative paths).
|
|
||||||
return os.path.normpath(str(win_path))
|
|
||||||
|
|
||||||
# Now process linux-like paths and remote URLs
|
|
||||||
url = urllib.parse.urlparse(path)
|
|
||||||
url_path = urllib.request.url2pathname(url.path)
|
|
||||||
if url.scheme:
|
|
||||||
if url.scheme != "file":
|
|
||||||
# Have a remote URL so simply return it with substitutions
|
|
||||||
return os.path.normpath(path)
|
|
||||||
|
|
||||||
# Drop the URL scheme from the local path
|
|
||||||
path = url_path
|
|
||||||
|
|
||||||
if os.path.isabs(path):
|
|
||||||
return os.path.normpath(path)
|
|
||||||
|
|
||||||
# Have a relative path so prepend the appropriate dir to make it absolute
|
|
||||||
if filename:
|
|
||||||
# Prepend the directory of the syaml path
|
|
||||||
return os.path.normpath(os.path.join(filename, path))
|
|
||||||
|
|
||||||
# Prepend the default, if provided, or current working directory.
|
|
||||||
base = default_wd or os.getcwd()
|
|
||||||
tty.debug(f"Using working directory {base} as base for abspath")
|
|
||||||
return os.path.normpath(os.path.join(base, path))
|
|
||||||
|
|
||||||
|
|
||||||
def longest_prefix_re(string, capture=True):
|
def longest_prefix_re(string, capture=True):
|
||||||
@@ -374,7 +347,6 @@ def filter_padding():
|
|||||||
This is needed because Spack's debug output gets extremely long when we use a
|
This is needed because Spack's debug output gets extremely long when we use a
|
||||||
long padded installation path.
|
long padded installation path.
|
||||||
"""
|
"""
|
||||||
# circular import
|
|
||||||
import spack.config
|
import spack.config
|
||||||
|
|
||||||
padding = spack.config.get("config:install_tree:padded_length", None)
|
padding = spack.config.get("config:install_tree:padded_length", None)
|
||||||
|
|||||||
@@ -1,146 +0,0 @@
|
|||||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
|
||||||
#
|
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
|
||||||
|
|
||||||
import hashlib
|
|
||||||
import os.path
|
|
||||||
import pathlib
|
|
||||||
import shutil
|
|
||||||
import tempfile
|
|
||||||
import urllib.parse
|
|
||||||
import urllib.request
|
|
||||||
from typing import Callable, Optional
|
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
|
||||||
from llnl.util.filesystem import copy, join_path, mkdirp
|
|
||||||
|
|
||||||
import spack.util.crypto
|
|
||||||
from spack.util.path import canonicalize_path
|
|
||||||
from spack.util.url import validate_scheme
|
|
||||||
|
|
||||||
|
|
||||||
def raw_github_gitlab_url(url: str) -> str:
|
|
||||||
"""Transform a github URL to the raw form to avoid undesirable html.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
url: url to be converted to raw form
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Raw github/gitlab url or the original url
|
|
||||||
"""
|
|
||||||
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
|
|
||||||
# actual URL under https://raw.githubusercontent.com/ with '/blob'
|
|
||||||
# removed and or, '/blame' if needed.
|
|
||||||
if "github" in url or "gitlab" in url:
|
|
||||||
return url.replace("/blob/", "/raw/")
|
|
||||||
|
|
||||||
return url
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_remote_text_file(url: str, dest_dir: str) -> str:
|
|
||||||
"""Retrieve the text file from the url into the destination directory.
|
|
||||||
|
|
||||||
Arguments:
|
|
||||||
url: URL for the remote text file
|
|
||||||
dest_dir: destination directory in which to stage the file locally
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Path to the fetched file
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: if there are missing required arguments
|
|
||||||
"""
|
|
||||||
from spack.util.web import fetch_url_text # circular import
|
|
||||||
|
|
||||||
if not url:
|
|
||||||
raise ValueError("Cannot retrieve the remote file without the URL")
|
|
||||||
|
|
||||||
raw_url = raw_github_gitlab_url(url)
|
|
||||||
tty.debug(f"Fetching file from {raw_url} into {dest_dir}")
|
|
||||||
|
|
||||||
return fetch_url_text(raw_url, dest_dir=dest_dir)
|
|
||||||
|
|
||||||
|
|
||||||
def local_path(raw_path: str, sha256: str, make_dest: Optional[Callable[[], str]] = None) -> str:
|
|
||||||
"""Determine the actual path and, if remote, stage its contents locally.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
raw_path: raw path with possible variables needing substitution
|
|
||||||
sha256: the expected sha256 for the file
|
|
||||||
make_dest: function to create a stage for remote files, if needed (e.g., `mkdtemp`)
|
|
||||||
|
|
||||||
Returns: resolved, normalized local path
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: missing or mismatched arguments, unsupported URL scheme
|
|
||||||
"""
|
|
||||||
if not raw_path:
|
|
||||||
raise ValueError("path argument is required to cache remote files")
|
|
||||||
|
|
||||||
file_schemes = ["", "file"]
|
|
||||||
|
|
||||||
# Allow paths (and URLs) to contain spack config/environment variables,
|
|
||||||
# etc.
|
|
||||||
path = canonicalize_path(raw_path)
|
|
||||||
|
|
||||||
# Save off the Windows drive of the canonicalized path (since now absolute)
|
|
||||||
# to ensure recognized by URL parsing as a valid file "scheme".
|
|
||||||
win_path = pathlib.PureWindowsPath(path)
|
|
||||||
if win_path.drive:
|
|
||||||
file_schemes.append(win_path.drive.lower().strip(":"))
|
|
||||||
|
|
||||||
url = urllib.parse.urlparse(path)
|
|
||||||
|
|
||||||
# Path isn't remote so return normalized, absolute path with substitutions.
|
|
||||||
if url.scheme in file_schemes:
|
|
||||||
return os.path.normpath(path)
|
|
||||||
|
|
||||||
# If scheme is not valid, path is not a supported url.
|
|
||||||
if validate_scheme(url.scheme):
|
|
||||||
# Fetch files from supported URL schemes.
|
|
||||||
if url.scheme in ("http", "https", "ftp"):
|
|
||||||
if make_dest is None:
|
|
||||||
raise ValueError("Requires the destination argument to cache remote files")
|
|
||||||
|
|
||||||
# Stage the remote configuration file
|
|
||||||
tmpdir = tempfile.mkdtemp()
|
|
||||||
try:
|
|
||||||
staged_path = fetch_remote_text_file(path, tmpdir)
|
|
||||||
|
|
||||||
# Ensure the sha256 is expected.
|
|
||||||
checksum = spack.util.crypto.checksum(hashlib.sha256, staged_path)
|
|
||||||
if sha256 and checksum != sha256:
|
|
||||||
raise ValueError(
|
|
||||||
f"Actual sha256 ('{checksum}') does not match expected ('{sha256}')"
|
|
||||||
)
|
|
||||||
|
|
||||||
# Help the user by reporting the required checksum.
|
|
||||||
if not sha256:
|
|
||||||
raise ValueError(f"Requires sha256 ('{checksum}') to cache remote files.")
|
|
||||||
|
|
||||||
# Copy the file to the destination directory
|
|
||||||
dest_dir = join_path(make_dest(), checksum)
|
|
||||||
if not os.path.exists(dest_dir):
|
|
||||||
mkdirp(dest_dir)
|
|
||||||
|
|
||||||
cache_path = join_path(dest_dir, os.path.basename(staged_path))
|
|
||||||
copy(staged_path, cache_path)
|
|
||||||
tty.debug(f"Cached {raw_path} in {cache_path}")
|
|
||||||
|
|
||||||
# Stash the associated URL to aid with debugging
|
|
||||||
with open(join_path(dest_dir, "source_url.txt"), "w", encoding="utf-8") as f:
|
|
||||||
f.write(f"{raw_path}\n")
|
|
||||||
|
|
||||||
return cache_path
|
|
||||||
|
|
||||||
except ValueError as err:
|
|
||||||
tty.warn(f"Unable to cache {raw_path}: {str(err)}")
|
|
||||||
raise
|
|
||||||
|
|
||||||
finally:
|
|
||||||
shutil.rmtree(tmpdir)
|
|
||||||
|
|
||||||
raise ValueError(f"Unsupported URL scheme ({url.scheme}) in {raw_path}")
|
|
||||||
|
|
||||||
else:
|
|
||||||
raise ValueError(f"Invalid URL scheme ({url.scheme}) in {raw_path}")
|
|
||||||
@@ -495,25 +495,3 @@ class SpackYAMLError(spack.error.SpackError):
|
|||||||
|
|
||||||
def __init__(self, msg, yaml_error):
|
def __init__(self, msg, yaml_error):
|
||||||
super().__init__(msg, str(yaml_error))
|
super().__init__(msg, str(yaml_error))
|
||||||
|
|
||||||
|
|
||||||
def get_mark_from_yaml_data(obj):
|
|
||||||
"""Try to get ``spack.util.spack_yaml`` mark from YAML data.
|
|
||||||
|
|
||||||
We try the object, and if that fails we try its first member (if it's a container).
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
mark if one is found, otherwise None.
|
|
||||||
"""
|
|
||||||
# mark of object itelf
|
|
||||||
mark = getattr(obj, "_start_mark", None)
|
|
||||||
if mark:
|
|
||||||
return mark
|
|
||||||
|
|
||||||
# mark of first member if it is a container
|
|
||||||
if isinstance(obj, (list, dict)):
|
|
||||||
first_member = next(iter(obj), None)
|
|
||||||
if first_member:
|
|
||||||
mark = getattr(first_member, "_start_mark", None)
|
|
||||||
|
|
||||||
return mark
|
|
||||||
|
|||||||
@@ -9,7 +9,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.10.13",
|
"python": "python@3.10.13",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -20,7 +20,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.11.5",
|
"python": "python@3.11.5",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -31,7 +31,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.12.0",
|
"python": "python@3.12.0",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -42,7 +42,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.6",
|
"python": "python@3.6",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -53,7 +53,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.7.17",
|
"python": "python@3.7.17",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -64,7 +64,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.8.18",
|
"python": "python@3.8.18",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -75,7 +75,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.9.18",
|
"python": "python@3.9.18",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=aarch64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -86,7 +86,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.10",
|
"python": "python@3.10",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -97,7 +97,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.11",
|
"python": "python@3.11",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -108,7 +108,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.12",
|
"python": "python@3.12",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -119,7 +119,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.6",
|
"python": "python@3.6",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -130,7 +130,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.7",
|
"python": "python@3.7",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -141,7 +141,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.8",
|
"python": "python@3.8",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -152,7 +152,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.9",
|
"python": "python@3.9",
|
||||||
"spec": "clingo-bootstrap platform=darwin target=x86_64 %apple-clang"
|
"spec": "clingo-bootstrap%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -163,7 +163,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.10.13",
|
"python": "python@3.10.13",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -174,7 +174,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.11.5",
|
"python": "python@3.11.5",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -185,7 +185,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.12.0",
|
"python": "python@3.12.0",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -196,7 +196,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.6",
|
"python": "python@3.6",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -207,7 +207,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.7.17",
|
"python": "python@3.7.17",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -218,7 +218,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.8.18",
|
"python": "python@3.8.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -229,7 +229,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.9.18",
|
"python": "python@3.9.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=aarch64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -240,7 +240,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.10.13",
|
"python": "python@3.10.13",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -251,7 +251,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.11.5",
|
"python": "python@3.11.5",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -262,7 +262,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.12.0",
|
"python": "python@3.12.0",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -273,7 +273,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.6",
|
"python": "python@3.6",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -284,7 +284,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.7.17",
|
"python": "python@3.7.17",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -295,7 +295,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.8.18",
|
"python": "python@3.8.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -306,7 +306,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.9.18",
|
"python": "python@3.9.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=ppc64le %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -317,7 +317,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.10.13",
|
"python": "python@3.10.13",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -328,7 +328,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.11.5",
|
"python": "python@3.11.5",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -339,7 +339,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.12.0",
|
"python": "python@3.12.0",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -350,7 +350,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.6",
|
"python": "python@3.6",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -361,7 +361,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.7.17",
|
"python": "python@3.7.17",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -372,7 +372,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.8.18",
|
"python": "python@3.8.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -383,7 +383,7 @@
|
|||||||
]
|
]
|
||||||
],
|
],
|
||||||
"python": "python@3.9.18",
|
"python": "python@3.9.18",
|
||||||
"spec": "clingo-bootstrap platform=linux target=x86_64 %gcc"
|
"spec": "clingo-bootstrap%gcc platform=linux target=x86_64"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -48,7 +48,7 @@
|
|||||||
"23fdd223493f441fa2e5f82d7e02837ecfad831fbfa4c27c175b3e294ed977d1"
|
"23fdd223493f441fa2e5f82d7e02837ecfad831fbfa4c27c175b3e294ed977d1"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.3: platform=darwin target=aarch64 %apple-clang"
|
"spec": "gnupg@2.3: %apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -98,7 +98,7 @@
|
|||||||
"b9481e122e2cb26f69b70505830d0fcc0d200aadbb6c6572339825f17ad1e52d"
|
"b9481e122e2cb26f69b70505830d0fcc0d200aadbb6c6572339825f17ad1e52d"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.3: platform=darwin target=x86_64 %apple-clang"
|
"spec": "gnupg@2.3: %apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -148,7 +148,7 @@
|
|||||||
"228ccb475932f7f40a64e9d87dec045931cc57f71b1dfd4b4c3926107222d96c"
|
"228ccb475932f7f40a64e9d87dec045931cc57f71b1dfd4b4c3926107222d96c"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.3: platform=linux target=aarch64 %gcc"
|
"spec": "gnupg@2.3: %gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -198,7 +198,7 @@
|
|||||||
"98e2bcb4064ec0830d896938bc1fe5264dac611da71ea546b9ca03349b752041"
|
"98e2bcb4064ec0830d896938bc1fe5264dac611da71ea546b9ca03349b752041"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.3: platform=linux target=ppc64le %gcc"
|
"spec": "gnupg@2.3: %gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -248,7 +248,7 @@
|
|||||||
"054fac6eaad7c862ea4661461d847fb069876eb114209416b015748266f7d166"
|
"054fac6eaad7c862ea4661461d847fb069876eb114209416b015748266f7d166"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.3: platform=linux target=x86_64 %gcc"
|
"spec": "gnupg@2.3: %gcc platform=linux target=x86_64"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -8,7 +8,7 @@
|
|||||||
"102800775f789cc293e244899f39a22f0b7a19373305ef0497ca3189223123f3"
|
"102800775f789cc293e244899f39a22f0b7a19373305ef0497ca3189223123f3"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.13: platform=linux target=aarch64 %gcc"
|
"spec": "patchelf@0.13: %gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -18,7 +18,7 @@
|
|||||||
"91cf0a9d4750c04575c5ed3bcdefc4754e1cf9d1cd1bf197eb1fe20ccaa869f1"
|
"91cf0a9d4750c04575c5ed3bcdefc4754e1cf9d1cd1bf197eb1fe20ccaa869f1"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.13: platform=linux target=ppc64le %gcc"
|
"spec": "patchelf@0.13: %gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
"73f4bde46b843c96521e3f5c31ab94756491404c1ad6429c9f61dbafbbfa6470"
|
"73f4bde46b843c96521e3f5c31ab94756491404c1ad6429c9f61dbafbbfa6470"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.13: platform=linux target=x86_64 %gcc"
|
"spec": "patchelf@0.13: %gcc platform=linux target=x86_64"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -8,7 +8,7 @@
|
|||||||
"ff7f45db1645d1d857a315bf8d63c31447330552528bdf1fccdcf50735e62166"
|
"ff7f45db1645d1d857a315bf8d63c31447330552528bdf1fccdcf50735e62166"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.10"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.10"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -18,7 +18,7 @@
|
|||||||
"e7491ac297cbb3f45c80aaf4ca5102e2b655b731e7b6ce7682807d302cb61f1c"
|
"e7491ac297cbb3f45c80aaf4ca5102e2b655b731e7b6ce7682807d302cb61f1c"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.11"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.11"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
"91214626a86c21fc0d76918884ec819050d4d52b4f78df7cc9769a83fbee2f71"
|
"91214626a86c21fc0d76918884ec819050d4d52b4f78df7cc9769a83fbee2f71"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.12"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.12"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -38,7 +38,7 @@
|
|||||||
"db596d9e6d8970d659f4be4cb510f9ba5dc2ec4ea42ecf2aed1325ec5ad72b45"
|
"db596d9e6d8970d659f4be4cb510f9ba5dc2ec4ea42ecf2aed1325ec5ad72b45"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.13"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.13"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -48,7 +48,7 @@
|
|||||||
"a7ed91aee1f8d5cfe2ca5ef2e3e74215953ffd2d8b5c722a670f2c303610e90c"
|
"a7ed91aee1f8d5cfe2ca5ef2e3e74215953ffd2d8b5c722a670f2c303610e90c"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.8"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -58,7 +58,7 @@
|
|||||||
"c856a98f92b9fa218377cea9272dffa736e93251d987b6386e6abf40058333dc"
|
"c856a98f92b9fa218377cea9272dffa736e93251d987b6386e6abf40058333dc"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=aarch64 %apple-clang ^python@3.9"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=aarch64 ^python@3.9"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -68,7 +68,7 @@
|
|||||||
"d74cc0b44faa69473816dca16a3806123790e6eb9a59f611b1d80da7843f474a"
|
"d74cc0b44faa69473816dca16a3806123790e6eb9a59f611b1d80da7843f474a"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.10"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.10"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -78,7 +78,7 @@
|
|||||||
"2cb12477504ca8e112522b6d56325ce32024c9466de5b8427fd70a1a81b15020"
|
"2cb12477504ca8e112522b6d56325ce32024c9466de5b8427fd70a1a81b15020"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.11"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.11"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -88,7 +88,7 @@
|
|||||||
"4e73426599fa61df1a4faceafa38ade170a3dec45b6d8f333e6c2b6bfe697724"
|
"4e73426599fa61df1a4faceafa38ade170a3dec45b6d8f333e6c2b6bfe697724"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.12"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.12"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -98,7 +98,7 @@
|
|||||||
"4309b42e5642bc5c83ede90759b1a0b5d66fffa8991b6213208577626b588cde"
|
"4309b42e5642bc5c83ede90759b1a0b5d66fffa8991b6213208577626b588cde"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.13"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.13"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -108,7 +108,7 @@
|
|||||||
"1feeab9e1a81ca56de838ccc234d60957e9ab14da038e38b6687732b7bae1ff6"
|
"1feeab9e1a81ca56de838ccc234d60957e9ab14da038e38b6687732b7bae1ff6"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.6"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -118,7 +118,7 @@
|
|||||||
"1149ab7d5f1c82d8de53f048af8aa5c5dbf0d21da4e4780c06e54b8ee902085b"
|
"1149ab7d5f1c82d8de53f048af8aa5c5dbf0d21da4e4780c06e54b8ee902085b"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.7"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.7"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -128,7 +128,7 @@
|
|||||||
"d6aeae2dbd7fa3c1d1c62f840a5c01f8e71b64afe2bdc9c90d4087694f7d3443"
|
"d6aeae2dbd7fa3c1d1c62f840a5c01f8e71b64afe2bdc9c90d4087694f7d3443"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.8"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -138,7 +138,7 @@
|
|||||||
"81ef2beef78f46979965e5e69cd92b68ff4d2a59dbae1331c648d18b6ded1444"
|
"81ef2beef78f46979965e5e69cd92b68ff4d2a59dbae1331c648d18b6ded1444"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=darwin target=x86_64 %apple-clang ^python@3.9"
|
"spec": "clingo-bootstrap@spack%apple-clang platform=darwin target=x86_64 ^python@3.9"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -148,7 +148,7 @@
|
|||||||
"3d0830654f9e327fd7ec6dab214050295dbf0832f493937c0c133e516dd2a95a"
|
"3d0830654f9e327fd7ec6dab214050295dbf0832f493937c0c133e516dd2a95a"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.10"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.10"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -158,7 +158,7 @@
|
|||||||
"941b93cd89d5271c740d1b1c870e85f32e5970f9f7b842ad99870399215a93db"
|
"941b93cd89d5271c740d1b1c870e85f32e5970f9f7b842ad99870399215a93db"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.11"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.11"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -168,7 +168,7 @@
|
|||||||
"8ca78e345da732643e3d1b077d8156ce89863c25095e4958d4ac6d1a458ae74b"
|
"8ca78e345da732643e3d1b077d8156ce89863c25095e4958d4ac6d1a458ae74b"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.12"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.12"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -178,7 +178,7 @@
|
|||||||
"f6ced988b515494d86a1069f13ae9030caeb40fe951c2460f532123c80399154"
|
"f6ced988b515494d86a1069f13ae9030caeb40fe951c2460f532123c80399154"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.13"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.13"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -188,7 +188,7 @@
|
|||||||
"c00855b5cda99639b87c3912ee9c734c0b609dfe7d2c47ea947738c32bab6f03"
|
"c00855b5cda99639b87c3912ee9c734c0b609dfe7d2c47ea947738c32bab6f03"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.6"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -198,7 +198,7 @@
|
|||||||
"aa861cfdf6001fc2da6e83eecc9ad35df424d86d71f6d73e480818942405ce4e"
|
"aa861cfdf6001fc2da6e83eecc9ad35df424d86d71f6d73e480818942405ce4e"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.7"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.7"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -208,7 +208,7 @@
|
|||||||
"cb7807cd31fc5e0efe2acc1de1f74c7cef962bcadfc656b09ff853bc33c11bd0"
|
"cb7807cd31fc5e0efe2acc1de1f74c7cef962bcadfc656b09ff853bc33c11bd0"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.8"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -218,7 +218,7 @@
|
|||||||
"36e5efb6b15b431b661e9e272904ab3c29ae7b87bf6250c158d545ccefc2f424"
|
"36e5efb6b15b431b661e9e272904ab3c29ae7b87bf6250c158d545ccefc2f424"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=aarch64 %gcc ^python@3.9"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=aarch64 ^python@3.9"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -228,7 +228,7 @@
|
|||||||
"bd492c078b2cdaf327148eee5b0abd5b068dbb3ffa5dae0ec5d53257f471f7f7"
|
"bd492c078b2cdaf327148eee5b0abd5b068dbb3ffa5dae0ec5d53257f471f7f7"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.10"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.10"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -238,7 +238,7 @@
|
|||||||
"0ebe5e05246c33fc8529e90e21529b29742b5dd6756dbc07534577a90394c0e6"
|
"0ebe5e05246c33fc8529e90e21529b29742b5dd6756dbc07534577a90394c0e6"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.11"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.11"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -248,7 +248,7 @@
|
|||||||
"9f97d3bf78b7642a775f12feb43781d46110793f58a7e69b0b68ac4fff47655c"
|
"9f97d3bf78b7642a775f12feb43781d46110793f58a7e69b0b68ac4fff47655c"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.12"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.12"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -258,7 +258,7 @@
|
|||||||
"e7295bb4bcb11a936f39665632ce68c751c9f6cddc44904392a1b33a5290bbbe"
|
"e7295bb4bcb11a936f39665632ce68c751c9f6cddc44904392a1b33a5290bbbe"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.13"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.13"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -268,7 +268,7 @@
|
|||||||
"c44e7fbf721383aa8ee57d2305f41377e64a42ab8e02a9d3d6fc792d9b29ad08"
|
"c44e7fbf721383aa8ee57d2305f41377e64a42ab8e02a9d3d6fc792d9b29ad08"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.6"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -278,7 +278,7 @@
|
|||||||
"965ba5c1a42f436001162a3f3a0d1715424f2ec8f65c42d6b66efcd4f4566b77"
|
"965ba5c1a42f436001162a3f3a0d1715424f2ec8f65c42d6b66efcd4f4566b77"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.7"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.7"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -288,7 +288,7 @@
|
|||||||
"c8d31089d8f91718a5bde9c6b28cc67bdbadab401c8fdd07b296d588ece4ddfe"
|
"c8d31089d8f91718a5bde9c6b28cc67bdbadab401c8fdd07b296d588ece4ddfe"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.8"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -298,7 +298,7 @@
|
|||||||
"ef3f05d30333a39fd18714b87ee22605679f52fda97f5e592764d1591527bbf3"
|
"ef3f05d30333a39fd18714b87ee22605679f52fda97f5e592764d1591527bbf3"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=ppc64le %gcc ^python@3.9"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=ppc64le ^python@3.9"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -308,7 +308,7 @@
|
|||||||
"a4abec667660307ad5cff0a616d6651e187cc7b1386fd8cd4b6b288a01614076"
|
"a4abec667660307ad5cff0a616d6651e187cc7b1386fd8cd4b6b288a01614076"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.10"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.10"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -318,7 +318,7 @@
|
|||||||
"a572ab6db954f4a850d1292bb1ef6d6055916784a894d149d657996fa98d0367"
|
"a572ab6db954f4a850d1292bb1ef6d6055916784a894d149d657996fa98d0367"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.11"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.11"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -328,7 +328,7 @@
|
|||||||
"97f8ea17f3df3fb38904450114cbef9b4b0ea9c94da9de7a49b70b707012277a"
|
"97f8ea17f3df3fb38904450114cbef9b4b0ea9c94da9de7a49b70b707012277a"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.12"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.12"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -338,7 +338,7 @@
|
|||||||
"6599ac06ade0cb3e80695f36492ea94a306f8bde0537482521510076c5981aa0"
|
"6599ac06ade0cb3e80695f36492ea94a306f8bde0537482521510076c5981aa0"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.13"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.13"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -348,7 +348,7 @@
|
|||||||
"90b7cf4dd98e26c58578ad8604738cc32dfbb228cfb981bdfe103c99d0e7b5dd"
|
"90b7cf4dd98e26c58578ad8604738cc32dfbb228cfb981bdfe103c99d0e7b5dd"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.6"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.6"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -358,7 +358,7 @@
|
|||||||
"dc5dbfd9c05b43c4992bf6666638ae96cee5548921e94eb793ba85727b25ec59"
|
"dc5dbfd9c05b43c4992bf6666638ae96cee5548921e94eb793ba85727b25ec59"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.7"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.7"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -368,7 +368,7 @@
|
|||||||
"e8518de25baff7a74bdb42193e6e4b0496e7d0688434c42ce4bdc92fe4293a09"
|
"e8518de25baff7a74bdb42193e6e4b0496e7d0688434c42ce4bdc92fe4293a09"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.8"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.8"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -378,7 +378,7 @@
|
|||||||
"0c5831932608e7b4084fc6ce60e2b67b77dab76e5515303a049d4d30cd772321"
|
"0c5831932608e7b4084fc6ce60e2b67b77dab76e5515303a049d4d30cd772321"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "clingo-bootstrap@spack platform=linux target=x86_64 %gcc ^python@3.9"
|
"spec": "clingo-bootstrap@spack%gcc platform=linux target=x86_64 ^python@3.9"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -48,7 +48,7 @@
|
|||||||
"61bcb83dc3fc2ae06fde30b9f79c2596bd0457cf56b4d339c8c562a38ca1c31f"
|
"61bcb83dc3fc2ae06fde30b9f79c2596bd0457cf56b4d339c8c562a38ca1c31f"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.4.5 platform=darwin target=aarch64 %apple-clang"
|
"spec": "gnupg@2.4.5%apple-clang platform=darwin target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -98,7 +98,7 @@
|
|||||||
"3d36bce8bbd06134445aa3cefa00a80068317b6d082d2b43bb1e3be81ede5849"
|
"3d36bce8bbd06134445aa3cefa00a80068317b6d082d2b43bb1e3be81ede5849"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.4.5 platform=darwin target=x86_64 %apple-clang"
|
"spec": "gnupg@2.4.5%apple-clang platform=darwin target=x86_64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -153,7 +153,7 @@
|
|||||||
"8398592ab0812d8c76a21deca06da4277d05f4db784e129ead7535bb8988faa2"
|
"8398592ab0812d8c76a21deca06da4277d05f4db784e129ead7535bb8988faa2"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.4.5 platform=linux target=aarch64 %gcc"
|
"spec": "gnupg@2.4.5%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -208,7 +208,7 @@
|
|||||||
"cc7e4833af58913fa4ab2b7ce3fdb86d214594d54327c7e4eb4ca3f0784c046f"
|
"cc7e4833af58913fa4ab2b7ce3fdb86d214594d54327c7e4eb4ca3f0784c046f"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.4.5 platform=linux target=ppc64le %gcc"
|
"spec": "gnupg@2.4.5%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -263,7 +263,7 @@
|
|||||||
"418b582f84547504b6464913fba5ba196482e86258081bdeb21af519fe8a2933"
|
"418b582f84547504b6464913fba5ba196482e86258081bdeb21af519fe8a2933"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "gnupg@2.4.5 platform=linux target=x86_64 %gcc"
|
"spec": "gnupg@2.4.5%gcc platform=linux target=x86_64"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -13,7 +13,7 @@
|
|||||||
"820b8013b0b918ad85caa953740497e6c31c09d812bd34d087fc57128bfbdacb"
|
"820b8013b0b918ad85caa953740497e6c31c09d812bd34d087fc57128bfbdacb"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.17.2 platform=linux target=aarch64 %gcc"
|
"spec": "patchelf@0.17.2%gcc platform=linux target=aarch64"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
"1569df037ea1ea316a50e89f5a0cafa0ce8e20629bbd07fcc3846d9fecd2451c"
|
"1569df037ea1ea316a50e89f5a0cafa0ce8e20629bbd07fcc3846d9fecd2451c"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.17.2 platform=linux target=ppc64le %gcc"
|
"spec": "patchelf@0.17.2%gcc platform=linux target=ppc64le"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"binaries": [
|
"binaries": [
|
||||||
@@ -43,7 +43,7 @@
|
|||||||
"79dfb7064e7993a97474c5f6b7560254fe19465a6c4cfc44569852e5a6ab542b"
|
"79dfb7064e7993a97474c5f6b7560254fe19465a6c4cfc44569852e5a6ab542b"
|
||||||
]
|
]
|
||||||
],
|
],
|
||||||
"spec": "patchelf@0.17.2 platform=linux target=x86_64 %gcc"
|
"spec": "patchelf@0.17.2%gcc platform=linux target=x86_64"
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
@@ -20,9 +20,8 @@ ci:
|
|||||||
- k=$CI_GPG_KEY_ROOT/intermediate_ci_signing_key.gpg; [[ -r $k ]] && spack gpg trust $k
|
- k=$CI_GPG_KEY_ROOT/intermediate_ci_signing_key.gpg; [[ -r $k ]] && spack gpg trust $k
|
||||||
- k=$CI_GPG_KEY_ROOT/spack_public_key.gpg; [[ -r $k ]] && spack gpg trust $k
|
- k=$CI_GPG_KEY_ROOT/spack_public_key.gpg; [[ -r $k ]] && spack gpg trust $k
|
||||||
script::
|
script::
|
||||||
- - if [ -n "$SPACK_EXTRA_MIRROR" ]; then spack mirror add local "${SPACK_EXTRA_MIRROR}/${SPACK_CI_STACK_NAME}"; fi
|
- - spack config blame mirrors
|
||||||
- spack config blame mirrors
|
- spack --color=always --backtrace ci rebuild -j ${SPACK_BUILD_JOBS} --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||||
- - spack --color=always --backtrace ci rebuild -j ${SPACK_BUILD_JOBS} --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
|
||||||
after_script:
|
after_script:
|
||||||
- - cat /proc/loadavg || true
|
- - cat /proc/loadavg || true
|
||||||
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
|
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
config:
|
config:
|
||||||
|
build_stage:
|
||||||
|
- $spack/tmp/stage
|
||||||
install_tree:
|
install_tree:
|
||||||
root: $spack/opt/spack
|
root: $spack/opt/spack
|
||||||
|
|||||||
@@ -2,11 +2,18 @@ ci:
|
|||||||
broken-tests-packages:
|
broken-tests-packages:
|
||||||
- mpich
|
- mpich
|
||||||
- openmpi
|
- openmpi
|
||||||
- py-mpi4py
|
|
||||||
pipeline-gen:
|
pipeline-gen:
|
||||||
- build-job-remove:
|
- build-job-remove:
|
||||||
tags: [spack]
|
tags: [spack]
|
||||||
- build-job:
|
- build-job:
|
||||||
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
|
tags: [ "macos-sequoia", "apple-clang-16", "aarch64-macos" ]
|
||||||
|
|
||||||
|
# after_script intended to ensure all stage files are properly cleaned up,
|
||||||
|
# including those that may have been created as read-only by `go mod`
|
||||||
|
# as part of installation of a golang package
|
||||||
|
# see: https://github.com/spack/spack/issues/49147
|
||||||
|
after_script-:
|
||||||
|
- - if [[ -d tmp ]] ; then chmod -R u+w tmp ; else echo tmp not found ; fi
|
||||||
|
- ./bin/spack clean -a
|
||||||
- build-job-remove:
|
- build-job-remove:
|
||||||
image:: macos-run-on-metal
|
image:: macos-run-on-metal
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ ci:
|
|||||||
- Write-Output "Done"
|
- Write-Output "Done"
|
||||||
|
|
||||||
before_script::
|
before_script::
|
||||||
- git config core.autocrlf true
|
|
||||||
- fsutil 8dot3name set C:\ 0
|
- fsutil 8dot3name set C:\ 0
|
||||||
- . .\share\spack\setup-env.ps1
|
- . .\share\spack\setup-env.ps1
|
||||||
- If (Test-Path -path C:\\key\intermediate_ci_signing_key.gpg) { spack.ps1 gpg trust C:\\key\intermediate_ci_signing_key.gpg }
|
- If (Test-Path -path C:\\key\intermediate_ci_signing_key.gpg) { spack.ps1 gpg trust C:\\key\intermediate_ci_signing_key.gpg }
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ spack:
|
|||||||
# Minimize LLVM
|
# Minimize LLVM
|
||||||
variants: ~lldb~lld~libomptarget~polly~gold libunwind=none compiler-rt=none
|
variants: ~lldb~lld~libomptarget~polly~gold libunwind=none compiler-rt=none
|
||||||
libllvm:
|
libllvm:
|
||||||
require: ["llvm"]
|
require: ["^llvm"]
|
||||||
visit:
|
visit:
|
||||||
require: ["@3.4.1"]
|
require: ["@3.4.1"]
|
||||||
|
|
||||||
|
|||||||
@@ -72,9 +72,6 @@ spack:
|
|||||||
require:
|
require:
|
||||||
- "~qt ^[virtuals=gl] osmesa"
|
- "~qt ^[virtuals=gl] osmesa"
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
petsc:
|
|
||||||
require:
|
|
||||||
- "+batch"
|
|
||||||
trilinos:
|
trilinos:
|
||||||
require:
|
require:
|
||||||
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
|
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
|
||||||
|
|||||||
@@ -30,13 +30,13 @@ spack:
|
|||||||
mpi:
|
mpi:
|
||||||
require: mpich
|
require: mpich
|
||||||
mpich:
|
mpich:
|
||||||
require: '~wrapperrpath ~hwloc target=neoverse_v2 %gcc'
|
require: '~wrapperrpath ~hwloc %gcc target=neoverse_v2'
|
||||||
tbb:
|
tbb:
|
||||||
require: intel-tbb
|
require: intel-tbb
|
||||||
vtk-m:
|
vtk-m:
|
||||||
require: "+examples target=neoverse_v2 %gcc"
|
require: "+examples %gcc target=neoverse_v2"
|
||||||
paraview:
|
paraview:
|
||||||
require: "+examples target=neoverse_v2 %gcc"
|
require: "+examples %gcc target=neoverse_v2"
|
||||||
|
|
||||||
specs:
|
specs:
|
||||||
# CPU
|
# CPU
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user