Remove the old concretizer (#45215)

The old concretizer is still used to bootstrap clingo from source. If we switch to a DAG model
where compilers are treated as nodes, we need to either:

1. fix the old concretizer to support this (which is a lot of work and possibly research), or
2. bootstrap `clingo` without the old concretizer.

This PR takes the second approach and gets rid of the old concretizer code. To bootstrap
`clingo`, we store some concrete spec prototypes as JSON, select one according to the
coarse-grained system architecture, and tweak them according to the current host.

The old concretizer and related dead code are removed.  In particular, this removes
`Spec.normalize()` and related methods, which were used in many unit-tests to set
up the test context. The tests have been updated not to use `normalize()`.

- [x] Bootstrap clingo concretization based on a JSON file
- [x] Bootstrap clingo *before* patchelf
- [x] Remove any use of the old concretizer, including:
      * Remove only_clingo and only_original fixtures
      * Remove _old_concretize and _new_concretize
      * Remove _concretize_together_old
      * Remove _concretize_together_new
      * Remove any use of `SPACK_TEST_SOLVER`
      * Simplify CI jobs
- [x] ensure bootstrapping `clingo` works on on Darwin and Windows
- [x] Raise an intelligible error when a compiler is missing
- [x] Ensure bootstrapping works on FreeBSD
- [x] remove normalize and related methods

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
This commit is contained in:
Massimiliano Culpo 2024-08-11 01:12:27 +02:00 committed by GitHub
parent 2dbc5213b0
commit 2079b888c8
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
40 changed files with 286 additions and 1938 deletions

View File

@ -16,38 +16,27 @@ jobs:
matrix: matrix:
os: [ubuntu-latest] os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12'] python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop: on_develop:
- ${{ github.ref == 'refs/heads/develop' }} - ${{ github.ref == 'refs/heads/develop' }}
include: include:
- python-version: '3.11'
os: ubuntu-latest
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.6' - python-version: '3.6'
os: ubuntu-20.04 os: ubuntu-20.04
concretizer: clingo
on_develop: ${{ github.ref == 'refs/heads/develop' }} on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude: exclude:
- python-version: '3.7' - python-version: '3.7'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.8' - python-version: '3.8'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.9' - python-version: '3.9'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.10' - python-version: '3.10'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.11' - python-version: '3.11'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
steps: steps:
@ -85,7 +74,6 @@ jobs:
- name: Run unit tests - name: Run unit tests
env: env:
SPACK_PYTHON: python SPACK_PYTHON: python
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
SPACK_TEST_PARALLEL: 2 SPACK_TEST_PARALLEL: 2
COVERAGE: true COVERAGE: true
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }} UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
@ -182,7 +170,6 @@ jobs:
- name: Run unit tests (full suite with coverage) - name: Run unit tests (full suite with coverage)
env: env:
COVERAGE: true COVERAGE: true
SPACK_TEST_SOLVER: clingo
run: | run: |
share/spack/qa/run-unit-tests share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673 - uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
@ -213,7 +200,6 @@ jobs:
brew install dash fish gcc gnupg2 kcov brew install dash fish gcc gnupg2 kcov
- name: Run unit tests - name: Run unit tests
env: env:
SPACK_TEST_SOLVER: clingo
SPACK_TEST_PARALLEL: 4 SPACK_TEST_PARALLEL: 4
run: | run: |
git --version git --version

View File

@ -170,23 +170,6 @@ config:
# If set to true, Spack will use ccache to cache C compiles. # If set to true, Spack will use ccache to cache C compiles.
ccache: false ccache: false
# The concretization algorithm to use in Spack. Options are:
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs. This will soon be deprecated in
# favor of clingo.
#
# See `concretizer.yaml` for more settings you can fine-tune when
# using clingo.
concretizer: clingo
# How long to wait to lock the Spack installation database. This lock is used # How long to wait to lock the Spack installation database. This lock is used
# when Spack needs to manage its own package metadata and all operations are # when Spack needs to manage its own package metadata and all operations are
# expected to complete within the default time limit. The timeout should # expected to complete within the default time limit. The timeout should

View File

@ -1,6 +1,5 @@
config: config:
locks: false locks: false
concretizer: clingo
build_stage:: build_stage::
- '$spack/.staging' - '$spack/.staging'
stage_name: '{name}-{version}-{hash:7}' stage_name: '{name}-{version}-{hash:7}'

View File

@ -0,0 +1,154 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap concrete specs for clingo
Spack uses clingo to concretize specs. When clingo itself needs to be bootstrapped from sources,
we need to rely on another mechanism to get a concrete spec that fits the current host.
This module contains the logic to get a concrete spec for clingo, starting from a prototype
JSON file for a similar platform.
"""
import pathlib
import sys
from typing import Dict, Optional, Tuple
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.platforms
import spack.spec
import spack.traverse
from .config import spec_for_current_python
class ClingoBootstrapConcretizer:
def __init__(self, configuration):
self.host_platform = spack.platforms.host()
self.host_os = self.host_platform.operating_system("frontend")
self.host_target = archspec.cpu.host().family
self.host_architecture = spack.spec.ArchSpec.frontend_arch()
self.host_architecture.target = str(self.host_target)
self.host_compiler = self._valid_compiler_or_raise()
self.host_python = self.python_external_spec()
if str(self.host_platform) == "linux":
self.host_libc = self.libc_external_spec()
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
compiler_name = "apple-clang"
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
return candidates[0]
def _externals_from_yaml(
self, configuration: "spack.config.Configuration"
) -> Tuple[Optional["spack.spec.Spec"], Optional["spack.spec.Spec"]]:
packages_yaml = configuration.get("packages")
requirements = {"cmake": "@3.20:", "bison": "@2.5:"}
selected: Dict[str, Optional["spack.spec.Spec"]] = {"cmake": None, "bison": None}
for pkg_name in ["cmake", "bison"]:
if pkg_name not in packages_yaml:
continue
candidates = packages_yaml[pkg_name].get("externals", [])
for candidate in candidates:
s = spack.spec.Spec(candidate["spec"], external_path=candidate["prefix"])
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
selected[pkg_name] = self._external_spec(s)
break
return selected["cmake"], selected["bison"]
def prototype_path(self) -> pathlib.Path:
"""Path to a prototype concrete specfile for clingo"""
parent_dir = pathlib.Path(__file__).parent
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-{self.host_target}.json"
if str(self.host_platform) == "linux":
# Using aarch64 as a fallback, since it has gnuconfig (x86_64 doesn't have it)
if not result.exists():
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-aarch64.json"
elif str(self.host_platform) == "freebsd":
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-amd64.json"
elif not result.exists():
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
return result
def concretize(self) -> "spack.spec.Spec":
# Read the prototype and mark it NOT concrete
s = spack.spec.Spec.from_specfile(str(self.prototype_path()))
s._mark_concrete(False)
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
edge.spec = self.host_python
if edge.spec.name == "bison" and self.external_bison:
edge.spec = self.external_bison
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if "libc" in edge.virtuals:
edge.spec = self.host_libc
s._finalize_concretization()
# Work around the fact that the installer calls Spec.dependents() and
# we modified edges inconsistently
return s.copy()
def python_external_spec(self) -> "spack.spec.Spec":
"""Python external spec corresponding to the current running interpreter"""
result = spack.spec.Spec(spec_for_current_python(), external_path=sys.exec_prefix)
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []
return spack.spec.parse_with_version_concrete(initial_spec)

View File

@ -54,6 +54,7 @@
import spack.version import spack.version
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python from .config import spack_python_interpreter, spec_for_current_python
#: Name of the file containing metadata about the bootstrapping source #: Name of the file containing metadata about the bootstrapping source
@ -268,15 +269,13 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Try to build and install from sources # Try to build and install from sources
with spack_python_interpreter(): with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
if module == "clingo": if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec._old_concretize( # pylint: disable=protected-access concrete_spec = bootstrapper.concretize()
deprecation_warning=False
)
else: else:
concrete_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize() concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources" msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
@ -303,14 +302,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount # might reduce compilation time by a fair amount
_add_externals_if_missing() _add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str) concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
if concrete_spec.name == "patchelf":
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
else:
concrete_spec.concretize()
msg = "[BOOTSTRAP] Try installing '{0}' from sources" msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str)) tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope): with spack.config.override(self.mirror_scope):

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -90,7 +90,6 @@ def report(args):
print("* **Spack:**", get_version()) print("* **Spack:**", get_version())
print("* **Python:**", platform.python_version()) print("* **Python:**", platform.python_version())
print("* **Platform:**", architecture) print("* **Platform:**", architecture)
print("* **Concretizer:**", spack.config.get("config:concretizer"))
def debug(parser, args): def debug(parser, args):

View File

@ -2,29 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
""" """
Functions here are used to take abstract specs and make them concrete. (DEPRECATED) Used to contain the code for the original concretizer
For example, if a spec asks for a version between 1.8 and 1.9, these
functions might take will take the most recent 1.9 version of the
package available. Or, if the user didn't specify a compiler for a
spec, then this will assign a compiler to the spec based on defaults
or user preferences.
TODO: make this customizable and allow users to configure
concretization policies.
""" """
import functools
import platform
import tempfile
from contextlib import contextmanager from contextlib import contextmanager
from itertools import chain from itertools import chain
from typing import Union
import archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
import spack.abi import spack.abi
import spack.compilers import spack.compilers
@ -37,639 +19,20 @@
import spack.target import spack.target
import spack.tengine import spack.tengine
import spack.util.path import spack.util.path
import spack.variant as vt
from spack.package_prefs import PackagePrefs, is_spec_buildable, spec_externals
from spack.version import ClosedOpenRange, VersionList, ver
#: impements rudimentary logic for ABI compatibility
_abi: Union[spack.abi.ABI, llnl.util.lang.Singleton] = llnl.util.lang.Singleton(
lambda: spack.abi.ABI()
)
@functools.total_ordering
class reverse_order:
"""Helper for creating key functions.
This is a wrapper that inverts the sense of the natural
comparisons on the object.
"""
def __init__(self, value):
self.value = value
def __eq__(self, other):
return other.value == self.value
def __lt__(self, other):
return other.value < self.value
class Concretizer: class Concretizer:
"""You can subclass this class to override some of the default """(DEPRECATED) Only contains logic to enable/disable compiler existence checks."""
concretization strategies, or you can override all of them.
"""
#: Controls whether we check that compiler versions actually exist #: Controls whether we check that compiler versions actually exist
#: during concretization. Used for testing and for mirror creation #: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway. def __init__(self):
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None: if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get( Concretizer.check_for_compiler_existence = not spack.config.get(
"config:install_missing_compilers", False "config:install_missing_compilers", False
) )
self.abstract_spec = abstract_spec
self._adjust_target_answer_generator = None
def concretize_develop(self, spec):
"""
Add ``dev_path=*`` variant to packages built from local source.
"""
env = spack.environment.active_environment()
dev_info = env.dev_specs.get(spec.name, {}) if env else {}
if not dev_info:
return False
path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
if "dev_path" in spec.variants:
assert spec.variants["dev_path"].value == path
changed = False
else:
spec.variants.setdefault("dev_path", vt.SingleValuedVariant("dev_path", path))
changed = True
changed |= spec.constrain(dev_info["spec"])
return changed
def _valid_virtuals_and_externals(self, spec):
"""Returns a list of candidate virtual dep providers and external
packages that coiuld be used to concretize a spec.
Preferred specs come first in the list.
"""
# First construct a list of concrete candidates to replace spec with.
candidates = [spec]
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)
# Find nearest spec in the DAG (up then down) that has prefs.
spec_w_prefs = find_spec(
spec, lambda p: PackagePrefs.has_preferred_providers(p.name, spec.name), spec
) # default to spec itself.
# Create a key to sort candidates by the prefs we found
pref_key = PackagePrefs(spec_w_prefs.name, "providers", spec.name)
# For each candidate package, if it has externals, add those
# to the usable list. if it's not buildable, then *only* add
# the externals.
usable = []
for cspec in candidates:
if is_spec_buildable(cspec):
usable.append(cspec)
externals = spec_externals(cspec)
for ext in externals:
if ext.intersects(spec):
usable.append(ext)
# If nothing is in the usable list now, it's because we aren't
# allowed to build anything.
if not usable:
raise NoBuildError(spec)
# Use a sort key to order the results
return sorted(
usable,
key=lambda spec: (
not spec.external, # prefer externals
pref_key(spec), # respect prefs
spec.name, # group by name
reverse_order(spec.versions), # latest version
spec, # natural order
),
)
def choose_virtual_or_external(self, spec: spack.spec.Spec):
"""Given a list of candidate virtual and external packages, try to
find one that is most ABI compatible.
"""
candidates = self._valid_virtuals_and_externals(spec)
if not candidates:
return candidates
# Find the nearest spec in the dag that has a compiler. We'll
# use that spec to calibrate compiler compatibility.
abi_exemplar = find_spec(spec, lambda x: x.compiler)
if abi_exemplar is None:
abi_exemplar = spec.root
# Sort candidates from most to least compatibility.
# We reverse because True > False.
# Sort is stable, so candidates keep their order.
return sorted(
candidates,
reverse=True,
key=lambda spec: (
_abi.compatible(spec, abi_exemplar, loose=True),
_abi.compatible(spec, abi_exemplar),
),
)
def concretize_version(self, spec):
"""If the spec is already concrete, return. Otherwise take
the preferred version from spackconfig, and default to the package's
version if there are no available versions.
TODO: In many cases we probably want to look for installed
versions of each package and use an installed version
if we can link to it. The policy implemented here will
tend to rebuild a lot of stuff becasue it will prefer
a compiler in the spec to any compiler already-
installed things were built with. There is likely
some better policy that finds some middle ground
between these two extremes.
"""
# return if already concrete.
if spec.versions.concrete:
return False
# List of versions we could consider, in sorted order
pkg_versions = spec.package_class.versions
usable = [v for v in pkg_versions if any(v.intersects(sv) for sv in spec.versions)]
yaml_prefs = PackagePrefs(spec.name, "version")
# The keys below show the order of precedence of factors used
# to select a version when concretizing. The item with
# the "largest" key will be selected.
#
# NOTE: When COMPARING VERSIONS, the '@develop' version is always
# larger than other versions. BUT when CONCRETIZING,
# the largest NON-develop version is selected by default.
keyfn = lambda v: (
# ------- Special direction from the user
# Respect order listed in packages.yaml
-yaml_prefs(v),
# The preferred=True flag (packages or packages.yaml or both?)
pkg_versions.get(v).get("preferred", False),
# ------- Regular case: use latest non-develop version by default.
# Avoid @develop version, which would otherwise be the "largest"
# in straight version comparisons
not v.isdevelop(),
# Compare the version itself
# This includes the logic:
# a) develop > everything (disabled by "not v.isdevelop() above)
# b) numeric > non-numeric
# c) Numeric or string comparison
v,
)
usable.sort(key=keyfn, reverse=True)
if usable:
spec.versions = ver([usable[0]])
else:
# We don't know of any SAFE versions that match the given
# spec. Grab the spec's versions and grab the highest
# *non-open* part of the range of versions it specifies.
# Someone else can raise an error if this happens,
# e.g. when we go to fetch it and don't know how. But it
# *might* work.
if not spec.versions or spec.versions == VersionList([":"]):
raise NoValidVersionError(spec)
else:
last = spec.versions[-1]
if isinstance(last, ClosedOpenRange):
range_as_version = VersionList([last]).concrete_range_as_version
if range_as_version:
spec.versions = ver([range_as_version])
else:
raise NoValidVersionError(spec)
else:
spec.versions = ver([last])
return True # Things changed
def concretize_architecture(self, spec):
"""If the spec is empty provide the defaults of the platform. If the
architecture is not a string type, then check if either the platform,
target or operating system are concretized. If any of the fields are
changed then return True. If everything is concretized (i.e the
architecture attribute is a namedtuple of classes) then return False.
If the target is a string type, then convert the string into a
concretized architecture. If it has no architecture and the root of the
DAG has an architecture, then use the root otherwise use the defaults
on the platform.
"""
# ensure type safety for the architecture
if spec.architecture is None:
spec.architecture = spack.spec.ArchSpec()
if spec.architecture.concrete:
return False
# Get platform of nearest spec with a platform, including spec
# If spec has a platform, easy
if spec.architecture.platform:
new_plat = spack.platforms.by_name(spec.architecture.platform)
else:
# Else if anyone else has a platform, take the closest one
# Search up, then down, along build/link deps first
# Then any nearest. Algorithm from compilerspec search
platform_spec = find_spec(spec, lambda x: x.architecture and x.architecture.platform)
if platform_spec:
new_plat = spack.platforms.by_name(platform_spec.architecture.platform)
else:
# If no platform anywhere in this spec, grab the default
new_plat = spack.platforms.host()
# Get nearest spec with relevant platform and an os
# Generally, same algorithm as finding platform, except we only
# consider specs that have a platform
if spec.architecture.os:
new_os = spec.architecture.os
else:
new_os_spec = find_spec(
spec,
lambda x: (
x.architecture
and x.architecture.platform == str(new_plat)
and x.architecture.os
),
)
if new_os_spec:
new_os = new_os_spec.architecture.os
else:
new_os = new_plat.operating_system("default_os")
# Get the nearest spec with relevant platform and a target
# Generally, same algorithm as finding os
curr_target = None
if spec.architecture.target:
curr_target = spec.architecture.target
if spec.architecture.target and spec.architecture.target_concrete:
new_target = spec.architecture.target
else:
new_target_spec = find_spec(
spec,
lambda x: (
x.architecture
and x.architecture.platform == str(new_plat)
and x.architecture.target
and x.architecture.target != curr_target
),
)
if new_target_spec:
if curr_target:
# constrain one target by the other
new_target_arch = spack.spec.ArchSpec(
(None, None, new_target_spec.architecture.target)
)
curr_target_arch = spack.spec.ArchSpec((None, None, curr_target))
curr_target_arch.constrain(new_target_arch)
new_target = curr_target_arch.target
else:
new_target = new_target_spec.architecture.target
else:
# To get default platform, consider package prefs
if PackagePrefs.has_preferred_targets(spec.name):
new_target = self.target_from_package_preferences(spec)
else:
new_target = new_plat.target("default_target")
if curr_target:
# convert to ArchSpec to compare satisfaction
new_target_arch = spack.spec.ArchSpec((None, None, str(new_target)))
curr_target_arch = spack.spec.ArchSpec((None, None, str(curr_target)))
if not new_target_arch.intersects(curr_target_arch):
# new_target is an incorrect guess based on preferences
# and/or default
valid_target_ranges = str(curr_target).split(",")
for target_range in valid_target_ranges:
t_min, t_sep, t_max = target_range.partition(":")
if not t_sep:
new_target = t_min
break
elif t_max:
new_target = t_max
break
elif t_min:
# TODO: something better than picking first
new_target = t_min
break
# Construct new architecture, compute whether spec changed
arch_spec = (str(new_plat), str(new_os), str(new_target))
new_arch = spack.spec.ArchSpec(arch_spec)
spec_changed = new_arch != spec.architecture
spec.architecture = new_arch
return spec_changed
def target_from_package_preferences(self, spec):
"""Returns the preferred target from the package preferences if
there's any.
Args:
spec: abstract spec to be concretized
"""
target_prefs = PackagePrefs(spec.name, "target")
target_specs = [spack.spec.Spec("target=%s" % tname) for tname in archspec.cpu.TARGETS]
def tspec_filter(s):
# Filter target specs by whether the architecture
# family is the current machine type. This ensures
# we only consider x86_64 targets when on an
# x86_64 machine, etc. This may need to change to
# enable setting cross compiling as a default
target = archspec.cpu.TARGETS[str(s.architecture.target)]
arch_family_name = target.family.name
return arch_family_name == platform.machine()
# Sort filtered targets by package prefs
target_specs = list(filter(tspec_filter, target_specs))
target_specs.sort(key=target_prefs)
new_target = target_specs[0].architecture.target
return new_target
def concretize_variants(self, spec):
"""If the spec already has variants filled in, return. Otherwise, add
the user preferences from packages.yaml or the default variants from
the package specification.
"""
changed = False
preferred_variants = PackagePrefs.preferred_variants(spec.name)
pkg_cls = spec.package_class
for name, entry in pkg_cls.variants.items():
variant, when = entry
var = spec.variants.get(name, None)
if var and "*" in var:
# remove variant wildcard before concretizing
# wildcard cannot be combined with other variables in a
# multivalue variant, a concrete variant cannot have the value
# wildcard, and a wildcard does not constrain a variant
spec.variants.pop(name)
if name not in spec.variants and any(spec.satisfies(w) for w in when):
changed = True
if name in preferred_variants:
spec.variants[name] = preferred_variants.get(name)
else:
spec.variants[name] = variant.make_default()
if name in spec.variants and not any(spec.satisfies(w) for w in when):
raise vt.InvalidVariantForSpecError(name, when, spec)
return changed
def concretize_compiler(self, spec):
"""If the spec already has a compiler, we're done. If not, then take
the compiler used for the nearest ancestor with a compiler
spec and use that. If the ancestor's compiler is not
concrete, then used the preferred compiler as specified in
spackconfig.
Intuition: Use the spackconfig default if no package that depends on
this one has a strict compiler requirement. Otherwise, try to
build with the compiler that will be used by libraries that
link to this one, to maximize compatibility.
"""
# Pass on concretizing the compiler if the target or operating system
# is not yet determined
if not spec.architecture.concrete:
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.
return True
# Only use a matching compiler if it is of the proper style
# Takes advantage of the proper logic already existing in
# compiler_for_spec Should think whether this can be more
# efficient
def _proper_compiler_style(cspec, aspec):
compilers = spack.compilers.compilers_for_spec(cspec, arch_spec=aspec)
# If the spec passed as argument is concrete we want to check
# the versions match exactly
if (
cspec.concrete
and compilers
and cspec.version not in [c.version for c in compilers]
):
return []
return compilers
if spec.compiler and spec.compiler.concrete:
if self.check_for_compiler_existence and not _proper_compiler_style(
spec.compiler, spec.architecture
):
_compiler_concretization_failure(spec.compiler, spec.architecture)
return False
# Find another spec that has a compiler, or the root if none do
other_spec = spec if spec.compiler else find_spec(spec, lambda x: x.compiler, spec.root)
other_compiler = other_spec.compiler
assert other_spec
# Check if the compiler is already fully specified
if other_compiler and other_compiler.concrete:
if self.check_for_compiler_existence and not _proper_compiler_style(
other_compiler, spec.architecture
):
_compiler_concretization_failure(other_compiler, spec.architecture)
spec.compiler = other_compiler
return True
if other_compiler: # Another node has abstract compiler information
compiler_list = spack.compilers.find_specs_by_arch(other_compiler, spec.architecture)
if not compiler_list:
# We don't have a matching compiler installed
if not self.check_for_compiler_existence:
# Concretize compiler spec versions as a package to build
cpkg_spec = spack.compilers.pkg_spec_for_compiler(other_compiler)
self.concretize_version(cpkg_spec)
spec.compiler = spack.spec.CompilerSpec(
other_compiler.name, cpkg_spec.versions
)
return True
else:
# No compiler with a satisfactory spec was found
raise UnavailableCompilerVersionError(other_compiler, spec.architecture)
else:
# We have no hints to go by, grab any compiler
compiler_list = spack.compilers.all_compiler_specs()
if not compiler_list:
# Spack has no compilers.
raise spack.compilers.NoCompilersError()
# By default, prefer later versions of compilers
compiler_list = sorted(compiler_list, key=lambda x: (x.name, x.version), reverse=True)
ppk = PackagePrefs(other_spec.name, "compiler")
matches = sorted(compiler_list, key=ppk)
# copy concrete version into other_compiler
try:
spec.compiler = next(
c for c in matches if _proper_compiler_style(c, spec.architecture)
).copy()
except StopIteration:
# No compiler with a satisfactory spec has a suitable arch
_compiler_concretization_failure(other_compiler, spec.architecture)
assert spec.compiler.concrete
return True # things changed.
def concretize_compiler_flags(self, spec):
"""
The compiler flags are updated to match those of the spec whose
compiler is used, defaulting to no compiler flags in the spec.
Default specs set at the compiler level will still be added later.
"""
# Pass on concretizing the compiler flags if the target or operating
# system is not set.
if not spec.architecture.concrete:
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.
return True
compiler_match = lambda other: (
spec.compiler == other.compiler and spec.architecture == other.architecture
)
ret = False
for flag in spack.spec.FlagMap.valid_compiler_flags():
if flag not in spec.compiler_flags:
spec.compiler_flags[flag] = list()
try:
nearest = next(
p
for p in spec.traverse(direction="parents")
if (compiler_match(p) and (p is not spec) and flag in p.compiler_flags)
)
nearest_flags = nearest.compiler_flags.get(flag, [])
flags = spec.compiler_flags.get(flag, [])
if set(nearest_flags) - set(flags):
spec.compiler_flags[flag] = list(llnl.util.lang.dedupe(nearest_flags + flags))
ret = True
except StopIteration:
pass
# Include the compiler flag defaults from the config files
# This ensures that spack will detect conflicts that stem from a change
# in default compiler flags.
try:
compiler = spack.compilers.compiler_for_spec(spec.compiler, spec.architecture)
except spack.compilers.NoCompilerForSpecError:
if self.check_for_compiler_existence:
raise
return ret
for flag in compiler.flags:
config_flags = compiler.flags.get(flag, [])
flags = spec.compiler_flags.get(flag, [])
spec.compiler_flags[flag] = list(llnl.util.lang.dedupe(config_flags + flags))
if set(config_flags) - set(flags):
ret = True
return ret
def adjust_target(self, spec):
"""Adjusts the target microarchitecture if the compiler is too old
to support the default one.
Args:
spec: spec to be concretized
Returns:
True if spec was modified, False otherwise
"""
# To minimize the impact on performance this function will attempt
# to adjust the target only at the very first call once necessary
# information is set. It will just return False on subsequent calls.
# The way this is achieved is by initializing a generator and making
# this function return the next answer.
if not (spec.architecture and spec.architecture.concrete):
# Not ready, but keep going because we have work to do later
return True
def _make_only_one_call(spec):
yield self._adjust_target(spec)
while True:
yield False
if self._adjust_target_answer_generator is None:
self._adjust_target_answer_generator = _make_only_one_call(spec)
return next(self._adjust_target_answer_generator)
def _adjust_target(self, spec):
"""Assumes that the architecture and the compiler have been
set already and checks if the current target microarchitecture
is the default and can be optimized by the compiler.
If not, downgrades the microarchitecture until a suitable one
is found. If none can be found raise an error.
Args:
spec: spec to be concretized
Returns:
True if any modification happened, False otherwise
"""
import archspec.cpu
# Try to adjust the target only if it is the default
# target for this platform
current_target = spec.architecture.target
current_platform = spack.platforms.by_name(spec.architecture.platform)
default_target = current_platform.target("default_target")
if PackagePrefs.has_preferred_targets(spec.name):
default_target = self.target_from_package_preferences(spec)
if current_target != default_target or (
self.abstract_spec
and self.abstract_spec.architecture
and self.abstract_spec.architecture.concrete
):
return False
try:
current_target.optimization_flags(spec.compiler)
except archspec.cpu.UnsupportedMicroarchitecture:
microarchitecture = current_target.microarchitecture
for ancestor in microarchitecture.ancestors:
candidate = None
try:
candidate = spack.target.Target(ancestor)
candidate.optimization_flags(spec.compiler)
except archspec.cpu.UnsupportedMicroarchitecture:
continue
if candidate is not None:
msg = (
"{0.name}@{0.version} cannot build optimized "
'binaries for "{1}". Using best target possible: '
'"{2}"'
)
msg = msg.format(spec.compiler, current_target, candidate)
tty.warn(msg)
spec.architecture.target = candidate
return True
else:
raise
return False
@contextmanager @contextmanager
@ -719,19 +82,6 @@ def find_spec(spec, condition, default=None):
return default # Nothing matched the condition; return default. return default # Nothing matched the condition; return default.
def _compiler_concretization_failure(compiler_spec, arch):
# Distinguish between the case that there are compilers for
# the arch but not with the given compiler spec and the case that
# there are no compilers for the arch at all
if not spack.compilers.compilers_for_arch(arch):
available_os_targets = set(
(c.operating_system, c.target) for c in spack.compilers.all_compilers()
)
raise NoCompilersForArchError(arch, available_os_targets)
else:
raise UnavailableCompilerVersionError(compiler_spec, arch)
def concretize_specs_together(*abstract_specs, **kwargs): def concretize_specs_together(*abstract_specs, **kwargs):
"""Given a number of specs as input, tries to concretize them together. """Given a number of specs as input, tries to concretize them together.
@ -744,12 +94,6 @@ def concretize_specs_together(*abstract_specs, **kwargs):
Returns: Returns:
List of concretized specs List of concretized specs
""" """
if spack.config.get("config:concretizer", "clingo") == "original":
return _concretize_specs_together_original(*abstract_specs, **kwargs)
return _concretize_specs_together_new(*abstract_specs, **kwargs)
def _concretize_specs_together_new(*abstract_specs, **kwargs):
import spack.solver.asp import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False) allow_deprecated = spack.config.get("config:deprecated", False)
@ -760,51 +104,6 @@ def _concretize_specs_together_new(*abstract_specs, **kwargs):
return [s.copy() for s in result.specs] return [s.copy() for s in result.specs]
def _concretize_specs_together_original(*abstract_specs, **kwargs):
abstract_specs = [spack.spec.Spec(s) for s in abstract_specs]
tmpdir = tempfile.mkdtemp()
builder = spack.repo.MockRepositoryBuilder(tmpdir)
# Split recursive specs, as it seems the concretizer has issue
# respecting conditions on dependents expressed like
# depends_on('foo ^bar@1.0'), see issue #11160
split_specs = [
dep.copy(deps=False) for spec1 in abstract_specs for dep in spec1.traverse(root=True)
]
builder.add_package(
"concretizationroot", dependencies=[(str(x), None, None) for x in split_specs]
)
with spack.repo.use_repositories(builder.root, override=False):
# Spec from a helper package that depends on all the abstract_specs
concretization_root = spack.spec.Spec("concretizationroot")
concretization_root.concretize(tests=kwargs.get("tests", False))
# Retrieve the direct dependencies
concrete_specs = [concretization_root[spec.name].copy() for spec in abstract_specs]
return concrete_specs
class NoCompilersForArchError(spack.error.SpackError):
def __init__(self, arch, available_os_targets):
err_msg = (
"No compilers found"
" for operating system %s and target %s."
"\nIf previous installations have succeeded, the"
" operating system may have been updated." % (arch.os, arch.target)
)
available_os_target_strs = list()
for operating_system, t in available_os_targets:
os_target_str = "%s-%s" % (operating_system, t) if t else operating_system
available_os_target_strs.append(os_target_str)
err_msg += (
"\nCompilers are defined for the following"
" operating systems and targets:\n\t" + "\n\t".join(available_os_target_strs)
)
super().__init__(err_msg, "Run 'spack compiler find' to add compilers.")
class UnavailableCompilerVersionError(spack.error.SpackError): class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a """Raised when there is no available compiler that satisfies a
compiler spec.""" compiler spec."""
@ -820,37 +119,3 @@ def __init__(self, compiler_spec, arch=None):
"'spack compilers' to see which compilers are already recognized" "'spack compilers' to see which compilers are already recognized"
" by spack.", " by spack.",
) )
class NoValidVersionError(spack.error.SpackError):
"""Raised when there is no way to have a concrete version for a
particular spec."""
def __init__(self, spec):
super().__init__(
"There are no valid versions for %s that match '%s'" % (spec.name, spec.versions)
)
class InsufficientArchitectureInfoError(spack.error.SpackError):
"""Raised when details on architecture cannot be collected from the
system"""
def __init__(self, spec, archs):
super().__init__(
"Cannot determine necessary architecture information for '%s': %s"
% (spec.name, str(archs))
)
class NoBuildError(spack.error.SpecError):
"""Raised when a package is configured with the buildable option False, but
no satisfactory external versions can be found
"""
def __init__(self, spec):
msg = (
"The spec\n '%s'\n is configured as not buildable, "
"and no matching external installs were found"
)
super().__init__(msg % spec)

View File

@ -99,7 +99,6 @@
"dirty": False, "dirty": False,
"build_jobs": min(16, cpus_available()), "build_jobs": min(16, cpus_available()),
"build_stage": "$tempdir/spack-stage", "build_stage": "$tempdir/spack-stage",
"concretizer": "clingo",
"license_dir": spack.paths.default_license_dir, "license_dir": spack.paths.default_license_dir,
} }
} }

View File

@ -1644,9 +1644,8 @@ def _concretize_separately(self, tests=False):
i += 1 i += 1
# Ensure we don't try to bootstrap clingo in parallel # Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo": with spack.bootstrap.ensure_bootstrap_configuration():
with spack.bootstrap.ensure_bootstrap_configuration(): spack.bootstrap.ensure_clingo_importable_or_raise()
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since # Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting # otherwise the processes in the pool may timeout on waiting

View File

@ -84,7 +84,6 @@
"build_language": {"type": "string"}, "build_language": {"type": "string"},
"build_jobs": {"type": "integer", "minimum": 1}, "build_jobs": {"type": "integer", "minimum": 1},
"ccache": {"type": "boolean"}, "ccache": {"type": "boolean"},
"concretizer": {"type": "string", "enum": ["original", "clingo"]},
"db_lock_timeout": {"type": "integer", "minimum": 1}, "db_lock_timeout": {"type": "integer", "minimum": 1},
"package_lock_timeout": { "package_lock_timeout": {
"anyOf": [{"type": "integer", "minimum": 1}, {"type": "null"}] "anyOf": [{"type": "integer", "minimum": 1}, {"type": "null"}]
@ -98,9 +97,9 @@
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}}, "aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},
}, },
"deprecatedProperties": { "deprecatedProperties": {
"properties": ["terminal_title"], "properties": ["concretizer"],
"message": "config:terminal_title has been replaced by " "message": "Spack supports only clingo as a concretizer from v0.23. "
"install_status and is ignored", "The config:concretizer config option is ignored.",
"error": False, "error": False,
}, },
} }

View File

@ -70,7 +70,6 @@
import spack.compiler import spack.compiler
import spack.compilers import spack.compilers
import spack.config import spack.config
import spack.dependency as dp
import spack.deptypes as dt import spack.deptypes as dt
import spack.error import spack.error
import spack.hash_types as ht import spack.hash_types as ht
@ -2615,294 +2614,6 @@ def validate_detection(self):
validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None) validate_fn = getattr(pkg_cls, "validate_detected_spec", lambda x, y: None)
validate_fn(self, self.extra_attributes) validate_fn(self, self.extra_attributes)
def _concretize_helper(self, concretizer, presets=None, visited=None):
"""Recursive helper function for concretize().
This concretizes everything bottom-up. As things are
concretized, they're added to the presets, and ancestors
will prefer the settings of their children.
"""
if presets is None:
presets = {}
if visited is None:
visited = set()
if self.name in visited:
return False
if self.concrete:
visited.add(self.name)
return False
changed = False
# Concretize deps first -- this is a bottom-up process.
for name in sorted(self._dependencies):
# WARNING: This function is an implementation detail of the
# WARNING: original concretizer. Since with that greedy
# WARNING: algorithm we don't allow multiple nodes from
# WARNING: the same package in a DAG, here we hard-code
# WARNING: using index 0 i.e. we assume that we have only
# WARNING: one edge from package "name"
changed |= self._dependencies[name][0].spec._concretize_helper(
concretizer, presets, visited
)
if self.name in presets:
changed |= self.constrain(presets[self.name])
else:
# Concretize virtual dependencies last. Because they're added
# to presets below, their constraints will all be merged, but we'll
# still need to select a concrete package later.
if not self.virtual:
changed |= any(
(
concretizer.concretize_develop(self), # special variant
concretizer.concretize_architecture(self),
concretizer.concretize_compiler(self),
concretizer.adjust_target(self),
# flags must be concretized after compiler
concretizer.concretize_compiler_flags(self),
concretizer.concretize_version(self),
concretizer.concretize_variants(self),
)
)
presets[self.name] = self
visited.add(self.name)
return changed
def _replace_with(self, concrete):
"""Replace this virtual spec with a concrete spec."""
assert self.virtual
virtuals = (self.name,)
for dep_spec in itertools.chain.from_iterable(self._dependents.values()):
dependent = dep_spec.parent
depflag = dep_spec.depflag
# remove self from all dependents, unless it is already removed
if self.name in dependent._dependencies:
del dependent._dependencies.edges[self.name]
# add the replacement, unless it is already a dep of dependent.
if concrete.name not in dependent._dependencies:
dependent._add_dependency(concrete, depflag=depflag, virtuals=virtuals)
else:
dependent.edges_to_dependencies(name=concrete.name)[0].update_virtuals(
virtuals=virtuals
)
def _expand_virtual_packages(self, concretizer):
"""Find virtual packages in this spec, replace them with providers,
and normalize again to include the provider's (potentially virtual)
dependencies. Repeat until there are no virtual deps.
Precondition: spec is normalized.
.. todo::
If a provider depends on something that conflicts with
other dependencies in the spec being expanded, this can
produce a conflicting spec. For example, if mpich depends
on hwloc@:1.3 but something in the spec needs hwloc1.4:,
then we should choose an MPI other than mpich. Cases like
this are infrequent, but should implement this before it is
a problem.
"""
# Make an index of stuff this spec already provides
self_index = spack.provider_index.ProviderIndex(
repository=spack.repo.PATH, specs=self.traverse(), restrict=True
)
changed = False
done = False
while not done:
done = True
for spec in list(self.traverse()):
replacement = None
if spec.external:
continue
if spec.virtual:
replacement = self._find_provider(spec, self_index)
if replacement:
# TODO: may break if in-place on self but
# shouldn't happen if root is traversed first.
spec._replace_with(replacement)
done = False
break
if not replacement:
# Get a list of possible replacements in order of
# preference.
candidates = concretizer.choose_virtual_or_external(spec)
# Try the replacements in order, skipping any that cause
# satisfiability problems.
for replacement in candidates:
if replacement is spec:
break
# Replace spec with the candidate and normalize
copy = self.copy()
copy[spec.name]._dup(replacement, deps=False)
try:
# If there are duplicate providers or duplicate
# provider deps, consolidate them and merge
# constraints.
copy.normalize(force=True)
break
except spack.error.SpecError:
# On error, we'll try the next replacement.
continue
# If replacement is external then trim the dependencies
if replacement.external:
if spec._dependencies:
for dep in spec.dependencies():
del dep._dependents.edges[spec.name]
changed = True
spec.clear_dependencies()
replacement.clear_dependencies()
replacement.architecture = self.architecture
# TODO: could this and the stuff in _dup be cleaned up?
def feq(cfield, sfield):
return (not cfield) or (cfield == sfield)
if replacement is spec or (
feq(replacement.name, spec.name)
and feq(replacement.versions, spec.versions)
and feq(replacement.compiler, spec.compiler)
and feq(replacement.architecture, spec.architecture)
and feq(replacement._dependencies, spec._dependencies)
and feq(replacement.variants, spec.variants)
and feq(replacement.external_path, spec.external_path)
and feq(replacement.external_modules, spec.external_modules)
):
continue
# Refine this spec to the candidate. This uses
# replace_with AND dup so that it can work in
# place. TODO: make this more efficient.
if spec.virtual:
spec._replace_with(replacement)
changed = True
if spec._dup(replacement, deps=False, cleardeps=False):
changed = True
self_index.update(spec)
done = False
break
return changed
def _old_concretize(self, tests=False, deprecation_warning=True):
"""A spec is concrete if it describes one build of a package uniquely.
This will ensure that this spec is concrete.
Args:
tests (list or bool): list of packages that will need test
dependencies, or True/False for test all/none
deprecation_warning (bool): enable or disable the deprecation
warning for the old concretizer
If this spec could describe more than one version, variant, or build
of a package, this will add constraints to make it concrete.
Some rigorous validation and checks are also performed on the spec.
Concretizing ensures that it is self-consistent and that it's
consistent with requirements of its packages. See flatten() and
normalize() for more details on this.
"""
import spack.concretize
# Add a warning message to inform users that the original concretizer
# will be removed
if deprecation_warning:
msg = (
"the original concretizer is currently being used.\n\tUpgrade to "
'"clingo" at your earliest convenience. The original concretizer '
"will be removed from Spack in a future version."
)
warnings.warn(msg)
self.replace_hash()
if not self.name:
raise spack.error.SpecError("Attempting to concretize anonymous spec")
if self._concrete:
return
# take the spec apart once before starting the main concretization loop and resolving
# deps, but don't break dependencies during concretization as the spec is built.
user_spec_deps = self.flat_dependencies(disconnect=True)
changed = True
force = False
concretizer = spack.concretize.Concretizer(self.copy())
while changed:
changes = (
self.normalize(force, tests, user_spec_deps, disconnect=False),
self._expand_virtual_packages(concretizer),
self._concretize_helper(concretizer),
)
changed = any(changes)
force = True
visited_user_specs = set()
for dep in self.traverse():
visited_user_specs.add(dep.name)
pkg_cls = spack.repo.PATH.get_pkg_class(dep.name)
visited_user_specs.update(pkg_cls(dep).provided_virtual_names())
extra = set(user_spec_deps.keys()).difference(visited_user_specs)
if extra:
raise InvalidDependencyError(self.name, extra)
Spec.inject_patches_variant(self)
for s in self.traverse():
# TODO: Refactor this into a common method to build external specs
# TODO: or turn external_path into a lazy property
Spec.ensure_external_path_if_external(s)
# assign hashes and mark concrete
self._finalize_concretization()
# If any spec in the DAG is deprecated, throw an error
Spec.ensure_no_deprecated(self)
# Update externals as needed
for dep in self.traverse():
if dep.external:
dep.package.update_external_dependencies()
# Now that the spec is concrete we should check if
# there are declared conflicts
#
# TODO: this needs rethinking, as currently we can only express
# TODO: internal configuration conflicts within one package.
matches = []
for x in self.traverse():
if x.external:
# external specs are already built, don't worry about whether
# it's possible to build that configuration with Spack
continue
for when_spec, conflict_list in x.package_class.conflicts.items():
if x.satisfies(when_spec):
for conflict_spec, msg in conflict_list:
if x.satisfies(conflict_spec):
when = when_spec.copy()
when.name = x.name
matches.append((x, conflict_spec, when, msg))
if matches:
raise ConflictsInSpecError(self, matches)
# Check if we can produce an optimized binary (will throw if
# there are declared inconsistencies)
self.architecture.target.optimization_flags(self.compiler)
def _patches_assigned(self): def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer.""" """Whether patches have been assigned to this spec by the concretizer."""
# FIXME: _patches_in_order_of_appearance is attached after concretization # FIXME: _patches_in_order_of_appearance is attached after concretization
@ -3032,7 +2743,13 @@ def ensure_no_deprecated(root):
msg += " For each package listed, choose another spec\n" msg += " For each package listed, choose another spec\n"
raise SpecDeprecatedError(msg) raise SpecDeprecatedError(msg)
def _new_concretize(self, tests=False): def concretize(self, tests: Union[bool, List[str]] = False) -> None:
"""Concretize the current spec.
Args:
tests: if False disregard 'test' dependencies, if a list of names activate them for
the packages in the list, if True activate 'test' dependencies for all packages.
"""
import spack.solver.asp import spack.solver.asp
self.replace_hash() self.replace_hash()
@ -3066,19 +2783,6 @@ def _new_concretize(self, tests=False):
concretized = answer[node] concretized = answer[node]
self._dup(concretized) self._dup(concretized)
def concretize(self, tests=False):
"""Concretize the current spec.
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
if spack.config.get("config:concretizer", "clingo") == "clingo":
self._new_concretize(tests)
else:
self._old_concretize(tests)
def _mark_root_concrete(self, value=True): def _mark_root_concrete(self, value=True):
"""Mark just this spec (not dependencies) concrete.""" """Mark just this spec (not dependencies) concrete."""
if (not value) and self.concrete and self.installed: if (not value) and self.concrete and self.installed:
@ -3182,34 +2886,6 @@ def concretized(self, tests=False):
clone.concretize(tests=tests) clone.concretize(tests=tests)
return clone return clone
def flat_dependencies(self, disconnect: bool = False):
"""Build DependencyMap of all of this spec's dependencies with their constraints merged.
Arguments:
disconnect: if True, disconnect all dependents and dependencies among nodes in this
spec's DAG.
"""
flat_deps = {}
deptree = self.traverse(root=False)
for spec in deptree:
if spec.name not in flat_deps:
flat_deps[spec.name] = spec
else:
try:
flat_deps[spec.name].constrain(spec)
except spack.error.UnsatisfiableSpecError as e:
# DAG contains two instances of the same package with inconsistent constraints.
raise InconsistentSpecError("Invalid Spec DAG: %s" % e.message) from e
if disconnect:
for spec in flat_deps.values():
if not spec.concrete:
spec.clear_edges()
self.clear_dependencies()
return flat_deps
def index(self, deptype="all"): def index(self, deptype="all"):
"""Return a dictionary that points to all the dependencies in this """Return a dictionary that points to all the dependencies in this
spec. spec.
@ -3219,312 +2895,6 @@ def index(self, deptype="all"):
dm[spec.name].append(spec) dm[spec.name].append(spec)
return dm return dm
def _evaluate_dependency_conditions(self, name):
"""Evaluate all the conditions on a dependency with this name.
Args:
name (str): name of dependency to evaluate conditions on.
Returns:
(Dependency): new Dependency object combining all constraints.
If the package depends on <name> in the current spec
configuration, return the constrained dependency and
corresponding dependency types.
If no conditions are True (and we don't depend on it), return
``(None, None)``.
"""
vt.substitute_abstract_variants(self)
# evaluate when specs to figure out constraints on the dependency.
dep = None
for when_spec, deps_by_name in self.package_class.dependencies.items():
if not self.satisfies(when_spec):
continue
for dep_name, dependency in deps_by_name.items():
if dep_name != name:
continue
if dep is None:
dep = dp.Dependency(Spec(self.name), Spec(name), depflag=0)
try:
dep.merge(dependency)
except spack.error.UnsatisfiableSpecError as e:
e.message = (
"Conflicting conditional dependencies for spec"
"\n\n\t{0}\n\n"
"Cannot merge constraint"
"\n\n\t{1}\n\n"
"into"
"\n\n\t{2}".format(self, dependency.spec, dep.spec)
)
raise e
return dep
def _find_provider(self, vdep, provider_index):
"""Find provider for a virtual spec in the provider index.
Raise an exception if there is a conflicting virtual
dependency already in this spec.
"""
assert spack.repo.PATH.is_virtual_safe(vdep.name), vdep
# note that this defensively copies.
providers = provider_index.providers_for(vdep)
# If there is a provider for the vpkg, then use that instead of
# the virtual package.
if providers:
# Remove duplicate providers that can concretize to the same
# result.
for provider in providers:
for spec in providers:
if spec is not provider and provider.intersects(spec):
providers.remove(spec)
# Can't have multiple providers for the same thing in one spec.
if len(providers) > 1:
raise MultipleProviderError(vdep, providers)
return providers[0]
else:
# The user might have required something insufficient for
# pkg_dep -- so we'll get a conflict. e.g., user asked for
# mpi@:1.1 but some package required mpi@2.1:.
required = provider_index.providers_for(vdep.name)
if len(required) > 1:
raise MultipleProviderError(vdep, required)
elif required:
raise UnsatisfiableProviderSpecError(required[0], vdep)
def _merge_dependency(self, dependency, visited, spec_deps, provider_index, tests):
"""Merge dependency information from a Package into this Spec.
Args:
dependency (Dependency): dependency metadata from a package;
this is typically the result of merging *all* matching
dependency constraints from the package.
visited (set): set of dependency nodes already visited by
``normalize()``.
spec_deps (dict): ``dict`` of all dependencies from the spec
being normalized.
provider_index (dict): ``provider_index`` of virtual dep
providers in the ``Spec`` as normalized so far.
NOTE: Caller should assume that this routine owns the
``dependency`` parameter, i.e., it needs to be a copy of any
internal structures.
This is the core of ``normalize()``. There are some basic steps:
* If dep is virtual, evaluate whether it corresponds to an
existing concrete dependency, and merge if so.
* If it's real and it provides some virtual dep, see if it provides
what some virtual dependency wants and merge if so.
* Finally, if none of the above, merge dependency and its
constraints into this spec.
This method returns True if the spec was changed, False otherwise.
"""
changed = False
dep = dependency.spec
# If it's a virtual dependency, try to find an existing
# provider in the spec, and merge that.
virtuals = ()
if spack.repo.PATH.is_virtual_safe(dep.name):
virtuals = (dep.name,)
visited.add(dep.name)
provider = self._find_provider(dep, provider_index)
if provider:
dep = provider
else:
index = spack.provider_index.ProviderIndex(
repository=spack.repo.PATH, specs=[dep], restrict=True
)
items = list(spec_deps.items())
for name, vspec in items:
if not spack.repo.PATH.is_virtual_safe(vspec.name):
continue
if index.providers_for(vspec):
vspec._replace_with(dep)
del spec_deps[vspec.name]
changed = True
else:
required = index.providers_for(vspec.name)
if required:
raise UnsatisfiableProviderSpecError(required[0], dep)
provider_index.update(dep)
# If the spec isn't already in the set of dependencies, add it.
# Note: dep is always owned by this method. If it's from the
# caller, it's a copy from _evaluate_dependency_conditions. If it
# comes from a vdep, it's a defensive copy from _find_provider.
if dep.name not in spec_deps:
if self.concrete:
return False
spec_deps[dep.name] = dep
changed = True
else:
# merge package/vdep information into spec
try:
tty.debug("{0} applying constraint {1}".format(self.name, str(dep)))
changed |= spec_deps[dep.name].constrain(dep)
except spack.error.UnsatisfiableSpecError as e:
fmt = "An unsatisfiable {0}".format(e.constraint_type)
fmt += " constraint has been detected for spec:"
fmt += "\n\n{0}\n\n".format(spec_deps[dep.name].tree(indent=4))
fmt += "while trying to concretize the partial spec:"
fmt += "\n\n{0}\n\n".format(self.tree(indent=4))
fmt += "{0} requires {1} {2} {3}, but spec asked for {4}"
e.message = fmt.format(
self.name, dep.name, e.constraint_type, e.required, e.provided
)
raise
# Add merged spec to my deps and recurse
spec_dependency = spec_deps[dep.name]
if dep.name not in self._dependencies:
self._add_dependency(spec_dependency, depflag=dependency.depflag, virtuals=virtuals)
changed |= spec_dependency._normalize_helper(visited, spec_deps, provider_index, tests)
return changed
def _normalize_helper(self, visited, spec_deps, provider_index, tests):
"""Recursive helper function for _normalize."""
if self.name in visited:
return False
visited.add(self.name)
# If we descend into a virtual spec, there's nothing more
# to normalize. Concretize will finish resolving it later.
if self.virtual or self.external:
return False
# Avoid recursively adding constraints for already-installed packages:
# these may include build dependencies which are not needed for this
# install (since this package is already installed).
if self.concrete and self.installed:
return False
# Combine constraints from package deps with constraints from
# the spec, until nothing changes.
any_change = False
changed = True
while changed:
changed = False
for dep_name in self.package_class.dependency_names():
# Do we depend on dep_name? If so pkg_dep is not None.
dep = self._evaluate_dependency_conditions(dep_name)
# If dep is a needed dependency, merge it.
if dep:
merge = (
# caller requested test dependencies
tests is True
or (tests and self.name in tests)
or
# this is not a test-only dependency
(dep.depflag & ~dt.TEST)
)
if merge:
changed |= self._merge_dependency(
dep, visited, spec_deps, provider_index, tests
)
any_change |= changed
return any_change
def normalize(self, force=False, tests=False, user_spec_deps=None, disconnect=True):
"""When specs are parsed, any dependencies specified are hanging off
the root, and ONLY the ones that were explicitly provided are there.
Normalization turns a partial flat spec into a DAG, where:
1. Known dependencies of the root package are in the DAG.
2. Each node's dependencies dict only contains its known direct
deps.
3. There is only ONE unique spec for each package in the DAG.
* This includes virtual packages. If there a non-virtual
package that provides a virtual package that is in the spec,
then we replace the virtual package with the non-virtual one.
TODO: normalize should probably implement some form of cycle
detection, to ensure that the spec is actually a DAG.
"""
if not self.name:
raise spack.error.SpecError("Attempting to normalize anonymous spec")
# Set _normal and _concrete to False when forced
if force and not self._concrete:
self._normal = False
if self._normal:
return False
# Ensure first that all packages & compilers in the DAG exist.
self.validate_or_raise()
# Clear the DAG and collect all dependencies in the DAG, which will be
# reapplied as constraints. All dependencies collected this way will
# have been created by a previous execution of 'normalize'.
# A dependency extracted here will only be reintegrated if it is
# discovered to apply according to _normalize_helper, so
# user-specified dependencies are recorded separately in case they
# refer to specs which take several normalization passes to
# materialize.
all_spec_deps = self.flat_dependencies(disconnect=disconnect)
if user_spec_deps:
for name, spec in user_spec_deps.items():
if not name:
msg = "Attempted to normalize anonymous dependency spec"
msg += " %s" % spec
raise InvalidSpecDetected(msg)
if name not in all_spec_deps:
all_spec_deps[name] = spec
else:
all_spec_deps[name].constrain(spec)
# Initialize index of virtual dependency providers if
# concretize didn't pass us one already
provider_index = spack.provider_index.ProviderIndex(
repository=spack.repo.PATH, specs=[s for s in all_spec_deps.values()], restrict=True
)
# traverse the package DAG and fill out dependencies according
# to package files & their 'when' specs
visited = set()
any_change = self._normalize_helper(visited, all_spec_deps, provider_index, tests)
# remove any leftover dependents outside the spec from, e.g., pruning externals
valid = {id(spec) for spec in all_spec_deps.values()} | {id(self)}
for spec in all_spec_deps.values():
remove = [dep for dep in spec.dependents() if id(dep) not in valid]
for dep in remove:
del spec._dependents.edges[dep.name]
del dep._dependencies.edges[spec.name]
# Mark the spec as normal once done.
self._normal = True
return any_change
def normalized(self):
"""
Return a normalized copy of this spec without modifying this spec.
"""
clone = self.copy()
clone.normalize()
return clone
def validate_or_raise(self): def validate_or_raise(self):
"""Checks that names and values in this spec are real. If they're not, """Checks that names and values in this spec are real. If they're not,
it will raise an appropriate exception. it will raise an appropriate exception.

View File

@ -208,7 +208,6 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
], ],
) )
@pytest.mark.usefixtures("mock_packages", "config") @pytest.mark.usefixtures("mock_packages", "config")
@pytest.mark.only_clingo("Fixing the parser broke this test for the original concretizer.")
@pytest.mark.skipif( @pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges" str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
) )

View File

@ -60,4 +60,3 @@ def test_report():
assert get_version() in out assert get_version() in out
assert platform.python_version() in out assert platform.python_version() in out
assert str(architecture) in out assert str(architecture) in out
assert spack.config.get("config:concretizer") in out

View File

@ -1057,7 +1057,6 @@ def test_env_with_included_config_file(mutable_mock_env_path, packages_file):
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs()) assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
@pytest.mark.only_clingo("original concretizer does not support requirements")
def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages, mutable_config): def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages, mutable_config):
"""Test ``config change`` with config in the ``spack.yaml`` as well as an """Test ``config change`` with config in the ``spack.yaml`` as well as an
included file scope. included file scope.
@ -1133,7 +1132,6 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
spack.spec.Spec("bowtie@1.2.2").concretized() spack.spec.Spec("bowtie@1.2.2").concretized()
@pytest.mark.only_clingo("original concretizer does not support requirements")
def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config): def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutable_config):
spack_yaml = tmp_path / ev.manifest_name spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text( spack_yaml.write_text(
@ -2332,8 +2330,6 @@ def test_stack_concretize_extraneous_deps(tmpdir, mock_packages):
# FIXME: constraints for stacks # FIXME: constraints for stacks
# FIXME: This now works for statically-determinable invalid deps # FIXME: This now works for statically-determinable invalid deps
# FIXME: But it still does not work for dynamically determined invalid deps # FIXME: But it still does not work for dynamically determined invalid deps
# if spack.config.get('config:concretizer') == 'clingo':
# pytest.skip('Clingo concretizer does not support soft constraints')
filename = str(tmpdir.join("spack.yaml")) filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f: with open(filename, "w") as f:
@ -3180,9 +3176,7 @@ def test_concretize_user_specs_together():
e.remove("mpich") e.remove("mpich")
e.add("mpich2") e.add("mpich2")
exc_cls = spack.error.SpackError exc_cls = spack.error.UnsatisfiableSpecError
if spack.config.get("config:concretizer") == "clingo":
exc_cls = spack.error.UnsatisfiableSpecError
# Concretizing without invalidating the concrete spec for mpileaks fails # Concretizing without invalidating the concrete spec for mpileaks fails
with pytest.raises(exc_cls): with pytest.raises(exc_cls):
@ -3208,10 +3202,8 @@ def test_duplicate_packages_raise_when_concretizing_together():
e.add("mpileaks~opt") e.add("mpileaks~opt")
e.add("mpich") e.add("mpich")
exc_cls, match = spack.error.SpackError, None exc_cls = spack.error.UnsatisfiableSpecError
if spack.config.get("config:concretizer") == "clingo": match = r"You could consider setting `concretizer:unify`"
exc_cls = spack.error.UnsatisfiableSpecError
match = r"You could consider setting `concretizer:unify`"
with pytest.raises(exc_cls, match=match): with pytest.raises(exc_cls, match=match):
e.concretize() e.concretize()

View File

@ -30,7 +30,6 @@ def test_spec():
assert "mpich@3.0.4" in output assert "mpich@3.0.4" in output
@pytest.mark.only_clingo("Known failure of the original concretizer")
def test_spec_concretizer_args(mutable_database, do_not_check_runtimes_on_reuse): def test_spec_concretizer_args(mutable_database, do_not_check_runtimes_on_reuse):
"""End-to-end test of CLI concretizer prefs. """End-to-end test of CLI concretizer prefs.

View File

@ -425,9 +425,6 @@ def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12
spec.concretize() spec.concretize()
assert spec.satisfies("cflags=-O2") assert spec.satisfies("cflags=-O2")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str,expected,not_expected", "spec_str,expected,not_expected",
[ [
@ -472,7 +469,6 @@ def test_compiler_inherited_upwards(self):
for dep in spec.traverse(): for dep in spec.traverse():
assert "%clang" in dep assert "%clang" in dep
@pytest.mark.only_clingo("Fixing the parser broke this test for the original concretizer")
def test_architecture_deep_inheritance(self, mock_targets, compiler_factory): def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
"""Make sure that indirect dependencies receive architecture """Make sure that indirect dependencies receive architecture
information from the root even when partial architecture information information from the root even when partial architecture information
@ -537,9 +533,6 @@ def test_concretize_two_virtuals_with_dual_provider_and_a_conflict(self):
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
s.concretize() s.concretize()
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str,expected_propagation", "spec_str,expected_propagation",
[ [
@ -567,9 +560,6 @@ def test_concretize_propagate_disabled_variant(self, spec_str, expected_propagat
for key, expected_satisfies in expected_propagation: for key, expected_satisfies in expected_propagation:
spec[key].satisfies(expected_satisfies) spec[key].satisfies(expected_satisfies)
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagated_variant_is_not_passed_to_dependent(self): def test_concretize_propagated_variant_is_not_passed_to_dependent(self):
"""Test a package variant value was passed from its parent.""" """Test a package variant value was passed from its parent."""
spec = Spec("ascent~~shared +adios2 ^adios2+shared") spec = Spec("ascent~~shared +adios2 ^adios2+shared")
@ -578,9 +568,6 @@ def test_concretize_propagated_variant_is_not_passed_to_dependent(self):
assert spec.satisfies("^adios2+shared") assert spec.satisfies("^adios2+shared")
assert spec.satisfies("^bzip2~shared") assert spec.satisfies("^bzip2~shared")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_specified_variant(self): def test_concretize_propagate_specified_variant(self):
"""Test that only the specified variant is propagated to the dependencies""" """Test that only the specified variant is propagated to the dependencies"""
spec = Spec("parent-foo-bar ~~foo") spec = Spec("parent-foo-bar ~~foo")
@ -589,7 +576,6 @@ def test_concretize_propagate_specified_variant(self):
assert spec.satisfies("~foo") and spec.satisfies("^dependency-foo-bar~foo") assert spec.satisfies("~foo") and spec.satisfies("^dependency-foo-bar~foo")
assert spec.satisfies("+bar") and not spec.satisfies("^dependency-foo-bar+bar") assert spec.satisfies("+bar") and not spec.satisfies("^dependency-foo-bar+bar")
@pytest.mark.only_clingo("Original concretizer is allowed to forego variant propagation")
def test_concretize_propagate_multivalue_variant(self): def test_concretize_propagate_multivalue_variant(self):
"""Test that multivalue variants are propagating the specified value(s) """Test that multivalue variants are propagating the specified value(s)
to their dependecies. The dependencies should not have the default value""" to their dependecies. The dependencies should not have the default value"""
@ -646,11 +632,6 @@ def test_virtual_is_fully_expanded_for_mpileaks(self):
assert all(not d.dependencies(name="mpi") for d in spec.traverse()) assert all(not d.dependencies(name="mpi") for d in spec.traverse())
assert all(x in spec for x in ("zmpi", "mpi")) assert all(x in spec for x in ("zmpi", "mpi"))
def test_my_dep_depends_on_provider_of_my_virtual_dep(self):
spec = Spec("indirect-mpich")
spec.normalize()
spec.concretize()
@pytest.mark.parametrize("compiler_str", ["clang", "gcc", "gcc@10.2.1", "clang@:15.0.0"]) @pytest.mark.parametrize("compiler_str", ["clang", "gcc", "gcc@10.2.1", "clang@:15.0.0"])
def test_compiler_inheritance(self, compiler_str): def test_compiler_inheritance(self, compiler_str):
spec_str = "mpileaks %{0}".format(compiler_str) spec_str = "mpileaks %{0}".format(compiler_str)
@ -746,7 +727,6 @@ def test_conflicts_in_spec(self, conflict_spec):
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
s.concretize() s.concretize()
@pytest.mark.only_clingo("Testing debug statements specific to new concretizer")
def test_conflicts_show_cores(self, conflict_spec, monkeypatch): def test_conflicts_show_cores(self, conflict_spec, monkeypatch):
s = Spec(conflict_spec) s = Spec(conflict_spec)
with pytest.raises(spack.error.SpackError) as e: with pytest.raises(spack.error.SpackError) as e:
@ -920,7 +900,6 @@ def test_concretize_anonymous_dep(self, spec_str):
("bowtie@1.2.2 os=redhat6", "%gcc@11.1.0"), ("bowtie@1.2.2 os=redhat6", "%gcc@11.1.0"),
], ],
) )
@pytest.mark.only_clingo("Original concretizer cannot work around conflicts")
def test_compiler_conflicts_in_package_py( def test_compiler_conflicts_in_package_py(
self, spec_str, expected_str, clang12_with_flags, gcc11_with_flags self, spec_str, expected_str, clang12_with_flags, gcc11_with_flags
): ):
@ -1036,7 +1015,6 @@ def test_patching_dependencies(self, spec_str, patched_deps):
("quantum-espresso~veritas", ["^libelf@0.8.13"]), ("quantum-espresso~veritas", ["^libelf@0.8.13"]),
], ],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_working_around_conflicting_defaults(self, spec_str, expected): def test_working_around_conflicting_defaults(self, spec_str, expected):
s = Spec(spec_str).concretized() s = Spec(spec_str).concretized()
@ -1049,7 +1027,6 @@ def test_working_around_conflicting_defaults(self, spec_str, expected):
"spec_str,expected", "spec_str,expected",
[("cmake", ["%clang"]), ("cmake %gcc", ["%gcc"]), ("cmake %clang", ["%clang"])], [("cmake", ["%clang"]), ("cmake %gcc", ["%gcc"]), ("cmake %clang", ["%clang"])],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_external_package_and_compiler_preferences(self, spec_str, expected, mutable_config): def test_external_package_and_compiler_preferences(self, spec_str, expected, mutable_config):
packages_yaml = { packages_yaml = {
"all": {"compiler": ["clang", "gcc"]}, "all": {"compiler": ["clang", "gcc"]},
@ -1066,7 +1043,6 @@ def test_external_package_and_compiler_preferences(self, spec_str, expected, mut
assert s.satisfies(condition) assert s.satisfies(condition)
@pytest.mark.regression("5651") @pytest.mark.regression("5651")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_package_with_constraint_not_met_by_external(self): def test_package_with_constraint_not_met_by_external(self):
"""Check that if we have an external package A at version X.Y in """Check that if we have an external package A at version X.Y in
packages.yaml, but our spec doesn't allow X.Y as a version, then packages.yaml, but our spec doesn't allow X.Y as a version, then
@ -1081,7 +1057,6 @@ def test_package_with_constraint_not_met_by_external(self):
assert not s["libelf"].external assert not s["libelf"].external
@pytest.mark.regression("9744") @pytest.mark.regression("9744")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_cumulative_version_ranges_with_different_length(self): def test_cumulative_version_ranges_with_different_length(self):
s = Spec("cumulative-vrange-root").concretized() s = Spec("cumulative-vrange-root").concretized()
assert s.concrete assert s.concrete
@ -1109,7 +1084,6 @@ def test_dependency_conditional_on_another_dependency_state(self):
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str,expected", [("cmake %gcc", "%gcc"), ("cmake %clang", "%clang")] "spec_str,expected", [("cmake %gcc", "%gcc"), ("cmake %clang", "%clang")]
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_compiler_constraint_with_external_package(self, spec_str, expected): def test_compiler_constraint_with_external_package(self, spec_str, expected):
packages_yaml = { packages_yaml = {
"cmake": {"externals": [{"spec": "cmake@3.4.3", "prefix": "/usr"}], "buildable": False} "cmake": {"externals": [{"spec": "cmake@3.4.3", "prefix": "/usr"}], "buildable": False}
@ -1154,18 +1128,9 @@ def test_compiler_in_nonbuildable_external_package(
spack.config.set("packages", packages_yaml) spack.config.set("packages", packages_yaml)
s = Spec(spec_str).concretized() s = Spec(spec_str).concretized()
if xfailold and spack.config.get("config:concretizer") == "original":
pytest.xfail("This only works on the ASP-based concretizer")
assert s.satisfies(expected) assert s.satisfies(expected)
assert "external-common-perl" not in [d.name for d in s.dependencies()] assert "external-common-perl" not in [d.name for d in s.dependencies()]
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_external_packages_have_consistent_hash(self):
s, t = Spec("externaltool"), Spec("externaltool")
s._old_concretize(), t._new_concretize()
assert s.dag_hash() == t.dag_hash()
def test_external_that_would_require_a_virtual_dependency(self): def test_external_that_would_require_a_virtual_dependency(self):
s = Spec("requires-virtual").concretized() s = Spec("requires-virtual").concretized()
@ -1182,7 +1147,6 @@ def test_transitive_conditional_virtual_dependency(self, mutable_config):
assert s.satisfies("^[virtuals=stuff] externalvirtual") assert s.satisfies("^[virtuals=stuff] externalvirtual")
@pytest.mark.regression("20040") @pytest.mark.regression("20040")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_conditional_provides_or_depends_on(self): def test_conditional_provides_or_depends_on(self):
# Check that we can concretize correctly a spec that can either # Check that we can concretize correctly a spec that can either
# provide a virtual or depend on it based on the value of a variant # provide a virtual or depend on it based on the value of a variant
@ -1221,7 +1185,6 @@ def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, witho
assert not node.dependencies(deptype="test"), msg.format(pkg_name) assert not node.dependencies(deptype="test"), msg.format(pkg_name)
@pytest.mark.regression("20019") @pytest.mark.regression("20019")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_compiler_match_is_preferred_to_newer_version(self, compiler_factory): def test_compiler_match_is_preferred_to_newer_version(self, compiler_factory):
# This spec depends on openblas. Openblas has a conflict # This spec depends on openblas. Openblas has a conflict
# that doesn't allow newer versions with gcc@4.4.0. Check # that doesn't allow newer versions with gcc@4.4.0. Check
@ -1240,7 +1203,6 @@ def test_target_ranges_in_conflicts(self):
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
Spec("impossible-concretization").concretized() Spec("impossible-concretization").concretized()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_target_compatibility(self): def test_target_compatibility(self):
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
Spec("libdwarf target=x86_64 ^libelf target=x86_64_v2").concretized() Spec("libdwarf target=x86_64 ^libelf target=x86_64_v2").concretized()
@ -1257,7 +1219,6 @@ def test_variant_not_default(self):
assert "+foo+bar+baz" in d assert "+foo+bar+baz" in d
@pytest.mark.regression("20055") @pytest.mark.regression("20055")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeypatch): def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeypatch):
mutable_config.set( mutable_config.set(
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")] "compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
@ -1358,7 +1319,6 @@ def mock_fn(*args, **kwargs):
{"add_variant": True, "delete_variant": True}, {"add_variant": True, "delete_variant": True},
], ],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_installed_packages_when_package_def_changes( def test_reuse_installed_packages_when_package_def_changes(
self, context, mutable_database, repo_with_changing_recipe self, context, mutable_database, repo_with_changing_recipe
): ):
@ -1388,7 +1348,6 @@ def test_reuse_installed_packages_when_package_def_changes(
# Structure and package hash will be different without reuse # Structure and package hash will be different without reuse
assert root.dag_hash() != new_root_without_reuse.dag_hash() assert root.dag_hash() != new_root_without_reuse.dag_hash()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
@pytest.mark.regression("43663") @pytest.mark.regression("43663")
def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, mock_packages): def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, mock_packages):
spack.config.set("concretizer:reuse", True) spack.config.set("concretizer:reuse", True)
@ -1404,7 +1363,6 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
new2 = Spec("conditional-variant-pkg +two_whens").concretized() new2 = Spec("conditional-variant-pkg +two_whens").concretized()
assert new2.satisfies("@2 +two_whens +version_based") assert new2.satisfies("@2 +two_whens +version_based")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_with_flags(self, mutable_database, mutable_config): def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True) spack.config.set("concretizer:reuse", True)
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized() spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized()
@ -1425,7 +1383,6 @@ def test_concretization_of_test_dependencies(self):
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str", ["wrong-variant-in-conflicts", "wrong-variant-in-depends-on"] "spec_str", ["wrong-variant-in-conflicts", "wrong-variant-in-depends-on"]
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_error_message_for_inconsistent_variants(self, spec_str): def test_error_message_for_inconsistent_variants(self, spec_str):
s = Spec(spec_str) s = Spec(spec_str)
with pytest.raises(RuntimeError, match="not found in package"): with pytest.raises(RuntimeError, match="not found in package"):
@ -1530,7 +1487,6 @@ def test_multivalued_variants_from_cli(self, spec_str, expected_dict):
("deprecated-versions@=1.1.0", "deprecated-versions@1.1.0"), ("deprecated-versions@=1.1.0", "deprecated-versions@1.1.0"),
], ],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_deprecated_versions_not_selected(self, spec_str, expected): def test_deprecated_versions_not_selected(self, spec_str, expected):
with spack.config.override("config:deprecated", True): with spack.config.override("config:deprecated", True):
s = Spec(spec_str).concretized() s = Spec(spec_str).concretized()
@ -1591,7 +1547,6 @@ def test_non_default_provider_of_multiple_virtuals(self):
"spec_str,expect_installed", "spec_str,expect_installed",
[("mpich", True), ("mpich+debug", False), ("mpich~debug", True)], [("mpich", True), ("mpich+debug", False), ("mpich~debug", True)],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_concrete_specs_are_not_modified_on_reuse( def test_concrete_specs_are_not_modified_on_reuse(
self, mutable_database, spec_str, expect_installed self, mutable_database, spec_str, expect_installed
): ):
@ -1605,7 +1560,6 @@ def test_concrete_specs_are_not_modified_on_reuse(
assert s.satisfies(spec_str) assert s.satisfies(spec_str)
@pytest.mark.regression("26721,19736") @pytest.mark.regression("26721,19736")
@pytest.mark.only_clingo("Original concretizer cannot use sticky variants")
def test_sticky_variant_in_package(self): def test_sticky_variant_in_package(self):
# Here we test that a sticky variant cannot be changed from its default value # Here we test that a sticky variant cannot be changed from its default value
# by the ASP solver if not set explicitly. The package used in the test needs # by the ASP solver if not set explicitly. The package used in the test needs
@ -1621,7 +1575,6 @@ def test_sticky_variant_in_package(self):
assert s.satisfies("%clang") and s.satisfies("~allow-gcc") assert s.satisfies("%clang") and s.satisfies("~allow-gcc")
@pytest.mark.regression("42172") @pytest.mark.regression("42172")
@pytest.mark.only_clingo("Original concretizer cannot use sticky variants")
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec,allow_gcc", "spec,allow_gcc",
[ [
@ -1644,7 +1597,6 @@ def test_sticky_variant_in_external(self, spec, allow_gcc):
assert s["sticky-variant"].satisfies("+allow-gcc") assert s["sticky-variant"].satisfies("+allow-gcc")
assert s["sticky-variant"].external assert s["sticky-variant"].external
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_do_not_invent_new_concrete_versions_unless_necessary(self): def test_do_not_invent_new_concrete_versions_unless_necessary(self):
# ensure we select a known satisfying version rather than creating # ensure we select a known satisfying version rather than creating
# a new '2.7' version. # a new '2.7' version.
@ -1666,14 +1618,12 @@ def test_do_not_invent_new_concrete_versions_unless_necessary(self):
("conditional-values-in-variant foo=foo", True), ("conditional-values-in-variant foo=foo", True),
], ],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_conditional_values_in_variants(self, spec_str, valid): def test_conditional_values_in_variants(self, spec_str, valid):
s = Spec(spec_str) s = Spec(spec_str)
raises = pytest.raises((RuntimeError, spack.error.UnsatisfiableSpecError)) raises = pytest.raises((RuntimeError, spack.error.UnsatisfiableSpecError))
with llnl.util.lang.nullcontext() if valid else raises: with llnl.util.lang.nullcontext() if valid else raises:
s.concretize() s.concretize()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_conditional_values_in_conditional_variant(self): def test_conditional_values_in_conditional_variant(self):
"""Test that conditional variants play well with conditional possible values""" """Test that conditional variants play well with conditional possible values"""
s = Spec("conditional-values-in-variant@1.50.0").concretized() s = Spec("conditional-values-in-variant@1.50.0").concretized()
@ -1682,7 +1632,6 @@ def test_conditional_values_in_conditional_variant(self):
s = Spec("conditional-values-in-variant@1.60.0").concretized() s = Spec("conditional-values-in-variant@1.60.0").concretized()
assert "cxxstd" in s.variants assert "cxxstd" in s.variants
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_target_granularity(self): def test_target_granularity(self):
# The test architecture uses core2 as the default target. Check that when # The test architecture uses core2 as the default target. Check that when
# we configure Spack for "generic" granularity we concretize for x86_64 # we configure Spack for "generic" granularity we concretize for x86_64
@ -1693,7 +1642,6 @@ def test_target_granularity(self):
with spack.config.override("concretizer:targets", {"granularity": "generic"}): with spack.config.override("concretizer:targets", {"granularity": "generic"}):
assert s.concretized().satisfies("target=%s" % generic_target) assert s.concretized().satisfies("target=%s" % generic_target)
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_host_compatible_concretization(self): def test_host_compatible_concretization(self):
# Check that after setting "host_compatible" to false we cannot concretize. # Check that after setting "host_compatible" to false we cannot concretize.
# Here we use "k10" to set a target non-compatible with the current host # Here we use "k10" to set a target non-compatible with the current host
@ -1706,7 +1654,6 @@ def test_host_compatible_concretization(self):
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
s.concretized() s.concretized()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_add_microarchitectures_on_explicit_request(self): def test_add_microarchitectures_on_explicit_request(self):
# Check that if we consider only "generic" targets, we can still solve for # Check that if we consider only "generic" targets, we can still solve for
# specific microarchitectures on explicit requests # specific microarchitectures on explicit requests
@ -1715,7 +1662,6 @@ def test_add_microarchitectures_on_explicit_request(self):
assert s.satisfies("target=k10") assert s.satisfies("target=k10")
@pytest.mark.regression("29201") @pytest.mark.regression("29201")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_delete_version_and_reuse(self, mutable_database, repo_with_changing_recipe): def test_delete_version_and_reuse(self, mutable_database, repo_with_changing_recipe):
"""Test that we can reuse installed specs with versions not """Test that we can reuse installed specs with versions not
declared in package.py declared in package.py
@ -1730,7 +1676,6 @@ def test_delete_version_and_reuse(self, mutable_database, repo_with_changing_rec
assert root.dag_hash() == new_root.dag_hash() assert root.dag_hash() == new_root.dag_hash()
@pytest.mark.regression("29201") @pytest.mark.regression("29201")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_installed_version_is_selected_only_for_reuse( def test_installed_version_is_selected_only_for_reuse(
self, mutable_database, repo_with_changing_recipe self, mutable_database, repo_with_changing_recipe
): ):
@ -1808,7 +1753,6 @@ def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, mo
(["mpi", "mpich"], 1, 1), (["mpi", "mpich"], 1, 1),
], ],
) )
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize(self, specs, expected, libc_offset): def test_best_effort_coconcretize(self, specs, expected, libc_offset):
specs = [Spec(s) for s in specs] specs = [Spec(s) for s in specs]
solver = spack.solver.asp.Solver() solver = spack.solver.asp.Solver()
@ -1853,7 +1797,6 @@ def test_best_effort_coconcretize(self, specs, expected, libc_offset):
(["hdf5+mpi", "zmpi", "mpich"], "mpich", 2), (["hdf5+mpi", "zmpi", "mpich"], "mpich", 2),
], ],
) )
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize_preferences(self, specs, expected_spec, occurances): def test_best_effort_coconcretize_preferences(self, specs, expected_spec, occurances):
"""Test package preferences during coconcretization.""" """Test package preferences during coconcretization."""
specs = [Spec(s) for s in specs] specs = [Spec(s) for s in specs]
@ -1869,7 +1812,6 @@ def test_best_effort_coconcretize_preferences(self, specs, expected_spec, occura
counter += 1 counter += 1
assert counter == occurances, concrete_specs assert counter == occurances, concrete_specs
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_solve_in_rounds_all_unsolved(self, monkeypatch, mock_packages): def test_solve_in_rounds_all_unsolved(self, monkeypatch, mock_packages):
specs = [Spec(x) for x in ["libdwarf%gcc", "libdwarf%clang"]] specs = [Spec(x) for x in ["libdwarf%gcc", "libdwarf%clang"]]
solver = spack.solver.asp.Solver() solver = spack.solver.asp.Solver()
@ -1885,7 +1827,6 @@ def test_solve_in_rounds_all_unsolved(self, monkeypatch, mock_packages):
): ):
list(solver.solve_in_rounds(specs)) list(solver.solve_in_rounds(specs))
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_coconcretize_reuse_and_virtuals(self): def test_coconcretize_reuse_and_virtuals(self):
reusable_specs = [] reusable_specs = []
for s in ["mpileaks ^mpich", "zmpi"]: for s in ["mpileaks ^mpich", "zmpi"]:
@ -1902,7 +1843,6 @@ def test_coconcretize_reuse_and_virtuals(self):
assert "zmpi" in spec assert "zmpi" in spec
@pytest.mark.regression("30864") @pytest.mark.regression("30864")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_misleading_error_message_on_version(self, mutable_database): def test_misleading_error_message_on_version(self, mutable_database):
# For this bug to be triggered we need a reusable dependency # For this bug to be triggered we need a reusable dependency
# that is not optimal in terms of optimization scores. # that is not optimal in terms of optimization scores.
@ -1919,7 +1859,6 @@ def test_misleading_error_message_on_version(self, mutable_database):
solver.driver.solve(setup, [root_spec], reuse=reusable_specs) solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
@pytest.mark.regression("31148") @pytest.mark.regression("31148")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_version_weight_and_provenance(self): def test_version_weight_and_provenance(self):
"""Test package preferences during coconcretization.""" """Test package preferences during coconcretization."""
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("pkg-b@0.9", "pkg-b@1.0")] reusable_specs = [Spec(spec_str).concretized() for spec_str in ("pkg-b@0.9", "pkg-b@1.0")]
@ -1951,7 +1890,6 @@ def test_version_weight_and_provenance(self):
assert criterion in result.criteria, criterion assert criterion in result.criteria, criterion
assert result_spec.satisfies("^pkg-b@1.0") assert result_spec.satisfies("^pkg-b@1.0")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_succeeds_with_config_compatible_os(self): def test_reuse_succeeds_with_config_compatible_os(self):
root_spec = Spec("pkg-b") root_spec = Spec("pkg-b")
s = root_spec.concretized() s = root_spec.concretized()
@ -1977,7 +1915,6 @@ def test_git_hash_assigned_version_is_preferred(self):
assert hash in str(c) assert hash in str(c)
@pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "main")) @pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "main"))
@pytest.mark.only_clingo("Original concretizer cannot account for git hashes")
def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref): def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
s = Spec("develop-branch-version@git.%s=develop" % git_ref) s = Spec("develop-branch-version@git.%s=develop" % git_ref)
c = s.concretized() c = s.concretized()
@ -1987,7 +1924,6 @@ def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
assert s.satisfies("@0.1:") assert s.satisfies("@0.1:")
@pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "fbranch")) @pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "fbranch"))
@pytest.mark.only_clingo("Original concretizer cannot account for git hashes")
def test_git_ref_version_succeeds_with_unknown_version(self, git_ref): def test_git_ref_version_succeeds_with_unknown_version(self, git_ref):
# main is not defined in the package.py for this file # main is not defined in the package.py for this file
s = Spec("develop-branch-version@git.%s=main" % git_ref) s = Spec("develop-branch-version@git.%s=main" % git_ref)
@ -1995,7 +1931,6 @@ def test_git_ref_version_succeeds_with_unknown_version(self, git_ref):
assert s.satisfies("develop-branch-version@main") assert s.satisfies("develop-branch-version@main")
@pytest.mark.regression("31484") @pytest.mark.regression("31484")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_installed_externals_are_reused( def test_installed_externals_are_reused(
self, mutable_database, repo_with_changing_recipe, tmp_path self, mutable_database, repo_with_changing_recipe, tmp_path
): ):
@ -2027,7 +1962,6 @@ def test_installed_externals_are_reused(
assert external3.dag_hash() == external1.dag_hash() assert external3.dag_hash() == external1.dag_hash()
@pytest.mark.regression("31484") @pytest.mark.regression("31484")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_user_can_select_externals_with_require(self, mutable_database, tmp_path): def test_user_can_select_externals_with_require(self, mutable_database, tmp_path):
"""Test that users have means to select an external even in presence of reusable specs.""" """Test that users have means to select an external even in presence of reusable specs."""
external_conf = { external_conf = {
@ -2056,7 +1990,6 @@ def test_user_can_select_externals_with_require(self, mutable_database, tmp_path
assert mpi_spec.name == "multi-provider-mpi" assert mpi_spec.name == "multi-provider-mpi"
@pytest.mark.regression("31484") @pytest.mark.regression("31484")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch): def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch):
"""Test that installed specs do not trigger conflicts. This covers for the rare case """Test that installed specs do not trigger conflicts. This covers for the rare case
where a conflict is added on a package after a spec matching the conflict was installed. where a conflict is added on a package after a spec matching the conflict was installed.
@ -2077,7 +2010,6 @@ def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch
assert s.satisfies("~debug"), s assert s.satisfies("~debug"), s
@pytest.mark.regression("32471") @pytest.mark.regression("32471")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_require_targets_are_allowed(self, mutable_database): def test_require_targets_are_allowed(self, mutable_database):
"""Test that users can set target constraints under the require attribute.""" """Test that users can set target constraints under the require attribute."""
# Configuration to be added to packages.yaml # Configuration to be added to packages.yaml
@ -2252,7 +2184,6 @@ def test_unsolved_specs_raises_error(self, monkeypatch, mock_packages):
solver.driver.solve(setup, specs, reuse=[]) solver.driver.solve(setup, specs, reuse=[])
@pytest.mark.regression("43141") @pytest.mark.regression("43141")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_clear_error_when_unknown_compiler_requested(self, mock_packages): def test_clear_error_when_unknown_compiler_requested(self, mock_packages):
"""Tests that the solver can report a case where the compiler cannot be set""" """Tests that the solver can report a case where the compiler cannot be set"""
with pytest.raises( with pytest.raises(
@ -2355,7 +2286,6 @@ def test_virtuals_are_annotated_on_edges(self, spec_str):
edges = spec.edges_to_dependencies(name="callpath") edges = spec.edges_to_dependencies(name="callpath")
assert len(edges) == 1 and edges[0].virtuals == () assert len(edges) == 1 and edges[0].virtuals == ()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
@pytest.mark.db @pytest.mark.db
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str,mpi_name", "spec_str,mpi_name",
@ -2370,7 +2300,6 @@ def test_virtuals_are_reconstructed_on_reuse(self, spec_str, mpi_name, mutable_d
assert len(mpi_edges) == 1 assert len(mpi_edges) == 1
assert "mpi" in mpi_edges[0].virtuals assert "mpi" in mpi_edges[0].virtuals
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_dont_define_new_version_from_input_if_checksum_required(self, working_env): def test_dont_define_new_version_from_input_if_checksum_required(self, working_env):
os.environ["SPACK_CONCRETIZER_REQUIRE_CHECKSUM"] = "yes" os.environ["SPACK_CONCRETIZER_REQUIRE_CHECKSUM"] = "yes"
with pytest.raises(spack.error.UnsatisfiableSpecError): with pytest.raises(spack.error.UnsatisfiableSpecError):
@ -2407,7 +2336,6 @@ def test_reuse_python_from_cli_and_extension_from_db(self, mutable_database):
("hdf5 ^gmake", {"gmake": "duplicates.test", "hdf5": "duplicates.test"}), ("hdf5 ^gmake", {"gmake": "duplicates.test", "hdf5": "duplicates.test"}),
], ],
) )
@pytest.mark.only_clingo("Uses specs requiring multiple gmake specs")
def test_select_lower_priority_package_from_repository_stack( def test_select_lower_priority_package_from_repository_stack(
self, spec_str, expected_namespaces self, spec_str, expected_namespaces
): ):
@ -2423,7 +2351,6 @@ def test_select_lower_priority_package_from_repository_stack(
assert s[name].concrete assert s[name].concrete
assert s[name].namespace == namespace assert s[name].namespace == namespace
@pytest.mark.only_clingo("Old concretizer cannot reuse")
def test_reuse_specs_from_non_available_compilers(self, mutable_config, mutable_database): def test_reuse_specs_from_non_available_compilers(self, mutable_config, mutable_database):
"""Tests that we can reuse specs with compilers that are not configured locally.""" """Tests that we can reuse specs with compilers that are not configured locally."""
# All the specs in the mutable DB have been compiled with %gcc@=10.2.1 # All the specs in the mutable DB have been compiled with %gcc@=10.2.1
@ -2494,7 +2421,6 @@ def test_spec_with_build_dep_from_json(self, tmp_path):
assert s["dttop"].dag_hash() == build_dep.dag_hash() assert s["dttop"].dag_hash() == build_dep.dag_hash()
@pytest.mark.regression("44040") @pytest.mark.regression("44040")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_exclude_specs_from_reuse(self, monkeypatch): def test_exclude_specs_from_reuse(self, monkeypatch):
"""Tests that we can exclude a spec from reuse when concretizing, and that the spec """Tests that we can exclude a spec from reuse when concretizing, and that the spec
is not added back to the solve as a dependency of another reusable spec. is not added back to the solve as a dependency of another reusable spec.
@ -2544,7 +2470,6 @@ def test_exclude_specs_from_reuse(self, monkeypatch):
[], [],
], ],
) )
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_include_specs_from_externals_and_libcs( def test_include_specs_from_externals_and_libcs(
self, included_externals, mutable_config, tmp_path self, included_externals, mutable_config, tmp_path
): ):
@ -2577,7 +2502,6 @@ def test_include_specs_from_externals_and_libcs(
assert result["deprecated-versions"].satisfies("@1.0.0") assert result["deprecated-versions"].satisfies("@1.0.0")
@pytest.mark.regression("44085") @pytest.mark.regression("44085")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_path): def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_path):
"""Test that external specs that are in the DB can be reused. This means they are """Test that external specs that are in the DB can be reused. This means they are
preferred to concretizing another external from packages.yaml preferred to concretizing another external from packages.yaml
@ -2600,7 +2524,6 @@ def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_p
sombrero = result.specs[0] sombrero = result.specs[0]
assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash() assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash()
@pytest.mark.only_clingo("Original concretizer cannot reuse")
def test_cannot_reuse_host_incompatible_libc(self): def test_cannot_reuse_host_incompatible_libc(self):
"""Test whether reuse concretization correctly fails to reuse a spec with a host """Test whether reuse concretization correctly fails to reuse a spec with a host
incompatible libc.""" incompatible libc."""
@ -2681,7 +2604,6 @@ def duplicates_test_repository():
@pytest.mark.usefixtures("mutable_config", "duplicates_test_repository") @pytest.mark.usefixtures("mutable_config", "duplicates_test_repository")
@pytest.mark.only_clingo("Not supported by the original concretizer")
class TestConcretizeSeparately: class TestConcretizeSeparately:
"""Collects test on separate concretization""" """Collects test on separate concretization"""
@ -2900,7 +2822,6 @@ def edges_test_repository():
@pytest.mark.usefixtures("mutable_config", "edges_test_repository") @pytest.mark.usefixtures("mutable_config", "edges_test_repository")
@pytest.mark.only_clingo("Edge properties not supported by the original concretizer")
class TestConcretizeEdges: class TestConcretizeEdges:
"""Collects tests on edge properties""" """Collects tests on edge properties"""
@ -3051,7 +2972,6 @@ def test_concretization_version_order():
] ]
@pytest.mark.only_clingo("Original concretizer cannot reuse specs")
@pytest.mark.parametrize( @pytest.mark.parametrize(
"roots,reuse_yaml,expected,not_expected,expected_length", "roots,reuse_yaml,expected,not_expected,expected_length",
[ [
@ -3126,7 +3046,6 @@ def test_spec_filters(specs, include, exclude, expected):
assert f.selected_specs() == expected assert f.selected_specs() == expected
@pytest.mark.only_clingo("clingo only reuse feature being tested")
@pytest.mark.regression("38484") @pytest.mark.regression("38484")
def test_git_ref_version_can_be_reused(install_mockery, do_not_check_runtimes_on_reuse): def test_git_ref_version_can_be_reused(install_mockery, do_not_check_runtimes_on_reuse):
first_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5~opt").concretized() first_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5~opt").concretized()
@ -3144,7 +3063,6 @@ def test_git_ref_version_can_be_reused(install_mockery, do_not_check_runtimes_on
assert third_spec.dag_hash() == first_spec.dag_hash() assert third_spec.dag_hash() == first_spec.dag_hash()
@pytest.mark.only_clingo("clingo only reuse feature being tested")
@pytest.mark.parametrize("standard_version", ["2.0.0", "2.1.5", "2.1.6"]) @pytest.mark.parametrize("standard_version", ["2.0.0", "2.1.5", "2.1.6"])
def test_reuse_prefers_standard_over_git_versions( def test_reuse_prefers_standard_over_git_versions(
standard_version, install_mockery, do_not_check_runtimes_on_reuse standard_version, install_mockery, do_not_check_runtimes_on_reuse

View File

@ -17,10 +17,7 @@
from spack.environment.environment import ViewDescriptor from spack.environment.environment import ViewDescriptor
from spack.version import Version from spack.version import Version
pytestmark = [ pytestmark = [pytest.mark.usefixtures("enable_runtimes")]
pytest.mark.only_clingo("Original concretizer does not support compiler runtimes"),
pytest.mark.usefixtures("enable_runtimes"),
]
def _concretize_with_reuse(*, root_str, reused_str): def _concretize_with_reuse(*, root_str, reused_str):

View File

@ -8,10 +8,7 @@
import spack.solver.asp import spack.solver.asp
import spack.spec import spack.spec
pytestmark = [ pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
pytest.mark.not_on_windows("Windows uses old concretizer"),
pytest.mark.only_clingo("Original concretizer does not support configuration requirements"),
]
version_error_messages = [ version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:", "Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",

View File

@ -113,7 +113,6 @@ def test_preferred_compilers(self, compiler_str, spec_str):
spec = spack.spec.Spec(spec_str).concretized() spec = spack.spec.Spec(spec_str).concretized()
assert spec.compiler == CompilerSpec(compiler_str) assert spec.compiler == CompilerSpec(compiler_str)
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_preferred_target(self, mutable_mock_repo): def test_preferred_target(self, mutable_mock_repo):
"""Test preferred targets are applied correctly""" """Test preferred targets are applied correctly"""
spec = concretize("mpich") spec = concretize("mpich")
@ -143,7 +142,6 @@ def test_preferred_versions(self):
spec = concretize("mpileaks") spec = concretize("mpileaks")
assert spec.version == Version("2.2") assert spec.version == Version("2.2")
@pytest.mark.only_clingo("This behavior is not enforced for the old concretizer")
def test_preferred_versions_mixed_version_types(self): def test_preferred_versions_mixed_version_types(self):
update_packages("mixedversions", "version", ["=2.0"]) update_packages("mixedversions", "version", ["=2.0"])
spec = concretize("mixedversions") spec = concretize("mixedversions")
@ -225,7 +223,6 @@ def test_preferred(self):
spec.concretize() spec.concretize()
assert spec.version == Version("3.5.0") assert spec.version == Version("3.5.0")
@pytest.mark.only_clingo("This behavior is not enforced for the old concretizer")
def test_preferred_undefined_raises(self): def test_preferred_undefined_raises(self):
"""Preference should not specify an undefined version""" """Preference should not specify an undefined version"""
update_packages("python", "version", ["3.5.0.1"]) update_packages("python", "version", ["3.5.0.1"])
@ -233,7 +230,6 @@ def test_preferred_undefined_raises(self):
with pytest.raises(spack.config.ConfigError): with pytest.raises(spack.config.ConfigError):
spec.concretize() spec.concretize()
@pytest.mark.only_clingo("This behavior is not enforced for the old concretizer")
def test_preferred_truncated(self): def test_preferred_truncated(self):
"""Versions without "=" are treated as version ranges: if there is """Versions without "=" are treated as version ranges: if there is
a satisfying version defined in the package.py, we should use that a satisfying version defined in the package.py, we should use that
@ -510,7 +506,6 @@ def test_sticky_variant_accounts_for_packages_yaml(self):
assert s.satisfies("%gcc") and s.satisfies("+allow-gcc") assert s.satisfies("%gcc") and s.satisfies("+allow-gcc")
@pytest.mark.regression("41134") @pytest.mark.regression("41134")
@pytest.mark.only_clingo("Not backporting the fix to the old concretizer")
def test_default_preference_variant_different_type_does_not_error(self): def test_default_preference_variant_different_type_does_not_error(self):
"""Tests that a different type for an existing variant in the 'all:' section of """Tests that a different type for an existing variant in the 'all:' section of
packages.yaml doesn't fail with an error. packages.yaml doesn't fail with an error.

View File

@ -19,10 +19,7 @@
from spack.test.conftest import create_test_repo from spack.test.conftest import create_test_repo
from spack.util.url import path_to_file_url from spack.util.url import path_to_file_url
pytestmark = [ pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
pytest.mark.not_on_windows("Windows uses old concretizer"),
pytest.mark.only_clingo("Original concretizer does not support configuration requirements"),
]
def update_packages_config(conf_str): def update_packages_config(conf_str):

View File

@ -306,14 +306,14 @@ def test_add_config_path(mutable_config):
@pytest.mark.regression("17543,23259") @pytest.mark.regression("17543,23259")
def test_add_config_path_with_enumerated_type(mutable_config): def test_add_config_path_with_enumerated_type(mutable_config):
spack.config.add("config:concretizer:clingo") spack.config.add("config:flags:keep_werror:all")
assert spack.config.get("config")["concretizer"] == "clingo" assert spack.config.get("config")["flags"]["keep_werror"] == "all"
spack.config.add("config:concretizer:original") spack.config.add("config:flags:keep_werror:specific")
assert spack.config.get("config")["concretizer"] == "original" assert spack.config.get("config")["flags"]["keep_werror"] == "specific"
with pytest.raises(spack.config.ConfigError): with pytest.raises(spack.config.ConfigError):
spack.config.add("config:concretizer:foo") spack.config.add("config:flags:keep_werror:foo")
def test_add_config_filename(mock_low_high_config, tmpdir): def test_add_config_filename(mock_low_high_config, tmpdir):

View File

@ -704,11 +704,10 @@ def configuration_dir(tmpdir_factory, linux_os):
tmpdir.ensure("user", dir=True) tmpdir.ensure("user", dir=True)
# Fill out config.yaml, compilers.yaml and modules.yaml templates. # Fill out config.yaml, compilers.yaml and modules.yaml templates.
solver = os.environ.get("SPACK_TEST_SOLVER", "clingo")
locks = sys.platform != "win32" locks = sys.platform != "win32"
config = tmpdir.join("site", "config.yaml") config = tmpdir.join("site", "config.yaml")
config_template = test_config / "config.yaml" config_template = test_config / "config.yaml"
config.write(config_template.read_text().format(install_tree_root, solver, locks)) config.write(config_template.read_text().format(install_tree_root, locks))
target = str(archspec.cpu.host().family) target = str(archspec.cpu.host().family)
compilers = tmpdir.join("site", "compilers.yaml") compilers = tmpdir.join("site", "compilers.yaml")
@ -1956,16 +1955,6 @@ def nullify_globals(request, monkeypatch):
def pytest_runtest_setup(item): def pytest_runtest_setup(item):
# Skip tests if they are marked only clingo and are run with the original concretizer
only_clingo_marker = item.get_closest_marker(name="only_clingo")
if only_clingo_marker and os.environ.get("SPACK_TEST_SOLVER") == "original":
pytest.skip(*only_clingo_marker.args)
# Skip tests if they are marked only original and are run with clingo
only_original_marker = item.get_closest_marker(name="only_original")
if only_original_marker and os.environ.get("SPACK_TEST_SOLVER", "clingo") == "clingo":
pytest.skip(*only_original_marker.args)
# Skip test marked "not_on_windows" if they're run on Windows # Skip test marked "not_on_windows" if they're run on Windows
not_on_windows_marker = item.get_closest_marker(name="not_on_windows") not_on_windows_marker = item.get_closest_marker(name="not_on_windows")
if not_on_windows_marker and sys.platform == "win32": if not_on_windows_marker and sys.platform == "win32":

View File

@ -13,5 +13,4 @@ config:
ssl_certs: $SSL_CERT_FILE ssl_certs: $SSL_CERT_FILE
checksum: true checksum: true
dirty: false dirty: false
concretizer: {1} locks: {1}
locks: {2}

View File

@ -573,9 +573,6 @@ def test_conflicts_with_packages_that_are_not_dependencies(
"""Tests that we cannot concretize two specs together, if one conflicts with the other, """Tests that we cannot concretize two specs together, if one conflicts with the other,
even though they don't have a dependency relation. even though they don't have a dependency relation.
""" """
if spack.config.get("config:concretizer") == "original":
pytest.xfail("Known failure of the original concretizer")
manifest = tmp_path / "spack.yaml" manifest = tmp_path / "spack.yaml"
manifest.write_text( manifest.write_text(
f"""\ f"""\
@ -597,7 +594,6 @@ def test_conflicts_with_packages_that_are_not_dependencies(
@pytest.mark.regression("39455") @pytest.mark.regression("39455")
@pytest.mark.only_clingo("Known failure of the original concretizer")
@pytest.mark.parametrize( @pytest.mark.parametrize(
"possible_mpi_spec,unify", [("mpich", False), ("mpich", True), ("zmpi", False), ("zmpi", True)] "possible_mpi_spec,unify", [("mpich", False), ("mpich", True), ("zmpi", False), ("zmpi", True)]
) )
@ -698,7 +694,6 @@ def test_removing_spec_from_manifest_with_exact_duplicates(
@pytest.mark.regression("35298") @pytest.mark.regression("35298")
@pytest.mark.only_clingo("Propagation not supported in the original concretizer")
def test_variant_propagation_with_unify_false(tmp_path, mock_packages, config): def test_variant_propagation_with_unify_false(tmp_path, mock_packages, config):
"""Spack distributes concretizations to different processes, when unify:false is selected and """Spack distributes concretizations to different processes, when unify:false is selected and
the number of roots is 2 or more. When that happens, the specs to be concretized need to be the number of roots is 2 or more. When that happens, the specs to be concretized need to be
@ -814,7 +809,6 @@ def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path, mock
@pytest.mark.regression("44216") @pytest.mark.regression("44216")
@pytest.mark.only_clingo()
def test_root_version_weights_for_old_versions(mutable_mock_env_path, mock_packages): def test_root_version_weights_for_old_versions(mutable_mock_env_path, mock_packages):
"""Tests that, when we select two old versions of root specs that have the same version """Tests that, when we select two old versions of root specs that have the same version
optimization penalty, both are considered. optimization penalty, both are considered.

View File

@ -9,33 +9,6 @@
import spack.spec import spack.spec
def test_static_graph_mpileaks(config, mock_packages):
"""Test a static spack graph for a simple package."""
s = spack.spec.Spec("mpileaks").normalized()
stream = io.StringIO()
spack.graph.static_graph_dot([s], out=stream)
dot = stream.getvalue()
assert ' "mpileaks" [label="mpileaks"]\n' in dot
assert ' "dyninst" [label="dyninst"]\n' in dot
assert ' "callpath" [label="callpath"]\n' in dot
assert ' "libelf" [label="libelf"]\n' in dot
assert ' "libdwarf" [label="libdwarf"]\n' in dot
mpi_providers = spack.repo.PATH.providers_for("mpi")
for spec in mpi_providers:
assert ('"mpileaks" -> "%s"' % spec.name) in dot
assert ('"callpath" -> "%s"' % spec.name) in dot
assert ' "dyninst" -> "libdwarf"\n' in dot
assert ' "callpath" -> "dyninst"\n' in dot
assert ' "libdwarf" -> "libelf"\n' in dot
assert ' "mpileaks" -> "callpath"\n' in dot
assert ' "dyninst" -> "libelf"\n' in dot
def test_dynamic_dot_graph_mpileaks(default_mock_concretization): def test_dynamic_dot_graph_mpileaks(default_mock_concretization):
"""Test dynamically graphing the mpileaks package.""" """Test dynamically graphing the mpileaks package."""
s = default_mock_concretization("mpileaks") s = default_mock_concretization("mpileaks")

View File

@ -14,7 +14,6 @@
pytestmark = [ pytestmark = [
pytest.mark.usefixtures("mock_packages", "config"), pytest.mark.usefixtures("mock_packages", "config"),
pytest.mark.only_clingo("The original concretizer cannot concretize most of the specs"),
pytest.mark.not_on_windows("Not running on windows"), pytest.mark.not_on_windows("Not running on windows"),
] ]

View File

@ -72,13 +72,6 @@ def spec_and_expected(request):
return spec, Spec.from_literal(d) return spec, Spec.from_literal(d)
def test_normalize(spec_and_expected, config, mock_packages):
spec, expected = spec_and_expected
spec = Spec(spec)
spec.normalize()
assert spec.eq_dag(expected, deptypes=False)
def test_default_variant(config, mock_packages): def test_default_variant(config, mock_packages):
spec = Spec("optional-dep-test-3") spec = Spec("optional-dep-test-3")
spec.concretize() spec.concretize()

View File

@ -82,7 +82,6 @@ def test_test_deptype(tmpdir):
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
@pytest.mark.only_clingo("fails with the original concretizer and full hashes")
def test_installed_deps(monkeypatch, mock_packages): def test_installed_deps(monkeypatch, mock_packages):
"""Ensure that concrete specs and their build deps don't constrain solves. """Ensure that concrete specs and their build deps don't constrain solves.
@ -183,14 +182,11 @@ def test_conflicting_package_constraints(self, set_dependency):
spec = Spec("mpileaks ^mpich ^callpath ^dyninst ^libelf ^libdwarf") spec = Spec("mpileaks ^mpich ^callpath ^dyninst ^libelf ^libdwarf")
# TODO: try to do something to show that the issue was with with pytest.raises(spack.error.UnsatisfiableSpecError):
# TODO: the user's input or with package inconsistencies. spec.concretize()
with pytest.raises(spack.spec.UnsatisfiableVersionSpecError):
spec.normalize()
def test_preorder_node_traversal(self): def test_preorder_node_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = ["mpileaks", "callpath", "dyninst", "libdwarf", "libelf", "zmpi", "fake"] names = ["mpileaks", "callpath", "dyninst", "libdwarf", "libelf", "zmpi", "fake"]
pairs = list(zip([0, 1, 2, 3, 4, 2, 3], names)) pairs = list(zip([0, 1, 2, 3, 4, 2, 3], names))
@ -202,8 +198,7 @@ def test_preorder_node_traversal(self):
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_preorder_edge_traversal(self): def test_preorder_edge_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = [ names = [
"mpileaks", "mpileaks",
@ -225,8 +220,7 @@ def test_preorder_edge_traversal(self):
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_preorder_path_traversal(self): def test_preorder_path_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = [ names = [
"mpileaks", "mpileaks",
@ -249,8 +243,7 @@ def test_preorder_path_traversal(self):
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_postorder_node_traversal(self): def test_postorder_node_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = ["libelf", "libdwarf", "dyninst", "fake", "zmpi", "callpath", "mpileaks"] names = ["libelf", "libdwarf", "dyninst", "fake", "zmpi", "callpath", "mpileaks"]
pairs = list(zip([4, 3, 2, 3, 2, 1, 0], names)) pairs = list(zip([4, 3, 2, 3, 2, 1, 0], names))
@ -262,8 +255,7 @@ def test_postorder_node_traversal(self):
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_postorder_edge_traversal(self): def test_postorder_edge_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = [ names = [
"libelf", "libelf",
@ -285,8 +277,7 @@ def test_postorder_edge_traversal(self):
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_postorder_path_traversal(self): def test_postorder_path_traversal(self):
dag = Spec("mpileaks ^zmpi") dag = Spec("mpileaks ^zmpi").concretized()
dag.normalize()
names = [ names = [
"libelf", "libelf",
@ -308,63 +299,6 @@ def test_postorder_path_traversal(self):
traversal = dag.traverse(cover="paths", depth=True, order="post") traversal = dag.traverse(cover="paths", depth=True, order="post")
assert [(x, y.name) for x, y in traversal] == pairs assert [(x, y.name) for x, y in traversal] == pairs
def test_conflicting_spec_constraints(self):
mpileaks = Spec("mpileaks ^mpich ^callpath ^dyninst ^libelf ^libdwarf")
# Normalize then add conflicting constraints to the DAG (this is an
# extremely unlikely scenario, but we test for it anyway)
mpileaks.normalize()
mpileaks.edges_to_dependencies(name="mpich")[0].spec = Spec("mpich@1.0")
mpileaks.edges_to_dependencies(name="callpath")[0].spec.edges_to_dependencies(
name="mpich"
)[0].spec = Spec("mpich@2.0")
with pytest.raises(spack.spec.InconsistentSpecError):
mpileaks.flat_dependencies()
def test_normalize_twice(self):
"""Make sure normalize can be run twice on the same spec,
and that it is idempotent."""
spec = Spec("mpileaks")
spec.normalize()
n1 = spec.copy()
spec.normalize()
assert n1 == spec
def test_normalize_a_lot(self):
spec = Spec("mpileaks")
spec.normalize()
spec.normalize()
spec.normalize()
spec.normalize()
def test_normalize_with_virtual_spec(self):
dag = Spec.from_literal(
{
"mpileaks": {
"callpath": {
"dyninst": {"libdwarf": {"libelf": None}, "libelf": None},
"mpi": None,
},
"mpi": None,
}
}
)
dag.normalize()
# make sure nothing with the same name occurs twice
counts = {}
for spec in dag.traverse(key=id):
if spec.name not in counts:
counts[spec.name] = 0
counts[spec.name] += 1
for name in counts:
assert counts[name] == 1
def test_dependents_and_dependencies_are_correct(self): def test_dependents_and_dependencies_are_correct(self):
spec = Spec.from_literal( spec = Spec.from_literal(
{ {
@ -377,36 +311,26 @@ def test_dependents_and_dependencies_are_correct(self):
} }
} }
) )
check_links(spec) check_links(spec)
spec.normalize() spec.concretize()
check_links(spec) check_links(spec)
def test_unsatisfiable_version(self, set_dependency): @pytest.mark.parametrize(
set_dependency("mpileaks", "mpich@1.0") "constraint_str,spec_str",
spec = Spec("mpileaks ^mpich@2.0 ^callpath ^dyninst ^libelf ^libdwarf") [
with pytest.raises(spack.spec.UnsatisfiableVersionSpecError): ("mpich@1.0", "mpileaks ^mpich@2.0"),
spec.normalize() ("mpich%gcc", "mpileaks ^mpich%intel"),
("mpich%gcc@4.6", "mpileaks ^mpich%gcc@4.5"),
def test_unsatisfiable_compiler(self, set_dependency): ("mpich platform=test target=be", "mpileaks ^mpich platform=test target=fe"),
set_dependency("mpileaks", "mpich%gcc") ],
spec = Spec("mpileaks ^mpich%intel ^callpath ^dyninst ^libelf" " ^libdwarf") )
with pytest.raises(spack.spec.UnsatisfiableCompilerSpecError): def test_unsatisfiable_cases(self, set_dependency, constraint_str, spec_str):
spec.normalize() """Tests that synthetic cases of conflicting requirements raise an UnsatisfiableSpecError
when concretizing.
def test_unsatisfiable_compiler_version(self, set_dependency): """
set_dependency("mpileaks", "mpich%gcc@4.6") set_dependency("mpileaks", constraint_str)
spec = Spec("mpileaks ^mpich%gcc@4.5 ^callpath ^dyninst ^libelf" " ^libdwarf") with pytest.raises(spack.error.UnsatisfiableSpecError):
with pytest.raises(spack.spec.UnsatisfiableCompilerSpecError): Spec(spec_str).concretize()
spec.normalize()
def test_unsatisfiable_architecture(self, set_dependency):
set_dependency("mpileaks", "mpich platform=test target=be")
spec = Spec(
"mpileaks ^mpich platform=test target=fe ^callpath" " ^dyninst ^libelf ^libdwarf"
)
with pytest.raises(spack.spec.UnsatisfiableArchitectureSpecError):
spec.normalize()
@pytest.mark.parametrize( @pytest.mark.parametrize(
"spec_str", ["libelf ^mpich", "libelf ^libdwarf", "mpich ^dyninst ^libelf"] "spec_str", ["libelf ^mpich", "libelf ^libdwarf", "mpich ^dyninst ^libelf"]
@ -451,106 +375,6 @@ def test_equal(self):
assert not flip_flat.eq_dag(flip_dag) assert not flip_flat.eq_dag(flip_dag)
assert not dag.eq_dag(flip_dag) assert not dag.eq_dag(flip_dag)
def test_normalize_mpileaks(self):
# Spec parsed in from a string
spec = Spec.from_literal(
{"mpileaks ^mpich ^callpath ^dyninst ^libelf@1.8.11 ^libdwarf": None}
)
# What that spec should look like after parsing
expected_flat = Spec.from_literal(
{
"mpileaks": {
"mpich": None,
"callpath": None,
"dyninst": None,
"libelf@1.8.11": None,
"libdwarf": None,
}
}
)
# What it should look like after normalization
mpich = Spec("mpich")
libelf = Spec("libelf@1.8.11")
expected_normalized = Spec.from_literal(
{
"mpileaks": {
"callpath": {
"dyninst": {"libdwarf": {libelf: None}, libelf: None},
mpich: None,
},
mpich: None,
}
}
)
# Similar to normalized spec, but now with copies of the same
# libelf node. Normalization should result in a single unique
# node for each package, so this is the wrong DAG.
non_unique_nodes = Spec.from_literal(
{
"mpileaks": {
"callpath": {
"dyninst": {"libdwarf": {"libelf@1.8.11": None}, "libelf@1.8.11": None},
mpich: None,
},
mpich: None,
}
},
normal=False,
)
# All specs here should be equal under regular equality
specs = (spec, expected_flat, expected_normalized, non_unique_nodes)
for lhs, rhs in zip(specs, specs):
assert lhs == rhs
assert str(lhs) == str(rhs)
# Test that equal and equal_dag are doing the right thing
assert spec == expected_flat
assert spec.eq_dag(expected_flat)
# Normalized has different DAG structure, so NOT equal.
assert spec != expected_normalized
assert not spec.eq_dag(expected_normalized)
# Again, different DAG structure so not equal.
assert spec != non_unique_nodes
assert not spec.eq_dag(non_unique_nodes)
spec.normalize()
# After normalizing, spec_dag_equal should match the normalized spec.
assert spec != expected_flat
assert not spec.eq_dag(expected_flat)
# verify DAG structure without deptypes.
assert spec.eq_dag(expected_normalized, deptypes=False)
assert not spec.eq_dag(non_unique_nodes, deptypes=False)
assert not spec.eq_dag(expected_normalized, deptypes=True)
assert not spec.eq_dag(non_unique_nodes, deptypes=True)
@pytest.mark.xfail(reason="String representation changed")
def test_normalize_with_virtual_package(self):
spec = Spec("mpileaks ^mpi ^libelf@1.8.11 ^libdwarf")
spec.normalize()
expected_normalized = Spec.from_literal(
{
"mpileaks": {
"callpath": {
"dyninst": {"libdwarf": {"libelf@1.8.11": None}, "libelf@1.8.11": None},
"mpi": None,
},
"mpi": None,
}
}
)
assert str(spec) == str(expected_normalized)
def test_contains(self): def test_contains(self):
spec = Spec("mpileaks ^mpi ^libelf@1.8.11 ^libdwarf") spec = Spec("mpileaks ^mpi ^libelf@1.8.11 ^libdwarf")
assert Spec("mpi") in spec assert Spec("mpi") in spec
@ -576,20 +400,6 @@ def test_copy_simple(self):
copy_ids = set(id(s) for s in copy.traverse()) copy_ids = set(id(s) for s in copy.traverse())
assert not orig_ids.intersection(copy_ids) assert not orig_ids.intersection(copy_ids)
def test_copy_normalized(self):
orig = Spec("mpileaks")
orig.normalize()
copy = orig.copy()
check_links(copy)
assert orig == copy
assert orig.eq_dag(copy)
# ensure no shared nodes bt/w orig and copy.
orig_ids = set(id(s) for s in orig.traverse())
copy_ids = set(id(s) for s in copy.traverse())
assert not orig_ids.intersection(copy_ids)
def test_copy_concretized(self): def test_copy_concretized(self):
orig = Spec("mpileaks") orig = Spec("mpileaks")
orig.concretize() orig.concretize()
@ -650,63 +460,53 @@ def test_copy_through_spec_build_interface(self):
run3 -b-> build3 run3 -b-> build3
""" """
def test_deptype_traversal(self): @pytest.mark.parametrize(
dag = Spec("dtuse") "spec_str,deptypes,expected",
dag.normalize() [
(
names = [ "dtuse",
"dtuse", ("build", "link"),
"dttop", [
"dtbuild1", "dtuse",
"dtbuild2", "dttop",
"dtlink2", "dtbuild1",
"dtlink1", "dtbuild2",
"dtlink3", "dtlink2",
"dtlink4", "dtlink1",
] "dtlink3",
"dtlink4",
traversal = dag.traverse(deptype=("build", "link")) ],
assert [x.name for x in traversal] == names ),
(
def test_deptype_traversal_with_builddeps(self): "dttop",
dag = Spec("dttop") ("build", "link"),
dag.normalize() ["dttop", "dtbuild1", "dtbuild2", "dtlink2", "dtlink1", "dtlink3", "dtlink4"],
),
names = ["dttop", "dtbuild1", "dtbuild2", "dtlink2", "dtlink1", "dtlink3", "dtlink4"] (
"dttop",
traversal = dag.traverse(deptype=("build", "link")) all,
assert [x.name for x in traversal] == names [
"dttop",
def test_deptype_traversal_full(self): "dtbuild1",
dag = Spec("dttop") "dtbuild2",
dag.normalize() "dtlink2",
"dtrun2",
names = [ "dtlink1",
"dttop", "dtlink3",
"dtbuild1", "dtlink4",
"dtbuild2", "dtrun1",
"dtlink2", "dtlink5",
"dtrun2", "dtrun3",
"dtlink1", "dtbuild3",
"dtlink3", ],
"dtlink4", ),
"dtrun1", ("dttop", "run", ["dttop", "dtrun1", "dtrun3"]),
"dtlink5", ],
"dtrun3", )
"dtbuild3", def test_deptype_traversal(self, spec_str, deptypes, expected):
] dag = Spec(spec_str).concretized()
traversal = dag.traverse(deptype=deptypes)
traversal = dag.traverse(deptype=all) assert [x.name for x in traversal] == expected
assert [x.name for x in traversal] == names
def test_deptype_traversal_run(self):
dag = Spec("dttop")
dag.normalize()
names = ["dttop", "dtrun1", "dtrun3"]
traversal = dag.traverse(deptype="run")
assert [x.name for x in traversal] == names
def test_hash_bits(self): def test_hash_bits(self):
"""Ensure getting first n bits of a base32-encoded DAG hash works.""" """Ensure getting first n bits of a base32-encoded DAG hash works."""
@ -834,15 +634,6 @@ def check_diamond_normalized_dag(self, spec):
assert spec.eq_dag(dag) assert spec.eq_dag(dag)
def test_normalize_diamond_deptypes(self):
"""Ensure that dependency types are preserved even if the same thing is
depended on in two different ways."""
s = Spec("dt-diamond")
s.normalize()
self.check_diamond_deptypes(s)
self.check_diamond_normalized_dag(s)
def test_concretize_deptypes(self): def test_concretize_deptypes(self):
"""Ensure that dependency types are preserved after concretization.""" """Ensure that dependency types are preserved after concretization."""
s = Spec("dt-diamond") s = Spec("dt-diamond")
@ -851,22 +642,11 @@ def test_concretize_deptypes(self):
def test_copy_deptypes(self): def test_copy_deptypes(self):
"""Ensure that dependency types are preserved by spec copy.""" """Ensure that dependency types are preserved by spec copy."""
s1 = Spec("dt-diamond") s1 = Spec("dt-diamond").concretized()
s1.normalize()
self.check_diamond_deptypes(s1) self.check_diamond_deptypes(s1)
self.check_diamond_normalized_dag(s1)
s2 = s1.copy() s2 = s1.copy()
self.check_diamond_normalized_dag(s2)
self.check_diamond_deptypes(s2) self.check_diamond_deptypes(s2)
s3 = Spec("dt-diamond")
s3.concretize()
self.check_diamond_deptypes(s3)
s4 = s3.copy()
self.check_diamond_deptypes(s4)
def test_getitem_query(self): def test_getitem_query(self):
s = Spec("mpileaks") s = Spec("mpileaks")
s.concretize() s.concretize()

View File

@ -537,27 +537,20 @@ def test_self_index(self):
s = Spec("callpath") s = Spec("callpath")
assert s["callpath"] == s assert s["callpath"] == s
def test_dep_index(self): def test_dep_index(self, default_mock_concretization):
s = Spec("callpath") """Tests __getitem__ and __contains__ for specs."""
s.normalize() s = default_mock_concretization("callpath")
assert s["callpath"] == s assert s["callpath"] == s
assert isinstance(s["dyninst"], Spec)
assert isinstance(s["libdwarf"], Spec)
assert isinstance(s["libelf"], Spec)
assert isinstance(s["mpi"], Spec)
assert s["dyninst"].name == "dyninst" # Real dependencies
assert s["libdwarf"].name == "libdwarf" for key in ("dyninst", "libdwarf", "libelf"):
assert s["libelf"].name == "libelf" assert isinstance(s[key], Spec)
assert s["mpi"].name == "mpi" assert s[key].name == key
assert key in s
def test_spec_contains_deps(self): # Virtual dependencies
s = Spec("callpath") assert s["mpi"].name == "mpich"
s.normalize()
assert "dyninst" in s
assert "libdwarf" in s
assert "libelf" in s
assert "mpi" in s assert "mpi" in s
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
@ -1123,9 +1116,6 @@ def test_spec_override(self):
], ],
) )
def test_virtual_deps_bindings(self, default_mock_concretization, spec_str, specs_in_dag): def test_virtual_deps_bindings(self, default_mock_concretization, spec_str, specs_in_dag):
if spack.config.get("config:concretizer") == "original":
pytest.skip("Use case not supported by the original concretizer")
s = default_mock_concretization(spec_str) s = default_mock_concretization(spec_str)
for label, expected in specs_in_dag: for label, expected in specs_in_dag:
assert label in s assert label in s
@ -1141,9 +1131,6 @@ def test_virtual_deps_bindings(self, default_mock_concretization, spec_str, spec
], ],
) )
def test_unsatisfiable_virtual_deps_bindings(self, spec_str): def test_unsatisfiable_virtual_deps_bindings(self, spec_str):
if spack.config.get("config:concretizer") == "original":
pytest.skip("Use case not supported by the original concretizer")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError): with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
Spec(spec_str).concretized() Spec(spec_str).concretized()

View File

@ -117,8 +117,9 @@ def test_yaml_subdag(config, mock_packages):
assert spec[dep].eq_dag(json_spec[dep]) assert spec[dep].eq_dag(json_spec[dep])
def test_using_ordered_dict(mock_packages): @pytest.mark.parametrize("spec_str", ["mpileaks ^zmpi", "dttop", "dtuse"])
"""Checks that dicts are ordered def test_using_ordered_dict(default_mock_concretization, spec_str):
"""Checks that we use syaml_dicts for spec serialization.
Necessary to make sure that dag_hash is stable across python Necessary to make sure that dag_hash is stable across python
versions and processes. versions and processes.
@ -136,14 +137,10 @@ def descend_and_check(iterable, level=0):
max_level = nlevel max_level = nlevel
return max_level return max_level
specs = ["mpileaks ^zmpi", "dttop", "dtuse"] s = default_mock_concretization(spec_str)
for spec in specs: level = descend_and_check(s.to_node_dict())
dag = Spec(spec) # level just makes sure we are doing something here
dag.normalize() assert level >= 5
level = descend_and_check(dag.to_node_dict())
# level just makes sure we are doing something here
assert level >= 5
def test_ordered_read_not_required_for_consistent_dag_hash(config, mock_packages): def test_ordered_read_not_required_for_consistent_dag_hash(config, mock_packages):

View File

@ -213,7 +213,6 @@ def test_from_list_url(mock_packages, config, spec, url, digest, _fetch_method):
("=2.0.0", "foo-2.0.0.tar.gz", None), ("=2.0.0", "foo-2.0.0.tar.gz", None),
], ],
) )
@pytest.mark.only_clingo("Original concretizer doesn't resolve concrete versions to known ones")
def test_new_version_from_list_url( def test_new_version_from_list_url(
mock_packages, config, _fetch_method, requested_version, tarball, digest mock_packages, config, _fetch_method, requested_version, tarball, digest
): ):

View File

@ -14,6 +14,4 @@ markers =
enable_compiler_verification: enable compiler verification within unit tests enable_compiler_verification: enable compiler verification within unit tests
enable_compiler_execution: enable compiler execution to detect link paths and libc enable_compiler_execution: enable compiler execution to detect link paths and libc
disable_clean_stage_check: avoid failing tests if there are leftover files in the stage area disable_clean_stage_check: avoid failing tests if there are leftover files in the stage area
only_clingo: mark unit tests that run only with clingo
only_original: mark unit tests that are specific to the original concretizer
not_on_windows: mark tests that are skipped on Windows not_on_windows: mark tests that are skipped on Windows

View File

@ -46,11 +46,6 @@ $coverage_run $(which spack) python -c "import spack.pkg.builtin.mpileaks; repr(
#----------------------------------------------------------- #-----------------------------------------------------------
# Run unit tests with code coverage # Run unit tests with code coverage
#----------------------------------------------------------- #-----------------------------------------------------------
if [[ "$SPACK_TEST_SOLVER" == "original" ]]; then
echo "ORIGINAL CONCRETIZER [skipping slow unit tests]"
export PYTEST_ADDOPTS='-m "not maybeslow"'
fi
# Check if xdist is available # Check if xdist is available
if python -m pytest --trace-config 2>&1 | grep xdist; then if python -m pytest --trace-config 2>&1 | grep xdist; then
export PYTEST_ADDOPTS="$PYTEST_ADDOPTS --dist loadfile --tx '${SPACK_TEST_PARALLEL:=3}*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python'" export PYTEST_ADDOPTS="$PYTEST_ADDOPTS --dist loadfile --tx '${SPACK_TEST_PARALLEL:=3}*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python'"