Add solver capability for synthesizing splices of ABI compatible packages. (#46729)
This PR provides complementary 2 features: 1. An augmentation to the package language to express ABI compatibility relationships among packages. 2. An extension to the concretizer that can synthesize splices between ABI compatible packages. 1. The `can_splice` directive and ABI compatibility We augment the package language with a single directive: `can_splice`. Here is an example of a package `Foo` exercising the `can_splice` directive: class Foo(Package): version("1.0") version("1.1") variant("compat", default=True) variant("json", default=False) variant("pic", default=False) can_splice("foo@1.0", when="@1.1") can_splice("bar@1.0", when="@1.0+compat") can_splice("baz@1.0+compat", when="@1.0+compat", match_variants="*") can_splice("quux@1.0", when=@1.1~compat", match_variants="json") Explanations of the uses of each directive: - `can_splice("foo@1.0", when="@1.1")`: If `foo@1.0` is the dependency of an already installed spec and `foo@1.1` could be a valid dependency for the parent spec, then `foo@1.1` can be spliced in for `foo@1.0` in the parent spec. - `can_splice("bar@1.0", when="@1.0+compat")`: If `bar@1.0` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `bar@1.0+compat` in the parent spec - `can_splice("baz@1.0", when="@1.0+compat", match_variants="*")`: If `baz@1.0+compat` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `baz@1.0+compat` in the parent spec, provided that they have the same value for all other variants (regardless of what those values are). - `can_splice("quux@1.0", when=@1.1~compat", match_variants="json")`:If `quux@1.0` is the dependency of an already installed spec and `foo@1.1~compat` could be a valid dependency for the parent spec, then `foo@1.0~compat` can be spliced in for `quux@1.0` in the parent spec, provided that they have the same value for their `json` variant. 2. Augmenting the solver to synthesize splices ### Changes to the hash encoding in `asp.py` Previously, when including concrete specs in the solve, they would have the following form: installed_hash("foo", "xxxyyy") imposed_constraint("xxxyyy", "foo", "attr1", ...) imposed_constraint("xxxyyy", "foo", "attr2", ...) % etc. Concrete specs now have the following form: installed_hash("foo", "xxxyyy") hash_attr("xxxyyy", "foo", "attr1", ...) hash_attr("xxxyyy", "foo", "attr2", ...) This transformation allows us to control which constraints are imposed when we select a hash, to facilitate the splicing of dependencies. 2.1 Compiling `can_splice` directives in `asp.py` Consider the concrete spec: foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ... It will emit the following facts for reuse (below is a subset) installed_hash("foo", "xxxyyy") hash_attr("xxxyyy", "hash", "foo", "xxxyyy") hash_attr("xxxyyy", "version", "foo", "2.72") hash_attr("xxxyyy", "node_os", "ubuntu22.04") hash_attr("xxxyyy", "hash", "bar", "zzzqqq") hash_attr("xxxyyy", "depends_on", "foo", "bar", "link") Rules that derive abi_splice_conditions_hold will be generated from use of the `can_splice` directive. They will have the following form: can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) ---> abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BaseHash) :- installed_hash("foo", BaseHash), attr("node", node(SID, SpliceName)), attr("node_version_satisfies", node(SID, "foo"), "1.0.1"), hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"), attr("variant_value", node(SID, "foo"), "a", "True"), hash_attr("hash", "variant_value", "foo", "a", "True"), attr("variant_value", node(SID, "foo"), "b", VariVar0), hash_attr("hash", "variant_value", "foo", "b", VariVar0). 2.2 Synthesizing splices in `concretize.lp` and `splices.lp` The ASP solver generates "splice_at_hash" attrs to indicate that a particular node has a splice in one of its immediate dependencies. Splices can be introduced in the dependencies of concrete specs when `splices.lp` is conditionally loaded (based on the config option `concretizer:splice:True`. 2.3 Constructing spliced specs in `asp.py` The method `SpecBuilder._resolve_splices` implements a top-down memoized implementation of hybrid splicing. This is an optimization over the more general `Spec.splice`, since the solver gives a global view of exactly which specs can be shared, to ensure the minimal number of splicing operations. Misc changes to facilitate configuration and benchmarking - Added the method `Solver.solve_with_stats` to expose timers from the public interface for easier benchmarking - Added the boolean config option `concretizer:splice` to conditionally load splicing behavior Co-authored-by: Greg Becker <becker33@llnl.gov>
This commit is contained in:
parent
ad518d975c
commit
bf16f0bf74
@ -39,7 +39,8 @@ concretizer:
|
|||||||
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
|
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
|
||||||
duplicates:
|
duplicates:
|
||||||
# "none": allows a single node for any package in the DAG.
|
# "none": allows a single node for any package in the DAG.
|
||||||
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
|
# "minimal": allows the duplication of 'build-tools' nodes only
|
||||||
|
# (e.g. py-setuptools, cmake etc.)
|
||||||
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
|
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
|
||||||
strategy: minimal
|
strategy: minimal
|
||||||
# Option to specify compatibility between operating systems for reuse of compilers and packages
|
# Option to specify compatibility between operating systems for reuse of compilers and packages
|
||||||
@ -47,3 +48,10 @@ concretizer:
|
|||||||
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
|
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
|
||||||
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
|
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
|
||||||
os_compatible: {}
|
os_compatible: {}
|
||||||
|
|
||||||
|
# Option to specify whether to support splicing. Splicing allows for
|
||||||
|
# the relinking of concrete package dependencies in order to better
|
||||||
|
# reuse already built packages with ABI compatible dependencies
|
||||||
|
splice:
|
||||||
|
explicit: []
|
||||||
|
automatic: false
|
||||||
|
@ -237,3 +237,35 @@ is optional -- by default, splices will be transitive.
|
|||||||
``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack
|
``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack
|
||||||
will warn the user in this case, but will not fail the
|
will warn the user in this case, but will not fail the
|
||||||
concretization.
|
concretization.
|
||||||
|
|
||||||
|
.. _automatic_splicing:
|
||||||
|
|
||||||
|
^^^^^^^^^^^^^^^^^^
|
||||||
|
Automatic Splicing
|
||||||
|
^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
|
The Spack solver can be configured to do automatic splicing for
|
||||||
|
ABI-compatible packages. Automatic splices are enabled in the concretizer
|
||||||
|
config section
|
||||||
|
|
||||||
|
.. code-block:: yaml
|
||||||
|
|
||||||
|
concretizer:
|
||||||
|
splice:
|
||||||
|
automatic: True
|
||||||
|
|
||||||
|
Packages can include ABI-compatibility information using the
|
||||||
|
``can_splice`` directive. See :ref:`the packaging
|
||||||
|
guide<abi_compatibility>` for instructions on specifying ABI
|
||||||
|
compatibility using the ``can_splice`` directive.
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
|
||||||
|
The ``can_splice`` directive is experimental and may be changed in
|
||||||
|
future versions.
|
||||||
|
|
||||||
|
When automatic splicing is enabled, the concretizer will combine any
|
||||||
|
number of ABI-compatible specs if possible to reuse installed packages
|
||||||
|
and packages available from binary caches. The end result of these
|
||||||
|
specs is equivalent to a series of transitive/intransitive splices,
|
||||||
|
but the series may be non-obvious.
|
||||||
|
@ -7113,6 +7113,46 @@ might write:
|
|||||||
CXXFLAGS += -I$DWARF_PREFIX/include
|
CXXFLAGS += -I$DWARF_PREFIX/include
|
||||||
CXXFLAGS += -L$DWARF_PREFIX/lib
|
CXXFLAGS += -L$DWARF_PREFIX/lib
|
||||||
|
|
||||||
|
.. _abi_compatibility:
|
||||||
|
|
||||||
|
----------------------------
|
||||||
|
Specifying ABI Compatibility
|
||||||
|
----------------------------
|
||||||
|
|
||||||
|
Packages can include ABI-compatibility information using the
|
||||||
|
``can_splice`` directive. For example, if ``Foo`` version 1.1 can
|
||||||
|
always replace version 1.0, then the package could have:
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
|
can_splice("foo@1.0", when="@1.1")
|
||||||
|
|
||||||
|
For virtual packages, packages can also specify ABI-compabitiliby with
|
||||||
|
other packages providing the same virtual. For example, ``zlib-ng``
|
||||||
|
could specify:
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
|
can_splice("zlib@1.3.1", when="@2.2+compat")
|
||||||
|
|
||||||
|
Some packages have ABI-compatibility that is dependent on matching
|
||||||
|
variant values, either for all variants or for some set of
|
||||||
|
ABI-relevant variants. In those cases, it is not necessary to specify
|
||||||
|
the full combinatorial explosion. The ``match_variants`` keyword can
|
||||||
|
cover all single-value variants.
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
|
can_splice("foo@1.1", when="@1.2", match_variants=["bar"]) # any value for bar as long as they're the same
|
||||||
|
can_splice("foo@1.2", when="@1.3", match_variants="*") # any variant values if all single-value variants match
|
||||||
|
|
||||||
|
The concretizer will use ABI compatibility to determine automatic
|
||||||
|
splices when :ref:`automatic splicing<automatic_splicing>` is enabled.
|
||||||
|
|
||||||
|
.. note::
|
||||||
|
|
||||||
|
The ``can_splice`` directive is experimental, and may be replaced
|
||||||
|
by a higher-level interface in future versions of Spack.
|
||||||
|
|
||||||
.. _package_class_structure:
|
.. _package_class_structure:
|
||||||
|
|
||||||
|
@ -77,6 +77,7 @@ class OpenMpi(Package):
|
|||||||
"build_system",
|
"build_system",
|
||||||
"requires",
|
"requires",
|
||||||
"redistribute",
|
"redistribute",
|
||||||
|
"can_splice",
|
||||||
]
|
]
|
||||||
|
|
||||||
_patch_order_index = 0
|
_patch_order_index = 0
|
||||||
@ -505,6 +506,43 @@ def _execute_provides(pkg: "spack.package_base.PackageBase"):
|
|||||||
return _execute_provides
|
return _execute_provides
|
||||||
|
|
||||||
|
|
||||||
|
@directive("splice_specs")
|
||||||
|
def can_splice(
|
||||||
|
target: SpecType, *, when: SpecType, match_variants: Union[None, str, List[str]] = None
|
||||||
|
):
|
||||||
|
"""Packages can declare whether they are ABI-compatible with another package
|
||||||
|
and thus can be spliced into concrete versions of that package.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
target: The spec that the current package is ABI-compatible with.
|
||||||
|
|
||||||
|
when: An anonymous spec constraining current package for when it is
|
||||||
|
ABI-compatible with target.
|
||||||
|
|
||||||
|
match_variants: A list of variants that must match
|
||||||
|
between target spec and current package, with special value '*'
|
||||||
|
which matches all variants. Example: a variant is defined on both
|
||||||
|
packages called json, and they are ABI-compatible whenever they agree on
|
||||||
|
the json variant (regardless of whether it is turned on or off). Note
|
||||||
|
that this cannot be applied to multi-valued variants and multi-valued
|
||||||
|
variants will be skipped by '*'.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _execute_can_splice(pkg: "spack.package_base.PackageBase"):
|
||||||
|
when_spec = _make_when_spec(when)
|
||||||
|
if isinstance(match_variants, str) and match_variants != "*":
|
||||||
|
raise ValueError(
|
||||||
|
"* is the only valid string for match_variants "
|
||||||
|
"if looking to provide a single variant, use "
|
||||||
|
f"[{match_variants}] instead"
|
||||||
|
)
|
||||||
|
if when_spec is None:
|
||||||
|
return
|
||||||
|
pkg.splice_specs[when_spec] = (spack.spec.Spec(target), match_variants)
|
||||||
|
|
||||||
|
return _execute_can_splice
|
||||||
|
|
||||||
|
|
||||||
@directive("patches")
|
@directive("patches")
|
||||||
def patch(
|
def patch(
|
||||||
url_or_filename: str,
|
url_or_filename: str,
|
||||||
|
@ -622,6 +622,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
|
|||||||
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
|
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
|
||||||
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
|
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
|
||||||
languages: Dict["spack.spec.Spec", Set[str]]
|
languages: Dict["spack.spec.Spec", Set[str]]
|
||||||
|
splice_specs: Dict["spack.spec.Spec", Tuple["spack.spec.Spec", Union[None, str, List[str]]]]
|
||||||
|
|
||||||
#: By default, packages are not virtual
|
#: By default, packages are not virtual
|
||||||
#: Virtual packages override this attribute
|
#: Virtual packages override this attribute
|
||||||
|
@ -78,7 +78,8 @@
|
|||||||
"transitive": {"type": "boolean", "default": False},
|
"transitive": {"type": "boolean", "default": False},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
}
|
},
|
||||||
|
"automatic": {"type": "boolean"},
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
"duplicates": {
|
"duplicates": {
|
||||||
|
@ -52,6 +52,7 @@
|
|||||||
|
|
||||||
from .core import (
|
from .core import (
|
||||||
AspFunction,
|
AspFunction,
|
||||||
|
AspVar,
|
||||||
NodeArgument,
|
NodeArgument,
|
||||||
ast_sym,
|
ast_sym,
|
||||||
ast_type,
|
ast_type,
|
||||||
@ -524,12 +525,14 @@ def _compute_specs_from_answer_set(self):
|
|||||||
node = SpecBuilder.make_node(pkg=providers[0])
|
node = SpecBuilder.make_node(pkg=providers[0])
|
||||||
candidate = answer.get(node)
|
candidate = answer.get(node)
|
||||||
|
|
||||||
if candidate and candidate.build_spec.satisfies(input_spec):
|
if candidate and candidate.satisfies(input_spec):
|
||||||
if not candidate.satisfies(input_spec):
|
self._concrete_specs.append(answer[node])
|
||||||
tty.warn(
|
self._concrete_specs_by_input[input_spec] = answer[node]
|
||||||
"explicit splice configuration has caused the concretized spec"
|
elif candidate and candidate.build_spec.satisfies(input_spec):
|
||||||
f" {candidate} not to satisfy the input spec {input_spec}"
|
tty.warn(
|
||||||
)
|
"explicit splice configuration has caused the concretized spec"
|
||||||
|
f" {candidate} not to satisfy the input spec {input_spec}"
|
||||||
|
)
|
||||||
self._concrete_specs.append(answer[node])
|
self._concrete_specs.append(answer[node])
|
||||||
self._concrete_specs_by_input[input_spec] = answer[node]
|
self._concrete_specs_by_input[input_spec] = answer[node]
|
||||||
else:
|
else:
|
||||||
@ -854,6 +857,8 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
|
|||||||
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
|
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
|
||||||
else:
|
else:
|
||||||
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
|
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
|
||||||
|
if setup.enable_splicing:
|
||||||
|
self.control.load(os.path.join(parent_dir, "splices.lp"))
|
||||||
|
|
||||||
timer.stop("load")
|
timer.stop("load")
|
||||||
|
|
||||||
@ -1166,6 +1171,9 @@ def __init__(self, tests: bool = False):
|
|||||||
# list of unique libc specs targeted by compilers (or an educated guess if no compiler)
|
# list of unique libc specs targeted by compilers (or an educated guess if no compiler)
|
||||||
self.libcs: List[spack.spec.Spec] = []
|
self.libcs: List[spack.spec.Spec] = []
|
||||||
|
|
||||||
|
# If true, we have to load the code for synthesizing splices
|
||||||
|
self.enable_splicing: bool = spack.config.CONFIG.get("concretizer:splice:automatic")
|
||||||
|
|
||||||
def pkg_version_rules(self, pkg):
|
def pkg_version_rules(self, pkg):
|
||||||
"""Output declared versions of a package.
|
"""Output declared versions of a package.
|
||||||
|
|
||||||
@ -1336,6 +1344,10 @@ def pkg_rules(self, pkg, tests):
|
|||||||
# dependencies
|
# dependencies
|
||||||
self.package_dependencies_rules(pkg)
|
self.package_dependencies_rules(pkg)
|
||||||
|
|
||||||
|
# splices
|
||||||
|
if self.enable_splicing:
|
||||||
|
self.package_splice_rules(pkg)
|
||||||
|
|
||||||
# virtual preferences
|
# virtual preferences
|
||||||
self.virtual_preferences(
|
self.virtual_preferences(
|
||||||
pkg.name,
|
pkg.name,
|
||||||
@ -1674,6 +1686,94 @@ def dependency_holds(input_spec, requirements):
|
|||||||
|
|
||||||
self.gen.newline()
|
self.gen.newline()
|
||||||
|
|
||||||
|
def _gen_match_variant_splice_constraints(
|
||||||
|
self,
|
||||||
|
pkg,
|
||||||
|
cond_spec: "spack.spec.Spec",
|
||||||
|
splice_spec: "spack.spec.Spec",
|
||||||
|
hash_asp_var: "AspVar",
|
||||||
|
splice_node,
|
||||||
|
match_variants: List[str],
|
||||||
|
):
|
||||||
|
# If there are no variants to match, no constraints are needed
|
||||||
|
variant_constraints = []
|
||||||
|
for i, variant_name in enumerate(match_variants):
|
||||||
|
vari_defs = pkg.variant_definitions(variant_name)
|
||||||
|
# the spliceable config of the package always includes the variant
|
||||||
|
if vari_defs != [] and any(cond_spec.satisfies(s) for (s, _) in vari_defs):
|
||||||
|
variant = vari_defs[0][1]
|
||||||
|
if variant.multi:
|
||||||
|
continue # cannot automatically match multi-valued variants
|
||||||
|
value_var = AspVar(f"VariValue{i}")
|
||||||
|
attr_constraint = fn.attr("variant_value", splice_node, variant_name, value_var)
|
||||||
|
hash_attr_constraint = fn.hash_attr(
|
||||||
|
hash_asp_var, "variant_value", splice_spec.name, variant_name, value_var
|
||||||
|
)
|
||||||
|
variant_constraints.append(attr_constraint)
|
||||||
|
variant_constraints.append(hash_attr_constraint)
|
||||||
|
return variant_constraints
|
||||||
|
|
||||||
|
def package_splice_rules(self, pkg):
|
||||||
|
self.gen.h2("Splice rules")
|
||||||
|
for i, (cond, (spec_to_splice, match_variants)) in enumerate(
|
||||||
|
sorted(pkg.splice_specs.items())
|
||||||
|
):
|
||||||
|
with named_spec(cond, pkg.name):
|
||||||
|
self.version_constraints.add((cond.name, cond.versions))
|
||||||
|
self.version_constraints.add((spec_to_splice.name, spec_to_splice.versions))
|
||||||
|
hash_var = AspVar("Hash")
|
||||||
|
splice_node = fn.node(AspVar("NID"), cond.name)
|
||||||
|
when_spec_attrs = [
|
||||||
|
fn.attr(c.args[0], splice_node, *(c.args[2:]))
|
||||||
|
for c in self.spec_clauses(cond, body=True, required_from=None)
|
||||||
|
if c.args[0] != "node"
|
||||||
|
]
|
||||||
|
splice_spec_hash_attrs = [
|
||||||
|
fn.hash_attr(hash_var, *(c.args))
|
||||||
|
for c in self.spec_clauses(spec_to_splice, body=True, required_from=None)
|
||||||
|
if c.args[0] != "node"
|
||||||
|
]
|
||||||
|
if match_variants is None:
|
||||||
|
variant_constraints = []
|
||||||
|
elif match_variants == "*":
|
||||||
|
filt_match_variants = set()
|
||||||
|
for map in pkg.variants.values():
|
||||||
|
for k in map:
|
||||||
|
filt_match_variants.add(k)
|
||||||
|
filt_match_variants = list(filt_match_variants)
|
||||||
|
variant_constraints = self._gen_match_variant_splice_constraints(
|
||||||
|
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
if any(
|
||||||
|
v in cond.variants or v in spec_to_splice.variants for v in match_variants
|
||||||
|
):
|
||||||
|
raise Exception(
|
||||||
|
"Overlap between match_variants and explicitly set variants"
|
||||||
|
)
|
||||||
|
variant_constraints = self._gen_match_variant_splice_constraints(
|
||||||
|
pkg, cond, spec_to_splice, hash_var, splice_node, match_variants
|
||||||
|
)
|
||||||
|
|
||||||
|
rule_head = fn.abi_splice_conditions_hold(
|
||||||
|
i, splice_node, spec_to_splice.name, hash_var
|
||||||
|
)
|
||||||
|
rule_body_components = (
|
||||||
|
[
|
||||||
|
# splice_set_fact,
|
||||||
|
fn.attr("node", splice_node),
|
||||||
|
fn.installed_hash(spec_to_splice.name, hash_var),
|
||||||
|
]
|
||||||
|
+ when_spec_attrs
|
||||||
|
+ splice_spec_hash_attrs
|
||||||
|
+ variant_constraints
|
||||||
|
)
|
||||||
|
rule_body = ",\n ".join(str(r) for r in rule_body_components)
|
||||||
|
rule = f"{rule_head} :-\n {rule_body}."
|
||||||
|
self.gen.append(rule)
|
||||||
|
|
||||||
|
self.gen.newline()
|
||||||
|
|
||||||
def virtual_preferences(self, pkg_name, func):
|
def virtual_preferences(self, pkg_name, func):
|
||||||
"""Call func(vspec, provider, i) for each of pkg's provider prefs."""
|
"""Call func(vspec, provider, i) for each of pkg's provider prefs."""
|
||||||
config = spack.config.get("packages")
|
config = spack.config.get("packages")
|
||||||
@ -2536,8 +2636,9 @@ def concrete_specs(self):
|
|||||||
for h, spec in self.reusable_and_possible.explicit_items():
|
for h, spec in self.reusable_and_possible.explicit_items():
|
||||||
# this indicates that there is a spec like this installed
|
# this indicates that there is a spec like this installed
|
||||||
self.gen.fact(fn.installed_hash(spec.name, h))
|
self.gen.fact(fn.installed_hash(spec.name, h))
|
||||||
# this describes what constraints it imposes on the solve
|
# indirection layer between hash constraints and imposition to allow for splicing
|
||||||
self.impose(h, spec, body=True)
|
for pred in self.spec_clauses(spec, body=True, required_from=None):
|
||||||
|
self.gen.fact(fn.hash_attr(h, *pred.args))
|
||||||
self.gen.newline()
|
self.gen.newline()
|
||||||
# Declare as possible parts of specs that are not in package.py
|
# Declare as possible parts of specs that are not in package.py
|
||||||
# - Add versions to possible versions
|
# - Add versions to possible versions
|
||||||
@ -3478,6 +3579,14 @@ def consume_facts(self):
|
|||||||
self._setup.effect_rules()
|
self._setup.effect_rules()
|
||||||
|
|
||||||
|
|
||||||
|
# This should be a dataclass, but dataclasses don't work on Python 3.6
|
||||||
|
class Splice:
|
||||||
|
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
|
||||||
|
self.splice_node = splice_node
|
||||||
|
self.child_name = child_name
|
||||||
|
self.child_hash = child_hash
|
||||||
|
|
||||||
|
|
||||||
class SpecBuilder:
|
class SpecBuilder:
|
||||||
"""Class with actions to rebuild a spec from ASP results."""
|
"""Class with actions to rebuild a spec from ASP results."""
|
||||||
|
|
||||||
@ -3513,10 +3622,11 @@ def make_node(*, pkg: str) -> NodeArgument:
|
|||||||
"""
|
"""
|
||||||
return NodeArgument(id="0", pkg=pkg)
|
return NodeArgument(id="0", pkg=pkg)
|
||||||
|
|
||||||
def __init__(
|
def __init__(self, specs, hash_lookup=None):
|
||||||
self, specs: List[spack.spec.Spec], *, hash_lookup: Optional[ConcreteSpecsByHash] = None
|
|
||||||
):
|
|
||||||
self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
|
self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
|
||||||
|
|
||||||
|
# Matches parent nodes to splice node
|
||||||
|
self._splices: Dict[NodeArgument, List[Splice]] = {}
|
||||||
self._result = None
|
self._result = None
|
||||||
self._command_line_specs = specs
|
self._command_line_specs = specs
|
||||||
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
|
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
|
||||||
@ -3600,16 +3710,8 @@ def external_spec_selected(self, node, idx):
|
|||||||
|
|
||||||
def depends_on(self, parent_node, dependency_node, type):
|
def depends_on(self, parent_node, dependency_node, type):
|
||||||
dependency_spec = self._specs[dependency_node]
|
dependency_spec = self._specs[dependency_node]
|
||||||
edges = self._specs[parent_node].edges_to_dependencies(name=dependency_spec.name)
|
|
||||||
edges = [x for x in edges if id(x.spec) == id(dependency_spec)]
|
|
||||||
depflag = dt.flag_from_string(type)
|
depflag = dt.flag_from_string(type)
|
||||||
|
self._specs[parent_node].add_dependency_edge(dependency_spec, depflag=depflag, virtuals=())
|
||||||
if not edges:
|
|
||||||
self._specs[parent_node].add_dependency_edge(
|
|
||||||
self._specs[dependency_node], depflag=depflag, virtuals=()
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
edges[0].update_deptypes(depflag=depflag)
|
|
||||||
|
|
||||||
def virtual_on_edge(self, parent_node, provider_node, virtual):
|
def virtual_on_edge(self, parent_node, provider_node, virtual):
|
||||||
dependencies = self._specs[parent_node].edges_to_dependencies(name=(provider_node.pkg))
|
dependencies = self._specs[parent_node].edges_to_dependencies(name=(provider_node.pkg))
|
||||||
@ -3726,6 +3828,48 @@ def _order_index(flag_group):
|
|||||||
def deprecated(self, node: NodeArgument, version: str) -> None:
|
def deprecated(self, node: NodeArgument, version: str) -> None:
|
||||||
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
|
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
|
||||||
|
|
||||||
|
def splice_at_hash(
|
||||||
|
self,
|
||||||
|
parent_node: NodeArgument,
|
||||||
|
splice_node: NodeArgument,
|
||||||
|
child_name: str,
|
||||||
|
child_hash: str,
|
||||||
|
):
|
||||||
|
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash)
|
||||||
|
self._splices.setdefault(parent_node, []).append(splice)
|
||||||
|
|
||||||
|
def _resolve_automatic_splices(self):
|
||||||
|
"""After all of the specs have been concretized, apply all immediate
|
||||||
|
splices in size order. This ensures that all dependencies are resolved
|
||||||
|
before their parents, allowing for maximal sharing and minimal copying.
|
||||||
|
"""
|
||||||
|
fixed_specs = {}
|
||||||
|
for node, spec in sorted(self._specs.items(), key=lambda x: len(x[1])):
|
||||||
|
immediate = self._splices.get(node, [])
|
||||||
|
if not immediate and not any(
|
||||||
|
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
|
||||||
|
):
|
||||||
|
continue
|
||||||
|
new_spec = spec.copy(deps=False)
|
||||||
|
new_spec.build_spec = spec
|
||||||
|
for edge in spec.edges_to_dependencies():
|
||||||
|
depflag = edge.depflag & ~dt.BUILD
|
||||||
|
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
|
||||||
|
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
|
||||||
|
new_spec.add_dependency_edge(
|
||||||
|
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
|
||||||
|
)
|
||||||
|
elif edge.spec in fixed_specs:
|
||||||
|
new_spec.add_dependency_edge(
|
||||||
|
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
new_spec.add_dependency_edge(
|
||||||
|
edge.spec, depflag=depflag, virtuals=edge.virtuals
|
||||||
|
)
|
||||||
|
self._specs[node] = new_spec
|
||||||
|
fixed_specs[spec] = new_spec
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def sort_fn(function_tuple) -> Tuple[int, int]:
|
def sort_fn(function_tuple) -> Tuple[int, int]:
|
||||||
"""Ensure attributes are evaluated in the correct order.
|
"""Ensure attributes are evaluated in the correct order.
|
||||||
@ -3755,7 +3899,6 @@ def build_specs(self, function_tuples):
|
|||||||
# them here so that directives that build objects (like node and
|
# them here so that directives that build objects (like node and
|
||||||
# node_compiler) are called in the right order.
|
# node_compiler) are called in the right order.
|
||||||
self.function_tuples = sorted(set(function_tuples), key=self.sort_fn)
|
self.function_tuples = sorted(set(function_tuples), key=self.sort_fn)
|
||||||
|
|
||||||
self._specs = {}
|
self._specs = {}
|
||||||
for name, args in self.function_tuples:
|
for name, args in self.function_tuples:
|
||||||
if SpecBuilder.ignored_attributes.match(name):
|
if SpecBuilder.ignored_attributes.match(name):
|
||||||
@ -3785,10 +3928,14 @@ def build_specs(self, function_tuples):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
# if we've already gotten a concrete spec for this pkg,
|
# if we've already gotten a concrete spec for this pkg,
|
||||||
# do not bother calling actions on it
|
# do not bother calling actions on it except for node_flag_source,
|
||||||
|
# since node_flag_source is tracking information not in the spec itself
|
||||||
|
# we also need to keep track of splicing information.
|
||||||
spec = self._specs.get(args[0])
|
spec = self._specs.get(args[0])
|
||||||
if spec and spec.concrete:
|
if spec and spec.concrete:
|
||||||
continue
|
do_not_ignore_attrs = ["node_flag_source", "splice_at_hash"]
|
||||||
|
if name not in do_not_ignore_attrs:
|
||||||
|
continue
|
||||||
|
|
||||||
action(*args)
|
action(*args)
|
||||||
|
|
||||||
@ -3798,7 +3945,7 @@ def build_specs(self, function_tuples):
|
|||||||
# inject patches -- note that we' can't use set() to unique the
|
# inject patches -- note that we' can't use set() to unique the
|
||||||
# roots here, because the specs aren't complete, and the hash
|
# roots here, because the specs aren't complete, and the hash
|
||||||
# function will loop forever.
|
# function will loop forever.
|
||||||
roots = [spec.root for spec in self._specs.values() if not spec.root.installed]
|
roots = [spec.root for spec in self._specs.values()]
|
||||||
roots = dict((id(r), r) for r in roots)
|
roots = dict((id(r), r) for r in roots)
|
||||||
for root in roots.values():
|
for root in roots.values():
|
||||||
spack.spec.Spec.inject_patches_variant(root)
|
spack.spec.Spec.inject_patches_variant(root)
|
||||||
@ -3814,6 +3961,8 @@ def build_specs(self, function_tuples):
|
|||||||
for root in roots.values():
|
for root in roots.values():
|
||||||
root._finalize_concretization()
|
root._finalize_concretization()
|
||||||
|
|
||||||
|
self._resolve_automatic_splices()
|
||||||
|
|
||||||
for s in self._specs.values():
|
for s in self._specs.values():
|
||||||
spack.spec.Spec.ensure_no_deprecated(s)
|
spack.spec.Spec.ensure_no_deprecated(s)
|
||||||
|
|
||||||
@ -3828,7 +3977,6 @@ def build_specs(self, function_tuples):
|
|||||||
)
|
)
|
||||||
|
|
||||||
specs = self.execute_explicit_splices()
|
specs = self.execute_explicit_splices()
|
||||||
|
|
||||||
return specs
|
return specs
|
||||||
|
|
||||||
def execute_explicit_splices(self):
|
def execute_explicit_splices(self):
|
||||||
@ -4165,7 +4313,6 @@ def reusable_specs(self, specs: List[spack.spec.Spec]) -> List[spack.spec.Spec]:
|
|||||||
result = []
|
result = []
|
||||||
for reuse_source in self.reuse_sources:
|
for reuse_source in self.reuse_sources:
|
||||||
result.extend(reuse_source.selected_specs())
|
result.extend(reuse_source.selected_specs())
|
||||||
|
|
||||||
# If we only want to reuse dependencies, remove the root specs
|
# If we only want to reuse dependencies, remove the root specs
|
||||||
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
|
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
|
||||||
result = [spec for spec in result if not any(root in spec for root in specs)]
|
result = [spec for spec in result if not any(root in spec for root in specs)]
|
||||||
@ -4335,11 +4482,10 @@ def __init__(self, provided, conflicts):
|
|||||||
|
|
||||||
super().__init__(msg)
|
super().__init__(msg)
|
||||||
|
|
||||||
self.provided = provided
|
|
||||||
|
|
||||||
# Add attribute expected of the superclass interface
|
# Add attribute expected of the superclass interface
|
||||||
self.required = None
|
self.required = None
|
||||||
self.constraint_type = None
|
self.constraint_type = None
|
||||||
|
self.provided = provided
|
||||||
|
|
||||||
|
|
||||||
class InvalidSpliceError(spack.error.SpackError):
|
class InvalidSpliceError(spack.error.SpackError):
|
||||||
|
@ -1449,25 +1449,71 @@ attr("node_flag", PackageNode, NodeFlag) :- attr("node_flag_set", PackageNode, N
|
|||||||
|
|
||||||
|
|
||||||
%-----------------------------------------------------------------------------
|
%-----------------------------------------------------------------------------
|
||||||
% Installed packages
|
% Installed Packages
|
||||||
%-----------------------------------------------------------------------------
|
%-----------------------------------------------------------------------------
|
||||||
% the solver is free to choose at most one installed hash for each package
|
|
||||||
{ attr("hash", node(ID, Package), Hash) : installed_hash(Package, Hash) } 1
|
|
||||||
:- attr("node", node(ID, Package)), internal_error("Package must resolve to at most one hash").
|
|
||||||
|
|
||||||
|
#defined installed_hash/2.
|
||||||
|
#defined abi_splice_conditions_hold/4.
|
||||||
|
|
||||||
|
% These are the previously concretized attributes of the installed package as
|
||||||
|
% a hash. It has the general form:
|
||||||
|
% hash_attr(Hash, Attribute, PackageName, Args*)
|
||||||
|
#defined hash_attr/3.
|
||||||
|
#defined hash_attr/4.
|
||||||
|
#defined hash_attr/5.
|
||||||
|
#defined hash_attr/6.
|
||||||
|
#defined hash_attr/7.
|
||||||
|
|
||||||
|
{ attr("hash", node(ID, PackageName), Hash): installed_hash(PackageName, Hash) } 1 :-
|
||||||
|
attr("node", node(ID, PackageName)),
|
||||||
|
internal_error("Package must resolve to at most 1 hash").
|
||||||
% you can't choose an installed hash for a dev spec
|
% you can't choose an installed hash for a dev spec
|
||||||
:- attr("hash", PackageNode, Hash), attr("variant_value", PackageNode, "dev_path", _).
|
:- attr("hash", PackageNode, Hash), attr("variant_value", PackageNode, "dev_path", _).
|
||||||
|
|
||||||
% You can't install a hash, if it is not installed
|
% You can't install a hash, if it is not installed
|
||||||
:- attr("hash", node(ID, Package), Hash), not installed_hash(Package, Hash).
|
:- attr("hash", node(ID, Package), Hash), not installed_hash(Package, Hash).
|
||||||
% This should be redundant given the constraint above
|
|
||||||
:- attr("node", PackageNode), 2 { attr("hash", PackageNode, Hash) }.
|
|
||||||
|
|
||||||
% if a hash is selected, we impose all the constraints that implies
|
% hash_attrs are versions, but can_splice_attr are usually node_version_satisfies
|
||||||
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash).
|
hash_attr(Hash, "node_version_satisfies", PackageName, Constraint) :-
|
||||||
|
hash_attr(Hash, "version", PackageName, Version),
|
||||||
|
pkg_fact(PackageName, version_satisfies(Constraint, Version)).
|
||||||
|
|
||||||
|
% This recovers the exact semantics for hash reuse hash and depends_on are where
|
||||||
|
% splices are decided, and virtual_on_edge can result in name-changes, which is
|
||||||
|
% why they are all treated separately.
|
||||||
|
imposed_constraint(Hash, Attr, PackageName) :-
|
||||||
|
hash_attr(Hash, Attr, PackageName).
|
||||||
|
imposed_constraint(Hash, Attr, PackageName, A1) :-
|
||||||
|
hash_attr(Hash, Attr, PackageName, A1), Attr != "hash".
|
||||||
|
imposed_constraint(Hash, Attr, PackageName, Arg1, Arg2) :-
|
||||||
|
hash_attr(Hash, Attr, PackageName, Arg1, Arg2),
|
||||||
|
Attr != "depends_on",
|
||||||
|
Attr != "virtual_on_edge".
|
||||||
|
imposed_constraint(Hash, Attr, PackageName, A1, A2, A3) :-
|
||||||
|
hash_attr(Hash, Attr, PackageName, A1, A2, A3).
|
||||||
|
imposed_constraint(Hash, "hash", PackageName, Hash) :- installed_hash(PackageName, Hash).
|
||||||
|
% Without splicing, we simply recover the exact semantics
|
||||||
|
imposed_constraint(ParentHash, "hash", ChildName, ChildHash) :-
|
||||||
|
hash_attr(ParentHash, "hash", ChildName, ChildHash),
|
||||||
|
ChildHash != ParentHash,
|
||||||
|
not abi_splice_conditions_hold(_, _, ChildName, ChildHash).
|
||||||
|
|
||||||
|
imposed_constraint(Hash, "depends_on", PackageName, DepName, Type) :-
|
||||||
|
hash_attr(Hash, "depends_on", PackageName, DepName, Type),
|
||||||
|
hash_attr(Hash, "hash", DepName, DepHash),
|
||||||
|
not attr("splice_at_hash", _, _, DepName, DepHash).
|
||||||
|
|
||||||
|
imposed_constraint(Hash, "virtual_on_edge", PackageName, DepName, VirtName) :-
|
||||||
|
hash_attr(Hash, "virtual_on_edge", PackageName, DepName, VirtName),
|
||||||
|
not attr("splice_at_hash", _, _, DepName,_).
|
||||||
|
|
||||||
|
% Rules pertaining to attr("splice_at_hash") and abi_splice_conditions_hold will
|
||||||
|
% be conditionally loaded from splices.lp
|
||||||
|
|
||||||
|
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash), attr("node", PackageNode).
|
||||||
|
|
||||||
|
% If there is not a hash for a package, we build it.
|
||||||
|
build(PackageNode) :- attr("node", PackageNode), not concrete(PackageNode).
|
||||||
|
|
||||||
% if we haven't selected a hash for a package, we'll be building it
|
|
||||||
build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode).
|
|
||||||
|
|
||||||
% Minimizing builds is tricky. We want a minimizing criterion
|
% Minimizing builds is tricky. We want a minimizing criterion
|
||||||
|
|
||||||
@ -1480,6 +1526,7 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
|
|||||||
% criteria for built specs -- so that they take precedence over the otherwise
|
% criteria for built specs -- so that they take precedence over the otherwise
|
||||||
% topmost-priority criterion to reuse what is installed.
|
% topmost-priority criterion to reuse what is installed.
|
||||||
%
|
%
|
||||||
|
|
||||||
% The priority ranges are:
|
% The priority ranges are:
|
||||||
% 1000+ Optimizations for concretization errors
|
% 1000+ Optimizations for concretization errors
|
||||||
% 300 - 1000 Highest priority optimizations for valid solutions
|
% 300 - 1000 Highest priority optimizations for valid solutions
|
||||||
@ -1505,12 +1552,10 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
|
|||||||
pkg_fact(Package, version_declared(Version, Weight, "installed")),
|
pkg_fact(Package, version_declared(Version, Weight, "installed")),
|
||||||
not optimize_for_reuse().
|
not optimize_for_reuse().
|
||||||
|
|
||||||
#defined installed_hash/2.
|
|
||||||
|
|
||||||
% This statement, which is a hidden feature of clingo, let us avoid cycles in the DAG
|
% This statement, which is a hidden feature of clingo, let us avoid cycles in the DAG
|
||||||
#edge (A, B) : depends_on(A, B).
|
#edge (A, B) : depends_on(A, B).
|
||||||
|
|
||||||
|
|
||||||
%-----------------------------------------------------------------
|
%-----------------------------------------------------------------
|
||||||
% Optimization to avoid errors
|
% Optimization to avoid errors
|
||||||
%-----------------------------------------------------------------
|
%-----------------------------------------------------------------
|
||||||
|
@ -44,6 +44,17 @@ def _id(thing: Any) -> Union[str, AspObject]:
|
|||||||
return f'"{str(thing)}"'
|
return f'"{str(thing)}"'
|
||||||
|
|
||||||
|
|
||||||
|
class AspVar(AspObject):
|
||||||
|
"""Represents a variable in an ASP rule, allows for conditionally generating
|
||||||
|
rules"""
|
||||||
|
|
||||||
|
def __init__(self, name: str):
|
||||||
|
self.name = name
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
return str(self.name)
|
||||||
|
|
||||||
|
|
||||||
@lang.key_ordering
|
@lang.key_ordering
|
||||||
class AspFunction(AspObject):
|
class AspFunction(AspObject):
|
||||||
"""A term in the ASP logic program"""
|
"""A term in the ASP logic program"""
|
||||||
@ -88,6 +99,8 @@ def _argify(self, arg: Any) -> Any:
|
|||||||
return clingo().Number(arg)
|
return clingo().Number(arg)
|
||||||
elif isinstance(arg, AspFunction):
|
elif isinstance(arg, AspFunction):
|
||||||
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
|
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
|
||||||
|
elif isinstance(arg, AspVar):
|
||||||
|
return clingo().Variable(arg.name)
|
||||||
return clingo().String(str(arg))
|
return clingo().String(str(arg))
|
||||||
|
|
||||||
def symbol(self):
|
def symbol(self):
|
||||||
|
@ -15,7 +15,6 @@
|
|||||||
#show attr/4.
|
#show attr/4.
|
||||||
#show attr/5.
|
#show attr/5.
|
||||||
#show attr/6.
|
#show attr/6.
|
||||||
|
|
||||||
% names of optimization criteria
|
% names of optimization criteria
|
||||||
#show opt_criterion/2.
|
#show opt_criterion/2.
|
||||||
|
|
||||||
|
56
lib/spack/spack/solver/splices.lp
Normal file
56
lib/spack/spack/solver/splices.lp
Normal file
@ -0,0 +1,56 @@
|
|||||||
|
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
% Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
%
|
||||||
|
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
%=============================================================================
|
||||||
|
% These rules are conditionally loaded to handle the synthesis of spliced
|
||||||
|
% packages.
|
||||||
|
% =============================================================================
|
||||||
|
% Consider the concrete spec:
|
||||||
|
% foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ...
|
||||||
|
% It will emit the following facts for reuse (below is a subset)
|
||||||
|
% installed_hash("foo", "xxxyyy")
|
||||||
|
% hash_attr("xxxyyy", "hash", "foo", "xxxyyy")
|
||||||
|
% hash_attr("xxxyyy", "version", "foo", "2.72")
|
||||||
|
% hash_attr("xxxyyy", "node_os", "ubuntu22.04")
|
||||||
|
% hash_attr("xxxyyy", "hash", "bar", "zzzqqq")
|
||||||
|
% hash_attr("xxxyyy", "depends_on", "foo", "bar", "link")
|
||||||
|
% Rules that derive abi_splice_conditions_hold will be generated from
|
||||||
|
% use of the `can_splice` directive. The will have the following form:
|
||||||
|
% can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) --->
|
||||||
|
% abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BashHash) :-
|
||||||
|
% installed_hash("foo", BaseHash),
|
||||||
|
% attr("node", node(SID, SpliceName)),
|
||||||
|
% attr("node_version_satisfies", node(SID, "foo"), "1.0.1"),
|
||||||
|
% hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"),
|
||||||
|
% attr("variant_value", node(SID, "foo"), "a", "True"),
|
||||||
|
% hash_attr("hash", "variant_value", "foo", "a", "True"),
|
||||||
|
% attr("variant_value", node(SID, "foo"), "b", VariVar0),
|
||||||
|
% hash_attr("hash", "variant_value", "foo", "b", VariVar0),
|
||||||
|
|
||||||
|
% If the splice is valid (i.e. abi_splice_conditions_hold is derived) in the
|
||||||
|
% dependency of a concrete spec the solver free to choose whether to continue
|
||||||
|
% with the exact hash semantics by simply imposing the child hash, or introducing
|
||||||
|
% a spliced node as the dependency instead
|
||||||
|
{ imposed_constraint(ParentHash, "hash", ChildName, ChildHash) } :-
|
||||||
|
hash_attr(ParentHash, "hash", ChildName, ChildHash),
|
||||||
|
abi_splice_conditions_hold(_, node(SID, SpliceName), ChildName, ChildHash).
|
||||||
|
|
||||||
|
attr("splice_at_hash", ParentNode, node(SID, SpliceName), ChildName, ChildHash) :-
|
||||||
|
attr("hash", ParentNode, ParentHash),
|
||||||
|
hash_attr(ParentHash, "hash", ChildName, ChildHash),
|
||||||
|
abi_splice_conditions_hold(_, node(SID, SpliceName), ChildName, ChildHash),
|
||||||
|
ParentHash != ChildHash,
|
||||||
|
not imposed_constraint(ParentHash, "hash", ChildName, ChildHash).
|
||||||
|
|
||||||
|
% Names and virtual providers may change when a dependency is spliced in
|
||||||
|
imposed_constraint(Hash, "dependency_holds", ParentName, SpliceName, Type) :-
|
||||||
|
hash_attr(Hash, "depends_on", ParentName, DepName, Type),
|
||||||
|
hash_attr(Hash, "hash", DepName, DepHash),
|
||||||
|
attr("splice_at_hash", node(ID, ParentName), node(SID, SpliceName), DepName, DepHash).
|
||||||
|
|
||||||
|
imposed_constraint(Hash, "virtual_on_edge", ParentName, SpliceName, VirtName) :-
|
||||||
|
hash_attr(Hash, "virtual_on_edge", ParentName, DepName, VirtName),
|
||||||
|
attr("splice_at_hash", node(ID, ParentName), node(SID, SpliceName), DepName, DepHash).
|
||||||
|
|
@ -1431,6 +1431,8 @@ def tree(
|
|||||||
class Spec:
|
class Spec:
|
||||||
#: Cache for spec's prefix, computed lazily in the corresponding property
|
#: Cache for spec's prefix, computed lazily in the corresponding property
|
||||||
_prefix = None
|
_prefix = None
|
||||||
|
#: Cache for spec's length, computed lazily in the corresponding property
|
||||||
|
_length = None
|
||||||
abstract_hash = None
|
abstract_hash = None
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@ -2907,7 +2909,7 @@ def _mark_concrete(self, value=True):
|
|||||||
if (not value) and s.concrete and s.installed:
|
if (not value) and s.concrete and s.installed:
|
||||||
continue
|
continue
|
||||||
elif not value:
|
elif not value:
|
||||||
s.clear_cached_hashes()
|
s.clear_caches()
|
||||||
s._mark_root_concrete(value)
|
s._mark_root_concrete(value)
|
||||||
|
|
||||||
def _finalize_concretization(self):
|
def _finalize_concretization(self):
|
||||||
@ -3700,6 +3702,18 @@ def __getitem__(self, name: str):
|
|||||||
|
|
||||||
return child
|
return child
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
if not self.concrete:
|
||||||
|
raise spack.error.SpecError(f"Cannot get length of abstract spec: {self}")
|
||||||
|
|
||||||
|
if not self._length:
|
||||||
|
self._length = 1 + sum(len(dep) for dep in self.dependencies())
|
||||||
|
return self._length
|
||||||
|
|
||||||
|
def __bool__(self):
|
||||||
|
# Need to define this so __len__ isn't used by default
|
||||||
|
return True
|
||||||
|
|
||||||
def __contains__(self, spec):
|
def __contains__(self, spec):
|
||||||
"""True if this spec or some dependency satisfies the spec.
|
"""True if this spec or some dependency satisfies the spec.
|
||||||
|
|
||||||
@ -4256,7 +4270,7 @@ def _splice_detach_and_add_dependents(self, replacement, context):
|
|||||||
for ancestor in ancestors_in_context:
|
for ancestor in ancestors_in_context:
|
||||||
# Only set it if it hasn't been spliced before
|
# Only set it if it hasn't been spliced before
|
||||||
ancestor._build_spec = ancestor._build_spec or ancestor.copy()
|
ancestor._build_spec = ancestor._build_spec or ancestor.copy()
|
||||||
ancestor.clear_cached_hashes(ignore=(ht.package_hash.attr,))
|
ancestor.clear_caches(ignore=(ht.package_hash.attr,))
|
||||||
for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD):
|
for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD):
|
||||||
if edge.depflag & ~dt.BUILD:
|
if edge.depflag & ~dt.BUILD:
|
||||||
edge.depflag &= ~dt.BUILD
|
edge.depflag &= ~dt.BUILD
|
||||||
@ -4450,7 +4464,7 @@ def mask_build_deps(in_spec):
|
|||||||
|
|
||||||
return spec
|
return spec
|
||||||
|
|
||||||
def clear_cached_hashes(self, ignore=()):
|
def clear_caches(self, ignore=()):
|
||||||
"""
|
"""
|
||||||
Clears all cached hashes in a Spec, while preserving other properties.
|
Clears all cached hashes in a Spec, while preserving other properties.
|
||||||
"""
|
"""
|
||||||
@ -4458,7 +4472,9 @@ def clear_cached_hashes(self, ignore=()):
|
|||||||
if h.attr not in ignore:
|
if h.attr not in ignore:
|
||||||
if hasattr(self, h.attr):
|
if hasattr(self, h.attr):
|
||||||
setattr(self, h.attr, None)
|
setattr(self, h.attr, None)
|
||||||
self._dunder_hash = None
|
for attr in ("_dunder_hash", "_prefix", "_length"):
|
||||||
|
if attr not in ignore:
|
||||||
|
setattr(self, attr, None)
|
||||||
|
|
||||||
def __hash__(self):
|
def __hash__(self):
|
||||||
# If the spec is concrete, we leverage the process hash and just use
|
# If the spec is concrete, we leverage the process hash and just use
|
||||||
|
234
lib/spack/spack/test/abi_splicing.py
Normal file
234
lib/spack/spack/test/abi_splicing.py
Normal file
@ -0,0 +1,234 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
""" Test ABI-based splicing of dependencies """
|
||||||
|
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
import spack.config
|
||||||
|
import spack.deptypes as dt
|
||||||
|
import spack.package_base
|
||||||
|
import spack.paths
|
||||||
|
import spack.repo
|
||||||
|
import spack.solver.asp
|
||||||
|
from spack.installer import PackageInstaller
|
||||||
|
from spack.spec import Spec
|
||||||
|
|
||||||
|
|
||||||
|
class CacheManager:
|
||||||
|
def __init__(self, specs: List[str]) -> None:
|
||||||
|
self.req_specs = specs
|
||||||
|
self.concr_specs: List[Spec]
|
||||||
|
self.concr_specs = []
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
|
||||||
|
for s in self.concr_specs:
|
||||||
|
PackageInstaller([s.package], fake=True, explicit=True).install()
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||||
|
for s in self.concr_specs:
|
||||||
|
s.package.do_uninstall()
|
||||||
|
|
||||||
|
|
||||||
|
# MacOS and Windows only work if you pass this function pointer rather than a
|
||||||
|
# closure
|
||||||
|
def _mock_has_runtime_dependencies(_x):
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _make_specs_non_buildable(specs: List[str]):
|
||||||
|
output_config = {}
|
||||||
|
for spec in specs:
|
||||||
|
output_config[spec] = {"buildable": False}
|
||||||
|
return output_config
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def splicing_setup(mutable_database, mock_packages, monkeypatch):
|
||||||
|
spack.config.set("concretizer:reuse", True)
|
||||||
|
monkeypatch.setattr(
|
||||||
|
spack.solver.asp, "_has_runtime_dependencies", _mock_has_runtime_dependencies
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _enable_splicing():
|
||||||
|
spack.config.set("concretizer:splice", {"automatic": True})
|
||||||
|
|
||||||
|
|
||||||
|
def _has_build_dependency(spec: Spec, name: str):
|
||||||
|
return any(s.name == name for s in spec.dependencies(None, dt.BUILD))
|
||||||
|
|
||||||
|
|
||||||
|
def test_simple_reuse(splicing_setup):
|
||||||
|
with CacheManager(["splice-z@1.0.0+compat"]):
|
||||||
|
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
|
||||||
|
assert Spec("splice-z").concretized().satisfies(Spec("splice-z"))
|
||||||
|
|
||||||
|
|
||||||
|
def test_simple_dep_reuse(splicing_setup):
|
||||||
|
with CacheManager(["splice-z@1.0.0+compat"]):
|
||||||
|
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
|
||||||
|
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
|
||||||
|
|
||||||
|
|
||||||
|
def test_splice_installed_hash(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0",
|
||||||
|
"splice-h@1.0.2+compat ^splice-z@1.0.0",
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
|
||||||
|
spack.config.set("packages", packages_config)
|
||||||
|
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0")
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
goal_spec.concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
assert goal_spec.concretized().satisfies(goal_spec)
|
||||||
|
|
||||||
|
|
||||||
|
def test_splice_build_splice_node(splicing_setup):
|
||||||
|
with CacheManager(["splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat"]):
|
||||||
|
spack.config.set("packages", _make_specs_non_buildable(["splice-t"]))
|
||||||
|
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat")
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
goal_spec.concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
assert goal_spec.concretized().satisfies(goal_spec)
|
||||||
|
|
||||||
|
|
||||||
|
def test_double_splice(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat",
|
||||||
|
"splice-h@1.0.2+compat ^splice-z@1.0.1+compat",
|
||||||
|
"splice-z@1.0.2+compat",
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
freeze_builds_config = _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"])
|
||||||
|
spack.config.set("packages", freeze_builds_config)
|
||||||
|
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat")
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
goal_spec.concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
assert goal_spec.concretized().satisfies(goal_spec)
|
||||||
|
|
||||||
|
|
||||||
|
# The next two tests are mirrors of one another
|
||||||
|
def test_virtual_multi_splices_in(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-1",
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-2",
|
||||||
|
]
|
||||||
|
goal_specs = [
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
|
||||||
|
for gs in goal_specs:
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
Spec(gs).concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
for gs in goal_specs:
|
||||||
|
assert Spec(gs).concretized().satisfies(gs)
|
||||||
|
|
||||||
|
|
||||||
|
def test_virtual_multi_can_be_spliced(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
|
||||||
|
]
|
||||||
|
goal_specs = [
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-1",
|
||||||
|
"depends-on-virtual-with-abi ^virtual-abi-2",
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
for gs in goal_specs:
|
||||||
|
Spec(gs).concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
for gs in goal_specs:
|
||||||
|
assert Spec(gs).concretized().satisfies(gs)
|
||||||
|
|
||||||
|
|
||||||
|
def test_manyvariant_star_matching_variant_splice(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
# can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
|
||||||
|
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
|
||||||
|
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
|
||||||
|
]
|
||||||
|
goal_specs = [
|
||||||
|
Spec("depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2"),
|
||||||
|
Spec("depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3"),
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
|
||||||
|
spack.config.set("packages", freeze_build_config)
|
||||||
|
for goal in goal_specs:
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
goal.concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
for goal in goal_specs:
|
||||||
|
assert goal.concretized().satisfies(goal)
|
||||||
|
|
||||||
|
|
||||||
|
def test_manyvariant_limited_matching(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
# can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
|
||||||
|
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
|
||||||
|
# can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
|
||||||
|
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
|
||||||
|
]
|
||||||
|
goal_specs = [
|
||||||
|
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2"),
|
||||||
|
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3"),
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
|
||||||
|
spack.config.set("packages", freeze_build_config)
|
||||||
|
for s in goal_specs:
|
||||||
|
with pytest.raises(Exception):
|
||||||
|
s.concretized()
|
||||||
|
_enable_splicing()
|
||||||
|
for s in goal_specs:
|
||||||
|
assert s.concretized().satisfies(s)
|
||||||
|
|
||||||
|
|
||||||
|
def test_external_splice_same_name(splicing_setup):
|
||||||
|
cache = [
|
||||||
|
"splice-h@1.0.0 ^splice-z@1.0.0+compat",
|
||||||
|
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat",
|
||||||
|
]
|
||||||
|
packages_yaml = {
|
||||||
|
"splice-z": {"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}]}
|
||||||
|
}
|
||||||
|
goal_specs = [
|
||||||
|
Spec("splice-h@1.0.0 ^splice-z@1.0.2"),
|
||||||
|
Spec("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"),
|
||||||
|
]
|
||||||
|
with CacheManager(cache):
|
||||||
|
spack.config.set("packages", packages_yaml)
|
||||||
|
_enable_splicing()
|
||||||
|
for s in goal_specs:
|
||||||
|
assert s.concretized().satisfies(s)
|
||||||
|
|
||||||
|
|
||||||
|
def test_spliced_build_deps_only_in_build_spec(splicing_setup):
|
||||||
|
cache = ["splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0"]
|
||||||
|
goal_spec = Spec("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
|
||||||
|
|
||||||
|
with CacheManager(cache):
|
||||||
|
_enable_splicing()
|
||||||
|
concr_goal = goal_spec.concretized()
|
||||||
|
build_spec = concr_goal._build_spec
|
||||||
|
# Spec has been spliced
|
||||||
|
assert build_spec is not None
|
||||||
|
# Build spec has spliced build dependencies
|
||||||
|
assert _has_build_dependency(build_spec, "splice-h")
|
||||||
|
assert _has_build_dependency(build_spec, "splice-z")
|
||||||
|
# Spliced build dependencies are removed
|
||||||
|
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
|
@ -311,7 +311,19 @@ def test_pkg_grep(mock_packages, capfd):
|
|||||||
output, _ = capfd.readouterr()
|
output, _ = capfd.readouterr()
|
||||||
assert output.strip() == "\n".join(
|
assert output.strip() == "\n".join(
|
||||||
spack.repo.PATH.get_pkg_class(name).module.__file__
|
spack.repo.PATH.get_pkg_class(name).module.__file__
|
||||||
for name in ["splice-a", "splice-h", "splice-t", "splice-vh", "splice-vt", "splice-z"]
|
for name in [
|
||||||
|
"depends-on-manyvariants",
|
||||||
|
"manyvariants",
|
||||||
|
"splice-a",
|
||||||
|
"splice-h",
|
||||||
|
"splice-t",
|
||||||
|
"splice-vh",
|
||||||
|
"splice-vt",
|
||||||
|
"splice-z",
|
||||||
|
"virtual-abi-1",
|
||||||
|
"virtual-abi-2",
|
||||||
|
"virtual-abi-multi",
|
||||||
|
]
|
||||||
)
|
)
|
||||||
|
|
||||||
# ensure that this string isn't fouhnd
|
# ensure that this string isn't fouhnd
|
||||||
|
@ -1763,8 +1763,8 @@ def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_co
|
|||||||
assert a1.dag_hash() == a2.dag_hash()
|
assert a1.dag_hash() == a2.dag_hash()
|
||||||
assert a1.process_hash() == a2.process_hash()
|
assert a1.process_hash() == a2.process_hash()
|
||||||
|
|
||||||
a1.clear_cached_hashes()
|
a1.clear_caches()
|
||||||
a2.clear_cached_hashes()
|
a2.clear_caches()
|
||||||
|
|
||||||
# tweak the dag hash of one of these specs
|
# tweak the dag hash of one of these specs
|
||||||
new_hash = "00000000000000000000000000000000"
|
new_hash = "00000000000000000000000000000000"
|
||||||
|
@ -0,0 +1,25 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class DependsOnManyvariants(Package):
|
||||||
|
"""
|
||||||
|
A package with a dependency on `manyvariants`, so that `manyvariants` can
|
||||||
|
be spliced in tests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.test.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("1.0")
|
||||||
|
version("2.0")
|
||||||
|
|
||||||
|
depends_on("manyvariants@1.0", when="@1.0")
|
||||||
|
depends_on("manyvariants@2.0", when="@2.0")
|
||||||
|
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
touch(prefix.bar)
|
@ -0,0 +1,19 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class DependsOnVirtualWithAbi(Package):
|
||||||
|
"""
|
||||||
|
This has a virtual dependency on `virtual-with-abi`, mostly for testing
|
||||||
|
automatic splicing of providers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.example.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("1.0")
|
||||||
|
depends_on("virtual-with-abi")
|
@ -0,0 +1,33 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class Manyvariants(Package):
|
||||||
|
"""
|
||||||
|
A package with 4 different variants of different arities to test the
|
||||||
|
`match_variants` argument to `can_splice`
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.test.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("2.0.1")
|
||||||
|
version("2.0.0")
|
||||||
|
version("1.0.1")
|
||||||
|
version("1.0.0")
|
||||||
|
|
||||||
|
variant("a", default=True)
|
||||||
|
variant("b", default=False)
|
||||||
|
variant("c", values=("v1", "v2", "v3"), multi=False, default="v1")
|
||||||
|
variant("d", values=("v1", "v2", "v3"), multi=False, default="v1")
|
||||||
|
|
||||||
|
can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
|
||||||
|
can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
|
||||||
|
can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
|
||||||
|
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
touch(prefix.bar)
|
@ -12,17 +12,24 @@ class SpliceH(Package):
|
|||||||
homepage = "http://www.example.com"
|
homepage = "http://www.example.com"
|
||||||
url = "http://www.example.com/splice-h-1.0.tar.gz"
|
url = "http://www.example.com/splice-h-1.0.tar.gz"
|
||||||
|
|
||||||
version("1.0", md5="0123456789abcdef0123456789abcdef")
|
version("1.0.2")
|
||||||
|
version("1.0.1")
|
||||||
|
version("1.0.0")
|
||||||
|
|
||||||
variant("foo", default=False, description="nope")
|
variant("foo", default=False, description="nope")
|
||||||
variant("bar", default=False, description="nope")
|
variant("bar", default=False, description="nope")
|
||||||
variant("baz", default=False, description="nope")
|
variant("baz", default=False, description="nope")
|
||||||
|
variant("compat", default=True, description="nope")
|
||||||
|
|
||||||
depends_on("splice-z")
|
depends_on("splice-z")
|
||||||
depends_on("splice-z+foo", when="+foo")
|
depends_on("splice-z+foo", when="+foo")
|
||||||
|
|
||||||
provides("something")
|
provides("something")
|
||||||
provides("somethingelse")
|
provides("somethingelse")
|
||||||
|
provides("virtual-abi")
|
||||||
|
|
||||||
|
can_splice("splice-h@1.0.0 +compat", when="@1.0.1 +compat")
|
||||||
|
can_splice("splice-h@1.0.0:1.0.1 +compat", when="@1.0.2 +compat")
|
||||||
|
|
||||||
def install(self, spec, prefix):
|
def install(self, spec, prefix):
|
||||||
with open(prefix.join("splice-h"), "w") as f:
|
with open(prefix.join("splice-h"), "w") as f:
|
||||||
|
@ -12,10 +12,16 @@ class SpliceZ(Package):
|
|||||||
homepage = "http://www.example.com"
|
homepage = "http://www.example.com"
|
||||||
url = "http://www.example.com/splice-z-1.0.tar.gz"
|
url = "http://www.example.com/splice-z-1.0.tar.gz"
|
||||||
|
|
||||||
version("1.0", md5="0123456789abcdef0123456789abcdef")
|
version("1.0.2")
|
||||||
|
version("1.0.1")
|
||||||
|
version("1.0.0")
|
||||||
|
|
||||||
variant("foo", default=False, description="nope")
|
variant("foo", default=False, description="nope")
|
||||||
variant("bar", default=False, description="nope")
|
variant("bar", default=False, description="nope")
|
||||||
|
variant("compat", default=True, description="nope")
|
||||||
|
|
||||||
|
can_splice("splice-z@1.0.0 +compat", when="@1.0.1 +compat")
|
||||||
|
can_splice("splice-z@1.0.0:1.0.1 +compat", when="@1.0.2 +compat")
|
||||||
|
|
||||||
def install(self, spec, prefix):
|
def install(self, spec, prefix):
|
||||||
with open(prefix.join("splice-z"), "w") as f:
|
with open(prefix.join("splice-z"), "w") as f:
|
||||||
|
@ -0,0 +1,25 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class VirtualAbi1(Package):
|
||||||
|
"""
|
||||||
|
This package provides `virtual-with-abi` and is conditionally ABI
|
||||||
|
compatible with `virtual-abi-multi`
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.example.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("1.0")
|
||||||
|
|
||||||
|
provides("virtual-with-abi")
|
||||||
|
|
||||||
|
can_splice("virtual-abi-multi@1.0 abi=one", when="@1.0")
|
||||||
|
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
touch(prefix.foo)
|
@ -0,0 +1,25 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class VirtualAbi2(Package):
|
||||||
|
"""
|
||||||
|
This package provides `virtual-with-abi` and is conditionally ABI
|
||||||
|
compatible with `virtual-abi-multi`
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.example.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("1.0")
|
||||||
|
|
||||||
|
provides("virtual-with-abi")
|
||||||
|
|
||||||
|
can_splice("virtual-abi-multi@1.0 abi=two", when="@1.0")
|
||||||
|
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
touch(prefix.foo)
|
@ -0,0 +1,29 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class VirtualAbiMulti(Package):
|
||||||
|
"""
|
||||||
|
This package provides `virtual-with-abi` is ABI compatible with either
|
||||||
|
`virtual-abi-1` or `virtual-abi-2` depending on the value of its `abi`
|
||||||
|
variant
|
||||||
|
"""
|
||||||
|
|
||||||
|
homepage = "https://www.example.com"
|
||||||
|
has_code = False
|
||||||
|
|
||||||
|
version("1.0")
|
||||||
|
|
||||||
|
variant("abi", default="custom", multi=False, values=("one", "two", "custom"))
|
||||||
|
|
||||||
|
provides("virtual-with-abi")
|
||||||
|
|
||||||
|
can_splice("virtual-abi-1@1.0", when="@1.0 abi=one")
|
||||||
|
can_splice("virtual-abi-2@1.0", when="@1.0 abi=two")
|
||||||
|
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
touch(prefix.foo)
|
@ -0,0 +1,16 @@
|
|||||||
|
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
|
||||||
|
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
|
class VirtualWithAbi(Package):
|
||||||
|
"""Virtual package for mocking an interface with stable ABI ."""
|
||||||
|
|
||||||
|
homepage = "https://www.abi.org/"
|
||||||
|
virtual = True
|
||||||
|
|
||||||
|
def test_hello(self):
|
||||||
|
print("Hello there!")
|
Loading…
Reference in New Issue
Block a user