Compare commits

...

20 Commits

Author SHA1 Message Date
Todd Gamblin
970f18cd45
add back patch level
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-28 01:33:56 -07:00
Todd Gamblin
7b4aa6739f
clean up test
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-28 01:22:35 -07:00
Todd Gamblin
fbdb5a9c95
remove redundant list element
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-28 01:17:20 -07:00
Todd Gamblin
eda1a6f60f
source_id -> source_provenance
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-28 01:16:16 -07:00
Todd Gamblin
bb77a7733c
remove superfluous debug lines in test
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:55:53 -07:00
Todd Gamblin
cb588d933c
remove straggling uses of hash_types.package_hash
- [x] don't allow calling `spec.package_hash()` on abstract specs
- [x] get rid of last few uses of `ht.package_hash`, which was removed.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:55:51 -07:00
Todd Gamblin
552a35e12e
bugfix: submodule callbacks need to be resolved in specs
The submodules argument for git versions supports callbacks (which are used on some key
packages to conditionally fetch certain submodules). Callbacks can't be serialized to
JSON, so we need to ensure that these things are resolved (by calling them) at
concretization time.

- [x] ensure that submodule callbacks are called when creating spec JSON
- [x] add tests that try serializing all types of git versions
2024-10-27 17:54:09 -07:00
Todd Gamblin
d5f4c69076
don't remove patches variant
Patches variant needs to stay on the spec even if we also have resources.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:09 -07:00
Todd Gamblin
b8ecccbde9
installer: show tracebacks from builds to debug failed installs
`installer.py` currently swallows the traceback and preserves only
the error messaege if a build process fails.

- [x] preserve exceptions from failed build processes
- [x] print a full traceback for each one when running with `spack -d`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:09 -07:00
Todd Gamblin
b7ebcc4a7b
extra debug statements
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:09 -07:00
Todd Gamblin
74684ea089
fix string in test
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:09 -07:00
Todd Gamblin
97f868a54e
avoid circular import of spack.patch
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
4005ae8651
remove comment
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
cf9720e00d
remove useless check
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
752c0d4fe2
clarify comment on use of _package_hash
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
4a2f5f4b50
abbreviate hashes in mock packages
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
de62f7fed2
don't refer to source id
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
e296d19146
update docstring
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
d06d812013
remove TODO: resources
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
Todd Gamblin
eef041abee
specs: include source provenance in spec.json and package hash
We've included a package hash in Spack since #7193 for CI, and we started using it on
the spec in #28504. However, what goes into the package hash is a bit opaque. Here's
what `spec.json` looks like now:

```json
{
  "spec": {
    "_meta": {
      "version": 3
    },
    "nodes": [
      {
        "name": "zlib",
        "version": "1.2.12",
        ...
        "patches": [
          "0d38234384870bfd34dfcb738a9083952656f0c766a0f5990b1893076b084b76"
        ],
        "package_hash": "pthf7iophdyonixxeed7gyqiksopxeklzzjbxtjrw7nzlkcqleba====",
        "hash": "ke4alug7ypoxp37jb6namwlxssmws4kp"
      }
    ]
  }
}
```

The `package_hash` there is a hash of the concatenation of:

* A canonical hash of the `package.py` recipe, as implemented in #28156;
* `sha256`'s of patches applied to the spec; and
* Archive `sha256` sums of archives or commits/revisions of repos used to build the spec.

There are some issues with this: patches are counted twice in this spec (in `patches`
and in the `package_hash`), the hashes of sources used to build are conflated with the
`package.py` hash, and we don't actually include resources anywhere.

With this PR, I've expanded the package hash out in the `spec.json` body. Here is the
"same" spec with the new fields:

```json
{
  "spec": {
    "_meta": {
      "version": 3
    },
    "nodes": [
      {
        "name": "zlib",
        "version": "1.2.12",
        ...
        "package_hash": "6kkliqdv67ucuvfpfdwaacy5bz6s6en4",
        "sources": [
          {
            "type": "archive",
            "sha256": "91844808532e5ce316b3c010929493c0244f3d37593afd6de04f71821d5136d9"
          }
        ],
        "patches": [
          "0d38234384870bfd34dfcb738a9083952656f0c766a0f5990b1893076b084b76"
        ],
        "hash": "ts3gkpltbgzr5y6nrfy6rzwbjmkscein"
      }
    ]
  }
}
```

Now:

* Patches and archive hashes are no longer included in the `package_hash`;
* Artifacts used in the build go in `sources`, and we tell you their checksum in the `spec.json`;
* `sources` will include resources for packages that have it;
* Patches are the same as before -- but only represented once; and
* The `package_hash` is a base32-encoded `sha1`, like other hashes in Spack, and it only
  tells you that the `package.py` changed.

The behavior of the DAG hash (which includes the `package_hash`) is basically the same
as before, except now resources are included, and we can see differences in archives and
resources directly in the `spec.json`

Note that we do not need to bump the spec meta version on this, as past versions of
Spack can still read the new specs; they just will not notice the new fields (which is
fine, since we currently do not do anything with them).

Among other things, this will more easily allow us to convert Spack specs to SBOM and
track relevant security information (like `sha256`'s of archives). For example, we could
do continuous scanning of a Spack installation based on these hashes, and if the
`sha256`'s become associated with CVE's, we'll know we're affected.

- [x] Add a method, `spec_attrs()` to `FetchStrategy` that can be used to describe a
      fetcher for a `spec.json`.

- [x] Simplify the way package_hash() is handled in Spack. Previously, it was handled as
      a special-case spec hash in `hash_types.py`, but it really doesn't belong there.
      Now, it's handled as part of `Spec._finalize_concretization()` and `hash_types.py`
      is much simpler.

- [x] Change `PackageBase.content_hash()` to `PackageBase.artifact_hashes()`, and
      include more information about artifacts in it.

- [x] Update package hash tests and make them check for artifact and resource hashes.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-27 17:54:08 -07:00
16 changed files with 288 additions and 189 deletions

View File

@ -74,6 +74,6 @@ def store(self, fetcher, relative_dest):
#: Spack's local cache for downloaded source archives #: Spack's local cache for downloaded source archives
FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = ( FETCH_CACHE: Union["spack.fetch_strategy.FsCache", llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_fetch_cache) llnl.util.lang.Singleton(_fetch_cache)
) )

View File

@ -33,7 +33,7 @@
import urllib.parse import urllib.parse
import urllib.request import urllib.request
from pathlib import PurePath from pathlib import PurePath
from typing import List, Optional from typing import Dict, List, Optional
import llnl.url import llnl.url
import llnl.util import llnl.util
@ -49,6 +49,7 @@
import spack.util.archive import spack.util.archive
import spack.util.crypto as crypto import spack.util.crypto as crypto
import spack.util.git import spack.util.git
import spack.util.spack_yaml as syaml
import spack.util.url as url_util import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
import spack.version import spack.version
@ -110,6 +111,28 @@ def __init__(self, **kwargs):
self.package = None self.package = None
def source_provenance(self) -> Dict:
"""Create a metadata dictionary that describes the artifacts fetched by this FetchStrategy.
The returned dictionary is added to the content used to determine the full hash
for a package. It should be serializable as JSON.
It should include data like sha256 hashes for archives, commits for source
repositories, and any information needed to describe exactly what artifacts went
into a build.
If a package has no soruce artifacts, it should return an empty dictionary.
"""
attrs = syaml.syaml_dict()
if self.url_attr:
attrs["type"] = "archive" if self.url_attr == "url" else self.url_attr
for attr in self.optional_attrs:
value = getattr(self, attr, None)
if value:
attrs[attr] = value
return attrs
def set_package(self, package): def set_package(self, package):
self.package = package self.package = package
@ -152,17 +175,6 @@ def cachable(self):
bool: True if can cache, False otherwise. bool: True if can cache, False otherwise.
""" """
def source_id(self):
"""A unique ID for the source.
It is intended that a human could easily generate this themselves using
the information available to them in the Spack package.
The returned value is added to the content which determines the full
hash for a package using `str()`.
"""
raise NotImplementedError
def mirror_id(self): def mirror_id(self):
"""This is a unique ID for a source that is intended to help identify """This is a unique ID for a source that is intended to help identify
reuse of resources across packages. reuse of resources across packages.
@ -213,9 +225,9 @@ def cachable(self):
"""Report False as there is no code to cache.""" """Report False as there is no code to cache."""
return False return False
def source_id(self): def source_provenance(self) -> Dict:
"""BundlePackages don't have a source id.""" """BundlePackages don't have a source of their own."""
return "" return {}
def mirror_id(self): def mirror_id(self):
"""BundlePackages don't have a mirror id.""" """BundlePackages don't have a mirror id."""
@ -260,8 +272,15 @@ def curl(self) -> Executable:
self._curl = web_util.require_curl() self._curl = web_util.require_curl()
return self._curl return self._curl
def source_id(self): def source_provenance(self) -> Dict:
return self.digest attrs = super().source_provenance()
if self.digest:
try:
hash_type = spack.util.crypto.hash_algo_for_digest(self.digest)
except ValueError:
hash_type = "digest"
attrs[hash_type] = self.digest
return attrs
def mirror_id(self): def mirror_id(self):
if not self.digest: if not self.digest:
@ -772,9 +791,15 @@ def git(self):
def cachable(self): def cachable(self):
return self.cache_enabled and bool(self.commit) return self.cache_enabled and bool(self.commit)
def source_id(self): def source_provenance(self) -> Dict:
# TODO: tree-hash would secure download cache and mirrors, commit only secures checkouts. attrs = super().source_provenance()
return self.commit
# need to fully resolve submodule callbacks for node dicts
submodules = attrs.get("submodules", None)
if submodules and callable(submodules):
attrs["submodules"] = submodules(self.package)
return attrs
def mirror_id(self): def mirror_id(self):
if self.commit: if self.commit:
@ -1084,17 +1109,6 @@ def cvs(self):
def cachable(self): def cachable(self):
return self.cache_enabled and (bool(self.branch) or bool(self.date)) return self.cache_enabled and (bool(self.branch) or bool(self.date))
def source_id(self):
if not (self.branch or self.date):
# We need a branch or a date to make a checkout reproducible
return None
id = "id"
if self.branch:
id += "-branch=" + self.branch
if self.date:
id += "-date=" + self.date
return id
def mirror_id(self): def mirror_id(self):
if not (self.branch or self.date): if not (self.branch or self.date):
# We need a branch or a date to make a checkout reproducible # We need a branch or a date to make a checkout reproducible
@ -1197,9 +1211,6 @@ def svn(self):
def cachable(self): def cachable(self):
return self.cache_enabled and bool(self.revision) return self.cache_enabled and bool(self.revision)
def source_id(self):
return self.revision
def mirror_id(self): def mirror_id(self):
if self.revision: if self.revision:
repo_path = urllib.parse.urlparse(self.url).path repo_path = urllib.parse.urlparse(self.url).path
@ -1307,9 +1318,6 @@ def hg(self):
def cachable(self): def cachable(self):
return self.cache_enabled and bool(self.revision) return self.cache_enabled and bool(self.revision)
def source_id(self):
return self.revision
def mirror_id(self): def mirror_id(self):
if self.revision: if self.revision:
repo_path = urllib.parse.urlparse(self.url).path repo_path = urllib.parse.urlparse(self.url).path

View File

@ -5,7 +5,6 @@
"""Definitions that control how Spack creates Spec hashes.""" """Definitions that control how Spack creates Spec hashes."""
import spack.deptypes as dt import spack.deptypes as dt
import spack.repo
hashes = [] hashes = []
@ -13,20 +12,17 @@
class SpecHashDescriptor: class SpecHashDescriptor:
"""This class defines how hashes are generated on Spec objects. """This class defines how hashes are generated on Spec objects.
Spec hashes in Spack are generated from a serialized (e.g., with Spec hashes in Spack are generated from a serialized JSON representation of the DAG.
YAML) representation of the Spec graph. The representation may only The representation may only include certain dependency types, and it may optionally
include certain dependency types, and it may optionally include a include a canonicalized hash of the ``package.py`` for each node in the graph.
canonicalized hash of the package.py for each node in the graph.
We currently use different hashes for different use cases.""" """
def __init__(self, depflag: dt.DepFlag, package_hash, name, override=None): def __init__(self, depflag: dt.DepFlag, package_hash, name):
self.depflag = depflag self.depflag = depflag
self.package_hash = package_hash self.package_hash = package_hash
self.name = name self.name = name
hashes.append(self) hashes.append(self)
# Allow spec hashes to have an alternate computation method
self.override = override
@property @property
def attr(self): def attr(self):
@ -54,18 +50,6 @@ def __repr__(self):
) )
def _content_hash_override(spec):
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
return pkg.content_hash()
#: Package hash used as part of dag hash
package_hash = SpecHashDescriptor(
depflag=0, package_hash=True, name="package_hash", override=_content_hash_override
)
# Deprecated hash types, no longer used, but needed to understand old serialized # Deprecated hash types, no longer used, but needed to understand old serialized
# spec formats # spec formats

View File

@ -36,6 +36,7 @@
import shutil import shutil
import sys import sys
import time import time
import traceback
from collections import defaultdict from collections import defaultdict
from gzip import GzipFile from gzip import GzipFile
from typing import Dict, Iterator, List, Optional, Set, Tuple, Union from typing import Dict, Iterator, List, Optional, Set, Tuple, Union
@ -2214,7 +2215,7 @@ def install(self) -> None:
if task.is_build_request: if task.is_build_request:
if single_requested_spec: if single_requested_spec:
raise raise
failed_build_requests.append((pkg, pkg_id, str(exc))) failed_build_requests.append((pkg, pkg_id, exc))
finally: finally:
# Remove the install prefix if anything went wrong during # Remove the install prefix if anything went wrong during
@ -2241,6 +2242,9 @@ def install(self) -> None:
if failed_build_requests or missing: if failed_build_requests or missing:
for _, pkg_id, err in failed_build_requests: for _, pkg_id, err in failed_build_requests:
tty.error(f"{pkg_id}: {err}") tty.error(f"{pkg_id}: {err}")
if spack.error.debug:
# note: in python 3.10+ this can just be print_exception(err)
traceback.print_exception(type(err), err, err.__traceback__)
for _, pkg_id in missing: for _, pkg_id in missing:
tty.error(f"{pkg_id}: Package was not installed") tty.error(f"{pkg_id}: Package was not installed")

View File

@ -9,12 +9,10 @@
packages. packages.
""" """
import base64
import collections import collections
import copy import copy
import functools import functools
import glob import glob
import hashlib
import importlib import importlib
import io import io
import os import os
@ -49,14 +47,15 @@
import spack.store import spack.store
import spack.url import spack.url
import spack.util.environment import spack.util.environment
import spack.util.package_hash as ph
import spack.util.path import spack.util.path
import spack.util.spack_yaml as syaml
import spack.util.web import spack.util.web
from spack.error import InstallError, NoURLError, PackageError from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView from spack.filesystem_view import YamlFilesystemView
from spack.install_test import PackageTest, TestSuite from spack.install_test import PackageTest, TestSuite
from spack.solver.version_order import concretization_version_order from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
from spack.version import GitVersion, StandardVersion from spack.version import GitVersion, StandardVersion
FLAG_HANDLER_RETURN_TYPE = Tuple[ FLAG_HANDLER_RETURN_TYPE = Tuple[
@ -1754,65 +1753,78 @@ def all_patches(cls):
return patches return patches
def content_hash(self, content=None): def artifact_hashes(self, content=None):
"""Create a hash based on the artifacts and patches used to build this package. """Create a dictionary of hashes of artifacts used in the build of this package.
This includes: This includes:
* source artifacts (tarballs, repositories) used to build; * source artifacts (tarballs, repositories) used to build;
* content hashes (``sha256``'s) of all patches applied by Spack; and * content hashes (``sha256``'s) of all patches applied by Spack; and
* canonicalized contents the ``package.py`` recipe used to build. * canonicalized contents the ``package.py`` recipe used to build.
This hash is only included in Spack's DAG hash for concrete specs, but if it Example::
happens to be called on a package with an abstract spec, only applicable (i.e.,
determinable) portions of the hash will be included. {
"package_hash": "qovi2hm2n2qsatng2r4n55yzjlhnwflx",
"sources": [
{
"sha256": "fc5fd69bb8736323f026672b1b7235da613d7177e72558893a0bdcd320466d60",
"type": "archive"
},
{
"sha256": "56ab9b90f5acbc42eb7a94cf482e6c058a63e8a1effdf572b8b2a6323a06d923",
"type": "archive"
}
}
All hashes are added to concrete specs at the end of concretization. If this
method is called on an abstract spec, only hashes that can be known from the
abstract spec will be included.
""" """
# list of components to make up the hash hashes = syaml.syaml_dict()
hash_content = []
# source artifacts/repositories # source artifacts/repositories
# TODO: resources
if self.spec.versions.concrete: if self.spec.versions.concrete:
sources = []
try: try:
source_id = fs.for_package_version(self).source_id() fetcher = fs.for_package_version(self)
provenance_dict = fetcher.source_provenance()
if provenance_dict:
sources.append(provenance_dict)
except (fs.ExtrapolationError, fs.InvalidArgsError): except (fs.ExtrapolationError, fs.InvalidArgsError):
# ExtrapolationError happens if the package has no fetchers defined. # ExtrapolationError happens if the package has no fetchers defined.
# InvalidArgsError happens when there are version directives with args, # InvalidArgsError happens when there are version directives with args,
# but none of them identifies an actual fetcher. # but none of them identifies an actual fetcher.
source_id = None
if not source_id: # if this is a develop spec, say so
# TODO? in cases where a digest or source_id isn't available,
# should this attempt to download the source and set one? This
# probably only happens for source repositories which are
# referenced by branch name rather than tag or commit ID.
from_local_sources = "dev_path" in self.spec.variants from_local_sources = "dev_path" in self.spec.variants
# don't bother setting a hash if none is available, but warn if
# it seems like there should be one.
if self.has_code and not self.spec.external and not from_local_sources: if self.has_code and not self.spec.external and not from_local_sources:
message = "Missing a source id for {s.name}@{s.version}" message = "Missing a hash for {s.name}@{s.version}"
tty.debug(message.format(s=self)) tty.debug(message.format(s=self))
hash_content.append("".encode("utf-8"))
else: for resource in self._get_needed_resources():
hash_content.append(source_id.encode("utf-8")) sources.append(resource.fetcher.source_provenance())
if sources:
hashes["sources"] = sources
# patch sha256's # patch sha256's
# Only include these if they've been assigned by the concretizer. # Only include these if they've been assigned by the concretizer.
# We check spec._patches_assigned instead of spec.concrete because # We check spec._patches_assigned instead of spec.concrete because
# we have to call package_hash *before* marking specs concrete # we have to call package_hash *before* marking specs concrete
if self.spec._patches_assigned(): if self.spec._patches_assigned():
hash_content.extend( hashes["patches"] = [
":".join((p.sha256, str(p.level))).encode("utf-8") for p in self.spec.patches {"sha256": patch.sha256, "level": patch.level} for patch in self.spec.patches
) ]
# package.py contents # package.py contents
hash_content.append(package_hash(self.spec, source=content).encode("utf-8")) hashes["package_hash"] = ph.package_hash(self.spec, source=content)
# put it all together and encode as base32 return hashes
b32_hash = base64.b32encode(
hashlib.sha256(bytes().join(sorted(hash_content))).digest()
).lower()
b32_hash = b32_hash.decode("utf-8")
return b32_hash
@property @property
def cmake_prefix_paths(self): def cmake_prefix_paths(self):

View File

@ -805,7 +805,7 @@ def tag_index(self) -> spack.tag.TagIndex:
return self._tag_index return self._tag_index
@property @property
def patch_index(self) -> spack.patch.PatchCache: def patch_index(self) -> "spack.patch.PatchCache":
"""Merged PatchIndex from all Repos in the RepoPath.""" """Merged PatchIndex from all Repos in the RepoPath."""
if self._patch_index is None: if self._patch_index is None:
self._patch_index = spack.patch.PatchCache(repository=self) self._patch_index = spack.patch.PatchCache(repository=self)
@ -1158,7 +1158,7 @@ def tag_index(self) -> spack.tag.TagIndex:
return self.index["tags"] return self.index["tags"]
@property @property
def patch_index(self) -> spack.patch.PatchCache: def patch_index(self) -> "spack.patch.PatchCache":
"""Index of patches and packages they're defined on.""" """Index of patches and packages they're defined on."""
return self.index["patches"] return self.index["patches"]

View File

@ -1478,6 +1478,12 @@ def __init__(
for h in ht.hashes: for h in ht.hashes:
setattr(self, h.attr, None) setattr(self, h.attr, None)
# hash of package.py at the time of concretization
self._package_hash = None
# dictionary of source artifact hashes, set at concretization time
self._artifact_hashes = None
# Python __hash__ is handled separately from the cached spec hashes # Python __hash__ is handled separately from the cached spec hashes
self._dunder_hash = None self._dunder_hash = None
@ -1968,10 +1974,6 @@ def spec_hash(self, hash):
Arguments: Arguments:
hash (spack.hash_types.SpecHashDescriptor): type of hash to generate. hash (spack.hash_types.SpecHashDescriptor): type of hash to generate.
""" """
# TODO: currently we strip build dependencies by default. Rethink
# this when we move to using package hashing on all specs.
if hash.override is not None:
return hash.override(self)
node_dict = self.to_node_dict(hash=hash) node_dict = self.to_node_dict(hash=hash)
json_text = sjson.dump(node_dict) json_text = sjson.dump(node_dict)
# This implements "frankenhashes", preserving the last 7 characters of the # This implements "frankenhashes", preserving the last 7 characters of the
@ -1981,7 +1983,7 @@ def spec_hash(self, hash):
return out[:-7] + self.build_spec.spec_hash(hash)[-7:] return out[:-7] + self.build_spec.spec_hash(hash)[-7:]
return out return out
def _cached_hash(self, hash, length=None, force=False): def _cached_hash(self, hash, length=None):
"""Helper function for storing a cached hash on the spec. """Helper function for storing a cached hash on the spec.
This will run spec_hash() with the deptype and package_hash This will run spec_hash() with the deptype and package_hash
@ -1991,7 +1993,6 @@ def _cached_hash(self, hash, length=None, force=False):
Arguments: Arguments:
hash (spack.hash_types.SpecHashDescriptor): type of hash to generate. hash (spack.hash_types.SpecHashDescriptor): type of hash to generate.
length (int): length of hash prefix to return (default is full hash string) length (int): length of hash prefix to return (default is full hash string)
force (bool): cache the hash even if spec is not concrete (default False)
""" """
if not hash.attr: if not hash.attr:
return self.spec_hash(hash)[:length] return self.spec_hash(hash)[:length]
@ -2001,21 +2002,24 @@ def _cached_hash(self, hash, length=None, force=False):
return hash_string[:length] return hash_string[:length]
else: else:
hash_string = self.spec_hash(hash) hash_string = self.spec_hash(hash)
if force or self.concrete: if self.concrete:
setattr(self, hash.attr, hash_string) setattr(self, hash.attr, hash_string)
return hash_string[:length] return hash_string[:length]
def package_hash(self): def package_hash(self):
"""Compute the hash of the contents of the package for this node""" """Compute the hash of the contents of the package for this node"""
if not self.concrete:
raise ValueError("Spec is not concrete: " + str(self))
# Concrete specs with the old DAG hash did not have the package hash, so we do # Concrete specs with the old DAG hash did not have the package hash, so we do
# not know what the package looked like at concretization time # not know what the package looked like at concretization time
if self.concrete and not self._package_hash: if not self._package_hash:
raise ValueError( raise ValueError(
"Cannot call package_hash() on concrete specs with the old dag_hash()" "Cannot call package_hash() on concrete specs with the old dag_hash()"
) )
return self._cached_hash(ht.package_hash) return self._package_hash
def dag_hash(self, length=None): def dag_hash(self, length=None):
"""This is Spack's default hash, used to identify installations. """This is Spack's default hash, used to identify installations.
@ -2202,23 +2206,25 @@ def to_node_dict(self, hash=ht.dag_hash):
if hasattr(variant, "_patches_in_order_of_appearance"): if hasattr(variant, "_patches_in_order_of_appearance"):
d["patches"] = variant._patches_in_order_of_appearance d["patches"] = variant._patches_in_order_of_appearance
if ( if self._concrete and hash.package_hash:
self._concrete # We use the `_package_hash` attribute here instead of `self.package_hash()`
and hash.package_hash # because `_package_hash` is *always* assigned at concretization time. If
and hasattr(self, "_package_hash") # the attribute is present, we should include it. If it isn't, we avoid
and self._package_hash # computing it because a) the package may no longer exist, or b) this is an
): # older spec and the `dag_hash` didn't include the package hash when the
# We use the attribute here instead of `self.package_hash()` because this # spec was concretized.
# should *always* be assignhed at concretization time. We don't want to try if hasattr(self, "_package_hash") and self._package_hash:
# to compute a package hash for concrete spec where a) the package might not d["package_hash"] = self._package_hash
# exist, or b) the `dag_hash` didn't include the package hash when the spec
# was concretized.
package_hash = self._package_hash
# Full hashes are in bytes if self._artifact_hashes:
if not isinstance(package_hash, str) and isinstance(package_hash, bytes): for key, source_list in sorted(self._artifact_hashes.items()):
package_hash = package_hash.decode("utf-8") # sources may be dictionaries (for archives/resources)
d["package_hash"] = package_hash def order(source):
if isinstance(source, dict):
return syaml.syaml_dict(sorted(source.items()))
return source
d[key] = [order(source) for source in source_list]
# Note: Relies on sorting dict by keys later in algorithm. # Note: Relies on sorting dict by keys later in algorithm.
deps = self._dependencies_dict(depflag=hash.depflag) deps = self._dependencies_dict(depflag=hash.depflag)
@ -2917,12 +2923,15 @@ def _finalize_concretization(self):
# We only assign package hash to not-yet-concrete specs, for which we know # We only assign package hash to not-yet-concrete specs, for which we know
# we can compute the hash. # we can compute the hash.
if not spec.concrete: if not spec.concrete:
# we need force=True here because package hash assignment has to happen # package hash assignment has to happen before we mark concrete, so that
# before we mark concrete, so that we know what was *already* concrete. # we know what was *already* concrete.
spec._cached_hash(ht.package_hash, force=True) # can't use self.package here b/c not concrete yet
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
artifact_hashes = pkg.artifact_hashes()
# keep this check here to ensure package hash is saved spec._package_hash = artifact_hashes.pop("package_hash")
assert getattr(spec, ht.package_hash.attr) spec._artifact_hashes = artifact_hashes
# Mark everything in the spec as concrete # Mark everything in the spec as concrete
self._mark_concrete() self._mark_concrete()
@ -3558,6 +3567,8 @@ def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, clearde
self._normal = other._normal self._normal = other._normal
for h in ht.hashes: for h in ht.hashes:
setattr(self, h.attr, getattr(other, h.attr, None)) setattr(self, h.attr, getattr(other, h.attr, None))
self._package_hash = getattr(other, "_package_hash", None)
self._artifact_hashes = getattr(other, "_artifact_hashes", None)
else: else:
self._dunder_hash = None self._dunder_hash = None
# Note, we could use other._normal if we are copying all deps, but # Note, we could use other._normal if we are copying all deps, but
@ -3565,6 +3576,8 @@ def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, clearde
self._normal = False self._normal = False
for h in ht.hashes: for h in ht.hashes:
setattr(self, h.attr, None) setattr(self, h.attr, None)
self._package_hash = None
self._artifact_hashes = None
return changed return changed
@ -4225,7 +4238,11 @@ def _splice_detach_and_add_dependents(self, replacement, context):
for ancestor in ancestors_in_context: for ancestor in ancestors_in_context:
# Only set it if it hasn't been spliced before # Only set it if it hasn't been spliced before
ancestor._build_spec = ancestor._build_spec or ancestor.copy() ancestor._build_spec = ancestor._build_spec or ancestor.copy()
ancestor.clear_cached_hashes(ignore=(ht.package_hash.attr,))
# reset all hashes *except* package and artifact hashes (since we are not
# rebuilding the spec)
ancestor.clear_cached_hashes(content_hashes=False)
for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD): for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD):
if edge.depflag & ~dt.BUILD: if edge.depflag & ~dt.BUILD:
edge.depflag &= ~dt.BUILD edge.depflag &= ~dt.BUILD
@ -4419,14 +4436,18 @@ def mask_build_deps(in_spec):
return spec return spec
def clear_cached_hashes(self, ignore=()): def clear_cached_hashes(self, content_hashes=True):
""" """
Clears all cached hashes in a Spec, while preserving other properties. Clears all cached hashes in a Spec, while preserving other properties.
""" """
for h in ht.hashes: for h in ht.hashes:
if h.attr not in ignore: if hasattr(self, h.attr):
if hasattr(self, h.attr): setattr(self, h.attr, None)
setattr(self, h.attr, None)
if content_hashes:
self._package_hash = None
self._artifact_hashes = None
self._dunder_hash = None self._dunder_hash = None
def __hash__(self): def __hash__(self):
@ -4702,6 +4723,14 @@ def from_node_dict(cls, node):
for h in ht.hashes: for h in ht.hashes:
setattr(spec, h.attr, node.get(h.name, None)) setattr(spec, h.attr, node.get(h.name, None))
# old and new-style package hash
if "package_hash" in node:
spec._package_hash = node["package_hash"]
# all source artifact hashes
if "sources" in node:
spec._artifact_hashes = syaml.syaml_dict([("sources", node["sources"])])
spec.name = name spec.name = name
spec.namespace = node.get("namespace", None) spec.namespace = node.get("namespace", None)

View File

@ -779,6 +779,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^pkg-b # ^pkg-b
# pkg-a # pkg-a
# ^pkg-b # ^pkg-b
e = ev.create("test", with_view=False) e = ev.create("test", with_view=False)
e.add("mpileaks") e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs e.add("libelf@0.8.10") # so env has both root and dep libelf specs
@ -786,14 +787,14 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
e.add("pkg-a ~bvv") e.add("pkg-a ~bvv")
e.concretize() e.concretize()
e.write() e.write()
env_specs = e.all_specs() initial_concrete_specs = e.all_specs()
a_spec = None a_spec = None
b_spec = None b_spec = None
mpi_spec = None mpi_spec = None
# First find and remember some target concrete specs in the environment # First find and remember some target concrete specs in the environment
for e_spec in env_specs: for e_spec in initial_concrete_specs:
if e_spec.satisfies(Spec("pkg-a ~bvv")): if e_spec.satisfies(Spec("pkg-a ~bvv")):
a_spec = e_spec a_spec = e_spec
elif e_spec.name == "pkg-b": elif e_spec.name == "pkg-b":
@ -815,8 +816,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
with e: with e:
# Assert using --no-add with a spec not in the env fails # Assert using --no-add with a spec not in the env fails
inst_out = install("--no-add", "boost", fail_on_error=False, output=str) inst_out = install("--no-add", "boost", fail_on_error=False, output=str)
assert "You can add specs to the environment with 'spack add boost'" in inst_out
assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that two packages "a" get installed # Without --add, ensure that two packages "a" get installed
inst_out = install("pkg-a", output=str) inst_out = install("pkg-a", output=str)
@ -828,13 +828,18 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
install("dyninst") install("dyninst")
find_output = find("-l", output=str) find_output = find("-l", output=str)
assert "dyninst" in find_output assert "dyninst" in find_output
assert "libdwarf" in find_output assert "libdwarf" in find_output
assert "libelf" in find_output assert "libelf" in find_output
assert "callpath" not in find_output assert "callpath" not in find_output
post_install_specs = e.all_specs() post_install_concrete_specs = e.all_specs()
assert all([s in env_specs for s in post_install_specs])
for s in post_install_concrete_specs:
assert (
s in initial_concrete_specs
), f"installed spec {s.format('{name}{@version}{/hash:7}')} not in original env"
# Make sure we can install a concrete dependency spec from a spec.json # Make sure we can install a concrete dependency spec from a spec.json
# file on disk, and the spec is installed but not added as a root # file on disk, and the spec is installed but not added as a root

View File

@ -410,7 +410,7 @@ def test_nosource_pkg_install(install_mockery, mock_fetch, mock_packages, capfd,
assert "Installing dependency-install" in out[0] assert "Installing dependency-install" in out[0]
# Make sure a warning for missing code is issued # Make sure a warning for missing code is issued
assert "Missing a source id for nosource" in out[1] assert "Missing a hash for nosource" in out[1]
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
@ -427,7 +427,7 @@ def test_nosource_bundle_pkg_install(
assert "Installing dependency-install" in out[0] assert "Installing dependency-install" in out[0]
# Make sure a warning for missing code is *not* issued # Make sure a warning for missing code is *not* issued
assert "Missing a source id for nosource" not in out[1] assert "Missing a hash for nosource" not in out[1]
def test_nosource_pkg_install_post_install(install_mockery, mock_fetch, mock_packages): def test_nosource_pkg_install_post_install(install_mockery, mock_fetch, mock_packages):

View File

@ -15,6 +15,7 @@
import spack.solver.asp import spack.solver.asp
import spack.spec import spack.spec
import spack.store import spack.store
import spack.util.package_hash as ph
import spack.variant import spack.variant
import spack.version as vn import spack.version as vn
from spack.error import SpecError, UnsatisfiableSpecError from spack.error import SpecError, UnsatisfiableSpecError
@ -1640,20 +1641,27 @@ def test_spec_installed(default_mock_concretization, database):
assert not spec.installed assert not spec.installed
def test_cannot_call_dag_hash_on_abstract_spec():
with pytest.raises(ValueError, match="Spec is not concrete"):
Spec("pkg-a").package_hash()
@pytest.mark.regression("30678") @pytest.mark.regression("30678")
def test_call_dag_hash_on_old_dag_hash_spec(mock_packages, default_mock_concretization): def test_call_dag_hash_on_old_dag_hash_spec(mock_packages, default_mock_concretization):
# create a concrete spec # create a concrete spec
a = default_mock_concretization("pkg-a") a = default_mock_concretization("pkg-a")
dag_hashes = {spec.name: spec.dag_hash() for spec in a.traverse()} dag_hashes = {spec.name: spec.dag_hash() for spec in a.traverse()}
for spec in a.traverse():
assert dag_hashes[spec.name] == spec.dag_hash()
assert spec.package_hash() == ph.package_hash(spec)
# make it look like an old DAG hash spec with no package hash on the spec. # make it look like an old DAG hash spec with no package hash on the spec.
for spec in a.traverse(): for spec in a.traverse():
assert spec.concrete assert spec.concrete
spec._package_hash = None spec._package_hash = None
for spec in a.traverse(): for spec in a.traverse():
assert dag_hashes[spec.name] == spec.dag_hash()
with pytest.raises(ValueError, match="Cannot call package_hash()"): with pytest.raises(ValueError, match="Cannot call package_hash()"):
spec.package_hash() spec.package_hash()

View File

@ -96,6 +96,20 @@ def test_invalid_json_spec(invalid_json, error_message):
# Virtuals on edges # Virtuals on edges
"callpath", "callpath",
"mpileaks", "mpileaks",
# Vvarious types of git versions
# Ensure that we try to serialize all the things that might be in the node dict,
# e.g., submodule callbacks can fail serialization if they're not fully resolved.
"git-url-top-level@develop",
"git-url-top-level@submodules",
"git-url-top-level@submodules_callback",
"git-url-top-level@3.4",
"git-url-top-level@3.3",
"git-url-top-level@3.2",
"git-url-top-level@3.1",
"git-url-top-level@3.0",
# URL versions with checksums
"git-url-top-level@2.3",
"git-url-top-level@2.1",
], ],
) )
def test_roundtrip_concrete_specs(abstract_spec, default_mock_concretization): def test_roundtrip_concrete_specs(abstract_spec, default_mock_concretization):

View File

@ -19,29 +19,27 @@
datadir = os.path.join(spack.paths.test_path, "data", "unparse") datadir = os.path.join(spack.paths.test_path, "data", "unparse")
def compare_sans_name(eq, spec1, spec2): def canonical_source_equal_sans_name(spec1, spec2):
content1 = ph.canonical_source(spec1) content1 = ph.canonical_source(spec1)
content1 = content1.replace(spack.repo.PATH.get_pkg_class(spec1.name).__name__, "TestPackage") content1 = content1.replace(spack.repo.PATH.get_pkg_class(spec1.name).__name__, "TestPackage")
content2 = ph.canonical_source(spec2) content2 = ph.canonical_source(spec2)
content2 = content2.replace(spack.repo.PATH.get_pkg_class(spec2.name).__name__, "TestPackage") content2 = content2.replace(spack.repo.PATH.get_pkg_class(spec2.name).__name__, "TestPackage")
if eq:
assert content1 == content2 return content1 == content2
else:
assert content1 != content2
def compare_hash_sans_name(eq, spec1, spec2): def package_hash_equal_sans_name(spec1, spec2):
content1 = ph.canonical_source(spec1) content1 = ph.canonical_source(spec1)
pkg_cls1 = spack.repo.PATH.get_pkg_class(spec1.name) pkg_cls1 = spack.repo.PATH.get_pkg_class(spec1.name)
content1 = content1.replace(pkg_cls1.__name__, "TestPackage") content1 = content1.replace(pkg_cls1.__name__, "TestPackage")
hash1 = pkg_cls1(spec1).content_hash(content=content1) hash1 = ph.package_hash(spec1, source=content1)
content2 = ph.canonical_source(spec2) content2 = ph.canonical_source(spec2)
pkg_cls2 = spack.repo.PATH.get_pkg_class(spec2.name) pkg_cls2 = spack.repo.PATH.get_pkg_class(spec2.name)
content2 = content2.replace(pkg_cls2.__name__, "TestPackage") content2 = content2.replace(pkg_cls2.__name__, "TestPackage")
hash2 = pkg_cls2(spec2).content_hash(content=content2) hash2 = ph.package_hash(spec2, source=content2)
assert (hash1 == hash2) == eq return hash1 == hash2
def test_hash(mock_packages, config): def test_hash(mock_packages, config):
@ -57,11 +55,11 @@ def test_different_variants(mock_packages, config):
def test_all_same_but_name(mock_packages, config): def test_all_same_but_name(mock_packages, config):
spec1 = Spec("hash-test1@=1.2") spec1 = Spec("hash-test1@=1.2")
spec2 = Spec("hash-test2@=1.2") spec2 = Spec("hash-test2@=1.2")
compare_sans_name(True, spec1, spec2) assert canonical_source_equal_sans_name(spec1, spec2)
spec1 = Spec("hash-test1@=1.2 +varianty") spec1 = Spec("hash-test1@=1.2 +varianty")
spec2 = Spec("hash-test2@=1.2 +varianty") spec2 = Spec("hash-test2@=1.2 +varianty")
compare_sans_name(True, spec1, spec2) assert canonical_source_equal_sans_name(spec1, spec2)
def test_all_same_but_archive_hash(mock_packages, config): def test_all_same_but_archive_hash(mock_packages, config):
@ -70,60 +68,63 @@ def test_all_same_but_archive_hash(mock_packages, config):
""" """
spec1 = Spec("hash-test1@=1.3") spec1 = Spec("hash-test1@=1.3")
spec2 = Spec("hash-test2@=1.3") spec2 = Spec("hash-test2@=1.3")
compare_sans_name(True, spec1, spec2) assert canonical_source_equal_sans_name(spec1, spec2)
def test_all_same_but_patch_contents(mock_packages, config): def test_all_same_but_patch_contents(mock_packages, config):
spec1 = Spec("hash-test1@=1.1") spec1 = Spec("hash-test1@=1.1")
spec2 = Spec("hash-test2@=1.1") spec2 = Spec("hash-test2@=1.1")
compare_sans_name(True, spec1, spec2) assert canonical_source_equal_sans_name(spec1, spec2)
def test_all_same_but_patches_to_apply(mock_packages, config): def test_all_same_but_patches_to_apply(mock_packages, config):
spec1 = Spec("hash-test1@=1.4") spec1 = Spec("hash-test1@=1.4")
spec2 = Spec("hash-test2@=1.4") spec2 = Spec("hash-test2@=1.4")
compare_sans_name(True, spec1, spec2) assert canonical_source_equal_sans_name(spec1, spec2)
def test_all_same_but_install(mock_packages, config): def test_all_same_but_install(mock_packages, config):
spec1 = Spec("hash-test1@=1.5") spec1 = Spec("hash-test1@=1.5")
spec2 = Spec("hash-test2@=1.5") spec2 = Spec("hash-test2@=1.5")
compare_sans_name(False, spec1, spec2) assert not canonical_source_equal_sans_name(spec1, spec2)
def test_content_hash_all_same_but_patch_contents(mock_packages, config): def test_package_hash_all_same_but_patch_contents_different(mock_packages, config):
spec1 = Spec("hash-test1@1.1").concretized() spec1 = Spec("hash-test1@1.1").concretized()
spec2 = Spec("hash-test2@1.1").concretized() spec2 = Spec("hash-test2@1.1").concretized()
compare_hash_sans_name(False, spec1, spec2)
assert package_hash_equal_sans_name(spec1, spec2)
assert spec1.dag_hash() != spec2.dag_hash()
assert spec1.to_node_dict()["patches"] != spec2.to_node_dict()["patches"]
def test_content_hash_not_concretized(mock_packages, config): def test_package_hash_not_concretized(mock_packages, config):
"""Check that Package.content_hash() works on abstract specs.""" """Check that ``package_hash()`` works on abstract specs."""
# these are different due to the package hash # these are different due to patches but not package hash
spec1 = Spec("hash-test1@=1.1") spec1 = Spec("hash-test1@=1.1")
spec2 = Spec("hash-test2@=1.3") spec2 = Spec("hash-test2@=1.3")
compare_hash_sans_name(False, spec1, spec2) assert package_hash_equal_sans_name(spec1, spec2)
# at v1.1 these are actually the same package when @when's are removed # at v1.1 these are actually the same package when @when's are removed
# and the name isn't considered # and the name isn't considered
spec1 = Spec("hash-test1@=1.1") spec1 = Spec("hash-test1@=1.1")
spec2 = Spec("hash-test2@=1.1") spec2 = Spec("hash-test2@=1.1")
compare_hash_sans_name(True, spec1, spec2) assert package_hash_equal_sans_name(spec1, spec2)
# these end up being different b/c we can't eliminate much of the package.py # these end up being different b/c without a version, we can't eliminate much of the
# without a version. # package.py when canonicalizing source.
spec1 = Spec("hash-test1") spec1 = Spec("hash-test1")
spec2 = Spec("hash-test2") spec2 = Spec("hash-test2")
compare_hash_sans_name(False, spec1, spec2) assert not package_hash_equal_sans_name(spec1, spec2)
def test_content_hash_different_variants(mock_packages, config): def test_package_hash_different_variants(mock_packages, config):
spec1 = Spec("hash-test1@1.2 +variantx").concretized() spec1 = Spec("hash-test1@1.2 +variantx").concretized()
spec2 = Spec("hash-test2@1.2 ~variantx").concretized() spec2 = Spec("hash-test2@1.2 ~variantx").concretized()
compare_hash_sans_name(True, spec1, spec2) assert package_hash_equal_sans_name(spec1, spec2)
def test_content_hash_cannot_get_details_from_ast(mock_packages, config): def test_package_hash_cannot_get_details_from_ast(mock_packages, config):
"""Packages hash-test1 and hash-test3 would be considered the same """Packages hash-test1 and hash-test3 would be considered the same
except that hash-test3 conditionally executes a phase based on except that hash-test3 conditionally executes a phase based on
a "when" directive that Spack cannot evaluate by examining the a "when" directive that Spack cannot evaluate by examining the
@ -135,18 +136,36 @@ def test_content_hash_cannot_get_details_from_ast(mock_packages, config):
""" """
spec3 = Spec("hash-test1@1.7").concretized() spec3 = Spec("hash-test1@1.7").concretized()
spec4 = Spec("hash-test3@1.7").concretized() spec4 = Spec("hash-test3@1.7").concretized()
compare_hash_sans_name(False, spec3, spec4) assert not package_hash_equal_sans_name(spec3, spec4)
def test_content_hash_all_same_but_archive_hash(mock_packages, config): def test_package_hash_all_same_but_archive_hash(mock_packages, config):
spec1 = Spec("hash-test1@1.3").concretized() spec1 = Spec("hash-test1@1.3").concretized()
spec2 = Spec("hash-test2@1.3").concretized() spec2 = Spec("hash-test2@1.3").concretized()
compare_hash_sans_name(False, spec1, spec2)
assert package_hash_equal_sans_name(spec1, spec2)
# the sources for these two packages will not be the same b/c their archive hashes differ
assert spec1.to_node_dict()["sources"] != spec2.to_node_dict()["sources"]
def test_content_hash_parse_dynamic_function_call(mock_packages, config): def test_package_hash_all_same_but_resources(mock_packages, config):
spec1 = Spec("hash-test1@1.7").concretized()
spec2 = Spec("hash-test1@1.8").concretized()
# these should be the same
assert canonical_source_equal_sans_name(spec1, spec2)
assert package_hash_equal_sans_name(spec1, spec2)
# but 1.7 has a resource that affects the hash
assert spec1.to_node_dict()["sources"] != spec2.to_node_dict()["sources"]
assert spec1.dag_hash() != spec2.dag_hash()
def test_package_hash_parse_dynamic_function_call(mock_packages, config):
spec = Spec("hash-test4").concretized() spec = Spec("hash-test4").concretized()
spec.package.content_hash() ph.package_hash(spec)
many_strings = '''\ many_strings = '''\

View File

@ -6,6 +6,11 @@
from spack.package import * from spack.package import *
def use_submodules(pkg):
"""test example of a submodule callback"""
return ["a", "b"]
class GitUrlTopLevel(Package): class GitUrlTopLevel(Package):
"""Mock package that top-level git and url attributes. """Mock package that top-level git and url attributes.
@ -22,6 +27,7 @@ class GitUrlTopLevel(Package):
# These resolve to git fetchers # These resolve to git fetchers
version("develop", branch="develop") version("develop", branch="develop")
version("submodules", submodules=True) version("submodules", submodules=True)
version("submodules_callback", submodules=use_submodules)
version("3.4", commit="abc34") version("3.4", commit="abc34")
version("3.3", branch="releases/v3.3", commit="abc33") version("3.3", branch="releases/v3.3", commit="abc33")
version("3.2", branch="releases/v3.2") version("3.2", branch="releases/v3.2")

View File

@ -14,13 +14,14 @@ class HashTest1(Package):
homepage = "http://www.hashtest1.org" homepage = "http://www.hashtest1.org"
url = "http://www.hashtest1.org/downloads/hashtest1-1.1.tar.bz2" url = "http://www.hashtest1.org/downloads/hashtest1-1.1.tar.bz2"
version("1.1", md5="a" * 32) version("1.1", sha256="a" * 64)
version("1.2", md5="b" * 32) version("1.2", sha256="b" * 64)
version("1.3", md5="c" * 32) version("1.3", sha256="c" * 64)
version("1.4", md5="d" * 32) version("1.4", sha256="d" * 64)
version("1.5", md5="d" * 32) version("1.5", sha256="d" * 64)
version("1.6", md5="e" * 32) version("1.6", sha256="e" * 64)
version("1.7", md5="f" * 32) version("1.7", sha256="f" * 64)
version("1.8", sha256="1" * 64)
patch("patch1.patch", when="@1.1") patch("patch1.patch", when="@1.1")
patch("patch2.patch", when="@1.4") patch("patch2.patch", when="@1.4")
@ -28,6 +29,12 @@ class HashTest1(Package):
variant("variantx", default=False, description="Test variant X") variant("variantx", default=False, description="Test variant X")
variant("varianty", default=False, description="Test variant Y") variant("varianty", default=False, description="Test variant Y")
resource(
url="http://www.example.com/example-1.0-resource.tar.gz",
sha256="abcd1234" * 8,
when="@1.8",
)
def setup_dependent_build_environment(self, env, dependent_spec): def setup_dependent_build_environment(self, env, dependent_spec):
pass pass

View File

@ -14,10 +14,13 @@ class HashTest2(Package):
homepage = "http://www.hashtest2.org" homepage = "http://www.hashtest2.org"
url = "http://www.hashtest1.org/downloads/hashtest2-1.1.tar.bz2" url = "http://www.hashtest1.org/downloads/hashtest2-1.1.tar.bz2"
version("1.1", md5="a" * 32) version("1.1", sha256="a" * 64)
version("1.2", md5="b" * 32) version("1.2", sha256="b" * 64)
version("1.3", md5="c" * 31 + "x") # Source hash differs from hash-test1@1.3
version("1.4", md5="d" * 32) # Source hash differs from hash-test1@1.3
version("1.3", sha256=("c" * 63) + "f")
version("1.4", sha256="d" * 64)
patch("patch1.patch", when="@1.1") patch("patch1.patch", when="@1.1")

View File

@ -14,11 +14,11 @@ class HashTest3(Package):
homepage = "http://www.hashtest3.org" homepage = "http://www.hashtest3.org"
url = "http://www.hashtest1.org/downloads/hashtest3-1.1.tar.bz2" url = "http://www.hashtest1.org/downloads/hashtest3-1.1.tar.bz2"
version("1.2", md5="b" * 32) version("1.2", sha256="b" * 64)
version("1.3", md5="c" * 32) version("1.3", sha256="c" * 64)
version("1.5", md5="d" * 32) version("1.5", sha256="d" * 64)
version("1.6", md5="e" * 32) version("1.6", sha256="e" * 64)
version("1.7", md5="f" * 32) version("1.7", sha256="f" * 64)
variant("variantx", default=False, description="Test variant X") variant("variantx", default=False, description="Test variant X")
variant("varianty", default=False, description="Test variant Y") variant("varianty", default=False, description="Test variant Y")