Compare commits

...

8 Commits

Author SHA1 Message Date
Harmen Stoppels
a3cbafd87c PackageBase: revamp libs and headers API
Add `find_libs`, `find_headers`, `query_libs`, and `query_headers` to
`PackageBase` and implement `SpecBuildInterface` in terms of those.

The old style `libs` and `headers` properties are deprecated but take
priority over the new style `find_libs` and `find_headers` methods.
2025-03-27 10:22:11 +01:00
Todd Gamblin
199133fca4 bugfix: concretization shouldn't require concrete packages to be known (#49706)
Concretization setup was checking whether any input spec has a dependency
that's *not* in the set of possible dependencies for all roots in the solve.
There are two reasons to check this:

1. The user could be asking for a dependency that none of the roots has, or
2. The user could be asking for a dependency that doesn't exist.

For abstract roots, (2) implies (1), and the check makes sense.  For concrete
roots, we don't care, because the spec has already been built. If a `package.py`
no longer depends on something it did before, it doesn't matter -- it's already
built. If the dependency no longer exists, we also do not care -- we already
built it and there's an installation for it somewhere. 

When you concretize an environment with a lockfile, *many* of the input specs
are concrete, and we don't need to build them. If a package changes its
dependencies, or if a `package.py` is removed for a concrete input spec, that
shouldn't cause an already-built environment to fail concretization.

A user reported that this was happening with an error like:

```console
spack concretize
==> Error: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```

Or, with traceback:
```console
  File "/apps/other/spack-devel/lib/spack/spack/solver/asp.py", line 3014, in setup
    raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
spack.spec.InvalidDependencyError: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```

Fix this by skipping the check for concrete input specs. We already ignore conflicts,
etc. for concrete/external specs, and we do not need metadata in the solve for
concrete dependencies b/c they're imposed by hash constraints.

- [x] Ignore the package existence check for concrete input specs.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-26 23:58:12 +00:00
Alec Scott
ea3a3b51a0 ci: Skip removed packages during automatic checksum verification (#49681)
* Skip packages removed for automatic checksum verification

* Unify finding modified or added packages with spack.repo logic

* Remove unused imports

* Fix unit-tests using shared modified function

* Update last remaining unit test to new format
2025-03-26 16:47:11 -07:00
Robert Maaskant
23bd3e6104 glab: add v1.54.0 and v1.55.0 (#49689)
* glab: add v1.54.0 and v1.55.0

* glab: uniform go deps

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-26 14:48:18 -06:00
Robert Maaskant
c72477e67a gtk-doc: use download.gnome.org for downloads (#49660) 2025-03-26 12:03:41 -06:00
Todd Gamblin
2d2a4d1908 bugfix: display old-style compilers without @= in spack find output (#49693)
* bugfix: display old-style compilers without `@=` in `spack find` output

Fix `display_str` attribute of `DeprecatedCompiler` so that `spack find` displays
compilers without `@=` for the version (as expected).

- [x] Use `spec.format("{@version}")` to do this

before:
```
> spack find
-- darwin-sequoia-m1 / apple-clang@=16.0.0 -----------------------
abseil-cpp@20240722.0               py-graphviz@0.13.2
apple-libuuid@1353.100.2            py-hatch-fancy-pypi-readme@23.1.0
```

after:
```
> spack find
-- darwin-sequoia-m1 / apple-clang@16.0.0 -----------------------
abseil-cpp@20240722.0               py-graphviz@0.13.2
apple-libuuid@1353.100.2            py-hatch-fancy-pypi-readme@23.1.0
```

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* [@spackbot] updating style on behalf of tgamblin

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-26 10:34:05 -07:00
Afzal Patel
2cd773aea4 py-jaxlib: add spack-built ROCm support (#49611)
* py-jaxlib: add spack-built ROCm support

* fix style

* py-jaxlib 0.4.38 rocm support

* py-jaxlib 0.4.38 rocm support

* add comgr dependency

* changes for ROCm external and enable till 0.4.38

* enable version of py-jax

* add jax+rocm to ci

* add conflict for cuda and remove py-jaxlib from aarch64 pipeline

* Update var/spack/repos/builtin/packages/py-jaxlib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add conflict for aarch64

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2025-03-26 09:23:52 -06:00
Mikael Simberg
145b0667cc asio: add 1.34.0 (#49529) 2025-03-26 14:01:01 +01:00
16 changed files with 325 additions and 304 deletions

View File

@@ -4,7 +4,6 @@
import json
import os
import re
import shutil
import sys
from typing import Dict
@@ -26,12 +25,10 @@
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.package_base
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.executable
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -45,7 +42,6 @@
SPACK_COMMAND = "spack"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
BUILTIN = re.compile(r"var\/spack\/repos\/builtin\/packages\/([^\/]+)\/package\.py")
def deindent(desc):
@@ -783,18 +779,15 @@ def ci_verify_versions(args):
then parses the git diff between the two to determine which packages
have been modified verifies the new checksums inside of them.
"""
with fs.working_dir(spack.paths.prefix):
# We use HEAD^1 explicitly on the merge commit created by
# GitHub Actions. However HEAD~1 is a safer default for the helper function.
files = spack.util.git.get_modified_files(from_ref=args.from_ref, to_ref=args.to_ref)
# Get a list of package names from the modified files.
pkgs = [(m.group(1), p) for p in files for m in [BUILTIN.search(p)] if m]
# Get a list of all packages that have been changed or added
# between from_ref and to_ref
pkgs = spack.repo.get_all_package_diffs("AC", args.from_ref, args.to_ref)
failed_version = False
for pkg_name, path in pkgs:
for pkg_name in pkgs:
spec = spack.spec.Spec(pkg_name)
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
path = spack.repo.PATH.package_path(pkg_name)
# Skip checking manual download packages and trust the maintainers
if pkg.manual_download:
@@ -818,7 +811,7 @@ def ci_verify_versions(args):
# TODO: enforce every version have a commit or a sha256 defined if not
# an infinite version (there are a lot of package's where this doesn't work yet.)
with fs.working_dir(spack.paths.prefix):
with fs.working_dir(os.path.dirname(path)):
added_checksums = spack_ci.get_added_versions(
checksums_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)

View File

@@ -22,9 +22,22 @@
import textwrap
import time
import traceback
from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, TypeVar, Union
from typing import (
Any,
Callable,
Dict,
Iterable,
List,
Optional,
Sequence,
Set,
Tuple,
Type,
TypeVar,
Union,
)
from typing_extensions import Literal
from typing_extensions import Literal, final
import llnl.util.filesystem as fsys
import llnl.util.tty as tty
@@ -1381,6 +1394,75 @@ def command(self) -> spack.util.executable.Executable:
return spack.util.executable.Executable(path)
raise RuntimeError(f"Unable to locate {self.spec.name} command in {self.home.bin}")
def find_headers(
self, *, features: Sequence[str] = (), virtual: Optional[str] = None
) -> fsys.HeaderList:
"""Return the header list for this package based on the query. This method can be
overridden by individual packages to return package specific headers.
Args:
features: query argument to filter or extend the header list.
virtual: when set, return headers relevant for the virtual provided by this package.
Raises:
spack.error.NoHeadersError: if there was an error locating the headers.
"""
spec = self.spec
home = self.home
headers = fsys.find_headers("*", root=home.include, recursive=True)
if headers:
return headers
raise spack.error.NoHeadersError(f"Unable to locate {spec.name} headers in {home}")
def find_libs(
self, *, features: Sequence[str] = (), virtual: Optional[str] = None
) -> fsys.LibraryList:
"""Return the library list for this package based on the query. This method can be
overridden by individual packages to return package specific libraries.
Args:
features: query argument to filter or extend the library list.
virtual: when set, return libraries relevant for the virtual provided by this package.
Raises:
spack.error.NoLibrariesError: if there was an error locating the libraries.
"""
spec = self.spec
home = self.home
name = self.spec.name.replace("-", "?")
# Avoid double 'lib' for packages whose names already start with lib
if not name.startswith("lib") and not spec.satisfies("platform=windows"):
name = "lib" + name
# If '+shared' search only for shared library; if '~shared' search only for
# static library; otherwise, first search for shared and then for static.
search_shared = (
[True] if ("+shared" in spec) else ([False] if ("~shared" in spec) else [True, False])
)
for shared in search_shared:
# Since we are searching for link libraries, on Windows search only for
# ".Lib" extensions by default as those represent import libraries for implicit links.
libs = fsys.find_libraries(name, home, shared=shared, recursive=True, runtime=False)
if libs:
return libs
raise spack.error.NoLibrariesError(
f"Unable to recursively locate {spec.name} libraries in {home}"
)
@final
def query_headers(self, name: str, *, features: Sequence[str] = ()) -> fsys.HeaderList:
"""Returns the header list for a dependency ``name``."""
spec, is_virtual = self.spec._get_dependency_by_name(name)
return spec.package.find_headers(features=features, virtual=name if is_virtual else None)
@final
def query_libs(self, name: str, *, features: Sequence[str] = ()) -> fsys.LibraryList:
"""Returns the library list for a dependency ``name``."""
spec, is_virtual = self.spec._get_dependency_by_name(name)
return spec.package.find_libs(features=features, virtual=name if is_virtual else None)
def url_version(self, version):
"""
Given a version, this returns a string that should be substituted

View File

@@ -3005,6 +3005,10 @@ def setup(
# Fail if we already know an unreachable node is requested
for spec in specs:
# concrete roots don't need their dependencies verified
if spec.concrete:
continue
missing_deps = [
str(d)
for d in spec.traverse()

View File

@@ -663,11 +663,9 @@ def versions(self):
def display_str(self):
"""Equivalent to {compiler.name}{@compiler.version} for Specs, without extra
@= for readability."""
if self.spec.concrete:
return f"{self.name}@{self.version}"
elif self.versions != vn.any_version:
return f"{self.name}@{self.versions}"
return self.name
if self.versions != vn.any_version:
return self.spec.format("{name}{@version}")
return self.spec.format("{name}")
def __lt__(self, other):
if not isinstance(other, CompilerSpec):
@@ -1072,123 +1070,26 @@ def clear(self):
self.edges.clear()
def _headers_default_handler(spec: "Spec"):
"""Default handler when looking for the 'headers' attribute.
Tries to search for ``*.h`` files recursively starting from
``spec.package.home.include``.
Parameters:
spec: spec that is being queried
Returns:
HeaderList: The headers in ``prefix.include``
Raises:
NoHeadersError: If no headers are found
"""
home = getattr(spec.package, "home")
headers = fs.find_headers("*", root=home.include, recursive=True)
if headers:
return headers
raise spack.error.NoHeadersError(f"Unable to locate {spec.name} headers in {home}")
def _libs_default_handler(spec: "Spec"):
"""Default handler when looking for the 'libs' attribute.
Tries to search for ``lib{spec.name}`` recursively starting from
``spec.package.home``. If ``spec.name`` starts with ``lib``, searches for
``{spec.name}`` instead.
Parameters:
spec: spec that is being queried
Returns:
LibraryList: The libraries found
Raises:
NoLibrariesError: If no libraries are found
"""
# Variable 'name' is passed to function 'find_libraries', which supports
# glob characters. For example, we have a package with a name 'abc-abc'.
# Now, we don't know if the original name of the package is 'abc_abc'
# (and it generates a library 'libabc_abc.so') or 'abc-abc' (and it
# generates a library 'libabc-abc.so'). So, we tell the function
# 'find_libraries' to give us anything that matches 'libabc?abc' and it
# gives us either 'libabc-abc.so' or 'libabc_abc.so' (or an error)
# depending on which one exists (there is a possibility, of course, to
# get something like 'libabcXabc.so, but for now we consider this
# unlikely).
name = spec.name.replace("-", "?")
home = getattr(spec.package, "home")
# Avoid double 'lib' for packages whose names already start with lib
if not name.startswith("lib") and not spec.satisfies("platform=windows"):
name = "lib" + name
# If '+shared' search only for shared library; if '~shared' search only for
# static library; otherwise, first search for shared and then for static.
search_shared = (
[True] if ("+shared" in spec) else ([False] if ("~shared" in spec) else [True, False])
)
for shared in search_shared:
# Since we are searching for link libraries, on Windows search only for
# ".Lib" extensions by default as those represent import libraries for implicit links.
libs = fs.find_libraries(name, home, shared=shared, recursive=True, runtime=False)
if libs:
return libs
raise spack.error.NoLibrariesError(
f"Unable to recursively locate {spec.name} libraries in {home}"
)
class ForwardQueryToPackage:
"""Descriptor used to forward queries from Spec to Package"""
def __init__(
self,
attribute_name: str,
default_handler: Optional[Callable[["Spec"], Any]] = None,
_indirect: bool = False,
) -> None:
def __init__(self, attribute_name: str, _indirect: bool = False) -> None:
"""Create a new descriptor.
Parameters:
attribute_name: name of the attribute to be searched for in the Package instance
default_handler: default function to be called if the attribute was not found in the
Package instance
_indirect: temporarily added to redirect a query to another package.
"""
self.attribute_name = attribute_name
self.default = default_handler
self.indirect = _indirect
def __get__(self, instance: "SpecBuildInterface", cls):
"""Retrieves the property from Package using a well defined chain
of responsibility.
"""Retrieves the property from Package using a well defined chain of responsibility.
The order of call is:
The call order is:
1. if the query was through the name of a virtual package try to
search for the attribute `{virtual_name}_{attribute_name}`
in Package
2. try to search for attribute `{attribute_name}` in Package
3. try to call the default handler
The first call that produces a value will stop the chain.
If no call can handle the request then AttributeError is raised with a
message indicating that no relevant attribute exists.
If a call returns None, an AttributeError is raised with a message
indicating a query failure, e.g. that library files were not found in a
'libs' query.
1. `pkg.{virtual_name}_{attribute_name}` if the query is for a virtual package
2. `pkg.{attribute_name}` otherwise
"""
# TODO: this indirection exist solely for `spec["python"].command` to actually return
# spec["python-venv"].command. It should be removed when `python` is a virtual.
@@ -1204,61 +1105,36 @@ def __get__(self, instance: "SpecBuildInterface", cls):
_ = instance.wrapped_obj[instance.wrapped_obj.name] # NOQA: ignore=F841
query = instance.last_query
callbacks_chain = []
# First in the chain : specialized attribute for virtual packages
# First try the deprecated attributes (e.g. `<virtual>_libs` and `libs`)
if query.isvirtual:
specialized_name = "{0}_{1}".format(query.name, self.attribute_name)
callbacks_chain.append(lambda: getattr(pkg, specialized_name))
# Try to get the generic method from Package
callbacks_chain.append(lambda: getattr(pkg, self.attribute_name))
# Final resort : default callback
if self.default is not None:
_default = self.default # make mypy happy
callbacks_chain.append(lambda: _default(instance.wrapped_obj))
deprecated_attrs = [f"{query.name}_{self.attribute_name}", self.attribute_name]
else:
deprecated_attrs = [self.attribute_name]
# Trigger the callbacks in order, the first one producing a
# value wins
value = None
message = None
for f in callbacks_chain:
try:
value = f()
# A callback can return None to trigger an error indicating
# that the query failed.
if value is None:
msg = "Query of package '{name}' for '{attrib}' failed\n"
msg += "\tprefix : {spec.prefix}\n"
msg += "\tspec : {spec}\n"
msg += "\tqueried as : {query.name}\n"
msg += "\textra parameters : {query.extra_parameters}"
message = msg.format(
name=pkg.name,
attrib=self.attribute_name,
spec=instance,
query=instance.last_query,
)
else:
return value
break
except AttributeError:
pass
# value is 'None'
if message is not None:
# Here we can use another type of exception. If we do that, the
# unit test 'test_getitem_exceptional_paths' in the file
# lib/spack/spack/test/spec_dag.py will need to be updated to match
# the type.
raise AttributeError(message)
# 'None' value at this point means that there are no appropriate
# properties defined and no default handler, or that all callbacks
# raised AttributeError. In this case, we raise AttributeError with an
# appropriate message.
fmt = "'{name}' package has no relevant attribute '{query}'\n"
fmt += "\tspec : '{spec}'\n"
fmt += "\tqueried as : '{spec.last_query.name}'\n"
fmt += "\textra parameters : '{spec.last_query.extra_parameters}'\n"
message = fmt.format(name=pkg.name, query=self.attribute_name, spec=instance)
raise AttributeError(message)
for attr in deprecated_attrs:
if not hasattr(pkg, attr):
continue
value = getattr(pkg, attr)
# Deprecated properties can return None to indicate the query failed.
if value is None:
raise AttributeError(
f"Query of package '{pkg.name}' for '{self.attribute_name}' failed\n"
f"\tprefix : {instance.prefix}\n" # type: ignore[attr-defined]
f"\tspec : {instance}\n"
f"\tqueried as : {query.name}\n"
f"\textra parameters : {query.extra_parameters}"
)
return value
# Then try the new functions (e.g. `find_libs`).
features = query.extra_parameters
virtual = query.name if query.isvirtual else None
if self.attribute_name == "libs":
return pkg.find_libs(features=features, virtual=virtual)
elif self.attribute_name == "headers":
return pkg.find_headers(features=features, virtual=virtual)
raise AttributeError(f"Package {pkg.name} has no attribute {self.attribute_name}")
def __set__(self, instance, value):
cls_name = type(instance).__name__
@@ -1272,10 +1148,10 @@ def __set__(self, instance, value):
class SpecBuildInterface(lang.ObjectWrapper):
# home is available in the base Package so no default is needed
home = ForwardQueryToPackage("home", default_handler=None)
headers = ForwardQueryToPackage("headers", default_handler=_headers_default_handler)
libs = ForwardQueryToPackage("libs", default_handler=_libs_default_handler)
command = ForwardQueryToPackage("command", default_handler=None, _indirect=True)
home = ForwardQueryToPackage("home")
headers = ForwardQueryToPackage("headers")
libs = ForwardQueryToPackage("libs")
command = ForwardQueryToPackage("command", _indirect=True)
def __init__(
self,
@@ -3644,6 +3520,21 @@ def version(self):
raise spack.error.SpecError("Spec version is not concrete: " + str(self))
return self.versions[0]
def _get_dependency_by_name(self, name: str) -> Tuple["Spec", bool]:
"""Get a dependency by package name or virtual. Returns a tuple with the matching spec
and a boolean indicating if the spec is a virtual dependency. Raises a KeyError if the
dependency is not found."""
# Consider all direct dependencies and transitive runtime dependencies
order = itertools.chain(
self.edges_to_dependencies(depflag=dt.BUILD | dt.TEST),
self.traverse_edges(deptype=dt.LINK | dt.RUN, order="breadth", cover="edges"),
)
edge = next((e for e in order if e.spec.name == name or name in e.virtuals), None)
if edge is None:
raise KeyError(f"No spec with name {name} in {self}")
return edge.spec, name in edge.virtuals
def __getitem__(self, name: str):
"""Get a dependency from the spec by its name. This call implicitly
sets a query state in the package being retrieved. The behavior of
@@ -3664,23 +3555,14 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
# Consider all direct dependencies and transitive runtime dependencies
order = itertools.chain(
self.edges_to_dependencies(depflag=dt.BUILD | dt.TEST),
self.traverse_edges(deptype=dt.LINK | dt.RUN, order="breadth", cover="edges"),
)
try:
edge = next((e for e in order if e.spec.name == name or name in e.virtuals))
except StopIteration as e:
raise KeyError(f"No spec with name {name} in {self}") from e
spec, is_virtual = self._get_dependency_by_name(name)
if self._concrete:
return SpecBuildInterface(
edge.spec, name, query_parameters, _parent=self, is_virtual=name in edge.virtuals
spec, name, query_parameters, _parent=self, is_virtual=is_virtual
)
return edge.spec
return spec
def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec.

View File

@@ -32,7 +32,7 @@ def repro_dir(tmp_path):
def test_get_added_versions_new_checksum(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
repo, filename, commits = mock_git_package_changes
checksum_versions = {
"3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04": Version("2.1.5"),
@@ -41,7 +41,7 @@ def test_get_added_versions_new_checksum(mock_git_package_changes):
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
with fs.working_dir(repo.packages_path):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[-1], to_ref=commits[-2]
)
@@ -50,7 +50,7 @@ def test_get_added_versions_new_checksum(mock_git_package_changes):
def test_get_added_versions_new_commit(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
repo, filename, commits = mock_git_package_changes
checksum_versions = {
"74253725f884e2424a0dd8ae3f69896d5377f325": Version("2.1.6"),
@@ -60,9 +60,9 @@ def test_get_added_versions_new_commit(mock_git_package_changes):
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
with fs.working_dir(repo.packages_path):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[2], to_ref=commits[1]
checksum_versions, filename, from_ref=commits[-2], to_ref=commits[-3]
)
assert len(added_versions) == 1
assert added_versions[0] == Version("2.1.6")

View File

@@ -1978,6 +1978,13 @@ def test_ci_validate_git_versions_invalid(
assert f"Invalid commit for diff-test@{version}" in err
def mock_packages_path(path):
def packages_path():
return path
return packages_path
@pytest.fixture
def verify_standard_versions_valid(monkeypatch):
def validate_standard_versions(pkg, versions):
@@ -2024,9 +2031,12 @@ def test_ci_verify_versions_valid(
mock_git_package_changes,
verify_standard_versions_valid,
verify_git_versions_valid,
tmpdir,
):
repo_path, _, commits = mock_git_package_changes
monkeypatch.setattr(spack.paths, "prefix", repo_path)
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
monkeypatch.setattr(spack.repo, "packages_path", mock_packages_path(repo.packages_path))
out = ci_cmd("verify-versions", commits[-1], commits[-3])
assert "Validated diff-test@2.1.5" in out
@@ -2040,9 +2050,10 @@ def test_ci_verify_versions_standard_invalid(
verify_standard_versions_invalid,
verify_git_versions_invalid,
):
repo_path, _, commits = mock_git_package_changes
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
monkeypatch.setattr(spack.paths, "prefix", repo_path)
monkeypatch.setattr(spack.repo, "packages_path", mock_packages_path(repo.packages_path))
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
assert "Invalid checksum found diff-test@2.1.5" in out
@@ -2050,8 +2061,10 @@ def test_ci_verify_versions_standard_invalid(
def test_ci_verify_versions_manual_package(monkeypatch, mock_packages, mock_git_package_changes):
repo_path, _, commits = mock_git_package_changes
monkeypatch.setattr(spack.paths, "prefix", repo_path)
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
monkeypatch.setattr(spack.repo, "packages_path", mock_packages_path(repo.packages_path))
pkg_class = spack.spec.Spec("diff-test").package_class
monkeypatch.setattr(pkg_class, "manual_download", True)

View File

@@ -243,13 +243,11 @@ def latest_commit():
@pytest.fixture
def mock_git_package_changes(git, tmpdir, override_git_repos_cache_path):
def mock_git_package_changes(git, tmpdir, override_git_repos_cache_path, monkeypatch):
"""Create a mock git repo with known structure of package edits
The structure of commits in this repo is as follows::
o diff-test: modification to make manual download package
|
o diff-test: add v1.2 (from a git ref)
|
o diff-test: add v1.1 (from source tarball)
@@ -261,8 +259,12 @@ def mock_git_package_changes(git, tmpdir, override_git_repos_cache_path):
Important attributes of the repo for test coverage are: multiple package
versions are added with some coming from a tarball and some from git refs.
"""
repo_path = str(tmpdir.mkdir("git_package_changes_repo"))
filename = "var/spack/repos/builtin/packages/diff-test/package.py"
filename = "diff-test/package.py"
repo_path, _ = spack.repo.create_repo(str(tmpdir.mkdir("myrepo")))
repo_cache = spack.util.file_cache.FileCache(str(tmpdir.mkdir("cache")))
repo = spack.repo.Repo(repo_path, cache=repo_cache)
def commit(message):
global commit_counter
@@ -276,7 +278,7 @@ def commit(message):
)
commit_counter += 1
with working_dir(repo_path):
with working_dir(repo.packages_path):
git("init")
git("config", "user.name", "Spack")
@@ -307,17 +309,11 @@ def latest_commit():
commit("diff-test: add v2.1.6")
commits.append(latest_commit())
# convert pkg-a to a manual download package
shutil.copy2(f"{spack.paths.test_path}/data/conftest/diff-test/package-3.txt", filename)
git("add", filename)
commit("diff-test: modification to make manual download package")
commits.append(latest_commit())
# The commits are ordered with the last commit first in the list
commits = list(reversed(commits))
# Return the git directory to install, the filename used, and the commits
yield repo_path, filename, commits
yield repo, filename, commits
@pytest.fixture(autouse=True)

View File

@@ -1,23 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DiffTest(AutotoolsPackage):
"""zlib replacement with optimizations for next generation systems."""
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
license("Zlib")
manual_download = True
version("2.1.6", tag="2.1.6", commit="74253725f884e2424a0dd8ae3f69896d5377f325")
version("2.1.5", sha256="3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04")
version("2.1.4", sha256="a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a")
version("2.0.7", sha256="6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200")
version("2.0.0", sha256="86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8")

View File

@@ -7,9 +7,9 @@
def test_modified_files(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
repo, filename, commits = mock_git_package_changes
with working_dir(repo_path):
with working_dir(repo.packages_path):
files = get_modified_files(from_ref="HEAD~1", to_ref="HEAD")
assert len(files) == 1
assert files[0] == filename

View File

@@ -27,9 +27,8 @@ spack:
- py-transformers
# JAX
# Does not yet support Spack-installed ROCm
# - py-jax
# - py-jaxlib
- py-jax
- py-jaxlib
# Keras
- py-keras backend=tensorflow

View File

@@ -18,6 +18,7 @@ class Asio(AutotoolsPackage):
license("BSL-1.0")
# As uneven minor versions of asio are not considered stable, they wont be added anymore
version("1.34.0", sha256="061ed6c8b97527756aed3e34d2cbcbcb6d3c80afd26ed6304f51119e1ef6a1cd")
version("1.32.0", sha256="f1b94b80eeb00bb63a3c8cef5047d4e409df4d8a3fe502305976965827d95672")
version("1.30.2", sha256="755bd7f85a4b269c67ae0ea254907c078d408cce8e1a352ad2ed664d233780e8")
version("1.30.1", sha256="94b121cc2016680f2314ef58eadf169c2d34fff97fba01df325a192d502d3a58")

View File

@@ -15,6 +15,8 @@ class Glab(GoPackage):
license("MIT")
version("1.55.0", sha256="21f58698b92035461e8e8ba9040429f4b5a0f6d528d8333834ef522a973384c8")
version("1.54.0", sha256="99f5dd785041ad26c8463ae8630e98a657aa542a2bb02333d50243dd5cfdf9cb")
version("1.53.0", sha256="2930aa5dd76030cc6edcc33483bb49dd6a328eb531d0685733ca7be7b906e915")
version("1.52.0", sha256="585495e53d3994172fb927218627b7470678bc766320cb52f4b4204238677dde")
version("1.51.0", sha256="6a95d827004fee258aacb49a427875e3b505b063cc578933d965cd56481f5a19")
@@ -34,20 +36,38 @@ class Glab(GoPackage):
version("1.21.1", sha256="8bb35c5cf6b011ff14d1eaa9ab70ec052d296978792984250e9063b006ee4d50")
version("1.20.0", sha256="6beb0186fa50d0dea3b05fcfe6e4bc1f9be0c07aa5fa15b37ca2047b16980412")
depends_on("go@1.13:", type="build")
depends_on("go@1.17:", type="build", when="@1.22:")
depends_on("go@1.18:", type="build", when="@1.23:")
depends_on("go@1.19:", type="build", when="@1.35:")
depends_on("go@1.21:", type="build", when="@1.37:")
depends_on("go@1.22.3:", type="build", when="@1.41:")
depends_on("go@1.22.4:", type="build", when="@1.42:")
depends_on("go@1.22.5:", type="build", when="@1.44:")
depends_on("go@1.23:", type="build", when="@1.46:")
depends_on("go@1.23.2:", type="build", when="@1.48:")
depends_on("go@1.23.4:", type="build", when="@1.52:")
with default_args(type="build"):
depends_on("go@1.24.1:", when="@1.54:")
depends_on("go@1.23.4:", when="@1.52:")
depends_on("go@1.23.2:", when="@1.48:")
depends_on("go@1.23.0:", when="@1.46:")
depends_on("go@1.22.5:", when="@1.44:")
depends_on("go@1.22.4:", when="@1.42:")
depends_on("go@1.22.3:", when="@1.41:")
depends_on("go@1.21.0:", when="@1.37:")
depends_on("go@1.19.0:", when="@1.35:")
depends_on("go@1.18.0:", when="@1.23:")
depends_on("go@1.17.0:", when="@1.22:")
depends_on("go@1.13.0:")
build_directory = "cmd/glab"
# Required to correctly set the version
# https://gitlab.com/gitlab-org/cli/-/blob/v1.55.0/Makefile?ref_type=tags#L44
@property
def build_args(self):
extra_ldflags = [f"-X 'main.version=v{self.version}'"]
args = super().build_args
if "-ldflags" in args:
ldflags_index = args.index("-ldflags") + 1
args[ldflags_index] = args[ldflags_index] + " " + " ".join(extra_ldflags)
else:
args.extend(["-ldflags", " ".join(extra_ldflags)])
return args
@run_after("install")
def install_completions(self):
glab = Executable(self.prefix.bin.glab)

View File

@@ -14,12 +14,14 @@ class GtkDoc(AutotoolsPackage):
pdf/man-pages with some extra work."""
homepage = "https://wiki.gnome.org/DocumentationProject/GtkDoc"
url = "https://gitlab.gnome.org/GNOME/gtk-doc/-/archive/1.33.2/gtk-doc-1.33.2.tar.gz"
url = "https://download.gnome.org/sources/gtk-doc/1.33/gtk-doc-1.33.2.tar.xz"
list_url = "https://download.gnome.org/sources/gtk-doc/"
list_depth = 1
license("GPL-2.0-or-later AND GFDL-1.1-or-later")
version("1.33.2", sha256="2d1b0cbd26edfcb54694b2339106a02a81d630a7dedc357461aeb186874cc7c0")
version("1.32", sha256="0890c1f00d4817279be51602e67c4805daf264092adc58f9c04338566e8225ba")
version("1.33.2", sha256="cc1b709a20eb030a278a1f9842a362e00402b7f834ae1df4c1998a723152bf43")
version("1.32", sha256="de0ef034fb17cb21ab0c635ec730d19746bce52984a6706e7bbec6fb5e0b907c")
depends_on("c", type="build") # generated
@@ -60,14 +62,8 @@ def installcheck(self):
pass
def url_for_version(self, version):
"""Handle gnome's version-based custom URLs."""
if version <= Version("1.32"):
url = "https://gitlab.gnome.org/GNOME/gtk-doc/-/archive/GTK_DOC_{0}/gtk-doc-GTK_DOC_{0}.tar.gz"
return url.format(version.underscored)
url = "https://gitlab.gnome.org/GNOME/gtk-doc/-/archive/{0}/gtk-doc-{0}.tar.gz"
return url.format(version)
url = "https://download.gnome.org/sources/gtk-doc/{0}/gtk-doc-{1}.tar.xz"
return url.format(version.up_to(2), version)
def configure_args(self):
args = ["--with-xml-catalog={0}".format(self["docbook-xml"].catalog)]

View File

@@ -169,6 +169,10 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
# Patches and one-off conflicts
# Asio 1.34.0 removed io_context::work, used by HPX:
# https://github.com/chriskohlhoff/asio/commit/a70f2df321ff40c1809773c2c09986745abf8d20.
conflicts("^asio@1.34:", when="@:1.10")
# Certain Asio headers don't compile with nvcc from 1.17.0 onwards with
# C++17. Starting with CUDA 11.3 they compile again.
conflicts("^asio@1.17.0:", when="+cuda cxxstd=17 ^cuda@:11.2")

View File

@@ -20,13 +20,13 @@ class PyJax(PythonPackage):
maintainers("adamjstewart", "jonas-eschle")
# version("0.5.0", sha256="49df70bf293a345a7fb519f71193506d37a024c4f850b358042eb32d502c81c8")
# version("0.4.38", sha256="43bae65881628319e0a2148e8f81a202fbc2b8d048e35c7cb1df2416672fa4a8")
# version("0.4.37", sha256="7774f3d9e23fe199c65589c680c5a5be87a183b89598421a632d8245222b637b")
# version("0.4.36", sha256="088bff0575d01fc82682a9af4eb07433d60de7e5164686bd2cea3439492e608a")
# version("0.4.35", sha256="c0c986993026b10bf6f607fecb7417377460254640766ce40f1fef3fd139c12e")
# version("0.4.34", sha256="44196854f40c5f9cea3142824b9f1051f85afc3fcf7593ec5479fc8db01c58db")
# version("0.4.33", sha256="f0d788692fc0179653066c9e1c64e57311b8c15a389837fd7baf328abefcbb92")
# version("0.4.32", sha256="eb703909968da161894fb6135a931c5f3d2aab64fff7cba5fcb803ce6d968e08")
version("0.4.38", sha256="43bae65881628319e0a2148e8f81a202fbc2b8d048e35c7cb1df2416672fa4a8")
version("0.4.37", sha256="7774f3d9e23fe199c65589c680c5a5be87a183b89598421a632d8245222b637b")
version("0.4.36", sha256="088bff0575d01fc82682a9af4eb07433d60de7e5164686bd2cea3439492e608a")
version("0.4.35", sha256="c0c986993026b10bf6f607fecb7417377460254640766ce40f1fef3fd139c12e")
version("0.4.34", sha256="44196854f40c5f9cea3142824b9f1051f85afc3fcf7593ec5479fc8db01c58db")
version("0.4.33", sha256="f0d788692fc0179653066c9e1c64e57311b8c15a389837fd7baf328abefcbb92")
version("0.4.32", sha256="eb703909968da161894fb6135a931c5f3d2aab64fff7cba5fcb803ce6d968e08")
version("0.4.31", sha256="fd2d470643a0073d822737f0788f71391656af7e62cc5b2e7995ee390ceac287")
version("0.4.30", sha256="94d74b5b2db0d80672b61d83f1f63ebf99d2ab7398ec12b2ca0c9d1e97afe577")
version("0.4.29", sha256="12904571eaefddcdc8c3b8d4936482b783d5a216e99ef5adcd3522fdfb4fc186")
@@ -85,13 +85,13 @@ class PyJax(PythonPackage):
# https://github.com/google/jax/commit/8be057de1f50756fe7522f7e98b2f30fad56f7e4
for v in [
# "0.5.0",
# "0.4.38",
# "0.4.37",
# "0.4.36",
# "0.4.35",
# "0.4.34",
# "0.4.33",
# "0.4.32",
"0.4.38",
"0.4.37",
"0.4.36",
"0.4.35",
"0.4.34",
"0.4.33",
"0.4.32",
"0.4.31",
"0.4.30",
"0.4.29",
@@ -126,12 +126,12 @@ class PyJax(PythonPackage):
# See _minimum_jaxlib_version in jax/version.py
# depends_on("py-jaxlib@0.5:", when="@0.5:")
# depends_on("py-jaxlib@0.4.38:", when="@0.4.38:")
# depends_on("py-jaxlib@0.4.36:", when="@0.4.36:")
# depends_on("py-jaxlib@0.4.35:", when="@0.4.35:")
# depends_on("py-jaxlib@0.4.34:", when="@0.4.34:")
# depends_on("py-jaxlib@0.4.33:", when="@0.4.33:")
# depends_on("py-jaxlib@0.4.32:", when="@0.4.32:")
depends_on("py-jaxlib@0.4.38:", when="@0.4.38:")
depends_on("py-jaxlib@0.4.36:", when="@0.4.36:")
depends_on("py-jaxlib@0.4.35:", when="@0.4.35:")
depends_on("py-jaxlib@0.4.34:", when="@0.4.34:")
depends_on("py-jaxlib@0.4.33:", when="@0.4.33:")
depends_on("py-jaxlib@0.4.32:", when="@0.4.32:")
depends_on("py-jaxlib@0.4.30:", when="@0.4.31:")
depends_on("py-jaxlib@0.4.27:", when="@0.4.28:")
depends_on("py-jaxlib@0.4.23:", when="@0.4.27:")

View File

@@ -8,20 +8,27 @@
from spack.package import *
rocm_dependencies = [
"hsa-rocr-dev",
"comgr",
"hip",
"rccl",
"rocprim",
"hipblas",
"hipblaslt",
"hipcub",
"rocthrust",
"roctracer-dev",
"rocrand",
"hipsparse",
"hipfft",
"rocfft",
"rocblas",
"hiprand",
"hipsolver",
"hipsparse",
"hsa-rocr-dev",
"miopen-hip",
"rccl",
"rocblas",
"rocfft",
"rocminfo",
"rocprim",
"rocrand",
"rocsolver",
"rocsparse",
"roctracer-dev",
"rocm-core",
]
@@ -39,14 +46,17 @@ class PyJaxlib(PythonPackage, CudaPackage, ROCmPackage):
license("Apache-2.0")
maintainers("adamjstewart", "jonas-eschle")
# version("0.5.3", sha256="1094581a30ec069965f4e3e67d60262570cc3dd016adc62073bc24347b14270c")
# version("0.5.2", sha256="8e9de1e012dd65fc4a9eec8af4aa2bf6782767130a5d8e1c1e342b7d658280fe")
# version("0.5.1", sha256="e74b1209517682075933f757d646b73040d09fe39ee3e9e4cd398407dd0902d2")
# version("0.5.0", sha256="04cc2eeb2e7ce1916674cea03a7d75a59d583ddb779d5104e103a2798a283ce9")
# version("0.4.38", sha256="ca1e63c488d505b9c92e81499e8b06cc1977319c50d64a0e58adbd2dae1a625c")
# version("0.4.37", sha256="17a8444a931f26edda8ccbc921ab71c6bf46857287b1db186deebd357e526870")
# version("0.4.36", sha256="442bfdf491b509995aa160361e23a9db488d5b97c87e6648cc733501b06eda77")
# version("0.4.35", sha256="65e086708ae56670676b7b2340ad82b901d8c9993d1241a839c8990bdb8d6212")
# version("0.4.34", sha256="d3a75ad667772309ade81350fa70c4a78028a920028800282e46d8383c0ee6bb")
# version("0.4.33", sha256="122a806e80fc1cd7d8ffaf9620701f2cb8e4fe22271c2cec53a9c60b30bd4c31")
# version("0.4.32", sha256="3fe36d596e4d640443c0a5c533845c74fbc4341e024d9bb1cd75cb49f5f419c2")
version("0.4.38", sha256="ca1e63c488d505b9c92e81499e8b06cc1977319c50d64a0e58adbd2dae1a625c")
version("0.4.37", sha256="17a8444a931f26edda8ccbc921ab71c6bf46857287b1db186deebd357e526870")
version("0.4.36", sha256="442bfdf491b509995aa160361e23a9db488d5b97c87e6648cc733501b06eda77")
version("0.4.35", sha256="65e086708ae56670676b7b2340ad82b901d8c9993d1241a839c8990bdb8d6212")
version("0.4.34", sha256="d3a75ad667772309ade81350fa70c4a78028a920028800282e46d8383c0ee6bb")
version("0.4.33", sha256="122a806e80fc1cd7d8ffaf9620701f2cb8e4fe22271c2cec53a9c60b30bd4c31")
version("0.4.32", sha256="3fe36d596e4d640443c0a5c533845c74fbc4341e024d9bb1cd75cb49f5f419c2")
version("0.4.31", sha256="022ea1347f9b21cbea31410b3d650d976ea4452a48ea7317a5f91c238031bf94")
version("0.4.30", sha256="0ef9635c734d9bbb44fcc87df4f1c3ccce1cfcfd243572c80d36fcdf826fe1e6")
version("0.4.29", sha256="3a8005f4f62d35a5aad7e3dbd596890b47c81cc6e34fcfe3dcb93b3ca7cb1246")
@@ -93,6 +103,10 @@ class PyJaxlib(PythonPackage, CudaPackage, ROCmPackage):
for pkg_dep in rocm_dependencies:
depends_on(f"{pkg_dep}@6:", when="@0.4.28:")
depends_on(pkg_dep)
depends_on("rocprofiler-register", when="^hip@6.2:")
depends_on("hipblas-common", when="^hip@6.3:")
depends_on("hsakmt-roct", when="^hip@:6.2")
depends_on("llvm-amdgpu")
depends_on("py-nanobind")
with default_args(type="build"):
@@ -113,6 +127,7 @@ class PyJaxlib(PythonPackage, CudaPackage, ROCmPackage):
depends_on("python@3.9:", when="@0.4.14:")
depends_on("python@3.8:", when="@0.4.6:")
depends_on("python@:3.13")
depends_on("python@:3.12", when="+rocm")
depends_on("python@:3.12", when="@:0.4.33")
depends_on("python@:3.11", when="@:0.4.16")
@@ -167,6 +182,22 @@ class PyJaxlib(PythonPackage, CudaPackage, ROCmPackage):
# Fails to build with freshly released CUDA (#48708).
conflicts("^cuda@12.8:", when="@:0.4.31")
# external CUDA is not supported https://github.com/jax-ml/jax/issues/23689
conflicts("+cuda", when="@0.4.32:")
# aarch64 is not supported https://github.com/jax-ml/jax/issues/25598
conflicts("target=aarch64:", when="@0.4.32:")
resource(
name="xla",
url="https://github.com/ROCm/xla/archive/07543ab117699a57c1267b453a62f89b1d5953fd.tar.gz",
sha256="cee377479654201c61cc3f230d89603cd589525fea2faf44564a23c70ba1448d",
expand=True,
destination="",
placement="xla",
when="@0.4.38:0.5.2 +rocm",
)
def url_for_version(self, version):
url = "https://github.com/jax-ml/jax/archive/refs/tags/{}-v{}.tar.gz"
if version >= Version("0.4.33"):
@@ -175,6 +206,20 @@ def url_for_version(self, version):
name = "jaxlib"
return url.format(name, version)
def setup_build_environment(self, env):
spec = self.spec
if spec.satisfies("@0.4.38: +rocm") and not spec["hip"].external:
if spec.satisfies("^hip@6.2:"):
rocm_dependencies.append("rocprofiler-register")
if spec.satisfies("^hip@6.3:"):
rocm_dependencies.append("hipblas-common")
else:
rocm_dependencies.append("hsakmt-roct")
env.set("LLVM_PATH", spec["llvm-amdgpu"].prefix)
for pkg_dep in rocm_dependencies:
env.prepend_path("TF_ROCM_MULTIPLE_PATHS", spec[pkg_dep].prefix)
env.prune_duplicate_paths("TF_ROCM_MULTIPLE_PATHS")
def install(self, spec, prefix):
# https://jax.readthedocs.io/en/latest/developer.html
args = ["build/build.py"]
@@ -216,7 +261,15 @@ def install(self, spec, prefix):
args.append(f"--bazel_options=--repo_env=LOCAL_NCCL_PATH={spec['nccl'].prefix}")
if "+rocm" in spec:
args.extend(["--enable_rocm", f"--rocm_path={self.spec['hip'].prefix}"])
args.append(f"--rocm_path={self.spec['hip'].prefix}")
if spec.satisfies("@:0.4.35"):
args.append("--enable_rocm")
if spec.satisfies("@0.4.38:") and not spec["hip"].external:
args.append("--bazel_options=--@local_config_rocm//rocm:rocm_path_type=multiple")
if spec.satisfies("@0.4.38:0.5.2"):
args.append(
f"--bazel_options=--override_repository=xla={self.stage.source_path}/xla"
)
args.extend(
[
@@ -227,5 +280,6 @@ def install(self, spec, prefix):
)
python(*args)
whl = glob.glob(join_path("dist", "*.whl"))[0]
pip(*PythonPipBuilder.std_args(self), f"--prefix={self.prefix}", whl)
for whl in glob.glob(join_path("dist", "*.whl")):
pip(*PythonPipBuilder.std_args(self), f"--prefix={self.prefix}", whl)