Compare commits

..

43 Commits

Author SHA1 Message Date
Harmen Stoppels
14de4b67ef . 2024-09-11 09:54:02 +02:00
Harmen Stoppels
e59363eaa1 simplify 2024-09-11 09:46:33 +02:00
Harmen Stoppels
ded4935834 detection: reduce fs pressure 2024-09-11 09:46:17 +02:00
Harmen Stoppels
c505f4f70a use packagebase 2024-09-11 09:44:08 +02:00
afzpatel
e1da0a7312 Bump up the version for ROCm-6.2.0 (#45701)
* initial update for rocm 6.2
* fix typo in rocprofiler-register
* update rocm-device-libs
* add patch to use clang 18 for roctracer-dev
* add updates for rocm-opencl and rocm-validation-suite
* add hipify-clang patch
* update remaining packages to 6.2
* update hipblaslt mivisionx patches
* update rocm-tensile to 6.2.0
* add hipsparselt changes for 6.2
* add rocm-openmp-extras patch
* add build-time test for rocprofiler-register
* update flang-legacy support for 6.2
* simplify version range
* update boost dependency in rpp

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-09-10 20:13:21 -06:00
Taillefumier Mathieu
f58ee3ea2b SIRIUS: add v7.6.1 (#46296)
bug fix update for sirius
2024-09-10 13:08:32 -06:00
Massimiliano Culpo
ffdfa498bf Deprecate config:install_missing_compilers (#46237)
The option config:install_missing_compilers is currently buggy,
and has been for a while. Remove it, since it won't be needed
when compilers are treated as dependencies.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-10 20:02:05 +02:00
Teague Sterling
decefe0234 perl-bio-ensembl-io: new package (#44509)
* Adding the perl-bio-db-bigfile package

* Adding perl-bio-ensembl-io package

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Reverting variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Rename package.py to package.py

* Update package.py

* Update package.py

* Removing unneeded dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updated from testing by @ebiarnie

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updated from testing by @ebiarnie

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing depends

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fix styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-10 11:45:59 -06:00
Pranav Sivaraman
e2e5e74b99 fast-float: new package (#46137)
* fast-float: new package

* fast-float: add test dependency

* fast-float: fix doctest dependency type

* fast-float: convert deps to tuple

* fast-float: add v6.1.5 and v6.1.6

* fast-float: patch older versions to use find_package(doctest)
2024-09-10 12:36:23 -05:00
Tony Weaver
0629c5df38 py-your: Changed software pull location (#46201)
* py-your: new package

Spack package recipe for YOUR, Your Unified Reader.  YOUR processes pulsar data in different formats.

Output below from spack install py-your
spack install py-your
==> Installing py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x [83/83]
==> No binary for py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x found: installing from source
==> Fetching https://github.com/thepetabyteproject/your/archive/refs/tags/0.6.7.tar.gz
==> No patches needed for py-your
==> py-your: Executing phase: 'install'
==> py-your: Successfully installed py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x
  Stage: 1.43s.  Install: 0.99s.  Post-install: 0.12s.  Total: 3.12s

* Removed setup_run_environment

After some testing, both spack load and module load for the package will include the bin directory generated by py-your as well as the path to the version of python the package was built with, without the need for the setup_run_environment function.

I removed that function (Although, like Tamara I thought it would be necessary based on other package setups I used as a  basis for this package).

Note: I also updated the required version of py-astropy from py-astropy@4.0: to @py-astropy@6.1.0:  In my test builds, the install was picking up version py-astropy@4.0.1.post1 and numpy1.26.  However when I  tried to run some of the code I was getting errors about py-astropy making numpy calls that are now removed.  The newer version of py-astropy corrects these.  Ideally this would be handled in the py-astropy package to make sure numpy isn't too new

* Changed  software pull location

The original package pulled a tagged release version from GitHub.  That tagged version was created in 2022  and has not been updated since.  It no longer runs because newer versions of numpy have removed deprecation warnings for several of their calls.  The main branch for this repository has addressed these numpy issues as well as some other important fixes but no new release has been generated.  Because of this and the apparent minimal development that now appears to be going on, it is probably best to always pull from the main branch

* [@spackbot] updating style on behalf of aweaver1fandm

* py-your: Changed software pull location

1. Restored original URL and version (0.6.7) as requested
2. Updated py-numpy dependency versions to be constrained based on the version of your.  Because of numpy deprecations related to your version 0.6.7 need to ensure that the numpy version used is not 1.24 or greater because the depracations were removed starting with that version
2024-09-10 10:37:05 -05:00
Robert Underwood
79d778f8cd Add additional cuda-toolkit location variable used by py-torch (#46245)
superceds #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-10 17:20:05 +02:00
Teague Sterling
6f5ba44431 perl-bioperl: add v1.6.924, v1.7.8, deprecate v1.007002, refactor dependeicies, update url (#46213)
* perl-bioperl: add v1.6.924

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix style

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: add v1.6.924, v1.7.2, deprecate v1.007002, refactor dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: add v1.7.8

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: update url

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: cleanup version urls

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* remove v1.7.2

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-10 11:36:43 +01:00
Harmen Stoppels
0905edf592 r: do not create dir in setup_dependent_package (#46282) 2024-09-10 09:04:09 +02:00
Harmen Stoppels
16dba78288 spec.py: dedent format logic (#46279) 2024-09-10 09:02:37 +02:00
Todd Gamblin
b220938d42 bugfix: elfutils has no bzip2 or xz variants (#46294)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-10 06:20:39 +00:00
Mikael Simberg
78117877e0 boost: Refactor header-only install and add missing compiled libraries (#46281) 2024-09-10 08:16:47 +02:00
Harmen Stoppels
975f4fbf84 cmake: remove last occurrences of std_cmake_args globals (#46288) 2024-09-10 07:56:51 +02:00
AcriusWinter
dc853b2cf4 gptune: new test API (#45383)
* gptune: new test API
* gptune: cleanup; finish API changes; separate unrelated test parts
* gptune: standalone test cleanup with timeout constraints
* gptune: ensure stand-alone test bash failures terminate; enable in CI
* gptune: add directory to terminate_bash_failures
* gptune/stand-alone tests: use satisifes for checking variants

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-09-09 15:21:55 -07:00
Philipp Edelmann
69b628a3f0 makedepf90: new package (#45816)
* makedepf90: new package

* reorder versions and dependencies
2024-09-09 16:40:25 -05:00
David Collins
98fb9c23f9 numactl: Add versions 2.0.16-2.0.18 (#46150)
* Add numactl 2.0.16-2.0.18

* Create link-with-latomic-if-needed-v2.0.16.patch

Add a link to libatomic, if needed, for numactl v2.0.16.

* Add some missing patches to v2.0.16

* Create numactl-2.0.18-syscall-NR-ppc64.patch

In short, we need numactl to set __NR_set_mempolicy_home_node on ppc64, if it's not already defined.

* Apply a necessary patch for v2.0.18 on PPC64

* Add libatomic patch for v2.0.16
2024-09-09 14:33:13 -06:00
Pranav Sivaraman
d5e08abe46 py-poxy: add new package (#46214)
* py-poxy: add new package
* py-poxy: depends on misk >= 0.8.1
2024-09-09 12:19:54 -07:00
Harmen Stoppels
2f789f01d3 Revert "Set module variables for all packages before running setup_dependent_…" (#46283)
This reverts commit 6f08db4631.
2024-09-09 10:37:26 -07:00
Kevin Kuriakose
0229240df4 darshan-runtime: fix JOBID determination (#46148) 2024-09-09 10:53:15 -06:00
Harmen Stoppels
9059756a11 reindex: do not assume fixed layout (#46170)
`spack reindex` relies on projections from configuration to locate
installed specs and prefixes. This is problematic because config can
change over time, and we have reasons to do so when turning compilers
into depedencies (removing `{compiler.name}-{compiler.version}` from
projections)

This commit makes reindex recursively search for .spack/ metadirs.
2024-09-09 17:26:30 +02:00
Pierre Augier
67089b967e Add new package py-fluiddyn (#46235) 2024-09-09 09:47:44 -05:00
Mikael Simberg
98e35f7232 pika: Add conflict with fmt 11 and newer (#46280) 2024-09-09 14:30:32 +02:00
Teague Sterling
3ff441c5b0 perl-graphviz: add graphviz dependency (#46268)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-09 11:08:34 +02:00
Jordan Galby
2502a3c2b6 Fix regression in spec format string for indiviual variants (#46206)
Fix a regression in {variants.X} and {variants.X.value} spec format strings.
2024-09-09 10:42:04 +02:00
Adam J. Stewart
9bdc97043e PyTorch: unpin pybind11 dependency (#46266) 2024-09-09 08:42:09 +02:00
snehring
66ee4caeab nekrs: add v23.0, add new build system (#45992) 2024-09-09 07:32:14 +02:00
Eric Berquist
18218d732a intel-pin: add up to v3.31 (#46185) 2024-09-09 07:28:53 +02:00
Paul Ferrell
4c6b3ccb40 charliecloud: fix libsquashfuse dependency type (#46189)
Should have been a link depenedency, not a build dependency.
2024-09-09 07:26:20 +02:00
Satish Balay
1d9c8a9034 xsdk: add amrex variant (#46190)
and remove compiler conditionals [as amrex conflict with clang no longer exists since #22967]
2024-09-09 07:19:31 +02:00
AMD Toolchain Support
216619bb53 namd: variant updates (#45825)
* Add missing variant, already used in recipe (avxtiles)

* Add memopt variant 

Co-authored-by: viveshar <vivek.sharma2@amd.com>
2024-09-09 07:12:16 +02:00
Wouter Deconinck
47c771f03f assimp: add v5.4.3, enable testing (#46267) 2024-09-09 06:51:57 +02:00
Wouter Deconinck
c7139eb690 wayland-protocols: add v1.37 (#46269) 2024-09-09 06:50:30 +02:00
Wouter Deconinck
74ba81368a py-vector: add v1.5.1 (#46271) 2024-09-09 06:49:35 +02:00
Wouter Deconinck
ef11fcdf34 protobuf: patch @3.22:3.24.3 when ^abseil-cpp@20240116: (#46273) 2024-09-09 06:48:50 +02:00
Wouter Deconinck
c4f3348af1 (py-)onnx: add v1.16.2; onnx: enable testing (#46274) 2024-09-09 06:47:45 +02:00
Richard Berger
fec2f30d5a lammps: add 20240829 and 20230802.4 releases (#46131) 2024-09-09 06:46:12 +02:00
Wouter Deconinck
6af92f59ec xrootd: add v5.7.1 (#46270) 2024-09-09 06:45:15 +02:00
Pierre Augier
cb47c5f0ac Add package py-transonic (#46234) 2024-09-08 20:03:09 -05:00
Dom Heinzeller
32727087f1 Add patch vfile_cassert.patch for ecflow@5.11.4 (#46095) 2024-09-07 07:54:43 -06:00
125 changed files with 1691 additions and 1123 deletions

View File

@@ -115,12 +115,6 @@ config:
suppress_gpg_warnings: false
# If set to true, Spack will attempt to build any compiler on the spec
# that is not already available. If set to False, Spack will only use
# compilers already configured in compilers.yaml
install_missing_compilers: false
# If set to true, Spack will always check checksums after downloading
# archives. If false, Spack skips the checksum step.
checksum: true

View File

@@ -218,6 +218,7 @@ def setup(sphinx):
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -181,10 +181,6 @@ Spec-related modules
:mod:`spack.parser`
Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs.
:mod:`spack.concretize`
Contains :class:`~spack.concretize.Concretizer` implementation,
which allows site administrators to change Spack's :ref:`concretization-policies`.
:mod:`spack.version`
Implements a simple :class:`~spack.version.Version` class with simple
comparison semantics. Also implements :class:`~spack.version.VersionRange`

View File

@@ -663,11 +663,7 @@ build the package.
When including a bootstrapping phase as in the example above, the result is that
the bootstrapped compiler packages will be pushed to the binary mirror (and the
local artifacts mirror) before the actual release specs are built. In this case,
the jobs corresponding to subsequent release specs are configured to
``install_missing_compilers``, so that if spack is asked to install a package
with a compiler it doesn't know about, it can be quickly installed from the
binary mirror first.
local artifacts mirror) before the actual release specs are built.
Since bootstrapping compilers is optional, those items can be left out of the
environment/stack file, and in that case no bootstrapping will be done (only the

View File

@@ -1003,6 +1003,7 @@ def set_all_package_py_globals(self):
"""Set the globals in modules of package.py files."""
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
if self.should_set_package_py_globals & flag:
if self.context == Context.BUILD and self.needs_build_context & flag:
set_package_py_globals(pkg, context=Context.BUILD)
@@ -1010,12 +1011,6 @@ def set_all_package_py_globals(self):
# This includes runtime dependencies, also runtime deps of direct build deps.
set_package_py_globals(pkg, context=Context.RUN)
# Looping over the set of packages a second time
# ensures all globals are loaded into the module space prior to
# any package setup. This guarantees package setup methods have
# access to expected module level definitions such as "spack_cc"
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
for spec in dspec.dependents():
# Note: some specs have dependents that are unreachable from the root, so avoid
# setting globals for those.

View File

@@ -5,6 +5,7 @@
from typing import Optional, Tuple
import llnl.util.lang as lang
from llnl.util.filesystem import mkdirp
from spack.directives import extends
@@ -36,6 +37,7 @@ def configure_vars(self):
def install(self, pkg, spec, prefix):
"""Installs an R package."""
mkdirp(pkg.module.r_lib_dir)
config_args = self.configure_args()
config_vars = self.configure_vars()
@@ -43,12 +45,12 @@ def install(self, pkg, spec, prefix):
args = ["--vanilla", "CMD", "INSTALL"]
if config_args:
args.append("--configure-args={0}".format(" ".join(config_args)))
args.append(f"--configure-args={' '.join(config_args)}")
if config_vars:
args.append("--configure-vars={0}".format(" ".join(config_vars)))
args.append(f"--configure-vars={' '.join(config_vars)}")
args.extend(["--library={0}".format(self.pkg.module.r_lib_dir), self.stage.source_path])
args.extend([f"--library={pkg.module.r_lib_dir}", self.stage.source_path])
pkg.module.R(*args)
@@ -79,27 +81,21 @@ class RPackage(Package):
@lang.classproperty
def homepage(cls):
if cls.cran:
return "https://cloud.r-project.org/package=" + cls.cran
return f"https://cloud.r-project.org/package={cls.cran}"
elif cls.bioc:
return "https://bioconductor.org/packages/" + cls.bioc
return f"https://bioconductor.org/packages/{cls.bioc}"
@lang.classproperty
def url(cls):
if cls.cran:
return (
"https://cloud.r-project.org/src/contrib/"
+ cls.cran
+ "_"
+ str(list(cls.versions)[0])
+ ".tar.gz"
)
return f"https://cloud.r-project.org/src/contrib/{cls.cran}_{str(list(cls.versions)[0])}.tar.gz"
@lang.classproperty
def list_url(cls):
if cls.cran:
return "https://cloud.r-project.org/src/contrib/Archive/" + cls.cran + "/"
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
@property
def git(self):
if self.bioc:
return "https://git.bioconductor.org/packages/" + self.bioc
return f"https://git.bioconductor.org/packages/{self.bioc}"

View File

@@ -82,10 +82,7 @@ def compiler_find(args):
"""
paths = args.add_paths or None
new_compilers = spack.compilers.find_compilers(
path_hints=paths,
scope=args.scope,
mixed_toolchain=args.mixed_toolchain,
max_workers=args.jobs,
path_hints=paths, scope=args.scope, mixed_toolchain=args.mixed_toolchain
)
if new_compilers:
n = len(new_compilers)

View File

@@ -135,9 +135,7 @@ def external_find(args):
candidate_packages = packages_to_search_for(
names=args.packages, tags=args.tags, exclude=args.exclude
)
detected_packages = spack.detection.by_path(
candidate_packages, path_hints=args.path, max_workers=args.jobs
)
detected_packages = spack.detection.by_path(candidate_packages, path_hints=args.path)
new_specs = spack.detection.update_configuration(
detected_packages, scope=args.scope, buildable=not args.not_buildable

View File

@@ -243,7 +243,6 @@ def find_compilers(
*,
scope: Optional[str] = None,
mixed_toolchain: bool = False,
max_workers: Optional[int] = None,
) -> List["spack.compiler.Compiler"]:
"""Searches for compiler in the paths given as argument. If any new compiler is found, the
configuration is updated, and the list of new compiler objects is returned.
@@ -254,7 +253,6 @@ def find_compilers(
scope: configuration scope to modify
mixed_toolchain: allow mixing compilers from different toolchains if otherwise missing for
a certain language
max_workers: number of processes used to search for compilers
"""
import spack.detection
@@ -267,9 +265,7 @@ def find_compilers(
default_paths.extend(windows_os.WindowsOs().compiler_search_paths)
compiler_pkgs = spack.repo.PATH.packages_with_tags(COMPILER_TAG, full=True)
detected_packages = spack.detection.by_path(
compiler_pkgs, path_hints=default_paths, max_workers=max_workers
)
detected_packages = spack.detection.by_path(compiler_pkgs, path_hints=default_paths)
valid_compilers = {}
for name, detected in detected_packages.items():

View File

@@ -19,35 +19,23 @@
import spack.tengine
import spack.util.path
class Concretizer:
"""(DEPRECATED) Only contains logic to enable/disable compiler existence checks."""
#: Controls whether we check that compiler versions actually exist
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
def __init__(self):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
"config:install_missing_compilers", False
)
CHECK_COMPILER_EXISTENCE = True
@contextmanager
def disable_compiler_existence_check():
saved = Concretizer.check_for_compiler_existence
Concretizer.check_for_compiler_existence = False
global CHECK_COMPILER_EXISTENCE
CHECK_COMPILER_EXISTENCE, saved = False, CHECK_COMPILER_EXISTENCE
yield
Concretizer.check_for_compiler_existence = saved
CHECK_COMPILER_EXISTENCE = saved
@contextmanager
def enable_compiler_existence_check():
saved = Concretizer.check_for_compiler_existence
Concretizer.check_for_compiler_existence = True
global CHECK_COMPILER_EXISTENCE
CHECK_COMPILER_EXISTENCE, saved = True, CHECK_COMPILER_EXISTENCE
yield
Concretizer.check_for_compiler_existence = saved
CHECK_COMPILER_EXISTENCE = saved
def find_spec(spec, condition, default=None):

View File

@@ -207,7 +207,7 @@ class InstallRecord:
def __init__(
self,
spec: "spack.spec.Spec",
path: str,
path: Optional[str],
installed: bool,
ref_count: int = 0,
explicit: bool = False,
@@ -845,7 +845,7 @@ def check(cond, msg):
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
self.reindex()
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
@@ -918,8 +918,6 @@ def reindex(self):
if self.is_upstream:
raise UpstreamDatabaseLockingError("Cannot reindex an upstream database")
error: Optional[CorruptDatabaseError] = None
# Special transaction to avoid recursive reindex calls and to
# ignore errors if we need to rebuild a corrupt database.
def _read_suppress_error():
@@ -927,99 +925,116 @@ def _read_suppress_error():
if os.path.isfile(self._index_path):
self._read_from_file(self._index_path)
except CorruptDatabaseError as e:
nonlocal error
error = e
tty.warn(f"Reindexing corrupt database, error was: {e}")
self._data = {}
self._installed_prefixes = set()
transaction = lk.WriteTransaction(
self.lock, acquire=_read_suppress_error, release=self._write
)
with transaction:
if error is not None:
tty.warn(f"Spack database was corrupt. Will rebuild. Error was: {error}")
old_data = self._data
old_installed_prefixes = self._installed_prefixes
with lk.WriteTransaction(self.lock, acquire=_read_suppress_error, release=self._write):
old_installed_prefixes, self._installed_prefixes = self._installed_prefixes, set()
old_data, self._data = self._data, {}
try:
self._construct_from_directory_layout(old_data)
self._reindex(old_data)
except BaseException:
# If anything explodes, restore old data, skip write.
self._data = old_data
self._installed_prefixes = old_installed_prefixes
raise
def _construct_entry_from_directory_layout(
self,
old_data: Dict[str, InstallRecord],
spec: "spack.spec.Spec",
deprecator: Optional["spack.spec.Spec"] = None,
):
# Try to recover explicit value from old DB, but
# default it to True if DB was corrupt. This is
# just to be conservative in case a command like
# "autoremove" is run by the user after a reindex.
tty.debug(f"Reconstructing from spec file: {spec}")
explicit = True
inst_time = os.stat(spec.prefix).st_ctime
if old_data is not None:
old_info = old_data.get(spec.dag_hash())
if old_info is not None:
explicit = old_info.explicit
inst_time = old_info.installation_time
def _reindex(self, old_data: Dict[str, InstallRecord]):
# Specs on the file system are the source of truth for record.spec. The old database values
# if available are the source of truth for the rest of the record.
assert self.layout, "Database layout must be set to reindex"
self._add(spec, explicit=explicit, installation_time=inst_time)
if deprecator:
self._deprecate(spec, deprecator)
specs_from_fs = self.layout.all_specs()
deprecated_for = self.layout.deprecated_for(specs_from_fs)
def _construct_from_directory_layout(self, old_data: Dict[str, InstallRecord]):
# Read first the spec files in the prefixes. They should be considered authoritative with
# respect to DB reindexing, as entries in the DB may be corrupted in a way that still makes
# them readable. If we considered DB entries authoritative instead, we would perpetuate
# errors over a reindex.
assert self.layout is not None, "Cannot reindex a database without a known layout"
with self.layout.disable_upstream_check():
# Initialize data in the reconstructed DB
self._data = {}
self._installed_prefixes = set()
known_specs: List[spack.spec.Spec] = [
*specs_from_fs,
*(deprecated for _, deprecated in deprecated_for),
*(rec.spec for rec in old_data.values()),
]
# Start inspecting the installed prefixes
processed_specs = set()
upstream_hashes = {
dag_hash for upstream in self.upstream_dbs for dag_hash in upstream._data
}
upstream_hashes.difference_update(spec.dag_hash() for spec in known_specs)
for spec in self.layout.all_specs():
self._construct_entry_from_directory_layout(old_data, spec)
processed_specs.add(spec)
def create_node(edge: spack.spec.DependencySpec, is_upstream: bool):
if is_upstream:
return
for spec, deprecator in self.layout.all_deprecated_specs():
self._construct_entry_from_directory_layout(old_data, spec, deprecator)
processed_specs.add(spec)
self._data[edge.spec.dag_hash()] = InstallRecord(
spec=edge.spec.copy(deps=False),
path=edge.spec.external_path if edge.spec.external else None,
installed=edge.spec.external,
)
for entry in old_data.values():
# We already took care of this spec using spec file from its prefix.
if entry.spec in processed_specs:
tty.debug(
f"Skipping reconstruction from old db: {entry.spec}"
" [already reconstructed from spec file]"
)
continue
# Store all nodes of known specs, excluding ones found in upstreams
tr.traverse_breadth_first_with_visitor(
known_specs,
tr.CoverNodesVisitor(
NoUpstreamVisitor(upstream_hashes, create_node), key=tr.by_dag_hash
),
)
# If we arrived here it very likely means that we have external specs that are not
# dependencies of other specs. This may be the case for externally installed
# compilers or externally installed applications.
tty.debug(f"Reconstructing from old db: {entry.spec}")
try:
self._add(
spec=entry.spec,
explicit=entry.explicit,
installation_time=entry.installation_time,
)
processed_specs.add(entry.spec)
except Exception as e:
# Something went wrong, so the spec was not restored from old data
tty.debug(e)
# Store the prefix and other information for specs were found on the file system
for s in specs_from_fs:
record = self._data[s.dag_hash()]
record.path = s.prefix
record.installed = True
record.explicit = True # conservative assumption
record.installation_time = os.stat(s.prefix).st_ctime
self._check_ref_counts()
# Deprecate specs
for new, old in deprecated_for:
self._data[old.dag_hash()].deprecated_for = new.dag_hash()
# Copy data we have from the old database
for old_record in old_data.values():
record = self._data[old_record.spec.dag_hash()]
record.explicit = old_record.explicit
record.installation_time = old_record.installation_time
record.origin = old_record.origin
record.deprecated_for = old_record.deprecated_for
# Warn when the spec has been removed from the file system (i.e. it was not detected)
if not record.installed and old_record.installed:
tty.warn(
f"Spec {old_record.spec.short_spec} was marked installed in the database "
"but was not found on the file system. It is now marked as missing."
)
def create_edge(edge: spack.spec.DependencySpec, is_upstream: bool):
if not edge.parent:
return
parent_record = self._data[edge.parent.dag_hash()]
if is_upstream:
upstream, child_record = self.query_by_spec_hash(edge.spec.dag_hash())
assert upstream and child_record, "Internal error: upstream spec not found"
else:
child_record = self._data[edge.spec.dag_hash()]
parent_record.spec._add_dependency(
child_record.spec, depflag=edge.depflag, virtuals=edge.virtuals
)
# Then store edges
tr.traverse_breadth_first_with_visitor(
known_specs,
tr.CoverEdgesVisitor(
NoUpstreamVisitor(upstream_hashes, create_edge), key=tr.by_dag_hash
),
)
# Finally update the ref counts
for record in self._data.values():
for dep in record.spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
dep_record = self._data.get(dep.dag_hash())
if dep_record: # dep might be upstream
dep_record.ref_count += 1
if record.deprecated_for:
self._data[record.deprecated_for].ref_count += 1
self._check_ref_counts()
def _check_ref_counts(self):
"""Ensure consistency of reference counts in the DB.
@@ -1199,7 +1214,7 @@ def _add(
for dep in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
dkey = dep.spec.dag_hash()
upstream, record = self.query_by_spec_hash(dkey)
assert record, f"Missing dependency {dep.spec} in DB"
assert record, f"Missing dependency {dep.spec.short_spec} in DB"
new_spec._add_dependency(record.spec, depflag=dep.depflag, virtuals=dep.virtuals)
if not upstream:
record.ref_count += 1
@@ -1711,6 +1726,33 @@ def update_explicit(self, spec, explicit):
rec.explicit = explicit
class NoUpstreamVisitor:
"""Gives edges to upstream specs, but does follow edges from upstream specs."""
def __init__(
self,
upstream_hashes: Set[str],
on_visit: Callable[["spack.spec.DependencySpec", bool], None],
):
self.upstream_hashes = upstream_hashes
self.on_visit = on_visit
def accept(self, item: tr.EdgeAndDepth) -> bool:
self.on_visit(item.edge, self.is_upstream(item))
return True
def is_upstream(self, item: tr.EdgeAndDepth) -> bool:
return item.edge.spec.dag_hash() in self.upstream_hashes
def neighbors(self, item: tr.EdgeAndDepth):
# Prune edges from upstream nodes, only follow database tracked dependencies
return (
[]
if self.is_upstream(item)
else item.edge.spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES)
)
class UpstreamDatabaseLockingError(SpackError):
"""Raised when an operation would need to lock an upstream database"""

View File

@@ -25,6 +25,7 @@
import llnl.util.tty
import spack.config
import spack.error
import spack.operating_systems.windows_os as winOs
import spack.spec
import spack.util.spack_yaml

View File

@@ -18,10 +18,14 @@
import llnl.util.lang
import llnl.util.tty
import spack.package_base
import spack.repo
import spack.spec
import spack.util.elf as elf_utils
import spack.util.environment
import spack.util.environment as environment
import spack.util.ld_so_conf
import spack.util.parallel
from .common import (
WindowsCompilerExternalPaths,
@@ -84,22 +88,24 @@ def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
return path_to_dict(search_paths)
def accept_elf(path, host_compat):
def accept_elf(entry: os.DirEntry, host_compat: Tuple[bool, bool, int]):
"""Accept an ELF file if the header matches the given compat triplet. In case it's not an ELF
(e.g. static library, or some arbitrary file, fall back to is_readable_file)."""
# Fast path: assume libraries at least have .so in their basename.
# Note: don't replace with splitext, because of libsmth.so.1.2.3 file names.
if ".so" not in os.path.basename(path):
return llnl.util.filesystem.is_readable_file(path)
if ".so" not in entry.name:
return is_readable_file(entry)
try:
return host_compat == elf_utils.get_elf_compat(path)
return host_compat == elf_utils.get_elf_compat(entry.path)
except (OSError, elf_utils.ElfParsingError):
return llnl.util.filesystem.is_readable_file(path)
return is_readable_file(entry)
def libraries_in_ld_and_system_library_path(
path_hints: Optional[List[str]] = None,
) -> Dict[str, str]:
def is_readable_file(entry: os.DirEntry) -> bool:
return entry.is_file() and os.access(entry.path, os.R_OK)
def system_library_paths() -> List[str]:
"""Get the paths of all libraries available from ``path_hints`` or the
following defaults:
@@ -113,79 +119,54 @@ def libraries_in_ld_and_system_library_path(
(i.e. the basename of the library path).
There may be multiple paths with the same basename. In this case it is
assumed there are two different instances of the library.
assumed there are two different instances of the library."""
Args:
path_hints: list of paths to be searched. If None the list will be
constructed based on the set of LD_LIBRARY_PATH, LIBRARY_PATH,
DYLD_LIBRARY_PATH, and DYLD_FALLBACK_LIBRARY_PATH environment
variables as well as the standard system library paths.
path_hints (list): list of paths to be searched. If ``None``, the default
system paths are used.
"""
if path_hints:
search_paths = llnl.util.filesystem.search_paths_for_libraries(*path_hints)
else:
search_paths = []
search_paths: List[str] = []
# Environment variables
if sys.platform == "darwin":
search_paths.extend(environment.get_path("DYLD_LIBRARY_PATH"))
search_paths.extend(environment.get_path("DYLD_FALLBACK_LIBRARY_PATH"))
elif sys.platform.startswith("linux"):
search_paths.extend(environment.get_path("LD_LIBRARY_PATH"))
# Dynamic linker paths
search_paths.extend(spack.util.ld_so_conf.host_dynamic_linker_search_paths())
# Drop redundant paths
search_paths = list(filter(os.path.isdir, search_paths))
# Make use we don't doubly list /usr/lib and /lib etc
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
try:
host_compat = elf_utils.get_elf_compat(sys.executable)
accept = lambda path: accept_elf(path, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = llnl.util.filesystem.is_readable_file
path_to_lib = {}
# Reverse order of search directories so that a lib in the first
# search path entry overrides later entries
for search_path in reversed(search_paths):
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if accept(lib_path):
path_to_lib[lib_path] = lib
return path_to_lib
def libraries_in_windows_paths(path_hints: Optional[List[str]] = None) -> Dict[str, str]:
"""Get the paths of all libraries available from the system PATH paths.
For more details, see `libraries_in_ld_and_system_library_path` regarding
return type and contents.
Args:
path_hints: list of paths to be searched. If None the list will be
constructed based on the set of PATH environment
variables as well as the standard system library paths.
"""
search_hints = (
path_hints if path_hints is not None else spack.util.environment.get_path("PATH")
)
search_paths = llnl.util.filesystem.search_paths_for_libraries(*search_hints)
# on Windows, some libraries (.dlls) are found in the bin directory or sometimes
# at the search root. Add both of those options to the search scheme
search_paths.extend(llnl.util.filesystem.search_paths_for_executables(*search_hints))
if path_hints is None:
if sys.platform == "win32":
search_hints = spack.util.environment.get_path("PATH")
search_paths.extend(llnl.util.filesystem.search_paths_for_libraries(*search_hints))
# on Windows, some libraries (.dlls) are found in the bin directory or sometimes
# at the search root. Add both of those options to the search scheme
search_paths.extend(llnl.util.filesystem.search_paths_for_executables(*search_hints))
# if no user provided path was given, add defaults to the search
search_paths.extend(WindowsKitExternalPaths.find_windows_kit_lib_paths())
# SDK and WGL should be handled by above, however on occasion the WDK is in an atypical
# location, so we handle that case specifically.
search_paths.extend(WindowsKitExternalPaths.find_windows_driver_development_kit_paths())
return path_to_dict(search_paths)
elif sys.platform == "darwin":
search_paths.extend(environment.get_path("DYLD_LIBRARY_PATH"))
search_paths.extend(environment.get_path("DYLD_FALLBACK_LIBRARY_PATH"))
search_paths.extend(spack.util.ld_so_conf.host_dynamic_linker_search_paths())
elif sys.platform.startswith("linux"):
search_paths.extend(environment.get_path("LD_LIBRARY_PATH"))
search_paths.extend(spack.util.ld_so_conf.host_dynamic_linker_search_paths())
# Drop redundant paths
search_paths = list(filter(os.path.isdir, search_paths))
# Make use we don't doubly list /usr/lib and /lib etc
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
return search_paths
def libraries_in_path(search_paths: List[str]) -> Dict[str, str]:
try:
host_compat = elf_utils.get_elf_compat(sys.executable)
accept = lambda entry: accept_elf(entry, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = is_readable_file
path_to_lib = {}
# Reverse order of search directories so that a lib in the first
# search path entry overrides later entries
for search_path in reversed(search_paths):
with os.scandir(search_path) as it:
for entry in it:
if accept(entry):
path_to_lib[entry.path] = entry.name
return path_to_lib
def _group_by_prefix(paths: List[str]) -> Dict[str, Set[str]]:
@@ -198,10 +179,13 @@ def _group_by_prefix(paths: List[str]) -> Dict[str, Set[str]]:
class Finder:
"""Inspects the file-system looking for packages. Guesses places where to look using PATH."""
def __init__(self, paths: Dict[str, str]):
self.paths = paths
def default_path_hints(self) -> List[str]:
return []
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
def search_patterns(self, *, pkg: Type[spack.package_base.PackageBase]) -> Optional[List[str]]:
"""Returns the list of patterns used to match candidate files.
Args:
@@ -209,15 +193,6 @@ def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> Lis
"""
raise NotImplementedError("must be implemented by derived classes")
def candidate_files(self, *, patterns: List[str], paths: List[str]) -> List[str]:
"""Returns a list of candidate files found on the system.
Args:
patterns: search patterns to be used for matching files
paths: paths where to search for files
"""
raise NotImplementedError("must be implemented by derived classes")
def prefix_from_path(self, *, path: str) -> str:
"""Given a path where a file was found, returns the corresponding prefix.
@@ -227,7 +202,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
self, *, pkg: Type[spack.package_base.PackageBase], paths: List[str]
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
@@ -301,45 +276,36 @@ def detect_specs(
return result
def find(
self, *, pkg_name: str, repository, initial_guess: Optional[List[str]] = None
) -> List["spack.spec.Spec"]:
def find(self, *, pkg_name: str, repository: spack.repo.Repo) -> List[spack.spec.Spec]:
"""For a given package, returns a list of detected specs.
Args:
pkg_name: package being detected
repository: repository to retrieve the package
initial_guess: initial list of paths to search from the caller if None, default paths
are searched. If this is an empty list, nothing will be searched.
"""
pkg_cls = repository.get_pkg_class(pkg_name)
patterns = self.search_patterns(pkg=pkg_cls)
if not patterns:
return []
if initial_guess is None:
initial_guess = self.default_path_hints()
initial_guess.extend(common_windows_package_paths(pkg_cls))
candidates = self.candidate_files(patterns=patterns, paths=initial_guess)
result = self.detect_specs(pkg=pkg_cls, paths=candidates)
return result
regex = re.compile("|".join(patterns))
paths = [path for path, file in self.paths.items() if regex.search(file)]
paths.sort()
return self.detect_specs(pkg=pkg_cls, paths=paths)
class ExecutablesFinder(Finder):
def default_path_hints(self) -> List[str]:
return spack.util.environment.get_path("PATH")
@classmethod
def in_search_paths(cls, paths: List[str]):
return cls(executables_in_path(paths))
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
@classmethod
def in_default_paths(cls):
return cls.in_search_paths(spack.util.environment.get_path("PATH"))
def search_patterns(self, *, pkg: Type[spack.package_base.PackageBase]) -> Optional[List[str]]:
if hasattr(pkg, "executables") and hasattr(pkg, "platform_executables"):
result = pkg.platform_executables()
return result
def candidate_files(self, *, patterns: List[str], paths: List[str]) -> List[str]:
executables_by_path = executables_in_path(path_hints=paths)
joined_pattern = re.compile(r"|".join(patterns))
result = [path for path, exe in executables_by_path.items() if joined_pattern.search(exe)]
result.sort()
return result
return pkg.platform_executables()
return None
def prefix_from_path(self, *, path: str) -> str:
result = executable_prefix(path)
@@ -350,29 +316,18 @@ def prefix_from_path(self, *, path: str) -> str:
class LibrariesFinder(Finder):
"""Finds libraries on the system, searching by LD_LIBRARY_PATH, LIBRARY_PATH,
DYLD_LIBRARY_PATH, DYLD_FALLBACK_LIBRARY_PATH, and standard system library paths
"""
"""Finds libraries in the provided paths matching package search patterns."""
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
if hasattr(pkg, "libraries"):
result = pkg.libraries
return result
@classmethod
def in_search_paths(cls, paths: List[str]):
return cls(libraries_in_path(paths))
def candidate_files(self, *, patterns: List[str], paths: List[str]) -> List[str]:
libraries_by_path = (
libraries_in_ld_and_system_library_path(path_hints=paths)
if sys.platform != "win32"
else libraries_in_windows_paths(path_hints=paths)
)
patterns = [re.compile(x) for x in patterns]
result = []
for compiled_re in patterns:
for path, exe in libraries_by_path.items():
if compiled_re.search(exe):
result.append(path)
return result
@classmethod
def in_default_paths(cls):
return cls.in_search_paths(system_library_paths())
def search_patterns(self, *, pkg: Type[spack.package_base.PackageBase]) -> Optional[List[str]]:
return getattr(pkg, "libraries", None)
def prefix_from_path(self, *, path: str) -> str:
result = library_prefix(path)
@@ -383,11 +338,8 @@ def prefix_from_path(self, *, path: str) -> str:
def by_path(
packages_to_search: Iterable[str],
*,
path_hints: Optional[List[str]] = None,
max_workers: Optional[int] = None,
) -> Dict[str, List["spack.spec.Spec"]]:
packages_to_search: Iterable[str], *, path_hints: Optional[List[str]] = None
) -> Dict[str, List[spack.spec.Spec]]:
"""Return the list of packages that have been detected on the system, keyed by
unqualified package name.
@@ -395,31 +347,26 @@ def by_path(
packages_to_search: list of packages to be detected. Each package can be either unqualified
of fully qualified
path_hints: initial list of paths to be searched
max_workers: maximum number of workers to search for packages in parallel
"""
import spack.repo
# TODO: Packages should be able to define both .libraries and .executables in the future
# TODO: determine_spec_details should get all relevant libraries and executables in one call
executables_finder, libraries_finder = ExecutablesFinder(), LibrariesFinder()
if path_hints is None:
exe_finder = ExecutablesFinder.in_default_paths()
lib_finder = LibrariesFinder.in_default_paths()
else:
exe_finder = ExecutablesFinder.in_search_paths(path_hints)
lib_finder = LibrariesFinder.in_search_paths(path_hints)
detected_specs_by_package: Dict[str, Tuple[concurrent.futures.Future, ...]] = {}
result = collections.defaultdict(list)
repository = spack.repo.PATH.ensure_unwrapped()
with concurrent.futures.ProcessPoolExecutor(max_workers=max_workers) as executor:
with spack.util.parallel.make_concurrent_executor() as executor:
for pkg in packages_to_search:
executable_future = executor.submit(
executables_finder.find,
pkg_name=pkg,
initial_guess=path_hints,
repository=repository,
)
library_future = executor.submit(
libraries_finder.find,
pkg_name=pkg,
initial_guess=path_hints,
repository=repository,
exe_finder.find, pkg_name=pkg, repository=repository
)
library_future = executor.submit(lib_finder.find, pkg_name=pkg, repository=repository)
detected_specs_by_package[pkg] = executable_future, library_future
for pkg_name, futures in detected_specs_by_package.items():
@@ -435,7 +382,7 @@ def by_path(
)
except Exception as e:
llnl.util.tty.debug(
f"[EXTERNAL DETECTION] Skipping {pkg_name}: exception occured {e}"
f"[EXTERNAL DETECTION] Skipping {pkg_name} due to: {e.__class__}: {e}"
)
return result

View File

@@ -4,14 +4,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import glob
import os
import posixpath
import re
import shutil
import sys
from contextlib import contextmanager
from pathlib import Path
from typing import List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.symlink import readlink
@@ -33,6 +31,42 @@ def _check_concrete(spec):
raise ValueError("Specs passed to a DirectoryLayout must be concrete!")
def _get_spec(prefix: str) -> Optional["spack.spec.Spec"]:
"""Returns a spec if the prefix contains a spec file in the .spack subdir"""
for f in ("spec.json", "spec.yaml"):
try:
return spack.spec.Spec.from_specfile(os.path.join(prefix, ".spack", f))
except Exception:
continue
return None
def specs_from_metadata_dirs(root: str) -> List["spack.spec.Spec"]:
stack = [root]
specs = []
while stack:
prefix = stack.pop()
spec = _get_spec(prefix)
if spec:
spec.prefix = prefix
specs.append(spec)
continue
try:
scandir = os.scandir(prefix)
except OSError:
continue
with scandir as entries:
for entry in entries:
if entry.is_dir(follow_symlinks=False):
stack.append(entry.path)
return specs
class DirectoryLayout:
"""A directory layout is used to associate unique paths with specs.
Different installations are going to want different layouts for their
@@ -184,12 +218,6 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
return yaml_path if os.path.exists(yaml_path) else json_path
@contextmanager
def disable_upstream_check(self):
self.check_upstream = False
yield
self.check_upstream = True
def metadata_path(self, spec):
return os.path.join(spec.prefix, self.metadata_dir)
@@ -244,53 +272,6 @@ def ensure_installed(self, spec):
"Spec file in %s does not match hash!" % spec_file_path
)
def all_specs(self):
if not os.path.isdir(self.root):
return []
specs = []
for _, path_scheme in self.projections.items():
path_elems = ["*"] * len(path_scheme.split(posixpath.sep))
# NOTE: Does not validate filename extension; should happen later
path_elems += [self.metadata_dir, "spec.json"]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
if not spec_files: # we're probably looking at legacy yaml...
path_elems += [self.metadata_dir, "spec.yaml"]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
specs.extend([self.read_spec(s) for s in spec_files])
return specs
def all_deprecated_specs(self):
if not os.path.isdir(self.root):
return []
deprecated_specs = set()
for _, path_scheme in self.projections.items():
path_elems = ["*"] * len(path_scheme.split(posixpath.sep))
# NOTE: Does not validate filename extension; should happen later
path_elems += [
self.metadata_dir,
self.deprecated_dir,
"*_spec.*",
] # + self.spec_file_name]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
get_depr_spec_file = lambda x: os.path.join(
os.path.dirname(os.path.dirname(x)), self.spec_file_name
)
deprecated_specs |= set(
(self.read_spec(s), self.read_spec(get_depr_spec_file(s))) for s in spec_files
)
return deprecated_specs
def specs_by_hash(self):
by_hash = {}
for spec in self.all_specs():
by_hash[spec.dag_hash()] = spec
return by_hash
def path_for_spec(self, spec):
"""Return absolute path from the root to a directory for the spec."""
_check_concrete(spec)
@@ -356,6 +337,35 @@ def remove_install_directory(self, spec, deprecated=False):
raise e
path = os.path.dirname(path)
def all_specs(self) -> List["spack.spec.Spec"]:
"""Returns a list of all specs detected in self.root, detected by `.spack` directories.
Their prefix is set to the directory containing the `.spack` directory. Note that these
specs may follow a different layout than the current layout if it was changed after
installation."""
return specs_from_metadata_dirs(self.root)
def deprecated_for(
self, specs: List["spack.spec.Spec"]
) -> List[Tuple["spack.spec.Spec", "spack.spec.Spec"]]:
"""Returns a list of tuples of specs (new, old) where new is deprecated for old"""
spec_with_deprecated = []
for spec in specs:
try:
deprecated = os.scandir(
os.path.join(str(spec.prefix), self.metadata_dir, self.deprecated_dir)
)
except OSError:
continue
with deprecated as entries:
for entry in entries:
try:
deprecated_spec = spack.spec.Spec.from_specfile(entry.path)
spec_with_deprecated.append((spec, deprecated_spec))
except Exception:
continue
return spec_with_deprecated
class DirectoryLayoutError(SpackError):
"""Superclass for directory layout errors."""

View File

@@ -276,52 +276,6 @@ def _do_fake_install(pkg: "spack.package_base.PackageBase") -> None:
dump_packages(pkg.spec, packages_dir)
def _packages_needed_to_bootstrap_compiler(
compiler: "spack.spec.CompilerSpec", architecture: "spack.spec.ArchSpec", pkgs: list
) -> List[Tuple["spack.package_base.PackageBase", bool]]:
"""
Return a list of packages required to bootstrap `pkg`s compiler
Checks Spack's compiler configuration for a compiler that
matches the package spec.
Args:
compiler: the compiler to bootstrap
architecture: the architecture for which to boostrap the compiler
pkgs: the packages that may need their compiler installed
Return:
list of tuples of packages and a boolean, for concretized compiler-related
packages that need to be installed and bool values specify whether the
package is the bootstrap compiler (``True``) or one of its dependencies
(``False``). The list will be empty if there are no compilers.
"""
tty.debug(f"Bootstrapping {compiler} compiler")
compilers = spack.compilers.compilers_for_spec(compiler, arch_spec=architecture)
if compilers:
return []
dep = spack.compilers.pkg_spec_for_compiler(compiler)
# Set the architecture for the compiler package in a way that allows the
# concretizer to back off if needed for the older bootstrapping compiler
dep.constrain(f"platform={str(architecture.platform)}")
dep.constrain(f"os={str(architecture.os)}")
dep.constrain(f"target={architecture.target.microarchitecture.family.name}:")
# concrete CompilerSpec has less info than concrete Spec
# concretize as Spec to add that information
dep.concretize()
# mark compiler as depended-on by the packages that use it
for pkg in pkgs:
dep._dependents.add(
spack.spec.DependencySpec(pkg.spec, dep, depflag=dt.BUILD, virtuals=())
)
packages = [(s.package, False) for s in dep.traverse(order="post", root=False)]
packages.append((dep.package, True))
return packages
def _hms(seconds: int) -> str:
"""
Convert seconds to hours, minutes, seconds
@@ -967,26 +921,6 @@ def __init__(
if package_id(d) != self.pkg_id
)
# Handle bootstrapped compiler
#
# The bootstrapped compiler is not a dependency in the spec, but it is
# a dependency of the build task. Here we add it to self.dependencies
compiler_spec = self.pkg.spec.compiler
arch_spec = self.pkg.spec.architecture
strict = spack.concretize.Concretizer().check_for_compiler_existence
if (
not spack.compilers.compilers_for_spec(compiler_spec, arch_spec=arch_spec)
and not strict
):
# The compiler is in the queue, identify it as dependency
dep = spack.compilers.pkg_spec_for_compiler(compiler_spec)
dep.constrain(f"platform={str(arch_spec.platform)}")
dep.constrain(f"os={str(arch_spec.os)}")
dep.constrain(f"target={arch_spec.target.microarchitecture.family.name}:")
dep.concretize()
dep_id = package_id(dep)
self.dependencies.add(dep_id)
# List of uninstalled dependencies, which is used to establish
# the priority of the build task.
#
@@ -1165,53 +1099,6 @@ def __str__(self) -> str:
installed = f"installed ({len(self.installed)}) = {self.installed}"
return f"{self.pid}: {requests}; {tasks}; {installed}; {failed}"
def _add_bootstrap_compilers(
self,
compiler: "spack.spec.CompilerSpec",
architecture: "spack.spec.ArchSpec",
pkgs: List["spack.package_base.PackageBase"],
request: BuildRequest,
all_deps,
) -> None:
"""
Add bootstrap compilers and dependencies to the build queue.
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs: the package list with possible compiler dependencies
request: the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
associated dependents
"""
packages = _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs)
for comp_pkg, is_compiler in packages:
pkgid = package_id(comp_pkg.spec)
if pkgid not in self.build_tasks:
self._add_init_task(comp_pkg, request, is_compiler, all_deps)
elif is_compiler:
# ensure it's queued as a compiler
self._modify_existing_task(pkgid, "compiler", True)
def _modify_existing_task(self, pkgid: str, attr, value) -> None:
"""
Update a task in-place to modify its behavior.
Currently used to update the ``compiler`` field on tasks
that were originally created as a dependency of a compiler,
but are compilers in their own right.
For example, ``intel-oneapi-compilers-classic`` depends on
``intel-oneapi-compilers``, which can cause the latter to be
queued first as a non-compiler, and only later as a compiler.
"""
for i, tup in enumerate(self.build_pq):
key, task = tup
if task.pkg_id == pkgid:
tty.debug(f"Modifying task for {pkgid} to treat it as a compiler", level=2)
setattr(task, attr, value)
self.build_pq[i] = (key, task)
def _add_init_task(
self,
pkg: "spack.package_base.PackageBase",
@@ -1541,42 +1428,7 @@ def _add_tasks(self, request: BuildRequest, all_deps):
tty.warn(f"Installation request refused: {str(err)}")
return
install_compilers = spack.config.get("config:install_missing_compilers", False)
install_deps = request.install_args.get("install_deps")
# Bootstrap compilers first
if install_deps and install_compilers:
packages_per_compiler: Dict[
"spack.spec.CompilerSpec",
Dict["spack.spec.ArchSpec", List["spack.package_base.PackageBase"]],
] = {}
for dep in request.traverse_dependencies():
dep_pkg = dep.package
compiler = dep_pkg.spec.compiler
arch = dep_pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(dep_pkg)
compiler = request.pkg.spec.compiler
arch = request.pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(request.pkg)
for compiler, archs in packages_per_compiler.items():
for arch, packages in archs.items():
self._add_bootstrap_compilers(compiler, arch, packages, request, all_deps)
if install_deps:
for dep in request.traverse_dependencies():
@@ -1608,10 +1460,6 @@ def _add_tasks(self, request: BuildRequest, all_deps):
fail_fast = bool(request.install_args.get("fail_fast"))
self.fail_fast = self.fail_fast or fail_fast
def _add_compiler_package_to_config(self, pkg: "spack.package_base.PackageBase") -> None:
compiler_search_prefix = getattr(pkg, "compiler_search_prefix", pkg.spec.prefix)
spack.compilers.find_compilers([compiler_search_prefix])
def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
"""
Perform the installation of the requested spec and/or dependency
@@ -1639,8 +1487,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if use_cache:
if _install_from_cache(pkg, explicit, unsigned):
self._update_installed(task)
if task.compiler:
self._add_compiler_package_to_config(pkg)
return
elif cache_only:
raise InstallError("No binary found when cache-only was specified", pkg=pkg)
@@ -1670,9 +1516,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, explicit=explicit)
# If a compiler, ensure it is added to the configuration
if task.compiler:
self._add_compiler_package_to_config(pkg)
except spack.build_environment.StopPhase as e:
# A StopPhase exception means that do_install was asked to
# stop early from clients, and is not an error at this point
@@ -2073,10 +1916,6 @@ def install(self) -> None:
path = spack.util.path.debug_padded_filter(pkg.prefix)
_print_installed_pkg(path)
# It's an already installed compiler, add it to the config
if task.compiler:
self._add_compiler_package_to_config(pkg)
else:
# At this point we've failed to get a write or a read
# lock, which means another process has taken a write

View File

@@ -75,7 +75,6 @@
"verify_ssl": {"type": "boolean"},
"ssl_certs": {"type": "string"},
"suppress_gpg_warnings": {"type": "boolean"},
"install_missing_compilers": {"type": "boolean"},
"debug": {"type": "boolean"},
"checksum": {"type": "boolean"},
"deprecated": {"type": "boolean"},
@@ -102,7 +101,14 @@
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
}
},
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"error": False,
},
],
}
}

View File

@@ -2731,10 +2731,6 @@ def define_runtime_constraints(self):
continue
current_libc = compiler.compiler_obj.default_libc
# If this is a compiler yet to be built (config:install_missing_compilers:true)
# infer libc from the Python process
if not current_libc and compiler.compiler_obj.cc is None:
current_libc = spack.util.libc.libc_from_current_python_process()
if using_libc_compatibility() and current_libc:
recorder("*").depends_on(
@@ -3156,7 +3152,7 @@ def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerPar
Args:
input_specs: specs to be concretized
"""
strict = spack.concretize.Concretizer().check_for_compiler_existence
strict = spack.concretize.CHECK_COMPILER_EXISTENCE
default_os = str(spack.platforms.host().default_os)
default_target = str(archspec.cpu.host().family)
for s in traverse.traverse_nodes(input_specs):

View File

@@ -1319,7 +1319,7 @@ node_compiler_weight(node(ID, Package), 100)
not compiler_weight(CompilerID, _).
% For the time being, be strict and reuse only if the compiler match one we have on the system
error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_missing_compilers:true if intended.", Package, Compiler, Version)
error(100, "Compiler {1}@{2} requested for {0} cannot be found.", Package, Compiler, Version)
:- attr("node_compiler_version", node(ID, Package), Compiler, Version),
not node_compiler(node(ID, Package), _).

View File

@@ -3911,43 +3911,43 @@ def format_attribute(match_object: Match) -> str:
for idx, part in enumerate(parts):
if not part:
raise SpecFormatStringError("Format string attributes must be non-empty")
if part.startswith("_"):
elif part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if part == "variants" and isinstance(current, VariantMap):
# subscript instead of getattr for variant names
elif isinstance(current, VariantMap):
# subscript instead of getattr for variant names
try:
current = current[part]
else:
# aliases
if part == "arch":
part = "architecture"
elif part == "version":
# version (singular) requires a concrete versions list. Avoid
# pedantic errors by using versions (plural) when not concrete.
# These two are not entirely equivalent for pkg@=1.2.3:
# - version prints '1.2.3'
# - versions prints '=1.2.3'
if not current.versions.concrete:
part = "versions"
try:
current = getattr(current, part)
except AttributeError:
parent = ".".join(parts[:idx])
m = "Attempted to format attribute %s." % attribute
m += "Spec %s has no attribute %s" % (parent, part)
raise SpecFormatStringError(m)
if isinstance(current, vn.VersionList):
if current == vn.any_version:
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# not printing anything
except KeyError:
raise SpecFormatStringError(f"Variant '{part}' does not exist")
else:
# aliases
if part == "arch":
part = "architecture"
elif part == "version" and not current.versions.concrete:
# version (singular) requires a concrete versions list. Avoid
# pedantic errors by using versions (plural) when not concrete.
# These two are not entirely equivalent for pkg@=1.2.3:
# - version prints '1.2.3'
# - versions prints '=1.2.3'
part = "versions"
try:
current = getattr(current, part)
except AttributeError:
raise SpecFormatStringError(
f"Attempted to format attribute {attribute}. "
f"Spec {'.'.join(parts[:idx])} has no attribute {part}"
)
if isinstance(current, vn.VersionList) and current == vn.any_version:
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# not printing anything
return ""
# Set color codes for various attributes
color = None
if "architecture" in parts:

View File

@@ -513,30 +513,6 @@ def test_setting_dtags_based_on_config(config_setting, expected_flag, config, mo
assert dtags_to_add.value == expected_flag
def test_module_globals_available_at_setup_dependent_time(
monkeypatch, mutable_config, mock_packages, working_env
):
"""Spack built package externaltest depends on an external package
externaltool. Externaltool's setup_dependent_package needs to be able to
access globals on the dependent"""
def setup_dependent_package(module, dependent_spec):
# Make sure set_package_py_globals was already called on
# dependents
# ninja is always set by the setup context and is not None
dependent_module = dependent_spec.package.module
assert hasattr(dependent_module, "ninja")
assert dependent_module.ninja is not None
dependent_spec.package.test_attr = True
externaltool = spack.spec.Spec("externaltest").concretized()
monkeypatch.setattr(
externaltool["externaltool"].package, "setup_dependent_package", setup_dependent_package
)
spack.build_environment.setup_package(externaltool.package, False)
assert externaltool.package.test_attr
def test_build_jobs_sequential_is_sequential():
assert (
determine_number_of_jobs(

View File

@@ -19,7 +19,6 @@
import spack.cmd.common.arguments
import spack.cmd.install
import spack.compilers as compilers
import spack.config
import spack.environment as ev
import spack.hash_types as ht
@@ -29,7 +28,7 @@
from spack.error import SpackError
from spack.main import SpackCommand
from spack.parser import SpecSyntaxError
from spack.spec import CompilerSpec, Spec
from spack.spec import Spec
install = SpackCommand("install")
env = SpackCommand("env")
@@ -916,68 +915,6 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
def test_compiler_bootstrap(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("pkg-a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows")
def test_compiler_bootstrap_from_binary_mirror(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
):
"""
Make sure installing compiler from buildcache registers compiler
"""
# Create a temp mirror directory for buildcache usage
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = "file://{0}".format(mirror_dir.strpath)
# Install a compiler, because we want to put it in a buildcache
install("gcc@=10.2.0")
# Put installed compiler in the buildcache
buildcache("push", "-u", "-f", mirror_dir.strpath, "gcc@10.2.0")
# Now uninstall the compiler
uninstall("-y", "gcc@10.2.0")
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=10.2.0") not in compilers.all_compiler_specs()
# Configure the mirror where we put that buildcache w/ the compiler
mirror("add", "test-mirror", mirror_url)
# Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not
# raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@pytest.mark.regression("16221")
def test_compiler_bootstrap_already_installed(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("gcc@=12.0")
install("pkg-a%gcc@=12.0")
def test_install_fails_no_args(tmpdir):
# ensure no spack.yaml in directory
with tmpdir.as_cwd():

View File

@@ -55,12 +55,24 @@ def test_reindex_with_deprecated_packages(
deprecate("-y", "libelf@0.8.12", "libelf@0.8.13")
all_installed = spack.store.STORE.db.query(installed=any)
non_deprecated = spack.store.STORE.db.query(installed=True)
db = spack.store.STORE.db
all_installed = db.query(installed=any)
non_deprecated = db.query(installed=True)
_clear_db(tmp_path)
reindex()
assert spack.store.STORE.db.query(installed=any) == all_installed
assert spack.store.STORE.db.query(installed=True) == non_deprecated
assert db.query(installed=any) == all_installed
assert db.query(installed=True) == non_deprecated
old_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.12", installed=any)[0].dag_hash()
)
new_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.13", installed=True)[0].dag_hash()
)
assert old_libelf.deprecated_for == new_libelf.spec.dag_hash()
assert new_libelf.deprecated_for is None
assert new_libelf.ref_count == 1

View File

@@ -2375,26 +2375,6 @@ def test_externals_with_platform_explicitly_set(self, tmp_path):
s = Spec("mpich").concretized()
assert s.external
@pytest.mark.regression("43875")
def test_concretize_missing_compiler(self, mutable_config, monkeypatch):
"""Tests that Spack can concretize a spec with a missing compiler when the
option is active.
"""
def _default_libc(self):
if self.cc is None:
return None
return Spec("glibc@=2.28")
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
monkeypatch.setattr(spack.compiler.Compiler, "default_libc", property(_default_libc))
monkeypatch.setattr(
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
)
mutable_config.set("config:install_missing_compilers", True)
s = Spec("pkg-a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267")
def test_spec_with_build_dep_from_json(self, tmp_path):
"""Tests that we can correctly concretize a spec, when we express its dependency as a

View File

@@ -7,6 +7,7 @@
import functools
import json
import os
import re
import shutil
import sys
@@ -982,9 +983,12 @@ def test_reindex_removed_prefix_is_not_installed(mutable_database, mock_store, c
# Reindex should pick up libelf as a dependency of libdwarf
spack.store.STORE.reindex()
# Reindexing should warn about libelf not being found on the filesystem
err = capfd.readouterr()[1]
assert "this directory does not contain an installation of the spec" in err
# Reindexing should warn about libelf not found on the filesystem
assert re.search(
"libelf@0.8.13.+ was marked installed in the database "
"but was not found on the file system",
capfd.readouterr().err,
)
# And we should still have libelf in the database, but not installed.
assert not mutable_database.query_one("libelf", installed=True)
@@ -1124,3 +1128,53 @@ def test_database_errors_with_just_a_version_key(tmp_path):
with pytest.raises(spack.database.InvalidDatabaseVersionError):
spack.database.Database(root).query_local()
def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config):
# Reindexing should not put install records of upstream entries into the local database. Here
# we install `mpileaks` locally with dependencies in the upstream. And we even install
# `mpileaks` with the same hash in the upstream. After reindexing, `mpileaks` should still be
# in the local db, and `callpath` should not.
mpileaks = spack.spec.Spec("mpileaks").concretized()
callpath = mpileaks.dependencies("callpath")[0]
upstream_store = spack.store.create(
{"config": {"install_tree": {"root": str(tmp_path / "upstream")}}}
)
monkeypatch.setattr(spack.store, "STORE", upstream_store)
callpath.package.do_install(fake=True)
local_store = spack.store.create(
{
"config": {"install_tree": {"root": str(tmp_path / "local")}},
"upstreams": {"my-upstream": {"install_tree": str(tmp_path / "upstream")}},
}
)
monkeypatch.setattr(spack.store, "STORE", local_store)
mpileaks.package.do_install(fake=True)
# Sanity check that callpath is from upstream.
assert not local_store.db.query_local("callpath")
assert local_store.db.query("callpath")
# Install mpileaks also upstream with the same hash to ensure that determining upstreamness
# checks local installs before upstream databases, even when the local database is being
# reindexed.
monkeypatch.setattr(spack.store, "STORE", upstream_store)
mpileaks.package.do_install(fake=True)
# Delete the local database
shutil.rmtree(local_store.db.database_directory)
# Create a new instance s.t. we don't have cached specs in memory
reindexed_local_store = spack.store.create(
{
"config": {"install_tree": {"root": str(tmp_path / "local")}},
"upstreams": {"my-upstream": {"install_tree": str(tmp_path / "upstream")}},
}
)
reindexed_local_store.db.reindex()
assert not reindexed_local_store.db.query_local("callpath")
assert reindexed_local_store.db.query("callpath") == [callpath]
assert reindexed_local_store.db.query_local("mpileaks") == [mpileaks]

View File

@@ -12,8 +12,6 @@
import py
import pytest
import archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lock as ulk
import llnl.util.tty as tty
@@ -435,76 +433,6 @@ def test_fake_install(install_mockery):
assert os.path.isdir(pkg.prefix.lib)
def test_packages_needed_to_bootstrap_compiler_none(install_mockery):
spec = spack.spec.Spec("trivial-install-test-package")
spec.concretize()
assert spec.concrete
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package]
)
assert not packages
@pytest.mark.xfail(reason="fails when assuming Spec.package can only be called on concrete specs")
def test_packages_needed_to_bootstrap_compiler_packages(install_mockery, monkeypatch):
spec = spack.spec.Spec("trivial-install-test-package")
spec.concretize()
def _conc_spec(compiler):
return spack.spec.Spec("pkg-a").concretized()
# Ensure we can get past functions that are precluding obtaining
# packages.
monkeypatch.setattr(spack.compilers, "compilers_for_spec", _none)
monkeypatch.setattr(spack.compilers, "pkg_spec_for_compiler", _conc_spec)
monkeypatch.setattr(spack.spec.Spec, "concretize", _noop)
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package]
)
assert packages
def test_update_tasks_for_compiler_packages_as_compiler(mock_packages, config, monkeypatch):
spec = spack.spec.Spec("trivial-install-test-package").concretized()
installer = inst.PackageInstaller([spec.package], {})
# Add a task to the queue
installer._add_init_task(spec.package, installer.build_requests[0], False, {})
# monkeypatch to make the list of compilers be what we test
def fake_package_list(compiler, architecture, pkgs):
return [(spec.package, True)]
monkeypatch.setattr(inst, "_packages_needed_to_bootstrap_compiler", fake_package_list)
installer._add_bootstrap_compilers("fake", "fake", "fake", None, {})
# Check that the only task is now a compiler task
assert len(installer.build_pq) == 1
assert installer.build_pq[0][1].compiler
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64",
reason="OneAPI compiler is not supported on other architectures",
)
def test_bootstrapping_compilers_with_different_names_from_spec(
install_mockery, mutable_config, mock_fetch, archspec_host_is_spack_test_host
):
"""Tests that, when we bootstrap '%oneapi' we can translate it to the
'intel-oneapi-compilers' package.
"""
with spack.config.override("config:install_missing_compilers", True):
with spack.concretize.disable_compiler_existence_check():
spec = spack.spec.Spec("trivial-install-test-package%oneapi@=22.2.0").concretized()
spec.package.do_install()
assert (
spack.spec.CompilerSpec("oneapi@=22.2.0") in spack.compilers.all_compiler_specs()
)
def test_dump_packages_deps_ok(install_mockery, tmpdir, mock_packages):
"""Test happy path for dump_packages with dependencies."""
@@ -696,26 +624,6 @@ def test_check_deps_status_upstream(install_mockery, monkeypatch):
assert inst.package_id(dep) in installer.installed
def test_add_bootstrap_compilers(install_mockery, monkeypatch):
from collections import defaultdict
def _pkgs(compiler, architecture, pkgs):
spec = spack.spec.Spec("mpi").concretized()
return [(spec.package, True)]
installer = create_installer(["trivial-install-test-package"], {})
request = installer.build_requests[0]
all_deps = defaultdict(set)
monkeypatch.setattr(inst, "_packages_needed_to_bootstrap_compiler", _pkgs)
installer._add_bootstrap_compilers("fake", "fake", [request.pkg], request, all_deps)
ids = list(installer.build_tasks)
assert len(ids) == 1
task = installer.build_tasks[ids[0]]
assert task.compiler
def test_prepare_for_install_on_installed(install_mockery, monkeypatch):
"""Test of _prepare_for_install's early return for installed task path."""
installer = create_installer(["dependent-install"], {})
@@ -729,18 +637,6 @@ def test_prepare_for_install_on_installed(install_mockery, monkeypatch):
installer._prepare_for_install(task)
def test_installer_init_requests(install_mockery):
"""Test of installer initial requests."""
spec_name = "dependent-install"
with spack.config.override("config:install_missing_compilers", True):
installer = create_installer([spec_name], {})
# There is only one explicit request in this case
assert len(installer.build_requests) == 1
request = installer.build_requests[0]
assert request.pkg.name == spec_name
def test_install_task_use_cache(install_mockery, monkeypatch):
installer = create_installer(["trivial-install-test-package"], {})
request = installer.build_requests[0]

View File

@@ -741,6 +741,13 @@ def test_spec_formatting(self, default_mock_concretization):
("{/hash}", "/", lambda s: "/" + s.dag_hash()),
]
variants_segments = [
("{variants.debug}", spec, "debug"),
("{variants.foo}", spec, "foo"),
("{^pkg-a.variants.bvv}", spec["pkg-a"], "bvv"),
("{^pkg-a.variants.foo}", spec["pkg-a"], "foo"),
]
other_segments = [
("{spack_root}", spack.paths.spack_root),
("{spack_install}", spack.store.STORE.layout.root),
@@ -768,6 +775,12 @@ def check_prop(check_spec, fmt_str, prop, getter):
callpath, fmt_str = depify("callpath", named_str, sigil)
assert spec.format(fmt_str) == getter(callpath)
for named_str, test_spec, variant_name in variants_segments:
assert test_spec.format(named_str) == str(test_spec.variants[variant_name])
assert test_spec.format(named_str[:-1] + ".value}") == str(
test_spec.variants[variant_name].value
)
for named_str, expected in other_segments:
actual = spec.format(named_str)
assert expected == actual
@@ -827,6 +840,7 @@ def test_spec_formatting_sigil_mismatches(self, default_mock_concretization, fmt
r"{dag_hash}",
r"{foo}",
r"{+variants.debug}",
r"{variants.this_variant_does_not_exist}",
],
)
def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str):

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from collections import defaultdict, namedtuple
from typing import Union
from collections import defaultdict
from typing import NamedTuple, Union
import spack.deptypes as dt
import spack.spec
@@ -12,11 +12,14 @@
# Export only the high-level API.
__all__ = ["traverse_edges", "traverse_nodes", "traverse_tree"]
#: Data class that stores a directed edge together with depth at
#: which the target vertex was found. It is passed to ``accept``
#: and ``neighbors`` of visitors, so they can decide whether to
#: follow the edge or not.
EdgeAndDepth = namedtuple("EdgeAndDepth", ["edge", "depth"])
class EdgeAndDepth(NamedTuple):
edge: "spack.spec.DependencySpec"
depth: int
def sort_edges(edges):

View File

@@ -2,7 +2,6 @@ ci:
target: gitlab
broken-tests-packages:
- gptune
- superlu-dist # srun -n 4 hangs
- papyrus

View File

@@ -16,7 +16,7 @@ spack:
boost:
variants: +python +filesystem +iostreams +system
elfutils:
variants: +bzip2 ~nls +xz
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:

View File

@@ -20,6 +20,7 @@ class Amdsmi(CMakePackage):
libraries = ["libamd_smi"]
license("MIT")
version("6.2.0", sha256="49e4b15af62bf9800c02a24c75c6cd99dc8b146d69cc7f00ecbbcd60f6106315")
version("6.1.2", sha256="4583ea9bc71d55e987db4a42f9b3b730def22892953d30bca64ca29ac844e058")
version("6.1.1", sha256="10ece6b1ca8bb36ab3ae987fc512838f30a92ab788a2200410e9c1707fe0166b")
version("6.1.0", sha256="5bd1f150a2191b1703ff2670e40f6fed730f59f155623d6e43b7f64c39ae0967")

View File

@@ -9,6 +9,7 @@
import llnl.util.tty as tty
from spack.build_systems.cmake import CMakeBuilder
from spack.package import *
@@ -65,7 +66,7 @@ def install(self, spec, prefix):
with working_dir("spack-build", create=True):
host_cfg_fname = self.create_host_config(spec, prefix)
print("Configuring APComp...")
cmake(*std_cmake_args, "-C", host_cfg_fname, "../src")
cmake(*CMakeBuilder.std_args(self), "-C", host_cfg_fname, "../src")
print("Building APComp...")
make()
print("Installing APComp...")

View File

@@ -8,6 +8,20 @@
from spack.package import *
_versions = {
"6.2.0": {
"apt": (
"75f4417477abb80f6a453f836d1ac44c8a3d24447b21cfa4b29787a73725ef4e",
"https://repo.radeon.com/rocm/apt/6.2/pool/main/h/hsa-amd-aqlprofile/hsa-amd-aqlprofile_1.0.0.60200.60200-66~20.04_amd64.deb",
),
"yum": (
"d8ec6ceffe366c041d4dda11c418da53ca3b2234e8a57d4c4af9fdec936349ed",
"https://repo.radeon.com/rocm/yum/6.2/main/hsa-amd-aqlprofile-1.0.0.60200.60200-66.el7.x86_64.rpm",
),
"zyp": (
"e7b34e800e4da6542261379e00b4f3a0e3ebc15e80925bf056ce495aff0b25e9",
"https://repo.radeon.com/rocm/zyp/6.2/main/hsa-amd-aqlprofile-1.0.0.60200.60200-sles155.66.x86_64.rpm",
),
},
"6.1.2": {
"apt": (
"93faa8a0d702bc1623d2346e07a9a1c9134d99c0d3f9de62903e7394e0eedf47",

View File

@@ -16,9 +16,10 @@ class Assimp(CMakePackage):
maintainers("wdconinc")
license("BSD-3-Clause")
license("BSD-3-Clause", checked_by="wdconinc")
version("master", branch="master")
version("5.4.3", sha256="66dfbaee288f2bc43172440a55d0235dfc7bf885dda6435c038e8000e79582cb")
version("5.4.2", sha256="7414861a7b038e407b510e8b8c9e58d5bf8ca76c9dfe07a01d20af388ec5086a")
version("5.4.0", sha256="a90f77b0269addb2f381b00c09ad47710f2aab6b1d904f5e9a29953c30104d3f")
version("5.3.1", sha256="a07666be71afe1ad4bc008c2336b7c688aca391271188eb9108d0c6db1be53f1")
@@ -32,9 +33,6 @@ class Assimp(CMakePackage):
version("5.0.1", sha256="11310ec1f2ad2cd46b95ba88faca8f7aaa1efe9aa12605c55e3de2b977b3dbfc")
version("4.0.1", sha256="60080d8ab4daaab309f65b3cffd99f19eb1af8d05623fff469b9b652818e286e")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
patch(
"https://patch-diff.githubusercontent.com/raw/assimp/assimp/pull/4203.patch?full_index=1",
sha256="24135e88bcef205e118f7a3f99948851c78d3f3e16684104dc603439dd790d74",
@@ -43,6 +41,9 @@ class Assimp(CMakePackage):
variant("shared", default=True, description="Enables the build of shared libraries")
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("cmake@3.10:", type="build", when="@5.1:")
depends_on("cmake@3.22:", type="build", when="@5.4:")
@@ -54,10 +55,10 @@ def patch(self):
def cmake_args(self):
args = [
"-DASSIMP_HUNTER_ENABLED=OFF",
"-DASSIMP_BUILD_ZLIB=OFF",
"-DASSIMP_BUILD_MINIZIP=OFF",
"-DASSIMP_BUILD_TESTS=OFF",
self.define("ASSIMP_HUNTER_ENABLED", False),
self.define("ASSIMP_BUILD_ZLIB", False),
self.define("ASSIMP_BUILD_MINIZIP", False),
self.define("ASSIMP_BUILD_TESTS", self.run_tests),
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
]
return args
@@ -67,3 +68,12 @@ def flag_handler(self, name, flags):
if name == "cxxflags":
flags.append(self.compiler.cxx11_flag)
return (None, None, flags)
def check(self):
unit = Executable(join_path(self.builder.build_directory, "bin", "unit"))
skipped_tests = [
"AssimpAPITest_aiMatrix3x3.aiMatrix3FromToTest",
"AssimpAPITest_aiMatrix4x4.aiMatrix4FromToTest",
"AssimpAPITest_aiQuaternion.aiQuaternionFromNormalizedQuaternionTest",
]
unit(f"--gtest_filter=-{':'.join(skipped_tests)}")

View File

@@ -116,7 +116,9 @@ class Boost(Package):
# support. The header-only library is installed when no variant is given.
all_libs = [
"atomic",
"charconv",
"chrono",
"cobalt",
"container",
"context",
"contract",
@@ -146,6 +148,7 @@ class Boost(Package):
"thread",
"timer",
"type_erasure",
"url",
"wave",
]
@@ -497,7 +500,7 @@ def bjam_python_line(self, spec):
spec["python"].libs[0],
)
def determine_bootstrap_options(self, spec, with_libs, without_libs, options):
def determine_bootstrap_options(self, spec, with_libs, options):
boost_toolset_id = self.determine_toolset(spec)
# Arm compiler bootstraps with 'gcc' (but builds as 'clang')
@@ -506,9 +509,9 @@ def determine_bootstrap_options(self, spec, with_libs, without_libs, options):
else:
options.append("--with-toolset=%s" % boost_toolset_id)
if with_libs:
options.append("--with-libraries=%s" % ",".join(with_libs))
options.append("--with-libraries=%s" % ",".join(sorted(with_libs)))
else:
options.append("--without-libraries=%s" % ",".join(without_libs))
options.append("--with-libraries=headers")
if spec.satisfies("+python"):
options.append("--with-python=%s" % spec["python"].command.path)
@@ -679,50 +682,39 @@ def install(self, spec, prefix):
force_symlink("/usr/bin/libtool", join_path(newdir, "libtool"))
env["PATH"] = newdir + ":" + env["PATH"]
with_libs = list()
without_libs = list()
for lib in Boost.all_libs:
if "+{0}".format(lib) in spec:
with_libs.append(lib)
else:
without_libs.append(lib)
remove_if_in_list = lambda lib, libs: libs.remove(lib) if lib in libs else None
with_libs = {f"{lib}" for lib in Boost.all_libs if f"+{lib}" in spec}
# Remove libraries that the release version does not support
if not spec.satisfies("@1.85.0:"):
with_libs.discard("charconv")
if not spec.satisfies("@1.84.0:"):
with_libs.discard("cobalt")
if not spec.satisfies("@1.81.0:"):
with_libs.discard("url")
if not spec.satisfies("@1.75.0:"):
remove_if_in_list("json", with_libs)
remove_if_in_list("json", without_libs)
with_libs.discard("json")
if spec.satisfies("@1.69.0:"):
remove_if_in_list("signals", with_libs)
remove_if_in_list("signals", without_libs)
with_libs.discard("signals")
if not spec.satisfies("@1.54.0:"):
remove_if_in_list("log", with_libs)
remove_if_in_list("log", without_libs)
with_libs.discard("log")
if not spec.satisfies("@1.53.0:"):
remove_if_in_list("atomic", with_libs)
remove_if_in_list("atomic", without_libs)
with_libs.discard("atomic")
if not spec.satisfies("@1.48.0:"):
remove_if_in_list("locale", with_libs)
remove_if_in_list("locale", without_libs)
with_libs.discard("locale")
if not spec.satisfies("@1.47.0:"):
remove_if_in_list("chrono", with_libs)
remove_if_in_list("chrono", without_libs)
with_libs.discard("chrono")
if not spec.satisfies("@1.43.0:"):
remove_if_in_list("random", with_libs)
remove_if_in_list("random", without_libs)
with_libs.discard("random")
if not spec.satisfies("@1.39.0:"):
remove_if_in_list("exception", with_libs)
remove_if_in_list("exception", without_libs)
with_libs.discard("exception")
if spec.satisfies("+graph") and spec.satisfies("+mpi"):
with_libs.append("graph_parallel")
remove_if_in_list("graph_parallel", without_libs)
with_libs.add("graph_parallel")
# to make Boost find the user-config.jam
env["BOOST_BUILD_PATH"] = self.stage.source_path
bootstrap_options = ["--prefix=%s" % prefix]
self.determine_bootstrap_options(spec, with_libs, without_libs, bootstrap_options)
self.determine_bootstrap_options(spec, with_libs, bootstrap_options)
if self.spec.satisfies("platform=windows"):
bootstrap = Executable("cmd.exe")

View File

@@ -150,9 +150,9 @@ class Charliecloud(AutotoolsPackage):
with when("+squashfuse"):
depends_on("libfuse@3:", type=("build", "run", "link"), when="@0.32:")
depends_on("pkgconfig", type="build", when="@0.37:")
depends_on("squashfuse@0.1.105:0.2.0,0.4.0:", type="build", when="@0.36:")
depends_on("squashfuse@0.1.105:0.2.0,0.4.0", type="build", when="@0.35")
depends_on("squashfuse@0.1.105", type="build", when="@0.32:0.34")
depends_on("squashfuse@0.1.105:0.2.0,0.4.0:", type="link", when="@0.36:")
depends_on("squashfuse@0.1.105:0.2.0,0.4.0", type="link", when="@0.35")
depends_on("squashfuse@0.1.105", type="link", when="@0.32:0.34")
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")

View File

@@ -30,6 +30,7 @@ def url_for_version(self, version):
license("NCSA")
version("master", branch="amd-stg-open")
version("6.2.0", sha256="12ce17dc920ec6dac0c5484159b3eec00276e4a5b301ab1250488db3b2852200")
version("6.1.2", sha256="300e9d6a137dcd91b18d5809a316fddb615e0e7f982dc7ef1bb56876dff6e097")
version("6.1.1", sha256="f1a67efb49f76a9b262e9735d3f75ad21e3bd6a05338c9b15c01e6c625c4460d")
version("6.1.0", sha256="6bd9912441de6caf6b26d1323e1c899ecd14ff2431874a2f5883d3bc5212db34")
@@ -81,6 +82,7 @@ def url_for_version(self, version):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
# llvm libs are linked statically, so this *could* be a build dep

View File

@@ -19,6 +19,7 @@ class ComposableKernel(CMakePackage):
license("MIT")
version("master", branch="develop")
version("6.2.0", sha256="4a3024f4f93c080db99d560a607ad758745cd2362a90d0e8f215331686a6bc64")
version("6.1.2", sha256="54db801e1c14239f574cf94dd764a2f986b4abcc223393d55c49e4b276e738c9")
version("6.1.1", sha256="f55643c6eee0878e8f2d14a382c33c8b84af0bdf8f31b37b6092b377f7a9c6b5")
version("6.1.0", sha256="355a4514b96b56aa9edf78198a3e22067e7397857cfe29d9a64d9c5557b9f83d")
@@ -56,6 +57,7 @@ class ComposableKernel(CMakePackage):
for ver in [
"master",
"6.2.0",
"6.1.2",
"6.1.1",
"6.1.0",
@@ -88,11 +90,11 @@ def cmake_args(self):
]
if "auto" not in self.spec.variants["amdgpu_target"]:
args.append(self.define_from_variant("GPU_TARGETS", "amdgpu_target"))
else:
args.append(self.define("INSTANCES_ONLY", "ON"))
if self.spec.satisfies("@5.6.0:"):
if self.run_tests:
args.append(self.define("BUILD_TESTING", "ON"))
else:
args.append(self.define("INSTANCES_ONLY", "ON"))
args.append(self.define("CK_BUILD_JIT_LIB", "ON"))
args.append(self.define("CMAKE_POSITION_INDEPENDENT_CODE", "ON"))
if self.spec.satisfies("@:5.7"):

View File

@@ -101,13 +101,13 @@ def configure_args(self):
extra_args = []
job_id = "NONE"
if spec.satisfies("+slurm"):
if spec.satisfies("scheduler=slurm"):
job_id = "SLURM_JOBID"
if spec.satisfies("+cobalt"):
elif spec.satisfies("scheduler=cobalt"):
job_id = "COBALT_JOBID"
if spec.satisfies("+pbs"):
elif spec.satisfies("scheduler=pbs"):
job_id = "PBS_JOBID"
if spec.satisfies("+sge"):
elif spec.satisfies("scheduler=sge"):
job_id = "JOB_ID"
if spec.satisfies("+hdf5"):

View File

@@ -8,6 +8,7 @@
import llnl.util.tty as tty
from spack.build_systems.cmake import CMakeBuilder
from spack.package import *
@@ -120,7 +121,12 @@ def install(self, spec, prefix):
with working_dir("spack-build", create=True):
host_cfg_fname = self.create_host_config(spec, prefix)
print("Configuring Devil Ray...")
cmake(*std_cmake_args, "-C", host_cfg_fname, "../src")
cmake(
*CMakeBuilder.std_args(self, generator="Unix Makefiles"),
"-C",
host_cfg_fname,
"../src",
)
print("Building Devil Ray...")
make()
# run unit tests if requested

View File

@@ -76,6 +76,7 @@ class Ecflow(CMakePackage):
# https://github.com/JCSDA/spack-stack/issues/1001
# https://github.com/JCSDA/spack-stack/issues/1009
patch("ctsapi_cassert.patch", when="@5.11.4")
patch("vfile_cassert.patch", when="@5.11.4")
@when("@:4.13.0")
def patch(self):

View File

@@ -0,0 +1,10 @@
--- a/Viewer/ecflowUI/src/VFile.cpp 2024-08-28 12:20:27.000000000 +0000
+++ b/Viewer/ecflowUI/src/VFile.cpp 2024-08-28 12:20:51.000000000 +0000
@@ -9,6 +9,7 @@
#include "VFile.hpp"
+#include <cassert>
#include <cstdio>
#include <cstdlib>
#include <cstring>

View File

@@ -0,0 +1,36 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class FastFloat(CMakePackage):
"""Fast and exact implementation of the C++ from_chars functions for number
types."""
homepage = "https://github.com/fastfloat/fast_float"
url = "https://github.com/fastfloat/fast_float/archive/refs/tags/v6.1.4.tar.gz"
license("Apache-2.0 OR BSL-1.0 OR MIT", checked_by="pranav-sivararamn")
version("6.1.6", sha256="4458aae4b0eb55717968edda42987cabf5f7fc737aee8fede87a70035dba9ab0")
version("6.1.5", sha256="597126ff5edc3ee59d502c210ded229401a30dafecb96a513135e9719fcad55f")
version("6.1.4", sha256="12cb6d250824160ca16bcb9d51f0ca7693d0d10cb444f34f1093bc02acfce704")
depends_on("cxx", type="build")
depends_on("cmake@3.9:", type="build")
depends_on("doctest", type="test")
patch(
"https://github.com/fastfloat/fast_float/commit/a7ed4e89c7444b5c8585453fc6d015c0efdf8654.patch?full_index=1",
sha256="25561aa7db452da458fb0ae3075ef8e63ccab174ca8f5a6c79fb15cb342b3683",
when="@:6.1.5",
)
def cmake_args(self):
args = [self.define("FASTFLOAT_TEST", self.run_tests), self.define("SYSTEM_DOCTEST", True)]
return args

View File

@@ -62,7 +62,7 @@ class Ginkgo(CMakePackage, CudaPackage, ROCmPackage):
depends_on("cuda@9:", when="+cuda @:1.4.0")
depends_on("cuda@9.2:", when="+cuda @1.5.0:")
depends_on("cuda@10.1:", when="+cuda @1.7.0:")
depends_on("mpi", when="+mpi")
depends_on("mpi@3.1:", when="+mpi")
depends_on("rocthrust", when="+rocm")
depends_on("hipsparse", when="+rocm")

View File

@@ -2,11 +2,19 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.package import *
def terminate_bash_failures(dir):
"""Ensure bash scripts within the directory fail as soon as a command
within fails."""
for f in os.listdir(dir):
if f.endswith(".sh"):
filter_file(r"#!/bin/bash", r"#!/bin/bash" + "\nset -e", join_path(dir, f))
class Gptune(CMakePackage):
"""GPTune is an autotuning framework that relies on multitask and transfer
learnings to help solve the underlying black-box optimization problem using
@@ -93,7 +101,6 @@ def cmake_args(self):
return args
examples_src_dir = "examples"
src_dir = "GPTune"
nodes = 1
cores = 4
@@ -101,45 +108,14 @@ def cmake_args(self):
def cache_test_sources(self):
"""Copy the example source files after the package is installed to an
install test subdirectory for use during `spack test run`."""
self.cache_extra_test_sources([self.examples_src_dir])
cache_extra_test_sources(self, [self.examples_src_dir])
def setup_run_environment(self, env):
env.set("GPTUNE_INSTALL_PATH", python_platlib)
def test(self):
spec = self.spec
# Create the environment setup script
comp_name = self.compiler.name
comp_version = str(self.compiler.version).replace(".", ",")
test_dir = join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
if spec.satisfies("+superlu"):
superludriver = join_path(spec["superlu-dist"].prefix.lib, "EXAMPLE/pddrive_spawn")
op = ["-r", superludriver, "."]
# copy superlu-dist executables to the correct place
wd = join_path(test_dir, "SuperLU_DIST")
self.run_test("rm", options=["-rf", "superlu_dist"], work_dir=wd)
self.run_test(
"git",
options=["clone", "https://github.com/xiaoyeli/superlu_dist.git"],
work_dir=wd,
)
self.run_test("mkdir", options=["-p", "build"], work_dir=wd + "/superlu_dist")
self.run_test("mkdir", options=["-p", "EXAMPLE"], work_dir=wd + "/superlu_dist/build")
self.run_test("cp", options=op, work_dir=wd + "/superlu_dist/build/EXAMPLE")
if spec.satisfies("+hypre"):
hypredriver = join_path(spec["hypre"].prefix.bin, "ij")
op = ["-r", hypredriver, "."]
# copy superlu-dist executables to the correct place
wd = join_path(test_dir, "Hypre")
self.run_test("rm", options=["-rf", "hypre"], work_dir=wd)
self.run_test(
"git", options=["clone", "https://github.com/hypre-space/hypre.git"], work_dir=wd
)
self.run_test("cp", options=op, work_dir=wd + "/hypre/src/test/")
wd = self.test_suite.current_test_cache_dir
with open("{0}/run_env.sh".format(wd), "w") as envfile:
spec = self.spec
script_path = f"{install_test_root(self)}/run_env.sh"
with open(script_path, "w") as envfile:
envfile.write('if [[ $NERSC_HOST = "cori" ]]; then\n')
envfile.write(" export machine=cori\n")
envfile.write('elif [[ $(uname -s) = "Darwin" ]]; then\n')
@@ -154,13 +130,15 @@ def test(self):
envfile.write(" export machine=unknownlinux\n")
envfile.write("fi\n")
envfile.write("export GPTUNEROOT=$PWD\n")
envfile.write("export MPIRUN={0}\n".format(which(spec["mpi"].prefix.bin + "/mpirun")))
envfile.write("export PYTHONPATH={0}:$PYTHONPATH\n".format(python_platlib + "/gptune"))
mpirun = spec["mpi"].prefix.bin.mpirun
envfile.write(f"export MPIRUN={mpirun}\n")
gptune_path = join_path(python_platlib, "gptune")
envfile.write(f"export PYTHONPATH={gptune_path}:$PYTHONPATH\n")
envfile.write("export proc=$(spack arch)\n")
envfile.write("export mpi={0}\n".format(spec["mpi"].name))
envfile.write("export compiler={0}\n".format(comp_name))
envfile.write("export nodes={0} \n".format(self.nodes))
envfile.write("export cores={0} \n".format(self.cores))
envfile.write(f"export mpi={spec['mpi'].name}\n")
envfile.write(f"export compiler={comp_name}\n")
envfile.write(f"export nodes={self.nodes} \n")
envfile.write(f"export cores={self.cores} \n")
envfile.write("export ModuleEnv=$machine-$proc-$mpi-$compiler \n")
envfile.write(
'software_json=$(echo ",\\"software_configuration\\":'
@@ -214,28 +192,115 @@ def test(self):
+ '{\\"nodes\\":$nodes,\\"cores\\":$cores}}}") \n'
)
# copy the environment configuration files to non-cache directories
op = ["run_env.sh", python_platlib + "/gptune/."]
self.run_test("cp", options=op, work_dir=wd)
op = ["run_env.sh", self.install_test_root + "/."]
self.run_test("cp", options=op, work_dir=wd)
# copy the environment configuration to the python install directory
cp = which("cp")
cp(script_path, join_path(python_platlib, "gptune"))
apps = ["Scalapack-PDGEQRF_RCI"]
if spec.satisfies("+mpispawn"):
apps = apps + ["GPTune-Demo", "Scalapack-PDGEQRF"]
if spec.satisfies("+superlu"):
apps = apps + ["SuperLU_DIST_RCI"]
if spec.satisfies("+mpispawn"):
apps = apps + ["SuperLU_DIST"]
if spec.satisfies("+hypre"):
if spec.satisfies("+mpispawn"):
apps = apps + ["Hypre"]
def setup_run_environment(self, env):
env.set("GPTUNE_INSTALL_PATH", python_platlib)
for app in apps:
wd = join_path(test_dir, app)
self.run_test(
"bash",
options=["run_examples.sh"],
work_dir=wd,
purpose="gptune smoke test for {0}".format(app),
bash = which("bash")
cp = which("cp")
git = which("git")
rm = which("rm")
def test_hypre(self):
"""set up and run hypre example"""
spec = self.spec
if spec.satisfies("~hypre") or spec.satisfies("~mpispawn"):
raise SkipTest("Package must be installed with +hypre+mpispawn")
# https://github.com/spack/spack/pull/45383#discussion_r1737987370
if not self.spec["hypre"].satisfies("@2.19.0"):
raise SkipTest("Package test only works for hypre@2.19.0")
test_dir = join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
# copy hypre executables to the correct place
wd = join_path(test_dir, "Hypre")
with working_dir(wd):
self.rm("-rf", "hypre")
self.git(
"clone",
"--depth",
"1",
"--branch",
f"v{self.spec['hypre'].version.string}",
"https://github.com/hypre-space/hypre.git",
)
hypre_test_dir = join_path(wd, "hypre", "src", "test")
mkdirp(hypre_test_dir)
self.cp("-r", self.spec["hypre"].prefix.bin.ij, hypre_test_dir)
# now run the test example
with working_dir(join_path(test_dir, "Hypre")):
terminate_bash_failures(".")
self.bash("run_examples.sh")
def test_superlu(self):
"""set up and run superlu tests"""
if self.spec.satisfies("~superlu"):
raise SkipTest("Package must be installed with +superlu")
# https://github.com/spack/spack/pull/45383#discussion_r1737987370
if self.spec["superlu-dist"].version < Version("7.1"):
raise SkipTest("Package must be installed with superlu-dist@:7.1")
test_dir = join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
# copy only works for-dist executables to the correct place
wd = join_path(test_dir, "SuperLU_DIST")
with working_dir(wd):
self.rm("-rf", "superlu_dist")
version = self.spec["superlu-dist"].version.string
tag = f"v{version}" if version.replace(".", "").isdigit() else version
# TODO: Replace this IF/when superlu-dist renames its "master"
# branch's version from "develop" to "master".
tag = "master" if tag == "develop" else tag
self.git(
"clone",
"--depth",
"1",
"--branch",
tag,
"https://github.com/xiaoyeli/superlu_dist.git",
)
superludriver = self.spec["superlu-dist"].prefix.lib.EXAMPLE.pddrive_spawn
example_dir = join_path(wd, "superlu_dist", "build", "EXAMPLE")
mkdirp(example_dir)
self.cp("-r", superludriver, example_dir)
apps = ["SuperLU_DIST", "SuperLU_DIST_RCI"]
for app in apps:
with test_part(self, f"test_superlu_{app}", purpose=f"run {app} example"):
if app == "SuperLU_DIST" and self.spec.satisfies("~mpispawn"):
raise SkipTest("Package must be installed with +superlu+mpispawn")
with working_dir(join_path(test_dir, app)):
terminate_bash_failures(".")
self.bash("run_examples.sh")
def test_demo(self):
"""Run the demo test"""
if self.spec.satisfies("~mpispawn"):
raise SkipTest("Package must be installed with +mpispawn")
test_dir = join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
with working_dir(join_path(test_dir, "GPTune-Demo")):
terminate_bash_failures(".")
self.bash("run_examples.sh")
def test_scalapack(self):
"""Run scalapack tests"""
test_dir = join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
apps = ["Scalapack-PDGEQRF", "Scalapack-PDGEQRF_RCI"]
for app in apps:
with test_part(self, f"test_scalapack_{app}", purpose=f"run {app} example"):
if app == "Scalapack-PDGEQRF" and self.spec.satisfies("~mpispawn"):
raise SkipTest("Package must be installed with +superlu+mpispawn")
with working_dir(join_path(test_dir, app)):
terminate_bash_failures(".")
self.bash("run_examples.sh")

View File

@@ -17,6 +17,7 @@ class HipTensor(CMakePackage, ROCmPackage):
maintainers("srekolam", "afzpatel")
version("master", branch="master")
version("6.2.0", sha256="adb7459416864fb2664064f5bea5fb669839247b702209a6415b396813626b31")
version("6.1.2", sha256="ac0e07a3019bcce4a0a98aafa4922d5fc9e953bed07084abef5306c851717783")
version("6.1.1", sha256="09bcdbf6b1d20dc4d75932abd335a9a534b16a8343858121daa5813a38f5ad3a")
version("6.1.0", sha256="9cc43b1b3394383f22f30e194d8753ca6ff1887c83ec1de5823cb2e94976eeed")
@@ -29,11 +30,11 @@ class HipTensor(CMakePackage, ROCmPackage):
variant("asan", default=False, description="Build with address-sanitizer enabled or disabled")
for ver in ["5.7.0", "5.7.1", "6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "master"]:
for ver in ["5.7.0", "5.7.1", "6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0", "master"]:
depends_on(f"composable-kernel@{ver}", when=f"@{ver}")
depends_on(f"rocm-cmake@{ver}", when=f"@{ver}")
for ver in ["6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"hipcc@{ver}", when=f"@{ver}")
def setup_build_environment(self, env):

View File

@@ -27,6 +27,7 @@ class Hip(CMakePackage):
license("MIT")
version("master", branch="master")
version("6.2.0", sha256="7ca261eba79793427674bf2372c92ac5483cc0fac5278f8ad611de396fad8bee")
version("6.1.2", sha256="9ba5f70a553b48b2cea25c7e16b97ad49320750c0152763b173b63b9f151e783")
version("6.1.1", sha256="09e8013b8071fca2cf914758001bbd1dccaa237e798e945970e4356cb9b90050")
version("6.1.0", sha256="6fd57910a16d0b54df822807e67b6207146233a2de5a46c6a05b940a21e2c4d7")
@@ -84,6 +85,7 @@ class Hip(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hsakmt-roct@{ver}", when=f"@{ver}")
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")
@@ -106,6 +108,7 @@ class Hip(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hipify-clang@{ver}", when=f"@{ver}")
@@ -121,13 +124,16 @@ class Hip(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")
depends_on("rocprofiler-register@6.2.0", when="@6.2.0")
# hipcc likes to add `-lnuma` by default :(
# ref https://github.com/ROCm/HIP/pull/2202
depends_on("numactl", when="@3.7.0:")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"hipcc@{ver}", when=f"@{ver}")
# roc-obj-ls requirements
@@ -189,6 +195,7 @@ class Hip(CMakePackage):
)
# Add hip-clr sources thru the below
for d_version, d_shasum in [
("6.2.0", "620e4c6a7f05651cc7a170bc4700fef8cae002420307a667c638b981d00b25e8"),
("6.1.2", "1a1e21640035d957991559723cd093f0c7e202874423667d2ba0c7662b01fea4"),
("6.1.1", "2db02f335c9d6fa69befcf7c56278e5cecfe3db0b457eaaa41206c2585ef8256"),
("6.1.0", "49b23eef621f4e8e528bb4de8478a17436f42053a2f7fde21ff221aa683205c7"),
@@ -242,6 +249,7 @@ class Hip(CMakePackage):
)
# Add hipother sources thru the below
for d_version, d_shasum in [
("6.2.0", "1f854b0c07d71b10450080e3bbffe47adaf10a9745a9212797d991756a100174"),
("6.1.2", "2740d1e3dcf1f2d07d2a8db6acf4c972941ae392172b83fd8ddcfe8706a40d0b"),
("6.1.1", "8b975623c8ed1db53feea2cfd5d29f2a615e890aee1157d0d17adeb97200643f"),
("6.1.0", "43a48ccc82f705a15852392ee7419e648d913716bfc04063a53d2d17979b1b46"),
@@ -260,6 +268,7 @@ class Hip(CMakePackage):
# Add hiptests sources thru the below
for d_version, d_shasum in [
("6.2.0", "314837dbac78be71844ceb959476470c484fdcd4fb622ff8de9277783e0fcf1c"),
("6.1.2", "5b14e4a30d8d8fb56c43e262009646ba9188eac1c8ff882d9a606a4bec69b56b"),
("6.1.1", "10c96ee72adf4580056292ab17cfd858a2fd7bc07abeb41c6780bd147b47f7af"),
("6.1.0", "cf3a6a7c43116032d933cc3bc88bfc4b17a4ee1513c978e751755ca11a5ed381"),

View File

@@ -24,6 +24,7 @@ class Hipblas(CMakePackage, CudaPackage, ROCmPackage):
version("develop", branch="develop")
version("master", branch="master")
version("6.2.0", sha256="33688a4d929b13e1fd800aff7e0833a9f7abf3913754b6b15995595e0d434e94")
version("6.1.2", sha256="73699892855775a67f48c38beae78169a516078c17f1ed5d67c80abe5d308502")
version("6.1.1", sha256="087ea82dff13c8162bf93343b174b18f1d58681711bce4fb7c8dc7212020c099")
version("6.1.0", sha256="5f8193c4ef0508967e608a8adf86d63066a984c5803a4d05dd617021d6298091")
@@ -75,7 +76,7 @@ class Hipblas(CMakePackage, CudaPackage, ROCmPackage):
depends_on("rocm-cmake@5.2.0:", type="build", when="@5.2.0:5.7")
depends_on("rocm-cmake@4.5.0:", type="build")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"rocm-cmake@{ver}", when=f"+rocm @{ver}")
depends_on(f"rocm-openmp-extras@{ver}", type="test", when=f"+rocm @{ver}")
@@ -97,6 +98,7 @@ class Hipblas(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
"develop",
]:

View File

@@ -17,6 +17,7 @@ class Hipblaslt(CMakePackage):
maintainers("srekolam", "afzpatel", "renjithravindrankannath")
license("MIT")
version("6.2.0", sha256="aec9edc75ae4438aa712192c784e2bed683d2839b502b6aadb18f6012306749b")
version("6.1.2", sha256="fcfe950f7b87c421565abe090b2de6f463afc1549841002f105ecca7bbbf59e5")
version("6.1.1", sha256="1e21730ade59b5e32432fa0981383f689a380b1ffc92fe950822722da9521a72")
version("6.1.0", sha256="90fc2f2c9e11c87e0529e824e4b0561dbc850f8ffa21be6932ae63cbaa27cdf0")
@@ -35,7 +36,7 @@ class Hipblaslt(CMakePackage):
)
variant("asan", default=False, description="Build with address-sanitizer enabled or disabled")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"hipblas@{ver}", when=f"@{ver}")
depends_on(f"rocm-openmp-extras@{ver}", type="test", when=f"@{ver}")
@@ -51,7 +52,7 @@ class Hipblaslt(CMakePackage):
patch("001_Set_LLVM_Paths_And_Add_Includes.patch", when="@6.0")
# Below patch sets the proper path for clang++ and clang-offload-blunder.
# Also adds hipblas and msgpack include directories for 6.1.0 release.
patch("0001-Set-LLVM_Path-Add-Hiblas-Include-to-CmakeLists-6.1.Patch", when="@6.1")
patch("0001-Set-LLVM_Path-Add-Hiblas-Include-to-CmakeLists-6.1.Patch", when="@6.1:6.2")
def setup_build_environment(self, env):
env.set("CXX", self.spec["hip"].hipcc)

View File

@@ -24,6 +24,7 @@ def url_for_version(self, version):
maintainers("srekolam", "renjithravindrankannath", "afzpatel")
license("MIT")
version("6.2.0", sha256="12ce17dc920ec6dac0c5484159b3eec00276e4a5b301ab1250488db3b2852200")
version("6.1.2", sha256="300e9d6a137dcd91b18d5809a316fddb615e0e7f982dc7ef1bb56876dff6e097")
version("6.1.1", sha256="f1a67efb49f76a9b262e9735d3f75ad21e3bd6a05338c9b15c01e6c625c4460d")
version("6.1.0", sha256="6bd9912441de6caf6b26d1323e1c899ecd14ff2431874a2f5883d3bc5212db34")

View File

@@ -17,6 +17,7 @@ class Hipcub(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
maintainers("srekolam", "renjithravindrankannath")
version("6.2.0", sha256="8dda8b77740e722fd4cf7223476313fc873bad75d50e6cb86ff284a91d76752d")
version("6.1.2", sha256="830a0f3231e07fcc6cd6261c4e1af2d7d0ac4862c606ecdc80c2635557ca3d9f")
version("6.1.1", sha256="967716d67e4270c599a60b770d543ea9148948edb907a0fa4d8be3a1785c2058")
version("6.1.0", sha256="39ac03053ecf35f1faf212e5b197b03c0104b74b0833f7cce5cf625c273ba71c")
@@ -74,6 +75,7 @@ class Hipcub(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocprim@{ver}", when=f"+rocm @{ver}")
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")

View File

@@ -24,6 +24,7 @@ class Hipfft(CMakePackage, CudaPackage, ROCmPackage):
license("MIT")
version("master", branch="master")
version("6.2.0", sha256="8d19aebb1bbfea1f235ca08d34393ce39bea35dc9cbfa72a3cf7cdf1c56410e7")
version("6.1.2", sha256="6753e45d9c671d58e68bed2b0c1bfcd40fad9d690dba3fe6011e67e51dbe3cc6")
version("6.1.1", sha256="df84e488098d457a7411f6b459537fa5c5ee160027efc3a9a076980bbe57c4d3")
version("6.1.0", sha256="1a9cf598a932192f7f12b8987d96477f09186f9a95c5a28742f9caeb81640c95")
@@ -81,6 +82,7 @@ class Hipfft(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")
depends_on(f"rocfft@{ver}", when=f"+rocm @{ver}")

View File

@@ -17,6 +17,7 @@ class Hipfort(CMakePackage):
license("MIT")
maintainers("cgmb", "srekolam", "renjithravindrankannath")
version("6.2.0", sha256="7f6db61a0ac7771e5c4604a6113b36736f6c7f05cabd7e1df8e832c98b87311d")
version("6.1.2", sha256="f60d07fa3e5b09246c8908b2876addf175a91e91c8b0fac85b000f88b6743c7c")
version("6.1.1", sha256="646f7077399db7a70d7102fda8307d0a11039f616399a4a06a64fd824336419f")
version("6.1.0", sha256="70d3ccc9f3536f62686e73934f5972ed011c4df7654ed1f8e6d2d42c4289f47e")
@@ -59,6 +60,7 @@ class Hipfort(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hip@{ver}", type="build", when=f"@{ver}")

View File

@@ -0,0 +1,12 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 0d105e2..0c1bbb5 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -177,6 +177,7 @@ if (NOT HIPIFY_CLANG_TESTS_ONLY)
install(
DIRECTORY ${CMAKE_CURRENT_SOURCE_DIR}/bin
DESTINATION .
+ USE_SOURCE_PERMISSIONS
PATTERN "hipify-perl"
PATTERN "*.sh"
PATTERN "findcode.sh" EXCLUDE

View File

@@ -20,6 +20,7 @@ class HipifyClang(CMakePackage):
license("MIT")
version("master", branch="master")
version("6.2.0", sha256="11bfbde7c40e5cd5de02a47ec30dc6df4b233a12126bf7ee449432a30a3e6e1e")
version("6.1.2", sha256="7cc1e3fd7690a3e1d99cd07f2bd62ee73682cceeb4a46918226fc70f8092eb68")
version("6.1.1", sha256="240b83ccbe1b6514a6af6c2261e306948ce6c2b1c4d1056e830bbaebddeabd82")
version("6.1.0", sha256="dc61b476081750130c62c7540fce49ee3a45a2b74e185d20049382574c1842d1")
@@ -46,7 +47,8 @@ class HipifyClang(CMakePackage):
patch("0001-install-hipify-clang-in-bin-dir-and-llvm-clangs-head.patch", when="@5.1.0:5.5")
patch("0002-install-hipify-clang-in-bin-dir-and-llvm-clangs-head.patch", when="@5.6:6.0")
patch("0003-install-hipify-clang-in-bin-dir-and-llvm-clangs-head.patch", when="@6.1:")
patch("0003-install-hipify-clang-in-bin-dir-and-llvm-clangs-head.patch", when="@6.1")
patch("0001-use-source-permission-for-hipify-perl.patch", when="@6.2")
depends_on("cmake@3.5:", type="build")
for ver in [
@@ -65,6 +67,7 @@ class HipifyClang(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
depends_on(f"llvm-amdgpu@{ver}", when=f"@{ver}")
@@ -81,6 +84,7 @@ class HipifyClang(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -24,6 +24,7 @@ class Hiprand(CMakePackage, CudaPackage, ROCmPackage):
version("develop", branch="develop")
version("master", branch="master")
version("6.2.0", sha256="daaf32506eaaf3c3b715ed631387c27992cfe0d938353a88ad6acedc735eb54b")
version("6.1.2", sha256="f0f129811c144dd711e967305c7af283cefb94bfdbcd2a11296b92a9e966be2c")
version("6.1.1", sha256="dde1526fb6cde17b18bc9ee6daa719056fc468dfbda5801b9a61260daf2b4498")
version("6.1.0", sha256="f9d71af23092f8faa888d2c14713ee4d4d350454818ca9331d422c81c2587c1f")
@@ -93,6 +94,7 @@ class Hiprand(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
"develop",
]:

View File

@@ -29,6 +29,7 @@ class Hipsolver(CMakePackage, CudaPackage, ROCmPackage):
version("develop", branch="develop")
version("master", branch="master")
version("6.2.0", sha256="637577a9cc38e4865894dbcd7eb35050e3de5d45e6db03472e836b318602a84d")
version("6.1.2", sha256="406a8e5b82daae2fc03e0a738b5a054ade01bb41785cee4afb9e21c7ec91d492")
version("6.1.1", sha256="01d4553458f417824807c069cacfc65d23f6cac79536158473b4356986c8fafd")
version("6.1.0", sha256="3cb89ca486cdbdfcb1a07c35ee65f60219ef7bc62a5b0f94ca1a3206a0106495")
@@ -99,6 +100,7 @@ class Hipsolver(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
"develop",
]:

View File

@@ -21,6 +21,7 @@ class Hipsparse(CMakePackage, CudaPackage, ROCmPackage):
libraries = ["libhipsparse"]
license("MIT")
version("6.2.0", sha256="e51b9871d764763519c14be2ec52c1e1ae3959b439afb4be6518b9f9a6f0ebaf")
version("6.1.2", sha256="dd44f9b6000b3b0ac0fa238037a80f79d6745a689d4a6755f2d595643be1ef6d")
version("6.1.1", sha256="307cff012f0465942dd6666cb00ae60c35941699677c4b26b08e4832bc499059")
version("6.1.0", sha256="1d9277a11f71474ea4a9f8419a7a2c37170a86969584e5724e385ec74241e565")
@@ -80,6 +81,7 @@ class Hipsparse(CMakePackage, CudaPackage, ROCmPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")
depends_on(f"rocsparse@{ver}", when=f"+rocm @{ver}")

View File

@@ -0,0 +1,77 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index e10585c..a29bc63 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -185,7 +185,7 @@ else()
set( tensile_fork "ROCmSoftwarePlatform" CACHE STRING "Tensile fork to use" )
file (STRINGS "tensilelite_tag.txt" read_tensile_tag)
set( tensile_tag ${read_tensile_tag} CACHE STRING "Tensile tag to download" )
- virtualenv_install("git+https://github.com/${tensile_fork}/hipBLASLt.git@${tensile_tag}#subdirectory=tensilelite")
+ virtualenv_install("git+https://github.com/ROCm/hipBLASLt.git@modify-tensilelite-spack-6.2#subdirectory=tensilelite")
message (STATUS "using GIT Tensile fork=${tensile_fork} from branch=${tensile_tag}")
endif()
diff --git a/clients/gtest/CMakeLists.txt b/clients/gtest/CMakeLists.txt
index 2057db0..6085133 100644
--- a/clients/gtest/CMakeLists.txt
+++ b/clients/gtest/CMakeLists.txt
@@ -53,6 +53,7 @@ target_include_directories( hipsparselt-test
$<BUILD_INTERFACE:${BLAS_INCLUDE_DIR}>
$<BUILD_INTERFACE:${BLIS_INCLUDE_DIR}> # may be blank if not used
$<BUILD_INTERFACE:${GTEST_INCLUDE_DIRS}>
+ $<BUILD_INTERFACE:${HIPSPARSE_INCLUDE_DIRS}>
)
message("BLIS_INCLUDE_DIR=" ${BLIS_INCLUDE_DIR})
target_link_libraries( hipsparselt-test PRIVATE ${BLAS_LIBRARY} ${GTEST_BOTH_LIBRARIES} roc::hipsparselt )
diff --git a/clients/samples/CMakeLists.txt b/clients/samples/CMakeLists.txt
index 6b303d5..c6d608c 100644
--- a/clients/samples/CMakeLists.txt
+++ b/clients/samples/CMakeLists.txt
@@ -50,6 +50,11 @@ foreach( exe ${sample_list_all} )
$<BUILD_INTERFACE:${HIP_INCLUDE_DIRS}>
)
+ target_include_directories( ${exe}
+ SYSTEM PRIVATE
+ $<BUILD_INTERFACE:${HIPSPARSE_INCLUDE_DIRS}>
+ )
+
if( CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
# GCC or hip-clang needs specific flags to turn on f16c intrinsics
target_compile_options( ${exe} PRIVATE -mf16c )
diff --git a/library/CMakeLists.txt b/library/CMakeLists.txt
index aac8506..e282268 100644
--- a/library/CMakeLists.txt
+++ b/library/CMakeLists.txt
@@ -58,6 +58,9 @@ include(src/CMakeLists.txt)
# Create hipSPARSELt library
add_library(hipsparselt ${hipsparselt_source} ${hipsparselt_headers_public})
add_library(roc::hipsparselt ALIAS hipsparselt)
+target_include_directories( hipsparselt PRIVATE ${HIPSPARSE_INCLUDE_DIRS} )
+target_include_directories( hipsparselt PRIVATE ${MSGPACK_DIR}/include )
+
# Target compile definitions
if(NOT BUILD_CUDA)
diff --git a/library/src/CMakeLists.txt b/library/src/CMakeLists.txt
index 85f7cde..4c52b34 100755
--- a/library/src/CMakeLists.txt
+++ b/library/src/CMakeLists.txt
@@ -61,7 +61,7 @@ if(NOT BUILD_CUDA)
if(Tensile_CPU_THREADS MATCHES "^[0-9]+$")
# only including threads argument if number
TensileCreateLibraryFiles(
- "${CMAKE_CURRENT_SOURCE_DIR}/src/hcc_detail/rocsparselt/src/spmm/Tensile/Logic/${Tensile_LOGIC}"
+ "${CMAKE_CURRENT_SOURCE_DIR}/src/hcc_detail/rocsparselt/src/spmm/Tensile/Logic"
"${PROJECT_BINARY_DIR}/Tensile"
ARCHITECTURE ${Tensile_ARCHITECTURE}
CODE_OBJECT_VERSION ${Tensile_CODE_OBJECT_VERSION}
@@ -72,7 +72,7 @@ if(NOT BUILD_CUDA)
)
else()
TensileCreateLibraryFiles(
- "${CMAKE_CURRENT_SOURCE_DIR}/src/hcc_detail/rocsparselt/src/spmm/Tensile/Logic/${Tensile_LOGIC}"
+ "${CMAKE_CURRENT_SOURCE_DIR}/src/hcc_detail/rocsparselt/src/spmm/Tensile/Logic"
"${PROJECT_BINARY_DIR}/Tensile"
ARCHITECTURE ${Tensile_ARCHITECTURE}
CODE_OBJECT_VERSION ${Tensile_CODE_OBJECT_VERSION}

View File

@@ -21,6 +21,7 @@ class Hipsparselt(CMakePackage, ROCmPackage):
maintainers("srekolam", "afzpatel", "renjithravindrankannath")
license("MIT")
version("6.2.0", sha256="a25a3ce0ed3cc616b1a4e38bfdd5e68463bb9fe791a56d1367b8a6373bb63d12")
version("6.1.2", sha256="a5a01fec7bc6e1f4792ccd5c8eaee7b42deac315c54298a7ce5265e5551e8640")
version("6.1.1", sha256="ca6da099d9e385ffce2b68404f395a93b199af1592037cf52c620f9148a6a78d")
version("6.1.0", sha256="66ade6de4fd19d144cab27214352faf5b00bbe12afe59472efb441b16d090265")
@@ -43,7 +44,7 @@ class Hipsparselt(CMakePackage, ROCmPackage):
)
variant("asan", default=False, description="Build with address-sanitizer enabled or disabled")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"hipsparse@{ver}", when=f"@{ver}")
depends_on(f"rocm-openmp-extras@{ver}", when=f"@{ver}", type="test")
@@ -64,6 +65,7 @@ class Hipsparselt(CMakePackage, ROCmPackage):
# tensorlite subdir of hipblas . Also adds hipsparse and msgpack include directories
# for 6.1.0 release.
patch("0001-update-llvm-path-add-hipsparse-include-dir-for-spack-6.1.patch", when="@6.1")
patch("0001-update-llvm-path-add-hipsparse-include-dir-for-spack-6.2.patch", when="@6.2")
def setup_build_environment(self, env):
env.set("CXX", self.spec["hip"].hipcc)

View File

@@ -24,6 +24,7 @@ class HsaRocrDev(CMakePackage):
libraries = ["libhsa-runtime64"]
version("master", branch="master")
version("6.2.0", sha256="c98090041fa56ca4a260709876e2666f85ab7464db9454b177a189e1f52e0b1a")
version("6.1.2", sha256="6eb7a02e5f1e5e3499206b9e74c9ccdd644abaafa2609dea0993124637617866")
version("6.1.1", sha256="72841f112f953c16619938273370eb8727ddf6c2e00312856c9fca54db583b99")
version("6.1.0", sha256="50386ebcb7ff24449afa2a10c76a059597464f877225c582ba3e097632a43f9c")
@@ -73,6 +74,7 @@ class HsaRocrDev(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
depends_on(f"hsakmt-roct@{ver}", when=f"@{ver}")
@@ -92,6 +94,7 @@ class HsaRocrDev(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -22,6 +22,7 @@ class HsakmtRoct(CMakePackage):
maintainers("srekolam", "renjithravindrankannath")
version("master", branch="master")
version("6.2.0", sha256="73df98ca2be8a887cb76554c23f148ef6556bdbccfac99f34111fa1f87fd7c5d")
version("6.1.2", sha256="097a5b7eb136300667b36bd35bf55e4a283a1ed04e614cf24dddca0a65c86389")
version("6.1.1", sha256="c586d8a04fbd9a7bc0a15e0a6a161a07f88f654402bb11694bd8aebc343c00f0")
version("6.1.0", sha256="1085055068420821f7a7adb816692412b5fb38f89d67b9edb9995198f39e2f31")

View File

@@ -18,6 +18,26 @@ class IntelPin(Package):
license("MIT")
version(
"3.31",
sha256="82216144e3df768f0203b671ff48605314f13266903eb42dac01b91310eba956",
url="https://software.intel.com/sites/landingpage/pintool/downloads/pin-external-3.31-98869-gfa6f126a8-gcc-linux.tar.gz",
)
version(
"3.30",
sha256="be4f1130445c3fc4d83b7afad85c421d418f60013c33e8ee457bc7c9c194de1b",
url="https://software.intel.com/sites/landingpage/pintool/downloads/pin-3.30-98830-g1d7b601b3-gcc-linux.tar.gz",
)
version(
"3.29",
sha256="45c2a68d4b2184117584a55db17b44c86f9476e9cb8109b2fae50a965b1ea64f",
url="https://software.intel.com/sites/landingpage/pintool/downloads/pin-3.29-98790-g1a445fcd1-gcc-linux.tar.gz",
)
version(
"3.28",
sha256="5a5a3337f3f16176b97edcd3366b561936e1068fba4ebcfed4b836d81d45847b",
url="https://software.intel.com/sites/landingpage/pintool/downloads/pin-3.28-98749-g6643ecee5-gcc-linux.tar.gz",
)
version(
"3.27",
sha256="e7d44d25668632007d5a109e5033415e91db543b8ce9e665893a05e852b67707",

View File

@@ -130,6 +130,5 @@ def install_libraries(self):
for ver, lib, checksum in self.resource_list:
if self.spec.version == Version(ver):
with working_dir("kicad-{0}-{1}".format(lib, ver)):
args = std_cmake_args
cmake(*args)
cmake(*self.std_cmake_args)
make("install")

View File

@@ -29,21 +29,43 @@ class Lammps(CMakePackage, CudaPackage, ROCmPackage, PythonExtension):
# marked deprecated=True
# * patch releases older than a stable release should be marked deprecated=True
version("develop", branch="develop")
version("20240627", sha256="2174a99d266279823a8c57629ee1c21ec357816aefd85f964d9f859fe9222aa5")
version("20240417", sha256="158b288725c251fd8b30dbcf61749e0d6a042807da92af865a7d3c413efdd8ea")
version(
"20240207.1", sha256="3ba62c2a1ed463fceedf313a1c3ea2997994aa102379a8d35b525ea424f56776"
"20240829",
sha256="6112e0cc352c3140a4874c7f74db3c0c8e30134024164509ecf3772b305fde2e",
preferred=True,
)
version(
"20240627",
sha256="2174a99d266279823a8c57629ee1c21ec357816aefd85f964d9f859fe9222aa5",
deprecated=True,
)
version(
"20240417",
sha256="158b288725c251fd8b30dbcf61749e0d6a042807da92af865a7d3c413efdd8ea",
deprecated=True,
)
version(
"20240207.1",
sha256="3ba62c2a1ed463fceedf313a1c3ea2997994aa102379a8d35b525ea424f56776",
deprecated=True,
)
version(
"20240207",
sha256="d518f32de4eb2681f2543be63926411e72072dd7d67c1670c090b5baabed98ac",
deprecated=True,
)
version("20231121", sha256="704d8a990874a425bcdfe0245faf13d712231ba23f014a3ebc27bc14398856f1")
version(
"20231121",
sha256="704d8a990874a425bcdfe0245faf13d712231ba23f014a3ebc27bc14398856f1",
deprecated=True,
)
version(
"20230802.4", sha256="6eed007cc24cda80b5dd43372b2ad4268b3982bb612669742c8c336b79137b5b"
)
version(
"20230802.3",
sha256="6666e28cb90d3ff01cbbda6c81bdb85cf436bbb41604a87f2ab2fa559caa8510",
preferred=True,
deprecated=True,
)
version(
"20230802.2",
@@ -372,7 +394,7 @@ class Lammps(CMakePackage, CudaPackage, ROCmPackage, PythonExtension):
depends_on("cxx", type="build")
# mdi, scafacos, ml-quip, qmmm require C, but not available in Spack
for c_pkg in ("adios", "atc", "awpmd", "ml-pod", "electrode", "kim", "h5md", "tools"):
for c_pkg in ("adios", "atc", "awpmd", "ml-pod", "electrode", "kim", "h5md", "tools", "rheo"):
depends_on("c", type="build", when=f"+{c_pkg}")
# scafacos, ml-quip require Fortran, but not available in Spack
@@ -380,6 +402,8 @@ class Lammps(CMakePackage, CudaPackage, ROCmPackage, PythonExtension):
depends_on("fortran", type="build", when=f"+{fc_pkg}")
stable_versions = {
"20240829",
"20230802.4",
"20230802.3",
"20230802.2",
"20230802.1",
@@ -495,6 +519,7 @@ def url_for_version(self, version):
"reaction": {"when": "@20210702:"},
"reax": {"when": "@:20181212"},
"reaxff": {"when": "@20210702:"},
"rheo": {"when": "@20240829:"},
"replica": {},
"rigid": {"default": True},
"shock": {},
@@ -569,6 +594,7 @@ def url_for_version(self, version):
variant("jpeg", default=False, description="Build with jpeg support")
variant("png", default=False, description="Build with png support")
variant("ffmpeg", default=False, description="Build with ffmpeg support")
variant("curl", default=False, description="Build with curl support", when="@20240829:")
variant("openmp", default=True, description="Build with OpenMP")
variant("opencl", default=False, description="Build with OpenCL")
variant(
@@ -658,6 +684,7 @@ def url_for_version(self, version):
depends_on("jpeg", when="+jpeg")
depends_on("kim-api", when="+kim")
depends_on("curl", when="@20190329:+kim")
depends_on("curl", when="+curl")
depends_on("libpng", when="+png")
depends_on("ffmpeg", when="+ffmpeg")
depends_on("kokkos+deprecated_code+shared@3.0.00", when="@20200303+kokkos")
@@ -688,6 +715,7 @@ def url_for_version(self, version):
depends_on("hipcub", when="~kokkos +rocm")
depends_on("llvm-amdgpu ", when="+rocm", type="build")
depends_on("rocm-openmp-extras", when="+rocm +openmp", type="build")
depends_on("gsl@2.6:", when="+rheo")
# propagate CUDA and ROCm architecture when +kokkos
for arch in CudaPackage.cuda_arch_values:
@@ -756,6 +784,12 @@ def url_for_version(self, version):
sha256="3dedd807f63a21c543d1036439099f05c6031fd98e7cb1ea7825822fc074106e",
when="@20220623.3:20230208 +kokkos +rocm +kspace",
)
# Fixed in https://github.com/lammps/lammps/pull/4305
patch(
"https://github.com/lammps/lammps/commit/49bdc3e26449634f150602a66d0dab34d09dbc0e.patch?full_index=1",
sha256="b8d1f08a82329e493e040de2bde9d2291af173a0fe6c7deb24750cc22823c421",
when="@20240829 %cce",
)
# Older LAMMPS does not compile with Kokkos 4.x
conflicts(
@@ -873,6 +907,7 @@ def cmake_args(self):
args.append(self.define_from_variant("WITH_JPEG", "jpeg"))
args.append(self.define_from_variant("WITH_PNG", "png"))
args.append(self.define_from_variant("WITH_FFMPEG", "ffmpeg"))
args.append(self.define_from_variant("WITH_CURL", "curl"))
for pkg, params in self.supported_packages.items():
if "when" not in params or spec.satisfies(params["when"]):

View File

@@ -3,6 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import shutil
from spack.package import *
@@ -13,7 +15,7 @@ class LlvmAmdgpu(CMakePackage, CompilerPackage):
homepage = "https://github.com/ROCm/llvm-project"
git = "https://github.com/ROCm/llvm-project.git"
url = "https://github.com/ROCm/llvm-project/archive/rocm-6.1.2.tar.gz"
url = "https://github.com/ROCm/llvm-project/archive/rocm-6.2.0.tar.gz"
tags = ["rocm"]
executables = [r"amdclang", r"amdclang\+\+", r"amdflang", r"clang.*", r"flang.*", "llvm-.*"]
generator("ninja")
@@ -23,6 +25,7 @@ class LlvmAmdgpu(CMakePackage, CompilerPackage):
license("Apache-2.0")
version("master", branch="amd-stg-open")
version("6.2.0", sha256="12ce17dc920ec6dac0c5484159b3eec00276e4a5b301ab1250488db3b2852200")
version("6.1.2", sha256="300e9d6a137dcd91b18d5809a316fddb615e0e7f982dc7ef1bb56876dff6e097")
version("6.1.1", sha256="f1a67efb49f76a9b262e9735d3f75ad21e3bd6a05338c9b15c01e6c625c4460d")
version("6.1.0", sha256="6bd9912441de6caf6b26d1323e1c899ecd14ff2431874a2f5883d3bc5212db34")
@@ -66,7 +69,8 @@ class LlvmAmdgpu(CMakePackage, CompilerPackage):
provides("libllvm@15", when="@5.3:5.4")
provides("libllvm@16", when="@5.5:5.6")
provides("libllvm@17", when="@5.7:")
provides("libllvm@17", when="@5.7:6.1")
provides("libllvm@18", when="@6.2:")
depends_on("cmake@3.13.4:", type="build")
depends_on("python", type="build")
@@ -144,6 +148,7 @@ class LlvmAmdgpu(CMakePackage, CompilerPackage):
when="@master +rocm-device-libs",
)
for d_version, d_shasum in [
("6.2.0", "c98090041fa56ca4a260709876e2666f85ab7464db9454b177a189e1f52e0b1a"),
("6.1.2", "6eb7a02e5f1e5e3499206b9e74c9ccdd644abaafa2609dea0993124637617866"),
("6.1.1", "72841f112f953c16619938273370eb8727ddf6c2e00312856c9fca54db583b99"),
("6.1.0", "50386ebcb7ff24449afa2a10c76a059597464f877225c582ba3e097632a43f9c"),
@@ -284,6 +289,22 @@ def setup_dependent_run_environment(self, env, dependent_spec):
llvm_amdgpu_home = self.spec["llvm-amdgpu"].prefix
env.prepend_path("LD_LIBRARY_PATH", llvm_amdgpu_home + "/lib")
@run_after("install")
def post_install(self):
if self.spec.satisfies("@6.1: +rocm-device-libs"):
exe = self.prefix.bin.join("llvm-config")
output = Executable(exe)("--version", output=str, error=str)
version = re.split("[.]", output)[0]
mkdirp(join_path(self.prefix.lib.clang, version, "lib"), "amdgcn")
install_tree(
self.prefix.amdgcn, join_path(self.prefix.lib.clang, version, "lib", "amdgcn")
)
shutil.rmtree(self.prefix.amdgcn)
os.symlink(
join_path(self.prefix.lib.clang, version, "lib", "amdgcn"),
os.path.join(self.prefix, "amdgcn"),
)
# Required for enabling asan on dependent packages
def setup_dependent_build_environment(self, env, dependent_spec):
for root, _, files in os.walk(self.spec["llvm-amdgpu"].prefix):

View File

@@ -0,0 +1,24 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Makedepf90(AutotoolsPackage):
"""Makedepf90 is a program for automatic creation of Makefile-style dependency lists for
Fortran source code."""
homepage = "https://salsa.debian.org/science-team/makedepf90"
url = "https://deb.debian.org/debian/pool/main/m/makedepf90/makedepf90_3.0.1.orig.tar.xz"
maintainers("tukss")
license("GPL-2.0-only", checked_by="tukss")
version("3.0.1", sha256="a11601ea14ad793f23fca9c7e7df694b6337f962ccc930d995d72e172edf29ee")
depends_on("c", type="build")
depends_on("flex", type="build")
depends_on("bison", type="build")

View File

@@ -20,6 +20,7 @@ class Migraphx(CMakePackage):
libraries = ["libmigraphx"]
license("MIT")
version("6.2.0", sha256="7b36c1a0c44dd21f31ce6c9c4e7472923281aa7fdc693e75edd2670b101a6d48")
version("6.1.2", sha256="829f4a2bd9fe3dee130dfcca103ddc7691da18382f5b683aaca8f3ceceaef355")
version("6.1.1", sha256="e14a62678e97356236b45921e24f28ff430d670fb70456c3e5ebfeeb22160811")
version("6.1.0", sha256="2ba44146397624845c64f3898bb1b08837ad7a49f133329e58eb04c05d1f36ac")
@@ -94,6 +95,7 @@ class Migraphx(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")
depends_on(f"hip@{ver}", when=f"@{ver}")
@@ -101,7 +103,7 @@ class Migraphx(CMakePackage):
depends_on(f"rocblas@{ver}", when=f"@{ver}")
depends_on(f"miopen-hip@{ver}", when=f"@{ver}")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"rocmlir@{ver}", when=f"@{ver}")
@property

View File

@@ -21,6 +21,7 @@ class MiopenHip(CMakePackage):
libraries = ["libMIOpen"]
license("MIT")
version("6.2.0", sha256="f4473f724362732019d505a0e01c17b060b542350859cb1e4bd4e3898b609276")
version("6.1.2", sha256="c8ff4af72264b2049bfe2685d581ea0f3e43319db7bd00dc347159bcf2731614")
version("6.1.1", sha256="cf568ea16dd23b32fe89e250bb33ed4722fea8aa7f407cc66ff37c37aab037ce")
version("6.1.0", sha256="3b373117eaeaf618aab9b39bb22e9950fd49bd0e264c8587b0c51fa348afe0d1")
@@ -84,12 +85,31 @@ class MiopenHip(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"rocm-clang-ocl@{ver}", when=f"@{ver}")
depends_on(f"rocblas@{ver}", when=f"@{ver}")
for ver in [
"5.3.0",
"5.3.3",
"5.4.0",
"5.4.3",
"5.5.0",
"5.5.1",
"5.6.0",
"5.6.1",
"5.7.0",
"5.7.1",
"6.0.0",
"6.0.2",
"6.1.0",
"6.1.1",
"6.1.2",
]:
depends_on(f"rocm-clang-ocl@{ver}", when=f"@{ver}")
for ver in ["5.3.0", "5.3.3"]:
depends_on(f"mlirmiopen@{ver}", when=f"@{ver}")
@@ -104,17 +124,20 @@ class MiopenHip(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on("nlohmann-json", type="link")
depends_on(f"composable-kernel@{ver}", when=f"@{ver}")
for ver in ["5.4.0", "5.4.3", "5.5.0"]:
depends_on("nlohmann-json", type="link")
depends_on(f"rocmlir@{ver}", when=f"@{ver}")
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on("roctracer-dev@" + ver, when="@" + ver)
for ver in ["6.1.0", "6.1.1", "6.1.2"]:
depends_on("googletest")
depends_on("rocrand@6.2.0", when="@6.2.0")
def setup_build_environment(self, env):
lib_dir = self.spec["zlib-api"].libs.directories[0]
env.prepend_path("LIBRARY_PATH", lib_dir)
@@ -160,7 +183,7 @@ def cmake_args(self):
if self.spec.satisfies("@5.1.0:5.3"):
mlir_inc = spec["mlirmiopen"].prefix.include
args.append(self.define("CMAKE_CXX_FLAGS", "-I{0}".format(mlir_inc)))
if self.spec.satisfies("@5.4.0:"):
if self.spec.satisfies("@5.4.0:6.1"):
args.append(
"-DNLOHMANN_JSON_INCLUDE={0}".format(self.spec["nlohmann-json"].prefix.include)
)
@@ -174,14 +197,21 @@ def cmake_args(self):
args.append(self.define("MIOPEN_USE_MLIR", "OFF"))
if self.spec.satisfies("@5.7.0:"):
args.append(self.define("MIOPEN_ENABLE_AI_IMMED_MODE_FALLBACK", "OFF"))
args.append(
"-DNLOHMANN_JSON_INCLUDE={0}".format(self.spec["nlohmann-json"].prefix.include)
)
if self.spec.satisfies("@6.0.0:"):
if self.spec.satisfies("@6:6.1"):
args.append(
"-DROCTRACER_INCLUDE_DIR={0}".format(self.spec["roctracer-dev"].prefix.include)
)
args.append("-DROCTRACER_LIB_DIR={0}".format(self.spec["roctracer-dev"].prefix.lib))
if self.spec.satisfies("@6.1:"):
if self.spec.satisfies("@6.1"):
args.append("-DSQLITE_INCLUDE_DIR={0}".format(self.spec["sqlite"].prefix.include))
if self.spec.satisfies("@6.2:"):
args.append(
self.define(
"CMAKE_CXX_FLAGS",
f"-I{self.spec['roctracer-dev'].prefix.include} "
f"-L{self.spec['roctracer-dev'].prefix.lib} "
f"-I{self.spec['nlohmann-json'].prefix.include} "
f"-I{self.spec['sqlite'].prefix.include} ",
)
)
return args

View File

@@ -26,6 +26,7 @@ def url_for_version(self, version):
return url.format(version)
license("MIT")
version("6.2.0", sha256="ce28ac3aef76f28869c4dad9ffd9ef090e0b54ac58088f1f1eef803641125b51")
version("6.1.2", sha256="0afa664931f566b7f5a3abd474dd641e56077529a2a5d7c788f5e6700e957ed6")
version("6.1.1", sha256="3483b5167c47047cca78581cc6c9685138f9b5b25edb11618b720814788fc2a0")
version("6.1.0", sha256="f18a72c4d12c36ab50f9c3a5c22fc3641feb11c99fed513540a16a65cd149fd1")
@@ -60,7 +61,7 @@ def url_for_version(self, version):
conflicts("+asan", when="os=centos8")
patch("0001-add-half-include-path.patch", when="@5.5")
patch("0001-add-half-include-path-5.6.patch", when="@5.6:")
patch("0001-add-half-include-path-5.6.patch", when="@5.6:6.1")
patch("0002-add-half-include-path-for-tests.patch", when="@5.5:6.0 +add_tests")
patch("0002-add-half-include-path-for-tests-6.1.0.patch", when="@6.1.0: +add_tests")
@@ -101,7 +102,7 @@ def patch(self):
"amd_openvx_extensions/amd_nn/nn_hip/CMakeLists.txt",
string=True,
)
if self.spec.satisfies("@5.5.0: + hip"):
if self.spec.satisfies("@5.5.0:6.1 + hip"):
filter_file(
r"${ROCM_PATH}/llvm/bin/clang++",
"{0}/bin/clang++".format(self.spec["llvm-amdgpu"].prefix),
@@ -249,6 +250,7 @@ def patch(self):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"miopen-hip@{ver}", when=f"@{ver}")
for ver in [
@@ -266,6 +268,7 @@ def patch(self):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"migraphx@{ver}", when=f"@{ver}")
depends_on(f"hip@{ver}", when=f"@{ver}")
@@ -282,10 +285,11 @@ def patch(self):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")
depends_on("python@3.5:", type="build")
for ver in ["5.7.0", "5.7.1", "6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in ["5.7.0", "5.7.1", "6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2", "6.2.0"]:
depends_on(f"rpp@{ver}", when=f"@{ver}")
def setup_run_environment(self, env):

View File

@@ -75,9 +75,24 @@ class Namd(MakefilePackage, CudaPackage, ROCmPackage):
description="Enables Tcl and/or python interface",
)
variant("avxtiles", when="target=x86_64_v4:", default=False, description="Enable avxtiles")
variant(
"avxtiles",
when="target=x86_64_v4: @2.15:",
default=False,
description="Enable avxtiles supported with NAMD 2.15+",
)
variant("single_node_gpu", default=False, description="Single node GPU")
# Adding memopt variant to build memory-optimized mode that utilizes a compressed
# version of the molecular structure and also supports parallel I/O.
# Refer: https://www.ks.uiuc.edu/Research/namd/wiki/index.cgi?NamdMemoryReduction
variant(
"memopt",
when="@2.8:",
default=False,
description="Enable memory-optimized build supported with NAMD 2.8+",
)
# init_tcl_pointers() declaration and implementation are inconsistent
# "src/colvarproxy_namd.C", line 482: error: inherited member is not
# allowed
@@ -103,9 +118,13 @@ class Namd(MakefilePackage, CudaPackage, ROCmPackage):
depends_on("tcl", when="interface=python")
depends_on("python", when="interface=python")
conflicts("+avxtiles", when="@:2.14,3:", msg="AVXTiles algorithm requires NAMD 2.15")
conflicts("+rocm", when="+cuda", msg="NAMD supports only one GPU backend at a time")
conflicts("+single_node_gpu", when="~cuda~rocm")
conflicts(
"+memopt",
when="+single_node_gpu",
msg="memopt mode is not compatible with GPU-resident builds",
)
# https://www.ks.uiuc.edu/Research/namd/2.12/features.html
# https://www.ks.uiuc.edu/Research/namd/2.13/features.html
@@ -304,6 +323,9 @@ def edit(self, spec, prefix):
if "+single_node_gpu" in spec:
opts.extend(["--with-single-node-hip"])
if spec.satisfies("+memopt"):
opts.append("--with-memopt")
config = Executable("./config")
config(self.build_directory, *opts)

View File

@@ -5,16 +5,19 @@
import os
import spack.build_systems.cmake
import spack.build_systems.generic
from spack.package import *
class Nekrs(Package, CudaPackage, ROCmPackage):
class Nekrs(Package, CMakePackage, CudaPackage, ROCmPackage):
"""nekRS is an open-source Navier Stokes solver based on the spectral
element method targeting classical processors and hardware accelerators
like GPUs"""
homepage = "https://github.com/Nek5000/nekRS"
git = "https://github.com/Nek5000/nekRS.git"
url = "https://github.com/Nek5000/nekRS/archive/refs/tags/v23.0.tar.gz"
tags = [
"cfd",
@@ -32,6 +35,11 @@ class Nekrs(Package, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
build_system(
conditional("cmake", when="@23.0:"), conditional("generic", when="@=21.0"), default="cmake"
)
version("23.0", sha256="2cb4ded69551b9614036e1a9d5ac54c8535826eae8f8b6a00ddb89043b2c392a")
version("21.0", tag="v21.0", commit="bcd890bf3f9fb4d91224c83aeda75c33570f1eaa")
depends_on("c", type="build") # generated
@@ -52,17 +60,35 @@ class Nekrs(Package, CudaPackage, ROCmPackage):
depends_on("git")
depends_on("cmake")
@run_before("install")
def fortran_check(self):
if not self.compiler.f77:
msg = "Cannot build NekRS without a Fortran 77 compiler."
raise RuntimeError(msg)
def patch(self):
with working_dir("scripts"):
# Make sure nekmpi wrapper uses srun when we know OpenMPI
# is not built with mpiexec
if self.spec.satisfies("^openmpi~legacylaunchers"):
filter_file(r"mpirun -np", "srun -n", "nrsmpi")
filter_file(r"mpirun -np", "srun -n", "nrspre")
filter_file(r"mpirun -np", "srun -n", "nrsbmpi")
# Following 4 methods are stolen from OCCA since we are using OCCA
# shipped with nekRS.
def setup_run_environment(self, env):
# The 'env' is included in the Spack generated module files.
spec = self.spec
env.set("OCCA_CXX", self.compiler.cxx)
cxxflags = spec.compiler_flags["cxxflags"]
if cxxflags:
# Run-time compiler flags:
env.set("OCCA_CXXFLAGS", " ".join(cxxflags))
if "+cuda" in spec:
cuda_dir = spec["cuda"].prefix
# Run-time CUDA compiler:
env.set("OCCA_CUDA_COMPILER", join_path(cuda_dir, "bin", "nvcc"))
class SetupEnvironment:
def _setup_runtime_flags(self, s_env):
spec = self.spec
s_env.set("OCCA_CXX", self.compiler.cxx)
s_env.set("OCCA_CXX", self.pkg.compiler.cxx)
cxxflags = spec.compiler_flags["cxxflags"]
if cxxflags:
@@ -111,26 +137,14 @@ def setup_build_environment(self, env):
env.set("OCCA_VERBOSE", "1")
self._setup_runtime_flags(env)
def setup_run_environment(self, env):
# The 'env' is included in the Spack generated module files.
self._setup_runtime_flags(env)
def setup_dependent_build_environment(self, env, dependent_spec):
# Export OCCA_* variables for everyone using this package from within
# Spack.
self._setup_runtime_flags(env)
def install(self, spec, prefix):
script_dir = "scripts"
with working_dir(script_dir):
# Make sure nekmpi wrapper uses srun when we know OpenMPI
# is not built with mpiexec
if "^openmpi~legacylaunchers" in spec:
filter_file(r"mpirun -np", "srun -n", "nrsmpi")
filter_file(r"mpirun -np", "srun -n", "nrspre")
filter_file(r"mpirun -np", "srun -n", "nrsbmpi")
class GenericBuilder(spack.build_systems.generic.GenericBuilder):
def install(self, pkg, spec, prefix):
makenrs = Executable(os.path.join(os.getcwd(), "makenrs"))
makenrs.add_default_env("NEKRS_INSTALL_DIR", prefix)
@@ -140,3 +154,17 @@ def install(self, spec, prefix):
makenrs.add_default_env("TRAVIS", "true")
makenrs(output=str, error=str, fail_on_error=True)
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
cxxflags = self.spec.compiler_flags["cxxflags"]
args = [
self.define("CMAKE_CXX_COMPILER", self.spec["mpi"].mpicxx),
self.define("NEKRS_COMPILER_FLAGS", cxxflags),
self.define("OCCA_CXXFLAGS", cxxflags),
self.define_from_variant("ENABLE_CUDA", "cuda"),
self.define_from_variant("ENABLE_OPENCL", "opencl"),
self.define_from_variant("ENABLE_HIP", "rocm"),
]
return args

View File

@@ -0,0 +1,11 @@
--- a/configure.ac
+++ b/configure.ac
@@ -24,6 +24,8 @@ AM_CONDITIONAL([HAVE_TREE_VECTORIZE], [test x"${tree_vectorize}" = x"true"])
AC_CONFIG_FILES([Makefile])
+AC_SEARCH_LIBS([__atomic_fetch_and_1], [atomic])
+
# GCC tries to be "helpful" and only issue a warning for unrecognized
# attributes. So we compile the test with Werror, so that if the
# attribute is not recognized the compilation fails

View File

@@ -0,0 +1,14 @@
diff --git a/syscall.c b/syscall.c
index 63b3e53..5b354c4 100644
--- a/syscall.c
+++ b/syscall.c
@@ -141,7 +141,7 @@
#if !defined(__NR_set_mempolicy_home_node)
-#if defined(__x86_64__) || defined(__aarch64__)
+#if defined(__x86_64__) || defined(__aarch64__) || defined(__PPC64__)
#define __NR_set_mempolicy_home_node 450
#else
#error "Add syscalls for your architecture or update kernel headers"

View File

@@ -16,6 +16,9 @@ class Numactl(AutotoolsPackage):
license("LGPL-2.1-only")
version("2.0.18", sha256="8cd6c13f3096e9c2293c1d732f56e2aa37a7ada1a98deed3fac7bd6da1aaaaf6")
version("2.0.17", sha256="af22829cda8b5bdee3d280e61291697bbd3f9bd372afdf119c9348b88369d40b")
version("2.0.16", sha256="a35c3bdb3efab5c65927e0de5703227760b1101f5e27ab741d8f32b3d5f0a44c")
version("2.0.14", sha256="1ee27abd07ff6ba140aaf9bc6379b37825e54496e01d6f7343330cf1a4487035")
version("2.0.12", sha256="7c3e819c2bdeb883de68bafe88776a01356f7ef565e75ba866c4b49a087c6bdf")
version("2.0.11", sha256="3e099a59b2c527bcdbddd34e1952ca87462d2cef4c93da9b0bc03f02903f7089")
@@ -25,8 +28,10 @@ class Numactl(AutotoolsPackage):
patch("numactl-2.0.11-sysmacros.patch", when="@2.0.11")
# https://github.com/numactl/numactl/issues/94
patch("numactl-2.0.14-symver.patch", when="@2.0.14")
patch("fix-empty-block.patch", when="@2.0.10:2.0.14")
patch("fix-empty-block.patch", when="@2.0.10:2.0.16")
patch("link-with-latomic-if-needed.patch", when="@2.0.14")
patch("link-with-latomic-if-needed-v2.0.16.patch", when="@2.0.16")
patch("numactl-2.0.18-syscall-NR-ppc64.patch", when="@2.0.18 target=ppc64le:")
depends_on("autoconf", type="build")
depends_on("automake", type="build")

View File

@@ -17,9 +17,10 @@ class Onnx(CMakePackage):
url = "https://github.com/onnx/onnx/archive/refs/tags/v1.9.0.tar.gz"
git = "https://github.com/onnx/onnx.git"
license("Apache-2.0")
license("Apache-2.0", checked_by="wdconinc")
version("master", branch="master")
version("1.16.2", sha256="84fc1c3d6133417f8a13af6643ed50983c91dacde5ffba16cc8bb39b22c2acbb")
version("1.16.1", sha256="0e6aa2c0a59bb2d90858ad0040ea1807117cc2f05b97702170f18e6cd6b66fb3")
version("1.16.0", sha256="0ce153e26ce2c00afca01c331a447d86fbf21b166b640551fe04258b4acfc6a4")
version("1.15.0", sha256="c757132e018dd0dd171499ef74fca88b74c5430a20781ec53da19eb7f937ef68")
@@ -61,7 +62,7 @@ class Onnx(CMakePackage):
"1.1.0_2018-04-19", commit="7e1bed51cc508a25b22130de459830b5d5063c41"
) # py-torch@0.4.0
depends_on("cxx", type="build") # generated
depends_on("cxx", type="build")
generator("ninja")
depends_on("cmake@3.1:", type="build")
@@ -73,6 +74,9 @@ def patch(self):
filter_file("CMAKE_CXX_STANDARD 11", "CMAKE_CXX_STANDARD 14", "CMakeLists.txt")
def cmake_args(self):
# Try to get ONNX to use the same version of python as the spec is using
args = ["-DPY_VERSION={0}".format(self.spec["python"].version.up_to(2))]
args = [
# Try to get ONNX to use the same version of python as the spec is using
self.define("PY_VERSION", self.spec["python"].version.up_to(2)),
self.define("ONNX_BUILD_TESTS", self.run_tests),
]
return args

View File

@@ -0,0 +1,47 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PerlBioEnsemblIo(Package):
"""File parsing and writing code for Ensembl."""
homepage = "https://github.com/Ensembl/ensembl-io/"
url = "https://github.com/Ensembl/ensembl-io/archive/release/111.zip"
maintainers("teaguesterling")
license("APACHE-2.0", checked_by="teaguesterling")
for vers, sha in [
("112", "ccbffe7c15318075463db46be348655a5914762e05ff47da2d72a4c99414d39a"),
("111", "f81d4c1aea88aac7105aaa3fec548e39b79f129c7abc08b55be7d0345aa5482c"),
("110", "83cf00ecdb6184be480fc3cbf0ffc322d3e9411e14602396fda8d153345d6c2e"),
]:
version(vers, sha256=sha)
depends_on(f"perl-bio-ensembl@{vers}", when=f"@{vers}")
extends("perl")
variant("scripts", default=False, description="Install scripts")
depends_on("perl-bio-bigfile")
depends_on("perl-bio-db-hts")
depends_on("perl-bio-ensembl")
depends_on("perl-bioperl@1.6.924")
depends_on("perl-compress-bzip2")
depends_on("perl-json")
depends_on("perl-try-tiny")
depends_on("perl-uri")
depends_on("vcftools")
def install(self, spec, prefix):
install_tree("modules", prefix.lib.perl5)
mkdirp(prefix.share.ensembl)
for extra in ["scripts"]:
if spec.satisfies(f"+{extra}"):
extra = extra.replace("_", "-")
target = join_path(prefix.share.ensembl, extra)
install_tree(extra, target)

View File

@@ -32,87 +32,81 @@ class PerlBioperl(PerlPackage):
and contribute your own if possible."""
homepage = "https://metacpan.org/pod/BioPerl"
url = "https://cpan.metacpan.org/authors/id/C/CD/CDRAUG/BioPerl-1.7.6.tar.gz"
url = "https://cpan.metacpan.org/authors/id/C/CJ/CJFIELDS/BioPerl-1.7.8.tar.gz"
license("Artistic-1.0")
version("1.7.8", sha256="c490a3be7715ea6e4305efd9710e5edab82dabc55fd786b6505b550a30d71738")
version(
"1.7.6",
sha256="df2a3efc991b9b5d7cc9d038a1452c6dac910c9ad2a0e47e408dd692c111688d",
preferred=True,
url="https://cpan.metacpan.org/authors/id/C/CD/CDRAUG/BioPerl-1.7.6.tar.gz",
)
version("1.6.924", sha256="616a7546bb3c58504de27304a0f6cb904e18b6bbcdb6a4ec8454f2bd37bb76d0")
# This is technically the same as 1.7.2, but with a more conventional version number.
version(
"1.007002",
sha256="17aa3aaab2f381bbcaffdc370002eaf28f2c341b538068d6586b2276a76464a1",
url="https://cpan.metacpan.org/authors/id/C/CJ/CJFIELDS/BioPerl-1.007002.tar.gz",
deprecated=True,
)
depends_on("fortran", type="build") # generated
with default_args(type=("build", "run")):
depends_on("perl-data-stag")
depends_on("perl-error")
depends_on("perl-graph")
depends_on("perl-http-message")
depends_on("perl-io-string")
depends_on("perl-io-stringy")
depends_on("perl-ipc-run")
depends_on("perl-libwww-perl")
depends_on("perl-libxml-perl")
depends_on("perl-list-moreutils")
depends_on("perl-module-build")
depends_on("perl-set-scalar")
depends_on("perl-test-most")
depends_on("perl-test-requiresinternet")
depends_on("perl-uri")
depends_on("perl-xml-dom")
depends_on("perl-xml-dom-xpath")
depends_on("perl-xml-libxml")
depends_on("perl-xml-parser")
depends_on("perl-xml-sax")
depends_on("perl-xml-sax-base")
depends_on("perl-xml-sax-writer")
depends_on("perl-xml-simple")
depends_on("perl-xml-twig")
depends_on("perl-yaml")
# According to cpandeps.grinnz.com Module-Build is both a build and run
# time dependency for BioPerl
depends_on("perl-module-build", type=("build", "run"))
depends_on("perl-uri", type=("build", "run"))
depends_on("perl-io-string", type=("build", "run"))
depends_on("perl-data-stag", type=("build", "run"))
depends_on("perl-test-most", type=("build", "run"))
depends_on("perl-error", when="@1.7.6:", type=("build", "run"))
depends_on("perl-graph", when="@1.7.6:", type=("build", "run"))
depends_on("perl-http-message", when="@1.7.6:", type=("build", "run"))
depends_on("perl-io-stringy", when="@1.7.6:", type=("build", "run"))
depends_on("perl-ipc-run", when="@1.7.6:", type=("build", "run"))
depends_on("perl-list-moreutils", when="@1.7.6:", type=("build", "run"))
depends_on("perl-set-scalar", when="@1.7.6:", type=("build", "run"))
depends_on("perl-test-requiresinternet", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-dom", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-dom-xpath", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-libxml", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-sax", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-sax-base", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-sax-writer", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-twig", when="@1.7.6:", type=("build", "run"))
depends_on("perl-xml-writer", when="@1.7.6:", type=("build", "run"))
depends_on("perl-yaml", when="@1.7.6:", type=("build", "run"))
depends_on("perl-libwww-perl", when="@1.7.6:", type=("build", "run"))
depends_on("perl-libxml-perl", when="@1.7.6:", type=("build", "run"))
with when("@:1.7.0"):
depends_on("perl-clone")
depends_on("perl-db-file")
depends_on("perl-dbd-mysql")
depends_on("perl-dbd-pg")
depends_on("perl-dbd-sqlite")
depends_on("perl-dbi")
depends_on("perl-gd")
depends_on("perl-graphviz")
depends_on("perl-scalar-list-utils")
depends_on("perl-set-scalar")
depends_on("perl-svg")
@when("@1.007002")
def configure(self, spec, prefix):
# Overriding default configure method in order to cater to interactive
# Build.pl
self.build_method = "Build.PL"
self.build_executable = Executable(join_path(self.stage.source_path, "Build"))
# TODO:
# variant("optionaldeps", default=False, description="Add optional dependencies")
# with when("@:1.7.0+optionaldeps"):
# depends_on("perl-sort-naturally")
# depends_on("perl-test-harness")
# depends_on("perl-text-parsewords")
# depends_on("perl-algorithm-munkres")
# depends_on("perl-array-compare")
# depends_on("perl-bio-phylo")
# depends_on("perl-convert-binary-c")
# depends_on("perl-html-entities")
# depends_on("perl-html-headparser")
# depends_on("perl-html-tableextract")
# depends_on("perl-svg-graph")
# Config questions consist of:
# Do you want to run the Bio::DB::GFF or Bio::DB::SeqFeature::Store
# live database tests? y/n [n]
#
# Install [a]ll BioPerl scripts, [n]one, or choose groups
# [i]nteractively? [a]
#
# Do you want to run tests that require connection to servers across
# the internet (likely to cause some failures)? y/n [n]
#
# Eventually, someone can add capability for the other options, but
# the current answers are the most practical for a spack install.
config_answers = ["n\n", "a\n", "n\n"]
config_answers_filename = "spack-config.in"
with open(config_answers_filename, "w") as f:
f.writelines(config_answers)
with open(config_answers_filename, "r") as f:
perl("Build.PL", "--install_base=%s" % self.prefix, input=f)
# Need to also override the build and install methods to make sure that the
# Build script is run through perl and not use the shebang, as it might be
# too long. This is needed because this does not pick up the
# `@run_after(configure)` step defined in `PerlPackage`.
@when("@1.007002")
def build(self, spec, prefix):
perl("Build")
@when("@1.007002")
def install(self, spec, prefix):
perl("Build", "install")
def configure_args(self):
args = ["--accept=1"]
return args

View File

@@ -18,6 +18,7 @@ class PerlGraphviz(PerlPackage):
version("2.26", sha256="9a5d2520b3262bf30475272dd764a445f8e7f931bef88be0e3d3bff445da7328")
depends_on("graphviz", type=("build", "run", "test"))
depends_on("perl-file-which@1.09:", type=("build", "run", "test"))
depends_on("perl-ipc-run@0.6:", type=("build", "run", "test"))
depends_on("perl-libwww-perl", type=("build", "run", "test"))

View File

@@ -126,6 +126,8 @@ class Pika(CMakePackage, CudaPackage, ROCmPackage):
# https://github.com/pika-org/pika/issues/686
conflicts("^fmt@10:", when="@:0.15 +cuda")
conflicts("^fmt@10:", when="@:0.15 +rocm")
# https://github.com/pika-org/pika/pull/1074
conflicts("^fmt@11:", when="@:0.23")
depends_on("spdlog@1.9.2:", when="@0.25:")
depends_on("hwloc@1.11.5:")
# https://github.com/pika-org/pika/issues/1223

View File

@@ -120,6 +120,22 @@ class Protobuf(CMakePackage):
patch("msvc-abseil-target-namespace.patch", when="@3.22 %msvc")
# Misisng #include "absl/container/internal/layout.h"
# See https://github.com/protocolbuffers/protobuf/pull/14042
patch(
"https://github.com/protocolbuffers/protobuf/commit/e052928c94f5a9a6a6cbdb82e09ab4ee92b7815f.patch?full_index=1",
when="@3.22:3.24.3 ^abseil-cpp@20240116:",
sha256="20e3cc99a9513b256e219653abe1bfc7d6b6a5413e269676e3d442830f99a1af",
)
# Missing #include "absl/strings/str_cat.h"
# See https://github.com/protocolbuffers/protobuf/pull/14054
patch(
"https://github.com/protocolbuffers/protobuf/commit/38a24729ec94e6576a1425951c898ad0b91ad2d2.patch?full_index=1",
when="@3.22:3.24.3 ^abseil-cpp@20240116:",
sha256="c061356db31cdce29c8cdd98a3a8219ef048ebc2318d0dec26c1f2c5e5dae29b",
)
def fetch_remote_versions(self, *args, **kwargs):
"""Ignore additional source artifacts uploaded with releases,
only keep known versions

View File

@@ -0,0 +1,42 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyFluiddyn(PythonPackage):
"""Framework for studying fluid dynamics."""
pypi = "fluiddyn/fluiddyn-0.6.5.tar.gz"
maintainers("paugier")
license("CECILL-B", checked_by="paugier")
version("0.6.5", sha256="ad0df4c05855bd2ae702731983d310bfbb13802874ce83e2da6454bb7100b5df")
version("0.6.4", sha256="576eb0fa50012552b3a68dd17e81ce4f08ddf1e276812b02316016bb1c3a1342")
version("0.6.3", sha256="3c4c57ac8e48c55498aeafaf8b26daecefc03e6ac6e2c03a591e0f7fec13bb69")
version("0.6.2", sha256="40f772cfdf111797ae1c6cf7b67272207f2bc7c4f599085634cc1d74eb748ee5")
version("0.6.1", sha256="af75ed3adfaaa0f0d82822619ced2f9e0611ad15351c9cdbc1d802d67249c3de")
version("0.6.0", sha256="47ad53b3723487d3711ec4ea16bca2d7c270b5c5c5a0255f7684558d7397850e")
depends_on("python@3.9:", type=("build", "run"))
depends_on("py-pdm-backend", type="build")
with default_args(type="run"):
depends_on("py-numpy")
depends_on("py-matplotlib")
depends_on("py-h5py")
depends_on("py-h5netcdf")
depends_on("py-distro")
depends_on("py-simpleeval@0.9.13:")
depends_on("py-psutil@5.2.1:")
depends_on("py-ipython")
depends_on("py-scipy")
with default_args(type="test"):
depends_on("py-pytest")
depends_on("py-pytest-allclose")
depends_on("py-pytest-mock")

View File

@@ -19,8 +19,9 @@ class PyOnnx(PythonPackage):
homepage = "https://github.com/onnx/onnx"
pypi = "Onnx/onnx-1.6.0.tar.gz"
license("Apache-2.0")
license("Apache-2.0", checked_by="wdconinc")
version("1.16.2", sha256="b33a282b038813c4b69e73ea65c2909768e8dd6cc10619b70632335daf094646")
version("1.16.1", sha256="8299193f0f2a3849bfc069641aa8e4f93696602da8d165632af8ee48ec7556b6")
version("1.16.0", sha256="237c6987c6c59d9f44b6136f5819af79574f8d96a760a1fa843bede11f3822f7")
version("1.15.0", sha256="b18461a7d38f286618ca2a6e78062a2a9c634ce498e631e708a8041b00094825")
@@ -59,6 +60,8 @@ class PyOnnx(PythonPackage):
depends_on("py-numpy", type=("build", "run"))
depends_on("py-numpy@1.16.6:", type=("build", "run"), when="@1.8.1:1.13")
depends_on("py-numpy@1.20:", type=("build", "run"), when="@1.16.0:")
depends_on("py-numpy@1.21:", type=("build", "run"), when="@1.16.2:")
depends_on("py-numpy@:1", type=("build", "run"), when="@:1.16")
# Historical dependencies
depends_on("py-six", type=("build", "run"), when="@:1.8.1")

View File

@@ -0,0 +1,35 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyPoxy(PythonPackage):
"""Documentation generator for C++"""
homepage = "https://github.com/marzer/poxy"
pypi = "poxy/poxy-0.18.0.tar.gz"
license("MIT", checked_by="pranav-sivaraman")
version("0.18.0", sha256="f5da8ff04ec08859bfd1c8ec6ef61b70e3af630915a4cce6a3e377eec3bcd3d4")
depends_on("py-setuptools", type="build")
with default_args(type=("build", "run")):
depends_on("python@3.7:")
depends_on("py-misk@0.8.1:")
depends_on("py-beautifulsoup4")
depends_on("py-jinja2")
depends_on("py-pygments")
depends_on("py-html5lib")
depends_on("py-lxml")
depends_on("py-tomli")
depends_on("py-schema")
depends_on("py-requests")
depends_on("py-trieregex")
depends_on("py-colorama")
conflicts("py-schema@=0.7.5")

View File

@@ -230,12 +230,12 @@ class PyTorch(PythonPackage, CudaPackage, ROCmPackage):
depends_on("pthreadpool@2020-10-05", when="@1.8")
depends_on("pthreadpool@2020-06-15", when="@1.6:1.7")
with default_args(type=("build", "link", "run")):
depends_on("py-pybind11@2.12.0", when="@2.3:")
depends_on("py-pybind11@2.11.0", when="@2.1:2.2")
depends_on("py-pybind11@2.10.1", when="@2.0")
depends_on("py-pybind11@2.10.0", when="@1.13:1")
depends_on("py-pybind11@2.6.2", when="@1.8:1.12")
depends_on("py-pybind11@2.3.0", when="@:1.7")
depends_on("py-pybind11@2.12.0:", when="@2.3:")
depends_on("py-pybind11@2.11.0:", when="@2.1:2.2")
depends_on("py-pybind11@2.10.1:", when="@2.0")
depends_on("py-pybind11@2.10.0:", when="@1.13:1")
depends_on("py-pybind11@2.6.2:", when="@1.8:1.12")
depends_on("py-pybind11@2.3.0:", when="@:1.7")
depends_on("sleef@3.6.0_2024-03-20", when="@2.4:")
depends_on("sleef@3.5.1_2020-12-22", when="@1.8:2.3")
depends_on("sleef@3.4.0_2019-07-30", when="@1.6:1.7")
@@ -526,6 +526,7 @@ def enable_or_disable(variant, keyword="USE", var=None):
enable_or_disable("cuda")
if "+cuda" in self.spec:
env.set("CUDA_TOOLKIT_ROOT_DIR", self.spec["cuda"].prefix) # Linux/macOS
env.set("CUDA_HOME", self.spec["cuda"].prefix) # Linux/macOS
env.set("CUDA_PATH", self.spec["cuda"].prefix) # Windows
self.torch_cuda_arch_list(env)

View File

@@ -0,0 +1,28 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyTransonic(PythonPackage):
"""Make your Python code fly at transonic speeds!"""
pypi = "transonic/transonic-0.7.2.tar.gz"
maintainers("paugier")
license("BSD-3-Clause", checked_by="paugier")
version("0.7.2", sha256="d0c39c13b535df4f121a8a378efc42e3d3bf4e49536d131e6d26e9fe7d5a5bf4")
version("0.7.1", sha256="dcc59f1936d09129c800629cd4e6812571a74afe40dadd8193940b545e6ef03e")
depends_on("python@3.9:", type=("build", "run"))
depends_on("py-pdm-backend", type="build")
with default_args(type="run"):
depends_on("py-numpy")
depends_on("py-beniget@0.4")
depends_on("py-gast@0.5")
depends_on("py-autopep8")

View File

@@ -16,8 +16,9 @@ class PyVector(PythonPackage):
tags = ["hep"]
license("BSD-3-Clause")
license("BSD-3-Clause", checked_by="wdconinc")
version("1.5.1", sha256="41ec731fb67ea35af2075eb3a4d6c83ef93b580dade63010821cbc00f1b98961")
version("1.5.0", sha256="77e48bd40b7e7d30a17bf79bb6ed0f2d6985d915fcb9bf0879836276a619a0a9")
version("1.4.2", sha256="3805848eb9e53e9c60aa24dd5be88c842a6cd3d241e22984bfe12629b08536a9")
version("1.4.1", sha256="15aef8911560db1ea3ffa9dbd5414d0ec575a504a2c3f23ea45170a18994466e")
@@ -43,7 +44,7 @@ class PyVector(PythonPackage):
depends_on("py-setuptools@42:", type="build")
depends_on("py-setuptools-scm@3.4: +toml", type="build")
depends_on("py-wheel", type="build")
depends_on("py-numpy@1.13.3:2.0", type=("build", "run"))
depends_on("py-numpy@1.13.3:", type=("build", "run"))
depends_on("py-packaging@19.0:", type=("build", "run"))
depends_on("py-importlib-metadata@0.22:", type=("build", "run"), when="@:1.0 ^python@:3.7")
depends_on("py-typing-extensions", type=("build", "run"), when="@:1.0 ^python@:3.7")
@@ -51,3 +52,6 @@ class PyVector(PythonPackage):
with when("+awkward"):
depends_on("py-awkward@1.2:", type=("build", "run"))
depends_on("py-awkward@2:", type=("build", "run"), when="@1.5:")
# Historical dependencies
depends_on("py-numpy@:2.0", type=("build", "run"), when="@:1.5.0")

View File

@@ -10,14 +10,14 @@ class PyYour(PythonPackage):
"""Python library to read and process pulsar data in several different formats"""
homepage = "https://github.com/thepetabyteproject/your"
# pypi tarball has requirements.txt missing
url = "https://github.com/thepetabyteproject/your/archive/refs/tags/0.6.7.tar.gz"
git = "https://github.com/thepetabyteproject/your.git"
maintainers("aweaver1fandm")
license("GPL-3.0")
version("main", branch="main", preferred=True)
version("0.6.7", sha256="f2124a630d413621cce067805feb6b9c21c5c8938f41188bd89684968478d814")
depends_on("python@3.8:", type=("build", "run"))
@@ -25,7 +25,9 @@ class PyYour(PythonPackage):
depends_on("py-astropy@6.1.0:", type=("build", "run"))
depends_on("py-matplotlib@3.2.1:", type=("build", "run"))
depends_on("py-numpy@1.18.4:", type=("build", "run"))
depends_on("py-numpy@1.18.4:1.23.5", when="@0.6.7", type=("build", "run"))
depends_on("py-numpy@1.18.4:", when="@main", type=("build", "run"))
depends_on("py-h5py@2.10:", type=("build", "run"))
depends_on("py-scikit-image@0.14.2:", type=("build", "run"))
depends_on("py-scipy@1.3:", type=("build", "run"))

View File

@@ -297,8 +297,3 @@ def setup_dependent_package(self, module, dependent_spec):
# Add variable for library directry
module.r_lib_dir = join_path(dependent_spec.prefix, self.r_lib_dir)
# Make the site packages directory for extensions, if it does not exist
# already.
if dependent_spec.package.is_extension:
mkdirp(module.r_lib_dir)

View File

@@ -21,6 +21,7 @@ class Rccl(CMakePackage):
maintainers("srekolam", "renjithravindrankannath")
libraries = ["librccl"]
version("6.2.0", sha256="a29c94ea3b9c1a0121d7b1450cb01a697f9f9132169632312b9b0bf744d3c0e3")
version("6.1.2", sha256="98af99c12d800f5439c7740d797162c35810a25e08e3b11b397d3300d3c0148e")
version("6.1.1", sha256="6368275059ba190d554535d5aeaa5c2510d944b56efd85c90a1701d0292a14c5")
version("6.1.0", sha256="c6308f6883cbd63dceadbe4ee154cc6fa9e6bdccbd2f0fda295b564b0cf01e9a")
@@ -73,6 +74,7 @@ class Rccl(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")
depends_on(f"hip@{ver}", when=f"@{ver}")
@@ -91,6 +93,7 @@ class Rccl(CMakePackage):
"6.0.2",
"6.1.0",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -27,6 +27,7 @@ def url_for_version(self, version):
return url.format(version)
license("MIT")
version("6.2.0", sha256="dd12428426a4963d6eb3cfdd818acef7a3c4cddf32504df17f4c1004fa902bef")
version("6.1.2", sha256="5553b76d4c8b6381d236197613720587377d03d4fd43a5a20bb6a716d49f7dfc")
version("6.1.1", sha256="c133ebd20bf42e543d13c5b84ea420a7f7c069c77b1d6dcae9680de924e5f539")
version("6.1.0", sha256="a8ad5d880645c9e95c9c90b0c9026627b22467e3e879525fff38ccd924f36c39")
@@ -50,7 +51,8 @@ def url_for_version(self, version):
depends_on("grpc@1.28.1+shared", type="build", when="@:5.3")
depends_on("grpc@1.44.0+shared", when="@5.4.0:5.4")
depends_on("grpc@1.55.0+shared", when="@5.5.0:6.0")
depends_on("grpc@1.59.1+shared", when="@6.1:")
depends_on("grpc@1.59.1+shared", when="@6.1")
depends_on("grpc@1.61.2+shared", when="@6.2:")
depends_on("protobuf")
depends_on("libcap")
@@ -70,6 +72,7 @@ def url_for_version(self, version):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-smi-lib@{ver}", type=("build", "link"), when=f"@{ver}")
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")
@@ -86,8 +89,10 @@ def url_for_version(self, version):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")
depends_on("amdsmi@6.2.0", when="@6.2.0")
def patch(self):
filter_file(r"\${ROCM_DIR}/rocm_smi", "${ROCM_SMI_DIR}", "CMakeLists.txt")

View File

@@ -27,6 +27,7 @@ class Rocalution(CMakePackage):
license("MIT")
version("6.2.0", sha256="fd9ad0aae5524d3995343d4d7c1948e7b21f0bdf5b1203d1de58548a814a9c39")
version("6.1.2", sha256="5f9fb302ab1951a1caf54ed31b41d6f41a353dd4b5ee32bc3de2e9f9244dd4ef")
version("6.1.1", sha256="1f80b33813291c2e81e5b1efc325d3f5bb6592c8670c016930d01e73e74ab46b")
version("6.1.0", sha256="699a9b73844fcd4e30d0607b4042dc779f9bcdc27ad732e7a038968ff555af2b")
@@ -78,6 +79,7 @@ class Rocalution(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"rocprim@{ver}", when=f"@{ver}")

View File

@@ -23,6 +23,7 @@ class Rocblas(CMakePackage):
version("develop", branch="develop")
version("master", branch="master")
version("6.2.0", sha256="184e9b39dcbed57c25f351b047d44c613f8a2bbab3314a20c335f024a12ad4e5")
version("6.1.2", sha256="1e83918bd7b28ec9ee292c6fb7eb0fc5f4db2d5d831a9a3db541f14a90c20a1a")
version("6.1.1", sha256="c920742fb8f45512c360cdb40e37d0ac767f042e52f1981264853dab5ec2c876")
version("6.1.0", sha256="af00357909da60d82618038aa9a3cc1f9d4ce1bdfb54db20ec746b592d478edf")
@@ -69,7 +70,18 @@ class Rocblas(CMakePackage):
depends_on("googletest@1.10.0:", type="test")
depends_on("amdblis", type="test")
for ver in ["5.6.0", "5.6.1", "5.7.0", "5.7.1", "6.0.0", "6.0.2", "6.1.0", "6.1.1", "6.1.2"]:
for ver in [
"5.6.0",
"5.6.1",
"5.7.0",
"5.7.1",
"6.0.0",
"6.0.2",
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-openmp-extras@{ver}", type="test", when=f"@{ver}")
depends_on("rocm-cmake@master", type="build", when="@master:")
@@ -90,6 +102,7 @@ class Rocblas(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"llvm-amdgpu@{ver}", type="build", when=f"@{ver}")
@@ -126,6 +139,7 @@ class Rocblas(CMakePackage):
("@6.1.0", "2b55ccf58712f67b3df0ca53b0445f094fcb96b2"),
("@6.1.1", "2b55ccf58712f67b3df0ca53b0445f094fcb96b2"),
("@6.1.2", "2b55ccf58712f67b3df0ca53b0445f094fcb96b2"),
("@6.2.0", "dbc2062dced66e4cbee8e0591d76e0a1588a4c70"),
]:
resource(
name="Tensile",

View File

@@ -20,6 +20,7 @@ class Rocfft(CMakePackage):
libraries = ["librocfft"]
license("MIT")
version("6.2.0", sha256="c9886ec2c713c502dcde4f5fed3d6e1a7dd019023fb07e82d3b622e66c6f2c36")
version("6.1.2", sha256="6f54609b0ecb8ceae8b7acd4c8692514c2c2dbaf0f8b199fe990fd4711428193")
version("6.1.1", sha256="d517a931d49a1e59df4e494ab2b68e301fe7ebf39723863985567467f111111c")
version("6.1.0", sha256="9e6643174a2b0f376127f43454e78d4feba6fac695d4cda9796da50005ecac66")
@@ -86,6 +87,7 @@ class Rocfft(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hip@{ver}", when=f"@{ver}")
depends_on(f"rocm-cmake@{ver}:", type="build", when=f"@{ver}")

View File

@@ -18,6 +18,7 @@ class RocmBandwidthTest(CMakePackage):
maintainers("srekolam", "renjithravindrankannath")
version("master", branch="master")
version("6.2.0", sha256="ca4caa4470c7ad0f1a4963072c1a25b0fd243844a72b26c83fcbca1e82091a41")
version("6.1.2", sha256="4259d53350d6731613d36c03593750547f84f084569f8017783947486b8189da")
version("6.1.1", sha256="01da756228f2bfb5e25ddb74b75a5939693b1b4f4559f37cfc85729e36a98450")
version("6.1.0", sha256="b06522efbd1a55247412c8f535321058e2463eab4abd25505c37e8c67941ae26")
@@ -55,6 +56,7 @@ class RocmBandwidthTest(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")
@@ -72,6 +74,7 @@ class RocmBandwidthTest(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -21,6 +21,7 @@ class RocmCmake(CMakePackage):
license("MIT")
version("master", branch="master")
version("6.2.0", sha256="7b6aaa1bb616669636aa2cd5dbc7fdb7cd05642a8dcc61138e0efb7d0dc7e1a3")
version("6.1.2", sha256="0757bb90f25d6f1e6bc93bdd1e238f76bbaddf154d66f94f37e40c425dc6d259")
version("6.1.1", sha256="0eb81245f7573a3cadf9e91a854d9a0a014ce93610e4e7ea4d8309867a470bf6")
version("6.1.0", sha256="8b37d458e801b486521f12d18ca2103125173dd0f1130d37c8c36e795d34772b")
@@ -54,6 +55,7 @@ class RocmCmake(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -20,6 +20,7 @@ class RocmCore(CMakePackage):
libraries = ["librocm-core"]
license("MIT")
version("6.2.0", sha256="9bafaf801721e98b398624c8d2fa78618d297d6800f96113e26c275889205526")
version("6.1.2", sha256="ce9cbe12977f2058564ecb4cdcef4fd0d7880f6eff8591630f542441092f4fa3")
version("6.1.1", sha256="a27bebdd1ba9d387f33b82a67f64c55cb565b482fe5017d5b5726d68da1ab839")
version("6.1.0", sha256="9dfe542d1647c42993b06f594c316dad63ba6d6fb2a7398bd72c5768fd1d7b5b")

View File

@@ -25,6 +25,7 @@ class RocmDbgapi(CMakePackage):
license("MIT")
version("master", branch="amd-master")
version("6.2.0", sha256="311811ce0970ee83206791c21d539f351ddeac56ce3ff7efbefc830038748c0c")
version("6.1.2", sha256="6e55839e3d95c2cfe3ff89e3e31da77aeecc74012a17f5308589e8808df78026")
version("6.1.1", sha256="425a6cf6a3942c2854c1f5e7717bed906cf6c3753b46c44476f54bfef6188dac")
version("6.1.0", sha256="0985405b6fd44667a7ce8914aa39a7e651613e037e649fbdbfa2adcf744a2d50")
@@ -68,6 +69,7 @@ class RocmDbgapi(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
depends_on(f"hsa-rocr-dev@{ver}", type="build", when=f"@{ver}")
@@ -85,6 +87,7 @@ class RocmDbgapi(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -18,6 +18,7 @@ class RocmDebugAgent(CMakePackage):
maintainers("srekolam", "renjithravindrankannath")
libraries = ["librocm-debug-agent"]
version("6.2.0", sha256="a4b839c47b8a1cd8d00c3577eeeea04d3661210eb8124e221d88bcbedc742363")
version("6.1.2", sha256="c7cb779915a3d61e39d92cef172997bcf5eae720308f6d9c363a2cbc71b5621c")
version("6.1.1", sha256="c631281b346bab9ec3607c59404f548f7cba084a05e9c9ceb3c3579c48361ad1")
version("6.1.0", sha256="f52700563e490d662b505693d485272d73521aabff306107586dd1149fb4a70e")
@@ -62,6 +63,7 @@ class RocmDebugAgent(CMakePackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")
depends_on(f"hsakmt-roct@{ver}", when=f"@{ver}")
@@ -79,6 +81,7 @@ class RocmDebugAgent(CMakePackage):
"6.0.2",
"6.1.0",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -25,6 +25,7 @@ def url_for_version(self, version):
maintainers("srekolam", "renjithravindrankannath", "haampie")
version("master", branch="amd-stg-open")
version("6.2.0", sha256="12ce17dc920ec6dac0c5484159b3eec00276e4a5b301ab1250488db3b2852200")
version("6.1.2", sha256="300e9d6a137dcd91b18d5809a316fddb615e0e7f982dc7ef1bb56876dff6e097")
version("6.1.1", sha256="f1a67efb49f76a9b262e9735d3f75ad21e3bd6a05338c9b15c01e6c625c4460d")
version("6.1.0", sha256="6bd9912441de6caf6b26d1323e1c899ecd14ff2431874a2f5883d3bc5212db34")
@@ -73,6 +74,7 @@ def url_for_version(self, version):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
"master",
]:
depends_on(f"llvm-amdgpu@{ver}", when=f"@{ver}")
@@ -89,6 +91,7 @@ def url_for_version(self, version):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@@ -18,6 +18,7 @@ class RocmGdb(AutotoolsPackage):
license("LGPL-2.0-or-later")
maintainers("srekolam", "renjithravindrankannath")
version("6.2.0", sha256="753fd4f34d49fb0297b01dca2dd7cdf12cd039caa622a5f2d153362d27a8659c")
version("6.1.2", sha256="19208de18d503e1da79dc0c9085221072a68e299f110dc836204364fa1b532cc")
version("6.1.1", sha256="3d982abc130a286d227948aca5783f2e4507ef4275be21dad0914e37217ba19e")
version("6.1.0", sha256="e90d855ca4c1478acf143d45ff0811e7ecd068711db155de6d5f3593cdef6230")
@@ -67,6 +68,7 @@ class RocmGdb(AutotoolsPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-dbgapi@{ver}", type="link", when=f"@{ver}")
depends_on(f"comgr@{ver}", type="link", when=f"@{ver}")
@@ -83,6 +85,7 @@ class RocmGdb(AutotoolsPackage):
"6.1.0",
"6.1.1",
"6.1.2",
"6.2.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

Some files were not shown because too many files have changed in this diff Show More