Compare commits

..

47 Commits

Author SHA1 Message Date
Adrien M. BERNEDE
0301896bf1 Apply fix suggested by @becker33 to copy over relatives includes at env creation 2025-01-21 11:43:17 +01:00
Adrien M. BERNEDE
d6334f7d39 Revert "Test a change @becker33 suggested to work around issue"
This reverts commit ca65fff09a.
2025-01-21 11:42:09 +01:00
Adrien M. BERNEDE
ca65fff09a Test a change @becker33 suggested to work around issue 2025-01-15 10:40:26 +01:00
Mosè Giordano
70534ac9d4 r-deseq2: add new versions (#48157) 2025-01-10 14:36:53 -08:00
Chris Marsh
b369d8b250 py-netcdf4: add v1.7.2 and fix variant (#47824) 2025-01-10 21:52:42 +01:00
Mosè Giordano
4d2319a785 r-sparsematrixstats: add new versions (#48154)
Also, specify incompatibility with `r-matrixstats` versions.
2025-01-10 10:50:04 -08:00
Dom Heinzeller
d6a9511f39 py-cylc-flow package: update package and dependents/dependencies
* Add py-protobuf@4.24.4 (needed for py-cylc-flow@8.3.6).
* Add py-cylc-flow@8.3.6 and enable png output when creating graphs
  by requesting variant pangocairo for graphviz dependency.
* Add corresponding versions of py-metomi-rose@2.3.2,
  py-cylc-rose@1.4.2, py-cylc-uiserver@1.4.2.
* Add myself as maintainer to all the cylc-related packages.
2025-01-10 10:12:41 -08:00
Massimiliano Culpo
dd69b646ad Add subscript notation to packages (#48467)
This PR allows using the subscript notation directly on packages. The
intent is to reduce the boilerplate needed to retrieve package
properties from nodes other than root.
2025-01-10 19:00:51 +01:00
Niklas Bölter
b670205e54 py-nvitop: new package (#48426)
* Adding nvitop package

* [@spackbot] updating style on behalf of nboelte

* Adding dependency constraints and types

* Adding maintainer

* Adding python version dependency

* Style fix

* Commented out unused platform=windows dependency
2025-01-10 09:18:12 -08:00
Harmen Stoppels
d6d8800466 executable.py: fix overlapping overload set (#48503) 2025-01-10 16:34:04 +00:00
Rocco Meli
7a32954f7f dftd4: add v3.6.0 and v3.7.0, update URL (#48500) 2025-01-10 16:13:15 +01:00
Massimiliano Culpo
92564ecd42 palace: add a dependency on c (#48315) 2025-01-10 15:15:57 +01:00
Massimiliano Culpo
c1258a1431 chai: add dependency on C (#48316) 2025-01-10 15:15:14 +01:00
Harmen Stoppels
d46ac9b1e4 spec.py: fix return type of concretized() (#48504) 2025-01-10 13:31:41 +00:00
Massimiliano Culpo
2e472a13e5 llvm: minimal fix for llvm_config method (#48501) 2025-01-10 11:36:07 +01:00
Harmen Stoppels
7edb525599 binary_distribution: improve deps_to_relocate (#48484) 2025-01-10 11:32:11 +01:00
Harmen Stoppels
93cd216603 binary_distribution: stop relocating tarballs with relative rpaths (#48488) 2025-01-10 11:30:18 +01:00
Harmen Stoppels
c1d385ada2 traverse: use overload for return type when depth=True vs depth=False (#48482) 2025-01-10 09:53:28 +01:00
Dom Heinzeller
464390962f Comment out py-arrow@1.3.0 because of build errors with basic gcc stack (#48478)
* Comment out py-arrow@1.3.0 because of build errors with basic gcc stack
* Also comment out 1.3.0-specific depends_on lines
2025-01-09 19:43:47 -07:00
Wouter Deconinck
16734cd8c6 mlpack: add v4.5.1 (#48334)
* mlpack: add v4.5.1
* mlpack: variant +go when=@4.5.1:
2025-01-09 18:40:49 -08:00
Alec Scott
1dd9eeb0c6 rust: fix dependency version constraint (#48445) 2025-01-09 18:37:25 -08:00
Wouter Deconinck
f4ef0aec28 armadillo: add v14.2.2 (#48480)
* armadillo: add v14.2.2
* armadillo: hdf5 support ends with v10
2025-01-09 18:34:58 -08:00
Buldram
ea2c70a21a toybox: depend on virtual zlib (#48486) 2025-01-09 18:33:20 -08:00
Weiqun Zhang
72ddc03da9 amrex: add v25.01 (#48423)
* amrex: add v25.01
* Add dependency on rocsparse

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-01-09 17:37:26 -08:00
Alec Scott
32de71b0b6 gnutls: add v3.8.8 (#48373) 2025-01-09 19:23:20 -06:00
Stephen Nicholas Swatman
e94d5b935f acts dependencies: new versions as of 2025/01/08 (#48465)
New year, new versions of the ACTS dependencies. This time, we have a
new version of detray, a new version of algebra plugins, as well as a
new version of ACTS itself.
2025-01-09 19:22:20 -06:00
Alberto Invernizzi
85649be232 py-stevedore: bump to version 5.4.0 (plus some intermediate ones as well) (#47784) 2025-01-09 16:55:40 -08:00
Harmen Stoppels
c23d2cdb2b lmod: add v8.7.55 (#48439) 2025-01-09 16:43:08 -08:00
Wouter Deconinck
dc5dd896a2 ensmallen: add v2.22.1 (#48481) 2025-01-09 16:41:46 -08:00
Dave Keeshan
43f23589ef verilator: add v5.032 (#48493) 2025-01-09 16:40:24 -08:00
Massimiliano Culpo
5085f635dd Add type hints to spack.util.executable.Executable (#48468)
* Add type-hints to `spack.util.executable.Executable`
* Add type-hint to input
* Use overload, and remove assertions at calling sites
* Bump mypy to v1.11.2 (working locally), Python to 3.13
2025-01-09 14:16:24 -08:00
Paul R. C. Kent
46da7952d3 llvm: add v19.1.6 (#48162) 2025-01-09 13:51:46 -08:00
Adam J. Stewart
72783bcb0a py-keras: add v3.8.0, remove optional deps (#48461) 2025-01-09 09:16:57 -08:00
Olivier Cessenat
f4d2ff0068 ocaml: compile versions before 4.8.0 again (#48470) 2025-01-09 09:15:12 -08:00
Adam J. Stewart
a2b7fee3fe py-timm: add v1.0.12 (#48474) 2025-01-09 09:11:08 -08:00
Mikael Simberg
2ebf2df421 mold: Add 2.36.0 (#48483) 2025-01-09 13:20:40 +01:00
dependabot[bot]
e725aa527e build(deps): bump codecov/codecov-action from 5.0.3 to 5.1.2 (#48200)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.0.3 to 5.1.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](05f5a9cfad...1e68e06f1d)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-09 12:33:45 +01:00
Richard Berger
7455c8d173 libmesh: add newer versions up to 1.7.6 (#48476) 2025-01-09 04:07:48 -07:00
Vanessasaurus
99e2bce99f Automated deployment to update package flux-core 2025-01-08 (#48466)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-09 03:38:42 -07:00
Buldram
4204d16fd3 toybox: new package (#48472) 2025-01-09 03:33:28 -07:00
dependabot[bot]
e76677cbd5 build(deps): bump pygments from 2.18.0 to 2.19.1 in /lib/spack/docs (#48431)
Bumps [pygments](https://github.com/pygments/pygments) from 2.18.0 to 2.19.1.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.18.0...2.19.1)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-01-09 11:31:42 +01:00
Wouter Deconinck
57357a540f py-iminuit: new versions through v2.30.1 (#48471)
* py-iminuit: new versions through v2.30.1
* py-iminuit: add tag hep

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2025-01-09 03:18:18 -07:00
Adam J. Stewart
97e0b39b32 py-segmentation-models-pytorch: add v0.4.0 (#48473) 2025-01-08 15:32:09 -08:00
Paul Gessinger
247da9ea7a root: Patch range restriction and gcc lower bound (#48449)
* root: Restrict patch range

* root: Set minimum gcc version for cxxstd=20

* root: fix gcc range when cxxstd 20

Co-authored-by: Paul Gessinger <hello@paulgessinger.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-08 16:03:08 -06:00
Sinan
07f89a73d1 libspatialite: fix for newer libxml2 (#48125)
* package/libspatialite fix for newer libxml
* clarify constraint
* fix issue properly via variant

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2025-01-08 12:58:14 -08:00
Krishna Chilleri
60cfdcb6cc py-neo4j: new package (#48383)
* add neo4j package

* [@spackbot] updating style on behalf of kchilleri

* modifying license header
2025-01-08 13:46:32 -06:00
Seth R. Johnson
1c9b042d3a googletest: new version, absl support, c++20 flag (#48328)
* googletest: new version, absl support, c++20 flag
* Update googletest header
2025-01-08 11:22:38 -08:00
75 changed files with 996 additions and 962 deletions

View File

@@ -29,7 +29,7 @@ jobs:
- run: coverage xml - run: coverage xml
- name: "Upload coverage report to CodeCov" - name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766 uses: codecov/codecov-action@1e68e06f1dbfde0e4cefc87efeba9e4643565303
with: with:
verbose: true verbose: true
fail_ci_if_error: false fail_ci_if_error: false

View File

@@ -2,6 +2,6 @@ black==24.10.0
clingo==5.7.1 clingo==5.7.1
flake8==7.1.1 flake8==7.1.1
isort==5.13.2 isort==5.13.2
mypy==1.8.0 mypy==1.11.2
types-six==1.17.0.20241205 types-six==1.17.0.20241205
vermin==1.6.0 vermin==1.6.0

View File

@@ -20,7 +20,7 @@ jobs:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b - uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with: with:
python-version: '3.11' python-version: '3.13'
cache: 'pip' cache: 'pip'
- name: Install Python Packages - name: Install Python Packages
run: | run: |
@@ -39,7 +39,7 @@ jobs:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b - uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with: with:
python-version: '3.11' python-version: '3.13'
cache: 'pip' cache: 'pip'
- name: Install Python packages - name: Install Python packages
run: | run: |
@@ -58,7 +58,7 @@ jobs:
secrets: inherit secrets: inherit
with: with:
with_coverage: ${{ inputs.with_coverage }} with_coverage: ${{ inputs.with_coverage }}
python_version: '3.11' python_version: '3.13'
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8 # Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8: bootstrap-dev-rhel8:
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@@ -4,7 +4,7 @@ sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2 sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1 python-levenshtein==0.26.1
docutils==0.21.2 docutils==0.21.2
pygments==2.18.0 pygments==2.19.1
urllib3==2.3.0 urllib3==2.3.0
pytest==8.3.4 pytest==8.3.4
isort==5.13.2 isort==5.13.2

View File

@@ -591,32 +591,18 @@ def file_matches(f: IO[bytes], regex: llnl.util.lang.PatternBytes) -> bool:
f.seek(0) f.seek(0)
def deps_to_relocate(spec): def specs_to_relocate(spec: spack.spec.Spec) -> List[spack.spec.Spec]:
"""Return the transitive link and direct run dependencies of the spec. """Return the set of specs that may be referenced in the install prefix of the provided spec.
We currently include non-external transitive link and direct run dependencies."""
This is a special traversal for dependencies we need to consider when relocating a package. specs = [
Package binaries, scripts, and other files may refer to the prefixes of dependencies, so
we need to rewrite those locations when dependencies are in a different place at install time
than they were at build time.
This traversal covers transitive link dependencies and direct run dependencies because:
1. Spack adds RPATHs for transitive link dependencies so that packages can find needed
dependency libraries.
2. Packages may call any of their *direct* run dependencies (and may bake their paths into
binaries or scripts), so we also need to search for run dependency prefixes when relocating.
This returns a deduplicated list of transitive link dependencies and direct run dependencies.
"""
deps = [
s s
for s in itertools.chain( for s in itertools.chain(
spec.traverse(root=True, deptype="link"), spec.dependencies(deptype="run") spec.traverse(root=True, deptype="link", order="breadth", key=traverse.by_dag_hash),
spec.dependencies(deptype="run"),
) )
if not s.external if not s.external
] ]
return llnl.util.lang.dedupe(deps, key=lambda s: s.dag_hash()) return list(llnl.util.lang.dedupe(specs, key=lambda s: s.dag_hash()))
def get_buildinfo_dict(spec): def get_buildinfo_dict(spec):
@@ -630,7 +616,7 @@ def get_buildinfo_dict(spec):
# "relocate_binaries": [], # "relocate_binaries": [],
# "relocate_links": [], # "relocate_links": [],
"hardlinks_deduped": True, "hardlinks_deduped": True,
"hash_to_prefix": {d.dag_hash(): str(d.prefix) for d in deps_to_relocate(spec)}, "hash_to_prefix": {d.dag_hash(): str(d.prefix) for d in specs_to_relocate(spec)},
} }
@@ -1112,7 +1098,7 @@ def _exists_in_buildcache(spec: spack.spec.Spec, tmpdir: str, out_url: str) -> E
def prefixes_to_relocate(spec): def prefixes_to_relocate(spec):
prefixes = [s.prefix for s in deps_to_relocate(spec)] prefixes = [s.prefix for s in specs_to_relocate(spec)]
prefixes.append(spack.hooks.sbang.sbang_install_path()) prefixes.append(spack.hooks.sbang.sbang_install_path())
prefixes.append(str(spack.store.STORE.layout.root)) prefixes.append(str(spack.store.STORE.layout.root))
return prefixes return prefixes
@@ -2189,7 +2175,12 @@ def relocate_package(spec):
old_spack_prefix = str(buildinfo.get("spackprefix")) old_spack_prefix = str(buildinfo.get("spackprefix"))
old_rel_prefix = buildinfo.get("relative_prefix") old_rel_prefix = buildinfo.get("relative_prefix")
old_prefix = os.path.join(old_layout_root, old_rel_prefix) old_prefix = os.path.join(old_layout_root, old_rel_prefix)
rel = buildinfo.get("relative_rpaths", False)
# Warn about old style tarballs created with the now removed --rel flag.
if buildinfo.get("relative_rpaths", False):
tty.warn(
f"Tarball for {spec} uses relative rpaths, " "which can cause library loading issues."
)
# In the past prefix_to_hash was the default and externals were not dropped, so prefixes # In the past prefix_to_hash was the default and externals were not dropped, so prefixes
# were not unique. # were not unique.
@@ -2229,7 +2220,7 @@ def relocate_package(spec):
# An analog in this algorithm is any spec that shares a name or provides the same virtuals # An analog in this algorithm is any spec that shares a name or provides the same virtuals
# in the context of the relevant root spec. This ensures that the analog for a spec s # in the context of the relevant root spec. This ensures that the analog for a spec s
# is the spec that s replaced when we spliced. # is the spec that s replaced when we spliced.
relocation_specs = deps_to_relocate(spec) relocation_specs = specs_to_relocate(spec)
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD)) build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in relocation_specs: for s in relocation_specs:
analog = s analog = s
@@ -2267,19 +2258,11 @@ def relocate_package(spec):
tty.debug("Relocating package from", "%s to %s." % (old_layout_root, new_layout_root)) tty.debug("Relocating package from", "%s to %s." % (old_layout_root, new_layout_root))
# Old archives maybe have hardlinks repeated. # Old archives may have hardlinks repeated.
dedupe_hardlinks_if_necessary(workdir, buildinfo) dedupe_hardlinks_if_necessary(workdir, buildinfo)
def is_backup_file(file):
return file.endswith("~")
# Text files containing the prefix text # Text files containing the prefix text
text_names = list() text_names = [os.path.join(workdir, f) for f in buildinfo["relocate_textfiles"]]
for filename in buildinfo["relocate_textfiles"]:
text_name = os.path.join(workdir, filename)
# Don't add backup files generated by filter_file during install step.
if not is_backup_file(text_name):
text_names.append(text_name)
# If we are not installing back to the same install tree do the relocation # If we are not installing back to the same install tree do the relocation
if old_prefix != new_prefix: if old_prefix != new_prefix:
@@ -2290,29 +2273,11 @@ def is_backup_file(file):
# do the relocation of path in binaries # do the relocation of path in binaries
platform = spack.platforms.by_name(spec.platform) platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats: if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries( relocate.relocate_macho_binaries(files_to_relocate, prefix_to_prefix_bin)
files_to_relocate, elif "elf" in platform.binary_formats:
old_layout_root,
new_layout_root,
prefix_to_prefix_bin,
rel,
old_prefix,
new_prefix,
)
elif "elf" in platform.binary_formats and not rel:
# The new ELF dynamic section relocation logic only handles absolute to # The new ELF dynamic section relocation logic only handles absolute to
# absolute relocation. # absolute relocation.
relocate.new_relocate_elf_binaries(files_to_relocate, prefix_to_prefix_bin) relocate.relocate_elf_binaries(files_to_relocate, prefix_to_prefix_bin)
elif "elf" in platform.binary_formats and rel:
relocate.relocate_elf_binaries(
files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix_bin,
rel,
old_prefix,
new_prefix,
)
# Relocate links to the new install prefix # Relocate links to the new install prefix
links = [os.path.join(workdir, f) for f in buildinfo.get("relocate_links", [])] links = [os.path.join(workdir, f) for f in buildinfo.get("relocate_links", [])]

View File

@@ -144,7 +144,7 @@ def is_installed(spec):
record = spack.store.STORE.db.query_local_by_spec_hash(spec.dag_hash()) record = spack.store.STORE.db.query_local_by_spec_hash(spec.dag_hash())
return record and record.installed return record and record.installed
specs = traverse.traverse_nodes( all_specs = traverse.traverse_nodes(
specs, specs,
root=False, root=False,
order="breadth", order="breadth",
@@ -155,7 +155,7 @@ def is_installed(spec):
) )
with spack.store.STORE.db.read_transaction(): with spack.store.STORE.db.read_transaction():
return [spec for spec in specs if is_installed(spec)] return [spec for spec in all_specs if is_installed(spec)]
def dependent_environments( def dependent_environments(

View File

@@ -1330,7 +1330,7 @@ def deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> N
def installed_relatives( def installed_relatives(
self, self,
spec: "spack.spec.Spec", spec: "spack.spec.Spec",
direction: str = "children", direction: tr.DirectionType = "children",
transitive: bool = True, transitive: bool = True,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL, deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
) -> Set["spack.spec.Spec"]: ) -> Set["spack.spec.Spec"]:

View File

@@ -2634,6 +2634,29 @@ def _ensure_env_dir():
shutil.copy(envfile, target_manifest) shutil.copy(envfile, target_manifest)
# Copy relative path includes that live inside the environment dir
try:
manifest = EnvironmentManifestFile(environment_dir)
except Exception as e:
msg = f"cannot initialize environment, '{environment_dir}' from manifest"
raise SpackEnvironmentError(msg) from e
else:
includes = manifest[TOP_LEVEL_KEY].get("include", [])
for include in includes:
if os.path.isabs(include):
continue
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
if not abspath.is_relative_to(environment_dir):
# Warn that we are not copying relative path
msg = "Spack will not copy relative include path from outside environment"
msg += f" directory: {include}"
tty.warn(msg)
continue
orig_abspath = os.path.normpath(envfile.parent / include)
shutil.copy(orig_abspath, abspath)
class EnvironmentManifestFile(collections.abc.Mapping): class EnvironmentManifestFile(collections.abc.Mapping):
"""Manages the in-memory representation of a manifest file, and its synchronization """Manages the in-memory representation of a manifest file, and its synchronization

View File

@@ -539,7 +539,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
# Note that we copy them in as they are in the *install* directory # Note that we copy them in as they are in the *install* directory
# NOT as they are in the repository, because we want a snapshot of # NOT as they are in the repository, because we want a snapshot of
# how *this* particular build was done. # how *this* particular build was done.
for node in spec.traverse(deptype=all): for node in spec.traverse(deptype="all"):
if node is not spec: if node is not spec:
# Locate the dependency package in the install tree and find # Locate the dependency package in the install tree and find
# its provenance information. # its provenance information.

View File

@@ -767,6 +767,9 @@ def __init__(self, spec):
self.win_rpath = fsys.WindowsSimulatedRPath(self) self.win_rpath = fsys.WindowsSimulatedRPath(self)
super().__init__() super().__init__()
def __getitem__(self, key: str) -> "PackageBase":
return self.spec[key].package
@classmethod @classmethod
def dependency_names(cls): def dependency_names(cls):
return _subkeys(cls.dependencies) return _subkeys(cls.dependencies)

View File

@@ -54,144 +54,11 @@ def _patchelf() -> Optional[executable.Executable]:
return spack.bootstrap.ensure_patchelf_in_path_or_raise() return spack.bootstrap.ensure_patchelf_in_path_or_raise()
def _elf_rpaths_for(path):
"""Return the RPATHs for an executable or a library.
Args:
path (str): full path to the executable or library
Return:
RPATHs as a list of strings. Returns an empty array
on ELF parsing errors, or when the ELF file simply
has no rpaths.
"""
return elf.get_rpaths(path) or []
def _make_relative(reference_file, path_root, paths):
"""Return a list where any path in ``paths`` that starts with
``path_root`` is made relative to the directory in which the
reference file is stored.
After a path is made relative it is prefixed with the ``$ORIGIN``
string.
Args:
reference_file (str): file from which the reference directory
is computed
path_root (str): root of the relative paths
paths: (list) paths to be examined
Returns:
List of relative paths
"""
start_directory = os.path.dirname(reference_file)
pattern = re.compile(path_root)
relative_paths = []
for path in paths:
if pattern.match(path):
rel = os.path.relpath(path, start=start_directory)
path = os.path.join("$ORIGIN", rel)
relative_paths.append(path)
return relative_paths
def _normalize_relative_paths(start_path, relative_paths):
"""Normalize the relative paths with respect to the original path name
of the file (``start_path``).
The paths that are passed to this function existed or were relevant
on another filesystem, so os.path.abspath cannot be used.
A relative path may contain the signifier $ORIGIN. Assuming that
``start_path`` is absolute, this implies that the relative path
(relative to start_path) should be replaced with an absolute path.
Args:
start_path (str): path from which the starting directory
is extracted
relative_paths (str): list of relative paths as obtained by a
call to :ref:`_make_relative`
Returns:
List of normalized paths
"""
normalized_paths = []
pattern = re.compile(re.escape("$ORIGIN"))
start_directory = os.path.dirname(start_path)
for path in relative_paths:
if path.startswith("$ORIGIN"):
sub = pattern.sub(start_directory, path)
path = os.path.normpath(sub)
normalized_paths.append(path)
return normalized_paths
def _decode_macho_data(bytestring): def _decode_macho_data(bytestring):
return bytestring.rstrip(b"\x00").decode("ascii") return bytestring.rstrip(b"\x00").decode("ascii")
def macho_make_paths_relative(path_name, old_layout_root, rpaths, deps, idpath): def macho_find_paths(orig_rpaths, deps, idpath, prefix_to_prefix):
"""
Return a dictionary mapping the original rpaths to the relativized rpaths.
This dictionary is used to replace paths in mach-o binaries.
Replace old_dir with relative path from dirname of path name
in rpaths and deps; idpath is replaced with @rpath/libname.
"""
paths_to_paths = dict()
if idpath:
paths_to_paths[idpath] = os.path.join("@rpath", "%s" % os.path.basename(idpath))
for rpath in rpaths:
if re.match(old_layout_root, rpath):
rel = os.path.relpath(rpath, start=os.path.dirname(path_name))
paths_to_paths[rpath] = os.path.join("@loader_path", "%s" % rel)
else:
paths_to_paths[rpath] = rpath
for dep in deps:
if re.match(old_layout_root, dep):
rel = os.path.relpath(dep, start=os.path.dirname(path_name))
paths_to_paths[dep] = os.path.join("@loader_path", "%s" % rel)
else:
paths_to_paths[dep] = dep
return paths_to_paths
def macho_make_paths_normal(orig_path_name, rpaths, deps, idpath):
"""
Return a dictionary mapping the relativized rpaths to the original rpaths.
This dictionary is used to replace paths in mach-o binaries.
Replace '@loader_path' with the dirname of the origname path name
in rpaths and deps; idpath is replaced with the original path name
"""
rel_to_orig = dict()
if idpath:
rel_to_orig[idpath] = orig_path_name
for rpath in rpaths:
if re.match("@loader_path", rpath):
norm = os.path.normpath(
re.sub(re.escape("@loader_path"), os.path.dirname(orig_path_name), rpath)
)
rel_to_orig[rpath] = norm
else:
rel_to_orig[rpath] = rpath
for dep in deps:
if re.match("@loader_path", dep):
norm = os.path.normpath(
re.sub(re.escape("@loader_path"), os.path.dirname(orig_path_name), dep)
)
rel_to_orig[dep] = norm
else:
rel_to_orig[dep] = dep
return rel_to_orig
def macho_find_paths(orig_rpaths, deps, idpath, old_layout_root, prefix_to_prefix):
""" """
Inputs Inputs
original rpaths from mach-o binaries original rpaths from mach-o binaries
@@ -207,13 +74,12 @@ def macho_find_paths(orig_rpaths, deps, idpath, old_layout_root, prefix_to_prefi
# Sort from longest path to shortest, to ensure we try /foo/bar/baz before /foo/bar # Sort from longest path to shortest, to ensure we try /foo/bar/baz before /foo/bar
prefix_iteration_order = sorted(prefix_to_prefix, key=len, reverse=True) prefix_iteration_order = sorted(prefix_to_prefix, key=len, reverse=True)
for orig_rpath in orig_rpaths: for orig_rpath in orig_rpaths:
if orig_rpath.startswith(old_layout_root): for old_prefix in prefix_iteration_order:
for old_prefix in prefix_iteration_order: new_prefix = prefix_to_prefix[old_prefix]
new_prefix = prefix_to_prefix[old_prefix] if orig_rpath.startswith(old_prefix):
if orig_rpath.startswith(old_prefix): new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath)
new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath) paths_to_paths[orig_rpath] = new_rpath
paths_to_paths[orig_rpath] = new_rpath break
break
else: else:
paths_to_paths[orig_rpath] = orig_rpath paths_to_paths[orig_rpath] = orig_rpath
@@ -348,9 +214,7 @@ def _set_elf_rpaths_and_interpreter(
return None return None
def relocate_macho_binaries( def relocate_macho_binaries(path_names, prefix_to_prefix):
path_names, old_layout_root, new_layout_root, prefix_to_prefix, rel, old_prefix, new_prefix
):
""" """
Use macholib python package to get the rpaths, depedent libraries Use macholib python package to get the rpaths, depedent libraries
and library identity for libraries from the MachO object. Modify them and library identity for libraries from the MachO object. Modify them
@@ -363,77 +227,15 @@ def relocate_macho_binaries(
# Corner case where macho object file ended up in the path name list # Corner case where macho object file ended up in the path name list
if path_name.endswith(".o"): if path_name.endswith(".o"):
continue continue
if rel: # get the paths in the old prefix
# get the relativized paths rpaths, deps, idpath = macholib_get_paths(path_name)
rpaths, deps, idpath = macholib_get_paths(path_name) # get the mapping of paths in the old prerix to the new prefix
# get the file path name in the original prefix paths_to_paths = macho_find_paths(rpaths, deps, idpath, prefix_to_prefix)
orig_path_name = re.sub(re.escape(new_prefix), old_prefix, path_name) # replace the old paths with new paths
# get the mapping of the relativized paths to the original modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
# normalized paths
rel_to_orig = macho_make_paths_normal(orig_path_name, rpaths, deps, idpath)
# replace the relativized paths with normalized paths
modify_macho_object(path_name, rpaths, deps, idpath, rel_to_orig)
# get the normalized paths in the mach-o binary
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths in old prefix to path in new prefix
paths_to_paths = macho_find_paths(
rpaths, deps, idpath, old_layout_root, prefix_to_prefix
)
# replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
# get the new normalized path in the mach-o binary
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths to relative paths in the new prefix
paths_to_paths = macho_make_paths_relative(
path_name, new_layout_root, rpaths, deps, idpath
)
# replace the new paths with relativized paths in the new prefix
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
else:
# get the paths in the old prefix
rpaths, deps, idpath = macholib_get_paths(path_name)
# get the mapping of paths in the old prerix to the new prefix
paths_to_paths = macho_find_paths(
rpaths, deps, idpath, old_layout_root, prefix_to_prefix
)
# replace the old paths with new paths
modify_macho_object(path_name, rpaths, deps, idpath, paths_to_paths)
def _transform_rpaths(orig_rpaths, orig_root, new_prefixes): def relocate_elf_binaries(binaries, prefix_to_prefix):
"""Return an updated list of RPATHs where each entry in the original list
starting with the old root is relocated to another place according to the
mapping passed as argument.
Args:
orig_rpaths (list): list of the original RPATHs
orig_root (str): original root to be substituted
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
Returns:
List of paths
"""
new_rpaths = []
for orig_rpath in orig_rpaths:
# If the original RPATH doesn't start with the target root
# append it verbatim and proceed
if not orig_rpath.startswith(orig_root):
new_rpaths.append(orig_rpath)
continue
# Otherwise inspect the mapping and transform + append any prefix
# that starts with a registered key
# avoiding duplicates
for old_prefix, new_prefix in new_prefixes.items():
if orig_rpath.startswith(old_prefix):
new_rpath = re.sub(re.escape(old_prefix), new_prefix, orig_rpath)
if new_rpath not in new_rpaths:
new_rpaths.append(new_rpath)
return new_rpaths
def new_relocate_elf_binaries(binaries, prefix_to_prefix):
"""Take a list of binaries, and an ordered dictionary of """Take a list of binaries, and an ordered dictionary of
prefix to prefix mapping, and update the rpaths accordingly.""" prefix to prefix mapping, and update the rpaths accordingly."""
@@ -452,98 +254,6 @@ def new_relocate_elf_binaries(binaries, prefix_to_prefix):
_set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter) _set_elf_rpaths_and_interpreter(path, rpaths=rpaths, interpreter=interpreter)
def relocate_elf_binaries(
binaries, orig_root, new_root, new_prefixes, rel, orig_prefix, new_prefix
):
"""Relocate the binaries passed as arguments by changing their RPATHs.
Use patchelf to get the original RPATHs and then replace them with
rpaths in the new directory layout.
New RPATHs are determined from a dictionary mapping the prefixes in the
old directory layout to the prefixes in the new directory layout if the
rpath was in the old layout root, i.e. system paths are not replaced.
Args:
binaries (list): list of binaries that might need relocation, located
in the new prefix
orig_root (str): original root to be substituted
new_root (str): new root to be used, only relevant for relative RPATHs
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
rel (bool): True if the RPATHs are relative, False if they are absolute
orig_prefix (str): prefix where the executable was originally located
new_prefix (str): prefix where we want to relocate the executable
"""
for new_binary in binaries:
orig_rpaths = _elf_rpaths_for(new_binary)
# TODO: Can we deduce `rel` from the original RPATHs?
if rel:
# Get the file path in the original prefix
orig_binary = re.sub(re.escape(new_prefix), orig_prefix, new_binary)
# Get the normalized RPATHs in the old prefix using the file path
# in the orig prefix
orig_norm_rpaths = _normalize_relative_paths(orig_binary, orig_rpaths)
# Get the normalize RPATHs in the new prefix
new_norm_rpaths = _transform_rpaths(orig_norm_rpaths, orig_root, new_prefixes)
# Get the relative RPATHs in the new prefix
new_rpaths = _make_relative(new_binary, new_root, new_norm_rpaths)
# check to see if relative rpaths are changed before rewriting
if sorted(new_rpaths) != sorted(orig_rpaths):
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
else:
new_rpaths = _transform_rpaths(orig_rpaths, orig_root, new_prefixes)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def make_link_relative(new_links, orig_links):
"""Compute the relative target from the original link and
make the new link relative.
Args:
new_links (list): new links to be made relative
orig_links (list): original links
"""
for new_link, orig_link in zip(new_links, orig_links):
target = readlink(orig_link)
relative_target = os.path.relpath(target, os.path.dirname(orig_link))
os.unlink(new_link)
symlink(relative_target, new_link)
def make_macho_binaries_relative(cur_path_names, orig_path_names, old_layout_root):
"""
Replace old RPATHs with paths relative to old_dir in binary files
"""
if not sys.platform == "darwin":
return
for cur_path, orig_path in zip(cur_path_names, orig_path_names):
(rpaths, deps, idpath) = macholib_get_paths(cur_path)
paths_to_paths = macho_make_paths_relative(
orig_path, old_layout_root, rpaths, deps, idpath
)
modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths)
def make_elf_binaries_relative(new_binaries, orig_binaries, orig_layout_root):
"""Replace the original RPATHs in the new binaries making them
relative to the original layout root.
Args:
new_binaries (list): new binaries whose RPATHs is to be made relative
orig_binaries (list): original binaries
orig_layout_root (str): path to be used as a base for making
RPATHs relative
"""
for new_binary, orig_binary in zip(new_binaries, orig_binaries):
orig_rpaths = _elf_rpaths_for(new_binary)
if orig_rpaths:
new_rpaths = _make_relative(orig_binary, orig_layout_root, orig_rpaths)
_set_elf_rpaths_and_interpreter(new_binary, new_rpaths)
def warn_if_link_cant_be_relocated(link, target): def warn_if_link_cant_be_relocated(link, target):
if not os.path.isabs(target): if not os.path.isabs(target):
return return

View File

@@ -48,7 +48,7 @@ def rewire_node(spec, explicit):
# spec # spec
prefix_to_prefix = {spec.build_spec.prefix: spec.prefix} prefix_to_prefix = {spec.build_spec.prefix: spec.prefix}
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD)) build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in bindist.deps_to_relocate(spec): for s in bindist.specs_to_relocate(spec):
analog = s analog = s
if id(s) not in build_spec_ids: if id(s) not in build_spec_ids:
analogs = [ analogs = [
@@ -77,25 +77,9 @@ def rewire_node(spec, explicit):
] ]
if bins_to_relocate: if bins_to_relocate:
if "macho" in platform.binary_formats: if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries( relocate.relocate_macho_binaries(bins_to_relocate, prefix_to_prefix)
bins_to_relocate,
str(spack.store.STORE.layout.root),
str(spack.store.STORE.layout.root),
prefix_to_prefix,
False,
spec.build_spec.prefix,
spec.prefix,
)
if "elf" in platform.binary_formats: if "elf" in platform.binary_formats:
relocate.relocate_elf_binaries( relocate.relocate_elf_binaries(bins_to_relocate, prefix_to_prefix)
bins_to_relocate,
str(spack.store.STORE.layout.root),
str(spack.store.STORE.layout.root),
prefix_to_prefix,
False,
spec.build_spec.prefix,
spec.prefix,
)
relocate.relocate_text_bin(binaries=bins_to_relocate, prefixes=prefix_to_prefix) relocate.relocate_text_bin(binaries=bins_to_relocate, prefixes=prefix_to_prefix)
shutil.rmtree(tempdir) shutil.rmtree(tempdir)
install_manifest = os.path.join( install_manifest = os.path.join(

View File

@@ -58,7 +58,21 @@
import re import re
import socket import socket
import warnings import warnings
from typing import Any, Callable, Dict, Iterable, List, Match, Optional, Set, Tuple, Union from typing import (
Any,
Callable,
Dict,
Iterable,
List,
Match,
Optional,
Set,
Tuple,
Union,
overload,
)
from typing_extensions import Literal
import archspec.cpu import archspec.cpu
@@ -83,7 +97,7 @@
import spack.solver import spack.solver
import spack.spec_parser import spack.spec_parser
import spack.store import spack.store
import spack.traverse as traverse import spack.traverse
import spack.util.executable import spack.util.executable
import spack.util.hash import spack.util.hash
import spack.util.module_cmd as md import spack.util.module_cmd as md
@@ -1339,16 +1353,16 @@ def tree(
depth: bool = False, depth: bool = False,
hashes: bool = False, hashes: bool = False,
hashlen: Optional[int] = None, hashlen: Optional[int] = None,
cover: str = "nodes", cover: spack.traverse.CoverType = "nodes",
indent: int = 0, indent: int = 0,
format: str = DEFAULT_FORMAT, format: str = DEFAULT_FORMAT,
deptypes: Union[Tuple[str, ...], str] = "all", deptypes: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
show_types: bool = False, show_types: bool = False,
depth_first: bool = False, depth_first: bool = False,
recurse_dependencies: bool = True, recurse_dependencies: bool = True,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None, status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
prefix: Optional[Callable[["Spec"], str]] = None, prefix: Optional[Callable[["Spec"], str]] = None,
key=id, key: Callable[["Spec"], Any] = id,
) -> str: ) -> str:
"""Prints out specs and their dependencies, tree-formatted with indentation. """Prints out specs and their dependencies, tree-formatted with indentation.
@@ -1380,11 +1394,16 @@ def tree(
# reduce deptypes over all in-edges when covering nodes # reduce deptypes over all in-edges when covering nodes
if show_types and cover == "nodes": if show_types and cover == "nodes":
deptype_lookup: Dict[str, dt.DepFlag] = collections.defaultdict(dt.DepFlag) deptype_lookup: Dict[str, dt.DepFlag] = collections.defaultdict(dt.DepFlag)
for edge in traverse.traverse_edges(specs, cover="edges", deptype=deptypes, root=False): for edge in spack.traverse.traverse_edges(
specs, cover="edges", deptype=deptypes, root=False
):
deptype_lookup[edge.spec.dag_hash()] |= edge.depflag deptype_lookup[edge.spec.dag_hash()] |= edge.depflag
for d, dep_spec in traverse.traverse_tree( # SupportsRichComparisonT issue with List[Spec]
sorted(specs), cover=cover, deptype=deptypes, depth_first=depth_first, key=key sorted_specs: List["Spec"] = sorted(specs) # type: ignore[type-var]
for d, dep_spec in spack.traverse.traverse_tree(
sorted_specs, cover=cover, deptype=deptypes, depth_first=depth_first, key=key
): ):
node = dep_spec.spec node = dep_spec.spec
@@ -1927,13 +1946,111 @@ def installed_upstream(self):
upstream, _ = spack.store.STORE.db.query_by_spec_hash(self.dag_hash()) upstream, _ = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
return upstream return upstream
def traverse(self, **kwargs): @overload
"""Shorthand for :meth:`~spack.traverse.traverse_nodes`""" def traverse(
return traverse.traverse_nodes([self], **kwargs) self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable["Spec"]: ...
def traverse_edges(self, **kwargs): @overload
def traverse(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, "Spec"]]: ...
def traverse(
self,
*,
root: bool = True,
order: spack.traverse.OrderType = "pre",
cover: spack.traverse.CoverType = "nodes",
direction: spack.traverse.DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth: bool = False,
key: Callable[["Spec"], Any] = id,
visited: Optional[Set[Any]] = None,
) -> Iterable[Union["Spec", Tuple[int, "Spec"]]]:
"""Shorthand for :meth:`~spack.traverse.traverse_nodes`"""
return spack.traverse.traverse_nodes(
[self],
root=root,
order=order,
cover=cover,
direction=direction,
deptype=deptype,
depth=depth,
key=key,
visited=visited,
)
@overload
def traverse_edges(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[DependencySpec]: ...
@overload
def traverse_edges(
self,
*,
root: bool = ...,
order: spack.traverse.OrderType = ...,
cover: spack.traverse.CoverType = ...,
direction: spack.traverse.DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, DependencySpec]]: ...
def traverse_edges(
self,
*,
root: bool = True,
order: spack.traverse.OrderType = "pre",
cover: spack.traverse.CoverType = "nodes",
direction: spack.traverse.DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth: bool = False,
key: Callable[["Spec"], Any] = id,
visited: Optional[Set[Any]] = None,
) -> Iterable[Union[DependencySpec, Tuple[int, DependencySpec]]]:
"""Shorthand for :meth:`~spack.traverse.traverse_edges`""" """Shorthand for :meth:`~spack.traverse.traverse_edges`"""
return traverse.traverse_edges([self], **kwargs) return spack.traverse.traverse_edges(
[self],
root=root,
order=order,
cover=cover,
direction=direction,
deptype=deptype,
depth=depth,
key=key,
visited=visited,
)
@property @property
def short_spec(self): def short_spec(self):
@@ -2944,7 +3061,7 @@ def _finalize_concretization(self):
for spec in self.traverse(): for spec in self.traverse():
spec._cached_hash(ht.dag_hash) spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec": def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize(). """This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package First clones, then returns a concrete version of this package
@@ -4105,10 +4222,10 @@ def tree(
depth: bool = False, depth: bool = False,
hashes: bool = False, hashes: bool = False,
hashlen: Optional[int] = None, hashlen: Optional[int] = None,
cover: str = "nodes", cover: spack.traverse.CoverType = "nodes",
indent: int = 0, indent: int = 0,
format: str = DEFAULT_FORMAT, format: str = DEFAULT_FORMAT,
deptypes: Union[Tuple[str, ...], str] = "all", deptypes: Union[dt.DepTypes, dt.DepFlag] = dt.ALL,
show_types: bool = False, show_types: bool = False,
depth_first: bool = False, depth_first: bool = False,
recurse_dependencies: bool = True, recurse_dependencies: bool = True,

View File

@@ -285,3 +285,16 @@ def compilers(compiler, arch_spec):
error = capfd.readouterr()[1] error = capfd.readouterr()[1]
assert "Skipping tests for package" in error assert "Skipping tests for package" in error
assert "test requires missing compiler" in error assert "test requires missing compiler" in error
def test_package_subscript(default_mock_concretization):
"""Tests that we can use the subscript notation on packages, and that it returns a package"""
root = default_mock_concretization("mpileaks")
root_pkg = root.package
# Subscript of a virtual
assert isinstance(root_pkg["mpi"], spack.package_base.PackageBase)
# Subscript on concrete
for d in root.traverse():
assert isinstance(root_pkg[d.name], spack.package_base.PackageBase)

View File

@@ -31,13 +31,7 @@
from spack.fetch_strategy import URLFetchStrategy from spack.fetch_strategy import URLFetchStrategy
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.paths import mock_gpg_keys_path from spack.paths import mock_gpg_keys_path
from spack.relocate import ( from spack.relocate import macho_find_paths, relocate_links, relocate_text
macho_find_paths,
macho_make_paths_normal,
macho_make_paths_relative,
relocate_links,
relocate_text,
)
from spack.spec import Spec from spack.spec import Spec
pytestmark = pytest.mark.not_on_windows("does not run on windows") pytestmark = pytest.mark.not_on_windows("does not run on windows")
@@ -301,7 +295,6 @@ def test_replace_paths(tmpdir):
os.path.join(oldlibdir_local, libfile_loco), os.path.join(oldlibdir_local, libfile_loco),
], ],
os.path.join(oldlibdir_cc, libfile_c), os.path.join(oldlibdir_cc, libfile_c),
old_spack_dir,
prefix2prefix, prefix2prefix,
) )
assert out_dict == { assert out_dict == {
@@ -325,7 +318,6 @@ def test_replace_paths(tmpdir):
os.path.join(oldlibdir_local, libfile_loco), os.path.join(oldlibdir_local, libfile_loco),
], ],
None, None,
old_spack_dir,
prefix2prefix, prefix2prefix,
) )
assert out_dict == { assert out_dict == {
@@ -349,7 +341,6 @@ def test_replace_paths(tmpdir):
f"@rpath/{libfile_loco}", f"@rpath/{libfile_loco}",
], ],
None, None,
old_spack_dir,
prefix2prefix, prefix2prefix,
) )
@@ -369,7 +360,6 @@ def test_replace_paths(tmpdir):
[oldlibdir_a, oldlibdir_b, oldlibdir_d, oldlibdir_local], [oldlibdir_a, oldlibdir_b, oldlibdir_d, oldlibdir_local],
[f"@rpath/{libfile_a}", f"@rpath/{libfile_b}", f"@rpath/{libfile_loco}"], [f"@rpath/{libfile_a}", f"@rpath/{libfile_b}", f"@rpath/{libfile_loco}"],
None, None,
old_spack_dir,
prefix2prefix, prefix2prefix,
) )
assert out_dict == { assert out_dict == {
@@ -383,91 +373,6 @@ def test_replace_paths(tmpdir):
} }
def test_macho_make_paths():
out = macho_make_paths_relative(
"/Users/Shared/spack/pkgC/lib/libC.dylib",
"/Users/Shared/spack",
("/Users/Shared/spack/pkgA/lib", "/Users/Shared/spack/pkgB/lib", "/usr/local/lib"),
(
"/Users/Shared/spack/pkgA/libA.dylib",
"/Users/Shared/spack/pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib",
),
"/Users/Shared/spack/pkgC/lib/libC.dylib",
)
assert out == {
"/Users/Shared/spack/pkgA/lib": "@loader_path/../../pkgA/lib",
"/Users/Shared/spack/pkgB/lib": "@loader_path/../../pkgB/lib",
"/usr/local/lib": "/usr/local/lib",
"/Users/Shared/spack/pkgA/libA.dylib": "@loader_path/../../pkgA/libA.dylib",
"/Users/Shared/spack/pkgB/libB.dylib": "@loader_path/../../pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib": "/usr/local/lib/libloco.dylib",
"/Users/Shared/spack/pkgC/lib/libC.dylib": "@rpath/libC.dylib",
}
out = macho_make_paths_normal(
"/Users/Shared/spack/pkgC/lib/libC.dylib",
("@loader_path/../../pkgA/lib", "@loader_path/../../pkgB/lib", "/usr/local/lib"),
(
"@loader_path/../../pkgA/libA.dylib",
"@loader_path/../../pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib",
),
"@rpath/libC.dylib",
)
assert out == {
"@rpath/libC.dylib": "/Users/Shared/spack/pkgC/lib/libC.dylib",
"@loader_path/../../pkgA/lib": "/Users/Shared/spack/pkgA/lib",
"@loader_path/../../pkgB/lib": "/Users/Shared/spack/pkgB/lib",
"/usr/local/lib": "/usr/local/lib",
"@loader_path/../../pkgA/libA.dylib": "/Users/Shared/spack/pkgA/libA.dylib",
"@loader_path/../../pkgB/libB.dylib": "/Users/Shared/spack/pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib": "/usr/local/lib/libloco.dylib",
}
out = macho_make_paths_relative(
"/Users/Shared/spack/pkgC/bin/exeC",
"/Users/Shared/spack",
("/Users/Shared/spack/pkgA/lib", "/Users/Shared/spack/pkgB/lib", "/usr/local/lib"),
(
"/Users/Shared/spack/pkgA/libA.dylib",
"/Users/Shared/spack/pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib",
),
None,
)
assert out == {
"/Users/Shared/spack/pkgA/lib": "@loader_path/../../pkgA/lib",
"/Users/Shared/spack/pkgB/lib": "@loader_path/../../pkgB/lib",
"/usr/local/lib": "/usr/local/lib",
"/Users/Shared/spack/pkgA/libA.dylib": "@loader_path/../../pkgA/libA.dylib",
"/Users/Shared/spack/pkgB/libB.dylib": "@loader_path/../../pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib": "/usr/local/lib/libloco.dylib",
}
out = macho_make_paths_normal(
"/Users/Shared/spack/pkgC/bin/exeC",
("@loader_path/../../pkgA/lib", "@loader_path/../../pkgB/lib", "/usr/local/lib"),
(
"@loader_path/../../pkgA/libA.dylib",
"@loader_path/../../pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib",
),
None,
)
assert out == {
"@loader_path/../../pkgA/lib": "/Users/Shared/spack/pkgA/lib",
"@loader_path/../../pkgB/lib": "/Users/Shared/spack/pkgB/lib",
"/usr/local/lib": "/usr/local/lib",
"@loader_path/../../pkgA/libA.dylib": "/Users/Shared/spack/pkgA/libA.dylib",
"@loader_path/../../pkgB/libB.dylib": "/Users/Shared/spack/pkgB/libB.dylib",
"/usr/local/lib/libloco.dylib": "/usr/local/lib/libloco.dylib",
}
@pytest.fixture() @pytest.fixture()
def mock_download(monkeypatch): def mock_download(monkeypatch):
"""Mock a failing download strategy.""" """Mock a failing download strategy."""
@@ -561,10 +466,6 @@ def test_macho_relocation_with_changing_projection(relocation_dict):
""" """
original_rpath = "/foo/bar/baz/abcdef" original_rpath = "/foo/bar/baz/abcdef"
result = macho_find_paths( result = macho_find_paths(
[original_rpath], [original_rpath], deps=[], idpath=None, prefix_to_prefix=relocation_dict
deps=[],
idpath=None,
old_layout_root="/foo",
prefix_to_prefix=relocation_dict,
) )
assert result[original_rpath] == "/a/b/c/abcdef" assert result[original_rpath] == "/a/b/c/abcdef"

View File

@@ -1,8 +1,6 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details. # Copyright Spack Project Developers. See COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import re import re
import shutil import shutil
@@ -114,49 +112,6 @@ def _copy_somewhere(orig_binary):
return _copy_somewhere return _copy_somewhere
@pytest.mark.parametrize(
"start_path,path_root,paths,expected",
[
(
"/usr/bin/test",
"/usr",
["/usr/lib", "/usr/lib64", "/opt/local/lib"],
[
os.path.join("$ORIGIN", "..", "lib"),
os.path.join("$ORIGIN", "..", "lib64"),
"/opt/local/lib",
],
)
],
)
def test_make_relative_paths(start_path, path_root, paths, expected):
relatives = spack.relocate._make_relative(start_path, path_root, paths)
assert relatives == expected
@pytest.mark.parametrize(
"start_path,relative_paths,expected",
[
# $ORIGIN will be replaced with os.path.dirname('usr/bin/test')
# and then normalized
(
"/usr/bin/test",
["$ORIGIN/../lib", "$ORIGIN/../lib64", "/opt/local/lib"],
[
os.sep + os.path.join("usr", "lib"),
os.sep + os.path.join("usr", "lib64"),
"/opt/local/lib",
],
),
# Relative path without $ORIGIN
("/usr/bin/test", ["../local/lib"], ["../local/lib"]),
],
)
def test_normalize_relative_paths(start_path, relative_paths, expected):
normalized = spack.relocate._normalize_relative_paths(start_path, relative_paths)
assert normalized == expected
@pytest.mark.requires_executables("patchelf", "gcc") @pytest.mark.requires_executables("patchelf", "gcc")
@skip_unless_linux @skip_unless_linux
def test_relocate_text_bin(binary_with_rpaths, prefix_like): def test_relocate_text_bin(binary_with_rpaths, prefix_like):
@@ -182,61 +137,13 @@ def test_relocate_elf_binaries_absolute_paths(binary_with_rpaths, copy_binary, p
new_binary = copy_binary(orig_binary) new_binary = copy_binary(orig_binary)
spack.relocate.relocate_elf_binaries( spack.relocate.relocate_elf_binaries(
binaries=[str(new_binary)], binaries=[str(new_binary)], prefix_to_prefix={str(orig_binary.dirpath()): "/foo"}
orig_root=str(orig_binary.dirpath()),
new_root=None, # Not needed when relocating absolute paths
new_prefixes={str(orig_binary.dirpath()): "/foo"},
rel=False,
# Not needed when relocating absolute paths
orig_prefix=None,
new_prefix=None,
) )
# Some compilers add rpaths so ensure changes included in final result # Some compilers add rpaths so ensure changes included in final result
assert "/foo/lib:/usr/lib64" in rpaths_for(new_binary) assert "/foo/lib:/usr/lib64" in rpaths_for(new_binary)
@pytest.mark.requires_executables("patchelf", "gcc")
@skip_unless_linux
def test_relocate_elf_binaries_relative_paths(binary_with_rpaths, copy_binary):
# Create an executable, set some RPATHs, copy it to another location
orig_binary = binary_with_rpaths(rpaths=["lib", "lib64", "/opt/local/lib"])
new_binary = copy_binary(orig_binary)
spack.relocate.relocate_elf_binaries(
binaries=[str(new_binary)],
orig_root=str(orig_binary.dirpath()),
new_root=str(new_binary.dirpath()),
new_prefixes={str(orig_binary.dirpath()): "/foo"},
rel=True,
orig_prefix=str(orig_binary.dirpath()),
new_prefix=str(new_binary.dirpath()),
)
# Some compilers add rpaths so ensure changes included in final result
assert "/foo/lib:/foo/lib64:/opt/local/lib" in rpaths_for(new_binary)
@pytest.mark.requires_executables("patchelf", "gcc")
@skip_unless_linux
def test_make_elf_binaries_relative(binary_with_rpaths, copy_binary, prefix_tmpdir):
orig_binary = binary_with_rpaths(
rpaths=[
str(prefix_tmpdir.mkdir("lib")),
str(prefix_tmpdir.mkdir("lib64")),
"/opt/local/lib",
]
)
new_binary = copy_binary(orig_binary)
spack.relocate.make_elf_binaries_relative(
[str(new_binary)], [str(orig_binary)], str(orig_binary.dirpath())
)
# Some compilers add rpaths so ensure changes included in final result
assert "$ORIGIN/lib:$ORIGIN/lib64:/opt/local/lib" in rpaths_for(new_binary)
@pytest.mark.requires_executables("patchelf", "gcc") @pytest.mark.requires_executables("patchelf", "gcc")
@skip_unless_linux @skip_unless_linux
def test_relocate_text_bin_with_message(binary_with_rpaths, copy_binary, prefix_tmpdir): def test_relocate_text_bin_with_message(binary_with_rpaths, copy_binary, prefix_tmpdir):

View File

@@ -3,7 +3,21 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
from collections import defaultdict from collections import defaultdict
from typing import Any, Callable, List, NamedTuple, Set, Union from typing import (
Any,
Callable,
Iterable,
List,
NamedTuple,
Optional,
Sequence,
Set,
Tuple,
Union,
overload,
)
from typing_extensions import Literal
import spack.deptypes as dt import spack.deptypes as dt
import spack.spec import spack.spec
@@ -424,49 +438,95 @@ def traverse_topo_edges_generator(edges, visitor, key=id, root=True, all_edges=F
# High-level API: traverse_edges, traverse_nodes, traverse_tree. # High-level API: traverse_edges, traverse_nodes, traverse_tree.
OrderType = Literal["pre", "post", "breadth", "topo"]
CoverType = Literal["nodes", "edges", "paths"]
DirectionType = Literal["children", "parents"]
@overload
def traverse_edges(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = ...,
order: OrderType = ...,
cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable["spack.spec.DependencySpec"]: ...
@overload
def traverse_edges(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = ...,
order: OrderType = ...,
cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, "spack.spec.DependencySpec"]]: ...
@overload
def traverse_edges(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = ...,
order: OrderType = ...,
cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: bool,
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Union["spack.spec.DependencySpec", Tuple[int, "spack.spec.DependencySpec"]]]: ...
def traverse_edges( def traverse_edges(
specs, specs: Sequence["spack.spec.Spec"],
root=True, root: bool = True,
order="pre", order: OrderType = "pre",
cover="nodes", cover: CoverType = "nodes",
direction="children", direction: DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all", deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth=False, depth: bool = False,
key=id, key: Callable[["spack.spec.Spec"], Any] = id,
visited=None, visited: Optional[Set[Any]] = None,
): ) -> Iterable[Union["spack.spec.DependencySpec", Tuple[int, "spack.spec.DependencySpec"]]]:
""" """
Generator that yields edges from the DAG, starting from a list of root specs. Iterable of edges from the DAG, starting from a list of root specs.
Arguments: Arguments:
specs (list): List of root specs (considered to be depth 0) specs: List of root specs (considered to be depth 0)
root (bool): Yield the root nodes themselves root: Yield the root nodes themselves
order (str): What order of traversal to use in the DAG. For depth-first order: What order of traversal to use in the DAG. For depth-first search this can be
search this can be ``pre`` or ``post``. For BFS this should be ``breadth``. ``pre`` or ``post``. For BFS this should be ``breadth``. For topological order use
For topological order use ``topo`` ``topo``
cover (str): Determines how extensively to cover the dag. Possible values: cover: Determines how extensively to cover the dag. Possible values:
``nodes`` -- Visit each unique node in the dag only once. ``nodes`` -- Visit each unique node in the dag only once.
``edges`` -- If a node has been visited once but is reached along a ``edges`` -- If a node has been visited once but is reached along a new path, it's
new path, it's accepted, but not recurisvely followed. This traverses accepted, but not recurisvely followed. This traverses each 'edge' in the DAG once.
each 'edge' in the DAG once. ``paths`` -- Explore every unique path reachable from the root. This descends into
``paths`` -- Explore every unique path reachable from the root. visited subtrees and will accept nodes multiple times if they're reachable by multiple
This descends into visited subtrees and will accept nodes multiple paths.
times if they're reachable by multiple paths. direction: ``children`` or ``parents``. If ``children``, does a traversal of this spec's
direction (str): ``children`` or ``parents``. If ``children``, does a traversal children. If ``parents``, traverses upwards in the DAG towards the root.
of this spec's children. If ``parents``, traverses upwards in the DAG
towards the root.
deptype: allowed dependency types deptype: allowed dependency types
depth (bool): When ``False``, yield just edges. When ``True`` yield depth: When ``False``, yield just edges. When ``True`` yield the tuple (depth, edge), where
the tuple (depth, edge), where depth corresponds to the depth depth corresponds to the depth at which edge.spec was discovered.
at which edge.spec was discovered.
key: function that takes a spec and outputs a key for uniqueness test. key: function that takes a spec and outputs a key for uniqueness test.
visited (set or None): a set of nodes not to follow visited: a set of nodes not to follow
Returns: Returns:
A generator that yields ``DependencySpec`` if depth is ``False`` An iterable of ``DependencySpec`` if depth is ``False`` or a tuple of
or a tuple of ``(depth, DependencySpec)`` if depth is ``True``. ``(depth, DependencySpec)`` if depth is ``True``.
""" """
# validate input # validate input
if order == "topo": if order == "topo":
@@ -484,7 +544,7 @@ def traverse_edges(
root_edges = with_artificial_edges(specs) root_edges = with_artificial_edges(specs)
# Depth-first # Depth-first
if order in ("pre", "post"): if order == "pre" or order == "post":
return traverse_depth_first_edges_generator( return traverse_depth_first_edges_generator(
root_edges, visitor, order == "post", root, depth root_edges, visitor, order == "post", root, depth
) )
@@ -496,79 +556,135 @@ def traverse_edges(
) )
@overload
def traverse_nodes( def traverse_nodes(
specs, specs: Sequence["spack.spec.Spec"],
root=True, *,
order="pre", root: bool = ...,
cover="nodes", order: OrderType = ...,
direction="children", cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[False] = False,
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable["spack.spec.Spec"]: ...
@overload
def traverse_nodes(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = ...,
order: OrderType = ...,
cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: Literal[True],
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Tuple[int, "spack.spec.Spec"]]: ...
@overload
def traverse_nodes(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = ...,
order: OrderType = ...,
cover: CoverType = ...,
direction: DirectionType = ...,
deptype: Union[dt.DepFlag, dt.DepTypes] = ...,
depth: bool,
key: Callable[["spack.spec.Spec"], Any] = ...,
visited: Optional[Set[Any]] = ...,
) -> Iterable[Union["spack.spec.Spec", Tuple[int, "spack.spec.Spec"]]]: ...
def traverse_nodes(
specs: Sequence["spack.spec.Spec"],
*,
root: bool = True,
order: OrderType = "pre",
cover: CoverType = "nodes",
direction: DirectionType = "children",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all", deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
depth=False, depth: bool = False,
key=id, key: Callable[["spack.spec.Spec"], Any] = id,
visited=None, visited: Optional[Set[Any]] = None,
): ) -> Iterable[Union["spack.spec.Spec", Tuple[int, "spack.spec.Spec"]]]:
""" """
Generator that yields specs from the DAG, starting from a list of root specs. Iterable of specs from the DAG, starting from a list of root specs.
Arguments: Arguments:
specs (list): List of root specs (considered to be depth 0) specs: List of root specs (considered to be depth 0)
root (bool): Yield the root nodes themselves root: Yield the root nodes themselves
order (str): What order of traversal to use in the DAG. For depth-first order: What order of traversal to use in the DAG. For depth-first search this can be
search this can be ``pre`` or ``post``. For BFS this should be ``breadth``. ``pre`` or ``post``. For BFS this should be ``breadth``.
cover (str): Determines how extensively to cover the dag. Possible values: cover: Determines how extensively to cover the dag. Possible values:
``nodes`` -- Visit each unique node in the dag only once. ``nodes`` -- Visit each unique node in the dag only once.
``edges`` -- If a node has been visited once but is reached along a ``edges`` -- If a node has been visited once but is reached along a new path, it's
new path, it's accepted, but not recurisvely followed. This traverses accepted, but not recurisvely followed. This traverses each 'edge' in the DAG once.
each 'edge' in the DAG once. ``paths`` -- Explore every unique path reachable from the root. This descends into
``paths`` -- Explore every unique path reachable from the root. visited subtrees and will accept nodes multiple times if they're reachable by multiple
This descends into visited subtrees and will accept nodes multiple paths.
times if they're reachable by multiple paths. direction: ``children`` or ``parents``. If ``children``, does a traversal of this spec's
direction (str): ``children`` or ``parents``. If ``children``, does a traversal children. If ``parents``, traverses upwards in the DAG towards the root.
of this spec's children. If ``parents``, traverses upwards in the DAG
towards the root.
deptype: allowed dependency types deptype: allowed dependency types
depth (bool): When ``False``, yield just edges. When ``True`` yield depth: When ``False``, yield just edges. When ``True`` yield the tuple ``(depth, edge)``,
the tuple ``(depth, edge)``, where depth corresponds to the depth where depth corresponds to the depth at which ``edge.spec`` was discovered.
at which ``edge.spec`` was discovered.
key: function that takes a spec and outputs a key for uniqueness test. key: function that takes a spec and outputs a key for uniqueness test.
visited (set or None): a set of nodes not to follow visited: a set of nodes not to follow
Yields: Yields:
By default :class:`~spack.spec.Spec`, or a tuple ``(depth, Spec)`` if depth is By default :class:`~spack.spec.Spec`, or a tuple ``(depth, Spec)`` if depth is
set to ``True``. set to ``True``.
""" """
for item in traverse_edges(specs, root, order, cover, direction, deptype, depth, key, visited): for item in traverse_edges(
yield (item[0], item[1].spec) if depth else item.spec specs,
root=root,
order=order,
cover=cover,
direction=direction,
deptype=deptype,
depth=depth,
key=key,
visited=visited,
):
yield (item[0], item[1].spec) if depth else item.spec # type: ignore
def traverse_tree( def traverse_tree(
specs, cover="nodes", deptype: Union[dt.DepFlag, dt.DepTypes] = "all", key=id, depth_first=True specs: Sequence["spack.spec.Spec"],
): cover: CoverType = "nodes",
deptype: Union[dt.DepFlag, dt.DepTypes] = "all",
key: Callable[["spack.spec.Spec"], Any] = id,
depth_first: bool = True,
) -> Iterable[Tuple[int, "spack.spec.DependencySpec"]]:
""" """
Generator that yields ``(depth, DependencySpec)`` tuples in the depth-first Generator that yields ``(depth, DependencySpec)`` tuples in the depth-first
pre-order, so that a tree can be printed from it. pre-order, so that a tree can be printed from it.
Arguments: Arguments:
specs (list): List of root specs (considered to be depth 0) specs: List of root specs (considered to be depth 0)
cover (str): Determines how extensively to cover the dag. Possible values: cover: Determines how extensively to cover the dag. Possible values:
``nodes`` -- Visit each unique node in the dag only once. ``nodes`` -- Visit each unique node in the dag only once.
``edges`` -- If a node has been visited once but is reached along a ``edges`` -- If a node has been visited once but is reached along a
new path, it's accepted, but not recurisvely followed. This traverses new path, it's accepted, but not recurisvely followed. This traverses each 'edge' in
each 'edge' in the DAG once. the DAG once.
``paths`` -- Explore every unique path reachable from the root. ``paths`` -- Explore every unique path reachable from the root. This descends into
This descends into visited subtrees and will accept nodes multiple visited subtrees and will accept nodes multiple times if they're reachable by multiple
times if they're reachable by multiple paths. paths.
deptype: allowed dependency types deptype: allowed dependency types
key: function that takes a spec and outputs a key for uniqueness test. key: function that takes a spec and outputs a key for uniqueness test.
depth_first (bool): Explore the tree in depth-first or breadth-first order. depth_first: Explore the tree in depth-first or breadth-first order. When setting
When setting ``depth_first=True`` and ``cover=nodes``, each spec only ``depth_first=True`` and ``cover=nodes``, each spec only occurs once at the shallowest
occurs once at the shallowest level, which is useful when rendering level, which is useful when rendering the tree in a terminal.
the tree in a terminal.
Returns: Returns:
A generator that yields ``(depth, DependencySpec)`` tuples in such an order A generator that yields ``(depth, DependencySpec)`` tuples in such an order that a tree can
that a tree can be printed. be printed.
""" """
# BFS only makes sense when going over edges and nodes, for paths the tree is # BFS only makes sense when going over edges and nodes, for paths the tree is
# identical to DFS, which is much more efficient then. # identical to DFS, which is much more efficient then.

View File

@@ -7,11 +7,12 @@
import subprocess import subprocess
import sys import sys
from pathlib import Path, PurePath from pathlib import Path, PurePath
from typing import Callable, Dict, Optional, Sequence, TextIO, Type, Union, overload
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.error import spack.error
import spack.util.environment from spack.util.environment import EnvironmentModifications
__all__ = ["Executable", "which", "which_string", "ProcessError"] __all__ = ["Executable", "which", "which_string", "ProcessError"]
@@ -19,33 +20,29 @@
class Executable: class Executable:
"""Class representing a program that can be run on the command line.""" """Class representing a program that can be run on the command line."""
def __init__(self, name): def __init__(self, name: str) -> None:
file_path = str(Path(name)) file_path = str(Path(name))
if sys.platform != "win32" and name.startswith("."): if sys.platform != "win32" and name.startswith("."):
# pathlib strips the ./ from relative paths so it must be added back # pathlib strips the ./ from relative paths so it must be added back
file_path = os.path.join(".", file_path) file_path = os.path.join(".", file_path)
self.exe = [file_path] self.exe = [file_path]
self.default_env: Dict[str, str] = {}
self.default_env = {} self.default_envmod = EnvironmentModifications()
self.returncode = 0
self.default_envmod = spack.util.environment.EnvironmentModifications()
self.returncode = None
self.ignore_quotes = False self.ignore_quotes = False
if not self.exe: def add_default_arg(self, *args: str) -> None:
raise ProcessError("Cannot construct executable for '%s'" % name)
def add_default_arg(self, *args):
"""Add default argument(s) to the command.""" """Add default argument(s) to the command."""
self.exe.extend(args) self.exe.extend(args)
def with_default_args(self, *args): def with_default_args(self, *args: str) -> "Executable":
"""Same as add_default_arg, but returns a copy of the executable.""" """Same as add_default_arg, but returns a copy of the executable."""
new = self.copy() new = self.copy()
new.add_default_arg(*args) new.add_default_arg(*args)
return new return new
def copy(self): def copy(self) -> "Executable":
"""Return a copy of this Executable.""" """Return a copy of this Executable."""
new = Executable(self.exe[0]) new = Executable(self.exe[0])
new.exe[:] = self.exe new.exe[:] = self.exe
@@ -53,7 +50,7 @@ def copy(self):
new.default_envmod.extend(self.default_envmod) new.default_envmod.extend(self.default_envmod)
return new return new
def add_default_env(self, key, value): def add_default_env(self, key: str, value: str) -> None:
"""Set an environment variable when the command is run. """Set an environment variable when the command is run.
Parameters: Parameters:
@@ -62,68 +59,109 @@ def add_default_env(self, key, value):
""" """
self.default_env[key] = value self.default_env[key] = value
def add_default_envmod(self, envmod): def add_default_envmod(self, envmod: EnvironmentModifications) -> None:
"""Set an EnvironmentModifications to use when the command is run.""" """Set an EnvironmentModifications to use when the command is run."""
self.default_envmod.extend(envmod) self.default_envmod.extend(envmod)
@property @property
def command(self): def command(self) -> str:
"""The command-line string. """Returns the entire command-line string"""
Returns:
str: The executable and default arguments
"""
return " ".join(self.exe) return " ".join(self.exe)
@property @property
def name(self): def name(self) -> str:
"""The executable name. """Returns the executable name"""
Returns:
str: The basename of the executable
"""
return PurePath(self.path).name return PurePath(self.path).name
@property @property
def path(self): def path(self) -> str:
"""The path to the executable. """Returns the executable path"""
Returns:
str: The path to the executable
"""
return str(PurePath(self.exe[0])) return str(PurePath(self.exe[0]))
def __call__(self, *args, **kwargs): @overload
"""Run this executable in a subprocess. def __call__(
self,
*args: str,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str] = ...,
error: Union[Optional[TextIO], str] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> None: ...
@overload
def __call__(
self,
*args: str,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Type[str], Callable],
error: Union[Optional[TextIO], str, Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
@overload
def __call__(
self,
*args: str,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str, Type[str], Callable] = ...,
error: Union[Type[str], Callable],
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
def __call__(
self,
*args: str,
fail_on_error: bool = True,
ignore_errors: Union[int, Sequence[int]] = (),
ignore_quotes: Optional[bool] = None,
timeout: Optional[int] = None,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = None,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = None,
input: Optional[TextIO] = None,
output: Union[Optional[TextIO], str, Type[str], Callable] = None,
error: Union[Optional[TextIO], str, Type[str], Callable] = None,
_dump_env: Optional[Dict[str, str]] = None,
) -> Optional[str]:
"""Runs this executable in a subprocess.
Parameters: Parameters:
*args (str): Command-line arguments to the executable to run *args: command-line arguments to the executable to run
fail_on_error: if True, raises an exception if the subprocess returns an error
Keyword Arguments: The return code is available as ``self.returncode``
_dump_env (dict): Dict to be set to the environment actually ignore_errors: a sequence of error codes to ignore. If these codes are returned, this
used (envisaged for testing purposes only) process will not raise an exception, even if ``fail_on_error`` is set to ``True``
env (dict or EnvironmentModifications): The environment with which ignore_quotes: if False, warn users that quotes are not needed, as Spack does not
to run the executable use a shell. If None, use ``self.ignore_quotes``.
extra_env (dict or EnvironmentModifications): Extra items to add to timeout: the number of seconds to wait before killing the child process
the environment (neither requires nor precludes env) env: the environment with which to run the executable
fail_on_error (bool): Raise an exception if the subprocess returns extra_env: extra items to add to the environment (neither requires nor precludes env)
an error. Default is True. The return code is available as input: where to read stdin from
``exe.returncode`` output: where to send stdout
ignore_errors (int or list): A list of error codes to ignore. error: where to send stderr
If these codes are returned, this process will not raise _dump_env: dict to be set to the environment actually used (envisaged for
an exception even if ``fail_on_error`` is set to ``True`` testing purposes only)
ignore_quotes (bool): If False, warn users that quotes are not needed
as Spack does not use a shell. Defaults to False.
timeout (int or float): The number of seconds to wait before killing
the child process
input: Where to read stdin from
output: Where to send stdout
error: Where to send stderr
Accepted values for input, output, and error: Accepted values for input, output, and error:
* python streams, e.g. open Python file objects, or ``os.devnull`` * python streams, e.g. open Python file objects, or ``os.devnull``
* filenames, which will be automatically opened for writing
* ``str``, as in the Python string type. If you set these to ``str``, * ``str``, as in the Python string type. If you set these to ``str``,
output and error will be written to pipes and returned as a string. output and error will be written to pipes and returned as a string.
If both ``output`` and ``error`` are set to ``str``, then one string If both ``output`` and ``error`` are set to ``str``, then one string
@@ -133,8 +171,11 @@ def __call__(self, *args, **kwargs):
Behaves the same as ``str``, except that value is also written to Behaves the same as ``str``, except that value is also written to
``stdout`` or ``stderr``. ``stdout`` or ``stderr``.
By default, the subprocess inherits the parent's file descriptors. For output and error it's also accepted:
* filenames, which will be automatically opened for writing
By default, the subprocess inherits the parent's file descriptors.
""" """
def process_cmd_output(out, err): def process_cmd_output(out, err):
@@ -159,44 +200,34 @@ def process_cmd_output(out, err):
sys.stderr.write(errstr) sys.stderr.write(errstr)
return result return result
# Environment
env_arg = kwargs.get("env", None)
# Setup default environment # Setup default environment
env = os.environ.copy() if env_arg is None else {} current_environment = os.environ.copy() if env is None else {}
self.default_envmod.apply_modifications(env) self.default_envmod.apply_modifications(current_environment)
env.update(self.default_env) current_environment.update(self.default_env)
# Apply env argument # Apply env argument
if isinstance(env_arg, spack.util.environment.EnvironmentModifications): if isinstance(env, EnvironmentModifications):
env_arg.apply_modifications(env) env.apply_modifications(current_environment)
elif env_arg: elif env:
env.update(env_arg) current_environment.update(env)
# Apply extra env # Apply extra env
extra_env = kwargs.get("extra_env", {}) if isinstance(extra_env, EnvironmentModifications):
if isinstance(extra_env, spack.util.environment.EnvironmentModifications): extra_env.apply_modifications(current_environment)
extra_env.apply_modifications(env) elif extra_env is not None:
else: current_environment.update(extra_env)
env.update(extra_env)
if "_dump_env" in kwargs: if _dump_env is not None:
kwargs["_dump_env"].clear() _dump_env.clear()
kwargs["_dump_env"].update(env) _dump_env.update(current_environment)
fail_on_error = kwargs.pop("fail_on_error", True) if ignore_quotes is None:
ignore_errors = kwargs.pop("ignore_errors", ()) ignore_quotes = self.ignore_quotes
ignore_quotes = kwargs.pop("ignore_quotes", self.ignore_quotes)
timeout = kwargs.pop("timeout", None)
# If they just want to ignore one error code, make it a tuple. # If they just want to ignore one error code, make it a tuple.
if isinstance(ignore_errors, int): if isinstance(ignore_errors, int):
ignore_errors = (ignore_errors,) ignore_errors = (ignore_errors,)
input = kwargs.pop("input", None)
output = kwargs.pop("output", None)
error = kwargs.pop("error", None)
if input is str: if input is str:
raise ValueError("Cannot use `str` as input stream.") raise ValueError("Cannot use `str` as input stream.")
@@ -230,9 +261,15 @@ def streamify(arg, mode):
cmd_line_string = " ".join(escaped_cmd) cmd_line_string = " ".join(escaped_cmd)
tty.debug(cmd_line_string) tty.debug(cmd_line_string)
result = None
try: try:
proc = subprocess.Popen( proc = subprocess.Popen(
cmd, stdin=istream, stderr=estream, stdout=ostream, env=env, close_fds=False cmd,
stdin=istream,
stderr=estream,
stdout=ostream,
env=current_environment,
close_fds=False,
) )
out, err = proc.communicate(timeout=timeout) out, err = proc.communicate(timeout=timeout)
@@ -248,9 +285,6 @@ def streamify(arg, mode):
long_msg += "\n" + result long_msg += "\n" + result
raise ProcessError("Command exited with status %d:" % proc.returncode, long_msg) raise ProcessError("Command exited with status %d:" % proc.returncode, long_msg)
return result
except OSError as e: except OSError as e:
message = "Command: " + cmd_line_string message = "Command: " + cmd_line_string
if " " in self.exe[0]: if " " in self.exe[0]:
@@ -286,6 +320,8 @@ def streamify(arg, mode):
if close_istream: if close_istream:
istream.close() istream.close()
return result
def __eq__(self, other): def __eq__(self, other):
return hasattr(other, "exe") and self.exe == other.exe return hasattr(other, "exe") and self.exe == other.exe

View File

@@ -93,7 +93,7 @@ class AbseilCpp(CMakePackage):
depends_on("cmake@3.5:", when="@20190312:", type="build") depends_on("cmake@3.5:", when="@20190312:", type="build")
depends_on("cmake@3.1:", type="build") depends_on("cmake@3.1:", type="build")
depends_on("googletest", type="build", when="@20220623:") depends_on("googletest~absl", type="test", when="@20220623:")
def cmake_args(self): def cmake_args(self):
run_tests = self.run_tests and self.spec.satisfies("@20220623:") run_tests = self.run_tests and self.spec.satisfies("@20220623:")

View File

@@ -16,6 +16,7 @@ class ActsAlgebraPlugins(CMakePackage):
license("MPL-2.0", checked_by="stephenswat") license("MPL-2.0", checked_by="stephenswat")
version("0.26.2", sha256="0170f22e1a75493b86464f27991117bc2c5a9d52554c75786e321d4c591990e7")
version("0.26.1", sha256="8eb1e9e28ec2839d149b6a6bddd0f983b0cdf71c286c0aeb67ede31727c5b7d3") version("0.26.1", sha256="8eb1e9e28ec2839d149b6a6bddd0f983b0cdf71c286c0aeb67ede31727c5b7d3")
version("0.26.0", sha256="301702e3d0a3d12e46ae6d949f3027ddebd0b1167cbb3004d9a4a5697d3adc7f") version("0.26.0", sha256="301702e3d0a3d12e46ae6d949f3027ddebd0b1167cbb3004d9a4a5697d3adc7f")
version("0.25.0", sha256="bb0cba6e37558689d780a6de8f749abb3b96f8cd9e0c8851474eb4532e1e98b8") version("0.25.0", sha256="bb0cba6e37558689d780a6de8f749abb3b96f8cd9e0c8851474eb4532e1e98b8")

View File

@@ -40,6 +40,7 @@ class Acts(CMakePackage, CudaPackage):
# Supported Acts versions # Supported Acts versions
version("main", branch="main") version("main", branch="main")
version("master", branch="main", deprecated=True) # For compatibility version("master", branch="main", deprecated=True) # For compatibility
version("38.2.0", commit="9cb8f4494656553fd9b85955938b79b2fac4c9b0", submodules=True)
version("38.1.0", commit="8a20c88808f10bf4fcdfd7c6e077f23614c3ab90", submodules=True) version("38.1.0", commit="8a20c88808f10bf4fcdfd7c6e077f23614c3ab90", submodules=True)
version("38.0.0", commit="0a6b5155e29e3b755bf351b8a76067fff9b4214b", submodules=True) version("38.0.0", commit="0a6b5155e29e3b755bf351b8a76067fff9b4214b", submodules=True)
version("37.4.0", commit="4ae9a44f54c854599d1d753222ec36e0b5b4e9c7", submodules=True) version("37.4.0", commit="4ae9a44f54c854599d1d753222ec36e0b5b4e9c7", submodules=True)

View File

@@ -25,6 +25,7 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause") license("BSD-3-Clause")
version("develop", branch="development") version("develop", branch="development")
version("25.01", sha256="29eb35cf67d66b0fd0654282454c210abfadf27fcff8478b256e3196f237c74f")
version("24.12", sha256="ca4b41ac73fabb9cf3600b530c9823eb3625f337d9b7b9699c1089e81c67fc67") version("24.12", sha256="ca4b41ac73fabb9cf3600b530c9823eb3625f337d9b7b9699c1089e81c67fc67")
version("24.11", sha256="31cc37b39f15e02252875815f6066046fc56a479bf459362b9889b0d6a202df6") version("24.11", sha256="31cc37b39f15e02252875815f6066046fc56a479bf459362b9889b0d6a202df6")
version("24.10", sha256="a2d15e417bd7c41963749338e884d939c80c5f2fcae3279fe3f1b463e3e4208a") version("24.10", sha256="a2d15e417bd7c41963749338e884d939c80c5f2fcae3279fe3f1b463e3e4208a")
@@ -151,6 +152,8 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
# Build dependencies # Build dependencies
depends_on("mpi", when="+mpi") depends_on("mpi", when="+mpi")
with when("+linear_solvers"):
depends_on("rocsparse", when="@25.01: +rocm")
with when("+fft"): with when("+fft"):
depends_on("rocfft", when="+rocm") depends_on("rocfft", when="+rocm")
depends_on("fftw@3", when="~cuda ~rocm ~sycl") depends_on("fftw@3", when="~cuda ~rocm ~sycl")

View File

@@ -16,6 +16,7 @@ class Armadillo(CMakePackage):
license("Apache-2.0") license("Apache-2.0")
version("14.2.2", sha256="3054c8e63db3abdf1a5c8f9fdb7e6b4ad833f9bcfb58324c0ff86de0784c70e0")
version("14.0.3", sha256="ebd6215eeb01ee412fed078c8a9f7f87d4e1f6187ebcdc1bc09f46095a4f4003") version("14.0.3", sha256="ebd6215eeb01ee412fed078c8a9f7f87d4e1f6187ebcdc1bc09f46095a4f4003")
version("14.0.2", sha256="248e2535fc092add6cb7dea94fc86ae1c463bda39e46fd82d2a7165c1c197dff") version("14.0.2", sha256="248e2535fc092add6cb7dea94fc86ae1c463bda39e46fd82d2a7165c1c197dff")
version("12.8.4", sha256="558fe526b990a1663678eff3af6ec93f79ee128c81a4c8aef27ad328fae61138") version("12.8.4", sha256="558fe526b990a1663678eff3af6ec93f79ee128c81a4c8aef27ad328fae61138")
@@ -33,14 +34,14 @@ class Armadillo(CMakePackage):
depends_on("c", type="build") depends_on("c", type="build")
depends_on("cxx", type="build") depends_on("cxx", type="build")
variant("hdf5", default=False, description="Include HDF5 support") variant("hdf5", default=False, description="Include HDF5 support", when="@:10")
depends_on("cmake@2.8.12:", type="build") depends_on("cmake@2.8.12:", type="build")
depends_on("cmake@3.5:", type="build", when="@14:") depends_on("cmake@3.5:", type="build", when="@14:")
depends_on("arpack-ng") # old arpack causes undefined symbols depends_on("arpack-ng") # old arpack causes undefined symbols
depends_on("blas") depends_on("blas")
depends_on("lapack") depends_on("lapack")
depends_on("superlu@5.2:") depends_on("superlu@5.2:5") # only superlu@5 is supported
depends_on("hdf5", when="+hdf5") depends_on("hdf5", when="+hdf5")
# Adds an `#undef linux` to prevent preprocessor expansion of include # Adds an `#undef linux` to prevent preprocessor expansion of include

View File

@@ -2,47 +2,34 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.build_systems.autotools
import spack.build_systems.meson
from spack.package import * from spack.package import *
class Cairo(MesonPackage, AutotoolsPackage): class Cairo(AutotoolsPackage):
"""Cairo is a 2D graphics library with support for multiple output """Cairo is a 2D graphics library with support for multiple output
devices.""" devices."""
homepage = "https://www.cairographics.org/" homepage = "https://www.cairographics.org/"
url = "https://www.cairographics.org/releases/cairo-1.16.0.tar.xz" url = "https://www.cairographics.org/releases/cairo-1.16.0.tar.xz"
git = "https://gitlab.freedesktop.org/cairo/cairo.git"
license("LGPL-2.1-or-later OR MPL-1.1", checked_by="tgamblin") license("LGPL-2.1-or-later OR MPL-1.1", checked_by="tgamblin")
# Cairo has meson.build @1.17.4:, but we only support @1.17.8: here
build_system(
conditional("autotools", when="@:1.17.6"),
conditional("meson", when="@1.17.8:"),
default="meson",
)
version("1.18.2", sha256="a62b9bb42425e844cc3d6ddde043ff39dbabedd1542eba57a2eb79f85889d45a")
version("1.18.0", sha256="243a0736b978a33dee29f9cca7521733b78a65b5418206fef7bd1c3d4cf10b64") version("1.18.0", sha256="243a0736b978a33dee29f9cca7521733b78a65b5418206fef7bd1c3d4cf10b64")
# 1.17.8: https://gitlab.freedesktop.org/cairo/cairo/-/issues/646 (we enable tee by default)
version(
"1.17.6",
sha256="4eebc4c2bad0402bc3f501db184417094657d111fb6c06f076a82ea191fe1faf",
url="https://cairographics.org/snapshots/cairo-1.17.6.tar.xz",
)
version( version(
"1.17.4", "1.17.4",
sha256="74b24c1ed436bbe87499179a3b27c43f4143b8676d8ad237a6fa787401959705", sha256="74b24c1ed436bbe87499179a3b27c43f4143b8676d8ad237a6fa787401959705",
url="https://cairographics.org/snapshots/cairo-1.17.4.tar.xz", url="https://cairographics.org/snapshots/cairo-1.17.4.tar.xz",
) ) # Snapshot
version( version(
"1.17.2", "1.17.2",
sha256="6b70d4655e2a47a22b101c666f4b29ba746eda4aa8a0f7255b32b2e9408801df", sha256="6b70d4655e2a47a22b101c666f4b29ba746eda4aa8a0f7255b32b2e9408801df",
url="https://cairographics.org/snapshots/cairo-1.17.2.tar.xz", url="https://cairographics.org/snapshots/cairo-1.17.2.tar.xz",
) # Snapshot
version(
"1.16.0",
sha256="5e7b29b3f113ef870d1e3ecf8adf21f923396401604bda16d44be45e66052331",
preferred=True,
) )
version("1.16.0", sha256="5e7b29b3f113ef870d1e3ecf8adf21f923396401604bda16d44be45e66052331")
version("1.14.12", sha256="8c90f00c500b2299c0a323dd9beead2a00353752b2092ead558139bd67f7bf16") version("1.14.12", sha256="8c90f00c500b2299c0a323dd9beead2a00353752b2092ead558139bd67f7bf16")
version("1.14.8", sha256="d1f2d98ae9a4111564f6de4e013d639cf77155baf2556582295a0f00a9bc5e20") version("1.14.8", sha256="d1f2d98ae9a4111564f6de4e013d639cf77155baf2556582295a0f00a9bc5e20")
version("1.14.0", sha256="2cf5f81432e77ea4359af9dcd0f4faf37d015934501391c311bfd2d19a0134b7") version("1.14.0", sha256="2cf5f81432e77ea4359af9dcd0f4faf37d015934501391c311bfd2d19a0134b7")
@@ -60,15 +47,6 @@ class Cairo(MesonPackage, AutotoolsPackage):
variant("shared", default=True, description="Build shared libraries") variant("shared", default=True, description="Build shared libraries")
variant("pic", default=True, description="Enable position-independent code (PIC)") variant("pic", default=True, description="Enable position-independent code (PIC)")
with when("build_system=autotools"):
depends_on("autoconf", type="build")
depends_on("automake", type="build")
depends_on("libtool", type="build")
depends_on("m4", type="build")
with when("build_system=meson"):
depends_on("meson@0.59:")
depends_on("libx11", when="+X") depends_on("libx11", when="+X")
depends_on("libxext", when="+X") depends_on("libxext", when="+X")
depends_on("libxrender", when="+X") depends_on("libxrender", when="+X")
@@ -76,21 +54,16 @@ class Cairo(MesonPackage, AutotoolsPackage):
depends_on("python", when="+X", type="build") depends_on("python", when="+X", type="build")
depends_on("libpng", when="+png") depends_on("libpng", when="+png")
depends_on("glib") depends_on("glib")
depends_on("pixman@0.40.0:", when="@1.18.2:")
depends_on("pixman@0.36.0:", when="@1.17.2:") depends_on("pixman@0.36.0:", when="@1.17.2:")
depends_on("pixman") depends_on("pixman")
depends_on("automake", type="build")
depends_on("autoconf", type="build")
depends_on("libtool", type="build")
depends_on("m4", type="build")
depends_on("freetype build_system=autotools", when="+ft") depends_on("freetype build_system=autotools", when="+ft")
# Require freetype with FT_Color
# https://gitlab.freedesktop.org/cairo/cairo/-/issues/792
depends_on("freetype@2.10:", when="@1.18.0: +ft")
depends_on("pkgconfig", type="build") depends_on("pkgconfig", type="build")
depends_on("fontconfig@2.10.91:", when="+fc") # Require newer version of fontconfig. depends_on("fontconfig@2.10.91:", when="+fc") # Require newer version of fontconfig.
depends_on("which", type="build") depends_on("which", type="build")
depends_on("zlib", when="+pdf")
# lzo is not strictly required, but cannot be disabled and may be pulled in accidentally
# https://github.com/mesonbuild/meson/issues/8224
# https://github.com/microsoft/vcpkg/pull/38313
depends_on("lzo", when="@1.18: build_system=meson")
conflicts("+png", when="platform=darwin") conflicts("+png", when="platform=darwin")
conflicts("+svg", when="platform=darwin") conflicts("+svg", when="platform=darwin")
@@ -98,37 +71,10 @@ class Cairo(MesonPackage, AutotoolsPackage):
# patch from https://gitlab.freedesktop.org/cairo/cairo/issues/346 # patch from https://gitlab.freedesktop.org/cairo/cairo/issues/346
patch("fontconfig.patch", when="@1.16.0:1.17.2") patch("fontconfig.patch", when="@1.16.0:1.17.2")
# Patch autogen.sh to not regenerate docs to avoid a dependency on gtk-doc # Don't regenerate docs to avoid a dependency on gtk-doc
patch("disable-gtk-docs.patch", when="build_system=autotools ^autoconf@2.70:") patch("disable-gtk-docs.patch", when="^autoconf@2.70:")
def check(self): def autoreconf(self, spec, prefix):
"""The checks are only for the cairo devs: They write others shouldn't bother"""
pass
class MesonBuilder(spack.build_systems.meson.MesonBuilder):
def meson_args(self):
args = ["-Dtee=enabled"]
if "+X" in self.spec:
args.extend(["-Dxlib=enabled", "-Dxcb=enabled"])
else:
args.extend(["-Dxlib=disabled", "-Dxcb=disabled"])
args.append("-Dzlib=" + ("enabled" if "+pdf" in self.spec else "disabled"))
args.append(
"-Dpng=" + ("enabled" if ("+png" in self.spec or "+svg" in self.spec) else "disabled")
)
args.append("-Dfreetype=" + ("enabled" if "+ft" in self.spec else "disabled"))
args.append("-Dfontconfig=" + ("enabled" if "+fc" in self.spec else "disabled"))
args.append("-Ddefault_library=" + ("shared" if "+shared" in self.spec else "static"))
args.append("-Db_staticpic=" + ("true" if "+pic" in self.spec else "false"))
return args
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
# Regenerate, directing the script *not* to call configure before Spack # Regenerate, directing the script *not* to call configure before Spack
# does # does
which("sh")("./autogen.sh", extra_env={"NOCONFIGURE": "1"}) which("sh")("./autogen.sh", extra_env={"NOCONFIGURE": "1"})
@@ -156,3 +102,7 @@ def configure_args(self):
args.append(f"LIBS={libs}") args.append(f"LIBS={libs}")
return args return args
def check(self):
"""The checks are only for the cairo devs: They write others shouldn't bother"""
pass

View File

@@ -97,7 +97,8 @@ class Chai(CachedCMakePackage, CudaPackage, ROCmPackage):
) )
version("1.0", tag="v1.0", commit="501a098ad879dc8deb4a74fcfe8c08c283a10627", submodules=True) version("1.0", tag="v1.0", commit="501a098ad879dc8deb4a74fcfe8c08c283a10627", submodules=True)
depends_on("cxx", type="build") # generated depends_on("c", type="build")
depends_on("cxx", type="build")
# Patching Umpire for dual BLT targets import changed MPI target name in Umpire link interface # Patching Umpire for dual BLT targets import changed MPI target name in Umpire link interface
# We propagate the patch here. # We propagate the patch here.

View File

@@ -790,7 +790,7 @@ def edit(self, pkg, spec, prefix):
"# include Plumed.inc as recommended by" "# include Plumed.inc as recommended by"
"PLUMED to include libraries and flags" "PLUMED to include libraries and flags"
) )
mkf.write("include {0}\n".format(spec["plumed"].package.plumed_inc)) mkf.write("include {0}\n".format(self.pkg["plumed"].plumed_inc))
mkf.write("\n# COMPILER, LINKER, TOOLS\n\n") mkf.write("\n# COMPILER, LINKER, TOOLS\n\n")
mkf.write( mkf.write(

View File

@@ -19,6 +19,7 @@ class Detray(CMakePackage):
license("MPL-2.0", checked_by="stephenswat") license("MPL-2.0", checked_by="stephenswat")
version("0.87.0", sha256="2d4a76432dd6ddbfc00b88b5d482072e471fefc264b60748bb1f9a123963576e")
version("0.86.0", sha256="98350c94e8a2395b8712b7102fd449536857e8158b38a96cc913c79b70301170") version("0.86.0", sha256="98350c94e8a2395b8712b7102fd449536857e8158b38a96cc913c79b70301170")
version("0.85.0", sha256="a0121a27fd08243d4a6aab060e8ab379ad5129e96775b45f6a683835767fa8e7") version("0.85.0", sha256="a0121a27fd08243d4a6aab060e8ab379ad5129e96775b45f6a683835767fa8e7")
version("0.84.0", sha256="b1d133a97dc90b1513f8c1ef235ceaa542d80243028a41f59a79300c7d71eb25") version("0.84.0", sha256="b1d133a97dc90b1513f8c1ef235ceaa542d80243028a41f59a79300c7d71eb25")
@@ -77,6 +78,7 @@ class Detray(CMakePackage):
depends_on("acts-algebra-plugins +vc", when="+vc") depends_on("acts-algebra-plugins +vc", when="+vc")
depends_on("acts-algebra-plugins +eigen", when="+eigen") depends_on("acts-algebra-plugins +eigen", when="+eigen")
depends_on("acts-algebra-plugins +smatrix", when="+smatrix") depends_on("acts-algebra-plugins +smatrix", when="+smatrix")
depends_on("acts-algebra-plugins@0.26.0:", when="@0.87:")
# Detray imposes requirements on the C++ standard values used by Algebra # Detray imposes requirements on the C++ standard values used by Algebra
# Plugins. # Plugins.

View File

@@ -9,7 +9,7 @@ class Dftd4(MesonPackage):
"""Generally Applicable Atomic-Charge Dependent London Dispersion Correction""" """Generally Applicable Atomic-Charge Dependent London Dispersion Correction"""
homepage = "https://www.chemie.uni-bonn.de/pctc/mulliken-center/software/dftd4" homepage = "https://www.chemie.uni-bonn.de/pctc/mulliken-center/software/dftd4"
url = "https://github.com/dftd4/dftd4/releases/download/v3.5.0/dftd4-3.5.0-source.tar.xz" url = "https://github.com/dftd4/dftd4/releases/download/v0.0.0/dftd4-0.0.0.tar.xz"
git = "https://github.com/dftd4/dftd4.git" git = "https://github.com/dftd4/dftd4.git"
maintainers("awvwgk") maintainers("awvwgk")
@@ -17,6 +17,8 @@ class Dftd4(MesonPackage):
license("LGPL-3.0-only") license("LGPL-3.0-only")
version("main", branch="main") version("main", branch="main")
version("3.7.0", sha256="4e8749df6852bf863d5d1831780a2d30e9ac4afcfebbbfe5f6a6a73d06d6c6ee")
version("3.6.0", sha256="56b3b4650853a34347d3d56c93d7596ecbe2208c4a14dbd027959fd4a009679d")
version("3.5.0", sha256="d2bab992b5ef999fd13fec8eb1da9e9e8d94b8727a2e624d176086197a00a46f") version("3.5.0", sha256="d2bab992b5ef999fd13fec8eb1da9e9e8d94b8727a2e624d176086197a00a46f")
version("3.4.0", sha256="24fcb225cdd5c292ac26f7d3204ee3c4024174adb5272eeda9ae7bc57113ec8d") version("3.4.0", sha256="24fcb225cdd5c292ac26f7d3204ee3c4024174adb5272eeda9ae7bc57113ec8d")
version("3.3.0", sha256="408720b8545532d5240dd743c05d57b140af983192dad6d965b0d79393d0a9ef") version("3.3.0", sha256="408720b8545532d5240dd743c05d57b140af983192dad6d965b0d79393d0a9ef")
@@ -54,3 +56,8 @@ def meson_args(self):
"-Dopenmp={0}".format(str("+openmp" in self.spec).lower()), "-Dopenmp={0}".format(str("+openmp" in self.spec).lower()),
"-Dpython={0}".format(str("+python" in self.spec).lower()), "-Dpython={0}".format(str("+python" in self.spec).lower()),
] ]
def url_for_version(self, version):
if version <= Version("3.6.0"):
return f"https://github.com/dftd4/dftd4/releases/download/v{version}/dftd4-{version}-source.tar.xz"
return super().url_for_version(version)

View File

@@ -20,15 +20,17 @@ class Ensmallen(CMakePackage):
license("BSD-3-Clause") license("BSD-3-Clause")
version("2.22.1", sha256="daf53fe96783043ca33151a3851d054a826fab8d9a173e6bcbbedd4a7eabf5b1")
version("2.21.1", sha256="820eee4d8aa32662ff6a7d883a1bcaf4e9bf9ca0a3171d94c5398fe745008750") version("2.21.1", sha256="820eee4d8aa32662ff6a7d883a1bcaf4e9bf9ca0a3171d94c5398fe745008750")
version("2.19.1", sha256="f36ad7f08b0688d2a8152e1c73dd437c56ed7a5af5facf65db6ffd977b275b2e") version("2.19.1", sha256="f36ad7f08b0688d2a8152e1c73dd437c56ed7a5af5facf65db6ffd977b275b2e")
depends_on("cxx", type="build") # generated depends_on("cxx", type="build")
variant("openmp", default=True, description="Use OpenMP for parallelization") variant("openmp", default=True, description="Use OpenMP for parallelization")
depends_on("cmake@3.3.2:") depends_on("cmake@3.3.2:")
depends_on("armadillo@9.800.0:") depends_on("armadillo@9.800.0:")
depends_on("armadillo@10.8.2:", when="@2.22:")
def cmake_args(self): def cmake_args(self):
args = [self.define_from_variant("USE_OPENMP", "openmp")] args = [self.define_from_variant("USE_OPENMP", "openmp")]

View File

@@ -20,6 +20,7 @@ class FluxCore(AutotoolsPackage):
license("LGPL-3.0-only") license("LGPL-3.0-only")
version("master", branch="master") version("master", branch="master")
version("0.67.0", sha256="9406e776cbeff971881143fd1b94c42ec912e5b226401d2d3d91d766dd81de8c")
version("0.66.0", sha256="0a25cfb1ebc033c249614eb2350c6fb57b00cdf3c584d0759c787f595c360daa") version("0.66.0", sha256="0a25cfb1ebc033c249614eb2350c6fb57b00cdf3c584d0759c787f595c360daa")
version("0.65.0", sha256="a60bc7ed13b8e6d09e99176123a474aad2d9792fff6eb6fd4da2a00e1d2865ab") version("0.65.0", sha256="a60bc7ed13b8e6d09e99176123a474aad2d9792fff6eb6fd4da2a00e1d2865ab")
version("0.64.0", sha256="0334d6191915f1b89b70cdbf14f24200f8899da31090df5f502020533b304bb3") version("0.64.0", sha256="0334d6191915f1b89b70cdbf14f24200f8899da31090df5f502020533b304bb3")
@@ -96,6 +97,7 @@ class FluxCore(AutotoolsPackage):
depends_on("py-pyyaml@3.10:", type=("build", "run")) depends_on("py-pyyaml@3.10:", type=("build", "run"))
depends_on("py-jsonschema@2.3:", type=("build", "run"), when="@:0.58.0") depends_on("py-jsonschema@2.3:", type=("build", "run"), when="@:0.58.0")
depends_on("py-ply", type=("build", "run"), when="@0.46.1:") depends_on("py-ply", type=("build", "run"), when="@0.46.1:")
depends_on("py-setuptools", type="build", when="@0.67.0:")
depends_on("jansson@2.10:") depends_on("jansson@2.10:")
depends_on("pkgconfig") depends_on("pkgconfig")
depends_on("lz4") depends_on("lz4")

View File

@@ -153,7 +153,7 @@ def common_args(self):
"CC={0}".format(env["CC"]), "CC={0}".format(env["CC"]),
"PREFIX={0}".format(self.spec.prefix.bin), "PREFIX={0}".format(self.spec.prefix.bin),
"MFEM_DIR={0}".format(self.spec["mfem"].prefix), "MFEM_DIR={0}".format(self.spec["mfem"].prefix),
"CONFIG_MK={0}".format(self.spec["mfem"].package.config_mk), "CONFIG_MK={0}".format(self.pkg["mfem"].config_mk),
] ]
# https://github.com/spack/spack/issues/42839 # https://github.com/spack/spack/issues/42839

View File

@@ -17,11 +17,13 @@ class Gnutls(AutotoolsPackage):
homepage = "https://www.gnutls.org" homepage = "https://www.gnutls.org"
url = "https://www.gnupg.org/ftp/gcrypt/gnutls/v3.5/gnutls-3.5.19.tar.xz" url = "https://www.gnupg.org/ftp/gcrypt/gnutls/v3.5/gnutls-3.5.19.tar.xz"
list_depth = 2
maintainers("alecbcs") maintainers("alecbcs")
license("LGPL-2.1-or-later") license("LGPL-2.1-or-later")
version("3.8.8", sha256="ac4f020e583880b51380ed226e59033244bc536cad2623f2e26f5afa2939d8fb")
version("3.8.4", sha256="2bea4e154794f3f00180fa2a5c51fe8b005ac7a31cd58bd44cdfa7f36ebc3a9b") version("3.8.4", sha256="2bea4e154794f3f00180fa2a5c51fe8b005ac7a31cd58bd44cdfa7f36ebc3a9b")
version("3.8.3", sha256="f74fc5954b27d4ec6dfbb11dea987888b5b124289a3703afcada0ee520f4173e") version("3.8.3", sha256="f74fc5954b27d4ec6dfbb11dea987888b5b124289a3703afcada0ee520f4173e")
version("3.7.8", sha256="c58ad39af0670efe6a8aee5e3a8b2331a1200418b64b7c51977fb396d4617114") version("3.7.8", sha256="c58ad39af0670efe6a8aee5e3a8b2331a1200418b64b7c51977fb396d4617114")

View File

@@ -15,6 +15,8 @@ class Googletest(CMakePackage):
maintainers("sethrj") maintainers("sethrj")
version("main", branch="main") version("main", branch="main")
version("1.15.2", sha256="7b42b4d6ed48810c5362c265a17faebe90dc2373c885e5216439d37927f02926")
version("1.15.0", sha256="7315acb6bf10e99f332c8a43f00d5fbb1ee6ca48c52f6b936991b216c586aaad")
version("1.14.0", sha256="8ad598c73ad796e0d8280b082cebd82a630d73e73cd3c70057938a6501bba5d7") version("1.14.0", sha256="8ad598c73ad796e0d8280b082cebd82a630d73e73cd3c70057938a6501bba5d7")
version("1.13.0", sha256="ad7fdba11ea011c1d925b3289cf4af2c66a352e18d4c7264392fead75e919363") version("1.13.0", sha256="ad7fdba11ea011c1d925b3289cf4af2c66a352e18d4c7264392fead75e919363")
version("1.12.1", sha256="81964fe578e9bd7c94dfdb09c8e4d6e6759e19967e397dbea48d1c10e45d0df2") version("1.12.1", sha256="81964fe578e9bd7c94dfdb09c8e4d6e6759e19967e397dbea48d1c10e45d0df2")
@@ -29,14 +31,18 @@ class Googletest(CMakePackage):
depends_on("c", type="build") depends_on("c", type="build")
depends_on("cxx", type="build") depends_on("cxx", type="build")
variant("absl", default=False, when="@1.12.1:", description="Build with abseil and RE2")
depends_on("abseil-cpp", when="+absl")
depends_on("re2", when="+absl")
variant("gmock", default=True, when="@1.8:", description="Build with gmock") variant("gmock", default=True, when="@1.8:", description="Build with gmock")
variant("pthreads", default=True, description="Build multithreaded version with pthreads") variant("pthreads", default=True, description="Build multithreaded version with pthreads")
variant("shared", default=True, description="Build shared libraries (DLLs)") variant("shared", default=True, description="Build shared libraries (DLLs)")
variant( variant(
"cxxstd", "cxxstd",
default="11", default="14",
values=("98", "11", "14", "17"), values=("98", "11", "14", "17", "20"),
multi=False, multi=False,
description="Use the specified C++ standard when building", description="Use the specified C++ standard when building",
) )
@@ -48,12 +54,13 @@ def cmake_args(self):
args = [ args = [
self.define_from_variant("BUILD_SHARED_LIBS", "shared"), self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"), self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"),
self.define_from_variant("BUILD_GMOCK", "gmock"),
self.define_from_variant("GTEST_HAS_ABSL", "absl"),
self.define("gtest_disable_pthreads", spec.satisfies("~pthreads")),
] ]
args.append(self.define("gtest_disable_pthreads", not spec.satisfies("+pthreads")))
if spec.satisfies("@1.8:"): if spec.satisfies("@:1.8.0"):
# New style (contains both Google Mock and Google Test)
args.append(self.define("BUILD_GTEST", True)) args.append(self.define("BUILD_GTEST", True))
args.append(self.define_from_variant("BUILD_GMOCK", "gmock"))
return args return args

View File

@@ -585,7 +585,7 @@ def patch(self):
) )
if self.spec.satisfies("+plumed"): if self.spec.satisfies("+plumed"):
self.spec["plumed"].package.apply_patch(self) self["plumed"].apply_patch(self)
if self.spec.satisfies("%nvhpc"): if self.spec.satisfies("%nvhpc"):
# Disable obsolete workaround # Disable obsolete workaround

View File

@@ -58,8 +58,8 @@ def build_targets(self):
spec = self.spec spec = self.spec
targets.append("MFEM_DIR=%s" % spec["mfem"].prefix) targets.append("MFEM_DIR=%s" % spec["mfem"].prefix)
targets.append("CONFIG_MK=%s" % spec["mfem"].package.config_mk) targets.append("CONFIG_MK=%s" % self["mfem"].config_mk)
targets.append("TEST_MK=%s" % spec["mfem"].package.test_mk) targets.append("TEST_MK=%s" % self["mfem"].test_mk)
if spec.satisfies("@:2.0"): if spec.satisfies("@:2.0"):
targets.append("CXX=%s" % spec["mpi"].mpicxx) targets.append("CXX=%s" % spec["mpi"].mpicxx)
if self.spec.satisfies("+ofast %gcc"): if self.spec.satisfies("+ofast %gcc"):

View File

@@ -19,6 +19,10 @@ class Libmesh(AutotoolsPackage):
version("master", branch="master", submodules=True) version("master", branch="master", submodules=True)
version("1.7.6", sha256="65093cc97227193241f78647ec2f04a1852437f40d3d1c49285c6ff712cd0bc8")
version("1.7.5", sha256="03a50cb471e7724a46623f0892cf77152f969d9ba89f8fcebd20bdc0845aab83")
version("1.7.4", sha256="0d603aacd2761292dff61ff7ce59d9fddd8691133f0219f7d1576bd4626b77b2")
version("1.7.3", sha256="fe0bec45a083ddd9e87dc51ab7e68039f3859e7ef0c4a87e76e562b172b6f739")
version("1.7.1", sha256="0387d62773cf92356eb128ba92f767e56c298d78f4b97446e68bf288da1eb6b4") version("1.7.1", sha256="0387d62773cf92356eb128ba92f767e56c298d78f4b97446e68bf288da1eb6b4")
version("1.4.1", sha256="67eb7d5a9c954d891ca1386b70f138333a87a141d9c44213449ca6be69a66414") version("1.4.1", sha256="67eb7d5a9c954d891ca1386b70f138333a87a141d9c44213449ca6be69a66414")
version("1.4.0", sha256="62d7fce89096c950d1b38908484856ea63df57754b64cde6582e7ac407c8c81d") version("1.4.0", sha256="62d7fce89096c950d1b38908484856ea63df57754b64cde6582e7ac407c8c81d")

View File

@@ -42,7 +42,7 @@ class Libspatialite(AutotoolsPackage):
depends_on("geos@:3.9", when="@:5.0.0") depends_on("geos@:3.9", when="@:5.0.0")
depends_on("iconv") depends_on("iconv")
depends_on("librttopo", when="@5.0.1:") depends_on("librttopo", when="@5.0.1:")
depends_on("libxml2") depends_on("libxml2+http")
depends_on("minizip", when="@5.0.0:") depends_on("minizip", when="@5.0.0:")
depends_on("proj") depends_on("proj")
depends_on("proj@:5", when="@:4") depends_on("proj@:5", when="@:4")

View File

@@ -56,6 +56,7 @@ class Llvm(CMakePackage, CudaPackage, LlvmDetection, CompilerPackage):
license("Apache-2.0") license("Apache-2.0")
version("main", branch="main") version("main", branch="main")
version("19.1.6", sha256="f07fdcbb27b2b67aa95e5ddadf45406b33228481c250e65175066d36536a1ee2")
version("19.1.5", sha256="e2204b9903cd9d7ee833a2f56a18bef40a33df4793e31cc090906b32cbd8a1f5") version("19.1.5", sha256="e2204b9903cd9d7ee833a2f56a18bef40a33df4793e31cc090906b32cbd8a1f5")
version("19.1.4", sha256="010e1fd3cabee8799bd2f8a6fbc68f28207494f315cf9da7057a2820f79fd531") version("19.1.4", sha256="010e1fd3cabee8799bd2f8a6fbc68f28207494f315cf9da7057a2820f79fd531")
version("19.1.3", sha256="e5106e2bef341b3f5e41340e4b6c6a58259f4021ad801acf14e88f1a84567b05") version("19.1.3", sha256="e5106e2bef341b3f5e41340e4b6c6a58259f4021ad801acf14e88f1a84567b05")
@@ -1143,12 +1144,12 @@ def post_install(self):
with open(os.path.join(self.prefix.bin, cfg), "w") as f: with open(os.path.join(self.prefix.bin, cfg), "w") as f:
print(gcc_install_dir_flag, file=f) print(gcc_install_dir_flag, file=f)
def llvm_config(self, *args, **kwargs): def llvm_config(self, *args, result=None, **kwargs):
lc = Executable(self.prefix.bin.join("llvm-config")) lc = Executable(self.prefix.bin.join("llvm-config"))
if not kwargs.get("output"): if not kwargs.get("output"):
kwargs["output"] = str kwargs["output"] = str
ret = lc(*args, **kwargs) ret = lc(*args, **kwargs)
if kwargs.get("result") == "list": if result == "list":
return ret.split() return ret.split()
else: else:
return ret return ret

View File

@@ -22,6 +22,7 @@ class Lmod(AutotoolsPackage):
license("MIT") license("MIT")
version("8.7.55", sha256="f85ed9b55c23afb563fa99c7201037628be016e8d88a1aa8dba4632c0ab450bd")
version("8.7.37", sha256="171529152fedfbb3c45d27937b0eaa1ee62b5e5cdac3086f44a6d56e5d1d7da4") version("8.7.37", sha256="171529152fedfbb3c45d27937b0eaa1ee62b5e5cdac3086f44a6d56e5d1d7da4")
version("8.7.24", sha256="8451267652059b6507b652e1b563929ecf9b689ffb20830642085eb6a55bd539") version("8.7.24", sha256="8451267652059b6507b652e1b563929ecf9b689ffb20830642085eb6a55bd539")
version("8.7.20", sha256="c04deff7d2ca354610a362459a7aa9a1c642a095e45a4b0bb2471bb3254e85f4") version("8.7.20", sha256="c04deff7d2ca354610a362459a7aa9a1c642a095e45a4b0bb2471bb3254e85f4")

View File

@@ -1309,7 +1309,7 @@ def libs(self):
@property @property
def config_mk(self): def config_mk(self):
"""Export the location of the config.mk file. """Export the location of the config.mk file.
This property can be accessed using spec["mfem"].package.config_mk This property can be accessed using pkg["mfem"].config_mk
""" """
dirs = [self.prefix, self.prefix.share.mfem] dirs = [self.prefix, self.prefix.share.mfem]
for d in dirs: for d in dirs:
@@ -1321,7 +1321,7 @@ def config_mk(self):
@property @property
def test_mk(self): def test_mk(self):
"""Export the location of the test.mk file. """Export the location of the test.mk file.
This property can be accessed using spec["mfem"].package.test_mk. This property can be accessed using pkg["mfem"].test_mk.
In version 3.3.2 and newer, the location of test.mk is also defined In version 3.3.2 and newer, the location of test.mk is also defined
inside config.mk, variable MFEM_TEST_MK. inside config.mk, variable MFEM_TEST_MK.
""" """

View File

@@ -19,6 +19,7 @@ class Mlpack(CMakePackage):
license("BSD-3-Clause", checked_by="wdconinc") license("BSD-3-Clause", checked_by="wdconinc")
version("4.5.1", sha256="58059b911a78b8bda91eef4cfc6278383b24e71865263c2e0569cf5faa59dda3")
version("4.5.0", sha256="aab70aee10c134ef3fe568843fe4b3bb5e8901af30ea666f57462ad950682317") version("4.5.0", sha256="aab70aee10c134ef3fe568843fe4b3bb5e8901af30ea666f57462ad950682317")
version("4.4.0", sha256="61c604026d05af26c244b0e47024698bbf150dfcc9d77b64057941d7d64d6cf6") version("4.4.0", sha256="61c604026d05af26c244b0e47024698bbf150dfcc9d77b64057941d7d64d6cf6")
version("4.3.0", sha256="08cd54f711fde66fc3b6c9db89dc26776f9abf1a6256c77cfa3556e2a56f1a3d") version("4.3.0", sha256="08cd54f711fde66fc3b6c9db89dc26776f9abf1a6256c77cfa3556e2a56f1a3d")
@@ -29,8 +30,7 @@ class Mlpack(CMakePackage):
depends_on("cxx", type="build") # generated depends_on("cxx", type="build") # generated
# TODO: Go bindings are not supported due to the absence of gonum in spack variant("go", default=False, description="Build Go bindings", when="@4.5.1:")
# variant("go", default=False, description="Build Go bindings")
variant("julia", default=False, description="Build Julia bindings") variant("julia", default=False, description="Build Julia bindings")
variant("python", default=False, description="Build Ppython bindings") variant("python", default=False, description="Build Ppython bindings")
variant("r", default=False, description="Build R bindings") variant("r", default=False, description="Build R bindings")
@@ -47,11 +47,9 @@ class Mlpack(CMakePackage):
conflicts("%gcc@:4", when="@4.0:", msg="mlpack 4.0+ requires at least gcc-5 with C++14") conflicts("%gcc@:4", when="@4.0:", msg="mlpack 4.0+ requires at least gcc-5 with C++14")
conflicts("%gcc@:7", when="@4.4:", msg="mlpack 4.4+ requires at least gcc-8 with C++17") conflicts("%gcc@:7", when="@4.4:", msg="mlpack 4.4+ requires at least gcc-8 with C++17")
# TODO: Go bindings are not supported due to the absence of gonum in spack with when("+go"):
# with when("+go"): # ref: src/mlpack/bindings/go/CMakeLists.txt
# # ref: src/mlpack/bindings/go/CMakeLists.txt depends_on("go@1.11.0:")
# depends_on("go@1.11.0:")
# depends_on("gonum")
with when("+julia"): with when("+julia"):
# ref: src/mlpack/bindings/julia/CMakeLists.txt # ref: src/mlpack/bindings/julia/CMakeLists.txt
depends_on("julia@0.7.0:") depends_on("julia@0.7.0:")
@@ -85,7 +83,7 @@ class Mlpack(CMakePackage):
def cmake_args(self): def cmake_args(self):
args = [ args = [
self.define("BUILD_CLI_EXECUTABLES", True), self.define("BUILD_CLI_EXECUTABLES", True),
# self.define_from_variant("BUILD_GO_BINDINGS", "go"), self.define_from_variant("BUILD_GO_BINDINGS", "go"),
self.define_from_variant("BUILD_JULIA_BINDINGS", "julia"), self.define_from_variant("BUILD_JULIA_BINDINGS", "julia"),
self.define_from_variant("BUILD_PYTHON_BINDINGS", "python"), self.define_from_variant("BUILD_PYTHON_BINDINGS", "python"),
self.define_from_variant("BUILD_R_BINDINGS", "r"), self.define_from_variant("BUILD_R_BINDINGS", "r"),

View File

@@ -15,6 +15,7 @@ class Mold(CMakePackage):
license("MIT") license("MIT")
version("2.36.0", sha256="3f57fe75535500ecce7a80fa1ba33675830b7d7deb1e5ee9a737e2bc43cdb1c7")
version("2.35.1", sha256="912b90afe7fde03e53db08d85a62c7b03a57417e54afc72c08e2fa07cab421ff") version("2.35.1", sha256="912b90afe7fde03e53db08d85a62c7b03a57417e54afc72c08e2fa07cab421ff")
version("2.35.0", sha256="2703f1c88c588523815886478950bcae1ef02190dc4787e0d120a293b1a46e3b") version("2.35.0", sha256="2703f1c88c588523815886478950bcae1ef02190dc4787e0d120a293b1a46e3b")
version("2.34.1", sha256="a8cf638045b4a4b2697d0bcc77fd96eae93d54d57ad3021bf03b0333a727a59d") version("2.34.1", sha256="a8cf638045b4a4b2697d0bcc77fd96eae93d54d57ad3021bf03b0333a727a59d")

View File

@@ -147,11 +147,9 @@ def _copy_arch_file(self, lib):
def _append_option(self, opts, lib): def _append_option(self, opts, lib):
if lib != "python": if lib != "python":
self._copy_arch_file(lib) self._copy_arch_file(lib)
spec = self.spec lib_pkg = self[lib]
lib_prefix = ( lib_prefix = (
spec[lib].package.component_prefix lib_pkg.component_prefix if lib_pkg.name == "intel-oneapi-mkl" else lib_pkg.prefix
if spec[lib].name == "intel-oneapi-mkl"
else spec[lib].prefix
) )
opts.extend(["--with-{0}".format(lib), "--{0}-prefix".format(lib), lib_prefix]) opts.extend(["--with-{0}".format(lib), "--{0}-prefix".format(lib), lib_prefix])

View File

@@ -35,7 +35,7 @@ class Nfft(AutotoolsPackage):
@property @property
def fftw_selected_precisions(self): def fftw_selected_precisions(self):
if not self._fftw_precisions: if not self._fftw_precisions:
self._fftw_precisions = self.spec["fftw"].package.selected_precisions self._fftw_precisions = self["fftw"].selected_precisions
return self._fftw_precisions return self._fftw_precisions
def configure(self, spec, prefix): def configure(self, spec, prefix):

View File

@@ -78,7 +78,10 @@ def install(self, spec, prefix):
string=True, string=True,
) )
configure(*(base_args), f"CC={self.compiler.cc}") if self.spec.satisfies("@4.8.0:"):
base_args += [f"CC={self.compiler.cc}"]
configure(*(base_args))
make("world.opt") make("world.opt")
make("install", "PREFIX={0}".format(prefix)) make("install", "PREFIX={0}".format(prefix))

View File

@@ -84,10 +84,10 @@ def post_install(self):
pyso = "pyopenvdb.dylib" pyso = "pyopenvdb.dylib"
else: else:
pyso = "pyopenvdb.so" pyso = "pyopenvdb.so"
pyver = "python{0}".format(spec["python"].package.version.up_to(2)) pyver = f"python{self['python'].version.up_to(2)}"
src = prefix.lib.join(pyver).join(pyso) src = self.prefix.lib.join(pyver).join(pyso)
if not os.path.isfile(src): if not os.path.isfile(src):
src = prefix.lib64.join(pyver).join(pyso) src = self.prefix.lib64.join(pyver).join(pyso)
assert os.path.isfile(src) assert os.path.isfile(src)
os.rename(src, os.path.join(python_platlib, pyso)) os.rename(src, os.path.join(python_platlib, pyso))

View File

@@ -21,11 +21,11 @@ def home(self):
@property @property
def headers(self): def headers(self):
return self.spec["mesa"].package.libosmesa_headers return self["mesa"].libosmesa_headers
@property @property
def libs(self): def libs(self):
return self.spec["mesa"].package.libosmesa_libs return self["mesa"].libosmesa_libs
@property @property
def gl_headers(self): def gl_headers(self):
@@ -33,4 +33,4 @@ def gl_headers(self):
@property @property
def gl_libs(self): def gl_libs(self):
return self.spec["mesa"].package.libosmesa_libs return self["mesa"].libosmesa_libs

View File

@@ -19,7 +19,8 @@ class Palace(CMakePackage):
version("0.12.0", tag="v0.12.0", commit="8c192071206466638d5818048ee712e1fada386f") version("0.12.0", tag="v0.12.0", commit="8c192071206466638d5818048ee712e1fada386f")
version("0.11.2", tag="v0.11.2", commit="6c3aa5f84a934a6ddd58022b2945a1bdb5fa329d") version("0.11.2", tag="v0.11.2", commit="6c3aa5f84a934a6ddd58022b2945a1bdb5fa329d")
depends_on("cxx", type="build") # generated depends_on("c", type="build")
depends_on("cxx", type="build")
variant("shared", default=True, description="Build shared libraries") variant("shared", default=True, description="Build shared libraries")
variant("int64", default=False, description="Use 64 bit integers") variant("int64", default=False, description="Use 64 bit integers")

View File

@@ -31,7 +31,7 @@ class Pnfft(AutotoolsPackage):
@property @property
def fftw_selected_precisions(self): def fftw_selected_precisions(self):
if not self._fftw_precisions: if not self._fftw_precisions:
self._fftw_precisions = self.spec["fftw"].package.selected_precisions self._fftw_precisions = self["fftw"].selected_precisions
return self._fftw_precisions return self._fftw_precisions
def configure(self, spec, prefix): def configure(self, spec, prefix):

View File

@@ -16,9 +16,12 @@ class PyArrow(PythonPackage):
homepage = "https://arrow.readthedocs.io/en/latest/" homepage = "https://arrow.readthedocs.io/en/latest/"
pypi = "arrow/arrow-0.16.0.tar.gz" pypi = "arrow/arrow-0.16.0.tar.gz"
maintainers("climbfuji")
license("Apache-2.0") license("Apache-2.0")
version("1.3.0", sha256="d4540617648cb5f895730f1ad8c82a65f2dad0166f57b75f3ca54759c4d67a85") # https://github.com/spack/spack/issues/48477
# version("1.3.0", sha256="d4540617648cb5f895730f1ad8c82a65f2dad0166f57b75f3ca54759c4d67a85")
version("1.2.3", sha256="3934b30ca1b9f292376d9db15b19446088d12ec58629bc3f0da28fd55fb633a1") version("1.2.3", sha256="3934b30ca1b9f292376d9db15b19446088d12ec58629bc3f0da28fd55fb633a1")
version("1.2.2", sha256="05caf1fd3d9a11a1135b2b6f09887421153b94558e5ef4d090b567b47173ac2b") version("1.2.2", sha256="05caf1fd3d9a11a1135b2b6f09887421153b94558e5ef4d090b567b47173ac2b")
version("1.2.1", sha256="c2dde3c382d9f7e6922ce636bf0b318a7a853df40ecb383b29192e6c5cc82840") version("1.2.1", sha256="c2dde3c382d9f7e6922ce636bf0b318a7a853df40ecb383b29192e6c5cc82840")
@@ -26,12 +29,15 @@ class PyArrow(PythonPackage):
version("0.14.7", sha256="67f8be7c0cf420424bc62d8d7dc40b44e4bb2f7b515f9cc2954fb36e35797656") version("0.14.7", sha256="67f8be7c0cf420424bc62d8d7dc40b44e4bb2f7b515f9cc2954fb36e35797656")
version("0.14.1", sha256="2d30837085011ef0b90ff75aa0a28f5c7d063e96b7e76b6cbc7e690310256685") version("0.14.1", sha256="2d30837085011ef0b90ff75aa0a28f5c7d063e96b7e76b6cbc7e690310256685")
depends_on("python@3.8:", type=("build", "run"), when="@1.3:") # https://github.com/spack/spack/issues/48477
# depends_on("python@3.8:", type=("build", "run"), when="@1.3:")
depends_on("python@3.6:", type=("build", "run"), when="@1.2.1:") depends_on("python@3.6:", type=("build", "run"), when="@1.2.1:")
depends_on("python@2.7:2.8,3.5:", type=("build", "run"), when="@:0.16.0") depends_on("python@2.7:2.8,3.5:", type=("build", "run"), when="@:0.16.0")
depends_on("py-setuptools", type="build", when="@:1.2") depends_on("py-setuptools", type="build", when="@:1.2")
depends_on("py-flit-core@3.2:3", type="build", when="@1.3:") # https://github.com/spack/spack/issues/48477
# depends_on("py-flit-core@3.2:3", type="build", when="@1.3:")
depends_on("py-python-dateutil", type=("build", "run")) depends_on("py-python-dateutil", type=("build", "run"))
depends_on("py-typing-extensions", type=("build", "run"), when="@1.2.1:1.2 ^python@:3.7") depends_on("py-typing-extensions", type=("build", "run"), when="@1.2.1:1.2 ^python@:3.7")
depends_on("py-python-dateutil@2.7.0:", type=("build", "run"), when="@1.2.1:") depends_on("py-python-dateutil@2.7.0:", type=("build", "run"), when="@1.2.1:")
depends_on("py-types-python-dateutil@2.8.10:", type=("build", "run"), when="@1.3:") # https://github.com/spack/spack/issues/48477
# depends_on("py-types-python-dateutil@2.8.10:", type=("build", "run"), when="@1.3:")

View File

@@ -10,16 +10,20 @@ class PyCylcFlow(PythonPackage):
homepage = "https://cylc.org" homepage = "https://cylc.org"
pypi = "cylc-flow/cylc-flow-8.1.4.tar.gz" pypi = "cylc-flow/cylc-flow-8.1.4.tar.gz"
git = "https://github.com/cylc/cylc-flow.git"
maintainers("LydDeb", "climbfuji") maintainers("LydDeb", "climbfuji")
license("GPL-3.0-only") license("GPL-3.0-only")
# Version 8.3.6 is available at PyPI, but not at the URL that is considered canonical by Spack
# https://github.com/spack/spack/issues/48479
version("8.3.6", commit="7f63b43164638e27636b992b14b3fa088b692b94")
version("8.2.3", sha256="dd5bea9e4b8dad00edd9c3459a38fb778e5a073da58ad2725bc9b84ad718e073") version("8.2.3", sha256="dd5bea9e4b8dad00edd9c3459a38fb778e5a073da58ad2725bc9b84ad718e073")
version("8.2.0", sha256="cbe35e0d72d1ca36f28a4cebe9b9040a3445a74253bc94051a3c906cf179ded0") version("8.2.0", sha256="cbe35e0d72d1ca36f28a4cebe9b9040a3445a74253bc94051a3c906cf179ded0")
version("8.1.4", sha256="d1835ac18f6f24f3115c56b2bc821185484e834a86b12fd0033ff7e4dc3c1f63") version("8.1.4", sha256="d1835ac18f6f24f3115c56b2bc821185484e834a86b12fd0033ff7e4dc3c1f63")
depends_on("py-setuptools@49:66,68:", type=("build", "run")) depends_on("py-setuptools@49:66,68:", type=("build", "run"), when="@:8.2")
depends_on("py-aiofiles@0.7", type=("build", "run"), when="@:8.1") depends_on("py-aiofiles@0.7", type=("build", "run"), when="@:8.1")
depends_on("py-ansimarkup@1.0.0:", type=("build", "run")) depends_on("py-ansimarkup@1.0.0:", type=("build", "run"))
depends_on("py-async-timeout@3.0.0:", type=("build", "run")) depends_on("py-async-timeout@3.0.0:", type=("build", "run"))
@@ -28,15 +32,20 @@ class PyCylcFlow(PythonPackage):
depends_on("py-jinja2@3.0", type=("build", "run")) depends_on("py-jinja2@3.0", type=("build", "run"))
depends_on("py-metomi-isodatetime@3.0", type=("build", "run"), when="@:8.2.0") depends_on("py-metomi-isodatetime@3.0", type=("build", "run"), when="@:8.2.0")
depends_on("py-metomi-isodatetime@3:3.1", type=("build", "run"), when="@8.2.3:") depends_on("py-metomi-isodatetime@3:3.1", type=("build", "run"), when="@8.2.3:")
depends_on("py-protobuf@4.21.2:4.21", type=("build", "run")) depends_on("py-packaging", type=("build", "run"), when="@8.3:")
depends_on("py-protobuf@4.21.2:4.21", type=("build", "run"), when="@:8.2")
depends_on("py-protobuf@4.24.4:4.24", type=("build", "run"), when="@8.3:")
depends_on("py-psutil@5.6.0:", type=("build", "run")) depends_on("py-psutil@5.6.0:", type=("build", "run"))
depends_on("py-pyzmq@22:", type=("build", "run"), when="@8.2:") depends_on("py-pyzmq@22:", type=("build", "run"), when="@8.2:")
depends_on("py-pyzmq@22", type=("build", "run"), when="@:8.1") depends_on("py-pyzmq@22", type=("build", "run"), when="@:8.1")
depends_on("py-importlib-metadata", type=("build", "run"), when="^python@:3.7") depends_on("py-importlib-metadata", type=("build", "run"), when="@:8.2 ^python@:3.7")
depends_on("py-urwid@2", type=("build", "run")) depends_on("py-importlib-metadata@5:", type=("build", "run"), when="@8.3: ^python@:3.11")
depends_on("py-urwid@2:2.6.1,2.6.4:2", type=("build", "run"))
depends_on("py-rx", type=("build", "run")) depends_on("py-rx", type=("build", "run"))
depends_on("py-promise", type=("build", "run")) depends_on("py-promise", type=("build", "run"))
depends_on("py-tomli@2:", type=("build", "run"), when="^python@:3.10") depends_on("py-tomli@2:", type=("build", "run"), when="^python@:3.10")
# Non-Python dependencies # Non-Python dependencies for creating graphs.
depends_on("graphviz", type="run") # We want at least the pangocairo variant for
# graphviz so that we can create output as png.
depends_on("graphviz+pangocairo", type="run")

View File

@@ -10,15 +10,26 @@ class PyCylcRose(PythonPackage):
homepage = "https://cylc.github.io/cylc-doc/latest/html/plugins/cylc-rose.html" homepage = "https://cylc.github.io/cylc-doc/latest/html/plugins/cylc-rose.html"
pypi = "cylc-rose/cylc-rose-1.3.0.tar.gz" pypi = "cylc-rose/cylc-rose-1.3.0.tar.gz"
git = "https://github.com/cylc/cylc-rose.git"
maintainers("LydDeb") maintainers("LydDeb", "climbfuji")
license("GPL-3.0-only") license("GPL-3.0-only")
# Version 1.4.2 is available at PyPI, but not at the URL that is considered canonical by Spack
# https://github.com/spack/spack/issues/48479
version("1.4.2", commit="8deda0480afed8cf92cfdf7938fc78d0aaf0c0e4")
version("1.3.0", sha256="017072b69d7a50fa6d309a911d2428743b07c095f308529b36b1b787ebe7ab88") version("1.3.0", sha256="017072b69d7a50fa6d309a911d2428743b07c095f308529b36b1b787ebe7ab88")
depends_on("py-setuptools", type="build") depends_on("py-setuptools", type="build")
depends_on("py-metomi-rose@2.1", type=("build", "run"))
depends_on("py-cylc-flow@8.2", type=("build", "run"))
depends_on("py-metomi-isodatetime", type=("build", "run")) depends_on("py-metomi-isodatetime", type=("build", "run"))
depends_on("py-jinja2", type=("build", "run")) depends_on("py-jinja2", type=("build", "run"))
with when("@1.3.0"):
depends_on("py-metomi-rose@2.1", type=("build", "run"))
depends_on("py-cylc-flow@8.2", type=("build", "run"))
with when("@1.4.2"):
depends_on("py-metomi-rose@2.3", type=("build", "run"))
depends_on("py-cylc-flow@8.3.5:8.3", type=("build", "run"))
depends_on("py-ansimarkup", type=("build", "run"))

View File

@@ -10,22 +10,31 @@ class PyCylcUiserver(PythonPackage):
homepage = "https://github.com/cylc/cylc-uiserver/" homepage = "https://github.com/cylc/cylc-uiserver/"
pypi = "cylc-uiserver/cylc-uiserver-1.3.0.tar.gz" pypi = "cylc-uiserver/cylc-uiserver-1.3.0.tar.gz"
git = "https://github.com/cylc/cylc-uiserver.git"
maintainers("LydDeb") maintainers("LydDeb", "climbfuji")
license("GPL-3.0-or-later") license("GPL-3.0-or-later")
# Version 1.5.1 is available at PyPI, but not at the URL that is considered canonical by Spack
# https://github.com/spack/spack/issues/48479
version("1.5.1", commit="3a41c6fbefbcea33c41410f3698de8b62c9871b8")
version("1.3.0", sha256="f3526e470c7ac2b61bf69e9b8d17fc7a513392219d28baed9b1166dcc7033d7a") version("1.3.0", sha256="f3526e470c7ac2b61bf69e9b8d17fc7a513392219d28baed9b1166dcc7033d7a")
depends_on("python@3.8:", when="@1.5.1", type=("build", "run"))
depends_on("py-wheel", type="build") depends_on("py-wheel", type="build")
depends_on("py-setuptools@40.9.0:", type="build") depends_on("py-setuptools@40.9.0:", type="build")
depends_on("py-cylc-flow@8.2", type=("build", "run"))
depends_on("py-cylc-flow@8.2", when="@1.3.0", type=("build", "run"))
depends_on("py-cylc-flow@8.3", when="@1.5.1", type=("build", "run"))
depends_on("py-ansimarkup@1.0.0:", type=("build", "run")) depends_on("py-ansimarkup@1.0.0:", type=("build", "run"))
depends_on("py-graphene", type=("build", "run")) depends_on("py-graphene", type=("build", "run"))
depends_on("py-graphene-tornado@2.6", type=("build", "run")) depends_on("py-graphene-tornado@2.6", type=("build", "run"))
depends_on("py-graphql-ws@0.4.4", type=("build", "run")) depends_on("py-graphql-ws@0.4.4", type=("build", "run"))
depends_on("py-jupyter-server@1.10.2:1", type=("build", "run")) depends_on("py-jupyter-server@1.10.2:1", when="@1.3.0", type=("build", "run"))
depends_on("py-jupyter-server@2.7:", when="@1.5.1", type=("build", "run"))
depends_on("py-requests", type=("build", "run")) depends_on("py-requests", type=("build", "run"))
depends_on("py-psutil", when="@1.5.1", type=("build", "run"))
depends_on("py-tornado@6.1.0:", type=("build", "run")) depends_on("py-tornado@6.1.0:", type=("build", "run"))
depends_on("py-traitlets@5.2.1:", type=("build", "run")) depends_on("py-traitlets@5.2.1:", type=("build", "run"))
depends_on("py-pyzmq", type=("build", "run")) depends_on("py-pyzmq", type=("build", "run"))

View File

@@ -8,21 +8,66 @@
class PyIminuit(PythonPackage): class PyIminuit(PythonPackage):
"""Interactive IPython-Friendly Minimizer based on SEAL Minuit2.""" """Interactive IPython-Friendly Minimizer based on SEAL Minuit2."""
homepage = "http://github.com/scikit-hep/iminuit"
pypi = "iminuit/iminuit-1.2.tar.gz" pypi = "iminuit/iminuit-1.2.tar.gz"
tags = ["hep"]
license("MIT AND LGPL-2.0-only", checked_by="wdconinc")
version("2.30.1", sha256="2815bfdeb8e7f78185f316b75e2d4b19d0f6993bdc5ff03352ed37b70a796360")
version("2.29.1", sha256="474d10eb2f924b9320f6f7093e4c149d0a38c124d0419c12a07a3eca942de025")
version("2.28.0", sha256="6646ae0b66a4760e02cd73711d460a6cf2375382b78ce8344141751595596aad")
version("2.27.0", sha256="4ce830667730e76d20b10416a5851672c7fcc301dd1f48b9143cfd187b89ab8e")
version("2.26.0", sha256="a51233fbf1c2e008aa584f9eea65b6c30ed56624e4dea5d4e53370ccd84c9b4e")
version("2.25.2", sha256="3bf8a1b96865a60cedf29135f4feae09fa7c66237d29f68ded64e97a823a9b3e")
version("2.24.0", sha256="25ab631c3c8e024b1bcc7c96f66338caac54a4a2324d55f1e3ba5617816e44fd")
version("2.23.0", sha256="98f1589eb18d4882232ff1556d62e7ca19c91bbab7524ac8b405261a674452a1")
version("2.22.0", sha256="e0ccc37bad8bc1bd3b9d3fa07d28d4c0407e25a888faa9b559be2d9afbd2d97c")
version("2.21.3", sha256="fb313f0cc27e221b9b221bcd779b3a668fb4c77b0f90abfd5336833ecbdac016")
version("2.20.0", sha256="a73fe6e02f35e3180fc01bc5c1794edf662ff1725c3bc2a4f433567799da7504")
version("2.19.0", sha256="f4d1cbaccf115cdc4866968f649f2a37794a5c0de018de8156aa74556350a54c")
version("2.18.0", sha256="7ee2c6a0bcdac581b38fae8d0f343fdee55f91f1f6a6cc9643fcfbcc6c2dc3e6")
version("2.17.0", sha256="75f4a8a2bad21fda7b6bd42df7ca04120fb24636ebf9b566d259b26f2044b1d0")
version("2.16.0", sha256="1024a519dbc8fd52d5fd2a3779fd485b09bc27c40556def8b6f91695423199d6")
version("2.15.2", sha256="60ac7d2fe9405c9206675229273f401611d3f5dfa22942541646c4625b59f1ea")
version("2.14.0", sha256="5920880d6ec0194411942ab6040a1930398be45669c9f60fff391e666c863417")
version("2.13.0", sha256="e34785c2a2c0aea6ff86672fe81b80a04ac9d42a79ed8249630f2529a8f6a0fa")
version("2.12.2", sha256="29142ed38cf986c08683dc9e912a484abc70962a4d36d7d71b7d9d872316be8e")
version("2.11.2", sha256="8cae7917ca2d22c691e00792bfbbb812b84ac5c75120eb2ae879fb4ada41ee6c")
version("2.10.0", sha256="93b33ca6d2ffd73e80b40e8a400ca3dbc70e05662f1bd390e2b6040279101485")
version("2.9.0", sha256="656410ceffead79a52d3d727fdcd2bac78d7774239bef0efc3b7a86bae000ff3")
version("2.8.4", sha256="4b09189f3094896cfc68596adc95b7f1d92772e1de1424e5dc4dd81def56e8b0") version("2.8.4", sha256="4b09189f3094896cfc68596adc95b7f1d92772e1de1424e5dc4dd81def56e8b0")
version("1.5.2", sha256="0b54f4d4fc3175471398b573d24616ddb8eb7d63808aa370cfc71fc1d636a1fd") version("1.5.2", sha256="0b54f4d4fc3175471398b573d24616ddb8eb7d63808aa370cfc71fc1d636a1fd")
version("1.3.7", sha256="9173e52cc4a0c0bda13ebfb862f9b074dc5de345b23cb15c1150863aafd8a26c") version("1.3.7", sha256="9173e52cc4a0c0bda13ebfb862f9b074dc5de345b23cb15c1150863aafd8a26c")
version("1.3.6", sha256="d79a197f305d4708a0e3e52b0a6748c1a6997360d2fbdfd09c022995a6963b5e") version("1.3.6", sha256="d79a197f305d4708a0e3e52b0a6748c1a6997360d2fbdfd09c022995a6963b5e")
version("1.2", sha256="7651105fc3f186cfb5742f075ffebcc5088bf7797d8ed124c00977eebe0d1c64") version("1.2", sha256="7651105fc3f186cfb5742f075ffebcc5088bf7797d8ed124c00977eebe0d1c64")
depends_on("cxx", type="build") # generated depends_on("cxx", type="build")
# Required dependencies
depends_on("python@3.6:", type=("build", "run"), when="@2.6.1:") depends_on("python@3.6:", type=("build", "run"), when="@2.6.1:")
depends_on("py-setuptools", type="build") depends_on("python@3.7:", type=("build", "run"), when="@2.17.0:")
depends_on("python@3.8:", type=("build", "run"), when="@2.19.0:")
depends_on("python@3.9:", type=("build", "run"), when="@2.28.0:")
with when("@2.22:"):
depends_on("py-scikit-build-core@0.3:+pyproject", type="build")
depends_on("py-scikit-build-core@0.5:+pyproject", type="build", when="@2.26:")
depends_on("py-pybind11", type="build")
depends_on("py-pybind11@2.12:", type="build", when="@2.26:")
with when("@:2.21"):
depends_on("py-setuptools", type="build")
depends_on("py-numpy", type=("build", "run"), when="@1.3:1.3.6") depends_on("py-numpy", type=("build", "run"), when="@1.3:1.3.6")
depends_on("py-numpy@1.11.3:", type=("build", "run"), when="@1.3.7:") depends_on("py-numpy@1.11.3:", type=("build", "run"), when="@1.3.7:")
# https://github.com/numpy/numpy/issues/26191#issuecomment-2179127999 # https://github.com/numpy/numpy/issues/26191#issuecomment-2179127999
depends_on("py-numpy@1.21:", type=("build", "run"), when="@2.22:")
depends_on("py-numpy@:1", when="@:2.25", type=("build", "run")) depends_on("py-numpy@:1", when="@:2.25", type=("build", "run"))
depends_on("cmake", type="build", when="@2.8.4") depends_on("cmake@3.11:", type="build")
depends_on("cmake@3.13:", type="build", when="@2:")
depends_on("cmake@3.15:", type="build", when="@2.22:")
# Historical dependencies
with when("@:2.27"):
depends_on("py-typing-extensions", when="@2.21: ^python@:3.8", type=("build", "run"))
depends_on(
"py-typing-extensions@3.7.4:", when="@2.26: ^python@:3.8", type=("build", "run")
)

View File

@@ -22,6 +22,7 @@ class PyKeras(PythonPackage):
maintainers("adamjstewart") maintainers("adamjstewart")
license("Apache-2.0") license("Apache-2.0")
version("3.8.0", sha256="6289006e6f6cb2b68a563b58cf8ae5a45569449c5a791df6b2f54c1877f3f344")
version("3.7.0", sha256="a4451a5591e75dfb414d0b84a3fd2fb9c0240cc87ebe7e397f547ce10b0e67b7") version("3.7.0", sha256="a4451a5591e75dfb414d0b84a3fd2fb9c0240cc87ebe7e397f547ce10b0e67b7")
version("3.6.0", sha256="405727525a3522ed8f9ec0b46e0667e4c65fcf714a067322c16a00d902ded41d") version("3.6.0", sha256="405727525a3522ed8f9ec0b46e0667e4c65fcf714a067322c16a00d902ded41d")
version("3.5.0", sha256="53ae4f9472ec9d9c6941c82a3fda86969724ace3b7630a94ba0a1f17ba1065c3") version("3.5.0", sha256="53ae4f9472ec9d9c6941c82a3fda86969724ace3b7630a94ba0a1f17ba1065c3")
@@ -64,6 +65,7 @@ class PyKeras(PythonPackage):
version("2.2.1", sha256="0d3cb14260a3fa2f4a5c4c9efa72226ffac3b4c50135ba6edaf2b3d1d23b11ee") version("2.2.1", sha256="0d3cb14260a3fa2f4a5c4c9efa72226ffac3b4c50135ba6edaf2b3d1d23b11ee")
version("2.2.0", sha256="5b8499d157af217f1a5ee33589e774127ebc3e266c833c22cb5afbb0ed1734bf") version("2.2.0", sha256="5b8499d157af217f1a5ee33589e774127ebc3e266c833c22cb5afbb0ed1734bf")
# TODO: add openvino backend (keras 3.8+)
variant( variant(
"backend", "backend",
default="tensorflow", default="tensorflow",
@@ -85,7 +87,6 @@ class PyKeras(PythonPackage):
depends_on("py-absl-py", when="@2.6:") depends_on("py-absl-py", when="@2.6:")
depends_on("py-numpy") depends_on("py-numpy")
depends_on("py-rich", when="@3:") depends_on("py-rich", when="@3:")
depends_on("py-namex@0.0.8:", when="@3.3.3:")
depends_on("py-namex", when="@3:") depends_on("py-namex", when="@3:")
depends_on("py-h5py") depends_on("py-h5py")
depends_on("py-optree", when="@3.1:") depends_on("py-optree", when="@3.1:")
@@ -93,22 +94,21 @@ class PyKeras(PythonPackage):
depends_on("py-packaging", when="@3.4:") depends_on("py-packaging", when="@3.4:")
# requirements-common.txt # requirements-common.txt
depends_on("py-scipy") # Many more (optional?) dependencies
depends_on("py-pandas")
depends_on("py-requests", when="@3:")
depends_on("py-protobuf", when="@3:")
# requirements-tensorflow-cuda.txt # requirements-tensorflow-cuda.txt
with when("backend=tensorflow"): with when("backend=tensorflow"):
depends_on("py-tensorflow@2.18", when="@3.7:") depends_on("py-tensorflow@2.18", when="@3.7:")
depends_on("py-tensorflow@2.17", when="@3.5:3.6") depends_on("py-tensorflow@2.17", when="@3.5:3.6")
depends_on("py-tensorflow@2.16.1:2.16", when="@3.0:3.4") depends_on("py-tensorflow@2.16.1:2.16", when="@3.0:3.4")
# depends_on("py-tf2onnx", when="@3.8:")
# requirements-jax-cuda.txt # requirements-jax-cuda.txt
with when("backend=jax"): with when("backend=jax"):
depends_on("py-jax@0.4.28", when="@3.6:") depends_on("py-jax@0.4.28", when="@3.6:")
depends_on("py-jax@0.4.23", when="@3.0.5:3.5") depends_on("py-jax@0.4.23", when="@3.0.5:3.5")
depends_on("py-jax", when="@3:") depends_on("py-jax", when="@3:")
# depends_on("py-flax", when="@3.2:")
# requirements-torch-cuda.txt # requirements-torch-cuda.txt
with when("backend=torch"): with when("backend=torch"):
@@ -126,6 +126,7 @@ class PyKeras(PythonPackage):
depends_on("py-torchvision@0.16.2", when="@3.0.3:3.0.5") depends_on("py-torchvision@0.16.2", when="@3.0.3:3.0.5")
depends_on("py-torchvision@0.16.1", when="@3.0.1:3.0.2") depends_on("py-torchvision@0.16.1", when="@3.0.1:3.0.2")
depends_on("py-torchvision@0.16.0", when="@3.0.0") depends_on("py-torchvision@0.16.0", when="@3.0.0")
# depends_on("py-torch-xla", when="@3.8:")
# Historical dependencies # Historical dependencies
with default_args(type="build"): with default_args(type="build"):

View File

@@ -11,10 +11,11 @@ class PyMetomiRose(PythonPackage):
homepage = "https://metomi.github.io/rose/doc/html/index.html" homepage = "https://metomi.github.io/rose/doc/html/index.html"
pypi = "metomi-rose/metomi-rose-2.1.0.tar.gz" pypi = "metomi-rose/metomi-rose-2.1.0.tar.gz"
maintainers("LydDeb") maintainers("LydDeb", "climbfuji")
license("GPL-3.0-only") license("GPL-3.0-only")
version("2.3.2", sha256="5d2a1593a5bbe8362fbe5e197eaa0cde2574700c62181d9b5c1fafa1e67656cd")
version("2.1.0", sha256="1b60135a434fe4325d364a57e8f5e81e90f39b373b9d68733458c1adc2513c05") version("2.1.0", sha256="1b60135a434fe4325d364a57e8f5e81e90f39b373b9d68733458c1adc2513c05")
depends_on("fortran", type="build") # generated depends_on("fortran", type="build") # generated
@@ -28,3 +29,6 @@ class PyMetomiRose(PythonPackage):
depends_on("py-psutil@5.6.0:", type=("build", "run")) depends_on("py-psutil@5.6.0:", type=("build", "run"))
depends_on("py-requests", type=("build", "run")) depends_on("py-requests", type=("build", "run"))
depends_on("py-sqlalchemy@1", type=("build", "run")) depends_on("py-sqlalchemy@1", type=("build", "run"))
depends_on("py-importlib-metadata@5:", when="@2.3.2 ^python@:3.11")
depends_on("py-importlib-resources@2:", when="@2.3.2 ^python@:3.8")

View File

@@ -0,0 +1,20 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyNeo4j(PythonPackage):
"""This is the neo4j bolt driver for python from the official repository"""
pypi = "neo4j/neo4j-5.25.0.tar.gz"
license("LGPL-3.0-only")
version("5.25.0", sha256="7c82001c45319092cc0b5df4c92894553b7ab97bd4f59655156fa9acab83aec9")
depends_on("py-pytz", type="run")
depends_on("py-setuptools@68.0.0", type="build")
depends_on("py-tomlkit@0.11.8", type="build")
depends_on("python@3.7.0:", type=("build", "run"))

View File

@@ -15,6 +15,7 @@ class PyNetcdf4(PythonPackage):
license("MIT") license("MIT")
version("1.7.2", sha256="a4c6375540b19989896136943abb6d44850ff6f1fa7d3f063253b1ad3f8b7fce")
version( version(
"1.7.1.post2", sha256="37d557e36654889d7020192bfb56f9d5f93894cb32997eb837ae586c538fd7b6" "1.7.1.post2", sha256="37d557e36654889d7020192bfb56f9d5f93894cb32997eb837ae586c538fd7b6"
) )
@@ -27,6 +28,7 @@ class PyNetcdf4(PythonPackage):
variant("mpi", default=True, description="Parallel IO support") variant("mpi", default=True, description="Parallel IO support")
depends_on("python", type=("build", "link", "run")) depends_on("python", type=("build", "link", "run"))
depends_on("python@3.8:", when="@1.7.1:", type=("build", "link", "run"))
depends_on("py-cython@0.29:", when="@1.6.5:", type="build") depends_on("py-cython@0.29:", when="@1.6.5:", type="build")
depends_on("py-cython@0.19:", type="build") depends_on("py-cython@0.19:", type="build")
depends_on("py-setuptools@61:", when="@1.6.5:", type="build") depends_on("py-setuptools@61:", when="@1.6.5:", type="build")
@@ -35,15 +37,17 @@ class PyNetcdf4(PythonPackage):
depends_on("py-setuptools-scm@3.4:+toml", when="@1.7:", type="build") depends_on("py-setuptools-scm@3.4:+toml", when="@1.7:", type="build")
depends_on("py-cftime", type=("build", "run")) depends_on("py-cftime", type=("build", "run"))
depends_on("py-certifi", when="@1.6.5:", type=("build", "run")) depends_on("py-certifi", when="@1.6.5:", type=("build", "run"))
depends_on("py-numpy", when="@1.6.5:", type=("build", "link", "run")) depends_on("py-numpy", type=("build", "link", "run"))
depends_on("py-numpy@2.0:", when="@1.7.1:", type=("build", "link", "run"))
depends_on("py-numpy@1.9:", when="@1.5.4:1.6.2", type=("build", "link", "run")) depends_on("py-numpy@1.9:", when="@1.5.4:1.6.2", type=("build", "link", "run"))
depends_on("py-numpy@1.7:", type=("build", "link", "run"))
# https://github.com/Unidata/netcdf4-python/pull/1317 # https://github.com/Unidata/netcdf4-python/pull/1317
depends_on("py-numpy@:1", when="@:1.6", type=("build", "link", "run")) depends_on("py-numpy@:1", when="@:1.6", type=("build", "link", "run"))
depends_on("py-mpi4py", when="+mpi", type=("build", "run")) depends_on("py-mpi4py", when="+mpi", type=("build", "run"))
depends_on("netcdf-c", when="-mpi") # These forced variant requests are due to py-netcdf4 build scripts
# https://github.com/spack/spack/pull/47824#discussion_r1882473998
depends_on("netcdf-c~mpi", when="~mpi")
depends_on("netcdf-c+mpi", when="+mpi") depends_on("netcdf-c+mpi", when="+mpi")
depends_on("hdf5@1.8.0:+hl", when="-mpi") depends_on("hdf5@1.8.0:+hl~mpi", when="~mpi")
depends_on("hdf5@1.8.0:+hl+mpi", when="+mpi") depends_on("hdf5@1.8.0:+hl+mpi", when="+mpi")
# The installation script tries to find hdf5 using pkg-config. However, the # The installation script tries to find hdf5 using pkg-config. However, the
@@ -57,7 +61,7 @@ class PyNetcdf4(PythonPackage):
patch( patch(
"https://github.com/Unidata/netcdf4-python/commit/49dcd0b5bd25824c254770c0d41445133fc13a46.patch?full_index=1", "https://github.com/Unidata/netcdf4-python/commit/49dcd0b5bd25824c254770c0d41445133fc13a46.patch?full_index=1",
sha256="71eefe1d3065ad050fb72eb61d916ae1374a3fafd96ddaee6499cda952d992c4", sha256="71eefe1d3065ad050fb72eb61d916ae1374a3fafd96ddaee6499cda952d992c4",
when="@1.6: %gcc@14:", when="@1.6:1.6.5 %gcc@14:",
) )
def url_for_version(self, version): def url_for_version(self, version):

View File

@@ -0,0 +1,33 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyNvitop(PythonPackage):
"""
An interactive NVIDIA-GPU process viewer and beyond,
the one-stop solution for GPU process management.
"""
homepage = "https://nvitop.readthedocs.io/"
pypi = "nvitop/nvitop-1.4.0.tar.gz"
maintainers("nboelte")
license("Apache-2.0", checked_by="nboelte")
version("1.4.0", sha256="92f313e9bd89fe1a9d54054e92f490f34331f1b7847a89ddaffd6a7fde1437bb")
depends_on("py-nvidia-ml-py@11.450.51:12.561", type=("build", "run"))
depends_on("py-psutil@5.6.6:", type=("build", "run"))
depends_on("py-cachetools@1.0.1:", type=("build", "run"))
depends_on("py-termcolor@1.0.0:", type=("build", "run"))
depends_on("python@3.7:", type=("build", "run"))
depends_on("py-setuptools", type="build")
# Windows support would require the package py-windows-curses to be available in spack.
# depends_on("py-colorama@0.4:", when="platform=windows", type=("build", "run"))
# depends_on("py-windows-curses@2.2.0:", when="platform=windows", type=("build", "run"))
conflicts("platform=windows")

View File

@@ -20,6 +20,7 @@ class PyProtobuf(PythonPackage):
version("5.27.5", sha256="7fa81bc550201144a32f4478659da06e0b2ebe4d5303aacce9a202a1c3d5178d") version("5.27.5", sha256="7fa81bc550201144a32f4478659da06e0b2ebe4d5303aacce9a202a1c3d5178d")
version("5.26.1", sha256="8ca2a1d97c290ec7b16e4e5dff2e5ae150cc1582f55b5ab300d45cb0dfa90e51") version("5.26.1", sha256="8ca2a1d97c290ec7b16e4e5dff2e5ae150cc1582f55b5ab300d45cb0dfa90e51")
version("4.25.3", sha256="25b5d0b42fd000320bd7830b349e3b696435f3b329810427a6bcce6a5492cc5c") version("4.25.3", sha256="25b5d0b42fd000320bd7830b349e3b696435f3b329810427a6bcce6a5492cc5c")
version("4.24.4", sha256="5a70731910cd9104762161719c3d883c960151eea077134458503723b60e3667")
version("4.24.3", sha256="12e9ad2ec079b833176d2921be2cb24281fa591f0b119b208b788adc48c2561d") version("4.24.3", sha256="12e9ad2ec079b833176d2921be2cb24281fa591f0b119b208b788adc48c2561d")
version("4.23.3", sha256="7a92beb30600332a52cdadbedb40d33fd7c8a0d7f549c440347bc606fb3fe34b") version("4.23.3", sha256="7a92beb30600332a52cdadbedb40d33fd7c8a0d7f549c440347bc606fb3fe34b")
version("4.21.9", sha256="61f21493d96d2a77f9ca84fefa105872550ab5ef71d21c458eb80edcf4885a99") version("4.21.9", sha256="61f21493d96d2a77f9ca84fefa105872550ab5ef71d21c458eb80edcf4885a99")

View File

@@ -43,7 +43,7 @@ def configure_args(self):
"--sip", "--sip",
self.spec["py-sip"].prefix.bin.sip, self.spec["py-sip"].prefix.bin.sip,
"--sip-incdir", "--sip-incdir",
join_path(self.spec["py-sip"].prefix, self.spec["python"].package.include), join_path(self.spec["py-sip"].prefix, self["python"].include),
"--bindir", "--bindir",
self.prefix.bin, self.prefix.bin,
"--destdir", "--destdir",

View File

@@ -14,24 +14,40 @@ class PySegmentationModelsPytorch(PythonPackage):
license("MIT") license("MIT")
maintainers("adamjstewart") maintainers("adamjstewart")
version("0.3.4", sha256="f4aee7f6add479bd3c3953e855b7055fc657dc6800bf7fc8ab733fd7f8acb163") version("0.4.0", sha256="8833e63f0846090667be6fce05a2bbebbd1537776d3dea72916aa3db9e22e55b")
version("0.3.3", sha256="b3b21ab4cd26a6b2b9e7a6ed466ace6452eb26ed3c31ae491ea2d7cbb01e384b") with default_args(deprecated=True):
version("0.3.2", sha256="8372733e57a10cb8f6b9e18a20577fbb3fb83549b6945664dc774a9b6d3ecd13") version("0.3.4", sha256="f4aee7f6add479bd3c3953e855b7055fc657dc6800bf7fc8ab733fd7f8acb163")
version("0.3.1", sha256="d4a4817cf48872c3461bb7d22864c00f9d491719a6460adb252c035f9b0e8d51") version("0.3.3", sha256="b3b21ab4cd26a6b2b9e7a6ed466ace6452eb26ed3c31ae491ea2d7cbb01e384b")
version("0.3.0", sha256="8e00ed1707698d309d23f207aef15f21465e091aa0f1dc8043ec3300f5f67216") version("0.3.2", sha256="8372733e57a10cb8f6b9e18a20577fbb3fb83549b6945664dc774a9b6d3ecd13")
version("0.2.1", sha256="86744552b04c6bedf7e10f7928791894d8d9b399b9ed58ed1a3236d2bf69ead6") version("0.3.1", sha256="d4a4817cf48872c3461bb7d22864c00f9d491719a6460adb252c035f9b0e8d51")
version("0.2.0", sha256="247266722c23feeef16b0862456c5ce815e5f0a77f95c2cd624a71bf00d955df") version("0.3.0", sha256="8e00ed1707698d309d23f207aef15f21465e091aa0f1dc8043ec3300f5f67216")
version("0.2.1", sha256="86744552b04c6bedf7e10f7928791894d8d9b399b9ed58ed1a3236d2bf69ead6")
version("0.2.0", sha256="247266722c23feeef16b0862456c5ce815e5f0a77f95c2cd624a71bf00d955df")
depends_on("py-setuptools", type="build") with default_args(type="build"):
depends_on("py-torchvision@0.5.0:", type=("build", "run")) depends_on("py-setuptools@61:", when="@0.4:")
depends_on("py-pretrainedmodels@0.7.4", type=("build", "run")) depends_on("py-setuptools")
depends_on("py-efficientnet-pytorch@0.7.1", when="@0.3:", type=("build", "run"))
depends_on("py-efficientnet-pytorch@0.6.3", when="@:0.2", type=("build", "run")) with default_args(type=("build", "run")):
depends_on("py-timm@0.9.7", when="@0.3.4", type=("build", "run")) depends_on("py-efficientnet-pytorch@0.6.1:", when="@0.4:")
depends_on("py-timm@0.9.2", when="@0.3.3", type=("build", "run")) depends_on("py-efficientnet-pytorch@0.7.1", when="@0.3")
depends_on("py-timm@0.6.12", when="@0.3.2", type=("build", "run")) depends_on("py-efficientnet-pytorch@0.6.3", when="@:0.2")
depends_on("py-timm@0.4.12", when="@:0.3.1", type=("build", "run")) depends_on("py-huggingface-hub@0.24:", when="@0.4:")
depends_on("py-huggingface-hub@0.24.6:", when="@0.3.4:", type=("build", "run")) depends_on("py-huggingface-hub@0.24.6:", when="@0.3.4:0.3")
depends_on("py-tqdm", when="@0.3:", type=("build", "run")) depends_on("py-numpy@1.19.3:", when="@0.4:")
depends_on("pil", when="@0.3:", type=("build", "run")) depends_on("pil@8:", when="@0.4:")
depends_on("py-six", when="@0.3.4:", type=("build", "run")) depends_on("pil", when="@0.3:")
depends_on("py-pretrainedmodels@0.7.1:", when="@0.4:")
depends_on("py-pretrainedmodels@0.7.4", when="@:0.3")
depends_on("py-six@1.5:", when="@0.4:")
depends_on("py-six", when="@0.3.4:")
depends_on("py-timm@0.9:", when="@0.4:")
depends_on("py-timm@0.9.7", when="@0.3.4")
depends_on("py-timm@0.9.2", when="@0.3.3")
depends_on("py-timm@0.6.12", when="@0.3.2")
depends_on("py-timm@0.4.12", when="@:0.3.1")
depends_on("py-torch@1.8:", when="@0.4:")
depends_on("py-torchvision@0.9:", when="@0.4:")
depends_on("py-torchvision@0.5:")
depends_on("py-tqdm@4.42.1:", when="@0.4:")
depends_on("py-tqdm", when="@0.3:")

View File

@@ -71,7 +71,7 @@ def install(self, spec, prefix):
"--sip-module={0}".format(spec.variants["module"].value), "--sip-module={0}".format(spec.variants["module"].value),
"--bindir={0}".format(prefix.bin), "--bindir={0}".format(prefix.bin),
"--destdir={0}".format(python_platlib), "--destdir={0}".format(python_platlib),
"--incdir={0}".format(join_path(prefix, spec["python"].package.include)), "--incdir={0}".format(join_path(prefix, self["python"].include)),
"--sipdir={0}".format(prefix.share.sip), "--sipdir={0}".format(prefix.share.sip),
"--stubsdir={0}".format(python_platlib), "--stubsdir={0}".format(python_platlib),
] ]

View File

@@ -13,6 +13,11 @@ class PyStevedore(PythonPackage):
license("Apache-2.0") license("Apache-2.0")
version("5.4.0", sha256="79e92235ecb828fe952b6b8b0c6c87863248631922c8e8e0fa5b17b232c4514d")
version("5.3.0", sha256="9a64265f4060312828151c204efbe9b7a9852a0d9228756344dbc7e4023e375a")
version("5.2.0", sha256="46b93ca40e1114cea93d738a6c1e365396981bb6bb78c27045b7587c9473544d")
version("5.1.0", sha256="a54534acf9b89bc7ed264807013b505bf07f74dbe4bcfa37d32bd063870b087c")
version("5.0.0", sha256="2c428d2338976279e8eb2196f7a94910960d9f7ba2f41f3988511e95ca447021")
version("4.0.0", sha256="f82cc99a1ff552310d19c379827c2c64dd9f85a38bcd5559db2470161867b786") version("4.0.0", sha256="f82cc99a1ff552310d19c379827c2c64dd9f85a38bcd5559db2470161867b786")
version("3.5.0", sha256="f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335") version("3.5.0", sha256="f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335")
version("1.28.0", sha256="f1c7518e7b160336040fee272174f1f7b29a46febb3632502a8f2055f973d60b") version("1.28.0", sha256="f1c7518e7b160336040fee272174f1f7b29a46febb3632502a8f2055f973d60b")

View File

@@ -14,6 +14,7 @@ class PyTimm(PythonPackage):
license("Apache-2.0") license("Apache-2.0")
maintainers("adamjstewart") maintainers("adamjstewart")
version("1.0.12", sha256="9da490683bd06302ec40e1892f1ccf87985f033e41f3580887d886b9aee9449a")
version("1.0.11", sha256="a005f72b87e67ed30cdbf405a9ffd4e723360c780a43b1cefe266af8ecc9d151") version("1.0.11", sha256="a005f72b87e67ed30cdbf405a9ffd4e723360c780a43b1cefe266af8ecc9d151")
version("0.9.7", sha256="2bfb1029e90b72e65eb9c75556169815f2e82257eaa1f6ebd623a4b4a52867a2") version("0.9.7", sha256="2bfb1029e90b72e65eb9c75556169815f2e82257eaa1f6ebd623a4b4a52867a2")
version("0.9.5", sha256="669835f0030cfb2412c464b7b563bb240d4d41a141226afbbf1b457e4f18cff1") version("0.9.5", sha256="669835f0030cfb2412c464b7b563bb240d4d41a141226afbbf1b457e4f18cff1")
@@ -32,15 +33,15 @@ class PyTimm(PythonPackage):
with default_args(type=("build", "run")): with default_args(type=("build", "run")):
# https://github.com/huggingface/pytorch-image-models/issues/1530 # https://github.com/huggingface/pytorch-image-models/issues/1530
# https://github.com/huggingface/pytorch-image-models/pull/1649 # https://github.com/huggingface/pytorch-image-models/pull/1649
depends_on("python@:3.10", when="@:0.6.12", type=("build", "run")) depends_on("python@:3.10", when="@:0.6.12")
depends_on("py-torch@1.7:", when="@0.6:", type=("build", "run")) depends_on("py-torch@1.7:", when="@0.6:")
depends_on("py-torch@1.4:", type=("build", "run")) depends_on("py-torch@1.4:")
depends_on("py-torchvision", type=("build", "run")) depends_on("py-torchvision")
depends_on("py-pyyaml", when="@0.6:", type=("build", "run")) depends_on("py-pyyaml", when="@0.6:")
depends_on("py-huggingface-hub", when="@0.6:", type=("build", "run")) depends_on("py-huggingface-hub", when="@0.6:")
depends_on("py-safetensors", when="@0.9:", type=("build", "run")) depends_on("py-safetensors", when="@0.9:")
# https://github.com/rwightman/pytorch-image-models/pull/1256 # https://github.com/rwightman/pytorch-image-models/pull/1256
depends_on("pil@:9", when="@:0.5", type=("build", "run")) depends_on("pil@:9", when="@:0.5")
depends_on("py-numpy", when="@:0.5", type=("build", "run")) depends_on("py-numpy", when="@:0.5")

View File

@@ -217,9 +217,7 @@ def fix_qsci_sip(self):
elif "^py-pyqt6" in self.spec: elif "^py-pyqt6" in self.spec:
pyqtx = "PyQt6" pyqtx = "PyQt6"
sip_inc_dir = join_path( sip_inc_dir = join_path(self["qscintilla"].module.python_platlib, pyqtx, "bindings")
self.spec["qscintilla"].package.module.python_platlib, pyqtx, "bindings"
)
with open(join_path("python", "gui", "pyproject.toml.in"), "a") as tomlfile: with open(join_path("python", "gui", "pyproject.toml.in"), "a") as tomlfile:
tomlfile.write(f'\n[tool.sip.project]\nsip-include-dirs = ["{sip_inc_dir}"]\n') tomlfile.write(f'\n[tool.sip.project]\nsip-include-dirs = ["{sip_inc_dir}"]\n')

View File

@@ -101,9 +101,7 @@ def make_qsci_python(self):
with working_dir(join_path(self.stage.source_path, "Python")): with working_dir(join_path(self.stage.source_path, "Python")):
copy(ftoml, "pyproject.toml") copy(ftoml, "pyproject.toml")
sip_inc_dir = join_path( sip_inc_dir = join_path(self[py_pyqtx].module.python_platlib, pyqtx, "bindings")
self.spec[py_pyqtx].package.module.python_platlib, pyqtx, "bindings"
)
with open("pyproject.toml", "a") as tomlfile: with open("pyproject.toml", "a") as tomlfile:
# https://pyqt-builder.readthedocs.io/en/latest/pyproject_toml.html # https://pyqt-builder.readthedocs.io/en/latest/pyproject_toml.html

View File

@@ -13,9 +13,11 @@ class RDeseq2(RPackage):
sequencing assays and test for differential expression based on a model sequencing assays and test for differential expression based on a model
using the negative binomial distribution.""" using the negative binomial distribution."""
homepage = "https://bioconductor.org/packages/DESeq2" bioc = "DESeq2"
git = "https://git.bioconductor.org/packages/DESeq2.git"
version("1.46.0", commit="4887eb42fa96fcc234118ead8ffd11032a8f08bb")
version("1.44.0", commit="5facd3093468ce2e75a2b742b1533efee13e5818")
version("1.42.0", commit="17a39b5296cb3d897f1e2a9aa4bebbdefb13b46a")
version("1.40.0", commit="c4962c3b16546e552fbc1a712258e4e21ff44241") version("1.40.0", commit="c4962c3b16546e552fbc1a712258e4e21ff44241")
version("1.38.0", commit="0e059f425d4ce6a5203685a4ad434f15bbd6e211") version("1.38.0", commit="0e059f425d4ce6a5203685a4ad434f15bbd6e211")
version("1.36.0", commit="2800b78ae52c0600f7e603c54af59beed3a2ed17") version("1.36.0", commit="2800b78ae52c0600f7e603c54af59beed3a2ed17")

View File

@@ -16,6 +16,14 @@ class RSparsematrixstats(RPackage):
bioc = "sparseMatrixStats" bioc = "sparseMatrixStats"
# The repository is at
# https://code.bioconductor.org/browse/sparseMatrixStats/, to find the
# commit hash check the branch corresponding to a BioConductor release and
# the latest commit (or one of the latest ones) should be the one bumping
# the r-sparsematrixstats version.
version("1.18.0", commit="172c63ee6c8fa200d2fda5546750ab5ac8ddd858")
version("1.16.0", commit="2ad650c393497263c20d67d45d1a56ee6fa3b402")
version("1.14.0", commit="2923a3bb4e59cf0e05f0e21a8e8df66e670c4abc")
version("1.12.0", commit="054bf939cd7220deaf8e768ff7029d0d38483c91") version("1.12.0", commit="054bf939cd7220deaf8e768ff7029d0d38483c91")
version("1.10.0", commit="75d85ba2c9c4c36887fef1a007883167aa85bd94") version("1.10.0", commit="75d85ba2c9c4c36887fef1a007883167aa85bd94")
version("1.8.0", commit="4f1e2213e5b0d6b3d817c2c9129b7566288916f6") version("1.8.0", commit="4f1e2213e5b0d6b3d817c2c9129b7566288916f6")
@@ -29,4 +37,7 @@ class RSparsematrixstats(RPackage):
depends_on("r-rcpp", type=("build", "run")) depends_on("r-rcpp", type=("build", "run"))
depends_on("r-matrix", type=("build", "run")) depends_on("r-matrix", type=("build", "run"))
depends_on("r-matrixstats", type=("build", "run")) depends_on("r-matrixstats", type=("build", "run"))
depends_on("r-matrixstats@0.60.0:", type=("build", "run"), when="@1.6.0:") depends_on("r-matrixstats@0.60.0:0.63.0", type=("build", "run"), when="@1.6.0:1.12")
# r-sparsematrixstats 1.12- is incompatible with r-matrixstats v1:
# https://github.com/HenrikBengtsson/matrixStats/issues/227
depends_on("r-matrixstats@1:", type=("build", "run"), when="@1.13.0:")

View File

@@ -38,14 +38,11 @@ class Remhos(MakefilePackage):
@property @property
def build_targets(self): def build_targets(self):
targets = [] return [
spec = self.spec f"MFEM_DIR={self['mfem'].prefix}",
f"CONFIG_MK={self['mfem'].config_mk}",
targets.append("MFEM_DIR=%s" % spec["mfem"].prefix) f"TEST_MK={self['mfem'].test_mk}",
targets.append("CONFIG_MK=%s" % spec["mfem"].package.config_mk) ]
targets.append("TEST_MK=%s" % spec["mfem"].package.test_mk)
return targets
# See lib/spack/spack/build_systems/makefile.py # See lib/spack/spack/build_systems/makefile.py
def check(self): def check(self):

View File

@@ -142,7 +142,7 @@ class Root(CMakePackage):
patch( patch(
"https://github.com/root-project/root/commit/2f00d6df258906c1f6fe848135a88b836db3077f.patch?full_index=1", "https://github.com/root-project/root/commit/2f00d6df258906c1f6fe848135a88b836db3077f.patch?full_index=1",
sha256="8da36032082e65ae246c03558a4c3fd67b157d1d0c6d20adac9de263279d1db6", sha256="8da36032082e65ae246c03558a4c3fd67b157d1d0c6d20adac9de263279d1db6",
when="@6.28:6.28.12", when="@6.28.6:6.28.12",
) )
patch( patch(
"https://github.com/root-project/root/commit/14838b35600b08278e69bc3d8d8669773bc11399.patch?full_index=1", "https://github.com/root-project/root/commit/14838b35600b08278e69bc3d8d8669773bc11399.patch?full_index=1",
@@ -452,6 +452,8 @@ class Root(CMakePackage):
"cxxstd=20", when="@:6.28.02", msg="C++20 support requires root version at least 6.28.04" "cxxstd=20", when="@:6.28.02", msg="C++20 support requires root version at least 6.28.04"
) )
conflicts("%gcc@:10", when="cxxstd=20")
# See https://github.com/root-project/root/issues/11128 # See https://github.com/root-project/root/issues/11128
conflicts("%clang@16:", when="@:6.26.07", msg="clang 16+ support was added in root 6.26.08") conflicts("%clang@16:", when="@:6.26.07", msg="clang 16+ support was added in root 6.26.08")

View File

@@ -93,6 +93,7 @@ class Rust(Package):
depends_on("rust-bootstrap@1.74:1.75", type="build", when="@1.75") depends_on("rust-bootstrap@1.74:1.75", type="build", when="@1.75")
depends_on("rust-bootstrap@1.77:1.78", type="build", when="@1.78") depends_on("rust-bootstrap@1.77:1.78", type="build", when="@1.78")
depends_on("rust-bootstrap@1.80:1.81", type="build", when="@1.81") depends_on("rust-bootstrap@1.80:1.81", type="build", when="@1.81")
depends_on("rust-bootstrap@1.82:1.83", type="build", when="@1.83")
# src/llvm-project/llvm/cmake/modules/CheckCompilerVersion.cmake # src/llvm-project/llvm/cmake/modules/CheckCompilerVersion.cmake
conflicts("%gcc@:7.3", when="@1.73:", msg="Host GCC version must be at least 7.4") conflicts("%gcc@:7.3", when="@1.73:", msg="Host GCC version must be at least 7.4")

View File

@@ -323,8 +323,8 @@ def install(self, spec, prefix):
env["F77"] = spec["mpi"].mpif77 env["F77"] = spec["mpi"].mpif77
env["FC"] = spec["mpi"].mpifc env["FC"] = spec["mpi"].mpifc
if spec["mpi"].name == "intel-oneapi-mpi": if spec["mpi"].name == "intel-oneapi-mpi":
options.append("-mpiinc=%s/include" % spec["mpi"].package.component_prefix) options.append("-mpiinc=%s/include" % self["mpi"].component_prefix)
options.append("-mpilib=%s/lib" % spec["mpi"].package.component_prefix) options.append("-mpilib=%s/lib" % self["mpi"].component_prefix)
else: else:
options.append("-mpiinc=%s" % spec["mpi"].prefix.include) options.append("-mpiinc=%s" % spec["mpi"].prefix.include)
options.append("-mpilib=%s" % spec["mpi"].prefix.lib) options.append("-mpilib=%s" % spec["mpi"].prefix.lib)

View File

@@ -0,0 +1,98 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Toybox(MakefilePackage):
"""All-in-one Linux command line"""
homepage = "https://landley.net/toybox/"
url = "https://landley.net/toybox/downloads/toybox-0.8.11.tar.gz"
git = "https://github.com/landley/toybox.git"
maintainers("Buldram")
license("0BSD", checked_by="Buldram")
version("0.8.11", sha256="15aa3f832f4ec1874db761b9950617f99e1e38144c22da39a71311093bfe67dc")
version("0.8.10", sha256="d3afee05ca90bf425ced73f527e418fecd626c5340b5f58711a14531f8d7d108")
version("0.8.9", sha256="06913dde3de7139b40f947bd7f23869dfc8796e9c6ff39de02719f8b7b2d47ad")
version("0.8.8", sha256="dafd41978d40f02a61cf1be99a2b4a25812bbfb9c3157e679ee7611202d6ac58")
version("0.8.7", sha256="b508bf336f82cb0739b77111f945931d1a143b5a53905cb753cd2607cfdd1494")
version("0.8.6", sha256="4298c90a2b238348e4fdc9f89eb4988356c80da3f0cf78c279d2e82b9119034b")
version("0.8.5", sha256="bfd230c187726347f7e31a1fc5841705871dfe4f3cbc6628f512b54e57360949")
version("0.8.4", sha256="cb2a565a8d30015d08d73628795dca51a85b99b149aeabbbecd9e8dbdbd8fddc")
version("0.8.3", sha256="eab28fd29d19d4e61ef09704e5871940e6f35fd35a3bb1285e41f204504b5c01")
version("0.8.2", sha256="9a2760fa442e9baf1be6064ab5ba8b90f2098e1d4bc33c788960b8d73f52fed5")
version("0.8.1", sha256="1ac41e62b809d2ab656479f7f4e20bb71c63c14473f5c7d13f25d4f7fcfefdb3")
version("0.8.0", sha256="e3ccecd9446db909437427a026c2788f2a96ac7ebc591c95b35df77f4e923956")
version("0.7.8", sha256="4962e16898cb3c6e2719205349c8e6749a30583618a264aa8911f9ad61d998d6")
version("0.7.7", sha256="ee218ab21c80044c04112ada7f59320062c35909a6e5f850b1318b17988ffba0")
version("0.7.6", sha256="e2c9643ebc2bcdec4d8f8db25d0b428dbe0928f7b730052dbbd25db47fb9db95")
version("0.7.5", sha256="3ada450ac1eab1dfc352fee915ea6129b9a4349c1885f1394b61bd2d89a46c04")
version("0.7.4", sha256="49d74ca897501e5c981516719870fe08581726f5c018abe35ef52c6f0de113e7")
conflicts("platform=darwin", when="@:0.7.8,=0.8.1")
conflicts("platform=freebsd", when="@:0.7.8,0.8.1:0.8.8")
variant("userland", default=True, description="Install symlinked individual commands")
variant("static", default=False, description="Build static binary")
variant("ssl", default=True, description="Build with OpenSSL support")
variant("zlib", default=True, description="Build with Zlib support")
depends_on("c", type="build")
depends_on("bash", type="build")
depends_on("sed", type="build")
depends_on("openssl", type="link", when="+ssl")
depends_on("zlib-api", type="link", when="+zlib")
# CVE-2022-32298
patch(
"https://github.com/landley/toybox/commit/6d4847934fc0fe47a3254ce6c0396d197a780cf4.patch?full_index=1",
sha256="2c6ffad53102db23b620fd883636daad15c70a08c72f802a1fbcf96c331280cc",
when="@=0.8.7",
)
# Fixes segfault when building with more recent toolchains.
patch(
"https://github.com/landley/toybox/commit/78289203031afc23585035c362beec10db54958d.patch?full_index=1",
sha256="a27a831eb80f9d46809f619b52018eb2e481758581f7a6932423b95422f23911",
when="@=0.7.4",
)
def setup_build_environment(self, env):
env.set("NOSTRIP", 1)
if not self.spec.satisfies("@=0.8.9"):
env.set("V", 1) # Verbose
if self.spec.satisfies("+static"):
env.append_flags("LDFLAGS", "--static")
def edit(self, spec, prefix):
if spec.satisfies("platform=darwin"):
defconfig = "macos_defconfig"
elif spec.satisfies("platform=freebsd"):
defconfig = "bsd_defconfig"
else:
defconfig = "defconfig"
make(defconfig, parallel=self.parallel and not spec.satisfies("@0.7.8:0.8.1"))
config = FileFilter(".config")
config.filter(
"# CONFIG_TOYBOX_LIBCRYPTO is not set",
"CONFIG_TOYBOX_LIBCRYPTO=" + ("y" if spec.satisfies("+ssl") else "n"),
)
config.filter(
"# CONFIG_TOYBOX_LIBZ is not set",
"CONFIG_TOYBOX_LIBZ=" + ("y" if spec.satisfies("+zlib") else "n"),
)
def install(self, spec, prefix):
if spec.satisfies("+userland"):
make("install_flat", "PREFIX=" + prefix.bin)
else:
mkdir(prefix.bin)
install("toybox", prefix.bin)

View File

@@ -41,6 +41,7 @@ class Verilator(AutotoolsPackage):
version("master", branch="master") version("master", branch="master")
version("5.032", sha256="5a262564b10be8bdb31ff4fb67d77bcf5f52fc1b4e6c88d5ca3264fb481f1e41")
version("5.030", sha256="b9e7e97257ca3825fcc75acbed792b03c3ec411d6808ad209d20917705407eac") version("5.030", sha256="b9e7e97257ca3825fcc75acbed792b03c3ec411d6808ad209d20917705407eac")
version("5.028", sha256="02d4b6f34754b46a97cfd70f5fcbc9b730bd1f0a24c3fc37223397778fcb142c") version("5.028", sha256="02d4b6f34754b46a97cfd70f5fcbc9b730bd1f0a24c3fc37223397778fcb142c")
version("5.026", sha256="87fdecf3967007d9ee8c30191ff2476f2a33635d0e0c6e3dbf345cc2f0c50b78") version("5.026", sha256="87fdecf3967007d9ee8c30191ff2476f2a33635d0e0c6e3dbf345cc2f0c50b78")