Compare commits

..

59 Commits

Author SHA1 Message Date
Harmen Stoppels
a660d78670 Revert "Solver: Cache Concretization Results (#48198)"
This reverts commit d234df62d7.
2025-03-19 12:31:01 +01:00
Robert Maaskant
e9cc1b36bc kubernetes: add v1.30.0 -> v1.32.3 (#49211)
* kubernetes: add new versions

* kubernetes: add v1.30.11, v1.31.7, v1.32.3

* kubernetes: remove new deprecated versions and refactor build deps
2025-03-18 18:54:12 -06:00
David--Cléris Timothée
fd2c040981 hipsycl: rework llvm compatibility matrix (#49507)
* [hipsycl] add llvm 20 conflict
* add llvm matrix support & add 24.10 release

---------

Co-authored-by: tdavidcl <tdavidcl@users.noreply.github.com>
2025-03-18 15:54:03 -07:00
Robert Maaskant
33cd7d6033 kubectl: add v1.30.0 -> v1.32.3 (#49082)
* kubectl: add all versions currently supported upstream

* kubectl: build same way as kubernetes

* kubectl: revert back to GoPackage

* kubectl: fix version command

* kubectl: add v1.30.11, v1.31.7, v1.32.3

* kubectl: remove new deprecated versions

* kubectl: refactor build deps
2025-03-18 10:16:57 -07:00
Alex Richert
9c255381b1 parallelio: set WITH_PNETCDF from +/~pnetcdf (#49548) 2025-03-18 05:03:55 -06:00
SXS Bot
fd6c419682 spectre: add v2025.03.17 (#49533)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2025-03-18 05:03:37 -06:00
Robert Maaskant
9d1d808f94 py-tqdm: add v4.66.4 -> v4.67.1 (#49525) 2025-03-18 00:06:13 -06:00
Axel Huebl
7a0ef93332 WarpX: Remove Deprecated Versions (#46765)
* WarpX: Remove Deprecated Versions

* Conflict: WarpX SYCL RZ FFT Issue

Conflict out on WarpX issue until fixed
https://github.com/BLAST-WarpX/warpx/issues/5774

* Fix #49546
2025-03-17 21:40:38 -06:00
Teague Sterling
bf48b7662e wasi-sdk-prebuilt: add v25.0,v24.0,v23.0 (#49523)
* wasi-sdk-prebuilt: add v25.0,24.0,23.0

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 20:13:34 -06:00
Teague Sterling
d14333cc79 libgtop: add v2.41.1-2.41.3 (#49524)
* libgtop: new package
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* Adding pkgconfig dep
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* Adding note about https://github.com/spack/spack/pull/44323
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
* libgtop: add v2.41.3
   Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 19:41:42 -06:00
Teague Sterling
084361124e vep: add v113.3 (#49518) 2025-03-17 19:33:42 -06:00
Lehman Garrison
a1f4cc8b73 py-corrfunc: add new package at v2.5.3 (#49502) 2025-03-17 16:43:50 -07:00
Teague Sterling
b20800e765 awscli-v2: add v2.24.24 (#49519)
* awscli-v2: add v2.24.24

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2025-03-17 16:24:21 -07:00
Fabio Durastante
01b1e24074 psblas: new package (#49423)
* Package for installing the PSBLAS library
* Put some version as deprecated
* Removed FIXME comments
* Added missing :
* Fixed style to comply with flex8
* Other round of style fixes to comply with flex8
* Used black to reformat the file
* Fixed typo
   Co-authored-by: Luca Heltai <luca.heltai@unipi.it>
* Added explicit .git extension
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Fixed typo on METIS string
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Added url before removing urls from the the version directives.
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Corrected typo in url.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Reordered variant and depend_on, removed deprecated and old software versions

---------

Co-authored-by: Luca Heltai <luca.heltai@unipi.it>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-03-17 10:47:59 -07:00
Harmen Stoppels
8029279dad gcc: drop redundant --with-ld and --with-as configure flags (#49538)
it's redundant due to our spec file which adds -B, and it breaks -fuse-ld=
2025-03-17 18:35:23 +01:00
Greg Sjaardema
5f4e12d8f2 seacas: add 2025-03-13 (bug fix, new functionality, portability) (#49474)
* seacas: bug fix, new functionality, portability
* Get new checksum due to moving tag forward...
2025-03-17 15:44:17 +00:00
Satish Balay
a8728e700b petsc4py, slepc4py: update homepage, add maintainers (#49383) 2025-03-17 16:19:34 +01:00
Wouter Deconinck
f8adf2b70f libunwind: variant component value setjump -> setjmp (#49508) 2025-03-17 16:12:59 +01:00
Andy Porter
d0ef2d9e00 py-fparser: add v0.2.0 (#47807)
* Update to release 0.2.0

* #47087 updates for review
2025-03-17 11:19:55 +01:00
Melven Roehrig-Zoellner
d4bd3e298a cgns: patch for include path for 4.5 (#49161)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-03-17 11:19:13 +01:00
sbstndb/sbstndbs
40268634b6 xsimd: add v9.0.1 -> 13.1.0 (#49156) 2025-03-17 11:18:24 +01:00
sbstndb/sbstndbs
b0e8451d83 xtl: add v0.7.7 (#49157) 2025-03-17 11:16:40 +01:00
Massimiliano Culpo
868a52387b Revert "py-flowcept: add py-flowcept package (#47745)" (#49528)
This reverts commit 3fe89115c2.
2025-03-17 11:10:19 +01:00
Matthieu Dorier
3fe89115c2 py-flowcept: add py-flowcept package (#47745)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 11:05:51 +01:00
Wouter Deconinck
412024cf21 git: add v2.48.1 and friends (#49061)
* git: add v2.47.1, v2.48.1

* git: deprecate older versions

* fixed incorrect sha256 for git-manpage for git-manpages-2.47.2.tar.gz listed at https://mirrors.edge.kernel.org/pub/software/scm/git/sha256sums.asc (#49095)

---------

Co-authored-by: Jennifer Green <jkgreen@sandia.gov>
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2025-03-17 11:02:40 +01:00
Hariharan Devarajan
91b20ed7d0 pydftracer, brahma: add new releases (#49245) 2025-03-17 11:00:43 +01:00
Robert Maaskant
0caacc6e21 py-wheel: add v0.41.3 -> v0.45.1 (#49238) 2025-03-17 10:59:23 +01:00
Robert Maaskant
651126e64c openssl: add v3.4.1 and backports (#49250)
Release notes:
- https://github.com/openssl/openssl/blob/openssl-3.4/CHANGES.md#changes-between-340-and-341-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.3/CHANGES.md#changes-between-332-and-333-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.2/CHANGES.md#changes-between-323-and-324-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.1/CHANGES.md#changes-between-317-and-318-11-feb-2025
- https://github.com/openssl/openssl/blob/openssl-3.0/CHANGES.md#changes-between-3015-and-3016-11-feb-2025
2025-03-17 10:57:54 +01:00
Wouter Deconinck
e15a530f32 py-onnxruntime: use CudaPackage (#47684) 2025-03-17 10:35:20 +01:00
Martin Lang
0f84623914 elpa: add 2024.05.001, 2025.01.001 (#49335) 2025-03-17 10:33:23 +01:00
Kin Fai Tse
90afa5c5ef openfoam: add v2406, v2412, fix minor link deps (#49254) 2025-03-17 10:32:15 +01:00
Alberto Sartori
024620bd7b justbuild: add v1.5.0 (#49343) 2025-03-17 09:59:49 +01:00
Wouter Deconinck
9bec8e2f4b py-setuptools-scm-git-archive: add v1.4.1 (#49347) 2025-03-17 09:50:20 +01:00
Dave Keeshan
18dd465532 verible: Add v0.0.3946 (#49362) 2025-03-17 09:47:00 +01:00
Satish Balay
a2431ec00c mpich: add v4.3.0 (#49375) 2025-03-17 09:39:18 +01:00
MatthewLieber
78abe968a0 mvapich: add v4.0 and update default pmi version (#49399)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2025-03-17 01:38:19 -07:00
Wouter Deconinck
38e9043b9e yoda: add v2.1.0; rivet: add v4.1.0 (#49382) 2025-03-17 09:37:42 +01:00
Fernando Ayats
a0599e5e27 py-chex: add 0.1.89, py-optax: add 0.2.4(#49388) 2025-03-17 09:34:42 +01:00
George Young
1cd6f4e28f py-macs3: add @3.0.3 (#49365)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2025-03-17 09:30:08 +01:00
Eric Berquist
d2298e8e99 SST: update package maintainers (#49392) 2025-03-17 09:28:39 +01:00
Robert Maaskant
e3806aeac5 py-setuptools: add v75.8.1 -> v76.0.0 (#49251)
* py-setuptools: add v75.8.1, v75.8.2

Release notes:
- https://setuptools.pypa.io/en/stable/history.html#v75-8-1
- https://setuptools.pypa.io/en/stable/history.html#v75-8-2

* py-setuptools: add v75.9.1, v76.0.0
2025-03-17 09:26:41 +01:00
Seth R. Johnson
38309ced33 CLI11: new versions, PIC option (#49397) 2025-03-17 09:25:10 +01:00
Robert Maaskant
2f21201bf8 util-linux-uuid: add v2.40.3, v2.40.4 (#49441) 2025-03-17 09:16:19 +01:00
Matt Thompson
95a0f1924d openmpi: fix internal-libevent variant (#49463) 2025-03-17 09:06:43 +01:00
Olivier Cessenat
52969dfa78 gsl: add external find (#48665) 2025-03-17 09:05:08 +01:00
Massimiliano Culpo
ee588e4bbe chameleon: update to use oneapi packages (#49498)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:50:05 +01:00
Massimiliano Culpo
461f1d186b timemory: update to use oneapi packages (#49305)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:48:57 +01:00
Massimiliano Culpo
03b864f986 ghost: remove outdated comments (#49501)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:47:50 +01:00
Harmen Stoppels
bff4fa2761 spec.py: include test deps in dag hash, remove process_hash (take two) (#49505)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-17 08:47:09 +01:00
Massimiliano Culpo
ad3fd4e7e9 fleur: update to use oneapi packages (#49500)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:46:50 +01:00
Massimiliano Culpo
a574a995f8 converge: remove package (#49499)
The package was added in 2017, and never updated
substantially. It requires users to login into
a platform to download code.

Thus, instead of updating to new versions, and add
support for OneAPI, remove the package.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:46:32 +01:00
Massimiliano Culpo
0002861daf camx: update to use oneapi packages (#49497)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:44:45 +01:00
Massimiliano Culpo
a65216f0a0 dftfe: update to use oneapi packages (#49430)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-17 08:44:30 +01:00
Sebastian Pipping
7604869198 expat: add v2.7.0 with security fixes + deprecate vulnerable 2.6.4 (#49481) 2025-03-17 08:31:56 +01:00
afzpatel
d409126c27 hip: apply LLVM_ROOT and Clang_ROOT args only when installing hip+rocm (#49368) 2025-03-17 07:37:31 +01:00
Lehman Garrison
2b0d985714 py-numexpr: add v2.10.2 (#49490) 2025-03-17 07:07:48 +01:00
Wouter Deconinck
eedec51566 dcap: add test dependency on cunit (#49510) 2025-03-17 06:56:01 +01:00
Seth R. Johnson
016954fcff vecgeom: new dev tag (#49511)
* vecgeom: add surface-dev 2

* Update hash and mark as deprecated
2025-03-17 06:54:43 +01:00
Adam J. Stewart
0f17672ddb py-numpy: add v2.2.4 (#49512) 2025-03-17 06:47:18 +01:00
89 changed files with 977 additions and 1633 deletions

View File

@@ -125,8 +125,6 @@ are stored in ``$spack/var/spack/cache``. These are stored indefinitely
by default. Can be purged with :ref:`spack clean --downloads
<cmd-spack-clean>`.
.. _Misc Cache:
--------------------
``misc_cache``
--------------------
@@ -336,52 +334,3 @@ create a new alias called ``inst`` that will always call ``install -v``:
aliases:
inst: install -v
-------------------------------
``concretization_cache:enable``
-------------------------------
When set to ``true``, Spack will utilize a cache of solver outputs from
successful concretization runs. When enabled, Spack will check the concretization
cache prior to running the solver. If a previous request to solve a given
problem is present in the cache, Spack will load the concrete specs and other
solver data from the cache rather than running the solver. Specs not previously
concretized will be added to the cache on a successful solve. The cache additionally
holds solver statistics, so commands like ``spack solve`` will still return information
about the run that produced a given solver result.
This cache is a subcache of the :ref:`Misc Cache` and as such will be cleaned when the Misc
Cache is cleaned.
When ``false`` or ommitted, all concretization requests will be performed from scatch
----------------------------
``concretization_cache:url``
----------------------------
Path to the location where Spack will root the concretization cache. Currently this only supports
paths on the local filesystem.
Default location is under the :ref:`Misc Cache` at: ``$misc_cache/concretization``
------------------------------------
``concretization_cache:entry_limit``
------------------------------------
Sets a limit on the number of concretization results that Spack will cache. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.
-----------------------------------
``concretization_cache:size_limit``
-----------------------------------
Sets a limit on the size of the concretization cache in bytes. The limit is evaluated
after each concretization run; if Spack has stored more results than the limit allows, the
oldest concretization results are pruned until 10% of the limit has been removed.
Setting this value to 0 disables the automatic pruning. It is expected users will be
responsible for maintaining this cache.

View File

@@ -7,7 +7,6 @@
import fnmatch
import glob
import hashlib
import io
import itertools
import numbers
import os
@@ -21,7 +20,6 @@
from contextlib import contextmanager
from itertools import accumulate
from typing import (
IO,
Callable,
Deque,
Dict,
@@ -2883,20 +2881,6 @@ def keep_modification_time(*filenames):
os.utime(f, (os.path.getatime(f), mtime))
@contextmanager
def temporary_file_position(stream):
orig_pos = stream.tell()
yield
stream.seek(orig_pos)
@contextmanager
def current_file_position(stream: IO[str], loc: int, relative_to=io.SEEK_CUR):
with temporary_file_position(stream):
stream.seek(loc, relative_to)
yield
@contextmanager
def temporary_dir(
suffix: Optional[str] = None, prefix: Optional[str] = None, dir: Optional[str] = None

View File

@@ -1126,7 +1126,7 @@ def _add(
installation_time:
Date and time of installation
allow_missing: if True, don't warn when installation is not found on on disk
This is useful when installing specs without build deps.
This is useful when installing specs without build/test deps.
"""
if not spec.concrete:
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
@@ -1146,10 +1146,8 @@ def _add(
edge.spec,
explicit=False,
installation_time=installation_time,
# allow missing build-only deps. This prevents excessive warnings when a spec is
# installed, and its build dep is missing a build dep; there's no need to install
# the build dep's build dep first, and there's no need to warn about it missing.
allow_missing=allow_missing or edge.depflag == dt.BUILD,
# allow missing build / test only deps
allow_missing=allow_missing or edge.depflag & (dt.BUILD | dt.TEST) == edge.depflag,
)
# Make sure the directory layout agrees whether the spec is installed

View File

@@ -6,7 +6,7 @@
import spack.deptypes as dt
import spack.repo
hashes = []
HASHES = []
class SpecHashDescriptor:
@@ -23,7 +23,7 @@ def __init__(self, depflag: dt.DepFlag, package_hash, name, override=None):
self.depflag = depflag
self.package_hash = package_hash
self.name = name
hashes.append(self)
HASHES.append(self)
# Allow spec hashes to have an alternate computation method
self.override = override
@@ -43,13 +43,9 @@ def __repr__(self):
)
#: Spack's deployment hash. Includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(depflag=dt.BUILD | dt.LINK | dt.RUN, package_hash=True, name="hash")
#: Hash descriptor used only to transfer a DAG, as is, across processes
process_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="process_hash"
#: The DAG hash includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="hash"
)

View File

@@ -108,8 +108,6 @@ def _get_user_cache_path():
#: transient caches for Spack data (virtual cache, patch sha256 lookup, etc.)
default_misc_cache_path = os.path.join(user_cache_path, "cache")
#: concretization cache for Spack concretizations
default_conc_cache_path = os.path.join(default_misc_cache_path, "concretization")
# Below paths pull configuration from the host environment.
#

View File

@@ -58,15 +58,6 @@
{"type": "string"}, # deprecated
]
},
"concretization_cache": {
"type": "object",
"properties": {
"enable": {"type": "boolean"},
"url": {"type": "string"},
"entry_limit": {"type": "integer", "minimum": 0},
"size_limit": {"type": "integer", "minimum": 0},
},
},
"install_hash_length": {"type": "integer", "minimum": 1},
"install_path_scheme": {"type": "string"}, # deprecated
"build_stage": {

View File

@@ -5,12 +5,9 @@
import collections.abc
import copy
import enum
import errno
import functools
import hashlib
import io
import itertools
import json
import os
import pathlib
import pprint
@@ -20,25 +17,12 @@
import typing
import warnings
from contextlib import contextmanager
from typing import (
IO,
Callable,
Dict,
Iterator,
List,
NamedTuple,
Optional,
Set,
Tuple,
Type,
Union,
)
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
import archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import current_file_position
from llnl.util.lang import elide_list
import spack
@@ -50,18 +34,15 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.hash_types as ht
import spack.package_base
import spack.package_prefs
import spack.patch
import spack.paths
import spack.platforms
import spack.repo
import spack.solver.splicing
import spack.spec
import spack.store
import spack.util.crypto
import spack.util.hash
import spack.util.libc
import spack.util.module_cmd as md
import spack.util.path
@@ -70,7 +51,6 @@
import spack.version as vn
import spack.version.git_ref_lookup
from spack import traverse
from spack.util.file_cache import FileCache
from .core import (
AspFunction,
@@ -558,364 +538,6 @@ def format_unsolved(unsolved_specs):
msg += "\n\t(No candidate specs from solver)"
return msg
def to_dict(self, test: bool = False) -> dict:
"""Produces dict representation of Result object
Does not include anything related to unsatisfiability as we
are only interested in storing satisfiable results
"""
serial_node_arg = (
lambda node_dict: f"""{{"id": "{node_dict.id}", "pkg": "{node_dict.pkg}"}}"""
)
spec_hash_type = ht.process_hash if test else ht.dag_hash
ret = dict()
ret["asp"] = self.asp
ret["criteria"] = self.criteria
ret["optimal"] = self.optimal
ret["warnings"] = self.warnings
ret["nmodels"] = self.nmodels
ret["abstract_specs"] = [str(x) for x in self.abstract_specs]
ret["satisfiable"] = self.satisfiable
serial_answers = []
for answer in self.answers:
serial_answer = answer[:2]
serial_answer_dict = {}
for node, spec in answer[2].items():
serial_answer_dict[serial_node_arg(node)] = spec.to_dict(hash=spec_hash_type)
serial_answer = serial_answer + (serial_answer_dict,)
serial_answers.append(serial_answer)
ret["answers"] = serial_answers
ret["specs_by_input"] = {}
input_specs = {} if not self.specs_by_input else self.specs_by_input
for input, spec in input_specs.items():
ret["specs_by_input"][str(input)] = spec.to_dict(hash=spec_hash_type)
return ret
@staticmethod
def from_dict(obj: dict):
"""Returns Result object from compatible dictionary"""
def _dict_to_node_argument(dict):
id = dict["id"]
pkg = dict["pkg"]
return NodeArgument(id=id, pkg=pkg)
def _str_to_spec(spec_str):
return spack.spec.Spec(spec_str)
def _dict_to_spec(spec_dict):
loaded_spec = spack.spec.Spec.from_dict(spec_dict)
_ensure_external_path_if_external(loaded_spec)
spack.spec.Spec.ensure_no_deprecated(loaded_spec)
return loaded_spec
asp = obj.get("asp")
spec_list = obj.get("abstract_specs")
if not spec_list:
raise RuntimeError("Invalid json for concretization Result object")
if spec_list:
spec_list = [_str_to_spec(x) for x in spec_list]
result = Result(spec_list, asp)
result.criteria = obj.get("criteria")
result.optimal = obj.get("optimal")
result.warnings = obj.get("warnings")
result.nmodels = obj.get("nmodels")
result.satisfiable = obj.get("satisfiable")
result._unsolved_specs = []
answers = []
for answer in obj.get("answers", []):
loaded_answer = answer[:2]
answer_node_dict = {}
for node, spec in answer[2].items():
answer_node_dict[_dict_to_node_argument(json.loads(node))] = _dict_to_spec(spec)
loaded_answer.append(answer_node_dict)
answers.append(tuple(loaded_answer))
result.answers = answers
result._concrete_specs_by_input = {}
result._concrete_specs = []
for input, spec in obj.get("specs_by_input", {}).items():
result._concrete_specs_by_input[_str_to_spec(input)] = _dict_to_spec(spec)
result._concrete_specs.append(_dict_to_spec(spec))
return result
class ConcretizationCache:
"""Store for Spack concretization results and statistics
Serializes solver result objects and statistics to json and stores
at a given endpoint in a cache associated by the sha256 of the
asp problem and the involved control files.
"""
def __init__(self, root: Union[str, None] = None):
root = root or spack.config.get(
"config:concretization_cache:url", spack.paths.default_conc_cache_path
)
self.root = pathlib.Path(spack.util.path.canonicalize_path(root))
self._fc = FileCache(self.root)
self._cache_manifest = ".cache_manifest"
self._manifest_queue: List[Tuple[pathlib.Path, int]] = []
def cleanup(self):
"""Prunes the concretization cache according to configured size and entry
count limits. Cleanup is done in FIFO ordering."""
# TODO: determine a better default
entry_limit = spack.config.get("config:concretization_cache:entry_limit", 1000)
bytes_limit = spack.config.get("config:concretization_cache:size_limit", 3e8)
# lock the entire buildcache as we're removing a lot of data from the
# manifest and cache itself
with self._fc.read_transaction(self._cache_manifest) as f:
count, cache_bytes = self._extract_cache_metadata(f)
if not count or not cache_bytes:
return
entry_count = int(count)
manifest_bytes = int(cache_bytes)
# move beyond the metadata entry
f.readline()
if entry_count > entry_limit and entry_limit > 0:
with self._fc.write_transaction(self._cache_manifest) as (old, new):
# prune the oldest 10% or until we have removed 10% of
# total bytes starting from oldest entry
# TODO: make this configurable?
prune_count = entry_limit // 10
lines_to_prune = f.readlines(prune_count)
for i, line in enumerate(lines_to_prune):
sha, cache_entry_bytes = self._parse_manifest_entry(line)
if sha and cache_entry_bytes:
cache_path = self._cache_path_from_hash(sha)
if self._fc.remove(cache_path):
entry_count -= 1
manifest_bytes -= int(cache_entry_bytes)
else:
tty.warn(
f"Invalid concretization cache entry: '{line}' on line: {i+1}"
)
self._write_manifest(f, entry_count, manifest_bytes)
elif manifest_bytes > bytes_limit and bytes_limit > 0:
with self._fc.write_transaction(self._cache_manifest) as (old, new):
# take 10% of current size off
prune_amount = bytes_limit // 10
total_pruned = 0
i = 0
while total_pruned < prune_amount:
sha, manifest_cache_bytes = self._parse_manifest_entry(f.readline())
if sha and manifest_cache_bytes:
entry_bytes = int(manifest_cache_bytes)
cache_path = self.root / sha[:2] / sha
if self._safe_remove(cache_path):
entry_count -= 1
entry_bytes -= entry_bytes
total_pruned += entry_bytes
else:
tty.warn(
"Invalid concretization cache entry "
f"'{sha} {manifest_cache_bytes}' on line: {i}"
)
i += 1
self._write_manifest(f, entry_count, manifest_bytes)
for cache_dir in self.root.iterdir():
if cache_dir.is_dir() and not any(cache_dir.iterdir()):
self._safe_remove(cache_dir)
def cache_entries(self):
"""Generator producing cache entries"""
for cache_dir in self.root.iterdir():
# ensure component is cache entry directory
# not metadata file
if cache_dir.is_dir():
for cache_entry in cache_dir.iterdir():
if not cache_entry.is_dir():
yield cache_entry
else:
raise RuntimeError(
"Improperly formed concretization cache. "
f"Directory {cache_entry.name} is improperly located "
"within the concretization cache."
)
def _parse_manifest_entry(self, line):
"""Returns parsed manifest entry lines
with handling for invalid reads."""
if line:
cache_values = line.strip("\n").split(" ")
if len(cache_values) < 2:
tty.warn(f"Invalid cache entry at {line}")
return None, None
return None, None
def _write_manifest(self, manifest_file, entry_count, entry_bytes):
"""Writes new concretization cache manifest file.
Arguments:
manifest_file: IO stream opened for readin
and writing wrapping the manifest file
with cursor at calltime set to location
where manifest should be truncated
entry_count: new total entry count
entry_bytes: new total entry bytes count
"""
persisted_entries = manifest_file.readlines()
manifest_file.truncate(0)
manifest_file.write(f"{entry_count} {entry_bytes}\n")
manifest_file.writelines(persisted_entries)
def _results_from_cache(self, cache_entry_buffer: IO[str]) -> Union[Result, None]:
"""Returns a Results object from the concretizer cache
Reads the cache hit and uses `Result`'s own deserializer
to produce a new Result object
"""
with current_file_position(cache_entry_buffer, 0):
cache_str = cache_entry_buffer.read()
# TODO: Should this be an error if None?
# Same for _stats_from_cache
if cache_str:
cache_entry = json.loads(cache_str)
result_json = cache_entry["results"]
return Result.from_dict(result_json)
return None
def _stats_from_cache(self, cache_entry_buffer: IO[str]) -> Union[List, None]:
"""Returns concretization statistic from the
concretization associated with the cache.
Deserialzes the the json representation of the
statistics covering the cached concretization run
and returns the Python data structures
"""
with current_file_position(cache_entry_buffer, 0):
cache_str = cache_entry_buffer.read()
if cache_str:
return json.loads(cache_str)["statistics"]
return None
def _extract_cache_metadata(self, cache_stream: IO[str]):
"""Extracts and returns cache entry count and bytes count from head of manifest
file"""
# make sure we're always reading from the beginning of the stream
# concretization cache manifest data lives at the top of the file
with current_file_position(cache_stream, 0):
return self._parse_manifest_entry(cache_stream.readline())
def _prefix_digest(self, problem: str) -> Tuple[str, str]:
"""Return the first two characters of, and the full, sha256 of the given asp problem"""
prob_digest = hashlib.sha256(problem.encode()).hexdigest()
prefix = prob_digest[:2]
return prefix, prob_digest
def _cache_path_from_problem(self, problem: str) -> pathlib.Path:
"""Returns a Path object representing the path to the cache
entry for the given problem"""
prefix, digest = self._prefix_digest(problem)
return pathlib.Path(prefix) / digest
def _cache_path_from_hash(self, hash: str) -> pathlib.Path:
"""Returns a Path object representing the cache entry
corresponding to the given sha256 hash"""
return pathlib.Path(hash[:2]) / hash
def _lock_prefix_from_cache_path(self, cache_path: str):
"""Returns the bit location corresponding to a given cache entry path
for file locking"""
return spack.util.hash.base32_prefix_bits(
spack.util.hash.b32_hash(cache_path), spack.util.crypto.bit_length(sys.maxsize)
)
def flush_manifest(self):
"""Updates the concretization cache manifest file after a cache write operation
Updates the current byte count and entry counts and writes to the head of the
manifest file"""
manifest_file = self.root / self._cache_manifest
manifest_file.touch(exist_ok=True)
with open(manifest_file, "r+", encoding="utf-8") as f:
# check if manifest is empty
count, cache_bytes = self._extract_cache_metadata(f)
if not count or not cache_bytes:
# cache is unintialized
count = 0
cache_bytes = 0
f.seek(0, io.SEEK_END)
for manifest_update in self._manifest_queue:
entry_path, entry_bytes = manifest_update
count += 1
cache_bytes += entry_bytes
f.write(f"{entry_path.name} {entry_bytes}")
f.seek(0, io.SEEK_SET)
new_stats = f"{int(count)+1} {int(cache_bytes)}\n"
f.write(new_stats)
def _register_cache_update(self, cache_path: pathlib.Path, bytes_written: int):
"""Adds manifest entry to update queue for later updates to the manifest"""
self._manifest_queue.append((cache_path, bytes_written))
def _safe_remove(self, cache_dir: pathlib.Path):
"""Removes cache entries with handling for the case where the entry has been
removed already or there are multiple cache entries in a directory"""
try:
if cache_dir.is_dir():
cache_dir.rmdir()
else:
cache_dir.unlink()
return True
except FileNotFoundError:
# This is acceptable, removal is idempotent
pass
except OSError as e:
if e.errno == errno.ENOTEMPTY:
# there exists another cache entry in this directory, don't clean yet
pass
return False
def store(self, problem: str, result: Result, statistics: List, test: bool = False):
"""Creates entry in concretization cache for problem if none exists,
storing the concretization Result object and statistics in the cache
as serialized json joined as a single file.
Hash membership is computed based on the sha256 of the provided asp
problem.
"""
cache_path = self._cache_path_from_problem(problem)
if self._fc.init_entry(cache_path):
# if an entry for this conc hash exists already, we're don't want
# to overwrite, just exit
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
return
with self._fc.write_transaction(cache_path) as (old, new):
if old:
# Entry for this conc hash exists already, do not overwrite
tty.debug(f"Cache entry {cache_path} exists, will not be overwritten")
return
cache_dict = {"results": result.to_dict(test=test), "statistics": statistics}
bytes_written = new.write(json.dumps(cache_dict))
self._register_cache_update(cache_path, bytes_written)
def fetch(self, problem: str) -> Union[Tuple[Result, List], Tuple[None, None]]:
"""Returns the concretization cache result for a lookup based on the given problem.
Checks the concretization cache for the given problem, and either returns the
Python objects cached on disk representing the concretization results and statistics
or returns none if no cache entry was found.
"""
cache_path = self._cache_path_from_problem(problem)
result, statistics = None, None
with self._fc.read_transaction(cache_path) as f:
if f:
result = self._results_from_cache(f)
statistics = self._stats_from_cache(f)
if result and statistics:
tty.debug(f"Concretization cache hit at {str(cache_path)}")
return result, statistics
tty.debug(f"Concretization cache miss at {str(cache_path)}")
return None, None
CONC_CACHE: ConcretizationCache = llnl.util.lang.Singleton(
lambda: ConcretizationCache()
) # type: ignore
def _normalize_packages_yaml(packages_yaml):
normalized_yaml = copy.copy(packages_yaml)
@@ -1184,15 +806,6 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
if sys.platform == "win32":
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
spack.bootstrap.core.ensure_winsdk_external_or_raise()
control_files = ["concretize.lp", "heuristic.lp", "display.lp"]
if not setup.concretize_everything:
control_files.append("when_possible.lp")
if using_libc_compatibility():
control_files.append("libc_compatibility.lp")
else:
control_files.append("os_compatibility.lp")
if setup.enable_splicing:
control_files.append("splices.lp")
timer.start("setup")
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
@@ -1202,133 +815,123 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
return Result(specs), None, None
timer.stop("setup")
timer.start("cache-check")
timer.start("ordering")
# ensure deterministic output
problem_repr = "\n".join(sorted(asp_problem.split("\n")))
timer.stop("ordering")
timer.start("load")
# Add the problem instance
self.control.add("base", [], asp_problem)
# Load the file itself
parent_dir = os.path.dirname(__file__)
full_path = lambda x: os.path.join(parent_dir, x)
abs_control_files = [full_path(x) for x in control_files]
for ctrl_file in abs_control_files:
with open(ctrl_file, "r+", encoding="utf-8") as f:
problem_repr += "\n" + f.read()
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
result = None
conc_cache_enabled = spack.config.get("config:concretization_cache:enable", True)
if conc_cache_enabled:
result, concretization_stats = CONC_CACHE.fetch(problem_repr)
# Binary compatibility is based on libc on Linux, and on the os tag elsewhere
if using_libc_compatibility():
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
if setup.enable_splicing:
self.control.load(os.path.join(parent_dir, "splices.lp"))
timer.stop("cache-check")
if not result:
timer.start("load")
# Add the problem instance
self.control.add("base", [], asp_problem)
# Load the files
[self.control.load(lp) for lp in abs_control_files]
timer.stop("load")
timer.stop("load")
# Grounding is the first step in the solve -- it turns our facts
# and first-order logic rules into propositional logic.
timer.start("ground")
self.control.ground([("base", [])])
timer.stop("ground")
# Grounding is the first step in the solve -- it turns our facts
# and first-order logic rules into propositional logic.
timer.start("ground")
self.control.ground([("base", [])])
timer.stop("ground")
# With a grounded program, we can run the solve.
models = [] # stable models if things go well
cores = [] # unsatisfiable cores if they do not
# With a grounded program, we can run the solve.
models = [] # stable models if things go well
cores = [] # unsatisfiable cores if they do not
def on_model(model):
models.append((model.cost, model.symbols(shown=True, terms=True)))
def on_model(model):
models.append((model.cost, model.symbols(shown=True, terms=True)))
solve_kwargs = {
"assumptions": setup.assumptions,
"on_model": on_model,
"on_core": cores.append,
}
solve_kwargs = {
"assumptions": setup.assumptions,
"on_model": on_model,
"on_core": cores.append,
}
if clingo_cffi():
solve_kwargs["on_unsat"] = cores.append
if clingo_cffi():
solve_kwargs["on_unsat"] = cores.append
timer.start("solve")
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = (
f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
)
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
timer.start("solve")
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
solve_result = handle.get()
timer.stop("solve")
solve_result = handle.get()
timer.stop("solve")
# once done, construct the solve result
result = Result(specs)
result.satisfiable = solve_result.satisfiable
# once done, construct the solve result
result = Result(specs)
result.satisfiable = solve_result.satisfiable
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
# first check for errors
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# first check for errors
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# build specs from spec attributes in the model
spec_attrs = [
(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")
]
answers = builder.build_specs(spec_attrs)
# build specs from spec attributes in the model
spec_attrs = [(name, tuple(rest)) for name, *rest in extract_args(best_model, "attr")]
answers = builder.build_specs(spec_attrs)
# add best spec to the results
result.answers.append((list(min_cost), 0, answers))
# add best spec to the results
result.answers.append((list(min_cost), 0, answers))
# get optimization criteria
criteria_args = extract_args(best_model, "opt_criterion")
result.criteria = build_criteria_names(min_cost, criteria_args)
# get optimization criteria
criteria_args = extract_args(best_model, "opt_criterion")
result.criteria = build_criteria_names(min_cost, criteria_args)
# record the number of models the solver considered
result.nmodels = len(models)
# record the number of models the solver considered
result.nmodels = len(models)
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
result.raise_if_unsat()
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
if conc_cache_enabled:
CONC_CACHE.store(problem_repr, result, self.control.statistics, test=setup.tests)
concretization_stats = self.control.statistics
if output.timers:
timer.write_tty()
print()
if output.stats:
print("Statistics:")
pprint.pprint(concretization_stats)
return result, timer, concretization_stats
pprint.pprint(self.control.statistics)
result.raise_if_unsat()
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
return result, timer, self.control.statistics
class ConcreteSpecsByHash(collections.abc.Mapping):
@@ -1770,7 +1373,7 @@ def effect_rules(self):
return
self.gen.h2("Imposed requirements")
for name in sorted(self._effect_cache):
for name in self._effect_cache:
cache = self._effect_cache[name]
for (spec_str, _), (effect_id, requirements) in cache.items():
self.gen.fact(fn.pkg_fact(name, fn.effect_id(effect_id)))
@@ -1823,8 +1426,8 @@ def define_variant(
elif isinstance(values, vt.DisjointSetsOfValues):
union = set()
for sid, s in enumerate(sorted(values.sets)):
for value in sorted(s):
for sid, s in enumerate(values.sets):
for value in s:
pkg_fact(fn.variant_value_from_disjoint_sets(vid, value, sid))
union.update(s)
values = union
@@ -2005,7 +1608,7 @@ def package_provider_rules(self, pkg):
self.gen.fact(fn.pkg_fact(pkg.name, fn.possible_provider(vpkg_name)))
for when, provided in pkg.provided.items():
for vpkg in sorted(provided):
for vpkg in provided:
if vpkg.name not in self.possible_virtuals:
continue
@@ -2020,8 +1623,8 @@ def package_provider_rules(self, pkg):
condition_id = self.condition(
when, required_name=pkg.name, msg="Virtuals are provided together"
)
for set_id, virtuals_together in enumerate(sorted(sets_of_virtuals)):
for name in sorted(virtuals_together):
for set_id, virtuals_together in enumerate(sets_of_virtuals):
for name in virtuals_together:
self.gen.fact(
fn.pkg_fact(pkg.name, fn.provided_together(condition_id, set_id, name))
)
@@ -2131,7 +1734,7 @@ def package_splice_rules(self, pkg):
for map in pkg.variants.values():
for k in map:
filt_match_variants.add(k)
filt_match_variants = sorted(filt_match_variants)
filt_match_variants = list(filt_match_variants)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
)
@@ -2661,7 +2264,7 @@ def define_package_versions_and_validate_preferences(
):
"""Declare any versions in specs not declared in packages."""
packages_yaml = spack.config.get("packages")
for pkg_name in sorted(possible_pkgs):
for pkg_name in possible_pkgs:
pkg_cls = self.pkg_class(pkg_name)
# All the versions from the corresponding package.py file. Since concepts
@@ -2989,7 +2592,7 @@ def define_variant_values(self):
"""
# Tell the concretizer about possible values from specs seen in spec_clauses().
# We might want to order these facts by pkg and name if we are debugging.
for pkg_name, variant_def_id, value in sorted(self.variant_values_from_specs):
for pkg_name, variant_def_id, value in self.variant_values_from_specs:
try:
vid = self.variant_ids_by_def_id[variant_def_id]
except KeyError:
@@ -3027,8 +2630,6 @@ def concrete_specs(self):
# Declare as possible parts of specs that are not in package.py
# - Add versions to possible versions
# - Add OS to possible OS's
# is traverse deterministic?
for dep in spec.traverse():
self.possible_versions[dep.name].add(dep.version)
if isinstance(dep.version, vn.GitVersion):
@@ -3266,7 +2867,7 @@ def define_runtime_constraints(self):
recorder.consume_facts()
def literal_specs(self, specs):
for spec in sorted(specs):
for spec in specs:
self.gen.h2("Spec: %s" % str(spec))
condition_id = next(self._id_counter)
trigger_id = next(self._id_counter)
@@ -3767,7 +3368,7 @@ def consume_facts(self):
# on the available compilers)
self._setup.pkg_version_rules(runtime_pkg)
for imposed_spec, when_spec in sorted(self.runtime_conditions):
for imposed_spec, when_spec in self.runtime_conditions:
msg = f"{when_spec} requires {imposed_spec} at runtime"
_ = self._setup.condition(when_spec, imposed_spec=imposed_spec, msg=msg)
@@ -4624,9 +4225,6 @@ def solve_with_stats(
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
CONC_CACHE.flush_manifest()
CONC_CACHE.cleanup()
return self.driver.solve(
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
)
@@ -4696,9 +4294,6 @@ def solve_in_rounds(
for spec in result.specs:
reusable_specs.extend(spec.traverse())
CONC_CACHE.flush_manifest()
CONC_CACHE.cleanup()
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""There was an issue with the spec that was requested (i.e. a user error)."""

View File

@@ -1515,7 +1515,7 @@ def __init__(self, spec_like=None, *, external_path=None, external_modules=None)
self.abstract_hash = None
# initial values for all spec hash types
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, None)
# cache for spec's prefix, computed lazily by prefix property
@@ -2198,30 +2198,16 @@ def package_hash(self):
def dag_hash(self, length=None):
"""This is Spack's default hash, used to identify installations.
Same as the full hash (includes package hash and build/link/run deps).
Tells us when package files and any dependencies have changes.
NOTE: Versions of Spack prior to 0.18 only included link and run deps.
NOTE: Versions of Spack prior to 1.0 only did not include test deps.
"""
return self._cached_hash(ht.dag_hash, length)
def process_hash(self, length=None):
"""Hash used to transfer specs among processes.
This hash includes build and test dependencies and is only used to
serialize a spec and pass it around among processes.
"""
return self._cached_hash(ht.process_hash, length)
def dag_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.dag_hash(), bits)
def process_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.process_hash(), bits)
def _lookup_hash(self):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
@@ -3588,11 +3574,11 @@ def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True)
if self._concrete:
self._dunder_hash = other._dunder_hash
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, getattr(other, h.attr, None))
else:
self._dunder_hash = None
for h in ht.hashes:
for h in ht.HASHES:
setattr(self, h.attr, None)
return changed
@@ -3784,16 +3770,6 @@ def _cmp_iter(self):
# serialized before the hash change and one after, are considered different.
yield self.dag_hash() if self.concrete else None
# This needs to be in _cmp_iter so that no specs with different process hashes
# are considered the same by `__hash__` or `__eq__`.
#
# TODO: We should eventually unify the `_cmp_*` methods with `to_node_dict` so
# TODO: there aren't two sources of truth, but this needs some thought, since
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield self.process_hash() if self.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
@@ -4447,7 +4423,7 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
"""
Clears all cached hashes in a Spec, while preserving other properties.
"""
for h in ht.hashes:
for h in ht.HASHES:
if h.attr not in ignore:
if hasattr(self, h.attr):
setattr(self, h.attr, None)
@@ -4456,18 +4432,12 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
setattr(self, attr, None)
def __hash__(self):
# If the spec is concrete, we leverage the process hash and just use
# a 64-bit prefix of it. The process hash has the advantage that it's
# computed once per concrete spec, and it's saved -- so if we read
# concrete specs we don't need to recompute the whole hash. This is
# good for large, unchanging specs.
#
# We use the process hash instead of the DAG hash here because the DAG
# hash includes the package hash, which can cause infinite recursion,
# and which isn't defined unless the spec has a known package.
# If the spec is concrete, we leverage the dag hash and just use a 64-bit prefix of it.
# The dag hash has the advantage that it's computed once per concrete spec, and it's saved
# -- so if we read concrete specs we don't need to recompute the whole hash.
if self.concrete:
if not self._dunder_hash:
self._dunder_hash = self.process_hash_bit_prefix(64)
self._dunder_hash = self.dag_hash_bit_prefix(64)
return self._dunder_hash
# This is the normal hash for lazy_lexicographic_ordering. It's
@@ -4476,7 +4446,7 @@ def __hash__(self):
return hash(lang.tuplify(self._cmp_iter))
def __reduce__(self):
return Spec.from_dict, (self.to_dict(hash=ht.process_hash),)
return Spec.from_dict, (self.to_dict(hash=ht.dag_hash),)
def attach_git_version_lookup(self):
# Add a git lookup method for GitVersions
@@ -4798,7 +4768,7 @@ def from_node_dict(cls, node):
spec = Spec()
name, node = cls.name_and_data(node)
for h in ht.hashes:
for h in ht.HASHES:
setattr(spec, h.attr, node.get(h.name, None))
spec.name = name
@@ -4981,7 +4951,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
"""
for dep_name, elt in deps.items():
if isinstance(elt, dict):
for h in ht.hashes:
for h in ht.HASHES:
if h.name in elt:
dep_hash, deptypes = elt[h.name], elt["type"]
hash_type = h.name
@@ -5024,7 +4994,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
dep_name = dep["name"]
if isinstance(elt, dict):
# new format: elements of dependency spec are keyed.
for h in ht.hashes:
for h in ht.HASHES:
if h.name in elt:
dep_hash, deptypes, hash_type, virtuals = cls.extract_info_from_dep(elt, h)
break

View File

@@ -42,7 +42,7 @@ def mock_pkg_git_repo(git, tmp_path_factory):
repo_dir = root_dir / "builtin.mock"
shutil.copytree(spack.paths.mock_packages_path, str(repo_dir))
repo_cache = spack.util.file_cache.FileCache(root_dir / "cache")
repo_cache = spack.util.file_cache.FileCache(str(root_dir / "cache"))
mock_repo = spack.repo.RepoPath(str(repo_dir), cache=repo_cache)
mock_repo_packages = mock_repo.repos[0].packages_path

View File

@@ -59,13 +59,6 @@ def mock_fetch_remote_versions(*args, **kwargs):
assert v.strip(" \n\t") == "99.99.99\n 3.2.1"
@pytest.mark.maybeslow
def test_no_versions():
"""Test a package for which no remote versions are available."""
versions("converge")
@pytest.mark.maybeslow
def test_no_unchecksummed_versions():
"""Test a package for which no unchecksummed versions are available."""

View File

@@ -3254,54 +3254,3 @@ def test_spec_unification(unify, mutable_config, mock_packages):
maybe_fails = pytest.raises if unify is True else llnl.util.lang.nullcontext
with maybe_fails(spack.solver.asp.UnsatisfiableSpecError):
_ = spack.cmd.parse_specs([a_restricted, b], concretize=True)
def test_concretization_cache_roundtrip(use_concretization_cache, monkeypatch, mutable_config):
"""Tests whether we can write the results of a clingo solve to the cache
and load the same spec request from the cache to produce identical specs"""
# Force determinism:
# Solver setup is normally non-deterministic due to non-determinism in
# asp solver setup logic generation. The only other inputs to the cache keys are
# the .lp files, which are invariant over the course of this test.
# This method forces the same setup to be produced for the same specs
# which gives us a guarantee of cache hits, as it removes the only
# element of non deterministic solver setup for the same spec
# Basically just a quick and dirty memoization
solver_setup = spack.solver.asp.SpackSolverSetup.setup
def _setup(self, specs, *, reuse=None, allow_deprecated=False):
if not getattr(_setup, "cache_setup", None):
cache_setup = solver_setup(self, specs, reuse=reuse, allow_deprecated=allow_deprecated)
setattr(_setup, "cache_setup", cache_setup)
return getattr(_setup, "cache_setup")
# monkeypatch our forced determinism setup method into solver setup
monkeypatch.setattr(spack.solver.asp.SpackSolverSetup, "setup", _setup)
assert spack.config.get("config:concretization_cache:enable")
# run one standard concretization to populate the cache and the setup method
# memoization
h = spack.concretize.concretize_one("hdf5")
# due to our forced determinism above, we should not be observing
# cache misses, assert that we're not storing any new cache entries
def _ensure_no_store(self, problem: str, result, statistics, test=False):
# always throw, we never want to reach this code path
assert False, "Concretization cache hit expected"
# Assert that we're actually hitting the cache
cache_fetch = spack.solver.asp.ConcretizationCache.fetch
def _ensure_cache_hits(self, problem: str):
result, statistics = cache_fetch(self, problem)
assert result, "Expected successful concretization cache hit"
assert statistics, "Expected statistics to be non null on cache hit"
return result, statistics
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "store", _ensure_no_store)
monkeypatch.setattr(spack.solver.asp.ConcretizationCache, "fetch", _ensure_cache_hits)
# ensure subsequent concretizations of the same spec produce the same spec
# object
for _ in range(5):
assert h == spack.concretize.concretize_one("hdf5")

View File

@@ -350,16 +350,6 @@ def pytest_collection_modifyitems(config, items):
item.add_marker(skip_as_slow)
@pytest.fixture(scope="function")
def use_concretization_cache(mutable_config, tmpdir):
"""Enables the use of the concretization cache"""
spack.config.set("config:concretization_cache:enable", True)
# ensure we have an isolated concretization cache
new_conc_cache_loc = str(tmpdir.mkdir("concretization"))
spack.config.set("config:concretization_cache:path", new_conc_cache_loc)
yield
#
# These fixtures are applied to all tests
#
@@ -2143,7 +2133,7 @@ def _c_compiler_always_exists():
@pytest.fixture(scope="session")
def mock_test_cache(tmp_path_factory):
cache_dir = tmp_path_factory.mktemp("cache")
return spack.util.file_cache.FileCache(cache_dir)
return spack.util.file_cache.FileCache(str(cache_dir))
class MockHTTPResponse(io.IOBase):

View File

@@ -14,5 +14,3 @@ config:
checksum: true
dirty: false
locks: {1}
concretization_cache:
enable: false

View File

@@ -161,7 +161,7 @@ def test_handle_unknown_package(temporary_store, config, mock_packages, tmp_path
"""
layout = temporary_store.layout
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
mock_db = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=repo_cache)
not_in_mock = set.difference(

View File

@@ -34,7 +34,7 @@ def extra_repo(tmp_path_factory, request):
subdirectory: '{request.param}'
"""
)
repo_cache = spack.util.file_cache.FileCache(cache_dir)
repo_cache = spack.util.file_cache.FileCache(str(cache_dir))
return spack.repo.Repo(str(repo_dir), cache=repo_cache), request.param
@@ -194,7 +194,7 @@ def _repo_paths(repos):
repo_paths, namespaces = _repo_paths(repos)
repo_cache = spack.util.file_cache.FileCache(tmp_path / "cache")
repo_cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
repo_path = spack.repo.RepoPath(*repo_paths, cache=repo_cache)
assert len(repo_path.repos) == len(namespaces)
assert [x.namespace for x in repo_path.repos] == namespaces
@@ -362,5 +362,5 @@ def test_repo_package_api_version(tmp_path: pathlib.Path):
namespace: example
"""
)
cache = spack.util.file_cache.FileCache(tmp_path / "cache")
cache = spack.util.file_cache.FileCache(str(tmp_path / "cache"))
assert spack.repo.Repo(str(tmp_path / "example"), cache=cache).package_api == (1, 0)

View File

@@ -1751,7 +1751,6 @@ def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_co
assert hash(a1) == hash(a2)
assert a1.dag_hash() == a2.dag_hash()
assert a1.process_hash() == a2.process_hash()
a1.clear_caches()
a2.clear_caches()
@@ -1764,7 +1763,6 @@ def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_co
assert hash(a1) != hash(a2)
assert a1.dag_hash() != a2.dag_hash()
assert a1.process_hash() != a2.process_hash()
def test_intersects_and_satisfies_on_concretized_spec(default_mock_concretization):

View File

@@ -878,9 +878,7 @@ def test_ambiguous_hash(mutable_database):
x1 = spack.concretize.concretize_one("pkg-a")
x2 = x1.copy()
x1._hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
x1._process_hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
x2._hash = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
x2._process_hash = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
assert x1 != x2 # doesn't hold when only the dag hash is modified.

View File

@@ -203,13 +203,6 @@ def test_ordered_read_not_required_for_consistent_dag_hash(
spec = spec.copy(deps=ht.dag_hash.depflag)
# specs and their hashes are equal to the original
assert (
spec.process_hash()
== from_yaml.process_hash()
== from_json.process_hash()
== from_yaml_rev.process_hash()
== from_json_rev.process_hash()
)
assert (
spec.dag_hash()
== from_yaml.dag_hash()

View File

@@ -5,17 +5,16 @@
import errno
import math
import os
import pathlib
import shutil
from typing import IO, Dict, Optional, Tuple, Union
from typing import IO, Optional, Tuple
from llnl.util.filesystem import rename
from llnl.util.filesystem import mkdirp, rename
from spack.error import SpackError
from spack.util.lock import Lock, ReadTransaction, WriteTransaction
def _maybe_open(path: Union[str, pathlib.Path]) -> Optional[IO[str]]:
def _maybe_open(path: str) -> Optional[IO[str]]:
try:
return open(path, "r", encoding="utf-8")
except OSError as e:
@@ -25,7 +24,7 @@ def _maybe_open(path: Union[str, pathlib.Path]) -> Optional[IO[str]]:
class ReadContextManager:
def __init__(self, path: Union[str, pathlib.Path]) -> None:
def __init__(self, path: str) -> None:
self.path = path
def __enter__(self) -> Optional[IO[str]]:
@@ -71,7 +70,7 @@ class FileCache:
"""
def __init__(self, root: Union[str, pathlib.Path], timeout=120):
def __init__(self, root, timeout=120):
"""Create a file cache object.
This will create the cache directory if it does not exist yet.
@@ -83,60 +82,58 @@ def __init__(self, root: Union[str, pathlib.Path], timeout=120):
for cache files, this specifies how long Spack should wait
before assuming that there is a deadlock.
"""
if isinstance(root, str):
root = pathlib.Path(root)
self.root = root
self.root.mkdir(parents=True, exist_ok=True)
self.root = root.rstrip(os.path.sep)
if not os.path.exists(self.root):
mkdirp(self.root)
self._locks: Dict[Union[pathlib.Path, str], Lock] = {}
self._locks = {}
self.lock_timeout = timeout
def destroy(self):
"""Remove all files under the cache root."""
for f in self.root.iterdir():
if f.is_dir():
shutil.rmtree(f, True)
for f in os.listdir(self.root):
path = os.path.join(self.root, f)
if os.path.isdir(path):
shutil.rmtree(path, True)
else:
f.unlink()
os.remove(path)
def cache_path(self, key: Union[str, pathlib.Path]):
def cache_path(self, key):
"""Path to the file in the cache for a particular key."""
return self.root / key
return os.path.join(self.root, key)
def _lock_path(self, key: Union[str, pathlib.Path]):
def _lock_path(self, key):
"""Path to the file in the cache for a particular key."""
keyfile = os.path.basename(key)
keydir = os.path.dirname(key)
return self.root / keydir / ("." + keyfile + ".lock")
return os.path.join(self.root, keydir, "." + keyfile + ".lock")
def _get_lock(self, key: Union[str, pathlib.Path]):
def _get_lock(self, key):
"""Create a lock for a key, if necessary, and return a lock object."""
if key not in self._locks:
self._locks[key] = Lock(str(self._lock_path(key)), default_timeout=self.lock_timeout)
self._locks[key] = Lock(self._lock_path(key), default_timeout=self.lock_timeout)
return self._locks[key]
def init_entry(self, key: Union[str, pathlib.Path]):
def init_entry(self, key):
"""Ensure we can access a cache file. Create a lock for it if needed.
Return whether the cache file exists yet or not.
"""
cache_path = self.cache_path(key)
# Avoid using pathlib here to allow the logic below to
# function as is
# TODO: Maybe refactor the following logic for pathlib
exists = os.path.exists(cache_path)
if exists:
if not cache_path.is_file():
if not os.path.isfile(cache_path):
raise CacheError("Cache file is not a file: %s" % cache_path)
if not os.access(cache_path, os.R_OK):
raise CacheError("Cannot access cache file: %s" % cache_path)
else:
# if the file is hierarchical, make parent directories
parent = cache_path.parent
if parent != self.root:
parent.mkdir(parents=True, exist_ok=True)
parent = os.path.dirname(cache_path)
if parent.rstrip(os.path.sep) != self.root:
mkdirp(parent)
if not os.access(parent, os.R_OK | os.W_OK):
raise CacheError("Cannot access cache directory: %s" % parent)
@@ -145,7 +142,7 @@ def init_entry(self, key: Union[str, pathlib.Path]):
self._get_lock(key)
return exists
def read_transaction(self, key: Union[str, pathlib.Path]):
def read_transaction(self, key):
"""Get a read transaction on a file cache item.
Returns a ReadTransaction context manager and opens the cache file for
@@ -156,11 +153,9 @@ def read_transaction(self, key: Union[str, pathlib.Path]):
"""
path = self.cache_path(key)
return ReadTransaction(
self._get_lock(key), acquire=lambda: ReadContextManager(path) # type: ignore
)
return ReadTransaction(self._get_lock(key), acquire=lambda: ReadContextManager(path))
def write_transaction(self, key: Union[str, pathlib.Path]):
def write_transaction(self, key):
"""Get a write transaction on a file cache item.
Returns a WriteTransaction context manager that opens a temporary file
@@ -172,11 +167,9 @@ def write_transaction(self, key: Union[str, pathlib.Path]):
if os.path.exists(path) and not os.access(path, os.W_OK):
raise CacheError(f"Insufficient permissions to write to file cache at {path}")
return WriteTransaction(
self._get_lock(key), acquire=lambda: WriteContextManager(path) # type: ignore
)
return WriteTransaction(self._get_lock(key), acquire=lambda: WriteContextManager(path))
def mtime(self, key: Union[str, pathlib.Path]) -> float:
def mtime(self, key) -> float:
"""Return modification time of cache file, or -inf if it does not exist.
Time is in units returned by os.stat in the mtime field, which is
@@ -186,14 +179,14 @@ def mtime(self, key: Union[str, pathlib.Path]) -> float:
if not self.init_entry(key):
return -math.inf
else:
return self.cache_path(key).stat().st_mtime
return os.stat(self.cache_path(key)).st_mtime
def remove(self, key: Union[str, pathlib.Path]):
def remove(self, key):
file = self.cache_path(key)
lock = self._get_lock(key)
try:
lock.acquire_write()
file.unlink()
os.unlink(file)
except OSError as e:
# File not found is OK, so remove is idempotent.
if e.errno != errno.ENOENT:

View File

@@ -106,7 +106,7 @@ spack:
- vecgeom ~cuda
- whizard +fastjet +gosam hepmc=3 +lcio +lhapdf +openloops +openmp +pythia8
- xrootd +davix +http +krb5 +python +readline +scitokens-cpp
- yoda +root
- yoda +hdf5 +highfive
# CUDA
#- acts +cuda +traccc cuda_arch=80

View File

@@ -14,6 +14,7 @@ class AwscliV2(PythonPackage):
maintainers("climbfuji", "teaguesterling")
version("2.24.24", sha256="d7b135ef02c96d50d81c0b5eb2723cf474cfda8e1758cccabbcaa6c14f281419")
version("2.22.4", sha256="56c6170f3be830afef2dea60fc3fd7ed14cf2ca2efba055c085fe6a7c4de358e")
version("2.15.53", sha256="a4f5fd4e09b8f2fb3d2049d0610c7b0993f9aafaf427f299439f05643b25eb4b")
version("2.13.22", sha256="dd731a2ba5973f3219f24c8b332a223a29d959493c8a8e93746d65877d02afc1")

View File

@@ -16,6 +16,10 @@ class Brahma(CMakePackage):
version("develop", branch="develop")
version("master", branch="master")
version("0.0.9", tag="v0.0.9", commit="4af20bbe241c983585e52c04e38868e8b56a9c21")
version("0.0.8", tag="v0.0.8", commit="a99b0f3a688d144b928e41c38977a2aecdaadb41")
version("0.0.7", tag="v0.0.7", commit="010662d1354080244b3b7b32e3e36aa9cfbbf3a1")
version("0.0.6", tag="v0.0.6", commit="e8ac7627d6e607310229b4dfe700715bdc92084e")
version("0.0.5", tag="v0.0.5", commit="219198c653cc4add845a644872e7b963a8de0fe2")
version("0.0.4", tag="v0.0.4", commit="8f41cc885dd8e31a1f134cbbcbaaab7e5d84331e")
version("0.0.3", tag="v0.0.3", commit="fd201c653e8fa00d4ba6197a56a513f740e3014e")

View File

@@ -74,7 +74,7 @@ def edit(self, spec, prefix):
makefile.filter("-mcmodel=medium", "-mcmodel=large")
# Support Intel MPI.
if spec["mpi"].name == "intel-mpi":
if spec["mpi"].name == "intel-oneapi-mpi":
makefile.filter(
"else ifneq (, $(findstring $(MPI),openmpi openMPI OPENMPI))",
"""else ifneq (, $(findstring $(MPI),intel-mpi intel impi))

View File

@@ -0,0 +1,13 @@
diff --git a/src/cgnstools/tkogl/tkogl.c b/src/cgnstools/tkogl/tkogl.c
index 506599d..71b4fb8 100644
--- a/src/cgnstools/tkogl/tkogl.c
+++ b/src/cgnstools/tkogl/tkogl.c
@@ -25,7 +25,7 @@
#if ! defined(__WIN32__) && ! defined(_WIN32)
/* For TkWmAddToColormapWindows. */
#define _TKPORT /* Typical installations cannot find tkPort.h. */
-#include <tk-private/generic/tkInt.h>
+#include <tkInt.h>
#endif
#ifndef CONST

View File

@@ -81,6 +81,8 @@ class Cgns(CMakePackage):
# copied from https://github.com/CGNS/CGNS/pull/757
# (adjusted an include from tk-private/generic/tkInt.h to tkInt.h)
patch("gcc14.patch", when="@:4.4.0 %gcc@14:")
# "wrong" include for spack (tk-private) from the patch above made it into the official version
patch("cgns-4.5-tk-private.patch", when="@4.5.0")
def cmake_args(self):
spec = self.spec

View File

@@ -113,15 +113,13 @@ def cmake_args(self):
]
)
if spec.satisfies("~simgrid"):
if spec.satisfies("^intel-mkl") or spec.satisfies("^intel-parallel-studio+mkl"):
if spec.satisfies("threads=none"):
args.extend([self.define("BLA_VENDOR", "Intel10_64lp_seq")])
else:
args.extend([self.define("BLA_VENDOR", "Intel10_64lp")])
elif spec.satisfies("^netlib-lapack"):
args.extend([self.define("BLA_VENDOR", "Generic")])
elif spec.satisfies("^openblas"):
args.extend([self.define("BLA_VENDOR", "OpenBLAS")])
if spec.satisfies("^[virtuals=blas,lapack] intel-oneapi-mkl threads=none"):
args.extend([self.define("BLA_VENDOR", "Intel10_64lp_seq")])
elif spec.satisfies("^[virtuals=blas,lapack] intel-oneapi-mkl"):
args.extend([self.define("BLA_VENDOR", "Intel10_64lp")])
elif spec.satisfies("^[virtuals=blas,lapack] netlib-lapack"):
args.extend([self.define("BLA_VENDOR", "Generic")])
elif spec.satisfies("^[virtuals=blas,lapack] openblas"):
args.extend([self.define("BLA_VENDOR", "OpenBLAS")])
return args

View File

@@ -15,10 +15,12 @@ class Cli11(CMakePackage):
license("BSD-3-Clause")
version("2.5.0", sha256="17e02b4cddc2fa348e5dbdbb582c59a3486fa2b2433e70a0c3bacb871334fd55")
version("2.4.2", sha256="f2d893a65c3b1324c50d4e682c0cdc021dd0477ae2c048544f39eed6654b699a")
version("2.4.1", sha256="73b7ec52261ce8fe980a29df6b4ceb66243bb0b779451dbd3d014cfec9fdbb58")
version("2.3.2", sha256="aac0ab42108131ac5d3344a9db0fdf25c4db652296641955720a4fbe52334e22")
version("2.3.1", sha256="378da73d2d1d9a7b82ad6ed2b5bda3e7bc7093c4034a1d680a2e009eb067e7b2")
version("2.1.2", sha256="26291377e892ba0e5b4972cdfd4a2ab3bf53af8dac1f4ea8fe0d1376b625c8cb")
version("2.1.1", sha256="d69023d1d0ab6a22be86b4f59d449422bc5efd9121868f4e284d6042e52f682e")
version("2.1.0", sha256="2661b0112b02478bad3dc7f1749c4825bfc7e37b440cbb4c8c0e2ffaa3999112")
version("2.0.0", sha256="2c672f17bf56e8e6223a3bfb74055a946fa7b1ff376510371902adb9cb0ab6a3")
@@ -26,8 +28,11 @@ class Cli11(CMakePackage):
depends_on("cxx", type="build") # generated
variant("pic", default=True, description="Produce position-independent code")
depends_on("cmake@3.4:", type="build")
depends_on("cmake@3.5:", type="build", when="@2.4:")
depends_on("cmake@3.10:", type="build", when="@2.5:")
def cmake_args(self):
args = [
@@ -35,5 +40,6 @@ def cmake_args(self):
self.define("CLI11_BUILD_DOCS", False),
self.define("CLI11_BUILD_TESTS", False),
self.define("CLI11_PRECOMPILED", True),
self.define_from_variant("CMAKE_POSITION_INDEPENDENT_CODE", "pic"),
]
return args

View File

@@ -1,221 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import os
from spack.package import *
class Converge(Package):
"""CONVERGE is a revolutionary computational fluid dynamics (CFD) program
that eliminates the grid generation bottleneck from the simulation process.
CONVERGE was developed by engine simulation experts and is straightforward
to use for both engine and non-engine simulations. Unlike many CFD
programs, CONVERGE automatically generates a perfectly orthogonal,
structured grid at runtime based on simple, user-defined grid control
parameters. This grid generation method completely eliminates the need to
manually generate a grid. In addition, CONVERGE offers many other features
to expedite the setup process and to ensure that your simulations are as
computationally efficient as possible."""
homepage = "https://www.convergecfd.com/"
url = "https://download.convergecfd.com/download/CONVERGE_2.4/Full_Solver_Packages/converge_install_2.4.10.tar.gz"
# In order to view available versions, you need to register for an account:
# https://download.convergecfd.com/wp-login.php?action=register
version("2.4.10", sha256="5d3c39894598d2395149cfcc653af13b8b1091177290edd62fcf22c7e830d410")
version("2.3.23", sha256="1217d16eaf9d263f917ee468778508bad9dacb7e4397a293cfa6467f39fb4c52")
version(
"2.2.0",
sha256="3885acbaf352c718ea69f0206c858a01be02f0928ffee738e4aceb1dd939a77a",
url="https://download.convergecfd.com/download/CONVERGE_2.2/Full_Solver_Packages/converge_install_2.2.0_042916.tar.gz",
)
version(
"2.1.0",
sha256="6b8896d42cf7b9013cae5456f4dc118306a5bd271d4a15945ceb7dae913e825a",
url="https://download.convergecfd.com/download/CONVERGE_2.1/Full_Solver_Packages/converge_install_2.1.0_111615.tar.gz",
)
version(
"2.0.0",
sha256="f32c4824eb33724d85e283481d67ebd0630b1406011c528d775028bb2546f34e",
url="https://download.convergecfd.com/download/CONVERGE_2.0/Full_Solver_Packages/converge_install_2.0.0_090214.tar.gz",
)
variant("mpi", default=True, description="Build with MPI support")
# The following MPI libraries are compatible with CONVERGE:
#
# +--------------+---------+---------+---------+---------+---------+
# | MPI Packages | v2.0 | v2.1 | v2.2 | v2.3 | v2.4 |
# +--------------+---------+---------+---------+---------+---------+
# | HP-MPI | 2.0.3+ | 2.0.3+ | 2.0.3+ | 2.0.3+ | |
# | Intel MPI | | | | | 17.0.98 |
# | MPICH | ?.?.? | ?.?.? | 1.2.1 | 3.1.4 | ?.?.? |
# | MVAPICH2 | ?.?.? | | | | |
# | Open MPI | 1.0-1.4 | 1.0-1.4 | 1.5-1.8 | 1.5-1.8 | 1.10 |
# | Platform MPI | | | 9.1.2 | 9.1.2 | 9.1.2 |
# +--------------+---------+---------+---------+---------+---------+
#
# NOTE: HP-MPI was bought out by Platform MPI
#
# These version requirements are more strict than for most packages.
# Since the tarball comes with pre-compiled executables,
# the version of libmpi.so must match exactly, or else
# you will end up with missing libraries and symbols.
depends_on("mpi", when="+mpi")
# TODO: Add version ranges for other MPI libraries
depends_on("openmpi@1.10.0:1.10", when="@2.4.0:2.4+mpi^[virtuals=mpi] openmpi")
depends_on("openmpi@1.5:1.8", when="@2.2:2.3+mpi^[virtuals=mpi] openmpi")
depends_on("openmpi@:1.4", when="@:2.1+mpi^[virtuals=mpi] openmpi")
conflicts("^intel-mpi", when="@:2.3")
conflicts("^intel-parallel-studio+mpi", when="@:2.3")
conflicts("^spectrum-mpi")
# Licensing
license_required = True
license_comment = "#"
license_files = ["license/license.lic"]
license_vars = ["RLM_LICENSE"]
license_url = "https://www.reprisesoftware.com/RLM_License_Administration.pdf"
def url_for_version(self, version):
url = "https://download.convergecfd.com/download/CONVERGE_{0}/Full_Solver_Packages/converge_install_{1}.tar.gz"
return url.format(version.up_to(2), version)
def install(self, spec, prefix):
# 2.0.0
# converge -> converge-2.0.0-hpmpi-090214
# converge-2.0.0-hpmpi-090214 -> libmpi.so.1, libmpio.so.1
# converge-2.0.0-mpich2-090214 -> libmpich.so.1.2
# converge-2.0.0-mvapich-090214 -> libibumad.so.1
# converge-2.0.0-openmpi-090214 -> libmpi.so.0
# converge-2.0.0-serial-090214
# make_surface
# post_convert
# 2.1.0
# converge -> converge-2.1.0-hpmpi-111615
# converge-2.1.0-hpmpi-111615 -> libmpi.so.1, libmpio.so.1
# converge-2.1.0-mpich2-111615 -> libmpich.so.1.2
# converge-2.1.0-openmpi-111615 -> libmpi.so.0
# converge-2.1.0-serial-111615
# make_surface
# post_convert
# 2.2.0
# converge -> converge-2.2.0-hpmpi-042916
# converge-2.2.0-hpmpi-042916 -> libmpi.so.1, libmpio.so.1
# converge-2.2.0-mpich2-042916
# converge-2.2.0-openmpi-042916 -> libmpi.so.1
# converge-2.2.0-pmpi-042916 -> libmpi.so.1, libmpio.so.1
# converge-2.2.0-serial-042916
# make_surface
# post_convert
# 2.3.23
# converge-2.3.23-hpmpi-linux-64 -> libmpi.so.1, libmpio.so.1
# converge-2.3.23-mpich2-linux-64 -> libmpi.so.12
# converge-2.3.23-openmpi-linux-64 -> libmpi.so.1
# converge-2.3.23-pmpi-linux-64 -> libmpi.so.1, libmpio.so.1
# converge-2.3.23-serial-linux-64
# make_surface_64
# post_convert_mpich_64 -> libmpi.so.12
# post_convert_ompi_64 -> libmpi.so.1
# post_convert_pmpi_64 -> libmpi.so.1, libmpio.so.1
# post_convert_serial_64
# 2.4.10
# converge-2.4.10-intel -> libmpi.so.12, libmpifort.so.12
# converge-2.4.10-mpich -> libmpi.so.12
# converge-2.4.10-ompi -> libmpi.so.12
# converge-2.4.10-pmpi -> libmpi.so.1, libmpio.so.1
# converge-2.4.10-serial
# make_surface_64
# post_convert_mpich_64 -> libmpi.so.12
# post_convert_ompi_64 -> libmpi.so.1
# post_convert_pmpi_64 -> libmpi.so.1
# post_convert_serial_64
# The CONVERGE tarball comes with binaries for several MPI libraries.
# Only install the binary that matches the MPI we are building with.
with working_dir("l_x86_64/bin"):
if spec.satisfies("~mpi"):
converge = glob.glob("converge-*-serial*")
post_convert = glob.glob("post_convert_serial*")
elif spec.satisfies("^hp-mpi"):
converge = glob.glob("converge-*-hpmpi*")
# No HP-MPI version of post_convert
post_convert = glob.glob("post_convert_serial*")
elif spec.satisfies("intel-mpi") or spec.satisfies("intel-parallel-studio+mpi"):
converge = glob.glob("converge-*-intel*")
# No Intel MPI version of post_convert
post_convert = glob.glob("post_convert_serial*")
elif spec.satisfies("^mpich"):
converge = glob.glob("converge-*-mpich*")
post_convert = glob.glob("post_convert_mpich*")
elif spec.satisfies("^mvapich2"):
converge = glob.glob("converge-*-mvapich*")
# MVAPICH2 hasn't been supported since CONVERGE
# came with a single serial post_convert
post_convert = glob.glob("post_convert")
elif spec.satisfies("^openmpi"):
converge = glob.glob("converge-*-o*mpi*")
post_convert = glob.glob("post_convert_o*mpi*")
elif spec.satisfies("^platform-mpi"):
converge = glob.glob("converge-*-pmpi*")
post_convert = glob.glob("post_convert_pmpi*")
else:
raise InstallError("Unsupported MPI provider")
make_surface = glob.glob("make_surface*")
# Old versions of CONVERGE come with a single serial post_convert
if not post_convert:
post_convert = glob.glob("post_convert")
# Make sure glob actually found something
if not converge:
raise InstallError("converge executable not found")
if not post_convert:
raise InstallError("post_convert executable not found")
if not make_surface:
raise InstallError("make_surface executable not found")
# Make sure glob didn't find multiple matches
if len(converge) > 1:
raise InstallError("multiple converge executables found")
if len(post_convert) > 1:
raise InstallError("multiple post_convert executables found")
if len(make_surface) > 1:
raise InstallError("multiple make_surface executables found")
converge = converge[0]
post_convert = post_convert[0]
make_surface = make_surface[0]
mkdir(prefix.bin)
# Install the executables
install(converge, join_path(prefix.bin, converge))
install(post_convert, join_path(prefix.bin, post_convert))
install(make_surface, join_path(prefix.bin, make_surface))
with working_dir(prefix.bin):
# Create generic symlinks to all executables
if not os.path.exists("converge"):
os.symlink(converge, "converge")
if not os.path.exists("post_convert"):
os.symlink(post_convert, "post_convert")
if not os.path.exists("make_surface"):
os.symlink(make_surface, "make_surface")
def setup_run_environment(self, env):
# CONVERGE searches for a valid license file in:
# $CONVERGE_ROOT/license/license.lic
env.set("CONVERGE_ROOT", self.prefix)

View File

@@ -28,6 +28,8 @@ class Dcap(AutotoolsPackage):
depends_on("libxcrypt")
depends_on("zlib-api")
depends_on("cunit", type="test")
variant("plugins", default=True, description="Build plugins")
def patch(self):

View File

@@ -54,7 +54,7 @@ def cmake_args(self):
"-DSPGLIB_DIR={0}".format(spec["spglib"].prefix),
]
if spec.satisfies("^intel-mkl"):
if spec.satisfies("^[virtuals=scalapack] intel-oneapi-mkl"):
args.append("-DWITH_INTEL_MKL=ON")
else:
args.append("-DWITH_INTEL_MKL=OFF")

View File

@@ -23,6 +23,12 @@ class Elpa(AutotoolsPackage, CudaPackage, ROCmPackage):
version("master", branch="master")
version(
"2025.01.001", sha256="3ef0c6aed9a3e05db6efafe6e14d66eb88b2a1354d61e765b7cde0d3d5f3951e"
)
version(
"2024.05.001", sha256="9caf41a3e600e2f6f4ce1931bd54185179dade9c171556d0c9b41bbc6940f2f6"
)
version(
"2024.03.001", sha256="41c6cbf56d2dac26443faaba8a77307d261bf511682a64b96e24def77c813622"
)

View File

@@ -15,8 +15,14 @@ class Expat(AutotoolsPackage, CMakePackage):
url = "https://github.com/libexpat/libexpat/releases/download/R_2_2_9/expat-2.2.9.tar.bz2"
license("MIT")
version("2.6.4", sha256="8dc480b796163d4436e6f1352e71800a774f73dbae213f1860b60607d2a83ada")
# deprecate all releases before 2.6.4 because of security issues
version("2.7.0", sha256="10f3e94896cd7f44de566cafa2e0e1f35e8df06d119b38d117c0e72d74a4b4b7")
# deprecate all releases before 2.7.0 because of security issues
# CVE-2024-8176 (fixed in 2.7.0)
version(
"2.6.4",
sha256="8dc480b796163d4436e6f1352e71800a774f73dbae213f1860b60607d2a83ada",
deprecated=True,
)
# CVE-2024-50602 (fixed in 2.6.4)
version(
"2.6.3",

View File

@@ -16,6 +16,7 @@ class Fleur(Package):
license("MIT")
version("develop", branch="develop")
version("7.2", tag="MaX-R7.2", commit="447eed3b7ec3de5fcdfbd232cd1eda4caefb51d3")
version("5.1", tag="MaX-R5.1", commit="a482abd9511b16412c2222e2ac1b1a303acd454b")
version("5.0", tag="MaX-R5", commit="f2df362c3dad6ef39938807ea14e4ec4cb677723")
version("4.0", tag="MaX-R4", commit="ea0db7877451e6240124e960c5546318c9ab3953")
@@ -53,7 +54,7 @@ class Fleur(Package):
depends_on("lapack")
depends_on("libxml2")
depends_on("mpi", when="+mpi")
depends_on("intel-mkl", when="fft=mkl")
depends_on("intel-oneapi-mkl", when="fft=mkl")
depends_on("fftw-api", when="fft=fftw")
depends_on("scalapack", when="+scalapack")
depends_on("libxc", when="+external_libxc")
@@ -108,9 +109,9 @@ def configure(self):
options["-includedir"].append(join_path(spec["libxml2"].prefix.include, "libxml2"))
if spec.satisfies("fft=mkl"):
options["-link"].append(spec["intel-mkl"].libs.link_flags)
options["-libdir"].append(spec["intel-mkl"].prefix.lib)
options["-includedir"].append(spec["intel-mkl"].prefix.include)
options["-link"].append(spec["intel-oneapi-mkl"].libs.link_flags)
options["-libdir"].append(spec["intel-oneapi-mkl"].prefix.lib)
options["-includedir"].append(spec["intel-oneapi-mkl"].prefix.include)
if spec.satisfies("fft=fftw"):
options["-link"].append(spec["fftw-api"].libs.link_flags)
options["-libdir"].append(spec["fftw-api"].prefix.lib)

View File

@@ -826,18 +826,6 @@ def configure_args(self):
if spec.satisfies("languages=jit"):
options.append("--enable-host-shared")
# Binutils
if spec.satisfies("+binutils"):
binutils = spec["binutils"].prefix.bin
options.extend(
[
"--with-gnu-ld",
"--with-ld=" + binutils.ld,
"--with-gnu-as",
"--with-as=" + binutils.join("as"),
]
)
# enable_bootstrap
if spec.satisfies("+bootstrap"):
options.extend(["--enable-bootstrap"])

View File

@@ -8,6 +8,7 @@
class Ghost(CMakePackage, CudaPackage):
"""GHOST: a General, Hybrid and Optimized Sparse Toolkit.
This library provides highly optimized building blocks for implementing
sparse iterative eigenvalue and linear solvers multi- and manycore
clusters and on heterogenous CPU/GPU machines. For an iterative solver
@@ -31,8 +32,6 @@ class Ghost(CMakePackage, CudaPackage):
variant("scotch", default=False, description="enable/disable matrix reordering with PT-SCOTCH")
variant("zoltan", default=False, description="enable/disable matrix reordering with Zoltan")
# ###################### Dependencies ##########################
# Everything should be compiled position independent (-fpic)
depends_on("cmake@3.5:", type="build")
depends_on("hwloc")
@@ -44,18 +43,14 @@ class Ghost(CMakePackage, CudaPackage):
conflicts("^hwloc@2:")
def cmake_args(self):
spec = self.spec
# note: we require the cblas_include_dir property from the blas
# provider, this is implemented at least for intel-mkl and
# netlib-lapack
args = [
self.define_from_variant("GHOST_ENABLE_MPI", "mpi"),
self.define_from_variant("GHOST_USE_CUDA", "cuda"),
self.define_from_variant("GHOST_USE_SCOTCH", "scotch"),
self.define_from_variant("GHOST_USE_ZOLTAN", "zoltan"),
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
"-DCBLAS_INCLUDE_DIR:STRING=%s" % format(spec["blas"].headers.directories[0]),
"-DBLAS_LIBRARIES=%s" % spec["blas:c"].libs.joined(";"),
self.define("CBLAS_INCLUDE_DIR", self.spec["blas"].headers.directories[0]),
self.define("BLAS_LIBRARIES", self.spec["blas:c"].libs.joined(";")),
]
return args

View File

@@ -29,92 +29,111 @@ class Git(AutotoolsPackage):
# Every new git release comes with a corresponding manpage resource:
# https://www.kernel.org/pub/software/scm/git/git-manpages-{version}.tar.gz
# https://mirrors.edge.kernel.org/pub/software/scm/git/sha256sums.asc
version("2.47.0", sha256="a84a7917e0ab608312834413f01fc01edc7844f9f9002ba69f3b4f4bcb8d937a")
version("2.46.2", sha256="65c5689fd44f1d09de7fd8c44de7fef074ddd69dda8b8503d44afb91495ecbce")
version("2.45.2", sha256="98b26090ed667099a3691b93698d1e213e1ded73d36a2fde7e9125fce28ba234")
version("2.44.2", sha256="f0655e81c5ecfeef7440aa4fcffa1c1a77eaccf764d6fe29579e9a06eac2cd04")
version("2.43.5", sha256="324c3b85d668e6afe571b3502035848e4b349dead35188e2b8ab1b96c0cd45ff")
version("2.42.3", sha256="f42a8e8f6c0add4516f9e4607554c8ad698161b7d721b82073fe315a59621961")
version("2.41.2", sha256="481aa0a15aa37802880a6245b96c1570d7337c44700d5d888344cd6d43d85306")
version("2.40.3", sha256="b3dc96b20edcdbe6bea7736ea55bb80babf683d126cc7f353ed4e3bc304cd7da")
version("2.39.5", sha256="ca0ec03fb2696f552f37135a56a0242fa062bd350cb243dc4a15c86f1cafbc99")
version("2.48.1", sha256="51b4d03b1e311ba673591210f94f24a4c5781453e1eb188822e3d9cdc04c2212")
version("2.47.2", sha256="a5d26bf7b01b2f0571b5a99300c256e556bd89b2a03088accf7b81bfa4f8f2fd")
version("2.46.3", sha256="f7ae38b1d2c4724cd9088575da470229b3360903a17b531f2e86967d856ed7ed")
version("2.45.3", sha256="40a2c40323d5077eda1e0353806b102813a23a174d24ff4b5aa7b87ffd3fcb03")
version("2.44.3", sha256="4237c37cdf7b3d38102117b22993b2f761a4c02758dfbe33f7b7423c0b096ca9")
version("2.43.6", sha256="f11f89bb48ecb3e18a2ecfb2a2db5a96fd6115d7e617be04e40020a50b03a038")
version("2.42.4", sha256="886898866d624fce14f470773bc19c671c1c858091afdf5815bf569ae14356b6")
version("2.41.3", sha256="2bf6589869f59b9c06e7b71ff8da3d7bb67b75549ca42c6f0ec81ab5e4570aa8")
version("2.40.4", sha256="796993ef502481acbeb7caa22ffbf5f22daf8b273ab6d8dafc0ed178337d2659")
# Deprecated versions (https://groups.google.com/g/git-packagers/c/zxdH4LPix3U)
with default_args(deprecated=True):
version(
"2.47.0", sha256="a84a7917e0ab608312834413f01fc01edc7844f9f9002ba69f3b4f4bcb8d937a"
)
version(
"2.46.2", sha256="65c5689fd44f1d09de7fd8c44de7fef074ddd69dda8b8503d44afb91495ecbce"
)
version(
"2.45.2", sha256="98b26090ed667099a3691b93698d1e213e1ded73d36a2fde7e9125fce28ba234"
)
version(
"2.44.2", sha256="f0655e81c5ecfeef7440aa4fcffa1c1a77eaccf764d6fe29579e9a06eac2cd04"
)
version(
"2.43.5", sha256="324c3b85d668e6afe571b3502035848e4b349dead35188e2b8ab1b96c0cd45ff"
)
version(
"2.42.3", sha256="f42a8e8f6c0add4516f9e4607554c8ad698161b7d721b82073fe315a59621961"
)
version(
"2.41.2", sha256="481aa0a15aa37802880a6245b96c1570d7337c44700d5d888344cd6d43d85306"
)
version(
"2.40.3", sha256="b3dc96b20edcdbe6bea7736ea55bb80babf683d126cc7f353ed4e3bc304cd7da"
)
version(
"2.39.5", sha256="ca0ec03fb2696f552f37135a56a0242fa062bd350cb243dc4a15c86f1cafbc99"
)
# Deprecated versions (https://groups.google.com/g/git-packagers/c/x6-nKLV20aE)
version(
"2.45.1",
sha256="10acb581993061e616be9c5674469335922025a666318e0748cb8306079fef24",
deprecated=True,
)
version(
"2.44.1",
sha256="118214bb8d7ba971a62741416e757562b8f5451cefc087a407e91857897c92cc",
deprecated=True,
)
version(
"2.43.4",
sha256="bfd717dc31922f718232a25a929d199e26146df5e876fdf0ff90a7cc95fa06e2",
deprecated=True,
)
version(
"2.42.2",
sha256="3b24b712fa6e9a3da5b7d3e68b1854466905aadb93a43088a38816bcc3b9d043",
deprecated=True,
)
version(
"2.41.1",
sha256="06d2a681aa7f1bdb6e7f7101631407e7412faa534e1fa0eb6fdcb9975d867d31",
deprecated=True,
)
version(
"2.40.2",
sha256="1dcdfbb4eeb3ef2c2d9154f888d4a6f0cf19f19acad76f0d32e725e7bc147753",
deprecated=True,
)
version(
"2.39.4",
sha256="b895ed2b5d98fd3dcfde5807f16d5fb17c4f83044e7d08e597ae13de222f0d26",
deprecated=True,
)
with default_args(deprecated=True):
version(
"2.45.1", sha256="10acb581993061e616be9c5674469335922025a666318e0748cb8306079fef24"
)
version(
"2.44.1", sha256="118214bb8d7ba971a62741416e757562b8f5451cefc087a407e91857897c92cc"
)
version(
"2.43.4", sha256="bfd717dc31922f718232a25a929d199e26146df5e876fdf0ff90a7cc95fa06e2"
)
version(
"2.42.2", sha256="3b24b712fa6e9a3da5b7d3e68b1854466905aadb93a43088a38816bcc3b9d043"
)
version(
"2.41.1", sha256="06d2a681aa7f1bdb6e7f7101631407e7412faa534e1fa0eb6fdcb9975d867d31"
)
version(
"2.40.2", sha256="1dcdfbb4eeb3ef2c2d9154f888d4a6f0cf19f19acad76f0d32e725e7bc147753"
)
version(
"2.39.4", sha256="b895ed2b5d98fd3dcfde5807f16d5fb17c4f83044e7d08e597ae13de222f0d26"
)
# Deprecated versions (see https://github.blog/2024-05-14-securing-git-addressing-5-new-vulnerabilities/).
version(
"2.42.0",
sha256="34aedd54210d7216a55d642bbb4cfb22695b7610719a106bf0ddef4c82a8beed",
deprecated=True,
)
version(
"2.41.0",
sha256="c4a6a3dd1827895a80cbd824e14d94811796ae54037549e0da93f7b84cb45b9f",
deprecated=True,
)
version(
"2.40.1",
sha256="55511f10f3b1cdf5db4e0e3dea61819dfb67661b0507a5a2b061c70e4f87e14c",
deprecated=True,
)
version(
"2.39.3",
sha256="2f9aa93c548941cc5aff641cedc24add15b912ad8c9b36ff5a41b1a9dcad783e",
deprecated=True,
)
with default_args(deprecated=True):
version(
"2.42.0", sha256="34aedd54210d7216a55d642bbb4cfb22695b7610719a106bf0ddef4c82a8beed"
)
version(
"2.41.0", sha256="c4a6a3dd1827895a80cbd824e14d94811796ae54037549e0da93f7b84cb45b9f"
)
version(
"2.40.1", sha256="55511f10f3b1cdf5db4e0e3dea61819dfb67661b0507a5a2b061c70e4f87e14c"
)
version(
"2.39.3", sha256="2f9aa93c548941cc5aff641cedc24add15b912ad8c9b36ff5a41b1a9dcad783e"
)
depends_on("c", type="build") # generated
for _version, _sha256_manpage in {
"2.48.1": "88742466926d3d682be5214470ae92b79a68796a9d171d393763a5767de5a581",
"2.47.2": "8a36a81ee3a031acbfc831a0972d849aa8777926a6c49c76141b0e0e4744dcb3",
"2.47.0": "1a6f1e775dfe324a9b521793cbd2b3bba546442cc2ac2106d4df33dea9005038",
"2.46.3": "5744ca6fd3ef39d0390400a47f2d7208668433af3d599cfbec7bb1c7140efe79",
"2.46.2": "4bc3774ee4597098977befa4ec30b0f2cbed3b59b756e7cbb59ce1738682d43a",
"2.45.3": "eae81e0d8b00f19c47d7efecfa04642e06e777dd44e3e87ef2b192ba617cddaa",
"2.45.2": "48c1e2e3ecbb2ce9faa020a19fcdbc6ce64ea25692111b5930686bc0bb4f0e7f",
"2.45.1": "d9098fd93a3c0ef242814fc856a99886ce31dae2ba457afc416ba4e92af8f8f5",
"2.44.3": "0f76464bbf8c0f5ccccfbacbd58d121376ff1e5147c4e0753b1ab1d578b9371e",
"2.44.2": "ee6a7238d5ede18fe21c0cc2131c7fbff1f871c25e2848892ee864d40baf7218",
"2.44.1": "8d80359e44cbcce256c1eb1389cb8e15ccfcd267fbb8df567d5ce19ce006eb42",
"2.43.6": "ce364c74d475d321acc8b710558647ee8177876ee529456bd7f92cbb9f6961d8",
"2.43.5": "df3c3d0f0834959aa33005e6f8134c1e56ab01f34d1497ceb34b2dd8ec7d4de4",
"2.43.4": "99d3a0394a6093237123237fd6c0d3de1041d5ceaedc3bfc016807914275d3e2",
"2.42.4": "6d207f38158d9f01c26feccb99af5a65ed3df20a18451649ce1ee718aabc331d",
"2.42.3": "3c8c55dcbc3f59560c63e6ced400f7251e9a00d876d365cb4fe9be6b3c3e3713",
"2.42.2": "2ddfa2187fdaf9ab2b27c0ab043e46793127c26c82a824ffe980f006be049286",
"2.42.0": "51643c53d70ce15dde83b6da2bad76ba0c7bbcd4f944d7c378f03a15b9f2e1de",
"2.41.3": "4f373c1f3d35e8f22f0920928f3d9968aa99a2a5a2673a8ed9964b96c8ee10bf",
"2.41.2": "a758988c81478a942e1593ecf11568b962506bff1119061bad04bd4149b40c2c",
"2.41.1": "7093ef7dacfa8cdb3c4689d8bc1f06186d9b2420bec49087a3a6a4dee26ddcec",
"2.41.0": "7b77c646b36d33c5c0f62677a147142011093270d6fd628ca38c42d5301f3888",
"2.40.4": "4a03ec30184aa27f5cf4123c532590be42d80e4b4797ad096f00b82109de1486",
"2.40.3": "fa9b837e1e161ebdbbbfde27a883a90fe5f603ce1618086a384bccda59c47de5",
"2.40.2": "2c71f3f3e4801176f97708f2093756bce672ef260c6d95c255046e6727b3a031",
"2.40.1": "6bbde434121bd0bf8aa574c60fd9a162388383679bd5ddd99921505149ffd4c2",

View File

@@ -2,6 +2,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
from spack.package import *
@@ -15,6 +16,10 @@ class Gsl(AutotoolsPackage, GNUMirrorPackage):
homepage = "https://www.gnu.org/software/gsl"
gnu_mirror_path = "gsl/gsl-2.3.tar.gz"
maintainers("cessenat")
tags = ["hpc"]
executables = ["^gsl-config$"]
license("GPL-3.0-or-later")
@@ -70,3 +75,9 @@ def setup_run_environment(self, env):
# cmake looks for GSL_ROOT_DIR to find GSL so this helps pick the spack one
# when there are multiple installations (e.g. a system one and a spack one)
env.set("GSL_ROOT_DIR", self.prefix)
@classmethod
def determine_version(cls, exe):
output = Executable(exe)("--version", output=str, error=str)
match = re.search(r"\s*(\d[\d\.]+)", output)
return match.group(1) if match else None

View File

@@ -588,18 +588,19 @@ def patch(self):
def cmake_args(self):
args = [
# find_package(Clang) and find_package(LLVM) in clr/hipamd/src/hiprtc/CMakeLists.txt
# should find llvm-amdgpu
self.define("LLVM_ROOT", self.spec["llvm-amdgpu"].prefix),
self.define("Clang_ROOT", self.spec["llvm-amdgpu"].prefix),
# Use the new behaviour of the policy CMP0074
# (https://cmake.org/cmake/help/latest/policy/CMP0074.html) which will search
# "prefixes specified by the <PackageName>_ROOT CMake variable".
# From HIP 6.2 onwards the policy is set explicitly by HIP itself:
# https://github.com/ROCm/clr/commit/a2a8dad980b0fa1a6086e0c0f95847ae80f5a2c6.
self.define("CMAKE_POLICY_DEFAULT_CMP0074", "NEW"),
self.define("CMAKE_POLICY_DEFAULT_CMP0074", "NEW")
]
if self.spec.satisfies("+rocm"):
# find_package(Clang) and find_package(LLVM) in clr/hipamd/src/hiprtc/CMakeLists.txt
# should find llvm-amdgpu
args.append(self.define("LLVM_ROOT", self.spec["llvm-amdgpu"].prefix))
args.append(self.define("Clang_ROOT", self.spec["llvm-amdgpu"].prefix))
args.append(self.define("HSA_PATH", self.spec["hsa-rocr-dev"].prefix))
args.append(self.define("HIP_COMPILER", "clang"))
args.append(

View File

@@ -24,11 +24,11 @@ class Hipsycl(CMakePackage, ROCmPackage):
license("BSD-2-Clause")
version("stable", branch="stable", submodules=True)
version("24.10.0", commit="7677cf6eefd8ab46d66168cd07ab042109448124", submodules=True)
version("24.06.0", commit="fc51dae9006d6858fc9c33148cc5f935bb56b075", submodules=True)
version("24.02.0", commit="974adc33ea5a35dd8b5be68c7a744b37482b8b64", submodules=True)
version("23.10.0", commit="3952b468c9da89edad9dff953cdcab0a3c3bf78c", submodules=True)
version("0.9.4", commit="99d9e24d462b35e815e0e59c1b611936c70464ae", submodules=True)
version("0.9.4", commit="99d9e24d462b35e815e0e59c1b611936c70464ae", submodules=True)
version("0.9.3", commit="51507bad524c33afe8b124804091b10fa25618dc", submodules=True)
version("0.9.2", commit="49fd02499841ae884c61c738610e58c27ab51fdb", submodules=True)
version("0.9.1", commit="fe8465cd5399a932f7221343c07c9942b0fe644c", submodules=True)
@@ -44,9 +44,28 @@ class Hipsycl(CMakePackage, ROCmPackage):
depends_on("python@3:")
depends_on("llvm@8: +clang", when="~cuda")
depends_on("llvm@9: +clang", when="+cuda")
# hipSYCL 0.8.0 supported only LLVM 8-10:
# (https://github.com/AdaptiveCpp/AdaptiveCpp/blob/v0.8.0/CMakeLists.txt#L29-L37)
# recent versions support only up to llvm18
# https://github.com/spack/spack/issues/46681
# https://github.com/spack/spack/issues/49506
# The following list was made based on the version tested in adaptivecpp github
depends_on("llvm@14:18", when="@develop")
depends_on("llvm@14:18", when="@stable")
depends_on("llvm@14:18", when="@24.10.0")
depends_on("llvm@14:18", when="@24.06.0")
depends_on("llvm@13:17", when="@24.02.0")
depends_on("llvm@13:17", when="@23.10.0")
depends_on("llvm@11:15", when="@0.9.4")
depends_on("llvm@11:14", when="@0.9.3")
depends_on("llvm@11:13", when="@0.9.2")
depends_on("llvm@11", when="@0.9.1")
# depends_on("llvm@10:11", when="@0.9.0") # missing in releases
depends_on("llvm@8:10", when="@0.8.0")
# https://github.com/spack/spack/issues/45029 and https://github.com/spack/spack/issues/43142
conflicts("^gcc@12", when="@23.10.0")
# https://github.com/OpenSYCL/OpenSYCL/pull/918 was introduced after 0.9.4
@@ -74,8 +93,6 @@ class Hipsycl(CMakePackage, ROCmPackage):
"further info please refer to: "
"https://github.com/illuhad/hipSYCL/blob/master/doc/install-cuda.md",
)
# https://github.com/spack/spack/issues/46681
conflicts("^llvm@19", when="@24.02.0:24.06.0")
def cmake_args(self):
spec = self.spec

View File

@@ -23,6 +23,7 @@ class Justbuild(Package):
license("Apache-2.0")
version("master", branch="master")
version("1.5.0", tag="v1.5.0", commit="21d9afbfb744596f0e7646c386870e78dbeab922")
version("1.4.3", tag="v1.4.3", commit="dfbfdc230805a7c92baa7e49d82edc2816e00511")
version("1.4.2", tag="v1.4.2", commit="7fd5d41bc219acf0d15da5dfc75d8dd4a6c53ba3")
version("1.4.1", tag="v1.4.1", commit="2dc306f510c7ba0661d95bd75305f7deb5eb54b2")

View File

@@ -11,20 +11,68 @@ class Kubectl(GoPackage):
"""
homepage = "https://kubernetes.io"
url = "https://github.com/kubernetes/kubernetes/archive/refs/tags/v1.27.0.tar.gz"
url = "https://github.com/kubernetes/kubernetes/archive/refs/tags/v1.32.2.tar.gz"
maintainers("alecbcs")
license("Apache-2.0")
version("1.32.0", sha256="3793859c53f09ebc92e013ea858b8916cc19d7fe288ec95882dada4e5a075d08")
version("1.31.1", sha256="83094915698a9c24f93d1ffda3f17804a4024d3b65eabf681e77a62b35137208")
version("1.31.0", sha256="6679eb90815cc4c3bef6c1b93f7a8451bf3f40d003f45ab57fdc9f8c4e8d4b4f")
version("1.27.1", sha256="3a3f7c6b8cf1d9f03aa67ba2f04669772b1205b89826859f1636062d5f8bec3f")
version("1.27.0", sha256="536025dba2714ee5e940bb0a6b1df9ca97c244fa5b00236e012776a69121c323")
version("1.32.3", sha256="b1ed5abe78a626804aadc49ecb8ade6fd33b27ab8c23d43cd59dc86f6462ac09")
version("1.31.7", sha256="92005ebd010a8d4fe3a532444c4645840e0af486062611a4d9c8d862414c3f56")
version("1.30.11", sha256="f30e4082b6a554d4a2bfedd8b2308a5e6012287e15bec94f72987f717bab4133")
depends_on("bash", type="build")
depends_on("go@1.22:", type="build", when="@1.30:")
depends_on("go@1.23:", type="build", when="@1.32:")
with default_args(deprecated=True):
version(
"1.32.0", sha256="3793859c53f09ebc92e013ea858b8916cc19d7fe288ec95882dada4e5a075d08"
)
version(
"1.31.1", sha256="83094915698a9c24f93d1ffda3f17804a4024d3b65eabf681e77a62b35137208"
)
version(
"1.31.0", sha256="6679eb90815cc4c3bef6c1b93f7a8451bf3f40d003f45ab57fdc9f8c4e8d4b4f"
)
version(
"1.27.1", sha256="3a3f7c6b8cf1d9f03aa67ba2f04669772b1205b89826859f1636062d5f8bec3f"
)
version(
"1.27.0", sha256="536025dba2714ee5e940bb0a6b1df9ca97c244fa5b00236e012776a69121c323"
)
with default_args(type="build"):
depends_on("bash")
depends_on("go@1.23:", when="@1.32:")
depends_on("go@1.22:", when="@1.30:")
depends_on("go@1.21:", when="@1.29:")
depends_on("go@1.20:", when="@1.27:")
build_directory = "cmd/kubectl"
# Required to correctly set the version
# Method discovered by following the trail below
#
# 1. https://github.com/kubernetes/kubernetes/blob/v1.32.2/Makefile#L1
# 2. https://github.com/kubernetes/kubernetes/blob/v1.32.2/build/root/Makefile#L97
# 3. https://github.com/kubernetes/kubernetes/blob/v1.32.2/hack/make-rules/build.sh#L25
# 4. https://github.com/kubernetes/kubernetes/blob/v1.32.2/hack/lib/init.sh#L61
# 5. https://github.com/kubernetes/kubernetes/blob/v1.32.2/hack/lib/version.sh#L151-L183
@property
def build_args(self):
kube_ldflags = [
f"-X 'k8s.io/client-go/pkg/version.gitVersion=v{self.version}'",
f"-X 'k8s.io/client-go/pkg/version.gitMajor={self.version.up_to(1)}'",
f"-X 'k8s.io/client-go/pkg/version.gitMinor={str(self.version).split('.')[1]}'",
f"-X 'k8s.io/component-base/version.gitVersion=v{self.version}'",
f"-X 'k8s.io/component-base/version.gitMajor={self.version.up_to(1)}'",
f"-X 'k8s.io/component-base/version.gitMinor={str(self.version).split('.')[1]}'",
]
args = super().build_args
if "-ldflags" in args:
ldflags_index = args.index("-ldflags") + 1
args[ldflags_index] = args[ldflags_index] + " " + " ".join(kube_ldflags)
else:
args.extend(["-ldflags", " ".join(kube_ldflags)])
return args

View File

@@ -11,23 +11,40 @@ class Kubernetes(Package):
for deployment, maintenance, and scaling of applications."""
homepage = "https://kubernetes.io"
url = "https://github.com/kubernetes/kubernetes/archive/refs/tags/v1.27.0.tar.gz"
url = "https://github.com/kubernetes/kubernetes/archive/refs/tags/v1.32.2.tar.gz"
maintainers("alecbcs")
license("Apache-2.0")
version("1.32.0", sha256="3793859c53f09ebc92e013ea858b8916cc19d7fe288ec95882dada4e5a075d08")
version("1.27.2", sha256="c6fcfddd38f877ce49c49318973496f9a16672e83a29874a921242950cd1c5d2")
version("1.27.1", sha256="3a3f7c6b8cf1d9f03aa67ba2f04669772b1205b89826859f1636062d5f8bec3f")
version("1.27.0", sha256="536025dba2714ee5e940bb0a6b1df9ca97c244fa5b00236e012776a69121c323")
version("1.32.3", sha256="b1ed5abe78a626804aadc49ecb8ade6fd33b27ab8c23d43cd59dc86f6462ac09")
version("1.31.7", sha256="92005ebd010a8d4fe3a532444c4645840e0af486062611a4d9c8d862414c3f56")
version("1.30.11", sha256="f30e4082b6a554d4a2bfedd8b2308a5e6012287e15bec94f72987f717bab4133")
with default_args(deprecated=True):
version(
"1.32.0", sha256="3793859c53f09ebc92e013ea858b8916cc19d7fe288ec95882dada4e5a075d08"
)
version(
"1.27.2", sha256="c6fcfddd38f877ce49c49318973496f9a16672e83a29874a921242950cd1c5d2"
)
version(
"1.27.1", sha256="3a3f7c6b8cf1d9f03aa67ba2f04669772b1205b89826859f1636062d5f8bec3f"
)
version(
"1.27.0", sha256="536025dba2714ee5e940bb0a6b1df9ca97c244fa5b00236e012776a69121c323"
)
depends_on("c", type="build")
depends_on("bash", type="build")
depends_on("go", type="build")
depends_on("go@1.23:", type="build", when="@1.32:")
depends_on("gmake", type="build")
with default_args(type="build"):
depends_on("bash")
depends_on("gmake")
depends_on("go@1.23:", when="@1.32:")
depends_on("go@1.22:", when="@1.30:")
depends_on("go@1.21:", when="@1.29:")
depends_on("go@1.20:", when="@1.27:")
phases = ["build", "install"]

View File

@@ -14,11 +14,13 @@ class Libgtop(AutotoolsPackage):
license("GPLv2", checked_by="teaguesterling")
version("2.41.3", sha256="775676df958e2ea2452f7568f28b2ea581063d312773dd5c0b7624c1b9b2da8c")
version("2.41.2", sha256="d9026cd8a48d27cdffd332f8d60a92764b56424e522c420cd13a01f40daf92c3")
version("2.41.1", sha256="43ea9ad13f7caf98303e64172b191be9b96bab340b019deeec72251ee140fe3b")
version("2.41.2", sha256="d9026cd8a48d27cdffd332f8d60a92764b56424e522c420cd13a01f40daf92c3")
version("2.41.1", sha256="43ea9ad13f7caf98303e64172b191be9b96bab340b019deeec72251ee140fe3b")
depends_on("c", type="build") # generated
depends_on("pkgconfig", type="build")
with default_args(type=("build", "link", "run")):
depends_on("glib@2.65:", when="@2.40:")

View File

@@ -50,7 +50,7 @@ class Libunwind(AutotoolsPackage):
variant(
"components",
values=any_combination_of("coredump", "ptrace", "setjump"),
values=any_combination_of("coredump", "ptrace", "setjmp"),
description="Build specified libunwind libraries",
)

View File

@@ -77,6 +77,7 @@ class Mpich(MpichEnvironmentModifications, AutotoolsPackage, CudaPackage, ROCmPa
license("mpich2")
version("develop", submodules=True)
version("4.3.0", sha256="5e04132984ad83cab9cc53f76072d2b5ef5a6d24b0a9ff9047a8ff96121bcc63")
version("4.2.3", sha256="7a019180c51d1738ad9c5d8d452314de65e828ee240bcb2d1f80de9a65be88a8")
version("4.2.2", sha256="883f5bb3aeabf627cb8492ca02a03b191d09836bbe0f599d8508351179781d41")
version("4.2.1", sha256="23331b2299f287c3419727edc2df8922d7e7abbb9fd0ac74e03b9966f9ad42d7")

View File

@@ -26,6 +26,7 @@ class Mvapich(MpichEnvironmentModifications, AutotoolsPackage):
license("Unlicense")
# Prefer the latest stable release
version("4.0", sha256="c532f7bdd5cca71f78c12e0885c492f6e276e283711806c84d0b0f80bb3e3b74")
version("3.0", sha256="ee076c4e672d18d6bf8dd2250e4a91fa96aac1db2c788e4572b5513d86936efb")
depends_on("c", type="build")
@@ -67,8 +68,8 @@ class Mvapich(MpichEnvironmentModifications, AutotoolsPackage):
variant(
"pmi_version",
description="Which pmi version to be used. If using pmi2 add it to your CFLAGS",
default="simple",
values=("simple", "pmi2", "pmix"),
default="none",
values=("none", "pmi1", "pmi2", "pmix"),
multi=False,
)
@@ -163,7 +164,6 @@ def process_manager_options(self):
if "process_managers=slurm" in spec:
opts = [
"--with-pm=slurm",
"--with-pmi=simple",
"--with-slurm={0}".format(spec["slurm"].prefix),
"CFLAGS=-I{0}/include/slurm".format(spec["slurm"].prefix),
]
@@ -231,7 +231,8 @@ def configure_args(self):
]
args.extend(self.enable_or_disable("alloca"))
args.append("--with-pmi=" + spec.variants["pmi_version"].value)
if not spec.satisfies("pmi_version=none"):
args.append("--with-pmi=" + spec.variants["pmi_version"].value)
if "pmi_version=pmix" in spec:
args.append("--with-pmix={0}".format(spec["pmix"].prefix))

View File

@@ -265,6 +265,8 @@ class Openfoam(Package):
version("develop", branch="develop", submodules="True")
version("master", branch="master", submodules="True")
version("2412", sha256="c353930105c39b75dac7fa7cfbfc346390caa633a868130fd8c9816ef5f732cd")
version("2406", sha256="8d1450fb89eec1e7cecc55c3bb7bc486ccbf63d069379d1d5d7518fa16a4686a")
version("2312", sha256="f113183a4d027c93939212af8967053c5f8fe76fb62e5848cb11bbcf8e829552")
version("2306", sha256="d7fba773658c0f06ad17f90199565f32e9bf502b7bb03077503642064e1f5344")
version(
@@ -379,6 +381,9 @@ class Openfoam(Package):
depends_on("flex@:2.6.1,2.6.4:")
depends_on("cmake", type="build")
depends_on("m4", type="build")
depends_on("json-c")
depends_on("libyaml")
depends_on("readline")
# Require scotch with ptscotch - corresponds to standard OpenFOAM setup
depends_on("scotch~metis+mpi~int64", when="+scotch~int64")

View File

@@ -1084,10 +1084,16 @@ def configure_args(self):
config_args.extend(["--enable-debug"])
# Package dependencies
for dep in ["libevent", "lustre", "singularity", "valgrind"]:
for dep in ["lustre", "singularity", "valgrind"]:
if "^" + dep in spec:
config_args.append("--with-{0}={1}".format(dep, spec[dep].prefix))
# libevent support
if spec.satisfies("+internal-libevent"):
config_args.append("--with-libevent=internal")
elif "^libevent" in spec:
config_args.append("--with-libevent={0}".format(spec["libevent"].prefix))
# PMIx support
if spec.satisfies("+internal-pmix"):
config_args.append("--with-pmix=internal")

View File

@@ -27,63 +27,39 @@ class Openssl(Package): # Uses Fake Autotools, should subclass Package
license("Apache-2.0")
version("3.4.0", sha256="e15dda82fe2fe8139dc2ac21a36d4ca01d5313c75f99f46c4e8a27709b7294bf")
version("3.3.2", sha256="2e8a40b01979afe8be0bbfb3de5dc1c6709fedb46d6c89c10da114ab5fc3d281")
version("3.2.3", sha256="52b5f1c6b8022bc5868c308c54fb77705e702d6c6f4594f99a0df216acf46239")
version("3.1.7", sha256="053a31fa80cf4aebe1068c987d2ef1e44ce418881427c4464751ae800c31d06c")
version("3.0.15", sha256="23c666d0edf20f14249b3d8f0368acaee9ab585b09e1de82107c66e1f3ec9533")
version("3.4.1", sha256="002a2d6b30b58bf4bea46c43bdd96365aaf8daa6c428782aa4feee06da197df3")
version("3.3.3", sha256="712590fd20aaa60ec75d778fe5b810d6b829ca7fb1e530577917a131f9105539")
version("3.2.4", sha256="b23ad7fd9f73e43ad1767e636040e88ba7c9e5775bfa5618436a0dd2c17c3716")
version("3.1.8", sha256="d319da6aecde3aa6f426b44bbf997406d95275c5c59ab6f6ef53caaa079f456f")
version("3.0.16", sha256="57e03c50feab5d31b152af2b764f10379aecd8ee92f16c985983ce4a99f7ef86")
version(
"3.3.1",
sha256="777cd596284c883375a2a7a11bf5d2786fc5413255efab20c50d6ffe6d020b7e",
deprecated=True,
)
version(
"3.3.0",
sha256="53e66b043322a606abf0087e7699a0e033a37fa13feb9742df35c3a33b18fb02",
deprecated=True,
)
version(
"3.2.2",
sha256="197149c18d9e9f292c43f0400acaba12e5f52cacfe050f3d199277ea738ec2e7",
deprecated=True,
)
version(
"3.2.1",
sha256="83c7329fe52c850677d75e5d0b0ca245309b97e8ecbcfdc1dfdc4ab9fac35b39",
deprecated=True,
)
version(
"3.1.6",
sha256="5d2be4036b478ef3cb0a854ca9b353072c3a0e26d8a56f8f0ab9fb6ed32d38d7",
deprecated=True,
)
version(
"3.1.5",
sha256="6ae015467dabf0469b139ada93319327be24b98251ffaeceda0221848dc09262",
deprecated=True,
)
version(
"3.0.14",
sha256="eeca035d4dd4e84fc25846d952da6297484afa0650a6f84c682e39df3a4123ca",
deprecated=True,
)
version(
"3.0.13",
sha256="88525753f79d3bec27d2fa7c66aa0b92b3aa9498dafd93d7cfa4b3780cdae313",
deprecated=True,
)
with default_args(deprecated=True):
version("3.4.0", sha256="e15dda82fe2fe8139dc2ac21a36d4ca01d5313c75f99f46c4e8a27709b7294bf")
version("3.3.1", sha256="777cd596284c883375a2a7a11bf5d2786fc5413255efab20c50d6ffe6d020b7e")
version("3.3.2", sha256="2e8a40b01979afe8be0bbfb3de5dc1c6709fedb46d6c89c10da114ab5fc3d281")
version("3.3.0", sha256="53e66b043322a606abf0087e7699a0e033a37fa13feb9742df35c3a33b18fb02")
version("3.2.3", sha256="52b5f1c6b8022bc5868c308c54fb77705e702d6c6f4594f99a0df216acf46239")
version("3.2.2", sha256="197149c18d9e9f292c43f0400acaba12e5f52cacfe050f3d199277ea738ec2e7")
version("3.2.1", sha256="83c7329fe52c850677d75e5d0b0ca245309b97e8ecbcfdc1dfdc4ab9fac35b39")
version("3.1.7", sha256="053a31fa80cf4aebe1068c987d2ef1e44ce418881427c4464751ae800c31d06c")
version("3.1.6", sha256="5d2be4036b478ef3cb0a854ca9b353072c3a0e26d8a56f8f0ab9fb6ed32d38d7")
version("3.1.5", sha256="6ae015467dabf0469b139ada93319327be24b98251ffaeceda0221848dc09262")
version(
"3.0.15", sha256="23c666d0edf20f14249b3d8f0368acaee9ab585b09e1de82107c66e1f3ec9533"
)
version(
"3.0.14", sha256="eeca035d4dd4e84fc25846d952da6297484afa0650a6f84c682e39df3a4123ca"
)
version(
"3.0.13", sha256="88525753f79d3bec27d2fa7c66aa0b92b3aa9498dafd93d7cfa4b3780cdae313"
)
version(
"1.1.1w",
sha256="cf3098950cb4d853ad95c0841f1f9c6d3dc102dccfcacd521d93925208b76ac8",
deprecated=True,
)
version(
"1.0.2u",
sha256="ecd0c6ffb493dd06707d38b14bb4d8c2288bb7033735606569d8f90f89669d16",
deprecated=True,
)
version(
"1.1.1w", sha256="cf3098950cb4d853ad95c0841f1f9c6d3dc102dccfcacd521d93925208b76ac8"
)
version(
"1.0.2u", sha256="ecd0c6ffb493dd06707d38b14bb4d8c2288bb7033735606569d8f90f89669d16"
)
depends_on("c", type="build") # generated

View File

@@ -88,6 +88,7 @@ def cmake_args(self):
define("GENF90_PATH", join_path(src, "genf90")),
define_from_variant("BUILD_SHARED_LIBS", "shared"),
define("PIO_ENABLE_EXAMPLES", False),
define_from_variant("WITH_PNETCDF", "pnetcdf"),
]
if spec.satisfies("+ncint"):
args.extend([define("PIO_ENABLE_NETCDF_INTEGRATION", True)])

View File

@@ -15,7 +15,7 @@ class Petsc(Package, CudaPackage, ROCmPackage):
homepage = "https://petsc.org"
url = "https://web.cels.anl.gov/projects/petsc/download/release-snapshots/petsc-3.20.0.tar.gz"
git = "https://gitlab.com/petsc/petsc.git"
maintainers("balay", "barrysmith", "jedbrown")
maintainers("balay", "jczhang07", "joseeroman")
tags = ["e4s"]

View File

@@ -0,0 +1,175 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Psblas(AutotoolsPackage):
"""PSBLAS: Parallel Sparse BLAS. A library of distributed sparse
linear algebra with support for GPU and Multithread acceleration.
Part of the PSCToolkit: Parallel Sparse Computation Toolkit."""
# Url for your package's homepage here.
homepage = "https://psctoolkit.github.io/"
git = "https://github.com/sfilippone/psblas3.git"
url = "https://github.com/sfilippone/psblas3/archive/refs/tags/v3.9.0-rc1.tar.gz"
# List of GitHub accounts to notify when the package is updated.
maintainers("cirdans-home", "sfilippone")
# SPDX identifier of the project's license below.
license("BSD-3-Clause", checked_by="cirdans-home")
version("development", branch="development")
version(
"3.9.0-rc1",
sha256="7a7091ce52582b6fc442e8793e36461be36c0947272ea803ad72736ec2d56da8",
preferred=True,
)
version("3.8.1-2", sha256="285ddb7c9a793ea7ecb428d68cf23f4cc04f1c567631aa84bc2bedb65a3d1b0c")
version("3.8.1", sha256="02e1f00e644426eb15eb08c735cf9c8ae692392f35c2cfe4f7474e1ab91575dc")
version("3.8.0-2", sha256="86a76bb0987edddd4c10c810d7f18e13742aadc66ac14ad3679669809c1184fa")
# Explicit phases: autoreconf, configure, build, install, and post_install
# which compiles the examples in the prefix/samples folder
phases = ["configure", "build", "install", "samples"]
# Variants:
# LPK/IPK: Integer precision variants
variant("LPK", default=8, values=int, description="Length in bytes for long integers (8 or 4)")
variant(
"IPK", default=4, values=int, description="Length in bytes for short integers (8 or 4)"
)
# MPI
variant("mpi", default=True, description="Activates MPI support")
# CUDA
variant(
"cuda", default=False, description="Activate CUDA support", when="@development,3.9.0-rc1"
)
variant(
"cudacc",
default="70,75,80,86,89,90",
multi=True,
description="Specify CUDA Compute Capabilities",
when="+cuda",
)
# METIS
variant("metis", default=False, description="Activate METIS support")
# SuiteSparse: Enable AMD library support via SuiteSparse
variant("amd", default=False, description="Activate AMD support via SuiteSparse")
# OpenMP
variant(
"openmp",
default=False,
description="Activates OpenMP support",
when="@development,3.9.0-rc1",
)
# OpenACC support (requires GCC >= 14.2.0)
variant(
"openacc",
default=False,
description="Activate OpenACC support",
when="@development,3.9.0-rc1",
)
# Additional configure options
variant("ccopt", default="none", description="Additional CCOPT flags")
variant("cxxopt", default="none", description="Additional CXXOPT flags")
variant("fcopt", default="none", description="Additional FCOPT flags")
variant("extra_opt", default="none", description="Additional EXTRA_OPT flags")
variant("libs", default="none", description="Additional link flags")
variant("clibs", default="none", description="Additional CLIBS flags")
variant("flibs", default="none", description="Additional FLIBS flags")
variant("extra_nvcc", default="none", description="Additional EXTRA_NVCC flags", when="+cuda")
variant(
"extraopenacc",
default="none",
description="Additional EXTRAOPENACC flags",
when="+openacc",
)
variant("ccopenacc", default="none", description="Additional CCOPENACC flags", when="+openacc")
variant(
"cxxopenacc", default="none", description="Additional CXXOPENACC flags", when="+openacc"
)
variant("fcopenacc", default="none", description="Additional FCOPENACC flags", when="+openacc")
# Dependencies:
# Languages: Fortran for much of the library, c for the interfaces,
# c++ for the matching routines
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
# MPI
depends_on("mpi", when="+mpi")
# BLAS/LAPACK
depends_on("blas")
depends_on("lapack")
# CUDA
depends_on("cuda", when="+cuda")
# Metis
depends_on("metis@5:+int64", when="+metis LPK=8")
depends_on("metis@5:~int64", when="+metis LPK=4")
depends_on("metis@5:+int64", when="+metis")
# SuiteSparse: Enable AMD library support via SuiteSparse
depends_on("suite-sparse", when="+amd")
# OpenACC support (requires GCC >= 14.2.0)
depends_on("gcc@14.2.0:+nvptx", when="+openacc", type="build")
def configure_args(self):
args = [f"--prefix={self.prefix}"]
# LPK/IPK Choice for integer configuration
args.append(f"--with-lpk={self.spec.variants['LPK'].value}")
args.append(f"--with-ipk={self.spec.variants['IPK'].value}")
# MPI/serial configuration
if "+mpi" in self.spec:
pass
else:
args.append("--enable-serial")
# OPENMP
args.extend(self.enable_or_disable("openmp"))
# CUDA
args.extend(self.enable_or_disable("cuda"))
if "+cuda" in self.spec:
cudacc_values = ",".join(self.spec.variants["cudacc"].value)
args.append(f"--with-cudacc={cudacc_values}")
args.append(f"--with-cudadir={self.spec['cuda'].prefix}")
for opt in ["extra_nvcc"]:
val = self.spec.variants[opt].value
if val != "none":
args.append(f"--with-{opt.replace('_', '-')}={val}")
# OpenACC configuration
args.extend(self.enable_or_disable("openacc"))
if "+openacc" in self.spec:
for opt in ["extraopenacc", "ccopenacc", "cxxopenacc", "fcopenacc"]:
val = self.spec.variants[opt].value
if val != "none":
args.append(f"--with-{opt.replace('_', '-')}={val}")
# METIS configuration
if "+metis" in self.spec:
args.append(f"--with-metisincdir={self.spec['metis'].prefix}/include")
args.append(f"--with-metislibdir={self.spec['metis'].prefix}/lib")
# SuiteSparse configuration for AMD library support
if "+amd" in self.spec:
args.append(f"--with-amddir={self.spec['suite-sparse'].prefix}")
# All the other options
for opt in ["ccopt", "cxxopt", "fcopt", "extra_opt", "libs", "clibs", "flibs"]:
val = self.spec.variants[opt].value
if val != "none":
args.append(f"--with-{opt.replace('_', '-')}={val}")
return args
def configure(self, spec, prefix):
configure = Executable("./configure")
configure(*self.configure_args())
def build(self, spec, prefix):
make()
def install(self, spec, prefix):
make("install")
def samples(self, spec, prefix):
with working_dir(prefix.samples.fileread):
make()
with working_dir(prefix.samples.pargen):
make()

View File

@@ -21,34 +21,10 @@ class PyAmrex(CMakePackage, PythonExtension, CudaPackage, ROCmPackage):
version("develop", branch="development")
version("25.03", sha256="5a65545d46b49dd3f2bca2647a174c3ee0384e49791dc3e335a3a39d9a045350")
version(
"25.02",
sha256="c743086b317f9fa90639d825db32a92376cde8dc5e1eab47a4c6a82af36d5b5c",
deprecated=True,
)
version(
"24.10",
sha256="dc1752ed3fbd5113dcfdbddcfe6c3c458e572b288ac9d41ed3ed7db130591d74",
deprecated=True,
)
version(
"24.08",
sha256="e7179d88261f64744f392a2194ff2744fe323fe0e21d0742ba60458709a1b47e",
deprecated=True,
)
version(
"24.04",
sha256="ab85695bb9644b702d0fc84e77205d264d27ba94999cab912c8a3212a7eb77fc",
deprecated=True,
)
with default_args(deprecated=True):
version("25.02", sha256="c743086b317f9fa90639d825db32a92376cde8dc5e1eab47a4c6a82af36d5b5c")
version(
"24.03",
sha256="bf85b4ad35b623278cbaae2c07e22138545dec0732d15c4ab7c53be76a7f2315",
deprecated=True,
)
for v in ["25.03", "25.02", "24.10", "24.08", "24.04", "24.03"]:
for v in ["25.03", "25.02", "develop"]:
depends_on("amrex@{0}".format(v), when="@{0}".format(v), type=("build", "link"))
variant(
@@ -77,8 +53,7 @@ class PyAmrex(CMakePackage, PythonExtension, CudaPackage, ROCmPackage):
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("cmake@3.20:3", type="build", when="@:24.08")
depends_on("cmake@3.24:3", type="build", when="@24.09:")
depends_on("cmake@3.24:3", type="build")
depends_on("pkgconfig", type="build") # amrex +fft
depends_on("python@3.8:", type=("build", "run"))
depends_on("py-mpi4py@2.1.0:", type=("build", "run"), when="+mpi")
@@ -128,38 +103,29 @@ class PyAmrex(CMakePackage, PythonExtension, CudaPackage, ROCmPackage):
depends_on("py-pandas", type="test")
depends_on("py-cupy", type="test", when="+cuda")
phases = ("cmake", "build", "install", "pip_install_nodeps")
build_targets = ["all", "pip_wheel"]
phases = ("cmake", "build", "install")
build_targets = ["all", "pip_wheel", "pip_install_nodeps"]
tests_src_dir = "tests/"
def cmake_args(self):
args = ["-DpyAMReX_amrex_internal=OFF", "-DpyAMReX_pybind11_internal=OFF"]
return args
pip_args = PythonPipBuilder.std_args(self) + [f"--prefix={self.prefix}"]
idx = pip_args.index("install")
# Docs: https://pyamrex.readthedocs.io/en/24.10/install/cmake.html#build-options
def pip_install_nodeps(self, spec, prefix):
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(self) + [
f"--prefix={prefix}",
"--find-links=amrex-whl",
"amrex",
return [
"-DpyAMReX_amrex_internal=OFF",
"-DpyAMReX_pybind11_internal=OFF",
"-DPY_PIP_OPTIONS=" + ";".join(pip_args[:idx]),
"-DPY_PIP_INSTALL_OPTIONS=" + ";".join(pip_args[idx + 1 :]),
]
with working_dir(self.build_directory):
pip(*args)
# todo: from PythonPipBuilder
# ....execute_install_time_tests()
def check(self):
"""Checks after the build phase"""
pytest = which("pytest")
pytest(join_path(self.stage.source_path, self.tests_src_dir))
@run_after("pip_install_nodeps")
@run_after("install")
def copy_test_sources(self):
"""Copy the example test files after the package is installed to an
install test subdirectory for use during `spack test run`."""

View File

@@ -15,6 +15,7 @@ class PyChex(PythonPackage):
license("Apache-2.0")
version("0.1.89", sha256="78f856e6a0a8459edfcbb402c2c044d2b8102eac4b633838cbdfdcdb09c6c8e0")
version("0.1.86", sha256="e8b0f96330eba4144659e1617c0f7a57b161e8cbb021e55c6d5056c7378091d1")
version("0.1.85", sha256="a27cfe87119d6e1fe24ccc1438a59195e6dc1d6e0e10099fcf618c3f64771faf")
version("0.1.5", sha256="686858320f8f220c82a6c7eeb54dcdcaa4f3d7f66690dacd13a24baa1ee8299e")

View File

@@ -0,0 +1,35 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyCorrfunc(PythonPackage):
"""Blazing fast correlation functions on the CPU."""
homepage = "https://corrfunc.readthedocs.io/"
pypi = "corrfunc/corrfunc-2.5.3.tar.gz"
maintainers("lgarrison")
license("MIT", checked_by="lgarrison")
version("2.5.3", sha256="32836235e2389f55f028664231f0d6f5716ac0d4226c620c0bbac9407dc225a1")
depends_on("c", type="build")
depends_on("python@3.9:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("gmake", type="build")
depends_on("py-numpy@1.20:", type=("build", "run"))
depends_on("py-future", type=("build", "run"))
depends_on("py-wurlitzer", type=("build", "run"))
depends_on("gsl@2.4:", type=("build", "link"))
def patch(self):
filter_file(r"^\s*CC\s*[:\?]?=.*", f"CC := {spack_cc}", "common.mk")
filter_file("-march=native", "", "common.mk")

View File

@@ -7,23 +7,27 @@
class PyFparser(PythonPackage):
"""
This project is based upon the Fortran (77..2003) parser
This project is based upon the Fortran parser
originally developed by Pearu Peterson for the F2PY project,
www.f2py.com. It provides a parser for Fortran source code
implemented purely in Python with minimal dependencies.
(up to and including F2008) implemented purely in Python with
minimal dependencies.
"""
# Links
homepage = "https://github.com/stfc/fparser"
git = "https://github.com/stfc/fparser.git"
pypi = "fparser/fparser-0.1.4.tar.gz"
pypi = "fparser/fparser-0.2.0.tar.gz"
maintainers("arporter")
# License
license("BSD-3-Clause")
# Releases
version("develop", branch="master")
version("0.2.0", sha256="3901d31c104062c4e532248286929e7405e43b79a6a85815146a176673e69c82")
version("0.1.4", sha256="00d4f7e9bbd8a9024c3c2f308dd3be9b0eeff3cb852772c9f3cf0c4909dbafd4")
version("0.1.3", sha256="10ba8b2803632846f6f011278e3810188a078d89afcb4a38bed0cbf10f775736")
version("0.0.16", sha256="a06389b95a1b9ed12f8141b69c67343da5ba0a29277b2997b02573a93af14e13")
@@ -34,8 +38,6 @@ class PyFparser(PythonPackage):
version("0.0.6", sha256="bf8a419cb528df1bfc24ddd26d63f2ebea6f1e103f1a259d8d3a6c9b1cd53012")
version("0.0.5", sha256="f3b5b0ac56fd22abed558c0fb0ba4f28edb8de7ef24cfda8ca8996562215822f")
depends_on("fortran", type="build") # generated
# Dependencies for latest version
depends_on("py-setuptools@61:", type="build", when="@0.1.4:")
depends_on("py-setuptools", type="build")

View File

@@ -16,12 +16,15 @@ class PyHmmlearn(PythonPackage):
license("BSD-3-Clause")
version("0.3.3", sha256="1d3c5dc4c5257e0c238dc1fe5387700b8cb987eab808edb3e0c73829f1cc44ec")
version("0.3.0", sha256="d13a91ea3695df881465e3d36132d7eef4e84d483f4ba538a4b46e24b5ea100f")
depends_on("cxx", type="build") # generated
depends_on("py-setuptools", type="build")
depends_on("py-setuptools-scm@3.3:", type="build")
depends_on("py-setuptools@62:", when="@0.3.3:", type="build")
depends_on("py-setuptools", when="@:0.3.2", type="build")
depends_on("py-setuptools-scm@6.2:", when="@0.3.3:", type="build")
depends_on("py-setuptools-scm@3.3:", when="@:0.3.2", type="build")
depends_on("py-pybind11@2.6:", type="build")
depends_on("py-numpy@1.10:", type=("build", "run"))

View File

@@ -9,23 +9,37 @@ class PyMacs3(PythonPackage):
"""MACS: Model-based Analysis for ChIP-Seq"""
homepage = "https://github.com/macs3-project/MACS/"
pypi = "MACS3/MACS3-3.0.0b3.tar.gz"
pypi = "MACS3/macs3-3.0.2.tar.gz"
maintainers("snehring")
license("BSD-3-Clause")
version("3.0.3", sha256="ee1c892901c4010ff9e201b433c0623cbd747a3058300322386a7185623b1684")
version("3.0.0b3", sha256="caa794d4cfcd7368447eae15878505315dac44c21546e8fecebb3561e9cee362")
depends_on("c", type="build") # generated
def url_for_version(self, version):
if version < Version("3.0.2"):
url_fmt = "https://files.pythonhosted.org/packages/b1/88/df5436ec1510d635e7c41f2aee185da35d6d98ccde9bf1ec5a67ad2bbd62/MACS3-{}.tar.gz"
return url_fmt.format(version)
return super().url_for_version(version)
depends_on("python@3.9:", type=("build", "run"))
depends_on("py-setuptools@68.0:", when="@3.0.2:", type="build")
depends_on("py-setuptools@60.0:", when="@:3.0.1", type="build")
depends_on("py-cython@3.0", when="@3.0.2:", type=("build", "run"))
depends_on("py-cython@0.29:0", when="@:3.0.1", type=("build", "run"))
depends_on("py-setuptools@60.0:", type="build")
depends_on("py-cython@0.29:0", type=("build", "run"))
depends_on("py-numpy@1.19:", type=("build", "run"))
depends_on("py-numpy@1.25:", when="@3.0.3:", type=("build", "run"))
depends_on("py-numpy@1.24.2:", when="@3.0.1:3.0.2", type=("build", "run"))
depends_on("py-numpy@1.19:", when="@:3.0.0", type=("build", "run"))
depends_on("py-scipy@1.12:", when="@3.0.3:", type=("build", "run"))
depends_on("py-scipy@1.11.4:", when="@3.0.1:3.0.2", type=("build", "run"))
depends_on("py-cykhash@2", type=("build", "run"))
depends_on("py-hmmlearn@0.3:", type=("build", "run"))
depends_on("py-scikit-learn@1.3:", when="@3.0.2:", type=("build", "run"))
depends_on("py-scikit-learn@1.2.1:", when="@3.0.1", type=("build", "run"))
depends_on("py-hmmlearn@0.3.2:", when="@3.0.2:", type=("build", "run"))
depends_on("py-hmmlearn@0.3:", when="@:3.0.1", type=("build", "run"))
depends_on("zlib-api")
depends_on("c", type="build") # generated

View File

@@ -11,8 +11,9 @@ class PyNumexpr(PythonPackage):
homepage = "https://github.com/pydata/numexpr"
url = "https://github.com/pydata/numexpr/archive/v2.7.0.tar.gz"
license("MIT")
license("MIT", checked_by="lgarrison")
version("2.10.2", sha256="7e61a8aa4dacb15787b31c31bd7edf90c026d5e6dbe727844c238726e8464592")
version("2.9.0", sha256="4df4163fcab20030137e8f2aa23e88e1e42e6fe702387cfd95d7675e1d84645e")
version("2.8.8", sha256="10b377c6ec6d9c01349d00e16dd82e6a6f4439c8c2b1945e490df1436c1825f5")
version("2.8.4", sha256="0e21addd25db5f62d60d97e4380339d9c1fb2de72c88b070c279776ee6455d10")
@@ -26,14 +27,16 @@ class PyNumexpr(PythonPackage):
version("2.5", sha256="4ca111a9a27c9513c2e2f5b70c0a84ea69081d7d8e4512d4c3f26a485292de0d")
version("2.4.6", sha256="2681faf55a3f19ba4424cc3d6f0a10610ebd49f029f8453f0ba64dd5c0fe4e0f")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("python@3.7:", when="@2.8.3:", type=("build", "run"))
depends_on("python@3.9:", when="@2.8.7:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-numpy@1.13.3:1.25", type=("build", "run"), when="@2.8.3:")
depends_on("py-numpy@2:", when="@2.10.2:", type="build")
depends_on("py-numpy@1.23:", when="@2.10.2:", type="run")
depends_on("py-numpy@1.13.3:1.25", type=("build", "run"), when="@2.8.3:2.9")
# https://github.com/pydata/numexpr/issues/397
depends_on("py-numpy@1.7:1.22", type=("build", "run"), when="@:2.7")
# https://github.com/pydata/numexpr/pull/478

View File

@@ -21,6 +21,7 @@ class PyNumpy(PythonPackage):
license("BSD-3-Clause")
version("main", branch="main")
version("2.2.4", sha256="9ba03692a45d3eef66559efe1d1096c4b9b75c0986b5dff5530c378fb8331d4f")
version("2.2.3", sha256="dbdc15f0c81611925f382dfa97b3bd0bc2c1ce19d4fe50482cb0ddc12ba30020")
version("2.2.2", sha256="ed6906f61834d687738d25988ae117683705636936cc605be0bb208b23df4d8f")
version("2.2.1", sha256="45681fd7128c8ad1c379f0ca0776a8b0c6583d2f69889ddac01559dfe4390918")

View File

@@ -6,7 +6,7 @@
from spack.package import *
class PyOnnxruntime(CMakePackage, PythonExtension, ROCmPackage):
class PyOnnxruntime(CMakePackage, PythonExtension, ROCmPackage, CudaPackage):
"""ONNX Runtime is a performance-focused complete scoring
engine for Open Neural Network Exchange (ONNX) models, with
an open extensible architecture to continually address the
@@ -35,8 +35,6 @@ class PyOnnxruntime(CMakePackage, PythonExtension, ROCmPackage):
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
variant("cuda", default=False, description="Build with CUDA support")
# cmake/CMakeLists.txt
depends_on("cmake@3.26:", when="@1.17:", type="build")
depends_on("cmake@3.1:", type="build")

View File

@@ -13,6 +13,7 @@ class PyOptax(PythonPackage):
license("Apache-2.0")
version("0.2.4", sha256="4e05d3d5307e6dde4c319187ae36e6cd3a0c035d4ed25e9e992449a304f47336")
version("0.2.1", sha256="fc9f430fa057377140d00aa50611dabbd7e8f4999e3c7543f641f9db6997cb1a")
version("0.1.7", sha256="6a5a848bc5e55e619b187c749fdddc4a5443ea14be85cc769f995779865c110d")
@@ -22,6 +23,8 @@ class PyOptax(PythonPackage):
depends_on("py-absl-py@0.7.1:", type=("build", "run"))
depends_on("py-chex@0.1.7:", when="@0.2.1:", type=("build", "run"))
depends_on("py-chex@0.1.5:", when="@0.1.7", type=("build", "run"))
depends_on("py-chex@0.1.89:", when="@0.2.4", type=("build", "run"))
depends_on("py-jax@0.1.55:", type=("build", "run"))
depends_on("py-jaxlib@0.1.37:", type=("build", "run"))
depends_on("py-numpy@1.18.0:", type=("build", "run"))
depends_on("py-etils@1.7.0", type=("build", "run"), when="@0.2.3:0.2.4")

View File

@@ -8,13 +8,13 @@
class PyPetsc4py(PythonPackage):
"""This package provides Python bindings for the PETSc package."""
homepage = "https://gitlab.com/petsc/petsc4py"
homepage = "https://petsc.org/release/petsc4py"
url = (
"https://web.cels.anl.gov/projects/petsc/download/release-snapshots/petsc4py-3.20.0.tar.gz"
)
git = "https://gitlab.com/petsc/petsc.git"
maintainers("balay")
maintainers("balay", "jczhang07", "joseeroman")
license("BSD-2-Clause")

View File

@@ -24,31 +24,6 @@ class PyPicmistandard(PythonPackage):
version("0.24.0", sha256="55a82adcc14b41eb612caf0d9e47b0e2a56ffc196a58b41fa0cc395c6924be9a")
version("0.23.2", sha256="2853fcfaf2f226a88bb6063ae564832b7e69965294fd652cd2ac04756fa4599a")
version("0.23.1", sha256="c7375010b7a3431b519bc0accf097f2aafdb520e2a0126f42895cb96dcc7dcf1")
version(
"0.0.22",
sha256="e234a431274254b22cd70be64d6555b383d98426b2763ea0c174cf77bf4d0890",
deprecated=True,
)
version(
"0.0.21",
sha256="930056a23ed92dac7930198f115b6248606b57403bffebce3d84579657c8d10b",
deprecated=True,
)
version(
"0.0.20",
sha256="9c1822eaa2e4dd543b5afcfa97940516267dda3890695a6cf9c29565a41e2905",
deprecated=True,
)
version(
"0.0.19",
sha256="4b7ba1330964fbfd515e8ea2219966957c1386e0896b92d36bd9e134afb02f5a",
deprecated=True,
)
version(
"0.0.18",
sha256="68c208c0c54b4786e133bb13eef0dd4824998da4906285987ddee84e6d195e71",
deprecated=True,
)
depends_on("python@3.8:", type=("build", "run"))
depends_on("py-numpy@1.15:1", type=("build", "run"))

View File

@@ -16,13 +16,18 @@ class PyPydftracer(PythonPackage):
version("develop", branch="develop")
version("master", branch="master")
version("1.0.8", tag="v1.0.8", commit="3602f9fbc76da7ae868080112b25d109a4439c79")
version("1.0.5", tag="v1.0.5", commit="e05f83143b88f3e03439af2ff95a1539de2ac811")
version("1.0.4", tag="v1.0.4", commit="757d9aacd0ab4a2b08e9885b34917872f531e0b7")
version("1.0.3", tag="v1.0.3", commit="856de0b958a22081d80a9a25bea3f74e2759d9ee")
version("1.0.2", tag="v1.0.2", commit="8a15f09ff54a909605eda0070689c0b99401db20")
version("1.0.1", tag="v1.0.1", commit="dc1ce44042e669e6da495f906ca5f8b155c9f155")
version("1.0.0", tag="v1.0.0", commit="b6df57d81ffb043b468e2bd3e8df9959fdb4af53")
depends_on("cpp-logger@0.0.4", when="@1.0.0:")
depends_on("brahma@0.0.5", when="@1.0.0:")
depends_on("brahma@0.0.5", when="@1.0.0:1.0.4")
depends_on("brahma@0.0.7", when="@1.0.5:1.0.7")
depends_on("brahma@0.0.9", when="@1.0.8:")
depends_on("yaml-cpp@0.6.3", when="@1.0.0:")
depends_on("py-setuptools@42:", type="build")
depends_on("py-pybind11", type=("build", "run"))

View File

@@ -16,9 +16,10 @@ class PySetuptoolsScmGitArchive(PythonPackage):
license("MIT")
version("1.4.1", sha256="c418bc77b3974d3ac65f268f058f23e01dc5f991f2233128b0e16a69de227b09")
version("1.4", sha256="b048b27b32e1e76ec865b0caa4bb85df6ddbf4697d6909f567ac36709f6ef2f0")
version("1.1", sha256="6026f61089b73fa1b5ee737e95314f41cb512609b393530385ed281d0b46c062")
version("1.0", sha256="52425f905518247c685fc64c5fdba6e1e74443c8562e141c8de56059be0e31da")
depends_on("py-setuptools", type="build")
depends_on("py-setuptools-scm", type=("build", "run"))
depends_on("py-setuptools-scm@:7", type=("build", "run"))

View File

@@ -19,6 +19,10 @@ class PySetuptools(Package, PythonExtension):
# Requires railroad
skip_modules = ["setuptools._vendor", "pkg_resources._vendor"]
version("76.0.0", sha256="199466a166ff664970d0ee145839f5582cb9bca7a0a3a2e795b6a9cb2308e9c6")
version("75.9.1", sha256="0a6f876d62f4d978ca1a11ab4daf728d1357731f978543ff18ecdbf9fd071f73")
version("75.8.2", sha256="558e47c15f1811c1fa7adbd0096669bf76c1d3f433f58324df69f3f5ecac4e8f")
version("75.8.1", sha256="3bc32c0b84c643299ca94e77f834730f126efd621de0cc1de64119e0e17dab1f")
version("75.8.0", sha256="e3982f444617239225d675215d51f6ba05f845d4eec313da4418fdbb56fb27e3")
version("75.3.0", sha256="f2504966861356aa38616760c0f66568e535562374995367b4e69c7143cf6bcd")
version("69.2.0", sha256="c21c49fb1042386df081cb5d86759792ab89efca84cf114889191cd09aacc80c")
@@ -48,66 +52,44 @@ class PySetuptools(Package, PythonExtension):
version("44.1.1", sha256="27a714c09253134e60a6fa68130f78c7037e5562c4f21f8f318f2ae900d152d5")
version("44.1.0", sha256="992728077ca19db6598072414fb83e0a284aca1253aaf2e24bb1e55ee6db1a30")
version("43.0.0", sha256="a67faa51519ef28cd8261aff0e221b6e4c370f8fb8bada8aa3e7ad8945199963")
version(
"41.4.0",
sha256="8d01f7ee4191d9fdcd9cc5796f75199deccb25b154eba82d44d6a042cf873670",
deprecated=True,
)
version(
"41.3.0",
sha256="e9832acd9be6f3174f4c34b40e7d913a146727920cbef6465c1c1bd2d21a4ec4",
deprecated=True,
)
version(
"41.0.1",
sha256="c7769ce668c7a333d84e17fe8b524b1c45e7ee9f7908ad0a73e1eda7e6a5aebf",
deprecated=True,
)
version(
"41.0.0",
sha256="e67486071cd5cdeba783bd0b64f5f30784ff855b35071c8670551fd7fc52d4a1",
deprecated=True,
)
version(
"40.8.0",
sha256="e8496c0079f3ac30052ffe69b679bd876c5265686127a3159cfa415669b7f9ab",
deprecated=True,
)
version(
"40.4.3",
sha256="ce4137d58b444bac11a31d4e0c1805c69d89e8ed4e91fde1999674ecc2f6f9ff",
deprecated=True,
)
version(
"40.2.0",
sha256="ea3796a48a207b46ea36a9d26de4d0cc87c953a683a7b314ea65d666930ea8e6",
deprecated=True,
)
version(
"39.2.0",
sha256="8fca9275c89964f13da985c3656cb00ba029d7f3916b37990927ffdf264e7926",
deprecated=True,
)
version(
"39.0.1",
sha256="8010754433e3211b9cdbbf784b50f30e80bf40fc6b05eb5f865fab83300599b8",
deprecated=True,
)
version(
"25.2.0",
sha256="2845247c359bb91097ccf8f6be8a69edfa44847f3d2d5def39aa43c3d7f615ca",
deprecated=True,
)
version(
"20.7.0",
sha256="8917a52aa3a389893221b173a89dae0471022d32bff3ebc31a1072988aa8039d",
deprecated=True,
)
version(
"20.6.7",
sha256="9982ee4d279a2541dc1a7efee994ff9c535cfc05315e121e09df7f93da48c442",
deprecated=True,
)
with default_args(deprecated=True):
version(
"41.4.0", sha256="8d01f7ee4191d9fdcd9cc5796f75199deccb25b154eba82d44d6a042cf873670"
)
version(
"41.3.0", sha256="e9832acd9be6f3174f4c34b40e7d913a146727920cbef6465c1c1bd2d21a4ec4"
)
version(
"41.0.1", sha256="c7769ce668c7a333d84e17fe8b524b1c45e7ee9f7908ad0a73e1eda7e6a5aebf"
)
version(
"41.0.0", sha256="e67486071cd5cdeba783bd0b64f5f30784ff855b35071c8670551fd7fc52d4a1"
)
version(
"40.8.0", sha256="e8496c0079f3ac30052ffe69b679bd876c5265686127a3159cfa415669b7f9ab"
)
version(
"40.4.3", sha256="ce4137d58b444bac11a31d4e0c1805c69d89e8ed4e91fde1999674ecc2f6f9ff"
)
version(
"40.2.0", sha256="ea3796a48a207b46ea36a9d26de4d0cc87c953a683a7b314ea65d666930ea8e6"
)
version(
"39.2.0", sha256="8fca9275c89964f13da985c3656cb00ba029d7f3916b37990927ffdf264e7926"
)
version(
"39.0.1", sha256="8010754433e3211b9cdbbf784b50f30e80bf40fc6b05eb5f865fab83300599b8"
)
version(
"25.2.0", sha256="2845247c359bb91097ccf8f6be8a69edfa44847f3d2d5def39aa43c3d7f615ca"
)
version(
"20.7.0", sha256="8917a52aa3a389893221b173a89dae0471022d32bff3ebc31a1072988aa8039d"
)
version(
"20.6.7", sha256="9982ee4d279a2541dc1a7efee994ff9c535cfc05315e121e09df7f93da48c442"
)
extends("python")

View File

@@ -8,7 +8,7 @@
class PySlepc4py(PythonPackage):
"""This package provides Python bindings for the SLEPc package."""
homepage = "https://gitlab.com/slepc/slepc4py"
homepage = "https://slepc.upv.es/slepc4py-current/docs"
url = "https://slepc.upv.es/download/distrib/slepc4py-3.17.1.tar.gz"
git = "https://gitlab.com/slepc/slepc.git"

View File

@@ -11,6 +11,11 @@ class PyTqdm(PythonPackage):
homepage = "https://github.com/tqdm/tqdm"
pypi = "tqdm/tqdm-4.45.0.tar.gz"
version("4.67.1", sha256="f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2")
version("4.67.0", sha256="fe5a6f95e6fe0b9755e9469b77b9c3cf850048224ecaa8293d7d2d31f97d869a")
version("4.66.6", sha256="4bdd694238bef1485ce839d67967ab50af8f9272aab687c0d7702a01da0be090")
version("4.66.5", sha256="e1020aef2e5096702d8a025ac7d16b1577279c9d63f8375b63083e9a5f0fcbad")
version("4.66.4", sha256="e4d936c9de8727928f3be6079590e97d9abfe8d39a590be678eb5919ffc186bb")
version("4.66.3", sha256="23097a41eba115ba99ecae40d06444c15d1c0c698d527a01c6c8bd1c5d0647e5")
version("4.66.1", sha256="d88e651f9db8d8551a62556d3cff9e3034274ca5d66e93197cf2490e2dcb69c7")
version("4.65.0", sha256="1871fb68a86b8fb3b59ca4cdd3dcccbc7e6d613eeed31f4c332531977b89beb5")
@@ -28,9 +33,20 @@ class PyTqdm(PythonPackage):
variant("telegram", default=False, description="Enable Telegram bot support")
variant("notebook", default=False, description="Enable Jupyter Notebook support")
depends_on("py-setuptools@42:", type=("build", "run"))
depends_on("py-setuptools-scm@3.4:+toml", type="build")
depends_on("py-colorama", when="platform=windows", type=("build", "run"))
with default_args(type=("build", "run")):
depends_on("python@3.7:", when="@4.65.0:")
depends_on("python@2.7:2,3.4:", when="@4.53.0:")
depends_on("python@2.6:2,3.2:", when="@4.8.4:")
depends_on("py-requests", when="+telegram", type=("build", "run"))
depends_on("py-ipywidgets@6:", when="+notebook", type=("build", "run"))
# not in original requirements, but pyproject.toml [project] requires py-setuptools@61:
depends_on("py-setuptools@61:", when="@4.65.1:")
depends_on("py-setuptools@42:")
depends_on("py-colorama", when="platform=windows")
depends_on("py-requests", when="+telegram")
depends_on("py-ipywidgets@6:", when="+notebook")
with default_args(type=("build")):
depends_on("py-setuptools-scm@3.4:+toml")
depends_on("py-wheel", when="@4.53.0:")

View File

@@ -1,103 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyWarpx(PythonPackage):
"""This package is deprecated. Please use `warpx +python`.
WarpX is an advanced electromagnetic Particle-In-Cell code. It supports
many features including Perfectly-Matched Layers (PML) and mesh refinement.
In addition, WarpX is a highly-parallel and highly-optimized code and
features hybrid OpenMP/MPI parallelization, advanced vectorization
techniques and load balancing capabilities.
These are the Python bindings of WarpX with PICMI input support.
See the C++ 'warpx' package for the WarpX application and library.
"""
homepage = "https://ecp-warpx.github.io"
url = "https://github.com/ECP-WarpX/WarpX/archive/refs/tags/23.08.tar.gz"
git = "https://github.com/ECP-WarpX/WarpX.git"
maintainers("ax3l", "dpgrote", "EZoni", "RemiLehe")
tags = ["e4s", "ecp"]
license("BSD-3-Clause-LBNL")
# NOTE: if you update the versions here, also see warpx
with default_args(deprecated=True):
version("develop", branch="development")
version("23.08", sha256="22b9ad39c89c3fae81b1ed49abd72f698fbc79d8ac3b8dd733fd02185c0fdf63")
version("23.07", sha256="eab91672191d7bf521cda40f2deb71a144da452b35245567b14873938f97497e")
version("23.06", sha256="9bbdcbfae3d7e87cd4457a0debd840897e9a9d15e14eaad4be8c8e16e13adca2")
version("23.05", sha256="fa4f6d8d0537057313bb657c60d5fd91ae6219d3de95d446282f1ad47eeda863")
version("23.04", sha256="90462a91106db8e241758aeb8aabdf3e217c9e3d2c531d6d30f0d03fd3ef789c")
version("23.03", sha256="26e79f8c7f0480b5f795d87ec2663e553b357894fc4b591c8e70d64bfbcb72a4")
version("23.02", sha256="596189e5cebb06e8e58a587423566db536f4ac3691d01db8d11f5503f1e7e35e")
version("23.01", sha256="ed0536ae5a75df4b93c275c6839a210fba61c9524a50ec6217c44d5ac83776b3")
version("22.12", sha256="bdd0a9ec909a5ac00f288bb4ab5193136b460e39e570ecb37d3d5d742b7e67bf")
version("22.11", sha256="03c580bcd0cf7b99a81b9ef09d14172c96672c7fb028a0ed6728b3cc9ec207e7")
version("22.10", sha256="736747184eaae65fb1bbeb6867b890c90b233132bc185d85ea605525637e7c53")
version("22.09", sha256="0dc7076bad1c46045abd812729fa650bc4f3d17fdfded6666cbaf06da70f09d7")
version("22.08", sha256="95930c4f4fc239dfe4ed36b1023dd3768637ad37ff746bb33cf05231ca08ee7a")
version("22.07", sha256="7d91305f8b54b36acf2359daec5a94e154e2a8d7cbae97429ed8e93f7c5ea661")
version("22.06", sha256="faa6550d7dd48fc64d4b4d67f583d34209c020bf4f026d10feb7c9b224caa208")
version("22.05", sha256="be97d695a425cfb0ecd3689a0b9706575b7b48ce1c73c39d4ea3abd616b15ad7")
version("22.04", sha256="ff6e3a379fafb76e311b2f48089da6b1ab328c5b52eccd347c41cce59d0441ed")
version("22.03", sha256="808a9f43514ee40fa4fa9ab5bf0ed11219ab6f9320eb414bb4f043fab112f7a0")
version("22.02", sha256="3179c54481c5dabde77a4e9a670bb97b599cecc617ad30f32ab3177559f67ffe")
version("22.01", sha256="73a65c1465eca80f0db2dab4347c22ddf68ad196e3bd0ccc0861d782f13b7388")
depends_on("cxx", type="build") # generated
variant("mpi", default=True, description="Enable MPI support")
for v in [
"23.08",
"23.07",
"23.06",
"23.05",
"23.04",
"23.03",
"23.02",
"23.01",
"22.12",
"22.11",
"22.10",
"22.09",
"22.08",
"22.07",
"22.06",
"22.05",
"22.04",
"22.03",
"22.02",
"22.01",
"develop",
]:
depends_on("warpx@{0}".format(v), when="@{0}".format(v), type=["build", "link"])
depends_on("python@3.7:", type=("build", "run"))
depends_on("python@3.8:", type=("build", "run"), when="@23.09:")
depends_on("py-numpy@1.15.0:1", type=("build", "run"))
depends_on("py-mpi4py@2.1.0:", type=("build", "run"), when="+mpi")
depends_on("py-periodictable@1.5:1", type=("build", "run"))
depends_on("py-picmistandard@0.25.0", type=("build", "run"), when="@23.07:")
depends_on("py-picmistandard@0.24.0", type=("build", "run"), when="@23.06")
depends_on("py-picmistandard@0.23.2", type=("build", "run"), when="@23.04:23.05")
depends_on("py-picmistandard@0.0.22", type=("build", "run"), when="@22.12:23.03")
depends_on("py-picmistandard@0.0.20", type=("build", "run"), when="@22.10:22.11")
depends_on("py-picmistandard@0.0.19", type=("build", "run"), when="@22.02:22.09")
depends_on("py-picmistandard@0.0.18", type=("build", "run"), when="@22.01")
depends_on("py-setuptools@42:", type="build")
# Since we use PYWARPX_LIB_DIR to pull binaries out of the
# 'warpx' spack package, we don't need cmake as declared
depends_on("warpx +lib ~mpi +shared", type=("build", "link"), when="~mpi")
depends_on("warpx +lib +mpi +shared", type=("build", "link"), when="+mpi")
def setup_build_environment(self, env):
env.set("PYWARPX_LIB_DIR", self.spec["warpx"].prefix.lib)

View File

@@ -10,11 +10,17 @@ class PyWheel(Package, PythonExtension):
"""A built-package format for Python."""
homepage = "https://github.com/pypa/wheel"
url = "https://files.pythonhosted.org/packages/py3/w/wheel/wheel-0.41.2-py3-none-any.whl"
url = "https://files.pythonhosted.org/packages/py3/w/wheel/wheel-0.45.1-py3-none-any.whl"
list_url = "https://pypi.org/simple/wheel/"
tags = ["build-tools"]
version("0.45.1", sha256="708e7481cc80179af0e556bbf0cc00b8444c7321e2700b8d8580231d13017248")
version("0.45.0", sha256="52f0baa5e6522155090a09c6bd95718cc46956d1b51d537ea5454249edb671c7")
version("0.44.0", sha256="2376a90c98cc337d18623527a97c31797bd02bad0033d41547043a1cbfbe448f")
version("0.43.0", sha256="55c570405f142630c6b9f72fe09d9b67cf1477fcf543ae5b8dcb1f5b7377da81")
version("0.42.0", sha256="177f9c9b0d45c47873b619f5b650346d632cdc35fb5e4d25058e09c9e581433d")
version("0.41.3", sha256="488609bc63a29322326e05560731bf7bfea8e48ad646e1f5e40d366607de0942")
version("0.41.2", sha256="75909db2664838d015e3d9139004ee16711748a52c8f336b52882266540215d8")
version("0.37.1", sha256="4bdcd7d840138086126cd09254dc6195fb4fc6f01c050a1d7236f2630db1d22a")
version("0.37.0", sha256="21014b2bd93c6d0034b6ba5d35e4eb284340e09d63c59aef6fc14b0f346146fd")
@@ -30,6 +36,7 @@ class PyWheel(Package, PythonExtension):
extends("python")
depends_on("python +ctypes", type=("build", "run"))
depends_on("python@3.8:", when="@0.43.0:", type=("build", "run"))
depends_on("python@3.7:", when="@0.38:", type=("build", "run"))
depends_on("py-pip", type="build")

View File

@@ -18,6 +18,8 @@ class Rivet(AutotoolsPackage):
license("GPL-3.0-or-later")
version("4.1.0", sha256="6548a351a44e5a4303fb2277e7521690a9d84195df96d92c707b816f3b40c843")
version("4.0.3", sha256="dbb97b769d1877f34c3c50190127bfda7847bcec79ba1de59561fdf43cdd4869")
version("4.0.2", sha256="65a3b36f42bff782ed2767930e669e09b140899605d7972fc8f77785b4a882c0")
version("4.0.1", sha256="4e8692d6e8a53961c77983eb6ba4893c3765cf23f705789e4d865be4892eff79")
version("4.0.0", sha256="d3c42d9b83ede3e7f4b534535345c2e06e6dafb851454c2b0a5d2331ab0f04d0")
@@ -68,6 +70,7 @@ class Rivet(AutotoolsPackage):
depends_on("yoda@1.9.11:", when="@3.1.11:")
depends_on("yoda@:1", when="@:3")
depends_on("yoda@2.0.1:", when="@4.0.0:")
depends_on("yoda@2.1.0:", when="@4.1.0:")
# The following versions were not a part of LCG stack
# and thus the exact version of YODA is unknown
@@ -89,6 +92,7 @@ class Rivet(AutotoolsPackage):
depends_on("fastjet@3.4.0:", when="@3.1.7:")
depends_on("fjcontrib")
depends_on("highfive", when="@4:")
depends_on("yaml-cpp", when="@4.1:")
depends_on("python", type=("build", "run"))
depends_on("py-cython@0.24.0:", type="build")
depends_on("swig", type="build")
@@ -156,6 +160,9 @@ def configure_args(self):
if self.spec.satisfies("^highfive"):
args += ["--with-highfive=" + self.spec["highfive"].prefix]
if self.spec.satisfies("^yaml-cpp"):
args += ["--with-yaml-cpp=" + self.spec["yaml-cpp"].prefix]
args += ["--disable-pdfmanual"]
return args

View File

@@ -34,6 +34,9 @@ class Seacas(CMakePackage):
# ###################### Versions ##########################
version("master", branch="master")
version(
"2025-03-13", sha256="406aff5b8908d6a3bf6687d825905990101caa9cf8c1213a508938eed2134d6d"
)
version(
"2025-02-27", sha256="224468d6215b4f4b15511ee7a29f755cdd9e7be18c08dfece9d9991e68185cfc"
)

View File

@@ -30,6 +30,9 @@ class Spectre(CMakePackage):
license("MIT")
version("develop", branch="develop")
version(
"2025.03.17", sha256="3b3187a6d0e0f0386ae9bf06b77485771b841886a6e71abb12516c2aa874ae65"
)
version(
"2025.01.30", sha256="1b79c297ca85e9c2c6242e3880144587fc8a1791124887a83f428c6301a80fe3"
)

View File

@@ -15,7 +15,7 @@ class SstCore(AutotoolsPackage):
git = "https://github.com/sstsimulator/sst-core.git"
url = "https://github.com/sstsimulator/sst-core/releases/download/v14.1.0_Final/sstcore-14.1.0.tar.gz"
maintainers("berquist", "naromero77")
maintainers("berquist", "jmlapre", "naromero77")
license("BSD-3-Clause")

View File

@@ -17,7 +17,7 @@ class SstDumpi(AutotoolsPackage):
url = "https://github.com/sstsimulator/sst-dumpi/archive/refs/tags/v13.0.0_Final.tar.gz"
git = "https://github.com/sstsimulator/sst-dumpi.git"
maintainers("berquist", "jpkenny", "calewis")
maintainers("berquist", "jmlapre")
license("BSD-3-Clause")

View File

@@ -15,7 +15,7 @@ class SstElements(AutotoolsPackage):
git = "https://github.com/sstsimulator/sst-elements.git"
url = "https://github.com/sstsimulator/sst-elements/releases/download/v14.1.0_Final/sstelements-14.1.0.tar.gz"
maintainers("berquist", "naromero77")
maintainers("berquist", "jmlapre", "naromero77")
license("BSD-3-Clause")

View File

@@ -18,7 +18,7 @@ class SstMacro(AutotoolsPackage):
git = "https://github.com/sstsimulator/sst-macro.git"
url = "https://github.com/sstsimulator/sst-macro/releases/download/v14.1.0_Final/sstmacro-14.1.0.tar.gz"
maintainers("berquist")
maintainers("berquist", "jmlapre")
version("14.1.0", sha256="241f42f5c460b0e7462592a7f412bda9c9de19ad7a4b62c22f35be4093b57014")
version("14.0.0", sha256="3962942268dd9fe6ebd4462e2d6d305ab757f3f510487e84687146a8d461be13")

View File

@@ -12,7 +12,7 @@ class SstTransports(CMakePackage):
homepage = "https://github.com/sstsimulator"
git = "https://github.com/jjwilke/sst-transports.git"
maintainers("jjwilke")
maintainers("berquist", "jmlapre")
license("BSD-3-Clause")

View File

@@ -3,6 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#
# ----------------------------------------------------------------------------
import os
from spack.package import *
@@ -122,7 +123,7 @@ class Timemory(CMakePackage, PythonExtension):
variant(
"cpu_target",
default="auto",
description=("Build for specific cpu architecture (specify " "cpu-model)"),
description="Build for specific cpu architecture (specify " "cpu-model)",
)
variant(
"use_arch",
@@ -143,7 +144,7 @@ class Timemory(CMakePackage, PythonExtension):
variant(
"statistics",
default=True,
description=("Build components w/ support for statistics " "(min/max/stddev)"),
description="Build components w/ support for statistics " "(min/max/stddev)",
)
variant(
"extra_optimizations",
@@ -215,7 +216,7 @@ class Timemory(CMakePackage, PythonExtension):
depends_on("caliper", when="+caliper")
depends_on("dyninst", when="+dyninst")
depends_on("gperftools", when="+gperftools")
depends_on("intel-parallel-studio", when="+vtune")
depends_on("intel-oneapi-vtune", when="+vtune")
depends_on("arm-forge", when="+allinea_map")
conflicts(
@@ -335,4 +336,26 @@ def cmake_args(self):
args.append(self.define_from_variant(key, "cuda_arch"))
args.append(self.define_from_variant("CMAKE_CUDA_STANDARD", "cudastd"))
if self.spec.satisfies("+vtune"):
ittnotify_include = os.path.join(
self["intel-oneapi-vtune"].component_prefix, "include"
)
ittnotify_libraries = find_libraries(
"libittnotify",
root=self["intel-oneapi-vtune"].component_prefix,
shared=False,
recursive=True,
)
if len(ittnotify_libraries) != 1:
vtune_spec = self.spec["intel-oneapi-vtune"]
raise InstallError(f"{self.spec} cannot find libittnotify from {vtune_spec}")
ittnotify_library = ittnotify_libraries.libraries[0]
args.extend(
[
self.define("ITTNOTIFY_INCLUDE_DIR", ittnotify_include),
self.define("ITTNOTIFY_LIBRARY", ittnotify_library),
]
)
return args

View File

@@ -17,6 +17,8 @@ class UtilLinuxUuid(AutotoolsPackage):
license("BSD-3-Clause", checked_by="wdconinc")
version("2.40.4", sha256="5b3b1435c02ba201ebaa5066bb391965a614b61721155dfb7f7b6569e95b0627")
version("2.40.3", sha256="6d72589a24b7feccdf8db20336bb984f64c7cfc2ceb044ef01cac5dce480284e")
version("2.40.2", sha256="7bec316b713a14c6be1a5721aa0e56a3b6170277329e6e1f1a56013cc91eece0")
version("2.40.1", sha256="8e396eececae2b3b68db232c33b8810faa7c31f6df19f98f512739293d5829b7")
version("2.38.1", sha256="0820eb8eea90408047e3715424bc6be771417047f683950fecb4bdd2e2cbbc6e")

View File

@@ -20,10 +20,19 @@ class Vecgeom(CMakePackage, CudaPackage):
maintainers("drbenmorgan", "sethrj")
version("master", branch="master", get_full_repo=True)
# NOTE: the surfacedev branches are not stable or official and will be
# deleted when the next 2.0 RC comes out
version(
"2.0.0-surfacedev.2",
tag="v2.0.0-surfacedev.2",
commit="91f5ee554e012ffa3baecd2b30e6f5e6905e5ffb",
deprecated=True,
)
version(
"2.0.0-surfacedev.1",
tag="v2.0.0-surfacedev.1",
commit="1d9797ea47e3b35ab0114e72ce5925ecbd59cbf4",
deprecated=True,
)
version(
"1.2.10",
@@ -91,6 +100,7 @@ class Vecgeom(CMakePackage, CudaPackage):
description="Use the specified C++ standard when building",
)
variant("gdml", default=True, description="Support native GDML geometry descriptions")
# TODO: delete geant4/root variants since they don't affect the build
variant("geant4", default=False, description="Support Geant4 geometry construction")
variant("root", default=False, description="Support ROOT geometry construction")
variant("shared", default=True, description="Build shared libraries")

View File

@@ -20,6 +20,7 @@ class Vep(Package):
license("APACHE-2.0", checked_by="teaguesterling")
version("113.3", sha256="af3af9d0ecb8a73fec04ff694bcd2ee31a1ef963fe073dc5f47a996f1407e142")
version("112.0", sha256="46dd08838fd94ecbfaa931266c78570748a3cb39668b6e43c3608e6cd0aff93f")
version("111.0", sha256="9cb326a1fa0054ce1a417f8fd4f2325ba605c40ec10eefbf87f461c264a89407")
version("110.0", sha256="391a1fe50139064c1044c09e013bb21437933d677537b5d3336807f3b131fb51")

View File

@@ -33,6 +33,11 @@ class Verible(Package):
version("master", branch="master")
version(
"0.0.3946",
sha256="1454b8df8a978c11139b800b229b498c02bb22f8f2b97a21022b2a6f099604af",
url="https://github.com/chipsalliance/verible/archive/refs/tags/v0.0-3946-g851d3ff4.tar.gz",
)
version(
"0.0.3929",
sha256="dc37ea4e07aca5770c43c51a9cc3dbd3a3e5fa8027da701c3ebf8873fc086157",

View File

@@ -23,53 +23,24 @@ class Warpx(CMakePackage, PythonExtension):
license("BSD-3-Clause-LBNL")
# NOTE: if you update the versions here, also see py-warpx
version("develop", branch="development")
version("25.03", sha256="18155ff67b036a00db2a25303058316167192a81cfe6dc1dec65fdef0b6d9903")
with default_args(deprecated=True):
version("25.02", sha256="7bdea9c1e94f82dbc3565f14f6b6ad7658a639217a10a6cf08c05a16aa26266f")
version("24.10", sha256="73b486b5fc561d97773fe95bb82751b9085aa8dfe27b4e2f285d646396b41323")
version("24.08", sha256="081b0d803d7b2b491626ba36e87e867b1fd1d20ddf0dee9c6ed4ff84f7d37553")
version("23.08", sha256="22b9ad39c89c3fae81b1ed49abd72f698fbc79d8ac3b8dd733fd02185c0fdf63")
version("23.07", sha256="eab91672191d7bf521cda40f2deb71a144da452b35245567b14873938f97497e")
version("23.06", sha256="9bbdcbfae3d7e87cd4457a0debd840897e9a9d15e14eaad4be8c8e16e13adca2")
version("23.05", sha256="fa4f6d8d0537057313bb657c60d5fd91ae6219d3de95d446282f1ad47eeda863")
version("23.04", sha256="90462a91106db8e241758aeb8aabdf3e217c9e3d2c531d6d30f0d03fd3ef789c")
version("23.03", sha256="26e79f8c7f0480b5f795d87ec2663e553b357894fc4b591c8e70d64bfbcb72a4")
version("23.02", sha256="596189e5cebb06e8e58a587423566db536f4ac3691d01db8d11f5503f1e7e35e")
version("23.01", sha256="ed0536ae5a75df4b93c275c6839a210fba61c9524a50ec6217c44d5ac83776b3")
version("22.12", sha256="bdd0a9ec909a5ac00f288bb4ab5193136b460e39e570ecb37d3d5d742b7e67bf")
version("22.11", sha256="03c580bcd0cf7b99a81b9ef09d14172c96672c7fb028a0ed6728b3cc9ec207e7")
version("22.10", sha256="736747184eaae65fb1bbeb6867b890c90b233132bc185d85ea605525637e7c53")
version("22.09", sha256="0dc7076bad1c46045abd812729fa650bc4f3d17fdfded6666cbaf06da70f09d7")
version("22.08", sha256="95930c4f4fc239dfe4ed36b1023dd3768637ad37ff746bb33cf05231ca08ee7a")
version("22.07", sha256="7d91305f8b54b36acf2359daec5a94e154e2a8d7cbae97429ed8e93f7c5ea661")
version("22.06", sha256="faa6550d7dd48fc64d4b4d67f583d34209c020bf4f026d10feb7c9b224caa208")
version("22.05", sha256="be97d695a425cfb0ecd3689a0b9706575b7b48ce1c73c39d4ea3abd616b15ad7")
version("22.04", sha256="ff6e3a379fafb76e311b2f48089da6b1ab328c5b52eccd347c41cce59d0441ed")
version("22.03", sha256="808a9f43514ee40fa4fa9ab5bf0ed11219ab6f9320eb414bb4f043fab112f7a0")
version("22.02", sha256="3179c54481c5dabde77a4e9a670bb97b599cecc617ad30f32ab3177559f67ffe")
version("22.01", sha256="73a65c1465eca80f0db2dab4347c22ddf68ad196e3bd0ccc0861d782f13b7388")
# 22.01+ requires C++17 or newer
version("21.12", sha256="3dd96d36db531f518cfec631bec243029fe63e1084b8cf7e8e75be50ebbdc794")
version("21.11", sha256="03727c21ee350fdc63057d4eebbff142928d74481f2234a8c3821cf338bfa4a0")
version("21.10", sha256="c35a98b1bd349cb944296c02d6e0602b6b7e33d1008207dd0d041a75cfb971e9")
version("21.09", sha256="0b20c5d7f13448f01115f68f131a3721e037ad9fab06aa3c24530bc48859c9eb")
version("21.08", sha256="5e61e4ec5a8605aa4fb89d49feba4a42d7d3f627745d4c85faab3657baf56011")
version("21.07", sha256="fe566f3de8d5b17a720e084d244c6617c87523b7d80756cbb5850df6e8100f5f")
version("21.06", sha256="246fb2c2bdb1dad347550c48e375326bc7bdeec0496c113c1057d2721a9ffd14")
version("21.05", sha256="16a206e898b22ace07c8dc9ea70af7f6f6f91a7a2e42c392fd15eb223faa1597")
version("21.04", sha256="13b13aebb25f43b7239743312dc9bb96bb365b72a99eb3c64492ae38f5141cff")
# 20.01+ requires C++14 or newer
for v in ["25.03", "25.02", "develop"]:
depends_on(
f"amrex@{v} build_system=cmake +linear_solvers +pic +particles +shared +tiny_profile",
when=f"@{v}",
type=("build", "link"),
)
depends_on("py-amrex@{0}".format(v), when="@{0} +python".format(v), type=("build", "run"))
variant("app", default=True, description="Build the WarpX executable application")
variant("ascent", default=False, description="Enable Ascent in situ visualization")
variant(
"catalyst",
default=False,
description="Enable Catalyst2 in situ visualization",
when="@24.09:",
)
variant("catalyst", default=False, description="Enable Catalyst2 in situ visualization")
variant("sensei", default=False, description="Enable SENSEI in situ visualization")
variant(
"compute",
@@ -78,24 +49,14 @@ class Warpx(CMakePackage, PythonExtension):
multi=False,
description="On-node, accelerated computing backend",
)
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
variant("eb", default=True, description="Embedded boundary support", when="@24.10:")
variant("eb", default=False, description="Embedded boundary support", when="@:24.09")
variant("eb", default=True, description="Embedded boundary support")
# Spack defaults to False but pybind11 defaults to True (and IPO is highly
# encouraged to be used)
variant(
@@ -128,19 +89,8 @@ class Warpx(CMakePackage, PythonExtension):
depends_on("c", type="build")
depends_on("cxx", type="build")
for v in ["25.03", "25.02", "24.10", "24.08", "develop"]:
depends_on(
f"amrex@{v} build_system=cmake +linear_solvers +pic +particles +shared +tiny_profile",
when=f"@{v}",
type=("build", "link"),
)
depends_on("py-amrex@{0}".format(v), when="@{0} +python".format(v), type=("build", "run"))
depends_on("boost@1.66.0: +math", when="+qedtablegen")
depends_on("cmake@3.15.0:", type="build")
depends_on("cmake@3.18.0:", type="build", when="@22.01:")
depends_on("cmake@3.20.0:", type="build", when="@22.08:")
depends_on("cmake@3.24.0:", type="build", when="@24.09:")
depends_on("cmake@3.24.0:", type="build")
with when("+ascent"):
depends_on("ascent", when="+ascent")
depends_on("ascent +cuda", when="+ascent compute=cuda")
@@ -161,7 +111,7 @@ class Warpx(CMakePackage, PythonExtension):
with when("+eb"):
depends_on("amrex +eb")
with when("+fft"):
depends_on("amrex +fft", when="@24.11:")
depends_on("amrex +fft")
depends_on("mpi", when="+mpi")
with when("+mpi"):
depends_on("amrex +mpi")
@@ -173,12 +123,11 @@ class Warpx(CMakePackage, PythonExtension):
depends_on("amrex precision=single")
with when("precision=double"):
depends_on("amrex precision=double")
depends_on("py-pybind11@2.12.0:", when="@24.04: +python", type=("build", "link"))
depends_on("sensei@4.0.0:", when="@22.07: +sensei")
depends_on("py-pybind11@2.12.0:", when="+python", type=("build", "link"))
depends_on("sensei@4.0.0:", when="+sensei")
with when("compute=cuda"):
depends_on("amrex +cuda")
depends_on("cuda@9.2.88:")
depends_on("cuda@11.0:", when="@22.01:")
depends_on("cuda@11.7:")
with when("compute=hip"):
depends_on("amrex +rocm")
depends_on("rocfft", when="+fft")
@@ -198,15 +147,14 @@ class Warpx(CMakePackage, PythonExtension):
depends_on("fftw ~mpi", when="~mpi")
depends_on("fftw +mpi", when="+mpi")
depends_on("pkgconfig", type="build")
with when("compute=sycl"):
depends_on("amrex +sycl")
with when("+fft dims=rz"):
depends_on("lapackpp")
depends_on("blaspp")
depends_on("blaspp +cuda", when="compute=cuda")
with when("+openpmd"):
depends_on("openpmd-api@0.13.1:")
depends_on("openpmd-api@0.14.2:", when="@21.09:")
depends_on("openpmd-api@0.15.1:", when="@23.05:")
depends_on("openpmd-api@0.16.1:", when="@25.02:")
depends_on("openpmd-api@0.16.1:")
depends_on("openpmd-api ~mpi", when="~mpi")
depends_on("openpmd-api +mpi", when="+mpi")
@@ -218,57 +166,19 @@ class Warpx(CMakePackage, PythonExtension):
depends_on("py-numpy@1.15.0:", type=("build", "run"))
depends_on("py-mpi4py@2.1.0:", type=("build", "run"), when="+mpi")
depends_on("py-periodictable@1.5:1", type=("build", "run"))
depends_on("py-picmistandard@0.28.0", type=("build", "run"), when="@23.11:24.07")
depends_on("py-picmistandard@0.29.0", type=("build", "run"), when="@24.08")
depends_on("py-picmistandard@0.30.0", type=("build", "run"), when="@24.09:24.12")
depends_on("py-picmistandard@0.33.0", type=("build", "run"), when="@25.01:")
depends_on("py-pip@23:", type="build")
depends_on("py-setuptools@42:", type="build")
depends_on("py-pybind11@2.12.0:", type=("build", "link"))
depends_on("py-wheel@0.40:", type="build")
conflicts("+python", when="@:24.04", msg="Python bindings only supported in 24.04+")
conflicts("dims=1", when="@:21.12", msg="WarpX 1D support starts in 22.01+")
conflicts("~qed +qedtablegen", msg="WarpX PICSAR QED table generation needs +qed")
# https://github.com/BLAST-WarpX/warpx/issues/5774
conflicts(
"compute=sycl",
"compute=sycl dims=rz",
when="+fft",
msg="WarpX spectral solvers are not yet tested with SYCL " '(use "warpx ~fft")',
)
conflicts("+sensei", when="@:22.06", msg="WarpX supports SENSEI 4.0+ with 22.07 and newer")
# The symbolic aliases for our +lib target were missing in the install
# location
# https://github.com/BLAST-WarpX/warpx/pull/2626
patch(
"https://github.com/BLAST-WarpX/warpx/pull/2626.patch?full_index=1",
sha256="a431d4664049d6dcb6454166d6a948d8069322a111816ca5ce01553800607544",
when="@21.12",
)
# Workaround for AMReX<=22.06 no-MPI Gather
# https://github.com/BLAST-WarpX/warpx/pull/3134
# https://github.com/AMReX-Codes/amrex/pull/2793
patch(
"https://github.com/BLAST-WarpX/warpx/pull/3134.patch?full_index=1",
sha256="b786ce64a3c2c2b96ff2e635f0ee48532e4ae7ad9637dbf03f11c0768c290690",
when="@22.02:22.05",
)
# Forgot to install ABLASTR library
# https://github.com/BLAST-WarpX/warpx/pull/3141
patch(
"https://github.com/BLAST-WarpX/warpx/pull/3141.patch?full_index=1",
sha256="dab6fb44556ee1fd466a4cb0e20f89bde1ce445c9a51a2c0f59d1740863b5e7d",
when="@22.04,22.05",
)
# Fix failing 1D CUDA build
# https://github.com/BLAST-WarpX/warpx/pull/3162
patch(
"https://github.com/BLAST-WarpX/warpx/pull/3162.patch?full_index=1",
sha256="0ae573d1390ed8063f84e3402d30d34e522e65dc5dfeea3d07e165127ab373e9",
when="@22.06",
msg="WarpX spectral solvers are not yet running on SYCL GPUs for RZ (GH#5774)",
)
def cmake_args(self):
@@ -295,11 +205,8 @@ def cmake_args(self):
self.define_from_variant("WarpX_QED_TABLE_GEN", "qedtablegen"),
]
if spec.satisfies("@24.08:"):
args.append("-DWarpX_amrex_internal=OFF")
args.append(self.define_from_variant("WarpX_FFT", "fft"))
else:
args.append(self.define_from_variant("WarpX_PSATD", "fft"))
args.append("-DWarpX_amrex_internal=OFF")
args.append(self.define_from_variant("WarpX_FFT", "fft"))
# FindMPI needs an extra hint sometimes, particularly on cray systems
if "+mpi" in spec:
@@ -310,10 +217,18 @@ def cmake_args(self):
args.append("-DWarpX_openpmd_internal=OFF")
if "+python" in spec:
if spec.satisfies("@24.08:"):
args.append("-DWarpX_pyamrex_internal=OFF")
args.append("-DWarpX_pybind11_internal=OFF")
args.append(self.define_from_variant("WarpX_PYTHON_IPO", "python_ipo"))
pip_args = PythonPipBuilder.std_args(self) + [f"--prefix={self.prefix}"]
idx = pip_args.index("install")
# Docs: https://warpx.readthedocs.io/en/latest/install/cmake.html#build-options
args.extend(
[
"-DWarpX_pyamrex_internal=OFF",
"-DWarpX_pybind11_internal=OFF",
self.define_from_variant("WarpX_PYTHON_IPO", "python_ipo"),
"-DPY_PIP_OPTIONS=" + ";".join(pip_args[:idx]),
"-DPY_PIP_INSTALL_OPTIONS=" + ";".join(pip_args[idx + 1 :]),
]
)
# Work-around for SENSEI 4.0: wrong install location for CMake config
# https://github.com/SENSEI-insitu/SENSEI/issues/79
@@ -321,32 +236,13 @@ def cmake_args(self):
args.append(self.define("SENSEI_DIR", spec["sensei"].prefix.lib.cmake))
# WarpX uses CCache by default, interfering with Spack wrappers
ccache_var = "CCACHE_PROGRAM" if spec.satisfies("@:24.01") else "WarpX_CCACHE"
args.append(self.define(ccache_var, False))
args.append(self.define("WarpX_CCACHE", False))
return args
phases = ("cmake", "build", "install", "pip_install_nodeps")
build_targets = ["all"]
with when("+python"):
build_targets += ["pip_wheel"]
def pip_install_nodeps(self, spec, prefix):
"""Install everything from build directory."""
pip = spec["python"].command
pip.add_default_arg("-m", "pip")
args = PythonPipBuilder.std_args(self) + [
f"--prefix={prefix}",
"--find-links=warpx-whl",
"pywarpx",
]
with working_dir(self.build_directory):
pip(*args)
# todo: from PythonPipBuilder
# ....execute_install_time_tests()
def edit(self, spec, prefix):
with when("+python"):
self.build_targets.extend(["pip_wheel", "pip_install_nodeps"])
@property
def libs(self):
@@ -374,26 +270,20 @@ def _get_input_options(self, dim, post_install):
install_test_root(self) if post_install else self.stage.source_path,
self.examples_src_dir,
)
if spec.satisfies("@:24.09"):
inputs_nD = {"1": "inputs_1d", "2": "inputs_2d", "3": "inputs_3d", "rz": "inputs_rz"}
if spec.satisfies("@:21.12"):
inputs_nD["rz"] = "inputs_2d_rz"
else:
inputs_nD = {
"1": "inputs_test_1d_laser_acceleration",
"2": "inputs_base_2d",
"3": "inputs_base_3d",
"rz": "inputs_base_rz",
}
inputs_nD = {
"1": "inputs_base_1d",
"2": "inputs_base_2d",
"3": "inputs_base_3d",
"rz": "inputs_base_rz",
}
inputs = join_path(examples_dir, inputs_nD[dim])
cli_args = [inputs, "max_step=50", "diag1.intervals=10"]
# test openPMD output if compiled in
if "+openpmd" in spec:
cli_args.append("diag1.format=openpmd")
# RZ: New openPMD thetaMode output
if dim == "rz" and spec.satisfies("@22.04:"):
cli_args.append("diag1.fields_to_plot=Er Et Ez Br Bt Bz jr jt jz rho")
# RZ: thetaMode output uses different variables
cli_args.append("diag1.fields_to_plot=Er Et Ez Br Bt Bz jr jt jz rho")
return cli_args
def check(self):
@@ -429,8 +319,6 @@ def run_warpx(self, dim):
if dim not in self.spec.variants["dims"].value:
raise SkipTest(f"Package must be installed with {dim} in dims")
dim_arg = f"{dim}d" if dim.isdigit() else dim
if self.spec.satisfies("@:23.05") and not dim.isdigit():
dim_arg = dim_arg.upper()
exe = find(self.prefix.bin, f"warpx.{dim_arg}.*", recursive=False)[0]
cli_args = self._get_input_options(dim, True)
warpx = which(exe)

View File

@@ -16,6 +16,9 @@ class WasiSdkPrebuilt(Package):
license("APACHE-2.0", checked_by="teaguesterling")
version("25.0", sha256="52640dde13599bf127a95499e61d6d640256119456d1af8897ab6725bcf3d89c")
version("24.0", sha256="c6c38aab56e5de88adf6c1ebc9c3ae8da72f88ec2b656fb024eda8d4167a0bc5")
version("23.0", sha256="521838d92816c92a731dee9246b0364eb00e300c5e2336e6dfa38f26a6494b06")
version("22.0", sha256="fa46b8f1b5170b0fecc0daf467c39f44a6d326b80ced383ec4586a50bc38d7b8")
version("21.0", sha256="f2fe0723b337c484556b19d64c0f6c6044827014bfcd403d00951c65a86cfa26")
version("20.0", sha256="7030139d495a19fbeccb9449150c2b1531e15d8fb74419872a719a7580aad0f9")

View File

@@ -17,6 +17,11 @@ class Xsimd(CMakePackage):
license("BSD-3-Clause")
version("develop", branch="master")
version("13.1.0", sha256="88c9dc6da677feadb40fe09f467659ba0a98e9987f7491d51919ee13d897efa4")
version("12.1.1", sha256="73f94a051278ef3da4533b691d31244d12074d5d71107473a9fd8d7be15f0110")
version("11.2.0", sha256="509bbfe12e78ee1a0e81711019e7c7a372dabcff566dbf15b95cc94339443242")
version("10.0.0", sha256="73f818368b3a4dad92fab1b2933d93694241bd2365a6181747b2df1768f6afdd")
version("9.0.1", sha256="b1bb5f92167fd3a4f25749db0be7e61ed37e0a5d943490f3accdcd2cd2918cc0")
version("8.1.0", sha256="d52551360d37709675237d2a0418e28f70995b5b7cdad7c674626bcfbbf48328")
version("8.0.5", sha256="0e1b5d973b63009f06a3885931a37452580dbc8d7ca8ad40d4b8c80d2a0f84d7")
version("8.0.4", sha256="5197529e7ca715ddfcae7c5c4097879c86dae6ef85f3f67c402e2e6c5e803c41")

View File

@@ -17,6 +17,7 @@ class Xtl(CMakePackage):
license("BSD-3-Clause")
version("develop", branch="master")
version("0.7.7", sha256="44fb99fbf5e56af5c43619fc8c29aa58e5fad18f3ba6e7d9c55c111b62df1fbb")
version("0.7.4", sha256="3c88be0e696b64150c4de7a70f9f09c00a335186b0b0b409771ef9f56bca7d9a")
version("0.7.2", sha256="95c221bdc6eaba592878090916383e5b9390a076828552256693d5d97f78357c")
version("0.6.4", sha256="5db5087c37daab3e1d35337782f79972aaaf19218a0de786a0515f247244e390")

View File

@@ -3,7 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
from spack.pkg.builtin.boost import Boost
class Yoda(AutotoolsPackage):
@@ -16,6 +15,8 @@ class Yoda(AutotoolsPackage):
license("GPL-3.0-or-later")
version("2.1.0", sha256="eba2efa58d407e5ca60205593339cdab12b7659255020358454b0f6502d115c2")
version("2.0.3", sha256="c066b1ae723e7cc76011e134fe9d69b378670ae03aae8a2781b0606692440143")
version("2.0.2", sha256="31a41413641189814ff3c6bbb96ac5d17d2b68734fe327d06794cdbd3a540399")
version("2.0.1", sha256="ae5a78eaae5574a5159d4058839d0983c9923558bfc88fbce21d251fd925d260")
version("2.0.0", sha256="680f43dabebb3167ce1c5dee72d1c2c285c3190751245aa51e3260a005a99575")
@@ -71,25 +72,27 @@ class Yoda(AutotoolsPackage):
version("1.0.4", sha256="697fe397c69689feecb2a731e19b2ff85e19343b8198c4f18a7064c4f7123950")
version("1.0.3", sha256="6a1d1d75d9d74da457726ea9463c1b0b6ba38d4b43ef54e1c33f885e70fdae4b")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("cxx", type="build")
variant("root", default=False, description="Enable ROOT interface")
variant("hdf5", default=False, description="Enable HDF5 compatibility", when="@2.1:")
variant("highfive", default=False, description="Enable HighFive compatibility", when="@2.1:")
variant("root", default=False, description="Enable ROOT interface", when="@:2.0")
depends_on("python", type=("build", "link", "run"))
depends_on("py-future", type=("build", "run"))
depends_on("zlib-api")
depends_on("boost", when="@:1.6.0", type=("build", "run"))
# TODO: replace this with an explicit list of components of Boost,
# for instance depends_on('boost +filesystem')
# See https://github.com/spack/spack/pull/22303 for reference
depends_on(Boost.with_default_variants, when="@:1.6.0", type=("build", "run"))
depends_on("boost@1.48: ", when="@:1.5", type=("build", "run"))
depends_on("yaml-cpp", type=("build", "link", "run"), when="@2.1:")
depends_on("py-cython@0.18:", type="build", when="@:1.4.0")
depends_on("py-cython@0.20:", type="build", when="@1.4.0:1.6.5")
depends_on("py-cython@0.23.5:", type="build", when="@1.6.5:1.8.0")
depends_on("py-cython@0.24:", type="build", when="@1.8.0:")
depends_on("py-matplotlib", when="@1.3.0:", type=("build", "run"))
depends_on("hdf5", type=("build", "link", "run"), when="+hdf5")
depends_on("highfive", type=("build", "link", "run"), when="+highfive")
depends_on("root", type=("build", "link", "run"), when="@2.1:")
depends_on("root", type=("build", "link", "run"), when="+root")
extends("python")
@@ -114,8 +117,14 @@ class Yoda(AutotoolsPackage):
def configure_args(self):
args = []
if self.spec.satisfies("@:1.6.0"):
args.append("--with-boost=" + self.spec["boost"].prefix)
args.append(f"--with-boost={self.spec['boost'].prefix}")
if self.spec.satisfies("@2.1:"):
args.append(f"--with-yaml-cpp={self.spec['yaml-cpp'].prefix}")
args.extend(self.with_or_without("h5", variant="hdf5"))
args.extend(
self.with_or_without("highfive", variant="highfive", activation_value="prefix")
)
args.extend(self.enable_or_disable("root"))
return args