Compare commits

...

60 Commits

Author SHA1 Message Date
Gregory Becker
54e5439dd6 Spec.format: conditional format strings 2023-08-22 11:22:36 -07:00
Wouter Deconinck
9f1a30d3b5 veccore: new variant vc (#39542)
* veccore: new variants umesimd and vc

* veccore: remove variant umesimd again
2023-08-22 11:20:19 -04:00
Harmen Stoppels
1340995249 clingo-bootstrap: pgo, lto, allocator optimizations (#34926)
Add support for PGO and LTO for gcc, clang and apple-clang, and add a
patch to allow mimalloc as an allocator in operator new/delete, give
reduces clingo runtime by about 30%.
2023-08-22 14:44:07 +02:00
dependabot[bot]
afebc11742 Bump sphinx from 6.2.1 to 7.2.2 in /lib/spack/docs (#39502)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.2.1 to 7.2.2.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.2.1...v7.2.2)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-22 12:02:58 +02:00
Jim Edwards
34e9fc612c parallelio: add v2.6.1 (#39479) 2023-08-22 09:44:06 +02:00
dependabot[bot]
1d8ff7f742 Bump sphinx-rtd-theme from 1.2.2 to 1.3.0 in /lib/spack/docs (#39562)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.2.2 to 1.3.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.2.2...1.3.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-22 09:32:39 +02:00
Axel Huebl
0e27f05611 openPMD-api: 0.15.2 (#39528)
Latest patch release.
2023-08-21 18:31:53 -07:00
Axel Huebl
19aaa97ff2 C-Blosc2: v2.10.2 (#39537)
This fixes regressions with external dependencies introduced
in 2.10.1
2023-08-21 18:29:58 -07:00
Kamil Iskra
990309355f [bats] Update to v1.10.0 (#39561)
* [bats] Update to v1.10.0
  Original bats has been abandoned for years; nowadays it's
  community-maintained.
* [bats] Fix style
2023-08-21 18:25:35 -07:00
eugeneswalker
2cb66e6e44 tasmanian@:7.9 +cuda conflicts with cuda@12 (#39541) 2023-08-21 13:08:12 -07:00
David Gardner
cfaade098a add sundials v6.6.0 (#39526) 2023-08-21 10:48:58 -07:00
Wouter Deconinck
ed65532e27 singularityce: fix after no spaces in Executable allowed (#39553) 2023-08-21 13:43:11 -04:00
Dr Marco Claudio De La Pierre
696d4a1b85 amdblis recipe: adding variant for a performance flag (#39549)
* amdblis recipe: adding variant for a performance flag

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* typo fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* style fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* another typo fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* one more style fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

---------

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>
2023-08-21 10:16:39 -07:00
Taillefumier Mathieu
8def75b414 Update cp2k recipe (#39128)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-08-21 17:09:28 +02:00
Harmen Stoppels
5389db821d aws-sdk-cpp: new versions and greatly reduce install size(#39511) 2023-08-21 14:52:21 +02:00
Wouter Deconinck
0d5ae3a809 cepgen: new version 1.1.0 (#39550) 2023-08-21 08:08:05 +02:00
Wouter Deconinck
b61ad8d2a8 whizard: new versions 3.1.1 and 3.1.2 (#39540) 2023-08-21 08:07:36 +02:00
Wouter Deconinck
b35db020eb zlib: new version 1.3 (#39539) 2023-08-21 08:06:41 +02:00
Wouter Deconinck
ca1d15101e zlib: add git url (#39533) 2023-08-21 08:04:28 +02:00
Brian Spilner
c9ec5fb9ac cdo: add v2.2.2 (#39506)
* update cdo-2.2.2

* add note on hdf5
2023-08-18 15:46:50 -07:00
Todd Gamblin
71abb8c7f0 less: update version, dependencies (#39521) 2023-08-18 15:43:54 -07:00
John W. Parent
4dafae8d17 spack.bat: Fixup CL arg parsing (#39359)
Previous changes to this file stopped directly processing CL args to
stop batch `for` from interpolating batch reserved characters needed in
arguments like URLS. In doing so, we relied on `for` for an easy
"split" operation, however this incorrectly splits paths with spaces in
certain cases. Processing everything ourselves with manual looping via
`goto` statements allows for full control over CL parsing and handling
of both paths with spaces and reserved characters.
2023-08-18 14:29:47 -07:00
Jen Herting
b2b00df5cc [py-gradio-client] New package (#39496) 2023-08-18 12:03:37 -05:00
Jen Herting
114e5d4767 [py-openmim] Beginning work (#39494) 2023-08-18 12:00:48 -05:00
Jen Herting
fd70e7fb31 [py-confection] new package (#39491) 2023-08-18 11:59:47 -05:00
George Young
77760c8ea4 py-multiqc: add 1.15, correct py-matplotlib dependency (#39509)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-08-18 11:56:20 -05:00
Jen Herting
737a6dcc73 py-markdown-it-py: add linkify variant and v2.2.0 (#39492)
* [py-markdown-it-py] added linkify variant

* [py-markdown-it-py] added version 2.2.0
2023-08-18 11:14:15 -05:00
Seth R. Johnson
3826fe3765 fortrilinos: release 2.3.0 compatible with trilinos@14 (#39500)
* fortrilinos: release 2.3.0 compatible with trilinos@14.0
2023-08-18 12:10:00 -04:00
Benjamin Meyers
edb11941b2 New: py-alpaca-farm, py-alpaca-eval, py-tiktoken; Updated: py-accerlate, py-transformers (#39432) 2023-08-18 10:29:12 -05:00
Axel Huebl
1bd58a8026 WarpX 23.08 (#39407)
* WarpX 23.08

Update WarpX and related Python packages to the lastest releases.

* fix style

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-08-18 10:03:47 -05:00
Jordan Galby
f8e0c8caed Fix Spack freeze on install child process unexpected exit (#39015)
* Fix spack frozen on child process defunct

* Rename parent/child pipe to read/write to emphasize non-duplex mode
2023-08-18 09:41:02 +02:00
Vijay M
d0412c1578 Updating the versions of available tarballs and adding an eigen3 variant as well (#39498) 2023-08-17 22:23:32 -04:00
Harmen Stoppels
ec500adb50 zlib-api: use zlib-ng +compat by default (#39358)
In the HPC package manager, we want the fastest `zlib`  implementation by default.  `zlib-ng` is up to 4x faster than stock `zlib`, and it can do things like take advantage of AVX-512 instructions.  This PR makes `zlib-ng` the default `zlib-api` provider (`zlib-api` was introduced earlier, in #37372).

As far as I can see, the only issues you can encounter are:

1. Build issues with packages that heavily rely on `zlib` internals. In Gitlab CI only one out of hundreds of packages had that issue (it extended zlib with deflate stuff, and used its own copy of zlib sources).
2. Packages that like to detect `zlib-ng` separately and rely on `zlib-ng` internals. The only issue I've found with this among the hundreds of packages built in CI is `perl` trying to report more specific zlib-ng version details, and relied on some internals that got refactored. But yeah... that warrants a patch / conflict and is nothing special.

At runtime, you cannot really have any issues, given that zlib and zlib-ng export the exact same symbols (and zlib-ng tests this in their CI).

You can't really have issues with externals when using zlib-ng either. The only type of issue is when system zlib is rather new, and not marked as external; if another external uses new symbols, and Spack builds an older zlib/zlib-ng, then the external might not find the new symbols. But this is a configuration issue, and it's not an issue caused by zlib-ng, as the same would happen with older Spack zlib.

* zlib-api: use zlib-ng +compat by default
* make a trivial change to zlib-ng to trigger a rebuild
* add `haampie` as maintainer
2023-08-17 14:03:14 -07:00
Harmen Stoppels
30f5c74614 py-flake8: bump including deps (#39426) 2023-08-17 21:02:11 +02:00
Massimiliano Culpo
713eb210ac zig: add v0.11.0 (#39484) 2023-08-17 19:18:00 +02:00
Massimiliano Culpo
a022e45866 ASP-based solver: optimize key to intermediate dicts (#39471)
Computing str(spec) is faster than computing hash(spec), and
since all the abstract specs we deal with come from user configuration
they cannot cover DAG structures that are not captured by str() but
are captured by hash()
2023-08-17 14:11:49 +02:00
eugeneswalker
82685a68d9 boost %oneapi: add cxxflags -Wno-error=enum-constexpr-conversion (#39477) 2023-08-17 07:49:20 -04:00
dependabot[bot]
b19691d503 Bump mypy from 1.5.0 to 1.5.1 in /lib/spack/docs (#39478)
Bumps [mypy](https://github.com/python/mypy) from 1.5.0 to 1.5.1.
- [Commits](https://github.com/python/mypy/compare/v1.5.0...v1.5.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-17 12:35:49 +02:00
eugeneswalker
54ea860b37 gmsh %oneapi: add cflag: -Wno-error=implicit-function-declaration (#39476) 2023-08-17 06:09:07 -04:00
eugeneswalker
fb598baa53 py-ruamel-yaml-clib %oneapi: -Wno-error=incompatible-function-pointer-types (#39480) 2023-08-17 11:59:45 +02:00
eugeneswalker
02763e967a sz %oneapi: add cflag=-Wno-error=implicit-function-declaration (#39467) 2023-08-17 02:36:17 -07:00
Seth R. Johnson
2846be315b kokkos: use 'when' instead of 'conflicts' for CUDA variants (#39463) 2023-08-17 05:12:52 -04:00
eugeneswalker
4818b75814 silo %oneapi: add cflags: -Wno-error=int, -Wno-error=int-conversion (#39475) 2023-08-17 04:42:58 -04:00
eugeneswalker
b613bf3855 py-matplotlib %oneapi: add cxxflags=-Wno-error=register (#39469) 2023-08-17 04:13:09 -04:00
Adam J. Stewart
3347372a7b py-deepspeed: add new package (#39427) 2023-08-17 08:47:40 +02:00
Sergey Kosukhin
c417a77a19 snappy: patch and conflict for %nvhpc (#39063) 2023-08-17 08:44:38 +02:00
Martin Aumüller
90d0d0176c botan: version 3 requires newer GCC (#39450)
e.g. 3.1.1 produces this during configuration when trying to install:
  ERROR: This version of Botan requires at least gcc 11.0
2023-08-17 08:34:43 +02:00
Adam J. Stewart
72b9f89504 py-setuptools: document Python 3.12 support (#39449) 2023-08-17 08:31:53 +02:00
Peter Scheibel
a89f1b1bf4 Add debugging statements to file search (#39121)
Co-authored-by: Scheibel <scheibel1@ml-9983616.the-lab.llnl.gov>
2023-08-17 08:31:06 +02:00
Adam J. Stewart
c6e26251a1 py-lightning: add v2.0.7 (#39468) 2023-08-17 08:19:50 +02:00
Harmen Stoppels
190a1bf523 Delay abstract hashes lookup (#39251)
Delay lookup for abstract hashes until concretization time, instead of
until Spec comparison. This has a few advantages:

1. `satisfies` / `intersects` etc don't always know where to resolve the
   abstract hash (in some cases it's wrong to look in the current env,
   db, buildcache, ...). Better to let the call site dictate it.
2. Allows search by abstract hash without triggering a database lookup,
   causing quadratic complexity issues (accidental nested loop during
   search)
3. Simplifies queries against the buildcache, they can now use Spec
   instances instead of strings.

The rules are straightforward:

1. a satisfies b when b's hash is prefix of a's hash
2. a intersects b when either a's or b's hash is a prefix of b's or a's
   hash respectively
2023-08-17 08:08:50 +02:00
eugeneswalker
e381e166ec py-numcodecs %oneapi: add cflags -Wno-error=implicit-function-declaration (#39454) 2023-08-16 15:52:01 -05:00
eugeneswalker
2f145b2684 pruners-ninja cflags: -Wno-error=implicit-function-declaration (#39452) 2023-08-16 15:51:38 -05:00
eugeneswalker
4c7748e954 amrex+sycl: restructure constraint on %oneapi (#39451) 2023-08-16 11:49:32 -06:00
eugeneswalker
86485dea14 hdf5-vol-cache %oneapi: cflags: add -Wno-error=incompatible-function-pointer-types (#39453) 2023-08-16 09:56:09 -06:00
Todd Kordenbrock
00f8f5898a faodel: update github URL organization to sandialabs (#39446) 2023-08-16 07:25:47 -07:00
Massimiliano Culpo
f41d7a89f3 Extract Package from PackageNode for error messages 2023-08-16 06:20:57 -07:00
Harmen Stoppels
4f07205c63 Avoid sort on singleton list during edge insertion (#39458)
The median length of this list of 1. For reasons I don't know, `.sort()`
still like to call the key function.

This saves ~9% of total database read time, and the number of calls
goes from 5305 -> 1715.
2023-08-16 14:33:03 +02:00
Massimiliano Culpo
08f9c7670e Do not impose provider conditions, if the node is not a provider (#39456)
* Do not impose provider conditions, if the node is not a provider

fixes #39455

When a node can be a provider of a spec, but is not selected as
a provider, we should not be imposing provider conditions on the
virtual.

* Adjust the integrity constraint, by using the correct atom
2023-08-16 05:15:13 -07:00
Harmen Stoppels
b451791336 json: minify by default (#39457) 2023-08-16 11:26:40 +00:00
78 changed files with 1071 additions and 335 deletions

View File

@@ -51,65 +51,43 @@ setlocal enabledelayedexpansion
:: subcommands will never start with '-'
:: everything after the subcommand is an arg
:: we cannot allow batch "for" loop to directly process CL args
:: a number of batch reserved characters are commonly passed to
:: spack and allowing batch's "for" method to process the raw inputs
:: results in a large number of formatting issues
:: instead, treat the entire CLI as one string
:: and split by space manually
:: capture cl args in variable named cl_args
set cl_args=%*
:process_cl_args
rem tokens=1* returns the first processed token produced
rem by tokenizing the input string cl_args on spaces into
rem the named variable %%g
rem While this make look like a for loop, it only
rem executes a single time for each of the cl args
rem the actual iterative loop is performed by the
rem goto process_cl_args stanza
rem we are simply leveraging the "for" method's string
rem tokenization
for /f "tokens=1*" %%g in ("%cl_args%") do (
set t=%%~g
rem remainder of string is composed into %%h
rem these are the cl args yet to be processed
rem assign cl_args var to only the args to be processed
rem effectively discarding the current arg %%g
rem this will be nul when we have no further tokens to process
set cl_args=%%h
rem process the first space delineated cl arg
rem of this iteration
if "!t:~0,1!" == "-" (
if defined _sp_subcommand (
rem We already have a subcommand, processing args now
if not defined _sp_args (
set "_sp_args=!t!"
) else (
set "_sp_args=!_sp_args! !t!"
)
) else (
if not defined _sp_flags (
set "_sp_flags=!t!"
shift
) else (
set "_sp_flags=!_sp_flags! !t!"
shift
)
)
) else if not defined _sp_subcommand (
set "_sp_subcommand=!t!"
shift
) else (
rem Set first cl argument (denoted by %1) to be processed
set t=%1
rem shift moves all cl positional arguments left by one
rem meaning %2 is now %1, this allows us to iterate over each
rem argument
shift
rem assign next "first" cl argument to cl_args, will be null when
rem there are now further arguments to process
set cl_args=%1
if "!t:~0,1!" == "-" (
if defined _sp_subcommand (
rem We already have a subcommand, processing args now
if not defined _sp_args (
set "_sp_args=!t!"
shift
) else (
set "_sp_args=!_sp_args! !t!"
shift
)
) else (
if not defined _sp_flags (
set "_sp_flags=!t!"
) else (
set "_sp_flags=!_sp_flags! !t!"
)
)
) else if not defined _sp_subcommand (
set "_sp_subcommand=!t!"
) else (
if not defined _sp_args (
set "_sp_args=!t!"
) else (
set "_sp_args=!_sp_args! !t!"
)
)
rem if this is not nil, we have more tokens to process
rem if this is not nu;ll, we have more tokens to process
rem start above process again with remaining unprocessed cl args
if defined cl_args goto :process_cl_args

View File

@@ -60,7 +60,7 @@ packages:
xxd: [xxd-standalone, vim]
yacc: [bison, byacc]
ziglang: [zig]
zlib-api: [zlib, zlib-ng+compat]
zlib-api: [zlib-ng+compat, zlib]
permissions:
read: world
write: user

View File

@@ -1,7 +1,7 @@
sphinx==6.2.1
sphinx==7.2.2
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.2.2
sphinx-rtd-theme==1.3.0
python-levenshtein==0.21.1
docutils==0.18.1
pygments==2.16.1
@@ -10,4 +10,4 @@ pytest==7.4.0
isort==5.12.0
black==23.7.0
flake8==6.1.0
mypy==1.5.0
mypy==1.5.1

View File

@@ -1754,9 +1754,14 @@ def find(root, files, recursive=True):
files = [files]
if recursive:
return _find_recursive(root, files)
tty.debug(f"Find (recursive): {root} {str(files)}")
result = _find_recursive(root, files)
else:
return _find_non_recursive(root, files)
tty.debug(f"Find (not recursive): {root} {str(files)}")
result = _find_non_recursive(root, files)
tty.debug(f"Find complete: {root} {str(files)}")
return result
@system_path_filter

View File

@@ -2383,22 +2383,12 @@ def __init__(self, all_architectures):
self.possible_specs = specs
def __call__(self, spec, **kwargs):
def __call__(self, spec: Spec, **kwargs):
"""
Args:
spec (str): The spec being searched for in its string representation or hash.
spec: The spec being searched for
"""
matches = []
if spec.startswith("/"):
# Matching a DAG hash
query_hash = spec.replace("/", "")
for candidate_spec in self.possible_specs:
if candidate_spec.dag_hash().startswith(query_hash):
matches.append(candidate_spec)
else:
# Matching a spec constraint
matches = [s for s in self.possible_specs if s.satisfies(spec)]
return matches
return [s for s in self.possible_specs if s.satisfies(spec)]
class FetchIndexError(Exception):

View File

@@ -1027,7 +1027,7 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, child_pipe, input_multiprocess_fd, jsfd1, jsfd2
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
context = kwargs.get("context", "build")
@@ -1048,12 +1048,12 @@ def _setup_pkg_and_run(
pkg, dirty=kwargs.get("dirty", False), context=context
)
return_value = function(pkg, kwargs)
child_pipe.send(return_value)
write_pipe.send(return_value)
except StopPhase as e:
# Do not create a full ChildError from this, it's not an error
# it's a control statement.
child_pipe.send(e)
write_pipe.send(e)
except BaseException:
# catch ANYTHING that goes wrong in the child process
exc_type, exc, tb = sys.exc_info()
@@ -1102,10 +1102,10 @@ def _setup_pkg_and_run(
context,
package_context,
)
child_pipe.send(ce)
write_pipe.send(ce)
finally:
child_pipe.close()
write_pipe.close()
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
@@ -1149,7 +1149,7 @@ def child_fun():
For more information on `multiprocessing` child process creation
mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
"""
parent_pipe, child_pipe = multiprocessing.Pipe()
read_pipe, write_pipe = multiprocessing.Pipe(duplex=False)
input_multiprocess_fd = None
jobserver_fd1 = None
jobserver_fd2 = None
@@ -1174,7 +1174,7 @@ def child_fun():
serialized_pkg,
function,
kwargs,
child_pipe,
write_pipe,
input_multiprocess_fd,
jobserver_fd1,
jobserver_fd2,
@@ -1183,6 +1183,12 @@ def child_fun():
p.start()
# We close the writable end of the pipe now to be sure that p is the
# only process which owns a handle for it. This ensures that when p
# closes its handle for the writable end, read_pipe.recv() will
# promptly report the readable end as being ready.
write_pipe.close()
except InstallError as e:
e.pkg = pkg
raise
@@ -1192,7 +1198,16 @@ def child_fun():
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
child_result = parent_pipe.recv()
def exitcode_msg(p):
typ = "exit" if p.exitcode >= 0 else "signal"
return f"{typ} {abs(p.exitcode)}"
try:
child_result = read_pipe.recv()
except EOFError:
p.join()
raise InstallError(f"The process has stopped unexpectedly ({exitcode_msg(p)})")
p.join()
# If returns a StopPhase, raise it
@@ -1212,6 +1227,10 @@ def child_fun():
child_result.print_context()
raise child_result
# Fallback. Usually caught beforehand in EOFError above.
if p.exitcode != 0:
raise InstallError(f"The process failed unexpectedly ({exitcode_msg(p)})")
return child_result

View File

@@ -1994,14 +1994,10 @@ def get_one_by_hash(self, dag_hash):
def all_matching_specs(self, *specs: spack.spec.Spec) -> List[Spec]:
"""Returns all concretized specs in the environment satisfying any of the input specs"""
# Look up abstract hashes ahead of time, to avoid O(n^2) traversal.
specs = [s.lookup_hash() for s in specs]
# Avoid double lookup by directly calling _satisfies.
return [
s
for s in traverse.traverse_nodes(self.concrete_roots(), key=traverse.by_dag_hash)
if any(s._satisfies(t) for t in specs)
if any(s.satisfies(t) for t in specs)
]
@spack.repo.autospec

View File

@@ -267,12 +267,14 @@ def _id(thing):
@llnl.util.lang.key_ordering
class AspFunction(AspObject):
__slots__ = ["name", "args"]
def __init__(self, name, args=None):
self.name = name
self.args = () if args is None else tuple(args)
def _cmp_key(self):
return (self.name, self.args)
return self.name, self.args
def __call__(self, *args):
"""Return a new instance of this function with added arguments.
@@ -731,7 +733,9 @@ def fact(self, head):
"""
symbol = head.symbol() if hasattr(head, "symbol") else head
self.out.write("%s.\n" % str(symbol))
# This is commented out to avoid evaluating str(symbol) when we have no stream
if not isinstance(self.out, llnl.util.lang.Devnull):
self.out.write(f"{str(symbol)}.\n")
atom = self.backend.add_atom(symbol)
@@ -1363,26 +1367,29 @@ def condition(self, required_spec, imposed_spec=None, name=None, msg=None, node=
self.gen.fact(fn.condition_reason(condition_id, msg))
cache = self._trigger_cache[named_cond.name]
if named_cond not in cache:
named_cond_key = str(named_cond)
if named_cond_key not in cache:
trigger_id = next(self._trigger_id_counter)
requirements = self.spec_clauses(named_cond, body=True, required_from=name)
cache[named_cond] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond]
cache[named_cond_key] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
if not imposed_spec:
return condition_id
cache = self._effect_cache[named_cond.name]
if imposed_spec not in cache:
imposed_spec_key = str(imposed_spec)
if imposed_spec_key not in cache:
effect_id = next(self._effect_id_counter)
requirements = self.spec_clauses(imposed_spec, body=False, required_from=name)
if not node:
requirements = list(
filter(lambda x: x.args[0] not in ("node", "virtual_node"), requirements)
)
cache[imposed_spec] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec]
cache[imposed_spec_key] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
return condition_id
@@ -2953,6 +2960,7 @@ def solve(self, specs, out=None, timers=False, stats=False, tests=False, setup_o
setup_only (bool): if True, stop after setup and don't solve (default False).
"""
# Check upfront that the variants are admissible
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
@@ -2976,6 +2984,7 @@ def solve_in_rounds(self, specs, out=None, timers=False, stats=False, tests=Fals
stats (bool): print internal statistics if set to True
tests (bool): add test dependencies to the solve
"""
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)

View File

@@ -124,6 +124,9 @@ attr(Name, node(min_dupe_id, A1), A2, A3, A4) :- literal(LiteralID, Name, A1, A2
attr("node_flag_source", node(min_dupe_id, A1), A2, node(min_dupe_id, A3)) :- literal(LiteralID, "node_flag_source", A1, A2, A3), solve_literal(LiteralID).
attr("depends_on", node(min_dupe_id, A1), node(min_dupe_id, A2), A3) :- literal(LiteralID, "depends_on", A1, A2, A3), solve_literal(LiteralID).
% Discriminate between "roots" that have been explicitly requested, and roots that are deduced from "virtual roots"
explicitly_requested_root(node(min_dupe_id, A1)) :- literal(LiteralID, "root", A1), solve_literal(LiteralID).
#defined concretize_everything/0.
#defined literal/1.
#defined literal/3.
@@ -144,10 +147,10 @@ error(100, no_value_error, Attribute, Package)
not attr(Attribute, node(ID, Package), _).
% Error when multiple attr need to be selected
error(100, multiple_values_error, Attribute, PackageNode)
:- attr("node", PackageNode),
error(100, multiple_values_error, Attribute, Package)
:- attr("node", node(ID, Package)),
attr_single_value(Attribute),
2 { attr(Attribute, PackageNode, Value) }.
2 { attr(Attribute, node(ID, Package), Value) }.
%-----------------------------------------------------------------------------
% Version semantics
@@ -478,13 +481,13 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
% If there's a virtual node, we must select one and only one provider.
% The provider must be selected among the possible providers.
error(100, "Cannot find valid provider for virtual {0}", VirtualNode)
:- attr("virtual_node", VirtualNode),
not provider(_, VirtualNode).
error(100, "Cannot find valid provider for virtual {0}", Virtual)
:- attr("virtual_node", node(X, Virtual)),
not provider(_, node(X, Virtual)).
error(100, "Cannot select a single provider for virtual '{0}'", VirtualNode)
:- attr("virtual_node", VirtualNode),
2 { provider(P, VirtualNode) }.
error(100, "Cannot select a single provider for virtual '{0}'", Virtual)
:- attr("virtual_node", node(X, Virtual)),
2 { provider(P, node(X, Virtual)) }.
% virtual roots imply virtual nodes, and that one provider is a root
attr("virtual_node", VirtualNode) :- attr("virtual_root", VirtualNode).
@@ -519,6 +522,19 @@ virtual_condition_holds(ID, node(ProviderID, Provider), Virtual) :-
condition_holds(ID, node(ProviderID, Provider)),
virtual(Virtual).
% If a "provider" condition holds, but this package is not a provider, do not impose the "provider" condition
do_not_impose(EffectID, node(X, Package))
:- virtual_condition_holds(ID, node(X, Package), Virtual),
pkg_fact(Package, condition_effect(ID, EffectID)),
not provider(node(X, Package), node(_, Virtual)).
% Choose the provider among root specs, if possible
:- provider(ProviderNode, node(min_dupe_id, Virtual)),
virtual_condition_holds(_, PossibleProvider, Virtual),
PossibleProvider != ProviderNode,
explicitly_requested_root(PossibleProvider),
not explicitly_requested_root(ProviderNode).
% A package cannot be the actual provider for a virtual if it does not
% fulfill the conditions to provide that virtual
:- provider(PackageNode, node(VirtualID, Virtual)),
@@ -727,23 +743,23 @@ attr("variant_value", node(ID, Package), Variant, Value) :-
attr("variant_propagate", node(ID, Package), Variant, Value, _),
pkg_fact(Package, variant_possible_value(Variant, Value)).
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, PackageNode, Value1, Value2) :-
attr("variant_propagate", PackageNode, Variant, Value1, Source1),
attr("variant_propagate", PackageNode, Variant, Value2, Source2),
node_has_variant(PackageNode, Variant),
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, PackageNode)
:- attr("variant_set", PackageNode, Variant),
not node_has_variant(PackageNode, Variant),
build(PackageNode).
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant),
not node_has_variant(node(X, Package), Variant),
build(node(X, Package)).
% a variant cannot take on a value if it is not a variant of the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, PackageNode)
:- attr("variant_value", PackageNode, Variant, _),
not node_has_variant(PackageNode, Variant),
build(PackageNode).
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_value", node(X, Package), Variant, _),
not node_has_variant(node(X, Package), Variant),
build(node(X, Package)).
% if a variant is sticky and not set its value is the default value
attr("variant_value", node(ID, Package), Variant, Value) :-
@@ -770,11 +786,11 @@ error(100, "'{0}' required multiple values for single-valued variant '{1}'", Pac
build(node(ID, Package)),
2 { attr("variant_value", node(ID, Package), Variant, Value) }.
error(100, "No valid value for variant '{1}' of package '{0}'", PackageNode, Variant)
:- attr("node", PackageNode),
node_has_variant(PackageNode, Variant),
build(PackageNode),
not attr("variant_value", PackageNode, Variant, _).
error(100, "No valid value for variant '{1}' of package '{0}'", Package, Variant)
:- attr("node", node(X, Package)),
node_has_variant(node(X, Package), Variant),
build(node(X, Package)),
not attr("variant_value", node(X, Package), Variant, _).
% if a variant is set to anything, it is considered 'set'.
attr("variant_set", PackageNode, Variant) :- attr("variant_set", PackageNode, Variant, _).
@@ -858,11 +874,11 @@ variant_default_value(Package, Variant, Value) :-
% Treat 'none' in a special way - it cannot be combined with other
% values even if the variant is multi-valued
error(100, "{0} variant '{1}' cannot have values '{2}' and 'none'", PackageNode, Variant, Value)
:- attr("variant_value", PackageNode, Variant, Value),
attr("variant_value", PackageNode, Variant, "none"),
error(100, "{0} variant '{1}' cannot have values '{2}' and 'none'", Package, Variant, Value)
:- attr("variant_value", node(X, Package), Variant, Value),
attr("variant_value", node(X, Package), Variant, "none"),
Value != "none",
build(PackageNode).
build(node(X, Package)).
% patches and dev_path are special variants -- they don't have to be
% declared in the package, so we just allow them to spring into existence
@@ -911,18 +927,18 @@ os(OS) :- os(OS, _).
{ attr("node_os", PackageNode, OS) : os(OS) } :- attr("node", PackageNode).
% can't have a non-buildable OS on a node we need to build
error(100, "Cannot select '{0} os={1}' (operating system '{1}' is not buildable)", PackageNode, OS)
:- build(PackageNode),
attr("node_os", PackageNode, OS),
error(100, "Cannot select '{0} os={1}' (operating system '{1}' is not buildable)", Package, OS)
:- build(node(X, Package)),
attr("node_os", node(X, Package), OS),
not buildable_os(OS).
% can't have dependencies on incompatible OS's
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", PackageNode, DependencyNode, PackageNodeOS, DependencyOS)
:- depends_on(PackageNode, DependencyNode),
attr("node_os", PackageNode, PackageNodeOS),
attr("node_os", DependencyNode, DependencyOS),
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageNodeOS, DependencyOS)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_os", node(X, Package), PackageNodeOS),
attr("node_os", node(Y, Dependency), DependencyOS),
not os_compatible(PackageNodeOS, DependencyOS),
build(PackageNode).
build(node(X, Package)).
% give OS choice weights according to os declarations
node_os_weight(PackageNode, Weight)
@@ -966,9 +982,9 @@ attr("node_os", PackageNode, OS) :- attr("node_os_set", PackageNode, OS), attr("
{ attr("node_target", PackageNode, Target) : target(Target) } :- attr("node", PackageNode).
% If a node must satisfy a target constraint, enforce it
error(10, "'{0} target={1}' cannot satisfy constraint 'target={2}'", PackageNode, Target, Constraint)
:- attr("node_target", PackageNode, Target),
attr("node_target_satisfies", PackageNode, Constraint),
error(10, "'{0} target={1}' cannot satisfy constraint 'target={2}'", Package, Target, Constraint)
:- attr("node_target", node(X, Package), Target),
attr("node_target_satisfies", node(X, Package), Constraint),
not target_satisfies(Constraint, Target).
% If a node has a target and the target satisfies a constraint, then the target
@@ -977,10 +993,10 @@ attr("node_target_satisfies", PackageNode, Constraint)
:- attr("node_target", PackageNode, Target), target_satisfies(Constraint, Target).
% If a node has a target, all of its dependencies must be compatible with that target
error(100, "Cannot find compatible targets for {0} and {1}", PackageNode, DependencyNode)
:- depends_on(PackageNode, DependencyNode),
attr("node_target", PackageNode, Target),
not node_target_compatible(DependencyNode, Target).
error(100, "Cannot find compatible targets for {0} and {1}", Package, Dependency)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_target", node(X, Package), Target),
not node_target_compatible(node(Y, Dependency), Target).
% Intermediate step for performance reasons
% When the integrity constraint above was formulated including this logic
@@ -992,13 +1008,13 @@ node_target_compatible(PackageNode, Target)
#defined target_satisfies/2.
% can't use targets on node if the compiler for the node doesn't support them
error(100, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", PackageNode, Target, Compiler, Version)
:- attr("node_target", PackageNode, Target),
node_compiler(PackageNode, CompilerID),
error(100, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Target, Compiler, Version)
:- attr("node_target", node(X, Package), Target),
node_compiler(node(X, Package), CompilerID),
not compiler_supports_target(CompilerID, Target),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, Version),
build(PackageNode).
build(node(X, Package)).
% if a target is set explicitly, respect it
attr("node_target", PackageNode, Target)
@@ -1021,9 +1037,9 @@ node_target_mismatch(ParentNode, DependencyNode)
not node_target_match(ParentNode, DependencyNode).
% disallow reusing concrete specs that don't have a compatible target
error(100, "'{0} target={1}' is not compatible with this machine", PackageNode, Target)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
error(100, "'{0} target={1}' is not compatible with this machine", Package, Target)
:- attr("node", node(X, Package)),
attr("node_target", node(X, Package), Target),
not target(Target).
%-----------------------------------------------------------------------------
@@ -1052,33 +1068,33 @@ attr("node_compiler_version", PackageNode, CompilerName, CompilerVersion)
attr("node_compiler", PackageNode, CompilerName)
:- attr("node_compiler_version", PackageNode, CompilerName, CompilerVersion).
error(100, "No valid compiler version found for '{0}'", PackageNode)
:- attr("node", PackageNode),
not node_compiler(PackageNode, _).
error(100, "No valid compiler version found for '{0}'", Package)
:- attr("node", node(X, Package)),
not node_compiler(node(X, Package), _).
% We can't have a compiler be enforced and select the version from another compiler
error(100, "Cannot select a single compiler for package {0}", PackageNode)
:- attr("node", PackageNode),
2 { attr("node_compiler_version", PackageNode, C, V) }.
error(100, "Cannot select a single compiler for package {0}", Package)
:- attr("node", node(X, Package)),
2 { attr("node_compiler_version", node(X, Package), C, V) }.
% If the compiler of a node cannot be satisfied, raise
error(10, "No valid compiler for {0} satisfies '%{1}'", PackageNode, Compiler)
:- attr("node", PackageNode),
attr("node_compiler_version_satisfies", PackageNode, Compiler, ":"),
error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, ":"),
not compiler_version_satisfies(Compiler, ":", _).
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", PackageNode, Compiler, Constraint)
:- attr("node", PackageNode),
attr("node_compiler_version_satisfies", PackageNode, Compiler, Constraint),
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", PackageNode, Compiler, Constraint)
:- attr("node", PackageNode),
attr("node_compiler_version_satisfies", PackageNode, Compiler, Constraint),
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
node_compiler(PackageNode, ID).
node_compiler(node(X, Package), ID).
% If the node is associated with a compiler and the compiler satisfy a constraint, then
% the compiler associated with the node satisfy the same constraint
@@ -1100,14 +1116,14 @@ attr("node_compiler_version_satisfies", PackageNode, Compiler, Constraint)
% Cannot select a compiler if it is not supported on the OS
% Compilers that are explicitly marked as allowed
% are excluded from this check
error(100, "{0} compiler '%{1}@{2}' incompatible with 'os={3}'", PackageNode, Compiler, Version, OS)
:- attr("node_os", PackageNode, OS),
node_compiler(PackageNode, CompilerID),
error(100, "{0} compiler '%{1}@{2}' incompatible with 'os={3}'", Package, Compiler, Version, OS)
:- attr("node_os", node(X, Package), OS),
node_compiler(node(X, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, Version),
not compiler_os(CompilerID, OS),
not allow_compiler(Compiler, Version),
build(PackageNode).
build(node(X, Package)).
% If a package and one of its dependencies don't have the
% same compiler there's a mismatch.

View File

@@ -985,16 +985,14 @@ def __iter__(self):
def __len__(self):
return len(self.edges)
def add(self, edge):
"""Adds a new edge to this object.
Args:
edge (DependencySpec): edge to be added
"""
def add(self, edge: DependencySpec):
key = edge.spec.name if self.store_by_child else edge.parent.name
current_list = self.edges.setdefault(key, [])
current_list.append(edge)
current_list.sort(key=_sort_by_dep_types)
if key in self.edges:
lst = self.edges[key]
lst.append(edge)
lst.sort(key=_sort_by_dep_types)
else:
self.edges[key] = [edge]
def __str__(self):
return "{deps: %s}" % ", ".join(str(d) for d in sorted(self.values()))
@@ -1927,19 +1925,15 @@ def _lookup_hash(self):
store, or finally, binary caches."""
import spack.environment
matches = []
active_env = spack.environment.active_environment()
if active_env:
env_matches = active_env.get_by_hash(self.abstract_hash) or []
matches = [m for m in env_matches if m._satisfies(self)]
if not matches:
db_matches = spack.store.STORE.db.get_by_hash(self.abstract_hash) or []
matches = [m for m in db_matches if m._satisfies(self)]
if not matches:
query = spack.binary_distribution.BinaryCacheQuery(True)
remote_matches = query("/" + self.abstract_hash) or []
matches = [m for m in remote_matches if m._satisfies(self)]
# First env, then store, then binary cache
matches = (
(active_env.all_matching_specs(self) if active_env else [])
or spack.store.STORE.db.query(self, installed=any)
or spack.binary_distribution.BinaryCacheQuery(True)(self)
)
if not matches:
raise InvalidHashError(self, self.abstract_hash)
@@ -1960,19 +1954,17 @@ def lookup_hash(self):
spec = self.copy(deps=False)
# root spec is replaced
if spec.abstract_hash:
new = self._lookup_hash()
spec._dup(new)
spec._dup(self._lookup_hash())
return spec
# Get dependencies that need to be replaced
for node in self.traverse(root=False):
if node.abstract_hash:
new = node._lookup_hash()
spec._add_dependency(new, deptypes=(), virtuals=())
spec._add_dependency(node._lookup_hash(), deptypes=(), virtuals=())
# reattach nodes that were not otherwise satisfied by new dependencies
for node in self.traverse(root=False):
if not any(n._satisfies(node) for n in spec.traverse()):
if not any(n.satisfies(node) for n in spec.traverse()):
spec._add_dependency(node.copy(), deptypes=(), virtuals=())
return spec
@@ -1985,9 +1977,7 @@ def replace_hash(self):
if not any(node for node in self.traverse(order="post") if node.abstract_hash):
return
spec_by_hash = self.lookup_hash()
self._dup(spec_by_hash)
self._dup(self.lookup_hash())
def to_node_dict(self, hash=ht.dag_hash):
"""Create a dictionary representing the state of this Spec.
@@ -3723,15 +3713,19 @@ def intersects(self, other: "Spec", deps: bool = True) -> bool:
"""
other = self._autospec(other)
lhs = self.lookup_hash() or self
rhs = other.lookup_hash() or other
return lhs._intersects(rhs, deps)
def _intersects(self, other: "Spec", deps: bool = True) -> bool:
if other.concrete and self.concrete:
return self.dag_hash() == other.dag_hash()
self_hash = self.dag_hash() if self.concrete else self.abstract_hash
other_hash = other.dag_hash() if other.concrete else other.abstract_hash
if (
self_hash
and other_hash
and not (self_hash.startswith(other_hash) or other_hash.startswith(self_hash))
):
return False
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
if self.virtual and other.virtual:
@@ -3791,19 +3785,8 @@ def _intersects(self, other: "Spec", deps: bool = True) -> bool:
# If we need to descend into dependencies, do it, otherwise we're done.
if deps:
return self._intersects_dependencies(other)
else:
return True
def satisfies(self, other, deps=True):
"""
This checks constraints on common dependencies against each other.
"""
other = self._autospec(other)
lhs = self.lookup_hash() or self
rhs = other.lookup_hash() or other
return lhs._satisfies(rhs, deps=deps)
return True
def _intersects_dependencies(self, other):
if not other._dependencies or not self._dependencies:
@@ -3840,7 +3823,7 @@ def _intersects_dependencies(self, other):
return True
def _satisfies(self, other: "Spec", deps: bool = True) -> bool:
def satisfies(self, other: "Spec", deps: bool = True) -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
Args:
@@ -3855,6 +3838,13 @@ def _satisfies(self, other: "Spec", deps: bool = True) -> bool:
# objects.
return self.concrete and self.dag_hash() == other.dag_hash()
# If the right-hand side has an abstract hash, make sure it's a prefix of the
# left-hand side's (abstract) hash.
if other.abstract_hash:
compare_hash = self.dag_hash() if self.concrete else self.abstract_hash
if not compare_hash or not compare_hash.startswith(other.abstract_hash):
return False
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
# A concrete provider can satisfy a virtual dependency.
@@ -4231,9 +4221,7 @@ def eq_node(self, other):
def _cmp_iter(self):
"""Lazily yield components of self for comparison."""
cmp_spec = self.lookup_hash() or self
for item in cmp_spec._cmp_node():
for item in self._cmp_node():
yield item
# This needs to be in _cmp_iter so that no specs with different process hashes
@@ -4244,10 +4232,10 @@ def _cmp_iter(self):
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield cmp_spec.process_hash() if cmp_spec.concrete else None
yield self.process_hash() if self.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(cmp_spec._dependencies.values())):
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
yield tuple(sorted(dep.deptypes))
yield hash(dep.spec)
@@ -4319,6 +4307,19 @@ def format(self, format_string=default_format, **kwargs):
``\{`` and ``\}`` for literal braces, and ``\\`` for the
literal ``\`` character.
The ``?`` sigil may be used to conditionally add a
value. Conditional format values are used if constructing the
value would not throw any error, and are ignored if it would
throw an error. For example, ``{?^mpi.name}`` will print
``Spec["mpi"].name`` if such a node exists, and otherwise
prints nothing.
The ``?`` sigil may also be combined with a conditional
separator. This separator is prepended if anything is printed
for the conditional attribute. The syntax for this is
``?sep?attribute``,
e.g. ``{name}-{version}{?/?^mpi.name}{?-?^mpi.version}``.
Args:
format_string (str): string containing the format to be expanded
@@ -4342,6 +4343,15 @@ def write(s, c=None):
def write_attribute(spec, attribute, color):
attribute = attribute.lower()
conditional = False
conditional_sep = ""
matches_conditional_sep = re.match(r"^\?(.*)\?", attribute)
if matches_conditional_sep:
conditional = True
conditional_sep = matches_conditional_sep.group(1)
if attribute.startswith("?"):
conditional = True
sig = ""
if attribute.startswith(("@", "%", "/")):
# color sigils that are inside braces
@@ -4373,6 +4383,9 @@ def write_attribute(spec, attribute, color):
elif sig == " arch=" and attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
# Now that we're done testing sig, combine it with conditional sep
sig = conditional_sep + sig
# find the morph function for our attribute
morph = transform.get(attribute, lambda s, x: x)
@@ -4402,7 +4415,12 @@ def write_attribute(spec, attribute, color):
else:
if isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names
current = current[part]
try:
current = current[part]
except KeyError:
if conditional:
return
raise
else:
# aliases
if part == "arch":
@@ -4418,6 +4436,8 @@ def write_attribute(spec, attribute, color):
try:
current = getattr(current, part)
except AttributeError:
if conditional:
return
parent = ".".join(parts[:idx])
m = "Attempted to format attribute %s." % attribute
m += "Spec %s has no attribute %s" % (parent, part)

View File

@@ -197,7 +197,9 @@ def _expand_matrix_constraints(matrix_config):
for combo in itertools.product(*expanded_rows):
# Construct a combined spec to test against excludes
flat_combo = [constraint for constraint_list in combo for constraint in constraint_list]
flat_combo = [Spec(x) for x in flat_combo]
# Resolve abstract hashes so we can exclude by their concrete properties
flat_combo = [Spec(x).lookup_hash() for x in flat_combo]
test_spec = flat_combo[0].copy()
for constraint in flat_combo[1:]:

View File

@@ -587,33 +587,39 @@ def test_conflicts_with_packages_that_are_not_dependencies(
assert any(s.satisfies(expected_spec) for s in e.concrete_roots())
def test_requires_on_virtual_and_potential_providers(tmp_path, mock_packages, config):
@pytest.mark.regression("39455")
@pytest.mark.only_clingo("Known failure of the original concretizer")
@pytest.mark.parametrize(
"possible_mpi_spec,unify", [("mpich", False), ("mpich", True), ("zmpi", False), ("zmpi", True)]
)
def test_requires_on_virtual_and_potential_providers(
possible_mpi_spec, unify, tmp_path, mock_packages, config
):
"""Tests that in an environment we can add packages explicitly, even though they provide
a virtual package, and we require the provider of the same virtual to be another package,
if they are added explicitly by their name.
"""
if spack.config.get("config:concretizer") == "original":
pytest.xfail("Known failure of the original concretizer")
manifest = tmp_path / "spack.yaml"
manifest.write_text(
"""\
f"""\
spack:
specs:
- mpich
- {possible_mpi_spec}
- mpich2
- mpileaks
packages:
mpi:
require: mpich2
concretizer:
unify: {unify}
"""
)
with ev.Environment(manifest.parent) as e:
e.concretize()
assert e.matching_spec("mpich")
assert e.matching_spec(possible_mpi_spec)
assert e.matching_spec("mpich2")
mpileaks = e.matching_spec("mpileaks")
assert mpileaks.satisfies("^mpich2")
assert mpileaks["mpi"].satisfies("mpich2")
assert not mpileaks.satisfies("^mpich")
assert not mpileaks.satisfies(f"^{possible_mpi_spec}")

View File

@@ -1291,3 +1291,38 @@ def test_constrain(factory, lhs_str, rhs_str, result, constrained_str):
rhs = factory(rhs_str)
rhs.constrain(lhs)
assert rhs == factory(constrained_str)
def test_abstract_hash_intersects_and_satisfies(default_mock_concretization):
concrete: Spec = default_mock_concretization("a")
hash = concrete.dag_hash()
hash_5 = hash[:5]
hash_6 = hash[:6]
# abstract hash that doesn't have a common prefix with the others.
hash_other = f"{'a' if hash_5[0] == 'b' else 'b'}{hash_5[1:]}"
abstract_5 = Spec(f"a/{hash_5}")
abstract_6 = Spec(f"a/{hash_6}")
abstract_none = Spec(f"a/{hash_other}")
abstract = Spec("a")
def assert_subset(a: Spec, b: Spec):
assert a.intersects(b) and b.intersects(a) and a.satisfies(b) and not b.satisfies(a)
def assert_disjoint(a: Spec, b: Spec):
assert (
not a.intersects(b)
and not b.intersects(a)
and not a.satisfies(b)
and not b.satisfies(a)
)
# left-hand side is more constrained, so its
# concretization space is a subset of the right-hand side's
assert_subset(concrete, abstract_5)
assert_subset(abstract_6, abstract_5)
assert_subset(abstract_5, abstract)
# disjoint concretization space
assert_disjoint(abstract_none, concrete)
assert_disjoint(abstract_none, abstract_5)

View File

@@ -726,22 +726,31 @@ def test_multiple_specs_with_hash(database, config):
@pytest.mark.db
def test_ambiguous_hash(mutable_database, default_mock_concretization, config):
"""Test that abstract hash ambiguity is delayed until concretization.
In the past this ambiguity error would happen during parse time."""
# This is a very sketchy as manually setting hashes easily breaks invariants
x1 = default_mock_concretization("a")
x2 = x1.copy()
x1._hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
x1._process_hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
x2._hash = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
mutable_database.add(x1, spack.store.STORE.layout)
mutable_database.add(x2, spack.store.STORE.layout)
x2._process_hash = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
assert x1 != x2 # doesn't hold when only the dag hash is modified.
mutable_database.add(x1, directory_layout=None)
mutable_database.add(x2, directory_layout=None)
# ambiguity in first hash character
s1 = SpecParser("/x").next_spec()
with pytest.raises(spack.spec.AmbiguousHashError):
parsed_spec = SpecParser("/x").next_spec()
parsed_spec.replace_hash()
s1.lookup_hash()
# ambiguity in first hash character AND spec name
s2 = SpecParser("a/x").next_spec()
with pytest.raises(spack.spec.AmbiguousHashError):
parsed_spec = SpecParser("a/x").next_spec()
parsed_spec.replace_hash()
s2.lookup_hash()
@pytest.mark.db

View File

@@ -11,7 +11,7 @@
__all__ = ["load", "dump", "SpackJSONError"]
_json_dump_args = {"indent": 2, "separators": (",", ": ")}
_json_dump_args = {"indent": None, "separators": (",", ":")}
def load(stream: Any) -> Dict:

View File

@@ -224,7 +224,7 @@ spack:
# GPU
- aml +ze
- amrex +sycl dimensions=3
- amrex +sycl
- arborx +sycl ^kokkos +sycl +openmp cxxstd=17 +tests +examples
- cabana +sycl ^kokkos +sycl +openmp cxxstd=17 +tests +examples
- kokkos +sycl +openmp cxxstd=17 +tests +examples

View File

@@ -40,6 +40,7 @@ class Amdblis(BlisBase):
version("2.2", sha256="e1feb60ac919cf6d233c43c424f6a8a11eab2c62c2c6e3f2652c15ee9063c0c9")
variant("ilp64", default=False, when="@3.0.1:", description="ILP64 support")
variant("suphandling", default=True, description="Small Unpacked Kernel handling")
def configure_args(self):
spec = self.spec
@@ -48,6 +49,11 @@ def configure_args(self):
if spec.satisfies("+ilp64"):
args.append("--blas-int-size=64")
if spec.satisfies("+suphandling"):
args.append("--enable-sup-handling")
else:
args.append("--disable-sup-handling")
# To enable Fortran to C calling convention for
# complex types when compiling with aocc flang
if self.spec.satisfies("@3.0 %aocc"):

View File

@@ -159,7 +159,6 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
depends_on("hypre@2.19.0:", type="link", when="@21.03: ~cuda +hypre")
depends_on("hypre@2.20.0:", type="link", when="@21.03: +cuda +hypre")
depends_on("petsc", type="link", when="+petsc")
depends_on("intel-oneapi-compilers@2023.0.0:", type="build", when="@23.01: +sycl")
depends_on("intel-oneapi-mkl", type=("build", "link"), when="+sycl")
# these versions of gcc have lambda function issues
@@ -249,6 +248,8 @@ def get_cuda_arch_string(self, values):
#
@when("@20.12:,develop")
def cmake_args(self):
if self.spec.satisfies("@23.01: +sycl") and not self.spec.satisfies("%oneapi@2023.0.0:"):
raise InstallError("amrex +sycl requires %oneapi@2023.0.0:")
args = [
"-DUSE_XSDK_DEFAULTS=ON",
self.define_from_variant("AMReX_SPACEDIM", "dimensions"),

View File

@@ -18,7 +18,10 @@ class AwsSdkCpp(CMakePackage):
homepage = "https://github.com/aws/aws-sdk-cpp"
git = "https://github.com/aws/aws-sdk-cpp.git"
version("1.11.144", tag="1.11.144", submodules=True)
version("1.10.57", tag="1.10.57", submodules=True)
version("1.10.32", tag="1.10.32", submodules=True)
version("1.9.379", tag="1.9.379", submodules=True)
version("1.9.247", tag="1.9.247", submodules=True)
depends_on("cmake@3.1:", type="build")
@@ -31,3 +34,6 @@ class AwsSdkCpp(CMakePackage):
sha256="ba86e0556322604fb4b70e2dd4f4fb874701868b07353fc1d5c329d90777bf45",
when="@1.9.247",
)
def cmake_args(self):
return [self.define("BUILD_ONLY", ("s3", "transfer"))]

View File

@@ -9,11 +9,16 @@
class Bats(Package):
"""Bats is a TAP-compliant testing framework for Bash."""
homepage = "https://github.com/sstephenson/bats"
url = "https://github.com/sstephenson/bats/archive/v0.4.0.tar.gz"
homepage = "https://github.com/bats-core/bats-core"
url = "https://github.com/bats-core/bats-core/archive/refs/tags/v1.10.0.tar.gz"
version("0.4.0", sha256="480d8d64f1681eee78d1002527f3f06e1ac01e173b761bc73d0cf33f4dc1d8d7")
version("1.10.0", sha256="a1a9f7875aa4b6a9480ca384d5865f1ccf1b0b1faead6b47aa47d79709a5c5fd")
version(
"0.4.0",
sha256="480d8d64f1681eee78d1002527f3f06e1ac01e173b761bc73d0cf33f4dc1d8d7",
url="https://github.com/sstephenson/bats/archive/v0.4.0.tar.gz",
)
def install(self, spec, prefix):
bash = which("bash")
bash("install.sh", prefix)
bash("./install.sh", prefix)

View File

@@ -431,6 +431,12 @@ def url_for_version(self, version):
return url.format(version.dotted, version.underscored)
def flag_handler(self, name, flags):
if name == "cxxflags":
if self.spec.satisfies("@1.79.0 %oneapi"):
flags.append("-Wno-error=enum-constexpr-conversion")
return (flags, None, None)
def determine_toolset(self, spec):
toolsets = {
"g++": "gcc",

View File

@@ -43,6 +43,8 @@ class Botan(MakefilePackage):
depends_on("python", type="build")
depends_on("py-sphinx@1.2:", type="build", when="+doc")
conflicts("%gcc@:10", when="@3:")
def edit(self, spec, prefix):
configure = Executable("./configure.py")
configure(*self.configure_args())

View File

@@ -11,13 +11,15 @@ class CBlosc2(CMakePackage):
other bells and whistles"""
homepage = "https://www.blosc.org/"
url = "https://github.com/Blosc/c-blosc2/archive/refs/tags/v2.10.1.tar.gz"
url = "https://github.com/Blosc/c-blosc2/archive/refs/tags/v2.10.2.tar.gz"
git = "https://github.com/Blosc/c-blosc2.git"
maintainers("ax3l", "robert-mijakovic")
version("develop", branch="master")
# 2.10.1+ adds Blosc2 CMake CONFIG files
# 2.10.2+ fixes regressions with external dependencies
version("2.10.2", sha256="069785bc14c006c7dab40ea0c620bdf3eb8752663fd55c706d145bceabc2a31d")
version("2.10.1", sha256="1dd65be2d76eee205c06e8812cc1360448620eee5e368b25ade4ea310654cd01")
version("2.10.0", sha256="cb7f7c0c62af78982140ecff21a2f3ca9ce6a0a1c02e314fcdce1a98da0fe231")
version("2.9.3", sha256="1f36b7d79d973505582b9a804803b640dcc0425af3d5e676070847ac4eb38176")

View File

@@ -20,6 +20,11 @@ class Cdo(AutotoolsPackage):
maintainers("skosukhin", "Try2Code")
version(
"2.2.2",
sha256="419c77315244019af41a296c05066f474cccbf94debfaae9e2106da51bc7c937",
url="https://code.mpimet.mpg.de/attachments/download/28882/cdo-2.2.2.tar.gz",
)
version(
"2.2.0",
sha256="679c8d105706caffcba0960ec5ddc4a1332c1b40c52f82c3937356999d8fadf2",
@@ -167,13 +172,14 @@ class Cdo(AutotoolsPackage):
# We also need the backend of netcdf to be thread safe.
depends_on("hdf5+threadsafe", when="+netcdf")
# Same in case hdf5 is used in the frontend
depends_on("hdf5+threadsafe", when="+hdf5")
depends_on("grib-api", when="grib2=grib-api")
depends_on("eccodes", when="grib2=eccodes")
depends_on("szip", when="+szip")
depends_on("hdf5+threadsafe", when="+hdf5")
depends_on("udunits", when="+udunits2")
depends_on("libxml2", when="+libxml2")
depends_on("proj@:5", when="@:1.9.6+proj")

View File

@@ -14,12 +14,16 @@ class Cepgen(CMakePackage):
tags = ["hep"]
version("1.1.0", sha256="2a4eaed161f007269516cbfb6e90421e657ab1922d4509de0165f08dde91bf3d")
version(
"1.0.2patch1", sha256="333bba0cb1965a98dec127e00c150eab1a515cd348a90f7b1d66d5cd8d206d21"
)
generator("ninja")
depends_on("cmake@3.5:", type="build", when="@1.0:")
depends_on("cmake@3.20:", type="build", when="@1.1:")
depends_on("gsl")
depends_on("openblas")
depends_on("hepmc")

View File

@@ -0,0 +1,39 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index f11e6e2..209970b 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -164,6 +164,7 @@ if (CLINGO_BUILD_WITH_LUA)
set_property(TARGET Lua::Lua PROPERTY INTERFACE_INCLUDE_DIRECTORIES "${LUA_INCLUDE_DIR}")
endif()
endif()
+find_package(mimalloc REQUIRED)
find_package(BISON "2.5")
find_package(RE2C "0.13")
if (PYCLINGO_USE_CFFI AND Python_Development_FOUND)
diff --git a/libclingo/CMakeLists.txt b/libclingo/CMakeLists.txt
index 83acc22..51d5762 100644
--- a/libclingo/CMakeLists.txt
+++ b/libclingo/CMakeLists.txt
@@ -50,7 +50,7 @@ else()
endif()
add_library(libclingo ${clingo_lib_type} ${header} ${source})
-target_link_libraries(libclingo PRIVATE libgringo libclasp)
+target_link_libraries(libclingo PRIVATE mimalloc-static libgringo libclasp)
target_include_directories(libclingo
PUBLIC
"$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>"
diff --git a/libclingo/src/clingo_app.cc b/libclingo/src/clingo_app.cc
index 3e4d14c..fcfc9ea 100644
--- a/libclingo/src/clingo_app.cc
+++ b/libclingo/src/clingo_app.cc
@@ -27,6 +27,9 @@
#include <clasp/parser.h>
#include <climits>
+#include <mimalloc-new-delete.h>
+
+
namespace Gringo {
// {{{ declaration of ClingoApp

View File

@@ -0,0 +1,39 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt
index 7fbe16bc..78539519 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -224,6 +224,7 @@ if (CLINGO_BUILD_WITH_LUA)
set_property(TARGET Lua::Lua PROPERTY INTERFACE_INCLUDE_DIRECTORIES "${LUA_INCLUDE_DIR}")
endif()
endif()
+find_package(mimalloc REQUIRED)
find_package(BISON "2.5")
find_package(RE2C "0.101")
if (Python_Development_FOUND)
diff --git a/libclingo/CMakeLists.txt b/libclingo/CMakeLists.txt
index 1d70ba56..de2f2766 100644
--- a/libclingo/CMakeLists.txt
+++ b/libclingo/CMakeLists.txt
@@ -51,7 +51,7 @@ endif()
add_library(libclingo ${clingo_lib_type})
target_sources(libclingo ${clingo_private_scope_} ${header} ${source})
-target_link_libraries(libclingo ${clingo_private_scope_} libgringo libclasp)
+target_link_libraries(libclingo ${clingo_private_scope_} mimalloc-static libgringo libclasp)
target_include_directories(libclingo
${clingo_public_scope_}
"$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>"
diff --git a/libclingo/src/clingo_app.cc b/libclingo/src/clingo_app.cc
index 13980efa..3c3b404b 100644
--- a/libclingo/src/clingo_app.cc
+++ b/libclingo/src/clingo_app.cc
@@ -26,6 +26,9 @@
#include <clasp/parser.h>
#include <climits>
+#include <mimalloc-new-delete.h>
+
+
namespace Gringo {
// {{{ declaration of ClingoApp

View File

@@ -2,8 +2,15 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import os
import spack.compilers
import spack.paths
import spack.user_environment
from spack.package import *
from spack.pkg.builtin.clingo import Clingo
from spack.util.environment import EnvironmentModifications
class ClingoBootstrap(Clingo):
@@ -13,24 +20,51 @@ class ClingoBootstrap(Clingo):
variant("build_type", default="Release", values=("Release",), description="CMake build type")
variant("static_libstdcpp", default=False, description="Require a static version of libstdc++")
variant(
"static_libstdcpp",
default=False,
when="platform=linux",
description="Require a static version of libstdc++",
)
variant(
"optimized",
default=False,
description="Enable a series of Spack-specific optimizations (PGO, LTO, mimalloc)",
)
# Enable LTO
conflicts("~ipo", when="+optimized")
with when("+optimized platform=linux"):
# Statically linked. Don't use ~override so we don't duplicate malloc/free, they
# get resolved to Python's libc's malloc in our case anyway.
depends_on("mimalloc +ipo libs=static ~override", type="build")
conflicts("~static_libstdcpp", msg="Custom allocator requires static libstdc++")
# Override new/delete with mimalloc.
patch("mimalloc.patch", when="@5.5.0:")
patch("mimalloc-pre-5.5.0.patch", when="@:5.4")
# ensure we hide libstdc++ with custom operator new/delete symbols
patch("version-script.patch")
# CMake at version 3.16.0 or higher has the possibility to force the
# Python interpreter, which is crucial to build against external Python
# in environment where more than one interpreter is in the same prefix
depends_on("cmake@3.16.0:", type="build")
# On Linux we bootstrap with GCC
for compiler_spec in [c for c in spack.compilers.supported_compilers() if c != "gcc"]:
# On Linux we bootstrap with GCC or clang
for compiler_spec in [
c for c in spack.compilers.supported_compilers() if c not in ("gcc", "clang")
]:
conflicts(
"%{0}".format(compiler_spec),
when="platform=linux",
msg="GCC is required to bootstrap clingo on Linux",
msg="GCC or clang are required to bootstrap clingo on Linux",
)
conflicts(
"%{0}".format(compiler_spec),
when="platform=cray",
msg="GCC is required to bootstrap clingo on Cray",
msg="GCC or clang are required to bootstrap clingo on Cray",
)
conflicts("%gcc@:5", msg="C++14 support is required to bootstrap clingo")
@@ -51,28 +85,70 @@ def cmake_py_shared(self):
def cmake_args(self):
args = super().cmake_args()
args.extend(
[
# Avoid building the clingo executable
self.define("CLINGO_BUILD_APPS", "OFF")
]
)
args.append(self.define("CLINGO_BUILD_APPS", False))
return args
def setup_build_environment(self, env):
opts = None
if "%apple-clang platform=darwin" in self.spec:
opts = "-mmacosx-version-min=10.13"
elif "%gcc" in self.spec:
if "+static_libstdcpp" in self.spec:
# This is either linux or cray
opts = "-static-libstdc++ -static-libgcc -Wl,--exclude-libs,ALL"
elif "platform=windows" in self.spec:
pass
else:
msg = 'unexpected compiler for spec "{0}"'.format(self.spec)
raise RuntimeError(msg)
@run_before("cmake", when="+optimized")
def pgo_train(self):
if self.spec.compiler.name == "clang":
llvm_profdata = which("llvm-profdata", required=True)
elif self.spec.compiler.name == "apple-clang":
llvm_profdata = Executable(
Executable("xcrun")("-find", "llvm-profdata", output=str).strip()
)
if opts:
env.set("CXXFLAGS", opts)
env.set("LDFLAGS", opts)
# First configure with PGO flags, and do build apps.
reports = os.path.abspath("reports")
sources = os.path.abspath(self.root_cmakelists_dir)
cmake_options = self.std_cmake_args + self.cmake_args() + [sources]
# Set PGO training flags.
generate_mods = EnvironmentModifications()
generate_mods.append_flags("CFLAGS", "-fprofile-generate={}".format(reports))
generate_mods.append_flags("CXXFLAGS", "-fprofile-generate={}".format(reports))
generate_mods.append_flags("LDFLAGS", "-fprofile-generate={} --verbose".format(reports))
with working_dir(self.build_directory, create=True):
cmake(*cmake_options, sources, extra_env=generate_mods)
make()
make("install")
# Clean the reports dir.
rmtree(reports, ignore_errors=True)
# Run spack solve --fresh hdf5 with instrumented clingo.
python_runtime_env = EnvironmentModifications()
for s in self.spec.traverse(deptype=("run", "link"), order="post"):
python_runtime_env.extend(spack.user_environment.environment_modifications_for_spec(s))
python_runtime_env.unset("SPACK_ENV")
python_runtime_env.unset("SPACK_PYTHON")
self.spec["python"].command(
spack.paths.spack_script, "solve", "--fresh", "hdf5", extra_env=python_runtime_env
)
# Clean the build dir.
rmtree(self.build_directory, ignore_errors=True)
if self.spec.compiler.name in ("clang", "apple-clang"):
# merge reports
use_report = join_path(reports, "merged.prof")
raw_files = glob.glob(join_path(reports, "*.profraw"))
llvm_profdata("merge", "--output={}".format(use_report), *raw_files)
use_flag = "-fprofile-instr-use={}".format(use_report)
else:
use_flag = "-fprofile-use={}".format(reports)
# Set PGO use flags for next cmake phase.
use_mods = EnvironmentModifications()
use_mods.append_flags("CFLAGS", use_flag)
use_mods.append_flags("CXXFLAGS", use_flag)
use_mods.append_flags("LDFLAGS", use_flag)
cmake.add_default_envmod(use_mods)
def setup_build_environment(self, env):
if "%apple-clang" in self.spec:
env.append_flags("CFLAGS", "-mmacosx-version-min=10.13")
env.append_flags("CXXFLAGS", "-mmacosx-version-min=10.13")
env.append_flags("LDFLAGS", "-mmacosx-version-min=10.13")
elif self.spec.compiler.name in ("gcc", "clang") and "+static_libstdcpp" in self.spec:
env.append_flags("LDFLAGS", "-static-libstdc++ -static-libgcc -Wl,--exclude-libs,ALL")

View File

@@ -0,0 +1,48 @@
From 59859b8896e527bbd4a727beb798776d2716a8b3 Mon Sep 17 00:00:00 2001
From: Harmen Stoppels <me@harmenstoppels.nl>
Date: Thu, 10 Aug 2023 18:53:17 +0200
Subject: [PATCH] version script
---
libclingo/CMakeLists.txt | 12 ++++++++++++
libclingo/clingo.map | 4 ++++
2 files changed, 16 insertions(+)
create mode 100644 libclingo/clingo.map
diff --git a/libclingo/CMakeLists.txt b/libclingo/CMakeLists.txt
index 1d70ba56..0fd3bf49 100644
--- a/libclingo/CMakeLists.txt
+++ b/libclingo/CMakeLists.txt
@@ -58,6 +58,18 @@ target_include_directories(libclingo
"$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>")
target_compile_definitions(libclingo ${clingo_private_scope_} CLINGO_BUILD_LIBRARY)
+# Hide private symbols on Linux.
+include(CheckCSourceCompiles)
+file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/version.map" "{ global: f; local: *;};")
+set(CMAKE_REQUIRED_FLAGS_SAVE ${CMAKE_REQUIRED_FLAGS})
+set(CMAKE_REQUIRED_FLAGS ${CMAKE_REQUIRED_FLAGS} "-Wl,--version-script='${CMAKE_CURRENT_BINARY_DIR}/version.map'")
+check_c_source_compiles("void f(void) {} int main(void) {return 0;}" HAVE_LD_VERSION_SCRIPT)
+set(CMAKE_REQUIRED_FLAGS ${CMAKE_REQUIRED_FLAGS_SAVE})
+file(REMOVE "${CMAKE_CURRENT_BINARY_DIR}/version.map")
+if(HAVE_LD_VERSION_SCRIPT)
+set_target_properties(libclingo PROPERTIES LINK_FLAGS "-Wl,--version-script='${CMAKE_CURRENT_SOURCE_DIR}/clingo.map'")
+endif()
+
if (NOT CLINGO_BUILD_SHARED)
target_compile_definitions(libclingo PUBLIC CLINGO_NO_VISIBILITY)
endif()
diff --git a/libclingo/clingo.map b/libclingo/clingo.map
new file mode 100644
index 00000000..a665456c
--- /dev/null
+++ b/libclingo/clingo.map
@@ -0,0 +1,4 @@
+{
+ global: clingo_*; gringo_*; g_clingo_*;
+ local: *;
+};
\ No newline at end of file
--
2.39.2

View File

@@ -120,6 +120,11 @@ def cmake_args(self):
else:
args += ["-DCLINGO_BUILD_WITH_PYTHON=OFF"]
# Use LTO also for non-Intel compilers please. This can be removed when they
# bump cmake_minimum_required to VERSION 3.9.
if "+ipo" in self.spec:
args.append("-DCMAKE_POLICY_DEFAULT_CMP0069=NEW")
return args
def win_add_library_dependent(self):

View File

@@ -33,6 +33,7 @@ class Cp2k(MakefilePackage, CudaPackage, CMakePackage, ROCmPackage):
maintainers("dev-zero", "mtaillefumier")
version("2023.2", sha256="adbcc903c1a78cba98f49fe6905a62b49f12e3dfd7cedea00616d1a5f50550db")
version("2023.1", sha256="dff343b4a80c3a79363b805429bdb3320d3e1db48e0ff7d20a3dfd1c946a51ce")
version("2022.2", sha256="1a473dea512fe264bb45419f83de432d441f90404f829d89cbc3a03f723b8354")
version("2022.1", sha256="2c34f1a7972973c62d471cd35856f444f11ab22f2ff930f6ead20f3454fd228b")
@@ -180,6 +181,11 @@ class Cp2k(MakefilePackage, CudaPackage, CMakePackage, ROCmPackage):
depends_on("libxc@5.1.3:5.1", when="@8.2:8")
depends_on("libxc@5.1.7:5.1", when="@9:2022.2")
depends_on("libxc@6.1:", when="@2023.1:")
depends_on("libxc@6.2:", when="@2023.2:")
with when("+spla"):
depends_on("spla+cuda+fortran", when="+cuda")
depends_on("spla+rocm+fortran", when="+rocm")
with when("+mpi"):
depends_on("mpi@2:")
@@ -191,6 +197,7 @@ class Cp2k(MakefilePackage, CudaPackage, CMakePackage, ROCmPackage):
depends_on("cosma@2.5.1:", when="@9:")
depends_on("cosma@2.6.3:", when="@master:")
depends_on("cosma+cuda", when="+cuda")
depends_on("cosma+rocm", when="+rocm")
conflicts("~mpi")
# COSMA support was introduced in 8+
conflicts("@:7")
@@ -201,6 +208,7 @@ class Cp2k(MakefilePackage, CudaPackage, CMakePackage, ROCmPackage):
depends_on("elpa~openmp", when="~openmp")
depends_on("elpa@2021.05:", when="@8.3:")
depends_on("elpa@2021.11.001:", when="@9.1:")
depends_on("elpa@2023.05.001:", when="@2023.2:")
with when("+plumed"):
depends_on("plumed+shared")
@@ -219,6 +227,8 @@ class Cp2k(MakefilePackage, CudaPackage, CMakePackage, ROCmPackage):
# a consistent/compatible combination is pulled into the dependency graph.
with when("+sirius"):
depends_on("sirius+fortran+shared")
depends_on("sirius+cuda", when="+cuda")
depends_on("sirius+rocm", when="+rocm")
depends_on("sirius+openmp", when="+openmp")
depends_on("sirius~openmp", when="~openmp")
depends_on("sirius@:6", when="@:7")
@@ -865,16 +875,20 @@ def cmake_args(self):
raise InstallError("CP2K supports only one cuda_arch at a time.")
else:
gpu_ver = gpu_map[spec.variants["cuda_arch"].value[0]]
args += ["-DCP2K_USE_ACCEL=CUDA"]
args += [self.define("CP2K_WITH_GPU", gpu_ver)]
args += [
self.define("CP2K_USE_ACCEL", "CUDA"),
self.define("CP2K_WITH_GPU", gpu_ver),
]
if "+rocm" in spec:
if len(spec.variants["amdgpu_target"].value) > 1:
raise InstallError("CP2K supports only one amdgpu_target at a time.")
else:
gpu_ver = gpu_map[spec.variants["amdgpu_target"].value[0]]
args += ["-DCP2K_USE_ACCEL=HIP"]
args += [self.define("CP2K_WITH_GPU", gpu_ver)]
args += [
self.define("CP2K_USE_ACCEL", "HIP"),
self.define("CP2K_WITH_GPU", gpu_ver),
]
args += [
self.define_from_variant("CP2K_ENABLE_REGTESTS", "enable_regtests"),

View File

@@ -9,9 +9,9 @@
class Faodel(CMakePackage):
"""Flexible, Asynchronous, Object Data-Exchange Libraries"""
homepage = "https://github.com/faodel/faodel"
url = "https://github.com/faodel/faodel/archive/v1.2108.1.tar.gz"
git = "https://github.com/faodel/faodel.git"
homepage = "https://github.com/sandialabs/faodel"
url = "https://github.com/sandialabs/faodel/archive/v1.2108.1.tar.gz"
git = "https://github.com/sandialabs/faodel.git"
maintainers("tkordenbrock", "craigulmer")

View File

@@ -27,6 +27,7 @@ class Fortrilinos(CMakePackage):
tags = ["e4s"]
test_requires_compiler = True
version("2.3.0", sha256="7be5efecaea61ad773d3fe182aa28735ebc3e7af821e1805ad284e4ed4e31a49")
version("2.2.0", sha256="9e73fc71066bfaf7cde040e1467baf7a1ec797ff2874add49f9741e93f9fffb5")
version("2.1.0", sha256="2c62bb6106ae86a804497d549080cb6877c5d860b6bf2e72ec5cbcbbe63e3b5b")
version("2.0.1", sha256="291a62c885cd4ffd76cbebafa02789649bd4fa73f1005cf8da51fd153acb9e1a")
@@ -50,9 +51,10 @@ class Fortrilinos(CMakePackage):
variant("shared", default=True, description="Build shared libraries")
# Trilinos version dependencies
depends_on("trilinos@13.4.0:13.4", when="@2.2.0:2.2")
depends_on("trilinos@13.2.0:13.2", when="@2.1.0:2.1")
depends_on("trilinos@13.0.0:13.2", when="@2.0.0:2.0")
depends_on("trilinos@14.0", when="@2.3")
depends_on("trilinos@13.4", when="@2.2")
depends_on("trilinos@13.2", when="@2.1.0:2.1")
depends_on("trilinos@13:13.2", when="@2.0")
depends_on("trilinos@12.18.1", when="@2.0.dev3")
depends_on("trilinos@12.18.1", when="@2.0.dev2")

View File

@@ -107,6 +107,12 @@ class Gmsh(CMakePackage):
conflicts("+oce", when="^gmsh@4.10:4.10.3")
conflicts("+metis", when="+external", msg="External Metis cannot build with GMSH")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=implicit-function-declaration")
return (flags, None, None)
def cmake_args(self):
spec = self.spec

View File

@@ -20,6 +20,12 @@ class Hdf5VolCache(CMakePackage):
depends_on("hdf5@1.14: +mpi +threadsafe")
depends_on("hdf5-vol-async")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=incompatible-function-pointer-types")
return (flags, None, None)
def setup_run_environment(self, env):
env.prepend_path("HDF5_PLUGIN_PATH", self.spec.prefix.lib)

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
from llnl.util import tty
from llnl.util import lang, tty
from spack.package import *
@@ -175,22 +175,14 @@ class Kokkos(CMakePackage, CudaPackage, ROCmPackage):
description="Intel GPU architecture",
)
devices_values = list(devices_variants.keys())
for dev in devices_variants:
dflt, desc = devices_variants[dev]
for dev, (dflt, desc) in devices_variants.items():
variant(dev, default=dflt, description=desc)
conflicts("+cuda", when="+rocm", msg="CUDA and ROCm are not compatible in Kokkos.")
options_values = list(options_variants.keys())
for opt in options_values:
if "cuda" in opt:
conflicts("+%s" % opt, when="~cuda", msg="Must enable CUDA to use %s" % opt)
dflt, desc = options_variants[opt]
variant(opt, default=dflt, description=desc)
for opt, (dflt, desc) in options_variants.items():
variant(opt, default=dflt, description=desc, when=("+cuda" if "cuda" in opt else None))
tpls_values = list(tpls_variants.keys())
for tpl in tpls_values:
dflt, desc = tpls_variants[tpl]
for tpl, (dflt, desc) in tpls_variants.items():
variant(tpl, default=dflt, description=desc)
depends_on(tpl, when="+%s" % tpl)
@@ -264,7 +256,7 @@ def append_args(self, cmake_prefix, cmake_options, spack_options):
optname = "Kokkos_%s_%s" % (cmake_prefix, opt.upper())
# Explicitly enable or disable
option = self.define_from_variant(optname, variant_name)
if option not in spack_options:
if option:
spack_options.append(option)
def setup_dependent_package(self, module, dependent_spec):
@@ -316,11 +308,11 @@ def cmake_args(self):
for arch in spack_microarches:
options.append(self.define("Kokkos_ARCH_" + arch.upper(), True))
self.append_args("ENABLE", self.devices_values, options)
self.append_args("ENABLE", self.options_values, options)
self.append_args("ENABLE", self.tpls_values, options)
self.append_args("ENABLE", self.devices_variants.keys(), options)
self.append_args("ENABLE", self.options_variants.keys(), options)
self.append_args("ENABLE", self.tpls_variants.keys(), options)
for tpl in self.tpls_values:
for tpl in self.tpls_variants:
if spec.variants[tpl].value:
options.append(self.define(tpl + "_DIR", spec[tpl].prefix))
@@ -334,7 +326,8 @@ def cmake_args(self):
if self.spec.satisfies("%oneapi") or self.spec.satisfies("%intel"):
options.append(self.define("CMAKE_CXX_FLAGS", "-fp-model=precise"))
return options
# Remove duplicate options
return lang.dedupe(options)
test_script_relative_path = join_path("scripts", "spack_test")

View File

@@ -15,6 +15,9 @@ class Less(AutotoolsPackage):
url = "https://www.greenwoodsoftware.com/less/less-551.zip"
list_url = "https://www.greenwoodsoftware.com/less/download.html"
depends_on("ncurses")
version("643", sha256="3bb417c4b909dfcb0adafc371ab87f0b22e8b15f463ec299d156c495fc9aa196")
version("590", sha256="69056021c365b16504cf5bd3864436a5e50cb2f98b76cd68b99b457064139375")
version("551", sha256="2630db16ef188e88b513b3cc24daa9a798c45643cc7da06e549c9c00cfd84244")
version("530", sha256="8c1652ba88a726314aa2616d1c896ca8fe9a30253a5a67bc21d444e79a6c6bc3")

View File

@@ -13,6 +13,9 @@ class Libxc(AutotoolsPackage, CudaPackage):
homepage = "https://tddft.org/programs/libxc/"
url = "https://gitlab.com/libxc/libxc/-/archive/6.1.0/libxc-6.1.0.tar.gz"
version("6.2.2", sha256="3b0523924579cf494cafc6fea92945257f35692b004217d3dfd3ea7ca780e8dc")
version("6.2.1", sha256="b5f3b4514db6bc4ccda1da90ac6176ea1f82e12241cc66427c58cbc4a5197b9b")
version("6.2.0", sha256="3d25878782b5f94e7e4d41bd6de27f98983584cd0be0c65e69a9ada986b56b4d")
version("6.1.0", sha256="f593745fa47ebfb9ddc467aaafdc2fa1275f0d7250c692ce9761389a90dd8eaf")
version("6.0.0", sha256="0c774e8e195dd92800b9adf3df5f5721e29acfe9af4b191a9937c7de4f9aa9f6")
version("5.2.3", sha256="7b7a96d8eeb472c7b8cca7ac38eae27e0a8113ef44dae5359b0eb12592b4bcf2")

View File

@@ -115,4 +115,9 @@ def cmake_args(self):
for lib in self.libs_values
]
args += [self.define_from_variant("MI_%s" % k.upper(), k) for k in self.mimalloc_options]
# Use LTO also for non-Intel compilers please. This can be removed when they
# bump cmake_minimum_required to VERSION 3.9.
if "+ipo" in self.spec:
args.append("-DCMAKE_POLICY_DEFAULT_CMP0069=NEW")
return args

View File

@@ -18,12 +18,16 @@ class Moab(AutotoolsPackage):
homepage = "https://bitbucket.org/fathomteam/moab"
git = "https://bitbucket.org/fathomteam/moab.git"
url = "https://ftp.mcs.anl.gov/pub/fathom/moab-5.0.0.tar.gz"
url = "https://ftp.mcs.anl.gov/pub/fathom/moab-5.5.0.tar.gz"
maintainers("vijaysm", "iulian787")
version("develop", branch="develop")
version("master", branch="master")
version("5.5.0", sha256="58969f8a1b209ec9036c08c53a6b7078b368eb3bf99d0368a4de5a2f2a8db678")
version("5.4.1", sha256="3625e25321bf37f88d98438f5d56c280b2774172602d8b6eb6c34eedf37686fc")
version("5.4.0", sha256="a30d2a1911fbf214ae0175b0856e0475c0077dc51ea5914c850d631155a72952")
version("5.3.1", sha256="2404fab2d84f87be72b57cfef5ea237bfa444aaca059e66a158f22134956fe54")
version("5.3.0", sha256="51c31ccbcaa76d9658a44452b9a39f076b795b27a1c9f408fc3d0eea97e032ef")
version("5.2.1", sha256="60d31762be3f0e5c89416c764e844ec88dac294169b59a5ead3c316b50f85c29")
version("5.2.0", sha256="805ed3546deff39e076be4d1f68aba1cf0dda8c34ce43e1fc179d1aff57c5d5d")
@@ -36,22 +40,23 @@ class Moab(AutotoolsPackage):
version("4.9.0", sha256="267a7c05da847e4ea856db2c649a5484fb7bdc132ab56721ca50ee69a7389f4d")
version("4.8.2", sha256="b105cff42930058dc14eabb9a25e979df7289b175732fe319d2494e83e09e968")
variant("mpi", default=True, description="enable mpi support")
variant("hdf5", default=True, description="Required to enable the hdf5 (default I/O) format")
variant("netcdf", default=False, description="Required to enable the ExodusII reader/writer.")
variant("pnetcdf", default=False, description="Enable pnetcdf (AKA parallel-netcdf) support")
variant("zoltan", default=False, description="Enable zoltan support")
variant("cgm", default=False, description="Enable common geometric module")
variant("metis", default=True, description="Enable metis link")
variant("parmetis", default=True, description="Enable parmetis link")
variant("irel", default=False, description="Enable irel interface")
variant("fbigeom", default=False, description="Enable fbigeom interface")
variant("mpi", default=True, description="Enable MPI parallelism support")
variant("hdf5", default=True, description="Enable the HDF5 (default I/O) format")
variant("netcdf", default=False, description="Enable the ExodusII reader/writer support")
variant("pnetcdf", default=False, description="Enable parallel-Netcdf library support")
variant("zoltan", default=True, description="Enable Zoltan partitioner support")
variant("eigen", default=True, description="Enable Eigen template library for linear algebra")
variant("cgm", default=False, description="Enable the Common Geometry Module")
variant("metis", default=False, description="Enable Metis partitioner")
variant("parmetis", default=False, description="Enable Parmetis partitioner")
variant("irel", default=False, description="Enable iRel interface")
variant("fbigeom", default=False, description="Enable FBiGeom interface")
variant("coupler", default=True, description="Enable mbcoupler tool")
variant("dagmc", default=False, description="Enable dagmc tool")
variant("dagmc", default=False, description="Enable DagMC tool")
variant("debug", default=False, description="enable debug symbols")
variant("debug", default=False, description="Enable debug symbols in libraries")
variant("shared", default=False, description="Enables the build of shared libraries")
variant("fortran", default=True, description="Enable Fortran support")
variant("fortran", default=False, description="Enable Fortran support")
conflicts("+irel", when="~cgm")
conflicts("+pnetcdf", when="~mpi")
@@ -80,6 +85,7 @@ class Moab(AutotoolsPackage):
depends_on("cgm", when="+cgm")
depends_on("metis", when="+metis")
depends_on("parmetis", when="+parmetis")
depends_on("eigen", when="+eigen")
# FIXME it seems that zoltan needs to be built without fortran
depends_on("zoltan~fortran", when="+zoltan")
@@ -126,6 +132,8 @@ def configure_args(self):
options.append("--with-blas=%s" % spec["blas"].libs.ld_flags)
options.append("--with-lapack=%s" % spec["lapack"].libs.ld_flags)
if "+eigen" in spec:
options.append("--with-eigen3=%s" % spec["eigen"].prefix.include.eigen3)
if "+hdf5" in spec:
options.append("--with-hdf5=%s" % spec["hdf5"].prefix)

View File

@@ -10,7 +10,7 @@ class OpenpmdApi(CMakePackage):
"""C++ & Python API for Scientific I/O"""
homepage = "https://www.openPMD.org"
url = "https://github.com/openPMD/openPMD-api/archive/0.15.1.tar.gz"
url = "https://github.com/openPMD/openPMD-api/archive/0.15.2.tar.gz"
git = "https://github.com/openPMD/openPMD-api.git"
maintainers("ax3l", "franzpoeschel")
@@ -19,6 +19,7 @@ class OpenpmdApi(CMakePackage):
# C++17 up until here
version("develop", branch="dev")
version("0.15.2", sha256="fbe3b356fe6f4589c659027c8056844692c62382e3ec53b953bed1c87e58ba13")
version("0.15.1", sha256="0e81652152391ba4d2b62cfac95238b11233a4f89ff45e1fcffcc7bcd79dabe1")
version("0.15.0", sha256="290e3a3c5814204ea6527d53423bfacf7a8dc490713227c9e0eaa3abf4756177")
# C++14 up until here
@@ -70,8 +71,8 @@ class OpenpmdApi(CMakePackage):
depends_on("py-pybind11@2.6.2:", type="link")
depends_on("py-numpy@1.15.1:", type=["test", "run"])
depends_on("py-mpi4py@2.1.0:", when="+mpi", type=["test", "run"])
depends_on("python@3.6:", type=["link", "test", "run"])
depends_on("python@3.7:", when="@0.15.0:", type=["link", "test", "run"])
depends_on("python@3.7:", type=["link", "test", "run"])
depends_on("python@3.8:", when="@0.15.2:", type=["link", "test", "run"])
conflicts("^hdf5 api=v16", msg="openPMD-api requires HDF5 APIs for 1.8+")

View File

@@ -17,6 +17,7 @@ class Parallelio(CMakePackage):
maintainers("jedwards4b")
version("2.6.1", sha256="83d3108d2b9db8219aa6b6ee333cfc12b2a588bcfc781587df5f8b24a716a6eb")
version("2.6.0", sha256="e56a980c71c7f57f396a88beae08f1670d4adf59be6411cd573fe85868ef98c0")
version("2.5.10", sha256="fac694827c81434a7766976711ba7179940e361e8ed0c189c7b397fd44d401de")
version("2.5.9", sha256="e5dbc153d8637111de3a51a9655660bf15367d55842de78240dcfc024380553d")
@@ -36,6 +37,12 @@ class Parallelio(CMakePackage):
)
variant("mpi", default=True, description="Use mpi to build, otherwise use mpi-serial")
# This patch addresses building pio2.6.1 with serial netcdf, again the issue is netcdf filters
patch("serial261.patch", when="@2.6.1")
# This patch addresses an issue when compiling pio2.6.0 with a serial netcdf library.
# netcdf4 filters are only available with the parallel build of netcdf.
patch("pio_260.patch", when="@2.6.0")
patch("remove_redefinition_of_mpi_offset.patch", when="@:2.5.6")
depends_on("cmake@3.7:", type="build")

View File

@@ -0,0 +1,20 @@
diff --git a/src/clib/pio_internal.h b/src/clib/pio_internal.h
index c360ae4e..79cc48eb 100644
--- a/src/clib/pio_internal.h
+++ b/src/clib/pio_internal.h
@@ -17,11 +17,13 @@
#include <limits.h>
#include <math.h>
#include <netcdf.h>
-#ifdef _NETCDF4
-#include <netcdf_par.h>
+#ifdef PIO_HAS_PAR_FILTERS
#include <netcdf_filter.h>
#include <netcdf_meta.h>
#endif
+#ifdef _NETCDF4
+#include <netcdf_par.h>
+#endif
#ifdef _PNETCDF
#include <pnetcdf.h>
#endif

View File

@@ -0,0 +1,36 @@
diff --git a/src/clib/pio.h b/src/clib/pio.h
index 0aea5a5f..767de18f 100644
--- a/src/clib/pio.h
+++ b/src/clib/pio.h
@@ -1267,9 +1267,8 @@ extern "C" {
const long long *op);
int PIOc_put_vard_ulonglong(int ncid, int varid, int decompid, const PIO_Offset recnum,
const unsigned long long *op);
-/* use this variable in the NETCDF library (introduced in v4.9.0) to determine if the following
- functions are available */
-#ifdef NC_HAS_MULTIFILTERS
+
+#ifdef PIO_HAS_PAR_FILTERS
int PIOc_def_var_filter(int ncid, int varid,unsigned int id, size_t nparams, unsigned int *params);
int PIOc_inq_var_filter_ids(int ncid, int varid, size_t *nfiltersp, unsigned int *ids);
int PIOc_inq_var_filter_info(int ncid, int varid, unsigned int id, size_t *nparamsp, unsigned int *params );
diff --git a/src/ncint/ncintdispatch.c b/src/ncint/ncintdispatch.c
index a77396bd..3dce9d2c 100644
--- a/src/ncint/ncintdispatch.c
+++ b/src/ncint/ncintdispatch.c
@@ -127,6 +127,7 @@ NC_Dispatch NCINT_dispatcher = {
NC_NOTNC4_def_var_filter,
NC_NOTNC4_set_var_chunk_cache,
NC_NOTNC4_get_var_chunk_cache,
+#ifdef PIO_HAS_PAR_FILTERS
#if NC_DISPATCH_VERSION == 2
PIO_NCINT_filter_actions,
#endif
@@ -141,6 +142,7 @@ NC_Dispatch NCINT_dispatcher = {
#if NC_DISPATCH_VERSION >= 5
PIOc_inq_filter_avail,
#endif
+#endif
};
/**

View File

@@ -24,3 +24,9 @@ class PrunersNinja(AutotoolsPackage):
depends_on("libtool", type="build")
patch("pruners-mutli-def-a-pr3-fix.patch")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=implicit-function-declaration")
return (flags, None, None)

View File

@@ -12,11 +12,16 @@ class PyAccelerate(PythonPackage):
homepage = "https://github.com/huggingface/accelerate"
pypi = "accelerate/accelerate-0.16.0.tar.gz"
maintainers("meyersbs")
version("0.21.0", sha256="e2959a0bf74d97c0b3c0e036ed96065142a060242281d27970d4c4e34f11ca59")
version("0.16.0", sha256="d13e30f3e6debfb46cada7b931af85560619b6a6a839d0cafeeab6ed7c6a498d")
depends_on("python@3.8.0:", when="@0.21.0:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-numpy@1.17:", type=("build", "run"))
depends_on("py-packaging@20:", type=("build", "run"))
depends_on("py-psutil", type=("build", "run"))
depends_on("py-pyyaml", type=("build", "run"))
depends_on("py-torch@1.10.0:", when="@0.21.0:", type=("build", "run"))
depends_on("py-torch@1.4:", type=("build", "run"))

View File

@@ -0,0 +1,28 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyAlpacaEval(PythonPackage):
"""An automatic evaluator for instruction-following language models.
Human-validated, high-quality, cheap, and fast."""
homepage = "https://github.com/tatsu-lab/alpaca_eval"
pypi = "alpaca_eval/alpaca_eval-0.2.8.tar.gz"
maintainers("meyersbs")
version("0.2.8", sha256="5b21b74d7362ee229481b6a6d826dd620b2ef6b82e4f5470645e0a4b696a31e6")
depends_on("py-setuptools", type="build")
depends_on("python@3.10:", type=("build", "run"))
depends_on("py-python-dotenv", type=("build", "run"))
depends_on("py-datasets", type=("build", "run"))
depends_on("py-openai", type=("build", "run"))
depends_on("py-pandas", type=("build", "run"))
depends_on("py-tiktoken@0.3.2:", type=("build", "run"))
depends_on("py-fire", type=("build", "run"))

View File

@@ -0,0 +1,39 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyAlpacaFarm(PythonPackage):
"""AlpacaFarm is a simulator that enables research and development on learning
from feedback at a fraction of the usual cost, promoting accessible research on
instruction following and alignment."""
homepage = "https://github.com/tatsu-lab/alpaca_farm"
pypi = "alpaca_farm/alpaca_farm-0.1.9.tar.gz"
maintainers("meyersbs")
version("0.1.9", sha256="1039d33c814d0bbbcab6a0e77ed8e897992ad7107d5c4999d56bdad7e0b0a59f")
depends_on("py-setuptools", type="build")
depends_on("python@3.10:", type=("build", "run"))
depends_on("py-datasets", type=("build", "run"))
depends_on("py-einops", type=("build", "run"))
depends_on("py-nltk", type=("build", "run"))
depends_on("py-accelerate@0.18.0:", type=("build", "run"))
depends_on("py-tabulate", type=("build", "run"))
depends_on("py-transformers@4.26.0:", type=("build", "run"))
depends_on("py-statsmodels", type=("build", "run"))
depends_on("py-tiktoken@0.3.2:", type=("build", "run"))
depends_on("py-markdown", type=("build", "run"))
depends_on("py-scikit-learn", type=("build", "run"))
depends_on("py-sentencepiece", type=("build", "run"))
depends_on("py-pandas", type=("build", "run"))
depends_on("py-wandb", type=("build", "run"))
depends_on("py-torch@1.13.1:", type=("build", "run"))
depends_on("py-fire", type=("build", "run"))
depends_on("py-openai", type=("build", "run"))
depends_on("py-alpaca-eval@0.2.8:", type=("build", "run"))

View File

@@ -0,0 +1,21 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyConfection(PythonPackage):
"""The sweetest config system for Python"""
homepage = "https://github.com/explosion/confection"
pypi = "confection/confection-0.0.4.tar.gz"
version("0.0.4", sha256="b1ddf5885da635f0e260a40b339730806dfb1bd17d30e08764f35af841b04ecf")
depends_on("python@3.6:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-pydantic@1.7.4:1.7,1.9:1.10", type=("build", "run"))
depends_on("py-typing-extensions@3.7.4.1:4.4", type=("build", "run"), when="^python@3.7")
depends_on("py-srsly@2.4:2", type=("build", "run"))

View File

@@ -0,0 +1,32 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyDeepspeed(PythonPackage):
"""DeepSpeed library.
DeepSpeed enables world's most powerful language models like MT-530B and BLOOM. It is an
easy-to-use deep learning optimization software suite that powers unprecedented scale and
speed for both training and inference.
"""
homepage = "http://deepspeed.ai/"
pypi = "deepspeed/deepspeed-0.10.0.tar.gz"
version("0.10.0", sha256="afb06a97fde2a33d0cbd60a8357a70087c037b9f647ca48377728330c35eff3e")
depends_on("py-setuptools", type="build")
depends_on("py-hjson", type=("build", "run"))
depends_on("py-ninja", type=("build", "run"))
depends_on("py-numpy", type=("build", "run"))
depends_on("py-packaging@20:", type=("build", "run"))
depends_on("py-psutil", type=("build", "run"))
depends_on("py-py-cpuinfo", type=("build", "run"))
depends_on("py-pydantic@:1", type=("build", "run"))
# https://github.com/microsoft/DeepSpeed/issues/2830
depends_on("py-torch+distributed", type=("build", "run"))
depends_on("py-tqdm", type=("build", "run"))

View File

@@ -13,6 +13,7 @@ class PyFlake8(PythonPackage):
homepage = "https://github.com/PyCQA/flake8"
pypi = "flake8/flake8-4.0.1.tar.gz"
version("6.1.0", sha256="d5b3857f07c030bdb5bf41c7f53799571d75c4491748a3adcd47de929e34cd23")
version("6.0.0", sha256="c61007e76655af75e6785a931f452915b371dc48f56efd765247c8fe68f2b181")
version("5.0.4", sha256="6fbe320aad8d6b95cec8b8e47bc933004678dc63095be98528b7bdd2a9f510db")
version("5.0.2", sha256="9cc32bc0c5d16eacc014c7ec6f0e9565fd81df66c2092c3c9df06e3c1ac95e5d")
@@ -43,6 +44,11 @@ class PyFlake8(PythonPackage):
# http://flake8.pycqa.org/en/latest/faq.html#why-does-flake8-use-ranges-for-its-dependencies
# http://flake8.pycqa.org/en/latest/internal/releases.html#releasing-flake8
# Flake8 6.1.X
depends_on("py-mccabe@0.7", when="@6.1", type=("build", "run"))
depends_on("py-pycodestyle@2.11", when="@6.1", type=("build", "run"))
depends_on("py-pyflakes@3.1", when="@6.1", type=("build", "run"))
# Flake8 6.0.X
depends_on("py-mccabe@0.7", when="@6.0", type=("build", "run"))
depends_on("py-pycodestyle@2.10", when="@6.0", type=("build", "run"))

View File

@@ -0,0 +1,27 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyGradioClient(PythonPackage):
"""Python library for easily interacting with trained machine learning models"""
homepage = "https://github.com/gradio-app/gradio"
pypi = "gradio_client/gradio_client-0.2.9.tar.gz"
version("0.2.9", sha256="d4071709ab45a3dbacdbd0797fde5d66d87a98424559e060d576fbe9b0171f4d")
depends_on("python@3.8:", type=("build", "run"))
depends_on("py-hatchling", type="build")
depends_on("py-hatch-requirements-txt", type="build")
depends_on("py-hatch-fancy-pypi-readme@22.5:", type="build")
depends_on("py-requests", type=("build", "run"))
depends_on("py-websockets", type=("build", "run"))
depends_on("py-packaging", type=("build", "run"))
depends_on("py-fsspec", type=("build", "run"))
depends_on("py-huggingface-hub@0.13:", type=("build", "run"))
depends_on("py-typing-extensions", type=("build", "run"))
depends_on("py-httpx", type=("build", "run"))

View File

@@ -16,6 +16,7 @@ class PyLightning(PythonPackage):
maintainers("adamjstewart")
version("2.0.7", sha256="f05acd4ba846505d40125b4f9f0bda0804b2b0356e2ad2fd4e4bf7d1c61c8cc6")
version("2.0.6", sha256="bff959f65eed2f626dd65e7b2cfd0d3ddcd0c4ca19ffc8f5f49a4ba4494ca528")
version("2.0.5", sha256="77df233129b29c11df7b5e071e24e29420d5efbdbbac9cb6fb4602b7b5afce8a")
version("2.0.4", sha256="f5f5ed75a657caa8931051590ed000d46bf1b8311ae89bb17a961c3f299dbf33")
@@ -55,7 +56,8 @@ class PyLightning(PythonPackage):
depends_on("py-numpy@1.17.2:2", type=("build", "run"))
depends_on("py-packaging@17.1:24", type=("build", "run"))
depends_on("py-psutil@:6", type=("build", "run"))
depends_on("py-pydantic@1.7.4:2.0", when="@2.0.6:", type=("build", "run"))
depends_on("py-pydantic@1.7.4:2.1", when="@2.0.7:", type=("build", "run"))
depends_on("py-pydantic@1.7.4:2.0", when="@2.0.6", type=("build", "run"))
depends_on("py-pydantic@1.7.4:1", when="@2.0.5", type=("build", "run"))
depends_on("py-pydantic@1.7.4:3", when="@2.0.3:2.0.4", type=("build", "run"))
depends_on("py-pydantic@:2", when="@:2.0.2", type=("build", "run"))

View File

@@ -16,14 +16,20 @@ class PyMarkdownItPy(PythonPackage):
pypi = "markdown-it-py/markdown-it-py-1.1.0.tar.gz"
version("3.0.0", sha256="e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb")
version("2.2.0", sha256="7c9a5e412688bc771c67432cbfebcdd686c93ce6484913dccf06cb5a0bea35a1")
version("1.1.0", sha256="36be6bb3ad987bfdb839f5ba78ddf094552ca38ccbd784ae4f74a4e1419fc6e3")
variant("linkify", default=False, description="Linkify support")
depends_on("python@3.8:", when="@2.1:", type=("build", "run"))
depends_on("python@3.6:3", when="@:2.0", type=("build", "run"))
depends_on("py-flit-core@3.4:3", when="@2.1:", type="build")
depends_on("py-mdurl@0.1:0", when="@2:", type=("build", "run"))
depends_on("py-linkify-it-py@1.0", when="@1.1.0+linkify", type=("build", "run"))
depends_on("py-linkify-it-py@1:2", when="@2.2:+linkify", type=("build", "run"))
# Historical dependencies
depends_on("py-setuptools", when="@:2.0", type="build")
depends_on("py-attrs@19:21", when="@:2.0", type=("build", "run"))

View File

@@ -261,6 +261,12 @@ def config_file(self):
def archive_files(self):
return [os.path.join(self.build_directory, self.config_file)]
def flag_handler(self, name, flags):
if name == "cxxflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=register")
return (flags, None, None)
def setup_build_environment(self, env):
include = []
library = []

View File

@@ -14,6 +14,7 @@ class PyMultiqc(PythonPackage):
homepage = "https://multiqc.info"
pypi = "multiqc/multiqc-1.0.tar.gz"
version("1.15", sha256="ce5359a12226cf4ce372c6fdad142cfe2ae7501ffa97ac7aab544ced4db5ea3c")
version("1.14", sha256="dcbba405f0c9521ed2bbd7e8f7a9200643047311c9619878b81d167300149362")
version("1.13", sha256="0564fb0f894e6ca0822a0f860941b3ed2c33dce407395ac0c2103775d45cbfa0")
version("1.7", sha256="02e6a7fac7cd9ed036dcc6c92b8f8bcacbd28983ba6be53afb35e08868bd2d68")
@@ -24,7 +25,7 @@ class PyMultiqc(PythonPackage):
depends_on("python@2.7:", when="@:1.7", type=("build", "run"))
depends_on("python@3:", when="@1.9:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-matplotlib@2.1.1:", type=("build", "run"), when="@:1.13")
depends_on("py-matplotlib@2.1.1:", type=("build", "run"), when="@1.13:")
depends_on("py-matplotlib@2.1.1:2", type=("build", "run"), when="@1.7")
depends_on("py-matplotlib@:2.1.0", type=("build", "run"), when="@1.5")
depends_on("py-matplotlib", type=("build", "run"), when="@:1.3")

View File

@@ -35,3 +35,9 @@ class PyNumcodecs(PythonPackage):
depends_on("py-msgpack", type=("build", "run"), when="+msgpack")
patch("apple-clang-12.patch", when="%apple-clang@12:")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=implicit-function-declaration")
return (flags, None, None)

View File

@@ -0,0 +1,26 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyOpenmim(PythonPackage):
"""MIM Installs OpenMMLab packages"""
homepage = "https://github.com/open-mmlab/mim"
pypi = "openmim/openmim-0.3.9.tar.gz"
version("0.3.9", sha256="b3977b92232b4b8c4d987cbc73e4515826d5543ccd3a66d49fcfc602cc5b3352")
depends_on("py-setuptools", type="build")
depends_on("py-click", type=("build", "run"))
depends_on("py-colorama", type=("build", "run"))
depends_on("py-model-index", type=("build", "run"))
depends_on("py-opendatalab", type=("build", "run"))
depends_on("py-pandas", type=("build", "run"))
depends_on("py-pip@19.3:", type=("build", "run"))
depends_on("py-requests", type=("build", "run"))
depends_on("py-rich", type=("build", "run"))
depends_on("py-tabulate", type=("build", "run"))

View File

@@ -11,11 +11,12 @@ class PyPicmistandard(PythonPackage):
homepage = "https://picmi-standard.github.io"
git = "https://github.com/picmi-standard/picmi.git"
pypi = "picmistandard/picmistandard-0.25.0.tar.gz"
pypi = "picmistandard/picmistandard-0.26.0.tar.gz"
maintainers("ax3l", "dpgrote", "RemiLehe")
version("develop", branch="master")
version("master", branch="master")
version("0.26.0", sha256="b22689f576d064bf0cd8f435621e912359fc2ee9347350eab845d2d36ebb62eb")
version("0.25.0", sha256="3fe6a524822d382e52bfc9d3378249546075d28620969954c5ffb43e7968fb02")
version("0.24.0", sha256="55a82adcc14b41eb612caf0d9e47b0e2a56ffc196a58b41fa0cc395c6924be9a")
version("0.23.2", sha256="2853fcfaf2f226a88bb6063ae564832b7e69965294fd652cd2ac04756fa4599a")

View File

@@ -13,6 +13,7 @@ class PyPycodestyle(PythonPackage):
homepage = "https://github.com/PyCQA/pycodestyle"
pypi = "pycodestyle/pycodestyle-2.8.0.tar.gz"
version("2.11.0", sha256="259bcc17857d8a8b3b4a2327324b79e5f020a13c16074670f9c8c8f872ea76d0")
version("2.10.0", sha256="347187bdb476329d98f695c213d7295a846d1152ff4fe9bacb8a9590b8ee7053")
version("2.9.1", sha256="2c9607871d58c76354b697b42f5d57e1ada7d261c261efac224b664affdc5785")
version("2.9.0", sha256="beaba44501f89d785be791c9462553f06958a221d166c64e1f107320f839acc2")

View File

@@ -12,6 +12,7 @@ class PyPyflakes(PythonPackage):
homepage = "https://github.com/PyCQA/pyflakes"
pypi = "pyflakes/pyflakes-2.4.0.tar.gz"
version("3.1.0", sha256="a0aae034c444db0071aa077972ba4768d40c830d9539fd45bf4cd3f8f6992efc")
version("3.0.1", sha256="ec8b276a6b60bd80defed25add7e439881c19e64850afd9b346283d4165fd0fd")
version("2.5.0", sha256="491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3")
version("2.4.0", sha256="05a85c2872edf37a4ed30b0cce2f6093e1d0581f8c19d7393122da7e25b2b24c")

View File

@@ -21,3 +21,9 @@ class PyRuamelYamlClib(PythonPackage):
# to prevent legacy-install-failure
depends_on("python@:3.9", when="@0.2.0", type=("build", "link", "run"))
depends_on("py-setuptools@28.7.0:", type="build")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=incompatible-function-pointer-types")
return (flags, None, None)

View File

@@ -193,11 +193,16 @@ class PySetuptools(Package, PythonExtension):
)
extends("python")
depends_on("python@3.7:", when="@59.7:", type=("build", "run"))
depends_on("python@3.6:", when="@51:", type=("build", "run"))
depends_on("python@3.5:", when="@45:50", type=("build", "run"))
depends_on("python@2.7:2.8,3.5:", when="@44", type=("build", "run"))
depends_on("python@2.7:2.8,3.4:", when="@:43", type=("build", "run"))
# https://github.com/pypa/setuptools/issues/3661
conflicts("python@3.12:", when="@:67")
depends_on("py-pip", type="build")
def url_for_version(self, version):

View File

@@ -0,0 +1,26 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyTiktoken(PythonPackage):
"""tiktoken is a fast BPE tokeniser for use with OpenAI's models."""
homepage = "https://github.com/openai/tiktoken"
pypi = "tiktoken/tiktoken-0.4.0.tar.gz"
maintainers("meyersbs")
version("0.4.0", sha256="59b20a819969735b48161ced9b92f05dc4519c17be4015cfb73b65270a243620")
# From pyproject.toml
depends_on("py-setuptools@62.4:", type="build")
depends_on("py-setuptools-rust@1.5.2:", type="build")
depends_on("py-wheel", type="build")
depends_on("python@3.8:", type=("build", "run"))
depends_on("py-regex@2022.1.18:", type=("build", "run"))
depends_on("py-requests@2.26.0:", type=("build", "run"))

View File

@@ -16,15 +16,18 @@ class PyTransformers(PythonPackage):
maintainers("adamjstewart")
version("4.31.0", sha256="4302fba920a1c24d3a429a29efff6a63eac03f3f3cf55b55927fc795d01cb273")
version("4.24.0", sha256="486f353a8e594002e48be0e2aba723d96eda839e63bfe274702a4b5eda85559b")
version("4.6.1", sha256="83dbff763b7e7dc57cbef1a6b849655d4fcab6bffdd955c5e8bea12a4f76dc10")
version("2.8.0", sha256="b9f29cdfd39c28f29e0806c321270dea337d6174a7aa60daf9625bf83dbb12ee")
depends_on("python@3.8:", when="@4.31:", type=("build", "run"))
depends_on("python@3.7:", when="@4.24:", type=("build", "run"))
depends_on("python@3.6:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-importlib-metadata", when="@4.6: ^python@:3.7", type=("build", "run"))
depends_on("py-filelock", type=("build", "run"))
depends_on("py-huggingface-hub@0.14.1:0", when="@4.26:", type=("build", "run"))
depends_on("py-huggingface-hub@0.10:0", when="@4.24:", type=("build", "run"))
depends_on("py-huggingface-hub@0.0.8", when="@4.6.1", type=("build", "run"))
depends_on("py-numpy@1.17:", when="@4.6:", type=("build", "run"))
@@ -34,6 +37,7 @@ class PyTransformers(PythonPackage):
depends_on("py-pyyaml@5.1:", when="@4.24:", type=("build", "run"))
depends_on("py-regex@:2019.12.16,2019.12.18:", type=("build", "run"))
depends_on("py-requests", type=("build", "run"))
depends_on("py-safetensors@0.3.1:", when="@4.31:", type=("build", "run"))
depends_on("py-tokenizers@0.11.1:0.11.2,0.11.4:0.13", when="@4.24:", type=("build", "run"))
depends_on("py-tokenizers@0.10.1:0.10", when="@4.6.1", type=("build", "run"))
depends_on("py-tokenizers@0.5.2", when="@2.8.0", type=("build", "run"))

View File

@@ -18,7 +18,7 @@ class PyWarpx(PythonPackage):
"""
homepage = "https://ecp-warpx.github.io"
url = "https://github.com/ECP-WarpX/WarpX/archive/refs/tags/23.07.tar.gz"
url = "https://github.com/ECP-WarpX/WarpX/archive/refs/tags/23.08.tar.gz"
git = "https://github.com/ECP-WarpX/WarpX.git"
maintainers("ax3l", "dpgrote", "RemiLehe")
@@ -27,6 +27,7 @@ class PyWarpx(PythonPackage):
# NOTE: if you update the versions here, also see warpx
version("develop", branch="development")
version("23.08", sha256="67695ff04b83d1823ea621c19488e54ebaf268532b0e5eb4ea8ad293d7ab3ddc")
version("23.07", sha256="511633f94c0d0205013609bde5bbf92a29c2e69f6e69b461b80d09dc25602945")
version("23.06", sha256="75fcac949220c44dce04de581860c9a2caa31a0eee8aa7d49455fa5fc928514b")
version("23.05", sha256="34306a98fdb1f5f44ab4fb92f35966bfccdcf1680a722aa773af2b59a3060d73")
@@ -50,6 +51,7 @@ class PyWarpx(PythonPackage):
variant("mpi", default=True, description="Enable MPI support")
for v in [
"23.08",
"23.07",
"23.06",
"23.05",
@@ -73,24 +75,23 @@ class PyWarpx(PythonPackage):
]:
depends_on("warpx@{0}".format(v), when="@{0}".format(v), type=["build", "link"])
depends_on("python@3.6:3.9", type=("build", "run"), when="@:21.12")
depends_on("python@3.6:", type=("build", "run"), when="@22.01:")
depends_on("python@3.7:", type=("build", "run"))
depends_on("python@3.8:", type=("build", "run"), when="@23.09:")
depends_on("py-numpy@1.15.0:1", type=("build", "run"))
depends_on("py-mpi4py@2.1.0:", type=("build", "run"), when="+mpi")
depends_on("py-periodictable@1.5:1", type=("build", "run"))
depends_on("py-picmistandard@0.0.18", type=("build", "run"), when="@22.01")
depends_on("py-picmistandard@0.0.19", type=("build", "run"), when="@22.02:22.09")
depends_on("py-picmistandard@0.0.20", type=("build", "run"), when="@22.10:22.11")
depends_on("py-picmistandard@0.0.22", type=("build", "run"), when="@22.12:23.03")
depends_on("py-picmistandard@0.23.2", type=("build", "run"), when="@23.04:23.05")
depends_on("py-picmistandard@0.24.0", type=("build", "run"), when="@23.06")
depends_on("py-picmistandard@0.25.0", type=("build", "run"), when="@23.07:")
depends_on("py-picmistandard@0.24.0", type=("build", "run"), when="@23.06")
depends_on("py-picmistandard@0.23.2", type=("build", "run"), when="@23.04:23.05")
depends_on("py-picmistandard@0.0.22", type=("build", "run"), when="@22.12:23.03")
depends_on("py-picmistandard@0.0.20", type=("build", "run"), when="@22.10:22.11")
depends_on("py-picmistandard@0.0.19", type=("build", "run"), when="@22.02:22.09")
depends_on("py-picmistandard@0.0.18", type=("build", "run"), when="@22.01")
depends_on("py-setuptools@42:", type="build")
# Since we use PYWARPX_LIB_DIR to pull binaries out of the
# 'warpx' spack package, we don't need py-cmake as declared
# depends_on('py-cmake@3.15:3', type='build')
# depends_on('py-cmake@3.18:3', type='build', when='@22.01:')
depends_on("py-wheel", type="build")
depends_on("warpx +lib ~mpi +shared", type=("build", "link"), when="~mpi")
depends_on("warpx +lib +mpi +shared", type=("build", "link"), when="+mpi")

View File

@@ -109,6 +109,9 @@ def flag_handler(self, name, flags):
elif name == "fcflags":
flags.append(self.compiler.fc_pic_flag)
if name == "cflags" or name == "cxxflags":
if spec.satisfies("%oneapi"):
flags.append("-Wno-error=int")
flags.append("-Wno-error=int-conversion")
if "+hdf5" in spec:
# @:4.10 can use up to the 1.10 API
if "@:4.10" in spec:

View File

@@ -79,14 +79,14 @@ def config_options(self):
# Hijack the edit stage to run mconfig.
def edit(self, spec, prefix):
with working_dir(self.build_directory):
confstring = "./mconfig --prefix=%s" % prefix
confstring += " " + " ".join(self.config_options)
_config_options = ["--prefix=%s" % prefix]
_config_options += self.config_options
if "~suid" in spec:
confstring += " --without-suid"
_config_options += " --without-suid"
if "~network" in spec:
confstring += " --without-network"
configure = Executable(confstring)
configure()
_config_options += " --without-network"
configure = Executable("./mconfig")
configure(*_config_options)
# Set these for use by MakefilePackage's default build/install methods.
build_targets = ["-C", "builddir", "parallel=False"]

View File

@@ -24,6 +24,21 @@ class Snappy(CMakePackage):
patch("link_gtest.patch", when="@:1.1.8")
# Version 1.1.9 makes use of an assembler feature that is not necessarily available when the
# __GNUC__ preprocessor macro is defined. Version 1.1.10 switched to the correct macro
# __GCC_ASM_FLAG_OUTPUTS__, which we also do for the version 1.1.9 by applying the patch from
# the upstream repo (see the commit message of the patch for more details).
patch(
"https://github.com/google/snappy/commit/8dd58a519f79f0742d4c68fbccb2aed2ddb651e8.patch?full_index=1",
sha256="debcdf182c046a30e9afea99ebbff280dd1fbb203e89abce6a05d3d17c587768",
when="@1.1.9",
)
# nvhpc@:22.3 does not know flag '-fno-rtti'
# nvhpc@:22.7 fails to compile snappy.cc: line 126: error: excessive recursion at instantiation
# of class "snappy::<unnamed>::make_index_sequence<
conflicts("@1.1.9:", when="%nvhpc@:22.7")
def cmake_args(self):
return [
self.define("CMAKE_INSTALL_LIBDIR", self.prefix.lib),

View File

@@ -27,6 +27,7 @@ class Sundials(CMakePackage, CudaPackage, ROCmPackage):
# Versions
# ==========================================================================
version("develop", branch="develop")
version("6.6.0", sha256="f90029b8da846c8faff5530fd1fa4847079188d040554f55c1d5d1e04743d29d")
version("6.5.1", sha256="4252303805171e4dbdd19a01e52c1dcfe0dafc599c3cfedb0a5c2ffb045a8a75")
version("6.5.0", sha256="4e0b998dff292a2617e179609b539b511eb80836f5faacf800e688a886288502")
version("6.4.1", sha256="7bf10a8d2920591af3fba2db92548e91ad60eb7241ab23350a9b1bc51e05e8d0")

View File

@@ -82,6 +82,12 @@ class Sz(CMakePackage, AutotoolsPackage):
patch("ctags-only-if-requested.patch", when="@2.1.8.1:2.1.8.3")
def flag_handler(self, name, flags):
if name == "cflags":
if self.spec.satisfies("%oneapi"):
flags.append("-Wno-error=implicit-function-declaration")
return (flags, None, None)
def setup_run_environment(self, env):
if "+hdf5" in self.spec:
env.prepend_path("HDF5_PLUGIN_PATH", self.prefix.lib64)

View File

@@ -108,6 +108,9 @@ class Tasmanian(CMakePackage, CudaPackage, ROCmPackage):
depends_on("magma@2.4.0:", when="+magma @6.0:", type=("build", "run"))
depends_on("magma@2.5.0:", when="+magma @7.0:", type=("build", "run"))
# https://github.com/spack/spack/issues/39536#issuecomment-1685161942
conflicts("^cuda@12", when="@:7.9 +cuda")
conflicts("+magma", when="~cuda~rocm") # currently MAGMA only works with CUDA
conflicts("+cuda", when="+rocm") # can pick CUDA or ROCm, not both

View File

@@ -31,3 +31,11 @@ class Veccore(CMakePackage):
version("0.4.1", sha256="59ffe668c061acde89afb33749f4eb8bab35dd5f6e51f632758794c1a745aabf")
version("0.4.0", sha256="0a38b958c92647c30b5709d17edaf39d241b92b988f1040c0fbe24932b42927e")
version("0.3.2", sha256="d72b03df00f5e94b2d07f78ab3af6d9d956c19e9a1fae07267b48f6fc8d7713f")
variant("vc", default=False, description="Enable Vc backend")
depends_on("vc@1.2.0:", when="@0.2.0: +vc")
depends_on("vc@1.3.3:", when="@0.6.0: +vc")
def cmake_args(self):
return [self.define_from_variant("VC", "vc")]

View File

@@ -17,7 +17,7 @@ class Warpx(CMakePackage):
"""
homepage = "https://ecp-warpx.github.io"
url = "https://github.com/ECP-WarpX/WarpX/archive/refs/tags/23.07.tar.gz"
url = "https://github.com/ECP-WarpX/WarpX/archive/refs/tags/23.08.tar.gz"
git = "https://github.com/ECP-WarpX/WarpX.git"
maintainers("ax3l", "dpgrote", "MaxThevenet", "RemiLehe")
@@ -25,6 +25,7 @@ class Warpx(CMakePackage):
# NOTE: if you update the versions here, also see py-warpx
version("develop", branch="development")
version("23.08", sha256="67695ff04b83d1823ea621c19488e54ebaf268532b0e5eb4ea8ad293d7ab3ddc")
version("23.07", sha256="511633f94c0d0205013609bde5bbf92a29c2e69f6e69b461b80d09dc25602945")
version("23.06", sha256="75fcac949220c44dce04de581860c9a2caa31a0eee8aa7d49455fa5fc928514b")
version("23.05", sha256="34306a98fdb1f5f44ab4fb92f35966bfccdcf1680a722aa773af2b59a3060d73")

View File

@@ -20,6 +20,8 @@ class Whizard(AutotoolsPackage):
maintainers("vvolkl")
version("master", branch="master")
version("3.1.2", sha256="4f706f8ef02a580ae4dba867828691dfe0b3f9f9b8982b617af72eb8cd4c6fa3")
version("3.1.1", sha256="dd48e4e39b8a4990be47775ec6171f89d8147cb2e9e293afc7051a7dbc5a23ef")
version("3.1.0", sha256="9dc5e6d1a25d2fc708625f85010cb81b63559ff02cceb9b35024cf9f426c0ad9")
version("3.0.3", sha256="20f2269d302fc162a6aed8e781b504ba5112ef0711c078cdb08b293059ed67cf")
version("3.0.2", sha256="f1db92cd95a0281f6afbf4ac32ab027670cb97a57ad8f5139c0d1f61593d66ec")

View File

@@ -15,6 +15,7 @@ class Zig(CMakePackage):
maintainers("alalazo")
version("0.11.0", tag="0.11.0")
version("0.10.1", tag="0.10.1")
version("0.9.1", tag="0.9.1", deprecated=True)
@@ -28,6 +29,7 @@ class Zig(CMakePackage):
depends_on("llvm targets=all")
depends_on("llvm@13", when="@0.9.1")
depends_on("llvm@15", when="@0.10.1")
depends_on("llvm@16", when="@0.11.0")
depends_on("git", type="build")
depends_on("ccache")

View File

@@ -12,6 +12,9 @@ class ZlibNg(AutotoolsPackage, CMakePackage):
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
maintainers("haampie")
version("2.1.3", sha256="d20e55f89d71991c59f1c5ad1ef944815e5850526c0d9cd8e504eaed5b24491a")
version("2.1.2", sha256="383560d6b00697c04e8878e26c0187b480971a8bce90ffd26a5a7b0f7ecf1a33")

View File

@@ -22,7 +22,9 @@ class Zlib(MakefilePackage, Package):
homepage = "https://zlib.net"
# URL must remain http:// so Spack can bootstrap curl
url = "http://zlib.net/fossils/zlib-1.2.11.tar.gz"
git = "https://github.com/madler/zlib.git"
version("1.3", sha256="ff0ba4c292013dbc27530b3a81e1f9a813cd39de01ca5e0f8bf355702efa593e")
version("1.2.13", sha256="b3a24de97a8fdbc835b9833169501030b8977031bcb54b3b3ac13740f846ab30")
version(
"1.2.12",