Compare commits

..

6 Commits

Author SHA1 Message Date
Todd Gamblin
37bb0843fd clean up some comments
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-20 09:25:20 -07:00
Todd Gamblin
07a8f6235b merge three branches into one 2025-05-20 01:02:49 -07:00
Todd Gamblin
485291ef20 Omit fake edge to root
This saves singificant time in comparison

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-19 15:59:03 -07:00
Todd Gamblin
8363fbf40f Spec: use short-circuiting, stable comparison
## Background

Spec comparison on develop used a somewhat questionable optimization to
get decent spec comparison performance -- instead of comparing entire spec
DAGs, it put a `hash()` call in `_cmp_iter()` and compared specs by their
runtime hashes. This gets us good performance abstract specs, which don't
have complex dependencies and for which hashing is cheap. But it makes
the order of specs unstable and hard to reproduce.

We really need to do a full, consistent traversal over specs to compare
and to get a stable ordering. Simply taking the hash out and yielding
dependencies recursively (i.e. yielding `dep.spec._cmp_iter()` instead
of a hash) goes exponential for concrete specs because it explores all
paths. Traversal tracks visited nodes, but it's expensive to set up
the data structures for that, and it can slow down comparison of simple
abstract specs. Abstract spec comparison performance is important for
concretization (specifically setup), so we don't want to do that.

## New comparison algorithm

We can have (mostly) the best of both worlds -- it's just a bit more
complicated.

This changes Spec comparison to do a proper, stable graph comparison:

1. Spec comparison will now short-circuit whenever possible for concrete
   specs, when DAG hashes are known to be equal or not equal. This means
   that concrete spec `==` and `!=` comparisons will no longer have
   to traverse the whole DAG.

2. Spec comparison now traverses the graph consistently, comparing nodes
   and edges in breadth-first order. This means Spec sort order is stable,
   and it won't vary arbitrarily from run to run.

3. Traversal can be expensive, so we avoid it for simple specs. Specifically,
   if a spec has no dependencies, or if its dependencies have no dependencies,
   we avoid calling `traverse_edges()` by doing some special casing.

The `_cmp_iter` method for `Spec` now iterates over the DAG and yields nodes
in BFS order. While it does that, it generates consistent ids for each node,
based on traversal order. It then outputs edges in terms of these ids, along with
their depflags and virtuals, so that all parts of the Spec DAG are included.
The resulting total ordering of specs keys on node attributes first, then
dependency nodes, then any edge differences between graphs.

Optimized cases skip the id generation and traversal, since we know the
order and therefore the ids in advance.

## Performance ramifications

### Abstract specs

This seems to add around 7-8% overhead to concretization setup time. It's
worth the cost, because this enables concretization caching (as input to
concretization was previously not stable) and setup will eventually be
parallelized, at which point it will no longer be a bottleneck for solving.
Together those two optimizations will cut well over 50% of the time (likely
closer to 90+%) off of most solves.

### Concrete specs

Comparison for concrete specs is faster than before, sometimes *way* faster
because comparison is now guaranteed to be linear time w.r.t. DAG size.
Times for comparing concrete Specs:

```python
def compare(json):
    a = spack.spec.Spec(json)
    b = spack.spec.Spec(json)
    print(a == b)
    print(timeit.timeit(lambda: a == b, number=1))

compare("./py-black.json")
compare("./hdf5.json")
```

* `develop` (uses our prior hash optimization):
  * `py-black`: 7.013e-05s
  * `py-hdf5`: 6.445e-05s

* `develop` with full traversal and no hash:
  * `py-black`: 3.955s
  * `py-hdf5`: 0.0122s

* This branch (full traversal, stable, short-circuiting, no hash)
  * `py-black`: 2.208e-06s
  * `py-hdf5`: 3.416e-06s

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-19 15:59:03 -07:00
Todd Gamblin
e4b898f7c3 lazy_lexicographic_ordering: Add short-circuiting with _cmp_fast_eq
Add a `_cmp_fast_eq` method to the `lazy_lexicographic_ordering` protocol
so that implementors can short-circuit full comparison if they know (e.g.
with a hash) that objects are equal (or not).

`_cmp_fast_eq` can return True (known true), False (known false), or
None (unknown -- do full comparison).

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-19 15:59:03 -07:00
Todd Gamblin
a5f1a4ea81 Spec: Remove unused methods on _EdgeMap
`_EdgeMap` implements `_cmp_iter` for `lazy_lexicographic_ordering` but it's never used
or tested. Remove it.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-19 15:59:02 -07:00
76 changed files with 3079 additions and 944 deletions

View File

@@ -3744,6 +3744,9 @@ the build system. The build systems currently supported by Spack are:
| :class:`~spack_repo.builtin.build_systems.ruby` | Specialized build system for |
| | Ruby extensions |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack_repo.builtin.build_systems.intel` | Specialized build system for |
| | licensed Intel software |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack_repo.builtin.build_systems.oneapi` | Specialized build system for |
| | Intel oneAPI software |
+----------------------------------------------------------+----------------------------------+

View File

@@ -67,7 +67,7 @@ def index_by(objects, *funcs):
}
If any elements in funcs is a string, it is treated as the name
of an attribute, and acts like getattr(object, name). So
of an attribute, and acts like ``getattr(object, name)``. So
shorthand for the above two indexes would be::
index1 = index_by(list_of_specs, 'arch', 'compiler')
@@ -77,7 +77,8 @@ def index_by(objects, *funcs):
index1 = index_by(list_of_specs, ('target', 'compiler'))
Keys in the resulting dict will look like ('gcc', 'skylake').
Keys in the resulting dict will look like ``('gcc', 'skylake')``.
"""
if not funcs:
return objects
@@ -315,7 +316,9 @@ def lazy_lexicographic_ordering(cls, set_hash=True):
This is a lazy version of the tuple comparison used frequently to
implement comparison in Python. Given some objects with fields, you
might use tuple keys to implement comparison, e.g.::
might use tuple keys to implement comparison, e.g.:
.. code-block:: python
class Widget:
def _cmp_key(self):
@@ -343,7 +346,9 @@ def __lt__(self):
Lazy lexicographic comparison maps the tuple comparison shown above
to generator functions. Instead of comparing based on pre-constructed
tuple keys, users of this decorator can compare using elements from a
generator. So, you'd write::
generator. So, you'd write:
.. code-block:: python
@lazy_lexicographic_ordering
class Widget:
@@ -366,6 +371,38 @@ def cd_fun():
only has to worry about writing ``_cmp_iter``, and making sure the
elements in it are also comparable.
In some cases, you may have a fast way to determine whether two
objects are equal, e.g. the ``is`` function or an already-computed
cryptographic hash. For this, you can implement your own
``_cmp_fast_eq`` function:
.. code-block:: python
@lazy_lexicographic_ordering
class Widget:
def _cmp_iter(self):
yield a
yield b
def cd_fun():
yield c
yield d
yield cd_fun
yield e
def _cmp_fast_eq(self, other):
return self is other or None
``_cmp_fast_eq`` should return:
* ``True`` if ``self`` is equal to ``other``,
* ``False`` if ``self`` is not equal to ``other``, and
* ``None`` if it's not known whether they are equal, and the full
comparison should be done.
``lazy_lexicographic_ordering`` uses ``_cmp_fast_eq`` to short-circuit
the comparison if the answer can be determined quickly. If you do not
implement it, it defaults to ``self is other or None``.
Some things to note:
* If a class already has ``__eq__``, ``__ne__``, ``__lt__``,
@@ -386,34 +423,40 @@ def cd_fun():
if not hasattr(cls, "_cmp_iter"):
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_iter().")
# get an equal operation that allows us to short-circuit comparison
# if it's not provided, default to `is`
_cmp_fast_eq = getattr(cls, "_cmp_fast_eq", lambda x, y: x is y or None)
# comparison operators are implemented in terms of lazy_eq and lazy_lt
def eq(self, other):
if self is other:
return True
fast_eq = _cmp_fast_eq(self, other)
if fast_eq is not None:
return fast_eq
return (other is not None) and lazy_eq(self._cmp_iter, other._cmp_iter)
def lt(self, other):
if self is other:
if _cmp_fast_eq(self, other) is True:
return False
return (other is not None) and lazy_lt(self._cmp_iter, other._cmp_iter)
def ne(self, other):
if self is other:
return False
fast_eq = _cmp_fast_eq(self, other)
if fast_eq is not None:
return not fast_eq
return (other is None) or not lazy_eq(self._cmp_iter, other._cmp_iter)
def gt(self, other):
if self is other:
if _cmp_fast_eq(self, other) is True:
return False
return (other is None) or lazy_lt(other._cmp_iter, self._cmp_iter)
def le(self, other):
if self is other:
if _cmp_fast_eq(self, other) is True:
return True
return (other is not None) and not lazy_lt(other._cmp_iter, self._cmp_iter)
def ge(self, other):
if self is other:
if _cmp_fast_eq(self, other) is True:
return True
return (other is None) or not lazy_lt(self._cmp_iter, other._cmp_iter)

View File

@@ -23,7 +23,7 @@
_BUILDERS: Dict[int, "Builder"] = {}
def register_builder(build_system_name: str):
def builder(build_system_name: str):
"""Class decorator used to register the default builder
for a given build-system.

View File

@@ -49,7 +49,7 @@
from llnl.util.symlink import symlink
from spack.build_environment import MakeExecutable
from spack.builder import BaseBuilder, Builder, register_builder
from spack.builder import BaseBuilder
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import (
@@ -81,13 +81,7 @@
)
from spack.mixins import filter_compiler_wrappers
from spack.multimethod import default_args, when
from spack.package_base import (
PackageBase,
build_system_flags,
env_flags,
inject_flags,
on_package_attributes,
)
from spack.package_base import build_system_flags, env_flags, inject_flags, on_package_attributes
from spack.package_completions import (
bash_completion_path,
fish_completion_path,
@@ -222,9 +216,6 @@ class tty:
"cd",
"pwd",
"tty",
"Builder",
"PackageBase",
"register_builder",
]
# These are just here for editor support; they may be set when the build env is set up.

View File

@@ -3,7 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import ast
import difflib
import os
import re
import shutil
@@ -83,8 +82,7 @@ def migrate_v1_to_v2(
errors = False
stack: List[Tuple[str, int]] = [(repo.packages_path, 0)]
stack: List[Tuple[str, int]] = [(repo.root, 0)]
while stack:
path, depth = stack.pop()
@@ -114,7 +112,11 @@ def migrate_v1_to_v2(
continue
# check if this is a package
if depth == 0 and os.path.exists(os.path.join(entry.path, "package.py")):
if (
depth == 1
and rel_path.startswith(f"{subdirectory}{os.sep}")
and os.path.exists(os.path.join(entry.path, "package.py"))
):
if "_" in entry.name:
print(
f"Invalid package name '{entry.name}': underscores are not allowed in "
@@ -142,7 +144,7 @@ def migrate_v1_to_v2(
rename_regex = re.compile("^(" + "|".join(re.escape(k) for k in rename.keys()) + ")")
if fix:
os.makedirs(os.path.join(new_root, repo.subdirectory), exist_ok=True)
os.makedirs(new_root, exist_ok=True)
def _relocate(rel_path: str) -> Tuple[str, str]:
old = os.path.join(repo.root, rel_path)
@@ -221,16 +223,6 @@ def _relocate(rel_path: str) -> Tuple[str, str]:
return result, (updated_repo if fix else None)
def _spack_pkg_to_spack_repo(modulename: str) -> str:
# rewrite spack.pkg.builtin.foo -> spack_repo.builtin.packages.foo.package
parts = modulename.split(".")
assert parts[:2] == ["spack", "pkg"]
parts[0:2] = ["spack_repo"]
parts.insert(2, "packages")
parts.append("package")
return ".".join(parts)
def migrate_v2_imports(
packages_dir: str, root: str, fix: bool, out: IO[str] = sys.stdout, err: IO[str] = sys.stderr
) -> bool:
@@ -255,6 +247,7 @@ def migrate_v2_imports(
"Package": "spack_repo.builtin.build_systems.generic",
"GNUMirrorPackage": "spack_repo.builtin.build_systems.gnu",
"GoPackage": "spack_repo.builtin.build_systems.go",
"IntelPackage": "spack_repo.builtin.build_systems.intel",
"LuaPackage": "spack_repo.builtin.build_systems.lua",
"MakefilePackage": "spack_repo.builtin.build_systems.makefile",
"MavenPackage": "spack_repo.builtin.build_systems.maven",
@@ -307,41 +300,12 @@ def migrate_v2_imports(
#: Set of symbols of interest that are already defined through imports, assignments, or
#: function definitions.
defined_symbols: Set[str] = set()
best_line: Optional[int] = None
seen_import = False
module_replacements: Dict[str, str] = {}
parent: Dict[int, ast.AST] = {}
#: List of (line, col start, old, new) tuples of strings to be replaced inline.
inline_updates: List[Tuple[int, int, str, str]] = []
#: List of (line from, line to, new lines) tuples of line replacements
multiline_updates: List[Tuple[int, int, List[str]]] = []
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
original_lines = file.readlines()
if len(original_lines) < 2: # assume packagepy files have at least 2 lines...
continue
if original_lines[0].endswith("\r\n"):
newline = "\r\n"
elif original_lines[0].endswith("\n"):
newline = "\n"
elif original_lines[0].endswith("\r"):
newline = "\r"
else:
success = False
print(f"{pkg_path}: unknown line ending, cannot fix", file=err)
continue
updated_lines = original_lines.copy()
for node in ast.walk(tree):
for child in ast.iter_child_nodes(node):
if isinstance(child, ast.Attribute):
parent[id(child)] = node
# Get the last import statement from the first block of top-level imports
if isinstance(node, ast.Module):
for child in ast.iter_child_nodes(node):
@@ -361,7 +325,7 @@ def migrate_v2_imports(
if is_import:
if isinstance(child, (ast.stmt, ast.expr)):
best_line = (getattr(child, "end_lineno", None) or child.lineno) + 1
best_line = (child.end_lineno or child.lineno) + 1
if not seen_import and is_import:
seen_import = True
@@ -390,89 +354,12 @@ def migrate_v2_imports(
elif isinstance(node, ast.Name) and node.id in symbol_to_module:
referenced_symbols.add(node.id)
# Find lines where spack.pkg is used.
elif (
isinstance(node, ast.Attribute)
and isinstance(node.value, ast.Name)
and node.value.id == "spack"
and node.attr == "pkg"
):
# go as many attrs up until we reach a known module name to be replaced
known_module = "spack.pkg"
ancestor = node
while True:
next_parent = parent.get(id(ancestor))
if next_parent is None or not isinstance(next_parent, ast.Attribute):
break
ancestor = next_parent
known_module = f"{known_module}.{ancestor.attr}"
if known_module in module_replacements:
break
inline_updates.append(
(
ancestor.lineno,
ancestor.col_offset,
known_module,
module_replacements[known_module],
)
)
# Register imported symbols to make this operation idempotent
elif isinstance(node, ast.ImportFrom):
# Keep track of old style spack.pkg imports, to be replaced.
if node.module and node.module.startswith("spack.pkg.") and node.level == 0:
depth = node.module.count(".")
# simple case of find and replace
# from spack.pkg.builtin.my_pkg import MyPkg
# -> from spack_repo.builtin.packages.my_pkg.package import MyPkg
if depth == 3:
module_replacements[node.module] = _spack_pkg_to_spack_repo(node.module)
inline_updates.append(
(
node.lineno,
node.col_offset,
node.module,
module_replacements[node.module],
)
)
# non-trivial possible multiline case
# from spack.pkg.builtin import (boost, cmake as foo)
# -> import spack_repo.builtin.packages.boost.package as boost
# -> import spack_repo.builtin.packages.cmake.package as foo
elif depth == 2 and node.end_lineno is not None:
_, _, namespace = node.module.rpartition(".")
indent = original_lines[node.lineno - 1][: node.col_offset]
multiline_updates.append(
(
node.lineno,
node.end_lineno + 1,
[
f"{indent}import spack_repo.{namespace}.packages."
f"{alias.name}.package as {alias.asname or alias.name}"
f"{newline}"
for alias in node.names
],
)
)
else:
success = False
print(
f"{pkg_path}:{node.lineno}: don't know how to rewrite `{node.module}`",
file=err,
)
# Subtract the symbols that are imported so we don't repeatedly add imports.
for alias in node.names:
if alias.name in symbol_to_module:
if alias.asname is None:
defined_symbols.add(alias.name)
# error when symbols are explicitly imported that are no longer available
if node.module == "spack.package" and node.level == 0:
defined_symbols.add(alias.name)
if node.module == "spack.package":
success = False
print(
f"{pkg_path}:{node.lineno}: `{alias.name}` is imported from "
@@ -483,84 +370,59 @@ def migrate_v2_imports(
if alias.asname and alias.asname in symbol_to_module:
defined_symbols.add(alias.asname)
elif isinstance(node, ast.Import):
# normal imports are easy find and replace since they are single lines.
for alias in node.names:
if alias.asname and alias.asname in symbol_to_module:
defined_symbols.add(alias.name)
elif alias.asname is None and alias.name.startswith("spack.pkg."):
module_replacements[alias.name] = _spack_pkg_to_spack_repo(alias.name)
inline_updates.append(
(
alias.lineno,
alias.col_offset,
alias.name,
module_replacements[alias.name],
)
)
# Remove imported symbols from the referenced symbols
referenced_symbols.difference_update(defined_symbols)
# Sort from last to first so we can modify without messing up the line / col offsets
inline_updates.sort(reverse=True)
# Nothing to change here.
if not inline_updates and not referenced_symbols:
if not referenced_symbols:
continue
# First do module replacements of spack.pkg imports
for line, col, old, new in inline_updates:
updated_lines[line - 1] = updated_lines[line - 1][:col] + updated_lines[line - 1][
col:
].replace(old, new, 1)
if best_line is None:
print(f"{pkg_path}: failed to update imports", file=err)
success = False
continue
# Then insert new imports for symbols referenced in the package
if referenced_symbols:
if best_line is None:
print(f"{pkg_path}: failed to update imports", file=err)
success = False
continue
# Add the missing imports right after the last import statement
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
lines = file.readlines()
# Group missing symbols by their module
missing_imports_by_module: Dict[str, list] = {}
for symbol in referenced_symbols:
module = symbol_to_module[symbol]
if module not in missing_imports_by_module:
missing_imports_by_module[module] = []
missing_imports_by_module[module].append(symbol)
# Group missing symbols by their module
missing_imports_by_module: Dict[str, list] = {}
for symbol in referenced_symbols:
module = symbol_to_module[symbol]
if module not in missing_imports_by_module:
missing_imports_by_module[module] = []
missing_imports_by_module[module].append(symbol)
new_lines = [
f"from {module} import {', '.join(sorted(symbols))}{newline}"
for module, symbols in sorted(missing_imports_by_module.items())
]
new_lines = [
f"from {module} import {', '.join(sorted(symbols))}\n"
for module, symbols in sorted(missing_imports_by_module.items())
]
if not seen_import:
new_lines.extend((newline, newline))
if not seen_import:
new_lines.extend(("\n", "\n"))
multiline_updates.append((best_line, best_line, new_lines))
multiline_updates.sort(reverse=True)
for start, end, new_lines in multiline_updates:
updated_lines[start - 1 : end - 1] = new_lines
if not fix:
if not fix: # only print the diff
success = False # packages need to be fixed, but we didn't do it
diff_start, diff_end = max(1, best_line - 3), min(best_line + 2, len(lines))
num_changed = diff_end - diff_start + 1
num_added = num_changed + len(new_lines)
rel_pkg_path = os.path.relpath(pkg_path, start=root)
diff = difflib.unified_diff(
original_lines,
updated_lines,
n=3,
fromfile=f"a/{rel_pkg_path}",
tofile=f"b/{rel_pkg_path}",
)
out.write("".join(diff))
out.write(f"--- a/{rel_pkg_path}\n+++ b/{rel_pkg_path}\n")
out.write(f"@@ -{diff_start},{num_changed} +{diff_start},{num_added} @@\n")
for line in lines[diff_start - 1 : best_line - 1]:
out.write(f" {line}")
for line in new_lines:
out.write(f"+{line}")
for line in lines[best_line - 1 : diff_end]:
out.write(f" {line}")
continue
lines[best_line - 1 : best_line - 1] = new_lines
tmp_file = pkg_path + ".tmp"
# binary mode to avoid newline conversion issues; utf-8 was already required upon read.
with open(tmp_file, "wb") as file:
file.write("".join(updated_lines).encode("utf-8"))
with open(tmp_file, "w", encoding="utf-8", newline="") as file:
file.writelines(lines)
os.replace(tmp_file, pkg_path)

View File

@@ -961,7 +961,6 @@ def _sort_by_dep_types(dspec: DependencySpec):
return dspec.depflag
@lang.lazy_lexicographic_ordering
class _EdgeMap(collections.abc.Mapping):
"""Represent a collection of edges (DependencySpec objects) in the DAG.
@@ -999,21 +998,6 @@ def add(self, edge: DependencySpec) -> None:
def __str__(self) -> str:
return f"{{deps: {', '.join(str(d) for d in sorted(self.values()))}}}"
def _cmp_iter(self):
for item in sorted(itertools.chain.from_iterable(self.edges.values())):
yield item
def copy(self):
"""Copies this object and returns a clone"""
clone = type(self)()
clone.store_by_child = self.store_by_child
# Copy everything from this dict into it.
for dspec in itertools.chain.from_iterable(self.values()):
clone.add(dspec.copy())
return clone
def select(
self,
*,
@@ -3785,26 +3769,152 @@ def eq_node(self, other):
"""Equality with another spec, not including dependencies."""
return (other is not None) and lang.lazy_eq(self._cmp_node, other._cmp_node)
def _cmp_iter(self):
"""Lazily yield components of self for comparison."""
for item in self._cmp_node():
yield item
def _cmp_fast_eq(self, other) -> Optional[bool]:
"""Short-circuit compare with other for equality, for lazy_lexicographic_ordering."""
# If there is ever a breaking change to hash computation, whether accidental or purposeful,
# two specs can be identical modulo DAG hash, depending on what time they were concretized
# From the perspective of many operation in Spack (database, build cache, etc) a different
# DAG hash means a different spec. Here we ensure that two otherwise identical specs, one
# serialized before the hash change and one after, are considered different.
yield self.dag_hash() if self.concrete else None
if self is other:
return True
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
yield dep.depflag
yield hash(dep.spec)
if self.concrete and other and other.concrete:
return self.dag_hash() == other.dag_hash()
yield deps
return None
def _cmp_iter(self):
"""Lazily yield components of self for comparison."""
# Spec comparison in Spack needs to be fast, so there are several cases here for
# performance. The main places we care about this are:
#
# * Abstract specs: there are lots of abstract specs in package.py files,
# which are put into metadata dictionaries and sorted during concretization
# setup. We want comparing abstract specs to be fast.
#
# * Concrete specs: concrete specs are bigger and have lots of nodes and
# edges. Because of the graph complexity, we need a full, linear time
# traversal to compare them -- that's pretty much is unavoidable. But they
# also have precoputed cryptographic hashes (dag_hash()), which we can use
# to do fast equality comparison. See _cmp_fast_eq() above for the
# short-circuit logic for hashes.
#
# A full traversal involves constructing data structurs, visitor objects, etc.,
# and it can be expensive if we have to do it to compare a bunch of tiny
# abstract specs. Therefore, there are 3 cases below, which avoid calling
# `spack.traverse.traverse_edges()` unless necessary.
#
# WARNING: the cases below need to be consistent, so don't mess with this code
# unless you really know what you're doing. Be sure to keep all three consistent.
#
# All cases lazily yield:
#
# 1. A generator over nodes
# 2. A generator over canonical edges
#
# Canonical edges have consistent ids defined by breadth-first traversal order. That is,
# the root is always 0, dependencies of the root are 1, 2, 3, etc., and so on.
#
# The three cases are:
#
# 1. Spec has no dependencies
# * We can avoid any traversal logic and just yield this node's _cmp_node generator.
#
# 2. Spec has dependencies, but dependencies have no dependencies.
# * We need to sort edges, but we don't need to track visited nodes, which
# can save us the cost of setting up all the tracking data structures
# `spack.traverse` uses.
#
# 3. Spec has dependencies that have dependencies.
# * In this case, the spec is *probably* concrete. Equality comparisons
# will be short-circuited by dag_hash(), but other comparisons will need
# to lazily enumerate components of the spec. The traversal logic is
# unavoidable.
#
# TODO: consider reworking `spack.traverse` to construct fewer data structures
# and objects, as this would make all traversals faster and could eliminate the
# need for the complexity here. It was not clear at the time of writing that how
# much optimization was possible in `spack.traverse`.
sorted_l1_edges = None
edge_list = None
node_ids = None
def nodes():
nonlocal sorted_l1_edges
nonlocal edge_list
nonlocal node_ids
# Level 0: root node
yield self._cmp_node # always yield the root (this node)
if not self._dependencies: # done if there are no dependencies
return
# Level 1: direct dependencies
# we can yield these in sorted order without tracking visited nodes
deps_have_deps = False
sorted_l1_edges = self.edges_to_dependencies(depflag=dt.ALL)
if len(sorted_l1_edges) > 1:
sorted_l1_edges = spack.traverse.sort_edges(sorted_l1_edges)
for edge in sorted_l1_edges:
yield edge.spec._cmp_node
if edge.spec._dependencies:
deps_have_deps = True
if not deps_have_deps: # done if level 1 specs have no dependencies
return
# Level 2: dependencies of direct dependencies
# now it's general; we need full traverse() to track visited nodes
l1_specs = [edge.spec for edge in sorted_l1_edges]
# the node_ids dict generates consistent ids based on BFS traversal order
# these are used to identify edges later
node_ids = collections.defaultdict(lambda: len(node_ids))
node_ids[id(self)] # self is 0
for spec in l1_specs:
node_ids[id(spec)] # l1 starts at 1
edge_list = []
for edge in spack.traverse.traverse_edges(
l1_specs, order="breadth", cover="edges", root=False, visited=set([0])
):
# yield each node only once, and generate a consistent id for it the
# first time it's encountered.
if id(edge.spec) not in node_ids:
yield edge.spec._cmp_node
node_ids[id(edge.spec)]
if edge.parent is None: # skip fake edge to root
continue
edge_list.append(
(
node_ids[id(edge.parent)],
node_ids[id(edge.spec)],
edge.depflag,
edge.virtuals,
)
)
def edges():
# no edges in single-node graph
if not self._dependencies:
return
# level 1 edges all start with zero
for i, edge in enumerate(sorted_l1_edges, start=1):
yield (0, i, edge.depflag, edge.virtuals)
# yield remaining edges in the order they were encountered during traversal
if edge_list:
yield from edge_list
yield nodes
yield edges
@property
def namespace_if_anonymous(self):

View File

@@ -95,47 +95,24 @@ class _7zip(Package):
pass
"""
# this is written like this to be explicit about line endings and indentation
OLD_NUMPY = (
b"# some comment\r\n"
b"\r\n"
b"import spack.pkg.builtin.foo, spack.pkg.builtin.bar\r\n"
b"from spack.package import *\r\n"
b"from something.unrelated import AutotoolsPackage\r\n"
b"\r\n"
b"if True:\r\n"
b"\tfrom spack.pkg.builtin import (\r\n"
b"\t\tfoo,\r\n"
b"\t\tbar as baz,\r\n"
b"\t)\r\n"
b"\r\n"
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
b"\tgenerator('ninja')\r\n"
b"\r\n"
b"\tdef example(self):\r\n"
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
b"\t\treturn spack.pkg.builtin.foo.example(), foo, baz\r\n"
)
OLD_NUMPY = b"""\
# some comment
NEW_NUMPY = (
b"# some comment\r\n"
b"\r\n"
b"import spack_repo.builtin.packages.foo.package, spack_repo.builtin.packages.bar.package\r\n"
b"from spack_repo.builtin.build_systems.cmake import CMakePackage, generator\r\n"
b"from spack.package import *\r\n"
b"from something.unrelated import AutotoolsPackage\r\n"
b"\r\n"
b"if True:\r\n"
b"\timport spack_repo.builtin.packages.foo.package as foo\r\n"
b"\timport spack_repo.builtin.packages.bar.package as baz\r\n"
b"\r\n"
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
b"\tgenerator('ninja')\r\n"
b"\r\n"
b"\tdef example(self):\r\n"
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
b"\t\treturn spack_repo.builtin.packages.foo.package.example(), foo, baz\r\n"
)
from spack.package import *
class PyNumpy(CMakePackage):
generator("ninja")
"""
NEW_NUMPY = b"""\
# some comment
from spack_repo.builtin.build_systems.cmake import CMakePackage, generator
from spack.package import *
class PyNumpy(CMakePackage):
generator("ninja")
"""
def test_repo_migrate(tmp_path: pathlib.Path, config):
@@ -165,6 +142,7 @@ def test_repo_migrate(tmp_path: pathlib.Path, config):
assert pkg_py_numpy_new.read_bytes() == NEW_NUMPY
@pytest.mark.not_on_windows("Known failure on windows")
def test_migrate_diff(git: Executable, tmp_path: pathlib.Path):
root, _ = spack.repo.create_repo(str(tmp_path), "foo", package_api=(2, 0))
r = pathlib.Path(root)

View File

@@ -6,6 +6,8 @@
import pytest
import llnl.util.lang
import spack.concretize
import spack.deptypes as dt
import spack.directives
@@ -2013,6 +2015,137 @@ def test_comparison_multivalued_variants():
assert Spec("x=a") < Spec("x=a,b") < Spec("x==a,b") < Spec("x==a,b,c")
@pytest.mark.parametrize(
"specs_in_expected_order",
[
("a", "b", "c", "d", "e"),
("a@1.0", "a@2.0", "b", "c@3.0", "c@4.0"),
("a^d", "b^c", "c^b", "d^a"),
("e^a", "e^b", "e^c", "e^d"),
("e^a@1.0", "e^a@2.0", "e^a@3.0", "e^a@4.0"),
("e^a@1.0 +a", "e^a@1.0 +b", "e^a@1.0 +c", "e^a@1.0 +c"),
("a^b%c", "a^b%d", "a^b%e", "a^b%f"),
("a^b%c@1.0", "a^b%c@2.0", "a^b%c@3.0", "a^b%c@4.0"),
("a^b%c@1.0 +a", "a^b%c@1.0 +b", "a^b%c@1.0 +c", "a^b%c@1.0 +d"),
("a cflags=-O1", "a cflags=-O2", "a cflags=-O3"),
("a %cmake@1.0 ^b %cmake@2.0", "a %cmake@2.0 ^b %cmake@1.0"),
("a^b^c^d", "a^b^c^e", "a^b^c^f"),
("a^b^c^d", "a^b^c^e", "a^b^c^e", "a^b^c^f"),
("a%b%c%d", "a%b%c%e", "a%b%c%e", "a%b%c%f"),
("d.a", "c.b", "b.c", "a.d"), # names before namespaces
],
)
def test_spec_ordering(specs_in_expected_order):
specs_in_expected_order = [Spec(s) for s in specs_in_expected_order]
assert sorted(specs_in_expected_order) == specs_in_expected_order
assert sorted(reversed(specs_in_expected_order)) == specs_in_expected_order
for i in range(len(specs_in_expected_order) - 1):
lhs, rhs = specs_in_expected_order[i : i + 2]
assert lhs <= rhs
assert (lhs < rhs and lhs != rhs) or lhs == rhs
assert rhs >= lhs
assert (rhs > lhs and rhs != lhs) or rhs == lhs
EMPTY_VER = vn.VersionList(":")
EMPTY_VAR = Spec().variants
EMPTY_FLG = Spec().compiler_flags
@pytest.mark.parametrize(
"spec,expected_tuplified",
[
# simple, no dependencies
[("a"), ((("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),), ())],
# with some node attributes
[
("a@1.0 +foo cflags='-O3 -g'"),
(
(
(
"a",
None,
vn.VersionList(["1.0"]),
Spec("+foo").variants,
Spec("cflags='-O3 -g'").compiler_flags,
None,
None,
None,
),
),
(),
),
],
# single edge case
[
("a^b"),
(
(
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
),
((0, 1, 0, ()),),
),
],
# root with multiple deps
[
("a^b^c^d"),
(
(
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
),
((0, 1, 0, ()), (0, 2, 0, ()), (0, 3, 0, ())),
),
],
# root with multiple build deps
[
("a%b%c%d"),
(
(
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
),
((0, 1, dt.BUILD, ()), (0, 2, dt.BUILD, ()), (0, 3, dt.BUILD, ())),
),
],
# dependencies with dependencies
[
("a ^b %c %d ^e %f %g"),
(
(
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("e", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("f", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
("g", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
),
(
(0, 1, 0, ()),
(0, 2, 0, ()),
(1, 3, dt.BUILD, ()),
(1, 4, dt.BUILD, ()),
(2, 5, dt.BUILD, ()),
(2, 6, dt.BUILD, ()),
),
),
],
],
)
def test_spec_canonical_comparison_form(spec, expected_tuplified):
print()
print()
print()
assert llnl.util.lang.tuplify(Spec(spec)._cmp_iter) == expected_tuplified
def test_comparison_after_breaking_hash_change():
# We simulate a breaking change in DAG hash computation in Spack. We have two specs that are
# entirely equal modulo DAG hash. When deserializing these specs, we don't want them to compare

View File

@@ -0,0 +1,660 @@
====================================
Development Notes on Intel Packages
====================================
These are notes for concepts and development of
lib/spack/spack/build_systems/intel.py .
For documentation on how to *use* ``IntelPackage``, see
lib/spack/docs/build_systems/intelpackage.rst .
-------------------------------------------------------------------------------
Installation and path handling as implemented in ./intel.py
-------------------------------------------------------------------------------
***************************************************************************
Prefix differences between Spack-external and Spack-internal installations
***************************************************************************
Problem summary
~~~~~~~~~~~~~~~~
For Intel packages that were installed external to Spack, ``self.prefix`` will
be a *component-specific* path (e.g. to an MKL-specific dir hierarchy), whereas
for a package installed by Spack itself, ``self.prefix`` will be a
*vendor-level* path that holds one or more components (or parts thereof), and
must be further qualified down to a particular desired component.
It is possible that a similar conceptual difference is inherent to other
package families that use a common vendor-style installer.
Description
~~~~~~~~~~~~
Spack makes packages available through two routes, let's call them A and B:
A. Packages pre-installed external to Spack and configured *for* Spack
B. Packages built and installed *by* Spack.
For a user who is interested in building end-user applications, it should not
matter through which route any of its dependent packages has been installed.
Most packages natively support a ``prefix`` concept which unifies the two
routes just fine.
Intel packages, however, are more complicated because they consist of a number
of components that are released as a suite of varying extent, like "Intel
Parallel Studio *Foo* Edition", or subsetted into products like "MKL" or "MPI",
each of which also contain libraries from other components like the compiler
runtime and multithreading libraries. For this reason, an Intel package is
"anchored" during installation at a directory level higher than just the
user-facing directory that has the conventional hierarchy of ``bin``, ``lib``,
and others relevant for the end-product.
As a result, internal to Spack, there is a conceptual difference in what
``self.prefix`` represents for the two routes.
For route A, consider MKL installed outside of Spack. It will likely be one
product component among other products, at one particular release among others
that are installed in sibling or cousin directories on the local system.
Therefore, the path given to Spack in ``packages.yaml`` should be a
*product-specific and fully version-specific* directory. E.g., for an
``intel-mkl`` package, ``self.prefix`` should look like::
/opt/intel/compilers_and_libraries_2018.1.163/linux/mkl
In this route, the interaction point with the user is encapsulated in an
environment variable which will be (in pseudo-code)::
MKLROOT := {self.prefix}
For route B, a Spack-based installation of MKL will be placed in the directory
given to the ``./install.sh`` script of Intel's package distribution. This
directory is taken to be the *vendor*-specific anchor directory, playing the
same role as the default ``/opt/intel``. In this case, ``self.prefix`` will
be::
$SPACK_ROOT/opt/spack/linux-centos6-x86_64/gcc-4.9.3/intel-mkl-2018.1.163-<HASH>
However, now the environment variable will have to be constructed as *several
directory levels down*::
MKLROOT := {self.prefix}/compilers_and_libraries_2018.1.163/linux/mkl
A recent post on the Spack mailing list illustrates the confusion when route A
was taken while route B was the only one that was coded in Spack:
https://groups.google.com/d/msg/spack/x28qlmqPAys/Ewx6220uAgAJ
Solution
~~~~~~~~~
Introduce a series of functions which will return the appropriate
directories, regardless of whether the Intel package has been installed
external or internal to Spack:
========================== ==================================================
Function Example return values
-------------------------- --------------------------------------------------
normalize_suite_dir() Spack-external installation:
/opt/intel/compilers_and_libraries_2018.1.163
Spack-internal installation:
$SPACK_ROOT/...<HASH>/compilers_and_libraries_2018.1.163
-------------------------- --------------------------------------------------
normalize_path('mkl') <suite_dir>/linux/mkl
component_bin_dir() <suite_dir>/linux/mkl/bin
component_lib_dir() <suite_dir>/linux/mkl/lib/intel64
-------------------------- --------------------------------------------------
normalize_path('mpi') <suite_dir>/linux/mpi
component_bin_dir('mpi') <suite_dir>/linux/mpi/intel64/bin
component_lib_dir('mpi') <suite_dir>/linux/mpi/intel64/lib
========================== ==================================================
*********************************
Analysis of directory layouts
*********************************
Let's look at some sample directory layouts, using ``ls -lF``,
but focusing on names and symlinks only.
Spack-born installation of ``intel-mkl@2018.1.163``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
::
$ ls -l <prefix>
bin/
- compilervars.*sh (symlinked) ONLY
compilers_and_libraries -> compilers_and_libraries_2018
- generically-named entry point, stable across versions (one hopes)
compilers_and_libraries_2018/
- vaguely-versioned dirname, holding a stub hierarchy --ignorable
$ ls -l compilers_and_libraries_2018/linux/
bin - actual compilervars.*sh (reg. files) ONLY
documentation -> ../../documentation_2018/
lib -> ../../compilers_and_libraries_2018.1.163/linux/compiler/lib/
mkl -> ../../compilers_and_libraries_2018.1.163/linux/mkl/
pkg_bin -> ../../compilers_and_libraries_2018.1.163/linux/bin/
samples -> ../../samples_2018/
tbb -> ../../compilers_and_libraries_2018.1.163/linux/tbb/
compilers_and_libraries_2018.1.163/
- Main "product" + a minimal set of libs from related products
$ ls -l compilers_and_libraries_2018.1.163/linux/
bin/ - compilervars.*sh, link_install*sh ONLY
mkl/ - Main Product ==> to be assigned to MKLROOT
compiler/ - lib/intel64_lin/libiomp5* ONLY
tbb/ - tbb/lib/intel64_lin/gcc4.[147]/libtbb*.so* ONLY
parallel_studio_xe_2018 -> parallel_studio_xe_2018.1.038/
parallel_studio_xe_2018.1.038/
- Alternate product packaging - ignorable
$ ls -l parallel_studio_xe_2018.1.038/
bin/ - actual psxevars.*sh (reg. files)
compilers_and_libraries_2018 -> <full_path>/comp...aries_2018.1.163
documentation_2018 -> <full_path_prefix>/documentation_2018
samples_2018 -> <full_path_prefix>/samples_2018
...
documentation_2018/
samples_2018/
lib -> compilers_and_libraries/linux/lib/
mkl -> compilers_and_libraries/linux/mkl/
tbb -> compilers_and_libraries/linux/tbb/
- auxiliaries and convenience links
Spack-external installation of Intel-MPI 2018
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
For MPI, the layout is slightly different than MKL. The prefix will have to
include an architecture directory (typically ``intel64``), which then contains
bin/, lib/, ..., all without further architecture branching. The environment
variable ``I_MPI_ROOT`` from the API documentation, however, must be the
package's top directory, not including the architecture.
FIXME: For MANPATH, need the parent dir.
::
$ ls -lF /opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/
bin64 -> intel64/bin/
etc64 -> intel64/etc/
include64 -> intel64/include/
lib64 -> intel64/lib/
benchmarks/
binding/
intel64/
man/
test/
The package contains an MPI-2019 preview; Curiously, its release notes contain
the tag: "File structure clean-up." I could not find further documentation on
this, however, so it is unclear what, if any, changes will make it to release.
https://software.intel.com/en-us/articles/restoring-legacy-path-structure-on-intel-mpi-library-2019
::
$ ls -lF /opt/intel/compilers_and_libraries_2018.1.163/linux/mpi_2019/
binding/
doc/
imb/
intel64/
man/
test/
Spack-external installation of Intel Parallel Studio 2018
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
This is the main product bundle that I actually downloaded and installed on my
system. Its nominal installation directory mostly holds merely symlinks
to components installed in sibling dirs::
$ ls -lF /opt/intel/parallel_studio_xe_2018.1.038/
advisor_2018 -> /opt/intel/advisor_2018/
clck_2018 -> /opt/intel/clck/2018.1/
compilers_and_libraries_2018 -> /opt/intel/comp....aries_2018.1.163/
documentation_2018 -> /opt/intel/documentation_2018/
ide_support_2018 -> /opt/intel/ide_support_2018/
inspector_2018 -> /opt/intel/inspector_2018/
itac_2018 -> /opt/intel/itac/2018.1.017/
man -> /opt/intel/man/
samples_2018 -> /opt/intel/samples_2018/
vtune_amplifier_2018 -> /opt/intel/vtune_amplifier_2018/
psxevars.csh -> ./bin/psxevars.csh*
psxevars.sh -> ./bin/psxevars.sh*
bin/ - *vars.*sh scripts + sshconnectivity.exp ONLY
licensing/
uninstall*
The only relevant regular files are ``*vars.*sh``, but those also just churn
through the subordinate vars files of the components.
Installation model
~~~~~~~~~~~~~~~~~~~~
Intel packages come with an ``install.sh`` script that is normally run
interactively (in either text or GUI mode) but can run unattended with a
``--silent <file>`` option, which is of course what Spack uses.
Format of configuration file
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The configuration file is conventionally called ``silent.cfg`` and has a simple
``token=value`` syntax. Before using the configuration file, the installer
calls ``<staging_dir>/pset/check.awk`` to validate it. Example paths to the
validator are::
.../l_mkl_2018.1.163/pset/check.awk .
.../parallel_studio_xe_2018_update1_cluster_edition/pset/check.awk
The tokens that are accepted in the configuration file vary between packages.
Tokens not supported for a given package **will cause the installer to stop
and fail.** This is particularly relevant for license-related tokens, which are
accepted only for packages that actually require a license.
Reference: [Intel's documentation](https://software.intel.com/en-us/articles/configuration-file-format)
See also: https://software.intel.com/en-us/articles/silent-installation-guide-for-intel-parallel-studio-xe-composer-edition-for-os-x
The following is from ``.../parallel_studio_xe_2018_update1_cluster_edition/pset/check.awk``:
* Tokens valid for all packages encountered::
ACCEPT_EULA {accept, decline}
CONTINUE_WITH_OPTIONAL_ERROR {yes, no}
PSET_INSTALL_DIR {/opt/intel, , filepat}
CONTINUE_WITH_INSTALLDIR_OVERWRITE {yes, no}
COMPONENTS {ALL, DEFAULTS, , anythingpat}
PSET_MODE {install, repair, uninstall}
NONRPM_DB_DIR {, filepat}
SIGNING_ENABLED {yes, no}
ARCH_SELECTED {IA32, INTEL64, ALL}
* Mentioned but unexplained in ``check.awk``::
NO_VALIDATE (?!)
* Only for licensed packages::
ACTIVATION_SERIAL_NUMBER {, snpat}
ACTIVATION_LICENSE_FILE {, lspat, filepat}
ACTIVATION_TYPE {exist_lic, license_server,
license_file, trial_lic,
PHONEHOME_SEND_USAGE_DATA {yes, no}
serial_number}
* Only for Amplifier (obviously)::
AMPLIFIER_SAMPLING_DRIVER_INSTALL_TYPE {build, kit}
AMPLIFIER_DRIVER_ACCESS_GROUP {, anythingpat, vtune}
AMPLIFIER_DRIVER_PERMISSIONS {, anythingpat, 666}
AMPLIFIER_LOAD_DRIVER {yes, no}
AMPLIFIER_C_COMPILER {, filepat, auto, none}
AMPLIFIER_KERNEL_SRC_DIR {, filepat, auto, none}
AMPLIFIER_MAKE_COMMAND {, filepat, auto, none}
AMPLIFIER_INSTALL_BOOT_SCRIPT {yes, no}
AMPLIFIER_DRIVER_PER_USER_MODE {yes, no}
* Only for MKL and Studio::
CLUSTER_INSTALL_REMOTE {yes, no}
CLUSTER_INSTALL_TEMP {, filepat}
CLUSTER_INSTALL_MACHINES_FILE {, filepat}
* "backward compatibility" (?)::
INSTALL_MODE {RPM, NONRPM}
download_only {yes}
download_dir {, filepat}
Details for licensing tokens
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Quoted from
https://software.intel.com/en-us/articles/configuration-file-format,
for reference:
[ed. note: As of 2018-05, the page incorrectly references ``ACTIVATION``, which
was used only until about 2012; this is corrected to ``ACTIVATION_TYPE`` here.]
...
``ACTIVATION_TYPE=exist_lic``
This directive tells the install program to look for an existing
license during the install process. This is the preferred method for
silent installs. Take the time to register your serial number and get
a license file (see below). Having a license file on the system
simplifies the process. In addition, as an administrator it is good
practice to know WHERE your licenses are saved on your system.
License files are plain text files with a .lic extension. By default
these are saved in /opt/intel/licenses which is searched by default.
If you save your license elsewhere, perhaps under an NFS folder, set
environment variable **INTEL_LICENSE_FILE** to the full path to your
license file prior to starting the installation or use the
configuration file directive ``ACTIVATION_LICENSE_FILE`` to specify the
full pathname to the license file.
Options for ``ACTIVATION_TYPE`` are ``{ exist_lic, license_file, server_lic,
serial_number, trial_lic }``
``exist_lic``
directs the installer to search for a valid license on the server.
Searches will utilize the environment variable **INTEL_LICENSE_FILE**,
search the default license directory /opt/intel/licenses, or use the
``ACTIVATION_LICENSE_FILE`` directive to find a valid license file.
``license_file``
is similar to exist_lic but directs the installer to use
``ACTIVATION_LICENSE_FILE`` to find the license file.
``server_lic``
is similar to exist_lic and exist_lic but directs the installer that
this is a client installation and a floating license server will be
contacted to active the product. This option will contact your
floating license server on your network to retrieve the license
information. BEFORE using this option make sure your client is
correctly set up for your network including all networking, routing,
name service, and firewall configuration. Insure that your client has
direct access to your floating license server and that firewalls are
set up to allow TCP/IP access for the 2 license server ports.
server_lic will use **INTEL_LICENSE_FILE** containing a port@host format
OR a client license file. The formats for these are described here
https://software.intel.com/en-us/articles/licensing-setting-up-the-client-floating-license
``serial_number``
directs the installer to use directive ``ACTIVATION_SERIAL_NUMBER`` for
activation. This method will require the installer to contact an
external Intel activation server over the Internet to confirm your
serial number. Due to user and company firewalls, this method is more
complex and hence error prone of the available activation methods. We
highly recommend using a license file or license server for activation
instead.
``trial_lic``
is used only if you do not have an existing license and intend to
temporarily evaluate the compiler. This method creates a temporary
trial license in Trusted Storage on your system.
...
*******************
vars files
*******************
Intel's product packages contain a number of shell initialization files let's call them vars files.
There are three kinds:
#. Component-specific vars files, such as `mklvars` or `tbbvars`.
#. Toplevel vars files such as "psxevars". They will scan for all
component-specific vars files associated with the product, and source them
if found.
#. Symbolic links to either of them. Links may appear under a different name
for backward compatibility.
At present, IntelPackage class is only concerned with the toplevel vars files,
generally found in the product's toplevel bin/ directory.
For reference, here is an overview of the names and locations of the vars files
in the 2018 product releases, as seen for Spack-native installation. NB: May be
incomplete as some components may have been omitted during installation.
Names of vars files seen::
$ cd opt/spack/linux-centos6-x86_64
$ find intel* -name \*vars.sh -printf '%f\n' | sort -u | nl
1 advixe-vars.sh
2 amplxe-vars.sh
3 apsvars.sh
4 compilervars.sh
5 daalvars.sh
6 debuggervars.sh
7 iccvars.sh
8 ifortvars.sh
9 inspxe-vars.sh
10 ippvars.sh
11 mklvars.sh
12 mpivars.sh
13 pstlvars.sh
14 psxevars.sh
15 sep_vars.sh
16 tbbvars.sh
Names and locations of vars files, sorted by Spack package name::
$ cd opt/spack/linux-centos6-x86_64
$ find intel* -name \*vars.sh -printf '%y\t%-15f\t%h\n' \
| cut -d/ -f1,4- \
| sed '/iccvars\|ifortvars/d; s,/,\t\t,; s,\.sh,,; s, */\(intel[/-]\),\1,' \
| sort -k3,3 -k2,2 \
| nl \
| awk '{printf "%6i %-2s %-16s %-24s %s\n", $1, $2, $3, $4, $5}'
--------------------------------------------------------------------------------------------------------
item no.
file or link
name of vars file
Spack package name
dir relative to Spack install dir
--------------------------------------------------------------------------------------------------------
1 f mpivars intel compilers_and_libraries_2018.1.163/linux/mpi/intel64/bin
2 f mpivars intel compilers_and_libraries_2018.1.163/linux/mpirt/bin/ia32_lin
3 f tbbvars intel compilers_and_libraries_2018.1.163/linux/tbb/bin
4 f pstlvars intel compilers_and_libraries_2018.1.163/linux/pstl/bin
5 f compilervars intel compilers_and_libraries_2018.1.163/linux/bin
6 f compilervars intel compilers_and_libraries_2018/linux/bin
7 l compilervars intel bin
8 f daalvars intel-daal compilers_and_libraries_2018.2.199/linux/daal/bin
9 f psxevars intel-daal parallel_studio_xe_2018.2.046/bin
10 l psxevars intel-daal parallel_studio_xe_2018.2.046
11 f compilervars intel-daal compilers_and_libraries_2018.2.199/linux/bin
12 f compilervars intel-daal compilers_and_libraries_2018/linux/bin
13 l compilervars intel-daal bin
14 f ippvars intel-ipp compilers_and_libraries_2018.2.199/linux/ipp/bin
15 f psxevars intel-ipp parallel_studio_xe_2018.2.046/bin
16 l psxevars intel-ipp parallel_studio_xe_2018.2.046
17 f compilervars intel-ipp compilers_and_libraries_2018.2.199/linux/bin
18 f compilervars intel-ipp compilers_and_libraries_2018/linux/bin
19 l compilervars intel-ipp bin
20 f mklvars intel-mkl compilers_and_libraries_2018.2.199/linux/mkl/bin
21 f psxevars intel-mkl parallel_studio_xe_2018.2.046/bin
22 l psxevars intel-mkl parallel_studio_xe_2018.2.046
23 f compilervars intel-mkl compilers_and_libraries_2018.2.199/linux/bin
24 f compilervars intel-mkl compilers_and_libraries_2018/linux/bin
25 l compilervars intel-mkl bin
26 f mpivars intel-mpi compilers_and_libraries_2018.2.199/linux/mpi_2019/intel64/bin
27 f mpivars intel-mpi compilers_and_libraries_2018.2.199/linux/mpi/intel64/bin
28 f psxevars intel-mpi parallel_studio_xe_2018.2.046/bin
29 l psxevars intel-mpi parallel_studio_xe_2018.2.046
30 f compilervars intel-mpi compilers_and_libraries_2018.2.199/linux/bin
31 f compilervars intel-mpi compilers_and_libraries_2018/linux/bin
32 l compilervars intel-mpi bin
33 f apsvars intel-parallel-studio vtune_amplifier_2018.1.0.535340
34 l apsvars intel-parallel-studio performance_snapshots_2018.1.0.535340
35 f ippvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/ipp/bin
36 f ippvars intel-parallel-studio composer_xe_2015.6.233/ipp/bin
37 f mklvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mkl/bin
38 f mklvars intel-parallel-studio composer_xe_2015.6.233/mkl/bin
39 f mpivars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mpi/intel64/bin
40 f mpivars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mpirt/bin/ia32_lin
41 f tbbvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/tbb/bin
42 f tbbvars intel-parallel-studio composer_xe_2015.6.233/tbb/bin
43 f daalvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/daal/bin
44 f pstlvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/pstl/bin
45 f psxevars intel-parallel-studio parallel_studio_xe_2018.1.038/bin
46 l psxevars intel-parallel-studio parallel_studio_xe_2018.1.038
47 f sep_vars intel-parallel-studio vtune_amplifier_2018.1.0.535340
48 f sep_vars intel-parallel-studio vtune_amplifier_2018.1.0.535340/target/android_v4.1_x86_64
49 f advixe-vars intel-parallel-studio advisor_2018.1.1.535164
50 f amplxe-vars intel-parallel-studio vtune_amplifier_2018.1.0.535340
51 f inspxe-vars intel-parallel-studio inspector_2018.1.1.535159
52 f compilervars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/bin
53 f compilervars intel-parallel-studio compilers_and_libraries_2018/linux/bin
54 l compilervars intel-parallel-studio bin
55 f debuggervars intel-parallel-studio debugger_2018/bin
********************
MPI linkage
********************
Library selection
~~~~~~~~~~~~~~~~~~~~~
In the Spack code so far, the library selections for MPI are:
::
libnames = ['libmpifort', 'libmpi']
if 'cxx' in self.spec.last_query.extra_parameters:
libnames = ['libmpicxx'] + libnames
return find_libraries(libnames,
root=self.component_lib_dir('mpi'),
shared=True, recursive=False)
The problem is that there are multiple library versions under ``component_lib_dir``::
$ cd $I_MPI_ROOT
$ find . -name libmpi.so | sort
./intel64/lib/debug/libmpi.so
./intel64/lib/debug_mt/libmpi.so
./intel64/lib/libmpi.so
./intel64/lib/release/libmpi.so
./intel64/lib/release_mt/libmpi.so
"mt" refers to multi-threading, not in the explicit sense but in the sense of being thread-safe::
$ mpiifort -help | grep mt
-mt_mpi link the thread safe version of the Intel(R) MPI Library
Well, why should we not inspect what the canonical script does? The wrapper
has its own hardcoded "prefix=..." and can thus tell us what it will do, from a
*wiped environment* no less!::
$ env - intel64/bin/mpiicc -show hello.c | ld-unwrap-args
icc 'hello.c' \
-I/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/include \
-L/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib/release_mt \
-L/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib \
-Xlinker --enable-new-dtags \
-Xlinker -rpath=/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib/release_mt \
-Xlinker -rpath=/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib \
-Xlinker -rpath=/opt/intel/mpi-rt/2017.0.0/intel64/lib/release_mt \
-Xlinker -rpath=/opt/intel/mpi-rt/2017.0.0/intel64/lib \
-lmpifort \
-lmpi \
-lmpigi \
-ldl \
-lrt \
-lpthread
MPI Wrapper options
~~~~~~~~~~~~~~~~~~~~~
For reference, here's the wrapper's builtin help output::
$ mpiifort -help
Simple script to compile and/or link MPI programs.
Usage: mpiifort [options] <files>
----------------------------------------------------------------------------
The following options are supported:
-fc=<name> | -f90=<name>
specify a FORTRAN compiler name: i.e. -fc=ifort
-echo print the scripts during their execution
-show show command lines without real calling
-config=<name> specify a configuration file: i.e. -config=ifort for mpif90-ifort.conf file
-v print version info of mpiifort and its native compiler
-profile=<name> specify a profile configuration file (an MPI profiling
library): i.e. -profile=myprofile for the myprofile.cfg file.
As a special case, lib<name>.so or lib<name>.a may be used
if the library is found
-check_mpi link against the Intel(R) Trace Collector (-profile=vtmc).
-static_mpi link the Intel(R) MPI Library statically
-mt_mpi link the thread safe version of the Intel(R) MPI Library
-ilp64 link the ILP64 support of the Intel(R) MPI Library
-no_ilp64 disable ILP64 support explicitly
-fast the same as -static_mpi + pass -fast option to a compiler.
-t or -trace
link against the Intel(R) Trace Collector
-trace-imbalance
link against the Intel(R) Trace Collector imbalance library
(-profile=vtim)
-dynamic_log link against the Intel(R) Trace Collector dynamically
-static use static linkage method
-nostrip turn off the debug information stripping during static linking
-O enable optimization
-link_mpi=<name>
link against the specified version of the Intel(R) MPI Library
All other options will be passed to the compiler without changing.
----------------------------------------------------------------------------
The following environment variables are used:
I_MPI_ROOT the Intel(R) MPI Library installation directory path
I_MPI_F90 or MPICH_F90
the path/name of the underlying compiler to be used
I_MPI_FC_PROFILE or I_MPI_F90_PROFILE or MPIF90_PROFILE
the name of profile file (without extension)
I_MPI_COMPILER_CONFIG_DIR
the folder which contains configuration files *.conf
I_MPI_TRACE_PROFILE
specify a default profile for the -trace option
I_MPI_CHECK_PROFILE
specify a default profile for the -check_mpi option
I_MPI_CHECK_COMPILER
enable compiler setup checks
I_MPI_LINK specify the version of the Intel(R) MPI Library
I_MPI_DEBUG_INFO_STRIP
turn on/off the debug information stripping during static linking
I_MPI_FCFLAGS
special flags needed for compilation
I_MPI_LDFLAGS
special flags needed for linking
----------------------------------------------------------------------------
Side Note: MPI version divergence in 2015 release
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The package `intel-parallel-studio@cluster.2015.6` contains both a full MPI
development version in `$prefix/impi` and an MPI Runtime under the
`composer_xe*` suite directory. Curiously, these have *different versions*,
with a release date nearly 1 year apart::
$ $SPACK_ROOT/...uaxaw7/impi/5.0.3.049/intel64/bin/mpiexec --version
Intel(R) MPI Library for Linux* OS, Version 5.0 Update 3 Build 20150804 (build id: 12452)
Copyright (C) 2003-2015, Intel Corporation. All rights reserved.
$ $SPACK_ROOT/...uaxaw7/composer_xe_2015.6.233/mpirt/bin/intel64/mpiexec --version
Intel(R) MPI Library for Linux* OS, Version 5.0 Update 1 Build 20140709
Copyright (C) 2003-2014, Intel Corporation. All rights reserved.
I'm not sure what to make of it.
**************
macOS support
**************
- On macOS, the Spack methods here only include support to integrate an
externally installed MKL.
- URLs in child packages will be Linux-specific; macOS download packages
are located in differently numbered dirs and are named m_*.dmg.

View File

@@ -2,13 +2,19 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Callable, List
from typing import List
import llnl.util.lang
import spack.builder
import spack.error
import spack.phase_callbacks
import spack.relocate
from spack.package import Builder, InstallError, Spec, run_after
import spack.spec
import spack.store
def sanity_check_prefix(builder: Builder):
def sanity_check_prefix(builder: spack.builder.Builder):
"""Check that specific directories and files are created after installation.
The files to be checked are in the ``sanity_check_is_file`` attribute of the
@@ -19,31 +25,27 @@ def sanity_check_prefix(builder: Builder):
"""
pkg = builder.pkg
def check_paths(path_list: List[str], filetype: str, predicate: Callable[[str], bool]) -> None:
def check_paths(path_list, filetype, predicate):
if isinstance(path_list, str):
path_list = [path_list]
for path in path_list:
if not predicate(os.path.join(pkg.prefix, path)):
raise InstallError(
f"Install failed for {pkg.name}. No such {filetype} in prefix: {path}"
)
abs_path = os.path.join(pkg.prefix, path)
if not predicate(abs_path):
msg = "Install failed for {0}. No such {1} in prefix: {2}"
msg = msg.format(pkg.name, filetype, path)
raise spack.error.InstallError(msg)
check_paths(pkg.sanity_check_is_file, "file", os.path.isfile)
check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir)
# Check that the prefix is not empty apart from the .spack/ directory
with os.scandir(pkg.prefix) as entries:
f = next(
(f for f in entries if not (f.name == ".spack" and f.is_dir(follow_symlinks=False))),
None,
)
if f is None:
raise InstallError(f"Install failed for {pkg.name}. Nothing was installed!")
ignore_file = llnl.util.lang.match_predicate(spack.store.STORE.layout.hidden_file_regexes)
if all(map(ignore_file, os.listdir(pkg.prefix))):
msg = "Install failed for {0}. Nothing was installed!"
raise spack.error.InstallError(msg.format(pkg.name))
def apply_macos_rpath_fixups(builder: Builder):
def apply_macos_rpath_fixups(builder: spack.builder.Builder):
"""On Darwin, make installed libraries more easily relocatable.
Some build systems (handrolled, autotools, makefiles) can set their own
@@ -60,7 +62,9 @@ def apply_macos_rpath_fixups(builder: Builder):
spack.relocate.fixup_macos_rpaths(builder.spec)
def ensure_build_dependencies_or_raise(spec: Spec, dependencies: List[str], error_msg: str):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
If not, raise a RuntimeError with a helpful error message.
@@ -97,7 +101,7 @@ def ensure_build_dependencies_or_raise(spec: Spec, dependencies: List[str], erro
raise RuntimeError(msg)
def execute_build_time_tests(builder: Builder):
def execute_build_time_tests(builder: spack.builder.Builder):
"""Execute the build-time tests prescribed by builder.
Args:
@@ -110,7 +114,7 @@ def execute_build_time_tests(builder: Builder):
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
def execute_install_time_tests(builder: Builder):
def execute_install_time_tests(builder: spack.builder.Builder):
"""Execute the install-time tests prescribed by builder.
Args:
@@ -123,8 +127,8 @@ def execute_install_time_tests(builder: Builder):
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BuilderWithDefaults(Builder):
class BuilderWithDefaults(spack.builder.Builder):
"""Base class for all specific builders with common callbacks registered."""
# Check that self.prefix is there after installation
run_after("install")(sanity_check_prefix)
spack.phase_callbacks.run_after("install")(sanity_check_prefix)

View File

@@ -23,6 +23,7 @@
from .generic import Package
from .gnu import GNUMirrorPackage
from .go import GoPackage
from .intel import IntelPackage
from .lua import LuaPackage
from .makefile import MakefilePackage
from .maven import MavenPackage
@@ -68,6 +69,7 @@
"Package",
"GNUMirrorPackage",
"GoPackage",
"IntelPackage",
"IntelOneApiLibraryPackageWithSdk",
"IntelOneApiLibraryPackage",
"IntelOneApiStaticLibraryList",

View File

@@ -3,7 +3,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.package import Executable, Prefix, Spec, extends, filter_file
import llnl.util.filesystem as fs
import spack.directives
import spack.spec
import spack.util.executable
import spack.util.prefix
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -15,13 +20,16 @@ class AspellBuilder(AutotoolsBuilder):
"""
def configure(
self, pkg: "AspellDictPackage", spec: Spec, prefix: Prefix # type: ignore[override]
self,
pkg: "AspellDictPackage", # type: ignore[override]
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
):
aspell = spec["aspell"].prefix.bin.aspell
prezip = spec["aspell"].prefix.bin.prezip
destdir = prefix
sh = Executable("/bin/sh")
sh = spack.util.executable.Executable("/bin/sh")
sh("./configure", "--vars", f"ASPELL={aspell}", f"PREZIP={prezip}", f"DESTDIR={destdir}")
@@ -34,7 +42,7 @@ def configure(
class AspellDictPackage(AutotoolsPackage):
"""Specialized class for building aspell dictionairies."""
extends("aspell", when="build_system=autotools")
spack.directives.extends("aspell", when="build_system=autotools")
#: Override the default autotools builder
AutotoolsBuilder = AspellBuilder
@@ -46,5 +54,5 @@ def patch(self):
datadir = aspell("dump", "config", "data-dir", output=str).strip()
dictdir = os.path.relpath(dictdir, aspell_spec.prefix)
datadir = os.path.relpath(datadir, aspell_spec.prefix)
filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure")
fs.filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
fs.filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure")

View File

@@ -7,36 +7,22 @@
from typing import Callable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_environment
import spack.builder
import spack.compilers.libraries
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
from spack.package import (
EnvironmentModifications,
Executable,
FileFilter,
InstallError,
PackageBase,
Prefix,
Spec,
Version,
build_system,
conflicts,
copy,
depends_on,
find,
force_remove,
is_exe,
keep_modification_time,
mkdirp,
register_builder,
run_after,
run_before,
tty,
when,
working_dir,
)
from spack.util.executable import Executable
from spack.version import Version
from ._checks import (
BuilderWithDefaults,
@@ -47,7 +33,7 @@
)
class AutotoolsPackage(PackageBase):
class AutotoolsPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using GNU Autotools."""
#: This attribute is used in UI queries that need to know the build
@@ -92,7 +78,7 @@ def with_or_without(self, *args, **kwargs):
return spack.builder.create(self).with_or_without(*args, **kwargs)
@register_builder("autotools")
@spack.builder.builder("autotools")
class AutotoolsBuilder(BuilderWithDefaults):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:
@@ -206,7 +192,7 @@ def archive_files(self) -> List[str]:
files.append(self._removed_la_files_log)
return files
@run_after("autoreconf")
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
"""Some packages ship with older config.guess/config.sub files and need to
have these updated when installed on a newer architecture.
@@ -244,7 +230,7 @@ def runs_ok(script_abs_path):
return True
# Get the list of files that needs to be patched
to_be_patched = find(self.pkg.stage.path, files=["config.sub", "config.guess"])
to_be_patched = fs.find(self.pkg.stage.path, files=["config.sub", "config.guess"])
to_be_patched = [f for f in to_be_patched if not runs_ok(f)]
# If there are no files to be patched, return early
@@ -263,13 +249,13 @@ def runs_ok(script_abs_path):
# An external gnuconfig may not not have a prefix.
if gnuconfig_dir is None:
raise InstallError(
raise spack.error.InstallError(
"Spack could not find substitutes for GNU config files because no "
"prefix is available for the `gnuconfig` package. Make sure you set a "
"prefix path instead of modules for external `gnuconfig`."
)
candidates = find(gnuconfig_dir, files=to_be_found, recursive=False)
candidates = fs.find(gnuconfig_dir, files=to_be_found, recursive=False)
# For external packages the user may have specified an incorrect prefix.
# otherwise the installation is just corrupt.
@@ -283,7 +269,7 @@ def runs_ok(script_abs_path):
msg += (
" or the `gnuconfig` package prefix is misconfigured as" " an external package"
)
raise InstallError(msg)
raise spack.error.InstallError(msg)
# Filter working substitutes
candidates = [f for f in candidates if runs_ok(f)]
@@ -308,29 +294,29 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
name = os.path.basename(abs_path)
mode = os.stat(abs_path).st_mode
os.chmod(abs_path, stat.S_IWUSR)
copy(substitutes[name], abs_path)
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@run_before("configure")
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
"""On NixOS file is not available in /usr/bin/file. Patch configure
scripts to use file from path."""
if self.spec.os.startswith("nixos"):
x = FileFilter(
*filter(is_exe, find(self.build_directory, "configure", recursive=True))
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
)
with keep_modification_time(*x.filenames):
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@run_before("configure")
@spack.phase_callbacks.run_before("configure")
def _set_autotools_environment_variables(self) -> None:
"""Many autotools builds use a version of mknod.m4 that fails when
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
@@ -344,7 +330,7 @@ def _set_autotools_environment_variables(self) -> None:
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@run_before("configure")
@spack.phase_callbacks.run_before("configure")
def _do_patch_libtool_configure(self) -> None:
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
@@ -355,12 +341,14 @@ def _do_patch_libtool_configure(self) -> None:
if not self.patch_libtool:
return
x = FileFilter(*filter(is_exe, find(self.build_directory, "configure", recursive=True)))
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
)
# There are distributed automatically generated files that depend on the configure script
# and require additional tools for rebuilding.
# See https://github.com/spack/spack/pull/30768#issuecomment-1219329860
with keep_modification_time(*x.filenames):
with fs.keep_modification_time(*x.filenames):
# Fix parsing of compiler output when collecting predeps and postdeps
# https://lists.gnu.org/archive/html/bug-libtool/2016-03/msg00003.html
x.filter(regex=r'^(\s*if test x-L = )("\$p" \|\|\s*)$', repl=r"\1x\2")
@@ -377,7 +365,7 @@ def _do_patch_libtool_configure(self) -> None:
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@run_after("configure")
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
@@ -399,7 +387,9 @@ def _do_patch_libtool(self) -> None:
if not self.patch_libtool:
return
x = FileFilter(*filter(is_exe, find(self.build_directory, "libtool", recursive=True)))
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "libtool", recursive=True))
)
# Exit early if there is nothing to patch:
if not x.filenames:
@@ -555,10 +545,10 @@ def build_directory(self) -> str:
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
return build_dir
@run_before("autoreconf")
@spack.phase_callbacks.run_before("autoreconf")
def _delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
force_remove(self.configure_abs_path)
fs.force_remove(self.configure_abs_path)
@property
def autoreconf_search_path_args(self) -> List[str]:
@@ -568,7 +558,7 @@ def autoreconf_search_path_args(self) -> List[str]:
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@run_after("autoreconf")
@spack.phase_callbacks.run_after("autoreconf")
def _set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
@@ -592,7 +582,9 @@ def configure_args(self) -> List[str]:
"""
return []
def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
def autoreconf(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
@@ -611,7 +603,7 @@ def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
tty.warn("* If the default procedure fails, consider implementing *")
tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************")
with working_dir(self.configure_directory):
with fs.working_dir(self.configure_directory):
# This line is what is needed most of the time
# --install, --verbose, --force
autoreconf_args = ["-ivf"]
@@ -619,7 +611,9 @@ def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
def configure(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
def configure(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
@@ -627,27 +621,31 @@ def configure(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
options += ["--prefix={0}".format(prefix)]
options += self.configure_args()
with working_dir(self.build_directory, create=True):
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
params += self.build_targets
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
@@ -715,7 +713,7 @@ def _activate_or_not(
Raises:
KeyError: if name is not among known variants
"""
spec: Spec = self.pkg.spec
spec: spack.spec.Spec = self.pkg.spec
args: List[str] = []
if activation_value == "prefix":
@@ -826,14 +824,14 @@ def enable_or_disable(
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@run_after("install")
@spack.phase_callbacks.run_after("install")
def _remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
@@ -843,23 +841,25 @@ def _remove_libtool_archives(self) -> None:
return
# Remove the files and create a log of what was removed
libtool_files = find(str(self.pkg.prefix), "*.la", recursive=True)
libtool_files = fs.find(str(self.pkg.prefix), "*.la", recursive=True)
with fs.safe_remove(*libtool_files):
mkdirp(os.path.dirname(self._removed_la_files_log))
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
f.write("\n".join(libtool_files))
def setup_build_environment(self, env: EnvironmentModifications) -> None:
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
# Many configure files rely on matching '10.*' for macOS version
# detection and fail to add flags if it shows as version 11.
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec: Spec) -> List[str]:
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()
flags_spack: List[str] = []
flags_external: List[str] = []

View File

@@ -1,10 +1,12 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import Builder, PackageBase, Prefix, Spec, build_system, register_builder
import spack.builder
import spack.directives
import spack.package_base
class BundlePackage(PackageBase):
class BundlePackage(spack.package_base.PackageBase):
"""General purpose bundle, or no-code, package class."""
#: This attribute is used in UI queries that require to know which
@@ -17,12 +19,12 @@ class BundlePackage(PackageBase):
#: Bundle packages do not have associated source or binary code.
has_code = False
build_system("bundle")
spack.directives.build_system("bundle")
@register_builder("bundle")
class BundleBuilder(Builder):
@spack.builder.builder("bundle")
class BundleBuilder(spack.builder.Builder):
phases = ("install",)
def install(self, pkg: BundlePackage, spec: Spec, prefix: Prefix) -> None:
def install(self, pkg, spec, prefix):
pass

View File

@@ -7,7 +7,14 @@
import re
from typing import Optional, Tuple
from spack.package import Prefix, Spec, depends_on, install, mkdirp, run_after, tty, which_string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import depends_on
from spack.util.executable import which_string
from .cmake import CMakeBuilder, CMakePackage
@@ -368,7 +375,9 @@ def initconfig_package_entries(self):
"""This method is to be overwritten by the package"""
return []
def initconfig(self, pkg: "CachedCMakePackage", spec: Spec, prefix: Prefix) -> None:
def initconfig(
self, pkg: "CachedCMakePackage", spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
cache_entries = (
self.std_initconfig_entries()
+ self.initconfig_compiler_entries()
@@ -388,10 +397,10 @@ def std_cmake_args(self):
args.extend(["-C", self.cache_path])
return args
@run_after("install")
@spack.phase_callbacks.run_after("install")
def install_cmake_cache(self):
mkdirp(self.pkg.spec.prefix.share.cmake)
install(self.cache_path, self.pkg.spec.prefix.share.cmake)
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)
class CachedCMakePackage(CMakePackage):

View File

@@ -2,24 +2,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
EnvironmentModifications,
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
install_tree,
register_builder,
run_after,
when,
working_dir,
)
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_install_time_tests
class CargoPackage(PackageBase):
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using cargo."""
#: This attribute is used in UI queries that need to know the build
@@ -32,7 +29,7 @@ class CargoPackage(PackageBase):
depends_on("rust", type="build")
@register_builder("cargo")
@spack.builder.builder("cargo")
class CargoBuilder(BuilderWithDefaults):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
@@ -90,24 +87,30 @@ def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def setup_build_environment(self, env: EnvironmentModifications) -> None:
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("CARGO_HOME", self.stage.path)
def build(self, pkg: CargoPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cargo install`` in the source directory"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.cargo(
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
)
def install(self, pkg: CargoPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy build files into package prefix."""
with working_dir(self.build_directory):
install_tree("out", prefix)
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg.module.cargo("test", *self.check_args)

View File

@@ -10,25 +10,20 @@
from itertools import chain
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util import tty
from llnl.util.lang import stable_partition
import spack.builder
import spack.deptypes as dt
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.package import (
InstallError,
PackageBase,
Prefix,
Spec,
build_system,
conflicts,
depends_on,
register_builder,
run_after,
tty,
variant,
when,
working_dir,
)
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
from ._checks import BuilderWithDefaults, execute_build_time_tests
@@ -46,7 +41,7 @@ def _extract_primary_generator(generator):
return _primary_generator_extractor.match(generator).group(1)
def _maybe_set_python_hints(pkg: PackageBase, args: List[str]) -> None:
def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
``find_python_hints`` for context."""
@@ -64,7 +59,7 @@ def _maybe_set_python_hints(pkg: PackageBase, args: List[str]) -> None:
)
def _supports_compilation_databases(pkg: PackageBase) -> bool:
def _supports_compilation_databases(pkg: spack.package_base.PackageBase) -> bool:
"""Check if this package (and CMake) can support compilation databases."""
# CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
@@ -78,7 +73,7 @@ def _supports_compilation_databases(pkg: PackageBase) -> bool:
return True
def _conditional_cmake_defaults(pkg: PackageBase, args: List[str]) -> None:
def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
"""Set a few default defines for CMake, depending on its version."""
cmakes = pkg.spec.dependencies("cmake", dt.BUILD)
@@ -169,7 +164,7 @@ def _values(x):
conflicts(f"generator={x}")
def get_cmake_prefix_path(pkg: PackageBase) -> List[str]:
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
edges = traverse.traverse_topo_edges_generator(
@@ -190,7 +185,7 @@ def get_cmake_prefix_path(pkg: PackageBase) -> List[str]:
)
class CMakePackage(PackageBase):
class CMakePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using CMake
For more information on the CMake build system, see:
@@ -288,7 +283,7 @@ def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) ->
return define_from_variant(self, cmake_var, variant)
@register_builder("cmake")
@spack.builder.builder("cmake")
class CMakeBuilder(BuilderWithDefaults):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:
@@ -375,7 +370,9 @@ def std_cmake_args(self) -> List[str]:
return args
@staticmethod
def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
def std_args(
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
) -> List[str]:
"""Computes the standard cmake arguments for a generic package"""
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
@@ -385,7 +382,7 @@ def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
msg = "Invalid CMake generator: '{0}'\n".format(generator)
msg += "CMakePackage currently supports the following "
msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators))
raise InstallError(msg)
raise spack.error.InstallError(msg)
try:
build_type = pkg.spec.variants["build_type"].value
@@ -423,11 +420,11 @@ def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
return args
@staticmethod
def define_cuda_architectures(pkg: PackageBase) -> str:
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_cuda_architectures(pkg)
@staticmethod
def define_hip_architectures(pkg: PackageBase) -> str:
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_hip_architectures(pkg)
@staticmethod
@@ -457,7 +454,9 @@ def cmake_args(self) -> List[str]:
"""
return []
def cmake(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
def cmake(
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cmake`` in the build directory"""
if spec.is_develop:
@@ -481,33 +480,37 @@ def cmake(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
options = self.std_cmake_args
options += self.cmake_args()
options.append(os.path.abspath(self.root_cmakelists_dir))
with working_dir(self.build_directory, create=True):
with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options)
def build(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
pkg.module.make(*self.build_targets)
elif self.generator == "Ninja":
self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets)
def install(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
pkg.module.make(*self.install_targets)
elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets)
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self) -> None:
"""Search the CMake-generated files for the targets ``test`` and ``check``,
and runs them if found.
"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
self.pkg._if_make_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
self.pkg._if_make_target_execute("check")
@@ -554,7 +557,9 @@ def define(cmake_var: str, value: Any) -> str:
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(pkg: PackageBase, cmake_var: str, variant: Optional[str] = None) -> str:
def define_from_variant(
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
) -> str:
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
@@ -614,7 +619,7 @@ def define_from_variant(pkg: PackageBase, cmake_var: str, variant: Optional[str]
return define(cmake_var, value)
def define_hip_architectures(pkg: PackageBase) -> str:
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
@@ -630,7 +635,7 @@ def define_hip_architectures(pkg: PackageBase) -> str:
return ""
def define_cuda_architectures(pkg: PackageBase) -> str:
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and

View File

@@ -8,16 +8,19 @@
import sys
from typing import Dict, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty, memoized
import spack
import spack.compilers.error
from spack.package import Executable, PackageBase, ProcessError, Spec, tty, which_string
import spack.package_base
import spack.util.executable
# Local "type" for type hints
Path = Union[str, pathlib.Path]
class CompilerPackage(PackageBase):
class CompilerPackage(spack.package_base.PackageBase):
"""A Package mixin for all common logic for packages that implement compilers"""
# TODO: how do these play nicely with other tags
@@ -49,7 +52,7 @@ class CompilerPackage(PackageBase):
#: Flags for generating debug information
debug_flags: Sequence[str] = []
def __init__(self, spec: Spec):
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
@@ -94,7 +97,7 @@ def determine_version(cls, exe: Path) -> str:
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
except ProcessError:
except spack.util.executable.ProcessError:
pass
except Exception as e:
tty.debug(
@@ -227,7 +230,7 @@ def _compiler_output(
compiler_path: path of the compiler to be invoked
version_argument: the argument used to extract version information
"""
compiler = Executable(compiler_path)
compiler = spack.util.executable.Executable(compiler_path)
if not version_argument:
return compiler(
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
@@ -250,7 +253,7 @@ def compiler_output(
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = which_string(str(compiler_path), required=True)
compiler_path = spack.util.executable.which_string(str(compiler_path), required=True)
return _compiler_output(
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors

View File

@@ -5,7 +5,10 @@
import re
from typing import Iterable, List
from spack.package import PackageBase, any_combination_of, conflicts, depends_on, variant, when
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
from spack.package_base import PackageBase
class CudaPackage(PackageBase):
@@ -68,7 +71,7 @@ class CudaPackage(PackageBase):
variant(
"cuda_arch",
description="CUDA architecture",
values=any_combination_of(*cuda_arch_values),
values=spack.variant.any_combination_of(*cuda_arch_values),
sticky=True,
when="+cuda",
)

View File

@@ -3,12 +3,17 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Tuple
from spack.package import PackageBase, Prefix, Spec, build_system, register_builder, run_after
import spack.builder
import spack.directives
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
class Package(PackageBase):
class Package(spack.package_base.PackageBase):
"""General purpose class with a single ``install`` phase that needs to be
coded by packagers.
"""
@@ -19,10 +24,10 @@ class Package(PackageBase):
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "generic"
build_system("generic")
spack.directives.build_system("generic")
@register_builder("generic")
@spack.builder.builder("generic")
class GenericBuilder(BuilderWithDefaults):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.
@@ -41,10 +46,12 @@ class GenericBuilder(BuilderWithDefaults):
install_time_test_callbacks = []
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
# unconditionally perform any post-install phase tests
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def install(self, pkg: Package, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: Package, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
raise NotImplementedError

View File

@@ -4,11 +4,11 @@
from typing import Optional
import spack.package_base
import spack.util.url
from spack.package import PackageBase
class GNUMirrorPackage(PackageBase):
class GNUMirrorPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror

View File

@@ -2,26 +2,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
EnvironmentModifications,
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
install,
join_path,
mkdirp,
register_builder,
run_after,
when,
working_dir,
)
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_install_time_tests
class GoPackage(PackageBase):
class GoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using the Go toolchain."""
#: This attribute is used in UI queries that need to know the build
@@ -37,7 +32,7 @@ class GoPackage(PackageBase):
depends_on("go", type="build")
@register_builder("go")
@spack.builder.builder("go")
class GoBuilder(BuilderWithDefaults):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
@@ -74,10 +69,12 @@ class GoBuilder(BuilderWithDefaults):
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
def setup_build_environment(self, env: EnvironmentModifications) -> None:
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
env.set("GOPATH", join_path(self.pkg.stage.path, "go"))
env.set("GOPATH", fs.join_path(self.pkg.stage.path, "go"))
@property
def build_directory(self):
@@ -103,20 +100,24 @@ def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg: GoPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``go build`` in the source directory"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.go("build", *self.build_args)
def install(self, pkg: GoPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install built binaries into prefix bin."""
with working_dir(self.build_directory):
mkdirp(prefix.bin)
install(pkg.name, prefix.bin)
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg.module.go("test", *self.check_args)

File diff suppressed because it is too large Load Diff

View File

@@ -3,23 +3,19 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.package import (
Builder,
EnvironmentModifications,
Executable,
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
extends,
find,
register_builder,
when,
)
from llnl.util.filesystem import find
import spack.builder
import spack.package_base
import spack.spec
import spack.util.environment
import spack.util.executable
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
class LuaPackage(PackageBase):
class LuaPackage(spack.package_base.PackageBase):
"""Specialized class for lua packages"""
#: This attribute is used in UI queries that need to know the build
@@ -44,16 +40,16 @@ class LuaPackage(PackageBase):
@property
def lua(self):
return Executable(self.spec["lua-lang"].prefix.bin.lua)
return spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
@property
def luarocks(self):
lr = Executable(self.spec["lua-lang"].prefix.bin.luarocks)
lr = spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
return lr
@register_builder("lua")
class LuaBuilder(Builder):
@spack.builder.builder("lua")
class LuaBuilder(spack.builder.Builder):
phases = ("unpack", "generate_luarocks_config", "preprocess", "install")
#: Names associated with package methods in the old build-system format
@@ -62,7 +58,9 @@ class LuaBuilder(Builder):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def unpack(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def unpack(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
if os.path.splitext(pkg.stage.archive_file)[1] == ".rock":
directory = pkg.luarocks("unpack", pkg.stage.archive_file, output=str)
dirlines = directory.split("\n")
@@ -73,7 +71,9 @@ def unpack(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def _generate_tree_line(name, prefix):
return """{{ name = "{name}", root = "{prefix}" }};""".format(name=name, prefix=prefix)
def generate_luarocks_config(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def generate_luarocks_config(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
spec = self.pkg.spec
table_entries = []
for d in spec.traverse(deptype=("build", "run")):
@@ -92,14 +92,18 @@ def generate_luarocks_config(self, pkg: LuaPackage, spec: Spec, prefix: Prefix)
)
)
def preprocess(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def preprocess(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Override this to preprocess source before building with luarocks"""
pass
def luarocks_args(self):
return []
def install(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
rock = "."
specs = find(".", "*.rockspec", recursive=False)
if specs:
@@ -111,5 +115,7 @@ def install(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
def _luarocks_config_path(self):
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")
def setup_build_environment(self, env: EnvironmentModifications) -> None:
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
env.set("LUAROCKS_CONFIG", self._luarocks_config_path())

View File

@@ -3,18 +3,15 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import List
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
conflicts,
depends_on,
register_builder,
run_after,
when,
working_dir,
)
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BuilderWithDefaults,
@@ -24,7 +21,7 @@
)
class MakefilePackage(PackageBase):
class MakefilePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using Makefiles."""
#: This attribute is used in UI queries that need to know the build
@@ -40,7 +37,7 @@ class MakefilePackage(PackageBase):
depends_on("gmake", type="build")
@register_builder("makefile")
@spack.builder.builder("makefile")
class MakefileBuilder(BuilderWithDefaults):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:
@@ -100,36 +97,42 @@ def build_directory(self) -> str:
"""Return the directory containing the main Makefile."""
return self.pkg.stage.source_path
def edit(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
def edit(
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self) -> None:
"""Searches the Makefile for an ``installcheck`` target
and runs it if found.
"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -1,23 +1,20 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
install_tree,
register_builder,
when,
which,
working_dir,
)
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BuilderWithDefaults
class MavenPackage(PackageBase):
class MavenPackage(spack.package_base.PackageBase):
"""Specialized class for packages that are built using the
Maven build system. See https://maven.apache.org/index.html
for more information.
@@ -37,7 +34,7 @@ class MavenPackage(PackageBase):
depends_on("maven", type="build")
@register_builder("maven")
@spack.builder.builder("maven")
class MavenBuilder(BuilderWithDefaults):
"""The Maven builder encodes the default way to build software with Maven.
It has two phases that can be overridden, if need be:
@@ -63,16 +60,20 @@ def build_args(self):
"""List of args to pass to build phase."""
return []
def build(self, pkg: MavenPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Compile code and package into a JAR file."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
mvn = which("mvn", required=True)
if self.pkg.run_tests:
mvn("verify", *self.build_args())
else:
mvn("package", "-DskipTests", *self.build_args())
def install(self, pkg: MavenPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy to installation prefix."""
with working_dir(self.build_directory):
install_tree(".", prefix)
with fs.working_dir(self.build_directory):
fs.install_tree(".", prefix)

View File

@@ -4,24 +4,20 @@
import os
from typing import List
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
conflicts,
depends_on,
register_builder,
run_after,
variant,
when,
working_dir,
)
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_build_time_tests
class MesonPackage(PackageBase):
class MesonPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using Meson. For more information
on the Meson build system, see https://mesonbuild.com/
"""
@@ -70,7 +66,7 @@ def flags_to_build_system_args(self, flags):
setattr(self, "meson_flag_args", [])
@register_builder("meson")
@spack.builder.builder("meson")
class MesonBuilder(BuilderWithDefaults):
"""The Meson builder encodes the default way to build software with Meson.
The builder has three phases that can be overridden, if need be:
@@ -194,7 +190,9 @@ def meson_args(self) -> List[str]:
"""
return []
def meson(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
def meson(
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run ``meson`` in the build directory"""
options = []
if self.spec["meson"].satisfies("@0.64:"):
@@ -202,25 +200,29 @@ def meson(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
options.append(os.path.abspath(self.root_mesonlists_dir))
options += self.std_meson_args
options += self.meson_args()
with working_dir(self.build_directory, create=True):
with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options)
def build(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.ninja(*options)
def install(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets)
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self) -> None:
"""Search Meson-generated files for the target ``test`` and run it if found."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
self.pkg._if_ninja_target_execute("test")
self.pkg._if_ninja_target_execute("check")

View File

@@ -5,20 +5,16 @@
import llnl.util.filesystem as fs
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
conflicts,
register_builder,
working_dir,
)
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
class MSBuildPackage(PackageBase):
class MSBuildPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using Visual Studio project files or solutions."""
#: This attribute is used in UI queries that need to know the build
@@ -34,7 +30,7 @@ def define(self, msbuild_arg, value):
return define(msbuild_arg, value)
@register_builder("msbuild")
@spack.builder.builder("msbuild")
class MSBuildBuilder(BuilderWithDefaults):
"""The MSBuild builder encodes the most common way of building software with
Mircosoft's MSBuild tool. It has two phases that can be overridden, if need be:
@@ -109,19 +105,23 @@ def msbuild_install_args(self):
as `msbuild_args` by default."""
return self.msbuild_args()
def build(self, pkg: MSBuildPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: MSBuildPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "msbuild" on the build targets specified by the builder."""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.msbuild(
*self.std_msbuild_args,
*self.msbuild_args(),
self.define_targets(*self.build_targets),
)
def install(self, pkg: MSBuildPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: MSBuildPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "msbuild" on the install targets specified by the builder.
This is INSTALL by default"""
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.msbuild(
*self.msbuild_install_args(), self.define_targets(*self.install_targets)
)

View File

@@ -5,20 +5,16 @@
import llnl.util.filesystem as fs
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
conflicts,
register_builder,
working_dir,
)
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
class NMakePackage(PackageBase):
class NMakePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
#: This attribute is used in UI queries that need to know the build
@@ -30,7 +26,7 @@ class NMakePackage(PackageBase):
conflicts("platform=darwin", when="build_system=nmake")
@register_builder("nmake")
@spack.builder.builder("nmake")
class NMakeBuilder(BuilderWithDefaults):
"""The NMake builder encodes the most common way of building software with
Mircosoft's NMake tool. It has two phases that can be overridden, if need be:
@@ -129,16 +125,20 @@ def nmake_install_args(self):
Individual packages should override to specify NMake args to command line"""
return []
def build(self, pkg: NMakePackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: NMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "nmake" on the build targets specified by the builder."""
opts = self.std_nmake_args
opts += self.nmake_args()
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.nmake(*opts, *self.build_targets, ignore_quotes=self.ignore_quotes)
def install(self, pkg: NMakePackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: NMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "nmake" on the install targets specified by the builder.
This is INSTALL by default"""
opts = self.std_nmake_args
@@ -147,5 +147,5 @@ def install(self, pkg: NMakePackage, spec: Spec, prefix: Prefix) -> None:
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pkg.module.nmake(*opts, *self.install_targets, ignore_quotes=self.ignore_quotes)

View File

@@ -1,21 +1,18 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
EnvironmentModifications,
PackageBase,
Prefix,
Spec,
build_system,
extends,
register_builder,
when,
)
import spack.builder
import spack.package_base
import spack.spec
import spack.util.environment
import spack.util.prefix
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BuilderWithDefaults
class OctavePackage(PackageBase):
class OctavePackage(spack.package_base.PackageBase):
"""Specialized class for Octave packages. See
https://www.gnu.org/software/octave/doc/v4.2.0/Installing-and-Removing-Packages.html
for more information.
@@ -33,7 +30,7 @@ class OctavePackage(PackageBase):
extends("octave")
@register_builder("octave")
@spack.builder.builder("octave")
class OctaveBuilder(BuilderWithDefaults):
"""The octave builder provides the following phases that can be overridden:
@@ -48,7 +45,9 @@ class OctaveBuilder(BuilderWithDefaults):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def install(self, pkg: OctavePackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: OctavePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package from the archive file"""
pkg.module.octave(
"--quiet",
@@ -59,7 +58,9 @@ def install(self, pkg: OctavePackage, spec: Spec, prefix: Prefix) -> None:
"pkg prefix %s; pkg install %s" % (prefix, self.pkg.stage.archive_file),
)
def setup_build_environment(self, env: EnvironmentModifications) -> None:
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
# octave does not like those environment variables to be set:
env.unset("CC")
env.unset("CXX")

View File

@@ -7,25 +7,16 @@
import shutil
from os.path import basename, isdir
from llnl.util import tty
from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
import spack.util.path
from spack.build_environment import dso_suffix
from spack.package import (
EnvironmentModifications,
Executable,
HeaderList,
InstallError,
LibraryList,
conflicts,
find_libraries,
join_path,
license,
mkdirp,
redistribute,
tty,
variant,
)
from spack.directives import conflicts, license, redistribute, variant
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from .generic import Package

View File

@@ -4,29 +4,23 @@
import os
from typing import Iterable
from llnl.util.filesystem import filter_file, find
from llnl.util.lang import memoized
from spack.package import (
Executable,
PackageBase,
Prefix,
SkipTest,
Spec,
build_system,
depends_on,
extends,
filter_file,
find,
register_builder,
run_after,
test_part,
when,
)
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.install_test import SkipTest, test_part
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BuilderWithDefaults, execute_build_time_tests
class PerlPackage(PackageBase):
class PerlPackage(spack.package_base.PackageBase):
"""Specialized class for packages that are built using Perl."""
#: This attribute is used in UI queries that need to know the build
@@ -94,7 +88,7 @@ def test_use(self):
assert "OK" in out
@register_builder("perl")
@spack.builder.builder("perl")
class PerlBuilder(BuilderWithDefaults):
"""The perl builder provides four phases that can be overridden, if required:
@@ -157,7 +151,9 @@ def configure_args(self):
"""
return []
def configure(self, pkg: PerlPackage, spec: Spec, prefix: Prefix) -> None:
def configure(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run Makefile.PL or Build.PL with arguments consisting of
an appropriate installation base directory followed by the
list returned by :py:meth:`~.PerlBuilder.configure_args`.
@@ -174,24 +170,28 @@ def configure(self, pkg: PerlPackage, spec: Spec, prefix: Prefix) -> None:
# Build.PL may be too long causing the build to fail. Patching the shebang
# does not happen until after install so set '/usr/bin/env perl' here in
# the Build script.
@run_after("configure")
@spack.phase_callbacks.run_after("configure")
def fix_shebang(self):
if self.build_method == "Build.PL":
pattern = "#!{0}".format(self.spec["perl"].command.path)
repl = "#!/usr/bin/env perl"
filter_file(pattern, repl, "Build", backup=False)
def build(self, pkg: PerlPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Builds a Perl package."""
self.build_executable()
# Ensure that tests run after build (if requested):
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
"""Runs built-in tests of a Perl package."""
self.build_executable("test")
def install(self, pkg: PerlPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: PerlPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Installs a Perl package."""
self.build_executable("install")

View File

@@ -13,37 +13,27 @@
import archspec
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList, join_path
from llnl.util.lang import ClassProperty, classproperty, match_predicate
import spack.builder
import spack.config
import spack.deptypes as dt
import spack.detection
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.package import (
HeaderList,
LibraryList,
NoHeadersError,
NoLibrariesError,
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
extends,
filter_file,
find,
find_all_headers,
join_path,
register_builder,
run_after,
test_part,
tty,
when,
working_dir,
)
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BuilderWithDefaults, execute_install_time_tests
@@ -67,7 +57,7 @@ def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
yield f"{key}={item}"
class PythonExtension(PackageBase):
class PythonExtension(spack.package_base.PackageBase):
@property
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
@@ -96,7 +86,7 @@ def import_modules(self) -> Iterable[str]:
# Some Python libraries are packages: collections of modules
# distributed in directories containing __init__.py files
for path in find(root, "__init__.py", recursive=True):
for path in fs.find(root, "__init__.py", recursive=True):
modules.append(
path.replace(root + os.sep, "", 1)
.replace(os.sep + "__init__.py", "")
@@ -105,7 +95,7 @@ def import_modules(self) -> Iterable[str]:
# Some Python libraries are modules: individual *.py files
# found in the site-packages directory
for path in find(root, "*.py", recursive=False):
for path in fs.find(root, "*.py", recursive=False):
modules.append(
path.replace(root + os.sep, "", 1).replace(".py", "").replace("/", ".")
)
@@ -190,7 +180,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
if (s.st_mode & 0b111) and fs.has_shebang(src):
copied_files[(s.st_dev, s.st_ino)] = dst
shutil.copy2(src, dst)
filter_file(
fs.filter_file(
python.prefix, os.path.abspath(view.get_projection_for_spec(self.spec)), dst
)
else:
@@ -372,7 +362,7 @@ class PythonPackage(PythonExtension):
build_system("python_pip")
with when("build_system=python_pip"):
with spack.multimethod.when("build_system=python_pip"):
extends("python")
depends_on("py-pip", type="build")
# FIXME: technically wheel is only needed when building from source, not when
@@ -405,7 +395,7 @@ def headers(self) -> HeaderList:
platlib = self.prefix.join(python.package.platlib).join(name)
purelib = self.prefix.join(python.package.purelib).join(name)
headers_list = map(find_all_headers, [include, platlib, purelib])
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
if headers:
@@ -437,7 +427,7 @@ def libs(self) -> LibraryList:
raise NoLibrariesError(msg.format(self.spec.name, platlib, purelib))
@register_builder("python_pip")
@spack.builder.builder("python_pip")
class PythonPipBuilder(BuilderWithDefaults):
phases = ("install",)
@@ -553,7 +543,7 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
else:
args.append(".")
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
pip(*args)
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -1,21 +1,19 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
register_builder,
run_after,
working_dir,
)
from llnl.util.filesystem import working_dir
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
class QMakePackage(PackageBase):
class QMakePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using qmake.
For more information on the qmake build system, see:
@@ -34,7 +32,7 @@ class QMakePackage(PackageBase):
depends_on("gmake", type="build")
@register_builder("qmake")
@spack.builder.builder("qmake")
class QMakeBuilder(BuilderWithDefaults):
"""The qmake builder provides three phases that can be overridden:
@@ -66,17 +64,23 @@ def qmake_args(self):
"""List of arguments passed to qmake."""
return []
def qmake(self, pkg: QMakePackage, spec: Spec, prefix: Prefix) -> None:
def qmake(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run ``qmake`` to configure the project and generate a Makefile."""
with working_dir(self.build_directory):
pkg.module.qmake(*self.qmake_args())
def build(self, pkg: QMakePackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
with working_dir(self.build_directory):
pkg.module.make()
def install(self, pkg: QMakePackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: QMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with working_dir(self.build_directory):
pkg.module.make("install")
@@ -86,4 +90,4 @@ def check(self):
with working_dir(self.build_directory):
self.pkg._if_make_target_execute("check")
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -3,9 +3,10 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional, Tuple
from llnl.util.filesystem import mkdirp
from llnl.util.lang import ClassProperty, classproperty
from spack.package import extends, mkdirp
from spack.directives import extends
from .generic import GenericBuilder, Package

View File

@@ -4,25 +4,19 @@
import os
from typing import Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import ClassProperty, classproperty
import spack.builder
import spack.spec
import spack.util.prefix
from spack.build_environment import SPACK_NO_PARALLEL_MAKE
from spack.package import (
Builder,
Executable,
PackageBase,
Prefix,
ProcessError,
Spec,
build_system,
determine_number_of_jobs,
extends,
maintainers,
register_builder,
tty,
working_dir,
)
from spack.config import determine_number_of_jobs
from spack.directives import build_system, extends, maintainers
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
def _homepage(cls: "RacketPackage") -> Optional[str]:
@@ -52,8 +46,8 @@ class RacketPackage(PackageBase):
homepage: ClassProperty[Optional[str]] = classproperty(_homepage)
@register_builder("racket")
class RacketBuilder(Builder):
@spack.builder.builder("racket")
class RacketBuilder(spack.builder.Builder):
"""The Racket builder provides an ``install`` phase that can be overridden."""
phases = ("install",)
@@ -82,10 +76,12 @@ def build_directory(self):
ret = os.path.join(ret, self.subdirectory)
return ret
def install(self, pkg: RacketPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: RacketPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install everything from build directory."""
raco = Executable("raco")
with working_dir(self.build_directory):
with fs.working_dir(self.build_directory):
parallel = pkg.parallel and (not env_flag(SPACK_NO_PARALLEL_MAKE))
name = pkg.racket_name
assert name is not None, "Racket package name is not set"

View File

@@ -76,14 +76,10 @@
import os
from spack.package import (
EnvironmentModifications,
PackageBase,
any_combination_of,
conflicts,
depends_on,
variant,
)
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.util.environment import EnvironmentModifications
class ROCmPackage(PackageBase):
@@ -139,7 +135,7 @@ class ROCmPackage(PackageBase):
variant(
"amdgpu_target",
description="AMD GPU architecture",
values=any_combination_of(*amdgpu_targets),
values=spack.variant.any_combination_of(*amdgpu_targets),
sticky=True,
when="+rocm",
)

View File

@@ -3,20 +3,16 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
extends,
maintainers,
register_builder,
)
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, extends, maintainers
from ._checks import BuilderWithDefaults
class RubyPackage(PackageBase):
class RubyPackage(spack.package_base.PackageBase):
"""Specialized class for building Ruby gems."""
maintainers("Kerilk")
@@ -32,7 +28,7 @@ class RubyPackage(PackageBase):
extends("ruby", when="build_system=ruby")
@register_builder("ruby")
@spack.builder.builder("ruby")
class RubyBuilder(BuilderWithDefaults):
"""The Ruby builder provides two phases that can be overridden if required:
@@ -48,7 +44,9 @@ class RubyBuilder(BuilderWithDefaults):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def build(self, pkg: RubyPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: RubyPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build a Ruby gem."""
# ruby-rake provides both rake.gemspec and Rakefile, but only
@@ -64,7 +62,9 @@ def build(self, pkg: RubyPackage, spec: Spec, prefix: Prefix) -> None:
# Some Ruby packages only ship `*.gem` files, so nothing to build
pass
def install(self, pkg: RubyPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: RubyPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install a Ruby gem.
The ruby package sets ``GEM_HOME`` to tell gem where to install to."""

View File

@@ -1,20 +1,17 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
register_builder,
run_after,
)
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
class SConsPackage(PackageBase):
class SConsPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using SCons.
See http://scons.org/documentation.html for more information.
@@ -32,7 +29,7 @@ class SConsPackage(PackageBase):
depends_on("scons", type="build", when="build_system=scons")
@register_builder("scons")
@spack.builder.builder("scons")
class SConsBuilder(BuilderWithDefaults):
"""The Scons builder provides the following phases that can be overridden:
@@ -64,7 +61,9 @@ def build_args(self, spec, prefix):
"""Arguments to pass to build."""
return []
def build(self, pkg: SConsPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: SConsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build the package."""
pkg.module.scons(*self.build_args(spec, prefix))
@@ -72,7 +71,9 @@ def install_args(self, spec, prefix):
"""Arguments to pass to install."""
return []
def install(self, pkg: SConsPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: SConsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package."""
pkg.module.scons("install", *self.install_args(spec, prefix))
@@ -84,4 +85,4 @@ def build_test(self):
"""
pass
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -4,27 +4,23 @@
import os
import re
from spack.package import (
Executable,
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
extends,
find,
register_builder,
run_after,
test_part,
tty,
when,
working_dir,
)
import llnl.util.tty as tty
from llnl.util.filesystem import find, working_dir
import spack.builder
import spack.install_test
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BuilderWithDefaults, execute_install_time_tests
class SIPPackage(PackageBase):
class SIPPackage(spack.package_base.PackageBase):
"""Specialized class for packages that are built using the
SIP build system. See https://www.riverbankcomputing.com/software/sip/intro
for more information.
@@ -100,7 +96,7 @@ def test_imports(self):
# Make sure we are importing the installed modules,
# not the ones in the source directory
for module in self.import_modules:
with test_part(
with spack.install_test.test_part(
self,
"test_imports_{0}".format(module),
purpose="checking import of {0}".format(module),
@@ -109,7 +105,7 @@ def test_imports(self):
self.python("-c", "import {0}".format(module))
@register_builder("sip")
@spack.builder.builder("sip")
class SIPBuilder(BuilderWithDefaults):
"""The SIP builder provides the following phases that can be overridden:
@@ -137,7 +133,9 @@ class SIPBuilder(BuilderWithDefaults):
build_directory = "build"
def configure(self, pkg: SIPPackage, spec: Spec, prefix: Prefix) -> None:
def configure(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Configure the package."""
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
@@ -155,7 +153,9 @@ def configure_args(self):
"""Arguments to pass to configure."""
return []
def build(self, pkg: SIPPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Build the package."""
args = self.build_args()
@@ -166,7 +166,9 @@ def build_args(self):
"""Arguments to pass to build."""
return []
def install(self, pkg: SIPPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: SIPPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install the package."""
args = self.install_args()
@@ -177,4 +179,4 @@ def install_args(self):
"""Arguments to pass to install."""
return []
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -4,11 +4,11 @@
from typing import Optional
import spack.package_base
import spack.util.url
from spack.package import PackageBase
class SourceforgePackage(PackageBase):
class SourceforgePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceforge
packages."""

View File

@@ -3,11 +3,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package_base
import spack.util.url
from spack.package import PackageBase
class SourcewarePackage(PackageBase):
class SourcewarePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceware.org
packages."""

View File

@@ -1,21 +1,19 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import (
PackageBase,
Prefix,
Spec,
build_system,
depends_on,
register_builder,
run_after,
working_dir,
)
from llnl.util.filesystem import working_dir
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
class WafPackage(PackageBase):
class WafPackage(spack.package_base.PackageBase):
"""Specialized class for packages that are built using the
Waf build system. See https://waf.io/book/ for more information.
"""
@@ -33,7 +31,7 @@ class WafPackage(PackageBase):
depends_on("python@2.5:", type="build", when="build_system=waf")
@register_builder("waf")
@spack.builder.builder("waf")
class WafBuilder(BuilderWithDefaults):
"""The WAF builder provides the following phases that can be overridden:
@@ -101,7 +99,9 @@ def waf(self, *args, **kwargs):
with working_dir(self.build_directory):
self.python("waf", "-j{0}".format(jobs), *args, **kwargs)
def configure(self, pkg: WafPackage, spec: Spec, prefix: Prefix) -> None:
def configure(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Configures the project."""
args = ["--prefix={0}".format(self.pkg.prefix)]
args += self.configure_args()
@@ -112,7 +112,9 @@ def configure_args(self):
"""Arguments to pass to configure."""
return []
def build(self, pkg: WafPackage, spec: Spec, prefix: Prefix) -> None:
def build(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Executes the build."""
args = self.build_args()
@@ -122,7 +124,9 @@ def build_args(self):
"""Arguments to pass to build."""
return []
def install(self, pkg: WafPackage, spec: Spec, prefix: Prefix) -> None:
def install(
self, pkg: WafPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Installs the targets on the system."""
args = self.install_args()
@@ -140,7 +144,7 @@ def build_test(self):
"""
pass
run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def install_test(self):
"""Run unit tests after install.
@@ -150,4 +154,4 @@ def install_test(self):
"""
pass
run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -4,11 +4,11 @@
from typing import Optional
import spack.package_base
import spack.util.url
from spack.package import PackageBase
class XorgPackage(PackageBase):
class XorgPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for x.org
packages."""

View File

@@ -75,18 +75,6 @@ def patch(self):
filter_file(r"bdb-sql", "", "dist/Makefile.in")
filter_file(r"gsg_db_server", "", "dist/Makefile.in")
def flag_handler(self, name, flags):
# GCC 15 defaults to C23 which gives a warning (and by default error)
# about assignment to an incompatible function pointer type
# (https://gcc.gnu.org/gcc-15/porting_to.html#c23-fn-decls-without-parameters):
#
# error: initialization of 'int (*)(void)' from incompatible pointer
# type 'int (*)(DB_ENV *, const char *)' {aka 'int (*)(struct __db_env
# *, const char *)'} [-Wincompatible-pointer-types]
if name == "cflags" and self.spec.satisfies("%[virtuals=c] gcc@15:"):
flags.append("-Wno-error=incompatible-pointer-types")
return (flags, None, None)
def configure_args(self):
spec = self.spec

View File

@@ -13,6 +13,7 @@
from llnl.util import lang
import spack.compilers.libraries
import spack.package_base
from spack.package import *
@@ -278,7 +279,7 @@ def enable_new_dtags(self) -> str:
return "--enable-new-dtags"
def _implicit_rpaths(pkg: PackageBase) -> List[str]:
def _implicit_rpaths(pkg: spack.package_base.PackageBase) -> List[str]:
detector = spack.compilers.libraries.CompilerPropertyDetector(pkg.spec)
paths = detector.implicit_rpaths()
return paths

View File

@@ -14,7 +14,7 @@ class Cubelib(AutotoolsPackage):
url = "https://apps.fz-juelich.de/scalasca/releases/cube/4.4/dist/cubelib-4.4.tar.gz"
maintainers("swat-jsc", "wrwilliams")
version("4.9", sha256="a0658f5bf3f74bf7dcf465ab6e30476751ad07eb93618801bdcf190ba3029443")
version("4.8.2", sha256="d6fdef57b1bc9594f1450ba46cf08f431dd0d4ae595c47e2f3454e17e4ae74f4")
version("4.8.1", sha256="e4d974248963edab48c5d0fc5831146d391b0ae4632cccafe840bf5f12cd80a9")
version("4.8", sha256="171c93ac5afd6bc74c50a9a58efdaf8589ff5cc1e5bd773ebdfb2347b77e2f68")

View File

@@ -15,7 +15,6 @@ class Cubew(AutotoolsPackage):
maintainers("swat-jsc", "wrwilliams")
version("4.9", sha256="4ef74e81c569bf53117459cba5a1ea52b5dac739493fa83be39678840cd2acdd")
version("4.8.2", sha256="4f3bcf0622c2429b8972b5eb3f14d79ec89b8161e3c1cc5862ceda417d7975d2")
version("4.8.1", sha256="42cbd743d87c16e805c8e28e79292ab33de259f2cfba46f2682cb35c1bc032d6")
version("4.8", sha256="73c7f9e9681ee45d71943b66c01cfe675b426e4816e751ed2e0b670563ca4cf3")

View File

@@ -26,7 +26,6 @@ class Dcap(AutotoolsPackage):
depends_on("libtool", type="build")
depends_on("m4", type="build")
depends_on("krb5")
depends_on("openssl")
depends_on("libxcrypt")
depends_on("zlib-api")

View File

@@ -22,7 +22,6 @@ class Easi(CMakePackage):
license("BSD-3-Clause")
version("master", branch="master")
version("1.6.1", tag="v1.6.1", commit="f79a943e34e1921b169af6504a68928dc626b6a9")
version("1.5.2", tag="v1.5.2", commit="0d87b1a7db31e453d52c7213cb9b31bda88cbf40")
version("1.5.1", tag="v1.5.1", commit="d12f3371ed26c7371e4efcc11e3cd468063ffdda")
version("1.5.0", tag="v1.5.0", commit="391698ab0072f66280d08441974c2bdb04a65ce0")
@@ -83,9 +82,6 @@ def cmake_args(self):
if spec.satisfies("jit=lua"):
args.append(self.define("LUA", True))
# Forces easilib over easilib64 in the path to python_wrapper
args.append(self.define("CMAKE_INSTALL_LIBDIR", "lib"))
if spec.satisfies("+python"):
args += [self.define("easi_INSTALL_PYTHONDIR", python_platlib)]

View File

@@ -16,7 +16,7 @@ class Exaca(CMakePackage, CudaPackage, ROCmPackage):
homepage = "https://github.com/LLNL/ExaCA"
git = "https://github.com/LLNL/ExaCA.git"
url = "https://github.com/LLNL/ExaCA/archive/2.0.1.tar.gz"
url = "https://github.com/LLNL/ExaCA/archive/2.0.0.tar.gz"
maintainers("streeve", "MattRolchigo")
@@ -25,7 +25,6 @@ class Exaca(CMakePackage, CudaPackage, ROCmPackage):
license("MIT")
version("master", branch="master")
version("2.0.1", sha256="aa094511ee8d8fe4e420814a1b7c61b05e8f45bd0ccd17b8089b7f82b2fa601b")
version("2.0.0", sha256="a33cc65a6e79bed37a644f5bfc9dd5fe356239f78c5b82830c6354acc43e016b")
version("1.3.0", sha256="637215d3c64e8007b55d68bea6003b51671029d9045af847534e0e59c4271a94")
version("1.2.0", sha256="5038d63de96c6142ddea956998e1f4ebffbc4a5723caa4da0e73eb185e6623e4")

View File

@@ -20,7 +20,6 @@ class HpcBeeflow(PythonPackage):
homepage = "https://github.com/lanl/bee"
pypi = "hpc_beeflow/hpc_beeflow-0.1.10.tar.gz"
git = "https://github.com/lanl/BEE.git"
maintainers("pagrubel")
@@ -28,13 +27,14 @@ class HpcBeeflow(PythonPackage):
license("MIT")
version("spack-develop", branch="spack-develop")
version("0.1.10", sha256="b7863798e15591a16f6cd265f9b5b7385779630f1c37d8a2a5178b8bf89fc664")
version("0.1.9", sha256="196eb9155a5ca6e35d0cc514e0609cf352fc757088707306653496b83a311ac1")
depends_on("neo4j@5.17.0", type=("build", "run"))
depends_on("redis@7.4.0", type=("build", "run"))
depends_on("python@3.8.3:3.13.0", type=("build", "run"))
depends_on("python@3.8.3:3.13.0", when="@0.1.10:", type=("build", "run"))
depends_on("python@3.8.3:3.12.2", when="@:0.1.9", type=("build", "run"))
depends_on("py-poetry@0.12:", type="build")
depends_on("py-flask@2.0:", type=("build", "run"))
@@ -43,11 +43,13 @@ class HpcBeeflow(PythonPackage):
depends_on("py-neo4j@5:", type=("build", "run"))
depends_on("py-pyyaml@6.0.1:", type=("build", "run"))
depends_on("py-flask-restful@0.3.9", type=("build", "run"))
depends_on("py-cwl-utils@0.16", type=("build", "run"))
depends_on("py-cwl-utils@0.16:", type=("build", "run"))
depends_on("py-apscheduler@3.6.3:", type=("build", "run"))
depends_on("py-jsonpickle@2.2.0:", type=("build", "run"))
depends_on("py-requests@2.32.3:", type=("build", "run"))
depends_on("py-requests-unixsocket@0.4.1:", type=("build", "run"))
depends_on("py-requests@:2.28", when="@:0.1.9", type=("build", "run"))
depends_on("py-requests@2.32.3:", when="@0.1.10:", type=("build", "run"))
depends_on("py-requests-unixsocket@0.3.0:", when="@:0.1.9", type=("build", "run"))
depends_on("py-requests-unixsocket@0.4.1:", when="@0.1.10:", type=("build", "run"))
depends_on("py-python-daemon@2.3.1:", type=("build", "run"))
depends_on("py-gunicorn@20.1.0:22", type=("build", "run"))
depends_on("py-typer@0.5.0", type=("build", "run"))
@@ -55,11 +57,10 @@ class HpcBeeflow(PythonPackage):
depends_on("py-celery+redis+sqlalchemy@5.3.4:", type=("build", "run"))
depends_on("py-docutils@0.18.1", type=("build", "run"))
depends_on("py-graphviz@0.20.3:", type=("build", "run"))
depends_on("py-networkx@3.1", type=("build", "run"))
depends_on("py-pre-commit@3.5.0", type=("build", "run"))
depends_on("py-networkx@3.1", when="@0.1.10:", type=("build", "run"))
depends_on("py-pre-commit@3.5.0", when="@0.1.10:", type=("build", "run"))
depends_on("py-mypy-extensions", type=("build", "run"))
# Setup for when "no containers" is specified
def setup_run_environment(self, env):
neo4j_bin = join_path(self.spec["neo4j"].prefix, "packaging/standalone/target")

View File

@@ -66,7 +66,6 @@ class Krb5(AutotoolsPackage):
depends_on("openssl@:1", when="@:1.19")
depends_on("openssl")
depends_on("gettext")
depends_on("libedit")
depends_on("perl", type="build")
depends_on("findutils", type="build")
depends_on("pkgconfig", type="build")

View File

@@ -13,6 +13,7 @@
from spack.operating_systems.mac_os import macos_sdk_path
from spack.package import *
from spack.package_base import PackageBase
class LlvmDetection(PackageBase):

View File

@@ -43,7 +43,6 @@ class Mapl(CMakePackage):
version("2.55.0", sha256="13ec3d81d53cf18aa18322b74b9a6990ad7e51224f1156be5d1f834ee826f95c")
version("2.54.2", sha256="70b7be425d07a7be7d9bb0e53b93a372887a048caf23260e0ae602ca6e3670ed")
version("2.54.1", sha256="2430ded45a98989e9100037f54cf22f5a5083e17196514b3667d3003413e49e1")
version("2.53.4", sha256="da38348a72fcbaa2b888578bfa630ab36261206136d33700344ed6792f9f9aeb")
version("2.53.3", sha256="32e9f21e1fdb83a5a59924f498a57e5369732c21eff9014f35052be0a2fa6d6c")
version("2.53.2", sha256="0f294a5289541b0028773f8e5ab2bf04734ec09241baa5a3dcea0e939d40336f")
version("2.53.1", sha256="8371a75d4d81294eb9d99d66702f8cf62d4bd954cec3e247e1afae621b4e4726")

View File

@@ -11,10 +11,11 @@
from spack_repo.builtin.build_systems.rocm import ROCmPackage
import spack.compilers.config
import spack.package_base
from spack.package import *
class MpichEnvironmentModifications(PackageBase):
class MpichEnvironmentModifications(spack.package_base.PackageBase):
"""Collects the environment modifications that are usually needed for the life-cycle of
MPICH, and derivatives.
"""

View File

@@ -12,6 +12,7 @@
import archspec.cpu
import spack.platforms
import spack.version
from spack.package import *
FC_PATH: Dict[str, str] = dict()
@@ -153,7 +154,9 @@ def init_msvc(self):
if str(archspec.cpu.host().family) == "x86_64":
arch = "amd64"
msvc_version = Version(re.search(Msvc.compiler_version_regex, self.cc).group(1))
msvc_version = spack.version.Version(
re.search(Msvc.compiler_version_regex, self.cc).group(1)
)
self.vcvars_call = VCVarsInvocation(vcvars_script_path, arch, msvc_version)
env_cmds.append(self.vcvars_call)
@@ -234,7 +237,7 @@ def vc_toolset_ver(self):
@property
def msvc_version(self):
"""This is the VCToolset version *NOT* the actual version of the cl compiler"""
return Version(re.search(Msvc.compiler_version_regex, self.cc).group(1))
return spack.version.Version(re.search(Msvc.compiler_version_regex, self.cc).group(1))
@property
def vs_root(self):
@@ -264,7 +267,7 @@ def platform_toolset_ver(self):
# TODO: (johnwparent) Update this logic for the next platform toolset
# or VC toolset version update
toolset_ver = self.vc_toolset_ver
vs22_toolset = Version(toolset_ver) > Version("142")
vs22_toolset = spack.version.Version(toolset_ver) > Version("142")
return toolset_ver if not vs22_toolset else "143"
@@ -359,6 +362,6 @@ def get_valid_fortran_pth():
"""Assign maximum available fortran compiler version"""
# TODO (johnwparent): validate compatibility w/ try compiler
# functionality when added
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fn = lambda fc_ver: spack.version.Version(fc_ver)
sort_fc_ver = sorted(list(FC_PATH.keys()), key=sort_fn)
return FC_PATH[sort_fc_ver[-1]] if sort_fc_ver else None

View File

@@ -21,7 +21,6 @@ class Opari2(AutotoolsPackage):
homepage = "https://www.vi-hps.org/projects/score-p"
url = "https://perftools.pages.jsc.fz-juelich.de/cicd/opari2/tags/opari2-2.0.8/opari2-2.0.8.tar.gz"
version("2.0.9", sha256="d57139f757c5666afaaead45ed3d0954a9b98c4a6cef6b22afe672707cffd779")
version("2.0.8", sha256="196e59a2a625e6c795a6124c61e784bad142f9f38df0b4fa4d435ba9b9c19721")
version("2.0.7", sha256="e302a4cc265eb2a4aa27c16a90eabd9e1e58cb02a191dd1c4d86f9a0df128715")
version("2.0.6", sha256="55972289ce66080bb48622110c3189a36e88a12917635f049b37685b9d3bbcb0")

View File

@@ -14,9 +14,8 @@ class Otf2(AutotoolsPackage):
"""
homepage = "https://www.vi-hps.org/projects/score-p"
url = "https://perftools.pages.jsc.fz-juelich.de/cicd/otf2/tags/otf2-3.1/otf2-3.1.tar.gz"
version("3.1.1", sha256="5a4e013a51ac4ed794fe35c55b700cd720346fda7f33ec84c76b86a5fb880a6e")
version("3.1", sha256="09dff2eda692486b88ad5ee189bbc9d7ebc1f17c863108c44ccf9631badbada4")
url = "https://perftools.pages.jsc.fz-juelich.de/cicd/otf2/tags/otf2-3.0/otf2-3.0.tar.gz"
version("3.0.3", sha256="18a3905f7917340387e3edc8e5766f31ab1af41f4ecc5665da6c769ca21c4ee8")
version("3.0", sha256="6fff0728761556e805b140fd464402ced394a3c622ededdb618025e6cdaa6d8c")
version("2.3", sha256="36957428d37c40d35b6b45208f050fb5cfe23c54e874189778a24b0e9219c7e3")

View File

@@ -20,7 +20,6 @@ class PyCwlUtils(PythonPackage):
version("0.37", sha256="7b69c948f8593fdf44b44852bd8ef94c666736ce0ac12cf6e66e2a72ad16a773")
version("0.21", sha256="583f05010f7572f3a69310325472ccb6efc2db7f43dc6428d03552e0ffcbaaf9")
version("0.16", sha256="38182e6dd12b039601ac2f72911b3d93ca4e37efca3b0165ffe162abab3edf7b")
depends_on("python@3.6:", type=("build", "run"))
depends_on("python@3.8:", when="@0.29:", type=("build", "run"))

View File

@@ -26,8 +26,6 @@ class PyPip(Package, PythonExtension):
license("MIT")
version("25.1.1", sha256="2913a38a2abf4ea6b64ab507bd9e967f3b53dc1ede74b01b0931e1ce548751af")
version("25.1", sha256="13b4aa0aaad055020a11bec8a1c2a70a2b2d080e12d89b962266029fff0a16ba")
version("25.0.1", sha256="c46efd13b6aa8279f33f2864459c8ce587ea6a1a59ee20de055868d8f7688f7f")
version("25.0", sha256="b6eb97a803356a52b2dd4bb73ba9e65b2ba16caa6bcb25a7497350a4e5859b65")
version("24.3.1", sha256="3790624780082365f47549d032f3770eeb2b1e8bd1f7b2e02dace1afa361b4ed")
@@ -53,7 +51,6 @@ class PyPip(Package, PythonExtension):
extends("python")
with default_args(type=("build", "run")):
depends_on("python@3.9:", when="@25.1:")
depends_on("python@3.8:", when="@24.1:")
depends_on("python@3.7:", when="@22:")

View File

@@ -23,7 +23,6 @@ class PySetuptools(Package, PythonExtension):
# Requires railroad
skip_modules = ["setuptools._vendor", "pkg_resources._vendor"]
version("79.0.1", sha256="e147c0549f27767ba362f9da434eab9c5dc0045d5304feb602a0af001089fc51")
version("78.1.1", sha256="c3a9c4211ff4c309edb8b8c4f1cbfa7ae324c4ba9f91ff254e3d305b9fd54561")
version("78.1.0", sha256="3e386e96793c8702ae83d17b853fb93d3e09ef82ec62722e61da5cd22376dcd8")
version("78.0.2", sha256="4a612c80e1f1d71b80e4906ce730152e8dec23df439f82731d9d0b608d7b700d")

View File

@@ -13,13 +13,9 @@ class PyWatchfiles(PythonPackage):
homepage = "https://github.com/samuelcolvin/watchfiles"
pypi = "watchfiles/watchfiles-0.18.1.tar.gz"
maintainers("viperML")
license("MIT")
version("1.0.5", sha256="b7529b5dcc114679d43827d8c35a07c493ad6f083633d573d81c660abc5979e9")
version("0.18.1", sha256="4ec0134a5e31797eb3c6c624dbe9354f2a8ee9c720e0b46fc5b7bab472b7c6d4")
depends_on("py-maturin@0.13", type="build", when="@0.18.1")
depends_on("py-maturin@0.14:2", type="build", when="@1:")
depends_on("py-maturin@0.13", type="build")
depends_on("py-anyio@3:", type=("build", "run"))

View File

@@ -25,7 +25,6 @@ class R(AutotoolsPackage):
license("GPL-2.0-or-later")
version("4.5.0", sha256="3b33ea113e0d1ddc9793874d5949cec2c7386f66e4abfb1cef9aec22846c3ce1")
version("4.4.3", sha256="0d93d224442dea253c2b086f088db6d0d3cfd9b592cd5496e8cb2143e90fc9e8")
version("4.4.2", sha256="1578cd603e8d866b58743e49d8bf99c569e81079b6a60cf33cdf7bdffeb817ec")
version("4.4.1", sha256="b4cb675deaaeb7299d3b265d218cde43f192951ce5b89b7bb1a5148a36b2d94d")
@@ -99,7 +98,6 @@ class R(AutotoolsPackage):
depends_on("which", type=("build", "run"))
depends_on("zlib-api")
depends_on("zlib@1.2.5:", when="^[virtuals=zlib-api] zlib")
depends_on("zstd", when="@4.5:")
depends_on("texinfo", type="build")
depends_on("gettext")

View File

@@ -16,7 +16,7 @@ class Scorep(AutotoolsPackage):
homepage = "https://www.vi-hps.org/projects/score-p"
url = "https://perftools.pages.jsc.fz-juelich.de/cicd/scorep/tags/scorep-7.1/scorep-7.1.tar.gz"
maintainers("wrwilliams")
version("9.0", sha256="5d0a5db4cc6f31c30ae03c7e6f6245e83667b0ff38a7041ffe8b2e8e581e0997")
version("8.4", sha256="7bbde9a0721d27cc6205baf13c1626833bcfbabb1f33b325a2d67976290f7f8a")
version("8.3", sha256="76c914e6319221c059234597a3bc53da788ed679179ac99c147284dcefb1574a")
# version 8.2 was immediately superseded before it hit Spack
@@ -88,51 +88,29 @@ def url_for_version(self, version):
sha256="d20b3046ba6a89ad9c106bcf372bceb1bd9ab780d4c7dd9e7373f0099b92d933",
)
# Variants
variant("mpi", default=True, description="Enable MPI support")
variant("papi", default=True, description="Enable PAPI")
variant("pdt", default=False, description="Enable PDT", when="@:8.4")
variant("pdt", default=False, description="Enable PDT")
variant("shmem", default=False, description="Enable shmem tracing")
variant("unwind", default=False, description="Enable sampling via libunwind and lib wrapping")
variant("cuda", default=False, description="Enable CUDA support")
variant("hip", default=False, description="Enable ROCm/HIP support", when="@8.0:")
variant("gcc-plugin", default=True, description="Enable gcc-plugin", when="%gcc")
variant(
"llvm-plugin", default=True, description="Enable LLVM compiler plugin", when="@9.0: ^llvm"
)
variant(
"binutils",
default=True,
description="Enable debug info lookup via binutils",
when="^binutils",
)
# Putting this in as preparation. F08 support exists in 9.0 but configure does not respect
# --enable-mpi-f08 and will not until 9.1.
variant("mpi_f08", default=True, description="Enable MPI F08 support", when="@9.1: +mpi")
# Dependencies for SCORE-P are quite tight. See the homepage for more
# information. Starting with scorep 4.0 / cube 4.4, Score-P only depends on
# two components of cube -- cubew and cubelib.
# Language dependencies
# TODO: we could allow a +fortran variant here.
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("fortran", type="build") # generated
# SCOREP 9
depends_on("gotcha@1.0.8:", type="link", when="@9:")
depends_on("otf2@3.1:", when="@9:")
depends_on("cubew@4.9:", when="@9:")
depends_on("cubelib@4.9:", when="@9:")
depends_on("opari2@2.0.9", when="@9:")
# SCOREP 8
depends_on("binutils", type="link", when="@8:")
depends_on("otf2@3:", when="@8:")
depends_on("cubew@4.8.2:4.8", when="@8.3:")
depends_on("cubelib@4.8.2:4.8", when="@8.3:")
depends_on("cubew@4.8", when="@8:8.2")
depends_on("cubelib@4.8", when="@8:8.2")
depends_on("cubew@4.8.2:", when="@8.3:")
depends_on("cubelib@4.8.2:", when="@8.3:")
depends_on("cubew@4.8:", when="@8:8.2")
depends_on("cubelib@4.8:", when="@8:8.2")
# fall through to Score-P 7's OPARI2, no new release
# SCOREP 7
depends_on("otf2@2.3:2.3.99", when="@7.0:7")
@@ -162,7 +140,6 @@ def url_for_version(self, version):
depends_on("opari2@1.1.4", when="@1.3")
depends_on("cube@4.2.3", when="@1.3")
# Conditional dependencies for variants
depends_on("mpi@2.2:", when="@7.0:+mpi")
depends_on("mpi", when="+mpi")
depends_on("papi", when="+papi")
@@ -174,7 +151,6 @@ def url_for_version(self, version):
depends_on("hip@4.2:", when="+hip")
depends_on("rocprofiler-dev", when="+hip")
depends_on("rocm-smi-lib", when="+hip")
# Score-P requires a case-sensitive file system, and therefore
# does not work on macOS
# https://github.com/spack/spack/issues/1609
@@ -182,16 +158,14 @@ def url_for_version(self, version):
# Score-P first has support for ROCm 6.x as of v8.4
conflicts("hip@6.0:", when="@1.0:8.3+hip")
# Utility function: extract the first directory in `root` where
# we find `libname`. Used to handle CUDA irregular layouts.
def find_libpath(self, libname, root):
libs = find_libraries(libname, root, shared=True, recursive=True)
if len(libs.directories) == 0:
return None
return libs.directories[0]
# Handle any mapping of Spack compiler names to Score-P args
# This should continue to exist for backward compatibility
# handle any mapping of Spack compiler names to Score-P args
# this should continue to exist for backward compatibility
def clean_compiler(self, compiler):
renames = {"cce": "cray", "rocmcc": "amdclang"}
if compiler in renames:
@@ -200,6 +174,7 @@ def clean_compiler(self, compiler):
def configure_args(self):
spec = self.spec
config_args = [
"--with-otf2=%s" % spec["otf2"].prefix.bin,
"--with-opari2=%s" % spec["opari2"].prefix.bin,
@@ -207,7 +182,8 @@ def configure_args(self):
]
cname = self.clean_compiler(spec.compiler.name)
config_args.extend(["--with-nocross-compiler-suite={0}".format(cname)])
config_args.append("--with-nocross-compiler-suite={0}".format(cname))
if self.version >= Version("4.0"):
config_args.append("--with-cubew=%s" % spec["cubew"].prefix.bin)
@@ -222,21 +198,14 @@ def configure_args(self):
if "+pdt" in spec:
config_args.append("--with-pdt=%s" % spec["pdt"].prefix.bin)
config_args.extend(
self.with_or_without("libunwind", activation_value="prefix", variant="unwind")
)
if "+unwind" in spec:
config_args.append("--with-libunwind=%s" % spec["libunwind"].prefix)
if "+cuda" in spec:
config_args.append("--with-libcudart=%s" % spec["cuda"].prefix)
cuda_driver_path = self.find_libpath("libcuda", spec["cuda"].prefix)
config_args.append("--with-libcuda-lib=%s" % cuda_driver_path)
config_args.extend(
self.with_or_without(
"rocm", activation_value=lambda _: self.spec["hip"].prefix, variant="hip"
)
)
config_args.extend(self.enable_or_disable("llvm-plugin"))
config_args.extend(self.enable_or_disable("gcc-plugin"))
config_args.extend(self.enable_or_disable("mpi_f08"))
if "+hip" in spec:
config_args.append("--with-rocm=%s" % spec["hip"].prefix)
if "~shmem" in spec:
config_args.append("--without-shmem")
@@ -258,7 +227,7 @@ def configure_args(self):
"^[virtuals=mpi] hpcx-mpi"
):
config_args.append("--with-mpi=openmpi")
elif spec.satisfies("~mpi"):
elif "~mpi" in spec:
config_args.append("--without-mpi")
# Let any +mpi that gets here autodetect, which is default
# Valid values are bullxmpi|cray|hp|ibmpoe|intel|intel2|intel3|intelpoe|lam|mpibull2
@@ -268,13 +237,8 @@ def configure_args(self):
# (see end of function)
# but add similar spec.satisfies clauses for any that you need.
# -- wrwilliams 12/2024
config_args.extend(
self.with_or_without(
"libbfd",
activation_value=lambda _: self.spec["binutils"].prefix,
variant="binutils",
)
)
if spec.satisfies("^binutils"):
config_args.append("--with-libbfd=%s" % spec["binutils"].prefix)
# when you build with gcc, you usually want to use the gcc-plugin!
# see, e.g., GNU Compiler Plug-In in https://scorepci.pages.jsc.fz-juelich.de/scorep-pipelines/docs/scorep-5.0/html/installationfile.html
@@ -290,7 +254,7 @@ def configure_args(self):
]
)
if spec.satisfies("+mpi"):
if "+mpi" in spec:
config_args.extend(
[
"MPICC={0}".format(spec["mpi"].mpicc),

View File

@@ -159,10 +159,10 @@ def install_mpp(self):
)
def setup_run_environment(self, env: EnvironmentModifications) -> None:
env.set("SU2_RUN", self.prefix.bin)
env.set("SU2_HOME", self.prefix)
env.prepend_path("PATH", self.prefix.bin)
env.prepend_path("PYTHONPATH", self.prefix.bin)
env.set("su2_run", self.prefix.bin)
env.set("su2_home", self.prefix)
env.prepend_path("path", self.prefix.bin)
env.prepend_path("pythonpath", self.prefix.bin)
if "+mpp" in self.spec:
env.set("MPP_DATA_DIRECTORY", join_path(self.prefix, "mpp-data"))
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)
env.set("mpp_data_directory", join_path(self.prefix, "mpp-data"))
env.prepend_path("ld_library_path", self.prefix.lib)

View File

@@ -18,7 +18,6 @@ class Ucc(AutotoolsPackage, CudaPackage, ROCmPackage):
maintainers("zzzoom")
version("1.4.4", sha256="e098e427c7b72b5434ae1e0da2258ab3bc271142c136b0bf4cf40ef9948b70d0")
version("1.3.0", sha256="b56379abe5f1c125bfa83be305d78d81a64aa271b7b5fff0ac17b86725ff3acf")
version("1.2.0", sha256="c1552797600835c0cf401b82dc89c4d27d5717f4fb805d41daca8e19f65e509d")

View File

@@ -19,7 +19,6 @@ class Xnedit(MakefilePackage):
maintainers("davekeeshan")
version("1.6.3", sha256="069f4d40445ef5db636975e0f268ca11ec828bbc62bf4e6d7343d57b0697ecb0")
version("1.6.2", sha256="0ee832ad186b81b8ba8df43352d86e35997cea9708ff7ddad15e9d91fe81b6cb")
version("1.6.1", sha256="46489fa3017f5e40da810170b33c681affd3cd4dff1dbd0f8a4c51f8285ca5c4")
version("1.6.0", sha256="197e635fc1aa8e4ff2dcd2375efac597975f04170c3eace3280c4054bbbc57ac")
@@ -41,16 +40,11 @@ class Xnedit(MakefilePackage):
version("1.0.1", sha256="3efa26d180696ea7b24c3efd2599c52183b6851fc1bc87ce9a4f85d465962a8c")
version("1.0.0", sha256="f58dcbd268f226192584f56dd1a897290a66176d91a90d715a40d63578a84b72")
variant("motif", default=True, description="build with motif support")
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("automake", type="build")
depends_on("libx11")
depends_on("libxt")
depends_on("libxpm")
depends_on("motif", when="+motif")
depends_on("motif")
depends_on("pcre")
def setup_build_environment(self, env: EnvironmentModifications) -> None:

View File

@@ -78,7 +78,7 @@ def with_or_without(self, *args, **kwargs):
return spack.builder.create(self).with_or_without(*args, **kwargs)
@spack.builder.register_builder("autotools")
@spack.builder.builder("autotools")
class AutotoolsBuilder(BuilderWithDefaults):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:

View File

@@ -22,7 +22,7 @@ class BundlePackage(spack.package_base.PackageBase):
spack.directives.build_system("bundle")
@spack.builder.register_builder("bundle")
@spack.builder.builder("bundle")
class BundleBuilder(spack.builder.Builder):
phases = ("install",)

View File

@@ -283,7 +283,7 @@ def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) ->
return define_from_variant(self, cmake_var, variant)
@spack.builder.register_builder("cmake")
@spack.builder.builder("cmake")
class CMakeBuilder(BuilderWithDefaults):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:

View File

@@ -27,7 +27,7 @@ class Package(spack.package_base.PackageBase):
spack.directives.build_system("generic")
@spack.builder.register_builder("generic")
@spack.builder.builder("generic")
class GenericBuilder(BuilderWithDefaults):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.

View File

@@ -37,7 +37,7 @@ class MakefilePackage(spack.package_base.PackageBase):
depends_on("gmake", type="build")
@spack.builder.register_builder("makefile")
@spack.builder.builder("makefile")
class MakefileBuilder(BuilderWithDefaults):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:

View File

@@ -88,7 +88,7 @@ def test_use(self):
assert "OK" in out
@spack.builder.register_builder("perl")
@spack.builder.builder("perl")
class PerlBuilder(BuilderWithDefaults):
"""The perl builder provides four phases that can be overridden, if required:

View File

@@ -427,7 +427,7 @@ def libs(self) -> LibraryList:
raise NoLibrariesError(msg.format(self.spec.name, platlib, purelib))
@spack.builder.register_builder("python_pip")
@spack.builder.builder("python_pip")
class PythonPipBuilder(BuilderWithDefaults):
phases = ("install",)

View File

@@ -35,7 +35,7 @@ def test_callback(self):
print("PyTestCallback test")
@spack.builder.register_builder("testcallback")
@spack.builder.builder("testcallback")
class MyBuilder(BuilderWithDefaults):
phases = ("install",)