Compare commits
2 Commits
develop
...
hs/fix/rep
Author | SHA1 | Date | |
---|---|---|---|
![]() |
931fa7ff51 | ||
![]() |
68b08498b7 |
1
.github/workflows/import-check.yaml
vendored
1
.github/workflows/import-check.yaml
vendored
@ -6,7 +6,6 @@ on:
|
||||
jobs:
|
||||
# Check we don't make the situation with circular imports worse
|
||||
import-check:
|
||||
continue-on-error: true
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: julia-actions/setup-julia@v2
|
||||
|
2
.github/workflows/sync-packages.yaml
vendored
2
.github/workflows/sync-packages.yaml
vendored
@ -28,7 +28,7 @@ jobs:
|
||||
run: |
|
||||
cd spack-packages
|
||||
git-filter-repo --quiet --source ../spack \
|
||||
--path var/spack/repos/ --path-rename var/spack/repos/:python/ \
|
||||
--subdirectory-filter var/spack/repos \
|
||||
--path share/spack/gitlab/cloud_pipelines/ --path-rename share/spack/gitlab/cloud_pipelines/:.ci/gitlab/ \
|
||||
--refs develop
|
||||
- name: Push
|
||||
|
@ -276,7 +276,7 @@ remove dependent packages *before* removing their dependencies or use the
|
||||
Garbage collection
|
||||
^^^^^^^^^^^^^^^^^^
|
||||
|
||||
When Spack builds software from sources, it often installs tools that are needed
|
||||
When Spack builds software from sources, if often installs tools that are needed
|
||||
just to build or test other software. These are not necessary at runtime.
|
||||
To support cases where removing these tools can be a benefit Spack provides
|
||||
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:
|
||||
|
@ -89,7 +89,7 @@ You can see that the mirror is added with ``spack mirror list`` as follows:
|
||||
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
|
||||
|
||||
|
||||
At this point, you've created a buildcache, but Spack hasn't indexed it, so if
|
||||
At this point, you've create a buildcache, but spack hasn't indexed it, so if
|
||||
you run ``spack buildcache list`` you won't see any results. You need to index
|
||||
this new build cache as follows:
|
||||
|
||||
@ -318,7 +318,7 @@ other system dependencies. However, they are still compatible with tools like
|
||||
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
|
||||
|
||||
.. note::
|
||||
The Docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
||||
The docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
||||
``max depth exceeded`` error may be produced when pulling the image. There
|
||||
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
|
||||
|
||||
|
@ -14,7 +14,7 @@ is an entire command dedicated to the management of every aspect of bootstrappin
|
||||
|
||||
.. command-output:: spack bootstrap --help
|
||||
|
||||
Spack is configured to bootstrap its dependencies lazily by default; i.e., the first time they are needed and
|
||||
Spack is configured to bootstrap its dependencies lazily by default; i.e. the first time they are needed and
|
||||
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
|
||||
|
||||
.. code-block:: console
|
||||
@ -36,8 +36,8 @@ can't be found. You can readily check if any prerequisite for using Spack is mis
|
||||
|
||||
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
|
||||
are missing and it's giving detailed information on why they are needed and whether
|
||||
they can be bootstrapped. The return code of this command summarizes the results; if any
|
||||
dependencies are missing, the return code is ``1``, otherwise ``0``. Running a command that
|
||||
they can be bootstrapped. The return code of this command summarizes the results, if any
|
||||
dependencies are missing the return code is ``1``, otherwise ``0``. Running a command that
|
||||
concretizes a spec, like:
|
||||
|
||||
.. code-block:: console
|
||||
|
@ -228,7 +228,7 @@ def setup(sphinx):
|
||||
("py:class", "spack.install_test.Pb"),
|
||||
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
|
||||
("py:class", "spack.traverse.EdgeAndDepth"),
|
||||
("py:class", "_vendoring.archspec.cpu.microarchitecture.Microarchitecture"),
|
||||
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
|
||||
("py:class", "spack.compiler.CompilerCache"),
|
||||
# TypeVar that is not handled correctly
|
||||
("py:class", "llnl.util.lang.T"),
|
||||
|
@ -148,8 +148,8 @@ this can expose you to attacks. Use at your own risk.
|
||||
``ssl_certs``
|
||||
--------------------
|
||||
|
||||
Path to custom certificates for SSL verification. The value can be a
|
||||
filesystem path, or an environment variable that expands to an absolute file path.
|
||||
Path to custom certificats for SSL verification. The value can be a
|
||||
filesytem path, or an environment variable that expands to an absolute file path.
|
||||
The default value is set to the environment variable ``SSL_CERT_FILE``
|
||||
to use the same syntax used by many other applications that automatically
|
||||
detect custom certificates.
|
||||
|
@ -11,7 +11,7 @@ Container Images
|
||||
Spack :ref:`environments` can easily be turned into container images. This page
|
||||
outlines two ways in which this can be done:
|
||||
|
||||
1. By installing the environment on the host system and copying the installations
|
||||
1. By installing the environment on the host system, and copying the installations
|
||||
into the container image. This approach does not require any tools like Docker
|
||||
or Singularity to be installed.
|
||||
2. By generating a Docker or Singularity recipe that can be used to build the
|
||||
@ -56,8 +56,8 @@ environment roots and its runtime dependencies.
|
||||
|
||||
.. note::
|
||||
|
||||
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag specifies not
|
||||
the password for your account, but rather a personal access token you need to generate separately.
|
||||
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag is not
|
||||
the password for your account, but a personal access token you need to generate separately.
|
||||
|
||||
The specified ``--base-image`` should have a libc that is compatible with the host system.
|
||||
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
|
||||
|
@ -20,7 +20,7 @@ be present on the machine where Spack is run:
|
||||
:header-rows: 1
|
||||
|
||||
These requirements can be easily installed on most modern Linux systems;
|
||||
on macOS, the Command Line Tools package is required, and a full Xcode suite
|
||||
on macOS, the Command Line Tools package is required, and a full XCode suite
|
||||
may be necessary for some packages such as Qt and apple-gl. Spack is designed
|
||||
to run on HPC platforms like Cray. Not all packages should be expected
|
||||
to work on all platforms.
|
||||
|
@ -8,7 +8,7 @@
|
||||
Modules (modules.yaml)
|
||||
======================
|
||||
|
||||
The use of module systems to manage user environments in a controlled way
|
||||
The use of module systems to manage user environment in a controlled way
|
||||
is a common practice at HPC centers that is sometimes embraced also by
|
||||
individual programmers on their development machines. To support this
|
||||
common practice Spack integrates with `Environment Modules
|
||||
@ -490,7 +490,7 @@ that are already in the Lmod hierarchy.
|
||||
|
||||
|
||||
.. note::
|
||||
Tcl and Lua modules also allow for explicit conflicts between module files.
|
||||
Tcl and Lua modules also allow for explicit conflicts between modulefiles.
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
@ -513,7 +513,7 @@ that are already in the Lmod hierarchy.
|
||||
:meth:`~spack.spec.Spec.format` method.
|
||||
|
||||
For Lmod and Environment Modules versions prior 4.2, it is important to
|
||||
express the conflict on both module files conflicting with each other.
|
||||
express the conflict on both modulefiles conflicting with each other.
|
||||
|
||||
|
||||
.. note::
|
||||
@ -550,7 +550,7 @@ that are already in the Lmod hierarchy.
|
||||
|
||||
.. warning::
|
||||
Consistency of Core packages
|
||||
The user is responsible for maintaining consistency among core packages, as ``core_specs``
|
||||
The user is responsible for maintining consistency among core packages, as ``core_specs``
|
||||
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
|
||||
|
||||
.. warning::
|
||||
|
@ -179,7 +179,7 @@ Spack can be found at :ref:`package_class_structure`.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
class Foo(CMakePackage):
|
||||
class Foo(CmakePackage):
|
||||
def cmake_args(self):
|
||||
...
|
||||
|
||||
@ -1212,7 +1212,7 @@ class-level tarball URL and VCS. For example:
|
||||
version("master", branch="master")
|
||||
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
|
||||
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
|
||||
version("12.8.1", md5="9f37f683ee2b427b5540db8a20ed6b15")
|
||||
version("12.8.1", "9f37f683ee2b427b5540db8a20ed6b15")
|
||||
|
||||
If a package contains both a ``url`` and ``git`` class-level attribute,
|
||||
Spack decides which to use based on the arguments to the ``version()``
|
||||
@ -1343,7 +1343,7 @@ Submodules
|
||||
|
||||
version("1.0.1", tag="v1.0.1", submodules=True)
|
||||
|
||||
If a package needs more fine-grained control over submodules, define
|
||||
If a package has needs more fine-grained control over submodules, define
|
||||
``submodules`` to be a callable function that takes the package instance as
|
||||
its only argument. The function should return a list of submodules to be fetched.
|
||||
|
||||
@ -2308,19 +2308,31 @@ looks like this:
|
||||
|
||||
parallel = False
|
||||
|
||||
You can also disable parallel builds only for specific make
|
||||
invocation:
|
||||
Similarly, you can disable parallel builds only for specific make
|
||||
commands, as ``libdwarf`` does:
|
||||
|
||||
.. code-block:: python
|
||||
:emphasize-lines: 5
|
||||
:emphasize-lines: 9, 12
|
||||
:linenos:
|
||||
|
||||
class Libelf(Package):
|
||||
...
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=" + prefix,
|
||||
"--enable-shared",
|
||||
"--disable-dependency-tracking",
|
||||
"--disable-debug")
|
||||
make()
|
||||
|
||||
# The mkdir commands in libelf's install can fail in parallel
|
||||
make("install", parallel=False)
|
||||
|
||||
The first make will run in parallel here, but the second will not. If
|
||||
you set ``parallel`` to ``False`` at the package level, then each call
|
||||
to ``make()`` will be sequential by default, but packagers can call
|
||||
``make(parallel=True)`` to override it.
|
||||
|
||||
Note that the ``--jobs`` option works out of the box for all standard
|
||||
build systems. If you are using a non-standard build system instead, you
|
||||
can use the variable ``make_jobs`` to extract the number of jobs specified
|
||||
@ -2495,7 +2507,7 @@ necessary when there are breaking changes in the dependency that the
|
||||
package cannot handle. In Spack we often add forward compatibility
|
||||
bounds only at the time a new, breaking version of a dependency is
|
||||
released. As with backward compatibility, it is typical to see a list
|
||||
of forward compatibility bounds in a package file as separate lines:
|
||||
of forward compatibility bounds in a package file as seperate lines:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
@ -3371,7 +3383,7 @@ the above attribute implementations:
|
||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
|
||||
])
|
||||
|
||||
# baz library directories in the baz subdirectory of the foo prefix
|
||||
# baz library directories in the baz subdirectory of the foo porefix
|
||||
>>> spec["baz"].libs.directories
|
||||
[
|
||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
|
||||
@ -5728,7 +5740,7 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
|
||||
.. note::
|
||||
|
||||
The method name ``copy_test_files`` here is for illustration purposes.
|
||||
You are free to use a name that is better suited to your package.
|
||||
You are free to use a name that is more suited to your package.
|
||||
|
||||
The key to copying files for stand-alone testing at build time is use
|
||||
of the ``run_after`` directive, which ensures the associated files are
|
||||
@ -7237,7 +7249,7 @@ which are not, there is the `checked_by` parameter in the license directive:
|
||||
|
||||
license("<license>", when="<when>", checked_by="<github username>")
|
||||
|
||||
When you have validated a package license, either when doing so explicitly or
|
||||
When you have validated a github license, either when doing so explicitly or
|
||||
as part of packaging a new package, please set the `checked_by` parameter
|
||||
to your Github username to signal that the license has been manually
|
||||
verified.
|
||||
|
@ -214,7 +214,7 @@ package versions, simply run the following commands:
|
||||
|
||||
Running ``spack mark -i --all`` tells Spack to mark all of the existing
|
||||
packages within an environment as "implicitly" installed. This tells
|
||||
Spack's garbage collection system that these packages should be cleaned up.
|
||||
spack's garbage collection system that these packages should be cleaned up.
|
||||
|
||||
Don't worry however, this will not remove your entire environment.
|
||||
Running ``spack install`` will reexamine your spack environment after
|
||||
|
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from _pyrsistent_version import *
|
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from altgraph import *
|
@ -1,20 +0,0 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Anders Høst
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in
|
||||
the Software without restriction, including without limitation the rights to
|
||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
|
||||
the Software, and to permit persons to whom the Software is furnished to do so,
|
||||
subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
|
||||
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
||||
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
||||
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
|
||||
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from jsonschema import *
|
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from macholib import *
|
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
@ -0,0 +1,213 @@
|
||||
# flake8: noqa: E704
|
||||
# from https://gist.github.com/WuTheFWasThat/091a17d4b5cab597dfd5d4c2d96faf09
|
||||
# Stubs for pyrsistent (Python 3.6)
|
||||
|
||||
from typing import Any
|
||||
from typing import AnyStr
|
||||
from typing import Callable
|
||||
from typing import Iterable
|
||||
from typing import Iterator
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
from typing import Mapping
|
||||
from typing import MutableMapping
|
||||
from typing import Sequence
|
||||
from typing import Set
|
||||
from typing import Union
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
from typing import TypeVar
|
||||
from typing import overload
|
||||
|
||||
# see commit 08519aa for explanation of the re-export
|
||||
from pyrsistent.typing import CheckedKeyTypeError as CheckedKeyTypeError
|
||||
from pyrsistent.typing import CheckedPMap as CheckedPMap
|
||||
from pyrsistent.typing import CheckedPSet as CheckedPSet
|
||||
from pyrsistent.typing import CheckedPVector as CheckedPVector
|
||||
from pyrsistent.typing import CheckedType as CheckedType
|
||||
from pyrsistent.typing import CheckedValueTypeError as CheckedValueTypeError
|
||||
from pyrsistent.typing import InvariantException as InvariantException
|
||||
from pyrsistent.typing import PClass as PClass
|
||||
from pyrsistent.typing import PBag as PBag
|
||||
from pyrsistent.typing import PDeque as PDeque
|
||||
from pyrsistent.typing import PList as PList
|
||||
from pyrsistent.typing import PMap as PMap
|
||||
from pyrsistent.typing import PMapEvolver as PMapEvolver
|
||||
from pyrsistent.typing import PSet as PSet
|
||||
from pyrsistent.typing import PSetEvolver as PSetEvolver
|
||||
from pyrsistent.typing import PTypeError as PTypeError
|
||||
from pyrsistent.typing import PVector as PVector
|
||||
from pyrsistent.typing import PVectorEvolver as PVectorEvolver
|
||||
|
||||
T = TypeVar('T')
|
||||
KT = TypeVar('KT')
|
||||
VT = TypeVar('VT')
|
||||
|
||||
def pmap(initial: Union[Mapping[KT, VT], Iterable[Tuple[KT, VT]]] = {}, pre_size: int = 0) -> PMap[KT, VT]: ...
|
||||
def m(**kwargs: VT) -> PMap[str, VT]: ...
|
||||
|
||||
def pvector(iterable: Iterable[T] = ...) -> PVector[T]: ...
|
||||
def v(*iterable: T) -> PVector[T]: ...
|
||||
|
||||
def pset(iterable: Iterable[T] = (), pre_size: int = 8) -> PSet[T]: ...
|
||||
def s(*iterable: T) -> PSet[T]: ...
|
||||
|
||||
# see class_test.py for use cases
|
||||
Invariant = Tuple[bool, Optional[Union[str, Callable[[], str]]]]
|
||||
|
||||
@overload
|
||||
def field(
|
||||
type: Union[Type[T], Sequence[Type[T]]] = ...,
|
||||
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||
initial: Any = object(),
|
||||
mandatory: bool = False,
|
||||
factory: Callable[[Any], T] = lambda x: x,
|
||||
serializer: Callable[[Any, T], Any] = lambda _, value: value,
|
||||
) -> T: ...
|
||||
# The actual return value (_PField) is irrelevant after a PRecord has been instantiated,
|
||||
# see https://github.com/tobgu/pyrsistent/blob/master/pyrsistent/_precord.py#L10
|
||||
@overload
|
||||
def field(
|
||||
type: Any = ...,
|
||||
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||
initial: Any = object(),
|
||||
mandatory: bool = False,
|
||||
factory: Callable[[Any], Any] = lambda x: x,
|
||||
serializer: Callable[[Any, Any], Any] = lambda _, value: value,
|
||||
) -> Any: ...
|
||||
|
||||
# Use precise types for the simplest use cases, but fall back to Any for
|
||||
# everything else. See record_test.py for the wide range of possible types for
|
||||
# item_type
|
||||
@overload
|
||||
def pset_field(
|
||||
item_type: Type[T],
|
||||
optional: bool = False,
|
||||
initial: Iterable[T] = ...,
|
||||
) -> PSet[T]: ...
|
||||
@overload
|
||||
def pset_field(
|
||||
item_type: Any,
|
||||
optional: bool = False,
|
||||
initial: Any = (),
|
||||
) -> PSet[Any]: ...
|
||||
|
||||
@overload
|
||||
def pmap_field(
|
||||
key_type: Type[KT],
|
||||
value_type: Type[VT],
|
||||
optional: bool = False,
|
||||
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||
) -> PMap[KT, VT]: ...
|
||||
@overload
|
||||
def pmap_field(
|
||||
key_type: Any,
|
||||
value_type: Any,
|
||||
optional: bool = False,
|
||||
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||
) -> PMap[Any, Any]: ...
|
||||
|
||||
@overload
|
||||
def pvector_field(
|
||||
item_type: Type[T],
|
||||
optional: bool = False,
|
||||
initial: Iterable[T] = ...,
|
||||
) -> PVector[T]: ...
|
||||
@overload
|
||||
def pvector_field(
|
||||
item_type: Any,
|
||||
optional: bool = False,
|
||||
initial: Any = (),
|
||||
) -> PVector[Any]: ...
|
||||
|
||||
def pbag(elements: Iterable[T]) -> PBag[T]: ...
|
||||
def b(*elements: T) -> PBag[T]: ...
|
||||
|
||||
def plist(iterable: Iterable[T] = (), reverse: bool = False) -> PList[T]: ...
|
||||
def l(*elements: T) -> PList[T]: ...
|
||||
|
||||
def pdeque(iterable: Optional[Iterable[T]] = None, maxlen: Optional[int] = None) -> PDeque[T]: ...
|
||||
def dq(*iterable: T) -> PDeque[T]: ...
|
||||
|
||||
@overload
|
||||
def optional(type: T) -> Tuple[T, Type[None]]: ...
|
||||
@overload
|
||||
def optional(*typs: Any) -> Tuple[Any, ...]: ...
|
||||
|
||||
T_PRecord = TypeVar('T_PRecord', bound='PRecord')
|
||||
class PRecord(PMap[AnyStr, Any]):
|
||||
_precord_fields: Mapping
|
||||
_precord_initial_values: Mapping
|
||||
|
||||
def __hash__(self) -> int: ...
|
||||
def __init__(self, **kwargs: Any) -> None: ...
|
||||
def __iter__(self) -> Iterator[Any]: ...
|
||||
def __len__(self) -> int: ...
|
||||
@classmethod
|
||||
def create(
|
||||
cls: Type[T_PRecord],
|
||||
kwargs: Mapping,
|
||||
_factory_fields: Optional[Iterable] = None,
|
||||
ignore_extra: bool = False,
|
||||
) -> T_PRecord: ...
|
||||
# This is OK because T_PRecord is a concrete type
|
||||
def discard(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||
def remove(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||
|
||||
def serialize(self, format: Optional[Any] = ...) -> MutableMapping: ...
|
||||
|
||||
# From pyrsistent documentation:
|
||||
# This set function differs slightly from that in the PMap
|
||||
# class. First of all it accepts key-value pairs. Second it accepts multiple key-value
|
||||
# pairs to perform one, atomic, update of multiple fields.
|
||||
@overload
|
||||
def set(self, key: KT, val: VT) -> Any: ...
|
||||
@overload
|
||||
def set(self, **kwargs: VT) -> Any: ...
|
||||
|
||||
def immutable(
|
||||
members: Union[str, Iterable[str]] = '',
|
||||
name: str = 'Immutable',
|
||||
verbose: bool = False,
|
||||
) -> Tuple: ... # actually a namedtuple
|
||||
|
||||
# ignore mypy warning "Overloaded function signatures 1 and 5 overlap with
|
||||
# incompatible return types"
|
||||
@overload
|
||||
def freeze(o: Mapping[KT, VT]) -> PMap[KT, VT]: ... # type: ignore
|
||||
@overload
|
||||
def freeze(o: List[T]) -> PVector[T]: ... # type: ignore
|
||||
@overload
|
||||
def freeze(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||
@overload
|
||||
def freeze(o: Set[T]) -> PSet[T]: ... # type: ignore
|
||||
@overload
|
||||
def freeze(o: T) -> T: ...
|
||||
|
||||
|
||||
@overload
|
||||
def thaw(o: PMap[KT, VT]) -> MutableMapping[KT, VT]: ... # type: ignore
|
||||
@overload
|
||||
def thaw(o: PVector[T]) -> List[T]: ... # type: ignore
|
||||
@overload
|
||||
def thaw(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||
# collections.abc.MutableSet is kind of garbage:
|
||||
# https://stackoverflow.com/questions/24977898/why-does-collections-mutableset-not-bestow-an-update-method
|
||||
@overload
|
||||
def thaw(o: PSet[T]) -> Set[T]: ... # type: ignore
|
||||
@overload
|
||||
def thaw(o: T) -> T: ...
|
||||
|
||||
def mutant(fn: Callable) -> Callable: ...
|
||||
|
||||
def inc(x: int) -> int: ...
|
||||
@overload
|
||||
def discard(evolver: PMapEvolver[KT, VT], key: KT) -> None: ...
|
||||
@overload
|
||||
def discard(evolver: PVectorEvolver[T], key: int) -> None: ...
|
||||
@overload
|
||||
def discard(evolver: PSetEvolver[T], key: T) -> None: ...
|
||||
def rex(expr: str) -> Callable[[Any], bool]: ...
|
||||
def ny(_: Any) -> bool: ...
|
||||
|
||||
def get_in(keys: Iterable, coll: Mapping, default: Optional[Any] = None, no_default: bool = False) -> Any: ...
|
1
lib/spack/external/_vendoring/ruamel.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/ruamel.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from ruamel import *
|
1
lib/spack/external/_vendoring/six/__init__.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/__init__.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from six import *
|
1
lib/spack/external/_vendoring/six/moves/__init__.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/moves/__init__.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from six.moves import *
|
1
lib/spack/external/_vendoring/six/moves/configparser.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/moves/configparser.pyi
vendored
Normal file
@ -0,0 +1 @@
|
||||
from six.moves.configparser import *
|
81
lib/spack/external/archspec/README.md
vendored
Normal file
81
lib/spack/external/archspec/README.md
vendored
Normal file
@ -0,0 +1,81 @@
|
||||
[](https://github.com/archspec/archspec/actions)
|
||||
[](https://codecov.io/gh/archspec/archspec)
|
||||
[](https://archspec.readthedocs.io/en/latest/?badge=latest)
|
||||
|
||||
|
||||
# Archspec (Python bindings)
|
||||
|
||||
Archspec aims at providing a standard set of human-understandable labels for
|
||||
various aspects of a system architecture like CPU, network fabrics, etc. and
|
||||
APIs to detect, query and compare them.
|
||||
|
||||
This project grew out of [Spack](https://spack.io/) and is currently under
|
||||
active development. At present it supports APIs to detect and model
|
||||
compatibility relationships among different CPU microarchitectures.
|
||||
|
||||
## Getting started with development
|
||||
|
||||
The `archspec` Python package needs [poetry](https://python-poetry.org/) to
|
||||
be installed from VCS sources. The preferred method to install it is via
|
||||
its custom installer outside of any virtual environment:
|
||||
```console
|
||||
$ curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python
|
||||
```
|
||||
You can refer to [Poetry's documentation](https://python-poetry.org/docs/#installation)
|
||||
for further details or for other methods to install this tool. You'll also need `tox`
|
||||
to run unit test:
|
||||
```console
|
||||
$ pip install --user tox
|
||||
```
|
||||
Finally you'll need to clone the repository:
|
||||
```console
|
||||
$ git clone --recursive https://github.com/archspec/archspec.git
|
||||
```
|
||||
|
||||
### Running unit tests
|
||||
Once you have your environment ready you can run `archspec` unit tests
|
||||
using ``tox`` from the root of the repository:
|
||||
```console
|
||||
$ tox
|
||||
[ ... ]
|
||||
py27: commands succeeded
|
||||
py35: commands succeeded
|
||||
py36: commands succeeded
|
||||
py37: commands succeeded
|
||||
py38: commands succeeded
|
||||
pylint: commands succeeded
|
||||
flake8: commands succeeded
|
||||
black: commands succeeded
|
||||
congratulations :)
|
||||
```
|
||||
|
||||
## Citing Archspec
|
||||
|
||||
If you are referencing `archspec` in a publication, please cite the following
|
||||
paper:
|
||||
|
||||
* Massimiliano Culpo, Gregory Becker, Carlos Eduardo Arango Gutierrez, Kenneth
|
||||
Hoste, and Todd Gamblin.
|
||||
[**`archspec`: A library for detecting, labeling, and reasoning about
|
||||
microarchitectures**](https://tgamblin.github.io/pubs/archspec-canopie-hpc-2020.pdf).
|
||||
In *2nd International Workshop on Containers and New Orchestration Paradigms
|
||||
for Isolated Environments in HPC (CANOPIE-HPC'20)*, Online Event, November
|
||||
12, 2020.
|
||||
|
||||
## License
|
||||
|
||||
Archspec is distributed under the terms of both the MIT license and the
|
||||
Apache License (Version 2.0). Users may choose either license, at their
|
||||
option.
|
||||
|
||||
All new contributions must be made under both the MIT and Apache-2.0
|
||||
licenses.
|
||||
|
||||
See [LICENSE-MIT](https://github.com/archspec/archspec/blob/master/LICENSE-MIT),
|
||||
[LICENSE-APACHE](https://github.com/archspec/archspec/blob/master/LICENSE-APACHE),
|
||||
[COPYRIGHT](https://github.com/archspec/archspec/blob/master/COPYRIGHT), and
|
||||
[NOTICE](https://github.com/archspec/archspec/blob/master/NOTICE) for details.
|
||||
|
||||
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
LLNL-CODE-811653
|
@ -1,3 +1,3 @@
|
||||
"""Init file to avoid namespace packages"""
|
||||
|
||||
__version__ = "0.2.5"
|
||||
__version__ = "0.2.4"
|
@ -9,8 +9,8 @@
|
||||
import argparse
|
||||
import typing
|
||||
|
||||
import _vendoring.archspec
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec
|
||||
import archspec.cpu
|
||||
|
||||
|
||||
def _make_parser() -> argparse.ArgumentParser:
|
||||
@ -24,7 +24,7 @@ def _make_parser() -> argparse.ArgumentParser:
|
||||
"-V",
|
||||
help="Show the version and exit.",
|
||||
action="version",
|
||||
version=f"archspec, version {_vendoring.archspec.__version__}",
|
||||
version=f"archspec, version {archspec.__version__}",
|
||||
)
|
||||
parser.add_argument("--help", "-h", help="Show the help and exit.", action="help")
|
||||
|
||||
@ -45,9 +45,9 @@ def _make_parser() -> argparse.ArgumentParser:
|
||||
|
||||
|
||||
def cpu() -> int:
|
||||
"""Run the `_vendoring.archspec.cpu` subcommand."""
|
||||
"""Run the `archspec cpu` subcommand."""
|
||||
try:
|
||||
print(_vendoring.archspec.cpu.host())
|
||||
print(archspec.cpu.host())
|
||||
except FileNotFoundError as exc:
|
||||
print(exc)
|
||||
return 1
|
@ -8,9 +8,9 @@
|
||||
import re
|
||||
import warnings
|
||||
|
||||
import _vendoring.archspec
|
||||
import _vendoring.archspec.cpu.alias
|
||||
import _vendoring.archspec.cpu.schema
|
||||
import archspec
|
||||
import archspec.cpu.alias
|
||||
import archspec.cpu.schema
|
||||
|
||||
from .alias import FEATURE_ALIASES
|
||||
from .schema import LazyDictionary
|
||||
@ -384,7 +384,7 @@ def fill_target_from_dict(name, data, targets):
|
||||
)
|
||||
|
||||
known_targets = {}
|
||||
data = _vendoring.archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
|
||||
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
|
||||
for name in data:
|
||||
if name in known_targets:
|
||||
# name was already brought in as ancestor to a target
|
22
lib/spack/external/archspec/json/COPYRIGHT
vendored
Normal file
22
lib/spack/external/archspec/json/COPYRIGHT
vendored
Normal file
@ -0,0 +1,22 @@
|
||||
Intellectual Property Notice
|
||||
------------------------------
|
||||
|
||||
Archspec is licensed under the Apache License, Version 2.0 (LICENSE-APACHE
|
||||
or http://www.apache.org/licenses/LICENSE-2.0) or the MIT license,
|
||||
(LICENSE-MIT or http://opensource.org/licenses/MIT), at your option.
|
||||
|
||||
Copyrights and patents in the Archspec project are retained by contributors.
|
||||
No copyright assignment is required to contribute to Archspec.
|
||||
|
||||
|
||||
SPDX usage
|
||||
------------
|
||||
|
||||
Individual files contain SPDX tags instead of the full license text.
|
||||
This enables machine processing of license information based on the SPDX
|
||||
License Identifiers that are available here: https://spdx.org/licenses/
|
||||
|
||||
Files that are dual-licensed as Apache-2.0 OR MIT contain the following
|
||||
text in the license header:
|
||||
|
||||
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
1
lib/spack/external/vendor.txt
vendored
1
lib/spack/external/vendor.txt
vendored
@ -9,4 +9,3 @@ macholib==1.16.2
|
||||
altgraph==0.17.3
|
||||
ruamel.yaml==0.17.21
|
||||
typing_extensions==4.1.1
|
||||
archspec @ git+https://github.com/archspec/archspec.git@38ce485258ffc4fc6dd6688f8dc90cb269478c47
|
||||
|
@ -12,9 +12,10 @@
|
||||
import warnings
|
||||
from typing import Optional, Sequence, Union
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
from _vendoring.typing_extensions import TypedDict
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
from llnl.util import tty
|
||||
|
||||
@ -137,7 +138,7 @@ def _fix_ext_suffix(candidate_spec: "spack.spec.Spec"):
|
||||
}
|
||||
|
||||
# If the current architecture is not problematic return
|
||||
generic_target = _vendoring.archspec.cpu.host().family
|
||||
generic_target = archspec.cpu.host().family
|
||||
if str(generic_target) not in _suffix_to_be_checked:
|
||||
return
|
||||
|
||||
@ -234,7 +235,7 @@ def _root_spec(spec_str: str) -> str:
|
||||
platform = str(spack.platforms.host())
|
||||
|
||||
spec_str += f" platform={platform}"
|
||||
target = _vendoring.archspec.cpu.host().family
|
||||
target = archspec.cpu.host().family
|
||||
spec_str += f" target={target}"
|
||||
|
||||
tty.debug(f"[BOOTSTRAP ROOT SPEC] {spec_str}")
|
||||
|
@ -13,7 +13,7 @@
|
||||
import sys
|
||||
from typing import Dict, Optional, Tuple
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import spack.compilers.config
|
||||
import spack.compilers.libraries
|
||||
@ -30,7 +30,7 @@ class ClingoBootstrapConcretizer:
|
||||
def __init__(self, configuration):
|
||||
self.host_platform = spack.platforms.host()
|
||||
self.host_os = self.host_platform.default_operating_system()
|
||||
self.host_target = _vendoring.archspec.cpu.host().family
|
||||
self.host_target = archspec.cpu.host().family
|
||||
self.host_architecture = spack.spec.ArchSpec.default_arch()
|
||||
self.host_architecture.target = str(self.host_target)
|
||||
self.host_compiler = self._valid_compiler_or_raise()
|
||||
|
@ -8,7 +8,7 @@
|
||||
import sys
|
||||
from typing import Iterable, List
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
from llnl.util import tty
|
||||
|
||||
@ -51,7 +51,7 @@ def environment_root(cls) -> pathlib.Path:
|
||||
"""Environment root directory"""
|
||||
bootstrap_root_path = root_path()
|
||||
python_part = spec_for_current_python().replace("@", "")
|
||||
arch_part = _vendoring.archspec.cpu.host().family
|
||||
arch_part = archspec.cpu.host().family
|
||||
interpreter_part = hashlib.md5(sys.exec_prefix.encode()).hexdigest()[:5]
|
||||
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
|
||||
return pathlib.Path(
|
||||
@ -112,7 +112,7 @@ def _write_spack_yaml_file(self) -> None:
|
||||
context = {
|
||||
"python_spec": spec_for_current_python(),
|
||||
"python_prefix": sys.exec_prefix,
|
||||
"architecture": _vendoring.archspec.cpu.host().family,
|
||||
"architecture": archspec.cpu.host().family,
|
||||
"environment_path": self.environment_root(),
|
||||
"environment_specs": self.spack_dev_requirements(),
|
||||
"store_path": store_path(),
|
||||
|
@ -59,7 +59,7 @@
|
||||
overload,
|
||||
)
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.string import plural
|
||||
@ -440,12 +440,10 @@ def optimization_flags(compiler, target):
|
||||
# Try to check if the current compiler comes with a version number or
|
||||
# has an unexpected suffix. If so, treat it as a compiler with a
|
||||
# custom spec.
|
||||
version_number, _ = _vendoring.archspec.cpu.version_components(
|
||||
compiler.version.dotted_numeric_string
|
||||
)
|
||||
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
|
||||
try:
|
||||
result = target.optimization_flags(compiler.name, version_number)
|
||||
except (ValueError, _vendoring.archspec.cpu.UnsupportedMicroarchitecture):
|
||||
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
|
||||
result = ""
|
||||
|
||||
return result
|
||||
|
@ -5,7 +5,7 @@
|
||||
import collections
|
||||
import warnings
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.tty.colify as colify
|
||||
import llnl.util.tty.color as color
|
||||
@ -92,11 +92,11 @@ def display_target_group(header, target_group):
|
||||
def arch(parser, args):
|
||||
if args.generic_target:
|
||||
# TODO: add deprecation warning in 0.24
|
||||
print(_vendoring.archspec.cpu.host().generic)
|
||||
print(archspec.cpu.host().generic)
|
||||
return
|
||||
|
||||
if args.known_targets:
|
||||
display_targets(_vendoring.archspec.cpu.TARGETS)
|
||||
display_targets(archspec.cpu.TARGETS)
|
||||
return
|
||||
|
||||
if args.frontend:
|
||||
|
@ -514,18 +514,17 @@ def extend_with_dependencies(specs):
|
||||
|
||||
|
||||
def concrete_specs_from_cli_or_file(args):
|
||||
tty.msg("Concretizing input specs")
|
||||
if args.specs:
|
||||
specs = spack.cmd.parse_specs(args.specs, concretize=False)
|
||||
specs = spack.cmd.parse_specs(args.specs, concretize=True)
|
||||
if not specs:
|
||||
raise SpackError("unable to parse specs from command line")
|
||||
|
||||
if args.file:
|
||||
specs = specs_from_text_file(args.file, concretize=False)
|
||||
specs = specs_from_text_file(args.file, concretize=True)
|
||||
if not specs:
|
||||
raise SpackError("unable to parse specs from file '{}'".format(args.file))
|
||||
|
||||
concrete_specs = spack.cmd.matching_specs_from_env(specs)
|
||||
return concrete_specs
|
||||
return specs
|
||||
|
||||
|
||||
class IncludeFilter:
|
||||
@ -608,6 +607,11 @@ def process_mirror_stats(present, mirrored, error):
|
||||
|
||||
def mirror_create(args):
|
||||
"""create a directory to be used as a spack mirror, and fill it with package archives"""
|
||||
if args.specs and args.all:
|
||||
raise SpackError(
|
||||
"cannot specify specs on command line if you chose to mirror all specs with '--all'"
|
||||
)
|
||||
|
||||
if args.file and args.all:
|
||||
raise SpackError(
|
||||
"cannot specify specs with a file if you chose to mirror all specs with '--all'"
|
||||
|
@ -201,19 +201,7 @@ def repo_migrate(args: Any) -> int:
|
||||
repo_v2 = None
|
||||
exit_code = 0
|
||||
|
||||
if not args.fix:
|
||||
tty.error(
|
||||
f"No changes were made to the repository {repo.root} with namespace "
|
||||
f"'{repo.namespace}'. Run with --fix to apply the above changes."
|
||||
)
|
||||
|
||||
elif exit_code == 1:
|
||||
tty.error(
|
||||
f"Repository '{repo.namespace}' could not be migrated to the latest Package API. "
|
||||
"Please check the error messages above."
|
||||
)
|
||||
|
||||
elif isinstance(repo_v2, spack.repo.Repo):
|
||||
if exit_code == 0 and isinstance(repo_v2, spack.repo.Repo):
|
||||
tty.info(
|
||||
f"Repository '{repo_v2.namespace}' was successfully migrated from "
|
||||
f"package API {repo.package_api_str} to {repo_v2.package_api_str}."
|
||||
@ -224,9 +212,15 @@ def repo_migrate(args: Any) -> int:
|
||||
f" spack repo add {shlex.quote(repo_v2.root)}"
|
||||
)
|
||||
|
||||
else:
|
||||
elif exit_code == 0:
|
||||
tty.info(f"Repository '{repo.namespace}' was successfully migrated")
|
||||
|
||||
elif not args.fix and exit_code == 1:
|
||||
tty.error(
|
||||
f"No changes were made to the repository {repo.root} with namespace "
|
||||
f"'{repo.namespace}'. Run with --fix to apply the above changes."
|
||||
)
|
||||
|
||||
return exit_code
|
||||
|
||||
|
||||
|
@ -10,7 +10,7 @@
|
||||
import warnings
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
import llnl.util.lang
|
||||
@ -316,7 +316,7 @@ def from_external_yaml(config: Dict[str, Any]) -> Optional[spack.spec.Spec]:
|
||||
@staticmethod
|
||||
def _finalize_external_concretization(abstract_spec):
|
||||
if CompilerFactory._GENERIC_TARGET is None:
|
||||
CompilerFactory._GENERIC_TARGET = _vendoring.archspec.cpu.host().family
|
||||
CompilerFactory._GENERIC_TARGET = archspec.cpu.host().family
|
||||
|
||||
if abstract_spec.architecture:
|
||||
abstract_spec.architecture.complete_with_defaults()
|
||||
|
@ -25,7 +25,7 @@
|
||||
import warnings
|
||||
from typing import List, Tuple
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.lang
|
||||
import llnl.util.tty as tty
|
||||
@ -734,7 +734,7 @@ def _compatible_sys_types():
|
||||
"""
|
||||
host_platform = spack.platforms.host()
|
||||
host_os = str(host_platform.default_operating_system())
|
||||
host_target = _vendoring.archspec.cpu.host()
|
||||
host_target = archspec.cpu.host()
|
||||
compatible_targets = [host_target] + host_target.ancestors
|
||||
|
||||
compatible_archs = [
|
||||
@ -794,7 +794,7 @@ def shell_set(var, value):
|
||||
# print environment module system if available. This can be expensive
|
||||
# on clusters, so skip it if not needed.
|
||||
if "modules" in info:
|
||||
generic_arch = _vendoring.archspec.cpu.host().family
|
||||
generic_arch = archspec.cpu.host().family
|
||||
module_spec = "environment-modules target={0}".format(generic_arch)
|
||||
specs = spack.store.STORE.db.query(module_spec)
|
||||
if specs:
|
||||
|
@ -986,9 +986,7 @@ def url_for_version(self, version):
|
||||
"""
|
||||
return self._implement_all_urls_for_version(version)[0]
|
||||
|
||||
def _update_external_dependencies(
|
||||
self, extendee_spec: Optional[spack.spec.Spec] = None
|
||||
) -> None:
|
||||
def update_external_dependencies(self, extendee_spec=None):
|
||||
"""
|
||||
Method to override in package classes to handle external dependencies
|
||||
"""
|
||||
|
@ -4,7 +4,7 @@
|
||||
import warnings
|
||||
from typing import Optional
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.lang
|
||||
|
||||
@ -38,15 +38,15 @@ def __init__(self, name):
|
||||
self.name = name
|
||||
self._init_targets()
|
||||
|
||||
def add_target(self, name: str, target: _vendoring.archspec.cpu.Microarchitecture) -> None:
|
||||
def add_target(self, name: str, target: archspec.cpu.Microarchitecture) -> None:
|
||||
if name in Platform.reserved_targets:
|
||||
msg = f"{name} is a spack reserved alias and cannot be the name of a target"
|
||||
raise ValueError(msg)
|
||||
self.targets[name] = target
|
||||
|
||||
def _init_targets(self):
|
||||
self.default = _vendoring.archspec.cpu.host().name
|
||||
for name, microarchitecture in _vendoring.archspec.cpu.TARGETS.items():
|
||||
self.default = archspec.cpu.host().name
|
||||
for name, microarchitecture in archspec.cpu.TARGETS.items():
|
||||
self.add_target(name, microarchitecture)
|
||||
|
||||
def target(self, name):
|
||||
|
@ -3,7 +3,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import platform
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import spack.operating_systems
|
||||
|
||||
@ -28,7 +28,7 @@ def __init__(self, name=None):
|
||||
def _init_targets(self):
|
||||
targets = ("aarch64", "m1") if platform.machine() == "arm64" else ("x86_64", "core2")
|
||||
for t in targets:
|
||||
self.add_target(t, _vendoring.archspec.cpu.TARGETS[t])
|
||||
self.add_target(t, archspec.cpu.TARGETS[t])
|
||||
|
||||
@classmethod
|
||||
def detect(cls):
|
||||
|
@ -3,6 +3,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import ast
|
||||
import difflib
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
@ -82,7 +83,8 @@ def migrate_v1_to_v2(
|
||||
|
||||
errors = False
|
||||
|
||||
stack: List[Tuple[str, int]] = [(repo.root, 0)]
|
||||
stack: List[Tuple[str, int]] = [(repo.packages_path, 0)]
|
||||
|
||||
while stack:
|
||||
path, depth = stack.pop()
|
||||
|
||||
@ -112,11 +114,7 @@ def migrate_v1_to_v2(
|
||||
continue
|
||||
|
||||
# check if this is a package
|
||||
if (
|
||||
depth == 1
|
||||
and rel_path.startswith(f"{subdirectory}{os.sep}")
|
||||
and os.path.exists(os.path.join(entry.path, "package.py"))
|
||||
):
|
||||
if depth == 0 and os.path.exists(os.path.join(entry.path, "package.py")):
|
||||
if "_" in entry.name:
|
||||
print(
|
||||
f"Invalid package name '{entry.name}': underscores are not allowed in "
|
||||
@ -144,7 +142,7 @@ def migrate_v1_to_v2(
|
||||
rename_regex = re.compile("^(" + "|".join(re.escape(k) for k in rename.keys()) + ")")
|
||||
|
||||
if fix:
|
||||
os.makedirs(new_root, exist_ok=True)
|
||||
os.makedirs(os.path.join(new_root, repo.subdirectory), exist_ok=True)
|
||||
|
||||
def _relocate(rel_path: str) -> Tuple[str, str]:
|
||||
old = os.path.join(repo.root, rel_path)
|
||||
@ -223,6 +221,16 @@ def _relocate(rel_path: str) -> Tuple[str, str]:
|
||||
return result, (updated_repo if fix else None)
|
||||
|
||||
|
||||
def _spack_pkg_to_spack_repo(modulename: str) -> str:
|
||||
# rewrite spack.pkg.builtin.foo -> spack_repo.builtin.packages.foo.package
|
||||
parts = modulename.split(".")
|
||||
assert parts[:2] == ["spack", "pkg"]
|
||||
parts[0:2] = ["spack_repo"]
|
||||
parts.insert(2, "packages")
|
||||
parts.append("package")
|
||||
return ".".join(parts)
|
||||
|
||||
|
||||
def migrate_v2_imports(
|
||||
packages_dir: str, root: str, fix: bool, out: IO[str] = sys.stdout, err: IO[str] = sys.stderr
|
||||
) -> bool:
|
||||
@ -299,12 +307,41 @@ def migrate_v2_imports(
|
||||
#: Set of symbols of interest that are already defined through imports, assignments, or
|
||||
#: function definitions.
|
||||
defined_symbols: Set[str] = set()
|
||||
|
||||
best_line: Optional[int] = None
|
||||
|
||||
seen_import = False
|
||||
module_replacements: Dict[str, str] = {}
|
||||
parent: Dict[int, ast.AST] = {}
|
||||
|
||||
#: List of (line, col start, old, new) tuples of strings to be replaced inline.
|
||||
inline_updates: List[Tuple[int, int, str, str]] = []
|
||||
|
||||
#: List of (line from, line to, new lines) tuples of line replacements
|
||||
multiline_updates: List[Tuple[int, int, List[str]]] = []
|
||||
|
||||
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
|
||||
original_lines = file.readlines()
|
||||
|
||||
if len(original_lines) < 2: # assume packagepy files have at least 2 lines...
|
||||
continue
|
||||
|
||||
if original_lines[0].endswith("\r\n"):
|
||||
newline = "\r\n"
|
||||
elif original_lines[0].endswith("\n"):
|
||||
newline = "\n"
|
||||
elif original_lines[0].endswith("\r"):
|
||||
newline = "\r"
|
||||
else:
|
||||
success = False
|
||||
print(f"{pkg_path}: unknown line ending, cannot fix", file=err)
|
||||
continue
|
||||
|
||||
updated_lines = original_lines.copy()
|
||||
|
||||
for node in ast.walk(tree):
|
||||
for child in ast.iter_child_nodes(node):
|
||||
if isinstance(child, ast.Attribute):
|
||||
parent[id(child)] = node
|
||||
|
||||
# Get the last import statement from the first block of top-level imports
|
||||
if isinstance(node, ast.Module):
|
||||
for child in ast.iter_child_nodes(node):
|
||||
@ -324,7 +361,7 @@ def migrate_v2_imports(
|
||||
|
||||
if is_import:
|
||||
if isinstance(child, (ast.stmt, ast.expr)):
|
||||
best_line = (child.end_lineno or child.lineno) + 1
|
||||
best_line = (getattr(child, "end_lineno", None) or child.lineno) + 1
|
||||
|
||||
if not seen_import and is_import:
|
||||
seen_import = True
|
||||
@ -353,12 +390,89 @@ def migrate_v2_imports(
|
||||
elif isinstance(node, ast.Name) and node.id in symbol_to_module:
|
||||
referenced_symbols.add(node.id)
|
||||
|
||||
# Register imported symbols to make this operation idempotent
|
||||
# Find lines where spack.pkg is used.
|
||||
elif (
|
||||
isinstance(node, ast.Attribute)
|
||||
and isinstance(node.value, ast.Name)
|
||||
and node.value.id == "spack"
|
||||
and node.attr == "pkg"
|
||||
):
|
||||
# go as many attrs up until we reach a known module name to be replaced
|
||||
known_module = "spack.pkg"
|
||||
ancestor = node
|
||||
while True:
|
||||
next_parent = parent.get(id(ancestor))
|
||||
if next_parent is None or not isinstance(next_parent, ast.Attribute):
|
||||
break
|
||||
ancestor = next_parent
|
||||
known_module = f"{known_module}.{ancestor.attr}"
|
||||
if known_module in module_replacements:
|
||||
break
|
||||
|
||||
inline_updates.append(
|
||||
(
|
||||
ancestor.lineno,
|
||||
ancestor.col_offset,
|
||||
known_module,
|
||||
module_replacements[known_module],
|
||||
)
|
||||
)
|
||||
|
||||
elif isinstance(node, ast.ImportFrom):
|
||||
# Keep track of old style spack.pkg imports, to be replaced.
|
||||
if node.module and node.module.startswith("spack.pkg.") and node.level == 0:
|
||||
|
||||
depth = node.module.count(".")
|
||||
|
||||
# simple case of find and replace
|
||||
# from spack.pkg.builtin.my_pkg import MyPkg
|
||||
# -> from spack_repo.builtin.packages.my_pkg.package import MyPkg
|
||||
if depth == 3:
|
||||
module_replacements[node.module] = _spack_pkg_to_spack_repo(node.module)
|
||||
inline_updates.append(
|
||||
(
|
||||
node.lineno,
|
||||
node.col_offset,
|
||||
node.module,
|
||||
module_replacements[node.module],
|
||||
)
|
||||
)
|
||||
|
||||
# non-trivial possible multiline case
|
||||
# from spack.pkg.builtin import (boost, cmake as foo)
|
||||
# -> import spack_repo.builtin.packages.boost.package as boost
|
||||
# -> import spack_repo.builtin.packages.cmake.package as foo
|
||||
elif depth == 2 and node.end_lineno is not None:
|
||||
_, _, namespace = node.module.rpartition(".")
|
||||
indent = original_lines[node.lineno - 1][: node.col_offset]
|
||||
multiline_updates.append(
|
||||
(
|
||||
node.lineno,
|
||||
node.end_lineno + 1,
|
||||
[
|
||||
f"{indent}import spack_repo.{namespace}.packages."
|
||||
f"{alias.name}.package as {alias.asname or alias.name}"
|
||||
f"{newline}"
|
||||
for alias in node.names
|
||||
],
|
||||
)
|
||||
)
|
||||
|
||||
else:
|
||||
success = False
|
||||
print(
|
||||
f"{pkg_path}:{node.lineno}: don't know how to rewrite `{node.module}`",
|
||||
file=err,
|
||||
)
|
||||
|
||||
# Subtract the symbols that are imported so we don't repeatedly add imports.
|
||||
for alias in node.names:
|
||||
if alias.name in symbol_to_module:
|
||||
if alias.asname is None:
|
||||
defined_symbols.add(alias.name)
|
||||
if node.module == "spack.package":
|
||||
|
||||
# error when symbols are explicitly imported that are no longer available
|
||||
if node.module == "spack.package" and node.level == 0:
|
||||
success = False
|
||||
print(
|
||||
f"{pkg_path}:{node.lineno}: `{alias.name}` is imported from "
|
||||
@ -369,21 +483,45 @@ def migrate_v2_imports(
|
||||
if alias.asname and alias.asname in symbol_to_module:
|
||||
defined_symbols.add(alias.asname)
|
||||
|
||||
elif isinstance(node, ast.Import):
|
||||
# normal imports are easy find and replace since they are single lines.
|
||||
for alias in node.names:
|
||||
if alias.asname and alias.asname in symbol_to_module:
|
||||
defined_symbols.add(alias.name)
|
||||
elif alias.asname is None and alias.name.startswith("spack.pkg."):
|
||||
module_replacements[alias.name] = _spack_pkg_to_spack_repo(alias.name)
|
||||
inline_updates.append(
|
||||
(
|
||||
alias.lineno,
|
||||
alias.col_offset,
|
||||
alias.name,
|
||||
module_replacements[alias.name],
|
||||
)
|
||||
)
|
||||
|
||||
# Remove imported symbols from the referenced symbols
|
||||
referenced_symbols.difference_update(defined_symbols)
|
||||
|
||||
if not referenced_symbols:
|
||||
# Sort from last to first so we can modify without messing up the line / col offsets
|
||||
inline_updates.sort(reverse=True)
|
||||
|
||||
# Nothing to change here.
|
||||
if not inline_updates and not referenced_symbols:
|
||||
continue
|
||||
|
||||
# First do module replacements of spack.pkg imports
|
||||
for line, col, old, new in inline_updates:
|
||||
updated_lines[line - 1] = updated_lines[line - 1][:col] + updated_lines[line - 1][
|
||||
col:
|
||||
].replace(old, new, 1)
|
||||
|
||||
# Then insert new imports for symbols referenced in the package
|
||||
if referenced_symbols:
|
||||
if best_line is None:
|
||||
print(f"{pkg_path}: failed to update imports", file=err)
|
||||
success = False
|
||||
continue
|
||||
|
||||
# Add the missing imports right after the last import statement
|
||||
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
|
||||
lines = file.readlines()
|
||||
|
||||
# Group missing symbols by their module
|
||||
missing_imports_by_module: Dict[str, list] = {}
|
||||
for symbol in referenced_symbols:
|
||||
@ -393,35 +531,36 @@ def migrate_v2_imports(
|
||||
missing_imports_by_module[module].append(symbol)
|
||||
|
||||
new_lines = [
|
||||
f"from {module} import {', '.join(sorted(symbols))}\n"
|
||||
f"from {module} import {', '.join(sorted(symbols))}{newline}"
|
||||
for module, symbols in sorted(missing_imports_by_module.items())
|
||||
]
|
||||
|
||||
if not seen_import:
|
||||
new_lines.extend(("\n", "\n"))
|
||||
new_lines.extend((newline, newline))
|
||||
|
||||
if not fix: # only print the diff
|
||||
success = False # packages need to be fixed, but we didn't do it
|
||||
diff_start, diff_end = max(1, best_line - 3), min(best_line + 2, len(lines))
|
||||
num_changed = diff_end - diff_start + 1
|
||||
num_added = num_changed + len(new_lines)
|
||||
multiline_updates.append((best_line, best_line, new_lines))
|
||||
|
||||
multiline_updates.sort(reverse=True)
|
||||
for start, end, new_lines in multiline_updates:
|
||||
updated_lines[start - 1 : end - 1] = new_lines
|
||||
|
||||
if not fix:
|
||||
rel_pkg_path = os.path.relpath(pkg_path, start=root)
|
||||
out.write(f"--- a/{rel_pkg_path}\n+++ b/{rel_pkg_path}\n")
|
||||
out.write(f"@@ -{diff_start},{num_changed} +{diff_start},{num_added} @@\n")
|
||||
for line in lines[diff_start - 1 : best_line - 1]:
|
||||
out.write(f" {line}")
|
||||
for line in new_lines:
|
||||
out.write(f"+{line}")
|
||||
for line in lines[best_line - 1 : diff_end]:
|
||||
out.write(f" {line}")
|
||||
diff = difflib.unified_diff(
|
||||
original_lines,
|
||||
updated_lines,
|
||||
n=3,
|
||||
fromfile=f"a/{rel_pkg_path}",
|
||||
tofile=f"b/{rel_pkg_path}",
|
||||
)
|
||||
out.write("".join(diff))
|
||||
continue
|
||||
|
||||
lines[best_line - 1 : best_line - 1] = new_lines
|
||||
|
||||
tmp_file = pkg_path + ".tmp"
|
||||
|
||||
with open(tmp_file, "w", encoding="utf-8", newline="") as file:
|
||||
file.writelines(lines)
|
||||
# binary mode to avoid newline conversion issues; utf-8 was already required upon read.
|
||||
with open(tmp_file, "wb") as file:
|
||||
file.write("".join(updated_lines).encode("utf-8"))
|
||||
|
||||
os.replace(tmp_file, pkg_path)
|
||||
|
||||
|
@ -34,7 +34,7 @@
|
||||
Union,
|
||||
)
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.lang
|
||||
import llnl.util.tty as tty
|
||||
@ -1617,7 +1617,7 @@ def target_ranges(self, spec, single_target_fn):
|
||||
target = spec.architecture.target
|
||||
|
||||
# Check if the target is a concrete target
|
||||
if str(target) in _vendoring.archspec.cpu.TARGETS:
|
||||
if str(target) in archspec.cpu.TARGETS:
|
||||
return [single_target_fn(spec.name, target)]
|
||||
|
||||
self.target_constraints.add(target)
|
||||
@ -2753,7 +2753,7 @@ def _supported_targets(self, compiler_name, compiler_version, targets):
|
||||
compiler_name, compiler_version.dotted_numeric_string
|
||||
)
|
||||
supported.append(target)
|
||||
except _vendoring.archspec.cpu.UnsupportedMicroarchitecture:
|
||||
except archspec.cpu.UnsupportedMicroarchitecture:
|
||||
continue
|
||||
except ValueError:
|
||||
continue
|
||||
@ -2818,7 +2818,7 @@ def target_defaults(self, specs):
|
||||
if not spec.architecture or not spec.architecture.target:
|
||||
continue
|
||||
|
||||
target = _vendoring.archspec.cpu.TARGETS.get(spec.target.name)
|
||||
target = archspec.cpu.TARGETS.get(spec.target.name)
|
||||
if not target:
|
||||
self.target_ranges(spec, None)
|
||||
continue
|
||||
@ -2830,7 +2830,7 @@ def target_defaults(self, specs):
|
||||
candidate_targets.append(ancestor)
|
||||
|
||||
platform = spack.platforms.host()
|
||||
uarch = _vendoring.archspec.cpu.TARGETS.get(platform.default)
|
||||
uarch = archspec.cpu.TARGETS.get(platform.default)
|
||||
best_targets = {uarch.family.name}
|
||||
for compiler in self.possible_compilers:
|
||||
supported = self._supported_targets(compiler.name, compiler.version, candidate_targets)
|
||||
@ -2938,7 +2938,7 @@ def _all_targets_satisfiying(single_constraint):
|
||||
return [single_constraint]
|
||||
|
||||
t_min, _, t_max = single_constraint.partition(":")
|
||||
for test_target in _vendoring.archspec.cpu.TARGETS.values():
|
||||
for test_target in archspec.cpu.TARGETS.values():
|
||||
# Check lower bound
|
||||
if t_min and not t_min <= test_target:
|
||||
continue
|
||||
@ -3894,7 +3894,7 @@ def external_spec_selected(self, node, idx):
|
||||
|
||||
if extendee_spec:
|
||||
extendee_node = SpecBuilder.make_node(pkg=extendee_spec.name)
|
||||
package._update_external_dependencies(self._specs.get(extendee_node))
|
||||
package.update_external_dependencies(self._specs.get(extendee_node, None))
|
||||
|
||||
def depends_on(self, parent_node, dependency_node, type):
|
||||
dependency_spec = self._specs[dependency_node]
|
||||
|
@ -5,7 +5,7 @@
|
||||
import collections
|
||||
from typing import Dict, List, NamedTuple, Set, Tuple, Union
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec.cpu
|
||||
|
||||
from llnl.util import lang, tty
|
||||
|
||||
@ -34,7 +34,7 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
||||
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
|
||||
"""Returns a list of targets that are candidate for concretization"""
|
||||
raise NotImplementedError
|
||||
|
||||
@ -70,7 +70,7 @@ def __init__(self, *, configuration: spack.config.Configuration, repo: spack.rep
|
||||
self.configuration = configuration
|
||||
self.repo = repo
|
||||
self._platform_condition = spack.spec.Spec(
|
||||
f"platform={spack.platforms.host()} target={_vendoring.archspec.cpu.host().family}:"
|
||||
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
|
||||
)
|
||||
|
||||
try:
|
||||
@ -110,10 +110,10 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
|
||||
"""
|
||||
return False
|
||||
|
||||
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
||||
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
|
||||
"""Returns a list of targets that are candidate for concretization"""
|
||||
platform = spack.platforms.host()
|
||||
default_target = _vendoring.archspec.cpu.TARGETS[platform.default]
|
||||
default_target = archspec.cpu.TARGETS[platform.default]
|
||||
|
||||
# Construct the list of targets which are compatible with the host
|
||||
candidate_targets = [default_target] + default_target.ancestors
|
||||
@ -125,7 +125,7 @@ def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
||||
additional_targets_in_family = sorted(
|
||||
[
|
||||
t
|
||||
for t in _vendoring.archspec.cpu.TARGETS.values()
|
||||
for t in archspec.cpu.TARGETS.values()
|
||||
if (t.family.name == default_target.family.name and t not in candidate_targets)
|
||||
],
|
||||
key=lambda x: len(x.ancestors),
|
||||
|
@ -74,9 +74,10 @@
|
||||
overload,
|
||||
)
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
from _vendoring.typing_extensions import Literal
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.path
|
||||
import llnl.string
|
||||
import llnl.util.filesystem as fs
|
||||
@ -216,12 +217,10 @@ def ensure_modern_format_string(fmt: str) -> None:
|
||||
)
|
||||
|
||||
|
||||
def _make_microarchitecture(name: str) -> _vendoring.archspec.cpu.Microarchitecture:
|
||||
if isinstance(name, _vendoring.archspec.cpu.Microarchitecture):
|
||||
def _make_microarchitecture(name: str) -> archspec.cpu.Microarchitecture:
|
||||
if isinstance(name, archspec.cpu.Microarchitecture):
|
||||
return name
|
||||
return _vendoring.archspec.cpu.TARGETS.get(
|
||||
name, _vendoring.archspec.cpu.generic_microarchitecture(name)
|
||||
)
|
||||
return archspec.cpu.TARGETS.get(name, archspec.cpu.generic_microarchitecture(name))
|
||||
|
||||
|
||||
@lang.lazy_lexicographic_ordering
|
||||
@ -365,7 +364,7 @@ def target(self, value):
|
||||
# will assumed to be the host machine's platform.
|
||||
|
||||
def target_or_none(t):
|
||||
if isinstance(t, _vendoring.archspec.cpu.Microarchitecture):
|
||||
if isinstance(t, archspec.cpu.Microarchitecture):
|
||||
return t
|
||||
if t and t != "None":
|
||||
return _make_microarchitecture(t)
|
||||
|
@ -3,9 +3,10 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import platform
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import spack.concretize
|
||||
import spack.operating_systems
|
||||
import spack.platforms
|
||||
@ -124,8 +125,7 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
|
||||
)
|
||||
@pytest.mark.usefixtures("mock_packages", "config")
|
||||
@pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="tests are for x86_64 uarch ranges",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
|
||||
)
|
||||
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
|
||||
spec = spack.concretize.concretize_one(
|
||||
|
@ -8,9 +8,10 @@
|
||||
import sys
|
||||
from typing import Dict, Optional, Tuple
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
from llnl.path import Path, convert_to_platform_path
|
||||
from llnl.util.filesystem import HeaderList, LibraryList
|
||||
|
||||
@ -726,15 +727,14 @@ def test_rpath_with_duplicate_link_deps():
|
||||
@pytest.mark.filterwarnings("ignore:microarchitecture specific")
|
||||
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
||||
def test_optimization_flags(compiler_spec, target_name, expected_flags, compiler_factory):
|
||||
target = _vendoring.archspec.cpu.TARGETS[target_name]
|
||||
target = archspec.cpu.TARGETS[target_name]
|
||||
compiler = spack.spec.parse_with_version_concrete(compiler_spec)
|
||||
opt_flags = spack.build_environment.optimization_flags(compiler, target)
|
||||
assert opt_flags == expected_flags
|
||||
|
||||
|
||||
@pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="tests check specific x86_64 uarch flags",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="tests check specific x86_64 uarch flags"
|
||||
)
|
||||
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
||||
def test_optimization_flags_are_using_node_target(default_mock_concretization, monkeypatch):
|
||||
|
@ -5,10 +5,11 @@
|
||||
import glob
|
||||
import os
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import py.path
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
|
||||
import spack
|
||||
@ -216,8 +217,7 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
|
||||
|
||||
@pytest.mark.disable_clean_stage_check
|
||||
@pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="test data is specific for x86_64",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||
)
|
||||
def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, monkeypatch):
|
||||
"""
|
||||
|
@ -61,26 +61,6 @@ def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fe
|
||||
assert mirror_res == expected
|
||||
|
||||
|
||||
# Test for command line-specified spec in concretized environment
|
||||
def test_mirror_spec_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
|
||||
mirror_dir = str(tmp_path / "mirror-B")
|
||||
env_name = "test"
|
||||
|
||||
env("create", env_name)
|
||||
with ev.read(env_name):
|
||||
add("simple-standalone-test@0.9")
|
||||
concretize()
|
||||
with spack.config.override("config:checksum", False):
|
||||
mirror("create", "-d", mirror_dir, "simple-standalone-test")
|
||||
|
||||
e = ev.read(env_name)
|
||||
assert set(os.listdir(mirror_dir)) == set([s.name for s in e.user_specs])
|
||||
spec = e.concrete_roots()[0]
|
||||
mirror_res = os.listdir(os.path.join(mirror_dir, spec.name))
|
||||
expected = ["%s.tar.gz" % spec.format("{name}-{version}")]
|
||||
assert mirror_res == expected
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def source_for_pkg_with_hash(mock_packages, tmpdir):
|
||||
s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
|
||||
@ -421,7 +401,8 @@ def test_all_specs_with_all_versions_dont_concretize(self):
|
||||
@pytest.mark.parametrize(
|
||||
"cli_args,error_str",
|
||||
[
|
||||
# Passed more than one among -f --all
|
||||
# Passed more than one among -f --all and specs
|
||||
({"specs": "hdf5", "file": None, "all": True}, "cannot specify specs on command line"),
|
||||
(
|
||||
{"specs": None, "file": "input.txt", "all": True},
|
||||
"cannot specify specs with a file if",
|
||||
|
@ -95,24 +95,47 @@ class _7zip(Package):
|
||||
pass
|
||||
"""
|
||||
|
||||
OLD_NUMPY = b"""\
|
||||
# some comment
|
||||
# this is written like this to be explicit about line endings and indentation
|
||||
OLD_NUMPY = (
|
||||
b"# some comment\r\n"
|
||||
b"\r\n"
|
||||
b"import spack.pkg.builtin.foo, spack.pkg.builtin.bar\r\n"
|
||||
b"from spack.package import *\r\n"
|
||||
b"from something.unrelated import AutotoolsPackage\r\n"
|
||||
b"\r\n"
|
||||
b"if True:\r\n"
|
||||
b"\tfrom spack.pkg.builtin import (\r\n"
|
||||
b"\t\tfoo,\r\n"
|
||||
b"\t\tbar as baz,\r\n"
|
||||
b"\t)\r\n"
|
||||
b"\r\n"
|
||||
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
|
||||
b"\tgenerator('ninja')\r\n"
|
||||
b"\r\n"
|
||||
b"\tdef example(self):\r\n"
|
||||
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
|
||||
b"\t\treturn spack.pkg.builtin.foo.example(), foo, baz\r\n"
|
||||
)
|
||||
|
||||
from spack.package import *
|
||||
|
||||
class PyNumpy(CMakePackage):
|
||||
generator("ninja")
|
||||
"""
|
||||
|
||||
NEW_NUMPY = b"""\
|
||||
# some comment
|
||||
|
||||
from spack_repo.builtin.build_systems.cmake import CMakePackage, generator
|
||||
from spack.package import *
|
||||
|
||||
class PyNumpy(CMakePackage):
|
||||
generator("ninja")
|
||||
"""
|
||||
NEW_NUMPY = (
|
||||
b"# some comment\r\n"
|
||||
b"\r\n"
|
||||
b"import spack_repo.builtin.packages.foo.package, spack_repo.builtin.packages.bar.package\r\n"
|
||||
b"from spack_repo.builtin.build_systems.cmake import CMakePackage, generator\r\n"
|
||||
b"from spack.package import *\r\n"
|
||||
b"from something.unrelated import AutotoolsPackage\r\n"
|
||||
b"\r\n"
|
||||
b"if True:\r\n"
|
||||
b"\timport spack_repo.builtin.packages.foo.package as foo\r\n"
|
||||
b"\timport spack_repo.builtin.packages.bar.package as baz\r\n"
|
||||
b"\r\n"
|
||||
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
|
||||
b"\tgenerator('ninja')\r\n"
|
||||
b"\r\n"
|
||||
b"\tdef example(self):\r\n"
|
||||
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
|
||||
b"\t\treturn spack_repo.builtin.packages.foo.package.example(), foo, baz\r\n"
|
||||
)
|
||||
|
||||
|
||||
def test_repo_migrate(tmp_path: pathlib.Path, config):
|
||||
@ -142,7 +165,6 @@ def test_repo_migrate(tmp_path: pathlib.Path, config):
|
||||
assert pkg_py_numpy_new.read_bytes() == NEW_NUMPY
|
||||
|
||||
|
||||
@pytest.mark.not_on_windows("Known failure on windows")
|
||||
def test_migrate_diff(git: Executable, tmp_path: pathlib.Path):
|
||||
root, _ = spack.repo.create_repo(str(tmp_path), "foo", package_api=(2, 0))
|
||||
r = pathlib.Path(root)
|
||||
|
@ -4,9 +4,10 @@
|
||||
|
||||
import os
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import spack.concretize
|
||||
import spack.config
|
||||
import spack.paths
|
||||
@ -85,8 +86,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
||||
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
|
||||
1,
|
||||
marks=pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="test data is x86_64 specific",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
|
||||
),
|
||||
),
|
||||
pytest.param(
|
||||
@ -98,8 +98,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
||||
},
|
||||
2,
|
||||
marks=pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="test data is x86_64 specific",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
|
||||
),
|
||||
),
|
||||
],
|
||||
|
@ -6,10 +6,11 @@
|
||||
import platform
|
||||
import sys
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import _vendoring.jinja2
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import llnl.util.lang
|
||||
|
||||
import spack.binary_distribution
|
||||
@ -145,12 +146,12 @@ def current_host(request, monkeypatch):
|
||||
|
||||
monkeypatch.setattr(spack.platforms.Test, "default", cpu)
|
||||
if not is_preference:
|
||||
target = _vendoring.archspec.cpu.TARGETS[cpu]
|
||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
|
||||
target = archspec.cpu.TARGETS[cpu]
|
||||
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
|
||||
yield target
|
||||
else:
|
||||
target = _vendoring.archspec.cpu.TARGETS["sapphirerapids"]
|
||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
|
||||
target = archspec.cpu.TARGETS["sapphirerapids"]
|
||||
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
|
||||
with spack.config.override("packages:all", {"target": [cpu]}):
|
||||
yield target
|
||||
|
||||
@ -382,7 +383,7 @@ def test_different_compilers_get_different_flags(
|
||||
"gcc": {"externals": [gcc11_with_flags]},
|
||||
},
|
||||
)
|
||||
t = _vendoring.archspec.cpu.host().family
|
||||
t = archspec.cpu.host().family
|
||||
client = spack.concretize.concretize_one(
|
||||
Spec(
|
||||
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
|
||||
@ -975,7 +976,7 @@ def test_noversion_pkg(self, spec):
|
||||
def test_adjusting_default_target_based_on_compiler(
|
||||
self, spec, compiler_spec, best_achievable, current_host, compiler_factory, mutable_config
|
||||
):
|
||||
best_achievable = _vendoring.archspec.cpu.TARGETS[best_achievable]
|
||||
best_achievable = archspec.cpu.TARGETS[best_achievable]
|
||||
expected = best_achievable if best_achievable < current_host else current_host
|
||||
mutable_config.set(
|
||||
"packages", {"gcc": {"externals": [compiler_factory(spec=f"{compiler_spec}")]}}
|
||||
@ -1644,7 +1645,7 @@ def test_target_granularity(self):
|
||||
# The test architecture uses core2 as the default target. Check that when
|
||||
# we configure Spack for "generic" granularity we concretize for x86_64
|
||||
default_target = spack.platforms.test.Test.default
|
||||
generic_target = _vendoring.archspec.cpu.TARGETS[default_target].generic.name
|
||||
generic_target = archspec.cpu.TARGETS[default_target].generic.name
|
||||
s = Spec("python")
|
||||
assert spack.concretize.concretize_one(s).satisfies("target=%s" % default_target)
|
||||
with spack.config.override("concretizer:targets", {"granularity": "generic"}):
|
||||
@ -2010,7 +2011,7 @@ def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch
|
||||
def test_require_targets_are_allowed(self, mutable_config, mutable_database):
|
||||
"""Test that users can set target constraints under the require attribute."""
|
||||
# Configuration to be added to packages.yaml
|
||||
required_target = _vendoring.archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
|
||||
required_target = archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
|
||||
external_conf = {"all": {"require": f"target={required_target}"}}
|
||||
mutable_config.set("packages", external_conf)
|
||||
|
||||
|
@ -20,12 +20,13 @@
|
||||
import tempfile
|
||||
import xml.etree.ElementTree
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import _vendoring.archspec.cpu.microarchitecture
|
||||
import _vendoring.archspec.cpu.schema
|
||||
import py
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
import archspec.cpu.microarchitecture
|
||||
import archspec.cpu.schema
|
||||
|
||||
import llnl.util.lang
|
||||
import llnl.util.lock
|
||||
import llnl.util.tty as tty
|
||||
@ -371,12 +372,12 @@ def clean_test_environment():
|
||||
def _host():
|
||||
"""Mock archspec host so there is no inconsistency on the Windows platform
|
||||
This function cannot be local as it needs to be pickleable"""
|
||||
return _vendoring.archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
|
||||
return archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
|
||||
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def archspec_host_is_spack_test_host(monkeypatch):
|
||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", _host)
|
||||
monkeypatch.setattr(archspec.cpu, "host", _host)
|
||||
|
||||
|
||||
# Hooks to add command line options or set other custom behaviors.
|
||||
@ -727,14 +728,14 @@ def mock_uarch_json(tmpdir_factory):
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def mock_uarch_configuration(mock_uarch_json):
|
||||
"""Create mock dictionaries for the _vendoring.archspec.cpu."""
|
||||
"""Create mock dictionaries for the archspec.cpu."""
|
||||
|
||||
def load_json():
|
||||
with open(mock_uarch_json, encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
|
||||
targets_json = load_json()
|
||||
targets = _vendoring.archspec.cpu.microarchitecture._known_microarchitectures()
|
||||
targets = archspec.cpu.microarchitecture._known_microarchitectures()
|
||||
|
||||
yield targets_json, targets
|
||||
|
||||
@ -743,8 +744,8 @@ def load_json():
|
||||
def mock_targets(mock_uarch_configuration, monkeypatch):
|
||||
"""Use this fixture to enable mock uarch targets for testing."""
|
||||
targets_json, targets = mock_uarch_configuration
|
||||
monkeypatch.setattr(_vendoring.archspec.cpu.schema, "TARGETS_JSON", targets_json)
|
||||
monkeypatch.setattr(_vendoring.archspec.cpu.microarchitecture, "TARGETS", targets)
|
||||
monkeypatch.setattr(archspec.cpu.schema, "TARGETS_JSON", targets_json)
|
||||
monkeypatch.setattr(archspec.cpu.microarchitecture, "TARGETS", targets)
|
||||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
@ -772,7 +773,7 @@ def configuration_dir(tmpdir_factory, linux_os):
|
||||
config_template = test_config / "config.yaml"
|
||||
config.write(config_template.read_text().format(install_tree_root, locks))
|
||||
|
||||
target = str(_vendoring.archspec.cpu.host().family)
|
||||
target = str(archspec.cpu.host().family)
|
||||
compilers = tmpdir.join("site", "packages.yaml")
|
||||
compilers_template = test_config / "packages.yaml"
|
||||
compilers.write(compilers_template.read_text().format(linux_os=linux_os, target=target))
|
||||
@ -2116,7 +2117,7 @@ def _factory(*, spec):
|
||||
@pytest.fixture()
|
||||
def host_architecture_str():
|
||||
"""Returns the broad architecture family (x86_64, aarch64, etc.)"""
|
||||
return str(_vendoring.archspec.cpu.host().family)
|
||||
return str(archspec.cpu.host().family)
|
||||
|
||||
|
||||
def _true(x):
|
||||
|
@ -11,9 +11,10 @@
|
||||
import json
|
||||
import os
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import spack
|
||||
import spack.cmd
|
||||
import spack.cmd.external
|
||||
@ -103,7 +104,7 @@ def spec_json(self):
|
||||
|
||||
@pytest.fixture
|
||||
def _common_arch(test_platform):
|
||||
generic = _vendoring.archspec.cpu.TARGETS[test_platform.default].family
|
||||
generic = archspec.cpu.TARGETS[test_platform.default].family
|
||||
return JsonArchEntry(platform=test_platform.name, os="redhat6", target=generic.name)
|
||||
|
||||
|
||||
|
@ -4,9 +4,10 @@
|
||||
|
||||
import os
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import spack.concretize
|
||||
import spack.config
|
||||
import spack.environment as ev
|
||||
@ -220,8 +221,7 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
||||
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
|
||||
|
||||
@pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="test data is specific for x86_64",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||
)
|
||||
def test_help_message(self, modulefile_content, module_configuration):
|
||||
"""Tests the generation of module help message."""
|
||||
|
@ -4,9 +4,10 @@
|
||||
|
||||
import os
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import pytest
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
import spack.concretize
|
||||
import spack.modules.common
|
||||
import spack.modules.tcl
|
||||
@ -185,8 +186,7 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
||||
assert len([x for x in content if "setenv FOO {{{name}}, {name}, {{}}, {}}" in x]) == 1
|
||||
|
||||
@pytest.mark.skipif(
|
||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
||||
reason="test data is specific for x86_64",
|
||||
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||
)
|
||||
def test_help_message(self, modulefile_content, module_configuration):
|
||||
"""Tests the generation of module help message."""
|
||||
|
@ -208,7 +208,7 @@ def variant_type(self) -> VariantType:
|
||||
else:
|
||||
return VariantType.SINGLE
|
||||
|
||||
def __str__(self) -> str:
|
||||
def __str__(self):
|
||||
return (
|
||||
f"Variant('{self.name}', "
|
||||
f"default='{self.default}', "
|
||||
@ -491,14 +491,14 @@ class DisjointSetsOfValues(collections.abc.Sequence):
|
||||
*sets (list): mutually exclusive sets of values
|
||||
"""
|
||||
|
||||
_empty_set = {"none"}
|
||||
_empty_set = set(("none",))
|
||||
|
||||
def __init__(self, *sets: Tuple[str, ...]) -> None:
|
||||
def __init__(self, *sets):
|
||||
self.sets = [set(_flatten(x)) for x in sets]
|
||||
|
||||
# 'none' is a special value and can appear only in a set of
|
||||
# a single element
|
||||
if any("none" in s and s != {"none"} for s in self.sets):
|
||||
if any("none" in s and s != set(("none",)) for s in self.sets):
|
||||
raise spack.error.SpecError(
|
||||
"The value 'none' represents the empty set,"
|
||||
" and must appear alone in a set. Use the "
|
||||
|
@ -75,6 +75,7 @@ section-order = [
|
||||
"future",
|
||||
"standard-library",
|
||||
"third-party",
|
||||
"archspec",
|
||||
"llnl",
|
||||
"spack",
|
||||
"first-party",
|
||||
@ -83,6 +84,7 @@ section-order = [
|
||||
|
||||
[tool.ruff.lint.isort.sections]
|
||||
spack = ["spack"]
|
||||
archspec = ["archspec"]
|
||||
llnl = ["llnl"]
|
||||
|
||||
[tool.ruff.lint.per-file-ignores]
|
||||
@ -102,11 +104,13 @@ sections = [
|
||||
"FUTURE",
|
||||
"STDLIB",
|
||||
"THIRDPARTY",
|
||||
"ARCHSPEC",
|
||||
"LLNL",
|
||||
"FIRSTPARTY",
|
||||
"LOCALFOLDER",
|
||||
]
|
||||
known_first_party = "spack"
|
||||
known_archspec = "archspec"
|
||||
known_llnl = "llnl"
|
||||
known_third_party = ["ruamel", "six"]
|
||||
src_paths = "lib"
|
||||
@ -260,9 +264,6 @@ substitute = [
|
||||
{ match = "from attr", replace = "from _vendoring.attr" },
|
||||
{ match = "import jsonschema", replace = "import _vendoring.jsonschema" },
|
||||
{ match = "from jsonschema", replace = "from _vendoring.jsonschema" },
|
||||
{ match = "archspec.cpu", replace = "_vendoring.archspec.cpu" },
|
||||
{ match = "archspec.__version__", replace = "_vendoring.archspec.__version__" },
|
||||
{ match = "import archspec", replace = "import _vendoring.archspec" },
|
||||
]
|
||||
drop = [
|
||||
# contains unnecessary scripts
|
||||
@ -284,21 +285,11 @@ drop = [
|
||||
"pvectorc.*.so",
|
||||
# Trim jsonschema tests
|
||||
"jsonschema/tests",
|
||||
"archspec/json/tests",
|
||||
"archspec/vendor/cpuid/.gitignore",
|
||||
"pyrsistent/__init__.pyi",
|
||||
]
|
||||
|
||||
[tool.vendoring.typing-stubs]
|
||||
_pyrsistent_version = []
|
||||
altgraph = []
|
||||
archspec = []
|
||||
six = ["six.__init__", "six.moves.__init__", "six.moves.configparser"]
|
||||
distro = []
|
||||
jsonschema = []
|
||||
macholib = []
|
||||
pyrsistent = []
|
||||
ruamel = []
|
||||
six = []
|
||||
|
||||
[tool.vendoring.license.directories]
|
||||
setuptools = "pkg_resources"
|
||||
|
@ -69,7 +69,7 @@ spack:
|
||||
- alpgen
|
||||
- ampt
|
||||
- apfel +lhapdf +python
|
||||
- celeritas ~cuda +openmp ~rocm +vecgeom +covfie
|
||||
- celeritas ~cuda +openmp ~rocm +vecgeom
|
||||
- cepgen
|
||||
- cernlib +shared
|
||||
- collier
|
||||
@ -130,7 +130,7 @@ spack:
|
||||
|
||||
# CUDA
|
||||
#- acts +cuda +traccc cuda_arch=80
|
||||
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80 +covfie
|
||||
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80
|
||||
- root +cuda +cudnn +tmva-gpu
|
||||
- vecgeom +cuda cuda_arch=80
|
||||
|
||||
|
@ -10,7 +10,7 @@
|
||||
import stat
|
||||
from typing import Dict, Iterable, List, Mapping, Optional, Tuple
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
import archspec
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
from llnl.util.lang import ClassProperty, classproperty, match_predicate
|
||||
@ -250,7 +250,7 @@ def test_imports(self) -> None:
|
||||
):
|
||||
python("-c", f"import {module}")
|
||||
|
||||
def _update_external_dependencies(self, extendee_spec: Optional[Spec] = None) -> None:
|
||||
def update_external_dependencies(self, extendee_spec=None):
|
||||
"""
|
||||
Ensure all external python packages have a python dependency
|
||||
|
||||
@ -286,7 +286,7 @@ def _update_external_dependencies(self, extendee_spec: Optional[Spec] = None) ->
|
||||
if not python.architecture.os:
|
||||
python.architecture.os = platform.default_operating_system()
|
||||
if not python.architecture.target:
|
||||
python.architecture.target = _vendoring.archspec.cpu.host().family.name
|
||||
python.architecture.target = archspec.cpu.host().family.name
|
||||
|
||||
python.external_path = self.spec.external_path
|
||||
python._mark_concrete()
|
||||
|
@ -1,5 +1,4 @@
|
||||
#!/bin/sh
|
||||
cd ${0%/*} || exit 1 # Run from this directory
|
||||
|
||||
wmake $targetType applications/solvers/additiveFoam/movingHeatSource
|
||||
wmake $targetType applications/solvers/additiveFoam
|
||||
applications/Allwmake $targetType $*
|
||||
|
5
var/spack/repos/spack_repo/builtin/packages/additivefoam/assets/assets_1.0.0/applications/Allwmake
vendored
Executable file
5
var/spack/repos/spack_repo/builtin/packages/additivefoam/assets/assets_1.0.0/applications/Allwmake
vendored
Executable file
@ -0,0 +1,5 @@
|
||||
#!/bin/sh
|
||||
cd ${0%/*} || exit 1 # Run from this directory
|
||||
|
||||
wmake libso solvers/additiveFoam/movingHeatSource
|
||||
wmake solvers/additiveFoam
|
@ -1,4 +1,4 @@
|
||||
#!/bin/sh
|
||||
cd ${0%/*} || exit 1 # Run from this directory
|
||||
|
||||
./applications/solvers/additiveFoam/Allwmake $targetType $*
|
||||
applications/Allwmake $targetType $*
|
||||
|
@ -0,0 +1,9 @@
|
||||
#!/bin/sh
|
||||
cd ${0%/*} || exit 1 # Run from this directory
|
||||
|
||||
# Parse arguments for library compilation
|
||||
. $WM_PROJECT_DIR/wmake/scripts/AllwmakeParseArguments
|
||||
|
||||
wmake $targetType solvers/additiveFoam/functionObjects/ExaCA
|
||||
wmake $targetType solvers/additiveFoam/movingHeatSource
|
||||
wmake $targetType solvers/additiveFoam
|
@ -15,9 +15,9 @@
|
||||
class Additivefoam(Package):
|
||||
"""AdditiveFOAM is a heat and mass transfer software for Additive Manufacturing (AM)"""
|
||||
|
||||
homepage = "https://ornl.github.io/AdditiveFOAM/"
|
||||
homepage = "https://github.com/ORNL/AdditiveFOAM"
|
||||
git = "https://github.com/ORNL/AdditiveFOAM.git"
|
||||
url = "https://github.com/ORNL/AdditiveFOAM/archive/1.1.0.tar.gz"
|
||||
url = "https://github.com/ORNL/AdditiveFOAM/archive/1.0.0.tar.gz"
|
||||
|
||||
maintainers("streeve", "colemanjs", "gknapp1")
|
||||
|
||||
@ -26,17 +26,16 @@ class Additivefoam(Package):
|
||||
license("GPL-3.0-only")
|
||||
|
||||
version("main", branch="main")
|
||||
version("1.1.0", sha256="a13770bd66fe10224705fb3a2bfb557e63e0aea98c917b0084cf8b91eaa53ee2")
|
||||
version("1.0.0", sha256="abbdf1b0230cd2f26f526be76e973f508978611f404fe8ec4ecdd7d5df88724c")
|
||||
|
||||
depends_on("cxx", type="build") # generated
|
||||
|
||||
depends_on("openfoam-org@10")
|
||||
|
||||
common = []
|
||||
assets = ["Allwmake"]
|
||||
common = ["spack-derived-Allwmake"]
|
||||
assets = [join_path("applications", "Allwmake"), "Allwmake"]
|
||||
|
||||
build_script = "./Allwmake"
|
||||
build_script = "./spack-derived-Allwmake"
|
||||
|
||||
phases = ["configure", "build", "install"]
|
||||
|
||||
@ -57,49 +56,15 @@ def add_extra_files(self, common, local_prefix, local):
|
||||
openfoam.install(join_path(indir, f), join_path(outdir, f))
|
||||
|
||||
def patch(self):
|
||||
"""Patches build by adding Allwmake from the asset directory based on
|
||||
the spec version.
|
||||
|
||||
For all versions after 1.0.0 there is an Allwmake script in
|
||||
the AdditiveFOAM repository that can be called by the spack assets_main/Allwmake
|
||||
script, whereas the assets_1.0.0/Allwmake script contains the
|
||||
build instructions."""
|
||||
spec = self.spec
|
||||
asset_dir = ""
|
||||
if Version("main") in spec.versions:
|
||||
asset_dir = "assets_main"
|
||||
if Version("1.0.0") in spec.versions:
|
||||
elif Version("1.0.0") in spec.versions:
|
||||
asset_dir = "assets_1.0.0"
|
||||
self.add_extra_files(self.common, asset_dir, self.assets)
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
"""Set up the build environment variables."""
|
||||
|
||||
# Ensure that the directories exist
|
||||
mkdirp(self.prefix.bin)
|
||||
mkdirp(self.prefix.lib)
|
||||
|
||||
# Add to the environment
|
||||
env.set("FOAM_USER_APPBIN", self.prefix.bin)
|
||||
env.set("FOAM_USER_LIBBIN", self.prefix.lib)
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
"""Set up the run environment variables."""
|
||||
|
||||
# Add to the environment
|
||||
env.prepend_path("PATH", self.prefix.bin)
|
||||
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)
|
||||
|
||||
def activate(self, spec, prefix):
|
||||
"""Activate the package to modify the environment."""
|
||||
self.setup_run_environment(self.spec.environment())
|
||||
|
||||
def deactivate(self, spec, prefix):
|
||||
"""Deactivate the package and clean up the environment."""
|
||||
env = self.spec.environment()
|
||||
env.pop("FOAM_USER_APPBIN", None)
|
||||
env.pop("FOAM_USER_LIBBIN", None)
|
||||
|
||||
def configure(self, spec, prefix):
|
||||
"""Configure the environment for building."""
|
||||
pass
|
||||
|
||||
def build(self, spec, prefix):
|
||||
|
@ -21,7 +21,6 @@ class Amdsmi(CMakePackage):
|
||||
libraries = ["libamd_smi"]
|
||||
|
||||
license("MIT")
|
||||
version("6.4.0", sha256="6f0200ba7305171e9dadbfcd41ff00c194b98d2b88e0555c57739ef01c767233")
|
||||
version("6.3.3", sha256="e23abc65a1cd75764d7da049b91cce2a095b287279efcd4f90b4b9b63b974dd5")
|
||||
version("6.3.2", sha256="1ed452eedfe51ac6e615d7bfe0bd7a0614f21113874ae3cbea7df72343cc2d13")
|
||||
version("6.3.1", sha256="a3a5a711052e813b9be9304d5e818351d3797f668ec2a455e61253a73429c355")
|
||||
@ -39,7 +38,6 @@ class Amdsmi(CMakePackage):
|
||||
version("5.5.1", sha256="b794c7fd562fd92f2c9f2bbdc2d5dded7486101fcd4598f2e8c3484c9a939281")
|
||||
version("5.5.0", sha256="dcfbd96e93afcf86b1261464e008e9ef7e521670871a1885e6eaffc7cdc8f555")
|
||||
|
||||
depends_on("c", type="build")
|
||||
depends_on("cxx", type="build") # generated
|
||||
|
||||
depends_on("cmake@3.11:")
|
||||
|
@ -10,20 +10,6 @@
|
||||
from spack.package import *
|
||||
|
||||
_versions = {
|
||||
"6.4.0": {
|
||||
"apt": (
|
||||
"5ec56bc3c227ad37227072bd513c58c9501e1ceefb06692ad4812f337853dca4",
|
||||
"https://repo.radeon.com/rocm/apt/6.4/pool/main/h/hsa-amd-aqlprofile/hsa-amd-aqlprofile_1.0.0.60400-47~22.04_amd64.deb",
|
||||
),
|
||||
"yum": (
|
||||
"22ed4c6a999ca6823e5e6bf9f4ab560cba68025f354346b1ac2ebb4757239c56",
|
||||
"https://repo.radeon.com/rocm/rhel8/6.4/main/hsa-amd-aqlprofile-1.0.0.60400-47.el8.x86_64.rpm",
|
||||
),
|
||||
"zyp": (
|
||||
"7a4c9ca0e6ca178c65776f9b1d9d01ca7576eaa555fdcbf49a42def1ce6d6041",
|
||||
"https://repo.radeon.com/rocm/zyp/6.4/main/hsa-amd-aqlprofile-1.0.0.60400-sles156.47.x86_64.rpm",
|
||||
),
|
||||
},
|
||||
"6.3.3": {
|
||||
"apt": (
|
||||
"5fe2b18e75e8c0a66069af8446399796818f7340a9ef5f2b52adaa79ee8e2a37",
|
||||
@ -321,7 +307,6 @@ class Aqlprofile(Package):
|
||||
"6.3.1",
|
||||
"6.3.2",
|
||||
"6.3.3",
|
||||
"6.4.0",
|
||||
]:
|
||||
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")
|
||||
|
||||
|
@ -19,7 +19,6 @@ class AwsOfiNccl(AutotoolsPackage):
|
||||
maintainers("bvanessen")
|
||||
|
||||
version("master", branch="master")
|
||||
version("1.14.2", sha256="e523ea08ce0caeff5c949b2134b4897186d793ce908904dd9d47bb08230b9bbd")
|
||||
version("1.13.0", sha256="50dd231a0a99cec29300df46b8e828139ced15322a3c3c41b1d22dcc9a62ec02")
|
||||
version("1.12.1", sha256="821f0929c016e5448785bbc6795af5096559ecfc6c9479eb3818cafa61424576")
|
||||
version("1.12.0", sha256="93029207103b75f4dc15f023b3b8692851202b52b7e2824723dd5d328f0ea65b")
|
||||
@ -52,7 +51,7 @@ class AwsOfiNccl(AutotoolsPackage):
|
||||
depends_on("libtool", type="build")
|
||||
|
||||
def url_for_version(self, version):
|
||||
if version < Version("1.7.0") or version >= Version("1.14.0"):
|
||||
if version < Version("1.7.0"):
|
||||
return super().url_for_version(version)
|
||||
url_fmt = "https://github.com/aws/aws-ofi-nccl/archive/v{0}-aws.tar.gz"
|
||||
return url_fmt.format(version)
|
||||
|
@ -4,19 +4,23 @@
|
||||
|
||||
from spack_repo.builtin.build_systems.generic import Package
|
||||
|
||||
import archspec
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class Blast2go(Package):
|
||||
"""Blast2GO is a bioinformatics platform for high-quality functional
|
||||
annotation and analysis of genomic datasets.
|
||||
"""
|
||||
annotation and analysis of genomic datasets."""
|
||||
|
||||
homepage = "https://www.blast2go.com/"
|
||||
|
||||
version("5.2.5", sha256="c37aeda25f96ac0553b52da6b5af3167d50671ddbfb3b39bcb11afe5d0643891")
|
||||
|
||||
requires("target=x86_64:", msg="blast2go is available x86_64 only")
|
||||
for t in set(
|
||||
[str(x.family) for x in archspec.cpu.TARGETS.values() if str(x.family) != "x86_64"]
|
||||
):
|
||||
conflicts("target={0}:".format(t), msg="blast2go is available x86_64 only")
|
||||
|
||||
depends_on("bash", type="build")
|
||||
depends_on("blast-plus", type="run")
|
||||
|
@ -19,17 +19,12 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
|
||||
git = "https://github.com/celeritas-project/celeritas.git"
|
||||
url = "https://github.com/celeritas-project/celeritas/releases/download/v0.1.0/celeritas-0.1.0.tar.gz"
|
||||
|
||||
maintainers("sethrj", "drbenmorgan")
|
||||
maintainers("sethrj")
|
||||
|
||||
license("Apache-2.0")
|
||||
|
||||
sanity_check_is_file = ["bin/celer-sim"]
|
||||
|
||||
version("develop", branch="develop", get_full_repo=True)
|
||||
|
||||
version("0.6.0", sha256="c776dee357ecff42f85ed02c328f24b092400af28e67af2c0e195ce8f67613b0")
|
||||
version("0.5.3", sha256="4d1fe1f34e899c3599898fb6d44686d2582a41b0872784514aa8c562597b3ee6")
|
||||
version("0.5.2", sha256="46311c096b271d0331b82c02485ac6bf650d5b0f7bd948fb01aef5058f8824e3")
|
||||
version("0.5.1", sha256="182d5466fbd98ba9400b343b55f6a06e03b77daed4de1dd16f632ac0a3620249")
|
||||
version("0.5.0", sha256="4a8834224d96fd01897e5872ac109f60d91ef0bd7b63fac05a73dcdb61a5530e")
|
||||
version("0.4.4", sha256="8b5ae63aa2d50c2ecf48d752424e4a33c50c07d9f0f5ca5448246de3286fd836")
|
||||
@ -64,13 +59,11 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
|
||||
multi=False,
|
||||
description="C++ standard version",
|
||||
)
|
||||
variant("covfie", default=False, when="@0.6:", description="Enable covfie magnetic fields")
|
||||
variant("debug", default=False, description="Enable runtime debug assertions")
|
||||
variant("doc", default=False, description="Build and install documentation")
|
||||
variant("geant4", default=True, description="Enable Geant4 integration")
|
||||
variant("hepmc3", default=True, description="Use HepMC3 I/O interfaces")
|
||||
variant("openmp", default=False, description="Use OpenMP multithreading")
|
||||
variant("perfetto", default=False, when="@0.5:", description="Use Perfetto profiling")
|
||||
variant("root", default=False, description="Use ROOT I/O")
|
||||
variant("shared", default=True, description="Build shared libraries")
|
||||
variant("swig", default=False, when="@:0.4", description="Generate SWIG Python bindings")
|
||||
@ -82,9 +75,7 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
|
||||
depends_on("cmake@3.18:", type="build", when="+cuda+vecgeom")
|
||||
depends_on("cmake@3.22:", type="build", when="+rocm")
|
||||
|
||||
depends_on("cli11", when="@0.6:")
|
||||
depends_on("nlohmann-json")
|
||||
depends_on("covfie@0.13:", when="+covfie")
|
||||
depends_on("geant4@10.5:11.1", when="@0.3.1:0.4.1 +geant4")
|
||||
depends_on("geant4@10.5:11.2", when="@0.4.2:0.4 +geant4")
|
||||
depends_on("geant4@10.5:", when="@0.5: +geant4")
|
||||
@ -113,20 +104,14 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
|
||||
for _std in _cxxstd_values:
|
||||
for _pkg in ["geant4", "root", "vecgeom"]:
|
||||
depends_on(f"{_pkg} cxxstd={_std}", when=f"+{_pkg} cxxstd={_std}")
|
||||
# NOTE: as a private dependency, covfie cxxstd can differ from the
|
||||
# celeritas "public" standard
|
||||
|
||||
# Ensure consistent CUDA architectures
|
||||
depends_on("vecgeom +cuda cuda_arch=none", when="+vecgeom +cuda cuda_arch=none")
|
||||
for _arch in CudaPackage.cuda_arch_values:
|
||||
for _pkg in ["covfie", "vecgeom"]:
|
||||
depends_on(f"{_pkg} +cuda cuda_arch={_arch}", when=f"+{_pkg} +cuda cuda_arch={_arch}")
|
||||
depends_on(f"vecgeom +cuda cuda_arch={_arch}", when=f"+vecgeom +cuda cuda_arch={_arch}")
|
||||
|
||||
conflicts("+rocm", when="+covfie", msg="HIP support is not yet available with covfie")
|
||||
conflicts("+rocm", when="+cuda", msg="AMD and NVIDIA accelerators are incompatible")
|
||||
conflicts("+rocm", when="+vecgeom", msg="HIP support is only available with ORANGE")
|
||||
for _arch in ["cuda", "rocm"]:
|
||||
conflicts("+perfetto", when=f"+{_arch}", msg="Perfetto is only used for CPU profiling")
|
||||
|
||||
# geant4@11.3.0 now returns const G4Element::GetElementTable()
|
||||
patch(
|
||||
@ -149,18 +134,7 @@ def cmake_args(self):
|
||||
define("CELERITAS_USE_Python", True),
|
||||
]
|
||||
|
||||
# NOTE: package names are stylized like the dependency asks: Find{pkg}.cmake
|
||||
for pkg in [
|
||||
"covfie",
|
||||
"CUDA",
|
||||
"Geant4",
|
||||
"HepMC3",
|
||||
"OpenMP",
|
||||
"Perfetto",
|
||||
"ROOT",
|
||||
"SWIG",
|
||||
"VecGeom",
|
||||
]:
|
||||
for pkg in ["CUDA", "Geant4", "HepMC3", "OpenMP", "ROOT", "SWIG", "VecGeom"]:
|
||||
args.append(from_variant("CELERITAS_USE_" + pkg, pkg.lower()))
|
||||
|
||||
if self.spec.satisfies("+cuda"):
|
||||
@ -188,9 +162,8 @@ def cmake_args(self):
|
||||
args.extend(
|
||||
(
|
||||
define(f"CELERITAS_BUILTIN_{pkg}", False)
|
||||
for pkg in ["covfie", "CLI11", "GTest", "nlohmann_json", "G4VG"]
|
||||
for pkg in ["GTest", "nlohmann_json", "G4VG"]
|
||||
)
|
||||
)
|
||||
# NOTE: Perfetto is always vendored!
|
||||
|
||||
return args
|
||||
|
@ -13,15 +13,13 @@ class Chafa(AutotoolsPackage):
|
||||
suitable for display in a terminal."""
|
||||
|
||||
homepage = "https://hpjansson.org/chafa/"
|
||||
url = "https://hpjansson.org/chafa/releases/chafa-1.16.1.tar.xz"
|
||||
url = "https://hpjansson.org/chafa/releases/chafa-1.14.5.tar.xz"
|
||||
git = "https://github.com/hpjansson/chafa.git"
|
||||
|
||||
license("LGPL-3.0-or-later", checked_by="Buldram")
|
||||
maintainers("Buldram")
|
||||
|
||||
version("master", branch="master")
|
||||
version("1.16.1", sha256="4a25debb71530baf0a748b15cfee6b8da6b513f696d9484987eaf410ecce1129")
|
||||
version("1.16.0", sha256="bf863e57b6200b696bde1742aa95d7feb8cd23b9df1e91e91859b2b1e54fd290")
|
||||
version("1.14.5", sha256="7b5b384d5fb76a641d00af0626ed2115fb255ea371d9bef11f8500286a7b09e5")
|
||||
version("1.14.4", sha256="d0708a63f05b79269dae862a42671e38aece47fbd4fc852904bca51a65954454")
|
||||
version("1.14.3", sha256="f3d5530a96c8e55eea180448896e973093e0302f4cbde45d028179af8cfd90f3")
|
||||
@ -77,19 +75,6 @@ def configure_args(self):
|
||||
*self.with_or_without("jxl"),
|
||||
]
|
||||
|
||||
@run_after("install")
|
||||
def install_completions(self):
|
||||
mkdirp(zsh_completion_path(self.prefix))
|
||||
install(
|
||||
"tools/completions/zsh-completion.zsh", zsh_completion_path(self.prefix) / "_chafa"
|
||||
)
|
||||
if self.spec.satisfies("@1.16.1:"):
|
||||
mkdirp(fish_completion_path(self.prefix))
|
||||
install(
|
||||
"tools/completions/fish-completion.fish",
|
||||
fish_completion_path(self.prefix) / "chafa.fish",
|
||||
)
|
||||
|
||||
@run_after("install", when="+man")
|
||||
def install_man(self):
|
||||
mkdirp(prefix.share.man.man1)
|
||||
|
@ -31,7 +31,6 @@ def url_for_version(self, version):
|
||||
license("NCSA")
|
||||
|
||||
version("master", branch="amd-stg-open", deprecated=True)
|
||||
version("6.4.0", sha256="dca1c145a23f05229d5d646241f9d1d3c5dbf1d745b338ae020eabe33beb965c")
|
||||
version("6.3.3", sha256="4df9aba24e574edf23844c0d2d9dda112811db5c2b08c9428604a21b819eb23d")
|
||||
version("6.3.2", sha256="1f52e45660ea508d3fe717a9903fe27020cee96de95a3541434838e0193a4827")
|
||||
version("6.3.1", sha256="e9c2481cccacdea72c1f8d3970956c447cec47e18dfb9712cbbba76a2820552c")
|
||||
@ -95,7 +94,6 @@ def url_for_version(self, version):
|
||||
"6.3.1",
|
||||
"6.3.2",
|
||||
"6.3.3",
|
||||
"6.4.0",
|
||||
"master",
|
||||
]:
|
||||
# llvm libs are linked statically, so this *could* be a build dep
|
||||
@ -125,7 +123,6 @@ def url_for_version(self, version):
|
||||
"6.3.1",
|
||||
"6.3.2",
|
||||
"6.3.3",
|
||||
"6.4.0",
|
||||
]:
|
||||
depends_on(f"rocm-core@{ver}", when=f"@{ver}")
|
||||
|
||||
|
@ -6,9 +6,10 @@
|
||||
import sys
|
||||
from typing import List
|
||||
|
||||
import _vendoring.archspec.cpu
|
||||
from spack_repo.builtin.build_systems.generic import Package
|
||||
|
||||
import archspec.cpu
|
||||
|
||||
from llnl.util import lang
|
||||
|
||||
import spack.compilers.libraries
|
||||
@ -182,12 +183,12 @@ def setup_dependent_build_environment(
|
||||
env.set(f"SPACK_{wrapper_var_name}_RPATH_ARG", compiler_pkg.rpath_arg)
|
||||
|
||||
uarch = dependent_spec.architecture.target
|
||||
version_number, _ = _vendoring.archspec.cpu.version_components(
|
||||
version_number, _ = archspec.cpu.version_components(
|
||||
compiler_pkg.spec.version.dotted_numeric_string
|
||||
)
|
||||
try:
|
||||
isa_arg = uarch.optimization_flags(compiler_pkg.archspec_name(), version_number)
|
||||
except (ValueError, _vendoring.archspec.cpu.UnsupportedMicroarchitecture):
|
||||
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
|
||||
isa_arg = ""
|
||||
|
||||
if isa_arg:
|
||||
|
@ -21,7 +21,6 @@ class ComposableKernel(CMakePackage):
|
||||
license("MIT")
|
||||
|
||||
version("master", branch="develop", deprecated=True)
|
||||
version("6.4.0", sha256="8dbfea0bdc4950ca60e8d1ea43edf1f515c4a34e47ead951415c49a0669a3baf")
|
||||
version("6.3.3", sha256="b7102efba044455416a6127af1951019fe8365a653ea7eb0b1d83bb4542c9309")
|
||||
version("6.3.2", sha256="875237fe493ff040f8f63b827cddf2ff30a8d3aa18864f87d0e35323c7d62a2d")
|
||||
version("6.3.1", sha256="3e8c8c832ca3f9ceb99ab90f654b93b7db876f08d90eda87a70bc629c854052a")
|
||||
@ -66,7 +65,6 @@ class ComposableKernel(CMakePackage):
|
||||
|
||||
for ver in [
|
||||
"master",
|
||||
"6.4.0",
|
||||
"6.3.3",
|
||||
"6.3.2",
|
||||
"6.3.1",
|
||||
|
@ -1,32 +0,0 @@
|
||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack_repo.builtin.build_systems.makefile import MakefilePackage
|
||||
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class Cryoef(MakefilePackage):
|
||||
"""An open-source software package for robust analysis of the orientation distribution
|
||||
of cryoelectron microscopy data."""
|
||||
|
||||
homepage = "https://www.mrc-lmb.cam.ac.uk/crusso/cryoEF"
|
||||
url = "https://www.mrc-lmb.cam.ac.uk/crusso/cryoEF/cryoEF_v1.1.0.tar.gz"
|
||||
|
||||
version("1.1.0", sha256="655ed8543a0226754bdeb6e0dd4efc0467f15dc4c9c963c44ef7b8d3d0e41b62")
|
||||
|
||||
depends_on("c", type="build")
|
||||
depends_on("cxx", type="build")
|
||||
depends_on("fftw-api@3")
|
||||
|
||||
def patch(self):
|
||||
filter_file(
|
||||
"-lfftw3", f"-lfftw3 {self.spec['fftw-api'].libs.ld_flags} -no-pie", "Makefile"
|
||||
)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
install_tree("TestData", prefix.TestData)
|
||||
install_tree("bin", prefix.bin)
|
||||
install_tree("lib", prefix.lib)
|
||||
install("PlotOD.py", prefix.bin)
|
@ -41,4 +41,3 @@ class CyrusSasl(AutotoolsPackage):
|
||||
depends_on("groff", type="build")
|
||||
depends_on("openssl", type="link")
|
||||
depends_on("libxcrypt", type="link")
|
||||
depends_on("krb5", type="link")
|
||||
|
@ -22,7 +22,6 @@ class Direnv(GoPackage):
|
||||
|
||||
# Versions (newest to oldest)
|
||||
version("master", branch="master")
|
||||
version("2.36.0", sha256="edb89ca67ef46a792d4e20177dae9dbd229e26dcbcfb17baa9645c1ff7cc47b0")
|
||||
version("2.35.0", sha256="a7aaec49d1b305f0745dad364af967fb3dc9bb5befc9f29d268d528b5a474e57")
|
||||
version("2.34.0", sha256="3d7067e71500e95d69eac86a271a6b6fc3f2f2817ba0e9a589524bf3e73e007c")
|
||||
version("2.33.0", sha256="8ef18051aa6bdcd6b59f04f02acdd0b78849b8ddbdbd372d4957af7889c903ea")
|
||||
@ -35,6 +34,5 @@ class Direnv(GoPackage):
|
||||
version("2.11.3", sha256="2d34103a7f9645059270763a0cfe82085f6d9fe61b2a85aca558689df0e7b006")
|
||||
|
||||
# Build dependencies
|
||||
depends_on("go@1.24:", type="build", when="@2.36:")
|
||||
depends_on("go@1.20:", type="build", when="@2.33:")
|
||||
depends_on("go@1.16:", type="build", when="@2.28:")
|
||||
|
@ -26,7 +26,6 @@ class E2fsprogs(AutotoolsPackage):
|
||||
depends_on("c", type="build") # generated
|
||||
depends_on("cxx", type="build") # generated
|
||||
|
||||
depends_on("uuid")
|
||||
depends_on("texinfo", type="build")
|
||||
depends_on("fuse", when="+fuse2fs")
|
||||
depends_on("pkgconfig", when="+fuse2fs")
|
||||
|
@ -22,7 +22,6 @@ class Edm4hep(CMakePackage):
|
||||
license("Apache-2.0")
|
||||
|
||||
version("main", branch="main")
|
||||
version("0.99.2", sha256="b3e7abb61fd969e4c9aef55dd6839a2186bf0b0d3801174fe6e0b9df8e0ebace")
|
||||
version("0.99.1", sha256="84d990f09dbd0ad2198596c0c51238a4b15391f51febfb15dd3d191dc7aae9f4")
|
||||
version("0.99", sha256="3636e8c14474237029bf1a8be11c53b57ad3ed438fd70a7e9b87c5d08f1f2ea6")
|
||||
version("0.10.5", sha256="003c8e0c8e1d1844592d43d41384f4320586fbfa51d4d728ae0870b9c4f78d81")
|
||||
@ -80,7 +79,6 @@ class Edm4hep(CMakePackage):
|
||||
depends_on("podio@1:", when="@0.99:")
|
||||
depends_on("podio@0.15:", when="@:0.10.5")
|
||||
depends_on("podio@:1.1", when="@:0.99.0")
|
||||
depends_on("podio@1.3:", when="@0.99.2:")
|
||||
for _std in _cxxstd_values:
|
||||
for _v in _std:
|
||||
depends_on(f"podio cxxstd={_v.value}", when=f"cxxstd={_v.value}")
|
||||
@ -111,8 +109,6 @@ def cmake_args(self):
|
||||
self.define("BUILD_TESTING", self.run_tests),
|
||||
self.define_from_variant("EDM4HEP_WITH_JSON", "json"),
|
||||
]
|
||||
if self.spec.satisfies("@:0.99.1 ^podio@1.3:"):
|
||||
args.append(self.define("PODIO_USE_CLANG_FORMAT", False))
|
||||
return args
|
||||
|
||||
def setup_run_environment(self, env: EnvironmentModifications) -> None:
|
||||
|
@ -16,20 +16,67 @@ class Fairlogger(CMakePackage):
|
||||
maintainers("dennisklein", "ChristianTackeGSI")
|
||||
|
||||
version("develop", branch="dev", get_full_repo=True)
|
||||
version("2.2.0", sha256="8dfb11e3aa0a9c545f3dfb310d261956727cea558d4123fd8c9c98e135e4d02b")
|
||||
version("1.11.1", sha256="bba5814f101d705792499e43b387190d8b8c7592466171ae045d4926485f2f70")
|
||||
version("1.10.4", sha256="2fa321893f2c8c599cca160db243299ce1e941fbfb3f935b1139caa943bc0dba")
|
||||
version("1.9.3", sha256="0c02076ed708372d5ae7bdebcefc8e45a8cbfa480eea781308336d60a2781f3a")
|
||||
version(
|
||||
"1.11.1",
|
||||
sha256="bba5814f101d705792499e43b387190d8b8c7592466171ae045d4926485f2f70",
|
||||
"1.9.0",
|
||||
sha256="13bcaa0d4129f8d4e69a0a2ece8e5b7073760082c8aa028e3fc0c11106503095",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.10.4",
|
||||
sha256="2fa321893f2c8c599cca160db243299ce1e941fbfb3f935b1139caa943bc0dba",
|
||||
"1.8.0",
|
||||
sha256="3f0a38dba1411b542d998e02badcc099c057b33a402954fc5c2ab74947a0c42c",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.9.3",
|
||||
sha256="0c02076ed708372d5ae7bdebcefc8e45a8cbfa480eea781308336d60a2781f3a",
|
||||
"1.7.0",
|
||||
sha256="ef467f0a70afc0549442323d70b165fa0b0b4b4e6f17834573ca15e8e0b007e4",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.6.2",
|
||||
sha256="5c6ef0c0029eb451fee71756cb96e6c5011040a9813e8889667b6f3b6b04ed03",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.6.1",
|
||||
sha256="3894580f4c398d724ba408e410e50f70c9f452e8cfaf7c3ff8118c08df28eaa8",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.6.0",
|
||||
sha256="721e8cadfceb2f63014c2a727e098babc6deba653baab8866445a772385d0f5b",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.5.0",
|
||||
sha256="8e74e0b1e50ee86f4fca87a44c6b393740b32099ac3880046bf252c31c58dd42",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.4.0",
|
||||
sha256="75457e86984cc03ce87d6ad37adc5aab1910cabd39a9bbe5fb21ce2475a91138",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.3.0",
|
||||
sha256="5cedea2773f7091d69aae9fd8f724e6e47929ee3784acdd295945a848eb36b93",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.2.0",
|
||||
sha256="bc0e049cf84ceb308132d8679e7f22fcdca5561dda314d5233d0d5fe2b0f8c62",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.1.0",
|
||||
sha256="e185e5bd07df648224f85e765d18579fae0de54adaab9a194335e3ad6d3d29f7",
|
||||
deprecated=True,
|
||||
)
|
||||
version(
|
||||
"1.0.6",
|
||||
sha256="2fc266a6e494adda40837be406aef8d9838f385ffd64fbfafb1164833906b4e0",
|
||||
deprecated=True,
|
||||
)
|
||||
|
||||
@ -45,21 +92,14 @@ class Fairlogger(CMakePackage):
|
||||
variant(
|
||||
"cxxstd",
|
||||
default="default",
|
||||
values=(
|
||||
"default",
|
||||
conditional("11", when="@:1.9"),
|
||||
conditional("14", when="@:1"),
|
||||
"17",
|
||||
"20",
|
||||
"23",
|
||||
"26",
|
||||
),
|
||||
values=("default", conditional("11", when="@:1.9"), "14", "17", "20"),
|
||||
multi=False,
|
||||
description="Use the specified C++ standard when building.",
|
||||
)
|
||||
variant(
|
||||
"pretty", default=False, description="Use BOOST_PRETTY_FUNCTION macro (Supported by 1.4+)."
|
||||
)
|
||||
conflicts("+pretty", when="@:1.3")
|
||||
|
||||
depends_on("cxx", type="build") # generated
|
||||
|
||||
@ -68,15 +108,14 @@ class Fairlogger(CMakePackage):
|
||||
|
||||
depends_on("boost", when="+pretty")
|
||||
conflicts("^boost@1.70:", when="^cmake@:3.14")
|
||||
depends_on("fmt")
|
||||
depends_on("fmt@:8", when="@:1.9")
|
||||
depends_on("fmt@5.3.0:5", when="@1.6.0:1.6.1")
|
||||
depends_on("fmt@5.3.0:", when="@1.6.2:")
|
||||
|
||||
def patch(self):
|
||||
"""FairLogger gets its version number from git.
|
||||
|
||||
The tarball doesn't have that information, so we patch the spack
|
||||
version into CMakeLists.txt.
|
||||
"""
|
||||
But the tarball doesn't have that information, so
|
||||
we patch the spack version into CMakeLists.txt"""
|
||||
if not self.spec.satisfies("@develop"):
|
||||
filter_file(
|
||||
r"(get_git_version\(.*)\)",
|
||||
@ -84,17 +123,14 @@ def patch(self):
|
||||
"CMakeLists.txt",
|
||||
)
|
||||
|
||||
if self.spec.satisfies("@:1"):
|
||||
filter_file(r"(LANGUAGES C CXX)", r"LANGUAGES CXX", "CMakeLists.txt")
|
||||
|
||||
def cmake_args(self):
|
||||
args = [
|
||||
self.define("DISABLE_COLOR", True),
|
||||
self.define_from_variant("USE_BOOST_PRETTY_FUNCTION", "pretty"),
|
||||
self.define("USE_EXTERNAL_FMT", True),
|
||||
]
|
||||
args = [self.define("DISABLE_COLOR", True)]
|
||||
if self.spec.variants["cxxstd"].value != "default":
|
||||
args.append(self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"))
|
||||
if self.spec.satisfies("@1.4:"):
|
||||
args.append(self.define_from_variant("USE_BOOST_PRETTY_FUNCTION", "pretty"))
|
||||
if self.spec.satisfies("@1.6:"):
|
||||
args.append(self.define("USE_EXTERNAL_FMT", True))
|
||||
if self.spec.satisfies("^boost@:1.69"):
|
||||
args.append(self.define("Boost_NO_BOOST_CMAKE", True))
|
||||
return args
|
||||
|
@ -57,7 +57,6 @@ class Fairmq(CMakePackage):
|
||||
depends_on("git")
|
||||
|
||||
depends_on("boost@1.66: +container+program_options+filesystem+date_time+regex")
|
||||
conflicts("^boost@1.88:")
|
||||
depends_on("fairlogger@1.6: +pretty")
|
||||
depends_on("libzmq@4.1.4:")
|
||||
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user