Compare commits
6 Commits
develop
...
stable-spe
Author | SHA1 | Date | |
---|---|---|---|
![]() |
37bb0843fd | ||
![]() |
07a8f6235b | ||
![]() |
485291ef20 | ||
![]() |
8363fbf40f | ||
![]() |
e4b898f7c3 | ||
![]() |
a5f1a4ea81 |
1
.github/workflows/import-check.yaml
vendored
1
.github/workflows/import-check.yaml
vendored
@ -6,7 +6,6 @@ on:
|
|||||||
jobs:
|
jobs:
|
||||||
# Check we don't make the situation with circular imports worse
|
# Check we don't make the situation with circular imports worse
|
||||||
import-check:
|
import-check:
|
||||||
continue-on-error: true
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: julia-actions/setup-julia@v2
|
- uses: julia-actions/setup-julia@v2
|
||||||
|
2
.github/workflows/sync-packages.yaml
vendored
2
.github/workflows/sync-packages.yaml
vendored
@ -28,7 +28,7 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
cd spack-packages
|
cd spack-packages
|
||||||
git-filter-repo --quiet --source ../spack \
|
git-filter-repo --quiet --source ../spack \
|
||||||
--path var/spack/repos/ --path-rename var/spack/repos/:python/ \
|
--subdirectory-filter var/spack/repos \
|
||||||
--path share/spack/gitlab/cloud_pipelines/ --path-rename share/spack/gitlab/cloud_pipelines/:.ci/gitlab/ \
|
--path share/spack/gitlab/cloud_pipelines/ --path-rename share/spack/gitlab/cloud_pipelines/:.ci/gitlab/ \
|
||||||
--refs develop
|
--refs develop
|
||||||
- name: Push
|
- name: Push
|
||||||
|
@ -276,7 +276,7 @@ remove dependent packages *before* removing their dependencies or use the
|
|||||||
Garbage collection
|
Garbage collection
|
||||||
^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
When Spack builds software from sources, it often installs tools that are needed
|
When Spack builds software from sources, if often installs tools that are needed
|
||||||
just to build or test other software. These are not necessary at runtime.
|
just to build or test other software. These are not necessary at runtime.
|
||||||
To support cases where removing these tools can be a benefit Spack provides
|
To support cases where removing these tools can be a benefit Spack provides
|
||||||
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:
|
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:
|
||||||
|
@ -89,7 +89,7 @@ You can see that the mirror is added with ``spack mirror list`` as follows:
|
|||||||
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
|
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
|
||||||
|
|
||||||
|
|
||||||
At this point, you've created a buildcache, but Spack hasn't indexed it, so if
|
At this point, you've create a buildcache, but spack hasn't indexed it, so if
|
||||||
you run ``spack buildcache list`` you won't see any results. You need to index
|
you run ``spack buildcache list`` you won't see any results. You need to index
|
||||||
this new build cache as follows:
|
this new build cache as follows:
|
||||||
|
|
||||||
@ -318,7 +318,7 @@ other system dependencies. However, they are still compatible with tools like
|
|||||||
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
|
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
The Docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
The docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
||||||
``max depth exceeded`` error may be produced when pulling the image. There
|
``max depth exceeded`` error may be produced when pulling the image. There
|
||||||
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
|
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
|
||||||
|
|
||||||
|
@ -14,7 +14,7 @@ is an entire command dedicated to the management of every aspect of bootstrappin
|
|||||||
|
|
||||||
.. command-output:: spack bootstrap --help
|
.. command-output:: spack bootstrap --help
|
||||||
|
|
||||||
Spack is configured to bootstrap its dependencies lazily by default; i.e., the first time they are needed and
|
Spack is configured to bootstrap its dependencies lazily by default; i.e. the first time they are needed and
|
||||||
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
|
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
@ -36,8 +36,8 @@ can't be found. You can readily check if any prerequisite for using Spack is mis
|
|||||||
|
|
||||||
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
|
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
|
||||||
are missing and it's giving detailed information on why they are needed and whether
|
are missing and it's giving detailed information on why they are needed and whether
|
||||||
they can be bootstrapped. The return code of this command summarizes the results; if any
|
they can be bootstrapped. The return code of this command summarizes the results, if any
|
||||||
dependencies are missing, the return code is ``1``, otherwise ``0``. Running a command that
|
dependencies are missing the return code is ``1``, otherwise ``0``. Running a command that
|
||||||
concretizes a spec, like:
|
concretizes a spec, like:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
@ -228,7 +228,7 @@ def setup(sphinx):
|
|||||||
("py:class", "spack.install_test.Pb"),
|
("py:class", "spack.install_test.Pb"),
|
||||||
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
|
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
|
||||||
("py:class", "spack.traverse.EdgeAndDepth"),
|
("py:class", "spack.traverse.EdgeAndDepth"),
|
||||||
("py:class", "_vendoring.archspec.cpu.microarchitecture.Microarchitecture"),
|
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
|
||||||
("py:class", "spack.compiler.CompilerCache"),
|
("py:class", "spack.compiler.CompilerCache"),
|
||||||
# TypeVar that is not handled correctly
|
# TypeVar that is not handled correctly
|
||||||
("py:class", "llnl.util.lang.T"),
|
("py:class", "llnl.util.lang.T"),
|
||||||
|
@ -148,8 +148,8 @@ this can expose you to attacks. Use at your own risk.
|
|||||||
``ssl_certs``
|
``ssl_certs``
|
||||||
--------------------
|
--------------------
|
||||||
|
|
||||||
Path to custom certificates for SSL verification. The value can be a
|
Path to custom certificats for SSL verification. The value can be a
|
||||||
filesystem path, or an environment variable that expands to an absolute file path.
|
filesytem path, or an environment variable that expands to an absolute file path.
|
||||||
The default value is set to the environment variable ``SSL_CERT_FILE``
|
The default value is set to the environment variable ``SSL_CERT_FILE``
|
||||||
to use the same syntax used by many other applications that automatically
|
to use the same syntax used by many other applications that automatically
|
||||||
detect custom certificates.
|
detect custom certificates.
|
||||||
|
@ -11,7 +11,7 @@ Container Images
|
|||||||
Spack :ref:`environments` can easily be turned into container images. This page
|
Spack :ref:`environments` can easily be turned into container images. This page
|
||||||
outlines two ways in which this can be done:
|
outlines two ways in which this can be done:
|
||||||
|
|
||||||
1. By installing the environment on the host system and copying the installations
|
1. By installing the environment on the host system, and copying the installations
|
||||||
into the container image. This approach does not require any tools like Docker
|
into the container image. This approach does not require any tools like Docker
|
||||||
or Singularity to be installed.
|
or Singularity to be installed.
|
||||||
2. By generating a Docker or Singularity recipe that can be used to build the
|
2. By generating a Docker or Singularity recipe that can be used to build the
|
||||||
@ -56,8 +56,8 @@ environment roots and its runtime dependencies.
|
|||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag specifies not
|
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag is not
|
||||||
the password for your account, but rather a personal access token you need to generate separately.
|
the password for your account, but a personal access token you need to generate separately.
|
||||||
|
|
||||||
The specified ``--base-image`` should have a libc that is compatible with the host system.
|
The specified ``--base-image`` should have a libc that is compatible with the host system.
|
||||||
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
|
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
|
||||||
|
@ -20,7 +20,7 @@ be present on the machine where Spack is run:
|
|||||||
:header-rows: 1
|
:header-rows: 1
|
||||||
|
|
||||||
These requirements can be easily installed on most modern Linux systems;
|
These requirements can be easily installed on most modern Linux systems;
|
||||||
on macOS, the Command Line Tools package is required, and a full Xcode suite
|
on macOS, the Command Line Tools package is required, and a full XCode suite
|
||||||
may be necessary for some packages such as Qt and apple-gl. Spack is designed
|
may be necessary for some packages such as Qt and apple-gl. Spack is designed
|
||||||
to run on HPC platforms like Cray. Not all packages should be expected
|
to run on HPC platforms like Cray. Not all packages should be expected
|
||||||
to work on all platforms.
|
to work on all platforms.
|
||||||
|
@ -8,7 +8,7 @@
|
|||||||
Modules (modules.yaml)
|
Modules (modules.yaml)
|
||||||
======================
|
======================
|
||||||
|
|
||||||
The use of module systems to manage user environments in a controlled way
|
The use of module systems to manage user environment in a controlled way
|
||||||
is a common practice at HPC centers that is sometimes embraced also by
|
is a common practice at HPC centers that is sometimes embraced also by
|
||||||
individual programmers on their development machines. To support this
|
individual programmers on their development machines. To support this
|
||||||
common practice Spack integrates with `Environment Modules
|
common practice Spack integrates with `Environment Modules
|
||||||
@ -490,7 +490,7 @@ that are already in the Lmod hierarchy.
|
|||||||
|
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
Tcl and Lua modules also allow for explicit conflicts between module files.
|
Tcl and Lua modules also allow for explicit conflicts between modulefiles.
|
||||||
|
|
||||||
.. code-block:: yaml
|
.. code-block:: yaml
|
||||||
|
|
||||||
@ -513,7 +513,7 @@ that are already in the Lmod hierarchy.
|
|||||||
:meth:`~spack.spec.Spec.format` method.
|
:meth:`~spack.spec.Spec.format` method.
|
||||||
|
|
||||||
For Lmod and Environment Modules versions prior 4.2, it is important to
|
For Lmod and Environment Modules versions prior 4.2, it is important to
|
||||||
express the conflict on both module files conflicting with each other.
|
express the conflict on both modulefiles conflicting with each other.
|
||||||
|
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
@ -550,7 +550,7 @@ that are already in the Lmod hierarchy.
|
|||||||
|
|
||||||
.. warning::
|
.. warning::
|
||||||
Consistency of Core packages
|
Consistency of Core packages
|
||||||
The user is responsible for maintaining consistency among core packages, as ``core_specs``
|
The user is responsible for maintining consistency among core packages, as ``core_specs``
|
||||||
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
|
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
|
||||||
|
|
||||||
.. warning::
|
.. warning::
|
||||||
|
@ -179,7 +179,7 @@ Spack can be found at :ref:`package_class_structure`.
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
class Foo(CMakePackage):
|
class Foo(CmakePackage):
|
||||||
def cmake_args(self):
|
def cmake_args(self):
|
||||||
...
|
...
|
||||||
|
|
||||||
@ -1212,7 +1212,7 @@ class-level tarball URL and VCS. For example:
|
|||||||
version("master", branch="master")
|
version("master", branch="master")
|
||||||
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
|
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
|
||||||
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
|
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
|
||||||
version("12.8.1", md5="9f37f683ee2b427b5540db8a20ed6b15")
|
version("12.8.1", "9f37f683ee2b427b5540db8a20ed6b15")
|
||||||
|
|
||||||
If a package contains both a ``url`` and ``git`` class-level attribute,
|
If a package contains both a ``url`` and ``git`` class-level attribute,
|
||||||
Spack decides which to use based on the arguments to the ``version()``
|
Spack decides which to use based on the arguments to the ``version()``
|
||||||
@ -1343,7 +1343,7 @@ Submodules
|
|||||||
|
|
||||||
version("1.0.1", tag="v1.0.1", submodules=True)
|
version("1.0.1", tag="v1.0.1", submodules=True)
|
||||||
|
|
||||||
If a package needs more fine-grained control over submodules, define
|
If a package has needs more fine-grained control over submodules, define
|
||||||
``submodules`` to be a callable function that takes the package instance as
|
``submodules`` to be a callable function that takes the package instance as
|
||||||
its only argument. The function should return a list of submodules to be fetched.
|
its only argument. The function should return a list of submodules to be fetched.
|
||||||
|
|
||||||
@ -2308,19 +2308,31 @@ looks like this:
|
|||||||
|
|
||||||
parallel = False
|
parallel = False
|
||||||
|
|
||||||
You can also disable parallel builds only for specific make
|
Similarly, you can disable parallel builds only for specific make
|
||||||
invocation:
|
commands, as ``libdwarf`` does:
|
||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:emphasize-lines: 5
|
:emphasize-lines: 9, 12
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
class Libelf(Package):
|
class Libelf(Package):
|
||||||
...
|
...
|
||||||
|
|
||||||
def install(self, spec, prefix):
|
def install(self, spec, prefix):
|
||||||
|
configure("--prefix=" + prefix,
|
||||||
|
"--enable-shared",
|
||||||
|
"--disable-dependency-tracking",
|
||||||
|
"--disable-debug")
|
||||||
|
make()
|
||||||
|
|
||||||
|
# The mkdir commands in libelf's install can fail in parallel
|
||||||
make("install", parallel=False)
|
make("install", parallel=False)
|
||||||
|
|
||||||
|
The first make will run in parallel here, but the second will not. If
|
||||||
|
you set ``parallel`` to ``False`` at the package level, then each call
|
||||||
|
to ``make()`` will be sequential by default, but packagers can call
|
||||||
|
``make(parallel=True)`` to override it.
|
||||||
|
|
||||||
Note that the ``--jobs`` option works out of the box for all standard
|
Note that the ``--jobs`` option works out of the box for all standard
|
||||||
build systems. If you are using a non-standard build system instead, you
|
build systems. If you are using a non-standard build system instead, you
|
||||||
can use the variable ``make_jobs`` to extract the number of jobs specified
|
can use the variable ``make_jobs`` to extract the number of jobs specified
|
||||||
@ -2495,7 +2507,7 @@ necessary when there are breaking changes in the dependency that the
|
|||||||
package cannot handle. In Spack we often add forward compatibility
|
package cannot handle. In Spack we often add forward compatibility
|
||||||
bounds only at the time a new, breaking version of a dependency is
|
bounds only at the time a new, breaking version of a dependency is
|
||||||
released. As with backward compatibility, it is typical to see a list
|
released. As with backward compatibility, it is typical to see a list
|
||||||
of forward compatibility bounds in a package file as separate lines:
|
of forward compatibility bounds in a package file as seperate lines:
|
||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
@ -3371,7 +3383,7 @@ the above attribute implementations:
|
|||||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
|
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
|
||||||
])
|
])
|
||||||
|
|
||||||
# baz library directories in the baz subdirectory of the foo prefix
|
# baz library directories in the baz subdirectory of the foo porefix
|
||||||
>>> spec["baz"].libs.directories
|
>>> spec["baz"].libs.directories
|
||||||
[
|
[
|
||||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
|
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
|
||||||
@ -3732,6 +3744,9 @@ the build system. The build systems currently supported by Spack are:
|
|||||||
| :class:`~spack_repo.builtin.build_systems.ruby` | Specialized build system for |
|
| :class:`~spack_repo.builtin.build_systems.ruby` | Specialized build system for |
|
||||||
| | Ruby extensions |
|
| | Ruby extensions |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
|
| :class:`~spack_repo.builtin.build_systems.intel` | Specialized build system for |
|
||||||
|
| | licensed Intel software |
|
||||||
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.oneapi` | Specialized build system for |
|
| :class:`~spack_repo.builtin.build_systems.oneapi` | Specialized build system for |
|
||||||
| | Intel oneAPI software |
|
| | Intel oneAPI software |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
@ -5728,7 +5743,7 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
|
|||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
The method name ``copy_test_files`` here is for illustration purposes.
|
The method name ``copy_test_files`` here is for illustration purposes.
|
||||||
You are free to use a name that is better suited to your package.
|
You are free to use a name that is more suited to your package.
|
||||||
|
|
||||||
The key to copying files for stand-alone testing at build time is use
|
The key to copying files for stand-alone testing at build time is use
|
||||||
of the ``run_after`` directive, which ensures the associated files are
|
of the ``run_after`` directive, which ensures the associated files are
|
||||||
@ -7237,7 +7252,7 @@ which are not, there is the `checked_by` parameter in the license directive:
|
|||||||
|
|
||||||
license("<license>", when="<when>", checked_by="<github username>")
|
license("<license>", when="<when>", checked_by="<github username>")
|
||||||
|
|
||||||
When you have validated a package license, either when doing so explicitly or
|
When you have validated a github license, either when doing so explicitly or
|
||||||
as part of packaging a new package, please set the `checked_by` parameter
|
as part of packaging a new package, please set the `checked_by` parameter
|
||||||
to your Github username to signal that the license has been manually
|
to your Github username to signal that the license has been manually
|
||||||
verified.
|
verified.
|
||||||
|
@ -214,7 +214,7 @@ package versions, simply run the following commands:
|
|||||||
|
|
||||||
Running ``spack mark -i --all`` tells Spack to mark all of the existing
|
Running ``spack mark -i --all`` tells Spack to mark all of the existing
|
||||||
packages within an environment as "implicitly" installed. This tells
|
packages within an environment as "implicitly" installed. This tells
|
||||||
Spack's garbage collection system that these packages should be cleaned up.
|
spack's garbage collection system that these packages should be cleaned up.
|
||||||
|
|
||||||
Don't worry however, this will not remove your entire environment.
|
Don't worry however, this will not remove your entire environment.
|
||||||
Running ``spack install`` will reexamine your spack environment after
|
Running ``spack install`` will reexamine your spack environment after
|
||||||
|
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from _pyrsistent_version import *
|
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from altgraph import *
|
@ -1,20 +0,0 @@
|
|||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2014 Anders Høst
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
|
||||||
this software and associated documentation files (the "Software"), to deal in
|
|
||||||
the Software without restriction, including without limitation the rights to
|
|
||||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
|
|
||||||
the Software, and to permit persons to whom the Software is furnished to do so,
|
|
||||||
subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
|
|
||||||
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
|
||||||
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
|
||||||
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
|
|
||||||
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from jsonschema import *
|
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from macholib import *
|
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
@ -0,0 +1,213 @@
|
|||||||
|
# flake8: noqa: E704
|
||||||
|
# from https://gist.github.com/WuTheFWasThat/091a17d4b5cab597dfd5d4c2d96faf09
|
||||||
|
# Stubs for pyrsistent (Python 3.6)
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
from typing import AnyStr
|
||||||
|
from typing import Callable
|
||||||
|
from typing import Iterable
|
||||||
|
from typing import Iterator
|
||||||
|
from typing import List
|
||||||
|
from typing import Optional
|
||||||
|
from typing import Mapping
|
||||||
|
from typing import MutableMapping
|
||||||
|
from typing import Sequence
|
||||||
|
from typing import Set
|
||||||
|
from typing import Union
|
||||||
|
from typing import Tuple
|
||||||
|
from typing import Type
|
||||||
|
from typing import TypeVar
|
||||||
|
from typing import overload
|
||||||
|
|
||||||
|
# see commit 08519aa for explanation of the re-export
|
||||||
|
from pyrsistent.typing import CheckedKeyTypeError as CheckedKeyTypeError
|
||||||
|
from pyrsistent.typing import CheckedPMap as CheckedPMap
|
||||||
|
from pyrsistent.typing import CheckedPSet as CheckedPSet
|
||||||
|
from pyrsistent.typing import CheckedPVector as CheckedPVector
|
||||||
|
from pyrsistent.typing import CheckedType as CheckedType
|
||||||
|
from pyrsistent.typing import CheckedValueTypeError as CheckedValueTypeError
|
||||||
|
from pyrsistent.typing import InvariantException as InvariantException
|
||||||
|
from pyrsistent.typing import PClass as PClass
|
||||||
|
from pyrsistent.typing import PBag as PBag
|
||||||
|
from pyrsistent.typing import PDeque as PDeque
|
||||||
|
from pyrsistent.typing import PList as PList
|
||||||
|
from pyrsistent.typing import PMap as PMap
|
||||||
|
from pyrsistent.typing import PMapEvolver as PMapEvolver
|
||||||
|
from pyrsistent.typing import PSet as PSet
|
||||||
|
from pyrsistent.typing import PSetEvolver as PSetEvolver
|
||||||
|
from pyrsistent.typing import PTypeError as PTypeError
|
||||||
|
from pyrsistent.typing import PVector as PVector
|
||||||
|
from pyrsistent.typing import PVectorEvolver as PVectorEvolver
|
||||||
|
|
||||||
|
T = TypeVar('T')
|
||||||
|
KT = TypeVar('KT')
|
||||||
|
VT = TypeVar('VT')
|
||||||
|
|
||||||
|
def pmap(initial: Union[Mapping[KT, VT], Iterable[Tuple[KT, VT]]] = {}, pre_size: int = 0) -> PMap[KT, VT]: ...
|
||||||
|
def m(**kwargs: VT) -> PMap[str, VT]: ...
|
||||||
|
|
||||||
|
def pvector(iterable: Iterable[T] = ...) -> PVector[T]: ...
|
||||||
|
def v(*iterable: T) -> PVector[T]: ...
|
||||||
|
|
||||||
|
def pset(iterable: Iterable[T] = (), pre_size: int = 8) -> PSet[T]: ...
|
||||||
|
def s(*iterable: T) -> PSet[T]: ...
|
||||||
|
|
||||||
|
# see class_test.py for use cases
|
||||||
|
Invariant = Tuple[bool, Optional[Union[str, Callable[[], str]]]]
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def field(
|
||||||
|
type: Union[Type[T], Sequence[Type[T]]] = ...,
|
||||||
|
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||||
|
initial: Any = object(),
|
||||||
|
mandatory: bool = False,
|
||||||
|
factory: Callable[[Any], T] = lambda x: x,
|
||||||
|
serializer: Callable[[Any, T], Any] = lambda _, value: value,
|
||||||
|
) -> T: ...
|
||||||
|
# The actual return value (_PField) is irrelevant after a PRecord has been instantiated,
|
||||||
|
# see https://github.com/tobgu/pyrsistent/blob/master/pyrsistent/_precord.py#L10
|
||||||
|
@overload
|
||||||
|
def field(
|
||||||
|
type: Any = ...,
|
||||||
|
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||||
|
initial: Any = object(),
|
||||||
|
mandatory: bool = False,
|
||||||
|
factory: Callable[[Any], Any] = lambda x: x,
|
||||||
|
serializer: Callable[[Any, Any], Any] = lambda _, value: value,
|
||||||
|
) -> Any: ...
|
||||||
|
|
||||||
|
# Use precise types for the simplest use cases, but fall back to Any for
|
||||||
|
# everything else. See record_test.py for the wide range of possible types for
|
||||||
|
# item_type
|
||||||
|
@overload
|
||||||
|
def pset_field(
|
||||||
|
item_type: Type[T],
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Iterable[T] = ...,
|
||||||
|
) -> PSet[T]: ...
|
||||||
|
@overload
|
||||||
|
def pset_field(
|
||||||
|
item_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Any = (),
|
||||||
|
) -> PSet[Any]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def pmap_field(
|
||||||
|
key_type: Type[KT],
|
||||||
|
value_type: Type[VT],
|
||||||
|
optional: bool = False,
|
||||||
|
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||||
|
) -> PMap[KT, VT]: ...
|
||||||
|
@overload
|
||||||
|
def pmap_field(
|
||||||
|
key_type: Any,
|
||||||
|
value_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||||
|
) -> PMap[Any, Any]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def pvector_field(
|
||||||
|
item_type: Type[T],
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Iterable[T] = ...,
|
||||||
|
) -> PVector[T]: ...
|
||||||
|
@overload
|
||||||
|
def pvector_field(
|
||||||
|
item_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Any = (),
|
||||||
|
) -> PVector[Any]: ...
|
||||||
|
|
||||||
|
def pbag(elements: Iterable[T]) -> PBag[T]: ...
|
||||||
|
def b(*elements: T) -> PBag[T]: ...
|
||||||
|
|
||||||
|
def plist(iterable: Iterable[T] = (), reverse: bool = False) -> PList[T]: ...
|
||||||
|
def l(*elements: T) -> PList[T]: ...
|
||||||
|
|
||||||
|
def pdeque(iterable: Optional[Iterable[T]] = None, maxlen: Optional[int] = None) -> PDeque[T]: ...
|
||||||
|
def dq(*iterable: T) -> PDeque[T]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def optional(type: T) -> Tuple[T, Type[None]]: ...
|
||||||
|
@overload
|
||||||
|
def optional(*typs: Any) -> Tuple[Any, ...]: ...
|
||||||
|
|
||||||
|
T_PRecord = TypeVar('T_PRecord', bound='PRecord')
|
||||||
|
class PRecord(PMap[AnyStr, Any]):
|
||||||
|
_precord_fields: Mapping
|
||||||
|
_precord_initial_values: Mapping
|
||||||
|
|
||||||
|
def __hash__(self) -> int: ...
|
||||||
|
def __init__(self, **kwargs: Any) -> None: ...
|
||||||
|
def __iter__(self) -> Iterator[Any]: ...
|
||||||
|
def __len__(self) -> int: ...
|
||||||
|
@classmethod
|
||||||
|
def create(
|
||||||
|
cls: Type[T_PRecord],
|
||||||
|
kwargs: Mapping,
|
||||||
|
_factory_fields: Optional[Iterable] = None,
|
||||||
|
ignore_extra: bool = False,
|
||||||
|
) -> T_PRecord: ...
|
||||||
|
# This is OK because T_PRecord is a concrete type
|
||||||
|
def discard(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||||
|
def remove(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||||
|
|
||||||
|
def serialize(self, format: Optional[Any] = ...) -> MutableMapping: ...
|
||||||
|
|
||||||
|
# From pyrsistent documentation:
|
||||||
|
# This set function differs slightly from that in the PMap
|
||||||
|
# class. First of all it accepts key-value pairs. Second it accepts multiple key-value
|
||||||
|
# pairs to perform one, atomic, update of multiple fields.
|
||||||
|
@overload
|
||||||
|
def set(self, key: KT, val: VT) -> Any: ...
|
||||||
|
@overload
|
||||||
|
def set(self, **kwargs: VT) -> Any: ...
|
||||||
|
|
||||||
|
def immutable(
|
||||||
|
members: Union[str, Iterable[str]] = '',
|
||||||
|
name: str = 'Immutable',
|
||||||
|
verbose: bool = False,
|
||||||
|
) -> Tuple: ... # actually a namedtuple
|
||||||
|
|
||||||
|
# ignore mypy warning "Overloaded function signatures 1 and 5 overlap with
|
||||||
|
# incompatible return types"
|
||||||
|
@overload
|
||||||
|
def freeze(o: Mapping[KT, VT]) -> PMap[KT, VT]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: List[T]) -> PVector[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||||
|
@overload
|
||||||
|
def freeze(o: Set[T]) -> PSet[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: T) -> T: ...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def thaw(o: PMap[KT, VT]) -> MutableMapping[KT, VT]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: PVector[T]) -> List[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||||
|
# collections.abc.MutableSet is kind of garbage:
|
||||||
|
# https://stackoverflow.com/questions/24977898/why-does-collections-mutableset-not-bestow-an-update-method
|
||||||
|
@overload
|
||||||
|
def thaw(o: PSet[T]) -> Set[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: T) -> T: ...
|
||||||
|
|
||||||
|
def mutant(fn: Callable) -> Callable: ...
|
||||||
|
|
||||||
|
def inc(x: int) -> int: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PMapEvolver[KT, VT], key: KT) -> None: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PVectorEvolver[T], key: int) -> None: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PSetEvolver[T], key: T) -> None: ...
|
||||||
|
def rex(expr: str) -> Callable[[Any], bool]: ...
|
||||||
|
def ny(_: Any) -> bool: ...
|
||||||
|
|
||||||
|
def get_in(keys: Iterable, coll: Mapping, default: Optional[Any] = None, no_default: bool = False) -> Any: ...
|
1
lib/spack/external/_vendoring/ruamel.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/ruamel.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from ruamel import *
|
1
lib/spack/external/_vendoring/six/__init__.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/__init__.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from six import *
|
1
lib/spack/external/_vendoring/six/moves/__init__.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/moves/__init__.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from six.moves import *
|
1
lib/spack/external/_vendoring/six/moves/configparser.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/six/moves/configparser.pyi
vendored
Normal file
@ -0,0 +1 @@
|
|||||||
|
from six.moves.configparser import *
|
81
lib/spack/external/archspec/README.md
vendored
Normal file
81
lib/spack/external/archspec/README.md
vendored
Normal file
@ -0,0 +1,81 @@
|
|||||||
|
[](https://github.com/archspec/archspec/actions)
|
||||||
|
[](https://codecov.io/gh/archspec/archspec)
|
||||||
|
[](https://archspec.readthedocs.io/en/latest/?badge=latest)
|
||||||
|
|
||||||
|
|
||||||
|
# Archspec (Python bindings)
|
||||||
|
|
||||||
|
Archspec aims at providing a standard set of human-understandable labels for
|
||||||
|
various aspects of a system architecture like CPU, network fabrics, etc. and
|
||||||
|
APIs to detect, query and compare them.
|
||||||
|
|
||||||
|
This project grew out of [Spack](https://spack.io/) and is currently under
|
||||||
|
active development. At present it supports APIs to detect and model
|
||||||
|
compatibility relationships among different CPU microarchitectures.
|
||||||
|
|
||||||
|
## Getting started with development
|
||||||
|
|
||||||
|
The `archspec` Python package needs [poetry](https://python-poetry.org/) to
|
||||||
|
be installed from VCS sources. The preferred method to install it is via
|
||||||
|
its custom installer outside of any virtual environment:
|
||||||
|
```console
|
||||||
|
$ curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python
|
||||||
|
```
|
||||||
|
You can refer to [Poetry's documentation](https://python-poetry.org/docs/#installation)
|
||||||
|
for further details or for other methods to install this tool. You'll also need `tox`
|
||||||
|
to run unit test:
|
||||||
|
```console
|
||||||
|
$ pip install --user tox
|
||||||
|
```
|
||||||
|
Finally you'll need to clone the repository:
|
||||||
|
```console
|
||||||
|
$ git clone --recursive https://github.com/archspec/archspec.git
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running unit tests
|
||||||
|
Once you have your environment ready you can run `archspec` unit tests
|
||||||
|
using ``tox`` from the root of the repository:
|
||||||
|
```console
|
||||||
|
$ tox
|
||||||
|
[ ... ]
|
||||||
|
py27: commands succeeded
|
||||||
|
py35: commands succeeded
|
||||||
|
py36: commands succeeded
|
||||||
|
py37: commands succeeded
|
||||||
|
py38: commands succeeded
|
||||||
|
pylint: commands succeeded
|
||||||
|
flake8: commands succeeded
|
||||||
|
black: commands succeeded
|
||||||
|
congratulations :)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Citing Archspec
|
||||||
|
|
||||||
|
If you are referencing `archspec` in a publication, please cite the following
|
||||||
|
paper:
|
||||||
|
|
||||||
|
* Massimiliano Culpo, Gregory Becker, Carlos Eduardo Arango Gutierrez, Kenneth
|
||||||
|
Hoste, and Todd Gamblin.
|
||||||
|
[**`archspec`: A library for detecting, labeling, and reasoning about
|
||||||
|
microarchitectures**](https://tgamblin.github.io/pubs/archspec-canopie-hpc-2020.pdf).
|
||||||
|
In *2nd International Workshop on Containers and New Orchestration Paradigms
|
||||||
|
for Isolated Environments in HPC (CANOPIE-HPC'20)*, Online Event, November
|
||||||
|
12, 2020.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
Archspec is distributed under the terms of both the MIT license and the
|
||||||
|
Apache License (Version 2.0). Users may choose either license, at their
|
||||||
|
option.
|
||||||
|
|
||||||
|
All new contributions must be made under both the MIT and Apache-2.0
|
||||||
|
licenses.
|
||||||
|
|
||||||
|
See [LICENSE-MIT](https://github.com/archspec/archspec/blob/master/LICENSE-MIT),
|
||||||
|
[LICENSE-APACHE](https://github.com/archspec/archspec/blob/master/LICENSE-APACHE),
|
||||||
|
[COPYRIGHT](https://github.com/archspec/archspec/blob/master/COPYRIGHT), and
|
||||||
|
[NOTICE](https://github.com/archspec/archspec/blob/master/NOTICE) for details.
|
||||||
|
|
||||||
|
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
LLNL-CODE-811653
|
@ -1,3 +1,3 @@
|
|||||||
"""Init file to avoid namespace packages"""
|
"""Init file to avoid namespace packages"""
|
||||||
|
|
||||||
__version__ = "0.2.5"
|
__version__ = "0.2.4"
|
@ -9,8 +9,8 @@
|
|||||||
import argparse
|
import argparse
|
||||||
import typing
|
import typing
|
||||||
|
|
||||||
import _vendoring.archspec
|
import archspec
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
|
|
||||||
def _make_parser() -> argparse.ArgumentParser:
|
def _make_parser() -> argparse.ArgumentParser:
|
||||||
@ -24,7 +24,7 @@ def _make_parser() -> argparse.ArgumentParser:
|
|||||||
"-V",
|
"-V",
|
||||||
help="Show the version and exit.",
|
help="Show the version and exit.",
|
||||||
action="version",
|
action="version",
|
||||||
version=f"archspec, version {_vendoring.archspec.__version__}",
|
version=f"archspec, version {archspec.__version__}",
|
||||||
)
|
)
|
||||||
parser.add_argument("--help", "-h", help="Show the help and exit.", action="help")
|
parser.add_argument("--help", "-h", help="Show the help and exit.", action="help")
|
||||||
|
|
||||||
@ -45,9 +45,9 @@ def _make_parser() -> argparse.ArgumentParser:
|
|||||||
|
|
||||||
|
|
||||||
def cpu() -> int:
|
def cpu() -> int:
|
||||||
"""Run the `_vendoring.archspec.cpu` subcommand."""
|
"""Run the `archspec cpu` subcommand."""
|
||||||
try:
|
try:
|
||||||
print(_vendoring.archspec.cpu.host())
|
print(archspec.cpu.host())
|
||||||
except FileNotFoundError as exc:
|
except FileNotFoundError as exc:
|
||||||
print(exc)
|
print(exc)
|
||||||
return 1
|
return 1
|
@ -8,9 +8,9 @@
|
|||||||
import re
|
import re
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
import _vendoring.archspec
|
import archspec
|
||||||
import _vendoring.archspec.cpu.alias
|
import archspec.cpu.alias
|
||||||
import _vendoring.archspec.cpu.schema
|
import archspec.cpu.schema
|
||||||
|
|
||||||
from .alias import FEATURE_ALIASES
|
from .alias import FEATURE_ALIASES
|
||||||
from .schema import LazyDictionary
|
from .schema import LazyDictionary
|
||||||
@ -384,7 +384,7 @@ def fill_target_from_dict(name, data, targets):
|
|||||||
)
|
)
|
||||||
|
|
||||||
known_targets = {}
|
known_targets = {}
|
||||||
data = _vendoring.archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
|
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
|
||||||
for name in data:
|
for name in data:
|
||||||
if name in known_targets:
|
if name in known_targets:
|
||||||
# name was already brought in as ancestor to a target
|
# name was already brought in as ancestor to a target
|
22
lib/spack/external/archspec/json/COPYRIGHT
vendored
Normal file
22
lib/spack/external/archspec/json/COPYRIGHT
vendored
Normal file
@ -0,0 +1,22 @@
|
|||||||
|
Intellectual Property Notice
|
||||||
|
------------------------------
|
||||||
|
|
||||||
|
Archspec is licensed under the Apache License, Version 2.0 (LICENSE-APACHE
|
||||||
|
or http://www.apache.org/licenses/LICENSE-2.0) or the MIT license,
|
||||||
|
(LICENSE-MIT or http://opensource.org/licenses/MIT), at your option.
|
||||||
|
|
||||||
|
Copyrights and patents in the Archspec project are retained by contributors.
|
||||||
|
No copyright assignment is required to contribute to Archspec.
|
||||||
|
|
||||||
|
|
||||||
|
SPDX usage
|
||||||
|
------------
|
||||||
|
|
||||||
|
Individual files contain SPDX tags instead of the full license text.
|
||||||
|
This enables machine processing of license information based on the SPDX
|
||||||
|
License Identifiers that are available here: https://spdx.org/licenses/
|
||||||
|
|
||||||
|
Files that are dual-licensed as Apache-2.0 OR MIT contain the following
|
||||||
|
text in the license header:
|
||||||
|
|
||||||
|
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
1
lib/spack/external/vendor.txt
vendored
1
lib/spack/external/vendor.txt
vendored
@ -9,4 +9,3 @@ macholib==1.16.2
|
|||||||
altgraph==0.17.3
|
altgraph==0.17.3
|
||||||
ruamel.yaml==0.17.21
|
ruamel.yaml==0.17.21
|
||||||
typing_extensions==4.1.1
|
typing_extensions==4.1.1
|
||||||
archspec @ git+https://github.com/archspec/archspec.git@38ce485258ffc4fc6dd6688f8dc90cb269478c47
|
|
||||||
|
@ -67,7 +67,7 @@ def index_by(objects, *funcs):
|
|||||||
}
|
}
|
||||||
|
|
||||||
If any elements in funcs is a string, it is treated as the name
|
If any elements in funcs is a string, it is treated as the name
|
||||||
of an attribute, and acts like getattr(object, name). So
|
of an attribute, and acts like ``getattr(object, name)``. So
|
||||||
shorthand for the above two indexes would be::
|
shorthand for the above two indexes would be::
|
||||||
|
|
||||||
index1 = index_by(list_of_specs, 'arch', 'compiler')
|
index1 = index_by(list_of_specs, 'arch', 'compiler')
|
||||||
@ -77,7 +77,8 @@ def index_by(objects, *funcs):
|
|||||||
|
|
||||||
index1 = index_by(list_of_specs, ('target', 'compiler'))
|
index1 = index_by(list_of_specs, ('target', 'compiler'))
|
||||||
|
|
||||||
Keys in the resulting dict will look like ('gcc', 'skylake').
|
Keys in the resulting dict will look like ``('gcc', 'skylake')``.
|
||||||
|
|
||||||
"""
|
"""
|
||||||
if not funcs:
|
if not funcs:
|
||||||
return objects
|
return objects
|
||||||
@ -315,7 +316,9 @@ def lazy_lexicographic_ordering(cls, set_hash=True):
|
|||||||
|
|
||||||
This is a lazy version of the tuple comparison used frequently to
|
This is a lazy version of the tuple comparison used frequently to
|
||||||
implement comparison in Python. Given some objects with fields, you
|
implement comparison in Python. Given some objects with fields, you
|
||||||
might use tuple keys to implement comparison, e.g.::
|
might use tuple keys to implement comparison, e.g.:
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
class Widget:
|
class Widget:
|
||||||
def _cmp_key(self):
|
def _cmp_key(self):
|
||||||
@ -343,7 +346,9 @@ def __lt__(self):
|
|||||||
Lazy lexicographic comparison maps the tuple comparison shown above
|
Lazy lexicographic comparison maps the tuple comparison shown above
|
||||||
to generator functions. Instead of comparing based on pre-constructed
|
to generator functions. Instead of comparing based on pre-constructed
|
||||||
tuple keys, users of this decorator can compare using elements from a
|
tuple keys, users of this decorator can compare using elements from a
|
||||||
generator. So, you'd write::
|
generator. So, you'd write:
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
@lazy_lexicographic_ordering
|
@lazy_lexicographic_ordering
|
||||||
class Widget:
|
class Widget:
|
||||||
@ -366,6 +371,38 @@ def cd_fun():
|
|||||||
only has to worry about writing ``_cmp_iter``, and making sure the
|
only has to worry about writing ``_cmp_iter``, and making sure the
|
||||||
elements in it are also comparable.
|
elements in it are also comparable.
|
||||||
|
|
||||||
|
In some cases, you may have a fast way to determine whether two
|
||||||
|
objects are equal, e.g. the ``is`` function or an already-computed
|
||||||
|
cryptographic hash. For this, you can implement your own
|
||||||
|
``_cmp_fast_eq`` function:
|
||||||
|
|
||||||
|
.. code-block:: python
|
||||||
|
|
||||||
|
@lazy_lexicographic_ordering
|
||||||
|
class Widget:
|
||||||
|
def _cmp_iter(self):
|
||||||
|
yield a
|
||||||
|
yield b
|
||||||
|
def cd_fun():
|
||||||
|
yield c
|
||||||
|
yield d
|
||||||
|
yield cd_fun
|
||||||
|
yield e
|
||||||
|
|
||||||
|
def _cmp_fast_eq(self, other):
|
||||||
|
return self is other or None
|
||||||
|
|
||||||
|
``_cmp_fast_eq`` should return:
|
||||||
|
|
||||||
|
* ``True`` if ``self`` is equal to ``other``,
|
||||||
|
* ``False`` if ``self`` is not equal to ``other``, and
|
||||||
|
* ``None`` if it's not known whether they are equal, and the full
|
||||||
|
comparison should be done.
|
||||||
|
|
||||||
|
``lazy_lexicographic_ordering`` uses ``_cmp_fast_eq`` to short-circuit
|
||||||
|
the comparison if the answer can be determined quickly. If you do not
|
||||||
|
implement it, it defaults to ``self is other or None``.
|
||||||
|
|
||||||
Some things to note:
|
Some things to note:
|
||||||
|
|
||||||
* If a class already has ``__eq__``, ``__ne__``, ``__lt__``,
|
* If a class already has ``__eq__``, ``__ne__``, ``__lt__``,
|
||||||
@ -386,34 +423,40 @@ def cd_fun():
|
|||||||
if not hasattr(cls, "_cmp_iter"):
|
if not hasattr(cls, "_cmp_iter"):
|
||||||
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_iter().")
|
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_iter().")
|
||||||
|
|
||||||
|
# get an equal operation that allows us to short-circuit comparison
|
||||||
|
# if it's not provided, default to `is`
|
||||||
|
_cmp_fast_eq = getattr(cls, "_cmp_fast_eq", lambda x, y: x is y or None)
|
||||||
|
|
||||||
# comparison operators are implemented in terms of lazy_eq and lazy_lt
|
# comparison operators are implemented in terms of lazy_eq and lazy_lt
|
||||||
def eq(self, other):
|
def eq(self, other):
|
||||||
if self is other:
|
fast_eq = _cmp_fast_eq(self, other)
|
||||||
return True
|
if fast_eq is not None:
|
||||||
|
return fast_eq
|
||||||
return (other is not None) and lazy_eq(self._cmp_iter, other._cmp_iter)
|
return (other is not None) and lazy_eq(self._cmp_iter, other._cmp_iter)
|
||||||
|
|
||||||
def lt(self, other):
|
def lt(self, other):
|
||||||
if self is other:
|
if _cmp_fast_eq(self, other) is True:
|
||||||
return False
|
return False
|
||||||
return (other is not None) and lazy_lt(self._cmp_iter, other._cmp_iter)
|
return (other is not None) and lazy_lt(self._cmp_iter, other._cmp_iter)
|
||||||
|
|
||||||
def ne(self, other):
|
def ne(self, other):
|
||||||
if self is other:
|
fast_eq = _cmp_fast_eq(self, other)
|
||||||
return False
|
if fast_eq is not None:
|
||||||
|
return not fast_eq
|
||||||
return (other is None) or not lazy_eq(self._cmp_iter, other._cmp_iter)
|
return (other is None) or not lazy_eq(self._cmp_iter, other._cmp_iter)
|
||||||
|
|
||||||
def gt(self, other):
|
def gt(self, other):
|
||||||
if self is other:
|
if _cmp_fast_eq(self, other) is True:
|
||||||
return False
|
return False
|
||||||
return (other is None) or lazy_lt(other._cmp_iter, self._cmp_iter)
|
return (other is None) or lazy_lt(other._cmp_iter, self._cmp_iter)
|
||||||
|
|
||||||
def le(self, other):
|
def le(self, other):
|
||||||
if self is other:
|
if _cmp_fast_eq(self, other) is True:
|
||||||
return True
|
return True
|
||||||
return (other is not None) and not lazy_lt(other._cmp_iter, self._cmp_iter)
|
return (other is not None) and not lazy_lt(other._cmp_iter, self._cmp_iter)
|
||||||
|
|
||||||
def ge(self, other):
|
def ge(self, other):
|
||||||
if self is other:
|
if _cmp_fast_eq(self, other) is True:
|
||||||
return True
|
return True
|
||||||
return (other is None) or not lazy_lt(self._cmp_iter, other._cmp_iter)
|
return (other is None) or not lazy_lt(self._cmp_iter, other._cmp_iter)
|
||||||
|
|
||||||
|
@ -12,9 +12,10 @@
|
|||||||
import warnings
|
import warnings
|
||||||
from typing import Optional, Sequence, Union
|
from typing import Optional, Sequence, Union
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
from _vendoring.typing_extensions import TypedDict
|
from _vendoring.typing_extensions import TypedDict
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
from llnl.util import tty
|
from llnl.util import tty
|
||||||
|
|
||||||
@ -137,7 +138,7 @@ def _fix_ext_suffix(candidate_spec: "spack.spec.Spec"):
|
|||||||
}
|
}
|
||||||
|
|
||||||
# If the current architecture is not problematic return
|
# If the current architecture is not problematic return
|
||||||
generic_target = _vendoring.archspec.cpu.host().family
|
generic_target = archspec.cpu.host().family
|
||||||
if str(generic_target) not in _suffix_to_be_checked:
|
if str(generic_target) not in _suffix_to_be_checked:
|
||||||
return
|
return
|
||||||
|
|
||||||
@ -234,7 +235,7 @@ def _root_spec(spec_str: str) -> str:
|
|||||||
platform = str(spack.platforms.host())
|
platform = str(spack.platforms.host())
|
||||||
|
|
||||||
spec_str += f" platform={platform}"
|
spec_str += f" platform={platform}"
|
||||||
target = _vendoring.archspec.cpu.host().family
|
target = archspec.cpu.host().family
|
||||||
spec_str += f" target={target}"
|
spec_str += f" target={target}"
|
||||||
|
|
||||||
tty.debug(f"[BOOTSTRAP ROOT SPEC] {spec_str}")
|
tty.debug(f"[BOOTSTRAP ROOT SPEC] {spec_str}")
|
||||||
|
@ -13,7 +13,7 @@
|
|||||||
import sys
|
import sys
|
||||||
from typing import Dict, Optional, Tuple
|
from typing import Dict, Optional, Tuple
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.compilers.config
|
import spack.compilers.config
|
||||||
import spack.compilers.libraries
|
import spack.compilers.libraries
|
||||||
@ -30,7 +30,7 @@ class ClingoBootstrapConcretizer:
|
|||||||
def __init__(self, configuration):
|
def __init__(self, configuration):
|
||||||
self.host_platform = spack.platforms.host()
|
self.host_platform = spack.platforms.host()
|
||||||
self.host_os = self.host_platform.default_operating_system()
|
self.host_os = self.host_platform.default_operating_system()
|
||||||
self.host_target = _vendoring.archspec.cpu.host().family
|
self.host_target = archspec.cpu.host().family
|
||||||
self.host_architecture = spack.spec.ArchSpec.default_arch()
|
self.host_architecture = spack.spec.ArchSpec.default_arch()
|
||||||
self.host_architecture.target = str(self.host_target)
|
self.host_architecture.target = str(self.host_target)
|
||||||
self.host_compiler = self._valid_compiler_or_raise()
|
self.host_compiler = self._valid_compiler_or_raise()
|
||||||
|
@ -8,7 +8,7 @@
|
|||||||
import sys
|
import sys
|
||||||
from typing import Iterable, List
|
from typing import Iterable, List
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
from llnl.util import tty
|
from llnl.util import tty
|
||||||
|
|
||||||
@ -51,7 +51,7 @@ def environment_root(cls) -> pathlib.Path:
|
|||||||
"""Environment root directory"""
|
"""Environment root directory"""
|
||||||
bootstrap_root_path = root_path()
|
bootstrap_root_path = root_path()
|
||||||
python_part = spec_for_current_python().replace("@", "")
|
python_part = spec_for_current_python().replace("@", "")
|
||||||
arch_part = _vendoring.archspec.cpu.host().family
|
arch_part = archspec.cpu.host().family
|
||||||
interpreter_part = hashlib.md5(sys.exec_prefix.encode()).hexdigest()[:5]
|
interpreter_part = hashlib.md5(sys.exec_prefix.encode()).hexdigest()[:5]
|
||||||
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
|
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
|
||||||
return pathlib.Path(
|
return pathlib.Path(
|
||||||
@ -112,7 +112,7 @@ def _write_spack_yaml_file(self) -> None:
|
|||||||
context = {
|
context = {
|
||||||
"python_spec": spec_for_current_python(),
|
"python_spec": spec_for_current_python(),
|
||||||
"python_prefix": sys.exec_prefix,
|
"python_prefix": sys.exec_prefix,
|
||||||
"architecture": _vendoring.archspec.cpu.host().family,
|
"architecture": archspec.cpu.host().family,
|
||||||
"environment_path": self.environment_root(),
|
"environment_path": self.environment_root(),
|
||||||
"environment_specs": self.spack_dev_requirements(),
|
"environment_specs": self.spack_dev_requirements(),
|
||||||
"store_path": store_path(),
|
"store_path": store_path(),
|
||||||
|
@ -59,7 +59,7 @@
|
|||||||
overload,
|
overload,
|
||||||
)
|
)
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.string import plural
|
from llnl.string import plural
|
||||||
@ -440,12 +440,10 @@ def optimization_flags(compiler, target):
|
|||||||
# Try to check if the current compiler comes with a version number or
|
# Try to check if the current compiler comes with a version number or
|
||||||
# has an unexpected suffix. If so, treat it as a compiler with a
|
# has an unexpected suffix. If so, treat it as a compiler with a
|
||||||
# custom spec.
|
# custom spec.
|
||||||
version_number, _ = _vendoring.archspec.cpu.version_components(
|
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
|
||||||
compiler.version.dotted_numeric_string
|
|
||||||
)
|
|
||||||
try:
|
try:
|
||||||
result = target.optimization_flags(compiler.name, version_number)
|
result = target.optimization_flags(compiler.name, version_number)
|
||||||
except (ValueError, _vendoring.archspec.cpu.UnsupportedMicroarchitecture):
|
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
|
||||||
result = ""
|
result = ""
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
@ -23,7 +23,7 @@
|
|||||||
_BUILDERS: Dict[int, "Builder"] = {}
|
_BUILDERS: Dict[int, "Builder"] = {}
|
||||||
|
|
||||||
|
|
||||||
def register_builder(build_system_name: str):
|
def builder(build_system_name: str):
|
||||||
"""Class decorator used to register the default builder
|
"""Class decorator used to register the default builder
|
||||||
for a given build-system.
|
for a given build-system.
|
||||||
|
|
||||||
|
@ -5,7 +5,7 @@
|
|||||||
import collections
|
import collections
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.tty.colify as colify
|
import llnl.util.tty.colify as colify
|
||||||
import llnl.util.tty.color as color
|
import llnl.util.tty.color as color
|
||||||
@ -92,11 +92,11 @@ def display_target_group(header, target_group):
|
|||||||
def arch(parser, args):
|
def arch(parser, args):
|
||||||
if args.generic_target:
|
if args.generic_target:
|
||||||
# TODO: add deprecation warning in 0.24
|
# TODO: add deprecation warning in 0.24
|
||||||
print(_vendoring.archspec.cpu.host().generic)
|
print(archspec.cpu.host().generic)
|
||||||
return
|
return
|
||||||
|
|
||||||
if args.known_targets:
|
if args.known_targets:
|
||||||
display_targets(_vendoring.archspec.cpu.TARGETS)
|
display_targets(archspec.cpu.TARGETS)
|
||||||
return
|
return
|
||||||
|
|
||||||
if args.frontend:
|
if args.frontend:
|
||||||
|
@ -514,18 +514,17 @@ def extend_with_dependencies(specs):
|
|||||||
|
|
||||||
|
|
||||||
def concrete_specs_from_cli_or_file(args):
|
def concrete_specs_from_cli_or_file(args):
|
||||||
|
tty.msg("Concretizing input specs")
|
||||||
if args.specs:
|
if args.specs:
|
||||||
specs = spack.cmd.parse_specs(args.specs, concretize=False)
|
specs = spack.cmd.parse_specs(args.specs, concretize=True)
|
||||||
if not specs:
|
if not specs:
|
||||||
raise SpackError("unable to parse specs from command line")
|
raise SpackError("unable to parse specs from command line")
|
||||||
|
|
||||||
if args.file:
|
if args.file:
|
||||||
specs = specs_from_text_file(args.file, concretize=False)
|
specs = specs_from_text_file(args.file, concretize=True)
|
||||||
if not specs:
|
if not specs:
|
||||||
raise SpackError("unable to parse specs from file '{}'".format(args.file))
|
raise SpackError("unable to parse specs from file '{}'".format(args.file))
|
||||||
|
return specs
|
||||||
concrete_specs = spack.cmd.matching_specs_from_env(specs)
|
|
||||||
return concrete_specs
|
|
||||||
|
|
||||||
|
|
||||||
class IncludeFilter:
|
class IncludeFilter:
|
||||||
@ -608,6 +607,11 @@ def process_mirror_stats(present, mirrored, error):
|
|||||||
|
|
||||||
def mirror_create(args):
|
def mirror_create(args):
|
||||||
"""create a directory to be used as a spack mirror, and fill it with package archives"""
|
"""create a directory to be used as a spack mirror, and fill it with package archives"""
|
||||||
|
if args.specs and args.all:
|
||||||
|
raise SpackError(
|
||||||
|
"cannot specify specs on command line if you chose to mirror all specs with '--all'"
|
||||||
|
)
|
||||||
|
|
||||||
if args.file and args.all:
|
if args.file and args.all:
|
||||||
raise SpackError(
|
raise SpackError(
|
||||||
"cannot specify specs with a file if you chose to mirror all specs with '--all'"
|
"cannot specify specs with a file if you chose to mirror all specs with '--all'"
|
||||||
|
@ -201,19 +201,7 @@ def repo_migrate(args: Any) -> int:
|
|||||||
repo_v2 = None
|
repo_v2 = None
|
||||||
exit_code = 0
|
exit_code = 0
|
||||||
|
|
||||||
if not args.fix:
|
if exit_code == 0 and isinstance(repo_v2, spack.repo.Repo):
|
||||||
tty.error(
|
|
||||||
f"No changes were made to the repository {repo.root} with namespace "
|
|
||||||
f"'{repo.namespace}'. Run with --fix to apply the above changes."
|
|
||||||
)
|
|
||||||
|
|
||||||
elif exit_code == 1:
|
|
||||||
tty.error(
|
|
||||||
f"Repository '{repo.namespace}' could not be migrated to the latest Package API. "
|
|
||||||
"Please check the error messages above."
|
|
||||||
)
|
|
||||||
|
|
||||||
elif isinstance(repo_v2, spack.repo.Repo):
|
|
||||||
tty.info(
|
tty.info(
|
||||||
f"Repository '{repo_v2.namespace}' was successfully migrated from "
|
f"Repository '{repo_v2.namespace}' was successfully migrated from "
|
||||||
f"package API {repo.package_api_str} to {repo_v2.package_api_str}."
|
f"package API {repo.package_api_str} to {repo_v2.package_api_str}."
|
||||||
@ -224,9 +212,15 @@ def repo_migrate(args: Any) -> int:
|
|||||||
f" spack repo add {shlex.quote(repo_v2.root)}"
|
f" spack repo add {shlex.quote(repo_v2.root)}"
|
||||||
)
|
)
|
||||||
|
|
||||||
else:
|
elif exit_code == 0:
|
||||||
tty.info(f"Repository '{repo.namespace}' was successfully migrated")
|
tty.info(f"Repository '{repo.namespace}' was successfully migrated")
|
||||||
|
|
||||||
|
elif not args.fix and exit_code == 1:
|
||||||
|
tty.error(
|
||||||
|
f"No changes were made to the repository {repo.root} with namespace "
|
||||||
|
f"'{repo.namespace}'. Run with --fix to apply the above changes."
|
||||||
|
)
|
||||||
|
|
||||||
return exit_code
|
return exit_code
|
||||||
|
|
||||||
|
|
||||||
|
@ -10,7 +10,7 @@
|
|||||||
import warnings
|
import warnings
|
||||||
from typing import Any, Dict, List, Optional, Tuple
|
from typing import Any, Dict, List, Optional, Tuple
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
@ -316,7 +316,7 @@ def from_external_yaml(config: Dict[str, Any]) -> Optional[spack.spec.Spec]:
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def _finalize_external_concretization(abstract_spec):
|
def _finalize_external_concretization(abstract_spec):
|
||||||
if CompilerFactory._GENERIC_TARGET is None:
|
if CompilerFactory._GENERIC_TARGET is None:
|
||||||
CompilerFactory._GENERIC_TARGET = _vendoring.archspec.cpu.host().family
|
CompilerFactory._GENERIC_TARGET = archspec.cpu.host().family
|
||||||
|
|
||||||
if abstract_spec.architecture:
|
if abstract_spec.architecture:
|
||||||
abstract_spec.architecture.complete_with_defaults()
|
abstract_spec.architecture.complete_with_defaults()
|
||||||
|
@ -25,7 +25,7 @@
|
|||||||
import warnings
|
import warnings
|
||||||
from typing import List, Tuple
|
from typing import List, Tuple
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
@ -734,7 +734,7 @@ def _compatible_sys_types():
|
|||||||
"""
|
"""
|
||||||
host_platform = spack.platforms.host()
|
host_platform = spack.platforms.host()
|
||||||
host_os = str(host_platform.default_operating_system())
|
host_os = str(host_platform.default_operating_system())
|
||||||
host_target = _vendoring.archspec.cpu.host()
|
host_target = archspec.cpu.host()
|
||||||
compatible_targets = [host_target] + host_target.ancestors
|
compatible_targets = [host_target] + host_target.ancestors
|
||||||
|
|
||||||
compatible_archs = [
|
compatible_archs = [
|
||||||
@ -794,7 +794,7 @@ def shell_set(var, value):
|
|||||||
# print environment module system if available. This can be expensive
|
# print environment module system if available. This can be expensive
|
||||||
# on clusters, so skip it if not needed.
|
# on clusters, so skip it if not needed.
|
||||||
if "modules" in info:
|
if "modules" in info:
|
||||||
generic_arch = _vendoring.archspec.cpu.host().family
|
generic_arch = archspec.cpu.host().family
|
||||||
module_spec = "environment-modules target={0}".format(generic_arch)
|
module_spec = "environment-modules target={0}".format(generic_arch)
|
||||||
specs = spack.store.STORE.db.query(module_spec)
|
specs = spack.store.STORE.db.query(module_spec)
|
||||||
if specs:
|
if specs:
|
||||||
|
@ -49,7 +49,7 @@
|
|||||||
from llnl.util.symlink import symlink
|
from llnl.util.symlink import symlink
|
||||||
|
|
||||||
from spack.build_environment import MakeExecutable
|
from spack.build_environment import MakeExecutable
|
||||||
from spack.builder import BaseBuilder, Builder, register_builder
|
from spack.builder import BaseBuilder
|
||||||
from spack.config import determine_number_of_jobs
|
from spack.config import determine_number_of_jobs
|
||||||
from spack.deptypes import ALL_TYPES as all_deptypes
|
from spack.deptypes import ALL_TYPES as all_deptypes
|
||||||
from spack.directives import (
|
from spack.directives import (
|
||||||
@ -81,13 +81,7 @@
|
|||||||
)
|
)
|
||||||
from spack.mixins import filter_compiler_wrappers
|
from spack.mixins import filter_compiler_wrappers
|
||||||
from spack.multimethod import default_args, when
|
from spack.multimethod import default_args, when
|
||||||
from spack.package_base import (
|
from spack.package_base import build_system_flags, env_flags, inject_flags, on_package_attributes
|
||||||
PackageBase,
|
|
||||||
build_system_flags,
|
|
||||||
env_flags,
|
|
||||||
inject_flags,
|
|
||||||
on_package_attributes,
|
|
||||||
)
|
|
||||||
from spack.package_completions import (
|
from spack.package_completions import (
|
||||||
bash_completion_path,
|
bash_completion_path,
|
||||||
fish_completion_path,
|
fish_completion_path,
|
||||||
@ -222,9 +216,6 @@ class tty:
|
|||||||
"cd",
|
"cd",
|
||||||
"pwd",
|
"pwd",
|
||||||
"tty",
|
"tty",
|
||||||
"Builder",
|
|
||||||
"PackageBase",
|
|
||||||
"register_builder",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
# These are just here for editor support; they may be set when the build env is set up.
|
# These are just here for editor support; they may be set when the build env is set up.
|
||||||
|
@ -986,9 +986,7 @@ def url_for_version(self, version):
|
|||||||
"""
|
"""
|
||||||
return self._implement_all_urls_for_version(version)[0]
|
return self._implement_all_urls_for_version(version)[0]
|
||||||
|
|
||||||
def _update_external_dependencies(
|
def update_external_dependencies(self, extendee_spec=None):
|
||||||
self, extendee_spec: Optional[spack.spec.Spec] = None
|
|
||||||
) -> None:
|
|
||||||
"""
|
"""
|
||||||
Method to override in package classes to handle external dependencies
|
Method to override in package classes to handle external dependencies
|
||||||
"""
|
"""
|
||||||
|
@ -4,7 +4,7 @@
|
|||||||
import warnings
|
import warnings
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
|
|
||||||
@ -38,15 +38,15 @@ def __init__(self, name):
|
|||||||
self.name = name
|
self.name = name
|
||||||
self._init_targets()
|
self._init_targets()
|
||||||
|
|
||||||
def add_target(self, name: str, target: _vendoring.archspec.cpu.Microarchitecture) -> None:
|
def add_target(self, name: str, target: archspec.cpu.Microarchitecture) -> None:
|
||||||
if name in Platform.reserved_targets:
|
if name in Platform.reserved_targets:
|
||||||
msg = f"{name} is a spack reserved alias and cannot be the name of a target"
|
msg = f"{name} is a spack reserved alias and cannot be the name of a target"
|
||||||
raise ValueError(msg)
|
raise ValueError(msg)
|
||||||
self.targets[name] = target
|
self.targets[name] = target
|
||||||
|
|
||||||
def _init_targets(self):
|
def _init_targets(self):
|
||||||
self.default = _vendoring.archspec.cpu.host().name
|
self.default = archspec.cpu.host().name
|
||||||
for name, microarchitecture in _vendoring.archspec.cpu.TARGETS.items():
|
for name, microarchitecture in archspec.cpu.TARGETS.items():
|
||||||
self.add_target(name, microarchitecture)
|
self.add_target(name, microarchitecture)
|
||||||
|
|
||||||
def target(self, name):
|
def target(self, name):
|
||||||
|
@ -3,7 +3,7 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import platform
|
import platform
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.operating_systems
|
import spack.operating_systems
|
||||||
|
|
||||||
@ -28,7 +28,7 @@ def __init__(self, name=None):
|
|||||||
def _init_targets(self):
|
def _init_targets(self):
|
||||||
targets = ("aarch64", "m1") if platform.machine() == "arm64" else ("x86_64", "core2")
|
targets = ("aarch64", "m1") if platform.machine() == "arm64" else ("x86_64", "core2")
|
||||||
for t in targets:
|
for t in targets:
|
||||||
self.add_target(t, _vendoring.archspec.cpu.TARGETS[t])
|
self.add_target(t, archspec.cpu.TARGETS[t])
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def detect(cls):
|
def detect(cls):
|
||||||
|
@ -247,6 +247,7 @@ def migrate_v2_imports(
|
|||||||
"Package": "spack_repo.builtin.build_systems.generic",
|
"Package": "spack_repo.builtin.build_systems.generic",
|
||||||
"GNUMirrorPackage": "spack_repo.builtin.build_systems.gnu",
|
"GNUMirrorPackage": "spack_repo.builtin.build_systems.gnu",
|
||||||
"GoPackage": "spack_repo.builtin.build_systems.go",
|
"GoPackage": "spack_repo.builtin.build_systems.go",
|
||||||
|
"IntelPackage": "spack_repo.builtin.build_systems.intel",
|
||||||
"LuaPackage": "spack_repo.builtin.build_systems.lua",
|
"LuaPackage": "spack_repo.builtin.build_systems.lua",
|
||||||
"MakefilePackage": "spack_repo.builtin.build_systems.makefile",
|
"MakefilePackage": "spack_repo.builtin.build_systems.makefile",
|
||||||
"MavenPackage": "spack_repo.builtin.build_systems.maven",
|
"MavenPackage": "spack_repo.builtin.build_systems.maven",
|
||||||
|
@ -34,7 +34,7 @@
|
|||||||
Union,
|
Union,
|
||||||
)
|
)
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
@ -1617,7 +1617,7 @@ def target_ranges(self, spec, single_target_fn):
|
|||||||
target = spec.architecture.target
|
target = spec.architecture.target
|
||||||
|
|
||||||
# Check if the target is a concrete target
|
# Check if the target is a concrete target
|
||||||
if str(target) in _vendoring.archspec.cpu.TARGETS:
|
if str(target) in archspec.cpu.TARGETS:
|
||||||
return [single_target_fn(spec.name, target)]
|
return [single_target_fn(spec.name, target)]
|
||||||
|
|
||||||
self.target_constraints.add(target)
|
self.target_constraints.add(target)
|
||||||
@ -2753,7 +2753,7 @@ def _supported_targets(self, compiler_name, compiler_version, targets):
|
|||||||
compiler_name, compiler_version.dotted_numeric_string
|
compiler_name, compiler_version.dotted_numeric_string
|
||||||
)
|
)
|
||||||
supported.append(target)
|
supported.append(target)
|
||||||
except _vendoring.archspec.cpu.UnsupportedMicroarchitecture:
|
except archspec.cpu.UnsupportedMicroarchitecture:
|
||||||
continue
|
continue
|
||||||
except ValueError:
|
except ValueError:
|
||||||
continue
|
continue
|
||||||
@ -2818,7 +2818,7 @@ def target_defaults(self, specs):
|
|||||||
if not spec.architecture or not spec.architecture.target:
|
if not spec.architecture or not spec.architecture.target:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
target = _vendoring.archspec.cpu.TARGETS.get(spec.target.name)
|
target = archspec.cpu.TARGETS.get(spec.target.name)
|
||||||
if not target:
|
if not target:
|
||||||
self.target_ranges(spec, None)
|
self.target_ranges(spec, None)
|
||||||
continue
|
continue
|
||||||
@ -2830,7 +2830,7 @@ def target_defaults(self, specs):
|
|||||||
candidate_targets.append(ancestor)
|
candidate_targets.append(ancestor)
|
||||||
|
|
||||||
platform = spack.platforms.host()
|
platform = spack.platforms.host()
|
||||||
uarch = _vendoring.archspec.cpu.TARGETS.get(platform.default)
|
uarch = archspec.cpu.TARGETS.get(platform.default)
|
||||||
best_targets = {uarch.family.name}
|
best_targets = {uarch.family.name}
|
||||||
for compiler in self.possible_compilers:
|
for compiler in self.possible_compilers:
|
||||||
supported = self._supported_targets(compiler.name, compiler.version, candidate_targets)
|
supported = self._supported_targets(compiler.name, compiler.version, candidate_targets)
|
||||||
@ -2938,7 +2938,7 @@ def _all_targets_satisfiying(single_constraint):
|
|||||||
return [single_constraint]
|
return [single_constraint]
|
||||||
|
|
||||||
t_min, _, t_max = single_constraint.partition(":")
|
t_min, _, t_max = single_constraint.partition(":")
|
||||||
for test_target in _vendoring.archspec.cpu.TARGETS.values():
|
for test_target in archspec.cpu.TARGETS.values():
|
||||||
# Check lower bound
|
# Check lower bound
|
||||||
if t_min and not t_min <= test_target:
|
if t_min and not t_min <= test_target:
|
||||||
continue
|
continue
|
||||||
@ -3894,7 +3894,7 @@ def external_spec_selected(self, node, idx):
|
|||||||
|
|
||||||
if extendee_spec:
|
if extendee_spec:
|
||||||
extendee_node = SpecBuilder.make_node(pkg=extendee_spec.name)
|
extendee_node = SpecBuilder.make_node(pkg=extendee_spec.name)
|
||||||
package._update_external_dependencies(self._specs.get(extendee_node))
|
package.update_external_dependencies(self._specs.get(extendee_node, None))
|
||||||
|
|
||||||
def depends_on(self, parent_node, dependency_node, type):
|
def depends_on(self, parent_node, dependency_node, type):
|
||||||
dependency_spec = self._specs[dependency_node]
|
dependency_spec = self._specs[dependency_node]
|
||||||
|
@ -5,7 +5,7 @@
|
|||||||
import collections
|
import collections
|
||||||
from typing import Dict, List, NamedTuple, Set, Tuple, Union
|
from typing import Dict, List, NamedTuple, Set, Tuple, Union
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
import archspec.cpu
|
||||||
|
|
||||||
from llnl.util import lang, tty
|
from llnl.util import lang, tty
|
||||||
|
|
||||||
@ -34,7 +34,7 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
|
|||||||
"""
|
"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
|
||||||
"""Returns a list of targets that are candidate for concretization"""
|
"""Returns a list of targets that are candidate for concretization"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
@ -70,7 +70,7 @@ def __init__(self, *, configuration: spack.config.Configuration, repo: spack.rep
|
|||||||
self.configuration = configuration
|
self.configuration = configuration
|
||||||
self.repo = repo
|
self.repo = repo
|
||||||
self._platform_condition = spack.spec.Spec(
|
self._platform_condition = spack.spec.Spec(
|
||||||
f"platform={spack.platforms.host()} target={_vendoring.archspec.cpu.host().family}:"
|
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
|
||||||
)
|
)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -110,10 +110,10 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
|
|||||||
"""
|
"""
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
|
||||||
"""Returns a list of targets that are candidate for concretization"""
|
"""Returns a list of targets that are candidate for concretization"""
|
||||||
platform = spack.platforms.host()
|
platform = spack.platforms.host()
|
||||||
default_target = _vendoring.archspec.cpu.TARGETS[platform.default]
|
default_target = archspec.cpu.TARGETS[platform.default]
|
||||||
|
|
||||||
# Construct the list of targets which are compatible with the host
|
# Construct the list of targets which are compatible with the host
|
||||||
candidate_targets = [default_target] + default_target.ancestors
|
candidate_targets = [default_target] + default_target.ancestors
|
||||||
@ -125,7 +125,7 @@ def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
|
|||||||
additional_targets_in_family = sorted(
|
additional_targets_in_family = sorted(
|
||||||
[
|
[
|
||||||
t
|
t
|
||||||
for t in _vendoring.archspec.cpu.TARGETS.values()
|
for t in archspec.cpu.TARGETS.values()
|
||||||
if (t.family.name == default_target.family.name and t not in candidate_targets)
|
if (t.family.name == default_target.family.name and t not in candidate_targets)
|
||||||
],
|
],
|
||||||
key=lambda x: len(x.ancestors),
|
key=lambda x: len(x.ancestors),
|
||||||
|
@ -74,9 +74,10 @@
|
|||||||
overload,
|
overload,
|
||||||
)
|
)
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
from _vendoring.typing_extensions import Literal
|
from _vendoring.typing_extensions import Literal
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.path
|
import llnl.path
|
||||||
import llnl.string
|
import llnl.string
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
@ -216,12 +217,10 @@ def ensure_modern_format_string(fmt: str) -> None:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def _make_microarchitecture(name: str) -> _vendoring.archspec.cpu.Microarchitecture:
|
def _make_microarchitecture(name: str) -> archspec.cpu.Microarchitecture:
|
||||||
if isinstance(name, _vendoring.archspec.cpu.Microarchitecture):
|
if isinstance(name, archspec.cpu.Microarchitecture):
|
||||||
return name
|
return name
|
||||||
return _vendoring.archspec.cpu.TARGETS.get(
|
return archspec.cpu.TARGETS.get(name, archspec.cpu.generic_microarchitecture(name))
|
||||||
name, _vendoring.archspec.cpu.generic_microarchitecture(name)
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@lang.lazy_lexicographic_ordering
|
@lang.lazy_lexicographic_ordering
|
||||||
@ -365,7 +364,7 @@ def target(self, value):
|
|||||||
# will assumed to be the host machine's platform.
|
# will assumed to be the host machine's platform.
|
||||||
|
|
||||||
def target_or_none(t):
|
def target_or_none(t):
|
||||||
if isinstance(t, _vendoring.archspec.cpu.Microarchitecture):
|
if isinstance(t, archspec.cpu.Microarchitecture):
|
||||||
return t
|
return t
|
||||||
if t and t != "None":
|
if t and t != "None":
|
||||||
return _make_microarchitecture(t)
|
return _make_microarchitecture(t)
|
||||||
@ -962,7 +961,6 @@ def _sort_by_dep_types(dspec: DependencySpec):
|
|||||||
return dspec.depflag
|
return dspec.depflag
|
||||||
|
|
||||||
|
|
||||||
@lang.lazy_lexicographic_ordering
|
|
||||||
class _EdgeMap(collections.abc.Mapping):
|
class _EdgeMap(collections.abc.Mapping):
|
||||||
"""Represent a collection of edges (DependencySpec objects) in the DAG.
|
"""Represent a collection of edges (DependencySpec objects) in the DAG.
|
||||||
|
|
||||||
@ -1000,21 +998,6 @@ def add(self, edge: DependencySpec) -> None:
|
|||||||
def __str__(self) -> str:
|
def __str__(self) -> str:
|
||||||
return f"{{deps: {', '.join(str(d) for d in sorted(self.values()))}}}"
|
return f"{{deps: {', '.join(str(d) for d in sorted(self.values()))}}}"
|
||||||
|
|
||||||
def _cmp_iter(self):
|
|
||||||
for item in sorted(itertools.chain.from_iterable(self.edges.values())):
|
|
||||||
yield item
|
|
||||||
|
|
||||||
def copy(self):
|
|
||||||
"""Copies this object and returns a clone"""
|
|
||||||
clone = type(self)()
|
|
||||||
clone.store_by_child = self.store_by_child
|
|
||||||
|
|
||||||
# Copy everything from this dict into it.
|
|
||||||
for dspec in itertools.chain.from_iterable(self.values()):
|
|
||||||
clone.add(dspec.copy())
|
|
||||||
|
|
||||||
return clone
|
|
||||||
|
|
||||||
def select(
|
def select(
|
||||||
self,
|
self,
|
||||||
*,
|
*,
|
||||||
@ -3786,26 +3769,152 @@ def eq_node(self, other):
|
|||||||
"""Equality with another spec, not including dependencies."""
|
"""Equality with another spec, not including dependencies."""
|
||||||
return (other is not None) and lang.lazy_eq(self._cmp_node, other._cmp_node)
|
return (other is not None) and lang.lazy_eq(self._cmp_node, other._cmp_node)
|
||||||
|
|
||||||
def _cmp_iter(self):
|
def _cmp_fast_eq(self, other) -> Optional[bool]:
|
||||||
"""Lazily yield components of self for comparison."""
|
"""Short-circuit compare with other for equality, for lazy_lexicographic_ordering."""
|
||||||
|
|
||||||
for item in self._cmp_node():
|
|
||||||
yield item
|
|
||||||
|
|
||||||
# If there is ever a breaking change to hash computation, whether accidental or purposeful,
|
# If there is ever a breaking change to hash computation, whether accidental or purposeful,
|
||||||
# two specs can be identical modulo DAG hash, depending on what time they were concretized
|
# two specs can be identical modulo DAG hash, depending on what time they were concretized
|
||||||
# From the perspective of many operation in Spack (database, build cache, etc) a different
|
# From the perspective of many operation in Spack (database, build cache, etc) a different
|
||||||
# DAG hash means a different spec. Here we ensure that two otherwise identical specs, one
|
# DAG hash means a different spec. Here we ensure that two otherwise identical specs, one
|
||||||
# serialized before the hash change and one after, are considered different.
|
# serialized before the hash change and one after, are considered different.
|
||||||
yield self.dag_hash() if self.concrete else None
|
if self is other:
|
||||||
|
return True
|
||||||
|
|
||||||
def deps():
|
if self.concrete and other and other.concrete:
|
||||||
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
|
return self.dag_hash() == other.dag_hash()
|
||||||
yield dep.spec.name
|
|
||||||
yield dep.depflag
|
|
||||||
yield hash(dep.spec)
|
|
||||||
|
|
||||||
yield deps
|
return None
|
||||||
|
|
||||||
|
def _cmp_iter(self):
|
||||||
|
"""Lazily yield components of self for comparison."""
|
||||||
|
|
||||||
|
# Spec comparison in Spack needs to be fast, so there are several cases here for
|
||||||
|
# performance. The main places we care about this are:
|
||||||
|
#
|
||||||
|
# * Abstract specs: there are lots of abstract specs in package.py files,
|
||||||
|
# which are put into metadata dictionaries and sorted during concretization
|
||||||
|
# setup. We want comparing abstract specs to be fast.
|
||||||
|
#
|
||||||
|
# * Concrete specs: concrete specs are bigger and have lots of nodes and
|
||||||
|
# edges. Because of the graph complexity, we need a full, linear time
|
||||||
|
# traversal to compare them -- that's pretty much is unavoidable. But they
|
||||||
|
# also have precoputed cryptographic hashes (dag_hash()), which we can use
|
||||||
|
# to do fast equality comparison. See _cmp_fast_eq() above for the
|
||||||
|
# short-circuit logic for hashes.
|
||||||
|
#
|
||||||
|
# A full traversal involves constructing data structurs, visitor objects, etc.,
|
||||||
|
# and it can be expensive if we have to do it to compare a bunch of tiny
|
||||||
|
# abstract specs. Therefore, there are 3 cases below, which avoid calling
|
||||||
|
# `spack.traverse.traverse_edges()` unless necessary.
|
||||||
|
#
|
||||||
|
# WARNING: the cases below need to be consistent, so don't mess with this code
|
||||||
|
# unless you really know what you're doing. Be sure to keep all three consistent.
|
||||||
|
#
|
||||||
|
# All cases lazily yield:
|
||||||
|
#
|
||||||
|
# 1. A generator over nodes
|
||||||
|
# 2. A generator over canonical edges
|
||||||
|
#
|
||||||
|
# Canonical edges have consistent ids defined by breadth-first traversal order. That is,
|
||||||
|
# the root is always 0, dependencies of the root are 1, 2, 3, etc., and so on.
|
||||||
|
#
|
||||||
|
# The three cases are:
|
||||||
|
#
|
||||||
|
# 1. Spec has no dependencies
|
||||||
|
# * We can avoid any traversal logic and just yield this node's _cmp_node generator.
|
||||||
|
#
|
||||||
|
# 2. Spec has dependencies, but dependencies have no dependencies.
|
||||||
|
# * We need to sort edges, but we don't need to track visited nodes, which
|
||||||
|
# can save us the cost of setting up all the tracking data structures
|
||||||
|
# `spack.traverse` uses.
|
||||||
|
#
|
||||||
|
# 3. Spec has dependencies that have dependencies.
|
||||||
|
# * In this case, the spec is *probably* concrete. Equality comparisons
|
||||||
|
# will be short-circuited by dag_hash(), but other comparisons will need
|
||||||
|
# to lazily enumerate components of the spec. The traversal logic is
|
||||||
|
# unavoidable.
|
||||||
|
#
|
||||||
|
# TODO: consider reworking `spack.traverse` to construct fewer data structures
|
||||||
|
# and objects, as this would make all traversals faster and could eliminate the
|
||||||
|
# need for the complexity here. It was not clear at the time of writing that how
|
||||||
|
# much optimization was possible in `spack.traverse`.
|
||||||
|
|
||||||
|
sorted_l1_edges = None
|
||||||
|
edge_list = None
|
||||||
|
node_ids = None
|
||||||
|
|
||||||
|
def nodes():
|
||||||
|
nonlocal sorted_l1_edges
|
||||||
|
nonlocal edge_list
|
||||||
|
nonlocal node_ids
|
||||||
|
|
||||||
|
# Level 0: root node
|
||||||
|
yield self._cmp_node # always yield the root (this node)
|
||||||
|
if not self._dependencies: # done if there are no dependencies
|
||||||
|
return
|
||||||
|
|
||||||
|
# Level 1: direct dependencies
|
||||||
|
# we can yield these in sorted order without tracking visited nodes
|
||||||
|
deps_have_deps = False
|
||||||
|
sorted_l1_edges = self.edges_to_dependencies(depflag=dt.ALL)
|
||||||
|
if len(sorted_l1_edges) > 1:
|
||||||
|
sorted_l1_edges = spack.traverse.sort_edges(sorted_l1_edges)
|
||||||
|
|
||||||
|
for edge in sorted_l1_edges:
|
||||||
|
yield edge.spec._cmp_node
|
||||||
|
if edge.spec._dependencies:
|
||||||
|
deps_have_deps = True
|
||||||
|
|
||||||
|
if not deps_have_deps: # done if level 1 specs have no dependencies
|
||||||
|
return
|
||||||
|
|
||||||
|
# Level 2: dependencies of direct dependencies
|
||||||
|
# now it's general; we need full traverse() to track visited nodes
|
||||||
|
l1_specs = [edge.spec for edge in sorted_l1_edges]
|
||||||
|
|
||||||
|
# the node_ids dict generates consistent ids based on BFS traversal order
|
||||||
|
# these are used to identify edges later
|
||||||
|
node_ids = collections.defaultdict(lambda: len(node_ids))
|
||||||
|
node_ids[id(self)] # self is 0
|
||||||
|
for spec in l1_specs:
|
||||||
|
node_ids[id(spec)] # l1 starts at 1
|
||||||
|
|
||||||
|
edge_list = []
|
||||||
|
for edge in spack.traverse.traverse_edges(
|
||||||
|
l1_specs, order="breadth", cover="edges", root=False, visited=set([0])
|
||||||
|
):
|
||||||
|
# yield each node only once, and generate a consistent id for it the
|
||||||
|
# first time it's encountered.
|
||||||
|
if id(edge.spec) not in node_ids:
|
||||||
|
yield edge.spec._cmp_node
|
||||||
|
node_ids[id(edge.spec)]
|
||||||
|
|
||||||
|
if edge.parent is None: # skip fake edge to root
|
||||||
|
continue
|
||||||
|
|
||||||
|
edge_list.append(
|
||||||
|
(
|
||||||
|
node_ids[id(edge.parent)],
|
||||||
|
node_ids[id(edge.spec)],
|
||||||
|
edge.depflag,
|
||||||
|
edge.virtuals,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def edges():
|
||||||
|
# no edges in single-node graph
|
||||||
|
if not self._dependencies:
|
||||||
|
return
|
||||||
|
|
||||||
|
# level 1 edges all start with zero
|
||||||
|
for i, edge in enumerate(sorted_l1_edges, start=1):
|
||||||
|
yield (0, i, edge.depflag, edge.virtuals)
|
||||||
|
|
||||||
|
# yield remaining edges in the order they were encountered during traversal
|
||||||
|
if edge_list:
|
||||||
|
yield from edge_list
|
||||||
|
|
||||||
|
yield nodes
|
||||||
|
yield edges
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def namespace_if_anonymous(self):
|
def namespace_if_anonymous(self):
|
||||||
|
@ -3,9 +3,10 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import platform
|
import platform
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.operating_systems
|
import spack.operating_systems
|
||||||
import spack.platforms
|
import spack.platforms
|
||||||
@ -124,8 +125,7 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
|
|||||||
)
|
)
|
||||||
@pytest.mark.usefixtures("mock_packages", "config")
|
@pytest.mark.usefixtures("mock_packages", "config")
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
|
||||||
reason="tests are for x86_64 uarch ranges",
|
|
||||||
)
|
)
|
||||||
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
|
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
|
||||||
spec = spack.concretize.concretize_one(
|
spec = spack.concretize.concretize_one(
|
||||||
|
@ -8,9 +8,10 @@
|
|||||||
import sys
|
import sys
|
||||||
from typing import Dict, Optional, Tuple
|
from typing import Dict, Optional, Tuple
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
from llnl.path import Path, convert_to_platform_path
|
from llnl.path import Path, convert_to_platform_path
|
||||||
from llnl.util.filesystem import HeaderList, LibraryList
|
from llnl.util.filesystem import HeaderList, LibraryList
|
||||||
|
|
||||||
@ -726,15 +727,14 @@ def test_rpath_with_duplicate_link_deps():
|
|||||||
@pytest.mark.filterwarnings("ignore:microarchitecture specific")
|
@pytest.mark.filterwarnings("ignore:microarchitecture specific")
|
||||||
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
||||||
def test_optimization_flags(compiler_spec, target_name, expected_flags, compiler_factory):
|
def test_optimization_flags(compiler_spec, target_name, expected_flags, compiler_factory):
|
||||||
target = _vendoring.archspec.cpu.TARGETS[target_name]
|
target = archspec.cpu.TARGETS[target_name]
|
||||||
compiler = spack.spec.parse_with_version_concrete(compiler_spec)
|
compiler = spack.spec.parse_with_version_concrete(compiler_spec)
|
||||||
opt_flags = spack.build_environment.optimization_flags(compiler, target)
|
opt_flags = spack.build_environment.optimization_flags(compiler, target)
|
||||||
assert opt_flags == expected_flags
|
assert opt_flags == expected_flags
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="tests check specific x86_64 uarch flags"
|
||||||
reason="tests check specific x86_64 uarch flags",
|
|
||||||
)
|
)
|
||||||
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
|
||||||
def test_optimization_flags_are_using_node_target(default_mock_concretization, monkeypatch):
|
def test_optimization_flags_are_using_node_target(default_mock_concretization, monkeypatch):
|
||||||
|
@ -5,10 +5,11 @@
|
|||||||
import glob
|
import glob
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import py.path
|
import py.path
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
|
|
||||||
import spack
|
import spack
|
||||||
@ -216,8 +217,7 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
|
|||||||
|
|
||||||
@pytest.mark.disable_clean_stage_check
|
@pytest.mark.disable_clean_stage_check
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||||
reason="test data is specific for x86_64",
|
|
||||||
)
|
)
|
||||||
def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, monkeypatch):
|
def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, monkeypatch):
|
||||||
"""
|
"""
|
||||||
|
@ -61,26 +61,6 @@ def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fe
|
|||||||
assert mirror_res == expected
|
assert mirror_res == expected
|
||||||
|
|
||||||
|
|
||||||
# Test for command line-specified spec in concretized environment
|
|
||||||
def test_mirror_spec_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
|
|
||||||
mirror_dir = str(tmp_path / "mirror-B")
|
|
||||||
env_name = "test"
|
|
||||||
|
|
||||||
env("create", env_name)
|
|
||||||
with ev.read(env_name):
|
|
||||||
add("simple-standalone-test@0.9")
|
|
||||||
concretize()
|
|
||||||
with spack.config.override("config:checksum", False):
|
|
||||||
mirror("create", "-d", mirror_dir, "simple-standalone-test")
|
|
||||||
|
|
||||||
e = ev.read(env_name)
|
|
||||||
assert set(os.listdir(mirror_dir)) == set([s.name for s in e.user_specs])
|
|
||||||
spec = e.concrete_roots()[0]
|
|
||||||
mirror_res = os.listdir(os.path.join(mirror_dir, spec.name))
|
|
||||||
expected = ["%s.tar.gz" % spec.format("{name}-{version}")]
|
|
||||||
assert mirror_res == expected
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def source_for_pkg_with_hash(mock_packages, tmpdir):
|
def source_for_pkg_with_hash(mock_packages, tmpdir):
|
||||||
s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
|
s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
|
||||||
@ -421,7 +401,8 @@ def test_all_specs_with_all_versions_dont_concretize(self):
|
|||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
"cli_args,error_str",
|
"cli_args,error_str",
|
||||||
[
|
[
|
||||||
# Passed more than one among -f --all
|
# Passed more than one among -f --all and specs
|
||||||
|
({"specs": "hdf5", "file": None, "all": True}, "cannot specify specs on command line"),
|
||||||
(
|
(
|
||||||
{"specs": None, "file": "input.txt", "all": True},
|
{"specs": None, "file": "input.txt", "all": True},
|
||||||
"cannot specify specs with a file if",
|
"cannot specify specs with a file if",
|
||||||
|
@ -4,9 +4,10 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.config
|
import spack.config
|
||||||
import spack.paths
|
import spack.paths
|
||||||
@ -85,8 +86,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
|||||||
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
|
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
|
||||||
1,
|
1,
|
||||||
marks=pytest.mark.skipif(
|
marks=pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
|
||||||
reason="test data is x86_64 specific",
|
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
pytest.param(
|
pytest.param(
|
||||||
@ -98,8 +98,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
|
|||||||
},
|
},
|
||||||
2,
|
2,
|
||||||
marks=pytest.mark.skipif(
|
marks=pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
|
||||||
reason="test data is x86_64 specific",
|
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
|
@ -6,10 +6,11 @@
|
|||||||
import platform
|
import platform
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import _vendoring.jinja2
|
import _vendoring.jinja2
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
|
|
||||||
import spack.binary_distribution
|
import spack.binary_distribution
|
||||||
@ -145,12 +146,12 @@ def current_host(request, monkeypatch):
|
|||||||
|
|
||||||
monkeypatch.setattr(spack.platforms.Test, "default", cpu)
|
monkeypatch.setattr(spack.platforms.Test, "default", cpu)
|
||||||
if not is_preference:
|
if not is_preference:
|
||||||
target = _vendoring.archspec.cpu.TARGETS[cpu]
|
target = archspec.cpu.TARGETS[cpu]
|
||||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
|
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
|
||||||
yield target
|
yield target
|
||||||
else:
|
else:
|
||||||
target = _vendoring.archspec.cpu.TARGETS["sapphirerapids"]
|
target = archspec.cpu.TARGETS["sapphirerapids"]
|
||||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
|
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
|
||||||
with spack.config.override("packages:all", {"target": [cpu]}):
|
with spack.config.override("packages:all", {"target": [cpu]}):
|
||||||
yield target
|
yield target
|
||||||
|
|
||||||
@ -382,7 +383,7 @@ def test_different_compilers_get_different_flags(
|
|||||||
"gcc": {"externals": [gcc11_with_flags]},
|
"gcc": {"externals": [gcc11_with_flags]},
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
t = _vendoring.archspec.cpu.host().family
|
t = archspec.cpu.host().family
|
||||||
client = spack.concretize.concretize_one(
|
client = spack.concretize.concretize_one(
|
||||||
Spec(
|
Spec(
|
||||||
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
|
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
|
||||||
@ -975,7 +976,7 @@ def test_noversion_pkg(self, spec):
|
|||||||
def test_adjusting_default_target_based_on_compiler(
|
def test_adjusting_default_target_based_on_compiler(
|
||||||
self, spec, compiler_spec, best_achievable, current_host, compiler_factory, mutable_config
|
self, spec, compiler_spec, best_achievable, current_host, compiler_factory, mutable_config
|
||||||
):
|
):
|
||||||
best_achievable = _vendoring.archspec.cpu.TARGETS[best_achievable]
|
best_achievable = archspec.cpu.TARGETS[best_achievable]
|
||||||
expected = best_achievable if best_achievable < current_host else current_host
|
expected = best_achievable if best_achievable < current_host else current_host
|
||||||
mutable_config.set(
|
mutable_config.set(
|
||||||
"packages", {"gcc": {"externals": [compiler_factory(spec=f"{compiler_spec}")]}}
|
"packages", {"gcc": {"externals": [compiler_factory(spec=f"{compiler_spec}")]}}
|
||||||
@ -1644,7 +1645,7 @@ def test_target_granularity(self):
|
|||||||
# The test architecture uses core2 as the default target. Check that when
|
# The test architecture uses core2 as the default target. Check that when
|
||||||
# we configure Spack for "generic" granularity we concretize for x86_64
|
# we configure Spack for "generic" granularity we concretize for x86_64
|
||||||
default_target = spack.platforms.test.Test.default
|
default_target = spack.platforms.test.Test.default
|
||||||
generic_target = _vendoring.archspec.cpu.TARGETS[default_target].generic.name
|
generic_target = archspec.cpu.TARGETS[default_target].generic.name
|
||||||
s = Spec("python")
|
s = Spec("python")
|
||||||
assert spack.concretize.concretize_one(s).satisfies("target=%s" % default_target)
|
assert spack.concretize.concretize_one(s).satisfies("target=%s" % default_target)
|
||||||
with spack.config.override("concretizer:targets", {"granularity": "generic"}):
|
with spack.config.override("concretizer:targets", {"granularity": "generic"}):
|
||||||
@ -2010,7 +2011,7 @@ def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch
|
|||||||
def test_require_targets_are_allowed(self, mutable_config, mutable_database):
|
def test_require_targets_are_allowed(self, mutable_config, mutable_database):
|
||||||
"""Test that users can set target constraints under the require attribute."""
|
"""Test that users can set target constraints under the require attribute."""
|
||||||
# Configuration to be added to packages.yaml
|
# Configuration to be added to packages.yaml
|
||||||
required_target = _vendoring.archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
|
required_target = archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
|
||||||
external_conf = {"all": {"require": f"target={required_target}"}}
|
external_conf = {"all": {"require": f"target={required_target}"}}
|
||||||
mutable_config.set("packages", external_conf)
|
mutable_config.set("packages", external_conf)
|
||||||
|
|
||||||
|
@ -20,12 +20,13 @@
|
|||||||
import tempfile
|
import tempfile
|
||||||
import xml.etree.ElementTree
|
import xml.etree.ElementTree
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import _vendoring.archspec.cpu.microarchitecture
|
|
||||||
import _vendoring.archspec.cpu.schema
|
|
||||||
import py
|
import py
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
import archspec.cpu.microarchitecture
|
||||||
|
import archspec.cpu.schema
|
||||||
|
|
||||||
import llnl.util.lang
|
import llnl.util.lang
|
||||||
import llnl.util.lock
|
import llnl.util.lock
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
@ -371,12 +372,12 @@ def clean_test_environment():
|
|||||||
def _host():
|
def _host():
|
||||||
"""Mock archspec host so there is no inconsistency on the Windows platform
|
"""Mock archspec host so there is no inconsistency on the Windows platform
|
||||||
This function cannot be local as it needs to be pickleable"""
|
This function cannot be local as it needs to be pickleable"""
|
||||||
return _vendoring.archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
|
return archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def archspec_host_is_spack_test_host(monkeypatch):
|
def archspec_host_is_spack_test_host(monkeypatch):
|
||||||
monkeypatch.setattr(_vendoring.archspec.cpu, "host", _host)
|
monkeypatch.setattr(archspec.cpu, "host", _host)
|
||||||
|
|
||||||
|
|
||||||
# Hooks to add command line options or set other custom behaviors.
|
# Hooks to add command line options or set other custom behaviors.
|
||||||
@ -727,14 +728,14 @@ def mock_uarch_json(tmpdir_factory):
|
|||||||
|
|
||||||
@pytest.fixture(scope="session")
|
@pytest.fixture(scope="session")
|
||||||
def mock_uarch_configuration(mock_uarch_json):
|
def mock_uarch_configuration(mock_uarch_json):
|
||||||
"""Create mock dictionaries for the _vendoring.archspec.cpu."""
|
"""Create mock dictionaries for the archspec.cpu."""
|
||||||
|
|
||||||
def load_json():
|
def load_json():
|
||||||
with open(mock_uarch_json, encoding="utf-8") as f:
|
with open(mock_uarch_json, encoding="utf-8") as f:
|
||||||
return json.load(f)
|
return json.load(f)
|
||||||
|
|
||||||
targets_json = load_json()
|
targets_json = load_json()
|
||||||
targets = _vendoring.archspec.cpu.microarchitecture._known_microarchitectures()
|
targets = archspec.cpu.microarchitecture._known_microarchitectures()
|
||||||
|
|
||||||
yield targets_json, targets
|
yield targets_json, targets
|
||||||
|
|
||||||
@ -743,8 +744,8 @@ def load_json():
|
|||||||
def mock_targets(mock_uarch_configuration, monkeypatch):
|
def mock_targets(mock_uarch_configuration, monkeypatch):
|
||||||
"""Use this fixture to enable mock uarch targets for testing."""
|
"""Use this fixture to enable mock uarch targets for testing."""
|
||||||
targets_json, targets = mock_uarch_configuration
|
targets_json, targets = mock_uarch_configuration
|
||||||
monkeypatch.setattr(_vendoring.archspec.cpu.schema, "TARGETS_JSON", targets_json)
|
monkeypatch.setattr(archspec.cpu.schema, "TARGETS_JSON", targets_json)
|
||||||
monkeypatch.setattr(_vendoring.archspec.cpu.microarchitecture, "TARGETS", targets)
|
monkeypatch.setattr(archspec.cpu.microarchitecture, "TARGETS", targets)
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="session")
|
@pytest.fixture(scope="session")
|
||||||
@ -772,7 +773,7 @@ def configuration_dir(tmpdir_factory, linux_os):
|
|||||||
config_template = test_config / "config.yaml"
|
config_template = test_config / "config.yaml"
|
||||||
config.write(config_template.read_text().format(install_tree_root, locks))
|
config.write(config_template.read_text().format(install_tree_root, locks))
|
||||||
|
|
||||||
target = str(_vendoring.archspec.cpu.host().family)
|
target = str(archspec.cpu.host().family)
|
||||||
compilers = tmpdir.join("site", "packages.yaml")
|
compilers = tmpdir.join("site", "packages.yaml")
|
||||||
compilers_template = test_config / "packages.yaml"
|
compilers_template = test_config / "packages.yaml"
|
||||||
compilers.write(compilers_template.read_text().format(linux_os=linux_os, target=target))
|
compilers.write(compilers_template.read_text().format(linux_os=linux_os, target=target))
|
||||||
@ -2116,7 +2117,7 @@ def _factory(*, spec):
|
|||||||
@pytest.fixture()
|
@pytest.fixture()
|
||||||
def host_architecture_str():
|
def host_architecture_str():
|
||||||
"""Returns the broad architecture family (x86_64, aarch64, etc.)"""
|
"""Returns the broad architecture family (x86_64, aarch64, etc.)"""
|
||||||
return str(_vendoring.archspec.cpu.host().family)
|
return str(archspec.cpu.host().family)
|
||||||
|
|
||||||
|
|
||||||
def _true(x):
|
def _true(x):
|
||||||
|
@ -11,9 +11,10 @@
|
|||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import spack
|
import spack
|
||||||
import spack.cmd
|
import spack.cmd
|
||||||
import spack.cmd.external
|
import spack.cmd.external
|
||||||
@ -103,7 +104,7 @@ def spec_json(self):
|
|||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def _common_arch(test_platform):
|
def _common_arch(test_platform):
|
||||||
generic = _vendoring.archspec.cpu.TARGETS[test_platform.default].family
|
generic = archspec.cpu.TARGETS[test_platform.default].family
|
||||||
return JsonArchEntry(platform=test_platform.name, os="redhat6", target=generic.name)
|
return JsonArchEntry(platform=test_platform.name, os="redhat6", target=generic.name)
|
||||||
|
|
||||||
|
|
||||||
|
@ -4,9 +4,10 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.config
|
import spack.config
|
||||||
import spack.environment as ev
|
import spack.environment as ev
|
||||||
@ -220,8 +221,7 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
|||||||
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
|
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
|
||||||
|
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||||
reason="test data is specific for x86_64",
|
|
||||||
)
|
)
|
||||||
def test_help_message(self, modulefile_content, module_configuration):
|
def test_help_message(self, modulefile_content, module_configuration):
|
||||||
"""Tests the generation of module help message."""
|
"""Tests the generation of module help message."""
|
||||||
|
@ -4,9 +4,10 @@
|
|||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import _vendoring.archspec.cpu
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import archspec.cpu
|
||||||
|
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.modules.common
|
import spack.modules.common
|
||||||
import spack.modules.tcl
|
import spack.modules.tcl
|
||||||
@ -185,8 +186,7 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
|
|||||||
assert len([x for x in content if "setenv FOO {{{name}}, {name}, {{}}, {}}" in x]) == 1
|
assert len([x for x in content if "setenv FOO {{{name}}, {name}, {{}}, {}}" in x]) == 1
|
||||||
|
|
||||||
@pytest.mark.skipif(
|
@pytest.mark.skipif(
|
||||||
str(_vendoring.archspec.cpu.host().family) != "x86_64",
|
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
|
||||||
reason="test data is specific for x86_64",
|
|
||||||
)
|
)
|
||||||
def test_help_message(self, modulefile_content, module_configuration):
|
def test_help_message(self, modulefile_content, module_configuration):
|
||||||
"""Tests the generation of module help message."""
|
"""Tests the generation of module help message."""
|
||||||
|
@ -6,6 +6,8 @@
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import llnl.util.lang
|
||||||
|
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.deptypes as dt
|
import spack.deptypes as dt
|
||||||
import spack.directives
|
import spack.directives
|
||||||
@ -2013,6 +2015,137 @@ def test_comparison_multivalued_variants():
|
|||||||
assert Spec("x=a") < Spec("x=a,b") < Spec("x==a,b") < Spec("x==a,b,c")
|
assert Spec("x=a") < Spec("x=a,b") < Spec("x==a,b") < Spec("x==a,b,c")
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"specs_in_expected_order",
|
||||||
|
[
|
||||||
|
("a", "b", "c", "d", "e"),
|
||||||
|
("a@1.0", "a@2.0", "b", "c@3.0", "c@4.0"),
|
||||||
|
("a^d", "b^c", "c^b", "d^a"),
|
||||||
|
("e^a", "e^b", "e^c", "e^d"),
|
||||||
|
("e^a@1.0", "e^a@2.0", "e^a@3.0", "e^a@4.0"),
|
||||||
|
("e^a@1.0 +a", "e^a@1.0 +b", "e^a@1.0 +c", "e^a@1.0 +c"),
|
||||||
|
("a^b%c", "a^b%d", "a^b%e", "a^b%f"),
|
||||||
|
("a^b%c@1.0", "a^b%c@2.0", "a^b%c@3.0", "a^b%c@4.0"),
|
||||||
|
("a^b%c@1.0 +a", "a^b%c@1.0 +b", "a^b%c@1.0 +c", "a^b%c@1.0 +d"),
|
||||||
|
("a cflags=-O1", "a cflags=-O2", "a cflags=-O3"),
|
||||||
|
("a %cmake@1.0 ^b %cmake@2.0", "a %cmake@2.0 ^b %cmake@1.0"),
|
||||||
|
("a^b^c^d", "a^b^c^e", "a^b^c^f"),
|
||||||
|
("a^b^c^d", "a^b^c^e", "a^b^c^e", "a^b^c^f"),
|
||||||
|
("a%b%c%d", "a%b%c%e", "a%b%c%e", "a%b%c%f"),
|
||||||
|
("d.a", "c.b", "b.c", "a.d"), # names before namespaces
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_spec_ordering(specs_in_expected_order):
|
||||||
|
specs_in_expected_order = [Spec(s) for s in specs_in_expected_order]
|
||||||
|
assert sorted(specs_in_expected_order) == specs_in_expected_order
|
||||||
|
assert sorted(reversed(specs_in_expected_order)) == specs_in_expected_order
|
||||||
|
|
||||||
|
for i in range(len(specs_in_expected_order) - 1):
|
||||||
|
lhs, rhs = specs_in_expected_order[i : i + 2]
|
||||||
|
assert lhs <= rhs
|
||||||
|
assert (lhs < rhs and lhs != rhs) or lhs == rhs
|
||||||
|
assert rhs >= lhs
|
||||||
|
assert (rhs > lhs and rhs != lhs) or rhs == lhs
|
||||||
|
|
||||||
|
|
||||||
|
EMPTY_VER = vn.VersionList(":")
|
||||||
|
EMPTY_VAR = Spec().variants
|
||||||
|
EMPTY_FLG = Spec().compiler_flags
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"spec,expected_tuplified",
|
||||||
|
[
|
||||||
|
# simple, no dependencies
|
||||||
|
[("a"), ((("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),), ())],
|
||||||
|
# with some node attributes
|
||||||
|
[
|
||||||
|
("a@1.0 +foo cflags='-O3 -g'"),
|
||||||
|
(
|
||||||
|
(
|
||||||
|
(
|
||||||
|
"a",
|
||||||
|
None,
|
||||||
|
vn.VersionList(["1.0"]),
|
||||||
|
Spec("+foo").variants,
|
||||||
|
Spec("cflags='-O3 -g'").compiler_flags,
|
||||||
|
None,
|
||||||
|
None,
|
||||||
|
None,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
# single edge case
|
||||||
|
[
|
||||||
|
("a^b"),
|
||||||
|
(
|
||||||
|
(
|
||||||
|
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
),
|
||||||
|
((0, 1, 0, ()),),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
# root with multiple deps
|
||||||
|
[
|
||||||
|
("a^b^c^d"),
|
||||||
|
(
|
||||||
|
(
|
||||||
|
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
),
|
||||||
|
((0, 1, 0, ()), (0, 2, 0, ()), (0, 3, 0, ())),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
# root with multiple build deps
|
||||||
|
[
|
||||||
|
("a%b%c%d"),
|
||||||
|
(
|
||||||
|
(
|
||||||
|
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
),
|
||||||
|
((0, 1, dt.BUILD, ()), (0, 2, dt.BUILD, ()), (0, 3, dt.BUILD, ())),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
# dependencies with dependencies
|
||||||
|
[
|
||||||
|
("a ^b %c %d ^e %f %g"),
|
||||||
|
(
|
||||||
|
(
|
||||||
|
("a", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("b", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("e", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("c", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("d", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("f", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
("g", None, EMPTY_VER, EMPTY_VAR, EMPTY_FLG, None, None, None),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
(0, 1, 0, ()),
|
||||||
|
(0, 2, 0, ()),
|
||||||
|
(1, 3, dt.BUILD, ()),
|
||||||
|
(1, 4, dt.BUILD, ()),
|
||||||
|
(2, 5, dt.BUILD, ()),
|
||||||
|
(2, 6, dt.BUILD, ()),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
],
|
||||||
|
)
|
||||||
|
def test_spec_canonical_comparison_form(spec, expected_tuplified):
|
||||||
|
print()
|
||||||
|
print()
|
||||||
|
print()
|
||||||
|
assert llnl.util.lang.tuplify(Spec(spec)._cmp_iter) == expected_tuplified
|
||||||
|
|
||||||
|
|
||||||
def test_comparison_after_breaking_hash_change():
|
def test_comparison_after_breaking_hash_change():
|
||||||
# We simulate a breaking change in DAG hash computation in Spack. We have two specs that are
|
# We simulate a breaking change in DAG hash computation in Spack. We have two specs that are
|
||||||
# entirely equal modulo DAG hash. When deserializing these specs, we don't want them to compare
|
# entirely equal modulo DAG hash. When deserializing these specs, we don't want them to compare
|
||||||
|
@ -208,7 +208,7 @@ def variant_type(self) -> VariantType:
|
|||||||
else:
|
else:
|
||||||
return VariantType.SINGLE
|
return VariantType.SINGLE
|
||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self):
|
||||||
return (
|
return (
|
||||||
f"Variant('{self.name}', "
|
f"Variant('{self.name}', "
|
||||||
f"default='{self.default}', "
|
f"default='{self.default}', "
|
||||||
@ -491,14 +491,14 @@ class DisjointSetsOfValues(collections.abc.Sequence):
|
|||||||
*sets (list): mutually exclusive sets of values
|
*sets (list): mutually exclusive sets of values
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_empty_set = {"none"}
|
_empty_set = set(("none",))
|
||||||
|
|
||||||
def __init__(self, *sets: Tuple[str, ...]) -> None:
|
def __init__(self, *sets):
|
||||||
self.sets = [set(_flatten(x)) for x in sets]
|
self.sets = [set(_flatten(x)) for x in sets]
|
||||||
|
|
||||||
# 'none' is a special value and can appear only in a set of
|
# 'none' is a special value and can appear only in a set of
|
||||||
# a single element
|
# a single element
|
||||||
if any("none" in s and s != {"none"} for s in self.sets):
|
if any("none" in s and s != set(("none",)) for s in self.sets):
|
||||||
raise spack.error.SpecError(
|
raise spack.error.SpecError(
|
||||||
"The value 'none' represents the empty set,"
|
"The value 'none' represents the empty set,"
|
||||||
" and must appear alone in a set. Use the "
|
" and must appear alone in a set. Use the "
|
||||||
|
@ -75,6 +75,7 @@ section-order = [
|
|||||||
"future",
|
"future",
|
||||||
"standard-library",
|
"standard-library",
|
||||||
"third-party",
|
"third-party",
|
||||||
|
"archspec",
|
||||||
"llnl",
|
"llnl",
|
||||||
"spack",
|
"spack",
|
||||||
"first-party",
|
"first-party",
|
||||||
@ -83,6 +84,7 @@ section-order = [
|
|||||||
|
|
||||||
[tool.ruff.lint.isort.sections]
|
[tool.ruff.lint.isort.sections]
|
||||||
spack = ["spack"]
|
spack = ["spack"]
|
||||||
|
archspec = ["archspec"]
|
||||||
llnl = ["llnl"]
|
llnl = ["llnl"]
|
||||||
|
|
||||||
[tool.ruff.lint.per-file-ignores]
|
[tool.ruff.lint.per-file-ignores]
|
||||||
@ -102,11 +104,13 @@ sections = [
|
|||||||
"FUTURE",
|
"FUTURE",
|
||||||
"STDLIB",
|
"STDLIB",
|
||||||
"THIRDPARTY",
|
"THIRDPARTY",
|
||||||
|
"ARCHSPEC",
|
||||||
"LLNL",
|
"LLNL",
|
||||||
"FIRSTPARTY",
|
"FIRSTPARTY",
|
||||||
"LOCALFOLDER",
|
"LOCALFOLDER",
|
||||||
]
|
]
|
||||||
known_first_party = "spack"
|
known_first_party = "spack"
|
||||||
|
known_archspec = "archspec"
|
||||||
known_llnl = "llnl"
|
known_llnl = "llnl"
|
||||||
known_third_party = ["ruamel", "six"]
|
known_third_party = ["ruamel", "six"]
|
||||||
src_paths = "lib"
|
src_paths = "lib"
|
||||||
@ -260,9 +264,6 @@ substitute = [
|
|||||||
{ match = "from attr", replace = "from _vendoring.attr" },
|
{ match = "from attr", replace = "from _vendoring.attr" },
|
||||||
{ match = "import jsonschema", replace = "import _vendoring.jsonschema" },
|
{ match = "import jsonschema", replace = "import _vendoring.jsonschema" },
|
||||||
{ match = "from jsonschema", replace = "from _vendoring.jsonschema" },
|
{ match = "from jsonschema", replace = "from _vendoring.jsonschema" },
|
||||||
{ match = "archspec.cpu", replace = "_vendoring.archspec.cpu" },
|
|
||||||
{ match = "archspec.__version__", replace = "_vendoring.archspec.__version__" },
|
|
||||||
{ match = "import archspec", replace = "import _vendoring.archspec" },
|
|
||||||
]
|
]
|
||||||
drop = [
|
drop = [
|
||||||
# contains unnecessary scripts
|
# contains unnecessary scripts
|
||||||
@ -284,21 +285,11 @@ drop = [
|
|||||||
"pvectorc.*.so",
|
"pvectorc.*.so",
|
||||||
# Trim jsonschema tests
|
# Trim jsonschema tests
|
||||||
"jsonschema/tests",
|
"jsonschema/tests",
|
||||||
"archspec/json/tests",
|
|
||||||
"archspec/vendor/cpuid/.gitignore",
|
|
||||||
"pyrsistent/__init__.pyi",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.vendoring.typing-stubs]
|
[tool.vendoring.typing-stubs]
|
||||||
_pyrsistent_version = []
|
six = ["six.__init__", "six.moves.__init__", "six.moves.configparser"]
|
||||||
altgraph = []
|
|
||||||
archspec = []
|
|
||||||
distro = []
|
distro = []
|
||||||
jsonschema = []
|
|
||||||
macholib = []
|
|
||||||
pyrsistent = []
|
|
||||||
ruamel = []
|
|
||||||
six = []
|
|
||||||
|
|
||||||
[tool.vendoring.license.directories]
|
[tool.vendoring.license.directories]
|
||||||
setuptools = "pkg_resources"
|
setuptools = "pkg_resources"
|
||||||
|
@ -69,7 +69,7 @@ spack:
|
|||||||
- alpgen
|
- alpgen
|
||||||
- ampt
|
- ampt
|
||||||
- apfel +lhapdf +python
|
- apfel +lhapdf +python
|
||||||
- celeritas ~cuda +openmp ~rocm +vecgeom +covfie
|
- celeritas ~cuda +openmp ~rocm +vecgeom
|
||||||
- cepgen
|
- cepgen
|
||||||
- cernlib +shared
|
- cernlib +shared
|
||||||
- collier
|
- collier
|
||||||
@ -130,7 +130,7 @@ spack:
|
|||||||
|
|
||||||
# CUDA
|
# CUDA
|
||||||
#- acts +cuda +traccc cuda_arch=80
|
#- acts +cuda +traccc cuda_arch=80
|
||||||
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80 +covfie
|
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80
|
||||||
- root +cuda +cudnn +tmva-gpu
|
- root +cuda +cudnn +tmva-gpu
|
||||||
- vecgeom +cuda cuda_arch=80
|
- vecgeom +cuda cuda_arch=80
|
||||||
|
|
||||||
|
@ -0,0 +1,660 @@
|
|||||||
|
====================================
|
||||||
|
Development Notes on Intel Packages
|
||||||
|
====================================
|
||||||
|
|
||||||
|
These are notes for concepts and development of
|
||||||
|
lib/spack/spack/build_systems/intel.py .
|
||||||
|
|
||||||
|
For documentation on how to *use* ``IntelPackage``, see
|
||||||
|
lib/spack/docs/build_systems/intelpackage.rst .
|
||||||
|
|
||||||
|
-------------------------------------------------------------------------------
|
||||||
|
Installation and path handling as implemented in ./intel.py
|
||||||
|
-------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
***************************************************************************
|
||||||
|
Prefix differences between Spack-external and Spack-internal installations
|
||||||
|
***************************************************************************
|
||||||
|
|
||||||
|
|
||||||
|
Problem summary
|
||||||
|
~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
For Intel packages that were installed external to Spack, ``self.prefix`` will
|
||||||
|
be a *component-specific* path (e.g. to an MKL-specific dir hierarchy), whereas
|
||||||
|
for a package installed by Spack itself, ``self.prefix`` will be a
|
||||||
|
*vendor-level* path that holds one or more components (or parts thereof), and
|
||||||
|
must be further qualified down to a particular desired component.
|
||||||
|
|
||||||
|
It is possible that a similar conceptual difference is inherent to other
|
||||||
|
package families that use a common vendor-style installer.
|
||||||
|
|
||||||
|
|
||||||
|
Description
|
||||||
|
~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Spack makes packages available through two routes, let's call them A and B:
|
||||||
|
|
||||||
|
A. Packages pre-installed external to Spack and configured *for* Spack
|
||||||
|
B. Packages built and installed *by* Spack.
|
||||||
|
|
||||||
|
For a user who is interested in building end-user applications, it should not
|
||||||
|
matter through which route any of its dependent packages has been installed.
|
||||||
|
Most packages natively support a ``prefix`` concept which unifies the two
|
||||||
|
routes just fine.
|
||||||
|
|
||||||
|
Intel packages, however, are more complicated because they consist of a number
|
||||||
|
of components that are released as a suite of varying extent, like "Intel
|
||||||
|
Parallel Studio *Foo* Edition", or subsetted into products like "MKL" or "MPI",
|
||||||
|
each of which also contain libraries from other components like the compiler
|
||||||
|
runtime and multithreading libraries. For this reason, an Intel package is
|
||||||
|
"anchored" during installation at a directory level higher than just the
|
||||||
|
user-facing directory that has the conventional hierarchy of ``bin``, ``lib``,
|
||||||
|
and others relevant for the end-product.
|
||||||
|
|
||||||
|
As a result, internal to Spack, there is a conceptual difference in what
|
||||||
|
``self.prefix`` represents for the two routes.
|
||||||
|
|
||||||
|
For route A, consider MKL installed outside of Spack. It will likely be one
|
||||||
|
product component among other products, at one particular release among others
|
||||||
|
that are installed in sibling or cousin directories on the local system.
|
||||||
|
Therefore, the path given to Spack in ``packages.yaml`` should be a
|
||||||
|
*product-specific and fully version-specific* directory. E.g., for an
|
||||||
|
``intel-mkl`` package, ``self.prefix`` should look like::
|
||||||
|
|
||||||
|
/opt/intel/compilers_and_libraries_2018.1.163/linux/mkl
|
||||||
|
|
||||||
|
In this route, the interaction point with the user is encapsulated in an
|
||||||
|
environment variable which will be (in pseudo-code)::
|
||||||
|
|
||||||
|
MKLROOT := {self.prefix}
|
||||||
|
|
||||||
|
For route B, a Spack-based installation of MKL will be placed in the directory
|
||||||
|
given to the ``./install.sh`` script of Intel's package distribution. This
|
||||||
|
directory is taken to be the *vendor*-specific anchor directory, playing the
|
||||||
|
same role as the default ``/opt/intel``. In this case, ``self.prefix`` will
|
||||||
|
be::
|
||||||
|
|
||||||
|
$SPACK_ROOT/opt/spack/linux-centos6-x86_64/gcc-4.9.3/intel-mkl-2018.1.163-<HASH>
|
||||||
|
|
||||||
|
However, now the environment variable will have to be constructed as *several
|
||||||
|
directory levels down*::
|
||||||
|
|
||||||
|
MKLROOT := {self.prefix}/compilers_and_libraries_2018.1.163/linux/mkl
|
||||||
|
|
||||||
|
A recent post on the Spack mailing list illustrates the confusion when route A
|
||||||
|
was taken while route B was the only one that was coded in Spack:
|
||||||
|
https://groups.google.com/d/msg/spack/x28qlmqPAys/Ewx6220uAgAJ
|
||||||
|
|
||||||
|
|
||||||
|
Solution
|
||||||
|
~~~~~~~~~
|
||||||
|
|
||||||
|
Introduce a series of functions which will return the appropriate
|
||||||
|
directories, regardless of whether the Intel package has been installed
|
||||||
|
external or internal to Spack:
|
||||||
|
|
||||||
|
========================== ==================================================
|
||||||
|
Function Example return values
|
||||||
|
-------------------------- --------------------------------------------------
|
||||||
|
normalize_suite_dir() Spack-external installation:
|
||||||
|
/opt/intel/compilers_and_libraries_2018.1.163
|
||||||
|
Spack-internal installation:
|
||||||
|
$SPACK_ROOT/...<HASH>/compilers_and_libraries_2018.1.163
|
||||||
|
-------------------------- --------------------------------------------------
|
||||||
|
normalize_path('mkl') <suite_dir>/linux/mkl
|
||||||
|
component_bin_dir() <suite_dir>/linux/mkl/bin
|
||||||
|
component_lib_dir() <suite_dir>/linux/mkl/lib/intel64
|
||||||
|
-------------------------- --------------------------------------------------
|
||||||
|
normalize_path('mpi') <suite_dir>/linux/mpi
|
||||||
|
component_bin_dir('mpi') <suite_dir>/linux/mpi/intel64/bin
|
||||||
|
component_lib_dir('mpi') <suite_dir>/linux/mpi/intel64/lib
|
||||||
|
========================== ==================================================
|
||||||
|
|
||||||
|
|
||||||
|
*********************************
|
||||||
|
Analysis of directory layouts
|
||||||
|
*********************************
|
||||||
|
|
||||||
|
Let's look at some sample directory layouts, using ``ls -lF``,
|
||||||
|
but focusing on names and symlinks only.
|
||||||
|
|
||||||
|
Spack-born installation of ``intel-mkl@2018.1.163``
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
$ ls -l <prefix>
|
||||||
|
|
||||||
|
bin/
|
||||||
|
- compilervars.*sh (symlinked) ONLY
|
||||||
|
|
||||||
|
compilers_and_libraries -> compilers_and_libraries_2018
|
||||||
|
- generically-named entry point, stable across versions (one hopes)
|
||||||
|
|
||||||
|
compilers_and_libraries_2018/
|
||||||
|
- vaguely-versioned dirname, holding a stub hierarchy --ignorable
|
||||||
|
|
||||||
|
$ ls -l compilers_and_libraries_2018/linux/
|
||||||
|
bin - actual compilervars.*sh (reg. files) ONLY
|
||||||
|
documentation -> ../../documentation_2018/
|
||||||
|
lib -> ../../compilers_and_libraries_2018.1.163/linux/compiler/lib/
|
||||||
|
mkl -> ../../compilers_and_libraries_2018.1.163/linux/mkl/
|
||||||
|
pkg_bin -> ../../compilers_and_libraries_2018.1.163/linux/bin/
|
||||||
|
samples -> ../../samples_2018/
|
||||||
|
tbb -> ../../compilers_and_libraries_2018.1.163/linux/tbb/
|
||||||
|
|
||||||
|
compilers_and_libraries_2018.1.163/
|
||||||
|
- Main "product" + a minimal set of libs from related products
|
||||||
|
|
||||||
|
$ ls -l compilers_and_libraries_2018.1.163/linux/
|
||||||
|
bin/ - compilervars.*sh, link_install*sh ONLY
|
||||||
|
mkl/ - Main Product ==> to be assigned to MKLROOT
|
||||||
|
compiler/ - lib/intel64_lin/libiomp5* ONLY
|
||||||
|
tbb/ - tbb/lib/intel64_lin/gcc4.[147]/libtbb*.so* ONLY
|
||||||
|
|
||||||
|
parallel_studio_xe_2018 -> parallel_studio_xe_2018.1.038/
|
||||||
|
parallel_studio_xe_2018.1.038/
|
||||||
|
- Alternate product packaging - ignorable
|
||||||
|
|
||||||
|
$ ls -l parallel_studio_xe_2018.1.038/
|
||||||
|
bin/ - actual psxevars.*sh (reg. files)
|
||||||
|
compilers_and_libraries_2018 -> <full_path>/comp...aries_2018.1.163
|
||||||
|
documentation_2018 -> <full_path_prefix>/documentation_2018
|
||||||
|
samples_2018 -> <full_path_prefix>/samples_2018
|
||||||
|
...
|
||||||
|
|
||||||
|
documentation_2018/
|
||||||
|
samples_2018/
|
||||||
|
lib -> compilers_and_libraries/linux/lib/
|
||||||
|
mkl -> compilers_and_libraries/linux/mkl/
|
||||||
|
tbb -> compilers_and_libraries/linux/tbb/
|
||||||
|
- auxiliaries and convenience links
|
||||||
|
|
||||||
|
Spack-external installation of Intel-MPI 2018
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
For MPI, the layout is slightly different than MKL. The prefix will have to
|
||||||
|
include an architecture directory (typically ``intel64``), which then contains
|
||||||
|
bin/, lib/, ..., all without further architecture branching. The environment
|
||||||
|
variable ``I_MPI_ROOT`` from the API documentation, however, must be the
|
||||||
|
package's top directory, not including the architecture.
|
||||||
|
|
||||||
|
FIXME: For MANPATH, need the parent dir.
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
$ ls -lF /opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/
|
||||||
|
bin64 -> intel64/bin/
|
||||||
|
etc64 -> intel64/etc/
|
||||||
|
include64 -> intel64/include/
|
||||||
|
lib64 -> intel64/lib/
|
||||||
|
|
||||||
|
benchmarks/
|
||||||
|
binding/
|
||||||
|
intel64/
|
||||||
|
man/
|
||||||
|
test/
|
||||||
|
|
||||||
|
The package contains an MPI-2019 preview; Curiously, its release notes contain
|
||||||
|
the tag: "File structure clean-up." I could not find further documentation on
|
||||||
|
this, however, so it is unclear what, if any, changes will make it to release.
|
||||||
|
|
||||||
|
https://software.intel.com/en-us/articles/restoring-legacy-path-structure-on-intel-mpi-library-2019
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
$ ls -lF /opt/intel/compilers_and_libraries_2018.1.163/linux/mpi_2019/
|
||||||
|
binding/
|
||||||
|
doc/
|
||||||
|
imb/
|
||||||
|
intel64/
|
||||||
|
man/
|
||||||
|
test/
|
||||||
|
|
||||||
|
Spack-external installation of Intel Parallel Studio 2018
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
This is the main product bundle that I actually downloaded and installed on my
|
||||||
|
system. Its nominal installation directory mostly holds merely symlinks
|
||||||
|
to components installed in sibling dirs::
|
||||||
|
|
||||||
|
$ ls -lF /opt/intel/parallel_studio_xe_2018.1.038/
|
||||||
|
advisor_2018 -> /opt/intel/advisor_2018/
|
||||||
|
clck_2018 -> /opt/intel/clck/2018.1/
|
||||||
|
compilers_and_libraries_2018 -> /opt/intel/comp....aries_2018.1.163/
|
||||||
|
documentation_2018 -> /opt/intel/documentation_2018/
|
||||||
|
ide_support_2018 -> /opt/intel/ide_support_2018/
|
||||||
|
inspector_2018 -> /opt/intel/inspector_2018/
|
||||||
|
itac_2018 -> /opt/intel/itac/2018.1.017/
|
||||||
|
man -> /opt/intel/man/
|
||||||
|
samples_2018 -> /opt/intel/samples_2018/
|
||||||
|
vtune_amplifier_2018 -> /opt/intel/vtune_amplifier_2018/
|
||||||
|
|
||||||
|
psxevars.csh -> ./bin/psxevars.csh*
|
||||||
|
psxevars.sh -> ./bin/psxevars.sh*
|
||||||
|
bin/ - *vars.*sh scripts + sshconnectivity.exp ONLY
|
||||||
|
|
||||||
|
licensing/
|
||||||
|
uninstall*
|
||||||
|
|
||||||
|
The only relevant regular files are ``*vars.*sh``, but those also just churn
|
||||||
|
through the subordinate vars files of the components.
|
||||||
|
|
||||||
|
Installation model
|
||||||
|
~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Intel packages come with an ``install.sh`` script that is normally run
|
||||||
|
interactively (in either text or GUI mode) but can run unattended with a
|
||||||
|
``--silent <file>`` option, which is of course what Spack uses.
|
||||||
|
|
||||||
|
Format of configuration file
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
The configuration file is conventionally called ``silent.cfg`` and has a simple
|
||||||
|
``token=value`` syntax. Before using the configuration file, the installer
|
||||||
|
calls ``<staging_dir>/pset/check.awk`` to validate it. Example paths to the
|
||||||
|
validator are::
|
||||||
|
|
||||||
|
.../l_mkl_2018.1.163/pset/check.awk .
|
||||||
|
.../parallel_studio_xe_2018_update1_cluster_edition/pset/check.awk
|
||||||
|
|
||||||
|
The tokens that are accepted in the configuration file vary between packages.
|
||||||
|
Tokens not supported for a given package **will cause the installer to stop
|
||||||
|
and fail.** This is particularly relevant for license-related tokens, which are
|
||||||
|
accepted only for packages that actually require a license.
|
||||||
|
|
||||||
|
Reference: [Intel's documentation](https://software.intel.com/en-us/articles/configuration-file-format)
|
||||||
|
|
||||||
|
See also: https://software.intel.com/en-us/articles/silent-installation-guide-for-intel-parallel-studio-xe-composer-edition-for-os-x
|
||||||
|
|
||||||
|
The following is from ``.../parallel_studio_xe_2018_update1_cluster_edition/pset/check.awk``:
|
||||||
|
|
||||||
|
* Tokens valid for all packages encountered::
|
||||||
|
|
||||||
|
ACCEPT_EULA {accept, decline}
|
||||||
|
CONTINUE_WITH_OPTIONAL_ERROR {yes, no}
|
||||||
|
PSET_INSTALL_DIR {/opt/intel, , filepat}
|
||||||
|
CONTINUE_WITH_INSTALLDIR_OVERWRITE {yes, no}
|
||||||
|
COMPONENTS {ALL, DEFAULTS, , anythingpat}
|
||||||
|
PSET_MODE {install, repair, uninstall}
|
||||||
|
NONRPM_DB_DIR {, filepat}
|
||||||
|
|
||||||
|
SIGNING_ENABLED {yes, no}
|
||||||
|
ARCH_SELECTED {IA32, INTEL64, ALL}
|
||||||
|
|
||||||
|
* Mentioned but unexplained in ``check.awk``::
|
||||||
|
|
||||||
|
NO_VALIDATE (?!)
|
||||||
|
|
||||||
|
* Only for licensed packages::
|
||||||
|
|
||||||
|
ACTIVATION_SERIAL_NUMBER {, snpat}
|
||||||
|
ACTIVATION_LICENSE_FILE {, lspat, filepat}
|
||||||
|
ACTIVATION_TYPE {exist_lic, license_server,
|
||||||
|
license_file, trial_lic,
|
||||||
|
|
||||||
|
PHONEHOME_SEND_USAGE_DATA {yes, no}
|
||||||
|
serial_number}
|
||||||
|
|
||||||
|
* Only for Amplifier (obviously)::
|
||||||
|
|
||||||
|
AMPLIFIER_SAMPLING_DRIVER_INSTALL_TYPE {build, kit}
|
||||||
|
AMPLIFIER_DRIVER_ACCESS_GROUP {, anythingpat, vtune}
|
||||||
|
AMPLIFIER_DRIVER_PERMISSIONS {, anythingpat, 666}
|
||||||
|
AMPLIFIER_LOAD_DRIVER {yes, no}
|
||||||
|
AMPLIFIER_C_COMPILER {, filepat, auto, none}
|
||||||
|
AMPLIFIER_KERNEL_SRC_DIR {, filepat, auto, none}
|
||||||
|
AMPLIFIER_MAKE_COMMAND {, filepat, auto, none}
|
||||||
|
AMPLIFIER_INSTALL_BOOT_SCRIPT {yes, no}
|
||||||
|
AMPLIFIER_DRIVER_PER_USER_MODE {yes, no}
|
||||||
|
|
||||||
|
* Only for MKL and Studio::
|
||||||
|
|
||||||
|
CLUSTER_INSTALL_REMOTE {yes, no}
|
||||||
|
CLUSTER_INSTALL_TEMP {, filepat}
|
||||||
|
CLUSTER_INSTALL_MACHINES_FILE {, filepat}
|
||||||
|
|
||||||
|
* "backward compatibility" (?)::
|
||||||
|
|
||||||
|
INSTALL_MODE {RPM, NONRPM}
|
||||||
|
download_only {yes}
|
||||||
|
download_dir {, filepat}
|
||||||
|
|
||||||
|
|
||||||
|
Details for licensing tokens
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
Quoted from
|
||||||
|
https://software.intel.com/en-us/articles/configuration-file-format,
|
||||||
|
for reference:
|
||||||
|
|
||||||
|
[ed. note: As of 2018-05, the page incorrectly references ``ACTIVATION``, which
|
||||||
|
was used only until about 2012; this is corrected to ``ACTIVATION_TYPE`` here.]
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
``ACTIVATION_TYPE=exist_lic``
|
||||||
|
This directive tells the install program to look for an existing
|
||||||
|
license during the install process. This is the preferred method for
|
||||||
|
silent installs. Take the time to register your serial number and get
|
||||||
|
a license file (see below). Having a license file on the system
|
||||||
|
simplifies the process. In addition, as an administrator it is good
|
||||||
|
practice to know WHERE your licenses are saved on your system.
|
||||||
|
License files are plain text files with a .lic extension. By default
|
||||||
|
these are saved in /opt/intel/licenses which is searched by default.
|
||||||
|
If you save your license elsewhere, perhaps under an NFS folder, set
|
||||||
|
environment variable **INTEL_LICENSE_FILE** to the full path to your
|
||||||
|
license file prior to starting the installation or use the
|
||||||
|
configuration file directive ``ACTIVATION_LICENSE_FILE`` to specify the
|
||||||
|
full pathname to the license file.
|
||||||
|
|
||||||
|
Options for ``ACTIVATION_TYPE`` are ``{ exist_lic, license_file, server_lic,
|
||||||
|
serial_number, trial_lic }``
|
||||||
|
|
||||||
|
``exist_lic``
|
||||||
|
directs the installer to search for a valid license on the server.
|
||||||
|
Searches will utilize the environment variable **INTEL_LICENSE_FILE**,
|
||||||
|
search the default license directory /opt/intel/licenses, or use the
|
||||||
|
``ACTIVATION_LICENSE_FILE`` directive to find a valid license file.
|
||||||
|
|
||||||
|
``license_file``
|
||||||
|
is similar to exist_lic but directs the installer to use
|
||||||
|
``ACTIVATION_LICENSE_FILE`` to find the license file.
|
||||||
|
|
||||||
|
``server_lic``
|
||||||
|
is similar to exist_lic and exist_lic but directs the installer that
|
||||||
|
this is a client installation and a floating license server will be
|
||||||
|
contacted to active the product. This option will contact your
|
||||||
|
floating license server on your network to retrieve the license
|
||||||
|
information. BEFORE using this option make sure your client is
|
||||||
|
correctly set up for your network including all networking, routing,
|
||||||
|
name service, and firewall configuration. Insure that your client has
|
||||||
|
direct access to your floating license server and that firewalls are
|
||||||
|
set up to allow TCP/IP access for the 2 license server ports.
|
||||||
|
server_lic will use **INTEL_LICENSE_FILE** containing a port@host format
|
||||||
|
OR a client license file. The formats for these are described here
|
||||||
|
https://software.intel.com/en-us/articles/licensing-setting-up-the-client-floating-license
|
||||||
|
|
||||||
|
``serial_number``
|
||||||
|
directs the installer to use directive ``ACTIVATION_SERIAL_NUMBER`` for
|
||||||
|
activation. This method will require the installer to contact an
|
||||||
|
external Intel activation server over the Internet to confirm your
|
||||||
|
serial number. Due to user and company firewalls, this method is more
|
||||||
|
complex and hence error prone of the available activation methods. We
|
||||||
|
highly recommend using a license file or license server for activation
|
||||||
|
instead.
|
||||||
|
|
||||||
|
``trial_lic``
|
||||||
|
is used only if you do not have an existing license and intend to
|
||||||
|
temporarily evaluate the compiler. This method creates a temporary
|
||||||
|
trial license in Trusted Storage on your system.
|
||||||
|
|
||||||
|
...
|
||||||
|
|
||||||
|
*******************
|
||||||
|
vars files
|
||||||
|
*******************
|
||||||
|
|
||||||
|
Intel's product packages contain a number of shell initialization files let's call them vars files.
|
||||||
|
|
||||||
|
There are three kinds:
|
||||||
|
|
||||||
|
#. Component-specific vars files, such as `mklvars` or `tbbvars`.
|
||||||
|
#. Toplevel vars files such as "psxevars". They will scan for all
|
||||||
|
component-specific vars files associated with the product, and source them
|
||||||
|
if found.
|
||||||
|
#. Symbolic links to either of them. Links may appear under a different name
|
||||||
|
for backward compatibility.
|
||||||
|
|
||||||
|
At present, IntelPackage class is only concerned with the toplevel vars files,
|
||||||
|
generally found in the product's toplevel bin/ directory.
|
||||||
|
|
||||||
|
For reference, here is an overview of the names and locations of the vars files
|
||||||
|
in the 2018 product releases, as seen for Spack-native installation. NB: May be
|
||||||
|
incomplete as some components may have been omitted during installation.
|
||||||
|
|
||||||
|
Names of vars files seen::
|
||||||
|
|
||||||
|
$ cd opt/spack/linux-centos6-x86_64
|
||||||
|
$ find intel* -name \*vars.sh -printf '%f\n' | sort -u | nl
|
||||||
|
1 advixe-vars.sh
|
||||||
|
2 amplxe-vars.sh
|
||||||
|
3 apsvars.sh
|
||||||
|
4 compilervars.sh
|
||||||
|
5 daalvars.sh
|
||||||
|
6 debuggervars.sh
|
||||||
|
7 iccvars.sh
|
||||||
|
8 ifortvars.sh
|
||||||
|
9 inspxe-vars.sh
|
||||||
|
10 ippvars.sh
|
||||||
|
11 mklvars.sh
|
||||||
|
12 mpivars.sh
|
||||||
|
13 pstlvars.sh
|
||||||
|
14 psxevars.sh
|
||||||
|
15 sep_vars.sh
|
||||||
|
16 tbbvars.sh
|
||||||
|
|
||||||
|
Names and locations of vars files, sorted by Spack package name::
|
||||||
|
|
||||||
|
$ cd opt/spack/linux-centos6-x86_64
|
||||||
|
$ find intel* -name \*vars.sh -printf '%y\t%-15f\t%h\n' \
|
||||||
|
| cut -d/ -f1,4- \
|
||||||
|
| sed '/iccvars\|ifortvars/d; s,/,\t\t,; s,\.sh,,; s, */\(intel[/-]\),\1,' \
|
||||||
|
| sort -k3,3 -k2,2 \
|
||||||
|
| nl \
|
||||||
|
| awk '{printf "%6i %-2s %-16s %-24s %s\n", $1, $2, $3, $4, $5}'
|
||||||
|
|
||||||
|
--------------------------------------------------------------------------------------------------------
|
||||||
|
item no.
|
||||||
|
file or link
|
||||||
|
name of vars file
|
||||||
|
Spack package name
|
||||||
|
dir relative to Spack install dir
|
||||||
|
--------------------------------------------------------------------------------------------------------
|
||||||
|
|
||||||
|
1 f mpivars intel compilers_and_libraries_2018.1.163/linux/mpi/intel64/bin
|
||||||
|
2 f mpivars intel compilers_and_libraries_2018.1.163/linux/mpirt/bin/ia32_lin
|
||||||
|
3 f tbbvars intel compilers_and_libraries_2018.1.163/linux/tbb/bin
|
||||||
|
4 f pstlvars intel compilers_and_libraries_2018.1.163/linux/pstl/bin
|
||||||
|
5 f compilervars intel compilers_and_libraries_2018.1.163/linux/bin
|
||||||
|
6 f compilervars intel compilers_and_libraries_2018/linux/bin
|
||||||
|
7 l compilervars intel bin
|
||||||
|
8 f daalvars intel-daal compilers_and_libraries_2018.2.199/linux/daal/bin
|
||||||
|
9 f psxevars intel-daal parallel_studio_xe_2018.2.046/bin
|
||||||
|
10 l psxevars intel-daal parallel_studio_xe_2018.2.046
|
||||||
|
11 f compilervars intel-daal compilers_and_libraries_2018.2.199/linux/bin
|
||||||
|
12 f compilervars intel-daal compilers_and_libraries_2018/linux/bin
|
||||||
|
13 l compilervars intel-daal bin
|
||||||
|
14 f ippvars intel-ipp compilers_and_libraries_2018.2.199/linux/ipp/bin
|
||||||
|
15 f psxevars intel-ipp parallel_studio_xe_2018.2.046/bin
|
||||||
|
16 l psxevars intel-ipp parallel_studio_xe_2018.2.046
|
||||||
|
17 f compilervars intel-ipp compilers_and_libraries_2018.2.199/linux/bin
|
||||||
|
18 f compilervars intel-ipp compilers_and_libraries_2018/linux/bin
|
||||||
|
19 l compilervars intel-ipp bin
|
||||||
|
20 f mklvars intel-mkl compilers_and_libraries_2018.2.199/linux/mkl/bin
|
||||||
|
21 f psxevars intel-mkl parallel_studio_xe_2018.2.046/bin
|
||||||
|
22 l psxevars intel-mkl parallel_studio_xe_2018.2.046
|
||||||
|
23 f compilervars intel-mkl compilers_and_libraries_2018.2.199/linux/bin
|
||||||
|
24 f compilervars intel-mkl compilers_and_libraries_2018/linux/bin
|
||||||
|
25 l compilervars intel-mkl bin
|
||||||
|
26 f mpivars intel-mpi compilers_and_libraries_2018.2.199/linux/mpi_2019/intel64/bin
|
||||||
|
27 f mpivars intel-mpi compilers_and_libraries_2018.2.199/linux/mpi/intel64/bin
|
||||||
|
28 f psxevars intel-mpi parallel_studio_xe_2018.2.046/bin
|
||||||
|
29 l psxevars intel-mpi parallel_studio_xe_2018.2.046
|
||||||
|
30 f compilervars intel-mpi compilers_and_libraries_2018.2.199/linux/bin
|
||||||
|
31 f compilervars intel-mpi compilers_and_libraries_2018/linux/bin
|
||||||
|
32 l compilervars intel-mpi bin
|
||||||
|
33 f apsvars intel-parallel-studio vtune_amplifier_2018.1.0.535340
|
||||||
|
34 l apsvars intel-parallel-studio performance_snapshots_2018.1.0.535340
|
||||||
|
35 f ippvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/ipp/bin
|
||||||
|
36 f ippvars intel-parallel-studio composer_xe_2015.6.233/ipp/bin
|
||||||
|
37 f mklvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mkl/bin
|
||||||
|
38 f mklvars intel-parallel-studio composer_xe_2015.6.233/mkl/bin
|
||||||
|
39 f mpivars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mpi/intel64/bin
|
||||||
|
40 f mpivars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/mpirt/bin/ia32_lin
|
||||||
|
41 f tbbvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/tbb/bin
|
||||||
|
42 f tbbvars intel-parallel-studio composer_xe_2015.6.233/tbb/bin
|
||||||
|
43 f daalvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/daal/bin
|
||||||
|
44 f pstlvars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/pstl/bin
|
||||||
|
45 f psxevars intel-parallel-studio parallel_studio_xe_2018.1.038/bin
|
||||||
|
46 l psxevars intel-parallel-studio parallel_studio_xe_2018.1.038
|
||||||
|
47 f sep_vars intel-parallel-studio vtune_amplifier_2018.1.0.535340
|
||||||
|
48 f sep_vars intel-parallel-studio vtune_amplifier_2018.1.0.535340/target/android_v4.1_x86_64
|
||||||
|
49 f advixe-vars intel-parallel-studio advisor_2018.1.1.535164
|
||||||
|
50 f amplxe-vars intel-parallel-studio vtune_amplifier_2018.1.0.535340
|
||||||
|
51 f inspxe-vars intel-parallel-studio inspector_2018.1.1.535159
|
||||||
|
52 f compilervars intel-parallel-studio compilers_and_libraries_2018.1.163/linux/bin
|
||||||
|
53 f compilervars intel-parallel-studio compilers_and_libraries_2018/linux/bin
|
||||||
|
54 l compilervars intel-parallel-studio bin
|
||||||
|
55 f debuggervars intel-parallel-studio debugger_2018/bin
|
||||||
|
|
||||||
|
|
||||||
|
********************
|
||||||
|
MPI linkage
|
||||||
|
********************
|
||||||
|
|
||||||
|
|
||||||
|
Library selection
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
In the Spack code so far, the library selections for MPI are:
|
||||||
|
|
||||||
|
::
|
||||||
|
|
||||||
|
libnames = ['libmpifort', 'libmpi']
|
||||||
|
if 'cxx' in self.spec.last_query.extra_parameters:
|
||||||
|
libnames = ['libmpicxx'] + libnames
|
||||||
|
return find_libraries(libnames,
|
||||||
|
root=self.component_lib_dir('mpi'),
|
||||||
|
shared=True, recursive=False)
|
||||||
|
|
||||||
|
The problem is that there are multiple library versions under ``component_lib_dir``::
|
||||||
|
|
||||||
|
$ cd $I_MPI_ROOT
|
||||||
|
$ find . -name libmpi.so | sort
|
||||||
|
./intel64/lib/debug/libmpi.so
|
||||||
|
./intel64/lib/debug_mt/libmpi.so
|
||||||
|
./intel64/lib/libmpi.so
|
||||||
|
./intel64/lib/release/libmpi.so
|
||||||
|
./intel64/lib/release_mt/libmpi.so
|
||||||
|
|
||||||
|
"mt" refers to multi-threading, not in the explicit sense but in the sense of being thread-safe::
|
||||||
|
|
||||||
|
$ mpiifort -help | grep mt
|
||||||
|
-mt_mpi link the thread safe version of the Intel(R) MPI Library
|
||||||
|
|
||||||
|
Well, why should we not inspect what the canonical script does? The wrapper
|
||||||
|
has its own hardcoded "prefix=..." and can thus tell us what it will do, from a
|
||||||
|
*wiped environment* no less!::
|
||||||
|
|
||||||
|
$ env - intel64/bin/mpiicc -show hello.c | ld-unwrap-args
|
||||||
|
icc 'hello.c' \
|
||||||
|
-I/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/include \
|
||||||
|
-L/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib/release_mt \
|
||||||
|
-L/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib \
|
||||||
|
-Xlinker --enable-new-dtags \
|
||||||
|
-Xlinker -rpath=/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib/release_mt \
|
||||||
|
-Xlinker -rpath=/opt/intel/compilers_and_libraries_2018.1.163/linux/mpi/intel64/lib \
|
||||||
|
-Xlinker -rpath=/opt/intel/mpi-rt/2017.0.0/intel64/lib/release_mt \
|
||||||
|
-Xlinker -rpath=/opt/intel/mpi-rt/2017.0.0/intel64/lib \
|
||||||
|
-lmpifort \
|
||||||
|
-lmpi \
|
||||||
|
-lmpigi \
|
||||||
|
-ldl \
|
||||||
|
-lrt \
|
||||||
|
-lpthread
|
||||||
|
|
||||||
|
|
||||||
|
MPI Wrapper options
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
For reference, here's the wrapper's builtin help output::
|
||||||
|
|
||||||
|
$ mpiifort -help
|
||||||
|
Simple script to compile and/or link MPI programs.
|
||||||
|
Usage: mpiifort [options] <files>
|
||||||
|
----------------------------------------------------------------------------
|
||||||
|
The following options are supported:
|
||||||
|
-fc=<name> | -f90=<name>
|
||||||
|
specify a FORTRAN compiler name: i.e. -fc=ifort
|
||||||
|
-echo print the scripts during their execution
|
||||||
|
-show show command lines without real calling
|
||||||
|
-config=<name> specify a configuration file: i.e. -config=ifort for mpif90-ifort.conf file
|
||||||
|
-v print version info of mpiifort and its native compiler
|
||||||
|
-profile=<name> specify a profile configuration file (an MPI profiling
|
||||||
|
library): i.e. -profile=myprofile for the myprofile.cfg file.
|
||||||
|
As a special case, lib<name>.so or lib<name>.a may be used
|
||||||
|
if the library is found
|
||||||
|
-check_mpi link against the Intel(R) Trace Collector (-profile=vtmc).
|
||||||
|
-static_mpi link the Intel(R) MPI Library statically
|
||||||
|
-mt_mpi link the thread safe version of the Intel(R) MPI Library
|
||||||
|
-ilp64 link the ILP64 support of the Intel(R) MPI Library
|
||||||
|
-no_ilp64 disable ILP64 support explicitly
|
||||||
|
-fast the same as -static_mpi + pass -fast option to a compiler.
|
||||||
|
-t or -trace
|
||||||
|
link against the Intel(R) Trace Collector
|
||||||
|
-trace-imbalance
|
||||||
|
link against the Intel(R) Trace Collector imbalance library
|
||||||
|
(-profile=vtim)
|
||||||
|
-dynamic_log link against the Intel(R) Trace Collector dynamically
|
||||||
|
-static use static linkage method
|
||||||
|
-nostrip turn off the debug information stripping during static linking
|
||||||
|
-O enable optimization
|
||||||
|
-link_mpi=<name>
|
||||||
|
link against the specified version of the Intel(R) MPI Library
|
||||||
|
All other options will be passed to the compiler without changing.
|
||||||
|
----------------------------------------------------------------------------
|
||||||
|
The following environment variables are used:
|
||||||
|
I_MPI_ROOT the Intel(R) MPI Library installation directory path
|
||||||
|
I_MPI_F90 or MPICH_F90
|
||||||
|
the path/name of the underlying compiler to be used
|
||||||
|
I_MPI_FC_PROFILE or I_MPI_F90_PROFILE or MPIF90_PROFILE
|
||||||
|
the name of profile file (without extension)
|
||||||
|
I_MPI_COMPILER_CONFIG_DIR
|
||||||
|
the folder which contains configuration files *.conf
|
||||||
|
I_MPI_TRACE_PROFILE
|
||||||
|
specify a default profile for the -trace option
|
||||||
|
I_MPI_CHECK_PROFILE
|
||||||
|
specify a default profile for the -check_mpi option
|
||||||
|
I_MPI_CHECK_COMPILER
|
||||||
|
enable compiler setup checks
|
||||||
|
I_MPI_LINK specify the version of the Intel(R) MPI Library
|
||||||
|
I_MPI_DEBUG_INFO_STRIP
|
||||||
|
turn on/off the debug information stripping during static linking
|
||||||
|
I_MPI_FCFLAGS
|
||||||
|
special flags needed for compilation
|
||||||
|
I_MPI_LDFLAGS
|
||||||
|
special flags needed for linking
|
||||||
|
----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
Side Note: MPI version divergence in 2015 release
|
||||||
|
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
The package `intel-parallel-studio@cluster.2015.6` contains both a full MPI
|
||||||
|
development version in `$prefix/impi` and an MPI Runtime under the
|
||||||
|
`composer_xe*` suite directory. Curiously, these have *different versions*,
|
||||||
|
with a release date nearly 1 year apart::
|
||||||
|
|
||||||
|
$ $SPACK_ROOT/...uaxaw7/impi/5.0.3.049/intel64/bin/mpiexec --version
|
||||||
|
Intel(R) MPI Library for Linux* OS, Version 5.0 Update 3 Build 20150804 (build id: 12452)
|
||||||
|
Copyright (C) 2003-2015, Intel Corporation. All rights reserved.
|
||||||
|
|
||||||
|
$ $SPACK_ROOT/...uaxaw7/composer_xe_2015.6.233/mpirt/bin/intel64/mpiexec --version
|
||||||
|
Intel(R) MPI Library for Linux* OS, Version 5.0 Update 1 Build 20140709
|
||||||
|
Copyright (C) 2003-2014, Intel Corporation. All rights reserved.
|
||||||
|
|
||||||
|
I'm not sure what to make of it.
|
||||||
|
|
||||||
|
|
||||||
|
**************
|
||||||
|
macOS support
|
||||||
|
**************
|
||||||
|
|
||||||
|
- On macOS, the Spack methods here only include support to integrate an
|
||||||
|
externally installed MKL.
|
||||||
|
|
||||||
|
- URLs in child packages will be Linux-specific; macOS download packages
|
||||||
|
are located in differently numbered dirs and are named m_*.dmg.
|
@ -2,13 +2,19 @@
|
|||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import os
|
import os
|
||||||
from typing import Callable, List
|
from typing import List
|
||||||
|
|
||||||
|
import llnl.util.lang
|
||||||
|
|
||||||
|
import spack.builder
|
||||||
|
import spack.error
|
||||||
|
import spack.phase_callbacks
|
||||||
import spack.relocate
|
import spack.relocate
|
||||||
from spack.package import Builder, InstallError, Spec, run_after
|
import spack.spec
|
||||||
|
import spack.store
|
||||||
|
|
||||||
|
|
||||||
def sanity_check_prefix(builder: Builder):
|
def sanity_check_prefix(builder: spack.builder.Builder):
|
||||||
"""Check that specific directories and files are created after installation.
|
"""Check that specific directories and files are created after installation.
|
||||||
|
|
||||||
The files to be checked are in the ``sanity_check_is_file`` attribute of the
|
The files to be checked are in the ``sanity_check_is_file`` attribute of the
|
||||||
@ -19,31 +25,27 @@ def sanity_check_prefix(builder: Builder):
|
|||||||
"""
|
"""
|
||||||
pkg = builder.pkg
|
pkg = builder.pkg
|
||||||
|
|
||||||
def check_paths(path_list: List[str], filetype: str, predicate: Callable[[str], bool]) -> None:
|
def check_paths(path_list, filetype, predicate):
|
||||||
if isinstance(path_list, str):
|
if isinstance(path_list, str):
|
||||||
path_list = [path_list]
|
path_list = [path_list]
|
||||||
|
|
||||||
for path in path_list:
|
for path in path_list:
|
||||||
if not predicate(os.path.join(pkg.prefix, path)):
|
abs_path = os.path.join(pkg.prefix, path)
|
||||||
raise InstallError(
|
if not predicate(abs_path):
|
||||||
f"Install failed for {pkg.name}. No such {filetype} in prefix: {path}"
|
msg = "Install failed for {0}. No such {1} in prefix: {2}"
|
||||||
)
|
msg = msg.format(pkg.name, filetype, path)
|
||||||
|
raise spack.error.InstallError(msg)
|
||||||
|
|
||||||
check_paths(pkg.sanity_check_is_file, "file", os.path.isfile)
|
check_paths(pkg.sanity_check_is_file, "file", os.path.isfile)
|
||||||
check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir)
|
check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir)
|
||||||
|
|
||||||
# Check that the prefix is not empty apart from the .spack/ directory
|
ignore_file = llnl.util.lang.match_predicate(spack.store.STORE.layout.hidden_file_regexes)
|
||||||
with os.scandir(pkg.prefix) as entries:
|
if all(map(ignore_file, os.listdir(pkg.prefix))):
|
||||||
f = next(
|
msg = "Install failed for {0}. Nothing was installed!"
|
||||||
(f for f in entries if not (f.name == ".spack" and f.is_dir(follow_symlinks=False))),
|
raise spack.error.InstallError(msg.format(pkg.name))
|
||||||
None,
|
|
||||||
)
|
|
||||||
|
|
||||||
if f is None:
|
|
||||||
raise InstallError(f"Install failed for {pkg.name}. Nothing was installed!")
|
|
||||||
|
|
||||||
|
|
||||||
def apply_macos_rpath_fixups(builder: Builder):
|
def apply_macos_rpath_fixups(builder: spack.builder.Builder):
|
||||||
"""On Darwin, make installed libraries more easily relocatable.
|
"""On Darwin, make installed libraries more easily relocatable.
|
||||||
|
|
||||||
Some build systems (handrolled, autotools, makefiles) can set their own
|
Some build systems (handrolled, autotools, makefiles) can set their own
|
||||||
@ -60,7 +62,9 @@ def apply_macos_rpath_fixups(builder: Builder):
|
|||||||
spack.relocate.fixup_macos_rpaths(builder.spec)
|
spack.relocate.fixup_macos_rpaths(builder.spec)
|
||||||
|
|
||||||
|
|
||||||
def ensure_build_dependencies_or_raise(spec: Spec, dependencies: List[str], error_msg: str):
|
def ensure_build_dependencies_or_raise(
|
||||||
|
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
|
||||||
|
):
|
||||||
"""Ensure that some build dependencies are present in the concrete spec.
|
"""Ensure that some build dependencies are present in the concrete spec.
|
||||||
|
|
||||||
If not, raise a RuntimeError with a helpful error message.
|
If not, raise a RuntimeError with a helpful error message.
|
||||||
@ -97,7 +101,7 @@ def ensure_build_dependencies_or_raise(spec: Spec, dependencies: List[str], erro
|
|||||||
raise RuntimeError(msg)
|
raise RuntimeError(msg)
|
||||||
|
|
||||||
|
|
||||||
def execute_build_time_tests(builder: Builder):
|
def execute_build_time_tests(builder: spack.builder.Builder):
|
||||||
"""Execute the build-time tests prescribed by builder.
|
"""Execute the build-time tests prescribed by builder.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@ -110,7 +114,7 @@ def execute_build_time_tests(builder: Builder):
|
|||||||
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
|
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
|
||||||
|
|
||||||
|
|
||||||
def execute_install_time_tests(builder: Builder):
|
def execute_install_time_tests(builder: spack.builder.Builder):
|
||||||
"""Execute the install-time tests prescribed by builder.
|
"""Execute the install-time tests prescribed by builder.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@ -123,8 +127,8 @@ def execute_install_time_tests(builder: Builder):
|
|||||||
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
|
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
|
||||||
|
|
||||||
|
|
||||||
class BuilderWithDefaults(Builder):
|
class BuilderWithDefaults(spack.builder.Builder):
|
||||||
"""Base class for all specific builders with common callbacks registered."""
|
"""Base class for all specific builders with common callbacks registered."""
|
||||||
|
|
||||||
# Check that self.prefix is there after installation
|
# Check that self.prefix is there after installation
|
||||||
run_after("install")(sanity_check_prefix)
|
spack.phase_callbacks.run_after("install")(sanity_check_prefix)
|
||||||
|
@ -23,6 +23,7 @@
|
|||||||
from .generic import Package
|
from .generic import Package
|
||||||
from .gnu import GNUMirrorPackage
|
from .gnu import GNUMirrorPackage
|
||||||
from .go import GoPackage
|
from .go import GoPackage
|
||||||
|
from .intel import IntelPackage
|
||||||
from .lua import LuaPackage
|
from .lua import LuaPackage
|
||||||
from .makefile import MakefilePackage
|
from .makefile import MakefilePackage
|
||||||
from .maven import MavenPackage
|
from .maven import MavenPackage
|
||||||
@ -68,6 +69,7 @@
|
|||||||
"Package",
|
"Package",
|
||||||
"GNUMirrorPackage",
|
"GNUMirrorPackage",
|
||||||
"GoPackage",
|
"GoPackage",
|
||||||
|
"IntelPackage",
|
||||||
"IntelOneApiLibraryPackageWithSdk",
|
"IntelOneApiLibraryPackageWithSdk",
|
||||||
"IntelOneApiLibraryPackage",
|
"IntelOneApiLibraryPackage",
|
||||||
"IntelOneApiStaticLibraryList",
|
"IntelOneApiStaticLibraryList",
|
||||||
|
@ -3,7 +3,12 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from spack.package import Executable, Prefix, Spec, extends, filter_file
|
import llnl.util.filesystem as fs
|
||||||
|
|
||||||
|
import spack.directives
|
||||||
|
import spack.spec
|
||||||
|
import spack.util.executable
|
||||||
|
import spack.util.prefix
|
||||||
|
|
||||||
from .autotools import AutotoolsBuilder, AutotoolsPackage
|
from .autotools import AutotoolsBuilder, AutotoolsPackage
|
||||||
|
|
||||||
@ -15,13 +20,16 @@ class AspellBuilder(AutotoolsBuilder):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
def configure(
|
def configure(
|
||||||
self, pkg: "AspellDictPackage", spec: Spec, prefix: Prefix # type: ignore[override]
|
self,
|
||||||
|
pkg: "AspellDictPackage", # type: ignore[override]
|
||||||
|
spec: spack.spec.Spec,
|
||||||
|
prefix: spack.util.prefix.Prefix,
|
||||||
):
|
):
|
||||||
aspell = spec["aspell"].prefix.bin.aspell
|
aspell = spec["aspell"].prefix.bin.aspell
|
||||||
prezip = spec["aspell"].prefix.bin.prezip
|
prezip = spec["aspell"].prefix.bin.prezip
|
||||||
destdir = prefix
|
destdir = prefix
|
||||||
|
|
||||||
sh = Executable("/bin/sh")
|
sh = spack.util.executable.Executable("/bin/sh")
|
||||||
sh("./configure", "--vars", f"ASPELL={aspell}", f"PREZIP={prezip}", f"DESTDIR={destdir}")
|
sh("./configure", "--vars", f"ASPELL={aspell}", f"PREZIP={prezip}", f"DESTDIR={destdir}")
|
||||||
|
|
||||||
|
|
||||||
@ -34,7 +42,7 @@ def configure(
|
|||||||
class AspellDictPackage(AutotoolsPackage):
|
class AspellDictPackage(AutotoolsPackage):
|
||||||
"""Specialized class for building aspell dictionairies."""
|
"""Specialized class for building aspell dictionairies."""
|
||||||
|
|
||||||
extends("aspell", when="build_system=autotools")
|
spack.directives.extends("aspell", when="build_system=autotools")
|
||||||
|
|
||||||
#: Override the default autotools builder
|
#: Override the default autotools builder
|
||||||
AutotoolsBuilder = AspellBuilder
|
AutotoolsBuilder = AspellBuilder
|
||||||
@ -46,5 +54,5 @@ def patch(self):
|
|||||||
datadir = aspell("dump", "config", "data-dir", output=str).strip()
|
datadir = aspell("dump", "config", "data-dir", output=str).strip()
|
||||||
dictdir = os.path.relpath(dictdir, aspell_spec.prefix)
|
dictdir = os.path.relpath(dictdir, aspell_spec.prefix)
|
||||||
datadir = os.path.relpath(datadir, aspell_spec.prefix)
|
datadir = os.path.relpath(datadir, aspell_spec.prefix)
|
||||||
filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
|
fs.filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
|
||||||
filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure")
|
fs.filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure")
|
||||||
|
@ -7,36 +7,22 @@
|
|||||||
from typing import Callable, List, Optional, Set, Tuple, Union
|
from typing import Callable, List, Optional, Set, Tuple, Union
|
||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
|
import llnl.util.tty as tty
|
||||||
|
|
||||||
import spack.build_environment
|
import spack.build_environment
|
||||||
import spack.builder
|
import spack.builder
|
||||||
import spack.compilers.libraries
|
import spack.compilers.libraries
|
||||||
|
import spack.error
|
||||||
|
import spack.package_base
|
||||||
|
import spack.phase_callbacks
|
||||||
|
import spack.spec
|
||||||
|
import spack.util.environment
|
||||||
|
import spack.util.prefix
|
||||||
|
from spack.directives import build_system, conflicts, depends_on
|
||||||
|
from spack.multimethod import when
|
||||||
from spack.operating_systems.mac_os import macos_version
|
from spack.operating_systems.mac_os import macos_version
|
||||||
from spack.package import (
|
from spack.util.executable import Executable
|
||||||
EnvironmentModifications,
|
from spack.version import Version
|
||||||
Executable,
|
|
||||||
FileFilter,
|
|
||||||
InstallError,
|
|
||||||
PackageBase,
|
|
||||||
Prefix,
|
|
||||||
Spec,
|
|
||||||
Version,
|
|
||||||
build_system,
|
|
||||||
conflicts,
|
|
||||||
copy,
|
|
||||||
depends_on,
|
|
||||||
find,
|
|
||||||
force_remove,
|
|
||||||
is_exe,
|
|
||||||
keep_modification_time,
|
|
||||||
mkdirp,
|
|
||||||
register_builder,
|
|
||||||
run_after,
|
|
||||||
run_before,
|
|
||||||
tty,
|
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import (
|
from ._checks import (
|
||||||
BuilderWithDefaults,
|
BuilderWithDefaults,
|
||||||
@ -47,7 +33,7 @@
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class AutotoolsPackage(PackageBase):
|
class AutotoolsPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using GNU Autotools."""
|
"""Specialized class for packages built using GNU Autotools."""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that need to know the build
|
#: This attribute is used in UI queries that need to know the build
|
||||||
@ -92,7 +78,7 @@ def with_or_without(self, *args, **kwargs):
|
|||||||
return spack.builder.create(self).with_or_without(*args, **kwargs)
|
return spack.builder.create(self).with_or_without(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
@register_builder("autotools")
|
@spack.builder.builder("autotools")
|
||||||
class AutotoolsBuilder(BuilderWithDefaults):
|
class AutotoolsBuilder(BuilderWithDefaults):
|
||||||
"""The autotools builder encodes the default way of installing software built
|
"""The autotools builder encodes the default way of installing software built
|
||||||
with autotools. It has four phases that can be overridden, if need be:
|
with autotools. It has four phases that can be overridden, if need be:
|
||||||
@ -206,7 +192,7 @@ def archive_files(self) -> List[str]:
|
|||||||
files.append(self._removed_la_files_log)
|
files.append(self._removed_la_files_log)
|
||||||
return files
|
return files
|
||||||
|
|
||||||
@run_after("autoreconf")
|
@spack.phase_callbacks.run_after("autoreconf")
|
||||||
def _do_patch_config_files(self) -> None:
|
def _do_patch_config_files(self) -> None:
|
||||||
"""Some packages ship with older config.guess/config.sub files and need to
|
"""Some packages ship with older config.guess/config.sub files and need to
|
||||||
have these updated when installed on a newer architecture.
|
have these updated when installed on a newer architecture.
|
||||||
@ -244,7 +230,7 @@ def runs_ok(script_abs_path):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
# Get the list of files that needs to be patched
|
# Get the list of files that needs to be patched
|
||||||
to_be_patched = find(self.pkg.stage.path, files=["config.sub", "config.guess"])
|
to_be_patched = fs.find(self.pkg.stage.path, files=["config.sub", "config.guess"])
|
||||||
to_be_patched = [f for f in to_be_patched if not runs_ok(f)]
|
to_be_patched = [f for f in to_be_patched if not runs_ok(f)]
|
||||||
|
|
||||||
# If there are no files to be patched, return early
|
# If there are no files to be patched, return early
|
||||||
@ -263,13 +249,13 @@ def runs_ok(script_abs_path):
|
|||||||
|
|
||||||
# An external gnuconfig may not not have a prefix.
|
# An external gnuconfig may not not have a prefix.
|
||||||
if gnuconfig_dir is None:
|
if gnuconfig_dir is None:
|
||||||
raise InstallError(
|
raise spack.error.InstallError(
|
||||||
"Spack could not find substitutes for GNU config files because no "
|
"Spack could not find substitutes for GNU config files because no "
|
||||||
"prefix is available for the `gnuconfig` package. Make sure you set a "
|
"prefix is available for the `gnuconfig` package. Make sure you set a "
|
||||||
"prefix path instead of modules for external `gnuconfig`."
|
"prefix path instead of modules for external `gnuconfig`."
|
||||||
)
|
)
|
||||||
|
|
||||||
candidates = find(gnuconfig_dir, files=to_be_found, recursive=False)
|
candidates = fs.find(gnuconfig_dir, files=to_be_found, recursive=False)
|
||||||
|
|
||||||
# For external packages the user may have specified an incorrect prefix.
|
# For external packages the user may have specified an incorrect prefix.
|
||||||
# otherwise the installation is just corrupt.
|
# otherwise the installation is just corrupt.
|
||||||
@ -283,7 +269,7 @@ def runs_ok(script_abs_path):
|
|||||||
msg += (
|
msg += (
|
||||||
" or the `gnuconfig` package prefix is misconfigured as" " an external package"
|
" or the `gnuconfig` package prefix is misconfigured as" " an external package"
|
||||||
)
|
)
|
||||||
raise InstallError(msg)
|
raise spack.error.InstallError(msg)
|
||||||
|
|
||||||
# Filter working substitutes
|
# Filter working substitutes
|
||||||
candidates = [f for f in candidates if runs_ok(f)]
|
candidates = [f for f in candidates if runs_ok(f)]
|
||||||
@ -308,29 +294,29 @@ def runs_ok(script_abs_path):
|
|||||||
and set the prefix to the directory containing the `config.guess` and
|
and set the prefix to the directory containing the `config.guess` and
|
||||||
`config.sub` files.
|
`config.sub` files.
|
||||||
"""
|
"""
|
||||||
raise InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
|
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
|
||||||
|
|
||||||
# Copy the good files over the bad ones
|
# Copy the good files over the bad ones
|
||||||
for abs_path in to_be_patched:
|
for abs_path in to_be_patched:
|
||||||
name = os.path.basename(abs_path)
|
name = os.path.basename(abs_path)
|
||||||
mode = os.stat(abs_path).st_mode
|
mode = os.stat(abs_path).st_mode
|
||||||
os.chmod(abs_path, stat.S_IWUSR)
|
os.chmod(abs_path, stat.S_IWUSR)
|
||||||
copy(substitutes[name], abs_path)
|
fs.copy(substitutes[name], abs_path)
|
||||||
os.chmod(abs_path, mode)
|
os.chmod(abs_path, mode)
|
||||||
|
|
||||||
@run_before("configure")
|
@spack.phase_callbacks.run_before("configure")
|
||||||
def _patch_usr_bin_file(self) -> None:
|
def _patch_usr_bin_file(self) -> None:
|
||||||
"""On NixOS file is not available in /usr/bin/file. Patch configure
|
"""On NixOS file is not available in /usr/bin/file. Patch configure
|
||||||
scripts to use file from path."""
|
scripts to use file from path."""
|
||||||
|
|
||||||
if self.spec.os.startswith("nixos"):
|
if self.spec.os.startswith("nixos"):
|
||||||
x = FileFilter(
|
x = fs.FileFilter(
|
||||||
*filter(is_exe, find(self.build_directory, "configure", recursive=True))
|
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
|
||||||
)
|
)
|
||||||
with keep_modification_time(*x.filenames):
|
with fs.keep_modification_time(*x.filenames):
|
||||||
x.filter(regex="/usr/bin/file", repl="file", string=True)
|
x.filter(regex="/usr/bin/file", repl="file", string=True)
|
||||||
|
|
||||||
@run_before("configure")
|
@spack.phase_callbacks.run_before("configure")
|
||||||
def _set_autotools_environment_variables(self) -> None:
|
def _set_autotools_environment_variables(self) -> None:
|
||||||
"""Many autotools builds use a version of mknod.m4 that fails when
|
"""Many autotools builds use a version of mknod.m4 that fails when
|
||||||
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
|
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
|
||||||
@ -344,7 +330,7 @@ def _set_autotools_environment_variables(self) -> None:
|
|||||||
"""
|
"""
|
||||||
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
|
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
|
||||||
|
|
||||||
@run_before("configure")
|
@spack.phase_callbacks.run_before("configure")
|
||||||
def _do_patch_libtool_configure(self) -> None:
|
def _do_patch_libtool_configure(self) -> None:
|
||||||
"""Patch bugs that propagate from libtool macros into "configure" and
|
"""Patch bugs that propagate from libtool macros into "configure" and
|
||||||
further into "libtool". Note that patches that can be fixed by patching
|
further into "libtool". Note that patches that can be fixed by patching
|
||||||
@ -355,12 +341,14 @@ def _do_patch_libtool_configure(self) -> None:
|
|||||||
if not self.patch_libtool:
|
if not self.patch_libtool:
|
||||||
return
|
return
|
||||||
|
|
||||||
x = FileFilter(*filter(is_exe, find(self.build_directory, "configure", recursive=True)))
|
x = fs.FileFilter(
|
||||||
|
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
|
||||||
|
)
|
||||||
|
|
||||||
# There are distributed automatically generated files that depend on the configure script
|
# There are distributed automatically generated files that depend on the configure script
|
||||||
# and require additional tools for rebuilding.
|
# and require additional tools for rebuilding.
|
||||||
# See https://github.com/spack/spack/pull/30768#issuecomment-1219329860
|
# See https://github.com/spack/spack/pull/30768#issuecomment-1219329860
|
||||||
with keep_modification_time(*x.filenames):
|
with fs.keep_modification_time(*x.filenames):
|
||||||
# Fix parsing of compiler output when collecting predeps and postdeps
|
# Fix parsing of compiler output when collecting predeps and postdeps
|
||||||
# https://lists.gnu.org/archive/html/bug-libtool/2016-03/msg00003.html
|
# https://lists.gnu.org/archive/html/bug-libtool/2016-03/msg00003.html
|
||||||
x.filter(regex=r'^(\s*if test x-L = )("\$p" \|\|\s*)$', repl=r"\1x\2")
|
x.filter(regex=r'^(\s*if test x-L = )("\$p" \|\|\s*)$', repl=r"\1x\2")
|
||||||
@ -377,7 +365,7 @@ def _do_patch_libtool_configure(self) -> None:
|
|||||||
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
|
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
|
||||||
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
|
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
|
||||||
|
|
||||||
@run_after("configure")
|
@spack.phase_callbacks.run_after("configure")
|
||||||
def _do_patch_libtool(self) -> None:
|
def _do_patch_libtool(self) -> None:
|
||||||
"""If configure generates a "libtool" script that does not correctly
|
"""If configure generates a "libtool" script that does not correctly
|
||||||
detect the compiler (and patch_libtool is set), patch in the correct
|
detect the compiler (and patch_libtool is set), patch in the correct
|
||||||
@ -399,7 +387,9 @@ def _do_patch_libtool(self) -> None:
|
|||||||
if not self.patch_libtool:
|
if not self.patch_libtool:
|
||||||
return
|
return
|
||||||
|
|
||||||
x = FileFilter(*filter(is_exe, find(self.build_directory, "libtool", recursive=True)))
|
x = fs.FileFilter(
|
||||||
|
*filter(fs.is_exe, fs.find(self.build_directory, "libtool", recursive=True))
|
||||||
|
)
|
||||||
|
|
||||||
# Exit early if there is nothing to patch:
|
# Exit early if there is nothing to patch:
|
||||||
if not x.filenames:
|
if not x.filenames:
|
||||||
@ -555,10 +545,10 @@ def build_directory(self) -> str:
|
|||||||
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
|
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
|
||||||
return build_dir
|
return build_dir
|
||||||
|
|
||||||
@run_before("autoreconf")
|
@spack.phase_callbacks.run_before("autoreconf")
|
||||||
def _delete_configure_to_force_update(self) -> None:
|
def _delete_configure_to_force_update(self) -> None:
|
||||||
if self.force_autoreconf:
|
if self.force_autoreconf:
|
||||||
force_remove(self.configure_abs_path)
|
fs.force_remove(self.configure_abs_path)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def autoreconf_search_path_args(self) -> List[str]:
|
def autoreconf_search_path_args(self) -> List[str]:
|
||||||
@ -568,7 +558,7 @@ def autoreconf_search_path_args(self) -> List[str]:
|
|||||||
spack dependencies."""
|
spack dependencies."""
|
||||||
return _autoreconf_search_path_args(self.spec)
|
return _autoreconf_search_path_args(self.spec)
|
||||||
|
|
||||||
@run_after("autoreconf")
|
@spack.phase_callbacks.run_after("autoreconf")
|
||||||
def _set_configure_or_die(self) -> None:
|
def _set_configure_or_die(self) -> None:
|
||||||
"""Ensure the presence of a "configure" script, or raise. If the "configure"
|
"""Ensure the presence of a "configure" script, or raise. If the "configure"
|
||||||
is found, a module level attribute is set.
|
is found, a module level attribute is set.
|
||||||
@ -592,7 +582,9 @@ def configure_args(self) -> List[str]:
|
|||||||
"""
|
"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
def autoreconf(
|
||||||
|
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Not needed usually, configure should be already there"""
|
"""Not needed usually, configure should be already there"""
|
||||||
|
|
||||||
# If configure exists nothing needs to be done
|
# If configure exists nothing needs to be done
|
||||||
@ -611,7 +603,7 @@ def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
tty.warn("* If the default procedure fails, consider implementing *")
|
tty.warn("* If the default procedure fails, consider implementing *")
|
||||||
tty.warn("* a custom AUTORECONF phase in the package *")
|
tty.warn("* a custom AUTORECONF phase in the package *")
|
||||||
tty.warn("*********************************************************")
|
tty.warn("*********************************************************")
|
||||||
with working_dir(self.configure_directory):
|
with fs.working_dir(self.configure_directory):
|
||||||
# This line is what is needed most of the time
|
# This line is what is needed most of the time
|
||||||
# --install, --verbose, --force
|
# --install, --verbose, --force
|
||||||
autoreconf_args = ["-ivf"]
|
autoreconf_args = ["-ivf"]
|
||||||
@ -619,7 +611,9 @@ def autoreconf(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
autoreconf_args += self.autoreconf_extra_args
|
autoreconf_args += self.autoreconf_extra_args
|
||||||
self.pkg.module.autoreconf(*autoreconf_args)
|
self.pkg.module.autoreconf(*autoreconf_args)
|
||||||
|
|
||||||
def configure(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
def configure(
|
||||||
|
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run "configure", with the arguments specified by the builder and an
|
"""Run "configure", with the arguments specified by the builder and an
|
||||||
appropriately set prefix.
|
appropriately set prefix.
|
||||||
"""
|
"""
|
||||||
@ -627,27 +621,31 @@ def configure(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
options += ["--prefix={0}".format(prefix)]
|
options += ["--prefix={0}".format(prefix)]
|
||||||
options += self.configure_args()
|
options += self.configure_args()
|
||||||
|
|
||||||
with working_dir(self.build_directory, create=True):
|
with fs.working_dir(self.build_directory, create=True):
|
||||||
pkg.module.configure(*options)
|
pkg.module.configure(*options)
|
||||||
|
|
||||||
def build(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run "make" on the build targets specified by the builder."""
|
"""Run "make" on the build targets specified by the builder."""
|
||||||
# See https://autotools.io/automake/silent.html
|
# See https://autotools.io/automake/silent.html
|
||||||
params = ["V=1"]
|
params = ["V=1"]
|
||||||
params += self.build_targets
|
params += self.build_targets
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.make(*params)
|
pkg.module.make(*params)
|
||||||
|
|
||||||
def install(self, pkg: AutotoolsPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run "make" on the install targets specified by the builder."""
|
"""Run "make" on the install targets specified by the builder."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.make(*self.install_targets)
|
pkg.module.make(*self.install_targets)
|
||||||
|
|
||||||
run_after("build")(execute_build_time_tests)
|
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
|
||||||
|
|
||||||
def check(self) -> None:
|
def check(self) -> None:
|
||||||
"""Run "make" on the ``test`` and ``check`` targets, if found."""
|
"""Run "make" on the ``test`` and ``check`` targets, if found."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg._if_make_target_execute("test")
|
self.pkg._if_make_target_execute("test")
|
||||||
self.pkg._if_make_target_execute("check")
|
self.pkg._if_make_target_execute("check")
|
||||||
|
|
||||||
@ -715,7 +713,7 @@ def _activate_or_not(
|
|||||||
Raises:
|
Raises:
|
||||||
KeyError: if name is not among known variants
|
KeyError: if name is not among known variants
|
||||||
"""
|
"""
|
||||||
spec: Spec = self.pkg.spec
|
spec: spack.spec.Spec = self.pkg.spec
|
||||||
args: List[str] = []
|
args: List[str] = []
|
||||||
|
|
||||||
if activation_value == "prefix":
|
if activation_value == "prefix":
|
||||||
@ -826,14 +824,14 @@ def enable_or_disable(
|
|||||||
"""
|
"""
|
||||||
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
|
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
|
||||||
|
|
||||||
run_after("install")(execute_install_time_tests)
|
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
|
||||||
|
|
||||||
def installcheck(self) -> None:
|
def installcheck(self) -> None:
|
||||||
"""Run "make" on the ``installcheck`` target, if found."""
|
"""Run "make" on the ``installcheck`` target, if found."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg._if_make_target_execute("installcheck")
|
self.pkg._if_make_target_execute("installcheck")
|
||||||
|
|
||||||
@run_after("install")
|
@spack.phase_callbacks.run_after("install")
|
||||||
def _remove_libtool_archives(self) -> None:
|
def _remove_libtool_archives(self) -> None:
|
||||||
"""Remove all .la files in prefix sub-folders if the package sets
|
"""Remove all .la files in prefix sub-folders if the package sets
|
||||||
``install_libtool_archives`` to be False.
|
``install_libtool_archives`` to be False.
|
||||||
@ -843,23 +841,25 @@ def _remove_libtool_archives(self) -> None:
|
|||||||
return
|
return
|
||||||
|
|
||||||
# Remove the files and create a log of what was removed
|
# Remove the files and create a log of what was removed
|
||||||
libtool_files = find(str(self.pkg.prefix), "*.la", recursive=True)
|
libtool_files = fs.find(str(self.pkg.prefix), "*.la", recursive=True)
|
||||||
with fs.safe_remove(*libtool_files):
|
with fs.safe_remove(*libtool_files):
|
||||||
mkdirp(os.path.dirname(self._removed_la_files_log))
|
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
|
||||||
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
|
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
|
||||||
f.write("\n".join(libtool_files))
|
f.write("\n".join(libtool_files))
|
||||||
|
|
||||||
def setup_build_environment(self, env: EnvironmentModifications) -> None:
|
def setup_build_environment(
|
||||||
|
self, env: spack.util.environment.EnvironmentModifications
|
||||||
|
) -> None:
|
||||||
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
|
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
|
||||||
# Many configure files rely on matching '10.*' for macOS version
|
# Many configure files rely on matching '10.*' for macOS version
|
||||||
# detection and fail to add flags if it shows as version 11.
|
# detection and fail to add flags if it shows as version 11.
|
||||||
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
|
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
|
||||||
|
|
||||||
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
||||||
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
||||||
|
|
||||||
|
|
||||||
def _autoreconf_search_path_args(spec: Spec) -> List[str]:
|
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
|
||||||
dirs_seen: Set[Tuple[int, int]] = set()
|
dirs_seen: Set[Tuple[int, int]] = set()
|
||||||
flags_spack: List[str] = []
|
flags_spack: List[str] = []
|
||||||
flags_external: List[str] = []
|
flags_external: List[str] = []
|
||||||
|
@ -1,10 +1,12 @@
|
|||||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
from spack.package import Builder, PackageBase, Prefix, Spec, build_system, register_builder
|
import spack.builder
|
||||||
|
import spack.directives
|
||||||
|
import spack.package_base
|
||||||
|
|
||||||
|
|
||||||
class BundlePackage(PackageBase):
|
class BundlePackage(spack.package_base.PackageBase):
|
||||||
"""General purpose bundle, or no-code, package class."""
|
"""General purpose bundle, or no-code, package class."""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that require to know which
|
#: This attribute is used in UI queries that require to know which
|
||||||
@ -17,12 +19,12 @@ class BundlePackage(PackageBase):
|
|||||||
#: Bundle packages do not have associated source or binary code.
|
#: Bundle packages do not have associated source or binary code.
|
||||||
has_code = False
|
has_code = False
|
||||||
|
|
||||||
build_system("bundle")
|
spack.directives.build_system("bundle")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("bundle")
|
@spack.builder.builder("bundle")
|
||||||
class BundleBuilder(Builder):
|
class BundleBuilder(spack.builder.Builder):
|
||||||
phases = ("install",)
|
phases = ("install",)
|
||||||
|
|
||||||
def install(self, pkg: BundlePackage, spec: Spec, prefix: Prefix) -> None:
|
def install(self, pkg, spec, prefix):
|
||||||
pass
|
pass
|
||||||
|
@ -7,7 +7,14 @@
|
|||||||
import re
|
import re
|
||||||
from typing import Optional, Tuple
|
from typing import Optional, Tuple
|
||||||
|
|
||||||
from spack.package import Prefix, Spec, depends_on, install, mkdirp, run_after, tty, which_string
|
import llnl.util.filesystem as fs
|
||||||
|
import llnl.util.tty as tty
|
||||||
|
|
||||||
|
import spack.phase_callbacks
|
||||||
|
import spack.spec
|
||||||
|
import spack.util.prefix
|
||||||
|
from spack.directives import depends_on
|
||||||
|
from spack.util.executable import which_string
|
||||||
|
|
||||||
from .cmake import CMakeBuilder, CMakePackage
|
from .cmake import CMakeBuilder, CMakePackage
|
||||||
|
|
||||||
@ -368,7 +375,9 @@ def initconfig_package_entries(self):
|
|||||||
"""This method is to be overwritten by the package"""
|
"""This method is to be overwritten by the package"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def initconfig(self, pkg: "CachedCMakePackage", spec: Spec, prefix: Prefix) -> None:
|
def initconfig(
|
||||||
|
self, pkg: "CachedCMakePackage", spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
cache_entries = (
|
cache_entries = (
|
||||||
self.std_initconfig_entries()
|
self.std_initconfig_entries()
|
||||||
+ self.initconfig_compiler_entries()
|
+ self.initconfig_compiler_entries()
|
||||||
@ -388,10 +397,10 @@ def std_cmake_args(self):
|
|||||||
args.extend(["-C", self.cache_path])
|
args.extend(["-C", self.cache_path])
|
||||||
return args
|
return args
|
||||||
|
|
||||||
@run_after("install")
|
@spack.phase_callbacks.run_after("install")
|
||||||
def install_cmake_cache(self):
|
def install_cmake_cache(self):
|
||||||
mkdirp(self.pkg.spec.prefix.share.cmake)
|
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
|
||||||
install(self.cache_path, self.pkg.spec.prefix.share.cmake)
|
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)
|
||||||
|
|
||||||
|
|
||||||
class CachedCMakePackage(CMakePackage):
|
class CachedCMakePackage(CMakePackage):
|
||||||
|
@ -2,24 +2,21 @@
|
|||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
from spack.package import (
|
import llnl.util.filesystem as fs
|
||||||
EnvironmentModifications,
|
|
||||||
PackageBase,
|
import spack.builder
|
||||||
Prefix,
|
import spack.package_base
|
||||||
Spec,
|
import spack.phase_callbacks
|
||||||
build_system,
|
import spack.spec
|
||||||
depends_on,
|
import spack.util.environment
|
||||||
install_tree,
|
import spack.util.prefix
|
||||||
register_builder,
|
from spack.directives import build_system, depends_on
|
||||||
run_after,
|
from spack.multimethod import when
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults, execute_install_time_tests
|
from ._checks import BuilderWithDefaults, execute_install_time_tests
|
||||||
|
|
||||||
|
|
||||||
class CargoPackage(PackageBase):
|
class CargoPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using cargo."""
|
"""Specialized class for packages built using cargo."""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that need to know the build
|
#: This attribute is used in UI queries that need to know the build
|
||||||
@ -32,7 +29,7 @@ class CargoPackage(PackageBase):
|
|||||||
depends_on("rust", type="build")
|
depends_on("rust", type="build")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("cargo")
|
@spack.builder.builder("cargo")
|
||||||
class CargoBuilder(BuilderWithDefaults):
|
class CargoBuilder(BuilderWithDefaults):
|
||||||
"""The Cargo builder encodes the most common way of building software with
|
"""The Cargo builder encodes the most common way of building software with
|
||||||
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
|
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
|
||||||
@ -90,24 +87,30 @@ def check_args(self):
|
|||||||
"""Argument for ``cargo test`` during check phase"""
|
"""Argument for ``cargo test`` during check phase"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def setup_build_environment(self, env: EnvironmentModifications) -> None:
|
def setup_build_environment(
|
||||||
|
self, env: spack.util.environment.EnvironmentModifications
|
||||||
|
) -> None:
|
||||||
env.set("CARGO_HOME", self.stage.path)
|
env.set("CARGO_HOME", self.stage.path)
|
||||||
|
|
||||||
def build(self, pkg: CargoPackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Runs ``cargo install`` in the source directory"""
|
"""Runs ``cargo install`` in the source directory"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.cargo(
|
pkg.module.cargo(
|
||||||
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
|
"install", "--root", "out", "--path", ".", *self.std_build_args, *self.build_args
|
||||||
)
|
)
|
||||||
|
|
||||||
def install(self, pkg: CargoPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Copy build files into package prefix."""
|
"""Copy build files into package prefix."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
install_tree("out", prefix)
|
fs.install_tree("out", prefix)
|
||||||
|
|
||||||
run_after("install")(execute_install_time_tests)
|
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
|
||||||
|
|
||||||
def check(self):
|
def check(self):
|
||||||
"""Run "cargo test"."""
|
"""Run "cargo test"."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg.module.cargo("test", *self.check_args)
|
self.pkg.module.cargo("test", *self.check_args)
|
||||||
|
@ -10,25 +10,20 @@
|
|||||||
from itertools import chain
|
from itertools import chain
|
||||||
from typing import Any, List, Optional, Tuple
|
from typing import Any, List, Optional, Tuple
|
||||||
|
|
||||||
|
import llnl.util.filesystem as fs
|
||||||
|
from llnl.util import tty
|
||||||
from llnl.util.lang import stable_partition
|
from llnl.util.lang import stable_partition
|
||||||
|
|
||||||
|
import spack.builder
|
||||||
import spack.deptypes as dt
|
import spack.deptypes as dt
|
||||||
|
import spack.error
|
||||||
|
import spack.package_base
|
||||||
|
import spack.phase_callbacks
|
||||||
|
import spack.spec
|
||||||
|
import spack.util.prefix
|
||||||
from spack import traverse
|
from spack import traverse
|
||||||
from spack.package import (
|
from spack.directives import build_system, conflicts, depends_on, variant
|
||||||
InstallError,
|
from spack.multimethod import when
|
||||||
PackageBase,
|
|
||||||
Prefix,
|
|
||||||
Spec,
|
|
||||||
build_system,
|
|
||||||
conflicts,
|
|
||||||
depends_on,
|
|
||||||
register_builder,
|
|
||||||
run_after,
|
|
||||||
tty,
|
|
||||||
variant,
|
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
from spack.util.environment import filter_system_paths
|
from spack.util.environment import filter_system_paths
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults, execute_build_time_tests
|
from ._checks import BuilderWithDefaults, execute_build_time_tests
|
||||||
@ -46,7 +41,7 @@ def _extract_primary_generator(generator):
|
|||||||
return _primary_generator_extractor.match(generator).group(1)
|
return _primary_generator_extractor.match(generator).group(1)
|
||||||
|
|
||||||
|
|
||||||
def _maybe_set_python_hints(pkg: PackageBase, args: List[str]) -> None:
|
def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
|
||||||
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
|
"""Set the PYTHON_EXECUTABLE, Python_EXECUTABLE, and Python3_EXECUTABLE CMake variables
|
||||||
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
|
if the package has Python as build or link dep and ``find_python_hints`` is set to True. See
|
||||||
``find_python_hints`` for context."""
|
``find_python_hints`` for context."""
|
||||||
@ -64,7 +59,7 @@ def _maybe_set_python_hints(pkg: PackageBase, args: List[str]) -> None:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def _supports_compilation_databases(pkg: PackageBase) -> bool:
|
def _supports_compilation_databases(pkg: spack.package_base.PackageBase) -> bool:
|
||||||
"""Check if this package (and CMake) can support compilation databases."""
|
"""Check if this package (and CMake) can support compilation databases."""
|
||||||
|
|
||||||
# CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
|
# CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
|
||||||
@ -78,7 +73,7 @@ def _supports_compilation_databases(pkg: PackageBase) -> bool:
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
def _conditional_cmake_defaults(pkg: PackageBase, args: List[str]) -> None:
|
def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[str]) -> None:
|
||||||
"""Set a few default defines for CMake, depending on its version."""
|
"""Set a few default defines for CMake, depending on its version."""
|
||||||
cmakes = pkg.spec.dependencies("cmake", dt.BUILD)
|
cmakes = pkg.spec.dependencies("cmake", dt.BUILD)
|
||||||
|
|
||||||
@ -169,7 +164,7 @@ def _values(x):
|
|||||||
conflicts(f"generator={x}")
|
conflicts(f"generator={x}")
|
||||||
|
|
||||||
|
|
||||||
def get_cmake_prefix_path(pkg: PackageBase) -> List[str]:
|
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
|
||||||
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
|
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
|
||||||
attribute of direct build/test and transitive link dependencies."""
|
attribute of direct build/test and transitive link dependencies."""
|
||||||
edges = traverse.traverse_topo_edges_generator(
|
edges = traverse.traverse_topo_edges_generator(
|
||||||
@ -190,7 +185,7 @@ def get_cmake_prefix_path(pkg: PackageBase) -> List[str]:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class CMakePackage(PackageBase):
|
class CMakePackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using CMake
|
"""Specialized class for packages built using CMake
|
||||||
|
|
||||||
For more information on the CMake build system, see:
|
For more information on the CMake build system, see:
|
||||||
@ -288,7 +283,7 @@ def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) ->
|
|||||||
return define_from_variant(self, cmake_var, variant)
|
return define_from_variant(self, cmake_var, variant)
|
||||||
|
|
||||||
|
|
||||||
@register_builder("cmake")
|
@spack.builder.builder("cmake")
|
||||||
class CMakeBuilder(BuilderWithDefaults):
|
class CMakeBuilder(BuilderWithDefaults):
|
||||||
"""The cmake builder encodes the default way of building software with CMake. IT
|
"""The cmake builder encodes the default way of building software with CMake. IT
|
||||||
has three phases that can be overridden:
|
has three phases that can be overridden:
|
||||||
@ -375,7 +370,9 @@ def std_cmake_args(self) -> List[str]:
|
|||||||
return args
|
return args
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
|
def std_args(
|
||||||
|
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
|
||||||
|
) -> List[str]:
|
||||||
"""Computes the standard cmake arguments for a generic package"""
|
"""Computes the standard cmake arguments for a generic package"""
|
||||||
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
|
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
|
||||||
generator = generator or default_generator
|
generator = generator or default_generator
|
||||||
@ -385,7 +382,7 @@ def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
|
|||||||
msg = "Invalid CMake generator: '{0}'\n".format(generator)
|
msg = "Invalid CMake generator: '{0}'\n".format(generator)
|
||||||
msg += "CMakePackage currently supports the following "
|
msg += "CMakePackage currently supports the following "
|
||||||
msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators))
|
msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators))
|
||||||
raise InstallError(msg)
|
raise spack.error.InstallError(msg)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
build_type = pkg.spec.variants["build_type"].value
|
build_type = pkg.spec.variants["build_type"].value
|
||||||
@ -423,11 +420,11 @@ def std_args(pkg: PackageBase, generator: Optional[str] = None) -> List[str]:
|
|||||||
return args
|
return args
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def define_cuda_architectures(pkg: PackageBase) -> str:
|
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
|
||||||
return define_cuda_architectures(pkg)
|
return define_cuda_architectures(pkg)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def define_hip_architectures(pkg: PackageBase) -> str:
|
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
|
||||||
return define_hip_architectures(pkg)
|
return define_hip_architectures(pkg)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
@ -457,7 +454,9 @@ def cmake_args(self) -> List[str]:
|
|||||||
"""
|
"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def cmake(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
|
def cmake(
|
||||||
|
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Runs ``cmake`` in the build directory"""
|
"""Runs ``cmake`` in the build directory"""
|
||||||
|
|
||||||
if spec.is_develop:
|
if spec.is_develop:
|
||||||
@ -481,33 +480,37 @@ def cmake(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
options = self.std_cmake_args
|
options = self.std_cmake_args
|
||||||
options += self.cmake_args()
|
options += self.cmake_args()
|
||||||
options.append(os.path.abspath(self.root_cmakelists_dir))
|
options.append(os.path.abspath(self.root_cmakelists_dir))
|
||||||
with working_dir(self.build_directory, create=True):
|
with fs.working_dir(self.build_directory, create=True):
|
||||||
pkg.module.cmake(*options)
|
pkg.module.cmake(*options)
|
||||||
|
|
||||||
def build(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Make the build targets"""
|
"""Make the build targets"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
if self.generator == "Unix Makefiles":
|
if self.generator == "Unix Makefiles":
|
||||||
pkg.module.make(*self.build_targets)
|
pkg.module.make(*self.build_targets)
|
||||||
elif self.generator == "Ninja":
|
elif self.generator == "Ninja":
|
||||||
self.build_targets.append("-v")
|
self.build_targets.append("-v")
|
||||||
pkg.module.ninja(*self.build_targets)
|
pkg.module.ninja(*self.build_targets)
|
||||||
|
|
||||||
def install(self, pkg: CMakePackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Make the install targets"""
|
"""Make the install targets"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
if self.generator == "Unix Makefiles":
|
if self.generator == "Unix Makefiles":
|
||||||
pkg.module.make(*self.install_targets)
|
pkg.module.make(*self.install_targets)
|
||||||
elif self.generator == "Ninja":
|
elif self.generator == "Ninja":
|
||||||
pkg.module.ninja(*self.install_targets)
|
pkg.module.ninja(*self.install_targets)
|
||||||
|
|
||||||
run_after("build")(execute_build_time_tests)
|
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
|
||||||
|
|
||||||
def check(self) -> None:
|
def check(self) -> None:
|
||||||
"""Search the CMake-generated files for the targets ``test`` and ``check``,
|
"""Search the CMake-generated files for the targets ``test`` and ``check``,
|
||||||
and runs them if found.
|
and runs them if found.
|
||||||
"""
|
"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
if self.generator == "Unix Makefiles":
|
if self.generator == "Unix Makefiles":
|
||||||
self.pkg._if_make_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
|
self.pkg._if_make_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
|
||||||
self.pkg._if_make_target_execute("check")
|
self.pkg._if_make_target_execute("check")
|
||||||
@ -554,7 +557,9 @@ def define(cmake_var: str, value: Any) -> str:
|
|||||||
return "".join(["-D", cmake_var, ":", kind, "=", value])
|
return "".join(["-D", cmake_var, ":", kind, "=", value])
|
||||||
|
|
||||||
|
|
||||||
def define_from_variant(pkg: PackageBase, cmake_var: str, variant: Optional[str] = None) -> str:
|
def define_from_variant(
|
||||||
|
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
|
||||||
|
) -> str:
|
||||||
"""Return a CMake command line argument from the given variant's value.
|
"""Return a CMake command line argument from the given variant's value.
|
||||||
|
|
||||||
The optional ``variant`` argument defaults to the lower-case transform
|
The optional ``variant`` argument defaults to the lower-case transform
|
||||||
@ -614,7 +619,7 @@ def define_from_variant(pkg: PackageBase, cmake_var: str, variant: Optional[str]
|
|||||||
return define(cmake_var, value)
|
return define(cmake_var, value)
|
||||||
|
|
||||||
|
|
||||||
def define_hip_architectures(pkg: PackageBase) -> str:
|
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
|
||||||
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
|
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
|
||||||
|
|
||||||
``amdgpu_target`` is variant composed of a list of the target HIP
|
``amdgpu_target`` is variant composed of a list of the target HIP
|
||||||
@ -630,7 +635,7 @@ def define_hip_architectures(pkg: PackageBase) -> str:
|
|||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def define_cuda_architectures(pkg: PackageBase) -> str:
|
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
|
||||||
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
|
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
|
||||||
|
|
||||||
``cuda_arch`` is variant composed of a list of target CUDA architectures and
|
``cuda_arch`` is variant composed of a list of target CUDA architectures and
|
||||||
|
@ -8,16 +8,19 @@
|
|||||||
import sys
|
import sys
|
||||||
from typing import Dict, List, Optional, Sequence, Tuple, Union
|
from typing import Dict, List, Optional, Sequence, Tuple, Union
|
||||||
|
|
||||||
|
import llnl.util.tty as tty
|
||||||
from llnl.util.lang import classproperty, memoized
|
from llnl.util.lang import classproperty, memoized
|
||||||
|
|
||||||
|
import spack
|
||||||
import spack.compilers.error
|
import spack.compilers.error
|
||||||
from spack.package import Executable, PackageBase, ProcessError, Spec, tty, which_string
|
import spack.package_base
|
||||||
|
import spack.util.executable
|
||||||
|
|
||||||
# Local "type" for type hints
|
# Local "type" for type hints
|
||||||
Path = Union[str, pathlib.Path]
|
Path = Union[str, pathlib.Path]
|
||||||
|
|
||||||
|
|
||||||
class CompilerPackage(PackageBase):
|
class CompilerPackage(spack.package_base.PackageBase):
|
||||||
"""A Package mixin for all common logic for packages that implement compilers"""
|
"""A Package mixin for all common logic for packages that implement compilers"""
|
||||||
|
|
||||||
# TODO: how do these play nicely with other tags
|
# TODO: how do these play nicely with other tags
|
||||||
@ -49,7 +52,7 @@ class CompilerPackage(PackageBase):
|
|||||||
#: Flags for generating debug information
|
#: Flags for generating debug information
|
||||||
debug_flags: Sequence[str] = []
|
debug_flags: Sequence[str] = []
|
||||||
|
|
||||||
def __init__(self, spec: Spec):
|
def __init__(self, spec: "spack.spec.Spec"):
|
||||||
super().__init__(spec)
|
super().__init__(spec)
|
||||||
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
|
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
|
||||||
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
|
msg += f" supports: {self.supported_languages}, valid values: {self.compiler_languages}"
|
||||||
@ -94,7 +97,7 @@ def determine_version(cls, exe: Path) -> str:
|
|||||||
match = re.search(cls.compiler_version_regex, output)
|
match = re.search(cls.compiler_version_regex, output)
|
||||||
if match:
|
if match:
|
||||||
return ".".join(match.groups())
|
return ".".join(match.groups())
|
||||||
except ProcessError:
|
except spack.util.executable.ProcessError:
|
||||||
pass
|
pass
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
tty.debug(
|
tty.debug(
|
||||||
@ -227,7 +230,7 @@ def _compiler_output(
|
|||||||
compiler_path: path of the compiler to be invoked
|
compiler_path: path of the compiler to be invoked
|
||||||
version_argument: the argument used to extract version information
|
version_argument: the argument used to extract version information
|
||||||
"""
|
"""
|
||||||
compiler = Executable(compiler_path)
|
compiler = spack.util.executable.Executable(compiler_path)
|
||||||
if not version_argument:
|
if not version_argument:
|
||||||
return compiler(
|
return compiler(
|
||||||
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
|
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
|
||||||
@ -250,7 +253,7 @@ def compiler_output(
|
|||||||
# not just executable name. If we don't do this, and the path changes
|
# not just executable name. If we don't do this, and the path changes
|
||||||
# (e.g., during testing), we can get incorrect results.
|
# (e.g., during testing), we can get incorrect results.
|
||||||
if not os.path.isabs(compiler_path):
|
if not os.path.isabs(compiler_path):
|
||||||
compiler_path = which_string(str(compiler_path), required=True)
|
compiler_path = spack.util.executable.which_string(str(compiler_path), required=True)
|
||||||
|
|
||||||
return _compiler_output(
|
return _compiler_output(
|
||||||
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
|
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
|
||||||
|
@ -5,7 +5,10 @@
|
|||||||
import re
|
import re
|
||||||
from typing import Iterable, List
|
from typing import Iterable, List
|
||||||
|
|
||||||
from spack.package import PackageBase, any_combination_of, conflicts, depends_on, variant, when
|
import spack.variant
|
||||||
|
from spack.directives import conflicts, depends_on, variant
|
||||||
|
from spack.multimethod import when
|
||||||
|
from spack.package_base import PackageBase
|
||||||
|
|
||||||
|
|
||||||
class CudaPackage(PackageBase):
|
class CudaPackage(PackageBase):
|
||||||
@ -68,7 +71,7 @@ class CudaPackage(PackageBase):
|
|||||||
variant(
|
variant(
|
||||||
"cuda_arch",
|
"cuda_arch",
|
||||||
description="CUDA architecture",
|
description="CUDA architecture",
|
||||||
values=any_combination_of(*cuda_arch_values),
|
values=spack.variant.any_combination_of(*cuda_arch_values),
|
||||||
sticky=True,
|
sticky=True,
|
||||||
when="+cuda",
|
when="+cuda",
|
||||||
)
|
)
|
||||||
|
@ -3,12 +3,17 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
from typing import Tuple
|
from typing import Tuple
|
||||||
|
|
||||||
from spack.package import PackageBase, Prefix, Spec, build_system, register_builder, run_after
|
import spack.builder
|
||||||
|
import spack.directives
|
||||||
|
import spack.package_base
|
||||||
|
import spack.phase_callbacks
|
||||||
|
import spack.spec
|
||||||
|
import spack.util.prefix
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
|
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
|
||||||
|
|
||||||
|
|
||||||
class Package(PackageBase):
|
class Package(spack.package_base.PackageBase):
|
||||||
"""General purpose class with a single ``install`` phase that needs to be
|
"""General purpose class with a single ``install`` phase that needs to be
|
||||||
coded by packagers.
|
coded by packagers.
|
||||||
"""
|
"""
|
||||||
@ -19,10 +24,10 @@ class Package(PackageBase):
|
|||||||
#: Legacy buildsystem attribute used to deserialize and install old specs
|
#: Legacy buildsystem attribute used to deserialize and install old specs
|
||||||
legacy_buildsystem = "generic"
|
legacy_buildsystem = "generic"
|
||||||
|
|
||||||
build_system("generic")
|
spack.directives.build_system("generic")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("generic")
|
@spack.builder.builder("generic")
|
||||||
class GenericBuilder(BuilderWithDefaults):
|
class GenericBuilder(BuilderWithDefaults):
|
||||||
"""A builder for a generic build system, that require packagers
|
"""A builder for a generic build system, that require packagers
|
||||||
to implement an "install" phase.
|
to implement an "install" phase.
|
||||||
@ -41,10 +46,12 @@ class GenericBuilder(BuilderWithDefaults):
|
|||||||
install_time_test_callbacks = []
|
install_time_test_callbacks = []
|
||||||
|
|
||||||
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
||||||
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
||||||
|
|
||||||
# unconditionally perform any post-install phase tests
|
# unconditionally perform any post-install phase tests
|
||||||
run_after("install")(execute_install_time_tests)
|
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
|
||||||
|
|
||||||
def install(self, pkg: Package, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: Package, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
@ -4,11 +4,11 @@
|
|||||||
|
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
|
import spack.package_base
|
||||||
import spack.util.url
|
import spack.util.url
|
||||||
from spack.package import PackageBase
|
|
||||||
|
|
||||||
|
|
||||||
class GNUMirrorPackage(PackageBase):
|
class GNUMirrorPackage(spack.package_base.PackageBase):
|
||||||
"""Mixin that takes care of setting url and mirrors for GNU packages."""
|
"""Mixin that takes care of setting url and mirrors for GNU packages."""
|
||||||
|
|
||||||
#: Path of the package in a GNU mirror
|
#: Path of the package in a GNU mirror
|
||||||
|
@ -2,26 +2,21 @@
|
|||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
from spack.package import (
|
import llnl.util.filesystem as fs
|
||||||
EnvironmentModifications,
|
|
||||||
PackageBase,
|
import spack.builder
|
||||||
Prefix,
|
import spack.package_base
|
||||||
Spec,
|
import spack.phase_callbacks
|
||||||
build_system,
|
import spack.spec
|
||||||
depends_on,
|
import spack.util.environment
|
||||||
install,
|
import spack.util.prefix
|
||||||
join_path,
|
from spack.directives import build_system, depends_on
|
||||||
mkdirp,
|
from spack.multimethod import when
|
||||||
register_builder,
|
|
||||||
run_after,
|
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults, execute_install_time_tests
|
from ._checks import BuilderWithDefaults, execute_install_time_tests
|
||||||
|
|
||||||
|
|
||||||
class GoPackage(PackageBase):
|
class GoPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using the Go toolchain."""
|
"""Specialized class for packages built using the Go toolchain."""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that need to know the build
|
#: This attribute is used in UI queries that need to know the build
|
||||||
@ -37,7 +32,7 @@ class GoPackage(PackageBase):
|
|||||||
depends_on("go", type="build")
|
depends_on("go", type="build")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("go")
|
@spack.builder.builder("go")
|
||||||
class GoBuilder(BuilderWithDefaults):
|
class GoBuilder(BuilderWithDefaults):
|
||||||
"""The Go builder encodes the most common way of building software with
|
"""The Go builder encodes the most common way of building software with
|
||||||
a golang go.mod file. It has two phases that can be overridden, if need be:
|
a golang go.mod file. It has two phases that can be overridden, if need be:
|
||||||
@ -74,10 +69,12 @@ class GoBuilder(BuilderWithDefaults):
|
|||||||
#: Callback names for install-time test
|
#: Callback names for install-time test
|
||||||
install_time_test_callbacks = ["check"]
|
install_time_test_callbacks = ["check"]
|
||||||
|
|
||||||
def setup_build_environment(self, env: EnvironmentModifications) -> None:
|
def setup_build_environment(
|
||||||
|
self, env: spack.util.environment.EnvironmentModifications
|
||||||
|
) -> None:
|
||||||
env.set("GO111MODULE", "on")
|
env.set("GO111MODULE", "on")
|
||||||
env.set("GOTOOLCHAIN", "local")
|
env.set("GOTOOLCHAIN", "local")
|
||||||
env.set("GOPATH", join_path(self.pkg.stage.path, "go"))
|
env.set("GOPATH", fs.join_path(self.pkg.stage.path, "go"))
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def build_directory(self):
|
def build_directory(self):
|
||||||
@ -103,20 +100,24 @@ def check_args(self):
|
|||||||
"""Argument for ``go test`` during check phase"""
|
"""Argument for ``go test`` during check phase"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def build(self, pkg: GoPackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Runs ``go build`` in the source directory"""
|
"""Runs ``go build`` in the source directory"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.go("build", *self.build_args)
|
pkg.module.go("build", *self.build_args)
|
||||||
|
|
||||||
def install(self, pkg: GoPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Install built binaries into prefix bin."""
|
"""Install built binaries into prefix bin."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
mkdirp(prefix.bin)
|
fs.mkdirp(prefix.bin)
|
||||||
install(pkg.name, prefix.bin)
|
fs.install(pkg.name, prefix.bin)
|
||||||
|
|
||||||
run_after("install")(execute_install_time_tests)
|
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
|
||||||
|
|
||||||
def check(self):
|
def check(self):
|
||||||
"""Run ``go test .`` in the source directory"""
|
"""Run ``go test .`` in the source directory"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg.module.go("test", *self.check_args)
|
self.pkg.module.go("test", *self.check_args)
|
||||||
|
1380
var/spack/repos/spack_repo/builtin/build_systems/intel.py
Normal file
1380
var/spack/repos/spack_repo/builtin/build_systems/intel.py
Normal file
File diff suppressed because it is too large
Load Diff
@ -3,23 +3,19 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from spack.package import (
|
from llnl.util.filesystem import find
|
||||||
Builder,
|
|
||||||
EnvironmentModifications,
|
import spack.builder
|
||||||
Executable,
|
import spack.package_base
|
||||||
PackageBase,
|
import spack.spec
|
||||||
Prefix,
|
import spack.util.environment
|
||||||
Spec,
|
import spack.util.executable
|
||||||
build_system,
|
import spack.util.prefix
|
||||||
depends_on,
|
from spack.directives import build_system, depends_on, extends
|
||||||
extends,
|
from spack.multimethod import when
|
||||||
find,
|
|
||||||
register_builder,
|
|
||||||
when,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class LuaPackage(PackageBase):
|
class LuaPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for lua packages"""
|
"""Specialized class for lua packages"""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that need to know the build
|
#: This attribute is used in UI queries that need to know the build
|
||||||
@ -44,16 +40,16 @@ class LuaPackage(PackageBase):
|
|||||||
|
|
||||||
@property
|
@property
|
||||||
def lua(self):
|
def lua(self):
|
||||||
return Executable(self.spec["lua-lang"].prefix.bin.lua)
|
return spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def luarocks(self):
|
def luarocks(self):
|
||||||
lr = Executable(self.spec["lua-lang"].prefix.bin.luarocks)
|
lr = spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
|
||||||
return lr
|
return lr
|
||||||
|
|
||||||
|
|
||||||
@register_builder("lua")
|
@spack.builder.builder("lua")
|
||||||
class LuaBuilder(Builder):
|
class LuaBuilder(spack.builder.Builder):
|
||||||
phases = ("unpack", "generate_luarocks_config", "preprocess", "install")
|
phases = ("unpack", "generate_luarocks_config", "preprocess", "install")
|
||||||
|
|
||||||
#: Names associated with package methods in the old build-system format
|
#: Names associated with package methods in the old build-system format
|
||||||
@ -62,7 +58,9 @@ class LuaBuilder(Builder):
|
|||||||
#: Names associated with package attributes in the old build-system format
|
#: Names associated with package attributes in the old build-system format
|
||||||
legacy_attributes = ()
|
legacy_attributes = ()
|
||||||
|
|
||||||
def unpack(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
def unpack(
|
||||||
|
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
if os.path.splitext(pkg.stage.archive_file)[1] == ".rock":
|
if os.path.splitext(pkg.stage.archive_file)[1] == ".rock":
|
||||||
directory = pkg.luarocks("unpack", pkg.stage.archive_file, output=str)
|
directory = pkg.luarocks("unpack", pkg.stage.archive_file, output=str)
|
||||||
dirlines = directory.split("\n")
|
dirlines = directory.split("\n")
|
||||||
@ -73,7 +71,9 @@ def unpack(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
def _generate_tree_line(name, prefix):
|
def _generate_tree_line(name, prefix):
|
||||||
return """{{ name = "{name}", root = "{prefix}" }};""".format(name=name, prefix=prefix)
|
return """{{ name = "{name}", root = "{prefix}" }};""".format(name=name, prefix=prefix)
|
||||||
|
|
||||||
def generate_luarocks_config(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
def generate_luarocks_config(
|
||||||
|
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
spec = self.pkg.spec
|
spec = self.pkg.spec
|
||||||
table_entries = []
|
table_entries = []
|
||||||
for d in spec.traverse(deptype=("build", "run")):
|
for d in spec.traverse(deptype=("build", "run")):
|
||||||
@ -92,14 +92,18 @@ def generate_luarocks_config(self, pkg: LuaPackage, spec: Spec, prefix: Prefix)
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
def preprocess(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
def preprocess(
|
||||||
|
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Override this to preprocess source before building with luarocks"""
|
"""Override this to preprocess source before building with luarocks"""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def luarocks_args(self):
|
def luarocks_args(self):
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def install(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
rock = "."
|
rock = "."
|
||||||
specs = find(".", "*.rockspec", recursive=False)
|
specs = find(".", "*.rockspec", recursive=False)
|
||||||
if specs:
|
if specs:
|
||||||
@ -111,5 +115,7 @@ def install(self, pkg: LuaPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
def _luarocks_config_path(self):
|
def _luarocks_config_path(self):
|
||||||
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")
|
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")
|
||||||
|
|
||||||
def setup_build_environment(self, env: EnvironmentModifications) -> None:
|
def setup_build_environment(
|
||||||
|
self, env: spack.util.environment.EnvironmentModifications
|
||||||
|
) -> None:
|
||||||
env.set("LUAROCKS_CONFIG", self._luarocks_config_path())
|
env.set("LUAROCKS_CONFIG", self._luarocks_config_path())
|
||||||
|
@ -3,18 +3,15 @@
|
|||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
from spack.package import (
|
import llnl.util.filesystem as fs
|
||||||
PackageBase,
|
|
||||||
Prefix,
|
import spack.builder
|
||||||
Spec,
|
import spack.package_base
|
||||||
build_system,
|
import spack.phase_callbacks
|
||||||
conflicts,
|
import spack.spec
|
||||||
depends_on,
|
import spack.util.prefix
|
||||||
register_builder,
|
from spack.directives import build_system, conflicts, depends_on
|
||||||
run_after,
|
from spack.multimethod import when
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import (
|
from ._checks import (
|
||||||
BuilderWithDefaults,
|
BuilderWithDefaults,
|
||||||
@ -24,7 +21,7 @@
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class MakefilePackage(PackageBase):
|
class MakefilePackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using Makefiles."""
|
"""Specialized class for packages built using Makefiles."""
|
||||||
|
|
||||||
#: This attribute is used in UI queries that need to know the build
|
#: This attribute is used in UI queries that need to know the build
|
||||||
@ -40,7 +37,7 @@ class MakefilePackage(PackageBase):
|
|||||||
depends_on("gmake", type="build")
|
depends_on("gmake", type="build")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("makefile")
|
@spack.builder.builder("makefile")
|
||||||
class MakefileBuilder(BuilderWithDefaults):
|
class MakefileBuilder(BuilderWithDefaults):
|
||||||
"""The Makefile builder encodes the most common way of building software with
|
"""The Makefile builder encodes the most common way of building software with
|
||||||
Makefiles. It has three phases that can be overridden, if need be:
|
Makefiles. It has three phases that can be overridden, if need be:
|
||||||
@ -100,36 +97,42 @@ def build_directory(self) -> str:
|
|||||||
"""Return the directory containing the main Makefile."""
|
"""Return the directory containing the main Makefile."""
|
||||||
return self.pkg.stage.source_path
|
return self.pkg.stage.source_path
|
||||||
|
|
||||||
def edit(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
|
def edit(
|
||||||
|
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Edit the Makefile before calling make. The default is a no-op."""
|
"""Edit the Makefile before calling make. The default is a no-op."""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
def build(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run "make" on the build targets specified by the builder."""
|
"""Run "make" on the build targets specified by the builder."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.make(*self.build_targets)
|
pkg.module.make(*self.build_targets)
|
||||||
|
|
||||||
def install(self, pkg: MakefilePackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run "make" on the install targets specified by the builder."""
|
"""Run "make" on the install targets specified by the builder."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.make(*self.install_targets)
|
pkg.module.make(*self.install_targets)
|
||||||
|
|
||||||
run_after("build")(execute_build_time_tests)
|
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
|
||||||
|
|
||||||
def check(self) -> None:
|
def check(self) -> None:
|
||||||
"""Run "make" on the ``test`` and ``check`` targets, if found."""
|
"""Run "make" on the ``test`` and ``check`` targets, if found."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg._if_make_target_execute("test")
|
self.pkg._if_make_target_execute("test")
|
||||||
self.pkg._if_make_target_execute("check")
|
self.pkg._if_make_target_execute("check")
|
||||||
|
|
||||||
run_after("install")(execute_install_time_tests)
|
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
|
||||||
|
|
||||||
def installcheck(self) -> None:
|
def installcheck(self) -> None:
|
||||||
"""Searches the Makefile for an ``installcheck`` target
|
"""Searches the Makefile for an ``installcheck`` target
|
||||||
and runs it if found.
|
and runs it if found.
|
||||||
"""
|
"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg._if_make_target_execute("installcheck")
|
self.pkg._if_make_target_execute("installcheck")
|
||||||
|
|
||||||
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
|
||||||
run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
|
||||||
|
@ -1,23 +1,20 @@
|
|||||||
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
from spack.package import (
|
import llnl.util.filesystem as fs
|
||||||
PackageBase,
|
|
||||||
Prefix,
|
import spack.builder
|
||||||
Spec,
|
import spack.package_base
|
||||||
build_system,
|
import spack.spec
|
||||||
depends_on,
|
import spack.util.prefix
|
||||||
install_tree,
|
from spack.directives import build_system, depends_on
|
||||||
register_builder,
|
from spack.multimethod import when
|
||||||
when,
|
from spack.util.executable import which
|
||||||
which,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults
|
from ._checks import BuilderWithDefaults
|
||||||
|
|
||||||
|
|
||||||
class MavenPackage(PackageBase):
|
class MavenPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages that are built using the
|
"""Specialized class for packages that are built using the
|
||||||
Maven build system. See https://maven.apache.org/index.html
|
Maven build system. See https://maven.apache.org/index.html
|
||||||
for more information.
|
for more information.
|
||||||
@ -37,7 +34,7 @@ class MavenPackage(PackageBase):
|
|||||||
depends_on("maven", type="build")
|
depends_on("maven", type="build")
|
||||||
|
|
||||||
|
|
||||||
@register_builder("maven")
|
@spack.builder.builder("maven")
|
||||||
class MavenBuilder(BuilderWithDefaults):
|
class MavenBuilder(BuilderWithDefaults):
|
||||||
"""The Maven builder encodes the default way to build software with Maven.
|
"""The Maven builder encodes the default way to build software with Maven.
|
||||||
It has two phases that can be overridden, if need be:
|
It has two phases that can be overridden, if need be:
|
||||||
@ -63,16 +60,20 @@ def build_args(self):
|
|||||||
"""List of args to pass to build phase."""
|
"""List of args to pass to build phase."""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def build(self, pkg: MavenPackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Compile code and package into a JAR file."""
|
"""Compile code and package into a JAR file."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
mvn = which("mvn", required=True)
|
mvn = which("mvn", required=True)
|
||||||
if self.pkg.run_tests:
|
if self.pkg.run_tests:
|
||||||
mvn("verify", *self.build_args())
|
mvn("verify", *self.build_args())
|
||||||
else:
|
else:
|
||||||
mvn("package", "-DskipTests", *self.build_args())
|
mvn("package", "-DskipTests", *self.build_args())
|
||||||
|
|
||||||
def install(self, pkg: MavenPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Copy to installation prefix."""
|
"""Copy to installation prefix."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
install_tree(".", prefix)
|
fs.install_tree(".", prefix)
|
||||||
|
@ -4,24 +4,20 @@
|
|||||||
import os
|
import os
|
||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
from spack.package import (
|
import llnl.util.filesystem as fs
|
||||||
PackageBase,
|
|
||||||
Prefix,
|
import spack.builder
|
||||||
Spec,
|
import spack.package_base
|
||||||
build_system,
|
import spack.phase_callbacks
|
||||||
conflicts,
|
import spack.spec
|
||||||
depends_on,
|
import spack.util.prefix
|
||||||
register_builder,
|
from spack.directives import build_system, conflicts, depends_on, variant
|
||||||
run_after,
|
from spack.multimethod import when
|
||||||
variant,
|
|
||||||
when,
|
|
||||||
working_dir,
|
|
||||||
)
|
|
||||||
|
|
||||||
from ._checks import BuilderWithDefaults, execute_build_time_tests
|
from ._checks import BuilderWithDefaults, execute_build_time_tests
|
||||||
|
|
||||||
|
|
||||||
class MesonPackage(PackageBase):
|
class MesonPackage(spack.package_base.PackageBase):
|
||||||
"""Specialized class for packages built using Meson. For more information
|
"""Specialized class for packages built using Meson. For more information
|
||||||
on the Meson build system, see https://mesonbuild.com/
|
on the Meson build system, see https://mesonbuild.com/
|
||||||
"""
|
"""
|
||||||
@ -70,7 +66,7 @@ def flags_to_build_system_args(self, flags):
|
|||||||
setattr(self, "meson_flag_args", [])
|
setattr(self, "meson_flag_args", [])
|
||||||
|
|
||||||
|
|
||||||
@register_builder("meson")
|
@spack.builder.builder("meson")
|
||||||
class MesonBuilder(BuilderWithDefaults):
|
class MesonBuilder(BuilderWithDefaults):
|
||||||
"""The Meson builder encodes the default way to build software with Meson.
|
"""The Meson builder encodes the default way to build software with Meson.
|
||||||
The builder has three phases that can be overridden, if need be:
|
The builder has three phases that can be overridden, if need be:
|
||||||
@ -194,7 +190,9 @@ def meson_args(self) -> List[str]:
|
|||||||
"""
|
"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def meson(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
|
def meson(
|
||||||
|
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Run ``meson`` in the build directory"""
|
"""Run ``meson`` in the build directory"""
|
||||||
options = []
|
options = []
|
||||||
if self.spec["meson"].satisfies("@0.64:"):
|
if self.spec["meson"].satisfies("@0.64:"):
|
||||||
@ -202,25 +200,29 @@ def meson(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
|
|||||||
options.append(os.path.abspath(self.root_mesonlists_dir))
|
options.append(os.path.abspath(self.root_mesonlists_dir))
|
||||||
options += self.std_meson_args
|
options += self.std_meson_args
|
||||||
options += self.meson_args()
|
options += self.meson_args()
|
||||||
with working_dir(self.build_directory, create=True):
|
with fs.working_dir(self.build_directory, create=True):
|
||||||
pkg.module.meson(*options)
|
pkg.module.meson(*options)
|
||||||
|
|
||||||
def build(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
|
def build(
|
||||||
|
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Make the build targets"""
|
"""Make the build targets"""
|
||||||
options = ["-v"]
|
options = ["-v"]
|
||||||
options += self.build_targets
|
options += self.build_targets
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.ninja(*options)
|
pkg.module.ninja(*options)
|
||||||
|
|
||||||
def install(self, pkg: MesonPackage, spec: Spec, prefix: Prefix) -> None:
|
def install(
|
||||||
|
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
|
||||||
|
) -> None:
|
||||||
"""Make the install targets"""
|
"""Make the install targets"""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
pkg.module.ninja(*self.install_targets)
|
pkg.module.ninja(*self.install_targets)
|
||||||
|
|
||||||
run_after("build")(execute_build_time_tests)
|
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
|
||||||
|
|
||||||
def check(self) -> None:
|
def check(self) -> None:
|
||||||
"""Search Meson-generated files for the target ``test`` and run it if found."""
|
"""Search Meson-generated files for the target ``test`` and run it if found."""
|
||||||
with working_dir(self.build_directory):
|
with fs.working_dir(self.build_directory):
|
||||||
self.pkg._if_ninja_target_execute("test")
|
self.pkg._if_ninja_target_execute("test")
|
||||||
self.pkg._if_ninja_target_execute("check")
|
self.pkg._if_ninja_target_execute("check")
|
||||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user