Compare commits

..

54 Commits

Author SHA1 Message Date
jmlapre
05ce2c7766
sst: update core, elements, macro to 15.0.0 (#50473) 2025-05-21 14:18:27 -07:00
Thomas Madlener
d8c819f3b8
podio, edm4hep: Add version 1.3 and 0.99.2 and ensure make sure EDM4hep still builds (#50489)
* Add tag for podio v01-03

* Update minimal root version for RNTuple support

* Make sure clang-format does not interfere with code generation

* Add edm4hep version 0.99.2 and new podio version dependency

* Keep root versions ordered for dependencies

Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>

---------

Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
2025-05-21 14:06:45 -05:00
Afzal Patel
4d563acd1b
bump up the version for ROCm-6.4.0 (#50037)
* bump up the version for ROCm-6.4.0
* update rocm-openmp-extras
* fix for rocprofiler-sdk
* fix hipblaslt ci fail
* bump hipsparselt
* set clang to use llvm-amdgpu
* modify rom-smi-lib and update hip-tensor
* miopen-hip: replace patch with commit
* miopen-hip: remove patch
* fix rocdecode libdrm error
* rocjpeg fix libdrm error
* rocm-bandwidth-test and amdsmi: add c dependency
* add explicit roctracer-dev dependency
* fix issue with rocm-openmp-extras using external
* add c dependency to rocm-debug-agent and rocm-dbgapi
* rocm-debug-agent: set clang as compiler
* hip-tensor: add c dependency
* hipsparselt: modify patch
* hipsparselt: fix ROCM_SMI_PATH
* hipsparselt: add c dependency
* rocm-validation-suite: add patch for libdrm
* rdc: remove rocm-smi-lib dependency
* rocmwmma: replace patch
* rocmwmma: remove old patch
* rocwmma: modify patch
* add c dependencies
* add c dependency to rocwmma and rocm-examples
* rocAL: force use of spack python
* rdc: add c dependency
* roctracer-dev: use llvm-amdgpu clang++
* rocm-tensile: add patch
* rocm-cmake: fix standalone test
* rocblas: fix client test error
* roctracer-dev: fix ci fail
* hipblaslt: add patch for client test
* hipfort: add c dependency
* py-tensorflow restrict to rocm 6.3
2025-05-21 11:53:30 -07:00
Julien Cortial
7d27e11698
vtk: add patch for missing includes (#50248)
Adds patch from upstream VTK for VTK 9.1 and 9.2 adding a missing include, without which VTK will fail to compile with modern compilers.
2025-05-21 14:27:15 -04:00
Adam J. Stewart
cac7684faa
py-matplotlib: add v3.10.3 (#50404) 2025-05-21 11:00:14 -07:00
Adam J. Stewart
8caba599af
py-scipy: add v1.15.3 (#50395) 2025-05-21 10:52:38 -07:00
Buldram
b542f379d5
chafa: add v1.16.1 (#50588) 2025-05-21 10:49:38 -07:00
Mark Abraham
a8aeb17e37
gromacs: support new releases (#50509)
* gromacs: support new releases

Also removed deprecated releases, per previously published policy, and marked new branches as deprecated.

* [@spackbot] updating style on behalf of mabraham

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2025-05-21 07:34:42 -10:00
Sam Reeve
8bcbe52b01
additivefoam: add 1.1 and website (#50551)
* additivefoam: add new version and website
* additivefoam: use spack prefix for bin & lib build path
* additivefoam: update logic for version asset selection

---------

Co-authored-by: Gerry Knapp <knappgl@ornl.gov>
2025-05-21 10:13:01 -07:00
Alec Scott
ecb02c1fc6
typescript: add v5.8.3 (#50574) 2025-05-21 11:06:24 -06:00
Afzal Patel
1f74ac5188
opencv: add pkgconfig dependency when +ffmpeg (#50457)
* opencv: add pkgconfig dependency when +ffmpeg

* add type build
2025-05-21 16:37:26 +02:00
Alberto Invernizzi
23ba489e06
neovim: add 0.10.3, 0.10.4 and 0.11 + drop old deps (#50123)
* add newer neovim versions + drop dependencies and add new ones

* waiting to fix/update some luajit problems on macOS ARM64

* Update var/spack/repos/builtin/packages/neovim/package.py

Co-authored-by: Felix Thaler <thaler@cscs.ch>

---------

Co-authored-by: Felix Thaler <thaler@cscs.ch>
2025-05-21 08:54:15 -04:00
Massimiliano Culpo
5879724a2a
builtin: remove spack.variant imports (#50576)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-05-21 11:35:06 +02:00
Harmen Stoppels
bfc52d6f50
spack repo migrate: fix order of if-elif (#50579) 2025-05-21 11:31:26 +02:00
Harmen Stoppels
0107792a9e
archspec: change module from archspec to _vendoring.archspec (#50450) 2025-05-21 11:25:02 +02:00
Massimiliano Culpo
de6f07094e
builtin: remove unnecessary use of archspec.cpu (#50582)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-05-21 10:09:21 +02:00
Harmen Stoppels
63c60c18c7
ci: sync packages to python/spack_repo instead of spack_repo/ (#50589) 2025-05-21 09:10:01 +02:00
Alex Richert
f01a442ad4
cmd/mirror.py: match CLI specs to concrete specs in env (#50307)
* cmd/mirror.py: match CLI specs to concrete specs in env

* Allow using 'mirror create -a/--all' to match multiple concrete specs

* remove unit test preventing 'mirror create -a <spec>'

* Add test_mirror_spec_from_env() unit test (cmd/mirror.py)
2025-05-21 05:40:14 +00:00
Alberto Sartori
cca0eb6873
justbuild: add v1.5.1 and v1.5.2 (#50566)
* justbuild: add v1.5.1
* justbuild: add v1.5.2
2025-05-20 23:33:40 -06:00
Todd Gamblin
c39725a9e6
bugfix: add build system imports to new packages (#50587)
Since #50452, build systems are no longer in core and need their own imports.

Specifically, in addition to:

```python
from spack.package import *
```

you now also need, e.g.:

```python
from spack_repo.builtin.build_systems.python import PythonPackage
```

Or similar for other build systems, or things will break.

- [x] Fix `py-deprecat` package
- [x] Fix `cryoef` package

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-05-20 18:26:24 -07:00
Daniele Colombo
e58351f421
py-maturin: update rust dependency version (#50485) 2025-05-20 17:18:40 -07:00
Daniele Colombo
2a028144ba
cryoef: new package (#50486) 2025-05-20 17:05:34 -07:00
Adam J. Stewart
22a7419235
py-numpy: add v2.2.6 (#50524) 2025-05-20 16:28:04 -07:00
Briffou
7e21357045
mgis: wrong version for tfel 5.0.0 (#50523) 2025-05-20 16:26:55 -07:00
Adam J. Stewart
cf7e1643c3
py-distributed: Python 3.12+ not supported before 2022.6.1 (#50525) 2025-05-20 16:25:22 -07:00
Adam J. Stewart
158ddf72cf
py-datacube: add v1.9.3 (#50526) 2025-05-20 16:22:11 -07:00
Alec Scott
9567b13b4f
direnv: add v2.36.0 (#50527)
* direnv: add v2.36.0
* Fix go version constraint
2025-05-20 16:19:22 -07:00
Alec Scott
f6da60b541
fzf: add v0.62.0 (#50528) 2025-05-20 16:18:14 -07:00
Alec Scott
3caff2c5a0
goimports: add v0.33.0 (#50529)
* goimports: add v0.33.0
* Fix depends_on go version constraint
2025-05-20 16:12:08 -07:00
Alec Scott
091a8a4734
gopls: add v0.18.1 (#50530) 2025-05-20 16:11:10 -07:00
Alec Scott
cfcee7d092
hugo: add v0.147.3 (#50531) 2025-05-20 16:09:23 -07:00
Alec Scott
c6d04286c5
kubectl: add v1.33.1 (#50532)
* kubectl: add v1.33.1
* Fix go version constraint
2025-05-20 16:08:28 -07:00
Wouter Deconinck
3c3e1c6d30
rdma-core: add through v57.0 (#50535) 2025-05-20 16:06:40 -07:00
Buldram
3669f0356b
chafa: add v1.16.0 (#50536) 2025-05-20 16:05:12 -07:00
Eddie Nolan
dd72df89d6
mpfr: Apply patches for versions 4.1.0-4.2.1 (#50537)
In particular, the 4.2.1 patches address a bug that prevents building
gcc with recent clang and glibc.

See details here: https://www.mpfr.org/mpfr-4.2.1/#fixed
2025-05-20 16:01:17 -07:00
Fernando Ayats
445667adbe
py-py-spy: fix linkage with libunwind (#50542) 2025-05-20 15:58:17 -07:00
Adam J. Stewart
3ec4797513
py-shapely: add v2.1.1 (#50544) 2025-05-20 15:55:59 -07:00
Dennis Klein
50e8f6395c
fairlogger: add new v2.2.0 (#50545)
* remove deprecated
   * remove obsolete conditions
* deprecate remaining `@:1`
2025-05-20 15:54:58 -07:00
Dennis Klein
bf9426a48d
fairmq: conflicts boost@1.88: (#50546) 2025-05-20 15:50:30 -07:00
Wouter Deconinck
bf08b1e2c6
openldap: depends_on uuid (#50557) 2025-05-20 15:43:36 -07:00
Wouter Deconinck
687137a057
e2fsprogs: depends_on uuid (#50559) 2025-05-20 15:41:51 -07:00
Patrick Lavin
6c4c9985d5
sst-core: depends on c (#50561) 2025-05-20 15:30:05 -07:00
Harmen Stoppels
2927e708bc
PackageBase: make _update_external_dependencies private (#50580) 2025-05-20 15:27:35 -07:00
Bram Veenboer
a9d3bd8d7f
Update pynvml package (#50398)
* Add dependency on Python <= 3.11 for 8.0.4
   The SafeConfigParser was removed in Python 3.12.
* Add version 11.5.3
* Add version 12.2.0
* Update order of version from newest to oldest
* Remove unneeded requirement on python@3.6
   Since Spack only has Python 3.6 or newer anyway.

* Update license to BSD-3-Clause from version 12 onwards

* Set minimum Python version to 3.9 from version 12 onwards

* Add py-nvidia-ml-py dependency from version 12 onwards

* Update py-nvidia-ml-py package

- Change license to BSD-2-Clause
- Add many more versions

* Apply black formatting

* Add url_for_version helper function

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Remove spaces on empty line

* Apply spack style

* Move depends_on directive

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-05-20 15:14:03 -07:00
Harmen Stoppels
64fc66ab48
ci: import-check continue-on-error: true (#50581)
Allow other CI checks to continue even if the circular import check fails.

Circular imports are often indicative of code that needs to be restructured, but there
are still places where we need to add them. This allows CI to continue, so that if CI is
passing *except* for a problematic import, we can still force merge and accept a bit
of technical debt to admit a useful feature.  Also, this allows people to keep working
while they fix their circular import issues, without being blind to CI results.

- [x] update ci workflow: import-check continue-on-error: true
2025-05-20 15:50:14 -06:00
Harmen Stoppels
c86b2860aa
docs: fix a few typos (#50573)
* docs: fix a few typos

* fix a few issues in packaging_guide.rst

* more fixes
2025-05-20 13:24:54 -04:00
Seth R. Johnson
d991ebbe09
celeritas: new versions through 0.6 (#50197)
* Add v0.5.2

* celeritas: new version

* celeritas: new version 0.6

* Add sanity check

* celeritas: add Ben as mantainer

* Enable celeritas covfie variant

* Fix missing covfie requirement

* Fix covfie/hip conflict

* fixup! Fix covfie/hip conflict

* Add new covfie version dependency

* Update covfie dependencies

* fixup! Update covfie dependencies

* try removing references to 0.6.1

* Style

* fixup! try removing references to 0.6.1

* Fix hep stack
2025-05-20 11:00:01 -04:00
Hugh Carson
3f00eeabd2
strumpack: Propagate cuda_arch to slate (#50460) 2025-05-20 07:51:09 -07:00
John Pennycook
b77d9b87f8
py-codebasin: new package (#50549) 2025-05-20 07:46:47 -07:00
jordialcaraz
e46dae9eb6
[Tau package] patch for rocm 6.2, do not configure for rocprofiler by default, only if enabled with +rocprofiler (#50555)
* Updateing TAU recipe to add patch file for ROCM>6.2

* Create tau-rocm-disable-rocprofiler-default.patch

* Changing from version check to +rocm

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2025-05-20 07:31:34 -07:00
Kyle Brindley
760fc05da3
py-waves: add v0.12.9 -> v0.13.1 (#50090)
* FEAT: add versions 0.12.9->0.13.1

* MAINT: remove developer inline note after release with updated minimum salib spec

* MAINT: build scripts changed in 0.12.10

* MAINT: style guide updates
2025-05-20 10:19:00 -04:00
Robert Maaskant
e07503e4ac
rclone: add v1.69.2 (#50426)
* rclone: add v1.69.2

* rclone: explicit type=build deps

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-05-20 09:53:44 -04:00
Wouter Deconinck
aba8d85b4d
cyrus-sasl: depends_on krb5 (#50556) 2025-05-20 05:55:03 -06:00
Mikael Simberg
546625cdb8
aws-ofi-nccl: add v1.14.2 (#50558) 2025-05-20 13:53:17 +02:00
222 changed files with 1557 additions and 1295 deletions

View File

@ -6,6 +6,7 @@ on:
jobs:
# Check we don't make the situation with circular imports worse
import-check:
continue-on-error: true
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2

View File

@ -28,7 +28,7 @@ jobs:
run: |
cd spack-packages
git-filter-repo --quiet --source ../spack \
--subdirectory-filter var/spack/repos \
--path var/spack/repos/ --path-rename var/spack/repos/:python/ \
--path share/spack/gitlab/cloud_pipelines/ --path-rename share/spack/gitlab/cloud_pipelines/:.ci/gitlab/ \
--refs develop
- name: Push

View File

@ -276,7 +276,7 @@ remove dependent packages *before* removing their dependencies or use the
Garbage collection
^^^^^^^^^^^^^^^^^^
When Spack builds software from sources, if often installs tools that are needed
When Spack builds software from sources, it often installs tools that are needed
just to build or test other software. These are not necessary at runtime.
To support cases where removing these tools can be a benefit Spack provides
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:

View File

@ -89,7 +89,7 @@ You can see that the mirror is added with ``spack mirror list`` as follows:
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
At this point, you've create a buildcache, but spack hasn't indexed it, so if
At this point, you've created a buildcache, but Spack hasn't indexed it, so if
you run ``spack buildcache list`` you won't see any results. You need to index
this new build cache as follows:
@ -318,7 +318,7 @@ other system dependencies. However, they are still compatible with tools like
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
.. note::
The docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
The Docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
``max depth exceeded`` error may be produced when pulling the image. There
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.

View File

@ -14,7 +14,7 @@ is an entire command dedicated to the management of every aspect of bootstrappin
.. command-output:: spack bootstrap --help
Spack is configured to bootstrap its dependencies lazily by default; i.e. the first time they are needed and
Spack is configured to bootstrap its dependencies lazily by default; i.e., the first time they are needed and
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
.. code-block:: console
@ -36,8 +36,8 @@ can't be found. You can readily check if any prerequisite for using Spack is mis
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
are missing and it's giving detailed information on why they are needed and whether
they can be bootstrapped. The return code of this command summarizes the results, if any
dependencies are missing the return code is ``1``, otherwise ``0``. Running a command that
they can be bootstrapped. The return code of this command summarizes the results; if any
dependencies are missing, the return code is ``1``, otherwise ``0``. Running a command that
concretizes a spec, like:
.. code-block:: console

View File

@ -228,7 +228,7 @@ def setup(sphinx):
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
("py:class", "_vendoring.archspec.cpu.microarchitecture.Microarchitecture"),
("py:class", "spack.compiler.CompilerCache"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),

View File

@ -148,8 +148,8 @@ this can expose you to attacks. Use at your own risk.
``ssl_certs``
--------------------
Path to custom certificats for SSL verification. The value can be a
filesytem path, or an environment variable that expands to an absolute file path.
Path to custom certificates for SSL verification. The value can be a
filesystem path, or an environment variable that expands to an absolute file path.
The default value is set to the environment variable ``SSL_CERT_FILE``
to use the same syntax used by many other applications that automatically
detect custom certificates.

View File

@ -11,7 +11,7 @@ Container Images
Spack :ref:`environments` can easily be turned into container images. This page
outlines two ways in which this can be done:
1. By installing the environment on the host system, and copying the installations
1. By installing the environment on the host system and copying the installations
into the container image. This approach does not require any tools like Docker
or Singularity to be installed.
2. By generating a Docker or Singularity recipe that can be used to build the
@ -56,8 +56,8 @@ environment roots and its runtime dependencies.
.. note::
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag is not
the password for your account, but a personal access token you need to generate separately.
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag specifies not
the password for your account, but rather a personal access token you need to generate separately.
The specified ``--base-image`` should have a libc that is compatible with the host system.
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``

View File

@ -20,7 +20,7 @@ be present on the machine where Spack is run:
:header-rows: 1
These requirements can be easily installed on most modern Linux systems;
on macOS, the Command Line Tools package is required, and a full XCode suite
on macOS, the Command Line Tools package is required, and a full Xcode suite
may be necessary for some packages such as Qt and apple-gl. Spack is designed
to run on HPC platforms like Cray. Not all packages should be expected
to work on all platforms.

View File

@ -8,7 +8,7 @@
Modules (modules.yaml)
======================
The use of module systems to manage user environment in a controlled way
The use of module systems to manage user environments in a controlled way
is a common practice at HPC centers that is sometimes embraced also by
individual programmers on their development machines. To support this
common practice Spack integrates with `Environment Modules
@ -490,7 +490,7 @@ that are already in the Lmod hierarchy.
.. note::
Tcl and Lua modules also allow for explicit conflicts between modulefiles.
Tcl and Lua modules also allow for explicit conflicts between module files.
.. code-block:: yaml
@ -513,7 +513,7 @@ that are already in the Lmod hierarchy.
:meth:`~spack.spec.Spec.format` method.
For Lmod and Environment Modules versions prior 4.2, it is important to
express the conflict on both modulefiles conflicting with each other.
express the conflict on both module files conflicting with each other.
.. note::
@ -550,7 +550,7 @@ that are already in the Lmod hierarchy.
.. warning::
Consistency of Core packages
The user is responsible for maintining consistency among core packages, as ``core_specs``
The user is responsible for maintaining consistency among core packages, as ``core_specs``
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
.. warning::

View File

@ -179,7 +179,7 @@ Spack can be found at :ref:`package_class_structure`.
.. code-block:: python
class Foo(CmakePackage):
class Foo(CMakePackage):
def cmake_args(self):
...
@ -1212,7 +1212,7 @@ class-level tarball URL and VCS. For example:
version("master", branch="master")
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
version("12.8.1", "9f37f683ee2b427b5540db8a20ed6b15")
version("12.8.1", md5="9f37f683ee2b427b5540db8a20ed6b15")
If a package contains both a ``url`` and ``git`` class-level attribute,
Spack decides which to use based on the arguments to the ``version()``
@ -1343,7 +1343,7 @@ Submodules
version("1.0.1", tag="v1.0.1", submodules=True)
If a package has needs more fine-grained control over submodules, define
If a package needs more fine-grained control over submodules, define
``submodules`` to be a callable function that takes the package instance as
its only argument. The function should return a list of submodules to be fetched.
@ -2308,31 +2308,19 @@ looks like this:
parallel = False
Similarly, you can disable parallel builds only for specific make
commands, as ``libdwarf`` does:
You can also disable parallel builds only for specific make
invocation:
.. code-block:: python
:emphasize-lines: 9, 12
:emphasize-lines: 5
:linenos:
class Libelf(Package):
...
def install(self, spec, prefix):
configure("--prefix=" + prefix,
"--enable-shared",
"--disable-dependency-tracking",
"--disable-debug")
make()
# The mkdir commands in libelf's install can fail in parallel
make("install", parallel=False)
The first make will run in parallel here, but the second will not. If
you set ``parallel`` to ``False`` at the package level, then each call
to ``make()`` will be sequential by default, but packagers can call
``make(parallel=True)`` to override it.
Note that the ``--jobs`` option works out of the box for all standard
build systems. If you are using a non-standard build system instead, you
can use the variable ``make_jobs`` to extract the number of jobs specified
@ -2507,7 +2495,7 @@ necessary when there are breaking changes in the dependency that the
package cannot handle. In Spack we often add forward compatibility
bounds only at the time a new, breaking version of a dependency is
released. As with backward compatibility, it is typical to see a list
of forward compatibility bounds in a package file as seperate lines:
of forward compatibility bounds in a package file as separate lines:
.. code-block:: python
@ -3383,7 +3371,7 @@ the above attribute implementations:
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
])
# baz library directories in the baz subdirectory of the foo porefix
# baz library directories in the baz subdirectory of the foo prefix
>>> spec["baz"].libs.directories
[
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
@ -5740,7 +5728,7 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
.. note::
The method name ``copy_test_files`` here is for illustration purposes.
You are free to use a name that is more suited to your package.
You are free to use a name that is better suited to your package.
The key to copying files for stand-alone testing at build time is use
of the ``run_after`` directive, which ensures the associated files are
@ -7249,7 +7237,7 @@ which are not, there is the `checked_by` parameter in the license directive:
license("<license>", when="<when>", checked_by="<github username>")
When you have validated a github license, either when doing so explicitly or
When you have validated a package license, either when doing so explicitly or
as part of packaging a new package, please set the `checked_by` parameter
to your Github username to signal that the license has been manually
verified.

View File

@ -214,7 +214,7 @@ package versions, simply run the following commands:
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
spack's garbage collection system that these packages should be cleaned up.
Spack's garbage collection system that these packages should be cleaned up.
Don't worry however, this will not remove your entire environment.
Running ``spack install`` will reexamine your spack environment after

View File

@ -1 +0,0 @@
from _pyrsistent_version import *

View File

@ -1 +0,0 @@
from altgraph import *

View File

@ -1,3 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.4"
__version__ = "0.2.5"

View File

@ -9,8 +9,8 @@
import argparse
import typing
import archspec
import archspec.cpu
import _vendoring.archspec
import _vendoring.archspec.cpu
def _make_parser() -> argparse.ArgumentParser:
@ -24,7 +24,7 @@ def _make_parser() -> argparse.ArgumentParser:
"-V",
help="Show the version and exit.",
action="version",
version=f"archspec, version {archspec.__version__}",
version=f"archspec, version {_vendoring.archspec.__version__}",
)
parser.add_argument("--help", "-h", help="Show the help and exit.", action="help")
@ -45,9 +45,9 @@ def _make_parser() -> argparse.ArgumentParser:
def cpu() -> int:
"""Run the `archspec cpu` subcommand."""
"""Run the `_vendoring.archspec.cpu` subcommand."""
try:
print(archspec.cpu.host())
print(_vendoring.archspec.cpu.host())
except FileNotFoundError as exc:
print(exc)
return 1

View File

@ -8,9 +8,9 @@
import re
import warnings
import archspec
import archspec.cpu.alias
import archspec.cpu.schema
import _vendoring.archspec
import _vendoring.archspec.cpu.alias
import _vendoring.archspec.cpu.schema
from .alias import FEATURE_ALIASES
from .schema import LazyDictionary
@ -384,7 +384,7 @@ def fill_target_from_dict(name, data, targets):
)
known_targets = {}
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
data = _vendoring.archspec.cpu.schema.TARGETS_JSON["microarchitectures"]
for name in data:
if name in known_targets:
# name was already brought in as ancestor to a target

View File

@ -0,0 +1,20 @@
The MIT License (MIT)
Copyright (c) 2014 Anders Høst
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@ -1 +0,0 @@
from jsonschema import *

View File

@ -1 +0,0 @@
from macholib import *

View File

@ -1,213 +0,0 @@
# flake8: noqa: E704
# from https://gist.github.com/WuTheFWasThat/091a17d4b5cab597dfd5d4c2d96faf09
# Stubs for pyrsistent (Python 3.6)
from typing import Any
from typing import AnyStr
from typing import Callable
from typing import Iterable
from typing import Iterator
from typing import List
from typing import Optional
from typing import Mapping
from typing import MutableMapping
from typing import Sequence
from typing import Set
from typing import Union
from typing import Tuple
from typing import Type
from typing import TypeVar
from typing import overload
# see commit 08519aa for explanation of the re-export
from pyrsistent.typing import CheckedKeyTypeError as CheckedKeyTypeError
from pyrsistent.typing import CheckedPMap as CheckedPMap
from pyrsistent.typing import CheckedPSet as CheckedPSet
from pyrsistent.typing import CheckedPVector as CheckedPVector
from pyrsistent.typing import CheckedType as CheckedType
from pyrsistent.typing import CheckedValueTypeError as CheckedValueTypeError
from pyrsistent.typing import InvariantException as InvariantException
from pyrsistent.typing import PClass as PClass
from pyrsistent.typing import PBag as PBag
from pyrsistent.typing import PDeque as PDeque
from pyrsistent.typing import PList as PList
from pyrsistent.typing import PMap as PMap
from pyrsistent.typing import PMapEvolver as PMapEvolver
from pyrsistent.typing import PSet as PSet
from pyrsistent.typing import PSetEvolver as PSetEvolver
from pyrsistent.typing import PTypeError as PTypeError
from pyrsistent.typing import PVector as PVector
from pyrsistent.typing import PVectorEvolver as PVectorEvolver
T = TypeVar('T')
KT = TypeVar('KT')
VT = TypeVar('VT')
def pmap(initial: Union[Mapping[KT, VT], Iterable[Tuple[KT, VT]]] = {}, pre_size: int = 0) -> PMap[KT, VT]: ...
def m(**kwargs: VT) -> PMap[str, VT]: ...
def pvector(iterable: Iterable[T] = ...) -> PVector[T]: ...
def v(*iterable: T) -> PVector[T]: ...
def pset(iterable: Iterable[T] = (), pre_size: int = 8) -> PSet[T]: ...
def s(*iterable: T) -> PSet[T]: ...
# see class_test.py for use cases
Invariant = Tuple[bool, Optional[Union[str, Callable[[], str]]]]
@overload
def field(
type: Union[Type[T], Sequence[Type[T]]] = ...,
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
initial: Any = object(),
mandatory: bool = False,
factory: Callable[[Any], T] = lambda x: x,
serializer: Callable[[Any, T], Any] = lambda _, value: value,
) -> T: ...
# The actual return value (_PField) is irrelevant after a PRecord has been instantiated,
# see https://github.com/tobgu/pyrsistent/blob/master/pyrsistent/_precord.py#L10
@overload
def field(
type: Any = ...,
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
initial: Any = object(),
mandatory: bool = False,
factory: Callable[[Any], Any] = lambda x: x,
serializer: Callable[[Any, Any], Any] = lambda _, value: value,
) -> Any: ...
# Use precise types for the simplest use cases, but fall back to Any for
# everything else. See record_test.py for the wide range of possible types for
# item_type
@overload
def pset_field(
item_type: Type[T],
optional: bool = False,
initial: Iterable[T] = ...,
) -> PSet[T]: ...
@overload
def pset_field(
item_type: Any,
optional: bool = False,
initial: Any = (),
) -> PSet[Any]: ...
@overload
def pmap_field(
key_type: Type[KT],
value_type: Type[VT],
optional: bool = False,
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
) -> PMap[KT, VT]: ...
@overload
def pmap_field(
key_type: Any,
value_type: Any,
optional: bool = False,
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
) -> PMap[Any, Any]: ...
@overload
def pvector_field(
item_type: Type[T],
optional: bool = False,
initial: Iterable[T] = ...,
) -> PVector[T]: ...
@overload
def pvector_field(
item_type: Any,
optional: bool = False,
initial: Any = (),
) -> PVector[Any]: ...
def pbag(elements: Iterable[T]) -> PBag[T]: ...
def b(*elements: T) -> PBag[T]: ...
def plist(iterable: Iterable[T] = (), reverse: bool = False) -> PList[T]: ...
def l(*elements: T) -> PList[T]: ...
def pdeque(iterable: Optional[Iterable[T]] = None, maxlen: Optional[int] = None) -> PDeque[T]: ...
def dq(*iterable: T) -> PDeque[T]: ...
@overload
def optional(type: T) -> Tuple[T, Type[None]]: ...
@overload
def optional(*typs: Any) -> Tuple[Any, ...]: ...
T_PRecord = TypeVar('T_PRecord', bound='PRecord')
class PRecord(PMap[AnyStr, Any]):
_precord_fields: Mapping
_precord_initial_values: Mapping
def __hash__(self) -> int: ...
def __init__(self, **kwargs: Any) -> None: ...
def __iter__(self) -> Iterator[Any]: ...
def __len__(self) -> int: ...
@classmethod
def create(
cls: Type[T_PRecord],
kwargs: Mapping,
_factory_fields: Optional[Iterable] = None,
ignore_extra: bool = False,
) -> T_PRecord: ...
# This is OK because T_PRecord is a concrete type
def discard(self: T_PRecord, key: KT) -> T_PRecord: ...
def remove(self: T_PRecord, key: KT) -> T_PRecord: ...
def serialize(self, format: Optional[Any] = ...) -> MutableMapping: ...
# From pyrsistent documentation:
# This set function differs slightly from that in the PMap
# class. First of all it accepts key-value pairs. Second it accepts multiple key-value
# pairs to perform one, atomic, update of multiple fields.
@overload
def set(self, key: KT, val: VT) -> Any: ...
@overload
def set(self, **kwargs: VT) -> Any: ...
def immutable(
members: Union[str, Iterable[str]] = '',
name: str = 'Immutable',
verbose: bool = False,
) -> Tuple: ... # actually a namedtuple
# ignore mypy warning "Overloaded function signatures 1 and 5 overlap with
# incompatible return types"
@overload
def freeze(o: Mapping[KT, VT]) -> PMap[KT, VT]: ... # type: ignore
@overload
def freeze(o: List[T]) -> PVector[T]: ... # type: ignore
@overload
def freeze(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
@overload
def freeze(o: Set[T]) -> PSet[T]: ... # type: ignore
@overload
def freeze(o: T) -> T: ...
@overload
def thaw(o: PMap[KT, VT]) -> MutableMapping[KT, VT]: ... # type: ignore
@overload
def thaw(o: PVector[T]) -> List[T]: ... # type: ignore
@overload
def thaw(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
# collections.abc.MutableSet is kind of garbage:
# https://stackoverflow.com/questions/24977898/why-does-collections-mutableset-not-bestow-an-update-method
@overload
def thaw(o: PSet[T]) -> Set[T]: ... # type: ignore
@overload
def thaw(o: T) -> T: ...
def mutant(fn: Callable) -> Callable: ...
def inc(x: int) -> int: ...
@overload
def discard(evolver: PMapEvolver[KT, VT], key: KT) -> None: ...
@overload
def discard(evolver: PVectorEvolver[T], key: int) -> None: ...
@overload
def discard(evolver: PSetEvolver[T], key: T) -> None: ...
def rex(expr: str) -> Callable[[Any], bool]: ...
def ny(_: Any) -> bool: ...
def get_in(keys: Iterable, coll: Mapping, default: Optional[Any] = None, no_default: bool = False) -> Any: ...

View File

@ -1 +0,0 @@
from ruamel import *

View File

@ -1 +0,0 @@
from six import *

View File

@ -1 +0,0 @@
from six.moves import *

View File

@ -1 +0,0 @@
from six.moves.configparser import *

View File

@ -1,81 +0,0 @@
[![](https://github.com/archspec/archspec/workflows/Unit%20tests/badge.svg)](https://github.com/archspec/archspec/actions)
[![codecov](https://codecov.io/gh/archspec/archspec/branch/master/graph/badge.svg)](https://codecov.io/gh/archspec/archspec)
[![Documentation Status](https://readthedocs.org/projects/archspec/badge/?version=latest)](https://archspec.readthedocs.io/en/latest/?badge=latest)
# Archspec (Python bindings)
Archspec aims at providing a standard set of human-understandable labels for
various aspects of a system architecture like CPU, network fabrics, etc. and
APIs to detect, query and compare them.
This project grew out of [Spack](https://spack.io/) and is currently under
active development. At present it supports APIs to detect and model
compatibility relationships among different CPU microarchitectures.
## Getting started with development
The `archspec` Python package needs [poetry](https://python-poetry.org/) to
be installed from VCS sources. The preferred method to install it is via
its custom installer outside of any virtual environment:
```console
$ curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python
```
You can refer to [Poetry's documentation](https://python-poetry.org/docs/#installation)
for further details or for other methods to install this tool. You'll also need `tox`
to run unit test:
```console
$ pip install --user tox
```
Finally you'll need to clone the repository:
```console
$ git clone --recursive https://github.com/archspec/archspec.git
```
### Running unit tests
Once you have your environment ready you can run `archspec` unit tests
using ``tox`` from the root of the repository:
```console
$ tox
[ ... ]
py27: commands succeeded
py35: commands succeeded
py36: commands succeeded
py37: commands succeeded
py38: commands succeeded
pylint: commands succeeded
flake8: commands succeeded
black: commands succeeded
congratulations :)
```
## Citing Archspec
If you are referencing `archspec` in a publication, please cite the following
paper:
* Massimiliano Culpo, Gregory Becker, Carlos Eduardo Arango Gutierrez, Kenneth
Hoste, and Todd Gamblin.
[**`archspec`: A library for detecting, labeling, and reasoning about
microarchitectures**](https://tgamblin.github.io/pubs/archspec-canopie-hpc-2020.pdf).
In *2nd International Workshop on Containers and New Orchestration Paradigms
for Isolated Environments in HPC (CANOPIE-HPC'20)*, Online Event, November
12, 2020.
## License
Archspec is distributed under the terms of both the MIT license and the
Apache License (Version 2.0). Users may choose either license, at their
option.
All new contributions must be made under both the MIT and Apache-2.0
licenses.
See [LICENSE-MIT](https://github.com/archspec/archspec/blob/master/LICENSE-MIT),
[LICENSE-APACHE](https://github.com/archspec/archspec/blob/master/LICENSE-APACHE),
[COPYRIGHT](https://github.com/archspec/archspec/blob/master/COPYRIGHT), and
[NOTICE](https://github.com/archspec/archspec/blob/master/NOTICE) for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
LLNL-CODE-811653

View File

@ -1,22 +0,0 @@
Intellectual Property Notice
------------------------------
Archspec is licensed under the Apache License, Version 2.0 (LICENSE-APACHE
or http://www.apache.org/licenses/LICENSE-2.0) or the MIT license,
(LICENSE-MIT or http://opensource.org/licenses/MIT), at your option.
Copyrights and patents in the Archspec project are retained by contributors.
No copyright assignment is required to contribute to Archspec.
SPDX usage
------------
Individual files contain SPDX tags instead of the full license text.
This enables machine processing of license information based on the SPDX
License Identifiers that are available here: https://spdx.org/licenses/
Files that are dual-licensed as Apache-2.0 OR MIT contain the following
text in the license header:
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@ -9,3 +9,4 @@ macholib==1.16.2
altgraph==0.17.3
ruamel.yaml==0.17.21
typing_extensions==4.1.1
archspec @ git+https://github.com/archspec/archspec.git@38ce485258ffc4fc6dd6688f8dc90cb269478c47

View File

@ -12,10 +12,9 @@
import warnings
from typing import Optional, Sequence, Union
import _vendoring.archspec.cpu
from _vendoring.typing_extensions import TypedDict
import archspec.cpu
import llnl.util.filesystem as fs
from llnl.util import tty
@ -138,7 +137,7 @@ def _fix_ext_suffix(candidate_spec: "spack.spec.Spec"):
}
# If the current architecture is not problematic return
generic_target = archspec.cpu.host().family
generic_target = _vendoring.archspec.cpu.host().family
if str(generic_target) not in _suffix_to_be_checked:
return
@ -235,7 +234,7 @@ def _root_spec(spec_str: str) -> str:
platform = str(spack.platforms.host())
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
target = _vendoring.archspec.cpu.host().family
spec_str += f" target={target}"
tty.debug(f"[BOOTSTRAP ROOT SPEC] {spec_str}")

View File

@ -13,7 +13,7 @@
import sys
from typing import Dict, Optional, Tuple
import archspec.cpu
import _vendoring.archspec.cpu
import spack.compilers.config
import spack.compilers.libraries
@ -30,7 +30,7 @@ class ClingoBootstrapConcretizer:
def __init__(self, configuration):
self.host_platform = spack.platforms.host()
self.host_os = self.host_platform.default_operating_system()
self.host_target = archspec.cpu.host().family
self.host_target = _vendoring.archspec.cpu.host().family
self.host_architecture = spack.spec.ArchSpec.default_arch()
self.host_architecture.target = str(self.host_target)
self.host_compiler = self._valid_compiler_or_raise()

View File

@ -8,7 +8,7 @@
import sys
from typing import Iterable, List
import archspec.cpu
import _vendoring.archspec.cpu
from llnl.util import tty
@ -51,7 +51,7 @@ def environment_root(cls) -> pathlib.Path:
"""Environment root directory"""
bootstrap_root_path = root_path()
python_part = spec_for_current_python().replace("@", "")
arch_part = archspec.cpu.host().family
arch_part = _vendoring.archspec.cpu.host().family
interpreter_part = hashlib.md5(sys.exec_prefix.encode()).hexdigest()[:5]
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
return pathlib.Path(
@ -112,7 +112,7 @@ def _write_spack_yaml_file(self) -> None:
context = {
"python_spec": spec_for_current_python(),
"python_prefix": sys.exec_prefix,
"architecture": archspec.cpu.host().family,
"architecture": _vendoring.archspec.cpu.host().family,
"environment_path": self.environment_root(),
"environment_specs": self.spack_dev_requirements(),
"store_path": store_path(),

View File

@ -59,7 +59,7 @@
overload,
)
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.tty as tty
from llnl.string import plural
@ -440,10 +440,12 @@ def optimization_flags(compiler, target):
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
version_number, _ = _vendoring.archspec.cpu.version_components(
compiler.version.dotted_numeric_string
)
try:
result = target.optimization_flags(compiler.name, version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
except (ValueError, _vendoring.archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result

View File

@ -5,7 +5,7 @@
import collections
import warnings
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.tty.colify as colify
import llnl.util.tty.color as color
@ -92,11 +92,11 @@ def display_target_group(header, target_group):
def arch(parser, args):
if args.generic_target:
# TODO: add deprecation warning in 0.24
print(archspec.cpu.host().generic)
print(_vendoring.archspec.cpu.host().generic)
return
if args.known_targets:
display_targets(archspec.cpu.TARGETS)
display_targets(_vendoring.archspec.cpu.TARGETS)
return
if args.frontend:

View File

@ -514,17 +514,18 @@ def extend_with_dependencies(specs):
def concrete_specs_from_cli_or_file(args):
tty.msg("Concretizing input specs")
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = spack.cmd.parse_specs(args.specs, concretize=False)
if not specs:
raise SpackError("unable to parse specs from command line")
if args.file:
specs = specs_from_text_file(args.file, concretize=True)
specs = specs_from_text_file(args.file, concretize=False)
if not specs:
raise SpackError("unable to parse specs from file '{}'".format(args.file))
return specs
concrete_specs = spack.cmd.matching_specs_from_env(specs)
return concrete_specs
class IncludeFilter:
@ -607,11 +608,6 @@ def process_mirror_stats(present, mirrored, error):
def mirror_create(args):
"""create a directory to be used as a spack mirror, and fill it with package archives"""
if args.specs and args.all:
raise SpackError(
"cannot specify specs on command line if you chose to mirror all specs with '--all'"
)
if args.file and args.all:
raise SpackError(
"cannot specify specs with a file if you chose to mirror all specs with '--all'"

View File

@ -201,7 +201,19 @@ def repo_migrate(args: Any) -> int:
repo_v2 = None
exit_code = 0
if exit_code == 0 and isinstance(repo_v2, spack.repo.Repo):
if not args.fix:
tty.error(
f"No changes were made to the repository {repo.root} with namespace "
f"'{repo.namespace}'. Run with --fix to apply the above changes."
)
elif exit_code == 1:
tty.error(
f"Repository '{repo.namespace}' could not be migrated to the latest Package API. "
"Please check the error messages above."
)
elif isinstance(repo_v2, spack.repo.Repo):
tty.info(
f"Repository '{repo_v2.namespace}' was successfully migrated from "
f"package API {repo.package_api_str} to {repo_v2.package_api_str}."
@ -212,15 +224,9 @@ def repo_migrate(args: Any) -> int:
f" spack repo add {shlex.quote(repo_v2.root)}"
)
elif exit_code == 0:
else:
tty.info(f"Repository '{repo.namespace}' was successfully migrated")
elif not args.fix and exit_code == 1:
tty.error(
f"No changes were made to the repository {repo.root} with namespace "
f"'{repo.namespace}'. Run with --fix to apply the above changes."
)
return exit_code

View File

@ -10,7 +10,7 @@
import warnings
from typing import Any, Dict, List, Optional, Tuple
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lang
@ -316,7 +316,7 @@ def from_external_yaml(config: Dict[str, Any]) -> Optional[spack.spec.Spec]:
@staticmethod
def _finalize_external_concretization(abstract_spec):
if CompilerFactory._GENERIC_TARGET is None:
CompilerFactory._GENERIC_TARGET = archspec.cpu.host().family
CompilerFactory._GENERIC_TARGET = _vendoring.archspec.cpu.host().family
if abstract_spec.architecture:
abstract_spec.architecture.complete_with_defaults()

View File

@ -25,7 +25,7 @@
import warnings
from typing import List, Tuple
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
@ -734,7 +734,7 @@ def _compatible_sys_types():
"""
host_platform = spack.platforms.host()
host_os = str(host_platform.default_operating_system())
host_target = archspec.cpu.host()
host_target = _vendoring.archspec.cpu.host()
compatible_targets = [host_target] + host_target.ancestors
compatible_archs = [
@ -794,7 +794,7 @@ def shell_set(var, value):
# print environment module system if available. This can be expensive
# on clusters, so skip it if not needed.
if "modules" in info:
generic_arch = archspec.cpu.host().family
generic_arch = _vendoring.archspec.cpu.host().family
module_spec = "environment-modules target={0}".format(generic_arch)
specs = spack.store.STORE.db.query(module_spec)
if specs:

View File

@ -986,7 +986,9 @@ def url_for_version(self, version):
"""
return self._implement_all_urls_for_version(version)[0]
def update_external_dependencies(self, extendee_spec=None):
def _update_external_dependencies(
self, extendee_spec: Optional[spack.spec.Spec] = None
) -> None:
"""
Method to override in package classes to handle external dependencies
"""

View File

@ -4,7 +4,7 @@
import warnings
from typing import Optional
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.lang
@ -38,15 +38,15 @@ def __init__(self, name):
self.name = name
self._init_targets()
def add_target(self, name: str, target: archspec.cpu.Microarchitecture) -> None:
def add_target(self, name: str, target: _vendoring.archspec.cpu.Microarchitecture) -> None:
if name in Platform.reserved_targets:
msg = f"{name} is a spack reserved alias and cannot be the name of a target"
raise ValueError(msg)
self.targets[name] = target
def _init_targets(self):
self.default = archspec.cpu.host().name
for name, microarchitecture in archspec.cpu.TARGETS.items():
self.default = _vendoring.archspec.cpu.host().name
for name, microarchitecture in _vendoring.archspec.cpu.TARGETS.items():
self.add_target(name, microarchitecture)
def target(self, name):

View File

@ -3,7 +3,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import archspec.cpu
import _vendoring.archspec.cpu
import spack.operating_systems
@ -28,7 +28,7 @@ def __init__(self, name=None):
def _init_targets(self):
targets = ("aarch64", "m1") if platform.machine() == "arm64" else ("x86_64", "core2")
for t in targets:
self.add_target(t, archspec.cpu.TARGETS[t])
self.add_target(t, _vendoring.archspec.cpu.TARGETS[t])
@classmethod
def detect(cls):

View File

@ -3,7 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import ast
import difflib
import os
import re
import shutil
@ -83,8 +82,7 @@ def migrate_v1_to_v2(
errors = False
stack: List[Tuple[str, int]] = [(repo.packages_path, 0)]
stack: List[Tuple[str, int]] = [(repo.root, 0)]
while stack:
path, depth = stack.pop()
@ -114,7 +112,11 @@ def migrate_v1_to_v2(
continue
# check if this is a package
if depth == 0 and os.path.exists(os.path.join(entry.path, "package.py")):
if (
depth == 1
and rel_path.startswith(f"{subdirectory}{os.sep}")
and os.path.exists(os.path.join(entry.path, "package.py"))
):
if "_" in entry.name:
print(
f"Invalid package name '{entry.name}': underscores are not allowed in "
@ -142,7 +144,7 @@ def migrate_v1_to_v2(
rename_regex = re.compile("^(" + "|".join(re.escape(k) for k in rename.keys()) + ")")
if fix:
os.makedirs(os.path.join(new_root, repo.subdirectory), exist_ok=True)
os.makedirs(new_root, exist_ok=True)
def _relocate(rel_path: str) -> Tuple[str, str]:
old = os.path.join(repo.root, rel_path)
@ -221,16 +223,6 @@ def _relocate(rel_path: str) -> Tuple[str, str]:
return result, (updated_repo if fix else None)
def _spack_pkg_to_spack_repo(modulename: str) -> str:
# rewrite spack.pkg.builtin.foo -> spack_repo.builtin.packages.foo.package
parts = modulename.split(".")
assert parts[:2] == ["spack", "pkg"]
parts[0:2] = ["spack_repo"]
parts.insert(2, "packages")
parts.append("package")
return ".".join(parts)
def migrate_v2_imports(
packages_dir: str, root: str, fix: bool, out: IO[str] = sys.stdout, err: IO[str] = sys.stderr
) -> bool:
@ -307,41 +299,12 @@ def migrate_v2_imports(
#: Set of symbols of interest that are already defined through imports, assignments, or
#: function definitions.
defined_symbols: Set[str] = set()
best_line: Optional[int] = None
seen_import = False
module_replacements: Dict[str, str] = {}
parent: Dict[int, ast.AST] = {}
#: List of (line, col start, old, new) tuples of strings to be replaced inline.
inline_updates: List[Tuple[int, int, str, str]] = []
#: List of (line from, line to, new lines) tuples of line replacements
multiline_updates: List[Tuple[int, int, List[str]]] = []
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
original_lines = file.readlines()
if len(original_lines) < 2: # assume packagepy files have at least 2 lines...
continue
if original_lines[0].endswith("\r\n"):
newline = "\r\n"
elif original_lines[0].endswith("\n"):
newline = "\n"
elif original_lines[0].endswith("\r"):
newline = "\r"
else:
success = False
print(f"{pkg_path}: unknown line ending, cannot fix", file=err)
continue
updated_lines = original_lines.copy()
for node in ast.walk(tree):
for child in ast.iter_child_nodes(node):
if isinstance(child, ast.Attribute):
parent[id(child)] = node
# Get the last import statement from the first block of top-level imports
if isinstance(node, ast.Module):
for child in ast.iter_child_nodes(node):
@ -361,7 +324,7 @@ def migrate_v2_imports(
if is_import:
if isinstance(child, (ast.stmt, ast.expr)):
best_line = (getattr(child, "end_lineno", None) or child.lineno) + 1
best_line = (child.end_lineno or child.lineno) + 1
if not seen_import and is_import:
seen_import = True
@ -390,89 +353,12 @@ def migrate_v2_imports(
elif isinstance(node, ast.Name) and node.id in symbol_to_module:
referenced_symbols.add(node.id)
# Find lines where spack.pkg is used.
elif (
isinstance(node, ast.Attribute)
and isinstance(node.value, ast.Name)
and node.value.id == "spack"
and node.attr == "pkg"
):
# go as many attrs up until we reach a known module name to be replaced
known_module = "spack.pkg"
ancestor = node
while True:
next_parent = parent.get(id(ancestor))
if next_parent is None or not isinstance(next_parent, ast.Attribute):
break
ancestor = next_parent
known_module = f"{known_module}.{ancestor.attr}"
if known_module in module_replacements:
break
inline_updates.append(
(
ancestor.lineno,
ancestor.col_offset,
known_module,
module_replacements[known_module],
)
)
# Register imported symbols to make this operation idempotent
elif isinstance(node, ast.ImportFrom):
# Keep track of old style spack.pkg imports, to be replaced.
if node.module and node.module.startswith("spack.pkg.") and node.level == 0:
depth = node.module.count(".")
# simple case of find and replace
# from spack.pkg.builtin.my_pkg import MyPkg
# -> from spack_repo.builtin.packages.my_pkg.package import MyPkg
if depth == 3:
module_replacements[node.module] = _spack_pkg_to_spack_repo(node.module)
inline_updates.append(
(
node.lineno,
node.col_offset,
node.module,
module_replacements[node.module],
)
)
# non-trivial possible multiline case
# from spack.pkg.builtin import (boost, cmake as foo)
# -> import spack_repo.builtin.packages.boost.package as boost
# -> import spack_repo.builtin.packages.cmake.package as foo
elif depth == 2 and node.end_lineno is not None:
_, _, namespace = node.module.rpartition(".")
indent = original_lines[node.lineno - 1][: node.col_offset]
multiline_updates.append(
(
node.lineno,
node.end_lineno + 1,
[
f"{indent}import spack_repo.{namespace}.packages."
f"{alias.name}.package as {alias.asname or alias.name}"
f"{newline}"
for alias in node.names
],
)
)
else:
success = False
print(
f"{pkg_path}:{node.lineno}: don't know how to rewrite `{node.module}`",
file=err,
)
# Subtract the symbols that are imported so we don't repeatedly add imports.
for alias in node.names:
if alias.name in symbol_to_module:
if alias.asname is None:
defined_symbols.add(alias.name)
# error when symbols are explicitly imported that are no longer available
if node.module == "spack.package" and node.level == 0:
defined_symbols.add(alias.name)
if node.module == "spack.package":
success = False
print(
f"{pkg_path}:{node.lineno}: `{alias.name}` is imported from "
@ -483,84 +369,59 @@ def migrate_v2_imports(
if alias.asname and alias.asname in symbol_to_module:
defined_symbols.add(alias.asname)
elif isinstance(node, ast.Import):
# normal imports are easy find and replace since they are single lines.
for alias in node.names:
if alias.asname and alias.asname in symbol_to_module:
defined_symbols.add(alias.name)
elif alias.asname is None and alias.name.startswith("spack.pkg."):
module_replacements[alias.name] = _spack_pkg_to_spack_repo(alias.name)
inline_updates.append(
(
alias.lineno,
alias.col_offset,
alias.name,
module_replacements[alias.name],
)
)
# Remove imported symbols from the referenced symbols
referenced_symbols.difference_update(defined_symbols)
# Sort from last to first so we can modify without messing up the line / col offsets
inline_updates.sort(reverse=True)
# Nothing to change here.
if not inline_updates and not referenced_symbols:
if not referenced_symbols:
continue
# First do module replacements of spack.pkg imports
for line, col, old, new in inline_updates:
updated_lines[line - 1] = updated_lines[line - 1][:col] + updated_lines[line - 1][
col:
].replace(old, new, 1)
if best_line is None:
print(f"{pkg_path}: failed to update imports", file=err)
success = False
continue
# Then insert new imports for symbols referenced in the package
if referenced_symbols:
if best_line is None:
print(f"{pkg_path}: failed to update imports", file=err)
success = False
continue
# Add the missing imports right after the last import statement
with open(pkg_path, "r", encoding="utf-8", newline="") as file:
lines = file.readlines()
# Group missing symbols by their module
missing_imports_by_module: Dict[str, list] = {}
for symbol in referenced_symbols:
module = symbol_to_module[symbol]
if module not in missing_imports_by_module:
missing_imports_by_module[module] = []
missing_imports_by_module[module].append(symbol)
# Group missing symbols by their module
missing_imports_by_module: Dict[str, list] = {}
for symbol in referenced_symbols:
module = symbol_to_module[symbol]
if module not in missing_imports_by_module:
missing_imports_by_module[module] = []
missing_imports_by_module[module].append(symbol)
new_lines = [
f"from {module} import {', '.join(sorted(symbols))}{newline}"
for module, symbols in sorted(missing_imports_by_module.items())
]
new_lines = [
f"from {module} import {', '.join(sorted(symbols))}\n"
for module, symbols in sorted(missing_imports_by_module.items())
]
if not seen_import:
new_lines.extend((newline, newline))
if not seen_import:
new_lines.extend(("\n", "\n"))
multiline_updates.append((best_line, best_line, new_lines))
multiline_updates.sort(reverse=True)
for start, end, new_lines in multiline_updates:
updated_lines[start - 1 : end - 1] = new_lines
if not fix:
if not fix: # only print the diff
success = False # packages need to be fixed, but we didn't do it
diff_start, diff_end = max(1, best_line - 3), min(best_line + 2, len(lines))
num_changed = diff_end - diff_start + 1
num_added = num_changed + len(new_lines)
rel_pkg_path = os.path.relpath(pkg_path, start=root)
diff = difflib.unified_diff(
original_lines,
updated_lines,
n=3,
fromfile=f"a/{rel_pkg_path}",
tofile=f"b/{rel_pkg_path}",
)
out.write("".join(diff))
out.write(f"--- a/{rel_pkg_path}\n+++ b/{rel_pkg_path}\n")
out.write(f"@@ -{diff_start},{num_changed} +{diff_start},{num_added} @@\n")
for line in lines[diff_start - 1 : best_line - 1]:
out.write(f" {line}")
for line in new_lines:
out.write(f"+{line}")
for line in lines[best_line - 1 : diff_end]:
out.write(f" {line}")
continue
lines[best_line - 1 : best_line - 1] = new_lines
tmp_file = pkg_path + ".tmp"
# binary mode to avoid newline conversion issues; utf-8 was already required upon read.
with open(tmp_file, "wb") as file:
file.write("".join(updated_lines).encode("utf-8"))
with open(tmp_file, "w", encoding="utf-8", newline="") as file:
file.writelines(lines)
os.replace(tmp_file, pkg_path)

View File

@ -34,7 +34,7 @@
Union,
)
import archspec.cpu
import _vendoring.archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
@ -1617,7 +1617,7 @@ def target_ranges(self, spec, single_target_fn):
target = spec.architecture.target
# Check if the target is a concrete target
if str(target) in archspec.cpu.TARGETS:
if str(target) in _vendoring.archspec.cpu.TARGETS:
return [single_target_fn(spec.name, target)]
self.target_constraints.add(target)
@ -2753,7 +2753,7 @@ def _supported_targets(self, compiler_name, compiler_version, targets):
compiler_name, compiler_version.dotted_numeric_string
)
supported.append(target)
except archspec.cpu.UnsupportedMicroarchitecture:
except _vendoring.archspec.cpu.UnsupportedMicroarchitecture:
continue
except ValueError:
continue
@ -2818,7 +2818,7 @@ def target_defaults(self, specs):
if not spec.architecture or not spec.architecture.target:
continue
target = archspec.cpu.TARGETS.get(spec.target.name)
target = _vendoring.archspec.cpu.TARGETS.get(spec.target.name)
if not target:
self.target_ranges(spec, None)
continue
@ -2830,7 +2830,7 @@ def target_defaults(self, specs):
candidate_targets.append(ancestor)
platform = spack.platforms.host()
uarch = archspec.cpu.TARGETS.get(platform.default)
uarch = _vendoring.archspec.cpu.TARGETS.get(platform.default)
best_targets = {uarch.family.name}
for compiler in self.possible_compilers:
supported = self._supported_targets(compiler.name, compiler.version, candidate_targets)
@ -2938,7 +2938,7 @@ def _all_targets_satisfiying(single_constraint):
return [single_constraint]
t_min, _, t_max = single_constraint.partition(":")
for test_target in archspec.cpu.TARGETS.values():
for test_target in _vendoring.archspec.cpu.TARGETS.values():
# Check lower bound
if t_min and not t_min <= test_target:
continue
@ -3894,7 +3894,7 @@ def external_spec_selected(self, node, idx):
if extendee_spec:
extendee_node = SpecBuilder.make_node(pkg=extendee_spec.name)
package.update_external_dependencies(self._specs.get(extendee_node, None))
package._update_external_dependencies(self._specs.get(extendee_node))
def depends_on(self, parent_node, dependency_node, type):
dependency_spec = self._specs[dependency_node]

View File

@ -5,7 +5,7 @@
import collections
from typing import Dict, List, NamedTuple, Set, Tuple, Union
import archspec.cpu
import _vendoring.archspec.cpu
from llnl.util import lang, tty
@ -34,7 +34,7 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""
raise NotImplementedError
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
raise NotImplementedError
@ -70,7 +70,7 @@ def __init__(self, *, configuration: spack.config.Configuration, repo: spack.rep
self.configuration = configuration
self.repo = repo
self._platform_condition = spack.spec.Spec(
f"platform={spack.platforms.host()} target={archspec.cpu.host().family}:"
f"platform={spack.platforms.host()} target={_vendoring.archspec.cpu.host().family}:"
)
try:
@ -110,10 +110,10 @@ def unreachable(self, *, pkg_name: str, when_spec: spack.spec.Spec) -> bool:
"""
return False
def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
def candidate_targets(self) -> List[_vendoring.archspec.cpu.Microarchitecture]:
"""Returns a list of targets that are candidate for concretization"""
platform = spack.platforms.host()
default_target = archspec.cpu.TARGETS[platform.default]
default_target = _vendoring.archspec.cpu.TARGETS[platform.default]
# Construct the list of targets which are compatible with the host
candidate_targets = [default_target] + default_target.ancestors
@ -125,7 +125,7 @@ def candidate_targets(self) -> List[archspec.cpu.Microarchitecture]:
additional_targets_in_family = sorted(
[
t
for t in archspec.cpu.TARGETS.values()
for t in _vendoring.archspec.cpu.TARGETS.values()
if (t.family.name == default_target.family.name and t not in candidate_targets)
],
key=lambda x: len(x.ancestors),

View File

@ -74,10 +74,9 @@
overload,
)
import _vendoring.archspec.cpu
from _vendoring.typing_extensions import Literal
import archspec.cpu
import llnl.path
import llnl.string
import llnl.util.filesystem as fs
@ -217,10 +216,12 @@ def ensure_modern_format_string(fmt: str) -> None:
)
def _make_microarchitecture(name: str) -> archspec.cpu.Microarchitecture:
if isinstance(name, archspec.cpu.Microarchitecture):
def _make_microarchitecture(name: str) -> _vendoring.archspec.cpu.Microarchitecture:
if isinstance(name, _vendoring.archspec.cpu.Microarchitecture):
return name
return archspec.cpu.TARGETS.get(name, archspec.cpu.generic_microarchitecture(name))
return _vendoring.archspec.cpu.TARGETS.get(
name, _vendoring.archspec.cpu.generic_microarchitecture(name)
)
@lang.lazy_lexicographic_ordering
@ -364,7 +365,7 @@ def target(self, value):
# will assumed to be the host machine's platform.
def target_or_none(t):
if isinstance(t, archspec.cpu.Microarchitecture):
if isinstance(t, _vendoring.archspec.cpu.Microarchitecture):
return t
if t and t != "None":
return _make_microarchitecture(t)

View File

@ -3,10 +3,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
import spack.concretize
import spack.operating_systems
import spack.platforms
@ -125,7 +124,8 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
)
@pytest.mark.usefixtures("mock_packages", "config")
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="tests are for x86_64 uarch ranges",
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = spack.concretize.concretize_one(

View File

@ -8,10 +8,9 @@
import sys
from typing import Dict, Optional, Tuple
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
from llnl.path import Path, convert_to_platform_path
from llnl.util.filesystem import HeaderList, LibraryList
@ -727,14 +726,15 @@ def test_rpath_with_duplicate_link_deps():
@pytest.mark.filterwarnings("ignore:microarchitecture specific")
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
def test_optimization_flags(compiler_spec, target_name, expected_flags, compiler_factory):
target = archspec.cpu.TARGETS[target_name]
target = _vendoring.archspec.cpu.TARGETS[target_name]
compiler = spack.spec.parse_with_version_concrete(compiler_spec)
opt_flags = spack.build_environment.optimization_flags(compiler, target)
assert opt_flags == expected_flags
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="tests check specific x86_64 uarch flags"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="tests check specific x86_64 uarch flags",
)
@pytest.mark.not_on_windows("Windows doesn't support the compiler wrapper")
def test_optimization_flags_are_using_node_target(default_mock_concretization, monkeypatch):

View File

@ -5,11 +5,10 @@
import glob
import os
import _vendoring.archspec.cpu
import py.path
import pytest
import archspec.cpu
import llnl.util.filesystem as fs
import spack
@ -217,7 +216,8 @@ def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
@pytest.mark.disable_clean_stage_check
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="test data is specific for x86_64",
)
def test_autotools_gnuconfig_replacement_no_gnuconfig(self, mutable_database, monkeypatch):
"""

View File

@ -61,6 +61,26 @@ def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fe
assert mirror_res == expected
# Test for command line-specified spec in concretized environment
def test_mirror_spec_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
mirror_dir = str(tmp_path / "mirror-B")
env_name = "test"
env("create", env_name)
with ev.read(env_name):
add("simple-standalone-test@0.9")
concretize()
with spack.config.override("config:checksum", False):
mirror("create", "-d", mirror_dir, "simple-standalone-test")
e = ev.read(env_name)
assert set(os.listdir(mirror_dir)) == set([s.name for s in e.user_specs])
spec = e.concrete_roots()[0]
mirror_res = os.listdir(os.path.join(mirror_dir, spec.name))
expected = ["%s.tar.gz" % spec.format("{name}-{version}")]
assert mirror_res == expected
@pytest.fixture
def source_for_pkg_with_hash(mock_packages, tmpdir):
s = spack.concretize.concretize_one("trivial-pkg-with-valid-hash")
@ -401,8 +421,7 @@ def test_all_specs_with_all_versions_dont_concretize(self):
@pytest.mark.parametrize(
"cli_args,error_str",
[
# Passed more than one among -f --all and specs
({"specs": "hdf5", "file": None, "all": True}, "cannot specify specs on command line"),
# Passed more than one among -f --all
(
{"specs": None, "file": "input.txt", "all": True},
"cannot specify specs with a file if",

View File

@ -95,47 +95,24 @@ class _7zip(Package):
pass
"""
# this is written like this to be explicit about line endings and indentation
OLD_NUMPY = (
b"# some comment\r\n"
b"\r\n"
b"import spack.pkg.builtin.foo, spack.pkg.builtin.bar\r\n"
b"from spack.package import *\r\n"
b"from something.unrelated import AutotoolsPackage\r\n"
b"\r\n"
b"if True:\r\n"
b"\tfrom spack.pkg.builtin import (\r\n"
b"\t\tfoo,\r\n"
b"\t\tbar as baz,\r\n"
b"\t)\r\n"
b"\r\n"
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
b"\tgenerator('ninja')\r\n"
b"\r\n"
b"\tdef example(self):\r\n"
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
b"\t\treturn spack.pkg.builtin.foo.example(), foo, baz\r\n"
)
OLD_NUMPY = b"""\
# some comment
NEW_NUMPY = (
b"# some comment\r\n"
b"\r\n"
b"import spack_repo.builtin.packages.foo.package, spack_repo.builtin.packages.bar.package\r\n"
b"from spack_repo.builtin.build_systems.cmake import CMakePackage, generator\r\n"
b"from spack.package import *\r\n"
b"from something.unrelated import AutotoolsPackage\r\n"
b"\r\n"
b"if True:\r\n"
b"\timport spack_repo.builtin.packages.foo.package as foo\r\n"
b"\timport spack_repo.builtin.packages.bar.package as baz\r\n"
b"\r\n"
b"class PyNumpy(CMakePackage, AutotoolsPackage):\r\n"
b"\tgenerator('ninja')\r\n"
b"\r\n"
b"\tdef example(self):\r\n"
b"\t\t# unchanged comment: spack.pkg.builtin.foo.something\r\n"
b"\t\treturn spack_repo.builtin.packages.foo.package.example(), foo, baz\r\n"
)
from spack.package import *
class PyNumpy(CMakePackage):
generator("ninja")
"""
NEW_NUMPY = b"""\
# some comment
from spack_repo.builtin.build_systems.cmake import CMakePackage, generator
from spack.package import *
class PyNumpy(CMakePackage):
generator("ninja")
"""
def test_repo_migrate(tmp_path: pathlib.Path, config):
@ -165,6 +142,7 @@ def test_repo_migrate(tmp_path: pathlib.Path, config):
assert pkg_py_numpy_new.read_bytes() == NEW_NUMPY
@pytest.mark.not_on_windows("Known failure on windows")
def test_migrate_diff(git: Executable, tmp_path: pathlib.Path):
root, _ = spack.repo.create_repo(str(tmp_path), "foo", package_api=(2, 0))
r = pathlib.Path(root)

View File

@ -4,10 +4,9 @@
import os
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
import spack.concretize
import spack.config
import spack.paths
@ -86,7 +85,8 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="test data is x86_64 specific",
),
),
pytest.param(
@ -98,7 +98,8 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
},
2,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="test data is x86_64 specific",
),
),
],

View File

@ -6,11 +6,10 @@
import platform
import sys
import _vendoring.archspec.cpu
import _vendoring.jinja2
import pytest
import archspec.cpu
import llnl.util.lang
import spack.binary_distribution
@ -146,12 +145,12 @@ def current_host(request, monkeypatch):
monkeypatch.setattr(spack.platforms.Test, "default", cpu)
if not is_preference:
target = archspec.cpu.TARGETS[cpu]
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
target = _vendoring.archspec.cpu.TARGETS[cpu]
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
yield target
else:
target = archspec.cpu.TARGETS["sapphirerapids"]
monkeypatch.setattr(archspec.cpu, "host", lambda: target)
target = _vendoring.archspec.cpu.TARGETS["sapphirerapids"]
monkeypatch.setattr(_vendoring.archspec.cpu, "host", lambda: target)
with spack.config.override("packages:all", {"target": [cpu]}):
yield target
@ -383,7 +382,7 @@ def test_different_compilers_get_different_flags(
"gcc": {"externals": [gcc11_with_flags]},
},
)
t = archspec.cpu.host().family
t = _vendoring.archspec.cpu.host().family
client = spack.concretize.concretize_one(
Spec(
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
@ -976,7 +975,7 @@ def test_noversion_pkg(self, spec):
def test_adjusting_default_target_based_on_compiler(
self, spec, compiler_spec, best_achievable, current_host, compiler_factory, mutable_config
):
best_achievable = archspec.cpu.TARGETS[best_achievable]
best_achievable = _vendoring.archspec.cpu.TARGETS[best_achievable]
expected = best_achievable if best_achievable < current_host else current_host
mutable_config.set(
"packages", {"gcc": {"externals": [compiler_factory(spec=f"{compiler_spec}")]}}
@ -1645,7 +1644,7 @@ def test_target_granularity(self):
# The test architecture uses core2 as the default target. Check that when
# we configure Spack for "generic" granularity we concretize for x86_64
default_target = spack.platforms.test.Test.default
generic_target = archspec.cpu.TARGETS[default_target].generic.name
generic_target = _vendoring.archspec.cpu.TARGETS[default_target].generic.name
s = Spec("python")
assert spack.concretize.concretize_one(s).satisfies("target=%s" % default_target)
with spack.config.override("concretizer:targets", {"granularity": "generic"}):
@ -2011,7 +2010,7 @@ def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch
def test_require_targets_are_allowed(self, mutable_config, mutable_database):
"""Test that users can set target constraints under the require attribute."""
# Configuration to be added to packages.yaml
required_target = archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
required_target = _vendoring.archspec.cpu.TARGETS[spack.platforms.test.Test.default].family
external_conf = {"all": {"require": f"target={required_target}"}}
mutable_config.set("packages", external_conf)

View File

@ -20,13 +20,12 @@
import tempfile
import xml.etree.ElementTree
import _vendoring.archspec.cpu
import _vendoring.archspec.cpu.microarchitecture
import _vendoring.archspec.cpu.schema
import py
import pytest
import archspec.cpu
import archspec.cpu.microarchitecture
import archspec.cpu.schema
import llnl.util.lang
import llnl.util.lock
import llnl.util.tty as tty
@ -372,12 +371,12 @@ def clean_test_environment():
def _host():
"""Mock archspec host so there is no inconsistency on the Windows platform
This function cannot be local as it needs to be pickleable"""
return archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
return _vendoring.archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
@pytest.fixture(scope="function")
def archspec_host_is_spack_test_host(monkeypatch):
monkeypatch.setattr(archspec.cpu, "host", _host)
monkeypatch.setattr(_vendoring.archspec.cpu, "host", _host)
# Hooks to add command line options or set other custom behaviors.
@ -728,14 +727,14 @@ def mock_uarch_json(tmpdir_factory):
@pytest.fixture(scope="session")
def mock_uarch_configuration(mock_uarch_json):
"""Create mock dictionaries for the archspec.cpu."""
"""Create mock dictionaries for the _vendoring.archspec.cpu."""
def load_json():
with open(mock_uarch_json, encoding="utf-8") as f:
return json.load(f)
targets_json = load_json()
targets = archspec.cpu.microarchitecture._known_microarchitectures()
targets = _vendoring.archspec.cpu.microarchitecture._known_microarchitectures()
yield targets_json, targets
@ -744,8 +743,8 @@ def load_json():
def mock_targets(mock_uarch_configuration, monkeypatch):
"""Use this fixture to enable mock uarch targets for testing."""
targets_json, targets = mock_uarch_configuration
monkeypatch.setattr(archspec.cpu.schema, "TARGETS_JSON", targets_json)
monkeypatch.setattr(archspec.cpu.microarchitecture, "TARGETS", targets)
monkeypatch.setattr(_vendoring.archspec.cpu.schema, "TARGETS_JSON", targets_json)
monkeypatch.setattr(_vendoring.archspec.cpu.microarchitecture, "TARGETS", targets)
@pytest.fixture(scope="session")
@ -773,7 +772,7 @@ def configuration_dir(tmpdir_factory, linux_os):
config_template = test_config / "config.yaml"
config.write(config_template.read_text().format(install_tree_root, locks))
target = str(archspec.cpu.host().family)
target = str(_vendoring.archspec.cpu.host().family)
compilers = tmpdir.join("site", "packages.yaml")
compilers_template = test_config / "packages.yaml"
compilers.write(compilers_template.read_text().format(linux_os=linux_os, target=target))
@ -2117,7 +2116,7 @@ def _factory(*, spec):
@pytest.fixture()
def host_architecture_str():
"""Returns the broad architecture family (x86_64, aarch64, etc.)"""
return str(archspec.cpu.host().family)
return str(_vendoring.archspec.cpu.host().family)
def _true(x):

View File

@ -11,10 +11,9 @@
import json
import os
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
import spack
import spack.cmd
import spack.cmd.external
@ -104,7 +103,7 @@ def spec_json(self):
@pytest.fixture
def _common_arch(test_platform):
generic = archspec.cpu.TARGETS[test_platform.default].family
generic = _vendoring.archspec.cpu.TARGETS[test_platform.default].family
return JsonArchEntry(platform=test_platform.name, os="redhat6", target=generic.name)

View File

@ -4,10 +4,9 @@
import os
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
import spack.concretize
import spack.config
import spack.environment as ev
@ -221,7 +220,8 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
assert len([x for x in content if 'setenv("FOO", "{{name}}, {name}, {{}}, {}")' in x]) == 1
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="test data is specific for x86_64",
)
def test_help_message(self, modulefile_content, module_configuration):
"""Tests the generation of module help message."""

View File

@ -4,10 +4,9 @@
import os
import _vendoring.archspec.cpu
import pytest
import archspec.cpu
import spack.concretize
import spack.modules.common
import spack.modules.tcl
@ -186,7 +185,8 @@ def test_setenv_raw_value(self, modulefile_content, module_configuration):
assert len([x for x in content if "setenv FOO {{{name}}, {name}, {{}}, {}}" in x]) == 1
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
str(_vendoring.archspec.cpu.host().family) != "x86_64",
reason="test data is specific for x86_64",
)
def test_help_message(self, modulefile_content, module_configuration):
"""Tests the generation of module help message."""

View File

@ -208,7 +208,7 @@ def variant_type(self) -> VariantType:
else:
return VariantType.SINGLE
def __str__(self):
def __str__(self) -> str:
return (
f"Variant('{self.name}', "
f"default='{self.default}', "
@ -491,14 +491,14 @@ class DisjointSetsOfValues(collections.abc.Sequence):
*sets (list): mutually exclusive sets of values
"""
_empty_set = set(("none",))
_empty_set = {"none"}
def __init__(self, *sets):
def __init__(self, *sets: Tuple[str, ...]) -> None:
self.sets = [set(_flatten(x)) for x in sets]
# 'none' is a special value and can appear only in a set of
# a single element
if any("none" in s and s != set(("none",)) for s in self.sets):
if any("none" in s and s != {"none"} for s in self.sets):
raise spack.error.SpecError(
"The value 'none' represents the empty set,"
" and must appear alone in a set. Use the "

View File

@ -75,7 +75,6 @@ section-order = [
"future",
"standard-library",
"third-party",
"archspec",
"llnl",
"spack",
"first-party",
@ -84,7 +83,6 @@ section-order = [
[tool.ruff.lint.isort.sections]
spack = ["spack"]
archspec = ["archspec"]
llnl = ["llnl"]
[tool.ruff.lint.per-file-ignores]
@ -104,13 +102,11 @@ sections = [
"FUTURE",
"STDLIB",
"THIRDPARTY",
"ARCHSPEC",
"LLNL",
"FIRSTPARTY",
"LOCALFOLDER",
]
known_first_party = "spack"
known_archspec = "archspec"
known_llnl = "llnl"
known_third_party = ["ruamel", "six"]
src_paths = "lib"
@ -264,6 +260,9 @@ substitute = [
{ match = "from attr", replace = "from _vendoring.attr" },
{ match = "import jsonschema", replace = "import _vendoring.jsonschema" },
{ match = "from jsonschema", replace = "from _vendoring.jsonschema" },
{ match = "archspec.cpu", replace = "_vendoring.archspec.cpu" },
{ match = "archspec.__version__", replace = "_vendoring.archspec.__version__" },
{ match = "import archspec", replace = "import _vendoring.archspec" },
]
drop = [
# contains unnecessary scripts
@ -285,11 +284,21 @@ drop = [
"pvectorc.*.so",
# Trim jsonschema tests
"jsonschema/tests",
"archspec/json/tests",
"archspec/vendor/cpuid/.gitignore",
"pyrsistent/__init__.pyi",
]
[tool.vendoring.typing-stubs]
six = ["six.__init__", "six.moves.__init__", "six.moves.configparser"]
_pyrsistent_version = []
altgraph = []
archspec = []
distro = []
jsonschema = []
macholib = []
pyrsistent = []
ruamel = []
six = []
[tool.vendoring.license.directories]
setuptools = "pkg_resources"

View File

@ -69,7 +69,7 @@ spack:
- alpgen
- ampt
- apfel +lhapdf +python
- celeritas ~cuda +openmp ~rocm +vecgeom
- celeritas ~cuda +openmp ~rocm +vecgeom +covfie
- cepgen
- cernlib +shared
- collier
@ -130,12 +130,12 @@ spack:
# CUDA
#- acts +cuda +traccc cuda_arch=80
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80
#- celeritas +cuda ~openmp +vecgeom cuda_arch=80 +covfie
- root +cuda +cudnn +tmva-gpu
- vecgeom +cuda cuda_arch=80
# ROCm
- celeritas +rocm amdgpu_target=gfx90a ~openmp ~vecgeom # only available with ORANGE
- celeritas +rocm amdgpu_target=gfx90a ~openmp ~vecgeom # only available with ORANGE
ci:
pipeline-gen:

View File

@ -10,7 +10,7 @@
import stat
from typing import Dict, Iterable, List, Mapping, Optional, Tuple
import archspec
import _vendoring.archspec.cpu
import llnl.util.filesystem as fs
from llnl.util.lang import ClassProperty, classproperty, match_predicate
@ -250,7 +250,7 @@ def test_imports(self) -> None:
):
python("-c", f"import {module}")
def update_external_dependencies(self, extendee_spec=None):
def _update_external_dependencies(self, extendee_spec: Optional[Spec] = None) -> None:
"""
Ensure all external python packages have a python dependency
@ -286,7 +286,7 @@ def update_external_dependencies(self, extendee_spec=None):
if not python.architecture.os:
python.architecture.os = platform.default_operating_system()
if not python.architecture.target:
python.architecture.target = archspec.cpu.host().family.name
python.architecture.target = _vendoring.archspec.cpu.host().family.name
python.external_path = self.spec.external_path
python._mark_concrete()

View File

@ -1,4 +1,5 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
applications/Allwmake $targetType $*
wmake $targetType applications/solvers/additiveFoam/movingHeatSource
wmake $targetType applications/solvers/additiveFoam

View File

@ -1,5 +0,0 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
wmake libso solvers/additiveFoam/movingHeatSource
wmake solvers/additiveFoam

View File

@ -1,4 +1,4 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
applications/Allwmake $targetType $*
./applications/solvers/additiveFoam/Allwmake $targetType $*

View File

@ -1,9 +0,0 @@
#!/bin/sh
cd ${0%/*} || exit 1 # Run from this directory
# Parse arguments for library compilation
. $WM_PROJECT_DIR/wmake/scripts/AllwmakeParseArguments
wmake $targetType solvers/additiveFoam/functionObjects/ExaCA
wmake $targetType solvers/additiveFoam/movingHeatSource
wmake $targetType solvers/additiveFoam

View File

@ -15,9 +15,9 @@
class Additivefoam(Package):
"""AdditiveFOAM is a heat and mass transfer software for Additive Manufacturing (AM)"""
homepage = "https://github.com/ORNL/AdditiveFOAM"
homepage = "https://ornl.github.io/AdditiveFOAM/"
git = "https://github.com/ORNL/AdditiveFOAM.git"
url = "https://github.com/ORNL/AdditiveFOAM/archive/1.0.0.tar.gz"
url = "https://github.com/ORNL/AdditiveFOAM/archive/1.1.0.tar.gz"
maintainers("streeve", "colemanjs", "gknapp1")
@ -26,16 +26,17 @@ class Additivefoam(Package):
license("GPL-3.0-only")
version("main", branch="main")
version("1.1.0", sha256="a13770bd66fe10224705fb3a2bfb557e63e0aea98c917b0084cf8b91eaa53ee2")
version("1.0.0", sha256="abbdf1b0230cd2f26f526be76e973f508978611f404fe8ec4ecdd7d5df88724c")
depends_on("cxx", type="build") # generated
depends_on("openfoam-org@10")
common = ["spack-derived-Allwmake"]
assets = [join_path("applications", "Allwmake"), "Allwmake"]
common = []
assets = ["Allwmake"]
build_script = "./spack-derived-Allwmake"
build_script = "./Allwmake"
phases = ["configure", "build", "install"]
@ -56,15 +57,49 @@ def add_extra_files(self, common, local_prefix, local):
openfoam.install(join_path(indir, f), join_path(outdir, f))
def patch(self):
"""Patches build by adding Allwmake from the asset directory based on
the spec version.
For all versions after 1.0.0 there is an Allwmake script in
the AdditiveFOAM repository that can be called by the spack assets_main/Allwmake
script, whereas the assets_1.0.0/Allwmake script contains the
build instructions."""
spec = self.spec
asset_dir = ""
if Version("main") in spec.versions:
asset_dir = "assets_main"
elif Version("1.0.0") in spec.versions:
asset_dir = "assets_main"
if Version("1.0.0") in spec.versions:
asset_dir = "assets_1.0.0"
self.add_extra_files(self.common, asset_dir, self.assets)
def setup_build_environment(self, env):
"""Set up the build environment variables."""
# Ensure that the directories exist
mkdirp(self.prefix.bin)
mkdirp(self.prefix.lib)
# Add to the environment
env.set("FOAM_USER_APPBIN", self.prefix.bin)
env.set("FOAM_USER_LIBBIN", self.prefix.lib)
def setup_run_environment(self, env):
"""Set up the run environment variables."""
# Add to the environment
env.prepend_path("PATH", self.prefix.bin)
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)
def activate(self, spec, prefix):
"""Activate the package to modify the environment."""
self.setup_run_environment(self.spec.environment())
def deactivate(self, spec, prefix):
"""Deactivate the package and clean up the environment."""
env = self.spec.environment()
env.pop("FOAM_USER_APPBIN", None)
env.pop("FOAM_USER_LIBBIN", None)
def configure(self, spec, prefix):
"""Configure the environment for building."""
pass
def build(self, spec, prefix):

View File

@ -21,6 +21,7 @@ class Amdsmi(CMakePackage):
libraries = ["libamd_smi"]
license("MIT")
version("6.4.0", sha256="6f0200ba7305171e9dadbfcd41ff00c194b98d2b88e0555c57739ef01c767233")
version("6.3.3", sha256="e23abc65a1cd75764d7da049b91cce2a095b287279efcd4f90b4b9b63b974dd5")
version("6.3.2", sha256="1ed452eedfe51ac6e615d7bfe0bd7a0614f21113874ae3cbea7df72343cc2d13")
version("6.3.1", sha256="a3a5a711052e813b9be9304d5e818351d3797f668ec2a455e61253a73429c355")
@ -38,6 +39,7 @@ class Amdsmi(CMakePackage):
version("5.5.1", sha256="b794c7fd562fd92f2c9f2bbdc2d5dded7486101fcd4598f2e8c3484c9a939281")
version("5.5.0", sha256="dcfbd96e93afcf86b1261464e008e9ef7e521670871a1885e6eaffc7cdc8f555")
depends_on("c", type="build")
depends_on("cxx", type="build") # generated
depends_on("cmake@3.11:")

View File

@ -10,6 +10,20 @@
from spack.package import *
_versions = {
"6.4.0": {
"apt": (
"5ec56bc3c227ad37227072bd513c58c9501e1ceefb06692ad4812f337853dca4",
"https://repo.radeon.com/rocm/apt/6.4/pool/main/h/hsa-amd-aqlprofile/hsa-amd-aqlprofile_1.0.0.60400-47~22.04_amd64.deb",
),
"yum": (
"22ed4c6a999ca6823e5e6bf9f4ab560cba68025f354346b1ac2ebb4757239c56",
"https://repo.radeon.com/rocm/rhel8/6.4/main/hsa-amd-aqlprofile-1.0.0.60400-47.el8.x86_64.rpm",
),
"zyp": (
"7a4c9ca0e6ca178c65776f9b1d9d01ca7576eaa555fdcbf49a42def1ce6d6041",
"https://repo.radeon.com/rocm/zyp/6.4/main/hsa-amd-aqlprofile-1.0.0.60400-sles156.47.x86_64.rpm",
),
},
"6.3.3": {
"apt": (
"5fe2b18e75e8c0a66069af8446399796818f7340a9ef5f2b52adaa79ee8e2a37",
@ -307,6 +321,7 @@ class Aqlprofile(Package):
"6.3.1",
"6.3.2",
"6.3.3",
"6.4.0",
]:
depends_on(f"hsa-rocr-dev@{ver}", when=f"@{ver}")

View File

@ -19,6 +19,7 @@ class AwsOfiNccl(AutotoolsPackage):
maintainers("bvanessen")
version("master", branch="master")
version("1.14.2", sha256="e523ea08ce0caeff5c949b2134b4897186d793ce908904dd9d47bb08230b9bbd")
version("1.13.0", sha256="50dd231a0a99cec29300df46b8e828139ced15322a3c3c41b1d22dcc9a62ec02")
version("1.12.1", sha256="821f0929c016e5448785bbc6795af5096559ecfc6c9479eb3818cafa61424576")
version("1.12.0", sha256="93029207103b75f4dc15f023b3b8692851202b52b7e2824723dd5d328f0ea65b")
@ -51,7 +52,7 @@ class AwsOfiNccl(AutotoolsPackage):
depends_on("libtool", type="build")
def url_for_version(self, version):
if version < Version("1.7.0"):
if version < Version("1.7.0") or version >= Version("1.14.0"):
return super().url_for_version(version)
url_fmt = "https://github.com/aws/aws-ofi-nccl/archive/v{0}-aws.tar.gz"
return url_fmt.format(version)

View File

@ -4,23 +4,19 @@
from spack_repo.builtin.build_systems.generic import Package
import archspec
from spack.package import *
class Blast2go(Package):
"""Blast2GO is a bioinformatics platform for high-quality functional
annotation and analysis of genomic datasets."""
annotation and analysis of genomic datasets.
"""
homepage = "https://www.blast2go.com/"
version("5.2.5", sha256="c37aeda25f96ac0553b52da6b5af3167d50671ddbfb3b39bcb11afe5d0643891")
for t in set(
[str(x.family) for x in archspec.cpu.TARGETS.values() if str(x.family) != "x86_64"]
):
conflicts("target={0}:".format(t), msg="blast2go is available x86_64 only")
requires("target=x86_64:", msg="blast2go is available x86_64 only")
depends_on("bash", type="build")
depends_on("blast-plus", type="run")

View File

@ -19,12 +19,17 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
git = "https://github.com/celeritas-project/celeritas.git"
url = "https://github.com/celeritas-project/celeritas/releases/download/v0.1.0/celeritas-0.1.0.tar.gz"
maintainers("sethrj")
maintainers("sethrj", "drbenmorgan")
license("Apache-2.0")
sanity_check_is_file = ["bin/celer-sim"]
version("develop", branch="develop", get_full_repo=True)
version("0.6.0", sha256="c776dee357ecff42f85ed02c328f24b092400af28e67af2c0e195ce8f67613b0")
version("0.5.3", sha256="4d1fe1f34e899c3599898fb6d44686d2582a41b0872784514aa8c562597b3ee6")
version("0.5.2", sha256="46311c096b271d0331b82c02485ac6bf650d5b0f7bd948fb01aef5058f8824e3")
version("0.5.1", sha256="182d5466fbd98ba9400b343b55f6a06e03b77daed4de1dd16f632ac0a3620249")
version("0.5.0", sha256="4a8834224d96fd01897e5872ac109f60d91ef0bd7b63fac05a73dcdb61a5530e")
version("0.4.4", sha256="8b5ae63aa2d50c2ecf48d752424e4a33c50c07d9f0f5ca5448246de3286fd836")
@ -59,11 +64,13 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
multi=False,
description="C++ standard version",
)
variant("covfie", default=False, when="@0.6:", description="Enable covfie magnetic fields")
variant("debug", default=False, description="Enable runtime debug assertions")
variant("doc", default=False, description="Build and install documentation")
variant("geant4", default=True, description="Enable Geant4 integration")
variant("hepmc3", default=True, description="Use HepMC3 I/O interfaces")
variant("openmp", default=False, description="Use OpenMP multithreading")
variant("perfetto", default=False, when="@0.5:", description="Use Perfetto profiling")
variant("root", default=False, description="Use ROOT I/O")
variant("shared", default=True, description="Build shared libraries")
variant("swig", default=False, when="@:0.4", description="Generate SWIG Python bindings")
@ -75,7 +82,9 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
depends_on("cmake@3.18:", type="build", when="+cuda+vecgeom")
depends_on("cmake@3.22:", type="build", when="+rocm")
depends_on("cli11", when="@0.6:")
depends_on("nlohmann-json")
depends_on("covfie@0.13:", when="+covfie")
depends_on("geant4@10.5:11.1", when="@0.3.1:0.4.1 +geant4")
depends_on("geant4@10.5:11.2", when="@0.4.2:0.4 +geant4")
depends_on("geant4@10.5:", when="@0.5: +geant4")
@ -104,14 +113,20 @@ class Celeritas(CMakePackage, CudaPackage, ROCmPackage):
for _std in _cxxstd_values:
for _pkg in ["geant4", "root", "vecgeom"]:
depends_on(f"{_pkg} cxxstd={_std}", when=f"+{_pkg} cxxstd={_std}")
# NOTE: as a private dependency, covfie cxxstd can differ from the
# celeritas "public" standard
# Ensure consistent CUDA architectures
depends_on("vecgeom +cuda cuda_arch=none", when="+vecgeom +cuda cuda_arch=none")
for _arch in CudaPackage.cuda_arch_values:
depends_on(f"vecgeom +cuda cuda_arch={_arch}", when=f"+vecgeom +cuda cuda_arch={_arch}")
for _pkg in ["covfie", "vecgeom"]:
depends_on(f"{_pkg} +cuda cuda_arch={_arch}", when=f"+{_pkg} +cuda cuda_arch={_arch}")
conflicts("+rocm", when="+covfie", msg="HIP support is not yet available with covfie")
conflicts("+rocm", when="+cuda", msg="AMD and NVIDIA accelerators are incompatible")
conflicts("+rocm", when="+vecgeom", msg="HIP support is only available with ORANGE")
for _arch in ["cuda", "rocm"]:
conflicts("+perfetto", when=f"+{_arch}", msg="Perfetto is only used for CPU profiling")
# geant4@11.3.0 now returns const G4Element::GetElementTable()
patch(
@ -134,7 +149,18 @@ def cmake_args(self):
define("CELERITAS_USE_Python", True),
]
for pkg in ["CUDA", "Geant4", "HepMC3", "OpenMP", "ROOT", "SWIG", "VecGeom"]:
# NOTE: package names are stylized like the dependency asks: Find{pkg}.cmake
for pkg in [
"covfie",
"CUDA",
"Geant4",
"HepMC3",
"OpenMP",
"Perfetto",
"ROOT",
"SWIG",
"VecGeom",
]:
args.append(from_variant("CELERITAS_USE_" + pkg, pkg.lower()))
if self.spec.satisfies("+cuda"):
@ -162,8 +188,9 @@ def cmake_args(self):
args.extend(
(
define(f"CELERITAS_BUILTIN_{pkg}", False)
for pkg in ["GTest", "nlohmann_json", "G4VG"]
for pkg in ["covfie", "CLI11", "GTest", "nlohmann_json", "G4VG"]
)
)
# NOTE: Perfetto is always vendored!
return args

View File

@ -13,13 +13,15 @@ class Chafa(AutotoolsPackage):
suitable for display in a terminal."""
homepage = "https://hpjansson.org/chafa/"
url = "https://hpjansson.org/chafa/releases/chafa-1.14.5.tar.xz"
url = "https://hpjansson.org/chafa/releases/chafa-1.16.1.tar.xz"
git = "https://github.com/hpjansson/chafa.git"
license("LGPL-3.0-or-later", checked_by="Buldram")
maintainers("Buldram")
version("master", branch="master")
version("1.16.1", sha256="4a25debb71530baf0a748b15cfee6b8da6b513f696d9484987eaf410ecce1129")
version("1.16.0", sha256="bf863e57b6200b696bde1742aa95d7feb8cd23b9df1e91e91859b2b1e54fd290")
version("1.14.5", sha256="7b5b384d5fb76a641d00af0626ed2115fb255ea371d9bef11f8500286a7b09e5")
version("1.14.4", sha256="d0708a63f05b79269dae862a42671e38aece47fbd4fc852904bca51a65954454")
version("1.14.3", sha256="f3d5530a96c8e55eea180448896e973093e0302f4cbde45d028179af8cfd90f3")
@ -75,6 +77,19 @@ def configure_args(self):
*self.with_or_without("jxl"),
]
@run_after("install")
def install_completions(self):
mkdirp(zsh_completion_path(self.prefix))
install(
"tools/completions/zsh-completion.zsh", zsh_completion_path(self.prefix) / "_chafa"
)
if self.spec.satisfies("@1.16.1:"):
mkdirp(fish_completion_path(self.prefix))
install(
"tools/completions/fish-completion.fish",
fish_completion_path(self.prefix) / "chafa.fish",
)
@run_after("install", when="+man")
def install_man(self):
mkdirp(prefix.share.man.man1)

View File

@ -31,6 +31,7 @@ def url_for_version(self, version):
license("NCSA")
version("master", branch="amd-stg-open", deprecated=True)
version("6.4.0", sha256="dca1c145a23f05229d5d646241f9d1d3c5dbf1d745b338ae020eabe33beb965c")
version("6.3.3", sha256="4df9aba24e574edf23844c0d2d9dda112811db5c2b08c9428604a21b819eb23d")
version("6.3.2", sha256="1f52e45660ea508d3fe717a9903fe27020cee96de95a3541434838e0193a4827")
version("6.3.1", sha256="e9c2481cccacdea72c1f8d3970956c447cec47e18dfb9712cbbba76a2820552c")
@ -94,6 +95,7 @@ def url_for_version(self, version):
"6.3.1",
"6.3.2",
"6.3.3",
"6.4.0",
"master",
]:
# llvm libs are linked statically, so this *could* be a build dep
@ -123,6 +125,7 @@ def url_for_version(self, version):
"6.3.1",
"6.3.2",
"6.3.3",
"6.4.0",
]:
depends_on(f"rocm-core@{ver}", when=f"@{ver}")

View File

@ -6,10 +6,9 @@
import sys
from typing import List
import _vendoring.archspec.cpu
from spack_repo.builtin.build_systems.generic import Package
import archspec.cpu
from llnl.util import lang
import spack.compilers.libraries
@ -183,12 +182,12 @@ def setup_dependent_build_environment(
env.set(f"SPACK_{wrapper_var_name}_RPATH_ARG", compiler_pkg.rpath_arg)
uarch = dependent_spec.architecture.target
version_number, _ = archspec.cpu.version_components(
version_number, _ = _vendoring.archspec.cpu.version_components(
compiler_pkg.spec.version.dotted_numeric_string
)
try:
isa_arg = uarch.optimization_flags(compiler_pkg.archspec_name(), version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
except (ValueError, _vendoring.archspec.cpu.UnsupportedMicroarchitecture):
isa_arg = ""
if isa_arg:

View File

@ -21,6 +21,7 @@ class ComposableKernel(CMakePackage):
license("MIT")
version("master", branch="develop", deprecated=True)
version("6.4.0", sha256="8dbfea0bdc4950ca60e8d1ea43edf1f515c4a34e47ead951415c49a0669a3baf")
version("6.3.3", sha256="b7102efba044455416a6127af1951019fe8365a653ea7eb0b1d83bb4542c9309")
version("6.3.2", sha256="875237fe493ff040f8f63b827cddf2ff30a8d3aa18864f87d0e35323c7d62a2d")
version("6.3.1", sha256="3e8c8c832ca3f9ceb99ab90f654b93b7db876f08d90eda87a70bc629c854052a")
@ -65,6 +66,7 @@ class ComposableKernel(CMakePackage):
for ver in [
"master",
"6.4.0",
"6.3.3",
"6.3.2",
"6.3.1",

View File

@ -0,0 +1,32 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack_repo.builtin.build_systems.makefile import MakefilePackage
from spack.package import *
class Cryoef(MakefilePackage):
"""An open-source software package for robust analysis of the orientation distribution
of cryoelectron microscopy data."""
homepage = "https://www.mrc-lmb.cam.ac.uk/crusso/cryoEF"
url = "https://www.mrc-lmb.cam.ac.uk/crusso/cryoEF/cryoEF_v1.1.0.tar.gz"
version("1.1.0", sha256="655ed8543a0226754bdeb6e0dd4efc0467f15dc4c9c963c44ef7b8d3d0e41b62")
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fftw-api@3")
def patch(self):
filter_file(
"-lfftw3", f"-lfftw3 {self.spec['fftw-api'].libs.ld_flags} -no-pie", "Makefile"
)
def install(self, spec, prefix):
install_tree("TestData", prefix.TestData)
install_tree("bin", prefix.bin)
install_tree("lib", prefix.lib)
install("PlotOD.py", prefix.bin)

View File

@ -41,3 +41,4 @@ class CyrusSasl(AutotoolsPackage):
depends_on("groff", type="build")
depends_on("openssl", type="link")
depends_on("libxcrypt", type="link")
depends_on("krb5", type="link")

View File

@ -22,6 +22,7 @@ class Direnv(GoPackage):
# Versions (newest to oldest)
version("master", branch="master")
version("2.36.0", sha256="edb89ca67ef46a792d4e20177dae9dbd229e26dcbcfb17baa9645c1ff7cc47b0")
version("2.35.0", sha256="a7aaec49d1b305f0745dad364af967fb3dc9bb5befc9f29d268d528b5a474e57")
version("2.34.0", sha256="3d7067e71500e95d69eac86a271a6b6fc3f2f2817ba0e9a589524bf3e73e007c")
version("2.33.0", sha256="8ef18051aa6bdcd6b59f04f02acdd0b78849b8ddbdbd372d4957af7889c903ea")
@ -34,5 +35,6 @@ class Direnv(GoPackage):
version("2.11.3", sha256="2d34103a7f9645059270763a0cfe82085f6d9fe61b2a85aca558689df0e7b006")
# Build dependencies
depends_on("go@1.24:", type="build", when="@2.36:")
depends_on("go@1.20:", type="build", when="@2.33:")
depends_on("go@1.16:", type="build", when="@2.28:")

View File

@ -26,6 +26,7 @@ class E2fsprogs(AutotoolsPackage):
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("uuid")
depends_on("texinfo", type="build")
depends_on("fuse", when="+fuse2fs")
depends_on("pkgconfig", when="+fuse2fs")

View File

@ -22,6 +22,7 @@ class Edm4hep(CMakePackage):
license("Apache-2.0")
version("main", branch="main")
version("0.99.2", sha256="b3e7abb61fd969e4c9aef55dd6839a2186bf0b0d3801174fe6e0b9df8e0ebace")
version("0.99.1", sha256="84d990f09dbd0ad2198596c0c51238a4b15391f51febfb15dd3d191dc7aae9f4")
version("0.99", sha256="3636e8c14474237029bf1a8be11c53b57ad3ed438fd70a7e9b87c5d08f1f2ea6")
version("0.10.5", sha256="003c8e0c8e1d1844592d43d41384f4320586fbfa51d4d728ae0870b9c4f78d81")
@ -79,6 +80,7 @@ class Edm4hep(CMakePackage):
depends_on("podio@1:", when="@0.99:")
depends_on("podio@0.15:", when="@:0.10.5")
depends_on("podio@:1.1", when="@:0.99.0")
depends_on("podio@1.3:", when="@0.99.2:")
for _std in _cxxstd_values:
for _v in _std:
depends_on(f"podio cxxstd={_v.value}", when=f"cxxstd={_v.value}")
@ -109,6 +111,8 @@ def cmake_args(self):
self.define("BUILD_TESTING", self.run_tests),
self.define_from_variant("EDM4HEP_WITH_JSON", "json"),
]
if self.spec.satisfies("@:0.99.1 ^podio@1.3:"):
args.append(self.define("PODIO_USE_CLANG_FORMAT", False))
return args
def setup_run_environment(self, env: EnvironmentModifications) -> None:

View File

@ -16,67 +16,20 @@ class Fairlogger(CMakePackage):
maintainers("dennisklein", "ChristianTackeGSI")
version("develop", branch="dev", get_full_repo=True)
version("1.11.1", sha256="bba5814f101d705792499e43b387190d8b8c7592466171ae045d4926485f2f70")
version("1.10.4", sha256="2fa321893f2c8c599cca160db243299ce1e941fbfb3f935b1139caa943bc0dba")
version("1.9.3", sha256="0c02076ed708372d5ae7bdebcefc8e45a8cbfa480eea781308336d60a2781f3a")
version("2.2.0", sha256="8dfb11e3aa0a9c545f3dfb310d261956727cea558d4123fd8c9c98e135e4d02b")
version(
"1.9.0",
sha256="13bcaa0d4129f8d4e69a0a2ece8e5b7073760082c8aa028e3fc0c11106503095",
"1.11.1",
sha256="bba5814f101d705792499e43b387190d8b8c7592466171ae045d4926485f2f70",
deprecated=True,
)
version(
"1.8.0",
sha256="3f0a38dba1411b542d998e02badcc099c057b33a402954fc5c2ab74947a0c42c",
"1.10.4",
sha256="2fa321893f2c8c599cca160db243299ce1e941fbfb3f935b1139caa943bc0dba",
deprecated=True,
)
version(
"1.7.0",
sha256="ef467f0a70afc0549442323d70b165fa0b0b4b4e6f17834573ca15e8e0b007e4",
deprecated=True,
)
version(
"1.6.2",
sha256="5c6ef0c0029eb451fee71756cb96e6c5011040a9813e8889667b6f3b6b04ed03",
deprecated=True,
)
version(
"1.6.1",
sha256="3894580f4c398d724ba408e410e50f70c9f452e8cfaf7c3ff8118c08df28eaa8",
deprecated=True,
)
version(
"1.6.0",
sha256="721e8cadfceb2f63014c2a727e098babc6deba653baab8866445a772385d0f5b",
deprecated=True,
)
version(
"1.5.0",
sha256="8e74e0b1e50ee86f4fca87a44c6b393740b32099ac3880046bf252c31c58dd42",
deprecated=True,
)
version(
"1.4.0",
sha256="75457e86984cc03ce87d6ad37adc5aab1910cabd39a9bbe5fb21ce2475a91138",
deprecated=True,
)
version(
"1.3.0",
sha256="5cedea2773f7091d69aae9fd8f724e6e47929ee3784acdd295945a848eb36b93",
deprecated=True,
)
version(
"1.2.0",
sha256="bc0e049cf84ceb308132d8679e7f22fcdca5561dda314d5233d0d5fe2b0f8c62",
deprecated=True,
)
version(
"1.1.0",
sha256="e185e5bd07df648224f85e765d18579fae0de54adaab9a194335e3ad6d3d29f7",
deprecated=True,
)
version(
"1.0.6",
sha256="2fc266a6e494adda40837be406aef8d9838f385ffd64fbfafb1164833906b4e0",
"1.9.3",
sha256="0c02076ed708372d5ae7bdebcefc8e45a8cbfa480eea781308336d60a2781f3a",
deprecated=True,
)
@ -92,14 +45,21 @@ class Fairlogger(CMakePackage):
variant(
"cxxstd",
default="default",
values=("default", conditional("11", when="@:1.9"), "14", "17", "20"),
values=(
"default",
conditional("11", when="@:1.9"),
conditional("14", when="@:1"),
"17",
"20",
"23",
"26",
),
multi=False,
description="Use the specified C++ standard when building.",
)
variant(
"pretty", default=False, description="Use BOOST_PRETTY_FUNCTION macro (Supported by 1.4+)."
)
conflicts("+pretty", when="@:1.3")
depends_on("cxx", type="build") # generated
@ -108,14 +68,15 @@ class Fairlogger(CMakePackage):
depends_on("boost", when="+pretty")
conflicts("^boost@1.70:", when="^cmake@:3.14")
depends_on("fmt")
depends_on("fmt@:8", when="@:1.9")
depends_on("fmt@5.3.0:5", when="@1.6.0:1.6.1")
depends_on("fmt@5.3.0:", when="@1.6.2:")
def patch(self):
"""FairLogger gets its version number from git.
But the tarball doesn't have that information, so
we patch the spack version into CMakeLists.txt"""
The tarball doesn't have that information, so we patch the spack
version into CMakeLists.txt.
"""
if not self.spec.satisfies("@develop"):
filter_file(
r"(get_git_version\(.*)\)",
@ -123,14 +84,17 @@ def patch(self):
"CMakeLists.txt",
)
if self.spec.satisfies("@:1"):
filter_file(r"(LANGUAGES C CXX)", r"LANGUAGES CXX", "CMakeLists.txt")
def cmake_args(self):
args = [self.define("DISABLE_COLOR", True)]
args = [
self.define("DISABLE_COLOR", True),
self.define_from_variant("USE_BOOST_PRETTY_FUNCTION", "pretty"),
self.define("USE_EXTERNAL_FMT", True),
]
if self.spec.variants["cxxstd"].value != "default":
args.append(self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"))
if self.spec.satisfies("@1.4:"):
args.append(self.define_from_variant("USE_BOOST_PRETTY_FUNCTION", "pretty"))
if self.spec.satisfies("@1.6:"):
args.append(self.define("USE_EXTERNAL_FMT", True))
if self.spec.satisfies("^boost@:1.69"):
args.append(self.define("Boost_NO_BOOST_CMAKE", True))
return args

View File

@ -57,6 +57,7 @@ class Fairmq(CMakePackage):
depends_on("git")
depends_on("boost@1.66: +container+program_options+filesystem+date_time+regex")
conflicts("^boost@1.88:")
depends_on("fairlogger@1.6: +pretty")
depends_on("libzmq@4.1.4:")

Some files were not shown because too many files have changed in this diff Show More