Compare commits

..

58 Commits

Author SHA1 Message Date
Greg Becker
719d31682f speed up relocation using memoized offsets
refactor to do scanning in a single pass
parallelize new relocate method with threadpool
relocate_by_offsets can recompute if offsets not memoized
2022-06-03 12:19:01 +02:00
Sreenivasa Murthy Kolam
1190d03b0f Bump up the version for ROCm-5.1.3 release (#30819)
* Bump up the version for ROCm-5.1.3 release

* remove extra comma from hashes for device-libs of rocm-openmp-extras
2022-06-01 10:57:10 -07:00
kwryankrattiger
5faa927fe6 Ascent: Patch find conduit python (#30949)
Some systems have trouble when using the python on the login node so
this should provide an option to build that doesn't require running
python.
2022-06-01 10:36:33 -07:00
Adam J. Stewart
c60d220f81 py-jupyterlab: add v3.4.2 (#30867) 2022-06-01 10:18:47 -07:00
Hartmut Kaiser
61d3d60414 Update HPX recipe for HPX V1.8.0 (#30896) 2022-06-01 11:06:03 -06:00
rashawnLK
174258c09a Updated intel-gtpin package.py for most recent version, GTPin 3.0. (#30877)
* Updated intel-gtpin package.py for  most recent version, GTPin 3.0.

* Fixed style issues in package.py -- removed trailing whitespace on two
lines.
2022-06-01 11:05:41 -06:00
Erik
73c6a8f73d Version updates for SUNDIALS and CUDA (#30874) 2022-06-01 11:01:32 -06:00
Olivier Cessenat
86dc904080 ngspice: adding version 37 (#30925) 2022-06-01 10:50:02 -06:00
Weiqun Zhang
9e1c87409d amrex: add v22.06 (#30951) 2022-06-01 10:49:46 -06:00
Sergey Kosukhin
2b30dc2e30 nag: add new version (#30927)
* nag: add new version

* nag: update maintainers
2022-06-01 10:49:32 -06:00
Ben Darwin
b1ce756d69 minc-toolkit: add version 1.9.18.2 (#30926) 2022-06-01 10:45:34 -06:00
Ida Mjelde
1194ac6985 Adding a libunwind variant to libzmq (#30932)
* Adding a libunwind variant to libzmq

* Remove whitespace line 46
2022-06-01 10:29:56 -06:00
Asher Mancinelli
954f961208 tmux: support building from master and utf8 opts (#30928)
* tmux: support building from master and utf8 opts

* Fix style errors
2022-06-01 10:29:35 -06:00
Zack Galbreath
47ac710796 CPU & memory requests for jobs that generate GitLab CI pipelines (#30940)
gitlab ci: make sure pipeline generation isn't resource starved
2022-06-01 09:43:23 -06:00
Derek Ryan Strong
d7fb5a6db4 rclone: add 1.58 (#30887)
* Add rclone 1.58

* Update rclone git repo path
2022-05-31 19:35:12 -07:00
Maciej Wójcik
e0624b9278 gromacs: Add recent releases (#30892)
* gromacs: Add recent releases

* gromacs: Update branch name

* gromacs: Update links
2022-05-31 19:27:23 -07:00
Olivier Cessenat
e86614f7b8 gmsh: adding version 4.10.3 (#30923) 2022-05-31 18:53:03 -07:00
Garth N. Wells
d166b948ce fenics-dolfinx: dependency updates (#30919)
* Add pugixml dependency

* Dependency updates

* Fix Spack Numpy verion

* Test more generous NumPy constraint

* Fix NumPy requirment
2022-05-31 18:49:49 -07:00
lorddavidiii
9cc3a2942d cfitsio: add 4.1.0 (#30920) 2022-05-31 18:46:07 -07:00
Marie Houillon
5d685f9ff6 New version for openCARP packages (#30931)
Co-authored-by: openCARP consortium <info@opencarp.org>
2022-05-31 18:21:55 -07:00
Paul Kuberry
9ddf45964d xyce: add sha for version 7.5.0 (#30941) 2022-05-31 17:59:13 -07:00
iarspider
b88cc77f16 xpmem package: add patches for building on FC 35 with kernel 5.16.18-200 (#29945) 2022-05-31 15:55:51 -07:00
Robert Cohn
f3af38ba9b Fix module support for oneapi compilers (#28901)
Updates to improve Spack-generated modules for Intel oneAPI compilers:

* intel-oneapi-compilers set CC etc.
* Add a new package intel-oneapi-compilers-classic which can be used to
  generate a module which sets CC etc. to older compilers (e.g. icc)
* lmod module logic now updated to treat the intel-oneapi-compilers*
  packages as compilers
2022-05-31 15:02:25 -07:00
Wouter Deconinck
adc9f887ea acts-dd4hep: new package; acts: new version (#30850)
* acts-dd4hep: new package, separated from new acts@19.1.0

* acts-dd4hep: improved versioning

* acts-dd4hep: don't use curl | sha256sum

* acts: new variant `odd` for Open Data Detector

* acts-dd4hep: style changes
2022-05-31 11:30:36 -07:00
Wouter Deconinck
9461f482d9 assimp: new version 5.2.4 (#30929) 2022-05-31 12:25:24 -06:00
Paul Kuberry
e014b889c6 xyce: remove python packages as +pymi dependencies and hdf5 from trilinos dependency (#30938) 2022-05-31 14:04:57 -04:00
snehring
181ac574bb sentieon-genomics: adding version 202112.04 (#30876) 2022-05-31 10:05:27 -06:00
Adam J. Stewart
055c9d125d CUDA: make cuda_arch sticky (#30910) 2022-05-30 12:53:15 -07:00
Evan Bollig
a94438b1f5 Added AWS-AHUG alinux2 pipeline (#24601)
Add spack stacks targeted at Spack + AWS + ARM HPC User Group hackathon.  Includes
a list of miniapps and full-apps that are ready to run on both x86_64 and aarch64.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-30 10:26:39 -06:00
Joseph Wang
f583e471b8 pass CC variable to make (#30912)
Set CC to cc
2022-05-30 08:44:31 +02:00
Brian Van Essen
f67f3b1796 Add new versions of protobuf and py-protobuf (#30503)
* Add new versions

* Updated the hashes to match the published pypi.org hashes.  Added version constraints for Python.
2022-05-30 01:23:26 -05:00
Jean Luca Bez
77c86c759c HDF5 VOL-ASYNC update versions (#30900) 2022-05-29 17:49:52 -04:00
Adam J. Stewart
8084259bd3 protobuf: fix spack versions (#30879) 2022-05-28 14:53:24 -06:00
Evan Bollig
98860c6a5f Alinux isc buildcache (#30462)
Add two new stacks targeted at x86_64 and arm, representing an initial list of packages 
used by current and planned AWS Workshops, and built in conjunction with the ISC22
announcement of the spack public binary cache.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-28 11:32:53 -06:00
Todd Gamblin
e6929b9ff9 0.18.0.dev0 -> 0.19.0.dev0 (#30907) 2022-05-28 17:23:01 +00:00
Tom Scogland
18c2f1a57a refactor: packages import spack.package explicitly (#30404)
Explicitly import package utilities in all packages, and corresponding fallout.

This includes:

* rename `spack.package` to `spack.package_base`
* rename `spack.pkgkit` to `spack.package`
* update all packages in builtin, builtin_mock and tutorials to include `from spack.package import *`
* update spack style
  * ensure packages include the import
  * automatically add the new import and remove any/all imports of `spack` and `spack.pkgkit`
    from packages when using `--fix`
  * add support for type-checking packages with mypy when SPACK_MYPY_CHECK_PACKAGES
    is set in the environment
* fix all type checking errors in packages in spack upstream
* update spack create to include the new imports
* update spack repo to inject the new import, injection persists to allow for a deprecation period

Original message below:
 
As requested @adamjstewart, update all packages to use pkgkit.  I ended up using isort to do this,
so repro is easy:

```console
$ isort -a 'from spack.pkgkit import *' --rm 'spack' ./var/spack/repos/builtin/packages/*/package.py
$ spack style --fix
```

There were several line spacing fixups caused either by space manipulation in isort or by packages
that haven't been touched since we added requirements, but there are no functional changes in here.

* [x] add config to isort to make sure this is maintained going forward
2022-05-28 12:55:44 -04:00
Todd Gamblin
3054cd0eff update changelog for v0.18.0 (#30905) 2022-05-28 17:33:20 +02:00
JDBetteridge
9016b79270 Additional BLAS/LAPACK library configuration for Numpy (#30817)
* Add amdblis and amdlibflame as BLAS/LAPACK options

* Add Cray-libsci as BLAS/LAPACK option

* Use Netlib config for Cray-libsci
2022-05-28 03:33:31 -06:00
Erik Schnetter
9f5c6fb398 hpx: New version 1.8.0 (#30848) 2022-05-28 09:40:20 +02:00
Greg Becker
19087c9d35 target optimization: re-norm optimization scale so that 0 is best. (#29926)
referred targets are currently the only minimization criteria for Spack for which we allow
negative values. That means Spack may be incentivized to add nodes to the DAG if they
match the preferred target.

This PR re-norms the minimization criteria so that preferred targets are weighted from 0,
and default target weights are offset by the number of preferred targets per-package to
calculate node_target_weight.

Also fixes a bug in the test for preferred targets that was making the test easier to pass
than it should be.
2022-05-27 22:49:41 -07:00
Greg Becker
4116b04368 update tutorial command for v0.18.0 and new gpg key (#30904) 2022-05-28 02:36:20 +00:00
JDBetteridge
1485931695 Ensure same BLAS/LAPACK config from Numpy used in Scipy (#30818)
* Call Numpy package's set_blas_lapack() and setup_build_environment() in Scipy package

* Remove broken link from comment

* Use .package attribute of spec to avoid import
2022-05-27 10:46:21 -07:00
Derek Ryan Strong
78cac4d840 Add R 4.2.0 (#30859) 2022-05-27 08:24:28 -05:00
Michael Kuhn
2f628c3a97 gcc: add 9.5.0 (#30893) 2022-05-27 12:18:57 +02:00
Adam J. Stewart
a3a8710cbe Python: fix clingo bootstrapping on Apple M1 (#30834)
This PR fixes several issues I noticed while trying to get Spack working on Apple M1.

- [x] `build_environment.py` attempts to add `spec['foo'].libs` and `spec['foo'].headers` to our compiler wrappers for all dependencies using a try-except that ignores `NoLibrariesError` and `NoHeadersError` respectively. However, The `libs` and `headers` attributes of the Python package were erroneously using `RuntimeError` instead.
- [x] `spack external find python` (used during bootstrapping) currently has no way to determine whether or not an installation is `+shared`, so previously we would only search for static Python libs. However, most distributions including XCode/Conda/Intel ship shared Python libs. I updated `libs` to search for both shared and static (order based on variant) as a fallback.
- [x] The `headers` attribute was recursively searching in `prefix.include` for `pyconfig.h`, but this could lead to non-deterministic behavior if multiple versions of Python are installed and `pyconfig.h` files exist in multiple `<prefix>/include/pythonX.Y` locations. It's safer to search in `sysconfig.get_path('include')` instead.
- [x] The Python installation that comes with XCode is broken, and `sysconfig.get_paths` is hard-coded to return specific directories. This meant that our logic for `platlib`/`purelib`/`include` where we replace `platbase`/`base`/`installed_base` with `prefix` wasn't working and the `mkdirp` in `setup_dependent_package` was trying to create a directory in root, giving permissions issues. Even if you commented out those `mkdirp` calls, Spack would add the wrong directories to `PYTHONPATH`. Added a fallback hard-coded to `lib/pythonX.Y/site-packages` if sysconfig is broken (this is what distutils always did).
2022-05-27 03:18:20 -07:00
Paul R. C. Kent
0bf3a9c2af llvm: 14.0.3 and 14.0.4 (#30888) 2022-05-27 09:51:54 +02:00
Severin Strobl
cff955f7bd otf2/scorep: add versions 3.0/7.1 (#28631) 2022-05-27 00:43:58 +02:00
Scott Wittenburg
3d43ebec72 Revert "strip -Werror: all specific or none (#30284)" (#30878)
This reverts commit 330832c22c.
2022-05-26 14:17:01 -07:00
Robert Pavel
6fd07479e3 Updated mfem constraints in laghos spackage (#30851)
Updated mfme constraints in laghos spackage to better match comments and
support legacy builds of `laghos@1.0:2.0`
2022-05-26 10:20:42 -07:00
Simon Pintarelli
03bc36f8b0 q-e-sirius: remove ~apps constraint (#30857) 2022-05-26 10:18:37 -07:00
Brian Van Essen
93e1b283b7 Added hash for new versions (#30860) 2022-05-26 10:15:59 -07:00
Derek Ryan Strong
df2c0fbfbd Add new versions of GNU parallel (#30862) 2022-05-26 10:09:43 -07:00
Derek Ryan Strong
54a69587c3 Add newer nano versions (#30865) 2022-05-26 10:04:50 -07:00
Hans Johansen
294312f02b Adding new package bricks for x86, cuda (#30863)
* Adding new package bricks for x86, cuda

* Fixed complaints from "spack style" that CI found

* add license comment at top

Co-authored-by: drhansj <drhansj@berkeley.edu>
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2022-05-26 07:40:11 -07:00
Massimiliano Culpo
0636fdbfef Remove the warning that Spack prints at each spec (#30872)
Add instead a warning box in the documentation
2022-05-26 14:35:20 +00:00
Scott Wittenburg
85e13260cf ci: Support secure binary signing on protected pipelines (#30753)
This PR supports the creation of securely signed binaries built from spack
develop as well as release branches and tags. Specifically:

- remove internal pr mirror url generation logic in favor of buildcache destination
on command line
    - with a single mirror url specified in the spack.yaml, this makes it clearer where 
    binaries from various pipelines are pushed
- designate some tags as reserved: ['public', 'protected', 'notary']
    - these tags are stripped from all jobs by default and provisioned internally
    based on pipeline type
- update gitlab ci yaml to include pipelines on more protected branches than just
develop (so include releases and tags)
    - binaries from all protected pipelines are pushed into mirrors including the
    branch name so releases, tags, and develop binaries are kept separate
- update rebuild jobs running on protected pipelines to run on special runners
provisioned with an intermediate signing key
    - protected rebuild jobs no longer use "SPACK_SIGNING_KEY" env var to
    obtain signing key (in fact, final signing key is nowhere available to rebuild jobs)
    - these intermediate signatures are verified at the end of each pipeline by a new
    signing job to ensure binaries were produced by a protected pipeline
- optionallly schedule a signing/notary job at the end of the pipeline to sign all
packges in the mirror
    - add signing-job-attributes to gitlab-ci section of spack environment to allow
    configuration
    - signing job runs on special runner (separate from protected rebuild runners)
    provisioned with public intermediate key and secret signing key
2022-05-26 08:31:22 -06:00
Adam J. Stewart
b5a519fa51 py-tensorboard: add v2.9.0 (#30832) 2022-05-26 07:43:40 -04:00
Adam J. Stewart
2e2d0b3211 libtiff: remove extra dependencies/patch (#30854) 2022-05-25 23:37:45 -06:00
6777 changed files with 13704 additions and 10134 deletions

View File

@@ -151,7 +151,7 @@ Package-related modules
^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.package`
Contains the :class:`~spack.package.Package` class, which
Contains the :class:`~spack.package_base.Package` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<package-lifecycle>` and manage the build process.

View File

@@ -2393,9 +2393,9 @@ Influence how dependents are built or run
Spack provides a mechanism for dependencies to influence the
environment of their dependents by overriding the
:meth:`setup_dependent_run_environment <spack.package.PackageBase.setup_dependent_run_environment>`
:meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
or the
:meth:`setup_dependent_build_environment <spack.package.PackageBase.setup_dependent_build_environment>`
:meth:`setup_dependent_build_environment <spack.package_base.PackageBase.setup_dependent_build_environment>`
methods.
The Qt package, for instance, uses this call:
@@ -2417,7 +2417,7 @@ will have the ``PYTHONPATH``, ``PYTHONHOME`` and ``PATH`` environment
variables set appropriately before starting the installation. To make things
even simpler the ``python setup.py`` command is also inserted into the module
scope of dependents by overriding a third method called
:meth:`setup_dependent_package <spack.package.PackageBase.setup_dependent_package>`
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`
:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
@@ -3022,7 +3022,7 @@ The classes that are currently provided by Spack are:
+----------------------------------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+==========================================================+==================================+
| :class:`~spack.package.Package` | General base class not |
| :class:`~spack.package_base.Package` | General base class not |
| | specialized for any build system |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.makefile.MakefilePackage` | Specialized class for packages |
@@ -3153,7 +3153,7 @@ for the install phase is:
For those not used to Python instance methods, this is the
package itself. In this case it's an instance of ``Foo``, which
extends ``Package``. For API docs on Package objects, see
:py:class:`Package <spack.package.Package>`.
:py:class:`Package <spack.package_base.Package>`.
``spec``
This is the concrete spec object created by Spack from an

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: (major, minor, micro, dev release) tuple
spack_version_info = (0, 18, 0)
spack_version_info = (0, 19, 0, 'dev0')
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
spack_version = '.'.join(str(s) for s in spack_version_info)

View File

@@ -12,7 +12,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.repo
import spack.util.executable

View File

@@ -624,10 +624,14 @@ def get_buildfile_manifest(spec):
"""
data = {"text_to_relocate": [], "binary_to_relocate": [],
"link_to_relocate": [], "other": [],
"binary_to_relocate_fullpath": []}
"binary_to_relocate_fullpath": [], "offsets": {}}
blacklist = (".spack", "man")
# Get all the paths we will want to relocate in binaries
paths_to_relocate = [s.prefix for s in spec.traverse(root=True)]
paths_to_relocate.append(spack.store.layout.root)
# Do this at during tarball creation to save time when tarball unpacked.
# Used by make_package_relative to determine binaries to change.
for root, dirs, files in os.walk(spec.prefix, topdown=True):
@@ -662,6 +666,11 @@ def get_buildfile_manifest(spec):
(m_subtype in ('x-mach-binary')
and sys.platform == 'darwin') or
(not filename.endswith('.o'))):
# Last path to relocate is the layout root, which is a substring
# of the others
indices = relocate.compute_indices(path_name, paths_to_relocate)
data['offsets'][rel_path_name] = indices
data['binary_to_relocate'].append(rel_path_name)
data['binary_to_relocate_fullpath'].append(path_name)
added = True
@@ -700,6 +709,7 @@ def write_buildinfo_file(spec, workdir, rel=False):
buildinfo['relocate_binaries'] = manifest['binary_to_relocate']
buildinfo['relocate_links'] = manifest['link_to_relocate']
buildinfo['prefix_to_hash'] = prefix_to_hash
buildinfo['offsets'] = manifest['offsets']
filename = buildinfo_file_name(workdir)
with open(filename, 'w') as outfile:
outfile.write(syaml.dump(buildinfo, default_flow_style=True))
@@ -1473,11 +1483,25 @@ def is_backup_file(file):
# If we are not installing back to the same install tree do the relocation
if old_prefix != new_prefix:
files_to_relocate = [os.path.join(workdir, filename)
# Relocate links to the new install prefix
links = [link for link in buildinfo.get('relocate_links', [])]
relocate.relocate_links(
links, old_layout_root, old_prefix, new_prefix
)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
# If the buildcache was not created with relativized rpaths
# do the relocation of rpaths in binaries
# TODO: Is this necessary? How are null-terminated strings handled
# in the rpath header?
files_to_relocate = [
os.path.join(workdir, filename)
for filename in buildinfo.get('relocate_binaries')
]
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
platform = spack.platforms.by_name(spec.platform)
if 'macho' in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate,
@@ -1493,25 +1517,11 @@ def is_backup_file(file):
prefix_to_prefix_bin, rel,
old_prefix,
new_prefix)
# Relocate links to the new install prefix
links = [link for link in buildinfo.get('relocate_links', [])]
relocate.relocate_links(
links, old_layout_root, old_prefix, new_prefix
)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
paths_to_relocate = [old_prefix, old_layout_root]
paths_to_relocate.extend(prefix_to_hash.keys())
files_to_relocate = list(filter(
lambda pathname: not relocate.file_is_relocatable(
pathname, paths_to_relocate=paths_to_relocate),
map(lambda filename: os.path.join(workdir, filename),
buildinfo['relocate_binaries'])))
# relocate the install prefixes in binary files including dependencies
relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin)
# If offsets is None, we will recompute offsets when needed
offsets = buildinfo.get('offsets', None)
relocate.relocate_text_bin(
files_to_relocate, prefix_to_prefix_bin, offsets, workdir)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed

View File

@@ -55,7 +55,7 @@
import spack.config
import spack.install_test
import spack.main
import spack.package
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
@@ -722,7 +722,7 @@ def get_std_cmake_args(pkg):
package were a CMakePackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for cmake
@@ -738,7 +738,7 @@ def get_std_meson_args(pkg):
package were a MesonPackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for meson
@@ -748,12 +748,12 @@ def get_std_meson_args(pkg):
def parent_class_modules(cls):
"""
Get list of superclass modules that descend from spack.package.PackageBase
Get list of superclass modules that descend from spack.package_base.PackageBase
Includes cls.__module__
"""
if (not issubclass(cls, spack.package.PackageBase) or
issubclass(spack.package.PackageBase, cls)):
if (not issubclass(cls, spack.package_base.PackageBase) or
issubclass(spack.package_base.PackageBase, cls)):
return []
result = []
module = sys.modules.get(cls.__module__)
@@ -771,7 +771,7 @@ def load_external_modules(pkg):
associated with them.
Args:
pkg (spack.package.PackageBase): package to load deps for
pkg (spack.package_base.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
@@ -1109,7 +1109,7 @@ def start_build_process(pkg, function, kwargs):
Args:
pkg (spack.package.PackageBase): package whose environment we should set up the
pkg (spack.package_base.PackageBase): package whose environment we should set up the
child process for.
function (typing.Callable): argless function to run in the child
process.
@@ -1234,7 +1234,7 @@ def make_stack(tb, stack=None):
if 'self' in frame.f_locals:
# Find the first proper subclass of PackageBase.
obj = frame.f_locals['self']
if isinstance(obj, spack.package.PackageBase):
if isinstance(obj, spack.package_base.PackageBase):
break
# We found obj, the Package implementation we care about.

View File

@@ -9,7 +9,7 @@
from spack.build_systems.autotools import AutotoolsPackage
from spack.directives import extends
from spack.package import ExtensionError
from spack.package_base import ExtensionError
from spack.util.executable import which

View File

@@ -16,7 +16,7 @@
from spack.build_environment import InstallError
from spack.directives import conflicts, depends_on
from spack.operating_systems.mac_os import macos_version
from spack.package import PackageBase, run_after, run_before
from spack.package_base import PackageBase, run_after, run_before
from spack.util.executable import Executable
from spack.version import Version

View File

@@ -8,7 +8,7 @@
from llnl.util.filesystem import install, mkdirp
from spack.build_systems.cmake import CMakePackage
from spack.package import run_after
from spack.package_base import run_after
def cmake_cache_path(name, value, comment=""):

View File

@@ -18,7 +18,7 @@
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.path import convert_to_posix_path
# Regex to extract the primary generator from the CMake generator

View File

@@ -6,7 +6,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
class CudaPackage(PackageBase):
@@ -37,6 +37,7 @@ class CudaPackage(PackageBase):
variant('cuda_arch',
description='CUDA architecture',
values=spack.variant.any_combination_of(*cuda_arch_values),
sticky=True,
when='+cuda')
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples

View File

@@ -3,14 +3,16 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class GNUMirrorPackage(spack.package.PackageBase):
class GNUMirrorPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror
gnu_mirror_path = None
gnu_mirror_path = None # type: Optional[str]
#: List of GNU mirrors used by Spack
base_mirrors = [

View File

@@ -26,7 +26,7 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
@@ -1115,7 +1115,7 @@ def _setup_dependent_env_callback(
raise InstallError('compilers_of_client arg required for MPI')
def setup_dependent_package(self, module, dep_spec):
# https://spack.readthedocs.io/en/latest/spack.html#spack.package.PackageBase.setup_dependent_package
# https://spack.readthedocs.io/en/latest/spack.html#spack.package_base.PackageBase.setup_dependent_package
# Reminder: "module" refers to Python module.
# Called before the install() method of dependents.

View File

@@ -10,7 +10,7 @@
from spack.directives import depends_on, extends
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.executable import Executable

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import conflicts
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MakefilePackage(PackageBase):

View File

@@ -7,7 +7,7 @@
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import which

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MesonPackage(PackageBase):

View File

@@ -6,7 +6,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class OctavePackage(PackageBase):

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.package import Package
from spack.package_base import Package
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable

View File

@@ -10,7 +10,7 @@
from llnl.util.filesystem import filter_file
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import Executable

View File

@@ -6,6 +6,7 @@
import os
import re
import shutil
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import (
@@ -19,13 +20,13 @@
from llnl.util.lang import match_predicate
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class PythonPackage(PackageBase):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi = None
pypi = None # type: Optional[str]
maintainers = ['adamjstewart']
@@ -46,7 +47,7 @@ class PythonPackage(PackageBase):
# package manually
depends_on('py-wheel', type='build')
py_namespace = None
py_namespace = None # type: Optional[str]
@staticmethod
def _std_args(cls):

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class QMakePackage(PackageBase):

View File

@@ -5,9 +5,10 @@
import inspect
from typing import Optional
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RPackage(PackageBase):
@@ -28,10 +29,10 @@ class RPackage(PackageBase):
# package attributes that can be expanded to set the homepage, url,
# list_url, and git values
# For CRAN packages
cran = None
cran = None # type: Optional[str]
# For Bioconductor packages
bioc = None
bioc = None # type: Optional[str]
maintainers = ['glennpj']

View File

@@ -3,13 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import extends
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
@@ -36,8 +37,8 @@ class RacketPackage(PackageBase):
extends('racket')
pkgs = False
subdirectory = None
name = None
subdirectory = None # type: Optional[str]
name = None # type: Optional[str]
parallel = True
@property

View File

@@ -77,7 +77,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package import PackageBase
from spack.package_base import PackageBase
class ROCmPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RubyPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SConsPackage(PackageBase):

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import find, join_path, working_dir
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SIPPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class SourceforgePackage(spack.package.PackageBase):
class SourceforgePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceforge
packages."""
#: Path of the package in a Sourceforge mirror
sourceforge_mirror_path = None
sourceforge_mirror_path = None # type: Optional[str]
#: List of Sourceforge mirrors used by Spack
base_mirrors = [

View File

@@ -2,16 +2,17 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package
import spack.package_base
import spack.util.url
class SourcewarePackage(spack.package.PackageBase):
class SourcewarePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceware.org
packages."""
#: Path of the package in a Sourceware mirror
sourceware_mirror_path = None
sourceware_mirror_path = None # type: Optional[str]
#: List of Sourceware mirrors used by Spack
base_mirrors = [

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class WafPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class XorgPackage(spack.package.PackageBase):
class XorgPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for x.org
packages."""
#: Path of the package in a x.org mirror
xorg_mirror_path = None
xorg_mirror_path = None # type: Optional[str]
#: List of x.org mirrors used by Spack
# Note: x.org mirrors are a bit tricky, since many are out-of-sync or off.

View File

@@ -14,7 +14,7 @@
import spack.repo
import spack.stage
import spack.util.crypto
from spack.package import preferred_version
from spack.package_base import preferred_version
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version, ver

View File

@@ -57,7 +57,7 @@
# See the Spack documentation for more information on packaging.
# ----------------------------------------------------------------------------
from spack import *
from spack.package import *
class {class_name}({base_class_name}):

View File

@@ -11,7 +11,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package
import spack.package_base
import spack.repo
import spack.store
@@ -57,7 +57,7 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package.possible_dependencies(
dependencies = spack.package_base.possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,

View File

@@ -200,7 +200,7 @@ def external_list(args):
list(spack.repo.path.all_packages())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package.detectable_packages.items()):
for namespace, pkgs in sorted(spack.package_base.detectable_packages.items()):
print("Repository:", namespace)
colify.colify(pkgs, indent=4, output=sys.stdout)

View File

@@ -18,7 +18,7 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
from spack.package import has_test_method, preferred_version
from spack.package_base import has_test_method, preferred_version
description = 'get detailed information on a particular package'
section = 'basic'
@@ -269,14 +269,14 @@ def print_tests(pkg):
names = []
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
pkg_base = spack.package.PackageBase
pkg_base = spack.package_base.PackageBase
test_pkgs = [str(cls.test) for cls in inspect.getmro(pkg_cls) if
issubclass(cls, pkg_base) and cls.test != pkg_base.test]
test_pkgs = list(set(test_pkgs))
names.extend([(test.split()[1]).lower() for test in test_pkgs])
# TODO Refactor START
# Use code from package.py's test_process IF this functionality is
# Use code from package_base.py's test_process IF this functionality is
# accepted.
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))

View File

@@ -302,7 +302,7 @@ def install(parser, args, **kwargs):
)
reporter = spack.report.collect_info(
spack.package.PackageInstaller, '_install_task', args.log_format, args)
spack.package_base.PackageInstaller, '_install_task', args.log_format, args)
if args.log_file:
reporter.filename = args.log_file

View File

@@ -12,7 +12,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses

View File

@@ -18,7 +18,7 @@
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package
import spack.package_base
import spack.solver.asp as asp
description = "concretize a specs using an ASP solver"

View File

@@ -65,7 +65,7 @@ def is_package(f):
packages, since we allow `from spack import *` and poking globals
into packages.
"""
return f.startswith("var/spack/repos/")
return f.startswith("var/spack/repos/") and f.endswith('package.py')
#: decorator for adding tools to the list
@@ -236,7 +236,7 @@ def translate(match):
continue
if not args.root_relative and re_obj:
line = re_obj.sub(translate, line)
print(" " + line)
print(line)
def print_style_header(file_list, args):
@@ -290,18 +290,24 @@ def run_flake8(flake8_cmd, file_list, args):
@tool("mypy")
def run_mypy(mypy_cmd, file_list, args):
# always run with config from running spack prefix
mypy_args = [
common_mypy_args = [
"--config-file", os.path.join(spack.paths.prefix, "pyproject.toml"),
"--package", "spack",
"--package", "llnl",
"--show-error-codes",
]
# not yet, need other updates to enable this
# if any([is_package(f) for f in file_list]):
# mypy_args.extend(["--package", "packages"])
mypy_arg_sets = [common_mypy_args + [
"--package", "spack",
"--package", "llnl",
]]
if 'SPACK_MYPY_CHECK_PACKAGES' in os.environ:
mypy_arg_sets.append(common_mypy_args + [
'--package', 'packages',
'--disable-error-code', 'no-redef',
])
returncode = 0
for mypy_args in mypy_arg_sets:
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode = mypy_cmd.returncode
returncode |= mypy_cmd.returncode
rewrite_and_print_output(output, args)
@@ -318,16 +324,29 @@ def run_isort(isort_cmd, file_list, args):
pat = re.compile("ERROR: (.*) Imports are incorrectly sorted")
replacement = "ERROR: {0} Imports are incorrectly sorted"
returncode = 0
returncode = [0]
def process_files(file_list, is_args):
for chunk in grouper(file_list, 100):
packed_args = isort_args + tuple(chunk)
packed_args = is_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode |= isort_cmd.returncode
returncode[0] |= isort_cmd.returncode
rewrite_and_print_output(output, args, pat, replacement)
print_tool_result("isort", returncode)
return returncode
packages_isort_args = ('--rm', 'spack', '--rm', 'spack.pkgkit', '--rm',
'spack.package_defs', '-a', 'from spack.package import *')
packages_isort_args = packages_isort_args + isort_args
# packages
process_files(filter(is_package, file_list),
packages_isort_args)
# non-packages
process_files(filter(lambda f: not is_package(f), file_list),
isort_args)
print_tool_result("isort", returncode[0])
return returncode[0]
@tool("black")

View File

@@ -20,7 +20,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package
import spack.package_base
import spack.repo
import spack.report
@@ -189,7 +189,7 @@ def test_run(args):
# Set up reporter
setattr(args, 'package', [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package.PackageBase, 'do_test', args.log_format, args)
spack.package_base.PackageBase, 'do_test', args.log_format, args)
if not reporter.filename:
if args.log_file:
if os.path.isabs(args.log_file):
@@ -217,7 +217,7 @@ def test_list(args):
else set()
def has_test_and_tags(pkg_class):
return spack.package.has_test_method(pkg_class) and \
return spack.package_base.has_test_method(pkg_class) and \
(not args.tag or pkg_class.name in tagged)
if args.list_all:

View File

@@ -15,7 +15,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses
@@ -221,7 +221,7 @@ def do_uninstall(env, specs, force):
except spack.repo.UnknownEntityError:
# The package.py file has gone away -- but still
# want to uninstall.
spack.package.Package.uninstall_by_spec(item, force=True)
spack.package_base.Package.uninstall_by_spec(item, force=True)
# A package is ready to be uninstalled when nothing else references it,
# unless we are requested to force uninstall it.

View File

@@ -422,7 +422,7 @@ def url_list_parsing(args, urls, url, pkg):
urls (set): List of URLs that have already been added
url (str or None): A URL to potentially add to ``urls`` depending on
``args``
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
Returns:
set: The updated set of ``urls``
@@ -470,7 +470,7 @@ def name_parsed_correctly(pkg, name):
"""Determine if the name of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
name (str): The name that was extracted from the URL
Returns:
@@ -487,7 +487,7 @@ def version_parsed_correctly(pkg, version):
"""Determine if the version of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
version (str): The version that was extracted from the URL
Returns:

View File

@@ -240,7 +240,7 @@ def compute_windows_program_path_for_package(pkg):
program files location, return list of best guesses
Args:
pkg (spack.package.Package): package for which
pkg (spack.package_base.Package): package for which
Program Files location is to be computed
"""
if not is_windows:

View File

@@ -1556,7 +1556,7 @@ def _extrapolate(pkg, version):
try:
return URLFetchStrategy(pkg.url_for_version(version),
fetch_options=pkg.fetch_options)
except spack.package.NoURLError:
except spack.package_base.NoURLError:
msg = ("Can't extrapolate a URL for version %s "
"because package %s defines no URLs")
raise ExtrapolationError(msg % (version, pkg.name))

View File

@@ -95,10 +95,7 @@ def view_copy(src, dst, view, spec=None):
view.get_projection_for_spec(dep)
if spack.relocate.is_binary(dst):
spack.relocate.relocate_text_bin(
binaries=[dst],
prefixes=prefix_to_projection
)
spack.relocate.relocate_text_bin([dst], prefix_to_projection)
else:
prefix_to_projection[spack.store.layout.root] = view._root
prefix_to_projection[orig_sbang] = new_sbang

View File

@@ -50,7 +50,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
import spack.store
@@ -103,7 +103,7 @@ def _check_last_phase(pkg):
package already.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
@@ -125,7 +125,7 @@ def _handle_external_and_upstream(pkg, explicit):
database if it is external package.
Args:
pkg (spack.package.Package): the package whose installation is under
pkg (spack.package_base.Package): the package whose installation is under
consideration
explicit (bool): the package was explicitly requested by the user
Return:
@@ -265,7 +265,7 @@ def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
Extract the package from binary cache
Args:
pkg (spack.package.PackageBase): the package to install from the binary cache
pkg (spack.package_base.PackageBase): package to install from the binary cache
cache_only (bool): only extract from binary cache
explicit (bool): ``True`` if installing the package was explicitly
requested by the user, otherwise, ``False``
@@ -355,7 +355,7 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned,
Process the binary cache tarball.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
binary_spec (spack.spec.Spec): the spec whose cache has been confirmed
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
@@ -394,7 +394,7 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
Try to extract the package from binary cache.
Args:
pkg (spack.package.PackageBase): the package to be extracted from binary cache
pkg (spack.package_base.PackageBase): package to be extracted from binary cache
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
otherwise, ``False``
@@ -530,7 +530,7 @@ def log(pkg):
Copy provenance into the install directory on success
Args:
pkg (spack.package.Package): the package that was built and installed
pkg (spack.package_base.Package): the package that was built and installed
"""
packages_dir = spack.store.layout.build_packages_path(pkg.spec)
@@ -616,7 +616,7 @@ def package_id(pkg):
and packages for combinatorial environments.
Args:
pkg (spack.package.PackageBase): the package from which the identifier is
pkg (spack.package_base.PackageBase): the package from which the identifier is
derived
"""
if not pkg.spec.concrete:
@@ -769,7 +769,7 @@ def _add_bootstrap_compilers(
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs (spack.package.PackageBase): the package with possible compiler
pkgs (spack.package_base.PackageBase): the package with possible compiler
dependencies
request (BuildRequest): the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
@@ -786,7 +786,7 @@ def _add_init_task(self, pkg, request, is_compiler, all_deps):
Creates and queus the initial build task for the package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -968,7 +968,7 @@ def _cleanup_task(self, pkg):
Cleanup the build task for the spec
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
"""
self._remove_task(package_id(pkg))
@@ -982,7 +982,7 @@ def _ensure_install_ready(self, pkg):
already locked.
Args:
pkg (spack.package.PackageBase): the package being locally installed
pkg (spack.package_base.PackageBase): the package being locally installed
"""
pkg_id = package_id(pkg)
pre = "{0} cannot be installed locally:".format(pkg_id)
@@ -1014,7 +1014,8 @@ def _ensure_locked(self, lock_type, pkg):
Args:
lock_type (str): 'read' for a read lock, 'write' for a write lock
pkg (spack.package.PackageBase): the package whose spec is being installed
pkg (spack.package_base.PackageBase): the package whose spec is being
installed
Return:
(lock_type, lock) tuple where lock will be None if it could not
@@ -1228,7 +1229,7 @@ def _install_task(self, task):
# Create a child process to do the actual installation.
# Preserve verbosity settings across installs.
spack.package.PackageBase._verbose = (
spack.package_base.PackageBase._verbose = (
spack.build_environment.start_build_process(
pkg, build_process, install_args)
)
@@ -1373,7 +1374,7 @@ def _setup_install_dir(self, pkg):
Write a small metadata file with the current spack environment.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
"""
if not os.path.exists(pkg.spec.prefix):
tty.debug('Creating the installation directory {0}'.format(pkg.spec.prefix))
@@ -1447,7 +1448,7 @@ def _flag_installed(self, pkg, dependent_ids=None):
known dependents.
Args:
pkg (spack.package.Package): Package that has been installed locally,
pkg (spack.package_base.Package): Package that has been installed locally,
externally or upstream
dependent_ids (list or None): list of the package's
dependent ids, or None if the dependent ids are limited to
@@ -1536,7 +1537,7 @@ def install(self):
Install the requested package(s) and or associated dependencies.
Args:
pkg (spack.package.Package): the package to be built and installed"""
pkg (spack.package_base.Package): the package to be built and installed"""
self._init_queue()
fail_fast_err = 'Terminating after first install failure'
@@ -1788,7 +1789,7 @@ def __init__(self, pkg, install_args):
process in the build.
Arguments:
pkg (spack.package.PackageBase) the package being installed.
pkg (spack.package_base.PackageBase) the package being installed.
install_args (dict) arguments to do_install() from parent process.
"""
@@ -1848,8 +1849,8 @@ def run(self):
# get verbosity from do_install() parameter or saved value
self.echo = self.verbose
if spack.package.PackageBase._verbose is not None:
self.echo = spack.package.PackageBase._verbose
if spack.package_base.PackageBase._verbose is not None:
self.echo = spack.package_base.PackageBase._verbose
self.pkg.stage.keep = self.keep_stage
@@ -2001,7 +2002,7 @@ def build_process(pkg, install_args):
This function's return value is returned to the parent process.
Arguments:
pkg (spack.package.PackageBase): the package being installed.
pkg (spack.package_base.PackageBase): the package being installed.
install_args (dict): arguments to do_install() from parent process.
"""
@@ -2049,7 +2050,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
Instantiate a build task for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -2062,7 +2063,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2240,11 +2241,11 @@ def __init__(self, pkg, install_args):
Instantiate a build request for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
install_args (dict): the install arguments associated with ``pkg``
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2314,7 +2315,7 @@ def get_deptypes(self, pkg):
"""Determine the required dependency types for the associated package.
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:
@@ -2337,7 +2338,7 @@ def run_tests(self, pkg):
"""Determine if the tests should be run for the provided packages
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:

View File

@@ -196,6 +196,14 @@ def provides(self):
if self.spec.name == 'llvm-amdgpu':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'rocmcc'
# Special case for oneapi
if self.spec.name == 'intel-oneapi-compilers':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'oneapi'
# Special case for oneapi classic
if self.spec.name == 'intel-oneapi-compilers-classic':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'intel'
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -356,7 +356,7 @@ def patch_for_package(self, sha256, pkg):
Arguments:
sha256 (str): sha256 hash to look up
pkg (spack.package.Package): Package object to get patch for.
pkg (spack.package_base.Package): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo.

View File

@@ -1,83 +0,0 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# flake8: noqa: F401
"""pkgkit is a set of useful build tools and directives for packages.
Everything in this module is automatically imported into Spack package files.
"""
import llnl.util.filesystem
from llnl.util.filesystem import *
import spack.directives
import spack.util.executable
from spack.build_systems.aspell_dict import AspellDictPackage
from spack.build_systems.autotools import AutotoolsPackage
from spack.build_systems.cached_cmake import (
CachedCMakePackage,
cmake_cache_option,
cmake_cache_path,
cmake_cache_string,
)
from spack.build_systems.cmake import CMakePackage
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.gnu import GNUMirrorPackage
from spack.build_systems.intel import IntelPackage
from spack.build_systems.lua import LuaPackage
from spack.build_systems.makefile import MakefilePackage
from spack.build_systems.maven import MavenPackage
from spack.build_systems.meson import MesonPackage
from spack.build_systems.octave import OctavePackage
from spack.build_systems.oneapi import (
IntelOneApiLibraryPackage,
IntelOneApiPackage,
IntelOneApiStaticLibraryList,
)
from spack.build_systems.perl import PerlPackage
from spack.build_systems.python import PythonPackage
from spack.build_systems.qmake import QMakePackage
from spack.build_systems.r import RPackage
from spack.build_systems.racket import RacketPackage
from spack.build_systems.rocm import ROCmPackage
from spack.build_systems.ruby import RubyPackage
from spack.build_systems.scons import SConsPackage
from spack.build_systems.sip import SIPPackage
from spack.build_systems.sourceforge import SourceforgePackage
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.dependency import all_deptypes
from spack.directives import *
from spack.install_test import get_escaped_text_output
from spack.installer import (
ExternalPackageError,
InstallError,
InstallLockError,
UpstreamPackageError,
)
from spack.mixins import filter_compiler_wrappers
from spack.multimethod import when
from spack.package import (
BundlePackage,
DependencyConflictError,
Package,
build_system_flags,
env_flags,
flatten_dependencies,
inject_flags,
install_dependency_symlinks,
on_package_attributes,
run_after,
run_before,
)
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,
conditional,
disjoint_sets,
)
from spack.version import Version, ver

View File

@@ -469,47 +469,6 @@ def _replace_prefix_text(filename, compiled_prefixes):
f.truncate()
def _replace_prefix_bin(filename, byte_prefixes):
"""Replace all the occurrences of the old install prefix with a
new install prefix in binary files.
The new install prefix is prefixed with ``os.sep`` until the
lengths of the prefixes are the same.
Args:
filename (str): target binary file
byte_prefixes (OrderedDict): OrderedDictionary where the keys are
precompiled regex of the old prefixes and the values are the new
prefixes (uft-8 encoded)
"""
with open(filename, 'rb+') as f:
data = f.read()
f.seek(0)
for orig_bytes, new_bytes in byte_prefixes.items():
original_data_len = len(data)
# Skip this hassle if not found
if orig_bytes not in data:
continue
# We only care about this problem if we are about to replace
length_compatible = len(new_bytes) <= len(orig_bytes)
if not length_compatible:
tty.debug('Binary failing to relocate is %s' % filename)
raise BinaryTextReplaceError(orig_bytes, new_bytes)
pad_length = len(orig_bytes) - len(new_bytes)
padding = os.sep * pad_length
padding = padding.encode('utf-8')
data = data.replace(orig_bytes, new_bytes + padding)
# Really needs to be the same length
if not len(data) == original_data_len:
print('Length of pad:', pad_length, 'should be', len(padding))
print(new_bytes, 'was to replace', orig_bytes)
raise BinaryStringReplacementError(
filename, original_data_len, len(data))
f.write(data)
f.truncate()
def relocate_macho_binaries(path_names, old_layout_root, new_layout_root,
prefix_to_prefix, rel, old_prefix, new_prefix):
"""
@@ -817,49 +776,6 @@ def relocate_text(files, prefixes, concurrency=32):
tp.join()
def relocate_text_bin(binaries, prefixes, concurrency=32):
"""Replace null terminated path strings hard coded into binaries.
The new install prefix must be shorter than the original one.
Args:
binaries (list): binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed.
concurrency (int): Desired degree of parallelism.
Raises:
BinaryTextReplaceError: when the new path is longer than the old path
"""
byte_prefixes = collections.OrderedDict({})
for orig_prefix, new_prefix in prefixes.items():
if orig_prefix != new_prefix:
if isinstance(orig_prefix, bytes):
orig_bytes = orig_prefix
else:
orig_bytes = orig_prefix.encode('utf-8')
if isinstance(new_prefix, bytes):
new_bytes = new_prefix
else:
new_bytes = new_prefix.encode('utf-8')
byte_prefixes[orig_bytes] = new_bytes
# Do relocations on text in binaries that refers to the install tree
# multiprocesing.ThreadPool.map requires single argument
args = []
for binary in binaries:
args.append((binary, byte_prefixes))
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_replace_prefix_bin), args)
finally:
tp.terminate()
tp.join()
def is_relocatable(spec):
"""Returns True if an installed spec is relocatable.
@@ -1126,3 +1042,120 @@ def fixup_macos_rpaths(spec):
))
else:
tty.debug('No rpath fixup needed for ' + specname)
def compute_indices(filename, paths_to_relocate):
"""
Compute the indices in filename at which each of paths_to_relocate occurs.
Arguments:
filename (str): file to compute indices for
paths_to_relocate (List[str]): paths to find indices of
Returns:
Dict
"""
with open(filename, 'rb') as f:
contents = f.read()
substring_prefix = os.path.commonprefix(paths_to_relocate).encode('utf-8')
indices = {}
index = 0
max_length = max(len(path) for path in paths_to_relocate)
while True:
try:
# We search for the smallest substring of all paths we relocate
# In practice, this is the spack install root, and we relocate
# prefixes in the root and the root itself
index = contents.index(substring_prefix, index)
except ValueError:
# The string isn't found in the rest of the binary
break
else:
# only copy the smallest portion of the binary for comparisons
substring_to_check = contents[index:index + max_length]
for path in paths_to_relocate:
# We guarantee any substring in the list comes after any superstring
p = path.encode('utf-8')
if substring_to_check.startswith(p):
indices[index] = str(path)
index += len(path)
break
else:
index += 1
return indices
def _relocate_binary_text(filename, offsets, prefix_to_prefix):
"""
Relocate the text of a single binary file, given the offsets at which the
replacements need to be made
Arguments:
filename (str): file to modify
offsets (Dict[int, str]): locations of the strings to replace
prefix_to_prefix (Dict[str, str]): strings to replace and their replacements
"""
with open(filename, 'rb+') as f:
for index, prefix in offsets.items():
replacement = prefix_to_prefix[prefix].encode('utf-8')
if len(replacement) > len(prefix):
raise BinaryTextReplaceError(prefix, replacement)
# read forward until we find the end of the string including
# the prefix and compute the replacement as we go
f.seek(index + len(prefix))
c = f.read(1)
while c not in (None, b'\x00'):
replacement += c
c = f.read(1)
# seek back to the index position and write the replacement in
# and add null-terminator
f.seek(index)
f.write(replacement)
f.write(b'\x00')
def relocate_text_bin(
files_to_relocate, prefix_to_prefix, offsets=None,
relative_root=None, concurrency=32
):
"""
For each file given, replace all keys in the given translation dict with
the associated values. Optionally executes using precomputed memoized offsets
for the substitutions.
Arguments:
files_to_relocate (List[str]): The files to modify
prefix_to_prefix (Dict[str, str]): keys are strings to replace, values are
replacements
offsets (Dict[str, Dict[int, str]): (optional) Mapping from relative filenames to
a mapping from indices to strings to replace found at each index
relative_root (str): (optional) prefix for relative paths in offsets
"""
# defaults to the common prefix of all input files
rel_root = relative_root or os.path.commonprefix(files_to_relocate)
if offsets is None:
offsets = {}
for filename in files_to_relocate:
indices = compute_indices(
filename,
list(prefix_to_prefix.keys()),
)
relpath = os.path.relpath(filename, rel_root)
offsets[relpath] = indices
args = [
(filename, offsets[os.path.relpath(filename, rel_root)], prefix_to_prefix)
for filename in files_to_relocate
]
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_relocate_binary_text), args)
finally:
tp.terminate()
tp.join()

View File

@@ -202,17 +202,11 @@ class RepoLoader(_PrependFileLoader):
"""Loads a Python module associated with a package in specific repository"""
#: Code in ``_package_prepend`` is prepended to imported packages.
#:
#: Spack packages were originally expected to call `from spack import *`
#: themselves, but it became difficult to manage and imports in the Spack
#: core the top-level namespace polluted by package symbols this way. To
#: solve this, the top-level ``spack`` package contains very few symbols
#: of its own, and importing ``*`` is essentially a no-op. The common
#: routines and directives that packages need are now in ``spack.pkgkit``,
#: and the import system forces packages to automatically include
#: this. This way, old packages that call ``from spack import *`` will
#: continue to work without modification, but it's no longer required.
#: Spack packages are expected to call `from spack.package import *`
#: themselves, but we are allowing a deprecation period before breaking
#: external repos that don't do this yet.
_package_prepend = ('from __future__ import absolute_import;'
'from spack.pkgkit import *')
'from spack.package import *')
def __init__(self, fullname, repo, package_name):
self.repo = repo
@@ -450,10 +444,10 @@ def is_package_file(filename):
# Package files are named `package.py` and are not in lib/spack/spack
# We have to remove the file extension because it can be .py and can be
# .pyc depending on context, and can differ between the files
import spack.package # break cycle
import spack.package_base # break cycle
filename_noext = os.path.splitext(filename)[0]
packagebase_filename_noext = os.path.splitext(
inspect.getfile(spack.package.PackageBase))[0]
inspect.getfile(spack.package_base.PackageBase))[0]
return (filename_noext != packagebase_filename_noext and
os.path.basename(filename_noext) == 'package')

View File

@@ -15,7 +15,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.install_test import TestSuite
from spack.reporter import Reporter
from spack.reporters.cdash import CDash
@@ -131,7 +131,7 @@ def gather_info(do_fn):
"""
@functools.wraps(do_fn)
def wrapper(instance, *args, **kwargs):
if isinstance(instance, spack.package.PackageBase):
if isinstance(instance, spack.package_base.PackageBase):
pkg = instance
elif hasattr(args[0], 'pkg'):
pkg = args[0].pkg

View File

@@ -22,7 +22,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.error import SpackError
from spack.reporter import Reporter
from spack.util.crypto import checksum

View File

@@ -7,7 +7,7 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package
import spack.package_base
from spack.reporter import Reporter
__all__ = ['JUnit']

View File

@@ -93,8 +93,10 @@ def rewire_node(spec, explicit):
False,
spec.build_spec.prefix,
spec.prefix)
relocate.relocate_text_bin(binaries=bins_to_relocate,
prefixes=prefix_to_prefix)
# Relocate text strings of prefixes embedded in binaries
relocate.relocate_text_bin(bins_to_relocate, prefix_to_prefix)
# Copy package into place, except for spec.json (because spec.json
# describes the old spec and not the new spliced spec).
shutil.copytree(os.path.join(tempdir, spec.dag_hash()), spec.prefix,

View File

@@ -41,7 +41,7 @@
import spack.directives
import spack.environment as ev
import spack.error
import spack.package
import spack.package_base
import spack.package_prefs
import spack.platforms
import spack.repo
@@ -1822,7 +1822,7 @@ def setup(self, driver, specs, reuse=None):
self.possible_virtuals = set(
x.name for x in specs if x.virtual
)
possible = spack.package.possible_dependencies(
possible = spack.package_base.possible_dependencies(
*specs,
virtuals=self.possible_virtuals,
deptype=spack.dependency.all_deptypes

View File

@@ -128,7 +128,7 @@ def test_cmake_bad_generator(config, mock_packages):
s.concretize()
pkg = spack.repo.get(s)
pkg.generator = 'Yellow Sticky Notes'
with pytest.raises(spack.package.InstallError):
with pytest.raises(spack.package_base.InstallError):
get_std_cmake_args(pkg)

View File

@@ -10,7 +10,7 @@
import spack.cmd.install
import spack.config
import spack.package
import spack.package_base
import spack.util.spack_json as sjson
from spack.main import SpackCommand
from spack.spec import Spec

View File

@@ -9,7 +9,7 @@
import spack.caches
import spack.main
import spack.package
import spack.package_base
import spack.stage
clean = spack.main.SpackCommand('clean')
@@ -31,7 +31,7 @@ def __init__(self, name):
def __call__(self, *args, **kwargs):
counts[self.name] += 1
monkeypatch.setattr(spack.package.PackageBase, 'do_clean',
monkeypatch.setattr(spack.package_base.PackageBase, 'do_clean',
Counter('package'))
monkeypatch.setattr(spack.stage, 'purge', Counter('stages'))
monkeypatch.setattr(

View File

@@ -20,7 +20,7 @@
import spack.config
import spack.environment as ev
import spack.hash_types as ht
import spack.package
import spack.package_base
import spack.util.executable
from spack.error import SpackError
from spack.main import SpackCommand
@@ -66,7 +66,7 @@ def test_install_package_and_dependency(
def test_install_runtests_notests(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert not pkg.run_tests
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('-v', 'dttop')
@@ -75,7 +75,7 @@ def test_install_runtests_root(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert pkg.run_tests == (pkg.name == 'dttop')
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('--test=root', 'dttop')
@@ -84,7 +84,7 @@ def test_install_runtests_all(monkeypatch, mock_packages, install_mockery):
def check(pkg):
assert pkg.run_tests
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', check)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', check)
install('--test=all', 'a')
@@ -257,7 +257,7 @@ def test_install_commit(
"""
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase,
monkeypatch.setattr(spack.package_base.PackageBase,
'git', 'file://%s' % repo_path,
raising=False)

View File

@@ -22,7 +22,7 @@
#: new fake package template
pkg_template = '''\
from spack import *
from spack.package import *
class {name}(Package):
homepage = "http://www.example.com"

View File

@@ -28,7 +28,7 @@ def test_stage_spec(monkeypatch):
def fake_stage(pkg, mirror_only=False):
expected.remove(pkg.name)
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
stage('trivial-install-test-package', 'mpileaks')
@@ -42,7 +42,7 @@ def check_stage_path(monkeypatch, tmpdir):
def fake_stage(pkg, mirror_only=False):
assert pkg.path == expected_path
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
return expected_path
@@ -69,7 +69,7 @@ def fake_stage(pkg, mirror_only=False):
assert pkg.name == 'trivial-install-test-package'
assert pkg.path is None
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
e = ev.create('test')
e.add('mpileaks')
@@ -87,7 +87,7 @@ def fake_stage(pkg, mirror_only=False):
assert pkg.name == 'mpileaks'
assert pkg.version == Version('100.100')
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
e = ev.create('test')
e.add('mpileaks@100.100')
@@ -115,7 +115,7 @@ def test_stage_full_env(mutable_mock_env_path, monkeypatch):
def fake_stage(pkg, mirror_only=False):
expected.remove(pkg.name)
monkeypatch.setattr(spack.package.PackageBase, 'do_stage', fake_stage)
monkeypatch.setattr(spack.package_base.PackageBase, 'do_stage', fake_stage)
with e:
stage()

View File

@@ -76,7 +76,9 @@ def flake8_package_with_errors(scope="function"):
package = FileFilter(filename)
package.filter("state = 'unmodified'", "state = 'modified'", string=True)
package.filter(
"from spack import *", "from spack import *\nimport os", string=True
"from spack.package import *",
"from spack.package import *\nimport os",
string=True
)
yield filename
finally:
@@ -273,17 +275,17 @@ def test_style(flake8_package, tmpdir):
relative = os.path.relpath(flake8_package)
# no args
output = style()
output = style(fail_on_error=False)
assert relative in output
assert "spack style checks were clean" in output
# one specific arg
output = style(flake8_package)
output = style(flake8_package, fail_on_error=False)
assert relative in output
assert "spack style checks were clean" in output
# specific file that isn't changed
output = style(__file__)
output = style(__file__, fail_on_error=False)
assert relative not in output
assert __file__ in output
assert "spack style checks were clean" in output

View File

@@ -13,7 +13,7 @@
import spack.cmd.install
import spack.config
import spack.package
import spack.package_base
import spack.paths
import spack.store
from spack.main import SpackCommand
@@ -239,7 +239,7 @@ def test_test_list(
reason="Not supported on Windows (yet)")
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package.has_test_method('printing-package')
spack.package_base.has_test_method('printing-package')
captured = capsys.readouterr()[1]
assert 'is not a class' in captured

View File

@@ -35,7 +35,7 @@
import spack.database
import spack.directory_layout
import spack.environment as ev
import spack.package
import spack.package_base
import spack.package_prefs
import spack.paths
import spack.platforms
@@ -532,7 +532,7 @@ def _pkg_install_fn(pkg, spec, prefix):
@pytest.fixture
def mock_pkg_install(monkeypatch):
monkeypatch.setattr(spack.package.PackageBase, 'install',
monkeypatch.setattr(spack.package_base.PackageBase, 'install',
_pkg_install_fn, raising=False)
@@ -934,7 +934,7 @@ def mock_fetch(mock_archive, monkeypatch):
mock_fetcher.append(URLFetchStrategy(mock_archive.url))
monkeypatch.setattr(
spack.package.PackageBase, 'fetcher', mock_fetcher)
spack.package_base.PackageBase, 'fetcher', mock_fetcher)
class MockLayout(object):

View File

@@ -29,7 +29,7 @@
from llnl.util.tty.colify import colify
import spack.database
import spack.package
import spack.package_base
import spack.repo
import spack.spec
import spack.store
@@ -792,7 +792,7 @@ def test_uninstall_by_spec(mutable_database):
with mutable_database.write_transaction():
for spec in mutable_database.query():
if spec.installed:
spack.package.PackageBase.uninstall_by_spec(spec, force=True)
spack.package_base.PackageBase.uninstall_by_spec(spec, force=True)
else:
mutable_database.remove(spec)
assert len(mutable_database.query()) == 0

View File

@@ -10,7 +10,7 @@
import spack.build_environment
import spack.repo
import spack.spec
from spack.pkgkit import build_system_flags, env_flags, inject_flags
from spack.package import build_system_flags, env_flags, inject_flags
@pytest.fixture()

View File

@@ -15,7 +15,7 @@
import spack.repo
import spack.store
import spack.util.spack_json as sjson
from spack.package import (
from spack.package_base import (
InstallError,
PackageBase,
PackageStillNeededError,
@@ -238,7 +238,7 @@ def test_flatten_deps(
assert dependency_name not in os.listdir(pkg.prefix)
# Flatten the dependencies and ensure the dependency directory is there.
spack.package.flatten_dependencies(spec, pkg.prefix)
spack.package_base.flatten_dependencies(spec, pkg.prefix)
dependency_dir = os.path.join(pkg.prefix, dependency_name)
assert os.path.isdir(dependency_dir)
@@ -322,7 +322,7 @@ def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch):
# If remove_prefix is called at any point in this test, that is an
# error
pkg.succeed = False # make the build fail
monkeypatch.setattr(spack.package.Package, 'remove_prefix', mock_remove_prefix)
monkeypatch.setattr(spack.package_base.Package, 'remove_prefix', mock_remove_prefix)
with pytest.raises(spack.build_environment.ChildError):
pkg.do_install(keep_prefix=True)
assert os.path.exists(pkg.prefix)
@@ -340,7 +340,7 @@ def test_partial_install_keep_prefix(install_mockery, mock_fetch, monkeypatch):
def test_second_install_no_overwrite_first(install_mockery, mock_fetch, monkeypatch):
spec = Spec('canfail').concretized()
pkg = spack.repo.get(spec)
monkeypatch.setattr(spack.package.Package, 'remove_prefix', mock_remove_prefix)
monkeypatch.setattr(spack.package_base.Package, 'remove_prefix', mock_remove_prefix)
pkg.succeed = True
pkg.do_install()

View File

@@ -69,7 +69,8 @@ def create_build_task(pkg, install_args={}):
Create a built task for the given (concretized) package
Args:
pkg (spack.package.PackageBase): concretized package associated with the task
pkg (spack.package_base.PackageBase): concretized package associated with
the task
install_args (dict): dictionary of kwargs (or install args)
Return:
@@ -716,7 +717,7 @@ def _add(_compilers):
task.compiler = True
# Preclude any meaningful side-effects
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', _true)
monkeypatch.setattr(spack.package_base.PackageBase, 'unit_test_check', _true)
monkeypatch.setattr(inst.PackageInstaller, '_setup_install_dir', _noop)
monkeypatch.setattr(spack.build_environment, 'start_build_process', _noop)
monkeypatch.setattr(spack.database.Database, 'add', _noop)
@@ -1049,7 +1050,7 @@ def test_install_fail_fast_on_except(install_mockery, monkeypatch, capsys):
# This will prevent b from installing, which will cause the build of a
# to be skipped.
monkeypatch.setattr(
spack.package.PackageBase,
spack.package_base.PackageBase,
'do_patch',
_test_install_fail_fast_on_except_patch
)

View File

@@ -17,7 +17,7 @@
import llnl.util.filesystem as fs
import spack.package
import spack.package_base
import spack.repo
@@ -80,7 +80,7 @@ def test_possible_dependencies_virtual(mock_packages, mpi_names):
# only one mock MPI has a dependency
expected['fake'] = set()
assert expected == spack.package.possible_dependencies(
assert expected == spack.package_base.possible_dependencies(
"mpi", transitive=False)
@@ -125,7 +125,7 @@ def test_possible_dependencies_with_multiple_classes(
'dt-diamond-bottom': set(),
})
assert expected == spack.package.possible_dependencies(*pkgs)
assert expected == spack.package_base.possible_dependencies(*pkgs)
def setup_install_test(source_paths, install_test_root):

View File

@@ -16,7 +16,7 @@
# do sanity checks only on packagess modified by a PR
import spack.cmd.style as style
import spack.fetch_strategy
import spack.package
import spack.package_base
import spack.paths
import spack.repo
import spack.util.crypto as crypto
@@ -265,7 +265,7 @@ def test_all_dependencies_exist():
"""Make sure no packages have nonexisting dependencies."""
missing = {}
pkgs = [pkg for pkg in spack.repo.path.all_package_names()]
spack.package.possible_dependencies(
spack.package_base.possible_dependencies(
*pkgs, transitive=True, missing=missing)
lines = [

View File

@@ -135,10 +135,10 @@ def test_urls_for_versions(mock_packages, config):
def test_url_for_version_with_no_urls(mock_packages, config):
pkg = spack.repo.get('git-test')
with pytest.raises(spack.package.NoURLError):
with pytest.raises(spack.package_base.NoURLError):
pkg.url_for_version('1.0')
with pytest.raises(spack.package.NoURLError):
with pytest.raises(spack.package_base.NoURLError):
pkg.url_for_version('1.1')
@@ -377,7 +377,7 @@ def test_fetch_options(mock_packages, config):
def test_has_test_method_fails(capsys):
with pytest.raises(SystemExit):
spack.package.has_test_method('printing-package')
spack.package_base.has_test_method('printing-package')
captured = capsys.readouterr()[1]
assert 'is not a class' in captured

View File

@@ -21,7 +21,7 @@
import spack.binary_distribution as bindist
import spack.cmd.buildcache as buildcache
import spack.package
import spack.package_base
import spack.repo
import spack.store
import spack.util.gpg
@@ -575,10 +575,10 @@ def fetch(self):
def fake_fn(self):
return fetcher
orig_fn = spack.package.PackageBase.fetcher
spack.package.PackageBase.fetcher = fake_fn
orig_fn = spack.package_base.PackageBase.fetcher
spack.package_base.PackageBase.fetcher = fake_fn
yield
spack.package.PackageBase.fetcher = orig_fn
spack.package_base.PackageBase.fetcher = orig_fn
@pytest.mark.parametrize("manual,instr", [(False, False), (False, True),
@@ -598,7 +598,7 @@ def _instr(pkg):
pkg.manual_download = manual
if instr:
monkeypatch.setattr(spack.package.PackageBase, 'download_instr',
monkeypatch.setattr(spack.package_base.PackageBase, 'download_instr',
_instr)
expected = pkg.download_instr if manual else 'All fetchers failed'
@@ -616,7 +616,7 @@ def fetch(self):
raise Exception("Sources are fetched but shouldn't have been")
fetcher = FetchStrategyComposite()
fetcher.append(FetchingNotAllowed())
monkeypatch.setattr(spack.package.PackageBase, 'fetcher', fetcher)
monkeypatch.setattr(spack.package_base.PackageBase, 'fetcher', fetcher)
def test_fetch_without_code_is_noop(install_mockery, fetching_not_allowed):

View File

@@ -294,14 +294,37 @@ def test_set_elf_rpaths_warning(mock_patchelf):
assert output is None
def test_relocate_binary_text(tmpdir):
filename = str(tmpdir.join('binary'))
with open(filename, 'wb') as f:
f.write(b'somebinarytext')
f.write(b'/usr/relpath')
f.write(b'\0')
f.write(b'morebinarytext')
spack.relocate._relocate_binary_text(filename, {14: '/usr'}, {'/usr': '/foo'})
with open(filename, 'rb') as f:
contents = f.read()
assert b'/foo/relpath\0' in contents
@pytest.mark.requires_executables('patchelf', 'strings', 'file', 'gcc')
@skip_unless_linux
def test_replace_prefix_bin(hello_world):
def test_relocate_binary(hello_world):
# Compile an "Hello world!" executable and set RPATHs
executable = hello_world(rpaths=['/usr/lib', '/usr/lib64'])
with open(str(executable), 'rb') as f:
contents = f.read()
index_0 = contents.index(b'/usr')
index_1 = contents.index(b'/usr', index_0 + 1)
offsets = {index_0: '/usr/lib', index_1: '/usr/lib64'}
# Relocate the RPATHs
spack.relocate._replace_prefix_bin(str(executable), {b'/usr': b'/foo'})
spack.relocate._relocate_binary_text(
str(executable), offsets,
{'/usr/lib': '/foo/lib', '/usr/lib64': '/foo/lib64'}
)
# Some compilers add rpaths so ensure changes included in final result
assert '/foo/lib:/foo/lib64' in rpaths_for(executable)
@@ -390,13 +413,11 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
assert not text_in_bin(str(new_binary.dirpath()), new_binary)
# Check this call succeed
orig_path_bytes = str(orig_binary.dirpath()).encode('utf-8')
new_path_bytes = str(new_binary.dirpath()).encode('utf-8')
orig_path_bytes = str(orig_binary.dirpath())
new_path_bytes = str(new_binary.dirpath())
spack.relocate.relocate_text_bin(
[str(new_binary)],
{orig_path_bytes: new_path_bytes}
)
[str(new_binary)], {orig_path_bytes: new_path_bytes})
# Check original directory is not there anymore and it was
# substituted with the new one
@@ -405,15 +426,14 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
def test_relocate_text_bin_raise_if_new_prefix_is_longer(tmpdir):
short_prefix = b'/short'
long_prefix = b'/much/longer'
short_prefix = '/short'
long_prefix = '/much/longer'
fpath = str(tmpdir.join('fakebin'))
with open(fpath, 'w') as f:
f.write('/short')
with pytest.raises(spack.relocate.BinaryTextReplaceError):
spack.relocate.relocate_text_bin(
[fpath], {short_prefix: long_prefix}
)
[fpath], {short_prefix: long_prefix})
@pytest.mark.requires_executables('install_name_tool', 'file', 'cc')

View File

@@ -7,7 +7,7 @@
import pytest
import spack.package
import spack.package_base
import spack.paths
import spack.repo
@@ -116,11 +116,12 @@ def test_absolute_import_spack_packages_as_python_modules(mock_packages):
assert hasattr(spack.pkg.builtin.mock, 'mpileaks')
assert hasattr(spack.pkg.builtin.mock.mpileaks, 'Mpileaks')
assert isinstance(spack.pkg.builtin.mock.mpileaks.Mpileaks,
spack.package.PackageMeta)
assert issubclass(spack.pkg.builtin.mock.mpileaks.Mpileaks, spack.package.Package)
spack.package_base.PackageMeta)
assert issubclass(spack.pkg.builtin.mock.mpileaks.Mpileaks,
spack.package_base.Package)
def test_relative_import_spack_packages_as_python_modules(mock_packages):
from spack.pkg.builtin.mock.mpileaks import Mpileaks
assert isinstance(Mpileaks, spack.package.PackageMeta)
assert issubclass(Mpileaks, spack.package.Package)
assert isinstance(Mpileaks, spack.package_base.PackageMeta)
assert issubclass(Mpileaks, spack.package_base.Package)

View File

@@ -8,7 +8,7 @@
import pytest
import spack.error
import spack.package
import spack.package_base
import spack.util.hash as hashutil
from spack.dependency import Dependency, all_deptypes, canonical_deptype
from spack.spec import Spec

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import MergeConflictError
import spack.package
import spack.package_base
import spack.spec
from spack.directory_layout import DirectoryLayout
from spack.filesystem_view import YamlFilesystemView

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import working_dir
import spack.package
import spack.package_base
import spack.spec
from spack.util.executable import which
from spack.version import Version, VersionList, VersionRange, ver
@@ -590,7 +590,7 @@ def test_invalid_versions(version_str):
reason="Not supported on Windows (yet)")
def test_versions_from_git(mock_git_version_info, monkeypatch, mock_packages):
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase, 'git', 'file://%s' % repo_path,
monkeypatch.setattr(spack.package_base.PackageBase, 'git', 'file://%s' % repo_path,
raising=False)
for commit in commits:
@@ -614,7 +614,7 @@ def test_git_hash_comparisons(
"""Check that hashes compare properly to versions
"""
repo_path, filename, commits = mock_git_version_info
monkeypatch.setattr(spack.package.PackageBase,
monkeypatch.setattr(spack.package_base.PackageBase,
'git', 'file://%s' % repo_path,
raising=False)

View File

@@ -15,7 +15,7 @@
class MockPackageBase(object):
"""Internal base class for mocking ``spack.package.PackageBase``.
"""Internal base class for mocking ``spack.package_base.PackageBase``.
Use ``MockPackageMultiRepo.add_package()`` to create new instances.

View File

@@ -7,7 +7,7 @@
import spack.directives
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.spec
import spack.util.hash
@@ -63,7 +63,7 @@ def __init__(self, spec):
# list of URL attributes and metadata attributes
# these will be removed from packages.
self.metadata_attrs = [s.url_attr for s in spack.fetch_strategy.all_strategies]
self.metadata_attrs += spack.package.Package.metadata_attrs
self.metadata_attrs += spack.package_base.Package.metadata_attrs
self.spec = spec
self.in_classdef = False # used to avoid nested classdefs

View File

@@ -594,7 +594,7 @@ def find_versions_of_archive(
list_depth (int): max depth to follow links on list_url pages.
Defaults to 0.
concurrency (int): maximum number of concurrent requests
reference_package (spack.package.Package or None): a spack package
reference_package (spack.package_base.Package or None): a spack package
used as a reference for url detection. Uses the url_for_version
method on the package to produce reference urls which, if found,
are preferred.

View File

@@ -95,7 +95,7 @@ def validate_or_raise(self, vspec, pkg=None):
Args:
vspec (Variant): instance to be validated
pkg (spack.package.Package): the package that required the validation,
pkg (spack.package_base.Package): the package that required the validation,
if available
Raises:

View File

@@ -177,6 +177,7 @@ class Version(object):
]
def __init__(self, string):
# type: (str) -> None
if not isinstance(string, str):
string = str(string)

View File

@@ -16,8 +16,9 @@ honor_noqa = true
[tool.mypy]
python_version = 3.7
files = ['lib/spack/llnl/**/*.py', 'lib/spack/spack/**/*.py']
files = ['lib/spack/llnl/**/*.py', 'lib/spack/spack/**/*.py', './var/spack/repos/builtin/packages/*/package.py']
mypy_path = ['bin', 'lib/spack', 'lib/spack/external', 'var/spack/repos/builtin']
allow_redefinition = true
# This and a generated import file allows supporting packages
namespace_packages = true
@@ -36,6 +37,8 @@ ignore_missing_imports = true
module = 'packages.*'
ignore_errors = false
ignore_missing_imports = false
# we can't do this here, not a module scope option, in spack style instead
# disable_error_code = 'no-redef'
[[tool.mypy.overrides]]
module = 'llnl.*'
@@ -58,6 +61,15 @@ ignore_missing_imports = true
module = 'jinja2'
follow_imports = 'skip'
[tool.pyright]
useLibraryCodeForTypes = true
reportMissingImports = true
reportWildcardImportFromLibrary = false
include = ['lib/spack']
ignore = ['lib/spack/external']
extraPaths = ['lib/spack', 'lib/spack/external']
[tool.coverage.run]
parallel = true
concurrency = ["multiprocessing"]

View File

@@ -45,6 +45,9 @@ default:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir"
tags: ["spack", "aws", "public", "medium", "x86_64"]
variables:
KUBERNETES_CPU_REQUEST: 4000m
KUBERNETES_MEMORY_REQUEST: 8G
interruptible: true
retry:
max: 2
@@ -52,12 +55,22 @@ default:
- runner_system_failure
- stuck_or_timeout_failure
.generate-aarch64:
extends: [ ".generate" ]
tags: ["spack", "aws", "public", "medium", "aarch64"]
.pr-generate:
extends: [ ".pr", ".generate" ]
.pr-generate-aarch64:
extends: [ ".pr", ".generate-aarch64" ]
.protected-generate:
extends: [ ".protected", ".generate" ]
.protected-generate-aarch64:
extends: [ ".protected", ".generate-aarch64" ]
.build:
stage: build
@@ -423,6 +436,164 @@ data-vis-sdk-protected-build:
- artifacts: True
job: data-vis-sdk-protected-generate
########################################
# AWS AHUG Applications (x86_64)
########################################
# Call this AFTER .*-generate
.aws-ahug-overrides:
# This controls image for generate step; build step is controlled by spack.yaml
# Note that generator emits OS info for build so these should be the same.
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
.aws-ahug:
variables:
SPACK_CI_STACK_NAME: aws-ahug
aws-ahug-pr-generate:
extends: [ ".aws-ahug", ".pr-generate", ".aws-ahug-overrides" ]
tags: ["spack", "public", "medium", "x86_64_v4"]
aws-ahug-protected-generate:
extends: [ ".aws-ahug", ".protected-generate", ".aws-ahug-overrides" ]
tags: ["spack", "public", "medium", "x86_64_v4"]
aws-ahug-pr-build:
extends: [ ".aws-ahug", ".pr-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-ahug-pr-generate
strategy: depend
needs:
- artifacts: True
job: aws-ahug-pr-generate
aws-ahug-protected-build:
extends: [ ".aws-ahug", ".protected-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-ahug-protected-generate
strategy: depend
needs:
- artifacts: True
job: aws-ahug-protected-generate
# Parallel Pipeline for aarch64 (reuses override image, but generates and builds on aarch64)
.aws-ahug-aarch64:
variables:
SPACK_CI_STACK_NAME: aws-ahug-aarch64
aws-ahug-aarch64-pr-generate:
extends: [ ".aws-ahug-aarch64", ".pr-generate-aarch64", ".aws-ahug-overrides" ]
aws-ahug-aarch64-protected-generate:
extends: [ ".aws-ahug-aarch64", ".protected-generate-aarch64", ".aws-ahug-overrides" ]
aws-ahug-aarch64-pr-build:
extends: [ ".aws-ahug-aarch64", ".pr-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-ahug-aarch64-pr-generate
strategy: depend
needs:
- artifacts: True
job: aws-ahug-aarch64-pr-generate
aws-ahug-aarch64-protected-build:
extends: [ ".aws-ahug-aarch64", ".protected-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-ahug-aarch64-protected-generate
strategy: depend
needs:
- artifacts: True
job: aws-ahug-aarch64-protected-generate
########################################
# AWS ISC Applications (x86_64)
########################################
# Call this AFTER .*-generate
.aws-isc-overrides:
# This controls image for generate step; build step is controlled by spack.yaml
# Note that generator emits OS info for build so these should be the same.
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
.aws-isc:
variables:
SPACK_CI_STACK_NAME: aws-isc
aws-isc-pr-generate:
extends: [ ".aws-isc", ".pr-generate", ".aws-isc-overrides" ]
tags: ["spack", "public", "medium", "x86_64_v4"]
aws-isc-protected-generate:
extends: [ ".aws-isc", ".protected-generate", ".aws-isc-overrides" ]
tags: ["spack", "public", "medium", "x86_64_v4"]
aws-isc-pr-build:
extends: [ ".aws-isc", ".pr-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-isc-pr-generate
strategy: depend
needs:
- artifacts: True
job: aws-isc-pr-generate
aws-isc-protected-build:
extends: [ ".aws-isc", ".protected-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-isc-protected-generate
strategy: depend
needs:
- artifacts: True
job: aws-isc-protected-generate
# Parallel Pipeline for aarch64 (reuses override image, but generates and builds on aarch64)
.aws-isc-aarch64:
variables:
SPACK_CI_STACK_NAME: aws-isc-aarch64
aws-isc-aarch64-pr-generate:
extends: [ ".aws-isc-aarch64", ".pr-generate-aarch64", ".aws-isc-overrides" ]
aws-isc-aarch64-protected-generate:
extends: [ ".aws-isc-aarch64", ".protected-generate-aarch64", ".aws-isc-overrides" ]
aws-isc-aarch64-pr-build:
extends: [ ".aws-isc-aarch64", ".pr-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-isc-aarch64-pr-generate
strategy: depend
needs:
- artifacts: True
job: aws-isc-aarch64-pr-generate
aws-isc-aarch64-protected-build:
extends: [ ".aws-isc-aarch64", ".protected-build" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: aws-isc-aarch64-protected-generate
strategy: depend
needs:
- artifacts: True
job: aws-isc-aarch64-protected-generate
########################################
# Spack Tutorial
########################################

View File

@@ -0,0 +1,329 @@
spack:
view: false
concretizer:
reuse: false
unify: false
config:
concretizer: clingo
install_tree:
root: /home/software/spack
padded_length: 384
projections:
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
packages:
all:
providers:
blas:
- openblas
mkl:
- intel-oneapi-mkl
mpi:
- openmpi
- mpich
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
version:
- 2.36.1
doxygen:
version:
- 1.8.20
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
libunwind:
variants: +pic +xz
#m4:
# version:
# - 1.4.18
mesa:
variants: ~llvm
mesa18:
variants: ~llvm
mpich:
#variants: ~wrapperrpath pmi=pmi netmod=ofi device=ch4
variants: ~wrapperrpath netmod=ofi device=ch4
#munge:
# variants: localstatedir=/var
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
openmpi:
#variants: +pmi +internal-hwloc fabrics=ofi schedulers=slurm
variants: fabrics=ofi
openturns:
version: [1.18]
#slurm:
# variants: +pmix sysconfdir=/opt/slurm/etc
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
definitions:
- compiler_specs:
- gcc
#- nvhpc@22.1
# - compilers:
# - '%gcc@7.5.0'
# - '%arm@21.0.0.879'
# - '%nvhpc@21.2'
# - 'arm@21.0.0.879'
#- when: arch.satisfies('os=ubuntu18.04')
# compilers: ['gcc@7.5.0', $bootstrap-compilers]
#- when: arch.satisfies('os=amzn2')
# compilers: ['gcc@7.3.1', $bootstrap-compilers]
# Note skipping spot since no spack package for it
- ahug_miniapps:
- cloverleaf
# - coevp
# Bad code pushed to github; needs specific versioning
# - cohmm
# - examinimd
# - exampm
# depends on openblas version conflicting on gcc@7.3.1
# - exasp2
# depends on openblas version conflicting on gcc@7.3.1
# - gamess-ri-mp2-miniapp
- hpcg
# depends on openblas version conflicting on gcc@7.3.1
# - laghos
- lulesh
# - miniaero
- miniamr
- minife
- minighost
# fails due to x86 specific timer (asm instructions)
# - minigmg
- minimd
# depends on openblas version conflicting on gcc@7.3.1
# - miniqmc
- minismac2d
- minitri
- minivite
- minixyce
- pennant
- picsarlite
- quicksilver
# - remhos
- rsbench
- simplemoc
- snap
- snappy
- tealeaf
# depends on openblas version conflicting on gcc@7.3.1
# - thornado-mini
- tycho2
- xsbench
- ahug_fullapps:
# depends on openblas version conflicting on gcc@7.3.1
# - abinit
- abyss
# conflicts on trilinos
# - albany
# - amber
- amg2013
# Bad variant fftw
# - aoflagger
# - athena
# - bowtie2
- branson
# - camx
# Bad variant gpu
# - candle-benchmarks
- cbench
- cgm
- chatterbug
# - cistem
- comd
# old version of openmpi
# - converge
# bad variant tensor ops mpi
# - cosmoflow-benchmark
# - cosmomc
- cosp2
# libxsmm not avail on arm
# - cp2k
# - dock
# - elk
- elmerfem
# - exabayes
# - examl
# - flecsph
# trilinos variant mumps
# - frontistr
- gatk
- graph500
- hpgmg
- lammps
- latte
- macsio
# - meep
- meme
# - modylas
# - mrbayes
# - mrchem
# cudnn depednency
# - mxnet
# trilinos variant mumps
# - nalu
# - nalu-wind
# - namd
- nek5000
- nekbone
# - nektar
# - nest
- nut
# - nwchem
- octopus
- openmm
- pathfinder
# - picsar
- pism
# meson version
# - qbox
- qmcpack
- quantum-espresso
# - relion
# - siesta
- snbone
- star
- su2
- swfft
- tinker
# gfortran lt 9 unsupported
# - vasp
- vpfft
- vpic
- warpx
# - yambo
- compiler:
- '%gcc@7.3.1'
- target:
- 'target=aarch64'
- 'target=graviton2'
specs:
- matrix:
- - $ahug_miniapps
- - $compiler
- - $target
- matrix:
- - $ahug_fullapps
- - $compiler
- - $target
# Build compilers to stage in binary cache
- matrix:
- - $compiler_specs
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug-aarch64" }
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
mappings:
- match:
- llvm
- llvm-amdgpu
- paraview
runner-attributes:
tags: [ "spack", "huge", "aarch64" ]
variables:
CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 11000m
KUBERNETES_MEMORY_REQUEST: 42G
- match:
- ascent
- axom
- cuda
- dyninst
- gcc
- ginkgo
- hpx
- kokkos-kernels
- kokkos-nvcc-wrapper
- magma
- mfem
- mpich
- openturns
- precice
- raja
- rocblas
- rocsolver
- rust
- slate
- strumpack
- sundials
- trilinos
- umpire
- vtk-h
- vtk-m
- warpx
runner-attributes:
tags: [ "spack", "large", "aarch64" ]
variables:
CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m
KUBERNETES_MEMORY_REQUEST: 12G
- match: ['os=amzn2']
runner-attributes:
tags: ["spack", "aarch64"]
variables:
CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes:
before_script:
- . "./share/spack/setup-env.sh"
- spack --version
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
tags: ["spack", "public", "aarch64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash:
build-group: AHUG ARM HPC User Group
url: https://cdash.spack.io
project: Spack Testing
site: Cloud Gitlab Infrastructure

View File

@@ -0,0 +1,330 @@
spack:
view: false
concretizer:
reuse: false
unify: false
config:
concretizer: clingo
install_tree:
root: /home/software/spack
padded_length: 384
projections:
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
packages:
all:
providers:
blas:
- openblas
mkl:
- intel-oneapi-mkl
mpi:
- openmpi
- mpich
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
version:
- 2.36.1
doxygen:
version:
- 1.8.20
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
libunwind:
variants: +pic +xz
#m4:
# version:
# - 1.4.18
mesa:
variants: ~llvm
mesa18:
variants: ~llvm
mpich:
#variants: ~wrapperrpath pmi=pmi netmod=ofi device=ch4
variants: ~wrapperrpath netmod=ofi device=ch4
#munge:
# variants: localstatedir=/var
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
openmpi:
#variants: +pmi +internal-hwloc fabrics=ofi schedulers=slurm
variants: fabrics=ofi
openturns:
version: [1.18]
#slurm:
# variants: +pmix sysconfdir=/opt/slurm/etc
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
definitions:
- compiler_specs:
- gcc
#- nvhpc@22.1
# - compilers:
# - '%gcc@7.5.0'
# - '%arm@21.0.0.879'
# - '%nvhpc@21.2'
# - 'arm@21.0.0.879'
#- when: arch.satisfies('os=ubuntu18.04')
# compilers: ['gcc@7.5.0', $bootstrap-compilers]
#- when: arch.satisfies('os=amzn2')
# compilers: ['gcc@7.3.1', $bootstrap-compilers]
# Note skipping spot since no spack package for it
- ahug_miniapps:
- cloverleaf
# - coevp
# Bad code pushed to github; needs specific versioning
# - cohmm
# - examinimd
# - exampm
# depends on openblas version conflicting on gcc@7.3.1
# - exasp2
# depends on openblas version conflicting on gcc@7.3.1
# - gamess-ri-mp2-miniapp
- hpcg
# depends on openblas version conflicting on gcc@7.3.1
# - laghos
- lulesh
# - miniaero
- miniamr
- minife
- minighost
# fails due to x86 specific timer (asm instructions)
# - minigmg
- minimd
# depends on openblas version conflicting on gcc@7.3.1
# - miniqmc
- minismac2d
- minitri
- minivite
- minixyce
- pennant
- picsarlite
- quicksilver
# - remhos
- rsbench
- simplemoc
- snap
- snappy
- tealeaf
# depends on openblas version conflicting on gcc@7.3.1
# - thornado-mini
- tycho2
- xsbench
- ahug_fullapps:
# depends on openblas version conflicting on gcc@7.3.1
# - abinit
- abyss
# conflicts on trilinos
# - albany
# - amber
- amg2013
# Bad variant fftw
# - aoflagger
# - athena
# - bowtie2
- branson
# - camx
# Bad variant gpu
# - candle-benchmarks
- cbench
- cgm
- chatterbug
# - cistem
- comd
# old version of openmpi
# - converge
# bad variant tensor ops mpi
# - cosmoflow-benchmark
# - cosmomc
- cosp2
# libxsmm not avail on arm
# - cp2k
# - dock
# - elk
- elmerfem
# - exabayes
# - examl
# - flecsph
# trilinos variant mumps
# - frontistr
- gatk
- graph500
- hpgmg
- lammps
- latte
- macsio
# - meep
- meme
# - modylas
# - mrbayes
# - mrchem
# cudnn depednency
# - mxnet
# trilinos variant mumps
# - nalu
# - nalu-wind
# - namd
- nek5000
- nekbone
# - nektar
# - nest
- nut
# - nwchem
- octopus
- openmm
- pathfinder
# - picsar
- pism
# meson version
# - qbox
- qmcpack
- quantum-espresso
# - relion
# - siesta
- snbone
- star
- su2
- swfft
- tinker
# gfortran lt 9 unsupported
# - vasp
- vpfft
- vpic
- warpx
# - yambo
- compiler:
- '%gcc@7.3.1'
- target:
#- 'target=x86_64'
- 'target=x86_64_v3'
- 'target=x86_64_v4'
specs:
- matrix:
- - $ahug_miniapps
- - $compiler
- - $target
- matrix:
- - $ahug_fullapps
- - $compiler
- - $target
# Build compilers to stage in binary cache
- matrix:
- - $compiler_specs
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug" }
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
mappings:
- match:
- llvm
- llvm-amdgpu
- paraview
runner-attributes:
tags: [ "spack", "huge", "x86_64_v4" ]
variables:
CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 11000m
KUBERNETES_MEMORY_REQUEST: 42G
- match:
- ascent
- axom
- cuda
- dyninst
- gcc
- ginkgo
- hpx
- kokkos-kernels
- kokkos-nvcc-wrapper
- magma
- mfem
- mpich
- openturns
- precice
- raja
- rocblas
- rocsolver
- rust
- slate
- strumpack
- sundials
- trilinos
- umpire
- vtk-h
- vtk-m
- warpx
runner-attributes:
tags: [ "spack", "large", "x86_64_v4" ]
variables:
CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m
KUBERNETES_MEMORY_REQUEST: 12G
- match: ['os=amzn2']
runner-attributes:
tags: ["spack", "x86_64_v4"]
variables:
CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes:
before_script:
- . "./share/spack/setup-env.sh"
- spack --version
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
tags: ["spack", "public", "x86_64_v4"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash:
build-group: AHUG ARM HPC User Group
url: https://cdash.spack.io
project: Spack Testing
site: Cloud Gitlab Infrastructure

View File

@@ -0,0 +1,256 @@
spack:
view: false
concretizer:
reuse: false
unify: false
config:
concretizer: clingo
install_tree:
root: /home/software/spack
padded_length: 384
projections:
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
packages:
all:
providers:
blas:
- openblas
mkl:
- intel-oneapi-mkl
mpi:
- openmpi
- mpich
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
version:
- 2.36.1
doxygen:
version:
- 1.8.20
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
libunwind:
variants: +pic +xz
mesa:
variants: ~llvm
mesa18:
variants: ~llvm
mpich:
variants: ~wrapperrpath netmod=ofi device=ch4
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
openmpi:
variants: fabrics=ofi +legacylaunchers
openturns:
version: [1.18]
relion:
variants: ~mklfft
# texlive:
# version: [20210325]
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
definitions:
- compiler_specs:
- gcc@11.2
# Licensing OK?
# - intel-oneapi-compilers@2022.1
# - nvhpc
- cuda_specs:
# Depends on ctffind which embeds fsincos (x86-specific asm) within code. Will not build on ARM
#- relion +cuda cuda_arch=70
- raja +cuda cuda_arch=70
- mfem +cuda cuda_arch=70
- app_specs:
- bwa
# Depends on simde which requires newer compiler?
#- bowtie2
# Requires x86_64 specific ASM
#- cistem
- cromwell
- fastqc
- flux-sched
- flux-core
- flux-pmix
- gatk
- gromacs
- lammps
- wrf build_type=dm+sm
- mfem
- mpas-model ^parallelio+pnetcdf
- nextflow
- octave
- openfoam
- osu-micro-benchmarks
- parallel
- paraview
- picard
- quantum-espresso
- raja
# Depends on bowtie2 -> simde which requires newer compiler?
#- rsem
# Errors on texlive
#- rstudio
- salmon
- samtools
- seqtk
- snakemake
- star
# Requires gcc@9:
#- ufs-weather-model
# requires LLVM which fails without constraint
#- visit
- lib_specs:
- openmpi fabrics=ofi
- openmpi fabrics=ofi +legacylaunchers
- openmpi fabrics=auto
- mpich
- libfabric
- compiler:
- '%gcc@7.3.1'
- target:
- 'target=aarch64'
- 'target=graviton2'
specs:
- matrix:
- - $cuda_specs
- - $compiler
- - $target
- matrix:
- - $app_specs
- - $compiler
- - $target
- matrix:
- - $lib_specs
- - $compiler
- - $target
- matrix:
- - $compiler_specs
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc-aarch64" }
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
mappings:
- match:
- llvm
- llvm-amdgpu
- paraview
runner-attributes:
tags: [ "spack", "huge", "aarch64" ]
variables:
CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 15000m
KUBERNETES_MEMORY_REQUEST: 62G
- match:
- ascent
- atk
- axom
- cistem
- ctffind
- cuda
- dyninst
- gcc
- ginkgo
- hdf5
- hpx
- kokkos-kernels
- kokkos-nvcc-wrapper
- magma
- mfem
- mpich
- openturns
- parallelio
- precice
- raja
- relion
- rocblas
- rocsolver
- rust
- slate
- strumpack
- sundials
- trilinos
- umpire
- vtk
- vtk-h
- vtk-m
- warpx
- wrf
- wxwidgets
runner-attributes:
tags: [ "spack", "large", "aarch64" ]
variables:
CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m
KUBERNETES_MEMORY_REQUEST: 12G
- match: ['os=amzn2']
runner-attributes:
tags: ["spack", "aarch64"]
variables:
CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes:
before_script:
- . "./share/spack/setup-env.sh"
- spack --version
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
tags: ["spack", "public", "aarch64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash:
build-group: AWS Packages
url: https://cdash.spack.io
project: Spack Testing
site: Cloud Gitlab Infrastructure

View File

@@ -0,0 +1,258 @@
spack:
view: false
concretizer:
reuse: false
unify: false
config:
concretizer: clingo
install_tree:
root: /home/software/spack
padded_length: 384
projections:
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
packages:
all:
providers:
blas:
- openblas
mkl:
- intel-oneapi-mkl
mpi:
- openmpi
- mpich
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
version:
- 2.36.1
doxygen:
version:
- 1.8.20
elfutils:
variants: +bzip2 ~nls +xz
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
libunwind:
variants: +pic +xz
mesa:
variants: ~llvm
mesa18:
variants: ~llvm
mpich:
variants: ~wrapperrpath netmod=ofi device=ch4
ncurses:
variants: +termlib
openblas:
variants: threads=openmp
openmpi:
variants: fabrics=ofi +legacylaunchers
openturns:
version: [1.18]
relion:
variants: ~mklfft
# texlive:
# version: [20210325]
trilinos:
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
definitions:
- compiler_specs:
- gcc@11.2
# Licensing OK?
# - intel-oneapi-compilers@2022.1
# - nvhpc
- cuda_specs:
# Disabled for consistency with aarch64
#- relion +cuda cuda_arch=70
- raja +cuda cuda_arch=70
- mfem +cuda cuda_arch=70
- app_specs:
- bwa
# Disabled for consistency with aarch64
#- bowtie2
# Disabled for consistency with aarch64
#- cistem
- cromwell
- fastqc
- flux-sched
- flux-core
- flux-pmix
- gatk
- gromacs
- lammps
- wrf build_type=dm+sm
- mfem
- mpas-model ^parallelio+pnetcdf
- nextflow
- octave
- openfoam
- osu-micro-benchmarks
- parallel
- paraview
- picard
- quantum-espresso
# Build broken for gcc@7.3.1 x86_64_v4 (error: '_mm512_loadu_epi32' was not declared in this scope)
#- raja
# Disabled for consistency with aarch64
#- rsem
# Errors on texlive
#- rstudio
- salmon
- samtools
- seqtk
- snakemake
- star
# Requires gcc@9:
#- ufs-weather-model
# Disabled for consistency with aarch64
#- visit
- lib_specs:
- openmpi fabrics=ofi
- openmpi fabrics=ofi +legacylaunchers
- openmpi fabrics=auto
- mpich
- libfabric
- compiler:
- '%gcc@7.3.1'
- target:
- 'target=x86_64_v3'
- 'target=x86_64_v4'
specs:
- matrix:
- - $cuda_specs
- - $compiler
- - $target
- matrix:
- - $app_specs
- - $compiler
- - $target
- matrix:
- - $lib_specs
- - $compiler
- - $target
- matrix:
- - $compiler_specs
- - $compiler
- - $target
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc" }
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
mappings:
- match:
- llvm
- llvm-amdgpu
- pango
- paraview
runner-attributes:
tags: [ "spack", "huge", "x86_64_v4" ]
variables:
CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 11000m
KUBERNETES_MEMORY_REQUEST: 42G
- match:
- ascent
- atk
- axom
- cistem
- ctffind
- cuda
- dyninst
- gcc
- ginkgo
- hdf5
- hpx
- kokkos-kernels
- kokkos-nvcc-wrapper
- magma
- mfem
- mpich
- openturns
- parallelio
- precice
- raja
- relion
- rocblas
- rocsolver
- rust
- slate
- strumpack
- sundials
- trilinos
- umpire
- vtk
- vtk-h
- vtk-m
- warpx
- wrf
- wxwidgets
runner-attributes:
tags: [ "spack", "large", "x86_64_v4" ]
variables:
CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m
KUBERNETES_MEMORY_REQUEST: 12G
- match: ['os=amzn2']
runner-attributes:
tags: ["spack", "x86_64_v4"]
variables:
CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes:
before_script:
- . "./share/spack/setup-env.sh"
- spack --version
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
tags: ["spack", "public", "x86_64_v4"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash:
build-group: AWS Packages
url: https://cdash.spack.io
project: Spack Testing
site: Cloud Gitlab Infrastructure

View File

@@ -16,10 +16,10 @@
pattern_exemptions = {
# exemptions applied only to package.py files.
r"package.py$": {
# Allow 'from spack import *' in packages, but no other wildcards
# Allow 'from spack.package import *' in packages, but no other wildcards
"F403": [
r"^from spack import \*$",
r"^from spack.pkgkit import \*$",
r"^from spack.package import \*$",
r"^from spack.package_defs import \*$",
],
# Exempt lines with urls and descriptions from overlong line errors.
"E501": [

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.package import *
class A(AutotoolsPackage):

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.package import *
class ArchiveFiles(AutotoolsPackage):

View File

@@ -4,6 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class AutotoolsConditionalVariantsTest(AutotoolsPackage):
homepage = "https://www.example.com"
has_code = False

View File

@@ -5,7 +5,7 @@
import os
from spack import *
from spack.package import *
class AutotoolsConfigReplacement(AutotoolsPackage):

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.package import *
class B(Package):

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.package import *
class Boost(Package):

Some files were not shown because too many files have changed in this diff Show More