Compare commits

..

88 Commits

Author SHA1 Message Date
Wouter Deconinck
09e1258ed4 qt-* (Qt6 pkgs): new versions 6.5.0, 6.5.1 (#36705)
* qt-base: new version 6.5.0

* qt-declarative: new version 6.5.0

* qt-quick3d: new version 6.5.0

* qt-quicktimeline: new version 6.5.0

* qt-shadertools: new version 6.5.0

* qt-*: new version 6.5.1

* qt-base: new version 6.5.1
2023-07-08 15:36:27 -05:00
Dax Lynch
b2dcd9bd42 PyNVTX: added new package (#38763) 2023-07-07 20:02:21 -05:00
Manuela Kuhn
2dc76248d3 py-pyarrow: disable dataset variant by default (#38775)
* py-pyarrow: enable parquet variant by default

* Disable parquet variant by default

* Add conflict to enable parquet when dataset is active

* Disable dataset variant by default
2023-07-07 16:27:32 -05:00
Adam J. Stewart
bf6eb832ae py-matplotlib: add v3.7.2 (#38745)
* py-matplotlib: add v3.7.2

* Update dep versions
2023-07-07 14:43:41 -05:00
Chris Richardson
e8e6d69af5 New package: py-nanobind (#38327)
* initial commit of nanobind package

* style fixes

* Update package.py

Typo

* addressed PR comments

* add v1.4.0

* Update var/spack/repos/builtin/packages/py-nanobind/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Matthew Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-07-07 12:02:45 -04:00
Manuela Kuhn
050d8df5a5 py-astropy: fix import tests and restrict py-pip version (#38731)
* py-astropy: fix import tests and restrict py-pip version

* Fix --install-option name in comments

* Rename variant and fix variant dependencies

* Remove parquet variant from py-pyarrow
2023-07-07 10:19:02 -05:00
Massimiliano Culpo
6958b49c2f Remove "node_compiler" from the list of unknown atoms (#38753) 2023-07-07 13:19:53 +02:00
Harmen Stoppels
a1d33e97ec Fix multiple quadratic complexity issues in environments (#38771)
1. Fix O(n^2) iteration in `_get_overwrite_specs`
2. Early exit `get_by_hash` on full hash
3. Fix O(n^2) double lookup in `all_matching_specs` with hashes
4. Fix some legibility issues
2023-07-07 10:51:58 +00:00
Massimiliano Culpo
ca9b52bbc5 Prevent "spack external find" to error out on wrong permissions (#38755)
fixes #38733
2023-07-07 12:05:32 +02:00
Andrey Parfenov
ae00d7c358 add info about spack env from spack-configs for oneAPI build tools (#38751)
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
2023-07-07 03:47:54 +00:00
Mickael PHILIT
1071c1d8e0 add cgns 4.4.0 (#38530) 2023-07-06 17:52:03 -07:00
Carlos Bederián
5f6c832020 freesurfer: add 7.4.1, 7.4.0, 7.3.2 (#38544) 2023-07-06 17:51:01 -07:00
Wouter Deconinck
9e4c4be3f5 mlpack: new package (#38277)
* mlpack: new package

mlpack is an intuitive, fast, and flexible header-only C++ machine learning library with bindings to other languages. It is meant to be a machine learning analog to LAPACK, and aims to implement a wide array of machine learning methods and functions as a "swiss army knife" for machine learning researchers.

* mlpack: upstream merged patch to allow python installation in spack
2023-07-06 13:42:56 -07:00
John W. Parent
6c325a2246 Curl Package: Fixup bugs preventing build on Win (#38757) 2023-07-06 14:59:42 -04:00
Ben Cowan
28b884ace5 Add new version 5.0.0 of PyAMG (#38674)
* Added v5.0.0 of PyAMG.  This required v7.1.0 of setuptools_scm due to a bug in 7.0.5.

* Added comment about version requirement.

* Loosened dependency based on build experiments.

* Updated tomli deps.

* Update var/spack/repos/builtin/packages/py-setuptools-scm/package.py

Dependence for 7.0 only.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pyamg/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Swapped lines.

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-07-06 13:54:29 -04:00
Harmen Stoppels
a13687f022 Disable fortran in openblas for darwin ci, fix variant default value (#38752) 2023-07-06 17:47:20 +02:00
Thomas Bouvier
bd81676e3c py-torch: fix build (#38730) 2023-07-06 09:35:11 -05:00
Adam J. Stewart
7eaa99eabc py-scikit-learn: add v1.3.0 (#38660) 2023-07-06 08:55:00 -05:00
kjrstory
5cb0b57b30 openfoam.org: add a maintainer(#37280) (#37697)
Co-authored-by: Dom Heinzeller <dom.heinzeller@icloud.com>
2023-07-06 13:44:24 +02:00
Jen Herting
c6867649b9 [py-blis] added versions 0.7.9 and 0.9.1 (#38269)
* [py-blis] added version 0.7.9

* [py-blis] added version 0.9.1

* [py-blis]

- removed type run for dependency py-cython
2023-07-06 05:09:11 -04:00
Jim Phillips
c129603192 namd: add maintainer (#38740) 2023-07-06 04:28:54 -04:00
Manuela Kuhn
4e456992e4 py-abipy: add 0.9.3 (#38716)
* py-abipy: add 0.9.3

* Remove py-cython dependency

* Remove dep version restrictions for new release
2023-07-06 04:23:55 -04:00
Jen Herting
10876736e0 [py-cymem] added version 2.0.7 (#38267)
* [py-cymem] added version 2.0.3

* [py-cymem] added restriction to py-wheel limitation
2023-07-06 04:13:44 -04:00
Thomas Bouvier
982cdd7988 py-dm-tree: add v0.1.8 (#38606)
* `py-dm-tree`: add v0.1.8

* Update dependencies

* Fix hash
2023-07-06 00:12:04 -05:00
Thomas Bouvier
095e48f399 py-horovod: update to v0.28.1 (#38732) 2023-07-05 23:56:47 -05:00
Manuela Kuhn
833db65fa3 py-pip: add 23.1.2 (#38608)
* py-pip: add 23.1.2

* Restrict py-pip version for py-protobuf

* Restrict py-pip version for straightforward packages

* Restrict py-pip version for nrm

* Fix --install-option name in comments

* Simplify py-pip restriction for py-scs

* nrm: fix wrong comment
2023-07-05 23:37:16 -04:00
Harmen Stoppels
06268f7b72 perl: add 5.38.0, 5.36.1; prefer all even minor versions over development versions (#38690)
* perl: add 5.38.0, 5.36.1; prefer all even minor versions over development versions

* fix libxcrypt build with new perl

* fix libxcrypt with a patch
2023-07-05 18:08:54 -04:00
Manuela Kuhn
f884e71a03 py-spglib: add 2.0.2 (#38715)
* py-spglib: add 2.0.2

* Update var/spack/repos/builtin/packages/py-spglib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove py-setuptools as run dependency

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-07-05 15:24:31 -05:00
Wouter Deconinck
ea1439dfa1 acts: new variant cxxstd (#38682)
* acts: allow ^root cxxstd=20

* acts: new variant cxxstd, pass through to root

* acts: always args.append CMAKE_CXX_STANDARD from variant

* acts: remove unused import

* acts: fix self.define_from_variant
2023-07-05 12:56:06 -07:00
Adam J. Stewart
45838cee0b Drop Python 2 super syntax (#38718) 2023-07-05 09:04:29 -05:00
Adam J. Stewart
95847a0b37 Drop Python 2 object subclassing (#38720) 2023-07-05 14:37:44 +02:00
George Young
8861fe0294 salmon: patching to build with %gcc@13: (#38553)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-07-05 14:36:55 +02:00
Harmen Stoppels
12427c6974 plumed: deprecate non-buildable versions, patch Python makefile only when supported (#38713) 2023-07-05 05:48:13 -04:00
Taillefumier Mathieu
a412403d7b Update cosma and a few related recipes (#35615)
* Add maintainers 

* Updated cosma archive checksum and costa version

- updated cosma version (in the cosma build system)
- updated costa version
- use the default generic url for downloading packages
- do not build tiled-mm when the cpu only version is needed


Signed-off-by: Dr. Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-07-05 11:02:50 +02:00
Martin Diehl
6a258c148f damask: fix build of alpha release (#38457) 2023-07-05 10:59:05 +02:00
Tom Epperly
8b13440038 Add version 1.8.23 that fixes a compilation bug on clang-15. (#38541) 2023-07-05 10:56:18 +02:00
Tamara Dahlgren
632f840d8a tk: convert to new stand-alone test process (#38575) 2023-07-05 10:55:07 +02:00
Alex Richert
8372726a88 ip: add v4.1.0, and additional variants (#38526) 2023-07-05 10:45:38 +02:00
Tamara Dahlgren
5dc84b64e9 tests/qthreads: convert to new stand-alone test process (#38600) 2023-07-05 10:41:25 +02:00
Tamara Dahlgren
8d72b8dd63 tests/papyrus: convert to new stand-alone test process (#38627) 2023-07-05 10:34:15 +02:00
Jim Phillips
adde84e663 Build NAMD with Tcl by default (#38645)
NAMD users expect the Tcl scripting interface to be enabled as it is used in many examples and tutorials in addition to being required for features such as multi-copy algorithms.
2023-07-05 10:29:11 +02:00
Rémi Lacroix
f863066b7e git-annex: add latest version 10.20230408 (#38728) 2023-07-05 04:27:46 -04:00
Juan Miguel Carceller
082afe04b8 xrootd: add _STAT_VER patch (#38547)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-07-05 10:14:18 +02:00
Peter Scheibel
f365386447 Installations: don't set group permissions when they match what is desired (#38036)
* When installing a package Spack will attempt to set group permissions on
the install prefix even when the configuration does not specify a group.

Co-authored-by: David Gomez <dvdgomez@users.noreply.github.com>
2023-07-05 09:54:04 +02:00
Carlos Bederián
a90200528f hcoll: ucx version requirements (#38665) 2023-07-05 09:48:58 +02:00
Adam J. Stewart
1ce6feef94 py-pyqt4: stricter dependency versions (#38673) 2023-07-05 09:45:24 +02:00
Weiqun Zhang
84010108b1 amrex: add v23.07 (#38676) 2023-07-05 09:43:37 +02:00
Adam J. Stewart
24d2005920 py-lightly: add v1.4.11 (#38717) 2023-07-05 09:41:08 +02:00
Robert Cohn
fa73b14247 intel-oneapi-mkl: support for cray mpich (#38725) 2023-07-05 09:20:19 +02:00
Manuela Kuhn
a99b7886e9 py-pymatgen: add 2022.9.8 (#38714) 2023-07-05 00:12:18 -05:00
Adam J. Stewart
2978911520 spack commands: add type hints and docstrings (#38705) 2023-07-04 16:43:02 -04:00
Harmen Stoppels
d35149d174 remove another Python 3.7 requirement & preference, since it was deprecated (#38710) 2023-07-04 20:33:47 +02:00
Manuela Kuhn
a3d11a7973 py-requests: add 2.31.0 (#38563) 2023-07-04 10:33:06 -04:00
Max Zeyen
cb69dbd804 gpi-space: add new versions (#38709)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-07-04 15:01:39 +02:00
Laura Bellentani
e6f50c5341 quantum-espresso: change in maintainers (#38688) 2023-07-04 14:06:21 +02:00
Chris White
32d0b5ca97 conduit: remove no longer needed blueos logic (#38698) 2023-07-04 07:52:59 -04:00
valmar
b537fad37a Added Python Prometheus client as dependency (#38700) 2023-07-04 07:12:47 -04:00
Thomas Bouvier
78e78eb1da nvtx: add new package (#38430)
Co-authored-by: thomas-bouvier <thomas-bouvier@users.noreply.github.com>
2023-07-04 07:07:41 -04:00
Stephen Sachs
8aeecafd1a wrf: add ARM compiler support (#38695) 2023-07-04 12:33:58 +02:00
Stephen Sachs
a0b2ca2dde Temporarily disable aws-pcluster pipelines (#38708) 2023-07-04 11:27:01 +02:00
Harmen Stoppels
08f23f4802 macos sip: apply on macos only, dont store LD_LIBRARY_PATH (#38687) 2023-07-04 10:54:13 +02:00
Adam J. Stewart
e9dc6dc96c Fix DYLD_* propagation to Python process from fish shell (#38615) 2023-07-04 10:48:56 +02:00
Adam J. Stewart
5588b93683 util-linux: add v2.39.1, fix macOS build (#38677) 2023-07-04 10:47:26 +02:00
George Young
70a38ea1c5 plink2: add new package @2.00a4.3 (#38469)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-07-04 04:09:45 -04:00
Adam J. Stewart
5bd7a0c563 spack make-installer: deterministic choice order (#38706) 2023-07-04 09:39:38 +02:00
Adam J. Stewart
33c5959e23 Remove from __future__ imports (#38703) 2023-07-04 08:30:29 +02:00
Manuela Kuhn
65288566e5 py-numpydoc: add 1.5.0 (#38701) 2023-07-03 22:13:54 -04:00
Manuela Kuhn
5e1c4693fd py-ruff: add 0.0.276 (#38702) 2023-07-03 21:59:14 -04:00
Aditya Bhamidipati
feb4681878 Add NCCL v2.18.3-1 release to recipe (#38647) 2023-07-03 16:26:07 -05:00
Thomas Bouvier
994b5ad49e ffmpeg: patch build failure (#38656)
* `ffmpeg`: patch build failure

* [@spackbot] updating style on behalf of thomas-bouvier

---------

Co-authored-by: thomas-bouvier <thomas-bouvier@users.noreply.github.com>
2023-07-03 11:12:26 -07:00
Harmen Stoppels
4654db54c7 hdf5-vol-log: depends on mpi (#38693)
From the configure.ac file:

> H5VL_log is built on top of MPI. Configure option --without-mpi or
> --with-mpi=no should not be used. Abort.

This currently fails to build in the oneAPI pipeline on `develop`
2023-07-03 20:08:40 +02:00
Adam J. Stewart
a6ebff3a2e py-pillow: add v10.0.0 (#38670) 2023-07-03 11:02:24 -07:00
Ashwin Kumar Karnad
6e8fb30b83 Add hash for octopusV13 (#38655) 2023-07-03 10:24:52 -07:00
Cyrus Harrison
465f83b484 add ascent 0.9.2 release (#38661) 2023-07-03 10:14:01 -07:00
Harmen Stoppels
ba7ae2c153 Drop requirement of python@3.7 since it's deprecated (#38692) 2023-07-03 16:15:32 +02:00
Jonathon Anderson
54adab7eac python: require xz libs=shared when +lzma (#38593) 2023-07-03 07:11:30 -05:00
Andrey Parfenov
30cb55e97c add support for oneapi compiler to wrf 4.4 (#38607)
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
2023-07-03 07:34:06 -04:00
Harmen Stoppels
3594203f64 guile: fix %oneapi fast math madness (#38691) 2023-07-03 13:25:55 +02:00
Adam J. Stewart
f0add2428c dbus: AutotoolsPackage, optional documentation build (#38679)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-07-03 04:18:00 -04:00
Adam J. Stewart
af449b7943 qt-base: disable accessibility by default (#38680) 2023-07-03 09:45:22 +02:00
Adam J. Stewart
b6d591e39b bash: adam now uses fish (#38684) 2023-07-03 02:44:44 -05:00
Adam J. Stewart
3811dec18d Deprecate conda for Python 2 (#38681) 2023-07-03 09:44:20 +02:00
Adam J. Stewart
ae2efa1c27 xmlto: fix missing dependency on util-linux (#38678) 2023-07-03 09:43:18 +02:00
John Biddiscombe
acdcc8ed71 Add support to Paraview for TBB (#38582)
Co-authored-by: Jean Favre <jfavre@cscs.ch>
2023-07-03 09:33:56 +02:00
Jack Morrison
11bc27d984 Add libfabric 1.18.1 release (#38669) 2023-07-02 13:39:20 -07:00
Adam J. Stewart
4d5ff045e3 Deprecate Python 3.7 (#38619)
* Deprecated Python 3.7

* Add Python 3.7.17 because why not
2023-07-02 11:29:35 -05:00
Adam J. Stewart
63576275be SIPPackage: documentation fix (#38672) 2023-07-02 11:30:08 +02:00
Harmen Stoppels
cc74729115 Revert "openblas: do not build tests when installing (#38591)" (#38662)
This reverts commit 51c75c6da3.
2023-07-01 22:02:39 -04:00
352 changed files with 2604 additions and 1345 deletions

View File

@@ -25,8 +25,6 @@ exit 1
# Line above is a shell no-op, and ends a python multi-line comment.
# The code above runs this file with our preferred python interpreter.
from __future__ import print_function
import os
import os.path
import sys

View File

@@ -76,6 +76,53 @@ To build with with ``icx``, do ::
spack install patchelf%oneapi
Using oneAPI Spack environment
-------------------------------
In this example, we build lammps with ``icx`` using Spack environment for oneAPI packages created by Intel. The
compilers are installed with Spack like in example above.
Install the oneAPI compilers::
spack install intel-oneapi-compilers
Add the compilers to your ``compilers.yaml`` so Spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
Verify that the compilers are available::
spack compiler list
Clone `spack-configs <https://github.com/spack/spack-configs>`_ repo and activate Intel oneAPI CPU environment::
git clone https://github.com/spack/spack-configs
spack env activate spack-configs/INTEL/CPU
spack concretize -f
`Intel oneAPI CPU environment <https://github.com/spack/spack-configs/blob/main/INTEL/CPU/spack.yaml>`_ contains applications tested and validated by Intel, this list is constantly extended. And currently it supports:
- `GROMACS <https://www.gromacs.org/>`_
- `HPCG <https://www.hpcg-benchmark.org/>`_
- `HPL <https://netlib.org/benchmark/hpl/>`_
- `LAMMPS <https://www.lammps.org/#gsc.tab=0>`_
- `OpenFOAM <https://www.openfoam.com/>`_
- `STREAM <https://www.cs.virginia.edu/stream/>`_
- `WRF <https://github.com/wrf-model/WRF>`_
To build lammps with oneAPI compiler from this environment just run::
spack install lammps
Compiled binaries can be find using::
spack cd -i lammps
You can do the same for all other applications from this environment.
Using oneAPI MPI to Satisfy a Virtual Dependence
------------------------------------------------------

View File

@@ -72,7 +72,7 @@ arguments to the configure phase, you can use:
.. code-block:: python
def configure_args(self, spec, prefix):
def configure_args(self):
return ['--no-python-dbus']

View File

@@ -97,9 +97,7 @@ class PatchedPythonDomain(PythonDomain):
def resolve_xref(self, env, fromdocname, builder, typ, target, node, contnode):
if "refspecific" in node:
del node["refspecific"]
return super(PatchedPythonDomain, self).resolve_xref(
env, fromdocname, builder, typ, target, node, contnode
)
return super().resolve_xref(env, fromdocname, builder, typ, target, node, contnode)
#

View File

@@ -121,7 +121,7 @@ Since v0.19, Spack supports two ways of writing a package recipe. The most comm
def url_for_version(self, version):
if version >= Version("2.1.1"):
return super(Openjpeg, self).url_for_version(version)
return super().url_for_version(version)
url_fmt = "https://github.com/uclouvain/openjpeg/archive/version.{0}.tar.gz"
return url_fmt.format(version)
@@ -155,7 +155,7 @@ builder class explicitly. Using the same example as above, this reads:
def url_for_version(self, version):
if version >= Version("2.1.1"):
return super(Openjpeg, self).url_for_version(version)
return super().url_for_version(version)
url_fmt = "https://github.com/uclouvain/openjpeg/archive/version.{0}.tar.gz"
return url_fmt.format(version)

View File

@@ -65,9 +65,6 @@
up to date with CTest, just make sure the ``*_matches`` and
``*_exceptions`` lists are kept up to date with CTest's build handler.
"""
from __future__ import print_function
from __future__ import division
import re
import math
import multiprocessing
@@ -211,7 +208,7 @@
]
class LogEvent(object):
class LogEvent:
"""Class representing interesting events (e.g., errors) in a build log."""
def __init__(self, text, line_no,
source_file=None, source_line_no=None,
@@ -348,7 +345,7 @@ def _parse_unpack(args):
return _parse(*args)
class CTestLogParser(object):
class CTestLogParser:
"""Log file parser that extracts errors and warnings."""
def __init__(self, profile=False):
# whether to record timing information

View File

@@ -3,33 +3,42 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import abc
import argparse
import errno
import io
import re
import sys
from argparse import ArgumentParser
from typing import IO, Optional, Sequence, Tuple
class Command(object):
class Command:
"""Parsed representation of a command from argparse.
This is a single command from an argparse parser. ``ArgparseWriter``
creates these and returns them from ``parse()``, and it passes one of
these to each call to ``format()`` so that we can take an action for
a single command.
Parts of a Command:
- prog: command name (str)
- description: command description (str)
- usage: command usage (str)
- positionals: list of positional arguments (list)
- optionals: list of optional arguments (list)
- subcommands: list of subcommand parsers (list)
This is a single command from an argparse parser. ``ArgparseWriter`` creates these and returns
them from ``parse()``, and it passes one of these to each call to ``format()`` so that we can
take an action for a single command.
"""
def __init__(self, prog, description, usage, positionals, optionals, subcommands):
def __init__(
self,
prog: str,
description: Optional[str],
usage: str,
positionals: Sequence[Tuple[str, str]],
optionals: Sequence[Tuple[Sequence[str], str, str]],
subcommands: Sequence[Tuple[ArgumentParser, str]],
) -> None:
"""Initialize a new Command instance.
Args:
prog: Program name.
description: Command description.
usage: Command usage.
positionals: List of positional arguments.
optionals: List of optional arguments.
subcommands: List of subcommand parsers.
"""
self.prog = prog
self.description = description
self.usage = usage
@@ -38,35 +47,34 @@ def __init__(self, prog, description, usage, positionals, optionals, subcommands
self.subcommands = subcommands
# NOTE: The only reason we subclass argparse.HelpFormatter is to get access
# to self._expand_help(), ArgparseWriter is not intended to be used as a
# formatter_class.
class ArgparseWriter(argparse.HelpFormatter):
"""Analyzes an argparse ArgumentParser for easy generation of help."""
# NOTE: The only reason we subclass argparse.HelpFormatter is to get access to self._expand_help(),
# ArgparseWriter is not intended to be used as a formatter_class.
class ArgparseWriter(argparse.HelpFormatter, abc.ABC):
"""Analyze an argparse ArgumentParser for easy generation of help."""
def __init__(self, prog, out=None, aliases=False):
"""Initializes a new ArgparseWriter instance.
def __init__(self, prog: str, out: IO = sys.stdout, aliases: bool = False) -> None:
"""Initialize a new ArgparseWriter instance.
Parameters:
prog (str): the program name
out (file object): the file to write to (default sys.stdout)
aliases (bool): whether or not to include subparsers for aliases
Args:
prog: Program name.
out: File object to write to.
aliases: Whether or not to include subparsers for aliases.
"""
super(ArgparseWriter, self).__init__(prog)
super().__init__(prog)
self.level = 0
self.prog = prog
self.out = sys.stdout if out is None else out
self.out = out
self.aliases = aliases
def parse(self, parser, prog):
"""Parses the parser object and returns the relavent components.
def parse(self, parser: ArgumentParser, prog: str) -> Command:
"""Parse the parser object and return the relavent components.
Parameters:
parser (argparse.ArgumentParser): the parser
prog (str): the command name
Args:
parser: Command parser.
prog: Program name.
Returns:
(Command) information about the command from the parser
Information about the command from the parser.
"""
self.parser = parser
@@ -80,8 +88,7 @@ def parse(self, parser, prog):
groups = parser._mutually_exclusive_groups
usage = fmt._format_usage(None, actions, groups, "").strip()
# Go through actions and split them into optionals, positionals,
# and subcommands
# Go through actions and split them into optionals, positionals, and subcommands
optionals = []
positionals = []
subcommands = []
@@ -98,7 +105,7 @@ def parse(self, parser, prog):
subcommands.append((subparser, subaction.dest))
# Look for aliases of the form 'name (alias, ...)'
if self.aliases:
if self.aliases and isinstance(subaction.metavar, str):
match = re.match(r"(.*) \((.*)\)", subaction.metavar)
if match:
aliases = match.group(2).split(", ")
@@ -113,28 +120,26 @@ def parse(self, parser, prog):
return Command(prog, description, usage, positionals, optionals, subcommands)
def format(self, cmd):
"""Returns the string representation of a single node in the
parser tree.
@abc.abstractmethod
def format(self, cmd: Command) -> str:
"""Return the string representation of a single node in the parser tree.
Override this in subclasses to define how each subcommand
should be displayed.
Override this in subclasses to define how each subcommand should be displayed.
Parameters:
(Command): parsed information about a command or subcommand
Args:
cmd: Parsed information about a command or subcommand.
Returns:
str: the string representation of this subcommand
String representation of this subcommand.
"""
raise NotImplementedError
def _write(self, parser, prog, level=0):
"""Recursively writes a parser.
def _write(self, parser: ArgumentParser, prog: str, level: int = 0) -> None:
"""Recursively write a parser.
Parameters:
parser (argparse.ArgumentParser): the parser
prog (str): the command name
level (int): the current level
Args:
parser: Command parser.
prog: Program name.
level: Current level.
"""
self.level = level
@@ -144,19 +149,17 @@ def _write(self, parser, prog, level=0):
for subparser, prog in cmd.subcommands:
self._write(subparser, prog, level=level + 1)
def write(self, parser):
def write(self, parser: ArgumentParser) -> None:
"""Write out details about an ArgumentParser.
Args:
parser (argparse.ArgumentParser): the parser
parser: Command parser.
"""
try:
self._write(parser, self.prog)
except IOError as e:
except BrokenPipeError:
# Swallow pipe errors
# Raises IOError in Python 2 and BrokenPipeError in Python 3
if e.errno != errno.EPIPE:
raise
pass
_rst_levels = ["=", "-", "^", "~", ":", "`"]
@@ -165,21 +168,33 @@ def write(self, parser):
class ArgparseRstWriter(ArgparseWriter):
"""Write argparse output as rst sections."""
def __init__(self, prog, out=None, aliases=False, rst_levels=_rst_levels):
"""Create a new ArgparseRstWriter.
def __init__(
self,
prog: str,
out: IO = sys.stdout,
aliases: bool = False,
rst_levels: Sequence[str] = _rst_levels,
) -> None:
"""Initialize a new ArgparseRstWriter instance.
Parameters:
prog (str): program name
out (file object): file to write to
aliases (bool): whether or not to include subparsers for aliases
rst_levels (list of str): list of characters
for rst section headings
Args:
prog: Program name.
out: File object to write to.
aliases: Whether or not to include subparsers for aliases.
rst_levels: List of characters for rst section headings.
"""
out = sys.stdout if out is None else out
super(ArgparseRstWriter, self).__init__(prog, out, aliases)
super().__init__(prog, out, aliases)
self.rst_levels = rst_levels
def format(self, cmd):
def format(self, cmd: Command) -> str:
"""Return the string representation of a single node in the parser tree.
Args:
cmd: Parsed information about a command or subcommand.
Returns:
String representation of a node.
"""
string = io.StringIO()
string.write(self.begin_command(cmd.prog))
@@ -205,7 +220,15 @@ def format(self, cmd):
return string.getvalue()
def begin_command(self, prog):
def begin_command(self, prog: str) -> str:
"""Text to print before a command.
Args:
prog: Program name.
Returns:
Text before a command.
"""
return """
----
@@ -218,10 +241,26 @@ def begin_command(self, prog):
prog.replace(" ", "-"), prog, self.rst_levels[self.level] * len(prog)
)
def description(self, description):
def description(self, description: str) -> str:
"""Description of a command.
Args:
description: Command description.
Returns:
Description of a command.
"""
return description + "\n\n"
def usage(self, usage):
def usage(self, usage: str) -> str:
"""Example usage of a command.
Args:
usage: Command usage.
Returns:
Usage of a command.
"""
return """\
.. code-block:: console
@@ -231,10 +270,24 @@ def usage(self, usage):
usage
)
def begin_positionals(self):
def begin_positionals(self) -> str:
"""Text to print before positional arguments.
Returns:
Positional arguments header.
"""
return "\n**Positional arguments**\n\n"
def positional(self, name, help):
def positional(self, name: str, help: str) -> str:
"""Description of a positional argument.
Args:
name: Argument name.
help: Help text.
Returns:
Positional argument description.
"""
return """\
{0}
{1}
@@ -243,13 +296,32 @@ def positional(self, name, help):
name, help
)
def end_positionals(self):
def end_positionals(self) -> str:
"""Text to print after positional arguments.
Returns:
Positional arguments footer.
"""
return ""
def begin_optionals(self):
def begin_optionals(self) -> str:
"""Text to print before optional arguments.
Returns:
Optional arguments header.
"""
return "\n**Optional arguments**\n\n"
def optional(self, opts, help):
def optional(self, opts: str, help: str) -> str:
"""Description of an optional argument.
Args:
opts: Optional argument.
help: Help text.
Returns:
Optional argument description.
"""
return """\
``{0}``
{1}
@@ -258,10 +330,23 @@ def optional(self, opts, help):
opts, help
)
def end_optionals(self):
def end_optionals(self) -> str:
"""Text to print after optional arguments.
Returns:
Optional arguments footer.
"""
return ""
def begin_subcommands(self, subcommands):
def begin_subcommands(self, subcommands: Sequence[Tuple[ArgumentParser, str]]) -> str:
"""Table with links to other subcommands.
Arguments:
subcommands: List of subcommands.
Returns:
Subcommand linking text.
"""
string = """
**Subcommands**
@@ -280,29 +365,25 @@ def begin_subcommands(self, subcommands):
class ArgparseCompletionWriter(ArgparseWriter):
"""Write argparse output as shell programmable tab completion functions."""
def format(self, cmd):
"""Returns the string representation of a single node in the
parser tree.
def format(self, cmd: Command) -> str:
"""Return the string representation of a single node in the parser tree.
Override this in subclasses to define how each subcommand
should be displayed.
Parameters:
(Command): parsed information about a command or subcommand
Args:
cmd: Parsed information about a command or subcommand.
Returns:
str: the string representation of this subcommand
String representation of this subcommand.
"""
assert cmd.optionals # we should always at least have -h, --help
assert not (cmd.positionals and cmd.subcommands) # one or the other
# We only care about the arguments/flags, not the help messages
positionals = []
positionals: Tuple[str, ...] = ()
if cmd.positionals:
positionals, _ = zip(*cmd.positionals)
optionals, _, _ = zip(*cmd.optionals)
subcommands = []
subcommands: Tuple[str, ...] = ()
if cmd.subcommands:
_, subcommands = zip(*cmd.subcommands)
@@ -315,71 +396,73 @@ def format(self, cmd):
+ self.end_function(cmd.prog)
)
def start_function(self, prog):
"""Returns the syntax needed to begin a function definition.
def start_function(self, prog: str) -> str:
"""Return the syntax needed to begin a function definition.
Parameters:
prog (str): the command name
Args:
prog: Program name.
Returns:
str: the function definition beginning
Function definition beginning.
"""
name = prog.replace("-", "_").replace(" ", "_")
return "\n_{0}() {{".format(name)
def end_function(self, prog=None):
"""Returns the syntax needed to end a function definition.
def end_function(self, prog: str) -> str:
"""Return the syntax needed to end a function definition.
Parameters:
prog (str or None): the command name
Args:
prog: Program name
Returns:
str: the function definition ending
Function definition ending.
"""
return "}\n"
def body(self, positionals, optionals, subcommands):
"""Returns the body of the function.
def body(
self, positionals: Sequence[str], optionals: Sequence[str], subcommands: Sequence[str]
) -> str:
"""Return the body of the function.
Parameters:
positionals (list): list of positional arguments
optionals (list): list of optional arguments
subcommands (list): list of subcommand parsers
Args:
positionals: List of positional arguments.
optionals: List of optional arguments.
subcommands: List of subcommand parsers.
Returns:
str: the function body
Function body.
"""
return ""
def positionals(self, positionals):
"""Returns the syntax for reporting positional arguments.
def positionals(self, positionals: Sequence[str]) -> str:
"""Return the syntax for reporting positional arguments.
Parameters:
positionals (list): list of positional arguments
Args:
positionals: List of positional arguments.
Returns:
str: the syntax for positional arguments
Syntax for positional arguments.
"""
return ""
def optionals(self, optionals):
"""Returns the syntax for reporting optional flags.
def optionals(self, optionals: Sequence[str]) -> str:
"""Return the syntax for reporting optional flags.
Parameters:
optionals (list): list of optional arguments
Args:
optionals: List of optional arguments.
Returns:
str: the syntax for optional flags
Syntax for optional flags.
"""
return ""
def subcommands(self, subcommands):
"""Returns the syntax for reporting subcommands.
def subcommands(self, subcommands: Sequence[str]) -> str:
"""Return the syntax for reporting subcommands.
Parameters:
subcommands (list): list of subcommand parsers
Args:
subcommands: List of subcommand parsers.
Returns:
str: the syntax for subcommand parsers
Syntax for subcommand parsers
"""
return ""

View File

@@ -402,7 +402,7 @@ def groupid_to_group(x):
os.remove(backup_filename)
class FileFilter(object):
class FileFilter:
"""Convenience class for calling ``filter_file`` a lot."""
def __init__(self, *filenames):
@@ -610,6 +610,8 @@ def chgrp(path, group, follow_symlinks=True):
gid = grp.getgrnam(group).gr_gid
else:
gid = group
if os.stat(path).st_gid == gid:
return
if follow_symlinks:
os.chown(path, -1, gid)
else:
@@ -1336,7 +1338,7 @@ def lexists_islink_isdir(path):
return True, is_link, is_dir
class BaseDirectoryVisitor(object):
class BaseDirectoryVisitor:
"""Base class and interface for :py:func:`visit_directory_tree`."""
def visit_file(self, root, rel_path, depth):
@@ -1890,7 +1892,7 @@ class HeaderList(FileList):
include_regex = re.compile(r"(.*?)(\binclude\b)(.*)")
def __init__(self, files):
super(HeaderList, self).__init__(files)
super().__init__(files)
self._macro_definitions = []
self._directories = None
@@ -1916,7 +1918,7 @@ def _default_directories(self):
"""Default computation of directories based on the list of
header files.
"""
dir_list = super(HeaderList, self).directories
dir_list = super().directories
values = []
for d in dir_list:
# If the path contains a subdirectory named 'include' then stop
@@ -2352,7 +2354,7 @@ def find_all_libraries(root, recursive=False):
)
class WindowsSimulatedRPath(object):
class WindowsSimulatedRPath:
"""Class representing Windows filesystem rpath analog
One instance of this class is associated with a package (only on Windows)

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division
import collections.abc
import contextlib
import functools
@@ -768,10 +766,10 @@ def pretty_seconds(seconds):
class RequiredAttributeError(ValueError):
def __init__(self, message):
super(RequiredAttributeError, self).__init__(message)
super().__init__(message)
class ObjectWrapper(object):
class ObjectWrapper:
"""Base class that wraps an object. Derived classes can add new behavior
while staying undercover.
@@ -798,7 +796,7 @@ def __init__(self, wrapped_object):
self.__dict__ = wrapped_object.__dict__
class Singleton(object):
class Singleton:
"""Simple wrapper for lazily initialized singleton objects."""
def __init__(self, factory):
@@ -845,7 +843,7 @@ def __repr__(self):
return repr(self.instance)
class LazyReference(object):
class LazyReference:
"""Lazily evaluated reference to part of a singleton."""
def __init__(self, ref_function):
@@ -943,7 +941,7 @@ def _wrapper(args):
return _wrapper
class Devnull(object):
class Devnull:
"""Null stream with less overhead than ``os.devnull``.
See https://stackoverflow.com/a/2929954.
@@ -1060,7 +1058,7 @@ def __str__(self):
return str(self.data)
class GroupedExceptionHandler(object):
class GroupedExceptionHandler:
"""A generic mechanism to coalesce multiple exceptions and preserve tracebacks."""
def __init__(self):
@@ -1091,7 +1089,7 @@ def grouped_message(self, with_tracebacks: bool = True) -> str:
return "due to the following failures:\n{0}".format("\n".join(each_exception_message))
class GroupedExceptionForwarder(object):
class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
@@ -1111,7 +1109,7 @@ def __exit__(self, exc_type, exc_value, tb):
return True
class classproperty(object):
class classproperty:
"""Non-data descriptor to evaluate a class-level property. The function that performs
the evaluation is injected at creation time and take an instance (could be None) and
an owner (i.e. the class that originated the instance)

View File

@@ -5,8 +5,6 @@
"""LinkTree class for setting up trees of symbolic links."""
from __future__ import print_function
import filecmp
import os
import shutil
@@ -287,7 +285,7 @@ def visit_symlinked_file(self, root, rel_path, depth):
self.visit_file(root, rel_path, depth)
class LinkTree(object):
class LinkTree:
"""Class to create trees of symbolic links from a source directory.
LinkTree objects are constructed with a source root. Their
@@ -432,12 +430,12 @@ class MergeConflictError(Exception):
class ConflictingSpecsError(MergeConflictError):
def __init__(self, spec_1, spec_2):
super(MergeConflictError, self).__init__(spec_1, spec_2)
super().__init__(spec_1, spec_2)
class SingleMergeConflictError(MergeConflictError):
def __init__(self, path):
super(MergeConflictError, self).__init__("Package merge blocked by file: %s" % path)
super().__init__("Package merge blocked by file: %s" % path)
class MergeConflictSummary(MergeConflictError):
@@ -452,4 +450,4 @@ def __init__(self, conflicts):
msg += "\n `{0}` and `{1}` both project to `{2}`".format(
conflict.src_a, conflict.src_b, conflict.dst
)
super(MergeConflictSummary, self).__init__(msg)
super().__init__(msg)

View File

@@ -39,7 +39,7 @@
true_fn = lambda: True
class OpenFile(object):
class OpenFile:
"""Record for keeping track of open lockfiles (with reference counting).
There's really only one ``OpenFile`` per inode, per process, but we record the
@@ -53,7 +53,7 @@ def __init__(self, fh):
self.refs = 0
class OpenFileTracker(object):
class OpenFileTracker:
"""Track open lockfiles, to minimize number of open file descriptors.
The ``fcntl`` locks that Spack uses are associated with an inode and a process.
@@ -169,7 +169,7 @@ def _attempts_str(wait_time, nattempts):
return " after {} and {}".format(pretty_seconds(wait_time), attempts)
class LockType(object):
class LockType:
READ = 0
WRITE = 1
@@ -192,7 +192,7 @@ def is_valid(op):
return op == LockType.READ or op == LockType.WRITE
class Lock(object):
class Lock:
"""This is an implementation of a filesystem lock using Python's lockf.
In Python, ``lockf`` actually calls ``fcntl``, so this should work with
@@ -681,7 +681,7 @@ def _status_msg(self, locktype, status):
)
class LockTransaction(object):
class LockTransaction:
"""Simple nested transaction context manager that uses a file lock.
Arguments:
@@ -770,7 +770,7 @@ class LockDowngradeError(LockError):
def __init__(self, path):
msg = "Cannot downgrade lock from write to read on file: %s" % path
super(LockDowngradeError, self).__init__(msg)
super().__init__(msg)
class LockLimitError(LockError):
@@ -782,7 +782,7 @@ class LockTimeoutError(LockError):
def __init__(self, lock_type, path, time, attempts):
fmt = "Timed out waiting for a {} lock after {}.\n Made {} {} on file: {}"
super(LockTimeoutError, self).__init__(
super().__init__(
fmt.format(
lock_type,
pretty_seconds(time),
@@ -798,7 +798,7 @@ class LockUpgradeError(LockError):
def __init__(self, path):
msg = "Cannot upgrade lock from read to write on file: %s" % path
super(LockUpgradeError, self).__init__(msg)
super().__init__(msg)
class LockPermissionError(LockError):
@@ -810,7 +810,7 @@ class LockROFileError(LockPermissionError):
def __init__(self, path):
msg = "Can't take write lock on read-only file: %s" % path
super(LockROFileError, self).__init__(msg)
super().__init__(msg)
class CantCreateLockError(LockPermissionError):
@@ -819,4 +819,4 @@ class CantCreateLockError(LockPermissionError):
def __init__(self, path):
msg = "cannot create lock '%s': " % path
msg += "file does not exist and location is not writable"
super(LockError, self).__init__(msg)
super().__init__(msg)

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import unicode_literals
import contextlib
import io
import os

View File

@@ -6,8 +6,6 @@
"""
Routines for printing columnar output. See ``colify()`` for more information.
"""
from __future__ import division, unicode_literals
import io
import os
import sys

View File

@@ -59,8 +59,6 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
from __future__ import unicode_literals
import re
import sys
from contextlib import contextmanager
@@ -70,7 +68,7 @@ class ColorParseError(Exception):
"""Raised when a color format fails to parse."""
def __init__(self, message):
super(ColorParseError, self).__init__(message)
super().__init__(message)
# Text styles for ansi codes
@@ -205,7 +203,7 @@ def color_when(value):
set_color_when(old_value)
class match_to_ansi(object):
class match_to_ansi:
def __init__(self, color=True, enclose=False):
self.color = _color_when_value(color)
self.enclose = enclose
@@ -321,7 +319,7 @@ def cescape(string):
return string
class ColorStream(object):
class ColorStream:
def __init__(self, stream, color=None):
self._stream = stream
self._color = color

View File

@@ -5,8 +5,6 @@
"""Utility classes for logging the output of blocks of code.
"""
from __future__ import unicode_literals
import atexit
import ctypes
import errno
@@ -67,7 +65,7 @@ def _strip(line):
return _escape.sub("", line)
class keyboard_input(object):
class keyboard_input:
"""Context manager to disable line editing and echoing.
Use this with ``sys.stdin`` for keyboard input, e.g.::
@@ -244,7 +242,7 @@ def __exit__(self, exc_type, exception, traceback):
signal.signal(signum, old_handler)
class Unbuffered(object):
class Unbuffered:
"""Wrapper for Python streams that forces them to be unbuffered.
This is implemented by forcing a flush after each write.
@@ -289,7 +287,7 @@ def _file_descriptors_work(*streams):
return False
class FileWrapper(object):
class FileWrapper:
"""Represents a file. Can be an open stream, a path to a file (not opened
yet), or neither. When unwrapped, it returns an open file (or file-like)
object.
@@ -331,7 +329,7 @@ def close(self):
self.file.close()
class MultiProcessFd(object):
class MultiProcessFd:
"""Return an object which stores a file descriptor and can be passed as an
argument to a function run with ``multiprocessing.Process``, such that
the file descriptor is available in the subprocess."""
@@ -431,7 +429,7 @@ def log_output(*args, **kwargs):
return nixlog(*args, **kwargs)
class nixlog(object):
class nixlog:
"""
Under the hood, we spawn a daemon and set up a pipe between this
process and the daemon. The daemon writes our output to both the
@@ -752,7 +750,7 @@ def close(self):
os.close(self.saved_stream)
class winlog(object):
class winlog:
"""
Similar to nixlog, with underlying
functionality ported to support Windows.

View File

@@ -13,8 +13,6 @@
Note: The functionality in this module is unsupported on Windows
"""
from __future__ import print_function
import multiprocessing
import os
import re
@@ -36,7 +34,7 @@
pass
class ProcessController(object):
class ProcessController:
"""Wrapper around some fundamental process control operations.
This allows one process (the controller) to drive another (the
@@ -157,7 +155,7 @@ def wait_running(self):
self.wait(lambda: "T" not in self.proc_status())
class PseudoShell(object):
class PseudoShell:
"""Sets up controller and minion processes with a PTY.
You can create a ``PseudoShell`` if you want to test how some

View File

@@ -13,7 +13,7 @@
from spack.util.executable import Executable, ProcessError
class ABI(object):
class ABI:
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""

View File

@@ -60,7 +60,7 @@ def _search_duplicate_compilers(error_cls):
GROUPS = collections.defaultdict(list)
class Error(object):
class Error:
"""Information on an error reported in a test."""
def __init__(self, summary, details):

View File

@@ -80,14 +80,14 @@ def __init__(self, errors):
else:
err = errors[0]
self.message = "{0}: {1}".format(err.__class__.__name__, str(err))
super(FetchCacheError, self).__init__(self.message)
super().__init__(self.message)
class ListMirrorSpecsError(spack.error.SpackError):
"""Raised when unable to retrieve list of specs from the mirror"""
class BinaryCacheIndex(object):
class BinaryCacheIndex:
"""
The BinaryCacheIndex tracks what specs are available on (usually remote)
binary caches.
@@ -517,9 +517,7 @@ class NoOverwriteException(spack.error.SpackError):
"""Raised when a file would be overwritten"""
def __init__(self, file_path):
super(NoOverwriteException, self).__init__(
f"Refusing to overwrite the following file: {file_path}"
)
super().__init__(f"Refusing to overwrite the following file: {file_path}")
class NoGpgException(spack.error.SpackError):
@@ -528,7 +526,7 @@ class NoGpgException(spack.error.SpackError):
"""
def __init__(self, msg):
super(NoGpgException, self).__init__(msg)
super().__init__(msg)
class NoKeyException(spack.error.SpackError):
@@ -537,7 +535,7 @@ class NoKeyException(spack.error.SpackError):
"""
def __init__(self, msg):
super(NoKeyException, self).__init__(msg)
super().__init__(msg)
class PickKeyException(spack.error.SpackError):
@@ -548,7 +546,7 @@ class PickKeyException(spack.error.SpackError):
def __init__(self, keys):
err_msg = "Multiple keys available for signing\n%s\n" % keys
err_msg += "Use spack buildcache create -k <key hash> to pick a key."
super(PickKeyException, self).__init__(err_msg)
super().__init__(err_msg)
class NoVerifyException(spack.error.SpackError):
@@ -565,7 +563,7 @@ class NoChecksumException(spack.error.SpackError):
"""
def __init__(self, path, size, contents, algorithm, expected, computed):
super(NoChecksumException, self).__init__(
super().__init__(
f"{algorithm} checksum failed for {path}",
f"Expected {expected} but got {computed}. "
f"File size = {size} bytes. Contents = {contents!r}",
@@ -578,7 +576,7 @@ class NewLayoutException(spack.error.SpackError):
"""
def __init__(self, msg):
super(NewLayoutException, self).__init__(msg)
super().__init__(msg)
class UnsignedPackageException(spack.error.SpackError):
@@ -2337,7 +2335,7 @@ def download_single_spec(concrete_spec, destination, mirror_url=None):
return download_buildcache_entry(files_to_fetch, mirror_url)
class BinaryCacheQuery(object):
class BinaryCacheQuery:
"""Callable object to query if a spec is in a binary cache"""
def __init__(self, all_architectures):

View File

@@ -148,7 +148,7 @@ class MakeExecutable(Executable):
def __init__(self, name, jobs, **kwargs):
supports_jobserver = kwargs.pop("supports_jobserver", True)
super(MakeExecutable, self).__init__(name, **kwargs)
super().__init__(name, **kwargs)
self.supports_jobserver = supports_jobserver
self.jobs = jobs
@@ -175,7 +175,7 @@ def __call__(self, *args, **kwargs):
if jobs_env_jobs is not None:
kwargs["extra_env"] = {jobs_env: str(jobs_env_jobs)}
return super(MakeExecutable, self).__call__(*args, **kwargs)
return super().__call__(*args, **kwargs)
def _on_cray():
@@ -1332,7 +1332,7 @@ class ChildError(InstallError):
build_errors = [("spack.util.executable", "ProcessError")]
def __init__(self, msg, module, classname, traceback_string, log_name, log_type, context):
super(ChildError, self).__init__(msg)
super().__init__(msg)
self.module = module
self.name = classname
self.traceback = traceback_string

View File

@@ -312,7 +312,7 @@ def initconfig(self, pkg, spec, prefix):
@property
def std_cmake_args(self):
args = super(CachedCMakeBuilder, self).std_cmake_args
args = super().std_cmake_args
args.extend(["-C", self.cache_path])
return args

View File

@@ -175,7 +175,7 @@ def libs(self):
return find_libraries("*", root=lib_path, shared=True, recursive=True)
class IntelOneApiStaticLibraryList(object):
class IntelOneApiStaticLibraryList:
"""Provides ld_flags when static linking is needed
Oneapi puts static and dynamic libraries in the same directory, so

View File

@@ -63,7 +63,7 @@ def create(pkg):
return _BUILDERS[id(pkg)]
class _PhaseAdapter(object):
class _PhaseAdapter:
def __init__(self, builder, phase_fn):
self.builder = builder
self.phase_fn = phase_fn
@@ -115,7 +115,7 @@ class hierarchy (look at AspellDictPackage for an example of that)
# package. The semantic should be the same as the method in the base builder were still
# present in the base class of the package.
class _ForwardToBaseBuilder(object):
class _ForwardToBaseBuilder:
def __init__(self, wrapped_pkg_object, root_builder):
self.wrapped_package_object = wrapped_pkg_object
self.root_builder = root_builder
@@ -188,7 +188,7 @@ def __init__(self, pkg):
# Attribute containing the package wrapped in dispatcher with a `__getattr__`
# method that will forward certain calls to the default builder.
self.pkg_with_dispatcher = _ForwardToBaseBuilder(pkg, root_builder=self)
super(Adapter, self).__init__(pkg)
super().__init__(pkg)
# These two methods don't follow the (self, spec, prefix) signature of phases nor
# the (self) signature of methods, so they are added explicitly to avoid using a
@@ -388,7 +388,7 @@ def __new__(mcs, name, bases, attr_dict):
return super(_PackageAdapterMeta, mcs).__new__(mcs, name, bases, attr_dict)
class InstallationPhase(object):
class InstallationPhase:
"""Manages a single phase of the installation.
This descriptor stores at creation time the name of the method it should
@@ -530,9 +530,9 @@ def setup_build_environment(self, env):
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(Builder, self), "setup_build_environment"):
if not hasattr(super(), "setup_build_environment"):
return
super(Builder, self).setup_build_environment(env)
super().setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
"""Sets up the build environment of packages that depend on this one.
@@ -563,9 +563,9 @@ def setup_dependent_build_environment(self, env, dependent_spec):
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(Builder, self), "setup_dependent_build_environment"):
if not hasattr(super(), "setup_dependent_build_environment"):
return
super(Builder, self).setup_dependent_build_environment(env, dependent_spec)
super().setup_dependent_build_environment(env, dependent_spec)
def __getitem__(self, idx):
key = self.phases[idx]

View File

@@ -58,7 +58,7 @@ def _fetch_cache():
return spack.fetch_strategy.FsCache(path)
class MirrorCache(object):
class MirrorCache:
def __init__(self, root, skip_unstable_versions):
self.root = os.path.abspath(root)
self.skip_unstable_versions = skip_unstable_versions

View File

@@ -57,7 +57,7 @@
PushResult = namedtuple("PushResult", "success url")
class TemporaryDirectory(object):
class TemporaryDirectory:
def __init__(self):
self.temporary_directory = tempfile.mkdtemp()
@@ -471,7 +471,7 @@ def _unpack_script(script_section, op=_noop):
return script
class RebuildDecision(object):
class RebuildDecision:
def __init__(self):
self.rebuild = True
self.mirrors = []
@@ -2128,7 +2128,7 @@ def run_standalone_tests(**kwargs):
tty.debug("spack test exited {0}".format(exit_code))
class CDashHandler(object):
class CDashHandler:
"""
Class for managing CDash data and processing.
"""

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import os
import re
@@ -149,7 +147,7 @@ def get_command(cmd_name):
return getattr(get_module(cmd_name), pname)
class _UnquotedFlags(object):
class _UnquotedFlags:
"""Use a heuristic in `.extract()` to detect whether the user is trying to set
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
@@ -547,7 +545,7 @@ class PythonNameError(spack.error.SpackError):
def __init__(self, name):
self.name = name
super(PythonNameError, self).__init__("{0} is not a permissible Python name.".format(name))
super().__init__("{0} is not a permissible Python name.".format(name))
class CommandNameError(spack.error.SpackError):
@@ -555,9 +553,7 @@ class CommandNameError(spack.error.SpackError):
def __init__(self, name):
self.name = name
super(CommandNameError, self).__init__(
"{0} is not a permissible Spack command name.".format(name)
)
super().__init__("{0} is not a permissible Spack command name.".format(name))
########################################

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import collections
import archspec.cpu

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os.path
import shutil
import tempfile

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import sys

View File

@@ -3,17 +3,22 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import copy
import os
import re
import sys
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Sequence, Set
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.argparsewriter import ArgparseCompletionWriter, ArgparseRstWriter, ArgparseWriter
from llnl.util.argparsewriter import (
ArgparseCompletionWriter,
ArgparseRstWriter,
ArgparseWriter,
Command,
)
from llnl.util.tty.colify import colify
import spack.cmd
@@ -27,12 +32,12 @@
#: list of command formatters
formatters = {}
formatters: Dict[str, Callable[[Namespace, IO], None]] = {}
#: standard arguments for updating completion scripts
#: we iterate through these when called with --update-completion
update_completion_args = {
update_completion_args: Dict[str, Dict[str, Any]] = {
"bash": {
"aliases": True,
"format": "bash",
@@ -42,13 +47,25 @@
}
def formatter(func):
"""Decorator used to register formatters"""
def formatter(func: Callable[[Namespace, IO], None]) -> Callable[[Namespace, IO], None]:
"""Decorator used to register formatters.
Args:
func: Formatting function.
Returns:
The same function.
"""
formatters[func.__name__] = func
return func
def setup_parser(subparser):
def setup_parser(subparser: ArgumentParser) -> None:
"""Set up the argument parser.
Args:
subparser: Preliminary argument parser.
"""
subparser.add_argument(
"--update-completion",
action="store_true",
@@ -91,18 +108,34 @@ class SpackArgparseRstWriter(ArgparseRstWriter):
def __init__(
self,
prog,
out=None,
aliases=False,
documented_commands=[],
rst_levels=["-", "-", "^", "~", ":", "`"],
prog: str,
out: IO = sys.stdout,
aliases: bool = False,
documented_commands: Set[str] = set(),
rst_levels: Sequence[str] = ["-", "-", "^", "~", ":", "`"],
):
out = sys.stdout if out is None else out
super(SpackArgparseRstWriter, self).__init__(prog, out, aliases, rst_levels)
"""Initialize a new SpackArgparseRstWriter instance.
Args:
prog: Program name.
out: File object to write to.
aliases: Whether or not to include subparsers for aliases.
documented_commands: Set of commands with additional documentation.
rst_levels: List of characters for rst section headings.
"""
super().__init__(prog, out, aliases, rst_levels)
self.documented = documented_commands
def usage(self, *args):
string = super(SpackArgparseRstWriter, self).usage(*args)
def usage(self, usage: str) -> str:
"""Example usage of a command.
Args:
usage: Command usage.
Returns:
Usage of a command.
"""
string = super().usage(usage)
cmd = self.parser.prog.replace(" ", "-")
if cmd in self.documented:
@@ -112,11 +145,21 @@ def usage(self, *args):
class SubcommandWriter(ArgparseWriter):
def format(self, cmd):
"""Write argparse output as a list of subcommands."""
def format(self, cmd: Command) -> str:
"""Return the string representation of a single node in the parser tree.
Args:
cmd: Parsed information about a command or subcommand.
Returns:
String representation of this subcommand.
"""
return " " * self.level + cmd.prog + "\n"
_positional_to_subroutine = {
_positional_to_subroutine: Dict[str, str] = {
"package": "_all_packages",
"spec": "_all_packages",
"filter": "_all_packages",
@@ -138,7 +181,19 @@ def format(self, cmd):
class BashCompletionWriter(ArgparseCompletionWriter):
"""Write argparse output as bash programmable tab completion."""
def body(self, positionals, optionals, subcommands):
def body(
self, positionals: Sequence[str], optionals: Sequence[str], subcommands: Sequence[str]
) -> str:
"""Return the body of the function.
Args:
positionals: List of positional arguments.
optionals: List of optional arguments.
subcommands: List of subcommand parsers.
Returns:
Function body.
"""
if positionals:
return """
if $list_options
@@ -168,7 +223,15 @@ def body(self, positionals, optionals, subcommands):
self.optionals(optionals)
)
def positionals(self, positionals):
def positionals(self, positionals: Sequence[str]) -> str:
"""Return the syntax for reporting positional arguments.
Args:
positionals: List of positional arguments.
Returns:
Syntax for positional arguments.
"""
# If match found, return function name
for positional in positionals:
for key, value in _positional_to_subroutine.items():
@@ -178,22 +241,49 @@ def positionals(self, positionals):
# If no matches found, return empty list
return 'SPACK_COMPREPLY=""'
def optionals(self, optionals):
def optionals(self, optionals: Sequence[str]) -> str:
"""Return the syntax for reporting optional flags.
Args:
optionals: List of optional arguments.
Returns:
Syntax for optional flags.
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(optionals))
def subcommands(self, subcommands):
def subcommands(self, subcommands: Sequence[str]) -> str:
"""Return the syntax for reporting subcommands.
Args:
subcommands: List of subcommand parsers.
Returns:
Syntax for subcommand parsers
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(subcommands))
@formatter
def subcommands(args, out):
def subcommands(args: Namespace, out: IO) -> None:
"""Hierarchical tree of subcommands.
args:
args: Command-line arguments.
out: File object to write to.
"""
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
writer = SubcommandWriter(parser.prog, out, args.aliases)
writer.write(parser)
def rst_index(out):
def rst_index(out: IO) -> None:
"""Generate an index of all commands.
Args:
out: File object to write to.
"""
out.write("\n")
index = spack.main.index_commands()
@@ -221,13 +311,19 @@ def rst_index(out):
@formatter
def rst(args, out):
def rst(args: Namespace, out: IO) -> None:
"""ReStructuredText documentation of subcommands.
args:
args: Command-line arguments.
out: File object to write to.
"""
# create a parser with all commands
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
# extract cross-refs of the form `_cmd-spack-<cmd>:` from rst files
documented_commands = set()
documented_commands: Set[str] = set()
for filename in args.rst_files:
with open(filename) as f:
for line in f:
@@ -245,7 +341,13 @@ def rst(args, out):
@formatter
def names(args, out):
def names(args: Namespace, out: IO) -> None:
"""Simple list of top-level commands.
args:
args: Command-line arguments.
out: File object to write to.
"""
commands = copy.copy(spack.cmd.all_commands())
if args.aliases:
@@ -255,7 +357,13 @@ def names(args, out):
@formatter
def bash(args, out):
def bash(args: Namespace, out: IO) -> None:
"""Bash tab-completion script.
args:
args: Command-line arguments.
out: File object to write to.
"""
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
@@ -263,7 +371,13 @@ def bash(args, out):
writer.write(parser)
def prepend_header(args, out):
def prepend_header(args: Namespace, out: IO) -> None:
"""Prepend header text at the beginning of a file.
Args:
args: Command-line arguments.
out: File object to write to.
"""
if not args.header:
return
@@ -271,10 +385,14 @@ def prepend_header(args, out):
out.write(header.read())
def _commands(parser, args):
def _commands(parser: ArgumentParser, args: Namespace) -> None:
"""This is the 'regular' command, which can be called multiple times.
See ``commands()`` below for ``--update-completion`` handling.
Args:
parser: Argument parser.
args: Command-line arguments.
"""
formatter = formatters[args.format]
@@ -296,12 +414,15 @@ def _commands(parser, args):
formatter(args, sys.stdout)
def update_completion(parser, args):
def update_completion(parser: ArgumentParser, args: Namespace) -> None:
"""Iterate through the shells and update the standard completion files.
This is a convenience method to avoid calling this command many
times, and to simplify completion update for developers.
Args:
parser: Argument parser.
args: Command-line arguments.
"""
for shell, shell_args in update_completion_args.items():
for attr, value in shell_args.items():
@@ -309,14 +430,20 @@ def update_completion(parser, args):
_commands(parser, args)
def commands(parser, args):
def commands(parser: ArgumentParser, args: Namespace) -> None:
"""Main function that calls formatter functions.
Args:
parser: Argument parser.
args: Command-line arguments.
"""
if args.update_completion:
if args.format != "names" or any([args.aliases, args.update, args.header]):
tty.die("--update-completion can only be specified alone.")
# this runs the command multiple times with different arguments
return update_completion(parser, args)
update_completion(parser, args)
else:
# run commands normally
return _commands(parser, args)
_commands(parser, args)

View File

@@ -479,7 +479,7 @@ def __init__(
# substituting '_' for ':'.
dest = dest.replace(":", "_")
super(ConfigSetAction, self).__init__(
super().__init__(
option_strings=option_strings,
dest=dest,
nargs=0,

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import os

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import sys

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import collections
import os
import shutil

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import re
import urllib.parse
@@ -71,7 +69,7 @@ class {class_name}({base_class_name}):
'''
class BundlePackageTemplate(object):
class BundlePackageTemplate:
"""
Provides the default values to be used for a bundle package file template.
"""
@@ -122,7 +120,7 @@ def install(self, spec, prefix):
url_line = ' url = "{url}"'
def __init__(self, name, url, versions):
super(PackageTemplate, self).__init__(name, versions)
super().__init__(name, versions)
self.url_def = self.url_line.format(url=url)
@@ -200,7 +198,7 @@ def __init__(self, name, url, *args, **kwargs):
# Make it more obvious that we are renaming the package
tty.msg("Changing package name from {0} to lua-{0}".format(name))
name = "lua-{0}".format(name)
super(LuaPackageTemplate, self).__init__(name, url, *args, **kwargs)
super().__init__(name, url, *args, **kwargs)
class MesonPackageTemplate(PackageTemplate):
@@ -308,7 +306,7 @@ def __init__(self, name, url, *args, **kwargs):
tty.msg("Changing package name from {0} to rkt-{0}".format(name))
name = "rkt-{0}".format(name)
self.body_def = self.body_def.format(name[4:])
super(RacketPackageTemplate, self).__init__(name, url, *args, **kwargs)
super().__init__(name, url, *args, **kwargs)
class PythonPackageTemplate(PackageTemplate):
@@ -400,7 +398,7 @@ def __init__(self, name, url, *args, **kwargs):
+ self.url_line
)
super(PythonPackageTemplate, self).__init__(name, url, *args, **kwargs)
super().__init__(name, url, *args, **kwargs)
class RPackageTemplate(PackageTemplate):
@@ -439,7 +437,7 @@ def __init__(self, name, url, *args, **kwargs):
if bioc:
self.url_line = ' url = "{0}"\n' ' bioc = "{1}"'.format(url, r_name)
super(RPackageTemplate, self).__init__(name, url, *args, **kwargs)
super().__init__(name, url, *args, **kwargs)
class PerlmakePackageTemplate(PackageTemplate):
@@ -466,7 +464,7 @@ def __init__(self, name, *args, **kwargs):
tty.msg("Changing package name from {0} to perl-{0}".format(name))
name = "perl-{0}".format(name)
super(PerlmakePackageTemplate, self).__init__(name, *args, **kwargs)
super().__init__(name, *args, **kwargs)
class PerlbuildPackageTemplate(PerlmakePackageTemplate):
@@ -499,7 +497,7 @@ def __init__(self, name, *args, **kwargs):
tty.msg("Changing package name from {0} to octave-{0}".format(name))
name = "octave-{0}".format(name)
super(OctavePackageTemplate, self).__init__(name, *args, **kwargs)
super().__init__(name, *args, **kwargs)
class RubyPackageTemplate(PackageTemplate):
@@ -527,7 +525,7 @@ def __init__(self, name, *args, **kwargs):
tty.msg("Changing package name from {0} to ruby-{0}".format(name))
name = "ruby-{0}".format(name)
super(RubyPackageTemplate, self).__init__(name, *args, **kwargs)
super().__init__(name, *args, **kwargs)
class MakefilePackageTemplate(PackageTemplate):
@@ -572,7 +570,7 @@ def __init__(self, name, *args, **kwargs):
tty.msg("Changing package name from {0} to py-{0}".format(name))
name = "py-{0}".format(name)
super(SIPPackageTemplate, self).__init__(name, *args, **kwargs)
super().__init__(name, *args, **kwargs)
templates = {

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import platform
import re

View File

@@ -13,8 +13,6 @@
It is up to the user to ensure binary compatibility between the deprecated
installation and its deprecator.
"""
from __future__ import print_function
import argparse
import os

View File

@@ -418,7 +418,7 @@ def env_list(args):
colify(color_names, indent=4)
class ViewAction(object):
class ViewAction:
regenerate = "regenerate"
enable = "enable"
disable = "disable"

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import errno
import os

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import copy
import sys

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import textwrap
from itertools import zip_longest
@@ -73,7 +71,7 @@ def variant(s):
return spack.spec.enabled_variant_color + s + plain_format
class VariantFormatter(object):
class VariantFormatter:
def __init__(self, variants):
self.variants = variants
self.headers = ("Name [Default]", "When", "Allowed values", "Description")

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import re
from collections import defaultdict
@@ -102,7 +100,7 @@ def list_files(args):
]
class LicenseError(object):
class LicenseError:
def __init__(self):
self.error_counts = defaultdict(int)

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division, print_function
import argparse
import fnmatch
import json

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import llnl.util.tty as tty

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
from collections import defaultdict

View File

@@ -49,9 +49,8 @@ def setup_parser(subparser):
"-g",
"--git-installer-verbosity",
default="",
choices=set(["SILENT", "VERYSILENT"]),
help="Level of verbosity provided by bundled Git Installer.\
Default is fully verbose",
choices=["SILENT", "VERYSILENT"],
help="Level of verbosity provided by bundled Git Installer. Default is fully verbose",
required=False,
action="store",
dest="git_verbosity",

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import sys
from llnl.util import tty

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import itertools
import os

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import code
import os

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import sys

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os
import llnl.util.tty as tty

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import re
import sys

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import sys
import llnl.util.lang as lang

View File

@@ -60,7 +60,7 @@ def is_package(f):
#: decorator for adding tools to the list
class tool(object):
class tool:
def __init__(self, name, required=False):
self.name = name
self.required = required

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import fnmatch
import os

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os.path
import shutil

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import sys
from typing import Dict, List, Optional

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division, print_function
import argparse
import collections
import io

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division, print_function
import urllib.parse
from collections import defaultdict
@@ -290,7 +288,7 @@ def url_stats(args):
# dictionary of issue type -> package -> descriptions
issues = defaultdict(lambda: defaultdict(lambda: []))
class UrlStats(object):
class UrlStats:
def __init__(self):
self.total = 0
self.schemes = defaultdict(lambda: 0)

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import argparse
import llnl.util.tty as tty

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import sys
import llnl.util.tty as tty

View File

@@ -189,7 +189,7 @@ def in_system_subdirectory(path):
return any(path_contains_subdirectory(path, x) for x in system_dirs)
class Compiler(object):
class Compiler:
"""This class encapsulates a Spack "compiler", which includes C,
C++, and Fortran compilers. Subclasses should implement
support for specific compilers, their possible names, arguments,
@@ -673,17 +673,17 @@ class CompilerAccessError(spack.error.SpackError):
def __init__(self, compiler, paths):
msg = "Compiler '%s' has executables that are missing" % compiler.spec
msg += " or are not executable: %s" % paths
super(CompilerAccessError, self).__init__(msg)
super().__init__(msg)
class InvalidCompilerError(spack.error.SpackError):
def __init__(self):
super(InvalidCompilerError, self).__init__("Compiler has no executables.")
super().__init__("Compiler has no executables.")
class UnsupportedCompilerFlag(spack.error.SpackError):
def __init__(self, compiler, feature, flag_name, ver_string=None):
super(UnsupportedCompilerFlag, self).__init__(
super().__init__(
"{0} ({1}) does not support {2} (as compiler.{3}).".format(
compiler.name, ver_string if ver_string else compiler.version, feature, flag_name
),

View File

@@ -369,7 +369,7 @@ def compiler_specs_for_arch(arch_spec, scope=None):
return [c.spec for c in compilers_for_arch(arch_spec, scope)]
class CacheReference(object):
class CacheReference:
"""This acts as a hashable reference to any object (regardless of whether
the object itself is hashable) and also prevents the object from being
garbage-collected (so if two CacheReference objects are equal, they
@@ -820,7 +820,7 @@ def name_matches(name, name_list):
class InvalidCompilerConfigurationError(spack.error.SpackError):
def __init__(self, compiler_spec):
super(InvalidCompilerConfigurationError, self).__init__(
super().__init__(
'Invalid configuration for [compiler "%s"]: ' % compiler_spec,
"Compiler configuration must contain entries for all compilers: %s"
% _path_instance_vars,
@@ -829,19 +829,17 @@ def __init__(self, compiler_spec):
class NoCompilersError(spack.error.SpackError):
def __init__(self):
super(NoCompilersError, self).__init__("Spack could not find any compilers!")
super().__init__("Spack could not find any compilers!")
class UnknownCompilerError(spack.error.SpackError):
def __init__(self, compiler_name):
super(UnknownCompilerError, self).__init__(
"Spack doesn't support the requested compiler: {0}".format(compiler_name)
)
super().__init__("Spack doesn't support the requested compiler: {0}".format(compiler_name))
class NoCompilerForSpecError(spack.error.SpackError):
def __init__(self, compiler_spec, target):
super(NoCompilerForSpecError, self).__init__(
super().__init__(
"No compilers for operating system %s satisfy spec %s" % (target, compiler_spec)
)
@@ -860,11 +858,9 @@ def __init__(self, compiler_spec, arch_spec):
+ " in the following files:\n\t"
+ "\n\t".join(duplicate_msg(x, y) for x, y in duplicate_table)
)
super(CompilerDuplicateError, self).__init__(msg)
super().__init__(msg)
class CompilerSpecInsufficientlySpecificError(spack.error.SpackError):
def __init__(self, compiler_spec):
super(CompilerSpecInsufficientlySpecificError, self).__init__(
"Multiple compilers satisfy spec %s" % compiler_spec
)
super().__init__("Multiple compilers satisfy spec %s" % compiler_spec)

View File

@@ -132,7 +132,7 @@ def setup_custom_environment(self, pkg, env):
the 'DEVELOPER_DIR' environment variables to cause the xcrun and
related tools to use this Xcode.app.
"""
super(AppleClang, self).setup_custom_environment(pkg, env)
super().setup_custom_environment(pkg, env)
if not pkg.use_xcode:
# if we do it for all packages, we get into big troubles with MPI:

View File

@@ -12,7 +12,7 @@ class Cce(Compiler):
"""Cray compiler environment compiler."""
def __init__(self, *args, **kwargs):
super(Cce, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
# For old cray compilers on module based systems we replace
# ``version_argument`` with the old value. Cannot be a property
# as the new value is used in classmethods for path-based detection

View File

@@ -77,7 +77,7 @@ class Msvc(Compiler):
def __init__(self, *args, **kwargs):
new_pth = [pth if pth else get_valid_fortran_pth(args[0].version) for pth in args[3]]
args[3][:] = new_pth
super(Msvc, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
if os.getenv("ONEAPI_ROOT"):
# If this found, it sets all the vars
self.setvarsfile = os.path.join(os.getenv("ONEAPI_ROOT"), "setvars.bat")

View File

@@ -14,8 +14,6 @@
TODO: make this customizable and allow users to configure
concretization policies.
"""
from __future__ import print_function
import functools
import platform
import tempfile
@@ -50,7 +48,7 @@
@functools.total_ordering
class reverse_order(object):
class reverse_order:
"""Helper for creating key functions.
This is a wrapper that inverts the sense of the natural
@@ -67,7 +65,7 @@ def __lt__(self, other):
return other.value < self.value
class Concretizer(object):
class Concretizer:
"""You can subclass this class to override some of the default
concretization strategies, or you can override all of them.
"""
@@ -794,9 +792,7 @@ def __init__(self, arch, available_os_targets):
" operating systems and targets:\n\t" + "\n\t".join(available_os_target_strs)
)
super(NoCompilersForArchError, self).__init__(
err_msg, "Run 'spack compiler find' to add compilers."
)
super().__init__(err_msg, "Run 'spack compiler find' to add compilers.")
class UnavailableCompilerVersionError(spack.error.SpackError):
@@ -808,7 +804,7 @@ def __init__(self, compiler_spec, arch=None):
if arch:
err_msg += " for operating system {0} and target {1}.".format(arch.os, arch.target)
super(UnavailableCompilerVersionError, self).__init__(
super().__init__(
err_msg,
"Run 'spack compiler find' to add compilers or "
"'spack compilers' to see which compilers are already recognized"
@@ -821,7 +817,7 @@ class NoValidVersionError(spack.error.SpackError):
particular spec."""
def __init__(self, spec):
super(NoValidVersionError, self).__init__(
super().__init__(
"There are no valid versions for %s that match '%s'" % (spec.name, spec.versions)
)
@@ -832,7 +828,7 @@ class InsufficientArchitectureInfoError(spack.error.SpackError):
system"""
def __init__(self, spec, archs):
super(InsufficientArchitectureInfoError, self).__init__(
super().__init__(
"Cannot determine necessary architecture information for '%s': %s"
% (spec.name, str(archs))
)
@@ -848,4 +844,4 @@ def __init__(self, spec):
"The spec\n '%s'\n is configured as not buildable, "
"and no matching external installs were found"
)
super(NoBuildError, self).__init__(msg % spec)
super().__init__(msg % spec)

View File

@@ -111,7 +111,7 @@
overrides_base_name = "overrides-"
class ConfigScope(object):
class ConfigScope:
"""This class represents a configuration scope.
A scope is one directory containing named configuration files.
@@ -182,7 +182,7 @@ def __init__(self, name, path, schema, yaml_path=None):
config:
install_tree: $spack/opt/spack
"""
super(SingleFileScope, self).__init__(name, path)
super().__init__(name, path)
self._raw_data = None
self.schema = schema
self.yaml_path = yaml_path or []
@@ -310,7 +310,7 @@ class InternalConfigScope(ConfigScope):
"""
def __init__(self, name, data=None):
super(InternalConfigScope, self).__init__(name, None)
super().__init__(name, None)
self.sections = syaml.syaml_dict()
if data:
@@ -382,7 +382,7 @@ def _method(self, *args, **kwargs):
return _method
class Configuration(object):
class Configuration:
"""A full Spack configuration, from a hierarchy of config files.
This class makes it easy to add a new scope on top of an existing one.
@@ -1495,7 +1495,7 @@ def __init__(self, validation_error, data, filename=None, line=None):
location += ":%d" % line
message = "%s: %s" % (location, validation_error.message)
super(ConfigError, self).__init__(message)
super().__init__(message)
def _get_mark(self, validation_error, data):
"""Get the file/line mark fo a validation error from a Spack YAML file."""

View File

@@ -18,7 +18,7 @@ class DockerContext(PathContext):
@tengine.context_property
def manifest(self):
manifest_str = super(DockerContext, self).manifest
manifest_str = super().manifest
# Docker doesn't support HEREDOC, so we need to resort to
# a horrible echo trick to have the manifest in the Dockerfile
echoed_lines = []

View File

@@ -199,4 +199,4 @@ def read(path, apply_updates):
class ManifestValidationError(spack.error.SpackError):
def __init__(self, msg, long_msg=None):
super(ManifestValidationError, self).__init__(msg, long_msg)
super().__init__(msg, long_msg)

View File

@@ -135,7 +135,7 @@ class InstallStatus(str):
pass
class InstallStatuses(object):
class InstallStatuses:
INSTALLED = InstallStatus("installed")
DEPRECATED = InstallStatus("deprecated")
MISSING = InstallStatus("missing")
@@ -162,7 +162,7 @@ def canonicalize(cls, query_arg):
return query_arg
class InstallRecord(object):
class InstallRecord:
"""A record represents one installation in the DB.
The record keeps track of the spec for the installation, its
@@ -253,7 +253,7 @@ class ForbiddenLockError(SpackError):
"""Raised when an upstream DB attempts to acquire a lock"""
class ForbiddenLock(object):
class ForbiddenLock:
def __getattribute__(self, name):
raise ForbiddenLockError("Cannot access attribute '{0}' of lock".format(name))
@@ -307,7 +307,7 @@ def __getattribute__(self, name):
"""
class Database(object):
class Database:
"""Per-process lock objects for each install prefix."""
@@ -1667,7 +1667,7 @@ def __init__(self, database, expected, found):
f"you need a newer Spack version to read the DB in '{database.root}'. "
f"{self.database_version_message}"
)
super(InvalidDatabaseVersionError, self).__init__(msg)
super().__init__(msg)
@property
def database_version_message(self):

View File

@@ -112,10 +112,15 @@ def path_to_dict(search_paths):
# Reverse order of search directories so that a lib in the first
# entry overrides later entries
for search_path in reversed(search_paths):
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if llnl.util.filesystem.is_readable_file(lib_path):
path_to_lib[lib_path] = lib
try:
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if llnl.util.filesystem.is_readable_file(lib_path):
path_to_lib[lib_path] = lib
except OSError as e:
msg = f"cannot scan '{search_path}' for external software: {str(e)}"
llnl.util.tty.debug(msg)
return path_to_lib
@@ -224,7 +229,7 @@ def _windows_drive():
return drive
class WindowsCompilerExternalPaths(object):
class WindowsCompilerExternalPaths:
@staticmethod
def find_windows_compiler_root_paths():
"""Helper for Windows compiler installation root discovery
@@ -260,7 +265,7 @@ def find_windows_compiler_bundled_packages():
)
class WindowsKitExternalPaths(object):
class WindowsKitExternalPaths:
if sys.platform == "win32":
plat_major_ver = str(winOs.windows_version()[0])

View File

@@ -37,7 +37,7 @@ def _check_concrete(spec):
raise ValueError("Specs passed to a DirectoryLayout must be concrete!")
class DirectoryLayout(object):
class DirectoryLayout:
"""A directory layout is used to associate unique paths with specs.
Different installations are going to want different layouts for their
install, and they can use this to customize the nesting structure of
@@ -388,14 +388,14 @@ class DirectoryLayoutError(SpackError):
"""Superclass for directory layout errors."""
def __init__(self, message, long_msg=None):
super(DirectoryLayoutError, self).__init__(message, long_msg)
super().__init__(message, long_msg)
class RemoveFailedError(DirectoryLayoutError):
"""Raised when a DirectoryLayout cannot remove an install prefix."""
def __init__(self, installed_spec, prefix, error):
super(RemoveFailedError, self).__init__(
super().__init__(
"Could not remove prefix %s for %s : %s" % (prefix, installed_spec.short_spec, error)
)
self.cause = error
@@ -405,7 +405,7 @@ class InconsistentInstallDirectoryError(DirectoryLayoutError):
"""Raised when a package seems to be installed to the wrong place."""
def __init__(self, message, long_msg=None):
super(InconsistentInstallDirectoryError, self).__init__(message, long_msg)
super().__init__(message, long_msg)
class SpecReadError(DirectoryLayoutError):
@@ -416,7 +416,7 @@ class InvalidDirectoryLayoutParametersError(DirectoryLayoutError):
"""Raised when a invalid directory layout parameters are supplied"""
def __init__(self, message, long_msg=None):
super(InvalidDirectoryLayoutParametersError, self).__init__(message, long_msg)
super().__init__(message, long_msg)
class InvalidExtensionSpecError(DirectoryLayoutError):
@@ -427,16 +427,14 @@ class ExtensionAlreadyInstalledError(DirectoryLayoutError):
"""Raised when an extension is added to a package that already has it."""
def __init__(self, spec, ext_spec):
super(ExtensionAlreadyInstalledError, self).__init__(
"%s is already installed in %s" % (ext_spec.short_spec, spec.short_spec)
)
super().__init__("%s is already installed in %s" % (ext_spec.short_spec, spec.short_spec))
class ExtensionConflictError(DirectoryLayoutError):
"""Raised when an extension is added to a package that already has it."""
def __init__(self, spec, ext_spec, conflict):
super(ExtensionConflictError, self).__init__(
super().__init__(
"%s cannot be installed in %s because it conflicts with %s"
% (ext_spec.short_spec, spec.short_spec, conflict.short_spec)
)

View File

@@ -39,7 +39,6 @@
import spack.stage
import spack.store
import spack.subprocess_context
import spack.traverse
import spack.user_environment as uenv
import spack.util.cpus
import spack.util.environment
@@ -51,6 +50,7 @@
import spack.util.spack_yaml as syaml
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
@@ -426,32 +426,6 @@ def _is_dev_spec_and_has_changed(spec):
return mtime > record.installation_time
def _spec_needs_overwrite(spec, changed_dev_specs):
"""Check whether the current spec needs to be overwritten because either it has
changed itself or one of its dependencies have changed
"""
# if it's not installed, we don't need to overwrite it
if not spec.installed:
return False
# If the spec itself has changed this is a trivial decision
if spec in changed_dev_specs:
return True
# if spec and all deps aren't dev builds, we don't need to overwrite it
if not any(spec.satisfies(c) for c in ("dev_path=*", "^dev_path=*")):
return False
# If any dep needs overwrite, or any dep is missing and is a dev build then
# overwrite this package
if any(
((not dep.installed) and dep.satisfies("dev_path=*"))
or _spec_needs_overwrite(dep, changed_dev_specs)
for dep in spec.traverse(root=False)
):
return True
def _error_on_nonempty_view_dir(new_root):
"""Defensively error when the target view path already exists and is not an
empty directory. This usually happens when the view symlink was removed, but
@@ -636,18 +610,16 @@ def specs_for_view(self, concretized_root_specs):
From the list of concretized user specs in the environment, flatten
the dags, and filter selected, installed specs, remove duplicates on dag hash.
"""
dag_hash = lambda spec: spec.dag_hash()
# With deps, requires traversal
if self.link == "all" or self.link == "run":
deptype = ("run") if self.link == "run" else ("link", "run")
specs = list(
spack.traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=dag_hash
traverse.traverse_nodes(
concretized_root_specs, deptype=deptype, key=traverse.by_dag_hash
)
)
else:
specs = list(dedupe(concretized_root_specs, key=dag_hash))
specs = list(dedupe(concretized_root_specs, key=traverse.by_dag_hash))
# Filter selected, installed specs
with spack.store.db.read_transaction():
@@ -1808,17 +1780,29 @@ def _add_concrete_spec(self, spec, concrete, new=True):
self.specs_by_hash[h] = concrete
def _get_overwrite_specs(self):
# Collect all specs in the environment first before checking which ones
# to rebuild to avoid checking the same specs multiple times
specs_to_check = set()
for dag_hash in self.concretized_order:
root_spec = self.specs_by_hash[dag_hash]
specs_to_check.update(root_spec.traverse(root=True))
changed_dev_specs = set(s for s in specs_to_check if _is_dev_spec_and_has_changed(s))
# Find all dev specs that were modified.
changed_dev_specs = [
s
for s in traverse.traverse_nodes(
self.concrete_roots(), order="breadth", key=traverse.by_dag_hash
)
if _is_dev_spec_and_has_changed(s)
]
# Collect their hashes, and the hashes of their installed parents.
# Notice: with order=breadth all changed dev specs are at depth 0,
# even if they occur as parents of one another.
return [
s.dag_hash() for s in specs_to_check if _spec_needs_overwrite(s, changed_dev_specs)
spec.dag_hash()
for depth, spec in traverse.traverse_nodes(
changed_dev_specs,
root=True,
order="breadth",
depth=True,
direction="parents",
key=traverse.by_dag_hash,
)
if depth == 0 or spec.installed
]
def _install_log_links(self, spec):
@@ -1925,7 +1909,7 @@ def install_specs(self, specs=None, **install_args):
def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
roots = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [s for s in spack.traverse.traverse_nodes(roots, lambda s: s.dag_hash())]
specs = [s for s in traverse.traverse_nodes(roots, key=traverse.by_dag_hash)]
specs.sort()
return specs
@@ -1971,13 +1955,18 @@ def concrete_roots(self):
roots *without* associated user spec"""
return [root for _, root in self.concretized_specs()]
def get_by_hash(self, dag_hash):
matches = {}
roots = [self.specs_by_hash[h] for h in self.concretized_order]
for spec in spack.traverse.traverse_nodes(roots, key=lambda s: s.dag_hash()):
def get_by_hash(self, dag_hash: str) -> List[Spec]:
# If it's not a partial hash prefix we can early exit
early_exit = len(dag_hash) == 32
matches = []
for spec in traverse.traverse_nodes(
self.concrete_roots(), key=traverse.by_dag_hash, order="breadth"
):
if spec.dag_hash().startswith(dag_hash):
matches[spec.dag_hash()] = spec
return list(matches.values())
matches.append(spec)
if early_exit:
break
return matches
def get_one_by_hash(self, dag_hash):
"""Returns the single spec from the environment which matches the
@@ -1989,11 +1978,14 @@ def get_one_by_hash(self, dag_hash):
def all_matching_specs(self, *specs: spack.spec.Spec) -> List[Spec]:
"""Returns all concretized specs in the environment satisfying any of the input specs"""
key = lambda s: s.dag_hash()
# Look up abstract hashes ahead of time, to avoid O(n^2) traversal.
specs = [s.lookup_hash() for s in specs]
# Avoid double lookup by directly calling _satisfies.
return [
s
for s in spack.traverse.traverse_nodes(self.concrete_roots(), key=key)
if any(s.satisfies(t) for t in specs)
for s in traverse.traverse_nodes(self.concrete_roots(), key=traverse.by_dag_hash)
if any(s._satisfies(t) for t in specs)
]
@spack.repo.autospec
@@ -2017,9 +2009,9 @@ def matching_spec(self, spec):
env_root_to_user = {root.dag_hash(): user for user, root in self.concretized_specs()}
root_matches, dep_matches = [], []
for env_spec in spack.traverse.traverse_nodes(
for env_spec in traverse.traverse_nodes(
specs=[root for _, root in self.concretized_specs()],
key=lambda s: s.dag_hash(),
key=traverse.by_dag_hash,
order="breadth",
):
if not env_spec.satisfies(spec):
@@ -2093,8 +2085,8 @@ def _get_environment_specs(self, recurse_dependencies=True):
if recurse_dependencies:
specs.extend(
spack.traverse.traverse_nodes(
specs, root=False, deptype=("link", "run"), key=lambda s: s.dag_hash()
traverse.traverse_nodes(
specs, root=False, deptype=("link", "run"), key=traverse.by_dag_hash
)
)
@@ -2103,9 +2095,7 @@ def _get_environment_specs(self, recurse_dependencies=True):
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = {}
for s in spack.traverse.traverse_nodes(
self.specs_by_hash.values(), key=lambda s: s.dag_hash()
):
for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash):
spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash)
# Assumes no legacy formats, since this was just created.
spec_dict[ht.dag_hash.name] = s.dag_hash()
@@ -2262,7 +2252,7 @@ def ensure_env_directory_exists(self, dot_env: bool = False) -> None:
def update_environment_repository(self) -> None:
"""Updates the repository associated with the environment."""
for spec in spack.traverse.traverse_nodes(self.new_specs):
for spec in traverse.traverse_nodes(self.new_specs):
if not spec.concrete:
raise ValueError("specs passed to environment.write() must be concrete!")

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import inspect
import sys
@@ -21,7 +19,7 @@ class SpackError(Exception):
"""
def __init__(self, message, long_message=None):
super(SpackError, self).__init__()
super().__init__()
self.message = message
self._long_message = long_message
@@ -93,14 +91,14 @@ class UnsupportedPlatformError(SpackError):
"""Raised by packages when a platform is not supported"""
def __init__(self, message):
super(UnsupportedPlatformError, self).__init__(message)
super().__init__(message)
class NoLibrariesError(SpackError):
"""Raised when package libraries are requested but cannot be found"""
def __init__(self, message_or_name, prefix=None):
super(NoLibrariesError, self).__init__(
super().__init__(
message_or_name
if prefix is None
else "Unable to locate {0} libraries in {1}".format(message_or_name, prefix)
@@ -125,9 +123,7 @@ class UnsatisfiableSpecError(SpecError):
def __init__(self, provided, required, constraint_type):
# This is only the entrypoint for old concretizer errors
super(UnsatisfiableSpecError, self).__init__(
"%s does not satisfy %s" % (provided, required)
)
super().__init__("%s does not satisfy %s" % (provided, required))
self.provided = provided
self.required = required

View File

@@ -176,7 +176,7 @@ class CommandNotFoundError(spack.error.SpackError):
"""
def __init__(self, cmd_name):
super(CommandNotFoundError, self).__init__(
super().__init__(
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
@@ -188,6 +188,4 @@ class ExtensionNamingError(spack.error.SpackError):
"""
def __init__(self, path):
super(ExtensionNamingError, self).__init__(
"{0} does not match the format for a Spack extension path.".format(path)
)
super().__init__("{0} does not match the format for a Spack extension path.".format(path))

View File

@@ -93,7 +93,7 @@ def fetcher(cls):
return cls
class FetchStrategy(object):
class FetchStrategy:
"""Superclass of all fetch strategies."""
#: The URL attribute must be specified either at the package class
@@ -234,9 +234,7 @@ class FetchStrategyComposite(pattern.Composite):
matches = FetchStrategy.matches
def __init__(self):
super(FetchStrategyComposite, self).__init__(
["fetch", "check", "expand", "reset", "archive", "cachable", "mirror_id"]
)
super().__init__(["fetch", "check", "expand", "reset", "archive", "cachable", "mirror_id"])
def source_id(self):
component_ids = tuple(i.source_id() for i in self)
@@ -263,7 +261,7 @@ class URLFetchStrategy(FetchStrategy):
optional_attrs = list(crypto.hashes.keys()) + ["checksum"]
def __init__(self, url=None, checksum=None, **kwargs):
super(URLFetchStrategy, self).__init__(**kwargs)
super().__init__(**kwargs)
# Prefer values in kwargs to the positionals.
self.url = kwargs.get("url", url)
@@ -580,7 +578,7 @@ class VCSFetchStrategy(FetchStrategy):
"""
def __init__(self, **kwargs):
super(VCSFetchStrategy, self).__init__(**kwargs)
super().__init__(**kwargs)
# Set a URL based on the type of fetch strategy.
self.url = kwargs.get(self.url_attr, None)
@@ -652,7 +650,7 @@ def __init__(self, **kwargs):
# call to __init__
forwarded_args = copy.copy(kwargs)
forwarded_args.pop("name", None)
super(GoFetchStrategy, self).__init__(**forwarded_args)
super().__init__(**forwarded_args)
self._go = None
@@ -681,7 +679,7 @@ def fetch(self):
self.go("get", "-v", "-d", self.url, env=env)
def archive(self, destination):
super(GoFetchStrategy, self).archive(destination, exclude=".git")
super().archive(destination, exclude=".git")
@_needs_stage
def expand(self):
@@ -740,7 +738,7 @@ def __init__(self, **kwargs):
# to __init__
forwarded_args = copy.copy(kwargs)
forwarded_args.pop("name", None)
super(GitFetchStrategy, self).__init__(**forwarded_args)
super().__init__(**forwarded_args)
self._git = None
self.submodules = kwargs.get("submodules", False)
@@ -948,7 +946,7 @@ def clone(self, dest=None, commit=None, branch=None, tag=None, bare=False):
git(*args)
def archive(self, destination):
super(GitFetchStrategy, self).archive(destination, exclude=".git")
super().archive(destination, exclude=".git")
@_needs_stage
def reset(self):
@@ -997,7 +995,7 @@ def __init__(self, **kwargs):
# to __init__
forwarded_args = copy.copy(kwargs)
forwarded_args.pop("name", None)
super(CvsFetchStrategy, self).__init__(**forwarded_args)
super().__init__(**forwarded_args)
self._cvs = None
if self.branch is not None:
@@ -1077,7 +1075,7 @@ def _remove_untracked_files(self):
os.unlink(path)
def archive(self, destination):
super(CvsFetchStrategy, self).archive(destination, exclude="CVS")
super().archive(destination, exclude="CVS")
@_needs_stage
def reset(self):
@@ -1113,7 +1111,7 @@ def __init__(self, **kwargs):
# to __init__
forwarded_args = copy.copy(kwargs)
forwarded_args.pop("name", None)
super(SvnFetchStrategy, self).__init__(**forwarded_args)
super().__init__(**forwarded_args)
self._svn = None
if self.revision is not None:
@@ -1172,7 +1170,7 @@ def _remove_untracked_files(self):
shutil.rmtree(path, ignore_errors=True)
def archive(self, destination):
super(SvnFetchStrategy, self).archive(destination, exclude=".svn")
super().archive(destination, exclude=".svn")
@_needs_stage
def reset(self):
@@ -1216,7 +1214,7 @@ def __init__(self, **kwargs):
# to __init__
forwarded_args = copy.copy(kwargs)
forwarded_args.pop("name", None)
super(HgFetchStrategy, self).__init__(**forwarded_args)
super().__init__(**forwarded_args)
self._hg = None
@@ -1277,7 +1275,7 @@ def fetch(self):
shutil.move(repo_name, self.stage.source_path)
def archive(self, destination):
super(HgFetchStrategy, self).archive(destination, exclude=".hg")
super().archive(destination, exclude=".hg")
@_needs_stage
def reset(self):
@@ -1306,7 +1304,7 @@ class S3FetchStrategy(URLFetchStrategy):
def __init__(self, *args, **kwargs):
try:
super(S3FetchStrategy, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
except ValueError:
if not kwargs.get("url"):
raise ValueError("S3FetchStrategy requires a url for fetching.")
@@ -1353,7 +1351,7 @@ class GCSFetchStrategy(URLFetchStrategy):
def __init__(self, *args, **kwargs):
try:
super(GCSFetchStrategy, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
except ValueError:
if not kwargs.get("url"):
raise ValueError("GCSFetchStrategy requires a url for fetching.")
@@ -1652,7 +1650,7 @@ def from_list_url(pkg):
tty.msg("Could not determine url from list_url.")
class FsCache(object):
class FsCache:
def __init__(self, root):
self.root = os.path.abspath(root)
@@ -1686,7 +1684,7 @@ class FailedDownloadError(web_util.FetchError):
"""Raised when a download fails."""
def __init__(self, url, msg=""):
super(FailedDownloadError, self).__init__("Failed to fetch file from URL: %s" % url, msg)
super().__init__("Failed to fetch file from URL: %s" % url, msg)
self.url = url
@@ -1716,7 +1714,7 @@ def __init__(self, pkg=None, version=None, **args):
if version:
msg += "@{version}".format(version=version)
long_msg = "with arguments: {args}".format(args=args)
super(InvalidArgsError, self).__init__(msg, long_msg)
super().__init__(msg, long_msg)
class ChecksumError(web_util.FetchError):
@@ -1727,6 +1725,4 @@ class NoStageError(web_util.FetchError):
"""Raised when fetch operations are called before set_stage()."""
def __init__(self, method):
super(NoStageError, self).__init__(
"Must call FetchStrategy.set_stage() before calling %s" % method.__name__
)
super().__init__("Must call FetchStrategy.set_stage() before calling %s" % method.__name__)

View File

@@ -126,7 +126,7 @@ def inverse_view_func_parser(view_type):
return link_name
class FilesystemView(object):
class FilesystemView:
"""
Governs a filesystem view that is located at certain root-directory.
@@ -255,7 +255,7 @@ class YamlFilesystemView(FilesystemView):
"""
def __init__(self, root, layout, **kwargs):
super(YamlFilesystemView, self).__init__(root, layout, **kwargs)
super().__init__(root, layout, **kwargs)
# Super class gets projections from the kwargs
# YAML specific to get projections from YAML file
@@ -637,7 +637,7 @@ class SimpleFilesystemView(FilesystemView):
were added."""
def __init__(self, root, layout, **kwargs):
super(SimpleFilesystemView, self).__init__(root, layout, **kwargs)
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same

View File

@@ -10,7 +10,7 @@
hashes = []
class SpecHashDescriptor(object):
class SpecHashDescriptor:
"""This class defines how hashes are generated on Spec objects.
Spec hashes in Spack are generated from a serialized (e.g., with

View File

@@ -33,7 +33,7 @@
import spack.paths
class _HookRunner(object):
class _HookRunner:
#: Stores all hooks on first call, shared among
#: all HookRunner objects
_hooks = None

View File

@@ -86,7 +86,7 @@
STATUS_REMOVED = "removed"
class InstallAction(object):
class InstallAction:
#: Don't perform an install
NONE = 0
#: Do a standard install
@@ -657,7 +657,7 @@ def package_id(pkg):
return "{0}-{1}-{2}".format(pkg.name, pkg.version, pkg.spec.dag_hash())
class TermTitle(object):
class TermTitle:
def __init__(self, pkg_count):
# Counters used for showing status information in the terminal title
self.pkg_num = 0
@@ -683,7 +683,7 @@ def set(self, text):
sys.stdout.flush()
class TermStatusLine(object):
class TermStatusLine:
"""
This class is used in distributed builds to inform the user that other packages are
being installed by another process.
@@ -727,7 +727,7 @@ def clear(self):
sys.stdout.flush()
class PackageInstaller(object):
class PackageInstaller:
"""
Class for managing the install process for a Spack instance based on a
bottom-up DAG approach.
@@ -1867,7 +1867,7 @@ def install(self):
)
class BuildProcessInstaller(object):
class BuildProcessInstaller:
"""This class implements the part installation that happens in the child process."""
def __init__(self, pkg, install_args):
@@ -2091,7 +2091,7 @@ def build_process(pkg, install_args):
return installer.run()
class OverwriteInstall(object):
class OverwriteInstall:
def __init__(self, installer, database, task):
self.installer = installer
self.database = database
@@ -2122,7 +2122,7 @@ def install(self):
raise e.inner_exception
class BuildTask(object):
class BuildTask:
"""Class for representing the build task for a package."""
def __init__(self, pkg, request, compiler, start, attempts, status, installed):
@@ -2338,7 +2338,7 @@ def priority(self):
return len(self.uninstalled_deps)
class BuildRequest(object):
class BuildRequest:
"""Class for representing an installation request."""
def __init__(self, pkg, install_args):
@@ -2505,7 +2505,7 @@ class InstallError(spack.error.SpackError):
"""
def __init__(self, message, long_msg=None, pkg=None):
super(InstallError, self).__init__(message, long_msg)
super().__init__(message, long_msg)
self.pkg = pkg
@@ -2513,9 +2513,7 @@ class BadInstallPhase(InstallError):
"""Raised for an install phase option is not allowed for a package."""
def __init__(self, pkg_name, phase):
super(BadInstallPhase, self).__init__(
"'{0}' is not a valid phase for package {1}".format(phase, pkg_name)
)
super().__init__("'{0}' is not a valid phase for package {1}".format(phase, pkg_name))
class ExternalPackageError(InstallError):

View File

@@ -8,8 +8,6 @@
In a normal Spack installation, this is invoked from the bin/spack script
after the system path is set up.
"""
from __future__ import print_function
import argparse
import inspect
import io
@@ -197,7 +195,7 @@ def index_commands():
class SpackHelpFormatter(argparse.RawTextHelpFormatter):
def _format_actions_usage(self, actions, groups):
"""Formatter with more concise usage strings."""
usage = super(SpackHelpFormatter, self)._format_actions_usage(actions, groups)
usage = super()._format_actions_usage(actions, groups)
# Eliminate any occurrence of two or more consecutive spaces
usage = re.sub(r"[ ]{2,}", " ", usage)
@@ -212,7 +210,7 @@ def _format_actions_usage(self, actions, groups):
def add_arguments(self, actions):
actions = sorted(actions, key=operator.attrgetter("option_strings"))
super(SpackHelpFormatter, self).add_arguments(actions)
super().add_arguments(actions)
class SpackArgumentParser(argparse.ArgumentParser):
@@ -332,7 +330,7 @@ def add_subparsers(self, **kwargs):
if sys.version_info[:2] > (3, 6):
kwargs.setdefault("required", True)
sp = super(SpackArgumentParser, self).add_subparsers(**kwargs)
sp = super().add_subparsers(**kwargs)
# This monkey patching is needed for Python 3.6, which supports
# having a required subparser but don't expose the API used above
if sys.version_info[:2] == (3, 6):
@@ -382,7 +380,7 @@ def format_help(self, level="short"):
return self.format_help_sections(level)
else:
# in subparsers, self.prog is, e.g., 'spack install'
return super(SpackArgumentParser, self).format_help()
return super().format_help()
def _check_value(self, action, value):
# converted value must be one of the choices (if specified)
@@ -653,7 +651,7 @@ def _invoke_command(command, parser, args, unknown_args):
return 0 if return_val is None else return_val
class SpackCommand(object):
class SpackCommand:
"""Callable object that invokes a spack command (for testing).
Example usage::
@@ -857,6 +855,23 @@ def shell_set(var, value):
shell_set("_sp_module_prefix", "not_installed")
def restore_macos_dyld_vars():
"""
Spack mutates DYLD_* variables in `spack load` and `spack env activate`.
Unlike Linux, macOS SIP clears these variables in new processes, meaning
that os.environ["DYLD_*"] in our Python process is not the same as the user's
shell. Therefore, we store the user's DYLD_* variables in SPACK_DYLD_* and
restore them here.
"""
if not sys.platform == "darwin":
return
for dyld_var in ("DYLD_LIBRARY_PATH", "DYLD_FALLBACK_LIBRARY_PATH"):
stored_var_name = f"SPACK_{dyld_var}"
if stored_var_name in os.environ:
os.environ[dyld_var] = os.environ[stored_var_name]
def _main(argv=None):
"""Logic for the main entry point for the Spack command.
@@ -889,18 +904,6 @@ def _main(argv=None):
parser.add_argument("command", nargs=argparse.REMAINDER)
args, unknown = parser.parse_known_args(argv)
# Recover stored LD_LIBRARY_PATH variables from spack shell function
# This is necessary because MacOS System Integrity Protection clears
# (DY?)LD_LIBRARY_PATH variables on process start.
# Spack clears these variables before building and installing packages,
# but needs to know the prior state for commands like `spack load` and
# `spack env activate that modify the user environment.
recovered_vars = ("LD_LIBRARY_PATH", "DYLD_LIBRARY_PATH", "DYLD_FALLBACK_LIBRARY_PATH")
for var in recovered_vars:
stored_var_name = "SPACK_%s" % var
if stored_var_name in os.environ:
os.environ[var] = os.environ[stored_var_name]
# Just print help and exit if run with no arguments at all
no_args = (len(sys.argv) == 1) if argv is None else (len(argv) == 0)
if no_args:
@@ -923,6 +926,9 @@ def _main(argv=None):
# scopes, then environment configuration here.
# ------------------------------------------------------------------------
# Make spack load / env activate work on macOS
restore_macos_dyld_vars()
# make spack.config aware of any command line configuration scopes
if args.config_scopes:
spack.config.command_line_scopes = args.config_scopes

View File

@@ -62,7 +62,7 @@ def _url_or_path_to_url(url_or_path: str) -> str:
return url_util.path_to_file_url(spack.util.path.canonicalize_path(url_or_path))
class Mirror(object):
class Mirror:
"""Represents a named location for storing source tarballs and binary
packages.
@@ -371,7 +371,7 @@ def _determine_extension(fetcher):
return ext
class MirrorReference(object):
class MirrorReference:
"""A ``MirrorReference`` stores the relative paths where you can store a
package/resource in a mirror directory.
@@ -597,7 +597,7 @@ def remove(name, scope):
tty.msg("Removed mirror %s." % name)
class MirrorStats(object):
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
@@ -693,4 +693,4 @@ class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super(MirrorError, self).__init__(msg, long_msg)
super().__init__(msg, long_msg)

View File

@@ -7,8 +7,6 @@
include Tcl non-hierarchical modules, Lua hierarchical modules, and others.
"""
from __future__ import absolute_import
from .common import disable_modules, ensure_modules_are_enabled_or_warn
from .lmod import LmodModulefileWriter
from .tcl import TclModulefileWriter

View File

@@ -294,7 +294,7 @@ def read_module_indices():
return module_indices
class UpstreamModuleIndex(object):
class UpstreamModuleIndex:
"""This is responsible for taking the individual module indices of all
upstream Spack installations and locating the module for a given spec
based on which upstream install it is located in."""
@@ -388,7 +388,7 @@ def get_module(module_type, spec, get_full_path, module_set_name="default", requ
return writer.layout.use_name
class BaseConfiguration(object):
class BaseConfiguration:
"""Manipulates the information needed to generate a module file to make
querying easier. It needs to be sub-classed for specific module types.
"""
@@ -551,7 +551,7 @@ def verbose(self):
return self.conf.get("verbose")
class BaseFileLayout(object):
class BaseFileLayout:
"""Provides information on the layout of module files. Needs to be
sub-classed for specific module types.
"""
@@ -821,7 +821,7 @@ def ensure_modules_are_enabled_or_warn():
warnings.warn(msg)
class BaseModuleFileWriter(object):
class BaseModuleFileWriter:
def __init__(self, spec, module_set_name, explicit=None):
self.spec = spec

View File

@@ -52,7 +52,7 @@ def __init__(cls, name, bases, attr_dict):
super(MultiMethodMeta, cls).__init__(name, bases, attr_dict)
class SpecMultiMethod(object):
class SpecMultiMethod:
"""This implements a multi-method for Spack specs. Packages are
instantiated with a particular spec, and you may want to
execute different versions of methods based on what the spec
@@ -153,7 +153,7 @@ def __call__(self, package_or_builder_self, *args, **kwargs):
)
class when(object):
class when:
def __init__(self, condition):
"""Can be used both as a decorator, for multimethods, or as a context
manager to group ``when=`` arguments together.
@@ -275,14 +275,14 @@ class MultiMethodError(spack.error.SpackError):
"""Superclass for multimethod dispatch errors"""
def __init__(self, message):
super(MultiMethodError, self).__init__(message)
super().__init__(message)
class NoSuchMethodError(spack.error.SpackError):
"""Raised when we can't find a version of a multi-method."""
def __init__(self, cls, method_name, spec, possible_specs):
super(NoSuchMethodError, self).__init__(
super().__init__(
"Package %s does not support %s called with %s. Options are: %s"
% (cls.__name__, method_name, spec, ", ".join(str(s) for s in possible_specs))
)

View File

@@ -8,7 +8,7 @@
@llnl.util.lang.lazy_lexicographic_ordering
class OperatingSystem(object):
class OperatingSystem:
"""Base class for all the Operating Systems.
On a multiple architecture machine, the architecture spec field can be set to

View File

@@ -86,9 +86,9 @@ def __init__(self):
# distro.linux_distribution, while still calling __init__
# methods further up the MRO, we skip LinuxDistro in the MRO and
# call the OperatingSystem superclass __init__ method
super(LinuxDistro, self).__init__(name, version)
super().__init__(name, version)
else:
super(CrayBackend, self).__init__()
super().__init__()
self.modulecmd = module
def __str__(self):

View File

@@ -67,4 +67,4 @@ def __init__(self):
else:
version = version[0]
super(LinuxDistro, self).__init__(distname, version)
super().__init__(distname, version)

View File

@@ -152,7 +152,7 @@ def __init__(self):
mac_ver = str(version.up_to(part))
name = mac_releases.get(mac_ver, "macos")
super(MacOs, self).__init__(name, mac_ver)
super().__init__(name, mac_ver)
def __str__(self):
return self.name

View File

@@ -72,7 +72,7 @@ def __init__(self):
plat_ver = windows_version()
if plat_ver < Version("10"):
raise SpackError("Spack is not supported on Windows versions older than 10")
super(WindowsOs, self).__init__("windows{}".format(plat_ver), plat_ver)
super().__init__("windows{}".format(plat_ver), plat_ver)
def __str__(self):
return self.name

View File

@@ -125,7 +125,7 @@ def preferred_version(pkg):
return max(pkg.versions, key=key_fn)
class WindowsRPath(object):
class WindowsRPath:
"""Collection of functionality surrounding Windows RPATH specific features
This is essentially meaningless for all other platforms
@@ -175,7 +175,7 @@ def windows_establish_runtime_linkage(self):
detectable_packages = collections.defaultdict(list)
class DetectablePackageMeta(object):
class DetectablePackageMeta:
"""Check if a package is detectable and add default implementations
for the detection function.
"""
@@ -365,7 +365,7 @@ def _wrapper(instance, *args, **kwargs):
return _execute_under_condition
class PackageViewMixin(object):
class PackageViewMixin:
"""This collects all functionality related to adding installed Spack
package to views. Packages can customize how they are added to views by
overriding these functions.
@@ -668,7 +668,7 @@ def __init__(self, spec):
pkg_cls = spack.repo.path.get_pkg_class(self.extendee_spec.name)
pkg_cls(self.extendee_spec)._check_extendable()
super(PackageBase, self).__init__()
super().__init__()
@classmethod
def possible_dependencies(
@@ -2511,7 +2511,7 @@ class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""
def __init__(self, spec, dependents):
super(PackageStillNeededError, self).__init__("Cannot uninstall %s" % spec)
super().__init__("Cannot uninstall %s" % spec)
self.spec = spec
self.dependents = dependents
@@ -2520,14 +2520,14 @@ class PackageError(spack.error.SpackError):
"""Raised when something is wrong with a package definition."""
def __init__(self, message, long_msg=None):
super(PackageError, self).__init__(message, long_msg)
super().__init__(message, long_msg)
class NoURLError(PackageError):
"""Raised when someone tries to build a URL for a package with no URLs."""
def __init__(self, cls):
super(NoURLError, self).__init__("Package %s has no version with a URL." % cls.__name__)
super().__init__("Package %s has no version with a URL." % cls.__name__)
class InvalidPackageOpError(PackageError):
@@ -2542,13 +2542,11 @@ class ActivationError(ExtensionError):
"""Raised when there are problems activating an extension."""
def __init__(self, msg, long_msg=None):
super(ActivationError, self).__init__(msg, long_msg)
super().__init__(msg, long_msg)
class DependencyConflictError(spack.error.SpackError):
"""Raised when the dependencies cannot be flattened as asked for."""
def __init__(self, conflict):
super(DependencyConflictError, self).__init__(
"%s conflicts with another file in the flattened directory." % (conflict)
)
super().__init__("%s conflicts with another file in the flattened directory." % (conflict))

View File

@@ -19,7 +19,7 @@ def _spec_type(component):
return _lesser_spec_types.get(component, spack.spec.Spec)
class PackagePrefs(object):
class PackagePrefs:
"""Defines the sort order for a set of specs.
Spack's package preference implementation uses PackagePrefss to

View File

@@ -51,7 +51,7 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
patch("-s", "-p", str(level), "-i", patch_path, "-d", working_dir)
class Patch(object):
class Patch:
"""Base class for patches.
Arguments:
@@ -152,7 +152,7 @@ def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
msg += "package %s.%s does not exist." % (pkg.namespace, pkg.name)
raise ValueError(msg)
super(FilePatch, self).__init__(pkg, abs_path, level, working_dir)
super().__init__(pkg, abs_path, level, working_dir)
self.path = abs_path
self._sha256 = None
self.ordering_key = ordering_key
@@ -164,9 +164,7 @@ def sha256(self):
return self._sha256
def to_dict(self):
return llnl.util.lang.union_dicts(
super(FilePatch, self).to_dict(), {"relative_path": self.relative_path}
)
return llnl.util.lang.union_dicts(super().to_dict(), {"relative_path": self.relative_path})
class UrlPatch(Patch):
@@ -181,7 +179,7 @@ class UrlPatch(Patch):
"""
def __init__(self, pkg, url, level=1, working_dir=".", ordering_key=None, **kwargs):
super(UrlPatch, self).__init__(pkg, url, level, working_dir)
super().__init__(pkg, url, level, working_dir)
self.url = url
self._stage = None
@@ -264,7 +262,7 @@ def clean(self):
self.stage.destroy()
def to_dict(self):
data = super(UrlPatch, self).to_dict()
data = super().to_dict()
data["url"] = self.url
if self.archive_sha256:
data["archive_sha256"] = self.archive_sha256
@@ -310,7 +308,7 @@ def from_dict(dictionary, repository=None):
raise ValueError("Invalid patch dictionary: %s" % dictionary)
class PatchCache(object):
class PatchCache:
"""Index of patches used in a repository, by sha256 hash.
This allows us to look up patches without loading all packages. It's

View File

@@ -35,7 +35,7 @@
host = _host
class _PickleableCallable(object):
class _PickleableCallable:
"""Class used to pickle a callable that may substitute either
_platform or _all_platforms. Lambda or nested functions are
not pickleable.

View File

@@ -12,11 +12,11 @@
class NoPlatformError(spack.error.SpackError):
def __init__(self):
msg = "Could not determine a platform for this machine"
super(NoPlatformError, self).__init__(msg)
super().__init__(msg)
@llnl.util.lang.lazy_lexicographic_ordering
class Platform(object):
class Platform:
"""Platform is an abstract class extended by subclasses.
To add a new type of platform (such as cray_xe), create a subclass and set all the

View File

@@ -59,7 +59,7 @@ def __init__(self):
configuration file "targets.yaml" with keys 'front_end', 'back_end'
scanning /etc/bash/bashrc.local for back_end only
"""
super(Cray, self).__init__("cray")
super().__init__("cray")
# Make all craype targets available.
for target in self._avail_targets():

View File

@@ -20,7 +20,7 @@ class Darwin(Platform):
binary_formats = ["macho"]
def __init__(self):
super(Darwin, self).__init__("darwin")
super().__init__("darwin")
for name in archspec.cpu.TARGETS:
self.add_target(name, spack.target.Target(name))

View File

@@ -16,7 +16,7 @@ class Linux(Platform):
priority = 90
def __init__(self):
super(Linux, self).__init__("linux")
super().__init__("linux")
for name in archspec.cpu.TARGETS:
self.add_target(name, spack.target.Target(name))

View File

@@ -31,7 +31,7 @@ class Test(Platform):
def __init__(self, name=None):
name = name or "test"
super(Test, self).__init__(name)
super().__init__(name)
self.add_target(self.default, spack.target.Target(self.default))
self.add_target(self.front_end, spack.target.Target(self.front_end))

Some files were not shown because too many files have changed in this diff Show More