Compare commits
13 Commits
hs/fix/ven
...
psakiev/ca
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5986173e5b | ||
|
|
9e942bb3a3 | ||
|
|
815cb4f5b2 | ||
|
|
944b3dad3f | ||
|
|
9fe2796e9d | ||
|
|
200191cc3d | ||
|
|
6a48121ed7 | ||
|
|
9c03f15cbd | ||
|
|
6eda1b4d04 | ||
|
|
0240120d4f | ||
|
|
88d7249141 | ||
|
|
8d9af73d83 | ||
|
|
6fb1ded7c3 |
1
.github/workflows/import-check.yaml
vendored
1
.github/workflows/import-check.yaml
vendored
@@ -6,7 +6,6 @@ on:
|
|||||||
jobs:
|
jobs:
|
||||||
# Check we don't make the situation with circular imports worse
|
# Check we don't make the situation with circular imports worse
|
||||||
import-check:
|
import-check:
|
||||||
continue-on-error: true
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: julia-actions/setup-julia@v2
|
- uses: julia-actions/setup-julia@v2
|
||||||
|
|||||||
5
.github/workflows/sync-packages.yaml
vendored
5
.github/workflows/sync-packages.yaml
vendored
@@ -27,10 +27,7 @@ jobs:
|
|||||||
- name: Sync spack/spack-packages with spack/spack
|
- name: Sync spack/spack-packages with spack/spack
|
||||||
run: |
|
run: |
|
||||||
cd spack-packages
|
cd spack-packages
|
||||||
git-filter-repo --quiet --source ../spack \
|
git-filter-repo --quiet --source ../spack --subdirectory-filter var/spack/repos --refs develop
|
||||||
--path var/spack/repos/ --path-rename var/spack/repos/:python/ \
|
|
||||||
--path share/spack/gitlab/cloud_pipelines/ --path-rename share/spack/gitlab/cloud_pipelines/:.ci/gitlab/ \
|
|
||||||
--refs develop
|
|
||||||
- name: Push
|
- name: Push
|
||||||
run: |
|
run: |
|
||||||
cd spack-packages
|
cd spack-packages
|
||||||
|
|||||||
@@ -276,7 +276,7 @@ remove dependent packages *before* removing their dependencies or use the
|
|||||||
Garbage collection
|
Garbage collection
|
||||||
^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
When Spack builds software from sources, it often installs tools that are needed
|
When Spack builds software from sources, if often installs tools that are needed
|
||||||
just to build or test other software. These are not necessary at runtime.
|
just to build or test other software. These are not necessary at runtime.
|
||||||
To support cases where removing these tools can be a benefit Spack provides
|
To support cases where removing these tools can be a benefit Spack provides
|
||||||
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:
|
the ``spack gc`` ("garbage collector") command, which will uninstall all unneeded packages:
|
||||||
|
|||||||
@@ -89,7 +89,7 @@ You can see that the mirror is added with ``spack mirror list`` as follows:
|
|||||||
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
|
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
|
||||||
|
|
||||||
|
|
||||||
At this point, you've created a buildcache, but Spack hasn't indexed it, so if
|
At this point, you've create a buildcache, but spack hasn't indexed it, so if
|
||||||
you run ``spack buildcache list`` you won't see any results. You need to index
|
you run ``spack buildcache list`` you won't see any results. You need to index
|
||||||
this new build cache as follows:
|
this new build cache as follows:
|
||||||
|
|
||||||
@@ -318,7 +318,7 @@ other system dependencies. However, they are still compatible with tools like
|
|||||||
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
|
``skopeo``, ``podman``, and ``docker`` for pulling and pushing.
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
The Docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
The docker ``overlayfs2`` storage driver is limited to 128 layers, above which a
|
||||||
``max depth exceeded`` error may be produced when pulling the image. There
|
``max depth exceeded`` error may be produced when pulling the image. There
|
||||||
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
|
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
|
||||||
|
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ is an entire command dedicated to the management of every aspect of bootstrappin
|
|||||||
|
|
||||||
.. command-output:: spack bootstrap --help
|
.. command-output:: spack bootstrap --help
|
||||||
|
|
||||||
Spack is configured to bootstrap its dependencies lazily by default; i.e., the first time they are needed and
|
Spack is configured to bootstrap its dependencies lazily by default; i.e. the first time they are needed and
|
||||||
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
|
can't be found. You can readily check if any prerequisite for using Spack is missing by running:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
@@ -36,8 +36,8 @@ can't be found. You can readily check if any prerequisite for using Spack is mis
|
|||||||
|
|
||||||
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
|
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
|
||||||
are missing and it's giving detailed information on why they are needed and whether
|
are missing and it's giving detailed information on why they are needed and whether
|
||||||
they can be bootstrapped. The return code of this command summarizes the results; if any
|
they can be bootstrapped. The return code of this command summarizes the results, if any
|
||||||
dependencies are missing, the return code is ``1``, otherwise ``0``. Running a command that
|
dependencies are missing the return code is ``1``, otherwise ``0``. Running a command that
|
||||||
concretizes a spec, like:
|
concretizes a spec, like:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
|||||||
@@ -66,7 +66,7 @@ on these ideas for each distinct build system that Spack supports:
|
|||||||
build_systems/rocmpackage
|
build_systems/rocmpackage
|
||||||
build_systems/sourceforgepackage
|
build_systems/sourceforgepackage
|
||||||
|
|
||||||
For reference, the :py:mod:`Build System API docs <spack_repo.builtin.build_systems>`
|
For reference, the :py:mod:`Build System API docs <spack.build_systems>`
|
||||||
provide a list of build systems and methods/attributes that can be
|
provide a list of build systems and methods/attributes that can be
|
||||||
overridden. If you are curious about the implementation of a particular
|
overridden. If you are curious about the implementation of a particular
|
||||||
build system, you can view the source code by running:
|
build system, you can view the source code by running:
|
||||||
@@ -90,7 +90,7 @@ packages. You can quickly find examples by running:
|
|||||||
You can then view these packages with ``spack edit``.
|
You can then view these packages with ``spack edit``.
|
||||||
|
|
||||||
This guide is intended to supplement the
|
This guide is intended to supplement the
|
||||||
:py:mod:`Build System API docs <spack_repo.builtin.build_systems>` with examples of
|
:py:mod:`Build System API docs <spack.build_systems>` with examples of
|
||||||
how to override commonly used methods. It also provides rules of thumb
|
how to override commonly used methods. It also provides rules of thumb
|
||||||
and suggestions for package developers who are unfamiliar with a
|
and suggestions for package developers who are unfamiliar with a
|
||||||
particular build system.
|
particular build system.
|
||||||
|
|||||||
@@ -129,8 +129,8 @@ Adding flags to cmake
|
|||||||
To add additional flags to the ``cmake`` call, simply override the
|
To add additional flags to the ``cmake`` call, simply override the
|
||||||
``cmake_args`` function. The following example defines values for the flags
|
``cmake_args`` function. The following example defines values for the flags
|
||||||
``WHATEVER``, ``ENABLE_BROKEN_FEATURE``, ``DETECT_HDF5``, and ``THREADS`` with
|
``WHATEVER``, ``ENABLE_BROKEN_FEATURE``, ``DETECT_HDF5``, and ``THREADS`` with
|
||||||
and without the :meth:`~spack_repo.builtin.build_systems.cmake.CMakeBuilder.define` and
|
and without the :meth:`~spack.build_systems.cmake.CMakeBuilder.define` and
|
||||||
:meth:`~spack_repo.builtin.build_systems.cmake.CMakeBuilder.define_from_variant` helper functions:
|
:meth:`~spack.build_systems.cmake.CMakeBuilder.define_from_variant` helper functions:
|
||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
|
|||||||
@@ -35,8 +35,8 @@
|
|||||||
if not os.path.exists(link_name):
|
if not os.path.exists(link_name):
|
||||||
os.symlink(os.path.abspath("../../.."), link_name, target_is_directory=True)
|
os.symlink(os.path.abspath("../../.."), link_name, target_is_directory=True)
|
||||||
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external"))
|
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external"))
|
||||||
|
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external/_vendoring"))
|
||||||
sys.path.append(os.path.abspath("_spack_root/lib/spack/"))
|
sys.path.append(os.path.abspath("_spack_root/lib/spack/"))
|
||||||
sys.path.append(os.path.abspath("_spack_root/var/spack/repos/"))
|
|
||||||
|
|
||||||
# Add the Spack bin directory to the path so that we can use its output in docs.
|
# Add the Spack bin directory to the path so that we can use its output in docs.
|
||||||
os.environ["SPACK_ROOT"] = os.path.abspath("_spack_root")
|
os.environ["SPACK_ROOT"] = os.path.abspath("_spack_root")
|
||||||
@@ -76,20 +76,11 @@
|
|||||||
apidoc_args
|
apidoc_args
|
||||||
+ [
|
+ [
|
||||||
"_spack_root/lib/spack/spack",
|
"_spack_root/lib/spack/spack",
|
||||||
"_spack_root/lib/spack/spack/package.py", # sphinx struggles with os.chdir re-export.
|
|
||||||
"_spack_root/lib/spack/spack/test/*.py",
|
"_spack_root/lib/spack/spack/test/*.py",
|
||||||
"_spack_root/lib/spack/spack/test/cmd/*.py",
|
"_spack_root/lib/spack/spack/test/cmd/*.py",
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/llnl"])
|
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/llnl"])
|
||||||
sphinx_apidoc(
|
|
||||||
apidoc_args
|
|
||||||
+ [
|
|
||||||
"--implicit-namespaces",
|
|
||||||
"_spack_root/var/spack/repos/spack_repo",
|
|
||||||
"_spack_root/var/spack/repos/spack_repo/builtin/packages",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
# Enable todo items
|
# Enable todo items
|
||||||
todo_include_todos = True
|
todo_include_todos = True
|
||||||
@@ -218,7 +209,7 @@ def setup(sphinx):
|
|||||||
# Spack classes that are private and we don't want to expose
|
# Spack classes that are private and we don't want to expose
|
||||||
("py:class", "spack.provider_index._IndexBase"),
|
("py:class", "spack.provider_index._IndexBase"),
|
||||||
("py:class", "spack.repo._PrependFileLoader"),
|
("py:class", "spack.repo._PrependFileLoader"),
|
||||||
("py:class", "spack_repo.builtin.build_systems._checks.BuilderWithDefaults"),
|
("py:class", "spack.build_systems._checks.BuilderWithDefaults"),
|
||||||
# Spack classes that intersphinx is unable to resolve
|
# Spack classes that intersphinx is unable to resolve
|
||||||
("py:class", "spack.version.StandardVersion"),
|
("py:class", "spack.version.StandardVersion"),
|
||||||
("py:class", "spack.spec.DependencySpec"),
|
("py:class", "spack.spec.DependencySpec"),
|
||||||
@@ -228,7 +219,7 @@ def setup(sphinx):
|
|||||||
("py:class", "spack.install_test.Pb"),
|
("py:class", "spack.install_test.Pb"),
|
||||||
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
|
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
|
||||||
("py:class", "spack.traverse.EdgeAndDepth"),
|
("py:class", "spack.traverse.EdgeAndDepth"),
|
||||||
("py:class", "_vendoring.archspec.cpu.microarchitecture.Microarchitecture"),
|
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
|
||||||
("py:class", "spack.compiler.CompilerCache"),
|
("py:class", "spack.compiler.CompilerCache"),
|
||||||
# TypeVar that is not handled correctly
|
# TypeVar that is not handled correctly
|
||||||
("py:class", "llnl.util.lang.T"),
|
("py:class", "llnl.util.lang.T"),
|
||||||
|
|||||||
@@ -148,8 +148,8 @@ this can expose you to attacks. Use at your own risk.
|
|||||||
``ssl_certs``
|
``ssl_certs``
|
||||||
--------------------
|
--------------------
|
||||||
|
|
||||||
Path to custom certificates for SSL verification. The value can be a
|
Path to custom certificats for SSL verification. The value can be a
|
||||||
filesystem path, or an environment variable that expands to an absolute file path.
|
filesytem path, or an environment variable that expands to an absolute file path.
|
||||||
The default value is set to the environment variable ``SSL_CERT_FILE``
|
The default value is set to the environment variable ``SSL_CERT_FILE``
|
||||||
to use the same syntax used by many other applications that automatically
|
to use the same syntax used by many other applications that automatically
|
||||||
detect custom certificates.
|
detect custom certificates.
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ Container Images
|
|||||||
Spack :ref:`environments` can easily be turned into container images. This page
|
Spack :ref:`environments` can easily be turned into container images. This page
|
||||||
outlines two ways in which this can be done:
|
outlines two ways in which this can be done:
|
||||||
|
|
||||||
1. By installing the environment on the host system and copying the installations
|
1. By installing the environment on the host system, and copying the installations
|
||||||
into the container image. This approach does not require any tools like Docker
|
into the container image. This approach does not require any tools like Docker
|
||||||
or Singularity to be installed.
|
or Singularity to be installed.
|
||||||
2. By generating a Docker or Singularity recipe that can be used to build the
|
2. By generating a Docker or Singularity recipe that can be used to build the
|
||||||
@@ -56,8 +56,8 @@ environment roots and its runtime dependencies.
|
|||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag specifies not
|
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag is not
|
||||||
the password for your account, but rather a personal access token you need to generate separately.
|
the password for your account, but a personal access token you need to generate separately.
|
||||||
|
|
||||||
The specified ``--base-image`` should have a libc that is compatible with the host system.
|
The specified ``--base-image`` should have a libc that is compatible with the host system.
|
||||||
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
|
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
|
||||||
|
|||||||
@@ -539,9 +539,7 @@ from the command line.
|
|||||||
|
|
||||||
You can also include an environment directly in the ``spack.yaml`` file. It
|
You can also include an environment directly in the ``spack.yaml`` file. It
|
||||||
involves adding the ``include_concrete`` heading in the yaml followed by the
|
involves adding the ``include_concrete`` heading in the yaml followed by the
|
||||||
absolute path to the independent environments. Note, that you may use Spack
|
absolute path to the independent environments.
|
||||||
config variables such as ``$spack`` or environment variables as long as the
|
|
||||||
expression expands to an absolute path.
|
|
||||||
|
|
||||||
.. code-block:: yaml
|
.. code-block:: yaml
|
||||||
|
|
||||||
@@ -551,7 +549,7 @@ expression expands to an absolute path.
|
|||||||
unify: true
|
unify: true
|
||||||
include_concrete:
|
include_concrete:
|
||||||
- /absolute/path/to/environment1
|
- /absolute/path/to/environment1
|
||||||
- $spack/../path/to/environment2
|
- /absolute/path/to/environment2
|
||||||
|
|
||||||
|
|
||||||
Once the ``spack.yaml`` has been updated you must concretize the environment to
|
Once the ``spack.yaml`` has been updated you must concretize the environment to
|
||||||
|
|||||||
@@ -20,7 +20,7 @@ be present on the machine where Spack is run:
|
|||||||
:header-rows: 1
|
:header-rows: 1
|
||||||
|
|
||||||
These requirements can be easily installed on most modern Linux systems;
|
These requirements can be easily installed on most modern Linux systems;
|
||||||
on macOS, the Command Line Tools package is required, and a full Xcode suite
|
on macOS, the Command Line Tools package is required, and a full XCode suite
|
||||||
may be necessary for some packages such as Qt and apple-gl. Spack is designed
|
may be necessary for some packages such as Qt and apple-gl. Spack is designed
|
||||||
to run on HPC platforms like Cray. Not all packages should be expected
|
to run on HPC platforms like Cray. Not all packages should be expected
|
||||||
to work on all platforms.
|
to work on all platforms.
|
||||||
|
|||||||
@@ -103,7 +103,6 @@ or refer to the full manual below.
|
|||||||
:caption: API Docs
|
:caption: API Docs
|
||||||
|
|
||||||
Spack API Docs <spack>
|
Spack API Docs <spack>
|
||||||
Spack Builtin Repo <spack_repo>
|
|
||||||
LLNL API Docs <llnl>
|
LLNL API Docs <llnl>
|
||||||
|
|
||||||
==================
|
==================
|
||||||
|
|||||||
@@ -120,6 +120,16 @@ what it looks like:
|
|||||||
Once this is done, you can tar up the ``spack-mirror-2014-06-24`` directory and
|
Once this is done, you can tar up the ``spack-mirror-2014-06-24`` directory and
|
||||||
copy it over to the machine you want it hosted on.
|
copy it over to the machine you want it hosted on.
|
||||||
|
|
||||||
|
Customization of the mirror contents can be done by selectively excluding
|
||||||
|
specs using the ``--exclude-file`` or ``--exclude-specs`` flags with
|
||||||
|
``spack mirror create``. Note that these only apply to source mirrors.
|
||||||
|
|
||||||
|
You may additionally add an ``exclude`` or ``include``
|
||||||
|
section to the ``mirrors`` configuration section for pushing to binary mirrors.
|
||||||
|
These are lists of abstract or concrete specs to configure what gets pushed to your mirror.
|
||||||
|
If overlapping inclusion and exclusions are applied then inclusion is preferred.
|
||||||
|
|
||||||
|
|
||||||
^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^
|
||||||
Custom package sets
|
Custom package sets
|
||||||
^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
Modules (modules.yaml)
|
Modules (modules.yaml)
|
||||||
======================
|
======================
|
||||||
|
|
||||||
The use of module systems to manage user environments in a controlled way
|
The use of module systems to manage user environment in a controlled way
|
||||||
is a common practice at HPC centers that is sometimes embraced also by
|
is a common practice at HPC centers that is sometimes embraced also by
|
||||||
individual programmers on their development machines. To support this
|
individual programmers on their development machines. To support this
|
||||||
common practice Spack integrates with `Environment Modules
|
common practice Spack integrates with `Environment Modules
|
||||||
@@ -490,7 +490,7 @@ that are already in the Lmod hierarchy.
|
|||||||
|
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
Tcl and Lua modules also allow for explicit conflicts between module files.
|
Tcl and Lua modules also allow for explicit conflicts between modulefiles.
|
||||||
|
|
||||||
.. code-block:: yaml
|
.. code-block:: yaml
|
||||||
|
|
||||||
@@ -513,7 +513,7 @@ that are already in the Lmod hierarchy.
|
|||||||
:meth:`~spack.spec.Spec.format` method.
|
:meth:`~spack.spec.Spec.format` method.
|
||||||
|
|
||||||
For Lmod and Environment Modules versions prior 4.2, it is important to
|
For Lmod and Environment Modules versions prior 4.2, it is important to
|
||||||
express the conflict on both module files conflicting with each other.
|
express the conflict on both modulefiles conflicting with each other.
|
||||||
|
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
@@ -550,7 +550,7 @@ that are already in the Lmod hierarchy.
|
|||||||
|
|
||||||
.. warning::
|
.. warning::
|
||||||
Consistency of Core packages
|
Consistency of Core packages
|
||||||
The user is responsible for maintaining consistency among core packages, as ``core_specs``
|
The user is responsible for maintining consistency among core packages, as ``core_specs``
|
||||||
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
|
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
|
||||||
|
|
||||||
.. warning::
|
.. warning::
|
||||||
|
|||||||
@@ -69,7 +69,7 @@ An example for ``CMake`` is, for instance:
|
|||||||
|
|
||||||
The predefined steps for each build system are called "phases".
|
The predefined steps for each build system are called "phases".
|
||||||
In general, the name and order in which the phases will be executed can be
|
In general, the name and order in which the phases will be executed can be
|
||||||
obtained by either reading the API docs at :py:mod:`~.spack_repo.builtin.build_systems`, or
|
obtained by either reading the API docs at :py:mod:`~.spack.build_systems`, or
|
||||||
using the ``spack info`` command:
|
using the ``spack info`` command:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
@@ -158,7 +158,7 @@ builder class explicitly. Using the same example as above, this reads:
|
|||||||
url_fmt = "https://github.com/uclouvain/openjpeg/archive/version.{0}.tar.gz"
|
url_fmt = "https://github.com/uclouvain/openjpeg/archive/version.{0}.tar.gz"
|
||||||
return url_fmt.format(version)
|
return url_fmt.format(version)
|
||||||
|
|
||||||
class CMakeBuilder(spack_repo.builtin.build_systems.cmake.CMakeBuilder):
|
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
|
||||||
def cmake_args(self):
|
def cmake_args(self):
|
||||||
args = [
|
args = [
|
||||||
self.define_from_variant("BUILD_CODEC", "codec"),
|
self.define_from_variant("BUILD_CODEC", "codec"),
|
||||||
@@ -179,7 +179,7 @@ Spack can be found at :ref:`package_class_structure`.
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
class Foo(CMakePackage):
|
class Foo(CmakePackage):
|
||||||
def cmake_args(self):
|
def cmake_args(self):
|
||||||
...
|
...
|
||||||
|
|
||||||
@@ -256,7 +256,7 @@ for details):
|
|||||||
#
|
#
|
||||||
# See the Spack documentation for more information on packaging.
|
# See the Spack documentation for more information on packaging.
|
||||||
# ----------------------------------------------------------------------------
|
# ----------------------------------------------------------------------------
|
||||||
import spack_repo.builtin.build_systems.autotools
|
import spack.build_systems.autotools
|
||||||
from spack.package import *
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
@@ -497,7 +497,7 @@ extends Spack's ``Package`` class. For example, here is
|
|||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
from spack.package import *
|
from spack import *
|
||||||
|
|
||||||
class Libelf(Package):
|
class Libelf(Package):
|
||||||
""" ... description ... """
|
""" ... description ... """
|
||||||
@@ -1089,7 +1089,7 @@ You've already seen the ``homepage`` and ``url`` package attributes:
|
|||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
from spack.package import *
|
from spack import *
|
||||||
|
|
||||||
|
|
||||||
class Mpich(Package):
|
class Mpich(Package):
|
||||||
@@ -1212,7 +1212,7 @@ class-level tarball URL and VCS. For example:
|
|||||||
version("master", branch="master")
|
version("master", branch="master")
|
||||||
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
|
version("12.12.1", md5="ecd4606fa332212433c98bf950a69cc7")
|
||||||
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
|
version("12.10.1", md5="667333dbd7c0f031d47d7c5511fd0810")
|
||||||
version("12.8.1", md5="9f37f683ee2b427b5540db8a20ed6b15")
|
version("12.8.1", "9f37f683ee2b427b5540db8a20ed6b15")
|
||||||
|
|
||||||
If a package contains both a ``url`` and ``git`` class-level attribute,
|
If a package contains both a ``url`` and ``git`` class-level attribute,
|
||||||
Spack decides which to use based on the arguments to the ``version()``
|
Spack decides which to use based on the arguments to the ``version()``
|
||||||
@@ -1343,7 +1343,7 @@ Submodules
|
|||||||
|
|
||||||
version("1.0.1", tag="v1.0.1", submodules=True)
|
version("1.0.1", tag="v1.0.1", submodules=True)
|
||||||
|
|
||||||
If a package needs more fine-grained control over submodules, define
|
If a package has needs more fine-grained control over submodules, define
|
||||||
``submodules`` to be a callable function that takes the package instance as
|
``submodules`` to be a callable function that takes the package instance as
|
||||||
its only argument. The function should return a list of submodules to be fetched.
|
its only argument. The function should return a list of submodules to be fetched.
|
||||||
|
|
||||||
@@ -2253,15 +2253,22 @@ RPATHs in Spack are handled in one of three ways:
|
|||||||
set in standard variables like ``CC``, ``CXX``, ``F77``, and ``FC``,
|
set in standard variables like ``CC``, ``CXX``, ``F77``, and ``FC``,
|
||||||
so most build systems (autotools and many gmake systems) pick them
|
so most build systems (autotools and many gmake systems) pick them
|
||||||
up and use them.
|
up and use them.
|
||||||
#. CMake has its own RPATH handling, and distinguishes between build and
|
#. CMake also respects Spack's compiler wrappers, but many CMake
|
||||||
install RPATHs. By default, during the build it registers RPATHs to
|
builds have logic to overwrite RPATHs when binaries are
|
||||||
all libraries it links to, so that just-built executables can be run
|
installed. Spack provides the ``std_cmake_args`` variable, which
|
||||||
during the build itself. Upon installation, these RPATHs are cleared,
|
includes parameters necessary for CMake build use the right
|
||||||
unless the user defines the install RPATHs. When inheriting from
|
installation RPATH. It can be used like this when ``cmake`` is
|
||||||
``CMakePackage``, Spack handles this automatically, and sets
|
invoked:
|
||||||
``CMAKE_INSTALL_RPATH_USE_LINK_PATH`` and ``CMAKE_INSTALL_RPATH``,
|
|
||||||
so that libraries of dependencies and the package's own libraries
|
.. code-block:: python
|
||||||
can be found at runtime.
|
|
||||||
|
class MyPackage(Package):
|
||||||
|
...
|
||||||
|
def install(self, spec, prefix):
|
||||||
|
cmake("..", *std_cmake_args)
|
||||||
|
make()
|
||||||
|
make("install")
|
||||||
|
|
||||||
#. If you need to modify the build to add your own RPATHs, you can
|
#. If you need to modify the build to add your own RPATHs, you can
|
||||||
use the ``self.rpath`` property of your package, which will
|
use the ``self.rpath`` property of your package, which will
|
||||||
return a list of all the RPATHs that Spack will use when it
|
return a list of all the RPATHs that Spack will use when it
|
||||||
@@ -2308,19 +2315,31 @@ looks like this:
|
|||||||
|
|
||||||
parallel = False
|
parallel = False
|
||||||
|
|
||||||
You can also disable parallel builds only for specific make
|
Similarly, you can disable parallel builds only for specific make
|
||||||
invocation:
|
commands, as ``libdwarf`` does:
|
||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:emphasize-lines: 5
|
:emphasize-lines: 9, 12
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
class Libelf(Package):
|
class Libelf(Package):
|
||||||
...
|
...
|
||||||
|
|
||||||
def install(self, spec, prefix):
|
def install(self, spec, prefix):
|
||||||
|
configure("--prefix=" + prefix,
|
||||||
|
"--enable-shared",
|
||||||
|
"--disable-dependency-tracking",
|
||||||
|
"--disable-debug")
|
||||||
|
make()
|
||||||
|
|
||||||
|
# The mkdir commands in libelf's install can fail in parallel
|
||||||
make("install", parallel=False)
|
make("install", parallel=False)
|
||||||
|
|
||||||
|
The first make will run in parallel here, but the second will not. If
|
||||||
|
you set ``parallel`` to ``False`` at the package level, then each call
|
||||||
|
to ``make()`` will be sequential by default, but packagers can call
|
||||||
|
``make(parallel=True)`` to override it.
|
||||||
|
|
||||||
Note that the ``--jobs`` option works out of the box for all standard
|
Note that the ``--jobs`` option works out of the box for all standard
|
||||||
build systems. If you are using a non-standard build system instead, you
|
build systems. If you are using a non-standard build system instead, you
|
||||||
can use the variable ``make_jobs`` to extract the number of jobs specified
|
can use the variable ``make_jobs`` to extract the number of jobs specified
|
||||||
@@ -2495,7 +2514,7 @@ necessary when there are breaking changes in the dependency that the
|
|||||||
package cannot handle. In Spack we often add forward compatibility
|
package cannot handle. In Spack we often add forward compatibility
|
||||||
bounds only at the time a new, breaking version of a dependency is
|
bounds only at the time a new, breaking version of a dependency is
|
||||||
released. As with backward compatibility, it is typical to see a list
|
released. As with backward compatibility, it is typical to see a list
|
||||||
of forward compatibility bounds in a package file as separate lines:
|
of forward compatibility bounds in a package file as seperate lines:
|
||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
@@ -3371,7 +3390,7 @@ the above attribute implementations:
|
|||||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
|
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so"
|
||||||
])
|
])
|
||||||
|
|
||||||
# baz library directories in the baz subdirectory of the foo prefix
|
# baz library directories in the baz subdirectory of the foo porefix
|
||||||
>>> spec["baz"].libs.directories
|
>>> spec["baz"].libs.directories
|
||||||
[
|
[
|
||||||
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
|
"/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib"
|
||||||
@@ -3685,57 +3704,60 @@ the build system. The build systems currently supported by Spack are:
|
|||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| **API docs** | **Description** |
|
| **API docs** | **Description** |
|
||||||
+==========================================================+==================================+
|
+==========================================================+==================================+
|
||||||
| :class:`~spack_repo.builtin.build_systems.generic` | Generic build system without any |
|
| :class:`~spack.build_systems.generic` | Generic build system without any |
|
||||||
| | base implementation |
|
| | base implementation |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.makefile` | Specialized build system for |
|
| :class:`~spack.build_systems.makefile` | Specialized build system for |
|
||||||
| | software built invoking |
|
| | software built invoking |
|
||||||
| | hand-written Makefiles |
|
| | hand-written Makefiles |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.autotools` | Specialized build system for |
|
| :class:`~spack.build_systems.autotools` | Specialized build system for |
|
||||||
| | software built using |
|
| | software built using |
|
||||||
| | GNU Autotools |
|
| | GNU Autotools |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.cmake` | Specialized build system for |
|
| :class:`~spack.build_systems.cmake` | Specialized build system for |
|
||||||
| | software built using CMake |
|
| | software built using CMake |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.maven` | Specialized build system for |
|
| :class:`~spack.build_systems.maven` | Specialized build system for |
|
||||||
| | software built using Maven |
|
| | software built using Maven |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.meson` | Specialized build system for |
|
| :class:`~spack.build_systems.meson` | Specialized build system for |
|
||||||
| | software built using Meson |
|
| | software built using Meson |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.nmake` | Specialized build system for |
|
| :class:`~spack.build_systems.nmake` | Specialized build system for |
|
||||||
| | software built using NMake |
|
| | software built using NMake |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.qmake` | Specialized build system for |
|
| :class:`~spack.build_systems.qmake` | Specialized build system for |
|
||||||
| | software built using QMake |
|
| | software built using QMake |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.scons` | Specialized build system for |
|
| :class:`~spack.build_systems.scons` | Specialized build system for |
|
||||||
| | software built using SCons |
|
| | software built using SCons |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.waf` | Specialized build system for |
|
| :class:`~spack.build_systems.waf` | Specialized build system for |
|
||||||
| | software built using Waf |
|
| | software built using Waf |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.r` | Specialized build system for |
|
| :class:`~spack.build_systems.r` | Specialized build system for |
|
||||||
| | R extensions |
|
| | R extensions |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.octave` | Specialized build system for |
|
| :class:`~spack.build_systems.octave` | Specialized build system for |
|
||||||
| | Octave packages |
|
| | Octave packages |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.python` | Specialized build system for |
|
| :class:`~spack.build_systems.python` | Specialized build system for |
|
||||||
| | Python extensions |
|
| | Python extensions |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.perl` | Specialized build system for |
|
| :class:`~spack.build_systems.perl` | Specialized build system for |
|
||||||
| | Perl extensions |
|
| | Perl extensions |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.ruby` | Specialized build system for |
|
| :class:`~spack.build_systems.ruby` | Specialized build system for |
|
||||||
| | Ruby extensions |
|
| | Ruby extensions |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.oneapi` | Specialized build system for |
|
| :class:`~spack.build_systems.intel` | Specialized build system for |
|
||||||
|
| | licensed Intel software |
|
||||||
|
+----------------------------------------------------------+----------------------------------+
|
||||||
|
| :class:`~spack.build_systems.oneapi` | Specialized build system for |
|
||||||
| | Intel oneAPI software |
|
| | Intel oneAPI software |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.aspell_dict` | Specialized build system for |
|
| :class:`~spack.build_systems.aspell_dict` | Specialized build system for |
|
||||||
| | Aspell dictionaries |
|
| | Aspell dictionaries |
|
||||||
+----------------------------------------------------------+----------------------------------+
|
+----------------------------------------------------------+----------------------------------+
|
||||||
|
|
||||||
@@ -3747,7 +3769,7 @@ the build system. The build systems currently supported by Spack are:
|
|||||||
rare cases where manual intervention is needed we need to stress that a
|
rare cases where manual intervention is needed we need to stress that a
|
||||||
package base class depends on the *build system* being used, not the language of the package.
|
package base class depends on the *build system* being used, not the language of the package.
|
||||||
For example, a Python extension installed with CMake would ``extends("python")`` and
|
For example, a Python extension installed with CMake would ``extends("python")`` and
|
||||||
subclass from :class:`~spack_repo.builtin.build_systems.cmake.CMakePackage`.
|
subclass from :class:`~spack.build_systems.cmake.CMakePackage`.
|
||||||
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
Overriding builder methods
|
Overriding builder methods
|
||||||
@@ -3755,7 +3777,7 @@ Overriding builder methods
|
|||||||
|
|
||||||
Build-system "phases" have default implementations that fit most of the common cases:
|
Build-system "phases" have default implementations that fit most of the common cases:
|
||||||
|
|
||||||
.. literalinclude:: _spack_root/var/spack/repos/spack_repo/builtin/build_systems/autotools.py
|
.. literalinclude:: _spack_root/lib/spack/spack/build_systems/autotools.py
|
||||||
:pyobject: AutotoolsBuilder.configure
|
:pyobject: AutotoolsBuilder.configure
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
@@ -3769,7 +3791,7 @@ configure arguments:
|
|||||||
|
|
||||||
Each specific build system has a list of attributes and methods that can be overridden to
|
Each specific build system has a list of attributes and methods that can be overridden to
|
||||||
fine-tune the installation of a package without overriding an entire phase. To
|
fine-tune the installation of a package without overriding an entire phase. To
|
||||||
have more information on them the place to go is the API docs of the :py:mod:`~.spack_repo.builtin.build_systems`
|
have more information on them the place to go is the API docs of the :py:mod:`~.spack.build_systems`
|
||||||
module.
|
module.
|
||||||
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
@@ -3811,7 +3833,7 @@ If the ``package.py`` has build instructions in a separate
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
class CMakeBuilder(spack_repo.builtin.build_systems.cmake.CMakeBuilder):
|
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
|
||||||
def install(self, pkg, spec, prefix):
|
def install(self, pkg, spec, prefix):
|
||||||
...
|
...
|
||||||
|
|
||||||
@@ -3824,32 +3846,31 @@ Mixin base classes
|
|||||||
Besides build systems, there are other cases where common metadata and behavior can be extracted
|
Besides build systems, there are other cases where common metadata and behavior can be extracted
|
||||||
and reused by many packages. For instance, packages that depend on ``Cuda`` or ``Rocm``, share
|
and reused by many packages. For instance, packages that depend on ``Cuda`` or ``Rocm``, share
|
||||||
common dependencies and constraints. To factor these attributes into a single place, Spack provides
|
common dependencies and constraints. To factor these attributes into a single place, Spack provides
|
||||||
a few mixin classes in the ``spack_repo.builtin.build_systems`` module:
|
a few mixin classes in the ``spack.build_systems`` module:
|
||||||
|
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
| **API docs** | **Description** |
|
| **API docs** | **Description** |
|
||||||
+============================================================================+==================================+
|
+===============================================================+==================================+
|
||||||
| :class:`~spack_repo.builtin.build_systems.cuda.CudaPackage` | A helper class for packages that |
|
| :class:`~spack.build_systems.cuda.CudaPackage` | A helper class for packages that |
|
||||||
| | use CUDA |
|
| | use CUDA |
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.rocm.ROCmPackage` | A helper class for packages that |
|
| :class:`~spack.build_systems.rocm.ROCmPackage` | A helper class for packages that |
|
||||||
| | use ROCm |
|
| | use ROCm |
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
| :class:`~spack_repo.builtin.build_systems.gnu.GNUMirrorPackage` | A helper class for GNU packages |
|
| :class:`~spack.build_systems.gnu.GNUMirrorPackage` | A helper class for GNU packages |
|
||||||
| | |
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
| :class:`~spack.build_systems.python.PythonExtension` | A helper class for Python |
|
||||||
| :class:`~spack_repo.builtin.build_systems.python.PythonExtension` | A helper class for Python |
|
| | extensions |
|
||||||
| | extensions |
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
| :class:`~spack.build_systems.sourceforge.SourceforgePackage` | A helper class for packages |
|
||||||
| :class:`~spack_repo.builtin.build_systems.sourceforge.SourceforgePackage` | A helper class for packages |
|
| | from sourceforge.org |
|
||||||
| | from sourceforge.org |
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
| :class:`~spack.build_systems.sourceware.SourcewarePackage` | A helper class for packages |
|
||||||
| :class:`~spack_repo.builtin.build_systems.sourceware.SourcewarePackage` | A helper class for packages |
|
| | from sourceware.org |
|
||||||
| | from sourceware.org |
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
| :class:`~spack.build_systems.xorg.XorgPackage` | A helper class for x.org |
|
||||||
| :class:`~spack_repo.builtin.build_systems.xorg.XorgPackage` | A helper class for x.org |
|
| | packages |
|
||||||
| | packages |
|
+---------------------------------------------------------------+----------------------------------+
|
||||||
+----------------------------------------------------------------------------+----------------------------------+
|
|
||||||
|
|
||||||
These classes should be used by adding them to the inheritance tree of the package that needs them,
|
These classes should be used by adding them to the inheritance tree of the package that needs them,
|
||||||
for instance:
|
for instance:
|
||||||
@@ -3893,13 +3914,13 @@ Additional build instructions are split into separate builder classes:
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
class CMakeBuilder(spack_repo.builtin.build_systems.cmake.CMakeBuilder):
|
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
|
||||||
def cmake_args(self):
|
def cmake_args(self):
|
||||||
return [
|
return [
|
||||||
self.define_from_variant("MY_FEATURE", "my_feature")
|
self.define_from_variant("MY_FEATURE", "my_feature")
|
||||||
]
|
]
|
||||||
|
|
||||||
class AutotoolsBuilder(spack_repo.builtin.build_systems.autotools.AutotoolsBuilder):
|
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
|
||||||
def configure_args(self):
|
def configure_args(self):
|
||||||
return self.with_or_without("my-feature", variant="my_feature")
|
return self.with_or_without("my-feature", variant="my_feature")
|
||||||
|
|
||||||
@@ -5728,7 +5749,7 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
|
|||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
The method name ``copy_test_files`` here is for illustration purposes.
|
The method name ``copy_test_files`` here is for illustration purposes.
|
||||||
You are free to use a name that is better suited to your package.
|
You are free to use a name that is more suited to your package.
|
||||||
|
|
||||||
The key to copying files for stand-alone testing at build time is use
|
The key to copying files for stand-alone testing at build time is use
|
||||||
of the ``run_after`` directive, which ensures the associated files are
|
of the ``run_after`` directive, which ensures the associated files are
|
||||||
@@ -6162,7 +6183,7 @@ running:
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
from spack.package import *
|
from spack import *
|
||||||
|
|
||||||
This is already part of the boilerplate for packages created with
|
This is already part of the boilerplate for packages created with
|
||||||
``spack create``.
|
``spack create``.
|
||||||
@@ -7237,7 +7258,7 @@ which are not, there is the `checked_by` parameter in the license directive:
|
|||||||
|
|
||||||
license("<license>", when="<when>", checked_by="<github username>")
|
license("<license>", when="<when>", checked_by="<github username>")
|
||||||
|
|
||||||
When you have validated a package license, either when doing so explicitly or
|
When you have validated a github license, either when doing so explicitly or
|
||||||
as part of packaging a new package, please set the `checked_by` parameter
|
as part of packaging a new package, please set the `checked_by` parameter
|
||||||
to your Github username to signal that the license has been manually
|
to your Github username to signal that the license has been manually
|
||||||
verified.
|
verified.
|
||||||
|
|||||||
@@ -214,7 +214,7 @@ package versions, simply run the following commands:
|
|||||||
|
|
||||||
Running ``spack mark -i --all`` tells Spack to mark all of the existing
|
Running ``spack mark -i --all`` tells Spack to mark all of the existing
|
||||||
packages within an environment as "implicitly" installed. This tells
|
packages within an environment as "implicitly" installed. This tells
|
||||||
Spack's garbage collection system that these packages should be cleaned up.
|
spack's garbage collection system that these packages should be cleaned up.
|
||||||
|
|
||||||
Don't worry however, this will not remove your entire environment.
|
Don't worry however, this will not remove your entire environment.
|
||||||
Running ``spack install`` will reexamine your spack environment after
|
Running ``spack install`` will reexamine your spack environment after
|
||||||
|
|||||||
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/_pyrsistent_version.pyi
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
from _pyrsistent_version import *
|
||||||
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/altgraph.pyi
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
from altgraph import *
|
||||||
10
lib/spack/external/_vendoring/altgraph/Dot.py
vendored
10
lib/spack/external/_vendoring/altgraph/Dot.py
vendored
@@ -1,8 +1,8 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.Dot - Interface to the dot language
|
altgraph.Dot - Interface to the dot language
|
||||||
============================================
|
============================================
|
||||||
|
|
||||||
The :py:mod:`~_vendoring.altgraph.Dot` module provides a simple interface to the
|
The :py:mod:`~altgraph.Dot` module provides a simple interface to the
|
||||||
file format used in the
|
file format used in the
|
||||||
`graphviz <http://www.research.att.com/sw/tools/graphviz/>`_
|
`graphviz <http://www.research.att.com/sw/tools/graphviz/>`_
|
||||||
program. The module is intended to offload the most tedious part of the process
|
program. The module is intended to offload the most tedious part of the process
|
||||||
@@ -20,7 +20,7 @@
|
|||||||
|
|
||||||
Here is a typical usage::
|
Here is a typical usage::
|
||||||
|
|
||||||
from _vendoring.altgraph import Graph, Dot
|
from altgraph import Graph, Dot
|
||||||
|
|
||||||
# create a graph
|
# create a graph
|
||||||
edges = [ (1,2), (1,3), (3,4), (3,5), (4,5), (5,4) ]
|
edges = [ (1,2), (1,3), (3,4), (3,5), (4,5), (5,4) ]
|
||||||
@@ -77,7 +77,7 @@
|
|||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
dotty (invoked via :py:func:`~_vendoring.altgraph.Dot.display`) may not be able to
|
dotty (invoked via :py:func:`~altgraph.Dot.display`) may not be able to
|
||||||
display all graphics styles. To verify the output save it to an image file
|
display all graphics styles. To verify the output save it to an image file
|
||||||
and look at it that way.
|
and look at it that way.
|
||||||
|
|
||||||
@@ -111,7 +111,7 @@
|
|||||||
import os
|
import os
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
from _vendoring.altgraph import GraphError
|
from altgraph import GraphError
|
||||||
|
|
||||||
|
|
||||||
class Dot(object):
|
class Dot(object):
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.Graph - Base Graph class
|
altgraph.Graph - Base Graph class
|
||||||
=================================
|
=================================
|
||||||
|
|
||||||
..
|
..
|
||||||
@@ -15,7 +15,7 @@
|
|||||||
|
|
||||||
from collections import deque
|
from collections import deque
|
||||||
|
|
||||||
from _vendoring.altgraph import GraphError
|
from altgraph import GraphError
|
||||||
|
|
||||||
|
|
||||||
class Graph(object):
|
class Graph(object):
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.GraphAlgo - Graph algorithms
|
altgraph.GraphAlgo - Graph algorithms
|
||||||
=====================================
|
=====================================
|
||||||
"""
|
"""
|
||||||
from _vendoring.altgraph import GraphError
|
from altgraph import GraphError
|
||||||
|
|
||||||
|
|
||||||
def dijkstra(graph, start, end=None):
|
def dijkstra(graph, start, end=None):
|
||||||
@@ -25,7 +25,7 @@ def dijkstra(graph, start, end=None):
|
|||||||
and will raise an exception if it discovers that a negative edge has
|
and will raise an exception if it discovers that a negative edge has
|
||||||
caused it to make a mistake.
|
caused it to make a mistake.
|
||||||
|
|
||||||
Adapted to _vendoring.altgraph by Istvan Albert, Pennsylvania State University -
|
Adapted to altgraph by Istvan Albert, Pennsylvania State University -
|
||||||
June, 9 2004
|
June, 9 2004
|
||||||
"""
|
"""
|
||||||
D = {} # dictionary of final distances
|
D = {} # dictionary of final distances
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.GraphStat - Functions providing various graph statistics
|
altgraph.GraphStat - Functions providing various graph statistics
|
||||||
=================================================================
|
=================================================================
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|||||||
@@ -1,17 +1,17 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.GraphUtil - Utility classes and functions
|
altgraph.GraphUtil - Utility classes and functions
|
||||||
==================================================
|
==================================================
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import random
|
import random
|
||||||
from collections import deque
|
from collections import deque
|
||||||
|
|
||||||
from _vendoring.altgraph import Graph, GraphError
|
from altgraph import Graph, GraphError
|
||||||
|
|
||||||
|
|
||||||
def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=False):
|
def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=False):
|
||||||
"""
|
"""
|
||||||
Generates and returns a :py:class:`~_vendoring.altgraph.Graph.Graph` instance with
|
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance with
|
||||||
*node_num* nodes randomly connected by *edge_num* edges.
|
*node_num* nodes randomly connected by *edge_num* edges.
|
||||||
"""
|
"""
|
||||||
g = Graph.Graph()
|
g = Graph.Graph()
|
||||||
@@ -52,7 +52,7 @@ def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=Fals
|
|||||||
|
|
||||||
def generate_scale_free_graph(steps, growth_num, self_loops=False, multi_edges=False):
|
def generate_scale_free_graph(steps, growth_num, self_loops=False, multi_edges=False):
|
||||||
"""
|
"""
|
||||||
Generates and returns a :py:class:`~_vendoring.altgraph.Graph.Graph` instance that
|
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance that
|
||||||
will have *steps* \\* *growth_num* nodes and a scale free (powerlaw)
|
will have *steps* \\* *growth_num* nodes and a scale free (powerlaw)
|
||||||
connectivity. Starting with a fully connected graph with *growth_num*
|
connectivity. Starting with a fully connected graph with *growth_num*
|
||||||
nodes at every step *growth_num* nodes are added to the graph and are
|
nodes at every step *growth_num* nodes are added to the graph and are
|
||||||
|
|||||||
@@ -1,14 +1,14 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph.ObjectGraph - Graph of objects with an identifier
|
altgraph.ObjectGraph - Graph of objects with an identifier
|
||||||
==========================================================
|
==========================================================
|
||||||
|
|
||||||
A graph of objects that have a "graphident" attribute.
|
A graph of objects that have a "graphident" attribute.
|
||||||
graphident is the key for the object in the graph
|
graphident is the key for the object in the graph
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from _vendoring.altgraph import GraphError
|
from altgraph import GraphError
|
||||||
from _vendoring.altgraph.Graph import Graph
|
from altgraph.Graph import Graph
|
||||||
from _vendoring.altgraph.GraphUtil import filter_stack
|
from altgraph.GraphUtil import filter_stack
|
||||||
|
|
||||||
|
|
||||||
class ObjectGraph(object):
|
class ObjectGraph(object):
|
||||||
|
|||||||
@@ -1,18 +1,18 @@
|
|||||||
"""
|
"""
|
||||||
_vendoring.altgraph - a python graph library
|
altgraph - a python graph library
|
||||||
=================================
|
=================================
|
||||||
|
|
||||||
_vendoring.altgraph is a fork of `graphlib <http://pygraphlib.sourceforge.net>`_ tailored
|
altgraph is a fork of `graphlib <http://pygraphlib.sourceforge.net>`_ tailored
|
||||||
to use newer Python 2.3+ features, including additional support used by the
|
to use newer Python 2.3+ features, including additional support used by the
|
||||||
py2app suite (modulegraph and _vendoring.macholib, specifically).
|
py2app suite (modulegraph and macholib, specifically).
|
||||||
|
|
||||||
_vendoring.altgraph is a python based graph (network) representation and manipulation
|
altgraph is a python based graph (network) representation and manipulation
|
||||||
package. It has started out as an extension to the
|
package. It has started out as an extension to the
|
||||||
`graph_lib module
|
`graph_lib module
|
||||||
<http://www.ece.arizona.edu/~denny/python_nest/graph_lib_1.0.1.html>`_
|
<http://www.ece.arizona.edu/~denny/python_nest/graph_lib_1.0.1.html>`_
|
||||||
written by Nathan Denny it has been significantly optimized and expanded.
|
written by Nathan Denny it has been significantly optimized and expanded.
|
||||||
|
|
||||||
The :class:`_vendoring.altgraph.Graph.Graph` class is loosely modeled after the
|
The :class:`altgraph.Graph.Graph` class is loosely modeled after the
|
||||||
`LEDA <http://www.algorithmic-solutions.com/enleda.htm>`_
|
`LEDA <http://www.algorithmic-solutions.com/enleda.htm>`_
|
||||||
(Library of Efficient Datatypes) representation. The library
|
(Library of Efficient Datatypes) representation. The library
|
||||||
includes methods for constructing graphs, BFS and DFS traversals,
|
includes methods for constructing graphs, BFS and DFS traversals,
|
||||||
@@ -22,22 +22,22 @@
|
|||||||
|
|
||||||
The package contains the following modules:
|
The package contains the following modules:
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.Graph` module contains the
|
- the :py:mod:`altgraph.Graph` module contains the
|
||||||
:class:`~_vendoring.altgraph.Graph.Graph` class that stores the graph data
|
:class:`~altgraph.Graph.Graph` class that stores the graph data
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.GraphAlgo` module implements graph algorithms
|
- the :py:mod:`altgraph.GraphAlgo` module implements graph algorithms
|
||||||
operating on graphs (:py:class:`~_vendoring.altgraph.Graph.Graph`} instances)
|
operating on graphs (:py:class:`~altgraph.Graph.Graph`} instances)
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.GraphStat` module contains functions for
|
- the :py:mod:`altgraph.GraphStat` module contains functions for
|
||||||
computing statistical measures on graphs
|
computing statistical measures on graphs
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.GraphUtil` module contains functions for
|
- the :py:mod:`altgraph.GraphUtil` module contains functions for
|
||||||
generating, reading and saving graphs
|
generating, reading and saving graphs
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.Dot` module contains functions for displaying
|
- the :py:mod:`altgraph.Dot` module contains functions for displaying
|
||||||
graphs via `graphviz <http://www.research.att.com/sw/tools/graphviz/>`_
|
graphs via `graphviz <http://www.research.att.com/sw/tools/graphviz/>`_
|
||||||
|
|
||||||
- the :py:mod:`_vendoring.altgraph.ObjectGraph` module implements a graph of
|
- the :py:mod:`altgraph.ObjectGraph` module implements a graph of
|
||||||
objects with a unique identifier
|
objects with a unique identifier
|
||||||
|
|
||||||
Installation
|
Installation
|
||||||
@@ -62,7 +62,7 @@
|
|||||||
Lets assume that we want to analyze the graph below (links to the full picture)
|
Lets assume that we want to analyze the graph below (links to the full picture)
|
||||||
GRAPH_IMG. Our script then might look the following way::
|
GRAPH_IMG. Our script then might look the following way::
|
||||||
|
|
||||||
from _vendoring.altgraph import Graph, GraphAlgo, Dot
|
from altgraph import Graph, GraphAlgo, Dot
|
||||||
|
|
||||||
# these are the edges
|
# these are the edges
|
||||||
edges = [ (1,2), (2,4), (1,3), (2,4), (3,4), (4,5), (6,5),
|
edges = [ (1,2), (2,4), (1,3), (2,4), (3,4), (4,5), (6,5),
|
||||||
@@ -141,7 +141,7 @@
|
|||||||
"""
|
"""
|
||||||
import pkg_resources
|
import pkg_resources
|
||||||
|
|
||||||
__version__ = pkg_resources.require("_vendoring.altgraph")[0].version
|
__version__ = pkg_resources.require("altgraph")[0].version
|
||||||
|
|
||||||
|
|
||||||
class GraphError(ValueError):
|
class GraphError(ValueError):
|
||||||
|
|||||||
@@ -1,20 +0,0 @@
|
|||||||
The MIT License (MIT)
|
|
||||||
|
|
||||||
Copyright (c) 2014 Anders Høst
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
|
||||||
this software and associated documentation files (the "Software"), to deal in
|
|
||||||
the Software without restriction, including without limitation the rights to
|
|
||||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
|
|
||||||
the Software, and to permit persons to whom the Software is furnished to do so,
|
|
||||||
subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
|
|
||||||
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
|
|
||||||
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
|
|
||||||
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
|
|
||||||
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
||||||
2
lib/spack/external/_vendoring/attr/_make.py
vendored
2
lib/spack/external/_vendoring/attr/_make.py
vendored
@@ -38,7 +38,7 @@
|
|||||||
"typing.ClassVar",
|
"typing.ClassVar",
|
||||||
"t.ClassVar",
|
"t.ClassVar",
|
||||||
"ClassVar",
|
"ClassVar",
|
||||||
"_vendoring.typing_extensions.ClassVar",
|
"typing_extensions.ClassVar",
|
||||||
)
|
)
|
||||||
# we don't use a double-underscore prefix because that triggers
|
# we don't use a double-underscore prefix because that triggers
|
||||||
# name mangling when trying to create a slot for the field
|
# name mangling when trying to create a slot for the field
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr import (
|
from attr import (
|
||||||
NOTHING,
|
NOTHING,
|
||||||
Attribute,
|
Attribute,
|
||||||
Factory,
|
Factory,
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
resolve_types,
|
resolve_types,
|
||||||
validate,
|
validate,
|
||||||
)
|
)
|
||||||
from _vendoring.attr._next_gen import asdict, astuple
|
from attr._next_gen import asdict, astuple
|
||||||
|
|
||||||
from . import converters, exceptions, filters, setters, validators
|
from . import converters, exceptions, filters, setters, validators
|
||||||
|
|
||||||
|
|||||||
@@ -1,3 +1,3 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr.converters import * # noqa
|
from attr.converters import * # noqa
|
||||||
|
|||||||
@@ -1,3 +1,3 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr.exceptions import * # noqa
|
from attr.exceptions import * # noqa
|
||||||
|
|||||||
@@ -1,3 +1,3 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr.filters import * # noqa
|
from attr.filters import * # noqa
|
||||||
|
|||||||
@@ -1,3 +1,3 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr.setters import * # noqa
|
from attr.setters import * # noqa
|
||||||
|
|||||||
@@ -1,3 +1,3 @@
|
|||||||
# SPDX-License-Identifier: MIT
|
# SPDX-License-Identifier: MIT
|
||||||
|
|
||||||
from _vendoring.attr.validators import * # noqa
|
from attr.validators import * # noqa
|
||||||
|
|||||||
12
lib/spack/external/_vendoring/jinja2/bccache.py
vendored
12
lib/spack/external/_vendoring/jinja2/bccache.py
vendored
@@ -19,7 +19,7 @@
|
|||||||
from types import CodeType
|
from types import CodeType
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
class _MemcachedClient(te.Protocol):
|
class _MemcachedClient(te.Protocol):
|
||||||
@@ -101,7 +101,7 @@ def bytecode_to_string(self) -> bytes:
|
|||||||
class BytecodeCache:
|
class BytecodeCache:
|
||||||
"""To implement your own bytecode cache you have to subclass this class
|
"""To implement your own bytecode cache you have to subclass this class
|
||||||
and override :meth:`load_bytecode` and :meth:`dump_bytecode`. Both of
|
and override :meth:`load_bytecode` and :meth:`dump_bytecode`. Both of
|
||||||
these methods are passed a :class:`~_vendoring.jinja2.bccache.Bucket`.
|
these methods are passed a :class:`~jinja2.bccache.Bucket`.
|
||||||
|
|
||||||
A very basic bytecode cache that saves the bytecode on the file system::
|
A very basic bytecode cache that saves the bytecode on the file system::
|
||||||
|
|
||||||
@@ -193,7 +193,7 @@ class FileSystemBytecodeCache(BytecodeCache):
|
|||||||
is created for the user in the system temp directory.
|
is created for the user in the system temp directory.
|
||||||
|
|
||||||
The pattern can be used to have multiple separate caches operate on the
|
The pattern can be used to have multiple separate caches operate on the
|
||||||
same directory. The default pattern is ``'___vendoring.jinja2_%s.cache'``. ``%s``
|
same directory. The default pattern is ``'__jinja2_%s.cache'``. ``%s``
|
||||||
is replaced with the cache key.
|
is replaced with the cache key.
|
||||||
|
|
||||||
>>> bcc = FileSystemBytecodeCache('/tmp/jinja_cache', '%s.cache')
|
>>> bcc = FileSystemBytecodeCache('/tmp/jinja_cache', '%s.cache')
|
||||||
@@ -202,7 +202,7 @@ class FileSystemBytecodeCache(BytecodeCache):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self, directory: t.Optional[str] = None, pattern: str = "___vendoring.jinja2_%s.cache"
|
self, directory: t.Optional[str] = None, pattern: str = "__jinja2_%s.cache"
|
||||||
) -> None:
|
) -> None:
|
||||||
if directory is None:
|
if directory is None:
|
||||||
directory = self._get_default_cache_dir()
|
directory = self._get_default_cache_dir()
|
||||||
@@ -225,7 +225,7 @@ def _unsafe_dir() -> "te.NoReturn":
|
|||||||
if not hasattr(os, "getuid"):
|
if not hasattr(os, "getuid"):
|
||||||
_unsafe_dir()
|
_unsafe_dir()
|
||||||
|
|
||||||
dirname = f"__vendoring.jinja2-cache-{os.getuid()}"
|
dirname = f"_jinja2-cache-{os.getuid()}"
|
||||||
actual_dir = os.path.join(tmpdir, dirname)
|
actual_dir = os.path.join(tmpdir, dirname)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -332,7 +332,7 @@ class MemcachedBytecodeCache(BytecodeCache):
|
|||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
client: "_MemcachedClient",
|
client: "_MemcachedClient",
|
||||||
prefix: str = "_vendoring.jinja2/bytecode/",
|
prefix: str = "jinja2/bytecode/",
|
||||||
timeout: t.Optional[int] = None,
|
timeout: t.Optional[int] = None,
|
||||||
ignore_memcache_errors: bool = True,
|
ignore_memcache_errors: bool = True,
|
||||||
):
|
):
|
||||||
|
|||||||
@@ -6,8 +6,8 @@
|
|||||||
from itertools import chain
|
from itertools import chain
|
||||||
from keyword import iskeyword as is_python_keyword
|
from keyword import iskeyword as is_python_keyword
|
||||||
|
|
||||||
from _vendoring.markupsafe import escape
|
from markupsafe import escape
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
|
|
||||||
from . import nodes
|
from . import nodes
|
||||||
from .exceptions import TemplateAssertionError
|
from .exceptions import TemplateAssertionError
|
||||||
@@ -23,7 +23,7 @@
|
|||||||
from .visitor import NodeVisitor
|
from .visitor import NodeVisitor
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
|
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
|
||||||
@@ -836,7 +836,7 @@ def visit_Template(
|
|||||||
exported_names = sorted(exported)
|
exported_names = sorted(exported)
|
||||||
|
|
||||||
self.writeline("from __future__ import generator_stop") # Python < 3.7
|
self.writeline("from __future__ import generator_stop") # Python < 3.7
|
||||||
self.writeline("from _vendoring.jinja2.runtime import " + ", ".join(exported_names))
|
self.writeline("from jinja2.runtime import " + ", ".join(exported_names))
|
||||||
|
|
||||||
# if we want a deferred initialization we cannot move the
|
# if we want a deferred initialization we cannot move the
|
||||||
# environment into a local name
|
# environment into a local name
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
from .utils import Namespace
|
from .utils import Namespace
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
|
|
||||||
# defaults for the parser / lexer
|
# defaults for the parser / lexer
|
||||||
BLOCK_START_STRING = "{%"
|
BLOCK_START_STRING = "{%"
|
||||||
|
|||||||
@@ -12,7 +12,7 @@
|
|||||||
from functools import reduce
|
from functools import reduce
|
||||||
from types import CodeType
|
from types import CodeType
|
||||||
|
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
|
|
||||||
from . import nodes
|
from . import nodes
|
||||||
from .compiler import CodeGenerator
|
from .compiler import CodeGenerator
|
||||||
@@ -55,7 +55,7 @@
|
|||||||
from .utils import missing
|
from .utils import missing
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .bccache import BytecodeCache
|
from .bccache import BytecodeCache
|
||||||
from .ext import Extension
|
from .ext import Extension
|
||||||
from .loaders import BaseLoader
|
from .loaders import BaseLoader
|
||||||
@@ -126,7 +126,7 @@ def _environment_config_check(environment: "Environment") -> "Environment":
|
|||||||
"""Perform a sanity check on the environment."""
|
"""Perform a sanity check on the environment."""
|
||||||
assert issubclass(
|
assert issubclass(
|
||||||
environment.undefined, Undefined
|
environment.undefined, Undefined
|
||||||
), "'undefined' must be a subclass of '_vendoring.jinja2.Undefined'."
|
), "'undefined' must be a subclass of 'jinja2.Undefined'."
|
||||||
assert (
|
assert (
|
||||||
environment.block_start_string
|
environment.block_start_string
|
||||||
!= environment.variable_start_string
|
!= environment.variable_start_string
|
||||||
@@ -221,7 +221,7 @@ class Environment:
|
|||||||
`autoescape`
|
`autoescape`
|
||||||
If set to ``True`` the XML/HTML autoescaping feature is enabled by
|
If set to ``True`` the XML/HTML autoescaping feature is enabled by
|
||||||
default. For more details about autoescaping see
|
default. For more details about autoescaping see
|
||||||
:class:`~_vendoring.markupsafe.Markup`. As of Jinja 2.4 this can also
|
:class:`~markupsafe.Markup`. As of Jinja 2.4 this can also
|
||||||
be a callable that is passed the template name and has to
|
be a callable that is passed the template name and has to
|
||||||
return ``True`` or ``False`` depending on autoescape should be
|
return ``True`` or ``False`` depending on autoescape should be
|
||||||
enabled by default.
|
enabled by default.
|
||||||
@@ -264,7 +264,7 @@ class Environment:
|
|||||||
|
|
||||||
#: if this environment is sandboxed. Modifying this variable won't make
|
#: if this environment is sandboxed. Modifying this variable won't make
|
||||||
#: the environment sandboxed though. For a real sandboxed environment
|
#: the environment sandboxed though. For a real sandboxed environment
|
||||||
#: have a look at _vendoring.jinja2.sandbox. This flag alone controls the code
|
#: have a look at jinja2.sandbox. This flag alone controls the code
|
||||||
#: generation by the compiler.
|
#: generation by the compiler.
|
||||||
sandboxed = False
|
sandboxed = False
|
||||||
|
|
||||||
@@ -279,11 +279,11 @@ class Environment:
|
|||||||
shared = False
|
shared = False
|
||||||
|
|
||||||
#: the class that is used for code generation. See
|
#: the class that is used for code generation. See
|
||||||
#: :class:`~_vendoring.jinja2.compiler.CodeGenerator` for more information.
|
#: :class:`~jinja2.compiler.CodeGenerator` for more information.
|
||||||
code_generator_class: t.Type["CodeGenerator"] = CodeGenerator
|
code_generator_class: t.Type["CodeGenerator"] = CodeGenerator
|
||||||
|
|
||||||
#: the context class that is used for templates. See
|
#: the context class that is used for templates. See
|
||||||
#: :class:`~_vendoring.jinja2.runtime.Context` for more information.
|
#: :class:`~jinja2.runtime.Context` for more information.
|
||||||
context_class: t.Type[Context] = Context
|
context_class: t.Type[Context] = Context
|
||||||
|
|
||||||
template_class: t.Type["Template"]
|
template_class: t.Type["Template"]
|
||||||
@@ -650,7 +650,7 @@ def _tokenize(
|
|||||||
state: t.Optional[str] = None,
|
state: t.Optional[str] = None,
|
||||||
) -> TokenStream:
|
) -> TokenStream:
|
||||||
"""Called by the parser to do the preprocessing and filtering
|
"""Called by the parser to do the preprocessing and filtering
|
||||||
for all the extensions. Returns a :class:`~_vendoring.jinja2.lexer.TokenStream`.
|
for all the extensions. Returns a :class:`~jinja2.lexer.TokenStream`.
|
||||||
"""
|
"""
|
||||||
source = self.preprocess(source, name, filename)
|
source = self.preprocess(source, name, filename)
|
||||||
stream = self.lexer.tokenize(source, name, filename, state)
|
stream = self.lexer.tokenize(source, name, filename, state)
|
||||||
@@ -1547,7 +1547,7 @@ def __repr__(self) -> str:
|
|||||||
|
|
||||||
|
|
||||||
class TemplateExpression:
|
class TemplateExpression:
|
||||||
"""The :meth:`_vendoring.jinja2.Environment.compile_expression` method returns an
|
"""The :meth:`jinja2.Environment.compile_expression` method returns an
|
||||||
instance of this object. It encapsulates the expression-like access
|
instance of this object. It encapsulates the expression-like access
|
||||||
to the template with an expression it wraps.
|
to the template with an expression it wraps.
|
||||||
"""
|
"""
|
||||||
|
|||||||
18
lib/spack/external/_vendoring/jinja2/ext.py
vendored
18
lib/spack/external/_vendoring/jinja2/ext.py
vendored
@@ -4,7 +4,7 @@
|
|||||||
import typing as t
|
import typing as t
|
||||||
import warnings
|
import warnings
|
||||||
|
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
|
|
||||||
from . import defaults
|
from . import defaults
|
||||||
from . import nodes
|
from . import nodes
|
||||||
@@ -18,7 +18,7 @@
|
|||||||
from .utils import pass_context
|
from .utils import pass_context
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .lexer import Token
|
from .lexer import Token
|
||||||
from .lexer import TokenStream
|
from .lexer import TokenStream
|
||||||
from .parser import Parser
|
from .parser import Parser
|
||||||
@@ -108,10 +108,10 @@ def preprocess(
|
|||||||
def filter_stream(
|
def filter_stream(
|
||||||
self, stream: "TokenStream"
|
self, stream: "TokenStream"
|
||||||
) -> t.Union["TokenStream", t.Iterable["Token"]]:
|
) -> t.Union["TokenStream", t.Iterable["Token"]]:
|
||||||
"""It's passed a :class:`~_vendoring.jinja2.lexer.TokenStream` that can be used
|
"""It's passed a :class:`~jinja2.lexer.TokenStream` that can be used
|
||||||
to filter tokens returned. This method has to return an iterable of
|
to filter tokens returned. This method has to return an iterable of
|
||||||
:class:`~_vendoring.jinja2.lexer.Token`\\s, but it doesn't have to return a
|
:class:`~jinja2.lexer.Token`\\s, but it doesn't have to return a
|
||||||
:class:`~_vendoring.jinja2.lexer.TokenStream`.
|
:class:`~jinja2.lexer.TokenStream`.
|
||||||
"""
|
"""
|
||||||
return stream
|
return stream
|
||||||
|
|
||||||
@@ -145,7 +145,7 @@ def call_method(
|
|||||||
lineno: t.Optional[int] = None,
|
lineno: t.Optional[int] = None,
|
||||||
) -> nodes.Call:
|
) -> nodes.Call:
|
||||||
"""Call a method of the extension. This is a shortcut for
|
"""Call a method of the extension. This is a shortcut for
|
||||||
:meth:`attr` + :class:`_vendoring.jinja2.nodes.Call`.
|
:meth:`attr` + :class:`jinja2.nodes.Call`.
|
||||||
"""
|
"""
|
||||||
if args is None:
|
if args is None:
|
||||||
args = []
|
args = []
|
||||||
@@ -629,9 +629,9 @@ class DebugExtension(Extension):
|
|||||||
|
|
||||||
.. code-block:: text
|
.. code-block:: text
|
||||||
|
|
||||||
{'context': {'cycler': <class '_vendoring.jinja2.utils.Cycler'>,
|
{'context': {'cycler': <class 'jinja2.utils.Cycler'>,
|
||||||
...,
|
...,
|
||||||
'namespace': <class '_vendoring.jinja2.utils.Namespace'>},
|
'namespace': <class 'jinja2.utils.Namespace'>},
|
||||||
'filters': ['abs', 'attr', 'batch', 'capitalize', 'center', 'count', 'd',
|
'filters': ['abs', 'attr', 'batch', 'capitalize', 'center', 'count', 'd',
|
||||||
..., 'urlencode', 'urlize', 'wordcount', 'wordwrap', 'xmlattr'],
|
..., 'urlencode', 'urlize', 'wordcount', 'wordwrap', 'xmlattr'],
|
||||||
'tests': ['!=', '<', '<=', '==', '>', '>=', 'callable', 'defined',
|
'tests': ['!=', '<', '<=', '==', '>', '>=', 'callable', 'defined',
|
||||||
@@ -679,7 +679,7 @@ def extract_from_ast(
|
|||||||
|
|
||||||
This example explains the behavior:
|
This example explains the behavior:
|
||||||
|
|
||||||
>>> from _vendoring.jinja2 import Environment
|
>>> from jinja2 import Environment
|
||||||
>>> env = Environment()
|
>>> env = Environment()
|
||||||
>>> node = env.parse('{{ (_("foo"), _(), ngettext("foo", "bar", 42)) }}')
|
>>> node = env.parse('{{ (_("foo"), _(), ngettext("foo", "bar", 42)) }}')
|
||||||
>>> list(extract_from_ast(node))
|
>>> list(extract_from_ast(node))
|
||||||
|
|||||||
20
lib/spack/external/_vendoring/jinja2/filters.py
vendored
20
lib/spack/external/_vendoring/jinja2/filters.py
vendored
@@ -9,9 +9,9 @@
|
|||||||
from itertools import chain
|
from itertools import chain
|
||||||
from itertools import groupby
|
from itertools import groupby
|
||||||
|
|
||||||
from _vendoring.markupsafe import escape
|
from markupsafe import escape
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
from _vendoring.markupsafe import soft_str
|
from markupsafe import soft_str
|
||||||
|
|
||||||
from .async_utils import async_variant
|
from .async_utils import async_variant
|
||||||
from .async_utils import auto_aiter
|
from .async_utils import auto_aiter
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
from .utils import urlize
|
from .utils import urlize
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
from .nodes import EvalContext
|
from .nodes import EvalContext
|
||||||
from .runtime import Context
|
from .runtime import Context
|
||||||
@@ -48,7 +48,7 @@ def contextfilter(f: F) -> F:
|
|||||||
"""Pass the context as the first argument to the decorated function.
|
"""Pass the context as the first argument to the decorated function.
|
||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use :func:`~_vendoring.jinja2.pass_context`
|
Will be removed in Jinja 3.1. Use :func:`~jinja2.pass_context`
|
||||||
instead.
|
instead.
|
||||||
"""
|
"""
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
@@ -66,7 +66,7 @@ def evalcontextfilter(f: F) -> F:
|
|||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use
|
Will be removed in Jinja 3.1. Use
|
||||||
:func:`~_vendoring.jinja2.pass_eval_context` instead.
|
:func:`~jinja2.pass_eval_context` instead.
|
||||||
|
|
||||||
.. versionadded:: 2.4
|
.. versionadded:: 2.4
|
||||||
"""
|
"""
|
||||||
@@ -85,7 +85,7 @@ def environmentfilter(f: F) -> F:
|
|||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use
|
Will be removed in Jinja 3.1. Use
|
||||||
:func:`~_vendoring.jinja2.pass_environment` instead.
|
:func:`~jinja2.pass_environment` instead.
|
||||||
"""
|
"""
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
"'environmentfilter' is renamed to 'pass_environment', the old"
|
"'environmentfilter' is renamed to 'pass_environment', the old"
|
||||||
@@ -547,10 +547,10 @@ def do_default(
|
|||||||
{{ ''|default('the string was empty', true) }}
|
{{ ''|default('the string was empty', true) }}
|
||||||
|
|
||||||
.. versionchanged:: 2.11
|
.. versionchanged:: 2.11
|
||||||
It's now possible to configure the :class:`~_vendoring.jinja2.Environment` with
|
It's now possible to configure the :class:`~jinja2.Environment` with
|
||||||
:class:`~_vendoring.jinja2.ChainableUndefined` to make the `default` filter work
|
:class:`~jinja2.ChainableUndefined` to make the `default` filter work
|
||||||
on nested elements and attributes that may contain undefined values
|
on nested elements and attributes that may contain undefined values
|
||||||
in the chain without getting an :exc:`~_vendoring.jinja2.UndefinedError`.
|
in the chain without getting an :exc:`~jinja2.UndefinedError`.
|
||||||
"""
|
"""
|
||||||
if isinstance(value, Undefined) or (boolean and not value):
|
if isinstance(value, Undefined) or (boolean and not value):
|
||||||
return default_value
|
return default_value
|
||||||
|
|||||||
@@ -14,7 +14,7 @@
|
|||||||
from .utils import LRUCache
|
from .utils import LRUCache
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
# cache for the lexers. Exists in order to be able to have multiple
|
# cache for the lexers. Exists in order to be able to have multiple
|
||||||
@@ -400,7 +400,7 @@ def close(self) -> None:
|
|||||||
|
|
||||||
def expect(self, expr: str) -> Token:
|
def expect(self, expr: str) -> Token:
|
||||||
"""Expect a given token type and return it. This accepts the same
|
"""Expect a given token type and return it. This accepts the same
|
||||||
argument as :meth:`_vendoring.jinja2.lexer.Token.test`.
|
argument as :meth:`jinja2.lexer.Token.test`.
|
||||||
"""
|
"""
|
||||||
if not self.current.test(expr):
|
if not self.current.test(expr):
|
||||||
expr = describe_token_expr(expr)
|
expr = describe_token_expr(expr)
|
||||||
|
|||||||
@@ -47,7 +47,7 @@ class BaseLoader:
|
|||||||
A very basic example for a loader that looks up templates on the file
|
A very basic example for a loader that looks up templates on the file
|
||||||
system could look like this::
|
system could look like this::
|
||||||
|
|
||||||
from _vendoring.jinja2 import BaseLoader, TemplateNotFound
|
from jinja2 import BaseLoader, TemplateNotFound
|
||||||
from os.path import join, exists, getmtime
|
from os.path import join, exists, getmtime
|
||||||
|
|
||||||
class MyLoader(BaseLoader):
|
class MyLoader(BaseLoader):
|
||||||
@@ -594,7 +594,7 @@ class ModuleLoader(BaseLoader):
|
|||||||
def __init__(
|
def __init__(
|
||||||
self, path: t.Union[str, os.PathLike, t.Sequence[t.Union[str, os.PathLike]]]
|
self, path: t.Union[str, os.PathLike, t.Sequence[t.Union[str, os.PathLike]]]
|
||||||
) -> None:
|
) -> None:
|
||||||
package_name = f"__vendoring.jinja2_module_templates_{id(self):x}"
|
package_name = f"_jinja2_module_templates_{id(self):x}"
|
||||||
|
|
||||||
# create a fake module that looks for the templates in the
|
# create a fake module that looks for the templates in the
|
||||||
# path given.
|
# path given.
|
||||||
|
|||||||
4
lib/spack/external/_vendoring/jinja2/meta.py
vendored
4
lib/spack/external/_vendoring/jinja2/meta.py
vendored
@@ -36,7 +36,7 @@ def find_undeclared_variables(ast: nodes.Template) -> t.Set[str]:
|
|||||||
variables will be used depending on the path the execution takes at
|
variables will be used depending on the path the execution takes at
|
||||||
runtime, all variables are returned.
|
runtime, all variables are returned.
|
||||||
|
|
||||||
>>> from _vendoring.jinja2 import Environment, meta
|
>>> from jinja2 import Environment, meta
|
||||||
>>> env = Environment()
|
>>> env = Environment()
|
||||||
>>> ast = env.parse('{% set foo = 42 %}{{ bar + foo }}')
|
>>> ast = env.parse('{% set foo = 42 %}{{ bar + foo }}')
|
||||||
>>> meta.find_undeclared_variables(ast) == {'bar'}
|
>>> meta.find_undeclared_variables(ast) == {'bar'}
|
||||||
@@ -64,7 +64,7 @@ def find_referenced_templates(ast: nodes.Template) -> t.Iterator[t.Optional[str]
|
|||||||
imports. If dynamic inheritance or inclusion is used, `None` will be
|
imports. If dynamic inheritance or inclusion is used, `None` will be
|
||||||
yielded.
|
yielded.
|
||||||
|
|
||||||
>>> from _vendoring.jinja2 import Environment, meta
|
>>> from jinja2 import Environment, meta
|
||||||
>>> env = Environment()
|
>>> env = Environment()
|
||||||
>>> ast = env.parse('{% extends "layout.html" %}{% include helper %}')
|
>>> ast = env.parse('{% extends "layout.html" %}{% include helper %}')
|
||||||
>>> list(meta.find_referenced_templates(ast))
|
>>> list(meta.find_referenced_templates(ast))
|
||||||
|
|||||||
14
lib/spack/external/_vendoring/jinja2/nodes.py
vendored
14
lib/spack/external/_vendoring/jinja2/nodes.py
vendored
@@ -7,12 +7,12 @@
|
|||||||
import typing as t
|
import typing as t
|
||||||
from collections import deque
|
from collections import deque
|
||||||
|
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
|
|
||||||
from .utils import _PassArg
|
from .utils import _PassArg
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
_NodeBound = t.TypeVar("_NodeBound", bound="Node")
|
_NodeBound = t.TypeVar("_NodeBound", bound="Node")
|
||||||
@@ -1041,7 +1041,7 @@ class ExtensionAttribute(Expr):
|
|||||||
The identifier is the identifier of the :class:`Extension`.
|
The identifier is the identifier of the :class:`Extension`.
|
||||||
|
|
||||||
This node is usually constructed by calling the
|
This node is usually constructed by calling the
|
||||||
:meth:`~_vendoring.jinja2.ext.Extension.attr` method on an extension.
|
:meth:`~jinja2.ext.Extension.attr` method on an extension.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
fields = ("identifier", "name")
|
fields = ("identifier", "name")
|
||||||
@@ -1063,7 +1063,7 @@ class ImportedName(Expr):
|
|||||||
class InternalName(Expr):
|
class InternalName(Expr):
|
||||||
"""An internal name in the compiler. You cannot create these nodes
|
"""An internal name in the compiler. You cannot create these nodes
|
||||||
yourself but the parser provides a
|
yourself but the parser provides a
|
||||||
:meth:`~_vendoring.jinja2.parser.Parser.free_identifier` method that creates
|
:meth:`~jinja2.parser.Parser.free_identifier` method that creates
|
||||||
a new identifier for you. This identifier is not available from the
|
a new identifier for you. This identifier is not available from the
|
||||||
template and is not treated specially by the compiler.
|
template and is not treated specially by the compiler.
|
||||||
"""
|
"""
|
||||||
@@ -1114,7 +1114,7 @@ def as_const(
|
|||||||
class ContextReference(Expr):
|
class ContextReference(Expr):
|
||||||
"""Returns the current template context. It can be used like a
|
"""Returns the current template context. It can be used like a
|
||||||
:class:`Name` node, with a ``'load'`` ctx and will return the
|
:class:`Name` node, with a ``'load'`` ctx and will return the
|
||||||
current :class:`~_vendoring.jinja2.runtime.Context` object.
|
current :class:`~jinja2.runtime.Context` object.
|
||||||
|
|
||||||
Here an example that assigns the current template name to a
|
Here an example that assigns the current template name to a
|
||||||
variable named `foo`::
|
variable named `foo`::
|
||||||
@@ -1123,7 +1123,7 @@ class ContextReference(Expr):
|
|||||||
Getattr(ContextReference(), 'name'))
|
Getattr(ContextReference(), 'name'))
|
||||||
|
|
||||||
This is basically equivalent to using the
|
This is basically equivalent to using the
|
||||||
:func:`~_vendoring.jinja2.pass_context` decorator when using the high-level
|
:func:`~jinja2.pass_context` decorator when using the high-level
|
||||||
API, which causes a reference to the context to be passed as the
|
API, which causes a reference to the context to be passed as the
|
||||||
first argument to a function.
|
first argument to a function.
|
||||||
"""
|
"""
|
||||||
@@ -1188,7 +1188,7 @@ class EvalContextModifier(Stmt):
|
|||||||
class ScopedEvalContextModifier(EvalContextModifier):
|
class ScopedEvalContextModifier(EvalContextModifier):
|
||||||
"""Modifies the eval context and reverts it later. Works exactly like
|
"""Modifies the eval context and reverts it later. Works exactly like
|
||||||
:class:`EvalContextModifier` but will only modify the
|
:class:`EvalContextModifier` but will only modify the
|
||||||
:class:`~_vendoring.jinja2.nodes.EvalContext` for nodes in the :attr:`body`.
|
:class:`~jinja2.nodes.EvalContext` for nodes in the :attr:`body`.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
fields = ("body",)
|
fields = ("body",)
|
||||||
|
|||||||
@@ -9,7 +9,7 @@
|
|||||||
from .lexer import describe_token_expr
|
from .lexer import describe_token_expr
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
_ImportInclude = t.TypeVar("_ImportInclude", nodes.Import, nodes.Include)
|
_ImportInclude = t.TypeVar("_ImportInclude", nodes.Import, nodes.Include)
|
||||||
@@ -156,7 +156,7 @@ def is_tuple_end(
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
def free_identifier(self, lineno: t.Optional[int] = None) -> nodes.InternalName:
|
def free_identifier(self, lineno: t.Optional[int] = None) -> nodes.InternalName:
|
||||||
"""Return a new free identifier as :class:`~_vendoring.jinja2.nodes.InternalName`."""
|
"""Return a new free identifier as :class:`~jinja2.nodes.InternalName`."""
|
||||||
self._last_identifier += 1
|
self._last_identifier += 1
|
||||||
rv = object.__new__(nodes.InternalName)
|
rv = object.__new__(nodes.InternalName)
|
||||||
nodes.Node.__init__(rv, f"fi{self._last_identifier}", lineno=lineno)
|
nodes.Node.__init__(rv, f"fi{self._last_identifier}", lineno=lineno)
|
||||||
@@ -687,7 +687,7 @@ def parse_tuple(
|
|||||||
explicit_parentheses: bool = False,
|
explicit_parentheses: bool = False,
|
||||||
) -> t.Union[nodes.Tuple, nodes.Expr]:
|
) -> t.Union[nodes.Tuple, nodes.Expr]:
|
||||||
"""Works like `parse_expression` but if multiple expressions are
|
"""Works like `parse_expression` but if multiple expressions are
|
||||||
delimited by a comma a :class:`~_vendoring.jinja2.nodes.Tuple` node is created.
|
delimited by a comma a :class:`~jinja2.nodes.Tuple` node is created.
|
||||||
This method could also return a regular expression instead of a tuple
|
This method could also return a regular expression instead of a tuple
|
||||||
if no commas where found.
|
if no commas where found.
|
||||||
|
|
||||||
|
|||||||
20
lib/spack/external/_vendoring/jinja2/runtime.py
vendored
20
lib/spack/external/_vendoring/jinja2/runtime.py
vendored
@@ -5,9 +5,9 @@
|
|||||||
from collections import abc
|
from collections import abc
|
||||||
from itertools import chain
|
from itertools import chain
|
||||||
|
|
||||||
from _vendoring.markupsafe import escape # noqa: F401
|
from markupsafe import escape # noqa: F401
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
from _vendoring.markupsafe import soft_str
|
from markupsafe import soft_str
|
||||||
|
|
||||||
from .async_utils import auto_aiter
|
from .async_utils import auto_aiter
|
||||||
from .async_utils import auto_await # noqa: F401
|
from .async_utils import auto_await # noqa: F401
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import logging
|
import logging
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
|
|
||||||
class LoopRenderFunc(te.Protocol):
|
class LoopRenderFunc(te.Protocol):
|
||||||
@@ -849,7 +849,7 @@ class Undefined:
|
|||||||
>>> foo + 42
|
>>> foo + 42
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__slots__ = (
|
__slots__ = (
|
||||||
@@ -1020,7 +1020,7 @@ class ChainableUndefined(Undefined):
|
|||||||
>>> foo.bar['baz'] + 42
|
>>> foo.bar['baz'] + 42
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
|
|
||||||
.. versionadded:: 2.11.0
|
.. versionadded:: 2.11.0
|
||||||
"""
|
"""
|
||||||
@@ -1047,7 +1047,7 @@ class DebugUndefined(Undefined):
|
|||||||
>>> foo + 42
|
>>> foo + 42
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__slots__ = ()
|
__slots__ = ()
|
||||||
@@ -1077,15 +1077,15 @@ class StrictUndefined(Undefined):
|
|||||||
>>> str(foo)
|
>>> str(foo)
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
>>> not foo
|
>>> not foo
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
>>> foo + 42
|
>>> foo + 42
|
||||||
Traceback (most recent call last):
|
Traceback (most recent call last):
|
||||||
...
|
...
|
||||||
_vendoring.jinja2.exceptions.UndefinedError: 'foo' is undefined
|
jinja2.exceptions.UndefinedError: 'foo' is undefined
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__slots__ = ()
|
__slots__ = ()
|
||||||
|
|||||||
@@ -9,8 +9,8 @@
|
|||||||
from collections import deque
|
from collections import deque
|
||||||
from string import Formatter
|
from string import Formatter
|
||||||
|
|
||||||
from _vendoring.markupsafe import EscapeFormatter
|
from markupsafe import EscapeFormatter
|
||||||
from _vendoring.markupsafe import Markup
|
from markupsafe import Markup
|
||||||
|
|
||||||
from .environment import Environment
|
from .environment import Environment
|
||||||
from .exceptions import SecurityError
|
from .exceptions import SecurityError
|
||||||
@@ -128,7 +128,7 @@ def is_internal_attribute(obj: t.Any, attr: str) -> bool:
|
|||||||
python objects. This is useful if the environment method
|
python objects. This is useful if the environment method
|
||||||
:meth:`~SandboxedEnvironment.is_safe_attribute` is overridden.
|
:meth:`~SandboxedEnvironment.is_safe_attribute` is overridden.
|
||||||
|
|
||||||
>>> from _vendoring.jinja2.sandbox import is_internal_attribute
|
>>> from jinja2.sandbox import is_internal_attribute
|
||||||
>>> is_internal_attribute(str, "mro")
|
>>> is_internal_attribute(str, "mro")
|
||||||
True
|
True
|
||||||
>>> is_internal_attribute(str, "upper")
|
>>> is_internal_attribute(str, "upper")
|
||||||
|
|||||||
48
lib/spack/external/_vendoring/jinja2/utils.py
vendored
48
lib/spack/external/_vendoring/jinja2/utils.py
vendored
@@ -12,10 +12,10 @@
|
|||||||
from types import CodeType
|
from types import CodeType
|
||||||
from urllib.parse import quote_from_bytes
|
from urllib.parse import quote_from_bytes
|
||||||
|
|
||||||
import _vendoring.markupsafe
|
import markupsafe
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
|
|
||||||
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
|
F = t.TypeVar("F", bound=t.Callable[..., t.Any])
|
||||||
|
|
||||||
@@ -28,7 +28,7 @@
|
|||||||
|
|
||||||
|
|
||||||
def pass_context(f: F) -> F:
|
def pass_context(f: F) -> F:
|
||||||
"""Pass the :class:`~_vendoring.jinja2.runtime.Context` as the first argument
|
"""Pass the :class:`~jinja2.runtime.Context` as the first argument
|
||||||
to the decorated function when called while rendering a template.
|
to the decorated function when called while rendering a template.
|
||||||
|
|
||||||
Can be used on functions, filters, and tests.
|
Can be used on functions, filters, and tests.
|
||||||
@@ -45,7 +45,7 @@ def pass_context(f: F) -> F:
|
|||||||
|
|
||||||
|
|
||||||
def pass_eval_context(f: F) -> F:
|
def pass_eval_context(f: F) -> F:
|
||||||
"""Pass the :class:`~_vendoring.jinja2.nodes.EvalContext` as the first argument
|
"""Pass the :class:`~jinja2.nodes.EvalContext` as the first argument
|
||||||
to the decorated function when called while rendering a template.
|
to the decorated function when called while rendering a template.
|
||||||
See :ref:`eval-context`.
|
See :ref:`eval-context`.
|
||||||
|
|
||||||
@@ -62,7 +62,7 @@ def pass_eval_context(f: F) -> F:
|
|||||||
|
|
||||||
|
|
||||||
def pass_environment(f: F) -> F:
|
def pass_environment(f: F) -> F:
|
||||||
"""Pass the :class:`~_vendoring.jinja2.Environment` as the first argument to
|
"""Pass the :class:`~jinja2.Environment` as the first argument to
|
||||||
the decorated function when called while rendering a template.
|
the decorated function when called while rendering a template.
|
||||||
|
|
||||||
Can be used on functions, filters, and tests.
|
Can be used on functions, filters, and tests.
|
||||||
@@ -104,7 +104,7 @@ def contextfunction(f: F) -> F:
|
|||||||
"""Pass the context as the first argument to the decorated function.
|
"""Pass the context as the first argument to the decorated function.
|
||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use :func:`~_vendoring.jinja2.pass_context`
|
Will be removed in Jinja 3.1. Use :func:`~jinja2.pass_context`
|
||||||
instead.
|
instead.
|
||||||
"""
|
"""
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
@@ -122,7 +122,7 @@ def evalcontextfunction(f: F) -> F:
|
|||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use
|
Will be removed in Jinja 3.1. Use
|
||||||
:func:`~_vendoring.jinja2.pass_eval_context` instead.
|
:func:`~jinja2.pass_eval_context` instead.
|
||||||
|
|
||||||
.. versionadded:: 2.4
|
.. versionadded:: 2.4
|
||||||
"""
|
"""
|
||||||
@@ -141,7 +141,7 @@ def environmentfunction(f: F) -> F:
|
|||||||
|
|
||||||
.. deprecated:: 3.0
|
.. deprecated:: 3.0
|
||||||
Will be removed in Jinja 3.1. Use
|
Will be removed in Jinja 3.1. Use
|
||||||
:func:`~_vendoring.jinja2.pass_environment` instead.
|
:func:`~jinja2.pass_environment` instead.
|
||||||
"""
|
"""
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
"'environmentfunction' is renamed to 'pass_environment', the"
|
"'environmentfunction' is renamed to 'pass_environment', the"
|
||||||
@@ -335,9 +335,9 @@ def trim_url(x: str) -> str:
|
|||||||
def trim_url(x: str) -> str:
|
def trim_url(x: str) -> str:
|
||||||
return x
|
return x
|
||||||
|
|
||||||
words = re.split(r"(\s+)", str(_vendoring.markupsafe.escape(text)))
|
words = re.split(r"(\s+)", str(markupsafe.escape(text)))
|
||||||
rel_attr = f' rel="{_vendoring.markupsafe.escape(rel)}"' if rel else ""
|
rel_attr = f' rel="{markupsafe.escape(rel)}"' if rel else ""
|
||||||
target_attr = f' target="{_vendoring.markupsafe.escape(target)}"' if target else ""
|
target_attr = f' target="{markupsafe.escape(target)}"' if target else ""
|
||||||
|
|
||||||
for i, word in enumerate(words):
|
for i, word in enumerate(words):
|
||||||
head, middle, tail = "", word, ""
|
head, middle, tail = "", word, ""
|
||||||
@@ -455,8 +455,8 @@ def generate_lorem_ipsum(
|
|||||||
|
|
||||||
if not html:
|
if not html:
|
||||||
return "\n\n".join(result)
|
return "\n\n".join(result)
|
||||||
return _vendoring.markupsafe.Markup(
|
return markupsafe.Markup(
|
||||||
"\n".join(f"<p>{_vendoring.markupsafe.escape(x)}</p>" for x in result)
|
"\n".join(f"<p>{markupsafe.escape(x)}</p>" for x in result)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -658,7 +658,7 @@ def select_autoescape(
|
|||||||
If you want to enable it for all templates created from strings or
|
If you want to enable it for all templates created from strings or
|
||||||
for all templates with `.html` and `.xml` extensions::
|
for all templates with `.html` and `.xml` extensions::
|
||||||
|
|
||||||
from _vendoring.jinja2 import Environment, select_autoescape
|
from jinja2 import Environment, select_autoescape
|
||||||
env = Environment(autoescape=select_autoescape(
|
env = Environment(autoescape=select_autoescape(
|
||||||
enabled_extensions=('html', 'xml'),
|
enabled_extensions=('html', 'xml'),
|
||||||
default_for_string=True,
|
default_for_string=True,
|
||||||
@@ -667,7 +667,7 @@ def select_autoescape(
|
|||||||
Example configuration to turn it on at all times except if the template
|
Example configuration to turn it on at all times except if the template
|
||||||
ends with `.txt`::
|
ends with `.txt`::
|
||||||
|
|
||||||
from _vendoring.jinja2 import Environment, select_autoescape
|
from jinja2 import Environment, select_autoescape
|
||||||
env = Environment(autoescape=select_autoescape(
|
env = Environment(autoescape=select_autoescape(
|
||||||
disabled_extensions=('txt',),
|
disabled_extensions=('txt',),
|
||||||
default_for_string=True,
|
default_for_string=True,
|
||||||
@@ -703,10 +703,10 @@ def autoescape(template_name: t.Optional[str]) -> bool:
|
|||||||
|
|
||||||
def htmlsafe_json_dumps(
|
def htmlsafe_json_dumps(
|
||||||
obj: t.Any, dumps: t.Optional[t.Callable[..., str]] = None, **kwargs: t.Any
|
obj: t.Any, dumps: t.Optional[t.Callable[..., str]] = None, **kwargs: t.Any
|
||||||
) -> _vendoring.markupsafe.Markup:
|
) -> markupsafe.Markup:
|
||||||
"""Serialize an object to a string of JSON with :func:`json.dumps`,
|
"""Serialize an object to a string of JSON with :func:`json.dumps`,
|
||||||
then replace HTML-unsafe characters with Unicode escapes and mark
|
then replace HTML-unsafe characters with Unicode escapes and mark
|
||||||
the result safe with :class:`~_vendoring.markupsafe.Markup`.
|
the result safe with :class:`~markupsafe.Markup`.
|
||||||
|
|
||||||
This is available in templates as the ``|tojson`` filter.
|
This is available in templates as the ``|tojson`` filter.
|
||||||
|
|
||||||
@@ -732,7 +732,7 @@ def htmlsafe_json_dumps(
|
|||||||
if dumps is None:
|
if dumps is None:
|
||||||
dumps = json.dumps
|
dumps = json.dumps
|
||||||
|
|
||||||
return _vendoring.markupsafe.Markup(
|
return markupsafe.Markup(
|
||||||
dumps(obj, **kwargs)
|
dumps(obj, **kwargs)
|
||||||
.replace("<", "\\u003c")
|
.replace("<", "\\u003c")
|
||||||
.replace(">", "\\u003e")
|
.replace(">", "\\u003e")
|
||||||
@@ -833,11 +833,11 @@ def __repr__(self) -> str:
|
|||||||
return f"<Namespace {self.__attrs!r}>"
|
return f"<Namespace {self.__attrs!r}>"
|
||||||
|
|
||||||
|
|
||||||
class Markup(_vendoring.markupsafe.Markup):
|
class Markup(markupsafe.Markup):
|
||||||
def __new__(cls, base="", encoding=None, errors="strict"): # type: ignore
|
def __new__(cls, base="", encoding=None, errors="strict"): # type: ignore
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
"'_vendoring.jinja2.Markup' is deprecated and will be removed in Jinja"
|
"'jinja2.Markup' is deprecated and will be removed in Jinja"
|
||||||
" 3.1. Import '_vendoring.markupsafe.Markup' instead.",
|
" 3.1. Import 'markupsafe.Markup' instead.",
|
||||||
DeprecationWarning,
|
DeprecationWarning,
|
||||||
stacklevel=2,
|
stacklevel=2,
|
||||||
)
|
)
|
||||||
@@ -846,9 +846,9 @@ def __new__(cls, base="", encoding=None, errors="strict"): # type: ignore
|
|||||||
|
|
||||||
def escape(s: t.Any) -> str:
|
def escape(s: t.Any) -> str:
|
||||||
warnings.warn(
|
warnings.warn(
|
||||||
"'_vendoring.jinja2.escape' is deprecated and will be removed in Jinja"
|
"'jinja2.escape' is deprecated and will be removed in Jinja"
|
||||||
" 3.1. Import '_vendoring.markupsafe.escape' instead.",
|
" 3.1. Import 'markupsafe.escape' instead.",
|
||||||
DeprecationWarning,
|
DeprecationWarning,
|
||||||
stacklevel=2,
|
stacklevel=2,
|
||||||
)
|
)
|
||||||
return _vendoring.markupsafe.escape(s)
|
return markupsafe.escape(s)
|
||||||
|
|||||||
@@ -6,7 +6,7 @@
|
|||||||
from .nodes import Node
|
from .nodes import Node
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
|
|
||||||
class VisitCallable(te.Protocol):
|
class VisitCallable(te.Protocol):
|
||||||
def __call__(self, node: Node, *args: t.Any, **kwargs: t.Any) -> t.Any:
|
def __call__(self, node: Node, *args: t.Any, **kwargs: t.Any) -> t.Any:
|
||||||
|
|||||||
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/jsonschema.pyi
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
from jsonschema import *
|
||||||
@@ -8,18 +8,18 @@
|
|||||||
instance under a schema, and will create a validator for you.
|
instance under a schema, and will create a validator for you.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from _vendoring.jsonschema.exceptions import (
|
from jsonschema.exceptions import (
|
||||||
ErrorTree, FormatError, RefResolutionError, SchemaError, ValidationError
|
ErrorTree, FormatError, RefResolutionError, SchemaError, ValidationError
|
||||||
)
|
)
|
||||||
from _vendoring.jsonschema._format import (
|
from jsonschema._format import (
|
||||||
FormatChecker,
|
FormatChecker,
|
||||||
draft3_format_checker,
|
draft3_format_checker,
|
||||||
draft4_format_checker,
|
draft4_format_checker,
|
||||||
draft6_format_checker,
|
draft6_format_checker,
|
||||||
draft7_format_checker,
|
draft7_format_checker,
|
||||||
)
|
)
|
||||||
from _vendoring.jsonschema._types import TypeChecker
|
from jsonschema._types import TypeChecker
|
||||||
from _vendoring.jsonschema.validators import (
|
from jsonschema.validators import (
|
||||||
Draft3Validator,
|
Draft3Validator,
|
||||||
Draft4Validator,
|
Draft4Validator,
|
||||||
Draft6Validator,
|
Draft6Validator,
|
||||||
|
|||||||
@@ -1,2 +1,2 @@
|
|||||||
from _vendoring.jsonschema.cli import main
|
from jsonschema.cli import main
|
||||||
main()
|
main()
|
||||||
|
|||||||
@@ -3,8 +3,8 @@
|
|||||||
import socket
|
import socket
|
||||||
import struct
|
import struct
|
||||||
|
|
||||||
from _vendoring.jsonschema.compat import str_types
|
from jsonschema.compat import str_types
|
||||||
from _vendoring.jsonschema.exceptions import FormatError
|
from jsonschema.exceptions import FormatError
|
||||||
|
|
||||||
|
|
||||||
class FormatChecker(object):
|
class FormatChecker(object):
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
from _vendoring.jsonschema import _utils
|
from jsonschema import _utils
|
||||||
from _vendoring.jsonschema.compat import iteritems
|
from jsonschema.compat import iteritems
|
||||||
from _vendoring.jsonschema.exceptions import ValidationError
|
from jsonschema.exceptions import ValidationError
|
||||||
|
|
||||||
|
|
||||||
def dependencies_draft3(validator, dependencies, instance, schema):
|
def dependencies_draft3(validator, dependencies, instance, schema):
|
||||||
|
|||||||
@@ -9,7 +9,7 @@
|
|||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.jsonschema.compat import PY3
|
from jsonschema.compat import PY3
|
||||||
|
|
||||||
|
|
||||||
class _NoModuleFound(Exception):
|
class _NoModuleFound(Exception):
|
||||||
|
|||||||
@@ -1,10 +1,10 @@
|
|||||||
import numbers
|
import numbers
|
||||||
|
|
||||||
from _vendoring.pyrsistent import pmap
|
from pyrsistent import pmap
|
||||||
import _vendoring.attr
|
import attr
|
||||||
|
|
||||||
from _vendoring.jsonschema.compat import int_types, str_types
|
from jsonschema.compat import int_types, str_types
|
||||||
from _vendoring.jsonschema.exceptions import UndefinedTypeCheck
|
from jsonschema.exceptions import UndefinedTypeCheck
|
||||||
|
|
||||||
|
|
||||||
def is_array(checker, instance):
|
def is_array(checker, instance):
|
||||||
@@ -45,7 +45,7 @@ def is_any(checker, instance):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
@_vendoring.attr.s(frozen=True)
|
@attr.s(frozen=True)
|
||||||
class TypeChecker(object):
|
class TypeChecker(object):
|
||||||
"""
|
"""
|
||||||
A ``type`` property checker.
|
A ``type`` property checker.
|
||||||
@@ -61,7 +61,7 @@ class TypeChecker(object):
|
|||||||
|
|
||||||
The initial mapping of types to their checking functions.
|
The initial mapping of types to their checking functions.
|
||||||
"""
|
"""
|
||||||
_type_checkers = _vendoring.attr.ib(default=pmap(), converter=pmap)
|
_type_checkers = attr.ib(default=pmap(), converter=pmap)
|
||||||
|
|
||||||
def is_type(self, instance, type):
|
def is_type(self, instance, type):
|
||||||
"""
|
"""
|
||||||
@@ -131,7 +131,7 @@ def redefine_many(self, definitions=()):
|
|||||||
|
|
||||||
A new `TypeChecker` instance.
|
A new `TypeChecker` instance.
|
||||||
"""
|
"""
|
||||||
return _vendoring.attr.evolve(
|
return attr.evolve(
|
||||||
self, type_checkers=self._type_checkers.update(definitions),
|
self, type_checkers=self._type_checkers.update(definitions),
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -162,7 +162,7 @@ def remove(self, *types):
|
|||||||
checkers = checkers.remove(each)
|
checkers = checkers.remove(each)
|
||||||
except KeyError:
|
except KeyError:
|
||||||
raise UndefinedTypeCheck(each)
|
raise UndefinedTypeCheck(each)
|
||||||
return _vendoring.attr.evolve(self, type_checkers=checkers)
|
return attr.evolve(self, type_checkers=checkers)
|
||||||
|
|
||||||
|
|
||||||
draft3_type_checker = TypeChecker(
|
draft3_type_checker = TypeChecker(
|
||||||
|
|||||||
@@ -3,7 +3,7 @@
|
|||||||
import pkgutil
|
import pkgutil
|
||||||
import re
|
import re
|
||||||
|
|
||||||
from _vendoring.jsonschema.compat import MutableMapping, str_types, urlsplit
|
from jsonschema.compat import MutableMapping, str_types, urlsplit
|
||||||
|
|
||||||
|
|
||||||
class URIDict(MutableMapping):
|
class URIDict(MutableMapping):
|
||||||
@@ -51,7 +51,7 @@ def load_schema(name):
|
|||||||
Load a schema from ./schemas/``name``.json and return it.
|
Load a schema from ./schemas/``name``.json and return it.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
data = pkgutil.get_data("_vendoring.jsonschema", "schemas/{0}.json".format(name))
|
data = pkgutil.get_data("jsonschema", "schemas/{0}.json".format(name))
|
||||||
return json.loads(data.decode("utf-8"))
|
return json.loads(data.decode("utf-8"))
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import re
|
import re
|
||||||
|
|
||||||
from _vendoring.jsonschema._utils import (
|
from jsonschema._utils import (
|
||||||
ensure_list,
|
ensure_list,
|
||||||
equal,
|
equal,
|
||||||
extras_msg,
|
extras_msg,
|
||||||
@@ -9,8 +9,8 @@
|
|||||||
unbool,
|
unbool,
|
||||||
uniq,
|
uniq,
|
||||||
)
|
)
|
||||||
from _vendoring.jsonschema.exceptions import FormatError, ValidationError
|
from jsonschema.exceptions import FormatError, ValidationError
|
||||||
from _vendoring.jsonschema.compat import iteritems
|
from jsonschema.compat import iteritems
|
||||||
|
|
||||||
|
|
||||||
def patternProperties(validator, patternProperties, instance, schema):
|
def patternProperties(validator, patternProperties, instance, schema):
|
||||||
|
|||||||
@@ -6,10 +6,10 @@
|
|||||||
"""
|
"""
|
||||||
from twisted.python.filepath import FilePath
|
from twisted.python.filepath import FilePath
|
||||||
from pyperf import Runner
|
from pyperf import Runner
|
||||||
from _vendoring.pyrsistent import m
|
from pyrsistent import m
|
||||||
|
|
||||||
from _vendoring.jsonschema.tests._suite import Version
|
from jsonschema.tests._suite import Version
|
||||||
import _vendoring.jsonschema
|
import jsonschema
|
||||||
|
|
||||||
|
|
||||||
issue232 = Version(
|
issue232 = Version(
|
||||||
|
|||||||
@@ -7,7 +7,7 @@
|
|||||||
"""
|
"""
|
||||||
from pyperf import Runner
|
from pyperf import Runner
|
||||||
|
|
||||||
from _vendoring.jsonschema.tests._suite import Suite
|
from jsonschema.tests._suite import Suite
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
@@ -6,9 +6,9 @@
|
|||||||
import json
|
import json
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.jsonschema import __version__
|
from jsonschema import __version__
|
||||||
from _vendoring.jsonschema._reflect import namedAny
|
from jsonschema._reflect import namedAny
|
||||||
from _vendoring.jsonschema.validators import validator_for
|
from jsonschema.validators import validator_for
|
||||||
|
|
||||||
|
|
||||||
def _namedAnyWithDefault(name):
|
def _namedAnyWithDefault(name):
|
||||||
|
|||||||
@@ -6,10 +6,10 @@
|
|||||||
import pprint
|
import pprint
|
||||||
import textwrap
|
import textwrap
|
||||||
|
|
||||||
import _vendoring.attr
|
import attr
|
||||||
|
|
||||||
from _vendoring.jsonschema import _utils
|
from jsonschema import _utils
|
||||||
from _vendoring.jsonschema.compat import PY3, iteritems
|
from jsonschema.compat import PY3, iteritems
|
||||||
|
|
||||||
|
|
||||||
WEAK_MATCHES = frozenset(["anyOf", "oneOf"])
|
WEAK_MATCHES = frozenset(["anyOf", "oneOf"])
|
||||||
@@ -149,13 +149,13 @@ class SchemaError(_Error):
|
|||||||
_word_for_instance_in_error_message = "schema"
|
_word_for_instance_in_error_message = "schema"
|
||||||
|
|
||||||
|
|
||||||
@_vendoring.attr.s(hash=True)
|
@attr.s(hash=True)
|
||||||
class RefResolutionError(Exception):
|
class RefResolutionError(Exception):
|
||||||
"""
|
"""
|
||||||
A ref could not be resolved.
|
A ref could not be resolved.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_cause = _vendoring.attr.ib()
|
_cause = attr.ib()
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return str(self._cause)
|
return str(self._cause)
|
||||||
|
|||||||
5
lib/spack/external/_vendoring/jsonschema/tests/_helpers.py
vendored
Normal file
5
lib/spack/external/_vendoring/jsonschema/tests/_helpers.py
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
def bug(issue=None):
|
||||||
|
message = "A known bug."
|
||||||
|
if issue is not None:
|
||||||
|
message += " See issue #{issue}.".format(issue=issue)
|
||||||
|
return message
|
||||||
239
lib/spack/external/_vendoring/jsonschema/tests/_suite.py
vendored
Normal file
239
lib/spack/external/_vendoring/jsonschema/tests/_suite.py
vendored
Normal file
@@ -0,0 +1,239 @@
|
|||||||
|
"""
|
||||||
|
Python representations of the JSON Schema Test Suite tests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from functools import partial
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
from twisted.python.filepath import FilePath
|
||||||
|
import attr
|
||||||
|
|
||||||
|
from jsonschema.compat import PY3
|
||||||
|
from jsonschema.validators import validators
|
||||||
|
import jsonschema
|
||||||
|
|
||||||
|
|
||||||
|
def _find_suite():
|
||||||
|
root = os.environ.get("JSON_SCHEMA_TEST_SUITE")
|
||||||
|
if root is not None:
|
||||||
|
return FilePath(root)
|
||||||
|
|
||||||
|
root = FilePath(jsonschema.__file__).parent().sibling("json")
|
||||||
|
if not root.isdir(): # pragma: no cover
|
||||||
|
raise ValueError(
|
||||||
|
(
|
||||||
|
"Can't find the JSON-Schema-Test-Suite directory. "
|
||||||
|
"Set the 'JSON_SCHEMA_TEST_SUITE' environment "
|
||||||
|
"variable or run the tests from alongside a checkout "
|
||||||
|
"of the suite."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
return root
|
||||||
|
|
||||||
|
|
||||||
|
@attr.s(hash=True)
|
||||||
|
class Suite(object):
|
||||||
|
|
||||||
|
_root = attr.ib(default=attr.Factory(_find_suite))
|
||||||
|
|
||||||
|
def _remotes(self):
|
||||||
|
jsonschema_suite = self._root.descendant(["bin", "jsonschema_suite"])
|
||||||
|
remotes = subprocess.check_output(
|
||||||
|
[sys.executable, jsonschema_suite.path, "remotes"],
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"http://localhost:1234/" + name: schema
|
||||||
|
for name, schema in json.loads(remotes.decode("utf-8")).items()
|
||||||
|
}
|
||||||
|
|
||||||
|
def benchmark(self, runner): # pragma: no cover
|
||||||
|
for name in validators:
|
||||||
|
self.version(name=name).benchmark(runner=runner)
|
||||||
|
|
||||||
|
def version(self, name):
|
||||||
|
return Version(
|
||||||
|
name=name,
|
||||||
|
path=self._root.descendant(["tests", name]),
|
||||||
|
remotes=self._remotes(),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@attr.s(hash=True)
|
||||||
|
class Version(object):
|
||||||
|
|
||||||
|
_path = attr.ib()
|
||||||
|
_remotes = attr.ib()
|
||||||
|
|
||||||
|
name = attr.ib()
|
||||||
|
|
||||||
|
def benchmark(self, runner, **kwargs): # pragma: no cover
|
||||||
|
for suite in self.tests():
|
||||||
|
for test in suite:
|
||||||
|
runner.bench_func(
|
||||||
|
test.fully_qualified_name,
|
||||||
|
partial(test.validate_ignoring_errors, **kwargs),
|
||||||
|
)
|
||||||
|
|
||||||
|
def tests(self):
|
||||||
|
return (
|
||||||
|
test
|
||||||
|
for child in self._path.globChildren("*.json")
|
||||||
|
for test in self._tests_in(
|
||||||
|
subject=child.basename()[:-5],
|
||||||
|
path=child,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def format_tests(self):
|
||||||
|
path = self._path.descendant(["optional", "format"])
|
||||||
|
return (
|
||||||
|
test
|
||||||
|
for child in path.globChildren("*.json")
|
||||||
|
for test in self._tests_in(
|
||||||
|
subject=child.basename()[:-5],
|
||||||
|
path=child,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def tests_of(self, name):
|
||||||
|
return self._tests_in(
|
||||||
|
subject=name,
|
||||||
|
path=self._path.child(name + ".json"),
|
||||||
|
)
|
||||||
|
|
||||||
|
def optional_tests_of(self, name):
|
||||||
|
return self._tests_in(
|
||||||
|
subject=name,
|
||||||
|
path=self._path.descendant(["optional", name + ".json"]),
|
||||||
|
)
|
||||||
|
|
||||||
|
def to_unittest_testcase(self, *suites, **kwargs):
|
||||||
|
name = kwargs.pop("name", "Test" + self.name.title())
|
||||||
|
methods = {
|
||||||
|
test.method_name: test.to_unittest_method(**kwargs)
|
||||||
|
for suite in suites
|
||||||
|
for tests in suite
|
||||||
|
for test in tests
|
||||||
|
}
|
||||||
|
cls = type(name, (unittest.TestCase,), methods)
|
||||||
|
|
||||||
|
try:
|
||||||
|
cls.__module__ = _someone_save_us_the_module_of_the_caller()
|
||||||
|
except Exception: # pragma: no cover
|
||||||
|
# We're doing crazy things, so if they go wrong, like a function
|
||||||
|
# behaving differently on some other interpreter, just make them
|
||||||
|
# not happen.
|
||||||
|
pass
|
||||||
|
|
||||||
|
return cls
|
||||||
|
|
||||||
|
def _tests_in(self, subject, path):
|
||||||
|
for each in json.loads(path.getContent().decode("utf-8")):
|
||||||
|
yield (
|
||||||
|
_Test(
|
||||||
|
version=self,
|
||||||
|
subject=subject,
|
||||||
|
case_description=each["description"],
|
||||||
|
schema=each["schema"],
|
||||||
|
remotes=self._remotes,
|
||||||
|
**test
|
||||||
|
) for test in each["tests"]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@attr.s(hash=True, repr=False)
|
||||||
|
class _Test(object):
|
||||||
|
|
||||||
|
version = attr.ib()
|
||||||
|
|
||||||
|
subject = attr.ib()
|
||||||
|
case_description = attr.ib()
|
||||||
|
description = attr.ib()
|
||||||
|
|
||||||
|
data = attr.ib()
|
||||||
|
schema = attr.ib(repr=False)
|
||||||
|
|
||||||
|
valid = attr.ib()
|
||||||
|
|
||||||
|
_remotes = attr.ib()
|
||||||
|
|
||||||
|
def __repr__(self): # pragma: no cover
|
||||||
|
return "<Test {}>".format(self.fully_qualified_name)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def fully_qualified_name(self): # pragma: no cover
|
||||||
|
return " > ".join(
|
||||||
|
[
|
||||||
|
self.version.name,
|
||||||
|
self.subject,
|
||||||
|
self.case_description,
|
||||||
|
self.description,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def method_name(self):
|
||||||
|
delimiters = r"[\W\- ]+"
|
||||||
|
name = "test_%s_%s_%s" % (
|
||||||
|
re.sub(delimiters, "_", self.subject),
|
||||||
|
re.sub(delimiters, "_", self.case_description),
|
||||||
|
re.sub(delimiters, "_", self.description),
|
||||||
|
)
|
||||||
|
|
||||||
|
if not PY3: # pragma: no cover
|
||||||
|
name = name.encode("utf-8")
|
||||||
|
return name
|
||||||
|
|
||||||
|
def to_unittest_method(self, skip=lambda test: None, **kwargs):
|
||||||
|
if self.valid:
|
||||||
|
def fn(this):
|
||||||
|
self.validate(**kwargs)
|
||||||
|
else:
|
||||||
|
def fn(this):
|
||||||
|
with this.assertRaises(jsonschema.ValidationError):
|
||||||
|
self.validate(**kwargs)
|
||||||
|
|
||||||
|
fn.__name__ = self.method_name
|
||||||
|
reason = skip(self)
|
||||||
|
return unittest.skipIf(reason is not None, reason)(fn)
|
||||||
|
|
||||||
|
def validate(self, Validator, **kwargs):
|
||||||
|
resolver = jsonschema.RefResolver.from_schema(
|
||||||
|
schema=self.schema,
|
||||||
|
store=self._remotes,
|
||||||
|
id_of=Validator.ID_OF,
|
||||||
|
)
|
||||||
|
jsonschema.validate(
|
||||||
|
instance=self.data,
|
||||||
|
schema=self.schema,
|
||||||
|
cls=Validator,
|
||||||
|
resolver=resolver,
|
||||||
|
**kwargs
|
||||||
|
)
|
||||||
|
|
||||||
|
def validate_ignoring_errors(self, Validator): # pragma: no cover
|
||||||
|
try:
|
||||||
|
self.validate(Validator=Validator)
|
||||||
|
except jsonschema.ValidationError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def _someone_save_us_the_module_of_the_caller():
|
||||||
|
"""
|
||||||
|
The FQON of the module 2nd stack frames up from here.
|
||||||
|
|
||||||
|
This is intended to allow us to dynamicallly return test case classes that
|
||||||
|
are indistinguishable from being defined in the module that wants them.
|
||||||
|
|
||||||
|
Otherwise, trial will mis-print the FQON, and copy pasting it won't re-run
|
||||||
|
the class that really is running.
|
||||||
|
|
||||||
|
Save us all, this is all so so so so so terrible.
|
||||||
|
"""
|
||||||
|
|
||||||
|
return sys._getframe(2).f_globals["__name__"]
|
||||||
151
lib/spack/external/_vendoring/jsonschema/tests/test_cli.py
vendored
Normal file
151
lib/spack/external/_vendoring/jsonschema/tests/test_cli.py
vendored
Normal file
@@ -0,0 +1,151 @@
|
|||||||
|
from unittest import TestCase
|
||||||
|
import json
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from jsonschema import Draft4Validator, ValidationError, cli, __version__
|
||||||
|
from jsonschema.compat import NativeIO
|
||||||
|
from jsonschema.exceptions import SchemaError
|
||||||
|
|
||||||
|
|
||||||
|
def fake_validator(*errors):
|
||||||
|
errors = list(reversed(errors))
|
||||||
|
|
||||||
|
class FakeValidator(object):
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def iter_errors(self, instance):
|
||||||
|
if errors:
|
||||||
|
return errors.pop()
|
||||||
|
return []
|
||||||
|
|
||||||
|
def check_schema(self, schema):
|
||||||
|
pass
|
||||||
|
|
||||||
|
return FakeValidator
|
||||||
|
|
||||||
|
|
||||||
|
class TestParser(TestCase):
|
||||||
|
|
||||||
|
FakeValidator = fake_validator()
|
||||||
|
instance_file = "foo.json"
|
||||||
|
schema_file = "schema.json"
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
cli.open = self.fake_open
|
||||||
|
self.addCleanup(delattr, cli, "open")
|
||||||
|
|
||||||
|
def fake_open(self, path):
|
||||||
|
if path == self.instance_file:
|
||||||
|
contents = ""
|
||||||
|
elif path == self.schema_file:
|
||||||
|
contents = {}
|
||||||
|
else: # pragma: no cover
|
||||||
|
self.fail("What is {!r}".format(path))
|
||||||
|
return NativeIO(json.dumps(contents))
|
||||||
|
|
||||||
|
def test_find_validator_by_fully_qualified_object_name(self):
|
||||||
|
arguments = cli.parse_args(
|
||||||
|
[
|
||||||
|
"--validator",
|
||||||
|
"jsonschema.tests.test_cli.TestParser.FakeValidator",
|
||||||
|
"--instance", self.instance_file,
|
||||||
|
self.schema_file,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
self.assertIs(arguments["validator"], self.FakeValidator)
|
||||||
|
|
||||||
|
def test_find_validator_in_jsonschema(self):
|
||||||
|
arguments = cli.parse_args(
|
||||||
|
[
|
||||||
|
"--validator", "Draft4Validator",
|
||||||
|
"--instance", self.instance_file,
|
||||||
|
self.schema_file,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
self.assertIs(arguments["validator"], Draft4Validator)
|
||||||
|
|
||||||
|
|
||||||
|
class TestCLI(TestCase):
|
||||||
|
def test_draft3_schema_draft4_validator(self):
|
||||||
|
stdout, stderr = NativeIO(), NativeIO()
|
||||||
|
with self.assertRaises(SchemaError):
|
||||||
|
cli.run(
|
||||||
|
{
|
||||||
|
"validator": Draft4Validator,
|
||||||
|
"schema": {
|
||||||
|
"anyOf": [
|
||||||
|
{"minimum": 20},
|
||||||
|
{"type": "string"},
|
||||||
|
{"required": True},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
"instances": [1],
|
||||||
|
"error_format": "{error.message}",
|
||||||
|
},
|
||||||
|
stdout=stdout,
|
||||||
|
stderr=stderr,
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_successful_validation(self):
|
||||||
|
stdout, stderr = NativeIO(), NativeIO()
|
||||||
|
exit_code = cli.run(
|
||||||
|
{
|
||||||
|
"validator": fake_validator(),
|
||||||
|
"schema": {},
|
||||||
|
"instances": [1],
|
||||||
|
"error_format": "{error.message}",
|
||||||
|
},
|
||||||
|
stdout=stdout,
|
||||||
|
stderr=stderr,
|
||||||
|
)
|
||||||
|
self.assertFalse(stdout.getvalue())
|
||||||
|
self.assertFalse(stderr.getvalue())
|
||||||
|
self.assertEqual(exit_code, 0)
|
||||||
|
|
||||||
|
def test_unsuccessful_validation(self):
|
||||||
|
error = ValidationError("I am an error!", instance=1)
|
||||||
|
stdout, stderr = NativeIO(), NativeIO()
|
||||||
|
exit_code = cli.run(
|
||||||
|
{
|
||||||
|
"validator": fake_validator([error]),
|
||||||
|
"schema": {},
|
||||||
|
"instances": [1],
|
||||||
|
"error_format": "{error.instance} - {error.message}",
|
||||||
|
},
|
||||||
|
stdout=stdout,
|
||||||
|
stderr=stderr,
|
||||||
|
)
|
||||||
|
self.assertFalse(stdout.getvalue())
|
||||||
|
self.assertEqual(stderr.getvalue(), "1 - I am an error!")
|
||||||
|
self.assertEqual(exit_code, 1)
|
||||||
|
|
||||||
|
def test_unsuccessful_validation_multiple_instances(self):
|
||||||
|
first_errors = [
|
||||||
|
ValidationError("9", instance=1),
|
||||||
|
ValidationError("8", instance=1),
|
||||||
|
]
|
||||||
|
second_errors = [ValidationError("7", instance=2)]
|
||||||
|
stdout, stderr = NativeIO(), NativeIO()
|
||||||
|
exit_code = cli.run(
|
||||||
|
{
|
||||||
|
"validator": fake_validator(first_errors, second_errors),
|
||||||
|
"schema": {},
|
||||||
|
"instances": [1, 2],
|
||||||
|
"error_format": "{error.instance} - {error.message}\t",
|
||||||
|
},
|
||||||
|
stdout=stdout,
|
||||||
|
stderr=stderr,
|
||||||
|
)
|
||||||
|
self.assertFalse(stdout.getvalue())
|
||||||
|
self.assertEqual(stderr.getvalue(), "1 - 9\t1 - 8\t2 - 7\t")
|
||||||
|
self.assertEqual(exit_code, 1)
|
||||||
|
|
||||||
|
def test_version(self):
|
||||||
|
version = subprocess.check_output(
|
||||||
|
[sys.executable, "-m", "jsonschema", "--version"],
|
||||||
|
stderr=subprocess.STDOUT,
|
||||||
|
)
|
||||||
|
version = version.decode("utf-8").strip()
|
||||||
|
self.assertEqual(version, __version__)
|
||||||
462
lib/spack/external/_vendoring/jsonschema/tests/test_exceptions.py
vendored
Normal file
462
lib/spack/external/_vendoring/jsonschema/tests/test_exceptions.py
vendored
Normal file
@@ -0,0 +1,462 @@
|
|||||||
|
from unittest import TestCase
|
||||||
|
import textwrap
|
||||||
|
|
||||||
|
from jsonschema import Draft4Validator, exceptions
|
||||||
|
from jsonschema.compat import PY3
|
||||||
|
|
||||||
|
|
||||||
|
class TestBestMatch(TestCase):
|
||||||
|
def best_match(self, errors):
|
||||||
|
errors = list(errors)
|
||||||
|
best = exceptions.best_match(errors)
|
||||||
|
reversed_best = exceptions.best_match(reversed(errors))
|
||||||
|
msg = "Didn't return a consistent best match!\nGot: {0}\n\nThen: {1}"
|
||||||
|
self.assertEqual(
|
||||||
|
best._contents(), reversed_best._contents(),
|
||||||
|
msg=msg.format(best, reversed_best),
|
||||||
|
)
|
||||||
|
return best
|
||||||
|
|
||||||
|
def test_shallower_errors_are_better_matches(self):
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"foo": {
|
||||||
|
"minProperties": 2,
|
||||||
|
"properties": {"bar": {"type": "object"}},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({"foo": {"bar": []}}))
|
||||||
|
self.assertEqual(best.validator, "minProperties")
|
||||||
|
|
||||||
|
def test_oneOf_and_anyOf_are_weak_matches(self):
|
||||||
|
"""
|
||||||
|
A property you *must* match is probably better than one you have to
|
||||||
|
match a part of.
|
||||||
|
"""
|
||||||
|
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"minProperties": 2,
|
||||||
|
"anyOf": [{"type": "string"}, {"type": "number"}],
|
||||||
|
"oneOf": [{"type": "string"}, {"type": "number"}],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({}))
|
||||||
|
self.assertEqual(best.validator, "minProperties")
|
||||||
|
|
||||||
|
def test_if_the_most_relevant_error_is_anyOf_it_is_traversed(self):
|
||||||
|
"""
|
||||||
|
If the most relevant error is an anyOf, then we traverse its context
|
||||||
|
and select the otherwise *least* relevant error, since in this case
|
||||||
|
that means the most specific, deep, error inside the instance.
|
||||||
|
|
||||||
|
I.e. since only one of the schemas must match, we look for the most
|
||||||
|
relevant one.
|
||||||
|
"""
|
||||||
|
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"foo": {
|
||||||
|
"anyOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{"properties": {"bar": {"type": "array"}}},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({"foo": {"bar": 12}}))
|
||||||
|
self.assertEqual(best.validator_value, "array")
|
||||||
|
|
||||||
|
def test_if_the_most_relevant_error_is_oneOf_it_is_traversed(self):
|
||||||
|
"""
|
||||||
|
If the most relevant error is an oneOf, then we traverse its context
|
||||||
|
and select the otherwise *least* relevant error, since in this case
|
||||||
|
that means the most specific, deep, error inside the instance.
|
||||||
|
|
||||||
|
I.e. since only one of the schemas must match, we look for the most
|
||||||
|
relevant one.
|
||||||
|
"""
|
||||||
|
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"foo": {
|
||||||
|
"oneOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{"properties": {"bar": {"type": "array"}}},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({"foo": {"bar": 12}}))
|
||||||
|
self.assertEqual(best.validator_value, "array")
|
||||||
|
|
||||||
|
def test_if_the_most_relevant_error_is_allOf_it_is_traversed(self):
|
||||||
|
"""
|
||||||
|
Now, if the error is allOf, we traverse but select the *most* relevant
|
||||||
|
error from the context, because all schemas here must match anyways.
|
||||||
|
"""
|
||||||
|
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"foo": {
|
||||||
|
"allOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{"properties": {"bar": {"type": "array"}}},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({"foo": {"bar": 12}}))
|
||||||
|
self.assertEqual(best.validator_value, "string")
|
||||||
|
|
||||||
|
def test_nested_context_for_oneOf(self):
|
||||||
|
validator = Draft4Validator(
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"foo": {
|
||||||
|
"oneOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{
|
||||||
|
"oneOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{
|
||||||
|
"properties": {
|
||||||
|
"bar": {"type": "array"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
best = self.best_match(validator.iter_errors({"foo": {"bar": 12}}))
|
||||||
|
self.assertEqual(best.validator_value, "array")
|
||||||
|
|
||||||
|
def test_one_error(self):
|
||||||
|
validator = Draft4Validator({"minProperties": 2})
|
||||||
|
error, = validator.iter_errors({})
|
||||||
|
self.assertEqual(
|
||||||
|
exceptions.best_match(validator.iter_errors({})).validator,
|
||||||
|
"minProperties",
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_no_errors(self):
|
||||||
|
validator = Draft4Validator({})
|
||||||
|
self.assertIsNone(exceptions.best_match(validator.iter_errors({})))
|
||||||
|
|
||||||
|
|
||||||
|
class TestByRelevance(TestCase):
|
||||||
|
def test_short_paths_are_better_matches(self):
|
||||||
|
shallow = exceptions.ValidationError("Oh no!", path=["baz"])
|
||||||
|
deep = exceptions.ValidationError("Oh yes!", path=["foo", "bar"])
|
||||||
|
match = max([shallow, deep], key=exceptions.relevance)
|
||||||
|
self.assertIs(match, shallow)
|
||||||
|
|
||||||
|
match = max([deep, shallow], key=exceptions.relevance)
|
||||||
|
self.assertIs(match, shallow)
|
||||||
|
|
||||||
|
def test_global_errors_are_even_better_matches(self):
|
||||||
|
shallow = exceptions.ValidationError("Oh no!", path=[])
|
||||||
|
deep = exceptions.ValidationError("Oh yes!", path=["foo"])
|
||||||
|
|
||||||
|
errors = sorted([shallow, deep], key=exceptions.relevance)
|
||||||
|
self.assertEqual(
|
||||||
|
[list(error.path) for error in errors],
|
||||||
|
[["foo"], []],
|
||||||
|
)
|
||||||
|
|
||||||
|
errors = sorted([deep, shallow], key=exceptions.relevance)
|
||||||
|
self.assertEqual(
|
||||||
|
[list(error.path) for error in errors],
|
||||||
|
[["foo"], []],
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_weak_validators_are_lower_priority(self):
|
||||||
|
weak = exceptions.ValidationError("Oh no!", path=[], validator="a")
|
||||||
|
normal = exceptions.ValidationError("Oh yes!", path=[], validator="b")
|
||||||
|
|
||||||
|
best_match = exceptions.by_relevance(weak="a")
|
||||||
|
|
||||||
|
match = max([weak, normal], key=best_match)
|
||||||
|
self.assertIs(match, normal)
|
||||||
|
|
||||||
|
match = max([normal, weak], key=best_match)
|
||||||
|
self.assertIs(match, normal)
|
||||||
|
|
||||||
|
def test_strong_validators_are_higher_priority(self):
|
||||||
|
weak = exceptions.ValidationError("Oh no!", path=[], validator="a")
|
||||||
|
normal = exceptions.ValidationError("Oh yes!", path=[], validator="b")
|
||||||
|
strong = exceptions.ValidationError("Oh fine!", path=[], validator="c")
|
||||||
|
|
||||||
|
best_match = exceptions.by_relevance(weak="a", strong="c")
|
||||||
|
|
||||||
|
match = max([weak, normal, strong], key=best_match)
|
||||||
|
self.assertIs(match, strong)
|
||||||
|
|
||||||
|
match = max([strong, normal, weak], key=best_match)
|
||||||
|
self.assertIs(match, strong)
|
||||||
|
|
||||||
|
|
||||||
|
class TestErrorTree(TestCase):
|
||||||
|
def test_it_knows_how_many_total_errors_it_contains(self):
|
||||||
|
# FIXME: https://github.com/Julian/jsonschema/issues/442
|
||||||
|
errors = [
|
||||||
|
exceptions.ValidationError("Something", validator=i)
|
||||||
|
for i in range(8)
|
||||||
|
]
|
||||||
|
tree = exceptions.ErrorTree(errors)
|
||||||
|
self.assertEqual(tree.total_errors, 8)
|
||||||
|
|
||||||
|
def test_it_contains_an_item_if_the_item_had_an_error(self):
|
||||||
|
errors = [exceptions.ValidationError("a message", path=["bar"])]
|
||||||
|
tree = exceptions.ErrorTree(errors)
|
||||||
|
self.assertIn("bar", tree)
|
||||||
|
|
||||||
|
def test_it_does_not_contain_an_item_if_the_item_had_no_error(self):
|
||||||
|
errors = [exceptions.ValidationError("a message", path=["bar"])]
|
||||||
|
tree = exceptions.ErrorTree(errors)
|
||||||
|
self.assertNotIn("foo", tree)
|
||||||
|
|
||||||
|
def test_validators_that_failed_appear_in_errors_dict(self):
|
||||||
|
error = exceptions.ValidationError("a message", validator="foo")
|
||||||
|
tree = exceptions.ErrorTree([error])
|
||||||
|
self.assertEqual(tree.errors, {"foo": error})
|
||||||
|
|
||||||
|
def test_it_creates_a_child_tree_for_each_nested_path(self):
|
||||||
|
errors = [
|
||||||
|
exceptions.ValidationError("a bar message", path=["bar"]),
|
||||||
|
exceptions.ValidationError("a bar -> 0 message", path=["bar", 0]),
|
||||||
|
]
|
||||||
|
tree = exceptions.ErrorTree(errors)
|
||||||
|
self.assertIn(0, tree["bar"])
|
||||||
|
self.assertNotIn(1, tree["bar"])
|
||||||
|
|
||||||
|
def test_children_have_their_errors_dicts_built(self):
|
||||||
|
e1, e2 = (
|
||||||
|
exceptions.ValidationError("1", validator="foo", path=["bar", 0]),
|
||||||
|
exceptions.ValidationError("2", validator="quux", path=["bar", 0]),
|
||||||
|
)
|
||||||
|
tree = exceptions.ErrorTree([e1, e2])
|
||||||
|
self.assertEqual(tree["bar"][0].errors, {"foo": e1, "quux": e2})
|
||||||
|
|
||||||
|
def test_multiple_errors_with_instance(self):
|
||||||
|
e1, e2 = (
|
||||||
|
exceptions.ValidationError(
|
||||||
|
"1",
|
||||||
|
validator="foo",
|
||||||
|
path=["bar", "bar2"],
|
||||||
|
instance="i1"),
|
||||||
|
exceptions.ValidationError(
|
||||||
|
"2",
|
||||||
|
validator="quux",
|
||||||
|
path=["foobar", 2],
|
||||||
|
instance="i2"),
|
||||||
|
)
|
||||||
|
exceptions.ErrorTree([e1, e2])
|
||||||
|
|
||||||
|
def test_it_does_not_contain_subtrees_that_are_not_in_the_instance(self):
|
||||||
|
error = exceptions.ValidationError("123", validator="foo", instance=[])
|
||||||
|
tree = exceptions.ErrorTree([error])
|
||||||
|
|
||||||
|
with self.assertRaises(IndexError):
|
||||||
|
tree[0]
|
||||||
|
|
||||||
|
def test_if_its_in_the_tree_anyhow_it_does_not_raise_an_error(self):
|
||||||
|
"""
|
||||||
|
If a validator is dumb (like :validator:`required` in draft 3) and
|
||||||
|
refers to a path that isn't in the instance, the tree still properly
|
||||||
|
returns a subtree for that path.
|
||||||
|
"""
|
||||||
|
|
||||||
|
error = exceptions.ValidationError(
|
||||||
|
"a message", validator="foo", instance={}, path=["foo"],
|
||||||
|
)
|
||||||
|
tree = exceptions.ErrorTree([error])
|
||||||
|
self.assertIsInstance(tree["foo"], exceptions.ErrorTree)
|
||||||
|
|
||||||
|
|
||||||
|
class TestErrorInitReprStr(TestCase):
|
||||||
|
def make_error(self, **kwargs):
|
||||||
|
defaults = dict(
|
||||||
|
message=u"hello",
|
||||||
|
validator=u"type",
|
||||||
|
validator_value=u"string",
|
||||||
|
instance=5,
|
||||||
|
schema={u"type": u"string"},
|
||||||
|
)
|
||||||
|
defaults.update(kwargs)
|
||||||
|
return exceptions.ValidationError(**defaults)
|
||||||
|
|
||||||
|
def assertShows(self, expected, **kwargs):
|
||||||
|
if PY3: # pragma: no cover
|
||||||
|
expected = expected.replace("u'", "'")
|
||||||
|
expected = textwrap.dedent(expected).rstrip("\n")
|
||||||
|
|
||||||
|
error = self.make_error(**kwargs)
|
||||||
|
message_line, _, rest = str(error).partition("\n")
|
||||||
|
self.assertEqual(message_line, error.message)
|
||||||
|
self.assertEqual(rest, expected)
|
||||||
|
|
||||||
|
def test_it_calls_super_and_sets_args(self):
|
||||||
|
error = self.make_error()
|
||||||
|
self.assertGreater(len(error.args), 1)
|
||||||
|
|
||||||
|
def test_repr(self):
|
||||||
|
self.assertEqual(
|
||||||
|
repr(exceptions.ValidationError(message="Hello!")),
|
||||||
|
"<ValidationError: %r>" % "Hello!",
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_unset_error(self):
|
||||||
|
error = exceptions.ValidationError("message")
|
||||||
|
self.assertEqual(str(error), "message")
|
||||||
|
|
||||||
|
kwargs = {
|
||||||
|
"validator": "type",
|
||||||
|
"validator_value": "string",
|
||||||
|
"instance": 5,
|
||||||
|
"schema": {"type": "string"},
|
||||||
|
}
|
||||||
|
# Just the message should show if any of the attributes are unset
|
||||||
|
for attr in kwargs:
|
||||||
|
k = dict(kwargs)
|
||||||
|
del k[attr]
|
||||||
|
error = exceptions.ValidationError("message", **k)
|
||||||
|
self.assertEqual(str(error), "message")
|
||||||
|
|
||||||
|
def test_empty_paths(self):
|
||||||
|
self.assertShows(
|
||||||
|
"""
|
||||||
|
Failed validating u'type' in schema:
|
||||||
|
{u'type': u'string'}
|
||||||
|
|
||||||
|
On instance:
|
||||||
|
5
|
||||||
|
""",
|
||||||
|
path=[],
|
||||||
|
schema_path=[],
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_one_item_paths(self):
|
||||||
|
self.assertShows(
|
||||||
|
"""
|
||||||
|
Failed validating u'type' in schema:
|
||||||
|
{u'type': u'string'}
|
||||||
|
|
||||||
|
On instance[0]:
|
||||||
|
5
|
||||||
|
""",
|
||||||
|
path=[0],
|
||||||
|
schema_path=["items"],
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_multiple_item_paths(self):
|
||||||
|
self.assertShows(
|
||||||
|
"""
|
||||||
|
Failed validating u'type' in schema[u'items'][0]:
|
||||||
|
{u'type': u'string'}
|
||||||
|
|
||||||
|
On instance[0][u'a']:
|
||||||
|
5
|
||||||
|
""",
|
||||||
|
path=[0, u"a"],
|
||||||
|
schema_path=[u"items", 0, 1],
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_uses_pprint(self):
|
||||||
|
self.assertShows(
|
||||||
|
"""
|
||||||
|
Failed validating u'maxLength' in schema:
|
||||||
|
{0: 0,
|
||||||
|
1: 1,
|
||||||
|
2: 2,
|
||||||
|
3: 3,
|
||||||
|
4: 4,
|
||||||
|
5: 5,
|
||||||
|
6: 6,
|
||||||
|
7: 7,
|
||||||
|
8: 8,
|
||||||
|
9: 9,
|
||||||
|
10: 10,
|
||||||
|
11: 11,
|
||||||
|
12: 12,
|
||||||
|
13: 13,
|
||||||
|
14: 14,
|
||||||
|
15: 15,
|
||||||
|
16: 16,
|
||||||
|
17: 17,
|
||||||
|
18: 18,
|
||||||
|
19: 19}
|
||||||
|
|
||||||
|
On instance:
|
||||||
|
[0,
|
||||||
|
1,
|
||||||
|
2,
|
||||||
|
3,
|
||||||
|
4,
|
||||||
|
5,
|
||||||
|
6,
|
||||||
|
7,
|
||||||
|
8,
|
||||||
|
9,
|
||||||
|
10,
|
||||||
|
11,
|
||||||
|
12,
|
||||||
|
13,
|
||||||
|
14,
|
||||||
|
15,
|
||||||
|
16,
|
||||||
|
17,
|
||||||
|
18,
|
||||||
|
19,
|
||||||
|
20,
|
||||||
|
21,
|
||||||
|
22,
|
||||||
|
23,
|
||||||
|
24]
|
||||||
|
""",
|
||||||
|
instance=list(range(25)),
|
||||||
|
schema=dict(zip(range(20), range(20))),
|
||||||
|
validator=u"maxLength",
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_str_works_with_instances_having_overriden_eq_operator(self):
|
||||||
|
"""
|
||||||
|
Check for https://github.com/Julian/jsonschema/issues/164 which
|
||||||
|
rendered exceptions unusable when a `ValidationError` involved
|
||||||
|
instances with an `__eq__` method that returned truthy values.
|
||||||
|
"""
|
||||||
|
|
||||||
|
class DontEQMeBro(object):
|
||||||
|
def __eq__(this, other): # pragma: no cover
|
||||||
|
self.fail("Don't!")
|
||||||
|
|
||||||
|
def __ne__(this, other): # pragma: no cover
|
||||||
|
self.fail("Don't!")
|
||||||
|
|
||||||
|
instance = DontEQMeBro()
|
||||||
|
error = exceptions.ValidationError(
|
||||||
|
"a message",
|
||||||
|
validator="foo",
|
||||||
|
instance=instance,
|
||||||
|
validator_value="some",
|
||||||
|
schema="schema",
|
||||||
|
)
|
||||||
|
self.assertIn(repr(instance), str(error))
|
||||||
|
|
||||||
|
|
||||||
|
class TestHashable(TestCase):
|
||||||
|
def test_hashable(self):
|
||||||
|
set([exceptions.ValidationError("")])
|
||||||
|
set([exceptions.SchemaError("")])
|
||||||
89
lib/spack/external/_vendoring/jsonschema/tests/test_format.py
vendored
Normal file
89
lib/spack/external/_vendoring/jsonschema/tests/test_format.py
vendored
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
"""
|
||||||
|
Tests for the parts of jsonschema related to the :validator:`format` property.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from unittest import TestCase
|
||||||
|
|
||||||
|
from jsonschema import FormatError, ValidationError, FormatChecker
|
||||||
|
from jsonschema.validators import Draft4Validator
|
||||||
|
|
||||||
|
|
||||||
|
BOOM = ValueError("Boom!")
|
||||||
|
BANG = ZeroDivisionError("Bang!")
|
||||||
|
|
||||||
|
|
||||||
|
def boom(thing):
|
||||||
|
if thing == "bang":
|
||||||
|
raise BANG
|
||||||
|
raise BOOM
|
||||||
|
|
||||||
|
|
||||||
|
class TestFormatChecker(TestCase):
|
||||||
|
def test_it_can_validate_no_formats(self):
|
||||||
|
checker = FormatChecker(formats=())
|
||||||
|
self.assertFalse(checker.checkers)
|
||||||
|
|
||||||
|
def test_it_raises_a_key_error_for_unknown_formats(self):
|
||||||
|
with self.assertRaises(KeyError):
|
||||||
|
FormatChecker(formats=["o noes"])
|
||||||
|
|
||||||
|
def test_it_can_register_cls_checkers(self):
|
||||||
|
original = dict(FormatChecker.checkers)
|
||||||
|
self.addCleanup(FormatChecker.checkers.pop, "boom")
|
||||||
|
FormatChecker.cls_checks("boom")(boom)
|
||||||
|
self.assertEqual(
|
||||||
|
FormatChecker.checkers,
|
||||||
|
dict(original, boom=(boom, ())),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_it_can_register_checkers(self):
|
||||||
|
checker = FormatChecker()
|
||||||
|
checker.checks("boom")(boom)
|
||||||
|
self.assertEqual(
|
||||||
|
checker.checkers,
|
||||||
|
dict(FormatChecker.checkers, boom=(boom, ()))
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_it_catches_registered_errors(self):
|
||||||
|
checker = FormatChecker()
|
||||||
|
checker.checks("boom", raises=type(BOOM))(boom)
|
||||||
|
|
||||||
|
with self.assertRaises(FormatError) as cm:
|
||||||
|
checker.check(instance=12, format="boom")
|
||||||
|
|
||||||
|
self.assertIs(cm.exception.cause, BOOM)
|
||||||
|
self.assertIs(cm.exception.__cause__, BOOM)
|
||||||
|
|
||||||
|
# Unregistered errors should not be caught
|
||||||
|
with self.assertRaises(type(BANG)):
|
||||||
|
checker.check(instance="bang", format="boom")
|
||||||
|
|
||||||
|
def test_format_error_causes_become_validation_error_causes(self):
|
||||||
|
checker = FormatChecker()
|
||||||
|
checker.checks("boom", raises=ValueError)(boom)
|
||||||
|
validator = Draft4Validator({"format": "boom"}, format_checker=checker)
|
||||||
|
|
||||||
|
with self.assertRaises(ValidationError) as cm:
|
||||||
|
validator.validate("BOOM")
|
||||||
|
|
||||||
|
self.assertIs(cm.exception.cause, BOOM)
|
||||||
|
self.assertIs(cm.exception.__cause__, BOOM)
|
||||||
|
|
||||||
|
def test_format_checkers_come_with_defaults(self):
|
||||||
|
# This is bad :/ but relied upon.
|
||||||
|
# The docs for quite awhile recommended people do things like
|
||||||
|
# validate(..., format_checker=FormatChecker())
|
||||||
|
# We should change that, but we can't without deprecation...
|
||||||
|
checker = FormatChecker()
|
||||||
|
with self.assertRaises(FormatError):
|
||||||
|
checker.check(instance="not-an-ipv4", format="ipv4")
|
||||||
|
|
||||||
|
def test_repr(self):
|
||||||
|
checker = FormatChecker(formats=())
|
||||||
|
checker.checks("foo")(lambda thing: True)
|
||||||
|
checker.checks("bar")(lambda thing: True)
|
||||||
|
checker.checks("baz")(lambda thing: True)
|
||||||
|
self.assertEqual(
|
||||||
|
repr(checker),
|
||||||
|
"<FormatChecker checkers=['bar', 'baz', 'foo']>",
|
||||||
|
)
|
||||||
277
lib/spack/external/_vendoring/jsonschema/tests/test_jsonschema_test_suite.py
vendored
Normal file
277
lib/spack/external/_vendoring/jsonschema/tests/test_jsonschema_test_suite.py
vendored
Normal file
@@ -0,0 +1,277 @@
|
|||||||
|
"""
|
||||||
|
Test runner for the JSON Schema official test suite
|
||||||
|
|
||||||
|
Tests comprehensive correctness of each draft's validator.
|
||||||
|
|
||||||
|
See https://github.com/json-schema-org/JSON-Schema-Test-Suite for details.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
from jsonschema import (
|
||||||
|
Draft3Validator,
|
||||||
|
Draft4Validator,
|
||||||
|
Draft6Validator,
|
||||||
|
Draft7Validator,
|
||||||
|
draft3_format_checker,
|
||||||
|
draft4_format_checker,
|
||||||
|
draft6_format_checker,
|
||||||
|
draft7_format_checker,
|
||||||
|
)
|
||||||
|
from jsonschema.tests._helpers import bug
|
||||||
|
from jsonschema.tests._suite import Suite
|
||||||
|
from jsonschema.validators import _DEPRECATED_DEFAULT_TYPES, create
|
||||||
|
|
||||||
|
|
||||||
|
SUITE = Suite()
|
||||||
|
DRAFT3 = SUITE.version(name="draft3")
|
||||||
|
DRAFT4 = SUITE.version(name="draft4")
|
||||||
|
DRAFT6 = SUITE.version(name="draft6")
|
||||||
|
DRAFT7 = SUITE.version(name="draft7")
|
||||||
|
|
||||||
|
|
||||||
|
def skip(message, **kwargs):
|
||||||
|
def skipper(test):
|
||||||
|
if all(value == getattr(test, attr) for attr, value in kwargs.items()):
|
||||||
|
return message
|
||||||
|
return skipper
|
||||||
|
|
||||||
|
|
||||||
|
def missing_format(checker):
|
||||||
|
def missing_format(test):
|
||||||
|
schema = test.schema
|
||||||
|
if schema is True or schema is False or "format" not in schema:
|
||||||
|
return
|
||||||
|
|
||||||
|
if schema["format"] not in checker.checkers:
|
||||||
|
return "Format checker {0!r} not found.".format(schema["format"])
|
||||||
|
return missing_format
|
||||||
|
|
||||||
|
|
||||||
|
is_narrow_build = sys.maxunicode == 2 ** 16 - 1
|
||||||
|
if is_narrow_build: # pragma: no cover
|
||||||
|
message = "Not running surrogate Unicode case, this Python is narrow."
|
||||||
|
|
||||||
|
def narrow_unicode_build(test): # pragma: no cover
|
||||||
|
return skip(
|
||||||
|
message=message,
|
||||||
|
description="one supplementary Unicode code point is not long enough",
|
||||||
|
)(test) or skip(
|
||||||
|
message=message,
|
||||||
|
description="two supplementary Unicode code points is long enough",
|
||||||
|
)(test)
|
||||||
|
else:
|
||||||
|
def narrow_unicode_build(test): # pragma: no cover
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
|
TestDraft3 = DRAFT3.to_unittest_testcase(
|
||||||
|
DRAFT3.tests(),
|
||||||
|
DRAFT3.optional_tests_of(name="bignum"),
|
||||||
|
DRAFT3.optional_tests_of(name="format"),
|
||||||
|
DRAFT3.optional_tests_of(name="zeroTerminatedFloats"),
|
||||||
|
Validator=Draft3Validator,
|
||||||
|
format_checker=draft3_format_checker,
|
||||||
|
skip=lambda test: (
|
||||||
|
narrow_unicode_build(test)
|
||||||
|
or missing_format(draft3_format_checker)(test)
|
||||||
|
or skip(
|
||||||
|
message="Upstream bug in strict_rfc3339",
|
||||||
|
subject="format",
|
||||||
|
description="case-insensitive T and Z",
|
||||||
|
)(test)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
TestDraft4 = DRAFT4.to_unittest_testcase(
|
||||||
|
DRAFT4.tests(),
|
||||||
|
DRAFT4.optional_tests_of(name="bignum"),
|
||||||
|
DRAFT4.optional_tests_of(name="format"),
|
||||||
|
DRAFT4.optional_tests_of(name="zeroTerminatedFloats"),
|
||||||
|
Validator=Draft4Validator,
|
||||||
|
format_checker=draft4_format_checker,
|
||||||
|
skip=lambda test: (
|
||||||
|
narrow_unicode_build(test)
|
||||||
|
or missing_format(draft4_format_checker)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Recursive references between schemas",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Location-independent identifier",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with absolute URI"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with base URI change in subschema"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="refRemote",
|
||||||
|
case_description="base URI change - change folder in subschema",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message="Upstream bug in strict_rfc3339",
|
||||||
|
subject="format",
|
||||||
|
description="case-insensitive T and Z",
|
||||||
|
)(test)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
TestDraft6 = DRAFT6.to_unittest_testcase(
|
||||||
|
DRAFT6.tests(),
|
||||||
|
DRAFT6.optional_tests_of(name="bignum"),
|
||||||
|
DRAFT6.optional_tests_of(name="format"),
|
||||||
|
DRAFT6.optional_tests_of(name="zeroTerminatedFloats"),
|
||||||
|
Validator=Draft6Validator,
|
||||||
|
format_checker=draft6_format_checker,
|
||||||
|
skip=lambda test: (
|
||||||
|
narrow_unicode_build(test)
|
||||||
|
or missing_format(draft6_format_checker)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Recursive references between schemas",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Location-independent identifier",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with absolute URI"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with base URI change in subschema"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="refRemote",
|
||||||
|
case_description="base URI change - change folder in subschema",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message="Upstream bug in strict_rfc3339",
|
||||||
|
subject="format",
|
||||||
|
description="case-insensitive T and Z",
|
||||||
|
)(test)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
TestDraft7 = DRAFT7.to_unittest_testcase(
|
||||||
|
DRAFT7.tests(),
|
||||||
|
DRAFT7.format_tests(),
|
||||||
|
DRAFT7.optional_tests_of(name="bignum"),
|
||||||
|
DRAFT7.optional_tests_of(name="content"),
|
||||||
|
DRAFT7.optional_tests_of(name="zeroTerminatedFloats"),
|
||||||
|
Validator=Draft7Validator,
|
||||||
|
format_checker=draft7_format_checker,
|
||||||
|
skip=lambda test: (
|
||||||
|
narrow_unicode_build(test)
|
||||||
|
or missing_format(draft7_format_checker)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Recursive references between schemas",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description="Location-independent identifier",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with absolute URI"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(371),
|
||||||
|
subject="ref",
|
||||||
|
case_description=(
|
||||||
|
"Location-independent identifier with base URI change in subschema"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(),
|
||||||
|
subject="refRemote",
|
||||||
|
case_description="base URI change - change folder in subschema",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message="Upstream bug in strict_rfc3339",
|
||||||
|
subject="date-time",
|
||||||
|
description="case-insensitive T and Z",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(593),
|
||||||
|
subject="content",
|
||||||
|
case_description=(
|
||||||
|
"validation of string-encoded content based on media type"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(593),
|
||||||
|
subject="content",
|
||||||
|
case_description="validation of binary string-encoding",
|
||||||
|
)(test)
|
||||||
|
or skip(
|
||||||
|
message=bug(593),
|
||||||
|
subject="content",
|
||||||
|
case_description=(
|
||||||
|
"validation of binary-encoded media type documents"
|
||||||
|
),
|
||||||
|
)(test)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
with warnings.catch_warnings():
|
||||||
|
warnings.simplefilter("ignore", DeprecationWarning)
|
||||||
|
|
||||||
|
TestDraft3LegacyTypeCheck = DRAFT3.to_unittest_testcase(
|
||||||
|
# Interestingly the any part couldn't really be done w/the old API.
|
||||||
|
(
|
||||||
|
(test for test in each if test.schema != {"type": "any"})
|
||||||
|
for each in DRAFT3.tests_of(name="type")
|
||||||
|
),
|
||||||
|
name="TestDraft3LegacyTypeCheck",
|
||||||
|
Validator=create(
|
||||||
|
meta_schema=Draft3Validator.META_SCHEMA,
|
||||||
|
validators=Draft3Validator.VALIDATORS,
|
||||||
|
default_types=_DEPRECATED_DEFAULT_TYPES,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
TestDraft4LegacyTypeCheck = DRAFT4.to_unittest_testcase(
|
||||||
|
DRAFT4.tests_of(name="type"),
|
||||||
|
name="TestDraft4LegacyTypeCheck",
|
||||||
|
Validator=create(
|
||||||
|
meta_schema=Draft4Validator.META_SCHEMA,
|
||||||
|
validators=Draft4Validator.VALIDATORS,
|
||||||
|
default_types=_DEPRECATED_DEFAULT_TYPES,
|
||||||
|
),
|
||||||
|
)
|
||||||
190
lib/spack/external/_vendoring/jsonschema/tests/test_types.py
vendored
Normal file
190
lib/spack/external/_vendoring/jsonschema/tests/test_types.py
vendored
Normal file
@@ -0,0 +1,190 @@
|
|||||||
|
"""
|
||||||
|
Tests on the new type interface. The actual correctness of the type checking
|
||||||
|
is handled in test_jsonschema_test_suite; these tests check that TypeChecker
|
||||||
|
functions correctly and can facilitate extensions to type checking
|
||||||
|
"""
|
||||||
|
from collections import namedtuple
|
||||||
|
from unittest import TestCase
|
||||||
|
|
||||||
|
from jsonschema import ValidationError, _validators
|
||||||
|
from jsonschema._types import TypeChecker
|
||||||
|
from jsonschema.exceptions import UndefinedTypeCheck
|
||||||
|
from jsonschema.validators import Draft4Validator, extend
|
||||||
|
|
||||||
|
|
||||||
|
def equals_2(checker, instance):
|
||||||
|
return instance == 2
|
||||||
|
|
||||||
|
|
||||||
|
def is_namedtuple(instance):
|
||||||
|
return isinstance(instance, tuple) and getattr(instance, "_fields", None)
|
||||||
|
|
||||||
|
|
||||||
|
def is_object_or_named_tuple(checker, instance):
|
||||||
|
if Draft4Validator.TYPE_CHECKER.is_type(instance, "object"):
|
||||||
|
return True
|
||||||
|
return is_namedtuple(instance)
|
||||||
|
|
||||||
|
|
||||||
|
def coerce_named_tuple(fn):
|
||||||
|
def coerced(validator, value, instance, schema):
|
||||||
|
if is_namedtuple(instance):
|
||||||
|
instance = instance._asdict()
|
||||||
|
return fn(validator, value, instance, schema)
|
||||||
|
return coerced
|
||||||
|
|
||||||
|
|
||||||
|
required = coerce_named_tuple(_validators.required)
|
||||||
|
properties = coerce_named_tuple(_validators.properties)
|
||||||
|
|
||||||
|
|
||||||
|
class TestTypeChecker(TestCase):
|
||||||
|
def test_is_type(self):
|
||||||
|
checker = TypeChecker({"two": equals_2})
|
||||||
|
self.assertEqual(
|
||||||
|
(
|
||||||
|
checker.is_type(instance=2, type="two"),
|
||||||
|
checker.is_type(instance="bar", type="two"),
|
||||||
|
),
|
||||||
|
(True, False),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_is_unknown_type(self):
|
||||||
|
with self.assertRaises(UndefinedTypeCheck) as context:
|
||||||
|
TypeChecker().is_type(4, "foobar")
|
||||||
|
self.assertIn("foobar", str(context.exception))
|
||||||
|
|
||||||
|
def test_checks_can_be_added_at_init(self):
|
||||||
|
checker = TypeChecker({"two": equals_2})
|
||||||
|
self.assertEqual(checker, TypeChecker().redefine("two", equals_2))
|
||||||
|
|
||||||
|
def test_redefine_existing_type(self):
|
||||||
|
self.assertEqual(
|
||||||
|
TypeChecker().redefine("two", object()).redefine("two", equals_2),
|
||||||
|
TypeChecker().redefine("two", equals_2),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_remove(self):
|
||||||
|
self.assertEqual(
|
||||||
|
TypeChecker({"two": equals_2}).remove("two"),
|
||||||
|
TypeChecker(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_remove_unknown_type(self):
|
||||||
|
with self.assertRaises(UndefinedTypeCheck) as context:
|
||||||
|
TypeChecker().remove("foobar")
|
||||||
|
self.assertIn("foobar", str(context.exception))
|
||||||
|
|
||||||
|
def test_redefine_many(self):
|
||||||
|
self.assertEqual(
|
||||||
|
TypeChecker().redefine_many({"foo": int, "bar": str}),
|
||||||
|
TypeChecker().redefine("foo", int).redefine("bar", str),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_remove_multiple(self):
|
||||||
|
self.assertEqual(
|
||||||
|
TypeChecker({"foo": int, "bar": str}).remove("foo", "bar"),
|
||||||
|
TypeChecker(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_type_check_can_raise_key_error(self):
|
||||||
|
"""
|
||||||
|
Make sure no one writes:
|
||||||
|
|
||||||
|
try:
|
||||||
|
self._type_checkers[type](...)
|
||||||
|
except KeyError:
|
||||||
|
|
||||||
|
ignoring the fact that the function itself can raise that.
|
||||||
|
"""
|
||||||
|
|
||||||
|
error = KeyError("Stuff")
|
||||||
|
|
||||||
|
def raises_keyerror(checker, instance):
|
||||||
|
raise error
|
||||||
|
|
||||||
|
with self.assertRaises(KeyError) as context:
|
||||||
|
TypeChecker({"foo": raises_keyerror}).is_type(4, "foo")
|
||||||
|
|
||||||
|
self.assertIs(context.exception, error)
|
||||||
|
|
||||||
|
|
||||||
|
class TestCustomTypes(TestCase):
|
||||||
|
def test_simple_type_can_be_extended(self):
|
||||||
|
def int_or_str_int(checker, instance):
|
||||||
|
if not isinstance(instance, (int, str)):
|
||||||
|
return False
|
||||||
|
try:
|
||||||
|
int(instance)
|
||||||
|
except ValueError:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
CustomValidator = extend(
|
||||||
|
Draft4Validator,
|
||||||
|
type_checker=Draft4Validator.TYPE_CHECKER.redefine(
|
||||||
|
"integer", int_or_str_int,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
validator = CustomValidator({"type": "integer"})
|
||||||
|
|
||||||
|
validator.validate(4)
|
||||||
|
validator.validate("4")
|
||||||
|
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
validator.validate(4.4)
|
||||||
|
|
||||||
|
def test_object_can_be_extended(self):
|
||||||
|
schema = {"type": "object"}
|
||||||
|
|
||||||
|
Point = namedtuple("Point", ["x", "y"])
|
||||||
|
|
||||||
|
type_checker = Draft4Validator.TYPE_CHECKER.redefine(
|
||||||
|
u"object", is_object_or_named_tuple,
|
||||||
|
)
|
||||||
|
|
||||||
|
CustomValidator = extend(Draft4Validator, type_checker=type_checker)
|
||||||
|
validator = CustomValidator(schema)
|
||||||
|
|
||||||
|
validator.validate(Point(x=4, y=5))
|
||||||
|
|
||||||
|
def test_object_extensions_require_custom_validators(self):
|
||||||
|
schema = {"type": "object", "required": ["x"]}
|
||||||
|
|
||||||
|
type_checker = Draft4Validator.TYPE_CHECKER.redefine(
|
||||||
|
u"object", is_object_or_named_tuple,
|
||||||
|
)
|
||||||
|
|
||||||
|
CustomValidator = extend(Draft4Validator, type_checker=type_checker)
|
||||||
|
validator = CustomValidator(schema)
|
||||||
|
|
||||||
|
Point = namedtuple("Point", ["x", "y"])
|
||||||
|
# Cannot handle required
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
validator.validate(Point(x=4, y=5))
|
||||||
|
|
||||||
|
def test_object_extensions_can_handle_custom_validators(self):
|
||||||
|
schema = {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["x"],
|
||||||
|
"properties": {"x": {"type": "integer"}},
|
||||||
|
}
|
||||||
|
|
||||||
|
type_checker = Draft4Validator.TYPE_CHECKER.redefine(
|
||||||
|
u"object", is_object_or_named_tuple,
|
||||||
|
)
|
||||||
|
|
||||||
|
CustomValidator = extend(
|
||||||
|
Draft4Validator,
|
||||||
|
type_checker=type_checker,
|
||||||
|
validators={"required": required, "properties": properties},
|
||||||
|
)
|
||||||
|
|
||||||
|
validator = CustomValidator(schema)
|
||||||
|
|
||||||
|
Point = namedtuple("Point", ["x", "y"])
|
||||||
|
# Can now process required and properties
|
||||||
|
validator.validate(Point(x=4, y=5))
|
||||||
|
|
||||||
|
with self.assertRaises(ValidationError):
|
||||||
|
validator.validate(Point(x="not an integer", y=5))
|
||||||
1762
lib/spack/external/_vendoring/jsonschema/tests/test_validators.py
vendored
Normal file
1762
lib/spack/external/_vendoring/jsonschema/tests/test_validators.py
vendored
Normal file
File diff suppressed because it is too large
Load Diff
@@ -8,16 +8,16 @@
|
|||||||
import json
|
import json
|
||||||
import numbers
|
import numbers
|
||||||
|
|
||||||
from _vendoring.six import add_metaclass
|
from six import add_metaclass
|
||||||
|
|
||||||
from _vendoring.jsonschema import (
|
from jsonschema import (
|
||||||
_legacy_validators,
|
_legacy_validators,
|
||||||
_types,
|
_types,
|
||||||
_utils,
|
_utils,
|
||||||
_validators,
|
_validators,
|
||||||
exceptions,
|
exceptions,
|
||||||
)
|
)
|
||||||
from _vendoring.jsonschema.compat import (
|
from jsonschema.compat import (
|
||||||
Sequence,
|
Sequence,
|
||||||
int_types,
|
int_types,
|
||||||
iteritems,
|
iteritems,
|
||||||
@@ -33,7 +33,7 @@
|
|||||||
# Sigh. https://gitlab.com/pycqa/flake8/issues/280
|
# Sigh. https://gitlab.com/pycqa/flake8/issues/280
|
||||||
# https://github.com/pyga/ebb-lint/issues/7
|
# https://github.com/pyga/ebb-lint/issues/7
|
||||||
# Imported for backwards compatibility.
|
# Imported for backwards compatibility.
|
||||||
from _vendoring.jsonschema.exceptions import ErrorTree
|
from jsonschema.exceptions import ErrorTree
|
||||||
ErrorTree
|
ErrorTree
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
1
lib/spack/external/_vendoring/macholib.pyi
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
from macholib import *
|
||||||
@@ -7,7 +7,7 @@
|
|||||||
import struct
|
import struct
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib.util import fileview
|
from macholib.util import fileview
|
||||||
|
|
||||||
from .mach_o import (
|
from .mach_o import (
|
||||||
FAT_MAGIC,
|
FAT_MAGIC,
|
||||||
@@ -41,7 +41,7 @@
|
|||||||
from .ptypes import sizeof
|
from .ptypes import sizeof
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from _vendoring.macholib.compat import bytes
|
from macholib.compat import bytes
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|||||||
@@ -5,11 +5,11 @@
|
|||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.altgraph.ObjectGraph import ObjectGraph
|
from altgraph.ObjectGraph import ObjectGraph
|
||||||
|
|
||||||
from _vendoring.macholib.dyld import dyld_find
|
from macholib.dyld import dyld_find
|
||||||
from _vendoring.macholib.itergraphreport import itergraphreport
|
from macholib.itergraphreport import itergraphreport
|
||||||
from _vendoring.macholib.MachO import MachO
|
from macholib.MachO import MachO
|
||||||
|
|
||||||
__all__ = ["MachOGraph"]
|
__all__ = ["MachOGraph"]
|
||||||
|
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
import os
|
import os
|
||||||
from collections import deque
|
from collections import deque
|
||||||
|
|
||||||
from _vendoring.macholib.dyld import framework_info
|
from macholib.dyld import framework_info
|
||||||
from _vendoring.macholib.MachOGraph import MachOGraph, MissingMachO
|
from macholib.MachOGraph import MachOGraph, MissingMachO
|
||||||
from _vendoring.macholib.util import (
|
from macholib.util import (
|
||||||
flipwritable,
|
flipwritable,
|
||||||
has_filename_filter,
|
has_filename_filter,
|
||||||
in_system_path,
|
in_system_path,
|
||||||
|
|||||||
@@ -5,7 +5,7 @@
|
|||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib.mach_o import (
|
from macholib.mach_o import (
|
||||||
MH_CIGAM_64,
|
MH_CIGAM_64,
|
||||||
MH_MAGIC_64,
|
MH_MAGIC_64,
|
||||||
dylib_module,
|
dylib_module,
|
||||||
|
|||||||
@@ -3,8 +3,8 @@
|
|||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib import macho_dump, macho_standalone
|
from macholib import macho_dump, macho_standalone
|
||||||
from _vendoring.macholib.util import is_platform_file
|
from macholib.util import is_platform_file
|
||||||
|
|
||||||
gCommand = None
|
gCommand = None
|
||||||
|
|
||||||
@@ -43,10 +43,10 @@ def walk_tree(callback, paths):
|
|||||||
|
|
||||||
def print_usage(fp):
|
def print_usage(fp):
|
||||||
print("Usage:", file=fp)
|
print("Usage:", file=fp)
|
||||||
print(" python -m_vendoring.macholib [help|--help]", file=fp)
|
print(" python -mmacholib [help|--help]", file=fp)
|
||||||
print(" python -m_vendoring.macholib dump FILE ...", file=fp)
|
print(" python -mmacholib dump FILE ...", file=fp)
|
||||||
print(" python -m_vendoring.macholib find DIR ...", file=fp)
|
print(" python -mmacholib find DIR ...", file=fp)
|
||||||
print(" python -m_vendoring.macholib standalone DIR ...", file=fp)
|
print(" python -mmacholib standalone DIR ...", file=fp)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
|
|||||||
@@ -6,7 +6,7 @@
|
|||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib.util import is_platform_file
|
from macholib.util import is_platform_file
|
||||||
|
|
||||||
|
|
||||||
def check_file(fp, path, callback):
|
def check_file(fp, path, callback):
|
||||||
|
|||||||
@@ -8,8 +8,8 @@
|
|||||||
import sys
|
import sys
|
||||||
from itertools import chain
|
from itertools import chain
|
||||||
|
|
||||||
from _vendoring.macholib.dylib import dylib_info
|
from macholib.dylib import dylib_info
|
||||||
from _vendoring.macholib.framework import framework_info
|
from macholib.framework import framework_info
|
||||||
|
|
||||||
__all__ = ["dyld_find", "framework_find", "framework_info", "dylib_info"]
|
__all__ = ["dyld_find", "framework_find", "framework_info", "dylib_info"]
|
||||||
|
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
|
|
||||||
import time
|
import time
|
||||||
|
|
||||||
from _vendoring.macholib.ptypes import (
|
from macholib.ptypes import (
|
||||||
Structure,
|
Structure,
|
||||||
p_int32,
|
p_int32,
|
||||||
p_int64,
|
p_int64,
|
||||||
|
|||||||
@@ -4,9 +4,9 @@
|
|||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib._cmdline import main as _main
|
from macholib._cmdline import main as _main
|
||||||
from _vendoring.macholib.mach_o import CPU_TYPE_NAMES, MH_CIGAM_64, MH_MAGIC_64, get_cpu_subtype
|
from macholib.mach_o import CPU_TYPE_NAMES, MH_CIGAM_64, MH_MAGIC_64, get_cpu_subtype
|
||||||
from _vendoring.macholib.MachO import MachO
|
from macholib.MachO import MachO
|
||||||
|
|
||||||
ARCH_MAP = {
|
ARCH_MAP = {
|
||||||
("<", "64-bit"): "x86_64",
|
("<", "64-bit"): "x86_64",
|
||||||
@@ -45,7 +45,7 @@ def print_file(fp, path):
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
print(
|
print(
|
||||||
"WARNING: 'macho_dump' is deprecated, use 'python -m_vendoring.macholib dump' " "instead"
|
"WARNING: 'macho_dump' is deprecated, use 'python -mmacholib dump' " "instead"
|
||||||
)
|
)
|
||||||
_main(print_file)
|
_main(print_file)
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
#!/usr/bin/env python
|
#!/usr/bin/env python
|
||||||
from __future__ import print_function
|
from __future__ import print_function
|
||||||
|
|
||||||
from _vendoring.macholib._cmdline import main as _main
|
from macholib._cmdline import main as _main
|
||||||
|
|
||||||
|
|
||||||
def print_file(fp, path):
|
def print_file(fp, path):
|
||||||
@@ -10,7 +10,7 @@ def print_file(fp, path):
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
print(
|
print(
|
||||||
"WARNING: 'macho_find' is deprecated, " "use 'python -m_vendoring.macholib dump' instead"
|
"WARNING: 'macho_find' is deprecated, " "use 'python -mmacholib dump' instead"
|
||||||
)
|
)
|
||||||
_main(print_file)
|
_main(print_file)
|
||||||
|
|
||||||
|
|||||||
@@ -3,8 +3,8 @@
|
|||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib.MachOStandalone import MachOStandalone
|
from macholib.MachOStandalone import MachOStandalone
|
||||||
from _vendoring.macholib.util import strip_files
|
from macholib.util import strip_files
|
||||||
|
|
||||||
|
|
||||||
def standaloneApp(path):
|
def standaloneApp(path):
|
||||||
@@ -18,7 +18,7 @@ def standaloneApp(path):
|
|||||||
def main():
|
def main():
|
||||||
print(
|
print(
|
||||||
"WARNING: 'macho_standalone' is deprecated, use "
|
"WARNING: 'macho_standalone' is deprecated, use "
|
||||||
"'python -m_vendoring.macholib standalone' instead"
|
"'python -mmacholib standalone' instead"
|
||||||
)
|
)
|
||||||
if not sys.argv[1:]:
|
if not sys.argv[1:]:
|
||||||
raise SystemExit("usage: %s [appbundle ...]" % (sys.argv[0],))
|
raise SystemExit("usage: %s [appbundle ...]" % (sys.argv[0],))
|
||||||
|
|||||||
@@ -4,7 +4,7 @@
|
|||||||
import struct
|
import struct
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.macholib import mach_o
|
from macholib import mach_o
|
||||||
|
|
||||||
MAGIC = [
|
MAGIC = [
|
||||||
struct.pack("!L", getattr(mach_o, "MH_" + _))
|
struct.pack("!L", getattr(mach_o, "MH_" + _))
|
||||||
|
|||||||
@@ -4,7 +4,7 @@
|
|||||||
import typing as t
|
import typing as t
|
||||||
|
|
||||||
if t.TYPE_CHECKING:
|
if t.TYPE_CHECKING:
|
||||||
import _vendoring.typing_extensions as te
|
import typing_extensions as te
|
||||||
|
|
||||||
class HasHTML(te.Protocol):
|
class HasHTML(te.Protocol):
|
||||||
def __html__(self) -> str:
|
def __html__(self) -> str:
|
||||||
|
|||||||
@@ -1,35 +1,35 @@
|
|||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pmap import pmap, m, PMap
|
from pyrsistent._pmap import pmap, m, PMap
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pvector import pvector, v, PVector
|
from pyrsistent._pvector import pvector, v, PVector
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pset import pset, s, PSet
|
from pyrsistent._pset import pset, s, PSet
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pbag import pbag, b, PBag
|
from pyrsistent._pbag import pbag, b, PBag
|
||||||
|
|
||||||
from _vendoring.pyrsistent._plist import plist, l, PList
|
from pyrsistent._plist import plist, l, PList
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pdeque import pdeque, dq, PDeque
|
from pyrsistent._pdeque import pdeque, dq, PDeque
|
||||||
|
|
||||||
from _vendoring.pyrsistent._checked_types import (
|
from pyrsistent._checked_types import (
|
||||||
CheckedPMap, CheckedPVector, CheckedPSet, InvariantException, CheckedKeyTypeError,
|
CheckedPMap, CheckedPVector, CheckedPSet, InvariantException, CheckedKeyTypeError,
|
||||||
CheckedValueTypeError, CheckedType, optional)
|
CheckedValueTypeError, CheckedType, optional)
|
||||||
|
|
||||||
from _vendoring.pyrsistent._field_common import (
|
from pyrsistent._field_common import (
|
||||||
field, PTypeError, pset_field, pmap_field, pvector_field)
|
field, PTypeError, pset_field, pmap_field, pvector_field)
|
||||||
|
|
||||||
from _vendoring.pyrsistent._precord import PRecord
|
from pyrsistent._precord import PRecord
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pclass import PClass, PClassMeta
|
from pyrsistent._pclass import PClass, PClassMeta
|
||||||
|
|
||||||
from _vendoring.pyrsistent._immutable import immutable
|
from pyrsistent._immutable import immutable
|
||||||
|
|
||||||
from _vendoring.pyrsistent._helpers import freeze, thaw, mutant
|
from pyrsistent._helpers import freeze, thaw, mutant
|
||||||
|
|
||||||
from _vendoring.pyrsistent._transformations import inc, discard, rex, ny
|
from pyrsistent._transformations import inc, discard, rex, ny
|
||||||
|
|
||||||
from _vendoring.pyrsistent._toolz import get_in
|
from pyrsistent._toolz import get_in
|
||||||
|
|
||||||
|
|
||||||
__all__ = ('pmap', 'm', 'PMap',
|
__all__ = ('pmap', 'm', 'PMap',
|
||||||
|
|||||||
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
213
lib/spack/external/_vendoring/pyrsistent/__init__.pyi
vendored
Normal file
@@ -0,0 +1,213 @@
|
|||||||
|
# flake8: noqa: E704
|
||||||
|
# from https://gist.github.com/WuTheFWasThat/091a17d4b5cab597dfd5d4c2d96faf09
|
||||||
|
# Stubs for pyrsistent (Python 3.6)
|
||||||
|
|
||||||
|
from typing import Any
|
||||||
|
from typing import AnyStr
|
||||||
|
from typing import Callable
|
||||||
|
from typing import Iterable
|
||||||
|
from typing import Iterator
|
||||||
|
from typing import List
|
||||||
|
from typing import Optional
|
||||||
|
from typing import Mapping
|
||||||
|
from typing import MutableMapping
|
||||||
|
from typing import Sequence
|
||||||
|
from typing import Set
|
||||||
|
from typing import Union
|
||||||
|
from typing import Tuple
|
||||||
|
from typing import Type
|
||||||
|
from typing import TypeVar
|
||||||
|
from typing import overload
|
||||||
|
|
||||||
|
# see commit 08519aa for explanation of the re-export
|
||||||
|
from pyrsistent.typing import CheckedKeyTypeError as CheckedKeyTypeError
|
||||||
|
from pyrsistent.typing import CheckedPMap as CheckedPMap
|
||||||
|
from pyrsistent.typing import CheckedPSet as CheckedPSet
|
||||||
|
from pyrsistent.typing import CheckedPVector as CheckedPVector
|
||||||
|
from pyrsistent.typing import CheckedType as CheckedType
|
||||||
|
from pyrsistent.typing import CheckedValueTypeError as CheckedValueTypeError
|
||||||
|
from pyrsistent.typing import InvariantException as InvariantException
|
||||||
|
from pyrsistent.typing import PClass as PClass
|
||||||
|
from pyrsistent.typing import PBag as PBag
|
||||||
|
from pyrsistent.typing import PDeque as PDeque
|
||||||
|
from pyrsistent.typing import PList as PList
|
||||||
|
from pyrsistent.typing import PMap as PMap
|
||||||
|
from pyrsistent.typing import PMapEvolver as PMapEvolver
|
||||||
|
from pyrsistent.typing import PSet as PSet
|
||||||
|
from pyrsistent.typing import PSetEvolver as PSetEvolver
|
||||||
|
from pyrsistent.typing import PTypeError as PTypeError
|
||||||
|
from pyrsistent.typing import PVector as PVector
|
||||||
|
from pyrsistent.typing import PVectorEvolver as PVectorEvolver
|
||||||
|
|
||||||
|
T = TypeVar('T')
|
||||||
|
KT = TypeVar('KT')
|
||||||
|
VT = TypeVar('VT')
|
||||||
|
|
||||||
|
def pmap(initial: Union[Mapping[KT, VT], Iterable[Tuple[KT, VT]]] = {}, pre_size: int = 0) -> PMap[KT, VT]: ...
|
||||||
|
def m(**kwargs: VT) -> PMap[str, VT]: ...
|
||||||
|
|
||||||
|
def pvector(iterable: Iterable[T] = ...) -> PVector[T]: ...
|
||||||
|
def v(*iterable: T) -> PVector[T]: ...
|
||||||
|
|
||||||
|
def pset(iterable: Iterable[T] = (), pre_size: int = 8) -> PSet[T]: ...
|
||||||
|
def s(*iterable: T) -> PSet[T]: ...
|
||||||
|
|
||||||
|
# see class_test.py for use cases
|
||||||
|
Invariant = Tuple[bool, Optional[Union[str, Callable[[], str]]]]
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def field(
|
||||||
|
type: Union[Type[T], Sequence[Type[T]]] = ...,
|
||||||
|
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||||
|
initial: Any = object(),
|
||||||
|
mandatory: bool = False,
|
||||||
|
factory: Callable[[Any], T] = lambda x: x,
|
||||||
|
serializer: Callable[[Any, T], Any] = lambda _, value: value,
|
||||||
|
) -> T: ...
|
||||||
|
# The actual return value (_PField) is irrelevant after a PRecord has been instantiated,
|
||||||
|
# see https://github.com/tobgu/pyrsistent/blob/master/pyrsistent/_precord.py#L10
|
||||||
|
@overload
|
||||||
|
def field(
|
||||||
|
type: Any = ...,
|
||||||
|
invariant: Callable[[Any], Union[Invariant, Iterable[Invariant]]] = lambda _: (True, None),
|
||||||
|
initial: Any = object(),
|
||||||
|
mandatory: bool = False,
|
||||||
|
factory: Callable[[Any], Any] = lambda x: x,
|
||||||
|
serializer: Callable[[Any, Any], Any] = lambda _, value: value,
|
||||||
|
) -> Any: ...
|
||||||
|
|
||||||
|
# Use precise types for the simplest use cases, but fall back to Any for
|
||||||
|
# everything else. See record_test.py for the wide range of possible types for
|
||||||
|
# item_type
|
||||||
|
@overload
|
||||||
|
def pset_field(
|
||||||
|
item_type: Type[T],
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Iterable[T] = ...,
|
||||||
|
) -> PSet[T]: ...
|
||||||
|
@overload
|
||||||
|
def pset_field(
|
||||||
|
item_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Any = (),
|
||||||
|
) -> PSet[Any]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def pmap_field(
|
||||||
|
key_type: Type[KT],
|
||||||
|
value_type: Type[VT],
|
||||||
|
optional: bool = False,
|
||||||
|
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||||
|
) -> PMap[KT, VT]: ...
|
||||||
|
@overload
|
||||||
|
def pmap_field(
|
||||||
|
key_type: Any,
|
||||||
|
value_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
invariant: Callable[[Any], Tuple[bool, Optional[str]]] = lambda _: (True, None),
|
||||||
|
) -> PMap[Any, Any]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def pvector_field(
|
||||||
|
item_type: Type[T],
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Iterable[T] = ...,
|
||||||
|
) -> PVector[T]: ...
|
||||||
|
@overload
|
||||||
|
def pvector_field(
|
||||||
|
item_type: Any,
|
||||||
|
optional: bool = False,
|
||||||
|
initial: Any = (),
|
||||||
|
) -> PVector[Any]: ...
|
||||||
|
|
||||||
|
def pbag(elements: Iterable[T]) -> PBag[T]: ...
|
||||||
|
def b(*elements: T) -> PBag[T]: ...
|
||||||
|
|
||||||
|
def plist(iterable: Iterable[T] = (), reverse: bool = False) -> PList[T]: ...
|
||||||
|
def l(*elements: T) -> PList[T]: ...
|
||||||
|
|
||||||
|
def pdeque(iterable: Optional[Iterable[T]] = None, maxlen: Optional[int] = None) -> PDeque[T]: ...
|
||||||
|
def dq(*iterable: T) -> PDeque[T]: ...
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def optional(type: T) -> Tuple[T, Type[None]]: ...
|
||||||
|
@overload
|
||||||
|
def optional(*typs: Any) -> Tuple[Any, ...]: ...
|
||||||
|
|
||||||
|
T_PRecord = TypeVar('T_PRecord', bound='PRecord')
|
||||||
|
class PRecord(PMap[AnyStr, Any]):
|
||||||
|
_precord_fields: Mapping
|
||||||
|
_precord_initial_values: Mapping
|
||||||
|
|
||||||
|
def __hash__(self) -> int: ...
|
||||||
|
def __init__(self, **kwargs: Any) -> None: ...
|
||||||
|
def __iter__(self) -> Iterator[Any]: ...
|
||||||
|
def __len__(self) -> int: ...
|
||||||
|
@classmethod
|
||||||
|
def create(
|
||||||
|
cls: Type[T_PRecord],
|
||||||
|
kwargs: Mapping,
|
||||||
|
_factory_fields: Optional[Iterable] = None,
|
||||||
|
ignore_extra: bool = False,
|
||||||
|
) -> T_PRecord: ...
|
||||||
|
# This is OK because T_PRecord is a concrete type
|
||||||
|
def discard(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||||
|
def remove(self: T_PRecord, key: KT) -> T_PRecord: ...
|
||||||
|
|
||||||
|
def serialize(self, format: Optional[Any] = ...) -> MutableMapping: ...
|
||||||
|
|
||||||
|
# From pyrsistent documentation:
|
||||||
|
# This set function differs slightly from that in the PMap
|
||||||
|
# class. First of all it accepts key-value pairs. Second it accepts multiple key-value
|
||||||
|
# pairs to perform one, atomic, update of multiple fields.
|
||||||
|
@overload
|
||||||
|
def set(self, key: KT, val: VT) -> Any: ...
|
||||||
|
@overload
|
||||||
|
def set(self, **kwargs: VT) -> Any: ...
|
||||||
|
|
||||||
|
def immutable(
|
||||||
|
members: Union[str, Iterable[str]] = '',
|
||||||
|
name: str = 'Immutable',
|
||||||
|
verbose: bool = False,
|
||||||
|
) -> Tuple: ... # actually a namedtuple
|
||||||
|
|
||||||
|
# ignore mypy warning "Overloaded function signatures 1 and 5 overlap with
|
||||||
|
# incompatible return types"
|
||||||
|
@overload
|
||||||
|
def freeze(o: Mapping[KT, VT]) -> PMap[KT, VT]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: List[T]) -> PVector[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||||
|
@overload
|
||||||
|
def freeze(o: Set[T]) -> PSet[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def freeze(o: T) -> T: ...
|
||||||
|
|
||||||
|
|
||||||
|
@overload
|
||||||
|
def thaw(o: PMap[KT, VT]) -> MutableMapping[KT, VT]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: PVector[T]) -> List[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: Tuple[T, ...]) -> Tuple[T, ...]: ...
|
||||||
|
# collections.abc.MutableSet is kind of garbage:
|
||||||
|
# https://stackoverflow.com/questions/24977898/why-does-collections-mutableset-not-bestow-an-update-method
|
||||||
|
@overload
|
||||||
|
def thaw(o: PSet[T]) -> Set[T]: ... # type: ignore
|
||||||
|
@overload
|
||||||
|
def thaw(o: T) -> T: ...
|
||||||
|
|
||||||
|
def mutant(fn: Callable) -> Callable: ...
|
||||||
|
|
||||||
|
def inc(x: int) -> int: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PMapEvolver[KT, VT], key: KT) -> None: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PVectorEvolver[T], key: int) -> None: ...
|
||||||
|
@overload
|
||||||
|
def discard(evolver: PSetEvolver[T], key: T) -> None: ...
|
||||||
|
def rex(expr: str) -> Callable[[Any], bool]: ...
|
||||||
|
def ny(_: Any) -> bool: ...
|
||||||
|
|
||||||
|
def get_in(keys: Iterable, coll: Mapping, default: Optional[Any] = None, no_default: bool = False) -> Any: ...
|
||||||
@@ -3,9 +3,9 @@
|
|||||||
from abc import abstractmethod, ABCMeta
|
from abc import abstractmethod, ABCMeta
|
||||||
from collections.abc import Iterable
|
from collections.abc import Iterable
|
||||||
|
|
||||||
from _vendoring.pyrsistent._pmap import PMap, pmap
|
from pyrsistent._pmap import PMap, pmap
|
||||||
from _vendoring.pyrsistent._pset import PSet, pset
|
from pyrsistent._pset import PSet, pset
|
||||||
from _vendoring.pyrsistent._pvector import PythonPVector, python_pvector
|
from pyrsistent._pvector import PythonPVector, python_pvector
|
||||||
|
|
||||||
|
|
||||||
class CheckedType(object):
|
class CheckedType(object):
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import sys
|
import sys
|
||||||
|
|
||||||
from _vendoring.pyrsistent._checked_types import (
|
from pyrsistent._checked_types import (
|
||||||
CheckedPMap,
|
CheckedPMap,
|
||||||
CheckedPSet,
|
CheckedPSet,
|
||||||
CheckedPVector,
|
CheckedPVector,
|
||||||
@@ -11,8 +11,8 @@
|
|||||||
maybe_parse_user_type,
|
maybe_parse_user_type,
|
||||||
maybe_parse_many_user_types,
|
maybe_parse_many_user_types,
|
||||||
)
|
)
|
||||||
from _vendoring.pyrsistent._checked_types import optional as optional_type
|
from pyrsistent._checked_types import optional as optional_type
|
||||||
from _vendoring.pyrsistent._checked_types import wrap_invariant
|
from pyrsistent._checked_types import wrap_invariant
|
||||||
import inspect
|
import inspect
|
||||||
|
|
||||||
PY2 = sys.version_info[0] < 3
|
PY2 = sys.version_info[0] < 3
|
||||||
|
|||||||
@@ -1,11 +1,11 @@
|
|||||||
from functools import wraps
|
from functools import wraps
|
||||||
from _vendoring.pyrsistent._pmap import PMap, pmap
|
from pyrsistent._pmap import PMap, pmap
|
||||||
from _vendoring.pyrsistent._pset import PSet, pset
|
from pyrsistent._pset import PSet, pset
|
||||||
from _vendoring.pyrsistent._pvector import PVector, pvector
|
from pyrsistent._pvector import PVector, pvector
|
||||||
|
|
||||||
def freeze(o, strict=True):
|
def freeze(o, strict=True):
|
||||||
"""
|
"""
|
||||||
Recursively convert simple Python containers into _vendoring.pyrsistent versions
|
Recursively convert simple Python containers into pyrsistent versions
|
||||||
of those containers.
|
of those containers.
|
||||||
|
|
||||||
- list is converted to pvector, recursively
|
- list is converted to pvector, recursively
|
||||||
@@ -47,7 +47,7 @@ def freeze(o, strict=True):
|
|||||||
|
|
||||||
def thaw(o, strict=True):
|
def thaw(o, strict=True):
|
||||||
"""
|
"""
|
||||||
Recursively convert _vendoring.pyrsistent containers into simple Python containers.
|
Recursively convert pyrsistent containers into simple Python containers.
|
||||||
|
|
||||||
- pvector is converted to list, recursively
|
- pvector is converted to list, recursively
|
||||||
- pmap is converted to dict, recursively on values (but not keys)
|
- pmap is converted to dict, recursively on values (but not keys)
|
||||||
@@ -59,7 +59,7 @@ def thaw(o, strict=True):
|
|||||||
- thaw is called on elements of lists
|
- thaw is called on elements of lists
|
||||||
- thaw is called on values in dicts
|
- thaw is called on values in dicts
|
||||||
|
|
||||||
>>> from _vendoring.pyrsistent import s, m, v
|
>>> from pyrsistent import s, m, v
|
||||||
>>> thaw(s(1, 2))
|
>>> thaw(s(1, 2))
|
||||||
{1, 2}
|
{1, 2}
|
||||||
>>> thaw(v(1, m(a=3)))
|
>>> thaw(v(1, m(a=3)))
|
||||||
|
|||||||
@@ -94,7 +94,7 @@ def set(self, **kwargs):
|
|||||||
print(template)
|
print(template)
|
||||||
|
|
||||||
from collections import namedtuple
|
from collections import namedtuple
|
||||||
namespace = dict(namedtuple=namedtuple, __name__='_vendoring.pyrsistent_immutable')
|
namespace = dict(namedtuple=namedtuple, __name__='pyrsistent_immutable')
|
||||||
try:
|
try:
|
||||||
exec(template, namespace)
|
exec(template, namespace)
|
||||||
except SyntaxError as e:
|
except SyntaxError as e:
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
from collections.abc import Container, Iterable, Sized, Hashable
|
from collections.abc import Container, Iterable, Sized, Hashable
|
||||||
from functools import reduce
|
from functools import reduce
|
||||||
from _vendoring.pyrsistent._pmap import pmap
|
from pyrsistent._pmap import pmap
|
||||||
|
|
||||||
|
|
||||||
def _add_to_counters(counters, element):
|
def _add_to_counters(counters, element):
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
from _vendoring.pyrsistent._checked_types import (InvariantException, CheckedType, _restore_pickle, store_invariants)
|
from pyrsistent._checked_types import (InvariantException, CheckedType, _restore_pickle, store_invariants)
|
||||||
from _vendoring.pyrsistent._field_common import (
|
from pyrsistent._field_common import (
|
||||||
set_fields, check_type, is_field_ignore_extra_complaint, PFIELD_NO_INITIAL, serialize, check_global_invariants
|
set_fields, check_type, is_field_ignore_extra_complaint, PFIELD_NO_INITIAL, serialize, check_global_invariants
|
||||||
)
|
)
|
||||||
from _vendoring.pyrsistent._transformations import transform
|
from pyrsistent._transformations import transform
|
||||||
|
|
||||||
|
|
||||||
def _is_pclass(bases):
|
def _is_pclass(bases):
|
||||||
@@ -41,7 +41,7 @@ class PClass(CheckedType, metaclass=PClassMeta):
|
|||||||
is not a PMap and hence not a collection but rather a plain Python object.
|
is not a PMap and hence not a collection but rather a plain Python object.
|
||||||
|
|
||||||
|
|
||||||
More documentation and examples of PClass usage is available at https://github.com/tobgu/_vendoring.pyrsistent
|
More documentation and examples of PClass usage is available at https://github.com/tobgu/pyrsistent
|
||||||
"""
|
"""
|
||||||
def __new__(cls, **kwargs): # Support *args?
|
def __new__(cls, **kwargs): # Support *args?
|
||||||
result = super(PClass, cls).__new__(cls)
|
result = super(PClass, cls).__new__(cls)
|
||||||
@@ -84,7 +84,7 @@ def set(self, *args, **kwargs):
|
|||||||
Set a field in the instance. Returns a new instance with the updated value. The original instance remains
|
Set a field in the instance. Returns a new instance with the updated value. The original instance remains
|
||||||
unmodified. Accepts key-value pairs or single string representing the field name and a value.
|
unmodified. Accepts key-value pairs or single string representing the field name and a value.
|
||||||
|
|
||||||
>>> from _vendoring.pyrsistent import PClass, field
|
>>> from pyrsistent import PClass, field
|
||||||
>>> class AClass(PClass):
|
>>> class AClass(PClass):
|
||||||
... x = field()
|
... x = field()
|
||||||
...
|
...
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
from collections.abc import Sequence, Hashable
|
from collections.abc import Sequence, Hashable
|
||||||
from itertools import islice, chain
|
from itertools import islice, chain
|
||||||
from numbers import Integral
|
from numbers import Integral
|
||||||
from _vendoring.pyrsistent._plist import plist
|
from pyrsistent._plist import plist
|
||||||
|
|
||||||
|
|
||||||
class PDeque(object):
|
class PDeque(object):
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
from collections.abc import Mapping, Hashable
|
from collections.abc import Mapping, Hashable
|
||||||
from itertools import chain
|
from itertools import chain
|
||||||
from _vendoring.pyrsistent._pvector import pvector
|
from pyrsistent._pvector import pvector
|
||||||
from _vendoring.pyrsistent._transformations import transform
|
from pyrsistent._transformations import transform
|
||||||
|
|
||||||
|
|
||||||
class PMap(object):
|
class PMap(object):
|
||||||
@@ -256,7 +256,7 @@ def transform(self, *transformations):
|
|||||||
consists of two parts. One match expression that specifies which elements to transform
|
consists of two parts. One match expression that specifies which elements to transform
|
||||||
and one transformation function that performs the actual transformation.
|
and one transformation function that performs the actual transformation.
|
||||||
|
|
||||||
>>> from _vendoring.pyrsistent import freeze, ny
|
>>> from pyrsistent import freeze, ny
|
||||||
>>> news_paper = freeze({'articles': [{'author': 'Sara', 'content': 'A short article'},
|
>>> news_paper = freeze({'articles': [{'author': 'Sara', 'content': 'A short article'},
|
||||||
... {'author': 'Steve', 'content': 'A slightly longer article'}],
|
... {'author': 'Steve', 'content': 'A slightly longer article'}],
|
||||||
... 'weather': {'temperature': '11C', 'wind': '5m/s'}})
|
... 'weather': {'temperature': '11C', 'wind': '5m/s'}})
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
from _vendoring.pyrsistent._checked_types import CheckedType, _restore_pickle, InvariantException, store_invariants
|
from pyrsistent._checked_types import CheckedType, _restore_pickle, InvariantException, store_invariants
|
||||||
from _vendoring.pyrsistent._field_common import (
|
from pyrsistent._field_common import (
|
||||||
set_fields, check_type, is_field_ignore_extra_complaint, PFIELD_NO_INITIAL, serialize, check_global_invariants
|
set_fields, check_type, is_field_ignore_extra_complaint, PFIELD_NO_INITIAL, serialize, check_global_invariants
|
||||||
)
|
)
|
||||||
from _vendoring.pyrsistent._pmap import PMap, pmap
|
from pyrsistent._pmap import PMap, pmap
|
||||||
|
|
||||||
|
|
||||||
class _PRecordMeta(type):
|
class _PRecordMeta(type):
|
||||||
@@ -28,7 +28,7 @@ class PRecord(PMap, CheckedType, metaclass=_PRecordMeta):
|
|||||||
from PRecord. Because it is a PMap it has full support for all Mapping methods such as iteration and element
|
from PRecord. Because it is a PMap it has full support for all Mapping methods such as iteration and element
|
||||||
access using subscript notation.
|
access using subscript notation.
|
||||||
|
|
||||||
More documentation and examples of PRecord usage is available at https://github.com/tobgu/_vendoring.pyrsistent
|
More documentation and examples of PRecord usage is available at https://github.com/tobgu/pyrsistent
|
||||||
"""
|
"""
|
||||||
def __new__(cls, **kwargs):
|
def __new__(cls, **kwargs):
|
||||||
# Hack total! If these two special attributes exist that means we can create
|
# Hack total! If these two special attributes exist that means we can create
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
from collections.abc import Set, Hashable
|
from collections.abc import Set, Hashable
|
||||||
import sys
|
import sys
|
||||||
from _vendoring.pyrsistent._pmap import pmap
|
from pyrsistent._pmap import pmap
|
||||||
|
|
||||||
|
|
||||||
class PSet(object):
|
class PSet(object):
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
from collections.abc import Sequence, Hashable
|
from collections.abc import Sequence, Hashable
|
||||||
from numbers import Integral
|
from numbers import Integral
|
||||||
import operator
|
import operator
|
||||||
from _vendoring.pyrsistent._transformations import transform
|
from pyrsistent._transformations import transform
|
||||||
|
|
||||||
|
|
||||||
def _bitcount(val):
|
def _bitcount(val):
|
||||||
@@ -626,7 +626,7 @@ def transform(self, *transformations):
|
|||||||
consists of two parts. One match expression that specifies which elements to transform
|
consists of two parts. One match expression that specifies which elements to transform
|
||||||
and one transformation function that performs the actual transformation.
|
and one transformation function that performs the actual transformation.
|
||||||
|
|
||||||
>>> from _vendoring.pyrsistent import freeze, ny
|
>>> from pyrsistent import freeze, ny
|
||||||
>>> news_paper = freeze({'articles': [{'author': 'Sara', 'content': 'A short article'},
|
>>> news_paper = freeze({'articles': [{'author': 'Sara', 'content': 'A short article'},
|
||||||
... {'author': 'Steve', 'content': 'A slightly longer article'}],
|
... {'author': 'Steve', 'content': 'A slightly longer article'}],
|
||||||
... 'weather': {'temperature': '11C', 'wind': '5m/s'}})
|
... 'weather': {'temperature': '11C', 'wind': '5m/s'}})
|
||||||
|
|||||||
@@ -56,7 +56,7 @@ def get_in(keys, coll, default=None, no_default=False):
|
|||||||
|
|
||||||
``get_in`` is a generalization of ``operator.getitem`` for nested data
|
``get_in`` is a generalization of ``operator.getitem`` for nested data
|
||||||
structures such as dictionaries and lists.
|
structures such as dictionaries and lists.
|
||||||
>>> from _vendoring.pyrsistent import freeze
|
>>> from pyrsistent import freeze
|
||||||
>>> transaction = freeze({'name': 'Alice',
|
>>> transaction = freeze({'name': 'Alice',
|
||||||
... 'purchase': {'items': ['Apple', 'Orange'],
|
... 'purchase': {'items': ['Apple', 'Orange'],
|
||||||
... 'costs': [0.50, 1.25]},
|
... 'costs': [0.50, 1.25]},
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user