Merge branch 'develop' into packages/mfem-4.8
This commit is contained in:
commit
314fcc15f4
34
.github/workflows/sync-packages.yaml
vendored
Normal file
34
.github/workflows/sync-packages.yaml
vendored
Normal file
@ -0,0 +1,34 @@
|
|||||||
|
name: sync with spack/spack-packages
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- develop
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
sync:
|
||||||
|
if: github.repository == 'spack/spack'
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout spack/spack
|
||||||
|
run: git clone https://github.com/spack/spack.git
|
||||||
|
- name: Checkout spack/spack-packages
|
||||||
|
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
|
||||||
|
with:
|
||||||
|
ssh-key: ${{ secrets.SYNC_PACKAGES_KEY }}
|
||||||
|
path: spack-packages
|
||||||
|
repository: spack/spack-packages
|
||||||
|
- name: Install git-filter-repo
|
||||||
|
run: |
|
||||||
|
curl -LfsO https://raw.githubusercontent.com/newren/git-filter-repo/refs/tags/v2.47.0/git-filter-repo
|
||||||
|
echo "67447413e273fc76809289111748870b6f6072f08b17efe94863a92d810b7d94 git-filter-repo" | sha256sum -c -
|
||||||
|
chmod +x git-filter-repo
|
||||||
|
sudo mv git-filter-repo /usr/local/bin/
|
||||||
|
- name: Sync spack/spack-packages with spack/spack
|
||||||
|
run: |
|
||||||
|
cd spack-packages
|
||||||
|
git-filter-repo --quiet --source ../spack --subdirectory-filter var/spack/repos --refs develop
|
||||||
|
- name: Push
|
||||||
|
run: |
|
||||||
|
cd spack-packages
|
||||||
|
git push git@github.com:spack/spack-packages.git develop:develop --force
|
@ -45,10 +45,14 @@ provided binary cache, which can be a local directory or a remote URL.
|
|||||||
Here is an example where a build cache is created in a local directory named
|
Here is an example where a build cache is created in a local directory named
|
||||||
"spack-cache", to which we push the "ninja" spec:
|
"spack-cache", to which we push the "ninja" spec:
|
||||||
|
|
||||||
|
ninja-1.12.1-vmvycib6vmiofkdqgrblo7zsvp7odwut
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
|
||||||
$ spack buildcache push ./spack-cache ninja
|
$ spack buildcache push ./spack-cache ninja
|
||||||
==> Pushing binary packages to file:///home/spackuser/spack/spack-cache/build_cache
|
==> Selected 30 specs to push to file:///home/spackuser/spack/spack-cache
|
||||||
|
...
|
||||||
|
==> [30/30] Pushed ninja@1.12.1/ngldn2k
|
||||||
|
|
||||||
Note that ``ninja`` must be installed locally for this to work.
|
Note that ``ninja`` must be installed locally for this to work.
|
||||||
|
|
||||||
@ -98,9 +102,10 @@ Now you can use list:
|
|||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
|
||||||
$ spack buildcache list
|
$ spack buildcache list
|
||||||
==> 1 cached build.
|
==> 24 cached builds.
|
||||||
-- linux-ubuntu20.04-skylake / gcc@9.3.0 ------------------------
|
-- linux-ubuntu22.04-sapphirerapids / gcc@12.3.0 ----------------
|
||||||
ninja@1.10.2
|
[ ... ]
|
||||||
|
ninja@1.12.1
|
||||||
|
|
||||||
With ``mymirror`` configured and an index available, Spack will automatically
|
With ``mymirror`` configured and an index available, Spack will automatically
|
||||||
use it during concretization and installation. That means that you can expect
|
use it during concretization and installation. That means that you can expect
|
||||||
@ -111,17 +116,17 @@ verify by re-installing ninja:
|
|||||||
|
|
||||||
$ spack uninstall ninja
|
$ spack uninstall ninja
|
||||||
$ spack install ninja
|
$ spack install ninja
|
||||||
==> Installing ninja-1.11.1-yxferyhmrjkosgta5ei6b4lqf6bxbscz
|
[ ... ]
|
||||||
==> Fetching file:///home/spackuser/spack/spack-cache/build_cache/linux-ubuntu20.04-skylake-gcc-9.3.0-ninja-1.10.2-yxferyhmrjkosgta5ei6b4lqf6bxbscz.spec.json.sig
|
==> Installing ninja-1.12.1-ngldn2kpvb6lqc44oqhhow7fzg7xu7lh [24/24]
|
||||||
gpg: Signature made Do 12 Jan 2023 16:01:04 CET
|
gpg: Signature made Thu 06 Mar 2025 10:03:38 AM MST
|
||||||
gpg: using RSA key 61B82B2B2350E171BD17A1744E3A689061D57BF6
|
gpg: using RSA key 75BC0528114909C076E2607418010FFAD73C9B07
|
||||||
gpg: Good signature from "example (GPG created for Spack) <example@example.com>" [ultimate]
|
gpg: Good signature from "example (GPG created for Spack) <example@example.com>" [ultimate]
|
||||||
==> Fetching file:///home/spackuser/spack/spack-cache/build_cache/linux-ubuntu20.04-skylake/gcc-9.3.0/ninja-1.10.2/linux-ubuntu20.04-skylake-gcc-9.3.0-ninja-1.10.2-yxferyhmrjkosgta5ei6b4lqf6bxbscz.spack
|
==> Fetching file:///home/spackuser/spack/spack-cache/blobs/sha256/f0/f08eb62661ad159d2d258890127fc6053f5302a2f490c1c7f7bd677721010ee0
|
||||||
==> Extracting ninja-1.10.2-yxferyhmrjkosgta5ei6b4lqf6bxbscz from binary cache
|
==> Fetching file:///home/spackuser/spack/spack-cache/blobs/sha256/c7/c79ac6e40dfdd01ac499b020e52e57aa91151febaea3ad183f90c0f78b64a31a
|
||||||
==> ninja: Successfully installed ninja-1.11.1-yxferyhmrjkosgta5ei6b4lqf6bxbscz
|
==> Extracting ninja-1.12.1-ngldn2kpvb6lqc44oqhhow7fzg7xu7lh from binary cache
|
||||||
Search: 0.00s. Fetch: 0.17s. Install: 0.12s. Total: 0.29s
|
==> ninja: Successfully installed ninja-1.12.1-ngldn2kpvb6lqc44oqhhow7fzg7xu7lh
|
||||||
[+] /home/harmen/spack/opt/spack/linux-ubuntu20.04-skylake/gcc-9.3.0/ninja-1.11.1-yxferyhmrjkosgta5ei6b4lqf6bxbscz
|
Search: 0.00s. Fetch: 0.11s. Install: 0.11s. Extract: 0.10s. Relocate: 0.00s. Total: 0.22s
|
||||||
|
[+] /home/spackuser/spack/opt/spack/linux-ubuntu22.04-sapphirerapids/gcc-12.3.0/ninja-1.12.1-ngldn2kpvb6lqc44oqhhow7fzg7xu7lh
|
||||||
|
|
||||||
It worked! You've just completed a full example of creating a build cache with
|
It worked! You've just completed a full example of creating a build cache with
|
||||||
a spec of interest, adding it as a mirror, updating its index, listing the contents,
|
a spec of interest, adding it as a mirror, updating its index, listing the contents,
|
||||||
@ -344,19 +349,18 @@ which lets you get started quickly. See the following resources for more informa
|
|||||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
Create tarball of installed Spack package and all dependencies.
|
Create tarball of installed Spack package and all dependencies.
|
||||||
Tarballs are checksummed and signed if gpg2 is available.
|
Tarballs and specfiles are compressed and checksummed, manifests are signed if gpg2 is available.
|
||||||
Places them in a directory ``build_cache`` that can be copied to a mirror.
|
Commands like ``spack buildcache install`` will search Spack mirrors to get the list of build caches.
|
||||||
Commands like ``spack buildcache install`` will search Spack mirrors for build_cache to get the list of build caches.
|
|
||||||
|
|
||||||
============== ========================================================================================================================
|
============== ========================================================================================================================
|
||||||
Arguments Description
|
Arguments Description
|
||||||
============== ========================================================================================================================
|
============== ========================================================================================================================
|
||||||
``<specs>`` list of partial specs or hashes with a leading ``/`` to match from installed packages and used for creating build caches
|
``<specs>`` list of partial specs or hashes with a leading ``/`` to match from installed packages and used for creating build caches
|
||||||
``-d <path>`` directory in which ``build_cache`` directory is created, defaults to ``.``
|
``-d <path>`` directory in which ``v3`` and ``blobs`` directories are created, defaults to ``.``
|
||||||
``-f`` overwrite ``.spack`` file in ``build_cache`` directory if it exists
|
``-f`` overwrite compressed tarball and spec metadata files if they already exist
|
||||||
``-k <key>`` the key to sign package with. In the case where multiple keys exist, the package will be unsigned unless ``-k`` is used.
|
``-k <key>`` the key to sign package with. In the case where multiple keys exist, the package will be unsigned unless ``-k`` is used.
|
||||||
``-r`` make paths in binaries relative before creating tarball
|
``-r`` make paths in binaries relative before creating tarball
|
||||||
``-y`` answer yes to all create unsigned ``build_cache`` questions
|
``-y`` answer yes to all questions about creating unsigned build caches
|
||||||
============== ========================================================================================================================
|
============== ========================================================================================================================
|
||||||
|
|
||||||
^^^^^^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
@ -397,6 +401,165 @@ List public keys available on Spack mirror.
|
|||||||
========= ==============================================
|
========= ==============================================
|
||||||
Arguments Description
|
Arguments Description
|
||||||
========= ==============================================
|
========= ==============================================
|
||||||
``-i`` trust the keys downloaded with prompt for each
|
``-it`` trust the keys downloaded with prompt for each
|
||||||
``-y`` answer yes to all trust all keys downloaded
|
``-y`` answer yes to all trust all keys downloaded
|
||||||
========= ==============================================
|
========= ==============================================
|
||||||
|
|
||||||
|
.. _build_cache_layout:
|
||||||
|
|
||||||
|
------------------
|
||||||
|
Build Cache Layout
|
||||||
|
------------------
|
||||||
|
|
||||||
|
This section describes the structure and content of URL-style build caches, as
|
||||||
|
distinguished from OCI-style build caches.
|
||||||
|
|
||||||
|
The entry point for a binary package is a manifest json file that points to at
|
||||||
|
least two other files stored as content-addressed blobs. These files include a spec
|
||||||
|
metadata file, as well as the installation directory of the package stored as
|
||||||
|
a compressed archive file. Binary package manifest files are named to indicate
|
||||||
|
the package name and version, as well as the hash of the concrete spec. For
|
||||||
|
example::
|
||||||
|
|
||||||
|
gcc-runtime-12.3.0-qyu2lvgt3nxh7izxycugdbgf5gsdpkjt.spec.manifest.json
|
||||||
|
|
||||||
|
would contain the manifest for a binary package of ``gcc-runtime@12.3.0``.
|
||||||
|
The id of the built package is defined to be the DAG hash of the concrete spec,
|
||||||
|
and exists in the name of the file as well. The id distinguishes a particular
|
||||||
|
binary package from all other binary packages with the same package name and
|
||||||
|
version. Below is an example binary package manifest file. Such a file would
|
||||||
|
live in the versioned spec manifests directory of a binary mirror, for example
|
||||||
|
``v3/manifests/spec/``::
|
||||||
|
|
||||||
|
{
|
||||||
|
"version": 3,
|
||||||
|
"data": [
|
||||||
|
{
|
||||||
|
"contentLength": 10731083,
|
||||||
|
"mediaType": "application/vnd.spack.install.v2.tar+gzip",
|
||||||
|
"compression": "gzip",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "0f24aa6b5dd7150067349865217acd3f6a383083f9eca111d2d2fed726c88210"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"contentLength": 1000,
|
||||||
|
"mediaType": "application/vnd.spack.spec.v5+json",
|
||||||
|
"compression": "gzip",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "fba751c4796536737c9acbb718dad7429be1fa485f5585d450ab8b25d12ae041"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
The manifest points to both the compressed tar file as well as the compressed
|
||||||
|
spec metadata file, and contains the checksum of each. This checksum
|
||||||
|
is also used as the address of the associated file, and hence, must be
|
||||||
|
known in order to locate the tarball or spec file within the mirror. Once the
|
||||||
|
tarball or spec metadata file is downloaded, the checksum should be computed locally
|
||||||
|
and compared to the checksum in the manifest to ensure the contents have not changed
|
||||||
|
since the binary package was pushed. Spack stores all data files (including compressed
|
||||||
|
tar files, spec metadata, indices, public keys, etc) within a ``blobs/<hash-algorithm>/``
|
||||||
|
directory, using the first two characters of the checksum as a sub-directory
|
||||||
|
to reduce the number files in a single folder. Here is a depiction of the
|
||||||
|
organization of binary mirror contents::
|
||||||
|
|
||||||
|
mirror_directory/
|
||||||
|
v3/
|
||||||
|
layout.json
|
||||||
|
manifests/
|
||||||
|
spec/
|
||||||
|
gcc-runtime/
|
||||||
|
gcc-runtime-12.3.0-s2nqujezsce4x6uhtvxscu7jhewqzztx.spec.manifest.json
|
||||||
|
gmake/
|
||||||
|
gmake-4.4.1-lpr4j77rcgkg5536tmiuzwzlcjsiomph.spec.manifest.json
|
||||||
|
compiler-wrapper/
|
||||||
|
compiler-wrapper-1.0-s7ieuyievp57vwhthczhaq2ogowf3ohe.spec.manifest.json
|
||||||
|
index/
|
||||||
|
index.manifest.json
|
||||||
|
key/
|
||||||
|
75BC0528114909C076E2607418010FFAD73C9B07.key.manifest.json
|
||||||
|
keys.manifest.json
|
||||||
|
blobs/
|
||||||
|
sha256/
|
||||||
|
0f/
|
||||||
|
0f24aa6b5dd7150067349865217acd3f6a383083f9eca111d2d2fed726c88210
|
||||||
|
fb/
|
||||||
|
fba751c4796536737c9acbb718dad7429be1fa485f5585d450ab8b25d12ae041
|
||||||
|
2a/
|
||||||
|
2a21836d206ccf0df780ab0be63fdf76d24501375306a35daa6683c409b7922f
|
||||||
|
...
|
||||||
|
|
||||||
|
Files within the ``manifests`` directory are organized into subdirectories by
|
||||||
|
the type of entity they represent. Binary package manifests live in the ``spec/``
|
||||||
|
directory, binary cache index manifests live in the ``index/`` directory, and
|
||||||
|
manifests for public keys and their indices live in the ``key/`` subdirectory.
|
||||||
|
Regardless of the type of entity they represent, all manifest files are named
|
||||||
|
with an extension ``.manifest.json``.
|
||||||
|
|
||||||
|
Every manifest contains a ``data`` array, each element of which refers to an
|
||||||
|
associated file stored a content-addressed blob. Considering the example spec
|
||||||
|
manifest shown above, the compressed installation archive can be found by
|
||||||
|
picking out the data blob with the appropriate ``mediaType``, which in this
|
||||||
|
case would be ``application/vnd.spack.install.v1.tar+gzip``. The associated
|
||||||
|
file is found by looking in the blobs directory under ``blobs/sha256/fb/`` for
|
||||||
|
the file named with the complete checksum value.
|
||||||
|
|
||||||
|
As mentioned above, every entity in a binary mirror (aka build cache) is stored
|
||||||
|
as a content-addressed blob pointed to by a manifest. While an example spec
|
||||||
|
manifest (i.e. a manifest for a binary package) is shown above, here is what
|
||||||
|
the manifest of a build cache index looks like::
|
||||||
|
|
||||||
|
{
|
||||||
|
"version": 3,
|
||||||
|
"data": [
|
||||||
|
{
|
||||||
|
"contentLength": 6411,
|
||||||
|
"mediaType": "application/vnd.spack.db.v8+json",
|
||||||
|
"compression": "none",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "225a3e9da24d201fdf9d8247d66217f5b3f4d0fc160db1498afd998bfd115234"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Some things to note about this manifest are that it points to a blob that is not
|
||||||
|
compressed (``compression: "none"``), and that the ``mediaType`` is one we have
|
||||||
|
not seen yet, ``application/vnd.spack.db.v8+json``. The decision not to compress
|
||||||
|
build cache indices stems from the fact that spack does not yet sign build cache
|
||||||
|
index manifests. Once that changes, you may start to see these indices stored as
|
||||||
|
compressed blobs.
|
||||||
|
|
||||||
|
For completeness, here are examples of manifests for the other two types of entities
|
||||||
|
you might find in a spack build cache. First a public key manifest::
|
||||||
|
|
||||||
|
{
|
||||||
|
"version": 3,
|
||||||
|
"data": [
|
||||||
|
{
|
||||||
|
"contentLength": 2472,
|
||||||
|
"mediaType": "application/pgp-keys",
|
||||||
|
"compression": "none",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "9fc18374aebc84deb2f27898da77d4d4410e5fb44c60c6238cb57fb36147e5c7"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Note the ``mediaType`` of ``application/pgp-keys``. Finally, a public key index manifest::
|
||||||
|
|
||||||
|
{
|
||||||
|
"version": 3,
|
||||||
|
"data": [
|
||||||
|
{
|
||||||
|
"contentLength": 56,
|
||||||
|
"mediaType": "application/vnd.spack.keyindex.v1+json",
|
||||||
|
"compression": "none",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "29b3a0eb6064fd588543bc43ac7d42d708a69058dafe4be0859e3200091a9a1c"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
Again note the ``mediaType`` of ``application/vnd.spack.keyindex.v1+json``. Also note
|
||||||
|
that both the above manifest examples refer to uncompressed blobs, this is for the same
|
||||||
|
reason spack does not yet compress build cache index blobs.
|
||||||
|
@ -539,7 +539,9 @@ from the command line.
|
|||||||
|
|
||||||
You can also include an environment directly in the ``spack.yaml`` file. It
|
You can also include an environment directly in the ``spack.yaml`` file. It
|
||||||
involves adding the ``include_concrete`` heading in the yaml followed by the
|
involves adding the ``include_concrete`` heading in the yaml followed by the
|
||||||
absolute path to the independent environments.
|
absolute path to the independent environments. Note, that you may use Spack
|
||||||
|
config variables such as ``$spack`` or environment variables as long as the
|
||||||
|
expression expands to an absolute path.
|
||||||
|
|
||||||
.. code-block:: yaml
|
.. code-block:: yaml
|
||||||
|
|
||||||
@ -549,7 +551,7 @@ absolute path to the independent environments.
|
|||||||
unify: true
|
unify: true
|
||||||
include_concrete:
|
include_concrete:
|
||||||
- /absolute/path/to/environment1
|
- /absolute/path/to/environment1
|
||||||
- /absolute/path/to/environment2
|
- $spack/../path/to/environment2
|
||||||
|
|
||||||
|
|
||||||
Once the ``spack.yaml`` has been updated you must concretize the environment to
|
Once the ``spack.yaml`` has been updated you must concretize the environment to
|
||||||
|
@ -497,7 +497,7 @@ extends Spack's ``Package`` class. For example, here is
|
|||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
from spack import *
|
from spack.package import *
|
||||||
|
|
||||||
class Libelf(Package):
|
class Libelf(Package):
|
||||||
""" ... description ... """
|
""" ... description ... """
|
||||||
@ -1089,7 +1089,7 @@ You've already seen the ``homepage`` and ``url`` package attributes:
|
|||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
:linenos:
|
:linenos:
|
||||||
|
|
||||||
from spack import *
|
from spack.package import *
|
||||||
|
|
||||||
|
|
||||||
class Mpich(Package):
|
class Mpich(Package):
|
||||||
@ -6183,7 +6183,7 @@ running:
|
|||||||
|
|
||||||
.. code-block:: python
|
.. code-block:: python
|
||||||
|
|
||||||
from spack import *
|
from spack.package import *
|
||||||
|
|
||||||
This is already part of the boilerplate for packages created with
|
This is already part of the boilerplate for packages created with
|
||||||
``spack create``.
|
``spack create``.
|
||||||
|
@ -176,92 +176,72 @@ community without needing deep familiarity with GnuPG or Public Key
|
|||||||
Infrastructure.
|
Infrastructure.
|
||||||
|
|
||||||
|
|
||||||
.. _build_cache_format:
|
.. _build_cache_signing:
|
||||||
|
|
||||||
------------------
|
-------------------
|
||||||
Build Cache Format
|
Build Cache Signing
|
||||||
------------------
|
-------------------
|
||||||
|
|
||||||
A binary package consists of a metadata file unambiguously defining the
|
For an in-depth description of the layout of a binary mirror, see
|
||||||
built package (and including other details such as how to relocate it)
|
the :ref:`documentation<build_cache_layout>` covering binary caches. The
|
||||||
and the installation directory of the package stored as a compressed
|
key takeaway from that discussion that applies here is that the entry point
|
||||||
archive file. The metadata files can either be unsigned, in which case
|
to a binary package is it's manifest. The manifest refers unambiguously to the
|
||||||
the contents are simply the json-serialized concrete spec plus metadata,
|
spec metadata and compressed archive, which are stored as content-addressed
|
||||||
or they can be signed, in which case the json-serialized concrete spec
|
blobs.
|
||||||
plus metadata is wrapped in a gpg cleartext signature. Built package
|
|
||||||
metadata files are named to indicate the operating system and
|
|
||||||
architecture for which the package was built as well as the compiler
|
|
||||||
used to build it and the packages name and version. For example::
|
|
||||||
|
|
||||||
linux-ubuntu18.04-haswell-gcc-7.5.0-zlib-1.2.12-llv2ysfdxnppzjrt5ldybb5c52qbmoow.spec.json.sig
|
The manifest files can either be signed or unsigned, but are always given
|
||||||
|
a name ending with ``.spec.manifest.json`` regardless. The difference between
|
||||||
would contain the concrete spec and binary metadata for a binary package
|
signed and unsigned manifests is simply that the signed version is wrapped in
|
||||||
of ``zlib@1.2.12``, built for the ``ubuntu`` operating system and ``haswell``
|
a gpg cleartext signature, as illustrated below::
|
||||||
architecture. The id of the built package exists in the name of the file
|
|
||||||
as well (after the package name and version) and in this case begins
|
|
||||||
with ``llv2ys``. The id distinguishes a particular built package from all
|
|
||||||
other built packages with the same os/arch, compiler, name, and version.
|
|
||||||
Below is an example of a signed binary package metadata file. Such a
|
|
||||||
file would live in the ``build_cache`` directory of a binary mirror::
|
|
||||||
|
|
||||||
-----BEGIN PGP SIGNED MESSAGE-----
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
Hash: SHA512
|
Hash: SHA512
|
||||||
|
|
||||||
{
|
{
|
||||||
"spec": {
|
"version": 3,
|
||||||
<concrete-spec-contents-omitted>
|
"data": [
|
||||||
},
|
{
|
||||||
|
"contentLength": 10731083,
|
||||||
"buildcache_layout_version": 1,
|
"mediaType": "application/vnd.spack.install.v2.tar+gzip",
|
||||||
"binary_cache_checksum": {
|
"compression": "gzip",
|
||||||
"hash_algorithm": "sha256",
|
"checksumAlgorithm": "sha256",
|
||||||
"hash": "4f1e46452c35a5e61bcacca205bae1bfcd60a83a399af201a29c95b7cc3e1423"
|
"checksum": "0f24aa6b5dd7150067349865217acd3f6a383083f9eca111d2d2fed726c88210"
|
||||||
}
|
},
|
||||||
|
{
|
||||||
|
"contentLength": 1000,
|
||||||
|
"mediaType": "application/vnd.spack.spec.v5+json",
|
||||||
|
"compression": "gzip",
|
||||||
|
"checksumAlgorithm": "sha256",
|
||||||
|
"checksum": "fba751c4796536737c9acbb718dad7429be1fa485f5585d450ab8b25d12ae041"
|
||||||
|
}
|
||||||
|
]
|
||||||
}
|
}
|
||||||
|
|
||||||
-----BEGIN PGP SIGNATURE-----
|
-----BEGIN PGP SIGNATURE-----
|
||||||
iQGzBAEBCgAdFiEETZn0sLle8jIrdAPLx/P+voVcifMFAmKAGvwACgkQx/P+voVc
|
|
||||||
ifNoVgv/VrhA+wurVs5GB9PhmMA1m5U/AfXZb4BElDRwpT8ZcTPIv5X8xtv60eyn
|
iQGzBAEBCgAdFiEEdbwFKBFJCcB24mB0GAEP+tc8mwcFAmf2rr4ACgkQGAEP+tc8
|
||||||
4EOneGVbZoMThVxgev/NKARorGmhFXRqhWf+jknJZ1dicpqn/qpv34rELKUpgXU+
|
mwfefwv+KJs8MsQ5ovFaBdmyx5H/3k4rO4QHBzuSPOB6UaxErA9IyOB31iP6vNTU
|
||||||
QDQ4d1P64AIdTczXe2GI9ZvhOo6+bPvK7LIsTkBbtWmopkomVxF0LcMuxAVIbA6b
|
HzYpxz6F5dJCJWmmNEMN/0+vjhMHEOkqd7M1l5reVcxduTF2yc4tBZUO2gienEHL
|
||||||
887yBvVO0VGlqRnkDW7nXx49r3AG2+wDcoU1f8ep8QtjOcMNaPTPJ0UnjD0VQGW6
|
W0e+SnUznl1yc/aVpChUiahO2zToCsI8HZRNT4tu6iCnE/OpghqjsSdBOZHmSNDD
|
||||||
4ZFaGZWzdo45MY6tF3o5mqM7zJkVobpoW3iUz6J5tjz7H/nMlGgMkUwY9Kxp2PVH
|
5wuuCxfDUyWI6ZlLclaaB7RdbCUUJf/iqi711J+wubvnDFhc6Ynwm1xai5laJ1bD
|
||||||
qoj6Zip3LWplnl2OZyAY+vflPFdFh12Xpk4FG7Sxm/ux0r+l8tCAPvtw+G38a5P7
|
ev3NrSb2AAroeNFVo4iECA0fZC1OZQYzaRmAEhBXtCideGJ5Zf2Cp9hmCwNK8Hq6
|
||||||
QEk2JBr8qMGKASmnRlJUkm1vwz0a95IF3S9YDfTAA2vz6HH3PtsNLFhtorfx8eBi
|
bNt94JP9LqC3FCCJJOMsPyOOhMSA5MU44zyyzloRwEQpHHLuFzVdbTHA3dmTc18n
|
||||||
Wn5aPJAGEPOawEOvXGGbsH4cDEKPeN0n6cy1k92uPEmBLDVsdnur8q42jk5c2Qyx
|
HxNLkZoEMYRc8zNr40g0yb2lCbc+P11TtL1E+5NlE34MX15mPewRCiIFTMwhCnE3
|
||||||
j3DXty57
|
gFSKtW1MKustZE35/RUwd2mpJRf+mSRVCl1f1RiFjktLjz7vWQq7imIUSam0fPDr
|
||||||
=3gvm
|
XD4aDogm
|
||||||
|
=RrFX
|
||||||
-----END PGP SIGNATURE-----
|
-----END PGP SIGNATURE-----
|
||||||
|
|
||||||
If a user has trusted the public key associated with the private key
|
If a user has trusted the public key associated with the private key
|
||||||
used to sign the above spec file, the signature can be verified with
|
used to sign the above manifest file, the signature can be verified with
|
||||||
gpg, as follows::
|
gpg, as follows::
|
||||||
|
|
||||||
$ gpg –verify linux-ubuntu18.04-haswell-gcc-7.5.0-zlib-1.2.12-llv2ysfdxnppzjrt5ldybb5c52qbmoow.spec.json.sig
|
$ gpg --verify gcc-runtime-12.3.0-s2nqujezsce4x6uhtvxscu7jhewqzztx.spec.manifest.json
|
||||||
|
|
||||||
The metadata (regardless whether signed or unsigned) contains the checksum
|
When attempting to install a binary package that has been signed, spack will
|
||||||
of the ``.spack`` file containing the actual installation. The checksum should
|
attempt to verify the signature with one of the trusted keys in its keyring,
|
||||||
be compared to a checksum computed locally on the ``.spack`` file to ensure the
|
and will fail if unable to do so. While not recommended, it is possible to
|
||||||
contents have not changed since the binary spec plus metadata were signed. The
|
force installation of a signed package without verification by providing the
|
||||||
``.spack`` files are actually tarballs containing the compressed archive of the
|
``--no-check-signature`` argument to ``spack install ...``.
|
||||||
install tree. These files, along with the metadata files, live within the
|
|
||||||
``build_cache`` directory of the mirror, and together are organized as follows::
|
|
||||||
|
|
||||||
build_cache/
|
|
||||||
# unsigned metadata (for indexing, contains sha256 of .spack file)
|
|
||||||
<arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json
|
|
||||||
# clearsigned metadata (same as above, but signed)
|
|
||||||
<arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json.sig
|
|
||||||
<arch>/
|
|
||||||
<compiler>/
|
|
||||||
<name>-<ver>/
|
|
||||||
# tar.gz-compressed prefix (may support more compression formats later)
|
|
||||||
<arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spack
|
|
||||||
|
|
||||||
Uncompressing and extracting the ``.spack`` file results in the install tree.
|
|
||||||
This is in contrast to previous versions of spack, where the ``.spack`` file
|
|
||||||
contained a (duplicated) metadata file, a signature file and a nested tarball
|
|
||||||
containing the install tree.
|
|
||||||
|
|
||||||
.. _internal_implementation:
|
.. _internal_implementation:
|
||||||
|
|
||||||
@ -320,10 +300,10 @@ the following way:
|
|||||||
Reputational Public Key are imported into a keyring by the ``spack gpg …``
|
Reputational Public Key are imported into a keyring by the ``spack gpg …``
|
||||||
sub-command. This is initiated by the job’s build script which is created by
|
sub-command. This is initiated by the job’s build script which is created by
|
||||||
the generate job at the beginning of the pipeline.
|
the generate job at the beginning of the pipeline.
|
||||||
4. Assuming the package has dependencies those specs are verified using
|
4. Assuming the package has dependencies those spec manifests are verified using
|
||||||
the keyring.
|
the keyring.
|
||||||
5. The package is built and the spec.json is generated
|
5. The package is built and the spec manifest is generated
|
||||||
6. The spec.json is signed by the keyring and uploaded to the mirror’s
|
6. The spec manifest is signed by the keyring and uploaded to the mirror’s
|
||||||
build cache.
|
build cache.
|
||||||
|
|
||||||
**Reputational Key**
|
**Reputational Key**
|
||||||
@ -376,24 +356,24 @@ following way:
|
|||||||
4. In addition to the secret, the runner creates a tmpfs memory mounted
|
4. In addition to the secret, the runner creates a tmpfs memory mounted
|
||||||
directory where the GnuPG keyring will be created to verify, and
|
directory where the GnuPG keyring will be created to verify, and
|
||||||
then resign the package specs.
|
then resign the package specs.
|
||||||
5. The job script syncs all spec.json.sig files from the build cache to
|
5. The job script syncs all spec manifest files from the build cache to
|
||||||
a working directory in the job’s execution environment.
|
a working directory in the job’s execution environment.
|
||||||
6. The job script then runs the ``sign.sh`` script built into the
|
6. The job script then runs the ``sign.sh`` script built into the
|
||||||
notary Docker image.
|
notary Docker image.
|
||||||
7. The ``sign.sh`` script imports the public components of the
|
7. The ``sign.sh`` script imports the public components of the
|
||||||
Reputational and Intermediate CI Keys and uses them to verify good
|
Reputational and Intermediate CI Keys and uses them to verify good
|
||||||
signatures on the spec.json.sig files. If any signed spec does not
|
signatures on the spec.manifest.json files. If any signed manifest
|
||||||
verify the job immediately fails.
|
does not verify, the job immediately fails.
|
||||||
8. Assuming all specs are verified, the ``sign.sh`` script then unpacks
|
8. Assuming all manifests are verified, the ``sign.sh`` script then unpacks
|
||||||
the spec json data from the signed file in preparation for being
|
the manifest json data from the signed file in preparation for being
|
||||||
re-signed with the Reputational Key.
|
re-signed with the Reputational Key.
|
||||||
9. The private components of the Reputational Key are decrypted to
|
9. The private components of the Reputational Key are decrypted to
|
||||||
standard out using ``aws-encryption-cli`` directly into a ``gpg
|
standard out using ``aws-encryption-cli`` directly into a ``gpg
|
||||||
–import …`` statement which imports the key into the
|
–import …`` statement which imports the key into the
|
||||||
keyring mounted in-memory.
|
keyring mounted in-memory.
|
||||||
10. The private key is then used to sign each of the json specs and the
|
10. The private key is then used to sign each of the manifests and the
|
||||||
keyring is removed from disk.
|
keyring is removed from disk.
|
||||||
11. The re-signed json specs are resynced to the AWS S3 Mirror and the
|
11. The re-signed manifests are resynced to the AWS S3 Mirror and the
|
||||||
public signing of the packages for the develop or release pipeline
|
public signing of the packages for the develop or release pipeline
|
||||||
that created them is complete.
|
that created them is complete.
|
||||||
|
|
||||||
|
@ -350,7 +350,7 @@ def _ensure_no_folders_without_package_py(error_cls):
|
|||||||
for repository in spack.repo.PATH.repos:
|
for repository in spack.repo.PATH.repos:
|
||||||
missing = []
|
missing = []
|
||||||
for entry in os.scandir(repository.packages_path):
|
for entry in os.scandir(repository.packages_path):
|
||||||
if not entry.is_dir():
|
if not entry.is_dir() or entry.name == "__pycache__":
|
||||||
continue
|
continue
|
||||||
package_py = pathlib.Path(entry.path) / spack.repo.package_file_name
|
package_py = pathlib.Path(entry.path) / spack.repo.package_file_name
|
||||||
if not package_py.exists():
|
if not package_py.exists():
|
||||||
|
File diff suppressed because it is too large
Load Diff
@ -36,7 +36,7 @@ class CompilerPackage(spack.package_base.PackageBase):
|
|||||||
|
|
||||||
#: Compiler argument(s) that produces version information
|
#: Compiler argument(s) that produces version information
|
||||||
#: If multiple arguments, the earlier arguments must produce errors when invalid
|
#: If multiple arguments, the earlier arguments must produce errors when invalid
|
||||||
compiler_version_argument: Union[str, Tuple[str]] = "-dumpversion"
|
compiler_version_argument: Union[str, Tuple[str, ...]] = "-dumpversion"
|
||||||
|
|
||||||
#: Regex used to extract version from compiler's output
|
#: Regex used to extract version from compiler's output
|
||||||
compiler_version_regex: str = "(.*)"
|
compiler_version_regex: str = "(.*)"
|
||||||
|
351
lib/spack/spack/buildcache_migrate.py
Normal file
351
lib/spack/spack/buildcache_migrate.py
Normal file
@ -0,0 +1,351 @@
|
|||||||
|
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
import codecs
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import pathlib
|
||||||
|
import tempfile
|
||||||
|
from typing import NamedTuple
|
||||||
|
|
||||||
|
import llnl.util.tty as tty
|
||||||
|
|
||||||
|
import spack.binary_distribution as bindist
|
||||||
|
import spack.database as spack_db
|
||||||
|
import spack.error
|
||||||
|
import spack.mirrors.mirror
|
||||||
|
import spack.spec
|
||||||
|
import spack.stage
|
||||||
|
import spack.util.crypto
|
||||||
|
import spack.util.parallel
|
||||||
|
import spack.util.url as url_util
|
||||||
|
import spack.util.web as web_util
|
||||||
|
|
||||||
|
from .enums import InstallRecordStatus
|
||||||
|
from .url_buildcache import (
|
||||||
|
BlobRecord,
|
||||||
|
BuildcacheComponent,
|
||||||
|
compressed_json_from_dict,
|
||||||
|
get_url_buildcache_class,
|
||||||
|
sign_file,
|
||||||
|
try_verify,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def v2_tarball_directory_name(spec):
|
||||||
|
"""
|
||||||
|
Return name of the tarball directory according to the convention
|
||||||
|
<os>-<architecture>/<compiler>/<package>-<version>/
|
||||||
|
"""
|
||||||
|
return spec.format_path("{architecture}/{compiler.name}-{compiler.version}/{name}-{version}")
|
||||||
|
|
||||||
|
|
||||||
|
def v2_tarball_name(spec, ext):
|
||||||
|
"""
|
||||||
|
Return the name of the tarfile according to the convention
|
||||||
|
<os>-<architecture>-<package>-<dag_hash><ext>
|
||||||
|
"""
|
||||||
|
spec_formatted = spec.format_path(
|
||||||
|
"{architecture}-{compiler.name}-{compiler.version}-{name}-{version}-{hash}"
|
||||||
|
)
|
||||||
|
return f"{spec_formatted}{ext}"
|
||||||
|
|
||||||
|
|
||||||
|
def v2_tarball_path_name(spec, ext):
|
||||||
|
"""
|
||||||
|
Return the full path+name for a given spec according to the convention
|
||||||
|
<tarball_directory_name>/<tarball_name>
|
||||||
|
"""
|
||||||
|
return os.path.join(v2_tarball_directory_name(spec), v2_tarball_name(spec, ext))
|
||||||
|
|
||||||
|
|
||||||
|
class MigrateSpecResult(NamedTuple):
|
||||||
|
success: bool
|
||||||
|
message: str
|
||||||
|
|
||||||
|
|
||||||
|
class MigrationException(spack.error.SpackError):
|
||||||
|
"""
|
||||||
|
Raised when migration fails irrevocably
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, msg):
|
||||||
|
super().__init__(msg)
|
||||||
|
|
||||||
|
|
||||||
|
def _migrate_spec(
|
||||||
|
s: spack.spec.Spec, mirror_url: str, tmpdir: str, unsigned: bool = False, signing_key: str = ""
|
||||||
|
) -> MigrateSpecResult:
|
||||||
|
"""Parallelizable function to migrate a single spec"""
|
||||||
|
print_spec = f"{s.name}/{s.dag_hash()[:7]}"
|
||||||
|
|
||||||
|
# Check if the spec file exists in the new location and exit early if so
|
||||||
|
|
||||||
|
v3_cache_class = get_url_buildcache_class(layout_version=3)
|
||||||
|
v3_cache_entry = v3_cache_class(mirror_url, s, allow_unsigned=unsigned)
|
||||||
|
exists = v3_cache_entry.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
v3_cache_entry.destroy()
|
||||||
|
|
||||||
|
if exists:
|
||||||
|
msg = f"No need to migrate {print_spec}"
|
||||||
|
return MigrateSpecResult(True, msg)
|
||||||
|
|
||||||
|
# Try to fetch the spec metadata
|
||||||
|
v2_metadata_urls = [
|
||||||
|
url_util.join(mirror_url, "build_cache", v2_tarball_name(s, ".spec.json.sig"))
|
||||||
|
]
|
||||||
|
|
||||||
|
if unsigned:
|
||||||
|
v2_metadata_urls.append(
|
||||||
|
url_util.join(mirror_url, "build_cache", v2_tarball_name(s, ".spec.json"))
|
||||||
|
)
|
||||||
|
|
||||||
|
spec_contents = None
|
||||||
|
|
||||||
|
for meta_url in v2_metadata_urls:
|
||||||
|
try:
|
||||||
|
_, _, meta_file = web_util.read_from_url(meta_url)
|
||||||
|
spec_contents = codecs.getreader("utf-8")(meta_file).read()
|
||||||
|
v2_spec_url = meta_url
|
||||||
|
break
|
||||||
|
except (web_util.SpackWebError, OSError):
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
msg = f"Unable to read metadata for {print_spec}"
|
||||||
|
return MigrateSpecResult(False, msg)
|
||||||
|
|
||||||
|
spec_dict = {}
|
||||||
|
|
||||||
|
if unsigned:
|
||||||
|
# User asked for unsigned, if we found a signed specfile, just ignore
|
||||||
|
# the signature
|
||||||
|
if v2_spec_url.endswith(".sig"):
|
||||||
|
spec_dict = spack.spec.Spec.extract_json_from_clearsig(spec_contents)
|
||||||
|
else:
|
||||||
|
spec_dict = json.loads(spec_contents)
|
||||||
|
else:
|
||||||
|
# User asked for signed, we must successfully verify the signature
|
||||||
|
local_signed_pre_verify = os.path.join(
|
||||||
|
tmpdir, f"{s.name}_{s.dag_hash()}_verify.spec.json.sig"
|
||||||
|
)
|
||||||
|
with open(local_signed_pre_verify, "w", encoding="utf-8") as fd:
|
||||||
|
fd.write(spec_contents)
|
||||||
|
if not try_verify(local_signed_pre_verify):
|
||||||
|
return MigrateSpecResult(False, f"Failed to verify signature of {print_spec}")
|
||||||
|
with open(local_signed_pre_verify, encoding="utf-8") as fd:
|
||||||
|
spec_dict = spack.spec.Spec.extract_json_from_clearsig(fd.read())
|
||||||
|
|
||||||
|
# Read out and remove the bits needed to rename and position the archive
|
||||||
|
bcc = spec_dict.pop("binary_cache_checksum", None)
|
||||||
|
if not bcc:
|
||||||
|
msg = "Cannot migrate a spec that does not have 'binary_cache_checksum'"
|
||||||
|
return MigrateSpecResult(False, msg)
|
||||||
|
|
||||||
|
algorithm = bcc["hash_algorithm"]
|
||||||
|
checksum = bcc["hash"]
|
||||||
|
|
||||||
|
# TODO: Remove this key once oci buildcache no longer uses it
|
||||||
|
spec_dict["buildcache_layout_version"] = 2
|
||||||
|
|
||||||
|
v2_archive_url = url_util.join(mirror_url, "build_cache", v2_tarball_path_name(s, ".spack"))
|
||||||
|
|
||||||
|
# spacks web utilities do not include direct copying of s3 objects, so we
|
||||||
|
# need to download the archive locally, and then push it back to the target
|
||||||
|
# location
|
||||||
|
archive_stage_path = os.path.join(tmpdir, f"archive_stage_{s.name}_{s.dag_hash()}")
|
||||||
|
archive_stage = spack.stage.Stage(v2_archive_url, path=archive_stage_path)
|
||||||
|
|
||||||
|
try:
|
||||||
|
archive_stage.create()
|
||||||
|
archive_stage.fetch()
|
||||||
|
except spack.error.FetchError:
|
||||||
|
return MigrateSpecResult(False, f"Unable to fetch archive for {print_spec}")
|
||||||
|
|
||||||
|
local_tarfile_path = archive_stage.save_filename
|
||||||
|
|
||||||
|
# As long as we have to download the tarball anyway, we might as well compute the
|
||||||
|
# checksum locally and check it against the expected value
|
||||||
|
local_checksum = spack.util.crypto.checksum(
|
||||||
|
spack.util.crypto.hash_fun_for_algo(algorithm), local_tarfile_path
|
||||||
|
)
|
||||||
|
|
||||||
|
if local_checksum != checksum:
|
||||||
|
return MigrateSpecResult(
|
||||||
|
False, f"Checksum mismatch for {print_spec}: expected {checksum}, got {local_checksum}"
|
||||||
|
)
|
||||||
|
|
||||||
|
spec_dict["archive_size"] = os.stat(local_tarfile_path).st_size
|
||||||
|
|
||||||
|
# Compress the spec dict and compute its checksum
|
||||||
|
metadata_checksum_algo = "sha256"
|
||||||
|
spec_json_path = os.path.join(tmpdir, f"{s.name}_{s.dag_hash()}.spec.json")
|
||||||
|
metadata_checksum, metadata_size = compressed_json_from_dict(
|
||||||
|
spec_json_path, spec_dict, metadata_checksum_algo
|
||||||
|
)
|
||||||
|
|
||||||
|
tarball_blob_record = BlobRecord(
|
||||||
|
spec_dict["archive_size"], v3_cache_class.TARBALL_MEDIATYPE, "gzip", algorithm, checksum
|
||||||
|
)
|
||||||
|
|
||||||
|
metadata_blob_record = BlobRecord(
|
||||||
|
metadata_size,
|
||||||
|
v3_cache_class.SPEC_MEDIATYPE,
|
||||||
|
"gzip",
|
||||||
|
metadata_checksum_algo,
|
||||||
|
metadata_checksum,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Compute the urls to the new blobs
|
||||||
|
v3_archive_url = v3_cache_class.get_blob_url(mirror_url, tarball_blob_record)
|
||||||
|
v3_spec_url = v3_cache_class.get_blob_url(mirror_url, metadata_blob_record)
|
||||||
|
|
||||||
|
# First push the tarball
|
||||||
|
tty.debug(f"Pushing {local_tarfile_path} to {v3_archive_url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
web_util.push_to_url(local_tarfile_path, v3_archive_url, keep_original=True)
|
||||||
|
except Exception:
|
||||||
|
return MigrateSpecResult(False, f"Failed to push archive for {print_spec}")
|
||||||
|
|
||||||
|
# Then push the spec file
|
||||||
|
tty.debug(f"Pushing {spec_json_path} to {v3_spec_url}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
web_util.push_to_url(spec_json_path, v3_spec_url, keep_original=True)
|
||||||
|
except Exception:
|
||||||
|
return MigrateSpecResult(False, f"Failed to push spec metadata for {print_spec}")
|
||||||
|
|
||||||
|
# Generate the manifest and write it to a temporary location
|
||||||
|
manifest = {
|
||||||
|
"version": v3_cache_class.get_layout_version(),
|
||||||
|
"data": [tarball_blob_record.to_dict(), metadata_blob_record.to_dict()],
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest_path = os.path.join(tmpdir, f"{s.dag_hash()}.manifest.json")
|
||||||
|
with open(manifest_path, "w", encoding="utf-8") as f:
|
||||||
|
json.dump(manifest, f, indent=0, separators=(",", ":"))
|
||||||
|
# Note: when using gpg clear sign, we need to avoid long lines (19995
|
||||||
|
# chars). If lines are longer, they are truncated without error. So,
|
||||||
|
# here we still add newlines, but no indent, so save on file size and
|
||||||
|
# line length.
|
||||||
|
|
||||||
|
# Possibly sign the manifest
|
||||||
|
if not unsigned:
|
||||||
|
manifest_path = sign_file(signing_key, manifest_path)
|
||||||
|
|
||||||
|
v3_manifest_url = v3_cache_class.get_manifest_url(s, mirror_url)
|
||||||
|
|
||||||
|
# Push the manifest
|
||||||
|
try:
|
||||||
|
web_util.push_to_url(manifest_path, v3_manifest_url, keep_original=True)
|
||||||
|
except Exception:
|
||||||
|
return MigrateSpecResult(False, f"Failed to push manifest for {print_spec}")
|
||||||
|
|
||||||
|
return MigrateSpecResult(True, f"Successfully migrated {print_spec}")
|
||||||
|
|
||||||
|
|
||||||
|
def migrate(
|
||||||
|
mirror: spack.mirrors.mirror.Mirror, unsigned: bool = False, delete_existing: bool = False
|
||||||
|
) -> None:
|
||||||
|
"""Perform migration of the given mirror
|
||||||
|
|
||||||
|
If unsigned is True, signatures on signed specs will be ignored, and specs
|
||||||
|
will not be re-signed before pushing to the new location. Otherwise, spack
|
||||||
|
will attempt to verify signatures and re-sign specs, and will fail if not
|
||||||
|
able to do so. If delete_existing is True, spack will delete the original
|
||||||
|
contents of the mirror once the migration is complete."""
|
||||||
|
signing_key = ""
|
||||||
|
if not unsigned:
|
||||||
|
try:
|
||||||
|
signing_key = bindist.select_signing_key()
|
||||||
|
except (bindist.NoKeyException, bindist.PickKeyException):
|
||||||
|
raise MigrationException(
|
||||||
|
"Signed migration requires exactly one secret key in keychain"
|
||||||
|
)
|
||||||
|
|
||||||
|
delete_action = "deleting" if delete_existing else "keeping"
|
||||||
|
sign_action = "an unsigned" if unsigned else "a signed"
|
||||||
|
mirror_url = mirror.fetch_url
|
||||||
|
|
||||||
|
tty.msg(
|
||||||
|
f"Performing {sign_action} migration of {mirror.push_url} "
|
||||||
|
f"and {delete_action} existing contents"
|
||||||
|
)
|
||||||
|
|
||||||
|
index_url = url_util.join(mirror_url, "build_cache", spack_db.INDEX_JSON_FILE)
|
||||||
|
contents = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
_, _, index_file = web_util.read_from_url(index_url)
|
||||||
|
contents = codecs.getreader("utf-8")(index_file).read()
|
||||||
|
except (web_util.SpackWebError, OSError):
|
||||||
|
raise MigrationException("Buildcache migration requires a buildcache index")
|
||||||
|
|
||||||
|
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
|
||||||
|
index_path = os.path.join(tmpdir, "_tmp_index.json")
|
||||||
|
with open(index_path, "w", encoding="utf-8") as fd:
|
||||||
|
fd.write(contents)
|
||||||
|
|
||||||
|
db = bindist.BuildCacheDatabase(tmpdir)
|
||||||
|
db._read_from_file(pathlib.Path(index_path))
|
||||||
|
|
||||||
|
specs_to_migrate = [
|
||||||
|
s
|
||||||
|
for s in db.query_local(installed=InstallRecordStatus.ANY)
|
||||||
|
if not s.external and db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
|
||||||
|
]
|
||||||
|
|
||||||
|
# Run the tasks in parallel if possible
|
||||||
|
executor = spack.util.parallel.make_concurrent_executor()
|
||||||
|
migrate_futures = [
|
||||||
|
executor.submit(_migrate_spec, spec, mirror_url, tmpdir, unsigned, signing_key)
|
||||||
|
for spec in specs_to_migrate
|
||||||
|
]
|
||||||
|
|
||||||
|
success_count = 0
|
||||||
|
|
||||||
|
tty.msg("Migration summary:")
|
||||||
|
for spec, migrate_future in zip(specs_to_migrate, migrate_futures):
|
||||||
|
result = migrate_future.result()
|
||||||
|
msg = f" {spec.name}/{spec.dag_hash()[:7]}: {result.message}"
|
||||||
|
if result.success:
|
||||||
|
success_count += 1
|
||||||
|
tty.msg(msg)
|
||||||
|
else:
|
||||||
|
tty.error(msg)
|
||||||
|
# The migrated index should have the same specs as the original index,
|
||||||
|
# modulo any specs that we failed to migrate for whatever reason. So
|
||||||
|
# to avoid having to re-fetch all the spec files now, just mark them
|
||||||
|
# appropriately in the existing database and push that.
|
||||||
|
db.mark(spec, "in_buildcache", result.success)
|
||||||
|
|
||||||
|
if success_count > 0:
|
||||||
|
tty.msg("Updating index and pushing keys")
|
||||||
|
|
||||||
|
# If the layout.json doesn't yet exist on this mirror, push it
|
||||||
|
v3_cache_class = get_url_buildcache_class(layout_version=3)
|
||||||
|
v3_cache_class.maybe_push_layout_json(mirror_url)
|
||||||
|
|
||||||
|
# Push the migrated mirror index
|
||||||
|
index_tmpdir = os.path.join(tmpdir, "rebuild_index")
|
||||||
|
os.mkdir(index_tmpdir)
|
||||||
|
bindist._push_index(db, index_tmpdir, mirror_url)
|
||||||
|
|
||||||
|
# Push the public part of the signing key
|
||||||
|
if not unsigned:
|
||||||
|
keys_tmpdir = os.path.join(tmpdir, "keys")
|
||||||
|
os.mkdir(keys_tmpdir)
|
||||||
|
bindist._url_push_keys(
|
||||||
|
mirror_url, keys=[signing_key], update_index=True, tmpdir=keys_tmpdir
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
tty.warn("No specs migrated, did you mean to perform an unsigned migration instead?")
|
||||||
|
|
||||||
|
# Delete the old layout if the user requested it
|
||||||
|
if delete_existing:
|
||||||
|
delete_prefix = url_util.join(mirror_url, "build_cache")
|
||||||
|
tty.msg(f"Recursively deleting {delete_prefix}")
|
||||||
|
web_util.remove_url(delete_prefix, recursive=True)
|
||||||
|
|
||||||
|
tty.msg("Migration complete")
|
@ -33,6 +33,7 @@
|
|||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.spec
|
import spack.spec
|
||||||
|
import spack.stage
|
||||||
import spack.store
|
import spack.store
|
||||||
import spack.util.git
|
import spack.util.git
|
||||||
import spack.util.gpg as gpg_util
|
import spack.util.gpg as gpg_util
|
||||||
@ -245,7 +246,9 @@ def rebuild_filter(s: spack.spec.Spec) -> RebuildDecision:
|
|||||||
if not spec_locations:
|
if not spec_locations:
|
||||||
return RebuildDecision(True, "not found anywhere")
|
return RebuildDecision(True, "not found anywhere")
|
||||||
|
|
||||||
urls = ",".join([loc["mirror_url"] for loc in spec_locations])
|
urls = ",".join(
|
||||||
|
[f"{loc.url_and_version.url}@v{loc.url_and_version.version}" for loc in spec_locations]
|
||||||
|
)
|
||||||
message = f"up-to-date [{urls}]"
|
message = f"up-to-date [{urls}]"
|
||||||
return RebuildDecision(False, message)
|
return RebuildDecision(False, message)
|
||||||
|
|
||||||
@ -1242,33 +1245,31 @@ def write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dic
|
|||||||
"""Given a url to write to and the details of the failed job, write an entry
|
"""Given a url to write to and the details of the failed job, write an entry
|
||||||
in the broken specs list.
|
in the broken specs list.
|
||||||
"""
|
"""
|
||||||
tmpdir = tempfile.mkdtemp()
|
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
|
||||||
file_path = os.path.join(tmpdir, "broken.txt")
|
file_path = os.path.join(tmpdir, "broken.txt")
|
||||||
|
|
||||||
broken_spec_details = {
|
broken_spec_details = {
|
||||||
"broken-spec": {
|
"broken-spec": {
|
||||||
"job-name": pkg_name,
|
"job-name": pkg_name,
|
||||||
"job-stack": stack_name,
|
"job-stack": stack_name,
|
||||||
"job-url": job_url,
|
"job-url": job_url,
|
||||||
"pipeline-url": pipeline_url,
|
"pipeline-url": pipeline_url,
|
||||||
"concrete-spec-dict": spec_dict,
|
"concrete-spec-dict": spec_dict,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with open(file_path, "w", encoding="utf-8") as fd:
|
with open(file_path, "w", encoding="utf-8") as fd:
|
||||||
syaml.dump(broken_spec_details, fd)
|
syaml.dump(broken_spec_details, fd)
|
||||||
web_util.push_to_url(
|
web_util.push_to_url(
|
||||||
file_path, url, keep_original=False, extra_args={"ContentType": "text/plain"}
|
file_path, url, keep_original=False, extra_args={"ContentType": "text/plain"}
|
||||||
)
|
)
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
# If there is an S3 error (e.g., access denied or connection
|
# If there is an S3 error (e.g., access denied or connection
|
||||||
# error), the first non boto-specific class in the exception
|
# error), the first non boto-specific class in the exception
|
||||||
# hierarchy is Exception. Just print a warning and return
|
# hierarchy is Exception. Just print a warning and return
|
||||||
msg = f"Error writing to broken specs list {url}: {err}"
|
msg = f"Error writing to broken specs list {url}: {err}"
|
||||||
tty.warn(msg)
|
tty.warn(msg)
|
||||||
finally:
|
|
||||||
shutil.rmtree(tmpdir)
|
|
||||||
|
|
||||||
|
|
||||||
def read_broken_spec(broken_spec_url):
|
def read_broken_spec(broken_spec_url):
|
||||||
|
@ -31,12 +31,12 @@
|
|||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.util.compression as compression
|
import spack.util.compression as compression
|
||||||
import spack.util.spack_yaml as syaml
|
import spack.util.spack_yaml as syaml
|
||||||
import spack.util.url as url_util
|
|
||||||
import spack.util.web as web_util
|
import spack.util.web as web_util
|
||||||
from spack import traverse
|
from spack import traverse
|
||||||
from spack.reporters import CDash, CDashConfiguration
|
from spack.reporters import CDash, CDashConfiguration
|
||||||
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
|
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
|
||||||
from spack.reporters.cdash import build_stamp as cdash_build_stamp
|
from spack.reporters.cdash import build_stamp as cdash_build_stamp
|
||||||
|
from spack.url_buildcache import get_url_buildcache_class
|
||||||
|
|
||||||
IS_WINDOWS = sys.platform == "win32"
|
IS_WINDOWS = sys.platform == "win32"
|
||||||
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
|
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
|
||||||
@ -179,33 +179,13 @@ def write_pipeline_manifest(specs, src_prefix, dest_prefix, output_file):
|
|||||||
|
|
||||||
for release_spec in specs:
|
for release_spec in specs:
|
||||||
release_spec_dag_hash = release_spec.dag_hash()
|
release_spec_dag_hash = release_spec.dag_hash()
|
||||||
# TODO: This assumes signed version of the spec
|
cache_class = get_url_buildcache_class(
|
||||||
buildcache_copies[release_spec_dag_hash] = [
|
layout_version=bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
{
|
)
|
||||||
"src": url_util.join(
|
buildcache_copies[release_spec_dag_hash] = {
|
||||||
src_prefix,
|
"src": cache_class.get_manifest_url(release_spec, src_prefix),
|
||||||
bindist.build_cache_relative_path(),
|
"dest": cache_class.get_manifest_url(release_spec, dest_prefix),
|
||||||
bindist.tarball_name(release_spec, ".spec.json.sig"),
|
}
|
||||||
),
|
|
||||||
"dest": url_util.join(
|
|
||||||
dest_prefix,
|
|
||||||
bindist.build_cache_relative_path(),
|
|
||||||
bindist.tarball_name(release_spec, ".spec.json.sig"),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"src": url_util.join(
|
|
||||||
src_prefix,
|
|
||||||
bindist.build_cache_relative_path(),
|
|
||||||
bindist.tarball_path_name(release_spec, ".spack"),
|
|
||||||
),
|
|
||||||
"dest": url_util.join(
|
|
||||||
dest_prefix,
|
|
||||||
bindist.build_cache_relative_path(),
|
|
||||||
bindist.tarball_path_name(release_spec, ".spack"),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
target_dir = os.path.dirname(output_file)
|
target_dir = os.path.dirname(output_file)
|
||||||
|
|
||||||
|
@ -292,6 +292,9 @@ def main_script_replacements(cmd):
|
|||||||
)
|
)
|
||||||
maybe_generate_manifest(pipeline, options, manifest_path)
|
maybe_generate_manifest(pipeline, options, manifest_path)
|
||||||
|
|
||||||
|
relative_specs_url = bindist.buildcache_relative_specs_url()
|
||||||
|
relative_keys_url = bindist.buildcache_relative_keys_url()
|
||||||
|
|
||||||
if options.pipeline_type == PipelineType.COPY_ONLY:
|
if options.pipeline_type == PipelineType.COPY_ONLY:
|
||||||
stage_names.append("copy")
|
stage_names.append("copy")
|
||||||
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
|
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
|
||||||
@ -301,9 +304,12 @@ def main_script_replacements(cmd):
|
|||||||
if "variables" not in sync_job:
|
if "variables" not in sync_job:
|
||||||
sync_job["variables"] = {}
|
sync_job["variables"] = {}
|
||||||
|
|
||||||
sync_job["variables"][
|
sync_job["variables"].update(
|
||||||
"SPACK_COPY_ONLY_DESTINATION"
|
{
|
||||||
] = options.buildcache_destination.fetch_url
|
"SPACK_COPY_ONLY_DESTINATION": options.buildcache_destination.fetch_url,
|
||||||
|
"SPACK_BUILDCACHE_RELATIVE_KEYS_URL": relative_keys_url,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
|
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
|
||||||
if "buildcache-source" not in pipeline_mirrors:
|
if "buildcache-source" not in pipeline_mirrors:
|
||||||
@ -333,9 +339,13 @@ def main_script_replacements(cmd):
|
|||||||
signing_job["interruptible"] = True
|
signing_job["interruptible"] = True
|
||||||
if "variables" not in signing_job:
|
if "variables" not in signing_job:
|
||||||
signing_job["variables"] = {}
|
signing_job["variables"] = {}
|
||||||
signing_job["variables"][
|
signing_job["variables"].update(
|
||||||
"SPACK_BUILDCACHE_DESTINATION"
|
{
|
||||||
] = options.buildcache_destination.push_url
|
"SPACK_BUILDCACHE_DESTINATION": options.buildcache_destination.push_url,
|
||||||
|
"SPACK_BUILDCACHE_RELATIVE_SPECS_URL": relative_specs_url,
|
||||||
|
"SPACK_BUILDCACHE_RELATIVE_KEYS_URL": relative_keys_url,
|
||||||
|
}
|
||||||
|
)
|
||||||
signing_job["dependencies"] = []
|
signing_job["dependencies"] = []
|
||||||
|
|
||||||
output_object["sign-pkgs"] = signing_job
|
output_object["sign-pkgs"] = signing_job
|
||||||
|
@ -2,6 +2,7 @@
|
|||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
import os
|
import os
|
||||||
|
import pathlib
|
||||||
import shutil
|
import shutil
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
@ -28,7 +29,7 @@
|
|||||||
|
|
||||||
|
|
||||||
# Tarball to be downloaded if binary packages are requested in a local mirror
|
# Tarball to be downloaded if binary packages are requested in a local mirror
|
||||||
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache.tar.gz"
|
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache-v3.tar.gz"
|
||||||
|
|
||||||
#: Subdirectory where to create the mirror
|
#: Subdirectory where to create the mirror
|
||||||
LOCAL_MIRROR_DIR = "bootstrap_cache"
|
LOCAL_MIRROR_DIR = "bootstrap_cache"
|
||||||
@ -410,8 +411,9 @@ def _mirror(args):
|
|||||||
stage.create()
|
stage.create()
|
||||||
stage.fetch()
|
stage.fetch()
|
||||||
stage.expand_archive()
|
stage.expand_archive()
|
||||||
build_cache_dir = os.path.join(stage.source_path, "build_cache")
|
stage_dir = pathlib.Path(stage.source_path)
|
||||||
shutil.move(build_cache_dir, mirror_dir)
|
for entry in stage_dir.iterdir():
|
||||||
|
shutil.move(str(entry), mirror_dir)
|
||||||
llnl.util.tty.set_msg_enabled(True)
|
llnl.util.tty.set_msg_enabled(True)
|
||||||
|
|
||||||
def write_metadata(subdir, metadata):
|
def write_metadata(subdir, metadata):
|
||||||
@ -436,7 +438,6 @@ def write_metadata(subdir, metadata):
|
|||||||
shutil.copy(spack.util.path.canonicalize_path(GNUPG_JSON), abs_directory)
|
shutil.copy(spack.util.path.canonicalize_path(GNUPG_JSON), abs_directory)
|
||||||
shutil.copy(spack.util.path.canonicalize_path(PATCHELF_JSON), abs_directory)
|
shutil.copy(spack.util.path.canonicalize_path(PATCHELF_JSON), abs_directory)
|
||||||
instructions += cmd.format("local-binaries", rel_directory)
|
instructions += cmd.format("local-binaries", rel_directory)
|
||||||
instructions += " % spack buildcache update-index <final-path>/bootstrap_cache\n"
|
|
||||||
print(instructions)
|
print(instructions)
|
||||||
|
|
||||||
|
|
||||||
|
@ -4,11 +4,9 @@
|
|||||||
import argparse
|
import argparse
|
||||||
import glob
|
import glob
|
||||||
import json
|
import json
|
||||||
import os
|
|
||||||
import shutil
|
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
from typing import List, Tuple
|
from typing import List, Optional, Tuple
|
||||||
|
|
||||||
import llnl.util.tty as tty
|
import llnl.util.tty as tty
|
||||||
from llnl.string import plural
|
from llnl.string import plural
|
||||||
@ -27,14 +25,21 @@
|
|||||||
import spack.stage
|
import spack.stage
|
||||||
import spack.store
|
import spack.store
|
||||||
import spack.util.parallel
|
import spack.util.parallel
|
||||||
import spack.util.url as url_util
|
|
||||||
import spack.util.web as web_util
|
import spack.util.web as web_util
|
||||||
from spack import traverse
|
from spack import traverse
|
||||||
from spack.cmd import display_specs
|
from spack.cmd import display_specs
|
||||||
from spack.cmd.common import arguments
|
from spack.cmd.common import arguments
|
||||||
from spack.spec import Spec, save_dependency_specfiles
|
from spack.spec import Spec, save_dependency_specfiles
|
||||||
|
|
||||||
|
from ..buildcache_migrate import migrate
|
||||||
from ..enums import InstallRecordStatus
|
from ..enums import InstallRecordStatus
|
||||||
|
from ..url_buildcache import (
|
||||||
|
BuildcacheComponent,
|
||||||
|
BuildcacheEntryError,
|
||||||
|
URLBuildcacheEntry,
|
||||||
|
check_mirror_for_layout,
|
||||||
|
get_url_buildcache_class,
|
||||||
|
)
|
||||||
|
|
||||||
description = "create, download and install binary packages"
|
description = "create, download and install binary packages"
|
||||||
section = "packaging"
|
section = "packaging"
|
||||||
@ -272,6 +277,27 @@ def setup_parser(subparser: argparse.ArgumentParser):
|
|||||||
)
|
)
|
||||||
update_index.set_defaults(func=update_index_fn)
|
update_index.set_defaults(func=update_index_fn)
|
||||||
|
|
||||||
|
# Migrate a buildcache from layout_version 2 to version 3
|
||||||
|
migrate = subparsers.add_parser("migrate", help=migrate_fn.__doc__)
|
||||||
|
migrate.add_argument("mirror", type=arguments.mirror_name, help="name of a configured mirror")
|
||||||
|
migrate.add_argument(
|
||||||
|
"-u",
|
||||||
|
"--unsigned",
|
||||||
|
default=False,
|
||||||
|
action="store_true",
|
||||||
|
help="Ignore signatures and do not resign, default is False",
|
||||||
|
)
|
||||||
|
migrate.add_argument(
|
||||||
|
"-d",
|
||||||
|
"--delete-existing",
|
||||||
|
default=False,
|
||||||
|
action="store_true",
|
||||||
|
help="Delete the previous layout, the default is to keep it.",
|
||||||
|
)
|
||||||
|
arguments.add_common_arguments(migrate, ["yes_to_all"])
|
||||||
|
# TODO: add -y argument to prompt if user really means to delete existing
|
||||||
|
migrate.set_defaults(func=migrate_fn)
|
||||||
|
|
||||||
|
|
||||||
def _matching_specs(specs: List[Spec]) -> List[Spec]:
|
def _matching_specs(specs: List[Spec]) -> List[Spec]:
|
||||||
"""Disambiguate specs and return a list of matching specs"""
|
"""Disambiguate specs and return a list of matching specs"""
|
||||||
@ -397,6 +423,10 @@ def push_fn(args):
|
|||||||
(s, PackageNotInstalledError("package not installed")) for s in not_installed
|
(s, PackageNotInstalledError("package not installed")) for s in not_installed
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Warn about possible old binary mirror layout
|
||||||
|
if not mirror.push_url.startswith("oci://"):
|
||||||
|
check_mirror_for_layout(mirror)
|
||||||
|
|
||||||
with bindist.make_uploader(
|
with bindist.make_uploader(
|
||||||
mirror=mirror,
|
mirror=mirror,
|
||||||
force=args.force,
|
force=args.force,
|
||||||
@ -527,8 +557,7 @@ def download_fn(args):
|
|||||||
if len(specs) != 1:
|
if len(specs) != 1:
|
||||||
tty.die("a single spec argument is required to download from a buildcache")
|
tty.die("a single spec argument is required to download from a buildcache")
|
||||||
|
|
||||||
if not bindist.download_single_spec(specs[0], args.path):
|
bindist.download_single_spec(specs[0], args.path)
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def save_specfile_fn(args):
|
def save_specfile_fn(args):
|
||||||
@ -553,29 +582,78 @@ def save_specfile_fn(args):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def copy_buildcache_file(src_url, dest_url, local_path=None):
|
def copy_buildcache_entry(cache_entry: URLBuildcacheEntry, destination_url: str):
|
||||||
"""Copy from source url to destination url"""
|
"""Download buildcache entry and copy it to the destination_url"""
|
||||||
tmpdir = None
|
try:
|
||||||
|
spec_dict = cache_entry.fetch_metadata()
|
||||||
|
cache_entry.fetch_archive()
|
||||||
|
except bindist.BuildcacheEntryError as e:
|
||||||
|
tty.warn(f"Failed to retrieve buildcache for copying due to {e}")
|
||||||
|
cache_entry.destroy()
|
||||||
|
return
|
||||||
|
|
||||||
if not local_path:
|
spec_blob_record = cache_entry.get_blob_record(BuildcacheComponent.SPEC)
|
||||||
tmpdir = tempfile.mkdtemp()
|
local_spec_path = cache_entry.get_local_spec_path()
|
||||||
local_path = os.path.join(tmpdir, os.path.basename(src_url))
|
tarball_blob_record = cache_entry.get_blob_record(BuildcacheComponent.TARBALL)
|
||||||
|
local_tarball_path = cache_entry.get_local_archive_path()
|
||||||
|
|
||||||
|
target_spec = spack.spec.Spec.from_dict(spec_dict)
|
||||||
|
spec_label = f"{target_spec.name}/{target_spec.dag_hash()[:7]}"
|
||||||
|
|
||||||
|
if not tarball_blob_record:
|
||||||
|
cache_entry.destroy()
|
||||||
|
raise BuildcacheEntryError(f"No source tarball blob record, failed to sync {spec_label}")
|
||||||
|
|
||||||
|
# Try to push the tarball
|
||||||
|
tarball_dest_url = cache_entry.get_blob_url(destination_url, tarball_blob_record)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
temp_stage = spack.stage.Stage(src_url, path=os.path.dirname(local_path))
|
web_util.push_to_url(local_tarball_path, tarball_dest_url, keep_original=True)
|
||||||
try:
|
except Exception as e:
|
||||||
temp_stage.create()
|
tty.warn(f"Failed to push {local_tarball_path} to {tarball_dest_url} due to {e}")
|
||||||
temp_stage.fetch()
|
cache_entry.destroy()
|
||||||
web_util.push_to_url(local_path, dest_url, keep_original=True)
|
return
|
||||||
except spack.error.FetchError as e:
|
|
||||||
# Expected, since we have to try all the possible extensions
|
if not spec_blob_record:
|
||||||
tty.debug("no such file: {0}".format(src_url))
|
cache_entry.destroy()
|
||||||
tty.debug(e)
|
raise BuildcacheEntryError(f"No source spec blob record, failed to sync {spec_label}")
|
||||||
finally:
|
|
||||||
temp_stage.destroy()
|
# Try to push the spec file
|
||||||
finally:
|
spec_dest_url = cache_entry.get_blob_url(destination_url, spec_blob_record)
|
||||||
if tmpdir and os.path.exists(tmpdir):
|
|
||||||
shutil.rmtree(tmpdir)
|
try:
|
||||||
|
web_util.push_to_url(local_spec_path, spec_dest_url, keep_original=True)
|
||||||
|
except Exception as e:
|
||||||
|
tty.warn(f"Failed to push {local_spec_path} to {spec_dest_url} due to {e}")
|
||||||
|
cache_entry.destroy()
|
||||||
|
return
|
||||||
|
|
||||||
|
# Stage the manifest locally, since if it's signed, we don't want to try to
|
||||||
|
# to reproduce that here. Instead just push the locally staged manifest to
|
||||||
|
# the expected path at the destination url.
|
||||||
|
manifest_src_url = cache_entry.remote_manifest_url
|
||||||
|
manifest_dest_url = cache_entry.get_manifest_url(target_spec, destination_url)
|
||||||
|
|
||||||
|
manifest_stage = spack.stage.Stage(manifest_src_url)
|
||||||
|
|
||||||
|
try:
|
||||||
|
manifest_stage.create()
|
||||||
|
manifest_stage.fetch()
|
||||||
|
except Exception as e:
|
||||||
|
tty.warn(f"Failed to fetch manifest from {manifest_src_url} due to {e}")
|
||||||
|
manifest_stage.destroy()
|
||||||
|
cache_entry.destroy()
|
||||||
|
return
|
||||||
|
|
||||||
|
local_manifest_path = manifest_stage.save_filename
|
||||||
|
|
||||||
|
try:
|
||||||
|
web_util.push_to_url(local_manifest_path, manifest_dest_url, keep_original=True)
|
||||||
|
except Exception as e:
|
||||||
|
tty.warn(f"Failed to push manifest to {manifest_dest_url} due to {e}")
|
||||||
|
|
||||||
|
manifest_stage.destroy()
|
||||||
|
cache_entry.destroy()
|
||||||
|
|
||||||
|
|
||||||
def sync_fn(args):
|
def sync_fn(args):
|
||||||
@ -615,37 +693,21 @@ def sync_fn(args):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
build_cache_dir = bindist.build_cache_relative_path()
|
|
||||||
buildcache_rel_paths = []
|
|
||||||
|
|
||||||
tty.debug("Syncing the following specs:")
|
tty.debug("Syncing the following specs:")
|
||||||
for s in env.all_specs():
|
specs_to_sync = [s for s in env.all_specs() if not s.external]
|
||||||
|
for s in specs_to_sync:
|
||||||
tty.debug(" {0}{1}: {2}".format("* " if s in env.roots() else " ", s.name, s.dag_hash()))
|
tty.debug(" {0}{1}: {2}".format("* " if s in env.roots() else " ", s.name, s.dag_hash()))
|
||||||
|
cache_class = get_url_buildcache_class(
|
||||||
buildcache_rel_paths.extend(
|
layout_version=bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
[
|
|
||||||
os.path.join(build_cache_dir, bindist.tarball_path_name(s, ".spack")),
|
|
||||||
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.json.sig")),
|
|
||||||
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.json")),
|
|
||||||
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.yaml")),
|
|
||||||
]
|
|
||||||
)
|
)
|
||||||
|
src_cache_entry = cache_class(src_mirror_url, s, allow_unsigned=True)
|
||||||
tmpdir = tempfile.mkdtemp()
|
src_cache_entry.read_manifest()
|
||||||
|
copy_buildcache_entry(src_cache_entry, dest_mirror_url)
|
||||||
try:
|
|
||||||
for rel_path in buildcache_rel_paths:
|
|
||||||
src_url = url_util.join(src_mirror_url, rel_path)
|
|
||||||
local_path = os.path.join(tmpdir, rel_path)
|
|
||||||
dest_url = url_util.join(dest_mirror_url, rel_path)
|
|
||||||
|
|
||||||
tty.debug("Copying {0} to {1} via {2}".format(src_url, dest_url, local_path))
|
|
||||||
copy_buildcache_file(src_url, dest_url, local_path=local_path)
|
|
||||||
finally:
|
|
||||||
shutil.rmtree(tmpdir)
|
|
||||||
|
|
||||||
|
|
||||||
def manifest_copy(manifest_file_list, dest_mirror=None):
|
def manifest_copy(
|
||||||
|
manifest_file_list: List[str], dest_mirror: Optional[spack.mirrors.mirror.Mirror] = None
|
||||||
|
):
|
||||||
"""Read manifest files containing information about specific specs to copy
|
"""Read manifest files containing information about specific specs to copy
|
||||||
from source to destination, remove duplicates since any binary packge for
|
from source to destination, remove duplicates since any binary packge for
|
||||||
a given hash should be the same as any other, and copy all files specified
|
a given hash should be the same as any other, and copy all files specified
|
||||||
@ -655,21 +717,24 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
|
|||||||
for manifest_path in manifest_file_list:
|
for manifest_path in manifest_file_list:
|
||||||
with open(manifest_path, encoding="utf-8") as fd:
|
with open(manifest_path, encoding="utf-8") as fd:
|
||||||
manifest = json.loads(fd.read())
|
manifest = json.loads(fd.read())
|
||||||
for spec_hash, copy_list in manifest.items():
|
for spec_hash, copy_obj in manifest.items():
|
||||||
# Last duplicate hash wins
|
# Last duplicate hash wins
|
||||||
deduped_manifest[spec_hash] = copy_list
|
deduped_manifest[spec_hash] = copy_obj
|
||||||
|
|
||||||
build_cache_dir = bindist.build_cache_relative_path()
|
for spec_hash, copy_obj in deduped_manifest.items():
|
||||||
for spec_hash, copy_list in deduped_manifest.items():
|
cache_class = get_url_buildcache_class(
|
||||||
for copy_file in copy_list:
|
layout_version=bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
dest = copy_file["dest"]
|
)
|
||||||
if dest_mirror:
|
src_cache_entry = cache_class(
|
||||||
src_relative_path = os.path.join(
|
cache_class.get_base_url(copy_obj["src"]), allow_unsigned=True
|
||||||
build_cache_dir, copy_file["src"].rsplit(build_cache_dir, 1)[1].lstrip("/")
|
)
|
||||||
)
|
src_cache_entry.read_manifest(manifest_url=copy_obj["src"])
|
||||||
dest = url_util.join(dest_mirror.push_url, src_relative_path)
|
if dest_mirror:
|
||||||
tty.debug("copying {0} to {1}".format(copy_file["src"], dest))
|
destination_url = dest_mirror.push_url
|
||||||
copy_buildcache_file(copy_file["src"], dest)
|
else:
|
||||||
|
destination_url = cache_class.get_base_url(copy_obj["dest"])
|
||||||
|
tty.debug("copying {0} to {1}".format(copy_obj["src"], destination_url))
|
||||||
|
copy_buildcache_entry(src_cache_entry, destination_url)
|
||||||
|
|
||||||
|
|
||||||
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
|
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
|
||||||
@ -693,13 +758,9 @@ def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
|
|||||||
bindist._url_generate_package_index(url, tmpdir)
|
bindist._url_generate_package_index(url, tmpdir)
|
||||||
|
|
||||||
if update_keys:
|
if update_keys:
|
||||||
keys_url = url_util.join(
|
|
||||||
url, bindist.build_cache_relative_path(), bindist.build_cache_keys_relative_path()
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
|
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
|
||||||
bindist.generate_key_index(keys_url, tmpdir)
|
bindist.generate_key_index(url, tmpdir)
|
||||||
except bindist.CannotListKeys as e:
|
except bindist.CannotListKeys as e:
|
||||||
# Do not error out if listing keys went wrong. This usually means that the _gpg path
|
# Do not error out if listing keys went wrong. This usually means that the _gpg path
|
||||||
# does not exist. TODO: distinguish between this and other errors.
|
# does not exist. TODO: distinguish between this and other errors.
|
||||||
@ -711,5 +772,53 @@ def update_index_fn(args):
|
|||||||
return update_index(args.mirror, update_keys=args.keys)
|
return update_index(args.mirror, update_keys=args.keys)
|
||||||
|
|
||||||
|
|
||||||
|
def migrate_fn(args):
|
||||||
|
"""perform in-place binary mirror migration (2 to 3)
|
||||||
|
|
||||||
|
A mirror can contain both layout version 2 and version 3 simultaneously without
|
||||||
|
interference. This command performs in-place migration of a binary mirror laid
|
||||||
|
out according to version 2, to a binary mirror laid out according to layout
|
||||||
|
version 3. Only indexed specs will be migrated, so consider updating the mirror
|
||||||
|
index before running this command. Re-run the command to migrate any missing
|
||||||
|
items.
|
||||||
|
|
||||||
|
The default mode of operation is to perform a signed migration, that is, spack
|
||||||
|
will attempt to verify the signatures on specs, and then re-sign them before
|
||||||
|
migration, using whatever keys are already installed in your key ring. You can
|
||||||
|
migrate a mirror of unsigned binaries (or convert a mirror of signed binaries
|
||||||
|
to unsigned) by providing the --unsigned argument.
|
||||||
|
|
||||||
|
By default spack will leave the original mirror contents (in the old layout) in
|
||||||
|
place after migration. You can have spack remove the old contents by providing
|
||||||
|
the --delete-existing argument. Because migrating a mostly-already-migrated
|
||||||
|
mirror should be fast, consider a workflow where you perform a default migration,
|
||||||
|
(i.e. preserve the existing layout rather than deleting it) then evaluate the
|
||||||
|
state of the migrated mirror by attempting to install from it, and finally
|
||||||
|
running the migration again with --delete-existing."""
|
||||||
|
target_mirror = args.mirror
|
||||||
|
unsigned = args.unsigned
|
||||||
|
assert isinstance(target_mirror, spack.mirrors.mirror.Mirror)
|
||||||
|
delete_existing = args.delete_existing
|
||||||
|
|
||||||
|
proceed = True
|
||||||
|
if delete_existing and not args.yes_to_all:
|
||||||
|
msg = (
|
||||||
|
"Using --delete-existing will delete the entire contents \n"
|
||||||
|
" of the old layout within the mirror. Because migrating a mirror \n"
|
||||||
|
" that has already been migrated should be fast, consider a workflow \n"
|
||||||
|
" where you perform a default migration (i.e. preserve the existing \n"
|
||||||
|
" layout rather than deleting it), then evaluate the state of the \n"
|
||||||
|
" migrated mirror by attempting to install from it, and finally, \n"
|
||||||
|
" run the migration again with --delete-existing."
|
||||||
|
)
|
||||||
|
tty.warn(msg)
|
||||||
|
proceed = tty.get_yes_or_no("Do you want to proceed?", default=False)
|
||||||
|
|
||||||
|
if not proceed:
|
||||||
|
tty.die("Migration aborted.")
|
||||||
|
|
||||||
|
migrate(target_mirror, unsigned=unsigned, delete_existing=delete_existing)
|
||||||
|
|
||||||
|
|
||||||
def buildcache(parser, args):
|
def buildcache(parser, args):
|
||||||
return args.func(args)
|
return args.func(args)
|
||||||
|
@ -423,7 +423,7 @@ def ci_rebuild(args):
|
|||||||
# jobs in subsequent stages.
|
# jobs in subsequent stages.
|
||||||
tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name))
|
tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name))
|
||||||
for match in matches:
|
for match in matches:
|
||||||
tty.msg(" {0}".format(match["mirror_url"]))
|
tty.msg(" {0}".format(match.url_and_version.url))
|
||||||
|
|
||||||
# Now we are done and successful
|
# Now we are done and successful
|
||||||
return 0
|
return 0
|
||||||
|
@ -56,7 +56,7 @@ def is_package(f):
|
|||||||
"""Whether flake8 should consider a file as a core file or a package.
|
"""Whether flake8 should consider a file as a core file or a package.
|
||||||
|
|
||||||
We run flake8 with different exceptions for the core and for
|
We run flake8 with different exceptions for the core and for
|
||||||
packages, since we allow `from spack import *` and poking globals
|
packages, since we allow `from spack.package import *` and poking globals
|
||||||
into packages.
|
into packages.
|
||||||
"""
|
"""
|
||||||
return f.startswith("var/spack/") and f.endswith("package.py")
|
return f.startswith("var/spack/") and f.endswith("package.py")
|
||||||
|
@ -1049,7 +1049,11 @@ def add_view(name, values):
|
|||||||
|
|
||||||
def _process_concrete_includes(self):
|
def _process_concrete_includes(self):
|
||||||
"""Extract and load into memory included concrete spec data."""
|
"""Extract and load into memory included concrete spec data."""
|
||||||
self.included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
|
_included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
|
||||||
|
# Expand config and environment variables
|
||||||
|
self.included_concrete_envs = [
|
||||||
|
spack.util.path.canonicalize_path(_env) for _env in _included_concrete_envs
|
||||||
|
]
|
||||||
|
|
||||||
if self.included_concrete_envs:
|
if self.included_concrete_envs:
|
||||||
if os.path.exists(self.lock_path):
|
if os.path.exists(self.lock_path):
|
||||||
|
@ -202,3 +202,16 @@ class MirrorError(SpackError):
|
|||||||
|
|
||||||
def __init__(self, msg, long_msg=None):
|
def __init__(self, msg, long_msg=None):
|
||||||
super().__init__(msg, long_msg)
|
super().__init__(msg, long_msg)
|
||||||
|
|
||||||
|
|
||||||
|
class NoChecksumException(SpackError):
|
||||||
|
"""
|
||||||
|
Raised if file fails checksum verification.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, path, size, contents, algorithm, expected, computed):
|
||||||
|
super().__init__(
|
||||||
|
f"{algorithm} checksum failed for {path}",
|
||||||
|
f"Expected {expected} but got {computed}. "
|
||||||
|
f"File size = {size} bytes. Contents = {contents!r}",
|
||||||
|
)
|
||||||
|
@ -65,6 +65,7 @@
|
|||||||
import spack.util.executable
|
import spack.util.executable
|
||||||
import spack.util.path
|
import spack.util.path
|
||||||
import spack.util.timer as timer
|
import spack.util.timer as timer
|
||||||
|
from spack.url_buildcache import BuildcacheEntryError
|
||||||
from spack.util.environment import EnvironmentModifications, dump_environment
|
from spack.util.environment import EnvironmentModifications, dump_environment
|
||||||
from spack.util.executable import which
|
from spack.util.executable import which
|
||||||
|
|
||||||
@ -449,17 +450,17 @@ def _process_binary_cache_tarball(
|
|||||||
else ``False``
|
else ``False``
|
||||||
"""
|
"""
|
||||||
with timer.measure("fetch"):
|
with timer.measure("fetch"):
|
||||||
download_result = binary_distribution.download_tarball(
|
tarball_stage = binary_distribution.download_tarball(
|
||||||
pkg.spec.build_spec, unsigned, mirrors_for_spec
|
pkg.spec.build_spec, unsigned, mirrors_for_spec
|
||||||
)
|
)
|
||||||
|
|
||||||
if download_result is None:
|
if tarball_stage is None:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
tty.msg(f"Extracting {package_id(pkg.spec)} from binary cache")
|
tty.msg(f"Extracting {package_id(pkg.spec)} from binary cache")
|
||||||
|
|
||||||
with timer.measure("install"), spack.util.path.filter_padding():
|
with timer.measure("install"), spack.util.path.filter_padding():
|
||||||
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
|
binary_distribution.extract_tarball(pkg.spec, tarball_stage, force=False, timer=timer)
|
||||||
|
|
||||||
if pkg.spec.spliced: # overwrite old metadata with new
|
if pkg.spec.spliced: # overwrite old metadata with new
|
||||||
spack.store.STORE.layout.write_spec(
|
spack.store.STORE.layout.write_spec(
|
||||||
@ -2177,7 +2178,7 @@ def install(self) -> None:
|
|||||||
)
|
)
|
||||||
raise
|
raise
|
||||||
|
|
||||||
except binary_distribution.NoChecksumException as exc:
|
except BuildcacheEntryError as exc:
|
||||||
if task.cache_only:
|
if task.cache_only:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
@ -550,7 +550,6 @@ def setup_main_options(args):
|
|||||||
spack.config.CONFIG.scopes["command_line"].sections["repos"] = syaml.syaml_dict(
|
spack.config.CONFIG.scopes["command_line"].sections["repos"] = syaml.syaml_dict(
|
||||||
[(key, [spack.paths.mock_packages_path])]
|
[(key, [spack.paths.mock_packages_path])]
|
||||||
)
|
)
|
||||||
spack.repo.PATH = spack.repo.create(spack.config.CONFIG)
|
|
||||||
|
|
||||||
# If the user asked for it, don't check ssl certs.
|
# If the user asked for it, don't check ssl certs.
|
||||||
if args.insecure:
|
if args.insecure:
|
||||||
@ -561,6 +560,8 @@ def setup_main_options(args):
|
|||||||
for config_var in args.config_vars or []:
|
for config_var in args.config_vars or []:
|
||||||
spack.config.add(fullpath=config_var, scope="command_line")
|
spack.config.add(fullpath=config_var, scope="command_line")
|
||||||
|
|
||||||
|
spack.repo.enable_repo(spack.repo.create(spack.config.CONFIG))
|
||||||
|
|
||||||
# On Windows10 console handling for ASCI/VT100 sequences is not
|
# On Windows10 console handling for ASCI/VT100 sequences is not
|
||||||
# on by default. Turn on before we try to write to console
|
# on by default. Turn on before we try to write to console
|
||||||
# with color
|
# with color
|
||||||
|
@ -172,3 +172,5 @@ class tty:
|
|||||||
spack_cxx: str
|
spack_cxx: str
|
||||||
spack_f77: str
|
spack_f77: str
|
||||||
spack_fc: str
|
spack_fc: str
|
||||||
|
prefix: Prefix
|
||||||
|
dso_suffix: str
|
||||||
|
@ -91,29 +91,8 @@ class ReposFinder:
|
|||||||
Returns a loader based on the inspection of the current repository list.
|
Returns a loader based on the inspection of the current repository list.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self):
|
#: The current list of repositories.
|
||||||
self._repo_init = _path
|
repo_path: "RepoPath"
|
||||||
self._repo: Optional[RepoType] = None
|
|
||||||
|
|
||||||
@property
|
|
||||||
def current_repository(self):
|
|
||||||
if self._repo is None:
|
|
||||||
self._repo = self._repo_init()
|
|
||||||
return self._repo
|
|
||||||
|
|
||||||
@current_repository.setter
|
|
||||||
def current_repository(self, value):
|
|
||||||
self._repo = value
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
|
||||||
def switch_repo(self, substitute: "RepoType"):
|
|
||||||
"""Switch the current repository list for the duration of the context manager."""
|
|
||||||
old = self._repo
|
|
||||||
try:
|
|
||||||
self._repo = substitute
|
|
||||||
yield
|
|
||||||
finally:
|
|
||||||
self._repo = old
|
|
||||||
|
|
||||||
def find_spec(self, fullname, python_path, target=None):
|
def find_spec(self, fullname, python_path, target=None):
|
||||||
# "target" is not None only when calling importlib.reload()
|
# "target" is not None only when calling importlib.reload()
|
||||||
@ -134,14 +113,11 @@ def compute_loader(self, fullname: str):
|
|||||||
namespace, dot, module_name = fullname.rpartition(".")
|
namespace, dot, module_name = fullname.rpartition(".")
|
||||||
|
|
||||||
# If it's a module in some repo, or if it is the repo's namespace, let the repo handle it.
|
# If it's a module in some repo, or if it is the repo's namespace, let the repo handle it.
|
||||||
current_repo = self.current_repository
|
|
||||||
is_repo_path = isinstance(current_repo, RepoPath)
|
|
||||||
if is_repo_path:
|
|
||||||
repos = current_repo.repos
|
|
||||||
else:
|
|
||||||
repos = [current_repo]
|
|
||||||
|
|
||||||
for repo in repos:
|
if not hasattr(self, "repo_path"):
|
||||||
|
return None
|
||||||
|
|
||||||
|
for repo in self.repo_path.repos:
|
||||||
# We are using the namespace of the repo and the repo contains the package
|
# We are using the namespace of the repo and the repo contains the package
|
||||||
if namespace == repo.full_namespace:
|
if namespace == repo.full_namespace:
|
||||||
# With 2 nested conditionals we can call "repo.real_name" only once
|
# With 2 nested conditionals we can call "repo.real_name" only once
|
||||||
@ -156,9 +132,7 @@ def compute_loader(self, fullname: str):
|
|||||||
|
|
||||||
# No repo provides the namespace, but it is a valid prefix of
|
# No repo provides the namespace, but it is a valid prefix of
|
||||||
# something in the RepoPath.
|
# something in the RepoPath.
|
||||||
if is_repo_path and current_repo.by_namespace.is_prefix(
|
if self.repo_path.by_namespace.is_prefix(fullname[len(PKG_MODULE_PREFIX_V1) :]):
|
||||||
fullname[len(PKG_MODULE_PREFIX_V1) :]
|
|
||||||
):
|
|
||||||
return SpackNamespaceLoader()
|
return SpackNamespaceLoader()
|
||||||
|
|
||||||
return None
|
return None
|
||||||
@ -258,6 +232,8 @@ def get_all_package_diffs(type: str, repo: "Repo", rev1="HEAD^1", rev2="HEAD") -
|
|||||||
changed: Set[str] = set()
|
changed: Set[str] = set()
|
||||||
for path in lines:
|
for path in lines:
|
||||||
dir_name, _, _ = path.partition("/")
|
dir_name, _, _ = path.partition("/")
|
||||||
|
if not nm.valid_module_name(dir_name, repo.package_api):
|
||||||
|
continue
|
||||||
pkg_name = nm.pkg_dir_to_pkg_name(dir_name, repo.package_api)
|
pkg_name = nm.pkg_dir_to_pkg_name(dir_name, repo.package_api)
|
||||||
if pkg_name not in added and pkg_name not in removed:
|
if pkg_name not in added and pkg_name not in removed:
|
||||||
changed.add(pkg_name)
|
changed.add(pkg_name)
|
||||||
@ -662,7 +638,6 @@ def __init__(
|
|||||||
if isinstance(repo, str):
|
if isinstance(repo, str):
|
||||||
assert cache is not None, "cache must hold a value, when repo is a string"
|
assert cache is not None, "cache must hold a value, when repo is a string"
|
||||||
repo = Repo(repo, cache=cache, overrides=overrides)
|
repo = Repo(repo, cache=cache, overrides=overrides)
|
||||||
repo.finder(self)
|
|
||||||
self.put_last(repo)
|
self.put_last(repo)
|
||||||
except RepoError as e:
|
except RepoError as e:
|
||||||
tty.warn(
|
tty.warn(
|
||||||
@ -672,6 +647,20 @@ def __init__(
|
|||||||
f" spack repo rm {repo}",
|
f" spack repo rm {repo}",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def enable(self) -> None:
|
||||||
|
"""Set the relevant search paths for package module loading"""
|
||||||
|
REPOS_FINDER.repo_path = self
|
||||||
|
for p in reversed(self.python_paths()):
|
||||||
|
if p not in sys.path:
|
||||||
|
sys.path.insert(0, p)
|
||||||
|
|
||||||
|
def disable(self) -> None:
|
||||||
|
"""Disable the search paths for package module loading"""
|
||||||
|
del REPOS_FINDER.repo_path
|
||||||
|
for p in self.python_paths():
|
||||||
|
if p in sys.path:
|
||||||
|
sys.path.remove(p)
|
||||||
|
|
||||||
def ensure_unwrapped(self) -> "RepoPath":
|
def ensure_unwrapped(self) -> "RepoPath":
|
||||||
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
|
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
|
||||||
return self
|
return self
|
||||||
@ -845,9 +834,6 @@ def python_paths(self) -> List[str]:
|
|||||||
|
|
||||||
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
|
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
|
||||||
"""Find a class for the spec's package and return the class object."""
|
"""Find a class for the spec's package and return the class object."""
|
||||||
for p in self.python_paths():
|
|
||||||
if p not in sys.path:
|
|
||||||
sys.path.insert(0, p)
|
|
||||||
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
|
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
|
||||||
|
|
||||||
@autospec
|
@autospec
|
||||||
@ -1094,9 +1080,6 @@ def check(condition, msg):
|
|||||||
# Class attribute overrides by package name
|
# Class attribute overrides by package name
|
||||||
self.overrides = overrides or {}
|
self.overrides = overrides or {}
|
||||||
|
|
||||||
# Optional reference to a RepoPath to influence module import from spack.pkg
|
|
||||||
self._finder: Optional[RepoPath] = None
|
|
||||||
|
|
||||||
# Maps that goes from package name to corresponding file stat
|
# Maps that goes from package name to corresponding file stat
|
||||||
self._fast_package_checker: Optional[FastPackageChecker] = None
|
self._fast_package_checker: Optional[FastPackageChecker] = None
|
||||||
|
|
||||||
@ -1108,9 +1091,6 @@ def check(condition, msg):
|
|||||||
def package_api_str(self) -> str:
|
def package_api_str(self) -> str:
|
||||||
return f"v{self.package_api[0]}.{self.package_api[1]}"
|
return f"v{self.package_api[0]}.{self.package_api[1]}"
|
||||||
|
|
||||||
def finder(self, value: RepoPath) -> None:
|
|
||||||
self._finder = value
|
|
||||||
|
|
||||||
def real_name(self, import_name: str) -> Optional[str]:
|
def real_name(self, import_name: str) -> Optional[str]:
|
||||||
"""Allow users to import Spack packages using Python identifiers.
|
"""Allow users to import Spack packages using Python identifiers.
|
||||||
|
|
||||||
@ -1363,11 +1343,9 @@ def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]
|
|||||||
fullname += ".package"
|
fullname += ".package"
|
||||||
|
|
||||||
class_name = nm.pkg_name_to_class_name(pkg_name)
|
class_name = nm.pkg_name_to_class_name(pkg_name)
|
||||||
if self.python_path and self.python_path not in sys.path:
|
|
||||||
sys.path.insert(0, self.python_path)
|
|
||||||
try:
|
try:
|
||||||
with REPOS_FINDER.switch_repo(self._finder or self):
|
module = importlib.import_module(fullname)
|
||||||
module = importlib.import_module(fullname)
|
|
||||||
except ImportError as e:
|
except ImportError as e:
|
||||||
raise UnknownPackageError(fullname) from e
|
raise UnknownPackageError(fullname) from e
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@ -1560,12 +1538,6 @@ def create_or_construct(
|
|||||||
return from_path(repo_yaml_dir)
|
return from_path(repo_yaml_dir)
|
||||||
|
|
||||||
|
|
||||||
def _path(configuration=None):
|
|
||||||
"""Get the singleton RepoPath instance for Spack."""
|
|
||||||
configuration = configuration or spack.config.CONFIG
|
|
||||||
return create(configuration=configuration)
|
|
||||||
|
|
||||||
|
|
||||||
def create(configuration: spack.config.Configuration) -> RepoPath:
|
def create(configuration: spack.config.Configuration) -> RepoPath:
|
||||||
"""Create a RepoPath from a configuration object.
|
"""Create a RepoPath from a configuration object.
|
||||||
|
|
||||||
@ -1588,8 +1560,10 @@ def create(configuration: spack.config.Configuration) -> RepoPath:
|
|||||||
return RepoPath(*repo_dirs, cache=spack.caches.MISC_CACHE, overrides=overrides)
|
return RepoPath(*repo_dirs, cache=spack.caches.MISC_CACHE, overrides=overrides)
|
||||||
|
|
||||||
|
|
||||||
#: Singleton repo path instance
|
#: Global package repository instance.
|
||||||
PATH: RepoPath = llnl.util.lang.Singleton(_path) # type: ignore
|
PATH: RepoPath = llnl.util.lang.Singleton(
|
||||||
|
lambda: create(configuration=spack.config.CONFIG)
|
||||||
|
) # type: ignore[assignment]
|
||||||
|
|
||||||
# Add the finder to sys.meta_path
|
# Add the finder to sys.meta_path
|
||||||
REPOS_FINDER = ReposFinder()
|
REPOS_FINDER = ReposFinder()
|
||||||
@ -1615,20 +1589,27 @@ def use_repositories(
|
|||||||
Returns:
|
Returns:
|
||||||
Corresponding RepoPath object
|
Corresponding RepoPath object
|
||||||
"""
|
"""
|
||||||
global PATH
|
|
||||||
paths = [getattr(x, "root", x) for x in paths_and_repos]
|
paths = [getattr(x, "root", x) for x in paths_and_repos]
|
||||||
scope_name = "use-repo-{}".format(uuid.uuid4())
|
scope_name = f"use-repo-{uuid.uuid4()}"
|
||||||
repos_key = "repos:" if override else "repos"
|
repos_key = "repos:" if override else "repos"
|
||||||
spack.config.CONFIG.push_scope(
|
spack.config.CONFIG.push_scope(
|
||||||
spack.config.InternalConfigScope(name=scope_name, data={repos_key: paths})
|
spack.config.InternalConfigScope(name=scope_name, data={repos_key: paths})
|
||||||
)
|
)
|
||||||
PATH, saved = create(configuration=spack.config.CONFIG), PATH
|
old_repo, new_repo = PATH, create(configuration=spack.config.CONFIG)
|
||||||
|
old_repo.disable()
|
||||||
|
enable_repo(new_repo)
|
||||||
try:
|
try:
|
||||||
with REPOS_FINDER.switch_repo(PATH): # type: ignore
|
yield new_repo
|
||||||
yield PATH
|
|
||||||
finally:
|
finally:
|
||||||
spack.config.CONFIG.remove_scope(scope_name=scope_name)
|
spack.config.CONFIG.remove_scope(scope_name=scope_name)
|
||||||
PATH = saved
|
enable_repo(old_repo)
|
||||||
|
|
||||||
|
|
||||||
|
def enable_repo(repo_path: RepoPath) -> None:
|
||||||
|
"""Set the global package repository and make them available in module search paths."""
|
||||||
|
global PATH
|
||||||
|
PATH = repo_path
|
||||||
|
PATH.enable()
|
||||||
|
|
||||||
|
|
||||||
class MockRepositoryBuilder:
|
class MockRepositoryBuilder:
|
||||||
|
@ -19,10 +19,6 @@
|
|||||||
"additionalProperties": True,
|
"additionalProperties": True,
|
||||||
"items": spack.schema.spec.properties,
|
"items": spack.schema.spec.properties,
|
||||||
},
|
},
|
||||||
"binary_cache_checksum": {
|
|
||||||
"type": "object",
|
|
||||||
"properties": {"hash_algorithm": {"type": "string"}, "hash": {"type": "string"}},
|
|
||||||
},
|
|
||||||
"buildcache_layout_version": {"type": "number"},
|
"buildcache_layout_version": {"type": "number"},
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -30,6 +26,6 @@
|
|||||||
"$schema": "http://json-schema.org/draft-07/schema#",
|
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||||
"title": "Spack buildcache specfile schema",
|
"title": "Spack buildcache specfile schema",
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"additionalProperties": False,
|
"additionalProperties": True,
|
||||||
"properties": properties,
|
"properties": properties,
|
||||||
}
|
}
|
||||||
|
45
lib/spack/spack/schema/url_buildcache_manifest.py
Normal file
45
lib/spack/spack/schema/url_buildcache_manifest.py
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
|
"""Schema for buildcache entry manifest file
|
||||||
|
|
||||||
|
.. literalinclude:: _spack_root/lib/spack/spack/schema/url_buildcache_manifest.py
|
||||||
|
:lines: 11-
|
||||||
|
"""
|
||||||
|
from typing import Any, Dict
|
||||||
|
|
||||||
|
properties: Dict[str, Any] = {
|
||||||
|
"version": {"type": "integer"},
|
||||||
|
"data": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"required": [
|
||||||
|
"contentLength",
|
||||||
|
"mediaType",
|
||||||
|
"compression",
|
||||||
|
"checksumAlgorithm",
|
||||||
|
"checksum",
|
||||||
|
],
|
||||||
|
"properties": {
|
||||||
|
"contentLength": {"type": "integer"},
|
||||||
|
"mediaType": {"type": "string"},
|
||||||
|
"compression": {"type": "string"},
|
||||||
|
"checksumAlgorithm": {"type": "string"},
|
||||||
|
"checksum": {"type": "string"},
|
||||||
|
},
|
||||||
|
"additionalProperties": True,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
#: Full schema with metadata
|
||||||
|
schema = {
|
||||||
|
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||||
|
"title": "Buildcache manifest schema",
|
||||||
|
"type": "object",
|
||||||
|
"required": ["version", "data"],
|
||||||
|
"additionalProperties": True,
|
||||||
|
"properties": properties,
|
||||||
|
}
|
@ -106,7 +106,7 @@ def __init__(self):
|
|||||||
|
|
||||||
def restore(self):
|
def restore(self):
|
||||||
spack.config.CONFIG = self.config
|
spack.config.CONFIG = self.config
|
||||||
spack.repo.PATH = spack.repo.create(self.config)
|
spack.repo.enable_repo(spack.repo.create(self.config))
|
||||||
spack.platforms.host = self.platform
|
spack.platforms.host = self.platform
|
||||||
spack.store.STORE = self.store
|
spack.store.STORE = self.store
|
||||||
self.test_patches.restore()
|
self.test_patches.restore()
|
||||||
@ -129,7 +129,6 @@ def restore(self):
|
|||||||
|
|
||||||
|
|
||||||
def store_patches():
|
def store_patches():
|
||||||
global patches
|
|
||||||
module_patches = list()
|
module_patches = list()
|
||||||
class_patches = list()
|
class_patches = list()
|
||||||
if not patches:
|
if not patches:
|
||||||
|
@ -17,11 +17,10 @@
|
|||||||
import urllib.request
|
import urllib.request
|
||||||
import urllib.response
|
import urllib.response
|
||||||
from pathlib import Path, PurePath
|
from pathlib import Path, PurePath
|
||||||
|
from typing import Any, Callable, Dict, NamedTuple, Optional
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
import archspec.cpu
|
|
||||||
|
|
||||||
from llnl.util.filesystem import copy_tree, join_path
|
from llnl.util.filesystem import copy_tree, join_path
|
||||||
from llnl.util.symlink import readlink
|
from llnl.util.symlink import readlink
|
||||||
|
|
||||||
@ -38,16 +37,27 @@
|
|||||||
import spack.paths
|
import spack.paths
|
||||||
import spack.repo
|
import spack.repo
|
||||||
import spack.spec
|
import spack.spec
|
||||||
|
import spack.stage
|
||||||
import spack.store
|
import spack.store
|
||||||
import spack.util.gpg
|
import spack.util.gpg
|
||||||
import spack.util.spack_yaml as syaml
|
import spack.util.spack_yaml as syaml
|
||||||
import spack.util.url as url_util
|
import spack.util.url as url_util
|
||||||
import spack.util.web as web_util
|
import spack.util.web as web_util
|
||||||
from spack.binary_distribution import INDEX_HASH_FILE, CannotListKeys, GenerateIndexError
|
from spack.binary_distribution import CannotListKeys, GenerateIndexError
|
||||||
from spack.database import INDEX_JSON_FILE
|
from spack.database import INDEX_JSON_FILE
|
||||||
from spack.installer import PackageInstaller
|
from spack.installer import PackageInstaller
|
||||||
from spack.paths import test_path
|
from spack.paths import test_path
|
||||||
from spack.spec import Spec
|
from spack.spec import Spec
|
||||||
|
from spack.url_buildcache import (
|
||||||
|
INDEX_MANIFEST_FILE,
|
||||||
|
BuildcacheComponent,
|
||||||
|
BuildcacheEntryError,
|
||||||
|
URLBuildcacheEntry,
|
||||||
|
URLBuildcacheEntryV2,
|
||||||
|
compression_writer,
|
||||||
|
get_url_buildcache_class,
|
||||||
|
get_valid_spec_file,
|
||||||
|
)
|
||||||
|
|
||||||
pytestmark = pytest.mark.not_on_windows("does not run on windows")
|
pytestmark = pytest.mark.not_on_windows("does not run on windows")
|
||||||
|
|
||||||
@ -372,7 +382,7 @@ def test_built_spec_cache(temporary_mirror_dir):
|
|||||||
|
|
||||||
for s in [gspec, cspec]:
|
for s in [gspec, cspec]:
|
||||||
results = bindist.get_mirrors_for_spec(s)
|
results = bindist.get_mirrors_for_spec(s)
|
||||||
assert any([r["spec"] == s for r in results])
|
assert any([r.spec == s for r in results])
|
||||||
|
|
||||||
|
|
||||||
def fake_dag_hash(spec, length=None):
|
def fake_dag_hash(spec, length=None):
|
||||||
@ -435,7 +445,11 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
|
|||||||
assert "libelf" in cache_list
|
assert "libelf" in cache_list
|
||||||
|
|
||||||
# Remove dependency from cache
|
# Remove dependency from cache
|
||||||
libelf_files = glob.glob(os.path.join(mirror_dir.join("build_cache").strpath, "*libelf*"))
|
libelf_files = glob.glob(
|
||||||
|
os.path.join(
|
||||||
|
mirror_dir.join(bindist.buildcache_relative_specs_path()).strpath, "libelf", "*libelf*"
|
||||||
|
)
|
||||||
|
)
|
||||||
os.remove(*libelf_files)
|
os.remove(*libelf_files)
|
||||||
|
|
||||||
# Update index
|
# Update index
|
||||||
@ -480,8 +494,7 @@ def mock_list_url(url, recursive=False):
|
|||||||
|
|
||||||
assert (
|
assert (
|
||||||
"Warning: Encountered problem listing packages at "
|
"Warning: Encountered problem listing packages at "
|
||||||
f"{test_url}/{bindist.BUILD_CACHE_RELATIVE_PATH}: Some HTTP error"
|
f"{test_url}: Some HTTP error" in capfd.readouterr().err
|
||||||
in capfd.readouterr().err
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@ -538,29 +551,6 @@ def test_update_sbang(tmp_path, temporary_mirror, mock_fetch, install_mockery):
|
|||||||
assert f.read() == new_contents
|
assert f.read() == new_contents
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.skipif(
|
|
||||||
str(archspec.cpu.host().family) != "x86_64",
|
|
||||||
reason="test data uses gcc 4.5.0 which does not support aarch64",
|
|
||||||
)
|
|
||||||
def test_install_legacy_buildcache_layout(mutable_config, compiler_factory, install_mockery):
|
|
||||||
"""Legacy buildcache layout involved a nested archive structure
|
|
||||||
where the .spack file contained a repeated spec.json and another
|
|
||||||
compressed archive file containing the install tree. This test
|
|
||||||
makes sure we can still read that layout."""
|
|
||||||
legacy_layout_dir = os.path.join(test_path, "data", "mirrors", "legacy_layout")
|
|
||||||
mirror_url = f"file://{legacy_layout_dir}"
|
|
||||||
filename = (
|
|
||||||
"test-debian6-core2-gcc-4.5.0-archive-files-2.0-"
|
|
||||||
"l3vdiqvbobmspwyb4q2b62fz6nitd4hk.spec.json"
|
|
||||||
)
|
|
||||||
spec_json_path = os.path.join(legacy_layout_dir, "build_cache", filename)
|
|
||||||
mirror_cmd("add", "--scope", "site", "test-legacy-layout", mirror_url)
|
|
||||||
output = install_cmd("--no-check-signature", "--cache-only", "-f", spec_json_path, output=str)
|
|
||||||
mirror_cmd("rm", "--scope=site", "test-legacy-layout")
|
|
||||||
expect_line = "Extracting archive-files-2.0-l3vdiqvbobmspwyb4q2b62fz6nitd4hk from binary cache"
|
|
||||||
assert expect_line in output
|
|
||||||
|
|
||||||
|
|
||||||
def test_FetchCacheError_only_accepts_lists_of_errors():
|
def test_FetchCacheError_only_accepts_lists_of_errors():
|
||||||
with pytest.raises(TypeError, match="list"):
|
with pytest.raises(TypeError, match="list"):
|
||||||
bindist.FetchCacheError("error")
|
bindist.FetchCacheError("error")
|
||||||
@ -600,7 +590,60 @@ def test_text_relocate_if_needed(install_mockery, temporary_store, mock_fetch, t
|
|||||||
assert join_path("bin", "secretexe") not in manifest["relocate_textfiles"]
|
assert join_path("bin", "secretexe") not in manifest["relocate_textfiles"]
|
||||||
|
|
||||||
|
|
||||||
def test_etag_fetching_304():
|
def test_compression_writer(tmp_path):
|
||||||
|
text = "This is some text. We might or might not like to compress it as we write."
|
||||||
|
checksum_algo = "sha256"
|
||||||
|
|
||||||
|
# Write the data using gzip compression
|
||||||
|
compressed_output_path = str(tmp_path / "compressed_text")
|
||||||
|
with compression_writer(compressed_output_path, "gzip", checksum_algo) as (
|
||||||
|
compressor,
|
||||||
|
checker,
|
||||||
|
):
|
||||||
|
compressor.write(text.encode("utf-8"))
|
||||||
|
|
||||||
|
compressed_size = checker.length
|
||||||
|
compressed_checksum = checker.hexdigest()
|
||||||
|
|
||||||
|
with open(compressed_output_path, "rb") as f:
|
||||||
|
binary_content = f.read()
|
||||||
|
|
||||||
|
assert bindist.compute_hash(binary_content) == compressed_checksum
|
||||||
|
assert os.stat(compressed_output_path).st_size == compressed_size
|
||||||
|
assert binary_content[:2] == b"\x1f\x8b"
|
||||||
|
decompressed_content = gzip.decompress(binary_content).decode("utf-8")
|
||||||
|
|
||||||
|
assert decompressed_content == text
|
||||||
|
|
||||||
|
# Write the data without compression
|
||||||
|
uncompressed_output_path = str(tmp_path / "uncompressed_text")
|
||||||
|
with compression_writer(uncompressed_output_path, "none", checksum_algo) as (
|
||||||
|
compressor,
|
||||||
|
checker,
|
||||||
|
):
|
||||||
|
compressor.write(text.encode("utf-8"))
|
||||||
|
|
||||||
|
uncompressed_size = checker.length
|
||||||
|
uncompressed_checksum = checker.hexdigest()
|
||||||
|
|
||||||
|
with open(uncompressed_output_path, "r", encoding="utf-8") as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
assert bindist.compute_hash(content) == uncompressed_checksum
|
||||||
|
assert os.stat(uncompressed_output_path).st_size == uncompressed_size
|
||||||
|
assert content == text
|
||||||
|
|
||||||
|
# Make sure we raise if requesting unknown compression type
|
||||||
|
nocare_output_path = str(tmp_path / "wontwrite")
|
||||||
|
with pytest.raises(BuildcacheEntryError, match="Unknown compression type"):
|
||||||
|
with compression_writer(nocare_output_path, "gsip", checksum_algo) as (
|
||||||
|
compressor,
|
||||||
|
checker,
|
||||||
|
):
|
||||||
|
compressor.write(text)
|
||||||
|
|
||||||
|
|
||||||
|
def test_v2_etag_fetching_304():
|
||||||
# Test conditional fetch with etags. If the remote hasn't modified the file
|
# Test conditional fetch with etags. If the remote hasn't modified the file
|
||||||
# it returns 304, which is an HTTPError in urllib-land. That should be
|
# it returns 304, which is an HTTPError in urllib-land. That should be
|
||||||
# handled as success, since it means the local cache is up-to-date.
|
# handled as success, since it means the local cache is up-to-date.
|
||||||
@ -613,7 +656,7 @@ def response_304(request: urllib.request.Request):
|
|||||||
)
|
)
|
||||||
assert False, "Should not fetch {}".format(url)
|
assert False, "Should not fetch {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.EtagIndexFetcher(
|
fetcher = bindist.EtagIndexFetcherV2(
|
||||||
url="https://www.example.com",
|
url="https://www.example.com",
|
||||||
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
urlopen=response_304,
|
urlopen=response_304,
|
||||||
@ -624,7 +667,7 @@ def response_304(request: urllib.request.Request):
|
|||||||
assert result.fresh
|
assert result.fresh
|
||||||
|
|
||||||
|
|
||||||
def test_etag_fetching_200():
|
def test_v2_etag_fetching_200():
|
||||||
# Test conditional fetch with etags. The remote has modified the file.
|
# Test conditional fetch with etags. The remote has modified the file.
|
||||||
def response_200(request: urllib.request.Request):
|
def response_200(request: urllib.request.Request):
|
||||||
url = request.get_full_url()
|
url = request.get_full_url()
|
||||||
@ -638,7 +681,7 @@ def response_200(request: urllib.request.Request):
|
|||||||
)
|
)
|
||||||
assert False, "Should not fetch {}".format(url)
|
assert False, "Should not fetch {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.EtagIndexFetcher(
|
fetcher = bindist.EtagIndexFetcherV2(
|
||||||
url="https://www.example.com",
|
url="https://www.example.com",
|
||||||
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
urlopen=response_200,
|
urlopen=response_200,
|
||||||
@ -652,7 +695,7 @@ def response_200(request: urllib.request.Request):
|
|||||||
assert result.hash == bindist.compute_hash("Result")
|
assert result.hash == bindist.compute_hash("Result")
|
||||||
|
|
||||||
|
|
||||||
def test_etag_fetching_404():
|
def test_v2_etag_fetching_404():
|
||||||
# Test conditional fetch with etags. The remote has modified the file.
|
# Test conditional fetch with etags. The remote has modified the file.
|
||||||
def response_404(request: urllib.request.Request):
|
def response_404(request: urllib.request.Request):
|
||||||
raise urllib.error.HTTPError(
|
raise urllib.error.HTTPError(
|
||||||
@ -663,7 +706,7 @@ def response_404(request: urllib.request.Request):
|
|||||||
fp=None,
|
fp=None,
|
||||||
)
|
)
|
||||||
|
|
||||||
fetcher = bindist.EtagIndexFetcher(
|
fetcher = bindist.EtagIndexFetcherV2(
|
||||||
url="https://www.example.com",
|
url="https://www.example.com",
|
||||||
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
urlopen=response_404,
|
urlopen=response_404,
|
||||||
@ -673,13 +716,13 @@ def response_404(request: urllib.request.Request):
|
|||||||
fetcher.conditional_fetch()
|
fetcher.conditional_fetch()
|
||||||
|
|
||||||
|
|
||||||
def test_default_index_fetch_200():
|
def test_v2_default_index_fetch_200():
|
||||||
index_json = '{"Hello": "World"}'
|
index_json = '{"Hello": "World"}'
|
||||||
index_json_hash = bindist.compute_hash(index_json)
|
index_json_hash = bindist.compute_hash(index_json)
|
||||||
|
|
||||||
def urlopen(request: urllib.request.Request):
|
def urlopen(request: urllib.request.Request):
|
||||||
url = request.get_full_url()
|
url = request.get_full_url()
|
||||||
if url.endswith(INDEX_HASH_FILE):
|
if url.endswith("index.json.hash"):
|
||||||
return urllib.response.addinfourl( # type: ignore[arg-type]
|
return urllib.response.addinfourl( # type: ignore[arg-type]
|
||||||
io.BytesIO(index_json_hash.encode()),
|
io.BytesIO(index_json_hash.encode()),
|
||||||
headers={}, # type: ignore[arg-type]
|
headers={}, # type: ignore[arg-type]
|
||||||
@ -697,7 +740,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
|
|
||||||
assert False, "Unexpected request {}".format(url)
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.DefaultIndexFetcher(
|
fetcher = bindist.DefaultIndexFetcherV2(
|
||||||
url="https://www.example.com", local_hash="outdated", urlopen=urlopen
|
url="https://www.example.com", local_hash="outdated", urlopen=urlopen
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -710,7 +753,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
assert result.hash == index_json_hash
|
assert result.hash == index_json_hash
|
||||||
|
|
||||||
|
|
||||||
def test_default_index_dont_fetch_index_json_hash_if_no_local_hash():
|
def test_v2_default_index_dont_fetch_index_json_hash_if_no_local_hash():
|
||||||
# When we don't have local hash, we should not be fetching the
|
# When we don't have local hash, we should not be fetching the
|
||||||
# remote index.json.hash file, but only index.json.
|
# remote index.json.hash file, but only index.json.
|
||||||
index_json = '{"Hello": "World"}'
|
index_json = '{"Hello": "World"}'
|
||||||
@ -728,7 +771,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
|
|
||||||
assert False, "Unexpected request {}".format(url)
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.DefaultIndexFetcher(
|
fetcher = bindist.DefaultIndexFetcherV2(
|
||||||
url="https://www.example.com", local_hash=None, urlopen=urlopen
|
url="https://www.example.com", local_hash=None, urlopen=urlopen
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -741,13 +784,13 @@ def urlopen(request: urllib.request.Request):
|
|||||||
assert not result.fresh
|
assert not result.fresh
|
||||||
|
|
||||||
|
|
||||||
def test_default_index_not_modified():
|
def test_v2_default_index_not_modified():
|
||||||
index_json = '{"Hello": "World"}'
|
index_json = '{"Hello": "World"}'
|
||||||
index_json_hash = bindist.compute_hash(index_json)
|
index_json_hash = bindist.compute_hash(index_json)
|
||||||
|
|
||||||
def urlopen(request: urllib.request.Request):
|
def urlopen(request: urllib.request.Request):
|
||||||
url = request.get_full_url()
|
url = request.get_full_url()
|
||||||
if url.endswith(INDEX_HASH_FILE):
|
if url.endswith("index.json.hash"):
|
||||||
return urllib.response.addinfourl(
|
return urllib.response.addinfourl(
|
||||||
io.BytesIO(index_json_hash.encode()),
|
io.BytesIO(index_json_hash.encode()),
|
||||||
headers={}, # type: ignore[arg-type]
|
headers={}, # type: ignore[arg-type]
|
||||||
@ -758,7 +801,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
# No request to index.json should be made.
|
# No request to index.json should be made.
|
||||||
assert False, "Unexpected request {}".format(url)
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.DefaultIndexFetcher(
|
fetcher = bindist.DefaultIndexFetcherV2(
|
||||||
url="https://www.example.com", local_hash=index_json_hash, urlopen=urlopen
|
url="https://www.example.com", local_hash=index_json_hash, urlopen=urlopen
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -766,7 +809,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("index_json", [b"\xa9", b"!#%^"])
|
@pytest.mark.parametrize("index_json", [b"\xa9", b"!#%^"])
|
||||||
def test_default_index_invalid_hash_file(index_json):
|
def test_v2_default_index_invalid_hash_file(index_json):
|
||||||
# Test invalid unicode / invalid hash type
|
# Test invalid unicode / invalid hash type
|
||||||
index_json_hash = bindist.compute_hash(index_json)
|
index_json_hash = bindist.compute_hash(index_json)
|
||||||
|
|
||||||
@ -778,21 +821,21 @@ def urlopen(request: urllib.request.Request):
|
|||||||
code=200,
|
code=200,
|
||||||
)
|
)
|
||||||
|
|
||||||
fetcher = bindist.DefaultIndexFetcher(
|
fetcher = bindist.DefaultIndexFetcherV2(
|
||||||
url="https://www.example.com", local_hash=index_json_hash, urlopen=urlopen
|
url="https://www.example.com", local_hash=index_json_hash, urlopen=urlopen
|
||||||
)
|
)
|
||||||
|
|
||||||
assert fetcher.get_remote_hash() is None
|
assert fetcher.get_remote_hash() is None
|
||||||
|
|
||||||
|
|
||||||
def test_default_index_json_404():
|
def test_v2_default_index_json_404():
|
||||||
# Test invalid unicode / invalid hash type
|
# Test invalid unicode / invalid hash type
|
||||||
index_json = '{"Hello": "World"}'
|
index_json = '{"Hello": "World"}'
|
||||||
index_json_hash = bindist.compute_hash(index_json)
|
index_json_hash = bindist.compute_hash(index_json)
|
||||||
|
|
||||||
def urlopen(request: urllib.request.Request):
|
def urlopen(request: urllib.request.Request):
|
||||||
url = request.get_full_url()
|
url = request.get_full_url()
|
||||||
if url.endswith(INDEX_HASH_FILE):
|
if url.endswith("index.json.hash"):
|
||||||
return urllib.response.addinfourl(
|
return urllib.response.addinfourl(
|
||||||
io.BytesIO(index_json_hash.encode()),
|
io.BytesIO(index_json_hash.encode()),
|
||||||
headers={}, # type: ignore[arg-type]
|
headers={}, # type: ignore[arg-type]
|
||||||
@ -811,7 +854,7 @@ def urlopen(request: urllib.request.Request):
|
|||||||
|
|
||||||
assert False, "Unexpected fetch {}".format(url)
|
assert False, "Unexpected fetch {}".format(url)
|
||||||
|
|
||||||
fetcher = bindist.DefaultIndexFetcher(
|
fetcher = bindist.DefaultIndexFetcherV2(
|
||||||
url="https://www.example.com", local_hash="invalid", urlopen=urlopen
|
url="https://www.example.com", local_hash="invalid", urlopen=urlopen
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -1097,9 +1140,7 @@ def test_get_valid_spec_file(tmp_path, layout, expect_success):
|
|||||||
json.dump(spec_dict, f)
|
json.dump(spec_dict, f)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
spec_dict_disk, layout_disk = bindist._get_valid_spec_file(
|
spec_dict_disk, layout_disk = get_valid_spec_file(str(path), max_supported_layout=1)
|
||||||
str(path), max_supported_layout=1
|
|
||||||
)
|
|
||||||
assert expect_success
|
assert expect_success
|
||||||
assert spec_dict_disk == spec_dict
|
assert spec_dict_disk == spec_dict
|
||||||
assert layout_disk == effective_layout
|
assert layout_disk == effective_layout
|
||||||
@ -1109,51 +1150,66 @@ def test_get_valid_spec_file(tmp_path, layout, expect_success):
|
|||||||
|
|
||||||
def test_get_valid_spec_file_doesnt_exist(tmp_path):
|
def test_get_valid_spec_file_doesnt_exist(tmp_path):
|
||||||
with pytest.raises(bindist.InvalidMetadataFile, match="No such file"):
|
with pytest.raises(bindist.InvalidMetadataFile, match="No such file"):
|
||||||
bindist._get_valid_spec_file(str(tmp_path / "no-such-file"), max_supported_layout=1)
|
get_valid_spec_file(str(tmp_path / "no-such-file"), max_supported_layout=1)
|
||||||
|
|
||||||
|
|
||||||
def test_get_valid_spec_file_gzipped(tmp_path):
|
|
||||||
# Create a gzipped file, contents don't matter
|
|
||||||
path = tmp_path / "spec.json.gz"
|
|
||||||
with gzip.open(path, "wb") as f:
|
|
||||||
f.write(b"hello")
|
|
||||||
with pytest.raises(
|
|
||||||
bindist.InvalidMetadataFile, match="Compressed spec files are not supported"
|
|
||||||
):
|
|
||||||
bindist._get_valid_spec_file(str(path), max_supported_layout=1)
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("filename", ["spec.json", "spec.json.sig"])
|
@pytest.mark.parametrize("filename", ["spec.json", "spec.json.sig"])
|
||||||
def test_get_valid_spec_file_no_json(tmp_path, filename):
|
def test_get_valid_spec_file_no_json(tmp_path, filename):
|
||||||
tmp_path.joinpath(filename).write_text("not json")
|
tmp_path.joinpath(filename).write_text("not json")
|
||||||
with pytest.raises(bindist.InvalidMetadataFile):
|
with pytest.raises(bindist.InvalidMetadataFile):
|
||||||
bindist._get_valid_spec_file(str(tmp_path / filename), max_supported_layout=1)
|
get_valid_spec_file(str(tmp_path / filename), max_supported_layout=1)
|
||||||
|
|
||||||
|
|
||||||
def test_download_tarball_with_unsupported_layout_fails(
|
@pytest.mark.usefixtures("install_mockery", "mock_packages", "mock_fetch", "temporary_mirror")
|
||||||
tmp_path, mock_packages, mutable_config, capsys
|
def test_url_buildcache_entry_v3(monkeypatch, tmpdir):
|
||||||
):
|
"""Make sure URLBuildcacheEntry behaves as expected"""
|
||||||
layout_version = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION + 1
|
|
||||||
spec = spack.concretize.concretize_one("pkg-c")
|
|
||||||
spec_dict = spec.to_dict()
|
|
||||||
spec_dict["buildcache_layout_version"] = layout_version
|
|
||||||
|
|
||||||
# Setup a basic local build cache structure
|
# Create a temp mirror directory for buildcache usage
|
||||||
path = (
|
mirror_dir = tmpdir.join("mirror_dir")
|
||||||
tmp_path / bindist.build_cache_relative_path() / bindist.tarball_name(spec, ".spec.json")
|
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
|
||||||
)
|
|
||||||
path.parent.mkdir(parents=True)
|
|
||||||
with open(path, "w", encoding="utf-8") as f:
|
|
||||||
json.dump(spec_dict, f)
|
|
||||||
|
|
||||||
# Configure as a mirror.
|
s = Spec("libdwarf").concretized()
|
||||||
mirror_cmd("add", "test-mirror", str(tmp_path))
|
|
||||||
|
|
||||||
# Shouldn't be able "download" this.
|
# Install libdwarf
|
||||||
assert bindist.download_tarball(spec, unsigned=True) is None
|
install_cmd("--fake", s.name)
|
||||||
|
|
||||||
# And there should be a warning about an unsupported layout version.
|
# Push libdwarf to buildcache
|
||||||
assert f"Layout version {layout_version} is too new" in capsys.readouterr().err
|
buildcache_cmd("push", "-u", mirror_dir.strpath, s.name)
|
||||||
|
|
||||||
|
cache_class = get_url_buildcache_class(bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION)
|
||||||
|
build_cache = cache_class(mirror_url, s, allow_unsigned=True)
|
||||||
|
|
||||||
|
manifest = build_cache.read_manifest()
|
||||||
|
spec_dict = build_cache.fetch_metadata()
|
||||||
|
local_tarball_path = build_cache.fetch_archive()
|
||||||
|
|
||||||
|
assert "spec" in spec_dict
|
||||||
|
|
||||||
|
for blob_record in manifest.data:
|
||||||
|
blob_path = build_cache.get_staged_blob_path(blob_record)
|
||||||
|
assert os.path.exists(blob_path)
|
||||||
|
actual_blob_size = os.stat(blob_path).st_size
|
||||||
|
assert blob_record.content_length == actual_blob_size
|
||||||
|
|
||||||
|
build_cache.destroy()
|
||||||
|
|
||||||
|
assert not os.path.exists(local_tarball_path)
|
||||||
|
|
||||||
|
|
||||||
|
def test_relative_path_components():
|
||||||
|
blobs_v3 = URLBuildcacheEntry.get_relative_path_components(BuildcacheComponent.BLOB)
|
||||||
|
assert len(blobs_v3) == 1
|
||||||
|
assert "blobs" in blobs_v3
|
||||||
|
|
||||||
|
blobs_v2 = URLBuildcacheEntryV2.get_relative_path_components(BuildcacheComponent.BLOB)
|
||||||
|
assert len(blobs_v2) == 1
|
||||||
|
assert "build_cache" in blobs_v2
|
||||||
|
|
||||||
|
v2_spec_url = "file:///home/me/mymirror/build_cache/linux-ubuntu22.04-sapphirerapids-gcc-12.3.0-gmake-4.4.1-5pddli3htvfe6svs7nbrqmwi5735agi3.spec.json.sig"
|
||||||
|
assert URLBuildcacheEntryV2.get_base_url(v2_spec_url) == "file:///home/me/mymirror"
|
||||||
|
|
||||||
|
v3_manifest_url = "file:///home/me/mymirror/v3/manifests/gmake-4.4.1-5pddli3htvfe6svs7nbrqmwi5735agi3.spec.manifest.json"
|
||||||
|
assert URLBuildcacheEntry.get_base_url(v3_manifest_url) == "file:///home/me/mymirror"
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
@ -1170,3 +1226,244 @@ def test_download_tarball_with_unsupported_layout_fails(
|
|||||||
def test_default_tag(spec: str):
|
def test_default_tag(spec: str):
|
||||||
"""Make sure that computed image tags are valid."""
|
"""Make sure that computed image tags are valid."""
|
||||||
assert re.fullmatch(spack.oci.image.tag, bindist._oci_default_tag(spack.spec.Spec(spec)))
|
assert re.fullmatch(spack.oci.image.tag, bindist._oci_default_tag(spack.spec.Spec(spec)))
|
||||||
|
|
||||||
|
|
||||||
|
class IndexInformation(NamedTuple):
|
||||||
|
manifest_contents: Dict[str, Any]
|
||||||
|
index_contents: str
|
||||||
|
index_hash: str
|
||||||
|
manifest_path: str
|
||||||
|
index_path: str
|
||||||
|
manifest_etag: str
|
||||||
|
fetched_blob: Callable[[], bool]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_index(tmp_path, monkeypatch) -> IndexInformation:
|
||||||
|
mirror_root = tmp_path / "mymirror"
|
||||||
|
index_json = '{"Hello": "World"}'
|
||||||
|
index_json_hash = bindist.compute_hash(index_json)
|
||||||
|
fetched = False
|
||||||
|
|
||||||
|
cache_class = get_url_buildcache_class(
|
||||||
|
layout_version=bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
)
|
||||||
|
|
||||||
|
index_blob_path = os.path.join(
|
||||||
|
str(mirror_root),
|
||||||
|
*cache_class.get_relative_path_components(BuildcacheComponent.BLOB),
|
||||||
|
"sha256",
|
||||||
|
index_json_hash[:2],
|
||||||
|
index_json_hash,
|
||||||
|
)
|
||||||
|
|
||||||
|
os.makedirs(os.path.dirname(index_blob_path))
|
||||||
|
with open(index_blob_path, "w", encoding="utf-8") as fd:
|
||||||
|
fd.write(index_json)
|
||||||
|
|
||||||
|
index_blob_record = bindist.BlobRecord(
|
||||||
|
os.stat(index_blob_path).st_size,
|
||||||
|
cache_class.BUILDCACHE_INDEX_MEDIATYPE,
|
||||||
|
"none",
|
||||||
|
"sha256",
|
||||||
|
index_json_hash,
|
||||||
|
)
|
||||||
|
|
||||||
|
index_manifest = {
|
||||||
|
"version": cache_class.get_layout_version(),
|
||||||
|
"data": [index_blob_record.to_dict()],
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest_json_path = cache_class.get_index_url(str(mirror_root))
|
||||||
|
|
||||||
|
os.makedirs(os.path.dirname(manifest_json_path))
|
||||||
|
|
||||||
|
with open(manifest_json_path, "w", encoding="utf-8") as f:
|
||||||
|
json.dump(index_manifest, f)
|
||||||
|
|
||||||
|
def fetch_patch(stage, mirror_only: bool = False, err_msg: Optional[str] = None):
|
||||||
|
nonlocal fetched
|
||||||
|
fetched = True
|
||||||
|
|
||||||
|
@property # type: ignore
|
||||||
|
def save_filename_patch(stage):
|
||||||
|
return str(index_blob_path)
|
||||||
|
|
||||||
|
monkeypatch.setattr(spack.stage.Stage, "fetch", fetch_patch)
|
||||||
|
monkeypatch.setattr(spack.stage.Stage, "save_filename", save_filename_patch)
|
||||||
|
|
||||||
|
def get_did_fetch():
|
||||||
|
# nonlocal fetched
|
||||||
|
return fetched
|
||||||
|
|
||||||
|
return IndexInformation(
|
||||||
|
index_manifest,
|
||||||
|
index_json,
|
||||||
|
index_json_hash,
|
||||||
|
manifest_json_path,
|
||||||
|
index_blob_path,
|
||||||
|
"59bcc3ad6775562f845953cf01624225",
|
||||||
|
get_did_fetch,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_etag_fetching_304():
|
||||||
|
# Test conditional fetch with etags. If the remote hasn't modified the file
|
||||||
|
# it returns 304, which is an HTTPError in urllib-land. That should be
|
||||||
|
# handled as success, since it means the local cache is up-to-date.
|
||||||
|
def response_304(request: urllib.request.Request):
|
||||||
|
url = request.get_full_url()
|
||||||
|
if url.endswith(INDEX_MANIFEST_FILE):
|
||||||
|
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
|
||||||
|
raise urllib.error.HTTPError(
|
||||||
|
url, 304, "Not Modified", hdrs={}, fp=None # type: ignore[arg-type]
|
||||||
|
)
|
||||||
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
|
fetcher = bindist.EtagIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
|
urlopen=response_304,
|
||||||
|
)
|
||||||
|
|
||||||
|
result = fetcher.conditional_fetch()
|
||||||
|
assert isinstance(result, bindist.FetchIndexResult)
|
||||||
|
assert result.fresh
|
||||||
|
|
||||||
|
|
||||||
|
def test_etag_fetching_200(mock_index):
|
||||||
|
# Test conditional fetch with etags. The remote has modified the file.
|
||||||
|
def response_200(request: urllib.request.Request):
|
||||||
|
url = request.get_full_url()
|
||||||
|
if url.endswith(INDEX_MANIFEST_FILE):
|
||||||
|
assert request.get_header("If-none-match") == '"112a8bbc1b3f7f185621c1ee335f0502"'
|
||||||
|
return urllib.response.addinfourl(
|
||||||
|
io.BytesIO(json.dumps(mock_index.manifest_contents).encode()),
|
||||||
|
headers={"Etag": f'"{mock_index.manifest_etag}"'}, # type: ignore[arg-type]
|
||||||
|
url=url,
|
||||||
|
code=200,
|
||||||
|
)
|
||||||
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
|
fetcher = bindist.EtagIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
|
urlopen=response_200,
|
||||||
|
)
|
||||||
|
|
||||||
|
result = fetcher.conditional_fetch()
|
||||||
|
assert isinstance(result, bindist.FetchIndexResult)
|
||||||
|
assert not result.fresh
|
||||||
|
assert mock_index.fetched_blob()
|
||||||
|
assert result.etag == mock_index.manifest_etag
|
||||||
|
assert result.data == mock_index.index_contents
|
||||||
|
assert result.hash == mock_index.index_hash
|
||||||
|
|
||||||
|
|
||||||
|
def test_etag_fetching_404():
|
||||||
|
# Test conditional fetch with etags. The remote has modified the file.
|
||||||
|
def response_404(request: urllib.request.Request):
|
||||||
|
raise urllib.error.HTTPError(
|
||||||
|
request.get_full_url(),
|
||||||
|
404,
|
||||||
|
"Not found",
|
||||||
|
hdrs={"Etag": '"59bcc3ad6775562f845953cf01624225"'}, # type: ignore[arg-type]
|
||||||
|
fp=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
fetcher = bindist.EtagIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
etag="112a8bbc1b3f7f185621c1ee335f0502",
|
||||||
|
urlopen=response_404,
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(bindist.FetchIndexError):
|
||||||
|
fetcher.conditional_fetch()
|
||||||
|
|
||||||
|
|
||||||
|
def test_default_index_fetch_200(mock_index):
|
||||||
|
# We fetch the manifest and then the index blob if the hash is outdated
|
||||||
|
def urlopen(request: urllib.request.Request):
|
||||||
|
url = request.get_full_url()
|
||||||
|
if url.endswith(INDEX_MANIFEST_FILE):
|
||||||
|
return urllib.response.addinfourl( # type: ignore[arg-type]
|
||||||
|
io.BytesIO(json.dumps(mock_index.manifest_contents).encode()),
|
||||||
|
headers={"Etag": f'"{mock_index.manifest_etag}"'}, # type: ignore[arg-type]
|
||||||
|
url=url,
|
||||||
|
code=200,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
|
fetcher = bindist.DefaultIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
local_hash="outdated",
|
||||||
|
urlopen=urlopen,
|
||||||
|
)
|
||||||
|
|
||||||
|
result = fetcher.conditional_fetch()
|
||||||
|
|
||||||
|
assert isinstance(result, bindist.FetchIndexResult)
|
||||||
|
assert not result.fresh
|
||||||
|
assert mock_index.fetched_blob()
|
||||||
|
assert result.etag == mock_index.manifest_etag
|
||||||
|
assert result.data == mock_index.index_contents
|
||||||
|
assert result.hash == mock_index.index_hash
|
||||||
|
|
||||||
|
|
||||||
|
def test_default_index_404():
|
||||||
|
# We get a fetch error if the index can't be fetched
|
||||||
|
def urlopen(request: urllib.request.Request):
|
||||||
|
raise urllib.error.HTTPError(
|
||||||
|
request.get_full_url(),
|
||||||
|
404,
|
||||||
|
"Not found",
|
||||||
|
hdrs={"Etag": '"59bcc3ad6775562f845953cf01624225"'}, # type: ignore[arg-type]
|
||||||
|
fp=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
fetcher = bindist.DefaultIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
local_hash=None,
|
||||||
|
urlopen=urlopen,
|
||||||
|
)
|
||||||
|
|
||||||
|
with pytest.raises(bindist.FetchIndexError):
|
||||||
|
fetcher.conditional_fetch()
|
||||||
|
|
||||||
|
|
||||||
|
def test_default_index_not_modified(mock_index):
|
||||||
|
# We don't fetch the index blob if hash didn't change
|
||||||
|
def urlopen(request: urllib.request.Request):
|
||||||
|
url = request.get_full_url()
|
||||||
|
if url.endswith(INDEX_MANIFEST_FILE):
|
||||||
|
return urllib.response.addinfourl(
|
||||||
|
io.BytesIO(json.dumps(mock_index.manifest_contents).encode()),
|
||||||
|
headers={}, # type: ignore[arg-type]
|
||||||
|
url=url,
|
||||||
|
code=200,
|
||||||
|
)
|
||||||
|
|
||||||
|
# No other request should be made.
|
||||||
|
assert False, "Unexpected request {}".format(url)
|
||||||
|
|
||||||
|
fetcher = bindist.DefaultIndexFetcher(
|
||||||
|
bindist.MirrorURLAndVersion(
|
||||||
|
"https://www.example.com", bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
),
|
||||||
|
local_hash=mock_index.index_hash,
|
||||||
|
urlopen=urlopen,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert fetcher.conditional_fetch().fresh
|
||||||
|
assert not mock_index.fetched_blob()
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
#
|
#
|
||||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||||
|
|
||||||
import os
|
import shutil
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
@ -37,12 +37,7 @@ def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_p
|
|||||||
assert not skipped
|
assert not skipped
|
||||||
|
|
||||||
# Remove the tarball, which should cause push to push.
|
# Remove the tarball, which should cause push to push.
|
||||||
os.remove(
|
shutil.rmtree(tmp_path / bd.buildcache_relative_blobs_path())
|
||||||
tmp_path
|
|
||||||
/ bd.BUILD_CACHE_RELATIVE_PATH
|
|
||||||
/ bd.tarball_directory_name(spec)
|
|
||||||
/ bd.tarball_name(spec, ".spack")
|
|
||||||
)
|
|
||||||
|
|
||||||
with bd.make_uploader(mirror) as uploader:
|
with bd.make_uploader(mirror) as uploader:
|
||||||
skipped = uploader.push_or_raise(specs)
|
skipped = uploader.push_or_raise(specs)
|
||||||
|
@ -414,7 +414,7 @@ def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("29947")
|
@pytest.mark.regression("29947")
|
||||||
def test_affected_specs_on_first_concretization(mutable_mock_env_path, mock_packages):
|
def test_affected_specs_on_first_concretization(mutable_mock_env_path):
|
||||||
e = ev.create("first_concretization")
|
e = ev.create("first_concretization")
|
||||||
e.add("mpileaks~shared")
|
e.add("mpileaks~shared")
|
||||||
e.add("mpileaks+shared")
|
e.add("mpileaks+shared")
|
||||||
@ -444,7 +444,7 @@ def _fail(self, args):
|
|||||||
ci.process_command("help", [], str(repro_dir))
|
ci.process_command("help", [], str(repro_dir))
|
||||||
|
|
||||||
|
|
||||||
def test_ci_create_buildcache(tmpdir, working_env, config, mock_packages, monkeypatch):
|
def test_ci_create_buildcache(tmpdir, working_env, config, monkeypatch):
|
||||||
"""Test that create_buildcache returns a list of objects with the correct
|
"""Test that create_buildcache returns a list of objects with the correct
|
||||||
keys and types."""
|
keys and types."""
|
||||||
monkeypatch.setattr(ci, "push_to_build_cache", lambda a, b, c: True)
|
monkeypatch.setattr(ci, "push_to_build_cache", lambda a, b, c: True)
|
||||||
@ -483,7 +483,7 @@ def test_ci_run_standalone_tests_missing_requirements(
|
|||||||
|
|
||||||
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
|
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
|
||||||
def test_ci_run_standalone_tests_not_installed_junit(
|
def test_ci_run_standalone_tests_not_installed_junit(
|
||||||
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
|
tmp_path, repro_dir, working_env, mock_test_stage, capfd
|
||||||
):
|
):
|
||||||
log_file = tmp_path / "junit.xml"
|
log_file = tmp_path / "junit.xml"
|
||||||
args = {
|
args = {
|
||||||
@ -501,7 +501,7 @@ def test_ci_run_standalone_tests_not_installed_junit(
|
|||||||
|
|
||||||
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
|
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
|
||||||
def test_ci_run_standalone_tests_not_installed_cdash(
|
def test_ci_run_standalone_tests_not_installed_cdash(
|
||||||
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
|
tmp_path, repro_dir, working_env, mock_test_stage, capfd
|
||||||
):
|
):
|
||||||
"""Test run_standalone_tests with cdash and related options."""
|
"""Test run_standalone_tests with cdash and related options."""
|
||||||
log_file = tmp_path / "junit.xml"
|
log_file = tmp_path / "junit.xml"
|
||||||
@ -537,7 +537,7 @@ def test_ci_run_standalone_tests_not_installed_cdash(
|
|||||||
assert "No such file or directory" in err
|
assert "No such file or directory" in err
|
||||||
|
|
||||||
|
|
||||||
def test_ci_skipped_report(tmpdir, mock_packages, config):
|
def test_ci_skipped_report(tmpdir, config):
|
||||||
"""Test explicit skipping of report as well as CI's 'package' arg."""
|
"""Test explicit skipping of report as well as CI's 'package' arg."""
|
||||||
pkg = "trivial-smoke-test"
|
pkg = "trivial-smoke-test"
|
||||||
spec = spack.concretize.concretize_one(pkg)
|
spec = spack.concretize.concretize_one(pkg)
|
||||||
|
@ -5,12 +5,16 @@
|
|||||||
import errno
|
import errno
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
|
import pathlib
|
||||||
import shutil
|
import shutil
|
||||||
from typing import List
|
from typing import List
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
from llnl.util.filesystem import copy_tree, find
|
||||||
|
|
||||||
import spack.binary_distribution
|
import spack.binary_distribution
|
||||||
|
import spack.buildcache_migrate as migrate
|
||||||
import spack.cmd.buildcache
|
import spack.cmd.buildcache
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.environment as ev
|
import spack.environment as ev
|
||||||
@ -18,8 +22,16 @@
|
|||||||
import spack.main
|
import spack.main
|
||||||
import spack.mirrors.mirror
|
import spack.mirrors.mirror
|
||||||
import spack.spec
|
import spack.spec
|
||||||
import spack.util.url
|
import spack.util.url as url_util
|
||||||
from spack.installer import PackageInstaller
|
from spack.installer import PackageInstaller
|
||||||
|
from spack.paths import test_path
|
||||||
|
from spack.url_buildcache import (
|
||||||
|
BuildcacheComponent,
|
||||||
|
URLBuildcacheEntry,
|
||||||
|
URLBuildcacheEntryV2,
|
||||||
|
check_mirror_for_layout,
|
||||||
|
get_url_buildcache_class,
|
||||||
|
)
|
||||||
|
|
||||||
buildcache = spack.main.SpackCommand("buildcache")
|
buildcache = spack.main.SpackCommand("buildcache")
|
||||||
install = spack.main.SpackCommand("install")
|
install = spack.main.SpackCommand("install")
|
||||||
@ -74,20 +86,6 @@ def test_buildcache_list_allarch(database, mock_get_specs_multiarch, capsys):
|
|||||||
assert output.count("mpileaks") == 2
|
assert output.count("mpileaks") == 2
|
||||||
|
|
||||||
|
|
||||||
def tests_buildcache_create(install_mockery, mock_fetch, monkeypatch, tmpdir):
|
|
||||||
""" "Ensure that buildcache create creates output files"""
|
|
||||||
pkg = "trivial-install-test-package"
|
|
||||||
install(pkg)
|
|
||||||
|
|
||||||
buildcache("push", "--unsigned", str(tmpdir), pkg)
|
|
||||||
|
|
||||||
spec = spack.concretize.concretize_one(pkg)
|
|
||||||
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
|
|
||||||
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
|
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
|
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball))
|
|
||||||
|
|
||||||
|
|
||||||
def tests_buildcache_create_env(
|
def tests_buildcache_create_env(
|
||||||
install_mockery, mock_fetch, monkeypatch, tmpdir, mutable_mock_env_path
|
install_mockery, mock_fetch, monkeypatch, tmpdir, mutable_mock_env_path
|
||||||
):
|
):
|
||||||
@ -102,10 +100,15 @@ def tests_buildcache_create_env(
|
|||||||
buildcache("push", "--unsigned", str(tmpdir))
|
buildcache("push", "--unsigned", str(tmpdir))
|
||||||
|
|
||||||
spec = spack.concretize.concretize_one(pkg)
|
spec = spack.concretize.concretize_one(pkg)
|
||||||
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
|
|
||||||
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
|
mirror_url = f"file://{tmpdir.strpath}"
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
|
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball))
|
cache_class = get_url_buildcache_class(
|
||||||
|
layout_version=spack.binary_distribution.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
)
|
||||||
|
cache_entry = cache_class(mirror_url, spec, allow_unsigned=True)
|
||||||
|
assert cache_entry.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
cache_entry.destroy()
|
||||||
|
|
||||||
|
|
||||||
def test_buildcache_create_fails_on_noargs(tmpdir):
|
def test_buildcache_create_fails_on_noargs(tmpdir):
|
||||||
@ -159,12 +162,14 @@ def test_update_key_index(
|
|||||||
# it causes the index to get update.
|
# it causes the index to get update.
|
||||||
buildcache("update-index", "--keys", mirror_dir.strpath)
|
buildcache("update-index", "--keys", mirror_dir.strpath)
|
||||||
|
|
||||||
key_dir_list = os.listdir(os.path.join(mirror_dir.strpath, "build_cache", "_pgp"))
|
key_dir_list = os.listdir(
|
||||||
|
os.path.join(mirror_dir.strpath, spack.binary_distribution.buildcache_relative_keys_path())
|
||||||
|
)
|
||||||
|
|
||||||
uninstall("-y", s.name)
|
uninstall("-y", s.name)
|
||||||
mirror("rm", "test-mirror")
|
mirror("rm", "test-mirror")
|
||||||
|
|
||||||
assert "index.json" in key_dir_list
|
assert "keys.manifest.json" in key_dir_list
|
||||||
|
|
||||||
|
|
||||||
def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
|
def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
|
||||||
@ -180,10 +185,14 @@ def test_buildcache_autopush(tmp_path, install_mockery, mock_fetch):
|
|||||||
# Install and generate build cache index
|
# Install and generate build cache index
|
||||||
PackageInstaller([s.package], fake=True, explicit=True).install()
|
PackageInstaller([s.package], fake=True, explicit=True).install()
|
||||||
|
|
||||||
metadata_file = spack.binary_distribution.tarball_name(s, ".spec.json")
|
assert s.name is not None
|
||||||
|
manifest_file = URLBuildcacheEntry.get_manifest_filename(s)
|
||||||
|
specs_dirs = os.path.join(
|
||||||
|
*URLBuildcacheEntry.get_relative_path_components(BuildcacheComponent.SPEC), s.name
|
||||||
|
)
|
||||||
|
|
||||||
assert not (mirror_dir / "build_cache" / metadata_file).exists()
|
assert not (mirror_dir / specs_dirs / manifest_file).exists()
|
||||||
assert (mirror_autopush_dir / "build_cache" / metadata_file).exists()
|
assert (mirror_autopush_dir / specs_dirs / manifest_file).exists()
|
||||||
|
|
||||||
|
|
||||||
def test_buildcache_sync(
|
def test_buildcache_sync(
|
||||||
@ -205,7 +214,11 @@ def test_buildcache_sync(
|
|||||||
out_env_pkg = "libdwarf"
|
out_env_pkg = "libdwarf"
|
||||||
|
|
||||||
def verify_mirror_contents():
|
def verify_mirror_contents():
|
||||||
dest_list = os.listdir(os.path.join(dest_mirror_dir, "build_cache"))
|
dest_list = os.listdir(
|
||||||
|
os.path.join(
|
||||||
|
dest_mirror_dir, spack.binary_distribution.buildcache_relative_specs_path()
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
found_pkg = False
|
found_pkg = False
|
||||||
|
|
||||||
@ -252,33 +265,15 @@ def verify_mirror_contents():
|
|||||||
verify_mirror_contents()
|
verify_mirror_contents()
|
||||||
shutil.rmtree(dest_mirror_dir)
|
shutil.rmtree(dest_mirror_dir)
|
||||||
|
|
||||||
|
cache_class = get_url_buildcache_class(
|
||||||
|
layout_version=spack.binary_distribution.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
)
|
||||||
|
|
||||||
def manifest_insert(manifest, spec, dest_url):
|
def manifest_insert(manifest, spec, dest_url):
|
||||||
manifest[spec.dag_hash()] = [
|
manifest[spec.dag_hash()] = {
|
||||||
{
|
"src": cache_class.get_manifest_url(spec, src_mirror_url),
|
||||||
"src": spack.util.url.join(
|
"dest": cache_class.get_manifest_url(spec, dest_url),
|
||||||
src_mirror_url,
|
}
|
||||||
spack.binary_distribution.build_cache_relative_path(),
|
|
||||||
spack.binary_distribution.tarball_name(spec, ".spec.json"),
|
|
||||||
),
|
|
||||||
"dest": spack.util.url.join(
|
|
||||||
dest_url,
|
|
||||||
spack.binary_distribution.build_cache_relative_path(),
|
|
||||||
spack.binary_distribution.tarball_name(spec, ".spec.json"),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"src": spack.util.url.join(
|
|
||||||
src_mirror_url,
|
|
||||||
spack.binary_distribution.build_cache_relative_path(),
|
|
||||||
spack.binary_distribution.tarball_path_name(spec, ".spack"),
|
|
||||||
),
|
|
||||||
"dest": spack.util.url.join(
|
|
||||||
dest_url,
|
|
||||||
spack.binary_distribution.build_cache_relative_path(),
|
|
||||||
spack.binary_distribution.tarball_path_name(spec, ".spack"),
|
|
||||||
),
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
manifest_file = os.path.join(tmpdir.strpath, "manifest_dest.json")
|
manifest_file = os.path.join(tmpdir.strpath, "manifest_dest.json")
|
||||||
with open(manifest_file, "w", encoding="utf-8") as fd:
|
with open(manifest_file, "w", encoding="utf-8") as fd:
|
||||||
@ -298,9 +293,7 @@ def manifest_insert(manifest, spec, dest_url):
|
|||||||
with open(manifest_file, "w", encoding="utf-8") as fd:
|
with open(manifest_file, "w", encoding="utf-8") as fd:
|
||||||
manifest = {}
|
manifest = {}
|
||||||
for spec in test_env.specs_by_hash.values():
|
for spec in test_env.specs_by_hash.values():
|
||||||
manifest_insert(
|
manifest_insert(manifest, spec, url_util.join(dest_mirror_url, "invalid_path"))
|
||||||
manifest, spec, spack.util.url.join(dest_mirror_url, "invalid_path")
|
|
||||||
)
|
|
||||||
json.dump(manifest, fd)
|
json.dump(manifest, fd)
|
||||||
|
|
||||||
# Trigger the warning
|
# Trigger the warning
|
||||||
@ -327,11 +320,37 @@ def test_buildcache_create_install(
|
|||||||
|
|
||||||
buildcache("push", "--unsigned", str(tmpdir), pkg)
|
buildcache("push", "--unsigned", str(tmpdir), pkg)
|
||||||
|
|
||||||
|
mirror_url = f"file://{tmpdir.strpath}"
|
||||||
|
|
||||||
spec = spack.concretize.concretize_one(pkg)
|
spec = spack.concretize.concretize_one(pkg)
|
||||||
tarball_path = spack.binary_distribution.tarball_path_name(spec, ".spack")
|
cache_class = get_url_buildcache_class(
|
||||||
tarball = spack.binary_distribution.tarball_name(spec, ".spec.json")
|
layout_version=spack.binary_distribution.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball_path))
|
)
|
||||||
assert os.path.exists(os.path.join(str(tmpdir), "build_cache", tarball))
|
cache_entry = cache_class(mirror_url, spec, allow_unsigned=True)
|
||||||
|
assert spec.name is not None
|
||||||
|
manifest_path = os.path.join(
|
||||||
|
str(tmpdir),
|
||||||
|
*cache_class.get_relative_path_components(BuildcacheComponent.SPEC),
|
||||||
|
spec.name,
|
||||||
|
cache_class.get_manifest_filename(spec),
|
||||||
|
)
|
||||||
|
|
||||||
|
assert os.path.exists(manifest_path)
|
||||||
|
cache_entry.read_manifest()
|
||||||
|
spec_blob_record = cache_entry.get_blob_record(BuildcacheComponent.SPEC)
|
||||||
|
tarball_blob_record = cache_entry.get_blob_record(BuildcacheComponent.TARBALL)
|
||||||
|
|
||||||
|
spec_blob_path = os.path.join(
|
||||||
|
tmpdir.strpath, *cache_class.get_blob_path_components(spec_blob_record)
|
||||||
|
)
|
||||||
|
assert os.path.exists(spec_blob_path)
|
||||||
|
|
||||||
|
tarball_blob_path = os.path.join(
|
||||||
|
tmpdir.strpath, *cache_class.get_blob_path_components(tarball_blob_record)
|
||||||
|
)
|
||||||
|
assert os.path.exists(tarball_blob_path)
|
||||||
|
|
||||||
|
cache_entry.destroy()
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
@ -503,3 +522,230 @@ def test_push_without_build_deps(tmp_path, temporary_store, mock_packages, mutab
|
|||||||
"push", "--update-index", "--without-build-dependencies", "my-mirror", f"/{s.dag_hash()}"
|
"push", "--update-index", "--without-build-dependencies", "my-mirror", f"/{s.dag_hash()}"
|
||||||
)
|
)
|
||||||
assert spack.binary_distribution.update_cache_and_get_specs() == [s]
|
assert spack.binary_distribution.update_cache_and_get_specs() == [s]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def v2_buildcache_layout(tmp_path):
|
||||||
|
def _layout(signedness: str = "signed"):
|
||||||
|
source_path = str(pathlib.Path(test_path) / "data" / "mirrors" / "v2_layout" / signedness)
|
||||||
|
test_mirror_path = tmp_path / "mirror"
|
||||||
|
copy_tree(source_path, test_mirror_path)
|
||||||
|
return test_mirror_path
|
||||||
|
|
||||||
|
return _layout
|
||||||
|
|
||||||
|
|
||||||
|
def test_check_mirror_for_layout(v2_buildcache_layout, mutable_config, capsys):
|
||||||
|
"""Check printed warning in the presence of v2 layout binary mirrors"""
|
||||||
|
test_mirror_path = v2_buildcache_layout("unsigned")
|
||||||
|
|
||||||
|
check_mirror_for_layout(spack.mirrors.mirror.Mirror.from_local_path(str(test_mirror_path)))
|
||||||
|
err = str(capsys.readouterr()[1])
|
||||||
|
assert all([word in err for word in ["Warning", "missing", "layout"]])
|
||||||
|
|
||||||
|
|
||||||
|
def test_url_buildcache_entry_v2_exists(
|
||||||
|
capsys, v2_buildcache_layout, mock_packages, mutable_config
|
||||||
|
):
|
||||||
|
"""Test existence check for v2 buildcache entries"""
|
||||||
|
test_mirror_path = v2_buildcache_layout("unsigned")
|
||||||
|
mirror_url = f"file://{test_mirror_path}"
|
||||||
|
mirror("add", "v2mirror", mirror_url)
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("list", "-a", "-l")
|
||||||
|
|
||||||
|
assert "Fetching an index from a v2 binary mirror layout" in output
|
||||||
|
assert "is deprecated" in output
|
||||||
|
|
||||||
|
v2_cache_class = URLBuildcacheEntryV2
|
||||||
|
|
||||||
|
# If you don't give it a spec, it returns False
|
||||||
|
build_cache = v2_cache_class(mirror_url)
|
||||||
|
assert not build_cache.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
|
||||||
|
spec = spack.concretize.concretize_one("libdwarf")
|
||||||
|
|
||||||
|
# In v2 we have to ask for both, because we need to have the spec to have the tarball
|
||||||
|
build_cache = v2_cache_class(mirror_url, spec, allow_unsigned=True)
|
||||||
|
assert not build_cache.exists([BuildcacheComponent.TARBALL])
|
||||||
|
assert not build_cache.exists([BuildcacheComponent.SPEC])
|
||||||
|
# But if we do ask for both, they should be there in this case
|
||||||
|
assert build_cache.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
|
||||||
|
spec_path = build_cache._get_spec_url(spec, mirror_url, ext=".spec.json")[7:]
|
||||||
|
tarball_path = build_cache._get_tarball_url(spec, mirror_url)[7:]
|
||||||
|
|
||||||
|
os.remove(tarball_path)
|
||||||
|
build_cache = v2_cache_class(mirror_url, spec, allow_unsigned=True)
|
||||||
|
assert not build_cache.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
|
||||||
|
os.remove(spec_path)
|
||||||
|
build_cache = v2_cache_class(mirror_url, spec, allow_unsigned=True)
|
||||||
|
assert not build_cache.exists([BuildcacheComponent.SPEC, BuildcacheComponent.TARBALL])
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize("signing", ["unsigned", "signed"])
|
||||||
|
def test_install_v2_layout(
|
||||||
|
signing,
|
||||||
|
capsys,
|
||||||
|
v2_buildcache_layout,
|
||||||
|
mock_packages,
|
||||||
|
mutable_config,
|
||||||
|
mutable_mock_env_path,
|
||||||
|
install_mockery,
|
||||||
|
mock_gnupghome,
|
||||||
|
monkeypatch,
|
||||||
|
):
|
||||||
|
"""Ensure we can still install from signed and unsigned v2 buildcache"""
|
||||||
|
test_mirror_path = v2_buildcache_layout(signing)
|
||||||
|
mirror("add", "my-mirror", str(test_mirror_path))
|
||||||
|
|
||||||
|
# Trust original signing key (no-op if this is the unsigned pass)
|
||||||
|
buildcache("keys", "--install", "--trust")
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = install("--fake", "--no-check-signature", "libdwarf")
|
||||||
|
|
||||||
|
assert "Extracting libelf" in output
|
||||||
|
assert "libelf: Successfully installed" in output
|
||||||
|
assert "Extracting libdwarf" in output
|
||||||
|
assert "libdwarf: Successfully installed" in output
|
||||||
|
assert "Installing a spec from a v2 binary mirror layout" in output
|
||||||
|
assert "is deprecated" in output
|
||||||
|
|
||||||
|
|
||||||
|
def test_basic_migrate_unsigned(capsys, v2_buildcache_layout, mutable_config):
|
||||||
|
"""Make sure first unsigned migration results in usable buildcache,
|
||||||
|
leaving the previous layout in place. Also test that a subsequent one
|
||||||
|
doesn't need to migrate anything, and that using --delete-existing
|
||||||
|
removes the previous layout"""
|
||||||
|
|
||||||
|
test_mirror_path = v2_buildcache_layout("unsigned")
|
||||||
|
mirror("add", "my-mirror", str(test_mirror_path))
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("migrate", "--unsigned", "my-mirror")
|
||||||
|
|
||||||
|
# The output indicates both specs were migrated
|
||||||
|
assert output.count("Successfully migrated") == 6
|
||||||
|
|
||||||
|
build_cache_path = str(test_mirror_path / "build_cache")
|
||||||
|
|
||||||
|
# Without "--delete-existing" and "--yes-to-all", migration leaves the
|
||||||
|
# previous layout in place
|
||||||
|
assert os.path.exists(build_cache_path)
|
||||||
|
assert os.path.isdir(build_cache_path)
|
||||||
|
|
||||||
|
# Now list the specs available under the new layout
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("list", "--allarch")
|
||||||
|
|
||||||
|
assert "libdwarf" in output and "libelf" in output
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache(
|
||||||
|
"migrate", "--unsigned", "--delete-existing", "--yes-to-all", "my-mirror"
|
||||||
|
)
|
||||||
|
|
||||||
|
# A second migration of the same mirror indicates neither spec
|
||||||
|
# needs to be migrated
|
||||||
|
assert output.count("No need to migrate") == 6
|
||||||
|
|
||||||
|
# When we provide "--delete-existing" and "--yes-to-all", migration
|
||||||
|
# removes the old layout
|
||||||
|
assert not os.path.exists(build_cache_path)
|
||||||
|
|
||||||
|
|
||||||
|
def test_basic_migrate_signed(
|
||||||
|
capsys, v2_buildcache_layout, monkeypatch, mock_gnupghome, mutable_config
|
||||||
|
):
|
||||||
|
"""Test a signed migration requires a signing key, requires the public
|
||||||
|
key originally used to sign the pkgs, fails and prints reasonable messages
|
||||||
|
if those requirements are unmet, and eventually succeeds when they are met."""
|
||||||
|
test_mirror_path = v2_buildcache_layout("signed")
|
||||||
|
mirror("add", "my-mirror", str(test_mirror_path))
|
||||||
|
|
||||||
|
with pytest.raises(migrate.MigrationException) as error:
|
||||||
|
buildcache("migrate", "my-mirror")
|
||||||
|
|
||||||
|
# Without a signing key spack fails and explains why
|
||||||
|
assert error.value.message == "Signed migration requires exactly one secret key in keychain"
|
||||||
|
|
||||||
|
# Create a signing key and trust the key used to sign the pkgs originally
|
||||||
|
gpg("create", "New Test Signing Key", "noone@nowhere.org")
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("migrate", "my-mirror")
|
||||||
|
|
||||||
|
# Without trusting the original signing key, spack fails with an explanation
|
||||||
|
assert "Failed to verify signature of libelf" in output
|
||||||
|
assert "Failed to verify signature of libdwarf" in output
|
||||||
|
assert "did you mean to perform an unsigned migration" in output
|
||||||
|
|
||||||
|
# Trust original signing key (since it's in the original layout location,
|
||||||
|
# this is where the monkeypatched attribute is used)
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("keys", "--install", "--trust")
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("migrate", "my-mirror")
|
||||||
|
|
||||||
|
# Once we have the proper keys, migration should succeed
|
||||||
|
assert "Successfully migrated libelf" in output
|
||||||
|
assert "Successfully migrated libelf" in output
|
||||||
|
|
||||||
|
# Now list the specs available under the new layout
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("list", "--allarch")
|
||||||
|
|
||||||
|
assert "libdwarf" in output and "libelf" in output
|
||||||
|
|
||||||
|
|
||||||
|
def test_unsigned_migrate_of_signed_mirror(capsys, v2_buildcache_layout, mutable_config):
|
||||||
|
"""Test spack can do an unsigned migration of a signed buildcache by
|
||||||
|
ignoring signatures and skipping re-signing."""
|
||||||
|
|
||||||
|
test_mirror_path = v2_buildcache_layout("signed")
|
||||||
|
mirror("add", "my-mirror", str(test_mirror_path))
|
||||||
|
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache(
|
||||||
|
"migrate", "--unsigned", "--delete-existing", "--yes-to-all", "my-mirror"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Now list the specs available under the new layout
|
||||||
|
with capsys.disabled():
|
||||||
|
output = buildcache("list", "--allarch")
|
||||||
|
|
||||||
|
assert "libdwarf" in output and "libelf" in output
|
||||||
|
|
||||||
|
# We should find two spec manifest files, one for each spec
|
||||||
|
file_list = find(test_mirror_path, "*.spec.manifest.json")
|
||||||
|
assert len(file_list) == 6
|
||||||
|
assert any(["libdwarf" in file for file in file_list])
|
||||||
|
assert any(["libelf" in file for file in file_list])
|
||||||
|
|
||||||
|
# The two spec manifest files should be unsigned
|
||||||
|
for file_path in file_list:
|
||||||
|
with open(file_path, "r", encoding="utf-8") as fd:
|
||||||
|
assert json.load(fd)
|
||||||
|
|
||||||
|
|
||||||
|
def test_migrate_requires_index(capsys, v2_buildcache_layout, mutable_config):
|
||||||
|
"""Test spack fails with a reasonable error message when mirror does
|
||||||
|
not have an index"""
|
||||||
|
|
||||||
|
test_mirror_path = v2_buildcache_layout("unsigned")
|
||||||
|
v2_index_path = test_mirror_path / "build_cache" / "index.json"
|
||||||
|
v2_index_hash_path = test_mirror_path / "build_cache" / "index.json.hash"
|
||||||
|
os.remove(str(v2_index_path))
|
||||||
|
os.remove(str(v2_index_hash_path))
|
||||||
|
|
||||||
|
mirror("add", "my-mirror", str(test_mirror_path))
|
||||||
|
|
||||||
|
with pytest.raises(migrate.MigrationException) as error:
|
||||||
|
buildcache("migrate", "--unsigned", "my-mirror")
|
||||||
|
|
||||||
|
# If the buildcache has no index, spack fails and explains why
|
||||||
|
assert error.value.message == "Buildcache migration requires a buildcache index"
|
||||||
|
@ -31,11 +31,8 @@
|
|||||||
from spack.ci.common import PipelineDag, PipelineOptions, SpackCIConfig
|
from spack.ci.common import PipelineDag, PipelineOptions, SpackCIConfig
|
||||||
from spack.ci.generator_registry import generator
|
from spack.ci.generator_registry import generator
|
||||||
from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE
|
from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE
|
||||||
from spack.database import INDEX_JSON_FILE
|
|
||||||
from spack.error import SpackError
|
from spack.error import SpackError
|
||||||
from spack.schema.buildcache_spec import schema as specfile_schema
|
|
||||||
from spack.schema.database_index import schema as db_idx_schema
|
from spack.schema.database_index import schema as db_idx_schema
|
||||||
from spack.spec import Spec
|
|
||||||
from spack.test.conftest import MockHTTPResponse
|
from spack.test.conftest import MockHTTPResponse
|
||||||
|
|
||||||
config_cmd = spack.main.SpackCommand("config")
|
config_cmd = spack.main.SpackCommand("config")
|
||||||
@ -718,7 +715,7 @@ def test_ci_nothing_to_rebuild(
|
|||||||
)
|
)
|
||||||
|
|
||||||
install_cmd("archive-files")
|
install_cmd("archive-files")
|
||||||
buildcache_cmd("push", "-f", "-u", mirror_url, "archive-files")
|
buildcache_cmd("push", "-f", "-u", "--update-index", mirror_url, "archive-files")
|
||||||
|
|
||||||
with working_dir(tmp_path):
|
with working_dir(tmp_path):
|
||||||
env_cmd("create", "test", "./spack.yaml")
|
env_cmd("create", "test", "./spack.yaml")
|
||||||
@ -855,18 +852,18 @@ def test_push_to_build_cache(
|
|||||||
|
|
||||||
# Test generating buildcache index while we have bin mirror
|
# Test generating buildcache index while we have bin mirror
|
||||||
buildcache_cmd("update-index", mirror_url)
|
buildcache_cmd("update-index", mirror_url)
|
||||||
with open(mirror_dir / "build_cache" / INDEX_JSON_FILE, encoding="utf-8") as idx_fd:
|
|
||||||
index_object = json.load(idx_fd)
|
# Validate resulting buildcache (database) index
|
||||||
jsonschema.validate(index_object, db_idx_schema)
|
layout_version = spack.binary_distribution.CURRENT_BUILD_CACHE_LAYOUT_VERSION
|
||||||
|
url_and_version = spack.binary_distribution.MirrorURLAndVersion(
|
||||||
|
mirror_url, layout_version
|
||||||
|
)
|
||||||
|
index_fetcher = spack.binary_distribution.DefaultIndexFetcher(url_and_version, None)
|
||||||
|
result = index_fetcher.conditional_fetch()
|
||||||
|
jsonschema.validate(json.loads(result.data), db_idx_schema)
|
||||||
|
|
||||||
# Now that index is regenerated, validate "buildcache list" output
|
# Now that index is regenerated, validate "buildcache list" output
|
||||||
assert "patchelf" in buildcache_cmd("list", output=str)
|
assert "patchelf" in buildcache_cmd("list", output=str)
|
||||||
# Also test buildcache_spec schema
|
|
||||||
for file_name in os.listdir(mirror_dir / "build_cache"):
|
|
||||||
if file_name.endswith(".spec.json.sig"):
|
|
||||||
with open(mirror_dir / "build_cache" / file_name, encoding="utf-8") as f:
|
|
||||||
spec_dict = Spec.extract_json_from_clearsig(f.read())
|
|
||||||
jsonschema.validate(spec_dict, specfile_schema)
|
|
||||||
|
|
||||||
logs_dir = scratch / "logs_dir"
|
logs_dir = scratch / "logs_dir"
|
||||||
logs_dir.mkdir()
|
logs_dir.mkdir()
|
||||||
@ -1032,7 +1029,7 @@ def test_ci_generate_override_runner_attrs(
|
|||||||
|
|
||||||
|
|
||||||
def test_ci_rebuild_index(
|
def test_ci_rebuild_index(
|
||||||
tmp_path: pathlib.Path, working_env, mutable_mock_env_path, install_mockery, mock_fetch
|
tmp_path: pathlib.Path, working_env, mutable_mock_env_path, install_mockery, mock_fetch, capsys
|
||||||
):
|
):
|
||||||
scratch = tmp_path / "working_dir"
|
scratch = tmp_path / "working_dir"
|
||||||
mirror_dir = scratch / "mirror"
|
mirror_dir = scratch / "mirror"
|
||||||
@ -1069,8 +1066,9 @@ def test_ci_rebuild_index(
|
|||||||
buildcache_cmd("push", "-u", "-f", mirror_url, "callpath")
|
buildcache_cmd("push", "-u", "-f", mirror_url, "callpath")
|
||||||
ci_cmd("rebuild-index")
|
ci_cmd("rebuild-index")
|
||||||
|
|
||||||
with open(mirror_dir / "build_cache" / INDEX_JSON_FILE, encoding="utf-8") as f:
|
with capsys.disabled():
|
||||||
jsonschema.validate(json.load(f), db_idx_schema)
|
output = buildcache_cmd("list", "--allarch")
|
||||||
|
assert "callpath" in output
|
||||||
|
|
||||||
|
|
||||||
def test_ci_get_stack_changed(mock_git_repo, monkeypatch):
|
def test_ci_get_stack_changed(mock_git_repo, monkeypatch):
|
||||||
@ -2030,13 +2028,12 @@ def test_ci_verify_versions_valid(
|
|||||||
tmpdir,
|
tmpdir,
|
||||||
):
|
):
|
||||||
repo, _, commits = mock_git_package_changes
|
repo, _, commits = mock_git_package_changes
|
||||||
spack.repo.PATH.put_first(repo)
|
with spack.repo.use_repositories(repo):
|
||||||
|
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
||||||
|
|
||||||
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
out = ci_cmd("verify-versions", commits[-1], commits[-3])
|
||||||
|
assert "Validated diff-test@2.1.5" in out
|
||||||
out = ci_cmd("verify-versions", commits[-1], commits[-3])
|
assert "Validated diff-test@2.1.6" in out
|
||||||
assert "Validated diff-test@2.1.5" in out
|
|
||||||
assert "Validated diff-test@2.1.6" in out
|
|
||||||
|
|
||||||
|
|
||||||
def test_ci_verify_versions_standard_invalid(
|
def test_ci_verify_versions_standard_invalid(
|
||||||
@ -2047,23 +2044,21 @@ def test_ci_verify_versions_standard_invalid(
|
|||||||
verify_git_versions_invalid,
|
verify_git_versions_invalid,
|
||||||
):
|
):
|
||||||
repo, _, commits = mock_git_package_changes
|
repo, _, commits = mock_git_package_changes
|
||||||
spack.repo.PATH.put_first(repo)
|
with spack.repo.use_repositories(repo):
|
||||||
|
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
||||||
|
|
||||||
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
|
||||||
|
assert "Invalid checksum found diff-test@2.1.5" in out
|
||||||
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
|
assert "Invalid commit for diff-test@2.1.6" in out
|
||||||
assert "Invalid checksum found diff-test@2.1.5" in out
|
|
||||||
assert "Invalid commit for diff-test@2.1.6" in out
|
|
||||||
|
|
||||||
|
|
||||||
def test_ci_verify_versions_manual_package(monkeypatch, mock_packages, mock_git_package_changes):
|
def test_ci_verify_versions_manual_package(monkeypatch, mock_packages, mock_git_package_changes):
|
||||||
repo, _, commits = mock_git_package_changes
|
repo, _, commits = mock_git_package_changes
|
||||||
spack.repo.PATH.put_first(repo)
|
with spack.repo.use_repositories(repo):
|
||||||
|
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
||||||
|
|
||||||
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
|
pkg_class = spack.spec.Spec("diff-test").package_class
|
||||||
|
monkeypatch.setattr(pkg_class, "manual_download", True)
|
||||||
|
|
||||||
pkg_class = spack.spec.Spec("diff-test").package_class
|
out = ci_cmd("verify-versions", commits[-1], commits[-2])
|
||||||
monkeypatch.setattr(pkg_class, "manual_download", True)
|
assert "Skipping manual download package: diff-test" in out
|
||||||
|
|
||||||
out = ci_cmd("verify-versions", commits[-1], commits[-2])
|
|
||||||
assert "Skipping manual download package: diff-test" in out
|
|
||||||
|
@ -448,7 +448,7 @@ def test_find_loaded(database, working_env):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("37712")
|
@pytest.mark.regression("37712")
|
||||||
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
|
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path, mock_packages):
|
||||||
"""Tests that having an active environment with a root spec containing a compiler constrained
|
"""Tests that having an active environment with a root spec containing a compiler constrained
|
||||||
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
|
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
|
||||||
when invoking "spack find".
|
when invoking "spack find".
|
||||||
|
@ -8,6 +8,7 @@
|
|||||||
|
|
||||||
import llnl.util.filesystem as fs
|
import llnl.util.filesystem as fs
|
||||||
|
|
||||||
|
import spack.binary_distribution as bindist
|
||||||
import spack.util.executable
|
import spack.util.executable
|
||||||
import spack.util.gpg
|
import spack.util.gpg
|
||||||
from spack.main import SpackCommand
|
from spack.main import SpackCommand
|
||||||
@ -172,23 +173,25 @@ def test_gpg(tmpdir, mutable_config, mock_gnupghome):
|
|||||||
# Verification should now succeed again.
|
# Verification should now succeed again.
|
||||||
gpg("verify", str(test_path))
|
gpg("verify", str(test_path))
|
||||||
|
|
||||||
|
relative_keys_path = bindist.buildcache_relative_keys_path()
|
||||||
|
|
||||||
# Publish the keys using a directory path
|
# Publish the keys using a directory path
|
||||||
test_path = tmpdir.join("dir_cache")
|
test_path = tmpdir.join("dir_cache")
|
||||||
os.makedirs("%s" % test_path)
|
os.makedirs(f"{test_path}")
|
||||||
gpg("publish", "--rebuild-index", "-d", str(test_path))
|
gpg("publish", "--rebuild-index", "-d", str(test_path))
|
||||||
assert os.path.exists("%s/build_cache/_pgp/index.json" % test_path)
|
assert os.path.exists(f"{test_path}/{relative_keys_path}/keys.manifest.json")
|
||||||
|
|
||||||
# Publish the keys using a mirror url
|
# Publish the keys using a mirror url
|
||||||
test_path = tmpdir.join("url_cache")
|
test_path = tmpdir.join("url_cache")
|
||||||
os.makedirs("%s" % test_path)
|
os.makedirs(f"{test_path}")
|
||||||
test_url = "file://%s" % test_path
|
test_url = f"file://{test_path}"
|
||||||
gpg("publish", "--rebuild-index", "--mirror-url", test_url)
|
gpg("publish", "--rebuild-index", "--mirror-url", test_url)
|
||||||
assert os.path.exists("%s/build_cache/_pgp/index.json" % test_path)
|
assert os.path.exists(f"{test_path}/{relative_keys_path}/keys.manifest.json")
|
||||||
|
|
||||||
# Publish the keys using a mirror name
|
# Publish the keys using a mirror name
|
||||||
test_path = tmpdir.join("named_cache")
|
test_path = tmpdir.join("named_cache")
|
||||||
os.makedirs("%s" % test_path)
|
os.makedirs(f"{test_path}")
|
||||||
mirror_url = "file://%s" % test_path
|
mirror_url = f"file://{test_path}"
|
||||||
mirror("add", "gpg", mirror_url)
|
mirror("add", "gpg", mirror_url)
|
||||||
gpg("publish", "--rebuild-index", "-m", "gpg")
|
gpg("publish", "--rebuild-index", "-m", "gpg")
|
||||||
assert os.path.exists("%s/build_cache/_pgp/index.json" % test_path)
|
assert os.path.exists(f"{test_path}/{relative_keys_path}/keys.manifest.json")
|
||||||
|
@ -9,6 +9,8 @@
|
|||||||
import spack.main
|
import spack.main
|
||||||
import spack.repo
|
import spack.repo
|
||||||
|
|
||||||
|
pytestmark = [pytest.mark.usefixtures("mock_packages")]
|
||||||
|
|
||||||
maintainers = spack.main.SpackCommand("maintainers")
|
maintainers = spack.main.SpackCommand("maintainers")
|
||||||
|
|
||||||
MAINTAINED_PACKAGES = [
|
MAINTAINED_PACKAGES = [
|
||||||
@ -26,17 +28,17 @@ def split(output):
|
|||||||
return re.split(r"\s+", output) if output else []
|
return re.split(r"\s+", output) if output else []
|
||||||
|
|
||||||
|
|
||||||
def test_maintained(mock_packages):
|
def test_maintained():
|
||||||
out = split(maintainers("--maintained"))
|
out = split(maintainers("--maintained"))
|
||||||
assert out == MAINTAINED_PACKAGES
|
assert out == MAINTAINED_PACKAGES
|
||||||
|
|
||||||
|
|
||||||
def test_unmaintained(mock_packages):
|
def test_unmaintained():
|
||||||
out = split(maintainers("--unmaintained"))
|
out = split(maintainers("--unmaintained"))
|
||||||
assert out == sorted(set(spack.repo.all_package_names()) - set(MAINTAINED_PACKAGES))
|
assert out == sorted(set(spack.repo.all_package_names()) - set(MAINTAINED_PACKAGES))
|
||||||
|
|
||||||
|
|
||||||
def test_all(mock_packages, capfd):
|
def test_all(capfd):
|
||||||
with capfd.disabled():
|
with capfd.disabled():
|
||||||
out = split(maintainers("--all"))
|
out = split(maintainers("--all"))
|
||||||
assert out == [
|
assert out == [
|
||||||
@ -63,7 +65,7 @@ def test_all(mock_packages, capfd):
|
|||||||
assert out == ["maintainers-1:", "user1,", "user2"]
|
assert out == ["maintainers-1:", "user1,", "user2"]
|
||||||
|
|
||||||
|
|
||||||
def test_all_by_user(mock_packages, capfd):
|
def test_all_by_user(capfd):
|
||||||
with capfd.disabled():
|
with capfd.disabled():
|
||||||
out = split(maintainers("--all", "--by-user"))
|
out = split(maintainers("--all", "--by-user"))
|
||||||
assert out == [
|
assert out == [
|
||||||
@ -100,22 +102,22 @@ def test_all_by_user(mock_packages, capfd):
|
|||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
def test_no_args(mock_packages):
|
def test_no_args():
|
||||||
with pytest.raises(spack.main.SpackCommandError):
|
with pytest.raises(spack.main.SpackCommandError):
|
||||||
maintainers()
|
maintainers()
|
||||||
|
|
||||||
|
|
||||||
def test_no_args_by_user(mock_packages):
|
def test_no_args_by_user():
|
||||||
with pytest.raises(spack.main.SpackCommandError):
|
with pytest.raises(spack.main.SpackCommandError):
|
||||||
maintainers("--by-user")
|
maintainers("--by-user")
|
||||||
|
|
||||||
|
|
||||||
def test_mutex_args_fail(mock_packages):
|
def test_mutex_args_fail():
|
||||||
with pytest.raises(SystemExit):
|
with pytest.raises(SystemExit):
|
||||||
maintainers("--maintained", "--unmaintained")
|
maintainers("--maintained", "--unmaintained")
|
||||||
|
|
||||||
|
|
||||||
def test_maintainers_list_packages(mock_packages, capfd):
|
def test_maintainers_list_packages(capfd):
|
||||||
with capfd.disabled():
|
with capfd.disabled():
|
||||||
out = split(maintainers("maintainers-1"))
|
out = split(maintainers("maintainers-1"))
|
||||||
assert out == ["user1", "user2"]
|
assert out == ["user1", "user2"]
|
||||||
@ -129,13 +131,13 @@ def test_maintainers_list_packages(mock_packages, capfd):
|
|||||||
assert out == ["user2", "user3"]
|
assert out == ["user2", "user3"]
|
||||||
|
|
||||||
|
|
||||||
def test_maintainers_list_fails(mock_packages, capfd):
|
def test_maintainers_list_fails(capfd):
|
||||||
out = maintainers("pkg-a", fail_on_error=False)
|
out = maintainers("pkg-a", fail_on_error=False)
|
||||||
assert not out
|
assert not out
|
||||||
assert maintainers.returncode == 1
|
assert maintainers.returncode == 1
|
||||||
|
|
||||||
|
|
||||||
def test_maintainers_list_by_user(mock_packages, capfd):
|
def test_maintainers_list_by_user(capfd):
|
||||||
with capfd.disabled():
|
with capfd.disabled():
|
||||||
out = split(maintainers("--by-user", "user1"))
|
out = split(maintainers("--by-user", "user1"))
|
||||||
assert out == ["maintainers-1", "maintainers-3", "py-extension1"]
|
assert out == ["maintainers-1", "maintainers-3", "py-extension1"]
|
||||||
|
@ -6,6 +6,7 @@
|
|||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
|
import spack.binary_distribution as bindist
|
||||||
import spack.cmd.mirror
|
import spack.cmd.mirror
|
||||||
import spack.concretize
|
import spack.concretize
|
||||||
import spack.config
|
import spack.config
|
||||||
@ -365,8 +366,10 @@ def test_mirror_destroy(
|
|||||||
install("--fake", "--no-cache", spec_name)
|
install("--fake", "--no-cache", spec_name)
|
||||||
buildcache("push", "-u", "-f", mirror_dir.strpath, spec_name)
|
buildcache("push", "-u", "-f", mirror_dir.strpath, spec_name)
|
||||||
|
|
||||||
|
blobs_path = bindist.buildcache_relative_blobs_path()
|
||||||
|
|
||||||
contents = os.listdir(mirror_dir.strpath)
|
contents = os.listdir(mirror_dir.strpath)
|
||||||
assert "build_cache" in contents
|
assert blobs_path in contents
|
||||||
|
|
||||||
# Destroy mirror by name
|
# Destroy mirror by name
|
||||||
mirror("destroy", "-m", "atest")
|
mirror("destroy", "-m", "atest")
|
||||||
@ -376,7 +379,7 @@ def test_mirror_destroy(
|
|||||||
buildcache("push", "-u", "-f", mirror_dir.strpath, spec_name)
|
buildcache("push", "-u", "-f", mirror_dir.strpath, spec_name)
|
||||||
|
|
||||||
contents = os.listdir(mirror_dir.strpath)
|
contents = os.listdir(mirror_dir.strpath)
|
||||||
assert "build_cache" in contents
|
assert blobs_path in contents
|
||||||
|
|
||||||
# Destroy mirror by url
|
# Destroy mirror by url
|
||||||
mirror("destroy", "--mirror-url", mirror_url)
|
mirror("destroy", "--mirror-url", mirror_url)
|
||||||
|
@ -7,6 +7,8 @@
|
|||||||
|
|
||||||
from spack.main import SpackCommand
|
from spack.main import SpackCommand
|
||||||
|
|
||||||
|
pytestmark = [pytest.mark.usefixtures("mock_packages")]
|
||||||
|
|
||||||
providers = SpackCommand("providers")
|
providers = SpackCommand("providers")
|
||||||
|
|
||||||
|
|
||||||
@ -24,16 +26,28 @@ def test_it_just_runs(pkg):
|
|||||||
(
|
(
|
||||||
("mpi",),
|
("mpi",),
|
||||||
[
|
[
|
||||||
"mpich",
|
"intel-parallel-studio",
|
||||||
"mpilander",
|
"low-priority-provider",
|
||||||
"mvapich2",
|
"mpich@3:",
|
||||||
"openmpi",
|
"mpich2",
|
||||||
"openmpi@1.7.5:",
|
"multi-provider-mpi@1.10.0",
|
||||||
"openmpi@2.0.0:",
|
"multi-provider-mpi@2.0.0",
|
||||||
"spectrum-mpi",
|
"zmpi",
|
||||||
],
|
],
|
||||||
),
|
),
|
||||||
(("D", "awk"), ["ldc", "gawk", "mawk"]), # Call 2 virtual packages at once
|
(
|
||||||
|
("lapack", "something"),
|
||||||
|
[
|
||||||
|
"intel-parallel-studio",
|
||||||
|
"low-priority-provider",
|
||||||
|
"netlib-lapack",
|
||||||
|
"openblas-with-lapack",
|
||||||
|
"simple-inheritance",
|
||||||
|
"splice-a",
|
||||||
|
"splice-h",
|
||||||
|
"splice-vh",
|
||||||
|
],
|
||||||
|
), # Call 2 virtual packages at once
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_provider_lists(vpkg, provider_list):
|
def test_provider_lists(vpkg, provider_list):
|
||||||
|
@ -19,7 +19,7 @@ def test_list():
|
|||||||
|
|
||||||
def test_list_with_pytest_arg():
|
def test_list_with_pytest_arg():
|
||||||
output = spack_test("--list", cmd_test_py)
|
output = spack_test("--list", cmd_test_py)
|
||||||
assert output.strip() == cmd_test_py
|
assert cmd_test_py in output.strip()
|
||||||
|
|
||||||
|
|
||||||
def test_list_with_keywords():
|
def test_list_with_keywords():
|
||||||
@ -27,7 +27,7 @@ def test_list_with_keywords():
|
|||||||
# since the behavior is inconsistent across different pytest
|
# since the behavior is inconsistent across different pytest
|
||||||
# versions, see https://stackoverflow.com/a/48814787/771663
|
# versions, see https://stackoverflow.com/a/48814787/771663
|
||||||
output = spack_test("--list", "-k", "unit_test.py")
|
output = spack_test("--list", "-k", "unit_test.py")
|
||||||
assert output.strip() == cmd_test_py
|
assert cmd_test_py in output.strip()
|
||||||
|
|
||||||
|
|
||||||
def test_list_long(capsys):
|
def test_list_long(capsys):
|
||||||
|
@ -1068,9 +1068,7 @@ def install_mockery(temporary_store: spack.store.Store, mutable_config, mock_pac
|
|||||||
@pytest.fixture(scope="module")
|
@pytest.fixture(scope="module")
|
||||||
def temporary_mirror_dir(tmpdir_factory):
|
def temporary_mirror_dir(tmpdir_factory):
|
||||||
dir = tmpdir_factory.mktemp("mirror")
|
dir = tmpdir_factory.mktemp("mirror")
|
||||||
dir.ensure("build_cache", dir=True)
|
|
||||||
yield str(dir)
|
yield str(dir)
|
||||||
dir.join("build_cache").remove()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
@ -1084,9 +1082,7 @@ def temporary_mirror(temporary_mirror_dir):
|
|||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def mutable_temporary_mirror_dir(tmpdir_factory):
|
def mutable_temporary_mirror_dir(tmpdir_factory):
|
||||||
dir = tmpdir_factory.mktemp("mirror")
|
dir = tmpdir_factory.mktemp("mirror")
|
||||||
dir.ensure("build_cache", dir=True)
|
|
||||||
yield str(dir)
|
yield str(dir)
|
||||||
dir.join("build_cache").remove()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
|
@ -1,54 +0,0 @@
|
|||||||
{
|
|
||||||
"spec": {
|
|
||||||
"_meta": {
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"nodes": [
|
|
||||||
{
|
|
||||||
"name": "archive-files",
|
|
||||||
"version": "2.0",
|
|
||||||
"arch": {
|
|
||||||
"platform": "test",
|
|
||||||
"platform_os": "debian6",
|
|
||||||
"target": {
|
|
||||||
"name": "core2",
|
|
||||||
"vendor": "GenuineIntel",
|
|
||||||
"features": [
|
|
||||||
"mmx",
|
|
||||||
"sse",
|
|
||||||
"sse2",
|
|
||||||
"ssse3"
|
|
||||||
],
|
|
||||||
"generation": 0,
|
|
||||||
"parents": [
|
|
||||||
"nocona"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"compiler": {
|
|
||||||
"name": "gcc",
|
|
||||||
"version": "4.5.0"
|
|
||||||
},
|
|
||||||
"namespace": "builtin.mock",
|
|
||||||
"parameters": {
|
|
||||||
"cflags": [],
|
|
||||||
"cppflags": [],
|
|
||||||
"cxxflags": [],
|
|
||||||
"fflags": [],
|
|
||||||
"ldflags": [],
|
|
||||||
"ldlibs": []
|
|
||||||
},
|
|
||||||
"package_hash": "ncv2pr4o2yemepsa4h7u4p4dsgieul5fxvh6s5am5fsb65ebugaa====",
|
|
||||||
"hash": "l3vdiqvbobmspwyb4q2b62fz6nitd4hk"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"binary_cache_checksum": {
|
|
||||||
"hash_algorithm": "sha256",
|
|
||||||
"hash": "c226b51d88876746efd6f9737cc6dfdd349870b6c0b9c045d9bad0f2764a40b9"
|
|
||||||
},
|
|
||||||
"buildinfo": {
|
|
||||||
"relative_prefix": "test-debian6-core2/gcc-4.5.0/archive-files-2.0-l3vdiqvbobmspwyb4q2b62fz6nitd4hk",
|
|
||||||
"relative_rpaths": false
|
|
||||||
}
|
|
||||||
}
|
|
Binary file not shown.
@ -0,0 +1,29 @@
|
|||||||
|
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||||
|
|
||||||
|
mQINBGf23+EBEAC6UqaiE43cF9jFuVjA8xJ5j31BMhufpnk0cwoE5Iks/GgR/Hki
|
||||||
|
LMYbzy36V7TZGObel+5DtFKipX+WCwWj2XsjbeqHeuCkxZhzHFwfi1UJl9FO2T28
|
||||||
|
iNn6OsBiGeU6ULNmehSia2hx0uhj1re/FUwJExOAvuYv8nc7M+nozqi7Pp/WjP8v
|
||||||
|
UTiqP2onzZJbidlSBvmZ2nheWk7G78e617gcV/ye+UyXZvciiF2UQBg9YV6D8JuD
|
||||||
|
YhBbNAVOzJOiyOdTBmZmOkmYsGx58sEbFVqGeOMB0xoxZrqKjMm9NhvjqjJF/sWs
|
||||||
|
hN/PD5ylW1UR05/fGxlG2GLKKfBInbdqnC101OFWXP5HenYHmKaBJoCKCAUfsoJ0
|
||||||
|
r/t/GVh3z3w/99p0TRDONnTecKm5S9z3/5QjjE5RsWcd4ll7mRikUiVpe1WhKRwT
|
||||||
|
4T76pQLq3XwNJqiOmuMQuSHoBE9OMufvRFiTYC0QHyLoCV2H5PCWtS2xSsIDN4PB
|
||||||
|
0RNd0hnHKanVV7d2TkIrGOagoAo0wXqyW/Op6KUG1NdaFYYziDFEHeZxfGoPKytO
|
||||||
|
iS5PEwZG2FqambAZhJU5OXwzgnCRIoE5DCZad4YS6U5YD/2zg+RrQ/5GUxl5Cc+W
|
||||||
|
Zwesn9FV5jywx/oFePYbTSNQVPQ6jbUDvhmHvZ8c/OfGOVXQr0VpvfIwdwARAQAB
|
||||||
|
tD1UZXN0IFNpZ25pbmcgS2V5IChHUEcgY3JlYXRlZCBmb3IgU3BhY2spIDxub2Jv
|
||||||
|
ZHlAbm93aGVyZS5jb20+iQJRBBMBCAA7FiEEqYoEuILhnYX9Nu4GlWXYCwVckv8F
|
||||||
|
Amf23+ECGwMFCwkIBwICIgIGFQoJCAsCBBYCAwECHgcCF4AACgkQlWXYCwVckv9i
|
||||||
|
pg//eGjBR9ph9hUYRsekzKWM1xB5zFOFfNoqlpCut/W7LAfy0XXkFy/y6EvPdcgn
|
||||||
|
lLWRWPsOFfsKGwZd7LgSovhEMQ2MRsAUUB/KNZx7s6vO/P773PmJspF3odQ/lcrM
|
||||||
|
1fum2lShChWqimdBdNLrXxG+8duO9uWaMBIp28diBCyB25M/MqpHtKYu00FB/QJ6
|
||||||
|
ZwQH4OsgXVQHRjyrtIGx/2FQoWt0ah3eJMJCEw46GgkgiojtoTfXQQc4fIJP324b
|
||||||
|
O1sxz5lx3xVBG/EZYzyV3xnSoG9aZNJ1cJq8EKO7ZoNKc/8jwkVu5gewGaXYI0LK
|
||||||
|
/WkOeiXcSHPMSdu7TpnitvLYFCjc9YAEKQnjooXdt7+BElwC3+5hZJNXEnoGPMzn
|
||||||
|
3UL60sQE/ViCsGcW+l9rtzXPNTmLMjEg4rGRqOhX+UmwyhvGD2QYbZtXlayu5xn+
|
||||||
|
5m/PfmdqgL1xsdvNsLo/BOo+6kizMdBk48Xfp0YM8AC4BzUEENypGzC4T0WYF0k1
|
||||||
|
Jfc6/eSwiytIcIkJ42GlaVfEFE8UxfYc1/2zqTBN9EdzWJqy0Bh+mVOgOaeb0Dzi
|
||||||
|
xWpUpChi1fBB3PXWJ5iAS/w0HSVn4G5/JAIEFAs7r6ju2YtKBfuk+u/K5Q28mo7W
|
||||||
|
6LrZQywN44nBMTvSQUhhXpSNYG+juyotXJUJ3F2u9Cf/jVU=
|
||||||
|
=TkbL
|
||||||
|
-----END PGP PUBLIC KEY BLOCK-----
|
@ -0,0 +1,29 @@
|
|||||||
|
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||||
|
|
||||||
|
mQINBGfHlp4BEAC5wkZSHqF9z6GcymuHpk1m9aNXCJdt4ZWvE8ck8GcuVu1nbzlZ
|
||||||
|
h959jqtwk7nFMki5YaNMz6jcQf0eeS75viL4CoPAqFiVyhyCCh5am75h9F7vTBq6
|
||||||
|
190017lhu9IgkAkiklnjfDbyXH+BwqJ78nXp6e6R4ShFMHNGGvYLem1wmPKzqPlZ
|
||||||
|
zN0yjc0+d5pw4hu+IEFrM63yqGp2BVX1X132IKUEcROCQt1QOma5oORhYEtSCieX
|
||||||
|
PuhuHJOA7q6nJuFccPCs5OcDS4IbQgGAbWL4L1+LAGVLVGpK4IVtqEZ831Srclh8
|
||||||
|
0ruyFFeV/hqOONThwwile0Jwh5Jz/2sYxT5c+nlumXWK+CXTm4OCfGt1UuGy6c6u
|
||||||
|
Rz84PHfanbKnATp6RUjz4DMREkmA6qBnUFqGLLGaBKBsm42b7kbo7m5aeItuOwLE
|
||||||
|
U7AcnBEqqHLfI7O1zrHKjQCxhEWP/iok0kgEdiJ4tlPhfDjQRG6thlmZnVdt/08V
|
||||||
|
+bvVkbYZyWPzjbG3QHyFew1+uzPHb2UopgpByVKYEWhCgNfcFtE56lEI9c40Ba5o
|
||||||
|
LaZl0VlgfSLP4c+LoFB6gZp1gcVQuPo1JKd1v5WP60f1iHhazL5LEeMYcW6kvujK
|
||||||
|
58Q683gSH5DsVAnxaj1uU4nvtKDh8IF1CNKKXk8RVsltdpv9bGhV8b4qVQARAQAB
|
||||||
|
tD1UZXN0IFNpZ25pbmcgS2V5IChHUEcgY3JlYXRlZCBmb3IgU3BhY2spIDxub2Jv
|
||||||
|
ZHlAbm93aGVyZS5jb20+iQJOBBMBCgA4FiEE6J1JcfAJex56PrVzcbSEgC54180F
|
||||||
|
AmfHlp4CGwMFCwkIBwIGFQoJCAsCBBYCAwECHgECF4AACgkQcbSEgC54180aDg//
|
||||||
|
f7GqIW5LzYqIqkey+IjdkSSfeD47tlWc2ukKYStHu0gTlHhrUp4rHNJ/s8XQ1o6o
|
||||||
|
jwzWfNMYh68wt9sjuM2BEkkh3RUFEjVqqW+k562gS5ibfKTDtJb2Yj0n/CQKWvoi
|
||||||
|
vUUzO88xW0AnZFieP+vD5iI5Zw4H2dY8cH4X1XlWAJufFdH4WBaZjujNwNOcCsnd
|
||||||
|
w2nE050wKTR2wroWq0HKn1Ni3QNtKWPpLoHGAlhW6ACLa+EFqxHU6D3KhW6IV4Jc
|
||||||
|
sdt36nHNiRiy6nT99asqtN6Z0Yw+EnQSuIDosIbmSgZoieINh0gU6AKwgydxLUxL
|
||||||
|
Cu1w2fZHGuFR/ym0c/tTpM893DxHMc/EZ/SpU8fXkC9lYnQO3or/Y0mLHd0kSEv7
|
||||||
|
XoonvcOu1tOQzmvrvUQUtTn4+6OKpGViyZG5C8Lbk8/yKWFv5b+Gpss/EiGTHSsk
|
||||||
|
bPTHf5jMsWElv0GgFq2TpybtIcY52yJoZ1fBMEA9Nk76Y/MNFlN0d7HyS6tWGr6E
|
||||||
|
8FWJB7RYG5XHMEDIKSheq+Q5cORwz92JPFI+sovZukp+20G7f7/gwos441KamJPc
|
||||||
|
y1+M4uO21aKX2fA07bcgFtm25gNLoHyvjQLcmyDis6xogvciCV3iQ/mtunewgYp/
|
||||||
|
lUX1dv0R5o8TteaAIkbJicbdLtur/iuAWN404E/QShc=
|
||||||
|
=8P00
|
||||||
|
-----END PGP PUBLIC KEY BLOCK-----
|
@ -0,0 +1 @@
|
|||||||
|
{"keys":{"A98A04B882E19D85FD36EE069565D80B055C92FF":{},"E89D4971F0097B1E7A3EB57371B484802E78D7CD":{}}}
|
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
7f94d6038bb4e5e7fff817151da5b22d7dd6d1e2d9ad51bd55504676786c17bd
|
@ -0,0 +1,124 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA512
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":4
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libdwarf",
|
||||||
|
"version":"20130729",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"n7axrpelzl5kjuctt4yoaaf33gvgnik6cx7fjudwhc6hvywdrr4q====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build",
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"hash":"sk2gqqz4n5njmvktycnd25wq25jxiqkr"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"811f500a89ae7d2f61e2c0ef6f56e352dfbac245ae88275809088a1481489d5b"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCgAdFiEE6J1JcfAJex56PrVzcbSEgC54180FAmfHlp8ACgkQcbSEgC54
|
||||||
|
180hlxAAisLofFhr/PQvLcQ79T3t3V0tqGgz9x6QnPKfbPCgvb66tTNlny+ML0fY
|
||||||
|
y1H9xXQO53QOxfN9cdXcf2EVbRQ2eT6ltmwekI3ZZuCaTguflNu/i11UV6UnDy3x
|
||||||
|
dXOYQhky5QjtPbhJ0NxG5XDKoRFoUPR/rgXsiNG5O0sk3M5H9ldpsj8af5W/6LCL
|
||||||
|
gCTNM8fF0TVbd4MF9TiIECFBng2CrxhHwpl2gPHHxab1zxLRCF6t1lZvL6To0hmC
|
||||||
|
e/Tqre+42PhRSCtXuwhK22r0rvreVUaiglYn8udjOJHwNVKdzLnTZ1OBAFeIq00U
|
||||||
|
9uuroyaF841pq9+8PitwUORurv0lsnHUbfbi/+ou0HzMiaXzz+MPdOXt8nUuyScs
|
||||||
|
oKOi8ExvpWJ7vn6klkvQtMK/Gakzd4YOxO/nk9K8BJgVN3qrODwHYSORk8RrdITS
|
||||||
|
tkjiEJiIoklddiwCf3NUzlxiIYWbiqKqNbY+Pxh4B+OpVDnvRmpkJHgoSuVoCS8b
|
||||||
|
coaOTIgqDpnIClHIj7ogxO+ureRjIIkGNNh6wVhlHDlgm1GzxNUOklMrzDkYMD01
|
||||||
|
eTYxrbicw7ZVwqhFtR8olODKT9QAqXUJOkGHS9IA6FJctatkUkIOG1DSI52AZV1r
|
||||||
|
PYzgdKtTxS60EkN8Igl6VMTkaC05anLygCTyOvGaV7sqVKmzHY8=
|
||||||
|
=8OR5
|
||||||
|
-----END PGP SIGNATURE-----
|
@ -0,0 +1,72 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA512
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":4
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"48c8aa769a62535f9d9f613722e3d3f5a48b91fde3c99a644b22f277a4502d75"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCgAdFiEE6J1JcfAJex56PrVzcbSEgC54180FAmfHlp8ACgkQcbSEgC54
|
||||||
|
182ezg/7Bkil1mY6d4recJMkFhpBzzDs8aMD+WQOBPoy/bWHIGsPb1DyOOW7lTLa
|
||||||
|
QC9jh9Rq02oMeX0LWvNg7k6iMTayWcrPzJwk1rgh3pg/ySgCTZ576/aP/UOZwA8h
|
||||||
|
HT/3RzsDFlq7Wkh4yYaDgSEDVc5PgUevb1p2f126Z9HMFjG8siEWmuZQOcy4I9JG
|
||||||
|
osQFtwWTLmx96sBMzweZTu2i3iGTPNz4Ae1hu+v5clmSFg43eW7EWChEVoob+3hb
|
||||||
|
hLRxajZEPsIho4yR5yynoxduXeXrLLP7GH6XGnYt7Z2GJR0UamIrPfxYuWBK76V1
|
||||||
|
03Ie2rRXwOKfsjDWw9Z8ziTVu25G0aZ274DX6eQyaWKfvzz69cBXO0fgw1lU8B9S
|
||||||
|
K0j9k/xtnDCrIkPSh4QGQpFRlbzxkj20E+EnwgDCGIlK1rBzo2V5na4YNj+SbC91
|
||||||
|
0BmWrj6dRkQZUMJHeb95kBMfFpKG5B6u7HQxZtIwHFAfF0nypbiB7xmdy/gAmUao
|
||||||
|
ej3Cu34DvWtLVeSh7lRimeEc44WyBDk2YSPqYleAwYMZBn4WSozUS/KVLU2T/AhZ
|
||||||
|
VlLaEBaFrVngmsw5PCdck0XRSNSAN9HUgPItpOzYig20NeT1/69wIlUZVNpLEYGT
|
||||||
|
yvZsmqHFnkunAs6av3XmGl0i8rSA6DujunpNXML6hUciFEK5wg4=
|
||||||
|
=Aq8h
|
||||||
|
-----END PGP SIGNATURE-----
|
Binary file not shown.
Binary file not shown.
@ -0,0 +1,429 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA256
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libdwarf",
|
||||||
|
"version":"20130729",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"n7axrpelzl5kjuctt4yoaaf33gvgnik6cx7fjudwhc6hvywdrr4q====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c",
|
||||||
|
"cxx"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build",
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"u5uz3dcch5if4eve4sef67o2rf2lbfgh"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"0898457b4cc4b18d71059ea254667fb6690f5933c82e1627f9fed3606488dbca"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCAAdFiEEqYoEuILhnYX9Nu4GlWXYCwVckv8FAmf23+QACgkQlWXYCwVc
|
||||||
|
kv9Xlg//d7uWhVbHjujSXRpoN3hzH5sUvvTSZ9xzvXGAXCoAu2oEGg4hxZPIFQJ3
|
||||||
|
pZzKysZMfeFg+UKwDzex5TlKZ3JtKgCTKYl64zZfUl2EQgo/d/Fjz5mSFHW/6sa1
|
||||||
|
1uTe3+sVt+HlijN72t2412Qbp+/uGvU+KBvXPA7kgkp88Kd/PL9xe3jlT9ytH5Nw
|
||||||
|
3LIghe++JiepjFAKXTfIA04EjLb8c50AAxsK5Xx37HOOVHHQ8L9anFnOVYM+DxAz
|
||||||
|
gn4dBYUQ9Uu5k5uEu5CwtxsED2/Yar7YWIepEnyp6z4zQVbwjO4/w0vZ3wSJ9c4P
|
||||||
|
UhZs8V2akuqIWyzlQuBOjywnEQc/nw9v0py+Dr/Qr3U4XWh/LARWABMxa4IqXMOK
|
||||||
|
aVmd6weVjV4U929gaOT/FCtZPfaFNRbk97YP8yAxuLhSdiGS0Mp16Ygz21fVWB7C
|
||||||
|
UjkGGsKK1cdiJQ0m1CffmydU/nbDjSuw4WZIoIgDzvN7SFm7YBtE+xY+RUPsHU22
|
||||||
|
QMAXojF5abwn48HJeP47MYdfR7+nUJq6XJiJ7/80a7Ciy8SAVxinQWqvigf/hmTf
|
||||||
|
kAiQaqOVSlRBJ2yry5fYBKHSIRvghCqS4t4es8o13R7n2wz68VqKu0JkNlT3Ijjc
|
||||||
|
QjJYtI+844PCDNetPVV8iNWF6upnTJnPHcFmKAEO1663hOc3Dh8=
|
||||||
|
=3fA5
|
||||||
|
-----END PGP SIGNATURE-----
|
@ -0,0 +1,317 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA256
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"c068bcd1a27a3081c07ba775d83e90228e340bb6a7f0d55deb18a462760c4bcf"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCAAdFiEEqYoEuILhnYX9Nu4GlWXYCwVckv8FAmf23+QACgkQlWXYCwVc
|
||||||
|
kv/zSg/+NrS4JjT9TFSFR/q2vaN9aL7fSTunxp+M8eAzTmg0sgHc/D6ov2PMpUF7
|
||||||
|
1E2mnZ2gL5a5dHtsSCf30ILFzQoD+m+I9yOwcJopcbEjr8pcnXBFe6TT8lkxlXtI
|
||||||
|
EHNsYGMUHFbFvc+hFdWatQJicdDaIbdyEMGAC7Kobs/4KpdBF5VWV+sIrzD5+XzO
|
||||||
|
ACiKRjBmcaJpa950nuEaFzBITgq1aDtZ0EEZdXYvjRnzj9Bm6gbqmWzlllW1wf4r
|
||||||
|
5hSMTpAsRED4TxL433nuf0nKIvTD5Mywzs88kiLCtEABfDy1qccyBAnjyNypFF6B
|
||||||
|
fPqSDnr33s+JQ35t7RcHKfrgowk69UablE25YOUrQP6LtH4QzLBLj4/Z0zuz33hO
|
||||||
|
v+YYe51DgixsMQ2WCKWEO6sNcrcrLBJMFVwUP2FyTTdW3jCYRlFiTYLSfoDhTRJ/
|
||||||
|
4o7f2eEp3sVoOe12jKI6dw/P+c70dl8K4+1ICcnZkwsb0pd0vt2z4J2kPs2+1/0g
|
||||||
|
vpywJO1HL5Zy7/ZRlmeeSMHYEDX2eKhm7QRFbxw1IEbg3stQCA7a425JWztyJ05K
|
||||||
|
sfhFQgPt7F/xanJVFYk/hdza+3+5pFr1K/ARcLFBdLBKGxAXTMMR+NkMp3J5NiOo
|
||||||
|
SMZJ3jG6xA2ntvSkyx/GFawD0FpnlgEByU3E+R/WiQA4VojLpvo=
|
||||||
|
=kfWI
|
||||||
|
-----END PGP SIGNATURE-----
|
@ -0,0 +1,99 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA256
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"2c1c5576e30b7063aa02a22111eb24b3f2a93c35ac0f64b4e491c7078706c0ea"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCAAdFiEEqYoEuILhnYX9Nu4GlWXYCwVckv8FAmf23+QACgkQlWXYCwVc
|
||||||
|
kv/T8BAAhK/v7CP6lMIKILj35nEi+Gftjs7B7f6qvb4QNtqcGHum6z9t3JxkOOrd
|
||||||
|
+q+Wd329kLYAFs/y9eaGe5X7wY1U7/f863i3XrxHbtmrnMci61D8qMjA1xnBGC+5
|
||||||
|
yd746aVeV/VRbJxTeB9kGcKPMcIQYcearlDMgj5fKfpCKM8a+VyJfw7qHNUyrTnu
|
||||||
|
d6LSGsEey6tGkJecgnJZTNSwryO3BZbg/4EviivMXm38AKGZrSib06qjkoHrPRvB
|
||||||
|
8ftGSGlK4YmFs5/YjKFL7QzuNJeqPNJt4mD64tsk21urOfbQJe5AmdMLPGY0PbW/
|
||||||
|
w++06c8lsd/6FmzUwlnTBUa39lKJjhkhoK7KFGVqZROcXZfhwAyqPZt7ReA5FDMV
|
||||||
|
l5X7sytjQuSFaQPGi5g1xXQGEI394T2I55p5T5/RuQ2PXcFxxSOmIcEcD8o6Z7+x
|
||||||
|
XWLq44KUWQyQP/StjaVhIz9YPogeBBJllA9hN+GzVrr2i+Esu1QO5uDgVuJP7pTA
|
||||||
|
9wwCLV/t0hf2TZcpU2fwEu+DMniaHm6haVwqiu6QGkbkMBx49zkV9b5i9L441GoC
|
||||||
|
Q86R2Gs9O0+QzHuN6egbQ0xKm/lfU8dmJSzV0snXawAeQ/vgCpdinx40EMc7Nz03
|
||||||
|
rgZ3j88c/ADvCb1DVKmu1Phf6U7WqG6/AvB9tYl4Zl30VX7ETaw=
|
||||||
|
=ifvQ
|
||||||
|
-----END PGP SIGNATURE-----
|
@ -0,0 +1,151 @@
|
|||||||
|
-----BEGIN PGP SIGNED MESSAGE-----
|
||||||
|
Hash: SHA256
|
||||||
|
|
||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"f33e7a6798a5fb2db6e538d3a530cc79b298e36d56a1df385d93889a9ba431d0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
-----BEGIN PGP SIGNATURE-----
|
||||||
|
|
||||||
|
iQIzBAEBCAAdFiEEqYoEuILhnYX9Nu4GlWXYCwVckv8FAmf23+QACgkQlWXYCwVc
|
||||||
|
kv+MsRAAsaQjZbB9iW/Lq9b87H/E5Zmv6RrClvpjSnwvhLR4nhPL3p0G70k6tI/b
|
||||||
|
NEdXctDyvBOJOEoLaEBrCODl/3GjV8B9Gj7OhT/BIKQjlOfJqVdwIrnHgav5ri+Q
|
||||||
|
UUXLtejhJiUNoxeILI/xZx2CoKT9q/3EpQ5ysqdybJmYJCf/hv+lXEhnwUIv8vV/
|
||||||
|
xdRYY//rfeMowCNIZtFPjSejMywXJfFKjl7h5dN5kwM63D6z/sh4zW7tqHq4kk+A
|
||||||
|
2m0WcorVg93wAm+YoJaQJVx8bYeMGfV/TjmY/cSouCt8PM4Vi93vwieZCkzEpXbM
|
||||||
|
BkVN4X3PTMZSOf0WTkEbnQD5v090/DoQPZyBrcDoJ/HmWDiz5Is2wUI0mLVkbg2L
|
||||||
|
+rKNC3ZajJhsWElMGNNtZRLmGeTIe8hT+LNAejo221vrOJbnUmpIjKxVjStDbXmW
|
||||||
|
nulgyEPSTfsJaXgbXmeJ8LOk0tWpBAGC16VzgXrPxoGD2XKxoiPCGLNrF/l1wyl+
|
||||||
|
n+nw3TchNFrofpPrqJzT/vS71B6KDb0PVSTQZfM9+FahrQ+YbsIkzDAuxVZb5t3q
|
||||||
|
HUME95RgoIBbccUGxAPwkaNme2OLaLzsJZ/Xhl5I8T1fraLYapsKNjQ5+CSKO8+t
|
||||||
|
MlJYgSHuazWSetRbZ2H7g7QJWqeHUAWi9i1szpNDYxTFSs8wgDY=
|
||||||
|
=edPy
|
||||||
|
-----END PGP SIGNATURE-----
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
File diff suppressed because one or more lines are too long
@ -0,0 +1 @@
|
|||||||
|
57cad2589fae55cda3c35cadf4286d2e7702f90a708da80d70a76213fc45a688
|
@ -0,0 +1,105 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":4
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libdwarf",
|
||||||
|
"version":"20130729",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"n7axrpelzl5kjuctt4yoaaf33gvgnik6cx7fjudwhc6hvywdrr4q====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build",
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"hash":"sk2gqqz4n5njmvktycnd25wq25jxiqkr"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"f31bce1bfdcaa9b11cd02b869dd07a843db9819737399b99ac614ab3552bd4b6"
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1,53 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":4
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"core2",
|
||||||
|
"vendor":"GenuineIntel",
|
||||||
|
"features":[
|
||||||
|
"mmx",
|
||||||
|
"sse",
|
||||||
|
"sse2",
|
||||||
|
"ssse3"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"nocona"
|
||||||
|
],
|
||||||
|
"cpupart":""
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"compiler":{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"hash":"rqh2vuf6fqwkmipzgi2wjx352mq7y7ez"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"523ebc3691b0687cf93fcd477002972ea7d6da5d3e8d46636b30d4f1052fcbf2"
|
||||||
|
}
|
||||||
|
}
|
Binary file not shown.
Binary file not shown.
@ -0,0 +1,410 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libdwarf",
|
||||||
|
"version":"20130729",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"n7axrpelzl5kjuctt4yoaaf33gvgnik6cx7fjudwhc6hvywdrr4q====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c",
|
||||||
|
"cxx"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build",
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"u5uz3dcch5if4eve4sef67o2rf2lbfgh"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"2a6f045996e4998ec37679e1aa8a245795fbbffaf9844692ba2de6eeffcbc722"
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1,298 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"libelf",
|
||||||
|
"version":"0.8.13",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ejr32l7tkp6uhdrlunqv4adkuxqwyac7vbqcjvg6dh72mll4cpiq====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[
|
||||||
|
"c"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"link"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"jr3yipyxyjulcdvckwwwjrrumis7glpa"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"59141e49bd05abe40639360cd9422020513781270a3461083fee0eba2af62ca0"
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1,80 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"compiler-wrapper",
|
||||||
|
"version":"1.0",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"ss7ybgvqf2fa2lvkf67eavllfxpxthiml2dobtkdq6wn7zkczteq====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"qeehcxyvluwnihsc2qxstmpomtxo3lrc"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"f27ddff0ef4268acbe816c51f6f1fc907dc1010d31f2d6556b699c80f026c47d"
|
||||||
|
}
|
||||||
|
}
|
@ -0,0 +1,132 @@
|
|||||||
|
{
|
||||||
|
"spec":{
|
||||||
|
"_meta":{
|
||||||
|
"version":5
|
||||||
|
},
|
||||||
|
"nodes":[
|
||||||
|
{
|
||||||
|
"name":"gcc-runtime",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":{
|
||||||
|
"name":"m1",
|
||||||
|
"vendor":"Apple",
|
||||||
|
"features":[
|
||||||
|
"aes",
|
||||||
|
"asimd",
|
||||||
|
"asimddp",
|
||||||
|
"asimdfhm",
|
||||||
|
"asimdhp",
|
||||||
|
"asimdrdm",
|
||||||
|
"atomics",
|
||||||
|
"cpuid",
|
||||||
|
"crc32",
|
||||||
|
"dcpodp",
|
||||||
|
"dcpop",
|
||||||
|
"dit",
|
||||||
|
"evtstrm",
|
||||||
|
"fcma",
|
||||||
|
"flagm",
|
||||||
|
"flagm2",
|
||||||
|
"fp",
|
||||||
|
"fphp",
|
||||||
|
"frint",
|
||||||
|
"ilrcpc",
|
||||||
|
"jscvt",
|
||||||
|
"lrcpc",
|
||||||
|
"paca",
|
||||||
|
"pacg",
|
||||||
|
"pmull",
|
||||||
|
"sb",
|
||||||
|
"sha1",
|
||||||
|
"sha2",
|
||||||
|
"sha3",
|
||||||
|
"sha512",
|
||||||
|
"ssbs",
|
||||||
|
"uscat"
|
||||||
|
],
|
||||||
|
"generation":0,
|
||||||
|
"parents":[
|
||||||
|
"armv8.4a"
|
||||||
|
],
|
||||||
|
"cpupart":"0x022"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"package_hash":"up2pdsw5tfvmn5gwgb3opl46la3uxoptkr3udmradd54s7qo72ha====",
|
||||||
|
"dependencies":[
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq",
|
||||||
|
"parameters":{
|
||||||
|
"deptypes":[
|
||||||
|
"build"
|
||||||
|
],
|
||||||
|
"virtuals":[]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"izgzpzeljwairalfjm3k6fntbb64nt6n"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name":"gcc",
|
||||||
|
"version":"10.2.1",
|
||||||
|
"arch":{
|
||||||
|
"platform":"test",
|
||||||
|
"platform_os":"debian6",
|
||||||
|
"target":"aarch64"
|
||||||
|
},
|
||||||
|
"namespace":"builtin.mock",
|
||||||
|
"parameters":{
|
||||||
|
"build_system":"generic",
|
||||||
|
"languages":[
|
||||||
|
"c",
|
||||||
|
"c++",
|
||||||
|
"fortran"
|
||||||
|
],
|
||||||
|
"cflags":[],
|
||||||
|
"cppflags":[],
|
||||||
|
"cxxflags":[],
|
||||||
|
"fflags":[],
|
||||||
|
"ldflags":[],
|
||||||
|
"ldlibs":[]
|
||||||
|
},
|
||||||
|
"external":{
|
||||||
|
"path":"/path",
|
||||||
|
"module":null,
|
||||||
|
"extra_attributes":{
|
||||||
|
"compilers":{
|
||||||
|
"c":"/path/bin/gcc-10",
|
||||||
|
"cxx":"/path/bin/g++-10",
|
||||||
|
"fortran":"/path/bin/gfortran-10"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"package_hash":"a7d6wvl2mh4od3uue3yxqonc7r7ihw3n3ldedu4kevqa32oy2ysa====",
|
||||||
|
"annotations":{
|
||||||
|
"original_specfile_version":5
|
||||||
|
},
|
||||||
|
"hash":"vd7v4ssgnoqdplgxyig3orum67n4vmhq"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"buildcache_layout_version":2,
|
||||||
|
"binary_cache_checksum":{
|
||||||
|
"hash_algorithm":"sha256",
|
||||||
|
"hash":"348f23717c5641fa6bbb90862e62bf632367511c53e9c6450584f0d000841320"
|
||||||
|
}
|
||||||
|
}
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -22,7 +22,10 @@
|
|||||||
)
|
)
|
||||||
from spack.environment.list import UndefinedReferenceError
|
from spack.environment.list import UndefinedReferenceError
|
||||||
|
|
||||||
pytestmark = pytest.mark.not_on_windows("Envs are not supported on windows")
|
pytestmark = [
|
||||||
|
pytest.mark.not_on_windows("Envs are not supported on windows"),
|
||||||
|
pytest.mark.usefixtures("mock_packages"),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
class TestDirectoryInitialization:
|
class TestDirectoryInitialization:
|
||||||
@ -35,7 +38,7 @@ def test_environment_dir_from_name(self, mutable_mock_env_path):
|
|||||||
ev.environment_dir_from_name("test", exists_ok=False)
|
ev.environment_dir_from_name("test", exists_ok=False)
|
||||||
|
|
||||||
|
|
||||||
def test_hash_change_no_rehash_concrete(tmp_path, mock_packages, config):
|
def test_hash_change_no_rehash_concrete(tmp_path, config):
|
||||||
# create an environment
|
# create an environment
|
||||||
env_path = tmp_path / "env_dir"
|
env_path = tmp_path / "env_dir"
|
||||||
env_path.mkdir(exist_ok=False)
|
env_path.mkdir(exist_ok=False)
|
||||||
@ -65,7 +68,7 @@ def test_hash_change_no_rehash_concrete(tmp_path, mock_packages, config):
|
|||||||
assert read_in.specs_by_hash[read_in.concretized_order[0]]._hash == new_hash
|
assert read_in.specs_by_hash[read_in.concretized_order[0]]._hash == new_hash
|
||||||
|
|
||||||
|
|
||||||
def test_env_change_spec(tmp_path, mock_packages, config):
|
def test_env_change_spec(tmp_path, config):
|
||||||
env_path = tmp_path / "env_dir"
|
env_path = tmp_path / "env_dir"
|
||||||
env_path.mkdir(exist_ok=False)
|
env_path.mkdir(exist_ok=False)
|
||||||
env = ev.create_in_dir(env_path)
|
env = ev.create_in_dir(env_path)
|
||||||
@ -98,7 +101,7 @@ def test_env_change_spec(tmp_path, mock_packages, config):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
def test_env_change_spec_in_definition(tmp_path, mock_packages, mutable_mock_env_path):
|
def test_env_change_spec_in_definition(tmp_path, mutable_mock_env_path):
|
||||||
manifest_file = tmp_path / ev.manifest_name
|
manifest_file = tmp_path / ev.manifest_name
|
||||||
manifest_file.write_text(_test_matrix_yaml)
|
manifest_file.write_text(_test_matrix_yaml)
|
||||||
e = ev.create("test", manifest_file)
|
e = ev.create("test", manifest_file)
|
||||||
@ -121,7 +124,7 @@ def test_env_change_spec_in_definition(tmp_path, mock_packages, mutable_mock_env
|
|||||||
assert not any(x.intersects("mpileaks@2.1%gcc") for x in e.user_specs)
|
assert not any(x.intersects("mpileaks@2.1%gcc") for x in e.user_specs)
|
||||||
|
|
||||||
|
|
||||||
def test_env_change_spec_in_matrix_raises_error(tmp_path, mock_packages, mutable_mock_env_path):
|
def test_env_change_spec_in_matrix_raises_error(tmp_path, mutable_mock_env_path):
|
||||||
manifest_file = tmp_path / ev.manifest_name
|
manifest_file = tmp_path / ev.manifest_name
|
||||||
manifest_file.write_text(_test_matrix_yaml)
|
manifest_file.write_text(_test_matrix_yaml)
|
||||||
e = ev.create("test", manifest_file)
|
e = ev.create("test", manifest_file)
|
||||||
@ -202,7 +205,7 @@ def test_environment_cant_modify_environments_root(tmpdir):
|
|||||||
unify: false"""
|
unify: false"""
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_roundtrip_spack_yaml_with_comments(original_content, mock_packages, config, tmp_path):
|
def test_roundtrip_spack_yaml_with_comments(original_content, config, tmp_path):
|
||||||
"""Ensure that round-tripping a spack.yaml file doesn't change its content."""
|
"""Ensure that round-tripping a spack.yaml file doesn't change its content."""
|
||||||
spack_yaml = tmp_path / "spack.yaml"
|
spack_yaml = tmp_path / "spack.yaml"
|
||||||
spack_yaml.write_text(original_content)
|
spack_yaml.write_text(original_content)
|
||||||
@ -242,7 +245,7 @@ def test_removing_from_non_existing_list_fails(tmp_path):
|
|||||||
(False, False),
|
(False, False),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_update_default_view(init_view, update_value, tmp_path, mock_packages, config):
|
def test_update_default_view(init_view, update_value, tmp_path, config):
|
||||||
"""Tests updating the default view with different values."""
|
"""Tests updating the default view with different values."""
|
||||||
env = ev.create_in_dir(tmp_path, with_view=init_view)
|
env = ev.create_in_dir(tmp_path, with_view=init_view)
|
||||||
env.update_default_view(update_value)
|
env.update_default_view(update_value)
|
||||||
@ -291,7 +294,7 @@ def test_update_default_view(init_view, update_value, tmp_path, mock_packages, c
|
|||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_update_default_complex_view(
|
def test_update_default_complex_view(
|
||||||
initial_content, update_value, expected_view, tmp_path, mock_packages, config
|
initial_content, update_value, expected_view, tmp_path, config
|
||||||
):
|
):
|
||||||
spack_yaml = tmp_path / "spack.yaml"
|
spack_yaml = tmp_path / "spack.yaml"
|
||||||
spack_yaml.write_text(initial_content)
|
spack_yaml.write_text(initial_content)
|
||||||
@ -366,7 +369,7 @@ def test_error_on_nonempty_view_dir(tmpdir):
|
|||||||
_error_on_nonempty_view_dir("file")
|
_error_on_nonempty_view_dir("file")
|
||||||
|
|
||||||
|
|
||||||
def test_can_add_specs_to_environment_without_specs_attribute(tmp_path, mock_packages, config):
|
def test_can_add_specs_to_environment_without_specs_attribute(tmp_path, config):
|
||||||
"""Sometimes users have template manifest files, and save one line in the YAML file by
|
"""Sometimes users have template manifest files, and save one line in the YAML file by
|
||||||
removing the empty 'specs: []' attribute. This test ensures that adding a spec to an
|
removing the empty 'specs: []' attribute. This test ensures that adding a spec to an
|
||||||
environment without the 'specs' attribute, creates the attribute first instead of returning
|
environment without the 'specs' attribute, creates the attribute first instead of returning
|
||||||
@ -397,12 +400,12 @@ def test_can_add_specs_to_environment_without_specs_attribute(tmp_path, mock_pac
|
|||||||
# baz
|
# baz
|
||||||
- zlib
|
- zlib
|
||||||
""",
|
""",
|
||||||
"libpng",
|
"libdwarf",
|
||||||
"""spack:
|
"""spack:
|
||||||
specs:
|
specs:
|
||||||
# baz
|
# baz
|
||||||
- zlib
|
- zlib
|
||||||
- libpng
|
- libdwarf
|
||||||
""",
|
""",
|
||||||
)
|
)
|
||||||
],
|
],
|
||||||
@ -572,7 +575,7 @@ def test_environment_config_scheme_used(tmp_path, unify_in_config):
|
|||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_conflicts_with_packages_that_are_not_dependencies(
|
def test_conflicts_with_packages_that_are_not_dependencies(
|
||||||
spec_str, expected_raise, expected_spec, tmp_path, mock_packages, config
|
spec_str, expected_raise, expected_spec, tmp_path, config
|
||||||
):
|
):
|
||||||
"""Tests that we cannot concretize two specs together, if one conflicts with the other,
|
"""Tests that we cannot concretize two specs together, if one conflicts with the other,
|
||||||
even though they don't have a dependency relation.
|
even though they don't have a dependency relation.
|
||||||
@ -601,9 +604,7 @@ def test_conflicts_with_packages_that_are_not_dependencies(
|
|||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
"possible_mpi_spec,unify", [("mpich", False), ("mpich", True), ("zmpi", False), ("zmpi", True)]
|
"possible_mpi_spec,unify", [("mpich", False), ("mpich", True), ("zmpi", False), ("zmpi", True)]
|
||||||
)
|
)
|
||||||
def test_requires_on_virtual_and_potential_providers(
|
def test_requires_on_virtual_and_potential_providers(possible_mpi_spec, unify, tmp_path, config):
|
||||||
possible_mpi_spec, unify, tmp_path, mock_packages, config
|
|
||||||
):
|
|
||||||
"""Tests that in an environment we can add packages explicitly, even though they provide
|
"""Tests that in an environment we can add packages explicitly, even though they provide
|
||||||
a virtual package, and we require the provider of the same virtual to be another package,
|
a virtual package, and we require the provider of the same virtual to be another package,
|
||||||
if they are added explicitly by their name.
|
if they are added explicitly by their name.
|
||||||
@ -698,7 +699,7 @@ def test_removing_spec_from_manifest_with_exact_duplicates(
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("35298")
|
@pytest.mark.regression("35298")
|
||||||
def test_variant_propagation_with_unify_false(tmp_path, mock_packages, config):
|
def test_variant_propagation_with_unify_false(tmp_path, config):
|
||||||
"""Spack distributes concretizations to different processes, when unify:false is selected and
|
"""Spack distributes concretizations to different processes, when unify:false is selected and
|
||||||
the number of roots is 2 or more. When that happens, the specs to be concretized need to be
|
the number of roots is 2 or more. When that happens, the specs to be concretized need to be
|
||||||
properly reconstructed on the worker process, if variant propagation was requested.
|
properly reconstructed on the worker process, if variant propagation was requested.
|
||||||
@ -722,7 +723,7 @@ def test_variant_propagation_with_unify_false(tmp_path, mock_packages, config):
|
|||||||
assert node.satisfies("+foo")
|
assert node.satisfies("+foo")
|
||||||
|
|
||||||
|
|
||||||
def test_env_with_include_defs(mutable_mock_env_path, mock_packages):
|
def test_env_with_include_defs(mutable_mock_env_path):
|
||||||
"""Test environment with included definitions file."""
|
"""Test environment with included definitions file."""
|
||||||
env_path = mutable_mock_env_path
|
env_path = mutable_mock_env_path
|
||||||
env_path.mkdir()
|
env_path.mkdir()
|
||||||
@ -756,7 +757,7 @@ def test_env_with_include_defs(mutable_mock_env_path, mock_packages):
|
|||||||
e.concretize()
|
e.concretize()
|
||||||
|
|
||||||
|
|
||||||
def test_env_with_include_def_missing(mutable_mock_env_path, mock_packages):
|
def test_env_with_include_def_missing(mutable_mock_env_path):
|
||||||
"""Test environment with included definitions file that is missing a definition."""
|
"""Test environment with included definitions file that is missing a definition."""
|
||||||
env_path = mutable_mock_env_path
|
env_path = mutable_mock_env_path
|
||||||
env_path.mkdir()
|
env_path.mkdir()
|
||||||
@ -782,7 +783,7 @@ def test_env_with_include_def_missing(mutable_mock_env_path, mock_packages):
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("41292")
|
@pytest.mark.regression("41292")
|
||||||
def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path, mock_packages):
|
def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path):
|
||||||
"""Tests that, after having deconcretized a spec, we can reconcretize an environment which
|
"""Tests that, after having deconcretized a spec, we can reconcretize an environment which
|
||||||
has 2 or more user specs mapping to the same concrete spec.
|
has 2 or more user specs mapping to the same concrete spec.
|
||||||
"""
|
"""
|
||||||
@ -811,7 +812,7 @@ def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path, mock
|
|||||||
|
|
||||||
|
|
||||||
@pytest.mark.regression("44216")
|
@pytest.mark.regression("44216")
|
||||||
def test_root_version_weights_for_old_versions(mutable_mock_env_path, mock_packages):
|
def test_root_version_weights_for_old_versions(mutable_mock_env_path):
|
||||||
"""Tests that, when we select two old versions of root specs that have the same version
|
"""Tests that, when we select two old versions of root specs that have the same version
|
||||||
optimization penalty, both are considered.
|
optimization penalty, both are considered.
|
||||||
"""
|
"""
|
||||||
@ -839,7 +840,7 @@ def test_root_version_weights_for_old_versions(mutable_mock_env_path, mock_packa
|
|||||||
assert gcc.satisfies("@=1.0")
|
assert gcc.satisfies("@=1.0")
|
||||||
|
|
||||||
|
|
||||||
def test_env_view_on_empty_dir_is_fine(tmp_path, config, mock_packages, temporary_store):
|
def test_env_view_on_empty_dir_is_fine(tmp_path, config, temporary_store):
|
||||||
"""Tests that creating a view pointing to an empty dir is not an error."""
|
"""Tests that creating a view pointing to an empty dir is not an error."""
|
||||||
view_dir = tmp_path / "view"
|
view_dir = tmp_path / "view"
|
||||||
view_dir.mkdir()
|
view_dir.mkdir()
|
||||||
@ -851,7 +852,7 @@ def test_env_view_on_empty_dir_is_fine(tmp_path, config, mock_packages, temporar
|
|||||||
assert view_dir.is_symlink()
|
assert view_dir.is_symlink()
|
||||||
|
|
||||||
|
|
||||||
def test_env_view_on_non_empty_dir_errors(tmp_path, config, mock_packages, temporary_store):
|
def test_env_view_on_non_empty_dir_errors(tmp_path, config, temporary_store):
|
||||||
"""Tests that creating a view pointing to a non-empty dir errors."""
|
"""Tests that creating a view pointing to a non-empty dir errors."""
|
||||||
view_dir = tmp_path / "view"
|
view_dir = tmp_path / "view"
|
||||||
view_dir.mkdir()
|
view_dir.mkdir()
|
||||||
@ -868,7 +869,7 @@ def test_env_view_on_non_empty_dir_errors(tmp_path, config, mock_packages, tempo
|
|||||||
"matrix_line", [("^zmpi", "^mpich"), ("~shared", "+shared"), ("shared=False", "+shared-libs")]
|
"matrix_line", [("^zmpi", "^mpich"), ("~shared", "+shared"), ("shared=False", "+shared-libs")]
|
||||||
)
|
)
|
||||||
@pytest.mark.regression("40791")
|
@pytest.mark.regression("40791")
|
||||||
def test_stack_enforcement_is_strict(tmp_path, matrix_line, config, mock_packages):
|
def test_stack_enforcement_is_strict(tmp_path, matrix_line, config):
|
||||||
"""Ensure that constraints in matrices are applied strictly after expansion, to avoid
|
"""Ensure that constraints in matrices are applied strictly after expansion, to avoid
|
||||||
inconsistencies between abstract user specs and concrete specs.
|
inconsistencies between abstract user specs and concrete specs.
|
||||||
"""
|
"""
|
||||||
@ -894,7 +895,7 @@ def test_stack_enforcement_is_strict(tmp_path, matrix_line, config, mock_package
|
|||||||
e.concretize()
|
e.concretize()
|
||||||
|
|
||||||
|
|
||||||
def test_only_roots_are_explicitly_installed(tmp_path, mock_packages, config, temporary_store):
|
def test_only_roots_are_explicitly_installed(tmp_path, config, temporary_store):
|
||||||
"""When installing specific non-root specs from an environment, we continue to mark them
|
"""When installing specific non-root specs from an environment, we continue to mark them
|
||||||
as implicitly installed. What makes installs explicit is that they are root of the env."""
|
as implicitly installed. What makes installs explicit is that they are root of the env."""
|
||||||
env = ev.create_in_dir(tmp_path)
|
env = ev.create_in_dir(tmp_path)
|
||||||
@ -908,7 +909,7 @@ def test_only_roots_are_explicitly_installed(tmp_path, mock_packages, config, te
|
|||||||
assert temporary_store.db.query(explicit=True) == [mpileaks]
|
assert temporary_store.db.query(explicit=True) == [mpileaks]
|
||||||
|
|
||||||
|
|
||||||
def test_environment_from_name_or_dir(mock_packages, mutable_mock_env_path, tmp_path):
|
def test_environment_from_name_or_dir(mutable_mock_env_path, tmp_path):
|
||||||
test_env = ev.create("test")
|
test_env = ev.create("test")
|
||||||
|
|
||||||
name_env = ev.environment_from_name_or_dir(test_env.name)
|
name_env = ev.environment_from_name_or_dir(test_env.name)
|
||||||
@ -923,7 +924,7 @@ def test_environment_from_name_or_dir(mock_packages, mutable_mock_env_path, tmp_
|
|||||||
_ = ev.environment_from_name_or_dir("fake-env")
|
_ = ev.environment_from_name_or_dir("fake-env")
|
||||||
|
|
||||||
|
|
||||||
def test_env_include_configs(mutable_mock_env_path, mock_packages):
|
def test_env_include_configs(mutable_mock_env_path):
|
||||||
"""check config and package values using new include schema"""
|
"""check config and package values using new include schema"""
|
||||||
env_path = mutable_mock_env_path
|
env_path = mutable_mock_env_path
|
||||||
env_path.mkdir()
|
env_path.mkdir()
|
||||||
@ -970,9 +971,7 @@ def test_env_include_configs(mutable_mock_env_path, mock_packages):
|
|||||||
assert req_specs == set(["@3.11:"])
|
assert req_specs == set(["@3.11:"])
|
||||||
|
|
||||||
|
|
||||||
def test_using_multiple_compilers_on_a_node_is_discouraged(
|
def test_using_multiple_compilers_on_a_node_is_discouraged(tmp_path, mutable_config):
|
||||||
tmp_path, mutable_config, mock_packages
|
|
||||||
):
|
|
||||||
"""Tests that when we specify %<compiler> Spack tries to use that compiler for all the
|
"""Tests that when we specify %<compiler> Spack tries to use that compiler for all the
|
||||||
languages needed by that node.
|
languages needed by that node.
|
||||||
"""
|
"""
|
||||||
|
@ -89,7 +89,7 @@ def mock_patch_stage(tmpdir_factory, monkeypatch):
|
|||||||
(os.path.join(data_path, "foo.patch"), platform_url_sha, None),
|
(os.path.join(data_path, "foo.patch"), platform_url_sha, None),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_url_patch(mock_patch_stage, filename, sha256, archive_sha256, config):
|
def test_url_patch(mock_packages, mock_patch_stage, filename, sha256, archive_sha256, config):
|
||||||
# Make a patch object
|
# Make a patch object
|
||||||
url = url_util.path_to_file_url(filename)
|
url = url_util.path_to_file_url(filename)
|
||||||
s = spack.concretize.concretize_one("patch")
|
s = spack.concretize.concretize_one("patch")
|
||||||
@ -466,7 +466,7 @@ def test_equality():
|
|||||||
assert patch1 != "not a patch"
|
assert patch1 != "not a patch"
|
||||||
|
|
||||||
|
|
||||||
def test_sha256_setter(mock_patch_stage, config):
|
def test_sha256_setter(mock_packages, mock_patch_stage, config):
|
||||||
path = os.path.join(data_path, "foo.patch")
|
path = os.path.join(data_path, "foo.patch")
|
||||||
s = spack.concretize.concretize_one("patch")
|
s = spack.concretize.concretize_one("patch")
|
||||||
patch = spack.patch.FilePatch(s.package, path, level=1, working_dir=".")
|
patch = spack.patch.FilePatch(s.package, path, level=1, working_dir=".")
|
||||||
|
@ -312,7 +312,6 @@ class TestRepoPath:
|
|||||||
def test_creation_from_string(self, mock_test_cache):
|
def test_creation_from_string(self, mock_test_cache):
|
||||||
repo = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=mock_test_cache)
|
repo = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=mock_test_cache)
|
||||||
assert len(repo.repos) == 1
|
assert len(repo.repos) == 1
|
||||||
assert repo.repos[0]._finder is repo
|
|
||||||
assert repo.by_namespace["builtin.mock"] is repo.repos[0]
|
assert repo.by_namespace["builtin.mock"] is repo.repos[0]
|
||||||
|
|
||||||
def test_get_repo(self, mock_test_cache):
|
def test_get_repo(self, mock_test_cache):
|
||||||
|
@ -55,6 +55,7 @@ def parser_and_speclist():
|
|||||||
return parser, result
|
return parser, result
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.usefixtures("mock_packages")
|
||||||
class TestSpecList:
|
class TestSpecList:
|
||||||
@pytest.mark.regression("28749")
|
@pytest.mark.regression("28749")
|
||||||
@pytest.mark.parametrize(
|
@pytest.mark.parametrize(
|
||||||
@ -83,8 +84,8 @@ class TestSpecList:
|
|||||||
),
|
),
|
||||||
# A constraint affects both the root and a dependency
|
# A constraint affects both the root and a dependency
|
||||||
(
|
(
|
||||||
[{"matrix": [["gromacs"], ["%gcc"], ["+plumed ^plumed%gcc"]]}],
|
[{"matrix": [["version-test-root"], ["%gcc"], ["^version-test-pkg%gcc"]]}],
|
||||||
["gromacs+plumed%gcc ^plumed%gcc"],
|
["version-test-root%gcc ^version-test-pkg%gcc"],
|
||||||
),
|
),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
@ -158,7 +159,7 @@ def test_spec_list_recursion_specs_as_constraints(self):
|
|||||||
assert result.specs == DEFAULT_SPECS
|
assert result.specs == DEFAULT_SPECS
|
||||||
|
|
||||||
@pytest.mark.regression("16841")
|
@pytest.mark.regression("16841")
|
||||||
def test_spec_list_matrix_exclude(self, mock_packages):
|
def test_spec_list_matrix_exclude(self):
|
||||||
parser = SpecListParser()
|
parser = SpecListParser()
|
||||||
result = parser.parse_user_specs(
|
result = parser.parse_user_specs(
|
||||||
name="specs",
|
name="specs",
|
||||||
@ -171,7 +172,7 @@ def test_spec_list_matrix_exclude(self, mock_packages):
|
|||||||
)
|
)
|
||||||
assert len(result.specs) == 1
|
assert len(result.specs) == 1
|
||||||
|
|
||||||
def test_spec_list_exclude_with_abstract_hashes(self, mock_packages, install_mockery):
|
def test_spec_list_exclude_with_abstract_hashes(self, install_mockery):
|
||||||
# Put mpich in the database so it can be referred to by hash.
|
# Put mpich in the database so it can be referred to by hash.
|
||||||
mpich_1 = spack.concretize.concretize_one("mpich+debug")
|
mpich_1 = spack.concretize.concretize_one("mpich+debug")
|
||||||
mpich_2 = spack.concretize.concretize_one("mpich~debug")
|
mpich_2 = spack.concretize.concretize_one("mpich~debug")
|
||||||
|
@ -1837,7 +1837,7 @@ def test_abstract_contains_semantic(lhs, rhs, expected, mock_packages):
|
|||||||
(Spec, "mpileaks ^callpath %gcc@5", "mpileaks ^callpath %gcc@5.4", (True, False, True)),
|
(Spec, "mpileaks ^callpath %gcc@5", "mpileaks ^callpath %gcc@5.4", (True, False, True)),
|
||||||
],
|
],
|
||||||
)
|
)
|
||||||
def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
|
def test_intersects_and_satisfies(mock_packages, factory, lhs_str, rhs_str, results):
|
||||||
lhs = factory(lhs_str)
|
lhs = factory(lhs_str)
|
||||||
rhs = factory(rhs_str)
|
rhs = factory(rhs_str)
|
||||||
|
|
||||||
|
1239
lib/spack/spack/url_buildcache.py
Normal file
1239
lib/spack/spack/url_buildcache.py
Normal file
File diff suppressed because it is too large
Load Diff
210
pyproject.toml
210
pyproject.toml
@ -1,11 +1,8 @@
|
|||||||
[project]
|
[project]
|
||||||
name="spack"
|
name = "spack"
|
||||||
description="The spack package manager"
|
description = "The spack package manager"
|
||||||
requires-python=">=3.6"
|
requires-python = ">=3.6"
|
||||||
dependencies=[
|
dependencies = ["clingo", "setuptools"]
|
||||||
"clingo",
|
|
||||||
"setuptools",
|
|
||||||
]
|
|
||||||
dynamic = ["version"]
|
dynamic = ["version"]
|
||||||
|
|
||||||
[project.scripts]
|
[project.scripts]
|
||||||
@ -21,16 +18,13 @@ dev = [
|
|||||||
"pytest-xdist",
|
"pytest-xdist",
|
||||||
"setuptools",
|
"setuptools",
|
||||||
"click",
|
"click",
|
||||||
'black',
|
"black",
|
||||||
"mypy",
|
"mypy",
|
||||||
"isort",
|
"isort",
|
||||||
"flake8",
|
"flake8",
|
||||||
"vermin",
|
"vermin",
|
||||||
]
|
]
|
||||||
ci = [
|
ci = ["pytest-cov", "codecov[toml]"]
|
||||||
"pytest-cov",
|
|
||||||
"codecov[toml]",
|
|
||||||
]
|
|
||||||
|
|
||||||
[build-system]
|
[build-system]
|
||||||
requires = ["hatchling"]
|
requires = ["hatchling"]
|
||||||
@ -53,9 +47,7 @@ include = [
|
|||||||
]
|
]
|
||||||
|
|
||||||
[tool.hatch.envs.default]
|
[tool.hatch.envs.default]
|
||||||
features = [
|
features = ["dev"]
|
||||||
"dev",
|
|
||||||
]
|
|
||||||
|
|
||||||
[tool.hatch.envs.default.scripts]
|
[tool.hatch.envs.default.scripts]
|
||||||
spack = "./bin/spack"
|
spack = "./bin/spack"
|
||||||
@ -63,10 +55,7 @@ style = "./bin/spack style"
|
|||||||
test = "./bin/spack unit-test"
|
test = "./bin/spack unit-test"
|
||||||
|
|
||||||
[tool.hatch.envs.ci]
|
[tool.hatch.envs.ci]
|
||||||
features = [
|
features = ["dev", "ci"]
|
||||||
"dev",
|
|
||||||
"ci",
|
|
||||||
]
|
|
||||||
|
|
||||||
[tool.ruff]
|
[tool.ruff]
|
||||||
line-length = 99
|
line-length = 99
|
||||||
@ -83,14 +72,14 @@ ignore = ["E731", "E203"]
|
|||||||
[tool.ruff.lint.isort]
|
[tool.ruff.lint.isort]
|
||||||
split-on-trailing-comma = false
|
split-on-trailing-comma = false
|
||||||
section-order = [
|
section-order = [
|
||||||
"future",
|
"future",
|
||||||
"standard-library",
|
"standard-library",
|
||||||
"third-party",
|
"third-party",
|
||||||
"archspec",
|
"archspec",
|
||||||
"llnl",
|
"llnl",
|
||||||
"spack",
|
"spack",
|
||||||
"first-party",
|
"first-party",
|
||||||
"local-folder",
|
"local-folder",
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.ruff.lint.isort.sections]
|
[tool.ruff.lint.isort.sections]
|
||||||
@ -104,8 +93,8 @@ llnl = ["llnl"]
|
|||||||
|
|
||||||
[tool.black]
|
[tool.black]
|
||||||
line-length = 99
|
line-length = 99
|
||||||
include = '(lib/spack|var/spack/repos|var/spack/test_repos)/.*\.pyi?$|bin/spack$'
|
include = "(lib/spack|var/spack/repos|var/spack/test_repos)/.*\\.pyi?$|bin/spack$"
|
||||||
extend-exclude = 'lib/spack/external'
|
extend-exclude = "lib/spack/external"
|
||||||
skip_magic_trailing_comma = true
|
skip_magic_trailing_comma = true
|
||||||
|
|
||||||
[tool.isort]
|
[tool.isort]
|
||||||
@ -115,7 +104,9 @@ sections = [
|
|||||||
"FUTURE",
|
"FUTURE",
|
||||||
"STDLIB",
|
"STDLIB",
|
||||||
"THIRDPARTY",
|
"THIRDPARTY",
|
||||||
"ARCHSPEC", "LLNL", "FIRSTPARTY",
|
"ARCHSPEC",
|
||||||
|
"LLNL",
|
||||||
|
"FIRSTPARTY",
|
||||||
"LOCALFOLDER",
|
"LOCALFOLDER",
|
||||||
]
|
]
|
||||||
known_first_party = "spack"
|
known_first_party = "spack"
|
||||||
@ -129,13 +120,9 @@ honor_noqa = true
|
|||||||
files = [
|
files = [
|
||||||
"lib/spack/llnl/**/*.py",
|
"lib/spack/llnl/**/*.py",
|
||||||
"lib/spack/spack/**/*.py",
|
"lib/spack/spack/**/*.py",
|
||||||
"var/spack/repos/spack_repo/builtin/packages/*/package.py"
|
"var/spack/repos/spack_repo/builtin/packages/*/package.py",
|
||||||
]
|
|
||||||
mypy_path = [
|
|
||||||
"lib/spack",
|
|
||||||
"lib/spack/external",
|
|
||||||
"var/spack/repos",
|
|
||||||
]
|
]
|
||||||
|
mypy_path = ["lib/spack", "lib/spack/external", "var/spack/repos"]
|
||||||
allow_redefinition = true
|
allow_redefinition = true
|
||||||
|
|
||||||
# This and a generated import file allows supporting packages
|
# This and a generated import file allows supporting packages
|
||||||
@ -146,68 +133,68 @@ namespace_packages = true
|
|||||||
ignore_errors = true
|
ignore_errors = true
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
|
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = 'spack.*'
|
module = "spack.*"
|
||||||
ignore_errors = false
|
ignore_errors = false
|
||||||
ignore_missing_imports = false
|
ignore_missing_imports = false
|
||||||
|
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = 'spack_repo.*'
|
module = "spack_repo.*"
|
||||||
ignore_errors = false
|
ignore_errors = false
|
||||||
ignore_missing_imports = false
|
ignore_missing_imports = false
|
||||||
# we can't do this here, not a module scope option, in spack style instead
|
# we can't do this here, not a module scope option, in spack style instead
|
||||||
# disable_error_code = 'no-redef'
|
# disable_error_code = 'no-redef'
|
||||||
|
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = 'llnl.*'
|
module = "llnl.*"
|
||||||
ignore_errors = false
|
ignore_errors = false
|
||||||
ignore_missing_imports = false
|
ignore_missing_imports = false
|
||||||
|
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = 'spack.test.packages'
|
module = "spack.test.packages"
|
||||||
ignore_errors = true
|
ignore_errors = true
|
||||||
|
|
||||||
# ignore errors in fake import path for packages
|
# ignore errors in fake import path for packages
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = 'spack.pkg.*'
|
module = "spack.pkg.*"
|
||||||
ignore_errors = true
|
ignore_errors = true
|
||||||
ignore_missing_imports = true
|
ignore_missing_imports = true
|
||||||
|
|
||||||
# Spack imports a number of external packages, and they *may* require Python 3.8 or
|
# Spack imports a number of external packages, and they *may* require Python 3.8 or
|
||||||
# higher in recent versions. This can cause mypy to fail because we check for 3.7
|
# higher in recent versions. This can cause mypy to fail because we check for 3.7
|
||||||
# compatibility. We could restrict mypy to run for the oldest supported version (3.7),
|
# compatibility. We could restrict mypy to run for the oldest supported version (3.7),
|
||||||
# but that means most developers won't be able to run mypy, which means it'll fail
|
# but that means most developers won't be able to run mypy, which means it'll fail
|
||||||
# more in CI. Instead, we exclude these imported packages from mypy checking.
|
# more in CI. Instead, we exclude these imported packages from mypy checking.
|
||||||
[[tool.mypy.overrides]]
|
[[tool.mypy.overrides]]
|
||||||
module = [
|
module = [
|
||||||
'IPython',
|
"IPython",
|
||||||
'altgraph',
|
"altgraph",
|
||||||
'attr',
|
"attr",
|
||||||
'boto3',
|
"boto3",
|
||||||
'botocore',
|
"botocore",
|
||||||
'distro',
|
"distro",
|
||||||
'importlib.metadata',
|
"importlib.metadata",
|
||||||
'jinja2',
|
"jinja2",
|
||||||
'jsonschema',
|
"jsonschema",
|
||||||
'macholib',
|
"macholib",
|
||||||
'markupsafe',
|
"markupsafe",
|
||||||
'numpy',
|
"numpy",
|
||||||
'pkg_resources',
|
"pkg_resources",
|
||||||
'pyristent',
|
"pyristent",
|
||||||
'pytest',
|
"pytest",
|
||||||
'ruamel.yaml',
|
"ruamel.yaml",
|
||||||
'six',
|
"six",
|
||||||
]
|
]
|
||||||
follow_imports = 'skip'
|
follow_imports = "skip"
|
||||||
follow_imports_for_stubs = true
|
follow_imports_for_stubs = true
|
||||||
|
|
||||||
[tool.pyright]
|
[tool.pyright]
|
||||||
useLibraryCodeForTypes = true
|
useLibraryCodeForTypes = true
|
||||||
reportMissingImports = true
|
reportMissingImports = true
|
||||||
reportWildcardImportFromLibrary = false
|
reportWildcardImportFromLibrary = false
|
||||||
include = ['lib/spack', 'var/spack/repos', 'var/spack/test_repos']
|
include = ["lib/spack", "var/spack/repos", "var/spack/test_repos"]
|
||||||
ignore = ['lib/spack/external']
|
ignore = ["lib/spack/external"]
|
||||||
extraPaths = ['lib/spack', 'lib/spack/external']
|
extraPaths = ["lib/spack", "lib/spack/external"]
|
||||||
|
|
||||||
|
|
||||||
[tool.coverage.run]
|
[tool.coverage.run]
|
||||||
@ -217,39 +204,39 @@ branch = true
|
|||||||
source = ["bin", "lib"]
|
source = ["bin", "lib"]
|
||||||
data_file = "./tests-coverage/.coverage"
|
data_file = "./tests-coverage/.coverage"
|
||||||
omit = [
|
omit = [
|
||||||
'lib/spack/spack/test/*',
|
"lib/spack/spack/test/*",
|
||||||
'lib/spack/docs/*',
|
"lib/spack/docs/*",
|
||||||
'lib/spack/external/*',
|
"lib/spack/external/*",
|
||||||
'share/spack/qa/*',
|
"share/spack/qa/*",
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.coverage.report]
|
[tool.coverage.report]
|
||||||
# Regexes for lines to exclude from consideration
|
# Regexes for lines to exclude from consideration
|
||||||
exclude_lines = [
|
exclude_lines = [
|
||||||
# Have to re-enable the standard pragma
|
# Have to re-enable the standard pragma
|
||||||
'pragma: no cover',
|
"pragma: no cover",
|
||||||
|
|
||||||
# Don't complain about missing debug-only code:
|
# Don't complain about missing debug-only code:
|
||||||
'def __repr__',
|
"def __repr__",
|
||||||
'if self\.debug',
|
"if self\\.debug",
|
||||||
|
|
||||||
# Don't complain if tests don't hit defensive assertion code:
|
# Don't complain if tests don't hit defensive assertion code:
|
||||||
'raise AssertionError',
|
"raise AssertionError",
|
||||||
'raise NotImplementedError',
|
"raise NotImplementedError",
|
||||||
|
|
||||||
# Don't complain if non-runnable code isn't run:
|
# Don't complain if non-runnable code isn't run:
|
||||||
'if 0:',
|
"if 0:",
|
||||||
'if False:',
|
"if False:",
|
||||||
'if __name__ == .__main__.:',
|
"if __name__ == .__main__.:",
|
||||||
]
|
]
|
||||||
ignore_errors = true
|
ignore_errors = true
|
||||||
|
|
||||||
[tool.coverage.paths]
|
[tool.coverage.paths]
|
||||||
source = [
|
source = [
|
||||||
".",
|
".",
|
||||||
"/Users/runner/work/spack/spack",
|
"/Users/runner/work/spack/spack",
|
||||||
"/System/Volumes/Data/home/runner/work/spack/spack",
|
"/System/Volumes/Data/home/runner/work/spack/spack",
|
||||||
"D:\\a\\spack\\spack",
|
"D:\\a\\spack\\spack",
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.coverage.html]
|
[tool.coverage.html]
|
||||||
@ -264,8 +251,7 @@ protected-files = ["__init__.py", "README.rst", "vendor.txt"]
|
|||||||
patches-dir = "lib/spack/external/patches"
|
patches-dir = "lib/spack/external/patches"
|
||||||
|
|
||||||
[tool.vendoring.transformations]
|
[tool.vendoring.transformations]
|
||||||
substitute = [
|
substitute = []
|
||||||
]
|
|
||||||
drop = [
|
drop = [
|
||||||
# contains unnecessary scripts
|
# contains unnecessary scripts
|
||||||
"bin/",
|
"bin/",
|
||||||
@ -278,12 +264,12 @@ drop = [
|
|||||||
"pkg_resources/extern/",
|
"pkg_resources/extern/",
|
||||||
# trim vendored pygments styles and lexers
|
# trim vendored pygments styles and lexers
|
||||||
"pygments/styles/[!_]*.py",
|
"pygments/styles/[!_]*.py",
|
||||||
'^pygments/lexers/(?!python|__init__|_mapping).*\.py$',
|
"^pygments/lexers/(?!python|__init__|_mapping).*\\.py$",
|
||||||
# trim rich's markdown support
|
# trim rich's markdown support
|
||||||
"rich/markdown.py",
|
"rich/markdown.py",
|
||||||
# ruamel.yaml installs unneded files
|
# ruamel.yaml installs unneded files
|
||||||
"ruamel.*.pth",
|
"ruamel.*.pth",
|
||||||
"pvectorc.*.so"
|
"pvectorc.*.so",
|
||||||
]
|
]
|
||||||
|
|
||||||
[tool.vendoring.typing-stubs]
|
[tool.vendoring.typing-stubs]
|
||||||
|
@ -42,13 +42,22 @@ ci:
|
|||||||
aud: "${OIDC_TOKEN_AUDIENCE}"
|
aud: "${OIDC_TOKEN_AUDIENCE}"
|
||||||
|
|
||||||
- signing-job:
|
- signing-job:
|
||||||
image: { "name": "ghcr.io/spack/notary:0.0.1", "entrypoint": [""] }
|
image:
|
||||||
|
name: ghcr.io/spack/notary@sha256:d5a183b090602dea5dc89d5023fe777d1e64d9a7ddcb6cc9ec58a79bb410c168
|
||||||
|
entrypoint: [""]
|
||||||
tags: ["aws"]
|
tags: ["aws"]
|
||||||
script:
|
script:
|
||||||
- - aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_BUILDCACHE_DESTINATION}/build_cache /tmp
|
- - export BUILD_CACHE="${SPACK_BUILDCACHE_DESTINATION}/${SPACK_BUILDCACHE_RELATIVE_SPECS_URL}"
|
||||||
|
- mkdir -p /tmp/input /tmp/output
|
||||||
|
- aws s3 sync --exclude "*" --include "*spec.manifest.json" ${BUILD_CACHE} /tmp/input
|
||||||
- /sign.sh
|
- /sign.sh
|
||||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_BUILDCACHE_DESTINATION}/build_cache
|
- aws s3 sync --exclude "*" --include "*spec.manifest.json" /tmp/output ${BUILD_CACHE}
|
||||||
- aws s3 cp /tmp/public_keys ${SPACK_BUILDCACHE_DESTINATION}/build_cache/_pgp --recursive --exclude "*" --include "*.pub"
|
- |+
|
||||||
|
for keyfile in $( find /tmp/public_keys -type f );
|
||||||
|
do
|
||||||
|
spack gpg trust $keyfile
|
||||||
|
done
|
||||||
|
- spack gpg publish --update-index --mirror-url ${SPACK_BUILDCACHE_DESTINATION}
|
||||||
id_tokens:
|
id_tokens:
|
||||||
GITLAB_OIDC_TOKEN:
|
GITLAB_OIDC_TOKEN:
|
||||||
aud: "${OIDC_TOKEN_AUDIENCE}"
|
aud: "${OIDC_TOKEN_AUDIENCE}"
|
||||||
@ -62,10 +71,14 @@ ci:
|
|||||||
- export SPACK_COPY_ONLY_SOURCE=${SPACK_BUILDCACHE_SOURCE//SPACK_REPLACE_VERSION/${SPACK_REPLACE_VERSION}}
|
- export SPACK_COPY_ONLY_SOURCE=${SPACK_BUILDCACHE_SOURCE//SPACK_REPLACE_VERSION/${SPACK_REPLACE_VERSION}}
|
||||||
script:
|
script:
|
||||||
- - spack env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
|
- - spack env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
|
||||||
|
# TODO: remove this when we stop getting windows config includes added to the environment
|
||||||
|
- spack config add 'config:build_stage:["$tempdir/$user/spack-stage", "$user_cache_path/stage"]'
|
||||||
|
- spack config blame config
|
||||||
- echo Copying environment specs from ${SPACK_COPY_ONLY_SOURCE} to ${SPACK_COPY_ONLY_DESTINATION}
|
- echo Copying environment specs from ${SPACK_COPY_ONLY_SOURCE} to ${SPACK_COPY_ONLY_DESTINATION}
|
||||||
- spack buildcache sync "${SPACK_COPY_ONLY_SOURCE}" "${SPACK_COPY_ONLY_DESTINATION}"
|
- spack buildcache sync "${SPACK_COPY_ONLY_SOURCE}" "${SPACK_COPY_ONLY_DESTINATION}"
|
||||||
- curl -fLsS https://spack.github.io/keys/spack-public-binary-key.pub -o /tmp/spack-public-binary-key.pub
|
- curl -fLsS https://spack.github.io/keys/spack-public-binary-key.pub -o /tmp/spack-public-binary-key.pub
|
||||||
- aws s3 cp /tmp/spack-public-binary-key.pub "${SPACK_COPY_ONLY_DESTINATION}/build_cache/_pgp/spack-public-binary-key.pub"
|
- spack gpg trust /tmp/spack-public-binary-key.pub
|
||||||
|
- spack gpg publish --mirror-url "${SPACK_COPY_ONLY_DESTINATION}"
|
||||||
- spack buildcache update-index --keys "${SPACK_COPY_ONLY_DESTINATION}"
|
- spack buildcache update-index --keys "${SPACK_COPY_ONLY_DESTINATION}"
|
||||||
when: "always"
|
when: "always"
|
||||||
retry:
|
retry:
|
||||||
|
@ -9,8 +9,3 @@ mirrors:
|
|||||||
push: ${PR_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}
|
push: ${PR_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}
|
||||||
source: False
|
source: False
|
||||||
binary: True
|
binary: True
|
||||||
buildcache-shared:
|
|
||||||
fetch: ${PR_MIRROR_FETCH_DOMAIN}/shared_pr_mirror/${SPACK_CI_STACK_NAME}
|
|
||||||
push: ${PR_MIRROR_PUSH_DOMAIN}/shared_pr_mirror/${SPACK_CI_STACK_NAME}
|
|
||||||
source: False
|
|
||||||
binary: True
|
|
||||||
|
@ -29,8 +29,9 @@ spack:
|
|||||||
- signing-job:
|
- signing-job:
|
||||||
before_script:
|
before_script:
|
||||||
# Do not distribute Intel & ARM binaries
|
# Do not distribute Intel & ARM binaries
|
||||||
- - for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep intel-oneapi | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
|
- - export SPECS_PATH=${SPACK_BUILDCACHE_RELATIVE_SPECS_PATH}
|
||||||
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep armpl | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
|
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/ | grep intel-oneapi | awk '{print $4}' | sed -e "s?^.*${SPECS_PATH}/??g"); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/$i; done
|
||||||
|
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/ | grep armpl | awk '{print $4}' | sed -e "s?^.*${SPECS_PATH}/??g"); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/$i; done
|
||||||
cdash:
|
cdash:
|
||||||
build-group: AWS Packages
|
build-group: AWS Packages
|
||||||
|
|
||||||
@ -56,6 +57,7 @@ spack:
|
|||||||
require:
|
require:
|
||||||
- gromacs@2024.3 ^armpl-gcc ^openmpi
|
- gromacs@2024.3 ^armpl-gcc ^openmpi
|
||||||
- "%gcc"
|
- "%gcc"
|
||||||
|
- target=neoverse_v1
|
||||||
libfabric:
|
libfabric:
|
||||||
buildable: true
|
buildable: true
|
||||||
externals:
|
externals:
|
||||||
@ -63,33 +65,42 @@ spack:
|
|||||||
spec: libfabric@1.17.0
|
spec: libfabric@1.17.0
|
||||||
require:
|
require:
|
||||||
- fabrics=shm,efa
|
- fabrics=shm,efa
|
||||||
|
- target=neoverse_v1
|
||||||
llvm:
|
llvm:
|
||||||
variants: ~lldb
|
variants: ~lldb
|
||||||
mpas-model:
|
mpas-model:
|
||||||
require:
|
require:
|
||||||
- precision=single ^parallelio+pnetcdf
|
- precision=single ^parallelio+pnetcdf
|
||||||
- "%gcc"
|
- "%gcc"
|
||||||
|
- target=neoverse_v1
|
||||||
mpich:
|
mpich:
|
||||||
require:
|
require:
|
||||||
- mpich pmi=pmi2 device=ch4 netmod=ofi +slurm
|
- mpich pmi=pmi2 device=ch4 netmod=ofi +slurm
|
||||||
|
- target=neoverse_v1
|
||||||
nvhpc:
|
nvhpc:
|
||||||
require:
|
require:
|
||||||
- "target=aarch64"
|
- "target=aarch64"
|
||||||
openfoam:
|
openfoam:
|
||||||
require:
|
require:
|
||||||
- openfoam ^scotch@6.0.9
|
- openfoam ^scotch@6.0.9
|
||||||
|
- target=neoverse_v1
|
||||||
openmpi:
|
openmpi:
|
||||||
variants: ~atomics ~cuda ~cxx ~cxx_exceptions ~internal-hwloc ~java +legacylaunchers ~lustre ~memchecker +pmi +romio ~singularity +vt +wrapper-rpath fabrics=ofi schedulers=slurm
|
variants: ~atomics ~cuda ~cxx ~cxx_exceptions ~internal-hwloc ~java +legacylaunchers ~lustre ~memchecker +pmi +romio ~singularity +vt +wrapper-rpath fabrics=ofi schedulers=slurm
|
||||||
require: '@4:'
|
require:
|
||||||
|
- '@4:'
|
||||||
|
- target=neoverse_v1
|
||||||
# Palace does not build correctly with armpl until https://github.com/awslabs/palace/pull/207 is merged into a version.
|
# Palace does not build correctly with armpl until https://github.com/awslabs/palace/pull/207 is merged into a version.
|
||||||
# palace:
|
# palace:
|
||||||
# require:
|
# require:
|
||||||
# - one_of: ["palace cxxflags=\"-include cstdint\" ^fmt@9.1.0"]
|
# - one_of: ["palace cxxflags=\"-include cstdint\" ^fmt@9.1.0"]
|
||||||
pmix:
|
pmix:
|
||||||
require: "pmix@3:"
|
require:
|
||||||
|
- "pmix@3:"
|
||||||
|
- target=neoverse_v1
|
||||||
quantum-espresso:
|
quantum-espresso:
|
||||||
require:
|
require:
|
||||||
- quantum-espresso@6.6 %gcc ^armpl-gcc
|
- quantum-espresso@6.6 %gcc ^armpl-gcc
|
||||||
|
- target=neoverse_v1
|
||||||
slurm:
|
slurm:
|
||||||
buildable: false
|
buildable: false
|
||||||
externals:
|
externals:
|
||||||
@ -98,8 +109,10 @@ spack:
|
|||||||
require:
|
require:
|
||||||
- "+pmix"
|
- "+pmix"
|
||||||
all:
|
all:
|
||||||
|
target: ["neoverse_v1"]
|
||||||
require:
|
require:
|
||||||
- "%gcc"
|
- "%gcc"
|
||||||
|
- "target=neoverse_v1"
|
||||||
providers:
|
providers:
|
||||||
blas: [armpl-gcc, openblas]
|
blas: [armpl-gcc, openblas]
|
||||||
fftw-api: [armpl-gcc, fftw]
|
fftw-api: [armpl-gcc, fftw]
|
||||||
|
@ -33,8 +33,9 @@ spack:
|
|||||||
- signing-job:
|
- signing-job:
|
||||||
before_script:
|
before_script:
|
||||||
# Do not distribute Intel & ARM binaries
|
# Do not distribute Intel & ARM binaries
|
||||||
- - for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep intel-oneapi | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
|
- - export SPECS_PATH=${SPACK_BUILDCACHE_RELATIVE_SPECS_PATH}
|
||||||
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/ | grep armpl | awk '{print $4}' | sed -e 's?^.*build_cache/??g'); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/$i; done
|
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/ | grep intel-oneapi | awk '{print $4}' | sed -e "s?^.*${SPECS_PATH}/??g"); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/$i; done
|
||||||
|
- for i in $(aws s3 ls --recursive ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/ | grep armpl | awk '{print $4}' | sed -e "s?^.*${SPECS_PATH}/??g"); do aws s3 rm ${SPACK_REMOTE_MIRROR_OVERRIDE}/${SPECS_PATH}/$i; done
|
||||||
cdash:
|
cdash:
|
||||||
build-group: AWS Packages
|
build-group: AWS Packages
|
||||||
|
|
||||||
|
@ -3,6 +3,7 @@ spack:
|
|||||||
|
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["aarch64"]
|
||||||
require: target=aarch64
|
require: target=aarch64
|
||||||
|
|
||||||
config:
|
config:
|
||||||
|
@ -3,6 +3,7 @@ spack:
|
|||||||
|
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require: target=x86_64_v3
|
require: target=x86_64_v3
|
||||||
|
|
||||||
config:
|
config:
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require:
|
require:
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
c:
|
c:
|
||||||
|
@ -2,32 +2,43 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
require: target=x86_64_v3
|
target: ["x86_64_v3"]
|
||||||
|
require:
|
||||||
|
- target=x86_64_v3
|
||||||
cmake:
|
cmake:
|
||||||
variants: ~ownlibs
|
variants: ~ownlibs
|
||||||
ecp-data-vis-sdk:
|
ecp-data-vis-sdk:
|
||||||
require:
|
require:
|
||||||
- "+ascent +adios2 +cinema +darshan +faodel +hdf5 +pnetcdf +sensei +sz +unifyfs +veloc +vtkm +zfp"
|
- "+ascent +adios2 +cinema +darshan +faodel +hdf5 +pnetcdf +sensei +sz +unifyfs +veloc +vtkm +zfp"
|
||||||
|
- target=x86_64_v3
|
||||||
hdf5:
|
hdf5:
|
||||||
require:
|
require:
|
||||||
- "@1.14"
|
- "@1.14"
|
||||||
|
- target=x86_64_v3
|
||||||
mesa:
|
mesa:
|
||||||
require:
|
require:
|
||||||
- "+glx +osmesa +opengl ~opengles +llvm"
|
- "+glx +osmesa +opengl ~opengles +llvm"
|
||||||
|
- target=x86_64_v3
|
||||||
libglx:
|
libglx:
|
||||||
require: "mesa +glx"
|
require: "mesa +glx"
|
||||||
ospray:
|
ospray:
|
||||||
require:
|
require:
|
||||||
- "@2.8.0"
|
- "@2.8.0"
|
||||||
- "+denoiser +mpi"
|
- "+denoiser +mpi"
|
||||||
|
- target=x86_64_v3
|
||||||
llvm:
|
llvm:
|
||||||
require: ["@14:"]
|
require:
|
||||||
|
- "@14:"
|
||||||
|
- target=x86_64_v3
|
||||||
# Minimize LLVM
|
# Minimize LLVM
|
||||||
variants: ~lldb~lld~libomptarget~polly~gold libunwind=none compiler-rt=none
|
variants: ~lldb~lld~libomptarget~polly~gold libunwind=none compiler-rt=none
|
||||||
libllvm:
|
libllvm:
|
||||||
require: ["llvm"]
|
require: ["llvm"]
|
||||||
visit:
|
visit:
|
||||||
require: ["@3.4.1"]
|
require:
|
||||||
|
- "@3.4.1"
|
||||||
|
- target=x86_64_v3
|
||||||
|
|
||||||
|
|
||||||
concretizer:
|
concretizer:
|
||||||
unify: when_possible
|
unify: when_possible
|
||||||
|
@ -44,7 +44,9 @@ spack:
|
|||||||
- netlib-scalapack
|
- netlib-scalapack
|
||||||
|
|
||||||
ncurses:
|
ncurses:
|
||||||
require: +termlib ldflags=-Wl,--undefined-version
|
require:
|
||||||
|
- +termlib ldflags=-Wl,--undefined-version
|
||||||
|
- target=x86_64_v3
|
||||||
tbb:
|
tbb:
|
||||||
require: "intel-tbb"
|
require: "intel-tbb"
|
||||||
binutils:
|
binutils:
|
||||||
@ -78,6 +80,7 @@ spack:
|
|||||||
petsc:
|
petsc:
|
||||||
require:
|
require:
|
||||||
- "+batch"
|
- "+batch"
|
||||||
|
- target=x86_64_v3
|
||||||
trilinos:
|
trilinos:
|
||||||
require:
|
require:
|
||||||
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
|
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
|
||||||
|
@ -10,12 +10,16 @@ spack:
|
|||||||
require:
|
require:
|
||||||
- "%gcc"
|
- "%gcc"
|
||||||
- target=neoverse_v2
|
- target=neoverse_v2
|
||||||
|
target: ["neoverse_v2"]
|
||||||
providers:
|
providers:
|
||||||
blas: [openblas]
|
blas: [openblas]
|
||||||
mpi: [mpich]
|
mpi: [mpich]
|
||||||
variants: +mpi
|
variants: +mpi
|
||||||
binutils:
|
binutils:
|
||||||
variants: +ld +gold +headers +libiberty ~nls
|
variants: +ld +gold +headers +libiberty ~nls
|
||||||
|
blas:
|
||||||
|
require:
|
||||||
|
- openblas
|
||||||
hdf5:
|
hdf5:
|
||||||
variants: +fortran +hl +shared
|
variants: +fortran +hl +shared
|
||||||
libfabric:
|
libfabric:
|
||||||
|
@ -8,6 +8,7 @@ spack:
|
|||||||
|
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require:
|
require:
|
||||||
- "%gcc"
|
- "%gcc"
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
@ -17,20 +18,34 @@ spack:
|
|||||||
tbb: [intel-tbb]
|
tbb: [intel-tbb]
|
||||||
variants: +mpi
|
variants: +mpi
|
||||||
acts:
|
acts:
|
||||||
require: +analysis +dd4hep +edm4hep +examples +fatras +geant4 +hepmc3 +podio +pythia8 +python +svg +tgeo cxxstd=20
|
require:
|
||||||
|
- +analysis +dd4hep +edm4hep +examples +fatras +geant4 +hepmc3 +podio +pythia8 +python +svg +tgeo cxxstd=20
|
||||||
|
- target=x86_64_v3
|
||||||
celeritas:
|
celeritas:
|
||||||
require: +geant4 +hepmc3 +root +shared cxxstd=20
|
require:
|
||||||
|
- +geant4 +hepmc3 +root +shared cxxstd=20
|
||||||
|
- target=x86_64_v3
|
||||||
geant4:
|
geant4:
|
||||||
require: +opengl +qt +threads +x11
|
require:
|
||||||
|
- +opengl +qt +threads +x11
|
||||||
|
- target=x86_64_v3
|
||||||
hip:
|
hip:
|
||||||
require: '@5.7.1 +rocm'
|
require:
|
||||||
|
- '@5.7.1 +rocm'
|
||||||
|
- target=x86_64_v3
|
||||||
rivet:
|
rivet:
|
||||||
require: hepmc=3
|
require:
|
||||||
|
- hepmc=3
|
||||||
|
- target=x86_64_v3
|
||||||
root:
|
root:
|
||||||
require: +arrow ~daos +davix +dcache +emacs +examples +fftw +fits +fortran +gdml +graphviz +gsl +http +math +minuit +mlp +mysql +opengl +postgres +pythia8 +python +r +roofit +root7 +rpath ~shadow +spectrum +sqlite +ssl +tbb +threads +tmva +tmva-cpu +unuran +vc +vdt +veccore +webgui +x +xml +xrootd # cxxstd=20
|
require:
|
||||||
|
- +arrow ~daos +davix +dcache +emacs +examples +fftw +fits +fortran +gdml +graphviz +gsl +http +math +minuit +mlp +mysql +opengl +postgres +pythia8 +python +r +roofit +root7 +rpath ~shadow +spectrum +sqlite +ssl +tbb +threads +tmva +tmva-cpu +unuran +vc +vdt +veccore +webgui +x +xml +xrootd # cxxstd=20
|
||||||
# note: root cxxstd=20 not concretizable within sherpa
|
# note: root cxxstd=20 not concretizable within sherpa
|
||||||
|
- target=x86_64_v3
|
||||||
vecgeom:
|
vecgeom:
|
||||||
require: +gdml +geant4 +root +shared cxxstd=20
|
require:
|
||||||
|
- +gdml +geant4 +root +shared cxxstd=20
|
||||||
|
- target=x86_64_v3
|
||||||
|
|
||||||
# Mark geant4 data as external to prevent wasting bandwidth on GB-scale files
|
# Mark geant4 data as external to prevent wasting bandwidth on GB-scale files
|
||||||
geant4-data:
|
geant4-data:
|
||||||
|
@ -3,6 +3,7 @@ spack:
|
|||||||
|
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["aarch64"]
|
||||||
require:
|
require:
|
||||||
- target=aarch64
|
- target=aarch64
|
||||||
- +mps
|
- +mps
|
||||||
@ -11,7 +12,9 @@ spack:
|
|||||||
mpi:
|
mpi:
|
||||||
require: mpich
|
require: mpich
|
||||||
openblas:
|
openblas:
|
||||||
require: ~fortran
|
require:
|
||||||
|
- ~fortran
|
||||||
|
- target=aarch64
|
||||||
|
|
||||||
specs:
|
specs:
|
||||||
# Horovod
|
# Horovod
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["aarch64"]
|
||||||
require:
|
require:
|
||||||
- target=aarch64
|
- target=aarch64
|
||||||
- ~cuda
|
- ~cuda
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["aarch64"]
|
||||||
require:
|
require:
|
||||||
- target=aarch64
|
- target=aarch64
|
||||||
- ~rocm
|
- ~rocm
|
||||||
@ -9,7 +10,9 @@ spack:
|
|||||||
- cuda_arch=80
|
- cuda_arch=80
|
||||||
llvm:
|
llvm:
|
||||||
# https://github.com/spack/spack/issues/27999
|
# https://github.com/spack/spack/issues/27999
|
||||||
require: ~cuda
|
require:
|
||||||
|
- ~cuda
|
||||||
|
- target=aarch64
|
||||||
mpi:
|
mpi:
|
||||||
require: openmpi
|
require: openmpi
|
||||||
py-torch:
|
py-torch:
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require:
|
require:
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
- ~cuda
|
- ~cuda
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require:
|
require:
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
- ~rocm
|
- ~rocm
|
||||||
@ -9,7 +10,9 @@ spack:
|
|||||||
- cuda_arch=80
|
- cuda_arch=80
|
||||||
llvm:
|
llvm:
|
||||||
# https://github.com/spack/spack/issues/27999
|
# https://github.com/spack/spack/issues/27999
|
||||||
require: ~cuda
|
require:
|
||||||
|
- ~cuda
|
||||||
|
- target=x86_64_v3
|
||||||
mpi:
|
mpi:
|
||||||
require: openmpi
|
require: openmpi
|
||||||
py-torch:
|
py-torch:
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: ["x86_64_v3"]
|
||||||
require:
|
require:
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
- ~cuda
|
- ~cuda
|
||||||
|
@ -2,15 +2,16 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
providers:
|
target: ["target=aarch64"]
|
||||||
blas:
|
require:
|
||||||
- openblas
|
- "target=aarch64"
|
||||||
mkl:
|
- "+mpi"
|
||||||
- intel-oneapi-mkl
|
blas:
|
||||||
mpi:
|
require:
|
||||||
- openmpi
|
- openblas
|
||||||
- mpich
|
mpi:
|
||||||
variants: +mpi
|
require:
|
||||||
|
- openmpi
|
||||||
|
|
||||||
definitions:
|
definitions:
|
||||||
- radiuss:
|
- radiuss:
|
||||||
|
@ -2,6 +2,7 @@ spack:
|
|||||||
view: false
|
view: false
|
||||||
packages:
|
packages:
|
||||||
all:
|
all:
|
||||||
|
target: [ "x86_64_v3" ]
|
||||||
require:
|
require:
|
||||||
- target=x86_64_v3
|
- target=x86_64_v3
|
||||||
# prefer %gcc@7.5.0 but also allow %gcc@12
|
# prefer %gcc@7.5.0 but also allow %gcc@12
|
||||||
|
@ -40,9 +40,7 @@ spack -p --lines 20 spec mpileaks%gcc
|
|||||||
$coverage_run $(which spack) bootstrap status --dev --optional
|
$coverage_run $(which spack) bootstrap status --dev --optional
|
||||||
|
|
||||||
# Check that we can import Spack packages directly as a first import
|
# Check that we can import Spack packages directly as a first import
|
||||||
# TODO: this check is disabled, because sys.path is only updated once
|
$coverage_run $(which spack) python -c "from spack_repo.builtin.packages.mpileaks.package import Mpileaks"
|
||||||
# spack.repo.PATH.get_pkg_class is called.
|
|
||||||
# $coverage_run $(which spack) python -c "import spack.pkg.builtin.mpileaks; repr(spack.pkg.builtin.mpileaks.Mpileaks)"
|
|
||||||
|
|
||||||
#-----------------------------------------------------------
|
#-----------------------------------------------------------
|
||||||
# Run unit tests with code coverage
|
# Run unit tests with code coverage
|
||||||
|
@ -563,7 +563,7 @@ _spack_buildcache() {
|
|||||||
then
|
then
|
||||||
SPACK_COMPREPLY="-h --help"
|
SPACK_COMPREPLY="-h --help"
|
||||||
else
|
else
|
||||||
SPACK_COMPREPLY="push create install list keys check download save-specfile sync update-index rebuild-index"
|
SPACK_COMPREPLY="push create install list keys check download save-specfile sync update-index rebuild-index migrate"
|
||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -651,6 +651,15 @@ _spack_buildcache_rebuild_index() {
|
|||||||
fi
|
fi
|
||||||
}
|
}
|
||||||
|
|
||||||
|
_spack_buildcache_migrate() {
|
||||||
|
if $list_options
|
||||||
|
then
|
||||||
|
SPACK_COMPREPLY="-h --help -u --unsigned -d --delete-existing -y --yes-to-all"
|
||||||
|
else
|
||||||
|
_mirrors
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
_spack_cd() {
|
_spack_cd() {
|
||||||
if $list_options
|
if $list_options
|
||||||
then
|
then
|
||||||
|
@ -697,6 +697,7 @@ complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a save-sp
|
|||||||
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a sync -d 'sync binaries (and associated metadata) from one mirror to another'
|
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a sync -d 'sync binaries (and associated metadata) from one mirror to another'
|
||||||
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a update-index -d 'update a buildcache index'
|
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a update-index -d 'update a buildcache index'
|
||||||
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a rebuild-index -d 'update a buildcache index'
|
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a rebuild-index -d 'update a buildcache index'
|
||||||
|
complete -c spack -n '__fish_spack_using_command_pos 0 buildcache' -f -a migrate -d 'perform in-place binary mirror migration (2 to 3)'
|
||||||
complete -c spack -n '__fish_spack_using_command buildcache' -s h -l help -f -a help
|
complete -c spack -n '__fish_spack_using_command buildcache' -s h -l help -f -a help
|
||||||
complete -c spack -n '__fish_spack_using_command buildcache' -s h -l help -d 'show this help message and exit'
|
complete -c spack -n '__fish_spack_using_command buildcache' -s h -l help -d 'show this help message and exit'
|
||||||
|
|
||||||
@ -861,6 +862,18 @@ complete -c spack -n '__fish_spack_using_command buildcache rebuild-index' -s h
|
|||||||
complete -c spack -n '__fish_spack_using_command buildcache rebuild-index' -s k -l keys -f -a keys
|
complete -c spack -n '__fish_spack_using_command buildcache rebuild-index' -s k -l keys -f -a keys
|
||||||
complete -c spack -n '__fish_spack_using_command buildcache rebuild-index' -s k -l keys -d 'if provided, key index will be updated as well as package index'
|
complete -c spack -n '__fish_spack_using_command buildcache rebuild-index' -s k -l keys -d 'if provided, key index will be updated as well as package index'
|
||||||
|
|
||||||
|
# spack buildcache migrate
|
||||||
|
set -g __fish_spack_optspecs_spack_buildcache_migrate h/help u/unsigned d/delete-existing y/yes-to-all
|
||||||
|
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s h -l help -f -a help
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s h -l help -d 'show this help message and exit'
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s u -l unsigned -f -a unsigned
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s u -l unsigned -d 'Ignore signatures and do not resign, default is False'
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s d -l delete-existing -f -a delete_existing
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s d -l delete-existing -d 'Delete the previous layout, the default is to keep it.'
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s y -l yes-to-all -f -a yes_to_all
|
||||||
|
complete -c spack -n '__fish_spack_using_command buildcache migrate' -s y -l yes-to-all -d 'assume "yes" is the answer to every confirmation request'
|
||||||
|
|
||||||
# spack cd
|
# spack cd
|
||||||
set -g __fish_spack_optspecs_spack_cd h/help m/module-dir r/spack-root i/install-dir p/package-dir P/packages s/stage-dir S/stages c/source-dir b/build-dir e/env= first
|
set -g __fish_spack_optspecs_spack_cd h/help m/module-dir r/spack-root i/install-dir p/package-dir P/packages s/stage-dir S/stages c/source-dir b/build-dir e/env= first
|
||||||
complete -c spack -n '__fish_spack_using_command_pos_remainder 0 cd' -f -k -a '(__fish_spack_specs)'
|
complete -c spack -n '__fish_spack_using_command_pos_remainder 0 cd' -f -k -a '(__fish_spack_specs)'
|
||||||
|
3
var/spack/repos/spack_repo/builtin/__init__.py
Normal file
3
var/spack/repos/spack_repo/builtin/__init__.py
Normal file
@ -0,0 +1,3 @@
|
|||||||
|
# Copyright Spack Project Developers. See COPYRIGHT file for details.
|
||||||
|
#
|
||||||
|
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user