Compare commits

...

7 Commits

Author SHA1 Message Date
Todd Gamblin
4c6564f10a update changelog for v0.18.0 (#30905) 2022-05-28 17:33:52 +02:00
Massimiliano Culpo
82919cb6a5 Remove the warning that Spack prints at each spec (#30872)
Add instead a warning box in the documentation
2022-05-28 16:37:51 +02:00
Greg Becker
844c799299 target optimization: re-norm optimization scale so that 0 is best. (#29926)
referred targets are currently the only minimization criteria for Spack for which we allow
negative values. That means Spack may be incentivized to add nodes to the DAG if they
match the preferred target.

This PR re-norms the minimization criteria so that preferred targets are weighted from 0,
and default target weights are offset by the number of preferred targets per-package to
calculate node_target_weight.

Also fixes a bug in the test for preferred targets that was making the test easier to pass
than it should be.
2022-05-27 22:51:03 -07:00
Greg Becker
9198ab63ae update tutorial command for v0.18.0 and new gpg key 2022-05-27 18:45:19 -07:00
Todd Gamblin
8f9bc5bba4 Revert "strip -Werror: all specific or none (#30284)"
This reverts commit 330832c22c.

`-Werror` chagnes were unfortunately causing the `rdma-core` build to fail.
Reverting on `v0.18`; we can fix this in `develop`
2022-05-26 12:13:40 -07:00
Scott Wittenburg
ca0c968639 ci: Support secure binary signing on protected pipelines (#30753)
This PR supports the creation of securely signed binaries built from spack
develop as well as release branches and tags. Specifically:

- remove internal pr mirror url generation logic in favor of buildcache destination
on command line
    - with a single mirror url specified in the spack.yaml, this makes it clearer where 
    binaries from various pipelines are pushed
- designate some tags as reserved: ['public', 'protected', 'notary']
    - these tags are stripped from all jobs by default and provisioned internally
    based on pipeline type
- update gitlab ci yaml to include pipelines on more protected branches than just
develop (so include releases and tags)
    - binaries from all protected pipelines are pushed into mirrors including the
    branch name so releases, tags, and develop binaries are kept separate
- update rebuild jobs running on protected pipelines to run on special runners
provisioned with an intermediate signing key
    - protected rebuild jobs no longer use "SPACK_SIGNING_KEY" env var to
    obtain signing key (in fact, final signing key is nowhere available to rebuild jobs)
    - these intermediate signatures are verified at the end of each pipeline by a new
    signing job to ensure binaries were produced by a protected pipeline
- optionallly schedule a signing/notary job at the end of the pipeline to sign all
packges in the mirror
    - add signing-job-attributes to gitlab-ci section of spack environment to allow
    configuration
    - signing job runs on special runner (separate from protected rebuild runners)
    provisioned with public intermediate key and secret signing key
2022-05-26 09:10:18 -07:00
Gregory Becker
d99a1b1047 release number for v0.18.0 2022-05-25 20:29:03 -07:00
26 changed files with 944 additions and 355 deletions

View File

@@ -1,3 +1,205 @@
# v0.18.0 (2022-05-28)
`v0.18.0` is a major feature release.
## Major features in this release
1. **Concretizer now reuses by default**
`spack install --reuse` was introduced in `v0.17.0`, and `--reuse`
is now the default concretization mode. Spack will try hard to
resolve dependencies using installed packages or binaries (#30396).
To avoid reuse and to use the latest package configurations, (the
old default), you can use `spack install --fresh`, or add
configuration like this to your environment or `concretizer.yaml`:
```yaml
concretizer:
reuse: false
```
2. **Finer-grained hashes**
Spack hashes now include `link`, `run`, *and* `build` dependencies,
as well as a canonical hash of package recipes. Previously, hashes
only included `link` and `run` dependencies (though `build`
dependencies were stored by environments). We coarsened the hash to
reduce churn in user installations, but the new default concretizer
behavior mitigates this concern and gets us reuse *and* provenance.
You will be able to see the build dependencies of new installations
with `spack find`. Old installations will not change and their
hashes will not be affected. (#28156, #28504, #30717, #30861)
3. **Improved error messages**
Error handling with the new concretizer is now done with
optimization criteria rather than with unsatisfiable cores, and
Spack reports many more details about conflicting constraints.
(#30669)
4. **Unify environments when possible**
Environments have thus far supported `concretization: together` or
`concretization: separately`. These have been replaced by a new
preference in `concretizer.yaml`:
```yaml
concretizer:
unify: [true|false|when_possible]
```
`concretizer:unify:when_possible` will *try* to resolve a fully
unified environment, but if it cannot, it will create multiple
configurations of some packages where it has to. For large
environments that previously had to be concretized separately, this
can result in a huge speedup (40-50x). (#28941)
5. **Automatically find externals on Cray machines**
Spack can now automatically discover installed packages in the Cray
Programming Environment by running `spack external find` (or `spack
external read-cray-manifest` to *only* query the PE). Packages from
the PE (e.g., `cray-mpich` are added to the database with full
dependency information, and compilers from the PE are added to
`compilers.yaml`. Available with the June 2022 release of the Cray
Programming Environment. (#24894, #30428)
6. **New binary format and hardened signing**
Spack now has an updated binary format, with improvements for
security. The new format has a detached signature file, and Spack
verifies the signature before untarring or decompressing the binary
package. The previous format embedded the signature in a `tar`
file, which required the client to run `tar` *before* verifying
(#30750). Spack can still install from build caches using the old
format, but we encourage users to switch to the new format going
forward.
Production GitLab pipelines have been hardened to securely sign
binaries. There is now a separate signing stage so that signing
keys are never exposed to build system code, and signing keys are
ephemeral and only live as long as the signing pipeline stage.
(#30753)
7. **Bootstrap mirror generation**
The `spack bootstrap mirror` command can automatically create a
mirror for bootstrapping the concretizer and other needed
dependencies in an air-gapped environment. (#28556)
8. **Nascent Windows support**
Spack now has initial support for Windows. Spack core has been
refactored to run in the Windows environment, and a small number of
packages can now build for Windows. More details are
[in the documentation](https://spack.rtfd.io/en/latest/getting_started.html#spack-on-windows)
(#27021, #28385, many more)
9. **Makefile generation**
`spack env depfile` can be used to generate a `Makefile` from an
environment, which can be used to build packages the environment
in parallel on a single node. e.g.:
```console
spack -e myenv env depfile > Makefile
make
```
Spack propagates `gmake` jobserver information to builds so that
their jobs can share cores. (#30039, #30254, #30302, #30526)
10. **New variant features**
In addition to being conditional themselves, variants can now have
[conditional *values*](https://spack.readthedocs.io/en/latest/packaging_guide.html#conditional-possible-values)
that are only possible for certain configurations of a package. (#29530)
Variants can be
[declared "sticky"](https://spack.readthedocs.io/en/latest/packaging_guide.html#sticky-variants),
which prevents them from being enabled or disabled by the
concretizer. Sticky variants must be set explicitly by users
on the command line or in `packages.yaml`. (#28630)
* Allow conditional possible values in variants
* Add a "sticky" property to variants
## Other new features of note
* Environment views can optionally link only `run` dependencies
with `link:run` (#29336)
* `spack external find --all` finds library-only packages in
addition to build dependencies (#28005)
* Customizable `config:license_dir` option (#30135)
* `spack external find --path PATH` takes a custom search path (#30479)
* `spack spec` has a new `--format` argument like `spack find` (#27908)
* `spack concretize --quiet` skips printing concretized specs (#30272)
* `spack info` now has cleaner output and displays test info (#22097)
* Package-level submodule option for git commit versions (#30085, #30037)
* Using `/hash` syntax to refer to concrete specs in an environment
now works even if `/hash` is not installed. (#30276)
## Major internal refactors
* full hash (see above)
* new develop versioning scheme `0.19.0-dev0`
* Allow for multiple dependencies/dependents from the same package (#28673)
* Splice differing virtual packages (#27919)
## Performance Improvements
* Concretization of large environments with `unify: when_possible` is
much faster than concretizing separately (#28941, see above)
* Single-pass view generation algorithm is 2.6x faster (#29443)
## Archspec improvements
* `oneapi` and `dpcpp` flag support (#30783)
* better support for `M1` and `a64fx` (#30683)
## Removals and Deprecations
* Spack no longer supports Python `2.6` (#27256)
* Removed deprecated `--run-tests` option of `spack install`;
use `spack test` (#30461)
* Removed deprecated `spack flake8`; use `spack style` (#27290)
* Deprecate `spack:concretization` config option; use
`concretizer:unify` (#30038)
* Deprecate top-level module configuration; use module sets (#28659)
* `spack activate` and `spack deactivate` are deprecated in favor of
environments; will be removed in `0.19.0` (#29430; see also `link:run`
in #29336 above)
## Notable Bugfixes
* Fix bug that broke locks with many parallel builds (#27846)
* Many bugfixes and consistency improvements for the new concretizer
and `--reuse` (#30357, #30092, #29835, #29933, #28605, #29694, #28848)
## Packages
* `CMakePackage` uses `CMAKE_INSTALL_RPATH_USE_LINK_PATH` (#29703)
* Refactored `lua` support: `lua-lang` virtual supports both
`lua` and `luajit` via new `LuaPackage` build system(#28854)
* PythonPackage: now installs packages with `pip` (#27798)
* Python: improve site_packages_dir handling (#28346)
* Extends: support spec, not just package name (#27754)
* `find_libraries`: search for both .so and .dylib on macOS (#28924)
* Use stable URLs and `?full_index=1` for all github patches (#29239)
## Spack community stats
* 6,416 total packages, 458 new since `v0.17.0`
* 219 new Python packages
* 60 new R packages
* 377 people contributed to this release
* 337 committers to packages
* 85 committers to core
# v0.17.2 (2022-04-13) # v0.17.2 (2022-04-13)
### Spack bugfixes ### Spack bugfixes
@@ -11,7 +213,7 @@
* Fixed a few bugs affecting the spack ci command (#29518, #29419) * Fixed a few bugs affecting the spack ci command (#29518, #29419)
* Fix handling of Intel compiler environment (#29439) * Fix handling of Intel compiler environment (#29439)
* Fix a few edge cases when reindexing the DB (#28764) * Fix a few edge cases when reindexing the DB (#28764)
* Remove "Known issues" from documentation (#29664) * Remove "Known issues" from documentation (#29664)
* Other miscellaneous bugfixes (0b72e070583fc5bcd016f5adc8a84c99f2b7805f, #28403, #29261) * Other miscellaneous bugfixes (0b72e070583fc5bcd016f5adc8a84c99f2b7805f, #28403, #29261)
# v0.17.1 (2021-12-23) # v0.17.1 (2021-12-23)

View File

@@ -50,6 +50,13 @@ build cache files for the "ninja" spec:
Note that the targeted spec must already be installed. Once you have a build cache, Note that the targeted spec must already be installed. Once you have a build cache,
you can add it as a mirror, discussed next. you can add it as a mirror, discussed next.
.. warning::
Spack improved the format used for binary caches in v0.18. The entire v0.18 series
will be able to verify and install binary caches both in the new and in the old format.
Support for using the old format is expected to end in v0.19, so we advise users to
recreate relevant buildcaches using Spack v0.18 or higher.
--------------------------------------- ---------------------------------------
Finding or installing build cache files Finding or installing build cache files
--------------------------------------- ---------------------------------------

26
lib/spack/env/cc vendored
View File

@@ -401,8 +401,7 @@ input_command="$*"
# command line and recombine them with Spack arguments later. We # command line and recombine them with Spack arguments later. We
# parse these out so that we can make sure that system paths come # parse these out so that we can make sure that system paths come
# last, that package arguments come first, and that Spack arguments # last, that package arguments come first, and that Spack arguments
# are injected properly. Based on configuration, we also strip -Werror # are injected properly.
# arguments.
# #
# All other arguments, including -l arguments, are treated as # All other arguments, including -l arguments, are treated as
# 'other_args' and left in their original order. This ensures that # 'other_args' and left in their original order. This ensures that
@@ -441,29 +440,6 @@ while [ $# -ne 0 ]; do
continue continue
fi fi
if [ -n "${SPACK_COMPILER_FLAGS_KEEP}" ] ; then
# NOTE: the eval is required to allow `|` alternatives inside the variable
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_KEEP)
append other_args_list "$1"
shift
continue
;;
esac
"
fi
if [ -n "${SPACK_COMPILER_FLAGS_REMOVE}" ] ; then
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_REMOVE)
shift
continue
;;
esac
"
fi
case "$1" in case "$1" in
-isystem*) -isystem*)
arg="${1#-isystem}" arg="${1#-isystem}"

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: (major, minor, micro, dev release) tuple #: (major, minor, micro, dev release) tuple
spack_version_info = (0, 18, 0, 'dev0') spack_version_info = (0, 18, 0)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string #: PEP440 canonical <major>.<minor>.<micro>.<devN> string
spack_version = '.'.join(str(s) for s in spack_version_info) spack_version = '.'.join(str(s) for s in spack_version_info)

View File

@@ -210,7 +210,7 @@ def get_all_built_specs(self):
return spec_list return spec_list
def find_built_spec(self, spec): def find_built_spec(self, spec, mirrors_to_check=None):
"""Look in our cache for the built spec corresponding to ``spec``. """Look in our cache for the built spec corresponding to ``spec``.
If the spec can be found among the configured binary mirrors, a If the spec can be found among the configured binary mirrors, a
@@ -225,6 +225,8 @@ def find_built_spec(self, spec):
Args: Args:
spec (spack.spec.Spec): Concrete spec to find spec (spack.spec.Spec): Concrete spec to find
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
Returns: Returns:
An list of objects containing the found specs and mirror url where An list of objects containing the found specs and mirror url where
@@ -240,17 +242,23 @@ def find_built_spec(self, spec):
] ]
""" """
self.regenerate_spec_cache() self.regenerate_spec_cache()
return self.find_by_hash(spec.dag_hash()) return self.find_by_hash(spec.dag_hash(), mirrors_to_check=mirrors_to_check)
def find_by_hash(self, find_hash): def find_by_hash(self, find_hash, mirrors_to_check=None):
"""Same as find_built_spec but uses the hash of a spec. """Same as find_built_spec but uses the hash of a spec.
Args: Args:
find_hash (str): hash of the spec to search find_hash (str): hash of the spec to search
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
""" """
if find_hash not in self._mirrors_for_spec: if find_hash not in self._mirrors_for_spec:
return None return None
return self._mirrors_for_spec[find_hash] results = self._mirrors_for_spec[find_hash]
if not mirrors_to_check:
return results
mirror_urls = mirrors_to_check.values()
return [r for r in results if r['mirror_url'] in mirror_urls]
def update_spec(self, spec, found_list): def update_spec(self, spec, found_list):
""" """
@@ -1592,9 +1600,6 @@ def extract_tarball(spec, download_result, allow_root=False, unsigned=False,
# Handle the older buildcache layout where the .spack file # Handle the older buildcache layout where the .spack file
# contains a spec json/yaml, maybe an .asc file (signature), # contains a spec json/yaml, maybe an .asc file (signature),
# and another tarball containing the actual install tree. # and another tarball containing the actual install tree.
tty.warn("This binary package uses a deprecated layout, "
"and support for it will eventually be removed "
"from spack.")
tmpdir = tempfile.mkdtemp() tmpdir = tempfile.mkdtemp()
try: try:
tarfile_path = _extract_inner_tarball( tarfile_path = _extract_inner_tarball(
@@ -1822,7 +1827,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
tty.debug("No Spack mirrors are currently configured") tty.debug("No Spack mirrors are currently configured")
return {} return {}
results = binary_index.find_built_spec(spec) results = binary_index.find_built_spec(spec, mirrors_to_check=mirrors_to_check)
# Maybe we just didn't have the latest information from the mirror, so # Maybe we just didn't have the latest information from the mirror, so
# try to fetch directly, unless we are only considering the indices. # try to fetch directly, unless we are only considering the indices.

View File

@@ -242,17 +242,6 @@ def clean_environment():
# show useful matches. # show useful matches.
env.set('LC_ALL', build_lang) env.set('LC_ALL', build_lang)
remove_flags = set()
keep_flags = set()
if spack.config.get('config:flags:keep_werror') == 'all':
keep_flags.add('-Werror*')
else:
if spack.config.get('config:flags:keep_werror') == 'specific':
keep_flags.add('-Werror=*')
remove_flags.add('-Werror*')
env.set('SPACK_COMPILER_FLAGS_KEEP', '|'.join(keep_flags))
env.set('SPACK_COMPILER_FLAGS_REMOVE', '|'.join(remove_flags))
# Remove any macports installs from the PATH. The macports ld can # Remove any macports installs from the PATH. The macports ld can
# cause conflicts with the built-in linker on el capitan. Solves # cause conflicts with the built-in linker on el capitan. Solves
# assembler issues, e.g.: # assembler issues, e.g.:

View File

@@ -33,7 +33,6 @@
import spack.util.executable as exe import spack.util.executable as exe
import spack.util.gpg as gpg_util import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util import spack.util.web as web_util
from spack.error import SpackError from spack.error import SpackError
from spack.spec import Spec from spack.spec import Spec
@@ -42,10 +41,8 @@
'always', 'always',
] ]
SPACK_PR_MIRRORS_ROOT_URL = 's3://spack-binaries-prs'
SPACK_SHARED_PR_MIRROR_URL = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
'shared_pr_mirror')
TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror' TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror'
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
spack_gpg = spack.main.SpackCommand('gpg') spack_gpg = spack.main.SpackCommand('gpg')
spack_compiler = spack.main.SpackCommand('compiler') spack_compiler = spack.main.SpackCommand('compiler')
@@ -199,6 +196,11 @@ def _get_cdash_build_name(spec, build_group):
spec.name, spec.version, spec.compiler, spec.architecture, build_group) spec.name, spec.version, spec.compiler, spec.architecture, build_group)
def _remove_reserved_tags(tags):
"""Convenience function to strip reserved tags from jobs"""
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _get_spec_string(spec): def _get_spec_string(spec):
format_elements = [ format_elements = [
'{name}{@version}', '{name}{@version}',
@@ -231,8 +233,10 @@ def _add_dependency(spec_label, dep_label, deps):
deps[spec_label].add(dep_label) deps[spec_label].add(dep_label)
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False): def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False,
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only) mirrors_to_check=None):
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
if spec_deps_obj: if spec_deps_obj:
dependencies = spec_deps_obj['dependencies'] dependencies = spec_deps_obj['dependencies']
@@ -249,7 +253,7 @@ def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
_add_dependency(entry['spec'], entry['depends'], deps) _add_dependency(entry['spec'], entry['depends'], deps)
def stage_spec_jobs(specs, check_index_only=False): def stage_spec_jobs(specs, check_index_only=False, mirrors_to_check=None):
"""Take a set of release specs and generate a list of "stages", where the """Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework. allows us to maximize build parallelism within the gitlab-ci framework.
@@ -261,6 +265,8 @@ def stage_spec_jobs(specs, check_index_only=False):
are up to date on those mirrors. This flag limits that search to are up to date on those mirrors. This flag limits that search to
the binary cache indices on those mirrors to speed the process up, the binary cache indices on those mirrors to speed the process up,
even though there is no garantee the index is up to date. even though there is no garantee the index is up to date.
mirrors_to_checK: Optional mapping giving mirrors to check instead of
any configured mirrors.
Returns: A tuple of information objects describing the specs, dependencies Returns: A tuple of information objects describing the specs, dependencies
and stages: and stages:
@@ -297,8 +303,8 @@ def _remove_satisfied_deps(deps, satisfied_list):
deps = {} deps = {}
spec_labels = {} spec_labels = {}
_get_spec_dependencies( _get_spec_dependencies(specs, deps, spec_labels, check_index_only=check_index_only,
specs, deps, spec_labels, check_index_only=check_index_only) mirrors_to_check=mirrors_to_check)
# Save the original deps, as we need to return them at the end of the # Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is # function. In the while loop below, the "dependencies" variable is
@@ -340,7 +346,7 @@ def _print_staging_summary(spec_labels, dependencies, stages):
_get_spec_string(s))) _get_spec_string(s)))
def _compute_spec_deps(spec_list, check_index_only=False): def _compute_spec_deps(spec_list, check_index_only=False, mirrors_to_check=None):
""" """
Computes all the dependencies for the spec(s) and generates a JSON Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a object which provides both a list of unique spec names as well as a
@@ -413,7 +419,7 @@ def append_dep(s, d):
continue continue
up_to_date_mirrors = bindist.get_mirrors_for_spec( up_to_date_mirrors = bindist.get_mirrors_for_spec(
spec=s, index_only=check_index_only) spec=s, mirrors_to_check=mirrors_to_check, index_only=check_index_only)
skey = _spec_deps_key(s) skey = _spec_deps_key(s)
spec_labels[skey] = { spec_labels[skey] = {
@@ -602,8 +608,8 @@ def get_spec_filter_list(env, affected_pkgs, dependencies=True, dependents=True)
def generate_gitlab_ci_yaml(env, print_summary, output_file, def generate_gitlab_ci_yaml(env, print_summary, output_file,
prune_dag=False, check_index_only=False, prune_dag=False, check_index_only=False,
run_optimizer=False, use_dependencies=False, run_optimizer=False, use_dependencies=False,
artifacts_root=None): artifacts_root=None, remote_mirror_override=None):
""" Generate a gitlab yaml file to run a dynamic chile pipeline from """ Generate a gitlab yaml file to run a dynamic child pipeline from
the spec matrix in the active environment. the spec matrix in the active environment.
Arguments: Arguments:
@@ -629,6 +635,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
artifacts_root (str): Path where artifacts like logs, environment artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory. requires this to be within the project directory.
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option.
""" """
with spack.concretize.disable_compiler_existence_check(): with spack.concretize.disable_compiler_existence_check():
with env.write_transaction(): with env.write_transaction():
@@ -678,17 +688,19 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
for s in affected_specs: for s in affected_specs:
tty.debug(' {0}'.format(s.name)) tty.debug(' {0}'.format(s.name))
generate_job_name = os.environ.get('CI_JOB_NAME', None) # Downstream jobs will "need" (depend on, for both scheduling and
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', None) # artifacts, which include spack.lock file) this pipeline generation
# job by both name and pipeline id. If those environment variables
# do not exist, then maybe this is just running in a shell, in which
# case, there is no expectation gitlab will ever run the generated
# pipeline and those environment variables do not matter.
generate_job_name = os.environ.get('CI_JOB_NAME', 'job-does-not-exist')
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', 'pipeline-does-not-exist')
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None) spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None)
is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
spack_pr_branch = os.environ.get('SPACK_PR_BRANCH', None) spack_buildcache_copy = os.environ.get('SPACK_COPY_BUILDCACHE', None)
pr_mirror_url = None
if spack_pr_branch:
pr_mirror_url = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
spack_pr_branch)
if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1: if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
tty.die('spack ci generate requires an env containing a mirror') tty.die('spack ci generate requires an env containing a mirror')
@@ -743,14 +755,25 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'strip-compilers': False, 'strip-compilers': False,
}) })
# Add per-PR mirror (and shared PR mirror) if enabled, as some specs might # If a remote mirror override (alternate buildcache destination) was
# be up to date in one of those and thus not need to be rebuilt. # specified, add it here in case it has already built hashes we might
if pr_mirror_url: # generate.
spack.mirror.add( mirrors_to_check = None
'ci_pr_mirror', pr_mirror_url, cfg.default_modify_scope()) if remote_mirror_override:
spack.mirror.add('ci_shared_pr_mirror', if spack_pipeline_type == 'spack_protected_branch':
SPACK_SHARED_PR_MIRROR_URL, # Overriding the main mirror in this case might result
cfg.default_modify_scope()) # in skipping jobs on a release pipeline because specs are
# up to date in develop. Eventually we want to notice and take
# advantage of this by scheduling a job to copy the spec from
# develop to the release, but until we have that, this makes
# sure we schedule a rebuild job if the spec isn't already in
# override mirror.
mirrors_to_check = {
'override': remote_mirror_override
}
else:
spack.mirror.add(
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
pipeline_artifacts_dir = artifacts_root pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir: if not pipeline_artifacts_dir:
@@ -825,11 +848,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
phase_spec.concretize() phase_spec.concretize()
staged_phases[phase_name] = stage_spec_jobs( staged_phases[phase_name] = stage_spec_jobs(
concrete_phase_specs, concrete_phase_specs,
check_index_only=check_index_only) check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
finally: finally:
# Clean up PR mirror if enabled # Clean up remote mirror override if enabled
if pr_mirror_url: if remote_mirror_override:
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope()) if spack_pipeline_type != 'spack_protected_branch':
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
all_job_names = [] all_job_names = []
output_object = {} output_object = {}
@@ -889,6 +914,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tags = [tag for tag in runner_attribs['tags']] tags = [tag for tag in runner_attribs['tags']]
if spack_pipeline_type is not None:
# For spack pipelines "public" and "protected" are reserved tags
tags = _remove_reserved_tags(tags)
if spack_pipeline_type == 'spack_protected_branch':
tags.extend(['aws', 'protected'])
elif spack_pipeline_type == 'spack_pull_request':
tags.extend(['public'])
variables = {} variables = {}
if 'variables' in runner_attribs: if 'variables' in runner_attribs:
variables.update(runner_attribs['variables']) variables.update(runner_attribs['variables'])
@@ -1174,6 +1207,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config, service_job_config,
cleanup_job) cleanup_job)
if 'tags' in cleanup_job:
service_tags = _remove_reserved_tags(cleanup_job['tags'])
cleanup_job['tags'] = service_tags
cleanup_job['stage'] = 'cleanup-temp-storage' cleanup_job['stage'] = 'cleanup-temp-storage'
cleanup_job['script'] = [ cleanup_job['script'] = [
'spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID'.format( 'spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID'.format(
@@ -1181,9 +1218,74 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
] ]
cleanup_job['when'] = 'always' cleanup_job['when'] = 'always'
cleanup_job['retry'] = service_job_retries cleanup_job['retry'] = service_job_retries
cleanup_job['interruptible'] = True
output_object['cleanup'] = cleanup_job output_object['cleanup'] = cleanup_job
if ('signing-job-attributes' in gitlab_ci and
spack_pipeline_type == 'spack_protected_branch'):
# External signing: generate a job to check and sign binary pkgs
stage_names.append('stage-sign-pkgs')
signing_job_config = gitlab_ci['signing-job-attributes']
signing_job = {}
signing_job_attrs_to_copy = [
'image',
'tags',
'variables',
'before_script',
'script',
'after_script',
]
_copy_attributes(signing_job_attrs_to_copy,
signing_job_config,
signing_job)
signing_job_tags = []
if 'tags' in signing_job:
signing_job_tags = _remove_reserved_tags(signing_job['tags'])
for tag in ['aws', 'protected', 'notary']:
if tag not in signing_job_tags:
signing_job_tags.append(tag)
signing_job['tags'] = signing_job_tags
signing_job['stage'] = 'stage-sign-pkgs'
signing_job['when'] = 'always'
signing_job['retry'] = {
'max': 2,
'when': ['always']
}
signing_job['interruptible'] = True
output_object['sign-pkgs'] = signing_job
if spack_buildcache_copy:
# Generate a job to copy the contents from wherever the builds are getting
# pushed to the url specified in the "SPACK_BUILDCACHE_COPY" environment
# variable.
src_url = remote_mirror_override or remote_mirror_url
dest_url = spack_buildcache_copy
stage_names.append('stage-copy-buildcache')
copy_job = {
'stage': 'stage-copy-buildcache',
'tags': ['spack', 'public', 'medium', 'aws', 'x86_64'],
'image': 'ghcr.io/spack/python-aws-bash:0.0.1',
'when': 'on_success',
'interruptible': True,
'retry': service_job_retries,
'script': [
'. ./share/spack/setup-env.sh',
'spack --version',
'aws s3 sync --exclude *index.json* --exclude *pgp* {0} {1}'.format(
src_url, dest_url)
]
}
output_object['copy-mirror'] = copy_job
if rebuild_index_enabled: if rebuild_index_enabled:
# Add a final job to regenerate the index # Add a final job to regenerate the index
stage_names.append('stage-rebuild-index') stage_names.append('stage-rebuild-index')
@@ -1194,9 +1296,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config, service_job_config,
final_job) final_job)
if 'tags' in final_job:
service_tags = _remove_reserved_tags(final_job['tags'])
final_job['tags'] = service_tags
index_target_mirror = mirror_urls[0] index_target_mirror = mirror_urls[0]
if is_pr_pipeline: if remote_mirror_override:
index_target_mirror = pr_mirror_url index_target_mirror = remote_mirror_override
final_job['stage'] = 'stage-rebuild-index' final_job['stage'] = 'stage-rebuild-index'
final_job['script'] = [ final_job['script'] = [
@@ -1205,6 +1311,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
] ]
final_job['when'] = 'always' final_job['when'] = 'always'
final_job['retry'] = service_job_retries final_job['retry'] = service_job_retries
final_job['interruptible'] = True
output_object['rebuild-index'] = final_job output_object['rebuild-index'] = final_job
@@ -1237,8 +1344,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'SPACK_PIPELINE_TYPE': str(spack_pipeline_type) 'SPACK_PIPELINE_TYPE': str(spack_pipeline_type)
} }
if pr_mirror_url: if remote_mirror_override:
output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url (output_object['variables']
['SPACK_REMOTE_MIRROR_OVERRIDE']) = remote_mirror_override
spack_stack_name = os.environ.get('SPACK_CI_STACK_NAME', None) spack_stack_name = os.environ.get('SPACK_CI_STACK_NAME', None)
if spack_stack_name: if spack_stack_name:

View File

@@ -64,6 +64,11 @@ def setup_parser(subparser):
'--dependencies', action='store_true', default=False, '--dependencies', action='store_true', default=False,
help="(Experimental) disable DAG scheduling; use " help="(Experimental) disable DAG scheduling; use "
' "plain" dependencies.') ' "plain" dependencies.')
generate.add_argument(
'--buildcache-destination', default=None,
help="Override the mirror configured in the environment (spack.yaml) " +
"in order to push binaries from the generated pipeline to a " +
"different location.")
prune_group = generate.add_mutually_exclusive_group() prune_group = generate.add_mutually_exclusive_group()
prune_group.add_argument( prune_group.add_argument(
'--prune-dag', action='store_true', dest='prune_dag', '--prune-dag', action='store_true', dest='prune_dag',
@@ -127,6 +132,7 @@ def ci_generate(args):
prune_dag = args.prune_dag prune_dag = args.prune_dag
index_only = args.index_only index_only = args.index_only
artifacts_root = args.artifacts_root artifacts_root = args.artifacts_root
buildcache_destination = args.buildcache_destination
if not output_file: if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml") output_file = os.path.abspath(".gitlab-ci.yml")
@@ -140,7 +146,8 @@ def ci_generate(args):
spack_ci.generate_gitlab_ci_yaml( spack_ci.generate_gitlab_ci_yaml(
env, True, output_file, prune_dag=prune_dag, env, True, output_file, prune_dag=prune_dag,
check_index_only=index_only, run_optimizer=run_optimizer, check_index_only=index_only, run_optimizer=run_optimizer,
use_dependencies=use_dependencies, artifacts_root=artifacts_root) use_dependencies=use_dependencies, artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination)
if copy_yaml_to: if copy_yaml_to:
copy_to_dir = os.path.dirname(copy_yaml_to) copy_to_dir = os.path.dirname(copy_yaml_to)
@@ -180,6 +187,9 @@ def ci_rebuild(args):
if not gitlab_ci: if not gitlab_ci:
tty.die('spack ci rebuild requires an env containing gitlab-ci cfg') tty.die('spack ci rebuild requires an env containing gitlab-ci cfg')
tty.msg('SPACK_BUILDCACHE_DESTINATION={0}'.format(
os.environ.get('SPACK_BUILDCACHE_DESTINATION', None)))
# Grab the environment variables we need. These either come from the # Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written # pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself. # out as variables, or else provided by GitLab itself.
@@ -196,7 +206,7 @@ def ci_rebuild(args):
compiler_action = get_env_var('SPACK_COMPILER_ACTION') compiler_action = get_env_var('SPACK_COMPILER_ACTION')
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME') cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE') spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE')
pr_mirror_url = get_env_var('SPACK_PR_MIRROR_URL') remote_mirror_override = get_env_var('SPACK_REMOTE_MIRROR_OVERRIDE')
remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL') remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL')
# Construct absolute paths relative to current $CI_PROJECT_DIR # Construct absolute paths relative to current $CI_PROJECT_DIR
@@ -244,6 +254,10 @@ def ci_rebuild(args):
tty.debug('Pipeline type - PR: {0}, develop: {1}'.format( tty.debug('Pipeline type - PR: {0}, develop: {1}'.format(
spack_is_pr_pipeline, spack_is_develop_pipeline)) spack_is_pr_pipeline, spack_is_develop_pipeline))
# If no override url exists, then just push binary package to the
# normal remote mirror url.
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
# Figure out what is our temporary storage mirror: Is it artifacts # Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to # buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build # force something or pipelines might not have a way to propagate build
@@ -373,7 +387,24 @@ def ci_rebuild(args):
cfg.default_modify_scope()) cfg.default_modify_scope())
# Check configured mirrors for a built spec with a matching hash # Check configured mirrors for a built spec with a matching hash
matches = bindist.get_mirrors_for_spec(job_spec, index_only=False) mirrors_to_check = None
if remote_mirror_override and spack_pipeline_type == 'spack_protected_branch':
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {
'override': remote_mirror_override
}
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add('mirror_override',
remote_mirror_override,
cfg.default_modify_scope())
matches = bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False)
if matches: if matches:
# Got a hash match on at least one configured mirror. All # Got a hash match on at least one configured mirror. All
@@ -517,13 +548,6 @@ def ci_rebuild(args):
# any logs from the staging directory to artifacts now # any logs from the staging directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir) spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# Create buildcache on remote mirror, either on pr-specific mirror or
# on the main mirror defined in the gitlab-enabled spack environment
if spack_is_pr_pipeline:
buildcache_mirror_url = pr_mirror_url
else:
buildcache_mirror_url = remote_mirror_url
# If the install succeeded, create a buildcache entry for this job spec # If the install succeeded, create a buildcache entry for this job spec
# and push it to one or more mirrors. If the install did not succeed, # and push it to one or more mirrors. If the install did not succeed,
# print out some instructions on how to reproduce this build failure # print out some instructions on how to reproduce this build failure

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters # tutorial configuration parameters
tutorial_branch = "releases/v0.17" tutorial_branch = "releases/v0.18"
tutorial_mirror = "file:///mirror" tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub") tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -105,9 +105,6 @@
'build_stage': '$tempdir/spack-stage', 'build_stage': '$tempdir/spack-stage',
'concretizer': 'clingo', 'concretizer': 'clingo',
'license_dir': spack.paths.default_license_dir, 'license_dir': spack.paths.default_license_dir,
'flags': {
'keep_werror': 'none',
},
} }
} }

View File

@@ -91,16 +91,7 @@
'additional_external_search_paths': { 'additional_external_search_paths': {
'type': 'array', 'type': 'array',
'items': {'type': 'string'} 'items': {'type': 'string'}
}, }
'flags': {
'type': 'object',
'properties': {
'keep_werror': {
'type': 'string',
'enum': ['all', 'specific', 'none'],
},
},
},
}, },
'deprecatedProperties': { 'deprecatedProperties': {
'properties': ['module_roots'], 'properties': ['module_roots'],

View File

@@ -110,6 +110,7 @@
}, },
}, },
'service-job-attributes': runner_selector_schema, 'service-job-attributes': runner_selector_schema,
'signing-job-attributes': runner_selector_schema,
'rebuild-index': {'type': 'boolean'}, 'rebuild-index': {'type': 'boolean'},
'broken-specs-url': {'type': 'string'}, 'broken-specs-url': {'type': 'string'},
}, },

View File

@@ -716,6 +716,7 @@ def __init__(self, tests=False):
self.variant_values_from_specs = set() self.variant_values_from_specs = set()
self.version_constraints = set() self.version_constraints = set()
self.target_constraints = set() self.target_constraints = set()
self.default_targets = {}
self.compiler_version_constraints = set() self.compiler_version_constraints = set()
self.post_facts = [] self.post_facts = []
@@ -1164,7 +1165,7 @@ def preferred_variants(self, pkg_name):
pkg_name, variant.name, value pkg_name, variant.name, value
)) ))
def preferred_targets(self, pkg_name): def target_preferences(self, pkg_name):
key_fn = spack.package_prefs.PackagePrefs(pkg_name, 'target') key_fn = spack.package_prefs.PackagePrefs(pkg_name, 'target')
if not self.target_specs_cache: if not self.target_specs_cache:
@@ -1175,13 +1176,25 @@ def preferred_targets(self, pkg_name):
target_specs = self.target_specs_cache target_specs = self.target_specs_cache
preferred_targets = [x for x in target_specs if key_fn(x) < 0] preferred_targets = [x for x in target_specs if key_fn(x) < 0]
if not preferred_targets:
return
preferred = preferred_targets[0] for i, preferred in enumerate(preferred_targets):
self.gen.fact(fn.package_target_weight( self.gen.fact(fn.package_target_weight(
str(preferred.architecture.target), pkg_name, -30 str(preferred.architecture.target), pkg_name, i
)) ))
# generate weights for non-preferred targets on a per-package basis
default_targets = {
name: weight for
name, weight in self.default_targets.items()
if not any(preferred.architecture.target.name == name
for preferred in preferred_targets)
}
num_preferred = len(preferred_targets)
for name, weight in default_targets.items():
self.gen.fact(fn.default_target_weight(
name, pkg_name, weight + num_preferred + 30
))
def flag_defaults(self): def flag_defaults(self):
self.gen.h2("Compiler flag defaults") self.gen.h2("Compiler flag defaults")
@@ -1572,7 +1585,7 @@ def target_defaults(self, specs):
compiler.name, compiler.version, uarch.family.name compiler.name, compiler.version, uarch.family.name
)) ))
i = 0 i = 0 # TODO compute per-target offset?
for target in candidate_targets: for target in candidate_targets:
self.gen.fact(fn.target(target.name)) self.gen.fact(fn.target(target.name))
self.gen.fact(fn.target_family(target.name, target.family.name)) self.gen.fact(fn.target_family(target.name, target.family.name))
@@ -1581,11 +1594,13 @@ def target_defaults(self, specs):
# prefer best possible targets; weight others poorly so # prefer best possible targets; weight others poorly so
# they're not used unless set explicitly # they're not used unless set explicitly
# these are stored to be generated as facts later offset by the
# number of preferred targets
if target.name in best_targets: if target.name in best_targets:
self.gen.fact(fn.default_target_weight(target.name, i)) self.default_targets[target.name] = i
i += 1 i += 1
else: else:
self.gen.fact(fn.default_target_weight(target.name, 100)) self.default_targets[target.name] = 100
self.gen.newline() self.gen.newline()
@@ -1862,7 +1877,7 @@ def setup(self, driver, specs, reuse=None):
self.pkg_rules(pkg, tests=self.tests) self.pkg_rules(pkg, tests=self.tests)
self.gen.h2('Package preferences: %s' % pkg) self.gen.h2('Package preferences: %s' % pkg)
self.preferred_variants(pkg) self.preferred_variants(pkg)
self.preferred_targets(pkg) self.target_preferences(pkg)
# Inject dev_path from environment # Inject dev_path from environment
env = ev.active_environment() env = ev.active_environment()

View File

@@ -779,10 +779,13 @@ target_compatible(Descendent, Ancestor)
#defined target_satisfies/2. #defined target_satisfies/2.
#defined target_parent/2. #defined target_parent/2.
% The target weight is either the default target weight % If the package does not have any specific weight for this
% or a more specific per-package weight if set % target, offset the default weights by the number of specific
% weights and use that. We additionally offset by 30 to ensure
% preferences are propagated even against large numbers of
% otherwise "better" matches.
target_weight(Target, Package, Weight) target_weight(Target, Package, Weight)
:- default_target_weight(Target, Weight), :- default_target_weight(Target, Package, Weight),
node(Package), node(Package),
not derive_target_from_parent(_, Package), not derive_target_from_parent(_, Package),
not package_target_weight(Target, Package, _). not package_target_weight(Target, Package, _).
@@ -1238,7 +1241,7 @@ opt_criterion(1, "non-preferred targets").
%----------------- %-----------------
#heuristic version(Package, Version) : version_declared(Package, Version, 0), node(Package). [10, true] #heuristic version(Package, Version) : version_declared(Package, Version, 0), node(Package). [10, true]
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), node(Package). [10, true] #heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), node(Package). [10, true]
#heuristic node_target(Package, Target) : default_target_weight(Target, 0), node(Package). [10, true] #heuristic node_target(Package, Target) : package_target_weight(Target, Package, 0), node(Package). [10, true]
#heuristic node_target_weight(Package, 0) : node(Package). [10, true] #heuristic node_target_weight(Package, 0) : node(Package). [10, true]
#heuristic variant_value(Package, Variant, Value) : variant_default_value(Package, Variant, Value), node(Package). [10, true] #heuristic variant_value(Package, Variant, Value) : variant_default_value(Package, Variant, Value), node(Package). [10, true]
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true] #heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]

View File

@@ -12,9 +12,6 @@
import pytest import pytest
import spack.build_environment
import spack.config
import spack.spec
from spack.paths import build_env_path from spack.paths import build_env_path
from spack.util.environment import set_env, system_dirs from spack.util.environment import set_env, system_dirs
from spack.util.executable import Executable, ProcessError from spack.util.executable import Executable, ProcessError
@@ -132,9 +129,7 @@ def wrapper_environment():
SPACK_TARGET_ARGS="-march=znver2 -mtune=znver2", SPACK_TARGET_ARGS="-march=znver2 -mtune=znver2",
SPACK_LINKER_ARG='-Wl,', SPACK_LINKER_ARG='-Wl,',
SPACK_DTAGS_TO_ADD='--disable-new-dtags', SPACK_DTAGS_TO_ADD='--disable-new-dtags',
SPACK_DTAGS_TO_STRIP='--enable-new-dtags', SPACK_DTAGS_TO_STRIP='--enable-new-dtags'):
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',):
yield yield
@@ -162,21 +157,6 @@ def check_args(cc, args, expected):
assert expected == cc_modified_args assert expected == cc_modified_args
def check_args_contents(cc, args, must_contain, must_not_contain):
"""Check output arguments that cc produces when called with args.
This assumes that cc will print debug command output with one element
per line, so that we see whether arguments that should (or shouldn't)
contain spaces are parsed correctly.
"""
with set_env(SPACK_TEST_COMMAND='dump-args'):
cc_modified_args = cc(*args, output=str).strip().split('\n')
for a in must_contain:
assert a in cc_modified_args
for a in must_not_contain:
assert a not in cc_modified_args
def check_env_var(executable, var, expected): def check_env_var(executable, var, expected):
"""Check environment variables updated by the passed compiler wrapper """Check environment variables updated by the passed compiler wrapper
@@ -662,63 +642,6 @@ def test_no_ccache_prepend_for_fc(wrapper_environment):
common_compile_args) common_compile_args)
def test_keep_and_remove(wrapper_environment):
werror_specific = ['-Werror=meh']
werror = ['-Werror']
werror_all = werror_specific + werror
with set_env(
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, ['-Wl,--end-group'], werror_all)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, werror_specific, werror)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*|-llib1|-Wl*',
):
check_args_contents(
cc,
test_args + werror_all,
werror_specific,
werror + ["-llib1", "-Wl,--rpath"]
)
@pytest.mark.parametrize('cfg_override,initial,expected,must_be_gone', [
# Set and unset variables
('config:flags:keep_werror:all',
['-Werror', '-Werror=specific', '-bah'],
['-Werror', '-Werror=specific', '-bah'],
[],
),
('config:flags:keep_werror:specific',
['-Werror', '-Werror=specific', '-bah'],
['-Werror=specific', '-bah'],
['-Werror'],
),
('config:flags:keep_werror:none',
['-Werror', '-Werror=specific', '-bah'],
['-bah'],
['-Werror', '-Werror=specific'],
),
])
@pytest.mark.usefixtures('wrapper_environment', 'mutable_config')
def test_flag_modification(cfg_override, initial, expected, must_be_gone):
spack.config.add(cfg_override)
env = spack.build_environment.clean_environment()
env.apply_modifications()
check_args_contents(
cc,
test_args + initial,
expected,
must_be_gone
)
@pytest.mark.regression('9160') @pytest.mark.regression('9160')
def test_disable_new_dtags(wrapper_environment, wrapper_flags): def test_disable_new_dtags(wrapper_environment, wrapper_flags):
with set_env(SPACK_TEST_COMMAND='dump-args'): with set_env(SPACK_TEST_COMMAND='dump-args'):

View File

@@ -635,10 +635,6 @@ def test_ci_generate_for_pr_pipeline(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
@@ -683,10 +679,6 @@ def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
@@ -920,6 +912,77 @@ def fake_dl_method(spec, *args, **kwargs):
env_cmd('deactivate') env_cmd('deactivate')
def test_ci_generate_mirror_override(tmpdir, mutable_mock_env_path,
install_mockery_mutable_config, mock_packages,
mock_fetch, mock_stage, mock_binary_index,
ci_base_environment):
"""Ensure that protected pipelines using --buildcache-destination do not
skip building specs that are not in the override mirror when they are
found in the main mirror."""
os.environ.update({
'SPACK_PIPELINE_TYPE': 'spack_protected_branch',
})
working_dir = tmpdir.join('working_dir')
mirror_dir = working_dir.join('mirror')
mirror_url = 'file://{0}'.format(mirror_dir.strpath)
spack_yaml_contents = """
spack:
definitions:
- packages: [patchelf]
specs:
- $packages
mirrors:
test-mirror: {0}
gitlab-ci:
mappings:
- match:
- patchelf
runner-attributes:
tags:
- donotcare
image: donotcare
service-job-attributes:
tags:
- nonbuildtag
image: basicimage
""".format(mirror_url)
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write(spack_yaml_contents)
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
first_ci_yaml = str(tmpdir.join('.gitlab-ci-1.yml'))
second_ci_yaml = str(tmpdir.join('.gitlab-ci-2.yml'))
with ev.read('test'):
install_cmd()
buildcache_cmd('create', '-u', '--mirror-url', mirror_url, 'patchelf')
buildcache_cmd('update-index', '--mirror-url', mirror_url, output=str)
# This generate should not trigger a rebuild of patchelf, since it's in
# the main mirror referenced in the environment.
ci_cmd('generate', '--check-index-only', '--output-file', first_ci_yaml)
# Because we used a mirror override (--buildcache-destination) on a
# spack protected pipeline, we expect to only look in the override
# mirror for the spec, and thus the patchelf job should be generated in
# this pipeline
ci_cmd('generate', '--check-index-only', '--output-file', second_ci_yaml,
'--buildcache-destination', 'file:///mirror/not/exist')
with open(first_ci_yaml) as fd1:
first_yaml = fd1.read()
assert 'no-specs-to-rebuild' in first_yaml
with open(second_ci_yaml) as fd2:
second_yaml = fd2.read()
assert 'no-specs-to-rebuild' not in second_yaml
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_push_mirror_contents(tmpdir, mutable_mock_env_path, def test_push_mirror_contents(tmpdir, mutable_mock_env_path,
install_mockery_mutable_config, mock_packages, install_mockery_mutable_config, mock_packages,
@@ -1151,10 +1214,6 @@ def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr( monkeypatch.setattr(
spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1') spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1')
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
@@ -1256,10 +1315,6 @@ def test_ci_generate_with_workarounds(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile, '--dependencies') ci_cmd('generate', '--output-file', outputfile, '--dependencies')
with open(outputfile) as f: with open(outputfile) as f:
@@ -1417,11 +1472,6 @@ def fake_get_mirrors_for_spec(spec=None, mirrors_to_check=None,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as of: with open(outputfile) as of:
@@ -1630,11 +1680,6 @@ def test_ci_generate_temp_storage_url(tmpdir, mutable_mock_env_path,
env_cmd('create', 'test', './spack.yaml') env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
monkeypatch.setattr(
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
monkeypatch.setattr(
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
with ev.read('test'): with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
@@ -1715,6 +1760,64 @@ def test_ci_generate_read_broken_specs_url(tmpdir, mutable_mock_env_path,
assert(ex not in output) assert(ex not in output)
def test_ci_generate_external_signing_job(tmpdir, mutable_mock_env_path,
install_mockery,
mock_packages, monkeypatch,
ci_base_environment):
"""Verify that in external signing mode: 1) each rebuild jobs includes
the location where the binary hash information is written and 2) we
properly generate a final signing job in the pipeline."""
os.environ.update({
'SPACK_PIPELINE_TYPE': 'spack_protected_branch'
})
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
specs:
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
temporary-storage-url-prefix: file:///work/temp/mirror
mappings:
- match:
- archive-files
runner-attributes:
tags:
- donotcare
image: donotcare
signing-job-attributes:
tags:
- nonbuildtag
- secretrunner
image:
name: customdockerimage
entrypoint: []
variables:
IMPORTANT_INFO: avalue
script:
- echo hello
""")
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as of:
pipeline_doc = syaml.load(of.read())
assert 'sign-pkgs' in pipeline_doc
signing_job = pipeline_doc['sign-pkgs']
assert 'tags' in signing_job
signing_job_tags = signing_job['tags']
for expected_tag in ['notary', 'protected', 'aws']:
assert expected_tag in signing_job_tags
def test_ci_reproduce(tmpdir, mutable_mock_env_path, def test_ci_reproduce(tmpdir, mutable_mock_env_path,
install_mockery, mock_packages, monkeypatch, install_mockery, mock_packages, monkeypatch,
last_two_git_commits, ci_base_environment, mock_binary_index): last_two_git_commits, ci_base_environment, mock_binary_index):

View File

@@ -144,7 +144,7 @@ def test_preferred_target(self, mutable_mock_repo):
update_packages('mpileaks', 'target', [default]) update_packages('mpileaks', 'target', [default])
spec = concretize('mpileaks') spec = concretize('mpileaks')
assert str(spec['mpich'].target) == default assert str(spec['mpileaks'].target) == default
assert str(spec['mpich'].target) == default assert str(spec['mpich'].target) == default
def test_preferred_versions(self): def test_preferred_versions(self):

View File

@@ -1,4 +1,4 @@
stages: [ "generate", "build" ] stages: [ "generate", "build", "publish" ]
default: default:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
@@ -9,16 +9,25 @@ default:
.pr: .pr:
only: only:
- /^pr[\d]+_.*$/
- /^github\/pr[\d]+_.*$/ - /^github\/pr[\d]+_.*$/
variables: variables:
SPACK_PR_BRANCH: ${CI_COMMIT_REF_NAME} SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries-prs/${CI_COMMIT_REF_NAME}"
SPACK_PIPELINE_TYPE: "spack_pull_request" SPACK_PIPELINE_TYPE: "spack_pull_request"
SPACK_PRUNE_UNTOUCHED: "True" SPACK_PRUNE_UNTOUCHED: "True"
.develop: .protected-refs:
only: only:
- /^develop$/
- /^releases\/v.*/
- /^v.*/
- /^github\/develop$/ - /^github\/develop$/
.protected:
extends: [ ".protected-refs" ]
variables: variables:
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
SPACK_COPY_BUILDCACHE: "s3://spack-binaries/${CI_COMMIT_REF_NAME}"
SPACK_PIPELINE_TYPE: "spack_protected_branch" SPACK_PIPELINE_TYPE: "spack_protected_branch"
.generate: .generate:
@@ -29,12 +38,13 @@ default:
- cd share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME} - cd share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME}
- spack env activate --without-view . - spack env activate --without-view .
- spack ci generate --check-index-only - spack ci generate --check-index-only
--buildcache-destination "${SPACK_BUILDCACHE_DESTINATION}"
--artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir" --artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir"
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml" --output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml"
artifacts: artifacts:
paths: paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir" - "${CI_PROJECT_DIR}/jobs_scratch_dir"
tags: ["spack", "public", "medium", "x86_64"] tags: ["spack", "aws", "public", "medium", "x86_64"]
interruptible: true interruptible: true
retry: retry:
max: 2 max: 2
@@ -45,8 +55,8 @@ default:
.pr-generate: .pr-generate:
extends: [ ".pr", ".generate" ] extends: [ ".pr", ".generate" ]
.develop-generate: .protected-generate:
extends: [ ".develop", ".generate" ] extends: [ ".protected", ".generate" ]
.build: .build:
stage: build stage: build
@@ -57,12 +67,24 @@ default:
AWS_ACCESS_KEY_ID: ${PR_MIRRORS_AWS_ACCESS_KEY_ID} AWS_ACCESS_KEY_ID: ${PR_MIRRORS_AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${PR_MIRRORS_AWS_SECRET_ACCESS_KEY} AWS_SECRET_ACCESS_KEY: ${PR_MIRRORS_AWS_SECRET_ACCESS_KEY}
.develop-build: .protected-build:
extends: [ ".develop", ".build" ] extends: [ ".protected", ".build" ]
variables: variables:
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID} AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY} AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
SPACK_SIGNING_KEY: ${PACKAGE_SIGNING_KEY}
protected-publish:
stage: publish
extends: [ ".protected-refs" ]
image: "ghcr.io/spack/python-aws-bash:0.0.1"
tags: ["spack", "public", "medium", "aws", "x86_64"]
variables:
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
script:
- . "./share/spack/setup-env.sh"
- spack --version
- spack buildcache update-index --mirror-url "s3://spack-binaries/${CI_COMMIT_REF_NAME}"
######################################## ########################################
# TEMPLATE FOR ADDING ANOTHER PIPELINE # TEMPLATE FOR ADDING ANOTHER PIPELINE
@@ -83,8 +105,8 @@ default:
# my-super-cool-stack-pr-generate: # my-super-cool-stack-pr-generate:
# extends: [ ".my-super-cool-stack", ".pr-generate"] # extends: [ ".my-super-cool-stack", ".pr-generate"]
# #
# my-super-cool-stack-develop-generate: # my-super-cool-stack-protected-generate:
# extends: [ ".my-super-cool-stack", ".develop-generate"] # extends: [ ".my-super-cool-stack", ".protected-generate"]
# #
# my-super-cool-stack-pr-build: # my-super-cool-stack-pr-build:
# extends: [ ".my-super-cool-stack", ".pr-build" ] # extends: [ ".my-super-cool-stack", ".pr-build" ]
@@ -94,24 +116,62 @@ default:
# job: my-super-cool-stack-pr-generate # job: my-super-cool-stack-pr-generate
# strategy: depend # strategy: depend
# #
# my-super-cool-stack-develop-build: # my-super-cool-stack-protected-build:
# extends: [ ".my-super-cool-stack", ".develop-build" ] # extends: [ ".my-super-cool-stack", ".protected-build" ]
# trigger: # trigger:
# include: # include:
# - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml # - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
# job: my-super-cool-stack-develop-generate # job: my-super-cool-stack-protected-generate
# strategy: depend # strategy: depend
######################################## ########################################
# E4S Mac Stack # E4S Mac Stack
#
# With no near-future plans to have
# protected aws runners running mac
# builds, it seems best to decouple
# them from the rest of the stacks for
# the time being. This way they can
# still run on UO runners and be signed
# using the previous approach.
######################################## ########################################
.e4s-mac: .e4s-mac:
variables: variables:
SPACK_CI_STACK_NAME: e4s-mac SPACK_CI_STACK_NAME: e4s-mac
allow_failure: True allow_failure: True
.mac-pr:
only:
- /^pr[\d]+_.*$/
- /^github\/pr[\d]+_.*$/
variables:
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries-prs/${CI_COMMIT_REF_NAME}"
SPACK_PRUNE_UNTOUCHED: "True"
.mac-protected:
only:
- /^develop$/
- /^releases\/v.*/
- /^v.*/
- /^github\/develop$/
variables:
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
.mac-pr-build:
extends: [ ".mac-pr", ".build" ]
variables:
AWS_ACCESS_KEY_ID: ${PR_MIRRORS_AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${PR_MIRRORS_AWS_SECRET_ACCESS_KEY}
.mac-protected-build:
extends: [ ".mac-protected", ".build" ]
variables:
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
SPACK_SIGNING_KEY: ${PACKAGE_SIGNING_KEY}
e4s-mac-pr-generate: e4s-mac-pr-generate:
extends: [".e4s-mac", ".pr"] extends: [".e4s-mac", ".mac-pr"]
stage: generate stage: generate
script: script:
- tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp" - tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp"
@@ -135,8 +195,8 @@ e4s-mac-pr-generate:
- stuck_or_timeout_failure - stuck_or_timeout_failure
timeout: 60 minutes timeout: 60 minutes
e4s-mac-develop-generate: e4s-mac-protected-generate:
extends: [".e4s-mac", ".develop"] extends: [".e4s-mac", ".mac-protected"]
stage: generate stage: generate
script: script:
- tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp" - tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp"
@@ -161,7 +221,7 @@ e4s-mac-develop-generate:
timeout: 60 minutes timeout: 60 minutes
e4s-mac-pr-build: e4s-mac-pr-build:
extends: [ ".e4s-mac", ".pr-build" ] extends: [ ".e4s-mac", ".mac-pr-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
@@ -171,16 +231,16 @@ e4s-mac-pr-build:
- artifacts: True - artifacts: True
job: e4s-mac-pr-generate job: e4s-mac-pr-generate
e4s-mac-develop-build: e4s-mac-protected-build:
extends: [ ".e4s-mac", ".develop-build" ] extends: [ ".e4s-mac", ".mac-protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: e4s-mac-develop-generate job: e4s-mac-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: e4s-mac-develop-generate job: e4s-mac-protected-generate
######################################## ########################################
# E4S pipeline # E4S pipeline
@@ -192,8 +252,8 @@ e4s-mac-develop-build:
e4s-pr-generate: e4s-pr-generate:
extends: [ ".e4s", ".pr-generate"] extends: [ ".e4s", ".pr-generate"]
e4s-develop-generate: e4s-protected-generate:
extends: [ ".e4s", ".develop-generate"] extends: [ ".e4s", ".protected-generate"]
e4s-pr-build: e4s-pr-build:
extends: [ ".e4s", ".pr-build" ] extends: [ ".e4s", ".pr-build" ]
@@ -206,16 +266,16 @@ e4s-pr-build:
- artifacts: True - artifacts: True
job: e4s-pr-generate job: e4s-pr-generate
e4s-develop-build: e4s-protected-build:
extends: [ ".e4s", ".develop-build" ] extends: [ ".e4s", ".protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: e4s-develop-generate job: e4s-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: e4s-develop-generate job: e4s-protected-generate
######################################## ########################################
# E4S on Power # E4S on Power
@@ -231,8 +291,8 @@ e4s-develop-build:
# e4s-on-power-pr-generate: # e4s-on-power-pr-generate:
# extends: [ ".e4s-on-power", ".pr-generate", ".power-e4s-generate-tags-and-image"] # extends: [ ".e4s-on-power", ".pr-generate", ".power-e4s-generate-tags-and-image"]
# e4s-on-power-develop-generate: # e4s-on-power-protected-generate:
# extends: [ ".e4s-on-power", ".develop-generate", ".power-e4s-generate-tags-and-image"] # extends: [ ".e4s-on-power", ".protected-generate", ".power-e4s-generate-tags-and-image"]
# e4s-on-power-pr-build: # e4s-on-power-pr-build:
# extends: [ ".e4s-on-power", ".pr-build" ] # extends: [ ".e4s-on-power", ".pr-build" ]
@@ -245,16 +305,16 @@ e4s-develop-build:
# - artifacts: True # - artifacts: True
# job: e4s-on-power-pr-generate # job: e4s-on-power-pr-generate
# e4s-on-power-develop-build: # e4s-on-power-protected-build:
# extends: [ ".e4s-on-power", ".develop-build" ] # extends: [ ".e4s-on-power", ".protected-build" ]
# trigger: # trigger:
# include: # include:
# - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml # - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
# job: e4s-on-power-develop-generate # job: e4s-on-power-protected-generate
# strategy: depend # strategy: depend
# needs: # needs:
# - artifacts: True # - artifacts: True
# job: e4s-on-power-develop-generate # job: e4s-on-power-protected-generate
######################################### #########################################
# Build tests for different build-systems # Build tests for different build-systems
@@ -266,8 +326,8 @@ e4s-develop-build:
build_systems-pr-generate: build_systems-pr-generate:
extends: [ ".build_systems", ".pr-generate"] extends: [ ".build_systems", ".pr-generate"]
build_systems-develop-generate: build_systems-protected-generate:
extends: [ ".build_systems", ".develop-generate"] extends: [ ".build_systems", ".protected-generate"]
build_systems-pr-build: build_systems-pr-build:
extends: [ ".build_systems", ".pr-build" ] extends: [ ".build_systems", ".pr-build" ]
@@ -280,16 +340,16 @@ build_systems-pr-build:
- artifacts: True - artifacts: True
job: build_systems-pr-generate job: build_systems-pr-generate
build_systems-develop-build: build_systems-protected-build:
extends: [ ".build_systems", ".develop-build" ] extends: [ ".build_systems", ".protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: build_systems-develop-generate job: build_systems-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: build_systems-develop-generate job: build_systems-protected-generate
######################################### #########################################
# RADIUSS # RADIUSS
@@ -313,20 +373,20 @@ radiuss-pr-build:
- artifacts: True - artifacts: True
job: radiuss-pr-generate job: radiuss-pr-generate
# --------- Develop --------- # --------- Protected ---------
radiuss-develop-generate: radiuss-protected-generate:
extends: [ ".radiuss", ".develop-generate" ] extends: [ ".radiuss", ".protected-generate" ]
radiuss-develop-build: radiuss-protected-build:
extends: [ ".radiuss", ".develop-build" ] extends: [ ".radiuss", ".protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: radiuss-develop-generate job: radiuss-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: radiuss-develop-generate job: radiuss-protected-generate
######################################## ########################################
# ECP Data & Vis SDK # ECP Data & Vis SDK
@@ -338,8 +398,8 @@ radiuss-develop-build:
data-vis-sdk-pr-generate: data-vis-sdk-pr-generate:
extends: [ ".data-vis-sdk", ".pr-generate"] extends: [ ".data-vis-sdk", ".pr-generate"]
data-vis-sdk-develop-generate: data-vis-sdk-protected-generate:
extends: [ ".data-vis-sdk", ".develop-generate"] extends: [ ".data-vis-sdk", ".protected-generate"]
data-vis-sdk-pr-build: data-vis-sdk-pr-build:
extends: [ ".data-vis-sdk", ".pr-build" ] extends: [ ".data-vis-sdk", ".pr-build" ]
@@ -352,16 +412,16 @@ data-vis-sdk-pr-build:
- artifacts: True - artifacts: True
job: data-vis-sdk-pr-generate job: data-vis-sdk-pr-generate
data-vis-sdk-develop-build: data-vis-sdk-protected-build:
extends: [ ".data-vis-sdk", ".develop-build" ] extends: [ ".data-vis-sdk", ".protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: data-vis-sdk-develop-generate job: data-vis-sdk-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: data-vis-sdk-develop-generate job: data-vis-sdk-protected-generate
######################################## ########################################
# Spack Tutorial # Spack Tutorial
@@ -373,8 +433,8 @@ data-vis-sdk-develop-build:
tutorial-pr-generate: tutorial-pr-generate:
extends: [ ".tutorial", ".pr-generate"] extends: [ ".tutorial", ".pr-generate"]
tutorial-develop-generate: tutorial-protected-generate:
extends: [ ".tutorial", ".develop-generate"] extends: [ ".tutorial", ".protected-generate"]
tutorial-pr-build: tutorial-pr-build:
extends: [ ".tutorial", ".pr-build" ] extends: [ ".tutorial", ".pr-build" ]
@@ -387,13 +447,13 @@ tutorial-pr-build:
- artifacts: True - artifacts: True
job: tutorial-pr-generate job: tutorial-pr-generate
tutorial-develop-build: tutorial-protected-build:
extends: [ ".tutorial", ".develop-build" ] extends: [ ".tutorial", ".protected-build" ]
trigger: trigger:
include: include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: tutorial-develop-generate job: tutorial-protected-generate
strategy: depend strategy: depend
needs: needs:
- artifacts: True - artifacts: True
job: tutorial-develop-generate job: tutorial-protected-generate

View File

@@ -29,7 +29,7 @@ spack:
- - $default_specs - - $default_specs
- - $arch - - $arch
mirrors: { "mirror": "s3://spack-binaries/build_systems" } mirrors: { "mirror": "s3://spack-binaries/develop/build_systems" }
gitlab-ci: gitlab-ci:
script: script:
@@ -38,6 +38,8 @@ spack:
- cd ${SPACK_CONCRETE_ENV_DIR} - cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view . - spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'" - spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild - spack -d ci rebuild
image: image:
@@ -48,7 +50,7 @@ spack:
- match: - match:
- cmake - cmake
runner-attributes: runner-attributes:
tags: [ "spack", "public", "large", "x86_64"] tags: [ "spack", "large", "x86_64"]
variables: variables:
CI_JOB_SIZE: large CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m KUBERNETES_CPU_REQUEST: 8000m
@@ -61,7 +63,7 @@ spack:
- openjpeg - openjpeg
- sqlite - sqlite
runner-attributes: runner-attributes:
tags: [ "spack", "public", "medium", "x86_64" ] tags: [ "spack", "medium", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "medium" CI_JOB_SIZE: "medium"
KUBERNETES_CPU_REQUEST: "2000m" KUBERNETES_CPU_REQUEST: "2000m"
@@ -85,7 +87,7 @@ spack:
- xz - xz
- zlib - zlib
runner-attributes: runner-attributes:
tags: [ "spack", "public", "medium", "x86_64" ] tags: [ "spack", "medium", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "small" CI_JOB_SIZE: "small"
KUBERNETES_CPU_REQUEST: "500m" KUBERNETES_CPU_REQUEST: "500m"
@@ -94,18 +96,27 @@ spack:
- match: - match:
- 'os=ubuntu18.04' - 'os=ubuntu18.04'
runner-attributes: runner-attributes:
tags: ["spack", "public", "x86_64"] tags: ["spack", "x86_64"]
variables: variables:
CI_JOB_SIZE: "default" CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
service-job-attributes: service-job-attributes:
before_script: before_script:
- . "./share/spack/setup-env.sh" - . "./share/spack/setup-env.sh"
- spack --version - spack --version
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
tags: ["spack", "public", "x86_64"] tags: ["spack", "public", "x86_64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash: cdash:
build-group: Build tests for different build systems build-group: Build tests for different build systems
url: https://cdash.spack.io url: https://cdash.spack.io

View File

@@ -42,7 +42,7 @@ spack:
+zfp +zfp
+visit +visit
mirrors: { "mirror": "s3://spack-binaries/data-vis-sdk" } mirrors: { "mirror": "s3://spack-binaries/develop/data-vis-sdk" }
gitlab-ci: gitlab-ci:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
@@ -52,13 +52,15 @@ spack:
- cd ${SPACK_CONCRETE_ENV_DIR} - cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view . - spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'" - spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild - spack -d ci rebuild
mappings: mappings:
- match: - match:
- llvm - llvm
- qt - qt
runner-attributes: runner-attributes:
tags: [ "spack", "public", "huge", "x86_64" ] tags: [ "spack", "huge", "x86_64" ]
variables: variables:
CI_JOB_SIZE: huge CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 11000m KUBERNETES_CPU_REQUEST: 11000m
@@ -72,7 +74,7 @@ spack:
- visit - visit
- vtk-m - vtk-m
runner-attributes: runner-attributes:
tags: [ "spack", "public", "large", "x86_64" ] tags: [ "spack", "large", "x86_64" ]
variables: variables:
CI_JOB_SIZE: large CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m KUBERNETES_CPU_REQUEST: 8000m
@@ -98,7 +100,7 @@ spack:
- raja - raja
- vtk-h - vtk-h
runner-attributes: runner-attributes:
tags: [ "spack", "public", "medium", "x86_64" ] tags: [ "spack", "medium", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "medium" CI_JOB_SIZE: "medium"
KUBERNETES_CPU_REQUEST: "2000m" KUBERNETES_CPU_REQUEST: "2000m"
@@ -133,7 +135,7 @@ spack:
- util-linux-uuid - util-linux-uuid
runner-attributes: runner-attributes:
tags: [ "spack", "public", "small", "x86_64" ] tags: [ "spack", "small", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "small" CI_JOB_SIZE: "small"
KUBERNETES_CPU_REQUEST: "500m" KUBERNETES_CPU_REQUEST: "500m"
@@ -141,11 +143,12 @@ spack:
- match: ['@:'] - match: ['@:']
runner-attributes: runner-attributes:
tags: ["spack", "public", "x86_64"] tags: ["spack", "x86_64"]
variables: variables:
CI_JOB_SIZE: "default" CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries-develop/broken-specs" broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes: service-job-attributes:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
before_script: before_script:
@@ -153,6 +156,14 @@ spack:
- spack --version - spack --version
tags: ["spack", "public", "medium", "x86_64"] tags: ["spack", "public", "medium", "x86_64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash: cdash:
build-group: Data and Vis SDK build-group: Data and Vis SDK
url: https://cdash.spack.io url: https://cdash.spack.io

View File

@@ -32,7 +32,7 @@ spack:
- - $easy_specs - - $easy_specs
- - $arch - - $arch
mirrors: { "mirror": "s3://spack-binaries/e4s-mac" } mirrors: { "mirror": "s3://spack-binaries/develop/e4s-mac" }
gitlab-ci: gitlab-ci:
@@ -51,7 +51,9 @@ spack:
runner-attributes: runner-attributes:
tags: tags:
- omicron - omicron
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes: service-job-attributes:
before_script: before_script:
- . "./share/spack/setup-env.sh" - . "./share/spack/setup-env.sh"

View File

@@ -222,7 +222,7 @@ spack:
- - $cuda_specs - - $cuda_specs
- - $arch - - $arch
mirrors: { "mirror": "s3://spack-binaries/e4s" } mirrors: { "mirror": "s3://spack-binaries/develop/e4s" }
gitlab-ci: gitlab-ci:
@@ -233,6 +233,8 @@ spack:
- spack env activate --without-view . - spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'" - spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data - mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2) - spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
@@ -240,7 +242,7 @@ spack:
- match: - match:
- llvm - llvm
runner-attributes: runner-attributes:
tags: [ "spack", "public", "huge", "x86_64" ] tags: [ "spack", "huge", "x86_64" ]
variables: variables:
CI_JOB_SIZE: huge CI_JOB_SIZE: huge
KUBERNETES_CPU_REQUEST: 11000m KUBERNETES_CPU_REQUEST: 11000m
@@ -265,7 +267,7 @@ spack:
- vtk-m - vtk-m
- warpx - warpx
runner-attributes: runner-attributes:
tags: [ "spack", "public", "large", "x86_64" ] tags: [ "spack", "large", "x86_64" ]
variables: variables:
CI_JOB_SIZE: large CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m KUBERNETES_CPU_REQUEST: 8000m
@@ -333,7 +335,7 @@ spack:
- vtk-h - vtk-h
- zfp - zfp
runner-attributes: runner-attributes:
tags: [ "spack", "public", "medium", "x86_64" ] tags: [ "spack", "medium", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "medium" CI_JOB_SIZE: "medium"
KUBERNETES_CPU_REQUEST: "2000m" KUBERNETES_CPU_REQUEST: "2000m"
@@ -394,7 +396,7 @@ spack:
- zlib - zlib
- zstd - zstd
runner-attributes: runner-attributes:
tags: [ "spack", "public", "small", "x86_64" ] tags: [ "spack", "small", "x86_64" ]
variables: variables:
CI_JOB_SIZE: "small" CI_JOB_SIZE: "small"
KUBERNETES_CPU_REQUEST: "500m" KUBERNETES_CPU_REQUEST: "500m"
@@ -402,11 +404,12 @@ spack:
- match: ['os=ubuntu18.04'] - match: ['os=ubuntu18.04']
runner-attributes: runner-attributes:
tags: ["spack", "public", "x86_64"] tags: ["spack", "x86_64"]
variables: variables:
CI_JOB_SIZE: "default" CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries-develop/broken-specs" broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes: service-job-attributes:
before_script: before_script:
- . "./share/spack/setup-env.sh" - . "./share/spack/setup-env.sh"
@@ -414,6 +417,14 @@ spack:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
tags: ["spack", "public", "x86_64"] tags: ["spack", "public", "x86_64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash: cdash:
build-group: New PR testing workflow build-group: New PR testing workflow
url: https://cdash.spack.io url: https://cdash.spack.io

View File

@@ -54,7 +54,7 @@ spack:
- zfp - zfp
mirrors: mirrors:
mirror: "s3://spack-binaries/radiuss" mirror: "s3://spack-binaries/develop/radiuss"
specs: specs:
- matrix: - matrix:
@@ -69,6 +69,8 @@ spack:
- cd ${SPACK_CONCRETE_ENV_DIR} - cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view . - spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'" - spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild - spack -d ci rebuild
mappings: mappings:
- match: - match:
@@ -76,7 +78,7 @@ spack:
- openblas - openblas
- rust - rust
runner-attributes: runner-attributes:
tags: ["spack", "public", "large", "x86_64"] tags: ["spack", "large", "x86_64"]
variables: variables:
CI_JOB_SIZE: large CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m KUBERNETES_CPU_REQUEST: 8000m
@@ -96,7 +98,7 @@ spack:
- vtk-h - vtk-h
- vtk-m - vtk-m
runner-attributes: runner-attributes:
tags: ["spack", "public", "medium", "x86_64"] tags: ["spack", "medium", "x86_64"]
variables: variables:
CI_JOB_SIZE: "medium" CI_JOB_SIZE: "medium"
KUBERNETES_CPU_REQUEST: "2000m" KUBERNETES_CPU_REQUEST: "2000m"
@@ -150,7 +152,7 @@ spack:
- zfp - zfp
- zlib - zlib
runner-attributes: runner-attributes:
tags: ["spack", "public", "small", "x86_64"] tags: ["spack", "small", "x86_64"]
variables: variables:
CI_JOB_SIZE: "small" CI_JOB_SIZE: "small"
KUBERNETES_CPU_REQUEST: "500m" KUBERNETES_CPU_REQUEST: "500m"
@@ -158,10 +160,12 @@ spack:
- match: ['os=ubuntu18.04'] - match: ['os=ubuntu18.04']
runner-attributes: runner-attributes:
tags: ["spack", "public", "x86_64"] tags: ["spack", "x86_64"]
variables: variables:
CI_JOB_SIZE: "default" CI_JOB_SIZE: "default"
broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes: service-job-attributes:
before_script: before_script:
- . "./share/spack/setup-env.sh" - . "./share/spack/setup-env.sh"
@@ -169,6 +173,14 @@ spack:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
tags: ["spack", "public", "x86_64"] tags: ["spack", "public", "x86_64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash: cdash:
build-group: RADIUSS build-group: RADIUSS
url: https://cdash.spack.io url: https://cdash.spack.io

View File

@@ -59,7 +59,7 @@ spack:
- $gcc_spack_built_packages - $gcc_spack_built_packages
mirrors: mirrors:
mirror: 's3://spack-binaries/tutorial' mirror: 's3://spack-binaries/develop/tutorial'
gitlab-ci: gitlab-ci:
script: script:
@@ -69,6 +69,8 @@ spack:
- cd ${SPACK_CONCRETE_ENV_DIR} - cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view . - spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'" - spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild - spack -d ci rebuild
image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] } image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] }
@@ -81,7 +83,7 @@ spack:
- netlib-lapack - netlib-lapack
- trilinos - trilinos
runner-attributes: runner-attributes:
tags: ["spack", "public", "large", "x86_64"] tags: ["spack", "large", "x86_64"]
variables: variables:
CI_JOB_SIZE: large CI_JOB_SIZE: large
KUBERNETES_CPU_REQUEST: 8000m KUBERNETES_CPU_REQUEST: 8000m
@@ -99,7 +101,7 @@ spack:
- py-scipy - py-scipy
- slurm - slurm
runner-attributes: runner-attributes:
tags: ["spack", "public", "medium", "x86_64"] tags: ["spack", "medium", "x86_64"]
variables: variables:
CI_JOB_SIZE: "medium" CI_JOB_SIZE: "medium"
KUBERNETES_CPU_REQUEST: "2000m" KUBERNETES_CPU_REQUEST: "2000m"
@@ -129,7 +131,7 @@ spack:
- tar - tar
- util-linux-uuid - util-linux-uuid
runner-attributes: runner-attributes:
tags: ["spack", "public", "small", "x86_64"] tags: ["spack", "small", "x86_64"]
variables: variables:
CI_JOB_SIZE: "small" CI_JOB_SIZE: "small"
KUBERNETES_CPU_REQUEST: "500m" KUBERNETES_CPU_REQUEST: "500m"
@@ -137,11 +139,12 @@ spack:
- match: ['@:'] - match: ['@:']
runner-attributes: runner-attributes:
tags: ["spack", "public", "x86_64"] tags: ["spack", "x86_64"]
variables: variables:
CI_JOB_SIZE: default CI_JOB_SIZE: default
broken-specs-url: "s3://spack-binaries-develop/broken-specs" broken-specs-url: "s3://spack-binaries/broken-specs"
service-job-attributes: service-job-attributes:
image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] } image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] }
before_script: before_script:
@@ -149,6 +152,14 @@ spack:
- spack --version - spack --version
tags: ["spack", "public", "x86_64"] tags: ["spack", "public", "x86_64"]
signing-job-attributes:
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
tags: ["spack", "aws"]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
cdash: cdash:
build-group: Spack Tutorial build-group: Spack Tutorial
url: https://cdash.spack.io url: https://cdash.spack.io

View File

@@ -1,37 +1,164 @@
-----BEGIN PGP PUBLIC KEY BLOCK----- -----BEGIN PGP PUBLIC KEY BLOCK-----
mQENBGATaEABCAC//LQBgVfSPb46EwV+JLwxdfGhbk6EwJFoxiYh5lACqJple+Hg mQINBGKOOXIBEACy2KAwhV/qjObbfLc0StT0u8SaSUcubHE3DXpQOo69qyGxWM+e
8louI+h5KFGxTGflhmVNCx3VfHS2QsPySp+qu3uCsmanQdkHyQkaji4YO5XzJEhI 2cfOt7cDuyw8yEYUzpKmRhUgVbUchCtyDQZlzx2nZ0IfuAudsUl0RANk7nbZdjeG
IWt6kKfUcttvBSlfXHhtPNvO0PPeRTQypqMlHx4QfJqmuH7jDMZUf3Kg6eLWOuEb F4C21NvA69gtT4sGrDqwTGpKCLxcAUwwpYw2WUcyyz5e7mlGdxA4DmJ8uThDFHhd
cW49KeAPfXdGXdq91a7XCNsurvyz8xFxgH/CywXF8pG8LVLB4dfN8b7M+RWxA023 Yoq2X8YHHvBRIIS8q+T1de2NeFsSIEV2DqYx/L+z6IWkgE60mJy+5mfcuT/+mRpX
UZR1f1tYg9eSPRwh7V4E69PcjF7WqvWRw+Uhkes7rUdDR2ZWXhFSn2BvL6p1MdI8 iZ+w0JAUJDbATndp24TahLo60B+S/G2oIWN5WbKYfsJmHbU5EgjbbC/H8cITt8wS
ZHCFHO6l6QwuYmSIhsCyh21YSwOb/GC0Z8bxABEBAAG0MFNwYWNrIEJ1aWxkIFBp ZTGm+ZnSH6QMPGc8A1w/n/77JAAyVpQW907gLnUOX8qRypkmpNUGVw5gmQj7jvR0
cGVsaW5lIChEZW1vIEtleSkgPGtleUBzcGFjay5kZW1vPokBTgQTAQoAOBYhBImQ JyCO0z3V2W8DCvxzQR+Mci13ZTGw53pNiHQNn0K1iMT2IJ6XmTcKXrBy37yEQClx
4GrBi1LDHJkUx5Mo0B3n+1WHBQJgE2hAAhsvBQsJCAcCBhUKCQgLAgQWAgMBAh4B 06h3DxSWfNQlQXBO5lnvwetMrU3OuwztYfsrnlqM3byLW21ZRCft7OCSzwiNbWu/
AheAAAoJEJMo0B3n+1WH7cMH/1Ay1GUB5V9/K+LJGFDLXzCVTqwJbJUB2IeXNpef qg8eyA9xiH/ElduuA1Z5dKcRY4dywHUy3GvbqkqJBlRCyySZlzeXkm3aBelFDaQ8
qQfhZWQzOi8qXFCYAIqRlJH3c+rYoQTpR+l7uPS87Q24MLHT/mN6ZP+mI4JLM00T rviKZ9Bc5AIAgjUG6Sz5rZu2VgHkxPo8ZzVJXAR5BPzRcY9UFmGH5c2jja/Hf2kd
CUhs18wN5owNBM7FOs4FTlRmvWhlTCjicCXa0UuH6pB/T35Z/OQVisAUQY82kNu0 zP43wWAtXLM8Oci0fb5nizohTmgQq0JJNYRtZOEW0fSRd3tzh/eGTqQWOQARAQAB
CUkWLfmfNfm9lVOWWMbceJ49sDGWsApYv4ihzzIvnDSS6n5Fg1p8+BEoDbzk2+f5 tDZTcGFjayBQcm9qZWN0IE9mZmljaWFsIEJpbmFyaWVzIDxtYWludGFpbmVyc0Bz
+Jr0lNXZmQvTx7kGUnwRfuUxJifB8SNbABWL0En2scaE/QACQXkbaNTPMdI8+l59 cGFjay5pbz6JAk4EEwEKADgWIQQsjdMiTvNXOkK9Ih+o4Mo8HCraLwUCYo45cgIb
ucmvDDsQHlBRXPGRM1ut+1DHSkdkKqjor3TnLkJDz+rOL+K5AQ0EYBNoQAEIAMin AwULCQgHAgYVCgkICwIEFgIDAQIeAQIXgAAKCRCo4Mo8HCraL4vfD/9EQ5sotTYj
weK4wrqZhWOotTD3FS6IUh46Jd57jxd9dWBl5YkjvTwHQJQ54csneY/jWaMSoxhJ 83YmyJP/MdRB358QzmKV7+ewcE2+f184kx7z4IK/gBFYz9t32rnBBgm6/aQASHD+
CEuEnb4P/6P0g5lCVYflkXLrCPLrYPJazW0EtTXQ5YRxFT7ISytsQDNgfSQO6irs scm/mqV5+26B1nLUr5GbAEAm5yFDJulTBXX/TZpc5FyC+AgSh4j47kc9ZDM+YHFU
rJlD+OWUGQYeIpa58hB+N9GnM8eka7lxKfay9lM3rn5Nz3E4x10mdgxYY9RzrFHv uPLNqhJI19LVMSyleifqd8Xbqo9/aCPw6eRZMGmw8estV+QgHCT5oD1Y4SSRm2th
1MTGvNe/wRO67e9s0yJT+JEJ5No5h/c6J0dcrAiegiOvbhUtAYaygCpaxryTz9Bt /CUspVEWr8Dg0j3n7N/3y+pIBlf5lQ3wBeRgvH2c2ty28f8HauHiXAi0tkdCTU1W
CsSrBOXadzxIEnyp2pJE4vyxCVyHWve2EBk7Fagh45Z+JdA5QhGmNS3tHQmZ6Nyu 509avGE8a5XWYL0zjBrcintEfeaf7e4zOW+YQdVmIp1HDvBkBrpuPSUwecZ5jtCX
CPAEjzn4k3jjHgoDfTUAEQEAAYkCbAQYAQoAIBYhBImQ4GrBi1LDHJkUx5Mo0B3n 1w74wa0XQOYy3J19Vhc/t2DN63wYCKV9BxznvOFOWDn9foo2kUkx7E5gnYcV/P6n
+1WHBQJgE2hAAhsuAUAJEJMo0B3n+1WHwHQgBBkBCgAdFiEEwbGjdbck4hC1jzBi cGG5iikikRvK0IKGGmaYa4W0VNDK819npsTM0EFxE+6j9VWsGBXHSSdPk1vIRYuw
DDW/ourdPjAFAmATaEAACgkQDDW/ourdPjDaqgf+Oav1oC+TfOvPyIJgfuZK3Of6 tWJi79/v5wpnsqwTycYv1v4V0PGeIi+dtCpFWj/4BFpqPLVVT4SswmHmEikAuFaC
hvTPW81udyKgmZ/pbwJ4rSAkX2BRe0k11OXSc0V4TEkUPG62lfyNbrb54FsZAaPk APczWR8H62Y+qrLyMlk66GdyU0j0+SkWmLZVhqAinskcpnnwXzHS6cQtIePjygXy
s2C8G5LW2xlJ91JXIxsFQJcGlWTTrd7IFMe+YtcBHYSBJNmtRXodXO0sYUXCcaMk 09whvSRq1BZtiNqcND/hz6cJR/v29brxXhQplPLCu942OixvG+DUjTPn/GhMfA8R
Au/6y6x3m9nhfpqCsc3XZ6C0QxVMMhgrdSfnEPhHV/84m8mqDobU34+eDjnY/l6V mXvx0MFYcw2T0m8JfqYOXra7Tbi8UOulP4kCMwQQAQgAHRYhBFZuUukEjpZald+h
7ZYycH7Ihtv7z4Ed5Ahasr2FmrMOA9y6VFHeFxmUPhRi2QwFl2TZJ0z8sMosUUg0 cYfpC6CmUwNPBQJijnY4AAoJEIfpC6CmUwNPcKYP/22pYBPIIUZwBeD8q/DileiX
0X2yMfkxnMUrym12NdLYrIMGCPo8vm6UhqY7TZis7N5esBEqmMMsqSH0xuGp8d+e L8YgHU+ziy1qxHPiVXuwWEvQUwnCG5BeyvsCViQrs1670OCqtsKk/sBJJvp17kkt
B/99W1lQjtdhtE/UEW/wRMQHFoDC2Hyd9jA+NpFK0l7ryft0Jq284b/H4reFffSj tKmBjUoZDS+YbD6d4q9QgHVN8jW1HzotWPGKrVV4oDdnNBBKoS5h6i4tpMca3kXq
ctQL123KtOLNFQsG5w2Theo2XtC9tvhYTAK8736bf7CWJFw3oW5OSpvfXntzJpmw d7Ow3qzxSYkHCmJqqXNCBQzgIlha2FijMNWe1cnJr+IpG6eJ5wfeQm3llIVHbeCj
qcISIERXJPMENLSwUwg7YfpgmSKdrafWSaQEr/e5t2fjf0O3rJfagWH9s0+BetlY YZyAorjRSzsopfWwXrQJXLTBJzgj4JDUWMmk6IdKcF41NOWB6f04Eky0zEHX02Xl
NhAwpSf7Tm1X+rcep/8rKAsxwhgEQpfn88+2NTzVJDrTSt7CjcbV7nVIdUcK5ki9 eRxJygan9bSayELz3vjWQu2aBR4NxDdsvRy2E35wfHHruqLKTzaC1VU3nYGDPcby
2+262W2wWFNnZ3ofWutFl9yTKEY3RVbxpkzYAIM7Q1vUEpP+rYSCYlUwYudl5Dwb W+A/fA6+sY7Gh5y41xiD6hyDJxKkhEn9SfuF47YghFtW0r5fPe4ktwjI5e0sHWI6
BX2VXKOmi9HIn475ykM23BDR WFdNLF51F6B7xSbzNwRPw+XSWNvLvXvOu94FSPV1aoT5Vgq79377q/wlqmv/SSk/
=magh FcZFmglmIwJiLj/qUt2TlAkXb9UxQCfP3Sn98wGMOtZ6Hm0Uga3zjmw9dCTY/anL
n/RZYTJSXlEDAtnT/aU708zsz5skKHfLWTk4/1wLg/mUqImLJGRQ26J1C1njl9RK
p5SRAObb7qvoyM730avvGkiom5/GwxAzXiRs2c9gbuTEeCYhcBrwMSK1QiHZbG8y
IAe8fagtFxvtN5oJVmqpiQIzBBABCgAdFiEEE6CQhKr7XHIVX7vh1XGr4uJd+5wF
AmKOeGUACgkQ1XGr4uJd+5xphRAAmEINmSJVDApYyhlDRCZAE39oD+FUlbOyOkr+
yGXsCSz2cnMXdypBms3hVhs5ut6zmiekFmGJ1X2SrHWcm/NoZZAZDu+8LT0HF90W
MG+u4hP66lsglwlVYhzX8XjQpoSCunFHUb7HAShNVPSC30aWlVJZTyowLIuXiMmV
pwB5lhtHLoMjjAEpsy3l7lEey/+clL03fW3FzZWAwY2RgORNlz6ctaIPlN6Tjoya
iO6iFWE8/DiiATJ4fass3FijmfERD8o4H6nKKwEQzYTG5MoMuKC6fbokCL58wQ5k
jJ8wFpiajFKFsKfuk5+0q+LZ3FuLY0TUAOR7AJUELqTsBMUM3qf9EFU7UHN4KHoK
+3FrqCouT90pUjq8KoYXi0nfOqROuXJBAcx+g9G7H6x8yEPISE4la6T9aNujavip
jxMP7D3fY5leqxRozVtUZ5rHAvN1s6DtzPnrQVsn9RiwRJO3lYj64wvSRPHkRAVL
U9RtZXonlHsx1PVbx4XlAuSFOxLHApBWM+CqemyhXdtC1MCpTgqOpt7rTendZnqc
T2tDcaZIHw3+KAdXrU9zvvEqkk/dfnckdDQZTSd96r16HGLtYj6ILzd17/j/qTnq
BYeZl9CbYtyF117zyIjzaq8oTZxj97Tu5a9WXNOyaeB4e6A9AQeSJVA9vFM9vUCM
5pJRJzqJAjMEEAEKAB0WIQR802Mbz07k2wBqg1zqYglGK5OFuQUCYo54fAAKCRDq
YglGK5OFufWXEACH06ybO/LwTX4I+aoj19d5OSe8xaSZBHu2WVh2KfbPxInXNIhQ
1/W44hnQNGS0B6a/fN80xPD+tWdBLF1fAl7tgz+KBECc9fTsebeXMmY/FJZP60uL
1Da1RMEOd3lg7DgNfLjgIiXi4eqqYHHRwCSFww4VhZJ3lxnXrcuFwLXDAvGzB/5x
mK23fhCQ0tK9I/jcKyzrN3a/tcU7bXuQ0ewRDtVvfCnimGAjcayIpb3bhn0CByxJ
B4fH3mIw1/nMzBZr6PtNEPwSRLEnVsWwmUs9sCHWftgAqDpF3dnC5nk5XZdr+auo
3MeP47gbMW6xQcGY6XJgoWNIeNUpVJg12e0AsaOdEJJO4NU+IEmb5d5M889Kr3e2
sor5/6kbRAjh5RUShOLrae15Gkzd13K5oOlhBcyTTEqf4QSnOtvnvpKghHjaBkSh
em2N+HfZ8hmahRv3DI79rVx/vjLSFwc9Y/GYw56Xu8bBFmHP4rdFZP4VC87EjOwa
zNb/0XfpUJkFsGXyUdrvd+Ma4z8PA3Got9ZO2ZV/yAMnCpyTX5pUvF6RBA2zlv2Z
8EqafabD/iGJJFubHuoDLzTCSgD0hmO6l8r/lR5TmjHH2fw8kYXOCpRe2rbtyS1f
ttwP1IvoFb7mVQ6/Q7ghMxPiOxg/cGtW5fK+TYGU5cyYFu0Ad7dL11rgI4kCMwQQ
AQoAHRYhBBmw8mZl6mDaDDnZvOI98LWuJhGfBQJijnd2AAoJEOI98LWuJhGfXIEP
/0Nk2P2rMrX4MrfxACkHnRFS35GKbs2EqQGy2mxok5s+VDE/neKLozzBU/2x4ub8
P8UrXKBHAyW2MwZ1etx27ARoWcbGaOICbIMUCgmGSCqMlfo18SJVyssRPDvKxy/+
S7PIwgdFlRb79UxEMYi7L5Ig0H5nHYaHPAqvzTOssy+OXub0oU+sCK4s9WD9OPBf
vA9dfGJMnLyi4wTs0/6LXKAf5BwGOzeXhWL4GQmpRqb8Kw40BgBXhye9xUwr72BI
iAVVfUG5LTY9K8b9eK6DB78fdaZsvtfgY85Ou+OiMjEPitYCQF1mIt0qb9GcaC1b
VujdvM/ifpxXpyTdC95KUf773kTrv+v8842U99gccBNQp6rYUHkDT3bZXISAQEpd
c22iclcr6dCKRTRnaQpEkfDcidTEOnpadEDjl0EeZOeAS333awNe4ABP7pR7AvRW
2vg1cY7z52FEoLG20SkXbLb3Iaf5t3AOqS5z0kS59ALy6hxuuDJG8gb1KfzmJgMn
k9PJE8LdBVwsz346GLNUxLKzqBVFRX4N6WKDYURq30h/68p+U24K9/IeA5BTWsUQ
p7q11dk/JpkbyrK74V2hThwEyTv9hjMQuALTNr9sh3aUqT5XbhCgMnJcb1mkxhU9
ADKN+4h+tfuz6C0+wf3nFE08sUkStlN3Vbh/w+nJaII1iQEzBBABCgAdFiEEuNcs
eEKe1WcgoStyNsSqUuTK0GUFAmKOXFQACgkQNsSqUuTK0GWOmAf/RNODdqbdXkyP
8J4ePZDcPbzEhGNW6piSCbyJ7cVOQHLYuwV7i6gJ3L+h9AGii2fI4YBQkEzjiGJm
8o0HazR76R2iT7QcFQ4vLkX8wYgO5q4SFGlzOfhHr+OOrd2r2D6Pid/oCADUfa+x
NJt9V9naVCjkQq6rAFUK3cpwVCC8pB4at5+sL503u718Ce4u0uzuKwGXqmhsRruF
9ZIn1hkoneKKFDb7C1zT08oegy484LcjTfbzKuBHZOXW0cNkqtSCuL9lBmrD2t+r
KZWwNX8DLIOZtfxux+jCxks2dp903Zd9H5PlaZOldMbXGIEINfcUurl5H79vAXqW
EUOtsYYX74kCMwQQAQgAHRYhBPMiP8Tk+ZH+vALLH8FnKfGqz2bGBQJijpCCAAoJ
EMFnKfGqz2bGrZoP/jobYML0JDIQT22TyNfz2Q38WhLtdaEnKeEMU5HDq9uEOjjz
BZewMB4LLTgE3uvXngL7K/2R6i6fu1YAOX2RaySG4VfxNd7z1TTHnTWmeKd+rtB/
o5/iH4hp3uLYFPvWqKjr7PPuXzi1JE99lEThuyqM88GcKfuNvldJtjhALZL548St
es6D82tGumbWzFEeyDbCxJRBOWfX6vkVBR7w3Q2NRxEOtvc58mhXiHOs2/vXMMzr
1RMYzzYvq8jXi8uaa5Esmeo6r1Md67oaNfPNulhYUe4mKYwPuphcBSNCfRGQnRlU
8oToURyRcXI6Bd9dJSvtznMHrsWO+Zm4O752cvfi/GKHUPVJ/FvO5L0qo2546+tn
nIDPVhvAnhWO5+95ooRIXsxa2mzYtaudAu6pcI6OaMANjJ8fUTxFmedN9HqlkKPF
ghvcwqJdpmpRs05nAuNzHmnKkMVI8R7uBavB6F5cAfonNVgoCKCsFHpG2jGViMRj
/OtovngNpYrweyIiPxRYGhKiP3rNzNjMT2HfVz8xiTttIeMXU+JrBem40CagxBFa
JYsZjATkKDJ3tXlRs1JGqKiI3Veia4elCCLv6uzfKVGAg4xMWjtKbdxpi3ku8JSh
Ntzc928pJkHprP1WPGVbZ3xPPJ3N+WTawGYnblcLlRFeVErdKSJeQOknbH/EiQIz
BBABCAAdFiEEikTOi7BILoKkJagWI2JUH20U7YQFAmKOvW0ACgkQI2JUH20U7YRW
Aw//dBsV+CCqb0i9s8O6l/L7kZG///jxqobv6SLlUOKuFIckGMKBVi7QSLC11Wgv
Z8ETswrSDSP9YzP46TP4Ad2tZQoulhB+sEfNIsRu1doYXPmr23T68Jof4dinCTVO
rgoU8XboKjzQzy27ziziJ4OZxRl9c4zIaSw4FyEzt4BLKAByi9NT5CtJN5Sr3v9X
CncuKnekqpTpLltbLJYYK+DT+Vy8+FT9XehQbndKtM9i4FXvp6xzM61GfL3s3MA8
Xosol+8OrwYLKhUM5mbg0sqreqVRcmeiRCBO5MfCfrpukCbqBmwi/E7qyw+S4IAl
HdU8JVRQvCoJCCy8pZqfMAdgx35E3CM7/GIlb5Dk2teKPlmSBXu7ckMhhFauiDFK
ImX6ThCe/uExvK5npiowKvQsEjhDeUU4zt9N8UxgaRpPYr2tyHnqqRTeEtk9/K9j
O8WH825DeKhjwU6Eg4Qtb8HmlA0fnZ//L826KC8mTkFSbkdKtIMvlq8u6nAgHsMF
GoUbBvtDbvenZhkndQpuDd2tXpSob+9f1TqZGfWv2nbOtfEfXEf9BwayX+iJOH2F
cC6bJbIG5UTirbjxDmVmKn71CxgJTHRqSUULKE4rimpnDpN6S2qw4ZEK4EIkSH6b
qmlDWCsiprIc2KwiTOz8wCCsKNwXnc3PfFtM8bGO6N9Yz965Ag0EYo45cgEQAOnf
WNZhXdeapCMfs+YxuySSfCw8X2ucHegKV0tKCg1ozKY9Z/CQ4apcNxJD1ZS++0y8
38OjNo7JayAp4joCT+/lFN+OzFuMf5xc5E5pQeF1UAsi27FJFJWX9NIvdZ0BmTzy
E0GJGg9CSUPOfImV1fW7uVWkkzi+UE09pe8llmkY3JCX5ViIH0bTFzF52BZL06np
0MxNFwBVm4sZXyPOxInqOm66gICrbxLDriz3EYa2bJm17I6Kclvw/X/ohCeGU9WW
KEzWTE03OxRMqLlPfxgqVshIz2dO77u47yehI6BOsOhpp4Ag7rRgLpRs1Iemg02/
Oa82FiiLw0S4g/4UXyi4cZKF4vCefNs2z+IaSEe3l9z9Gg19gPsanrYP+CfZRsXk
2EYxwt0Ffmma2rQ24zQ9yJ5NvqiINX9xd/LjOv8pmkNArKXsbtY4KwtH7zVkiS4Q
9FsU5C/9BaM9z/fMpQNUq6mx9FCw0C+NntWYvfXn4PFNPC7klYgM/2VFvxq+vBk6
CVpbcSoYy1+7uZZ+hskyQek8Dbtnk7fLBRC4gHK9T2gbroo/eS0u9b1PFlaka0HG
1zKwU1u6Iq19r2qapKoMf3SGStEilh8x0eyCdEqqCHEi/HKYU4zGa54zBGlpkmMy
Q7VZmeozNpY/F02KZByMWIstjKZGEINXhaI/2F5HABEBAAGJAjYEGAEKACAWIQQs
jdMiTvNXOkK9Ih+o4Mo8HCraLwUCYo45cgIbDAAKCRCo4Mo8HCraL5bAD/9bS+7u
0Xq+kt8/sQpWnmwevgcnsSxVwENf1In9l/ZSShtaLvBUaj2Ot80bnfTl9dVruH8v
+Nh1HrzklGwwcqNP5jQZEVFaqQO/jA/c84gKvXoQPUA555rcTonZYquEBMqEMlJe
jil+Lb+pfII+BVD0wvQDCnpzni8u7r0cEjPEMevLoTJdcgNhn1WuvDRFp8+HtlTx
198wcZbAPgFHRpE1NQjrP2CBket/ZIxuvuAEf9TpYifsjG0NZcdxeN0JZ3HOKZUT
kKgNanm+PxqXRynnrdEEH6I2vPR+KMr5+ZqFcXbamvDy64Xewi0EVYecQk1SllfC
lCuDih5ZqcjmZqdcqoFxc+tc7gcb509Fo/+mBCq6nXEVorKPJqdoW3IGbz29Nkqc
ZczFyeOpCk3UaPCz3kxebVfaDydiRkFnWlFEZNkAidZGOKs5ykEmEvq8c9+dSyaS
3Y7xcx/SaGyF/a4+9cdd974/HcPKcRHRi7nXrn+yEVQq8CZAvKWVYyme461isPkz
loWb1AKXK5kHR0CFq4HTXMZrrNsdWoU2lP+BNVg0dQG4Z0QpcOrgwbLjXrqTVrbB
PITOx6cYB7RafdBmhBF+8qOHmr7wwI92DV0vYeEjlGT9FazAquCAMVqBnqz/52GM
Vzr8QG9fSxBmTeoMQkkdXah/sex7zWVfUqoJLrkCDQRijjtHARAAmPVZ6gtFeFcI
NzMD3iZVkFglIM91SfdglmgjxdC4mAjSGQk5dBxrqFJrT6Tn5b/i8JLGgrOpaDvb
O50chDmys+VtWEmoxa9kl4BOjzEbr7e0UymoK1GTK6ZAIIrHytFkSjcP3LSi5Am9
aCwVhZ3NoH3LKBzj+ast7I0QJT/mt5ti2h/4qEw3qJ6ImKUmjfLTOkTjt5NfWgT9
McdnV74XpOi3OdIL8vTp+1S5Dm3pByVYJdR0mfD0uRg9+OHN04t4D3k2hPovBxJN
E1uKE5IPt6RJ+E1POCfA1dM1PsaSf1S4zyyIlUK7HM99oXUg00JXHBUD5z12hXuy
5sQZE+lFaMRej+uwO/uO2YiebansrQc+MMnrkAKElCzb0esuRNWff5CPnQ2K0ZL/
x+xNfziRNvTAICz6ir4bPONa201V6rDFoe4HNLxL0u+mLnag5i/4xiE/eWtzEBSp
F1HNn+LSrNn65JDjU1y3o0iDwZZo1hVzG2zgx8f/7gXDJpMcVHpLWz5Why77643l
NoR+qdUwoofzC5Soz+m1SoOOoEfCTiZZaukaOSFDeh/ZQ7M/MvQ0ytd8HZwFm6po
QJYQiwJUV9Es4szqndr8bxHxoy55mJqewe+cTvqB2Nqy7OjXNFZD37TuLLw9clJK
MLKFrz2VitRyRADhg11oCmGp6GAZBLEAEQEAAYkEbAQYAQoAIBYhBCyN0yJO81c6
Qr0iH6jgyjwcKtovBQJijjtHAhsCAkAJEKjgyjwcKtovwXQgBBkBCgAdFiEE0sfr
PysF+oZZDSk8BAAbLj2wxyMFAmKOO0cACgkQBAAbLj2wxyNbCw//ah/m1jrdV254
WAEt800l9brvONRU5YwjZE9HuHaTXxaRU5jd55lreFGWqzYJDe1vKkR0BdCGHIB8
KERNGXq6oQUJ4+oVrLWeV11ojen7xmbSYwEvd5VEK2ihYHyq5n0n2mwOG3+9HPkm
5N1QqjykePr7xqkBn3xgMJgB4KGydZi/zk5mNM3R23T061gn0G3TntnGWjppzRzx
a4CUz4b+ys6yz6I6LQ1pIG3pYeXgb6p9McdWP3+gec1xYPTgR001AbcbuMAXzjRI
GNGblsy0CAXTPju1451129wTx9l5x4sLLscmHv/TDRT9/YpEPfhA0xtL/XdNgG9o
lndhi6UsC8dg68sKI6MZzbFJBUmzwvThZi9DjvS7tI4IynQEENB0rEpNwBgNpE3w
OvoJBB+Ykr7Lyg7M/AdymBu4sHTW6nUuLlDo45gHAaFkKdM+WCRllvdRDI6/CnDh
dqSnqrfcyFFgzPgrA3fqoQ1TX8pgoMWasnBShaZn2vmXBUcfImxKCCVSpqhzfSIx
lmkneo3WC2hqkMDTfcx8z77u37pJYPWMiHidGqkJCRr7P4K112SoH5SVa0R94yNu
5zOolbyvt1lgKYmS/UDpxfHkUHL1WVJo0Ki/LTADNYCMYUKz/4E3Em5T9DIBRrkq
F7BWxpCF3kogEnpOQ5Qj9cTOZsfBEqM4jA/9H233pFPKvgzYcYst3l65Ootx+dsh
5gHIbp0aWBGGxLMbwkjSXdIbGxZrmxTng+9CWgpAX9j5lkjCNJxEpxzYGiRwJE2N
p7O+dGnfO9VTsVWqcCc73T4s6HVnmxX8ZxGSW2LhoI1na+rqnz6rzz1Rdj+RnG57
HHDzGKvzjbFfbQREweduG+M4JbtOMLCCooojwzxyCRTbNsQEh0uleMyse1PjmnYz
dII0l+arVaOg73i+1KkMyCKvxd+zPw5/gPM62LcUxkqMfnFgBHyzh/W199b6ukZP
DODeXOzKhiJiIVtxztl6L+hpXY+yE60iPgcbIiP4qMZxFGGolM3LfDzzD57evJWH
6SuDy3yv6vmUcZgmEpDSU/wbByNN7FNTHblrImRDGHg9Xs/9NQV0ngA96jPvILsu
BUw4y1ybVwu5GgNct3VbDGzlaSpUpYnSyp04id3iZJRIYKBln+wJ7q73oo8vltlX
QpOENRJHJoDECq0R/UDg1j+mwEw/p1A9xcQ8caw9p0Y2YyIlcSboqeKbjXrOv4oi
rEUyfGBlSvk+8Deg7TIW4fGQ/uW4sthkRNLpSxWgV+t8VRTEtQ6+ONL+u2ehQkbF
7+kvlN1LTQNITLpJ+8DGlYke8qlloGY2ROPR+INyNbJCiJOLba2CjRBu48w30iXy
cHFsNYXiYw5O7lA=
=cjRy
-----END PGP PUBLIC KEY BLOCK----- -----END PGP PUBLIC KEY BLOCK-----

View File

@@ -626,7 +626,7 @@ _spack_ci() {
} }
_spack_ci_generate() { _spack_ci_generate() {
SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --prune-dag --no-prune-dag --check-index-only --artifacts-root" SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --buildcache-destination --prune-dag --no-prune-dag --check-index-only --artifacts-root"
} }
_spack_ci_rebuild_index() { _spack_ci_rebuild_index() {