ci: Remove deprecated logic from the ci module (#47062)

ci: Remove deprecated logic from the ci module

Remove the following from the ci module, schema, and tests:

- deprecated ci stack and handling of old ci config
- deprecated mirror handling logic
- support for artifacts buildcache
- support for temporary storage url
This commit is contained in:
Scott Wittenburg 2024-10-23 12:50:55 -06:00 committed by GitHub
parent 755c113c16
commit 1472dcace4
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
11 changed files with 169 additions and 1221 deletions

View File

@ -59,7 +59,7 @@ Functional Example
------------------ ------------------
The simplest fully functional standalone example of a working pipeline can be The simplest fully functional standalone example of a working pipeline can be
examined live at this example `project <https://gitlab.com/scott.wittenburg/spack-pipeline-demo>`_ examined live at this example `project <https://gitlab.com/spack/pipeline-quickstart>`_
on gitlab.com. on gitlab.com.
Here's the ``.gitlab-ci.yml`` file from that example that builds and runs the Here's the ``.gitlab-ci.yml`` file from that example that builds and runs the
@ -67,39 +67,46 @@ pipeline:
.. code-block:: yaml .. code-block:: yaml
stages: [generate, build] stages: [ "generate", "build" ]
variables: variables:
SPACK_REPO: https://github.com/scottwittenburg/spack.git SPACK_REPOSITORY: "https://github.com/spack/spack.git"
SPACK_REF: pipelines-reproducible-builds SPACK_REF: "develop-2024-10-06"
SPACK_USER_CONFIG_PATH: ${CI_PROJECT_DIR}
SPACK_BACKTRACE: 1
generate-pipeline: generate-pipeline:
stage: generate
tags: tags:
- docker - saas-linux-small-amd64
stage: generate
image: image:
name: ghcr.io/scottwittenburg/ecpe4s-ubuntu18.04-runner-x86_64:2020-09-01 name: ghcr.io/spack/ubuntu20.04-runner-x86_64:2023-01-01
entrypoint: [""]
before_script:
- git clone ${SPACK_REPO}
- pushd spack && git checkout ${SPACK_REF} && popd
- . "./spack/share/spack/setup-env.sh"
script: script:
- git clone ${SPACK_REPOSITORY}
- cd spack && git checkout ${SPACK_REF} && cd ../
- . "./spack/share/spack/setup-env.sh"
- spack --version
- spack env activate --without-view . - spack env activate --without-view .
- spack -d ci generate - spack -d -v --color=always
ci generate
--check-index-only
--artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir" --artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir"
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml" --output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml"
artifacts: artifacts:
paths: paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir" - "${CI_PROJECT_DIR}/jobs_scratch_dir"
build-jobs: build-pipeline:
stage: build stage: build
trigger: trigger:
include: include:
- artifact: "jobs_scratch_dir/pipeline.yml" - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: generate-pipeline job: generate-pipeline
strategy: depend strategy: depend
needs:
- artifacts: True
job: generate-pipeline
The key thing to note above is that there are two jobs: The first job to run, The key thing to note above is that there are two jobs: The first job to run,
``generate-pipeline``, runs the ``spack ci generate`` command to generate a ``generate-pipeline``, runs the ``spack ci generate`` command to generate a
@ -114,82 +121,93 @@ And here's the spack environment built by the pipeline represented as a
spack: spack:
view: false view: false
concretizer: concretizer:
unify: false unify: true
reuse: false
definitions: definitions:
- pkgs: - pkgs:
- zlib - zlib
- bzip2 - bzip2 ~debug
- arch: - compiler:
- '%gcc@7.5.0 arch=linux-ubuntu18.04-x86_64' - '%gcc'
specs: specs:
- matrix: - matrix:
- - $pkgs - - $pkgs
- - $arch - - $compiler
mirrors: { "mirror": "s3://spack-public/mirror" }
ci: ci:
enable-artifacts-buildcache: True target: gitlab
rebuild-index: False
pipeline-gen: pipeline-gen:
- any-job: - any-job:
before_script: tags:
- git clone ${SPACK_REPO} - saas-linux-small-amd64
- pushd spack && git checkout ${SPACK_CHECKOUT_VERSION} && popd
- . "./spack/share/spack/setup-env.sh"
- build-job:
tags: [docker]
image: image:
name: ghcr.io/scottwittenburg/ecpe4s-ubuntu18.04-runner-x86_64:2020-09-01 name: ghcr.io/spack/ubuntu20.04-runner-x86_64:2023-01-01
entrypoint: [""] before_script:
- git clone ${SPACK_REPOSITORY}
- cd spack && git checkout ${SPACK_REF} && cd ../
- . "./spack/share/spack/setup-env.sh"
- spack --version
- export SPACK_USER_CONFIG_PATH=${CI_PROJECT_DIR}
- spack config blame mirrors
The elements of this file important to spack ci pipelines are described in more
detail below, but there are a couple of things to note about the above working
example:
.. note:: .. note::
There is no ``script`` attribute specified for here. The reason for this is The use of ``reuse: false`` in spack environments used for pipelines is
Spack CI will automatically generate reasonable default scripts. More almost always what you want, as without it your pipelines will not rebuild
detail on what is in these scripts can be found below. packages even if package hashes have changed. This is due to the concretizer
strongly preferring known hashes when ``reuse: true``.
Also notice the ``before_script`` section. It is required when using any of the The ``ci`` section in the above environment file contains the bare minimum
default scripts to source the ``setup-env.sh`` script in order to inform configuration required for ``spack ci generate`` to create a working pipeline.
the default scripts where to find the ``spack`` executable. The ``target: gitlab`` tells spack that the desired pipeline output is for
gitlab. However, this isn't strictly required, as currently gitlab is the
only possible output format for pipelines. The ``pipeline-gen`` section
contains the key information needed to specify attributes for the generated
jobs. Notice that it contains a list which has only a single element in
this case. In real pipelines it will almost certainly have more elements,
and in those cases, order is important: spack starts at the bottom of the
list and works upwards when applying attributes.
Normally ``enable-artifacts-buildcache`` is not recommended in production as it But in this simple case, we use only the special key ``any-job`` to
results in large binary artifacts getting transferred back and forth between indicate that spack should apply the specified attributes (``tags``, ``image``,
gitlab and the runners. But in this example on gitlab.com where there is no and ``before_script``) to any job it generates. This includes jobs for
shared, persistent file system, and where no secrets are stored for giving building/pushing all packages, a ``rebuild-index`` job at the end of the
permission to write to an S3 bucket, ``enabled-buildcache-artifacts`` is the only pipeline, as well as any ``noop`` jobs that might be needed by gitlab when
way to propagate binaries from jobs to their dependents. no rebuilds are required.
Also, it is usually a good idea to let the pipeline generate a final "rebuild the Something to note is that in this simple case, we rely on spack to
buildcache index" job, so that subsequent pipeline generation can quickly determine generate a reasonable script for the package build jobs (it just creates
which specs are up to date and which need to be rebuilt (it's a good idea for other a script that invokes ``spack ci rebuild``).
reasons as well, but those are out of scope for this discussion). In this case we
have disabled it (using ``rebuild-index: False``) because the index would only be
generated in the artifacts mirror anyway, and consequently would not be available
during subsequent pipeline runs.
.. note:: Another thing to note is the use of the ``SPACK_USER_CONFIG_DIR`` environment
With the addition of reproducible builds (#22887) a previously working variable in any generated jobs. The purpose of this is to make spack
pipeline will require some changes: aware of one final file in the example, the one that contains the mirror
configuration. This file, ``mirrors.yaml`` looks like this:
* In the build-jobs, the environment location changed. .. code-block:: yaml
This will typically show as a ``KeyError`` in the failing job. Be sure to
point to ``${SPACK_CONCRETE_ENV_DIR}``.
* When using ``include`` in your environment, be sure to make the included mirrors:
files available in the build jobs. This means adding those files to the buildcache-destination:
artifact directory. Those files will also be missing in the reproducibility url: oci://registry.gitlab.com/spack/pipeline-quickstart
artifact. binary: true
access_pair:
id_variable: CI_REGISTRY_USER
secret_variable: CI_REGISTRY_PASSWORD
* Because the location of the environment changed, including files with
relative path may have to be adapted to work both in the project context Note the name of the mirror is ``buildcache-destination``, which is required
(generation job) and in the concrete env dir context (build job). as of Spack 0.23 (see below for more information). The mirror url simply
points to the container registry associated with the project, while
``id_variable`` and ``secret_variable`` refer to to environment variables
containing the access credentials for the mirror.
When spack builds packages for this example project, they will be pushed to
the project container registry, where they will be available for subsequent
jobs to install as dependencies, or for other pipelines to use to build runnable
container images.
----------------------------------- -----------------------------------
Spack commands supporting pipelines Spack commands supporting pipelines
@ -417,15 +435,6 @@ configuration with a ``script`` attribute. Specifying a signing job without a sc
does not create a signing job and the job configuration attributes will be ignored. does not create a signing job and the job configuration attributes will be ignored.
Signing jobs are always assigned the runner tags ``aws``, ``protected``, and ``notary``. Signing jobs are always assigned the runner tags ``aws``, ``protected``, and ``notary``.
^^^^^^^^^^^^^^^^^
Cleanup (cleanup)
^^^^^^^^^^^^^^^^^
When using ``temporary-storage-url-prefix`` the cleanup job will destroy the mirror
created for the associated Gitlab pipeline. Cleanup jobs do not allow modifying the
script, but do expect that the spack command is in the path and require a
``before_script`` to be specified that sources the ``setup-env.sh`` script.
.. _noop_jobs: .. _noop_jobs:
^^^^^^^^^^^^ ^^^^^^^^^^^^
@ -741,15 +750,6 @@ environment/stack file, and in that case no bootstrapping will be done (only the
specs will be staged for building) and the runners will be expected to already specs will be staged for building) and the runners will be expected to already
have all needed compilers installed and configured for spack to use. have all needed compilers installed and configured for spack to use.
^^^^^^^^^^^^^^^^^^^
Pipeline Buildcache
^^^^^^^^^^^^^^^^^^^
The ``enable-artifacts-buildcache`` key
takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``).
^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^
Broken Specs URL Broken Specs URL
^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^

View File

@ -34,7 +34,7 @@
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.concretize import spack.concretize
import spack.config as cfg import spack.config as cfg
import spack.environment as ev import spack.error
import spack.main import spack.main
import spack.mirror import spack.mirror
import spack.paths import spack.paths
@ -95,8 +95,6 @@ def dispatch_open(fullurl, data=None, timeout=None, verify_ssl=True):
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror" TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"] SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
# TODO: Remove this in Spack 0.23
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = ( JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{ arch=architecture}" "{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{ arch=architecture}"
) )
@ -201,11 +199,11 @@ def _remove_satisfied_deps(deps, satisfied_list):
return nodes, edges, stages return nodes, edges, stages
def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions): def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages: if not stages:
return return
mirrors = spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True) mirrors = spack.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:") tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values(): for m in mirrors.values():
tty.msg(f" {m.fetch_url}") tty.msg(f" {m.fetch_url}")
@ -252,21 +250,14 @@ def _spec_matches(spec, match_string):
return spec.intersects(match_string) return spec.intersects(match_string)
def _format_job_needs( def _format_job_needs(dep_jobs, build_group, prune_dag, rebuild_decisions):
dep_jobs, build_group, prune_dag, rebuild_decisions, enable_artifacts_buildcache
):
needs_list = [] needs_list = []
for dep_job in dep_jobs: for dep_job in dep_jobs:
dep_spec_key = _spec_ci_label(dep_job) dep_spec_key = _spec_ci_label(dep_job)
rebuild = rebuild_decisions[dep_spec_key].rebuild rebuild = rebuild_decisions[dep_spec_key].rebuild
if not prune_dag or rebuild: if not prune_dag or rebuild:
needs_list.append( needs_list.append({"job": get_job_name(dep_job, build_group), "artifacts": False})
{
"job": get_job_name(dep_job, build_group),
"artifacts": enable_artifacts_buildcache,
}
)
return needs_list return needs_list
@ -410,12 +401,6 @@ def __init__(self, ci_config, spec_labels, stages):
self.ir = { self.ir = {
"jobs": {}, "jobs": {},
"temporary-storage-url-prefix": self.ci_config.get(
"temporary-storage-url-prefix", None
),
"enable-artifacts-buildcache": self.ci_config.get(
"enable-artifacts-buildcache", False
),
"rebuild-index": self.ci_config.get("rebuild-index", True), "rebuild-index": self.ci_config.get("rebuild-index", True),
"broken-specs-url": self.ci_config.get("broken-specs-url", None), "broken-specs-url": self.ci_config.get("broken-specs-url", None),
"broken-tests-packages": self.ci_config.get("broken-tests-packages", []), "broken-tests-packages": self.ci_config.get("broken-tests-packages", []),
@ -698,14 +683,13 @@ def generate_gitlab_ci_yaml(
prune_dag=False, prune_dag=False,
check_index_only=False, check_index_only=False,
artifacts_root=None, artifacts_root=None,
remote_mirror_override=None,
): ):
"""Generate a gitlab yaml file to run a dynamic child pipeline from """Generate a gitlab yaml file to run a dynamic child pipeline from
the spec matrix in the active environment. the spec matrix in the active environment.
Arguments: Arguments:
env (spack.environment.Environment): Activated environment object env (spack.environment.Environment): Activated environment object
which must contain a gitlab-ci section describing how to map which must contain a ci section describing how to map
specs to runners specs to runners
print_summary (bool): Should we print a summary of all the jobs in print_summary (bool): Should we print a summary of all the jobs in
the stages in which they were placed. the stages in which they were placed.
@ -720,39 +704,21 @@ def generate_gitlab_ci_yaml(
artifacts_root (str): Path where artifacts like logs, environment artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory. requires this to be within the project directory.
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option. DEPRECATED
""" """
with spack.concretize.disable_compiler_existence_check(): with spack.concretize.disable_compiler_existence_check():
with env.write_transaction(): with env.write_transaction():
env.concretize() env.concretize()
env.write() env.write()
yaml_root = env.manifest[ev.TOP_LEVEL_KEY]
# Get the joined "ci" config with all of the current scopes resolved # Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci") ci_config = cfg.get("ci")
config_deprecated = False
if not ci_config: if not ci_config:
tty.warn("Environment does not have `ci` a configuration") raise SpackCIError("Environment does not have a `ci` configuration")
gitlabci_config = yaml_root.get("gitlab-ci")
if not gitlabci_config:
tty.die("Environment yaml does not have `gitlab-ci` config section. Cannot recover.")
tty.warn(
"The `gitlab-ci` configuration is deprecated in favor of `ci`.\n",
"To update run \n\t$ spack env update /path/to/ci/spack.yaml",
)
translate_deprecated_config(gitlabci_config)
ci_config = gitlabci_config
config_deprecated = True
# Default target is gitlab...and only target is gitlab # Default target is gitlab...and only target is gitlab
if not ci_config.get("target", "gitlab") == "gitlab": if not ci_config.get("target", "gitlab") == "gitlab":
tty.die('Spack CI module only generates target "gitlab"') raise SpackCIError('Spack CI module only generates target "gitlab"')
cdash_config = cfg.get("cdash") cdash_config = cfg.get("cdash")
cdash_handler = CDashHandler(cdash_config) if "build-group" in cdash_config else None cdash_handler = CDashHandler(cdash_config) if "build-group" in cdash_config else None
@ -813,12 +779,6 @@ def generate_gitlab_ci_yaml(
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None) spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None)
copy_only_pipeline = spack_pipeline_type == "spack_copy_only" copy_only_pipeline = spack_pipeline_type == "spack_copy_only"
if copy_only_pipeline and config_deprecated:
tty.warn(
"SPACK_PIPELINE_TYPE=spack_copy_only is not supported when using\n",
"deprecated ci configuration, a no-op pipeline will be generated\n",
"instead.",
)
def ensure_expected_target_path(path): def ensure_expected_target_path(path):
"""Returns passed paths with all Windows path separators exchanged """Returns passed paths with all Windows path separators exchanged
@ -837,38 +797,16 @@ def ensure_expected_target_path(path):
return path return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True) pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors: if "buildcache-destination" not in pipeline_mirrors:
if remote_mirror_override: raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
tty.die(
"Using the deprecated --buildcache-destination cli option and "
"having a mirror named 'buildcache-destination' at the same time "
"is not allowed"
)
buildcache_destination = pipeline_mirrors["buildcache-destination"] buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
# TODO: Remove this block in spack 0.23
remote_mirror_url = None
if deprecated_mirror_config:
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None) spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if spack_buildcache_copy: if spack_buildcache_copy:
buildcache_copies = {} buildcache_copies = {}
buildcache_copy_src_prefix = ( buildcache_copy_src_prefix = buildcache_destination.fetch_url
buildcache_destination.fetch_url
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
buildcache_copy_dest_prefix = spack_buildcache_copy buildcache_copy_dest_prefix = spack_buildcache_copy
# Check for a list of "known broken" specs that we should not bother # Check for a list of "known broken" specs that we should not bother
@ -878,55 +816,10 @@ def ensure_expected_target_path(path):
if "broken-specs-url" in ci_config: if "broken-specs-url" in ci_config:
broken_specs_url = ci_config["broken-specs-url"] broken_specs_url = ci_config["broken-specs-url"]
enable_artifacts_buildcache = False
if "enable-artifacts-buildcache" in ci_config:
tty.warn("Support for enable-artifacts-buildcache will be removed in Spack 0.23")
enable_artifacts_buildcache = ci_config["enable-artifacts-buildcache"]
rebuild_index_enabled = True rebuild_index_enabled = True
if "rebuild-index" in ci_config and ci_config["rebuild-index"] is False: if "rebuild-index" in ci_config and ci_config["rebuild-index"] is False:
rebuild_index_enabled = False rebuild_index_enabled = False
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
tty.warn("Support for temporary-storage-url-prefix will be removed in Spack 0.23")
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
# If a remote mirror override (alternate buildcache destination) was
# specified, add it here in case it has already built hashes we might
# generate.
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if deprecated_mirror_config and remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
# Overriding the main mirror in this case might result
# in skipping jobs on a release pipeline because specs are
# up to date in develop. Eventually we want to notice and take
# advantage of this by scheduling a job to copy the spec from
# develop to the release, but until we have that, this makes
# sure we schedule a rebuild job if the spec isn't already in
# override mirror.
mirrors_to_check = {"override": remote_mirror_override}
# If we have a remote override and we want generate pipeline using
# --check-index-only, then the override mirror needs to be added to
# the configured mirrors when bindist.update() is run, or else we
# won't fetch its index and include in our local cache.
spack.mirror.add(
spack.mirror.Mirror(remote_mirror_override, name="ci_pr_mirror"),
cfg.default_modify_scope(),
)
# TODO: Remove this block in Spack 0.23
shared_pr_mirror = None
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
stack_name = os.environ.get("SPACK_CI_STACK_NAME", "")
shared_pr_mirror = url_util.join(SHARED_PR_MIRROR_URL, stack_name)
spack.mirror.add(
spack.mirror.Mirror(shared_pr_mirror, name="ci_shared_pr_mirror"),
cfg.default_modify_scope(),
)
pipeline_artifacts_dir = artifacts_root pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir: if not pipeline_artifacts_dir:
proj_dir = os.environ.get("CI_PROJECT_DIR", os.getcwd()) proj_dir = os.environ.get("CI_PROJECT_DIR", os.getcwd())
@ -935,9 +828,8 @@ def ensure_expected_target_path(path):
pipeline_artifacts_dir = os.path.abspath(pipeline_artifacts_dir) pipeline_artifacts_dir = os.path.abspath(pipeline_artifacts_dir)
concrete_env_dir = os.path.join(pipeline_artifacts_dir, "concrete_environment") concrete_env_dir = os.path.join(pipeline_artifacts_dir, "concrete_environment")
# Now that we've added the mirrors we know about, they should be properly # Copy the environment manifest file into the concrete environment directory,
# reflected in the environment manifest file, so copy that into the # along with the spack.lock file.
# concrete environment directory, along with the spack.lock file.
if not os.path.exists(concrete_env_dir): if not os.path.exists(concrete_env_dir):
os.makedirs(concrete_env_dir) os.makedirs(concrete_env_dir)
shutil.copyfile(env.manifest_path, os.path.join(concrete_env_dir, "spack.yaml")) shutil.copyfile(env.manifest_path, os.path.join(concrete_env_dir, "spack.yaml"))
@ -962,18 +854,12 @@ def ensure_expected_target_path(path):
env_includes.extend(include_scopes) env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes] env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes]
if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]:
env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci")
translate_deprecated_config(env_yaml_root["spack"]["ci"])
with open(os.path.join(concrete_env_dir, "spack.yaml"), "w") as fd: with open(os.path.join(concrete_env_dir, "spack.yaml"), "w") as fd:
fd.write(syaml.dump_config(env_yaml_root, default_flow_style=False)) fd.write(syaml.dump_config(env_yaml_root, default_flow_style=False))
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs") job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction") job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests") job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
# TODO: Remove this line in Spack 0.23
local_mirror_dir = os.path.join(pipeline_artifacts_dir, "mirror")
user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data") user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data")
# We communicate relative paths to the downstream jobs to avoid issues in # We communicate relative paths to the downstream jobs to avoid issues in
@ -987,8 +873,6 @@ def ensure_expected_target_path(path):
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir) rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir) rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir)
rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir) rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir)
# TODO: Remove this line in Spack 0.23
rel_local_mirror_dir = os.path.join(local_mirror_dir, ci_project_dir)
rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir) rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir)
# Speed up staging by first fetching binary indices from all mirrors # Speed up staging by first fetching binary indices from all mirrors
@ -1050,7 +934,7 @@ def ensure_expected_target_path(path):
continue continue
up_to_date_mirrors = bindist.get_mirrors_for_spec( up_to_date_mirrors = bindist.get_mirrors_for_spec(
spec=release_spec, mirrors_to_check=mirrors_to_check, index_only=check_index_only spec=release_spec, index_only=check_index_only
) )
spec_record.rebuild = not up_to_date_mirrors spec_record.rebuild = not up_to_date_mirrors
@ -1094,11 +978,6 @@ def main_script_replacements(cmd):
job_object["needs"] = [] job_object["needs"] = []
if spec_label in dependencies: if spec_label in dependencies:
if enable_artifacts_buildcache:
# Get dependencies transitively, so they're all
# available in the artifacts buildcache.
dep_jobs = [d for d in release_spec.traverse(deptype="all", root=False)]
else:
# In this case, "needs" is only used for scheduling # In this case, "needs" is only used for scheduling
# purposes, so we only get the direct dependencies. # purposes, so we only get the direct dependencies.
dep_jobs = [] dep_jobs = []
@ -1106,13 +985,7 @@ def main_script_replacements(cmd):
dep_jobs.append(spec_labels[dep_label]) dep_jobs.append(spec_labels[dep_label])
job_object["needs"].extend( job_object["needs"].extend(
_format_job_needs( _format_job_needs(dep_jobs, build_group, prune_dag, rebuild_decisions)
dep_jobs,
build_group,
prune_dag,
rebuild_decisions,
enable_artifacts_buildcache,
)
) )
rebuild_spec = spec_record.rebuild rebuild_spec = spec_record.rebuild
@ -1194,19 +1067,6 @@ def main_script_replacements(cmd):
}, },
) )
# TODO: Remove this block in Spack 0.23
if enable_artifacts_buildcache:
bc_root = os.path.join(local_mirror_dir, "build_cache")
job_object["artifacts"]["paths"].extend(
[
os.path.join(bc_root, p)
for p in [
bindist.tarball_name(release_spec, ".spec.json"),
bindist.tarball_directory_name(release_spec),
]
]
)
job_object["stage"] = stage_name job_object["stage"] = stage_name
job_object["retry"] = {"max": 2, "when": JOB_RETRY_CONDITIONS} job_object["retry"] = {"max": 2, "when": JOB_RETRY_CONDITIONS}
job_object["interruptible"] = True job_object["interruptible"] = True
@ -1221,15 +1081,7 @@ def main_script_replacements(cmd):
job_id += 1 job_id += 1
if print_summary: if print_summary:
_print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions) _print_staging_summary(spec_labels, stages, rebuild_decisions)
# Clean up remote mirror override if enabled
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config:
if remote_mirror_override:
spack.mirror.remove("ci_pr_mirror", cfg.default_modify_scope())
if spack_pipeline_type == "spack_pull_request":
spack.mirror.remove("ci_shared_pr_mirror", cfg.default_modify_scope())
tty.debug(f"{job_id} build jobs generated in {stage_id} stages") tty.debug(f"{job_id} build jobs generated in {stage_id} stages")
@ -1251,7 +1103,7 @@ def main_script_replacements(cmd):
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"], "when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
} }
if copy_only_pipeline and not config_deprecated: if copy_only_pipeline:
stage_names.append("copy") stage_names.append("copy")
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"]) sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
sync_job["stage"] = "copy" sync_job["stage"] = "copy"
@ -1261,17 +1113,12 @@ def main_script_replacements(cmd):
if "variables" not in sync_job: if "variables" not in sync_job:
sync_job["variables"] = {} sync_job["variables"] = {}
sync_job["variables"]["SPACK_COPY_ONLY_DESTINATION"] = ( sync_job["variables"]["SPACK_COPY_ONLY_DESTINATION"] = buildcache_destination.fetch_url
buildcache_destination.fetch_url
if buildcache_destination if "buildcache-source" not in pipeline_mirrors:
else remote_mirror_override or remote_mirror_url raise SpackCIError("Copy-only pipelines require a mirror named 'buildcache-source'")
)
if "buildcache-source" in pipeline_mirrors:
buildcache_source = pipeline_mirrors["buildcache-source"].fetch_url buildcache_source = pipeline_mirrors["buildcache-source"].fetch_url
else:
# TODO: Remove this condition in Spack 0.23
buildcache_source = os.environ.get("SPACK_SOURCE_MIRROR", None)
sync_job["variables"]["SPACK_BUILDCACHE_SOURCE"] = buildcache_source sync_job["variables"]["SPACK_BUILDCACHE_SOURCE"] = buildcache_source
sync_job["dependencies"] = [] sync_job["dependencies"] = []
@ -1279,27 +1126,6 @@ def main_script_replacements(cmd):
job_id += 1 job_id += 1
if job_id > 0: if job_id > 0:
# TODO: Remove this block in Spack 0.23
if temp_storage_url_prefix:
# There were some rebuild jobs scheduled, so we will need to
# schedule a job to clean up the temporary storage location
# associated with this pipeline.
stage_names.append("cleanup-temp-storage")
cleanup_job = copy.deepcopy(spack_ci_ir["jobs"]["cleanup"]["attributes"])
cleanup_job["stage"] = "cleanup-temp-storage"
cleanup_job["when"] = "always"
cleanup_job["retry"] = service_job_retries
cleanup_job["interruptible"] = True
cleanup_job["script"] = _unpack_script(
cleanup_job["script"],
op=lambda cmd: cmd.replace("mirror_prefix", temp_storage_url_prefix),
)
cleanup_job["dependencies"] = []
output_object["cleanup"] = cleanup_job
if ( if (
"script" in spack_ci_ir["jobs"]["signing"]["attributes"] "script" in spack_ci_ir["jobs"]["signing"]["attributes"]
and spack_pipeline_type == "spack_protected_branch" and spack_pipeline_type == "spack_protected_branch"
@ -1316,11 +1142,9 @@ def main_script_replacements(cmd):
signing_job["interruptible"] = True signing_job["interruptible"] = True
if "variables" not in signing_job: if "variables" not in signing_job:
signing_job["variables"] = {} signing_job["variables"] = {}
signing_job["variables"]["SPACK_BUILDCACHE_DESTINATION"] = ( signing_job["variables"][
buildcache_destination.push_url # need the s3 url for aws s3 sync "SPACK_BUILDCACHE_DESTINATION"
if buildcache_destination ] = buildcache_destination.push_url
else remote_mirror_override or remote_mirror_url
)
signing_job["dependencies"] = [] signing_job["dependencies"] = []
output_object["sign-pkgs"] = signing_job output_object["sign-pkgs"] = signing_job
@ -1331,8 +1155,6 @@ def main_script_replacements(cmd):
final_job = spack_ci_ir["jobs"]["reindex"]["attributes"] final_job = spack_ci_ir["jobs"]["reindex"]["attributes"]
final_job["stage"] = "stage-rebuild-index" final_job["stage"] = "stage-rebuild-index"
target_mirror = remote_mirror_override or remote_mirror_url
if buildcache_destination:
target_mirror = buildcache_destination.push_url target_mirror = buildcache_destination.push_url
final_job["script"] = _unpack_script( final_job["script"] = _unpack_script(
final_job["script"], final_job["script"],
@ -1359,17 +1181,11 @@ def main_script_replacements(cmd):
"SPACK_CONCRETE_ENV_DIR": rel_concrete_env_dir, "SPACK_CONCRETE_ENV_DIR": rel_concrete_env_dir,
"SPACK_VERSION": spack_version, "SPACK_VERSION": spack_version,
"SPACK_CHECKOUT_VERSION": version_to_clone, "SPACK_CHECKOUT_VERSION": version_to_clone,
# TODO: Remove this line in Spack 0.23
"SPACK_REMOTE_MIRROR_URL": remote_mirror_url,
"SPACK_JOB_LOG_DIR": rel_job_log_dir, "SPACK_JOB_LOG_DIR": rel_job_log_dir,
"SPACK_JOB_REPRO_DIR": rel_job_repro_dir, "SPACK_JOB_REPRO_DIR": rel_job_repro_dir,
"SPACK_JOB_TEST_DIR": rel_job_test_dir, "SPACK_JOB_TEST_DIR": rel_job_test_dir,
# TODO: Remove this line in Spack 0.23
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type), "SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"), "SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
# TODO: Remove this line in Spack 0.23
"SPACK_CI_SHARED_PR_MIRROR_URL": shared_pr_mirror or "None",
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag), "SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything), "SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"), "SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
@ -1378,10 +1194,6 @@ def main_script_replacements(cmd):
for item, val in output_vars.items(): for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val) output_vars[item] = ensure_expected_target_path(val)
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
(output_object["variables"]["SPACK_REMOTE_MIRROR_OVERRIDE"]) = remote_mirror_override
spack_stack_name = os.environ.get("SPACK_CI_STACK_NAME", None) spack_stack_name = os.environ.get("SPACK_CI_STACK_NAME", None)
if spack_stack_name: if spack_stack_name:
output_object["variables"]["SPACK_CI_STACK_NAME"] = spack_stack_name output_object["variables"]["SPACK_CI_STACK_NAME"] = spack_stack_name
@ -1408,13 +1220,6 @@ def main_script_replacements(cmd):
noop_job["retry"] = 0 noop_job["retry"] = 0
noop_job["allow_failure"] = True noop_job["allow_failure"] = True
if copy_only_pipeline and config_deprecated:
tty.debug("Generating no-op job as copy-only is unsupported here.")
noop_job["script"] = [
'echo "copy-only pipelines are not supported with deprecated ci configs"'
]
output_object = {"unsupported-copy": noop_job}
else:
tty.debug("No specs to rebuild, generating no-op job") tty.debug("No specs to rebuild, generating no-op job")
output_object = {"no-specs-to-rebuild": noop_job} output_object = {"no-specs-to-rebuild": noop_job}
@ -2454,83 +2259,6 @@ def report_skipped(self, spec: spack.spec.Spec, report_dir: str, reason: Optiona
reporter.test_skipped_report(report_dir, spec, reason) reporter.test_skipped_report(report_dir, spec, reason)
def translate_deprecated_config(config): class SpackCIError(spack.error.SpackError):
# Remove all deprecated keys from config def __init__(self, msg):
mappings = config.pop("mappings", []) super().__init__(msg)
match_behavior = config.pop("match_behavior", "first")
build_job = {}
if "image" in config:
build_job["image"] = config.pop("image")
if "tags" in config:
build_job["tags"] = config.pop("tags")
if "variables" in config:
build_job["variables"] = config.pop("variables")
# Scripts always override in old CI
if "before_script" in config:
build_job["before_script:"] = config.pop("before_script")
if "script" in config:
build_job["script:"] = config.pop("script")
if "after_script" in config:
build_job["after_script:"] = config.pop("after_script")
signing_job = None
if "signing-job-attributes" in config:
signing_job = {"signing-job": config.pop("signing-job-attributes")}
service_job_attributes = None
if "service-job-attributes" in config:
service_job_attributes = config.pop("service-job-attributes")
# If this config already has pipeline-gen do not more
if "pipeline-gen" in config:
return True if mappings or build_job or signing_job or service_job_attributes else False
config["target"] = "gitlab"
config["pipeline-gen"] = []
pipeline_gen = config["pipeline-gen"]
# Build Job
submapping = []
for section in mappings:
submapping_section = {"match": section["match"]}
if "runner-attributes" in section:
remapped_attributes = {}
if match_behavior == "first":
for key, value in section["runner-attributes"].items():
# Scripts always override in old CI
if key == "script":
remapped_attributes["script:"] = value
elif key == "before_script":
remapped_attributes["before_script:"] = value
elif key == "after_script":
remapped_attributes["after_script:"] = value
else:
remapped_attributes[key] = value
else:
# Handle "merge" behavior be allowing scripts to merge in submapping section
remapped_attributes = section["runner-attributes"]
submapping_section["build-job"] = remapped_attributes
if "remove-attributes" in section:
# Old format only allowed tags in this section, so no extra checks are needed
submapping_section["build-job-remove"] = section["remove-attributes"]
submapping.append(submapping_section)
pipeline_gen.append({"submapping": submapping, "match_behavior": match_behavior})
if build_job:
pipeline_gen.append({"build-job": build_job})
# Signing Job
if signing_job:
pipeline_gen.append(signing_job)
# Service Jobs
if service_job_attributes:
pipeline_gen.append({"reindex-job": service_job_attributes})
pipeline_gen.append({"noop-job": service_job_attributes})
pipeline_gen.append({"cleanup-job": service_job_attributes})
return True

View File

@ -62,13 +62,6 @@ def setup_parser(subparser):
"path to the file where generated jobs file should be written. " "path to the file where generated jobs file should be written. "
"default is .gitlab-ci.yml in the root of the repository", "default is .gitlab-ci.yml in the root of the repository",
) )
generate.add_argument(
"--copy-to",
default=None,
help="path to additional directory for job files\n\n"
"this option provides an absolute path to a directory where the generated "
"jobs yaml file should be copied. default is not to copy",
)
generate.add_argument( generate.add_argument(
"--optimize", "--optimize",
action="store_true", action="store_true",
@ -83,12 +76,6 @@ def setup_parser(subparser):
default=False, default=False,
help="(DEPRECATED) disable DAG scheduling (use 'plain' dependencies)", help="(DEPRECATED) disable DAG scheduling (use 'plain' dependencies)",
) )
generate.add_argument(
"--buildcache-destination",
default=None,
help="override the mirror configured in the environment\n\n"
"allows for pushing binaries from the generated pipeline to a different location",
)
prune_group = generate.add_mutually_exclusive_group() prune_group = generate.add_mutually_exclusive_group()
prune_group.add_argument( prune_group.add_argument(
"--prune-dag", "--prune-dag",
@ -214,20 +201,10 @@ def ci_generate(args):
env = spack.cmd.require_active_env(cmd_name="ci generate") env = spack.cmd.require_active_env(cmd_name="ci generate")
if args.copy_to:
tty.warn("The flag --copy-to is deprecated and will be removed in Spack 0.23")
if args.buildcache_destination:
tty.warn(
"The flag --buildcache-destination is deprecated and will be removed in Spack 0.23"
)
output_file = args.output_file output_file = args.output_file
copy_yaml_to = args.copy_to
prune_dag = args.prune_dag prune_dag = args.prune_dag
index_only = args.index_only index_only = args.index_only
artifacts_root = args.artifacts_root artifacts_root = args.artifacts_root
buildcache_destination = args.buildcache_destination
if not output_file: if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml") output_file = os.path.abspath(".gitlab-ci.yml")
@ -245,15 +222,8 @@ def ci_generate(args):
prune_dag=prune_dag, prune_dag=prune_dag,
check_index_only=index_only, check_index_only=index_only,
artifacts_root=artifacts_root, artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination,
) )
if copy_yaml_to:
copy_to_dir = os.path.dirname(copy_yaml_to)
if not os.path.exists(copy_to_dir):
os.makedirs(copy_to_dir)
shutil.copyfile(output_file, copy_yaml_to)
def ci_reindex(args): def ci_reindex(args):
"""rebuild the buildcache index for the remote mirror """rebuild the buildcache index for the remote mirror
@ -298,22 +268,13 @@ def ci_rebuild(args):
job_log_dir = os.environ.get("SPACK_JOB_LOG_DIR") job_log_dir = os.environ.get("SPACK_JOB_LOG_DIR")
job_test_dir = os.environ.get("SPACK_JOB_TEST_DIR") job_test_dir = os.environ.get("SPACK_JOB_TEST_DIR")
repro_dir = os.environ.get("SPACK_JOB_REPRO_DIR") repro_dir = os.environ.get("SPACK_JOB_REPRO_DIR")
# TODO: Remove this in Spack 0.23
local_mirror_dir = os.environ.get("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = os.environ.get("SPACK_CONCRETE_ENV_DIR") concrete_env_dir = os.environ.get("SPACK_CONCRETE_ENV_DIR")
ci_pipeline_id = os.environ.get("CI_PIPELINE_ID")
ci_job_name = os.environ.get("CI_JOB_NAME") ci_job_name = os.environ.get("CI_JOB_NAME")
signing_key = os.environ.get("SPACK_SIGNING_KEY") signing_key = os.environ.get("SPACK_SIGNING_KEY")
job_spec_pkg_name = os.environ.get("SPACK_JOB_SPEC_PKG_NAME") job_spec_pkg_name = os.environ.get("SPACK_JOB_SPEC_PKG_NAME")
job_spec_dag_hash = os.environ.get("SPACK_JOB_SPEC_DAG_HASH") job_spec_dag_hash = os.environ.get("SPACK_JOB_SPEC_DAG_HASH")
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE") spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE")
# TODO: Remove this in Spack 0.23
remote_mirror_override = os.environ.get("SPACK_REMOTE_MIRROR_OVERRIDE")
# TODO: Remove this in Spack 0.23
remote_mirror_url = os.environ.get("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = os.environ.get("SPACK_CI_STACK_NAME") spack_ci_stack_name = os.environ.get("SPACK_CI_STACK_NAME")
# TODO: Remove this in Spack 0.23
shared_pr_mirror_url = os.environ.get("SPACK_CI_SHARED_PR_MIRROR_URL")
rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING") rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING")
require_signing = os.environ.get("SPACK_REQUIRE_SIGNING") require_signing = os.environ.get("SPACK_REQUIRE_SIGNING")
@ -333,12 +294,10 @@ def ci_rebuild(args):
job_log_dir = os.path.join(ci_project_dir, job_log_dir) job_log_dir = os.path.join(ci_project_dir, job_log_dir)
job_test_dir = os.path.join(ci_project_dir, job_test_dir) job_test_dir = os.path.join(ci_project_dir, job_test_dir)
repro_dir = os.path.join(ci_project_dir, repro_dir) repro_dir = os.path.join(ci_project_dir, repro_dir)
local_mirror_dir = os.path.join(ci_project_dir, local_mirror_dir)
concrete_env_dir = os.path.join(ci_project_dir, concrete_env_dir) concrete_env_dir = os.path.join(ci_project_dir, concrete_env_dir)
# Debug print some of the key environment variables we should have received # Debug print some of the key environment variables we should have received
tty.debug("pipeline_artifacts_dir = {0}".format(pipeline_artifacts_dir)) tty.debug("pipeline_artifacts_dir = {0}".format(pipeline_artifacts_dir))
tty.debug("remote_mirror_url = {0}".format(remote_mirror_url))
tty.debug("job_spec_pkg_name = {0}".format(job_spec_pkg_name)) tty.debug("job_spec_pkg_name = {0}".format(job_spec_pkg_name))
# Query the environment manifest to find out whether we're reporting to a # Query the environment manifest to find out whether we're reporting to a
@ -370,51 +329,11 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True) pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors: if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")
buildcache_destination = pipeline_mirrors["buildcache-destination"] buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
# If no override url exists, then just push binary package to the
# normal remote mirror url.
# TODO: Remove in Spack 0.23
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
if buildcache_destination:
buildcache_mirror_url = buildcache_destination.push_url
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build
# artifacts from upstream to downstream jobs.
# TODO: Remove this in Spack 0.23
pipeline_mirror_url = None
# TODO: Remove this in Spack 0.23
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
pipeline_mirror_url = url_util.join(temp_storage_url_prefix, ci_pipeline_id)
# TODO: Remove this in Spack 0.23
enable_artifacts_mirror = False
if "enable-artifacts-buildcache" in ci_config:
enable_artifacts_mirror = ci_config["enable-artifacts-buildcache"]
if enable_artifacts_mirror or (
spack_is_pr_pipeline and not enable_artifacts_mirror and not temp_storage_url_prefix
):
# If you explicitly enabled the artifacts buildcache feature, or
# if this is a PR pipeline but you did not enable either of the
# per-pipeline temporary storage features, we force the use of
# artifacts buildcache. Otherwise jobs will not have binary
# dependencies from previous stages available since we do not
# allow pushing binaries to the remote mirror during PR pipelines.
enable_artifacts_mirror = True
pipeline_mirror_url = url_util.path_to_file_url(local_mirror_dir)
mirror_msg = "artifact buildcache enabled, mirror url: {0}".format(pipeline_mirror_url)
tty.debug(mirror_msg)
# Get the concrete spec to be built by this job. # Get the concrete spec to be built by this job.
try: try:
@ -489,48 +408,7 @@ def ci_rebuild(args):
fd.write(spack_info.encode("utf8")) fd.write(spack_info.encode("utf8"))
fd.write(b"\n") fd.write(b"\n")
pipeline_mirrors = [] matches = None if full_rebuild else bindist.get_mirrors_for_spec(job_spec, index_only=False)
# If we decided there should be a temporary storage mechanism, add that
# mirror now so it's used when we check for a hash match already
# built for this spec.
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror = spack.mirror.Mirror(pipeline_mirror_url, name=spack_ci.TEMP_STORAGE_MIRROR_NAME)
spack.mirror.add(mirror, cfg.default_modify_scope())
pipeline_mirrors.append(pipeline_mirror_url)
# Check configured mirrors for a built spec with a matching hash
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {"override": remote_mirror_override}
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add(
spack.mirror.Mirror(remote_mirror_override, name="mirror_override"),
cfg.default_modify_scope(),
)
pipeline_mirrors.append(remote_mirror_override)
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
if shared_pr_mirror_url != "None":
pipeline_mirrors.append(shared_pr_mirror_url)
matches = (
None
if full_rebuild
else bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False
)
)
if matches: if matches:
# Got a hash match on at least one configured mirror. All # Got a hash match on at least one configured mirror. All
@ -542,25 +420,10 @@ def ci_rebuild(args):
tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name)) tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name))
for match in matches: for match in matches:
tty.msg(" {0}".format(match["mirror_url"])) tty.msg(" {0}".format(match["mirror_url"]))
# TODO: Remove this block in Spack 0.23
if enable_artifacts_mirror:
matching_mirror = matches[0]["mirror_url"]
build_cache_dir = os.path.join(local_mirror_dir, "build_cache")
tty.debug("Getting {0} buildcache from {1}".format(job_spec_pkg_name, matching_mirror))
tty.debug("Downloading to {0}".format(build_cache_dir))
bindist.download_single_spec(job_spec, build_cache_dir, mirror_url=matching_mirror)
# Now we are done and successful # Now we are done and successful
return 0 return 0
# Before beginning the install, if this is a "rebuild everything" pipeline, we
# only want to keep the mirror being used by the current pipeline as it's binary
# package destination. This ensures that the when we rebuild everything, we only
# consume binary dependencies built in this pipeline.
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and full_rebuild:
spack_ci.remove_other_mirrors(pipeline_mirrors, cfg.default_modify_scope())
# No hash match anywhere means we need to rebuild spec # No hash match anywhere means we need to rebuild spec
# Start with spack arguments # Start with spack arguments
@ -681,17 +544,11 @@ def ci_rebuild(args):
cdash_handler.copy_test_results(reports_dir, job_test_dir) cdash_handler.copy_test_results(reports_dir, job_test_dir)
if install_exit_code == 0: if install_exit_code == 0:
# If the install succeeded, push it to one or more mirrors. Failure to push to any mirror # If the install succeeded, push it to the buildcache destination. Failure to push
# will result in a non-zero exit code. Pushing is best-effort. # will result in a non-zero exit code. Pushing is best-effort.
mirror_urls = [buildcache_mirror_url]
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror_urls.append(pipeline_mirror_url)
for result in spack_ci.create_buildcache( for result in spack_ci.create_buildcache(
input_spec=job_spec, input_spec=job_spec,
destination_mirror_urls=mirror_urls, destination_mirror_urls=[buildcache_destination.push_url],
sign_binaries=spack_ci.can_sign_binaries(), sign_binaries=spack_ci.can_sign_binaries(),
): ):
if not result.success: if not result.success:

View File

@ -11,8 +11,6 @@
from llnl.util.lang import union_dicts from llnl.util.lang import union_dicts
import spack.schema.gitlab_ci
# Schema for script fields # Schema for script fields
# List of lists and/or strings # List of lists and/or strings
# This is similar to what is allowed in # This is similar to what is allowed in
@ -137,39 +135,8 @@ def job_schema(name: str):
} }
) )
# TODO: Remove in Spack 0.23
ci_properties = {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
# "required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"enable-artifacts-buildcache": {"type": "boolean"}}
),
},
{
"type": "object",
"additionalProperties": False,
# "required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"temporary-storage-url-prefix": {"type": "string"}}
),
},
]
}
#: Properties for inclusion in other schemas #: Properties for inclusion in other schemas
properties: Dict[str, Any] = { properties: Dict[str, Any] = {"ci": core_shared_properties}
"ci": {
"oneOf": [
# TODO: Replace with core-shared-properties in Spack 0.23
ci_properties,
# Allow legacy format under `ci` for `config update ci`
spack.schema.gitlab_ci.gitlab_ci_properties,
]
}
}
#: Full schema with metadata #: Full schema with metadata
schema = { schema = {
@ -179,21 +146,3 @@ def job_schema(name: str):
"additionalProperties": False, "additionalProperties": False,
"properties": properties, "properties": properties,
} }
def update(data):
import llnl.util.tty as tty
import spack.ci
import spack.environment as ev
# Warn if deprecated section is still in the environment
ci_env = ev.active_environment()
if ci_env:
env_config = ci_env.manifest[ev.TOP_LEVEL_KEY]
if "gitlab-ci" in env_config:
tty.die("Error: `gitlab-ci` section detected with `ci`, these are not compatible")
# Detect if the ci section is using the new pipeline-gen
# If it is, assume it has already been converted
return spack.ci.translate_deprecated_config(data)

View File

@ -12,7 +12,6 @@
from llnl.util.lang import union_dicts from llnl.util.lang import union_dicts
import spack.schema.gitlab_ci # DEPRECATED
import spack.schema.merged import spack.schema.merged
from .spec_list import spec_list_schema from .spec_list import spec_list_schema
@ -26,8 +25,6 @@
"default": {}, "default": {},
"additionalProperties": False, "additionalProperties": False,
"properties": union_dicts( "properties": union_dicts(
# Include deprecated "gitlab-ci" section
spack.schema.gitlab_ci.properties,
# merged configuration scope schemas # merged configuration scope schemas
spack.schema.merged.properties, spack.schema.merged.properties,
# extra environment schema properties # extra environment schema properties
@ -58,15 +55,6 @@ def update(data):
Returns: Returns:
True if data was changed, False otherwise True if data was changed, False otherwise
""" """
import spack.ci
if "gitlab-ci" in data:
data["ci"] = data.pop("gitlab-ci")
if "ci" in data:
return spack.ci.translate_deprecated_config(data["ci"])
# There are not currently any deprecated attributes in this section # There are not currently any deprecated attributes in this section
# that have not been removed # that have not been removed
return False return False

View File

@ -1,125 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for gitlab-ci.yaml configuration file.
.. literalinclude:: ../spack/schema/gitlab_ci.py
:lines: 15-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
image_schema = {
"oneOf": [
{"type": "string"},
{
"type": "object",
"properties": {
"name": {"type": "string"},
"entrypoint": {"type": "array", "items": {"type": "string"}},
},
},
]
}
runner_attributes_schema_items = {
"image": image_schema,
"tags": {"type": "array", "items": {"type": "string"}},
"variables": {"type": "object", "patternProperties": {r"[\w\d\-_\.]+": {"type": "string"}}},
"before_script": {"type": "array", "items": {"type": "string"}},
"script": {"type": "array", "items": {"type": "string"}},
"after_script": {"type": "array", "items": {"type": "string"}},
}
runner_selector_schema = {
"type": "object",
"additionalProperties": True,
"required": ["tags"],
"properties": runner_attributes_schema_items,
}
remove_attributes_schema = {
"type": "object",
"additionalProperties": False,
"required": ["tags"],
"properties": {"tags": {"type": "array", "items": {"type": "string"}}},
}
core_shared_properties = union_dicts(
runner_attributes_schema_items,
{
"bootstrap": {
"type": "array",
"items": {
"anyOf": [
{"type": "string"},
{
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {
"name": {"type": "string"},
"compiler-agnostic": {"type": "boolean", "default": False},
},
},
]
},
},
"match_behavior": {"type": "string", "enum": ["first", "merge"], "default": "first"},
"mappings": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"required": ["match"],
"properties": {
"match": {"type": "array", "items": {"type": "string"}},
"remove-attributes": remove_attributes_schema,
"runner-attributes": runner_selector_schema,
},
},
},
"service-job-attributes": runner_selector_schema,
"signing-job-attributes": runner_selector_schema,
"rebuild-index": {"type": "boolean"},
"broken-specs-url": {"type": "string"},
"broken-tests-packages": {"type": "array", "items": {"type": "string"}},
},
)
gitlab_ci_properties = {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"enable-artifacts-buildcache": {"type": "boolean"}}
),
},
{
"type": "object",
"additionalProperties": False,
"required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"temporary-storage-url-prefix": {"type": "string"}}
),
},
]
}
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {"gitlab-ci": gitlab_ci_properties}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack gitlab-ci configuration file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import filecmp
import json import json
import os import os
import pathlib import pathlib
@ -27,7 +26,6 @@
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE from spack.cmd.ci import FAILED_CREATE_BUILDCACHE_CODE
from spack.schema.buildcache_spec import schema as specfile_schema from spack.schema.buildcache_spec import schema as specfile_schema
from spack.schema.ci import schema as ci_schema
from spack.schema.database_index import schema as db_idx_schema from spack.schema.database_index import schema as db_idx_schema
from spack.spec import Spec from spack.spec import Spec
@ -197,7 +195,7 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
- matrix: - matrix:
- [$old-gcc-pkgs] - [$old-gcc-pkgs]
mirrors: mirrors:
some-mirror: {mirror_url} buildcache-destination: {mirror_url}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -239,7 +237,9 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
assert "rebuild-index" in yaml_contents assert "rebuild-index" in yaml_contents
rebuild_job = yaml_contents["rebuild-index"] rebuild_job = yaml_contents["rebuild-index"]
assert rebuild_job["script"][0] == f"spack buildcache update-index --keys {mirror_url}" assert (
rebuild_job["script"][0] == f"spack buildcache update-index --keys {mirror_url.as_uri()}"
)
assert rebuild_job["custom_attribute"] == "custom!" assert rebuild_job["custom_attribute"] == "custom!"
assert "variables" in yaml_contents assert "variables" in yaml_contents
@ -249,31 +249,28 @@ def test_ci_generate_with_env(ci_generate_test, tmp_path, mock_binary_index):
def test_ci_generate_with_env_missing_section(ci_generate_test, tmp_path, mock_binary_index): def test_ci_generate_with_env_missing_section(ci_generate_test, tmp_path, mock_binary_index):
"""Make sure we get a reasonable message if we omit gitlab-ci section""" """Make sure we get a reasonable message if we omit gitlab-ci section"""
_, _, output = ci_generate_test( env_yaml = f"""\
f"""\
spack: spack:
specs: specs:
- archive-files - archive-files
mirrors: mirrors:
some-mirror: {tmp_path / 'ci-mirror'} buildcache-destination: {tmp_path / 'ci-mirror'}
""", """
fail_on_error=False, expect = "Environment does not have a `ci` configuration"
) with pytest.raises(ci.SpackCIError, match=expect):
assert "Environment does not have `ci` a configuration" in output ci_generate_test(env_yaml)
def test_ci_generate_with_cdash_token(ci_generate_test, tmp_path, mock_binary_index, monkeypatch): def test_ci_generate_with_cdash_token(ci_generate_test, tmp_path, mock_binary_index, monkeypatch):
"""Make sure we it doesn't break if we configure cdash""" """Make sure we it doesn't break if we configure cdash"""
monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "notreallyatokenbutshouldnotmatter") monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "notreallyatokenbutshouldnotmatter")
backup_file = tmp_path / "backup-ci.yml"
spack_yaml_content = f"""\ spack_yaml_content = f"""\
spack: spack:
specs: specs:
- archive-files - archive-files
mirrors: mirrors:
some-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
enable-artifacts-buildcache: True
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -288,16 +285,15 @@ def test_ci_generate_with_cdash_token(ci_generate_test, tmp_path, mock_binary_in
project: Not used project: Not used
site: Nothing site: Nothing
""" """
spack_yaml, original_file, output = ci_generate_test( spack_yaml, original_file, output = ci_generate_test(spack_yaml_content)
spack_yaml_content, "--copy-to", str(backup_file) yaml_contents = syaml.load(original_file.read_text())
)
# That fake token should still have resulted in being unable to # That fake token should have resulted in being unable to
# register build group with cdash, but the workload should # register build group with cdash, but the workload should
# still have been generated. # still have been generated.
assert "Problem populating buildgroup" in output assert "Problem populating buildgroup" in output
assert backup_file.exists() expected_keys = ["rebuild-index", "stages", "variables", "workflow"]
assert filecmp.cmp(str(original_file), str(backup_file)) assert all([key in yaml_contents.keys() for key in expected_keys])
def test_ci_generate_with_custom_settings( def test_ci_generate_with_custom_settings(
@ -312,7 +308,7 @@ def test_ci_generate_with_custom_settings(
specs: specs:
- archive-files - archive-files
mirrors: mirrors:
some-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -387,9 +383,8 @@ def test_ci_generate_pkg_with_deps(ci_generate_test, tmp_path, ci_base_environme
specs: specs:
- flatten-deps - flatten-deps
mirrors: mirrors:
some-mirror: {tmp_path / 'ci-mirror'} buildcache-destination: {tmp_path / 'ci-mirror'}
ci: ci:
enable-artifacts-buildcache: True
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -422,13 +417,8 @@ def test_ci_generate_pkg_with_deps(ci_generate_test, tmp_path, ci_base_environme
def test_ci_generate_for_pr_pipeline(ci_generate_test, tmp_path, monkeypatch): def test_ci_generate_for_pr_pipeline(ci_generate_test, tmp_path, monkeypatch):
"""Test that PR pipelines do not include a final stage job for """Test generation of a PR pipeline with disabled rebuild-index"""
rebuilding the mirror index, even if that job is specifically
configured.
"""
monkeypatch.setenv("SPACK_PIPELINE_TYPE", "spack_pull_request") monkeypatch.setenv("SPACK_PIPELINE_TYPE", "spack_pull_request")
monkeypatch.setenv("SPACK_PR_BRANCH", "fake-test-branch")
monkeypatch.setattr(spack.ci, "SHARED_PR_MIRROR_URL", f"{tmp_path / 'shared-pr-mirror'}")
spack_yaml, outputfile, _ = ci_generate_test( spack_yaml, outputfile, _ = ci_generate_test(
f"""\ f"""\
@ -436,9 +426,8 @@ def test_ci_generate_for_pr_pipeline(ci_generate_test, tmp_path, monkeypatch):
specs: specs:
- flatten-deps - flatten-deps
mirrors: mirrors:
some-mirror: {tmp_path / 'ci-mirror'} buildcache-destination: {tmp_path / 'ci-mirror'}
ci: ci:
enable-artifacts-buildcache: True
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -474,7 +463,7 @@ def test_ci_generate_with_external_pkg(ci_generate_test, tmp_path, monkeypatch):
- archive-files - archive-files
- externaltest - externaltest
mirrors: mirrors:
some-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -540,7 +529,6 @@ def create_rebuild_env(
broken_specs_path = scratch / "naughty-list" broken_specs_path = scratch / "naughty-list"
mirror_url = mirror_dir.as_uri() mirror_url = mirror_dir.as_uri()
temp_storage_url = (tmp_path / "temp-storage").as_uri()
ci_job_url = "https://some.domain/group/project/-/jobs/42" ci_job_url = "https://some.domain/group/project/-/jobs/42"
ci_pipeline_url = "https://some.domain/group/project/-/pipelines/7" ci_pipeline_url = "https://some.domain/group/project/-/pipelines/7"
@ -555,11 +543,10 @@ def create_rebuild_env(
specs: specs:
- $packages - $packages
mirrors: mirrors:
test-mirror: {mirror_dir} buildcache-destination: {mirror_dir}
ci: ci:
broken-specs-url: {broken_specs_path.as_uri()} broken-specs-url: {broken_specs_path.as_uri()}
broken-tests-packages: {json.dumps([pkg_name] if broken_tests else [])} broken-tests-packages: {json.dumps([pkg_name] if broken_tests else [])}
temporary-storage-url-prefix: {temp_storage_url}
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -711,7 +698,7 @@ def test_ci_require_signing(
specs: specs:
- archive-files - archive-files
mirrors: mirrors:
test-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -759,9 +746,8 @@ def test_ci_nothing_to_rebuild(
specs: specs:
- $packages - $packages
mirrors: mirrors:
test-mirror: {mirror_url} buildcache-destination: {mirror_url}
ci: ci:
enable-artifacts-buildcache: true
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -788,103 +774,20 @@ def test_ci_nothing_to_rebuild(
"SPACK_JOB_LOG_DIR": "log_dir", "SPACK_JOB_LOG_DIR": "log_dir",
"SPACK_JOB_REPRO_DIR": "repro_dir", "SPACK_JOB_REPRO_DIR": "repro_dir",
"SPACK_JOB_TEST_DIR": "test_dir", "SPACK_JOB_TEST_DIR": "test_dir",
"SPACK_LOCAL_MIRROR_DIR": str(mirror_dir),
"SPACK_CONCRETE_ENV_DIR": str(tmp_path), "SPACK_CONCRETE_ENV_DIR": str(tmp_path),
"SPACK_JOB_SPEC_DAG_HASH": env.concrete_roots()[0].dag_hash(), "SPACK_JOB_SPEC_DAG_HASH": env.concrete_roots()[0].dag_hash(),
"SPACK_JOB_SPEC_PKG_NAME": "archive-files", "SPACK_JOB_SPEC_PKG_NAME": "archive-files",
"SPACK_COMPILER_ACTION": "NONE", "SPACK_COMPILER_ACTION": "NONE",
"SPACK_REMOTE_MIRROR_URL": mirror_url,
} }
) )
def fake_dl_method(spec, *args, **kwargs):
print("fake download buildcache {0}".format(spec.name))
monkeypatch.setattr(spack.binary_distribution, "download_single_spec", fake_dl_method)
ci_out = ci_cmd("rebuild", output=str) ci_out = ci_cmd("rebuild", output=str)
assert "No need to rebuild archive-files" in ci_out assert "No need to rebuild archive-files" in ci_out
assert "fake download buildcache archive-files" in ci_out
env_cmd("deactivate") env_cmd("deactivate")
def test_ci_generate_mirror_override(
tmp_path: pathlib.Path,
mutable_mock_env_path,
install_mockery,
mock_fetch,
mock_binary_index,
ci_base_environment,
):
"""Ensure that protected pipelines using --buildcache-destination do not
skip building specs that are not in the override mirror when they are
found in the main mirror."""
os.environ.update({"SPACK_PIPELINE_TYPE": "spack_protected_branch"})
mirror_url = (tmp_path / "mirror").as_uri()
with open(tmp_path / "spack.yaml", "w") as f:
f.write(
f"""
spack:
definitions:
- packages: [patchelf]
specs:
- $packages
mirrors:
test-mirror: {mirror_url}
ci:
pipeline-gen:
- submapping:
- match:
- patchelf
build-job:
tags:
- donotcare
image: donotcare
- cleanup-job:
tags:
- nonbuildtag
image: basicimage
"""
)
with working_dir(tmp_path):
env_cmd("create", "test", "./spack.yaml")
first_ci_yaml = str(tmp_path / ".gitlab-ci-1.yml")
second_ci_yaml = str(tmp_path / ".gitlab-ci-2.yml")
with ev.read("test"):
install_cmd()
buildcache_cmd("push", "-u", mirror_url, "patchelf")
buildcache_cmd("update-index", mirror_url, output=str)
# This generate should not trigger a rebuild of patchelf, since it's in
# the main mirror referenced in the environment.
ci_cmd("generate", "--check-index-only", "--output-file", first_ci_yaml)
# Because we used a mirror override (--buildcache-destination) on a
# spack protected pipeline, we expect to only look in the override
# mirror for the spec, and thus the patchelf job should be generated in
# this pipeline
ci_cmd(
"generate",
"--check-index-only",
"--output-file",
second_ci_yaml,
"--buildcache-destination",
(tmp_path / "does-not-exist").as_uri(),
)
with open(first_ci_yaml) as fd1:
first_yaml = fd1.read()
assert "no-specs-to-rebuild" in first_yaml
with open(second_ci_yaml) as fd2:
second_yaml = fd2.read()
assert "no-specs-to-rebuild" not in second_yaml
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_push_to_build_cache( def test_push_to_build_cache(
tmp_path: pathlib.Path, tmp_path: pathlib.Path,
@ -911,9 +814,8 @@ def test_push_to_build_cache(
specs: specs:
- $packages - $packages
mirrors: mirrors:
test-mirror: {mirror_url} buildcache-destination: {mirror_url}
ci: ci:
enable-artifacts-buildcache: True
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -1049,7 +951,7 @@ def test_ci_generate_override_runner_attrs(
- flatten-deps - flatten-deps
- pkg-a - pkg-a
mirrors: mirrors:
some-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- match_behavior: {match_behavior} - match_behavior: {match_behavior}
@ -1189,7 +1091,7 @@ def test_ci_rebuild_index(
specs: specs:
- callpath - callpath
mirrors: mirrors:
test-mirror: {mirror_url} buildcache-destination: {mirror_url}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -1245,7 +1147,7 @@ def fake_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
- archive-files - archive-files
- callpath - callpath
mirrors: mirrors:
some-mirror: {tmp_path / 'ci-mirror'} buildcache-destination: {tmp_path / 'ci-mirror'}
ci: ci:
pipeline-gen: pipeline-gen:
- build-job: - build-job:
@ -1308,101 +1210,15 @@ def test_ci_subcommands_without_mirror(
with ev.read("test"): with ev.read("test"):
# Check the 'generate' subcommand # Check the 'generate' subcommand
output = ci_cmd( expect = "spack ci generate requires a mirror named 'buildcache-destination'"
"generate", with pytest.raises(ci.SpackCIError, match=expect):
"--output-file", ci_cmd("generate", "--output-file", str(tmp_path / ".gitlab-ci.yml"))
str(tmp_path / ".gitlab-ci.yml"),
output=str,
fail_on_error=False,
)
assert "spack ci generate requires an env containing a mirror" in output
# Also check the 'rebuild-index' subcommand # Also check the 'rebuild-index' subcommand
output = ci_cmd("rebuild-index", output=str, fail_on_error=False) output = ci_cmd("rebuild-index", output=str, fail_on_error=False)
assert "spack ci rebuild-index requires an env containing a mirror" in output assert "spack ci rebuild-index requires an env containing a mirror" in output
def test_ensure_only_one_temporary_storage():
"""Make sure 'gitlab-ci' section of env does not allow specification of
both 'enable-artifacts-buildcache' and 'temporary-storage-url-prefix'."""
gitlab_ci_template = """
ci:
{0}
pipeline-gen:
- submapping:
- match:
- notcheckedhere
build-job:
tags:
- donotcare
"""
enable_artifacts = "enable-artifacts-buildcache: True"
temp_storage = "temporary-storage-url-prefix: file:///temp/mirror"
specify_both = f"{enable_artifacts}\n {temp_storage}"
specify_neither = ""
# User can specify "enable-artifacts-buildcache" (boolean)
yaml_obj = syaml.load(gitlab_ci_template.format(enable_artifacts))
jsonschema.validate(yaml_obj, ci_schema)
# User can also specify "temporary-storage-url-prefix" (string)
yaml_obj = syaml.load(gitlab_ci_template.format(temp_storage))
jsonschema.validate(yaml_obj, ci_schema)
# However, specifying both should fail to validate
yaml_obj = syaml.load(gitlab_ci_template.format(specify_both))
with pytest.raises(jsonschema.ValidationError):
jsonschema.validate(yaml_obj, ci_schema)
# Specifying neither should be fine too, as neither of these properties
# should be required
yaml_obj = syaml.load(gitlab_ci_template.format(specify_neither))
jsonschema.validate(yaml_obj, ci_schema)
def test_ci_generate_temp_storage_url(ci_generate_test, tmp_path, mock_binary_index):
"""Verify correct behavior when using temporary-storage-url-prefix"""
_, outputfile, _ = ci_generate_test(
f"""\
spack:
specs:
- archive-files
mirrors:
some-mirror: {(tmp_path / "ci-mirror").as_uri()}
ci:
temporary-storage-url-prefix: {(tmp_path / "temp-mirror").as_uri()}
pipeline-gen:
- submapping:
- match:
- archive-files
build-job:
tags:
- donotcare
image: donotcare
- cleanup-job:
custom_attribute: custom!
"""
)
yaml_contents = syaml.load(outputfile.read_text())
assert "cleanup" in yaml_contents
cleanup_job = yaml_contents["cleanup"]
assert cleanup_job["custom_attribute"] == "custom!"
assert "script" in cleanup_job
cleanup_task = cleanup_job["script"][0]
assert cleanup_task.startswith("spack -d mirror destroy")
assert "stages" in yaml_contents
stages = yaml_contents["stages"]
# Cleanup job should be 2nd to last, just before rebuild-index
assert "stage" in cleanup_job
assert cleanup_job["stage"] == stages[-2]
def test_ci_generate_read_broken_specs_url( def test_ci_generate_read_broken_specs_url(
tmp_path: pathlib.Path, tmp_path: pathlib.Path,
mutable_mock_env_path, mutable_mock_env_path,
@ -1439,7 +1255,7 @@ def test_ci_generate_read_broken_specs_url(
- flatten-deps - flatten-deps
- pkg-a - pkg-a
mirrors: mirrors:
some-mirror: {(tmp_path / "ci-mirror").as_uri()} buildcache-destination: {(tmp_path / "ci-mirror").as_uri()}
ci: ci:
broken-specs-url: "{broken_specs_url}" broken-specs-url: "{broken_specs_url}"
pipeline-gen: pipeline-gen:
@ -1484,9 +1300,8 @@ def test_ci_generate_external_signing_job(ci_generate_test, tmp_path, monkeypatc
specs: specs:
- archive-files - archive-files
mirrors: mirrors:
some-mirror: {(tmp_path / "ci-mirror").as_uri()} buildcache-destination: {(tmp_path / "ci-mirror").as_uri()}
ci: ci:
temporary-storage-url-prefix: {(tmp_path / "temp-mirror").as_uri()}
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
@ -1541,7 +1356,7 @@ def test_ci_reproduce(
specs: specs:
- $packages - $packages
mirrors: mirrors:
test-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- submapping: - submapping:
@ -1672,106 +1487,6 @@ def test_cmd_first_line():
assert spack.cmd.first_line(doc) == first assert spack.cmd.first_line(doc) == first
legacy_spack_yaml_contents = """
spack:
definitions:
- old-gcc-pkgs:
- archive-files
- callpath
# specify ^openblas-with-lapack to ensure that builtin.mock repo flake8
# package (which can also provide lapack) is not chosen, as it violates
# a package-level check which requires exactly one fetch strategy (this
# is apparently not an issue for other tests that use it).
- hypre@0.2.15 ^openblas-with-lapack
specs:
- matrix:
- [$old-gcc-pkgs]
mirrors:
test-mirror: {mirror_url}
{key}:
match_behavior: first
mappings:
- match:
- arch=test-debian6-core2
runner-attributes:
tags:
- donotcare
image: donotcare
- match:
- arch=test-debian6-m1
runner-attributes:
tags:
- donotcare
image: donotcare
service-job-attributes:
image: donotcare
tags: [donotcare]
cdash:
build-group: Not important
url: https://my.fake.cdash
project: Not used
site: Nothing
"""
@pytest.mark.regression("36409")
def test_gitlab_ci_deprecated(
tmp_path: pathlib.Path,
mutable_mock_env_path,
install_mockery,
monkeypatch,
ci_base_environment,
mock_binary_index,
):
mirror_url = (tmp_path / "ci-mirror").as_uri()
with open(tmp_path / "spack.yaml", "w") as f:
f.write(legacy_spack_yaml_contents.format(mirror_url=mirror_url, key="gitlab-ci"))
with working_dir(tmp_path):
with ev.Environment("."):
ci_cmd("generate", "--output-file", "generated-pipeline.yaml")
with open("generated-pipeline.yaml") as f:
yaml_contents = syaml.load(f)
assert "stages" in yaml_contents
assert len(yaml_contents["stages"]) == 5
assert yaml_contents["stages"][0] == "stage-0"
assert yaml_contents["stages"][4] == "stage-rebuild-index"
assert "rebuild-index" in yaml_contents
rebuild_job = yaml_contents["rebuild-index"]
expected = f"spack buildcache update-index --keys {mirror_url}"
assert rebuild_job["script"][0] == expected
assert "variables" in yaml_contents
assert "SPACK_ARTIFACTS_ROOT" in yaml_contents["variables"]
artifacts_root = yaml_contents["variables"]["SPACK_ARTIFACTS_ROOT"]
assert artifacts_root == "jobs_scratch_dir"
@pytest.mark.regression("36045")
def test_gitlab_ci_update(
tmp_path: pathlib.Path,
mutable_mock_env_path,
install_mockery,
monkeypatch,
ci_base_environment,
mock_binary_index,
):
with open(tmp_path / "spack.yaml", "w") as f:
f.write(
legacy_spack_yaml_contents.format(mirror_url=(tmp_path / "mirror").as_uri(), key="ci")
)
env_cmd("update", "-y", str(tmp_path))
with open(tmp_path / "spack.yaml") as f:
yaml_contents = syaml.load(f)
ci_root = yaml_contents["spack"]["ci"]
assert "pipeline-gen" in ci_root
def test_gitlab_config_scopes(ci_generate_test, tmp_path): def test_gitlab_config_scopes(ci_generate_test, tmp_path):
"""Test pipeline generation with real configs included""" """Test pipeline generation with real configs included"""
configs_path = os.path.join(spack_paths.share_path, "gitlab", "cloud_pipelines", "configs") configs_path = os.path.join(spack_paths.share_path, "gitlab", "cloud_pipelines", "configs")
@ -1785,7 +1500,7 @@ def test_gitlab_config_scopes(ci_generate_test, tmp_path):
specs: specs:
- flatten-deps - flatten-deps
mirrors: mirrors:
some-mirror: {tmp_path / "ci-mirror"} buildcache-destination: {tmp_path / "ci-mirror"}
ci: ci:
pipeline-gen: pipeline-gen:
- build-job: - build-job:
@ -1858,7 +1573,7 @@ def dynamic_mapping_setup(tmpdir):
specs: specs:
- pkg-a - pkg-a
mirrors: mirrors:
some-mirror: https://my.fake.mirror buildcache-destination: https://my.fake.mirror
ci: ci:
pipeline-gen: pipeline-gen:
- dynamic-mapping: - dynamic-mapping:

View File

@ -75,8 +75,6 @@ default:
.base-job: .base-job:
variables: variables:
PIPELINE_MIRROR_TEMPLATE: "single-src-protected-mirrors.yaml.in" PIPELINE_MIRROR_TEMPLATE: "single-src-protected-mirrors.yaml.in"
# TODO: We can remove this when we drop the "deprecated" stack
PUSH_BUILDCACHE_DEPRECATED: "${PROTECTED_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
SPACK_CI_CONFIG_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/configs" SPACK_CI_CONFIG_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/configs"
SPACK_CI_SCRIPTS_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts" SPACK_CI_SCRIPTS_ROOT: "${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts"
@ -106,8 +104,6 @@ default:
when: always when: always
variables: variables:
SPACK_PIPELINE_TYPE: "spack_pull_request" SPACK_PIPELINE_TYPE: "spack_pull_request"
# TODO: We can remove this when we drop the "deprecated" stack
PUSH_BUILDCACHE_DEPRECATED: "${PR_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
SPACK_PRUNE_UNTOUCHED: "True" SPACK_PRUNE_UNTOUCHED: "True"
SPACK_PRUNE_UNTOUCHED_DEPENDENT_DEPTH: "1" SPACK_PRUNE_UNTOUCHED_DEPENDENT_DEPTH: "1"
# TODO: Change sync script to include target in branch name. Then we could # TODO: Change sync script to include target in branch name. Then we could
@ -221,41 +217,6 @@ default:
tags: ["spack", "public", "medium", "x86_64-win"] tags: ["spack", "public", "medium", "x86_64-win"]
image: "ghcr.io/johnwparent/windows-server21h2:sha-1c12b61" image: "ghcr.io/johnwparent/windows-server21h2:sha-1c12b61"
.generate-deprecated:
extends: [ ".base-job" ]
stage: generate
script:
- uname -a || true
- grep -E 'vendor|model name' /proc/cpuinfo 2>/dev/null | sort -u || head -n10 /proc/cpuinfo 2>/dev/null || true
- nproc || true
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- . "./share/spack/setup-env.sh"
- spack --version
- cd share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME}
- spack env activate --without-view .
- spack -v --color=always
ci generate --check-index-only
--buildcache-destination "${PUSH_BUILDCACHE_DEPRECATED}"
--artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir"
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml"
after_script:
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir"
variables:
KUBERNETES_CPU_REQUEST: 4000m
KUBERNETES_MEMORY_REQUEST: 16G
interruptible: true
timeout: 60 minutes
retry:
max: 2
when:
- always
tags: ["spack", "public", "medium", "x86_64"]
.build: .build:
extends: [ ".base-job" ] extends: [ ".base-job" ]
stage: build stage: build
@ -797,27 +758,6 @@ ml-darwin-aarch64-mps-build:
- artifacts: True - artifacts: True
job: ml-darwin-aarch64-mps-generate job: ml-darwin-aarch64-mps-generate
########################################
# Deprecated CI testing
########################################
.deprecated-ci:
variables:
SPACK_CI_STACK_NAME: deprecated
deprecated-ci-generate:
extends: [ ".generate-deprecated", ".deprecated-ci" ]
deprecated-ci-build:
extends: [ ".build", ".deprecated-ci" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: deprecated-ci-generate
strategy: depend
needs:
- artifacts: True
job: deprecated-ci-generate
######################################## ########################################
# AWS ParallelCluster # AWS ParallelCluster
######################################## ########################################

View File

@ -1,100 +0,0 @@
###
# Spack pipeline for testing deprecated gitlab-ci configuration
###
spack:
view: false
concretizer:
reuse: false
unify: false
config:
db_lock_timeout: 120
install_tree:
padded_length: 256
projections:
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
deprecated: true
packages:
all:
require: target=x86_64
specs:
- readline
mirrors:
mirror: s3://spack-binaries/develop/deprecated
gitlab-ci:
broken-tests-packages:
- gptune
broken-specs-url: s3://spack-binaries/broken-specs
image: ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02
before_script:
- uname -a || true
- grep -E "vendor|model name" /proc/cpuinfo 2>/dev/null | sort -u || head -n10
/proc/cpuinfo 2>/dev/null || true
- nproc
- . "./share/spack/setup-env.sh"
- spack --version
- spack arch
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
script:
- spack compiler find
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- if [ -n "$SPACK_BUILD_JOBS" ]; then spack config add "config:build_jobs:$SPACK_BUILD_JOBS";
fi
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
# AWS runners mount E4S public key (verification), UO runners mount public/private (signing/verification)
- if [[ -r /mnt/key/e4s.gpg ]]; then spack gpg trust /mnt/key/e4s.gpg; fi
# UO runners mount intermediate ci public key (verification), AWS runners mount public/private (signing/verification)
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg;
fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg;
fi
- spack --color=always --backtrace ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt)
2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
after_script:
- cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
match_behavior: first
mappings:
- match:
- '@:'
runner-attributes:
id_tokens:
GITLAB_OIDC_TOKEN:
aud: "${OIDC_TOKEN_AUDIENCE}"
tags: [spack, public, small, x86_64]
variables:
CI_JOB_SIZE: small
SPACK_BUILD_JOBS: '1'
KUBERNETES_CPU_REQUEST: 500m
KUBERNETES_MEMORY_REQUEST: 500M
signing-job-attributes:
id_tokens:
GITLAB_OIDC_TOKEN:
aud: "${OIDC_TOKEN_AUDIENCE}"
image: {name: 'ghcr.io/spack/notary:latest', entrypoint: ['']}
tags: [aws]
script:
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
/tmp
- /sign.sh
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
- aws s3 cp /tmp/public_keys ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache/_pgp
--recursive --exclude "*" --include "*.pub"
service-job-attributes:
id_tokens:
GITLAB_OIDC_TOKEN:
aud: "${OIDC_TOKEN_AUDIENCE}"
image: ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02
before_script:
- . "./share/spack/setup-env.sh"
- spack --version
tags: [spack, public, x86_64]
cdash:
build-group: Spack Deprecated CI
url: https://cdash.spack.io
project: Spack Testing
site: Cloud Gitlab Infrastructure

View File

@ -693,7 +693,7 @@ _spack_ci() {
} }
_spack_ci_generate() { _spack_ci_generate() {
SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --buildcache-destination --prune-dag --no-prune-dag --check-index-only --artifacts-root" SPACK_COMPREPLY="-h --help --output-file --optimize --dependencies --prune-dag --no-prune-dag --check-index-only --artifacts-root"
} }
_spack_ci_rebuild_index() { _spack_ci_rebuild_index() {

View File

@ -955,19 +955,15 @@ complete -c spack -n '__fish_spack_using_command ci' -s h -l help -f -a help
complete -c spack -n '__fish_spack_using_command ci' -s h -l help -d 'show this help message and exit' complete -c spack -n '__fish_spack_using_command ci' -s h -l help -d 'show this help message and exit'
# spack ci generate # spack ci generate
set -g __fish_spack_optspecs_spack_ci_generate h/help output-file= copy-to= optimize dependencies buildcache-destination= prune-dag no-prune-dag check-index-only artifacts-root= set -g __fish_spack_optspecs_spack_ci_generate h/help output-file= optimize dependencies prune-dag no-prune-dag check-index-only artifacts-root=
complete -c spack -n '__fish_spack_using_command ci generate' -s h -l help -f -a help complete -c spack -n '__fish_spack_using_command ci generate' -s h -l help -f -a help
complete -c spack -n '__fish_spack_using_command ci generate' -s h -l help -d 'show this help message and exit' complete -c spack -n '__fish_spack_using_command ci generate' -s h -l help -d 'show this help message and exit'
complete -c spack -n '__fish_spack_using_command ci generate' -l output-file -r -f -a output_file complete -c spack -n '__fish_spack_using_command ci generate' -l output-file -r -f -a output_file
complete -c spack -n '__fish_spack_using_command ci generate' -l output-file -r -d 'pathname for the generated gitlab ci yaml file' complete -c spack -n '__fish_spack_using_command ci generate' -l output-file -r -d 'pathname for the generated gitlab ci yaml file'
complete -c spack -n '__fish_spack_using_command ci generate' -l copy-to -r -f -a copy_to
complete -c spack -n '__fish_spack_using_command ci generate' -l copy-to -r -d 'path to additional directory for job files'
complete -c spack -n '__fish_spack_using_command ci generate' -l optimize -f -a optimize complete -c spack -n '__fish_spack_using_command ci generate' -l optimize -f -a optimize
complete -c spack -n '__fish_spack_using_command ci generate' -l optimize -d '(DEPRECATED) optimize the gitlab yaml file for size' complete -c spack -n '__fish_spack_using_command ci generate' -l optimize -d '(DEPRECATED) optimize the gitlab yaml file for size'
complete -c spack -n '__fish_spack_using_command ci generate' -l dependencies -f -a dependencies complete -c spack -n '__fish_spack_using_command ci generate' -l dependencies -f -a dependencies
complete -c spack -n '__fish_spack_using_command ci generate' -l dependencies -d '(DEPRECATED) disable DAG scheduling (use '"'"'plain'"'"' dependencies)' complete -c spack -n '__fish_spack_using_command ci generate' -l dependencies -d '(DEPRECATED) disable DAG scheduling (use '"'"'plain'"'"' dependencies)'
complete -c spack -n '__fish_spack_using_command ci generate' -l buildcache-destination -r -f -a buildcache_destination
complete -c spack -n '__fish_spack_using_command ci generate' -l buildcache-destination -r -d 'override the mirror configured in the environment'
complete -c spack -n '__fish_spack_using_command ci generate' -l prune-dag -f -a prune_dag complete -c spack -n '__fish_spack_using_command ci generate' -l prune-dag -f -a prune_dag
complete -c spack -n '__fish_spack_using_command ci generate' -l prune-dag -d 'skip up-to-date specs' complete -c spack -n '__fish_spack_using_command ci generate' -l prune-dag -d 'skip up-to-date specs'
complete -c spack -n '__fish_spack_using_command ci generate' -l no-prune-dag -f -a prune_dag complete -c spack -n '__fish_spack_using_command ci generate' -l no-prune-dag -f -a prune_dag