Pipelines: Support DAG scheduling and dynamic child pipelines

This change also adds a code path through the spack ci pipelines
infrastructure which supports PR testing on the Spack repository.
Gitlab pipelines run as a result of a PR (either creation or pushing
to a PR branch) will only verify that the packages in the environment
build without error.  When the PR branch is merged to develop,
another pipeline will run which results in the generated binaries
getting pushed to the binary mirror.
This commit is contained in:
Scott Wittenburg 2020-05-12 16:17:50 -06:00 committed by Todd Gamblin
parent 47b3dda1aa
commit e0572a7d96
7 changed files with 444 additions and 301 deletions

View File

@ -32,30 +32,46 @@ for setting up a build pipeline are as follows:
#. Create a repository on your gitlab instance #. Create a repository on your gitlab instance
#. Add a ``spack.yaml`` at the root containing your pipeline environment (see #. Add a ``spack.yaml`` at the root containing your pipeline environment (see
below for details) below for details)
#. Add a ``.gitlab-ci.yml`` at the root containing a single job, similar to #. Add a ``.gitlab-ci.yml`` at the root containing two jobs (one to generate
the pipeline dynamically, and one to run the generated jobs), similar to
this one: this one:
.. code-block:: yaml .. code-block:: yaml
pipeline-job: stages: [generate, build]
generate-pipeline:
stage: generate
tags: tags:
- <custom-tag> - <custom-tag>
...
script: script:
- spack ci start - spack ci generate
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
build-jobs:
stage: build
trigger:
include:
- artifact: "jobs_scratch_dir/pipeline.yml"
job: generate-pipeline
strategy: depend
#. Add any secrets required by the CI process to environment variables using the #. Add any secrets required by the CI process to environment variables using the
CI web ui CI web ui
#. Push a commit containing the ``spack.yaml`` and ``.gitlab-ci.yml`` mentioned above #. Push a commit containing the ``spack.yaml`` and ``.gitlab-ci.yml`` mentioned above
to the gitlab repository to the gitlab repository
The ``<custom-tag>``, above, is used to pick one of your configured runners, The ``<custom-tag>``, above, is used to pick one of your configured runners to
while the use of the ``spack ci start`` command implies that runner has an run the pipeline generation phase (this is implemented in the ``spack ci generate``
appropriate version of spack installed and configured for use. Of course, there command, which assumes the runner has an appropriate version of spack installed
are myriad ways to customize the process. You can configure CDash reporting and configured for use). Of course, there are many ways to customize the process.
on the progress of your builds, set up S3 buckets to mirror binaries built by You can configure CDash reporting on the progress of your builds, set up S3 buckets
the pipeline, clone a custom spack repository/ref for use by the pipeline, and to mirror binaries built by the pipeline, clone a custom spack repository/ref for
more. use by the pipeline, and more.
While it is possible to set up pipelines on gitlab.com, the builds there are While it is possible to set up pipelines on gitlab.com, the builds there are
limited to 60 minutes and generic hardware. It is also possible to limited to 60 minutes and generic hardware. It is also possible to
@ -64,15 +80,24 @@ Gitlab to Google Kubernetes Engine (`GKE <https://cloud.google.com/kubernetes-en
or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), though those or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), though those
topics are outside the scope of this document. topics are outside the scope of this document.
Spack's pipelines are now making use of the
`trigger <https://docs.gitlab.com/12.9/ee/ci/yaml/README.html#trigger>` syntax to run
dynamically generated
`child pipelines <https://docs.gitlab.com/12.9/ee/ci/parent_child_pipelines.html>`.
Note that the use of dynamic child pipelines requires running Gitlab version
``>= 12.9``.
----------------------------------- -----------------------------------
Spack commands supporting pipelines Spack commands supporting pipelines
----------------------------------- -----------------------------------
Spack provides a command `ci` with sub-commands for doing various things related Spack provides a command ``ci`` with two sub-commands: ``spack ci generate`` generates
to automated build pipelines. All of the ``spack ci ...`` commands must be run a pipeline (a .gitlab-ci.yml file) from a spack environment, and ``spack ci rebuild``
from within a environment, as each one makes use of the environment for different checks a spec against a remote mirror and possibly rebuilds it from source and updates
purposes. Additionally, some options to the commands (or conditions present in the binary mirror with the latest built package. Both ``spack ci ...`` commands must
the spack environment file) may require particular environment variables to be be run from within the same environment, as each one makes use of the environment for
different purposes. Additionally, some options to the commands (or conditions present
in the spack environment file) may require particular environment variables to be
set in order to function properly. Examples of these are typically secrets set in order to function properly. Examples of these are typically secrets
needed for pipeline operation that should not be visible in a spack environment needed for pipeline operation that should not be visible in a spack environment
file. These environment variables are described in more detail file. These environment variables are described in more detail
@ -87,15 +112,6 @@ file. These environment variables are described in more detail
Super-command for functionality related to generating pipelines and executing Super-command for functionality related to generating pipelines and executing
pipeline jobs. pipeline jobs.
.. _cmd_spack_ci_start:
^^^^^^^^^^^^^^^^^^
``spack ci start``
^^^^^^^^^^^^^^^^^^
Currently this command is a short-cut to first run ``spack ci generate``, followed
by ``spack ci pushyaml``.
.. _cmd_spack_ci_generate: .. _cmd_spack_ci_generate:
^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^
@ -105,39 +121,6 @@ by ``spack ci pushyaml``.
Concretizes the specs in the active environment, stages them (as described in Concretizes the specs in the active environment, stages them (as described in
:ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk. :ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk.
.. _cmd_spack_ci_pushyaml:
^^^^^^^^^^^^^^^^^^^^^
``spack ci pushyaml``
^^^^^^^^^^^^^^^^^^^^^
Generates a commit containing the generated ``.gitlab-ci.yml`` and pushes it to a
``DOWNSTREAM_CI_REPO``, which is frequently the same repository. The branch
created has the same name as the current branch being tested, but has ``multi-ci-``
prepended to the branch name. Once Gitlab CI has full support for dynamically
defined workloads, this command will be deprecated.
Until this command is no longer needed and can be deprecated, there are
a few gotchas to note. While you can embed your username and password in the
`DOWNSTREAM_CI_REPO` url, you may not be able to have Gitlab mask the value, as
it will likely contain characters that Gitlab cannot currently mask. Another
option is to set up an SSH token, but for this to work, the associated SSH
key must be passphrase-less so that it can be provided in an automated manner.
If you attempt to set up an SSH token that does require a passphrase, you may
see a log message similar to:
```
fatal: https://<instance-url>/<org>/<project>:<port>/info/refs not valid: is this a git repository?
```
In this case, you can try a passphrase-less SSH key, or else embed your gitlab
username and password in the `DOWNSTREAM_CI_REPO` as in the following example:
```
https://<username>:<password>@<instance-url>/<org>/<project>.git
```
.. _cmd_spack_ci_rebuild: .. _cmd_spack_ci_rebuild:
^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^
@ -179,14 +162,14 @@ sections describing a build pipeline:
- os=ubuntu18.04 - os=ubuntu18.04
runner-attributes: runner-attributes:
tags: tags:
- spack-k8s - spack-kube
image: spack/spack_builder_ubuntu_18.04 image: spack/ubuntu-bionic
- match: - match:
- os=centos7 - os=centos7
runner-attributes: runner-attributes:
tags: tags:
- spack-k8s - spack-kube
image: spack/spack_builder_centos_7 image: spack/centos7
cdash: cdash:
build-group: Release Testing build-group: Release Testing
url: https://cdash.spack.io url: https://cdash.spack.io
@ -389,22 +372,29 @@ containing the url and branch/tag you want to clone (calling them, for example,
``SPACK_REPO`` and ``SPACK_REF``), use them to clone spack in your pre-ci ``SPACK_REPO`` and ``SPACK_REF``), use them to clone spack in your pre-ci
``before_script``, and finally pass those same values along to the workload ``before_script``, and finally pass those same values along to the workload
generation process via the ``spack-repo`` and ``spack-ref`` cli args. Here's generation process via the ``spack-repo`` and ``spack-ref`` cli args. Here's
an example: the ``generate-pipeline`` job from the top of this document, updated to clone
a custom spack and make sure the generated rebuild jobs will clone it too:
.. code-block:: yaml .. code-block:: yaml
pipeline-job: generate-pipeline:
tags: tags:
- <some-other-tag> - <some-other-tag>
before_script: before_script:
- git clone ${SPACK_REPO} --branch ${SPACK_REF} - git clone ${SPACK_REPO} --branch ${SPACK_REF}
- . ./spack/share/spack/setup-env.sh - . ./spack/share/spack/setup-env.sh
script: script:
- spack ci start --spack-repo ${SPACK_REPO} --spack-ref ${SPACK_REF} <...args> - spack ci generate
--spack-repo ${SPACK_REPO} --spack-ref ${SPACK_REF}
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
after_script: after_script:
- rm -rf ./spack - rm -rf ./spack
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
If the ``spack ci start`` command receives those extra command line arguments,
If the ``spack ci generate`` command receives those extra command line arguments,
then it adds similar ``before_script`` and ``after_script`` sections for each of then it adds similar ``before_script`` and ``after_script`` sections for each of
the ``spack ci rebuild`` jobs it generates (cloning and sourcing a custom the ``spack ci rebuild`` jobs it generates (cloning and sourcing a custom
spack in the ``before_script`` and removing it again in the ``after_script``). spack in the ``before_script`` and removing it again in the ``after_script``).
@ -451,11 +441,3 @@ SPACK_SIGNING_KEY
^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^
Needed to sign/verify binary packages from the remote binary mirror. Needed to sign/verify binary packages from the remote binary mirror.
^^^^^^^^^^^^^^^^^^
DOWNSTREAM_CI_REPO
^^^^^^^^^^^^^^^^^^
Needed until Gitlab CI supports dynamic job generation. Can contain connection
credentials embedded in the url, and could be the same repository or a different
one.

View File

@ -24,7 +24,6 @@
import spack.compilers as compilers import spack.compilers as compilers
import spack.config as cfg import spack.config as cfg
import spack.environment as ev import spack.environment as ev
from spack.dependency import all_deptypes
from spack.error import SpackError from spack.error import SpackError
import spack.hash_types as ht import spack.hash_types as ht
from spack.main import SpackCommand from spack.main import SpackCommand
@ -34,6 +33,21 @@
import spack.util.web as web_util import spack.util.web as web_util
JOB_RETRY_CONDITIONS = [
'unknown_failure',
'api_failure',
'stuck_or_timeout_failure',
'runner_system_failure',
'missing_dependency_failure',
'runner_unsupported',
'stale_schedule',
'job_execution_timeout',
'archived_failure',
'unmet_prerequisites',
'scheduler_failure',
'data_integrity_failure',
]
spack_gpg = SpackCommand('gpg') spack_gpg = SpackCommand('gpg')
spack_compiler = SpackCommand('compiler') spack_compiler = SpackCommand('compiler')
@ -360,7 +374,6 @@ def compute_spec_deps(spec_list):
} }
""" """
deptype = all_deptypes
spec_labels = {} spec_labels = {}
specs = [] specs = []
@ -380,7 +393,7 @@ def append_dep(s, d):
rkey, rlabel = spec_deps_key_label(spec) rkey, rlabel = spec_deps_key_label(spec)
for s in spec.traverse(deptype=deptype): for s in spec.traverse(deptype=all):
if s.external: if s.external:
tty.msg('Will not stage external pkg: {0}'.format(s)) tty.msg('Will not stage external pkg: {0}'.format(s))
continue continue
@ -392,7 +405,7 @@ def append_dep(s, d):
} }
append_dep(rlabel, slabel) append_dep(rlabel, slabel)
for d in s.dependencies(deptype=deptype): for d in s.dependencies(deptype=all):
dkey, dlabel = spec_deps_key_label(d) dkey, dlabel = spec_deps_key_label(d)
if d.external: if d.external:
tty.msg('Will not stage external dep: {0}'.format(d)) tty.msg('Will not stage external dep: {0}'.format(d))
@ -400,11 +413,11 @@ def append_dep(s, d):
append_dep(slabel, dlabel) append_dep(slabel, dlabel)
for l, d in spec_labels.items(): for spec_label, spec_holder in spec_labels.items():
specs.append({ specs.append({
'label': l, 'label': spec_label,
'spec': d['spec'], 'spec': spec_holder['spec'],
'root_spec': d['root'], 'root_spec': spec_holder['root'],
}) })
deps_json_obj = { deps_json_obj = {
@ -431,6 +444,21 @@ def pkg_name_from_spec_label(spec_label):
return spec_label[:spec_label.index('/')] return spec_label[:spec_label.index('/')]
def format_job_needs(phase_name, strip_compilers, dep_jobs,
osname, build_group, enable_artifacts_buildcache):
needs_list = []
for dep_job in dep_jobs:
needs_list.append({
'job': get_job_name(phase_name,
strip_compilers,
dep_job,
osname,
build_group),
'artifacts': enable_artifacts_buildcache,
})
return needs_list
def generate_gitlab_ci_yaml(env, print_summary, output_file, def generate_gitlab_ci_yaml(env, print_summary, output_file,
custom_spack_repo=None, custom_spack_ref=None): custom_spack_repo=None, custom_spack_ref=None):
# FIXME: What's the difference between one that opens with 'spack' # FIXME: What's the difference between one that opens with 'spack'
@ -466,6 +494,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tty.verbose("Using CDash auth token from environment") tty.verbose("Using CDash auth token from environment")
cdash_auth_token = os.environ.get('SPACK_CDASH_AUTH_TOKEN') cdash_auth_token = os.environ.get('SPACK_CDASH_AUTH_TOKEN')
is_pr_pipeline = (
os.environ.get('SPACK_IS_PR_PIPELINE', '').lower() == 'true'
)
# Make sure we use a custom spack if necessary # Make sure we use a custom spack if necessary
before_script = None before_script = None
after_script = None after_script = None
@ -538,6 +570,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
stage_names = [] stage_names = []
max_length_needs = 0
max_needs_job = ''
for phase in phases: for phase in phases:
phase_name = phase['name'] phase_name = phase['name']
strip_compilers = phase['strip-compilers'] strip_compilers = phase['strip-compilers']
@ -601,25 +636,35 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
root_spec, main_phase, strip_compilers), root_spec, main_phase, strip_compilers),
'SPACK_JOB_SPEC_PKG_NAME': release_spec.name, 'SPACK_JOB_SPEC_PKG_NAME': release_spec.name,
'SPACK_COMPILER_ACTION': compiler_action, 'SPACK_COMPILER_ACTION': compiler_action,
'SPACK_IS_PR_PIPELINE': str(is_pr_pipeline),
} }
job_dependencies = [] job_dependencies = []
if spec_label in dependencies: if spec_label in dependencies:
for dep_label in dependencies[spec_label]: if enable_artifacts_buildcache:
dep_pkg = pkg_name_from_spec_label(dep_label) dep_jobs = [
dep_spec = spec_labels[dep_label]['rootSpec'][dep_pkg] d for d in release_spec.traverse(deptype=all,
dep_job_name = get_job_name( root=False)
phase_name, strip_compilers, dep_spec, osname, ]
build_group) else:
job_dependencies.append(dep_job_name) dep_jobs = []
for dep_label in dependencies[spec_label]:
dep_pkg = pkg_name_from_spec_label(dep_label)
dep_root = spec_labels[dep_label]['rootSpec']
dep_jobs.append(dep_root[dep_pkg])
job_dependencies.extend(
format_job_needs(phase_name, strip_compilers, dep_jobs,
osname, build_group,
enable_artifacts_buildcache))
# This next section helps gitlab make sure the right # This next section helps gitlab make sure the right
# bootstrapped compiler exists in the artifacts buildcache by # bootstrapped compiler exists in the artifacts buildcache by
# creating an artificial dependency between this spec and its # creating an artificial dependency between this spec and its
# compiler. So, if we are in the main phase, and if the # compiler. So, if we are in the main phase, and if the
# compiler we are supposed to use is listed in any of the # compiler we are supposed to use is listed in any of the
# bootstrap spec lists, then we will add one more dependency to # bootstrap spec lists, then we will add more dependencies to
# "job_dependencies" (that compiler). # the job (that compiler and maybe it's dependencies as well).
if is_main_phase(phase_name): if is_main_phase(phase_name):
compiler_pkg_spec = compilers.pkg_spec_for_compiler( compiler_pkg_spec = compilers.pkg_spec_for_compiler(
release_spec.compiler) release_spec.compiler)
@ -627,12 +672,25 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
bs_arch = bs['spec'].architecture bs_arch = bs['spec'].architecture
if (bs['spec'].satisfies(compiler_pkg_spec) and if (bs['spec'].satisfies(compiler_pkg_spec) and
bs_arch == release_spec.architecture): bs_arch == release_spec.architecture):
c_job_name = get_job_name(bs['phase-name'], # We found the bootstrap compiler this release spec
bs['strip-compilers'], # should be built with, so for DAG scheduling
bs['spec'], # purposes, we will at least add the compiler spec
str(bs_arch), # to the jobs "needs". But if artifact buildcache
build_group) # is enabled, we'll have to add all transtive deps
job_dependencies.append(c_job_name) # of the compiler as well.
dep_jobs = [bs['spec']]
if enable_artifacts_buildcache:
dep_jobs = [
d for d in bs['spec'].traverse(deptype=all)
]
job_dependencies.extend(
format_job_needs(bs['phase-name'],
bs['strip-compilers'],
dep_jobs,
str(bs_arch),
build_group,
enable_artifacts_buildcache))
if enable_cdash_reporting: if enable_cdash_reporting:
cdash_build_name = get_cdash_build_name( cdash_build_name = get_cdash_build_name(
@ -647,7 +705,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
job_vars['SPACK_CDASH_BUILD_NAME'] = cdash_build_name job_vars['SPACK_CDASH_BUILD_NAME'] = cdash_build_name
job_vars['SPACK_RELATED_BUILDS_CDASH'] = ';'.join( job_vars['SPACK_RELATED_BUILDS_CDASH'] = ';'.join(
related_builds) sorted(related_builds))
variables.update(job_vars) variables.update(job_vars)
@ -657,7 +715,12 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
] ]
if enable_artifacts_buildcache: if enable_artifacts_buildcache:
artifact_paths.append('local_mirror/build_cache') bc_root = 'local_mirror/build_cache'
artifact_paths.extend([os.path.join(bc_root, p) for p in [
bindist.tarball_name(release_spec, '.spec.yaml'),
bindist.tarball_name(release_spec, '.cdashid'),
bindist.tarball_directory_name(release_spec),
]])
job_object = { job_object = {
'stage': stage_name, 'stage': stage_name,
@ -668,9 +731,18 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'paths': artifact_paths, 'paths': artifact_paths,
'when': 'always', 'when': 'always',
}, },
'dependencies': job_dependencies, 'needs': sorted(job_dependencies, key=lambda d: d['job']),
'retry': {
'max': 2,
'when': JOB_RETRY_CONDITIONS,
}
} }
length_needs = len(job_dependencies)
if length_needs > max_length_needs:
max_length_needs = length_needs
max_needs_job = job_name
if before_script: if before_script:
job_object['before_script'] = before_script job_object['before_script'] = before_script
@ -691,6 +763,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tty.debug('{0} build jobs generated in {1} stages'.format( tty.debug('{0} build jobs generated in {1} stages'.format(
job_id, stage_id)) job_id, stage_id))
tty.debug('The max_needs_job is {0}, with {1} needs'.format(
max_needs_job, max_length_needs))
# Use "all_job_names" to populate the build group for this set # Use "all_job_names" to populate the build group for this set
if enable_cdash_reporting and cdash_auth_token: if enable_cdash_reporting and cdash_auth_token:
try: try:
@ -701,7 +776,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
else: else:
tty.warn('Unable to populate buildgroup without CDash credentials') tty.warn('Unable to populate buildgroup without CDash credentials')
if final_job_config: if final_job_config and not is_pr_pipeline:
# Add an extra, final job to regenerate the index # Add an extra, final job to regenerate the index
final_stage = 'stage-rebuild-index' final_stage = 'stage-rebuild-index'
final_job = { final_job = {
@ -721,8 +796,12 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
output_object['stages'] = stage_names output_object['stages'] = stage_names
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
with open(output_file, 'w') as outf: with open(output_file, 'w') as outf:
outf.write(syaml.dump_config(output_object, default_flow_style=True)) outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
def url_encode_string(input_string): def url_encode_string(input_string):

View File

@ -34,37 +34,6 @@ def setup_parser(subparser):
setup_parser.parser = subparser setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help='CI sub-commands') subparsers = subparser.add_subparsers(help='CI sub-commands')
start = subparsers.add_parser('start', help=ci_start.__doc__)
start.add_argument(
'--output-file', default=None,
help="Absolute path to file where generated jobs file should be " +
"written. The default is .gitlab-ci.yml in the root of the " +
"repository.")
start.add_argument(
'--copy-to', default=None,
help="Absolute path of additional location where generated jobs " +
"yaml file should be copied. Default is not to copy.")
start.add_argument(
'--spack-repo', default=None,
help="Provide a url for this argument if a custom spack repo " +
"should be cloned as a step in each generated job.")
start.add_argument(
'--spack-ref', default=None,
help="Provide a git branch or tag if a custom spack branch " +
"should be checked out as a step in each generated job. " +
"This argument is ignored if no --spack-repo is provided.")
start.add_argument(
'--downstream-repo', default=None,
help="Url to repository where commit containing jobs yaml file " +
"should be pushed.")
start.add_argument(
'--branch-name', default='default-branch',
help="Name of current branch, used in generation of pushed commit.")
start.add_argument(
'--commit-sha', default='none',
help="SHA of current commit, used in generation of pushed commit.")
start.set_defaults(func=ci_start)
# Dynamic generation of the jobs yaml from a spack environment # Dynamic generation of the jobs yaml from a spack environment
generate = subparsers.add_parser('generate', help=ci_generate.__doc__) generate = subparsers.add_parser('generate', help=ci_generate.__doc__)
generate.add_argument( generate.add_argument(
@ -87,20 +56,6 @@ def setup_parser(subparser):
"This argument is ignored if no --spack-repo is provided.") "This argument is ignored if no --spack-repo is provided.")
generate.set_defaults(func=ci_generate) generate.set_defaults(func=ci_generate)
# Commit and push jobs yaml to a downstream CI repo
pushyaml = subparsers.add_parser('pushyaml', help=ci_pushyaml.__doc__)
pushyaml.add_argument(
'--downstream-repo', default=None,
help="Url to repository where commit containing jobs yaml file " +
"should be pushed.")
pushyaml.add_argument(
'--branch-name', default='default-branch',
help="Name of current branch, used in generation of pushed commit.")
pushyaml.add_argument(
'--commit-sha', default='none',
help="SHA of current commit, used in generation of pushed commit.")
pushyaml.set_defaults(func=ci_pushyaml)
# Check a spec against mirror. Rebuild, create buildcache and push to # Check a spec against mirror. Rebuild, create buildcache and push to
# mirror (if necessary). # mirror (if necessary).
rebuild = subparsers.add_parser('rebuild', help=ci_rebuild.__doc__) rebuild = subparsers.add_parser('rebuild', help=ci_rebuild.__doc__)
@ -140,64 +95,6 @@ def ci_generate(args):
shutil.copyfile(output_file, copy_yaml_to) shutil.copyfile(output_file, copy_yaml_to)
def ci_pushyaml(args):
"""Push the generated jobs yaml file to a remote repository. The file
(.gitlab-ci.yaml) is expected to be in the current directory, which
should be the root of the repository."""
downstream_repo = args.downstream_repo
branch_name = args.branch_name
commit_sha = args.commit_sha
if not downstream_repo:
tty.die('No downstream repo to push to, exiting')
working_dir = os.getcwd()
jobs_yaml = os.path.join(working_dir, '.gitlab-ci.yml')
git_dir = os.path.join(working_dir, '.git')
if not os.path.exists(jobs_yaml):
tty.die('.gitlab-ci.yml must exist in current directory')
if not os.path.exists(git_dir):
tty.die('.git directory must exist in current directory')
# Create a temporary working directory
with spack_ci.TemporaryDirectory() as temp_dir:
git = exe.which('git', required=True)
# Push a commit with the generated file to the downstream ci repo
saved_git_dir = os.path.join(temp_dir, 'original-git-dir')
shutil.move('.git', saved_git_dir)
git('init', '.')
git('config', 'user.email', 'robot@spack.io')
git('config', 'user.name', 'Spack Build Bot')
git('add', '.')
# If the environment contains a spack directory, do not commit
# or push it with any other generated products
if os.path.exists('./spack') and os.path.isdir('./spack'):
git('rm', '-rf', '--cached', 'spack')
tty.msg('git commit')
commit_message = '{0} {1} ({2})'.format(
'Auto-generated commit testing', branch_name, commit_sha)
git('commit', '-m', '{0}'.format(commit_message))
tty.msg('git push')
git('remote', 'add', 'downstream', downstream_repo)
push_to_branch = 'master:multi-ci-{0}'.format(branch_name)
git('push', '--force', 'downstream', push_to_branch)
shutil.rmtree('.git')
shutil.move(saved_git_dir, '.git')
git('reset', '--hard', 'HEAD')
def ci_rebuild(args): def ci_rebuild(args):
"""This command represents a gitlab-ci job, corresponding to a single """This command represents a gitlab-ci job, corresponding to a single
release spec. As such it must first decide whether or not the spec it release spec. As such it must first decide whether or not the spec it
@ -239,6 +136,7 @@ def ci_rebuild(args):
compiler_action = get_env_var('SPACK_COMPILER_ACTION') compiler_action = get_env_var('SPACK_COMPILER_ACTION')
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME') cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
related_builds = get_env_var('SPACK_RELATED_BUILDS_CDASH') related_builds = get_env_var('SPACK_RELATED_BUILDS_CDASH')
pr_env_var = get_env_var('SPACK_IS_PR_PIPELINE')
gitlab_ci = None gitlab_ci = None
if 'gitlab-ci' in yaml_root: if 'gitlab-ci' in yaml_root:
@ -291,11 +189,18 @@ def ci_rebuild(args):
local_mirror_dir = os.path.join(ci_artifact_dir, 'local_mirror') local_mirror_dir = os.path.join(ci_artifact_dir, 'local_mirror')
build_cache_dir = os.path.join(local_mirror_dir, 'build_cache') build_cache_dir = os.path.join(local_mirror_dir, 'build_cache')
spack_is_pr_pipeline = True if pr_env_var == 'True' else False
enable_artifacts_mirror = False enable_artifacts_mirror = False
artifact_mirror_url = None artifact_mirror_url = None
if 'enable-artifacts-buildcache' in gitlab_ci: if 'enable-artifacts-buildcache' in gitlab_ci:
enable_artifacts_mirror = gitlab_ci['enable-artifacts-buildcache'] enable_artifacts_mirror = gitlab_ci['enable-artifacts-buildcache']
if enable_artifacts_mirror: if enable_artifacts_mirror or spack_is_pr_pipeline:
# If this is a PR pipeline, we will override the setting to
# make sure that artifacts buildcache is enabled. Otherwise
# jobs will not have binary deps available since we do not
# allow pushing binaries to remote mirror during PR pipelines
enable_artifacts_mirror = True
artifact_mirror_url = 'file://' + local_mirror_dir artifact_mirror_url = 'file://' + local_mirror_dir
mirror_msg = 'artifact buildcache enabled, mirror url: {0}'.format( mirror_msg = 'artifact buildcache enabled, mirror url: {0}'.format(
artifact_mirror_url) artifact_mirror_url)
@ -441,9 +346,12 @@ def ci_rebuild(args):
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir) spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# 4) create buildcache on remote mirror # 4) create buildcache on remote mirror, but not if this is
spack_ci.push_mirror_contents(env, job_spec, job_spec_yaml_path, # running to test a spack PR
remote_mirror_url, cdash_build_id) if not spack_is_pr_pipeline:
spack_ci.push_mirror_contents(
env, job_spec, job_spec_yaml_path, remote_mirror_url,
cdash_build_id)
# 5) create another copy of that buildcache on "local artifact # 5) create another copy of that buildcache on "local artifact
# mirror" (only done if cash reporting is enabled) # mirror" (only done if cash reporting is enabled)
@ -468,13 +376,6 @@ def ci_rebuild(args):
job_spec, build_cache_dir, True, remote_mirror_url) job_spec, build_cache_dir, True, remote_mirror_url)
def ci_start(args):
"""Kicks of the CI process (currently just calls ci_generate() then
ci_push())"""
ci_generate(args)
ci_pushyaml(args)
def ci(parser, args): def ci(parser, args):
if args.func: if args.func:
args.func(args) args.func(args)

View File

@ -7,8 +7,6 @@
import os import os
import pytest import pytest
import llnl.util.filesystem as fs
import spack import spack
import spack.ci as ci import spack.ci as ci
import spack.config import spack.config
@ -48,37 +46,6 @@ def env_deactivate():
os.environ.pop('SPACK_ENV', None) os.environ.pop('SPACK_ENV', None)
def initialize_new_repo(repo_path, initial_commit=False):
if not os.path.exists(repo_path):
os.makedirs(repo_path)
with fs.working_dir(repo_path):
init_args = ['init', '.']
# if not initial_commit:
# init_args.append('--bare')
git(*init_args)
if initial_commit:
readme_contents = "This is the project README\n"
readme_path = os.path.join(repo_path, 'README.md')
with open(readme_path, 'w') as fd:
fd.write(readme_contents)
git('add', '.')
git('commit', '-m', 'Project initial commit')
def get_repo_status(repo_path):
with fs.working_dir(repo_path):
output = git('rev-parse', '--abbrev-ref', 'HEAD', output=str)
current_branch = output.split()[0]
output = git('rev-parse', 'HEAD', output=str)
current_sha = output.split()[0]
return current_branch, current_sha
def set_env_var(key, val): def set_env_var(key, val):
os.environ[key] = val os.environ[key] = val
@ -204,6 +171,144 @@ def test_ci_generate_with_env(tmpdir, mutable_mock_env_path, env_deactivate,
assert(yaml_contents['stages'][5] == 'stage-rebuild-index') assert(yaml_contents['stages'][5] == 'stage-rebuild-index')
def _validate_needs_graph(yaml_contents, needs_graph, artifacts):
for job_name, job_def in yaml_contents.items():
for needs_def_name, needs_list in needs_graph.items():
if job_name.startswith(needs_def_name):
# check job needs against the expected needs definition
assert all([job_needs['job'][:job_needs['job'].index('/')]
in needs_list for job_needs in job_def['needs']])
assert all([job_needs['artifacts'] == artifacts
for job_needs in job_def['needs']])
break
def test_ci_generate_bootstrap_gcc(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery,
mock_packages):
"""Test that we can bootstrap a compiler and use it as the
compiler for a spec in the environment"""
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
definitions:
- bootstrap:
- gcc@3.0
specs:
- dyninst%gcc@3.0
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
- match:
- arch=test-debian6-x86_64
runner-attributes:
tags:
- donotcare
""")
needs_graph = {
'(bootstrap) conflict': [],
'(bootstrap) gcc': [
'(bootstrap) conflict',
],
'(specs) libelf': [
'(bootstrap) gcc',
],
'(specs) libdwarf': [
'(bootstrap) gcc',
'(specs) libelf',
],
'(specs) dyninst': [
'(bootstrap) gcc',
'(specs) libelf',
'(specs) libdwarf',
],
}
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
contents = f.read()
yaml_contents = syaml.load(contents)
_validate_needs_graph(yaml_contents, needs_graph, False)
def test_ci_generate_bootstrap_artifacts_buildcache(tmpdir,
mutable_mock_env_path,
env_deactivate,
install_mockery,
mock_packages):
"""Test that we can bootstrap a compiler when artifacts buildcache
is turned on"""
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
definitions:
- bootstrap:
- gcc@3.0
specs:
- dyninst%gcc@3.0
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
- match:
- arch=test-debian6-x86_64
runner-attributes:
tags:
- donotcare
enable-artifacts-buildcache: True
""")
needs_graph = {
'(bootstrap) conflict': [],
'(bootstrap) gcc': [
'(bootstrap) conflict',
],
'(specs) libelf': [
'(bootstrap) gcc',
'(bootstrap) conflict',
],
'(specs) libdwarf': [
'(bootstrap) gcc',
'(bootstrap) conflict',
'(specs) libelf',
],
'(specs) dyninst': [
'(bootstrap) gcc',
'(bootstrap) conflict',
'(specs) libelf',
'(specs) libdwarf',
],
}
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
contents = f.read()
yaml_contents = syaml.load(contents)
_validate_needs_graph(yaml_contents, needs_graph, True)
def test_ci_generate_with_env_missing_section(tmpdir, mutable_mock_env_path, def test_ci_generate_with_env_missing_section(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery, env_deactivate, install_mockery,
mock_packages): mock_packages):
@ -282,6 +387,110 @@ def test_ci_generate_with_cdash_token(tmpdir, mutable_mock_env_path,
assert(filecmp.cmp(orig_file, copy_to_file) is True) assert(filecmp.cmp(orig_file, copy_to_file) is True)
def test_ci_generate_pkg_with_deps(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery,
mock_packages):
"""Test pipeline generation for a package w/ dependencies"""
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
specs:
- flatten-deps
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
enable-artifacts-buildcache: True
mappings:
- match:
- flatten-deps
runner-attributes:
tags:
- donotcare
- match:
- dependency-install
runner-attributes:
tags:
- donotcare
""")
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
contents = f.read()
print('generated contents: ')
print(contents)
yaml_contents = syaml.load(contents)
found = []
for ci_key in yaml_contents.keys():
ci_obj = yaml_contents[ci_key]
if 'dependency-install' in ci_key:
assert('stage' in ci_obj)
assert(ci_obj['stage'] == 'stage-0')
found.append('dependency-install')
if 'flatten-deps' in ci_key:
assert('stage' in ci_obj)
assert(ci_obj['stage'] == 'stage-1')
found.append('flatten-deps')
assert('flatten-deps' in found)
assert('dependency-install' in found)
def test_ci_generate_for_pr_pipeline(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery,
mock_packages):
"""Test that PR pipelines do not include a final stage job for
rebuilding the mirror index, even if that job is specifically
configured"""
filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f:
f.write("""\
spack:
specs:
- flatten-deps
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
enable-artifacts-buildcache: True
mappings:
- match:
- flatten-deps
runner-attributes:
tags:
- donotcare
- match:
- dependency-install
runner-attributes:
tags:
- donotcare
final-stage-rebuild-index:
image: donotcare
tags: [donotcare]
""")
with tmpdir.as_cwd():
env_cmd('create', 'test', './spack.yaml')
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'):
os.environ['SPACK_IS_PR_PIPELINE'] = 'True'
ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f:
contents = f.read()
print('generated contents: ')
print(contents)
yaml_contents = syaml.load(contents)
assert('rebuild-index' not in yaml_contents)
def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path, def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery, env_deactivate, install_mockery,
mock_packages): mock_packages):
@ -458,49 +667,6 @@ def test_ci_rebuild_basic(tmpdir, mutable_mock_env_path, env_deactivate,
print(rebuild_output) print(rebuild_output)
def test_ci_pushyaml(tmpdir):
fake_yaml_contents = """generate ci jobs:
script:
- "./share/spack/qa/gitlab/generate-gitlab-ci-yml.sh"
tags:
- "spack-pre-ci"
artifacts:
paths:
- ci-generation
when: always
"""
local_repo_path = tmpdir.join('local_repo')
initialize_new_repo(local_repo_path.strpath, True)
remote_repo_path = tmpdir.join('remote_repo')
initialize_new_repo(remote_repo_path.strpath)
current_branch, current_sha = get_repo_status(local_repo_path.strpath)
print('local repo info: {0}, {1}'.format(current_branch, current_sha))
local_jobs_yaml = local_repo_path.join('.gitlab-ci.yml')
with local_jobs_yaml.open('w') as f:
f.write(fake_yaml_contents)
pushyaml_args = [
'pushyaml',
'--downstream-repo', remote_repo_path.strpath,
'--branch-name', current_branch,
'--commit-sha', current_sha,
]
with fs.working_dir(local_repo_path.strpath):
ci_cmd(*pushyaml_args)
with fs.working_dir(remote_repo_path.strpath):
branch_to_checkout = 'multi-ci-{0}'.format(current_branch)
git('checkout', branch_to_checkout)
with open('.gitlab-ci.yml') as fd:
pushed_contents = fd.read()
assert pushed_contents == fake_yaml_contents
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
@pytest.mark.skipif(not has_gpg(), reason='This test requires gpg') @pytest.mark.skipif(not has_gpg(), reason='This test requires gpg')
def test_push_mirror_contents(tmpdir, mutable_mock_env_path, env_deactivate, def test_push_mirror_contents(tmpdir, mutable_mock_env_path, env_deactivate,

View File

@ -0,0 +1,20 @@
pr_pipeline:
only:
- external_pull_requests
variables:
SPACK_REPO: https://github.com/spack/spack.git
SPACK_REF: ${CI_EXTERNAL_PULL_REQUEST_SOURCE_BRANCH_NAME}
SPACK_IS_PR_PIPELINE: "True"
trigger:
project: spack/e4s
strategy: depend
merge_pipeline:
only:
- develop
variables:
SPACK_REPO: https://github.com/spack/spack.git
SPACK_REF: develop
trigger:
project: spack/e4s
strategy: depend

View File

@ -469,22 +469,14 @@ _spack_ci() {
then then
SPACK_COMPREPLY="-h --help" SPACK_COMPREPLY="-h --help"
else else
SPACK_COMPREPLY="start generate pushyaml rebuild" SPACK_COMPREPLY="generate rebuild"
fi fi
} }
_spack_ci_start() {
SPACK_COMPREPLY="-h --help --output-file --copy-to --spack-repo --spack-ref --downstream-repo --branch-name --commit-sha"
}
_spack_ci_generate() { _spack_ci_generate() {
SPACK_COMPREPLY="-h --help --output-file --copy-to --spack-repo --spack-ref" SPACK_COMPREPLY="-h --help --output-file --copy-to --spack-repo --spack-ref"
} }
_spack_ci_pushyaml() {
SPACK_COMPREPLY="-h --help --downstream-repo --branch-name --commit-sha"
}
_spack_ci_rebuild() { _spack_ci_rebuild() {
SPACK_COMPREPLY="-h --help" SPACK_COMPREPLY="-h --help"
} }

View File

@ -14,6 +14,9 @@ class Gcc(Package):
version('1.0', '0123456789abcdef0123456789abcdef') version('1.0', '0123456789abcdef0123456789abcdef')
version('2.0', '2.0_a_hash') version('2.0', '2.0_a_hash')
version('3.0', '3.0_a_hash')
depends_on('conflict', when='@3.0')
def install(self, spec, prefix): def install(self, spec, prefix):
# Create the minimal compiler that will fool `spack compiler find` # Create the minimal compiler that will fool `spack compiler find`