Remove :<name> interpolation, add SPACK_VERSION variables

Also fix issues with documentation to reflect changes
This commit is contained in:
Scott Wittenburg 2020-08-06 20:35:01 -06:00
parent bf90cdd6c7
commit 031490f6aa
3 changed files with 93 additions and 127 deletions

View File

@ -223,10 +223,6 @@ takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``). provide this option is ``False``).
The ``enable-debug-messages`` key takes a boolean
and allows you to choose whether the pipeline build jobs are run as ``spack -d ci rebuild``
or just ``spack ci rebuild`` (the default is not to enable debug messages).
The The
``final-stage-rebuild-index`` section controls whether an extra job is added to the ``final-stage-rebuild-index`` section controls whether an extra job is added to the
end of your pipeline (in a stage by itself) which will regenerate the mirror's end of your pipeline (in a stage by itself) which will regenerate the mirror's
@ -281,20 +277,17 @@ as well as an ``entrypoint`` to override whatever the default for that image is)
For other types of runners the ``variables`` key will be useful to pass any For other types of runners the ``variables`` key will be useful to pass any
information on to the runner that it needs to do its work (e.g. scheduler information on to the runner that it needs to do its work (e.g. scheduler
parameters, etc.). Any ``variables`` provided here will be added, verbatim, to parameters, etc.). Any ``variables`` provided here will be added, verbatim, to
each job, unless the value takes on a special form. If the value has the form each job.
``$env:<var_value>``, then that variable will be inserted with ``var_value``
first sampled from the environment during job generation. This can be useful
for propagating generation-time variables which normally would not be available
child build jobs.
The ``runner-attributes`` section also allows users to supply custom ``script``, The ``runner-attributes`` section also allows users to supply custom ``script``,
``before_script``, and ``after_script`` sections to be applied to every job ``before_script``, and ``after_script`` sections to be applied to every job
scheduled on that runner. This allows users to do any custom preparation or scheduled on that runner. This allows users to do any custom preparation or
cleanup tasks that fit their particular workflow, as well as completely cleanup tasks that fit their particular workflow, as well as completely
customize the rebuilding of a spec if they so choose. We will never generate customize the rebuilding of a spec if they so choose. Spack will not generate
a ``before_script`` or ``after_script`` for jobs, but if you do not provide a ``before_script`` or ``after_script`` for jobs, but if you do not provide
a custom ``script``, we will generate one for you that invokes a custom ``script``, spack will generate one for you that assumes your
``spack ci rebuild``. ``spack.yaml`` is at the root of the repository, activates that environment for
you, and invokes ``spack ci rebuild``.
.. _staging_algorithm: .. _staging_algorithm:
@ -305,8 +298,8 @@ Summary of ``.gitlab-ci.yml`` generation algorithm
All specs yielded by the matrix (or all the specs in the environment) have their All specs yielded by the matrix (or all the specs in the environment) have their
dependencies computed, and the entire resulting set of specs are staged together dependencies computed, and the entire resulting set of specs are staged together
before being run through the ``gitlab-ci/mappings`` entries, where each staged before being run through the ``gitlab-ci/mappings`` entries, where each staged
spec is assigned a runner. "Staging" is the name we have given to the process spec is assigned a runner. "Staging" is the name given to the process of
of figuring out in what order the specs should be built, taking into consideration figuring out in what order the specs should be built, taking into consideration
Gitlab CI rules about jobs/stages. In the staging process the goal is to maximize Gitlab CI rules about jobs/stages. In the staging process the goal is to maximize
the number of jobs in any stage of the pipeline, while ensuring that the jobs in the number of jobs in any stage of the pipeline, while ensuring that the jobs in
any stage only depend on jobs in previous stages (since those jobs are guaranteed any stage only depend on jobs in previous stages (since those jobs are guaranteed
@ -317,7 +310,7 @@ a runner, the ``.gitlab-ci.yml`` is written to disk.
The short example provided above would result in the ``readline``, ``ncurses``, The short example provided above would result in the ``readline``, ``ncurses``,
and ``pkgconf`` packages getting staged and built on the runner chosen by the and ``pkgconf`` packages getting staged and built on the runner chosen by the
``spack-k8s`` tag. In this example, we assume the runner is a Docker executor ``spack-k8s`` tag. In this example, spack assumes the runner is a Docker executor
type runner, and thus certain jobs will be run in the ``centos7`` container, type runner, and thus certain jobs will be run in the ``centos7`` container,
and others in the ``ubuntu-18.04`` container. The resulting ``.gitlab-ci.yml`` and others in the ``ubuntu-18.04`` container. The resulting ``.gitlab-ci.yml``
will contain 6 jobs in three stages. Once the jobs have been generated, the will contain 6 jobs in three stages. Once the jobs have been generated, the
@ -376,12 +369,12 @@ Here's an example of what bootstrapping some compilers might look like:
# mappings similar to the example higher up in this description # mappings similar to the example higher up in this description
... ...
In the example above, we have added a list to the ``definitions`` called The example above adds a list to the ``definitions`` called ``compiler-pkgs``
``compiler-pkgs`` (you can add any number of these), which lists compiler packages (you can add any number of these), which lists compiler packages that should
we want to be staged ahead of the full matrix of release specs (which consists be staged ahead of the full matrix of release specs (in this example, only
only of readline in our example). Then within the ``gitlab-ci`` section, we readline). Then within the ``gitlab-ci`` section, note the addition of a
have added a ``bootstrap`` section, which can contain a list of items, each ``bootstrap`` section, which can contain a list of items, each referring to
referring to a list in the ``definitions`` section. These items can either a list in the ``definitions`` section. These items can either
be a dictionary or a string. If you supply a dictionary, it must have a name be a dictionary or a string. If you supply a dictionary, it must have a name
key whose value must match one of the lists in definitions and it can have a key whose value must match one of the lists in definitions and it can have a
``compiler-agnostic`` key whose value is a boolean. If you supply a string, ``compiler-agnostic`` key whose value is a boolean. If you supply a string,
@ -420,9 +413,9 @@ other reason you want to use a custom version of spack to run your pipelines,
this section provides an example of how you could take advantage of this section provides an example of how you could take advantage of
user-provided pipeline scripts to accomplish this fairly simply. First, you user-provided pipeline scripts to accomplish this fairly simply. First, you
could use the GitLab user interface to create CI environment variables could use the GitLab user interface to create CI environment variables
containing the url and branch or tag you want to clone (calling them, for containing the url and branch or tag you want to use (calling them, for
example, ``SPACK_REPO`` and ``SPACK_REF``), then use those in a custom shell example, ``SPACK_REPO`` and ``SPACK_REF``), then refer to those in a custom shell
script invoked both from your pipeline generation job, as well in your rebuild script invoked both from your pipeline generation job, as well as in your rebuild
jobs. Here's the ``generate-pipeline`` job from the top of this document, jobs. Here's the ``generate-pipeline`` job from the top of this document,
updated to invoke a custom shell script that will clone and source a custom updated to invoke a custom shell script that will clone and source a custom
spack: spack:
@ -444,7 +437,7 @@ spack:
paths: paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml" - "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
And that script could contain: And the ``cloneSpack.sh`` script could contain:
.. code-block:: bash .. code-block:: bash
@ -482,84 +475,30 @@ of spack, so you would update your ``spack.yaml`` from above as follows:
after_script: after_script:
- rm -rf ./spack - rm -rf ./spack
Now all the generated rebuild jobs will use the same shell script to clone Now all of the generated rebuild jobs will use the same shell script to clone
spack before running their actual workload. Here we have also provided a spack before running their actual workload. Note in the above example the
custom ``script`` because we want to run ``spack ci rebuild`` in debug mode provision of a custom ``script`` section. The reason for this is to run
to get more information when builds fail. ``spack ci rebuild`` in debug mode to get more information when builds fail.
Now imagine you have long pipelines with many specs to be built, and you Now imagine you have long pipelines with many specs to be built, and you
are worried about the branch of spack you're testing changing somewhere are pointing to a spack repository and branch that has a tendency to change
in the middle of the pipeline, resulting in half the jobs getting run frequently, such as the main repo and it's ``develop`` branch. If each child
with a different version of spack. In this situation, you can take job checks out the ``develop`` branch, that could result in some jobs running
advantage of "generation-time variable interpolation" to avoid this issue. with one SHA of spack, while later jobs run with another. To help avoid this
issue, the pipeline generation process saves global variables called
``SPACK_VERSION`` and ``SPACK_CHECKOUT_VERSION`` that capture the version
of spack used to generate the pipeline. While the ``SPACK_VERSION`` variable
simply contains the human-readable value produced by ``spack -V`` at pipeline
generation time, the ``SPACK_CHECKOUT_VERSION`` variable can be used in a
``git checkout`` command to make sure all child jobs checkout the same version
of spack used to generate the pipeline. To take advantage of this, you could
simply replace ``git checkout ${SPACK_REF}`` in the example ``cloneSpack.sh``
script above with ``git checkout ${SPACK_CHECKOUT_VERSION}``.
The basic idea is that before you run ``spack ci generate`` to generate On the other hand, if you're pointing to a spack repository and branch under your
your pipeline, you first capture the SHA of spack on the branch you want control, there may be no benefit in using the captured ``SPACK_CHECKOUT_VERSION``,
to test. Then in your ``spack.yaml`` you use the ``variables`` section and you can instead just clone using the project CI variables you set (in the
of your ``runner-attributes`` to set the SHA as a variable (interpolated earlier example these were ``SPACK_REPO`` and ``SPACK_REF``).
at pipeline generation time) which will be available to your generated child
jobs when they run. Below we show a ``.gitlab-ci.yml`` for your environment
repo, similar to the one above, but with the script elements right inline:
.. code-block:: yaml
generate-pipeline:
tags:
- <some-other-tag>
before_script:
- git clone ${SPACK_REPO} --branch ${SPACK_REF}
- pushd ./spack && export CURRENT_SPACK_SHA=$(git rev-parse HEAD) && popd
- . "./spack/share/spack/setup-env.sh"
- spack --version
script:
- spack env activate --without-view .
- spack ci generate
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
after_script:
- rm -rf ./spack
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
Above we have just set an environment variable, ``CURRENT_SPACK_SHA``, which
will be available when the pipeline generation job is running, but otherwise
would not be available when the child jobs run. To make this available to
those jobs, we can do something like this in our ``spack.yaml``:
.. code-block:: yaml
spack:
...
gitlab-ci:
mappings:
- match:
- os=ubuntu18.04
runner-attributes:
tags:
- spack-kube
image: spack/ubuntu-bionic
variables:
CAPTURED_SPACK_SHA: $env:CURRENT_SPACK_SHA
before_script:
- git clone ${SPACK_REPO}
- pushd ./spack && git checkout ${CAPTURED_SPACK_SHA} && popd
- . "./spack/share/spack/setup-env.sh"
- spack --version
script:
- spack env activate --without-view .
- spack -d ci rebuild
after_script:
- rm -rf ./spack
The behavior of pipeline generation is just to copy your ``script`` elements
verbatim into the job, leaving shell variable interopolation until the time
jobs run. In this case, we have just used the ``$env:CURRENT_SPACK_SHA``
syntax to interpolate that variable at job generation time, and put it in
the generated yaml of the job as a variable we can get when the child job
runs (``CAPTURED_SPACK_SHA``). Those variables can actually have the same name,
but to avoid confusion, we've made them distinct here. Now all your child
jobs will clone the same SHA of spack used to generate the pipeline, no matter
whether more commits are pushed to the branch during that time.
.. _ci_environment_variables: .. _ci_environment_variables:

View File

@ -28,7 +28,7 @@
import spack.environment as ev import spack.environment as ev
from spack.error import SpackError from spack.error import SpackError
import spack.hash_types as ht import spack.hash_types as ht
from spack.main import SpackCommand import spack.main
import spack.repo import spack.repo
from spack.spec import Spec from spack.spec import Spec
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
@ -39,10 +39,8 @@
'always', 'always',
] ]
spack_gpg = SpackCommand('gpg') spack_gpg = spack.main.SpackCommand('gpg')
spack_compiler = SpackCommand('compiler') spack_compiler = spack.main.SpackCommand('compiler')
runner_var_regex = re.compile('\\$env:(.+)$')
class TemporaryDirectory(object): class TemporaryDirectory(object):
@ -619,13 +617,6 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
variables = {} variables = {}
if 'variables' in runner_attribs: if 'variables' in runner_attribs:
variables.update(runner_attribs['variables']) variables.update(runner_attribs['variables'])
for name, value in variables.items():
m = runner_var_regex.search(value)
if m:
env_var = m.group(1)
interp_value = os.environ.get(env_var, None)
if interp_value:
variables[name] = interp_value
image_name = None image_name = None
image_entry = None image_entry = None
@ -830,6 +821,26 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
output_object['stages'] = stage_names output_object['stages'] = stage_names
# Capture the version of spack used to generate the pipeline, transform it
# into a value that can be passed to "git checkout", and save it in a
# global yaml variable
spack_version = spack.main.get_version()
version_to_clone = None
v_match = re.match(r"^\d+\.\d+\.\d+$", spack_version)
if v_match:
version_to_clone = 'v{0}'.format(v_match.group(0))
else:
v_match = re.match(r"^[^-]+-[^-]+-([a-f\d]+)$", spack_version)
if v_match:
version_to_clone = v_match.group(1)
else:
version_to_clone = spack_version
output_object['variables'] = {
'SPACK_VERSION': spack_version,
'SPACK_CHECKOUT_VERSION': version_to_clone,
}
sorted_output = {} sorted_output = {}
for output_key, output_value in sorted(output_object.items()): for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value sorted_output[output_key] = output_value

View File

@ -14,7 +14,7 @@
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
import spack.hash_types as ht import spack.hash_types as ht
from spack.main import SpackCommand import spack.main
import spack.paths as spack_paths import spack.paths as spack_paths
import spack.repo as repo import spack.repo as repo
from spack.schema.buildcache_spec import schema as spec_yaml_schema from spack.schema.buildcache_spec import schema as spec_yaml_schema
@ -26,12 +26,12 @@
import spack.util.gpg import spack.util.gpg
ci_cmd = SpackCommand('ci') ci_cmd = spack.main.SpackCommand('ci')
env_cmd = SpackCommand('env') env_cmd = spack.main.SpackCommand('env')
mirror_cmd = SpackCommand('mirror') mirror_cmd = spack.main.SpackCommand('mirror')
gpg_cmd = SpackCommand('gpg') gpg_cmd = spack.main.SpackCommand('gpg')
install_cmd = SpackCommand('install') install_cmd = spack.main.SpackCommand('install')
buildcache_cmd = SpackCommand('buildcache') buildcache_cmd = spack.main.SpackCommand('buildcache')
git = exe.which('git', required=True) git = exe.which('git', required=True)
@ -390,10 +390,10 @@ def test_ci_generate_with_cdash_token(tmpdir, mutable_mock_env_path,
assert(filecmp.cmp(orig_file, copy_to_file) is True) assert(filecmp.cmp(orig_file, copy_to_file) is True)
def test_ci_generate_with_script_and_variables(tmpdir, mutable_mock_env_path, def test_ci_generate_with_custom_scripts(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery, env_deactivate, install_mockery,
mock_packages): mock_packages, monkeypatch):
"""Make sure we it doesn't break if we configure cdash""" """Test use of user-provided scripts"""
filename = str(tmpdir.join('spack.yaml')) filename = str(tmpdir.join('spack.yaml'))
with open(filename, 'w') as f: with open(filename, 'w') as f:
f.write("""\ f.write("""\
@ -410,7 +410,7 @@ def test_ci_generate_with_script_and_variables(tmpdir, mutable_mock_env_path,
tags: tags:
- donotcare - donotcare
variables: variables:
ONE: $env:INTERP_ON_GENERATE ONE: plain-string-value
TWO: ${INTERP_ON_BUILD} TWO: ${INTERP_ON_BUILD}
before_script: before_script:
- mkdir /some/path - mkdir /some/path
@ -430,7 +430,7 @@ def test_ci_generate_with_script_and_variables(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
os.environ['INTERP_ON_GENERATE'] = 'success' monkeypatch.setattr(spack.main, 'get_version', lambda: '0.15.3')
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
@ -439,6 +439,13 @@ def test_ci_generate_with_script_and_variables(tmpdir, mutable_mock_env_path,
found_it = False found_it = False
assert('variables' in yaml_contents)
global_vars = yaml_contents['variables']
assert('SPACK_VERSION' in global_vars)
assert(global_vars['SPACK_VERSION'] == '0.15.3')
assert('SPACK_CHECKOUT_VERSION' in global_vars)
assert(global_vars['SPACK_CHECKOUT_VERSION'] == 'v0.15.3')
for ci_key in yaml_contents.keys(): for ci_key in yaml_contents.keys():
ci_obj = yaml_contents[ci_key] ci_obj = yaml_contents[ci_key]
if 'archive-files' in ci_key: if 'archive-files' in ci_key:
@ -446,7 +453,7 @@ def test_ci_generate_with_script_and_variables(tmpdir, mutable_mock_env_path,
assert('variables' in ci_obj) assert('variables' in ci_obj)
var_d = ci_obj['variables'] var_d = ci_obj['variables']
assert('ONE' in var_d) assert('ONE' in var_d)
assert(var_d['ONE'] == 'success') assert(var_d['ONE'] == 'plain-string-value')
assert('TWO' in var_d) assert('TWO' in var_d)
assert(var_d['TWO'] == '${INTERP_ON_BUILD}') assert(var_d['TWO'] == '${INTERP_ON_BUILD}')
@ -782,7 +789,7 @@ def test_push_mirror_contents(tmpdir, mutable_mock_env_path, env_deactivate,
def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path, def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
env_deactivate, install_mockery, env_deactivate, install_mockery,
mock_packages): mock_packages, monkeypatch):
"""Test that we get the behavior we want with respect to the provision """Test that we get the behavior we want with respect to the provision
of runner attributes like tags, variables, and scripts, both when we of runner attributes like tags, variables, and scripts, both when we
inherit them from the top level, as well as when we override one or inherit them from the top level, as well as when we override one or
@ -844,6 +851,8 @@ def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
outputfile = str(tmpdir.join('.gitlab-ci.yml')) outputfile = str(tmpdir.join('.gitlab-ci.yml'))
with ev.read('test'): with ev.read('test'):
monkeypatch.setattr(
spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1')
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
@ -852,6 +861,13 @@ def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
print(contents) print(contents)
yaml_contents = syaml.load(contents) yaml_contents = syaml.load(contents)
assert('variables' in yaml_contents)
global_vars = yaml_contents['variables']
assert('SPACK_VERSION' in global_vars)
assert(global_vars['SPACK_VERSION'] == '0.15.3-416-12ad69eb1')
assert('SPACK_CHECKOUT_VERSION' in global_vars)
assert(global_vars['SPACK_CHECKOUT_VERSION'] == '12ad69eb1')
for ci_key in yaml_contents.keys(): for ci_key in yaml_contents.keys():
if '(specs) b' in ci_key: if '(specs) b' in ci_key:
print('Should not have staged "b" w/out a match') print('Should not have staged "b" w/out a match')