Compare commits
44 Commits
develop-20
...
releases/v
Author | SHA1 | Date | |
---|---|---|---|
![]() |
13e6f87ef6 | ||
![]() |
45183d2b49 | ||
![]() |
22a7f98141 | ||
![]() |
1bbf2fa93e | ||
![]() |
67f2d64a3f | ||
![]() |
7405d18e98 | ||
![]() |
bff1de69a5 | ||
![]() |
44f360556d | ||
![]() |
96b68a210f | ||
![]() |
d3ee0b9c07 | ||
![]() |
ab1e04f1d0 | ||
![]() |
fbc2e59221 | ||
![]() |
4f40c9aab9 | ||
![]() |
618161075d | ||
![]() |
21fa564df8 | ||
![]() |
b3e9abc72a | ||
![]() |
488a513109 | ||
![]() |
53cb6312a8 | ||
![]() |
7371fef3ca | ||
![]() |
60d9e12594 | ||
![]() |
c4e492202d | ||
![]() |
2742e3d332 | ||
![]() |
54f1ba9c3b | ||
![]() |
a78944c9eb | ||
![]() |
6a4d1af5be | ||
![]() |
6baf171700 | ||
![]() |
891bb7131b | ||
![]() |
3876367a75 | ||
![]() |
19e4649979 | ||
![]() |
3dd7619f73 | ||
![]() |
2a7a040327 | ||
![]() |
83bf44f2fe | ||
![]() |
c527b43d18 | ||
![]() |
4866c587e6 | ||
![]() |
c09bf37ff6 | ||
![]() |
deeebc4593 | ||
![]() |
eef202ea85 | ||
![]() |
4c6564f10a | ||
![]() |
82919cb6a5 | ||
![]() |
844c799299 | ||
![]() |
9198ab63ae | ||
![]() |
8f9bc5bba4 | ||
![]() |
ca0c968639 | ||
![]() |
d99a1b1047 |
2
.github/workflows/unit_tests.yaml
vendored
2
.github/workflows/unit_tests.yaml
vendored
@@ -39,7 +39,7 @@ jobs:
|
||||
python-version: '3.10'
|
||||
- name: Install Python packages
|
||||
run: |
|
||||
pip install --upgrade pip six setuptools types-six
|
||||
pip install --upgrade pip six setuptools==62.3.4 types-six
|
||||
- name: Setup git configuration
|
||||
run: |
|
||||
# Need this for the git tests to succeed.
|
||||
|
2
.github/workflows/windows_python.yml
vendored
2
.github/workflows/windows_python.yml
vendored
@@ -41,7 +41,7 @@ jobs:
|
||||
python-version: 3.9
|
||||
- name: Install Python packages
|
||||
run: |
|
||||
python -m pip install --upgrade pip six setuptools flake8 isort>=4.3.5 mypy>=0.800 black pywin32 types-python-dateutil
|
||||
python -m pip install --upgrade pip six setuptools==62.3.4 flake8 isort>=4.3.5 mypy>=0.800 black pywin32 types-python-dateutil
|
||||
- name: Create local develop
|
||||
run: |
|
||||
.\spack\.github\workflows\setup_git.ps1
|
||||
|
222
CHANGELOG.md
222
CHANGELOG.md
@@ -1,3 +1,223 @@
|
||||
# v0.18.1 (2022-07-19)
|
||||
|
||||
### Spack Bugfixes
|
||||
* Fix several bugs related to bootstrapping (#30834,#31042,#31180)
|
||||
* Fix a regression that was causing spec hashes to differ between
|
||||
Python 2 and Python 3 (#31092)
|
||||
* Fixed compiler flags for oneAPI and DPC++ (#30856)
|
||||
* Fixed several issues related to concretization (#31142,#31153,#31170,#31226)
|
||||
* Improved support for Cray manifest file and `spack external find` (#31144,#31201,#31173.#31186)
|
||||
* Assign a version to openSUSE Tumbleweed according to the GLIBC version
|
||||
in the system (#19895)
|
||||
* Improved Dockerfile generation for `spack containerize` (#29741,#31321)
|
||||
* Fixed a few bugs related to concurrent execution of commands (#31509,#31493,#31477)
|
||||
|
||||
### Package updates
|
||||
* WarpX: add v22.06, fixed libs property (#30866,#31102)
|
||||
* openPMD: add v0.14.5, update recipe for @develop (#29484,#31023)
|
||||
|
||||
# v0.18.0 (2022-05-28)
|
||||
|
||||
`v0.18.0` is a major feature release.
|
||||
|
||||
## Major features in this release
|
||||
|
||||
1. **Concretizer now reuses by default**
|
||||
|
||||
`spack install --reuse` was introduced in `v0.17.0`, and `--reuse`
|
||||
is now the default concretization mode. Spack will try hard to
|
||||
resolve dependencies using installed packages or binaries (#30396).
|
||||
|
||||
To avoid reuse and to use the latest package configurations, (the
|
||||
old default), you can use `spack install --fresh`, or add
|
||||
configuration like this to your environment or `concretizer.yaml`:
|
||||
|
||||
```yaml
|
||||
concretizer:
|
||||
reuse: false
|
||||
```
|
||||
|
||||
2. **Finer-grained hashes**
|
||||
|
||||
Spack hashes now include `link`, `run`, *and* `build` dependencies,
|
||||
as well as a canonical hash of package recipes. Previously, hashes
|
||||
only included `link` and `run` dependencies (though `build`
|
||||
dependencies were stored by environments). We coarsened the hash to
|
||||
reduce churn in user installations, but the new default concretizer
|
||||
behavior mitigates this concern and gets us reuse *and* provenance.
|
||||
You will be able to see the build dependencies of new installations
|
||||
with `spack find`. Old installations will not change and their
|
||||
hashes will not be affected. (#28156, #28504, #30717, #30861)
|
||||
|
||||
3. **Improved error messages**
|
||||
|
||||
Error handling with the new concretizer is now done with
|
||||
optimization criteria rather than with unsatisfiable cores, and
|
||||
Spack reports many more details about conflicting constraints.
|
||||
(#30669)
|
||||
|
||||
4. **Unify environments when possible**
|
||||
|
||||
Environments have thus far supported `concretization: together` or
|
||||
`concretization: separately`. These have been replaced by a new
|
||||
preference in `concretizer.yaml`:
|
||||
|
||||
```yaml
|
||||
concretizer:
|
||||
unify: [true|false|when_possible]
|
||||
```
|
||||
|
||||
`concretizer:unify:when_possible` will *try* to resolve a fully
|
||||
unified environment, but if it cannot, it will create multiple
|
||||
configurations of some packages where it has to. For large
|
||||
environments that previously had to be concretized separately, this
|
||||
can result in a huge speedup (40-50x). (#28941)
|
||||
|
||||
5. **Automatically find externals on Cray machines**
|
||||
|
||||
Spack can now automatically discover installed packages in the Cray
|
||||
Programming Environment by running `spack external find` (or `spack
|
||||
external read-cray-manifest` to *only* query the PE). Packages from
|
||||
the PE (e.g., `cray-mpich` are added to the database with full
|
||||
dependency information, and compilers from the PE are added to
|
||||
`compilers.yaml`. Available with the June 2022 release of the Cray
|
||||
Programming Environment. (#24894, #30428)
|
||||
|
||||
6. **New binary format and hardened signing**
|
||||
|
||||
Spack now has an updated binary format, with improvements for
|
||||
security. The new format has a detached signature file, and Spack
|
||||
verifies the signature before untarring or decompressing the binary
|
||||
package. The previous format embedded the signature in a `tar`
|
||||
file, which required the client to run `tar` *before* verifying
|
||||
(#30750). Spack can still install from build caches using the old
|
||||
format, but we encourage users to switch to the new format going
|
||||
forward.
|
||||
|
||||
Production GitLab pipelines have been hardened to securely sign
|
||||
binaries. There is now a separate signing stage so that signing
|
||||
keys are never exposed to build system code, and signing keys are
|
||||
ephemeral and only live as long as the signing pipeline stage.
|
||||
(#30753)
|
||||
|
||||
7. **Bootstrap mirror generation**
|
||||
|
||||
The `spack bootstrap mirror` command can automatically create a
|
||||
mirror for bootstrapping the concretizer and other needed
|
||||
dependencies in an air-gapped environment. (#28556)
|
||||
|
||||
8. **Nascent Windows support**
|
||||
|
||||
Spack now has initial support for Windows. Spack core has been
|
||||
refactored to run in the Windows environment, and a small number of
|
||||
packages can now build for Windows. More details are
|
||||
[in the documentation](https://spack.rtfd.io/en/latest/getting_started.html#spack-on-windows)
|
||||
(#27021, #28385, many more)
|
||||
|
||||
9. **Makefile generation**
|
||||
|
||||
`spack env depfile` can be used to generate a `Makefile` from an
|
||||
environment, which can be used to build packages the environment
|
||||
in parallel on a single node. e.g.:
|
||||
|
||||
```console
|
||||
spack -e myenv env depfile > Makefile
|
||||
make
|
||||
```
|
||||
|
||||
Spack propagates `gmake` jobserver information to builds so that
|
||||
their jobs can share cores. (#30039, #30254, #30302, #30526)
|
||||
|
||||
10. **New variant features**
|
||||
|
||||
In addition to being conditional themselves, variants can now have
|
||||
[conditional *values*](https://spack.readthedocs.io/en/latest/packaging_guide.html#conditional-possible-values)
|
||||
that are only possible for certain configurations of a package. (#29530)
|
||||
|
||||
Variants can be
|
||||
[declared "sticky"](https://spack.readthedocs.io/en/latest/packaging_guide.html#sticky-variants),
|
||||
which prevents them from being enabled or disabled by the
|
||||
concretizer. Sticky variants must be set explicitly by users
|
||||
on the command line or in `packages.yaml`. (#28630)
|
||||
|
||||
* Allow conditional possible values in variants
|
||||
* Add a "sticky" property to variants
|
||||
|
||||
|
||||
## Other new features of note
|
||||
|
||||
* Environment views can optionally link only `run` dependencies
|
||||
with `link:run` (#29336)
|
||||
* `spack external find --all` finds library-only packages in
|
||||
addition to build dependencies (#28005)
|
||||
* Customizable `config:license_dir` option (#30135)
|
||||
* `spack external find --path PATH` takes a custom search path (#30479)
|
||||
* `spack spec` has a new `--format` argument like `spack find` (#27908)
|
||||
* `spack concretize --quiet` skips printing concretized specs (#30272)
|
||||
* `spack info` now has cleaner output and displays test info (#22097)
|
||||
* Package-level submodule option for git commit versions (#30085, #30037)
|
||||
* Using `/hash` syntax to refer to concrete specs in an environment
|
||||
now works even if `/hash` is not installed. (#30276)
|
||||
|
||||
## Major internal refactors
|
||||
|
||||
* full hash (see above)
|
||||
* new develop versioning scheme `0.19.0-dev0`
|
||||
* Allow for multiple dependencies/dependents from the same package (#28673)
|
||||
* Splice differing virtual packages (#27919)
|
||||
|
||||
## Performance Improvements
|
||||
|
||||
* Concretization of large environments with `unify: when_possible` is
|
||||
much faster than concretizing separately (#28941, see above)
|
||||
* Single-pass view generation algorithm is 2.6x faster (#29443)
|
||||
|
||||
## Archspec improvements
|
||||
|
||||
* `oneapi` and `dpcpp` flag support (#30783)
|
||||
* better support for `M1` and `a64fx` (#30683)
|
||||
|
||||
## Removals and Deprecations
|
||||
|
||||
* Spack no longer supports Python `2.6` (#27256)
|
||||
* Removed deprecated `--run-tests` option of `spack install`;
|
||||
use `spack test` (#30461)
|
||||
* Removed deprecated `spack flake8`; use `spack style` (#27290)
|
||||
|
||||
* Deprecate `spack:concretization` config option; use
|
||||
`concretizer:unify` (#30038)
|
||||
* Deprecate top-level module configuration; use module sets (#28659)
|
||||
* `spack activate` and `spack deactivate` are deprecated in favor of
|
||||
environments; will be removed in `0.19.0` (#29430; see also `link:run`
|
||||
in #29336 above)
|
||||
|
||||
## Notable Bugfixes
|
||||
|
||||
* Fix bug that broke locks with many parallel builds (#27846)
|
||||
* Many bugfixes and consistency improvements for the new concretizer
|
||||
and `--reuse` (#30357, #30092, #29835, #29933, #28605, #29694, #28848)
|
||||
|
||||
## Packages
|
||||
|
||||
* `CMakePackage` uses `CMAKE_INSTALL_RPATH_USE_LINK_PATH` (#29703)
|
||||
* Refactored `lua` support: `lua-lang` virtual supports both
|
||||
`lua` and `luajit` via new `LuaPackage` build system(#28854)
|
||||
* PythonPackage: now installs packages with `pip` (#27798)
|
||||
* Python: improve site_packages_dir handling (#28346)
|
||||
* Extends: support spec, not just package name (#27754)
|
||||
* `find_libraries`: search for both .so and .dylib on macOS (#28924)
|
||||
* Use stable URLs and `?full_index=1` for all github patches (#29239)
|
||||
|
||||
## Spack community stats
|
||||
|
||||
* 6,416 total packages, 458 new since `v0.17.0`
|
||||
* 219 new Python packages
|
||||
* 60 new R packages
|
||||
* 377 people contributed to this release
|
||||
* 337 committers to packages
|
||||
* 85 committers to core
|
||||
|
||||
|
||||
# v0.17.2 (2022-04-13)
|
||||
|
||||
### Spack bugfixes
|
||||
@@ -11,7 +231,7 @@
|
||||
* Fixed a few bugs affecting the spack ci command (#29518, #29419)
|
||||
* Fix handling of Intel compiler environment (#29439)
|
||||
* Fix a few edge cases when reindexing the DB (#28764)
|
||||
* Remove "Known issues" from documentation (#29664)
|
||||
* Remove "Known issues" from documentation (#29664)
|
||||
* Other miscellaneous bugfixes (0b72e070583fc5bcd016f5adc8a84c99f2b7805f, #28403, #29261)
|
||||
|
||||
# v0.17.1 (2021-12-23)
|
||||
|
@@ -50,6 +50,13 @@ build cache files for the "ninja" spec:
|
||||
Note that the targeted spec must already be installed. Once you have a build cache,
|
||||
you can add it as a mirror, discussed next.
|
||||
|
||||
.. warning::
|
||||
|
||||
Spack improved the format used for binary caches in v0.18. The entire v0.18 series
|
||||
will be able to verify and install binary caches both in the new and in the old format.
|
||||
Support for using the old format is expected to end in v0.19, so we advise users to
|
||||
recreate relevant buildcaches using Spack v0.18 or higher.
|
||||
|
||||
---------------------------------------
|
||||
Finding or installing build cache files
|
||||
---------------------------------------
|
||||
|
@@ -545,8 +545,8 @@ environment and have a single view of it in the filesystem.
|
||||
|
||||
The ``concretizer:unify`` config option was introduced in Spack 0.18 to
|
||||
replace the ``concretization`` property. For reference,
|
||||
``concretization: separately`` is replaced by ``concretizer:unify:true``,
|
||||
and ``concretization: together`` is replaced by ``concretizer:unify:false``.
|
||||
``concretization: together`` is replaced by ``concretizer:unify:true``,
|
||||
and ``concretization: separately`` is replaced by ``concretizer:unify:false``.
|
||||
|
||||
.. admonition:: Re-concretization of user specs
|
||||
|
||||
@@ -799,7 +799,7 @@ directories.
|
||||
select: [^mpi]
|
||||
exclude: ['%pgi@18.5']
|
||||
projections:
|
||||
all: {name}/{version}-{compiler.name}
|
||||
all: '{name}/{version}-{compiler.name}'
|
||||
link: all
|
||||
link_type: symlink
|
||||
|
||||
@@ -1051,4 +1051,4 @@ the include is conditional.
|
||||
the ``--make-target-prefix`` flag and use the non-phony targets
|
||||
``<target-prefix>/env`` and ``<target-prefix>/fetch`` as
|
||||
prerequisites, instead of the phony targets ``<target-prefix>/all``
|
||||
and ``<target-prefix>/fetch-all`` respectively.
|
||||
and ``<target-prefix>/fetch-all`` respectively.
|
||||
|
26
lib/spack/env/cc
vendored
26
lib/spack/env/cc
vendored
@@ -401,8 +401,7 @@ input_command="$*"
|
||||
# command line and recombine them with Spack arguments later. We
|
||||
# parse these out so that we can make sure that system paths come
|
||||
# last, that package arguments come first, and that Spack arguments
|
||||
# are injected properly. Based on configuration, we also strip -Werror
|
||||
# arguments.
|
||||
# are injected properly.
|
||||
#
|
||||
# All other arguments, including -l arguments, are treated as
|
||||
# 'other_args' and left in their original order. This ensures that
|
||||
@@ -441,29 +440,6 @@ while [ $# -ne 0 ]; do
|
||||
continue
|
||||
fi
|
||||
|
||||
if [ -n "${SPACK_COMPILER_FLAGS_KEEP}" ] ; then
|
||||
# NOTE: the eval is required to allow `|` alternatives inside the variable
|
||||
eval "\
|
||||
case '$1' in
|
||||
$SPACK_COMPILER_FLAGS_KEEP)
|
||||
append other_args_list "$1"
|
||||
shift
|
||||
continue
|
||||
;;
|
||||
esac
|
||||
"
|
||||
fi
|
||||
if [ -n "${SPACK_COMPILER_FLAGS_REMOVE}" ] ; then
|
||||
eval "\
|
||||
case '$1' in
|
||||
$SPACK_COMPILER_FLAGS_REMOVE)
|
||||
shift
|
||||
continue
|
||||
;;
|
||||
esac
|
||||
"
|
||||
fi
|
||||
|
||||
case "$1" in
|
||||
-isystem*)
|
||||
arg="${1#-isystem}"
|
||||
|
2
lib/spack/external/__init__.py
vendored
2
lib/spack/external/__init__.py
vendored
@@ -18,7 +18,7 @@
|
||||
|
||||
* Homepage: https://pypi.python.org/pypi/archspec
|
||||
* Usage: Labeling, comparison and detection of microarchitectures
|
||||
* Version: 0.1.4 (commit 53fc4ac91e9b4c5e4079f15772503a80bece72ad)
|
||||
* Version: 0.1.4 (commit b8eea9df2b4204ff27d204452cd46f5199a0b423)
|
||||
|
||||
argparse
|
||||
--------
|
||||
|
@@ -85,21 +85,21 @@
|
||||
"intel": [
|
||||
{
|
||||
"versions": ":",
|
||||
"name": "pentium4",
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
],
|
||||
"oneapi": [
|
||||
{
|
||||
"versions": ":",
|
||||
"name": "pentium4",
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
],
|
||||
"dpcpp": [
|
||||
{
|
||||
"versions": ":",
|
||||
"name": "pentium4",
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
]
|
||||
@@ -143,6 +143,20 @@
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3"
|
||||
}
|
||||
],
|
||||
"oneapi": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v2",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
],
|
||||
"dpcpp": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v2",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
@@ -200,6 +214,20 @@
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave"
|
||||
}
|
||||
],
|
||||
"oneapi": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v3",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
],
|
||||
"dpcpp": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v3",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
@@ -262,6 +290,20 @@
|
||||
"name": "x86-64",
|
||||
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave -mavx512f -mavx512bw -mavx512cd -mavx512dq -mavx512vl"
|
||||
}
|
||||
],
|
||||
"oneapi": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v4",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
],
|
||||
"dpcpp": [
|
||||
{
|
||||
"versions": "2021.2.0:",
|
||||
"name": "x86-64-v4",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
@@ -302,22 +344,19 @@
|
||||
"intel": [
|
||||
{
|
||||
"versions": "16.0:",
|
||||
"name": "pentium4",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
"flags": "-march={name} -mtune={name}"
|
||||
}
|
||||
],
|
||||
"oneapi": [
|
||||
{
|
||||
"versions": ":",
|
||||
"name": "pentium4",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
"flags": "-march={name} -mtune={name}"
|
||||
}
|
||||
],
|
||||
"dpcpp": [
|
||||
{
|
||||
"versions": ":",
|
||||
"name": "pentium4",
|
||||
"flags": "-march={name} -mtune=generic"
|
||||
"flags": "-march={name} -mtune={name}"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
@@ -4,7 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
#: (major, minor, micro, dev release) tuple
|
||||
spack_version_info = (0, 18, 0, 'dev0')
|
||||
spack_version_info = (0, 18, 1)
|
||||
|
||||
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
|
||||
spack_version = '.'.join(str(s) for s in spack_version_info)
|
||||
|
@@ -298,7 +298,8 @@ def _check_build_test_callbacks(pkgs, error_cls):
|
||||
def _check_patch_urls(pkgs, error_cls):
|
||||
"""Ensure that patches fetched from GitHub have stable sha256 hashes."""
|
||||
github_patch_url_re = (
|
||||
r"^https?://github\.com/.+/.+/(?:commit|pull)/[a-fA-F0-9]*.(?:patch|diff)"
|
||||
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
|
||||
".+/.+/(?:commit|pull)/[a-fA-F0-9]*.(?:patch|diff)"
|
||||
)
|
||||
|
||||
errors = []
|
||||
|
@@ -210,7 +210,7 @@ def get_all_built_specs(self):
|
||||
|
||||
return spec_list
|
||||
|
||||
def find_built_spec(self, spec):
|
||||
def find_built_spec(self, spec, mirrors_to_check=None):
|
||||
"""Look in our cache for the built spec corresponding to ``spec``.
|
||||
|
||||
If the spec can be found among the configured binary mirrors, a
|
||||
@@ -225,6 +225,8 @@ def find_built_spec(self, spec):
|
||||
|
||||
Args:
|
||||
spec (spack.spec.Spec): Concrete spec to find
|
||||
mirrors_to_check: Optional mapping containing mirrors to check. If
|
||||
None, just assumes all configured mirrors.
|
||||
|
||||
Returns:
|
||||
An list of objects containing the found specs and mirror url where
|
||||
@@ -240,17 +242,23 @@ def find_built_spec(self, spec):
|
||||
]
|
||||
"""
|
||||
self.regenerate_spec_cache()
|
||||
return self.find_by_hash(spec.dag_hash())
|
||||
return self.find_by_hash(spec.dag_hash(), mirrors_to_check=mirrors_to_check)
|
||||
|
||||
def find_by_hash(self, find_hash):
|
||||
def find_by_hash(self, find_hash, mirrors_to_check=None):
|
||||
"""Same as find_built_spec but uses the hash of a spec.
|
||||
|
||||
Args:
|
||||
find_hash (str): hash of the spec to search
|
||||
mirrors_to_check: Optional mapping containing mirrors to check. If
|
||||
None, just assumes all configured mirrors.
|
||||
"""
|
||||
if find_hash not in self._mirrors_for_spec:
|
||||
return None
|
||||
return self._mirrors_for_spec[find_hash]
|
||||
results = self._mirrors_for_spec[find_hash]
|
||||
if not mirrors_to_check:
|
||||
return results
|
||||
mirror_urls = mirrors_to_check.values()
|
||||
return [r for r in results if r['mirror_url'] in mirror_urls]
|
||||
|
||||
def update_spec(self, spec, found_list):
|
||||
"""
|
||||
@@ -1592,9 +1600,6 @@ def extract_tarball(spec, download_result, allow_root=False, unsigned=False,
|
||||
# Handle the older buildcache layout where the .spack file
|
||||
# contains a spec json/yaml, maybe an .asc file (signature),
|
||||
# and another tarball containing the actual install tree.
|
||||
tty.warn("This binary package uses a deprecated layout, "
|
||||
"and support for it will eventually be removed "
|
||||
"from spack.")
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
try:
|
||||
tarfile_path = _extract_inner_tarball(
|
||||
@@ -1822,7 +1827,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
|
||||
tty.debug("No Spack mirrors are currently configured")
|
||||
return {}
|
||||
|
||||
results = binary_index.find_built_spec(spec)
|
||||
results = binary_index.find_built_spec(spec, mirrors_to_check=mirrors_to_check)
|
||||
|
||||
# Maybe we just didn't have the latest information from the mirror, so
|
||||
# try to fetch directly, unless we are only considering the indices.
|
||||
|
@@ -456,9 +456,10 @@ def _make_bootstrapper(conf):
|
||||
return _bootstrap_methods[btype](conf)
|
||||
|
||||
|
||||
def _validate_source_is_trusted(conf):
|
||||
def source_is_enabled_or_raise(conf):
|
||||
"""Raise ValueError if the source is not enabled for bootstrapping"""
|
||||
trusted, name = spack.config.get('bootstrap:trusted'), conf['name']
|
||||
if name not in trusted:
|
||||
if not trusted.get(name, False):
|
||||
raise ValueError('source is not trusted')
|
||||
|
||||
|
||||
@@ -529,7 +530,7 @@ def ensure_module_importable_or_raise(module, abstract_spec=None):
|
||||
|
||||
for current_config in bootstrapping_sources():
|
||||
with h.forward(current_config['name']):
|
||||
_validate_source_is_trusted(current_config)
|
||||
source_is_enabled_or_raise(current_config)
|
||||
|
||||
b = _make_bootstrapper(current_config)
|
||||
if b.try_import(module, abstract_spec):
|
||||
@@ -571,7 +572,7 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec):
|
||||
|
||||
for current_config in bootstrapping_sources():
|
||||
with h.forward(current_config['name']):
|
||||
_validate_source_is_trusted(current_config)
|
||||
source_is_enabled_or_raise(current_config)
|
||||
|
||||
b = _make_bootstrapper(current_config)
|
||||
if b.try_search_path(executables, abstract_spec):
|
||||
|
@@ -242,17 +242,6 @@ def clean_environment():
|
||||
# show useful matches.
|
||||
env.set('LC_ALL', build_lang)
|
||||
|
||||
remove_flags = set()
|
||||
keep_flags = set()
|
||||
if spack.config.get('config:flags:keep_werror') == 'all':
|
||||
keep_flags.add('-Werror*')
|
||||
else:
|
||||
if spack.config.get('config:flags:keep_werror') == 'specific':
|
||||
keep_flags.add('-Werror=*')
|
||||
remove_flags.add('-Werror*')
|
||||
env.set('SPACK_COMPILER_FLAGS_KEEP', '|'.join(keep_flags))
|
||||
env.set('SPACK_COMPILER_FLAGS_REMOVE', '|'.join(remove_flags))
|
||||
|
||||
# Remove any macports installs from the PATH. The macports ld can
|
||||
# cause conflicts with the built-in linker on el capitan. Solves
|
||||
# assembler issues, e.g.:
|
||||
|
@@ -33,7 +33,6 @@
|
||||
import spack.util.executable as exe
|
||||
import spack.util.gpg as gpg_util
|
||||
import spack.util.spack_yaml as syaml
|
||||
import spack.util.url as url_util
|
||||
import spack.util.web as web_util
|
||||
from spack.error import SpackError
|
||||
from spack.spec import Spec
|
||||
@@ -42,10 +41,8 @@
|
||||
'always',
|
||||
]
|
||||
|
||||
SPACK_PR_MIRRORS_ROOT_URL = 's3://spack-binaries-prs'
|
||||
SPACK_SHARED_PR_MIRROR_URL = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
|
||||
'shared_pr_mirror')
|
||||
TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror'
|
||||
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
|
||||
|
||||
spack_gpg = spack.main.SpackCommand('gpg')
|
||||
spack_compiler = spack.main.SpackCommand('compiler')
|
||||
@@ -199,6 +196,11 @@ def _get_cdash_build_name(spec, build_group):
|
||||
spec.name, spec.version, spec.compiler, spec.architecture, build_group)
|
||||
|
||||
|
||||
def _remove_reserved_tags(tags):
|
||||
"""Convenience function to strip reserved tags from jobs"""
|
||||
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
|
||||
|
||||
|
||||
def _get_spec_string(spec):
|
||||
format_elements = [
|
||||
'{name}{@version}',
|
||||
@@ -231,8 +233,10 @@ def _add_dependency(spec_label, dep_label, deps):
|
||||
deps[spec_label].add(dep_label)
|
||||
|
||||
|
||||
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
|
||||
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only)
|
||||
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False,
|
||||
mirrors_to_check=None):
|
||||
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only,
|
||||
mirrors_to_check=mirrors_to_check)
|
||||
|
||||
if spec_deps_obj:
|
||||
dependencies = spec_deps_obj['dependencies']
|
||||
@@ -249,7 +253,7 @@ def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
|
||||
_add_dependency(entry['spec'], entry['depends'], deps)
|
||||
|
||||
|
||||
def stage_spec_jobs(specs, check_index_only=False):
|
||||
def stage_spec_jobs(specs, check_index_only=False, mirrors_to_check=None):
|
||||
"""Take a set of release specs and generate a list of "stages", where the
|
||||
jobs in any stage are dependent only on jobs in previous stages. This
|
||||
allows us to maximize build parallelism within the gitlab-ci framework.
|
||||
@@ -261,6 +265,8 @@ def stage_spec_jobs(specs, check_index_only=False):
|
||||
are up to date on those mirrors. This flag limits that search to
|
||||
the binary cache indices on those mirrors to speed the process up,
|
||||
even though there is no garantee the index is up to date.
|
||||
mirrors_to_checK: Optional mapping giving mirrors to check instead of
|
||||
any configured mirrors.
|
||||
|
||||
Returns: A tuple of information objects describing the specs, dependencies
|
||||
and stages:
|
||||
@@ -297,8 +303,8 @@ def _remove_satisfied_deps(deps, satisfied_list):
|
||||
deps = {}
|
||||
spec_labels = {}
|
||||
|
||||
_get_spec_dependencies(
|
||||
specs, deps, spec_labels, check_index_only=check_index_only)
|
||||
_get_spec_dependencies(specs, deps, spec_labels, check_index_only=check_index_only,
|
||||
mirrors_to_check=mirrors_to_check)
|
||||
|
||||
# Save the original deps, as we need to return them at the end of the
|
||||
# function. In the while loop below, the "dependencies" variable is
|
||||
@@ -340,7 +346,7 @@ def _print_staging_summary(spec_labels, dependencies, stages):
|
||||
_get_spec_string(s)))
|
||||
|
||||
|
||||
def _compute_spec_deps(spec_list, check_index_only=False):
|
||||
def _compute_spec_deps(spec_list, check_index_only=False, mirrors_to_check=None):
|
||||
"""
|
||||
Computes all the dependencies for the spec(s) and generates a JSON
|
||||
object which provides both a list of unique spec names as well as a
|
||||
@@ -413,7 +419,7 @@ def append_dep(s, d):
|
||||
continue
|
||||
|
||||
up_to_date_mirrors = bindist.get_mirrors_for_spec(
|
||||
spec=s, index_only=check_index_only)
|
||||
spec=s, mirrors_to_check=mirrors_to_check, index_only=check_index_only)
|
||||
|
||||
skey = _spec_deps_key(s)
|
||||
spec_labels[skey] = {
|
||||
@@ -602,8 +608,8 @@ def get_spec_filter_list(env, affected_pkgs, dependencies=True, dependents=True)
|
||||
def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
prune_dag=False, check_index_only=False,
|
||||
run_optimizer=False, use_dependencies=False,
|
||||
artifacts_root=None):
|
||||
""" Generate a gitlab yaml file to run a dynamic chile pipeline from
|
||||
artifacts_root=None, remote_mirror_override=None):
|
||||
""" Generate a gitlab yaml file to run a dynamic child pipeline from
|
||||
the spec matrix in the active environment.
|
||||
|
||||
Arguments:
|
||||
@@ -629,6 +635,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
artifacts_root (str): Path where artifacts like logs, environment
|
||||
files (spack.yaml, spack.lock), etc should be written. GitLab
|
||||
requires this to be within the project directory.
|
||||
remote_mirror_override (str): Typically only needed when one spack.yaml
|
||||
is used to populate several mirrors with binaries, based on some
|
||||
criteria. Spack protected pipelines populate different mirrors based
|
||||
on branch name, facilitated by this option.
|
||||
"""
|
||||
with spack.concretize.disable_compiler_existence_check():
|
||||
with env.write_transaction():
|
||||
@@ -678,17 +688,19 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
for s in affected_specs:
|
||||
tty.debug(' {0}'.format(s.name))
|
||||
|
||||
generate_job_name = os.environ.get('CI_JOB_NAME', None)
|
||||
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', None)
|
||||
# Downstream jobs will "need" (depend on, for both scheduling and
|
||||
# artifacts, which include spack.lock file) this pipeline generation
|
||||
# job by both name and pipeline id. If those environment variables
|
||||
# do not exist, then maybe this is just running in a shell, in which
|
||||
# case, there is no expectation gitlab will ever run the generated
|
||||
# pipeline and those environment variables do not matter.
|
||||
generate_job_name = os.environ.get('CI_JOB_NAME', 'job-does-not-exist')
|
||||
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', 'pipeline-does-not-exist')
|
||||
|
||||
# Values: "spack_pull_request", "spack_protected_branch", or not set
|
||||
spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None)
|
||||
is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
|
||||
|
||||
spack_pr_branch = os.environ.get('SPACK_PR_BRANCH', None)
|
||||
pr_mirror_url = None
|
||||
if spack_pr_branch:
|
||||
pr_mirror_url = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
|
||||
spack_pr_branch)
|
||||
spack_buildcache_copy = os.environ.get('SPACK_COPY_BUILDCACHE', None)
|
||||
|
||||
if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
|
||||
tty.die('spack ci generate requires an env containing a mirror')
|
||||
@@ -743,14 +755,25 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
'strip-compilers': False,
|
||||
})
|
||||
|
||||
# Add per-PR mirror (and shared PR mirror) if enabled, as some specs might
|
||||
# be up to date in one of those and thus not need to be rebuilt.
|
||||
if pr_mirror_url:
|
||||
spack.mirror.add(
|
||||
'ci_pr_mirror', pr_mirror_url, cfg.default_modify_scope())
|
||||
spack.mirror.add('ci_shared_pr_mirror',
|
||||
SPACK_SHARED_PR_MIRROR_URL,
|
||||
cfg.default_modify_scope())
|
||||
# If a remote mirror override (alternate buildcache destination) was
|
||||
# specified, add it here in case it has already built hashes we might
|
||||
# generate.
|
||||
mirrors_to_check = None
|
||||
if remote_mirror_override:
|
||||
if spack_pipeline_type == 'spack_protected_branch':
|
||||
# Overriding the main mirror in this case might result
|
||||
# in skipping jobs on a release pipeline because specs are
|
||||
# up to date in develop. Eventually we want to notice and take
|
||||
# advantage of this by scheduling a job to copy the spec from
|
||||
# develop to the release, but until we have that, this makes
|
||||
# sure we schedule a rebuild job if the spec isn't already in
|
||||
# override mirror.
|
||||
mirrors_to_check = {
|
||||
'override': remote_mirror_override
|
||||
}
|
||||
else:
|
||||
spack.mirror.add(
|
||||
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
|
||||
|
||||
pipeline_artifacts_dir = artifacts_root
|
||||
if not pipeline_artifacts_dir:
|
||||
@@ -825,11 +848,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
phase_spec.concretize()
|
||||
staged_phases[phase_name] = stage_spec_jobs(
|
||||
concrete_phase_specs,
|
||||
check_index_only=check_index_only)
|
||||
check_index_only=check_index_only,
|
||||
mirrors_to_check=mirrors_to_check)
|
||||
finally:
|
||||
# Clean up PR mirror if enabled
|
||||
if pr_mirror_url:
|
||||
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
|
||||
# Clean up remote mirror override if enabled
|
||||
if remote_mirror_override:
|
||||
if spack_pipeline_type != 'spack_protected_branch':
|
||||
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
|
||||
|
||||
all_job_names = []
|
||||
output_object = {}
|
||||
@@ -889,6 +914,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
|
||||
tags = [tag for tag in runner_attribs['tags']]
|
||||
|
||||
if spack_pipeline_type is not None:
|
||||
# For spack pipelines "public" and "protected" are reserved tags
|
||||
tags = _remove_reserved_tags(tags)
|
||||
if spack_pipeline_type == 'spack_protected_branch':
|
||||
tags.extend(['aws', 'protected'])
|
||||
elif spack_pipeline_type == 'spack_pull_request':
|
||||
tags.extend(['public'])
|
||||
|
||||
variables = {}
|
||||
if 'variables' in runner_attribs:
|
||||
variables.update(runner_attribs['variables'])
|
||||
@@ -1174,6 +1207,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
service_job_config,
|
||||
cleanup_job)
|
||||
|
||||
if 'tags' in cleanup_job:
|
||||
service_tags = _remove_reserved_tags(cleanup_job['tags'])
|
||||
cleanup_job['tags'] = service_tags
|
||||
|
||||
cleanup_job['stage'] = 'cleanup-temp-storage'
|
||||
cleanup_job['script'] = [
|
||||
'spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID'.format(
|
||||
@@ -1181,9 +1218,74 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
]
|
||||
cleanup_job['when'] = 'always'
|
||||
cleanup_job['retry'] = service_job_retries
|
||||
cleanup_job['interruptible'] = True
|
||||
|
||||
output_object['cleanup'] = cleanup_job
|
||||
|
||||
if ('signing-job-attributes' in gitlab_ci and
|
||||
spack_pipeline_type == 'spack_protected_branch'):
|
||||
# External signing: generate a job to check and sign binary pkgs
|
||||
stage_names.append('stage-sign-pkgs')
|
||||
signing_job_config = gitlab_ci['signing-job-attributes']
|
||||
signing_job = {}
|
||||
|
||||
signing_job_attrs_to_copy = [
|
||||
'image',
|
||||
'tags',
|
||||
'variables',
|
||||
'before_script',
|
||||
'script',
|
||||
'after_script',
|
||||
]
|
||||
|
||||
_copy_attributes(signing_job_attrs_to_copy,
|
||||
signing_job_config,
|
||||
signing_job)
|
||||
|
||||
signing_job_tags = []
|
||||
if 'tags' in signing_job:
|
||||
signing_job_tags = _remove_reserved_tags(signing_job['tags'])
|
||||
|
||||
for tag in ['aws', 'protected', 'notary']:
|
||||
if tag not in signing_job_tags:
|
||||
signing_job_tags.append(tag)
|
||||
signing_job['tags'] = signing_job_tags
|
||||
|
||||
signing_job['stage'] = 'stage-sign-pkgs'
|
||||
signing_job['when'] = 'always'
|
||||
signing_job['retry'] = {
|
||||
'max': 2,
|
||||
'when': ['always']
|
||||
}
|
||||
signing_job['interruptible'] = True
|
||||
|
||||
output_object['sign-pkgs'] = signing_job
|
||||
|
||||
if spack_buildcache_copy:
|
||||
# Generate a job to copy the contents from wherever the builds are getting
|
||||
# pushed to the url specified in the "SPACK_BUILDCACHE_COPY" environment
|
||||
# variable.
|
||||
src_url = remote_mirror_override or remote_mirror_url
|
||||
dest_url = spack_buildcache_copy
|
||||
|
||||
stage_names.append('stage-copy-buildcache')
|
||||
copy_job = {
|
||||
'stage': 'stage-copy-buildcache',
|
||||
'tags': ['spack', 'public', 'medium', 'aws', 'x86_64'],
|
||||
'image': 'ghcr.io/spack/python-aws-bash:0.0.1',
|
||||
'when': 'on_success',
|
||||
'interruptible': True,
|
||||
'retry': service_job_retries,
|
||||
'script': [
|
||||
'. ./share/spack/setup-env.sh',
|
||||
'spack --version',
|
||||
'aws s3 sync --exclude *index.json* --exclude *pgp* {0} {1}'.format(
|
||||
src_url, dest_url)
|
||||
]
|
||||
}
|
||||
|
||||
output_object['copy-mirror'] = copy_job
|
||||
|
||||
if rebuild_index_enabled:
|
||||
# Add a final job to regenerate the index
|
||||
stage_names.append('stage-rebuild-index')
|
||||
@@ -1194,9 +1296,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
service_job_config,
|
||||
final_job)
|
||||
|
||||
if 'tags' in final_job:
|
||||
service_tags = _remove_reserved_tags(final_job['tags'])
|
||||
final_job['tags'] = service_tags
|
||||
|
||||
index_target_mirror = mirror_urls[0]
|
||||
if is_pr_pipeline:
|
||||
index_target_mirror = pr_mirror_url
|
||||
if remote_mirror_override:
|
||||
index_target_mirror = remote_mirror_override
|
||||
|
||||
final_job['stage'] = 'stage-rebuild-index'
|
||||
final_job['script'] = [
|
||||
@@ -1205,6 +1311,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
]
|
||||
final_job['when'] = 'always'
|
||||
final_job['retry'] = service_job_retries
|
||||
final_job['interruptible'] = True
|
||||
|
||||
output_object['rebuild-index'] = final_job
|
||||
|
||||
@@ -1237,8 +1344,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
'SPACK_PIPELINE_TYPE': str(spack_pipeline_type)
|
||||
}
|
||||
|
||||
if pr_mirror_url:
|
||||
output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url
|
||||
if remote_mirror_override:
|
||||
(output_object['variables']
|
||||
['SPACK_REMOTE_MIRROR_OVERRIDE']) = remote_mirror_override
|
||||
|
||||
spack_stack_name = os.environ.get('SPACK_CI_STACK_NAME', None)
|
||||
if spack_stack_name:
|
||||
|
@@ -8,7 +8,10 @@
|
||||
import argparse
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import sys
|
||||
from textwrap import dedent
|
||||
from typing import List, Tuple
|
||||
|
||||
import ruamel.yaml as yaml
|
||||
import six
|
||||
@@ -147,6 +150,58 @@ def get_command(cmd_name):
|
||||
return getattr(get_module(cmd_name), pname)
|
||||
|
||||
|
||||
class _UnquotedFlags(object):
|
||||
"""Use a heuristic in `.extract()` to detect whether the user is trying to set
|
||||
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
|
||||
|
||||
If the heuristic finds a match (which can be checked with `__bool__()`), a warning
|
||||
message explaining how to quote multiple flags correctly can be generated with
|
||||
`.report()`.
|
||||
"""
|
||||
|
||||
flags_arg_pattern = re.compile(
|
||||
r'^({0})=([^\'"].*)$'.format(
|
||||
'|'.join(spack.spec.FlagMap.valid_compiler_flags()),
|
||||
))
|
||||
|
||||
def __init__(self, all_unquoted_flag_pairs):
|
||||
# type: (List[Tuple[re.Match, str]]) -> None
|
||||
self._flag_pairs = all_unquoted_flag_pairs
|
||||
|
||||
def __bool__(self):
|
||||
# type: () -> bool
|
||||
return bool(self._flag_pairs)
|
||||
|
||||
@classmethod
|
||||
def extract(cls, sargs):
|
||||
# type: (str) -> _UnquotedFlags
|
||||
all_unquoted_flag_pairs = [] # type: List[Tuple[re.Match, str]]
|
||||
prev_flags_arg = None
|
||||
for arg in shlex.split(sargs):
|
||||
if prev_flags_arg is not None:
|
||||
all_unquoted_flag_pairs.append((prev_flags_arg, arg))
|
||||
prev_flags_arg = cls.flags_arg_pattern.match(arg)
|
||||
return cls(all_unquoted_flag_pairs)
|
||||
|
||||
def report(self):
|
||||
# type: () -> str
|
||||
single_errors = [
|
||||
'({0}) {1} {2} => {3}'.format(
|
||||
i + 1, match.group(0), next_arg,
|
||||
'{0}="{1} {2}"'.format(match.group(1), match.group(2), next_arg),
|
||||
)
|
||||
for i, (match, next_arg) in enumerate(self._flag_pairs)
|
||||
]
|
||||
return dedent("""\
|
||||
Some compiler or linker flags were provided without quoting their arguments,
|
||||
which now causes spack to try to parse the *next* argument as a spec component
|
||||
such as a variant instead of an additional compiler or linker flag. If the
|
||||
intent was to set multiple flags, try quoting them together as described below.
|
||||
|
||||
Possible flag quotation errors (with the correctly-quoted version after the =>):
|
||||
{0}""").format('\n'.join(single_errors))
|
||||
|
||||
|
||||
def parse_specs(args, **kwargs):
|
||||
"""Convenience function for parsing arguments from specs. Handles common
|
||||
exceptions and dies if there are errors.
|
||||
@@ -157,15 +212,28 @@ def parse_specs(args, **kwargs):
|
||||
|
||||
sargs = args
|
||||
if not isinstance(args, six.string_types):
|
||||
sargs = ' '.join(spack.util.string.quote(args))
|
||||
specs = spack.spec.parse(sargs)
|
||||
for spec in specs:
|
||||
if concretize:
|
||||
spec.concretize(tests=tests) # implies normalize
|
||||
elif normalize:
|
||||
spec.normalize(tests=tests)
|
||||
sargs = ' '.join(args)
|
||||
unquoted_flags = _UnquotedFlags.extract(sargs)
|
||||
|
||||
return specs
|
||||
try:
|
||||
specs = spack.spec.parse(sargs)
|
||||
for spec in specs:
|
||||
if concretize:
|
||||
spec.concretize(tests=tests) # implies normalize
|
||||
elif normalize:
|
||||
spec.normalize(tests=tests)
|
||||
return specs
|
||||
|
||||
except spack.error.SpecError as e:
|
||||
|
||||
msg = e.message
|
||||
if e.long_message:
|
||||
msg += e.long_message
|
||||
if unquoted_flags:
|
||||
msg += '\n\n'
|
||||
msg += unquoted_flags.report()
|
||||
|
||||
raise spack.error.SpackError(msg)
|
||||
|
||||
|
||||
def matching_spec_from_env(spec):
|
||||
|
@@ -379,7 +379,9 @@ def _remove(args):
|
||||
|
||||
|
||||
def _mirror(args):
|
||||
mirror_dir = os.path.join(args.root_dir, LOCAL_MIRROR_DIR)
|
||||
mirror_dir = spack.util.path.canonicalize_path(
|
||||
os.path.join(args.root_dir, LOCAL_MIRROR_DIR)
|
||||
)
|
||||
|
||||
# TODO: Here we are adding gnuconfig manually, but this can be fixed
|
||||
# TODO: as soon as we have an option to add to a mirror all the possible
|
||||
|
@@ -64,6 +64,11 @@ def setup_parser(subparser):
|
||||
'--dependencies', action='store_true', default=False,
|
||||
help="(Experimental) disable DAG scheduling; use "
|
||||
' "plain" dependencies.')
|
||||
generate.add_argument(
|
||||
'--buildcache-destination', default=None,
|
||||
help="Override the mirror configured in the environment (spack.yaml) " +
|
||||
"in order to push binaries from the generated pipeline to a " +
|
||||
"different location.")
|
||||
prune_group = generate.add_mutually_exclusive_group()
|
||||
prune_group.add_argument(
|
||||
'--prune-dag', action='store_true', dest='prune_dag',
|
||||
@@ -127,6 +132,7 @@ def ci_generate(args):
|
||||
prune_dag = args.prune_dag
|
||||
index_only = args.index_only
|
||||
artifacts_root = args.artifacts_root
|
||||
buildcache_destination = args.buildcache_destination
|
||||
|
||||
if not output_file:
|
||||
output_file = os.path.abspath(".gitlab-ci.yml")
|
||||
@@ -140,7 +146,8 @@ def ci_generate(args):
|
||||
spack_ci.generate_gitlab_ci_yaml(
|
||||
env, True, output_file, prune_dag=prune_dag,
|
||||
check_index_only=index_only, run_optimizer=run_optimizer,
|
||||
use_dependencies=use_dependencies, artifacts_root=artifacts_root)
|
||||
use_dependencies=use_dependencies, artifacts_root=artifacts_root,
|
||||
remote_mirror_override=buildcache_destination)
|
||||
|
||||
if copy_yaml_to:
|
||||
copy_to_dir = os.path.dirname(copy_yaml_to)
|
||||
@@ -180,6 +187,9 @@ def ci_rebuild(args):
|
||||
if not gitlab_ci:
|
||||
tty.die('spack ci rebuild requires an env containing gitlab-ci cfg')
|
||||
|
||||
tty.msg('SPACK_BUILDCACHE_DESTINATION={0}'.format(
|
||||
os.environ.get('SPACK_BUILDCACHE_DESTINATION', None)))
|
||||
|
||||
# Grab the environment variables we need. These either come from the
|
||||
# pipeline generation step ("spack ci generate"), where they were written
|
||||
# out as variables, or else provided by GitLab itself.
|
||||
@@ -196,7 +206,7 @@ def ci_rebuild(args):
|
||||
compiler_action = get_env_var('SPACK_COMPILER_ACTION')
|
||||
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
|
||||
spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE')
|
||||
pr_mirror_url = get_env_var('SPACK_PR_MIRROR_URL')
|
||||
remote_mirror_override = get_env_var('SPACK_REMOTE_MIRROR_OVERRIDE')
|
||||
remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL')
|
||||
|
||||
# Construct absolute paths relative to current $CI_PROJECT_DIR
|
||||
@@ -244,6 +254,10 @@ def ci_rebuild(args):
|
||||
tty.debug('Pipeline type - PR: {0}, develop: {1}'.format(
|
||||
spack_is_pr_pipeline, spack_is_develop_pipeline))
|
||||
|
||||
# If no override url exists, then just push binary package to the
|
||||
# normal remote mirror url.
|
||||
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
|
||||
|
||||
# Figure out what is our temporary storage mirror: Is it artifacts
|
||||
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
|
||||
# force something or pipelines might not have a way to propagate build
|
||||
@@ -373,7 +387,24 @@ def ci_rebuild(args):
|
||||
cfg.default_modify_scope())
|
||||
|
||||
# Check configured mirrors for a built spec with a matching hash
|
||||
matches = bindist.get_mirrors_for_spec(job_spec, index_only=False)
|
||||
mirrors_to_check = None
|
||||
if remote_mirror_override and spack_pipeline_type == 'spack_protected_branch':
|
||||
# Passing "mirrors_to_check" below means we *only* look in the override
|
||||
# mirror to see if we should skip building, which is what we want.
|
||||
mirrors_to_check = {
|
||||
'override': remote_mirror_override
|
||||
}
|
||||
|
||||
# Adding this mirror to the list of configured mirrors means dependencies
|
||||
# could be installed from either the override mirror or any other configured
|
||||
# mirror (e.g. remote_mirror_url which is defined in the environment or
|
||||
# pipeline_mirror_url), which is also what we want.
|
||||
spack.mirror.add('mirror_override',
|
||||
remote_mirror_override,
|
||||
cfg.default_modify_scope())
|
||||
|
||||
matches = bindist.get_mirrors_for_spec(
|
||||
job_spec, mirrors_to_check=mirrors_to_check, index_only=False)
|
||||
|
||||
if matches:
|
||||
# Got a hash match on at least one configured mirror. All
|
||||
@@ -517,13 +548,6 @@ def ci_rebuild(args):
|
||||
# any logs from the staging directory to artifacts now
|
||||
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
|
||||
|
||||
# Create buildcache on remote mirror, either on pr-specific mirror or
|
||||
# on the main mirror defined in the gitlab-enabled spack environment
|
||||
if spack_is_pr_pipeline:
|
||||
buildcache_mirror_url = pr_mirror_url
|
||||
else:
|
||||
buildcache_mirror_url = remote_mirror_url
|
||||
|
||||
# If the install succeeded, create a buildcache entry for this job spec
|
||||
# and push it to one or more mirrors. If the install did not succeed,
|
||||
# print out some instructions on how to reproduce this build failure
|
||||
|
@@ -5,6 +5,7 @@
|
||||
from __future__ import print_function
|
||||
|
||||
import argparse
|
||||
import errno
|
||||
import os
|
||||
import sys
|
||||
|
||||
@@ -93,6 +94,21 @@ def external_find(args):
|
||||
# It's fine to not find any manifest file if we are doing the
|
||||
# search implicitly (i.e. as part of 'spack external find')
|
||||
pass
|
||||
except Exception as e:
|
||||
# For most exceptions, just print a warning and continue.
|
||||
# Note that KeyboardInterrupt does not subclass Exception
|
||||
# (so CTRL-C will terminate the program as expected).
|
||||
skip_msg = ("Skipping manifest and continuing with other external "
|
||||
"checks")
|
||||
if ((isinstance(e, IOError) or isinstance(e, OSError)) and
|
||||
e.errno in [errno.EPERM, errno.EACCES]):
|
||||
# The manifest file does not have sufficient permissions enabled:
|
||||
# print a warning and keep going
|
||||
tty.warn("Unable to read manifest due to insufficient "
|
||||
"permissions.", skip_msg)
|
||||
else:
|
||||
tty.warn("Unable to read manifest, unexpected error: {0}"
|
||||
.format(str(e)), skip_msg)
|
||||
|
||||
# If the user didn't specify anything, search for build tools by default
|
||||
if not args.tags and not args.all and not args.packages:
|
||||
@@ -125,7 +141,7 @@ def external_find(args):
|
||||
|
||||
# If the list of packages is empty, search for every possible package
|
||||
if not args.tags and not packages_to_check:
|
||||
packages_to_check = spack.repo.path.all_packages()
|
||||
packages_to_check = list(spack.repo.path.all_packages())
|
||||
|
||||
detected_packages = spack.detection.by_executable(
|
||||
packages_to_check, path_hints=args.path)
|
||||
@@ -177,7 +193,10 @@ def _collect_and_consume_cray_manifest_files(
|
||||
|
||||
for directory in manifest_dirs:
|
||||
for fname in os.listdir(directory):
|
||||
manifest_files.append(os.path.join(directory, fname))
|
||||
if fname.endswith('.json'):
|
||||
fpath = os.path.join(directory, fname)
|
||||
tty.debug("Adding manifest file: {0}".format(fpath))
|
||||
manifest_files.append(os.path.join(directory, fpath))
|
||||
|
||||
if not manifest_files:
|
||||
raise NoManifestFileError(
|
||||
@@ -185,6 +204,7 @@ def _collect_and_consume_cray_manifest_files(
|
||||
.format(cray_manifest.default_path))
|
||||
|
||||
for path in manifest_files:
|
||||
tty.debug("Reading manifest file: " + path)
|
||||
try:
|
||||
cray_manifest.read(path, not dry_run)
|
||||
except (spack.compilers.UnknownCompilerError, spack.error.SpackError) as e:
|
||||
|
@@ -80,7 +80,8 @@ def spec(parser, args):
|
||||
# Use command line specified specs, otherwise try to use environment specs.
|
||||
if args.specs:
|
||||
input_specs = spack.cmd.parse_specs(args.specs)
|
||||
specs = [(s, s.concretized()) for s in input_specs]
|
||||
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
|
||||
specs = list(zip(input_specs, concretized_specs))
|
||||
else:
|
||||
env = ev.active_environment()
|
||||
if env:
|
||||
|
@@ -24,7 +24,7 @@
|
||||
|
||||
|
||||
# tutorial configuration parameters
|
||||
tutorial_branch = "releases/v0.17"
|
||||
tutorial_branch = "releases/v0.18"
|
||||
tutorial_mirror = "file:///mirror"
|
||||
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")
|
||||
|
||||
|
@@ -252,6 +252,13 @@ def find_new_compilers(path_hints=None, scope=None):
|
||||
merged configuration.
|
||||
"""
|
||||
compilers = find_compilers(path_hints)
|
||||
return select_new_compilers(compilers, scope)
|
||||
|
||||
|
||||
def select_new_compilers(compilers, scope=None):
|
||||
"""Given a list of compilers, remove those that are already defined in
|
||||
the configuration.
|
||||
"""
|
||||
compilers_not_in_config = []
|
||||
for c in compilers:
|
||||
arch_spec = spack.spec.ArchSpec((None, c.operating_system, c.target))
|
||||
|
@@ -105,15 +105,12 @@
|
||||
'build_stage': '$tempdir/spack-stage',
|
||||
'concretizer': 'clingo',
|
||||
'license_dir': spack.paths.default_license_dir,
|
||||
'flags': {
|
||||
'keep_werror': 'none',
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#: metavar to use for commands that accept scopes
|
||||
#: this is shorter and more readable than listing all choices
|
||||
scopes_metavar = '{defaults,system,site,user}[/PLATFORM]'
|
||||
scopes_metavar = '{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT'
|
||||
|
||||
#: Base name for the (internal) overrides scope.
|
||||
overrides_base_name = 'overrides-'
|
||||
|
@@ -171,11 +171,12 @@ def strip(self):
|
||||
def paths(self):
|
||||
"""Important paths in the image"""
|
||||
Paths = collections.namedtuple('Paths', [
|
||||
'environment', 'store', 'view'
|
||||
'environment', 'store', 'hidden_view', 'view'
|
||||
])
|
||||
return Paths(
|
||||
environment='/opt/spack-environment',
|
||||
store='/opt/software',
|
||||
hidden_view='/opt/._view',
|
||||
view='/opt/view'
|
||||
)
|
||||
|
||||
|
@@ -20,6 +20,7 @@
|
||||
|
||||
compiler_name_translation = {
|
||||
'nvidia': 'nvhpc',
|
||||
'rocm': 'rocmcc',
|
||||
}
|
||||
|
||||
|
||||
@@ -38,10 +39,6 @@ def translated_compiler_name(manifest_compiler_name):
|
||||
elif manifest_compiler_name in spack.compilers.supported_compilers():
|
||||
return manifest_compiler_name
|
||||
else:
|
||||
# Try to fail quickly. This can occur in two cases: (1) the compiler
|
||||
# definition (2) a spec can specify a compiler that doesn't exist; the
|
||||
# first will be caught when creating compiler definition. The second
|
||||
# will result in Specs with associated undefined compilers.
|
||||
raise spack.compilers.UnknownCompilerError(
|
||||
"Manifest parsing - unknown compiler: {0}"
|
||||
.format(manifest_compiler_name))
|
||||
@@ -185,6 +182,8 @@ def read(path, apply_updates):
|
||||
tty.debug("{0}: {1} compilers read from manifest".format(
|
||||
path,
|
||||
str(len(compilers))))
|
||||
# Filter out the compilers that already appear in the configuration
|
||||
compilers = spack.compilers.select_new_compilers(compilers)
|
||||
if apply_updates and compilers:
|
||||
spack.compilers.add_compilers_to_config(
|
||||
compilers, init_config=False)
|
||||
|
@@ -1612,9 +1612,10 @@ def install_specs(self, specs=None, **install_args):
|
||||
# ensure specs already installed are marked explicit
|
||||
all_specs = specs or [cs for _, cs in self.concretized_specs()]
|
||||
specs_installed = [s for s in all_specs if s.installed]
|
||||
with spack.store.db.write_transaction(): # do all in one transaction
|
||||
for spec in specs_installed:
|
||||
spack.store.db.update_explicit(spec, True)
|
||||
if specs_installed:
|
||||
with spack.store.db.write_transaction(): # do all in one transaction
|
||||
for spec in specs_installed:
|
||||
spack.store.db.update_explicit(spec, True)
|
||||
|
||||
if not specs_to_install:
|
||||
tty.msg('All of the packages are already installed')
|
||||
|
@@ -327,9 +327,7 @@ def fetch(self):
|
||||
continue
|
||||
|
||||
try:
|
||||
partial_file, save_file = self._fetch_from_url(url)
|
||||
if save_file and (partial_file is not None):
|
||||
llnl.util.filesystem.rename(partial_file, save_file)
|
||||
self._fetch_from_url(url)
|
||||
break
|
||||
except FailedDownloadError as e:
|
||||
errors.append(str(e))
|
||||
@@ -379,9 +377,7 @@ def _check_headers(self, headers):
|
||||
|
||||
@_needs_stage
|
||||
def _fetch_urllib(self, url):
|
||||
save_file = None
|
||||
if self.stage.save_filename:
|
||||
save_file = self.stage.save_filename
|
||||
save_file = self.stage.save_filename
|
||||
tty.msg('Fetching {0}'.format(url))
|
||||
|
||||
# Run urllib but grab the mime type from the http headers
|
||||
@@ -391,16 +387,18 @@ def _fetch_urllib(self, url):
|
||||
# clean up archive on failure.
|
||||
if self.archive_file:
|
||||
os.remove(self.archive_file)
|
||||
if save_file and os.path.exists(save_file):
|
||||
if os.path.lexists(save_file):
|
||||
os.remove(save_file)
|
||||
msg = 'urllib failed to fetch with error {0}'.format(e)
|
||||
raise FailedDownloadError(url, msg)
|
||||
|
||||
if os.path.lexists(save_file):
|
||||
os.remove(save_file)
|
||||
|
||||
with open(save_file, 'wb') as _open_file:
|
||||
shutil.copyfileobj(response, _open_file)
|
||||
|
||||
self._check_headers(str(headers))
|
||||
return None, save_file
|
||||
|
||||
@_needs_stage
|
||||
def _fetch_curl(self, url):
|
||||
@@ -461,7 +459,7 @@ def _fetch_curl(self, url):
|
||||
if self.archive_file:
|
||||
os.remove(self.archive_file)
|
||||
|
||||
if partial_file and os.path.exists(partial_file):
|
||||
if partial_file and os.path.lexists(partial_file):
|
||||
os.remove(partial_file)
|
||||
|
||||
if curl.returncode == 22:
|
||||
@@ -488,7 +486,9 @@ def _fetch_curl(self, url):
|
||||
"Curl failed with error %d" % curl.returncode)
|
||||
|
||||
self._check_headers(headers)
|
||||
return partial_file, save_file
|
||||
|
||||
if save_file and (partial_file is not None):
|
||||
os.rename(partial_file, save_file)
|
||||
|
||||
@property # type: ignore # decorated properties unsupported in mypy
|
||||
@_needs_stage
|
||||
@@ -642,7 +642,7 @@ def fetch(self):
|
||||
|
||||
# remove old symlink if one is there.
|
||||
filename = self.stage.save_filename
|
||||
if os.path.exists(filename):
|
||||
if os.path.lexists(filename):
|
||||
os.remove(filename)
|
||||
|
||||
# Symlink to local cached archive.
|
||||
|
@@ -4,6 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import platform as py_platform
|
||||
import re
|
||||
from subprocess import check_output
|
||||
|
||||
from spack.version import Version
|
||||
|
||||
@@ -51,6 +52,17 @@ def __init__(self):
|
||||
|
||||
if 'ubuntu' in distname:
|
||||
version = '.'.join(version[0:2])
|
||||
# openSUSE Tumbleweed is a rolling release which can change
|
||||
# more than once in a week, so set version to tumbleweed$GLIBVERS
|
||||
elif 'opensuse-tumbleweed' in distname or 'opensusetumbleweed' in distname:
|
||||
distname = 'opensuse'
|
||||
output = check_output(["ldd", "--version"]).decode()
|
||||
libcvers = re.findall(r'ldd \(GNU libc\) (.*)', output)
|
||||
if len(libcvers) == 1:
|
||||
version = 'tumbleweed' + libcvers[0]
|
||||
else:
|
||||
version = 'tumbleweed' + version[0]
|
||||
|
||||
else:
|
||||
version = version[0]
|
||||
|
||||
|
@@ -3,7 +3,7 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from io import BufferedReader
|
||||
from io import BufferedReader, IOBase
|
||||
|
||||
import six
|
||||
import six.moves.urllib.error as urllib_error
|
||||
@@ -23,11 +23,15 @@
|
||||
# https://github.com/python/cpython/pull/3249
|
||||
class WrapStream(BufferedReader):
|
||||
def __init__(self, raw):
|
||||
raw.readable = lambda: True
|
||||
raw.writable = lambda: False
|
||||
raw.seekable = lambda: False
|
||||
raw.closed = False
|
||||
raw.flush = lambda: None
|
||||
# In botocore >=1.23.47, StreamingBody inherits from IOBase, so we
|
||||
# only add missing attributes in older versions.
|
||||
# https://github.com/boto/botocore/commit/a624815eabac50442ed7404f3c4f2664cd0aa784
|
||||
if not isinstance(raw, IOBase):
|
||||
raw.readable = lambda: True
|
||||
raw.writable = lambda: False
|
||||
raw.seekable = lambda: False
|
||||
raw.closed = False
|
||||
raw.flush = lambda: None
|
||||
super(WrapStream, self).__init__(raw)
|
||||
|
||||
def detach(self):
|
||||
|
@@ -91,16 +91,7 @@
|
||||
'additional_external_search_paths': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'}
|
||||
},
|
||||
'flags': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'keep_werror': {
|
||||
'type': 'string',
|
||||
'enum': ['all', 'specific', 'none'],
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
},
|
||||
'deprecatedProperties': {
|
||||
'properties': ['module_roots'],
|
||||
|
@@ -26,6 +26,9 @@
|
||||
"cpe-version": {"type": "string", "minLength": 1},
|
||||
"system-type": {"type": "string", "minLength": 1},
|
||||
"schema-version": {"type": "string", "minLength": 1},
|
||||
# Older schemas use did not have "cpe-version", just the
|
||||
# schema version; in that case it was just called "version"
|
||||
"version": {"type": "string", "minLength": 1},
|
||||
}
|
||||
},
|
||||
"compilers": {
|
||||
|
@@ -110,6 +110,7 @@
|
||||
},
|
||||
},
|
||||
'service-job-attributes': runner_selector_schema,
|
||||
'signing-job-attributes': runner_selector_schema,
|
||||
'rebuild-index': {'type': 'boolean'},
|
||||
'broken-specs-url': {'type': 'string'},
|
||||
},
|
||||
|
@@ -631,6 +631,7 @@ def visit(node):
|
||||
|
||||
# Load the file itself
|
||||
self.control.load(os.path.join(parent_dir, 'concretize.lp'))
|
||||
self.control.load(os.path.join(parent_dir, "os_facts.lp"))
|
||||
self.control.load(os.path.join(parent_dir, "display.lp"))
|
||||
timer.phase("load")
|
||||
|
||||
@@ -716,6 +717,7 @@ def __init__(self, tests=False):
|
||||
self.variant_values_from_specs = set()
|
||||
self.version_constraints = set()
|
||||
self.target_constraints = set()
|
||||
self.default_targets = []
|
||||
self.compiler_version_constraints = set()
|
||||
self.post_facts = []
|
||||
|
||||
@@ -747,7 +749,13 @@ def key_fn(version):
|
||||
|
||||
pkg = packagize(pkg)
|
||||
declared_versions = self.declared_versions[pkg.name]
|
||||
most_to_least_preferred = sorted(declared_versions, key=key_fn)
|
||||
partially_sorted_versions = sorted(set(declared_versions), key=key_fn)
|
||||
|
||||
most_to_least_preferred = []
|
||||
for _, group in itertools.groupby(partially_sorted_versions, key=key_fn):
|
||||
most_to_least_preferred.extend(list(sorted(
|
||||
group, reverse=True, key=lambda x: spack.version.ver(x.version)
|
||||
)))
|
||||
|
||||
for weight, declared_version in enumerate(most_to_least_preferred):
|
||||
self.gen.fact(fn.version_declared(
|
||||
@@ -1164,24 +1172,26 @@ def preferred_variants(self, pkg_name):
|
||||
pkg_name, variant.name, value
|
||||
))
|
||||
|
||||
def preferred_targets(self, pkg_name):
|
||||
def target_preferences(self, pkg_name):
|
||||
key_fn = spack.package_prefs.PackagePrefs(pkg_name, 'target')
|
||||
|
||||
if not self.target_specs_cache:
|
||||
self.target_specs_cache = [
|
||||
spack.spec.Spec('target={0}'.format(target_name))
|
||||
for target_name in archspec.cpu.TARGETS
|
||||
for _, target_name in self.default_targets
|
||||
]
|
||||
|
||||
target_specs = self.target_specs_cache
|
||||
preferred_targets = [x for x in target_specs if key_fn(x) < 0]
|
||||
if not preferred_targets:
|
||||
return
|
||||
package_targets = self.target_specs_cache[:]
|
||||
package_targets.sort(key=key_fn)
|
||||
|
||||
preferred = preferred_targets[0]
|
||||
self.gen.fact(fn.package_target_weight(
|
||||
str(preferred.architecture.target), pkg_name, -30
|
||||
))
|
||||
offset = 0
|
||||
best_default = self.default_targets[0][1]
|
||||
for i, preferred in enumerate(package_targets):
|
||||
if str(preferred.architecture.target) == best_default and i != 0:
|
||||
offset = 100
|
||||
self.gen.fact(fn.target_weight(
|
||||
pkg_name, str(preferred.architecture.target), i + offset
|
||||
))
|
||||
|
||||
def flag_defaults(self):
|
||||
self.gen.h2("Compiler flag defaults")
|
||||
@@ -1572,7 +1582,7 @@ def target_defaults(self, specs):
|
||||
compiler.name, compiler.version, uarch.family.name
|
||||
))
|
||||
|
||||
i = 0
|
||||
i = 0 # TODO compute per-target offset?
|
||||
for target in candidate_targets:
|
||||
self.gen.fact(fn.target(target.name))
|
||||
self.gen.fact(fn.target_family(target.name, target.family.name))
|
||||
@@ -1581,12 +1591,15 @@ def target_defaults(self, specs):
|
||||
|
||||
# prefer best possible targets; weight others poorly so
|
||||
# they're not used unless set explicitly
|
||||
# these are stored to be generated as facts later offset by the
|
||||
# number of preferred targets
|
||||
if target.name in best_targets:
|
||||
self.gen.fact(fn.default_target_weight(target.name, i))
|
||||
self.default_targets.append((i, target.name))
|
||||
i += 1
|
||||
else:
|
||||
self.gen.fact(fn.default_target_weight(target.name, 100))
|
||||
self.default_targets.append((100, target.name))
|
||||
|
||||
self.default_targets = list(sorted(set(self.default_targets)))
|
||||
self.gen.newline()
|
||||
|
||||
def virtual_providers(self):
|
||||
@@ -1862,14 +1875,19 @@ def setup(self, driver, specs, reuse=None):
|
||||
self.pkg_rules(pkg, tests=self.tests)
|
||||
self.gen.h2('Package preferences: %s' % pkg)
|
||||
self.preferred_variants(pkg)
|
||||
self.preferred_targets(pkg)
|
||||
self.target_preferences(pkg)
|
||||
|
||||
# Inject dev_path from environment
|
||||
env = ev.active_environment()
|
||||
if env:
|
||||
for spec in sorted(specs):
|
||||
for dep in spec.traverse():
|
||||
_develop_specs_from_env(dep, env)
|
||||
for name, info in env.dev_specs.items():
|
||||
dev_spec = spack.spec.Spec(info['spec'])
|
||||
dev_spec.constrain(
|
||||
'dev_path=%s' % spack.util.path.canonicalize_path(info['path'])
|
||||
)
|
||||
|
||||
self.condition(spack.spec.Spec(name), dev_spec,
|
||||
msg="%s is a develop spec" % name)
|
||||
|
||||
self.gen.h1('Spec Constraints')
|
||||
self.literal_specs(specs)
|
||||
|
@@ -107,10 +107,28 @@ possible_version_weight(Package, Weight)
|
||||
:- version(Package, Version),
|
||||
version_declared(Package, Version, Weight).
|
||||
|
||||
version_weight(Package, Weight)
|
||||
% we can't use the weight for an external version if we don't use the
|
||||
% corresponding external spec.
|
||||
:- version(Package, Version),
|
||||
version_weight(Package, Weight),
|
||||
version_declared(Package, Version, Weight, "external"),
|
||||
not external(Package).
|
||||
|
||||
% we can't use a weight from an installed spec if we are building it
|
||||
% and vice-versa
|
||||
:- version(Package, Version),
|
||||
version_weight(Package, Weight),
|
||||
version_declared(Package, Version, Weight, "installed"),
|
||||
build(Package).
|
||||
|
||||
:- version(Package, Version),
|
||||
version_weight(Package, Weight),
|
||||
not version_declared(Package, Version, Weight, "installed"),
|
||||
not build(Package).
|
||||
|
||||
1 { version_weight(Package, Weight) : version_declared(Package, Version, Weight) } 1
|
||||
:- version(Package, Version),
|
||||
node(Package),
|
||||
Weight = #min{W : version_declared(Package, Version, W)}.
|
||||
node(Package).
|
||||
|
||||
% node_version_satisfies implies that exactly one of the satisfying versions
|
||||
% is the package's version, and vice versa.
|
||||
@@ -120,6 +138,11 @@ version_weight(Package, Weight)
|
||||
{ version(Package, Version) : version_satisfies(Package, Constraint, Version) }
|
||||
:- node_version_satisfies(Package, Constraint).
|
||||
|
||||
% If there is at least a version that satisfy the constraint, impose a lower
|
||||
% bound on the choice rule to avoid false positives with the error below
|
||||
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) }
|
||||
:- node_version_satisfies(Package, Constraint), version_satisfies(Package, Constraint, _).
|
||||
|
||||
% More specific error message if the version cannot satisfy some constraint
|
||||
% Otherwise covered by `no_version_error` and `versions_conflict_error`.
|
||||
error(1, "No valid version for '{0}' satisfies '@{1}'", Package, Constraint)
|
||||
@@ -481,13 +504,13 @@ variant(Package, Variant) :- variant_condition(ID, Package, Variant),
|
||||
condition_holds(ID).
|
||||
|
||||
% a variant cannot be set if it is not a variant on the package
|
||||
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Package, Variant)
|
||||
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
|
||||
:- variant_set(Package, Variant),
|
||||
not variant(Package, Variant),
|
||||
build(Package).
|
||||
|
||||
% a variant cannot take on a value if it is not a variant of the package
|
||||
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Package, Variant)
|
||||
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
|
||||
:- variant_value(Package, Variant, _),
|
||||
not variant(Package, Variant),
|
||||
build(Package).
|
||||
@@ -714,9 +737,14 @@ node_os_mismatch(Package, Dependency) :-
|
||||
% every OS is compatible with itself. We can use `os_compatible` to declare
|
||||
os_compatible(OS, OS) :- os(OS).
|
||||
|
||||
% OS compatibility rules for reusing solves.
|
||||
% catalina binaries can be used on bigsur. Direction is package -> dependency.
|
||||
os_compatible("bigsur", "catalina").
|
||||
% Transitive compatibility among operating systems
|
||||
os_compatible(OS1, OS3) :- os_compatible(OS1, OS2), os_compatible(OS2, OS3).
|
||||
|
||||
% We can select only operating systems compatible with the ones
|
||||
% for which we can build software. We need a cardinality constraint
|
||||
% since we might have more than one "buildable_os(OS)" fact.
|
||||
:- not 1 { os_compatible(CurrentOS, ReusedOS) : buildable_os(CurrentOS) },
|
||||
node_os(Package, ReusedOS).
|
||||
|
||||
% If an OS is set explicitly respect the value
|
||||
node_os(Package, OS) :- node_os_set(Package, OS), node(Package).
|
||||
@@ -779,24 +807,6 @@ target_compatible(Descendent, Ancestor)
|
||||
#defined target_satisfies/2.
|
||||
#defined target_parent/2.
|
||||
|
||||
% The target weight is either the default target weight
|
||||
% or a more specific per-package weight if set
|
||||
target_weight(Target, Package, Weight)
|
||||
:- default_target_weight(Target, Weight),
|
||||
node(Package),
|
||||
not derive_target_from_parent(_, Package),
|
||||
not package_target_weight(Target, Package, _).
|
||||
|
||||
% TODO: Need to account for the case of more than one parent
|
||||
% TODO: each of which sets different targets
|
||||
target_weight(Target, Dependency, Weight)
|
||||
:- depends_on(Package, Dependency),
|
||||
derive_target_from_parent(Package, Dependency),
|
||||
target_weight(Target, Package, Weight).
|
||||
|
||||
target_weight(Target, Package, Weight)
|
||||
:- package_target_weight(Target, Package, Weight).
|
||||
|
||||
% can't use targets on node if the compiler for the node doesn't support them
|
||||
error(2, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Target, Compiler, Version)
|
||||
:- node_target(Package, Target),
|
||||
@@ -813,11 +823,7 @@ node_target(Package, Target)
|
||||
node_target_weight(Package, Weight)
|
||||
:- node(Package),
|
||||
node_target(Package, Target),
|
||||
target_weight(Target, Package, Weight).
|
||||
|
||||
derive_target_from_parent(Parent, Package)
|
||||
:- depends_on(Parent, Package),
|
||||
not package_target_weight(_, Package, _).
|
||||
target_weight(Package, Target, Weight).
|
||||
|
||||
% compatibility rules for targets among nodes
|
||||
node_target_match(Parent, Dependency)
|
||||
@@ -935,6 +941,9 @@ compiler_weight(Package, 100)
|
||||
not node_compiler_preference(Package, Compiler, Version, _),
|
||||
not default_compiler_preference(Compiler, Version, _).
|
||||
|
||||
% For the time being, be strict and reuse only if the compiler match one we have on the system
|
||||
:- node_compiler_version(Package, Compiler, Version), not compiler_version(Compiler, Version).
|
||||
|
||||
#defined node_compiler_preference/4.
|
||||
#defined default_compiler_preference/3.
|
||||
|
||||
@@ -1238,11 +1247,12 @@ opt_criterion(1, "non-preferred targets").
|
||||
%-----------------
|
||||
#heuristic version(Package, Version) : version_declared(Package, Version, 0), node(Package). [10, true]
|
||||
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), node(Package). [10, true]
|
||||
#heuristic node_target(Package, Target) : default_target_weight(Target, 0), node(Package). [10, true]
|
||||
#heuristic node_target(Package, Target) : package_target_weight(Target, Package, 0), node(Package). [10, true]
|
||||
#heuristic node_target_weight(Package, 0) : node(Package). [10, true]
|
||||
#heuristic variant_value(Package, Variant, Value) : variant_default_value(Package, Variant, Value), node(Package). [10, true]
|
||||
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]
|
||||
#heuristic node(Package) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]
|
||||
#heuristic node_os(Package, OS) : buildable_os(OS). [10, true]
|
||||
|
||||
%-----------
|
||||
% Notes
|
||||
|
22
lib/spack/spack/solver/os_facts.lp
Normal file
22
lib/spack/spack/solver/os_facts.lp
Normal file
@@ -0,0 +1,22 @@
|
||||
% Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
|
||||
% Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
%
|
||||
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
% OS compatibility rules for reusing solves.
|
||||
% os_compatible(RecentOS, OlderOS)
|
||||
% OlderOS binaries can be used on RecentOS
|
||||
|
||||
% macOS
|
||||
os_compatible("monterey", "bigsur").
|
||||
os_compatible("bigsur", "catalina").
|
||||
|
||||
% Ubuntu
|
||||
os_compatible("ubuntu22.04", "ubuntu21.10").
|
||||
os_compatible("ubuntu21.10", "ubuntu21.04").
|
||||
os_compatible("ubuntu21.04", "ubuntu20.10").
|
||||
os_compatible("ubuntu20.10", "ubuntu20.04").
|
||||
os_compatible("ubuntu20.04", "ubuntu19.10").
|
||||
os_compatible("ubuntu19.10", "ubuntu19.04").
|
||||
os_compatible("ubuntu19.04", "ubuntu18.10").
|
||||
os_compatible("ubuntu18.10", "ubuntu18.04").
|
@@ -180,3 +180,20 @@ def test_status_function_find_files(
|
||||
|
||||
_, missing = spack.bootstrap.status_message('optional')
|
||||
assert missing is expected_missing
|
||||
|
||||
|
||||
@pytest.mark.regression('31042')
|
||||
def test_source_is_disabled(mutable_config):
|
||||
# Get the configuration dictionary of the current bootstrapping source
|
||||
conf = next(iter(spack.bootstrap.bootstrapping_sources()))
|
||||
|
||||
# The source is not explicitly enabled or disabled, so the following
|
||||
# call should raise to skip using it for bootstrapping
|
||||
with pytest.raises(ValueError):
|
||||
spack.bootstrap.source_is_enabled_or_raise(conf)
|
||||
|
||||
# Try to explicitly disable the source and verify that the behavior
|
||||
# is the same as above
|
||||
spack.config.add('bootstrap:trusted:{0}:{1}'.format(conf['name'], False))
|
||||
with pytest.raises(ValueError):
|
||||
spack.bootstrap.source_is_enabled_or_raise(conf)
|
||||
|
@@ -12,9 +12,6 @@
|
||||
|
||||
import pytest
|
||||
|
||||
import spack.build_environment
|
||||
import spack.config
|
||||
import spack.spec
|
||||
from spack.paths import build_env_path
|
||||
from spack.util.environment import set_env, system_dirs
|
||||
from spack.util.executable import Executable, ProcessError
|
||||
@@ -132,9 +129,7 @@ def wrapper_environment():
|
||||
SPACK_TARGET_ARGS="-march=znver2 -mtune=znver2",
|
||||
SPACK_LINKER_ARG='-Wl,',
|
||||
SPACK_DTAGS_TO_ADD='--disable-new-dtags',
|
||||
SPACK_DTAGS_TO_STRIP='--enable-new-dtags',
|
||||
SPACK_COMPILER_FLAGS_KEEP='',
|
||||
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',):
|
||||
SPACK_DTAGS_TO_STRIP='--enable-new-dtags'):
|
||||
yield
|
||||
|
||||
|
||||
@@ -162,21 +157,6 @@ def check_args(cc, args, expected):
|
||||
assert expected == cc_modified_args
|
||||
|
||||
|
||||
def check_args_contents(cc, args, must_contain, must_not_contain):
|
||||
"""Check output arguments that cc produces when called with args.
|
||||
|
||||
This assumes that cc will print debug command output with one element
|
||||
per line, so that we see whether arguments that should (or shouldn't)
|
||||
contain spaces are parsed correctly.
|
||||
"""
|
||||
with set_env(SPACK_TEST_COMMAND='dump-args'):
|
||||
cc_modified_args = cc(*args, output=str).strip().split('\n')
|
||||
for a in must_contain:
|
||||
assert a in cc_modified_args
|
||||
for a in must_not_contain:
|
||||
assert a not in cc_modified_args
|
||||
|
||||
|
||||
def check_env_var(executable, var, expected):
|
||||
"""Check environment variables updated by the passed compiler wrapper
|
||||
|
||||
@@ -662,63 +642,6 @@ def test_no_ccache_prepend_for_fc(wrapper_environment):
|
||||
common_compile_args)
|
||||
|
||||
|
||||
def test_keep_and_remove(wrapper_environment):
|
||||
werror_specific = ['-Werror=meh']
|
||||
werror = ['-Werror']
|
||||
werror_all = werror_specific + werror
|
||||
with set_env(
|
||||
SPACK_COMPILER_FLAGS_KEEP='',
|
||||
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
|
||||
):
|
||||
check_args_contents(cc, test_args + werror_all, ['-Wl,--end-group'], werror_all)
|
||||
with set_env(
|
||||
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
|
||||
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
|
||||
):
|
||||
check_args_contents(cc, test_args + werror_all, werror_specific, werror)
|
||||
with set_env(
|
||||
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
|
||||
SPACK_COMPILER_FLAGS_REMOVE='-Werror*|-llib1|-Wl*',
|
||||
):
|
||||
check_args_contents(
|
||||
cc,
|
||||
test_args + werror_all,
|
||||
werror_specific,
|
||||
werror + ["-llib1", "-Wl,--rpath"]
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('cfg_override,initial,expected,must_be_gone', [
|
||||
# Set and unset variables
|
||||
('config:flags:keep_werror:all',
|
||||
['-Werror', '-Werror=specific', '-bah'],
|
||||
['-Werror', '-Werror=specific', '-bah'],
|
||||
[],
|
||||
),
|
||||
('config:flags:keep_werror:specific',
|
||||
['-Werror', '-Werror=specific', '-bah'],
|
||||
['-Werror=specific', '-bah'],
|
||||
['-Werror'],
|
||||
),
|
||||
('config:flags:keep_werror:none',
|
||||
['-Werror', '-Werror=specific', '-bah'],
|
||||
['-bah'],
|
||||
['-Werror', '-Werror=specific'],
|
||||
),
|
||||
])
|
||||
@pytest.mark.usefixtures('wrapper_environment', 'mutable_config')
|
||||
def test_flag_modification(cfg_override, initial, expected, must_be_gone):
|
||||
spack.config.add(cfg_override)
|
||||
env = spack.build_environment.clean_environment()
|
||||
env.apply_modifications()
|
||||
check_args_contents(
|
||||
cc,
|
||||
test_args + initial,
|
||||
expected,
|
||||
must_be_gone
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.regression('9160')
|
||||
def test_disable_new_dtags(wrapper_environment, wrapper_flags):
|
||||
with set_env(SPACK_TEST_COMMAND='dump-args'):
|
||||
|
@@ -635,10 +635,6 @@ def test_ci_generate_for_pr_pipeline(tmpdir, mutable_mock_env_path,
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
with ev.read('test'):
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
with open(outputfile) as f:
|
||||
@@ -683,10 +679,6 @@ def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
with ev.read('test'):
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
with open(outputfile) as f:
|
||||
@@ -920,6 +912,77 @@ def fake_dl_method(spec, *args, **kwargs):
|
||||
env_cmd('deactivate')
|
||||
|
||||
|
||||
def test_ci_generate_mirror_override(tmpdir, mutable_mock_env_path,
|
||||
install_mockery_mutable_config, mock_packages,
|
||||
mock_fetch, mock_stage, mock_binary_index,
|
||||
ci_base_environment):
|
||||
"""Ensure that protected pipelines using --buildcache-destination do not
|
||||
skip building specs that are not in the override mirror when they are
|
||||
found in the main mirror."""
|
||||
os.environ.update({
|
||||
'SPACK_PIPELINE_TYPE': 'spack_protected_branch',
|
||||
})
|
||||
|
||||
working_dir = tmpdir.join('working_dir')
|
||||
|
||||
mirror_dir = working_dir.join('mirror')
|
||||
mirror_url = 'file://{0}'.format(mirror_dir.strpath)
|
||||
|
||||
spack_yaml_contents = """
|
||||
spack:
|
||||
definitions:
|
||||
- packages: [patchelf]
|
||||
specs:
|
||||
- $packages
|
||||
mirrors:
|
||||
test-mirror: {0}
|
||||
gitlab-ci:
|
||||
mappings:
|
||||
- match:
|
||||
- patchelf
|
||||
runner-attributes:
|
||||
tags:
|
||||
- donotcare
|
||||
image: donotcare
|
||||
service-job-attributes:
|
||||
tags:
|
||||
- nonbuildtag
|
||||
image: basicimage
|
||||
""".format(mirror_url)
|
||||
|
||||
filename = str(tmpdir.join('spack.yaml'))
|
||||
with open(filename, 'w') as f:
|
||||
f.write(spack_yaml_contents)
|
||||
|
||||
with tmpdir.as_cwd():
|
||||
env_cmd('create', 'test', './spack.yaml')
|
||||
first_ci_yaml = str(tmpdir.join('.gitlab-ci-1.yml'))
|
||||
second_ci_yaml = str(tmpdir.join('.gitlab-ci-2.yml'))
|
||||
with ev.read('test'):
|
||||
install_cmd()
|
||||
buildcache_cmd('create', '-u', '--mirror-url', mirror_url, 'patchelf')
|
||||
buildcache_cmd('update-index', '--mirror-url', mirror_url, output=str)
|
||||
|
||||
# This generate should not trigger a rebuild of patchelf, since it's in
|
||||
# the main mirror referenced in the environment.
|
||||
ci_cmd('generate', '--check-index-only', '--output-file', first_ci_yaml)
|
||||
|
||||
# Because we used a mirror override (--buildcache-destination) on a
|
||||
# spack protected pipeline, we expect to only look in the override
|
||||
# mirror for the spec, and thus the patchelf job should be generated in
|
||||
# this pipeline
|
||||
ci_cmd('generate', '--check-index-only', '--output-file', second_ci_yaml,
|
||||
'--buildcache-destination', 'file:///mirror/not/exist')
|
||||
|
||||
with open(first_ci_yaml) as fd1:
|
||||
first_yaml = fd1.read()
|
||||
assert 'no-specs-to-rebuild' in first_yaml
|
||||
|
||||
with open(second_ci_yaml) as fd2:
|
||||
second_yaml = fd2.read()
|
||||
assert 'no-specs-to-rebuild' not in second_yaml
|
||||
|
||||
|
||||
@pytest.mark.disable_clean_stage_check
|
||||
def test_push_mirror_contents(tmpdir, mutable_mock_env_path,
|
||||
install_mockery_mutable_config, mock_packages,
|
||||
@@ -1151,10 +1214,6 @@ def test_ci_generate_override_runner_attrs(tmpdir, mutable_mock_env_path,
|
||||
with ev.read('test'):
|
||||
monkeypatch.setattr(
|
||||
spack.main, 'get_version', lambda: '0.15.3-416-12ad69eb1')
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
with open(outputfile) as f:
|
||||
@@ -1256,10 +1315,6 @@ def test_ci_generate_with_workarounds(tmpdir, mutable_mock_env_path,
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
with ev.read('test'):
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
ci_cmd('generate', '--output-file', outputfile, '--dependencies')
|
||||
|
||||
with open(outputfile) as f:
|
||||
@@ -1417,11 +1472,6 @@ def fake_get_mirrors_for_spec(spec=None, mirrors_to_check=None,
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
with ev.read('test'):
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
with open(outputfile) as of:
|
||||
@@ -1630,11 +1680,6 @@ def test_ci_generate_temp_storage_url(tmpdir, mutable_mock_env_path,
|
||||
env_cmd('create', 'test', './spack.yaml')
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_PR_MIRRORS_ROOT_URL', r"file:///fake/mirror")
|
||||
monkeypatch.setattr(
|
||||
ci, 'SPACK_SHARED_PR_MIRROR_URL', r"file:///fake/mirror_two")
|
||||
|
||||
with ev.read('test'):
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
@@ -1715,6 +1760,64 @@ def test_ci_generate_read_broken_specs_url(tmpdir, mutable_mock_env_path,
|
||||
assert(ex not in output)
|
||||
|
||||
|
||||
def test_ci_generate_external_signing_job(tmpdir, mutable_mock_env_path,
|
||||
install_mockery,
|
||||
mock_packages, monkeypatch,
|
||||
ci_base_environment):
|
||||
"""Verify that in external signing mode: 1) each rebuild jobs includes
|
||||
the location where the binary hash information is written and 2) we
|
||||
properly generate a final signing job in the pipeline."""
|
||||
os.environ.update({
|
||||
'SPACK_PIPELINE_TYPE': 'spack_protected_branch'
|
||||
})
|
||||
filename = str(tmpdir.join('spack.yaml'))
|
||||
with open(filename, 'w') as f:
|
||||
f.write("""\
|
||||
spack:
|
||||
specs:
|
||||
- archive-files
|
||||
mirrors:
|
||||
some-mirror: https://my.fake.mirror
|
||||
gitlab-ci:
|
||||
temporary-storage-url-prefix: file:///work/temp/mirror
|
||||
mappings:
|
||||
- match:
|
||||
- archive-files
|
||||
runner-attributes:
|
||||
tags:
|
||||
- donotcare
|
||||
image: donotcare
|
||||
signing-job-attributes:
|
||||
tags:
|
||||
- nonbuildtag
|
||||
- secretrunner
|
||||
image:
|
||||
name: customdockerimage
|
||||
entrypoint: []
|
||||
variables:
|
||||
IMPORTANT_INFO: avalue
|
||||
script:
|
||||
- echo hello
|
||||
""")
|
||||
|
||||
with tmpdir.as_cwd():
|
||||
env_cmd('create', 'test', './spack.yaml')
|
||||
outputfile = str(tmpdir.join('.gitlab-ci.yml'))
|
||||
|
||||
with ev.read('test'):
|
||||
ci_cmd('generate', '--output-file', outputfile)
|
||||
|
||||
with open(outputfile) as of:
|
||||
pipeline_doc = syaml.load(of.read())
|
||||
|
||||
assert 'sign-pkgs' in pipeline_doc
|
||||
signing_job = pipeline_doc['sign-pkgs']
|
||||
assert 'tags' in signing_job
|
||||
signing_job_tags = signing_job['tags']
|
||||
for expected_tag in ['notary', 'protected', 'aws']:
|
||||
assert expected_tag in signing_job_tags
|
||||
|
||||
|
||||
def test_ci_reproduce(tmpdir, mutable_mock_env_path,
|
||||
install_mockery, mock_packages, monkeypatch,
|
||||
last_two_git_commits, ci_base_environment, mock_binary_index):
|
||||
|
@@ -45,21 +45,22 @@ def test_negative_integers_not_allowed_for_parallel_jobs(job_parser):
|
||||
assert 'expected a positive integer' in str(exc_info.value)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('specs,expected_variants,unexpected_variants', [
|
||||
(['coreutils', 'cflags=-O3 -g'], [], ['g']),
|
||||
(['coreutils', 'cflags=-O3', '-g'], ['g'], []),
|
||||
@pytest.mark.parametrize('specs,cflags,negated_variants', [
|
||||
(['coreutils cflags="-O3 -g"'], ['-O3', '-g'], []),
|
||||
(['coreutils', 'cflags=-O3 -g'], ['-O3'], ['g']),
|
||||
(['coreutils', 'cflags=-O3', '-g'], ['-O3'], ['g']),
|
||||
])
|
||||
@pytest.mark.regression('12951')
|
||||
def test_parse_spec_flags_with_spaces(
|
||||
specs, expected_variants, unexpected_variants
|
||||
):
|
||||
def test_parse_spec_flags_with_spaces(specs, cflags, negated_variants):
|
||||
spec_list = spack.cmd.parse_specs(specs)
|
||||
assert len(spec_list) == 1
|
||||
|
||||
s = spec_list.pop()
|
||||
|
||||
assert all(x not in s.variants for x in unexpected_variants)
|
||||
assert all(x in s.variants for x in expected_variants)
|
||||
assert s.compiler_flags['cflags'] == cflags
|
||||
assert list(s.variants.keys()) == negated_variants
|
||||
for v in negated_variants:
|
||||
assert '~{0}'.format(v) in s
|
||||
|
||||
|
||||
@pytest.mark.usefixtures('config')
|
||||
|
@@ -8,6 +8,8 @@
|
||||
|
||||
import pytest
|
||||
|
||||
from llnl.util.filesystem import getuid, touch
|
||||
|
||||
import spack
|
||||
import spack.detection
|
||||
import spack.detection.path
|
||||
@@ -194,6 +196,53 @@ def test_find_external_empty_default_manifest_dir(
|
||||
external('find')
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == 'win32',
|
||||
reason="Can't chmod on Windows")
|
||||
@pytest.mark.skipif(getuid() == 0, reason='user is root')
|
||||
def test_find_external_manifest_with_bad_permissions(
|
||||
mutable_config, working_env, mock_executable, mutable_mock_repo,
|
||||
_platform_executables, tmpdir, monkeypatch):
|
||||
"""The user runs 'spack external find'; the default path for storing
|
||||
manifest files exists but with insufficient permissions. Check that
|
||||
the command does not fail.
|
||||
"""
|
||||
test_manifest_dir = str(tmpdir.mkdir('manifest_dir'))
|
||||
test_manifest_file_path = os.path.join(test_manifest_dir, 'badperms.json')
|
||||
touch(test_manifest_file_path)
|
||||
monkeypatch.setenv('PATH', '')
|
||||
monkeypatch.setattr(spack.cray_manifest, 'default_path',
|
||||
test_manifest_dir)
|
||||
try:
|
||||
os.chmod(test_manifest_file_path, 0)
|
||||
output = external('find')
|
||||
assert 'insufficient permissions' in output
|
||||
assert 'Skipping manifest and continuing' in output
|
||||
finally:
|
||||
os.chmod(test_manifest_file_path, 0o700)
|
||||
|
||||
|
||||
def test_find_external_manifest_failure(
|
||||
mutable_config, mutable_mock_repo, tmpdir, monkeypatch):
|
||||
"""The user runs 'spack external find'; the manifest parsing fails with
|
||||
some exception. Ensure that the command still succeeds (i.e. moves on
|
||||
to other external detection mechanisms).
|
||||
"""
|
||||
# First, create an empty manifest file (without a file to read, the
|
||||
# manifest parsing is skipped)
|
||||
test_manifest_dir = str(tmpdir.mkdir('manifest_dir'))
|
||||
test_manifest_file_path = os.path.join(test_manifest_dir, 'test.json')
|
||||
touch(test_manifest_file_path)
|
||||
|
||||
def fail():
|
||||
raise Exception()
|
||||
|
||||
monkeypatch.setattr(
|
||||
spack.cmd.external, '_collect_and_consume_cray_manifest_files', fail)
|
||||
monkeypatch.setenv('PATH', '')
|
||||
output = external('find')
|
||||
assert 'Skipping manifest and continuing' in output
|
||||
|
||||
|
||||
def test_find_external_nonempty_default_manifest_dir(
|
||||
mutable_database, mutable_mock_repo,
|
||||
_platform_executables, tmpdir, monkeypatch,
|
||||
|
@@ -4,10 +4,12 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import re
|
||||
from textwrap import dedent
|
||||
|
||||
import pytest
|
||||
|
||||
import spack.environment as ev
|
||||
import spack.error
|
||||
import spack.spec
|
||||
import spack.store
|
||||
from spack.main import SpackCommand, SpackCommandError
|
||||
@@ -55,6 +57,54 @@ def test_spec_concretizer_args(mutable_config, mutable_database):
|
||||
assert h in output
|
||||
|
||||
|
||||
def test_spec_parse_dependency_variant_value():
|
||||
"""Verify that we can provide multiple key=value variants to multiple separate
|
||||
packages within a spec string."""
|
||||
output = spec('multivalue-variant fee=barbaz ^ a foobar=baz')
|
||||
|
||||
assert 'fee=barbaz' in output
|
||||
assert 'foobar=baz' in output
|
||||
|
||||
|
||||
def test_spec_parse_cflags_quoting():
|
||||
"""Verify that compiler flags can be provided to a spec from the command line."""
|
||||
output = spec('--yaml', 'gcc cflags="-Os -pipe" cxxflags="-flto -Os"')
|
||||
gh_flagged = spack.spec.Spec.from_yaml(output)
|
||||
|
||||
assert ['-Os', '-pipe'] == gh_flagged.compiler_flags['cflags']
|
||||
assert ['-flto', '-Os'] == gh_flagged.compiler_flags['cxxflags']
|
||||
|
||||
|
||||
def test_spec_parse_unquoted_flags_report():
|
||||
"""Verify that a useful error message is produced if unquoted compiler flags are
|
||||
provided."""
|
||||
# This should fail during parsing, since /usr/include is interpreted as a spec hash.
|
||||
with pytest.raises(spack.error.SpackError) as cm:
|
||||
# We don't try to figure out how many following args were intended to be part of
|
||||
# cflags, we just explain how to fix it for the immediate next arg.
|
||||
spec('gcc cflags=-Os -pipe -other-arg-that-gets-ignored cflags=-I /usr/include')
|
||||
# Verify that the generated error message is nicely formatted.
|
||||
assert str(cm.value) == dedent('''\
|
||||
No installed spec matches the hash: 'usr'
|
||||
|
||||
Some compiler or linker flags were provided without quoting their arguments,
|
||||
which now causes spack to try to parse the *next* argument as a spec component
|
||||
such as a variant instead of an additional compiler or linker flag. If the
|
||||
intent was to set multiple flags, try quoting them together as described below.
|
||||
|
||||
Possible flag quotation errors (with the correctly-quoted version after the =>):
|
||||
(1) cflags=-Os -pipe => cflags="-Os -pipe"
|
||||
(2) cflags=-I /usr/include => cflags="-I /usr/include"''')
|
||||
|
||||
# Verify that the same unquoted cflags report is generated in the error message even
|
||||
# if it fails during concretization, not just during parsing.
|
||||
with pytest.raises(spack.error.SpackError) as cm:
|
||||
spec('gcc cflags=-Os -pipe')
|
||||
cm = str(cm.value)
|
||||
assert cm.startswith('trying to set variant "pipe" in package "gcc", but the package has no such variant [happened during concretization of gcc cflags="-Os" ~pipe]') # noqa: E501
|
||||
assert cm.endswith('(1) cflags=-Os -pipe => cflags="-Os -pipe"')
|
||||
|
||||
|
||||
def test_spec_yaml():
|
||||
output = spec('--yaml', 'mpileaks')
|
||||
|
||||
@@ -125,14 +175,14 @@ def test_spec_returncode():
|
||||
|
||||
|
||||
def test_spec_parse_error():
|
||||
with pytest.raises(spack.spec.SpecParseError) as e:
|
||||
with pytest.raises(spack.error.SpackError) as e:
|
||||
spec("1.15:")
|
||||
|
||||
# make sure the error is formatted properly
|
||||
error_msg = """\
|
||||
1.15:
|
||||
^"""
|
||||
assert error_msg in e.value.long_message
|
||||
assert error_msg in str(e.value)
|
||||
|
||||
|
||||
def test_env_aware_spec(mutable_mock_env_path):
|
||||
|
@@ -1732,3 +1732,79 @@ def test_best_effort_coconcretize_preferences(
|
||||
if expected_spec in spec:
|
||||
counter += 1
|
||||
assert counter == occurances, concrete_specs
|
||||
|
||||
@pytest.mark.regression('30864')
|
||||
def test_misleading_error_message_on_version(self, mutable_database):
|
||||
# For this bug to be triggered we need a reusable dependency
|
||||
# that is not optimal in terms of optimization scores.
|
||||
# We pick an old version of "b"
|
||||
import spack.solver.asp
|
||||
if spack.config.get('config:concretizer') == 'original':
|
||||
pytest.skip('Original concretizer cannot reuse')
|
||||
|
||||
reusable_specs = [
|
||||
spack.spec.Spec('non-existing-conditional-dep@1.0').concretized()
|
||||
]
|
||||
root_spec = spack.spec.Spec('non-existing-conditional-dep@2.0')
|
||||
|
||||
with spack.config.override("concretizer:reuse", True):
|
||||
solver = spack.solver.asp.Solver()
|
||||
setup = spack.solver.asp.SpackSolverSetup()
|
||||
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError,
|
||||
match="'dep-with-variants' satisfies '@999'"):
|
||||
solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
|
||||
|
||||
@pytest.mark.regression('31148')
|
||||
def test_version_weight_and_provenance(self):
|
||||
"""Test package preferences during coconcretization."""
|
||||
import spack.solver.asp
|
||||
if spack.config.get('config:concretizer') == 'original':
|
||||
pytest.skip('Original concretizer cannot reuse')
|
||||
|
||||
reusable_specs = [
|
||||
spack.spec.Spec(spec_str).concretized()
|
||||
for spec_str in ('b@0.9', 'b@1.0')
|
||||
]
|
||||
root_spec = spack.spec.Spec('a foobar=bar')
|
||||
|
||||
with spack.config.override("concretizer:reuse", True):
|
||||
solver = spack.solver.asp.Solver()
|
||||
setup = spack.solver.asp.SpackSolverSetup()
|
||||
result = solver.driver.solve(
|
||||
setup, [root_spec], reuse=reusable_specs, out=sys.stdout
|
||||
)
|
||||
# The result here should have a single spec to build ('a')
|
||||
# and it should be using b@1.0 with a version badness of 2
|
||||
# The provenance is:
|
||||
# version_declared("b","1.0",0,"package_py").
|
||||
# version_declared("b","0.9",1,"package_py").
|
||||
# version_declared("b","1.0",2,"installed").
|
||||
# version_declared("b","0.9",3,"installed").
|
||||
for criterion in [
|
||||
(1, None, 'number of packages to build (vs. reuse)'),
|
||||
(2, 0, 'version badness')
|
||||
]:
|
||||
assert criterion in result.criteria
|
||||
assert result.specs[0].satisfies('^b@1.0')
|
||||
|
||||
@pytest.mark.regression('31169')
|
||||
def test_not_reusing_incompatible_os_or_compiler(self):
|
||||
import spack.solver.asp
|
||||
if spack.config.get('config:concretizer') == 'original':
|
||||
pytest.skip('Original concretizer cannot reuse')
|
||||
|
||||
root_spec = spack.spec.Spec('b')
|
||||
s = root_spec.concretized()
|
||||
wrong_compiler, wrong_os = s.copy(), s.copy()
|
||||
wrong_compiler.compiler = spack.spec.CompilerSpec('gcc@12.1.0')
|
||||
wrong_os.architecture = spack.spec.ArchSpec('test-ubuntu2204-x86_64')
|
||||
reusable_specs = [wrong_compiler, wrong_os]
|
||||
with spack.config.override("concretizer:reuse", True):
|
||||
solver = spack.solver.asp.Solver()
|
||||
setup = spack.solver.asp.SpackSolverSetup()
|
||||
result = solver.driver.solve(
|
||||
setup, [root_spec], reuse=reusable_specs, out=sys.stdout
|
||||
)
|
||||
concrete_spec = result.specs[0]
|
||||
assert concrete_spec.satisfies('%gcc@4.5.0')
|
||||
assert concrete_spec.satisfies('os=debian6')
|
||||
|
@@ -144,7 +144,7 @@ def test_preferred_target(self, mutable_mock_repo):
|
||||
|
||||
update_packages('mpileaks', 'target', [default])
|
||||
spec = concretize('mpileaks')
|
||||
assert str(spec['mpich'].target) == default
|
||||
assert str(spec['mpileaks'].target) == default
|
||||
assert str(spec['mpich'].target) == default
|
||||
|
||||
def test_preferred_versions(self):
|
||||
|
@@ -10,6 +10,7 @@
|
||||
logic needs to consume all related specs in a single pass).
|
||||
"""
|
||||
import json
|
||||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
@@ -32,7 +33,7 @@
|
||||
},
|
||||
"compiler": {
|
||||
"name": "gcc",
|
||||
"version": "10.2.0"
|
||||
"version": "10.2.0.cray"
|
||||
},
|
||||
"dependencies": {
|
||||
"packagey": {
|
||||
@@ -156,7 +157,7 @@ def spec_json(self):
|
||||
# Intended to match example_compiler_entry above
|
||||
_common_compiler = JsonCompilerEntry(
|
||||
name='gcc',
|
||||
version='10.2.0',
|
||||
version='10.2.0.cray',
|
||||
arch={
|
||||
"os": "centos8",
|
||||
"target": "x86_64"
|
||||
@@ -309,8 +310,16 @@ def test_failed_translate_compiler_name():
|
||||
|
||||
def create_manifest_content():
|
||||
return {
|
||||
# Note: the cray_manifest module doesn't use the _meta section right
|
||||
# now, but it is anticipated to be useful
|
||||
'_meta': {
|
||||
"file-type": "cray-pe-json",
|
||||
"system-type": "test",
|
||||
"schema-version": "1.3",
|
||||
"cpe-version": "22.06"
|
||||
},
|
||||
'specs': list(x.to_dict() for x in generate_openmpi_entries()),
|
||||
'compilers': []
|
||||
'compilers': [_common_compiler.compiler_json()]
|
||||
}
|
||||
|
||||
|
||||
@@ -336,3 +345,45 @@ def test_read_cray_manifest(
|
||||
' ^/openmpifakehasha'.split(),
|
||||
concretize=True)
|
||||
assert concretized_specs[0]['hwloc'].dag_hash() == 'hwlocfakehashaaa'
|
||||
|
||||
|
||||
def test_read_cray_manifest_twice_no_compiler_duplicates(
|
||||
tmpdir, mutable_config, mock_packages, mutable_database):
|
||||
if spack.config.get('config:concretizer') == 'clingo':
|
||||
pytest.skip("The ASP-based concretizer is currently picky about "
|
||||
" OS matching and will fail.")
|
||||
|
||||
with tmpdir.as_cwd():
|
||||
test_db_fname = 'external-db.json'
|
||||
with open(test_db_fname, 'w') as db_file:
|
||||
json.dump(create_manifest_content(), db_file)
|
||||
|
||||
# Read the manifest twice
|
||||
cray_manifest.read(test_db_fname, True)
|
||||
cray_manifest.read(test_db_fname, True)
|
||||
|
||||
compilers = spack.compilers.all_compilers()
|
||||
filtered = list(c for c in compilers if
|
||||
c.spec == spack.spec.CompilerSpec('gcc@10.2.0.cray'))
|
||||
assert(len(filtered) == 1)
|
||||
|
||||
|
||||
def test_read_old_manifest_v1_2(
|
||||
tmpdir, mutable_config, mock_packages, mutable_database):
|
||||
"""Test reading a file using the older format
|
||||
('version' instead of 'schema-version').
|
||||
"""
|
||||
manifest_dir = str(tmpdir.mkdir('manifest_dir'))
|
||||
manifest_file_path = os.path.join(manifest_dir, 'test.json')
|
||||
with open(manifest_file_path, 'w') as manifest_file:
|
||||
manifest_file.write("""\
|
||||
{
|
||||
"_meta": {
|
||||
"file-type": "cray-pe-json",
|
||||
"system-type": "EX",
|
||||
"version": "1.3"
|
||||
},
|
||||
"specs": []
|
||||
}
|
||||
""")
|
||||
cray_manifest.read(manifest_file_path, True)
|
||||
|
@@ -15,6 +15,7 @@
|
||||
SpecFormatSigilError,
|
||||
SpecFormatStringError,
|
||||
UnconstrainableDependencySpecError,
|
||||
UnsupportedCompilerError,
|
||||
)
|
||||
from spack.variant import (
|
||||
InvalidVariantValueError,
|
||||
@@ -1320,3 +1321,8 @@ def test_concretize_partial_old_dag_hash_spec(mock_packages, config):
|
||||
|
||||
# make sure package hash is NOT recomputed
|
||||
assert not getattr(spec["dt-diamond-bottom"], '_package_hash', None)
|
||||
|
||||
|
||||
def test_unsupported_compiler():
|
||||
with pytest.raises(UnsupportedCompilerError):
|
||||
Spec('gcc%fake-compiler').validate_or_raise()
|
||||
|
@@ -8,7 +8,10 @@
|
||||
The YAML and JSON formats preserve DAG information in the spec.
|
||||
|
||||
"""
|
||||
from __future__ import print_function
|
||||
|
||||
import ast
|
||||
import collections
|
||||
import inspect
|
||||
import os
|
||||
|
||||
@@ -433,3 +436,75 @@ def test_legacy_yaml(tmpdir, install_mockery, mock_packages):
|
||||
spec = Spec.from_yaml(yaml)
|
||||
concrete_spec = spec.concretized()
|
||||
assert concrete_spec.eq_dag(spec)
|
||||
|
||||
|
||||
#: A well ordered Spec dictionary, using ``OrderdDict``.
|
||||
#: Any operation that transforms Spec dictionaries should
|
||||
#: preserve this order.
|
||||
ordered_spec = collections.OrderedDict([
|
||||
("arch", collections.OrderedDict([
|
||||
("platform", "darwin"),
|
||||
("platform_os", "bigsur"),
|
||||
("target", collections.OrderedDict([
|
||||
("features", [
|
||||
"adx",
|
||||
"aes",
|
||||
"avx",
|
||||
"avx2",
|
||||
"bmi1",
|
||||
"bmi2",
|
||||
"clflushopt",
|
||||
"f16c",
|
||||
"fma",
|
||||
"mmx",
|
||||
"movbe",
|
||||
"pclmulqdq",
|
||||
"popcnt",
|
||||
"rdrand",
|
||||
"rdseed",
|
||||
"sse",
|
||||
"sse2",
|
||||
"sse4_1",
|
||||
"sse4_2",
|
||||
"ssse3",
|
||||
"xsavec",
|
||||
"xsaveopt"
|
||||
]),
|
||||
("generation", 0),
|
||||
("name", "skylake"),
|
||||
("parents", ["broadwell"]),
|
||||
("vendor", "GenuineIntel"),
|
||||
])),
|
||||
])),
|
||||
("compiler", collections.OrderedDict([
|
||||
("name", "apple-clang"),
|
||||
("version", "13.0.0"),
|
||||
])),
|
||||
("name", "zlib"),
|
||||
("namespace", "builtin"),
|
||||
("parameters", collections.OrderedDict([
|
||||
("cflags", []),
|
||||
("cppflags", []),
|
||||
("cxxflags", []),
|
||||
("fflags", []),
|
||||
("ldflags", []),
|
||||
("ldlibs", []),
|
||||
("optimize", True),
|
||||
("pic", True),
|
||||
("shared", True),
|
||||
])),
|
||||
("version", "1.2.11"),
|
||||
])
|
||||
|
||||
|
||||
@pytest.mark.regression("31092")
|
||||
def test_strify_preserves_order():
|
||||
"""Ensure that ``spack_json._strify()`` dumps dictionaries in the right order.
|
||||
|
||||
``_strify()`` is used in ``spack_json.dump()``, which is used in
|
||||
``Spec.dag_hash()``, so if this goes wrong, ``Spec`` hashes can vary between python
|
||||
versions.
|
||||
|
||||
"""
|
||||
strified = sjson._strify(ordered_spec)
|
||||
assert list(ordered_spec.items()) == list(strified.items())
|
||||
|
@@ -55,9 +55,12 @@ def test_write_and_remove_cache_file(file_cache):
|
||||
|
||||
file_cache.remove('test.yaml')
|
||||
|
||||
# After removal both the file and the lock file should not exist
|
||||
# After removal the file should not exist
|
||||
assert not os.path.exists(file_cache.cache_path('test.yaml'))
|
||||
assert not os.path.exists(file_cache._lock_path('test.yaml'))
|
||||
|
||||
# Whether the lock file exists is more of an implementation detail, on Linux they
|
||||
# continue to exist, on Windows they don't.
|
||||
# assert os.path.exists(file_cache._lock_path('test.yaml'))
|
||||
|
||||
|
||||
@pytest.mark.skipif(sys.platform == 'win32',
|
||||
@@ -94,3 +97,10 @@ def test_cache_write_readonly_cache_fails(file_cache):
|
||||
|
||||
with pytest.raises(CacheError, match='Insufficient permissions to write'):
|
||||
file_cache.write_transaction(filename)
|
||||
|
||||
|
||||
@pytest.mark.regression('31475')
|
||||
def test_delete_is_idempotent(file_cache):
|
||||
"""Deleting a non-existent key should be idempotent, to simplify life when
|
||||
running delete with multiple processes"""
|
||||
file_cache.remove('test.yaml')
|
||||
|
@@ -3,6 +3,7 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import errno
|
||||
import os
|
||||
import shutil
|
||||
|
||||
@@ -170,13 +171,17 @@ def mtime(self, key):
|
||||
return sinfo.st_mtime
|
||||
|
||||
def remove(self, key):
|
||||
file = self.cache_path(key)
|
||||
lock = self._get_lock(key)
|
||||
try:
|
||||
lock.acquire_write()
|
||||
os.unlink(self.cache_path(key))
|
||||
os.unlink(file)
|
||||
except OSError as e:
|
||||
# File not found is OK, so remove is idempotent.
|
||||
if e.errno != errno.ENOENT:
|
||||
raise
|
||||
finally:
|
||||
lock.release_write()
|
||||
lock.cleanup()
|
||||
|
||||
|
||||
class CacheError(SpackError):
|
||||
|
@@ -4,6 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
"""Simple wrapper around JSON to guarantee consistent use of load/dump. """
|
||||
import collections
|
||||
import json
|
||||
from typing import Any, Dict, Optional # novm
|
||||
|
||||
@@ -72,9 +73,10 @@ def _strify(data, ignore_dicts=False):
|
||||
# if this is a dictionary, return dictionary of byteified keys and values
|
||||
# but only if we haven't already byteified it
|
||||
if isinstance(data, dict) and not ignore_dicts:
|
||||
return dict((_strify(key, ignore_dicts=True),
|
||||
_strify(value, ignore_dicts=True)) for key, value in
|
||||
iteritems(data))
|
||||
return collections.OrderedDict(
|
||||
(_strify(key, ignore_dicts=True), _strify(value, ignore_dicts=True))
|
||||
for key, value in iteritems(data)
|
||||
)
|
||||
|
||||
# if it's anything else, return it in its original form
|
||||
return data
|
||||
|
@@ -1,4 +1,4 @@
|
||||
stages: [ "generate", "build" ]
|
||||
stages: [ "generate", "build", "publish" ]
|
||||
|
||||
default:
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
@@ -9,16 +9,25 @@ default:
|
||||
|
||||
.pr:
|
||||
only:
|
||||
- /^pr[\d]+_.*$/
|
||||
- /^github\/pr[\d]+_.*$/
|
||||
variables:
|
||||
SPACK_PR_BRANCH: ${CI_COMMIT_REF_NAME}
|
||||
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries-prs/${CI_COMMIT_REF_NAME}"
|
||||
SPACK_PIPELINE_TYPE: "spack_pull_request"
|
||||
SPACK_PRUNE_UNTOUCHED: "True"
|
||||
|
||||
.develop:
|
||||
.protected-refs:
|
||||
only:
|
||||
- /^develop$/
|
||||
- /^releases\/v.*/
|
||||
- /^v.*/
|
||||
- /^github\/develop$/
|
||||
|
||||
.protected:
|
||||
extends: [ ".protected-refs" ]
|
||||
variables:
|
||||
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
|
||||
SPACK_COPY_BUILDCACHE: "s3://spack-binaries/${CI_COMMIT_REF_NAME}"
|
||||
SPACK_PIPELINE_TYPE: "spack_protected_branch"
|
||||
|
||||
.generate:
|
||||
@@ -29,12 +38,13 @@ default:
|
||||
- cd share/spack/gitlab/cloud_pipelines/stacks/${SPACK_CI_STACK_NAME}
|
||||
- spack env activate --without-view .
|
||||
- spack ci generate --check-index-only
|
||||
--buildcache-destination "${SPACK_BUILDCACHE_DESTINATION}"
|
||||
--artifacts-root "${CI_PROJECT_DIR}/jobs_scratch_dir"
|
||||
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/cloud-ci-pipeline.yml"
|
||||
artifacts:
|
||||
paths:
|
||||
- "${CI_PROJECT_DIR}/jobs_scratch_dir"
|
||||
tags: ["spack", "public", "medium", "x86_64"]
|
||||
tags: ["spack", "aws", "public", "medium", "x86_64"]
|
||||
interruptible: true
|
||||
retry:
|
||||
max: 2
|
||||
@@ -42,11 +52,21 @@ default:
|
||||
- runner_system_failure
|
||||
- stuck_or_timeout_failure
|
||||
|
||||
.generate-aarch64:
|
||||
extends: [ ".generate" ]
|
||||
tags: ["spack", "aws", "public", "medium", "aarch64"]
|
||||
|
||||
.pr-generate:
|
||||
extends: [ ".pr", ".generate" ]
|
||||
|
||||
.develop-generate:
|
||||
extends: [ ".develop", ".generate" ]
|
||||
.pr-generate-aarch64:
|
||||
extends: [ ".pr", ".generate-aarch64" ]
|
||||
|
||||
.protected-generate:
|
||||
extends: [ ".protected", ".generate" ]
|
||||
|
||||
.protected-generate-aarch64:
|
||||
extends: [ ".protected", ".generate-aarch64" ]
|
||||
|
||||
.build:
|
||||
stage: build
|
||||
@@ -57,12 +77,24 @@ default:
|
||||
AWS_ACCESS_KEY_ID: ${PR_MIRRORS_AWS_ACCESS_KEY_ID}
|
||||
AWS_SECRET_ACCESS_KEY: ${PR_MIRRORS_AWS_SECRET_ACCESS_KEY}
|
||||
|
||||
.develop-build:
|
||||
extends: [ ".develop", ".build" ]
|
||||
.protected-build:
|
||||
extends: [ ".protected", ".build" ]
|
||||
variables:
|
||||
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
|
||||
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
|
||||
SPACK_SIGNING_KEY: ${PACKAGE_SIGNING_KEY}
|
||||
|
||||
protected-publish:
|
||||
stage: publish
|
||||
extends: [ ".protected-refs" ]
|
||||
image: "ghcr.io/spack/python-aws-bash:0.0.1"
|
||||
tags: ["spack", "public", "medium", "aws", "x86_64"]
|
||||
variables:
|
||||
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
|
||||
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
|
||||
script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
- spack buildcache update-index --mirror-url "s3://spack-binaries/${CI_COMMIT_REF_NAME}"
|
||||
|
||||
########################################
|
||||
# TEMPLATE FOR ADDING ANOTHER PIPELINE
|
||||
@@ -83,8 +115,8 @@ default:
|
||||
# my-super-cool-stack-pr-generate:
|
||||
# extends: [ ".my-super-cool-stack", ".pr-generate"]
|
||||
#
|
||||
# my-super-cool-stack-develop-generate:
|
||||
# extends: [ ".my-super-cool-stack", ".develop-generate"]
|
||||
# my-super-cool-stack-protected-generate:
|
||||
# extends: [ ".my-super-cool-stack", ".protected-generate"]
|
||||
#
|
||||
# my-super-cool-stack-pr-build:
|
||||
# extends: [ ".my-super-cool-stack", ".pr-build" ]
|
||||
@@ -94,24 +126,62 @@ default:
|
||||
# job: my-super-cool-stack-pr-generate
|
||||
# strategy: depend
|
||||
#
|
||||
# my-super-cool-stack-develop-build:
|
||||
# extends: [ ".my-super-cool-stack", ".develop-build" ]
|
||||
# my-super-cool-stack-protected-build:
|
||||
# extends: [ ".my-super-cool-stack", ".protected-build" ]
|
||||
# trigger:
|
||||
# include:
|
||||
# - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
# job: my-super-cool-stack-develop-generate
|
||||
# job: my-super-cool-stack-protected-generate
|
||||
# strategy: depend
|
||||
|
||||
########################################
|
||||
# E4S Mac Stack
|
||||
# E4S Mac Stack
|
||||
#
|
||||
# With no near-future plans to have
|
||||
# protected aws runners running mac
|
||||
# builds, it seems best to decouple
|
||||
# them from the rest of the stacks for
|
||||
# the time being. This way they can
|
||||
# still run on UO runners and be signed
|
||||
# using the previous approach.
|
||||
########################################
|
||||
.e4s-mac:
|
||||
variables:
|
||||
SPACK_CI_STACK_NAME: e4s-mac
|
||||
allow_failure: True
|
||||
|
||||
.mac-pr:
|
||||
only:
|
||||
- /^pr[\d]+_.*$/
|
||||
- /^github\/pr[\d]+_.*$/
|
||||
variables:
|
||||
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries-prs/${CI_COMMIT_REF_NAME}"
|
||||
SPACK_PRUNE_UNTOUCHED: "True"
|
||||
|
||||
.mac-protected:
|
||||
only:
|
||||
- /^develop$/
|
||||
- /^releases\/v.*/
|
||||
- /^v.*/
|
||||
- /^github\/develop$/
|
||||
variables:
|
||||
SPACK_BUILDCACHE_DESTINATION: "s3://spack-binaries/${CI_COMMIT_REF_NAME}/${SPACK_CI_STACK_NAME}"
|
||||
|
||||
.mac-pr-build:
|
||||
extends: [ ".mac-pr", ".build" ]
|
||||
variables:
|
||||
AWS_ACCESS_KEY_ID: ${PR_MIRRORS_AWS_ACCESS_KEY_ID}
|
||||
AWS_SECRET_ACCESS_KEY: ${PR_MIRRORS_AWS_SECRET_ACCESS_KEY}
|
||||
|
||||
.mac-protected-build:
|
||||
extends: [ ".mac-protected", ".build" ]
|
||||
variables:
|
||||
AWS_ACCESS_KEY_ID: ${PROTECTED_MIRRORS_AWS_ACCESS_KEY_ID}
|
||||
AWS_SECRET_ACCESS_KEY: ${PROTECTED_MIRRORS_AWS_SECRET_ACCESS_KEY}
|
||||
SPACK_SIGNING_KEY: ${PACKAGE_SIGNING_KEY}
|
||||
|
||||
e4s-mac-pr-generate:
|
||||
extends: [".e4s-mac", ".pr"]
|
||||
extends: [".e4s-mac", ".mac-pr"]
|
||||
stage: generate
|
||||
script:
|
||||
- tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp"
|
||||
@@ -135,8 +205,8 @@ e4s-mac-pr-generate:
|
||||
- stuck_or_timeout_failure
|
||||
timeout: 60 minutes
|
||||
|
||||
e4s-mac-develop-generate:
|
||||
extends: [".e4s-mac", ".develop"]
|
||||
e4s-mac-protected-generate:
|
||||
extends: [".e4s-mac", ".mac-protected"]
|
||||
stage: generate
|
||||
script:
|
||||
- tmp="$(mktemp -d)"; export SPACK_USER_CONFIG_PATH="$tmp"; export SPACK_USER_CACHE_PATH="$tmp"
|
||||
@@ -161,7 +231,7 @@ e4s-mac-develop-generate:
|
||||
timeout: 60 minutes
|
||||
|
||||
e4s-mac-pr-build:
|
||||
extends: [ ".e4s-mac", ".pr-build" ]
|
||||
extends: [ ".e4s-mac", ".mac-pr-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
@@ -171,16 +241,16 @@ e4s-mac-pr-build:
|
||||
- artifacts: True
|
||||
job: e4s-mac-pr-generate
|
||||
|
||||
e4s-mac-develop-build:
|
||||
extends: [ ".e4s-mac", ".develop-build" ]
|
||||
e4s-mac-protected-build:
|
||||
extends: [ ".e4s-mac", ".mac-protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: e4s-mac-develop-generate
|
||||
job: e4s-mac-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: e4s-mac-develop-generate
|
||||
job: e4s-mac-protected-generate
|
||||
|
||||
########################################
|
||||
# E4S pipeline
|
||||
@@ -192,8 +262,8 @@ e4s-mac-develop-build:
|
||||
e4s-pr-generate:
|
||||
extends: [ ".e4s", ".pr-generate"]
|
||||
|
||||
e4s-develop-generate:
|
||||
extends: [ ".e4s", ".develop-generate"]
|
||||
e4s-protected-generate:
|
||||
extends: [ ".e4s", ".protected-generate"]
|
||||
|
||||
e4s-pr-build:
|
||||
extends: [ ".e4s", ".pr-build" ]
|
||||
@@ -206,16 +276,16 @@ e4s-pr-build:
|
||||
- artifacts: True
|
||||
job: e4s-pr-generate
|
||||
|
||||
e4s-develop-build:
|
||||
extends: [ ".e4s", ".develop-build" ]
|
||||
e4s-protected-build:
|
||||
extends: [ ".e4s", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: e4s-develop-generate
|
||||
job: e4s-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: e4s-develop-generate
|
||||
job: e4s-protected-generate
|
||||
|
||||
########################################
|
||||
# E4S on Power
|
||||
@@ -231,8 +301,8 @@ e4s-develop-build:
|
||||
# e4s-on-power-pr-generate:
|
||||
# extends: [ ".e4s-on-power", ".pr-generate", ".power-e4s-generate-tags-and-image"]
|
||||
|
||||
# e4s-on-power-develop-generate:
|
||||
# extends: [ ".e4s-on-power", ".develop-generate", ".power-e4s-generate-tags-and-image"]
|
||||
# e4s-on-power-protected-generate:
|
||||
# extends: [ ".e4s-on-power", ".protected-generate", ".power-e4s-generate-tags-and-image"]
|
||||
|
||||
# e4s-on-power-pr-build:
|
||||
# extends: [ ".e4s-on-power", ".pr-build" ]
|
||||
@@ -245,16 +315,16 @@ e4s-develop-build:
|
||||
# - artifacts: True
|
||||
# job: e4s-on-power-pr-generate
|
||||
|
||||
# e4s-on-power-develop-build:
|
||||
# extends: [ ".e4s-on-power", ".develop-build" ]
|
||||
# e4s-on-power-protected-build:
|
||||
# extends: [ ".e4s-on-power", ".protected-build" ]
|
||||
# trigger:
|
||||
# include:
|
||||
# - artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
# job: e4s-on-power-develop-generate
|
||||
# job: e4s-on-power-protected-generate
|
||||
# strategy: depend
|
||||
# needs:
|
||||
# - artifacts: True
|
||||
# job: e4s-on-power-develop-generate
|
||||
# job: e4s-on-power-protected-generate
|
||||
|
||||
#########################################
|
||||
# Build tests for different build-systems
|
||||
@@ -266,8 +336,8 @@ e4s-develop-build:
|
||||
build_systems-pr-generate:
|
||||
extends: [ ".build_systems", ".pr-generate"]
|
||||
|
||||
build_systems-develop-generate:
|
||||
extends: [ ".build_systems", ".develop-generate"]
|
||||
build_systems-protected-generate:
|
||||
extends: [ ".build_systems", ".protected-generate"]
|
||||
|
||||
build_systems-pr-build:
|
||||
extends: [ ".build_systems", ".pr-build" ]
|
||||
@@ -280,16 +350,16 @@ build_systems-pr-build:
|
||||
- artifacts: True
|
||||
job: build_systems-pr-generate
|
||||
|
||||
build_systems-develop-build:
|
||||
extends: [ ".build_systems", ".develop-build" ]
|
||||
build_systems-protected-build:
|
||||
extends: [ ".build_systems", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: build_systems-develop-generate
|
||||
job: build_systems-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: build_systems-develop-generate
|
||||
job: build_systems-protected-generate
|
||||
|
||||
#########################################
|
||||
# RADIUSS
|
||||
@@ -313,20 +383,20 @@ radiuss-pr-build:
|
||||
- artifacts: True
|
||||
job: radiuss-pr-generate
|
||||
|
||||
# --------- Develop ---------
|
||||
radiuss-develop-generate:
|
||||
extends: [ ".radiuss", ".develop-generate" ]
|
||||
# --------- Protected ---------
|
||||
radiuss-protected-generate:
|
||||
extends: [ ".radiuss", ".protected-generate" ]
|
||||
|
||||
radiuss-develop-build:
|
||||
extends: [ ".radiuss", ".develop-build" ]
|
||||
radiuss-protected-build:
|
||||
extends: [ ".radiuss", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: radiuss-develop-generate
|
||||
job: radiuss-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: radiuss-develop-generate
|
||||
job: radiuss-protected-generate
|
||||
|
||||
########################################
|
||||
# ECP Data & Vis SDK
|
||||
@@ -338,8 +408,8 @@ radiuss-develop-build:
|
||||
data-vis-sdk-pr-generate:
|
||||
extends: [ ".data-vis-sdk", ".pr-generate"]
|
||||
|
||||
data-vis-sdk-develop-generate:
|
||||
extends: [ ".data-vis-sdk", ".develop-generate"]
|
||||
data-vis-sdk-protected-generate:
|
||||
extends: [ ".data-vis-sdk", ".protected-generate"]
|
||||
|
||||
data-vis-sdk-pr-build:
|
||||
extends: [ ".data-vis-sdk", ".pr-build" ]
|
||||
@@ -352,16 +422,174 @@ data-vis-sdk-pr-build:
|
||||
- artifacts: True
|
||||
job: data-vis-sdk-pr-generate
|
||||
|
||||
data-vis-sdk-develop-build:
|
||||
extends: [ ".data-vis-sdk", ".develop-build" ]
|
||||
data-vis-sdk-protected-build:
|
||||
extends: [ ".data-vis-sdk", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: data-vis-sdk-develop-generate
|
||||
job: data-vis-sdk-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: data-vis-sdk-develop-generate
|
||||
job: data-vis-sdk-protected-generate
|
||||
|
||||
########################################
|
||||
# AWS AHUG Applications (x86_64)
|
||||
########################################
|
||||
|
||||
# Call this AFTER .*-generate
|
||||
.aws-ahug-overrides:
|
||||
# This controls image for generate step; build step is controlled by spack.yaml
|
||||
# Note that generator emits OS info for build so these should be the same.
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
|
||||
.aws-ahug:
|
||||
variables:
|
||||
SPACK_CI_STACK_NAME: aws-ahug
|
||||
|
||||
aws-ahug-pr-generate:
|
||||
extends: [ ".aws-ahug", ".pr-generate", ".aws-ahug-overrides" ]
|
||||
tags: ["spack", "public", "medium", "x86_64_v4"]
|
||||
|
||||
aws-ahug-protected-generate:
|
||||
extends: [ ".aws-ahug", ".protected-generate", ".aws-ahug-overrides" ]
|
||||
tags: ["spack", "public", "medium", "x86_64_v4"]
|
||||
|
||||
aws-ahug-pr-build:
|
||||
extends: [ ".aws-ahug", ".pr-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-ahug-pr-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-ahug-pr-generate
|
||||
|
||||
aws-ahug-protected-build:
|
||||
extends: [ ".aws-ahug", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-ahug-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-ahug-protected-generate
|
||||
|
||||
|
||||
# Parallel Pipeline for aarch64 (reuses override image, but generates and builds on aarch64)
|
||||
.aws-ahug-aarch64:
|
||||
variables:
|
||||
SPACK_CI_STACK_NAME: aws-ahug-aarch64
|
||||
|
||||
aws-ahug-aarch64-pr-generate:
|
||||
extends: [ ".aws-ahug-aarch64", ".pr-generate-aarch64", ".aws-ahug-overrides" ]
|
||||
|
||||
aws-ahug-aarch64-protected-generate:
|
||||
extends: [ ".aws-ahug-aarch64", ".protected-generate-aarch64", ".aws-ahug-overrides" ]
|
||||
|
||||
aws-ahug-aarch64-pr-build:
|
||||
extends: [ ".aws-ahug-aarch64", ".pr-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-ahug-aarch64-pr-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-ahug-aarch64-pr-generate
|
||||
|
||||
aws-ahug-aarch64-protected-build:
|
||||
extends: [ ".aws-ahug-aarch64", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-ahug-aarch64-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-ahug-aarch64-protected-generate
|
||||
|
||||
########################################
|
||||
# AWS ISC Applications (x86_64)
|
||||
########################################
|
||||
|
||||
# Call this AFTER .*-generate
|
||||
.aws-isc-overrides:
|
||||
# This controls image for generate step; build step is controlled by spack.yaml
|
||||
# Note that generator emits OS info for build so these should be the same.
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
|
||||
.aws-isc:
|
||||
variables:
|
||||
SPACK_CI_STACK_NAME: aws-isc
|
||||
|
||||
aws-isc-pr-generate:
|
||||
extends: [ ".aws-isc", ".pr-generate", ".aws-isc-overrides" ]
|
||||
tags: ["spack", "public", "medium", "x86_64_v4"]
|
||||
|
||||
aws-isc-protected-generate:
|
||||
extends: [ ".aws-isc", ".protected-generate", ".aws-isc-overrides" ]
|
||||
tags: ["spack", "public", "medium", "x86_64_v4"]
|
||||
|
||||
aws-isc-pr-build:
|
||||
extends: [ ".aws-isc", ".pr-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-isc-pr-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-isc-pr-generate
|
||||
|
||||
aws-isc-protected-build:
|
||||
extends: [ ".aws-isc", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-isc-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-isc-protected-generate
|
||||
|
||||
|
||||
# Parallel Pipeline for aarch64 (reuses override image, but generates and builds on aarch64)
|
||||
|
||||
.aws-isc-aarch64:
|
||||
variables:
|
||||
SPACK_CI_STACK_NAME: aws-isc-aarch64
|
||||
|
||||
aws-isc-aarch64-pr-generate:
|
||||
extends: [ ".aws-isc-aarch64", ".pr-generate-aarch64", ".aws-isc-overrides" ]
|
||||
|
||||
aws-isc-aarch64-protected-generate:
|
||||
extends: [ ".aws-isc-aarch64", ".protected-generate-aarch64", ".aws-isc-overrides" ]
|
||||
|
||||
aws-isc-aarch64-pr-build:
|
||||
extends: [ ".aws-isc-aarch64", ".pr-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-isc-aarch64-pr-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-isc-aarch64-pr-generate
|
||||
|
||||
aws-isc-aarch64-protected-build:
|
||||
extends: [ ".aws-isc-aarch64", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: aws-isc-aarch64-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: aws-isc-aarch64-protected-generate
|
||||
|
||||
|
||||
########################################
|
||||
# Spack Tutorial
|
||||
@@ -373,8 +601,8 @@ data-vis-sdk-develop-build:
|
||||
tutorial-pr-generate:
|
||||
extends: [ ".tutorial", ".pr-generate"]
|
||||
|
||||
tutorial-develop-generate:
|
||||
extends: [ ".tutorial", ".develop-generate"]
|
||||
tutorial-protected-generate:
|
||||
extends: [ ".tutorial", ".protected-generate"]
|
||||
|
||||
tutorial-pr-build:
|
||||
extends: [ ".tutorial", ".pr-build" ]
|
||||
@@ -387,13 +615,13 @@ tutorial-pr-build:
|
||||
- artifacts: True
|
||||
job: tutorial-pr-generate
|
||||
|
||||
tutorial-develop-build:
|
||||
extends: [ ".tutorial", ".develop-build" ]
|
||||
tutorial-protected-build:
|
||||
extends: [ ".tutorial", ".protected-build" ]
|
||||
trigger:
|
||||
include:
|
||||
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
|
||||
job: tutorial-develop-generate
|
||||
job: tutorial-protected-generate
|
||||
strategy: depend
|
||||
needs:
|
||||
- artifacts: True
|
||||
job: tutorial-develop-generate
|
||||
job: tutorial-protected-generate
|
||||
|
@@ -0,0 +1,329 @@
|
||||
spack:
|
||||
view: false
|
||||
|
||||
concretizer:
|
||||
reuse: false
|
||||
unify: false
|
||||
|
||||
config:
|
||||
concretizer: clingo
|
||||
install_tree:
|
||||
root: /home/software/spack
|
||||
padded_length: 384
|
||||
projections:
|
||||
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
|
||||
|
||||
packages:
|
||||
all:
|
||||
providers:
|
||||
blas:
|
||||
- openblas
|
||||
mkl:
|
||||
- intel-oneapi-mkl
|
||||
mpi:
|
||||
- openmpi
|
||||
- mpich
|
||||
variants: +mpi
|
||||
binutils:
|
||||
variants: +ld +gold +headers +libiberty ~nls
|
||||
version:
|
||||
- 2.36.1
|
||||
doxygen:
|
||||
version:
|
||||
- 1.8.20
|
||||
elfutils:
|
||||
variants: +bzip2 ~nls +xz
|
||||
hdf5:
|
||||
variants: +fortran +hl +shared
|
||||
libfabric:
|
||||
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
|
||||
libunwind:
|
||||
variants: +pic +xz
|
||||
#m4:
|
||||
# version:
|
||||
# - 1.4.18
|
||||
mesa:
|
||||
variants: ~llvm
|
||||
mesa18:
|
||||
variants: ~llvm
|
||||
mpich:
|
||||
#variants: ~wrapperrpath pmi=pmi netmod=ofi device=ch4
|
||||
variants: ~wrapperrpath netmod=ofi device=ch4
|
||||
#munge:
|
||||
# variants: localstatedir=/var
|
||||
ncurses:
|
||||
variants: +termlib
|
||||
openblas:
|
||||
variants: threads=openmp
|
||||
openmpi:
|
||||
#variants: +pmi +internal-hwloc fabrics=ofi schedulers=slurm
|
||||
variants: fabrics=ofi
|
||||
openturns:
|
||||
version: [1.18]
|
||||
#slurm:
|
||||
# variants: +pmix sysconfdir=/opt/slurm/etc
|
||||
trilinos:
|
||||
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
|
||||
xz:
|
||||
variants: +pic
|
||||
|
||||
definitions:
|
||||
|
||||
- compiler_specs:
|
||||
- gcc
|
||||
#- nvhpc@22.1
|
||||
|
||||
|
||||
# - compilers:
|
||||
# - '%gcc@7.5.0'
|
||||
# - '%arm@21.0.0.879'
|
||||
# - '%nvhpc@21.2'
|
||||
# - 'arm@21.0.0.879'
|
||||
#- when: arch.satisfies('os=ubuntu18.04')
|
||||
# compilers: ['gcc@7.5.0', $bootstrap-compilers]
|
||||
#- when: arch.satisfies('os=amzn2')
|
||||
# compilers: ['gcc@7.3.1', $bootstrap-compilers]
|
||||
|
||||
# Note skipping spot since no spack package for it
|
||||
- ahug_miniapps:
|
||||
- cloverleaf
|
||||
# - coevp
|
||||
# Bad code pushed to github; needs specific versioning
|
||||
# - cohmm
|
||||
# - examinimd
|
||||
# - exampm
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - exasp2
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - gamess-ri-mp2-miniapp
|
||||
- hpcg
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - laghos
|
||||
- lulesh
|
||||
# - miniaero
|
||||
- miniamr
|
||||
- minife
|
||||
- minighost
|
||||
# fails due to x86 specific timer (asm instructions)
|
||||
# - minigmg
|
||||
- minimd
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - miniqmc
|
||||
- minismac2d
|
||||
- minitri
|
||||
- minivite
|
||||
- minixyce
|
||||
- pennant
|
||||
- picsarlite
|
||||
- quicksilver
|
||||
# - remhos
|
||||
- rsbench
|
||||
- simplemoc
|
||||
- snap
|
||||
- snappy
|
||||
- tealeaf
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - thornado-mini
|
||||
- tycho2
|
||||
- xsbench
|
||||
|
||||
- ahug_fullapps:
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - abinit
|
||||
- abyss
|
||||
# conflicts on trilinos
|
||||
# - albany
|
||||
# - amber
|
||||
- amg2013
|
||||
# Bad variant fftw
|
||||
# - aoflagger
|
||||
# - athena
|
||||
# - bowtie2
|
||||
- branson
|
||||
# - camx
|
||||
# Bad variant gpu
|
||||
# - candle-benchmarks
|
||||
- cbench
|
||||
- cgm
|
||||
- chatterbug
|
||||
# - cistem
|
||||
- comd
|
||||
# old version of openmpi
|
||||
# - converge
|
||||
# bad variant tensor ops mpi
|
||||
# - cosmoflow-benchmark
|
||||
# - cosmomc
|
||||
- cosp2
|
||||
# libxsmm not avail on arm
|
||||
# - cp2k
|
||||
# - dock
|
||||
# - elk
|
||||
- elmerfem
|
||||
# - exabayes
|
||||
# - examl
|
||||
# - flecsph
|
||||
# trilinos variant mumps
|
||||
# - frontistr
|
||||
- gatk
|
||||
- graph500
|
||||
- hpgmg
|
||||
- lammps
|
||||
- latte
|
||||
- macsio
|
||||
# - meep
|
||||
- meme
|
||||
# - modylas
|
||||
# - mrbayes
|
||||
# - mrchem
|
||||
# cudnn depednency
|
||||
# - mxnet
|
||||
# trilinos variant mumps
|
||||
# - nalu
|
||||
# - nalu-wind
|
||||
# - namd
|
||||
- nek5000
|
||||
- nekbone
|
||||
# - nektar
|
||||
# - nest
|
||||
- nut
|
||||
# - nwchem
|
||||
- octopus
|
||||
- openmm
|
||||
- pathfinder
|
||||
# - picsar
|
||||
- pism
|
||||
# meson version
|
||||
# - qbox
|
||||
- qmcpack
|
||||
- quantum-espresso
|
||||
# - relion
|
||||
# - siesta
|
||||
- snbone
|
||||
- star
|
||||
- su2
|
||||
- swfft
|
||||
- tinker
|
||||
# gfortran lt 9 unsupported
|
||||
# - vasp
|
||||
- vpfft
|
||||
- vpic
|
||||
- warpx
|
||||
# - yambo
|
||||
|
||||
- compiler:
|
||||
- '%gcc@7.3.1'
|
||||
|
||||
- target:
|
||||
- 'target=aarch64'
|
||||
- 'target=graviton2'
|
||||
|
||||
|
||||
specs:
|
||||
|
||||
- matrix:
|
||||
- - $ahug_miniapps
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $ahug_fullapps
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
# Build compilers to stage in binary cache
|
||||
- matrix:
|
||||
- - $compiler_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug-aarch64" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
mappings:
|
||||
- match:
|
||||
- llvm
|
||||
- llvm-amdgpu
|
||||
- paraview
|
||||
runner-attributes:
|
||||
tags: [ "spack", "huge", "aarch64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 11000m
|
||||
KUBERNETES_MEMORY_REQUEST: 42G
|
||||
|
||||
|
||||
- match:
|
||||
- ascent
|
||||
- axom
|
||||
- cuda
|
||||
- dyninst
|
||||
- gcc
|
||||
- ginkgo
|
||||
- hpx
|
||||
- kokkos-kernels
|
||||
- kokkos-nvcc-wrapper
|
||||
- magma
|
||||
- mfem
|
||||
- mpich
|
||||
- openturns
|
||||
- precice
|
||||
- raja
|
||||
- rocblas
|
||||
- rocsolver
|
||||
- rust
|
||||
- slate
|
||||
- strumpack
|
||||
- sundials
|
||||
- trilinos
|
||||
- umpire
|
||||
- vtk-h
|
||||
- vtk-m
|
||||
- warpx
|
||||
runner-attributes:
|
||||
tags: [ "spack", "large", "aarch64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
KUBERNETES_MEMORY_REQUEST: 12G
|
||||
|
||||
- match: ['os=amzn2']
|
||||
runner-attributes:
|
||||
tags: ["spack", "aarch64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "aarch64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: AHUG ARM HPC User Group
|
||||
url: https://cdash.spack.io
|
||||
project: Spack Testing
|
||||
site: Cloud Gitlab Infrastructure
|
330
share/spack/gitlab/cloud_pipelines/stacks/aws-ahug/spack.yaml
Normal file
330
share/spack/gitlab/cloud_pipelines/stacks/aws-ahug/spack.yaml
Normal file
@@ -0,0 +1,330 @@
|
||||
spack:
|
||||
view: false
|
||||
|
||||
concretizer:
|
||||
reuse: false
|
||||
unify: false
|
||||
|
||||
config:
|
||||
concretizer: clingo
|
||||
install_tree:
|
||||
root: /home/software/spack
|
||||
padded_length: 384
|
||||
projections:
|
||||
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
|
||||
|
||||
packages:
|
||||
all:
|
||||
providers:
|
||||
blas:
|
||||
- openblas
|
||||
mkl:
|
||||
- intel-oneapi-mkl
|
||||
mpi:
|
||||
- openmpi
|
||||
- mpich
|
||||
variants: +mpi
|
||||
binutils:
|
||||
variants: +ld +gold +headers +libiberty ~nls
|
||||
version:
|
||||
- 2.36.1
|
||||
doxygen:
|
||||
version:
|
||||
- 1.8.20
|
||||
elfutils:
|
||||
variants: +bzip2 ~nls +xz
|
||||
hdf5:
|
||||
variants: +fortran +hl +shared
|
||||
libfabric:
|
||||
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
|
||||
libunwind:
|
||||
variants: +pic +xz
|
||||
#m4:
|
||||
# version:
|
||||
# - 1.4.18
|
||||
mesa:
|
||||
variants: ~llvm
|
||||
mesa18:
|
||||
variants: ~llvm
|
||||
mpich:
|
||||
#variants: ~wrapperrpath pmi=pmi netmod=ofi device=ch4
|
||||
variants: ~wrapperrpath netmod=ofi device=ch4
|
||||
#munge:
|
||||
# variants: localstatedir=/var
|
||||
ncurses:
|
||||
variants: +termlib
|
||||
openblas:
|
||||
variants: threads=openmp
|
||||
openmpi:
|
||||
#variants: +pmi +internal-hwloc fabrics=ofi schedulers=slurm
|
||||
variants: fabrics=ofi
|
||||
openturns:
|
||||
version: [1.18]
|
||||
#slurm:
|
||||
# variants: +pmix sysconfdir=/opt/slurm/etc
|
||||
trilinos:
|
||||
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
|
||||
xz:
|
||||
variants: +pic
|
||||
|
||||
definitions:
|
||||
|
||||
- compiler_specs:
|
||||
- gcc
|
||||
#- nvhpc@22.1
|
||||
|
||||
|
||||
# - compilers:
|
||||
# - '%gcc@7.5.0'
|
||||
# - '%arm@21.0.0.879'
|
||||
# - '%nvhpc@21.2'
|
||||
# - 'arm@21.0.0.879'
|
||||
#- when: arch.satisfies('os=ubuntu18.04')
|
||||
# compilers: ['gcc@7.5.0', $bootstrap-compilers]
|
||||
#- when: arch.satisfies('os=amzn2')
|
||||
# compilers: ['gcc@7.3.1', $bootstrap-compilers]
|
||||
|
||||
# Note skipping spot since no spack package for it
|
||||
- ahug_miniapps:
|
||||
- cloverleaf
|
||||
# - coevp
|
||||
# Bad code pushed to github; needs specific versioning
|
||||
# - cohmm
|
||||
# - examinimd
|
||||
# - exampm
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - exasp2
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - gamess-ri-mp2-miniapp
|
||||
- hpcg
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - laghos
|
||||
- lulesh
|
||||
# - miniaero
|
||||
- miniamr
|
||||
- minife
|
||||
- minighost
|
||||
# fails due to x86 specific timer (asm instructions)
|
||||
# - minigmg
|
||||
- minimd
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - miniqmc
|
||||
- minismac2d
|
||||
- minitri
|
||||
- minivite
|
||||
- minixyce
|
||||
- pennant
|
||||
- picsarlite
|
||||
- quicksilver
|
||||
# - remhos
|
||||
- rsbench
|
||||
- simplemoc
|
||||
- snap
|
||||
- snappy
|
||||
- tealeaf
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - thornado-mini
|
||||
- tycho2
|
||||
- xsbench
|
||||
|
||||
- ahug_fullapps:
|
||||
# depends on openblas version conflicting on gcc@7.3.1
|
||||
# - abinit
|
||||
- abyss
|
||||
# conflicts on trilinos
|
||||
# - albany
|
||||
# - amber
|
||||
- amg2013
|
||||
# Bad variant fftw
|
||||
# - aoflagger
|
||||
# - athena
|
||||
# - bowtie2
|
||||
- branson
|
||||
# - camx
|
||||
# Bad variant gpu
|
||||
# - candle-benchmarks
|
||||
- cbench
|
||||
- cgm
|
||||
- chatterbug
|
||||
# - cistem
|
||||
- comd
|
||||
# old version of openmpi
|
||||
# - converge
|
||||
# bad variant tensor ops mpi
|
||||
# - cosmoflow-benchmark
|
||||
# - cosmomc
|
||||
- cosp2
|
||||
# libxsmm not avail on arm
|
||||
# - cp2k
|
||||
# - dock
|
||||
# - elk
|
||||
- elmerfem
|
||||
# - exabayes
|
||||
# - examl
|
||||
# - flecsph
|
||||
# trilinos variant mumps
|
||||
# - frontistr
|
||||
- gatk
|
||||
- graph500
|
||||
- hpgmg
|
||||
- lammps
|
||||
- latte
|
||||
- macsio
|
||||
# - meep
|
||||
- meme
|
||||
# - modylas
|
||||
# - mrbayes
|
||||
# - mrchem
|
||||
# cudnn depednency
|
||||
# - mxnet
|
||||
# trilinos variant mumps
|
||||
# - nalu
|
||||
# - nalu-wind
|
||||
# - namd
|
||||
- nek5000
|
||||
- nekbone
|
||||
# - nektar
|
||||
# - nest
|
||||
- nut
|
||||
# - nwchem
|
||||
- octopus
|
||||
- openmm
|
||||
- pathfinder
|
||||
# - picsar
|
||||
- pism
|
||||
# meson version
|
||||
# - qbox
|
||||
- qmcpack
|
||||
- quantum-espresso
|
||||
# - relion
|
||||
# - siesta
|
||||
- snbone
|
||||
- star
|
||||
- su2
|
||||
- swfft
|
||||
- tinker
|
||||
# gfortran lt 9 unsupported
|
||||
# - vasp
|
||||
- vpfft
|
||||
- vpic
|
||||
- warpx
|
||||
# - yambo
|
||||
|
||||
- compiler:
|
||||
- '%gcc@7.3.1'
|
||||
|
||||
- target:
|
||||
#- 'target=x86_64'
|
||||
- 'target=x86_64_v3'
|
||||
- 'target=x86_64_v4'
|
||||
|
||||
|
||||
specs:
|
||||
|
||||
- matrix:
|
||||
- - $ahug_miniapps
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $ahug_fullapps
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
# Build compilers to stage in binary cache
|
||||
- matrix:
|
||||
- - $compiler_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-ahug" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
mappings:
|
||||
- match:
|
||||
- llvm
|
||||
- llvm-amdgpu
|
||||
- paraview
|
||||
runner-attributes:
|
||||
tags: [ "spack", "huge", "x86_64_v4" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 11000m
|
||||
KUBERNETES_MEMORY_REQUEST: 42G
|
||||
|
||||
|
||||
- match:
|
||||
- ascent
|
||||
- axom
|
||||
- cuda
|
||||
- dyninst
|
||||
- gcc
|
||||
- ginkgo
|
||||
- hpx
|
||||
- kokkos-kernels
|
||||
- kokkos-nvcc-wrapper
|
||||
- magma
|
||||
- mfem
|
||||
- mpich
|
||||
- openturns
|
||||
- precice
|
||||
- raja
|
||||
- rocblas
|
||||
- rocsolver
|
||||
- rust
|
||||
- slate
|
||||
- strumpack
|
||||
- sundials
|
||||
- trilinos
|
||||
- umpire
|
||||
- vtk-h
|
||||
- vtk-m
|
||||
- warpx
|
||||
runner-attributes:
|
||||
tags: [ "spack", "large", "x86_64_v4" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
KUBERNETES_MEMORY_REQUEST: 12G
|
||||
|
||||
- match: ['os=amzn2']
|
||||
runner-attributes:
|
||||
tags: ["spack", "x86_64_v4"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "x86_64_v4"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: AHUG ARM HPC User Group
|
||||
url: https://cdash.spack.io
|
||||
project: Spack Testing
|
||||
site: Cloud Gitlab Infrastructure
|
@@ -0,0 +1,256 @@
|
||||
spack:
|
||||
view: false
|
||||
|
||||
concretizer:
|
||||
reuse: false
|
||||
unify: false
|
||||
|
||||
config:
|
||||
concretizer: clingo
|
||||
install_tree:
|
||||
root: /home/software/spack
|
||||
padded_length: 384
|
||||
projections:
|
||||
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
|
||||
|
||||
packages:
|
||||
all:
|
||||
providers:
|
||||
blas:
|
||||
- openblas
|
||||
mkl:
|
||||
- intel-oneapi-mkl
|
||||
mpi:
|
||||
- openmpi
|
||||
- mpich
|
||||
variants: +mpi
|
||||
binutils:
|
||||
variants: +ld +gold +headers +libiberty ~nls
|
||||
version:
|
||||
- 2.36.1
|
||||
doxygen:
|
||||
version:
|
||||
- 1.8.20
|
||||
elfutils:
|
||||
variants: +bzip2 ~nls +xz
|
||||
hdf5:
|
||||
variants: +fortran +hl +shared
|
||||
libfabric:
|
||||
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
|
||||
libunwind:
|
||||
variants: +pic +xz
|
||||
mesa:
|
||||
variants: ~llvm
|
||||
mesa18:
|
||||
variants: ~llvm
|
||||
mpich:
|
||||
variants: ~wrapperrpath netmod=ofi device=ch4
|
||||
ncurses:
|
||||
variants: +termlib
|
||||
openblas:
|
||||
variants: threads=openmp
|
||||
openmpi:
|
||||
variants: fabrics=ofi +legacylaunchers
|
||||
openturns:
|
||||
version: [1.18]
|
||||
relion:
|
||||
variants: ~mklfft
|
||||
# texlive:
|
||||
# version: [20210325]
|
||||
trilinos:
|
||||
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
|
||||
xz:
|
||||
variants: +pic
|
||||
|
||||
definitions:
|
||||
|
||||
- compiler_specs:
|
||||
- gcc@11.2
|
||||
# Licensing OK?
|
||||
# - intel-oneapi-compilers@2022.1
|
||||
# - nvhpc
|
||||
|
||||
- cuda_specs:
|
||||
# Depends on ctffind which embeds fsincos (x86-specific asm) within code. Will not build on ARM
|
||||
#- relion +cuda cuda_arch=70
|
||||
- raja +cuda cuda_arch=70
|
||||
- mfem +cuda cuda_arch=70
|
||||
|
||||
- app_specs:
|
||||
- bwa
|
||||
# Depends on simde which requires newer compiler?
|
||||
#- bowtie2
|
||||
# Requires x86_64 specific ASM
|
||||
#- cistem
|
||||
- cromwell
|
||||
- fastqc
|
||||
- flux-sched
|
||||
- flux-core
|
||||
- flux-pmix
|
||||
- gatk
|
||||
- gromacs
|
||||
- lammps
|
||||
- wrf build_type=dm+sm
|
||||
- mfem
|
||||
- mpas-model ^parallelio+pnetcdf
|
||||
- nextflow
|
||||
- octave
|
||||
- openfoam
|
||||
- osu-micro-benchmarks
|
||||
- parallel
|
||||
- paraview
|
||||
- picard
|
||||
- quantum-espresso
|
||||
- raja
|
||||
# Depends on bowtie2 -> simde which requires newer compiler?
|
||||
#- rsem
|
||||
# Errors on texlive
|
||||
#- rstudio
|
||||
- salmon
|
||||
- samtools
|
||||
- seqtk
|
||||
- snakemake
|
||||
- star
|
||||
# Requires gcc@9:
|
||||
#- ufs-weather-model
|
||||
# requires LLVM which fails without constraint
|
||||
#- visit
|
||||
|
||||
- lib_specs:
|
||||
- openmpi fabrics=ofi
|
||||
- openmpi fabrics=ofi +legacylaunchers
|
||||
- openmpi fabrics=auto
|
||||
- mpich
|
||||
- libfabric
|
||||
|
||||
- compiler:
|
||||
- '%gcc@7.3.1'
|
||||
|
||||
- target:
|
||||
- 'target=aarch64'
|
||||
- 'target=graviton2'
|
||||
|
||||
|
||||
specs:
|
||||
|
||||
- matrix:
|
||||
- - $cuda_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $app_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $lib_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $compiler_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc-aarch64" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
mappings:
|
||||
- match:
|
||||
- llvm
|
||||
- llvm-amdgpu
|
||||
- paraview
|
||||
runner-attributes:
|
||||
tags: [ "spack", "huge", "aarch64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 15000m
|
||||
KUBERNETES_MEMORY_REQUEST: 62G
|
||||
|
||||
|
||||
- match:
|
||||
- ascent
|
||||
- atk
|
||||
- axom
|
||||
- cistem
|
||||
- ctffind
|
||||
- cuda
|
||||
- dyninst
|
||||
- gcc
|
||||
- ginkgo
|
||||
- hdf5
|
||||
- hpx
|
||||
- kokkos-kernels
|
||||
- kokkos-nvcc-wrapper
|
||||
- magma
|
||||
- mfem
|
||||
- mpich
|
||||
- openturns
|
||||
- parallelio
|
||||
- precice
|
||||
- raja
|
||||
- relion
|
||||
- rocblas
|
||||
- rocsolver
|
||||
- rust
|
||||
- slate
|
||||
- strumpack
|
||||
- sundials
|
||||
- trilinos
|
||||
- umpire
|
||||
- vtk
|
||||
- vtk-h
|
||||
- vtk-m
|
||||
- warpx
|
||||
- wrf
|
||||
- wxwidgets
|
||||
runner-attributes:
|
||||
tags: [ "spack", "large", "aarch64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
KUBERNETES_MEMORY_REQUEST: 12G
|
||||
|
||||
- match: ['os=amzn2']
|
||||
runner-attributes:
|
||||
tags: ["spack", "aarch64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "aarch64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: AWS Packages
|
||||
url: https://cdash.spack.io
|
||||
project: Spack Testing
|
||||
site: Cloud Gitlab Infrastructure
|
258
share/spack/gitlab/cloud_pipelines/stacks/aws-isc/spack.yaml
Normal file
258
share/spack/gitlab/cloud_pipelines/stacks/aws-isc/spack.yaml
Normal file
@@ -0,0 +1,258 @@
|
||||
spack:
|
||||
view: false
|
||||
|
||||
concretizer:
|
||||
reuse: false
|
||||
unify: false
|
||||
|
||||
config:
|
||||
concretizer: clingo
|
||||
install_tree:
|
||||
root: /home/software/spack
|
||||
padded_length: 384
|
||||
projections:
|
||||
all: '{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'
|
||||
|
||||
packages:
|
||||
all:
|
||||
providers:
|
||||
blas:
|
||||
- openblas
|
||||
mkl:
|
||||
- intel-oneapi-mkl
|
||||
mpi:
|
||||
- openmpi
|
||||
- mpich
|
||||
variants: +mpi
|
||||
binutils:
|
||||
variants: +ld +gold +headers +libiberty ~nls
|
||||
version:
|
||||
- 2.36.1
|
||||
doxygen:
|
||||
version:
|
||||
- 1.8.20
|
||||
elfutils:
|
||||
variants: +bzip2 ~nls +xz
|
||||
hdf5:
|
||||
variants: +fortran +hl +shared
|
||||
libfabric:
|
||||
variants: fabrics=efa,tcp,udp,sockets,verbs,shm,mrail,rxd,rxm
|
||||
libunwind:
|
||||
variants: +pic +xz
|
||||
mesa:
|
||||
variants: ~llvm
|
||||
mesa18:
|
||||
variants: ~llvm
|
||||
mpich:
|
||||
variants: ~wrapperrpath netmod=ofi device=ch4
|
||||
ncurses:
|
||||
variants: +termlib
|
||||
openblas:
|
||||
variants: threads=openmp
|
||||
openmpi:
|
||||
variants: fabrics=ofi +legacylaunchers
|
||||
openturns:
|
||||
version: [1.18]
|
||||
relion:
|
||||
variants: ~mklfft
|
||||
# texlive:
|
||||
# version: [20210325]
|
||||
trilinos:
|
||||
variants: +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
|
||||
xz:
|
||||
variants: +pic
|
||||
|
||||
definitions:
|
||||
|
||||
- compiler_specs:
|
||||
- gcc@11.2
|
||||
# Licensing OK?
|
||||
# - intel-oneapi-compilers@2022.1
|
||||
# - nvhpc
|
||||
|
||||
- cuda_specs:
|
||||
# Disabled for consistency with aarch64
|
||||
#- relion +cuda cuda_arch=70
|
||||
- raja +cuda cuda_arch=70
|
||||
- mfem +cuda cuda_arch=70
|
||||
|
||||
- app_specs:
|
||||
- bwa
|
||||
# Disabled for consistency with aarch64
|
||||
#- bowtie2
|
||||
# Disabled for consistency with aarch64
|
||||
#- cistem
|
||||
- cromwell
|
||||
- fastqc
|
||||
- flux-sched
|
||||
- flux-core
|
||||
- flux-pmix
|
||||
- gatk
|
||||
- gromacs
|
||||
- lammps
|
||||
- wrf build_type=dm+sm
|
||||
- mfem
|
||||
- mpas-model ^parallelio+pnetcdf
|
||||
- nextflow
|
||||
- octave
|
||||
- openfoam
|
||||
- osu-micro-benchmarks
|
||||
- parallel
|
||||
- paraview
|
||||
- picard
|
||||
- quantum-espresso
|
||||
# Build broken for gcc@7.3.1 x86_64_v4 (error: '_mm512_loadu_epi32' was not declared in this scope)
|
||||
#- raja
|
||||
# Disabled for consistency with aarch64
|
||||
#- rsem
|
||||
# Errors on texlive
|
||||
#- rstudio
|
||||
- salmon
|
||||
- samtools
|
||||
- seqtk
|
||||
- snakemake
|
||||
- star
|
||||
# Requires gcc@9:
|
||||
#- ufs-weather-model
|
||||
# Disabled for consistency with aarch64
|
||||
#- visit
|
||||
|
||||
- lib_specs:
|
||||
- openmpi fabrics=ofi
|
||||
- openmpi fabrics=ofi +legacylaunchers
|
||||
- openmpi fabrics=auto
|
||||
- mpich
|
||||
- libfabric
|
||||
|
||||
- compiler:
|
||||
- '%gcc@7.3.1'
|
||||
|
||||
- target:
|
||||
- 'target=x86_64_v3'
|
||||
- 'target=x86_64_v4'
|
||||
|
||||
|
||||
specs:
|
||||
|
||||
- matrix:
|
||||
- - $cuda_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $app_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $lib_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
- matrix:
|
||||
- - $compiler_specs
|
||||
- - $compiler
|
||||
- - $target
|
||||
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/aws-isc" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
mappings:
|
||||
- match:
|
||||
- llvm
|
||||
- llvm-amdgpu
|
||||
- pango
|
||||
- paraview
|
||||
runner-attributes:
|
||||
tags: [ "spack", "huge", "x86_64_v4" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 11000m
|
||||
KUBERNETES_MEMORY_REQUEST: 42G
|
||||
|
||||
|
||||
- match:
|
||||
- ascent
|
||||
- atk
|
||||
- axom
|
||||
- cistem
|
||||
- ctffind
|
||||
- cuda
|
||||
- dyninst
|
||||
- gcc
|
||||
- ginkgo
|
||||
- hdf5
|
||||
- hpx
|
||||
- kokkos-kernels
|
||||
- kokkos-nvcc-wrapper
|
||||
- magma
|
||||
- mfem
|
||||
- mpich
|
||||
- openturns
|
||||
- parallelio
|
||||
- precice
|
||||
- raja
|
||||
- relion
|
||||
- rocblas
|
||||
- rocsolver
|
||||
- rust
|
||||
- slate
|
||||
- strumpack
|
||||
- sundials
|
||||
- trilinos
|
||||
- umpire
|
||||
- vtk
|
||||
- vtk-h
|
||||
- vtk-m
|
||||
- warpx
|
||||
- wrf
|
||||
- wxwidgets
|
||||
runner-attributes:
|
||||
tags: [ "spack", "large", "x86_64_v4" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
KUBERNETES_MEMORY_REQUEST: 12G
|
||||
|
||||
- match: ['os=amzn2']
|
||||
runner-attributes:
|
||||
tags: ["spack", "x86_64_v4"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
image: { "name": "ghcr.io/spack/e4s-amazonlinux-2:v2022-03-21", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "x86_64_v4"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: AWS Packages
|
||||
url: https://cdash.spack.io
|
||||
project: Spack Testing
|
||||
site: Cloud Gitlab Infrastructure
|
@@ -29,7 +29,7 @@ spack:
|
||||
- - $default_specs
|
||||
- - $arch
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/build_systems" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/build_systems" }
|
||||
|
||||
gitlab-ci:
|
||||
script:
|
||||
@@ -38,6 +38,8 @@ spack:
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild
|
||||
|
||||
image:
|
||||
@@ -48,7 +50,7 @@ spack:
|
||||
- match:
|
||||
- cmake
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "large", "x86_64"]
|
||||
tags: [ "spack", "large", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
@@ -61,7 +63,7 @@ spack:
|
||||
- openjpeg
|
||||
- sqlite
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "medium", "x86_64" ]
|
||||
tags: [ "spack", "medium", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "medium"
|
||||
KUBERNETES_CPU_REQUEST: "2000m"
|
||||
@@ -85,7 +87,7 @@ spack:
|
||||
- xz
|
||||
- zlib
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "medium", "x86_64" ]
|
||||
tags: [ "spack", "medium", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "small"
|
||||
KUBERNETES_CPU_REQUEST: "500m"
|
||||
@@ -94,18 +96,27 @@ spack:
|
||||
- match:
|
||||
- 'os=ubuntu18.04'
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
tags: ["spack", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
- spack --version
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: Build tests for different build systems
|
||||
url: https://cdash.spack.io
|
||||
|
@@ -42,7 +42,7 @@ spack:
|
||||
+zfp
|
||||
+visit
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/data-vis-sdk" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/data-vis-sdk" }
|
||||
|
||||
gitlab-ci:
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
@@ -52,13 +52,15 @@ spack:
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild
|
||||
mappings:
|
||||
- match:
|
||||
- llvm
|
||||
- qt
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "huge", "x86_64" ]
|
||||
tags: [ "spack", "huge", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 11000m
|
||||
@@ -72,7 +74,7 @@ spack:
|
||||
- visit
|
||||
- vtk-m
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "large", "x86_64" ]
|
||||
tags: [ "spack", "large", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
@@ -98,7 +100,7 @@ spack:
|
||||
- raja
|
||||
- vtk-h
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "medium", "x86_64" ]
|
||||
tags: [ "spack", "medium", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "medium"
|
||||
KUBERNETES_CPU_REQUEST: "2000m"
|
||||
@@ -133,7 +135,7 @@ spack:
|
||||
- util-linux-uuid
|
||||
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "small", "x86_64" ]
|
||||
tags: [ "spack", "small", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "small"
|
||||
KUBERNETES_CPU_REQUEST: "500m"
|
||||
@@ -141,11 +143,12 @@ spack:
|
||||
|
||||
- match: ['@:']
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
tags: ["spack", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
before_script:
|
||||
@@ -153,6 +156,14 @@ spack:
|
||||
- spack --version
|
||||
tags: ["spack", "public", "medium", "x86_64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: Data and Vis SDK
|
||||
url: https://cdash.spack.io
|
||||
|
@@ -32,7 +32,7 @@ spack:
|
||||
- - $easy_specs
|
||||
- - $arch
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/e4s-mac" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s-mac" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
@@ -51,7 +51,9 @@ spack:
|
||||
runner-attributes:
|
||||
tags:
|
||||
- omicron
|
||||
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
|
@@ -222,7 +222,7 @@ spack:
|
||||
- - $cuda_specs
|
||||
- - $arch
|
||||
|
||||
mirrors: { "mirror": "s3://spack-binaries/e4s" }
|
||||
mirrors: { "mirror": "s3://spack-binaries/develop/e4s" }
|
||||
|
||||
gitlab-ci:
|
||||
|
||||
@@ -233,6 +233,8 @@ spack:
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
|
||||
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
@@ -240,7 +242,7 @@ spack:
|
||||
- match:
|
||||
- llvm
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "huge", "x86_64" ]
|
||||
tags: [ "spack", "huge", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: huge
|
||||
KUBERNETES_CPU_REQUEST: 11000m
|
||||
@@ -265,7 +267,7 @@ spack:
|
||||
- vtk-m
|
||||
- warpx
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "large", "x86_64" ]
|
||||
tags: [ "spack", "large", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
@@ -333,7 +335,7 @@ spack:
|
||||
- vtk-h
|
||||
- zfp
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "medium", "x86_64" ]
|
||||
tags: [ "spack", "medium", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "medium"
|
||||
KUBERNETES_CPU_REQUEST: "2000m"
|
||||
@@ -394,7 +396,7 @@ spack:
|
||||
- zlib
|
||||
- zstd
|
||||
runner-attributes:
|
||||
tags: [ "spack", "public", "small", "x86_64" ]
|
||||
tags: [ "spack", "small", "x86_64" ]
|
||||
variables:
|
||||
CI_JOB_SIZE: "small"
|
||||
KUBERNETES_CPU_REQUEST: "500m"
|
||||
@@ -402,11 +404,12 @@ spack:
|
||||
|
||||
- match: ['os=ubuntu18.04']
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
tags: ["spack", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
@@ -414,6 +417,14 @@ spack:
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: New PR testing workflow
|
||||
url: https://cdash.spack.io
|
||||
|
@@ -54,7 +54,7 @@ spack:
|
||||
- zfp
|
||||
|
||||
mirrors:
|
||||
mirror: "s3://spack-binaries/radiuss"
|
||||
mirror: "s3://spack-binaries/develop/radiuss"
|
||||
|
||||
specs:
|
||||
- matrix:
|
||||
@@ -69,6 +69,8 @@ spack:
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild
|
||||
mappings:
|
||||
- match:
|
||||
@@ -76,7 +78,7 @@ spack:
|
||||
- openblas
|
||||
- rust
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "large", "x86_64"]
|
||||
tags: ["spack", "large", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
@@ -96,7 +98,7 @@ spack:
|
||||
- vtk-h
|
||||
- vtk-m
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "medium", "x86_64"]
|
||||
tags: ["spack", "medium", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "medium"
|
||||
KUBERNETES_CPU_REQUEST: "2000m"
|
||||
@@ -150,7 +152,7 @@ spack:
|
||||
- zfp
|
||||
- zlib
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "small", "x86_64"]
|
||||
tags: ["spack", "small", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "small"
|
||||
KUBERNETES_CPU_REQUEST: "500m"
|
||||
@@ -158,10 +160,12 @@ spack:
|
||||
|
||||
- match: ['os=ubuntu18.04']
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
tags: ["spack", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "default"
|
||||
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
before_script:
|
||||
- . "./share/spack/setup-env.sh"
|
||||
@@ -169,6 +173,14 @@ spack:
|
||||
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: RADIUSS
|
||||
url: https://cdash.spack.io
|
||||
|
@@ -59,7 +59,7 @@ spack:
|
||||
- $gcc_spack_built_packages
|
||||
|
||||
mirrors:
|
||||
mirror: 's3://spack-binaries/tutorial'
|
||||
mirror: 's3://spack-binaries/develop/tutorial'
|
||||
|
||||
gitlab-ci:
|
||||
script:
|
||||
@@ -69,6 +69,8 @@ spack:
|
||||
- cd ${SPACK_CONCRETE_ENV_DIR}
|
||||
- spack env activate --without-view .
|
||||
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
|
||||
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
|
||||
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
|
||||
- spack -d ci rebuild
|
||||
|
||||
image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] }
|
||||
@@ -81,7 +83,7 @@ spack:
|
||||
- netlib-lapack
|
||||
- trilinos
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "large", "x86_64"]
|
||||
tags: ["spack", "large", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: large
|
||||
KUBERNETES_CPU_REQUEST: 8000m
|
||||
@@ -99,7 +101,7 @@ spack:
|
||||
- py-scipy
|
||||
- slurm
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "medium", "x86_64"]
|
||||
tags: ["spack", "medium", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "medium"
|
||||
KUBERNETES_CPU_REQUEST: "2000m"
|
||||
@@ -129,7 +131,7 @@ spack:
|
||||
- tar
|
||||
- util-linux-uuid
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "small", "x86_64"]
|
||||
tags: ["spack", "small", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: "small"
|
||||
KUBERNETES_CPU_REQUEST: "500m"
|
||||
@@ -137,11 +139,12 @@ spack:
|
||||
|
||||
- match: ['@:']
|
||||
runner-attributes:
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
tags: ["spack", "x86_64"]
|
||||
variables:
|
||||
CI_JOB_SIZE: default
|
||||
|
||||
broken-specs-url: "s3://spack-binaries-develop/broken-specs"
|
||||
broken-specs-url: "s3://spack-binaries/broken-specs"
|
||||
|
||||
service-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/tutorial-ubuntu-18.04:v2021-11-02", "entrypoint": [""] }
|
||||
before_script:
|
||||
@@ -149,6 +152,14 @@ spack:
|
||||
- spack --version
|
||||
tags: ["spack", "public", "x86_64"]
|
||||
|
||||
signing-job-attributes:
|
||||
image: { "name": "ghcr.io/spack/notary:latest", "entrypoint": [""] }
|
||||
tags: ["spack", "aws"]
|
||||
script:
|
||||
- aws s3 sync --exclude "*" --include "*spec.json*" ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache /tmp
|
||||
- /sign.sh
|
||||
- aws s3 sync --exclude "*" --include "*spec.json.sig*" /tmp ${SPACK_REMOTE_MIRROR_OVERRIDE}/build_cache
|
||||
|
||||
cdash:
|
||||
build-group: Spack Tutorial
|
||||
url: https://cdash.spack.io
|
||||
|
@@ -1,37 +1,164 @@
|
||||
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||
|
||||
mQENBGATaEABCAC//LQBgVfSPb46EwV+JLwxdfGhbk6EwJFoxiYh5lACqJple+Hg
|
||||
8louI+h5KFGxTGflhmVNCx3VfHS2QsPySp+qu3uCsmanQdkHyQkaji4YO5XzJEhI
|
||||
IWt6kKfUcttvBSlfXHhtPNvO0PPeRTQypqMlHx4QfJqmuH7jDMZUf3Kg6eLWOuEb
|
||||
cW49KeAPfXdGXdq91a7XCNsurvyz8xFxgH/CywXF8pG8LVLB4dfN8b7M+RWxA023
|
||||
UZR1f1tYg9eSPRwh7V4E69PcjF7WqvWRw+Uhkes7rUdDR2ZWXhFSn2BvL6p1MdI8
|
||||
ZHCFHO6l6QwuYmSIhsCyh21YSwOb/GC0Z8bxABEBAAG0MFNwYWNrIEJ1aWxkIFBp
|
||||
cGVsaW5lIChEZW1vIEtleSkgPGtleUBzcGFjay5kZW1vPokBTgQTAQoAOBYhBImQ
|
||||
4GrBi1LDHJkUx5Mo0B3n+1WHBQJgE2hAAhsvBQsJCAcCBhUKCQgLAgQWAgMBAh4B
|
||||
AheAAAoJEJMo0B3n+1WH7cMH/1Ay1GUB5V9/K+LJGFDLXzCVTqwJbJUB2IeXNpef
|
||||
qQfhZWQzOi8qXFCYAIqRlJH3c+rYoQTpR+l7uPS87Q24MLHT/mN6ZP+mI4JLM00T
|
||||
CUhs18wN5owNBM7FOs4FTlRmvWhlTCjicCXa0UuH6pB/T35Z/OQVisAUQY82kNu0
|
||||
CUkWLfmfNfm9lVOWWMbceJ49sDGWsApYv4ihzzIvnDSS6n5Fg1p8+BEoDbzk2+f5
|
||||
+Jr0lNXZmQvTx7kGUnwRfuUxJifB8SNbABWL0En2scaE/QACQXkbaNTPMdI8+l59
|
||||
ucmvDDsQHlBRXPGRM1ut+1DHSkdkKqjor3TnLkJDz+rOL+K5AQ0EYBNoQAEIAMin
|
||||
weK4wrqZhWOotTD3FS6IUh46Jd57jxd9dWBl5YkjvTwHQJQ54csneY/jWaMSoxhJ
|
||||
CEuEnb4P/6P0g5lCVYflkXLrCPLrYPJazW0EtTXQ5YRxFT7ISytsQDNgfSQO6irs
|
||||
rJlD+OWUGQYeIpa58hB+N9GnM8eka7lxKfay9lM3rn5Nz3E4x10mdgxYY9RzrFHv
|
||||
1MTGvNe/wRO67e9s0yJT+JEJ5No5h/c6J0dcrAiegiOvbhUtAYaygCpaxryTz9Bt
|
||||
CsSrBOXadzxIEnyp2pJE4vyxCVyHWve2EBk7Fagh45Z+JdA5QhGmNS3tHQmZ6Nyu
|
||||
CPAEjzn4k3jjHgoDfTUAEQEAAYkCbAQYAQoAIBYhBImQ4GrBi1LDHJkUx5Mo0B3n
|
||||
+1WHBQJgE2hAAhsuAUAJEJMo0B3n+1WHwHQgBBkBCgAdFiEEwbGjdbck4hC1jzBi
|
||||
DDW/ourdPjAFAmATaEAACgkQDDW/ourdPjDaqgf+Oav1oC+TfOvPyIJgfuZK3Of6
|
||||
hvTPW81udyKgmZ/pbwJ4rSAkX2BRe0k11OXSc0V4TEkUPG62lfyNbrb54FsZAaPk
|
||||
s2C8G5LW2xlJ91JXIxsFQJcGlWTTrd7IFMe+YtcBHYSBJNmtRXodXO0sYUXCcaMk
|
||||
Au/6y6x3m9nhfpqCsc3XZ6C0QxVMMhgrdSfnEPhHV/84m8mqDobU34+eDjnY/l6V
|
||||
7ZYycH7Ihtv7z4Ed5Ahasr2FmrMOA9y6VFHeFxmUPhRi2QwFl2TZJ0z8sMosUUg0
|
||||
0X2yMfkxnMUrym12NdLYrIMGCPo8vm6UhqY7TZis7N5esBEqmMMsqSH0xuGp8d+e
|
||||
B/99W1lQjtdhtE/UEW/wRMQHFoDC2Hyd9jA+NpFK0l7ryft0Jq284b/H4reFffSj
|
||||
ctQL123KtOLNFQsG5w2Theo2XtC9tvhYTAK8736bf7CWJFw3oW5OSpvfXntzJpmw
|
||||
qcISIERXJPMENLSwUwg7YfpgmSKdrafWSaQEr/e5t2fjf0O3rJfagWH9s0+BetlY
|
||||
NhAwpSf7Tm1X+rcep/8rKAsxwhgEQpfn88+2NTzVJDrTSt7CjcbV7nVIdUcK5ki9
|
||||
2+262W2wWFNnZ3ofWutFl9yTKEY3RVbxpkzYAIM7Q1vUEpP+rYSCYlUwYudl5Dwb
|
||||
BX2VXKOmi9HIn475ykM23BDR
|
||||
=magh
|
||||
mQINBGKOOXIBEACy2KAwhV/qjObbfLc0StT0u8SaSUcubHE3DXpQOo69qyGxWM+e
|
||||
2cfOt7cDuyw8yEYUzpKmRhUgVbUchCtyDQZlzx2nZ0IfuAudsUl0RANk7nbZdjeG
|
||||
F4C21NvA69gtT4sGrDqwTGpKCLxcAUwwpYw2WUcyyz5e7mlGdxA4DmJ8uThDFHhd
|
||||
Yoq2X8YHHvBRIIS8q+T1de2NeFsSIEV2DqYx/L+z6IWkgE60mJy+5mfcuT/+mRpX
|
||||
iZ+w0JAUJDbATndp24TahLo60B+S/G2oIWN5WbKYfsJmHbU5EgjbbC/H8cITt8wS
|
||||
ZTGm+ZnSH6QMPGc8A1w/n/77JAAyVpQW907gLnUOX8qRypkmpNUGVw5gmQj7jvR0
|
||||
JyCO0z3V2W8DCvxzQR+Mci13ZTGw53pNiHQNn0K1iMT2IJ6XmTcKXrBy37yEQClx
|
||||
06h3DxSWfNQlQXBO5lnvwetMrU3OuwztYfsrnlqM3byLW21ZRCft7OCSzwiNbWu/
|
||||
qg8eyA9xiH/ElduuA1Z5dKcRY4dywHUy3GvbqkqJBlRCyySZlzeXkm3aBelFDaQ8
|
||||
rviKZ9Bc5AIAgjUG6Sz5rZu2VgHkxPo8ZzVJXAR5BPzRcY9UFmGH5c2jja/Hf2kd
|
||||
zP43wWAtXLM8Oci0fb5nizohTmgQq0JJNYRtZOEW0fSRd3tzh/eGTqQWOQARAQAB
|
||||
tDZTcGFjayBQcm9qZWN0IE9mZmljaWFsIEJpbmFyaWVzIDxtYWludGFpbmVyc0Bz
|
||||
cGFjay5pbz6JAk4EEwEKADgWIQQsjdMiTvNXOkK9Ih+o4Mo8HCraLwUCYo45cgIb
|
||||
AwULCQgHAgYVCgkICwIEFgIDAQIeAQIXgAAKCRCo4Mo8HCraL4vfD/9EQ5sotTYj
|
||||
83YmyJP/MdRB358QzmKV7+ewcE2+f184kx7z4IK/gBFYz9t32rnBBgm6/aQASHD+
|
||||
scm/mqV5+26B1nLUr5GbAEAm5yFDJulTBXX/TZpc5FyC+AgSh4j47kc9ZDM+YHFU
|
||||
uPLNqhJI19LVMSyleifqd8Xbqo9/aCPw6eRZMGmw8estV+QgHCT5oD1Y4SSRm2th
|
||||
/CUspVEWr8Dg0j3n7N/3y+pIBlf5lQ3wBeRgvH2c2ty28f8HauHiXAi0tkdCTU1W
|
||||
509avGE8a5XWYL0zjBrcintEfeaf7e4zOW+YQdVmIp1HDvBkBrpuPSUwecZ5jtCX
|
||||
1w74wa0XQOYy3J19Vhc/t2DN63wYCKV9BxznvOFOWDn9foo2kUkx7E5gnYcV/P6n
|
||||
cGG5iikikRvK0IKGGmaYa4W0VNDK819npsTM0EFxE+6j9VWsGBXHSSdPk1vIRYuw
|
||||
tWJi79/v5wpnsqwTycYv1v4V0PGeIi+dtCpFWj/4BFpqPLVVT4SswmHmEikAuFaC
|
||||
APczWR8H62Y+qrLyMlk66GdyU0j0+SkWmLZVhqAinskcpnnwXzHS6cQtIePjygXy
|
||||
09whvSRq1BZtiNqcND/hz6cJR/v29brxXhQplPLCu942OixvG+DUjTPn/GhMfA8R
|
||||
mXvx0MFYcw2T0m8JfqYOXra7Tbi8UOulP4kCMwQQAQgAHRYhBFZuUukEjpZald+h
|
||||
cYfpC6CmUwNPBQJijnY4AAoJEIfpC6CmUwNPcKYP/22pYBPIIUZwBeD8q/DileiX
|
||||
L8YgHU+ziy1qxHPiVXuwWEvQUwnCG5BeyvsCViQrs1670OCqtsKk/sBJJvp17kkt
|
||||
tKmBjUoZDS+YbD6d4q9QgHVN8jW1HzotWPGKrVV4oDdnNBBKoS5h6i4tpMca3kXq
|
||||
d7Ow3qzxSYkHCmJqqXNCBQzgIlha2FijMNWe1cnJr+IpG6eJ5wfeQm3llIVHbeCj
|
||||
YZyAorjRSzsopfWwXrQJXLTBJzgj4JDUWMmk6IdKcF41NOWB6f04Eky0zEHX02Xl
|
||||
eRxJygan9bSayELz3vjWQu2aBR4NxDdsvRy2E35wfHHruqLKTzaC1VU3nYGDPcby
|
||||
W+A/fA6+sY7Gh5y41xiD6hyDJxKkhEn9SfuF47YghFtW0r5fPe4ktwjI5e0sHWI6
|
||||
WFdNLF51F6B7xSbzNwRPw+XSWNvLvXvOu94FSPV1aoT5Vgq79377q/wlqmv/SSk/
|
||||
FcZFmglmIwJiLj/qUt2TlAkXb9UxQCfP3Sn98wGMOtZ6Hm0Uga3zjmw9dCTY/anL
|
||||
n/RZYTJSXlEDAtnT/aU708zsz5skKHfLWTk4/1wLg/mUqImLJGRQ26J1C1njl9RK
|
||||
p5SRAObb7qvoyM730avvGkiom5/GwxAzXiRs2c9gbuTEeCYhcBrwMSK1QiHZbG8y
|
||||
IAe8fagtFxvtN5oJVmqpiQIzBBABCgAdFiEEE6CQhKr7XHIVX7vh1XGr4uJd+5wF
|
||||
AmKOeGUACgkQ1XGr4uJd+5xphRAAmEINmSJVDApYyhlDRCZAE39oD+FUlbOyOkr+
|
||||
yGXsCSz2cnMXdypBms3hVhs5ut6zmiekFmGJ1X2SrHWcm/NoZZAZDu+8LT0HF90W
|
||||
MG+u4hP66lsglwlVYhzX8XjQpoSCunFHUb7HAShNVPSC30aWlVJZTyowLIuXiMmV
|
||||
pwB5lhtHLoMjjAEpsy3l7lEey/+clL03fW3FzZWAwY2RgORNlz6ctaIPlN6Tjoya
|
||||
iO6iFWE8/DiiATJ4fass3FijmfERD8o4H6nKKwEQzYTG5MoMuKC6fbokCL58wQ5k
|
||||
jJ8wFpiajFKFsKfuk5+0q+LZ3FuLY0TUAOR7AJUELqTsBMUM3qf9EFU7UHN4KHoK
|
||||
+3FrqCouT90pUjq8KoYXi0nfOqROuXJBAcx+g9G7H6x8yEPISE4la6T9aNujavip
|
||||
jxMP7D3fY5leqxRozVtUZ5rHAvN1s6DtzPnrQVsn9RiwRJO3lYj64wvSRPHkRAVL
|
||||
U9RtZXonlHsx1PVbx4XlAuSFOxLHApBWM+CqemyhXdtC1MCpTgqOpt7rTendZnqc
|
||||
T2tDcaZIHw3+KAdXrU9zvvEqkk/dfnckdDQZTSd96r16HGLtYj6ILzd17/j/qTnq
|
||||
BYeZl9CbYtyF117zyIjzaq8oTZxj97Tu5a9WXNOyaeB4e6A9AQeSJVA9vFM9vUCM
|
||||
5pJRJzqJAjMEEAEKAB0WIQR802Mbz07k2wBqg1zqYglGK5OFuQUCYo54fAAKCRDq
|
||||
YglGK5OFufWXEACH06ybO/LwTX4I+aoj19d5OSe8xaSZBHu2WVh2KfbPxInXNIhQ
|
||||
1/W44hnQNGS0B6a/fN80xPD+tWdBLF1fAl7tgz+KBECc9fTsebeXMmY/FJZP60uL
|
||||
1Da1RMEOd3lg7DgNfLjgIiXi4eqqYHHRwCSFww4VhZJ3lxnXrcuFwLXDAvGzB/5x
|
||||
mK23fhCQ0tK9I/jcKyzrN3a/tcU7bXuQ0ewRDtVvfCnimGAjcayIpb3bhn0CByxJ
|
||||
B4fH3mIw1/nMzBZr6PtNEPwSRLEnVsWwmUs9sCHWftgAqDpF3dnC5nk5XZdr+auo
|
||||
3MeP47gbMW6xQcGY6XJgoWNIeNUpVJg12e0AsaOdEJJO4NU+IEmb5d5M889Kr3e2
|
||||
sor5/6kbRAjh5RUShOLrae15Gkzd13K5oOlhBcyTTEqf4QSnOtvnvpKghHjaBkSh
|
||||
em2N+HfZ8hmahRv3DI79rVx/vjLSFwc9Y/GYw56Xu8bBFmHP4rdFZP4VC87EjOwa
|
||||
zNb/0XfpUJkFsGXyUdrvd+Ma4z8PA3Got9ZO2ZV/yAMnCpyTX5pUvF6RBA2zlv2Z
|
||||
8EqafabD/iGJJFubHuoDLzTCSgD0hmO6l8r/lR5TmjHH2fw8kYXOCpRe2rbtyS1f
|
||||
ttwP1IvoFb7mVQ6/Q7ghMxPiOxg/cGtW5fK+TYGU5cyYFu0Ad7dL11rgI4kCMwQQ
|
||||
AQoAHRYhBBmw8mZl6mDaDDnZvOI98LWuJhGfBQJijnd2AAoJEOI98LWuJhGfXIEP
|
||||
/0Nk2P2rMrX4MrfxACkHnRFS35GKbs2EqQGy2mxok5s+VDE/neKLozzBU/2x4ub8
|
||||
P8UrXKBHAyW2MwZ1etx27ARoWcbGaOICbIMUCgmGSCqMlfo18SJVyssRPDvKxy/+
|
||||
S7PIwgdFlRb79UxEMYi7L5Ig0H5nHYaHPAqvzTOssy+OXub0oU+sCK4s9WD9OPBf
|
||||
vA9dfGJMnLyi4wTs0/6LXKAf5BwGOzeXhWL4GQmpRqb8Kw40BgBXhye9xUwr72BI
|
||||
iAVVfUG5LTY9K8b9eK6DB78fdaZsvtfgY85Ou+OiMjEPitYCQF1mIt0qb9GcaC1b
|
||||
VujdvM/ifpxXpyTdC95KUf773kTrv+v8842U99gccBNQp6rYUHkDT3bZXISAQEpd
|
||||
c22iclcr6dCKRTRnaQpEkfDcidTEOnpadEDjl0EeZOeAS333awNe4ABP7pR7AvRW
|
||||
2vg1cY7z52FEoLG20SkXbLb3Iaf5t3AOqS5z0kS59ALy6hxuuDJG8gb1KfzmJgMn
|
||||
k9PJE8LdBVwsz346GLNUxLKzqBVFRX4N6WKDYURq30h/68p+U24K9/IeA5BTWsUQ
|
||||
p7q11dk/JpkbyrK74V2hThwEyTv9hjMQuALTNr9sh3aUqT5XbhCgMnJcb1mkxhU9
|
||||
ADKN+4h+tfuz6C0+wf3nFE08sUkStlN3Vbh/w+nJaII1iQEzBBABCgAdFiEEuNcs
|
||||
eEKe1WcgoStyNsSqUuTK0GUFAmKOXFQACgkQNsSqUuTK0GWOmAf/RNODdqbdXkyP
|
||||
8J4ePZDcPbzEhGNW6piSCbyJ7cVOQHLYuwV7i6gJ3L+h9AGii2fI4YBQkEzjiGJm
|
||||
8o0HazR76R2iT7QcFQ4vLkX8wYgO5q4SFGlzOfhHr+OOrd2r2D6Pid/oCADUfa+x
|
||||
NJt9V9naVCjkQq6rAFUK3cpwVCC8pB4at5+sL503u718Ce4u0uzuKwGXqmhsRruF
|
||||
9ZIn1hkoneKKFDb7C1zT08oegy484LcjTfbzKuBHZOXW0cNkqtSCuL9lBmrD2t+r
|
||||
KZWwNX8DLIOZtfxux+jCxks2dp903Zd9H5PlaZOldMbXGIEINfcUurl5H79vAXqW
|
||||
EUOtsYYX74kCMwQQAQgAHRYhBPMiP8Tk+ZH+vALLH8FnKfGqz2bGBQJijpCCAAoJ
|
||||
EMFnKfGqz2bGrZoP/jobYML0JDIQT22TyNfz2Q38WhLtdaEnKeEMU5HDq9uEOjjz
|
||||
BZewMB4LLTgE3uvXngL7K/2R6i6fu1YAOX2RaySG4VfxNd7z1TTHnTWmeKd+rtB/
|
||||
o5/iH4hp3uLYFPvWqKjr7PPuXzi1JE99lEThuyqM88GcKfuNvldJtjhALZL548St
|
||||
es6D82tGumbWzFEeyDbCxJRBOWfX6vkVBR7w3Q2NRxEOtvc58mhXiHOs2/vXMMzr
|
||||
1RMYzzYvq8jXi8uaa5Esmeo6r1Md67oaNfPNulhYUe4mKYwPuphcBSNCfRGQnRlU
|
||||
8oToURyRcXI6Bd9dJSvtznMHrsWO+Zm4O752cvfi/GKHUPVJ/FvO5L0qo2546+tn
|
||||
nIDPVhvAnhWO5+95ooRIXsxa2mzYtaudAu6pcI6OaMANjJ8fUTxFmedN9HqlkKPF
|
||||
ghvcwqJdpmpRs05nAuNzHmnKkMVI8R7uBavB6F5cAfonNVgoCKCsFHpG2jGViMRj
|
||||
/OtovngNpYrweyIiPxRYGhKiP3rNzNjMT2HfVz8xiTttIeMXU+JrBem40CagxBFa
|
||||
JYsZjATkKDJ3tXlRs1JGqKiI3Veia4elCCLv6uzfKVGAg4xMWjtKbdxpi3ku8JSh
|
||||
Ntzc928pJkHprP1WPGVbZ3xPPJ3N+WTawGYnblcLlRFeVErdKSJeQOknbH/EiQIz
|
||||
BBABCAAdFiEEikTOi7BILoKkJagWI2JUH20U7YQFAmKOvW0ACgkQI2JUH20U7YRW
|
||||
Aw//dBsV+CCqb0i9s8O6l/L7kZG///jxqobv6SLlUOKuFIckGMKBVi7QSLC11Wgv
|
||||
Z8ETswrSDSP9YzP46TP4Ad2tZQoulhB+sEfNIsRu1doYXPmr23T68Jof4dinCTVO
|
||||
rgoU8XboKjzQzy27ziziJ4OZxRl9c4zIaSw4FyEzt4BLKAByi9NT5CtJN5Sr3v9X
|
||||
CncuKnekqpTpLltbLJYYK+DT+Vy8+FT9XehQbndKtM9i4FXvp6xzM61GfL3s3MA8
|
||||
Xosol+8OrwYLKhUM5mbg0sqreqVRcmeiRCBO5MfCfrpukCbqBmwi/E7qyw+S4IAl
|
||||
HdU8JVRQvCoJCCy8pZqfMAdgx35E3CM7/GIlb5Dk2teKPlmSBXu7ckMhhFauiDFK
|
||||
ImX6ThCe/uExvK5npiowKvQsEjhDeUU4zt9N8UxgaRpPYr2tyHnqqRTeEtk9/K9j
|
||||
O8WH825DeKhjwU6Eg4Qtb8HmlA0fnZ//L826KC8mTkFSbkdKtIMvlq8u6nAgHsMF
|
||||
GoUbBvtDbvenZhkndQpuDd2tXpSob+9f1TqZGfWv2nbOtfEfXEf9BwayX+iJOH2F
|
||||
cC6bJbIG5UTirbjxDmVmKn71CxgJTHRqSUULKE4rimpnDpN6S2qw4ZEK4EIkSH6b
|
||||
qmlDWCsiprIc2KwiTOz8wCCsKNwXnc3PfFtM8bGO6N9Yz965Ag0EYo45cgEQAOnf
|
||||
WNZhXdeapCMfs+YxuySSfCw8X2ucHegKV0tKCg1ozKY9Z/CQ4apcNxJD1ZS++0y8
|
||||
38OjNo7JayAp4joCT+/lFN+OzFuMf5xc5E5pQeF1UAsi27FJFJWX9NIvdZ0BmTzy
|
||||
E0GJGg9CSUPOfImV1fW7uVWkkzi+UE09pe8llmkY3JCX5ViIH0bTFzF52BZL06np
|
||||
0MxNFwBVm4sZXyPOxInqOm66gICrbxLDriz3EYa2bJm17I6Kclvw/X/ohCeGU9WW
|
||||
KEzWTE03OxRMqLlPfxgqVshIz2dO77u47yehI6BOsOhpp4Ag7rRgLpRs1Iemg02/
|
||||
Oa82FiiLw0S4g/4UXyi4cZKF4vCefNs2z+IaSEe3l9z9Gg19gPsanrYP+CfZRsXk
|
||||
2EYxwt0Ffmma2rQ24zQ9yJ5NvqiINX9xd/LjOv8pmkNArKXsbtY4KwtH7zVkiS4Q
|
||||
9FsU5C/9BaM9z/fMpQNUq6mx9FCw0C+NntWYvfXn4PFNPC7klYgM/2VFvxq+vBk6
|
||||
CVpbcSoYy1+7uZZ+hskyQek8Dbtnk7fLBRC4gHK9T2gbroo/eS0u9b1PFlaka0HG
|
||||
1zKwU1u6Iq19r2qapKoMf3SGStEilh8x0eyCdEqqCHEi/HKYU4zGa54zBGlpkmMy
|
||||
Q7VZmeozNpY/F02KZByMWIstjKZGEINXhaI/2F5HABEBAAGJAjYEGAEKACAWIQQs
|
||||
jdMiTvNXOkK9Ih+o4Mo8HCraLwUCYo45cgIbDAAKCRCo4Mo8HCraL5bAD/9bS+7u
|
||||
0Xq+kt8/sQpWnmwevgcnsSxVwENf1In9l/ZSShtaLvBUaj2Ot80bnfTl9dVruH8v
|
||||
+Nh1HrzklGwwcqNP5jQZEVFaqQO/jA/c84gKvXoQPUA555rcTonZYquEBMqEMlJe
|
||||
jil+Lb+pfII+BVD0wvQDCnpzni8u7r0cEjPEMevLoTJdcgNhn1WuvDRFp8+HtlTx
|
||||
198wcZbAPgFHRpE1NQjrP2CBket/ZIxuvuAEf9TpYifsjG0NZcdxeN0JZ3HOKZUT
|
||||
kKgNanm+PxqXRynnrdEEH6I2vPR+KMr5+ZqFcXbamvDy64Xewi0EVYecQk1SllfC
|
||||
lCuDih5ZqcjmZqdcqoFxc+tc7gcb509Fo/+mBCq6nXEVorKPJqdoW3IGbz29Nkqc
|
||||
ZczFyeOpCk3UaPCz3kxebVfaDydiRkFnWlFEZNkAidZGOKs5ykEmEvq8c9+dSyaS
|
||||
3Y7xcx/SaGyF/a4+9cdd974/HcPKcRHRi7nXrn+yEVQq8CZAvKWVYyme461isPkz
|
||||
loWb1AKXK5kHR0CFq4HTXMZrrNsdWoU2lP+BNVg0dQG4Z0QpcOrgwbLjXrqTVrbB
|
||||
PITOx6cYB7RafdBmhBF+8qOHmr7wwI92DV0vYeEjlGT9FazAquCAMVqBnqz/52GM
|
||||
Vzr8QG9fSxBmTeoMQkkdXah/sex7zWVfUqoJLrkCDQRijjtHARAAmPVZ6gtFeFcI
|
||||
NzMD3iZVkFglIM91SfdglmgjxdC4mAjSGQk5dBxrqFJrT6Tn5b/i8JLGgrOpaDvb
|
||||
O50chDmys+VtWEmoxa9kl4BOjzEbr7e0UymoK1GTK6ZAIIrHytFkSjcP3LSi5Am9
|
||||
aCwVhZ3NoH3LKBzj+ast7I0QJT/mt5ti2h/4qEw3qJ6ImKUmjfLTOkTjt5NfWgT9
|
||||
McdnV74XpOi3OdIL8vTp+1S5Dm3pByVYJdR0mfD0uRg9+OHN04t4D3k2hPovBxJN
|
||||
E1uKE5IPt6RJ+E1POCfA1dM1PsaSf1S4zyyIlUK7HM99oXUg00JXHBUD5z12hXuy
|
||||
5sQZE+lFaMRej+uwO/uO2YiebansrQc+MMnrkAKElCzb0esuRNWff5CPnQ2K0ZL/
|
||||
x+xNfziRNvTAICz6ir4bPONa201V6rDFoe4HNLxL0u+mLnag5i/4xiE/eWtzEBSp
|
||||
F1HNn+LSrNn65JDjU1y3o0iDwZZo1hVzG2zgx8f/7gXDJpMcVHpLWz5Why77643l
|
||||
NoR+qdUwoofzC5Soz+m1SoOOoEfCTiZZaukaOSFDeh/ZQ7M/MvQ0ytd8HZwFm6po
|
||||
QJYQiwJUV9Es4szqndr8bxHxoy55mJqewe+cTvqB2Nqy7OjXNFZD37TuLLw9clJK
|
||||
MLKFrz2VitRyRADhg11oCmGp6GAZBLEAEQEAAYkEbAQYAQoAIBYhBCyN0yJO81c6
|
||||
Qr0iH6jgyjwcKtovBQJijjtHAhsCAkAJEKjgyjwcKtovwXQgBBkBCgAdFiEE0sfr
|
||||
PysF+oZZDSk8BAAbLj2wxyMFAmKOO0cACgkQBAAbLj2wxyNbCw//ah/m1jrdV254
|
||||
WAEt800l9brvONRU5YwjZE9HuHaTXxaRU5jd55lreFGWqzYJDe1vKkR0BdCGHIB8
|
||||
KERNGXq6oQUJ4+oVrLWeV11ojen7xmbSYwEvd5VEK2ihYHyq5n0n2mwOG3+9HPkm
|
||||
5N1QqjykePr7xqkBn3xgMJgB4KGydZi/zk5mNM3R23T061gn0G3TntnGWjppzRzx
|
||||
a4CUz4b+ys6yz6I6LQ1pIG3pYeXgb6p9McdWP3+gec1xYPTgR001AbcbuMAXzjRI
|
||||
GNGblsy0CAXTPju1451129wTx9l5x4sLLscmHv/TDRT9/YpEPfhA0xtL/XdNgG9o
|
||||
lndhi6UsC8dg68sKI6MZzbFJBUmzwvThZi9DjvS7tI4IynQEENB0rEpNwBgNpE3w
|
||||
OvoJBB+Ykr7Lyg7M/AdymBu4sHTW6nUuLlDo45gHAaFkKdM+WCRllvdRDI6/CnDh
|
||||
dqSnqrfcyFFgzPgrA3fqoQ1TX8pgoMWasnBShaZn2vmXBUcfImxKCCVSpqhzfSIx
|
||||
lmkneo3WC2hqkMDTfcx8z77u37pJYPWMiHidGqkJCRr7P4K112SoH5SVa0R94yNu
|
||||
5zOolbyvt1lgKYmS/UDpxfHkUHL1WVJo0Ki/LTADNYCMYUKz/4E3Em5T9DIBRrkq
|
||||
F7BWxpCF3kogEnpOQ5Qj9cTOZsfBEqM4jA/9H233pFPKvgzYcYst3l65Ootx+dsh
|
||||
5gHIbp0aWBGGxLMbwkjSXdIbGxZrmxTng+9CWgpAX9j5lkjCNJxEpxzYGiRwJE2N
|
||||
p7O+dGnfO9VTsVWqcCc73T4s6HVnmxX8ZxGSW2LhoI1na+rqnz6rzz1Rdj+RnG57
|
||||
HHDzGKvzjbFfbQREweduG+M4JbtOMLCCooojwzxyCRTbNsQEh0uleMyse1PjmnYz
|
||||
dII0l+arVaOg73i+1KkMyCKvxd+zPw5/gPM62LcUxkqMfnFgBHyzh/W199b6ukZP
|
||||
DODeXOzKhiJiIVtxztl6L+hpXY+yE60iPgcbIiP4qMZxFGGolM3LfDzzD57evJWH
|
||||
6SuDy3yv6vmUcZgmEpDSU/wbByNN7FNTHblrImRDGHg9Xs/9NQV0ngA96jPvILsu
|
||||
BUw4y1ybVwu5GgNct3VbDGzlaSpUpYnSyp04id3iZJRIYKBln+wJ7q73oo8vltlX
|
||||
QpOENRJHJoDECq0R/UDg1j+mwEw/p1A9xcQ8caw9p0Y2YyIlcSboqeKbjXrOv4oi
|
||||
rEUyfGBlSvk+8Deg7TIW4fGQ/uW4sthkRNLpSxWgV+t8VRTEtQ6+ONL+u2ehQkbF
|
||||
7+kvlN1LTQNITLpJ+8DGlYke8qlloGY2ROPR+INyNbJCiJOLba2CjRBu48w30iXy
|
||||
cHFsNYXiYw5O7lA=
|
||||
=cjRy
|
||||
-----END PGP PUBLIC KEY BLOCK-----
|
||||
|
@@ -626,7 +626,7 @@ _spack_ci() {
|
||||
}
|
||||
|
||||
_spack_ci_generate() {
|
||||
SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --prune-dag --no-prune-dag --check-index-only --artifacts-root"
|
||||
SPACK_COMPREPLY="-h --help --output-file --copy-to --optimize --dependencies --buildcache-destination --prune-dag --no-prune-dag --check-index-only --artifacts-root"
|
||||
}
|
||||
|
||||
_spack_ci_rebuild_index() {
|
||||
|
@@ -47,6 +47,7 @@ FROM {{ run.image }}
|
||||
|
||||
COPY --from=builder {{ paths.environment }} {{ paths.environment }}
|
||||
COPY --from=builder {{ paths.store }} {{ paths.store }}
|
||||
COPY --from=builder {{ paths.hidden_view }} {{ paths.hidden_view }}
|
||||
COPY --from=builder {{ paths.view }} {{ paths.view }}
|
||||
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
@@ -62,5 +63,6 @@ RUN {% if os_package_update %}{{ os_packages_final.update }} \
|
||||
{% for label, value in labels.items() %}
|
||||
LABEL "{{ label }}"="{{ value }}"
|
||||
{% endfor %}
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l", "-c", "$*", "--" ]
|
||||
CMD [ "/bin/bash" ]
|
||||
{% endif %}
|
||||
|
@@ -56,6 +56,7 @@ Stage: final
|
||||
%files from build
|
||||
{{ paths.environment }} /opt
|
||||
{{ paths.store }} /opt
|
||||
{{ paths.hidden_view }} /opt
|
||||
{{ paths.view }} /opt
|
||||
{{ paths.environment }}/environment_modifications.sh {{ paths.environment }}/environment_modifications.sh
|
||||
|
||||
|
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
from spack.package import *
|
||||
|
||||
|
||||
class NonExistingConditionalDep(Package):
|
||||
"""Simple package with no source and one dependency"""
|
||||
|
||||
homepage = "http://www.example.com"
|
||||
|
||||
version('2.0')
|
||||
version('1.0')
|
||||
|
||||
depends_on('dep-with-variants@999', when='@2.0')
|
@@ -25,8 +25,8 @@ class Assimp(CMakePackage):
|
||||
version('5.0.1', sha256='11310ec1f2ad2cd46b95ba88faca8f7aaa1efe9aa12605c55e3de2b977b3dbfc')
|
||||
version('4.0.1', sha256='60080d8ab4daaab309f65b3cffd99f19eb1af8d05623fff469b9b652818e286e')
|
||||
|
||||
patch('https://patch-diff.githubusercontent.com/raw/assimp/assimp/pull/4203.patch',
|
||||
sha256='a227714a215023184536e38b4bc7f8341f635e16bfb3b0ea029d420c29aacd2d',
|
||||
patch('https://patch-diff.githubusercontent.com/raw/assimp/assimp/pull/4203.patch?full_index=1',
|
||||
sha256='24135e88bcef205e118f7a3f99948851c78d3f3e16684104dc603439dd790d74',
|
||||
when='@5.1:5.2.2')
|
||||
|
||||
variant('shared', default=True,
|
||||
|
@@ -13,12 +13,14 @@ class OpenpmdApi(CMakePackage):
|
||||
url = "https://github.com/openPMD/openPMD-api/archive/0.14.2.tar.gz"
|
||||
git = "https://github.com/openPMD/openPMD-api.git"
|
||||
|
||||
maintainers = ['ax3l']
|
||||
maintainers = ['ax3l', 'franzpoeschel']
|
||||
|
||||
tags = ['e4s']
|
||||
|
||||
# C++14 up until here
|
||||
# C++17 up until here
|
||||
version('develop', branch='dev')
|
||||
# C++14 up until here
|
||||
version('0.14.5', sha256='e3f509098e75014394877e0dc91f833e57ced5552b110c7339a69e9dbe49bf62')
|
||||
version('0.14.4', sha256='42b7bcd043e772d63f0fe0e5e70da411f001db10096d5b8be797ffc88e786379')
|
||||
version('0.14.3', sha256='57282455e0fb1873b4def1894fadadd6425dfc8349eac7fcc68daf677c48b7ce')
|
||||
version('0.14.2', sha256='25c6b4bcd0ae1ba668b633b8514e66c402da54901c26861fc754fca55717c836')
|
||||
@@ -47,11 +49,12 @@ class OpenpmdApi(CMakePackage):
|
||||
description='Enable Python bindings')
|
||||
|
||||
depends_on('cmake@3.15.0:', type='build')
|
||||
depends_on('mpark-variant@1.4.0:')
|
||||
depends_on('catch2@2.6.1:', type='test')
|
||||
depends_on('catch2@2.13.4:', type='test', when='@0.14.0:')
|
||||
depends_on('mpi@2.3:', when='+mpi') # might become MPI 3.0+
|
||||
depends_on('nlohmann-json@3.9.1:')
|
||||
depends_on('mpark-variant@1.4.0:', when='@:0.14.99') # pre C++17 releases
|
||||
depends_on('toml11@3.7.1:', when='@0.15.0:')
|
||||
with when('+hdf5'):
|
||||
depends_on('hdf5@1.8.13:')
|
||||
depends_on('hdf5@1.8.13: ~mpi', when='~mpi')
|
||||
@@ -105,10 +108,12 @@ def cmake_args(self):
|
||||
self.define('openPMD_USE_INTERNAL_PYBIND11', False)
|
||||
]
|
||||
|
||||
args += [
|
||||
self.define('openPMD_USE_INTERNAL_JSON', False),
|
||||
self.define('openPMD_USE_INTERNAL_VARIANT', False)
|
||||
]
|
||||
args.append(self.define('openPMD_USE_INTERNAL_JSON', False))
|
||||
if spec.satisfies('@:0.14'): # pre C++17 releases
|
||||
args.append(self.define('openPMD_USE_INTERNAL_VARIANT', False))
|
||||
if spec.satisfies('@0.15:'):
|
||||
args.append(self.define('openPMD_USE_INTERNAL_TOML11', False))
|
||||
|
||||
if self.run_tests:
|
||||
args.append(self.define('openPMD_USE_INTERNAL_CATCH', False))
|
||||
|
||||
@@ -117,8 +122,9 @@ def cmake_args(self):
|
||||
def setup_run_environment(self, env):
|
||||
spec = self.spec
|
||||
# pre-load dependent CMake-PUBLIC header-only libs
|
||||
env.prepend_path('CMAKE_PREFIX_PATH', spec['mpark-variant'].prefix)
|
||||
env.prepend_path('CPATH', spec['mpark-variant'].prefix.include)
|
||||
if spec.satisfies('@:0.14'): # pre C++17 releases
|
||||
env.prepend_path('CMAKE_PREFIX_PATH', spec['mpark-variant'].prefix)
|
||||
env.prepend_path('CPATH', spec['mpark-variant'].prefix.include)
|
||||
|
||||
# more deps searched in openPMDConfig.cmake
|
||||
if spec.satisfies("+mpi"):
|
||||
@@ -132,10 +138,12 @@ def setup_run_environment(self, env):
|
||||
env.prepend_path('CMAKE_PREFIX_PATH', spec['hdf5'].prefix)
|
||||
|
||||
def setup_dependent_build_environment(self, env, dependent_spec):
|
||||
spec = self.spec
|
||||
# pre-load dependent CMake-PUBLIC header-only libs
|
||||
env.prepend_path('CMAKE_PREFIX_PATH',
|
||||
self.spec['mpark-variant'].prefix)
|
||||
env.prepend_path('CPATH', self.spec['mpark-variant'].prefix.include)
|
||||
if spec.satisfies('@:0.14'): # pre C++17 releases
|
||||
env.prepend_path('CMAKE_PREFIX_PATH',
|
||||
spec['mpark-variant'].prefix)
|
||||
env.prepend_path('CPATH', spec['mpark-variant'].prefix.include)
|
||||
|
||||
def check(self):
|
||||
"""CTest checks after the build phase"""
|
||||
|
@@ -27,6 +27,7 @@ class PyWarpx(PythonPackage):
|
||||
|
||||
# NOTE: if you update the versions here, also see warpx
|
||||
version('develop', branch='development')
|
||||
version('22.06', sha256='e78398e215d3fc6bc5984f5d1c2ddeac290dcbc8a8e9d196e828ef6299187db9')
|
||||
version('22.05', sha256='2fa69e6a4db36459b67bf663e8fbf56191f6c8c25dc76301dbd02a36f9b50479')
|
||||
version('22.04', sha256='9234d12e28b323cb250d3d2cefee0b36246bd8a1d1eb48e386f41977251c028f')
|
||||
version('22.03', sha256='ddbef760c8000f2f827dfb097ca3359e7aecbea8766bec5c3a91ee28d3641564')
|
||||
@@ -45,7 +46,7 @@ class PyWarpx(PythonPackage):
|
||||
variant('mpi', default=True,
|
||||
description='Enable MPI support')
|
||||
|
||||
for v in ['22.05', '22.04', '22.03', '22.02', '22.01',
|
||||
for v in ['22.06', '22.05', '22.04', '22.03', '22.02', '22.01',
|
||||
'21.12', '21.11', '21.10', '21.09', '21.08', '21.07', '21.06',
|
||||
'21.05', '21.04',
|
||||
'develop']:
|
||||
|
@@ -18,7 +18,7 @@
|
||||
is_nonsymlink_exe_with_shebang,
|
||||
path_contains_subdirectory,
|
||||
)
|
||||
from llnl.util.lang import match_predicate
|
||||
from llnl.util.lang import dedupe, match_predicate
|
||||
|
||||
from spack import *
|
||||
from spack.build_environment import dso_suffix
|
||||
@@ -1019,16 +1019,13 @@ def home(self):
|
||||
"""
|
||||
return Prefix(self.config_vars['prefix'])
|
||||
|
||||
@property
|
||||
def libs(self):
|
||||
# Spack installs libraries into lib, except on openSUSE where it
|
||||
# installs them into lib64. If the user is using an externally
|
||||
# installed package, it may be in either lib or lib64, so we need
|
||||
# to ask Python where its LIBDIR is.
|
||||
def find_library(self, library):
|
||||
# Spack installs libraries into lib, except on openSUSE where it installs them
|
||||
# into lib64. If the user is using an externally installed package, it may be
|
||||
# in either lib or lib64, so we need to ask Python where its LIBDIR is.
|
||||
libdir = self.config_vars['LIBDIR']
|
||||
|
||||
# In Ubuntu 16.04.6 and python 2.7.12 from the system, lib could be
|
||||
# in LBPL
|
||||
# In Ubuntu 16.04.6 and python 2.7.12 from the system, lib could be in LBPL
|
||||
# https://mail.python.org/pipermail/python-dev/2013-April/125733.html
|
||||
libpl = self.config_vars['LIBPL']
|
||||
|
||||
@@ -1044,50 +1041,74 @@ def libs(self):
|
||||
else:
|
||||
macos_developerdir = ''
|
||||
|
||||
if '+shared' in self.spec:
|
||||
ldlibrary = self.config_vars['LDLIBRARY']
|
||||
win_bin_dir = self.config_vars['BINDIR']
|
||||
if os.path.exists(os.path.join(libdir, ldlibrary)):
|
||||
return LibraryList(os.path.join(libdir, ldlibrary))
|
||||
elif os.path.exists(os.path.join(libpl, ldlibrary)):
|
||||
return LibraryList(os.path.join(libpl, ldlibrary))
|
||||
elif os.path.exists(os.path.join(frameworkprefix, ldlibrary)):
|
||||
return LibraryList(os.path.join(frameworkprefix, ldlibrary))
|
||||
elif macos_developerdir and \
|
||||
os.path.exists(os.path.join(macos_developerdir, ldlibrary)):
|
||||
return LibraryList(os.path.join(macos_developerdir, ldlibrary))
|
||||
elif is_windows and \
|
||||
os.path.exists(os.path.join(win_bin_dir, ldlibrary)):
|
||||
return LibraryList(os.path.join(win_bin_dir, ldlibrary))
|
||||
else:
|
||||
msg = 'Unable to locate {0} libraries in {1}'
|
||||
raise RuntimeError(msg.format(ldlibrary, libdir))
|
||||
else:
|
||||
library = self.config_vars['LIBRARY']
|
||||
# Windows libraries are installed directly to BINDIR
|
||||
win_bin_dir = self.config_vars['BINDIR']
|
||||
|
||||
if os.path.exists(os.path.join(libdir, library)):
|
||||
return LibraryList(os.path.join(libdir, library))
|
||||
elif os.path.exists(os.path.join(frameworkprefix, library)):
|
||||
return LibraryList(os.path.join(frameworkprefix, library))
|
||||
else:
|
||||
msg = 'Unable to locate {0} libraries in {1}'
|
||||
raise RuntimeError(msg.format(library, libdir))
|
||||
directories = [libdir, libpl, frameworkprefix, macos_developerdir, win_bin_dir]
|
||||
for directory in directories:
|
||||
path = os.path.join(directory, library)
|
||||
if os.path.exists(path):
|
||||
return LibraryList(path)
|
||||
|
||||
@property
|
||||
def libs(self):
|
||||
# The +shared variant isn't always reliable, as `spack external find`
|
||||
# currently can't detect it. If +shared, prefer the shared libraries, but check
|
||||
# for static if those aren't found. Vice versa for ~shared.
|
||||
|
||||
# The values of LDLIBRARY and LIBRARY also aren't reliable. Intel Python uses a
|
||||
# static binary but installs shared libraries, so sysconfig reports
|
||||
# libpythonX.Y.a but only libpythonX.Y.so exists.
|
||||
shared_libs = [
|
||||
self.config_vars['LDLIBRARY'],
|
||||
'libpython{}.{}'.format(self.version.up_to(2), dso_suffix),
|
||||
]
|
||||
static_libs = [
|
||||
self.config_vars['LIBRARY'],
|
||||
'libpython{}.a'.format(self.version.up_to(2)),
|
||||
]
|
||||
if '+shared' in self.spec:
|
||||
libraries = shared_libs + static_libs
|
||||
else:
|
||||
libraries = static_libs + shared_libs
|
||||
libraries = dedupe(libraries)
|
||||
|
||||
for library in libraries:
|
||||
lib = self.find_library(library)
|
||||
if lib:
|
||||
return lib
|
||||
|
||||
msg = 'Unable to locate {} libraries in {}'
|
||||
libdir = self.config_vars['LIBDIR']
|
||||
raise spack.error.NoLibrariesError(msg.format(self.name, libdir))
|
||||
|
||||
@property
|
||||
def headers(self):
|
||||
directory = self.config_vars['include']
|
||||
config_h = self.config_vars['config_h_filename']
|
||||
|
||||
if os.path.exists(config_h):
|
||||
headers = HeaderList(config_h)
|
||||
else:
|
||||
headers = find_headers(
|
||||
'pyconfig', self.prefix.include, recursive=True)
|
||||
config_h = headers[0]
|
||||
headers = find_headers('pyconfig', directory)
|
||||
if headers:
|
||||
config_h = headers[0]
|
||||
else:
|
||||
msg = 'Unable to locate {} headers in {}'
|
||||
raise spack.error.NoHeadersError(msg.format(self.name, directory))
|
||||
|
||||
headers.directories = [os.path.dirname(config_h)]
|
||||
return headers
|
||||
|
||||
# https://docs.python.org/3/library/sysconfig.html#installation-paths
|
||||
# https://discuss.python.org/t/understanding-site-packages-directories/12959
|
||||
# https://github.com/pypa/pip/blob/22.1/src/pip/_internal/locations/__init__.py
|
||||
# https://github.com/pypa/installer/pull/103
|
||||
|
||||
# NOTE: XCode Python's sysconfing module was incorrectly patched, and hard-codes
|
||||
# everything to be installed in /Library/Python. Therefore, we need to use a
|
||||
# fallback in the following methods. For more information, see:
|
||||
# https://github.com/pypa/pip/blob/22.1/src/pip/_internal/locations/__init__.py#L486
|
||||
|
||||
@property
|
||||
def platlib(self):
|
||||
@@ -1104,8 +1125,12 @@ def platlib(self):
|
||||
Returns:
|
||||
str: platform-specific site-packages directory
|
||||
"""
|
||||
return self.config_vars['platlib'].replace(
|
||||
self.config_vars['platbase'] + os.sep, ''
|
||||
prefix = self.config_vars['platbase'] + os.sep
|
||||
path = self.config_vars['platlib']
|
||||
if path.startswith(prefix):
|
||||
return path.replace(prefix, '')
|
||||
return os.path.join(
|
||||
'lib64', 'python{}'.format(self.version.up_to(2)), 'site-packages'
|
||||
)
|
||||
|
||||
@property
|
||||
@@ -1122,8 +1147,12 @@ def purelib(self):
|
||||
Returns:
|
||||
str: platform-independent site-packages directory
|
||||
"""
|
||||
return self.config_vars['purelib'].replace(
|
||||
self.config_vars['base'] + os.sep, ''
|
||||
prefix = self.config_vars['base'] + os.sep
|
||||
path = self.config_vars['purelib']
|
||||
if path.startswith(prefix):
|
||||
return path.replace(prefix, '')
|
||||
return os.path.join(
|
||||
'lib', 'python{}'.format(self.version.up_to(2)), 'site-packages'
|
||||
)
|
||||
|
||||
@property
|
||||
@@ -1142,9 +1171,11 @@ def include(self):
|
||||
Returns:
|
||||
str: platform-independent header file directory
|
||||
"""
|
||||
return self.config_vars['include'].replace(
|
||||
self.config_vars['installed_base'] + os.sep, ''
|
||||
)
|
||||
prefix = self.config_vars['installed_base'] + os.sep
|
||||
path = self.config_vars['include']
|
||||
if path.startswith(prefix):
|
||||
return path.replace(prefix, '')
|
||||
return os.path.join('include', 'python{}'.format(self.version.up_to(2)))
|
||||
|
||||
@property
|
||||
def easy_install_file(self):
|
||||
|
@@ -18,6 +18,7 @@ class Simde(MesonPackage):
|
||||
url = "https://github.com/simd-everywhere/simde/archive/v0.6.0.tar.gz"
|
||||
git = "https://github.com/simd-everywhere/simde.git"
|
||||
|
||||
version('0.7.2', sha256='366d5e9a342c30f1e40d1234656fb49af5ee35590aaf53b3c79b2afb906ed4c8')
|
||||
version('0.6.0', sha256='25a8b8c69c17ddc2f6209e86caa6b12d4ed91c0f841617efc56e5675eea84915')
|
||||
|
||||
patch('sve-gcc.patch', when='@0.6.0 %gcc')
|
||||
|
@@ -391,8 +391,8 @@ class Trilinos(CMakePackage, CudaPackage, ROCmPackage):
|
||||
patch('fix_clang_errors_12_18_1.patch', when='@12.18.1%clang')
|
||||
patch('cray_secas_12_12_1.patch', when='@12.12.1%cce')
|
||||
patch('cray_secas.patch', when='@12.14.1:12%cce')
|
||||
patch('https://patch-diff.githubusercontent.com/raw/trilinos/Trilinos/pull/10545.patch',
|
||||
sha256='7f446d8bdcdc7ec29e1caeb0faf8d9fd85bd470fc52d3a955c144ab14bb16b90',
|
||||
patch('https://patch-diff.githubusercontent.com/raw/trilinos/Trilinos/pull/10545.patch?full_index=1',
|
||||
sha256='62272054f7cc644583c269e692c69f0a26af19e5a5bd262db3ea3de3447b3358',
|
||||
when='@:13.2.0 +complex')
|
||||
|
||||
# workaround an NVCC bug with c++14 (https://github.com/trilinos/Trilinos/issues/6954)
|
||||
|
@@ -1,34 +0,0 @@
|
||||
From 9785e706229622626133c4b03c7abd004f62023f Mon Sep 17 00:00:00 2001
|
||||
From: Axel Huebl <axel.huebl@plasma.ninja>
|
||||
Date: Sat, 4 Dec 2021 15:28:13 -0800
|
||||
Subject: [PATCH] Fix: Installed Symlink LIB
|
||||
|
||||
The latest patch to these routines broke our library alias in installs.
|
||||
|
||||
By default, this variable is relative and needs the prefix appended.
|
||||
In some cases, e.g., if externally set, it can already be absolute. In that
|
||||
case, we skip adding the prefix.
|
||||
---
|
||||
CMakeLists.txt | 7 ++++++-
|
||||
1 file changed, 6 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/CMakeLists.txt b/CMakeLists.txt
|
||||
index 04092ba962..a549546ab9 100644
|
||||
--- a/CMakeLists.txt
|
||||
+++ b/CMakeLists.txt
|
||||
@@ -343,9 +343,14 @@ if(WarpX_LIB)
|
||||
else()
|
||||
set(mod_ext "so")
|
||||
endif()
|
||||
+ if(IS_ABSOLUTE ${CMAKE_INSTALL_LIBDIR})
|
||||
+ set(ABS_INSTALL_LIB_DIR ${CMAKE_INSTALL_LIBDIR})
|
||||
+ else()
|
||||
+ set(ABS_INSTALL_LIB_DIR ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR})
|
||||
+ endif()
|
||||
install(CODE "file(CREATE_LINK
|
||||
$<TARGET_FILE_NAME:shared>
|
||||
- ${CMAKE_INSTALL_LIBDIR}/libwarpx.${lib_suffix}.${mod_ext}
|
||||
+ ${ABS_INSTALL_LIB_DIR}/libwarpx.${lib_suffix}.${mod_ext}
|
||||
COPY_ON_ERROR SYMBOLIC)")
|
||||
endif()
|
||||
|
@@ -25,6 +25,7 @@ class Warpx(CMakePackage):
|
||||
|
||||
# NOTE: if you update the versions here, also see py-warpx
|
||||
version('develop', branch='development')
|
||||
version('22.06', sha256='e78398e215d3fc6bc5984f5d1c2ddeac290dcbc8a8e9d196e828ef6299187db9')
|
||||
version('22.05', sha256='2fa69e6a4db36459b67bf663e8fbf56191f6c8c25dc76301dbd02a36f9b50479')
|
||||
version('22.04', sha256='9234d12e28b323cb250d3d2cefee0b36246bd8a1d1eb48e386f41977251c028f')
|
||||
version('22.03', sha256='ddbef760c8000f2f827dfb097ca3359e7aecbea8766bec5c3a91ee28d3641564')
|
||||
@@ -132,7 +133,28 @@ class Warpx(CMakePackage):
|
||||
# The symbolic aliases for our +lib target were missing in the install
|
||||
# location
|
||||
# https://github.com/ECP-WarpX/WarpX/pull/2626
|
||||
patch('2626.patch', when='@21.12')
|
||||
patch('https://github.com/ECP-WarpX/WarpX/pull/2626.patch?full_index=1',
|
||||
sha256='a431d4664049d6dcb6454166d6a948d8069322a111816ca5ce01553800607544',
|
||||
when='@21.12')
|
||||
|
||||
# Workaround for AMReX<=22.06 no-MPI Gather
|
||||
# https://github.com/ECP-WarpX/WarpX/pull/3134
|
||||
# https://github.com/AMReX-Codes/amrex/pull/2793
|
||||
patch('https://github.com/ECP-WarpX/WarpX/pull/3134.patch?full_index=1',
|
||||
sha256='b786ce64a3c2c2b96ff2e635f0ee48532e4ae7ad9637dbf03f11c0768c290690',
|
||||
when='@22.02:22.05')
|
||||
|
||||
# Forgot to install ABLASTR library
|
||||
# https://github.com/ECP-WarpX/WarpX/pull/3141
|
||||
patch('https://github.com/ECP-WarpX/WarpX/pull/3141.patch?full_index=1',
|
||||
sha256='dab6fb44556ee1fd466a4cb0e20f89bde1ce445c9a51a2c0f59d1740863b5e7d',
|
||||
when='@22.04,22.05')
|
||||
|
||||
# Fix failing 1D CUDA build
|
||||
# https://github.com/ECP-WarpX/WarpX/pull/3162
|
||||
patch('https://github.com/ECP-WarpX/WarpX/pull/3162.patch?full_index=1',
|
||||
sha256='0ae573d1390ed8063f84e3402d30d34e522e65dc5dfeea3d07e165127ab373e9',
|
||||
when='@22.06')
|
||||
|
||||
def cmake_args(self):
|
||||
spec = self.spec
|
||||
@@ -168,10 +190,15 @@ def cmake_args(self):
|
||||
def libs(self):
|
||||
libsuffix = {'1': '1d', '2': '2d', '3': '3d', 'rz': 'rz'}
|
||||
dims = self.spec.variants['dims'].value
|
||||
return find_libraries(
|
||||
libs = find_libraries(
|
||||
['libwarpx.' + libsuffix[dims]], root=self.prefix, recursive=True,
|
||||
shared=True
|
||||
)
|
||||
libs += find_libraries(
|
||||
['libablastr'], root=self.prefix, recursive=True,
|
||||
shared=self.spec.variants['shared']
|
||||
)
|
||||
return libs
|
||||
|
||||
# WarpX has many examples to serve as a suitable smoke check. One
|
||||
# that is typical was chosen here
|
||||
|
Reference in New Issue
Block a user