Compare commits

..

69 Commits

Author SHA1 Message Date
Gregory Becker
9479be5618 improved again 2020-12-02 18:22:46 -08:00
Gregory Becker
4dd854d31b improve detection of shared variant 2020-12-02 18:21:46 -08:00
Gregory Becker
25fd25a77d default pythoncmd true for python2 detection 2020-12-02 09:46:33 -08:00
Gregory Becker
842867dd89 uncomment gettext 2020-11-25 17:55:35 -08:00
Gregory Becker
49866c9013 improved python detection 2020-11-25 17:34:55 -08:00
Massimiliano Culpo
983fb11dee concretizer: treat conditional providers correctly (#20086)
refers #20040

This modification emits rules like:

provides_virtual("netlib-lapack","blas") :- variant_value("netlib-lapack","external-blas","False").

for packages that provide virtual dependencies conditionally instead
of a fact that doesn't account for the condition.
2020-11-25 22:03:42 +01:00
Martin Aumüller
b33969598a intel-tbb: patch for arm64 on macOS (#20039)
* intel-tbb: patch for arm64 on macOS

as submitted upstream and used in homebrew

* intel-tbb: check patchable versions

* intel-tbb: avoid patch breakage when 2021.1 is released

2021.1-beta05 would be considered newer than 2021.1
2020-11-25 10:05:50 -06:00
downloadico
8b2c7a6c65 Add the 'exciting' package. (#20060)
* Add the 'exciting' package.
Version 14 (latest available) is defined.
An as-of-yet unpublished patch (dfgather.patch) from the developers is also
included.

* fixed flake8 errors (I *thought* I had already gotten them!  OOPS!)

* Update var/spack/repos/builtin/packages/exciting/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fixed install method to just do the install, and no build method is needed.

* *Actually* added the lapack dependency!

* removed variant from blas dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-25 09:59:35 -06:00
Harmen Stoppels
f40492b7d4 concretizer: remove debug statement (#20085) 2020-11-25 14:09:52 +01:00
Tomoyasu Nojiri
164fc4ee95 powertop: Add depend ncurses (#20080) 2020-11-25 10:15:51 +01:00
Erik Schnetter
408824f365 h5cpp: Correct checksum and build instructions (#20053)
* h5cpp: Correct checksum and build instructions

Closes https://github.com/spack/spack/issues/20046.

* h5cpp: Update to 1.10.4-6

* Update var/spack/repos/builtin/packages/h5cpp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* h5cpp: Correct formatting

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-24 13:37:29 -06:00
Dr. Christian Tacke
f6549849e5 root: Add 6.22.02, remove preferred label (#20002)
Drop the preferred label from 6.20.x.
Let's just default to the latest (production) version.
2020-11-24 11:08:56 -06:00
Scott Wittenburg
348cbe143c phist needs sbang fix for build script (#20063) 2020-11-24 07:07:34 -08:00
Ricardo Silva
87689f7cc8 sqlcipher: new package at v4.4.1 (#20009)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-11-24 15:54:27 +01:00
Ben Morgan
d9f6ef9df4 clhep: new version 2.4.4.0 (#20066) 2020-11-24 08:53:22 -06:00
Ben Morgan
2c0091df3f vecgeom: new version 1.1.8 (#20067) 2020-11-24 08:32:33 -05:00
vvolkl
28a3b30c53 [root] fix rootfit/roofit variant (#20051)
* [root] fix rootfit/roofit variant

fix typo

* Update var/spack/repos/builtin/packages/root/package.py

Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>

Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>
2020-11-24 11:19:10 +00:00
vvolkl
4e35df4b61 [gaudi] clhep is not optional (#20052) 2020-11-24 11:17:28 +00:00
Justin S
c468d6bed2 r-stanheaders: add 2.21.0-6 (#20057) 2020-11-23 20:56:48 -06:00
Andrew W Elble
3f0984e5e1 py-tensorboard: force use of spack's python, force build of ijar (#20059)
(same fix for py-tensorboard-plugin-wit)
2020-11-23 20:56:33 -06:00
Martin Oberzalek
fd07decd27 Also build static library (#20016) 2020-11-23 20:46:14 -06:00
Dmitri Smirnov
aa8dd782cd mysql: Add sasl as dependency for versions <5.7.999 (#20027) 2020-11-23 20:43:37 -06:00
Matthias Diener
617f2ac714 charmpp: fix tests (#20047) 2020-11-23 20:27:55 -06:00
Justin S
aee3b4a1e8 r-bh: add 1.72.0-3 (#20058) 2020-11-23 20:18:14 -06:00
Axel Huebl
01c9f3edc3 pybind11: 2.6.1 (#20061)
Add the latest pybind11 release.
Since release 2.6.0 was a major release with many changes in the
build system, prefer 2.5.0 for now.
2020-11-23 20:12:24 -06:00
Seth R. Johnson
5c623b03b1 flibcpp: update version and deps (#20048) 2020-11-23 10:40:57 -05:00
iulian787
cb4a08b3e0 moab: added v5.2.1, v5.2.0, v5.1.0 (#20010)
Also added maintainers for the package recipe.
2020-11-23 13:17:52 +01:00
vvolkl
fa66d683e4 delphes: updated pre-release version (#20011) 2020-11-23 13:16:14 +01:00
Adam J. Stewart
fb2ac2077d Docs: remove duplication in Command Reference (#20021) 2020-11-23 12:38:34 +01:00
Martin Aumüller
b62401ec8f ispc: external find support, added master branch version (#20033)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2020-11-23 12:28:11 +01:00
Cyrus Harrison
8a54817d4e vtk-h: added v0.6.6 (#20026) 2020-11-23 11:21:42 +01:00
Martin Aumüller
5088d799eb openscenegraph: remove dependency on Qt for >= 3.5.4 (#20032) 2020-11-23 11:18:41 +01:00
Michael Kuhn
c25f15b7d5 libx11: added v1.7.0 (#20035) 2020-11-23 11:13:33 +01:00
Michael Kuhn
d7db6068c5 libxcb, xcb-proto: added v1.14 versions (#20036)
At least xcb-proto 1.14.1 is only avaiable from the new URL, so change
both to be future-proof.
2020-11-23 11:13:09 +01:00
Martin Aumüller
236796577d botan: added v2.17.0, v2.17.1 and v2.17.2 (#20037) 2020-11-23 11:11:51 +01:00
Martin Aumüller
b490d65f28 recognize macOS 11.1 as big sur (#20038)
Big Sur versions go 11.0, 11.0.1, 11.1 (vs. prior versions that
only used the minor component)

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2020-11-23 08:37:40 +01:00
Pramod Kumbhar
92d540fde7 Add sionlib and linktest packages (#20034)
* Add sionlib and linktest packages

* fix flake8
2020-11-22 09:22:26 -06:00
Thomas Gruber
4609a126ba Add new release 5.1.0 and change homepage (#20022) 2020-11-21 00:02:08 +01:00
Shahzeb Siddiqui
f30aeb35ae [WIP] nersc e4s pipeline trigger (#19688)
* nersc e4s pipeline trigger

* Update nersc_pipeline.yml

* Update nersc_pipeline.yml
2020-11-20 13:31:25 -08:00
Martin Oberzalek
439b329c38 openldap: enable creation of static libraries (#20013) 2020-11-20 13:51:25 +01:00
psakievich
f613e10f24 Trilinos: Add CUDA relocatable code flag (#19993)
* Add relocatable code flag to trilinos

* Make CUDA RDC and varainat

* adjust default of cuda_rdc
2020-11-19 19:24:42 -05:00
Josh Essman
f92e52cdc8 mfem: Add support for AmgX, fix to version extensions (#19990)
* fix: leading . is not needed in extension kwarg

* mfem: add support for NVIDIA AmgX

fix: proper spacing

* mfem: use conflict to indicate that AmgX is expected to depend on CUDA
2020-11-19 14:03:22 -06:00
Axel Huebl
16d5cc2c99 ADIOS2: ~dataman default (#20003)
Disable dataman by default. It pulls heavy dependencies that are
often not needed for HPC (ZMQ) and it currently does not link
with popular compilers.
2020-11-19 13:56:51 -06:00
Massimiliano Culpo
d6e44b94d6 globalarrays: added v5.8 and earlier, simplified recipe (#19999)
fixes #19966

Global arrays supports GCC 10 since version 5.7.1,
therefore a conflict has been added to avoid old
releases to error at build-time.

Removed the 'blas' and 'lapack' variant since
BLAS and LAPACK are always a dependency, and
if not specified during configure, a version
of these APIs vendored with Global Arrays is
built.

Fixed a few options in configuration.
2020-11-19 11:58:54 -06:00
Brian Van Essen
5015635506 Removed accidental command to not expand the tarball. (#20001) 2020-11-19 11:58:25 -06:00
Dr. Christian Tacke
c417827954 cmake: Add Version 3.19.0 (#19996) 2020-11-19 10:43:52 -06:00
Sreenivasa Murthy Kolam
e75b76f433 bump up version for rocm 3.9.0 (#19995) 2020-11-19 07:40:24 -06:00
Toyohisa Kameyama
1522d1fac6 simde: New package (#19992)
* simde: New package

* remove 0.5.0.
2020-11-19 07:38:50 -06:00
Nithin Senthil Kumar
5129d84304 mvapich2: extended the fabrics variant description (#19860)
The point of this variant is to give the end user an option to use system
installed fabrics such as mofed instead of upstream fabrics such as rdma-core.
This was found to avoid run time errors on some systems.

Co-authored-by: nithintsk <nithintsk@github.com>
2020-11-19 13:47:57 +01:00
Adam J. Stewart
14a9359395 spack debug report: print concretizer (#19983) 2020-11-19 11:12:28 +01:00
Tomoki, Karatsu
8f3594564c fujitsu compiler: added / fixed support for compiler flags (#19967)
Added flags for:
- Debug symbols
- C++17 standard

Fixed the list of flags for generic optimizations
2020-11-19 11:09:34 +01:00
Michael Kuhn
1b7a5e53a6 clang/llvm: fix version detection (#19978)
This PR fixes two problems with clang/llvm's version detection. clang's
version output looks like this:

```
clang version 11.0.0
Target: x86_64-unknown-linux-gnu
```

This caused clang's version to be misdetected as:

```
clang@11.0.0
Target:
```

This resulted in errors when trying to actually use it as a compiler.

When using `spack external find`, we couldn't determine the compiler
version, resulting in errors like this:

```
==> Warning: "llvm@11.0.0+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for llvm@11.0.0+clang+lld+lldb]
```

Changing the regex to only match until the end of the line fixes these
problems.

Fixes: #19473
2020-11-19 11:06:45 +01:00
Michael Kuhn
dd54cb4c7a llvm: add missing pkgconfig dependency (#19982)
When building llvm with CUDA support, it needs to find libffi. Without
pkg-config, libffi will not be found.
2020-11-19 10:34:13 +01:00
Brian Van Essen
db9b7a509a cuDNN Refactor to accommodate architecture and CUDA version (#19989)
* Updated the cuDNN recipe to generate the proper version names for only
the arhcitecture that you are on.  This prevents the concretizer from
selecting a source code version that is incompatible with your current
architecture.  Additionally, add constraints to ensure that the
corresponding CUDA version is properly set as well.

* Added maintainer

* Fixed renaming for darwin systems

* Fixed flake8

* Fixed flake8

* Fixed range typo

* Update var/spack/repos/builtin/packages/cudnn/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fixed style issues

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-18 22:56:39 -06:00
eugeneswalker
a2801a1384 openblas@0.3.11 conflicts with gcc less than 8.3.0 (#19975) 2020-11-18 22:52:31 -06:00
Andreas Baumbach
cb22bcf6f1 drop unnecessary tk dependency of py-git-review (#19969)
* seems to have been introduced errorously by users using gitk-based
  workflows. This should be handled by the git package
* fixes build problems on OSX bigsur
2020-11-18 22:51:47 -06:00
t-nojiri
c9aac3e221 openloops: Fix for aarch64 (#19965) 2020-11-18 22:49:04 -06:00
arjun-raj-kuppala
a680df8453 AMD ROCm 3.9.0 release: Bump up version for aomp, roctracer-dev (#19957)
* AMD ROCm 3.9.0 release: Bump up version for aomp, roctracer-dev and updates to hip/hip-rocclr

* Update package.py
2020-11-18 22:47:10 -06:00
Matthias Diener
932f128bc8 charmpp: various fixes (#19956)
* charmpp: various fixes

- change URLs to https
- address deprecated/renamed versions
- make it build with the cmake build system

* flake8

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2020-11-18 22:45:57 -06:00
Adam J. Stewart
95f5419502 py-ipykernel: fix bug in phase method (#19986)
* py-ipykernel: fix bug in phase method

* Fix bug in executable calling
2020-11-18 18:45:29 -08:00
Enrico Usai
bc5c475909 aws-parallelcluster: 2.10.0 release (#19976)
Updated boto3 dependency and removed useless comments.
2020-11-18 19:27:04 -06:00
Greg Becker
10f784338b fix error handling for spack test results command (#19987) 2020-11-18 16:16:34 -08:00
Danny Taller
3b9155239b hip support for umpire, chai, raja, camp (#19715)
* create HipPackage base class and do some refactoring

* comments and added conflict to raja for openmp with hip
2020-11-18 11:52:21 -08:00
Cyrus Harrison
676d68a979 add 0.6.0 conduit release and update for branch changes (#19696) 2020-11-18 12:34:14 -06:00
vvolkl
3069631f37 Add "hep" label to high energy physics packages (#19968)
* [hep] add hep tag to relevant packages

* [lcio] add hep label
2020-11-18 17:07:35 +00:00
Dr. Christian Tacke
eca1370abc root: Add +spectrum variant to enable TSpectrum (#19971) 2020-11-18 16:37:37 +00:00
Massimiliano Culpo
b1dc3e787b Merge tag 'v0.16.0' into develop 2020-11-18 15:23:35 +01:00
Axel Huebl
8b431d1774 py-ipykernel: fix install (#19617)
There is a post-install routine in `ipykernel` that needs to be
called for proper registration with jupyter.
2020-11-18 07:34:12 -06:00
Wouter Deconinck
a0a15b5cd0 qt: patch missing includes when +opengl %gcc@10: (#19963) 2020-11-18 07:51:51 -05:00
223 changed files with 2307 additions and 2895 deletions

View File

@@ -1,32 +1,3 @@
# v0.16.1 (2021-02-22)
This minor release includes a new feature and associated fixes:
* intel-oneapi support through new packages (#20411, #20686, #20693, #20717,
#20732, #20808, #21377, #21448)
This release also contains bug fixes/enhancements for:
* HIP/ROCm support (#19715, #20095)
* concretization (#19988, #20020, #20082, #20086, #20099, #20102, #20128,
#20182, #20193, #20194, #20196, #20203, #20247, #20259, #20307, #20362,
#20383, #20423, #20473, #20506, #20507, #20604, #20638, #20649, #20677,
#20680, #20790)
* environment install reporting fix (#20004)
* avoid import in ABI compatibility info (#20236)
* restore ability of dev-build to skip patches (#20351)
* spack find -d spec grouping (#20028)
* spack smoke test support (#19987, #20298)
* macOS fixes (#20038, #21662)
* abstract spec comparisons (#20341)
* continuous integration (#17563)
* performance improvements for binary relocation (#19690, #20768)
* additional sanity checks for variants in builtin packages (#20373)
* do not pollute auto-generated configuration files with empty lists or
dicts (#20526)
plus assorted documentation (#20021, #20174) and package bug fixes/enhancements
(#19617, #19933, #19986, #20006, #20097, #20198, #20794, #20906, #21411).
# v0.16.0 (2020-11-18)
`v0.16.0` is a major feature release.

View File

@@ -324,21 +324,21 @@ mentions that Python 3 is required, this can be specified as:
.. code-block:: python
depends_on('python@3:', type=('build', 'run'))
depends_on('python@3:', type=('build', 'run')
If Python 2 is required, this would look like:
.. code-block:: python
depends_on('python@:2', type=('build', 'run'))
depends_on('python@:2', type=('build', 'run')
If Python 2.7 is the only version that works, you can use:
.. code-block:: python
depends_on('python@2.7:2.8', type=('build', 'run'))
depends_on('python@2.7:2.8', type=('build', 'run')
The documentation may not always specify supported Python versions.

View File

@@ -673,13 +673,6 @@ def uniq(sequence):
return uniq_list
def star(func):
"""Unpacks arguments for use with Multiprocessing mapping functions"""
def _wrapper(args):
return func(*args)
return _wrapper
class Devnull(object):
"""Null stream with less overhead than ``os.devnull``.

View File

@@ -5,7 +5,7 @@
#: major, minor, patch version for Spack, in a tuple
spack_version_info = (0, 16, 1)
spack_version_info = (0, 16, 0)
#: String containing Spack version joined with .'s
spack_version = '.'.join(str(v) for v in spack_version_info)

View File

@@ -8,6 +8,7 @@
from llnl.util.lang import memoized
import spack.spec
from spack.build_environment import dso_suffix
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
from spack.compilers.clang import Clang
@@ -29,7 +30,6 @@ def architecture_compatible(self, target, constraint):
def _gcc_get_libstdcxx_version(self, version):
"""Returns gcc ABI compatibility info by getting the library version of
a compiler's libstdc++ or libgcc_s"""
from spack.build_environment import dso_suffix
spec = CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:

View File

@@ -12,7 +12,6 @@
import tempfile
import hashlib
import glob
from ordereddict_backport import OrderedDict
from contextlib import closing
import ruamel.yaml as yaml
@@ -599,9 +598,7 @@ def write_buildinfo_file(spec, workdir, rel=False):
text_to_relocate.append(rel_path_name)
# Create buildinfo data and write it to disk
import spack.hooks.sbang as sbang
buildinfo = {}
buildinfo['sbang_install_path'] = sbang.sbang_install_path()
buildinfo['relative_rpaths'] = rel
buildinfo['buildpath'] = spack.store.layout.root
buildinfo['spackprefix'] = spack.paths.prefix
@@ -1087,10 +1084,6 @@ def relocate_package(spec, allow_root):
new_prefix = str(spec.prefix)
new_rel_prefix = str(os.path.relpath(new_prefix, new_layout_root))
new_spack_prefix = str(spack.paths.prefix)
old_sbang_install_path = None
if 'sbang_install_path' in buildinfo:
old_sbang_install_path = str(buildinfo['sbang_install_path'])
old_layout_root = str(buildinfo['buildpath'])
old_spack_prefix = str(buildinfo.get('spackprefix'))
old_rel_prefix = buildinfo.get('relative_prefix')
@@ -1112,32 +1105,11 @@ def relocate_package(spec, allow_root):
new_deps = spack.build_environment.get_rpath_deps(spec.package)
for d in new_deps:
hash_to_prefix[d.format('{hash}')] = str(d.prefix)
# Spurious replacements (e.g. sbang) will cause issues with binaries
# For example, the new sbang can be longer than the old one.
# Hence 2 dictionaries are maintained here.
prefix_to_prefix_text = OrderedDict({})
prefix_to_prefix_bin = OrderedDict({})
if old_sbang_install_path:
import spack.hooks.sbang as sbang
prefix_to_prefix_text[old_sbang_install_path] = \
sbang.sbang_install_path()
prefix_to_prefix_text[old_prefix] = new_prefix
prefix_to_prefix_bin[old_prefix] = new_prefix
prefix_to_prefix_text[old_layout_root] = new_layout_root
prefix_to_prefix_bin[old_layout_root] = new_layout_root
prefix_to_prefix = dict()
for orig_prefix, hash in prefix_to_hash.items():
prefix_to_prefix_text[orig_prefix] = hash_to_prefix.get(hash, None)
prefix_to_prefix_bin[orig_prefix] = hash_to_prefix.get(hash, None)
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
import spack.hooks.sbang as sbang
orig_sbang = '#!/bin/bash {0}/bin/sbang'.format(old_spack_prefix)
new_sbang = sbang.sbang_shebang_line()
prefix_to_prefix_text[orig_sbang] = new_sbang
prefix_to_prefix[orig_prefix] = hash_to_prefix.get(hash, None)
prefix_to_prefix[old_prefix] = new_prefix
prefix_to_prefix[old_layout_root] = new_layout_root
tty.debug("Relocating package from",
"%s to %s." % (old_layout_root, new_layout_root))
@@ -1165,14 +1137,15 @@ def is_backup_file(file):
relocate.relocate_macho_binaries(files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix_bin, rel,
prefix_to_prefix, rel,
old_prefix,
new_prefix)
if 'elf' in platform.binary_formats:
relocate.relocate_elf_binaries(files_to_relocate,
old_layout_root,
new_layout_root,
prefix_to_prefix_bin, rel,
prefix_to_prefix, rel,
old_prefix,
new_prefix)
# Relocate links to the new install prefix
@@ -1183,7 +1156,12 @@ def is_backup_file(file):
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
relocate.relocate_text(text_names,
old_layout_root, new_layout_root,
old_prefix, new_prefix,
old_spack_prefix,
new_spack_prefix,
prefix_to_prefix)
paths_to_relocate = [old_prefix, old_layout_root]
paths_to_relocate.extend(prefix_to_hash.keys())
@@ -1193,13 +1171,22 @@ def is_backup_file(file):
map(lambda filename: os.path.join(workdir, filename),
buildinfo['relocate_binaries'])))
# relocate the install prefixes in binary files including dependencies
relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin)
relocate.relocate_text_bin(files_to_relocate,
old_prefix, new_prefix,
old_spack_prefix,
new_spack_prefix,
prefix_to_prefix)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
else:
if old_spack_prefix != new_spack_prefix:
relocate.relocate_text(text_names, prefix_to_prefix_text)
relocate.relocate_text(text_names,
old_layout_root, new_layout_root,
old_prefix, new_prefix,
old_spack_prefix,
new_spack_prefix,
prefix_to_prefix)
def extract_tarball(spec, filename, allow_root=False, unsigned=False,

View File

@@ -412,8 +412,7 @@ def set_build_environment_variables(pkg, env, dirty):
# directory. Add that to the path too.
env_paths = []
compiler_specific = os.path.join(
spack.paths.build_env_path,
os.path.dirname(pkg.compiler.link_paths['cc']))
spack.paths.build_env_path, pkg.compiler.name)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, 'case-insensitive')
@@ -751,9 +750,6 @@ def setup_package(pkg, dirty, context='build'):
elif context == 'test':
import spack.user_environment as uenv # avoid circular import
env.extend(uenv.environment_modifications_for_spec(pkg.spec))
env.extend(
modifications_from_dependencies(pkg.spec, context=context)
)
set_module_variables_for_package(pkg)
env.prepend_path('PATH', '.')
@@ -818,8 +814,7 @@ def modifications_from_dependencies(spec, context):
}
deptype, method = deptype_and_method[context]
root = context == 'test'
for dspec in spec.traverse(order='post', root=root, deptype=deptype):
for dspec in spec.traverse(order='post', root=False, deptype=deptype):
dpkg = dspec.package
set_module_variables_for_package(dpkg)
# Allow dependencies to modify the module

View File

@@ -19,7 +19,7 @@ class CudaPackage(PackageBase):
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
# https://developer.nvidia.com/cuda-gpus
# https://en.wikipedia.org/wiki/CUDA#GPUs_supported
cuda_arch_values = (
cuda_arch_values = [
'10', '11', '12', '13',
'20', '21',
'30', '32', '35', '37',
@@ -27,7 +27,7 @@ class CudaPackage(PackageBase):
'60', '61', '62',
'70', '72', '75',
'80', '86'
)
]
# FIXME: keep cuda and cuda_arch separate to make usage easier until
# Spack has depends_on(cuda, when='cuda_arch!=None') or alike

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# Troubleshooting advice for +rocm builds:
# Troubleshooting advice for +hip builds:
#
# 1. When building with clang, go your compilers.yaml,
# add an entry for the amd version of clang, as below.
@@ -73,11 +73,9 @@
from spack.package import PackageBase
from spack.directives import depends_on, variant, conflicts
import spack.variant
class ROCmPackage(PackageBase):
"""Auxiliary class which contains ROCm variant, dependencies and conflicts
class HipPackage(PackageBase):
"""Auxiliary class which contains HIP variant, dependencies and conflicts
and is meant to unify and facilitate its usage. Closely mimics CudaPackage.
Maintainers: dtaller
@@ -88,26 +86,24 @@ class ROCmPackage(PackageBase):
amdgpu_targets = (
'gfx701', 'gfx801', 'gfx802', 'gfx803',
'gfx900', 'gfx906', 'gfx908', 'gfx1010',
'gfx1011', 'gfx1012'
'gfx1011', 'gfx1012', 'none'
)
variant('rocm', default=False, description='Enable ROCm support')
variant('hip', default=False, description='Enable HIP support')
# possible amd gpu targets for rocm builds
variant('amdgpu_target',
description='AMD GPU architecture',
values=spack.variant.any_combination_of(*amdgpu_targets))
# possible amd gpu targets for hip builds
variant('amdgpu_target', default='none', values=amdgpu_targets)
depends_on('llvm-amdgpu', when='+rocm')
depends_on('hsa-rocr-dev', when='+rocm')
depends_on('hip', when='+rocm')
depends_on('llvm-amdgpu', when='+hip')
depends_on('hsa-rocr-dev', when='+hip')
depends_on('hip', when='+hip')
# need amd gpu type for rocm builds
conflicts('amdgpu_target=none', when='+rocm')
# need amd gpu type for hip builds
conflicts('amdgpu_target=none', when='+hip')
# Make sure amdgpu_targets cannot be used without +rocm
for value in amdgpu_targets:
conflicts('~rocm', when='amdgpu_target=' + value)
# Make sure non-'none' amdgpu_targets cannot be used without +hip
for value in amdgpu_targets[:-1]:
conflicts('~hip', when='amdgpu_target=' + value)
# https://github.com/ROCm-Developer-Tools/HIP/blob/master/bin/hipcc
# It seems that hip-clang does not (yet?) accept this flag, in which case
@@ -115,8 +111,17 @@ class ROCmPackage(PackageBase):
# hip package file. But I will leave this here for future development.
@staticmethod
def hip_flags(amdgpu_target):
archs = ",".join(amdgpu_target)
return '--amdgpu-target={0}'.format(archs)
return '--amdgpu-target={0}'.format(amdgpu_target)
# https://llvm.org/docs/AMDGPUUsage.html
# Possible architectures (not including 'none' option)
@staticmethod
def amd_gputargets_list():
return (
'gfx701', 'gfx801', 'gfx802', 'gfx803',
'gfx900', 'gfx906', 'gfx908', 'gfx1010',
'gfx1011', 'gfx1012'
)
# HIP version vs Architecture

View File

@@ -1,80 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common utilities for managing intel oneapi packages.
"""
from os.path import dirname, isdir
from spack.package import Package
from spack.util.executable import Executable
from llnl.util.filesystem import find_headers, find_libraries
class IntelOneApiPackage(Package):
"""Base class for Intel oneAPI packages."""
homepage = 'https://software.intel.com/oneapi'
phases = ['install']
def component_info(self,
dir_name,
components,
releases,
url_name):
self._dir_name = dir_name
self._components = components
self._releases = releases
self._url_name = url_name
def url_for_version(self, version):
release = self._release(version)
return 'https://registrationcenter-download.intel.com/akdlm/irc_nas/%s/%s' % (
release['irc_id'], self._oneapi_file(version, release))
def install(self, spec, prefix):
bash = Executable('bash')
# Installer writes files in ~/intel set HOME so it goes to prefix
bash.add_default_env('HOME', prefix)
version = spec.versions.lowest()
release = self._release(version)
bash('./%s' % self._oneapi_file(version, release),
'-s', '-a', '-s', '--action', 'install',
'--eula', 'accept',
'--components',
self._components,
'--install-dir', prefix)
#
# Helper functions
#
def _release(self, version):
return self._releases[str(version)]
def _oneapi_file(self, version, release):
return 'l_%s_p_%s.%s_offline.sh' % (
self._url_name, version, release['build'])
class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages."""
@property
def headers(self):
include_path = '%s/%s/latest/include' % (
self.prefix, self._dir_name)
return find_headers('*', include_path, recursive=True)
@property
def libs(self):
lib_path = '%s/%s/latest/lib/intel64' % (self.prefix, self._dir_name)
lib_path = lib_path if isdir(lib_path) else dirname(lib_path)
return find_libraries('*', root=lib_path, shared=True, recursive=True)

View File

@@ -435,7 +435,7 @@ def format_list(specs):
out = ''
if groups:
for specs in iter_groups(specs, indent, all_headers):
output.write(format_list(specs))
out += format_list(specs)
else:
out = format_list(sorted(specs))

View File

@@ -15,6 +15,7 @@
from llnl.util.filesystem import working_dir
import spack.architecture as architecture
import spack.config
import spack.paths
from spack.main import get_version
from spack.util.executable import which
@@ -89,6 +90,7 @@ def report(args):
print('* **Python:**', platform.python_version())
print('* **Platform:**', architecture.Arch(
architecture.platform(), 'frontend', 'frontend'))
print('* **Concretizer:**', spack.config.get('config:concretizer'))
def debug(parser, args):

View File

@@ -112,7 +112,6 @@ def dev_build(self, args):
verbose=not args.quiet,
dirty=args.dirty,
stop_before=args.before,
skip_patch=args.skip_patch,
stop_at=args.until)
# drop into the build environment of the package?

View File

@@ -255,7 +255,7 @@ def install(parser, args, **kwargs):
reporter.specs = specs
tty.msg("Installing environment {0}".format(env.name))
with reporter('build'):
with reporter:
env.install_all(args, **kwargs)
tty.debug("Regenerating environment views for {0}"

View File

@@ -39,8 +39,7 @@
_compiler_cache = {}
_compiler_to_pkg = {
'clang': 'llvm+clang',
'oneapi': 'intel-oneapi-compilers'
'clang': 'llvm+clang'
}

View File

@@ -159,11 +159,11 @@ def extract_version_from_output(cls, output):
match = re.search(
# Normal clang compiler versions are left as-is
r'clang version ([^ )]+)-svn[~.\w\d-]*|'
r'clang version ([^ )\n]+)-svn[~.\w\d-]*|'
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r'clang version ([^ )]+?)-[~.\w\d-]*|'
r'clang version ([^ )]+)',
r'clang version ([^ )\n]+?)-[~.\w\d-]*|'
r'clang version ([^ )\n]+)',
output
)
if match:

View File

@@ -34,9 +34,13 @@ class Fj(spack.compiler.Compiler):
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return "-g"
@property
def opt_flags(self):
return ['-O', '-O0', '-O1', '-O2', '-O3', '-O4']
return ['-O0', '-O1', '-O2', '-O3', '-Ofast']
@property
def openmp_flag(self):
@@ -54,6 +58,10 @@ def cxx11_flag(self):
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def c99_flag(self):
return "-std=c99"

View File

@@ -29,14 +29,13 @@ class Oneapi(Compiler):
PrgEnv_compiler = 'oneapi'
version_argument = '--version'
version_regex = r'(?:(?:oneAPI DPC\+\+ Compiler)|(?:ifx \(IFORT\))) (\S+)'
version_regex = r'\((?:IFORT|ICC)\)|DPC\+\+ [^ ]+ [^ ]+ [^ ]+ \(([^ ]+)\)'
@property
def verbose_flag(self):
return "-v"
required_libs = ['libirc', 'libifcore', 'libifcoremt', 'libirng',
'libsvml', 'libintlc', 'libimf']
required_libs = ['libirc', 'libifcore', 'libifcoremt', 'libirng']
@property
def debug_flags(self):

View File

@@ -253,7 +253,8 @@ def concretize_architecture(self, spec):
if spec.architecture is None:
spec.architecture = spack.spec.ArchSpec()
if spec.architecture.concrete:
if spec.architecture.platform and \
(spec.architecture.os and spec.architecture.target):
return False
# Get platform of nearest spec with a platform, including spec
@@ -293,58 +294,22 @@ def concretize_architecture(self, spec):
# Get the nearest spec with relevant platform and a target
# Generally, same algorithm as finding os
curr_target = None
if spec.architecture.target:
curr_target = spec.architecture.target
if spec.architecture.target and spec.architecture.target_concrete:
new_target = spec.architecture.target
else:
new_target_spec = find_spec(
spec, lambda x: (x.architecture and
x.architecture.platform == str(new_plat) and
x.architecture.target and
x.architecture.target != curr_target)
x.architecture.target)
)
if new_target_spec:
if curr_target:
# constrain one target by the other
new_target_arch = spack.spec.ArchSpec(
(None, None, new_target_spec.architecture.target))
curr_target_arch = spack.spec.ArchSpec(
(None, None, curr_target))
curr_target_arch.constrain(new_target_arch)
new_target = curr_target_arch.target
else:
new_target = new_target_spec.architecture.target
new_target = new_target_spec.architecture.target
else:
# To get default platform, consider package prefs
if PackagePrefs.has_preferred_targets(spec.name):
new_target = self.target_from_package_preferences(spec)
else:
new_target = new_plat.target('default_target')
if curr_target:
# convert to ArchSpec to compare satisfaction
new_target_arch = spack.spec.ArchSpec(
(None, None, str(new_target)))
curr_target_arch = spack.spec.ArchSpec(
(None, None, str(curr_target)))
if not new_target_arch.satisfies(curr_target_arch):
# new_target is an incorrect guess based on preferences
# and/or default
valid_target_ranges = str(curr_target).split(',')
for target_range in valid_target_ranges:
t_min, t_sep, t_max = target_range.partition(':')
if not t_sep:
new_target = t_min
break
elif t_max:
new_target = t_max
break
elif t_min:
# TODO: something better than picking first
new_target = t_min
break
# Construct new architecture, compute whether spec changed
arch_spec = (str(new_plat), str(new_os), str(new_target))
@@ -419,7 +384,7 @@ def concretize_compiler(self, spec):
"""
# Pass on concretizing the compiler if the target or operating system
# is not yet determined
if not spec.architecture.concrete:
if not (spec.architecture.os and spec.architecture.target):
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.
@@ -517,7 +482,7 @@ def concretize_compiler_flags(self, spec):
"""
# Pass on concretizing the compiler flags if the target or operating
# system is not set.
if not spec.architecture.concrete:
if not (spec.architecture.os and spec.architecture.target):
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.

View File

@@ -685,7 +685,7 @@ def _read_manifest(self, f, raw_yaml=None):
else:
self.spec_lists[name] = user_specs
spec_list = config_dict(self.yaml).get(user_speclist_name, [])
spec_list = config_dict(self.yaml).get(user_speclist_name)
user_specs = SpecList(user_speclist_name, [s for s in spec_list if s],
self.spec_lists.copy())
self.spec_lists[user_speclist_name] = user_specs
@@ -707,11 +707,10 @@ def _read_manifest(self, f, raw_yaml=None):
self.views = {}
# Retrieve the current concretization strategy
configuration = config_dict(self.yaml)
# default concretization to separately
self.concretization = configuration.get('concretization', 'separately')
self.concretization = configuration.get('concretization')
# Retrieve dev-build packages:
self.dev_specs = configuration.get('develop', {})
self.dev_specs = configuration['develop']
for name, entry in self.dev_specs.items():
# spec must include a concrete version
assert Spec(entry['spec']).version.concrete

View File

@@ -8,7 +8,6 @@
import re
import shutil
import sys
from ordereddict_backport import OrderedDict
from llnl.util.link_tree import LinkTree, MergeConflictError
from llnl.util import tty
@@ -66,35 +65,32 @@ def view_copy(src, dst, view, spec=None):
# Not metadata, we have to relocate it
# Get information on where to relocate from/to
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
# TODO: Not sure which one to use...
import spack.hooks.sbang as sbang
orig_sbang = '#!/bin/bash {0}/bin/sbang'.format(spack.paths.spack_root)
new_sbang = sbang.sbang_shebang_line()
prefix_to_projection = OrderedDict({
spec.prefix: view.get_projection_for_spec(spec),
spack.paths.spack_root: view._root})
for dep in spec.traverse():
prefix_to_projection[dep.prefix] = \
view.get_projection_for_spec(dep)
prefix_to_projection = dict(
(dep.prefix, view.get_projection_for_spec(dep))
for dep in spec.traverse()
)
if spack.relocate.is_binary(dst):
# relocate binaries
spack.relocate.relocate_text_bin(
binaries=[dst],
prefixes=prefix_to_projection
orig_install_prefix=spec.prefix,
new_install_prefix=view.get_projection_for_spec(spec),
orig_spack=spack.paths.spack_root,
new_spack=view._root,
new_prefixes=prefix_to_projection
)
else:
prefix_to_projection[spack.store.layout.root] = view._root
prefix_to_projection[orig_sbang] = new_sbang
# relocate text
spack.relocate.relocate_text(
files=[dst],
prefixes=prefix_to_projection
orig_layout_root=spack.store.layout.root,
new_layout_root=view._root,
orig_install_prefix=spec.prefix,
new_install_prefix=view.get_projection_for_spec(spec),
orig_spack=spack.paths.spack_root,
new_spack=view._root,
new_prefixes=prefix_to_projection
)

View File

@@ -182,7 +182,7 @@ def _do_fake_install(pkg):
dump_packages(pkg.spec, packages_dir)
def _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs):
def _packages_needed_to_bootstrap_compiler(pkg):
"""
Return a list of packages required to bootstrap `pkg`s compiler
@@ -190,11 +190,7 @@ def _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs):
matches the package spec.
Args:
compiler (CompilerSpec): the compiler to bootstrap
architecture (ArchSpec): the architecture for which to boostrap the
compiler
pkgs (list of PackageBase): the packages that may need their compiler
installed
pkg (Package): the package that may need its compiler installed
Return:
(list) list of tuples, (PackageBase, bool), for concretized compiler-
@@ -203,27 +199,21 @@ def _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs):
(``True``) or one of its dependencies (``False``). The list
will be empty if there are no compilers.
"""
tty.debug('Bootstrapping {0} compiler'.format(compiler))
tty.debug('Bootstrapping {0} compiler for {1}'
.format(pkg.spec.compiler, package_id(pkg)))
compilers = spack.compilers.compilers_for_spec(
compiler, arch_spec=architecture)
pkg.spec.compiler, arch_spec=pkg.spec.architecture)
if compilers:
return []
dep = spack.compilers.pkg_spec_for_compiler(compiler)
# Set the architecture for the compiler package in a way that allows the
# concretizer to back off if needed for the older bootstrapping compiler
dep.constrain('platform=%s' % str(architecture.platform))
dep.constrain('os=%s' % str(architecture.os))
dep.constrain('target=%s:' %
architecture.target.microarchitecture.family.name)
dep = spack.compilers.pkg_spec_for_compiler(pkg.spec.compiler)
dep.architecture = pkg.spec.architecture
# concrete CompilerSpec has less info than concrete Spec
# concretize as Spec to add that information
dep.concretize()
# mark compiler as depended-on by the packages that use it
for pkg in pkgs:
dep._dependents[pkg.name] = spack.spec.DependencySpec(
pkg.spec, dep, ('build',))
# mark compiler as depended-on by the package that uses it
dep._dependents[pkg.name] = spack.spec.DependencySpec(
pkg.spec, dep, ('build',))
packages = [(s.package, False) for
s in dep.traverse(order='post', root=False)]
packages.append((dep.package, True))
@@ -657,21 +647,17 @@ def __str__(self):
return '{0}: {1}; {2}; {3}; {4}'.format(
self.pid, requests, tasks, installed, failed)
def _add_bootstrap_compilers(
self, compiler, architecture, pkgs, request, all_deps):
def _add_bootstrap_compilers(self, pkg, request, all_deps):
"""
Add bootstrap compilers and dependencies to the build queue.
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs (PackageBase): the package with possible compiler dependencies
pkg (PackageBase): the package with possible compiler dependencies
request (BuildRequest): the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
associated dependents
"""
packages = _packages_needed_to_bootstrap_compiler(
compiler, architecture, pkgs)
packages = _packages_needed_to_bootstrap_compiler(pkg)
for (comp_pkg, is_compiler) in packages:
if package_id(comp_pkg) not in self.build_tasks:
self._add_init_task(comp_pkg, request, is_compiler, all_deps)
@@ -1011,42 +997,14 @@ def _add_tasks(self, request, all_deps):
'config:install_missing_compilers', False)
install_deps = request.install_args.get('install_deps')
# Bootstrap compilers first
if install_deps and install_compilers:
packages_per_compiler = {}
for dep in request.traverse_dependencies():
dep_pkg = dep.package
compiler = dep_pkg.spec.compiler
arch = dep_pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(dep_pkg)
compiler = request.pkg.spec.compiler
arch = request.pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(request.pkg)
for compiler, archs in packages_per_compiler.items():
for arch, packages in archs.items():
self._add_bootstrap_compilers(
compiler, arch, packages, request, all_deps)
if install_deps:
for dep in request.traverse_dependencies():
dep_pkg = dep.package
# First push any missing compilers (if requested)
if install_compilers:
self._add_bootstrap_compilers(dep_pkg, request, all_deps)
dep_id = package_id(dep_pkg)
if dep_id not in self.build_tasks:
self._add_init_task(dep_pkg, request, False, all_deps)
@@ -1056,9 +1014,13 @@ def _add_tasks(self, request, all_deps):
# of the spec.
spack.store.db.clear_failure(dep, force=False)
# Push any missing compilers (if requested) as part of the
# package dependencies.
if install_compilers:
self._add_bootstrap_compilers(request.pkg, request, all_deps)
install_package = request.install_args.get('install_package')
if install_package and request.pkg_id not in self.build_tasks:
# Be sure to clear any previous failure
spack.store.db.clear_failure(request.spec, force=True)
@@ -1790,11 +1752,6 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
# to support tracking of parallel, multi-spec, environment installs.
self.dependents = set(get_dependent_ids(self.pkg.spec))
tty.debug(
'Pkg id {0} has the following dependents:'.format(self.pkg_id))
for dep_id in self.dependents:
tty.debug('- {0}'.format(dep_id))
# Set of dependencies
#
# Be consistent wrt use of dependents and dependencies. That is,
@@ -1815,10 +1772,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
arch_spec=arch_spec):
# The compiler is in the queue, identify it as dependency
dep = spack.compilers.pkg_spec_for_compiler(compiler_spec)
dep.constrain('platform=%s' % str(arch_spec.platform))
dep.constrain('os=%s' % str(arch_spec.os))
dep.constrain('target=%s:' %
arch_spec.target.microarchitecture.family.name)
dep.architecture = arch_spec
dep.concretize()
dep_id = package_id(dep.package)
self.dependencies.add(dep_id)

View File

@@ -20,9 +20,7 @@
from spack.build_systems.autotools import AutotoolsPackage
from spack.build_systems.cmake import CMakePackage
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.oneapi import IntelOneApiPackage
from spack.build_systems.oneapi import IntelOneApiLibraryPackage
from spack.build_systems.rocm import ROCmPackage
from spack.build_systems.hip import HipPackage
from spack.build_systems.qmake import QMakePackage
from spack.build_systems.maven import MavenPackage
from spack.build_systems.scons import SConsPackage

View File

@@ -6,8 +6,6 @@
import platform
import re
import shutil
import multiprocessing.pool
from ordereddict_backport import OrderedDict
import llnl.util.lang
import llnl.util.tty as tty
@@ -451,26 +449,36 @@ def needs_text_relocation(m_type, m_subtype):
return m_type == 'text'
def _replace_prefix_text(filename, compiled_prefixes):
def _replace_prefix_text(filename, old_dir, new_dir):
"""Replace all the occurrences of the old install prefix with a
new install prefix in text files that are utf-8 encoded.
Args:
filename (str): target text file (utf-8 encoded)
compiled_prefixes (OrderedDict): OrderedDictionary where the keys are
precompiled regex of the old prefixes and the values are the new
prefixes (uft-8 encoded)
old_dir (str): directory to be searched in the file
new_dir (str): substitute for the old directory
"""
# TODO: cache regexes globally to speedup computation
with open(filename, 'rb+') as f:
data = f.read()
f.seek(0)
for orig_prefix_rexp, new_bytes in compiled_prefixes.items():
data = orig_prefix_rexp.sub(new_bytes, data)
f.write(data)
# Replace old_dir with new_dir if it appears at the beginning of a path
# Negative lookbehind for a character legal in a path
# Then a match group for any characters legal in a compiler flag
# Then old_dir
# Then characters legal in a path
# Ensures we only match the old_dir if it's precedeed by a flag or by
# characters not legal in a path, but not if it's preceeded by other
# components of a path.
old_bytes = old_dir.encode('utf-8')
pat = b'(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)' % old_bytes
repl = b'\\1%s\\2' % new_dir.encode('utf-8')
ndata = re.sub(pat, repl, data)
f.write(ndata)
f.truncate()
def _replace_prefix_bin(filename, byte_prefixes):
def _replace_prefix_bin(filename, old_dir, new_dir):
"""Replace all the occurrences of the old install prefix with a
new install prefix in binary files.
@@ -479,34 +487,33 @@ def _replace_prefix_bin(filename, byte_prefixes):
Args:
filename (str): target binary file
byte_prefixes (OrderedDict): OrderedDictionary where the keys are
precompiled regex of the old prefixes and the values are the new
prefixes (uft-8 encoded)
old_dir (str): directory to be searched in the file
new_dir (str): substitute for the old directory
"""
def replace(match):
occurances = match.group().count(old_dir.encode('utf-8'))
olen = len(old_dir.encode('utf-8'))
nlen = len(new_dir.encode('utf-8'))
padding = (olen - nlen) * occurances
if padding < 0:
return data
return match.group().replace(
old_dir.encode('utf-8'),
os.sep.encode('utf-8') * padding + new_dir.encode('utf-8')
)
with open(filename, 'rb+') as f:
data = f.read()
f.seek(0)
for orig_bytes, new_bytes in byte_prefixes.items():
original_data_len = len(data)
# Skip this hassle if not found
if orig_bytes not in data:
continue
# We only care about this problem if we are about to replace
length_compatible = len(new_bytes) <= len(orig_bytes)
if not length_compatible:
raise BinaryTextReplaceError(orig_bytes, new_bytes)
pad_length = len(orig_bytes) - len(new_bytes)
padding = os.sep * pad_length
padding = padding.encode('utf-8')
data = data.replace(orig_bytes, new_bytes + padding)
# Really needs to be the same length
if not len(data) == original_data_len:
print('Length of pad:', pad_length, 'should be', len(padding))
print(new_bytes, 'was to replace', orig_bytes)
raise BinaryStringReplacementError(
filename, original_data_len, len(data))
f.write(data)
original_data_len = len(data)
pat = re.compile(old_dir.encode('utf-8'))
if not pat.search(data):
return
ndata = pat.sub(replace, data)
if not len(ndata) == original_data_len:
raise BinaryStringReplacementError(
filename, original_data_len, len(ndata))
f.write(ndata)
f.truncate()
@@ -779,88 +786,86 @@ def relocate_links(links, orig_layout_root,
tty.warn(msg.format(link_target, abs_link, new_install_prefix))
def relocate_text(files, prefixes, concurrency=32):
"""Relocate text file from the original installation prefix to the
new prefix.
def relocate_text(
files, orig_layout_root, new_layout_root, orig_install_prefix,
new_install_prefix, orig_spack, new_spack, new_prefixes
):
"""Relocate text file from the original ``install_tree`` to the new one.
Relocation also affects the the path in Spack's sbang script.
This also handles relocating Spack's sbang scripts to point at the
new install tree.
Args:
files (list): text files to be relocated
orig_layout_root (str): original layout root
new_layout_root (str): new layout root
orig_install_prefix (str): install prefix of the original installation
new_install_prefix (str): install prefix where we want to relocate
orig_spack (str): path to the original Spack
new_spack (str): path to the new Spack
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
Args:
files (list): Text files to be relocated
prefixes (OrderedDict): String prefixes which need to be changed
concurrency (int): Preferred degree of parallelism
"""
# TODO: reduce the number of arguments (8 seems too much)
# This now needs to be handled by the caller in all cases
# orig_sbang = '#!/bin/bash {0}/bin/sbang'.format(orig_spack)
# new_sbang = '#!/bin/bash {0}/bin/sbang'.format(new_spack)
compiled_prefixes = OrderedDict({})
for orig_prefix, new_prefix in prefixes.items():
if orig_prefix != new_prefix:
orig_bytes = orig_prefix.encode('utf-8')
orig_prefix_rexp = re.compile(
b'(?<![\\w\\-_/])([\\w\\-_]*?)%s([\\w\\-_/]*)' % orig_bytes)
new_bytes = b'\\1%s\\2' % new_prefix.encode('utf-8')
compiled_prefixes[orig_prefix_rexp] = new_bytes
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
import spack.hooks.sbang as sbang
orig_sbang = '#!/bin/bash {0}/bin/sbang'.format(orig_spack)
new_sbang = sbang.sbang_shebang_line()
# Do relocations on text that refers to the install tree
# multiprocesing.ThreadPool.map requires single argument
args = []
for filename in files:
args.append((filename, compiled_prefixes))
_replace_prefix_text(filename, orig_install_prefix, new_install_prefix)
for orig_dep_prefix, new_dep_prefix in new_prefixes.items():
_replace_prefix_text(filename, orig_dep_prefix, new_dep_prefix)
_replace_prefix_text(filename, orig_layout_root, new_layout_root)
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_replace_prefix_text), args)
finally:
tp.terminate()
tp.join()
# Point old packages at the new sbang location. Packages that
# already use the new sbang location will already have been
# handled by the prior call to _replace_prefix_text
_replace_prefix_text(filename, orig_sbang, new_sbang)
def relocate_text_bin(binaries, prefixes, concurrency=32):
def relocate_text_bin(
binaries, orig_install_prefix, new_install_prefix,
orig_spack, new_spack, new_prefixes
):
"""Replace null terminated path strings hard coded into binaries.
The new install prefix must be shorter than the original one.
Args:
binaries (list): binaries to be relocated
prefixes (OrderedDict): String prefixes which need to be changed.
concurrency (int): Desired degree of parallelism.
orig_install_prefix (str): install prefix of the original installation
new_install_prefix (str): install prefix where we want to relocate
orig_spack (str): path to the original Spack
new_spack (str): path to the new Spack
new_prefixes (dict): dictionary that maps the original prefixes to
where they should be relocated
Raises:
BinaryTextReplaceError: when the new path is longer than the old path
BinaryTextReplaceError: when the new path in longer than the old path
"""
byte_prefixes = OrderedDict({})
for orig_prefix, new_prefix in prefixes.items():
if orig_prefix != new_prefix:
if isinstance(orig_prefix, bytes):
orig_bytes = orig_prefix
else:
orig_bytes = orig_prefix.encode('utf-8')
if isinstance(new_prefix, bytes):
new_bytes = new_prefix
else:
new_bytes = new_prefix.encode('utf-8')
byte_prefixes[orig_bytes] = new_bytes
# Do relocations on text in binaries that refers to the install tree
# multiprocesing.ThreadPool.map requires single argument
args = []
# Raise if the new install prefix is longer than the
# original one, since it means we can't change the original
# binary to relocate it
new_prefix_is_shorter = len(new_install_prefix) <= len(orig_install_prefix)
if not new_prefix_is_shorter and len(binaries) > 0:
raise BinaryTextReplaceError(orig_install_prefix, new_install_prefix)
for binary in binaries:
args.append((binary, byte_prefixes))
for old_dep_prefix, new_dep_prefix in new_prefixes.items():
if len(new_dep_prefix) <= len(old_dep_prefix):
_replace_prefix_bin(binary, old_dep_prefix, new_dep_prefix)
_replace_prefix_bin(binary, orig_install_prefix, new_install_prefix)
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
tp.map(llnl.util.lang.star(_replace_prefix_bin), args)
finally:
tp.terminate()
tp.join()
# Note: Replacement of spack directory should not be done. This causes
# an incorrect replacement path in the case where the install root is a
# subdirectory of the spack directory.
def is_relocatable(spec):

View File

@@ -4,6 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains jsonschema files for all of Spack's YAML formats."""
import copy
import re
import six
import llnl.util.lang
@@ -15,6 +18,45 @@
# and increases the start-up time
def _make_validator():
import jsonschema
_validate_properties = jsonschema.Draft4Validator.VALIDATORS["properties"]
_validate_pattern_properties = jsonschema.Draft4Validator.VALIDATORS[
"patternProperties"
]
def _set_defaults(validator, properties, instance, schema):
"""Adds support for the 'default' attribute in 'properties'.
``jsonschema`` does not handle this out of the box -- it only
validates. This allows us to set default values for configs
where certain fields are `None` b/c they're deleted or
commented out.
"""
for property, subschema in six.iteritems(properties):
if "default" in subschema:
instance.setdefault(
property, copy.deepcopy(subschema["default"]))
for err in _validate_properties(
validator, properties, instance, schema):
yield err
def _set_pp_defaults(validator, properties, instance, schema):
"""Adds support for the 'default' attribute in 'patternProperties'.
``jsonschema`` does not handle this out of the box -- it only
validates. This allows us to set default values for configs
where certain fields are `None` b/c they're deleted or
commented out.
"""
for property, subschema in six.iteritems(properties):
if "default" in subschema:
if isinstance(instance, dict):
for key, val in six.iteritems(instance):
if re.match(property, key) and val is None:
instance[key] = copy.deepcopy(subschema["default"])
for err in _validate_pattern_properties(
validator, properties, instance, schema):
yield err
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
@@ -59,6 +101,8 @@ def _deprecated_properties(validator, deprecated, instance, schema):
return jsonschema.validators.extend(
jsonschema.Draft4Validator, {
"validate_spec": _validate_spec,
"properties": _set_defaults,
"patternProperties": _set_pp_defaults,
"deprecatedProperties": _deprecated_properties
}
)

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
%=============================================================================
% This logic program implements Spack's concretizer
% Generate
%=============================================================================
%-----------------------------------------------------------------------------
@@ -21,15 +21,7 @@ version_weight(Package, Weight)
version_weight(Package, Weight)
:- version(Package, Version), preferred_version_declared(Package, Version, Weight).
% version_satisfies implies that exactly one of the satisfying versions
% is the package's version, and vice versa.
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
#defined preferred_version_declared/3.
#defined version_satisfies/3.
%-----------------------------------------------------------------------------
% Dependency semantics
@@ -40,148 +32,34 @@ depends_on(Package, Dependency) :- depends_on(Package, Dependency, _).
% declared dependencies are real if they're not virtual AND
% the package is not an external
depends_on(Package, Dependency, Type)
:- dependency_conditions(Package, Dependency, Type),
:- declared_dependency(Package, Dependency, Type),
node(Package),
not virtual(Dependency),
not external(Package).
% every root must be a node
node(Package) :- root(Package).
% if you declare a dependency on a virtual AND the package is not an external,
% you depend on one of its providers
1 {
depends_on(Package, Provider, Type)
: provides_virtual(Provider, Virtual)
} 1
:- declared_dependency(Package, Virtual, Type),
virtual(Virtual),
not external(Package),
node(Package).
% dependencies imply new nodes
node(Dependency) :- node(Package), depends_on(Package, Dependency).
% all nodes in the graph must be reachable from some root
% this ensures a user can't say `zlib ^libiconv` (neither of which have any
% dependencies) and get a two-node unconnected graph
needed(Package) :- root(Package).
needed(Dependency) :- needed(Package), depends_on(Package, Dependency).
:- node(Package), not needed(Package).
% Avoid cycles in the DAG
% some combinations of conditional dependencies can result in cycles;
% this ensures that we solve around them
path(Parent, Child) :- depends_on(Parent, Child).
path(Parent, Descendant) :- path(Parent, A), depends_on(A, Descendant).
:- path(A, B), path(B, A).
%-----------------------------------------------------------------------------
% Conditional dependencies
%
% This takes care of `when=SPEC` in `depends_on("foo@1.0+bar", when="SPEC")`.
%-----------------------------------------------------------------------------
% if any individual condition below is true, trigger the dependency.
dependency_conditions(Package, Dependency, Type) :-
dependency_conditions_hold(ID, Package, Dependency),
dependency_type(ID, Type).
#defined dependency_type/2.
% collect all the dependency conditions into a single conditional rule
% distinguishing between Parent and Package (Arg1) is needed to account
% for conditions like:
%
% depends_on('patchelf@0.9', when='@1.0:1.1 ^python@:2')
%
% that include dependencies
dependency_conditions_hold(ID, Parent, Dependency) :-
attr(Name, Arg1) : required_dependency_condition(ID, Name, Arg1);
attr(Name, Arg1, Arg2) : required_dependency_condition(ID, Name, Arg1, Arg2);
attr(Name, Arg1, Arg2, Arg3) : required_dependency_condition(ID, Name, Arg1, Arg2, Arg3);
dependency_condition(ID, Parent, Dependency);
% There must be at least a dependency type declared,
% otherwise the dependency doesn't hold
dependency_type(ID, _);
node(Parent);
not external(Parent).
#defined dependency_condition/3.
#defined required_dependency_condition/3.
#defined required_dependency_condition/4.
#defined required_dependency_condition/5.
%-----------------------------------------------------------------------------
% Imposed constraints on dependencies
%
% This handles the `@1.0+bar` in `depends_on("foo@1.0+bar", when="SPEC")`, or
% the `mpi@2:` in `provides("mpi@2:", when="@1.9:")`.
%-----------------------------------------------------------------------------
% NOTE: `attr(Name, Arg1)` is omitted here b/c the only single-arg attribute is
% NOTE: `node()`, which is handled above under "Dependency Semantics"
attr(Name, Arg1, Arg2) :-
dependency_conditions_hold(ID, Package, Dependency),
imposed_dependency_condition(ID, Name, Arg1, Arg2).
attr(Name, Arg1, Arg2, Arg3) :-
dependency_conditions_hold(ID, Package, Dependency),
imposed_dependency_condition(ID, Name, Arg1, Arg2, Arg3).
#defined imposed_dependency_condition/4.
#defined imposed_dependency_condition/5.
%-----------------------------------------------------------------------------
% Conflicts
%-----------------------------------------------------------------------------
:- not external(Package) : conflict_condition(ID, "node", Package);
attr(Name, Arg1) : conflict_condition(ID, Name, Arg1);
attr(Name, Arg1, Arg2) : conflict_condition(ID, Name, Arg1, Arg2);
attr(Name, Arg1, Arg2, Arg3) : conflict_condition(ID, Name, Arg1, Arg2, Arg3);
conflict(ID, Package).
#defined conflict/2.
#defined conflict_condition/3.
#defined conflict_condition/4.
#defined conflict_condition/5.
%-----------------------------------------------------------------------------
% Virtual dependencies
%-----------------------------------------------------------------------------
% if a package depends on a virtual, it's not external and we have a
% provider for that virtual then it depends on the provider
depends_on(Package, Provider, Type)
:- dependency_conditions(Package, Virtual, Type),
provides_virtual(Provider, Virtual),
not external(Package).
% if there's a virtual node, we must select one provider
1 { provides_virtual(Package, Virtual) : possible_provider(Package, Virtual) } 1
% if a virtual was required by some root spec, one provider is in the DAG
1 { node(Package) : provides_virtual(Package, Virtual) } 1
:- virtual_node(Virtual).
% virtual roots imply virtual nodes, and that one provider is a root
virtual_node(Virtual) :- virtual_root(Virtual).
1 { root(Package) : provides_virtual(Package, Virtual) } 1
:- virtual_root(Virtual).
% all virtual providers come from provider conditions like this
dependency_conditions_hold(ID, Provider, Virtual) :-
attr(Name, Arg1) : required_provider_condition(ID, Name, Arg1);
attr(Name, Arg1, Arg2) : required_provider_condition(ID, Name, Arg1, Arg2);
attr(Name, Arg1, Arg2, Arg3) : required_provider_condition(ID, Name, Arg1, Arg2, Arg3);
virtual(Virtual);
provider_condition(ID, Provider, Virtual).
% The provider provides the virtual if some provider condition holds.
provides_virtual(Provider, Virtual) :-
provider_condition(ID, Provider, Virtual),
dependency_conditions_hold(ID, Provider, Virtual),
virtual(Virtual).
% a node that provides a virtual is a provider
provider(Package, Virtual)
:- node(Package), provides_virtual(Package, Virtual).
% dependencies on virtuals also imply that the virtual is a virtual node
virtual_node(Virtual)
:- dependency_conditions(Package, Virtual, Type),
virtual(Virtual), not external(Package).
% for any virtual, there can be at most one provider in the DAG
0 { node(Package) : provides_virtual(Package, Virtual) } 1 :- virtual(Virtual).
0 { provider(Package, Virtual) :
node(Package), provides_virtual(Package, Virtual) } 1 :- virtual(Virtual).
%-----------------------------------------------------------------------------
% Virtual dependency weights
%-----------------------------------------------------------------------------
% give dependents the virtuals they want
provider_weight(Dependency, 0)
:- virtual(Virtual), depends_on(Package, Dependency),
@@ -222,52 +100,23 @@ provider_weight(Package, 100)
provider(Package, Virtual),
not default_provider_preference(Virtual, Package, _).
#defined provider_condition/3.
#defined required_provider_condition/3.
#defined required_provider_condition/4.
#defined required_provider_condition/5.
% all nodes must be reachable from some root
node(Package) :- root(Package).
%-----------------------------------------------------------------------------
% Spec Attributes
%-----------------------------------------------------------------------------
% Equivalencies of the form:
%
% name(Arg1, Arg2, ...) :- attr("name", Arg1, Arg2, ...).
% attr("name", Arg1, Arg2, ...) :- name(Arg1, Arg2, ...).
%
% These allow us to easily define conditional dependency and conflict rules
% without enumerating all spec attributes every time.
node(Package) :- attr("node", Package).
version(Package, Version) :- attr("version", Package, Version).
version_satisfies(Package, Constraint) :- attr("version_satisfies", Package, Constraint).
node_platform(Package, Platform) :- attr("node_platform", Package, Platform).
node_os(Package, OS) :- attr("node_os", Package, OS).
node_target(Package, Target) :- attr("node_target", Package, Target).
node_target_satisfies(Package, Target) :- attr("node_target_satisfies", Package, Target).
variant_value(Package, Variant, Value) :- attr("variant_value", Package, Variant, Value).
variant_set(Package, Variant, Value) :- attr("variant_set", Package, Variant, Value).
node_flag(Package, FlagType, Flag) :- attr("node_flag", Package, FlagType, Flag).
node_compiler(Package, Compiler) :- attr("node_compiler", Package, Compiler).
node_compiler_version(Package, Compiler, Version)
:- attr("node_compiler_version", Package, Compiler, Version).
node_compiler_version_satisfies(Package, Compiler, Version)
:- attr("node_compiler_version_satisfies", Package, Compiler, Version).
1 { root(Package) : provides_virtual(Package, Virtual) } 1
:- virtual_root(Virtual).
attr("node", Package) :- node(Package).
attr("version", Package, Version) :- version(Package, Version).
attr("version_satisfies", Package, Constraint) :- version_satisfies(Package, Constraint).
attr("node_platform", Package, Platform) :- node_platform(Package, Platform).
attr("node_os", Package, OS) :- node_os(Package, OS).
attr("node_target", Package, Target) :- node_target(Package, Target).
attr("node_target_satisfies", Package, Target) :- node_target_satisfies(Package, Target).
attr("variant_value", Package, Variant, Value) :- variant_value(Package, Variant, Value).
attr("variant_set", Package, Variant, Value) :- variant_set(Package, Variant, Value).
attr("node_flag", Package, FlagType, Flag) :- node_flag(Package, FlagType, Flag).
attr("node_compiler", Package, Compiler) :- node_compiler(Package, Compiler).
attr("node_compiler_version", Package, Compiler, Version)
:- node_compiler_version(Package, Compiler, Version).
attr("node_compiler_version_satisfies", Package, Compiler, Version)
:- node_compiler_version_satisfies(Package, Compiler, Version).
needed(Package) :- root(Package).
needed(Dependency) :- needed(Package), depends_on(Package, Dependency).
:- node(Package), not needed(Package).
% real dependencies imply new nodes.
node(Dependency) :- node(Package), depends_on(Package, Dependency).
% Avoid cycles in the DAG
path(Parent, Child) :- depends_on(Parent, Child).
path(Parent, Descendant) :- path(Parent, A), depends_on(A, Descendant).
:- path(A, B), path(B, A).
% do not warn if generated program contains none of these.
#defined depends_on/3.
@@ -282,8 +131,6 @@ attr("node_compiler_version_satisfies", Package, Compiler, Version)
#defined external_only/1.
#defined pkg_provider_preference/4.
#defined default_provider_preference/3.
#defined version_satisfies/2.
#defined node_compiler_version_satisfies/3.
#defined root/1.
%-----------------------------------------------------------------------------
@@ -308,30 +155,9 @@ external(Package) :-
version(Package, Version), version_weight(Package, Weight),
external_version_declared(Package, Version, Weight, ID).
% determine if an external spec has been selected
external_spec_selected(ID, Package, LocalIndex) :-
external_spec(Package, ID) :-
version(Package, Version), version_weight(Package, Weight),
external_spec_index(ID, Package, LocalIndex),
external_version_declared(Package, Version, Weight, LocalIndex),
external_spec_conditions_hold(ID, Package).
% determine if all the conditions on an external spec hold. If they do
% the spec can be selected.
external_spec_conditions_hold(ID, Package) :-
attr(Name, Arg1) : external_spec_condition(ID, Name, Arg1);
attr(Name, Arg1, Arg2) : external_spec_condition(ID, Name, Arg1, Arg2);
attr(Name, Arg1, Arg2, Arg3) : external_spec_condition(ID, Name, Arg1, Arg2, Arg3);
external_spec(ID, Package);
node(Package).
% it cannot happen that a spec is external, but none of the external specs
% conditions hold.
:- external(Package), not external_spec_conditions_hold(_, Package).
#defined external_spec_index/3.
#defined external_spec_condition/3.
#defined external_spec_condition/4.
#defined external_spec_condition/5.
external_version_declared(Package, Version, Weight, ID).
%-----------------------------------------------------------------------------
% Variant semantics
@@ -357,9 +183,6 @@ external_spec_conditions_hold(ID, Package) :-
% if a variant is set to anything, it is considered 'set'.
variant_set(Package, Variant) :- variant_set(Package, Variant, _).
% A variant cannot have a value that is not also a possible value
:- variant_value(Package, Variant, Value), not variant_possible_value(Package, Variant, Value).
% variant_set is an explicitly set variant value. If it's not 'set',
% we revert to the default value. If it is set, we force the set value
variant_value(Package, Variant, Value)
@@ -371,7 +194,6 @@ variant_value(Package, Variant, Value)
variant_not_default(Package, Variant, Value, 1)
:- variant_value(Package, Variant, Value),
not variant_default_value(Package, Variant, Value),
not variant_set(Package, Variant, Value),
node(Package).
variant_not_default(Package, Variant, Value, 0)
@@ -379,12 +201,6 @@ variant_not_default(Package, Variant, Value, 0)
variant_default_value(Package, Variant, Value),
node(Package).
variant_not_default(Package, Variant, Value, 0)
:- variant_value(Package, Variant, Value),
variant_set(Package, Variant, Value),
node(Package).
% The default value for a variant in a package is what is written
% in the package.py file, unless some preference is set in packages.yaml
variant_default_value(Package, Variant, Value)
@@ -399,16 +215,6 @@ variant_default_value(Package, Variant, Value)
:- 2 {variant_value(Package, Variant, Value): variant_possible_value(Package, Variant, Value)},
variant_value(Package, Variant, "none").
% patches and dev_path are special variants -- they don't have to be
% declared in the package, so we just allow them to spring into existence
% when assigned a value.
auto_variant("dev_path").
auto_variant("patches").
variant(Package, Variant)
:- variant_set(Package, Variant, _), auto_variant(Variant).
variant_single_value(Package, "dev_path")
:- variant_set(Package, "dev_path", _).
% suppress warnings about this atom being unset. It's only set if some
% spec or some package sets it, and without this, clingo will give
% warnings like 'info: atom does not occur in any rule head'.
@@ -423,9 +229,9 @@ variant_single_value(Package, "dev_path")
%-----------------------------------------------------------------------------
% Platform semantics
%-----------------------------------------------------------------------------
% one platform per node
:- M = #count { Platform : node_platform(Package, Platform) }, M !=1, node(Package).
1 { node_platform(Package, Platform) : node_platform(Packagee, Platform) } 1
:- node(Package).
% if no platform is set, fall back to the default
node_platform(Package, Platform)
@@ -473,13 +279,6 @@ node_os(Package, OS)
% one target per node -- optimization will pick the "best" one
1 { node_target(Package, Target) : target(Target) } 1 :- node(Package).
% node_target_satisfies semantics
1 { node_target(Package, Target) : node_target_satisfies(Package, Constraint, Target) } 1
:- node_target_satisfies(Package, Constraint).
node_target_satisfies(Package, Constraint)
:- node_target(Package, Target), node_target_satisfies(Package, Constraint, Target).
#defined node_target_satisfies/3.
% The target weight is either the default target weight
% or a more specific per-package weight if set
target_weight(Target, Package, Weight)
@@ -515,21 +314,16 @@ node_target_weight(Package, Weight)
target_weight(Target, Package, Weight).
% compatibility rules for targets among nodes
node_target_match_pref(Package, Target) :- node_target_set(Package, Target).
node_target_match_pref(Dependency, Target)
:- depends_on(Package, Dependency),
node_target_match_pref(Package, Target),
not node_target_set(Dependency, _).
node_target_match_pref(Dependency, Target)
:- depends_on(Package, Dependency),
node_target_set(Package, Target),
not node_target_match_pref(Package, Target),
:- depends_on(Package, Dependency), node_target_match_pref(Package, Target),
not node_target_set(Dependency, _).
node_target_match_pref(Dependency, Target)
:- depends_on(Package, Dependency),
root(Package), node_target(Package, Target),
not node_target_match_pref(Package, _).
not node_target_match_pref(Package, _),
not node_target_set(Dependency, _).
node_target_match(Package, 1)
:- node_target(Package, Target), node_target_match_pref(Package, Target).
@@ -544,36 +338,17 @@ derive_target_from_parent(Parent, Package)
%-----------------------------------------------------------------------------
% Compiler semantics
%-----------------------------------------------------------------------------
compiler(Compiler) :- compiler_version(Compiler, _).
% There must be only one compiler set per node. The compiler
% is chosen among available versions.
% one compiler per node
1 { node_compiler(Package, Compiler) : compiler(Compiler) } 1 :- node(Package).
1 { node_compiler_version(Package, Compiler, Version)
: compiler_version(Compiler, Version) } 1 :- node(Package).
% Sometimes we just need to know the compiler and not the version
node_compiler(Package, Compiler) :- node_compiler_version(Package, Compiler, _).
% We can't have a compiler be enforced and select the version from another compiler
:- node_compiler(Package, Compiler1),
node_compiler_version(Package, Compiler2, _),
Compiler1 != Compiler2.
% define node_compiler_version_satisfies/3 from node_compiler_version_satisfies/4
% version_satisfies implies that exactly one of the satisfying versions
% is the package's version, and vice versa.
1 { node_compiler_version(Package, Compiler, Version)
: node_compiler_version_satisfies(Package, Compiler, Constraint, Version) } 1
:- node_compiler_version_satisfies(Package, Compiler, Constraint).
node_compiler_version_satisfies(Package, Compiler, Constraint)
:- node_compiler_version(Package, Compiler, Version),
node_compiler_version_satisfies(Package, Compiler, Constraint, Version).
#defined node_compiler_version_satisfies/4.
1 { compiler_weight(Package, Weight) : compiler_weight(Package, Weight) } 1
:- node(Package).
% If the compiler version was set from the command line,
% respect it verbatim
node_compiler_version(Package, Compiler, Version) :- node_compiler_version_set(Package, Compiler, Version).
node_compiler_version(Package, Compiler, Version) :- node_compiler_version_hard(Package, Compiler, Version).
% Cannot select a compiler if it is not supported on the OS
% Compilers that are explicitly marked as allowed
@@ -587,7 +362,7 @@ node_compiler_version(Package, Compiler, Version) :- node_compiler_version_set(P
% Compiler prescribed in the root spec
node_compiler_version_match_pref(Package, Compiler, V)
:- node_compiler_set(Package, Compiler),
:- node_compiler_hard(Package, Compiler),
node_compiler_version(Package, Compiler, V),
not external(Package).
@@ -596,34 +371,37 @@ node_compiler_version_match_pref(Dependency, Compiler, V)
:- depends_on(Package, Dependency),
node_compiler_version_match_pref(Package, Compiler, V),
node_compiler_version(Dependency, Compiler, V),
not node_compiler_set(Dependency, Compiler).
not node_compiler_hard(Dependency, Compiler).
% Compiler inherited from the root package
node_compiler_version_match_pref(Dependency, Compiler, V)
:- depends_on(Package, Dependency),
node_compiler_version(Package, Compiler, V), root(Package),
node_compiler_version(Dependency, Compiler, V),
not node_compiler_set(Dependency, Compiler).
not node_compiler_hard(Dependency, Compiler).
compiler_version_match(Package, 1)
:- node_compiler_version(Package, Compiler, V),
node_compiler_version_match_pref(Package, Compiler, V).
#defined node_compiler_set/2.
#defined node_compiler_version_set/3.
#defined node_compiler_hard/2.
#defined node_compiler_version_hard/3.
#defined compiler_supports_os/3.
#defined allow_compiler/2.
% compilers weighted by preference according to packages.yaml
compiler_weight(Package, Weight)
:- node_compiler_version(Package, Compiler, V),
:- node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, V),
node_compiler_preference(Package, Compiler, V, Weight).
compiler_weight(Package, Weight)
:- node_compiler_version(Package, Compiler, V),
:- node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, V),
not node_compiler_preference(Package, Compiler, V, _),
default_compiler_preference(Compiler, V, Weight).
compiler_weight(Package, 100)
:- node_compiler_version(Package, Compiler, Version),
:- node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, Version),
not node_compiler_preference(Package, Compiler, Version, _),
not default_compiler_preference(Compiler, Version, _).
@@ -657,6 +435,7 @@ node_flag_source(Dependency, Q)
node_flag(Package, FlagType, Flag)
:- not node_flag_set(Package),
compiler_version_flag(Compiler, Version, FlagType, Flag),
node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, Version),
flag_type(FlagType),
compiler(Compiler),
@@ -665,6 +444,7 @@ node_flag(Package, FlagType, Flag)
node_flag_compiler_default(Package)
:- not node_flag_set(Package),
compiler_version_flag(Compiler, Version, FlagType, Flag),
node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, Version),
flag_type(FlagType),
compiler(Compiler),
@@ -721,34 +501,44 @@ root(Dependency, 1) :- not root(Dependency), node(Dependency).
% need to maximize their number below to ensure they're all set
#maximize {
1@13,Package,Variant,Value
: variant_not_default(Package, Variant, Value, Weight),
not variant_single_value(Package, Variant),
root(Package)
: variant_not_default(Package, Variant, Value, Weight), root(Package)
}.
#minimize{
Weight@13,Provider
: provider_weight(Provider, Weight), root(Provider)
}.
% Try to use default variants or variants that have been set
#minimize {
Weight@11,Package,Variant,Value
: variant_not_default(Package, Variant, Value, Weight), not root(Package)
}.
% Minimize the weights of the providers, i.e. use as much as
% possible the most preferred providers
% Next, we want to minimize the weights of the providers
% i.e. use as much as possible the most preferred providers
#minimize{
Weight@9,Provider
Weight@11,Provider
: provider_weight(Provider, Weight), not root(Provider)
}.
% For external packages it's more important than for others
% to match the compiler with their parent node
#maximize{
Weight@10,Package
: compiler_version_match(Package, Weight), external(Package)
}.
% Then try to use as much as possible:
% 1. Default variants
% 2. Latest versions
% of all the other nodes in the DAG
#minimize {
Weight@9,Package,Variant,Value
: variant_not_default(Package, Variant, Value, Weight), not root(Package)
}.
% If the value is a multivalued variant there could be multiple
% values set as default. Since a default value has a weight of 0 we
% need to maximize their number below to ensure they're all set
#maximize {
1@8,Package,Variant,Value
: variant_not_default(Package, Variant, Value, Weight),
not variant_single_value(Package, Variant),
not root(Package)
: variant_not_default(Package, Variant, Value, Weight), not root(Package)
}.
#minimize{
Weight@8,Package : version_weight(Package, Weight)
}.
% Try to maximize the number of compiler matches in the DAG,
@@ -758,15 +548,10 @@ root(Dependency, 1) :- not root(Dependency), node(Dependency).
#minimize{ 1@7,Package : node(Package) }.
#maximize{ Weight@7,Package : compiler_version_match(Package, Weight) }.
% Choose more recent versions for nodes
#minimize{
Weight@6,Package : version_weight(Package, Weight)
}.
% Try to use preferred compilers
#minimize{ Weight@5,Package : compiler_weight(Package, Weight) }.
#minimize{ Weight@6,Package : compiler_weight(Package, Weight) }.
% Maximize the number of matches for targets in the DAG, try
% to select the preferred target.
#maximize{ Weight@4,Package : node_target_match(Package, Weight) }.
#minimize{ Weight@3,Package : node_target_weight(Package, Weight) }.
#maximize{ Weight@5,Package : node_target_match(Package, Weight) }.
#minimize{ Weight@4,Package : node_target_weight(Package, Weight) }.

View File

@@ -24,4 +24,4 @@
#show compiler_weight/2.
#show node_target_match/2.
#show node_target_weight/2.
#show external_spec_selected/3.
#show external_spec/2.

View File

@@ -359,11 +359,13 @@ def satisfies(self, other, strict=False):
return False
# Check target
return self.target_satisfies(other, strict=strict)
return self._satisfies_target(other.target, strict=strict)
def target_satisfies(self, other, strict):
need_to_check = bool(other.target) if strict or self.concrete \
else bool(other.target and self.target)
def _satisfies_target(self, other_target, strict):
self_target = self.target
need_to_check = bool(other_target) if strict or self.concrete \
else bool(other_target and self_target)
# If there's no need to check we are fine
if not need_to_check:
@@ -373,68 +375,24 @@ def target_satisfies(self, other, strict):
if self.target is None:
return False
return bool(self.target_intersection(other))
for target_range in str(other_target).split(','):
t_min, sep, t_max = target_range.partition(':')
def target_constrain(self, other):
if not other.target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other)
# Checking against a single specific target
if not sep and self_target == t_min:
return True
if self.target_concrete:
return False
elif other.target_concrete:
self.target = other.target
return True
if not sep and self_target != t_min:
return False
# Compute the intersection of every combination of ranges in the lists
results = self.target_intersection(other)
# Do we need to dedupe here?
self.target = ','.join(results)
# Check against a range
min_ok = self_target.microarchitecture >= t_min if t_min else True
max_ok = self_target.microarchitecture <= t_max if t_max else True
def target_intersection(self, other):
results = []
if min_ok and max_ok:
return True
if not self.target or not other.target:
return results
for s_target_range in str(self.target).split(','):
s_min, s_sep, s_max = s_target_range.partition(':')
for o_target_range in str(other.target).split(','):
o_min, o_sep, o_max = o_target_range.partition(':')
if not s_sep:
# s_target_range is a concrete target
# get a microarchitecture reference for at least one side
# of each comparison so we can use archspec comparators
s_comp = spack.architecture.Target(s_min).microarchitecture
if not o_sep:
if s_min == o_min:
results.append(s_min)
elif (not o_min or s_comp >= o_min) and (
not o_max or s_comp <= o_max):
results.append(s_min)
elif not o_sep:
# "cast" to microarchitecture
o_comp = spack.architecture.Target(o_min).microarchitecture
if (not s_min or o_comp >= s_min) and (
not s_max or o_comp <= s_max):
results.append(o_min)
else:
# Take intersection of two ranges
# Lots of comparisons needed
_s_min = spack.architecture.Target(s_min).microarchitecture
_s_max = spack.architecture.Target(s_max).microarchitecture
_o_min = spack.architecture.Target(o_min).microarchitecture
_o_max = spack.architecture.Target(o_max).microarchitecture
n_min = s_min if _s_min >= _o_min else o_min
n_max = s_max if _s_max <= _o_max else o_max
_n_min = spack.architecture.Target(n_min).microarchitecture
_n_max = spack.architecture.Target(n_max).microarchitecture
if _n_min == _n_max:
results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max:
results.append('%s:%s' % (n_min, n_max))
return results
return False
def constrain(self, other):
"""Projects all architecture fields that are specified in the given
@@ -451,18 +409,16 @@ def constrain(self, other):
"""
other = self._autospec(other)
if not other.satisfies(self):
raise UnsatisfiableArchitectureSpecError(other, self)
if not self.satisfies(other):
raise UnsatisfiableArchitectureSpecError(self, other)
constrained = False
for attr in ('platform', 'os'):
for attr in ('platform', 'os', 'target'):
svalue, ovalue = getattr(self, attr), getattr(other, attr)
if svalue is None and ovalue is not None:
setattr(self, attr, ovalue)
constrained = True
self.target_constrain(other)
return constrained
def copy(self):
@@ -475,13 +431,7 @@ def copy(self):
def concrete(self):
"""True if the spec is concrete, False otherwise"""
# return all(v for k, v in six.iteritems(self.to_cmp_dict()))
return (self.platform and self.os and self.target and
self.target_concrete)
@property
def target_concrete(self):
"""True if the target is not a range or list."""
return ':' not in str(self.target) and ',' not in str(self.target)
return self.platform and self.os and self.target
def to_dict(self):
d = syaml.syaml_dict([
@@ -2492,9 +2442,6 @@ def _new_concretize(self, tests=False):
raise spack.error.SpecError(
"Spec has no name; cannot concretize an anonymous spec")
if self._concrete:
return
result = spack.solver.asp.solve([self], tests=tests)
if not result.satisfiable:
result.print_cores()
@@ -2516,14 +2463,8 @@ def _new_concretize(self, tests=False):
self._dup(concretized)
self._mark_concrete()
#: choose your concretizer here.
def concretize(self, tests=False):
"""Concretize the current spec.
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
if spack.config.get('config:concretizer') == "clingo":
self._new_concretize(tests)
else:
@@ -2541,19 +2482,12 @@ def _mark_concrete(self, value=True):
s._normal = value
s._concrete = value
def concretized(self, tests=False):
"""This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package
without modifying this package.
Args:
tests (bool or list): if False disregard 'test' dependencies,
if a list of names activate them for the packages in the list,
if True activate 'test' dependencies for all packages.
"""
def concretized(self):
"""This is a non-destructive version of concretize(). First clones,
then returns a concrete version of this package without modifying
this package. """
clone = self.copy(caches=False)
clone.concretize(tests=tests)
clone.concretize()
return clone
def flat_dependencies(self, **kwargs):
@@ -2943,40 +2877,6 @@ def ensure_valid_variants(spec):
if not_existing:
raise vt.UnknownVariantError(spec, not_existing)
def update_variant_validate(self, variant_name, values):
"""If it is not already there, adds the variant named
`variant_name` to the spec `spec` based on the definition
contained in the package metadata. Validates the variant and
values before returning.
Used to add values to a variant without being sensitive to the
variant being single or multi-valued. If the variant already
exists on the spec it is assumed to be multi-valued and the
values are appended.
Args:
variant_name: the name of the variant to add or append to
values: the value or values (as a tuple) to add/append
to the variant
"""
if not isinstance(values, tuple):
values = (values,)
pkg_variant = self.package_class.variants[variant_name]
for value in values:
if self.variants.get(variant_name):
msg = ("Cannot append a value to a single-valued "
"variant with an already set value")
assert pkg_variant.multi, msg
self.variants[variant_name].append(value)
else:
variant = pkg_variant.make_variant(value)
self.variants[variant_name] = variant
pkg_variant.validate_or_raise(
self.variants[variant_name], self.package)
def constrain(self, other, deps=True):
"""Merge the constraints of other with self.
@@ -3567,11 +3467,8 @@ def ne_dag(self, other, deptypes=True):
def _cmp_node(self):
"""Comparison key for just *this node* and not its deps."""
# Name or namespace None will lead to invalid comparisons for abstract
# specs. Replace them with the empty string, which is not a valid spec
# name nor namespace so it will not create spurious equalities.
return (self.name or '',
self.namespace or '',
return (self.name,
self.namespace,
tuple(self.versions),
self.variants,
self.architecture,

View File

@@ -12,7 +12,6 @@
import pytest
import spack.architecture
import spack.concretize
from spack.spec import Spec
from spack.platforms.cray import Cray
from spack.platforms.linux import Linux
@@ -224,29 +223,3 @@ def test_satisfy_strict_constraint_when_not_concrete(
architecture = spack.spec.ArchSpec(architecture_tuple)
constraint = spack.spec.ArchSpec(constraint_tuple)
assert not architecture.satisfies(constraint, strict=True)
@pytest.mark.parametrize('root_target_range,dep_target_range,result', [
(('x86_64:nocona', 'x86_64:core2', 'nocona')), # pref not in intersection
(('x86_64:core2', 'x86_64:nocona', 'nocona')),
(('x86_64:haswell', 'x86_64:mic_knl', 'core2')), # pref in intersection
(('ivybridge', 'nocona:skylake', 'ivybridge')), # one side concrete
(('haswell:icelake', 'broadwell', 'broadwell')),
# multiple ranges in lists with multiple overlaps
(('x86_64:nocona,haswell:broadwell', 'nocona:haswell,skylake:',
'nocona')),
# lists with concrete targets, lists compared to ranges
(('x86_64,haswell', 'core2:broadwell', 'haswell'))
])
@pytest.mark.usefixtures('mock_packages', 'config')
def test_concretize_target_ranges(
root_target_range, dep_target_range, result
):
# use foobar=bar to make the problem simpler for the old concretizer
# the new concretizer should not need that help
spec = Spec('a %%gcc@10 foobar=bar target=%s ^b target=%s' %
(root_target_range, dep_target_range))
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert str(spec).count('arch=test-debian6-%s' % result) == 2

View File

@@ -19,7 +19,6 @@
import spack.cmd.install as install
import spack.cmd.uninstall as uninstall
import spack.cmd.mirror as mirror
import spack.hooks.sbang as sbang
from spack.main import SpackCommand
import spack.mirror
import spack.util.gpg
@@ -81,15 +80,6 @@ def mirror_directory_rel(session_mirror_rel):
yield(session_mirror_rel)
@pytest.fixture(scope='function')
def function_mirror(tmpdir):
mirror_dir = str(tmpdir.join('mirror'))
mirror_cmd('add', '--scope', 'site', 'test-mirror-func',
'file://%s' % mirror_dir)
yield mirror_dir
mirror_cmd('rm', '--scope=site', 'test-mirror-func')
@pytest.fixture(scope='session')
def config_directory(tmpdir_factory):
tmpdir = tmpdir_factory.mktemp('test_configs')
@@ -681,78 +671,3 @@ def mock_list_url(url, recursive=False):
err = capfd.readouterr()[1]
expect = 'Encountered problem listing packages at {0}'.format(test_url)
assert expect in err
@pytest.mark.usefixtures('mock_fetch')
def test_update_sbang(tmpdir, install_mockery, function_mirror):
"""
Test the creation and installation of buildcaches with default rpaths
into the non-default directory layout scheme, triggering an update of the
sbang.
"""
# Save the original store and layout before we touch ANYTHING.
real_store = spack.store.store
real_layout = spack.store.layout
# Concretize a package with some old-fashioned sbang lines.
sspec = Spec('old-sbang')
sspec.concretize()
# Need a fake mirror with *function* scope.
mirror_dir = function_mirror
# Assumes all commands will concretize sspec the same way.
install_cmd('--no-cache', sspec.name)
# Create a buildcache with the installed spec.
buildcache_cmd('create', '-u', '-a', '-d', mirror_dir,
'/%s' % sspec.dag_hash())
# Need to force an update of the buildcache index
buildcache_cmd('update-index', '-d', 'file://%s' % mirror_dir)
# Uninstall the original package.
uninstall_cmd('-y', '/%s' % sspec.dag_hash())
try:
# New install tree locations...
# Too fine-grained to do be done in a fixture
spack.store.store = spack.store.Store(str(tmpdir.join('newtree')))
spack.store.layout = YamlDirectoryLayout(str(tmpdir.join('newtree')),
path_scheme=ndef_install_path_scheme) # noqa: E501
# Install package from buildcache
buildcache_cmd('install', '-a', '-u', '-f', sspec.name)
# Continue blowing away caches
bindist.clear_spec_cache()
spack.stage.purge()
# test that the sbang was updated by the move
sbang_style_1_expected = '''{0}
#!/usr/bin/env python
{1}
'''.format(sbang.sbang_shebang_line(), sspec.prefix.bin)
sbang_style_2_expected = '''{0}
#!/usr/bin/env python
{1}
'''.format(sbang.sbang_shebang_line(), sspec.prefix.bin)
installed_script_style_1_path = \
sspec.prefix.bin.join('sbang-style-1.sh')
assert sbang_style_1_expected == \
open(str(installed_script_style_1_path)).read()
installed_script_style_2_path = \
sspec.prefix.bin.join('sbang-style-2.sh')
assert sbang_style_2_expected == \
open(str(installed_script_style_2_path)).read()
uninstall_cmd('-y', '/%s' % sspec.dag_hash())
finally:
spack.store.store = real_store
spack.store.layout = real_layout

View File

@@ -11,6 +11,7 @@
import os.path
import spack.architecture as architecture
import spack.config
from spack.main import SpackCommand, get_version
from spack.util.executable import which
@@ -53,3 +54,4 @@ def test_report():
assert get_version() in out
assert platform.python_version() in out
assert str(arch) in out
assert spack.config.get('config:concretizer') in out

View File

@@ -23,8 +23,7 @@ def test_immediate_dependents(mock_packages):
'libdwarf',
'patch-a-dependency',
'patch-several-dependencies',
'quantum-espresso',
'conditionally-patch-dependency'
'quantum-espresso'
])
@@ -39,8 +38,7 @@ def test_transitive_dependents(mock_packages):
'multivalue-variant',
'singlevalue-variant-dependent',
'patch-a-dependency', 'patch-several-dependencies',
'quantum-espresso',
'conditionally-patch-dependency'
'quantum-espresso'
])

View File

@@ -2162,40 +2162,6 @@ def test_env_write_only_non_default():
assert yaml == ev.default_manifest_yaml
@pytest.mark.regression('20526')
def test_env_write_only_non_default_nested(tmpdir):
# setup an environment file
# the environment includes configuration because nested configs proved the
# most difficult to avoid writing.
filename = 'spack.yaml'
filepath = str(tmpdir.join(filename))
contents = """\
env:
specs:
- matrix:
- [mpileaks]
packages:
mpileaks:
compiler: [gcc]
view: true
"""
# create environment with some structure
with open(filepath, 'w') as f:
f.write(contents)
env('create', 'test', filepath)
# concretize
with ev.read('test') as e:
concretize()
e.write()
with open(e.manifest_path, 'r') as f:
manifest = f.read()
assert manifest == contents
@pytest.fixture
def packages_yaml_v015(tmpdir):
"""Return the path to an existing manifest in the v0.15.x format

View File

@@ -471,14 +471,16 @@ def test_fj_flags():
supported_flag_test("cxx98_flag", "-std=c++98", "fj@4.0.0")
supported_flag_test("cxx11_flag", "-std=c++11", "fj@4.0.0")
supported_flag_test("cxx14_flag", "-std=c++14", "fj@4.0.0")
supported_flag_test("cxx17_flag", "-std=c++17", "fj@4.0.0")
supported_flag_test("c99_flag", "-std=c99", "fj@4.0.0")
supported_flag_test("c11_flag", "-std=c11", "fj@4.0.0")
supported_flag_test("cc_pic_flag", "-KPIC", "fj@4.0.0")
supported_flag_test("cxx_pic_flag", "-KPIC", "fj@4.0.0")
supported_flag_test("f77_pic_flag", "-KPIC", "fj@4.0.0")
supported_flag_test("fc_pic_flag", "-KPIC", "fj@4.0.0")
supported_flag_test("opt_flags", ['-O', '-O0', '-O1', '-O2', '-O3', '-O4'],
supported_flag_test("opt_flags", ['-O0', '-O1', '-O2', '-O3', '-Ofast'],
'fj@4.0.0')
supported_flag_test("debug_flags", "-g", "fj@4.0.0")
def test_gcc_flags():

View File

@@ -96,7 +96,11 @@ def test_apple_clang_version_detection(
('clang version 8.0.0-3 (tags/RELEASE_800/final)\n'
'Target: aarch64-unknown-linux-gnu\n'
'Thread model: posix\n'
'InstalledDir: /usr/bin\n', '8.0.0')
'InstalledDir: /usr/bin\n', '8.0.0'),
('clang version 11.0.0\n'
'Target: aarch64-unknown-linux-gnu\n'
'Thread model: posix\n'
'InstalledDir: /usr/bin\n', '11.0.0')
])
def test_clang_version_detection(version_str, expected_version):
version = spack.compilers.clang.Clang.extract_version_from_output(
@@ -152,18 +156,28 @@ def test_intel_version_detection(version_str, expected_version):
@pytest.mark.parametrize('version_str,expected_version', [
( # ICX/ICPX
'Intel(R) oneAPI DPC++ Compiler 2021.1 (2020.10.0.1113)\n'
( # ICX
'Intel(R) oneAPI DPC++ Compiler Pro 2021.1 (2020.8.0.0827)\n'
'Target: x86_64-unknown-linux-gnu\n'
'Thread model: posix\n'
'InstalledDir: /made/up/path',
'2021.1'
'InstalledDir: /soft/restricted/CNDA/sdk/\n'
'2020.9.15.1/oneapi/compiler/2021.1-beta09/linux/bin',
'2020.8.0.0827'
),
( # IFX
'ifx (IFORT) 2021.1 Beta 20201113\n'
'Copyright (C) 1985-2020 Intel Corporation. All rights reserved.',
'2021.1'
( # ICPX
'Intel(R) oneAPI DPC++ Compiler Pro 2021.1 (2020.8.0.0827)\n'
'Target: x86_64-unknown-linux-gnu\n'
'Thread model: posix\n'
'InstalledDir: /soft/restricted/CNDA/sdk/\n'
'2020.9.15.1/oneapi/compiler/2021.1-beta09/linux/bin',
'2020.8.0.0827'
)
# Detection will fail for ifx because it can't parse it from this.
# ( # IFX
# 'ifx (IFORT) 2021.1 Beta 20200827\n'
# 'Copyright (C) 1985-2020 Intel Corporation. All rights reserved.',
# '2020.8.0.0827'
# )
])
def test_oneapi_version_detection(version_str, expected_version):
version = spack.compilers.oneapi.Oneapi.extract_version_from_output(

View File

@@ -5,7 +5,6 @@
import sys
import pytest
import jinja2
import archspec.cpu
@@ -115,64 +114,6 @@ def current_host(request, monkeypatch):
spack.architecture.get_platform.cache.clear()
@pytest.fixture()
def repo_with_changing_recipe(tmpdir_factory, mutable_mock_repo):
repo_namespace = 'changing'
repo_dir = tmpdir_factory.mktemp(repo_namespace)
repo_dir.join('repo.yaml').write("""
repo:
namespace: changing
""", ensure=True)
packages_dir = repo_dir.ensure('packages', dir=True)
root_pkg_str = """
class Root(Package):
homepage = "http://www.example.com"
url = "http://www.example.com/root-1.0.tar.gz"
version(1.0, sha256='abcde')
depends_on('changing')
"""
packages_dir.join('root', 'package.py').write(
root_pkg_str, ensure=True
)
changing_template = """
class Changing(Package):
homepage = "http://www.example.com"
url = "http://www.example.com/changing-1.0.tar.gz"
version(1.0, sha256='abcde')
{% if not delete_variant %}
variant('fee', default=True, description='nope')
{% endif %}
variant('foo', default=True, description='nope')
{% if add_variant %}
variant('fum', default=True, description='nope')
{% endif %}
"""
repo = spack.repo.Repo(str(repo_dir))
mutable_mock_repo.put_first(repo)
class _ChangingPackage(object):
def change(self, context):
# To ensure we get the changed package we need to
# invalidate the cache
repo._modules = {}
t = jinja2.Template(changing_template)
changing_pkg_str = t.render(**context)
packages_dir.join('changing', 'package.py').write(
changing_pkg_str, ensure=True
)
_changing_pkg = _ChangingPackage()
_changing_pkg.change({'delete_variant': False, 'add_variant': False})
return _changing_pkg
# This must use the mutable_config fixture because the test
# adjusting_default_target_based_on_compiler uses the current_host fixture,
# which changes the config.
@@ -323,10 +264,6 @@ def concretize_multi_provider(self):
s.concretize()
assert s['mpi'].version == ver('1.10.3')
def test_concretize_dependent_with_singlevalued_variant_type(self):
s = Spec('singlevalue-variant-dependent-type')
s.concretize()
@pytest.mark.parametrize("spec,version", [
('dealii', 'develop'),
('xsdk', '0.4.0'),
@@ -564,11 +501,6 @@ def test_conflicts_in_spec(self, conflict_spec):
with pytest.raises(spack.error.SpackError):
s.concretize()
def test_conflict_in_all_directives_true(self):
s = Spec('when-directives-true')
with pytest.raises(spack.error.SpackError):
s.concretize()
@pytest.mark.parametrize('spec_str', [
'conflict@10.0%clang+foo'
])
@@ -963,137 +895,3 @@ def test_conditional_provides_or_depends_on(self):
assert 'v1-provider' in s
assert s['v1'].name == 'v1-provider'
assert s['v2'].name == 'conditional-provider'
@pytest.mark.regression('20079')
@pytest.mark.parametrize('spec_str,tests_arg,with_dep,without_dep', [
# Check that True is treated correctly and attaches test deps
# to all nodes in the DAG
('a', True, ['a'], []),
('a foobar=bar', True, ['a', 'b'], []),
# Check that a list of names activates the dependency only for
# packages in that list
('a foobar=bar', ['a'], ['a'], ['b']),
('a foobar=bar', ['b'], ['b'], ['a']),
# Check that False disregard test dependencies
('a foobar=bar', False, [], ['a', 'b']),
])
def test_activating_test_dependencies(
self, spec_str, tests_arg, with_dep, without_dep
):
s = Spec(spec_str).concretized(tests=tests_arg)
for pkg_name in with_dep:
msg = "Cannot find test dependency in package '{0}'"
node = s[pkg_name]
assert node.dependencies(deptype='test'), msg.format(pkg_name)
for pkg_name in without_dep:
msg = "Test dependency in package '{0}' is unexpected"
node = s[pkg_name]
assert not node.dependencies(deptype='test'), msg.format(pkg_name)
@pytest.mark.regression('20019')
def test_compiler_match_is_preferred_to_newer_version(self):
if spack.config.get('config:concretizer') == 'original':
pytest.xfail('Known failure of the original concretizer')
# This spec depends on openblas. Openblas has a conflict
# that doesn't allow newer versions with gcc@4.4.0. Check
# that an old version of openblas is selected, rather than
# a different compiler for just that node.
spec_str = 'simple-inheritance+openblas %gcc@4.4.0 os=redhat6'
s = Spec(spec_str).concretized()
assert 'openblas@0.2.13' in s
assert s['openblas'].satisfies('%gcc@4.4.0')
@pytest.mark.regression('19981')
def test_target_ranges_in_conflicts(self):
with pytest.raises(spack.error.SpackError):
Spec('impossible-concretization').concretized()
@pytest.mark.regression('20040')
def test_variant_not_default(self):
s = Spec('ecp-viz-sdk').concretized()
# Check default variant value for the package
assert '+dep' in s['conditional-constrained-dependencies']
# Check that non-default variant values are forced on the dependency
d = s['dep-with-variants']
assert '+foo+bar+baz' in d
@pytest.mark.regression('20055')
def test_custom_compiler_version(self):
if spack.config.get('config:concretizer') == 'original':
pytest.xfail('Known failure of the original concretizer')
s = Spec('a %gcc@foo os=redhat6').concretized()
assert '%gcc@foo' in s
def test_all_patches_applied(self):
uuidpatch = 'a60a42b73e03f207433c5579de207c6ed61d58e4d12dd3b5142eb525728d89ea'
localpatch = 'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855'
spec = spack.spec.Spec('conditionally-patch-dependency+jasper')
spec.concretize()
assert ((uuidpatch, localpatch) ==
spec['libelf'].variants['patches'].value)
def test_dont_select_version_that_brings_more_variants_in(self):
s = Spec('dep-with-variants-if-develop-root').concretized()
assert s['dep-with-variants-if-develop'].satisfies('@1.0')
@pytest.mark.regression('20244,20736')
@pytest.mark.parametrize('spec_str,is_external,expected', [
# These are all externals, and 0_8 is a version not in package.py
('externaltool@1.0', True, '@1.0'),
('externaltool@0.9', True, '@0.9'),
('externaltool@0_8', True, '@0_8'),
# This external package is buildable, has a custom version
# in packages.yaml that is greater than the ones in package.py
# and specifies a variant
('external-buildable-with-variant +baz', True, '@1.1.special +baz'),
('external-buildable-with-variant ~baz', False, '@1.0 ~baz'),
('external-buildable-with-variant@1.0: ~baz', False, '@1.0 ~baz'),
# This uses an external version that meets the condition for
# having an additional dependency, but the dependency shouldn't
# appear in the answer set
('external-buildable-with-variant@0.9 +baz', True, '@0.9'),
])
def test_external_package_versions(self, spec_str, is_external, expected):
s = Spec(spec_str).concretized()
assert s.external == is_external
assert s.satisfies(expected)
@pytest.mark.regression('20292')
@pytest.mark.parametrize('context', [
{'add_variant': True, 'delete_variant': False},
{'add_variant': False, 'delete_variant': True},
{'add_variant': True, 'delete_variant': True}
])
@pytest.mark.xfail()
def test_reuse_installed_packages(
self, context, mutable_database, repo_with_changing_recipe
):
# Install a spec
root = Spec('root').concretized()
dependency = root['changing'].copy()
root.package.do_install(fake=True, explicit=True)
# Modify package.py
repo_with_changing_recipe.change(context)
# Try to concretize with the spec installed previously
new_root = Spec('root ^/{0}'.format(
dependency.dag_hash())
).concretized()
assert root.dag_hash() == new_root.dag_hash()
@pytest.mark.regression('20784')
def test_concretization_of_test_dependencies(self):
# With clingo we emit dependency_conditions regardless of the type
# of the dependency. We need to ensure that there's at least one
# dependency type declared to infer that the dependency holds.
s = Spec('test-dep-with-imposed-conditions').concretized()
assert 'c' not in s

View File

@@ -82,9 +82,7 @@ class TestConcretizePreferences(object):
{'debug': True, 'opt': True, 'shared': False, 'static': False}),
# Check a multivalued variant with multiple values set
('multivalue-variant', ['foo=bar,baz', 'fee=bar'],
{'foo': ('bar', 'baz'), 'fee': 'bar'}),
('singlevalue-variant', ['fum=why'],
{'fum': 'why'})
{'foo': ('bar', 'baz'), 'fee': 'bar'})
])
def test_preferred_variants(
self, package_name, variant_value, expected_results
@@ -373,13 +371,3 @@ def test_config_perms_fail_write_gt_read(self, configure_permissions):
spec = Spec('callpath')
with pytest.raises(ConfigError):
spack.package_prefs.get_package_permissions(spec)
@pytest.mark.regression('20040')
def test_variant_not_flipped_to_pull_externals(self):
"""Test that a package doesn't prefer pulling in an
external to using the default value of a variant.
"""
s = Spec('vdefault-or-external-root').concretized()
assert '~external' in s['vdefault-or-external']
assert 'externaltool' not in s

View File

@@ -656,36 +656,6 @@ def mock_store(tmpdir_factory, mock_repo_path, mock_configuration,
store_path.join('.spack-db').chmod(mode=0o755, rec=1)
@pytest.fixture(scope='function')
def mutable_mock_store(tmpdir_factory, mock_repo_path, mock_configuration,
_store_dir_and_cache):
"""Creates a read-only mock database with some packages installed note
that the ref count for dyninst here will be 3, as it's recycled
across each install.
This does not actually activate the store for use by Spack -- see the
``database`` fixture for that.
"""
store_path, store_cache = _store_dir_and_cache
store = spack.store.Store(str(store_path))
# If the cache does not exist populate the store and create it
if not os.path.exists(str(store_cache.join('.spack-db'))):
with use_configuration(mock_configuration):
with use_store(store):
with use_repo(mock_repo_path):
_populate(store.db)
store_path.copy(store_cache, mode=True, stat=True)
# Make the DB filesystem read-only to ensure we can't modify entries
store_path.join('.spack-db').chmod(mode=0o555, rec=1)
yield store
store_path.join('.spack-db').chmod(mode=0o755, rec=1)
@pytest.fixture(scope='function')
def database(mock_store, mock_packages, config, monkeypatch):
"""This activates the mock store, packages, AND config."""

View File

@@ -8,7 +8,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
spec: gcc@4.5.0
operating_system: {0.name}{0.version}
@@ -18,7 +17,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
spec: clang@3.3
operating_system: CNL
@@ -37,7 +35,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
spec: clang@3.3
operating_system: yosemite
@@ -47,7 +44,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
paths:
cc: /path/to/gcc
@@ -66,7 +62,6 @@ compilers:
operating_system: SuSE11
spec: gcc@4.5.0
modules: 'None'
target: x86_64
- compiler:
paths:
cc: /path/to/gcc
@@ -76,7 +71,6 @@ compilers:
operating_system: yosemite
spec: gcc@4.5.0
modules: 'None'
target: x86_64
- compiler:
paths:
cc: /path/to/gcc
@@ -86,7 +80,6 @@ compilers:
operating_system: elcapitan
spec: gcc@4.5.0
modules: 'None'
target: x86_64
- compiler:
spec: clang@3.3
operating_system: elcapitan
@@ -96,7 +89,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
spec: gcc@4.7.2
operating_system: redhat6
@@ -110,16 +102,6 @@ compilers:
cxxflags: -O0 -g
fflags: -O0 -g
modules: 'None'
target: x86_64
- compiler:
spec: gcc@4.4.0
operating_system: redhat6
paths:
cc: /path/to/gcc440
cxx: /path/to/g++440
f77: /path/to/gfortran440
fc: /path/to/gfortran440
modules: 'None'
- compiler:
spec: clang@3.5
operating_system: redhat6
@@ -132,7 +114,6 @@ compilers:
cflags: -O3
cxxflags: -O3
modules: 'None'
target: x86_64
- compiler:
spec: clang@8.0.0
operating_system: redhat7
@@ -145,7 +126,6 @@ compilers:
cflags: -O3
cxxflags: -O3
modules: 'None'
target: x86_64
- compiler:
spec: apple-clang@9.1.0
operating_system: elcapitan
@@ -155,7 +135,6 @@ compilers:
f77: None
fc: None
modules: 'None'
target: x86_64
- compiler:
spec: gcc@foo
operating_system: redhat6
@@ -165,7 +144,6 @@ compilers:
f77: /path/to/gfortran
fc: /path/to/gfortran
modules: 'None'
target: x86_64
- compiler:
spec: gcc@4.4.0-special
operating_system: redhat6
@@ -175,4 +153,3 @@ compilers:
f77: /path/to/gfortran
fc: /path/to/gfortran
modules: 'None'
target: x86_64

View File

@@ -9,8 +9,6 @@ packages:
prefix: /path/to/external_tool
- spec: externaltool@0.9%gcc@4.5.0
prefix: /usr
- spec: externaltool@0_8%gcc@4.5.0
prefix: /usr
externalvirtual:
buildable: False
externals:
@@ -29,10 +27,3 @@ packages:
externals:
- spec: requires-virtual@2.0
prefix: /usr
'external-buildable-with-variant':
buildable: True
externals:
- spec: external-buildable-with-variant@1.1.special +baz
prefix: /usr
- spec: external-buildable-with-variant@0.9 +baz
prefix: /usr

View File

@@ -450,8 +450,7 @@ def test_packages_needed_to_bootstrap_compiler_none(install_mockery):
spec.concretize()
assert spec.concrete
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package])
packages = inst._packages_needed_to_bootstrap_compiler(spec.package)
assert not packages
@@ -469,8 +468,7 @@ def _conc_spec(compiler):
monkeypatch.setattr(spack.compilers, 'pkg_spec_for_compiler', _conc_spec)
monkeypatch.setattr(spack.spec.Spec, 'concretize', _noop)
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package])
packages = inst._packages_needed_to_bootstrap_compiler(spec.package)
assert packages
@@ -628,7 +626,7 @@ def test_check_deps_status_upstream(install_mockery, monkeypatch):
def test_add_bootstrap_compilers(install_mockery, monkeypatch):
from collections import defaultdict
def _pkgs(compiler, architecture, pkgs):
def _pkgs(pkg):
spec = spack.spec.Spec('mpi').concretized()
return [(spec.package, True)]
@@ -638,8 +636,7 @@ def _pkgs(compiler, architecture, pkgs):
all_deps = defaultdict(set)
monkeypatch.setattr(inst, '_packages_needed_to_bootstrap_compiler', _pkgs)
installer._add_bootstrap_compilers(
'fake', 'fake', [request.pkg], request, all_deps)
installer._add_bootstrap_compilers(request.pkg, request, all_deps)
ids = list(installer.build_tasks)
assert len(ids) == 1

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This test does sanity checks on Spack's builtin package database."""
import os.path
import re
@@ -13,7 +14,6 @@
import spack.paths
import spack.repo
import spack.util.executable as executable
import spack.variant
# A few functions from this module are used to
# do sanity checks only on packagess modified by a PR
import spack.cmd.flake8 as flake8
@@ -257,15 +257,3 @@ def test_variant_defaults_are_parsable_from_cli():
if not default_is_parsable:
failing.append((pkg.name, variant_name))
assert not failing
def test_variant_defaults_listed_explicitly_in_values():
failing = []
for pkg in spack.repo.path.all_packages():
for variant_name, variant in pkg.variants.items():
vspec = variant.make_default()
try:
variant.validate_or_raise(vspec, pkg=pkg)
except spack.variant.InvalidVariantValueError:
failing.append((pkg.name, variant.name))
assert not failing

View File

@@ -196,8 +196,10 @@ def test_relocate_text(tmpdir):
script.close()
filenames = [filename]
new_dir = '/opt/rh/devtoolset/'
# Singleton dict doesn't matter if Ordered
relocate_text(filenames, {old_dir: new_dir})
relocate_text(filenames, old_dir, new_dir,
old_dir, new_dir,
old_dir, new_dir,
{old_dir: new_dir})
with open(filename, "r")as script:
for line in script:
assert(new_dir in line)

View File

@@ -12,6 +12,7 @@
import pytest
import spack.architecture
import spack.concretize
import spack.hooks.sbang as sbang
import spack.paths
import spack.relocate
import spack.spec
@@ -280,7 +281,7 @@ def test_replace_prefix_bin(hello_world):
executable = hello_world(rpaths=['/usr/lib', '/usr/lib64'])
# Relocate the RPATHs
spack.relocate._replace_prefix_bin(str(executable), {b'/usr': b'/foo'})
spack.relocate._replace_prefix_bin(str(executable), '/usr', '/foo')
# Some compilers add rpaths so ensure changes included in final result
assert '/foo/lib:/foo/lib64' in rpaths_for(executable)
@@ -381,12 +382,11 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
assert not text_in_bin(str(new_binary.dirpath()), new_binary)
# Check this call succeed
orig_path_bytes = str(orig_binary.dirpath()).encode('utf-8')
new_path_bytes = str(new_binary.dirpath()).encode('utf-8')
spack.relocate.relocate_text_bin(
[str(new_binary)],
{orig_path_bytes: new_path_bytes}
str(orig_binary.dirpath()), str(new_binary.dirpath()),
spack.paths.spack_root, spack.paths.spack_root,
{str(orig_binary.dirpath()): str(new_binary.dirpath())}
)
# Check original directory is not there anymore and it was
@@ -395,13 +395,55 @@ def test_relocate_text_bin(hello_world, copy_binary, tmpdir):
assert text_in_bin(str(new_binary.dirpath()), new_binary)
def test_relocate_text_bin_raise_if_new_prefix_is_longer(tmpdir):
short_prefix = b'/short'
long_prefix = b'/much/longer'
fpath = str(tmpdir.join('fakebin'))
with open(fpath, 'w') as f:
f.write('/short')
def test_relocate_text_bin_raise_if_new_prefix_is_longer():
short_prefix = '/short'
long_prefix = '/much/longer'
with pytest.raises(spack.relocate.BinaryTextReplaceError):
spack.relocate.relocate_text_bin(
[fpath], {short_prefix: long_prefix}
['item'], short_prefix, long_prefix, None, None, None
)
@pytest.mark.parametrize("sbang_line", [
"#!/bin/bash /path/to/orig/spack/bin/sbang",
"#!/bin/sh /orig/layout/root/bin/sbang"
])
def test_relocate_text_old_sbang(tmpdir, sbang_line):
"""Ensure that old and new sbang styles are relocated."""
old_install_prefix = "/orig/layout/root/orig/install/prefix"
new_install_prefix = os.path.join(
spack.store.layout.root, "new", "install", "prefix"
)
# input file with an sbang line
original = """\
{0}
#!/usr/bin/env python
/orig/layout/root/orig/install/prefix
""".format(sbang_line)
# expected relocation
expected = """\
{0}
#!/usr/bin/env python
{1}
""".format(sbang.sbang_shebang_line(), new_install_prefix)
path = tmpdir.ensure("path", "to", "file")
with path.open("w") as f:
f.write(original)
spack.relocate.relocate_text(
[str(path)],
"/orig/layout/root", spack.store.layout.root,
old_install_prefix, new_install_prefix,
"/path/to/orig/spack", spack.paths.spack_root,
{
old_install_prefix: new_install_prefix
}
)
assert expected == open(str(path)).read()

View File

@@ -2,7 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import pytest
import shlex
@@ -806,26 +806,3 @@ def test_kv_with_spaces(self):
])
def test_target_tokenization(self, expected_tokens, spec_string):
self.check_lex(expected_tokens, spec_string)
@pytest.mark.regression('20310')
def test_compare_abstract_specs(self):
"""Spec comparisons must be valid for abstract specs.
Check that the spec cmp_key appropriately handles comparing specs for
which some attributes are None in exactly one of two specs"""
# Add fields in order they appear in `Spec._cmp_node`
constraints = [
None,
'foo',
'foo.foo',
'foo.foo@foo',
'foo.foo@foo+foo',
'foo.foo@foo+foo arch=foo-foo-foo',
'foo.foo@foo+foo arch=foo-foo-foo %foo',
'foo.foo@foo+foo arch=foo-foo-foo %foo cflags=foo',
]
specs = [Spec(s) for s in constraints]
for a, b in itertools.product(specs, repeat=2):
# Check that we can compare without raising an error
assert a <= b or b < a

View File

@@ -102,9 +102,6 @@ def repo_for_pkg(self, name):
Repo = collections.namedtuple('Repo', ['namespace'])
return Repo('mockrepo')
def __contains__(self, item):
return item in self.spec_to_pkg
def add_package(self, name, dependencies=None, dependency_types=None,
conditions=None):
"""Factory method for creating mock packages.

View File

@@ -41,7 +41,6 @@ class HTMLParseError(Exception):
import spack.util.crypto
import spack.util.s3 as s3_util
import spack.util.url as url_util
import llnl.util.lang
from spack.util.compression import ALLOWED_ARCHIVE_TYPES
@@ -425,6 +424,12 @@ def _spider(url, collect_nested):
return pages, links, subcalls
# TODO: Needed until we drop support for Python 2.X
def star(func):
def _wrapper(args):
return func(*args)
return _wrapper
if isinstance(root_urls, six.string_types):
root_urls = [root_urls]
@@ -445,7 +450,7 @@ def _spider(url, collect_nested):
tty.debug("SPIDER: [depth={0}, max_depth={1}, urls={2}]".format(
current_depth, depth, len(spider_args))
)
results = tp.map(llnl.util.lang.star(_spider), spider_args)
results = tp.map(star(_spider), spider_args)
spider_args = []
collect = current_depth < depth
for sub_pages, sub_links, sub_spider_args in results:

View File

@@ -82,7 +82,8 @@ def isa_type(v):
else:
# Otherwise assume values is the set of allowed explicit values
self.values = values
self.single_value_validator = lambda x: x in tuple(self.values)
allowed = tuple(self.values) + (self.default,)
self.single_value_validator = lambda x: x in allowed
self.multi = multi
self.group_validator = validator

View File

@@ -0,0 +1,9 @@
merge_pipeline:
only:
- develop
variables:
SPACK_REPO: ${CI_PROJECT_URL}
SPACK_REF: ${CI_COMMIT_SHA}
trigger:
project: ecp/e4s/e4s
strategy: depend

View File

@@ -31,7 +31,6 @@ class A(AutotoolsPackage):
variant('bvv', default=True, description='The good old BV variant')
depends_on('b', when='foobar=bar')
depends_on('test-dependency', type='test')
parallel = False

View File

@@ -13,5 +13,3 @@ class B(Package):
url = "http://www.example.com/b-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
depends_on('test-dependency', type='test')

View File

@@ -1,16 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class ConditionalConstrainedDependencies(Package):
"""Package that has a variant which adds a dependency forced to
use non default values.
"""
homepage = "https://dev.null"
version('1.0')
# This variant is on by default and attaches a dependency
# with a lot of variants set at their non-default values
variant('dep', default=True, description='nope')
depends_on('dep-with-variants+foo+bar+baz', when='+dep')

View File

@@ -1,16 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class ConditionallyPatchDependency(Package):
"""Package that conditionally requries a patched version
of a dependency."""
homepage = "http://www.example.com"
url = "http://www.example.com/patch-a-dependency-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
variant('jasper', default=False)
depends_on('libelf@0.8.10', patches=[patch('uuid.patch')], when='+jasper')

View File

@@ -1,11 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class DepWithVariantsIfDevelopRoot(Package):
"""Package that adds a dependency with many variants only at @develop"""
homepage = "https://dev.null"
version('1.0')
depends_on('dep-with-variants-if-develop')

View File

@@ -1,12 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class DepWithVariantsIfDevelop(Package):
"""Package that adds a dependency with many variants only at @develop"""
homepage = "https://dev.null"
version('develop')
version('1.0')
depends_on('dep-with-variants', when='@develop')

View File

@@ -1,15 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class DepWithVariants(Package):
"""Package that has a variant which adds a dependency forced to
use non default values.
"""
homepage = "https://dev.null"
version('1.0')
variant('foo', default=False, description='nope')
variant('bar', default=False, description='nope')
variant('baz', default=False, description='nope')

View File

@@ -1,14 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class EcpVizSdk(Package):
"""Package that has a dependency with a variant which
adds a transitive dependency forced to use non default
values.
"""
homepage = "https://dev.null"
version('1.0')
depends_on('conditional-constrained-dependencies')

View File

@@ -1,18 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class ExternalBuildableWithVariant(Package):
homepage = "http://somewhere.com"
url = "http://somewhere.com/module-1.0.tar.gz"
version('1.0', '1234567890abcdef1234567890abcdef')
version('0.9', '1234567890abcdef1234567890abcdef')
variant('baz', default=False, description='nope')
depends_on('c@1.0', when='@0.9')

View File

@@ -12,6 +12,5 @@ class Externaltool(Package):
version('1.0', '1234567890abcdef1234567890abcdef')
version('0.9', '1234567890abcdef1234567890abcdef')
version('0.8.1', '1234567890abcdef1234567890abcdef')
depends_on('externalprereq')

View File

@@ -1,15 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class ImpossibleConcretization(Package):
"""Package that should be impossible to concretize due to a conflict
with target ranges. See Issue 19981.
"""
homepage = "http://www.example.com"
url = "http://www.example.com/example-1.0.tar.gz"
version(1.0, 'foobarbaz')
conflicts('target=x86_64:')

View File

@@ -1,35 +0,0 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class OldSbang(Package):
"""Toy package for testing the old sbang replacement problem"""
homepage = "https://www.example.com"
url = "https://www.example.com/old-sbang.tar.gz"
version('1.0.0', '0123456789abcdef0123456789abcdef')
def install(self, spec, prefix):
mkdirp(prefix.bin)
sbang_style_1 = '''#!/bin/bash {0}/bin/sbang
#!/usr/bin/env python
{1}
'''.format(spack.paths.prefix, prefix.bin)
sbang_style_2 = '''#!/bin/sh {0}/bin/sbang
#!/usr/bin/env python
{1}
'''.format(spack.store.unpadded_root, prefix.bin)
with open('%s/sbang-style-1.sh' % self.prefix.bin, 'w') as f:
f.write(sbang_style_1)
with open('%s/sbang-style-2.sh' % self.prefix.bin, 'w') as f:
f.write(sbang_style_2)

View File

@@ -12,10 +12,5 @@ class Openblas(Package):
url = "http://github.com/xianyi/OpenBLAS/archive/v0.2.15.tar.gz"
version('0.2.15', 'b1190f3d3471685f17cfd1ec1d252ac9')
version('0.2.14', 'b1190f3d3471685f17cfd1ec1d252ac9')
version('0.2.13', 'b1190f3d3471685f17cfd1ec1d252ac9')
# See #20019 for this conflict
conflicts('%gcc@:4.4.99', when='@0.2.14:')
provides('blas')

View File

@@ -1,16 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class SinglevalueVariantDependentType(Package):
"""Simple package with one dependency that has a single-valued
variant with values=str"""
homepage = "http://www.example.com"
url = "http://www.example.com/archive-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
depends_on('singlevalue-variant fum=nope')

View File

@@ -1,19 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class SinglevalueVariant(Package):
homepage = "http://www.llnl.gov"
url = "http://www.llnl.gov/mpileaks-1.0.tar.gz"
version(1.0, 'foobarbaz')
variant(
'fum',
description='Single-valued variant with type in values',
default='bar',
values=str,
multi=False
)

View File

@@ -1,17 +0,0 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class TestDepWithImposedConditions(Package):
"""Simple package with no dependencies"""
homepage = "http://www.example.com"
url = "http://www.example.com/e-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')
depends_on('c@1.0', type='test')

View File

@@ -1,12 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class TestDependency(Package):
"""Represent a dependency that is pulled-in to allow testing other
packages.
"""
homepage = "http://www.example.com"
url = "http://www.example.com/tdep-1.0.tar.gz"
version('1.0', '0123456789abcdef0123456789abcdef')

View File

@@ -1,15 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class VdefaultOrExternalRoot(Package):
"""Test that we don't prefer adding an external to using
a default variant value.
"""
homepage = 'https://www.example.org'
url = 'https://example.org/files/v3.4/cmake-3.4.3.tar.gz'
version('1.0', '4cb3ff35b2472aae70f542116d616e63')
depends_on('vdefault-or-external')

View File

@@ -1,17 +0,0 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
class VdefaultOrExternal(Package):
"""Test that we don't prefer adding an external to using
a default variant value.
"""
homepage = 'https://www.example.org'
url = 'https://example.org/files/v3.4/cmake-3.4.3.tar.gz'
version('1.0', '4cb3ff35b2472aae70f542116d616e63')
variant('external', default=False, description='nope')
depends_on('externaltool', when='+external')

View File

@@ -33,6 +33,8 @@ class Acts(CMakePackage, CudaPackage):
git = "https://github.com/acts-project/acts.git"
maintainers = ['HadrienG2']
tags = ['hep']
# Supported Acts versions
version('master', branch='master')
version('2.00.0', commit='8708eae2b2ccdf57ab7b451cfbba413daa1fc43c')

View File

@@ -51,7 +51,7 @@ class Adios2(CMakePackage):
# transport engines
variant('sst', default=True,
description='Enable the SST staging engine')
variant('dataman', default=True,
variant('dataman', default=False,
description='Enable the DataMan engine for WAN transports')
variant('dataspaces', default=False,
description='Enable support for DATASPACES')

View File

@@ -12,6 +12,8 @@ class Aida(Package):
homepage = "http://aida.freehep.org/"
url = "ftp://ftp.slac.stanford.edu/software/freehep/AIDA/v3.2.1/aida-3.2.1.tar.gz"
tags = ['hep']
version('3.2.1', sha256='c51da83e99c0985a7ef3e8bc5a60c3ae61f3ca603b61100c2438b4cdadd5bb2e')
def install(self, spec, prefix):

View File

@@ -0,0 +1,28 @@
From 526efe86427a4d49da38773534d84025dd4246c3 Mon Sep 17 00:00:00 2001
From: Ethan Stewart <ethan.stewart@amd.com>
Date: Tue, 10 Nov 2020 15:32:59 -0600
Subject: [PATCH] Add cmake option for copying source for debugging.
---
openmp/CMakeLists.txt | 8 ++++++++
1 file changed, 8 insertions(+)
diff --git a/openmp/CMakeLists.txt b/openmp/CMakeLists.txt
index a86e83c50212..51962b561a3b 100644
--- a/openmp/CMakeLists.txt
+++ b/openmp/CMakeLists.txt
@@ -103,3 +103,11 @@ endif()
# Now that we have seen all testsuites, create the check-openmp target.
construct_check_openmp_target()
+
+option(DEBUG_COPY_SOURCE "Enable source code copy for openmp debug build."
+ ${ENABLE_SOURCE_COPY})
+if (${ENABLE_SOURCE_COPY})
+ install(DIRECTORY runtime/src DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/runtime)
+ install(DIRECTORY libomptarget/src libomptarget/plugins DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/libomptarget)
+ install(DIRECTORY libompd/src DESTINATION ${OPENMP_INSTALL_LIBDIR}/src/openmp/libompd)
+endif()
--
2.17.1

View File

@@ -8,6 +8,22 @@
tools_url = 'https://github.com/ROCm-Developer-Tools'
compute_url = 'https://github.com/RadeonOpenCompute'
# 3.9 SHA Keys
aomp39 = dict()
aomp39 = {
"aomp":
"14fc6867af0b17e3bff8cb42cb36f509c95a29b7a933a106bf6778de21f6c123",
"devlib":
"c99f45dacf5967aef9a31e3731011b9c142446d4a12bac69774998976f2576d7",
"llvm":
"f0a0b9fec0626878340a15742e73a56f155090011716461edcb069dcf05e6b30",
"flang":
"43d57bcc87fab092ac242e36da62588a87b6fa91f9e81fdb330159497afdecb3",
"extras":
"014fca1fba54997c6db0e84822df274fb6807698b6856da4f737f38f10ab0e5d"
}
# 3.8 SHA Keys
aomp38 = dict()
aomp38 = {
@@ -65,17 +81,19 @@ class Aomp(Package):
"""llvm openmp compiler from AMD."""
homepage = tools_url + "/aomp"
url = tools_url + "/aomp/archive/rocm-3.8.0.tar.gz"
url = tools_url + "/aomp/archive/rocm-3.9.0.tar.gz"
maintainers = ['srekolam', 'arjun-raj-kuppala', 'estewart08']
version('3.9.0', sha256=aomp38['aomp'])
version('3.8.0', sha256=aomp38['aomp'])
version('3.7.0', sha256=aomp37['aomp'])
version('3.5.0', sha256=aomp35['aomp'])
depends_on('cmake@3.5.2:3.13.4', type='build')
depends_on('mesa~llvm@18.3:', type=('build', 'link'))
depends_on('mesa18~llvm@18.3:', type=('build', 'link'))
depends_on('py-setuptools@44.1.0', type='build')
depends_on('python@2.7.18', type='build')
depends_on('python@2.7.18', type='build', when='@3.5.0:3.8.0')
depends_on('python@3.6.9', type='build', when='@3.9.0:')
depends_on('py-pip', when='@3.8.0:', type='build')
depends_on('py-wheel@0.29.0', when='@3.8.0:', type=('build', 'run'))
depends_on('perl-data-dumper', type='build')
@@ -83,11 +101,60 @@ class Aomp(Package):
depends_on('elfutils', type=('build', 'link'))
depends_on('libffi', type=('build', 'link'))
for ver in ['3.5.0', '3.7.0', '3.8.0']:
depends_on('rocm-device-libs@' + ver, type='build', when='@' + ver)
depends_on('hsakmt-roct@' + ver, type='build', when='@' + ver)
depends_on('hsa-rocr-dev@' + ver, type='build', when='@' + ver)
for ver in ['3.5.0', '3.7.0', '3.8.0', '3.9.0']:
depends_on('hsakmt-roct@' + ver, type=('build', 'run'), when='@' + ver)
depends_on('comgr@' + ver, type='build', when='@' + ver)
depends_on('hsa-rocr-dev@' + ver, type=('build', 'run'),
when='@' + ver)
depends_on('rocm-device-libs@' + ver, type=('build', 'run'),
when='@' + ver)
if ver != '3.5.0':
depends_on('hip@' + ver, type=('build', 'run'), when='@' + ver)
depends_on('hip-rocclr@' + ver, type='build', when='@' + ver)
if ver == '3.9.0':
depends_on('rocm-gdb@' + ver, type=('build', 'run'),
when='@' + ver)
# 3.9.0 Resources
if ver == '3.9.0':
resource(
name='rocm-device-libs',
url=compute_url +
'/ROCm-Device-Libs/archive/rocm-3.9.0.tar.gz',
sha256=aomp39['devlib'],
expand=True,
destination='aomp-dir',
placement='rocm-device-libs',
when='@3.9.0')
resource(
name='amd-llvm-project',
url=tools_url + '/amd-llvm-project/archive/rocm-3.9.0.tar.gz',
sha256=aomp39['llvm'],
expand=True,
destination='aomp-dir',
placement='amd-llvm-project',
when='@3.9.0')
resource(
name='flang',
url=tools_url + '/flang/archive/rocm-3.9.0.tar.gz',
sha256=aomp39['flang'],
expand=True,
destination='aomp-dir',
placement='flang',
when='@3.9.0')
resource(
name='aomp-extras',
url=tools_url + '/aomp-extras/archive/rocm-3.9.0.tar.gz',
sha256=aomp39['extras'],
expand=True,
destination='aomp-dir',
placement='aomp-extras',
when='@3.9.0')
# 3.8.0 Resources
if ver == '3.8.0':
@@ -233,6 +300,11 @@ class Aomp(Package):
destination='aomp-dir',
placement='opencl-on-vdi',
when='@3.5.0')
# Copy source files over for debug build in 3.9.0
patch('0001-Add-cmake-option-for-copying-source-for-debugging.patch',
working_dir='aomp-dir/amd-llvm-project', when='@3.9.0:')
# Revert back to .amdgcn.bc naming scheme for 3.8.0
patch('0001-Add-amdgcn-to-devicelibs-bitcode-names-3.8.patch',
working_dir='aomp-dir/amd-llvm-project', when='@3.8.0')
@@ -256,8 +328,14 @@ def patch(self):
src = self.stage.source_path
libomptarget = '{0}/aomp-dir/amd-llvm-project/openmp/libomptarget'
aomp_extras = '{0}/aomp-dir/aomp-extras/aomp-device-libs'
flang = '{0}/aomp-dir/flang/'
if self.spec.version == Version('3.8.0'):
if self.spec.version >= Version('3.9.0'):
filter_file(
'ADDITIONAL_VERSIONS 2.7', 'ADDITIONAL_VERSIONS 3',
flang.format(src) + 'CMakeLists.txt')
if self.spec.version >= Version('3.8.0'):
filter_file(
'{CMAKE_INSTALL_PREFIX}', '{HSA_INCLUDE}',
libomptarget.format(src) + '/hostrpc/services/CMakeLists.txt')
@@ -330,6 +408,13 @@ def patch(self):
'-Wl,-rpath,${COMGR_LIB}',
libomptarget.format(src) + '/plugins/hsa/CMakeLists.txt')
def setup_run_environment(self, env):
devlibs_prefix = self.spec['rocm-device-libs'].prefix
aomp_prefix = self.spec['aomp'].prefix
env.set('HIP_DEVICE_LIB_PATH',
'{0}/amdgcn/bitcode'.format(format(devlibs_prefix)))
env.set('AOMP', '{0}'.format(format(aomp_prefix)))
def setup_build_environment(self, env):
aomp_prefix = self.spec['aomp'].prefix
env.set('AOMP', '{0}'.format(format(aomp_prefix)))
@@ -347,7 +432,15 @@ def install(self, spec, prefix):
hsakmt_prefix = self.spec['hsakmt-roct'].prefix
comgr_prefix = self.spec['comgr'].prefix
opencl_src = '/aomp-dir/opencl-on-vdi/api/opencl'
omp_src = '/aomp-dir/amd-llvm-project/openmp'
debug_map = '-fdebug-prefix-map={0}{1}={2}'
debug_map_format = debug_map.format(src, omp_src, aomp_prefix)
components = dict()
if self.spec.version >= Version('3.9.0'):
bitcode_dir = '/amdgcn/bitcode'
else:
bitcode_dir = '/lib'
components['amd-llvm-project'] = [
'../aomp-dir/amd-llvm-project/llvm',
'-DLLVM_ENABLE_PROJECTS=clang;lld;compiler-rt',
@@ -388,15 +481,16 @@ def install(self, spec, prefix):
components['aomp-extras'] = [
'../aomp-dir/aomp-extras',
'-DROCM_PATH=$ROCM_DIR ',
'-DDEVICE_LIBS_DIR={0}/lib'.format(devlibs_prefix),
'-DDEVICE_LIBS_DIR={0}{1}'.format(devlibs_prefix, bitcode_dir),
'-DAOMP_STANDALONE_BUILD=0',
'-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs'.format(src)
'-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs'.format(src),
'-DCMAKE_VERBOSE_MAKEFILE=1'
]
components['openmp'] = [
'../aomp-dir/amd-llvm-project/openmp',
'-DROCM_DIR={0}'.format(hsa_prefix),
'-DDEVICE_LIBS_DIR={0}/lib'.format(devlibs_prefix),
'-DDEVICE_LIBS_DIR={0}{1}'.format(devlibs_prefix, bitcode_dir),
'-DAOMP_STANDALONE_BUILD=0',
'-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs'.format(src),
'-DOPENMP_TEST_C_COMPILER=$AOMP/bin/clang',
@@ -416,7 +510,7 @@ def install(self, spec, prefix):
components['openmp-debug'] = [
'../aomp-dir/amd-llvm-project/openmp',
'-DROCM_DIR={0}'.format(hsa_prefix),
'-DDEVICE_LIBS_DIR={0}/lib'.format(devlibs_prefix),
'-DDEVICE_LIBS_DIR={0}{1}'.format(devlibs_prefix, bitcode_dir),
'-DAOMP_STANDALONE_BUILD=0',
'-DDEVICELIBS_ROOT={0}/aomp-dir/rocm-device-libs'.format(src),
'-DOPENMP_TEST_C_COMPILER=$AOMP/bin/clang',
@@ -434,7 +528,13 @@ def install(self, spec, prefix):
'-DOPENMP_ENABLE_LIBOMPTARGET_HSA=1'
]
if self.spec.version == Version('3.8.0'):
if self.spec.version >= Version('3.9.0'):
components['openmp-debug'] += [
'-DENABLE_SOURCE_COPY=ON',
'-DOPENMP_SOURCE_DEBUG_MAP={0}'.format(debug_map_format),
]
if self.spec.version >= Version('3.8.0'):
components['openmp-debug'] += [
'-DLIBOMP_ARCH=x86_64',
'-DLIBOMP_OMP_VERSION=50',

View File

@@ -12,7 +12,7 @@ class AwsParallelcluster(PythonPackage):
tool to deploy and manage HPC clusters in the AWS cloud."""
homepage = "https://github.com/aws/aws-parallelcluster"
url = "https://pypi.io/packages/source/a/aws-parallelcluster/aws-parallelcluster-2.9.1.tar.gz"
url = "https://pypi.io/packages/source/a/aws-parallelcluster/aws-parallelcluster-2.10.0.tar.gz"
maintainers = [
'sean-smith', 'demartinofra', 'enrico-usai', 'lukeseawalker', 'rexcsn',
@@ -23,6 +23,7 @@ class AwsParallelcluster(PythonPackage):
'pcluster.config', 'pcluster.networking'
]
version('2.10.0', sha256='a7a27871b4f54cb913b0c1233e675131e9b2099549af0840d32c36b7e91b104b')
version('2.9.1', sha256='12dc22286cd447a16931f1f8619bdd47d4543fd0de7905d52b6c6f83ff9db8a3')
version('2.9.0', sha256='e98a8426bc46aca0860d9a2be89bbc4a90aab3ed2f60ca6c385b595fbbe79a78')
version('2.8.1', sha256='c183dc3f053bc2445db724e561cea7f633dd5e7d467a7b3f9b2f2f703f7d5d49')
@@ -33,29 +34,33 @@ class AwsParallelcluster(PythonPackage):
version('2.5.1', sha256='4fd6e14583f8cf81f9e4aa1d6188e3708d3d14e6ae252de0a94caaf58be76303')
version('2.5.0', sha256='3b0209342ea0d9d8cc95505456103ad87c2d4e35771aa838765918194efd0ad3')
# common deps
depends_on('python@2.7:', type=('build', 'run'))
depends_on('py-future@0.16.0:0.18.2', type=('build', 'run'))
depends_on('py-ipaddress@1.0.22:', type=('build', 'run'))
depends_on('py-configparser@3.5.0:3.8.1', when='^python@:2', type=('build', 'run'))
# 2.9.x changes
depends_on('py-tabulate@0.8.2:0.8.3', when='@:2.8', type=('build', 'run'))
depends_on('py-tabulate@0.8.5', when='@2.9: ^python@3.0:3.4', type=('build', 'run'))
depends_on('py-tabulate@0.8.2:0.8.7', when='@2.9: ^python@:2,3.5:', type=('build', 'run'))
depends_on('py-pyyaml@5.2', when='@2.6:2.8 ^python@3.0:3.4', type=('build', 'run'))
depends_on('py-pyyaml@5.3.1:', when='@2.9: ^python@:2,3.5:', type=('build', 'run'))
depends_on('py-jinja2@2.10.1', when='@2.9: ^python@3.0:3.4', type=('build', 'run'))
depends_on('py-jinja2@2.11.0:', when='@2.9: ^python@:2,3.5:', type=('build', 'run'))
# 2.8.x changes
depends_on('py-boto3@1.14.3:', when='@2.8:', type=('build', 'run'))
depends_on('py-boto3@1.16.14:', when='@2.10:', type=('build', 'run'))
depends_on('py-boto3@1.14.3:', when='@2.8:2.9', type=('build', 'run'))
depends_on('py-boto3@1.10.15:', when='@:2.7', type=('build', 'run'))
# 2.6.x changes
depends_on('py-setuptools', when='@2.6:', type=('build', 'run'))
depends_on('py-enum34@1.1.6:', when='@2.6: ^python@:3.3', type=('build', 'run'))
depends_on('py-enum34@1.1.6:', when='@:2.5', type=('build', 'run'))
depends_on('py-pyyaml@5.1.2', when='@2.6: ^python@:2,3.5:', type=('build', 'run'))
depends_on('py-pyyaml@5.1.2:', when='@:2.5', type=('build', 'run'))

View File

@@ -539,16 +539,3 @@ def install(self, spec, prefix):
def setup_run_environment(self, env):
env.set('BOOST_ROOT', self.prefix)
def setup_dependent_package(self, module, dependent_spec):
# Disable find package's config mode for versions of Boost that
# didn't provide it. See https://github.com/spack/spack/issues/20169
# and https://cmake.org/cmake/help/latest/module/FindBoost.html
is_cmake = isinstance(dependent_spec.package, CMakePackage)
if self.spec.satisfies('boost@:1.69.0') and is_cmake:
args_fn = type(dependent_spec.package).cmake_args
def _cmake_args(self):
return ['-DBoost_NO_BOOST_CMAKE=ON'] + args_fn(self)
type(dependent_spec.package).cmake_args = _cmake_args

View File

@@ -14,6 +14,9 @@ class Botan(Package):
maintainers = ['aumuell']
version('2.17.2', sha256='ebe27dfe2b55d7e02bf520e926606c48b76b22facb483256b13ab38e018e1e6c')
version('2.17.1', sha256='741358b3f1638ed7d9b2f59b4e344aa46f4966b15958b5434c0ac1580df0c0c1')
version('2.17.0', sha256='b97044b312aa718349af7851331b064bc7bd5352400d5f80793bace427d01343')
version('2.16.0', sha256='92ed6ebc918d86bd1b04221ca518af4cf29cc326c4760740bd2d22e61cea2628')
version('2.15.0', sha256='d88af1307f1fefac79aa4f2f524699478d69ce15a857cf2d0a90ac6bf2a50009')
version('2.14.0', sha256='0c10f12b424a40ee19bde00292098e201d7498535c062d8d5b586d07861a54b5')

View File

@@ -6,7 +6,7 @@
from spack import *
class Camp(CMakePackage, CudaPackage, ROCmPackage):
class Camp(CMakePackage, CudaPackage, HipPackage):
"""
Compiler agnostic metaprogramming library providing concepts,
type operations and tuples for C++ and cuda
@@ -40,17 +40,12 @@ def cmake_args(self):
else:
options.append('-DENABLE_CUDA=OFF')
if '+rocm' in spec:
if '+hip' in spec:
arch = self.spec.variants['amdgpu_target'].value
options.extend([
'-DENABLE_HIP=ON',
'-DHIP_ROOT_DIR={0}'.format(spec['hip'].prefix)
])
archs = self.spec.variants['amdgpu_target'].value
if archs != 'none':
arch_str = ",".join(archs)
options.append(
'-DHIP_HIPCC_FLAGS=--amdgpu-target={0}'.format(arch_str)
)
'-DHIP_ROOT_DIR={0}'.format(spec['hip'].prefix),
'-DHIP_HIPCC_FLAGS=--amdgpu-target={0}'.format(arch)])
else:
options.append('-DENABLE_HIP=OFF')

View File

@@ -27,7 +27,7 @@ class CbtfArgonavis(CMakePackage):
to point to target build.")
variant('runtime', default=False,
description="build only the runtime libraries and collectors.")
variant('build_type', default='None', values=('None',),
variant('build_type', default='None', values=('None'),
description='CMake build type')
depends_on("cmake@3.0.2:", type='build')

View File

@@ -39,7 +39,7 @@ class CbtfKrell(CMakePackage):
description="Build mpi experiment collector for mpich MPI.")
variant('runtime', default=False,
description="build only the runtime libraries and collectors.")
variant('build_type', default='None', values=('None',),
variant('build_type', default='None', values=('None'),
description='CMake build type')
variant('cti', default=False,
description="Build MRNet with the CTI startup option")

View File

@@ -20,7 +20,7 @@ class CbtfLanl(CMakePackage):
version('1.9.1.1', branch='1.9.1.1')
version('1.9.1.0', branch='1.9.1.0')
variant('build_type', default='None', values=('None',),
variant('build_type', default='None', values=('None'),
description='CMake build type')
variant('runtime', default=False,

View File

@@ -29,7 +29,7 @@ class Cbtf(CMakePackage):
variant('runtime', default=False,
description="build only the runtime libraries and collectors.")
variant('build_type', default='None', values=('None',),
variant('build_type', default='None', values=('None'),
description='CMake build type')
depends_on("cmake@3.0.2:", type='build')

View File

@@ -31,6 +31,8 @@ class CcsQcd(MakefilePackage):
homepage = "https://github.com/fiber-miniapp/ccs-qcd"
git = "https://github.com/fiber-miniapp/ccs-qcd.git"
tags = ['hep']
version('master', branch='master')
version('1.2.1', commit='d7c6b6923f35a824e997ba8db5bd12dc20dda45c')

View File

@@ -6,7 +6,7 @@
from spack import *
class Chai(CMakePackage, CudaPackage, ROCmPackage):
class Chai(CMakePackage, CudaPackage, HipPackage):
"""
Copy-hiding array interface for data migration between memory spaces
"""
@@ -36,11 +36,12 @@ class Chai(CMakePackage, CudaPackage, ROCmPackage):
depends_on('umpire+cuda', when="+cuda")
depends_on('raja+cuda', when="+raja+cuda")
# variants +rocm and amdgpu_targets are not automatically passed to
# variants +hip and amdgpu_targets are not automatically passed to
# dependencies, so do it manually.
depends_on('umpire+rocm', when='+rocm')
depends_on('raja+rocm', when="+raja+rocm")
for val in ROCmPackage.amdgpu_targets:
amdgpu_targets = HipPackage.amd_gputargets_list()
depends_on('umpire+hip', when='+hip')
depends_on('raja+hip', when="+raja+hip")
for val in amdgpu_targets:
depends_on('umpire amdgpu_target=%s' % val, when='amdgpu_target=%s' % val)
depends_on('raja amdgpu_target=%s' % val, when='+raja amdgpu_target=%s' % val)
@@ -62,17 +63,12 @@ def cmake_args(self):
else:
options.append('-DENABLE_CUDA=OFF')
if '+rocm' in spec:
if '+hip' in spec:
arch = self.spec.variants['amdgpu_target'].value
options.extend([
'-DENABLE_HIP=ON',
'-DHIP_ROOT_DIR={0}'.format(spec['hip'].prefix)
])
archs = self.spec.variants['amdgpu_target'].value
if archs != 'none':
arch_str = ",".join(archs)
options.append(
'-DHIP_HIPCC_FLAGS=--amdgpu-target={0}'.format(arch_str)
)
'-DHIP_ROOT_DIR={0}'.format(spec['hip'].prefix),
'-DHIP_HIPCC_FLAGS=--amdgpu-target={0}'.format(arch)])
else:
options.append('-DENABLE_HIP=OFF')

View File

@@ -17,11 +17,13 @@ class Charmpp(Package):
allows programs to run portably from small multicore computers
(your laptop) to the largest supercomputers."""
homepage = "http://charmplusplus.org"
url = "http://charm.cs.illinois.edu/distrib/charm-6.8.2.tar.gz"
homepage = "https://charmplusplus.org"
url = "https://charm.cs.illinois.edu/distrib/charm-6.8.2.tar.gz"
git = "https://github.com/UIUC-PPL/charm.git"
version("develop", branch="master")
maintainers = ["matthiasdiener"]
version("master", branch="master")
version('6.10.2', sha256='7abb4cace8aebdfbb8006eac03eb766897c009cfb919da0d0a33f74c3b4e6deb')
version('6.10.1', sha256='ab96198105daabbb8c8bdf370f87b0523521ce502c656cb6cd5b89f69a2c70a8')
@@ -37,7 +39,7 @@ class Charmpp(Package):
version("6.5.1", sha256="68aa43e2a6e476e116a7e80e385c25c6ac6497807348025505ba8bfa256ed34a")
# Support OpenMPI; see
# <https://charm.cs.illinois.edu/redmine/issues/1206>
# <https://github.com/UIUC-PPL/charm/issues/1206>
# Patch is no longer needed in versions 6.8.0+
patch("mpi.patch", when="@:6.7.1")
@@ -147,17 +149,12 @@ def charmarch(self):
("darwin", "x86_64", "mpi"): "mpi-darwin-x86_64",
("darwin", "x86_64", "multicore"): "multicore-darwin-x86_64",
("darwin", "x86_64", "netlrts"): "netlrts-darwin-x86_64",
("linux", "i386", "mpi"): "mpi-linux",
("linux", "i386", "multicore"): "multicore-linux",
("linux", "i386", "netlrts"): "netlrts-linux",
("linux", "i386", "uth"): "uth-linux",
("linux", "x86_64", "mpi"): "mpi-linux-x86_64",
("linux", "x86_64", "multicore"): "multicore-linux-x86_64",
("linux", "x86_64", "netlrts"): "netlrts-linux-x86_64",
("linux", "x86_64", "verbs"): "verbs-linux-x86_64",
("linux", "x86_64", "ofi"): "ofi-linux-x86_64",
("linux", "x86_64", "ucx"): "ucx-linux-x86_64",
("linux", "x86_64", "uth"): "uth-linux-x86_64",
("linux", "ppc", "mpi"): "mpi-linux-ppc",
("linux", "ppc", "multicore"): "multicore-linux-ppc",
("linux", "ppc", "netlrts"): "netlrts-linux-ppc",
@@ -173,6 +170,21 @@ def charmarch(self):
("cnl", "x86_64", "gni"): "gni-crayxc",
("cnl", "x86_64", "mpi"): "mpi-crayxc",
}
# Some versions were renamed/removed in 6.11
if self.spec.version < Version("6.11.0"):
versions.update({("linux", "i386", "mpi"): "mpi-linux"})
versions.update({("linux", "i386", "multicore"):
"multicore-linux"})
versions.update({("linux", "i386", "netlrts"): "netlrts-linux"})
versions.update({("linux", "i386", "uth"): "uth-linux"})
else:
versions.update({("linux", "i386", "mpi"): "mpi-linux-i386"})
versions.update({("linux", "i386", "multicore"):
"multicore-linux-i386"})
versions.update({("linux", "i386", "netlrts"):
"netlrts-linux-i386"})
if (plat, mach, comm) not in versions:
raise InstallError(
"The communication mechanism %s is not supported "
@@ -301,7 +313,10 @@ def install(self, spec, prefix):
os.rename(tmppath, filepath)
except (IOError, OSError):
pass
shutil.rmtree(join_path(builddir, "tmp"))
tmp_path = join_path(builddir, "tmp")
if not os.path.islink(tmp_path):
shutil.rmtree(tmp_path)
if self.spec.satisfies('@6.9.99'):
# A broken 'doc' link in the prefix can break the build.
@@ -315,8 +330,8 @@ def install(self, spec, prefix):
@run_after('install')
@on_package_attributes(run_tests=True)
def check_build(self):
make('-C', join_path(self.stage.source_path, 'charm/tests'),
'test', parallel=False)
make('-C', join_path(self.stage.source_path, 'tests'),
'test', 'TESTOPTS=++local', parallel=False)
def setup_dependent_build_environment(self, env, dependent_spec):
env.set('MPICC', self.prefix.bin.ampicc)

View File

@@ -14,8 +14,11 @@ class Clhep(CMakePackage):
list_url = "https://proj-clhep.web.cern.ch/proj-clhep/"
list_depth = 1
tags = ['hep']
maintainers = ['drbenmorgan']
version('2.4.4.0', sha256='5df78c11733a091da9ae5a24ce31161d44034dd45f20455587db85f1ca1ba539')
version('2.4.1.3', sha256='27c257934929f4cb1643aa60aeaad6519025d8f0a1c199bc3137ad7368245913')
version('2.4.1.2', sha256='ff96e7282254164380460bc8cf2dff2b58944084eadcd872b5661eb5a33fa4b8')
version('2.4.1.0', sha256='d14736eb5c3d21f86ce831dc1afcf03d423825b35c84deb6f8fd16773528c54d')

View File

@@ -11,11 +11,12 @@ class Cmake(Package):
tools designed to build, test and package software.
"""
homepage = 'https://www.cmake.org'
url = 'https://github.com/Kitware/CMake/releases/download/v3.15.5/cmake-3.15.5.tar.gz'
url = 'https://github.com/Kitware/CMake/releases/download/v3.19.0/cmake-3.19.0.tar.gz'
maintainers = ['chuckatkins']
executables = ['^cmake$']
version('3.19.0', sha256='fdda688155aa7e72b7c63ef6f559fca4b6c07382ea6dca0beb5f45aececaf493')
version('3.18.4', sha256='597c61358e6a92ecbfad42a9b5321ddd801fc7e7eca08441307c9138382d4f77')
version('3.18.3', sha256='2c89f4e30af4914fd6fb5d00f863629812ada848eee4e2d29ec7e456d7fa32e5')
version('3.18.2', sha256='5d4e40fc775d3d828c72e5c45906b4d9b59003c9433ff1b36a1cb552bbd51d7e')

View File

@@ -12,6 +12,8 @@ class Collier(CMakePackage):
homepage = "https://collier.hepforge.org"
url = "https://collier.hepforge.org/downloads/?f=collier-1.2.5.tar.gz"
tags = ['hep']
version('1.2.5', sha256='3ec58a975ff0c3b1ca870bc38973476c923ff78fd3dd5850e296037852b94a8b')
version('1.2.4', sha256='92ae8f61461b232fbd47a6d8e832e1a726d504f9390b7edc49a68fceedff8857')
version('1.2.3', sha256='e6f72df223654df59113b0067a4bebe9f8c20227bb81371d3193e1557bdf56fb')

View File

@@ -37,7 +37,11 @@ class Conduit(Package):
url = "https://github.com/LLNL/conduit/releases/download/v0.3.0/conduit-v0.3.0-src-with-blt.tar.gz"
git = "https://github.com/LLNL/conduit.git"
version('master', branch='master', submodules=True, preferred=True)
version('develop', branch='develop', submodules=True, preferred=True)
# note: the main branch in conduit was renamed to develop, this next entry
# is to bridge any spack dependencies that are still using the name master
version('master', branch='develop', submodules=True)
version('0.6.0', sha256='078f086a13b67a97e4ab6fe1063f2fef2356df297e45b43bb43d74635f80475d')
version('0.5.1', sha256='68a3696d1ec6d3a4402b44a464d723e6529ec41016f9b44c053676affe516d44')
version('0.5.0', sha256='7efac668763d02bd0a2c0c1b134d9f5ee27e99008183905bb0512e5502b8b4fe')
version('0.4.0', sha256='c228e6f0ce5a9c0ffb98e0b3d886f2758ace1a4b40d00f3f118542c0747c1f52')

View File

@@ -5,6 +5,137 @@
import os
from spack import *
import platform
_versions = {
# cuDNN 8.0.4
'8.0.4.30-11.1': {
'Linux-x86_64': '8f4c662343afce5998ce963500fe3bb167e9a508c1a1a949d821a4b80fa9beab',
'Linux-ppc64le': 'b4ddb51610cbae806017616698635a9914c3e1eb14259f3a39ee5c84e7106712'},
'8.0.4.30-11.0': {
'Linux-x86_64': '38a81a28952e314e21577432b0bab68357ef9de7f6c8858f721f78df9ee60c35',
'Linux-ppc64le': '8da8ed689b1a348182ddd3f59b6758a502e11dc6708c33f96e3b4a40e033d2e1'},
'8.0.4.30-10.2': {
'Linux-x86_64': 'c12c69eb16698eacac40aa46b9ce399d4cd86efb6ff0c105142f8a28fcfb980e',
'Linux-ppc64le': '32a5b92f9e1ef2be90e10f220c4ab144ca59d215eb6a386e93597f447aa6507e'},
'8.0.4.30-10.1': {
'Linux-x86_64': 'eb4b888e61715168f57a0a0a21c281ada6856b728e5112618ed15f8637487715',
'Linux-ppc64le': '690811bbf04adef635f4a6f480575fc2a558c4a2c98c85c7090a3a8c60dacea9'},
# cuDNN 8.0.3
'8.0.3.33-11.0': {
'Linux-x86_64': '8924bcc4f833734bdd0009050d110ad0c8419d3796010cf7bc515df654f6065a',
'Linux-ppc64le': 'c2d0519831137b43d0eebe07522edb4ef5d62320e65e5d5fa840a9856f25923d'},
'8.0.3.33-10.2': {
'Linux-x86_64': 'b3d487c621e24b5711983b89bb8ad34f0378bdbf8a1a4b86eefaa23b19956dcc',
'Linux-ppc64le': 'ff22c9c37af191c9104989d784427cde744cdde879bfebf3e4e55ca6a9634a11'},
'8.0.3.33-10.1': {
'Linux-x86_64': '4752ac6aea4e4d2226061610d6843da6338ef75a93518aa9ce50d0f58df5fb07',
'Linux-ppc64le': 'c546175f6ec86a11ee8fb9ab5526fa8d854322545769a87d35b1a505992f89c3'},
# cuDNN 8.0.2
'8.0.2.39-11.0': {
'Linux-x86_64': '672f46288b8edd98f8d156a4f1ff518201ca6de0cff67915ceaa37f6d6d86345',
'Linux-ppc64le': 'b7c1ce5b1191eb007ba3455ea5f497fdce293a646545d8a6ed93e9bb06d7f057'},
'8.0.2.39-10.2': {
'Linux-x86_64': 'c9cbe5c211360f3cfbc0fb104f0e9096b37e53f89392525679f049276b2f701f',
'Linux-ppc64le': 'c32325ff84a8123491f2e58b3694885a9a672005bc21764b38874688c0e43262'},
'8.0.2.39-10.1': {
'Linux-x86_64': '82148a68bd6bdaab93af5e05bb1842b8ccb3ab7de7bed41f609a7616c102213d',
'Linux-ppc64le': '8196ec4f031356317baeccefbc4f61c8fccb2cf0bdef0a6431438918ddf68fb9'},
# cuDNN 8.0
'8.0.0.180-11.0': {
'Linux-x86_64': '9e75ea70280a77de815e0bdc85d08b67e081bc99a708b574092142344d2ba07e',
'Linux-ppc64le': '1229e94731bbca63ee7f5a239f4e1838a51a301d896f3097fbf7377d74704060'},
'8.0.0.180-10.2': {
'Linux-x86_64': '0c87c12358ee2b99d57c2a8c7560e3bb93e54bb929f5f8bec4964a72a2bb261d',
'Linux-ppc64le': '59e4ad6db15fcc374976e8052fe39e3f30f34079710fb3c7751a64c853d9243f'},
# cuDNN 7.6.5
'7.6.5.32-10.2': {
'Linux-x86_64': '600267f2caaed2fd58eb214ba669d8ea35f396a7d19b94822e6b36f9f7088c20',
'Linux-ppc64le': '7dc08b6ab9331bfd12207d4802c61db1ad7cace7395b67a6e7b16efa0335668b'},
'7.6.5.32-10.1': {
'Linux-x86_64': '7eaec8039a2c30ab0bc758d303588767693def6bf49b22485a2c00bf2e136cb3',
'Darwin-x86_64': '8ecce28a5ed388a2b9b2d239e08d7c550f53b79288e6d9e5eb4c152bfc711aff',
'Linux-ppc64le': '97b2faf73eedfc128f2f5762784d21467a95b2d5ba719825419c058f427cbf56'},
'7.6.5.32-10.0': {
'Linux-x86_64': '28355e395f0b2b93ac2c83b61360b35ba6cd0377e44e78be197b6b61b4b492ba',
'Darwin-x86_64': '6fa0b819374da49102e285ecf7fcb8879df4d0b3cc430cc8b781cdeb41009b47',
'Linux-ppc64le': 'b1717f4570083bbfc6b8b59f280bae4e4197cc1cb50e9d873c05adf670084c5b'},
'7.6.5.32-9.2': {
'Linux-x86_64': 'a2a2c7a8ba7b16d323b651766ee37dcfdbc2b50d920f73f8fde85005424960e4',
'Linux-ppc64le': 'a11f44f9a827b7e69f527a9d260f1637694ff7c1674a3e46bd9ec054a08f9a76'},
'7.6.5.32-9.0': {
'Linux-x86_64': 'bd0a4c0090d5b02feec3f195738968690cc2470b9bc6026e6fe8ff245cd261c8'},
# cuDNN 7.6.4
'7.6.4.38-10.1': {
'Linux-x86_64': '32091d115c0373027418620a09ebec3658a6bc467d011de7cdd0eb07d644b099',
'Darwin-x86_64': 'bfced062c3689ced2c1fb49c7d5052e6bc3da6974c1eb707e4dcf8cd209d4236',
'Linux-ppc64le': 'f3615fea50986a4dfd05d7a0cf83396dfdceefa9c209e8bf9691e20a48e420ce'},
'7.6.4.38-10.0': {
'Linux-x86_64': '417bb5daf51377037eb2f5c87649000ca1b9cec0acb16cfe07cb1d3e9a961dbf',
'Darwin-x86_64': 'af01ab841caec25087776a6b8fc7782883da12e590e24825ad1031f9ae0ed4b1',
'Linux-ppc64le': 'c1725ad6bd7d7741e080a1e6da4b62eac027a94ac55c606cce261e3f829400bb'},
'7.6.4.38-9.2': {
'Linux-x86_64': 'c79156531e641289b6a6952888b9637059ef30defd43c3cf82acf38d67f60a27',
'Linux-ppc64le': '98d8aae2dcd851558397a9a30b73242f257e1556be17c83650e63a0685969884'},
'7.6.4.38-9.0': {
'Linux-x86_64': '8db78c3623c192d4f03f3087b41c32cb0baac95e13408b5d9dabe626cb4aab5d'},
# cuDNN 7.6.3
'7.6.3.30-10.1': {
'Linux-x86_64': '352557346d8111e2f954c494be1a90207103d316b8777c33e62b3a7f7b708961',
'Linux-ppc64le': 'f274735a8fc31923d3623b1c3d2b1d0d35bb176687077c6a4d4353c6b900d8ee'},
# cuDNN 7.5.1
'7.5.1.10-10.1': {
'Linux-x86_64': '2c833f43c9147d9a25a20947a4c5a5f5c33b2443240fd767f63b330c482e68e0',
'Linux-ppc64le': 'a9e23bc83c970daec20874ccd1d8d80b648adf15440ecd0164818b330b1e2663'},
'7.5.1.10-10.0': {
'Linux-x86_64': 'c0a4ec438920aa581dd567117b9c316745b4a451ac739b1e04939a3d8b229985',
'Linux-ppc64le': 'd9205718da5fbab85433476f9ff61fcf4b889d216d6eea26753bbc24d115dd70'},
# cuDNN 7.5.0
'7.5.0.56-10.1': {
'Linux-x86_64': 'c31697d6b71afe62838ad2e57da3c3c9419c4e9f5635d14b683ebe63f904fbc8',
'Linux-ppc64le': '15415eb714ab86ab6c7531f2cac6474b5dafd989479b062776c670b190e43638'},
'7.5.0.56-10.0': {
'Linux-x86_64': '701097882cb745d4683bb7ff6c33b8a35c7c81be31bac78f05bad130e7e0b781',
'Linux-ppc64le': 'f0c1cbd9de553c8e2a3893915bd5fff57b30e368ef4c964d783b6a877869e93a'},
# cuDNN 7.3.0
'7.3.0.29-9.0': {
'Linux-x86_64': '403f9043ff2c7b2c5967454872275d07bca11fd41dfc7b21995eadcad6dbe49b'},
# cuDNN 7.2.1
'7.2.1.38-9.0': {
'Linux-x86_64': 'cf007437b9ac6250ec63b89c25f248d2597fdd01369c80146567f78e75ce4e37'},
# cuDNN 7.1.3
'7.1.3-9.1': {
'Linux-x86_64': 'dd616d3794167ceb923d706bf73e8d6acdda770751492b921ee6827cdf190228',
'Linux-ppc64le': 'e3b4837f711b98a52faacc872a68b332c833917ef3cf87c0108f1d01af9b2931'},
# cuDNN 6.0
'6.0-8.0': {
'Linux-x86_64': '9b09110af48c9a4d7b6344eb4b3e344daa84987ed6177d5c44319732f3bb7f9c'},
# cuDNN 5.1
'5.1-8.0': {
'Linux-x86_64': 'c10719b36f2dd6e9ddc63e3189affaa1a94d7d027e63b71c3f64d449ab0645ce'},
}
class Cudnn(Package):
@@ -20,214 +151,26 @@ class Cudnn(Package):
# Note that download links don't work from command line,
# need to use modified URLs like in url_for_version.
maintainers = ['adamjstewart']
maintainers = ['adamjstewart', 'bvanessen']
# cudNN 8.0.4
version('8.0.4.30-11.1-linux-x64',
sha256='8f4c662343afce5998ce963500fe3bb167e9a508c1a1a949d821a4b80fa9beab')
version('8.0.4.30-11.1-linux-ppc64le',
sha256='b4ddb51610cbae806017616698635a9914c3e1eb14259f3a39ee5c84e7106712')
version('8.0.4.30-11.0-linux-x64',
sha256='38a81a28952e314e21577432b0bab68357ef9de7f6c8858f721f78df9ee60c35',
preferred=True)
version('8.0.4.30-11.0-linux-ppc64le',
sha256='8da8ed689b1a348182ddd3f59b6758a502e11dc6708c33f96e3b4a40e033d2e1')
version('8.0.4.30-10.2-linux-x64',
sha256='c12c69eb16698eacac40aa46b9ce399d4cd86efb6ff0c105142f8a28fcfb980e')
version('8.0.4.30-10.2-linux-ppc64le',
sha256='32a5b92f9e1ef2be90e10f220c4ab144ca59d215eb6a386e93597f447aa6507e')
version('8.0.4.30-10.1-linux-x64',
sha256='eb4b888e61715168f57a0a0a21c281ada6856b728e5112618ed15f8637487715')
version('8.0.4.30-10.1-linux-ppc64le',
sha256='690811bbf04adef635f4a6f480575fc2a558c4a2c98c85c7090a3a8c60dacea9')
# cuDNN 8.0.3
version('8.0.3.33-11.0-linux-x64',
sha256='8924bcc4f833734bdd0009050d110ad0c8419d3796010cf7bc515df654f6065a')
version('8.0.3.33-11.0-linux-ppc64le',
sha256='c2d0519831137b43d0eebe07522edb4ef5d62320e65e5d5fa840a9856f25923d')
version('8.0.3.33-10.2-linux-x64',
sha256='b3d487c621e24b5711983b89bb8ad34f0378bdbf8a1a4b86eefaa23b19956dcc')
version('8.0.3.33-10.2-linux-ppc64le',
sha256='ff22c9c37af191c9104989d784427cde744cdde879bfebf3e4e55ca6a9634a11')
version('8.0.3.33-10.1-linux-x64',
sha256='4752ac6aea4e4d2226061610d6843da6338ef75a93518aa9ce50d0f58df5fb07')
version('8.0.3.33-10.1-linux-ppc64le',
sha256='c546175f6ec86a11ee8fb9ab5526fa8d854322545769a87d35b1a505992f89c3')
# cuDNN 8.0.2
version('8.0.2.39-11.0-linux-x64',
sha256='672f46288b8edd98f8d156a4f1ff518201ca6de0cff67915ceaa37f6d6d86345')
version('8.0.2.39-11.0-linux-ppc64le',
sha256='b7c1ce5b1191eb007ba3455ea5f497fdce293a646545d8a6ed93e9bb06d7f057')
version('8.0.2.39-10.2-linux-x64',
sha256='c9cbe5c211360f3cfbc0fb104f0e9096b37e53f89392525679f049276b2f701f')
version('8.0.2.39-10.2-linux-ppc64le',
sha256='c32325ff84a8123491f2e58b3694885a9a672005bc21764b38874688c0e43262')
version('8.0.2.39-10.1-linux-x64',
sha256='82148a68bd6bdaab93af5e05bb1842b8ccb3ab7de7bed41f609a7616c102213d')
version('8.0.2.39-10.1-linux-ppc64le',
sha256='8196ec4f031356317baeccefbc4f61c8fccb2cf0bdef0a6431438918ddf68fb9')
# cuDNN 8.0
version('8.0.0.180-11.0-linux-x64',
sha256='9e75ea70280a77de815e0bdc85d08b67e081bc99a708b574092142344d2ba07e')
version('8.0.0.180-11.0-linux-ppc64le',
sha256='1229e94731bbca63ee7f5a239f4e1838a51a301d896f3097fbf7377d74704060')
version('8.0.0.180-10.2-linux-x64',
sha256='0c87c12358ee2b99d57c2a8c7560e3bb93e54bb929f5f8bec4964a72a2bb261d')
version('8.0.0.180-10.2-linux-ppc64le',
sha256='59e4ad6db15fcc374976e8052fe39e3f30f34079710fb3c7751a64c853d9243f')
# cuDNN 7.6.5
version('7.6.5.32-10.2-linux-x64',
sha256='600267f2caaed2fd58eb214ba669d8ea35f396a7d19b94822e6b36f9f7088c20')
version('7.6.5.32-10.2-linux-ppc64le',
sha256='7dc08b6ab9331bfd12207d4802c61db1ad7cace7395b67a6e7b16efa0335668b')
version('7.6.5.32-10.1-linux-x64',
sha256='7eaec8039a2c30ab0bc758d303588767693def6bf49b22485a2c00bf2e136cb3')
version('7.6.5.32-10.1-osx-x64',
sha256='8ecce28a5ed388a2b9b2d239e08d7c550f53b79288e6d9e5eb4c152bfc711aff')
version('7.6.5.32-10.1-linux-ppc64le',
sha256='97b2faf73eedfc128f2f5762784d21467a95b2d5ba719825419c058f427cbf56')
version('7.6.5.32-10.0-linux-x64',
sha256='28355e395f0b2b93ac2c83b61360b35ba6cd0377e44e78be197b6b61b4b492ba')
version('7.6.5.32-10.0-osx-x64',
sha256='6fa0b819374da49102e285ecf7fcb8879df4d0b3cc430cc8b781cdeb41009b47')
version('7.6.5.32-10.0-linux-ppc64le',
sha256='b1717f4570083bbfc6b8b59f280bae4e4197cc1cb50e9d873c05adf670084c5b')
version('7.6.5.32-9.2-linux-x64',
sha256='a2a2c7a8ba7b16d323b651766ee37dcfdbc2b50d920f73f8fde85005424960e4')
version('7.6.5.32-9.2-linux-ppc64le',
sha256='a11f44f9a827b7e69f527a9d260f1637694ff7c1674a3e46bd9ec054a08f9a76')
version('7.6.5.32-9.0-linux-x64',
sha256='bd0a4c0090d5b02feec3f195738968690cc2470b9bc6026e6fe8ff245cd261c8')
# cuDNN 7.6.4
version('7.6.4.38-10.1-linux-x64',
sha256='32091d115c0373027418620a09ebec3658a6bc467d011de7cdd0eb07d644b099')
version('7.6.4.38-10.1-osx-x64',
sha256='bfced062c3689ced2c1fb49c7d5052e6bc3da6974c1eb707e4dcf8cd209d4236')
version('7.6.4.38-10.1-linux-ppc64le',
sha256='f3615fea50986a4dfd05d7a0cf83396dfdceefa9c209e8bf9691e20a48e420ce')
version('7.6.4.38-10.0-linux-x64',
sha256='417bb5daf51377037eb2f5c87649000ca1b9cec0acb16cfe07cb1d3e9a961dbf')
version('7.6.4.38-10.0-osx-x64',
sha256='af01ab841caec25087776a6b8fc7782883da12e590e24825ad1031f9ae0ed4b1')
version('7.6.4.38-10.0-linux-ppc64le',
sha256='c1725ad6bd7d7741e080a1e6da4b62eac027a94ac55c606cce261e3f829400bb')
version('7.6.4.38-9.2-linux-x64',
sha256='c79156531e641289b6a6952888b9637059ef30defd43c3cf82acf38d67f60a27')
version('7.6.4.38-9.2-linux-ppc64le',
sha256='98d8aae2dcd851558397a9a30b73242f257e1556be17c83650e63a0685969884')
version('7.6.4.38-9.0-linux-x64',
sha256='8db78c3623c192d4f03f3087b41c32cb0baac95e13408b5d9dabe626cb4aab5d')
# cuDNN 7.6.3
version('7.6.3.30-10.1-linux-x64',
sha256='352557346d8111e2f954c494be1a90207103d316b8777c33e62b3a7f7b708961')
version('7.6.3.30-10.1-linux-ppc64le',
sha256='f274735a8fc31923d3623b1c3d2b1d0d35bb176687077c6a4d4353c6b900d8ee')
# cuDNN 7.5.1
version('7.5.1.10-10.1-linux-x64',
sha256='2c833f43c9147d9a25a20947a4c5a5f5c33b2443240fd767f63b330c482e68e0')
version('7.5.1.10-10.1-linux-ppc64le',
sha256='a9e23bc83c970daec20874ccd1d8d80b648adf15440ecd0164818b330b1e2663')
version('7.5.1.10-10.0-linux-x64',
sha256='c0a4ec438920aa581dd567117b9c316745b4a451ac739b1e04939a3d8b229985')
version('7.5.1.10-10.0-linux-ppc64le',
sha256='d9205718da5fbab85433476f9ff61fcf4b889d216d6eea26753bbc24d115dd70')
# cuDNN 7.5.0
version('7.5.0.56-10.1-linux-x64',
sha256='c31697d6b71afe62838ad2e57da3c3c9419c4e9f5635d14b683ebe63f904fbc8')
version('7.5.0.56-10.1-linux-ppc64le',
sha256='15415eb714ab86ab6c7531f2cac6474b5dafd989479b062776c670b190e43638')
version('7.5.0.56-10.0-linux-x64',
sha256='701097882cb745d4683bb7ff6c33b8a35c7c81be31bac78f05bad130e7e0b781')
version('7.5.0.56-10.0-linux-ppc64le',
sha256='f0c1cbd9de553c8e2a3893915bd5fff57b30e368ef4c964d783b6a877869e93a')
# cuDNN 7.3.0
version('7.3.0.29-9.0-linux-x64',
sha256='403f9043ff2c7b2c5967454872275d07bca11fd41dfc7b21995eadcad6dbe49b')
# cuDNN 7.2.1
version('7.2.1.38-9.0-linux-x64',
sha256='cf007437b9ac6250ec63b89c25f248d2597fdd01369c80146567f78e75ce4e37')
# cuDNN 7.1.3
version('7.1.3-9.1-linux-x64',
sha256='dd616d3794167ceb923d706bf73e8d6acdda770751492b921ee6827cdf190228')
version('7.1.3-9.1-linux-ppc64le',
sha256='e3b4837f711b98a52faacc872a68b332c833917ef3cf87c0108f1d01af9b2931')
# cuDNN 6.0
version('6.0-8.0-linux-x64',
sha256='9b09110af48c9a4d7b6344eb4b3e344daa84987ed6177d5c44319732f3bb7f9c')
# cuDNN 5.1
version('5.1-8.0-linux-x64',
sha256='c10719b36f2dd6e9ddc63e3189affaa1a94d7d027e63b71c3f64d449ab0645ce')
# CUDA 10.2
depends_on('cuda@10.2.0:10.2.999', when='@7.6.5.32-10.2-linux-x64')
# CUDA 10.1
depends_on('cuda@10.1.0:10.1.999', when='@7.6.5.32-10.1-osx-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.5.32-10.1-linux-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.5.32-10.1-linux-ppc64le')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.4.38-10.1-osx-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.4.38-10.1-linux-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.4.38-10.1-linux-ppc64le')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.3.30-10.1-linux-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.6.3.30-10.1-linux-ppc64le')
depends_on('cuda@10.1.0:10.1.999', when='@7.5.0.56-10.1-linux-x64')
depends_on('cuda@10.1.0:10.1.999', when='@7.5.0.56-10.1-linux-ppc64le')
# CUDA 10.0
depends_on('cuda@10.0.0:10.0.999', when='@7.6.5.32-10.0-osx-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.6.5.32-10.0-linux-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.6.5.32-10.0-linux-ppc64le')
depends_on('cuda@10.0.0:10.0.999', when='@7.6.4.38-10.0-osx-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.6.4.38-10.0-linux-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.6.4.38-10.0-linux-ppc64le')
depends_on('cuda@10.0.0:10.0.999', when='@7.5.1.10-10.0-linux-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.5.1.10-10.0-linux-ppc64le')
depends_on('cuda@10.0.0:10.0.999', when='@7.5.0.56-10.0-linux-x64')
depends_on('cuda@10.0.0:10.0.999', when='@7.5.0.56-10.0-linux-ppc64le')
# CUDA 9.2
depends_on('cuda@9.2.0:9.2.999', when='@7.6.5.32-9.2-linux-x64')
depends_on('cuda@9.2.0:9.2.999', when='@7.6.5.32-9.2-linux-ppc64le')
depends_on('cuda@9.2.0:9.2.999', when='@7.6.4.38-9.2-linux-x64')
depends_on('cuda@9.2.0:9.2.999', when='@7.6.4.38-9.2-linux-ppc64le')
# CUDA 9.1
depends_on('cuda@9.1.0:9.1.999', when='@7.1.3-9.1-linux-x64')
depends_on('cuda@9.1.0:9.1.999', when='@7.1.3-9.1-linux-ppc64le')
# CUDA 9.0
depends_on('cuda@9.0.0:9.0.999', when='@7.6.5.32-9.0-linux-x64')
depends_on('cuda@9.0.0:9.0.999', when='@7.6.4.38-9.0-linux-x64')
depends_on('cuda@9.0.0:9.0.999', when='@7.3.0.29-9.0-linux-x64')
depends_on('cuda@9.0.0:9.0.999', when='@7.2.1.38-9.0-linux-x64')
# CUDA 8.0
depends_on('cuda@8.0.0:8.0.999', when='@6.0-8.0-linux-x64')
depends_on('cuda@8.0.0:8.0.999', when='@5.1-8.0-linux-x64')
for ver, packages in _versions.items():
key = "{0}-{1}".format(platform.system(), platform.machine())
pkg = packages.get(key)
cudnn_ver, cuda_ver = ver.split('-')
long_ver = "{0}-{1}".format(cudnn_ver, cuda_ver)
if pkg:
version(long_ver, sha256=pkg)
# Add constraints matching CUDA version to cuDNN version
cuda_req = 'cuda@{0}.0:{0}.999'.format(cuda_ver)
cudnn_ver_req = '@{0}'.format(long_ver)
depends_on(cuda_req, when=cudnn_ver_req)
def url_for_version(self, version):
url = 'https://developer.download.nvidia.com/compute/redist/cudnn/v{0}/cudnn-{1}-v{2}.tgz'
url = 'https://developer.download.nvidia.com/compute/redist/cudnn/v{0}/cudnn-{1}-{2}-v{3}.tgz'
# Get the system and machine arch for building the file path
sys = "{0}-{1}".format(platform.system(), platform.machine())
# Munge it to match Nvidia's naming scheme
sys_key = sys.lower().replace('x86_64', 'x64').replace('darwin', 'osx')
if version >= Version('7.2'):
directory = version[:3]
@@ -246,7 +189,7 @@ def url_for_version(self, version):
ver = version[:2]
cuda = version[2:]
return url.format(directory, cuda, ver)
return url.format(directory, cuda, sys_key, ver)
def setup_run_environment(self, env):
if 'target=ppc64le: platform=linux' in self.spec:

View File

@@ -21,6 +21,8 @@ class Dd4hep(CMakePackage):
maintainers = ['vvolkl', 'drbenmorgan']
tags = ['hep']
version('master', branch='master')
version('1.14.1', sha256='5b5742f1e23c2b36d3174cca95f810ce909c0eb66f3d6d7acb0ba657819e6717')
version('1.14.0', sha256='b603aa3c0db8dda392253aa71fa4a0f0c3c9715d47df0b895d45c1e8849f4895')

View File

@@ -15,11 +15,12 @@ class Delphes(CMakePackage):
git = "https://github.com/delphes/delphes.git"
url = "http://cp3.irmp.ucl.ac.be/downloads/Delphes-3.4.2.tar.gz"
tags = ['hep']
maintainers = ['drbenmorgan', 'vvolkl', 'selvaggi']
version('master', branch='master')
version('3.4.3pre05', tag='3.4.3pre05')
version('3.4.3pre04', tag='3.4.3pre04')
version('3.4.3pre06', tag='3.4.3pre06')
version('3.4.2', sha256='d46a7c5474de650befdb89377115feee31f1743107ceb3d8da699be9d48c097b', preferred=True)
version('3.4.1', sha256='4b5a2aeac326643f45b6d45c39ba2302e323eeb86d8cb58843c6e73949b1208a')
version('3.4.0', sha256='c0f9500663a0c3a5c1eddcee598a67b5bcfc9318303195c6cacc0590b4023fa1')

View File

@@ -16,6 +16,8 @@ class Dire(Package):
git = "http://gitlab.com/dire/direforpythia"
list_url = "http://dire.gitlab.io/Downloads.html"
tags = ['hep']
maintainer = ['mdiefent']
version('2.004', sha256='8cc1213b58fec744fdaa50834560a14b141de99efb2c3e3d3d47f3d6d84b179f')

View File

@@ -25,7 +25,7 @@ class Elsi(CMakePackage):
)
variant(
'elpa2_kernel', default="none", description="ELPA2 Kernel",
values=('none', 'AVX', 'AVX2', 'AVX512'), multi=False
values=('AVX', 'AVX2', 'AVX512'), multi=False
)
variant(
'enable_pexsi', default=False, description='Enable PEXSI support'

View File

@@ -13,6 +13,8 @@ class Evtgen(AutotoolsPackage):
homepage = "https://evtgen.hepforge.org/"
url = "http://lcgpackages.web.cern.ch/lcgpackages/tarFiles/sources/MCGeneratorsTarFiles/evtgen-R01-07-00.tar.gz"
tags = ['hep']
maintainers = ['vvolkl']
version('02-00-00', sha256='02372308e1261b8369d10538a3aa65fe60728ab343fcb64b224dac7313deb719')

View File

@@ -0,0 +1,27 @@
--- dfgather.F90_old 2020-09-16 17:19:18.000000000 -0600
+++ dfgather.F90 2020-09-16 17:19:21.000000000 -0600
@@ -27,6 +27,11 @@
! loop over q-points
Do iq = 1, nqpt
tq0 = tqgamma (iq)
+
+ call genfilname(basename='X0', bzsampl=bzsampl,&
+ & acont=input%xs%tddft%acont, nar= .not. input%xs%tddft%aresdf,&
+ & tord=input%xs%tddft%torddf, markfxcbse=tfxcbse, iqmt=iq, filnam=fnchi0)
+
! calculate k+q and G+k+q related variables
Call init1offs (qvkloff(1, iq))
! size of local field effects
@@ -54,6 +59,12 @@
call mpi_bcast(chi0hd,3*3,MPI_DOUBLE_COMPLEX,iproc,mpi_comm_world,ierr)
#endif
if(rank.eq.0.or.(firstinnode.and. .not.input%sharedfs))then
+ if (iw == wpari) then
+ print *, 'procs=', procs
+ print *, 'rank=', rank
+ print *, 'iq=', iq, '/', nqpt
+ print *, 'fnchi0=',trim(fnchi0)
+ end if
Call putx0 (tq0, iq, iw, trim(fnchi0), '', chi0, chi0wg, &
& chi0hd)
endif

View File

@@ -0,0 +1,117 @@
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Exciting(MakefilePackage):
"""
exciting is a full-potential all-electron density-functional-theory package
implementing the families of linearized augmented planewave methods. It can
be applied to all kinds of materials, irrespective of the atomic species in
volved, and also allows for exploring the physics of core electrons. A
particular focus are excited states within many-body perturbation theory.
"""
homepage = "http://exciting-code.org/"
url = "http://exciting.wdfiles.com/local--files/nitrogen-14/exciting.nitrogen-14.tar.gz"
version('14', sha256='a7feaffdc23881d6c0737d2f79f94d9bf073e85ea358a57196d7f7618a0a3eff')
# as-of-yet unpublished fix to version 14
patch('dfgather.patch', when='@14', working_dir='src/src_xs', level=0)
variant('mpi', default=False, description='Use MPI')
variant('mkl', default=False, description='Use MKL')
variant('omp', default=True, description='Use OpenMP')
variant('scalapack', default=False, description='Use ScaLAPACK')
depends_on('blas')
depends_on('lapack')
depends_on('fftw', when='~mkl')
depends_on('mkl', when='+mkl')
depends_on('mpi', when='+mpi')
depends_on('scalapack', when='+scalapack')
conflicts('%gcc@10:', msg='exciting cannot be built with GCC 10')
for __compiler in spack.compilers.supported_compilers():
if __compiler != 'intel':
conflicts('%{0}'.format(__compiler), when='^mkl',
msg='MKL only works with the Intel compiler')
def edit(self, spec, prefix):
opts = {}
opts['BUILDSMP'] = 'true'
opts['F90_OPTS'] = '-cpp '
opts['F77_OPTS'] = '-cpp -O3 '
opts['CPP_ON_OPTS'] = '-cpp -DXS -DISO -DLIBXC'
opts['LIB_ARP'] = 'libarpack.a'
opts['F90'] = spack_fc
opts['F77'] = spack_f77
if '+omp' in spec:
opts['LDFLAGS'] = self.compiler.openmp_flag + ' -DUSEOMP'
opts['F90_OPTS'] += self.compiler.openmp_flag + ' -DUSEOMP'
opts['F77_OPTS'] += self.compiler.openmp_flag + ' -DUSEOMP'
if '%intel' in spec:
opts['F90_OPTS'] += ' -O3 -cpp -ip -unroll -scalar_rep '
opts['CPP_ON_OPTS'] += ' -DIFORT -DFFTW'
if '%gcc' in spec:
opts['F90_OPTS'] += '-O3 -march=native -ffree-line-length-0'
filter_file('FCFLAGS = @FCFLAGS@',
' '.join(['FCFLAGS = @FCFLAGS@', '-cpp',
self.compiler.openmp_flag]),
'src/libXC/src/Makefile.in')
if '+mkl' in spec:
if '%intel' in spec:
opts['LIB_LPK'] = '-mkl=parallel'
opts['INC_MKL'] = spec['mkl'].headers.include_flags
opts['LIB_MKL'] = spec['mkl'].libs.ld_flags
else:
opts['LIB_LPK'] = ' '.join([spec['lapack'].libs.ld_flags,
spec['blas'].libs.ld_flags,
self.compiler.openmp_flag])
if '+mpi' in spec:
opts['BUILDMPI'] = 'true'
opts['MPIF90'] = spec['mpi'].mpifc
opts['MPIF90_CPP_OPTS'] = self.compiler.openmp_flag
opts['MPIF90_CPP_OPTS'] += ' -DMPI -DMPIRHO -DMPISEC '
opts['MPIF90_OPTS'] = ' '.join(['$(F90_OPTS)', '$(CPP_ON_OPTS) '
'$(MPIF90_CPP_OPTS)'])
opts['MPIF90MT'] = '$(MPIF90)'
else:
opts['BUILDMPI'] = 'false'
if '+scalapack' in spec:
opts['LIB_SCLPK'] = spec['scalapack'].libs.ld_flags
opts['LIB_SCLPK'] += ' ' + self.compiler.openmp_flag
opts['CPP_SCLPK'] = ' -DSCAL '
opts['LIBS_MPI'] = '$(LIB_SCLPK)'
opts['MPIF90_CPP_OPTS'] += ' $(CPP_SCLPK) '
opts['USE_SYS_LAPACK'] = 'true'
opts['LIB_FFT'] = 'fftlib.a'
opts['LIB_BZINT'] = 'libbzint.a'
opts['LIBS'] = '$(LIB_ARP) $(LIB_LPK) $(LIB_FFT) $(LIB_BZINT)'
with open('build/make.inc', 'a') as inc:
for key in opts:
inc.write('{0} = {1}\n'.format(key, opts[key]))
def install(self, spec, prefix):
install_tree('bin', prefix)
install_tree('species', prefix.species)
install_tree('tools', prefix.tools)
def setup_run_environment(self, env):
env.set('WNHOME', self.prefix)
env.set('EXCITINGROOT', self.prefix)
env.set('EXCITINGBIN', self.prefix.bin)
env.set('EXCITINGTOOLS', self.prefix.tools)
env.set('EXCITINGSTM', self.prefix.tools.stm)
env.set('EXCITINGVISUAL', self.prefix.xml.visualizationtemplates)
env.set('EXCITINGCONVERT', self.prefix.xml.inputfileconverter)
env.set('TIMEFORMAT', ' Elapsed time = %0lR')
env.set('WRITEMINMAX', '1')
env.append_path('PYTHONPATH', self.prefix.tools.stm)
env.append_path('PATH', self.prefix.tools)
env.append_path('PATH', self.prefix)
env.append_path('PATH', self.prefix.tools.stm)

View File

@@ -39,7 +39,9 @@ def url_for_version(self, version):
def configure_args(self):
spec = self.spec
args = ['--without-docbook']
args = ['--without-docbook',
'--enable-static',
]
if '+libbsd' in spec and '@2.2.1:' in spec:
args.append('--with-libbsd')
return args

View File

@@ -36,7 +36,7 @@ class Fairlogger(CMakePackage):
multi=False,
description='CMake build type')
variant('cxxstd', default='default',
values=('default', '11', '14', '17'),
values=('11', '14', '17'),
multi=False,
description='Use the specified C++ standard when building.')
variant('pretty',

Some files were not shown because too many files have changed in this diff Show More