Merge tag 'v0.15.0' into features/shared
This commit is contained in:
1
.github/workflows/linux_build_tests.yaml
vendored
1
.github/workflows/linux_build_tests.yaml
vendored
@@ -5,6 +5,7 @@ on:
|
||||
branches:
|
||||
- master
|
||||
- develop
|
||||
- releases/**
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
1
.github/workflows/linux_unit_tests.yaml
vendored
1
.github/workflows/linux_unit_tests.yaml
vendored
@@ -5,6 +5,7 @@ on:
|
||||
branches:
|
||||
- master
|
||||
- develop
|
||||
- releases/**
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
1
.github/workflows/macos_unit_tests.yaml
vendored
1
.github/workflows/macos_unit_tests.yaml
vendored
@@ -5,6 +5,7 @@ on:
|
||||
branches:
|
||||
- master
|
||||
- develop
|
||||
- releases/**
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
@@ -5,6 +5,7 @@ on:
|
||||
branches:
|
||||
- master
|
||||
- develop
|
||||
- releases/**
|
||||
pull_request:
|
||||
branches:
|
||||
- master
|
||||
|
||||
@@ -40,6 +40,7 @@ jobs:
|
||||
- ninja-build
|
||||
- realpath
|
||||
- zsh
|
||||
- fish
|
||||
env: [ TEST_SUITE=unit, COVERAGE=true ]
|
||||
- python: '3.8'
|
||||
os: linux
|
||||
@@ -73,6 +74,7 @@ addons:
|
||||
- ninja-build
|
||||
- patchelf
|
||||
- zsh
|
||||
- fish
|
||||
update: true
|
||||
|
||||
# ~/.ccache needs to be cached directly as Travis is not taking care of it
|
||||
|
||||
98
CHANGELOG.md
98
CHANGELOG.md
@@ -1,4 +1,98 @@
|
||||
# v0.14.2 (2019-04-15)
|
||||
# v0.15.0 (2020-06-28)
|
||||
|
||||
`v0.15.0` is a major feature release.
|
||||
|
||||
## Major Features in this release
|
||||
|
||||
1. **Cray support** Spack will now work properly on Cray "Cluster"
|
||||
systems (non XC systems) and after a `module purge` command on Cray
|
||||
systems. See #12989
|
||||
|
||||
2. **Virtual package configuration** Virtual packages are allowed in
|
||||
packages.yaml configuration. This allows users to specify a virtual
|
||||
package as non-buildable without needing to specify for each
|
||||
implementation. See #14934
|
||||
|
||||
3. **New config subcommands** This release adds `spack config add` and
|
||||
`spack config remove` commands to add to and remove from yaml
|
||||
configuration files from the CLI. See #13920
|
||||
|
||||
4. **Environment activation** Anonymous environments are **no longer**
|
||||
automatically activated in the current working directory. To activate
|
||||
an environment from a `spack.yaml` file in the current directory, use
|
||||
the `spack env activate .` command. This removes a concern that users
|
||||
were too easily polluting their anonymous environments with accidental
|
||||
installations. See #17258
|
||||
|
||||
5. **Apple clang compiler** The clang compiler and the apple-clang
|
||||
compiler are now separate compilers in Spack. This allows Spack to
|
||||
improve support for the apple-clang compiler. See #17110
|
||||
|
||||
6. **Finding external packages** Spack packages can now support an API
|
||||
for finding external installations. This allows the `spack external
|
||||
find` command to automatically add installations of those packages to
|
||||
the user's configuration. See #15158
|
||||
|
||||
|
||||
## Additional new features of note
|
||||
|
||||
* support for using Spack with the fish shell (#9279)
|
||||
* `spack load --first` option to load first match (instead of prompting user) (#15622)
|
||||
* support the Cray cce compiler both new and classic versions (#17256, #12989)
|
||||
* `spack dev-build` command:
|
||||
* supports stopping before a specified phase (#14699)
|
||||
* supports automatically launching a shell in the build environment (#14887)
|
||||
* `spack install --fail-fast` allows builds to fail at the first error (rather than best-effort) (#15295)
|
||||
* environments: SpecList references can be dereferenced as compiler or dependency constraints (#15245)
|
||||
* `spack view` command: new support for a copy/relocate view type (#16480)
|
||||
* ci pipelines: see documentation for several improvements
|
||||
* `spack mirror -a` command now supports excluding packages (#14154)
|
||||
* `spack buildcache create` is now environment-aware (#16580)
|
||||
* module generation: more flexible format for specifying naming schemes (#16629)
|
||||
* lmod module generation: packages can be configured as core specs for lmod hierarchy (#16517)
|
||||
|
||||
## Deprecations and Removals
|
||||
|
||||
The following commands were deprecated in v0.13.0, and have now been removed:
|
||||
|
||||
* `spack configure`
|
||||
* `spack build`
|
||||
* `spack diy`
|
||||
|
||||
The following commands were deprecated in v0.14.0, and will be removed in the next major release:
|
||||
|
||||
* `spack bootstrap`
|
||||
|
||||
## Bugfixes
|
||||
|
||||
Some of the most notable bugfixes in this release include:
|
||||
|
||||
* Spack environments can now contain the string `-h` (#15429)
|
||||
* The `spack install` gracefully handles being backgrounded (#15723, #14682)
|
||||
* Spack uses `-isystem` instead of `-I` in cases that the underlying build system does as well (#16077)
|
||||
* Spack no longer prints any specs that cannot be safely copied into a Spack command (#16462)
|
||||
* Incomplete Spack environments containing python no longer cause problems (#16473)
|
||||
* Several improvements to binary package relocation
|
||||
|
||||
## Package Improvements
|
||||
|
||||
The Spack project is constantly engaged in routine maintenance,
|
||||
bugfixes, and improvements for the package ecosystem. Of particular
|
||||
note in this release are the following:
|
||||
|
||||
* Spack now contains 4339 packages. There are 430 newly supported packages in v0.15.0
|
||||
* GCC now builds properly on ARM architectures (#17280)
|
||||
* Python: patched to support compiling mixed C/C++ python modules through distutils (#16856)
|
||||
* improvements to pytorch and py-tensorflow packages
|
||||
* improvements to major MPI implementations: mvapich2, mpich, openmpi, and others
|
||||
|
||||
## Spack Project Management:
|
||||
|
||||
* Much of the Spack CI infrastructure has moved from Travis to GitHub Actions (#16610, #14220, #16345)
|
||||
* All merges to the `develop` branch run E4S CI pipeline (#16338)
|
||||
* New `spack debug report` command makes reporting bugs easier (#15834)
|
||||
|
||||
# v0.14.2 (2020-04-15)
|
||||
|
||||
This is a minor release on the `0.14` series. It includes performance
|
||||
improvements and bug fixes:
|
||||
@@ -13,7 +107,7 @@ improvements and bug fixes:
|
||||
* Avoid adding spurious `LMOD` env vars to Intel modules (#15778)
|
||||
* Don't output [+] for mock installs run during tests (#15609)
|
||||
|
||||
# v0.14.1 (2019-03-20)
|
||||
# v0.14.1 (2020-03-20)
|
||||
|
||||
This is a bugfix release on top of `v0.14.0`. Specific fixes include:
|
||||
|
||||
|
||||
@@ -165,7 +165,7 @@ of environments:
|
||||
# Extra instructions
|
||||
extra_instructions:
|
||||
final: |
|
||||
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
|
||||
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
|
||||
|
||||
# Labels for the image
|
||||
labels:
|
||||
|
||||
@@ -167,15 +167,6 @@ Any directory can be treated as an environment if it contains a file
|
||||
|
||||
$ spack env activate -d /path/to/directory
|
||||
|
||||
Spack commands that are environment sensitive will also act on the
|
||||
environment any time the current working directory contains a
|
||||
``spack.yaml`` file. Changing working directory to a directory
|
||||
containing a ``spack.yaml`` file is equivalent to the command:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env activate -d /path/to/dir --without-view
|
||||
|
||||
Anonymous specs can be created in place using the command:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
@@ -45,6 +45,7 @@ for setting up a build pipeline are as follows:
|
||||
tags:
|
||||
- <custom-tag>
|
||||
script:
|
||||
- spack env activate .
|
||||
- spack ci generate
|
||||
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
|
||||
artifacts:
|
||||
@@ -384,6 +385,7 @@ a custom spack and make sure the generated rebuild jobs will clone it too:
|
||||
- git clone ${SPACK_REPO} --branch ${SPACK_REF}
|
||||
- . ./spack/share/spack/setup-env.sh
|
||||
script:
|
||||
- spack env activate .
|
||||
- spack ci generate
|
||||
--spack-repo ${SPACK_REPO} --spack-ref ${SPACK_REF}
|
||||
--output-file "${CI_PROJECT_DIR}/jobs_scratch_dir/pipeline.yml"
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
|
||||
#: major, minor, patch version for Spack, in a tuple
|
||||
spack_version_info = (0, 14, 2)
|
||||
spack_version_info = (0, 15, 0)
|
||||
|
||||
#: String containing Spack version joined with .'s
|
||||
spack_version = '.'.join(str(v) for v in spack_version_info)
|
||||
|
||||
@@ -25,6 +25,7 @@
|
||||
|
||||
import spack.cmd
|
||||
import spack.config as config
|
||||
import spack.database as spack_db
|
||||
import spack.fetch_strategy as fs
|
||||
import spack.util.gpg
|
||||
import spack.relocate as relocate
|
||||
@@ -32,7 +33,6 @@
|
||||
import spack.mirror
|
||||
import spack.util.url as url_util
|
||||
import spack.util.web as web_util
|
||||
|
||||
from spack.spec import Spec
|
||||
from spack.stage import Stage
|
||||
from spack.util.gpg import Gpg
|
||||
@@ -282,31 +282,47 @@ def sign_tarball(key, force, specfile_path):
|
||||
def generate_package_index(cache_prefix):
|
||||
"""Create the build cache index page.
|
||||
|
||||
Creates (or replaces) the "index.html" page at the location given in
|
||||
Creates (or replaces) the "index.json" page at the location given in
|
||||
cache_prefix. This page contains a link for each binary package (*.yaml)
|
||||
and public key (*.key) under cache_prefix.
|
||||
"""
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
try:
|
||||
index_html_path = os.path.join(tmpdir, 'index.html')
|
||||
file_list = (
|
||||
entry
|
||||
for entry in web_util.list_url(cache_prefix)
|
||||
if (entry.endswith('.yaml')
|
||||
or entry.endswith('.key')))
|
||||
db_root_dir = os.path.join(tmpdir, 'db_root')
|
||||
db = spack_db.Database(None, db_dir=db_root_dir,
|
||||
enable_transaction_locking=False,
|
||||
record_fields=['spec', 'ref_count'])
|
||||
|
||||
with open(index_html_path, 'w') as f:
|
||||
f.write(BUILD_CACHE_INDEX_TEMPLATE.format(
|
||||
title='Spack Package Index',
|
||||
path_list='\n'.join(
|
||||
BUILD_CACHE_INDEX_ENTRY_TEMPLATE.format(path=path)
|
||||
for path in file_list)))
|
||||
file_list = (
|
||||
entry
|
||||
for entry in web_util.list_url(cache_prefix)
|
||||
if entry.endswith('.yaml'))
|
||||
|
||||
tty.debug('Retrieving spec.yaml files from {0} to build index'.format(
|
||||
cache_prefix))
|
||||
for file_path in file_list:
|
||||
try:
|
||||
yaml_url = url_util.join(cache_prefix, file_path)
|
||||
tty.debug('fetching {0}'.format(yaml_url))
|
||||
_, _, yaml_file = web_util.read_from_url(yaml_url)
|
||||
yaml_contents = codecs.getreader('utf-8')(yaml_file).read()
|
||||
# yaml_obj = syaml.load(yaml_contents)
|
||||
# s = Spec.from_yaml(yaml_obj)
|
||||
s = Spec.from_yaml(yaml_contents)
|
||||
db.add(s, None)
|
||||
except (URLError, web_util.SpackWebError) as url_err:
|
||||
tty.error('Error reading spec.yaml: {0}'.format(file_path))
|
||||
tty.error(url_err)
|
||||
|
||||
try:
|
||||
index_json_path = os.path.join(db_root_dir, 'index.json')
|
||||
with open(index_json_path, 'w') as f:
|
||||
db._write_to_file(f)
|
||||
|
||||
web_util.push_to_url(
|
||||
index_html_path,
|
||||
url_util.join(cache_prefix, 'index.html'),
|
||||
index_json_path,
|
||||
url_util.join(cache_prefix, 'index.json'),
|
||||
keep_original=False,
|
||||
extra_args={'ContentType': 'text/html'})
|
||||
extra_args={'ContentType': 'application/json'})
|
||||
finally:
|
||||
shutil.rmtree(tmpdir)
|
||||
|
||||
@@ -825,49 +841,55 @@ def get_spec(spec=None, force=False):
|
||||
return try_download_specs(urls=urls, force=force)
|
||||
|
||||
|
||||
def get_specs(force=False, allarch=False):
|
||||
def get_specs(allarch=False):
|
||||
"""
|
||||
Get spec.yaml's for build caches available on mirror
|
||||
"""
|
||||
global _cached_specs
|
||||
arch = architecture.Arch(architecture.platform(),
|
||||
'default_os', 'default_target')
|
||||
arch_pattern = ('([^-]*-[^-]*-[^-]*)')
|
||||
if not allarch:
|
||||
arch_pattern = '(%s-%s-[^-]*)' % (arch.platform, arch.os)
|
||||
|
||||
regex_pattern = '%s(.*)(spec.yaml$)' % (arch_pattern)
|
||||
arch_re = re.compile(regex_pattern)
|
||||
|
||||
if not spack.mirror.MirrorCollection():
|
||||
tty.debug("No Spack mirrors are currently configured")
|
||||
return {}
|
||||
|
||||
urls = set()
|
||||
for mirror in spack.mirror.MirrorCollection().values():
|
||||
fetch_url_build_cache = url_util.join(
|
||||
mirror.fetch_url, _build_cache_relative_path)
|
||||
|
||||
mirror_dir = url_util.local_file_path(fetch_url_build_cache)
|
||||
if mirror_dir:
|
||||
tty.msg("Finding buildcaches in %s" % mirror_dir)
|
||||
if os.path.exists(mirror_dir):
|
||||
files = os.listdir(mirror_dir)
|
||||
for file in files:
|
||||
m = arch_re.search(file)
|
||||
if m:
|
||||
link = url_util.join(fetch_url_build_cache, file)
|
||||
urls.add(link)
|
||||
else:
|
||||
tty.msg("Finding buildcaches at %s" %
|
||||
url_util.format(fetch_url_build_cache))
|
||||
p, links = web_util.spider(
|
||||
url_util.join(fetch_url_build_cache, 'index.html'))
|
||||
for link in links:
|
||||
m = arch_re.search(link)
|
||||
if m:
|
||||
urls.add(link)
|
||||
tty.msg("Finding buildcaches at %s" %
|
||||
url_util.format(fetch_url_build_cache))
|
||||
|
||||
return try_download_specs(urls=urls, force=force)
|
||||
index_url = url_util.join(fetch_url_build_cache, 'index.json')
|
||||
|
||||
try:
|
||||
_, _, file_stream = web_util.read_from_url(
|
||||
index_url, 'application/json')
|
||||
index_object = codecs.getreader('utf-8')(file_stream).read()
|
||||
except (URLError, web_util.SpackWebError) as url_err:
|
||||
tty.error('Failed to read index {0}'.format(index_url))
|
||||
tty.debug(url_err)
|
||||
# Just return whatever specs we may already have cached
|
||||
return _cached_specs
|
||||
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
index_file_path = os.path.join(tmpdir, 'index.json')
|
||||
with open(index_file_path, 'w') as fd:
|
||||
fd.write(index_object)
|
||||
|
||||
db_root_dir = os.path.join(tmpdir, 'db_root')
|
||||
db = spack_db.Database(None, db_dir=db_root_dir,
|
||||
enable_transaction_locking=False)
|
||||
|
||||
db._read_from_file(index_file_path)
|
||||
spec_list = db.query_local(installed=False)
|
||||
|
||||
for indexed_spec in spec_list:
|
||||
spec_arch = architecture.arch_for_spec(indexed_spec.architecture)
|
||||
if (allarch is True or spec_arch == arch):
|
||||
_cached_specs.add(indexed_spec)
|
||||
|
||||
return _cached_specs
|
||||
|
||||
|
||||
def get_keys(install=False, trust=False, force=False):
|
||||
|
||||
@@ -198,6 +198,9 @@ def set_compiler_environment_variables(pkg, env):
|
||||
compiler = pkg.compiler
|
||||
spec = pkg.spec
|
||||
|
||||
# Make sure the executables for this compiler exist
|
||||
compiler.verify_executables()
|
||||
|
||||
# Set compiler variables used by CMake and autotools
|
||||
assert all(key in compiler.link_paths for key in (
|
||||
'cc', 'cxx', 'f77', 'fc'))
|
||||
|
||||
@@ -118,13 +118,15 @@ def _do_patch_config_files(self):
|
||||
config_file = 'config.{0}'.format(config_name)
|
||||
if os.path.exists(config_file):
|
||||
# First search the top-level source directory
|
||||
my_config_files[config_name] = config_file
|
||||
my_config_files[config_name] = os.path.join(
|
||||
self.configure_directory, config_file)
|
||||
else:
|
||||
# Then search in all sub directories recursively.
|
||||
# We would like to use AC_CONFIG_AUX_DIR, but not all packages
|
||||
# ship with their configure.in or configure.ac.
|
||||
config_path = next((os.path.join(r, f)
|
||||
for r, ds, fs in os.walk('.') for f in fs
|
||||
for r, ds, fs in os.walk(
|
||||
self.configure_directory) for f in fs
|
||||
if f == config_file), None)
|
||||
my_config_files[config_name] = config_path
|
||||
|
||||
|
||||
@@ -612,7 +612,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
|
||||
if 'enable-debug-messages' in gitlab_ci:
|
||||
debug_flag = '-d '
|
||||
|
||||
job_scripts = ['spack {0}ci rebuild'.format(debug_flag)]
|
||||
job_scripts = [
|
||||
'spack env activate .',
|
||||
'spack {0}ci rebuild'.format(debug_flag),
|
||||
]
|
||||
|
||||
compiler_action = 'NONE'
|
||||
if len(phases) > 1:
|
||||
@@ -1025,9 +1028,9 @@ def read_cdashid_from_mirror(spec, mirror_url):
|
||||
def push_mirror_contents(env, spec, yaml_path, mirror_url, build_id):
|
||||
if mirror_url:
|
||||
tty.debug('Creating buildcache')
|
||||
buildcache._createtarball(env, yaml_path, None, True, False,
|
||||
mirror_url, None, True, False, False, True,
|
||||
False)
|
||||
buildcache._createtarball(env, spec_yaml=yaml_path, add_deps=False,
|
||||
output_location=mirror_url, force=True,
|
||||
allow_root=True)
|
||||
if build_id:
|
||||
tty.debug('Writing cdashid ({0}) to remote mirror: {1}'.format(
|
||||
build_id, mirror_url))
|
||||
|
||||
@@ -17,6 +17,19 @@
|
||||
import spack.util.spack_yaml as syaml
|
||||
|
||||
|
||||
def sort_yaml_obj(obj):
|
||||
if isinstance(obj, collections_abc.Mapping):
|
||||
return syaml.syaml_dict(
|
||||
(k, sort_yaml_obj(v))
|
||||
for k, v in
|
||||
sorted(obj.items(), key=(lambda item: str(item[0]))))
|
||||
|
||||
if isinstance(obj, collections_abc.Sequence) and not isinstance(obj, str):
|
||||
return syaml.syaml_list(sort_yaml_obj(x) for x in obj)
|
||||
|
||||
return obj
|
||||
|
||||
|
||||
def matches(obj, proto):
|
||||
"""Returns True if the test object "obj" matches the prototype object
|
||||
"proto".
|
||||
@@ -235,8 +248,10 @@ def try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs):
|
||||
# pass was not applied
|
||||
return (yaml, new_yaml, False, other_results)
|
||||
|
||||
pre_size = len(syaml.dump_config(yaml, default_flow_style=True))
|
||||
post_size = len(syaml.dump_config(new_yaml, default_flow_style=True))
|
||||
pre_size = len(syaml.dump_config(
|
||||
sort_yaml_obj(yaml), default_flow_style=True))
|
||||
post_size = len(syaml.dump_config(
|
||||
sort_yaml_obj(new_yaml), default_flow_style=True))
|
||||
|
||||
# pass makes the size worse: not applying
|
||||
applied = (post_size <= pre_size)
|
||||
@@ -281,7 +296,7 @@ def build_histogram(iterator, key):
|
||||
continue
|
||||
|
||||
value_hash = hashlib.sha1()
|
||||
value_hash.update(syaml.dump_config(val).encode())
|
||||
value_hash.update(syaml.dump_config(sort_yaml_obj(val)).encode())
|
||||
value_hash = value_hash.hexdigest()
|
||||
|
||||
buckets[value_hash] += 1
|
||||
@@ -292,7 +307,8 @@ def build_histogram(iterator, key):
|
||||
|
||||
|
||||
def optimizer(yaml):
|
||||
original_size = len(syaml.dump_config(yaml, default_flow_style=True))
|
||||
original_size = len(syaml.dump_config(
|
||||
sort_yaml_obj(yaml), default_flow_style=True))
|
||||
|
||||
# try factoring out commonly repeated portions
|
||||
common_job = {
|
||||
@@ -369,7 +385,8 @@ def optimizer(yaml):
|
||||
common_subobject,
|
||||
{'variables': {'SPACK_ROOT_SPEC': spec}})
|
||||
|
||||
new_size = len(syaml.dump_config(yaml, default_flow_style=True))
|
||||
new_size = len(syaml.dump_config(
|
||||
sort_yaml_obj(yaml), default_flow_style=True))
|
||||
|
||||
print('\n')
|
||||
print_delta('overall summary', original_size, new_size)
|
||||
|
||||
@@ -1,45 +0,0 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import spack.cmd.configure as cfg
|
||||
import llnl.util.tty as tty
|
||||
|
||||
from spack.build_systems.autotools import AutotoolsPackage
|
||||
from spack.build_systems.cmake import CMakePackage
|
||||
from spack.build_systems.qmake import QMakePackage
|
||||
from spack.build_systems.scons import SConsPackage
|
||||
from spack.build_systems.waf import WafPackage
|
||||
from spack.build_systems.python import PythonPackage
|
||||
from spack.build_systems.perl import PerlPackage
|
||||
from spack.build_systems.meson import MesonPackage
|
||||
from spack.build_systems.sip import SIPPackage
|
||||
|
||||
description = 'DEPRECATED: stops at build stage when installing a package'
|
||||
section = "build"
|
||||
level = "long"
|
||||
|
||||
|
||||
build_system_to_phase = {
|
||||
AutotoolsPackage: 'build',
|
||||
CMakePackage: 'build',
|
||||
QMakePackage: 'build',
|
||||
SConsPackage: 'build',
|
||||
WafPackage: 'build',
|
||||
PythonPackage: 'build',
|
||||
PerlPackage: 'build',
|
||||
MesonPackage: 'build',
|
||||
SIPPackage: 'build',
|
||||
}
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
cfg.setup_parser(subparser)
|
||||
|
||||
|
||||
def build(parser, args):
|
||||
tty.warn("This command is deprecated. Use `spack install --until` to"
|
||||
" select an end phase instead. The `spack build` command will be"
|
||||
" removed in a future version of Spack")
|
||||
cfg._stop_at_phase_during_install(args, build, build_system_to_phase)
|
||||
@@ -68,9 +68,9 @@ def setup_parser(subparser):
|
||||
type=str,
|
||||
help="URL of the mirror where " +
|
||||
"buildcaches will be written.")
|
||||
create.add_argument('--no-rebuild-index', action='store_true',
|
||||
default=False, help="skip rebuilding index after " +
|
||||
"building package(s)")
|
||||
create.add_argument('--rebuild-index', action='store_true',
|
||||
default=False, help="Regenerate buildcache index " +
|
||||
"after building package(s)")
|
||||
create.add_argument('-y', '--spec-yaml', default=None,
|
||||
help='Create buildcache entry for spec from yaml file')
|
||||
create.add_argument('--only', default='package,dependencies',
|
||||
@@ -108,8 +108,6 @@ def setup_parser(subparser):
|
||||
action='store_true',
|
||||
dest='variants',
|
||||
help='show variants in output (can be long)')
|
||||
listcache.add_argument('-f', '--force', action='store_true',
|
||||
help="force new download of specs")
|
||||
listcache.add_argument('-a', '--allarch', action='store_true',
|
||||
help="list specs for all available architectures" +
|
||||
" instead of default platform and OS")
|
||||
@@ -291,7 +289,7 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
|
||||
specs_from_cli = []
|
||||
has_errors = False
|
||||
allarch = other_arch
|
||||
specs = bindist.get_specs(force, allarch)
|
||||
specs = bindist.get_specs(allarch)
|
||||
for pkg in pkgs:
|
||||
matches = []
|
||||
tty.msg("buildcache spec(s) matching %s \n" % pkg)
|
||||
@@ -323,9 +321,10 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
|
||||
return specs_from_cli
|
||||
|
||||
|
||||
def _createtarball(env, spec_yaml, packages, add_spec, add_deps,
|
||||
output_location, key, force, rel, unsigned, allow_root,
|
||||
no_rebuild_index):
|
||||
def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
|
||||
add_deps=True, output_location=os.getcwd(),
|
||||
signing_key=None, force=False, make_relative=False,
|
||||
unsigned=False, allow_root=False, rebuild_index=False):
|
||||
if spec_yaml:
|
||||
packages = set()
|
||||
with open(spec_yaml, 'r') as fd:
|
||||
@@ -355,10 +354,6 @@ def _createtarball(env, spec_yaml, packages, add_spec, add_deps,
|
||||
msg = 'Buildcache files will be output to %s/build_cache' % outdir
|
||||
tty.msg(msg)
|
||||
|
||||
signkey = None
|
||||
if key:
|
||||
signkey = key
|
||||
|
||||
matches = find_matching_specs(pkgs, env=env)
|
||||
|
||||
if matches:
|
||||
@@ -398,9 +393,9 @@ def _createtarball(env, spec_yaml, packages, add_spec, add_deps,
|
||||
|
||||
for spec in specs:
|
||||
tty.debug('creating binary cache file for package %s ' % spec.format())
|
||||
bindist.build_tarball(spec, outdir, force, rel,
|
||||
unsigned, allow_root, signkey,
|
||||
not no_rebuild_index)
|
||||
bindist.build_tarball(spec, outdir, force, make_relative,
|
||||
unsigned, allow_root, signing_key,
|
||||
rebuild_index)
|
||||
|
||||
|
||||
def createtarball(args):
|
||||
@@ -447,9 +442,12 @@ def createtarball(args):
|
||||
add_spec = ('package' in args.things_to_install)
|
||||
add_deps = ('dependencies' in args.things_to_install)
|
||||
|
||||
_createtarball(env, args.spec_yaml, args.specs, add_spec, add_deps,
|
||||
output_location, args.key, args.force, args.rel,
|
||||
args.unsigned, args.allow_root, args.no_rebuild_index)
|
||||
_createtarball(env, spec_yaml=args.spec_yaml, packages=args.specs,
|
||||
add_spec=add_spec, add_deps=add_deps,
|
||||
output_location=output_location, signing_key=args.key,
|
||||
force=args.force, make_relative=args.rel,
|
||||
unsigned=args.unsigned, allow_root=args.allow_root,
|
||||
rebuild_index=args.rebuild_index)
|
||||
|
||||
|
||||
def installtarball(args):
|
||||
@@ -458,8 +456,7 @@ def installtarball(args):
|
||||
tty.die("build cache file installation requires" +
|
||||
" at least one package spec argument")
|
||||
pkgs = set(args.specs)
|
||||
matches = match_downloaded_specs(pkgs, args.multiple, args.force,
|
||||
args.otherarch)
|
||||
matches = match_downloaded_specs(pkgs, args.multiple, args.otherarch)
|
||||
|
||||
for match in matches:
|
||||
install_tarball(match, args)
|
||||
@@ -491,7 +488,7 @@ def install_tarball(spec, args):
|
||||
|
||||
def listspecs(args):
|
||||
"""list binary packages available from mirrors"""
|
||||
specs = bindist.get_specs(args.force, args.allarch)
|
||||
specs = bindist.get_specs(args.allarch)
|
||||
if args.specs:
|
||||
constraints = set(args.specs)
|
||||
specs = [s for s in specs if any(s.satisfies(c) for c in constraints)]
|
||||
|
||||
@@ -38,7 +38,7 @@ def setup_parser(subparser):
|
||||
generate = subparsers.add_parser('generate', help=ci_generate.__doc__)
|
||||
generate.add_argument(
|
||||
'--output-file', default=None,
|
||||
help="Absolute path to file where generated jobs file should be " +
|
||||
help="Path to file where generated jobs file should be " +
|
||||
"written. The default is .gitlab-ci.yml in the root of the " +
|
||||
"repository.")
|
||||
generate.add_argument(
|
||||
@@ -88,10 +88,10 @@ def ci_generate(args):
|
||||
use_dependencies = args.dependencies
|
||||
|
||||
if not output_file:
|
||||
gen_ci_dir = os.getcwd()
|
||||
output_file = os.path.join(gen_ci_dir, '.gitlab-ci.yml')
|
||||
output_file = os.path.abspath(".gitlab-ci.yml")
|
||||
else:
|
||||
gen_ci_dir = os.path.dirname(output_file)
|
||||
output_file_path = os.path.abspath(output_file)
|
||||
gen_ci_dir = os.path.dirname(output_file_path)
|
||||
if not os.path.exists(gen_ci_dir):
|
||||
os.makedirs(gen_ci_dir)
|
||||
|
||||
|
||||
@@ -37,7 +37,7 @@ def setup_parser(subparser):
|
||||
find_parser.add_argument('add_paths', nargs=argparse.REMAINDER)
|
||||
find_parser.add_argument(
|
||||
'--scope', choices=scopes, metavar=scopes_metavar,
|
||||
default=spack.config.default_modify_scope(),
|
||||
default=spack.config.default_modify_scope('compilers'),
|
||||
help="configuration scope to modify")
|
||||
|
||||
# Remove
|
||||
@@ -49,7 +49,7 @@ def setup_parser(subparser):
|
||||
remove_parser.add_argument('compiler_spec')
|
||||
remove_parser.add_argument(
|
||||
'--scope', choices=scopes, metavar=scopes_metavar,
|
||||
default=spack.config.default_modify_scope(),
|
||||
default=spack.config.default_modify_scope('compilers'),
|
||||
help="configuration scope to modify")
|
||||
|
||||
# List
|
||||
|
||||
@@ -1,85 +0,0 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import argparse
|
||||
|
||||
import llnl.util.tty as tty
|
||||
import spack.cmd
|
||||
import spack.cmd.common.arguments as arguments
|
||||
import spack.cmd.install as inst
|
||||
|
||||
from spack.build_systems.autotools import AutotoolsPackage
|
||||
from spack.build_systems.cmake import CMakePackage
|
||||
from spack.build_systems.qmake import QMakePackage
|
||||
from spack.build_systems.waf import WafPackage
|
||||
from spack.build_systems.perl import PerlPackage
|
||||
from spack.build_systems.intel import IntelPackage
|
||||
from spack.build_systems.meson import MesonPackage
|
||||
from spack.build_systems.sip import SIPPackage
|
||||
|
||||
description = 'DEPRECATED: stage and configure a package but do not install'
|
||||
section = "build"
|
||||
level = "long"
|
||||
|
||||
|
||||
build_system_to_phase = {
|
||||
AutotoolsPackage: 'configure',
|
||||
CMakePackage: 'cmake',
|
||||
QMakePackage: 'qmake',
|
||||
WafPackage: 'configure',
|
||||
PerlPackage: 'configure',
|
||||
IntelPackage: 'configure',
|
||||
MesonPackage: 'meson',
|
||||
SIPPackage: 'configure',
|
||||
}
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
subparser.add_argument(
|
||||
'-v', '--verbose',
|
||||
action='store_true',
|
||||
help="print additional output during builds"
|
||||
)
|
||||
arguments.add_common_arguments(subparser, ['spec'])
|
||||
|
||||
|
||||
def _stop_at_phase_during_install(args, calling_fn, phase_mapping):
|
||||
if not args.package:
|
||||
tty.die("configure requires at least one package argument")
|
||||
|
||||
# TODO: to be refactored with code in install
|
||||
specs = spack.cmd.parse_specs(args.package, concretize=True)
|
||||
if len(specs) != 1:
|
||||
tty.error('only one spec can be installed at a time.')
|
||||
spec = specs.pop()
|
||||
pkg = spec.package
|
||||
try:
|
||||
key = [cls for cls in phase_mapping if isinstance(pkg, cls)].pop()
|
||||
phase = phase_mapping[key]
|
||||
# Install package dependencies if needed
|
||||
parser = argparse.ArgumentParser()
|
||||
inst.setup_parser(parser)
|
||||
tty.msg('Checking dependencies for {0}'.format(args.spec[0]))
|
||||
cli_args = ['-v'] if args.verbose else []
|
||||
install_args = parser.parse_args(cli_args + ['--only=dependencies'])
|
||||
install_args.spec = args.spec
|
||||
inst.install(parser, install_args)
|
||||
# Install package and stop at the given phase
|
||||
cli_args = ['-v'] if args.verbose else []
|
||||
install_args = parser.parse_args(cli_args + ['--only=package'])
|
||||
install_args.spec = args.spec
|
||||
inst.install(parser, install_args, stop_at=phase)
|
||||
except IndexError:
|
||||
tty.error(
|
||||
'Package {0} has no {1} phase, or its {1} phase is not separated from install'.format( # NOQA: ignore=E501
|
||||
spec.name, calling_fn.__name__)
|
||||
)
|
||||
|
||||
|
||||
def configure(parser, args):
|
||||
tty.warn("This command is deprecated. Use `spack install --until` to"
|
||||
" select an end phase instead. The `spack configure` command will"
|
||||
" be removed in a future version of Spack.")
|
||||
_stop_at_phase_during_install(args, configure, build_system_to_phase)
|
||||
@@ -1,20 +0,0 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import spack.cmd.dev_build
|
||||
import llnl.util.tty as tty
|
||||
|
||||
description = "DEPRECATED: do-it-yourself: build from local source directory"
|
||||
section = "build"
|
||||
level = "long"
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
spack.cmd.dev_build.setup_parser(subparser)
|
||||
|
||||
|
||||
def diy(self, args):
|
||||
tty.warn("`spack diy` has been renamed to `spack dev-build`."
|
||||
"The `diy` command will be removed in a future version of Spack")
|
||||
spack.cmd.dev_build.dev_build(self, args)
|
||||
@@ -52,6 +52,9 @@ def env_activate_setup_parser(subparser):
|
||||
shells.add_argument(
|
||||
'--csh', action='store_const', dest='shell', const='csh',
|
||||
help="print csh commands to activate the environment")
|
||||
shells.add_argument(
|
||||
'--fish', action='store_const', dest='shell', const='fish',
|
||||
help="print fish commands to activate the environment")
|
||||
|
||||
view_options = subparser.add_mutually_exclusive_group()
|
||||
view_options.add_argument(
|
||||
@@ -127,6 +130,9 @@ def env_deactivate_setup_parser(subparser):
|
||||
shells.add_argument(
|
||||
'--csh', action='store_const', dest='shell', const='csh',
|
||||
help="print csh commands to deactivate the environment")
|
||||
shells.add_argument(
|
||||
'--fish', action='store_const', dest='shell', const='fish',
|
||||
help="print fish commands to activate the environment")
|
||||
|
||||
|
||||
def env_deactivate(args):
|
||||
|
||||
@@ -32,6 +32,9 @@ def setup_parser(subparser):
|
||||
shells.add_argument(
|
||||
'--csh', action='store_const', dest='shell', const='csh',
|
||||
help="print csh commands to load the package")
|
||||
shells.add_argument(
|
||||
'--fish', action='store_const', dest='shell', const='fish',
|
||||
help="print fish commands to load the package")
|
||||
|
||||
subparser.add_argument(
|
||||
'--first',
|
||||
|
||||
@@ -39,13 +39,6 @@ def setup_parser(subparser):
|
||||
arguments.add_common_arguments(cd_group, ['clean', 'dirty'])
|
||||
|
||||
|
||||
def spack_transitive_include_path():
|
||||
return ';'.join(
|
||||
os.path.join(dep, 'include')
|
||||
for dep in os.environ['SPACK_DEPENDENCIES'].split(os.pathsep)
|
||||
)
|
||||
|
||||
|
||||
def write_spconfig(package, dirty):
|
||||
# Set-up the environment
|
||||
spack.build_environment.setup_package(package, dirty)
|
||||
@@ -57,8 +50,8 @@ def write_spconfig(package, dirty):
|
||||
paths = os.environ['PATH'].split(':')
|
||||
paths = [item for item in paths if 'spack/env' not in item]
|
||||
env['PATH'] = ':'.join(paths)
|
||||
env['SPACK_TRANSITIVE_INCLUDE_PATH'] = spack_transitive_include_path()
|
||||
env['CMAKE_PREFIX_PATH'] = os.environ['CMAKE_PREFIX_PATH']
|
||||
env['SPACK_INCLUDE_DIRS'] = os.environ['SPACK_INCLUDE_DIRS']
|
||||
env['CC'] = os.environ['SPACK_CC']
|
||||
env['CXX'] = os.environ['SPACK_CXX']
|
||||
env['FC'] = os.environ['SPACK_FC']
|
||||
@@ -84,7 +77,7 @@ def cmdlist(str):
|
||||
if name.find('PATH') < 0:
|
||||
fout.write('env[%s] = %s\n' % (repr(name), repr(val)))
|
||||
else:
|
||||
if name == 'SPACK_TRANSITIVE_INCLUDE_PATH':
|
||||
if name == 'SPACK_INCLUDE_DIRS':
|
||||
sep = ';'
|
||||
else:
|
||||
sep = ':'
|
||||
|
||||
@@ -31,6 +31,9 @@ def setup_parser(subparser):
|
||||
shells.add_argument(
|
||||
'--csh', action='store_const', dest='shell', const='csh',
|
||||
help="print csh commands to activate the environment")
|
||||
shells.add_argument(
|
||||
'--fish', action='store_const', dest='shell', const='fish',
|
||||
help="print fish commands to load the package")
|
||||
|
||||
subparser.add_argument('-a', '--all', action='store_true',
|
||||
help='unload all loaded Spack packages.')
|
||||
|
||||
@@ -1,214 +0,0 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
# TODO: This will be merged into the buildcache command once
|
||||
# everything is working.
|
||||
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
try:
|
||||
import boto3
|
||||
import botocore
|
||||
have_boto3_support = True
|
||||
except ImportError:
|
||||
have_boto3_support = False
|
||||
|
||||
import llnl.util.tty as tty
|
||||
|
||||
from spack.error import SpackError
|
||||
import spack.tengine as template_engine
|
||||
from spack.spec import Spec
|
||||
|
||||
|
||||
import spack.binary_distribution as bindist
|
||||
|
||||
|
||||
description = "temporary command to upload buildcaches to 's3.spack.io'"
|
||||
section = "packaging"
|
||||
level = "long"
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
setup_parser.parser = subparser
|
||||
subparsers = subparser.add_subparsers(help='upload-s3 sub-commands')
|
||||
|
||||
# sub-command to upload a built spec to s3
|
||||
spec = subparsers.add_parser('spec', help=upload_spec.__doc__)
|
||||
|
||||
spec.add_argument('-s', '--spec', default=None,
|
||||
help='Spec to upload')
|
||||
|
||||
spec.add_argument('-y', '--spec-yaml', default=None,
|
||||
help='Path to spec yaml file containing spec to upload')
|
||||
|
||||
spec.add_argument('-b', '--base-dir', default=None,
|
||||
help='Path to root of buildcaches')
|
||||
|
||||
spec.add_argument('-e', '--endpoint-url',
|
||||
default='https://s3.spack.io', help='URL of mirror')
|
||||
|
||||
spec.set_defaults(func=upload_spec)
|
||||
|
||||
# sub-command to update the index of a buildcache on s3
|
||||
index = subparsers.add_parser('index', help=update_index.__doc__)
|
||||
|
||||
index.add_argument('-e', '--endpoint-url',
|
||||
default='https://s3.spack.io', help='URL of mirror')
|
||||
|
||||
index.set_defaults(func=update_index)
|
||||
|
||||
|
||||
def get_s3_session(endpoint_url):
|
||||
if not have_boto3_support:
|
||||
raise SpackError('boto3 module not available')
|
||||
|
||||
session = boto3.Session()
|
||||
s3 = session.resource('s3', endpoint_url=endpoint_url)
|
||||
|
||||
bucket_names = []
|
||||
for bucket in s3.buckets.all():
|
||||
bucket_names.append(bucket.name)
|
||||
|
||||
if len(bucket_names) > 1:
|
||||
raise SpackError('More than one bucket associated with credentials')
|
||||
|
||||
bucket_name = bucket_names[0]
|
||||
|
||||
return s3, bucket_name
|
||||
|
||||
|
||||
def update_index(args):
|
||||
"""Update the index of an s3 buildcache"""
|
||||
s3, bucket_name = get_s3_session(args.endpoint_url)
|
||||
|
||||
bucket = s3.Bucket(bucket_name)
|
||||
exists = True
|
||||
|
||||
try:
|
||||
s3.meta.client.head_bucket(Bucket=bucket_name)
|
||||
except botocore.exceptions.ClientError as e:
|
||||
# If a client error is thrown, then check that it was a 404 error.
|
||||
# If it was a 404 error, then the bucket does not exist.
|
||||
error_code = e.response['Error']['Code']
|
||||
if error_code == '404':
|
||||
exists = False
|
||||
|
||||
if not exists:
|
||||
tty.error('S3 bucket "{0}" does not exist'.format(bucket_name))
|
||||
sys.exit(1)
|
||||
|
||||
build_cache_dir = os.path.join(
|
||||
'mirror', bindist.build_cache_relative_path())
|
||||
|
||||
spec_yaml_regex = re.compile('{0}/(.+\\.spec\\.yaml)$'.format(
|
||||
build_cache_dir))
|
||||
spack_regex = re.compile('{0}/([^/]+)/.+\\.spack$'.format(
|
||||
build_cache_dir))
|
||||
|
||||
top_level_keys = set()
|
||||
|
||||
for key in bucket.objects.all():
|
||||
m = spec_yaml_regex.search(key.key)
|
||||
if m:
|
||||
top_level_keys.add(m.group(1))
|
||||
print(m.group(1))
|
||||
continue
|
||||
|
||||
m = spack_regex.search(key.key)
|
||||
if m:
|
||||
top_level_keys.add(m.group(1))
|
||||
print(m.group(1))
|
||||
continue
|
||||
|
||||
index_data = {
|
||||
'top_level_keys': top_level_keys,
|
||||
}
|
||||
|
||||
env = template_engine.make_environment()
|
||||
template_dir = 'misc'
|
||||
index_template = os.path.join(template_dir, 'buildcache_index.html')
|
||||
t = env.get_template(index_template)
|
||||
contents = t.render(index_data)
|
||||
|
||||
index_key = os.path.join(build_cache_dir, 'index.html')
|
||||
|
||||
tty.debug('Generated index:')
|
||||
tty.debug(contents)
|
||||
tty.debug('Pushing it to {0} -> {1}'.format(bucket_name, index_key))
|
||||
|
||||
s3_obj = s3.Object(bucket_name, index_key)
|
||||
s3_obj.put(Body=contents, ACL='public-read')
|
||||
|
||||
|
||||
def upload_spec(args):
|
||||
"""Upload a spec to s3 bucket"""
|
||||
if not args.spec and not args.spec_yaml:
|
||||
tty.error('Cannot upload spec without spec arg or path to spec yaml')
|
||||
sys.exit(1)
|
||||
|
||||
if not args.base_dir:
|
||||
tty.error('No base directory for buildcache specified')
|
||||
sys.exit(1)
|
||||
|
||||
if args.spec:
|
||||
try:
|
||||
spec = Spec(args.spec)
|
||||
spec.concretize()
|
||||
except Exception as e:
|
||||
tty.debug(e)
|
||||
tty.error('Unable to concrectize spec from string {0}'.format(
|
||||
args.spec))
|
||||
sys.exit(1)
|
||||
else:
|
||||
try:
|
||||
with open(args.spec_yaml, 'r') as fd:
|
||||
spec = Spec.from_yaml(fd.read())
|
||||
except Exception as e:
|
||||
tty.debug(e)
|
||||
tty.error('Unable to concrectize spec from yaml {0}'.format(
|
||||
args.spec_yaml))
|
||||
sys.exit(1)
|
||||
|
||||
s3, bucket_name = get_s3_session(args.endpoint_url)
|
||||
|
||||
build_cache_dir = bindist.build_cache_relative_path()
|
||||
|
||||
tarball_key = os.path.join(
|
||||
build_cache_dir, bindist.tarball_path_name(spec, '.spack'))
|
||||
tarball_path = os.path.join(args.base_dir, tarball_key)
|
||||
|
||||
specfile_key = os.path.join(
|
||||
build_cache_dir, bindist.tarball_name(spec, '.spec.yaml'))
|
||||
specfile_path = os.path.join(args.base_dir, specfile_key)
|
||||
|
||||
cdashidfile_key = os.path.join(
|
||||
build_cache_dir, bindist.tarball_name(spec, '.cdashid'))
|
||||
cdashidfile_path = os.path.join(args.base_dir, cdashidfile_key)
|
||||
|
||||
tty.msg('Uploading {0}'.format(tarball_key))
|
||||
s3.meta.client.upload_file(
|
||||
tarball_path, bucket_name,
|
||||
os.path.join('mirror', tarball_key),
|
||||
ExtraArgs={'ACL': 'public-read'})
|
||||
|
||||
tty.msg('Uploading {0}'.format(specfile_key))
|
||||
s3.meta.client.upload_file(
|
||||
specfile_path, bucket_name,
|
||||
os.path.join('mirror', specfile_key),
|
||||
ExtraArgs={'ACL': 'public-read'})
|
||||
|
||||
if os.path.exists(cdashidfile_path):
|
||||
tty.msg('Uploading {0}'.format(cdashidfile_key))
|
||||
s3.meta.client.upload_file(
|
||||
cdashidfile_path, bucket_name,
|
||||
os.path.join('mirror', cdashidfile_key),
|
||||
ExtraArgs={'ACL': 'public-read'})
|
||||
|
||||
|
||||
def upload_s3(parser, args):
|
||||
if args.func:
|
||||
args.func(args)
|
||||
@@ -27,12 +27,6 @@
|
||||
__all__ = ['Compiler']
|
||||
|
||||
|
||||
def _verify_executables(*paths):
|
||||
for path in paths:
|
||||
if not os.path.isfile(path) and os.access(path, os.X_OK):
|
||||
raise CompilerAccessError(path)
|
||||
|
||||
|
||||
@llnl.util.lang.memoized
|
||||
def get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
|
||||
"""Invokes the compiler at a given path passing a single
|
||||
@@ -158,6 +152,10 @@ def _parse_non_system_link_dirs(string):
|
||||
"""
|
||||
link_dirs = _parse_link_paths(string)
|
||||
|
||||
# Remove directories that do not exist. Some versions of the Cray compiler
|
||||
# report nonexistent directories
|
||||
link_dirs = [d for d in link_dirs if os.path.isdir(d)]
|
||||
|
||||
# Return set of directories containing needed compiler libs, minus
|
||||
# system paths. Note that 'filter_system_paths' only checks for an
|
||||
# exact match, while 'in_system_subdirectory' checks if a path contains
|
||||
@@ -271,20 +269,16 @@ def __init__(self, cspec, operating_system, target,
|
||||
self.extra_rpaths = extra_rpaths
|
||||
self.enable_implicit_rpaths = enable_implicit_rpaths
|
||||
|
||||
def check(exe):
|
||||
if exe is None:
|
||||
return None
|
||||
_verify_executables(exe)
|
||||
return exe
|
||||
|
||||
self.cc = check(paths[0])
|
||||
self.cxx = check(paths[1])
|
||||
self.cc = paths[0]
|
||||
self.cxx = paths[1]
|
||||
self.f77 = None
|
||||
self.fc = None
|
||||
if len(paths) > 2:
|
||||
self.f77 = check(paths[2])
|
||||
self.f77 = paths[2]
|
||||
if len(paths) == 3:
|
||||
self.fc = self.f77
|
||||
else:
|
||||
self.fc = check(paths[3])
|
||||
self.fc = paths[3]
|
||||
|
||||
self.environment = environment
|
||||
self.extra_rpaths = extra_rpaths or []
|
||||
@@ -298,6 +292,31 @@ def check(exe):
|
||||
if value is not None:
|
||||
self.flags[flag] = tokenize_flags(value)
|
||||
|
||||
def verify_executables(self):
|
||||
"""Raise an error if any of the compiler executables is not valid.
|
||||
|
||||
This method confirms that for all of the compilers (cc, cxx, f77, fc)
|
||||
that have paths, those paths exist and are executable by the current
|
||||
user.
|
||||
Raises a CompilerAccessError if any of the non-null paths for the
|
||||
compiler are not accessible.
|
||||
"""
|
||||
def accessible_exe(exe):
|
||||
# compilers may contain executable names (on Cray or user edited)
|
||||
if not os.path.isabs(exe):
|
||||
exe = spack.util.executable.which_string(exe)
|
||||
if not exe:
|
||||
return False
|
||||
return os.path.isfile(exe) and os.access(exe, os.X_OK)
|
||||
|
||||
# setup environment before verifying in case we have executable names
|
||||
# instead of absolute paths
|
||||
with self._compiler_environment():
|
||||
missing = [cmp for cmp in (self.cc, self.cxx, self.f77, self.fc)
|
||||
if cmp and not accessible_exe(cmp)]
|
||||
if missing:
|
||||
raise CompilerAccessError(self, missing)
|
||||
|
||||
@property
|
||||
def version(self):
|
||||
return self.spec.version
|
||||
@@ -575,10 +594,10 @@ def _compiler_environment(self):
|
||||
|
||||
|
||||
class CompilerAccessError(spack.error.SpackError):
|
||||
|
||||
def __init__(self, path):
|
||||
super(CompilerAccessError, self).__init__(
|
||||
"'%s' is not a valid compiler." % path)
|
||||
def __init__(self, compiler, paths):
|
||||
msg = "Compiler '%s' has executables that are missing" % compiler.spec
|
||||
msg += " or are not executable: %s" % paths
|
||||
super(CompilerAccessError, self).__init__(msg)
|
||||
|
||||
|
||||
class InvalidCompilerError(spack.error.SpackError):
|
||||
|
||||
@@ -91,16 +91,24 @@ def c11_flag(self):
|
||||
|
||||
@property
|
||||
def cc_pic_flag(self):
|
||||
if self.is_clang_based:
|
||||
return "-fPIC"
|
||||
return "-h PIC"
|
||||
|
||||
@property
|
||||
def cxx_pic_flag(self):
|
||||
if self.is_clang_based:
|
||||
return "-fPIC"
|
||||
return "-h PIC"
|
||||
|
||||
@property
|
||||
def f77_pic_flag(self):
|
||||
if self.is_clang_based:
|
||||
return "-fPIC"
|
||||
return "-h PIC"
|
||||
|
||||
@property
|
||||
def fc_pic_flag(self):
|
||||
if self.is_clang_based:
|
||||
return "-fPIC"
|
||||
return "-h PIC"
|
||||
|
||||
@@ -11,7 +11,8 @@
|
||||
"0.14.0": "0.14.0",
|
||||
"0.14.1": "0.14.1",
|
||||
"0.14.2": "0.14.2",
|
||||
"0.14.3": "0.14.3"
|
||||
"0.15": "0.15",
|
||||
"0.15.0": "0.15.0"
|
||||
}
|
||||
},
|
||||
"ubuntu:16.04": {
|
||||
@@ -26,7 +27,8 @@
|
||||
"0.14.0": "0.14.0",
|
||||
"0.14.1": "0.14.1",
|
||||
"0.14.2": "0.14.2",
|
||||
"0.14.3": "0.14.3"
|
||||
"0.15": "0.15",
|
||||
"0.15.0": "0.15.0"
|
||||
}
|
||||
},
|
||||
"centos:7": {
|
||||
@@ -41,7 +43,8 @@
|
||||
"0.14.0": "0.14.0",
|
||||
"0.14.1": "0.14.1",
|
||||
"0.14.2": "0.14.2",
|
||||
"0.14.3": "0.14.3"
|
||||
"0.15": "0.15",
|
||||
"0.15.0": "0.15.0"
|
||||
}
|
||||
},
|
||||
"centos:6": {
|
||||
@@ -56,7 +59,8 @@
|
||||
"0.14.0": "0.14.0",
|
||||
"0.14.1": "0.14.1",
|
||||
"0.14.2": "0.14.2",
|
||||
"0.14.3": "0.14.3"
|
||||
"0.15": "0.15",
|
||||
"0.15.0": "0.15.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -48,6 +48,12 @@
|
||||
from spack.util.crypto import bit_length
|
||||
from spack.version import Version
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def nullcontext(*args, **kwargs):
|
||||
yield
|
||||
|
||||
|
||||
# TODO: Provide an API automatically retyring a build after detecting and
|
||||
# TODO: clearing a failure.
|
||||
|
||||
@@ -87,6 +93,17 @@
|
||||
# Types of dependencies tracked by the database
|
||||
_tracked_deps = ('link', 'run')
|
||||
|
||||
# Default list of fields written for each install record
|
||||
default_install_record_fields = [
|
||||
'spec',
|
||||
'ref_count',
|
||||
'path',
|
||||
'installed',
|
||||
'explicit',
|
||||
'installation_time',
|
||||
'deprecated_for',
|
||||
]
|
||||
|
||||
|
||||
def _now():
|
||||
"""Returns the time since the epoch"""
|
||||
@@ -187,17 +204,17 @@ def install_type_matches(self, installed):
|
||||
else:
|
||||
return InstallStatuses.MISSING in installed
|
||||
|
||||
def to_dict(self):
|
||||
rec_dict = {
|
||||
'spec': self.spec.to_node_dict(),
|
||||
'path': self.path,
|
||||
'installed': self.installed,
|
||||
'ref_count': self.ref_count,
|
||||
'explicit': self.explicit,
|
||||
'installation_time': self.installation_time,
|
||||
}
|
||||
if self.deprecated_for:
|
||||
rec_dict.update({'deprecated_for': self.deprecated_for})
|
||||
def to_dict(self, include_fields=default_install_record_fields):
|
||||
rec_dict = {}
|
||||
|
||||
for field_name in include_fields:
|
||||
if field_name == 'spec':
|
||||
rec_dict.update({'spec': self.spec.to_node_dict()})
|
||||
elif field_name == 'deprecated_for' and self.deprecated_for:
|
||||
rec_dict.update({'deprecated_for': self.deprecated_for})
|
||||
else:
|
||||
rec_dict.update({field_name: getattr(self, field_name)})
|
||||
|
||||
return rec_dict
|
||||
|
||||
@classmethod
|
||||
@@ -206,9 +223,12 @@ def from_dict(cls, spec, dictionary):
|
||||
d.pop('spec', None)
|
||||
|
||||
# Old databases may have "None" for path for externals
|
||||
if d['path'] == 'None':
|
||||
if 'path' not in d or d['path'] == 'None':
|
||||
d['path'] = None
|
||||
|
||||
if 'installed' not in d:
|
||||
d['installed'] = False
|
||||
|
||||
return InstallRecord(spec, **d)
|
||||
|
||||
|
||||
@@ -275,7 +295,8 @@ class Database(object):
|
||||
_prefix_failures = {}
|
||||
|
||||
def __init__(self, root, db_dir=None, upstream_dbs=None,
|
||||
is_upstream=False):
|
||||
is_upstream=False, enable_transaction_locking=True,
|
||||
record_fields=default_install_record_fields):
|
||||
"""Create a Database for Spack installations under ``root``.
|
||||
|
||||
A Database is a cache of Specs data from ``$prefix/spec.yaml``
|
||||
@@ -293,6 +314,12 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
|
||||
Caller may optionally provide a custom ``db_dir`` parameter
|
||||
where data will be stored. This is intended to be used for
|
||||
testing the Database class.
|
||||
|
||||
This class supports writing buildcache index files, in which case
|
||||
certain fields are not needed in each install record, and no
|
||||
transaction locking is required. To use this feature, provide
|
||||
``enable_transaction_locking=False``, and specify a list of needed
|
||||
fields in ``record_fields``.
|
||||
"""
|
||||
self.root = root
|
||||
|
||||
@@ -375,14 +402,23 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
|
||||
# message)
|
||||
self._fail_when_missing_deps = False
|
||||
|
||||
if enable_transaction_locking:
|
||||
self._write_transaction_impl = lk.WriteTransaction
|
||||
self._read_transaction_impl = lk.ReadTransaction
|
||||
else:
|
||||
self._write_transaction_impl = nullcontext
|
||||
self._read_transaction_impl = nullcontext
|
||||
|
||||
self._record_fields = record_fields
|
||||
|
||||
def write_transaction(self):
|
||||
"""Get a write lock context manager for use in a `with` block."""
|
||||
return lk.WriteTransaction(
|
||||
return self._write_transaction_impl(
|
||||
self.lock, acquire=self._read, release=self._write)
|
||||
|
||||
def read_transaction(self):
|
||||
"""Get a read lock context manager for use in a `with` block."""
|
||||
return lk.ReadTransaction(self.lock, acquire=self._read)
|
||||
return self._read_transaction_impl(self.lock, acquire=self._read)
|
||||
|
||||
def _failed_spec_path(self, spec):
|
||||
"""Return the path to the spec's failure file, which may not exist."""
|
||||
@@ -592,7 +628,8 @@ def _write_to_file(self, stream):
|
||||
This function does not do any locking or transactions.
|
||||
"""
|
||||
# map from per-spec hash code to installation record.
|
||||
installs = dict((k, v.to_dict()) for k, v in self._data.items())
|
||||
installs = dict((k, v.to_dict(include_fields=self._record_fields))
|
||||
for k, v in self._data.items())
|
||||
|
||||
# database includes installation list and version.
|
||||
|
||||
@@ -726,7 +763,8 @@ def check(cond, msg):
|
||||
|
||||
self.reindex(spack.store.layout)
|
||||
installs = dict(
|
||||
(k, v.to_dict()) for k, v in self._data.items()
|
||||
(k, v.to_dict(include_fields=self._record_fields))
|
||||
for k, v in self._data.items()
|
||||
)
|
||||
|
||||
def invalid_record(hash_key, error):
|
||||
|
||||
@@ -115,7 +115,7 @@ def activate(
|
||||
use_env_repo (bool): use the packages exactly as they appear in the
|
||||
environment's repository
|
||||
add_view (bool): generate commands to add view to path variables
|
||||
shell (string): One of `sh`, `csh`.
|
||||
shell (string): One of `sh`, `csh`, `fish`.
|
||||
prompt (string): string to add to the users prompt, or None
|
||||
|
||||
Returns:
|
||||
@@ -141,6 +141,19 @@ def activate(
|
||||
cmds += 'if (! $?SPACK_OLD_PROMPT ) '
|
||||
cmds += 'setenv SPACK_OLD_PROMPT "${prompt}";\n'
|
||||
cmds += 'set prompt="%s ${prompt}";\n' % prompt
|
||||
elif shell == 'fish':
|
||||
if os.getenv('TERM') and 'color' in os.getenv('TERM') and prompt:
|
||||
prompt = colorize('@G{%s} ' % prompt, color=True)
|
||||
|
||||
cmds += 'set -gx SPACK_ENV %s;\n' % env.path
|
||||
cmds += 'function despacktivate;\n'
|
||||
cmds += ' spack env deactivate;\n'
|
||||
cmds += 'end;\n'
|
||||
#
|
||||
# NOTE: We're not changing the fish_prompt function (which is fish's
|
||||
# solution to the PS1 variable) here. This is a bit fiddly, and easy to
|
||||
# screw up => spend time reasearching a solution. Feedback welcome.
|
||||
#
|
||||
else:
|
||||
if os.getenv('TERM') and 'color' in os.getenv('TERM') and prompt:
|
||||
prompt = colorize('@G{%s} ' % prompt, color=True)
|
||||
@@ -156,6 +169,12 @@ def activate(
|
||||
cmds += 'fi;\n'
|
||||
cmds += 'export PS1="%s ${PS1}";\n' % prompt
|
||||
|
||||
#
|
||||
# NOTE in the fish-shell: Path variables are a special kind of variable
|
||||
# used to support colon-delimited path lists including PATH, CDPATH,
|
||||
# MANPATH, PYTHONPATH, etc. All variables that end in PATH (case-sensitive)
|
||||
# become PATH variables.
|
||||
#
|
||||
if add_view and default_view_name in env.views:
|
||||
with spack.store.db.read_transaction():
|
||||
cmds += env.add_default_view_to_shell(shell)
|
||||
@@ -167,7 +186,7 @@ def deactivate(shell='sh'):
|
||||
"""Undo any configuration or repo settings modified by ``activate()``.
|
||||
|
||||
Arguments:
|
||||
shell (string): One of `sh`, `csh`. Shell style to use.
|
||||
shell (string): One of `sh`, `csh`, `fish`. Shell style to use.
|
||||
|
||||
Returns:
|
||||
(string): shell commands for `shell` to undo environment variables
|
||||
@@ -191,6 +210,12 @@ def deactivate(shell='sh'):
|
||||
cmds += 'set prompt="$SPACK_OLD_PROMPT" && '
|
||||
cmds += 'unsetenv SPACK_OLD_PROMPT;\n'
|
||||
cmds += 'unalias despacktivate;\n'
|
||||
elif shell == 'fish':
|
||||
cmds += 'set -e SPACK_ENV;\n'
|
||||
cmds += 'functions -e despacktivate;\n'
|
||||
#
|
||||
# NOTE: Not changing fish_prompt (above) => no need to restore it here.
|
||||
#
|
||||
else:
|
||||
cmds += 'if [ ! -z ${SPACK_ENV+x} ]; then\n'
|
||||
cmds += 'unset SPACK_ENV; export SPACK_ENV;\n'
|
||||
@@ -247,18 +272,13 @@ def find_environment(args):
|
||||
# at env_dir (env and env_dir are mutually exclusive)
|
||||
env = getattr(args, 'env_dir', None)
|
||||
|
||||
# if no argument, look for a manifest file
|
||||
# if no argument, look for the environment variable
|
||||
if not env:
|
||||
if os.path.exists(manifest_name):
|
||||
env = os.getcwd()
|
||||
env = os.environ.get(spack_env_var)
|
||||
|
||||
# if no env, env_dir, or manifest try the environment
|
||||
# nothing was set; there's no active environment
|
||||
if not env:
|
||||
env = os.environ.get(spack_env_var)
|
||||
|
||||
# nothing was set; there's no active environment
|
||||
if not env:
|
||||
return None
|
||||
return None
|
||||
|
||||
# if we get here, env isn't the name of a spack environment; it has
|
||||
# to be a path to an environment, or there is something wrong.
|
||||
|
||||
@@ -146,7 +146,8 @@ def detect_version(self, detect_version_args):
|
||||
compiler_cls.PrgEnv_compiler
|
||||
)
|
||||
matches = re.findall(version_regex, output)
|
||||
version = tuple(version for _, version in matches)
|
||||
version = tuple(version for _, version in matches
|
||||
if 'classic' not in version)
|
||||
compiler_id = detect_version_args.id
|
||||
value = detect_version_args._replace(
|
||||
id=compiler_id._replace(version=version)
|
||||
|
||||
@@ -5,8 +5,11 @@
|
||||
|
||||
import contextlib
|
||||
import os
|
||||
import re
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
import llnl.util.lang
|
||||
import llnl.util.tty as tty
|
||||
|
||||
from spack.operating_systems.linux_distro import LinuxDistro
|
||||
from spack.util.environment import get_path
|
||||
@@ -60,6 +63,43 @@ def compiler_search_paths(self):
|
||||
This prevents from detecting Cray compiler wrappers and avoids
|
||||
possible false detections.
|
||||
"""
|
||||
import spack.compilers
|
||||
|
||||
with unload_programming_environment():
|
||||
search_paths = fs.search_paths_for_executables(*get_path('PATH'))
|
||||
return search_paths
|
||||
search_paths = get_path('PATH')
|
||||
|
||||
extract_path_re = re.compile(r'prepend-path[\s]*PATH[\s]*([/\w\.:-]*)')
|
||||
|
||||
for compiler_cls in spack.compilers.all_compiler_types():
|
||||
# Check if the compiler class is supported on Cray
|
||||
prg_env = getattr(compiler_cls, 'PrgEnv', None)
|
||||
compiler_module = getattr(compiler_cls, 'PrgEnv_compiler', None)
|
||||
if not (prg_env and compiler_module):
|
||||
continue
|
||||
|
||||
# It is supported, check which versions are available
|
||||
output = module('avail', compiler_cls.PrgEnv_compiler)
|
||||
version_regex = r'({0})/([\d\.]+[\d]-?[\w]*)'.format(
|
||||
compiler_cls.PrgEnv_compiler
|
||||
)
|
||||
matches = re.findall(version_regex, output)
|
||||
versions = tuple(version for _, version in matches
|
||||
if 'classic' not in version)
|
||||
|
||||
# Now inspect the modules and add to paths
|
||||
msg = "[CRAY FE] Detected FE compiler [name={0}, versions={1}]"
|
||||
tty.debug(msg.format(compiler_module, versions))
|
||||
for v in versions:
|
||||
try:
|
||||
current_module = compiler_module + '/' + v
|
||||
out = module('show', current_module)
|
||||
match = extract_path_re.search(out)
|
||||
search_paths += match.group(1).split(':')
|
||||
except Exception as e:
|
||||
msg = ("[CRAY FE] An unexpected error occurred while "
|
||||
"detecting FE compiler [compiler={0}, "
|
||||
" version={1}, error={2}]")
|
||||
tty.debug(msg.format(compiler_cls.name, v, str(e)))
|
||||
|
||||
search_paths = list(llnl.util.lang.dedupe(search_paths))
|
||||
return fs.search_paths_for_executables(*search_paths)
|
||||
|
||||
@@ -54,6 +54,7 @@ def __init__(self):
|
||||
'10.13': 'highsierra',
|
||||
'10.14': 'mojave',
|
||||
'10.15': 'catalina',
|
||||
'11.0': 'bigsur',
|
||||
}
|
||||
|
||||
mac_ver = str(macos_version().up_to(2))
|
||||
|
||||
42
lib/spack/spack/schema/buildcache_spec.py
Normal file
42
lib/spack/spack/schema/buildcache_spec.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
"""Schema for a buildcache spec.yaml file
|
||||
|
||||
.. literalinclude:: _spack_root/lib/spack/spack/schema/buildcache_spec.py
|
||||
:lines: 14-
|
||||
"""
|
||||
import spack.schema.spec
|
||||
|
||||
|
||||
schema = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'title': 'Spack buildcache spec.yaml schema',
|
||||
'type': 'object',
|
||||
# 'additionalProperties': True,
|
||||
'properties': {
|
||||
'buildinfo': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'required': ['relative_prefix'],
|
||||
'properties': {
|
||||
'relative_prefix': {'type': 'string'},
|
||||
'relative_rpaths': {'type': 'boolean'},
|
||||
},
|
||||
},
|
||||
'full_hash': {'type': 'string'},
|
||||
'spec': {
|
||||
'type': 'array',
|
||||
'items': spack.schema.spec.properties,
|
||||
},
|
||||
'binary_cache_checksum': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'hash_algorithm': {'type': 'string'},
|
||||
'hash': {'type': 'string'},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
@@ -31,7 +31,8 @@
|
||||
'type': 'string',
|
||||
'enum': [
|
||||
'develop',
|
||||
'0.14', '0.14.0', '0.14.1', '0.14.2', '0.14.3'
|
||||
'0.14', '0.14.0', '0.14.1', '0.14.2',
|
||||
'0.15', '0.15.0',
|
||||
]
|
||||
}
|
||||
},
|
||||
|
||||
58
lib/spack/spack/schema/database_index.py
Normal file
58
lib/spack/spack/schema/database_index.py
Normal file
@@ -0,0 +1,58 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
"""Schema for database index.json file
|
||||
|
||||
.. literalinclude:: _spack_root/lib/spack/spack/schema/database_index.py
|
||||
:lines: 36-
|
||||
"""
|
||||
import spack.schema.spec
|
||||
|
||||
# spack.schema.spec.properties
|
||||
|
||||
#: Full schema with metadata
|
||||
schema = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'title': 'Spack spec schema',
|
||||
'type': 'object',
|
||||
'required': ['database'],
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'database': {
|
||||
'type': 'object',
|
||||
'required': ['installs', 'version'],
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'installs': {
|
||||
'type': 'object',
|
||||
'patternProperties': {
|
||||
r'^[\w\d]{32}$': {
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'spec': spack.schema.spec.properties,
|
||||
'path': {
|
||||
'oneOf': [
|
||||
{'type': 'string'},
|
||||
{'type': 'null'},
|
||||
],
|
||||
},
|
||||
'installed': {'type': 'boolean'},
|
||||
'ref_count': {
|
||||
'type': 'integer',
|
||||
'minimum': 0,
|
||||
},
|
||||
'explicit': {'type': 'boolean'},
|
||||
'installation_time': {
|
||||
'type': 'number',
|
||||
}
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
'version': {'type': 'string'},
|
||||
}
|
||||
},
|
||||
},
|
||||
}
|
||||
159
lib/spack/spack/schema/spec.py
Normal file
159
lib/spack/spack/schema/spec.py
Normal file
@@ -0,0 +1,159 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
"""Schema for a spec found in spec.yaml or database index.json files
|
||||
|
||||
.. literalinclude:: _spack_root/lib/spack/spack/schema/spec.py
|
||||
:lines: 13-
|
||||
"""
|
||||
|
||||
|
||||
target = {
|
||||
'oneOf': [
|
||||
{
|
||||
'type': 'string',
|
||||
}, {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'required': [
|
||||
'name',
|
||||
'vendor',
|
||||
'features',
|
||||
'generation',
|
||||
'parents',
|
||||
],
|
||||
'properties': {
|
||||
'name': {'type': 'string'},
|
||||
'vendor': {'type': 'string'},
|
||||
'features': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'generation': {'type': 'integer'},
|
||||
'parents': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
arch = {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'platform': {},
|
||||
'platform_os': {},
|
||||
'target': target,
|
||||
},
|
||||
}
|
||||
|
||||
dependencies = {
|
||||
'type': 'object',
|
||||
'patternProperties': {
|
||||
r'\w[\w-]*': { # package name
|
||||
'type': 'object',
|
||||
'properties': {
|
||||
'hash': {'type': 'string'},
|
||||
'type': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
#: Properties for inclusion in other schemas
|
||||
properties = {
|
||||
r'\w[\w-]*': { # package name
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'required': [
|
||||
'version',
|
||||
'arch',
|
||||
'compiler',
|
||||
'namespace',
|
||||
'parameters',
|
||||
],
|
||||
'properties': {
|
||||
'hash': {'type': 'string'},
|
||||
'version': {
|
||||
'oneOf': [
|
||||
{'type': 'string'},
|
||||
{'type': 'number'},
|
||||
],
|
||||
},
|
||||
'arch': arch,
|
||||
'compiler': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'name': {'type': 'string'},
|
||||
'version': {'type': 'string'},
|
||||
},
|
||||
},
|
||||
'namespace': {'type': 'string'},
|
||||
'parameters': {
|
||||
'type': 'object',
|
||||
'required': [
|
||||
'cflags',
|
||||
'cppflags',
|
||||
'cxxflags',
|
||||
'fflags',
|
||||
'ldflags',
|
||||
'ldlibs',
|
||||
],
|
||||
'additionalProperties': True,
|
||||
'properties': {
|
||||
'patches': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'cflags': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'cppflags': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'cxxflags': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'fflags': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'ldflags': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
'ldlib': {
|
||||
'type': 'array',
|
||||
'items': {'type': 'string'},
|
||||
},
|
||||
},
|
||||
},
|
||||
'patches': {
|
||||
'type': 'array',
|
||||
'items': {},
|
||||
},
|
||||
'dependencies': dependencies,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
#: Full schema with metadata
|
||||
schema = {
|
||||
'$schema': 'http://json-schema.org/schema#',
|
||||
'title': 'Spack spec schema',
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'patternProperties': properties,
|
||||
}
|
||||
@@ -188,8 +188,6 @@ def test_ci_workarounds():
|
||||
'SPACK_IS_PR_PIPELINE': 'False',
|
||||
}
|
||||
|
||||
common_script = ['spack ci rebuild']
|
||||
|
||||
common_before_script = [
|
||||
'git clone "https://github.com/spack/spack"',
|
||||
' && '.join((
|
||||
@@ -219,14 +217,14 @@ def make_build_job(name, deps, stage, use_artifact_buildcache, optimize,
|
||||
},
|
||||
'retry': {'max': 2, 'when': ['always']},
|
||||
'after_script': ['rm -rf "./spack"'],
|
||||
'image': {'name': 'spack/centos7', 'entrypoint': ['']},
|
||||
'script': ['spack ci rebuild'],
|
||||
'image': {'name': 'spack/centos7', 'entrypoint': ['']}
|
||||
}
|
||||
|
||||
if optimize:
|
||||
result['extends'] = ['.c0', '.c1', '.c2']
|
||||
result['extends'] = ['.c0', '.c1']
|
||||
else:
|
||||
variables['SPACK_ROOT_SPEC'] = fake_root_spec
|
||||
result['script'] = common_script
|
||||
result['before_script'] = common_before_script
|
||||
|
||||
result['variables'] = variables
|
||||
@@ -254,7 +252,7 @@ def make_rebuild_index_job(
|
||||
}
|
||||
|
||||
if optimize:
|
||||
result['extends'] = '.c1'
|
||||
result['extends'] = '.c0'
|
||||
else:
|
||||
result['before_script'] = common_before_script
|
||||
|
||||
@@ -262,11 +260,16 @@ def make_rebuild_index_job(
|
||||
|
||||
def make_factored_jobs(optimize):
|
||||
return {
|
||||
'.c0': {'script': common_script},
|
||||
'.c1': {'before_script': common_before_script},
|
||||
'.c2': {'variables': {'SPACK_ROOT_SPEC': fake_root_spec}}
|
||||
'.c0': {'before_script': common_before_script},
|
||||
'.c1': {'variables': {'SPACK_ROOT_SPEC': fake_root_spec}}
|
||||
} if optimize else {}
|
||||
|
||||
def make_stage_list(num_build_stages):
|
||||
return {
|
||||
'stages': (
|
||||
['-'.join(('stage', str(i))) for i in range(num_build_stages)]
|
||||
+ ['stage-rebuild-index'])}
|
||||
|
||||
def make_yaml_obj(use_artifact_buildcache, optimize, use_dependencies):
|
||||
result = {}
|
||||
|
||||
@@ -287,22 +290,10 @@ def make_yaml_obj(use_artifact_buildcache, optimize, use_dependencies):
|
||||
|
||||
result.update(make_factored_jobs(optimize))
|
||||
|
||||
result.update(make_stage_list(3))
|
||||
|
||||
return result
|
||||
|
||||
def sort_yaml_obj(obj):
|
||||
if isinstance(obj, collections_abc.Mapping):
|
||||
result = syaml.syaml_dict()
|
||||
for k in sorted(obj.keys(), key=str):
|
||||
result[k] = sort_yaml_obj(obj[k])
|
||||
return result
|
||||
|
||||
if (isinstance(obj, collections_abc.Sequence) and
|
||||
not isinstance(obj, str)):
|
||||
return syaml.syaml_list(sorted(
|
||||
(sort_yaml_obj(x) for x in obj), key=str))
|
||||
|
||||
return obj
|
||||
|
||||
# test every combination of:
|
||||
# use artifact buildcache: true or false
|
||||
# run optimization pass: true or false
|
||||
@@ -331,8 +322,8 @@ def sort_yaml_obj(obj):
|
||||
actual = cinw.needs_to_dependencies(actual)
|
||||
|
||||
predicted = syaml.dump_config(
|
||||
sort_yaml_obj(predicted), default_flow_style=True)
|
||||
ci_opt.sort_yaml_obj(predicted), default_flow_style=True)
|
||||
actual = syaml.dump_config(
|
||||
sort_yaml_obj(actual), default_flow_style=True)
|
||||
ci_opt.sort_yaml_obj(actual), default_flow_style=True)
|
||||
|
||||
assert(predicted == actual)
|
||||
|
||||
@@ -24,7 +24,7 @@
|
||||
def mock_get_specs(database, monkeypatch):
|
||||
specs = database.query_local()
|
||||
monkeypatch.setattr(
|
||||
spack.binary_distribution, 'get_specs', lambda x, y: specs
|
||||
spack.binary_distribution, 'get_specs', lambda x: specs
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -4,8 +4,10 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import filecmp
|
||||
import json
|
||||
import os
|
||||
import pytest
|
||||
from jsonschema import validate
|
||||
|
||||
import spack
|
||||
import spack.ci as ci
|
||||
@@ -15,6 +17,8 @@
|
||||
from spack.main import SpackCommand
|
||||
import spack.paths as spack_paths
|
||||
import spack.repo as repo
|
||||
from spack.schema.buildcache_spec import schema as spec_yaml_schema
|
||||
from spack.schema.database_index import schema as db_idx_schema
|
||||
from spack.spec import Spec
|
||||
from spack.util.mock_package import MockPackageMultiRepo
|
||||
import spack.util.executable as exe
|
||||
@@ -717,10 +721,28 @@ def test_push_mirror_contents(tmpdir, mutable_mock_env_path, env_deactivate,
|
||||
ci.push_mirror_contents(
|
||||
env, concrete_spec, yaml_path, mirror_url, '42')
|
||||
|
||||
buildcache_list_output = buildcache_cmd('list', output=str)
|
||||
buildcache_path = os.path.join(mirror_dir.strpath, 'build_cache')
|
||||
|
||||
# Test generating buildcache index while we have bin mirror
|
||||
buildcache_cmd('update-index', '--mirror-url', mirror_url)
|
||||
index_path = os.path.join(buildcache_path, 'index.json')
|
||||
with open(index_path) as idx_fd:
|
||||
index_object = json.load(idx_fd)
|
||||
validate(index_object, db_idx_schema)
|
||||
|
||||
# Now that index is regenerated, validate "buildcache list" output
|
||||
buildcache_list_output = buildcache_cmd('list', output=str)
|
||||
assert('patchelf' in buildcache_list_output)
|
||||
|
||||
# Also test buildcache_spec schema
|
||||
bc_files_list = os.listdir(buildcache_path)
|
||||
for file_name in bc_files_list:
|
||||
if file_name.endswith('.spec.yaml'):
|
||||
spec_yaml_path = os.path.join(buildcache_path, file_name)
|
||||
with open(spec_yaml_path) as yaml_fd:
|
||||
yaml_object = syaml.load(yaml_fd)
|
||||
validate(yaml_object, spec_yaml_schema)
|
||||
|
||||
logs_dir = working_dir.join('logs_dir')
|
||||
if not os.path.exists(logs_dir.strpath):
|
||||
os.makedirs(logs_dir.strpath)
|
||||
|
||||
@@ -159,7 +159,13 @@ def __init__(self):
|
||||
default_compiler_entry['paths']['f77']],
|
||||
environment={})
|
||||
|
||||
_get_compiler_link_paths = Compiler._get_compiler_link_paths
|
||||
def _get_compiler_link_paths(self, paths):
|
||||
# Mock os.path.isdir so the link paths don't have to exist
|
||||
old_isdir = os.path.isdir
|
||||
os.path.isdir = lambda x: True
|
||||
ret = super(MockCompiler, self)._get_compiler_link_paths(paths)
|
||||
os.path.isdir = old_isdir
|
||||
return ret
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
@@ -222,6 +228,7 @@ def call_compiler(exe, *args, **kwargs):
|
||||
('f77', 'fflags'),
|
||||
('f77', 'cppflags'),
|
||||
])
|
||||
@pytest.mark.enable_compiler_link_paths
|
||||
def test_get_compiler_link_paths(monkeypatch, exe, flagname):
|
||||
# create fake compiler that emits mock verbose output
|
||||
compiler = MockCompiler()
|
||||
@@ -261,6 +268,7 @@ def test_get_compiler_link_paths_no_verbose_flag():
|
||||
assert dirs == []
|
||||
|
||||
|
||||
@pytest.mark.enable_compiler_link_paths
|
||||
def test_get_compiler_link_paths_load_env(working_env, monkeypatch, tmpdir):
|
||||
gcc = str(tmpdir.join('gcc'))
|
||||
with open(gcc, 'w') as f:
|
||||
@@ -377,6 +385,10 @@ def test_cce_flags():
|
||||
supported_flag_test("cxx_pic_flag", "-h PIC", "cce@1.0")
|
||||
supported_flag_test("f77_pic_flag", "-h PIC", "cce@1.0")
|
||||
supported_flag_test("fc_pic_flag", "-h PIC", "cce@1.0")
|
||||
supported_flag_test("cc_pic_flag", "-fPIC", "cce@9.1.0")
|
||||
supported_flag_test("cxx_pic_flag", "-fPIC", "cce@9.1.0")
|
||||
supported_flag_test("f77_pic_flag", "-fPIC", "cce@9.1.0")
|
||||
supported_flag_test("fc_pic_flag", "-fPIC", "cce@9.1.0")
|
||||
supported_flag_test("debug_flags", ['-g', '-G0', '-G1', '-G2', '-Gfast'],
|
||||
'cce@1.0')
|
||||
|
||||
@@ -823,3 +835,33 @@ class MockPackage(object):
|
||||
pkg = MockPackage()
|
||||
with pytest.raises(OSError):
|
||||
compiler.setup_custom_environment(pkg, env)
|
||||
|
||||
|
||||
@pytest.mark.enable_compiler_verification
|
||||
def test_compiler_executable_verification_raises(tmpdir):
|
||||
compiler = MockCompiler()
|
||||
compiler.cc = '/this/path/does/not/exist'
|
||||
|
||||
with pytest.raises(spack.compiler.CompilerAccessError):
|
||||
compiler.verify_executables()
|
||||
|
||||
|
||||
@pytest.mark.enable_compiler_verification
|
||||
def test_compiler_executable_verification_success(tmpdir):
|
||||
def prepare_executable(name):
|
||||
real = str(tmpdir.join('cc').ensure())
|
||||
fs.set_executable(real)
|
||||
setattr(compiler, name, real)
|
||||
|
||||
# setup mock compiler with real paths
|
||||
compiler = MockCompiler()
|
||||
for name in ('cc', 'cxx', 'f77', 'fc'):
|
||||
prepare_executable(name)
|
||||
|
||||
# testing that this doesn't raise an error because the paths exist and
|
||||
# are executable
|
||||
compiler.verify_executables()
|
||||
|
||||
# Test that null entries don't fail
|
||||
compiler.cc = None
|
||||
compiler.verify_executables()
|
||||
|
||||
@@ -4,6 +4,9 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Test detection of compiler version"""
|
||||
import pytest
|
||||
import os
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
|
||||
import spack.compilers.arm
|
||||
import spack.compilers.cce
|
||||
@@ -16,6 +19,9 @@
|
||||
import spack.compilers.xl
|
||||
import spack.compilers.xl_r
|
||||
|
||||
from spack.operating_systems.cray_frontend import CrayFrontend
|
||||
import spack.util.module_cmd
|
||||
|
||||
|
||||
@pytest.mark.parametrize('version_str,expected_version', [
|
||||
('Arm C/C++/Fortran Compiler version 19.0 (build number 73) (based on LLVM 7.0.2)\n' # NOQA
|
||||
@@ -189,3 +195,41 @@ def test_xl_version_detection(version_str, expected_version):
|
||||
|
||||
version = spack.compilers.xl_r.XlR.extract_version_from_output(version_str)
|
||||
assert version == expected_version
|
||||
|
||||
|
||||
@pytest.mark.parametrize('compiler,version', [
|
||||
('gcc', '8.1.0'),
|
||||
('gcc', '1.0.0-foo'),
|
||||
('pgi', '19.1'),
|
||||
('pgi', '19.1a'),
|
||||
('intel', '9.0.0'),
|
||||
('intel', '0.0.0-foobar')
|
||||
])
|
||||
def test_cray_frontend_compiler_detection(
|
||||
compiler, version, tmpdir, monkeypatch, working_env
|
||||
):
|
||||
"""Test that the Cray frontend properly finds compilers form modules"""
|
||||
# setup the fake compiler directory
|
||||
compiler_dir = tmpdir.join(compiler)
|
||||
compiler_exe = compiler_dir.join('cc').ensure()
|
||||
fs.set_executable(str(compiler_exe))
|
||||
|
||||
# mock modules
|
||||
def _module(cmd, *args):
|
||||
module_name = '%s/%s' % (compiler, version)
|
||||
module_contents = 'prepend-path PATH %s' % compiler_dir
|
||||
if cmd == 'avail':
|
||||
return module_name if compiler in args[0] else ''
|
||||
if cmd == 'show':
|
||||
return module_contents if module_name in args else ''
|
||||
monkeypatch.setattr(spack.operating_systems.cray_frontend, 'module',
|
||||
_module)
|
||||
|
||||
# remove PATH variable
|
||||
os.environ.pop('PATH', None)
|
||||
|
||||
# get a CrayFrontend object
|
||||
cray_fe_os = CrayFrontend()
|
||||
|
||||
paths = cray_fe_os.compiler_search_paths
|
||||
assert paths == [str(compiler_dir)]
|
||||
|
||||
@@ -70,6 +70,25 @@ def clean_user_environment():
|
||||
ev.activate(active)
|
||||
|
||||
|
||||
#
|
||||
# Disable checks on compiler executable existence
|
||||
#
|
||||
@pytest.fixture(scope='function', autouse=True)
|
||||
def mock_compiler_executable_verification(request, monkeypatch):
|
||||
"""Mock the compiler executable verification to allow missing executables.
|
||||
|
||||
This fixture can be disabled for tests of the compiler verification
|
||||
functionality by::
|
||||
|
||||
@pytest.mark.enable_compiler_verification
|
||||
|
||||
If a test is marked in that way this is a no-op."""
|
||||
if 'enable_compiler_verification' not in request.keywords:
|
||||
monkeypatch.setattr(spack.compiler.Compiler,
|
||||
'verify_executables',
|
||||
lambda x: None)
|
||||
|
||||
|
||||
# Hooks to add command line options or set other custom behaviors.
|
||||
# They must be placed here to be found by pytest. See:
|
||||
#
|
||||
@@ -600,18 +619,26 @@ def dirs_with_libfiles(tmpdir_factory):
|
||||
|
||||
|
||||
@pytest.fixture(scope='function', autouse=True)
|
||||
def disable_compiler_execution(monkeypatch):
|
||||
def noop(*args):
|
||||
return []
|
||||
def disable_compiler_execution(monkeypatch, request):
|
||||
"""
|
||||
This fixture can be disabled for tests of the compiler link path
|
||||
functionality by::
|
||||
|
||||
# Compiler.determine_implicit_rpaths actually runs the compiler. So this
|
||||
# replaces that function with a noop that simulates finding no implicit
|
||||
# RPATHs
|
||||
monkeypatch.setattr(
|
||||
spack.compiler.Compiler,
|
||||
'_get_compiler_link_paths',
|
||||
noop
|
||||
)
|
||||
@pytest.mark.enable_compiler_link_paths
|
||||
|
||||
If a test is marked in that way this is a no-op."""
|
||||
if 'enable_compiler_link_paths' not in request.keywords:
|
||||
def noop(*args):
|
||||
return []
|
||||
|
||||
# Compiler.determine_implicit_rpaths actually runs the compiler. So
|
||||
# replace that function with a noop that simulates finding no implicit
|
||||
# RPATHs
|
||||
monkeypatch.setattr(
|
||||
spack.compiler.Compiler,
|
||||
'_get_compiler_link_paths',
|
||||
noop
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
|
||||
@@ -21,6 +21,8 @@
|
||||
_use_uuid = False
|
||||
pass
|
||||
|
||||
from jsonschema import validate
|
||||
|
||||
import llnl.util.lock as lk
|
||||
from llnl.util.tty.colify import colify
|
||||
|
||||
@@ -31,6 +33,7 @@
|
||||
import spack.spec
|
||||
from spack.util.mock_package import MockPackageMultiRepo
|
||||
from spack.util.executable import Executable
|
||||
from spack.schema.database_index import schema
|
||||
|
||||
|
||||
pytestmark = pytest.mark.db
|
||||
@@ -438,6 +441,10 @@ def test_005_db_exists(database):
|
||||
assert os.path.exists(str(index_file))
|
||||
assert os.path.exists(str(lock_file))
|
||||
|
||||
with open(index_file) as fd:
|
||||
index_object = json.load(fd)
|
||||
validate(index_object, schema)
|
||||
|
||||
|
||||
def test_010_all_install_sanity(database):
|
||||
"""Ensure that the install layout reflects what we think it does."""
|
||||
@@ -730,6 +737,8 @@ def test_old_external_entries_prefix(mutable_database):
|
||||
with open(spack.store.db._index_path, 'r') as f:
|
||||
db_obj = json.loads(f.read())
|
||||
|
||||
validate(db_obj, schema)
|
||||
|
||||
s = spack.spec.Spec('externaltool')
|
||||
s.concretize()
|
||||
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import pytest
|
||||
import os
|
||||
|
||||
import spack.paths
|
||||
@@ -13,6 +13,13 @@
|
||||
'compiler_verbose_output')
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def allow_nonexistent_paths(monkeypatch):
|
||||
# Allow nonexistent paths to be detected as part of the output
|
||||
# for testing purposes.
|
||||
monkeypatch.setattr(os.path, 'isdir', lambda x: True)
|
||||
|
||||
|
||||
def check_link_paths(filename, paths):
|
||||
with open(os.path.join(datadir, filename)) as file:
|
||||
output = file.read()
|
||||
|
||||
@@ -108,6 +108,8 @@ def test_buildcache(mock_archive, tmpdir):
|
||||
else:
|
||||
create_args.insert(create_args.index('-a'), '-u')
|
||||
|
||||
create_args.insert(create_args.index('-a'), '--rebuild-index')
|
||||
|
||||
args = parser.parse_args(create_args)
|
||||
buildcache.buildcache(parser, args)
|
||||
# trigger overwrite warning
|
||||
@@ -165,7 +167,7 @@ def test_buildcache(mock_archive, tmpdir):
|
||||
args = parser.parse_args(['list'])
|
||||
buildcache.buildcache(parser, args)
|
||||
|
||||
args = parser.parse_args(['list', '-f'])
|
||||
args = parser.parse_args(['list'])
|
||||
buildcache.buildcache(parser, args)
|
||||
|
||||
args = parser.parse_args(['list', 'trivial'])
|
||||
|
||||
@@ -32,12 +32,14 @@
|
||||
_shell_set_strings = {
|
||||
'sh': 'export {0}={1};\n',
|
||||
'csh': 'setenv {0} {1};\n',
|
||||
'fish': 'set -gx {0} {1};\n'
|
||||
}
|
||||
|
||||
|
||||
_shell_unset_strings = {
|
||||
'sh': 'unset {0};\n',
|
||||
'csh': 'unsetenv {0};\n',
|
||||
'fish': 'set -e {0};\n',
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -25,6 +25,9 @@ check_dependencies $coverage git hg svn
|
||||
export PATH="$ORIGINAL_PATH"
|
||||
unset spack
|
||||
|
||||
# Convert QA_DIR to absolute path before changing directory
|
||||
export QA_DIR=$(realpath $QA_DIR)
|
||||
|
||||
# Start in the spack root directory
|
||||
cd "$SPACK_ROOT"
|
||||
|
||||
@@ -41,3 +44,6 @@ fi
|
||||
# Run the test scripts for their output (these will print nicely)
|
||||
zsh "$QA_DIR/setup-env-test.sh"
|
||||
dash "$QA_DIR/setup-env-test.sh"
|
||||
|
||||
# Run fish tests
|
||||
fish "$QA_DIR/setup-env-test.fish"
|
||||
|
||||
395
share/spack/qa/setup-env-test.fish
Executable file
395
share/spack/qa/setup-env-test.fish
Executable file
@@ -0,0 +1,395 @@
|
||||
#!/usr/bin/env fish
|
||||
#
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
#
|
||||
# This script tests that Spack's setup-env.fish init script works.
|
||||
#
|
||||
|
||||
|
||||
function allocate_testing_global -d "allocate global variables used for testing"
|
||||
|
||||
# Colors for output
|
||||
set -gx __spt_red '\033[1;31m'
|
||||
set -gx __spt_cyan '\033[1;36m'
|
||||
set -gx __spt_green '\033[1;32m'
|
||||
set -gx __spt_reset '\033[0m'
|
||||
|
||||
# counts of test successes and failures.
|
||||
set -gx __spt_success 0
|
||||
set -gx __spt_errors 0
|
||||
end
|
||||
|
||||
|
||||
function delete_testing_global -d "deallocate global varialbes used for testing"
|
||||
|
||||
set -e __spt_red
|
||||
set -e __spt_cyan
|
||||
set -e __spt_green
|
||||
set -e __spt_reset
|
||||
|
||||
set -e __spt_success
|
||||
set -e __spt_errors
|
||||
end
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
# Functions for color output.
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
|
||||
function echo_red
|
||||
printf "$__spt_red$argv$__spt_reset\n"
|
||||
end
|
||||
|
||||
function echo_green
|
||||
printf "$__spt_green$argv$__spt_reset\n"
|
||||
end
|
||||
|
||||
function echo_msg
|
||||
printf "$__spt_cyan$argv$__spt_reset\n"
|
||||
end
|
||||
|
||||
|
||||
|
||||
# ------------------------------------------------------------------------
|
||||
# Generic functions for testing fish code.
|
||||
# ------------------------------------------------------------------------
|
||||
|
||||
|
||||
# Print out a header for a group of tests.
|
||||
function title
|
||||
|
||||
echo
|
||||
echo_msg "$argv"
|
||||
echo_msg "---------------------------------"
|
||||
|
||||
end
|
||||
|
||||
# echo FAIL in red text; increment failures
|
||||
function fail
|
||||
echo_red FAIL
|
||||
set __spt_errors (math $__spt_errors+1)
|
||||
end
|
||||
|
||||
# echo SUCCESS in green; increment successes
|
||||
function pass
|
||||
echo_green SUCCESS
|
||||
set __spt_success (math $__spt_success+1)
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Run a command and suppress output unless it fails.
|
||||
# On failure, echo the exit code and output.
|
||||
#
|
||||
function spt_succeeds
|
||||
printf "'$argv' succeeds ... "
|
||||
|
||||
set -l output (eval $argv 2>&1)
|
||||
|
||||
if test $status -ne 0
|
||||
fail
|
||||
echo_red "Command failed with error $status"
|
||||
if test -n "$output"
|
||||
echo_msg "Output:"
|
||||
echo "$output"
|
||||
else
|
||||
echo_msg "No output."
|
||||
end
|
||||
else
|
||||
pass
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Run a command and suppress output unless it succeeds.
|
||||
# If the command succeeds, echo the output.
|
||||
#
|
||||
function spt_fails
|
||||
printf "'$argv' fails ... "
|
||||
|
||||
set -l output (eval $argv 2>&1)
|
||||
|
||||
if test $status -eq 0
|
||||
fail
|
||||
echo_red "Command failed with error $status"
|
||||
if test -n "$output"
|
||||
echo_msg "Output:"
|
||||
echo "$output"
|
||||
else
|
||||
echo_msg "No output."
|
||||
end
|
||||
else
|
||||
pass
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Ensure that a string is in the output of a command.
|
||||
# Suppresses output on success.
|
||||
# On failure, echo the exit code and output.
|
||||
#
|
||||
function spt_contains
|
||||
set -l target_string $argv[1]
|
||||
set -l remaining_args $argv[2..-1]
|
||||
|
||||
printf "'$remaining_args' output contains '$target_string' ... "
|
||||
|
||||
set -l output (eval $remaining_args 2>&1)
|
||||
|
||||
if not echo "$output" | string match -q -r ".*$target_string.*"
|
||||
fail
|
||||
echo_red "Command exited with error $status"
|
||||
echo_red "'$target_string' was not in output."
|
||||
if test -n "$output"
|
||||
echo_msg "Output:"
|
||||
echo "$output"
|
||||
else
|
||||
echo_msg "No output."
|
||||
end
|
||||
else
|
||||
pass
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Ensure that a variable is set.
|
||||
#
|
||||
function is_set
|
||||
printf "'$argv[1]' is set ... "
|
||||
|
||||
if test -z "$$argv[1]"
|
||||
fail
|
||||
echo_msg "'$argv[1]' was not set!"
|
||||
else
|
||||
pass
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Ensure that a variable is not set.
|
||||
# Fails and prints the value of the variable if it is set.
|
||||
#
|
||||
function is_not_set
|
||||
printf "'$argv[1]' is not set ... "
|
||||
|
||||
if test -n "$$argv[1]"
|
||||
fail
|
||||
echo_msg "'$argv[1]' was set!"
|
||||
echo " $$argv[1]"
|
||||
else
|
||||
pass
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Setup test environment and do some preliminary checks
|
||||
# -----------------------------------------------------------------------
|
||||
|
||||
# Make sure no environment is active
|
||||
set -e SPACK_ENV
|
||||
true # ignore failing `set -e`
|
||||
|
||||
# Source setup-env.sh before tests
|
||||
set -gx QA_DIR (dirname (status --current-filename))
|
||||
source $QA_DIR/../setup-env.fish
|
||||
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Instead of invoking the module and cd commands, we print the arguments that
|
||||
# Spack invokes the command with, so we can check that Spack passes the expected
|
||||
# arguments in the tests below.
|
||||
#
|
||||
# We make that happen by defining the fish functions below. NOTE: these overwrite
|
||||
# existing functions => define them last
|
||||
# -----------------------------------------------------------------------
|
||||
|
||||
|
||||
function module
|
||||
echo "module $argv"
|
||||
end
|
||||
|
||||
function cd
|
||||
echo "cd $argv"
|
||||
end
|
||||
|
||||
|
||||
allocate_testing_global
|
||||
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Let the testing begin!
|
||||
# -----------------------------------------------------------------------
|
||||
|
||||
|
||||
title "Testing setup-env.fish with $_sp_shell"
|
||||
|
||||
# spack command is now available
|
||||
spt_succeeds which spack
|
||||
|
||||
|
||||
# create a fake mock package install and store its location for later
|
||||
title "Setup"
|
||||
echo "Creating a mock package installation"
|
||||
spack -m install --fake a
|
||||
|
||||
# create a test environment for testing environment commands
|
||||
echo "Creating a mock environment"
|
||||
spack env create spack_test_env
|
||||
|
||||
# ensure that we uninstall b on exit
|
||||
function spt_cleanup
|
||||
|
||||
set trapped_error false
|
||||
if test $status -ne 0
|
||||
set trapped_error true
|
||||
end
|
||||
|
||||
echo "Removing test environment before exiting."
|
||||
spack env deactivate 2>&1 > /dev/null
|
||||
spack env rm -y spack_test_env
|
||||
|
||||
title "Cleanup"
|
||||
echo "Removing test packages before exiting."
|
||||
spack -m uninstall -yf b a
|
||||
|
||||
echo
|
||||
echo "$__spt_success tests succeeded."
|
||||
echo "$__spt_errors tests failed."
|
||||
|
||||
if test "$trapped_error" = false
|
||||
echo "Exited due to an error."
|
||||
end
|
||||
|
||||
if test "$__spt_errors" -eq 0
|
||||
if test "$trapped_error" = false
|
||||
pass
|
||||
exit 0
|
||||
else
|
||||
fail
|
||||
exit 1
|
||||
end
|
||||
else
|
||||
fail
|
||||
exit 1
|
||||
end
|
||||
|
||||
delete_testing_global
|
||||
end
|
||||
|
||||
trap spt_cleanup EXIT
|
||||
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------
|
||||
# Test all spack commands with special env support
|
||||
# -----------------------------------------------------------------------
|
||||
title 'Testing `spack`'
|
||||
spt_contains 'usage: spack ' spack
|
||||
spt_contains "usage: spack " spack -h
|
||||
spt_contains "usage: spack " spack help
|
||||
spt_contains "usage: spack " spack -H
|
||||
spt_contains "usage: spack " spack help --all
|
||||
|
||||
title 'Testing `spack cd`'
|
||||
spt_contains "usage: spack cd " spack cd -h
|
||||
spt_contains "usage: spack cd " spack cd --help
|
||||
spt_contains "cd $b_install" spack cd -i b
|
||||
|
||||
title 'Testing `spack module`'
|
||||
spt_contains "usage: spack module " spack -m module -h
|
||||
spt_contains "usage: spack module " spack -m module --help
|
||||
spt_contains "usage: spack module " spack -m module
|
||||
|
||||
title 'Testing `spack load`'
|
||||
set _b_loc (spack -m location -i b)
|
||||
set _b_ld $_b_loc"/lib"
|
||||
set _a_loc (spack -m location -i a)
|
||||
set _a_ld $_a_loc"/lib"
|
||||
|
||||
spt_contains "set -gx LD_LIBRARY_PATH $_b_ld" spack -m load --only package --fish b
|
||||
spt_succeeds spack -m load b
|
||||
# test a variable MacOS clears and one it doesn't for recursive loads
|
||||
spt_contains "set -gx LD_LIBRARY_PATH $_a_ld:$_b_ld" spack -m load --fish a
|
||||
spt_contains "set -gx LIBRARY_PATH $_a_ld:$_b_ld" spack -m load --fish a
|
||||
spt_succeeds spack -m load --only dependencies a
|
||||
spt_succeeds spack -m load --only package a
|
||||
spt_fails spack -m load d
|
||||
spt_contains "usage: spack load " spack -m load -h
|
||||
spt_contains "usage: spack load " spack -m load -h d
|
||||
spt_contains "usage: spack load " spack -m load --help
|
||||
|
||||
title 'Testing `spack unload`'
|
||||
spack -m load b a # setup
|
||||
# spt_contains "module unload $b_module" spack -m unload b
|
||||
spt_succeeds spack -m unload b
|
||||
spt_succeeds spack -m unload --all
|
||||
spack -m unload --all # cleanup
|
||||
spt_fails spack -m unload -l
|
||||
# spt_contains "module unload -l --arg $b_module" spack -m unload -l --arg b
|
||||
spt_fails spack -m unload d
|
||||
spt_contains "usage: spack unload " spack -m unload -h
|
||||
spt_contains "usage: spack unload " spack -m unload -h d
|
||||
spt_contains "usage: spack unload " spack -m unload --help
|
||||
|
||||
title 'Testing `spack env`'
|
||||
spt_contains "usage: spack env " spack env -h
|
||||
spt_contains "usage: spack env " spack env --help
|
||||
|
||||
title 'Testing `spack env list`'
|
||||
spt_contains " spack env list " spack env list -h
|
||||
spt_contains " spack env list " spack env list --help
|
||||
|
||||
title 'Testing `spack env activate`'
|
||||
spt_contains "No such environment:" spack env activate no_such_environment
|
||||
spt_contains "usage: spack env activate " spack env activate
|
||||
spt_contains "usage: spack env activate " spack env activate -h
|
||||
spt_contains "usage: spack env activate " spack env activate --help
|
||||
|
||||
title 'Testing `spack env deactivate`'
|
||||
spt_contains "Error: No environment is currently active" spack env deactivate
|
||||
spt_contains "usage: spack env deactivate " spack env deactivate no_such_environment
|
||||
spt_contains "usage: spack env deactivate " spack env deactivate -h
|
||||
spt_contains "usage: spack env deactivate " spack env deactivate --help
|
||||
|
||||
title 'Testing activate and deactivate together'
|
||||
echo "Testing 'spack env activate spack_test_env'"
|
||||
spack env activate spack_test_env
|
||||
is_set SPACK_ENV
|
||||
|
||||
echo "Testing 'spack env deactivate'"
|
||||
spack env deactivate
|
||||
is_not_set SPACK_ENV
|
||||
|
||||
echo "Testing 'spack env activate spack_test_env'"
|
||||
spack env activate spack_test_env
|
||||
is_set SPACK_ENV
|
||||
|
||||
echo "Testing 'despacktivate'"
|
||||
despacktivate
|
||||
is_not_set SPACK_ENV
|
||||
|
||||
#
|
||||
# NOTE: `--prompt` on fish does nothing => currently not implemented.
|
||||
#
|
||||
|
||||
# echo "Testing 'spack env activate --prompt spack_test_env'"
|
||||
# spack env activate --prompt spack_test_env
|
||||
# is_set SPACK_ENV
|
||||
# is_set SPACK_OLD_PS1
|
||||
#
|
||||
# echo "Testing 'despacktivate'"
|
||||
# despacktivate
|
||||
# is_not_set SPACK_ENV
|
||||
# is_not_set SPACK_OLD_PS1
|
||||
723
share/spack/setup-env.fish
Executable file
723
share/spack/setup-env.fish
Executable file
@@ -0,0 +1,723 @@
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
|
||||
#################################################################################
|
||||
#
|
||||
# This file is part of Spack and sets up the spack environment for the friendly
|
||||
# interactive shell (fish). This includes module support, and it also puts spack
|
||||
# in your path. The script also checks that at least module support exists, and
|
||||
# provides suggestions if it doesn't. Source it like this:
|
||||
#
|
||||
# source /path/to/spack/share/spack/setup-env.fish
|
||||
#
|
||||
#################################################################################
|
||||
# This is a wrapper around the spack command that forwards calls to 'spack load'
|
||||
# and 'spack unload' to shell functions. This in turn allows them to be used to
|
||||
# invoke environment modules functions.
|
||||
#
|
||||
# 'spack load' is smarter than just 'load' because it converts its arguments into
|
||||
# a unique spack spec that is then passed to module commands. This allows the
|
||||
# user to load packages without knowing all their installation details.
|
||||
#
|
||||
# e.g., rather than requiring a full spec for libelf, the user can type:
|
||||
#
|
||||
# spack load libelf
|
||||
#
|
||||
# This will first find the available libelf modules and load a matching one. If
|
||||
# there are two versions of libelf, the user would need to be more specific,
|
||||
# e.g.:
|
||||
#
|
||||
# spack load libelf@0.8.13
|
||||
#
|
||||
# This is very similar to how regular spack commands work and it avoids the need
|
||||
# to come up with a user-friendly naming scheme for spack dotfiles.
|
||||
#################################################################################
|
||||
|
||||
|
||||
#
|
||||
# Test for STDERR-NOCARET feature: if this is off, fish will redirect stderr to
|
||||
# a file named in the string after `^`
|
||||
#
|
||||
|
||||
|
||||
if status test-feature stderr-nocaret
|
||||
else
|
||||
echo "WARNING: you have not enabled the 'stderr-nocaret' feature."
|
||||
echo "This means that you have to escape the caret (^) character when defining specs."
|
||||
echo "Consider enabling stderr-nocaret: https://fishshell.com/docs/current/index.html#featureflags"
|
||||
end
|
||||
|
||||
|
||||
|
||||
#
|
||||
# SPACK wrapper function, preprocessing arguments and flags.
|
||||
#
|
||||
|
||||
|
||||
function spack -d "wrapper for the `spack` command"
|
||||
|
||||
|
||||
#
|
||||
# DEFINE SUPPORT FUNCTIONS HERE
|
||||
#
|
||||
|
||||
|
||||
#
|
||||
# ALLOCATE_SP_SHARED, and DELETE_SP_SHARED allocate (and delete) temporary
|
||||
# global variables
|
||||
#
|
||||
|
||||
|
||||
function allocate_sp_shared -d "allocate shared (global variables)"
|
||||
set -gx __sp_remaining_args
|
||||
set -gx __sp_subcommand_args
|
||||
set -gx __sp_module_args
|
||||
set -gx __sp_stat
|
||||
set -gx __sp_stdout
|
||||
set -gx __sp_stderr
|
||||
end
|
||||
|
||||
|
||||
|
||||
function delete_sp_shared -d "deallocate shared (global variables)"
|
||||
set -e __sp_remaining_args
|
||||
set -e __sp_subcommand_args
|
||||
set -e __sp_module_args
|
||||
set -e __sp_stat
|
||||
set -e __sp_stdout
|
||||
set -e __sp_stderr
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# STREAM_ARGS and SHIFT_ARGS: helper functions manipulating the `argv` array:
|
||||
# -> STREAM_ARGS: echos the `argv` array element-by-element
|
||||
# -> SHIFT_ARGS: echos the `argv` array element-by-element starting with the
|
||||
# second element. If `argv` has only one element, echo the
|
||||
# empty string `""`.
|
||||
# NOTE: while `stream_args` is not strictly necessary, it adds a nice symmetry
|
||||
# to `shift_args`
|
||||
#
|
||||
|
||||
function stream_args -d "echos args as a stream"
|
||||
# return the elements of `$argv` as an array
|
||||
# -> since we want to be able to call it as part of `set x (shift_args
|
||||
# $x)`, we return these one-at-a-time using echo... this means that the
|
||||
# sub-command stream will correctly concatenate the output into an array
|
||||
for elt in $argv
|
||||
echo $elt
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
function shift_args -d "simulates bash shift"
|
||||
#
|
||||
# Returns argv[2..-1] (as an array)
|
||||
# -> if argv has only 1 element, then returns the empty string. This
|
||||
# simulates the behavior of bash `shift`
|
||||
#
|
||||
|
||||
if test -z "$argv[2]"
|
||||
# there are no more element, returning the empty string
|
||||
echo ""
|
||||
else
|
||||
# return the next elements `$argv[2..-1]` as an array
|
||||
# -> since we want to be able to call it as part of `set x (shift_args
|
||||
# $x)`, we return these one-at-a-time using echo... this means that
|
||||
# the sub-command stream will correctly concatenate the output into
|
||||
# an array
|
||||
for elt in $argv[2..-1]
|
||||
echo $elt
|
||||
end
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# CAPTURE_ALL: helper function used to capture stdout, stderr, and status
|
||||
# -> CAPTURE_ALL: there is a bug in fish, that prevents stderr re-capture
|
||||
# from nested command substitution:
|
||||
# https://github.com/fish-shell/fish-shell/issues/6459
|
||||
#
|
||||
|
||||
function capture_all
|
||||
begin;
|
||||
begin;
|
||||
eval $argv[1]
|
||||
set $argv[2] $status # read sets the `status` flag => capture here
|
||||
end 2>| read -z __err
|
||||
end 1>| read -z __out
|
||||
|
||||
# output arrays
|
||||
set $argv[3] (echo $__out | string split \n)
|
||||
set $argv[4] (echo $__err | string split \n)
|
||||
|
||||
return 0
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# GET_SP_FLAGS, and GET_MOD_ARGS: support functions for extracting arguments and
|
||||
# flags. Note bash's `shift` operation is simulated by the `__sp_remaining_args`
|
||||
# array which is roughly equivalent to `$@` in bash.
|
||||
#
|
||||
|
||||
function get_sp_flags -d "return leading flags"
|
||||
#
|
||||
# Accumulate initial flags for main spack command. NOTE: Sets the external
|
||||
# array: `__sp_remaining_args` containing all unprocessed arguments.
|
||||
#
|
||||
|
||||
# initialize argument counter
|
||||
set -l i 1
|
||||
|
||||
# iterate over elements (`elt`) in `argv` array
|
||||
for elt in $argv
|
||||
|
||||
# match element `elt` of `argv` array to check if it has a leading dash
|
||||
if echo $elt | string match -r -q "^-"
|
||||
# by echoing the current `elt`, the calling stream accumulates list
|
||||
# of valid flags. NOTE that this can also be done by adding to an
|
||||
# array, but fish functions can only return integers, so this is the
|
||||
# most elegant solution.
|
||||
echo $elt
|
||||
else
|
||||
# bash compatibility: stop when the match first fails. Upon failure,
|
||||
# we pack the remainder of `argv` into a global `__sp_remaining_args`
|
||||
# array (`i` tracks the index of the next element).
|
||||
set __sp_remaining_args (stream_args $argv[$i..-1])
|
||||
return
|
||||
end
|
||||
|
||||
# increment argument counter: used in place of bash's `shift` command
|
||||
set -l i (math $i+1)
|
||||
|
||||
end
|
||||
|
||||
# if all elements in `argv` are matched, make sure that `__sp_remaining_args`
|
||||
# is deleted (this might be overkill...).
|
||||
set -e __sp_remaining_args
|
||||
end
|
||||
|
||||
|
||||
|
||||
#
|
||||
# CHECK_SP_FLAGS, CONTAINS_HELP_FLAGS, CHECK_ENV_ACTIVATE_FLAGS, and
|
||||
# CHECK_ENV_DEACTIVATE_FLAGS: support functions for checking arguments and flags.
|
||||
#
|
||||
|
||||
function check_sp_flags -d "check spack flags for h/V flags"
|
||||
#
|
||||
# Check if inputs contain h or V flags.
|
||||
#
|
||||
|
||||
# combine argument array into single string (space seperated), to be passed
|
||||
# to regular expression matching (`string match -r`)
|
||||
set -l _a "$argv"
|
||||
|
||||
# skip if called with blank input. Notes: [1] (cf. EOF)
|
||||
if test -n "$_a"
|
||||
if echo $_a | string match -r -q ".*h.*"
|
||||
return 0
|
||||
end
|
||||
if echo $_a | string match -r -q ".*V.*"
|
||||
return 0
|
||||
end
|
||||
end
|
||||
|
||||
return 1
|
||||
end
|
||||
|
||||
|
||||
|
||||
function check_env_activate_flags -d "check spack env subcommand flags for -h, --sh, --csh, or --fish"
|
||||
#
|
||||
# Check if inputs contain -h, --sh, --csh, or --fish
|
||||
#
|
||||
|
||||
# combine argument array into single string (space seperated), to be passed
|
||||
# to regular expression matching (`string match -r`)
|
||||
set -l _a "$argv"
|
||||
|
||||
# skip if called with blank input. Notes: [1] (cf. EOF)
|
||||
if test -n "$_a"
|
||||
# looks for a single `-h` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *-h *"
|
||||
return 0
|
||||
end
|
||||
|
||||
# looks for a single `--sh` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--sh *"
|
||||
return 0
|
||||
end
|
||||
|
||||
# looks for a single `--csh` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--csh *"
|
||||
return 0
|
||||
end
|
||||
|
||||
# looks for a single `--fish` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--fish *"
|
||||
return 0
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
return 1
|
||||
end
|
||||
|
||||
|
||||
function check_env_deactivate_flags -d "check spack env subcommand flags for --sh, --csh, or --fish"
|
||||
#
|
||||
# Check if inputs contain -h, --sh, --csh, or --fish
|
||||
#
|
||||
|
||||
# combine argument array into single string (space seperated), to be passed
|
||||
# to regular expression matching (`string match -r`)
|
||||
set -l _a "$argv"
|
||||
|
||||
# skip if called with blank input. Notes: [1] (cf. EOF)
|
||||
if test -n "$_a"
|
||||
|
||||
# TODO: should this crash (we're clearly using fish, not bash, here)?
|
||||
# looks for a single `--sh` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--sh *"
|
||||
return 0
|
||||
end
|
||||
|
||||
# TODO: should this crash (we're clearly using fish, not csh, here)?
|
||||
# looks for a single `--csh` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--csh *"
|
||||
return 0
|
||||
end
|
||||
|
||||
# looks for a single `--fish` (possibly surrounded by spaces)
|
||||
if echo $_a | string match -r -q " *--fish *"
|
||||
return 0
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
return 1
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# SPACK RUNNER function, this does all the work!
|
||||
#
|
||||
|
||||
|
||||
function spack_runner -d "Runner function for the `spack` wrapper"
|
||||
|
||||
|
||||
#
|
||||
# Accumulate initial flags for main spack command
|
||||
#
|
||||
|
||||
set __sp_remaining_args # remaining (unparsed) arguments
|
||||
set -l sp_flags (get_sp_flags $argv) # sets __sp_remaining_args
|
||||
|
||||
|
||||
#
|
||||
# h and V flags don't require further output parsing.
|
||||
#
|
||||
|
||||
if check_sp_flags $sp_flags
|
||||
command spack $sp_flags $__sp_remaining_args
|
||||
return 0
|
||||
end
|
||||
|
||||
|
||||
#
|
||||
# Isolate subcommand and subcommand specs. Notes: [1] (cf. EOF)
|
||||
#
|
||||
|
||||
set -l sp_subcommand ""
|
||||
|
||||
if test -n "$__sp_remaining_args[1]"
|
||||
set sp_subcommand $__sp_remaining_args[1]
|
||||
set __sp_remaining_args (shift_args $__sp_remaining_args) # simulates bash shift
|
||||
end
|
||||
|
||||
set -l sp_spec $__sp_remaining_args
|
||||
|
||||
|
||||
#
|
||||
# Filter out cd, env, and load and unload. For any other commands, just run
|
||||
# the spack command as is.
|
||||
#
|
||||
|
||||
switch $sp_subcommand
|
||||
|
||||
# CASE: spack subcommand is `cd`: if the sub command arg is `-h`, nothing
|
||||
# further needs to be done. Otherwise, test the location referring the
|
||||
# subcommand and cd there (if it exists).
|
||||
|
||||
case "cd"
|
||||
|
||||
set -l sp_arg ""
|
||||
|
||||
# Extract the first subcommand argument. Notes: [1] (cf. EOF)
|
||||
if test -n "$__sp_remaining_args[1]"
|
||||
set sp_arg $__sp_remaining_args[1]
|
||||
set __sp_remaining_args (shift_args $__sp_remaining_args) # simulates bash shift
|
||||
end
|
||||
|
||||
# Notes: [2] (cf. EOF)
|
||||
if test "x$sp_arg" = "x-h"; or test "x$sp_arg" = "x--help"
|
||||
# nothing more needs to be done for `-h` or `--help`
|
||||
command spack cd -h
|
||||
else
|
||||
# extract location using the subcommand (fish `(...)`)
|
||||
set -l LOC (command spack location $sp_arg $__sp_remaining_args)
|
||||
|
||||
# test location and cd if exists:
|
||||
if test -d "$LOC"
|
||||
cd $LOC
|
||||
else
|
||||
return 1
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
# CASE: spack subcommand is `env`. Here we get the spack runtime to
|
||||
# supply the appropriate shell commands for setting the environment
|
||||
# varibles. These commands are then run by fish (using the `capture_all`
|
||||
# function, instead of a command substitution).
|
||||
|
||||
case "env"
|
||||
|
||||
set -l sp_arg ""
|
||||
|
||||
# Extract the first subcommand argument. Notes: [1] (cf. EOF)
|
||||
if test -n "$__sp_remaining_args[1]"
|
||||
set sp_arg $__sp_remaining_args[1]
|
||||
set __sp_remaining_args (shift_args $__sp_remaining_args) # simulates bash shift
|
||||
end
|
||||
|
||||
# Notes: [2] (cf. EOF)
|
||||
if test "x$sp_arg" = "x-h"; or test "x$sp_arg" = "x--help"
|
||||
# nothing more needs to be done for `-h` or `--help`
|
||||
command spack env -h
|
||||
else
|
||||
switch $sp_arg
|
||||
case "activate"
|
||||
set -l _a (stream_args $__sp_remaining_args)
|
||||
|
||||
if check_env_activate_flags $_a
|
||||
# no args or args contain -h/--help, --sh, or --csh: just execute
|
||||
command spack env activate $_a
|
||||
else
|
||||
# actual call to activate: source the output
|
||||
set -l sp_env_cmd "command spack $sp_flags env activate --fish $__sp_remaining_args"
|
||||
capture_all $sp_env_cmd __sp_stat __sp_stdout __sp_stderr
|
||||
eval $__sp_stdout
|
||||
if test -n "$__sp_stderr"
|
||||
echo -s \n$__sp_stderr 1>&2 # current fish bug: handle stderr manually
|
||||
end
|
||||
end
|
||||
|
||||
case "deactivate"
|
||||
set -l _a (stream_args $__sp_remaining_args)
|
||||
|
||||
if check_env_deactivate_flags $_a
|
||||
# just execute the command if --sh, --csh, or --fish are provided
|
||||
command spack env deactivate $_a
|
||||
|
||||
# Test of further (unparsed arguments). Any other
|
||||
# arguments are an error or help, so just run help
|
||||
# -> TODO: This should throw and error but leave as is
|
||||
# for compatibility with setup-env.sh
|
||||
# -> Notes: [1] (cf. EOF).
|
||||
else if test -n "$__sp_remaining_args"
|
||||
command spack env deactivate -h
|
||||
else
|
||||
# no args: source the output of the command
|
||||
set -l sp_env_cmd "command spack $sp_flags env deactivate --fish"
|
||||
capture_all $sp_env_cmd __sp_stat __sp_stdout __sp_stderr
|
||||
eval $__sp_stdout
|
||||
if test $__sp_stat -ne 0
|
||||
if test -n "$__sp_stderr"
|
||||
echo -s \n$__sp_stderr 1>&2 # current fish bug: handle stderr manually
|
||||
end
|
||||
return 1
|
||||
end
|
||||
end
|
||||
|
||||
case "*"
|
||||
# if $__sp_remaining_args is empty, then don't include it
|
||||
# as argument (otherwise it will be confused as a blank
|
||||
# string input!)
|
||||
if test -n "$__sp_remaining_args"
|
||||
command spack env $sp_arg $__sp_remaining_args
|
||||
else
|
||||
command spack env $sp_arg
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
# CASE: spack subcommand is either `load`, or `unload`. These statements
|
||||
# deal with the technical details of actually using modules. Especially
|
||||
# to deal with the substituting latest version numbers to the module
|
||||
# command.
|
||||
|
||||
case "load" or "unload"
|
||||
|
||||
set -l _a (stream_args $__sp_remaining_args)
|
||||
|
||||
if check_env_activate_flags $_a
|
||||
# no args or args contain -h/--help, --sh, or --csh: just execute
|
||||
command spack $sp_flags $sp_subcommand $__sp_remaining_args
|
||||
else
|
||||
# actual call to activate: source the output
|
||||
set -l sp_env_cmd "command spack $sp_flags $sp_subcommand --fish $__sp_remaining_args"
|
||||
capture_all $sp_env_cmd __sp_stat __sp_stdout __sp_stderr
|
||||
eval $__sp_stdout
|
||||
if test $__sp_stat -ne 0
|
||||
if test -n "$__sp_stderr"
|
||||
echo -s \n$__sp_stderr 1>&2 # current fish bug: handle stderr manually
|
||||
end
|
||||
return 1
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
# CASE: Catch-all
|
||||
|
||||
case "*"
|
||||
command spack $argv
|
||||
|
||||
end
|
||||
|
||||
return 0
|
||||
end
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# RUN SPACK_RUNNER HERE
|
||||
#
|
||||
|
||||
|
||||
#
|
||||
# Allocate temporary global variables used for return extra arguments from
|
||||
# functions. NOTE: remember to call delete_sp_shared whenever returning from
|
||||
# this function.
|
||||
#
|
||||
|
||||
allocate_sp_shared
|
||||
|
||||
|
||||
#
|
||||
# Run spack command using the spack_runner.
|
||||
#
|
||||
|
||||
spack_runner $argv
|
||||
# Capture state of spack_runner (returned below)
|
||||
set -l stat $status
|
||||
|
||||
|
||||
#
|
||||
# Delete temprary global variabels allocated in `allocated_sp_shared`.
|
||||
#
|
||||
|
||||
delete_sp_shared
|
||||
|
||||
|
||||
|
||||
return $stat
|
||||
|
||||
end
|
||||
|
||||
|
||||
|
||||
#################################################################################
|
||||
# Prepends directories to path, if they exist.
|
||||
# pathadd /path/to/dir # add to PATH
|
||||
# or pathadd OTHERPATH /path/to/dir # add to OTHERPATH
|
||||
#################################################################################
|
||||
function spack_pathadd -d "Add path to specified variable (defaults to PATH)"
|
||||
#
|
||||
# Adds (existing only) paths to specified (defaults to PATH)
|
||||
# variable. Does not warn attempting to add non-existing path. This is not a
|
||||
# bug because the MODULEPATH setup tries add all possible compatible systems
|
||||
# and therefore sp_multi_pathadd relies on this function failing silently.
|
||||
#
|
||||
|
||||
# If no variable name is supplied, just append to PATH otherwise append to
|
||||
# that variable.
|
||||
# -> Notes: [1] (cf. EOF).
|
||||
if test -n "$argv[2]"
|
||||
set pa_varname $argv[1]
|
||||
set pa_new_path $argv[2]
|
||||
else
|
||||
true # this is a bit of a strange hack! Notes: [3] (cf EOF).
|
||||
set pa_varname PATH
|
||||
set pa_new_path $argv[1]
|
||||
end
|
||||
|
||||
set pa_oldvalue $$pa_varname
|
||||
|
||||
# skip path is not existing directory
|
||||
# -> Notes: [1] (cf. EOF).
|
||||
if test -d "$pa_new_path"
|
||||
|
||||
# combine argument array into single string (space seperated), to be
|
||||
# passed to regular expression matching (`string match -r`)
|
||||
set -l _a "$pa_oldvalue"
|
||||
|
||||
# skip path if it is already contained in the variable
|
||||
# note spaces in regular expression: we're matching to a space delimited
|
||||
# list of paths
|
||||
if not echo $_a | string match -q -r " *$pa_new_path *"
|
||||
if test -n "$pa_oldvalue"
|
||||
set $pa_varname $pa_new_path $pa_oldvalue
|
||||
else
|
||||
true # this is a bit of a strange hack! Notes: [3] (cf. EOF)
|
||||
set $pa_varname $pa_new_path
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
function sp_multi_pathadd -d "Helper for adding module-style paths by incorporating compatible systems into pathadd" --inherit-variable _sp_compatible_sys_types
|
||||
#
|
||||
# Calls spack_pathadd in path inputs, adding all compatible system types
|
||||
# (sourced from $_sp_compatible_sys_types) to input paths.
|
||||
#
|
||||
|
||||
for pth in $argv[2]
|
||||
for systype in $_sp_compatible_sys_types
|
||||
spack_pathadd $argv[1] "$pth/$systype"
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
|
||||
#
|
||||
# Figure out where this file is. Below code only needs to work in fish
|
||||
#
|
||||
set -l sp_source_file (status -f) # name of current file
|
||||
|
||||
|
||||
|
||||
#
|
||||
# Find root directory and add bin to path.
|
||||
#
|
||||
set -l sp_share_dir (realpath (dirname $sp_source_file))
|
||||
set -l sp_prefix (realpath (dirname (dirname $sp_share_dir)))
|
||||
spack_pathadd PATH "$sp_prefix/bin"
|
||||
set -xg SPACK_ROOT $sp_prefix
|
||||
|
||||
|
||||
|
||||
#
|
||||
# No need to determine which shell is being used (obviously it's fish)
|
||||
#
|
||||
set -xg SPACK_SHELL "fish"
|
||||
set -xg _sp_shell "fish"
|
||||
|
||||
|
||||
|
||||
|
||||
#
|
||||
# Check whether we need environment-variables (module) <= `use` is not available
|
||||
#
|
||||
set -l need_module "no"
|
||||
if not functions -q use; and not functions -q module
|
||||
set need_module "yes"
|
||||
end
|
||||
|
||||
|
||||
|
||||
#
|
||||
# Make environment-modules available to shell
|
||||
#
|
||||
function sp_apply_shell_vars -d "applies expressions of the type `a='b'` as `set a b`"
|
||||
|
||||
# convert `a='b' to array variable `a b`
|
||||
set -l expr_token (string trim -c "'" (string split "=" $argv))
|
||||
|
||||
# run set command to takes, converting lists of type `a:b:c` to array
|
||||
# variables `a b c` by splitting around the `:` character
|
||||
set -xg $expr_token[1] (string split ":" $expr_token[2])
|
||||
end
|
||||
|
||||
|
||||
if test "$need_module" = "yes"
|
||||
set -l sp_shell_vars (command spack --print-shell-vars sh,modules)
|
||||
|
||||
for sp_var_expr in $sp_shell_vars
|
||||
sp_apply_shell_vars $sp_var_expr
|
||||
end
|
||||
|
||||
# _sp_module_prefix is set by spack --print-sh-vars
|
||||
if test "$_sp_module_prefix" != "not_installed"
|
||||
set -xg MODULE_PREFIX $_sp_module_prefix
|
||||
spack_pathadd PATH "$MODULE_PREFIX/bin"
|
||||
end
|
||||
|
||||
else
|
||||
|
||||
set -l sp_shell_vars (command spack --print-shell-vars sh)
|
||||
|
||||
for sp_var_expr in $sp_shell_vars
|
||||
sp_apply_shell_vars $sp_var_expr
|
||||
end
|
||||
|
||||
end
|
||||
|
||||
if test "$need_module" = "yes"
|
||||
function module -d "wrapper for the `module` command to point at Spack's modules instance" --inherit-variable MODULE_PREFIX
|
||||
eval $MODULE_PREFIX/bin/modulecmd $SPACK_SHELL $argv
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
|
||||
#
|
||||
# set module system roots
|
||||
#
|
||||
|
||||
# Search of MODULESPATHS by trying all possible compatible system types as
|
||||
# module roots.
|
||||
if test -z "$MODULEPATH"
|
||||
set -gx MODULEPATH
|
||||
end
|
||||
sp_multi_pathadd MODULEPATH $_sp_tcl_roots
|
||||
|
||||
|
||||
|
||||
#
|
||||
# NOTES
|
||||
#
|
||||
# [1]: `test -n` requires exactly 1 argument. If `argv` is undefined, or if it
|
||||
# is an array, `test -n $argv` is unpredictable. Instead, encapsulate
|
||||
# `argv` in a string, and test the string.
|
||||
#
|
||||
# [2]: `test "$a" = "$b$` is dangerous if `a` and `b` contain flags at index 1,
|
||||
# as `test $a` can be interpreted as `test $a[1] $a[2..-1]`. Solution is to
|
||||
# prepend a non-flag character, eg: `test "x$a" = "x$b"`.
|
||||
#
|
||||
# [3]: When the test in the if statement fails, the `status` flag is set to 1.
|
||||
# `true` here manuallt resets the value of `status` to 0. Since `set`
|
||||
# passes `status` along, we thus avoid the function returning 1 by mistake.
|
||||
@@ -320,7 +320,7 @@ _spack() {
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -H --all-help --color -C --config-scope -d --debug --timestamp --pdb -e --env -D --env-dir -E --no-env --use-env-repo -k --insecure -l --enable-locks -L --disable-locks -m --mock -p --profile --sorted-profile --lines -v --verbose --stacktrace -V --version --print-shell-vars"
|
||||
else
|
||||
SPACK_COMPREPLY="activate add arch blame build build-env buildcache cd checksum ci clean clone commands compiler compilers concretize config configure containerize create deactivate debug dependencies dependents deprecate dev-build diy docs edit env extensions external fetch find flake8 gc gpg graph help info install license list load location log-parse maintainers mirror module patch pkg providers pydoc python reindex remove rm repo resource restage setup spec stage test uninstall unload upload-s3 url verify versions view"
|
||||
SPACK_COMPREPLY="activate add arch blame build-env buildcache cd checksum ci clean clone commands compiler compilers concretize config containerize create deactivate debug dependencies dependents deprecate dev-build docs edit env extensions external fetch find flake8 gc gpg graph help info install license list load location log-parse maintainers mirror module patch pkg providers pydoc python reindex remove rm repo resource restage setup spec stage test uninstall unload url verify versions view"
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -355,15 +355,6 @@ _spack_blame() {
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_build() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -v --verbose"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_build_env() {
|
||||
if $list_options
|
||||
then
|
||||
@@ -385,7 +376,7 @@ _spack_buildcache() {
|
||||
_spack_buildcache_create() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -r --rel -f --force -u --unsigned -a --allow-root -k --key -d --directory -m --mirror-name --mirror-url --no-rebuild-index -y --spec-yaml --only"
|
||||
SPACK_COMPREPLY="-h --help -r --rel -f --force -u --unsigned -a --allow-root -k --key -d --directory -m --mirror-name --mirror-url --rebuild-index -y --spec-yaml --only"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -403,7 +394,7 @@ _spack_buildcache_install() {
|
||||
_spack_buildcache_list() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -l --long -L --very-long -v --variants -f --force -a --allarch"
|
||||
SPACK_COMPREPLY="-h --help -l --long -L --very-long -v --variants -a --allarch"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -641,15 +632,6 @@ _spack_config_rm() {
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_configure() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -v --verbose"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_containerize() {
|
||||
SPACK_COMPREPLY="-h --help"
|
||||
}
|
||||
@@ -725,15 +707,6 @@ _spack_dev_build() {
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_diy() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -j --jobs -d --source-path -i --ignore-dependencies -n --no-checksum --keep-prefix --skip-patch -q --quiet --drop-in -b --before -u --until --clean --dirty"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_docs() {
|
||||
SPACK_COMPREPLY="-h --help"
|
||||
}
|
||||
@@ -759,14 +732,14 @@ _spack_env() {
|
||||
_spack_env_activate() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help --sh --csh -v --with-view -V --without-view -d --dir -p --prompt"
|
||||
SPACK_COMPREPLY="-h --help --sh --csh --fish -v --with-view -V --without-view -d --dir -p --prompt"
|
||||
else
|
||||
_environments
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_env_deactivate() {
|
||||
SPACK_COMPREPLY="-h --help --sh --csh"
|
||||
SPACK_COMPREPLY="-h --help --sh --csh --fish"
|
||||
}
|
||||
|
||||
_spack_env_create() {
|
||||
@@ -1024,7 +997,7 @@ _spack_list() {
|
||||
_spack_load() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -r --dependencies --sh --csh --first --only"
|
||||
SPACK_COMPREPLY="-h --help -r --dependencies --sh --csh --fish --first --only"
|
||||
else
|
||||
_installed_packages
|
||||
fi
|
||||
@@ -1472,29 +1445,12 @@ _spack_uninstall() {
|
||||
_spack_unload() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help --sh --csh -a --all"
|
||||
SPACK_COMPREPLY="-h --help --sh --csh --fish -a --all"
|
||||
else
|
||||
_installed_packages
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_upload_s3() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help"
|
||||
else
|
||||
SPACK_COMPREPLY="spec index"
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_upload_s3_spec() {
|
||||
SPACK_COMPREPLY="-h --help -s --spec -y --spec-yaml -b --base-dir -e --endpoint-url"
|
||||
}
|
||||
|
||||
_spack_upload_s3_index() {
|
||||
SPACK_COMPREPLY="-h --help -e --endpoint-url"
|
||||
}
|
||||
|
||||
_spack_url() {
|
||||
if $list_options
|
||||
then
|
||||
|
||||
@@ -7,7 +7,7 @@ RUN mkdir {{ paths.environment }} \
|
||||
{{ manifest }} > {{ paths.environment }}/spack.yaml
|
||||
|
||||
# Install the software, remove unecessary deps
|
||||
RUN cd {{ paths.environment }} && spack install && spack gc -y
|
||||
RUN cd {{ paths.environment }} && spack env activate . && spack install && spack gc -y
|
||||
{% if strip %}
|
||||
|
||||
# Strip all the binaries
|
||||
|
||||
@@ -11,8 +11,10 @@ EOF
|
||||
|
||||
# Install all the required software
|
||||
. /opt/spack/share/spack/setup-env.sh
|
||||
spack env activate .
|
||||
spack install
|
||||
spack gc -y
|
||||
spack env deactivate
|
||||
spack env activate --sh -d . >> {{ paths.environment }}/environment_modifications.sh
|
||||
{% if strip %}
|
||||
|
||||
@@ -87,4 +89,4 @@ Stage: final
|
||||
{% for label, value in labels.items() %}
|
||||
{{ label }} {{ value }}
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
|
||||
@@ -12,8 +12,14 @@ class Catch2(CMakePackage):
|
||||
|
||||
homepage = "https://github.com/catchorg/Catch2"
|
||||
url = "https://github.com/catchorg/Catch2/archive/v2.9.1.tar.gz"
|
||||
maintainers = ['ax3l']
|
||||
git = "https://github.com/catchorg/Catch2.git"
|
||||
maintainers = ["ax3l", "AndrewGaspar"]
|
||||
|
||||
# In-Development
|
||||
version('master', branch='master')
|
||||
|
||||
# Releases
|
||||
version('2.12.3', sha256='78425e7055cea5bad1ff8db7ea0d6dfc0722ece156be1ccf3597c15e674e6943')
|
||||
version('2.12.1', sha256='e5635c082282ea518a8dd7ee89796c8026af8ea9068cd7402fb1615deacd91c3')
|
||||
version('2.12.0', sha256='6606b754363d3a4521bfecf717dc1972c50dca282bd428dfb1370ec8b9c26918')
|
||||
version('2.11.3', sha256='9a6967138062688f04374698fce4ce65908f907d8c0fe5dfe8dc33126bd46543')
|
||||
|
||||
@@ -0,0 +1,10 @@
|
||||
--- spack-src/CMakeLists.txt.org 2020-06-29 14:38:00.737544597 +0900
|
||||
+++ spack-src/CMakeLists.txt 2020-06-29 14:40:33.758297327 +0900
|
||||
@@ -24,6 +24,6 @@
|
||||
|
||||
add_definitions(-DVERSION="git" -DHOST_NAME="cab")
|
||||
|
||||
-set(CMAKE_SHARED_LIBRARY_LINK_CXX_FLAGS "")
|
||||
+set(CMAKE_SHARED_LIBRARY_LINK_CXX_FLAGS "--linkfortran")
|
||||
|
||||
add_subdirectory (src)
|
||||
@@ -24,6 +24,10 @@ class Cleverleaf(CMakePackage):
|
||||
depends_on('boost')
|
||||
depends_on('cmake@3.1:', type='build')
|
||||
|
||||
# The Fujitsu compiler requires the '--linkfortran'
|
||||
# option to combine C++ and Fortran programs.
|
||||
patch('fujitsu_add_link_flags.patch', when='%fj')
|
||||
|
||||
def flag_handler(self, name, flags):
|
||||
if self.spec.satisfies('%intel') and name in ['cppflags', 'cxxflags']:
|
||||
flags.append(self.compiler.cxx11_flag)
|
||||
|
||||
@@ -18,6 +18,7 @@ class Cmake(Package):
|
||||
|
||||
executables = ['cmake']
|
||||
|
||||
version('3.17.3', sha256='0bd60d512275dc9f6ef2a2865426a184642ceb3761794e6b65bff233b91d8c40')
|
||||
version('3.17.1', sha256='3aa9114485da39cbd9665a0bfe986894a282d5f0882b1dea960a739496620727')
|
||||
version('3.17.0', sha256='b74c05b55115eacc4fa2b77a814981dbda05cdc95a53e279fe16b7b272f00847')
|
||||
version('3.16.5', sha256='5f760b50b8ecc9c0c37135fae5fbf00a2fef617059aa9d61c1bb91653e5a8bfc')
|
||||
|
||||
@@ -14,10 +14,12 @@ class CodarCheetah(PythonPackage):
|
||||
maintainers = ['kshitij-v-mehta']
|
||||
|
||||
homepage = "https://github.com/CODARcode/cheetah"
|
||||
url = "https://github.com/CODARcode/cheetah/archive/v0.1.tar.gz"
|
||||
url = "https://github.com/CODARcode/cheetah/archive/v1.1.0.tar.gz"
|
||||
git = "https://github.com/CODARcode/cheetah.git"
|
||||
|
||||
version('develop', branch='dev')
|
||||
version('1.1.0', sha256='519a47e4fc5b124b443839fde10b8b72120ab768398628df43e0b570a266434c')
|
||||
version('1.0.0', sha256='1f935fbc1475a654f3b6d2140d8b2a6079a65c8701655e544ba1fab3a7c1bc19')
|
||||
version('0.5', sha256='f37a554741eff4bb8407a68f799dd042dfc4df525e84896cad70fccbd6aca6ee')
|
||||
|
||||
depends_on('python@3.5:', type=('build', 'run'))
|
||||
|
||||
51
var/spack/repos/builtin/packages/coin3d/package.py
Normal file
51
var/spack/repos/builtin/packages/coin3d/package.py
Normal file
@@ -0,0 +1,51 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Coin3d(AutotoolsPackage):
|
||||
"""Coin is an OpenGL-based, 3D graphics library that has its roots in the
|
||||
Open Inventor 2.1 API, which Coin still is compatible with."""
|
||||
|
||||
homepage = "https://github.com/coin3d/coin"
|
||||
url = "https://github.com/coin3d/coin/archive/Coin-4.0.0.tar.gz"
|
||||
|
||||
version('3.1.0', sha256='70dd5ef39406e1d9e05eeadd54a5b51884a143e127530876a97744ca54173dc3')
|
||||
version('3.0.0', sha256='d5c2eb0ecaa5c83d93daf0e9e275e58a6a8dfadc74c873d51b0c939011f81bfa')
|
||||
version('2.0.0', sha256='6d26435aa962d085b7accd306a0b478069a7de1bc5ca24e22344971852dd097c')
|
||||
|
||||
depends_on('boost@1.45.0:', type='build')
|
||||
depends_on('doxygen', when='+html', type='build')
|
||||
depends_on('perl', when='+html', type='build')
|
||||
depends_on('openglu', type='link')
|
||||
depends_on('opengl', type='link')
|
||||
depends_on('libsm', type='link')
|
||||
depends_on('libxext', type='link')
|
||||
depends_on('libice', type='link')
|
||||
depends_on('libuuid', type='link')
|
||||
depends_on('libxcb', type='link')
|
||||
depends_on('libxau', type='link')
|
||||
|
||||
variant('html', default=False, description='Build and install Coin HTML documentation')
|
||||
variant('man', default=False, description='Build and install Coin man pages')
|
||||
|
||||
variant('framework', default=False, description="Do 'UNIX-style' installation on Mac OS X")
|
||||
variant('shared', default=True, description='Build shared library (off: build static library)')
|
||||
variant('debug', default=False, description='Make debug build')
|
||||
variant('symbols', default=False, description='Enable debug symbols')
|
||||
|
||||
def configure_args(self):
|
||||
args = []
|
||||
args += self.enable_or_disable('framework')
|
||||
args += self.enable_or_disable('shared')
|
||||
args += self.enable_or_disable('html')
|
||||
args += self.enable_or_disable('man')
|
||||
args += self.enable_or_disable('symbols')
|
||||
args += self.enable_or_disable('debug')
|
||||
args.append("--with-boost=" + self.spec['boost'].prefix)
|
||||
args.append("--with-boost-libdir=" + self.spec['boost'].prefix.lib)
|
||||
|
||||
return args
|
||||
@@ -68,7 +68,7 @@ class Cp2k(MakefilePackage, CudaPackage):
|
||||
variant('lmax',
|
||||
description='Maximum supported angular momentum (HFX and others)',
|
||||
default='5',
|
||||
values=list(HFX_LMAX_RANGE),
|
||||
values=map(str, HFX_LMAX_RANGE),
|
||||
multi=False)
|
||||
|
||||
depends_on('python', type='build')
|
||||
|
||||
@@ -77,6 +77,7 @@ class Cuda(Package):
|
||||
depends_on('libxml2', when='@10.1.243:')
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
env.set('CUDAHOSTCXX', self.compiler.cxx)
|
||||
if self.spec.satisfies('@10.1.243:'):
|
||||
libxml2_home = self.spec['libxml2'].prefix
|
||||
env.set('LIBXML2HOME', libxml2_home)
|
||||
@@ -84,6 +85,7 @@ def setup_build_environment(self, env):
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.set('CUDA_HOME', self.prefix)
|
||||
env.set('CUDAHOSTCXX', self.compiler.cxx)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
if os.path.exists('/tmp/cuda-installer.log'):
|
||||
|
||||
32
var/spack/repos/builtin/packages/dssp/package.py
Normal file
32
var/spack/repos/builtin/packages/dssp/package.py
Normal file
@@ -0,0 +1,32 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Dssp(AutotoolsPackage):
|
||||
"""'mkdssp' utility. (dictionary of protein secondary structure)"""
|
||||
|
||||
homepage = "https://github.com/cmbi/dssp"
|
||||
url = "https://github.com/cmbi/dssp/archive/3.1.4.tar.gz"
|
||||
|
||||
version('3.1.4', sha256='496282b4b5defc55d111190ab9f1b615a9574a2f090e7cf5444521c747b272d4')
|
||||
version('2.3.0', sha256='4c95976d86dc64949cb0807fbd58c7bee5393df0001999405863dc90f05846c6')
|
||||
|
||||
depends_on('autoconf', type='build')
|
||||
depends_on('automake', type='build')
|
||||
depends_on('libtool', type='build')
|
||||
depends_on('m4', type='build')
|
||||
depends_on('boost@1.48:')
|
||||
|
||||
def configure_args(self):
|
||||
args = [
|
||||
"--with-boost=%s" % self.spec['boost'].prefix]
|
||||
return args
|
||||
|
||||
@run_after('configure')
|
||||
def edit(self):
|
||||
makefile = FileFilter(join_path(self.stage.source_path, 'Makefile'))
|
||||
makefile.filter('.*-Werror .*', ' -Wno-error \\')
|
||||
@@ -47,3 +47,13 @@ class Fastjet(AutotoolsPackage):
|
||||
version('2.3.1', sha256='16c32b420e1aa7d0b6fecddd980ea0f2b7e3c2c66585e06f0eb3142677ab6ccf')
|
||||
version('2.3.0', sha256='e452fe4a9716627bcdb726cfb0917f46a7ac31f6006330a6ccc1abc43d9c2d53')
|
||||
# older version use .tar instead of .tar.gz extension, to be added
|
||||
|
||||
variant('shared', default=True, description='Builds a shared version of the library')
|
||||
variant('auto-ptr', default=False, description='Use auto_ptr')
|
||||
|
||||
def configure_args(self):
|
||||
extra_args = ["--enable-allplugins"]
|
||||
extra_args += self.enable_or_disable('shared')
|
||||
extra_args += self.enable_or_disable('auto-ptr')
|
||||
|
||||
return extra_args
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
--- FFB8.org/util/xvx2gf/XVX.h 2013-03-26 10:09:49.000000000 +0900
|
||||
+++ FFB8.new/util/xvx2gf/XVX.h 2020-06-16 16:00:06.408500236 +0900
|
||||
@@ -5,6 +5,7 @@
|
||||
#include <algorithm>
|
||||
#include <cassert>
|
||||
#include <vector>
|
||||
+#include <stdint.h>
|
||||
|
||||
#define MAX_LEVEL 32
|
||||
|
||||
60
var/spack/repos/builtin/packages/ffb/fortran-format.patch
Normal file
60
var/spack/repos/builtin/packages/ffb/fortran-format.patch
Normal file
@@ -0,0 +1,60 @@
|
||||
diff --git a/util/convert/gfmavs.f b/util/convert/gfmavs.f
|
||||
index debe3ec..34e6f66 100644
|
||||
--- a/util/convert/gfmavs.f
|
||||
+++ b/util/convert/gfmavs.f
|
||||
@@ -422,7 +422,7 @@ C
|
||||
C
|
||||
WRITE(IUT6,*) 'GFMAVS: OUTPUTING GRID COORDINATES...'
|
||||
DO 1100 IP = 1,NP
|
||||
- WRITE(IUTAVS,'(I8,P3E13.5)') IP,X(IP),Y(IP),Z(IP)
|
||||
+ WRITE(IUTAVS,'(I8,1P3E13.5)') IP,X(IP),Y(IP),Z(IP)
|
||||
1100 CONTINUE
|
||||
C
|
||||
WRITE(IUT6,*) 'GFMAVS: OUTPUTING ELEMENT DATA...'
|
||||
@@ -448,7 +448,7 @@ C
|
||||
C
|
||||
WRITE(IUT6,*) 'GFMAVS: OUTPUTING VARIABLES DATA...'
|
||||
DO 1400 IP = 1,NP
|
||||
- WRITE(IUTAVS,'(I8,P4E13.5)')
|
||||
+ WRITE(IUTAVS,'(I8,1P4E13.5)')
|
||||
* IP,U(IP),V(IP),W(IP),P(IP)
|
||||
1400 CONTINUE
|
||||
C
|
||||
@@ -470,7 +470,7 @@ C
|
||||
C
|
||||
DO 2100 JB = 1,NPTMP
|
||||
IP = LPBOUN(JB,ITYPE)
|
||||
- WRITE(IUTAVS,'(I8,P3E13.5)')
|
||||
+ WRITE(IUTAVS,'(I8,1P3E13.5)')
|
||||
* IP,X(IP),Y(IP),Z(IP)
|
||||
2100 CONTINUE
|
||||
C
|
||||
@@ -498,7 +498,7 @@ C
|
||||
C
|
||||
DO 2400 JB = 1,NPTMP
|
||||
IP = LPBOUN(JB,ITYPE)
|
||||
- WRITE(IUTAVS,'(I8,P7E13.5)')
|
||||
+ WRITE(IUTAVS,'(I8,1P7E13.5)')
|
||||
* IP,U(IP),V(IP),W(IP),P(IP),F(IP),T(IP),VIS(IP)
|
||||
2400 CONTINUE
|
||||
C
|
||||
diff --git a/util/multi/partdx_rcap.f b/util/multi/partdx_rcap.f
|
||||
index a0962b7..aab1876 100644
|
||||
--- a/util/multi/partdx_rcap.f
|
||||
+++ b/util/multi/partdx_rcap.f
|
||||
@@ -448,11 +448,11 @@ C
|
||||
C
|
||||
CLOSE(IUTPT)
|
||||
C
|
||||
- 700 FORMAT(A,1X,I)
|
||||
+ 700 FORMAT(A,1X,I8)
|
||||
701 FORMAT(A,1X,A)
|
||||
- 702 FORMAT(A,1X,I)
|
||||
- 703 FORMAT(I,1X,A,1X,8(I,1X))
|
||||
- 704 FORMAT(I,3(1X,F))
|
||||
+ 702 FORMAT(A,1X,I8)
|
||||
+ 703 FORMAT(I8,1X,A,1X,8(I8,1X))
|
||||
+ 704 FORMAT(I8,3(1X,F13.5))
|
||||
CC
|
||||
CC 700 FORMAT(A,1X,I8)
|
||||
CC 701 FORMAT(A,1X,A)
|
||||
33
var/spack/repos/builtin/packages/ffb/gffv3tr.patch
Normal file
33
var/spack/repos/builtin/packages/ffb/gffv3tr.patch
Normal file
@@ -0,0 +1,33 @@
|
||||
--- FFB8.org/util/tetra/gffv3tr.f 2013-03-26 10:10:51.000000000 +0900
|
||||
+++ FFB8.new/util/tetra/gffv3tr.f 2020-06-23 10:32:13.673071588 +0900
|
||||
@@ -832,7 +832,7 @@ C
|
||||
LIST2(3)=LEBUF(5,IS2)
|
||||
C
|
||||
IMATCH=0
|
||||
- CALL MATCH4(LIST1,LIST2,IMATCH)
|
||||
+ CALL MATCH3(LIST1,LIST2,IMATCH)
|
||||
IF(IMATCH.EQ.1)THEN
|
||||
LEBUF(6,IS1)=0
|
||||
LEBUF(6,IS2)=0
|
||||
@@ -859,17 +859,17 @@ C
|
||||
RETURN
|
||||
END
|
||||
CCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCC
|
||||
- SUBROUTINE MATCH4(LIST1,LIST2,IMATCH)
|
||||
- INTEGER LIST1(4), LIST2(4)
|
||||
+ SUBROUTINE MATCH3(LIST1,LIST2,IMATCH)
|
||||
+ INTEGER LIST1(3), LIST2(3)
|
||||
INTEGER IMATCH
|
||||
C [INPUT] LIST1,LIST2 :VECTORS TO BE CHECK
|
||||
C [OUTPUT] IMATCH :0 NOT MATCH ,1 MATCH
|
||||
C
|
||||
IMATCH = 0
|
||||
C
|
||||
- DO 100 I=1,4
|
||||
+ DO 100 I=1,3
|
||||
IBUF=1
|
||||
- DO 200 J=1,4
|
||||
+ DO 200 J=1,3
|
||||
IBUF=IBUF*(LIST1(I)-LIST2(J))
|
||||
200 CONTINUE
|
||||
IF(IBUF.NE.0)RETURN
|
||||
@@ -15,7 +15,10 @@ class Ffb(MakefilePackage):
|
||||
version('8.1', sha256='1ad008c909152b6c27668bafbad820da3e6ec3309c7e858ddb785f0a3d6e43ae')
|
||||
|
||||
patch('revocap_refiner.patch')
|
||||
patch('fj_compiler.patch', when='%fj')
|
||||
patch('revocap_refiner-size_t.patch')
|
||||
patch('fortran-format.patch')
|
||||
patch('xvx.patch')
|
||||
patch('gffv3tr.patch')
|
||||
|
||||
depends_on('mpi')
|
||||
depends_on('blas')
|
||||
@@ -79,7 +82,7 @@ def edit(self, spec, prefix):
|
||||
m.write('#!/bin/csh -f\n')
|
||||
m.write('setenv LES3DHOME {0}\n'.format(workdir))
|
||||
m.write('cd {0}\n'.format(dd_mpi_dir))
|
||||
m.write('make lib\n')
|
||||
m.write('make lib FCOM={0}\n'.format(spec['mpi'].mpifc))
|
||||
os.chmod(makeall, 0o755)
|
||||
|
||||
makeall = join_path('.', 'Makeall.les')
|
||||
@@ -143,16 +146,16 @@ def edit(self, spec, prefix):
|
||||
m.filter(r'LIBS = -lfort -lgf2 -ldd_mpi -lmpi_f77',
|
||||
'LIBS = -lfort -lgf2 -ldd_mpi')
|
||||
|
||||
editfile = join_path('util', 'xvx2gf', 'FILES')
|
||||
cxx_fortran_flags = []
|
||||
if spec.satisfies('%gcc'):
|
||||
editfile = join_path('util', 'xvx2gf', 'FILES')
|
||||
m = FileFilter(editfile)
|
||||
m.filter(r'LIBS = -lgf2 -lz -lifcore -limf -ldl',
|
||||
'LIBS = -lgf2 -lz -ldl')
|
||||
cxx_fortran_flags.append('-lgfortran')
|
||||
elif spec.satisfies('%intel'):
|
||||
cxx_fortran_flags.expand(['-lifcore', '-limf'])
|
||||
elif spec.satisfies('%fj'):
|
||||
editfile = join_path('util', 'xvx2gf', 'FILES')
|
||||
m = FileFilter(editfile)
|
||||
m.filter(r'LIBS = -lgf2 -lz -lifcore -limf -ldl',
|
||||
'LIBS = -lgf2 -lz -ldl -linkfortran')
|
||||
cxx_fortran_flags.append('--linkfortran')
|
||||
m = FileFilter(editfile)
|
||||
m.filter('-lifcore -limf', ' '.join(cxx_fortran_flags))
|
||||
|
||||
def build(self, spec, prefix):
|
||||
for m in [join_path('make', 'Makeall'),
|
||||
|
||||
@@ -0,0 +1,225 @@
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainer.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainer.h
|
||||
index b5b8525..3e95332 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainer.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainer.h
|
||||
@@ -38,6 +38,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Matrix4x4;
|
||||
|
||||
class Point3DContainer
|
||||
diff --git a/spack-src/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Bucket.h b/spack-src/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Bucket.h
|
||||
index cd1d5df..a2fff46 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Bucket.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Bucket.h
|
||||
@@ -194,7 +194,7 @@
|
||||
return itClone;
|
||||
};
|
||||
|
||||
- size_t getCount(int i,int j,int k) const{
|
||||
+ std::size_t getCount(int i,int j,int k) const{
|
||||
return regions.count(getIndex(i,j,k));
|
||||
};
|
||||
|
||||
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_BLArray.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_BLArray.h
|
||||
index 4761262..b9a0c8c 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_BLArray.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_BLArray.h
|
||||
@@ -30,6 +30,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class BLArrayIndex;
|
||||
|
||||
class BLArrayBase
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Classification.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Classification.h
|
||||
index 81b3b9f..51c06cc 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Classification.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Classification.h
|
||||
@@ -39,6 +39,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
typedef int classificationType;
|
||||
|
||||
template<typename T>
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainer.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainer.h
|
||||
index 0f7e548..39278e6 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainer.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainer.h
|
||||
@@ -38,6 +38,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Point2DContainer
|
||||
{
|
||||
protected:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.h
|
||||
index c4c0dc9..8852677 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.h
|
||||
@@ -17,6 +17,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Point3DContainerArray : public Point3DContainer
|
||||
{
|
||||
private:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMArray.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMArray.h
|
||||
index 51bb315..8bb557f 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMArray.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMArray.h
|
||||
@@ -19,6 +19,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Point3DContainerMArray : public Point3DContainer
|
||||
{
|
||||
private:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.h
|
||||
index 1254a5c..57b1763 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.h
|
||||
@@ -31,6 +31,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Point3DContainerVect : public Point3DContainer
|
||||
{
|
||||
private:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.h b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.h
|
||||
index 778e253..b743069 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.h
|
||||
@@ -32,6 +32,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Point3DOctree
|
||||
{
|
||||
private:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.cpp b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.cpp
|
||||
index 6a8bb38..d29e6fa 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.cpp
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DOctree.cpp
|
||||
@@ -85,10 +85,10 @@ kmb::Point3DOctree::getLocalCount(void) const
|
||||
return count;
|
||||
}
|
||||
|
||||
-size_t
|
||||
+std::size_t
|
||||
kmb::Point3DOctree::getCount(void) const
|
||||
{
|
||||
- size_t sum = 0;
|
||||
+ std::size_t sum = 0;
|
||||
if( children != NULL ){
|
||||
for(int i=0;i<8;++i){
|
||||
sum += children[i]->getCount();
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainerMap.cpp b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainerMap.cpp
|
||||
index 6838486..2d4874c 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainerMap.cpp
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point2DContainerMap.cpp
|
||||
@@ -33,6 +33,8 @@
|
||||
#pragma warning(disable:4100)
|
||||
#endif
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
|
||||
|
||||
const char* kmb::Point2DContainerMap::CONTAINER_TYPE = "stl::map<id,Point2D*>";
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.cpp b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.cpp
|
||||
index 02362fc..b9bf597 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.cpp
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerArray.cpp
|
||||
@@ -19,6 +19,8 @@
|
||||
#pragma warning(disable:4100)
|
||||
#endif
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
const char* kmb::Point3DContainerArray::CONTAINER_TYPE = "double_array";
|
||||
|
||||
kmb::Point3DContainerArray::Point3DContainerArray(void)
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMap.cpp b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMap.cpp
|
||||
index 9619325..24997b6 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMap.cpp
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerMap.cpp
|
||||
@@ -33,6 +33,7 @@
|
||||
#pragma warning(disable:4100)
|
||||
#endif
|
||||
|
||||
+using std::size_t;
|
||||
|
||||
|
||||
const char* kmb::Point3DContainerMap::CONTAINER_TYPE = "stl::map<id,Point3D*>";
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.cpp b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.cpp
|
||||
index 06add56..ec55033 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.cpp
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Geometry/kmb_Point3DContainerVect.cpp
|
||||
@@ -32,6 +32,8 @@
|
||||
#pragma warning(disable:4100)
|
||||
#endif
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
const char* kmb::Point3DContainerVect::CONTAINER_TYPE = "stl::vector<Point3D*>";
|
||||
|
||||
kmb::Point3DContainerVect::Point3DContainerVect(void)
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbElementContainerNArray.h b/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbElementContainerNArray.h
|
||||
index f3335d0..138aa5e 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbElementContainerNArray.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbElementContainerNArray.h
|
||||
@@ -27,6 +27,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class ElementContainerNArray : public ElementContainer
|
||||
{
|
||||
protected:
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbNodeNeighborPtrInfo.h b/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbNodeNeighborPtrInfo.h
|
||||
index 70cdc5e..6a7a4b7 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbNodeNeighborPtrInfo.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/MeshDB/kmbNodeNeighborPtrInfo.h
|
||||
@@ -24,6 +24,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class Element;
|
||||
class ElementContainer;
|
||||
class MeshData;
|
||||
diff --git a/lib/src/REVOCAP_Refiner-0.4.3/Shape/kmbFittingToSurface.h b/lib/src/REVOCAP_Refiner-0.4.3/Shape/kmbFittingToSurface.h
|
||||
index 3a6d7dd..5910b77 100644
|
||||
--- a/lib/src/REVOCAP_Refiner-0.4.3/Shape/kmbFittingToSurface.h
|
||||
+++ b/lib/src/REVOCAP_Refiner-0.4.3/Shape/kmbFittingToSurface.h
|
||||
@@ -19,6 +19,8 @@
|
||||
|
||||
namespace kmb{
|
||||
|
||||
+using std::size_t;
|
||||
+
|
||||
class MeshData;
|
||||
class Surface3D;
|
||||
template <typename> class Vector2WithIntBindings;
|
||||
37
var/spack/repos/builtin/packages/ffb/xvx.patch
Normal file
37
var/spack/repos/builtin/packages/ffb/xvx.patch
Normal file
@@ -0,0 +1,37 @@
|
||||
--- FFB8.org/util/xvx2gf/XVX.h 2013-03-26 10:09:49.000000000 +0900
|
||||
+++ FFB8.new/util/xvx2gf/XVX.h 2020-06-24 14:27:32.000000000 +0900
|
||||
@@ -5,6 +5,7 @@
|
||||
#include <algorithm>
|
||||
#include <cassert>
|
||||
#include <vector>
|
||||
+#include <inttypes.h>
|
||||
|
||||
#define MAX_LEVEL 32
|
||||
|
||||
@@ -182,25 +183,7 @@ public:
|
||||
void *_ptr;
|
||||
unsigned char volptr[3];// <20>Ƃ肠<C682><E882A0><EFBFBD><EFBFBD><EFBFBD>ő<EFBFBD>256^3<>܂<EFBFBD> uint<6E>ɂ<EFBFBD><C982><EFBFBD>bitmask<73><6B><EFBFBD><EFBFBD><EFBFBD><EFBFBD>1024^3
|
||||
|
||||
- #if _WIN64 // _WIN32<33>Ə<EFBFBD><C68F>Ԃ<EFBFBD><D482>ς<EFBFBD><CF82><EFBFBD><EFBFBD>Ƃ<EFBFBD><C682><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD>Ȃ<EFBFBD><C882>̂Œ<CC82><C592>Ӂi64<36>r<EFBFBD>b<EFBFBD>gWindows<77><EFBFBD><C28B>ł́A_WIN32<33><32>_WIN64<36>̗<EFBFBD><CC97><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD>`<60><><EFBFBD><EFBFBD><EFBFBD><EFBFBD><EFBFBD>j
|
||||
- typedef unsigned long long int PtrAsInt;
|
||||
- #elif _WIN32
|
||||
- typedef unsigned int PtrAsInt;
|
||||
- #elif __APPLE__
|
||||
- #if x86_64
|
||||
- typedef unsigned long long int PtrAsInt;
|
||||
- #else
|
||||
- typedef unsigned int PtrAsInt;
|
||||
- #endif
|
||||
- #elif __linux__
|
||||
- #if __x86_64__
|
||||
- typedef unsigned long long int PtrAsInt;
|
||||
- #else
|
||||
- typedef unsigned int PtrAsInt;
|
||||
- #endif
|
||||
- #else
|
||||
- #error you must typedef 'PtrAsInt'
|
||||
- #endif
|
||||
+ typedef size_t PtrAsInt;
|
||||
}ALIGNMENT;
|
||||
#ifdef __GNUC__
|
||||
#pragma pack(pop)
|
||||
@@ -30,11 +30,3 @@ class Filebench(AutotoolsPackage):
|
||||
depends_on('m4', type='build')
|
||||
depends_on('flex', type='build')
|
||||
depends_on('bison', type='build')
|
||||
|
||||
def autoreconf(self, spec, prefix):
|
||||
sh = which('sh')
|
||||
sh('libtoolize')
|
||||
sh('aclocal')
|
||||
sh('autoheader')
|
||||
sh('automake', '--add-missing')
|
||||
sh('autoconf')
|
||||
|
||||
85
var/spack/repos/builtin/packages/fjcontrib/package.py
Normal file
85
var/spack/repos/builtin/packages/fjcontrib/package.py
Normal file
@@ -0,0 +1,85 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import inspect
|
||||
from spack import *
|
||||
|
||||
|
||||
class Fjcontrib(AutotoolsPackage):
|
||||
"""3rd party extensions of FastJet"""
|
||||
|
||||
homepage = "https://fastjet.hepforge.org/contrib/"
|
||||
url = "http://fastjet.hepforge.org/contrib/downloads/fjcontrib-1.044.tar.gz"
|
||||
|
||||
version('1.044', sha256='de3f45c2c1bed6d7567483e4a774575a504de8ddc214678bac7f64e9d2e7e7a7')
|
||||
version('1.043', sha256='ef0f586b19ffd12f392b7facc890a73d31fc11b9f5bb727cf3743d6eb59e9993')
|
||||
version('1.042', sha256='5b052e93a371c557557fa4a293cca4b08f88ccfb6c43a4df15b2a9f38c6d8831')
|
||||
version('1.041', sha256='31e244728de7787a582479cbed606a1c2fe5fcf3ad3ba1b0134742de5107510c')
|
||||
version('1.040', sha256='8fcd261f4dec41d3fcf2d550545d7b6e8e5f54e1604615489e7a2c0e45888663')
|
||||
version('1.039', sha256='a3b88de5e43edb13cde37ed5821aaed6562eec23a774092547f397b6e0b3f088')
|
||||
version('1.038', sha256='e90d079706a0db516cfc1170b4c34aa197320d019d71db1f726764f293561b15')
|
||||
version('1.037', sha256='9d69a4a91ef60557e9cebe14952db64f35ea75c51c14e57c1d998bbed0ad6342')
|
||||
version('1.036', sha256='ecb93ae43e76dbd485b50d37c234aa039fe455d26c273247bc3c93b273dc24f2')
|
||||
version('1.035', sha256='bc33e5c8132258213cf569789cbcfc838dbed1c242bd31decc0be1f099d405b6')
|
||||
version('1.034', sha256='8aa9ff609b678c363d682afde91765f7f1145cdf203c18c3fa15c3e1a9a5bcfa')
|
||||
version('1.033', sha256='d964370b3db0703e223b42a793b07eafbb3658db1266956fd86e69ea5abab7ea')
|
||||
version('1.032', sha256='125bae5ab04384596c4bcde1c84a3189034f1e94fd99b6c1025cfac1c1795831')
|
||||
version('1.031', sha256='edb692e8e4e4bbe5401d8891ccf81bd6d26fd7c41d1ec47bd40f2c0e39be9545')
|
||||
version('1.030', sha256='e4c47bf808f2c078d6fe332ccff00716fdec618e10b7adbf9344e44c8364dba9')
|
||||
version('1.029', sha256='9147f533bf806dbef5c80f38a26f2002f4c8dfe5c2e01ce1c8edfef27ed527ed')
|
||||
version('1.028', sha256='cbc24d6e28b03f3c198103c7ae449f401f2e3b7c8a58d241e87f6a8edd30effa')
|
||||
version('1.027', sha256='8d983c1622d4163459a67d7208961061699e52e417893c045357cd0d2d6048c4')
|
||||
version('1.026', sha256='6bfe538f17fb3fbbf2c0010640ed6eeedfe4317e6b56365e727d44639f76e3d7')
|
||||
version('1.025', sha256='504ff45770e8160f1d3b842ea5962c84faeb9acd2821195b08c7e21a84402dcc')
|
||||
version('1.024', sha256='be13d4b6dfc3c6a157c521247576d01abaa90413af07a5f13b1d0e3622225116')
|
||||
version('1.023', sha256='ca8f5520ab8c934ea4458016f66d4d7b08e25e491f8fdb5393572cd411ddde03')
|
||||
version('1.022', sha256='f505c60a87283f380f083736ac411bd865750e81e59876583a9d5c4085d927ce')
|
||||
version('1.021', sha256='f4bcd9ca39399a3b7719358e757462ecc7937f73a6b625cbb164173a26847d33')
|
||||
version('1.020', sha256='1ec2041cb5e7f050e20823a31111c7e61d24c4d7ee546c9714861b16eca83a04')
|
||||
version('1.019', sha256='adec9e1d3c48fec8d6786b64d9c4e95e79129a156f0b0ed3df7e98d57882ffb2')
|
||||
version('1.018', sha256='fe9d9661cefba048b19002abec7e75385ed73a8a3f45a29718e110beab142c2c')
|
||||
version('1.017', sha256='6f711b8b7f63a9a1bdf83cc1df6f7785bd3394e96dddba12186dfb5de352362c')
|
||||
version('1.016', sha256='9301027a71098ef75c37742a2f60631b2e39eb4d6b73b18db48d31e6aed830a0')
|
||||
version('1.015', sha256='4b9d56d2b6ae56a894b64920ac9ef59013b1d0a54454e716dd98e724f9ea82c0')
|
||||
version('1.014', sha256='c8de7a0cc19be7684dcfbc3101f9cd4c9426daa3009726acd5a5cbe5e699544b')
|
||||
version('1.013', sha256='c2b4838a41431cd1c28d3a836f10c00bb84eb6eb2e6586a28badab66518107a6')
|
||||
version('1.012', sha256='9215ee5b6465fa64296a0139e727a2eb61255d50faccafaa5b241e408ae54821')
|
||||
version('1.011', sha256='7e4e62755d5905e9dc4a6ef29728c03da538cbbff381eed32a35f06934304b1d')
|
||||
version('1.010', sha256='1ae74ef44e3d8ce96fd7eb61571819e8891ff708fd468a71780e693c2081176c')
|
||||
version('1.009', sha256='dc12cce6f0468b4fab9ec4d9fbb2c3d76e942b3b6965cf609963f65c69c08d2f')
|
||||
version('1.008', sha256='8cc98d837b0579b605a4731fb69882084948e1f1d2ee20af5c8adade2434e4ed')
|
||||
version('1.007', sha256='70b9d57e5c690d9af8b5be96eefeea9f8c4c4d1a7f000fe1dac30e5aae36077f')
|
||||
version('1.006', sha256='e2b6d2c5a666eb4db6bef89ea176df4bc1cdec316f8d18594ab54eb0ce8dcbb6')
|
||||
version('1.005', sha256='265e682a05c1f5e5cf1560cc4efa99e07b211cb2add5f3a09b5be4363ab9cc7f')
|
||||
version('1.004', sha256='b397e6f822a1d539ac1f63dc8c0709a7a00eed25f4c011114be71f0b38f8f514')
|
||||
version('1.003', sha256='79b2d42b1b7a99686fa2e0d95ad39050fd974fdf5ea5066271185ae308299b72')
|
||||
version('1.002', sha256='b48d2cc0a9c2fcfe3bb7faee3217a50191be4150a62ed2f6419fd0025d2cec92')
|
||||
version('1.001', sha256='6030dfee59040d2ada4ee582254e60b9ecf4b62598129e7b0a114672cf267491')
|
||||
version('1.000', sha256='95fa36ae48f03cb941d632b0a537995fb7148a6bd028c4978fab8b7b04332c3b')
|
||||
version('0.001', sha256='51f24ad55e28fb1f9d698270602e5077c920fcf58d8ccfd274efbe829d7dd821')
|
||||
version('0.000', sha256='9486b11201e6b6e181b8a3abecd929403ae9aa67de0eb8b7353fb82ab4b89f41')
|
||||
|
||||
depends_on('fastjet')
|
||||
|
||||
build_targets = ['all', 'fragile-shared']
|
||||
install_targets = ['install', 'fragile-shared-install']
|
||||
|
||||
def configure_args(self):
|
||||
args = ['--fastjet-config=' +
|
||||
self.spec['fastjet'].prefix.bin +
|
||||
'/fastjet-config',
|
||||
"CXXFLAGS=-O3 -Wall -g " +
|
||||
self.compiler.cxx_pic_flag]
|
||||
return args
|
||||
|
||||
def build(self, spec, prefix):
|
||||
with working_dir(self.build_directory):
|
||||
for target in self.build_targets:
|
||||
inspect.getmodule(self).make(target)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
with working_dir(self.build_directory):
|
||||
for target in self.install_targets:
|
||||
inspect.getmodule(self).make(target)
|
||||
@@ -4,6 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
import os
|
||||
|
||||
|
||||
class FujitsuMpi(Package):
|
||||
@@ -28,6 +29,24 @@ def install(self, spec, prefix):
|
||||
raise InstallError(
|
||||
'Fujitsu MPI is not installable; it is vendor supplied')
|
||||
|
||||
@property
|
||||
def headers(self):
|
||||
hdrs = find_headers('mpi', self.prefix.include, recursive=True)
|
||||
hdrs.directories = os.path.dirname(hdrs[0])
|
||||
return hdrs or None
|
||||
|
||||
@property
|
||||
def libs(self):
|
||||
query_parameters = self.spec.last_query.extra_parameters
|
||||
libraries = ['libmpi']
|
||||
|
||||
if 'cxx' in query_parameters:
|
||||
libraries = ['libmpi_cxx'] + libraries
|
||||
|
||||
return find_libraries(
|
||||
libraries, root=self.prefix, shared=True, recursive=True
|
||||
)
|
||||
|
||||
def setup_dependent_package(self, module, dependent_spec):
|
||||
self.spec.mpicc = self.prefix.bin.mpifcc
|
||||
self.spec.mpicxx = self.prefix.bin.mpiFCC
|
||||
|
||||
@@ -16,6 +16,7 @@ class Gaudi(CMakePackage):
|
||||
version('master', branch='master')
|
||||
# major cmake config overhaul already in use by some
|
||||
version('develop', git='https://gitlab.cern.ch/clemenci/Gaudi.git', branch='cmake-modernisation')
|
||||
version('33.2', sha256='26aaf9c4ff237a60ec79af9bd18ad249fc91c16e297ba77e28e4a256123db6e5')
|
||||
version('33.1', sha256='7eb6b2af64aeb965228d4b6ea66c7f9f57f832f93d5b8ad55c9105235af5b042')
|
||||
version('33.0', sha256='76a967c41f579acc432593d498875dd4dc1f8afd5061e692741a355a9cf233c8')
|
||||
version('32.2', sha256='e9ef3eb57fd9ac7b9d5647e278a84b2e6263f29f0b14dbe1321667d44d969d2e')
|
||||
@@ -32,17 +33,20 @@ class Gaudi(CMakePackage):
|
||||
description='Build with Intel VTune profiler support')
|
||||
|
||||
# only build subdirectory GaudiExamples when +optional
|
||||
patch("build_testing.patch", when="@:33.1")
|
||||
patch("build_testing.patch", when="@:33.2")
|
||||
# fix for the new cmake config, should be merged in branch
|
||||
patch('python2.patch', when="@develop")
|
||||
# fixes for the cmake config which could not find newer boost versions
|
||||
patch("link_target_fixes.patch", when="@33.0:33.1")
|
||||
patch("link_target_fixes.patch", when="@33.0:33.2")
|
||||
patch("link_target_fixes32.patch", when="@:32.2")
|
||||
|
||||
# These dependencies are needed for a minimal Gaudi build
|
||||
depends_on('aida')
|
||||
depends_on('boost@1.67.0: +python')
|
||||
depends_on('clhep')
|
||||
depends_on('cmake', type='build')
|
||||
depends_on('cppgsl')
|
||||
depends_on('fmt', when='@33.2:')
|
||||
depends_on('intel-tbb')
|
||||
depends_on('libuuid')
|
||||
# some bugs with python 3.8
|
||||
@@ -60,8 +64,6 @@ class Gaudi(CMakePackage):
|
||||
depends_on('py-nose', when="@develop", type=('build', 'run'))
|
||||
|
||||
# Adding these dependencies triggers the build of most optional components
|
||||
depends_on('aida', when='+optional')
|
||||
depends_on('clhep', when='+optional')
|
||||
depends_on('cppgsl', when='+optional')
|
||||
depends_on('cppunit', when='+optional')
|
||||
depends_on('doxygen +graphviz', when='+docs')
|
||||
@@ -87,16 +89,6 @@ class Gaudi(CMakePackage):
|
||||
def cmake_args(self):
|
||||
args = [
|
||||
self.define_from_variant("BUILD_TESTING", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_AIDA", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_XERCESC", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_CLHEP", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_HEPPDT", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_CPPUNIT", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_UNWIND", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_GPERFTOOLS", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_DOXYGEN", "docs"),
|
||||
self.define_from_variant("GAUDI_USE_INTELAMPLIFIER", "optional"),
|
||||
self.define_from_variant("GAUDI_USE_JEMALLOC", "optional"),
|
||||
# this is not really used in spack builds, but needs to be set
|
||||
"-DHOST_BINARY_TAG=x86_64-linux-gcc9-opt",
|
||||
]
|
||||
|
||||
@@ -335,15 +335,9 @@ def configure_args(self):
|
||||
|
||||
# Binutils
|
||||
if spec.satisfies('+binutils'):
|
||||
stage1_ldflags = str(self.rpath_args)
|
||||
boot_ldflags = stage1_ldflags + ' -static-libstdc++ -static-libgcc'
|
||||
if '%gcc' in spec:
|
||||
stage1_ldflags = boot_ldflags
|
||||
binutils = spec['binutils'].prefix.bin
|
||||
options.extend([
|
||||
'--with-sysroot=/',
|
||||
'--with-stage1-ldflags=' + stage1_ldflags,
|
||||
'--with-boot-ldflags=' + boot_ldflags,
|
||||
'--with-gnu-ld',
|
||||
'--with-ld=' + binutils.ld,
|
||||
'--with-gnu-as',
|
||||
@@ -376,6 +370,14 @@ def configure_args(self):
|
||||
'--with-libiconv-prefix={0}'.format(spec['iconv'].prefix)
|
||||
])
|
||||
|
||||
# enable appropriate bootstrapping flags
|
||||
stage1_ldflags = str(self.rpath_args)
|
||||
boot_ldflags = stage1_ldflags + ' -static-libstdc++ -static-libgcc'
|
||||
if '%gcc' in spec:
|
||||
stage1_ldflags = boot_ldflags
|
||||
options.append('--with-stage1-ldflags=' + stage1_ldflags)
|
||||
options.append('--with-boot-ldflags=' + boot_ldflags)
|
||||
|
||||
return options
|
||||
|
||||
# run configure/make/make(install) for the nvptx-none target
|
||||
|
||||
@@ -18,7 +18,7 @@ class Gdal(AutotoolsPackage):
|
||||
"""
|
||||
|
||||
homepage = "https://www.gdal.org/"
|
||||
url = "https://download.osgeo.org/gdal/3.0.4/gdal-3.0.4.tar.xz"
|
||||
url = "https://download.osgeo.org/gdal/3.1.1/gdal-3.1.1.tar.xz"
|
||||
list_url = "https://download.osgeo.org/gdal/"
|
||||
list_depth = 1
|
||||
|
||||
@@ -29,6 +29,7 @@ class Gdal(AutotoolsPackage):
|
||||
'osgeo.gdal_array', 'osgeo.gdalconst'
|
||||
]
|
||||
|
||||
version('3.1.1', sha256='97154a606339a6c1d87c80fb354d7456fe49828b2ef9a3bc9ed91771a03d2a04')
|
||||
version('3.1.0', sha256='e754a22242ccbec731aacdb2333b567d4c95b9b02d3ba1ea12f70508d244fcda')
|
||||
version('3.0.4', sha256='5569a4daa1abcbba47a9d535172fc335194d9214fdb96cd0f139bb57329ae277')
|
||||
version('3.0.3', sha256='e20add5802265159366f197a8bb354899e1693eab8dbba2208de14a457566109')
|
||||
|
||||
@@ -88,7 +88,7 @@ class Geant4(CMakePackage):
|
||||
depends_on("libx11", when='+x11')
|
||||
depends_on("libxmu", when='+x11')
|
||||
depends_on("motif", when='+motif')
|
||||
depends_on("qt@5:", when="+qt")
|
||||
depends_on("qt@5: +opengl", when="+qt")
|
||||
|
||||
# As released, 10.03.03 has issues with respect to using external
|
||||
# CLHEP.
|
||||
|
||||
42
var/spack/repos/builtin/packages/graphblast/package.py
Normal file
42
var/spack/repos/builtin/packages/graphblast/package.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Graphblast(MakefilePackage, CudaPackage):
|
||||
"""High-Performance Linear Algebra-based Graph Primitives on GPUs"""
|
||||
|
||||
homepage = "https://github.com/gunrock/graphblast"
|
||||
git = "https://github.com/gunrock/graphblast.git"
|
||||
|
||||
version('master', submodules=True)
|
||||
version('2020-05-07', submodules=True, commit='1a052558a71f2cd67f5d6fe9db3b274c303ef8f6', preferred=True)
|
||||
|
||||
variant('cuda', default=True, description="Build with Cuda support")
|
||||
|
||||
depends_on('boost +program_options')
|
||||
|
||||
# This package is confirmed to compile with:
|
||||
# gcc@:5.4.0,7.5.0 , boost@1.58.0:1.60.0 , cuda@9:
|
||||
|
||||
# TODO: the package doesn't compile as CMakePackage
|
||||
# once that is fixed it should be converted to a CMakePackage type.
|
||||
|
||||
conflicts('cuda_arch=none', when='+cuda',
|
||||
msg='Must specify CUDA compute capabilities of your GPU. \
|
||||
See "spack info graphblast"')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
install_tree(self.build_directory, self.prefix)
|
||||
|
||||
def patch(self):
|
||||
cuda_arch_list = self.spec.variants['cuda_arch'].value
|
||||
arches = 'ARCH = '
|
||||
for i in cuda_arch_list:
|
||||
arches = arches +\
|
||||
' -gencode arch=compute_{0},code=compute_{0}'.format(i)
|
||||
makefile = FileFilter('common.mk')
|
||||
makefile.filter(r'^ARCH =.*', arches)
|
||||
23
var/spack/repos/builtin/packages/half/f16fix.patch
Normal file
23
var/spack/repos/builtin/packages/half/f16fix.patch
Normal file
@@ -0,0 +1,23 @@
|
||||
--- a/include/half.hpp.orig 2020-06-26 07:46:50.861595808 -0400
|
||||
+++ b/include/half.hpp 2020-06-26 07:47:31.353286730 -0400
|
||||
@@ -266,9 +266,6 @@
|
||||
#if HALF_ENABLE_CPP11_HASH
|
||||
#include <functional>
|
||||
#endif
|
||||
-#if HALF_ENABLE_F16C_INTRINSICS
|
||||
- #include <immintrin.h>
|
||||
-#endif
|
||||
|
||||
|
||||
#ifndef HALF_ENABLE_F16C_INTRINSICS
|
||||
@@ -281,6 +278,10 @@
|
||||
#define HALF_ENABLE_F16C_INTRINSICS __F16C__
|
||||
#endif
|
||||
|
||||
+#if HALF_ENABLE_F16C_INTRINSICS
|
||||
+ #include <immintrin.h>
|
||||
+#endif
|
||||
+
|
||||
#ifdef HALF_DOXYGEN_ONLY
|
||||
/// Type for internal floating-point computations.
|
||||
/// This can be predefined to a built-in floating-point type (`float`, `double` or `long double`) to override the internal
|
||||
@@ -23,6 +23,8 @@ class Half(Package):
|
||||
|
||||
version('2.1.0', sha256='ad1788afe0300fa2b02b0d1df128d857f021f92ccf7c8bddd07812685fa07a25')
|
||||
|
||||
patch('f16fix.patch')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
mkdirp(prefix.include)
|
||||
install_tree('include', prefix.include)
|
||||
|
||||
58
var/spack/repos/builtin/packages/hepmcanalysis/lcg.patch
Normal file
58
var/spack/repos/builtin/packages/hepmcanalysis/lcg.patch
Normal file
@@ -0,0 +1,58 @@
|
||||
--- spack-src/examples/hepmcreader/reader.cc.orig 2013-10-31 11:11:16.000000000 +0100
|
||||
+++ spack-src/examples/hepmcreader/reader.cc 2013-10-31 11:11:59.000000000 +0100
|
||||
@@ -172,7 +172,7 @@
|
||||
for (vector<baseAnalysis*>::const_iterator i(analysis.begin()); i != analysis.end(); ++i) {
|
||||
(*i)->Finalize(f);
|
||||
}
|
||||
-
|
||||
+f->Close();
|
||||
// clean up analysis modules and configuration
|
||||
for (vector<baseAnalysis*>::const_iterator i(analysis.begin()); i != analysis.end(); ++i) {
|
||||
delete (*i);
|
||||
--- spack-src/Makefile.orig 2013-09-03 16:52:14.000000000 +0200
|
||||
+++ spack-src/Makefile 2013-09-03 16:54:04.000000000 +0200
|
||||
@@ -28,7 +28,7 @@
|
||||
@mkdir -p $(LIBDIR)
|
||||
$(CXX) $(CXXFLAGS) $(INCLUDES) \
|
||||
-shared -o $(LIBDIR)/libHepMCAnalysis.so $(OBJECTS) \
|
||||
- $(HepMClib) $(HepPDTlib) $(FastJetlib) \
|
||||
+ $(HepMClib) $(HepPDTlib) $(FastJetlib) $(CLHEPlib) \
|
||||
$(ROOTGLIBS) $(LINK_LIBS)
|
||||
|
||||
|
||||
--- spack-src/src/baseAnalysis.cc 2013-03-04 19:19:55.000000000 +0100
|
||||
+++ spack-src/src/baseAnalysis.cc 2014-10-03 07:36:47.000000000 +0200
|
||||
@@ -281,8 +281,8 @@
|
||||
double pe = (*p)->momentum().e();
|
||||
|
||||
// remove garbage
|
||||
- if ( isnan( px ) || isnan( py ) || isnan( pz ) || isnan( pe ) ||
|
||||
- isinf( px ) || isinf( py ) || isinf( pz ) || isinf( pe ) ) {
|
||||
+ if ( std::isnan( px ) || std::isnan( py ) || std::isnan( pz ) || std::isnan( pe ) ||
|
||||
+ std::isinf( px ) || std::isinf( py ) || std::isinf( pz ) || std::isinf( pe ) ) {
|
||||
cout << "baseAnalysis: NaN (Not A Number) found in the event record! Particle will be skipped." << endl;
|
||||
++m_NanNum;
|
||||
continue;
|
||||
--- spack-src/config.mk.orig 2020-06-25 17:05:30.142357100 +0200
|
||||
+++ spack-src/config.mk 2020-06-25 17:06:24.219319861 +0200
|
||||
@@ -10,16 +10,16 @@
|
||||
# -I$(HepPDTdir)/include -I$(ROOTSYS)/include
|
||||
INCLUDES = $(INCLUDEH) $(INCLUDEF)
|
||||
|
||||
-ROOTGLIBS = `root-config --glibs`
|
||||
-ROOTCFLAGS = `root-config --cflags` -Wno-long-long
|
||||
+ROOTGLIBS = `$(ROOTSYS)/bin/root-config --glibs`
|
||||
+ROOTCFLAGS = `$(ROOTSYS)/bin/root-config --cflags` -Wno-long-long
|
||||
|
||||
#CLHEP = -L$(CLHEPdir)/lib -lCLHEP # causes problems in HepMCAnalysis_i interface!
|
||||
-CLHEPlib =
|
||||
+CLHEPlib = -L$(CLHEPdir)/lib -lCLHEP
|
||||
|
||||
# common libraries
|
||||
HepPDTlib =
|
||||
#HepPDTlib = -L$(HepPDTdir)/lib -lHepPDT -lHepPID
|
||||
-HepMClib = $(HepMCdir)/lib/libHepMC.so $(HepMCdir)/lib/libHepMCfio.a
|
||||
+HepMClib = -L$(HepMCdir)/lib -lHepMC
|
||||
#FastJetlib = -L$(FastJetdir)/lib -lfastjet -lSISConePlugin -lCDFConesPlugin -lsiscone -lsiscone_spherical
|
||||
FastJetlib = -L$(FastJetdir)/lib -lfastjet -lfastjetplugins -lsiscone -lsiscone_spherical
|
||||
|
||||
47
var/spack/repos/builtin/packages/hepmcanalysis/package.py
Normal file
47
var/spack/repos/builtin/packages/hepmcanalysis/package.py
Normal file
@@ -0,0 +1,47 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Hepmcanalysis(MakefilePackage):
|
||||
"""The HepMCAnalysis Tool is a tool for generator
|
||||
validation and comparisons."""
|
||||
|
||||
homepage = "https://hepmcanalysistool.desy.de/"
|
||||
url = "https://hepmcanalysistool.desy.de/releases/HepMCAnalysis-00-03-04-13.tar.gz"
|
||||
|
||||
version('3.4.13', sha256='be9937c6de493a5671258919493b0caa0cecca77853a2075f5cecce1071e0029')
|
||||
|
||||
depends_on('hepmc')
|
||||
depends_on('fastjet')
|
||||
depends_on('root')
|
||||
depends_on('clhep')
|
||||
|
||||
variant('cxxstd',
|
||||
default='11',
|
||||
values=('11', '14', '17'),
|
||||
multi=False,
|
||||
description='Use the specified C++ standard when building.')
|
||||
|
||||
patch('lcg.patch')
|
||||
|
||||
def edit(self, spec, prefix):
|
||||
filter_file(r"CXXFLAGS(.*)", r"CXXFLAGS\1 -std=c++" +
|
||||
self.spec.variants['cxxstd'].value, "config.mk")
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
env.set("HepMCdir", self.spec['hepmc'].prefix)
|
||||
env.set("FastJetdir", self.spec['fastjet'].prefix)
|
||||
env.set("CLHEPdir", self.spec['clhep'].prefix)
|
||||
|
||||
def url_for_version(self, version):
|
||||
parts = [int(x) for x in str(version).split('.')]
|
||||
root = "https://hepmcanalysistool.desy.de/releases/HepMCAnalysis-00-"
|
||||
return root + "{0:02d}-{1:02d}-{2:02d}.tar.gz".format(*parts)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
install_tree('lib', prefix.lib)
|
||||
install_tree('include', prefix.include)
|
||||
@@ -65,6 +65,10 @@ class Hpx(CMakePackage, CudaPackage):
|
||||
depends_on('boost@1.55.0:', when='@:1.1.0')
|
||||
depends_on('hwloc@1.6:', when='@:1.1.0')
|
||||
|
||||
# boost 1.73.0 build problem with HPX 1.4.0 and 1.4.1
|
||||
# https://github.com/STEllAR-GROUP/hpx/issues/4728#issuecomment-640685308
|
||||
depends_on('boost@:1.72.0', when='@:1.4')
|
||||
|
||||
# CXX Standard
|
||||
depends_on('boost cxxstd=11', when='cxxstd=11')
|
||||
depends_on('boost cxxstd=14', when='cxxstd=14')
|
||||
|
||||
@@ -6,12 +6,17 @@
|
||||
from spack import *
|
||||
|
||||
|
||||
class Xssp(AutotoolsPackage):
|
||||
"""The source code for building the mkdssp, mkhssp, hsspconv, and hsspsoap
|
||||
programs is bundled in the xssp project"""
|
||||
class Hssp(AutotoolsPackage):
|
||||
"""The source code for building the mkhssp and hsspconv programs is bundled
|
||||
in the hssp project.
|
||||
|
||||
homepage = "https://github.com/cmbi/xssp"
|
||||
url = "https://github.com/cmbi/xssp/archive/3.0.10.tar.gz"
|
||||
The mkhssp executable creates stockholm files with hssp annotations in
|
||||
them. The hsspconv executable converts stockholm to the original hssp
|
||||
format.
|
||||
"""
|
||||
|
||||
homepage = "https://github.com/cmbi/hssp"
|
||||
url = "https://github.com/cmbi/hssp/archive/3.0.10.tar.gz"
|
||||
|
||||
version('3.0.10', sha256='b475d6fa62098df0e54c8dbdaa0b32de93bf5a393335f73f9b5a7e95f3090d2a')
|
||||
version('3.0.9', sha256='42a9a93c48d22478212dcaf6ceb3feb64443e4cb2e8cccdd402b47a595d16658')
|
||||
@@ -65,7 +65,7 @@ def setup_dependent_build_environment(self, *args):
|
||||
})
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
super(self, IntelMpi).setup_run_environment(env)
|
||||
super(IntelMPI, self).setup_run_environment(env)
|
||||
|
||||
for name, value in self.mpi_compiler.wrappers.items():
|
||||
env.set(name, value)
|
||||
|
||||
@@ -227,7 +227,7 @@ def setup_dependent_build_environment(self, *args):
|
||||
})
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
super(self, IntelParallelStudio).setup_run_environment(env)
|
||||
super(IntelParallelStudio, self).setup_run_environment(env)
|
||||
|
||||
for name, value in self.mpi_compiler_wrappers.items():
|
||||
env.set(name, value)
|
||||
|
||||
@@ -13,6 +13,8 @@ class Intel(IntelPackage):
|
||||
|
||||
# Same as in ../intel-parallel-studio/package.py, Composer Edition,
|
||||
# but the version numbering in Spack differs.
|
||||
version('20.0.1', sha256='26c7e7da87b8a83adfd408b2a354d872be97736abed837364c1bf10f4469b01e', url='http://registrationcenter-download.intel.com/akdlm/irc_nas/tec/16530/parallel_studio_xe_2020_update1_composer_edition.tgz')
|
||||
version('20.0.0', sha256='9168045466139b8e280f50f0606b9930ffc720bbc60bc76f5576829ac15757ae', url='http://registrationcenter-download.intel.com/akdlm/irc_nas/tec/16229/parallel_studio_xe_2020_composer_edition.tgz')
|
||||
version('19.1.1', sha256='26c7e7da87b8a83adfd408b2a354d872be97736abed837364c1bf10f4469b01e', url='http://registrationcenter-download.intel.com/akdlm/irc_nas/tec/16530/parallel_studio_xe_2020_update1_composer_edition.tgz')
|
||||
version('19.1.0', sha256='9168045466139b8e280f50f0606b9930ffc720bbc60bc76f5576829ac15757ae', url='http://registrationcenter-download.intel.com/akdlm/irc_nas/tec/16229/parallel_studio_xe_2020_composer_edition.tgz')
|
||||
version('19.0.5', sha256='e8c8e4b9b46826a02c49325c370c79f896858611bf33ddb7fb204614838ad56c', url='http://registrationcenter-download.intel.com/akdlm/irc_nas/tec/15813/parallel_studio_xe_2019_update5_composer_edition.tgz')
|
||||
|
||||
@@ -11,10 +11,10 @@ class Lapackpp(CMakePackage):
|
||||
of Tennessee)"""
|
||||
|
||||
homepage = "https://bitbucket.org/icl/lapackpp"
|
||||
hg = "https://bitbucket.org/icl/lapackpp"
|
||||
git = "https://bitbucket.org/icl/lapackpp"
|
||||
maintainers = ['teonnik', 'Sely85']
|
||||
|
||||
version('develop', hg=hg, revision="7ffa486")
|
||||
version('develop', commit="f878fad")
|
||||
|
||||
variant('shared', default=True,
|
||||
description='Build a shared version of the library')
|
||||
|
||||
@@ -25,3 +25,11 @@ class Libffi(AutotoolsPackage, SourcewarePackage):
|
||||
def headers(self):
|
||||
# The headers are probably in self.prefix.lib but we search everywhere
|
||||
return find_headers('ffi', self.prefix, recursive=True)
|
||||
|
||||
def configure_args(self):
|
||||
args = []
|
||||
if self.spec.version >= Version('3.3'):
|
||||
# Spack adds its own target flags, so tell libffi not to
|
||||
# second-guess us
|
||||
args.append('--without-gcc-arch')
|
||||
return args
|
||||
|
||||
@@ -31,7 +31,7 @@ class Libnetworkit(CMakePackage):
|
||||
depends_on('libtlx')
|
||||
depends_on('py-sphinx', when='+doc', type='build')
|
||||
|
||||
patch('0001-Name-agnostic-import-of-tlx-library.patch', when='@6.1')
|
||||
patch('0001-Name-agnostic-import-of-tlx-library.patch', when='@6.1:')
|
||||
|
||||
def cmake_args(self):
|
||||
spec = self.spec
|
||||
|
||||
@@ -14,10 +14,11 @@ class Libxsmm(MakefilePackage):
|
||||
and deep learning primitives."""
|
||||
|
||||
homepage = 'https://github.com/hfp/libxsmm'
|
||||
url = 'https://github.com/hfp/libxsmm/archive/1.16.tar.gz'
|
||||
url = 'https://github.com/hfp/libxsmm/archive/1.16.1.tar.gz'
|
||||
git = 'https://github.com/hfp/libxsmm.git'
|
||||
|
||||
version('master', branch='master')
|
||||
version('1.16.1', sha256='93dc7a3ec40401988729ddb2c6ea2294911261f7e6cd979cf061b5c3691d729d')
|
||||
version('1.16', sha256='4f4f2ad97815413af80821d2e306eb6f00541941ad412662da05c02361a20e07')
|
||||
version('1.15', sha256='499e5adfbf90cd3673309243c2b56b237d54f86db2437e1ac06c8746b55ab91c')
|
||||
version('1.14', sha256='9c0af4509ea341d1ee2c6c19fc6f19289318c3bd4b17844efeb9e7f9691abf76')
|
||||
|
||||
@@ -18,6 +18,7 @@ class NfsUtils(AutotoolsPackage):
|
||||
version('2.4.1', sha256='c0dda96318af554881f4eb1590bfe91f1aba2fba59ed2ac3ba099f80fdf838e9')
|
||||
version('2.3.4', sha256='36e70b0a583751ead0034ebe5d8826caf2dcc7ee7c0beefe94d6ee5a3b0b2484')
|
||||
|
||||
depends_on('pkgconfig', type='build')
|
||||
depends_on('libtirpc')
|
||||
depends_on('libevent')
|
||||
depends_on('libdmx')
|
||||
|
||||
@@ -265,6 +265,7 @@ class Openfoam(Package):
|
||||
|
||||
version('develop', branch='develop', submodules='True')
|
||||
version('master', branch='master', submodules='True')
|
||||
version('2006', sha256='30c6376d6f403985fc2ab381d364522d1420dd58a42cb270d2ad86f8af227edc')
|
||||
version('1912_200506', sha256='831a39ff56e268e88374d0a3922479fd80260683e141e51980242cc281484121')
|
||||
version('1912_200403', sha256='1de8f4ddd39722b75f6b01ace9f1ba727b53dd999d1cd2b344a8c677ac2db4c0')
|
||||
version('1912', sha256='437feadf075419290aa8bf461673b723a60dc39525b23322850fb58cb48548f2')
|
||||
|
||||
22
var/spack/repos/builtin/packages/perl-date-manip/package.py
Normal file
22
var/spack/repos/builtin/packages/perl-date-manip/package.py
Normal file
@@ -0,0 +1,22 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class PerlDateManip(PerlPackage):
|
||||
"""Date::Manip - Date manipulation routines
|
||||
|
||||
Date::Manip is a series of modules designed to make any common date/time
|
||||
operation easy to do. Operations such as comparing two times,
|
||||
determining a date a given amount of time from another, or parsing
|
||||
international times are all easily done. It deals with time as it is
|
||||
used in the Gregorian calendar (the one currently in use) with full
|
||||
support for time changes due to daylight saving time."""
|
||||
|
||||
homepage = "https://metacpan.org/release/Date-Manip"
|
||||
url = "https://cpan.metacpan.org/authors/id/S/SB/SBECK/Date-Manip-6.82.tar.gz"
|
||||
|
||||
version('6.82', sha256='fa96bcf94c6b4b7d3333f073f5d0faad59f546e5aec13ac01718f2e6ef14672a')
|
||||
81
var/spack/repos/builtin/packages/postgis/package.py
Normal file
81
var/spack/repos/builtin/packages/postgis/package.py
Normal file
@@ -0,0 +1,81 @@
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Postgis(AutotoolsPackage):
|
||||
"""
|
||||
PostGIS is a spatial database extender for PostgreSQL object-relational
|
||||
database. It adds support for geographic objects allowing location
|
||||
queries to be run in SQL
|
||||
"""
|
||||
|
||||
homepage = "https://postgis.net/"
|
||||
url = "https://download.osgeo.org/postgis/source/postgis-2.5.3.tar.gz"
|
||||
|
||||
version('3.0.1', sha256='5a5432f95150d9bae9215c6d1c7bb354e060482a7c379daa9b8384e1d03e6353')
|
||||
version('3.0.0', sha256='c06fd2cd5cea0119106ffe17a7235d893c2bbe6f4b63c8617c767630973ba594')
|
||||
version('2.5.3', sha256='72e8269d40f981e22fb2b78d3ff292338e69a4f5166e481a77b015e1d34e559a')
|
||||
|
||||
variant('gui', default=False, description='Build with GUI support, creating shp2pgsql-gui graphical interface to shp2pgsql')
|
||||
|
||||
# Refs:
|
||||
# https://postgis.net/docs/postgis_installation.html
|
||||
# https://postgis.net/source/
|
||||
|
||||
depends_on('postgresql')
|
||||
depends_on('geos')
|
||||
depends_on('proj')
|
||||
depends_on('gdal')
|
||||
depends_on('libxml2')
|
||||
depends_on('json-c')
|
||||
|
||||
depends_on('sfcgal')
|
||||
depends_on('pcre')
|
||||
depends_on('perl', type=('build', 'run'))
|
||||
depends_on('protobuf-c')
|
||||
|
||||
depends_on('gtkplus@:2.24.32', when='+gui')
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
env.set('POSTGIS_GDAL_ENABLED_DRIVERS', 'ENABLE_ALL')
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.set('POSTGIS_GDAL_ENABLED_DRIVERS', 'ENABLE_ALL')
|
||||
|
||||
def configure_args(self):
|
||||
args = []
|
||||
args.append('--with-sfcgal=' + str(self.spec['sfcgal'].prefix.bin) +
|
||||
'/sfcgal-config')
|
||||
if '+gui' in self.spec:
|
||||
args.append('--with-gui')
|
||||
return args
|
||||
|
||||
# By default package installs under postgresql prefix.
|
||||
# Apparently this is a known bug:
|
||||
# https://postgis.net/docs/postgis_installation.html
|
||||
# The following modifacations that fixed this issue are found in
|
||||
# Guix recipe for postgis.
|
||||
# https://git.savannah.gnu.org/cgit/guix.git/tree/gnu/packages/geo.scm#n720
|
||||
|
||||
def build(self, spec, prefix):
|
||||
make('bindir=' + prefix.bin, 'libdir=' + prefix.lib,
|
||||
'pkglibdir=' + prefix.lib, 'datadir=' + prefix.share,
|
||||
'docdir=' + prefix.share.doc)
|
||||
|
||||
def install(self, spec, prefix):
|
||||
make('install', 'bindir=' + prefix.bin, 'libdir=' + prefix.lib,
|
||||
'pkglibdir=' + prefix.lib, 'datadir=' + prefix.share,
|
||||
'docdir=' + prefix.share.doc)
|
||||
|
||||
@run_before('build')
|
||||
def fix_raster_bindir(self):
|
||||
makefile = FileFilter('raster/loader/Makefile')
|
||||
makefile.filter('$(DESTDIR)$(PGSQL_BINDIR)', self.prefix.bin,
|
||||
string=True)
|
||||
makefile = FileFilter('raster/scripts/Makefile')
|
||||
makefile.filter('$(DESTDIR)$(PGSQL_BINDIR)', self.prefix.bin,
|
||||
string=True)
|
||||
@@ -13,6 +13,7 @@ class PyCrossmap(PythonPackage, SourceforgePackage):
|
||||
homepage = "http://crossmap.sourceforge.net/"
|
||||
sourceforge_mirror_path = "crossmap/CrossMap-0.3.3.tar.gz"
|
||||
|
||||
version('0.3.9', sha256='e20a4653e9fc313ac0f5a6cfc37b42e83c3cf2b42f9483706cfb9ec9ff72c74c')
|
||||
version('0.3.3', sha256='56d99fd606e13e399b83438953d0d89fc281df1c1e8e47eed7d773e7ec9c88f8')
|
||||
version('0.2.9', sha256='57243ee5051352c93088874c797ceac0426f249704ba897360fb628b3365d0af')
|
||||
|
||||
@@ -20,8 +21,7 @@ class PyCrossmap(PythonPackage, SourceforgePackage):
|
||||
depends_on('python@2.7:2.8', type=('build', 'run'), when='@:0.2.9')
|
||||
|
||||
depends_on('py-setuptools', type='build')
|
||||
depends_on('py-numpy', type=('build', 'run'))
|
||||
depends_on('py-cython', type=('build', 'run'))
|
||||
depends_on('py-cython@0.17:', type=('build', 'run'))
|
||||
depends_on('py-pysam', type=('build', 'run'))
|
||||
depends_on('py-bx-python', type=('build', 'run'))
|
||||
|
||||
|
||||
@@ -16,6 +16,8 @@ class PyMdanalysis(PythonPackage):
|
||||
homepage = "http://www.mdanalysis.org"
|
||||
url = "https://pypi.io/packages/source/M/MDAnalysis/MDAnalysis-0.19.2.tar.gz"
|
||||
|
||||
version('1.0.0', sha256='f45a024aca45e390ff1c45ca90beb2180b78881be377e2a1aa9cd6c109bcfa81')
|
||||
version('0.20.1', sha256='d04b71b193b9716d2597ffb9938b93f43487fa535da1bb5c1f2baccf356d7df9')
|
||||
version('0.19.2', sha256='c5395bbafa5efca2e1aee4715d26129844140c47cb8301da0293106cb969de7d')
|
||||
version('0.19.1', sha256='ff1d694f8598c0833ec340de6a6adb3b5e62b92d0fa94ee6401718ba972db3cc')
|
||||
version('0.19.0', sha256='248e3b37fc6150e31c609cc18a3927c32aee37b76d29cbfedf635e7e1aa982cf')
|
||||
@@ -31,18 +33,42 @@ class PyMdanalysis(PythonPackage):
|
||||
variant('amber', default=False,
|
||||
description='Support AMBER netcdf format.')
|
||||
|
||||
depends_on('python@2.7:')
|
||||
depends_on('py-setuptools', type='build')
|
||||
depends_on('py-cython@0.16:', type='build')
|
||||
depends_on('py-numpy@1.5.0:', type=('build', 'run'))
|
||||
depends_on('py-six@1.4.0:', type=('build', 'run'))
|
||||
depends_on('py-biopython@1.59:', type=('build', 'run'))
|
||||
depends_on('py-networkx@1.0:', type=('build', 'run'))
|
||||
depends_on('py-griddataformats@0.3.2:', type=('build', 'run'))
|
||||
depends_on('python@2.7:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-matplotlib', when='+analysis', type=('build', 'run'))
|
||||
depends_on('py-scipy', when='+analysis', type=('build', 'run'))
|
||||
depends_on('py-seaborn', when='+analysis', type=('build', 'run'))
|
||||
depends_on('py-setuptools', type='build')
|
||||
depends_on('py-cython@0.16:', type='build')
|
||||
|
||||
depends_on('py-six@1.4.0:', type=('build', 'run'))
|
||||
depends_on('py-networkx@1.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-gsd@1.4.0:', when='@1.17.0:', type=('build', 'run'))
|
||||
depends_on('py-mmtf-python@1.0.0:', when='@0.16.0:', type=('build', 'run'))
|
||||
depends_on('py-mock', when='@0.18.0:', type=('build', 'run'))
|
||||
depends_on('py-tqdm@4.43.0:', when='@1.0.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-joblib', when='@0.16.0:0.20.1', type=('build', 'run'))
|
||||
depends_on('py-joblib@0.12:', when='@1.0.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-numpy@1.5.0:', when='@:0.15.0', type=('build', 'run'))
|
||||
depends_on('py-numpy@1.10.4:', when='@0.16.0:0.19.2', type=('build', 'run'))
|
||||
depends_on('py-numpy@1.13.3:', when='@0.20.1:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-biopython@1.59:', when='@:0.17.0', type=('build', 'run'))
|
||||
depends_on('py-biopython@1.71:', when='@0.18.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-griddataformats@0.3.2:', when='@:0.16.2', type=('build', 'run'))
|
||||
depends_on('py-griddataformats@0.4:', when='@0.17.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-matplotlib', when='@:0.15.0+analysis', type=('build', 'run'))
|
||||
depends_on('py-matplotlib@1.5.1:', when='@0.16.0:0.16.1+analysis', type=('build', 'run'))
|
||||
depends_on('py-matplotlib@1.5.1:', when='@0.16.2:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-scipy', when='@:0.16.1+analysis', type=('build', 'run'))
|
||||
depends_on('py-scipy', when='@0.16.2:0.17.0', type=('build', 'run'))
|
||||
depends_on('py-scipy@1.0.0:', when='@0.18.0:', type=('build', 'run'))
|
||||
|
||||
depends_on('py-scikit-learn', when='@0.16.0:+analysis', type=('build', 'run'))
|
||||
depends_on('py-seaborn', when='+analysis', type=('build', 'run'))
|
||||
|
||||
depends_on('py-netcdf4@1.0:', when='+amber', type=('build', 'run'))
|
||||
depends_on('hdf5', when='+amber', type=('run'))
|
||||
depends_on('hdf5', when='+amber', type=('run'))
|
||||
|
||||
@@ -30,3 +30,8 @@ class PyMpi4py(PythonPackage):
|
||||
|
||||
def build_args(self, spec, prefix):
|
||||
return ['--mpicc=%s -shared' % spec['mpi'].mpicc]
|
||||
|
||||
@property
|
||||
def headers(self):
|
||||
headers = find_all_headers(self.prefix.lib)
|
||||
return headers
|
||||
|
||||
42
var/spack/repos/builtin/packages/py-networkit/package.py
Normal file
42
var/spack/repos/builtin/packages/py-networkit/package.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class PyNetworkit(PythonPackage):
|
||||
"""NetworKit is a growing open-source toolkit for large-scale network
|
||||
analysis. Its aim is to provide tools for the analysis of large networks
|
||||
in the size range from thousands to billions of edges. For this purpose,
|
||||
it implements efficient graph algorithms, many of them parallel to
|
||||
utilize multicore architectures. These are meant to compute standard
|
||||
measures of network analysis, such as degree sequences, clustering
|
||||
coefficients, and centrality measures. In this respect, NetworKit is
|
||||
comparable to packages such as NetworkX, albeit with a focus on
|
||||
parallelism and scalability."""
|
||||
|
||||
homepage = "https://networkit.github.io/"
|
||||
url = "https://pypi.io/packages/source/n/networkit/networkit-6.1.tar.gz"
|
||||
|
||||
version('7.0', sha256='eea4b5e565d6990b674e1c7f4d598be9377d57b61d0d82883ecc39edabaf3631')
|
||||
version('6.1', sha256='f7fcb50dec66a8253f85c10ff9314100de013c7578d531c81d3f71bc6cf8f093')
|
||||
|
||||
# Required dependencies
|
||||
depends_on('cmake', type='build')
|
||||
depends_on('libnetworkit@7.0', type=('build', 'link'), when='@7.0')
|
||||
depends_on('libnetworkit@6.1', type=('build', 'link'), when='@6.1')
|
||||
depends_on('llvm-openmp', when='%apple-clang')
|
||||
depends_on('ninja', type='build')
|
||||
depends_on('py-cython', type='build')
|
||||
depends_on('py-numpy', type=('build', 'run'))
|
||||
depends_on('py-scipy', type=('build', 'run'))
|
||||
depends_on('py-setuptools', type='build')
|
||||
|
||||
phases = ['build_ext', 'install']
|
||||
|
||||
# Overwrite build_ext to enable ext. core-library + parallel build
|
||||
def build_ext_args(self, spec, prefix):
|
||||
args = ['--networkit-external-core', '-j{0}'.format(make_jobs)]
|
||||
return args
|
||||
@@ -17,10 +17,16 @@ class Qgis(CMakePackage):
|
||||
|
||||
maintainers = ['adamjstewart', 'Sinan81']
|
||||
|
||||
version('3.14.0', sha256='1b76c5278def0c447c3d354149a2afe2562ac26cf0bcbe69b9e0528356d407b8')
|
||||
version('3.12.3', sha256='c2b53815f9b994e1662995d1f25f90628156b996758f5471bffb74ab29a95220')
|
||||
version('3.12.2', sha256='501f81715672205afd2c1a289ffc765aff96eaa8ecb49d079a58ef4d907467b8')
|
||||
version('3.12.1', sha256='a7dc7af768b8960c08ce72a06c1f4ca4664f4197ce29c7fe238429e48b2881a8')
|
||||
version('3.12.0', sha256='19e9c185dfe88cad7ee6e0dcf5ab7b0bbfe1672307868a53bf771e0c8f9d5e9c')
|
||||
# Prefer latest long term release
|
||||
version('3.10.4', sha256='a032e2b8144c2fd825bc26766f586cfb1bd8574bc72efd1aa8ce18dfff8b6c9f', preferred=True)
|
||||
version('3.10.7', sha256='f6c02489e065bae355d2f4374b84a1624379634c34a770b6d65bf38eb7e71564', preferred=True)
|
||||
version('3.10.6', sha256='a96791bf6615e4f8ecdbbb9a90a8ef14a12459d8c5c374ab22eb5f776f864bb5')
|
||||
version('3.10.5', sha256='f3e1cc362941ec69cc21062eeaea160354ef71382b21dc4b3191c315447b4ce1')
|
||||
version('3.10.4', sha256='a032e2b8144c2fd825bc26766f586cfb1bd8574bc72efd1aa8ce18dfff8b6c9f')
|
||||
version('3.10.3', sha256='0869704df9120dd642996ff1ed50213ac8247650aa0640b62f8c9c581c05d7a7')
|
||||
version('3.10.2', sha256='381cb01a8ac2f5379a915b124e9c830d727d2c67775ec49609c7153fe765a6f7')
|
||||
version('3.10.1', sha256='466ac9fad91f266cf3b9d148f58e2adebd5b9fcfc03e6730eb72251e6c34c8ab')
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user