Merge branch 'develop' of https://github.com/spack/spack into develop

This commit is contained in:
Gilbert Brietzke 2019-07-16 14:06:23 +02:00
commit 7d76020080
97 changed files with 1532 additions and 442 deletions

View File

@ -4,12 +4,13 @@ coverage:
range: 60...90
status:
project:
default:
threshold: 0.5
default: yes
ignore:
- lib/spack/spack/test/.*
- lib/spack/docs/.*
- lib/spack/external/.*
- share/spack/qa/.*
- share/spack/spack-completion.bash
comment: off

View File

@ -19,57 +19,32 @@ jobs:
include:
- stage: 'style checks'
python: '2.7'
sudo: required
os: linux
language: python
env: TEST_SUITE=flake8
- stage: 'unit tests + documentation'
python: '2.6'
dist: trusty
sudo: required
os: linux
language: python
env: [ TEST_SUITE=unit, COVERAGE=true ]
- python: '2.7'
sudo: required
os: linux
language: python
env: [ TEST_SUITE=unit, COVERAGE=true ]
- python: '3.5'
sudo: required
os: linux
language: python
env: TEST_SUITE=unit
- python: '3.6'
sudo: required
os: linux
language: python
env: TEST_SUITE=unit
- python: '3.7'
sudo: required
os: linux
language: python
env: [ TEST_SUITE=unit, COVERAGE=true ]
addons:
apt:
packages:
- cmake
- gfortran
- graphviz
- gnupg2
- kcov
- mercurial
- ninja-build
- perl
- perl-base
- realpath
- patchelf
- r-base
- r-base-core
- r-base-dev
- python: '3.7'
sudo: required
os: linux
language: python
env: TEST_SUITE=doc
@ -119,7 +94,6 @@ jobs:
env: [ TEST_SUITE=build, 'SPEC=mpich' ]
- python: '3.6'
stage: 'docker build'
sudo: required
os: linux
language: python
env: TEST_SUITE=docker
@ -137,8 +111,6 @@ stages:
#=============================================================================
# Environment
#=============================================================================
# Use new Travis infrastructure (Docker can't sudo yet)
sudo: false
# Docs need graphviz to build
addons:
@ -159,6 +131,7 @@ addons:
- r-base
- r-base-core
- r-base-dev
- zsh
# for Mac builds, we use Homebrew
homebrew:
packages:
@ -166,6 +139,8 @@ addons:
- gcc
- gnupg2
- ccache
- dash
- kcov
update: true
# ~/.ccache needs to be cached directly as Travis is not taking care of it
@ -223,11 +198,13 @@ script:
after_success:
- ccache -s
- if [[ "$TEST_SUITE" == "unit" || "$TEST_SUITE" == "build" ]]; then
codecov --env PYTHON_VERSION
--required
--flags "${TEST_SUITE}${TRAVIS_OS_NAME}";
fi
- case "$TEST_SUITE" in
unit)
codecov --env PYTHON_VERSION
--required
--flags "${TEST_SUITE}${TRAVIS_OS_NAME}";
;;
esac
#=============================================================================
# Notifications

View File

@ -10,10 +10,9 @@ Tutorial: Spack 101
=============================
This is a full-day introduction to Spack with lectures and live demos.
It was last presented at the `International Supercomputing Conference
<https://isc-hpc.com>`_ on June 16, 2019. That instance was a half day
tutorial, but the materials here include all the material you need for a
full day.
It was last presented at the `First Workshop on NSF and DOE High
Performance Computing Tools
<http://oaciss.uoregon.edu/NSFDOE19/agenda.html>`_ on July 10, 2019.
You can use these materials to teach a course on Spack at your own site,
or you can just skip ahead and read the live demo scripts to see how
@ -24,17 +23,17 @@ Spack is used in practice.
.. rubric:: Slides
.. figure:: tutorial/sc16-tutorial-slide-preview.png
:target: https://spack.io/slides/Spack-ISC19-Tutorial.pdf
:target: https://spack.io/slides/Spack-DOE-NSF-Tutorial-2019.pdf
:height: 72px
:align: left
:alt: Slide Preview
`Download Slides <https://spack.io/slides/Spack-RIKEN19-Tutorial.pdf>`_.
`Download Slides <https://spack.io/slides/Spack-DOE-NSF-Tutorial-2019.pdf>`_.
**Full citation:** Todd Gamblin, Gregory Becker, Massimiliano Culpo, and
Michael Kuhn. Managing HPC Software Complexity with Spack. Tutorial
presented at International Supercomputing Conference (ISC'19). June 16, 2019.
Frankfurt, Germany.
**Full citation:** Todd Gamblin and Gregory Becker. Managing HPC Software
Complexity with Spack. Tutorial presented at the First Workshop on NSF
and DOE High Performance Computing Tools. July 9, 2019. Eugene, Oregon,
USA.
.. _sc16-live-demos:

View File

@ -347,7 +347,7 @@ we'll notice that this time the installation won't complete:
11 options.extend([
See build log for details:
/usr/local/var/spack/stage/arpack-ng-3.5.0-bloz7cqirpdxj33pg7uj32zs5likz2un/arpack-ng-3.5.0/spack-build.out
/usr/local/var/spack/stage/arpack-ng-3.5.0-bloz7cqirpdxj33pg7uj32zs5likz2un/arpack-ng-3.5.0/spack-build-out.txt
Unlike ``openblas`` which provides a library named ``libopenblas.so``,
``netlib-lapack`` provides ``liblapack.so``, so it needs to implement
@ -459,7 +459,7 @@ Let's look at an example and try to install ``netcdf ^mpich``:
56 config_args.append('--enable-pnetcdf')
See build log for details:
/usr/local/var/spack/stage/netcdf-4.4.1.1-gk2xxhbqijnrdwicawawcll4t3c7dvoj/netcdf-4.4.1.1/spack-build.out
/usr/local/var/spack/stage/netcdf-4.4.1.1-gk2xxhbqijnrdwicawawcll4t3c7dvoj/netcdf-4.4.1.1/spack-build-out.txt
We can see from the error that ``netcdf`` needs to know how to link the *high-level interface*
of ``hdf5``, and thus passes the extra parameter ``hl`` after the request to retrieve it.

View File

@ -123,7 +123,7 @@ to build this package:
>> 3 make: *** No targets specified and no makefile found. Stop.
See build log for details:
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-sv75n3u5ev6mljwcezisz3slooozbbxu/mpileaks-1.0/spack-build.out
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-sv75n3u5ev6mljwcezisz3slooozbbxu/mpileaks-1.0/spack-build-out.txt
This obviously didn't work; we need to fill in the package-specific
information. Specifically, Spack didn't try to build any of mpileaks'
@ -256,7 +256,7 @@ Now when we try to install this package a lot more happens:
>> 3 make: *** No targets specified and no makefile found. Stop.
See build log for details:
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-csoikctsalli4cdkkdk377gprkc472rb/mpileaks-1.0/spack-build.out
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-csoikctsalli4cdkkdk377gprkc472rb/mpileaks-1.0/spack-build-out.txt
Note that this command may take a while to run and produce more output if
you don't have an MPI already installed or configured in Spack.
@ -319,14 +319,15 @@ If we re-run we still get errors:
>> 31 configure: error: unable to locate adept-utils installation
See build log for details:
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-csoikctsalli4cdkkdk377gprkc472rb/mpileaks-1.0/spack-build.out
/home/ubuntu/packaging/spack/var/spack/stage/mpileaks-1.0-csoikctsalli4cdkkdk377gprkc472rb/mpileaks-1.0/spack-build-out.txt
Again, the problem may be obvious. But let's pretend we're not
all intelligent developers and use this opportunity spend some
time debugging. We have a few options that can tell us about
what's going wrong:
As per the error message, Spack has given us a ``spack-build.out`` debug log:
As per the error message, Spack has given us a ``spack-build-out.txt`` debug
log:
.. code-block:: console

View File

@ -1389,7 +1389,25 @@ def find_libraries(libraries, root, shared=True, recursive=False):
# List of libraries we are searching with suffixes
libraries = ['{0}.{1}'.format(lib, suffix) for lib in libraries]
return LibraryList(find(root, libraries, recursive))
if not recursive:
# If not recursive, look for the libraries directly in root
return LibraryList(find(root, libraries, False))
# To speedup the search for external packages configured e.g. in /usr,
# perform first non-recursive search in root/lib then in root/lib64 and
# finally search all of root recursively. The search stops when the first
# match is found.
for subdir in ('lib', 'lib64'):
dirname = join_path(root, subdir)
if not os.path.isdir(dirname):
continue
found_libs = find(dirname, libraries, False)
if found_libs:
break
else:
found_libs = find(root, libraries, True)
return LibraryList(found_libs)
@memoized
@ -1443,6 +1461,7 @@ def search_paths_for_executables(*path_hints):
if not os.path.isdir(path):
continue
path = os.path.abspath(path)
executable_paths.append(path)
bin_dir = os.path.join(path, 'bin')

View File

@ -406,6 +406,7 @@ def _set_variables_for_single_module(pkg, module):
"""Helper function to set module variables for single module."""
jobs = spack.config.get('config:build_jobs') if pkg.parallel else 1
jobs = min(jobs, multiprocessing.cpu_count())
assert jobs is not None, "no default set for config:build_jobs"
m = module
@ -993,7 +994,7 @@ def long_message(self):
if self.build_log and os.path.exists(self.build_log):
out.write('See build log for details:\n')
out.write(' %s' % self.build_log)
out.write(' %s\n' % self.build_log)
return out.getvalue()

View File

@ -25,11 +25,11 @@
from spack.util.prefix import Prefix
from spack.build_environment import dso_suffix
# A couple of utility functions that might be useful in general. If so, they
# should really be defined elsewhere, unless deemed heretical.
# (Or na"ive on my part).
def debug_print(msg, *args):
'''Prints a message (usu. a variable) and the callers' names for a couple
of stack frames.
@ -115,6 +115,14 @@ class IntelPackage(PackageBase):
'intel-mpi@5.1:5.99': 2016,
}
# Below is the list of possible values for setting auto dispatch functions
# for the Intel compilers. Using these allows for the building of fat
# binaries that will detect the CPU SIMD capabilities at run time and
# activate the appropriate extensions.
auto_dispatch_options = ('COMMON-AVX512', 'MIC-AVX512', 'CORE-AVX512',
'CORE-AVX2', 'CORE-AVX-I', 'AVX', 'SSE4.2',
'SSE4.1', 'SSSE3', 'SSE3', 'SSE2')
@property
def license_required(self):
# The Intel libraries are provided without requiring a license as of
@ -1241,6 +1249,30 @@ def configure_rpath(self):
with open(compiler_cfg, 'w') as fh:
fh.write('-Xlinker -rpath={0}\n'.format(compilers_lib_dir))
@run_after('install')
def configure_auto_dispatch(self):
if self._has_compilers:
if ('auto_dispatch=none' in self.spec):
return
compilers_bin_dir = self.component_bin_dir('compiler')
for compiler_name in 'icc icpc ifort'.split():
f = os.path.join(compilers_bin_dir, compiler_name)
if not os.path.isfile(f):
raise InstallError(
'Cannot find compiler command to configure '
'auto_dispatch:\n\t' + f)
ad = []
for x in IntelPackage.auto_dispatch_options:
if 'auto_dispatch={0}'.format(x) in self.spec:
ad.append(x)
compiler_cfg = os.path.abspath(f + '.cfg')
with open(compiler_cfg, 'a') as fh:
fh.write('-ax{0}\n'.format(','.join(ad)))
@run_after('install')
def filter_compiler_wrappers(self):
if (('+mpi' in self.spec or self.provides('mpi')) and

View File

@ -264,7 +264,7 @@ def get_arg(name, default=None):
hashes = True
hlen = None
nfmt = '{namespace}{name}' if namespace else '{name}'
nfmt = '{namespace}.{name}' if namespace else '{name}'
ffmt = ''
if full_compiler or flags:
ffmt += '{%compiler.name}'

View File

@ -27,6 +27,14 @@ def setup_parser(subparser):
'-v', '--verbose', action='store_true', dest='verbose',
help="display verbose build output while installing")
cache_group = subparser.add_mutually_exclusive_group()
cache_group.add_argument(
'--use-cache', action='store_true', dest='use_cache', default=True,
help="check for pre-built Spack packages in mirrors (default)")
cache_group.add_argument(
'--no-cache', action='store_false', dest='use_cache', default=True,
help="do not check for pre-built Spack packages in mirrors")
cd_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(cd_group, ['clean', 'dirty'])
@ -37,7 +45,8 @@ def bootstrap(parser, args, **kwargs):
'keep_stage': args.keep_stage,
'install_deps': 'dependencies',
'verbose': args.verbose,
'dirty': args.dirty
'dirty': args.dirty,
'use_cache': args.use_cache
})
# Define requirement dictionary defining general specs which need

View File

@ -393,6 +393,9 @@ def setup_parser(subparser):
subparser.add_argument(
'-f', '--force', action='store_true',
help="overwrite any existing package file with the same name")
subparser.add_argument(
'--skip-editor', action='store_true',
help="skip the edit session for the package (e.g., automation)")
class BuildSystemGuesser:
@ -671,5 +674,6 @@ def create(parser, args):
package.write(pkg_path)
tty.msg("Created package file: {0}".format(pkg_path))
# Open up the new package file in your $EDITOR
editor(pkg_path)
# Optionally open up the new package file in your $EDITOR
if not args.skip_editor:
editor(pkg_path)

View File

@ -9,6 +9,7 @@
import spack.cmd
import spack.environment as ev
import spack.error
import spack.package
import spack.cmd.common.arguments as arguments
import spack.repo
@ -126,9 +127,10 @@ def installed_dependents(specs, env):
env_hashes = set(env.all_hashes()) if env else set()
all_specs_in_db = spack.store.db.query()
for spec in specs:
installed = spack.store.db.installed_relatives(
spec, direction='parents', transitive=True)
installed = [x for x in all_specs_in_db if spec in x]
# separate installed dependents into dpts in this environment and
# dpts that are outside this environment
@ -212,16 +214,23 @@ def do_uninstall(env, specs, force):
if env:
_remove_from_env(item, env)
# Sort packages to be uninstalled by the number of installed dependents
# This ensures we do things in the right order
def num_installed_deps(pkg):
dependents = spack.store.db.installed_relatives(
pkg.spec, 'parents', True)
return len(dependents)
# A package is ready to be uninstalled when nothing else references it,
# unless we are requested to force uninstall it.
is_ready = lambda x: not spack.store.db.query_by_spec_hash(x)[1].ref_count
if force:
is_ready = lambda x: True
packages.sort(key=num_installed_deps)
for item in packages:
item.do_uninstall(force=force)
while packages:
ready = [x for x in packages if is_ready(x.spec.dag_hash())]
if not ready:
msg = 'unexpected error [cannot proceed uninstalling specs with' \
' remaining dependents {0}]'
msg = msg.format(', '.join(x.name for x in packages))
raise spack.error.SpackError(msg)
packages = [x for x in packages if x not in ready]
for item in ready:
item.do_uninstall(force=force)
# write any changes made to the active environment
if env:
@ -332,5 +341,5 @@ def uninstall(parser, args):
' Use `spack uninstall --all` to uninstall ALL packages.')
# [any] here handles the --all case by forcing all specs to be returned
uninstall_specs(
args, spack.cmd.parse_specs(args.packages) if args.packages else [any])
specs = spack.cmd.parse_specs(args.packages) if args.packages else [any]
uninstall_specs(args, specs)

View File

@ -46,4 +46,4 @@ def cxx14_flag(self):
@property
def pic_flag(self):
return "-fPIC"
return "-KPIC"

View File

@ -193,9 +193,7 @@ def __init__(self, root, **kwargs):
self.metadata_dir = '.spack'
self.spec_file_name = 'spec.yaml'
self.extension_file_name = 'extensions.yaml'
self.build_log_name = 'build.txt' # build log.
self.build_env_name = 'build.env' # build environment
self.packages_dir = 'repos' # archive of package.py files
self.packages_dir = 'repos' # archive of package.py files
@property
def hidden_file_paths(self):
@ -242,12 +240,6 @@ def disable_upstream_check(self):
def metadata_path(self, spec):
return os.path.join(spec.prefix, self.metadata_dir)
def build_log_path(self, spec):
return os.path.join(self.metadata_path(spec), self.build_log_name)
def build_env_path(self, spec):
return os.path.join(self.metadata_path(spec), self.build_env_name)
def build_packages_path(self, spec):
return os.path.join(self.metadata_path(spec), self.packages_dir)

View File

@ -16,6 +16,7 @@
import inspect
import pstats
import argparse
import traceback
from six import StringIO
import llnl.util.tty as tty
@ -705,10 +706,14 @@ def main(argv=None):
tty.die(e)
except KeyboardInterrupt:
if spack.config.get('config:debug'):
raise
sys.stderr.write('\n')
tty.die("Keyboard interrupt.")
except SystemExit as e:
if spack.config.get('config:debug'):
traceback.print_exc()
return e.code

View File

@ -63,6 +63,13 @@
from spack.package_prefs import get_package_dir_permissions, get_package_group
# Filename for the Spack build/install log.
_spack_build_logfile = 'spack-build-out.txt'
# Filename for the Spack build/install environment file.
_spack_build_envfile = 'spack-build-env.txt'
class InstallPhase(object):
"""Manages a single phase of the installation.
@ -432,8 +439,9 @@ class PackageBase(with_metaclass(PackageMeta, PackageViewMixin, object)):
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will
#: be copied in the same directory tree as build.env and build.txt.
#: Matching artifacts found at the end of the build process will be
#: copied in the same directory tree as _spack_build_logfile and
#: _spack_build_envfile.
archive_files = []
#
@ -782,11 +790,55 @@ def stage(self, stage):
@property
def env_path(self):
return os.path.join(self.stage.path, 'spack-build.env')
"""Return the build environment file path associated with staging."""
# Backward compatibility: Return the name of an existing log path;
# otherwise, return the current install env path name.
old_filename = os.path.join(self.stage.path, 'spack-build.env')
if os.path.exists(old_filename):
return old_filename
else:
return os.path.join(self.stage.path, _spack_build_envfile)
@property
def install_env_path(self):
"""
Return the build environment file path on successful installation.
"""
install_path = spack.store.layout.metadata_path(self.spec)
# Backward compatibility: Return the name of an existing log path;
# otherwise, return the current install env path name.
old_filename = os.path.join(install_path, 'build.env')
if os.path.exists(old_filename):
return old_filename
else:
return os.path.join(install_path, _spack_build_envfile)
@property
def log_path(self):
return os.path.join(self.stage.path, 'spack-build.txt')
"""Return the build log file path associated with staging."""
# Backward compatibility: Return the name of an existing log path.
for filename in ['spack-build.out', 'spack-build.txt']:
old_log = os.path.join(self.stage.path, filename)
if os.path.exists(old_log):
return old_log
# Otherwise, return the current log path name.
return os.path.join(self.stage.path, _spack_build_logfile)
@property
def install_log_path(self):
"""Return the build log file path on successful installation."""
install_path = spack.store.layout.metadata_path(self.spec)
# Backward compatibility: Return the name of an existing install log.
for filename in ['build.out', 'build.txt']:
old_log = os.path.join(install_path, filename)
if os.path.exists(old_log):
return old_log
# Otherwise, return the current install log path name.
return os.path.join(install_path, _spack_build_logfile)
def _make_fetcher(self):
# Construct a composite fetcher that always contains at least
@ -1394,7 +1446,6 @@ def bootstrap_compiler(self, **kwargs):
)
def do_install(self, **kwargs):
"""Called by commands to install a package and its dependencies.
Package implementations should override install() to describe
@ -1617,6 +1668,9 @@ def build_process():
if mode != perms:
os.chmod(self.prefix, perms)
# Ensure the metadata path exists as well
mkdirp(spack.store.layout.metadata_path(self.spec), mode=perms)
# Fork a child to do the actual installation
# we preserve verbosity settings across installs.
PackageBase._verbose = spack.build_environment.fork(
@ -1724,24 +1778,24 @@ def _do_install_pop_kwargs(self, kwargs):
.format(self.last_phase, self.name))
def log(self):
# Copy provenance into the install directory on success
log_install_path = spack.store.layout.build_log_path(self.spec)
env_install_path = spack.store.layout.build_env_path(self.spec)
"""Copy provenance into the install directory on success."""
packages_dir = spack.store.layout.build_packages_path(self.spec)
# Remove first if we're overwriting another build
# (can happen with spack setup)
try:
# log_install_path and env_install_path are inside this
# log and env install paths are inside this
shutil.rmtree(packages_dir)
except Exception as e:
# FIXME : this potentially catches too many things...
tty.debug(e)
# Archive the whole stdout + stderr for the package
install(self.log_path, log_install_path)
install(self.log_path, self.install_log_path)
# Archive the environment used for the build
install(self.env_path, env_install_path)
install(self.env_path, self.install_env_path)
# Finally, archive files that are specific to each package
with working_dir(self.stage.path):
errors = StringIO()
@ -1816,10 +1870,12 @@ def check_paths(path_list, filetype, predicate):
@property
def build_log_path(self):
if self.installed:
return spack.store.layout.build_log_path(self.spec)
else:
return self.log_path
"""
Return the expected (or current) build log file path. The path points
to the staging build file until the software is successfully installed,
when it points to the file in the installation directory.
"""
return self.install_log_path if self.installed else self.log_path
@classmethod
def inject_flags(cls, name, flags):

View File

@ -725,28 +725,18 @@ def _libs_default_handler(descriptor, spec, cls):
if not name.startswith('lib'):
name = 'lib' + name
# To speedup the search for external packages configured e.g. in /usr,
# perform first non-recursive search in prefix.lib then in prefix.lib64 and
# finally search all of prefix recursively. The search stops when the first
# match is found.
prefix = spec.prefix
search_paths = [(prefix.lib, False), (prefix.lib64, False), (prefix, True)]
# If '+shared' search only for shared library; if '~shared' search only for
# static library; otherwise, first search for shared and then for static.
search_shared = [True] if ('+shared' in spec) else \
([False] if ('~shared' in spec) else [True, False])
for shared in search_shared:
for path, recursive in search_paths:
libs = find_libraries(
name, root=path, shared=shared, recursive=recursive
)
if libs:
return libs
libs = find_libraries(name, spec.prefix, shared=shared, recursive=True)
if libs:
return libs
msg = 'Unable to recursively locate {0} libraries in {1}'
raise NoLibrariesError(msg.format(spec.name, prefix))
raise NoLibrariesError(msg.format(spec.name, spec.prefix))
class ForwardQueryToPackage(object):
@ -2017,13 +2007,18 @@ def concretize(self, tests=False):
# Now that the spec is concrete we should check if
# there are declared conflicts
#
# TODO: this needs rethinking, as currently we can only express
# TODO: internal configuration conflicts within one package.
matches = []
for x in self.traverse():
for conflict_spec, when_list in x.package_class.conflicts.items():
if x.satisfies(conflict_spec, strict=True):
for when_spec, msg in when_list:
if x.satisfies(when_spec, strict=True):
matches.append((x, conflict_spec, when_spec, msg))
when = when_spec.copy()
when.name = x.name
matches.append((x, conflict_spec, when, msg))
if matches:
raise ConflictsInSpecError(self, matches)

View File

@ -0,0 +1,57 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import os
import pytest
import spack.cmd.create
import spack.util.editor
from spack.main import SpackCommand
create = SpackCommand('create')
@pytest.fixture("module")
def cmd_create_repo(tmpdir_factory):
repo_namespace = 'cmd_create_repo'
repodir = tmpdir_factory.mktemp(repo_namespace)
repodir.ensure(spack.repo.packages_dir_name, dir=True)
yaml = repodir.join('repo.yaml')
yaml.write("""
repo:
namespace: cmd_create_repo
""")
repo = spack.repo.RepoPath(str(repodir))
with spack.repo.swap(repo):
yield repo, repodir
@pytest.fixture(scope='module')
def parser():
"""Returns the parser for the module"""
prs = argparse.ArgumentParser()
spack.cmd.create.setup_parser(prs)
return prs
def test_create_template(parser, cmd_create_repo):
"""Test template creation."""
repo, repodir = cmd_create_repo
name = 'test-package'
args = parser.parse_args(['--skip-editor', name])
spack.cmd.create.create(parser, args)
filename = repo.filename_for_package_name(name)
assert os.path.exists(filename)
with open(filename, 'r') as package_file:
content = ' '.join(package_file.readlines())
for entry in [r'TestPackage(Package)', r'def install(self']:
assert content.find(entry) > -1

View File

@ -31,6 +31,19 @@
find = SpackCommand('find')
def check_mpileaks_install(viewdir):
"""Check that the expected install directories exist."""
assert os.path.exists(str(viewdir.join('.spack', 'mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(viewdir.join('.spack', 'libdwarf')))
def check_viewdir_removal(viewdir):
"""Check that the uninstall/removal worked."""
assert (not os.path.exists(str(viewdir.join('.spack'))) or
os.listdir(str(viewdir.join('.spack'))) == ['projections.yaml'])
def test_add():
e = ev.create('test')
e.add('mpileaks')
@ -606,22 +619,18 @@ def test_uninstall_removes_from_env(mock_stage, mock_fetch, install_mockery):
def test_env_updates_view_install(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
with ev.read('test'):
add('mpileaks')
install('--fake')
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
def test_env_without_view_install(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
# Test enabling a view after installing specs
env('create', '--without-view', 'test')
@ -639,13 +648,11 @@ def test_env_without_view_install(
# After enabling the view, the specs should be linked into the environment
# view dir
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
def test_env_config_view_default(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
# This config doesn't mention whether a view is enabled
test_config = """\
env:
@ -665,21 +672,17 @@ def test_env_config_view_default(
def test_env_updates_view_install_package(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
with ev.read('test'):
install('--fake', 'mpileaks')
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
def test_env_updates_view_add_concretize(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
install('--fake', 'mpileaks')
@ -687,33 +690,26 @@ def test_env_updates_view_add_concretize(
add('mpileaks')
concretize()
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
def test_env_updates_view_uninstall(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
with ev.read('test'):
install('--fake', 'mpileaks')
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
with ev.read('test'):
uninstall('-ay')
assert (not os.path.exists(str(view_dir.join('.spack'))) or
os.listdir(str(view_dir.join('.spack'))) == ['projections.yaml'])
check_viewdir_removal(view_dir)
def test_env_updates_view_uninstall_referenced_elsewhere(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
install('--fake', 'mpileaks')
@ -721,20 +717,16 @@ def test_env_updates_view_uninstall_referenced_elsewhere(
add('mpileaks')
concretize()
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
with ev.read('test'):
uninstall('-ay')
assert (not os.path.exists(str(view_dir.join('.spack'))) or
os.listdir(str(view_dir.join('.spack'))) == ['projections.yaml'])
check_viewdir_removal(view_dir)
def test_env_updates_view_remove_concretize(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
install('--fake', 'mpileaks')
@ -742,40 +734,32 @@ def test_env_updates_view_remove_concretize(
add('mpileaks')
concretize()
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
with ev.read('test'):
remove('mpileaks')
concretize()
assert (not os.path.exists(str(view_dir.join('.spack'))) or
os.listdir(str(view_dir.join('.spack'))) == ['projections.yaml'])
check_viewdir_removal(view_dir)
def test_env_updates_view_force_remove(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
view_dir = tmpdir.mkdir('view')
env('create', '--with-view=%s' % view_dir, 'test')
with ev.read('test'):
install('--fake', 'mpileaks')
assert os.path.exists(str(view_dir.join('.spack/mpileaks')))
# Check that dependencies got in too
assert os.path.exists(str(view_dir.join('.spack/libdwarf')))
check_mpileaks_install(view_dir)
with ev.read('test'):
remove('-f', 'mpileaks')
assert (not os.path.exists(str(view_dir.join('.spack'))) or
os.listdir(str(view_dir.join('.spack'))) == ['projections.yaml'])
check_viewdir_removal(view_dir)
def test_env_activate_view_fails(
tmpdir, mock_stage, mock_fetch, install_mockery
):
tmpdir, mock_stage, mock_fetch, install_mockery):
"""Sanity check on env activate to make sure it requires shell support"""
out = env('activate', 'test')
assert "To initialize spack's shell commands:" in out

View File

@ -7,9 +7,13 @@
import pytest
import spack.cmd.find
from spack.main import SpackCommand
from spack.util.pattern import Bunch
find = SpackCommand('find')
@pytest.fixture(scope='module')
def parser():
"""Returns the parser for the module command"""
@ -98,3 +102,12 @@ def test_tag2_tag3(parser, specs):
spack.cmd.find.find(parser, args)
assert len(specs) == 0
@pytest.mark.db
def test_namespaces_shown_correctly(database):
out = find()
assert 'builtin.mock.zmpi' not in out
out = find('--namespace')
assert 'builtin.mock.zmpi' in out

View File

@ -125,7 +125,7 @@ def test_package_output(tmpdir, capsys, install_mockery, mock_fetch):
pkg = spec.package
pkg.do_install(verbose=True)
log_file = os.path.join(spec.prefix, '.spack', 'build.txt')
log_file = pkg.build_log_path
with open(log_file) as f:
out = f.read()

View File

@ -37,7 +37,7 @@ def test_installed_dependents():
@pytest.mark.db
@pytest.mark.usefixtures('database')
@pytest.mark.usefixtures('mutable_database')
def test_recursive_uninstall():
"""Test recursive uninstall."""
uninstall('-y', '-a', '--dependents', 'callpath')
@ -52,3 +52,32 @@ def test_recursive_uninstall():
assert len(mpileaks_specs) == 0
assert len(callpath_specs) == 0
assert len(mpi_specs) == 3
@pytest.mark.db
@pytest.mark.regression('3690')
@pytest.mark.usefixtures('mutable_database')
@pytest.mark.parametrize('constraint,expected_number_of_specs', [
('dyninst', 7), ('libelf', 5)
])
def test_uninstall_spec_with_multiple_roots(
constraint, expected_number_of_specs
):
uninstall('-y', '-a', '--dependents', constraint)
all_specs = spack.store.layout.all_specs()
assert len(all_specs) == expected_number_of_specs
@pytest.mark.db
@pytest.mark.usefixtures('mutable_database')
@pytest.mark.parametrize('constraint,expected_number_of_specs', [
('dyninst', 13), ('libelf', 13)
])
def test_force_uninstall_spec_with_ref_count_not_zero(
constraint, expected_number_of_specs
):
uninstall('-f', '-y', constraint)
all_specs = spack.store.layout.all_specs()
assert len(all_specs) == expected_number_of_specs

View File

@ -274,11 +274,11 @@ def test_xl_r_flags():
def test_fj_flags():
supported_flag_test("openmp_flag", "-Kopenmp", "fj@1.2.0")
supported_flag_test("cxx98_flag", "-std=c++98", "fj@1.2.0")
supported_flag_test("cxx11_flag", "-std=c++11", "fj@1.2.0")
supported_flag_test("cxx14_flag", "-std=c++14", "fj@1.2.0")
supported_flag_test("pic_flag", "-fPIC", "fj@1.2.0")
supported_flag_test("openmp_flag", "-Kopenmp", "fj@4.0.0")
supported_flag_test("cxx98_flag", "-std=c++98", "fj@4.0.0")
supported_flag_test("cxx11_flag", "-std=c++11", "fj@4.0.0")
supported_flag_test("cxx14_flag", "-std=c++14", "fj@4.0.0")
supported_flag_test("pic_flag", "-KPIC", "fj@4.0.0")
@pytest.mark.regression('10191')

View File

@ -29,9 +29,7 @@ def layout_and_dir(tmpdir):
spack.store.layout = old_layout
def test_yaml_directory_layout_parameters(
tmpdir, config
):
def test_yaml_directory_layout_parameters(tmpdir, config):
"""This tests the various parameters that can be used to configure
the install location """
spec = Spec('python')
@ -84,9 +82,7 @@ def test_yaml_directory_layout_parameters(
path_scheme=scheme_package7)
def test_read_and_write_spec(
layout_and_dir, config, mock_packages
):
def test_read_and_write_spec(layout_and_dir, config, mock_packages):
"""This goes through each package in spack and creates a directory for
it. It then ensures that the spec for the directory's
installed package can be read back in consistently, and
@ -162,9 +158,7 @@ def test_read_and_write_spec(
assert not os.path.exists(install_dir)
def test_handle_unknown_package(
layout_and_dir, config, mock_packages
):
def test_handle_unknown_package(layout_and_dir, config, mock_packages):
"""This test ensures that spack can at least do *some*
operations with packages that are installed but that it
does not know about. This is actually not such an uncommon
@ -234,3 +228,14 @@ def test_find(layout_and_dir, config, mock_packages):
for name, spec in found_specs.items():
assert name in found_specs
assert found_specs[name].eq_dag(spec)
def test_yaml_directory_layout_build_path(tmpdir, config):
"""This tests build path method."""
spec = Spec('python')
spec.concretize()
layout = YamlDirectoryLayout(str(tmpdir))
rel_path = os.path.join(layout.metadata_dir, layout.packages_dir)
assert layout.build_packages_path(spec) == os.path.join(spec.prefix,
rel_path)

View File

@ -5,11 +5,15 @@
import os
import pytest
import shutil
from llnl.util.filesystem import mkdirp, touch, working_dir
import spack.patch
import spack.repo
import spack.store
from spack.spec import Spec
from spack.package import _spack_build_envfile, _spack_build_logfile
def test_install_and_uninstall(install_mockery, mock_fetch, monkeypatch):
@ -269,3 +273,97 @@ def test_failing_build(install_mockery, mock_fetch):
class MockInstallError(spack.error.SpackError):
pass
def test_pkg_build_paths(install_mockery):
# Get a basic concrete spec for the trivial install package.
spec = Spec('trivial-install-test-package').concretized()
log_path = spec.package.log_path
assert log_path.endswith(_spack_build_logfile)
env_path = spec.package.env_path
assert env_path.endswith(_spack_build_envfile)
# Backward compatibility checks
log_dir = os.path.dirname(log_path)
mkdirp(log_dir)
with working_dir(log_dir):
# Start with the older of the previous log filenames
older_log = 'spack-build.out'
touch(older_log)
assert spec.package.log_path.endswith(older_log)
# Now check the newer log filename
last_log = 'spack-build.txt'
os.rename(older_log, last_log)
assert spec.package.log_path.endswith(last_log)
# Check the old environment file
last_env = 'spack-build.env'
os.rename(last_log, last_env)
assert spec.package.env_path.endswith(last_env)
# Cleanup
shutil.rmtree(log_dir)
def test_pkg_install_paths(install_mockery):
# Get a basic concrete spec for the trivial install package.
spec = Spec('trivial-install-test-package').concretized()
log_path = os.path.join(spec.prefix, '.spack', _spack_build_logfile)
assert spec.package.install_log_path == log_path
env_path = os.path.join(spec.prefix, '.spack', _spack_build_envfile)
assert spec.package.install_env_path == env_path
# Backward compatibility checks
log_dir = os.path.dirname(log_path)
mkdirp(log_dir)
with working_dir(log_dir):
# Start with the older of the previous install log filenames
older_log = 'build.out'
touch(older_log)
assert spec.package.install_log_path.endswith(older_log)
# Now check the newer install log filename
last_log = 'build.txt'
os.rename(older_log, last_log)
assert spec.package.install_log_path.endswith(last_log)
# Check the old install environment file
last_env = 'build.env'
os.rename(last_log, last_env)
assert spec.package.install_env_path.endswith(last_env)
# Cleanup
shutil.rmtree(log_dir)
def test_pkg_install_log(install_mockery):
# Get a basic concrete spec for the trivial install package.
spec = Spec('trivial-install-test-package').concretized()
# Attempt installing log without the build log file
with pytest.raises(IOError, match="No such file or directory"):
spec.package.log()
# Set up mock build files and try again
log_path = spec.package.log_path
log_dir = os.path.dirname(log_path)
mkdirp(log_dir)
with working_dir(log_dir):
touch(log_path)
touch(spec.package.env_path)
install_path = os.path.dirname(spec.package.install_log_path)
mkdirp(install_path)
spec.package.log()
assert os.path.exists(spec.package.install_log_path)
assert os.path.exists(spec.package.install_env_path)
# Cleanup
shutil.rmtree(log_dir)

View File

@ -17,7 +17,7 @@ def test_possible_dependencies(mock_packages):
mpileaks = spack.repo.get('mpileaks')
mpi_names = [spec.name for spec in spack.repo.path.providers_for('mpi')]
assert mpileaks.possible_dependencies() == {
assert mpileaks.possible_dependencies(expand_virtuals=True) == {
'callpath': set(['dyninst'] + mpi_names),
'dyninst': set(['libdwarf', 'libelf']),
'fake': set(),
@ -30,6 +30,15 @@ def test_possible_dependencies(mock_packages):
'zmpi': set(['fake']),
}
assert mpileaks.possible_dependencies(expand_virtuals=False) == {
'callpath': set(['dyninst']),
'dyninst': set(['libdwarf', 'libelf']),
'libdwarf': set(['libelf']),
'libelf': set(),
'mpi': set(),
'mpileaks': set(['callpath']),
}
def test_possible_dependencies_with_deptypes(mock_packages):
dtbuild1 = spack.repo.get('dtbuild1')

View File

@ -16,6 +16,12 @@
# Optionally add one or more unit tests
# to only run these tests.
#
#-----------------------------------------------------------
# Run a few initial commands and set up test environment
#-----------------------------------------------------------
ORIGINAL_PATH="$PATH"
. "$(dirname $0)/setup.sh"
check_dependencies ${coverage} git hg svn
@ -33,9 +39,33 @@ bin/spack help -a
# Profile and print top 20 lines for a simple call to spack spec
bin/spack -p --lines 20 spec mpileaks%gcc ^elfutils@0.170
#-----------------------------------------------------------
# Run unit tests with code coverage
#-----------------------------------------------------------
extra_args=""
if [[ -n "$@" ]]; then
extra_args="-k $@"
fi
${coverage_run} bin/spack test --verbose "$extra_args"
#-----------------------------------------------------------
# Run tests for setup-env.sh
#-----------------------------------------------------------
# Clean the environment by removing Spack from the path and getting rid of
# the spack shell function
export PATH="$ORIGINAL_PATH"
unset spack
# start in the spack root directory
cd $SPACK_ROOT
# Run bash tests with coverage enabled, but pipe output to /dev/null
# because it seems that kcov seems to undo the script's redirection
if [ "$BASH_COVERAGE" = true ]; then
${QA_DIR}/bashcov ${QA_DIR}/setup-env-test.sh &> /dev/null
fi
# run the test scripts for their output (these will print nicely)
bash ${QA_DIR}/setup-env-test.sh
zsh ${QA_DIR}/setup-env-test.sh
dash ${QA_DIR}/setup-env-test.sh

292
share/spack/qa/setup-env-test.sh Executable file
View File

@ -0,0 +1,292 @@
#!/bin/sh
#
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#
# This script tests that Spack's setup-env.sh init script works.
#
# The tests are portable to bash, zsh, and bourne shell, and can be run
# in any of these shells.
#
# ------------------------------------------------------------------------
# Functions for color output.
# ------------------------------------------------------------------------
# Colors for output
red='\033[1;31m'
cyan='\033[1;36m'
green='\033[1;32m'
reset='\033[0m'
echo_red() {
printf "${red}$*${reset}\n"
}
echo_green() {
printf "${green}$*${reset}\n"
}
echo_msg() {
printf "${cyan}$*${reset}\n"
}
# ------------------------------------------------------------------------
# Generic functions for testing shell code.
# ------------------------------------------------------------------------
# counts of test successes and failures.
success=0
errors=0
# Print out a header for a group of tests.
title() {
echo
echo_msg "$@"
echo_msg "---------------------------------"
}
# echo FAIL in red text; increment failures
fail() {
echo_red FAIL
errors=$((errors+1))
}
#
# Echo SUCCESS in green; increment successes
#
pass() {
echo_green SUCCESS
success=$((success+1))
}
#
# Run a command and suppress output unless it fails.
# On failure, echo the exit code and output.
#
succeeds() {
printf "'%s' succeeds ... " "$*"
output=$("$@" 2>&1)
err="$?"
if [ "$err" != 0 ]; then
fail
echo_red "Command failed with error $err."
if [ -n "$output" ]; then
echo_msg "Output:"
echo "$output"
else
echo_msg "No output."
fi
else
pass
fi
}
#
# Run a command and suppress output unless it succeeds.
# If the command succeeds, echo the output.
#
fails() {
printf "'%s' fails ... " "$*"
output=$("$@" 2>&1)
err="$?"
if [ "$err" = 0 ]; then
fail
echo_red "Command failed with error $err."
if [ -n "$output" ]; then
echo_msg "Output:"
echo "$output"
else
echo_msg "No output."
fi
else
pass
fi
}
#
# Ensure that a string is in the output of a command.
# Suppresses output on success.
# On failure, echo the exit code and output.
#
contains() {
string="$1"
shift
printf "'%s' output contains '$string' ... " "$*"
output=$("$@" 2>&1)
err="$?"
if [ "${output#*$string}" = "${output}" ]; then
fail
echo_red "Command exited with error $err."
echo_red "'$string' was not in output."
if [ -n "$output" ]; then
echo_msg "Output:"
echo "$output"
else
echo_msg "No output."
fi
else
pass
fi
}
# -----------------------------------------------------------------------
# Instead of invoking the module/use/dotkit commands, we print the
# arguments that Spack invokes the command with, so we can check that
# Spack passes the expected arguments in the tests below.
#
# We make that happen by defining the sh functions below.
# -----------------------------------------------------------------------
module() {
echo module "$@"
}
use() {
echo use "$@"
}
unuse() {
echo unuse "$@"
}
# -----------------------------------------------------------------------
# Setup test environment and do some preliminary checks
# -----------------------------------------------------------------------
# Make sure no environment is active
unset SPACK_ENV
# fail with undefined variables
set -u
# Source setup-env.sh before tests
. share/spack/setup-env.sh
title "Testing setup-env.sh with $_sp_shell"
# spack command is now avaialble
succeeds which spack
# mock cd command (intentionally define only AFTER setup-env.sh)
cd() {
echo cd "$@"
}
# create a fake mock package install and store its location for later
title "Setup"
echo "Creating a mock package installation"
spack -m install --fake a
a_install=$(spack location -i a)
a_module=$(spack -m module tcl find a)
a_dotkit=$(spack -m module dotkit find a)
b_install=$(spack location -i b)
b_module=$(spack -m module tcl find b)
b_dotkit=$(spack -m module dotkit find b)
# ensure that we uninstall b on exit
cleanup() {
title "Cleanup"
echo "Removing test package before exiting test script."
spack -m uninstall -yf b
spack -m uninstall -yf a
echo
echo "$success tests succeeded."
echo "$errors tests failed."
if [ "$errors" = 0 ]; then
pass
exit 0
else
fail
exit 1
fi
}
trap cleanup EXIT
# -----------------------------------------------------------------------
# Test all spack commands with special env support
# -----------------------------------------------------------------------
title 'Testing `spack`'
contains 'usage: spack ' spack
contains "usage: spack " spack -h
contains "usage: spack " spack help
contains "usage: spack " spack -H
contains "usage: spack " spack help --all
title 'Testing `spack cd`'
contains "usage: spack cd " spack cd -h
contains "usage: spack cd " spack cd --help
contains "cd $b_install" spack cd -i b
title 'Testing `spack module`'
contains "usage: spack module " spack -m module -h
contains "usage: spack module " spack -m module --help
contains "usage: spack module " spack -m module
title 'Testing `spack load`'
contains "module load $b_module" spack -m load b
fails spack -m load -l
contains "module load -l --arg $b_module" spack -m load -l --arg b
contains "module load $b_module $a_module" spack -m load -r a
contains "module load $b_module $a_module" spack -m load --dependencies a
fails spack -m load d
contains "usage: spack load " spack -m load -h
contains "usage: spack load " spack -m load -h d
contains "usage: spack load " spack -m load --help
title 'Testing `spack unload`'
contains "module unload $b_module" spack -m unload b
fails spack -m unload -l
contains "module unload -l --arg $b_module" spack -m unload -l --arg b
fails spack -m unload d
contains "usage: spack unload " spack -m unload -h
contains "usage: spack unload " spack -m unload -h d
contains "usage: spack unload " spack -m unload --help
title 'Testing `spack use`'
contains "use $b_dotkit" spack -m use b
fails spack -m use -l
contains "use -l --arg $b_dotkit" spack -m use -l --arg b
contains "use $b_dotkit $a_dotkit" spack -m use -r a
contains "use $b_dotkit $a_dotkit" spack -m use --dependencies a
fails spack -m use d
contains "usage: spack use " spack -m use -h
contains "usage: spack use " spack -m use -h d
contains "usage: spack use " spack -m use --help
title 'Testing `spack unuse`'
contains "unuse $b_dotkit" spack -m unuse b
fails spack -m unuse -l
contains "unuse -l --arg $b_dotkit" spack -m unuse -l --arg b
fails spack -m unuse d
contains "usage: spack unuse " spack -m unuse -h
contains "usage: spack unuse " spack -m unuse -h d
contains "usage: spack unuse " spack -m unuse --help
title 'Testing `spack env`'
contains "usage: spack env " spack env -h
contains "usage: spack env " spack env --help
title 'Testing `spack env activate`'
contains "No such environment:" spack env activate no_such_environment
contains "usage: spack env activate " spack env activate
contains "usage: spack env activate " spack env activate -h
contains "usage: spack env activate " spack env activate --help
title 'Testing `spack env deactivate`'
contains "Error: No environment is currently active" spack env deactivate
contains "usage: spack env deactivate " spack env deactivate no_such_environment
contains "usage: spack env deactivate " spack env deactivate -h
contains "usage: spack env deactivate " spack env deactivate --help
title 'Testing `spack env list`'
contains " spack env list " spack env list -h
contains " spack env list " spack env list --help

View File

@ -20,18 +20,24 @@ export SPACK_ROOT=$(realpath "$QA_DIR/../../..")
coverage=""
coverage_run=""
# bash coverage depends on some other factors -- there are issues with
# kcov for Python 2.6, unit tests, and build tests.
if [[ $TEST_SUITE == unit && # kcov segfaults for the MPICH build test
$TRAVIS_OS_NAME == linux &&
$TRAVIS_PYTHON_VERSION != 2.6 ]];
then
BASH_COVERAGE="true"
else
BASH_COVERAGE="false"
fi
# Set up some variables for running coverage tests.
if [[ "$COVERAGE" == "true" ]]; then
# these set up coverage for Python
coverage=coverage
coverage_run="coverage run"
# make a coverage directory for kcov, and patch cc to use our bashcov
# script instead of plain bash
if [[ $TEST_SUITE == unit && # kcov segfaults for the MPICH build test
$TRAVIS_OS_NAME == linux &&
$TRAVIS_PYTHON_VERSION != 2.6 ]];
then
if [ "$BASH_COVERAGE" = true ]; then
mkdir -p coverage
cc_script="$SPACK_ROOT/lib/spack/env/cc"
bashcov=$(realpath ${QA_DIR}/bashcov)

View File

@ -6,11 +6,11 @@
########################################################################
#
# This file is part of Spack and sets up the spack environment for
# bash and zsh. This includes dotkit support, module support, and
# it also puts spack in your path. The script also checks that
# at least module support exists, and provides suggestions if it
# doesn't. Source it like this:
# This file is part of Spack and sets up the spack environment for bash,
# zsh, and dash (sh). This includes dotkit support, module support, and
# it also puts spack in your path. The script also checks that at least
# module support exists, and provides suggestions if it doesn't. Source
# it like this:
#
# . /path/to/spack/share/spack/setup-env.sh
#
@ -39,34 +39,38 @@
# spack dotfiles.
########################################################################
function spack {
# Zsh does not do word splitting by default, this enables it for this function only
spack() {
# Zsh does not do word splitting by default, this enables it for this
# function only
if [ -n "${ZSH_VERSION:-}" ]; then
emulate -L sh
fi
# save raw arguments into an array before butchering them
args=( "$@" )
# accumulate initial flags for main spack command
# accumulate flags meant for the main spack command
# the loop condition is unreadable, but it means:
# while $1 is set (while there are arguments)
# and $1 starts with '-' (and the arguments are flags)
_sp_flags=""
while [[ "$1" =~ ^- ]]; do
while [ ! -z ${1+x} ] && [ "${1#-}" != "${1}" ]; do
_sp_flags="$_sp_flags $1"
shift
done
# h and V flags don't require further output parsing.
if [[ (! -z "$_sp_flags") && ("$_sp_flags" =~ '.*h.*' || "$_sp_flags" =~ '.*V.*') ]]; then
if [ -n "$_sp_flags" ] && \
[ "${_sp_flags#*h}" != "${_sp_flags}" ] || \
[ "${_sp_flags#*V}" != "${_sp_flags}" ];
then
command spack $_sp_flags "$@"
return
fi
# set the subcommand if there is one (if $1 is set)
_sp_subcommand=""
if [ -n "$1" ]; then
if [ ! -z ${1+x} ]; then
_sp_subcommand="$1"
shift
fi
_sp_spec=("$@")
# Filter out use and unuse. For any other commands, just run the
# command.
@ -77,11 +81,11 @@ function spack {
_sp_arg="$1"
shift
fi
if [[ "$_sp_arg" = "-h" || "$_sp_arg" = "--help" ]]; then
if [ "$_sp_arg" = "-h" ] || [ "$_sp_arg" = "--help" ]; then
command spack cd -h
else
LOC="$(spack location $_sp_arg "$@")"
if [[ -d "$LOC" ]] ; then
if [ -d "$LOC" ] ; then
cd "$LOC"
else
return 1
@ -96,31 +100,41 @@ function spack {
shift
fi
if [[ "$_sp_arg" = "-h" || "$_sp_arg" = "--help" ]]; then
if [ "$_sp_arg" = "-h" ] || [ "$_sp_arg" = "--help" ]; then
command spack env -h
else
case $_sp_arg in
activate)
_a="$@"
if [ -z "$1" -o "${_a#*--sh}" != "$_a" -o "${_a#*--csh}" != "$_a" -o "${_a#*-h}" != "$_a" ]; then
if [ -z ${1+x} ] || \
[ "${_a#*--sh}" != "$_a" ] || \
[ "${_a#*--csh}" != "$_a" ] || \
[ "${_a#*-h}" != "$_a" ];
then
# no args or args contain -h/--help, --sh, or --csh: just execute
command spack "${args[@]}"
command spack env activate "$@"
else
# actual call to activate: source the output
eval $(command spack $_sp_flags env activate --sh "$@")
fi
;;
deactivate)
if [ -n "$1" ]; then
# with args: execute the command
command spack "${args[@]}"
_a="$@"
if [ "${_a#*--sh}" != "$_a" ] || \
[ "${_a#*--csh}" != "$_a" ];
then
# just execute the command if --sh or --csh are provided
command spack env deactivate "$@"
elif [ -n "$*" ]; then
# any other arguments are an error or help, so just run help
command spack env deactivate -h
else
# no args: source the output.
# no args: source the output of the command
eval $(command spack $_sp_flags env deactivate --sh)
fi
;;
*)
command spack "${args[@]}"
command spack env $_sp_arg "$@"
;;
esac
fi
@ -130,8 +144,11 @@ function spack {
# Shift any other args for use off before parsing spec.
_sp_subcommand_args=""
_sp_module_args=""
while [[ "$1" =~ ^- ]]; do
if [ "$1" = "-r" -o "$1" = "--dependencies" ]; then
while [ "${1#-}" != "${1}" ]; do
if [ "$1" = "-h" ] || [ "$1" = "--help" ]; then
command spack $_sp_flags $_sp_subcommand $_sp_subcommand_args "$@"
return
elif [ "$1" = "-r" ] || [ "$1" = "--dependencies" ]; then
_sp_subcommand_args="$_sp_subcommand_args $1"
else
_sp_module_args="$_sp_module_args $1"
@ -139,51 +156,54 @@ function spack {
shift
done
_sp_spec=("$@")
# Here the user has run use or unuse with a spec. Find a matching
# spec using 'spack module find', then use the appropriate module
# tool's commands to add/remove the result from the environment.
# If spack module command comes back with an error, do nothing.
case $_sp_subcommand in
"use")
if _sp_full_spec=$(command spack $_sp_flags module dotkit find $_sp_subcommand_args "${_sp_spec[@]}"); then
if _sp_full_spec=$(command spack $_sp_flags module dotkit find $_sp_subcommand_args "$@"); then
use $_sp_module_args $_sp_full_spec
else
$(exit 1)
fi ;;
fi
;;
"unuse")
if _sp_full_spec=$(command spack $_sp_flags module dotkit find $_sp_subcommand_args "${_sp_spec[@]}"); then
if _sp_full_spec=$(command spack $_sp_flags module dotkit find $_sp_subcommand_args "$@"); then
unuse $_sp_module_args $_sp_full_spec
else
$(exit 1)
fi ;;
fi
;;
"load")
if _sp_full_spec=$(command spack $_sp_flags module tcl find $_sp_subcommand_args "${_sp_spec[@]}"); then
if _sp_full_spec=$(command spack $_sp_flags module tcl find $_sp_subcommand_args "$@"); then
module load $_sp_module_args $_sp_full_spec
else
$(exit 1)
fi ;;
fi
;;
"unload")
if _sp_full_spec=$(command spack $_sp_flags module tcl find $_sp_subcommand_args "${_sp_spec[@]}"); then
if _sp_full_spec=$(command spack $_sp_flags module tcl find $_sp_subcommand_args "$@"); then
module unload $_sp_module_args $_sp_full_spec
else
$(exit 1)
fi ;;
fi
;;
esac
;;
*)
command spack "${args[@]}"
command spack $_sp_flags $_sp_subcommand "$@"
;;
esac
}
########################################################################
# Prepends directories to path, if they exist.
# pathadd /path/to/dir # add to PATH
# or pathadd OTHERPATH /path/to/dir # add to OTHERPATH
########################################################################
function _spack_pathadd {
_spack_pathadd() {
# If no variable name is supplied, just append to PATH
# otherwise append to that variable.
_pa_varname=PATH
@ -196,7 +216,10 @@ function _spack_pathadd {
# Do the actual prepending here.
eval "_pa_oldvalue=\${${_pa_varname}:-}"
if [ -d "$_pa_new_path" ] && [[ ":$_pa_oldvalue:" != *":$_pa_new_path:"* ]]; then
_pa_canonical=":$_pa_oldvalue:"
if [ -d "$_pa_new_path" ] && \
[ "${_pa_canonical#*:${_pa_new_path}:}" = "${_pa_canonical}" ];
then
if [ -n "$_pa_oldvalue" ]; then
eval "export $_pa_varname=\"$_pa_new_path:$_pa_oldvalue\""
else
@ -205,59 +228,84 @@ function _spack_pathadd {
fi
}
# Export spack function so it is available in subshells (only works with bash)
if [ -n "${BASH_VERSION:-}" ]; then
export -f spack
fi
#
# Figure out where this file is. Below code needs to be portable to
# bash and zsh.
#
_sp_source_file="${BASH_SOURCE[0]:-}" # Bash's location of last sourced file.
if [ -z "$_sp_source_file" ]; then
_sp_source_file="${(%):-%N}" # zsh way to do it
if [ -z "$_sp_source_file" ]; then
# Not zsh either... bail out with plain old $0,
# which WILL NOT work if this is sourced indirectly.
_sp_source_file="$0"
fi
fi
#
# Determine which shell is being used
#
function _spack_determine_shell() {
# This logic is derived from the cea-hpc/modules profile.sh example at
# https://github.com/cea-hpc/modules/blob/master/init/profile.sh.in
#
# The objective is to correctly detect the shell type even when setup-env
# is sourced within a script itself rather than a login terminal.
_spack_determine_shell() {
if [ -n "${BASH:-}" ]; then
echo ${BASH##*/}
echo bash
elif [ -n "${ZSH_NAME:-}" ]; then
echo $ZSH_NAME
echo zsh
else
PS_FORMAT= ps -p $$ | tail -n 1 | awk '{print $4}' | sed 's/^-//' | xargs basename
fi
}
export SPACK_SHELL=$(_spack_determine_shell)
_sp_shell=$(_spack_determine_shell)
# Export spack function so it is available in subshells (only works with bash)
if [ "$_sp_shell" = bash ]; then
export -f spack
fi
#
# Figure out where this file is.
#
if [ "$_sp_shell" = bash ]; then
_sp_source_file="${BASH_SOURCE[0]:-}"
elif [ "$_sp_shell" = zsh ]; then
_sp_source_file="${(%):-%N}"
else
# Try to read the /proc filesystem (works on linux without lsof)
# In dash, the sourced file is the last one opened (and it's kept open)
_sp_source_file_fd="$(\ls /proc/$$/fd 2>/dev/null | sort -n | tail -1)"
if ! _sp_source_file="$(readlink /proc/$$/fd/$_sp_source_file_fd)"; then
# Last resort: try lsof. This works in dash on macos -- same reason.
# macos has lsof installed by default; some linux containers don't.
_sp_lsof_output="$(lsof -p $$ -Fn0 | tail -1)"
_sp_source_file="${_sp_lsof_output#*n}"
fi
# If we can't find this script's path after all that, bail out with
# plain old $0, which WILL NOT work if this is sourced indirectly.
if [ ! -f "$_sp_source_file" ]; then
_sp_source_file="$0"
fi
fi
#
# Find root directory and add bin to path.
#
# We send cd output to /dev/null to avoid because a lot of users set up
# their shell so that cd prints things out to the tty.
#
_sp_share_dir="$(cd "$(dirname $_sp_source_file)" > /dev/null && pwd)"
_sp_prefix="$(cd "$(dirname $(dirname $_sp_share_dir))" > /dev/null && pwd)"
if [ -x "$_sp_prefix/bin/spack" ]; then
export SPACK_ROOT="${_sp_prefix}"
else
# If the shell couldn't find the sourced script, fall back to
# whatever the user set SPACK_ROOT to.
if [ -n "$SPACK_ROOT" ]; then
_sp_prefix="$SPACK_ROOT"
_sp_share_dir="$_sp_prefix/share/spack"
fi
# If SPACK_ROOT didn't work, fail. We should need this rarely, as
# the tricks above for finding the sourced file are pretty robust.
if [ ! -x "$_sp_prefix/bin/spack" ]; then
echo "==> Error: SPACK_ROOT must point to spack's prefix when using $_sp_shell"
echo "Run this with the correct prefix before sourcing setup-env.sh:"
echo " export SPACK_ROOT=</path/to/spack>"
return 1
fi
fi
_spack_pathadd PATH "${_sp_prefix%/}/bin"
export SPACK_ROOT="${_sp_prefix}"
#
# Check whether a function of the given name is defined
#
function _spack_fn_exists() {
_spack_fn_exists() {
LANG= type $1 2>&1 | grep -q 'function'
}
@ -266,7 +314,6 @@ if ! _spack_fn_exists use && ! _spack_fn_exists module; then
need_module="yes"
fi;
#
# make available environment-modules
#
@ -277,25 +324,29 @@ if [ "${need_module}" = "yes" ]; then
if [ "${_sp_module_prefix}" != "not_installed" ]; then
# activate it!
# environment-modules@4: has a bin directory inside its prefix
MODULE_PREFIX_BIN="${_sp_module_prefix}/bin"
if [ ! -d "${MODULE_PREFIX_BIN}" ]; then
_sp_module_bin="${_sp_module_prefix}/bin"
if [ ! -d "${_sp_module_bin}" ]; then
# environment-modules@3 has a nested bin directory
MODULE_PREFIX_BIN="${_sp_module_prefix}/Modules/bin"
_sp_module_bin="${_sp_module_prefix}/Modules/bin"
fi
export MODULE_PREFIX_BIN
_spack_pathadd PATH "${MODULE_PREFIX_BIN}"
module() { eval `${MODULE_PREFIX_BIN}/modulecmd ${SPACK_SHELL} $*`; }
# _sp_module_bin and _sp_shell are evaluated here; the quoted
# eval statement and $* are deferred.
_sp_cmd="module() { eval \`${_sp_module_bin}/modulecmd ${_sp_shell} \$*\`; }"
eval "$_sp_cmd"
_spack_pathadd PATH "${_sp_module_bin}"
fi;
else
eval `spack --print-shell-vars sh`
fi;
#
# set module system roots
#
_sp_multi_pathadd() {
local IFS=':'
if [[ -n "${ZSH_VERSION:-}" ]]; then
if [ "$_sp_shell" = zsh ]; then
setopt sh_word_split
fi
for pth in $2; do
@ -307,6 +358,6 @@ _sp_multi_pathadd DK_NODE "$_sp_dotkit_roots"
# Add programmable tab completion for Bash
#
if [ -n "${BASH_VERSION:-}" ]; then
if [ "$_sp_shell" = bash ]; then
source $_sp_share_dir/spack-completion.bash
fi

View File

@ -140,7 +140,8 @@ function _spack_blame {
function _spack_bootstrap {
compgen -W "-h --help -j --jobs --keep-prefix --keep-stage
-n --no-checksum -v --verbose --clean --dirty" -- "$cur"
-n --no-checksum -v --verbose --use-cache --no-cache
--clean --dirty" -- "$cur"
}
function _spack_build {
@ -352,7 +353,7 @@ function _spack_create {
if $list_options
then
compgen -W "-h --help --keep-stage -n --name -t --template -r --repo
-N --namespace -f --force" -- "$cur"
-N --namespace -f --force --skip-editor" -- "$cur"
fi
}

View File

@ -78,5 +78,5 @@ unsetenv {{ cmd.name }}
{% endblock %}
{% block footer %}
{# In case he module needs to be extended with custom TCL code #}
{# In case the module needs to be extended with custom TCL code #}
{% endblock %}

View File

@ -13,7 +13,7 @@ class Amg(MakefilePackage):
"""
tags = ['proxy-app', 'ecp-proxy-app']
homepage = "https://computation.llnl.gov/projects/co-design/amg2013"
homepage = "https://computing.llnl.gov/projects/co-design/amg2013"
git = "https://github.com/LLNL/AMG.git"
version('develop', branch='master')

View File

@ -14,11 +14,11 @@ class Amg2013(MakefilePackage):
in the Center for Applied Scientific Computing (CASC) at LLNL.
"""
tags = ['proxy-app']
homepage = "https://computation.llnl.gov/projects/co-design/amg2013"
url = "https://computation.llnl.gov/projects/co-design/download/amg2013.tgz"
homepage = "https://computing.llnl.gov/projects/co-design/amg2013"
url = "https://computing.llnl.gov/projects/co-design/download/amg2013.tgz"
version('master', '9d918d2a69528b83e6e0aba6ba601fef',
url='https://computation.llnl.gov/projects/co-design/download/amg2013.tgz')
url='https://computing.llnl.gov/projects/co-design/download/amg2013.tgz')
variant('openmp', default=True, description='Build with OpenMP support')
variant('assumedpartition', default=False, description='Use assumed partition (for thousands of processors)')

View File

@ -91,3 +91,12 @@ def install_headers(self):
for current_file in glob.glob(join_path(self.build_directory,
'bfd', '*.h')):
install(current_file, extradir)
def flag_handler(self, name, flags):
# To ignore the errors of narrowing conversions for
# the Fujitsu compiler
if name == 'cxxflags'\
and (self.compiler.name == 'fj' or self.compiler.name == 'clang')\
and self.version <= ver('2.31.1'):
flags.append('-Wno-narrowing')
return (flags, None, None)

View File

@ -201,9 +201,11 @@ def determine_toolset(self, spec):
toolsets = {'g++': 'gcc',
'icpc': 'intel',
'clang++': 'clang',
'armclang++': 'clang',
'xlc++': 'xlcpp',
'xlc++_r': 'xlcpp',
'pgc++': 'pgi'}
'pgc++': 'pgi',
'FCC': 'clang'}
if spec.satisfies('@1.47:'):
toolsets['icpc'] += '-linux'
@ -228,7 +230,12 @@ def bjam_python_line(self, spec):
def determine_bootstrap_options(self, spec, with_libs, options):
boost_toolset_id = self.determine_toolset(spec)
options.append('--with-toolset=%s' % boost_toolset_id)
# Arm compiler bootstraps with 'gcc' (but builds as 'clang')
if spec.satisfies('%arm'):
options.append('--with-toolset=gcc')
else:
options.append('--with-toolset=%s' % boost_toolset_id)
options.append("--with-libraries=%s" % ','.join(with_libs))
if '+python' in spec:

View File

@ -14,6 +14,7 @@ class Bowtie2(Package):
homepage = "bowtie-bio.sourceforge.net/bowtie2/index.shtml"
url = "http://downloads.sourceforge.net/project/bowtie-bio/bowtie2/2.3.1/bowtie2-2.3.1-source.zip"
version('2.3.5.1', sha256='335c8dafb1487a4a9228ef922fbce4fffba3ce8bc211e2d7085aac092155a53f')
version('2.3.5', sha256='2b6b2c46fbb5565ba6206b47d07ece8754b295714522149d92acebefef08347b')
version('2.3.4.1', '8371bbb6eb02ae99c5cf633054265cb9')
version('2.3.1', 'b4efa22612e98e0c23de3d2c9f2f2478')
@ -29,9 +30,18 @@ class Bowtie2(Package):
patch('bowtie2-2.2.5.patch', when='@2.2.5', level=0)
patch('bowtie2-2.3.1.patch', when='@2.3.1', level=0)
patch('bowtie2-2.3.0.patch', when='@2.3.0', level=0)
resource(name='simde', git="https://github.com/nemequ/simde",
destination='.', when='target=aarch64')
# seems to have trouble with 6's -std=gnu++14
conflicts('%gcc@6:', when='@:2.3.1')
conflicts('@:2.3.5.0', when='target=aarch64')
def patch(self):
if self.spec.satisfies('target=aarch64'):
copy_tree('simde', 'third_party/simde')
if self.spec.satisfies('%gcc@:4.8.9 target=aarch64'):
filter_file('-fopenmp-simd', '', 'Makefile')
@run_before('install')
def filter_sbang(self):
@ -55,7 +65,10 @@ def filter_sbang(self):
filter_file(match, substitute, *files, **kwargs)
def install(self, spec, prefix):
make()
make_arg = []
if self.spec.satisfies('target=aarch64'):
make_arg.append('POPCNT_CAPABILITY=0')
make(*make_arg)
mkdirp(prefix.bin)
for bow in glob("bowtie2*"):
install(bow, prefix.bin)

View File

@ -13,12 +13,11 @@ class Bzip2(Package):
compressors), whilst being around twice as fast at compression
and six times faster at decompression."""
# FIXME: The bzip.org domain has expired:
# https://lwn.net/Articles/762264/
# This package will need to be updated when a new home is found.
homepage = "https://sourceware.org/bzip2/"
url = "https://fossies.org/linux/misc/bzip2-1.0.6.tar.gz"
url = "https://sourceware.org/pub/bzip2/bzip2-1.0.8.tar.gz"
version('1.0.8', 'ab5a03176ee106d3f0fa90e381da478ddae405918153cca248e682cd0c4a2269')
version('1.0.7', 'e768a87c5b1a79511499beb41500bcc4caf203726fff46a6f5f9ad27fe08ab2b')
version('1.0.6', '00b516f4704d4a7cb50a1d97e6e8e15b')
variant('shared', default=True, description='Enables the build of shared libraries.')

View File

@ -22,6 +22,7 @@ class Cfitsio(AutotoolsPackage):
variant('bzip2', default=True, description='Enable bzip2 support')
variant('shared', default=True, description='Build shared libraries')
depends_on('curl')
depends_on('bzip2', when='+bzip2')
def url_for_version(self, version):

View File

@ -13,6 +13,7 @@ class Cmake(Package):
url = 'https://github.com/Kitware/CMake/releases/download/v3.13.0/cmake-3.13.0.tar.gz'
maintainers = ['chuckatkins']
version('3.14.5', sha256='505ae49ebe3c63c595fa5f814975d8b72848447ee13b6613b0f8b96ebda18c06')
version('3.14.4', sha256='00b4dc9b0066079d10f16eed32ec592963a44e7967371d2f5077fd1670ff36d9')
version('3.14.3', sha256='215d0b64e81307182b29b63e562edf30b3875b834efdad09b3fcb5a7d2f4b632')
version('3.14.2', sha256='a3cbf562b99270c0ff192f692139e98c605f292bfdbc04d70da0309a5358e71e')

View File

@ -63,14 +63,8 @@ def install(self, spec, prefix):
@property
def libs(self):
prefix = self.prefix
search_paths = [(prefix.lib, False), (prefix.lib64, False),
(prefix, True)]
for search_root, recursive in search_paths:
libs = find_libraries(
'libcuda', root=search_root, shared=True, recursive=recursive)
if libs:
break
libs = find_libraries('libcuda', root=self.prefix, shared=True,
recursive=True)
filtered_libs = []
# CUDA 10.0 provides Compatability libraries for running newer versions

View File

@ -17,6 +17,11 @@ class Diffsplice(MakefilePackage):
version('0.1.2beta', 'a1df6e0b50968f2c229d5d7f97327336')
version('0.1.1', 'be90e6c072402d5aae0b4e2cbb8c10ac')
def edit(self, spec, prefix):
if spec.satisfies('target=aarch64'):
makefile = FileFilter(join_path(self.build_directory, 'Makefile'))
makefile.filter('-m64', '')
def install(self, spec, prefix):
mkdirp(prefix.bin)
install('diffsplice', prefix.bin)

View File

@ -0,0 +1,34 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Dimemas(AutotoolsPackage):
"""High-abstracted network simulator for message-passing programs."""
homepage = "https://tools.bsc.es/dimemas"
url = "https://github.com/bsc-performance-tools/dimemas/archive/5.4.1.tar.gz"
version('5.4.1', sha256='10ddca3745a56ebab5c1ba180f6f4bce5832c4deac50c1b1dc08271db5c7cafa')
depends_on('autoconf', type='build')
depends_on('automake', type='build')
depends_on('libtool', type='build')
depends_on('m4', type='build')
depends_on('bison', type=('build', 'link', 'run'))
depends_on('flex', type=('build', 'link', 'run'))
depends_on('boost@1.65.0+program_options cxxstd=11', type=('build', 'link'))
def autoreconf(self, spec, prefix):
autoreconf('--install', '--verbose', '--force')
def configure_args(self):
args = ["--with-boost=%s" % self.spec['boost'].prefix,
"--with-boost-libdir=%s" % self.spec['boost'].prefix.lib,
"LEXLIB=-l:libfl.a"]
return args

View File

@ -16,6 +16,7 @@ class EnvironmentModules(Package):
maintainers = ['xdelaruelle']
version('4.2.5', sha256='3375b454568e7bbec7748cd6173516ef9f30a3d8e13c3e99c02794a6a3bc3c8c')
version('4.2.4', sha256='416dda94141e4778356e2aa9ba8687e6522a1eb197b3caf00a82e5fa2707709a')
version('4.2.3', sha256='f667134cca8e2c75e12a00b50b5605d0d8068efa5b821f5b3c3f0df06fd79411')
version('4.2.2', sha256='481fe8d03eec6806c1b401d764536edc2c233ac9d82091a10094de6101867afc')

View File

@ -15,13 +15,15 @@ class Ferret(Package):
homepage = "http://ferret.pmel.noaa.gov/Ferret/home"
url = "ftp://ftp.pmel.noaa.gov/ferret/pub/source/fer_source.v696.tar.gz"
version('7.2', '21c339b1bafa6939fc869428d906451f130f7e77e828c532ab9488d51cf43095')
version('6.96', '51722027c864369f41bab5751dfff8cc')
depends_on("hdf5~mpi~fortran")
depends_on("netcdf~mpi")
depends_on("hdf5+hl")
depends_on("netcdf")
depends_on("netcdf-fortran")
depends_on("readline")
depends_on("zlib")
depends_on("libx11")
def url_for_version(self, version):
return "ftp://ftp.pmel.noaa.gov/ferret/pub/source/fer_source.v{0}.tar.gz".format(
@ -57,36 +59,27 @@ def patch(self):
filter_file(r'-lm',
'-lgfortran -lm',
'FERRET/platform_specific.mk.x86_64-linux')
filter_file(r'\$\(NETCDF4_DIR\)/lib64/libnetcdff.a',
"-L%s -lnetcdff" % self.spec['netcdf-fortran'].prefix.lib,
'FERRET/platform_specific.mk.x86_64-linux')
filter_file(r'\$\(NETCDF4_DIR\)/lib64/libnetcdf.a',
"-L%s -lnetcdf" % self.spec['netcdf'].prefix.lib,
'FERRET/platform_specific.mk.x86_64-linux')
filter_file(r'\$\(HDF5_DIR\)/lib64/libhdf5_hl.a',
"-L%s -lhdf5_hl" % self.spec['hdf5'].prefix.lib,
'FERRET/platform_specific.mk.x86_64-linux')
filter_file(r'\$\(HDF5_DIR\)/lib64/libhdf5.a',
"-L%s -lhdf5" % self.spec['hdf5'].prefix.lib,
'FERRET/platform_specific.mk.x86_64-linux')
def install(self, spec, prefix):
hdf5_prefix = spec['hdf5'].prefix
netcdff_prefix = spec['netcdf-fortran'].prefix
netcdf_prefix = spec['netcdf'].prefix
libz_prefix = spec['zlib'].prefix
ln = which('ln')
ln('-sf',
hdf5_prefix + '/lib',
hdf5_prefix + '/lib64')
ln('-sf',
netcdff_prefix + '/lib',
netcdff_prefix + '/lib64')
ln('-sf',
netcdf_prefix + '/lib/libnetcdf.a',
netcdff_prefix + '/lib/libnetcdf.a')
ln('-sf',
netcdf_prefix + '/lib/libnetcdf.la',
netcdff_prefix + '/lib/libnetcdf.la')
ln('-sf',
libz_prefix + '/lib',
libz_prefix + '/lib64')
if 'LDFLAGS' in env and env['LDFLAGS']:
env['LDFLAGS'] += ' ' + '-lquadmath'
else:
env['LDFLAGS'] = '-lquadmath'
with working_dir('FERRET', create=False):
os.environ['LD_X11'] = '-L/usr/lib/X11 -lX11'
os.environ['LD_X11'] = '-L%s -lX11' % spec['libx11'].prefix.lib
os.environ['HOSTTYPE'] = 'x86_64-linux'
make(parallel=False)
make("install")

View File

@ -425,7 +425,7 @@ def install_links(self):
# Make build log visible - it contains OpenFOAM-specific information
with working_dir(self.projectdir):
os.symlink(
join_path('.spack', 'build.out'),
join_path(os.path.relpath(self.install_log_path)),
join_path('log.' + str(self.foam_arch)))
# -----------------------------------------------------------------------------

View File

@ -15,6 +15,7 @@ class Gdb(AutotoolsPackage):
homepage = "https://www.gnu.org/software/gdb"
url = "https://ftpmirror.gnu.org/gdb/gdb-7.10.tar.gz"
version('8.3', sha256='b2266ec592440d0eec18ee1790f8558b3b8a2845b76cc83a872e39b501ce8a28')
version('8.2.1', sha256='0107985f1edb8dddef6cdd68a4f4e419f5fec0f488cc204f0b7d482c0c6c9282')
version('8.2', '0783c6d86775c5aff06cccc8a3d7cad8')
version('8.1', '0c85ecbb43569ec43b1c9230622e84ab')

View File

@ -13,6 +13,7 @@ class Hpx(CMakePackage, CudaPackage):
homepage = "http://stellar.cct.lsu.edu/tag/hpx/"
url = "https://github.com/STEllAR-GROUP/hpx/archive/1.2.1.tar.gz"
version('1.3.0', sha256='cd34da674064c4cc4a331402edbd65c5a1f8058fb46003314ca18fa08423c5ad')
version('1.2.1', sha256='8cba9b48e919035176d3b7bbfc2c110df6f07803256626f1dad8d9dde16ab77a')
version('1.2.0', sha256='20942314bd90064d9775f63b0e58a8ea146af5260a4c84d0854f9f968077c170')
version('1.1.0', sha256='1f28bbe58d8f0da600d60c3a74a644d75ac777b20a018a5c1c6030a470e8a1c9')

View File

@ -13,7 +13,7 @@ class Hypre(Package):
features parallel multigrid methods for both structured and
unstructured grid problems."""
homepage = "http://computation.llnl.gov/project/linear_solvers/software.php"
homepage = "http://computing.llnl.gov/project/linear_solvers/software.php"
url = "https://github.com/hypre-space/hypre/archive/v2.14.0.tar.gz"
git = "https://github.com/hypre-space/hypre.git"
@ -71,7 +71,7 @@ def url_for_version(self, version):
if version >= Version('2.12.0'):
url = 'https://github.com/hypre-space/hypre/archive/v{0}.tar.gz'
else:
url = 'http://computation.llnl.gov/project/linear_solvers/download/hypre-{0}.tar.gz'
url = 'http://computing.llnl.gov/project/linear_solvers/download/hypre-{0}.tar.gz'
return url.format(version)
@ -150,12 +150,7 @@ def libs(self):
"""Export the hypre library.
Sample usage: spec['hypre'].libs.ld_flags
"""
search_paths = [[self.prefix.lib, False], [self.prefix.lib64, False],
[self.prefix, True]]
is_shared = '+shared' in self.spec
for path, recursive in search_paths:
libs = find_libraries('libHYPRE', root=path,
shared=is_shared, recursive=recursive)
if libs:
return libs
return None
libs = find_libraries('libHYPRE', root=self.prefix, shared=is_shared,
recursive=True)
return libs or None

View File

@ -129,6 +129,13 @@ class IntelParallelStudio(IntelPackage):
multi=False
)
auto_dispatch_options = IntelPackage.auto_dispatch_options
variant(
'auto_dispatch',
values=any_combination_of(*auto_dispatch_options),
description='Enable generation of multiple auto-dispatch code paths'
)
# Components available in all editions
variant('daal', default=True,
description='Install the Intel DAAL libraries')
@ -186,6 +193,12 @@ class IntelParallelStudio(IntelPackage):
conflicts('+daal', when='@cluster.0:cluster.2015.7')
conflicts('+daal', when='@composer.0:composer.2015.7')
# MacOS does not support some of the auto dispatch settings
conflicts('auto_dispatch=SSE2', 'platform=darwin',
msg='SSE2 is not supported on MacOS')
conflicts('auto_dispatch=SSE3', 'platform=darwin target=x86_64',
msg='SSE3 is not supported on MacOS x86_64')
def setup_dependent_environment(self, *args):
# Handle in callback, conveying client's compilers in additional arg.
# CAUTION - DUP code in:

View File

@ -43,5 +43,18 @@ class Intel(IntelPackage):
variant('rpath', default=True, description='Add rpath to .cfg files')
auto_dispatch_options = IntelPackage.auto_dispatch_options
variant(
'auto_dispatch',
values=any_combination_of(*auto_dispatch_options),
description='Enable generation of multiple auto-dispatch code paths'
)
# MacOS does not support some of the auto dispatch settings
conflicts('auto_dispatch=SSE2', 'platform=darwin',
msg='SSE2 is not supported on MacOS')
conflicts('auto_dispatch=SSE3', 'platform=darwin target=x86_64',
msg='SSE3 is not supported on MacOS x86_64')
# Since the current package is a subset of 'intel-parallel-studio',
# all remaining Spack actions are handled in the package class.

View File

@ -36,6 +36,8 @@ class Jdk(Package):
# found in a link above. The build number can be deciphered from the URL.
# Alternatively, run `bin/java -version` after extracting. Replace '+'
# symbol in version with '_', otherwise it will be interpreted as a variant
version('12.0.1_12', sha256='9fd6dcdaf2cfca7da59e39b009a0f5bcd53bec2fb16105f7ca8d689cdab68d75', curl_options=curl_options,
url='https://download.oracle.com/otn-pub/java/jdk/12.0.1+12/69cfe15208a647278a19ef0990eea691/jdk-12.0.1_linux-x64_bin.tar.gz')
version('11.0.2_9', sha256='7b4fd8ffcf53e9ff699d964a80e4abf9706b5bdb5644a765c2b96f99e3a2cdc8', curl_options=curl_options,
url='http://download.oracle.com/otn-pub/java/jdk/11.0.2+9/f51449fcd52f4d52b93a989c5c56ed3c/jdk-11.0.2_linux-x64_bin.tar.gz')
version('11.0.1_13', sha256='e7fd856bacad04b6dbf3606094b6a81fa9930d6dbb044bbd787be7ea93abc885', curl_options=curl_options,
@ -60,6 +62,7 @@ class Jdk(Package):
version('1.7.0_80-b0', '6152f8a7561acf795ca4701daa10a965', curl_options=curl_options)
provides('java')
provides('java@12', when='@12.0:12.999')
provides('java@11', when='@11.0:11.999')
provides('java@10', when='@10.0:10.999')
provides('java@9', when='@9.0:9.999')

View File

@ -0,0 +1,41 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Jube(PythonPackage):
"""The Juelich benchmarking environment (JUBE) provides a script based
framework to easily create benchmark sets, run those sets on different
computer systems and evaluate the results."""
homepage = "https://www.fz-juelich.de/jsc/jube/"
url = "https://apps.fz-juelich.de/jsc/jube/jube2/download.php?version=2.2.2"
version('2.2.2', sha256='135bc03cf07c4624ef2cf581ba5ec52eb44ca1dac15cffb83637e86170980477', extension="tar.gz")
version('2.2.1', sha256='68751bf2e17766650ccddb7a5321dd1ac8b34ffa3585db392befbe9ff180ddd9', extension="tar.gz")
version('2.2.0', sha256='bc825884fc8506d0fb7b3b5cbb5ad4c7e82b1fe1d7ec861ca33699adfc8100f1', extension="tar.gz")
version('2.1.4', sha256='13da3213db834ed2f3a04fedf20a24c4a11b76620e18fed0a0bbcb7484f980bb', extension="tar.gz")
version('2.1.3', sha256='ccc7af95eb1e3f63c52a26db08ef9986f7cc500df7a51af0c5e14ed4e7431ad6', extension="tar.gz")
version('2.1.2', sha256='4ff1c4eabaaa71829e46e4fb4092a88675f8c2b6708d5ec2b12f991dd9a4de2d', extension="tar.gz")
version('2.1.1', sha256='0c48ce4cb9300300d115ae428b1843c4229a54eb286ab0ced953e96ed3f2b9b2', extension="tar.gz")
version('2.1.0', sha256='eb9c542b9eb760ea834459a09f8be55891e993a40e277d325bc093b040921e23', extension="tar.gz")
version('2.0.1', sha256='ef3c4de8b2353ec0ee229428b1ef1912bc3019b72d4e78be00eecd1f384aeec0', extension="tar.gz")
version('2.0.0', sha256='ecfe8717bc61f35f333bc24d27b39e78e67c596e23512bdd97c9b4f28491f0b3', extension="tar.gz")
variant(
'resource_manager', default='slurm',
description='Select resource manager templates',
values=('loadleveler', 'lsf', 'moab', 'pbs', 'slurm'), multi=False
)
depends_on('py-setuptools', type='build')
def setup_environment(self, spack_env, run_env):
run_env.prepend_path(
'JUBE_INCLUDE_PATH',
prefix + "/platform/" +
self.spec.variants['resource_manager'].value
)

View File

@ -10,8 +10,8 @@ class Kripke(CMakePackage):
"""Kripke is a simple, scalable, 3D Sn deterministic particle
transport proxy/mini app.
"""
homepage = "https://computation.llnl.gov/projects/co-design/kripke"
url = "https://computation.llnl.gov/projects/co-design/download/kripke-openmp-1.1.tar.gz"
homepage = "https://computing.llnl.gov/projects/co-design/kripke"
url = "https://computing.llnl.gov/projects/co-design/download/kripke-openmp-1.1.tar.gz"
tags = ['proxy-app']
version('1.1', '7fe6f2b26ed983a6ce5495ab701f85bf')

View File

@ -14,7 +14,7 @@ class Laghos(MakefilePackage):
"""
tags = ['proxy-app', 'ecp-proxy-app']
homepage = "https://computation.llnl.gov/projects/co-design/laghos"
homepage = "https://computing.llnl.gov/projects/co-design/laghos"
url = "https://github.com/CEED/Laghos/archive/v1.0.tar.gz"
git = "https://github.com/CEED/Laghos.git"

View File

@ -16,8 +16,8 @@ class Lcals(MakefilePackage):
by Frank H. McMahon, UCRL-53745.). The suite contains facilities to
generate timing statistics and reports."""
homepage = "https://computation.llnl.gov/projects/co-design/lcals"
url = "https://computation.llnl.gov/projects/co-design/download/lcals-v1.0.2.tgz"
homepage = "https://computing.llnl.gov/projects/co-design/lcals"
url = "https://computing.llnl.gov/projects/co-design/download/lcals-v1.0.2.tgz"
tags = ['proxy-app']

View File

@ -36,7 +36,8 @@ class Libfabric(AutotoolsPackage):
'udp',
'rxm',
'rxd',
'mlx')
'mlx',
'tcp')
variant(
'fabrics',

View File

@ -6,7 +6,7 @@
from spack import *
class Luit(Package):
class Luit(AutotoolsPackage):
"""Luit is a filter that can be run between an arbitrary application and
a UTF-8 terminal emulator such as xterm. It will convert application
output from the locale's encoding into UTF-8, and convert terminal
@ -23,10 +23,6 @@ class Luit(Package):
depends_on('pkgconfig', type='build')
depends_on('util-macros', type='build')
def install(self, spec, prefix):
configure('--prefix={0}'.format(prefix),
# see http://www.linuxquestions.org/questions/linux-from-scratch-13/can't-compile-luit-xorg-applications-4175476308/ # noqa
'CFLAGS=-U_XOPEN_SOURCE -D_XOPEN_SOURCE=600')
make()
make('install')
# see http://www.linuxquestions.org/questions/linux-from-scratch-13/can't-compile-luit-xorg-applications-4175476308/ # noqa
def configure_args(self):
return ['CFLAGS=-U_XOPEN_SOURCE -D_XOPEN_SOURCE=600']

View File

@ -12,7 +12,7 @@ class Lulesh(MakefilePackage):
code to only solve a Sedov blast problem with analytic answer
"""
tags = ['proxy-app']
homepage = "https://computation.llnl.gov/projects/co-design/lulesh"
homepage = "https://computing.llnl.gov/projects/co-design/lulesh"
git = "https://github.com/LLNL/LULESH.git"
version('2.0.3', tag='2.0.3')

View File

@ -11,7 +11,7 @@ class Macsio(CMakePackage):
tags = ['proxy-app', 'ecp-proxy-app']
homepage = "https://computation.llnl.gov/projects/co-design/macsio"
homepage = "https://computing.llnl.gov/projects/co-design/macsio"
url = "https://github.com/LLNL/MACSio/archive/v1.1.tar.gz"
git = "https://github.com/LLNL/MACSio.git"

View File

@ -29,11 +29,3 @@ def install(self, spec, prefix):
'-targetdir={0}'.format(prefix),
'-execdir={0}'.format(prefix.bin),
'-selinux=y')
# after the install phase completes, spack tries to install build.out
# into <prefix>/.spack, but the .spack dir will not exist, causing the
# build to fail. package.py:1690 seems to show that the dir is created
# right before writing build.out -- possible bug?
# creating the .spack dir right after installing prevents explosions
mkdirp(join_path(prefix, '.spack'))

View File

@ -0,0 +1,20 @@
Common subdirectories: spack-src/cub and spack-src.mod/cub
Common subdirectories: spack-src/example and spack-src.mod/example
Common subdirectories: spack-src/lib_idba and spack-src.mod/lib_idba
diff -u spack-src/Makefile spack-src.mod/Makefile
--- spack-src/Makefile 2018-11-02 10:14:27.000000000 +0900
+++ spack-src.mod/Makefile 2019-07-09 15:53:34.685959951 +0900
@@ -48,6 +48,12 @@
CPU_ARCH_SUFFIX = ppc64
CPU_ARCH = -mpowerpc64
endif
+IS_AARCH64 := $(shell echo `$(CXX) -v 2>&1 | grep aarch64 | wc -l`)
+ifneq (0, $(IS_AARCH64))
+ CPU_ARCH_SUFFIX = aarch64
+ CPU_ARCH =
+ disablempopcnt=1
+endif
#-------------------------------------------------------------------------------
# Includes
Common subdirectories: spack-src/tools and spack-src.mod/tools

View File

@ -18,6 +18,8 @@ class Megahit(MakefilePackage):
depends_on('zlib')
patch('amd.patch', when='target=aarch64')
def install(self, spec, prefix):
mkdirp(prefix.bin)
install('megahit', prefix.bin)

View File

@ -12,7 +12,7 @@ class Motif(AutotoolsPackage):
specification and the widget toolkit
"""
homepage = "http://motif.ics.com/"
url = "http://cfhcable.dl.sourceforge.net/project/motif/Motif 2.3.8 Source Code/motif-2.3.8.tar.gz"
url = "http://cfhcable.dl.sourceforge.net/project/motif/Motif%202.3.8%20Source%20Code/motif-2.3.8.tar.gz"
version('2.3.8', '7572140bb52ba21ec2f0c85b2605e2b1')

View File

@ -19,6 +19,7 @@ class Mpich(AutotoolsPackage):
list_depth = 1
version('develop', submodules=True)
version('3.3.1', 'fe551ef29c8eea8978f679484441ed8bb1d943f6ad25b63c235d4b9243d551e5')
version('3.3', '574af413dc0dc7fbb929a761822beb06')
version('3.2.1', 'e175452f4d61646a52c73031683fc375')
version('3.2', 'f414cfa77099cd1fa1a5ae4e22db508a')
@ -78,9 +79,11 @@ class Mpich(AutotoolsPackage):
# Fix SLURM node list parsing
# See https://github.com/pmodels/mpich/issues/3572
# and https://github.com/pmodels/mpich/pull/3578
# Even though there is no version 3.3.0, we need to specify 3.3:3.3.0 in
# the when clause, otherwise the patch will be applied to 3.3.1, too.
patch('https://github.com/pmodels/mpich/commit/b324d2de860a7a2848dc38aefb8c7627a72d2003.patch',
sha256='c7d4ecf865dccff5b764d9c66b6a470d11b0b1a5b4f7ad1ffa61079ad6b5dede',
when='@3.3')
when='@3.3:3.3.0')
depends_on('findutils', type='build')
depends_on('pkgconfig', type='build')

View File

@ -0,0 +1,41 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Nim(Package):
"""Nim is a statically typed compiled systems programming language.
It combines successful concepts from mature languages like Python,
Ada and Modula."""
homepage = "https://nim-lang.org/"
url = "https://nim-lang.org/download/nim-0.20.0.tar.xz"
version('0.20.0', sha256='4be00d7dd47220da508b30ce3b4deade4ba39800ea4a1e574018b3c733359780')
version('0.19.6', sha256='a09f0c58d29392434d4fd6d15d4059cf7e013ae948413cb9233b8233d67e3a29')
version('0.19.9', sha256='154c440cb8f27da20b3d6b1a8cc03f63305073fb995bbf26ec9bc6ad891ce276',
url='https://github.com/nim-lang/nightlies/releases/download/2019-06-02-devel-1255b3c/nim-0.19.9-linux_x64.tar.xz')
depends_on('pcre', type='build')
depends_on('openssl', type='build')
def install(self, spec, prefix):
bash = which('bash')
bash('./build.sh')
nim = Executable(join_path('.', 'bin/nim'))
nim('c', 'koch')
koch = Executable(join_path('.', 'koch'))
koch('tools')
install_tree('bin', prefix.bin)
install_tree('lib', prefix.lib)
install_tree('compiler', prefix.compiler)
install_tree('config', prefix.config)
install_tree('doc', prefix.doc)
install('compiler.nimble', prefix)

View File

@ -344,7 +344,7 @@ def install_links(self):
# Make build log visible - it contains OpenFOAM-specific information
with working_dir(self.projectdir):
os.symlink(
join_path('.spack', 'build.out'),
join_path(os.path.relpath(self.install_log_path)),
join_path('log.' + str(self.foam_arch)))
if not self.config['link']:

View File

@ -258,9 +258,9 @@ class Openfoam(Package):
maintainers = ['olesenm']
homepage = "http://www.openfoam.com/"
url = "https://sourceforge.net/projects/openfoamplus/files/v1706/OpenFOAM-v1706.tgz"
url = "https://sourceforge.net/projects/openfoam/files/v1906/OpenFOAM-v1906.tgz"
git = "https://develop.openfoam.com/Development/OpenFOAM-plus.git"
list_url = "https://sourceforge.net/projects/openfoamplus/files/"
list_url = "https://sourceforge.net/projects/openfoam/files/"
list_depth = 2
version('develop', branch='develop', submodules='True')
@ -724,7 +724,7 @@ def install_links(self):
# Make build log visible - it contains OpenFOAM-specific information
with working_dir(self.projectdir):
os.symlink(
join_path('.spack', 'build.out'),
join_path(os.path.relpath(self.install_log_path)),
join_path('log.' + str(self.foam_arch)))
if not self.config['link']:

View File

@ -42,7 +42,7 @@ class Picard(Package):
def install(self, spec, prefix):
mkdirp(prefix.bin)
# The list of files to install varies with release...
# ... but skip the spack-{build.env}.out files.
# ... but skip the spack-build-{env}out.txt files.
files = [x for x in glob.glob("*") if not re.match("^spack-", x)]
for f in files:
install(f, prefix.bin)

View File

@ -0,0 +1,51 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Powerapi(AutotoolsPackage):
"""This software is a reference implementation of the PowerAPI"""
homepage = "https://powerapi.sandia.gov/"
git = "https://github.com/pwrapi/pwrapi-ref.git"
version('1.1.1', commit='93f66dfa29f014067823f2b790a1862e5841a11c')
variant('hwloc', default=False, description='Build hwloc support')
variant('debug', default=False, description='Enable debug support')
variant('mpi', default=False, description='Enable MPI support')
variant('gnu-ld', default=False, description='Assume GNU compiled uses gnu-ld')
depends_on('autoconf')
depends_on('automake')
depends_on('libtool')
depends_on('m4')
depends_on('hwloc', when='+hwloc')
depends_on('mpi', when='+mpi')
def autoreconf(self, spec, prefix):
bash = which('bash')
bash('./autogen.sh')
def configure_args(self):
spec = self.spec
args = ['--prefix={0}'.format(self.prefix)]
if '+hwloc' in spec:
args.append('--with-hwloc={0}'.format(spec['hwloc'].prefix))
if '+mpi' in spec:
args.append('--with-mpi={0}'.format(spec['mpi'].prefix))
args.extend([
'--with%s-gnu-ld' % ('' if '+gnu-ld' in spec else 'out'),
'--%sable-debug' % ('en' if '+debug' in spec else 'dis')
])
return args
def install(self, spec, prefix):
make('install')

View File

@ -19,6 +19,7 @@ class Precice(CMakePackage):
maintainers = ['fsimonis', 'MakisH']
version('develop', branch='develop')
version('1.5.2', sha256='051e0d7655a91f8681901e5c92812e48f33a5779309e2f104c99f5a687e1a418')
version('1.5.1', sha256='fbe151f1a9accf9154362c70d15254935d4f594d189982c3a99fdb3dd9d9e665')
version('1.5.0', sha256='a2a794becd08717e3049252134ae35692fed71966ed32e22cca796a169c16c3e')
version('1.4.1', sha256='dde4882edde17882340f9f601941d110d5976340bd71af54c6e6ea22ae56f1a5')

View File

@ -0,0 +1,21 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class PyAntlr4Python3Runtime(PythonPackage):
"""This package provides runtime libraries required to use
parsers generated for the Python3 language by version 4 of
ANTLR (ANother Tool for Language Recognition).
"""
homepage = "https://www.antlr.org"
url = "https://pypi.io/packages/source/a/antlr4-python3-runtime/antlr4-python3-runtime-4.7.2.tar.gz"
version('4.7.2', sha256='168cdcec8fb9152e84a87ca6fd261b3d54c8f6358f42ab3b813b14a7193bb50b')
depends_on('python@3:', type=('build', 'run'))
depends_on('py-setuptools', type='build')

View File

@ -14,6 +14,7 @@ class PyAstropy(PythonPackage):
homepage = 'http://www.astropy.org/'
url = 'https://pypi.io/packages/source/a/astropy/astropy-1.1.2.tar.gz'
version('3.2.1', '706c0457789c78285e5464a5a336f5f0b058d646d60f4e5f5ba1f7d5bf424b28')
version('1.1.2', 'cbe32023b5b1177d1e2498a0d00cda51')
version('1.1.post1', 'b52919f657a37d45cc45f5cb0f58c44d')

View File

@ -19,9 +19,10 @@ class PyBasemap(PythonPackage):
# Per Github issue #3813, setuptools is required at runtime in order
# to make mpl_toolkits a namespace package that can span multiple
# directories (i.e., matplotlib and basemap)
depends_on('py-setuptools', type=('run'))
depends_on('py-setuptools', type=('build', 'run'))
depends_on('py-numpy', type=('build', 'run'))
depends_on('py-matplotlib', type=('build', 'run'))
depends_on('py-pyproj@:1.99', type=('build', 'run'), when='@:1.2.0')
depends_on('py-pyproj', type=('build', 'run'))
depends_on('py-pyshp', type=('build', 'run'))
depends_on('pil', type=('build', 'run'))

View File

@ -10,8 +10,9 @@ class PyJsonschema(PythonPackage):
"""Jsonschema: An(other) implementation of JSON Schema for Python."""
homepage = "http://github.com/Julian/jsonschema"
url = "https://pypi.io/packages/source/j/jsonschema/jsonschema-2.5.1.tar.gz"
url = "https://pypi.io/packages/source/j/jsonschema/jsonschema-2.6.0.tar.gz"
version('2.6.0', sha256='6ff5f3180870836cae40f06fa10419f557208175f13ad7bc26caa77beb1f6e02')
version('2.5.1', '374e848fdb69a3ce8b7e778b47c30640')
depends_on('py-setuptools', type='build')

View File

@ -12,6 +12,7 @@ class PyLarkParser(PythonPackage):
homepage = "https://github.com/lark-parser/lark/"
url = "https://pypi.io/packages/source/l/lark-parser/lark-parser-0.6.2.tar.gz"
version('0.7.1', sha256='8455e05d062fa7f9d59a2735583cf02291545f944955c4056bf1144c4e625344')
version('0.6.2', '675058937a7f41e661bcf2b3bfdb7ceb')
depends_on('py-setuptools', type='build')

View File

@ -11,10 +11,11 @@ class PyLibensemble(PythonPackage):
"""Library for managing ensemble-like collections of computations."""
homepage = "https://libensemble.readthedocs.io"
url = "https://pypi.io/packages/source/l/libensemble/libensemble-0.5.0.tar.gz"
url = "https://pypi.io/packages/source/l/libensemble/libensemble-0.5.1.tar.gz"
git = "https://github.com/Libensemble/libensemble.git"
version('develop', branch='develop')
version('0.5.1', sha256='522e0cc086a3ed75a101b704c0fe01eae07f2684bd8d6da7bdfe9371d3187362')
version('0.5.0', sha256='c4623171dee049bfaa38a9c433609299a56b1afb774db8b71321247bc7556b8f')
version('0.4.1', sha256='282c32ffb79d84cc80b5cc7043c202d5f0b8ebff10f63924752f092e3938db5e')
version('0.4.0', sha256='9384aa3a58cbc20bbd1c6fddfadb5e6a943d593a3a81c8665f030dbc6d76e76e')

View File

@ -19,10 +19,13 @@ class PyScikitImage(PythonPackage):
extends('python', ignore=r'bin/.*\.py$')
depends_on('py-dask', type=('build', 'run'))
depends_on('pil', type=('build', 'run'))
depends_on('py-pillow', type=('build', 'run'))
depends_on('py-networkx', type=('build', 'run'))
depends_on('py-six', type=('build', 'run'))
depends_on('py-numpy', type=('build', 'run'))
depends_on('py-scipy', type=('build', 'run'))
depends_on('py-matplotlib', type=('build', 'run'))
depends_on('py-pywavelets', type=('build', 'run'), when='@0.14:')
depends_on('py-cloudpickle', type=('build', 'run'), when='@0.14:')
depends_on('py-setuptools', type='build')
depends_on('py-cython@0.23.4:', type='build')

View File

@ -0,0 +1,22 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class PyStratify(PythonPackage):
"""Vectorized interpolators that are especially useful for
Nd vertical interpolation/stratification of atmospheric and
oceanographic datasets.
"""
homepage = "https://github.com/SciTools-incubator/python-stratify"
url = "https://pypi.io/packages/source/s/stratify/stratify-0.1.tar.gz"
version('0.1', sha256='5426f3b66e45e1010952d426e5a7be42cd45fe65f1cd73a98fee1eb7c110c6ee')
depends_on('py-setuptools', type='build')
depends_on('py-numpy', type=('build', 'run'))
depends_on('py-cython', type=('build', 'run'))

View File

@ -82,7 +82,7 @@ class Qt(Package):
# Fix build failure with newer versions of GCC
patch('https://github.com/qt/qtbase/commit/a52d7861edfb5956de38ba80015c4dd0b596259b.patch',
sha256='e10c871033568a9aed982628ed627356761f72f63c5fdaf11882dc147528e9ed',
sha256='c49b228c27e3ad46ec3af4bac0e9985af5b5b28760f238422d32e14f98e49b1e',
working_dir='qtbase',
when='@5.10:5.12.0 %gcc@9:')

View File

@ -12,9 +12,11 @@ class RBiocgenerics(RPackage):
homepage = "https://www.bioconductor.org/packages/BiocGenerics/"
git = "https://git.bioconductor.org/packages/BiocGenerics.git"
version('0.30.0', commit='fc7c3af4a5635a30988a062ed09332c13ca1d1a8')
version('0.26.0', commit='5b2a6df639e48c3cd53789e0b174aec9dda6b67d')
version('0.24.0', commit='3db111e8c1f876267da89f4f0c5406a9d5c31cd1')
version('0.22.1', commit='9c90bb8926885289d596a81ff318ee3745cbb6ad')
depends_on('r@3.4.0:3.4.9', when='@0.22.1', type=('build', 'run'))
depends_on('r@3.5.0:3.5.9', when='@0.26.0', type=('build', 'run'))
depends_on('r@3.6.0:', when='@0.30.0', type=('build', 'run'))

View File

@ -14,5 +14,5 @@ class RGraph(RPackage):
version('1.54.0', commit='2a8b08520096241620421078fc1098f4569c7301')
depends_on('r@3.4.0:3.4.9', when='@1.54.0')
depends_on('r@2.10:', type=('build', 'run'), when='@1.54.0')
depends_on('r-biocgenerics', type=('build', 'run'))

View File

@ -0,0 +1,26 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class RGrbase(RPackage):
"""gRbase: A Package for Graphical Modelling in R"""
homepage = "http://people.math.aau.dk/~sorenh/software/gR/"
url = "https://cran.r-project.org/src/contrib/gRbase_1.8-3.4.tar.gz"
list_url = "https://cran.r-project.org/src/contrib/Archive/gRbase"
version('1.8-3.4', sha256='d35f94c2fb7cbd4ce3991570424dfe6723a849658da32e13df29f53b6ea2cc2c')
depends_on('r@3.0.2:', type=('build', 'run'))
depends_on('r-graph', type=('build', 'run'))
depends_on('r-igraph', type=('build', 'run'))
depends_on('r-magrittr', type=('build', 'run'))
depends_on('r-matrix', type=('build', 'run'))
depends_on('r-rbgl', type=('build', 'run'))
depends_on('r-rcpp@0.11.1:', type=('build', 'run'))
depends_on('r-rcpparmadillo', type=('build', 'run'))
depends_on('r-rcppeigen', type=('build', 'run'))

View File

@ -13,7 +13,10 @@ class RRbgl(RPackage):
homepage = "https://www.bioconductor.org/packages/RBGL/"
git = "https://git.bioconductor.org/packages/RBGL.git"
version('1.60.0', commit='ef24c17c411659b8f030602bd9781c534d6ec93b')
version('1.52.0', commit='93e8fcfafec8f1cd5638fe30dc0f9506d15b49c0')
depends_on('r@3.4.0:3.4.9', when='@1.52.0')
depends_on('r@3.6.0:', when='@1.60.0', type=('build', 'run'))
depends_on('r-bh', when='@1.60.0', type=('build', 'run'))
depends_on('r-graph', type=('build', 'run'))

View File

@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
from spack import *
@ -48,6 +49,8 @@ class R(AutotoolsPackage):
description='Enable X11 support (call configure --with-x)')
variant('memory_profiling', default=False,
description='Enable memory profiling')
variant('rmath', default=False,
description='Build standalone Rmath library')
# Virtual dependencies
depends_on('blas', when='+external-lapack')
@ -70,9 +73,10 @@ class R(AutotoolsPackage):
depends_on('pango~X', when='~X')
depends_on('freetype')
depends_on('tcl')
depends_on('tk')
depends_on('tk', when='+X')
depends_on('libx11', when='+X')
depends_on('libxt', when='+X')
depends_on('libxmu', when='+X')
depends_on('curl')
depends_on('pcre')
depends_on('java')
@ -87,12 +91,23 @@ class R(AutotoolsPackage):
def etcdir(self):
return join_path(prefix, 'rlib', 'R', 'etc')
@run_after('build')
def build_rmath(self):
if '+rmath' in self.spec:
with working_dir('src/nmath/standalone'):
make()
@run_after('install')
def install_rmath(self):
if '+rmath' in self.spec:
with working_dir('src/nmath/standalone'):
make('install')
def configure_args(self):
spec = self.spec
prefix = self.prefix
tcl_config_path = join_path(spec['tcl'].prefix.lib, 'tclConfig.sh')
tk_config_path = join_path(spec['tk'].prefix.lib, 'tkConfig.sh')
config_args = [
'--libdir={0}'.format(join_path(prefix, 'rlib')),
@ -100,14 +115,27 @@ def configure_args(self):
'--enable-BLAS-shlib',
'--enable-R-framework=no',
'--with-tcl-config={0}'.format(tcl_config_path),
'--with-tk-config={0}'.format(tk_config_path),
'LDFLAGS=-L{0} -Wl,-rpath,{0}'.format(join_path(prefix, 'rlib',
'R', 'lib')),
]
if '^tk' in spec:
tk_config_path = join_path(spec['tk'].prefix.lib, 'tkConfig.sh')
config_args.append('--with-tk-config={0}'.format(tk_config_path))
if '+external-lapack' in spec:
config_args.extend([
'--with-blas={0}'.format(spec['blas'].libs),
'--with-lapack'
])
if '^mkl' in spec and 'gfortran' in self.compiler.fc:
mkl_re = re.compile(r'(mkl_)intel(_i?lp64\b)')
config_args.extend([
mkl_re.sub(r'\g<1>gf\g<2>',
'--with-blas={0}'.format(
spec['blas'].libs.ld_flags)),
'--with-lapack'
])
else:
config_args.extend([
'--with-blas={0}'.format(spec['blas'].libs.ld_flags),
'--with-lapack'
])
if '+X' in spec:
config_args.append('--with-x')

View File

@ -14,8 +14,8 @@ class Samrai(AutotoolsPackage):
(SAMR) technology in large-scale parallel application development.
"""
homepage = "https://computation.llnl.gov/projects/samrai"
url = "https://computation.llnl.gov/projects/samrai/download/SAMRAI-v3.11.2.tar.gz"
homepage = "https://computing.llnl.gov/projects/samrai"
url = "https://computing.llnl.gov/projects/samrai/download/SAMRAI-v3.11.2.tar.gz"
list_url = homepage
version('3.12.0', '07364f6e209284e45ac0e9caf1d610f6')

View File

@ -18,7 +18,9 @@ class Scalasca(AutotoolsPackage):
homepage = "http://www.scalasca.org"
url = "http://apps.fz-juelich.de/scalasca/releases/scalasca/2.1/dist/scalasca-2.1.tar.gz"
list_url = "https://scalasca.org/scalasca/front_content.php?idart=1072"
version('2.5', sha256='7dfa01e383bfb8a4fd3771c9ea98ff43772e415009d9f3c5f63b9e05f2dde0f6')
version('2.4', '4a895868258030f700a635eac93d36764f60c8c63673c7db419ea4bcc6b0b760')
version('2.3.1', 'a83ced912b9d2330004cb6b9cefa7585')
version('2.2.2', '2bafce988b0522d18072f7771e491ab9')
@ -26,14 +28,16 @@ class Scalasca(AutotoolsPackage):
depends_on("mpi")
# version 2.4
# version 2.4+
depends_on('cubew@4.4:', when='@2.4:')
# version 2.3+
depends_on('otf2@2:', when='@2.3:')
# version 2.3
depends_on('cube@4.3', when='@2.3:2.3.99')
depends_on('otf2@2:', when='@2.3:')
# version 2.1+
# version 2.1 - 2.2
depends_on('cube@4.2', when='@2.1:2.2.999')
depends_on('otf2@1.4', when='@2.1:2.2.999')

View File

@ -13,7 +13,7 @@ class Scr(CMakePackage):
Linux cluster to provide a fast, scalable checkpoint/restart
capability for MPI codes"""
homepage = "http://computation.llnl.gov/projects/scalable-checkpoint-restart-for-mpi"
homepage = "http://computing.llnl.gov/projects/scalable-checkpoint-restart-for-mpi"
url = "https://github.com/LLNL/scr/archive/v1.2.0.tar.gz"
git = "https://github.com/llnl/scr.git"

View File

@ -12,7 +12,7 @@ class Spindle(AutotoolsPackage):
overload on a shared file system when loading dynamically
linked libraries, causing site-wide performance problems.
"""
homepage = "https://computation.llnl.gov/project/spindle/"
homepage = "https://computing.llnl.gov/project/spindle/"
url = "https://github.com/hpc/Spindle/archive/v0.8.1.tar.gz"
version('0.8.1', 'f11793a6b9d8df2cd231fccb2857d912')

View File

@ -12,8 +12,8 @@ class Sundials(CMakePackage):
"""SUNDIALS (SUite of Nonlinear and DIfferential/ALgebraic equation
Solvers)"""
homepage = "https://computation.llnl.gov/projects/sundials"
url = "https://computation.llnl.gov/projects/sundials/download/sundials-2.7.0.tar.gz"
homepage = "https://computing.llnl.gov/projects/sundials"
url = "https://computing.llnl.gov/projects/sundials/download/sundials-2.7.0.tar.gz"
maintainers = ['cswoodward', 'gardner48', 'balos1']
# ==========================================================================
@ -500,12 +500,9 @@ def libs(self):
# Q: should the result be ordered by dependency?
else:
sun_libs = ['libsundials_' + p for p in query_parameters]
search_paths = [[self.prefix.lib, False], [self.prefix.lib64, False],
[self.prefix, True]]
is_shared = '+shared' in self.spec
for path, recursive in search_paths:
libs = find_libraries(sun_libs, root=path, shared=is_shared,
recursive=recursive)
if libs:
return libs
return None # Raise an error
libs = find_libraries(sun_libs, root=self.prefix, shared=is_shared,
recursive=True)
return libs or None # Raise an error if no libs are found

View File

@ -80,6 +80,10 @@ class Tau(Package):
depends_on('cuda', when='+cuda')
depends_on('gasnet', when='+gasnet')
# Elf only required from 2.28.1 on
conflicts('+libelf', when='@:2.28.0')
conflicts('+libdwarf', when='@:2.28.0')
filter_compiler_wrappers('tau_cc.sh', 'Makefile.tau', relative_root='bin')
def set_compiler_options(self):

View File

@ -23,10 +23,17 @@ class Tk(AutotoolsPackage):
version('8.6.3', '85ca4dbf4dcc19777fd456f6ee5d0221')
version('8.5.19', 'e89df710447cce0fc0bde65667c12f85')
variant('xft', default=True,
description='Enable X FreeType')
variant('xss', default=True,
description='Enable X Screen Saver')
extends('tcl')
depends_on('tcl@8.6:', when='@8.6:')
depends_on('libx11')
depends_on('libxft', when='+xft')
depends_on('libxscrnsaver', when='+xss')
configure_directory = 'unix'
@ -65,9 +72,15 @@ def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
def configure_args(self):
spec = self.spec
return ['--with-tcl={0}'.format(spec['tcl'].prefix.lib),
'--x-includes={0}'.format(spec['libx11'].prefix.include),
'--x-libraries={0}'.format(spec['libx11'].prefix.lib)]
config_args = [
'--with-tcl={0}'.format(spec['tcl'].prefix.lib),
'--x-includes={0}'.format(spec['libx11'].prefix.include),
'--x-libraries={0}'.format(spec['libx11'].prefix.lib)
]
config_args += self.enable_or_disable('xft')
config_args += self.enable_or_disable('xss')
return config_args
@run_after('install')
def symlink_wish(self):

View File

@ -0,0 +1,35 @@
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Umap(CMakePackage):
"""Umap is a library that provides an mmap()-like interface to a
simple, user-space page fault handler based on the userfaultfd Linux
feature (starting with 4.3 linux kernel)."""
homepage = "https://github.com/LLNL/umap"
url = "https://github.com/LLNL/umap/archive/v2.0.0.tar.gz"
git = "https://github.com/LLNL/umap.git"
version('develop', branch='develop')
version('2.0.0', sha256='85c4bc68e8790393847a84eb54eaf6fc321acade382a399a2679d541b0e34150')
version('1.0.0', sha256='c746de3fae5bfc5bbf36234d5e888ea45eeba374c26cd8b5a817d0c08e454ed5')
version('0.0.4', sha256='bffaa03668c95b608406269cba6543f5e0ba37b04ac08a3fc4593976996bc273')
version('0.0.3', sha256='8e80835a85ad69fcd95f963822b1616c782114077d79c350017db4d82871455c')
version('0.0.2', sha256='eccc987b414bc568bd33b569ab6e18c328409499f11e65ac5cd5c3e1a8b47509')
version('0.0.1', sha256='49020adf55aa3f8f03757373b21ff229d2e8cf4155d54835019cd4745c1291ef')
variant('logging', default=False, description='Build with logging enabled.')
variant('tests', default=False, description='Build test programs.')
def cmake_args(self):
spec = self.spec
args = [
"-DENABLE_LOGGING=%s" % ('On' if '+logging' in spec else 'Off'),
"-DENABLE_TESTS=%s" % ('On' if '+tests' in spec else 'Off'),
]
return args

View File

@ -17,6 +17,11 @@ class Velvet(MakefilePackage):
depends_on('zlib')
def edit(self, spec, prefix):
if spec.satisfies('target=aarch64'):
makefile = FileFilter('Makefile')
makefile.filter('-m64', '')
def install(self, spec, prefix):
mkdirp(prefix.bin)
install('velvetg', prefix.bin)

View File

@ -11,8 +11,8 @@
class Xbraid(MakefilePackage):
"""XBraid: Parallel time integration with Multigrid"""
homepage = "https://computation.llnl.gov/projects/parallel-time-integration-multigrid/software"
url = "https://computation.llnl.gov/projects/parallel-time-integration-multigrid/download/braid_2.2.0.tar.gz"
homepage = "https://computing.llnl.gov/projects/parallel-time-integration-multigrid/software"
url = "https://computing.llnl.gov/projects/parallel-time-integration-multigrid/download/braid_2.2.0.tar.gz"
version('2.2.0', '0a9c2fc3eb8f605f73cce78ab0d8a7d9')

View File

@ -12,8 +12,8 @@ class Zfp(MakefilePackage):
arrays.
"""
homepage = 'http://computation.llnl.gov/projects/floating-point-compression'
url = 'http://computation.llnl.gov/projects/floating-point-compression/download/zfp-0.5.2.tar.gz'
homepage = 'http://computing.llnl.gov/projects/floating-point-compression'
url = 'http://computing.llnl.gov/projects/floating-point-compression/download/zfp-0.5.2.tar.gz'
version('0.5.4', sha256='768a05ed9bf10e54ac306f90b81dd17b0e7b13782f01823d7da4394fd2da8adb')
version('0.5.2', '2f0a77aa34087219a6e10b8b7d031e77')

View File

@ -6,11 +6,9 @@
from spack import *
import re
import os
import glob
class Zoltan(Package):
class Zoltan(AutotoolsPackage):
"""The Zoltan library is a toolkit of parallel combinatorial algorithms
for parallel, unstructured, and/or adaptive scientific
applications. Zoltan's largest component is a suite of dynamic
@ -43,15 +41,43 @@ class Zoltan(Package):
depends_on('parmetis@4:', when='+parmetis')
depends_on('metis', when='+parmetis')
depends_on('perl@:5.21', type='build', when='@:3.6')
depends_on('autoconf', type='build')
depends_on('automake', type='build')
depends_on('m4', type='build')
conflicts('+parmetis', when='~mpi')
def install(self, spec, prefix):
build_directory = 'spack-build'
@property
def configure_directory(self):
spec = self.spec
# FIXME: The older Zoltan versions fail to compile the F90 MPI wrappers
# because of some complicated generic type problem.
if spec.satisfies('@:3.6+fortran+mpi'):
raise RuntimeError(('Cannot build Zoltan v{0} with +fortran and '
'+mpi; please disable one of these features '
'or upgrade versions.').format(self.version))
if spec.satisfies('@:3.6'):
zoltan_path = 'Zoltan_v{0}'.format(self.version)
return zoltan_path
return '.'
@property
def parallel(self):
# NOTE: Earlier versions of Zoltan cannot be built in parallel
# because they contain nested Makefile dependency bugs.
return not self.spec.satisfies('@:3.6+fortran')
def autoreconf(self, spec, prefix):
autoreconf = which('autoreconf')
with working_dir(self.configure_directory):
autoreconf('-ivf')
def configure_args(self):
spec = self.spec
config_args = [
self.get_config_flag('f90interface', 'fortran'),
@ -63,8 +89,10 @@ def install(self, spec, prefix):
]
if '+shared' in spec:
config_args.append('RANLIB=echo')
config_args.append('--with-ar=$(CXX) -shared $(LDFLAGS) -o')
config_args.extend([
'RANLIB=echo',
'--with-ar=$(CXX) -shared $(LDFLAGS) -o'
])
config_cflags.append(self.compiler.pic_flag)
if spec.satisfies('%gcc'):
config_args.append('--with-libs=-lgfortran')
@ -72,66 +100,54 @@ def install(self, spec, prefix):
config_args.append('--with-libs=-lifcore')
if '+parmetis' in spec:
config_args.append('--with-parmetis')
config_args.append('--with-parmetis-libdir={0}'
.format(spec['parmetis'].prefix.lib))
config_args.append('--with-parmetis-incdir={0}'
.format(spec['parmetis'].prefix.include))
config_args.append('--with-incdirs=-I{0}'
.format(spec['metis'].prefix.include))
config_args.append('--with-ldflags=-L{0}'
.format(spec['metis'].prefix.lib))
parmetis_prefix = spec['parmetis'].prefix
config_args.extend([
'--with-parmetis',
'--with-parmetis-libdir={0}'.format(parmetis_prefix.lib),
'--with-parmetis-incdir={0}'.format(parmetis_prefix.include),
'--with-incdirs=-I{0}'.format(spec['metis'].prefix.include),
'--with-ldflags=-L{0}'.format(spec['metis'].prefix.lib)
])
if '+int64' in spec['metis']:
config_args.append('--with-id-type=ulong')
else:
config_args.append('--with-id-type=uint')
if '+mpi' in spec:
config_args.append('CC={0}'.format(spec['mpi'].mpicc))
config_args.append('CXX={0}'.format(spec['mpi'].mpicxx))
config_args.append('FC={0}'.format(spec['mpi'].mpifc))
config_args.extend([
'CC={0}'.format(spec['mpi'].mpicc),
'CXX={0}'.format(spec['mpi'].mpicxx),
'FC={0}'.format(spec['mpi'].mpifc),
'--with-mpi={0}'.format(spec['mpi'].prefix),
config_args.append('--with-mpi={0}'.format(spec['mpi'].prefix))
# NOTE: Zoltan assumes that it's linking against an MPI library
# that can be found with '-lmpi' which isn't the case for many
# MPI packages. We rely on the MPI-wrappers to automatically add
# what is required for linking and thus pass an empty list of libs
config_args.append('--with-mpi-libs= ')
# NOTE: Zoltan assumes that it's linking against an MPI library
# that can be found with '-lmpi' which isn't the case for many
# MPI packages. We rely on the MPI-wrappers to automatically
# add what is required for linking and thus pass an empty
# list of libs
'--with-mpi-libs= '
])
# NOTE: Early versions of Zoltan come packaged with a few embedded
# library packages (e.g. ParMETIS, Scotch), which messes with Spack's
# ability to descend directly into the package's source directory.
source_directory = self.stage.source_path
if spec.satisfies('@:3.6'):
zoltan_directory = 'Zoltan_v{0}'.format(self.version)
source_directory = join_path(source_directory, zoltan_directory)
config_args.extend([
'--with-cflags={0}'.format(' '.join(config_cflags)),
'--with-cxxflags={0}'.format(' '.join(config_cflags)),
'--with-fcflags={0}'.format(' '.join(config_cflags))
])
return config_args
build_directory = join_path(source_directory, 'build')
with working_dir(build_directory, create=True):
config = Executable(join_path(source_directory, 'configure'))
config(
'--prefix={0}'.format(prefix),
'--with-cflags={0}'.format(' '.join(config_cflags)),
'--with-cxxflags={0}'.format(' '.join(config_cflags)),
'--with-fcflags={0}'.format(' '.join(config_cflags)),
*config_args
)
# NOTE: Earlier versions of Zoltan cannot be built in parallel
# because they contain nested Makefile dependency bugs.
make(parallel=not spec.satisfies('@:3.6+fortran'))
make('install')
# NOTE: Unfortunately, Zoltan doesn't provide any configuration
# options for the extension of the output library files, so this
# script must change these extensions as a post-processing step.
if '+shared' in spec:
for lib_path in glob.glob(join_path(prefix, 'lib', '*.a')):
lib_static_name = os.path.basename(lib_path)
# NOTE: Unfortunately, Zoltan doesn't provide any configuration
# options for the extension of the output library files, so this
# script must change these extensions as a post-processing step.
@run_after('install')
def solib_install(self):
if '+shared' in self.spec:
for lib_path in find(self.spec.prefix.lib, 'lib*.a'):
lib_shared_name = re.sub(r'\.a$', '.{0}'.format(dso_suffix),
lib_static_name)
move(lib_path, join_path(prefix, 'lib', lib_shared_name))
lib_path)
move(lib_path, lib_shared_name)
def get_config_flag(self, flag_name, flag_variant):
flag_pre = 'en' if '+{0}'.format(flag_variant) in self.spec else 'dis'