Merge branch 'develop' of https://github.com/LLNL/spack into features/env_objects_flying_around

This commit is contained in:
alalazo 2016-03-16 09:02:35 +01:00
commit b2c98feea4
23 changed files with 611 additions and 152 deletions

View File

@ -16,7 +16,10 @@ before_install:
script:
- . share/spack/setup-env.sh
- spack compilers
- spack config get compilers
- spack test
- spack install -v libdwarf
notifications:
email:

View File

@ -18,7 +18,7 @@ configurations can coexist on the same system.
Most importantly, Spack is *simple*. It offers a simple *spec* syntax
so that users can specify versions and configuration options
concisely. Spack is also simple for package authors: package files
are writtin in pure Python, and specs allow package authors to
are written in pure Python, and specs allow package authors to
maintain a single file for many different builds of the same package.
See the :doc:`features` for examples and highlights.

View File

@ -419,7 +419,7 @@ directory to the directory containing the downloaded archive before it
calls your ``install`` method. Within ``install``, the path to the
downloaded archive is available as ``self.stage.archive_file``.
Here is an example snippet for packages distribuetd as self-extracting
Here is an example snippet for packages distributed as self-extracting
archives. The example sets permissions on the downloaded file to make
it executable, then runs it with some arguments.
@ -1556,12 +1556,12 @@ you ask for a particular spec.
``Concretization Policies``
~~~~~~~~~~~~~~~~~~~~~~~~~~~
A user may have certain perferrences for how packages should
A user may have certain preferences for how packages should
be concretized on their system. For example, one user may prefer packages
built with OpenMPI and the Intel compiler. Another user may prefer
packages be built with MVAPICH and GCC.
Spack can be configurated to prefer certain compilers, package
Spack can be configured to prefer certain compilers, package
versions, depends_on, and variants during concretization.
The preferred configuration can be controlled via the
``~/.spack/packages.yaml`` file for user configuations, or the
@ -1588,16 +1588,16 @@ At a high level, this example is specifying how packages should be
concretized. The dyninst package should prefer using gcc 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich for
its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by perfering gcc 4.9).
its MPI and gcc 4.4.7 (except for Dyninst, which overrides this by preferring gcc 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
Each packages.yaml file begin with the string ``packages:`` and
Each packages.yaml file begins with the string ``packages:`` and
package names are specified on the next level. The special string ``all``
applies settings to each package. Underneath each package name is
one or more components: ``compiler``, ``variants``, ``version``,
or ``providers``. Each component has an ordered list of spec
``constraints``, with earlier entries in the list being prefered over
``constraints``, with earlier entries in the list being preferred over
later entries.
Sometimes a package installation may have constraints that forbid

View File

@ -78,8 +78,7 @@ This example lists three installations of OpenMPI, one built with gcc,
one built with gcc and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the the pre-installed OpenMPI in
the given directory. This example also specifies that Spack should never
build its own OpenMPI via the ``nobuild: True`` option.
the given directory.
Each ``packages.yaml`` begins with a ``packages:`` token, followed
by a list of package names. To specify externals, add a ``paths``
@ -111,16 +110,16 @@ be:
paths:
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7=chaos_5_x86_64_ib+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel
nobuild: True
openmpi@1.6.5%intel@10.1=chaos_5_x86_64_ib: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``nobuild`` flag tells Spack that it should never build
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI, and it will instead always rely on a pre-built
OpenMPI. Similar to ``paths``, ``nobuild`` is specified as a property under
OpenMPI. Similar to ``paths``, ``buildable`` is specified as a property under
a package name.
The ``nobuild`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.

View File

@ -235,11 +235,11 @@ def setter(name, value):
if not has_method(cls, '_cmp_key'):
raise TypeError("'%s' doesn't define _cmp_key()." % cls.__name__)
setter('__eq__', lambda s,o: o is not None and s._cmp_key() == o._cmp_key())
setter('__eq__', lambda s,o: (s is o) or (o is not None and s._cmp_key() == o._cmp_key()))
setter('__lt__', lambda s,o: o is not None and s._cmp_key() < o._cmp_key())
setter('__le__', lambda s,o: o is not None and s._cmp_key() <= o._cmp_key())
setter('__ne__', lambda s,o: o is None or s._cmp_key() != o._cmp_key())
setter('__ne__', lambda s,o: (s is not o) and (o is None or s._cmp_key() != o._cmp_key()))
setter('__gt__', lambda s,o: o is None or s._cmp_key() > o._cmp_key())
setter('__ge__', lambda s,o: o is None or s._cmp_key() >= o._cmp_key())

View File

@ -208,7 +208,7 @@ def find_repository(spec, args):
return repo
def fetch_tarballs(url, name, args):
def fetch_tarballs(url, name, version):
"""Try to find versions of the supplied archive by scraping the web.
Prompts the user to select how many to download if many are found.
@ -222,7 +222,7 @@ def fetch_tarballs(url, name, args):
archives_to_fetch = 1
if not versions:
# If the fetch failed for some reason, revert to what the user provided
versions = { "version" : url }
versions = { version : url }
elif len(versions) > 1:
tty.msg("Found %s versions of %s:" % (len(versions), name),
*spack.cmd.elide_list(
@ -256,7 +256,7 @@ def create(parser, args):
tty.msg("Creating template for package %s" % name)
# Fetch tarballs (prompting user if necessary)
versions, urls = fetch_tarballs(url, name, args)
versions, urls = fetch_tarballs(url, name, version)
# Try to guess what configure system is used.
guesser = ConfigureGuesser()

View File

@ -75,8 +75,8 @@ def diy(self, args):
edit_package(spec.name, spack.repo.first_repo(), None, True)
return
if not spec.version.concrete:
tty.die("spack diy spec must have a single, concrete version.")
if not spec.versions.concrete:
tty.die("spack diy spec must have a single, concrete version. Did you forget a package version number?")
spec.concretize()
package = spack.repo.get(spec)

View File

@ -51,10 +51,10 @@ class DefaultConcretizer(object):
"""
def _valid_virtuals_and_externals(self, spec):
"""Returns a list of spec/external-path pairs for both virtuals and externals
that can concretize this spec."""
# Get a list of candidate packages that could satisfy this spec
packages = []
"""Returns a list of candidate virtual dep providers and external
packages that coiuld be used to concretize a spec."""
# First construct a list of concrete candidates to replace spec with.
candidates = [spec]
if spec.virtual:
providers = spack.repo.providers_for(spec)
if not providers:
@ -64,96 +64,72 @@ def _valid_virtuals_and_externals(self, spec):
if not spec_w_preferred_providers:
spec_w_preferred_providers = spec
provider_cmp = partial(spack.pkgsort.provider_compare, spec_w_preferred_providers.name, spec.name)
packages = sorted(providers, cmp=provider_cmp)
else:
packages = [spec]
candidates = sorted(providers, cmp=provider_cmp)
# For each candidate package, if it has externals add those to the candidates
# if it's a nobuild, then only add the externals.
candidates = []
all_compilers = spack.compilers.all_compilers()
for pkg in packages:
externals = spec_externals(pkg)
buildable = not is_spec_nobuild(pkg)
if buildable:
candidates.append((pkg, None))
# For each candidate package, if it has externals, add those to the usable list.
# if it's not buildable, then *only* add the externals.
usable = []
for cspec in candidates:
if is_spec_buildable(cspec):
usable.append(cspec)
externals = spec_externals(cspec)
for ext in externals:
if ext[0].satisfies(spec):
candidates.append(ext)
if not candidates:
if ext.satisfies(spec):
usable.append(ext)
# If nothing is in the usable list now, it's because we aren't
# allowed to build anything.
if not usable:
raise NoBuildError(spec)
def cmp_externals(a, b):
if a[0].name != b[0].name:
#We're choosing between different providers. Maintain order from above sort
if a.name != b.name:
# We're choosing between different providers, so
# maintain order from provider sort
return candidates.index(a) - candidates.index(b)
result = cmp_specs(a[0], b[0])
result = cmp_specs(a, b)
if result != 0:
return result
if not a[1] and b[1]:
return 1
if not b[1] and a[1]:
return -1
return cmp(a[1], b[1])
candidates = sorted(candidates, cmp=cmp_externals)
return candidates
# prefer external packages to internal packages.
if a.external is None or b.external is None:
return -cmp(a.external, b.external)
else:
return cmp(a.external, b.external)
usable.sort(cmp=cmp_externals)
return usable
def concretize_virtual_and_external(self, spec):
"""From a list of candidate virtual and external packages, concretize to one that
is ABI compatible with the rest of the DAG."""
def choose_virtual_or_external(self, spec):
"""Given a list of candidate virtual and external packages, try to
find one that is most ABI compatible.
"""
candidates = self._valid_virtuals_and_externals(spec)
if not candidates:
return False
return candidates
# Find the nearest spec in the dag that has a compiler. We'll use that
# spec to test compiler compatibility.
other_spec = find_spec(spec, lambda(x): x.compiler)
if not other_spec:
other_spec = spec.root
# Find the nearest spec in the dag that has a compiler. We'll
# use that spec to calibrate compiler compatibility.
abi_exemplar = find_spec(spec, lambda(x): x.compiler)
if not abi_exemplar:
abi_exemplar = spec.root
# Choose an ABI-compatible candidate, or the first match otherwise.
candidate = None
if other_spec:
candidate = next((c for c in candidates if spack.abi.compatible(c[0], other_spec)), None)
if not candidate:
# Try a looser ABI matching
candidate = next((c for c in candidates if spack.abi.compatible(c[0], other_spec, loose=True)), None)
if not candidate:
# No ABI matches. Pick the top choice based on the orignal preferences.
candidate = candidates[0]
candidate_spec = candidate[0]
external = candidate[1]
changed = False
# Make a list including ABI compatibility of specs with the exemplar.
strict = [spack.abi.compatible(c, abi_exemplar) for c in candidates]
loose = [spack.abi.compatible(c, abi_exemplar, loose=True) for c in candidates]
keys = zip(strict, loose, candidates)
# If we're external then trim the dependencies
if external:
if (spec.dependencies):
changed = True
spec.dependencies = DependencyMap()
candidate_spec.dependencies = DependencyMap()
# Sort candidates from most to least compatibility.
# Note:
# 1. We reverse because True > False.
# 2. Sort is stable, so c's keep their order.
keys.sort(key=lambda k:k[:2], reverse=True)
def fequal(candidate_field, spec_field):
return (not candidate_field) or (candidate_field == spec_field)
if (fequal(candidate_spec.name, spec.name) and
fequal(candidate_spec.versions, spec.versions) and
fequal(candidate_spec.compiler, spec.compiler) and
fequal(candidate_spec.architecture, spec.architecture) and
fequal(candidate_spec.dependencies, spec.dependencies) and
fequal(candidate_spec.variants, spec.variants) and
fequal(external, spec.external)):
return changed
# Refine this spec to the candidate.
if spec.virtual:
spec._replace_with(candidate_spec)
changed = True
if spec._dup(candidate_spec, deps=False, cleardeps=False):
changed = True
spec.external = external
return changed
# Pull the candidates back out and return them in order
candidates = [c for s,l,c in keys]
return candidates
def concretize_version(self, spec):
@ -238,7 +214,7 @@ def concretize_variants(self, spec):
the default variants from the package specification.
"""
changed = False
for name, variant in spec.package.variants.items():
for name, variant in spec.package_class.variants.items():
if name not in spec.variants:
spec.variants[name] = spack.spec.VariantSpec(name, variant.default)
changed = True
@ -369,8 +345,8 @@ def __init__(self, spec):
class NoBuildError(spack.error.SpackError):
"""Raised when a package is configured with the nobuild option, but
"""Raised when a package is configured with the buildable option False, but
no satisfactory external versions can be found"""
def __init__(self, spec):
super(NoBuildError, self).__init__(
"The spec '%s' is configured as nobuild, and no matching external installs were found" % spec.name)
"The spec '%s' is configured as not buildable, and no matching external installs were found" % spec.name)

View File

@ -220,9 +220,9 @@
'type' : 'array',
'default' : [],
'items' : { 'type' : 'string' } }, #compiler specs
'nobuild': {
'buildable': {
'type': 'boolean',
'default': False,
'default': True,
},
'providers': {
'type': 'object',
@ -539,33 +539,36 @@ def print_section(section):
def spec_externals(spec):
"""Return a list of spec, directory pairs for each external location for spec"""
"""Return a list of external specs (with external directory path filled in),
one for each known external installation."""
allpkgs = get_config('packages')
name = spec.name
spec_locations = []
external_specs = []
pkg_paths = allpkgs.get(name, {}).get('paths', None)
if not pkg_paths:
return []
for pkg,path in pkg_paths.iteritems():
if not spec.satisfies(pkg):
continue
for external_spec, path in pkg_paths.iteritems():
if not path:
# skip entries without paths (avoid creating extra Specs)
continue
spec_locations.append( (spack.spec.Spec(pkg), path) )
return spec_locations
external_spec = spack.spec.Spec(external_spec, external=path)
if external_spec.satisfies(spec):
external_specs.append(external_spec)
return external_specs
def is_spec_nobuild(spec):
"""Return true if the spec pkgspec is configured as nobuild"""
def is_spec_buildable(spec):
"""Return true if the spec pkgspec is configured as buildable"""
allpkgs = get_config('packages')
name = spec.name
if not spec.name in allpkgs:
return False
if not 'nobuild' in allpkgs[spec.name]:
return False
return allpkgs[spec.name]['nobuild']
return True
if not 'buildable' in allpkgs[spec.name]:
return True
return allpkgs[spec.name]['buildable']
class ConfigError(SpackError): pass

View File

@ -316,6 +316,11 @@ def get(self, spec, new=False):
return self.repo_for_pkg(spec).get(spec)
def get_pkg_class(self, pkg_name):
"""Find a class for the spec's package and return the class object."""
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
@_autospec
def dump_provenance(self, spec, path):
"""Dump provenance information for a spec to a particular path.
@ -550,7 +555,7 @@ def get(self, spec, new=False):
key = hash(spec)
if new or key not in self._instances:
package_class = self._get_pkg_class(spec.name)
package_class = self.get_pkg_class(spec.name)
try:
copy = spec.copy() # defensive copy. Package owns its spec.
self._instances[key] = package_class(copy)
@ -715,7 +720,7 @@ def _get_pkg_module(self, pkg_name):
return self._modules[pkg_name]
def _get_pkg_class(self, pkg_name):
def get_pkg_class(self, pkg_name):
"""Get the class for the package out of its module.
First loads (or fetches from cache) a module for the

View File

@ -353,7 +353,7 @@ def constrain(self, other):
@property
def concrete(self):
return self.spec._concrete or all(
v in self for v in self.spec.package.variants)
v in self for v in self.spec.package_class.variants)
def copy(self):
@ -418,9 +418,11 @@ def __init__(self, spec_like, *dep_like, **kwargs):
# cases we've read them from a file want to assume normal.
# This allows us to manipulate specs that Spack doesn't have
# package.py files for.
self._normal = kwargs.get('normal', False)
self._normal = kwargs.get('normal', False)
self._concrete = kwargs.get('concrete', False)
self.external = None
# Allow a spec to be constructed with an external path.
self.external = kwargs.get('external', None)
# This allows users to construct a spec DAG with literals.
# Note that given two specs a and b, Spec(a) copies a, but
@ -498,6 +500,14 @@ def package(self):
return spack.repo.get(self)
@property
def package_class(self):
"""Internal package call gets only the class object for a package.
Use this to just get package metadata.
"""
return spack.repo.get_pkg_class(self.name)
@property
def virtual(self):
"""Right now, a spec is virtual if no package exists with its name.
@ -786,8 +796,30 @@ def _replace_with(self, concrete):
"""Replace this virtual spec with a concrete spec."""
assert(self.virtual)
for name, dependent in self.dependents.items():
# remove self from all dependents.
del dependent.dependencies[self.name]
dependent._add_dependency(concrete)
# add the replacement, unless it is already a dep of dependent.
if concrete.name not in dependent.dependencies:
dependent._add_dependency(concrete)
def _replace_node(self, replacement):
"""Replace this spec with another.
Connects all dependents of this spec to its replacement, and
disconnects this spec from any dependencies it has. New spec
will have any dependencies the replacement had, and may need
to be normalized.
"""
for name, dependent in self.dependents.items():
del dependent.dependencies[self.name]
dependent._add_dependency(replacement)
for name, dep in self.dependencies.items():
del dep.dependents[self.name]
del self.dependencies[dep.name]
def _expand_virtual_packages(self):
@ -807,18 +839,80 @@ def _expand_virtual_packages(self):
this are infrequent, but should implement this before it is
a problem.
"""
# Make an index of stuff this spec already provides
self_index = ProviderIndex(self.traverse(), restrict=True)
changed = False
done = False
while not done:
done = True
for spec in list(self.traverse()):
if spack.concretizer.concretize_virtual_and_external(spec):
done = False
replacement = None
if spec.virtual:
replacement = self._find_provider(spec, self_index)
if replacement:
# TODO: may break if in-place on self but
# shouldn't happen if root is traversed first.
spec._replace_with(replacement)
done=False
break
if not replacement:
# Get a list of possible replacements in order of preference.
candidates = spack.concretizer.choose_virtual_or_external(spec)
# Try the replacements in order, skipping any that cause
# satisfiability problems.
for replacement in candidates:
if replacement is spec:
break
# Replace spec with the candidate and normalize
copy = self.copy()
copy[spec.name]._dup(replacement.copy(deps=False))
try:
# If there are duplicate providers or duplicate provider
# deps, consolidate them and merge constraints.
copy.normalize(force=True)
break
except SpecError as e:
# On error, we'll try the next replacement.
continue
# If replacement is external then trim the dependencies
if replacement.external:
if (spec.dependencies):
changed = True
spec.dependencies = DependencyMap()
replacement.dependencies = DependencyMap()
# TODO: could this and the stuff in _dup be cleaned up?
def feq(cfield, sfield):
return (not cfield) or (cfield == sfield)
if replacement is spec or (feq(replacement.name, spec.name) and
feq(replacement.versions, spec.versions) and
feq(replacement.compiler, spec.compiler) and
feq(replacement.architecture, spec.architecture) and
feq(replacement.dependencies, spec.dependencies) and
feq(replacement.variants, spec.variants) and
feq(replacement.external, spec.external)):
continue
# Refine this spec to the candidate. This uses
# replace_with AND dup so that it can work in
# place. TODO: make this more efficient.
if spec.virtual:
spec._replace_with(replacement)
changed = True
if spec._dup(replacement, deps=False, cleardeps=False):
changed = True
# If there are duplicate providers or duplicate provider deps, this
# consolidates them and merge constraints.
changed |= self.normalize(force=True)
self_index.update(spec)
done=False
break
return changed
@ -842,7 +936,7 @@ def concretize(self):
force = False
while changed:
changes = (self.normalize(force=force),
changes = (self.normalize(force),
self._expand_virtual_packages(),
self._concretize_helper())
changed = any(changes)
@ -968,8 +1062,8 @@ def _evaluate_dependency_conditions(self, name):
def _find_provider(self, vdep, provider_index):
"""Find provider for a virtual spec in the provider index.
Raise an exception if there is a conflicting virtual
dependency already in this spec.
Raise an exception if there is a conflicting virtual
dependency already in this spec.
"""
assert(vdep.virtual)
providers = provider_index.providers_for(vdep)
@ -1010,17 +1104,14 @@ def _merge_dependency(self, dep, visited, spec_deps, provider_index):
"""
changed = False
# If it's a virtual dependency, try to find a provider and
# merge that.
# If it's a virtual dependency, try to find an existing
# provider in the spec, and merge that.
if dep.virtual:
visited.add(dep.name)
provider = self._find_provider(dep, provider_index)
if provider:
dep = provider
else:
# if it's a real dependency, check whether it provides
# something already required in the spec.
index = ProviderIndex([dep], restrict=True)
for vspec in (v for v in spec_deps.values() if v.virtual):
if index.providers_for(vspec):
@ -1117,13 +1208,14 @@ def normalize(self, force=False):
# Get all the dependencies into one DependencyMap
spec_deps = self.flat_dependencies(copy=False)
# Initialize index of virtual dependency providers
index = ProviderIndex(spec_deps.values(), restrict=True)
# Initialize index of virtual dependency providers if
# concretize didn't pass us one already
provider_index = ProviderIndex(spec_deps.values(), restrict=True)
# traverse the package DAG and fill out dependencies according
# to package files & their 'when' specs
visited = set()
any_change = self._normalize_helper(visited, spec_deps, index)
any_change = self._normalize_helper(visited, spec_deps, provider_index)
# If there are deps specified but not visited, they're not
# actually deps of this package. Raise an error.
@ -1161,7 +1253,7 @@ def validate_names(self):
# Ensure that variants all exist.
for vname, variant in spec.variants.items():
if vname not in spec.package.variants:
if vname not in spec.package_class.variants:
raise UnknownVariantError(spec.name, vname)
@ -1402,13 +1494,12 @@ def _dup(self, other, **kwargs):
Whether deps should be copied too. Set to false to copy a
spec but not its dependencies.
"""
# We don't count dependencies as changes here
changed = True
if hasattr(self, 'name'):
changed = (self.name != other.name and self.versions != other.versions and \
self.architecture != other.architecture and self.compiler != other.compiler and \
self.variants != other.variants and self._normal != other._normal and \
changed = (self.name != other.name and self.versions != other.versions and
self.architecture != other.architecture and self.compiler != other.compiler and
self.variants != other.variants and self._normal != other._normal and
self.concrete != other.concrete and self.external != other.external)
# Local node attributes get copied first.

View File

@ -142,6 +142,34 @@ def test_concretize_with_provides_when(self):
for spec in spack.repo.providers_for('mpi@3')))
def test_concretize_two_virtuals(self):
"""Test a package with multiple virtual dependencies."""
s = Spec('hypre').concretize()
def test_concretize_two_virtuals_with_one_bound(self):
"""Test a package with multiple virtual dependencies and one preset."""
s = Spec('hypre ^openblas').concretize()
def test_concretize_two_virtuals_with_two_bound(self):
"""Test a package with multiple virtual dependencies and two of them preset."""
s = Spec('hypre ^openblas ^netlib-lapack').concretize()
def test_concretize_two_virtuals_with_dual_provider(self):
"""Test a package with multiple virtual dependencies and force a provider
that provides both."""
s = Spec('hypre ^openblas-with-lapack').concretize()
def test_concretize_two_virtuals_with_dual_provider_and_a_conflict(self):
"""Test a package with multiple virtual dependencies and force a provider
that provides both, and another conflicting package that provides one."""
s = Spec('hypre ^openblas-with-lapack ^netlib-lapack')
self.assertRaises(spack.spec.MultipleProviderError, s.concretize)
def test_virtual_is_fully_expanded_for_callpath(self):
# force dependence on fake "zmpi" by asking for MPI 10.0
spec = Spec('callpath ^mpi@10.0')
@ -281,4 +309,3 @@ def test_find_spec_none(self):
Spec('d')),
Spec('e'))
self.assertEqual(None, find_spec(s['b'], lambda s: '+foo' in s))

View File

@ -52,11 +52,11 @@
mock_packages_config = """\
packages:
externaltool:
nobuild: True
buildable: False
paths:
externaltool@1.0%gcc@4.5.0: /path/to/external_tool
externalvirtual:
nobuild: True
buildable: False
paths:
externalvirtual@2.0%clang@3.3: /path/to/external_virtual_clang
externalvirtual@1.0%gcc@4.5.0: /path/to/external_virtual_gcc

View File

@ -0,0 +1,39 @@
##############################################################################
# Copyright (c) 2013-2015, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Hypre(Package):
"""Hypre is included here as an example of a package that depends on
both LAPACK and BLAS."""
homepage = "http://www.openblas.net"
url = "http://github.com/xianyi/OpenBLAS/archive/v0.2.15.tar.gz"
version('0.2.15', 'b1190f3d3471685f17cfd1ec1d252ac9')
depends_on('lapack')
depends_on('blas')
def install(self, spec, prefix):
pass

View File

@ -0,0 +1,38 @@
##############################################################################
# Copyright (c) 2013-2015, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class OpenblasWithLapack(Package):
"""Dummy version of OpenBLAS that also provides LAPACK, for testing."""
homepage = "http://www.openblas.net"
url = "http://github.com/xianyi/OpenBLAS/archive/v0.2.15.tar.gz"
version('0.2.15', 'b1190f3d3471685f17cfd1ec1d252ac9')
provides('lapack')
provides('blas')
def install(self, spec, prefix):
pass

View File

@ -30,6 +30,7 @@ class Cmake(Package):
homepage = 'https://www.cmake.org'
url = 'https://cmake.org/files/v3.4/cmake-3.4.3.tar.gz'
version('3.5.0', '33c5d09d4c33d4ffcc63578a6ba8777e')
version('3.4.3', '4cb3ff35b2472aae70f542116d616e63')
version('3.4.0', 'cd3034e0a44256a0917e254167217fc8')
version('3.3.1', '52638576f4e1e621fed6c3410d3a1b12')
@ -37,16 +38,48 @@ class Cmake(Package):
version('2.8.10.2', '097278785da7182ec0aea8769d06860c')
variant('ncurses', default=True, description='Enables the build of the ncurses gui')
variant('qt', default=False, description='Enables the build of cmake-gui')
variant('doc', default=False, description='Enables the generation of html and man page documentation')
depends_on('ncurses', when='+ncurses')
depends_on('qt', when='+qt')
depends_on('python@2.7.11:', when='+doc')
depends_on('py-sphinx', when='+doc')
def url_for_version(self, version):
"""Handle CMake's version-based custom URLs."""
return 'https://cmake.org/files/v%s/cmake-%s.tar.gz' % (version.up_to(2), version)
def validate(self, spec):
"""
Checks if incompatible versions of qt were specified
:param spec: spec of the package
:raises RuntimeError: in case of inconsistencies
"""
if '+qt' in spec and spec.satisfies('^qt@5.4.0'):
msg = 'qt-5.4.0 has broken CMake modules.'
raise RuntimeError(msg)
def install(self, spec, prefix):
configure('--prefix=' + prefix,
'--parallel=' + str(make_jobs),
'--', '-DCMAKE_USE_OPENSSL=ON')
# Consistency check
self.validate(spec)
# configure, build, install:
options = ['--prefix=%s' % prefix]
options.append('--parallel=%s' % str(make_jobs))
if '+qt' in spec:
options.append('--qt-gui')
if '+doc' in spec:
options.append('--sphinx-html')
options.append('--sphinx-man')
options.append('--')
options.append('-DCMAKE_USE_OPENSSL=ON')
configure(*options)
make()
make('install')

View File

@ -0,0 +1,21 @@
from spack import *
class Emacs(Package):
"""The Emacs programmable text editor."""
homepage = "https://www.gnu.org/software/emacs"
url = "http://ftp.gnu.org/gnu/emacs/emacs-24.5.tar.gz"
version('24.5', 'd74b597503a68105e61b5b9f6d065b44')
depends_on('ncurses')
# Emacs also depends on:
# GTK or other widget library
# libtiff, png, etc.
# For now, we assume the system provides all that stuff.
# For Ubuntu 14.04 LTS:
# sudo apt-get install libgtk-3-dev libxpm-dev libtiff5-dev libjpeg8-dev libgif-dev libpng12-dev
def install(self, spec, prefix):
configure('--prefix=%s' % prefix)
make()
make("install")

View File

@ -0,0 +1,18 @@
from spack import *
class Gl2ps(Package):
"""GL2PS is a C library providing high quality vector output for any
OpenGL application."""
homepage = "http://www.geuz.org/gl2ps/"
url = "http://geuz.org/gl2ps/src/gl2ps-1.3.9.tgz"
version('1.3.9', '377b2bcad62d528e7096e76358f41140')
depends_on("libpng")
def install(self, spec, prefix):
cmake('.', *std_cmake_args)
make()
make("install")

View File

@ -12,6 +12,7 @@ class NetlibLapack(Package):
homepage = "http://www.netlib.org/lapack/"
url = "http://www.netlib.org/lapack/lapack-3.5.0.tgz"
version('3.6.0', 'f2f6c67134e851fe189bb3ca1fbb5101')
version('3.5.0', 'b1d3e3e425b2e44a06760ff173104bdf')
version('3.4.2', '61bf1a8a4469d4bdb7604f5897179478')
version('3.4.1', '44c3869c38c8335c2b9c2a8bb276eb55')

View File

@ -0,0 +1,183 @@
from spack import *
class Octave(Package):
"""GNU Octave is a high-level language, primarily intended for numerical
computations. It provides a convenient command line interface for solving
linear and nonlinear problems numerically, and for performing other
numerical experiments using a language that is mostly compatible with
Matlab. It may also be used as a batch-oriented language."""
homepage = "https://www.gnu.org/software/octave/"
url = "ftp://ftp.gnu.org/gnu/octave/octave-4.0.0.tar.gz"
version('4.0.0' , 'a69f8320a4f20a8480c1b278b1adb799')
# Variants
variant('readline', default=True)
variant('arpack', default=False)
variant('curl', default=False)
variant('fftw', default=False)
variant('fltk', default=False)
variant('fontconfig', default=False)
variant('freetype', default=False)
variant('glpk', default=False)
variant('gl2ps', default=False)
variant('gnuplot', default=False)
variant('magick', default=False)
variant('hdf5', default=False)
variant('jdk', default=False)
variant('llvm', default=False)
variant('opengl', default=False)
variant('qhull', default=False)
variant('qrupdate', default=False)
variant('qscintilla', default=False)
variant('qt', default=False)
variant('suiteparse', default=False)
variant('zlib', default=False)
# Required dependencies
depends_on('blas')
depends_on('lapack')
depends_on('pcre')
# Strongly recommended dependencies
depends_on('readline', when='+readline')
# Optional dependencies
depends_on('arpack', when='+arpack')
depends_on('curl', when='+curl')
depends_on('fftw', when='+fftw')
depends_on('fltk', when='+fltk')
depends_on('fontconfig', when='+fontconfig')
depends_on('freetype', when='+freetype')
depends_on('glpk', when='+glpk')
depends_on('gl2ps', when='+gl2ps')
depends_on('gnuplot', when='+gnuplot')
depends_on('ImageMagick', when='+magick')
depends_on('hdf5', when='+hdf5')
depends_on('jdk', when='+jdk')
depends_on('llvm', when='+llvm')
#depends_on('opengl', when='+opengl') # TODO: add package
depends_on('qhull', when='+qhull')
depends_on('qrupdate', when='+qrupdate')
#depends_on('qscintilla', when='+qscintilla) # TODO: add package
depends_on('qt', when='+qt')
depends_on('SuiteSparse', when='+suitesparse')
depends_on('zlib', when='+zlib')
def install(self, spec, prefix):
config_args = [
"--prefix=%s" % prefix
]
# Required dependencies
config_args.extend([
"--with-blas=%s" % spec['blas'].prefix.lib,
"--with-lapack=%s" % spec['lapack'].prefix.lib
])
# Strongly recommended dependencies
if '+readline' in spec:
config_args.append('--enable-readline')
else:
config_args.append('--disable-readline')
# Optional dependencies
if '+arpack' in spec:
config_args.extend([
"--with-arpack-includedir=%s" % spec['arpack'].prefix.include,
"--with-arpack-libdir=%s" % spec['arpack'].prefix.lib
])
else:
config_args.append("--without-arpack")
if '+curl' in spec:
config_args.extend([
"--with-curl-includedir=%s" % spec['curl'].prefix.include,
"--with-curl-libdir=%s" % spec['curl'].prefix.lib
])
else:
config_args.append("--without-curl")
if '+fftw' in spec:
config_args.extend([
"--with-fftw3-includedir=%s" % spec['fftw'].prefix.include,
"--with-fftw3-libdir=%s" % spec['fftw'].prefix.lib,
"--with-fftw3f-includedir=%s" % spec['fftw'].prefix.include,
"--with-fftw3f-libdir=%s" % spec['fftw'].prefix.lib
])
else:
config_args.extend([
"--without-fftw3",
"--without-fftw3f"
])
if '+fltk' in spec:
config_args.extend([
"--with-fltk-prefix=%s" % spec['fltk'].prefix,
"--with-fltk-exec-prefix=%s" % spec['fltk'].prefix
])
else:
config_args.append("--without-fltk")
if '+glpk' in spec:
config_args.extend([
"--with-glpk-includedir=%s" % spec['glpk'].prefix.include,
"--with-glpk-libdir=%s" % spec['glpk'].prefix.lib
])
else:
config_args.append("--without-glpk")
if '+magick' in spec:
config_args.append("--with-magick=%s" % spec['ImageMagick'].prefix.lib)
if '+hdf5' in spec:
config_args.extend([
"--with-hdf5-includedir=%s" % spec['hdf5'].prefix.include,
"--with-hdf5-libdir=%s" % spec['hdf5'].prefix.lib
])
else:
config_args.append("--without-hdf5")
if '+jdk' in spec:
config_args.extend([
"--with-java-homedir=%s" % spec['jdk'].prefix,
"--with-java-includedir=%s" % spec['jdk'].prefix.include,
"--with-java-libdir=%s" % spec['jdk'].prefix.lib
])
if '~opengl' in spec:
config_args.extend([
"--without-opengl",
"--without-framework-opengl"
])
if '+qhull' in spec:
config_args.extend([
"--with-qhull-includedir=%s" % spec['qhull'].prefix.include,
"--with-qhull-libdir=%s" % spec['qhull'].prefix.lib
])
else:
config_args.append("--without-qhull")
if '+qrupdate' in spec:
config_args.extend([
"--with-qrupdate-includedir=%s" % spec['qrupdate'].prefix.include,
"--with-qrupdate-libdir=%s" % spec['qrupdate'].prefix.lib
])
else:
config_args.append("--without-qrupdate")
if '+zlib' in spec:
config_args.extend([
"--with-z-includedir=%s" % spec['zlib'].prefix.include,
"--with-z-libdir=%s" % spec['zlib'].prefix.lib
])
else:
config_args.append("--without-z")
configure(*config_args)
make()
make("install")

View File

@ -0,0 +1,18 @@
from spack import *
class Qrupdate(Package):
"""qrupdate is a Fortran library for fast updates of QR and
Cholesky decompositions."""
homepage = "http://sourceforge.net/projects/qrupdate/"
url = "https://downloads.sourceforge.net/qrupdate/qrupdate-1.1.2.tar.gz"
version('1.1.2', '6d073887c6e858c24aeda5b54c57a8c4')
depends_on("blas")
depends_on("lapack")
def install(self, spec, prefix):
# Build static and dynamic libraries
make("lib", "solib")
make("install", "PREFIX=%s" % prefix)

View File

@ -8,6 +8,9 @@ class Qt(Package):
list_url = 'http://download.qt-project.org/official_releases/qt/'
list_depth = 2
version('5.4.2', 'fa1c4d819b401b267eb246a543a63ea5',
url='http://download.qt-project.org/official_releases/qt/5.4/5.4.2/single/qt-everywhere-opensource-src-5.4.2.tar.gz')
version('5.4.0', 'e8654e4b37dd98039ba20da7a53877e6',
url='http://download.qt-project.org/official_releases/qt/5.4/5.4.0/single/qt-everywhere-opensource-src-5.4.0.tar.gz')

View File

@ -7,10 +7,11 @@ class Tmux(Package):
do a lot more.
"""
homepage = "http://tmux.sourceforge.net"
url = "http://downloads.sourceforge.net/project/tmux/tmux/tmux-1.9/tmux-1.9a.tar.gz"
homepage = "http://tmux.github.io"
url = "https://github.com/tmux/tmux/releases/download/2.1/tmux-2.1.tar.gz"
version('1.9a', 'b07601711f96f1d260b390513b509a2d')
version('2.1', '74a2855695bccb51b6e301383ad4818c')
depends_on('libevent')
depends_on('ncurses')