Merge branch 'develop' of https://github.com/LLNL/spack into features/custom_modulefile_from_config

This commit is contained in:
alalazo
2016-05-10 11:13:19 +02:00
148 changed files with 3692 additions and 767 deletions

View File

@@ -372,25 +372,32 @@ how this is done is in :ref:`sec-specs`.
``spack compiler add``
~~~~~~~~~~~~~~~~~~~~~~~
An alias for ``spack compiler find``.
.. _spack-compiler-find:
``spack compiler find``
~~~~~~~~~~~~~~~~~~~~~~~
If you do not see a compiler in this list, but you want to use it with
Spack, you can simply run ``spack compiler add`` with the path to
Spack, you can simply run ``spack compiler find`` with the path to
where the compiler is installed. For example::
$ spack compiler add /usr/local/tools/ic-13.0.079
$ spack compiler find /usr/local/tools/ic-13.0.079
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
intel@13.0.079
Or you can run ``spack compiler add`` with no arguments to force
Or you can run ``spack compiler find`` with no arguments to force
auto-detection. This is useful if you do not know where compilers are
installed, but you know that new compilers have been added to your
``PATH``. For example, using dotkit, you might do this::
$ module load gcc-4.9.0
$ spack compiler add
$ spack compiler find
==> Added 1 new compiler to /Users/gamblin2/.spack/compilers.yaml
gcc@4.9.0
This loads the environment module for gcc-4.9.0 to get it into the
This loads the environment module for gcc-4.9.0 to add it to
``PATH``, and then it adds the compiler to Spack.
.. _spack-compiler-info:
@@ -807,17 +814,22 @@ Environment Modules, you can get it with Spack:
1. Install with::
.. code-block:: sh
spack install environment-modules
2. Activate with::
MODULES_HOME=`spack location -i environment-modules`
MODULES_VERSION=`ls -1 $MODULES_HOME/Modules | head -1`
${MODULES_HOME}/Modules/${MODULES_VERSION}/bin/add.modules
Add the following two lines to your ``.bashrc`` profile (or similar):
.. code-block:: sh
MODULES_HOME=`spack location -i environment-modules`
source ${MODULES_HOME}/Modules/init/bash
In case you use a Unix shell other than bash, substitute ``bash`` by
the appropriate file in ``${MODULES_HOME}/Modules/init/``.
This adds to your ``.bashrc`` (or similar) files, enabling Environment
Modules when you log in. It will ask your permission before changing
any files.
Spack and Environment Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

View File

@@ -1803,15 +1803,15 @@ Compile-time library search paths
* ``-L$dep_prefix/lib``
* ``-L$dep_prefix/lib64``
Runtime library search paths (RPATHs)
* ``-Wl,-rpath,$dep_prefix/lib``
* ``-Wl,-rpath,$dep_prefix/lib64``
* ``$rpath_flag$dep_prefix/lib``
* ``$rpath_flag$dep_prefix/lib64``
Include search paths
* ``-I$dep_prefix/include``
An example of this would be the ``libdwarf`` build, which has one
dependency: ``libelf``. Every call to ``cc`` in the ``libdwarf``
build will have ``-I$LIBELF_PREFIX/include``,
``-L$LIBELF_PREFIX/lib``, and ``-Wl,-rpath,$LIBELF_PREFIX/lib``
``-L$LIBELF_PREFIX/lib``, and ``$rpath_flag$LIBELF_PREFIX/lib``
inserted on the command line. This is done transparently to the
project's build system, which will just think it's using a system
where ``libelf`` is readily available. Because of this, you **do
@@ -1831,6 +1831,48 @@ successfully find ``libdwarf.h`` and ``libdwarf.so``, without the
packager having to provide ``--with-libdwarf=/path/to/libdwarf`` on
the command line.
.. note::
For most compilers, ``$rpath_flag`` is ``-Wl,-rpath,``. However, NAG
passes its flags to GCC instead of passing them directly to the linker.
Therefore, its ``$rpath_flag`` is doubly wrapped: ``-Wl,-Wl,,-rpath,``.
``$rpath_flag`` can be overriden on a compiler specific basis in
``lib/spack/spack/compilers/$compiler.py``.
Compiler flags
~~~~~~~~~~~~~~
In rare circumstances such as compiling and running small unit tests, a package
developer may need to know what are the appropriate compiler flags to enable
features like ``OpenMP``, ``c++11``, ``c++14`` and alike. To that end the
compiler classes in ``spack`` implement the following _properties_ :
``openmp_flag``, ``cxx11_flag``, ``cxx14_flag``, which can be accessed in a
package by ``self.compiler.cxx11_flag`` and alike. Note that the implementation
is such that if a given compiler version does not support this feature, an
error will be produced. Therefore package developers can also use these properties
to assert that a compiler supports the requested feature. This is handy when a
package supports additional variants like
.. code-block:: python
variant('openmp', default=True, description="Enable OpenMP support.")
Message Parsing Interface (MPI)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
It is common for high performance computing software/packages to use ``MPI``.
As a result of conretization, a given package can be built using different
implementations of MPI such as ``Openmpi``, ``MPICH`` or ``IntelMPI``.
In some scenarios to configure a package one have to provide it with appropriate MPI
compiler wrappers such as ``mpicc``, ``mpic++``.
However different implementations of ``MPI`` may have different names for those
wrappers. In order to make package's ``install()`` method indifferent to the
choice ``MPI`` implementation, each package which implements ``MPI`` sets up
``self.spec.mpicc``, ``self.spec.mpicxx``, ``self.spec.mpifc`` and ``self.spec.mpif77``
to point to ``C``, ``C++``, ``Fortran 90`` and ``Fortran 77`` ``MPI`` wrappers.
Package developers are advised to use these variables, for example ``self.spec['mpi'].mpicc``
instead of hard-coding ``join_path(self.spec['mpi'].prefix.bin, 'mpicc')`` for
the reasons outlined above.
Forking ``install()``
~~~~~~~~~~~~~~~~~~~~~

41
lib/spack/env/cc vendored
View File

@@ -38,15 +38,20 @@
# -Wl,-rpath arguments for dependency /lib directories.
#
# This is the list of environment variables that need to be set before
# This is an array of environment variables that need to be set before
# the script runs. They are set by routines in spack.build_environment
# as part of spack.package.Package.do_install().
parameters="
SPACK_PREFIX
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_COMPILER_SPEC
SPACK_SHORT_SPEC"
parameters=(
SPACK_PREFIX
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_SHORT_SPEC
)
# The compiler input variables are checked for sanity later:
# SPACK_CC, SPACK_CXX, SPACK_F77, SPACK_FC
@@ -64,7 +69,7 @@ function die {
exit 1
}
for param in $parameters; do
for param in ${parameters[@]}; do
if [[ -z ${!param} ]]; then
die "Spack compiler must be run from Spack! Input '$param' is missing."
fi
@@ -85,6 +90,7 @@ done
# ccld compile & link
command=$(basename "$0")
comp="CC"
case "$command" in
cpp)
mode=cpp
@@ -92,18 +98,22 @@ case "$command" in
cc|c89|c99|gcc|clang|icc|pgcc|xlc)
command="$SPACK_CC"
language="C"
comp="CC"
;;
c++|CC|g++|clang++|icpc|pgc++|xlc++)
command="$SPACK_CXX"
language="C++"
comp="CXX"
;;
f90|fc|f95|gfortran|ifort|pgfortran|xlf90|nagfor)
command="$SPACK_FC"
language="Fortran 90"
comp="FC"
;;
f77|gfortran|ifort|pgfortran|xlf|nagfor)
command="$SPACK_F77"
language="Fortran 77"
comp="F77"
;;
ld)
mode=ld
@@ -142,6 +152,9 @@ if [[ -z $mode ]]; then
done
fi
# Set up rpath variable according to language.
eval rpath=\$SPACK_${comp}_RPATH_ARG
# Dump the version and exit if we're in testing mode.
if [[ $SPACK_TEST_COMMAND == dump-mode ]]; then
echo "$mode"
@@ -162,7 +175,7 @@ fi
# It doesn't work with -rpath.
# This variable controls whether they are added.
add_rpaths=true
if [[ mode == ld && $OSTYPE == darwin* ]]; then
if [[ $mode == ld && "$SPACK_SHORT_SPEC" =~ "darwin" ]]; then
for arg in "$@"; do
if [[ $arg == -r ]]; then
add_rpaths=false
@@ -188,7 +201,7 @@ for dep in "${deps[@]}"; do
# Prepend lib and RPATH directories
if [[ -d $dep/lib ]]; then
if [[ $mode == ccld ]]; then
$add_rpaths && args=("-Wl,-rpath,$dep/lib" "${args[@]}")
$add_rpaths && args=("$rpath$dep/lib" "${args[@]}")
args=("-L$dep/lib" "${args[@]}")
elif [[ $mode == ld ]]; then
$add_rpaths && args=("-rpath" "$dep/lib" "${args[@]}")
@@ -199,7 +212,7 @@ for dep in "${deps[@]}"; do
# Prepend lib64 and RPATH directories
if [[ -d $dep/lib64 ]]; then
if [[ $mode == ccld ]]; then
$add_rpaths && args=("-Wl,-rpath,$dep/lib64" "${args[@]}")
$add_rpaths && args=("$rpath$dep/lib64" "${args[@]}")
args=("-L$dep/lib64" "${args[@]}")
elif [[ $mode == ld ]]; then
$add_rpaths && args=("-rpath" "$dep/lib64" "${args[@]}")
@@ -210,9 +223,11 @@ done
# Include all -L's and prefix/whatever dirs in rpath
if [[ $mode == ccld ]]; then
$add_rpaths && args=("-Wl,-rpath,$SPACK_PREFIX/lib" "-Wl,-rpath,$SPACK_PREFIX/lib64" "${args[@]}")
$add_rpaths && args=("$rpath$SPACK_PREFIX/lib64" "${args[@]}")
$add_rpaths && args=("$rpath$SPACK_PREFIX/lib" "${args[@]}")
elif [[ $mode == ld ]]; then
$add_rpaths && args=("-rpath" "$SPACK_PREFIX/lib" "-rpath" "$SPACK_PREFIX/lib64" "${args[@]}")
$add_rpaths && args=("-rpath" "$SPACK_PREFIX/lib64" "${args[@]}")
$add_rpaths && args=("-rpath" "$SPACK_PREFIX/lib" "${args[@]}")
fi
#

View File

@@ -98,21 +98,27 @@ def set_compiler_environment_variables(pkg, env):
# and return it
# TODO : add additional kwargs for better diagnostics, like requestor, ttyout, ttyerr, etc.
link_dir = spack.build_env_path
env.set('CC', join_path(link_dir, pkg.compiler.link_paths['cc']))
env.set('CC', join_path(link_dir, pkg.compiler.link_paths['cc']))
env.set('CXX', join_path(link_dir, pkg.compiler.link_paths['cxx']))
env.set('F77', join_path(link_dir, pkg.compiler.link_paths['f77']))
env.set('FC', join_path(link_dir, pkg.compiler.link_paths['fc']))
env.set('FC', join_path(link_dir, pkg.compiler.link_paths['fc']))
# Set SPACK compiler variables so that our wrapper knows what to call
compiler = pkg.compiler
if compiler.cc:
env.set('SPACK_CC', compiler.cc)
env.set('SPACK_CC', compiler.cc)
if compiler.cxx:
env.set('SPACK_CXX', compiler.cxx)
if compiler.f77:
env.set('SPACK_F77', compiler.f77)
if compiler.fc:
env.set('SPACK_FC', compiler.fc)
env.set('SPACK_FC', compiler.fc)
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set('SPACK_CC_RPATH_ARG', compiler.cc_rpath_arg)
env.set('SPACK_CXX_RPATH_ARG', compiler.cxx_rpath_arg)
env.set('SPACK_F77_RPATH_ARG', compiler.f77_rpath_arg)
env.set('SPACK_FC_RPATH_ARG', compiler.fc_rpath_arg)
env.set('SPACK_COMPILER_SPEC', str(pkg.spec.compiler))
return env
@@ -175,8 +181,8 @@ def set_build_environment_variables(pkg, env):
# Add any pkgconfig directories to PKG_CONFIG_PATH
pkg_config_dirs = []
for p in dep_prefixes:
for libdir in ('lib', 'lib64'):
pcdir = join_path(p, libdir, 'pkgconfig')
for maybe in ('lib', 'lib64', 'share'):
pcdir = join_path(p, maybe, 'pkgconfig')
if os.path.isdir(pcdir):
pkg_config_dirs.append(pcdir)
env.set_path('PKG_CONFIG_PATH', pkg_config_dirs)

View File

@@ -22,19 +22,18 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import argparse
import sys
import llnl.util.tty as tty
from llnl.util.tty.color import colorize
from llnl.util.tty.colify import colify
from llnl.util.lang import index_by
import spack.compilers
import spack.spec
import spack.config
from spack.util.environment import get_path
import spack.spec
from llnl.util.lang import index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from spack.spec import CompilerSpec
from spack.util.environment import get_path
description = "Manage compilers"
@@ -44,10 +43,10 @@ def setup_parser(subparser):
scopes = spack.config.config_scopes
# Add
add_parser = sp.add_parser('add', help='Add compilers to the Spack configuration.')
add_parser.add_argument('add_paths', nargs=argparse.REMAINDER)
add_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope,
# Find
find_parser = sp.add_parser('find', aliases=['add'], help='Search the system for compilers to add to the Spack configuration.')
find_parser.add_argument('add_paths', nargs=argparse.REMAINDER)
find_parser.add_argument('--scope', choices=scopes, default=spack.cmd.default_modify_scope,
help="Configuration scope to modify.")
# Remove
@@ -70,7 +69,7 @@ def setup_parser(subparser):
help="Configuration scope to read from.")
def compiler_add(args):
def compiler_find(args):
"""Search either $PATH or a list of paths for compilers and add them
to Spack's configuration."""
paths = args.add_paths
@@ -136,7 +135,8 @@ def compiler_list(args):
def compiler(parser, args):
action = { 'add' : compiler_add,
action = { 'add' : compiler_find,
'find' : compiler_find,
'remove' : compiler_remove,
'rm' : compiler_remove,
'info' : compiler_info,

View File

@@ -124,10 +124,12 @@ def __call__(self, stage):
autotools = "configure('--prefix=%s' % prefix)"
cmake = "cmake('.', *std_cmake_args)"
python = "python('setup.py', 'install', '--prefix=%s' % prefix)"
r = "R('CMD', 'INSTALL', '--library=%s' % self.module.r_lib_dir, '%s' % self.stage.archive_file)"
config_lines = ((r'/configure$', 'autotools', autotools),
(r'/CMakeLists.txt$', 'cmake', cmake),
(r'/setup.py$', 'python', python))
(r'/setup.py$', 'python', python),
(r'/NAMESPACE$', 'r', r))
# Peek inside the tarball.
tar = which('tar')
@@ -272,6 +274,10 @@ def create(parser, args):
if guesser.build_system == 'python':
name = 'py-%s' % name
# Prepend 'r-' to R package names, by convention.
if guesser.build_system == 'r':
name = 'r-%s' % name
# Create a directory for the new package.
pkg_path = repo.filename_for_package_name(name)
if os.path.exists(pkg_path) and not args.force:

View File

@@ -23,87 +23,106 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
import xml.etree.ElementTree as ET
import itertools
import re
import os
import codecs
import os
import time
import xml.dom.minidom
import xml.etree.ElementTree as ET
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack
import spack.cmd
from llnl.util.filesystem import *
from spack.build_environment import InstallError
from spack.fetch_strategy import FetchError
import spack.cmd
description = "Run package installation as a unit test, output formatted results."
def setup_parser(subparser):
subparser.add_argument(
'-j', '--jobs', action='store', type=int,
help="Explicitly set number of make jobs. Default is #cpus.")
subparser.add_argument('-j',
'--jobs',
action='store',
type=int,
help="Explicitly set number of make jobs. Default is #cpus.")
subparser.add_argument(
'-n', '--no-checksum', action='store_true', dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument('-n',
'--no-checksum',
action='store_true',
dest='no_checksum',
help="Do not check packages against checksum")
subparser.add_argument(
'-o', '--output', action='store', help="test output goes in this file")
subparser.add_argument('-o', '--output', action='store', help="test output goes in this file")
subparser.add_argument(
'package', nargs=argparse.REMAINDER, help="spec of package to install")
class JunitResultFormat(object):
def __init__(self):
self.root = ET.Element('testsuite')
self.tests = []
def add_test(self, buildId, testResult, buildInfo=None):
self.tests.append((buildId, testResult, buildInfo))
def write_to(self, stream):
self.root.set('tests', '{0}'.format(len(self.tests)))
for buildId, testResult, buildInfo in self.tests:
testcase = ET.SubElement(self.root, 'testcase')
testcase.set('classname', buildId.name)
testcase.set('name', buildId.stringId())
if testResult == TestResult.FAILED:
failure = ET.SubElement(testcase, 'failure')
failure.set('type', "Build Error")
failure.text = buildInfo
elif testResult == TestResult.SKIPPED:
skipped = ET.SubElement(testcase, 'skipped')
skipped.set('type', "Skipped Build")
skipped.text = buildInfo
ET.ElementTree(self.root).write(stream)
subparser.add_argument('package', nargs=argparse.REMAINDER, help="spec of package to install")
class TestResult(object):
PASSED = 0
FAILED = 1
SKIPPED = 2
ERRORED = 3
class BuildId(object):
def __init__(self, spec):
self.name = spec.name
self.version = spec.version
self.hashId = spec.dag_hash()
class TestSuite(object):
def __init__(self, filename):
self.filename = filename
self.root = ET.Element('testsuite')
self.tests = []
def stringId(self):
return "-".join(str(x) for x in (self.name, self.version, self.hashId))
def __enter__(self):
return self
def __hash__(self):
return hash((self.name, self.version, self.hashId))
def append(self, item):
if not isinstance(item, TestCase):
raise TypeError('only TestCase instances may be appended to a TestSuite instance')
self.tests.append(item) # Append the item to the list of tests
def __eq__(self, other):
if not isinstance(other, BuildId):
return False
def __exit__(self, exc_type, exc_val, exc_tb):
# Prepare the header for the entire test suite
number_of_errors = sum(x.result_type == TestResult.ERRORED for x in self.tests)
self.root.set('errors', str(number_of_errors))
number_of_failures = sum(x.result_type == TestResult.FAILED for x in self.tests)
self.root.set('failures', str(number_of_failures))
self.root.set('tests', str(len(self.tests)))
return ((self.name, self.version, self.hashId) ==
(other.name, other.version, other.hashId))
for item in self.tests:
self.root.append(item.element)
with open(self.filename, 'wb') as file:
xml_string = ET.tostring(self.root)
xml_string = xml.dom.minidom.parseString(xml_string).toprettyxml()
file.write(xml_string)
class TestCase(object):
results = {
TestResult.PASSED: None,
TestResult.SKIPPED: 'skipped',
TestResult.FAILED: 'failure',
TestResult.ERRORED: 'error',
}
def __init__(self, classname, name, time=None):
self.element = ET.Element('testcase')
self.element.set('classname', str(classname))
self.element.set('name', str(name))
if time is not None:
self.element.set('time', str(time))
self.result_type = None
def set_result(self, result_type, message=None, error_type=None, text=None):
self.result_type = result_type
result = TestCase.results[self.result_type]
if result is not None and result is not TestResult.PASSED:
subelement = ET.SubElement(self.element, result)
if error_type is not None:
subelement.set('type', error_type)
if message is not None:
subelement.set('message', str(message))
if text is not None:
subelement.text = text
def fetch_log(path):
@@ -114,46 +133,76 @@ def fetch_log(path):
def failed_dependencies(spec):
return set(childSpec for childSpec in spec.dependencies.itervalues() if not
spack.repo.get(childSpec).installed)
return set(item for item in spec.dependencies.itervalues() if not spack.repo.get(item).installed)
def create_test_output(topSpec, newInstalls, output, getLogFunc=fetch_log):
# Post-order traversal is not strictly required but it makes sense to output
# tests for dependencies first.
for spec in topSpec.traverse(order='post'):
if spec not in newInstalls:
continue
def get_top_spec_or_die(args):
specs = spack.cmd.parse_specs(args.package, concretize=True)
if len(specs) > 1:
tty.die("Only 1 top-level package can be specified")
top_spec = iter(specs).next()
return top_spec
failedDeps = failed_dependencies(spec)
package = spack.repo.get(spec)
if failedDeps:
result = TestResult.SKIPPED
dep = iter(failedDeps).next()
depBID = BuildId(dep)
errOutput = "Skipped due to failed dependency: {0}".format(
depBID.stringId())
elif (not package.installed) and (not package.stage.source_path):
result = TestResult.FAILED
errOutput = "Failure to fetch package resources."
elif not package.installed:
result = TestResult.FAILED
lines = getLogFunc(package.build_log_path)
errMessages = list(line for line in lines if
re.search('error:', line, re.IGNORECASE))
errOutput = errMessages if errMessages else lines[-10:]
errOutput = '\n'.join(itertools.chain(
[spec.to_yaml(), "Errors:"], errOutput,
["Build Log:", package.build_log_path]))
else:
result = TestResult.PASSED
errOutput = None
bId = BuildId(spec)
output.add_test(bId, result, errOutput)
def install_single_spec(spec, number_of_jobs):
package = spack.repo.get(spec)
# If it is already installed, skip the test
if spack.repo.get(spec).installed:
testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
testcase.set_result(TestResult.SKIPPED, message='Skipped [already installed]', error_type='already_installed')
return testcase
# If it relies on dependencies that did not install, skip
if failed_dependencies(spec):
testcase = TestCase(package.name, package.spec.short_spec, time=0.0)
testcase.set_result(TestResult.SKIPPED, message='Skipped [failed dependencies]', error_type='dep_failed')
return testcase
# Otherwise try to install the spec
try:
start_time = time.time()
package.do_install(keep_prefix=False,
keep_stage=True,
ignore_deps=False,
make_jobs=number_of_jobs,
verbose=True,
fake=False)
duration = time.time() - start_time
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.PASSED)
except InstallError:
# An InstallError is considered a failure (the recipe didn't work correctly)
duration = time.time() - start_time
# Try to get the log
lines = fetch_log(package.build_log_path)
text = '\n'.join(lines)
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.FAILED, message='Installation failure', text=text)
except FetchError:
# A FetchError is considered an error (we didn't even start building)
duration = time.time() - start_time
testcase = TestCase(package.name, package.spec.short_spec, duration)
testcase.set_result(TestResult.ERRORED, message='Unable to fetch package')
return testcase
def get_filename(args, top_spec):
if not args.output:
fname = 'test-{x.name}-{x.version}-{hash}.xml'.format(x=top_spec, hash=top_spec.dag_hash())
output_directory = join_path(os.getcwd(), 'test-output')
if not os.path.exists(output_directory):
os.mkdir(output_directory)
output_filename = join_path(output_directory, fname)
else:
output_filename = args.output
return output_filename
def test_install(parser, args):
# Check the input
if not args.package:
tty.die("install requires a package argument")
@@ -162,50 +211,15 @@ def test_install(parser, args):
tty.die("The -j option must be a positive integer!")
if args.no_checksum:
spack.do_checksum = False # TODO: remove this global.
spack.do_checksum = False # TODO: remove this global.
specs = spack.cmd.parse_specs(args.package, concretize=True)
if len(specs) > 1:
tty.die("Only 1 top-level package can be specified")
topSpec = iter(specs).next()
newInstalls = set()
for spec in topSpec.traverse():
package = spack.repo.get(spec)
if not package.installed:
newInstalls.add(spec)
if not args.output:
bId = BuildId(topSpec)
outputDir = join_path(os.getcwd(), "test-output")
if not os.path.exists(outputDir):
os.mkdir(outputDir)
outputFpath = join_path(outputDir, "test-{0}.xml".format(bId.stringId()))
else:
outputFpath = args.output
for spec in topSpec.traverse(order='post'):
# Calling do_install for the top-level package would be sufficient but
# this attempts to keep going if any package fails (other packages which
# are not dependents may succeed)
package = spack.repo.get(spec)
if (not failed_dependencies(spec)) and (not package.installed):
try:
package.do_install(
keep_prefix=False,
keep_stage=True,
ignore_deps=False,
make_jobs=args.jobs,
verbose=True,
fake=False)
except InstallError:
pass
except FetchError:
pass
jrf = JunitResultFormat()
handled = {}
create_test_output(topSpec, newInstalls, jrf)
with open(outputFpath, 'wb') as F:
jrf.write_to(F)
# Get the one and only top spec
top_spec = get_top_spec_or_die(args)
# Get the filename of the test
output_filename = get_filename(args, top_spec)
# TEST SUITE
with TestSuite(output_filename) as test_suite:
# Traverse in post order : each spec is a test case
for spec in top_spec.traverse(order='post'):
test_case = install_single_spec(spec, args.jobs)
test_suite.append(test_case)

View File

@@ -91,11 +91,22 @@ class Compiler(object):
# version suffix for gcc.
suffixes = [r'-.*']
# Names of generic arguments used by this compiler
arg_rpath = '-Wl,-rpath,%s'
# Default flags used by a compiler to set an rpath
@property
def cc_rpath_arg(self):
return '-Wl,-rpath,'
# argument used to get C++11 options
cxx11_flag = "-std=c++11"
@property
def cxx_rpath_arg(self):
return '-Wl,-rpath,'
@property
def f77_rpath_arg(self):
return '-Wl,-rpath,'
@property
def fc_rpath_arg(self):
return '-Wl,-rpath,'
def __init__(self, cspec, cc, cxx, f77, fc):
@@ -117,6 +128,37 @@ def check(exe):
def version(self):
return self.spec.version
# This property should be overridden in the compiler subclass if
# OpenMP is supported by that compiler
@property
def openmp_flag(self):
# If it is not overridden, assume it is not supported and warn the user
tty.die("The compiler you have chosen does not currently support OpenMP.",
"If you think it should, please edit the compiler subclass and",
"submit a pull request or issue.")
# This property should be overridden in the compiler subclass if
# C++11 is supported by that compiler
@property
def cxx11_flag(self):
# If it is not overridden, assume it is not supported and warn the user
tty.die("The compiler you have chosen does not currently support C++11.",
"If you think it should, please edit the compiler subclass and",
"submit a pull request or issue.")
# This property should be overridden in the compiler subclass if
# C++14 is supported by that compiler
@property
def cxx14_flag(self):
# If it is not overridden, assume it is not supported and warn the user
tty.die("The compiler you have chosen does not currently support C++14.",
"If you think it should, please edit the compiler subclass and",
"submit a pull request or issue.")
#
# Compiler classes have methods for querying the version of
# specific compiler executables. This is used when discovering compilers.
@@ -202,6 +244,10 @@ def check(key):
return None
successful = [key for key in parmap(check, checks) if key is not None]
# The 'successful' list is ordered like the input paths.
# Reverse it here so that the dict creation (last insert wins)
# does not spoil the intented precedence.
successful.reverse()
return dict(((v, p, s), path) for v, p, s, path in successful)
@classmethod

View File

@@ -26,6 +26,8 @@
import spack.compiler as cpr
from spack.compiler import *
from spack.util.executable import *
import llnl.util.tty as tty
from spack.version import ver
class Clang(Compiler):
# Subclasses use possible names of C compiler
@@ -47,6 +49,29 @@ class Clang(Compiler):
'f77' : 'f77',
'fc' : 'f90' }
@property
def is_apple(self):
ver_string = str(self.version)
return ver_string.endswith('-apple')
@property
def openmp_flag(self):
if self.is_apple:
tty.die("Clang from Apple does not support Openmp yet.")
else:
return "-fopenmp"
@property
def cxx11_flag(self):
if self.is_apple:
# FIXME: figure out from which version Apple's clang supports c++11
return "-std=c++11"
else:
if self.version < ver('3.3'):
tty.die("Only Clang 3.3 and above support c++11.")
else:
return "-std=c++11"
@classmethod
def default_version(self, comp):
"""The '--version' option works for clang compilers.

View File

@@ -49,14 +49,25 @@ class Gcc(Compiler):
'f77' : 'gcc/gfortran',
'fc' : 'gcc/gfortran' }
@property
def openmp_flag(self):
return "-fopenmp"
@property
def cxx11_flag(self):
if self.version < ver('4.3'):
tty.die("Only gcc 4.3 and above support c++11.")
elif self.version < ver('4.7'):
return "-std=gnu++0x"
return "-std=c++0x"
else:
return "-std=gnu++11"
return "-std=c++11"
@property
def cxx14_flag(self):
if self.version < ver('4.8'):
tty.die("Only gcc 4.8 and above support c++14.")
else:
return "-std=c++14"
@classmethod
def fc_version(cls, fc):

View File

@@ -23,6 +23,8 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack.compiler import *
import llnl.util.tty as tty
from spack.version import ver
class Intel(Compiler):
# Subclasses use possible names of C compiler
@@ -43,6 +45,13 @@ class Intel(Compiler):
'f77' : 'intel/ifort',
'fc' : 'intel/ifort' }
@property
def openmp_flag(self):
if self.version < ver('16.0'):
return "-openmp"
else:
return "-qopenmp"
@property
def cxx11_flag(self):
if self.version < ver('11.1'):
@@ -68,5 +77,3 @@ def default_version(cls, comp):
"""
return get_compiler_version(
comp, '--version', r'\((?:IFORT|ICC)\) ([^ ]+)')

View File

@@ -1,4 +1,5 @@
from spack.compiler import *
import llnl.util.tty as tty
class Nag(Compiler):
# Subclasses use possible names of C compiler
@@ -20,6 +21,27 @@ class Nag(Compiler):
'f77' : 'nag/nagfor',
'fc' : 'nag/nagfor' }
@property
def openmp_flag(self):
return "-openmp"
@property
def cxx11_flag(self):
# NAG does not have a C++ compiler
# However, it can be mixed with a compiler that does support it
return "-std=c++11"
# Unlike other compilers, the NAG compiler passes options to GCC, which
# then passes them to the linker. Therefore, we need to doubly wrap the
# options with '-Wl,-Wl,,'
@property
def f77_rpath_arg(self):
return '-Wl,-Wl,,-rpath,'
@property
def fc_rpath_arg(self):
return '-Wl,-Wl,,-rpath,'
@classmethod
def default_version(self, comp):
"""The '-V' option works for nag compilers.

View File

@@ -23,6 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack.compiler import *
import llnl.util.tty as tty
class Pgi(Compiler):
# Subclasses use possible names of C compiler
@@ -43,6 +44,15 @@ class Pgi(Compiler):
'f77' : 'pgi/pgfortran',
'fc' : 'pgi/pgfortran' }
@property
def openmp_flag(self):
return "-mp"
@property
def cxx11_flag(self):
return "-std=c++11"
@classmethod
def default_version(cls, comp):
"""The '-V' option works for all the PGI compilers.
@@ -54,4 +64,3 @@ def default_version(cls, comp):
"""
return get_compiler_version(
comp, '-V', r'pg[^ ]* ([^ ]+) \d\d\d?-bit target')

View File

@@ -24,6 +24,8 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack.compiler import *
import llnl.util.tty as tty
from spack.version import ver
class Xl(Compiler):
# Subclasses use possible names of C compiler
@@ -44,6 +46,10 @@ class Xl(Compiler):
'f77' : 'xl/xlf',
'fc' : 'xl/xlf90' }
@property
def openmp_flag(self):
return "-qsmp=omp"
@property
def cxx11_flag(self):
if self.version < ver('13.1'):

View File

@@ -618,14 +618,16 @@ def update_config(section, update_data, scope=None):
other yaml-ish structure.
"""
# read in the config to ensure we've got current data
get_config(section)
validate_section_name(section) # validate section name
scope = validate_scope(scope) # get ConfigScope object from string.
validate_section_name(section) # validate section name
scope = validate_scope(scope) # get ConfigScope object from string.
# read in the config to ensure we've got current data
configuration = get_config(section)
configuration.update(update_data)
# read only the requested section's data.
scope.sections[section] = { section : update_data }
scope.sections[section] = {section: configuration}
scope.write_section(section)

View File

@@ -157,12 +157,26 @@ def fetch(self):
tty.msg("Already downloaded %s" % self.archive_file)
return
possible_files = self.stage.expected_archive_files
save_file = None
partial_file = None
if possible_files:
save_file = self.stage.expected_archive_files[0]
partial_file = self.stage.expected_archive_files[0] + '.part'
tty.msg("Trying to fetch from %s" % self.url)
curl_args = ['-O', # save file to disk
if partial_file:
save_args = ['-C', '-', # continue partial downloads
'-o', partial_file] # use a .part file
else:
save_args = ['-O']
curl_args = save_args + [
'-f', # fail on >400 errors
'-D', '-', # print out HTML headers
'-L', self.url, ]
'-L', # resolve 3xx redirects
self.url, ]
if sys.stdout.isatty():
curl_args.append('-#') # status bar when using a tty
@@ -178,6 +192,9 @@ def fetch(self):
if self.archive_file:
os.remove(self.archive_file)
if partial_file and os.path.exists(partial_file):
os.remove(partial_file)
if spack.curl.returncode == 22:
# This is a 404. Curl will print the error.
raise FailedDownloadError(
@@ -209,6 +226,9 @@ def fetch(self):
"'spack clean <package>' to remove the bad archive, then fix",
"your internet gateway issue and install again.")
if save_file:
os.rename(partial_file, save_file)
if not self.archive_file:
raise FailedDownloadError(self.url)

View File

@@ -210,6 +210,18 @@ def _need_to_create_path(self):
return False
@property
def expected_archive_files(self):
"""Possible archive file paths."""
paths = []
if isinstance(self.fetcher, fs.URLFetchStrategy):
paths.append(os.path.join(self.path, os.path.basename(self.fetcher.url)))
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
return paths
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""

View File

@@ -62,14 +62,14 @@
'optional_deps',
'make_executable',
'configure_guess',
'unit_install',
'lock',
'database',
'namespace_trie',
'yaml',
'sbang',
'environment',
'cmd.uninstall']
'cmd.uninstall',
'cmd.test_install']
def list_tests():

View File

@@ -67,6 +67,11 @@ def setUp(self):
os.environ['SPACK_COMPILER_SPEC'] = "gcc@4.4.7"
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2"
os.environ['SPACK_CC_RPATH_ARG'] = "-Wl,-rpath,"
os.environ['SPACK_CXX_RPATH_ARG'] = "-Wl,-rpath,"
os.environ['SPACK_F77_RPATH_ARG'] = "-Wl,-rpath,"
os.environ['SPACK_FC_RPATH_ARG'] = "-Wl,-rpath,"
# Make some fake dependencies
self.tmp_deps = tempfile.mkdtemp()
self.dep1 = join_path(self.tmp_deps, 'dep1')
@@ -219,3 +224,27 @@ def test_ld_deps(self):
' '.join(test_command))
def test_ld_deps_reentrant(self):
"""Make sure ld -r is handled correctly on OS's where it doesn't
support rpaths."""
os.environ['SPACK_DEPENDENCIES'] = ':'.join([self.dep1])
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=linux-x86_64"
reentrant_test_command = ['-r'] + test_command
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-rpath ' + self.prefix + '/lib ' +
'-rpath ' + self.prefix + '/lib64 ' +
'-L' + self.dep1 + '/lib ' +
'-rpath ' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))
os.environ['SPACK_SHORT_SPEC'] = "foo@1.2=darwin-x86_64"
self.check_ld('dump-args', reentrant_test_command,
'ld ' +
'-L' + self.dep1 + '/lib ' +
'-r ' +
' '.join(test_command))

View File

@@ -0,0 +1,190 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import collections
from contextlib import contextmanager
import StringIO
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
# Monkey-patch open to write module files to a StringIO instance
@contextmanager
def mock_open(filename, mode):
if not mode == 'wb':
raise RuntimeError('test.test_install : unexpected opening mode for monkey-patched open')
FILE_REGISTRY[filename] = StringIO.StringIO()
try:
yield FILE_REGISTRY[filename]
finally:
handle = FILE_REGISTRY[filename]
FILE_REGISTRY[filename] = handle.getvalue()
handle.close()
import os
import itertools
import unittest
import spack
import spack.cmd
# The use of __import__ is necessary to maintain a name with hyphen (which cannot be an identifier in python)
test_install = __import__("spack.cmd.test-install", fromlist=['test_install'])
class MockSpec(object):
def __init__(self, name, version, hashStr=None):
self.dependencies = {}
self.name = name
self.version = version
self.hash = hashStr if hashStr else hash((name, version))
def traverse(self, order=None):
for _, spec in self.dependencies.items():
yield spec
yield self
#allDeps = itertools.chain.from_iterable(i.traverse() for i in self.dependencies.itervalues())
#return set(itertools.chain([self], allDeps))
def dag_hash(self):
return self.hash
@property
def short_spec(self):
return '-'.join([self.name, str(self.version), str(self.hash)])
class MockPackage(object):
def __init__(self, spec, buildLogPath):
self.name = spec.name
self.spec = spec
self.installed = False
self.build_log_path = buildLogPath
def do_install(self, *args, **kwargs):
self.installed = True
class MockPackageDb(object):
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []
specX = MockSpec('X', "1.2.0")
specY = MockSpec('Y', "2.3.8")
specX.dependencies['Y'] = specY
pkgX = MockPackage(specX, 'logX')
pkgY = MockPackage(specY, 'logY')
class MockArgs(object):
def __init__(self, package):
self.package = package
self.jobs = None
self.no_checksum = False
self.output = None
# TODO: add test(s) where Y fails to install
class TestInstallTest(unittest.TestCase):
"""
Tests test-install where X->Y
"""
def setUp(self):
super(TestInstallTest, self).setUp()
# Monkey patch parse specs
def monkey_parse_specs(x, concretize):
if x == 'X':
return [specX]
elif x == 'Y':
return [specY]
return []
self.parse_specs = spack.cmd.parse_specs
spack.cmd.parse_specs = monkey_parse_specs
# Monkey patch os.mkdirp
self.os_mkdir = os.mkdir
os.mkdir = lambda x: True
# Monkey patch open
test_install.open = mock_open
# Clean FILE_REGISTRY
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
pkgX.installed = False
pkgY.installed = False
# Monkey patch pkgDb
self.saved_db = spack.repo
pkgDb = MockPackageDb({specX: pkgX, specY: pkgY})
spack.repo = pkgDb
def tearDown(self):
# Remove the monkey patched test_install.open
test_install.open = open
# Remove the monkey patched os.mkdir
os.mkdir = self.os_mkdir
del self.os_mkdir
# Remove the monkey patched parse_specs
spack.cmd.parse_specs = self.parse_specs
del self.parse_specs
super(TestInstallTest, self).tearDown()
spack.repo = self.saved_db
def test_installing_both(self):
test_install.test_install(None, MockArgs('X') )
self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
def test_dependency_already_installed(self):
pkgX.installed = True
pkgY.installed = True
test_install.test_install(None, MockArgs('X'))
self.assertEqual(len(FILE_REGISTRY), 1)
for _, content in FILE_REGISTRY.items():
self.assertTrue('tests="2"' in content)
self.assertTrue('failures="0"' in content)
self.assertTrue('errors="0"' in content)
self.assertEqual(sum('skipped' in line for line in content.split('\n')), 2)

View File

@@ -33,7 +33,7 @@
# Some sample compiler config data
a_comps = {
"all": {
"x86_64_E5v2_IntelIB": {
"gcc@4.7.3" : {
"cc" : "/gcc473",
"cxx": "/g++473",
@@ -53,7 +53,7 @@
}
b_comps = {
"all": {
"x86_64_E5v3": {
"icc@10.0" : {
"cc" : "/icc100",
"cxx": "/icc100",
@@ -85,27 +85,24 @@ def tearDown(self):
super(ConfigTest, self).tearDown()
shutil.rmtree(self.tmp_dir, True)
def check_config(self, comps, *compiler_names):
def check_config(self, comps, arch, *compiler_names):
"""Check that named compilers in comps match Spack's config."""
config = spack.config.get_config('compilers')
compiler_list = ['cc', 'cxx', 'f77', 'fc']
for key in compiler_names:
for c in compiler_list:
expected = comps['all'][key][c]
actual = config['all'][key][c]
expected = comps[arch][key][c]
actual = config[arch][key][c]
self.assertEqual(expected, actual)
def test_write_key_in_memory(self):
# Write b_comps "on top of" a_comps.
spack.config.update_config('compilers', a_comps, 'test_low_priority')
spack.config.update_config('compilers', b_comps, 'test_high_priority')
# Make sure the config looks how we expect.
self.check_config(a_comps, 'gcc@4.7.3', 'gcc@4.5.0')
self.check_config(b_comps, 'icc@10.0', 'icc@11.1', 'clang@3.3')
self.check_config(a_comps, 'x86_64_E5v2_IntelIB', 'gcc@4.7.3', 'gcc@4.5.0')
self.check_config(b_comps, 'x86_64_E5v3', 'icc@10.0', 'icc@11.1', 'clang@3.3')
def test_write_key_to_disk(self):
# Write b_comps "on top of" a_comps.
@@ -116,5 +113,17 @@ def test_write_key_to_disk(self):
spack.config.clear_config_caches()
# Same check again, to ensure consistency.
self.check_config(a_comps, 'gcc@4.7.3', 'gcc@4.5.0')
self.check_config(b_comps, 'icc@10.0', 'icc@11.1', 'clang@3.3')
self.check_config(a_comps, 'x86_64_E5v2_IntelIB', 'gcc@4.7.3', 'gcc@4.5.0')
self.check_config(b_comps, 'x86_64_E5v3', 'icc@10.0', 'icc@11.1', 'clang@3.3')
def test_write_to_same_priority_file(self):
# Write b_comps in the same file as a_comps.
spack.config.update_config('compilers', a_comps, 'test_low_priority')
spack.config.update_config('compilers', b_comps, 'test_low_priority')
# Clear caches so we're forced to read from disk.
spack.config.clear_config_caches()
# Same check again, to ensure consistency.
self.check_config(a_comps, 'x86_64_E5v2_IntelIB', 'gcc@4.7.3', 'gcc@4.5.0')
self.check_config(b_comps, 'x86_64_E5v3', 'icc@10.0', 'icc@11.1', 'clang@3.3')

View File

@@ -1,126 +0,0 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import itertools
import unittest
import spack
test_install = __import__("spack.cmd.test-install",
fromlist=["BuildId", "create_test_output", "TestResult"])
class MockOutput(object):
def __init__(self):
self.results = {}
def add_test(self, buildId, passed=True, buildInfo=None):
self.results[buildId] = passed
def write_to(self, stream):
pass
class MockSpec(object):
def __init__(self, name, version, hashStr=None):
self.dependencies = {}
self.name = name
self.version = version
self.hash = hashStr if hashStr else hash((name, version))
def traverse(self, order=None):
allDeps = itertools.chain.from_iterable(i.traverse() for i in
self.dependencies.itervalues())
return set(itertools.chain([self], allDeps))
def dag_hash(self):
return self.hash
def to_yaml(self):
return "<<<MOCK YAML {0}>>>".format(test_install.BuildId(self).stringId())
class MockPackage(object):
def __init__(self, buildLogPath):
self.installed = False
self.build_log_path = buildLogPath
specX = MockSpec("X", "1.2.0")
specY = MockSpec("Y", "2.3.8")
specX.dependencies['Y'] = specY
pkgX = MockPackage('logX')
pkgY = MockPackage('logY')
bIdX = test_install.BuildId(specX)
bIdY = test_install.BuildId(specY)
class UnitInstallTest(unittest.TestCase):
"""Tests test-install where X->Y"""
def setUp(self):
super(UnitInstallTest, self).setUp()
pkgX.installed = False
pkgY.installed = False
self.saved_db = spack.repo
pkgDb = MockPackageDb({specX:pkgX, specY:pkgY})
spack.repo = pkgDb
def tearDown(self):
super(UnitInstallTest, self).tearDown()
spack.repo = self.saved_db
def test_installing_both(self):
mo = MockOutput()
pkgX.installed = True
pkgY.installed = True
test_install.create_test_output(specX, [specX, specY], mo, getLogFunc=mock_fetch_log)
self.assertEqual(mo.results,
{bIdX:test_install.TestResult.PASSED,
bIdY:test_install.TestResult.PASSED})
def test_dependency_already_installed(self):
mo = MockOutput()
pkgX.installed = True
pkgY.installed = True
test_install.create_test_output(specX, [specX], mo, getLogFunc=mock_fetch_log)
self.assertEqual(mo.results, {bIdX:test_install.TestResult.PASSED})
#TODO: add test(s) where Y fails to install
class MockPackageDb(object):
def __init__(self, init=None):
self.specToPkg = {}
if init:
self.specToPkg.update(init)
def get(self, spec):
return self.specToPkg[spec]
def mock_fetch_log(path):
return []

View File

@@ -206,6 +206,9 @@ def parse_version_offset(path):
# e.g. lame-398-1
(r'-((\d)+-\d)', stem),
# e.g. foobar_1.2-3
(r'_((\d+\.)+\d+(-\d+)?[a-z]?)', stem),
# e.g. foobar-4.5.1
(r'-((\d+\.)*\d+)$', stem),

View File

@@ -144,7 +144,7 @@ def streamify(arg, mode):
cmd = self.exe + list(args)
cmd_line = ' '.join(cmd)
cmd_line = "'%s'" % "' '".join(map(lambda arg: arg.replace("'", "'\"'\"'"), cmd))
tty.debug(cmd_line)
try: