Merge branch 'develop' of https://github.com/LLNL/spack into fixes/intel_openmpi
This commit is contained in:
commit
4887936809
@ -19,7 +19,7 @@ written in pure Python, and specs allow package authors to write a
|
||||
single build script for many different builds of the same package.
|
||||
|
||||
See the
|
||||
[Feature Overview](http://llnl.github.io/spack/features.html)
|
||||
[Feature Overview](http://software.llnl.gov/spack/features.html)
|
||||
for examples and highlights.
|
||||
|
||||
To install spack and install your first package:
|
||||
@ -31,7 +31,7 @@ To install spack and install your first package:
|
||||
Documentation
|
||||
----------------
|
||||
|
||||
[**Full documentation**](http://llnl.github.io/spack) for Spack is
|
||||
[**Full documentation**](http://software.llnl.gov/spack) for Spack is
|
||||
the first place to look.
|
||||
|
||||
See also:
|
||||
|
84
bin/sbang
Executable file
84
bin/sbang
Executable file
@ -0,0 +1,84 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# `sbang`: Run scripts with long shebang lines.
|
||||
#
|
||||
# Many operating systems limit the length of shebang lines, making it
|
||||
# hard to use interpreters that are deep in the directory hierarchy.
|
||||
# `sbang` can run such scripts, either as a shebang interpreter, or
|
||||
# directly on the command line.
|
||||
#
|
||||
# Usage
|
||||
# -----------------------------
|
||||
# Suppose you have a script, long-shebang.sh, like this:
|
||||
#
|
||||
# 1 #!/very/long/path/to/some/interpreter
|
||||
# 2
|
||||
# 3 echo "success!"
|
||||
#
|
||||
# Invoking this script will result in an error on some OS's. On
|
||||
# Linux, you get this:
|
||||
#
|
||||
# $ ./long-shebang.sh
|
||||
# -bash: ./long: /very/long/path/to/some/interp: bad interpreter:
|
||||
# No such file or directory
|
||||
#
|
||||
# On Mac OS X, the system simply assumes the interpreter is the shell
|
||||
# and tries to run with it, which is likely not what you want.
|
||||
#
|
||||
#
|
||||
# `sbang` on the command line
|
||||
# -----------------------------
|
||||
# You can use `sbang` in two ways. The first is to use it directly,
|
||||
# from the command line, like this:
|
||||
#
|
||||
# $ sbang ./long-shebang.sh
|
||||
# success!
|
||||
#
|
||||
#
|
||||
# `sbang` as the interpreter
|
||||
# -----------------------------
|
||||
# You can also use `sbang` *as* the interpreter for your script. Put
|
||||
# `#!/bin/bash /path/to/sbang` on line 1, and move the original
|
||||
# shebang to line 2 of the script:
|
||||
#
|
||||
# 1 #!/bin/bash /path/to/sbang
|
||||
# 2 #!/long/path/to/real/interpreter with arguments
|
||||
# 3
|
||||
# 4 echo "success!"
|
||||
#
|
||||
# $ ./long-shebang.sh
|
||||
# success!
|
||||
#
|
||||
# On Linux, you could shorten line 1 to `#!/path/to/sbang`, but other
|
||||
# operating systems like Mac OS X require the interpreter to be a
|
||||
# binary, so it's best to use `sbang` as a `bash` argument.
|
||||
# Obviously, for this to work, `sbang` needs to have a short enough
|
||||
# path that *it* will run without hitting OS limits.
|
||||
#
|
||||
#
|
||||
# How it works
|
||||
# -----------------------------
|
||||
# `sbang` is a very simple bash script. It looks at the first two
|
||||
# lines of a script argument and runs the last line starting with
|
||||
# `#!`, with the script as an argument. It also forwards arguments.
|
||||
#
|
||||
|
||||
# First argument is the script we want to actually run.
|
||||
script="$1"
|
||||
|
||||
# Search the first two lines of script for interpreters.
|
||||
lines=0
|
||||
while read line && ((lines < 2)) ; do
|
||||
if [[ "$line" = '#!'* ]]; then
|
||||
interpreter="${line#\#!}"
|
||||
fi
|
||||
lines=$((lines+1))
|
||||
done < "$script"
|
||||
|
||||
# Invoke any interpreter found, or raise an error if none was found.
|
||||
if [ -n "$interpreter" ]; then
|
||||
exec $interpreter "$@"
|
||||
else
|
||||
echo "error: sbang found no interpreter in $script"
|
||||
exit 1
|
||||
fi
|
@ -896,7 +896,7 @@ Or, similarly with modules, you could type:
|
||||
$ spack load mpich %gcc@4.4.7
|
||||
|
||||
These commands will add appropriate directories to your ``PATH``,
|
||||
``MANPATH``, and ``LD_LIBRARY_PATH``. When you no longer want to use
|
||||
``MANPATH``, ``CPATH``, and ``LD_LIBRARY_PATH``. When you no longer want to use
|
||||
a package, you can type unload or unuse similarly:
|
||||
|
||||
.. code-block:: sh
|
||||
|
@ -73,19 +73,32 @@ with a high level view of Spack's directory structure::
|
||||
spack/ <- installation root
|
||||
bin/
|
||||
spack <- main spack executable
|
||||
|
||||
etc/
|
||||
spack/ <- Spack config files.
|
||||
Can be overridden by files in ~/.spack.
|
||||
|
||||
var/
|
||||
spack/ <- build & stage directories
|
||||
repos/ <- contains package repositories
|
||||
builtin/ <- pkg repository that comes with Spack
|
||||
repo.yaml <- descriptor for the builtin repository
|
||||
packages/ <- directories under here contain packages
|
||||
|
||||
opt/
|
||||
spack/ <- packages are installed here
|
||||
|
||||
lib/
|
||||
spack/
|
||||
docs/ <- source for this documentation
|
||||
env/ <- compiler wrappers for build environment
|
||||
|
||||
external/ <- external libs included in Spack distro
|
||||
llnl/ <- some general-use libraries
|
||||
|
||||
spack/ <- spack module; contains Python code
|
||||
cmd/ <- each file in here is a spack subcommand
|
||||
compilers/ <- compiler description files
|
||||
packages/ <- each file in here is a spack package
|
||||
test/ <- unit test modules
|
||||
util/ <- common code
|
||||
|
||||
|
@ -103,7 +103,7 @@ creates a simple python file:
|
||||
It doesn't take much python coding to get from there to a working
|
||||
package:
|
||||
|
||||
.. literalinclude:: ../../../var/spack/packages/libelf/package.py
|
||||
.. literalinclude:: ../../../var/spack/repos/builtin/packages/libelf/package.py
|
||||
:lines: 25-
|
||||
|
||||
Spack also provides wrapper functions around common commands like
|
||||
|
@ -22,7 +22,7 @@ go:
|
||||
$ spack install libelf
|
||||
|
||||
For a richer experience, use Spack's `shell support
|
||||
<http://llnl.github.io/spack/basic_usage.html#environment-modules>`_:
|
||||
<http://software.llnl.gov/spack/basic_usage.html#environment-modules>`_:
|
||||
|
||||
.. code-block:: sh
|
||||
|
||||
|
@ -84,7 +84,7 @@ always choose to download just one tarball initially, and run
|
||||
|
||||
If it fails entirely, you can get minimal boilerplate by using
|
||||
:ref:`spack-edit-f`, or you can manually create a directory and
|
||||
``package.py`` file for the package in ``var/spack/packages``.
|
||||
``package.py`` file for the package in ``var/spack/repos/builtin/packages``.
|
||||
|
||||
.. note::
|
||||
|
||||
@ -203,7 +203,7 @@ edit`` command:
|
||||
So, if you used ``spack create`` to create a package, then saved and
|
||||
closed the resulting file, you can get back to it with ``spack edit``.
|
||||
The ``cmake`` package actually lives in
|
||||
``$SPACK_ROOT/var/spack/packages/cmake/package.py``, but this provides
|
||||
``$SPACK_ROOT/var/spack/repos/builtin/packages/cmake/package.py``, but this provides
|
||||
a much simpler shortcut and saves you the trouble of typing the full
|
||||
path.
|
||||
|
||||
@ -269,18 +269,18 @@ live in Spack's directory structure. In general, `spack-create`_ and
|
||||
`spack-edit`_ handle creating package files for you, so you can skip
|
||||
most of the details here.
|
||||
|
||||
``var/spack/packages``
|
||||
``var/spack/repos/builtin/packages``
|
||||
~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
A Spack installation directory is structured like a standard UNIX
|
||||
install prefix (``bin``, ``lib``, ``include``, ``var``, ``opt``,
|
||||
etc.). Most of the code for Spack lives in ``$SPACK_ROOT/lib/spack``.
|
||||
Packages themselves live in ``$SPACK_ROOT/var/spack/packages``.
|
||||
Packages themselves live in ``$SPACK_ROOT/var/spack/repos/builtin/packages``.
|
||||
|
||||
If you ``cd`` to that directory, you will see directories for each
|
||||
package:
|
||||
|
||||
.. command-output:: cd $SPACK_ROOT/var/spack/packages; ls -CF
|
||||
.. command-output:: cd $SPACK_ROOT/var/spack/repos/builtin/packages; ls -CF
|
||||
:shell:
|
||||
:ellipsis: 10
|
||||
|
||||
@ -288,7 +288,7 @@ Each directory contains a file called ``package.py``, which is where
|
||||
all the python code for the package goes. For example, the ``libelf``
|
||||
package lives in::
|
||||
|
||||
$SPACK_ROOT/var/spack/packages/libelf/package.py
|
||||
$SPACK_ROOT/var/spack/repos/builtin/packages/libelf/package.py
|
||||
|
||||
Alongside the ``package.py`` file, a package may contain extra
|
||||
directories or files (like patches) that it needs to build.
|
||||
@ -301,7 +301,7 @@ Packages are named after the directory containing ``package.py``. So,
|
||||
``libelf``'s ``package.py`` lives in a directory called ``libelf``.
|
||||
The ``package.py`` file defines a class called ``Libelf``, which
|
||||
extends Spack's ``Package`` class. for example, here is
|
||||
``$SPACK_ROOT/var/spack/packages/libelf/package.py``:
|
||||
``$SPACK_ROOT/var/spack/repos/builtin/packages/libelf/package.py``:
|
||||
|
||||
.. code-block:: python
|
||||
:linenos:
|
||||
@ -328,7 +328,7 @@ these:
|
||||
$ spack install libelf@0.8.13
|
||||
|
||||
Spack sees the package name in the spec and looks for
|
||||
``libelf/package.py`` in ``var/spack/packages``. Likewise, if you say
|
||||
``libelf/package.py`` in ``var/spack/repos/builtin/packages``. Likewise, if you say
|
||||
``spack install py-numpy``, then Spack looks for
|
||||
``py-numpy/package.py``.
|
||||
|
||||
@ -401,6 +401,35 @@ construct the new one for ``8.2.1``.
|
||||
When you supply a custom URL for a version, Spack uses that URL
|
||||
*verbatim* and does not perform extrapolation.
|
||||
|
||||
Skipping the expand step
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Spack normally expands archives automatically after downloading
|
||||
them. If you want to skip this step (e.g., for self-extracting
|
||||
executables and other custom archive types), you can add
|
||||
``expand=False`` to a ``version`` directive.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
version('8.2.1', '4136d7b4c04df68b686570afa26988ac',
|
||||
url='http://example.com/foo-8.2.1-special-version.tar.gz', 'expand=False')
|
||||
|
||||
When ``expand`` is set to ``False``, Spack sets the current working
|
||||
directory to the directory containing the downloaded archive before it
|
||||
calls your ``install`` method. Within ``install``, the path to the
|
||||
downloaded archive is available as ``self.stage.archive_file``.
|
||||
|
||||
Here is an example snippet for packages distribuetd as self-extracting
|
||||
archives. The example sets permissions on the downloaded file to make
|
||||
it executable, then runs it with some arguments.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
def install(self, spec, prefix):
|
||||
set_executable(self.stage.archive_file)
|
||||
installer = Executable(self.stage.archive_file)
|
||||
installer('--prefix=%s' % prefix, 'arg1', 'arg2', 'etc.')
|
||||
|
||||
Checksums
|
||||
~~~~~~~~~~~~~~~~~
|
||||
|
||||
@ -703,7 +732,7 @@ supply is a filename, then the patch needs to live within the spack
|
||||
source tree. For example, the patch above lives in a directory
|
||||
structure like this::
|
||||
|
||||
$SPACK_ROOT/var/spack/packages/
|
||||
$SPACK_ROOT/var/spack/repos/builtin/packages/
|
||||
mvapich2/
|
||||
package.py
|
||||
ad_lustre_rwcontig_open_source.patch
|
||||
@ -1533,7 +1562,7 @@ The last element of a package is its ``install()`` method. This is
|
||||
where the real work of installation happens, and it's the main part of
|
||||
the package you'll need to customize for each piece of software.
|
||||
|
||||
.. literalinclude:: ../../../var/spack/packages/libelf/package.py
|
||||
.. literalinclude:: ../../../var/spack/repos/builtin/packages/libelf/package.py
|
||||
:start-after: 0.8.12
|
||||
:linenos:
|
||||
|
||||
@ -1711,15 +1740,15 @@ Compile-time library search paths
|
||||
* ``-L$dep_prefix/lib``
|
||||
* ``-L$dep_prefix/lib64``
|
||||
Runtime library search paths (RPATHs)
|
||||
* ``-Wl,-rpath=$dep_prefix/lib``
|
||||
* ``-Wl,-rpath=$dep_prefix/lib64``
|
||||
* ``-Wl,-rpath,$dep_prefix/lib``
|
||||
* ``-Wl,-rpath,$dep_prefix/lib64``
|
||||
Include search paths
|
||||
* ``-I$dep_prefix/include``
|
||||
|
||||
An example of this would be the ``libdwarf`` build, which has one
|
||||
dependency: ``libelf``. Every call to ``cc`` in the ``libdwarf``
|
||||
build will have ``-I$LIBELF_PREFIX/include``,
|
||||
``-L$LIBELF_PREFIX/lib``, and ``-Wl,-rpath=$LIBELF_PREFIX/lib``
|
||||
``-L$LIBELF_PREFIX/lib``, and ``-Wl,-rpath,$LIBELF_PREFIX/lib``
|
||||
inserted on the command line. This is done transparently to the
|
||||
project's build system, which will just think it's using a system
|
||||
where ``libelf`` is readily available. Because of this, you **do
|
||||
@ -2108,6 +2137,15 @@ Filtering functions
|
||||
|
||||
Examples:
|
||||
|
||||
#. Filtering a Makefile to force it to use Spack's compiler wrappers:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
filter_file(r'^CC\s*=.*', spack_cc, 'Makefile')
|
||||
filter_file(r'^CXX\s*=.*', spack_cxx, 'Makefile')
|
||||
filter_file(r'^F77\s*=.*', spack_f77, 'Makefile')
|
||||
filter_file(r'^FC\s*=.*', spack_fc, 'Makefile')
|
||||
|
||||
#. Replacing ``#!/usr/bin/perl`` with ``#!/usr/bin/env perl`` in ``bib2xhtml``:
|
||||
|
||||
.. code-block:: python
|
||||
|
50
lib/spack/env/cc
vendored
50
lib/spack/env/cc
vendored
@ -90,15 +90,15 @@ case "$command" in
|
||||
command="$SPACK_CC"
|
||||
language="C"
|
||||
;;
|
||||
c++|CC|g++|clang++|icpc|pgCC|xlc++)
|
||||
c++|CC|g++|clang++|icpc|pgc++|xlc++)
|
||||
command="$SPACK_CXX"
|
||||
language="C++"
|
||||
;;
|
||||
f90|fc|f95|gfortran|ifort|pgf90|xlf90|nagfor)
|
||||
f90|fc|f95|gfortran|ifort|pgfortran|xlf90|nagfor)
|
||||
command="$SPACK_FC"
|
||||
language="Fortran 90"
|
||||
;;
|
||||
f77|gfortran|ifort|pgf77|xlf|nagfor)
|
||||
f77|gfortran|ifort|pgfortran|xlf|nagfor)
|
||||
command="$SPACK_F77"
|
||||
language="Fortran 77"
|
||||
;;
|
||||
@ -138,7 +138,7 @@ if [ -z "$mode" ]; then
|
||||
done
|
||||
fi
|
||||
|
||||
# Dump the version and exist if we're in testing mode.
|
||||
# Dump the version and exit if we're in testing mode.
|
||||
if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
|
||||
echo "$mode"
|
||||
exit
|
||||
@ -187,32 +187,44 @@ while [ -n "$1" ]; do
|
||||
;;
|
||||
-Wl,*)
|
||||
arg="${1#-Wl,}"
|
||||
if [ -z "$arg" ]; then shift; arg="$1"; fi
|
||||
if [[ "$arg" = -rpath=* ]]; then
|
||||
rpaths+=("${arg#-rpath=}")
|
||||
elif [[ "$arg" = -rpath ]]; then
|
||||
# TODO: Handle multiple -Wl, continuations of -Wl,-rpath
|
||||
if [[ $arg == -rpath=* ]]; then
|
||||
arg="${arg#-rpath=}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
elif [[ $arg == -rpath,* ]]; then
|
||||
arg="${arg#-rpath,}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
elif [[ $arg == -rpath ]]; then
|
||||
shift; arg="$1"
|
||||
if [[ "$arg" != -Wl,* ]]; then
|
||||
if [[ $arg != '-Wl,'* ]]; then
|
||||
die "-Wl,-rpath was not followed by -Wl,*"
|
||||
fi
|
||||
rpaths+=("${arg#-Wl,}")
|
||||
arg="${arg#-Wl,}"
|
||||
for rpath in ${arg//,/ }; do
|
||||
rpaths+=("$rpath")
|
||||
done
|
||||
else
|
||||
other_args+=("-Wl,$arg")
|
||||
fi
|
||||
;;
|
||||
-Xlinker,*)
|
||||
arg="${1#-Xlinker,}"
|
||||
if [ -z "$arg" ]; then shift; arg="$1"; fi
|
||||
if [[ "$arg" = -rpath=* ]]; then
|
||||
-Xlinker)
|
||||
shift; arg="$1";
|
||||
if [[ $arg = -rpath=* ]]; then
|
||||
rpaths+=("${arg#-rpath=}")
|
||||
elif [[ "$arg" = -rpath ]]; then
|
||||
elif [[ $arg = -rpath ]]; then
|
||||
shift; arg="$1"
|
||||
if [[ "$arg" != -Xlinker,* ]]; then
|
||||
die "-Xlinker,-rpath was not followed by -Xlinker,*"
|
||||
if [[ $arg != -Xlinker ]]; then
|
||||
die "-Xlinker -rpath was not followed by -Xlinker <arg>"
|
||||
fi
|
||||
rpaths+=("${arg#-Xlinker,}")
|
||||
shift; arg="$1"
|
||||
rpaths+=("$arg")
|
||||
else
|
||||
other_args+=("-Xlinker,$arg")
|
||||
other_args+=("-Xlinker")
|
||||
other_args+=("$arg")
|
||||
fi
|
||||
;;
|
||||
*)
|
||||
|
1
lib/spack/env/pgi/case-insensitive/pgCC
vendored
1
lib/spack/env/pgi/case-insensitive/pgCC
vendored
@ -1 +0,0 @@
|
||||
../../cc
|
@ -25,7 +25,9 @@
|
||||
__all__ = ['set_install_permissions', 'install', 'install_tree', 'traverse_tree',
|
||||
'expand_user', 'working_dir', 'touch', 'touchp', 'mkdirp',
|
||||
'force_remove', 'join_path', 'ancestor', 'can_access', 'filter_file',
|
||||
'FileFilter', 'change_sed_delimiter', 'is_exe', 'force_symlink']
|
||||
'FileFilter', 'change_sed_delimiter', 'is_exe', 'force_symlink',
|
||||
'set_executable', 'copy_mode', 'unset_executable_mode',
|
||||
'remove_dead_links', 'remove_linked_tree']
|
||||
|
||||
import os
|
||||
import sys
|
||||
@ -152,15 +154,28 @@ def set_install_permissions(path):
|
||||
def copy_mode(src, dest):
|
||||
src_mode = os.stat(src).st_mode
|
||||
dest_mode = os.stat(dest).st_mode
|
||||
if src_mode | stat.S_IXUSR: dest_mode |= stat.S_IXUSR
|
||||
if src_mode | stat.S_IXGRP: dest_mode |= stat.S_IXGRP
|
||||
if src_mode | stat.S_IXOTH: dest_mode |= stat.S_IXOTH
|
||||
if src_mode & stat.S_IXUSR: dest_mode |= stat.S_IXUSR
|
||||
if src_mode & stat.S_IXGRP: dest_mode |= stat.S_IXGRP
|
||||
if src_mode & stat.S_IXOTH: dest_mode |= stat.S_IXOTH
|
||||
os.chmod(dest, dest_mode)
|
||||
|
||||
|
||||
def unset_executable_mode(path):
|
||||
mode = os.stat(path).st_mode
|
||||
mode &= ~stat.S_IXUSR
|
||||
mode &= ~stat.S_IXGRP
|
||||
mode &= ~stat.S_IXOTH
|
||||
os.chmod(path, mode)
|
||||
|
||||
|
||||
def install(src, dest):
|
||||
"""Manually install a file to a particular location."""
|
||||
tty.debug("Installing %s to %s" % (src, dest))
|
||||
|
||||
# Expand dsst to its eventual full path if it is a directory.
|
||||
if os.path.isdir(dest):
|
||||
dest = join_path(dest, os.path.basename(src))
|
||||
|
||||
shutil.copy(src, dest)
|
||||
set_install_permissions(dest)
|
||||
copy_mode(src, dest)
|
||||
@ -235,7 +250,7 @@ def touchp(path):
|
||||
def force_symlink(src, dest):
|
||||
try:
|
||||
os.symlink(src, dest)
|
||||
except OSError, e:
|
||||
except OSError as e:
|
||||
os.remove(dest)
|
||||
os.symlink(src, dest)
|
||||
|
||||
@ -339,3 +354,41 @@ def traverse_tree(source_root, dest_root, rel_path='', **kwargs):
|
||||
|
||||
if order == 'post':
|
||||
yield (source_path, dest_path)
|
||||
|
||||
|
||||
def set_executable(path):
|
||||
st = os.stat(path)
|
||||
os.chmod(path, st.st_mode | stat.S_IEXEC)
|
||||
|
||||
|
||||
def remove_dead_links(root):
|
||||
"""
|
||||
Removes any dead link that is present in root
|
||||
|
||||
Args:
|
||||
root: path where to search for dead links
|
||||
|
||||
"""
|
||||
for file in os.listdir(root):
|
||||
path = join_path(root, file)
|
||||
if os.path.islink(path):
|
||||
real_path = os.path.realpath(path)
|
||||
if not os.path.exists(real_path):
|
||||
os.unlink(path)
|
||||
|
||||
def remove_linked_tree(path):
|
||||
"""
|
||||
Removes a directory and its contents. If the directory is a
|
||||
symlink, follows the link and removes the real directory before
|
||||
removing the link.
|
||||
|
||||
Args:
|
||||
path: directory to be removed
|
||||
|
||||
"""
|
||||
if os.path.exists(path):
|
||||
if os.path.islink(path):
|
||||
shutil.rmtree(os.path.realpath(path), True)
|
||||
os.unlink(path)
|
||||
else:
|
||||
shutil.rmtree(path, True)
|
||||
|
@ -177,8 +177,6 @@ def set_module_variables_for_package(pkg, m):
|
||||
"""Populate the module scope of install() with some useful functions.
|
||||
This makes things easier for package writers.
|
||||
"""
|
||||
m = pkg.module
|
||||
|
||||
# number of jobs spack will to build with.
|
||||
jobs = multiprocessing.cpu_count()
|
||||
if not pkg.parallel:
|
||||
@ -214,6 +212,13 @@ def set_module_variables_for_package(pkg, m):
|
||||
m.std_cmake_args.append('-DCMAKE_INSTALL_RPATH_USE_LINK_PATH=FALSE')
|
||||
m.std_cmake_args.append('-DCMAKE_INSTALL_RPATH=%s' % ":".join(get_rpaths(pkg)))
|
||||
|
||||
# Put spack compiler paths in module scope.
|
||||
link_dir = spack.build_env_path
|
||||
m.spack_cc = join_path(link_dir, pkg.compiler.link_paths['cc'])
|
||||
m.spack_cxx = join_path(link_dir, pkg.compiler.link_paths['cxx'])
|
||||
m.spack_f77 = join_path(link_dir, pkg.compiler.link_paths['f77'])
|
||||
m.spack_f90 = join_path(link_dir, pkg.compiler.link_paths['fc'])
|
||||
|
||||
# Emulate some shell commands for convenience
|
||||
m.pwd = os.getcwd
|
||||
m.cd = os.chdir
|
||||
|
@ -22,23 +22,18 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import re
|
||||
import argparse
|
||||
import hashlib
|
||||
from pprint import pprint
|
||||
from subprocess import CalledProcessError
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.tty.colify import colify
|
||||
|
||||
import spack
|
||||
import spack.cmd
|
||||
import spack.util.crypto
|
||||
from spack.stage import Stage, FailedDownloadError
|
||||
from spack.version import *
|
||||
|
||||
description ="Checksum available versions of a package."
|
||||
description = "Checksum available versions of a package."
|
||||
|
||||
|
||||
def setup_parser(subparser):
|
||||
subparser.add_argument(
|
||||
@ -58,25 +53,23 @@ def get_checksums(versions, urls, **kwargs):
|
||||
|
||||
tty.msg("Downloading...")
|
||||
hashes = []
|
||||
for i, (url, version) in enumerate(zip(urls, versions)):
|
||||
stage = Stage(url)
|
||||
i = 0
|
||||
for url, version in zip(urls, versions):
|
||||
try:
|
||||
stage.fetch()
|
||||
if i == 0 and first_stage_function:
|
||||
first_stage_function(stage)
|
||||
with Stage(url, keep=keep_stage) as stage:
|
||||
stage.fetch()
|
||||
if i == 0 and first_stage_function:
|
||||
first_stage_function(stage)
|
||||
|
||||
hashes.append(
|
||||
spack.util.crypto.checksum(hashlib.md5, stage.archive_file))
|
||||
except FailedDownloadError, e:
|
||||
hashes.append((version,
|
||||
spack.util.crypto.checksum(hashlib.md5, stage.archive_file)))
|
||||
i += 1
|
||||
except FailedDownloadError as e:
|
||||
tty.msg("Failed to fetch %s" % url)
|
||||
continue
|
||||
|
||||
finally:
|
||||
if not keep_stage:
|
||||
stage.destroy()
|
||||
|
||||
return zip(versions, hashes)
|
||||
except Exception as e:
|
||||
tty.msg('Something failed on %s, skipping.\n (%s)' % (url, e))
|
||||
|
||||
return hashes
|
||||
|
||||
|
||||
def checksum(parser, args):
|
||||
@ -95,13 +88,13 @@ def checksum(parser, args):
|
||||
else:
|
||||
versions = pkg.fetch_remote_versions()
|
||||
if not versions:
|
||||
tty.die("Could not fetch any versions for %s." % pkg.name)
|
||||
tty.die("Could not fetch any versions for %s" % pkg.name)
|
||||
|
||||
sorted_versions = sorted(versions, reverse=True)
|
||||
|
||||
tty.msg("Found %s versions of %s." % (len(versions), pkg.name),
|
||||
tty.msg("Found %s versions of %s" % (len(versions), pkg.name),
|
||||
*spack.cmd.elide_list(
|
||||
["%-10s%s" % (v, versions[v]) for v in sorted_versions]))
|
||||
["%-10s%s" % (v, versions[v]) for v in sorted_versions]))
|
||||
print
|
||||
archives_to_fetch = tty.get_number(
|
||||
"How many would you like to checksum?", default=5, abort='q')
|
||||
@ -116,7 +109,7 @@ def checksum(parser, args):
|
||||
keep_stage=args.keep_stage)
|
||||
|
||||
if not version_hashes:
|
||||
tty.die("Could not fetch any versions for %s." % pkg.name)
|
||||
tty.die("Could not fetch any versions for %s" % pkg.name)
|
||||
|
||||
version_lines = [" version('%s', '%s')" % (v, h) for v, h in version_hashes]
|
||||
tty.msg("Checksummed new versions of %s:" % pkg.name, *version_lines)
|
||||
|
@ -96,7 +96,7 @@ def compiler_remove(args):
|
||||
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
|
||||
|
||||
if not compilers:
|
||||
tty.die("No compilers match spec %s." % cspec)
|
||||
tty.die("No compilers match spec %s" % cspec)
|
||||
elif not args.all and len(compilers) > 1:
|
||||
tty.error("Multiple compilers match spec %s. Choose one:" % cspec)
|
||||
colify(reversed(sorted([c.spec for c in compilers])), indent=4)
|
||||
@ -105,7 +105,7 @@ def compiler_remove(args):
|
||||
|
||||
for compiler in compilers:
|
||||
spack.compilers.remove_compiler_from_config(compiler.spec, scope=args.scope)
|
||||
tty.msg("Removed compiler %s." % compiler.spec)
|
||||
tty.msg("Removed compiler %s" % compiler.spec)
|
||||
|
||||
|
||||
def compiler_info(args):
|
||||
@ -114,7 +114,7 @@ def compiler_info(args):
|
||||
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
|
||||
|
||||
if not compilers:
|
||||
tty.error("No compilers match spec %s." % cspec)
|
||||
tty.error("No compilers match spec %s" % cspec)
|
||||
else:
|
||||
for c in compilers:
|
||||
print str(c.spec) + ":"
|
||||
|
@ -156,7 +156,7 @@ def guess_name_and_version(url, args):
|
||||
# Try to deduce name and version of the new package from the URL
|
||||
version = spack.url.parse_version(url)
|
||||
if not version:
|
||||
tty.die("Couldn't guess a version string from %s." % url)
|
||||
tty.die("Couldn't guess a version string from %s" % url)
|
||||
|
||||
# Try to guess a name. If it doesn't work, allow the user to override.
|
||||
if args.alternate_name:
|
||||
@ -189,7 +189,7 @@ def find_repository(spec, args):
|
||||
try:
|
||||
repo = Repo(repo_path)
|
||||
if spec.namespace and spec.namespace != repo.namespace:
|
||||
tty.die("Can't create package with namespace %s in repo with namespace %s."
|
||||
tty.die("Can't create package with namespace %s in repo with namespace %s"
|
||||
% (spec.namespace, repo.namespace))
|
||||
except RepoError as e:
|
||||
tty.die(str(e))
|
||||
@ -252,7 +252,7 @@ def create(parser, args):
|
||||
name = spec.name # factors out namespace, if any
|
||||
repo = find_repository(spec, args)
|
||||
|
||||
tty.msg("This looks like a URL for %s version %s." % (name, version))
|
||||
tty.msg("This looks like a URL for %s version %s" % (name, version))
|
||||
tty.msg("Creating template for package %s" % name)
|
||||
|
||||
# Fetch tarballs (prompting user if necessary)
|
||||
@ -266,7 +266,7 @@ def create(parser, args):
|
||||
keep_stage=args.keep_stage)
|
||||
|
||||
if not ver_hash_tuples:
|
||||
tty.die("Could not fetch any tarballs for %s." % name)
|
||||
tty.die("Could not fetch any tarballs for %s" % name)
|
||||
|
||||
# Prepend 'py-' to python package names, by convention.
|
||||
if guesser.build_system == 'python':
|
||||
@ -291,4 +291,4 @@ def create(parser, args):
|
||||
|
||||
# If everything checks out, go ahead and edit.
|
||||
spack.editor(pkg_path)
|
||||
tty.msg("Created package %s." % pkg_path)
|
||||
tty.msg("Created package %s" % pkg_path)
|
||||
|
@ -45,6 +45,9 @@ def setup_parser(subparser):
|
||||
subparser.add_argument(
|
||||
'--skip-patch', action='store_true',
|
||||
help="Skip patching for the DIY build.")
|
||||
subparser.add_argument(
|
||||
'-q', '--quiet', action='store_true', dest='quiet',
|
||||
help="Do not display verbose build output while installing.")
|
||||
subparser.add_argument(
|
||||
'spec', nargs=argparse.REMAINDER,
|
||||
help="specs to use for install. Must contain package AND verison.")
|
||||
@ -92,4 +95,5 @@ def diy(self, args):
|
||||
package.do_install(
|
||||
keep_prefix=args.keep_prefix,
|
||||
ignore_deps=args.ignore_deps,
|
||||
verbose=not args.quiet,
|
||||
keep_stage=True) # don't remove source dir for DIY.
|
||||
|
@ -22,51 +22,51 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import argparse
|
||||
import hashlib
|
||||
|
||||
from contextlib import contextmanager
|
||||
import os
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack.util.crypto
|
||||
from spack.stage import Stage, FailedDownloadError
|
||||
|
||||
description = "Calculate md5 checksums for files/urls."
|
||||
|
||||
@contextmanager
|
||||
def stager(url):
|
||||
_cwd = os.getcwd()
|
||||
_stager = Stage(url)
|
||||
try:
|
||||
_stager.fetch()
|
||||
yield _stager
|
||||
except FailedDownloadError:
|
||||
tty.msg("Failed to fetch %s" % url)
|
||||
finally:
|
||||
_stager.destroy()
|
||||
os.chdir(_cwd) # the Stage class changes the current working dir so it has to be restored
|
||||
|
||||
def setup_parser(subparser):
|
||||
setup_parser.parser = subparser
|
||||
subparser.add_argument('files', nargs=argparse.REMAINDER,
|
||||
help="Files to checksum.")
|
||||
|
||||
|
||||
def compute_md5_checksum(url):
|
||||
if not os.path.isfile(url):
|
||||
with Stage(url) as stage:
|
||||
stage.fetch()
|
||||
value = spack.util.crypto.checksum(hashlib.md5, stage.archive_file)
|
||||
else:
|
||||
value = spack.util.crypto.checksum(hashlib.md5, url)
|
||||
return value
|
||||
|
||||
|
||||
def md5(parser, args):
|
||||
if not args.files:
|
||||
setup_parser.parser.print_help()
|
||||
return 1
|
||||
|
||||
for f in args.files:
|
||||
if not os.path.isfile(f):
|
||||
with stager(f) as stage:
|
||||
checksum = spack.util.crypto.checksum(hashlib.md5, stage.archive_file)
|
||||
print "%s %s" % (checksum, f)
|
||||
else:
|
||||
if not can_access(f):
|
||||
tty.die("Cannot read file: %s" % f)
|
||||
results = []
|
||||
for url in args.files:
|
||||
try:
|
||||
checksum = compute_md5_checksum(url)
|
||||
results.append((checksum, url))
|
||||
except FailedDownloadError as e:
|
||||
tty.warn("Failed to fetch %s" % url)
|
||||
tty.warn("%s" % e)
|
||||
except IOError as e:
|
||||
tty.warn("Error when reading %s" % url)
|
||||
tty.warn("%s" % e)
|
||||
|
||||
checksum = spack.util.crypto.checksum(hashlib.md5, f)
|
||||
print "%s %s" % (checksum, f)
|
||||
# Dump the MD5s at last without interleaving them with downloads
|
||||
tty.msg("%d MD5 checksums:" % len(results))
|
||||
for checksum, url in results:
|
||||
print "%s %s" % (checksum, url)
|
||||
|
@ -126,7 +126,7 @@ def mirror_remove(args):
|
||||
|
||||
old_value = mirrors.pop(name)
|
||||
spack.config.update_config('mirrors', mirrors, scope=args.scope)
|
||||
tty.msg("Removed mirror %s with url %s." % (name, old_value))
|
||||
tty.msg("Removed mirror %s with url %s" % (name, old_value))
|
||||
|
||||
|
||||
def mirror_list(args):
|
||||
@ -203,7 +203,7 @@ def mirror_create(args):
|
||||
|
||||
verb = "updated" if existed else "created"
|
||||
tty.msg(
|
||||
"Successfully %s mirror in %s." % (verb, directory),
|
||||
"Successfully %s mirror in %s" % (verb, directory),
|
||||
"Archive stats:",
|
||||
" %-4d already present" % p,
|
||||
" %-4d added" % m,
|
||||
|
@ -58,7 +58,7 @@ def module_find(mtype, spec_array):
|
||||
should type to use that package's module.
|
||||
"""
|
||||
if mtype not in module_types:
|
||||
tty.die("Invalid module type: '%s'. Options are %s." % (mtype, comma_or(module_types)))
|
||||
tty.die("Invalid module type: '%s'. Options are %s" % (mtype, comma_or(module_types)))
|
||||
|
||||
specs = spack.cmd.parse_specs(spec_array)
|
||||
if len(specs) > 1:
|
||||
@ -78,7 +78,7 @@ def module_find(mtype, spec_array):
|
||||
mt = module_types[mtype]
|
||||
mod = mt(specs[0])
|
||||
if not os.path.isfile(mod.file_name):
|
||||
tty.die("No %s module is installed for %s." % (mtype, spec))
|
||||
tty.die("No %s module is installed for %s" % (mtype, spec))
|
||||
|
||||
print mod.use_name
|
||||
|
||||
@ -94,7 +94,7 @@ def module_refresh():
|
||||
shutil.rmtree(cls.path, ignore_errors=False)
|
||||
mkdirp(cls.path)
|
||||
for spec in specs:
|
||||
tty.debug(" Writing file for %s." % spec)
|
||||
tty.debug(" Writing file for %s" % spec)
|
||||
cls(spec).write()
|
||||
|
||||
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
@ -74,51 +74,7 @@ def setup_parser(subparser):
|
||||
|
||||
def repo_create(args):
|
||||
"""Create a new package repository."""
|
||||
root = canonicalize_path(args.directory)
|
||||
namespace = args.namespace
|
||||
|
||||
if not args.namespace:
|
||||
namespace = os.path.basename(root)
|
||||
|
||||
if not re.match(r'\w[\.\w-]*', namespace):
|
||||
tty.die("'%s' is not a valid namespace." % namespace)
|
||||
|
||||
existed = False
|
||||
if os.path.exists(root):
|
||||
if os.path.isfile(root):
|
||||
tty.die('File %s already exists and is not a directory' % root)
|
||||
elif os.path.isdir(root):
|
||||
if not os.access(root, os.R_OK | os.W_OK):
|
||||
tty.die('Cannot create new repo in %s: cannot access directory.' % root)
|
||||
if os.listdir(root):
|
||||
tty.die('Cannot create new repo in %s: directory is not empty.' % root)
|
||||
existed = True
|
||||
|
||||
full_path = os.path.realpath(root)
|
||||
parent = os.path.dirname(full_path)
|
||||
if not os.access(parent, os.R_OK | os.W_OK):
|
||||
tty.die("Cannot create repository in %s: can't access parent!" % root)
|
||||
|
||||
try:
|
||||
config_path = os.path.join(root, repo_config_name)
|
||||
packages_path = os.path.join(root, packages_dir_name)
|
||||
|
||||
mkdirp(packages_path)
|
||||
with open(config_path, 'w') as config:
|
||||
config.write("repo:\n")
|
||||
config.write(" namespace: '%s'\n" % namespace)
|
||||
|
||||
except (IOError, OSError) as e:
|
||||
tty.die('Failed to create new repository in %s.' % root,
|
||||
"Caused by %s: %s" % (type(e), e))
|
||||
|
||||
# try to clean up.
|
||||
if existed:
|
||||
shutil.rmtree(config_path, ignore_errors=True)
|
||||
shutil.rmtree(packages_path, ignore_errors=True)
|
||||
else:
|
||||
shutil.rmtree(root, ignore_errors=True)
|
||||
|
||||
full_path, namespace = create_repo(args.directory, args.namespace)
|
||||
tty.msg("Created repo with namespace '%s'." % namespace)
|
||||
tty.msg("To register it with spack, run this command:",
|
||||
'spack repo add %s' % full_path)
|
||||
@ -133,11 +89,11 @@ def repo_add(args):
|
||||
|
||||
# check if the path exists
|
||||
if not os.path.exists(canon_path):
|
||||
tty.die("No such file or directory: '%s'." % path)
|
||||
tty.die("No such file or directory: %s" % path)
|
||||
|
||||
# Make sure the path is a directory.
|
||||
if not os.path.isdir(canon_path):
|
||||
tty.die("Not a Spack repository: '%s'." % path)
|
||||
tty.die("Not a Spack repository: %s" % path)
|
||||
|
||||
# Make sure it's actually a spack repository by constructing it.
|
||||
repo = Repo(canon_path)
|
||||
@ -147,7 +103,7 @@ def repo_add(args):
|
||||
if not repos: repos = []
|
||||
|
||||
if repo.root in repos or path in repos:
|
||||
tty.die("Repository is already registered with Spack: '%s'" % path)
|
||||
tty.die("Repository is already registered with Spack: %s" % path)
|
||||
|
||||
repos.insert(0, canon_path)
|
||||
spack.config.update_config('repos', repos, args.scope)
|
||||
@ -166,7 +122,7 @@ def repo_remove(args):
|
||||
if canon_path == repo_canon_path:
|
||||
repos.remove(repo_path)
|
||||
spack.config.update_config('repos', repos, args.scope)
|
||||
tty.msg("Removed repository '%s'." % repo_path)
|
||||
tty.msg("Removed repository %s" % repo_path)
|
||||
return
|
||||
|
||||
# If it is a namespace, remove corresponding repo
|
||||
@ -176,13 +132,13 @@ def repo_remove(args):
|
||||
if repo.namespace == path_or_namespace:
|
||||
repos.remove(path)
|
||||
spack.config.update_config('repos', repos, args.scope)
|
||||
tty.msg("Removed repository '%s' with namespace %s."
|
||||
tty.msg("Removed repository %s with namespace '%s'."
|
||||
% (repo.root, repo.namespace))
|
||||
return
|
||||
except RepoError as e:
|
||||
continue
|
||||
|
||||
tty.die("No repository with path or namespace: '%s'"
|
||||
tty.die("No repository with path or namespace: %s"
|
||||
% path_or_namespace)
|
||||
|
||||
|
||||
|
@ -256,12 +256,12 @@ def find(cls, *path):
|
||||
|
||||
|
||||
def __repr__(self):
|
||||
"""Return a string represntation of the compiler toolchain."""
|
||||
"""Return a string representation of the compiler toolchain."""
|
||||
return self.__str__()
|
||||
|
||||
|
||||
def __str__(self):
|
||||
"""Return a string represntation of the compiler toolchain."""
|
||||
"""Return a string representation of the compiler toolchain."""
|
||||
return "%s(%s)" % (
|
||||
self.name, '\n '.join((str(s) for s in (self.cc, self.cxx, self.f77, self.fc))))
|
||||
|
||||
|
@ -29,28 +29,28 @@ class Pgi(Compiler):
|
||||
cc_names = ['pgcc']
|
||||
|
||||
# Subclasses use possible names of C++ compiler
|
||||
cxx_names = ['pgCC']
|
||||
cxx_names = ['pgc++', 'pgCC']
|
||||
|
||||
# Subclasses use possible names of Fortran 77 compiler
|
||||
f77_names = ['pgf77']
|
||||
f77_names = ['pgfortran', 'pgf77']
|
||||
|
||||
# Subclasses use possible names of Fortran 90 compiler
|
||||
fc_names = ['pgf95', 'pgf90']
|
||||
fc_names = ['pgfortran', 'pgf95', 'pgf90']
|
||||
|
||||
# Named wrapper links within spack.build_env_path
|
||||
link_paths = { 'cc' : 'pgi/pgcc',
|
||||
'cxx' : 'pgi/case-insensitive/pgCC',
|
||||
'f77' : 'pgi/pgf77',
|
||||
'fc' : 'pgi/pgf90' }
|
||||
'cxx' : 'pgi/pgc++',
|
||||
'f77' : 'pgi/pgfortran',
|
||||
'fc' : 'pgi/pgfortran' }
|
||||
|
||||
@classmethod
|
||||
def default_version(cls, comp):
|
||||
"""The '-V' option works for all the PGI compilers.
|
||||
Output looks like this::
|
||||
|
||||
pgf95 10.2-0 64-bit target on x86-64 Linux -tp nehalem-64
|
||||
Copyright 1989-2000, The Portland Group, Inc. All Rights Reserved.
|
||||
Copyright 2000-2010, STMicroelectronics, Inc. All Rights Reserved.
|
||||
pgcc 15.10-0 64-bit target on x86-64 Linux -tp sandybridge
|
||||
The Portland Group - PGI Compilers and Tools
|
||||
Copyright (c) 2015, NVIDIA CORPORATION. All rights reserved.
|
||||
"""
|
||||
return get_compiler_version(
|
||||
comp, '-V', r'pg[^ ]* ([^ ]+) \d\d\d?-bit target')
|
||||
|
@ -205,7 +205,7 @@
|
||||
def validate_section_name(section):
|
||||
"""Raise a ValueError if the section is not a valid section."""
|
||||
if section not in section_schemas:
|
||||
raise ValueError("Invalid config section: '%s'. Options are %s."
|
||||
raise ValueError("Invalid config section: '%s'. Options are %s"
|
||||
% (section, section_schemas))
|
||||
|
||||
|
||||
@ -335,7 +335,7 @@ def validate_scope(scope):
|
||||
return config_scopes[scope]
|
||||
|
||||
else:
|
||||
raise ValueError("Invalid config scope: '%s'. Must be one of %s."
|
||||
raise ValueError("Invalid config scope: '%s'. Must be one of %s"
|
||||
% (scope, config_scopes.keys()))
|
||||
|
||||
|
||||
@ -350,7 +350,7 @@ def _read_config_file(filename, schema):
|
||||
"Invlaid configuration. %s exists but is not a file." % filename)
|
||||
|
||||
elif not os.access(filename, os.R_OK):
|
||||
raise ConfigFileError("Config file is not readable: %s." % filename)
|
||||
raise ConfigFileError("Config file is not readable: %s" % filename)
|
||||
|
||||
try:
|
||||
tty.debug("Reading config file %s" % filename)
|
||||
|
@ -330,7 +330,7 @@ def _check_ref_counts(self):
|
||||
found = rec.ref_count
|
||||
if not expected == found:
|
||||
raise AssertionError(
|
||||
"Invalid ref_count: %s: %d (expected %d), in DB %s."
|
||||
"Invalid ref_count: %s: %d (expected %d), in DB %s"
|
||||
% (key, found, expected, self._index_path))
|
||||
|
||||
|
||||
|
@ -125,7 +125,7 @@ def __init__(self, dicts=None):
|
||||
dicts = (dicts,)
|
||||
elif type(dicts) not in (list, tuple):
|
||||
raise TypeError(
|
||||
"dicts arg must be list, tuple, or string. Found %s."
|
||||
"dicts arg must be list, tuple, or string. Found %s"
|
||||
% type(dicts))
|
||||
|
||||
self.dicts = dicts
|
||||
@ -174,7 +174,11 @@ def version(pkg, ver, checksum=None, **kwargs):
|
||||
|
||||
|
||||
def _depends_on(pkg, spec, when=None):
|
||||
if when is None:
|
||||
# If when is False do nothing
|
||||
if when is False:
|
||||
return
|
||||
# If when is None or True make sure the condition is always satisfied
|
||||
if when is None or when is True:
|
||||
when = pkg.name
|
||||
when_spec = parse_anonymous_spec(when, pkg.name)
|
||||
|
||||
@ -313,5 +317,5 @@ class CircularReferenceError(DirectiveError):
|
||||
def __init__(self, directive, package):
|
||||
super(CircularReferenceError, self).__init__(
|
||||
directive,
|
||||
"Package '%s' cannot pass itself to %s." % (package, directive))
|
||||
"Package '%s' cannot pass itself to %s" % (package, directive))
|
||||
self.package = package
|
||||
|
@ -85,6 +85,16 @@ def create_install_directory(self, spec):
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def check_installed(self, spec):
|
||||
"""Checks whether a spec is installed.
|
||||
|
||||
Return the spec's prefix, if it is installed, None otherwise.
|
||||
|
||||
Raise an exception if the install is inconsistent or corrupt.
|
||||
"""
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
def extension_map(self, spec):
|
||||
"""Get a dict of currently installed extension packages for a spec.
|
||||
|
||||
@ -173,7 +183,9 @@ def __init__(self, root, **kwargs):
|
||||
|
||||
self.spec_file_name = 'spec.yaml'
|
||||
self.extension_file_name = 'extensions.yaml'
|
||||
self.build_log_name = 'build.out' # TODO: use config file.
|
||||
self.build_log_name = 'build.out' # build log.
|
||||
self.build_env_name = 'build.env' # build environment
|
||||
self.packages_dir = 'repos' # archive of package.py files
|
||||
|
||||
# Cache of already written/read extension maps.
|
||||
self._extension_maps = {}
|
||||
@ -231,29 +243,49 @@ def build_log_path(self, spec):
|
||||
self.build_log_name)
|
||||
|
||||
|
||||
def build_env_path(self, spec):
|
||||
return join_path(self.path_for_spec(spec), self.metadata_dir,
|
||||
self.build_env_name)
|
||||
|
||||
|
||||
def build_packages_path(self, spec):
|
||||
return join_path(self.path_for_spec(spec), self.metadata_dir,
|
||||
self.packages_dir)
|
||||
|
||||
|
||||
def create_install_directory(self, spec):
|
||||
_check_concrete(spec)
|
||||
|
||||
prefix = self.check_installed(spec)
|
||||
if prefix:
|
||||
raise InstallDirectoryAlreadyExistsError(prefix)
|
||||
|
||||
mkdirp(self.metadata_path(spec))
|
||||
self.write_spec(spec, self.spec_file_path(spec))
|
||||
|
||||
|
||||
def check_installed(self, spec):
|
||||
_check_concrete(spec)
|
||||
path = self.path_for_spec(spec)
|
||||
spec_file_path = self.spec_file_path(spec)
|
||||
|
||||
if os.path.isdir(path):
|
||||
if not os.path.isfile(spec_file_path):
|
||||
raise InconsistentInstallDirectoryError(
|
||||
'No spec file found at path %s' % spec_file_path)
|
||||
if not os.path.isdir(path):
|
||||
return None
|
||||
|
||||
installed_spec = self.read_spec(spec_file_path)
|
||||
if installed_spec == self.spec:
|
||||
raise InstallDirectoryAlreadyExistsError(path)
|
||||
if not os.path.isfile(spec_file_path):
|
||||
raise InconsistentInstallDirectoryError(
|
||||
'Inconsistent state: install prefix exists but contains no spec.yaml:',
|
||||
" " + path)
|
||||
|
||||
if spec.dag_hash() == installed_spec.dag_hash():
|
||||
raise SpecHashCollisionError(installed_hash, spec_hash)
|
||||
else:
|
||||
raise InconsistentInstallDirectoryError(
|
||||
'Spec file in %s does not match hash!' % spec_file_path)
|
||||
installed_spec = self.read_spec(spec_file_path)
|
||||
if installed_spec == spec:
|
||||
return path
|
||||
|
||||
mkdirp(self.metadata_path(spec))
|
||||
self.write_spec(spec, spec_file_path)
|
||||
if spec.dag_hash() == installed_spec.dag_hash():
|
||||
raise SpecHashCollisionError(installed_hash, spec_hash)
|
||||
else:
|
||||
raise InconsistentInstallDirectoryError(
|
||||
'Spec file in %s does not match hash!' % spec_file_path)
|
||||
|
||||
|
||||
def all_specs(self):
|
||||
@ -323,7 +355,7 @@ def _extension_map(self, spec):
|
||||
|
||||
if not dag_hash in by_hash:
|
||||
raise InvalidExtensionSpecError(
|
||||
"Spec %s not found in %s." % (dag_hash, prefix))
|
||||
"Spec %s not found in %s" % (dag_hash, prefix))
|
||||
|
||||
ext_spec = by_hash[dag_hash]
|
||||
if not prefix == ext_spec.prefix:
|
||||
@ -387,8 +419,8 @@ def remove_extension(self, spec, ext_spec):
|
||||
|
||||
class DirectoryLayoutError(SpackError):
|
||||
"""Superclass for directory layout errors."""
|
||||
def __init__(self, message):
|
||||
super(DirectoryLayoutError, self).__init__(message)
|
||||
def __init__(self, message, long_msg=None):
|
||||
super(DirectoryLayoutError, self).__init__(message, long_msg)
|
||||
|
||||
|
||||
class SpecHashCollisionError(DirectoryLayoutError):
|
||||
@ -410,8 +442,8 @@ def __init__(self, installed_spec, prefix, error):
|
||||
|
||||
class InconsistentInstallDirectoryError(DirectoryLayoutError):
|
||||
"""Raised when a package seems to be installed to the wrong place."""
|
||||
def __init__(self, message):
|
||||
super(InconsistentInstallDirectoryError, self).__init__(message)
|
||||
def __init__(self, message, long_msg=None):
|
||||
super(InconsistentInstallDirectoryError, self).__init__(message, long_msg)
|
||||
|
||||
|
||||
class InstallDirectoryAlreadyExistsError(DirectoryLayoutError):
|
||||
@ -438,7 +470,7 @@ class ExtensionConflictError(DirectoryLayoutError):
|
||||
"""Raised when an extension is added to a package that already has it."""
|
||||
def __init__(self, spec, ext_spec, conflict):
|
||||
super(ExtensionConflictError, self).__init__(
|
||||
"%s cannot be installed in %s because it conflicts with %s."% (
|
||||
"%s cannot be installed in %s because it conflicts with %s"% (
|
||||
ext_spec.short_spec, spec.short_spec, conflict.short_spec))
|
||||
|
||||
|
||||
|
@ -82,7 +82,6 @@ class FetchStrategy(object):
|
||||
|
||||
class __metaclass__(type):
|
||||
"""This metaclass registers all fetch strategies in a list."""
|
||||
|
||||
def __init__(cls, name, bases, dict):
|
||||
type.__init__(cls, name, bases, dict)
|
||||
if cls.enabled: all_strategies.append(cls)
|
||||
@ -145,6 +144,8 @@ def __init__(self, url=None, digest=None, **kwargs):
|
||||
self.digest = kwargs.get('md5', None)
|
||||
if not self.digest: self.digest = digest
|
||||
|
||||
self.expand_archive = kwargs.get('expand', True)
|
||||
|
||||
if not self.url:
|
||||
raise ValueError("URLFetchStrategy requires a url for fetching.")
|
||||
|
||||
@ -153,7 +154,7 @@ def fetch(self):
|
||||
self.stage.chdir()
|
||||
|
||||
if self.archive_file:
|
||||
tty.msg("Already downloaded %s." % self.archive_file)
|
||||
tty.msg("Already downloaded %s" % self.archive_file)
|
||||
return
|
||||
|
||||
tty.msg("Trying to fetch from %s" % self.url)
|
||||
@ -218,6 +219,10 @@ def archive_file(self):
|
||||
|
||||
@_needs_stage
|
||||
def expand(self):
|
||||
if not self.expand_archive:
|
||||
tty.msg("Skipping expand step for %s" % self.archive_file)
|
||||
return
|
||||
|
||||
tty.msg("Staging archive: %s" % self.archive_file)
|
||||
|
||||
self.stage.chdir()
|
||||
@ -275,8 +280,8 @@ def check(self):
|
||||
checker = crypto.Checker(self.digest)
|
||||
if not checker.check(self.archive_file):
|
||||
raise ChecksumError(
|
||||
"%s checksum failed for %s." % (checker.hash_name, self.archive_file),
|
||||
"Expected %s but got %s." % (self.digest, checker.sum))
|
||||
"%s checksum failed for %s" % (checker.hash_name, self.archive_file),
|
||||
"Expected %s but got %s" % (self.digest, checker.sum))
|
||||
|
||||
@_needs_stage
|
||||
def reset(self):
|
||||
@ -312,7 +317,7 @@ def __init__(self, name, *rev_types, **kwargs):
|
||||
# Ensure that there's only one of the rev_types
|
||||
if sum(k in kwargs for k in rev_types) > 1:
|
||||
raise FetchStrategyError(
|
||||
"Supply only one of %s to fetch with %s." % (
|
||||
"Supply only one of %s to fetch with %s" % (
|
||||
comma_or(rev_types), name))
|
||||
|
||||
# Set attributes for each rev type.
|
||||
@ -321,7 +326,7 @@ def __init__(self, name, *rev_types, **kwargs):
|
||||
|
||||
@_needs_stage
|
||||
def check(self):
|
||||
tty.msg("No checksum needed when fetching with %s." % self.name)
|
||||
tty.msg("No checksum needed when fetching with %s" % self.name)
|
||||
|
||||
@_needs_stage
|
||||
def expand(self):
|
||||
@ -395,7 +400,7 @@ def fetch(self):
|
||||
self.stage.chdir()
|
||||
|
||||
if self.stage.source_path:
|
||||
tty.msg("Already fetched %s." % self.stage.source_path)
|
||||
tty.msg("Already fetched %s" % self.stage.source_path)
|
||||
return
|
||||
|
||||
args = []
|
||||
@ -505,7 +510,7 @@ def fetch(self):
|
||||
self.stage.chdir()
|
||||
|
||||
if self.stage.source_path:
|
||||
tty.msg("Already fetched %s." % self.stage.source_path)
|
||||
tty.msg("Already fetched %s" % self.stage.source_path)
|
||||
return
|
||||
|
||||
tty.msg("Trying to check out svn repository: %s" % self.url)
|
||||
@ -584,7 +589,7 @@ def fetch(self):
|
||||
self.stage.chdir()
|
||||
|
||||
if self.stage.source_path:
|
||||
tty.msg("Already fetched %s." % self.stage.source_path)
|
||||
tty.msg("Already fetched %s" % self.stage.source_path)
|
||||
return
|
||||
|
||||
args = []
|
||||
|
77
lib/spack/spack/hooks/sbang.py
Normal file
77
lib/spack/spack/hooks/sbang.py
Normal file
@ -0,0 +1,77 @@
|
||||
##############################################################################
|
||||
# Copyright (c) 2013-2015, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://github.com/llnl/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License (as published by
|
||||
# the Free Software Foundation) version 2.1 dated February 1999.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful, but
|
||||
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
|
||||
# conditions of the GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU Lesser General Public License
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
import llnl.util.tty as tty
|
||||
|
||||
import spack
|
||||
import spack.modules
|
||||
|
||||
# Character limit for shebang line. Using Linux's 127 characters
|
||||
# here, as it is the shortest I could find on a modern OS.
|
||||
shebang_limit = 127
|
||||
|
||||
def shebang_too_long(path):
|
||||
"""Detects whether an file has a shebang line that is too long."""
|
||||
with open(path, 'r') as script:
|
||||
bytes = script.read(2)
|
||||
if bytes != '#!':
|
||||
return False
|
||||
|
||||
line = bytes + script.readline()
|
||||
return len(line) > shebang_limit
|
||||
|
||||
|
||||
def filter_shebang(path):
|
||||
"""Adds a second shebang line, using sbang, at the beginning of a file."""
|
||||
backup = path + ".shebang.bak"
|
||||
os.rename(path, backup)
|
||||
|
||||
with open(backup, 'r') as bak_file:
|
||||
original = bak_file.read()
|
||||
|
||||
with open(path, 'w') as new_file:
|
||||
new_file.write('#!/bin/bash %s/bin/sbang\n' % spack.spack_root)
|
||||
new_file.write(original)
|
||||
|
||||
copy_mode(backup, path)
|
||||
unset_executable_mode(backup)
|
||||
|
||||
tty.warn("Patched overly long shebang in %s" % path)
|
||||
|
||||
|
||||
def post_install(pkg):
|
||||
"""This hook edits scripts so that they call /bin/bash
|
||||
$spack_prefix/bin/sbang instead of something longer than the
|
||||
shebang limit."""
|
||||
if not os.path.isdir(pkg.prefix.bin):
|
||||
return
|
||||
|
||||
for file in os.listdir(pkg.prefix.bin):
|
||||
path = os.path.join(pkg.prefix.bin, file)
|
||||
if shebang_too_long(path):
|
||||
filter_shebang(path)
|
||||
|
@ -51,13 +51,20 @@ def mirror_archive_filename(spec, fetcher):
|
||||
raise ValueError("mirror.path requires spec with concrete version.")
|
||||
|
||||
if isinstance(fetcher, fs.URLFetchStrategy):
|
||||
# If we fetch this version with a URLFetchStrategy, use URL's archive type
|
||||
ext = url.downloaded_file_extension(fetcher.url)
|
||||
if fetcher.expand_archive:
|
||||
# If we fetch this version with a URLFetchStrategy, use URL's archive type
|
||||
ext = url.downloaded_file_extension(fetcher.url)
|
||||
else:
|
||||
# If the archive shouldn't be expanded, don't check for its extension.
|
||||
ext = None
|
||||
else:
|
||||
# Otherwise we'll make a .tar.gz ourselves
|
||||
ext = 'tar.gz'
|
||||
|
||||
return "%s-%s.%s" % (spec.package.name, spec.version, ext)
|
||||
filename = "%s-%s" % (spec.package.name, spec.version)
|
||||
if ext:
|
||||
filename += ".%s" % ext
|
||||
return filename
|
||||
|
||||
|
||||
def mirror_archive_path(spec, fetcher):
|
||||
@ -73,7 +80,7 @@ def get_matching_versions(specs, **kwargs):
|
||||
|
||||
# Skip any package that has no known versions.
|
||||
if not pkg.versions:
|
||||
tty.msg("No safe (checksummed) versions for package %s." % pkg.name)
|
||||
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
|
||||
continue
|
||||
|
||||
num_versions = kwargs.get('num_versions', 0)
|
||||
@ -110,7 +117,6 @@ def suggest_archive_basename(resource):
|
||||
return basename
|
||||
|
||||
|
||||
|
||||
def create(path, specs, **kwargs):
|
||||
"""Create a directory to be used as a spack mirror, and fill it with
|
||||
package archives.
|
||||
@ -158,17 +164,29 @@ def create(path, specs, **kwargs):
|
||||
"Cannot create directory '%s':" % mirror_root, str(e))
|
||||
|
||||
# Things to keep track of while parsing specs.
|
||||
present = []
|
||||
mirrored = []
|
||||
error = []
|
||||
categories = {
|
||||
'present': [],
|
||||
'mirrored': [],
|
||||
'error': []
|
||||
}
|
||||
|
||||
# Iterate through packages and download all the safe tarballs for each of them
|
||||
everything_already_exists = True
|
||||
for spec in version_specs:
|
||||
pkg = spec.package
|
||||
tty.msg("Adding package {pkg} to mirror".format(pkg=spec.format("$_$@")))
|
||||
try:
|
||||
for ii, stage in enumerate(pkg.stage):
|
||||
add_single_spec(spec, mirror_root, categories, **kwargs)
|
||||
|
||||
return categories['present'], categories['mirrored'], categories['error']
|
||||
|
||||
|
||||
def add_single_spec(spec, mirror_root, categories, **kwargs):
|
||||
tty.msg("Adding package {pkg} to mirror".format(pkg=spec.format("$_$@")))
|
||||
spec_exists_in_mirror = True
|
||||
try:
|
||||
with spec.package.stage:
|
||||
# fetcher = stage.fetcher
|
||||
# fetcher.fetch()
|
||||
# ...
|
||||
# fetcher.archive(archive_path)
|
||||
for ii, stage in enumerate(spec.package.stage):
|
||||
fetcher = stage.fetcher
|
||||
if ii == 0:
|
||||
# create a subdirectory for the current package@version
|
||||
@ -184,7 +202,7 @@ def create(path, specs, **kwargs):
|
||||
if os.path.exists(archive_path):
|
||||
tty.msg("{name} : already added".format(name=name))
|
||||
else:
|
||||
everything_already_exists = False
|
||||
spec_exists_in_mirror = False
|
||||
fetcher.fetch()
|
||||
if not kwargs.get('no_checksum', False):
|
||||
fetcher.check()
|
||||
@ -195,20 +213,16 @@ def create(path, specs, **kwargs):
|
||||
fetcher.archive(archive_path)
|
||||
tty.msg("{name} : added".format(name=name))
|
||||
|
||||
if everything_already_exists:
|
||||
present.append(spec)
|
||||
else:
|
||||
mirrored.append(spec)
|
||||
except Exception, e:
|
||||
if spack.debug:
|
||||
sys.excepthook(*sys.exc_info())
|
||||
else:
|
||||
tty.warn("Error while fetching %s." % spec.format('$_$@'), e.message)
|
||||
error.append(spec)
|
||||
finally:
|
||||
pkg.stage.destroy()
|
||||
|
||||
return (present, mirrored, error)
|
||||
if spec_exists_in_mirror:
|
||||
categories['present'].append(spec)
|
||||
else:
|
||||
categories['mirrored'].append(spec)
|
||||
except Exception as e:
|
||||
if spack.debug:
|
||||
sys.excepthook(*sys.exc_info())
|
||||
else:
|
||||
tty.warn("Error while fetching %s" % spec.format('$_$@'), e.message)
|
||||
categories['error'].append(spec)
|
||||
|
||||
|
||||
class MirrorError(spack.error.SpackError):
|
||||
|
@ -33,6 +33,7 @@
|
||||
|
||||
* /bin directories to be appended to PATH
|
||||
* /lib* directories for LD_LIBRARY_PATH
|
||||
* /include directories for CPATH
|
||||
* /man* and /share/man* directories for MANPATH
|
||||
* the package prefix for CMAKE_PREFIX_PATH
|
||||
|
||||
@ -121,6 +122,7 @@ def add_path(path_name, directory):
|
||||
('LIBRARY_PATH', self.spec.prefix.lib64),
|
||||
('LD_LIBRARY_PATH', self.spec.prefix.lib),
|
||||
('LD_LIBRARY_PATH', self.spec.prefix.lib64),
|
||||
('CPATH', self.spec.prefix.include),
|
||||
('PKG_CONFIG_PATH', join_path(self.spec.prefix.lib, 'pkgconfig')),
|
||||
('PKG_CONFIG_PATH', join_path(self.spec.prefix.lib64, 'pkgconfig'))]:
|
||||
|
||||
|
@ -138,7 +138,7 @@ class when(object):
|
||||
methods like install() that depend on the package's spec.
|
||||
For example:
|
||||
|
||||
.. code-block::
|
||||
.. code-block:: python
|
||||
|
||||
class SomePackage(Package):
|
||||
...
|
||||
@ -163,26 +163,28 @@ def install(self, prefix):
|
||||
if you only have part of the install that is platform specific, you
|
||||
could do this:
|
||||
|
||||
class SomePackage(Package):
|
||||
...
|
||||
# virtual dependence on MPI.
|
||||
# could resolve to mpich, mpich2, OpenMPI
|
||||
depends_on('mpi')
|
||||
.. code-block:: python
|
||||
|
||||
def setup(self):
|
||||
# do nothing in the default case
|
||||
pass
|
||||
class SomePackage(Package):
|
||||
...
|
||||
# virtual dependence on MPI.
|
||||
# could resolve to mpich, mpich2, OpenMPI
|
||||
depends_on('mpi')
|
||||
|
||||
@when('^openmpi')
|
||||
def setup(self):
|
||||
# do something special when this is built with OpenMPI for
|
||||
# its MPI implementations.
|
||||
def setup(self):
|
||||
# do nothing in the default case
|
||||
pass
|
||||
|
||||
@when('^openmpi')
|
||||
def setup(self):
|
||||
# do something special when this is built with OpenMPI for
|
||||
# its MPI implementations.
|
||||
|
||||
|
||||
def install(self, prefix):
|
||||
# Do common install stuff
|
||||
self.setup()
|
||||
# Do more common install stuff
|
||||
def install(self, prefix):
|
||||
# Do common install stuff
|
||||
self.setup()
|
||||
# Do more common install stuff
|
||||
|
||||
There must be one (and only one) @when clause that matches the
|
||||
package's spec. If there is more than one, or if none match,
|
||||
@ -193,10 +195,11 @@ def install(self, prefix):
|
||||
platform-specific versions. There's not much we can do to get
|
||||
around this because of the way decorators work.
|
||||
"""
|
||||
class when(object):
|
||||
def __init__(self, spec):
|
||||
pkg = get_calling_module_name()
|
||||
self.spec = parse_anonymous_spec(spec, pkg)
|
||||
if spec is True:
|
||||
spec = pkg
|
||||
self.spec = parse_anonymous_spec(spec, pkg) if spec is not False else None
|
||||
|
||||
def __call__(self, method):
|
||||
# Get the first definition of the method in the calling scope
|
||||
@ -207,7 +210,9 @@ def __call__(self, method):
|
||||
if not type(original_method) == SpecMultiMethod:
|
||||
original_method = SpecMultiMethod(original_method)
|
||||
|
||||
original_method.register(self.spec, method)
|
||||
if self.spec is not None:
|
||||
original_method.register(self.spec, method)
|
||||
|
||||
return original_method
|
||||
|
||||
|
||||
|
@ -58,6 +58,7 @@
|
||||
import spack.mirror
|
||||
import spack.hooks
|
||||
import spack.directives
|
||||
import spack.repository
|
||||
import spack.build_environment
|
||||
import spack.url
|
||||
import spack.util.web
|
||||
@ -66,6 +67,7 @@
|
||||
from spack.stage import Stage, ResourceStage, StageComposite
|
||||
from spack.util.compression import allowed_archive, extension
|
||||
from spack.util.executable import ProcessError
|
||||
from spack.util.environment import dump_environment
|
||||
|
||||
"""Allowed URL schemes for spack packages."""
|
||||
_ALLOWED_URL_SCHEMES = ["http", "https", "ftp", "file", "git"]
|
||||
@ -454,7 +456,7 @@ def _make_stage(self):
|
||||
# Construct a composite stage on top of the composite FetchStrategy
|
||||
composite_fetcher = self.fetcher
|
||||
composite_stage = StageComposite()
|
||||
resources = self._get_resources()
|
||||
resources = self._get_needed_resources()
|
||||
for ii, fetcher in enumerate(composite_fetcher):
|
||||
if ii == 0:
|
||||
# Construct root stage first
|
||||
@ -465,6 +467,11 @@ def _make_stage(self):
|
||||
stage = self._make_resource_stage(composite_stage[0], fetcher, resource)
|
||||
# Append the item to the composite
|
||||
composite_stage.append(stage)
|
||||
|
||||
# Create stage on first access. Needed because fetch, stage,
|
||||
# patch, and install can be called independently of each
|
||||
# other, so `with self.stage:` in do_install isn't sufficient.
|
||||
composite_stage.create()
|
||||
return composite_stage
|
||||
|
||||
@property
|
||||
@ -483,12 +490,14 @@ def stage(self, stage):
|
||||
|
||||
|
||||
def _make_fetcher(self):
|
||||
# Construct a composite fetcher that always contains at least one element (the root package). In case there
|
||||
# are resources associated with the package, append their fetcher to the composite.
|
||||
# Construct a composite fetcher that always contains at least
|
||||
# one element (the root package). In case there are resources
|
||||
# associated with the package, append their fetcher to the
|
||||
# composite.
|
||||
root_fetcher = fs.for_package_version(self, self.version)
|
||||
fetcher = fs.FetchStrategyComposite() # Composite fetcher
|
||||
fetcher.append(root_fetcher) # Root fetcher is always present
|
||||
resources = self._get_resources()
|
||||
resources = self._get_needed_resources()
|
||||
for resource in resources:
|
||||
fetcher.append(resource.fetcher)
|
||||
return fetcher
|
||||
@ -685,7 +694,7 @@ def do_fetch(self, mirror_only=False):
|
||||
|
||||
if not ignore_checksum:
|
||||
raise FetchError(
|
||||
"Will not fetch %s." % self.spec.format('$_$@'), checksum_msg)
|
||||
"Will not fetch %s" % self.spec.format('$_$@'), checksum_msg)
|
||||
|
||||
self.stage.fetch(mirror_only)
|
||||
|
||||
@ -705,6 +714,7 @@ def do_stage(self, mirror_only=False):
|
||||
self.stage.expand_archive()
|
||||
self.stage.chdir_to_source()
|
||||
|
||||
|
||||
def do_patch(self):
|
||||
"""Calls do_stage(), then applied patches to the expanded tarball if they
|
||||
haven't been applied already."""
|
||||
@ -719,7 +729,7 @@ def do_patch(self):
|
||||
|
||||
# If there are no patches, note it.
|
||||
if not self.patches and not has_patch_fun:
|
||||
tty.msg("No patches needed for %s." % self.name)
|
||||
tty.msg("No patches needed for %s" % self.name)
|
||||
return
|
||||
|
||||
# Construct paths to special files in the archive dir used to
|
||||
@ -732,7 +742,7 @@ def do_patch(self):
|
||||
# If we encounter an archive that failed to patch, restage it
|
||||
# so that we can apply all the patches again.
|
||||
if os.path.isfile(bad_file):
|
||||
tty.msg("Patching failed last time. Restaging.")
|
||||
tty.msg("Patching failed last time. Restaging.")
|
||||
self.stage.restage()
|
||||
|
||||
self.stage.chdir_to_source()
|
||||
@ -742,7 +752,7 @@ def do_patch(self):
|
||||
tty.msg("Already patched %s" % self.name)
|
||||
return
|
||||
elif os.path.isfile(no_patches_file):
|
||||
tty.msg("No patches needed for %s." % self.name)
|
||||
tty.msg("No patches needed for %s" % self.name)
|
||||
return
|
||||
|
||||
# Apply all the patches for specs that match this one
|
||||
@ -763,10 +773,10 @@ def do_patch(self):
|
||||
if has_patch_fun:
|
||||
try:
|
||||
self.patch()
|
||||
tty.msg("Ran patch() for %s." % self.name)
|
||||
tty.msg("Ran patch() for %s" % self.name)
|
||||
patched = True
|
||||
except:
|
||||
tty.msg("patch() function failed for %s." % self.name)
|
||||
tty.msg("patch() function failed for %s" % self.name)
|
||||
touch(bad_file)
|
||||
raise
|
||||
|
||||
@ -797,7 +807,7 @@ def do_fake_install(self):
|
||||
mkdirp(self.prefix.man1)
|
||||
|
||||
|
||||
def _get_resources(self):
|
||||
def _get_needed_resources(self):
|
||||
resources = []
|
||||
# Select the resources that are needed for this build
|
||||
for when_spec, resource_list in self.resources.items():
|
||||
@ -815,7 +825,7 @@ def _resource_stage(self, resource):
|
||||
|
||||
|
||||
def do_install(self,
|
||||
keep_prefix=False, keep_stage=False, ignore_deps=False,
|
||||
keep_prefix=False, keep_stage=None, ignore_deps=False,
|
||||
skip_patch=False, verbose=False, make_jobs=None, fake=False):
|
||||
"""Called by commands to install a package and its dependencies.
|
||||
|
||||
@ -824,7 +834,8 @@ def do_install(self,
|
||||
|
||||
Args:
|
||||
keep_prefix -- Keep install prefix on failure. By default, destroys it.
|
||||
keep_stage -- Keep stage on successful build. By default, destroys it.
|
||||
keep_stage -- Set to True or false to always keep or always delete stage.
|
||||
By default, stage is destroyed only if there are no exceptions.
|
||||
ignore_deps -- Do not install dependencies before installing this package.
|
||||
fake -- Don't really build -- install fake stub files instead.
|
||||
skip_patch -- Skip patch stage of build if True.
|
||||
@ -834,99 +845,98 @@ def do_install(self,
|
||||
if not self.spec.concrete:
|
||||
raise ValueError("Can only install concrete packages.")
|
||||
|
||||
if os.path.exists(self.prefix):
|
||||
tty.msg("%s is already installed in %s." % (self.name, self.prefix))
|
||||
# Ensure package is not already installed
|
||||
if spack.install_layout.check_installed(self.spec):
|
||||
tty.msg("%s is already installed in %s" % (self.name, self.prefix))
|
||||
return
|
||||
|
||||
tty.msg("Installing %s" % self.name)
|
||||
|
||||
# First, install dependencies recursively.
|
||||
if not ignore_deps:
|
||||
self.do_install_dependencies(
|
||||
keep_prefix=keep_prefix, keep_stage=keep_stage, ignore_deps=ignore_deps,
|
||||
fake=fake, skip_patch=skip_patch, verbose=verbose,
|
||||
make_jobs=make_jobs)
|
||||
|
||||
start_time = time.time()
|
||||
if not fake:
|
||||
if not skip_patch:
|
||||
self.do_patch()
|
||||
else:
|
||||
self.do_stage()
|
||||
|
||||
# create the install directory. The install layout
|
||||
# handles this in case so that it can use whatever
|
||||
# package naming scheme it likes.
|
||||
spack.install_layout.create_install_directory(self.spec)
|
||||
|
||||
def cleanup():
|
||||
if not keep_prefix:
|
||||
# If anything goes wrong, remove the install prefix
|
||||
self.remove_prefix()
|
||||
else:
|
||||
tty.warn("Keeping install prefix in place despite error.",
|
||||
"Spack will think this package is installed." +
|
||||
"Manually remove this directory to fix:",
|
||||
self.prefix, wrap=True)
|
||||
|
||||
|
||||
def real_work():
|
||||
try:
|
||||
tty.msg("Building %s." % self.name)
|
||||
|
||||
# Run the pre-install hook in the child process after
|
||||
# the directory is created.
|
||||
spack.hooks.pre_install(self)
|
||||
|
||||
# Set up process's build environment before running install.
|
||||
if fake:
|
||||
self.do_fake_install()
|
||||
else:
|
||||
# Do the real install in the source directory.
|
||||
self.stage.chdir_to_source()
|
||||
|
||||
# This redirects I/O to a build log (and optionally to the terminal)
|
||||
log_path = join_path(os.getcwd(), 'spack-build.out')
|
||||
log_file = open(log_path, 'w')
|
||||
with log_output(log_file, verbose, sys.stdout.isatty(), True):
|
||||
self.install(self.spec, self.prefix)
|
||||
|
||||
# Ensure that something was actually installed.
|
||||
self._sanity_check_install()
|
||||
|
||||
# Move build log into install directory on success
|
||||
if not fake:
|
||||
log_install_path = spack.install_layout.build_log_path(self.spec)
|
||||
install(log_path, log_install_path)
|
||||
|
||||
# On successful install, remove the stage.
|
||||
if not keep_stage:
|
||||
self.stage.destroy()
|
||||
|
||||
# Stop timer.
|
||||
self._total_time = time.time() - start_time
|
||||
build_time = self._total_time - self._fetch_time
|
||||
|
||||
tty.msg("Successfully installed %s." % self.name,
|
||||
"Fetch: %s. Build: %s. Total: %s."
|
||||
% (_hms(self._fetch_time), _hms(build_time), _hms(self._total_time)))
|
||||
print_pkg(self.prefix)
|
||||
|
||||
except ProcessError, e:
|
||||
# Annotate with location of build log.
|
||||
e.build_log = log_path
|
||||
cleanup()
|
||||
raise e
|
||||
|
||||
except:
|
||||
# other exceptions just clean up and raise.
|
||||
cleanup()
|
||||
raise
|
||||
fake=fake, skip_patch=skip_patch, verbose=verbose, make_jobs=make_jobs)
|
||||
|
||||
# Set parallelism before starting build.
|
||||
self.make_jobs = make_jobs
|
||||
|
||||
# Do the build.
|
||||
spack.build_environment.fork(self, real_work)
|
||||
# Then install the package itself.
|
||||
def build_process():
|
||||
"""Forked for each build. Has its own process and python
|
||||
module space set up by build_environment.fork()."""
|
||||
start_time = time.time()
|
||||
if not fake:
|
||||
if not skip_patch:
|
||||
self.do_patch()
|
||||
else:
|
||||
self.do_stage()
|
||||
|
||||
tty.msg("Building %s" % self.name)
|
||||
|
||||
self.stage.keep = keep_stage
|
||||
with self.stage:
|
||||
# Run the pre-install hook in the child process after
|
||||
# the directory is created.
|
||||
spack.hooks.pre_install(self)
|
||||
|
||||
if fake:
|
||||
self.do_fake_install()
|
||||
else:
|
||||
# Do the real install in the source directory.
|
||||
self.stage.chdir_to_source()
|
||||
|
||||
# Save the build environment in a file before building.
|
||||
env_path = join_path(os.getcwd(), 'spack-build.env')
|
||||
|
||||
try:
|
||||
# Redirect I/O to a build log (and optionally to the terminal)
|
||||
log_path = join_path(os.getcwd(), 'spack-build.out')
|
||||
log_file = open(log_path, 'w')
|
||||
with log_output(log_file, verbose, sys.stdout.isatty(), True):
|
||||
dump_environment(env_path)
|
||||
self.install(self.spec, self.prefix)
|
||||
|
||||
except ProcessError as e:
|
||||
# Annotate ProcessErrors with the location of the build log.
|
||||
e.build_log = log_path
|
||||
raise e
|
||||
|
||||
# Ensure that something was actually installed.
|
||||
self._sanity_check_install()
|
||||
|
||||
# Copy provenance into the install directory on success
|
||||
log_install_path = spack.install_layout.build_log_path(self.spec)
|
||||
env_install_path = spack.install_layout.build_env_path(self.spec)
|
||||
packages_dir = spack.install_layout.build_packages_path(self.spec)
|
||||
|
||||
install(log_path, log_install_path)
|
||||
install(env_path, env_install_path)
|
||||
dump_packages(self.spec, packages_dir)
|
||||
|
||||
# Stop timer.
|
||||
self._total_time = time.time() - start_time
|
||||
build_time = self._total_time - self._fetch_time
|
||||
|
||||
tty.msg("Successfully installed %s" % self.name,
|
||||
"Fetch: %s. Build: %s. Total: %s."
|
||||
% (_hms(self._fetch_time), _hms(build_time), _hms(self._total_time)))
|
||||
print_pkg(self.prefix)
|
||||
|
||||
try:
|
||||
# Create the install prefix and fork the build process.
|
||||
spack.install_layout.create_install_directory(self.spec)
|
||||
spack.build_environment.fork(self, build_process)
|
||||
except:
|
||||
# remove the install prefix if anything went wrong during install.
|
||||
if not keep_prefix:
|
||||
self.remove_prefix()
|
||||
else:
|
||||
tty.warn("Keeping install prefix in place despite error.",
|
||||
"Spack will think this package is installed. " +
|
||||
"Manually remove this directory to fix:",
|
||||
self.prefix, wrap=True)
|
||||
raise
|
||||
|
||||
# note: PARENT of the build process adds the new package to
|
||||
# the database, so that we don't need to re-read from file.
|
||||
@ -1013,7 +1023,7 @@ def do_uninstall(self, force=False):
|
||||
# Uninstalling in Spack only requires removing the prefix.
|
||||
self.remove_prefix()
|
||||
spack.installed_db.remove(self.spec)
|
||||
tty.msg("Successfully uninstalled %s." % self.spec.short_spec)
|
||||
tty.msg("Successfully uninstalled %s" % self.spec.short_spec)
|
||||
|
||||
# Once everything else is done, run post install hooks
|
||||
spack.hooks.post_uninstall(self)
|
||||
@ -1060,7 +1070,7 @@ def do_activate(self, force=False):
|
||||
self.extendee_spec.package.activate(self, **self.extendee_args)
|
||||
|
||||
spack.install_layout.add_extension(self.extendee_spec, self.spec)
|
||||
tty.msg("Activated extension %s for %s."
|
||||
tty.msg("Activated extension %s for %s"
|
||||
% (self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
|
||||
|
||||
|
||||
@ -1112,7 +1122,7 @@ def do_deactivate(self, **kwargs):
|
||||
if self.activated:
|
||||
spack.install_layout.remove_extension(self.extendee_spec, self.spec)
|
||||
|
||||
tty.msg("Deactivated extension %s for %s."
|
||||
tty.msg("Deactivated extension %s for %s"
|
||||
% (self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
|
||||
|
||||
|
||||
@ -1140,8 +1150,7 @@ def do_restage(self):
|
||||
|
||||
def do_clean(self):
|
||||
"""Removes the package's build stage and source tarball."""
|
||||
if os.path.exists(self.stage.path):
|
||||
self.stage.destroy()
|
||||
self.stage.destroy()
|
||||
|
||||
|
||||
def format_doc(self, **kwargs):
|
||||
@ -1180,7 +1189,7 @@ def fetch_remote_versions(self):
|
||||
try:
|
||||
return spack.util.web.find_versions_of_archive(
|
||||
*self.all_urls, list_url=self.list_url, list_depth=self.list_depth)
|
||||
except spack.error.NoNetworkConnectionError, e:
|
||||
except spack.error.NoNetworkConnectionError as e:
|
||||
tty.die("Package.fetch_versions couldn't connect to:",
|
||||
e.url, e.message)
|
||||
|
||||
@ -1198,8 +1207,8 @@ def rpath(self):
|
||||
|
||||
@property
|
||||
def rpath_args(self):
|
||||
"""Get the rpath args as a string, with -Wl,-rpath= for each element."""
|
||||
return " ".join("-Wl,-rpath=%s" % p for p in self.rpath)
|
||||
"""Get the rpath args as a string, with -Wl,-rpath, for each element."""
|
||||
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
|
||||
|
||||
|
||||
def validate_package_url(url_string):
|
||||
@ -1212,6 +1221,52 @@ def validate_package_url(url_string):
|
||||
tty.die("Invalid file type in URL: '%s'" % url_string)
|
||||
|
||||
|
||||
def dump_packages(spec, path):
|
||||
"""Dump all package information for a spec and its dependencies.
|
||||
|
||||
This creates a package repository within path for every
|
||||
namespace in the spec DAG, and fills the repos wtih package
|
||||
files and patch files for every node in the DAG.
|
||||
"""
|
||||
mkdirp(path)
|
||||
|
||||
# Copy in package.py files from any dependencies.
|
||||
# Note that we copy them in as they are in the *install* directory
|
||||
# NOT as they are in the repository, because we want a snapshot of
|
||||
# how *this* particular build was done.
|
||||
for node in spec.traverse():
|
||||
if node is not spec:
|
||||
# Locate the dependency package in the install tree and find
|
||||
# its provenance information.
|
||||
source = spack.install_layout.build_packages_path(node)
|
||||
source_repo_root = join_path(source, node.namespace)
|
||||
|
||||
# There's no provenance installed for the source package. Skip it.
|
||||
# User can always get something current from the builtin repo.
|
||||
if not os.path.isdir(source_repo_root):
|
||||
continue
|
||||
|
||||
# Create a source repo and get the pkg directory out of it.
|
||||
try:
|
||||
source_repo = spack.repository.Repo(source_repo_root)
|
||||
source_pkg_dir = source_repo.dirname_for_package_name(node.name)
|
||||
except RepoError as e:
|
||||
tty.warn("Warning: Couldn't copy in provenance for %s" % node.name)
|
||||
|
||||
# Create a destination repository
|
||||
dest_repo_root = join_path(path, node.namespace)
|
||||
if not os.path.exists(dest_repo_root):
|
||||
spack.repository.create_repo(dest_repo_root)
|
||||
repo = spack.repository.Repo(dest_repo_root)
|
||||
|
||||
# Get the location of the package in the dest repo.
|
||||
dest_pkg_dir = repo.dirname_for_package_name(node.name)
|
||||
if node is not spec:
|
||||
install_tree(source_pkg_dir, dest_pkg_dir)
|
||||
else:
|
||||
spack.repo.dump_provenance(node, dest_pkg_dir)
|
||||
|
||||
|
||||
def print_pkg(message):
|
||||
"""Outputs a message with a package icon."""
|
||||
from llnl.util.tty.color import cwrite
|
||||
@ -1262,7 +1317,7 @@ class PackageVersionError(PackageError):
|
||||
"""Raised when a version URL cannot automatically be determined."""
|
||||
def __init__(self, version):
|
||||
super(PackageVersionError, self).__init__(
|
||||
"Cannot determine a URL automatically for version %s." % version,
|
||||
"Cannot determine a URL automatically for version %s" % version,
|
||||
"Please provide a url for this version in the package.py file.")
|
||||
|
||||
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
@ -33,7 +33,7 @@
|
||||
from external import yaml
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import join_path
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack.error
|
||||
import spack.config
|
||||
@ -156,7 +156,7 @@ def _add(self, repo):
|
||||
|
||||
if repo.namespace in self.by_namespace:
|
||||
raise DuplicateRepoError(
|
||||
"Package repos '%s' and '%s' both provide namespace %s."
|
||||
"Package repos '%s' and '%s' both provide namespace %s"
|
||||
% (repo.root, self.by_namespace[repo.namespace].root, repo.namespace))
|
||||
|
||||
# Add repo to the pkg indexes
|
||||
@ -316,6 +316,16 @@ def get(self, spec, new=False):
|
||||
return self.repo_for_pkg(spec).get(spec)
|
||||
|
||||
|
||||
@_autospec
|
||||
def dump_provenance(self, spec, path):
|
||||
"""Dump provenance information for a spec to a particular path.
|
||||
|
||||
This dumps the package file and any associated patch files.
|
||||
Raises UnknownPackageError if not found.
|
||||
"""
|
||||
return self.repo_for_pkg(spec).dump_provenance(spec, path)
|
||||
|
||||
|
||||
def dirname_for_package_name(self, pkg_name):
|
||||
return self.repo_for_pkg(pkg_name).dirname_for_package_name(pkg_name)
|
||||
|
||||
@ -535,7 +545,7 @@ def get(self, spec, new=False):
|
||||
raise UnknownPackageError(spec.name)
|
||||
|
||||
if spec.namespace and spec.namespace != self.namespace:
|
||||
raise UnknownPackageError("Repository %s does not contain package %s."
|
||||
raise UnknownPackageError("Repository %s does not contain package %s"
|
||||
% (self.namespace, spec.fullname))
|
||||
|
||||
key = hash(spec)
|
||||
@ -552,6 +562,35 @@ def get(self, spec, new=False):
|
||||
return self._instances[key]
|
||||
|
||||
|
||||
@_autospec
|
||||
def dump_provenance(self, spec, path):
|
||||
"""Dump provenance information for a spec to a particular path.
|
||||
|
||||
This dumps the package file and any associated patch files.
|
||||
Raises UnknownPackageError if not found.
|
||||
"""
|
||||
# Some preliminary checks.
|
||||
if spec.virtual:
|
||||
raise UnknownPackageError(spec.name)
|
||||
|
||||
if spec.namespace and spec.namespace != self.namespace:
|
||||
raise UnknownPackageError("Repository %s does not contain package %s."
|
||||
% (self.namespace, spec.fullname))
|
||||
|
||||
# Install any patch files needed by packages.
|
||||
mkdirp(path)
|
||||
for spec, patches in spec.package.patches.items():
|
||||
for patch in patches:
|
||||
if patch.path:
|
||||
if os.path.exists(patch.path):
|
||||
install(patch.path, path)
|
||||
else:
|
||||
tty.warn("Patch file did not exist: %s" % patch.path)
|
||||
|
||||
# Install the package.py file itself.
|
||||
install(self.filename_for_package_name(spec), path)
|
||||
|
||||
|
||||
def purge(self):
|
||||
"""Clear entire package instance cache."""
|
||||
self._instances.clear()
|
||||
@ -705,6 +744,58 @@ def __contains__(self, pkg_name):
|
||||
return self.exists(pkg_name)
|
||||
|
||||
|
||||
def create_repo(root, namespace=None):
|
||||
"""Create a new repository in root with the specified namespace.
|
||||
|
||||
If the namespace is not provided, use basename of root.
|
||||
Return the canonicalized path and the namespace of the created repository.
|
||||
"""
|
||||
root = canonicalize_path(root)
|
||||
if not namespace:
|
||||
namespace = os.path.basename(root)
|
||||
|
||||
if not re.match(r'\w[\.\w-]*', namespace):
|
||||
raise InvalidNamespaceError("'%s' is not a valid namespace." % namespace)
|
||||
|
||||
existed = False
|
||||
if os.path.exists(root):
|
||||
if os.path.isfile(root):
|
||||
raise BadRepoError('File %s already exists and is not a directory' % root)
|
||||
elif os.path.isdir(root):
|
||||
if not os.access(root, os.R_OK | os.W_OK):
|
||||
raise BadRepoError('Cannot create new repo in %s: cannot access directory.' % root)
|
||||
if os.listdir(root):
|
||||
raise BadRepoError('Cannot create new repo in %s: directory is not empty.' % root)
|
||||
existed = True
|
||||
|
||||
full_path = os.path.realpath(root)
|
||||
parent = os.path.dirname(full_path)
|
||||
if not os.access(parent, os.R_OK | os.W_OK):
|
||||
raise BadRepoError("Cannot create repository in %s: can't access parent!" % root)
|
||||
|
||||
try:
|
||||
config_path = os.path.join(root, repo_config_name)
|
||||
packages_path = os.path.join(root, packages_dir_name)
|
||||
|
||||
mkdirp(packages_path)
|
||||
with open(config_path, 'w') as config:
|
||||
config.write("repo:\n")
|
||||
config.write(" namespace: '%s'\n" % namespace)
|
||||
|
||||
except (IOError, OSError) as e:
|
||||
raise BadRepoError('Failed to create new repository in %s.' % root,
|
||||
"Caused by %s: %s" % (type(e), e))
|
||||
|
||||
# try to clean up.
|
||||
if existed:
|
||||
shutil.rmtree(config_path, ignore_errors=True)
|
||||
shutil.rmtree(packages_path, ignore_errors=True)
|
||||
else:
|
||||
shutil.rmtree(root, ignore_errors=True)
|
||||
|
||||
return full_path, namespace
|
||||
|
||||
|
||||
class RepoError(spack.error.SpackError):
|
||||
"""Superclass for repository-related errors."""
|
||||
|
||||
@ -713,6 +804,10 @@ class NoRepoConfiguredError(RepoError):
|
||||
"""Raised when there are no repositories configured."""
|
||||
|
||||
|
||||
class InvalidNamespaceError(RepoError):
|
||||
"""Raised when an invalid namespace is encountered."""
|
||||
|
||||
|
||||
class BadRepoError(RepoError):
|
||||
"""Raised when repo layout is invalid."""
|
||||
|
||||
@ -730,7 +825,7 @@ class UnknownPackageError(PackageLoadError):
|
||||
def __init__(self, name, repo=None):
|
||||
msg = None
|
||||
if repo:
|
||||
msg = "Package %s not found in repository %s." % (name, repo)
|
||||
msg = "Package %s not found in repository %s" % (name, repo)
|
||||
else:
|
||||
msg = "Package %s not found." % name
|
||||
super(UnknownPackageError, self).__init__(msg)
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
|
@ -42,36 +42,53 @@
|
||||
|
||||
|
||||
class Stage(object):
|
||||
"""A Stage object manages a directory where some source code is
|
||||
downloaded and built before being installed. It handles
|
||||
fetching the source code, either as an archive to be expanded
|
||||
or by checking it out of a repository. A stage's lifecycle
|
||||
looks like this:
|
||||
"""Manages a temporary stage directory for building.
|
||||
|
||||
Stage()
|
||||
Constructor creates the stage directory.
|
||||
fetch()
|
||||
Fetch a source archive into the stage.
|
||||
expand_archive()
|
||||
Expand the source archive.
|
||||
<install>
|
||||
Build and install the archive. This is handled by the Package class.
|
||||
destroy()
|
||||
Remove the stage once the package has been installed.
|
||||
A Stage object is a context manager that handles a directory where
|
||||
some source code is downloaded and built before being installed.
|
||||
It handles fetching the source code, either as an archive to be
|
||||
expanded or by checking it out of a repository. A stage's
|
||||
lifecycle looks like this:
|
||||
|
||||
If spack.use_tmp_stage is True, spack will attempt to create stages
|
||||
in a tmp directory. Otherwise, stages are created directly in
|
||||
spack.stage_path.
|
||||
```
|
||||
with Stage() as stage: # Context manager creates and destroys the stage directory
|
||||
stage.fetch() # Fetch a source archive into the stage.
|
||||
stage.expand_archive() # Expand the source archive.
|
||||
<install> # Build and install the archive. (handled by user of Stage)
|
||||
```
|
||||
|
||||
There are two kinds of stages: named and unnamed. Named stages can
|
||||
persist between runs of spack, e.g. if you fetched a tarball but
|
||||
didn't finish building it, you won't have to fetch it again.
|
||||
When used as a context manager, the stage is automatically
|
||||
destroyed if no exception is raised by the context. If an
|
||||
excpetion is raised, the stage is left in the filesystem and NOT
|
||||
destroyed, for potential reuse later.
|
||||
|
||||
Unnamed stages are created using standard mkdtemp mechanisms or
|
||||
similar, and are intended to persist for only one run of spack.
|
||||
You can also use the stage's create/destroy functions manually,
|
||||
like this:
|
||||
|
||||
```
|
||||
stage = Stage()
|
||||
try:
|
||||
stage.create() # Explicitly create the stage directory.
|
||||
stage.fetch() # Fetch a source archive into the stage.
|
||||
stage.expand_archive() # Expand the source archive.
|
||||
<install> # Build and install the archive. (handled by user of Stage)
|
||||
finally:
|
||||
stage.destroy() # Explicitly destroy the stage directory.
|
||||
```
|
||||
|
||||
If spack.use_tmp_stage is True, spack will attempt to create
|
||||
stages in a tmp directory. Otherwise, stages are created directly
|
||||
in spack.stage_path.
|
||||
|
||||
There are two kinds of stages: named and unnamed. Named stages
|
||||
can persist between runs of spack, e.g. if you fetched a tarball
|
||||
but didn't finish building it, you won't have to fetch it again.
|
||||
|
||||
Unnamed stages are created using standard mkdtemp mechanisms or
|
||||
similar, and are intended to persist for only one run of spack.
|
||||
"""
|
||||
|
||||
def __init__(self, url_or_fetch_strategy, **kwargs):
|
||||
def __init__(self, url_or_fetch_strategy, name=None, mirror_path=None, keep=None):
|
||||
"""Create a stage object.
|
||||
Parameters:
|
||||
url_or_fetch_strategy
|
||||
@ -83,6 +100,18 @@ def __init__(self, url_or_fetch_strategy, **kwargs):
|
||||
and will persist between runs (or if you construct another
|
||||
stage object later). If name is not provided, then this
|
||||
stage will be given a unique name automatically.
|
||||
|
||||
mirror_path
|
||||
If provided, Stage will search Spack's mirrors for
|
||||
this archive at the mirror_path, before using the
|
||||
default fetch strategy.
|
||||
|
||||
keep
|
||||
By default, when used as a context manager, the Stage
|
||||
is cleaned up when everything goes well, and it is
|
||||
kept intact when an exception is raised. You can
|
||||
override this behavior by setting keep to True
|
||||
(always keep) or False (always delete).
|
||||
"""
|
||||
# TODO: fetch/stage coupling needs to be reworked -- the logic
|
||||
# TODO: here is convoluted and not modular enough.
|
||||
@ -96,21 +125,55 @@ def __init__(self, url_or_fetch_strategy, **kwargs):
|
||||
self.default_fetcher = self.fetcher # self.fetcher can change with mirrors.
|
||||
self.skip_checksum_for_mirror = True # used for mirrored archives of repositories.
|
||||
|
||||
self.name = kwargs.get('name')
|
||||
self.mirror_path = kwargs.get('mirror_path')
|
||||
# TODO : this uses a protected member of tempfile, but seemed the only way to get a temporary name
|
||||
# TODO : besides, the temporary link name won't be the same as the temporary stage area in tmp_root
|
||||
self.name = name
|
||||
if name is None:
|
||||
self.name = STAGE_PREFIX + next(tempfile._get_candidate_names())
|
||||
self.mirror_path = mirror_path
|
||||
self.tmp_root = find_tmp_root()
|
||||
|
||||
self.path = None
|
||||
self._setup()
|
||||
# Try to construct here a temporary name for the stage directory
|
||||
# If this is a named stage, then construct a named path.
|
||||
self.path = join_path(spack.stage_path, self.name)
|
||||
|
||||
# Flag to decide whether to delete the stage folder on exit or not
|
||||
self.keep = keep
|
||||
|
||||
|
||||
def __enter__(self):
|
||||
"""
|
||||
Entering a stage context will create the stage directory
|
||||
|
||||
Returns:
|
||||
self
|
||||
"""
|
||||
self.create()
|
||||
return self
|
||||
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
"""
|
||||
Exiting from a stage context will delete the stage directory unless:
|
||||
- it was explicitly requested not to do so
|
||||
- an exception has been raised
|
||||
|
||||
Args:
|
||||
exc_type: exception type
|
||||
exc_val: exception value
|
||||
exc_tb: exception traceback
|
||||
|
||||
Returns:
|
||||
Boolean
|
||||
"""
|
||||
if self.keep is None:
|
||||
# Default: delete when there are no exceptions.
|
||||
if exc_type is None: self.destroy()
|
||||
|
||||
elif not self.keep:
|
||||
# Overridden. Either always keep or always delete.
|
||||
self.destroy()
|
||||
|
||||
def _cleanup_dead_links(self):
|
||||
"""Remove any dead links in the stage directory."""
|
||||
for file in os.listdir(spack.stage_path):
|
||||
path = join_path(spack.stage_path, file)
|
||||
if os.path.islink(path):
|
||||
real_path = os.path.realpath(path)
|
||||
if not os.path.exists(path):
|
||||
os.unlink(path)
|
||||
|
||||
def _need_to_create_path(self):
|
||||
"""Makes sure nothing weird has happened since the last time we
|
||||
@ -148,54 +211,6 @@ def _need_to_create_path(self):
|
||||
|
||||
return False
|
||||
|
||||
def _setup(self):
|
||||
"""Creates the stage directory.
|
||||
If spack.use_tmp_stage is False, the stage directory is created
|
||||
directly under spack.stage_path.
|
||||
|
||||
If spack.use_tmp_stage is True, this will attempt to create a
|
||||
stage in a temporary directory and link it into spack.stage_path.
|
||||
Spack will use the first writable location in spack.tmp_dirs to
|
||||
create a stage. If there is no valid location in tmp_dirs, fall
|
||||
back to making the stage inside spack.stage_path.
|
||||
"""
|
||||
# Create the top-level stage directory
|
||||
mkdirp(spack.stage_path)
|
||||
self._cleanup_dead_links()
|
||||
|
||||
# If this is a named stage, then construct a named path.
|
||||
if self.name is not None:
|
||||
self.path = join_path(spack.stage_path, self.name)
|
||||
|
||||
# If this is a temporary stage, them make the temp directory
|
||||
tmp_dir = None
|
||||
if self.tmp_root:
|
||||
if self.name is None:
|
||||
# Unnamed tmp root. Link the path in
|
||||
tmp_dir = tempfile.mkdtemp('', STAGE_PREFIX, self.tmp_root)
|
||||
self.name = os.path.basename(tmp_dir)
|
||||
self.path = join_path(spack.stage_path, self.name)
|
||||
if self._need_to_create_path():
|
||||
os.symlink(tmp_dir, self.path)
|
||||
|
||||
else:
|
||||
if self._need_to_create_path():
|
||||
tmp_dir = tempfile.mkdtemp('', STAGE_PREFIX, self.tmp_root)
|
||||
os.symlink(tmp_dir, self.path)
|
||||
|
||||
# if we're not using a tmp dir, create the stage directly in the
|
||||
# stage dir, rather than linking to it.
|
||||
else:
|
||||
if self.name is None:
|
||||
self.path = tempfile.mkdtemp('', STAGE_PREFIX, spack.stage_path)
|
||||
self.name = os.path.basename(self.path)
|
||||
else:
|
||||
if self._need_to_create_path():
|
||||
mkdirp(self.path)
|
||||
|
||||
# Make sure we can actually do something with the stage we made.
|
||||
ensure_access(self.path)
|
||||
|
||||
@property
|
||||
def archive_file(self):
|
||||
"""Path to the source archive within this stage directory."""
|
||||
@ -214,13 +229,22 @@ def archive_file(self):
|
||||
|
||||
@property
|
||||
def source_path(self):
|
||||
"""Returns the path to the expanded/checked out source code
|
||||
within this fetch strategy's path.
|
||||
"""Returns the path to the expanded/checked out source code.
|
||||
|
||||
This assumes nothing else is going ot be put in the
|
||||
FetchStrategy's path. It searches for the first
|
||||
subdirectory of the path it can find, then returns that.
|
||||
To find the source code, this method searches for the first
|
||||
subdirectory of the stage that it can find, and returns it.
|
||||
This assumes nothing besides the archive file will be in the
|
||||
stage path, but it has the advantage that we don't need to
|
||||
know the name of the archive or its contents.
|
||||
|
||||
If the fetch strategy is not supposed to expand the downloaded
|
||||
file, it will just return the stage path. If the archive needs
|
||||
to be expanded, it will return None when no archive is found.
|
||||
"""
|
||||
if isinstance(self.fetcher, fs.URLFetchStrategy):
|
||||
if not self.fetcher.expand_archive:
|
||||
return self.path
|
||||
|
||||
for p in [os.path.join(self.path, f) for f in os.listdir(self.path)]:
|
||||
if os.path.isdir(p):
|
||||
return p
|
||||
@ -231,7 +255,7 @@ def chdir(self):
|
||||
if os.path.isdir(self.path):
|
||||
os.chdir(self.path)
|
||||
else:
|
||||
tty.die("Setup failed: no such directory: " + self.path)
|
||||
raise ChdirError("Setup failed: no such directory: " + self.path)
|
||||
|
||||
def fetch(self, mirror_only=False):
|
||||
"""Downloads an archive or checks out code from a repository."""
|
||||
@ -276,7 +300,7 @@ def fetch(self, mirror_only=False):
|
||||
self.fetcher = fetcher
|
||||
self.fetcher.fetch()
|
||||
break
|
||||
except spack.error.SpackError, e:
|
||||
except spack.error.SpackError as e:
|
||||
tty.msg("Fetching from %s failed." % fetcher)
|
||||
tty.debug(e)
|
||||
continue
|
||||
@ -306,9 +330,9 @@ def expand_archive(self):
|
||||
archive_dir = self.source_path
|
||||
if not archive_dir:
|
||||
self.fetcher.expand()
|
||||
tty.msg("Created stage in %s." % self.path)
|
||||
tty.msg("Created stage in %s" % self.path)
|
||||
else:
|
||||
tty.msg("Already staged %s in %s." % (self.name, self.path))
|
||||
tty.msg("Already staged %s in %s" % (self.name, self.path))
|
||||
|
||||
def chdir_to_source(self):
|
||||
"""Changes directory to the expanded archive directory.
|
||||
@ -328,8 +352,35 @@ def restage(self):
|
||||
"""
|
||||
self.fetcher.reset()
|
||||
|
||||
def create(self):
|
||||
"""
|
||||
Creates the stage directory
|
||||
|
||||
If self.tmp_root evaluates to False, the stage directory is
|
||||
created directly under spack.stage_path, otherwise this will
|
||||
attempt to create a stage in a temporary directory and link it
|
||||
into spack.stage_path.
|
||||
|
||||
Spack will use the first writable location in spack.tmp_dirs
|
||||
to create a stage. If there is no valid location in tmp_dirs,
|
||||
fall back to making the stage inside spack.stage_path.
|
||||
"""
|
||||
# Create the top-level stage directory
|
||||
mkdirp(spack.stage_path)
|
||||
remove_dead_links(spack.stage_path)
|
||||
# If a tmp_root exists then create a directory there and then link it in the stage area,
|
||||
# otherwise create the stage directory in self.path
|
||||
if self._need_to_create_path():
|
||||
if self.tmp_root:
|
||||
tmp_dir = tempfile.mkdtemp('', STAGE_PREFIX, self.tmp_root)
|
||||
os.symlink(tmp_dir, self.path)
|
||||
else:
|
||||
mkdirp(self.path)
|
||||
# Make sure we can actually do something with the stage we made.
|
||||
ensure_access(self.path)
|
||||
|
||||
def destroy(self):
|
||||
"""Remove this stage directory."""
|
||||
"""Removes this stage directory."""
|
||||
remove_linked_tree(self.path)
|
||||
|
||||
# Make sure we don't end up in a removed directory
|
||||
@ -374,13 +425,28 @@ def expand_archive(self):
|
||||
shutil.move(source_path, destination_path)
|
||||
|
||||
|
||||
@pattern.composite(method_list=['fetch', 'check', 'expand_archive', 'restage', 'destroy'])
|
||||
@pattern.composite(method_list=['fetch', 'create', 'check', 'expand_archive', 'restage', 'destroy'])
|
||||
class StageComposite:
|
||||
"""
|
||||
Composite for Stage type objects. The first item in this composite is considered to be the root package, and
|
||||
operations that return a value are forwarded to it.
|
||||
"""
|
||||
#
|
||||
# __enter__ and __exit__ delegate to all stages in the composite.
|
||||
#
|
||||
def __enter__(self):
|
||||
for item in self:
|
||||
item.__enter__()
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
for item in reversed(self):
|
||||
item.keep = getattr(self, 'keep', None)
|
||||
item.__exit__(exc_type, exc_val, exc_tb)
|
||||
|
||||
#
|
||||
# Below functions act only on the *first* stage in the composite.
|
||||
#
|
||||
@property
|
||||
def source_path(self):
|
||||
return self[0].source_path
|
||||
@ -392,6 +458,10 @@ def path(self):
|
||||
def chdir_to_source(self):
|
||||
return self[0].chdir_to_source()
|
||||
|
||||
@property
|
||||
def archive_file(self):
|
||||
return self[0].archive_file
|
||||
|
||||
|
||||
class DIYStage(object):
|
||||
"""Simple class that allows any directory to be a spack stage."""
|
||||
@ -405,12 +475,16 @@ def chdir(self):
|
||||
if os.path.isdir(self.path):
|
||||
os.chdir(self.path)
|
||||
else:
|
||||
tty.die("Setup failed: no such directory: " + self.path)
|
||||
raise ChdirError("Setup failed: no such directory: " + self.path)
|
||||
|
||||
# DIY stages do nothing as context managers.
|
||||
def __enter__(self): pass
|
||||
def __exit__(self, exc_type, exc_val, exc_tb): pass
|
||||
|
||||
def chdir_to_source(self):
|
||||
self.chdir()
|
||||
|
||||
def fetch(self):
|
||||
def fetch(self, mirror_only):
|
||||
tty.msg("No need to fetch for DIY.")
|
||||
|
||||
def check(self):
|
||||
@ -439,19 +513,6 @@ def ensure_access(file=spack.stage_path):
|
||||
tty.die("Insufficient permissions for %s" % file)
|
||||
|
||||
|
||||
def remove_linked_tree(path):
|
||||
"""Removes a directory and its contents. If the directory is a symlink,
|
||||
follows the link and reamoves the real directory before removing the
|
||||
link.
|
||||
"""
|
||||
if os.path.exists(path):
|
||||
if os.path.islink(path):
|
||||
shutil.rmtree(os.path.realpath(path), True)
|
||||
os.unlink(path)
|
||||
else:
|
||||
shutil.rmtree(path, True)
|
||||
|
||||
|
||||
def purge():
|
||||
"""Remove all build directories in the top-level stage path."""
|
||||
if os.path.isdir(spack.stage_path):
|
||||
@ -480,19 +541,15 @@ def find_tmp_root():
|
||||
|
||||
|
||||
class StageError(spack.error.SpackError):
|
||||
def __init__(self, message, long_message=None):
|
||||
super(self, StageError).__init__(message, long_message)
|
||||
""""Superclass for all errors encountered during staging."""
|
||||
|
||||
|
||||
class RestageError(StageError):
|
||||
def __init__(self, message, long_msg=None):
|
||||
super(RestageError, self).__init__(message, long_msg)
|
||||
""""Error encountered during restaging."""
|
||||
|
||||
|
||||
class ChdirError(StageError):
|
||||
def __init__(self, message, long_msg=None):
|
||||
super(ChdirError, self).__init__(message, long_msg)
|
||||
|
||||
"""Raised when Spack can't change directories."""
|
||||
|
||||
# Keep this in namespace for convenience
|
||||
FailedDownloadError = fs.FailedDownloadError
|
||||
|
@ -39,11 +39,11 @@
|
||||
'arg1',
|
||||
'-Wl,--start-group',
|
||||
'arg2',
|
||||
'-Wl,-rpath=/first/rpath', 'arg3', '-Wl,-rpath', '-Wl,/second/rpath',
|
||||
'-Wl,-rpath,/first/rpath', 'arg3', '-Wl,-rpath', '-Wl,/second/rpath',
|
||||
'-llib1', '-llib2',
|
||||
'arg4',
|
||||
'-Wl,--end-group',
|
||||
'-Xlinker,-rpath', '-Xlinker,/third/rpath', '-Xlinker,-rpath=/fourth/rpath',
|
||||
'-Xlinker', '-rpath', '-Xlinker', '/third/rpath', '-Xlinker', '-rpath', '-Xlinker', '/fourth/rpath',
|
||||
'-llib3', '-llib4',
|
||||
'arg5', 'arg6']
|
||||
|
||||
@ -95,13 +95,13 @@ def test_cpp_mode(self):
|
||||
def test_ccld_mode(self):
|
||||
self.check_cc('dump-mode', [], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.c', '-o', 'foo'], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.c', '-o', 'foo', '-Wl,-rpath=foo'], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath=foo'], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.c', '-o', 'foo', '-Wl,-rpath,foo'], "ccld")
|
||||
self.check_cc('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath,foo'], "ccld")
|
||||
|
||||
|
||||
def test_ld_mode(self):
|
||||
self.check_ld('dump-mode', [], "ld")
|
||||
self.check_ld('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath=foo'], "ld")
|
||||
self.check_ld('dump-mode', ['foo.o', 'bar.o', 'baz.o', '-o', 'foo', '-Wl,-rpath,foo'], "ld")
|
||||
|
||||
|
||||
def test_includes(self):
|
||||
|
@ -22,8 +22,6 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
from spack.spec import Spec, CompilerSpec
|
||||
from spack.test.mock_packages_test import *
|
||||
|
@ -22,13 +22,13 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
import shutil
|
||||
import os
|
||||
import shutil
|
||||
from tempfile import mkdtemp
|
||||
from ordereddict_backport import OrderedDict
|
||||
|
||||
import spack
|
||||
import spack.config
|
||||
from ordereddict_backport import OrderedDict
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
# Some sample compiler config data
|
||||
|
@ -23,20 +23,15 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import unittest
|
||||
import shutil
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
from spack.cmd.create import ConfigureGuesser
|
||||
from spack.stage import Stage
|
||||
|
||||
from spack.fetch_strategy import URLFetchStrategy
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.util.executable import which
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_repo import MockArchive
|
||||
from spack.util.executable import which
|
||||
|
||||
|
||||
class InstallTest(unittest.TestCase):
|
||||
@ -52,8 +47,6 @@ def setUp(self):
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.tmpdir, ignore_errors=True)
|
||||
if self.stage:
|
||||
self.stage.destroy()
|
||||
os.chdir(self.orig_dir)
|
||||
|
||||
|
||||
@ -64,12 +57,12 @@ def check_archive(self, filename, system):
|
||||
|
||||
url = 'file://' + join_path(os.getcwd(), 'archive.tar.gz')
|
||||
print url
|
||||
self.stage = Stage(url)
|
||||
self.stage.fetch()
|
||||
with Stage(url) as stage:
|
||||
stage.fetch()
|
||||
|
||||
guesser = ConfigureGuesser()
|
||||
guesser(self.stage)
|
||||
self.assertEqual(system, guesser.build_system)
|
||||
guesser = ConfigureGuesser()
|
||||
guesser(stage)
|
||||
self.assertEqual(system, guesser.build_system)
|
||||
|
||||
|
||||
def test_python(self):
|
||||
|
@ -26,19 +26,18 @@
|
||||
These tests check the database is functioning properly,
|
||||
both in memory and in its file
|
||||
"""
|
||||
import tempfile
|
||||
import shutil
|
||||
import multiprocessing
|
||||
|
||||
from llnl.util.lock import *
|
||||
from llnl.util.filesystem import join_path
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
import spack
|
||||
from llnl.util.filesystem import join_path
|
||||
from llnl.util.lock import *
|
||||
from llnl.util.tty.colify import colify
|
||||
from spack.database import Database
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
from llnl.util.tty.colify import colify
|
||||
|
||||
def _print_ref_counts():
|
||||
"""Print out all ref counts for the graph used here, for debugging"""
|
||||
|
@ -25,20 +25,17 @@
|
||||
"""\
|
||||
This test verifies that the Spack directory layout works properly.
|
||||
"""
|
||||
import unittest
|
||||
import tempfile
|
||||
import shutil
|
||||
import os
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
import spack
|
||||
from spack.spec import Spec
|
||||
from spack.repository import RepoPath
|
||||
from llnl.util.filesystem import *
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.repository import RepoPath
|
||||
from spack.spec import Spec
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
||||
# number of packages to test (to reduce test time)
|
||||
max_packages = 10
|
||||
|
||||
|
@ -23,19 +23,12 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import unittest
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
from spack.version import ver
|
||||
from spack.stage import Stage
|
||||
from spack.util.executable import which
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_repo import MockGitRepo
|
||||
from spack.version import ver
|
||||
|
||||
|
||||
class GitFetchTest(MockPackagesTest):
|
||||
@ -52,19 +45,15 @@ def setUp(self):
|
||||
spec.concretize()
|
||||
self.pkg = spack.repo.get(spec, new=True)
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
"""Destroy the stage space used by this test."""
|
||||
super(GitFetchTest, self).tearDown()
|
||||
self.repo.destroy()
|
||||
self.pkg.do_clean()
|
||||
|
||||
|
||||
def assert_rev(self, rev):
|
||||
"""Check that the current git revision is equal to the supplied rev."""
|
||||
self.assertEqual(self.repo.rev_hash('HEAD'), self.repo.rev_hash(rev))
|
||||
|
||||
|
||||
def try_fetch(self, rev, test_file, args):
|
||||
"""Tries to:
|
||||
1. Fetch the repo using a fetch strategy constructed with
|
||||
@ -76,26 +65,27 @@ def try_fetch(self, rev, test_file, args):
|
||||
"""
|
||||
self.pkg.versions[ver('git')] = args
|
||||
|
||||
self.pkg.do_stage()
|
||||
self.assert_rev(rev)
|
||||
with self.pkg.stage:
|
||||
self.pkg.do_stage()
|
||||
self.assert_rev(rev)
|
||||
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
|
||||
untracked_file = 'foobarbaz'
|
||||
touch(untracked_file)
|
||||
self.assertTrue(os.path.isfile(untracked_file))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked_file))
|
||||
untracked_file = 'foobarbaz'
|
||||
touch(untracked_file)
|
||||
self.assertTrue(os.path.isfile(untracked_file))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked_file))
|
||||
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
self.assert_rev(rev)
|
||||
self.assert_rev(rev)
|
||||
|
||||
|
||||
def test_fetch_master(self):
|
||||
|
@ -23,16 +23,12 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import unittest
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
|
||||
from spack.version import ver
|
||||
from spack.stage import Stage
|
||||
from spack.util.executable import which
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_repo import MockHgRepo
|
||||
from llnl.util.filesystem import *
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
||||
class HgFetchTest(MockPackagesTest):
|
||||
@ -49,13 +45,10 @@ def setUp(self):
|
||||
spec.concretize()
|
||||
self.pkg = spack.repo.get(spec, new=True)
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
"""Destroy the stage space used by this test."""
|
||||
super(HgFetchTest, self).tearDown()
|
||||
self.repo.destroy()
|
||||
self.pkg.do_clean()
|
||||
|
||||
|
||||
def try_fetch(self, rev, test_file, args):
|
||||
"""Tries to:
|
||||
@ -68,26 +61,27 @@ def try_fetch(self, rev, test_file, args):
|
||||
"""
|
||||
self.pkg.versions[ver('hg')] = args
|
||||
|
||||
self.pkg.do_stage()
|
||||
self.assertEqual(self.repo.get_rev(), rev)
|
||||
with self.pkg.stage:
|
||||
self.pkg.do_stage()
|
||||
self.assertEqual(self.repo.get_rev(), rev)
|
||||
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
|
||||
untracked = 'foobarbaz'
|
||||
touch(untracked)
|
||||
self.assertTrue(os.path.isfile(untracked))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked))
|
||||
untracked = 'foobarbaz'
|
||||
touch(untracked)
|
||||
self.assertTrue(os.path.isfile(untracked))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked))
|
||||
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
self.assertEqual(self.repo.get_rev(), rev)
|
||||
self.assertEqual(self.repo.get_rev(), rev)
|
||||
|
||||
|
||||
def test_fetch_default(self):
|
||||
|
@ -22,18 +22,13 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
import unittest
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
from spack.stage import Stage
|
||||
from spack.fetch_strategy import URLFetchStrategy, FetchStrategyComposite
|
||||
from llnl.util.filesystem import *
|
||||
from spack.directory_layout import YamlDirectoryLayout
|
||||
from spack.util.executable import which
|
||||
from spack.fetch_strategy import URLFetchStrategy, FetchStrategyComposite
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_repo import MockArchive
|
||||
|
||||
|
@ -24,8 +24,6 @@
|
||||
##############################################################################
|
||||
import os
|
||||
import unittest
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
from llnl.util.link_tree import LinkTree
|
||||
@ -38,6 +36,7 @@ class LinkTreeTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.stage = Stage('link-tree-test')
|
||||
self.stage.create()
|
||||
|
||||
with working_dir(self.stage.path):
|
||||
touchp('source/1')
|
||||
@ -51,10 +50,8 @@ def setUp(self):
|
||||
source_path = os.path.join(self.stage.path, 'source')
|
||||
self.link_tree = LinkTree(source_path)
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
if self.stage:
|
||||
self.stage.destroy()
|
||||
self.stage.destroy()
|
||||
|
||||
|
||||
def check_file_link(self, filename):
|
||||
|
@ -25,15 +25,13 @@
|
||||
"""
|
||||
These tests ensure that our lock works correctly.
|
||||
"""
|
||||
import unittest
|
||||
import os
|
||||
import tempfile
|
||||
import shutil
|
||||
import tempfile
|
||||
import unittest
|
||||
from multiprocessing import Process
|
||||
|
||||
from llnl.util.lock import *
|
||||
from llnl.util.filesystem import join_path, touch
|
||||
|
||||
from llnl.util.lock import *
|
||||
from spack.util.multiproc import Barrier
|
||||
|
||||
# This is the longest a failed test will take, as the barriers will
|
||||
|
@ -28,13 +28,13 @@
|
||||
This just tests whether the right args are getting passed to make.
|
||||
"""
|
||||
import os
|
||||
import unittest
|
||||
import tempfile
|
||||
import shutil
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
from spack.util.environment import path_put_first
|
||||
from spack.build_environment import MakeExecutable
|
||||
from spack.util.environment import path_put_first
|
||||
|
||||
|
||||
class MakeExecutableTest(unittest.TestCase):
|
||||
|
@ -23,11 +23,10 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import os
|
||||
from filecmp import dircmp
|
||||
|
||||
import spack
|
||||
import spack.mirror
|
||||
from spack.util.compression import decompressor_for
|
||||
|
||||
from filecmp import dircmp
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.test.mock_repo import *
|
||||
|
||||
@ -74,14 +73,14 @@ def set_up_package(self, name, MockRepoClass, url_attr):
|
||||
|
||||
|
||||
def check_mirror(self):
|
||||
stage = Stage('spack-mirror-test')
|
||||
mirror_root = join_path(stage.path, 'test-mirror')
|
||||
with Stage('spack-mirror-test') as stage:
|
||||
mirror_root = join_path(stage.path, 'test-mirror')
|
||||
|
||||
# register mirror with spack config
|
||||
mirrors = { 'spack-mirror-test' : 'file://' + mirror_root }
|
||||
spack.config.update_config('mirrors', mirrors)
|
||||
|
||||
# register mirror with spack config
|
||||
mirrors = { 'spack-mirror-test' : 'file://' + mirror_root }
|
||||
spack.config.update_config('mirrors', mirrors)
|
||||
|
||||
try:
|
||||
os.chdir(stage.path)
|
||||
spack.mirror.create(
|
||||
mirror_root, self.repos, no_checksum=True)
|
||||
@ -97,38 +96,28 @@ def check_mirror(self):
|
||||
files = os.listdir(subdir)
|
||||
self.assertEqual(len(files), 1)
|
||||
|
||||
# Now try to fetch each package.
|
||||
for name, mock_repo in self.repos.items():
|
||||
spec = Spec(name).concretized()
|
||||
pkg = spec.package
|
||||
# Now try to fetch each package.
|
||||
for name, mock_repo in self.repos.items():
|
||||
spec = Spec(name).concretized()
|
||||
pkg = spec.package
|
||||
|
||||
pkg._stage = None
|
||||
saved_checksum_setting = spack.do_checksum
|
||||
try:
|
||||
# Stage the archive from the mirror and cd to it.
|
||||
spack.do_checksum = False
|
||||
pkg.do_stage(mirror_only=True)
|
||||
|
||||
# Compare the original repo with the expanded archive
|
||||
original_path = mock_repo.path
|
||||
if 'svn' in name:
|
||||
# have to check out the svn repo to compare.
|
||||
original_path = join_path(mock_repo.path, 'checked_out')
|
||||
svn('checkout', mock_repo.url, original_path)
|
||||
|
||||
dcmp = dircmp(original_path, pkg.stage.source_path)
|
||||
|
||||
# make sure there are no new files in the expanded tarball
|
||||
self.assertFalse(dcmp.right_only)
|
||||
|
||||
# and that all original files are present.
|
||||
self.assertTrue(all(l in exclude for l in dcmp.left_only))
|
||||
|
||||
finally:
|
||||
spack.do_checksum = saved_checksum_setting
|
||||
pkg.do_clean()
|
||||
finally:
|
||||
stage.destroy()
|
||||
saved_checksum_setting = spack.do_checksum
|
||||
with pkg.stage:
|
||||
# Stage the archive from the mirror and cd to it.
|
||||
spack.do_checksum = False
|
||||
pkg.do_stage(mirror_only=True)
|
||||
# Compare the original repo with the expanded archive
|
||||
original_path = mock_repo.path
|
||||
if 'svn' in name:
|
||||
# have to check out the svn repo to compare.
|
||||
original_path = join_path(mock_repo.path, 'checked_out')
|
||||
svn('checkout', mock_repo.url, original_path)
|
||||
dcmp = dircmp(original_path, pkg.stage.source_path)
|
||||
# make sure there are no new files in the expanded tarball
|
||||
self.assertFalse(dcmp.right_only)
|
||||
# and that all original files are present.
|
||||
self.assertTrue(all(l in exclude for l in dcmp.left_only))
|
||||
spack.do_checksum = saved_checksum_setting
|
||||
|
||||
|
||||
def test_git_mirror(self):
|
||||
|
@ -22,17 +22,15 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import sys
|
||||
import os
|
||||
import shutil
|
||||
import unittest
|
||||
import tempfile
|
||||
from ordereddict_backport import OrderedDict
|
||||
|
||||
from llnl.util.filesystem import mkdirp
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
import spack.config
|
||||
from llnl.util.filesystem import mkdirp
|
||||
from ordereddict_backport import OrderedDict
|
||||
from spack.repository import RepoPath
|
||||
from spack.spec import Spec
|
||||
|
||||
|
@ -26,13 +26,9 @@
|
||||
import shutil
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
from spack.version import ver
|
||||
from spack.stage import Stage
|
||||
from spack.util.executable import which
|
||||
|
||||
|
||||
#
|
||||
# VCS Systems used by mock repo code.
|
||||
#
|
||||
|
@ -25,14 +25,11 @@
|
||||
"""
|
||||
Test for multi_method dispatch.
|
||||
"""
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
from spack.multimethod import *
|
||||
from spack.version import *
|
||||
from spack.spec import Spec
|
||||
from spack.multimethod import when
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.version import *
|
||||
|
||||
|
||||
class MultiMethodTest(MockPackagesTest):
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
@ -23,6 +23,7 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
|
||||
from spack.util.naming import NamespaceTrie
|
||||
|
||||
|
||||
|
@ -22,10 +22,8 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
from spack.spec import Spec, CompilerSpec
|
||||
from spack.spec import Spec
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
class ConcretizeTest(MockPackagesTest):
|
||||
|
@ -22,14 +22,12 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
|
||||
from llnl.util.filesystem import join_path
|
||||
|
||||
import spack
|
||||
from llnl.util.filesystem import join_path
|
||||
from spack.repository import Repo
|
||||
from spack.util.naming import mod_to_class
|
||||
from spack.test.mock_packages_test import *
|
||||
from spack.util.naming import mod_to_class
|
||||
|
||||
|
||||
class PackagesTest(MockPackagesTest):
|
||||
|
@ -28,12 +28,11 @@
|
||||
Spack was originally 2.7, but enough systems in 2014 are still using
|
||||
2.6 on their frontend nodes that we need 2.6 to get adopted.
|
||||
"""
|
||||
import unittest
|
||||
import os
|
||||
import re
|
||||
import unittest
|
||||
|
||||
import llnl.util.tty as tty
|
||||
|
||||
import pyqver2
|
||||
import spack
|
||||
|
||||
|
@ -31,8 +31,6 @@
|
||||
import spack
|
||||
import spack.package
|
||||
|
||||
from llnl.util.lang import list_modules
|
||||
|
||||
from spack.spec import Spec
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
@ -22,7 +22,6 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
from spack.spec import *
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
@ -23,9 +23,10 @@
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
|
||||
import spack.spec
|
||||
from spack.spec import *
|
||||
from spack.parse import Token
|
||||
from spack.spec import *
|
||||
|
||||
# Sample output for a complex lexing.
|
||||
complex_lex = [Token(ID, 'mvapich_foo'),
|
||||
|
@ -25,15 +25,13 @@
|
||||
"""\
|
||||
Test that the Stage class works correctly.
|
||||
"""
|
||||
import unittest
|
||||
import shutil
|
||||
import os
|
||||
import getpass
|
||||
import shutil
|
||||
import unittest
|
||||
from contextlib import *
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
from llnl.util.filesystem import *
|
||||
from spack.stage import Stage
|
||||
from spack.util.executable import which
|
||||
|
||||
@ -192,116 +190,90 @@ def check_destroy(self, stage, stage_name):
|
||||
|
||||
def test_setup_and_destroy_name_with_tmp(self):
|
||||
with use_tmp(True):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
self.check_setup(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_setup_and_destroy_name_without_tmp(self):
|
||||
with use_tmp(False):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
self.check_setup(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_setup_and_destroy_no_name_with_tmp(self):
|
||||
with use_tmp(True):
|
||||
stage = Stage(archive_url)
|
||||
self.check_setup(stage, None)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url) as stage:
|
||||
self.check_setup(stage, None)
|
||||
self.check_destroy(stage, None)
|
||||
|
||||
|
||||
def test_setup_and_destroy_no_name_without_tmp(self):
|
||||
with use_tmp(False):
|
||||
stage = Stage(archive_url)
|
||||
self.check_setup(stage, None)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url) as stage:
|
||||
self.check_setup(stage, None)
|
||||
self.check_destroy(stage, None)
|
||||
|
||||
|
||||
def test_chdir(self):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
|
||||
stage.chdir()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_chdir(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
stage.chdir()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_chdir(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_fetch(self):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_chdir(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_chdir(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_expand_archive(self):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
|
||||
stage.expand_archive()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
stage.expand_archive()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_expand_archive(self):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
|
||||
stage.expand_archive()
|
||||
stage.chdir_to_source()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
|
||||
stage.destroy()
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
stage.fetch()
|
||||
self.check_setup(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
stage.expand_archive()
|
||||
stage.chdir_to_source()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
||||
|
||||
def test_restage(self):
|
||||
stage = Stage(archive_url, name=stage_name)
|
||||
with Stage(archive_url, name=stage_name) as stage:
|
||||
stage.fetch()
|
||||
stage.expand_archive()
|
||||
stage.chdir_to_source()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
|
||||
stage.fetch()
|
||||
stage.expand_archive()
|
||||
stage.chdir_to_source()
|
||||
self.check_expand_archive(stage, stage_name)
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
# Try to make a file in the old archive dir
|
||||
with open('foobar', 'w') as file:
|
||||
file.write("this file is to be destroyed.")
|
||||
|
||||
# Try to make a file in the old archive dir
|
||||
with open('foobar', 'w') as file:
|
||||
file.write("this file is to be destroyed.")
|
||||
self.assertTrue('foobar' in os.listdir(stage.source_path))
|
||||
|
||||
self.assertTrue('foobar' in os.listdir(stage.source_path))
|
||||
|
||||
# Make sure the file is not there after restage.
|
||||
stage.restage()
|
||||
self.check_chdir(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
|
||||
stage.chdir_to_source()
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
self.assertFalse('foobar' in os.listdir(stage.source_path))
|
||||
|
||||
stage.destroy()
|
||||
# Make sure the file is not there after restage.
|
||||
stage.restage()
|
||||
self.check_chdir(stage, stage_name)
|
||||
self.check_fetch(stage, stage_name)
|
||||
stage.chdir_to_source()
|
||||
self.check_chdir_to_source(stage, stage_name)
|
||||
self.assertFalse('foobar' in os.listdir(stage.source_path))
|
||||
self.check_destroy(stage, stage_name)
|
||||
|
@ -24,18 +24,12 @@
|
||||
##############################################################################
|
||||
import os
|
||||
import re
|
||||
import unittest
|
||||
import shutil
|
||||
import tempfile
|
||||
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
import spack
|
||||
from spack.version import ver
|
||||
from spack.stage import Stage
|
||||
from spack.util.executable import which
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
from spack.test.mock_repo import svn, MockSvnRepo
|
||||
from spack.version import ver
|
||||
from spack.test.mock_packages_test import *
|
||||
from llnl.util.filesystem import *
|
||||
|
||||
|
||||
class SvnFetchTest(MockPackagesTest):
|
||||
@ -51,13 +45,10 @@ def setUp(self):
|
||||
spec.concretize()
|
||||
self.pkg = spack.repo.get(spec, new=True)
|
||||
|
||||
|
||||
def tearDown(self):
|
||||
"""Destroy the stage space used by this test."""
|
||||
super(SvnFetchTest, self).tearDown()
|
||||
self.repo.destroy()
|
||||
self.pkg.do_clean()
|
||||
|
||||
|
||||
def assert_rev(self, rev):
|
||||
"""Check that the current revision is equal to the supplied rev."""
|
||||
@ -70,7 +61,6 @@ def get_rev():
|
||||
return match.group(1)
|
||||
self.assertEqual(get_rev(), rev)
|
||||
|
||||
|
||||
def try_fetch(self, rev, test_file, args):
|
||||
"""Tries to:
|
||||
1. Fetch the repo using a fetch strategy constructed with
|
||||
@ -82,26 +72,27 @@ def try_fetch(self, rev, test_file, args):
|
||||
"""
|
||||
self.pkg.versions[ver('svn')] = args
|
||||
|
||||
self.pkg.do_stage()
|
||||
self.assert_rev(rev)
|
||||
with self.pkg.stage:
|
||||
self.pkg.do_stage()
|
||||
self.assert_rev(rev)
|
||||
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
file_path = join_path(self.pkg.stage.source_path, test_file)
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
os.unlink(file_path)
|
||||
self.assertFalse(os.path.isfile(file_path))
|
||||
|
||||
untracked = 'foobarbaz'
|
||||
touch(untracked)
|
||||
self.assertTrue(os.path.isfile(untracked))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked))
|
||||
untracked = 'foobarbaz'
|
||||
touch(untracked)
|
||||
self.assertTrue(os.path.isfile(untracked))
|
||||
self.pkg.do_restage()
|
||||
self.assertFalse(os.path.isfile(untracked))
|
||||
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
|
||||
self.assertTrue(os.path.isfile(file_path))
|
||||
|
||||
self.assert_rev(rev)
|
||||
self.assert_rev(rev)
|
||||
|
||||
|
||||
def test_fetch_default(self):
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://scalability-llnl.github.io/spack
|
||||
# For details, see https://scalability-software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
@ -22,10 +22,10 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
from nose.plugins import Plugin
|
||||
|
||||
import os
|
||||
|
||||
from nose.plugins import Plugin
|
||||
|
||||
class Tally(Plugin):
|
||||
name = 'tally'
|
||||
|
||||
@ -34,7 +34,7 @@ def __init__(self):
|
||||
self.successCount = 0
|
||||
self.failCount = 0
|
||||
self.errorCount = 0
|
||||
|
||||
|
||||
@property
|
||||
def numberOfTestsRun(self):
|
||||
"""Excludes skipped tests"""
|
||||
@ -48,10 +48,10 @@ def configure(self, options, conf):
|
||||
|
||||
def addSuccess(self, test):
|
||||
self.successCount += 1
|
||||
|
||||
|
||||
def addError(self, test, err):
|
||||
self.errorCount += 1
|
||||
|
||||
|
||||
def addFailure(self, test, err):
|
||||
self.failCount += 1
|
||||
|
||||
|
@ -22,10 +22,11 @@
|
||||
# along with this program; if not, write to the Free Software Foundation,
|
||||
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
||||
##############################################################################
|
||||
import unittest
|
||||
import itertools
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
|
||||
test_install = __import__("spack.cmd.test-install",
|
||||
fromlist=["BuildId", "create_test_output", "TestResult"])
|
||||
|
||||
|
@ -25,10 +25,7 @@
|
||||
"""\
|
||||
Tests ability of spack to extrapolate URL versions from existing versions.
|
||||
"""
|
||||
import spack
|
||||
import spack.url as url
|
||||
from spack.spec import Spec
|
||||
from spack.version import ver
|
||||
from spack.test.mock_packages_test import *
|
||||
|
||||
|
||||
|
@ -27,8 +27,8 @@
|
||||
detection in Homebrew.
|
||||
"""
|
||||
import unittest
|
||||
|
||||
import spack.url as url
|
||||
from pprint import pprint
|
||||
|
||||
|
||||
class UrlParseTest(unittest.TestCase):
|
||||
|
@ -27,7 +27,6 @@
|
||||
"""
|
||||
import unittest
|
||||
|
||||
import spack
|
||||
import spack.url as url
|
||||
|
||||
|
||||
|
@ -28,6 +28,7 @@
|
||||
where it makes sense.
|
||||
"""
|
||||
import unittest
|
||||
|
||||
from spack.version import *
|
||||
|
||||
|
||||
|
@ -26,6 +26,7 @@
|
||||
Test Spack's custom YAML format.
|
||||
"""
|
||||
import unittest
|
||||
|
||||
import spack.util.spack_yaml as syaml
|
||||
|
||||
test_file = """\
|
||||
|
@ -225,7 +225,7 @@ def parse_version_offset(path):
|
||||
(r'_((\d+\.)+\d+[a-z]?)[.]orig$', stem),
|
||||
|
||||
# e.g. http://www.openssl.org/source/openssl-0.9.8s.tar.gz
|
||||
(r'-([^-]+(-alpha|-beta)?)', stem),
|
||||
(r'-v?([^-]+(-alpha|-beta)?)', stem),
|
||||
|
||||
# e.g. astyle_1.23_macosx.tar.gz
|
||||
(r'_([^_]+(_alpha|_beta)?)', stem),
|
||||
|
@ -63,3 +63,10 @@ def pop_keys(dictionary, *keys):
|
||||
for key in keys:
|
||||
if key in dictionary:
|
||||
dictionary.pop(key)
|
||||
|
||||
|
||||
def dump_environment(path):
|
||||
"""Dump the current environment out to a file."""
|
||||
with open(path, 'w') as env_file:
|
||||
for key,val in sorted(os.environ.items()):
|
||||
env_file.write("%s=%s\n" % (key, val))
|
||||
|
@ -86,12 +86,12 @@ def _spider(args):
|
||||
|
||||
if not "Content-type" in resp.headers:
|
||||
tty.debug("ignoring page " + url)
|
||||
return pages
|
||||
return pages, links
|
||||
|
||||
if not resp.headers["Content-type"].startswith('text/html'):
|
||||
tty.debug("ignoring page " + url + " with content type " +
|
||||
resp.headers["Content-type"])
|
||||
return pages
|
||||
return pages, links
|
||||
|
||||
# Do the real GET request when we know it's just HTML.
|
||||
req.get_method = lambda: "GET"
|
||||
@ -173,7 +173,7 @@ def spider(root_url, **kwargs):
|
||||
performance over a sequential fetch.
|
||||
"""
|
||||
max_depth = kwargs.setdefault('depth', 1)
|
||||
pages, links = _spider((root_url, set(), root_url, None, 1, max_depth, False))
|
||||
pages, links = _spider((root_url, set(), root_url, None, 1, max_depth, False))
|
||||
return pages, links
|
||||
|
||||
|
||||
|
15
var/spack/repos/builtin/packages/blitz/package.py
Normal file
15
var/spack/repos/builtin/packages/blitz/package.py
Normal file
@ -0,0 +1,15 @@
|
||||
from spack import *
|
||||
|
||||
class Blitz(Package):
|
||||
"""N-dimensional arrays for C++"""
|
||||
homepage = "http://github.com/blitzpp/blitz"
|
||||
url = "https://github.com/blitzpp/blitz/tarball/1.0.0"
|
||||
|
||||
version('1.0.0', '9f040b9827fe22228a892603671a77af')
|
||||
|
||||
# No dependencies
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix)
|
||||
make()
|
||||
make("install")
|
@ -46,6 +46,7 @@ class Cgal(Package):
|
||||
depends_on('mpfr')
|
||||
depends_on('gmp')
|
||||
depends_on('zlib')
|
||||
depends_on('cmake')
|
||||
|
||||
# FIXME : Qt5 dependency missing (needs Qt5 and OpenGL)
|
||||
# FIXME : Optional third party libraries missing
|
||||
|
@ -7,6 +7,7 @@ class Expat(Package):
|
||||
|
||||
version('2.1.0', 'dd7dab7a5fea97d2a6a43f511449b7cd')
|
||||
|
||||
depends_on('cmake')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
|
@ -47,6 +47,8 @@ class Fftw(Package):
|
||||
|
||||
depends_on('mpi', when='+mpi')
|
||||
|
||||
# TODO : add support for architecture specific optimizations as soon as targets are supported
|
||||
|
||||
def install(self, spec, prefix):
|
||||
options = ['--prefix=%s' % prefix,
|
||||
'--enable-shared',
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
|
56
var/spack/repos/builtin/packages/gromacs/package.py
Normal file
56
var/spack/repos/builtin/packages/gromacs/package.py
Normal file
@ -0,0 +1,56 @@
|
||||
from spack import *
|
||||
|
||||
|
||||
class Gromacs(Package):
|
||||
"""
|
||||
GROMACS (GROningen MAchine for Chemical Simulations) is a molecular dynamics package primarily designed for
|
||||
simulations of proteins, lipids and nucleic acids. It was originally developed in the Biophysical Chemistry
|
||||
department of University of Groningen, and is now maintained by contributors in universities and research centers
|
||||
across the world.
|
||||
|
||||
GROMACS is one of the fastest and most popular software packages available and can run on CPUs as well as GPUs.
|
||||
It is free, open source released under the GNU General Public License. Starting from version 4.6, GROMACS is
|
||||
released under the GNU Lesser General Public License.
|
||||
"""
|
||||
|
||||
homepage = 'http://www.gromacs.org'
|
||||
url = 'ftp://ftp.gromacs.org/pub/gromacs/gromacs-5.1.2.tar.gz'
|
||||
|
||||
version('5.1.2', '614d0be372f1a6f1f36382b7a6fcab98')
|
||||
|
||||
variant('mpi', default=True, description='Activate MPI support')
|
||||
variant('shared', default=True, description='Enables the build of shared libraries')
|
||||
variant('debug', default=False, description='Enables debug mode')
|
||||
variant('double', default=False, description='Produces a double precision version of the executables')
|
||||
|
||||
depends_on('mpi', when='+mpi')
|
||||
|
||||
depends_on('fftw')
|
||||
|
||||
# TODO : add GPU support
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
options = []
|
||||
|
||||
if '+mpi' in spec:
|
||||
options.append('-DGMX_MPI:BOOL=ON')
|
||||
|
||||
if '+double' in spec:
|
||||
options.append('-DGMX_DOUBLE:BOOL=ON')
|
||||
|
||||
if '~shared' in spec:
|
||||
options.append('-DBUILD_SHARED_LIBS:BOOL=OFF')
|
||||
|
||||
if '+debug' in spec:
|
||||
options.append('-DCMAKE_BUILD_TYPE:STRING=Debug')
|
||||
else:
|
||||
options.append('-DCMAKE_BUILD_TYPE:STRING=Release')
|
||||
|
||||
options.extend(std_cmake_args)
|
||||
|
||||
with working_dir('spack-build', create=True):
|
||||
|
||||
cmake('..', *options)
|
||||
make()
|
||||
make('install')
|
@ -28,7 +28,7 @@ class Jdk(Package):
|
||||
'-H', # specify required License Agreement cookie
|
||||
'Cookie: oraclelicense=accept-securebackup-cookie']
|
||||
|
||||
def do_fetch(self):
|
||||
def do_fetch(self, mirror_only=False):
|
||||
# Add our custom curl commandline options
|
||||
tty.msg(
|
||||
"[Jdk] Adding required commandline options to curl " +
|
||||
@ -39,7 +39,7 @@ def do_fetch(self):
|
||||
spack.curl.add_default_arg(option)
|
||||
|
||||
# Now perform the actual fetch
|
||||
super(Jdk, self).do_fetch()
|
||||
super(Jdk, self).do_fetch(mirror_only)
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
@ -22,9 +22,16 @@ class Libevent(Package):
|
||||
version('2.0.13', 'af786b4b3f790c9d3279792edf7867fc')
|
||||
version('2.0.12', '42986228baf95e325778ed328a93e070')
|
||||
|
||||
variant('openssl', default=True, description="Build with encryption enabled at the libevent level.")
|
||||
depends_on('openssl', when='+openssl')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
configure_args = []
|
||||
if '+openssl' in spec:
|
||||
configure_args.append('--enable-openssl')
|
||||
else:
|
||||
configure_args.append('--enable-openssl')
|
||||
|
||||
configure("--prefix=%s" % prefix, *configure_args)
|
||||
make()
|
||||
make("install")
|
||||
|
15
var/spack/repos/builtin/packages/libsigsegv/package.py
Normal file
15
var/spack/repos/builtin/packages/libsigsegv/package.py
Normal file
@ -0,0 +1,15 @@
|
||||
from spack import *
|
||||
|
||||
class Libsigsegv(Package):
|
||||
"""GNU libsigsegv is a library for handling page faults in user mode."""
|
||||
homepage = "https://www.gnu.org/software/libsigsegv/"
|
||||
url = "ftp://ftp.gnu.org/gnu/libsigsegv/libsigsegv-2.10.tar.gz"
|
||||
|
||||
version('2.10', '7f96fb1f65b3b8cbc1582fb7be774f0f')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix,
|
||||
'--enable-shared')
|
||||
|
||||
make()
|
||||
make("install")
|
@ -1,5 +1,5 @@
|
||||
##############################################################################
|
||||
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
|
||||
# Copyright (c) 2016, Lawrence Livermore National Security, LLC.
|
||||
# Produced at the Lawrence Livermore National Laboratory.
|
||||
#
|
||||
# This file is part of Spack.
|
||||
@ -34,7 +34,7 @@ class Llvm(Package):
|
||||
it is the full name of the project.
|
||||
"""
|
||||
homepage = 'http://llvm.org/'
|
||||
url = 'http://llvm.org/releases/3.7.0/llvm-3.7.0.src.tar.xz'
|
||||
url = 'http://llvm.org/releases/3.7.1/llvm-3.7.1.src.tar.xz'
|
||||
|
||||
version('3.0', 'a8e5f5f1c1adebae7b4a654c376a6005', url='http://llvm.org/releases/3.0/llvm-3.0.tar.gz') # currently required by mesa package
|
||||
|
||||
@ -117,6 +117,36 @@ class Llvm(Package):
|
||||
},
|
||||
}
|
||||
releases = [
|
||||
{
|
||||
'version' : 'trunk',
|
||||
'repo' : 'http://llvm.org/svn/llvm-project/llvm/trunk',
|
||||
'resources' : {
|
||||
'compiler-rt' : 'http://llvm.org/svn/llvm-project/compiler-rt/trunk',
|
||||
'openmp' : 'http://llvm.org/svn/llvm-project/openmp/trunk',
|
||||
'polly' : 'http://llvm.org/svn/llvm-project/polly/trunk',
|
||||
'libcxx' : 'http://llvm.org/svn/llvm-project/libcxx/trunk',
|
||||
'libcxxabi' : 'http://llvm.org/svn/llvm-project/libcxxabi/trunk',
|
||||
'clang' : 'http://llvm.org/svn/llvm-project/cfe/trunk',
|
||||
'clang-tools-extra' : 'http://llvm.org/svn/llvm-project/clang-tools-extra/trunk',
|
||||
'lldb' : 'http://llvm.org/svn/llvm-project/lldb/trunk',
|
||||
'llvm-libunwind' : 'http://llvm.org/svn/llvm-project/libunwind/trunk',
|
||||
}
|
||||
},
|
||||
{
|
||||
'version' : '3.7.1',
|
||||
'md5':'bf8b3a2c79e61212c5409041dfdbd319',
|
||||
'resources' : {
|
||||
'compiler-rt' : '1c6975daf30bb3b0473b53c3a1a6ff01',
|
||||
'openmp' : 'b4ad08cda4e5c22e42b66062b140438e',
|
||||
'polly' : '3a2a7367002740881637f4d47bca4dc3',
|
||||
'libcxx' : 'f9c43fa552a10e14ff53b94d04bea140',
|
||||
'libcxxabi' : '52d925afac9f97e9dcac90745255c169',
|
||||
'clang' : '0acd026b5529164197563d135a8fd83e',
|
||||
'clang-tools-extra' : '5d49ff745037f061a7c86aeb6a24c3d2',
|
||||
'lldb' : 'a106d8a0d21fc84d76953822fbaf3398',
|
||||
'llvm-libunwind' : '814bd52c9247c5d04629658fbcb3ab8c',
|
||||
}
|
||||
},
|
||||
{
|
||||
'version' : '3.7.0',
|
||||
'md5':'b98b9495e5655a672d6cb83e1a180f8e',
|
||||
@ -161,34 +191,25 @@ class Llvm(Package):
|
||||
]
|
||||
|
||||
for release in releases:
|
||||
version(release['version'], release['md5'], url=llvm_url % release)
|
||||
|
||||
for name, md5 in release['resources'].items():
|
||||
resource(name=name,
|
||||
url=resources[name]['url'] % release,
|
||||
md5=md5,
|
||||
destination=resources[name]['destination'],
|
||||
when='@%(version)s' % release,
|
||||
placement=resources[name].get('placement', None))
|
||||
|
||||
# SVN - current develop
|
||||
version('develop', svn='http://llvm.org/svn/llvm-project/llvm/trunk')
|
||||
resource(name='clang', svn='http://llvm.org/svn/llvm-project/cfe/trunk',
|
||||
destination='tools', when='@develop', placement='clang')
|
||||
resource(name='compiler-rt', svn='http://llvm.org/svn/llvm-project/compiler-rt/trunk',
|
||||
destination='projects', when='@develop', placement='compiler-rt')
|
||||
resource(name='openmp', svn='http://llvm.org/svn/llvm-project/openmp/trunk',
|
||||
destination='projects', when='@develop', placement='openmp')
|
||||
resource(name='libcxx', svn='http://llvm.org/svn/llvm-project/libcxx/trunk',
|
||||
destination='projects', when='@develop', placement='libcxx')
|
||||
resource(name='libcxxabi', svn='http://llvm.org/svn/llvm-project/libcxxabi/trunk',
|
||||
destination='projects', when='@develop', placement='libcxxabi')
|
||||
resource(name='polly', svn='http://llvm.org/svn/llvm-project/polly/trunk',
|
||||
destination='tools', when='@develop', placement='polly')
|
||||
resource(name='lldb', svn='http://llvm.org/svn/llvm-project/lldb/trunk',
|
||||
destination='tools', when='@develop', placement='lldb')
|
||||
if release['version'] == 'trunk' :
|
||||
version(release['version'], svn=release['repo'])
|
||||
|
||||
for name, repo in release['resources'].items():
|
||||
resource(name=name,
|
||||
svn=repo,
|
||||
destination=resources[name]['destination'],
|
||||
when='@%(version)s' % release,
|
||||
placement=resources[name].get('placement', None))
|
||||
else:
|
||||
version(release['version'], release['md5'], url=llvm_url % release)
|
||||
|
||||
for name, md5 in release['resources'].items():
|
||||
resource(name=name,
|
||||
url=resources[name]['url'] % release,
|
||||
md5=md5,
|
||||
destination=resources[name]['destination'],
|
||||
when='@%(version)s' % release,
|
||||
placement=resources[name].get('placement', None))
|
||||
|
||||
def install(self, spec, prefix):
|
||||
env['CXXFLAGS'] = self.compiler.cxx11_flag
|
||||
|
@ -7,7 +7,17 @@ class M4(Package):
|
||||
|
||||
version('1.4.17', 'a5e9954b1dae036762f7b13673a2cf76')
|
||||
|
||||
variant('sigsegv', default=True, description="Build the libsigsegv dependency")
|
||||
|
||||
depends_on('libsigsegv', when='+sigsegv')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
configure_args = []
|
||||
if 'libsigsegv' in spec:
|
||||
configure_args.append('--with-libsigsegv-prefix=%s' % spec['libsigsegv'].prefix)
|
||||
else:
|
||||
configure_args.append('--without-libsigsegv-prefix')
|
||||
|
||||
configure("--prefix=%s" % prefix, *configure_args)
|
||||
make()
|
||||
make("install")
|
||||
|
@ -37,6 +37,12 @@ class Mpc(Package):
|
||||
depends_on("gmp")
|
||||
depends_on("mpfr")
|
||||
|
||||
def url_for_version(self, version):
|
||||
if version < Version("1.0.1"):
|
||||
return "http://www.multiprecision.org/mpc/download/mpc-%s.tar.gz" % version
|
||||
else:
|
||||
return "ftp://ftp.gnu.org/gnu/mpc/mpc-%s.tar.gz" % version
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
make()
|
||||
|
15
var/spack/repos/builtin/packages/netcdf-cxx4/package.py
Normal file
15
var/spack/repos/builtin/packages/netcdf-cxx4/package.py
Normal file
@ -0,0 +1,15 @@
|
||||
from spack import *
|
||||
|
||||
class NetcdfCxx4(Package):
|
||||
"""C++ interface for NetCDF4"""
|
||||
homepage = "http://www.unidata.ucar.edu/software/netcdf"
|
||||
url = "http://www.unidata.ucar.edu/downloads/netcdf/ftp/netcdf-cxx4-4.2.tar.gz"
|
||||
|
||||
version('4.2', 'd019853802092cf686254aaba165fc81')
|
||||
|
||||
depends_on('netcdf')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix)
|
||||
make()
|
||||
make("install")
|
16
var/spack/repos/builtin/packages/netcdf-fortran/package.py
Normal file
16
var/spack/repos/builtin/packages/netcdf-fortran/package.py
Normal file
@ -0,0 +1,16 @@
|
||||
from spack import *
|
||||
|
||||
class NetcdfFortran(Package):
|
||||
"""Fortran interface for NetCDF4"""
|
||||
|
||||
homepage = "http://www.unidata.ucar.edu/software/netcdf"
|
||||
url = "http://www.unidata.ucar.edu/downloads/netcdf/ftp/netcdf-fortran-4.4.3.tar.gz"
|
||||
|
||||
version('4.4.3', 'bfd4ae23a34635b273d3eb0d91cbde9e')
|
||||
|
||||
depends_on('netcdf')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
make()
|
||||
make("install")
|
@ -6,14 +6,13 @@ class Netcdf(Package):
|
||||
data formats that support the creation, access, and sharing of array-oriented
|
||||
scientific data."""
|
||||
|
||||
homepage = "http://www.unidata.ucar.edu/software/netcdf/"
|
||||
homepage = "http://www.unidata.ucar.edu/software/netcdf"
|
||||
url = "ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-4.3.3.tar.gz"
|
||||
|
||||
version('4.4.0', 'cffda0cbd97fdb3a06e9274f7aef438e')
|
||||
version('4.3.3', '5fbd0e108a54bd82cb5702a73f56d2ae')
|
||||
|
||||
variant('mpi', default=True, description='Enables MPI parallelism')
|
||||
variant('fortran', default=False, description="Download and install NetCDF-Fortran")
|
||||
variant('hdf4', default=False, description="Enable HDF4 support")
|
||||
|
||||
# Dependencies:
|
||||
@ -66,11 +65,7 @@ def install(self, spec, prefix):
|
||||
|
||||
# Fortran support
|
||||
# In version 4.2+, NetCDF-C and NetCDF-Fortran have split.
|
||||
# They can be installed separately, but this bootstrap procedure
|
||||
# should be able to install both at the same time.
|
||||
# Note: this is a new experimental feature.
|
||||
if '+fortran' in spec:
|
||||
config_args.append("--enable-remote-fortran-bootstrap")
|
||||
# Use the netcdf-fortran package to install Fortran support.
|
||||
|
||||
config_args.append('CPPFLAGS=%s' % ' '.join(CPPFLAGS))
|
||||
config_args.append('LDFLAGS=%s' % ' '.join(LDFLAGS))
|
||||
@ -79,8 +74,3 @@ def install(self, spec, prefix):
|
||||
configure(*config_args)
|
||||
make()
|
||||
make("install")
|
||||
|
||||
# After installing NetCDF-C, install NetCDF-Fortran
|
||||
if '+fortran' in spec:
|
||||
make("build-netcdf-fortran")
|
||||
make("install-netcdf-fortran")
|
||||
|
@ -17,6 +17,7 @@ class Openssl(Package):
|
||||
version('1.0.2d', '38dd619b2e77cbac69b99f52a053d25a')
|
||||
version('1.0.2e', '5262bfa25b60ed9de9f28d5d52d77fc5')
|
||||
version('1.0.2f', 'b3bf73f507172be9292ea2a8c28b659d')
|
||||
version('1.0.2g', 'f3c710c045cdee5fd114feb69feba7aa')
|
||||
|
||||
depends_on("zlib")
|
||||
parallel = False
|
||||
|
@ -16,4 +16,4 @@ class Pango(Package):
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
make()
|
||||
make("install")
|
||||
make("install", parallel=False)
|
||||
|
@ -2,9 +2,11 @@
|
||||
|
||||
class Paraview(Package):
|
||||
homepage = 'http://www.paraview.org'
|
||||
url = 'http://www.paraview.org/files/v4.4/ParaView-v4.4.0-source.tar.gz'
|
||||
url = 'http://www.paraview.org/files/v5.0/ParaView-v'
|
||||
_url_str = 'http://www.paraview.org/files/v%s/ParaView-v%s-source.tar.gz'
|
||||
|
||||
version('4.4.0', 'fa1569857dd680ebb4d7ff89c2227378', url='http://www.paraview.org/files/v4.4/ParaView-v4.4.0-source.tar.gz')
|
||||
version('4.4.0', 'fa1569857dd680ebb4d7ff89c2227378')
|
||||
version('5.0.0', '4598f0b421460c8bbc635c9a1c3bdbee')
|
||||
|
||||
variant('python', default=False, description='Enable Python support')
|
||||
|
||||
@ -25,8 +27,8 @@ class Paraview(Package):
|
||||
|
||||
depends_on('bzip2')
|
||||
depends_on('freetype')
|
||||
depends_on('hdf5')
|
||||
depends_on('hdf5+mpi', when='+mpi')
|
||||
depends_on('hdf5~mpi', when='~mpi')
|
||||
depends_on('jpeg')
|
||||
depends_on('libpng')
|
||||
depends_on('libtiff')
|
||||
@ -35,6 +37,11 @@ class Paraview(Package):
|
||||
#depends_on('protobuf') # version mismatches?
|
||||
#depends_on('sqlite') # external version not supported
|
||||
depends_on('zlib')
|
||||
|
||||
def url_for_version(self, version):
|
||||
"""Handle ParaView version-based custom URLs."""
|
||||
return self._url_str % (version.up_to(2), version)
|
||||
|
||||
|
||||
def install(self, spec, prefix):
|
||||
with working_dir('spack-build', create=True):
|
||||
|
20
var/spack/repos/builtin/packages/proj/package.py
Normal file
20
var/spack/repos/builtin/packages/proj/package.py
Normal file
@ -0,0 +1,20 @@
|
||||
from spack import *
|
||||
|
||||
class Proj(Package):
|
||||
"""Cartographic Projections"""
|
||||
homepage = "https://github.com/OSGeo/proj.4/wiki"
|
||||
url = "http://download.osgeo.org/proj/proj-4.9.2.tar.gz"
|
||||
|
||||
version('4.9.2', '9843131676e31bbd903d60ae7dc76cf9')
|
||||
version('4.9.1', '3cbb2a964fd19a496f5f4265a717d31c')
|
||||
version('4.8.0', 'd815838c92a29179298c126effbb1537')
|
||||
version('4.7.0', '927d34623b52e0209ba2bfcca18fe8cd')
|
||||
version('4.6.1', '7dbaab8431ad50c25669fd3fb28dc493')
|
||||
|
||||
# No dependencies
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure('--prefix=%s' % prefix)
|
||||
|
||||
make()
|
||||
make("install")
|
@ -55,6 +55,20 @@ def install(self, spec, prefix):
|
||||
make()
|
||||
make("install")
|
||||
|
||||
# Modify compiler paths in configuration files. This is necessary for
|
||||
# building site packages outside of spack
|
||||
filter_file(r'([/s]=?)([\S=]*)/lib/spack/env(/[^\s/]*)?/(\S*)(\s)',
|
||||
(r'\4\5'),
|
||||
join_path(prefix.lib, 'python%d.%d' % self.version[:2], '_sysconfigdata.py'))
|
||||
|
||||
python3_version = ''
|
||||
if spec.satisfies('@3:'):
|
||||
python3_version = '-%d.%dm' % self.version[:2]
|
||||
makefile_filepath = join_path(prefix.lib, 'python%d.%d' % self.version[:2], 'config%s' % python3_version, 'Makefile')
|
||||
filter_file(r'([/s]=?)([\S=]*)/lib/spack/env(/[^\s/]*)?/(\S*)(\s)',
|
||||
(r'\4\5'),
|
||||
makefile_filepath)
|
||||
|
||||
|
||||
# ========================================================================
|
||||
# Set up environment to make install easy for python extensions.
|
||||
|
@ -1,19 +1,28 @@
|
||||
from spack import *
|
||||
|
||||
class Silo(Package):
|
||||
"""Silo is a library for reading and writing a wide variety of scientific data to binary, disk files."""
|
||||
"""Silo is a library for reading and writing a wide variety of scientific
|
||||
data to binary, disk files."""
|
||||
|
||||
homepage = "http://wci.llnl.gov/simulation/computer-codes/silo"
|
||||
url = "https://wci.llnl.gov/content/assets/docs/simulation/computer-codes/silo/silo-4.8/silo-4.8.tar.gz"
|
||||
|
||||
#version('4.9', 'a83eda4f06761a86726e918fc55e782a')
|
||||
version('4.8', 'b1cbc0e7ec435eb656dc4b53a23663c9')
|
||||
|
||||
depends_on("hdf5@:1.8.12")
|
||||
variant('fortran', default=True, description='Enable Fortran support')
|
||||
|
||||
depends_on("hdf5")
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix,
|
||||
"--with-hdf5=%s" %spec['hdf5'].prefix)
|
||||
config_args = [
|
||||
'--enable-fortran' if '+fortran' in spec else '--disable-fortran',
|
||||
]
|
||||
|
||||
configure(
|
||||
"--prefix=%s" % prefix,
|
||||
"--with-hdf5=%s,%s" % (spec['hdf5'].prefix.include, spec['hdf5'].prefix.lib),
|
||||
"--with-zlib=%s,%s" % (spec['zlib'].prefix.include, spec['zlib'].prefix.lib),
|
||||
*config_args)
|
||||
|
||||
make()
|
||||
make("install")
|
||||
|
@ -6,7 +6,7 @@
|
||||
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
|
||||
# LLNL-CODE-647188
|
||||
#
|
||||
# For details, see https://llnl.github.io/spack
|
||||
# For details, see https://software.llnl.gov/spack
|
||||
# Please also see the LICENSE file for our notice and the LGPL.
|
||||
#
|
||||
# This program is free software; you can redistribute it and/or modify
|
||||
|
16
var/spack/repos/builtin/packages/udunits2/package.py
Normal file
16
var/spack/repos/builtin/packages/udunits2/package.py
Normal file
@ -0,0 +1,16 @@
|
||||
from spack import *
|
||||
|
||||
class Udunits2(Package):
|
||||
"""Automated units conversion"""
|
||||
|
||||
homepage = "http://www.unidata.ucar.edu/software/udunits"
|
||||
url = "ftp://ftp.unidata.ucar.edu/pub/udunits/udunits-2.2.20.tar.gz"
|
||||
|
||||
version('2.2.20', '1586b70a49dfe05da5fcc29ef239dce0')
|
||||
|
||||
depends_on('expat')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
configure("--prefix=%s" % prefix)
|
||||
make()
|
||||
make("install")
|
26
var/spack/repos/builtin/packages/zfp/package.py
Normal file
26
var/spack/repos/builtin/packages/zfp/package.py
Normal file
@ -0,0 +1,26 @@
|
||||
from spack import *
|
||||
|
||||
class Zfp(Package):
|
||||
"""zfp is an open source C library for compressed floating-point arrays that supports
|
||||
very high throughput read and write random acces, target error bounds or bit rates.
|
||||
Although bit-for-bit lossless compression is not always possible, zfp is usually
|
||||
accurate to within machine epsilon in near-lossless mode, and is often orders of
|
||||
magnitude more accurate than other lossy compressors.
|
||||
"""
|
||||
|
||||
homepage = "http://computation.llnl.gov/projects/floating-point-compression"
|
||||
url = "http://computation.llnl.gov/projects/floating-point-compression/download/zfp-0.5.0.tar.gz"
|
||||
|
||||
version('0.5.0', '2ab29a852e65ad85aae38925c5003654')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
make("shared")
|
||||
|
||||
# No install provided
|
||||
mkdirp(prefix.lib)
|
||||
mkdirp(prefix.include)
|
||||
install('lib/libzfp.so', prefix.lib)
|
||||
install('inc/zfp.h', prefix.include)
|
||||
install('inc/types.h', prefix.include)
|
||||
install('inc/bitstream.h', prefix.include)
|
||||
install('inc/system.h', prefix.include)
|
Loading…
Reference in New Issue
Block a user