Release v0.8.10

This commit is contained in:
Todd Gamblin 2014-10-27 23:04:47 -07:00
commit 84aa69fb0d
93 changed files with 4824 additions and 1266 deletions

48
LICENSE
View File

@ -1,4 +1,4 @@
Copyright (c) 2013, Lawrence Livermore National Security, LLC.
Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
Produced at the Lawrence Livermore National Laboratory.
This file is part of Spack.
@ -55,22 +55,22 @@ Modification
0. This License Agreement applies to any software library or other
program which contains a notice placed by the copyright holder or
other authorized party saying it may be distributed under the terms of
this Lesser General Public License (also called “this License”). Each
licensee is addressed as “you”.
this Lesser General Public License (also called "this License"). Each
licensee is addressed as "you".
A “library” means a collection of software functions and/or data
A "library" means a collection of software functions and/or data
prepared so as to be conveniently linked with application programs
(which use some of those functions and data) to form executables.
The “Library”, below, refers to any such software library or work
which has been distributed under these terms. A work based on the
Library means either the Library or any derivative work under
The "Library", below, refers to any such software library or work
which has been distributed under these terms. A "work based on the
Library" means either the Library or any derivative work under
copyright law: that is to say, a work containing the Library or a
portion of it, either verbatim or with modifications and/or translated
straightforwardly into another language. (Hereinafter, translation is
included without limitation in the term “modification”.)
included without limitation in the term "modification".)
“Source code” for a work means the preferred form of the work for
"Source code" for a work means the preferred form of the work for
making modifications to it. For a library, complete source code means
all the source code for all modules it contains, plus any associated
interface definition files, plus the scripts used to control
@ -83,7 +83,7 @@ covered only if its contents constitute a work based on the Library
it). Whether that is true depends on what the Library does and what
the program that uses the Library does.
1. You may copy and distribute verbatim copies of the Librarys
1. You may copy and distribute verbatim copies of the Library's
complete source code as you receive it, in any medium, provided that
you conspicuously and appropriately publish on each copy an
appropriate copyright notice and disclaimer of warranty; keep intact
@ -170,17 +170,17 @@ source along with the object code.
5. A program that contains no derivative of any portion of the
Library, but is designed to work with the Library by being compiled or
linked with it, is called a “work that uses the Library”. Such a work,
linked with it, is called a "work that uses the Library". Such a work,
in isolation, is not a derivative work of the Library, and therefore
falls outside the scope of this License.
However, linking a “work that uses the Library” with the Library
However, linking a "work that uses the Library" with the Library
creates an executable that is a derivative of the Library (because it
contains portions of the Library), rather than a work that uses the
library. The executable is therefore covered by this License. Section
contains portions of the Library), rather than a "work that uses the
library". The executable is therefore covered by this License. Section
6 states terms for distribution of such executables.
When a “work that uses the Library” uses material from a header file
When a "work that uses the Library" uses material from a header file
that is part of the Library, the object code for the work may be a
derivative work of the Library even though the source code is
not. Whether this is true is especially significant if the work can be
@ -200,10 +200,10 @@ distribute the object code for the work under the terms of Section
whether or not they are linked directly with the Library itself.
6. As an exception to the Sections above, you may also combine or link
a “work that uses the Library” with the Library to produce a work
a "work that uses the Library" with the Library to produce a work
containing portions of the Library, and distribute that work under
terms of your choice, provided that the terms permit modification of
the work for the customers own use and reverse engineering for
the work for the customer's own use and reverse engineering for
debugging such modifications.
You must give prominent notice with each copy of the work that the
@ -218,7 +218,7 @@ a) Accompany the work with the complete corresponding machine-readable
source code for the Library including whatever changes were used in
the work (which must be distributed under Sections 1 and 2 above);
and, if the work is an executable liked with the Library, with the
complete machine-readable “work that uses the Library”, as object code
complete machine-readable "work that uses the Library", as object code
and/or source code, so that the user can modify the Library and then
relink to produce a modified executable containing the modified
Library. (It is understood that the user who changes the contents of
@ -227,7 +227,7 @@ recompile the application to use the modified definitions.)
b) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (1) uses at run time a copy
of the library already present on the users computer system, rather
of the library already present on the user's computer system, rather
than copying library functions into the executable, and (2) will
operate properly with a modified version of the library, if the user
installs one, as long as the modified version is interface- compatible
@ -245,8 +245,8 @@ specified materials from the same place.
e) Verify that the user has already received a copy of these materials
or that you have already sent this user a copy.
For an executable, the required form of the work that uses the
Library must include any data and utility programs needed for
For an executable, the required form of the "work that uses the
Library" must include any data and utility programs needed for
reproducing the executable from it. However, as a special exception,
the materials to be distributed need not include anything that is
normally distributed (in either source or binary form) with the major
@ -296,7 +296,7 @@ the Library or works based on it.
Library), the recipient automatically receives a license from the
original licensor to copy, distribute, link with or modify the Library
subject to these terms and conditions. You may not impose any further
restrictions on the recipients exercise of the rights granted
restrictions on the recipients' exercise of the rights granted
herein. You are not responsible for enforcing compliance by third
parties with this License.
@ -347,7 +347,7 @@ differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Library
specifies a version number of this License which applies to it and
“any later version”, you have the option of following the terms and
"any later version", you have the option of following the terms and
conditions either of that version or of any later version published by
the Free Software Foundation. If the Library does not specify a
license version number, you may choose any version ever published by
@ -367,7 +367,7 @@ NO WARRANTY
1 BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT
WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER
PARTIES PROVIDE THE LIBRARY “AS IS” WITHOUT WARRANTY OF ANY KIND,
PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND,
EITHER EXPRESSED OR IMPLIED INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE

View File

@ -45,6 +45,7 @@ people:
* Matt Legendre
* Greg Lee
* Adam Moody
* Bob Robey
Release
----------------

View File

@ -96,7 +96,7 @@ if args.mock:
# If the user asked for it, don't check ssl certs.
if args.insecure:
tty.warn("You asked for --insecure, which does not check SSL certificates.")
tty.warn("You asked for --insecure, which does not check SSL certificates or checksums.")
spack.curl.add_default_arg('-k')
# Try to load the particular command asked for and run it

View File

@ -1,2 +1,3 @@
package_list.rst
spack*.rst
_build

View File

@ -21,6 +21,12 @@ I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
all: html
#
# This autogenerates a package list.
#
package_list:
spack info -r > package_list.rst
#
# This creates a git repository and commits generated html docs.
# It them pushes the new branch into THIS repository as gh-pages.
@ -36,12 +42,14 @@ gh-pages: _build/html
touch .nojekyll && \
git init && \
git add . && \
git commit -m "Initial commit" && \
git commit -m "Spack Documentation" && \
git push -f $$root master:gh-pages && \
rm -rf .git
upload:
rsync -avz --rsh=ssh --delete _build/html/ cab:/usr/global/web-pages/lc/www/adept/docs/spack
git push -f origin gh-pages
git push -f github gh-pages
apidoc:
sphinx-apidoc -T -o . $(PYTHONPATH)/spack
@ -69,9 +77,10 @@ help:
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
-rm -f package_list.rst
-rm -rf $(BUILDDIR)/* $(APIDOC_FILES)
html: apidoc
html: apidoc package_list
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

View File

@ -13,7 +13,7 @@
<hr/>
<p>
&copy; Copyright 2013,
&copy; Copyright 2013-2014,
<a href="https://scalability.llnl.gov/">Lawrence Livermore National Laboratory</a>.
<br/>
Written by Todd Gamblin, <a href="mailto:tgamblin@llnl.gov">tgamblin@llnl.gov</a>, LLNL-CODE-647188

View File

@ -10,25 +10,6 @@ Only a small subset of commands are needed for typical usage.
This section covers a small set of subcommands that should cover most
general use cases for Spack.
Getting Help
-----------------------
``spack help``
~~~~~~~~~~~~~~~~~~~~~~
The ``help`` subcommand will print out out a list of all of
``spack``'s options and subcommands:
.. command-output:: spack help
Adding an argument, e.g. ``spack help <subcommand>``, will print out
usage information for a particular subcommand:
.. command-output:: spack help install
Alternately, you can use ``spack -h`` in place of ``spack help``, or
``spack <subcommand> -h`` to get help on a particular subcommand.
Listing available packages
------------------------------
@ -44,7 +25,12 @@ Spack can install:
.. command-output:: spack list
The packages are listed by name in alphabetical order.
The packages are listed by name in alphabetical order. You can also
do wildcard searches using ``*``:
.. command-output:: spack list m*
.. command-output:: spack list *util*
``spack info``
@ -327,19 +313,19 @@ completely remove the directory in which the package was installed.
spack uninstall mpich
If there are still installed packages that depend on the package to be
uninstalled, spack will issue a warning. In general, it is safer to
remove dependent packages *before* removing their dependencies. Not
doing so risks breaking packages on your system. To remove a package
without regard for its dependencies, run ``spack uninstall -f
<package>`` to override the warning.
uninstalled, spack will refuse to uninstall. If you know what you're
doing, you can override this with ``spack uninstall -f <package>``.
However, running this risks breaking other installed packages. In
general, it is safer to remove dependent packages *before* removing
their dependencies.
A line like ``spack uninstall mpich`` may be ambiguous, if multiple
``mpich`` configurations are installed. For example, if both
``mpich@3.0.2`` and ``mpich@3.1`` are installed, it could refer to
either one, and Spack cannot determine which one to uninstall. Spack
will ask you to provide a version number to remove any ambiguity. For
example, ``spack uninstall mpich@3.1`` is unambiguous in the
above scenario.
will ask you to provide a version number to remove the ambiguity. For
example, ``spack uninstall mpich@3.1`` is unambiguous in the above
scenario.
.. _sec-specs:
@ -657,3 +643,236 @@ add a version specifier to the spec:
Notice that the package versions that provide insufficient MPI
versions are now filtered out.
.. _shell-support:
Environment Modules
-------------------------------
.. note::
Environment module support is currently experimental and should not
be considered a stable feature of Spack. In particular, the
interface and/or generated module names may change in future
versions.
Spack provides some limited integration with environment module
systems to make it easier to use the packages it provides.
You can enable shell support by sourcing some files in the
``/share/spack`` directory.
For ``bash`` or ``ksh``, run:
.. code-block:: sh
. $SPACK_ROOT/share/spack/setup-env.sh
For ``csh`` and ``tcsh`` run:
.. code-block:: csh
setenv SPACK_ROOT /path/to/spack
source $SPACK_ROOT/share/spack/setup-env.csh
You can put the above code in your ``.bashrc`` or ``.cshrc``, and
Spack's shell support will be available on the command line.
-------------------------------
When you install a package with Spack, it automatically generates an
environment module that lets you add the package to your environment.
Currently, Spack supports the generation of `TCL Modules
<http://wiki.tcl.tk/12999>`_ and `Dotkit
<https://computing.llnl.gov/?set=jobs&page=dotkit>`_. Generated
module files for each of these systems can be found in these
directories:
* ``$SPACK_ROOT/share/spack/modules``
* ``$SPACK_ROOT/share/spack/dotkit``
The directories are automatically added to your ``MODULEPATH`` and
``DK_NODE`` environment variables when you enable Spack's `shell
support <shell-support_>`_.
Using Modules & Dotkits
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
If you have shell support enabled you should be able to run either
``module avail`` or ``use -l spack`` to see what modules/dotkits have
been installed. Here is sample output of those programs, showing lots
of installed packages.
.. code-block:: sh
$ module avail
------- /g/g21/gamblin2/src/spack/share/spack/modules/chaos_5_x86_64_ib --------
adept-utils@1.0%gcc@4.4.7-5adef8da libelf@0.8.13%gcc@4.4.7
automaded@1.0%gcc@4.4.7-d9691bb0 libelf@0.8.13%intel@15.0.0
boost@1.55.0%gcc@4.4.7 mpc@1.0.2%gcc@4.4.7-559607f5
callpath@1.0.1%gcc@4.4.7-5dce4318 mpfr@3.1.2%gcc@4.4.7
dyninst@8.1.2%gcc@4.4.7-b040c20e mpich@3.0.4%gcc@4.4.7
gcc@4.9.1%gcc@4.4.7-93ab98c5 mpich@3.0.4%gcc@4.9.0
gmp@6.0.0a%gcc@4.4.7 mrnet@4.1.0%gcc@4.4.7-72b7881d
graphlib@2.0.0%gcc@4.4.7 netgauge@2.4.6%gcc@4.9.0-27912b7b
launchmon@1.0.1%gcc@4.4.7 stat@2.1.0%gcc@4.4.7-51101207
libNBC@1.1.1%gcc@4.9.0-27912b7b sundials@2.5.0%gcc@4.9.0-27912b7b
libdwarf@20130729%gcc@4.4.7-b52fac98
.. code-block:: sh
$ use -l spack
spack ----------
adept-utils@1.0%gcc@4.4.7-5adef8da - adept-utils @1.0
automaded@1.0%gcc@4.4.7-d9691bb0 - automaded @1.0
boost@1.55.0%gcc@4.4.7 - boost @1.55.0
callpath@1.0.1%gcc@4.4.7-5dce4318 - callpath @1.0.1
dyninst@8.1.2%gcc@4.4.7-b040c20e - dyninst @8.1.2
gmp@6.0.0a%gcc@4.4.7 - gmp @6.0.0a
libNBC@1.1.1%gcc@4.9.0-27912b7b - libNBC @1.1.1
libdwarf@20130729%gcc@4.4.7-b52fac98 - libdwarf @20130729
libelf@0.8.13%gcc@4.4.7 - libelf @0.8.13
libelf@0.8.13%intel@15.0.0 - libelf @0.8.13
mpc@1.0.2%gcc@4.4.7-559607f5 - mpc @1.0.2
mpfr@3.1.2%gcc@4.4.7 - mpfr @3.1.2
mpich@3.0.4%gcc@4.4.7 - mpich @3.0.4
mpich@3.0.4%gcc@4.9.0 - mpich @3.0.4
netgauge@2.4.6%gcc@4.9.0-27912b7b - netgauge @2.4.6
sundials@2.5.0%gcc@4.9.0-27912b7b - sundials @2.5.0
The names here should look familiar, they're the same ones from
``spack find``. You *can* use the names here directly. For example,
you could type either of these commands to load the callpath module
(assuming dotkit and modules are installed):
.. code-block:: sh
use callpath@1.0.1%gcc@4.4.7-5dce4318
.. code-block:: sh
module load callpath@1.0.1%gcc@4.4.7-5dce4318
Neither of these is particularly pretty, easy to remember, or
easy to type. Luckily, Spack has its own interface for using modules
and dotkits. You can use the same spec syntax you're used to:
========================= ==========================
Modules Dotkit
========================= ==========================
``spack load <spec>`` ``spack use <spec>``
``spack unload <spec>`` ``spack unuse <spec>``
========================= ==========================
And you can use the same shortened names you use everywhere else in
Spack. For example, this will add the ``mpich`` package built with
``gcc`` to your path:
.. code-block:: sh
$ spack install mpich %gcc@4.4.7
# ... wait for install ...
$ spack use mpich %gcc@4.4.7
Prepending: mpich@3.0.4%gcc@4.4.7 (ok)
$ which mpicc
~/src/spack/opt/chaos_5_x86_64_ib/gcc@4.4.7/mpich@3.0.4/bin/mpicc
Or, similarly with modules, you could type:
.. code-block:: sh
$ spack load mpich %gcc@4.4.7
These commands will add appropriate directories to your ``PATH``,
``MANPATH``, and ``LD_LIBRARY_PATH``. When you no longer want to use
a package, you can type unload or unuse similarly:
.. code-block:: sh
$ spack unload mpich %gcc@4.4.7 # modules
$ spack unuse mpich %gcc@4.4.7 # dotkit
.. note::
These ``use``, ``unuse``, ``load``, and ``unload`` subcommands are
only available if you have enabled Spack's shell support *and* you
have dotkit or modules installed on your machine.
Ambiguous module names
~~~~~~~~~~~~~~~~~~~~~~~~
If a spec used with load/unload or use/unuse is ambiguous (i.e. more
than one installed package matches it), then Spack will warn you:
.. code-block:: sh
$ spack load libelf
==> Error: Multiple matches for spec libelf. Choose one:
libelf@0.8.13%gcc@4.4.7=chaos_5_x86_64_ib
libelf@0.8.13%intel@15.0.0=chaos_5_x86_64_ib
You can either type the ``spack load`` command again with a fully
qualified argument, or you can add just enough extra constraints to
identify one package. For example, above, the key differentiator is
that one ``libelf`` is built with the Intel compiler, while the other
used ``gcc``. You could therefore just type:
.. code-block:: sh
$ spack load libelf %intel
To identify just the one built with the Intel compiler.
Regenerating Module files
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Module and dotkit files are generated when packages are installed, and
are placed in the following directories under the Spack root:
* ``$SPACK_ROOT/share/spack/modules``
* ``$SPACK_ROOT/share/spack/dotkit``
Sometimes you may need to regenerate the modules files. For example,
if newer, fancier module support is added to Spack at some later date,
you may want to regenerate all the modules to take advantage of these
new features.
``spack module refresh``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Running ``spack module refresh`` will remove the
``share/spack/modules`` and ``share/spack/dotkit`` directories, then
regenerate all module and dotkit files from scratch:
.. code-block:: sh
$ spack module refresh
==> Regenerating tcl module files.
==> Regenerating dotkit module files.
Getting Help
-----------------------
``spack help``
~~~~~~~~~~~~~~~~~~~~~~
If you don't find what you need here, the ``help`` subcommand will
print out out a list of *all* of ``spack``'s options and subcommands:
.. command-output:: spack help
Adding an argument, e.g. ``spack help <subcommand>``, will print out
usage information for a particular subcommand:
.. command-output:: spack help install
Alternately, you can use ``spack -h`` in place of ``spack help``, or
``spack <subcommand> -h`` to get help on a particular subcommand.

View File

@ -90,7 +90,7 @@
# General information about the project.
project = u'Spack'
copyright = u'2013, Lawrence Livermore National Laboratory'
copyright = u'2013-2014, Lawrence Livermore National Laboratory'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the

View File

@ -4,7 +4,7 @@ Developer Guide
=====================
This guide is intended for people who want to work on Spack itself.
If you just want to develop pacakges, see the :ref:`packaging-guide`.
If you just want to develop packages, see the :ref:`packaging-guide`.
It is assumed that you've read the :ref:`basic-usage` and
:ref:`packaging-guide` sections, and that you're familiar with the

View File

@ -93,7 +93,7 @@ creates a simple python file:
homepage = "http://www.example.com/"
url = "http://www.mr511.de/software/libelf-0.8.13.tar.gz"
versions = { '0.8.13' : '4136d7b4c04df68b686570afa26988ac' }
version('0.8.13', '4136d7b4c04df68b686570afa26988ac')
def install(self, prefix):
configure("--prefix=%s" % prefix)

View File

@ -48,6 +48,7 @@ Table of Contents
packaging_guide
site_configuration
developer_guide
package_list
API Docs <spack>
Indices and tables

File diff suppressed because it is too large Load Diff

View File

@ -108,6 +108,8 @@
import sys as _sys
import textwrap as _textwrap
from llnl.util.tty.colify import colified
from gettext import gettext as _
try:
@ -2285,8 +2287,8 @@ def _get_value(self, action, arg_string):
def _check_value(self, action, value):
# converted value must be one of the choices (if specified)
if action.choices is not None and value not in action.choices:
tup = value, ', '.join(map(repr, action.choices))
msg = _('invalid choice: %r (choose from %s)') % tup
cols = colified(sorted(action.choices), indent=4, tty=True)
msg = _('invalid choice: %r choose from:\n%s') % (value, cols)
raise ArgumentError(action, msg)
# =======================

View File

@ -38,7 +38,7 @@
from spack.util.compression import ALLOWED_ARCHIVE_TYPES
def filter_file(regex, repl, *filenames):
def filter_file(regex, repl, *filenames, **kwargs):
"""Like sed, but uses python regular expressions.
Filters every line of file through regex and replaces the file
@ -49,16 +49,31 @@ def filter_file(regex, repl, *filenames):
return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
Keyword Options:
string[=False] If True, treat regex as a plain string.
backup[=True] Make a backup files suffixed with ~
ignore_absent[=False] Ignore any files that don't exist.
"""
# Keep callables intact
if not hasattr(repl, '__call__'):
# Allow strings to use \1, \2, etc. for replacement, like sed
string = kwargs.get('string', False)
backup = kwargs.get('backup', True)
ignore_absent = kwargs.get('ignore_absent', False)
# Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl):
unescaped = repl.replace(r'\\', '\\')
repl = lambda m: re.sub(
r'\\([0-9])', lambda x: m.group(int(x.group(1))), unescaped)
if string:
regex = re.escape(regex)
for filename in filenames:
backup = filename + "~"
if ignore_absent and not os.path.exists(filename):
continue
shutil.copy(filename, backup)
try:
with closing(open(backup)) as infile:
@ -71,6 +86,10 @@ def filter_file(regex, repl, *filenames):
shutil.move(backup, filename)
raise
finally:
if not backup:
shutil.rmtree(backup, ignore_errors=True)
def change_sed_delimiter(old_delim, new_delim, *filenames):
"""Find all sed search/replace commands and change the delimiter.
@ -125,6 +144,7 @@ def expand_user(path):
def mkdirp(*paths):
"""Creates a directory, as well as parent directories if needed."""
for path in paths:
if not os.path.exists(path):
os.makedirs(path)
@ -144,6 +164,7 @@ def working_dir(dirname, **kwargs):
def touch(path):
"""Creates an empty file at the specified path."""
with closing(open(path, 'a')) as file:
os.utime(path, None)

View File

@ -37,9 +37,11 @@
import fcntl
import termios
import struct
from StringIO import StringIO
from llnl.util.tty import terminal_size
class ColumnConfig:
def __init__(self, cols):
self.cols = cols
@ -102,16 +104,20 @@ def colify(elts, **options):
output = options.get("output", sys.stdout)
indent = options.get("indent", 0)
padding = options.get("padding", 2)
tty = options.get('tty', None)
# elts needs to be an array of strings so we can count the elements
elts = [str(elt) for elt in elts]
if not elts:
return
return (0, ())
if not isatty(output):
for elt in elts:
output.write("%s\n" % elt)
return
if not tty:
if tty is False or not isatty(output):
for elt in elts:
output.write("%s\n" % elt)
maxlen = max(len(str(s)) for s in elts)
return (1, (maxlen,))
console_cols = options.get("cols", None)
if not console_cols:
@ -146,6 +152,17 @@ def colify(elts, **options):
if row == rows_last_col:
cols -= 1
return (config.cols, tuple(config.widths))
def colified(elts, **options):
"""Invokes the colify() function but returns the result as a string
instead of writing it to an output string."""
sio = StringIO()
options['output'] = sio
colify(elts, **options)
return sio.getvalue()
if __name__ == "__main__":
import optparse

View File

@ -137,9 +137,9 @@
# TODO: it's not clear where all the stuff that needs to be included in packages
# should live. This file is overloaded for spack core vs. for packages.
#
__all__ = ['Package', 'Version', 'when']
__all__ = ['Package', 'Version', 'when', 'ver']
from spack.package import Package
from spack.version import Version
from spack.version import Version, ver
from spack.multimethod import when
import llnl.util.filesystem

View File

@ -96,7 +96,7 @@ def checksum(parser, args):
if not versions:
tty.die("Could not fetch any available versions for %s." % pkg.name)
versions = list(reversed(versions))
versions = list(reversed(sorted(versions)))
urls = [pkg.url_for_version(v) for v in versions]
@ -117,7 +117,5 @@ def checksum(parser, args):
if not version_hashes:
tty.die("Could not fetch any available versions for %s." % pkg.name)
dict_string = [" '%s' : '%s'," % (v, h) for v, h in version_hashes]
dict_string = ['{'] + dict_string + ["}"]
tty.msg("Checksummed new versions of %s:" % pkg.name, *dict_string)
version_lines = [" version('%s', '%s')" % (v, h) for v, h in version_hashes]
tty.msg("Checksummed new versions of %s:" % pkg.name, *version_lines)

View File

@ -52,7 +52,15 @@ def clean(parser, args):
package = spack.db.get(spec)
if args.dist:
package.do_clean_dist()
tty.msg("Cleaned %s" % package.name)
elif args.work:
package.do_clean_work()
tty.msg("Restaged %s" % package.name)
else:
package.do_clean()
try:
package.do_clean()
except subprocess.CalledProcessError, e:
tty.warn("Warning: 'make clean' didn't work. Consider 'spack clean --work'.")
tty.msg("Made clean for %s" % package.name)

View File

@ -72,6 +72,9 @@ class ${class_name}(Package):
${versions}
# FIXME: Add dependencies if this package requires them.
# depends_on("foo")
def install(self, spec, prefix):
# FIXME: Modify the configure line to suit your build system here.
${configure}

View File

@ -27,9 +27,10 @@
from contextlib import closing
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
from llnl.util.filesystem import mkdirp, join_path
import spack
import spack.cmd
from spack.util.naming import mod_to_class
description = "Open package files in $EDITOR"
@ -57,6 +58,9 @@ def setup_parser(subparser):
subparser.add_argument(
'-f', '--force', dest='force', action='store_true',
help="Open a new file in $EDITOR even if package doesn't exist.")
subparser.add_argument(
'-c', '--command', dest='edit_command', action='store_true',
help="Edit the command with the supplied name instead of a package.")
subparser.add_argument(
'name', nargs='?', default=None, help="name of package to edit")
@ -64,25 +68,34 @@ def setup_parser(subparser):
def edit(parser, args):
name = args.name
# By default open the directory where packages live.
if not name:
path = spack.packages_path
else:
path = spack.db.filename_for_package_name(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something's wrong. '%s' is not a file!" % path)
if not os.access(path, os.R_OK|os.W_OK):
tty.die("Insufficient permissions on '%s'!" % path)
elif not args.force:
tty.die("No package '%s'. Use spack create, or supply -f/--force "
"to edit a new file." % name)
if args.edit_command:
if not name:
path = spack.cmd.command_path
else:
mkdirp(os.path.dirname(path))
with closing(open(path, "w")) as pkg_file:
pkg_file.write(
package_template.substitute(name=name, class_name=mod_to_class(name)))
path = join_path(spack.cmd.command_path, name + ".py")
if not os.path.exists(path):
tty.die("No command named '%s'." % name)
else:
# By default open the directory where packages or commands live.
if not name:
path = spack.packages_path
else:
path = spack.db.filename_for_package_name(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something's wrong. '%s' is not a file!" % path)
if not os.access(path, os.R_OK|os.W_OK):
tty.die("Insufficient permissions on '%s'!" % path)
elif not args.force:
tty.die("No package '%s'. Use spack create, or supply -f/--force "
"to edit a new file." % name)
else:
mkdirp(os.path.dirname(path))
with closing(open(path, "w")) as pkg_file:
pkg_file.write(
package_template.substitute(name=name, class_name=mod_to_class(name)))
# If everything checks out, go ahead and edit.
spack.editor(path)

View File

@ -24,52 +24,137 @@
##############################################################################
import re
import textwrap
from llnl.util.tty.colify import colify
from StringIO import StringIO
from llnl.util.tty.colify import *
import spack
import spack.fetch_strategy as fs
description = "Get detailed information on a particular package"
def setup_parser(subparser):
subparser.add_argument('name', metavar="PACKAGE", help="name of packages to get info on")
subparser.add_argument('-r', '--rst', action='store_true',
help="List all packages in reStructured text, for docs.")
subparser.add_argument('name', metavar="PACKAGE", nargs='?', help="name of packages to get info on")
def info(parser, args):
package = spack.db.get(args.name)
print "Package: ", package.name
print "Homepage: ", package.homepage
def format_doc(pkg, **kwargs):
"""Wrap doc string at 72 characters and format nicely"""
indent = kwargs.get('indent', 0)
if not pkg.__doc__:
return ""
doc = re.sub(r'\s+', ' ', pkg.__doc__)
lines = textwrap.wrap(doc, 72)
results = StringIO()
for line in lines:
results.write((" " * indent) + line + "\n")
return results.getvalue()
def github_url(pkg):
"""Link to a package file on github."""
return ("https://github.com/scalability-llnl/spack/blob/master/var/spack/packages/%s/package.py" %
pkg.name)
def rst_table(elts):
"""Print out a RST-style table."""
cols = StringIO()
ncol, widths = colify(elts, output=cols, tty=True)
header = " ".join("=" * (w-1) for w in widths)
return "%s\n%s%s" % (header, cols.getvalue(), header)
def info_rst():
"""Print out information on all packages in restructured text."""
pkgs = sorted(spack.db.all_packages(), key=lambda s:s.name.lower())
print "Package List"
print "=================="
print "This is a list of things you can install using Spack. It is"
print "automatically generated based on the packages in the latest Spack"
print "release."
print
print "Spack currently has %d mainline packages:" % len(pkgs)
print
print rst_table("`%s`_" % p.name for p in pkgs)
print
print "-----"
# Output some text for each package.
for pkg in pkgs:
print
print ".. _%s:" % pkg.name
print
print pkg.name
print "-" * len(pkg.name)
print "Links"
print " * `Homepage <%s>`__" % pkg.homepage
print " * `%s/package.py <%s>`__" % (pkg.name, github_url(pkg))
print
if pkg.versions:
print "Versions:"
print " " + ", ".join(str(v) for v in reversed(sorted(pkg.versions)))
if pkg.dependencies:
print "Dependencies"
print " " + ", ".join("`%s`_" % d if d != "mpi" else d
for d in pkg.dependencies)
print
print "Description"
print format_doc(pkg, indent=2)
print
print "-----"
def info_text(pkg):
"""Print out a plain text description of a package."""
print "Package: ", pkg.name
print "Homepage: ", pkg.homepage
print
print "Safe versions: "
if not package.versions:
if not pkg.versions:
print("None.")
else:
maxlen = max(len(str(v)) for v in package.versions)
maxlen = max(len(str(v)) for v in pkg.versions)
fmt = "%%-%ss" % maxlen
for v in reversed(sorted(package.versions)):
print " " + (fmt % v) + " " + package.url_for_version(v)
for v in reversed(sorted(pkg.versions)):
f = fs.for_package_version(pkg, v)
print " " + (fmt % v) + " " + str(f)
print
print "Dependencies:"
if package.dependencies:
colify(package.dependencies, indent=4)
if pkg.dependencies:
colify(pkg.dependencies, indent=4)
else:
print " None"
print
print "Virtual packages: "
if package.provided:
for spec, when in package.provided.items():
if pkg.provided:
for spec, when in pkg.provided.items():
print " %s provides %s" % (when, spec)
else:
print " None"
print
print "Description:"
if package.__doc__:
doc = re.sub(r'\s+', ' ', package.__doc__)
lines = textwrap.wrap(doc, 72)
for line in lines:
print " " + line
if pkg.__doc__:
print format_doc(pkg, indent=4)
else:
print " None"
def info(parser, args):
if args.rst:
info_rst()
else:
if not args.name:
tty.die("You must supply a package name.")
pkg = spack.db.get(args.name)
info_text(pkg)

View File

@ -22,15 +22,44 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import llnl.util.tty as tty
from external import argparse
from llnl.util.tty.colify import colify
import spack
import fnmatch
description ="List available spack packages"
def setup_parser(subparser):
pass
subparser.add_argument(
'filter', nargs=argparse.REMAINDER,
help='Optional glob patterns to filter results.')
subparser.add_argument(
'-i', '--insensitive', action='store_true', default=False,
help='Filtering will be case insensitive.')
def list(parser, args):
# Start with all package names.
pkgs = spack.db.all_package_names()
# filter if a filter arg was provided
if args.filter:
def match(p, f):
if args.insensitive:
p = p.lower()
f = f.lower()
return fnmatch.fnmatchcase(p, f)
pkgs = [p for p in pkgs if any(match(p, f) for f in args.filter)]
# sort before displaying.
sorted_packages = sorted(pkgs, key=lambda s:s.lower())
# Print all the package names in columns
colify(spack.db.all_package_names())
indent=0
if sys.stdout.isatty():
tty.msg("%d packages." % len(sorted_packages))
indent=2
colify(sorted_packages, indent=indent)

View File

@ -48,11 +48,14 @@ def setup_parser(subparser):
directories.add_argument(
'-p', '--package-dir', action='store_true',
help="Directory enclosing a spec's package.py file.")
directories.add_argument(
'-P', '--packages', action='store_true',
help="Top-level packages directory for Spack.")
directories.add_argument(
'-s', '--stage-dir', action='store_true', help="Stage directory for a spec.")
directories.add_argument(
'-b', '--build-dir', action='store_true',
help="Expanded archive directory for a spec (requires it to be staged first).")
help="Checked out or expanded source directory for a spec (requires it to be staged first).")
subparser.add_argument(
'spec', nargs=argparse.REMAINDER, help="spec of package to fetch directory for.")
@ -65,29 +68,47 @@ def location(parser, args):
elif args.spack_root:
print spack.prefix
elif args.packages:
print spack.db.root
else:
specs = spack.cmd.parse_specs(args.spec, concretize=True)
specs = spack.cmd.parse_specs(args.spec)
if not specs:
tty.die("You must supply a spec.")
if len(specs) != 1:
tty.die("Too many specs. Need only one.")
tty.die("Too many specs. Supply only one.")
spec = specs[0]
if args.install_dir:
print spec.prefix
# install_dir command matches against installed specs.
matching_specs = spack.db.get_installed(spec)
if not matching_specs:
tty.die("Spec '%s' matches no installed packages." % spec)
elif len(matching_specs) > 1:
args = ["%s matches multiple packages." % spec,
"Matching packages:"]
args += [" " + str(s) for s in matching_specs]
args += ["Use a more specific spec."]
tty.die(*args)
print matching_specs[0].prefix
elif args.package_dir:
# This one just needs the spec name.
print join_path(spack.db.root, spec.name)
else:
# These versions need concretized specs.
spec.concretize()
pkg = spack.db.get(spec)
if args.stage_dir:
print pkg.stage.path
else: # args.build_dir is the default.
if not os.listdir(pkg.stage.path):
if not pkg.stage.source_path:
tty.die("Build directory does not exist yet. Run this to create it:",
"spack stage " + " ".join(args.spec))
print pkg.stage.expanded_archive_path
print pkg.stage.source_path

View File

@ -0,0 +1,52 @@
##############################################################################
# Copyright (c) 2013-2014, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import hashlib
from external import argparse
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack.util.crypto
description = "Calculate md5 checksums for files."
def setup_parser(subparser):
setup_parser.parser = subparser
subparser.add_argument('files', nargs=argparse.REMAINDER,
help="Files to checksum.")
def md5(parser, args):
if not args.files:
setup_parser.parser.print_help()
for f in args.files:
if not os.path.isfile(f):
tty.die("Not a file: %s" % f)
if not can_access(f):
tty.die("Cannot read file: %s" % f)
checksum = spack.util.crypto.checksum(hashlib.md5, f)
print "%s %s" % (checksum, f)

View File

@ -23,23 +23,19 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import shutil
import sys
from datetime import datetime
from contextlib import closing
from external import argparse
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.filesystem import mkdirp, join_path
import spack
import spack.cmd
import spack.config
import spack.mirror
from spack.spec import Spec
from spack.error import SpackError
from spack.stage import Stage
from spack.util.compression import extension
description = "Manage mirrors."
@ -58,6 +54,9 @@ def setup_parser(subparser):
'specs', nargs=argparse.REMAINDER, help="Specs of packages to put in mirror")
create_parser.add_argument(
'-f', '--file', help="File with specs of packages to put in mirror.")
create_parser.add_argument(
'-o', '--one-version-per-spec', action='store_const', const=1, default=0,
help="Only fetch one 'preferred' version per spec, not all known versions.")
add_parser = sp.add_parser('add', help=mirror_add.__doc__)
add_parser.add_argument('name', help="Mnemonic name for mirror.")
@ -72,8 +71,12 @@ def setup_parser(subparser):
def mirror_add(args):
"""Add a mirror to Spack."""
url = args.url
if url.startswith('/'):
url = 'file://' + url
config = spack.config.get_config('user')
config.set_value('mirror', args.name, 'url', args.url)
config.set_value('mirror', args.name, 'url', url)
config.write()
@ -105,112 +108,63 @@ def mirror_list(args):
print fmt % (name, val)
def _read_specs_from_file(filename):
with closing(open(filename, "r")) as stream:
for i, string in enumerate(stream):
try:
s = Spec(string)
s.package
args.specs.append(s)
except SpackError, e:
tty.die("Parse error in %s, line %d:" % (args.file, i+1),
">>> " + string, str(e))
def mirror_create(args):
"""Create a directory to be used as a spack mirror, and fill it with
package archives."""
# try to parse specs from the command line first.
args.specs = spack.cmd.parse_specs(args.specs)
specs = spack.cmd.parse_specs(args.specs)
# If there is a file, parse each line as a spec and add it to the list.
if args.file:
with closing(open(args.file, "r")) as stream:
for i, string in enumerate(stream):
try:
s = Spec(string)
s.package
args.specs.append(s)
except SpackError, e:
tty.die("Parse error in %s, line %d:" % (args.file, i+1),
">>> " + string, str(e))
if specs:
tty.die("Cannot pass specs on the command line with --file.")
specs = _read_specs_from_file(args.file)
if not args.specs:
args.specs = [Spec(n) for n in spack.db.all_package_names()]
# If nothing is passed, use all packages.
if not specs:
specs = [Spec(n) for n in spack.db.all_package_names()]
specs.sort(key=lambda s: s.format("$_$@").lower())
# Default name for directory is spack-mirror-<DATESTAMP>
if not args.directory:
directory = args.directory
if not directory:
timestamp = datetime.now().strftime("%Y-%m-%d")
args.directory = 'spack-mirror-' + timestamp
directory = 'spack-mirror-' + timestamp
# Make sure nothing is in the way.
if os.path.isfile(args.directory):
tty.error("%s already exists and is a file." % args.directory)
existed = False
if os.path.isfile(directory):
tty.error("%s already exists and is a file." % directory)
elif os.path.isdir(directory):
existed = True
# Create a directory if none exists
if not os.path.isdir(args.directory):
mkdirp(args.directory)
tty.msg("Created new mirror in %s" % args.directory)
else:
tty.msg("Adding to existing mirror in %s" % args.directory)
# Actually do the work to create the mirror
present, mirrored, error = spack.mirror.create(
directory, specs, num_versions=args.one_version_per_spec)
p, m, e = len(present), len(mirrored), len(error)
# Things to keep track of while parsing specs.
working_dir = os.getcwd()
num_mirrored = 0
num_error = 0
# Iterate through packages and download all the safe tarballs for each of them
for spec in args.specs:
pkg = spec.package
# Skip any package that has no checksummed versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s."
% pkg.name)
continue
# create a subdir for the current package.
pkg_path = join_path(args.directory, pkg.name)
mkdirp(pkg_path)
# Download all the tarballs using Stages, then move them into place
for version in pkg.versions:
# Skip versions that don't match the spec
vspec = Spec('%s@%s' % (pkg.name, version))
if not vspec.satisfies(spec):
continue
mirror_path = "%s/%s-%s.%s" % (
pkg.name, pkg.name, version, extension(pkg.url))
os.chdir(working_dir)
mirror_file = join_path(args.directory, mirror_path)
if os.path.exists(mirror_file):
tty.msg("Already fetched %s." % mirror_file)
num_mirrored += 1
continue
# Get the URL for the version and set up a stage to download it.
url = pkg.url_for_version(version)
stage = Stage(url)
try:
# fetch changes directory into the stage
stage.fetch()
if not args.no_checksum and version in pkg.versions:
digest = pkg.versions[version]
stage.check(digest)
tty.msg("Checksum passed for %s@%s" % (pkg.name, version))
# change back and move the new archive into place.
os.chdir(working_dir)
shutil.move(stage.archive_file, mirror_file)
tty.msg("Added %s to mirror" % mirror_file)
num_mirrored += 1
except Exception, e:
tty.warn("Error while fetching %s." % url, e.message)
num_error += 1
finally:
stage.destroy()
# If nothing happened, try to say why.
if not num_mirrored:
if num_error:
tty.error("No packages added to mirror.",
"All packages failed to fetch.")
else:
tty.error("No packages added to mirror. No versions matched specs:")
colify(args.specs, indent=4)
verb = "updated" if existed else "created"
tty.msg(
"Successfully %s mirror in %s." % (verb, directory),
"Archive stats:",
" %-4d already present" % p,
" %-4d added" % m,
" %-4d failed to fetch." % e)
if error:
tty.error("Failed downloads:")
colify(s.format("$_$@") for s in error)
def mirror(parser, args):
@ -218,4 +172,5 @@ def mirror(parser, args):
'add' : mirror_add,
'remove' : mirror_remove,
'list' : mirror_list }
action[args.mirror_command](args)

View File

@ -190,6 +190,12 @@ def check(key):
except ProcessError, e:
tty.debug("Couldn't get version for compiler %s" % full_path, e)
return None
except Exception, e:
# Catching "Exception" here is fine because it just
# means something went wrong running a candidate executable.
tty.debug("Error while executing candidate compiler %s" % full_path,
"%s: %s" %(e.__class__.__name__, e))
return None
successful = [key for key in parmap(check, checks) if key is not None]
return dict(((v, p, s), path) for v, p, s, path in successful)

View File

@ -56,7 +56,7 @@ def fc_version(cls, fc):
return get_compiler_version(
fc, '-dumpversion',
# older gfortran versions don't have simple dumpversion output.
r'(?:GNU Fortran \(GCC\))?(\d+\.\d+\.\d+)')
r'(?:GNU Fortran \(GCC\))?(\d+\.\d+(?:\.\d+)?)')
@classmethod

View File

@ -68,11 +68,13 @@ def concretize_version(self, spec):
# If there are known avaialble versions, return the most recent
# version that satisfies the spec
pkg = spec.package
valid_versions = pkg.available_versions.intersection(spec.versions)
valid_versions = [v for v in pkg.available_versions
if any(v.satisfies(sv) for sv in spec.versions)]
if valid_versions:
spec.versions = ver([valid_versions[-1]])
else:
raise NoValidVerionError(spec)
raise NoValidVersionError(spec)
def concretize_architecture(self, spec):
@ -160,9 +162,9 @@ def __init__(self, compiler_spec):
"Run 'spack compilers' to see available compiler Options.")
class NoValidVerionError(spack.error.SpackError):
class NoValidVersionError(spack.error.SpackError):
"""Raised when there is no available version for a package that
satisfies a spec."""
def __init__(self, spec):
super(NoValidVerionError, self).__init__(
super(NoValidVersionError, self).__init__(
"No available version of %s matches '%s'" % (spec.name, spec.versions))

View File

@ -0,0 +1,650 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Fetch strategies are used to download source code into a staging area
in order to build it. They need to define the following methods:
* fetch()
This should attempt to download/check out source from somewhere.
* check()
Apply a checksum to the downloaded source code, e.g. for an archive.
May not do anything if the fetch method was safe to begin with.
* expand()
Expand (e.g., an archive) downloaded file to source.
* reset()
Restore original state of downloaded code. Used by clean commands.
This may just remove the expanded source and re-expand an archive,
or it may run something like git reset --hard.
* archive()
Archive a source directory, e.g. for creating a mirror.
"""
import os
import re
import shutil
from functools import wraps
import llnl.util.tty as tty
import spack
import spack.error
import spack.util.crypto as crypto
from spack.util.executable import *
from spack.util.string import *
from spack.version import Version, ver
from spack.util.compression import decompressor_for, extension
"""List of all fetch strategies, created by FetchStrategy metaclass."""
all_strategies = []
def _needs_stage(fun):
"""Many methods on fetch strategies require a stage to be set
using set_stage(). This decorator adds a check for self.stage."""
@wraps(fun)
def wrapper(self, *args, **kwargs):
if not self.stage:
raise NoStageError(fun)
return fun(self, *args, **kwargs)
return wrapper
class FetchStrategy(object):
"""Superclass of all fetch strategies."""
enabled = False # Non-abstract subclasses should be enabled.
required_attributes = None # Attributes required in version() args.
class __metaclass__(type):
"""This metaclass registers all fetch strategies in a list."""
def __init__(cls, name, bases, dict):
type.__init__(cls, name, bases, dict)
if cls.enabled: all_strategies.append(cls)
def __init__(self):
# The stage is initialized late, so that fetch strategies can be constructed
# at package construction time. This is where things will be fetched.
self.stage = None
def set_stage(self, stage):
"""This is called by Stage before any of the fetching
methods are called on the stage."""
self.stage = stage
# Subclasses need to implement these methods
def fetch(self): pass # Return True on success, False on fail.
def check(self): pass # Do checksum.
def expand(self): pass # Expand archive.
def reset(self): pass # Revert to freshly downloaded state.
def archive(self, destination): pass # Used to create tarball for mirror.
def __str__(self): # Should be human readable URL.
return "FetchStrategy.__str___"
# This method is used to match fetch strategies to version()
# arguments in packages.
@classmethod
def matches(cls, args):
return any(k in args for k in cls.required_attributes)
class URLFetchStrategy(FetchStrategy):
"""FetchStrategy that pulls source code from a URL for an archive,
checks the archive against a checksum,and decompresses the archive.
"""
enabled = True
required_attributes = ['url']
def __init__(self, url=None, digest=None, **kwargs):
super(URLFetchStrategy, self).__init__()
# If URL or digest are provided in the kwargs, then prefer
# those values.
self.url = kwargs.get('url', None)
if not self.url: self.url = url
self.digest = kwargs.get('md5', None)
if not self.digest: self.digest = digest
if not self.url:
raise ValueError("URLFetchStrategy requires a url for fetching.")
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.archive_file:
tty.msg("Already downloaded %s." % self.archive_file)
return
tty.msg("Trying to fetch from %s" % self.url)
# Run curl but grab the mime type from the http headers
headers = spack.curl('-#', # status bar
'-O', # save file to disk
'-f', # fail on >400 errors
'-D', '-', # print out HTML headers
'-L', self.url,
return_output=True, fail_on_error=False)
if spack.curl.returncode != 0:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if spack.curl.returncode == 22:
# This is a 404. Curl will print the error.
raise FailedDownloadError(url)
if spack.curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
self.url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL configuration "
"is bad. If you believe your SSL configuration is bad, you "
"can try running spack -k, which will not check SSL certificates."
"Use this at your own risk.")
# Check if we somehow got an HTML file rather than the archive we
# asked for. We only look at the last content type, to handle
# redirects properly.
content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
if content_types and 'text/html' in content_types[-1]:
tty.warn("The contents of " + self.archive_file + " look like HTML.",
"The checksum will likely be bad. If it is, you can use",
"'spack clean --dist' to remove the bad archive, then fix",
"your internet gateway issue and install again.")
if not self.archive_file:
raise FailedDownloadError(self.url)
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""
return self.stage.archive_file
@_needs_stage
def expand(self):
tty.msg("Staging archive: %s" % self.archive_file)
self.stage.chdir()
if not self.archive_file:
raise NoArchiveFileError("URLFetchStrategy couldn't find archive file",
"Failed on expand() for URL %s" % self.url)
decompress = decompressor_for(self.archive_file)
decompress(self.archive_file)
def archive(self, destination):
"""Just moves this archive to the destination."""
if not self.archive_file:
raise NoArchiveFileError("Cannot call archive() before fetching.")
if not extension(destination) == extension(self.archive_file):
raise ValueError("Cannot archive without matching extensions.")
shutil.move(self.archive_file, destination)
@_needs_stage
def check(self):
"""Check the downloaded archive against a checksum digest.
No-op if this stage checks code out of a repository."""
if not self.digest:
raise NoDigestError("Attempt to check URLFetchStrategy with no digest.")
checker = crypto.Checker(self.digest)
if not checker.check(self.archive_file):
raise ChecksumError(
"%s checksum failed for %s." % (checker.hash_name, self.archive_file),
"Expected %s but got %s." % (self.digest, checker.sum))
@_needs_stage
def reset(self):
"""Removes the source path if it exists, then re-expands the archive."""
if not self.archive_file:
raise NoArchiveFileError("Tried to reset URLFetchStrategy before fetching",
"Failed on reset() for URL %s" % self.url)
if self.stage.source_path:
shutil.rmtree(self.stage.source_path, ignore_errors=True)
self.expand()
def __repr__(self):
url = self.url if self.url else "no url"
return "URLFetchStrategy<%s>" % url
def __str__(self):
if self.url:
return self.url
else:
return "[no url]"
class VCSFetchStrategy(FetchStrategy):
def __init__(self, name, *rev_types, **kwargs):
super(VCSFetchStrategy, self).__init__()
self.name = name
# Set a URL based on the type of fetch strategy.
self.url = kwargs.get(name, None)
if not self.url: raise ValueError(
"%s requires %s argument." % (self.__class__, name))
# Ensure that there's only one of the rev_types
if sum(k in kwargs for k in rev_types) > 1:
raise FetchStrategyError(
"Supply only one of %s to fetch with %s." % (
comma_or(rev_types), name))
# Set attributes for each rev type.
for rt in rev_types:
setattr(self, rt, kwargs.get(rt, None))
@_needs_stage
def check(self):
tty.msg("No checksum needed when fetching with %s." % self.name)
@_needs_stage
def expand(self):
tty.debug("Source fetched with %s is already expanded." % self.name)
@_needs_stage
def archive(self, destination, **kwargs):
assert(extension(destination) == 'tar.gz')
assert(self.stage.source_path.startswith(self.stage.path))
tar = which('tar', required=True)
patterns = kwargs.get('exclude', None)
if patterns is not None:
if isinstance(patterns, basestring):
patterns = [patterns]
for p in patterns:
tar.add_default_arg('--exclude=%s' % p)
self.stage.chdir()
tar('-czf', destination, os.path.basename(self.stage.source_path))
def __str__(self):
return "VCS: %s" % self.url
def __repr__(self):
return "%s<%s>" % (self.__class__, self.url)
class GitFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a git repository.
Use like this in a package:
version('name', git='https://github.com/project/repo.git')
Optionally, you can provide a branch, or commit to check out, e.g.:
version('1.1', git='https://github.com/project/repo.git', tag='v1.1')
You can use these three optional attributes in addition to ``git``:
* ``branch``: Particular branch to build from (default is master)
* ``tag``: Particular tag to check out
* ``commit``: Particular commit hash in the repo
"""
enabled = True
required_attributes = ('git',)
def __init__(self, **kwargs):
super(GitFetchStrategy, self).__init__(
'git', 'tag', 'branch', 'commit', **kwargs)
self._git = None
# For git fetch branches and tags the same way.
if not self.branch:
self.branch = self.tag
@property
def git_version(self):
git = which('git', required=True)
vstring = git('--version', return_output=True).lstrip('git version ')
return Version(vstring)
@property
def git(self):
if not self._git:
self._git = which('git', required=True)
return self._git
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
args = []
if self.commit:
args.append('at commit %s' % self.commit)
elif self.tag:
args.append('at tag %s' % self.tag)
elif self.branch:
args.append('on branch %s' % self.branch)
tty.msg("Trying to clone git repository:", self.url, *args)
if self.commit:
# Need to do a regular clone and check out everything if
# they asked for a particular commit.
self.git('clone', self.url)
self.stage.chdir_to_source()
self.git('checkout', self.commit)
else:
# Can be more efficient if not checking out a specific commit.
args = ['clone']
# If we want a particular branch ask for it.
if self.branch:
args.extend(['--branch', self.branch])
# Try to be efficient if we're using a new enough git.
# This checks out only one branch's history
if self.git_version > ver('1.7.10'):
args.append('--single-branch')
args.append(self.url)
self.git(*args)
self.stage.chdir_to_source()
def archive(self, destination):
super(GitFetchStrategy, self).archive(destination, exclude='.git')
@_needs_stage
def reset(self):
self.stage.chdir_to_source()
self.git('checkout', '.')
self.git('clean', '-f')
def __str__(self):
return "[git] %s" % self.url
class SvnFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a subversion repository.
Use like this in a package:
version('name', svn='http://www.example.com/svn/trunk')
Optionally, you can provide a revision for the URL:
version('name', svn='http://www.example.com/svn/trunk',
revision='1641')
"""
enabled = True
required_attributes = ['svn']
def __init__(self, **kwargs):
super(SvnFetchStrategy, self).__init__(
'svn', 'revision', **kwargs)
self._svn = None
if self.revision is not None:
self.revision = str(self.revision)
@property
def svn(self):
if not self._svn:
self._svn = which('svn', required=True)
return self._svn
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
tty.msg("Trying to check out svn repository: %s" % self.url)
args = ['checkout', '--force']
if self.revision:
args += ['-r', self.revision]
args.append(self.url)
self.svn(*args)
self.stage.chdir_to_source()
def _remove_untracked_files(self):
"""Removes untracked files in an svn repository."""
status = self.svn('status', '--no-ignore', return_output=True)
self.svn('status', '--no-ignore')
for line in status.split('\n'):
if not re.match('^[I?]', line):
continue
path = line[8:].strip()
if os.path.isfile(path):
os.unlink(path)
elif os.path.isdir(path):
shutil.rmtree(path, ignore_errors=True)
def archive(self, destination):
super(SvnFetchStrategy, self).archive(destination, exclude='.svn')
@_needs_stage
def reset(self):
self.stage.chdir_to_source()
self._remove_untracked_files()
self.svn('revert', '.', '-R')
def __str__(self):
return "[svn] %s" % self.url
class HgFetchStrategy(VCSFetchStrategy):
"""Fetch strategy that gets source code from a Mercurial repository.
Use like this in a package:
version('name', hg='https://jay.grs.rwth-aachen.de/hg/lwm2')
Optionally, you can provide a branch, or revision to check out, e.g.:
version('torus', hg='https://jay.grs.rwth-aachen.de/hg/lwm2', branch='torus')
You can use the optional 'revision' attribute to check out a
branch, tag, or particular revision in hg. To prevent
non-reproducible builds, using a moving target like a branch is
discouraged.
* ``revision``: Particular revision, branch, or tag.
"""
enabled = True
required_attributes = ['hg']
def __init__(self, **kwargs):
super(HgFetchStrategy, self).__init__(
'hg', 'revision', **kwargs)
self._hg = None
@property
def hg(self):
if not self._hg:
self._hg = which('hg', required=True)
return self._hg
@_needs_stage
def fetch(self):
self.stage.chdir()
if self.stage.source_path:
tty.msg("Already fetched %s." % self.stage.source_path)
return
args = []
if self.revision:
args.append('at revision %s' % self.revision)
tty.msg("Trying to clone Mercurial repository:", self.url, *args)
args = ['clone', self.url]
if self.revision:
args += ['-r', self.revision]
self.hg(*args)
def archive(self, destination):
super(HgFetchStrategy, self).archive(destination, exclude='.hg')
@_needs_stage
def reset(self):
self.stage.chdir()
source_path = self.stage.source_path
scrubbed = "scrubbed-source-tmp"
args = ['clone']
if self.revision:
args += ['-r', self.revision]
args += [source_path, scrubbed]
self.hg(*args)
shutil.rmtree(source_path, ignore_errors=True)
shutil.move(scrubbed, source_path)
self.stage.chdir_to_source()
def __str__(self):
return "[hg] %s" % self.url
def from_url(url):
"""Given a URL, find an appropriate fetch strategy for it.
Currently just gives you a URLFetchStrategy that uses curl.
TODO: make this return appropriate fetch strategies for other
types of URLs.
"""
return URLFetchStrategy(url)
def args_are_for(args, fetcher):
fetcher.matches(args)
def for_package_version(pkg, version):
"""Determine a fetch strategy based on the arguments supplied to
version() in the package description."""
# If it's not a known version, extrapolate one.
if not version in pkg.versions:
url = pkg.url_for_verison(version)
if not url:
raise InvalidArgsError(pkg, version)
return URLFetchStrategy(url)
# Grab a dict of args out of the package version dict
args = pkg.versions[version]
# Test all strategies against per-version arguments.
for fetcher in all_strategies:
if fetcher.matches(args):
return fetcher(**args)
# If nothing matched for a *specific* version, test all strategies
# against
for fetcher in all_strategies:
attrs = dict((attr, getattr(pkg, attr, None))
for attr in fetcher.required_attributes)
if 'url' in attrs:
attrs['url'] = pkg.url_for_version(version)
attrs.update(args)
if fetcher.matches(attrs):
return fetcher(**attrs)
raise InvalidArgsError(pkg, version)
class FetchError(spack.error.SpackError):
def __init__(self, msg, long_msg):
super(FetchError, self).__init__(msg, long_msg)
class FailedDownloadError(FetchError):
"""Raised wen a download fails."""
def __init__(self, url, msg=""):
super(FailedDownloadError, self).__init__(
"Failed to fetch file from URL: %s" % url, msg)
self.url = url
class NoArchiveFileError(FetchError):
def __init__(self, msg, long_msg):
super(NoArchiveFileError, self).__init__(msg, long_msg)
class NoDigestError(FetchError):
def __init__(self, msg, long_msg):
super(NoDigestError, self).__init__(msg, long_msg)
class InvalidArgsError(FetchError):
def __init__(self, pkg, version):
msg = "Could not construct a fetch strategy for package %s at version %s"
msg %= (pkg.name, version)
super(InvalidArgsError, self).__init__(msg)
class ChecksumError(FetchError):
"""Raised when archive fails to checksum."""
def __init__(self, message, long_msg=None):
super(ChecksumError, self).__init__(message, long_msg)
class NoStageError(FetchError):
"""Raised when fetch operations are called before set_stage()."""
def __init__(self, method):
super(NoStageError, self).__init__(
"Must call FetchStrategy.set_stage() before calling %s" % method.__name__)

189
lib/spack/spack/mirror.py Normal file
View File

@ -0,0 +1,189 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particualr package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import sys
import os
import llnl.util.tty as tty
from llnl.util.filesystem import *
import spack
import spack.error
import spack.fetch_strategy as fs
from spack.spec import Spec
from spack.stage import Stage
from spack.version import *
from spack.util.compression import extension
def mirror_archive_filename(spec):
"""Get the path that this spec will live at within a mirror."""
if not spec.version.concrete:
raise ValueError("mirror.path requires spec with concrete version.")
fetcher = spec.package.fetcher
if isinstance(fetcher, fs.URLFetchStrategy):
# If we fetch this version with a URLFetchStrategy, use URL's archive type
ext = extension(fetcher.url)
else:
# Otherwise we'll make a .tar.gz ourselves
ext = 'tar.gz'
return "%s-%s.%s" % (spec.package.name, spec.version, ext)
def get_matching_versions(specs, **kwargs):
"""Get a spec for EACH known version matching any spec in the list."""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s." % pkg.name)
continue
num_versions = kwargs.get('num_versions', 0)
for i, v in enumerate(reversed(sorted(pkg.versions))):
# Generate no more than num_versions versions for each spec.
if num_versions and i >= num_versions:
break
# Generate only versions that satisfy the spec.
if v.satisfies(spec.versions):
s = Spec(pkg.name)
s.versions = VersionList([v])
matching.append(s)
return matching
def create(path, specs, **kwargs):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path Path to create a mirror directory hierarchy in.
specs Any package versions matching these specs will be added
to the mirror.
Keyword args:
no_checksum: If True, do not checkpoint when fetching (default False)
num_versions: Max number of versions to fetch per spec,
if spec is ambiguous (default is 0 for all of them)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already prsent.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
This routine iterates through all known package versions, and
it creates specs for those versions. If the version satisfies any spec
in the specs list, it is downloaded and added to the mirror.
"""
# Make sure nothing is in the way.
if os.path.isfile(path):
raise MirrorError("%s already exists and is a file." % path)
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, Spec) else Spec(s) for s in specs]
# Get concrete specs for each matching version of these specs.
version_specs = get_matching_versions(
specs, num_versions=kwargs.get('num_versions', 0))
for s in version_specs:
s.concretize()
# Get the absolute path of the root before we start jumping around.
mirror_root = os.path.abspath(path)
if not os.path.isdir(mirror_root):
mkdirp(mirror_root)
# Things to keep track of while parsing specs.
present = []
mirrored = []
error = []
# Iterate through packages and download all the safe tarballs for each of them
for spec in version_specs:
pkg = spec.package
stage = None
try:
# create a subdirectory for the current package@version
subdir = join_path(mirror_root, pkg.name)
mkdirp(subdir)
archive_file = mirror_archive_filename(spec)
archive_path = join_path(subdir, archive_file)
if os.path.exists(archive_path):
tty.msg("Already added %s" % spec.format("$_$@"))
present.append(spec)
continue
# Set up a stage and a fetcher for the download
unique_fetch_name = spec.format("$_$@")
fetcher = fs.for_package_version(pkg, pkg.version)
stage = Stage(fetcher, name=unique_fetch_name)
fetcher.set_stage(stage)
# Do the fetch and checksum if necessary
fetcher.fetch()
if not kwargs.get('no_checksum', False):
fetcher.check()
tty.msg("Checksum passed for %s@%s" % (pkg.name, pkg.version))
# Fetchers have to know how to archive their files. Use
# that to move/copy/create an archive in the mirror.
fetcher.archive(archive_path)
tty.msg("Added %s." % spec.format("$_$@"))
mirrored.append(spec)
except Exception, e:
if spack.debug:
sys.excepthook(*sys.exc_info())
else:
tty.warn("Error while fetching %s." % spec.format('$_$@'), e.message)
error.append(spec)
finally:
if stage:
stage.destroy()
return (present, mirrored, error)
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super(MirrorError, self).__init__(msg, long_msg)

View File

@ -52,6 +52,7 @@
import spack.hooks
import spack.build_environment as build_env
import spack.url as url
import spack.fetch_strategy as fs
from spack.version import *
from spack.stage import Stage
from spack.util.web import get_pages
@ -114,13 +115,13 @@ def install(self, spec, prefix):
2. The class name, "Cmake". This is formed by converting `-` or
``_`` in the module name to camel case. If the name starts with
a number, we prefix the class name with ``Num_``. Examples:
a number, we prefix the class name with ``_``. Examples:
Module Name Class Name
foo_bar FooBar
docbook-xml DocbookXml
FooBar Foobar
3proxy Num_3proxy
3proxy _3proxy
The class name is what spack looks for when it loads a package module.
@ -300,7 +301,7 @@ class SomePackage(Package):
# These variables are defaults for the various "relations".
#
"""Map of information about Versions of this package.
Map goes: Version -> VersionDescriptor"""
Map goes: Version -> dict of attributes"""
versions = {}
"""Specs of dependency packages, keyed by name."""
@ -337,7 +338,7 @@ def __init__(self, spec):
# Sanity check some required variables that could be
# overridden by package authors.
def sanity_check_dict(attr_name):
def ensure_has_dict(attr_name):
if not hasattr(self, attr_name):
raise PackageError("Package %s must define %s" % attr_name)
@ -345,10 +346,10 @@ def sanity_check_dict(attr_name):
if not isinstance(attr, dict):
raise PackageError("Package %s has non-dict %s attribute!"
% (self.name, attr_name))
sanity_check_dict('versions')
sanity_check_dict('dependencies')
sanity_check_dict('conflicted')
sanity_check_dict('patches')
ensure_has_dict('versions')
ensure_has_dict('dependencies')
ensure_has_dict('conflicted')
ensure_has_dict('patches')
# Check versions in the versions dict.
for v in self.versions:
@ -356,22 +357,23 @@ def sanity_check_dict(attr_name):
# Check version descriptors
for v in sorted(self.versions):
vdesc = self.versions[v]
assert(isinstance(vdesc, spack.relations.VersionDescriptor))
assert(isinstance(self.versions[v], dict))
# Version-ize the keys in versions dict
try:
self.versions = dict((Version(v), h) for v,h in self.versions.items())
except ValueError:
raise ValueError("Keys of versions dict in package %s must be versions!"
% self.name)
except ValueError, e:
raise ValueError("In package %s: %s" % (self.name, e.message))
# stage used to build this package.
self._stage = None
# patch up self.url based on the actual version
if self.spec.concrete:
self.url = self.url_for_version(self.version)
# If there's no default URL provided, set this package's url to None
if not hasattr(self, 'url'):
self.url = None
# Init fetch strategy to None
self._fetcher = None
# Set a default list URL (place to find available versions)
if not hasattr(self, 'list_url'):
@ -383,29 +385,102 @@ def sanity_check_dict(attr_name):
@property
def version(self):
if not self.spec.concrete:
raise ValueError("Can only get version of concrete package.")
if not self.spec.versions.concrete:
raise ValueError("Can only get of package with concrete version.")
return self.spec.versions[0]
@memoized
def version_urls(self):
"""Return a list of URLs for different versions of this
package, sorted by version. A version's URL only appears
in this list if it has an explicitly defined URL."""
version_urls = {}
for v in sorted(self.versions):
args = self.versions[v]
if 'url' in args:
version_urls[v] = args['url']
return version_urls
def nearest_url(self, version):
"""Finds the URL for the next lowest version with a URL.
If there is no lower version with a URL, uses the
package url property. If that isn't there, uses a
*higher* URL, and if that isn't there raises an error.
"""
version_urls = self.version_urls()
url = self.url
for v in version_urls:
if v > version and url:
break
if version_urls[v]:
url = version_urls[v]
return url
def has_url(self):
"""Returns whether there is a URL available for this package.
If there isn't, it's probably fetched some other way (version
control, etc.)"""
return self.url or self.version_urls()
# TODO: move this out of here and into some URL extrapolation module?
def url_for_version(self, version):
"""Returns a URL that you can download a new version of this package from."""
if not isinstance(version, Version):
version = Version(version)
if not self.has_url():
raise NoURLError(self.__class__)
# If we have a specific URL for this version, don't extrapolate.
version_urls = self.version_urls()
if version in version_urls:
return version_urls[version]
# If we have no idea, try to substitute the version.
return url.substitute_version(self.nearest_url(version),
self.url_version(version))
@property
def stage(self):
if not self.spec.concrete:
raise ValueError("Can only get a stage for a concrete package.")
if self._stage is None:
if not self.url:
raise PackageVersionError(self.version)
# TODO: move this logic into a mirror module.
mirror_path = "%s/%s" % (self.name, "%s-%s.%s" % (
self.name, self.version, extension(self.url)))
self._stage = Stage(
self.url, mirror_path=mirror_path, name=self.spec.short_spec)
self._stage = Stage(self.fetcher,
mirror_path=self.mirror_path(),
name=self.spec.short_spec)
return self._stage
@property
def fetcher(self):
if not self.spec.versions.concrete:
raise ValueError(
"Can only get a fetcher for a package with concrete versions.")
if not self._fetcher:
self._fetcher = fs.for_package_version(self, self.version)
return self._fetcher
@fetcher.setter
def fetcher(self, f):
self._fetcher = f
def mirror_path(self):
"""Get path to this package's archive in a mirror."""
filename = "%s-%s." % (self.name, self.version)
filename += extension(self.url) if self.has_url() else "tar.gz"
return "%s/%s" % (self.name, filename)
def preorder_traversal(self, visited=None, **kwargs):
"""This does a preorder traversal of the package's dependence DAG."""
virtual = kwargs.get("virtual", False)
@ -524,47 +599,6 @@ def url_version(self, version):
return str(version)
def url_for_version(self, version):
"""Returns a URL that you can download a new version of this package from."""
if not isinstance(version, Version):
version = Version(version)
def nearest_url(version):
"""Finds the URL for the next lowest version with a URL.
If there is no lower version with a URL, uses the
package url property. If that isn't there, uses a
*higher* URL, and if that isn't there raises an error.
"""
url = getattr(self, 'url', None)
for v in sorted(self.versions):
if v > version and url:
break
if self.versions[v].url:
url = self.versions[v].url
return url
if version in self.versions:
vdesc = self.versions[version]
if not vdesc.url:
base_url = nearest_url(version)
vdesc.url = url.substitute_version(
base_url, self.url_version(version))
return vdesc.url
else:
return nearest_url(version)
@property
def default_url(self):
if self.concrete:
return self.url_for_version(self.version)
else:
url = getattr(self, 'url', None)
if url:
return url
def remove_prefix(self):
"""Removes the prefix for a package along with any empty parent directories."""
spack.install_layout.remove_path_for_spec(self.spec)
@ -587,9 +621,7 @@ def do_fetch(self):
self.stage.fetch()
if spack.do_checksum and self.version in self.versions:
digest = self.versions[self.version].checksum
self.stage.check(digest)
tty.msg("Checksum passed for %s@%s" % (self.name, self.version))
self.stage.check()
def do_stage(self):
@ -600,14 +632,13 @@ def do_stage(self):
self.do_fetch()
archive_dir = self.stage.expanded_archive_path
archive_dir = self.stage.source_path
if not archive_dir:
tty.msg("Staging archive: %s" % self.stage.archive_file)
self.stage.expand_archive()
tty.msg("Created stage directory in %s." % self.stage.path)
tty.msg("Created stage in %s." % self.stage.path)
else:
tty.msg("Already staged %s in %s." % (self.name, self.stage.path))
self.stage.chdir_to_archive()
self.stage.chdir_to_source()
def do_patch(self):
@ -616,11 +647,17 @@ def do_patch(self):
if not self.spec.concrete:
raise ValueError("Can only patch concrete packages.")
# Kick off the stage first.
self.do_stage()
# If there are no patches, note it.
if not self.patches:
tty.msg("No patches needed for %s." % self.name)
return
# Construct paths to special files in the archive dir used to
# keep track of whether patches were successfully applied.
archive_dir = self.stage.expanded_archive_path
archive_dir = self.stage.source_path
good_file = join_path(archive_dir, '.spack_patched')
bad_file = join_path(archive_dir, '.spack_patch_failed')
@ -630,7 +667,7 @@ def do_patch(self):
tty.msg("Patching failed last time. Restaging.")
self.stage.restage()
self.stage.chdir_to_archive()
self.stage.chdir_to_source()
# If this file exists, then we already applied all the patches.
if os.path.isfile(good_file):
@ -790,19 +827,15 @@ def do_uninstall(self, **kwargs):
def do_clean(self):
if self.stage.expanded_archive_path:
self.stage.chdir_to_archive()
self.stage.chdir_to_source()
self.clean()
def clean(self):
"""By default just runs make clean. Override if this isn't good."""
try:
# TODO: should we really call make clean, ro just blow away the directory?
make = build_env.MakeExecutable('make', self.parallel)
make('clean')
tty.msg("Successfully cleaned %s" % self.name)
except subprocess.CalledProcessError, e:
tty.warn("Warning: 'make clean' didn't work. Consider 'spack clean --work'.")
# TODO: should we really call make clean, ro just blow away the directory?
make = build_env.MakeExecutable('make', self.parallel)
make('clean')
def do_clean_work(self):
@ -814,7 +847,6 @@ def do_clean_dist(self):
"""Removes the stage directory where this package was built."""
if os.path.exists(self.stage.path):
self.stage.destroy()
tty.msg("Successfully cleaned %s" % self.name)
def fetch_available_versions(self):
@ -859,7 +891,7 @@ def find_versions_of_archive(archive_url, **kwargs):
list_depth = kwargs.get('list_depth', 1)
if not list_url:
list_url = os.path.dirname(archive_url)
list_url = url.find_list_url(archive_url)
# This creates a regex from the URL with a capture group for the
# version part of the URL. The capture group is converted to a
@ -946,3 +978,10 @@ def __init__(self, cls):
super(VersionFetchError, self).__init__(
"Cannot fetch version for package %s " % cls.__name__ +
"because it does not define a default url.")
class NoURLError(PackageError):
"""Raised when someone tries to build a URL for a package with no URLs."""
def __init__(self, cls):
super(NoURLError, self).__init__(
"Package %s has no version with a URL." % cls.__name__)

View File

@ -47,10 +47,10 @@
def _autospec(function):
"""Decorator that automatically converts the argument of a single-arg
function to a Spec."""
def converter(self, spec_like):
def converter(self, spec_like, **kwargs):
if not isinstance(spec_like, spack.spec.Spec):
spec_like = spack.spec.Spec(spec_like)
return function(self, spec_like)
return function(self, spec_like, **kwargs)
return converter
@ -63,10 +63,14 @@ def __init__(self, root):
@_autospec
def get(self, spec):
def get(self, spec, **kwargs):
if spec.virtual:
raise UnknownPackageError(spec.name)
if kwargs.get('new', False):
if spec in self.instances:
del self.instances[spec]
if not spec in self.instances:
package_class = self.get_class_for_package_name(spec.name)
try:
@ -77,6 +81,17 @@ def get(self, spec):
return self.instances[spec]
@_autospec
def delete(self, spec):
"""Force a package to be recreated."""
del self.instances[spec]
def purge(self):
"""Clear entire package instance cache."""
self.instances.clear()
@_autospec
def get_installed(self, spec):
"""Get all the installed specs that satisfy the provided spec constraint."""

View File

@ -64,7 +64,7 @@ def apply(self, stage):
"""Fetch this patch, if necessary, and apply it to the source
code in the supplied stage.
"""
stage.chdir_to_archive()
stage.chdir_to_source()
patch_stage = None
try:

View File

@ -79,32 +79,28 @@ class Mpileaks(Package):
import spack.spec
import spack.error
import spack.url
from spack.version import Version
from spack.patch import Patch
from spack.spec import Spec, parse_anonymous_spec
class VersionDescriptor(object):
"""A VersionDescriptor contains information to describe a
particular version of a package. That currently includes a URL
for the version along with a checksum."""
def __init__(self, checksum, url):
self.checksum = checksum
self.url = url
def version(ver, checksum, **kwargs):
"""Adds a version and associated metadata to the package."""
def version(ver, checksum=None, **kwargs):
"""Adds a version and metadata describing how to fetch it.
Metadata is just stored as a dict in the package's versions
dictionary. Package must turn it into a valid fetch strategy
later.
"""
pkg = caller_locals()
versions = pkg.setdefault('versions', {})
patches = pkg.setdefault('patches', {})
ver = Version(ver)
url = kwargs.get('url', None)
# special case checksum for backward compatibility
if checksum:
kwargs['md5'] = checksum
versions[ver] = VersionDescriptor(checksum, url)
# Store the kwargs for the package to use later when constructing
# a fetch strategy.
versions[Version(ver)] = kwargs
def depends_on(*specs):

View File

@ -1619,7 +1619,7 @@ def __init__(self, provided, required, constraint_type):
class UnsatisfiableSpecNameError(UnsatisfiableSpecError):
"""Raised when two specs aren't even for the same package."""
def __init__(self, provided, required):
super(UnsatisfiableVersionSpecError, self).__init__(
super(UnsatisfiableSpecNameError, self).__init__(
provided, required, "name")

View File

@ -32,18 +32,19 @@
import spack
import spack.config
import spack.fetch_strategy as fs
import spack.error
import spack.util.crypto as crypto
from spack.util.compression import decompressor_for
STAGE_PREFIX = 'spack-stage-'
class Stage(object):
"""A Stage object manaages a directory where an archive is downloaded,
expanded, and built before being installed. It also handles downloading
the archive. A stage's lifecycle looks like this:
"""A Stage object manaages a directory where some source code is
downloaded and built before being installed. It handles
fetching the source code, either as an archive to be expanded
or by checking it out of a repository. A stage's lifecycle
looks like this:
Stage()
Constructor creates the stage directory.
@ -68,21 +69,31 @@ class Stage(object):
similar, and are intended to persist for only one run of spack.
"""
def __init__(self, url, **kwargs):
def __init__(self, url_or_fetch_strategy, **kwargs):
"""Create a stage object.
Parameters:
url URL of the archive to be downloaded into this stage.
url_or_fetch_strategy
URL of the archive to be downloaded into this stage, OR
a valid FetchStrategy.
name If a name is provided, then this stage is a named stage
and will persist between runs (or if you construct another
stage object later). If name is not provided, then this
stage will be given a unique name automatically.
name
If a name is provided, then this stage is a named stage
and will persist between runs (or if you construct another
stage object later). If name is not provided, then this
stage will be given a unique name automatically.
"""
if isinstance(url_or_fetch_strategy, basestring):
self.fetcher = fs.from_url(url_or_fetch_strategy)
elif isinstance(url_or_fetch_strategy, fs.FetchStrategy):
self.fetcher = url_or_fetch_strategy
else:
raise ValueError("Can't construct Stage without url or fetch strategy")
self.fetcher.set_stage(self)
self.name = kwargs.get('name')
self.mirror_path = kwargs.get('mirror_path')
self.tmp_root = find_tmp_root()
self.url = url
self.path = None
self._setup()
@ -187,7 +198,10 @@ def _setup(self):
@property
def archive_file(self):
"""Path to the source archive within this stage directory."""
paths = [os.path.join(self.path, os.path.basename(self.url))]
if not isinstance(self.fetcher, fs.URLFetchStrategy):
return None
paths = [os.path.join(self.path, os.path.basename(self.fetcher.url))]
if self.mirror_path:
paths.append(os.path.join(self.path, os.path.basename(self.mirror_path)))
@ -198,17 +212,17 @@ def archive_file(self):
@property
def expanded_archive_path(self):
"""Returns the path to the expanded archive directory if it's expanded;
None if the archive hasn't been expanded.
"""
if not self.archive_file:
return None
def source_path(self):
"""Returns the path to the expanded/checked out source code
within this fetch strategy's path.
for file in os.listdir(self.path):
archive_path = join_path(self.path, file)
if os.path.isdir(archive_path):
return archive_path
This assumes nothing else is going ot be put in the
FetchStrategy's path. It searches for the first
subdirectory of the path it can find, then returns that.
"""
for p in [os.path.join(self.path, f) for f in os.listdir(self.path)]:
if os.path.isdir(p):
return p
return None
@ -220,71 +234,37 @@ def chdir(self):
tty.die("Setup failed: no such directory: " + self.path)
def fetch_from_url(self, url):
# Run curl but grab the mime type from the http headers
headers = spack.curl('-#', # status bar
'-O', # save file to disk
'-D', '-', # print out HTML headers
'-L', url,
return_output=True, fail_on_error=False)
if spack.curl.returncode != 0:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if spack.curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL configuration "
"is bad. If you believe your SSL configuration is bad, you "
"can try running spack -k, which will not check SSL certificates."
"Use this at your own risk.")
# Check if we somehow got an HTML file rather than the archive we
# asked for. We only look at the last content type, to handle
# redirects properly.
content_types = re.findall(r'Content-Type:[^\r\n]+', headers)
if content_types and 'text/html' in content_types[-1]:
tty.warn("The contents of " + self.archive_file + " look like HTML.",
"The checksum will likely be bad. If it is, you can use",
"'spack clean --dist' to remove the bad archive, then fix",
"your internet gateway issue and install again.")
def fetch(self):
"""Downloads the file at URL to the stage. Returns true if it was downloaded,
false if it already existed."""
"""Downloads an archive or checks out code from a repository."""
self.chdir()
if self.archive_file:
tty.msg("Already downloaded %s." % self.archive_file)
else:
urls = [self.url]
if self.mirror_path:
urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()] + urls
fetchers = [self.fetcher]
for url in urls:
tty.msg("Trying to fetch from %s" % url)
self.fetch_from_url(url)
if self.archive_file:
break
# TODO: move mirror logic out of here and clean it up!
if self.mirror_path:
urls = ["%s/%s" % (m, self.mirror_path) for m in _get_mirrors()]
if not self.archive_file:
raise FailedDownloadError(url)
digest = None
if isinstance(self.fetcher, fs.URLFetchStrategy):
digest = self.fetcher.digest
fetchers = [fs.URLFetchStrategy(url, digest)
for url in urls] + fetchers
for f in fetchers:
f.set_stage(self)
return self.archive_file
for fetcher in fetchers:
try:
fetcher.fetch()
break
except spack.error.SpackError, e:
tty.msg("Fetching %s failed." % fetcher)
continue
def check(self, digest):
"""Check the downloaded archive against a checksum digest"""
checker = crypto.Checker(digest)
if not checker.check(self.archive_file):
raise ChecksumError(
"%s checksum failed for %s." % (checker.hash_name, self.archive_file),
"Expected %s but got %s." % (digest, checker.sum))
def check(self):
"""Check the downloaded archive against a checksum digest.
No-op if this stage checks code out of a repository."""
self.fetcher.check()
def expand_archive(self):
@ -292,19 +272,14 @@ def expand_archive(self):
archive. Fail if the stage is not set up or if the archive is not yet
downloaded.
"""
self.chdir()
if not self.archive_file:
tty.die("Attempt to expand archive before fetching.")
decompress = decompressor_for(self.archive_file)
decompress(self.archive_file)
self.fetcher.expand()
def chdir_to_archive(self):
def chdir_to_source(self):
"""Changes directory to the expanded archive directory.
Dies with an error if there was no expanded archive.
"""
path = self.expanded_archive_path
path = self.source_path
if not path:
tty.die("Attempt to chdir before expanding archive.")
else:
@ -317,12 +292,7 @@ def restage(self):
"""Removes the expanded archive path if it exists, then re-expands
the archive.
"""
if not self.archive_file:
tty.die("Attempt to restage when not staged.")
if self.expanded_archive_path:
shutil.rmtree(self.expanded_archive_path, True)
self.expand_archive()
self.fetcher.reset()
def destroy(self):
@ -393,15 +363,20 @@ def find_tmp_root():
return None
class FailedDownloadError(spack.error.SpackError):
"""Raised wen a download fails."""
def __init__(self, url, msg=""):
super(FailedDownloadError, self).__init__(
"Failed to fetch file from URL: %s" % url, msg)
self.url = url
class StageError(spack.error.SpackError):
def __init__(self, message, long_message=None):
super(self, StageError).__init__(message, long_message)
class ChecksumError(spack.error.SpackError):
"""Raised when archive fails to checksum."""
def __init__(self, message, long_msg):
super(ChecksumError, self).__init__(message, long_msg)
class RestageError(StageError):
def __init__(self, message, long_msg=None):
super(RestageError, self).__init__(message, long_msg)
class ChdirError(StageError):
def __init__(self, message, long_msg=None):
super(ChdirError, self).__init__(message, long_msg)
# Keep this in namespace for convenience
FailedDownloadError = fs.FailedDownloadError

View File

@ -36,6 +36,7 @@
"""Names of tests to be included in Spack's test suite"""
test_names = ['versions',
'url_parse',
'url_substitution',
'packages',
'stage',
'spec_syntax',
@ -47,7 +48,12 @@
'package_sanity',
'config',
'directory_layout',
'python_version']
'python_version',
'git_fetch',
'svn_fetch',
'hg_fetch',
'mirror',
'url_extrapolate']
def list_tests():

View File

@ -0,0 +1,133 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import unittest
import shutil
import tempfile
from contextlib import closing
from llnl.util.filesystem import *
import spack
from spack.version import ver
from spack.stage import Stage
from spack.util.executable import which
from spack.test.mock_packages_test import *
from spack.test.mock_repo import MockGitRepo
class GitFetchTest(MockPackagesTest):
"""Tests fetching from a dummy git repository."""
def setUp(self):
"""Create a git repository with master and two other branches,
and one tag, so that we can experiment on it."""
super(GitFetchTest, self).setUp()
self.repo = MockGitRepo()
spec = Spec('git-test')
spec.concretize()
self.pkg = spack.db.get(spec, new=True)
def tearDown(self):
"""Destroy the stage space used by this test."""
super(GitFetchTest, self).tearDown()
if self.repo.stage is not None:
self.repo.stage.destroy()
self.pkg.do_clean_dist()
def assert_rev(self, rev):
"""Check that the current git revision is equal to the supplied rev."""
self.assertEqual(self.repo.rev_hash('HEAD'), self.repo.rev_hash(rev))
def try_fetch(self, rev, test_file, args):
"""Tries to:
1. Fetch the repo using a fetch strategy constructed with
supplied args.
2. Check if the test_file is in the checked out repository.
3. Assert that the repository is at the revision supplied.
4. Add and remove some files, then reset the repo, and
ensure it's all there again.
"""
self.pkg.versions[ver('git')] = args
self.pkg.do_stage()
self.assert_rev(rev)
file_path = join_path(self.pkg.stage.source_path, test_file)
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
os.unlink(file_path)
self.assertFalse(os.path.isfile(file_path))
untracked_file = 'foobarbaz'
touch(untracked_file)
self.assertTrue(os.path.isfile(untracked_file))
self.pkg.do_clean_work()
self.assertFalse(os.path.isfile(untracked_file))
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
self.assert_rev(rev)
def test_fetch_master(self):
"""Test a default git checkout with no commit or tag specified."""
self.try_fetch('master', self.repo.r0_file, {
'git' : self.repo.path
})
def ztest_fetch_branch(self):
"""Test fetching a branch."""
self.try_fetch(self.repo.branch, self.repo.branch_file, {
'git' : self.repo.path,
'branch' : self.repo.branch
})
def ztest_fetch_tag(self):
"""Test fetching a tag."""
self.try_fetch(self.repo.tag, self.repo.tag_file, {
'git' : self.repo.path,
'tag' : self.repo.tag
})
def ztest_fetch_commit(self):
"""Test fetching a particular commit."""
self.try_fetch(self.repo.r1, self.repo.r1_file, {
'git' : self.repo.path,
'commit' : self.repo.r1
})

View File

@ -0,0 +1,111 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import unittest
import shutil
import tempfile
from contextlib import closing
from llnl.util.filesystem import *
import spack
from spack.version import ver
from spack.stage import Stage
from spack.util.executable import which
from spack.test.mock_packages_test import *
from spack.test.mock_repo import MockHgRepo
class HgFetchTest(MockPackagesTest):
"""Tests fetching from a dummy hg repository."""
def setUp(self):
"""Create a hg repository with master and two other branches,
and one tag, so that we can experiment on it."""
super(HgFetchTest, self).setUp()
self.repo = MockHgRepo()
spec = Spec('hg-test')
spec.concretize()
self.pkg = spack.db.get(spec, new=True)
def tearDown(self):
"""Destroy the stage space used by this test."""
super(HgFetchTest, self).tearDown()
if self.repo.stage is not None:
self.repo.stage.destroy()
self.pkg.do_clean_dist()
def try_fetch(self, rev, test_file, args):
"""Tries to:
1. Fetch the repo using a fetch strategy constructed with
supplied args.
2. Check if the test_file is in the checked out repository.
3. Assert that the repository is at the revision supplied.
4. Add and remove some files, then reset the repo, and
ensure it's all there again.
"""
self.pkg.versions[ver('hg')] = args
self.pkg.do_stage()
self.assertEqual(self.repo.get_rev(), rev)
file_path = join_path(self.pkg.stage.source_path, test_file)
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
os.unlink(file_path)
self.assertFalse(os.path.isfile(file_path))
untracked = 'foobarbaz'
touch(untracked)
self.assertTrue(os.path.isfile(untracked))
self.pkg.do_clean_work()
self.assertFalse(os.path.isfile(untracked))
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
self.assertEqual(self.repo.get_rev(), rev)
def test_fetch_default(self):
"""Test a default hg checkout with no commit or tag specified."""
self.try_fetch(self.repo.r1, self.repo.r1_file, {
'hg' : self.repo.path
})
def test_fetch_rev0(self):
"""Test fetching a branch."""
self.try_fetch(self.repo.r0, self.repo.r0_file, {
'hg' : self.repo.path,
'revision' : self.repo.r0
})

View File

@ -32,42 +32,21 @@
import spack
from spack.stage import Stage
from spack.fetch_strategy import URLFetchStrategy
from spack.directory_layout import SpecHashDirectoryLayout
from spack.util.executable import which
from spack.test.mock_packages_test import *
from spack.test.mock_repo import MockArchive
dir_name = 'trivial-1.0'
archive_name = 'trivial-1.0.tar.gz'
install_test_package = 'trivial_install_test_package'
class InstallTest(MockPackagesTest):
"""Tests install and uninstall on a trivial package."""
def setUp(self):
super(InstallTest, self).setUp()
self.stage = Stage('not_a_real_url')
archive_dir = join_path(self.stage.path, dir_name)
dummy_configure = join_path(archive_dir, 'configure')
mkdirp(archive_dir)
with closing(open(dummy_configure, 'w')) as configure:
configure.write(
"#!/bin/sh\n"
"prefix=$(echo $1 | sed 's/--prefix=//')\n"
"cat > Makefile <<EOF\n"
"all:\n"
"\techo Building...\n\n"
"install:\n"
"\tmkdir -p $prefix\n"
"\ttouch $prefix/dummy_file\n"
"EOF\n")
os.chmod(dummy_configure, 0755)
with working_dir(self.stage.path):
tar = which('tar')
tar('-czf', archive_name, dir_name)
# create a simple installable package directory and tarball
self.repo = MockArchive()
# We use a fake package, so skip the checksum.
spack.do_checksum = False
@ -82,8 +61,8 @@ def setUp(self):
def tearDown(self):
super(InstallTest, self).tearDown()
if self.stage is not None:
self.stage.destroy()
if self.repo.stage is not None:
self.repo.stage.destroy()
# Turn checksumming back on
spack.do_checksum = True
@ -95,7 +74,7 @@ def tearDown(self):
def test_install_and_uninstall(self):
# Get a basic concrete spec for the trivial install package.
spec = Spec(install_test_package)
spec = Spec('trivial_install_test_package')
spec.concretize()
self.assertTrue(spec.concrete)
@ -103,8 +82,7 @@ def test_install_and_uninstall(self):
pkg = spack.db.get(spec)
# Fake the URL for the package so it downloads from a file.
archive_path = join_path(self.stage.path, archive_name)
pkg.url = 'file://' + archive_path
pkg.fetcher = URLFetchStrategy(self.repo.url)
try:
pkg.do_install()

View File

@ -0,0 +1,156 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
from filecmp import dircmp
import spack
import spack.mirror
from spack.util.compression import decompressor_for
from spack.test.mock_packages_test import *
from spack.test.mock_repo import *
# paths in repos that shouldn't be in the mirror tarballs.
exclude = ['.hg', '.git', '.svn']
class MirrorTest(MockPackagesTest):
def setUp(self):
"""Sets up a mock package and a mock repo for each fetch strategy, to
ensure that the mirror can create archives for each of them.
"""
super(MirrorTest, self).setUp()
self.repos = {}
def set_up_package(self, name, mock_repo_class, url_attr):
"""Use this to set up a mock package to be mirrored.
Each package needs us to:
1. Set up a mock repo/archive to fetch from.
2. Point the package's version args at that repo.
"""
# Set up packages to point at mock repos.
spec = Spec(name)
spec.concretize()
# Get the package and fix its fetch args to point to a mock repo
pkg = spack.db.get(spec)
repo = mock_repo_class()
self.repos[name] = repo
# change the fetch args of the first (only) version.
assert(len(pkg.versions) == 1)
v = next(iter(pkg.versions))
pkg.versions[v][url_attr] = repo.url
def tearDown(self):
"""Destroy all the stages created by the repos in setup."""
super(MirrorTest, self).tearDown()
for name, repo in self.repos.items():
if repo.stage:
repo.stage.destroy()
self.repos.clear()
def check_mirror(self):
stage = Stage('spack-mirror-test')
mirror_root = join_path(stage.path, 'test-mirror')
try:
os.chdir(stage.path)
spack.mirror.create(
mirror_root, self.repos, no_checksum=True)
# Stage directory exists
self.assertTrue(os.path.isdir(mirror_root))
# subdirs for each package
for name in self.repos:
subdir = join_path(mirror_root, name)
self.assertTrue(os.path.isdir(subdir))
files = os.listdir(subdir)
self.assertEqual(len(files), 1)
# Decompress archive in the mirror
archive = files[0]
archive_path = join_path(subdir, archive)
decomp = decompressor_for(archive_path)
with working_dir(subdir):
decomp(archive_path)
# Find the untarred archive directory.
files = os.listdir(subdir)
self.assertEqual(len(files), 2)
self.assertTrue(archive in files)
files.remove(archive)
expanded_archive = join_path(subdir, files[0])
self.assertTrue(os.path.isdir(expanded_archive))
# Compare the original repo with the expanded archive
repo = self.repos[name]
if not 'svn' in name:
original_path = repo.path
else:
co = 'checked_out'
svn('checkout', repo.url, co)
original_path = join_path(subdir, co)
dcmp = dircmp(original_path, expanded_archive)
# make sure there are no new files in the expanded tarball
self.assertFalse(dcmp.right_only)
self.assertTrue(all(l in exclude for l in dcmp.left_only))
finally:
stage.destroy()
def test_git_mirror(self):
self.set_up_package('git-test', MockGitRepo, 'git')
self.check_mirror()
def test_svn_mirror(self):
self.set_up_package('svn-test', MockSvnRepo, 'svn')
self.check_mirror()
def test_hg_mirror(self):
self.set_up_package('hg-test', MockHgRepo, 'hg')
self.check_mirror()
def test_url_mirror(self):
self.set_up_package('trivial_install_test_package', MockArchive, 'url')
self.check_mirror()
def test_all_mirror(self):
self.set_up_package('git-test', MockGitRepo, 'git')
self.set_up_package('svn-test', MockSvnRepo, 'svn')
self.set_up_package('hg-test', MockHgRepo, 'hg')
self.set_up_package('trivial_install_test_package', MockArchive, 'url')
self.check_mirror()

View File

@ -0,0 +1,197 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import shutil
from contextlib import closing
from llnl.util.filesystem import *
import spack
from spack.version import ver
from spack.stage import Stage
from spack.util.executable import which
#
# VCS Systems used by mock repo code.
#
git = which('git', required=True)
svn = which('svn', required=True)
svnadmin = which('svnadmin', required=True)
hg = which('hg', required=True)
tar = which('tar', required=True)
class MockRepo(object):
def __init__(self, stage_name, repo_name):
"""This creates a stage where some archive/repo files can be staged
for testing spack's fetch strategies."""
# Stage where this repo has been created
self.stage = Stage(stage_name)
# Full path to the repo within the stage.
self.path = join_path(self.stage.path, repo_name)
mkdirp(self.path)
class MockArchive(MockRepo):
"""Creates a very simple archive directory with a configure script and a
makefile that installs to a prefix. Tars it up into an archive."""
def __init__(self):
repo_name = 'mock-archive-repo'
super(MockArchive, self).__init__('mock-archive-stage', repo_name)
with working_dir(self.path):
configure = join_path(self.path, 'configure')
with closing(open(configure, 'w')) as cfg_file:
cfg_file.write(
"#!/bin/sh\n"
"prefix=$(echo $1 | sed 's/--prefix=//')\n"
"cat > Makefile <<EOF\n"
"all:\n"
"\techo Building...\n\n"
"install:\n"
"\tmkdir -p $prefix\n"
"\ttouch $prefix/dummy_file\n"
"EOF\n")
os.chmod(configure, 0755)
with working_dir(self.stage.path):
archive_name = "%s.tar.gz" % repo_name
tar('-czf', archive_name, repo_name)
self.archive_path = join_path(self.stage.path, archive_name)
self.url = 'file://' + self.archive_path
class MockVCSRepo(MockRepo):
def __init__(self, stage_name, repo_name):
"""This creates a stage and a repo directory within the stage."""
super(MockVCSRepo, self).__init__(stage_name, repo_name)
# Name for rev0 & rev1 files in the repo to be
self.r0_file = 'r0_file'
self.r1_file = 'r1_file'
class MockGitRepo(MockVCSRepo):
def __init__(self):
super(MockGitRepo, self).__init__('mock-git-stage', 'mock-git-repo')
with working_dir(self.path):
git('init')
# r0 is just the first commit
touch(self.r0_file)
git('add', self.r0_file)
git('commit', '-m', 'mock-git-repo r0')
self.branch = 'test-branch'
self.branch_file = 'branch_file'
git('branch', self.branch)
self.tag_branch = 'tag-branch'
self.tag_file = 'tag_file'
git('branch', self.tag_branch)
# Check out first branch
git('checkout', self.branch)
touch(self.branch_file)
git('add', self.branch_file)
git('commit', '-m' 'r1 test branch')
# Check out a second branch and tag it
git('checkout', self.tag_branch)
touch(self.tag_file)
git('add', self.tag_file)
git('commit', '-m' 'tag test branch')
self.tag = 'test-tag'
git('tag', self.tag)
git('checkout', 'master')
# R1 test is the same as test for branch
self.r1 = self.rev_hash(self.branch)
self.r1_file = self.branch_file
self.url = self.path
def rev_hash(self, rev):
return git('rev-parse', rev, return_output=True).strip()
class MockSvnRepo(MockVCSRepo):
def __init__(self):
super(MockSvnRepo, self).__init__('mock-svn-stage', 'mock-svn-repo')
self.url = 'file://' + self.path
with working_dir(self.stage.path):
svnadmin('create', self.path)
tmp_path = join_path(self.stage.path, 'tmp-path')
mkdirp(tmp_path)
with working_dir(tmp_path):
touch(self.r0_file)
svn('import', tmp_path, self.url, '-m', 'Initial import r0')
shutil.rmtree(tmp_path)
svn('checkout', self.url, tmp_path)
with working_dir(tmp_path):
touch(self.r1_file)
svn('add', self.r1_file)
svn('ci', '-m', 'second revision r1')
shutil.rmtree(tmp_path)
self.r0 = '1'
self.r1 = '2'
class MockHgRepo(MockVCSRepo):
def __init__(self):
super(MockHgRepo, self).__init__('mock-hg-stage', 'mock-hg-repo')
self.url = 'file://' + self.path
with working_dir(self.path):
hg('init')
touch(self.r0_file)
hg('add', self.r0_file)
hg('commit', '-m', 'revision 0', '-u', 'test')
self.r0 = self.get_rev()
touch(self.r1_file)
hg('add', self.r1_file)
hg('commit', '-m' 'revision 1', '-u', 'test')
self.r1 = self.get_rev()
def get_rev(self):
"""Get current mercurial revision."""
return hg('id', '-i', return_output=True).strip()

View File

@ -73,11 +73,11 @@ def test_mpi_version(self):
def test_undefined_mpi_version(self):
# This currently fails because provides() doesn't do
# the right thing undefined version ranges.
# TODO: fix this.
pkg = spack.db.get('multimethod^mpich@0.4')
self.assertEqual(pkg.mpi_version(), 0)
self.assertEqual(pkg.mpi_version(), 1)
pkg = spack.db.get('multimethod^mpich@1.4')
self.assertEqual(pkg.mpi_version(), 1)
def test_default_works(self):

View File

@ -56,8 +56,8 @@ def test_get_all_mock_packages(self):
def test_url_versions(self):
"""Check URLs for regular packages, if they are explicitly defined."""
for pkg in spack.db.all_packages():
for v, vdesc in pkg.versions.items():
if vdesc.url:
for v, vattrs in pkg.versions.items():
if 'url' in vattrs:
# If there is a url for the version check it.
v_url = pkg.url_for_version(v)
self.assertEqual(vdesc.url, v_url)
self.assertEqual(vattrs['url'], v_url)

View File

@ -146,7 +146,7 @@ def check_fetch(self, stage, stage_name):
stage_path = self.get_stage_path(stage, stage_name)
self.assertTrue(archive_name in os.listdir(stage_path))
self.assertEqual(join_path(stage_path, archive_name),
stage.archive_file)
stage.fetcher.archive_file)
def check_expand_archive(self, stage, stage_name):
@ -156,7 +156,7 @@ def check_expand_archive(self, stage, stage_name):
self.assertEqual(
join_path(stage_path, archive_dir),
stage.expanded_archive_path)
stage.source_path)
readme = join_path(stage_path, archive_dir, readme_name)
self.assertTrue(os.path.isfile(readme))
@ -170,7 +170,7 @@ def check_chdir(self, stage, stage_name):
self.assertEqual(os.path.realpath(stage_path), os.getcwd())
def check_chdir_to_archive(self, stage, stage_name):
def check_chdir_to_source(self, stage, stage_name):
stage_path = self.get_stage_path(stage, stage_name)
self.assertEqual(
join_path(os.path.realpath(stage_path), archive_dir),
@ -271,9 +271,9 @@ def test_expand_archive(self):
self.check_fetch(stage, stage_name)
stage.expand_archive()
stage.chdir_to_archive()
stage.chdir_to_source()
self.check_expand_archive(stage, stage_name)
self.check_chdir_to_archive(stage, stage_name)
self.check_chdir_to_source(stage, stage_name)
stage.destroy()
self.check_destroy(stage, stage_name)
@ -284,24 +284,24 @@ def test_restage(self):
stage.fetch()
stage.expand_archive()
stage.chdir_to_archive()
stage.chdir_to_source()
self.check_expand_archive(stage, stage_name)
self.check_chdir_to_archive(stage, stage_name)
self.check_chdir_to_source(stage, stage_name)
# Try to make a file in the old archive dir
with closing(open('foobar', 'w')) as file:
file.write("this file is to be destroyed.")
self.assertTrue('foobar' in os.listdir(stage.expanded_archive_path))
self.assertTrue('foobar' in os.listdir(stage.source_path))
# Make sure the file is not there after restage.
stage.restage()
self.check_chdir(stage, stage_name)
self.check_fetch(stage, stage_name)
stage.chdir_to_archive()
self.check_chdir_to_archive(stage, stage_name)
self.assertFalse('foobar' in os.listdir(stage.expanded_archive_path))
stage.chdir_to_source()
self.check_chdir_to_source(stage, stage_name)
self.assertFalse('foobar' in os.listdir(stage.source_path))
stage.destroy()
self.check_destroy(stage, stage_name)

View File

@ -0,0 +1,123 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import re
import unittest
import shutil
import tempfile
from contextlib import closing
from llnl.util.filesystem import *
import spack
from spack.version import ver
from spack.stage import Stage
from spack.util.executable import which
from spack.test.mock_packages_test import *
from spack.test.mock_repo import svn, MockSvnRepo
class SvnFetchTest(MockPackagesTest):
"""Tests fetching from a dummy git repository."""
def setUp(self):
"""Create an svn repository with two revisions."""
super(SvnFetchTest, self).setUp()
self.repo = MockSvnRepo()
spec = Spec('svn-test')
spec.concretize()
self.pkg = spack.db.get(spec, new=True)
def tearDown(self):
"""Destroy the stage space used by this test."""
super(SvnFetchTest, self).tearDown()
if self.repo.stage is not None:
self.repo.stage.destroy()
self.pkg.do_clean_dist()
def assert_rev(self, rev):
"""Check that the current revision is equal to the supplied rev."""
def get_rev():
output = svn('info', return_output=True)
self.assertTrue("Revision" in output)
for line in output.split('\n'):
match = re.match(r'Revision: (\d+)', line)
if match:
return match.group(1)
self.assertEqual(get_rev(), rev)
def try_fetch(self, rev, test_file, args):
"""Tries to:
1. Fetch the repo using a fetch strategy constructed with
supplied args.
2. Check if the test_file is in the checked out repository.
3. Assert that the repository is at the revision supplied.
4. Add and remove some files, then reset the repo, and
ensure it's all there again.
"""
self.pkg.versions[ver('svn')] = args
self.pkg.do_stage()
self.assert_rev(rev)
file_path = join_path(self.pkg.stage.source_path, test_file)
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
os.unlink(file_path)
self.assertFalse(os.path.isfile(file_path))
untracked = 'foobarbaz'
touch(untracked)
self.assertTrue(os.path.isfile(untracked))
self.pkg.do_clean_work()
self.assertFalse(os.path.isfile(untracked))
self.assertTrue(os.path.isdir(self.pkg.stage.source_path))
self.assertTrue(os.path.isfile(file_path))
self.assert_rev(rev)
def test_fetch_default(self):
"""Test a default checkout and make sure it's on rev 1"""
self.try_fetch(self.repo.r1, self.repo.r1_file, {
'svn' : self.repo.url
})
def test_fetch_r1(self):
"""Test fetching an older revision (0)."""
self.try_fetch(self.repo.r0, self.repo.r0_file, {
'svn' : self.repo.url,
'revision' : self.repo.r0
})

View File

@ -0,0 +1,91 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""\
Tests ability of spack to extrapolate URL versions from existing versions.
"""
import spack
import spack.url as url
from spack.spec import Spec
from spack.version import ver
from spack.test.mock_packages_test import *
class UrlExtrapolateTest(MockPackagesTest):
def test_known_version(self):
d = spack.db.get('dyninst')
self.assertEqual(
d.url_for_version('8.2'), 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
self.assertEqual(
d.url_for_version('8.1.2'), 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
self.assertEqual(
d.url_for_version('8.1.1'), 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
def test_extrapolate_version(self):
d = spack.db.get('dyninst')
# Nearest URL for 8.1.1.5 is 8.1.1, and the URL there is
# release8.1/DyninstAPI-8.1.1.tgz. Only the last part matches
# the version, so only extrapolate the last part. Obviously
# dyninst has ambiguous URL versions, but we want to make sure
# extrapolation works in a well-defined way.
self.assertEqual(
d.url_for_version('8.1.1.5'), 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.5.tgz')
# 8.2 matches both the release8.2 component and the DyninstAPI-8.2 component.
# Extrapolation should replace both with the new version.
# TODO: figure out a consistent policy for this.
# self.assertEqual(
# d.url_for_version('8.2.3'), 'http://www.paradyn.org/release8.2.3/DyninstAPI-8.2.3.tgz')
def test_with_package(self):
d = spack.db.get('dyninst@8.2')
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
d = spack.db.get('dyninst@8.1.2')
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
d = spack.db.get('dyninst@8.1.1')
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
def test_concrete_package(self):
s = Spec('dyninst@8.2')
s.concretize()
d = spack.db.get(s)
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
s = Spec('dyninst@8.1.2')
s.concretize()
d = spack.db.get(s)
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
s = Spec('dyninst@8.1.1')
s.concretize()
d = spack.db.get(s)
self.assertEqual(d.fetcher.url, 'http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')

View File

@ -281,11 +281,16 @@ def test_synergy_version(self):
'synergy', '1.3.6p2',
'http://synergy.googlecode.com/files/synergy-1.3.6p2-MacOSX-Universal.zip')
def test_mvapich2_version(self):
def test_mvapich2_19_version(self):
self.check(
'mvapich2', '1.9',
'http://mvapich.cse.ohio-state.edu/download/mvapich2/mv2/mvapich2-1.9.tgz')
def test_mvapich2_19_version(self):
self.check(
'mvapich2', '2.0',
'http://mvapich.cse.ohio-state.edu/download/mvapich/mv2/mvapich2-2.0.tar.gz')
def test_hdf5_version(self):
self.check(
'hdf5', '1.8.13',

View File

@ -0,0 +1,73 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""\
This test does sanity checks on substituting new versions into URLs
"""
import unittest
import spack
import spack.url as url
from spack.packages import PackageDB
class PackageSanityTest(unittest.TestCase):
def test_hypre_url_substitution(self):
base = "https://computation-rnd.llnl.gov/linear_solvers/download/hypre-2.9.0b.tar.gz"
self.assertEqual(url.substitute_version(base, '2.9.0b'), base)
self.assertEqual(
url.substitute_version(base, '2.8.0b'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-2.8.0b.tar.gz")
self.assertEqual(
url.substitute_version(base, '2.7.0b'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-2.7.0b.tar.gz")
self.assertEqual(
url.substitute_version(base, '2.6.0b'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-2.6.0b.tar.gz")
self.assertEqual(
url.substitute_version(base, '1.14.0b'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-1.14.0b.tar.gz")
self.assertEqual(
url.substitute_version(base, '1.13.0b'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-1.13.0b.tar.gz")
self.assertEqual(
url.substitute_version(base, '2.0.0'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-2.0.0.tar.gz")
self.assertEqual(
url.substitute_version(base, '1.6.0'),
"https://computation-rnd.llnl.gov/linear_solvers/download/hypre-1.6.0.tar.gz")
def test_otf2_url_substitution(self):
base = "http://www.vi-hps.org/upload/packages/otf2/otf2-1.4.tar.gz"
self.assertEqual(url.substitute_version(base, '1.4'), base)
self.assertEqual(
url.substitute_version(base, '1.3.1'),
"http://www.vi-hps.org/upload/packages/otf2/otf2-1.3.1.tar.gz")
self.assertEqual(
url.substitute_version(base, '1.2.1'),
"http://www.vi-hps.org/upload/packages/otf2/otf2-1.2.1.tar.gz")

View File

@ -95,6 +95,10 @@ def check_intersection(self, expected, a, b):
self.assertEqual(ver(expected), ver(a).intersection(ver(b)))
def check_union(self, expected, a, b):
self.assertEqual(ver(expected), ver(a).union(ver(b)))
def test_two_segments(self):
self.assert_ver_eq('1.0', '1.0')
self.assert_ver_lt('1.0', '2.0')
@ -217,12 +221,16 @@ def test_contains(self):
self.assert_in('1.3.5-7', '1.2:1.4')
self.assert_not_in('1.1', '1.2:1.4')
self.assert_not_in('1.5', '1.2:1.4')
self.assert_not_in('1.4.2', '1.2:1.4')
self.assert_in('1.4.2', '1.2:1.4')
self.assert_not_in('1.4.2', '1.2:1.4.0')
self.assert_in('1.2.8', '1.2.7:1.4')
self.assert_in('1.2.7:1.4', ':')
self.assert_not_in('1.2.5', '1.2.7:1.4')
self.assert_not_in('1.4.1', '1.2.7:1.4')
self.assert_in('1.4.1', '1.2.7:1.4')
self.assert_not_in('1.4.1', '1.2.7:1.4.0')
def test_in_list(self):
@ -254,6 +262,17 @@ def test_ranges_overlap(self):
self.assert_overlaps('1.6:1.9', ':')
def test_overlap_with_containment(self):
self.assert_in('1.6.5', '1.6')
self.assert_in('1.6.5', ':1.6')
self.assert_overlaps('1.6.5', ':1.6')
self.assert_overlaps(':1.6', '1.6.5')
self.assert_not_in(':1.6', '1.6.5')
self.assert_in('1.6.5', ':1.6')
def test_lists_overlap(self):
self.assert_overlaps('1.2b:1.7,5', '1.6:1.9,1')
self.assert_overlaps('1,2,3,4,5', '3,4,5,6,7')
@ -311,6 +330,32 @@ def test_intersection(self):
self.check_intersection(['0:1'], [':'], ['0:1'])
def test_intersect_with_containment(self):
self.check_intersection('1.6.5', '1.6.5', ':1.6')
self.check_intersection('1.6.5', ':1.6', '1.6.5')
self.check_intersection('1.6:1.6.5', ':1.6.5', '1.6')
self.check_intersection('1.6:1.6.5', '1.6', ':1.6.5')
def test_union_with_containment(self):
self.check_union(':1.6', '1.6.5', ':1.6')
self.check_union(':1.6', ':1.6', '1.6.5')
self.check_union(':1.6', ':1.6.5', '1.6')
self.check_union(':1.6', '1.6', ':1.6.5')
def test_union_with_containment(self):
self.check_union(':', '1.0:', ':2.0')
self.check_union('1:4', '1:3', '2:4')
self.check_union('1:4', '2:4', '1:3')
# Tests successor/predecessor case.
self.check_union('1:4', '1:2', '3:4')
def test_basic_version_satisfaction(self):
self.assert_satisfies('4.7.3', '4.7.3')
@ -326,6 +371,7 @@ def test_basic_version_satisfaction(self):
self.assert_does_not_satisfy('4.8', '4.9')
self.assert_does_not_satisfy('4', '4.9')
def test_basic_version_satisfaction_in_lists(self):
self.assert_satisfies(['4.7.3'], ['4.7.3'])
@ -341,6 +387,7 @@ def test_basic_version_satisfaction_in_lists(self):
self.assert_does_not_satisfy(['4.8'], ['4.9'])
self.assert_does_not_satisfy(['4'], ['4.9'])
def test_version_range_satisfaction(self):
self.assert_satisfies('4.7b6', '4.3:4.7')
self.assert_satisfies('4.3.0', '4.3:4.7')
@ -352,6 +399,7 @@ def test_version_range_satisfaction(self):
self.assert_satisfies('4.7b6', '4.3:4.7')
self.assert_does_not_satisfy('4.8.0', '4.3:4.7')
def test_version_range_satisfaction_in_lists(self):
self.assert_satisfies(['4.7b6'], ['4.3:4.7'])
self.assert_satisfies(['4.3.0'], ['4.3:4.7'])

View File

@ -78,6 +78,26 @@ def __init__(self, path):
"Couldn't parse package name in: " + path, path)
def find_list_url(url):
"""Finds a good list URL for the supplied URL. This depends on
the site. By default, just assumes that a good list URL is the
dirname of an archive path. For github URLs, this returns the
URL of the project's releases page.
"""
url_types = [
# e.g. https://github.com/scalability-llnl/callpath/archive/v1.0.1.tar.gz
(r'^(https://github.com/[^/]+/[^/]+)/archive/', lambda m: m.group(1) + '/releases')
]
for pattern, fun in url_types:
match = re.search(pattern, url)
if match:
return fun(match)
else:
return os.path.dirname(url)
def parse_version_string_with_indices(path):
"""Try to extract a version string from a filename or URL. This is taken
largely from Homebrew's Version class."""

View File

@ -23,6 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import re
import os
from itertools import product
from spack.util.executable import which
@ -60,6 +61,11 @@ def strip_extension(path):
def extension(path):
"""Get the archive extension for a path."""
# Strip sourceforge suffix.
if re.search(r'((?:sourceforge.net|sf.net)/.*)/download$', path):
path = os.path.dirname(path)
for type in ALLOWED_ARCHIVE_TYPES:
suffix = r'\.%s$' % type
if re.search(suffix, path):

View File

@ -50,10 +50,6 @@
from functools import wraps
from external.functools import total_ordering
import llnl.util.compare.none_high as none_high
import llnl.util.compare.none_low as none_low
import spack.error
# Valid version characters
VALID_VERSION = r'[A-Za-z0-9_.-]'
@ -256,18 +252,39 @@ def __hash__(self):
@coerced
def __contains__(self, other):
return self == other
if other is None:
return False
return other.version[:len(self.version)] == self.version
def is_predecessor(self, other):
"""True if the other version is the immediate predecessor of this one.
That is, NO versions v exist such that:
(self < v < other and v not in self).
"""
if len(self.version) != len(other.version):
return False
sl = self.version[-1]
ol = other.version[-1]
return type(sl) == int and type(ol) == int and (ol - sl == 1)
def is_successor(self, other):
return other.is_predecessor(self)
@coerced
def overlaps(self, other):
return self == other
return self in other or other in self
@coerced
def union(self, other):
if self == other:
if self == other or other in self:
return self
elif self in other:
return other
else:
return VersionList([self, other])
@ -290,7 +307,7 @@ def __init__(self, start, end):
self.start = start
self.end = end
if start and end and end < start:
if start and end and end < start:
raise ValueError("Invalid Version range: %s" % self)
@ -312,9 +329,12 @@ def __lt__(self, other):
if other is None:
return False
return (none_low.lt(self.start, other.start) or
(self.start == other.start and
none_high.lt(self.end, other.end)))
s, o = self, other
if s.start != o.start:
return s.start is None or (o.start is not None and s.start < o.start)
return (s.end != o.end and
o.end is None or (s.end is not None and s.end < o.end))
@coerced
@ -335,8 +355,23 @@ def concrete(self):
@coerced
def __contains__(self, other):
return (none_low.ge(other.start, self.start) and
none_high.le(other.end, self.end))
if other is None:
return False
in_lower = (self.start == other.start or
self.start is None or
(other.start is not None and (
self.start < other.start or
other.start in self.start)))
if not in_lower:
return False
in_upper = (self.end == other.end or
self.end is None or
(other.end is not None and (
self.end > other.end or
other.end in self.end)))
return in_upper
@coerced
@ -372,27 +407,75 @@ def satisfies(self, other):
@coerced
def overlaps(self, other):
return (other in self or self in other or
((self.start == None or other.end is None or
self.start <= other.end) and
(other.start is None or self.end == None or
other.start <= self.end)))
return ((self.start == None or other.end is None or
self.start <= other.end or
other.end in self.start or self.start in other.end) and
(other.start is None or self.end == None or
other.start <= self.end or
other.start in self.end or self.end in other.start))
@coerced
def union(self, other):
if self.overlaps(other):
return VersionRange(none_low.min(self.start, other.start),
none_high.max(self.end, other.end))
else:
if not self.overlaps(other):
if (self.end is not None and other.start is not None and
self.end.is_predecessor(other.start)):
return VersionRange(self.start, other.end)
if (other.end is not None and self.start is not None and
other.end.is_predecessor(self.start)):
return VersionRange(other.start, self.end)
return VersionList([self, other])
# if we're here, then we know the ranges overlap.
if self.start is None or other.start is None:
start = None
else:
start = self.start
# TODO: See note in intersection() about < and in discrepancy.
if self.start in other.start or other.start < self.start:
start = other.start
if self.end is None or other.end is None:
end = None
else:
end = self.end
# TODO: See note in intersection() about < and in discrepancy.
if not other.end in self.end:
if end in other.end or other.end > self.end:
end = other.end
return VersionRange(start, end)
@coerced
def intersection(self, other):
if self.overlaps(other):
return VersionRange(none_low.max(self.start, other.start),
none_high.min(self.end, other.end))
if self.start is None:
start = other.start
else:
start = self.start
if other.start is not None:
if other.start > start or other.start in start:
start = other.start
if self.end is None:
end = other.end
else:
end = self.end
# TODO: does this make sense?
# This is tricky:
# 1.6.5 in 1.6 = True (1.6.5 is more specific)
# 1.6 < 1.6.5 = True (lexicographic)
# Should 1.6 NOT be less than 1.6.5? Hm.
# Here we test (not end in other.end) first to avoid paradox.
if other.end is not None and not end in other.end:
if other.end < end or other.end in end:
end = other.end
return VersionRange(start, end)
else:
return VersionList()

View File

@ -46,8 +46,17 @@ set _sp_spec=""
set _sp_modtype = ""
switch ($_sp_subcommand)
case cd:
shift _sp_args # get rid of 'cd'
set _sp_arg=""
[ $#_sp_args -gt 0 ] && set _sp_arg = ($_sp_args[1])
shift _sp_args
cd `spack location $_sp_args`
if ( "$_sp_arg" == "-h" ) then
\spack cd -h
else
cd `\spack location $_sp_arg $_sp_args`
endif
breaksw
case use:
case unuse:

View File

@ -76,7 +76,12 @@ function spack {
# command.
case $_sp_subcommand in
"cd")
cd $(spack location "$@")
_sp_arg="$1"; shift
if [ "$_sp_arg" = "-h" ]; then
command spack cd -h
else
cd $(spack location $_sp_arg "$@")
fi
return
;;
"use"|"unuse"|"load"|"unload")

View File

@ -26,11 +26,14 @@
class Dyninst(Package):
homepage = "https://paradyn.org"
url = "http://www.dyninst.org/sites/default/files/downloads/dyninst/8.1.2/DyninstAPI-8.1.2.tgz"
list_url = "http://www.dyninst.org/downloads/dyninst-8.x"
url = "http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz"
version('8.1.2', 'bf03b33375afa66fe0efa46ce3f4b17a')
version('8.1.1', '1f8743e3a5662b25ce64a7edf647e77d')
version('8.2', 'cxyzab',
url='http://www.paradyn.org/release8.2/DyninstAPI-8.2.tgz')
version('8.1.2', 'bcxyza',
url='http://www.paradyn.org/release8.1.2/DyninstAPI-8.1.2.tgz')
version('8.1.1', 'abcxyz',
url='http://www.paradyn.org/release8.1/DyninstAPI-8.1.1.tgz')
depends_on("libelf")
depends_on("libdwarf")

View File

@ -0,0 +1,10 @@
from spack import *
class GitTest(Package):
"""Mock package that uses git for fetching."""
homepage = "http://www.git-fetch-example.com"
version('git', git='to-be-filled-in-by-test')
def install(self, spec, prefix):
pass

View File

@ -0,0 +1,10 @@
from spack import *
class HgTest(Package):
"""Test package that does fetching with mercurial."""
homepage = "http://www.hg-fetch-example.com"
version('hg', hg='to-be-filled-in-by-test')
def install(self, spec, prefix):
pass

View File

@ -37,7 +37,7 @@ class Mpich(Package):
version('3.0', 'foobarbaz')
provides('mpi@:3', when='@3:')
provides('mpi@:1', when='@1:')
provides('mpi@:1', when='@:1')
def install(self, spec, prefix):
pass

View File

@ -0,0 +1,10 @@
from spack import *
class SvnTest(Package):
"""Mock package that uses svn for fetching."""
url = "http://www.example.com/svn-test-1.0.tar.gz"
version('svn', 'to-be-filled-in-by-test')
def install(self, spec, prefix):
pass

View File

@ -0,0 +1,21 @@
from spack import *
class Imagemagick(Package):
"""ImageMagick is a image processing library"""
homepage = "http://www.imagemagic.org"
url = "http://www.imagemagick.org/download/ImageMagick-6.8.9-9.tar.gz"
version('6.8.9-9', 'e63fed3e3550851328352c708f800676')
depends_on('libtool')
depends_on('jpeg')
depends_on('libpng')
depends_on('freetype')
depends_on('fontconfig')
# depends_on('libtiff')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,14 @@
from spack import *
class Autoconf(Package):
"""Autoconf -- system configuration part of autotools"""
homepage = "https://www.gnu.org/software/autoconf/"
url = "http://ftp.gnu.org/gnu/autoconf/autoconf-2.69.tar.gz"
version('2.69', '82d05e03b93e45f5a39b828dc9c6c29b')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -22,49 +22,30 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Functions for comparing values that may potentially be None.
These none_low functions consider None as less than all other values.
"""
from spack import *
# Preserve builtin min and max functions
_builtin_min = min
_builtin_max = max
class Automaded(Package):
"""AutomaDeD (Automata-based Debugging for Dissimilar parallel
tasks) is a tool for automatic diagnosis of performance and
correctness problems in MPI applications. It creates
control-flow models of each MPI process and, when a failure
occurs, these models are leveraged to find the origin of
problems automatically. MPI calls are intercepted (using
wrappers) to create the models. When an MPI application hangs,
AutomaDeD creates a progress-dependence graph that helps
finding the process (or group of processes) that caused the hang.
"""
homepage = "https://github.com/scalability-llnl/AutomaDeD"
url = "https://github.com/scalability-llnl/AutomaDeD/archive/v1.0.tar.gz"
def lt(lhs, rhs):
"""Less-than comparison. None is lower than any value."""
return lhs != rhs and (lhs is None or (rhs is not None and lhs < rhs))
version('1.0', '16a3d4def2c4c77d0bc4b21de8b3ab03')
depends_on('mpi')
depends_on('boost')
depends_on('callpath')
def le(lhs, rhs):
"""Less-than-or-equal comparison. None is less than any value."""
return lhs == rhs or lt(lhs, rhs)
def gt(lhs, rhs):
"""Greater-than comparison. None is less than any value."""
return lhs != rhs and not lt(lhs, rhs)
def ge(lhs, rhs):
"""Greater-than-or-equal comparison. None is less than any value."""
return lhs == rhs or gt(lhs, rhs)
def min(lhs, rhs):
"""Minimum function where None is less than any value."""
if lhs is None or rhs is None:
return None
else:
return _builtin_min(lhs, rhs)
def max(lhs, rhs):
"""Maximum function where None is less than any value."""
if lhs is None:
return rhs
elif rhs is None:
return lhs
else:
return _builtin_max(lhs, rhs)
def install(self, spec, prefix):
cmake("-DSTATE_TRACKER_WITH_CALLPATH=ON", *std_cmake_args)
make()
make("install")

View File

@ -0,0 +1,16 @@
from spack import *
class Automake(Package):
"""Automake -- make file builder part of autotools"""
homepage = "http://www.gnu.org/software/automake/"
url = "http://ftp.gnu.org/gnu/automake/automake-1.14.tar.gz"
version('1.14.1', 'd052a3e884631b9c7892f2efce542d75')
depends_on('autoconf')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,27 @@
from spack import *
from glob import glob
class Bib2xhtml(Package):
"""bib2xhtml is a program that converts BibTeX files into HTML."""
homepage = "http://www.spinellis.gr/sw/textproc/bib2xhtml/"
url='http://www.spinellis.gr/sw/textproc/bib2xhtml/bib2xhtml-v3.0-15-gf506.tar.gz'
version('3.0-15-gf506', 'a26ba02fe0053bbbf2277bdf0acf8645')
def url_for_version(self, v):
return ('http://www.spinellis.gr/sw/textproc/bib2xhtml/bib2xhtml-v%s.tar.gz' % v)
def install(self, spec, prefix):
# Add the bst include files to the install directory
bst_include = join_path(prefix.share, 'bib2xhtml')
mkdirp(bst_include)
for bstfile in glob('html-*bst'):
install(bstfile, bst_include)
# Install the script and point it at the user's favorite perl
# and the bst include directory.
mkdirp(prefix.bin)
install('bib2xhtml', prefix.bin)
filter_file(r'#!/usr/bin/perl',
'#!/usr/bin/env BSTINPUTS=%s perl' % bst_include,
join_path(prefix.bin, 'bib2xhtml'))

View File

@ -31,8 +31,11 @@ class Callpath(Package):
homepage = "https://github.com/scalability-llnl/callpath"
url = "https://github.com/scalability-llnl/callpath/archive/v1.0.1.tar.gz"
version('1.0.2', 'b1994d5ee7c7db9d27586fc2dcf8f373')
version('1.0.1', '0047983d2a52c5c335f8ba7f5bab2325')
depends_on("libelf")
depends_on("libdwarf")
depends_on("dyninst")
depends_on("adept-utils")
depends_on("mpi")

View File

@ -25,10 +25,18 @@
from spack import *
class Cmake(Package):
"""A cross-platform, open-source build system. CMake is a family of
tools designed to build, test and package software."""
homepage = 'https://www.cmake.org'
url = 'http://www.cmake.org/files/v2.8/cmake-2.8.10.2.tar.gz'
version('2.8.10.2', '097278785da7182ec0aea8769d06860c')
version('2.8.10.2', '097278785da7182ec0aea8769d06860c',
url = 'http://www.cmake.org/files/v2.8/cmake-2.8.10.2.tar.gz')
version('3.0.2', 'db4c687a31444a929d2fdc36c4dfb95f',
url = 'http://www.cmake.org/files/v3.0/cmake-3.0.2.tar.gz')
# version('3.0.1', 'e2e05d84cb44a42f1371d9995631dcf5')
# version('3.0.0', '21a1c85e1a3b803c4b48e7ff915a863e')
def install(self, spec, prefix):
configure('--prefix=' + prefix,

View File

@ -0,0 +1,17 @@
from spack import *
class Coreutils(Package):
"""The GNU Core Utilities are the basic file, shell and text
manipulation utilities of the GNU operating system. These are
the core utilities which are expected to exist on every
operating system.
"""
homepage = "http://www.gnu.org/software/coreutils/"
url = "http://ftp.gnu.org/gnu/coreutils/coreutils-8.23.tar.xz"
version('8.23', 'abed135279f87ad6762ce57ff6d89c41')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -25,6 +25,8 @@
from spack import *
class Dyninst(Package):
"""API for dynamic binary instrumentation. Modify programs while they
are executing without recompiling, re-linking, or re-executing."""
homepage = "https://paradyn.org"
url = "http://www.dyninst.org/sites/default/files/downloads/dyninst/8.1.2/DyninstAPI-8.1.2.tgz"
list_url = "http://www.dyninst.org/downloads/dyninst-8.x"

View File

@ -0,0 +1,16 @@
from spack import *
class Fontconfig(Package):
"""Fontconfig customizing font access"""
homepage = "http://www.freedesktop.org/wiki/Software/fontconfig/"
url = "http://www.freedesktop.org/software/fontconfig/release/fontconfig-2.11.1.tar.gz"
version('2.11.1' , 'e75e303b4f7756c2b16203a57ac87eba')
depends_on('freetype')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,16 @@
from spack import *
class Freetype(Package):
"""Font package"""
homepage = "http://http://www.freetype.org"
url = "http://download.savannah.gnu.org/releases/freetype/freetype-2.5.3.tar.gz"
version('2.5.3' , 'cafe9f210e45360279c730d27bf071e9')
depends_on('libpng')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,14 @@
from spack import *
class Jpeg(Package):
"""jpeg library"""
homepage = "http://www.ijg.org"
url = "http://www.ijg.org/files/jpegsrc.v9a.tar.gz"
version('9', 'b397211ddfd506b92cd5e02a22ac924d')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -25,6 +25,8 @@
from spack import *
class Launchmon(Package):
"""Software infrastructure that enables HPC run-time tools to
co-locate tool daemons with a parallel job."""
homepage = "http://sourceforge.net/projects/launchmon"
url = "http://downloads.sourceforge.net/project/launchmon/launchmon/1.0.1%20release/launchmon-1.0.1.tar.gz"

View File

@ -22,49 +22,22 @@
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
Functions for comparing values that may potentially be None.
These none_high functions consider None as greater than all other values.
"""
from spack import *
# Preserve builtin min and max functions
_builtin_min = min
_builtin_max = max
class Libnbc(Package):
"""LibNBC is a prototypic implementation of a nonblocking
interface for MPI collective operations. Based on ANSI C and
MPI-1, it supports all MPI-1 collective operations in a
nonblocking manner. LibNBC is distributed under the BSD license.
"""
homepage = "http://unixer.de/research/nbcoll/libnbc/"
url = "http://unixer.de/research/nbcoll/libnbc/libNBC-1.1.1.tar.gz"
version('1.1.1', 'ece5c94992591a9fa934a90e5dbe50ce')
def lt(lhs, rhs):
"""Less-than comparison. None is greater than any value."""
return lhs != rhs and (rhs is None or (lhs is not None and lhs < rhs))
depends_on("mpi")
def le(lhs, rhs):
"""Less-than-or-equal comparison. None is greater than any value."""
return lhs == rhs or lt(lhs, rhs)
def gt(lhs, rhs):
"""Greater-than comparison. None is greater than any value."""
return lhs != rhs and not lt(lhs, rhs)
def ge(lhs, rhs):
"""Greater-than-or-equal comparison. None is greater than any value."""
return lhs == rhs or gt(lhs, rhs)
def min(lhs, rhs):
"""Minimum function where None is greater than any value."""
if lhs is None:
return rhs
elif rhs is None:
return lhs
else:
return _builtin_min(lhs, rhs)
def max(lhs, rhs):
"""Maximum function where None is greater than any value."""
if lhs is None or rhs is None:
return None
else:
return _builtin_max(lhs, rhs)
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -27,9 +27,8 @@
class Libmonitor(Package):
"""Libmonitor is a library for process and thread control."""
homepage = "http://hpctoolkit.org"
url = "file:///g/g0/legendre/tools/oss/openspeedshop-release-2.1/SOURCES/libmonitor-20130218.tar.gz"
version('20130218', 'aa85c2c580e2dafb823cc47b09374279')
version('20130218', svn='https://outreach.scidac.gov/svn/libmonitor/trunk', revision=146)
def install(self, spec, prefix):
configure("--prefix=" + prefix)

View File

@ -0,0 +1,14 @@
from spack import *
class Libpng(Package):
"""libpng graphics file format"""
homepage = "http://www.libpng.org/pub/png/libpng.html"
url = "http://sourceforge.net/projects/libpng/files/libpng16/1.6.14/libpng-1.6.14.tar.gz/download"
version('1.6.14', '2101b3de1d5f348925990f9aa8405660')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,16 @@
from spack import *
class Libtiff(Package):
"""libtiff graphics format library"""
homepage = "http://www.remotesensing.org/libtiff/"
url = "http://download.osgeo.org/libtiff/tiff-4.0.3.tar.gz"
version('4.0.3', '051c1068e6a0627f461948c365290410')
depends_on('jpeg')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,14 @@
from spack import *
class Libtool(Package):
"""libtool -- library building part of autotools"""
homepage = "https://www.gnu.org/software/libtool/"
url = "http://ftpmirror.gnu.org/libtool/libtool-2.4.2.tar.gz"
version('2.4.2' , 'd2f3b7d4627e69e13514a40e72a24d50')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -25,6 +25,8 @@
from spack import *
class Libunwind(Package):
"""A portable and efficient C programming interface (API) to determine
the call-chain of a program."""
homepage = "http://www.nongnu.org/libunwind/"
url = "http://download.savannah.gnu.org/releases/libunwind/libunwind-1.1.tar.gz"

View File

@ -0,0 +1,18 @@
from spack import *
class Lwm2(Package):
"""LWM2: Light Weight Measurement Module. This is a PMPI module
that can collect a number of time-sliced MPI and POSIX I/O
measurements from a program.
"""
homepage = "https://jay.grs.rwth-aachen.de/redmine/projects/lwm2"
version('torus', hg='https://jay.grs.rwth-aachen.de/hg/lwm2', revision='torus')
depends_on("papi")
depends_on("mpi")
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,12 @@
diff -rupN mpe2-1.3.0/src/graphics/src/mpe_graphics.c mpe2-1.3.0.new/src/graphics/src/mpe_graphics.c
--- mpe2-1.3.0/src/graphics/src/mpe_graphics.c 2009-06-15 10:36:22.000000000 -0600
+++ mpe2-1.3.0.new/src/graphics/src/mpe_graphics.c 2014-10-25 00:11:22.000000000 -0600
@@ -982,7 +982,7 @@ char *string;
return MPE_ERR_BAD_ARGS;
}
- printf("color = %d, string = %s\n",(int) color, string);
+//printf("color = %d, string = %s\n",(int) color, string);
XBSetPixVal( graph->xwin, graph->xwin->cmapping[color] );
returnVal = XDrawString( graph->xwin->disp, XBDrawable(graph->xwin),

View File

@ -0,0 +1,28 @@
from spack import *
class Mpe2(Package):
"""Message Passing Extensions (MPE) -- Parallel, shared X window graphics"""
homepage = "http://www.mcs.anl.gov/research/projects/perfvis/software/MPE/"
url = "ftp://ftp.mcs.anl.gov/pub/mpi/mpe/mpe2-1.3.0.tar.gz"
version('1.3.0', '67bf0c7b2e573df3ba0d2059a96c2f7b')
patch('mpe2.patch')
depends_on("mpi")
provides("mpe")
def install(self, spec, prefix):
configure("--prefix=" + prefix,
"--x-includes=/usr/X11R6/include",
"--x-libraries=/usr/X11R6/lib",
"--enable-mpe_graphics=yes",
"--disable-f77",
"--enable-viewers=no",
"--enable-slog2=no",
"--with-mpicc=mpicc")
make()
make("install")

View File

@ -23,6 +23,7 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
import os
class Mpich(Package):
"""MPICH is a high performance and widely portable implementation of
@ -38,8 +39,41 @@ class Mpich(Package):
provides('mpi@:1', when='@1:')
def install(self, spec, prefix):
configure(
"--prefix=" + prefix,
"--enable-shared")
config_args = ["--prefix=" + prefix,
"--enable-shared"]
# TODO: Spack should make it so that you can't actually find
# these compilers if they're "disabled" for the current
# compiler configuration.
if not self.compiler.f77:
config_args.append("--disable-f77")
if not self.compiler.fc:
config_args.append("--disable-fc")
configure(*config_args)
make()
make("install")
self.filter_compilers()
def filter_compilers(self):
"""Run after install to make the MPI compilers use the
compilers that Spack built the package with.
If this isn't done, they'll have CC, CXX, F77, and FC set
to Spack's generic cc, c++, f77, and f90. We want them to
be bound to whatever compiler they were built with.
"""
bin = self.prefix.bin
mpicc = os.path.join(bin, 'mpicc')
mpicxx = os.path.join(bin, 'mpicxx')
mpif77 = os.path.join(bin, 'mpif77')
mpif90 = os.path.join(bin, 'mpif90')
kwargs = { 'ignore_absent' : True, 'backup' : False, 'string' : True }
filter_file('CC="cc"', 'CC="%s"' % self.compiler.cc, mpicc, **kwargs)
filter_file('CXX="c++"', 'CXX="%s"' % self.compiler.cxx, mpicxx, **kwargs)
filter_file('F77="f77"', 'F77="%s"' % self.compiler.f77, mpif77, **kwargs)
filter_file('FC="f90"', 'FC="%s"' % self.compiler.fc, mpif90, **kwargs)

View File

@ -0,0 +1,43 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Netgauge(Package):
"""Netgauge is a high-precision network parameter measurement
tool. It supports benchmarking of many different network protocols
and communication patterns. The main focus lies on accuracy,
statistical analysis and easy extensibility.
"""
homepage = "http://unixer.de/research/netgauge/"
url = "http://unixer.de/research/netgauge/netgauge-2.4.6.tar.gz"
version('2.4.6', 'e0e040ec6452e93ca21ccc54deac1d7f')
depends_on("mpi")
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -10,22 +10,32 @@ class Openmpi(Package):
"""
homepage = "http://www.open-mpi.org"
url = "http://www.open-mpi.org/software/ompi/v1.6/downloads/openmpi-1.6.5.tar.bz2"
version('1.6.5', '03aed2a4aa4d0b27196962a2a65fc475')
version('1.8.2', 'ab538ed8e328079d566fc797792e016e',
url='http://www.open-mpi.org/software/ompi/v1.8/downloads/openmpi-1.8.2.tar.gz')
version('1.6.5', '03aed2a4aa4d0b27196962a2a65fc475',
url = "http://www.open-mpi.org/software/ompi/v1.6/downloads/openmpi-1.6.5.tar.bz2")
patch('ad_lustre_rwcontig_open_source.patch', when="@1.6.5")
patch('llnl-platforms.patch', when="@1.6.5")
provides('mpi@:2')
patch('ad_lustre_rwcontig_open_source.patch')
patch('llnl-platforms.patch')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix,
"--with-platform=contrib/platform/lanl/tlcc2/optimized-nopanasas")
config_args = ["--prefix=%s" % prefix]
# TODO: implement variants next, so we can have LLNL and LANL options.
# use above for LANL builds, but for LLNL builds, we need this
# "--with-platform=contrib/platform/llnl/optimized")
# TODO: use variants for this, e.g. +lanl, +llnl, etc.
# use this for LANL builds, but for LLNL builds, we need:
# "--with-platform=contrib/platform/llnl/optimized"
if self.version == ver("1.6.5"):
confg_args.append("--with-platform=contrib/platform/lanl/tlcc2/optimized-nopanasas")
# TODO: Spack should make it so that you can't actually find
# these compilers if they're "disabled" for the current
# compiler configuration.
if not self.compiler.f77 and not self.compiler.fc:
config_args.append("--enable-mpi-fortran=no")
configure(*config_args)
make()
make("install")

View File

@ -7,9 +7,9 @@ class Paraver(Package):
is expressed on its input trace format. Traces for parallel MPI,
OpenMP and other programs can be genereated with Extrae."""
homepage = "http://www.bsc.es/computer-sciences/performance-tools/paraver"
url = "http://www.bsc.es/ssl/apps/performanceTools/files/paraver-sources-4.5.2.tar.gz"
url = "http://www.bsc.es/ssl/apps/performanceTools/files/paraver-sources-4.5.3.tar.gz"
version('4.5.2', 'ea463dd494519395c99ebae294edee17')
version('4.5.3', '625de9ec0d639acd18d1aaa644b38f72')
depends_on("boost")
#depends_on("extrae")

View File

@ -11,25 +11,47 @@ class Scalasca(Package):
# FIXME: add a proper url for your package's homepage here.
homepage = "http://www.scalasca.org"
url = "http://apps.fz-juelich.de/scalasca/releases/scalasca/2.1/dist/scalasca-2.1-rc2.tar.gz"
url = "http://apps.fz-juelich.de/scalasca/releases/scalasca/2.1/dist/scalasca-2.1.tar.gz"
version('2.1-rc2', '1a95a39e5430539753e956a7524a756b')
version('2.1', 'bab9c2b021e51e2ba187feec442b96e6',
url = 'http://apps.fz-juelich.de/scalasca/releases/scalasca/2.1/dist/scalasca-2.1.tar.gz' )
depends_on("mpi")
depends_on("otf2@1.4")
depends_on("cube")
depends_on("cube@4.2.3")
backend_user_provided = """\
CC=cc
CXX=c++
F77=f77
FC=f90
CFLAGS=-fPIC
CXXFLAGS=-fPIC
"""
frontend_user_provided = """\
CC_FOR_BUILD=cc
CXX_FOR_BUILD=c++
F77_FOR_BUILD=f70
FC_FOR_BUILD=f90
CFLAGS_FOR_BUILD=-fPIC
CXXFLAGS_FOR_BUILD=-fPIC
"""
mpi_user_provided = """\
MPICC=mpicc
MPICXX=mpicxx
MPIF77=mpif77
MPIFC=mpif90
MPI_CFLAGS=-fPIC
MPI_CXXFLAGS=-fPIC
"""
def install(self, spec, prefix):
configure_args = ["--prefix=%s" % prefix,
"--with-custom-compilers",
"--with-otf2=%s" % spec['otf2'].prefix.bin,
"--with-cube=%s" % spec['cube'].prefix.bin,
"--enable-shared"]
if spec.satisfies('%gcc'):
configure_args.append('--with-nocross-compiler-suite=gcc')
if spec.satisfies('%intel'):
configure_args.append('--with-nocross-compiler-suite=intel')
configure(*configure_args)
make()

View File

@ -25,6 +25,9 @@
from spack import *
class Scr(Package):
"""SCR caches checkpoint data in storage on the compute nodes of a
Linux cluster to provide a fast, scalable checkpoint/restart
capability for MPI codes"""
homepage = "https://computation-rnd.llnl.gov/scr"
url = "http://downloads.sourceforge.net/project/scalablecr/releases/scr-1.1-7.tar.gz"

View File

@ -25,6 +25,11 @@
from spack import *
class Spindle(Package):
"""Spindle improves the library-loading performance of dynamically
linked HPC applications. Without Spindle large MPI jobs can
overload on a shared file system when loading dynamically
linked libraries, causing site-wide performance problems.
"""
homepage = "https://computation-rnd.llnl.gov/spindle"
url = "https://github.com/hpc/Spindle/archive/v0.8.1.tar.gz"
list_url = "https://github.com/hpc/Spindle/releases"

View File

@ -5,8 +5,8 @@ class Stat(Package):
homepage = "http://paradyn.org/STAT/STAT.html"
url = "https://github.com/lee218llnl/stat/archive/v2.0.0.tar.gz"
version('2.0.0', 'c7494210b0ba26b577171b92838e1a9b')
version('2.1.0', 'ece26beaf057aa9134d62adcdda1ba91')
version('2.0.0', 'c7494210b0ba26b577171b92838e1a9b')
depends_on('libdwarf')
depends_on('dyninst')
@ -27,9 +27,5 @@ def install(self, spec, prefix):
"--with-stackwalker=%s" % spec['dyninst'].prefix,
"--with-libdwarf=%s" % spec['libdwarf'].prefix)
# TODO: remove once SPACK-19 is fixed
import shutil
shutil.copy2('/usr/bin/libtool', 'libtool')
make(parallel=False)
make("install")

View File

@ -0,0 +1,39 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Sundials(Package):
"""SUNDIALS (SUite of Nonlinear and DIfferential/ALgebraic equation Solvers)"""
homepage = "http://computation.llnl.gov/casc/sundials/"
url = "http://computation.llnl.gov/casc/sundials/download/code/sundials-2.5.0.tar.gz"
version('2.5.0', 'aba8b56eec600de3109cfb967aa3ba0f')
depends_on("mpi")
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")

View File

@ -0,0 +1,44 @@
##############################################################################
# Copyright (c) 2014, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Matthew LeGendre, legendre1@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://scalability-llnl.github.io/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Swig(Package):
"""SWIG is an interface compiler that connects programs written in
C and C++ with scripting languages such as Perl, Python, Ruby,
and Tcl. It works by taking the declarations found in C/C++
header files and using them to generate the wrapper code that
scripting languages need to access the underlying C/C++
code. In addition, SWIG provides a variety of customization
features that let you tailor the wrapping process to suit your
application."""
homepage = "http://www.swig.org"
url = "http://prdownloads.sourceforge.net/swig/swig-3.0.2.tar.gz"
version('3.0.2', '62f9b0d010cef36a13a010dc530d0d41')
def install(self, spec, prefix):
configure("--prefix=%s" % prefix)
make()
make("install")