Merge branch 'custom_modulefiles_from_config' of https://github.com/epfl-scitas/spack into epfl-scitas-custom_modulefiles_from_config

This commit is contained in:
Todd Gamblin 2016-05-11 08:59:23 -07:00
commit fd5d89b61c
10 changed files with 1150 additions and 396 deletions

View File

@ -5,4 +5,14 @@
# although users can override these settings in their ~/.spack/modules.yaml.
# -------------------------------------------------------------------------
modules:
prefix_inspections: {
bin: ['PATH'],
man: ['MANPATH'],
lib: ['LIBRARY_PATH', 'LD_LIBRARY_PATH'],
lib64: ['LIBRARY_PATH', 'LD_LIBRARY_PATH'],
include: ['CPATH'],
lib/pkgconfig: ['PKGCONFIG'],
lib64/pkgconfig: ['PKGCONFIG'],
'': ['CMAKE_PREFIX_PATH']
}
enable: ['tcl', 'dotkit']

View File

@ -788,7 +788,7 @@ versions are now filtered out.
.. _shell-support:
Environment modules
Integration with module systems
-------------------------------
.. note::
@ -798,42 +798,50 @@ Environment modules
interface and/or generated module names may change in future
versions.
Spack provides some limited integration with environment module
systems to make it easier to use the packages it provides.
Spack provides some integration with
`Environment Modules <http://modules.sourceforge.net/>`_
and `Dotkit <https://computing.llnl.gov/?set=jobs&page=dotkit>`_ to make
it easier to use the packages it installed.
Installing Environment Modules
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In order to use Spack's generated environment modules, you must have
installed the *Environment Modules* package. On many Linux
distributions, this can be installed from the vendor's repository.
For example: ```yum install environment-modules``
(Fedora/RHEL/CentOS). If your Linux distribution does not have
Environment Modules, you can get it with Spack:
distributions, this can be installed from the vendor's repository:
1. Install with::
.. code-block:: sh
yum install environment-modules # (Fedora/RHEL/CentOS)
apt-get install environment-modules # (Ubuntu/Debian)
If your Linux distribution does not have
Environment Modules, you can get it with Spack:
.. code-block:: sh
spack install environment-modules
2. Activate with::
Add the following two lines to your ``.bashrc`` profile (or similar):
In this case to activate it automatically you need to add the following two
lines to your ``.bashrc`` profile (or similar):
.. code-block:: sh
MODULES_HOME=`spack location -i environment-modules`
source ${MODULES_HOME}/Modules/init/bash
In case you use a Unix shell other than bash, substitute ``bash`` by
the appropriate file in ``${MODULES_HOME}/Modules/init/``.
If you use a Unix shell other than ``bash``, modify the commands above
accordingly and source the appropriate file in
``${MODULES_HOME}/Modules/init/``.
Spack and Environment Modules
.. TODO : Add a similar section on how to install dotkit ?
Spack and module systems
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
You can enable shell support by sourcing some files in the
``/share/spack`` directory.
@ -841,7 +849,7 @@ For ``bash`` or ``ksh``, run:
.. code-block:: sh
. $SPACK_ROOT/share/spack/setup-env.sh
. ${SPACK_ROOT}/share/spack/setup-env.sh
For ``csh`` and ``tcsh`` run:
@ -853,17 +861,19 @@ For ``csh`` and ``tcsh`` run:
You can put the above code in your ``.bashrc`` or ``.cshrc``, and
Spack's shell support will be available on the command line.
When you install a package with Spack, it automatically generates an
environment module that lets you add the package to your environment.
When you install a package with Spack, it automatically generates a module file
that lets you add the package to your environment.
Currently, Spack supports the generation of `TCL Modules
Currently, Spack supports the generation of `Environment Modules
<http://wiki.tcl.tk/12999>`_ and `Dotkit
<https://computing.llnl.gov/?set=jobs&page=dotkit>`_. Generated
module files for each of these systems can be found in these
directories:
* ``$SPACK_ROOT/share/spack/modules``
* ``$SPACK_ROOT/share/spack/dotkit``
.. code-block:: sh
${SPACK_ROOT}/share/spack/modules
${SPACK_ROOT}/share/spack/dotkit
The directories are automatically added to your ``MODULEPATH`` and
``DK_NODE`` environment variables when you enable Spack's `shell
@ -919,8 +929,7 @@ of installed packages.
The names here should look familiar, they're the same ones from
``spack find``. You *can* use the names here directly. For example,
you could type either of these commands to load the callpath module
(assuming dotkit and modules are installed):
you could type either of these commands to load the callpath module:
.. code-block:: sh
@ -935,7 +944,7 @@ easy to type. Luckily, Spack has its own interface for using modules
and dotkits. You can use the same spec syntax you're used to:
========================= ==========================
Modules Dotkit
Environment Modules Dotkit
========================= ==========================
``spack load <spec>`` ``spack use <spec>``
``spack unload <spec>`` ``spack unuse <spec>``
@ -1002,15 +1011,216 @@ used ``gcc``. You could therefore just type:
To identify just the one built with the Intel compiler.
Module files generation and customization
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Regenerating Module files
~~~~~~~~~~~~~~~~~~~~~~~~~~~
Environment Modules and Dotkit files are generated when packages are installed,
and are placed in the following directories under the Spack root:
Module and dotkit files are generated when packages are installed, and
are placed in the following directories under the Spack root:
.. code-block:: sh
* ``$SPACK_ROOT/share/spack/modules``
* ``$SPACK_ROOT/share/spack/dotkit``
${SPACK_ROOT}/share/spack/modules
${SPACK_ROOT}/share/spack/dotkit
The content that gets written in each module file can be customized in two ways:
1. overriding part of the ``spack.Package`` API within a ``package.py``
2. writing dedicated configuration files
Override ``Package`` API
^^^^^^^^^^^^^^^^^^^^^^^^
There are currently two methods in ``spack.Package`` that may affect the content
of module files:
.. code-block:: python
def setup_environment(self, spack_env, run_env):
"""Set up the compile and runtime environments for a package."""
pass
.. code-block:: python
def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
"""Set up the environment of packages that depend on this one"""
pass
As briefly stated in the comments, the first method lets you customize the
module file content for the package you are currently writing, the second
allows for modifications to your dependees module file. In both cases one
needs to fill ``run_env`` with the desired list of environment modifications.
Example : ``builtin/packages/python/package.py``
""""""""""""""""""""""""""""""""""""""""""""""""
The ``python`` package that comes with the ``builtin`` Spack repository
overrides ``setup_dependent_environment`` in the following way:
.. code-block:: python
def setup_dependent_environment(self, spack_env, run_env, extension_spec):
# ...
if extension_spec.package.extends(self.spec):
run_env.prepend_path('PYTHONPATH', os.path.join(extension_spec.prefix, self.site_packages_dir))
to insert the appropriate ``PYTHONPATH`` modifications in the module
files of python packages.
Configuration files
^^^^^^^^^^^^^^^^^^^
Another way of modifying the content of module files is writing a
``modules.yaml`` configuration file. Following usual Spack conventions, this
file can be placed either at *site* or *user* scope.
The default site configuration reads:
.. literalinclude:: ../../../etc/spack/modules.yaml
:language: yaml
It basically inspects the installation prefixes for the
existence of a few folders and, if they exist, it prepends a path to a given
list of environment variables.
For each module system that can be enabled a finer configuration is possible:
.. code-block:: yaml
modules:
tcl:
# contains environment modules specific customizations
dotkit:
# contains dotkit specific customizations
The structure under the ``tcl`` and ``dotkit`` keys is almost equal, and will
be showcased in the following by some examples.
Select module files by spec constraints
"""""""""""""""""""""""""""""""""""""""
Using spec syntax it's possible to have different customizations for different
groups of module files.
Considering :
.. code-block:: yaml
modules:
tcl:
all: # Default addition for every package
environment:
set:
BAR: 'bar'
^openmpi:: # A double ':' overrides previous rules
environment:
set:
BAR: 'baz'
zlib:
environment:
prepend_path:
LD_LIBRARY_PATH: 'foo'
zlib%gcc@4.8:
environment:
unset:
- FOOBAR
what will happen is that:
- every module file will set ``BAR=bar``
- unless the associated spec satisfies ``^openmpi`` in which case ``BAR=baz``
- any spec that satisfies ``zlib`` will additionally prepend ``foo`` to ``LD_LIBRARY_PATH``
- any spec that satisfies ``zlib%gcc@4.8`` will additionally unset ``FOOBAR``
.. note::
Order does matter
The modifications associated with the ``all`` keyword are always evaluated
first, no matter where they appear in the configuration file. All the other
spec constraints are instead evaluated top to bottom.
Filter modifications out of module files
""""""""""""""""""""""""""""""""""""""""
Modifications to certain environment variables in module files are generated by
default. Suppose you would like to avoid having ``CPATH`` and ``LIBRARY_PATH``
modified by your dotkit modules. Then :
.. code-block:: yaml
modules:
dotkit:
all:
filter:
environment_blacklist: ['CPATH', 'LIBRARY_PATH'] # Exclude changes to any of these variables
will generate dotkit module files that will not contain modifications to either
``CPATH`` or ``LIBRARY_PATH`` and environment module files that instead will
contain those modifications.
Autoload dependencies
"""""""""""""""""""""
The following lines in ``modules.yaml``:
.. code-block:: yaml
modules:
tcl:
all:
autoload: 'direct'
will produce environment module files that will automatically load their direct
dependencies.
.. note::
Allowed values for ``autoload`` statements
Allowed values for ``autoload`` statements are either ``none``, ``direct``
or ``all``. In ``tcl`` configuration it is possible to use the option
``prerequisites`` that accepts the same values and will add ``prereq``
statements instead of automatically loading other modules.
Blacklist or whitelist the generation of specific module files
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Sometimes it is desirable not to generate module files, a common use case being
not providing the users with software built using the system compiler.
A configuration file like:
.. code-block:: yaml
modules:
tcl:
whitelist: ['gcc', 'llvm'] # Whitelist will have precedence over blacklist
blacklist: ['%gcc@4.4.7'] # Assuming gcc@4.4.7 is the system compiler
will skip module file generation for anything that satisfies ``%gcc@4.4.7``,
with the exception of specs that satisfy ``gcc`` or ``llvm``.
Customize the naming scheme and insert conflicts
""""""""""""""""""""""""""""""""""""""""""""""""
A configuration file like:
.. code-block:: yaml
modules:
tcl:
naming_scheme: '{name}/{version}-{compiler.name}-{compiler.version}'
all:
conflict: ['{name}', 'intel/14.0.1']
will create module files that will conflict with ``intel/14.0.1`` and with the
base directory of the same module, effectively preventing the possibility to
load two or more versions of the same software at the same time.
.. note::
Tokens available for the naming scheme
currently only the tokens shown in the example are available to construct
the naming scheme
.. note::
The ``conflict`` option is ``tcl`` specific
Regenerating module files
^^^^^^^^^^^^^^^^^^^^^^^^^
Sometimes you may need to regenerate the modules files. For example,
if newer, fancier module support is added to Spack at some later date,
@ -1020,7 +1230,7 @@ new features.
.. _spack-module:
``spack module refresh``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
""""""""""""""""""""""""
Running ``spack module refresh`` will remove the
``share/spack/modules`` and ``share/spack/dotkit`` directories, then

View File

@ -32,18 +32,21 @@
from spack.modules import module_types
from spack.util.string import *
description ="Manipulate modules and dotkits."
description = "Manipulate modules and dotkits."
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='module_command')
refresh_parser = sp.add_parser('refresh', help='Regenerate all module files.')
sp.add_parser('refresh', help='Regenerate all module files.')
find_parser = sp.add_parser('find', help='Find module files for packages.')
find_parser.add_argument(
'module_type', help="Type of module to find file for. [" + '|'.join(module_types) + "]")
find_parser.add_argument('spec', nargs='+', help='spec to find a module file for.')
find_parser.add_argument('module_type',
help="Type of module to find file for. [" +
'|'.join(module_types) + "]")
find_parser.add_argument('spec',
nargs='+',
help='spec to find a module file for.')
def module_find(mtype, spec_array):
@ -53,7 +56,8 @@ def module_find(mtype, spec_array):
should type to use that package's module.
"""
if mtype not in module_types:
tty.die("Invalid module type: '%s'. Options are %s" % (mtype, comma_or(module_types)))
tty.die("Invalid module type: '%s'. Options are %s" %
(mtype, comma_or(module_types)))
specs = spack.cmd.parse_specs(spec_array)
if len(specs) > 1:
@ -89,7 +93,6 @@ def module_refresh():
shutil.rmtree(cls.path, ignore_errors=False)
mkdirp(cls.path)
for spec in specs:
tty.debug(" Writing file for %s" % spec)
cls(spec).write()

View File

@ -118,21 +118,20 @@
the site configuration will be ignored.
"""
import copy
import os
import re
import sys
import copy
import jsonschema
from jsonschema import Draft4Validator, validators
import yaml
from yaml.error import MarkedYAMLError
from ordereddict_backport import OrderedDict
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack
import yaml
from jsonschema import Draft4Validator, validators
from llnl.util.filesystem import mkdirp
from ordereddict_backport import OrderedDict
from spack.error import SpackError
from yaml.error import MarkedYAMLError
# Hacked yaml for configuration files preserves line numbers.
import spack.util.spack_yaml as syaml
@ -146,7 +145,7 @@
'type': 'object',
'additionalProperties': False,
'patternProperties': {
'compilers:?': { # optional colon for overriding site config.
'compilers:?': { # optional colon for overriding site config.
'type': 'object',
'default': {},
'additionalProperties': False,
@ -195,6 +194,7 @@
'default': [],
'items': {
'type': 'string'},},},},
'packages': {
'$schema': 'http://json-schema.org/schema#',
'title': 'Spack package configuration file schema',
@ -238,24 +238,120 @@
'default' : {},
}
},},},},},},
'modules': {
'$schema': 'http://json-schema.org/schema#',
'title': 'Spack module file configuration file schema',
'type': 'object',
'additionalProperties': False,
'definitions': {
'array_of_strings': {
'type': 'array',
'default': [],
'items': {
'type': 'string'
}
},
'dictionary_of_strings': {
'type': 'object',
'patternProperties': {
r'\w[\w-]*': { # key
'type': 'string'
}
}
},
'dependency_selection': {
'type': 'string',
'enum': ['none', 'direct', 'all']
},
'module_file_configuration': {
'type': 'object',
'default': {},
'additionalProperties': False,
'properties': {
'filter': {
'type': 'object',
'default': {},
'additionalProperties': False,
'properties': {
'environment_blacklist': {
'type': 'array',
'default': [],
'items': {
'type': 'string'
}
}
}
},
'autoload': {'$ref': '#/definitions/dependency_selection'},
'prerequisites': {'$ref': '#/definitions/dependency_selection'},
'conflict': {'$ref': '#/definitions/array_of_strings'},
'environment': {
'type': 'object',
'default': {},
'additionalProperties': False,
'properties': {
'set': {'$ref': '#/definitions/dictionary_of_strings'},
'unset': {'$ref': '#/definitions/array_of_strings'},
'prepend_path': {'$ref': '#/definitions/dictionary_of_strings'},
'append_path': {'$ref': '#/definitions/dictionary_of_strings'}
}
}
}
},
'module_type_configuration': {
'type': 'object',
'default': {},
'anyOf': [
{
'properties': {
'whitelist': {'$ref': '#/definitions/array_of_strings'},
'blacklist': {'$ref': '#/definitions/array_of_strings'},
'naming_scheme': {
'type': 'string' # Can we be more specific here?
}
}
},
{
'patternProperties': {r'\w[\w-]*': {'$ref': '#/definitions/module_file_configuration'}}
}
]
}
},
'patternProperties': {
r'modules:?': {
'type': 'object',
'default': {},
'additionalProperties': False,
'properties': {
'prefix_inspections': {
'type': 'object',
'patternProperties': {
r'\w[\w-]*': { # path to be inspected for existence (relative to prefix)
'$ref': '#/definitions/array_of_strings'
}
}
},
'enable': {
'type': 'array',
'default': [],
'items': {
'type': 'string'
'type': 'string',
'enum': ['tcl', 'dotkit']
}
}
},
'tcl': {
'allOf': [
{'$ref': '#/definitions/module_type_configuration'}, # Base configuration
{} # Specific tcl extensions
]
},
'dotkit': {
'allOf': [
{'$ref': '#/definitions/module_type_configuration'}, # Base configuration
{} # Specific dotkit extensions
]
},
}
},
},
@ -411,7 +507,7 @@ def _read_config_file(filename, schema):
elif not os.path.isfile(filename):
raise ConfigFileError(
"Invlaid configuration. %s exists but is not a file." % filename)
"Invalid configuration. %s exists but is not a file." % filename)
elif not os.access(filename, os.R_OK):
raise ConfigFileError("Config file is not readable: %s" % filename)

View File

@ -1,7 +1,7 @@
import os
import os.path
import collections
import inspect
import os
import os.path
class NameModifier(object):
@ -26,7 +26,8 @@ def execute(self):
class UnsetEnv(NameModifier):
def execute(self):
os.environ.pop(self.name, None) # Avoid throwing if the variable was not set
# Avoid throwing if the variable was not set
os.environ.pop(self.name, None)
class SetPath(NameValueModifier):
@ -55,7 +56,9 @@ class RemovePath(NameValueModifier):
def execute(self):
environment_value = os.environ.get(self.name, '')
directories = environment_value.split(':') if environment_value else []
directories = [os.path.normpath(x) for x in directories if x != os.path.normpath(self.value)]
directories = [os.path.normpath(x)
for x in directories
if x != os.path.normpath(self.value)]
os.environ[self.name] = ':'.join(directories)
@ -63,7 +66,8 @@ class EnvironmentModifications(object):
"""
Keeps track of requests to modify the current environment.
Each call to a method to modify the environment stores the extra information on the caller in the request:
Each call to a method to modify the environment stores the extra
information on the caller in the request:
- 'filename' : filename of the module where the caller is defined
- 'lineno': line number where the request occurred
- 'context' : line of code that issued the request that failed
@ -71,10 +75,10 @@ class EnvironmentModifications(object):
def __init__(self, other=None):
"""
Initializes a new instance, copying commands from other if it is not None
Initializes a new instance, copying commands from other if not None
Args:
other: another instance of EnvironmentModifications from which (optional)
other: another instance of EnvironmentModifications (optional)
"""
self.env_modifications = []
if other is not None:
@ -93,7 +97,8 @@ def extend(self, other):
@staticmethod
def _check_other(other):
if not isinstance(other, EnvironmentModifications):
raise TypeError('other must be an instance of EnvironmentModifications')
raise TypeError(
'other must be an instance of EnvironmentModifications')
def _get_outside_caller_attributes(self):
stack = inspect.stack()
@ -101,12 +106,10 @@ def _get_outside_caller_attributes(self):
_, filename, lineno, _, context, index = stack[2]
context = context[index].strip()
except Exception:
filename, lineno, context = 'unknown file', 'unknown line', 'unknown context'
args = {
'filename': filename,
'lineno': lineno,
'context': context
}
filename = 'unknown file'
lineno = 'unknown line'
context = 'unknown context'
args = {'filename': filename, 'lineno': lineno, 'context': context}
return args
def set(self, name, value, **kwargs):
@ -170,7 +173,8 @@ def prepend_path(self, name, path, **kwargs):
def remove_path(self, name, path, **kwargs):
"""
Stores in the current object a request to remove a path from a path list
Stores in the current object a request to remove a path from a path
list
Args:
name: name of the path list in the environment
@ -185,7 +189,8 @@ def group_by_name(self):
Returns a dict of the modifications grouped by variable name
Returns:
dict mapping the environment variable name to the modifications to be done on it
dict mapping the environment variable name to the modifications to
be done on it
"""
modifications = collections.defaultdict(list)
for item in self:
@ -203,7 +208,7 @@ def apply_modifications(self):
Applies the modifications and clears the list
"""
modifications = self.group_by_name()
# Apply the modifications to the environment variables one variable at a time
# Apply modifications one variable at a time
for name, actions in sorted(modifications.items()):
for x in actions:
x.execute()
@ -224,13 +229,19 @@ def concatenate_paths(paths):
def set_or_unset_not_first(variable, changes, errstream):
"""
Check if we are going to set or unset something after other modifications have already been requested
Check if we are going to set or unset something after other modifications
have already been requested
"""
indexes = [ii for ii, item in enumerate(changes) if ii != 0 and type(item) in [SetEnv, UnsetEnv]]
indexes = [ii
for ii, item in enumerate(changes)
if ii != 0 and type(item) in [SetEnv, UnsetEnv]]
if indexes:
good = '\t \t{context} at {filename}:{lineno}'
nogood = '\t--->\t{context} at {filename}:{lineno}'
errstream('Suspicious requests to set or unset the variable \'{var}\' found'.format(var=variable))
message = 'Suspicious requests to set or unset the variable \'{var}\' found' # NOQA: ignore=E501
errstream(
message.format(
var=variable))
for ii, item in enumerate(changes):
print_format = nogood if ii in indexes else good
errstream(print_format.format(**item.args))
@ -238,8 +249,8 @@ def set_or_unset_not_first(variable, changes, errstream):
def validate(env, errstream):
"""
Validates the environment modifications to check for the presence of suspicious patterns. Prompts a warning for
everything that was found
Validates the environment modifications to check for the presence of
suspicious patterns. Prompts a warning for everything that was found
Current checks:
- set or unset variables after other changes on the same variable
@ -250,3 +261,20 @@ def validate(env, errstream):
modifications = env.group_by_name()
for variable, list_of_changes in sorted(modifications.items()):
set_or_unset_not_first(variable, list_of_changes, errstream)
def filter_environment_blacklist(env, variables):
"""
Generator that filters out any change to environment variables present in
the input list
Args:
env: list of environment modifications
variables: list of variable names to be filtered
Yields:
items in env if they are not in variables
"""
for item in env:
if item.name not in variables:
yield item

View File

@ -23,33 +23,35 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
This module contains code for creating environment modules, which can include dotkits, tcl modules, lmod, and others.
This module contains code for creating environment modules, which can include
dotkits, tcl modules, lmod, and others.
The various types of modules are installed by post-install hooks and removed after an uninstall by post-uninstall hooks.
This class consolidates the logic for creating an abstract description of the information that module systems need.
Currently that includes a number of directories to be appended to paths in the user's environment:
The various types of modules are installed by post-install hooks and removed
after an uninstall by post-uninstall hooks. This class consolidates the logic
for creating an abstract description of the information that module systems
need.
* /bin directories to be appended to PATH
* /lib* directories for LD_LIBRARY_PATH
* /include directories for CPATH
* /man* and /share/man* directories for MANPATH
* the package prefix for CMAKE_PREFIX_PATH
This module also includes logic for coming up with unique names for the module
files so that they can be found by the various shell-support files in
$SPACK/share/spack/setup-env.*.
This module also includes logic for coming up with unique names for the module files so that they can be found by the
various shell-support files in $SPACK/share/spack/setup-env.*.
Each hook in hooks/ implements the logic for writing its specific type of module file.
Each hook in hooks/ implements the logic for writing its specific type of
module file.
"""
import copy
import datetime
import os
import os.path
import re
import shutil
import string
import textwrap
import llnl.util.tty as tty
import spack
import spack.config
from llnl.util.filesystem import join_path, mkdirp
from spack.build_environment import parent_class_modules
from spack.build_environment import set_module_variables_for_package
from spack.environment import *
__all__ = ['EnvModule', 'Dotkit', 'TclModule']
@ -61,56 +63,175 @@
def print_help():
"""For use by commands to tell user how to activate shell support."""
tty.msg("This command requires spack's shell integration.",
"",
"""
For use by commands to tell user how to activate shell support.
"""
tty.msg("This command requires spack's shell integration.", "",
"To initialize spack's shell commands, you must run one of",
"the commands below. Choose the right command for your shell.",
"",
"For bash and zsh:",
" . %s/setup-env.sh" % spack.share_path,
"",
"For csh and tcsh:",
" setenv SPACK_ROOT %s" % spack.prefix,
" source %s/setup-env.csh" % spack.share_path,
"")
"", "For bash and zsh:",
" . %s/setup-env.sh" % spack.share_path, "",
"For csh and tcsh:", " setenv SPACK_ROOT %s" % spack.prefix,
" source %s/setup-env.csh" % spack.share_path, "")
def inspect_path(prefix):
"""
Inspects the prefix of an installation to search for common layouts. Issues a request to modify the environment
accordingly when an item is found.
Inspects the prefix of an installation to search for common layouts. Issues
a request to modify the environment accordingly when an item is found.
Args:
prefix: prefix of the installation
Returns:
instance of EnvironmentModifications containing the requested modifications
instance of EnvironmentModifications containing the requested
modifications
"""
env = EnvironmentModifications()
# Inspect the prefix to check for the existence of common directories
prefix_inspections = {
'bin': ('PATH',),
'man': ('MANPATH',),
'lib': ('LIBRARY_PATH', 'LD_LIBRARY_PATH'),
'lib64': ('LIBRARY_PATH', 'LD_LIBRARY_PATH'),
'include': ('CPATH',)
}
for attribute, variables in prefix_inspections.items():
expected = getattr(prefix, attribute)
prefix_inspections = CONFIGURATION.get('prefix_inspections', {})
for relative_path, variables in prefix_inspections.items():
expected = join_path(prefix, relative_path)
if os.path.isdir(expected):
for variable in variables:
env.prepend_path(variable, expected)
# PKGCONFIG
for expected in (join_path(prefix.lib, 'pkgconfig'), join_path(prefix.lib64, 'pkgconfig')):
if os.path.isdir(expected):
env.prepend_path('PKG_CONFIG_PATH', expected)
# CMake related variables
env.prepend_path('CMAKE_PREFIX_PATH', prefix)
return env
def dependencies(spec, request='all'):
"""
Returns the list of dependent specs for a given spec, according to the
given request
Args:
spec: target spec
request: either 'none', 'direct' or 'all'
Returns:
empty list if 'none', direct dependency list if 'direct', all
dependencies if 'all'
"""
if request not in ('none', 'direct', 'all'):
message = "Wrong value for argument 'request' : "
message += "should be one of ('none', 'direct', 'all')"
raise tty.error(message + " [current value is '%s']" % request)
if request == 'none':
return []
if request == 'direct':
return [xx for _, xx in spec.dependencies.items()]
# FIXME : during module file creation nodes seem to be visited multiple
# FIXME : times even if cover='nodes' is given. This work around permits
# FIXME : to get a unique list of spec anyhow. Do we miss a merge
# FIXME : step among nodes that refer to the same package?
seen = set()
seen_add = seen.add
l = [xx
for xx in sorted(
spec.traverse(order='post',
depth=True,
cover='nodes',
root=False),
reverse=True)]
return [xx for ii, xx in l if not (xx in seen or seen_add(xx))]
def update_dictionary_extending_lists(target, update):
for key in update:
value = target.get(key, None)
if isinstance(value, list):
target[key].extend(update[key])
elif isinstance(value, dict):
update_dictionary_extending_lists(target[key], update[key])
else:
target[key] = update[key]
def parse_config_options(module_generator):
"""
Parse the configuration file and returns a bunch of items that will be
needed during module file generation
Args:
module_generator: module generator for a given spec
Returns:
autoloads: list of specs to be autoloaded
prerequisites: list of specs to be marked as prerequisite
filters: list of environment variables whose modification is
blacklisted in module files
env: list of custom environment modifications to be applied in the
module file
"""
# Get the configuration for this kind of generator
module_configuration = copy.deepcopy(CONFIGURATION.get(
module_generator.name, {}))
#####
# Merge all the rules
#####
module_file_actions = module_configuration.pop('all', {})
for spec, conf in module_configuration.items():
override = False
if spec.endswith(':'):
spec = spec.strip(':')
override = True
if module_generator.spec.satisfies(spec):
if override:
module_file_actions = {}
update_dictionary_extending_lists(module_file_actions, conf)
#####
# Process the common rules
#####
# Automatic loading loads
module_file_actions['autoload'] = dependencies(
module_generator.spec, module_file_actions.get('autoload', 'none'))
# Prerequisites
module_file_actions['prerequisites'] = dependencies(
module_generator.spec, module_file_actions.get('prerequisites',
'none'))
# Environment modifications
environment_actions = module_file_actions.pop('environment', {})
env = EnvironmentModifications()
def process_arglist(arglist):
if method == 'unset':
for x in arglist:
yield (x, )
else:
for x in arglist.iteritems():
yield x
for method, arglist in environment_actions.items():
for args in process_arglist(arglist):
getattr(env, method)(*args)
return module_file_actions, env
def filter_blacklisted(specs, module_name):
"""
Given a sequence of specs, filters the ones that are blacklisted in the
module configuration file.
Args:
specs: sequence of spec instances
module_name: type of module file objects
Yields:
non blacklisted specs
"""
for x in specs:
if module_types[module_name](x).blacklisted:
tty.debug('\tFILTER : %s' % x)
continue
yield x
class EnvModule(object):
name = 'env_module'
formats = {}
@ -118,7 +239,8 @@ class EnvModule(object):
class __metaclass__(type):
def __init__(cls, name, bases, dict):
type.__init__(cls, name, bases, dict)
if cls.name != 'env_module' and cls.name in CONFIGURATION['enable']:
if cls.name != 'env_module' and cls.name in CONFIGURATION[
'enable']:
module_types[cls.name] = cls
def __init__(self, spec=None):
@ -134,8 +256,41 @@ def __init__(self, spec=None):
# long description is the docstring with reduced whitespace.
self.long_description = None
if self.spec.package.__doc__:
self.long_description = re.sub(r'\s+', ' ', self.spec.package.__doc__)
self.long_description = re.sub(r'\s+', ' ',
self.spec.package.__doc__)
@property
def naming_scheme(self):
try:
naming_scheme = CONFIGURATION[self.name]['naming_scheme']
except KeyError:
naming_scheme = self.default_naming_format
return naming_scheme
@property
def tokens(self):
tokens = {
'name': self.spec.name,
'version': self.spec.version,
'compiler': self.spec.compiler
}
return tokens
@property
def use_name(self):
"""
Subclasses should implement this to return the name the module command
uses to refer to the package.
"""
naming_tokens = self.tokens
naming_scheme = self.naming_scheme
name = naming_scheme.format(**naming_tokens)
name += '-' + self.spec.dag_hash(
) # Always append the hash to make the module file unique
# Not everybody is working on linux...
parts = name.split('/')
name = join_path(*parts)
return name
@property
def category(self):
@ -144,13 +299,51 @@ def category(self):
return self.pkg.category
# Extensions
for extendee in self.pkg.extendees:
return '{extendee} extension'.format(extendee=extendee)
return '{extendee}_extension'.format(extendee=extendee)
# Not very descriptive fallback
return 'spack installed package'
return 'spack'
@property
def blacklisted(self):
configuration = CONFIGURATION.get(self.name, {})
whitelist_matches = [x
for x in configuration.get('whitelist', [])
if self.spec.satisfies(x)]
blacklist_matches = [x
for x in configuration.get('blacklist', [])
if self.spec.satisfies(x)]
if whitelist_matches:
message = '\tWHITELIST : %s [matches : ' % self.spec.cshort_spec
for rule in whitelist_matches:
message += '%s ' % rule
message += ' ]'
tty.debug(message)
if blacklist_matches:
message = '\tBLACKLIST : %s [matches : ' % self.spec.cshort_spec
for rule in blacklist_matches:
message += '%s ' % rule
message += ' ]'
tty.debug(message)
if not whitelist_matches and blacklist_matches:
return True
return False
def write(self):
"""Write out a module file for this object."""
"""
Writes out a module file for this object.
This method employs a template pattern and expects derived classes to:
- override the header property
- provide formats for autoload, prerequisites and environment changes
"""
if self.blacklisted:
return
tty.debug("\tWRITE : %s [%s]" %
(self.spec.cshort_spec, self.file_name))
module_dir = os.path.dirname(self.file_name)
if not os.path.exists(module_dir):
mkdirp(module_dir)
@ -159,42 +352,73 @@ def write(self):
# installation prefix
env = inspect_path(self.spec.prefix)
# Let the extendee modify their extensions before asking for
# package-specific modifications
# Let the extendee/dependency modify their extensions/dependencies
# before asking for package-specific modifications
spack_env = EnvironmentModifications()
for item in self.pkg.extendees:
try:
package = self.spec[item].package
package.setup_dependent_package(self.pkg.module, self.spec)
package.setup_dependent_environment(spack_env, env, self.spec)
except:
# The extends was conditional, so it doesn't count here
# eg: extends('python', when='+python')
pass
# TODO : the code down below is quite similar to
# TODO : build_environment.setup_package and needs to be factored out
# TODO : to a single place
for item in dependencies(self.spec, 'all'):
package = self.spec[item.name].package
modules = parent_class_modules(package.__class__)
for mod in modules:
set_module_variables_for_package(package, mod)
set_module_variables_for_package(package, package.module)
package.setup_dependent_package(self.pkg.module, self.spec)
package.setup_dependent_environment(spack_env, env, self.spec)
# Package-specific environment modifications
set_module_variables_for_package(self.pkg, self.pkg.module)
self.spec.package.setup_environment(spack_env, env)
# TODO : implement site-specific modifications and filters
if not env:
return
# Parse configuration file
module_configuration, conf_env = parse_config_options(self)
env.extend(conf_env)
filters = module_configuration.get('filter', {}).get(
'environment_blacklist', {})
# Build up the module file content
module_file_content = self.header
for x in filter_blacklisted(
module_configuration.pop('autoload', []), self.name):
module_file_content += self.autoload(x)
for x in filter_blacklisted(
module_configuration.pop('prerequisites', []), self.name):
module_file_content += self.prerequisite(x)
for line in self.process_environment_command(
filter_environment_blacklist(env, filters)):
module_file_content += line
for line in self.module_specific_content(module_configuration):
module_file_content += line
# Dump to file
with open(self.file_name, 'w') as f:
self.write_header(f)
for line in self.process_environment_command(env):
f.write(line)
f.write(module_file_content)
def write_header(self, stream):
@property
def header(self):
raise NotImplementedError()
def module_specific_content(self, configuration):
return tuple()
def autoload(self, spec):
m = type(self)(spec)
return self.autoload_format.format(module_file=m.use_name)
def prerequisite(self, spec):
m = type(self)(spec)
return self.prerequisite_format.format(module_file=m.use_name)
def process_environment_command(self, env):
for command in env:
try:
yield self.formats[type(command)].format(**command.args)
yield self.environment_modifications_formats[type(
command)].format(**command.args)
except KeyError:
tty.warn('Cannot handle command of type {command} : skipping request'.format(command=type(command)))
tty.warn('{context} at {filename}:{lineno}'.format(**command.args))
message = 'Cannot handle command of type {command} : skipping request' # NOQA: ignore=E501
details = '{context} at {filename}:{lineno}'
tty.warn(message.format(command=type(command)))
tty.warn(details.format(**command.args))
@property
def file_name(self):
@ -202,62 +426,65 @@ def file_name(self):
where this module lives."""
raise NotImplementedError()
@property
def use_name(self):
"""Subclasses should implement this to return the name the
module command uses to refer to the package."""
raise NotImplementedError()
def remove(self):
mod_file = self.file_name
if os.path.exists(mod_file):
try:
os.remove(mod_file) # Remove the module file
os.removedirs(os.path.dirname(mod_file)) # Remove all the empty directories from the leaf up
os.removedirs(
os.path.dirname(mod_file)
) # Remove all the empty directories from the leaf up
except OSError:
pass # removedirs throws OSError on first non-empty directory found
# removedirs throws OSError on first non-empty directory found
pass
class Dotkit(EnvModule):
name = 'dotkit'
path = join_path(spack.share_path, "dotkit")
formats = {
environment_modifications_formats = {
PrependPath: 'dk_alter {name} {value}\n',
SetEnv: 'dk_setenv {name} {value}\n'
}
autoload_format = 'dk_op {module_file}\n'
default_naming_format = '{name}-{version}-{compiler.name}-{compiler.version}' # NOQA: ignore=E501
@property
def file_name(self):
return join_path(Dotkit.path, self.spec.architecture, '%s.dk' % self.use_name)
return join_path(Dotkit.path, self.spec.architecture,
'%s.dk' % self.use_name)
@property
def use_name(self):
return "%s-%s-%s-%s-%s" % (self.spec.name, self.spec.version,
self.spec.compiler.name,
self.spec.compiler.version,
self.spec.dag_hash())
def write_header(self, dk_file):
def header(self):
# Category
header = ''
if self.category:
dk_file.write('#c %s\n' % self.category)
header += '#c %s\n' % self.category
# Short description
if self.short_description:
dk_file.write('#d %s\n' % self.short_description)
header += '#d %s\n' % self.short_description
# Long description
if self.long_description:
for line in textwrap.wrap(self.long_description, 72):
dk_file.write("#h %s\n" % line)
header += '#h %s\n' % line
return header
def prerequisite(self, spec):
tty.warn('prerequisites: not supported by dotkit module files')
tty.warn('\tYou may want to check ~/.spack/modules.yaml')
return ''
class TclModule(EnvModule):
name = 'tcl'
path = join_path(spack.share_path, "modules")
formats = {
environment_modifications_formats = {
PrependPath: 'prepend-path {name} \"{value}\"\n',
AppendPath: 'append-path {name} \"{value}\"\n',
RemovePath: 'remove-path {name} \"{value}\"\n',
@ -265,28 +492,61 @@ class TclModule(EnvModule):
UnsetEnv: 'unsetenv {name}\n'
}
autoload_format = ('if ![ is-loaded {module_file} ] {{\n'
' puts stderr "Autoloading {module_file}"\n'
' module load {module_file}\n'
'}}\n\n')
prerequisite_format = 'prereq {module_file}\n'
default_naming_format = '{name}-{version}-{compiler.name}-{compiler.version}' # NOQA: ignore=E501
@property
def file_name(self):
return join_path(TclModule.path, self.spec.architecture, self.use_name)
@property
def use_name(self):
return "%s-%s-%s-%s-%s" % (self.spec.name, self.spec.version,
self.spec.compiler.name,
self.spec.compiler.version,
self.spec.dag_hash())
def write_header(self, module_file):
def header(self):
timestamp = datetime.datetime.now()
# TCL Modulefile header
module_file.write('#%Module1.0\n')
header = '#%Module1.0\n'
header += '## Module file created by spack (https://github.com/LLNL/spack) on %s\n' % timestamp # NOQA: ignore=E501
header += '##\n'
header += '## %s\n' % self.spec.short_spec
header += '##\n'
# TODO : category ?
# Short description
if self.short_description:
module_file.write('module-whatis \"%s\"\n\n' % self.short_description)
header += 'module-whatis \"%s\"\n\n' % self.short_description
# Long description
if self.long_description:
module_file.write('proc ModulesHelp { } {\n')
header += 'proc ModulesHelp { } {\n'
for line in textwrap.wrap(self.long_description, 72):
module_file.write("puts stderr \"%s\"\n" % line)
module_file.write('}\n\n')
header += 'puts stderr "%s"\n' % line
header += '}\n\n'
return header
def module_specific_content(self, configuration):
naming_tokens = self.tokens
# Conflict
conflict_format = configuration.get('conflict', [])
f = string.Formatter()
for item in conflict_format:
line = 'conflict ' + item + '\n'
if len([x for x in f.parse(line)
]) > 1: # We do have placeholder to substitute
for naming_dir, conflict_dir in zip(
self.naming_scheme.split('/'), item.split('/')):
if naming_dir != conflict_dir:
message = 'conflict scheme does not match naming scheme [{spec}]\n\n' # NOQA: ignore=E501
message += 'naming scheme : "{nformat}"\n'
message += 'conflict scheme : "{cformat}"\n\n'
message += '** You may want to check your `modules.yaml` configuration file **\n' # NOQA: ignore=E501
tty.error(message.format(spec=self.spec,
nformat=self.naming_scheme,
cformat=item))
raise SystemExit('Module generation aborted.')
line = line.format(**naming_tokens)
yield line

View File

@ -37,7 +37,6 @@
import re
import textwrap
import time
import glob
import llnl.util.tty as tty
import spack
@ -62,7 +61,6 @@
from spack.util.executable import ProcessError
from spack.version import *
from urlparse import urlparse
"""Allowed URL schemes for spack packages."""
_ALLOWED_URL_SCHEMES = ["http", "https", "ftp", "file", "git"]
@ -305,26 +303,21 @@ class SomePackage(Package):
#
"""By default we build in parallel. Subclasses can override this."""
parallel = True
"""# jobs to use for parallel make. If set, overrides default of ncpus."""
make_jobs = None
"""Most packages are NOT extendable. Set to True if you want extensions."""
"""Most packages are NOT extendable. Set to True if you want extensions."""
extendable = False
"""List of prefix-relative file paths (or a single path). If these do
not exist after install, or if they exist but are not files,
sanity checks fail.
"""
sanity_check_is_file = []
"""List of prefix-relative directory paths (or a single path). If
these do not exist after install, or if they exist but are not
directories, sanity checks will fail.
"""
sanity_check_is_dir = []
def __init__(self, spec):
# this determines how the package should be built.
self.spec = spec
@ -336,23 +329,24 @@ def __init__(self, spec):
self.name = self.name[self.name.rindex('.') + 1:]
# Allow custom staging paths for packages
self.path=None
self.path = None
# Sanity check attributes required by Spack directives.
spack.directives.ensure_dicts(type(self))
# Check versions in the versions dict.
for v in self.versions:
assert(isinstance(v, Version))
assert (isinstance(v, Version))
# Check version descriptors
for v in sorted(self.versions):
assert(isinstance(self.versions[v], dict))
assert (isinstance(self.versions[v], dict))
# Version-ize the keys in versions dict
try:
self.versions = dict((Version(v), h) for v,h in self.versions.items())
except ValueError, e:
self.versions = dict((Version(v), h)
for v, h in self.versions.items())
except ValueError as e:
raise ValueError("In package %s: %s" % (self.name, e.message))
# stage used to build this package.
@ -366,9 +360,9 @@ def __init__(self, spec):
# This makes self.url behave sanely.
if self.spec.versions.concrete:
# TODO: this is a really roundabout way of determining the type
# TODO: of fetch to do. figure out a more sane fetch strategy/package
# TODO: init order (right now it's conflated with stage, package, and
# TODO: the tests make assumptions)
# TODO: of fetch to do. figure out a more sane fetch
# TODO: strategy/package init order (right now it's conflated with
# TODO: stage, package, and the tests make assumptions)
f = fs.for_package_version(self, self.version)
if isinstance(f, fs.URLFetchStrategy):
self.url = self.url_for_version(self.spec.version)
@ -387,14 +381,12 @@ def __init__(self, spec):
if self.is_extension:
spack.repo.get(self.extendee_spec)._check_extendable()
@property
def version(self):
if not self.spec.versions.concrete:
raise ValueError("Can only get of package with concrete version.")
return self.spec.versions[0]
@memoized
def version_urls(self):
"""Return a list of URLs for different versions of this
@ -407,7 +399,6 @@ def version_urls(self):
version_urls[v] = args['url']
return version_urls
def nearest_url(self, version):
"""Finds the URL for the next lowest version with a URL.
If there is no lower version with a URL, uses the
@ -424,10 +415,11 @@ def nearest_url(self, version):
url = version_urls[v]
return url
# TODO: move this out of here and into some URL extrapolation module?
def url_for_version(self, version):
"""Returns a URL that you can download a new version of this package from."""
"""
Returns a URL that you can download a new version of this package from.
"""
if not isinstance(version, Version):
version = Version(version)
@ -441,14 +433,17 @@ def url_for_version(self, version):
return version_urls[version]
# If we have no idea, try to substitute the version.
return spack.url.substitute_version(self.nearest_url(version),
self.url_version(version))
return spack.url.substitute_version(
self.nearest_url(version), self.url_version(version))
def _make_resource_stage(self, root_stage, fetcher, resource):
resource_stage_folder = self._resource_stage(resource)
resource_mirror = join_path(self.name, os.path.basename(fetcher.url))
stage = ResourceStage(resource.fetcher, root=root_stage, resource=resource,
name=resource_stage_folder, mirror_path=resource_mirror,
stage = ResourceStage(resource.fetcher,
root=root_stage,
resource=resource,
name=resource_stage_folder,
mirror_path=resource_mirror,
path=self.path)
return stage
@ -474,7 +469,8 @@ def _make_stage(self):
else:
# Construct resource stage
resource = resources[ii - 1] # ii == 0 is root!
stage = self._make_resource_stage(composite_stage[0], fetcher, resource)
stage = self._make_resource_stage(composite_stage[0], fetcher,
resource)
# Append the item to the composite
composite_stage.append(stage)
@ -492,13 +488,11 @@ def stage(self):
self._stage = self._make_stage()
return self._stage
@stage.setter
def stage(self, stage):
"""Allow a stage object to be set to override the default."""
self._stage = stage
def _make_fetcher(self):
# Construct a composite fetcher that always contains at least
# one element (the root package). In case there are resources
@ -515,7 +509,8 @@ def _make_fetcher(self):
@property
def fetcher(self):
if not self.spec.versions.concrete:
raise ValueError("Can only get a fetcher for a package with concrete versions.")
raise ValueError(
"Can only get a fetcher for a package with concrete versions.")
if not self._fetcher:
self._fetcher = self._make_fetcher()
return self._fetcher
@ -524,10 +519,11 @@ def fetcher(self):
def fetcher(self, f):
self._fetcher = f
@property
def extendee_spec(self):
"""Spec of the extendee of this package, or None if it is not an extension."""
"""
Spec of the extendee of this package, or None if it is not an extension
"""
if not self.extendees:
return None
@ -549,10 +545,11 @@ def extendee_spec(self):
spec, kwargs = self.extendees[name]
return spec
@property
def extendee_args(self):
"""Spec of the extendee of this package, or None if it is not an extension."""
"""
Spec of the extendee of this package, or None if it is not an extension
"""
if not self.extendees:
return None
@ -560,7 +557,6 @@ def extendee_args(self):
name = next(iter(self.extendees))
return self.extendees[name][1]
@property
def is_extension(self):
# if it is concrete, it's only an extension if it actually
@ -571,22 +567,20 @@ def is_extension(self):
# If not, then it's an extension if it *could* be an extension
return bool(self.extendees)
def extends(self, spec):
if not spec.name in self.extendees:
if spec.name not in self.extendees:
return False
s = self.extendee_spec
return s and s.satisfies(spec)
@property
def activated(self):
if not self.is_extension:
raise ValueError("is_extension called on package that is not an extension.")
raise ValueError(
"is_extension called on package that is not an extension.")
exts = spack.install_layout.extension_map(self.extendee_spec)
return (self.name in exts) and (exts[self.name] == self.spec)
def preorder_traversal(self, visited=None, **kwargs):
"""This does a preorder traversal of the package's dependence DAG."""
virtual = kwargs.get("virtual", False)
@ -605,36 +599,35 @@ def preorder_traversal(self, visited=None, **kwargs):
spec = self.dependencies[name]
# currently, we do not descend into virtual dependencies, as this
# makes doing a sensible traversal much harder. We just assume that
# ANY of the virtual deps will work, which might not be true (due to
# conflicts or unsatisfiable specs). For now this is ok but we might
# want to reinvestigate if we start using a lot of complicated virtual
# dependencies
# makes doing a sensible traversal much harder. We just assume
# that ANY of the virtual deps will work, which might not be true
# (due to conflicts or unsatisfiable specs). For now this is ok
# but we might want to reinvestigate if we start using a lot of
# complicated virtual dependencies
# TODO: reinvestigate this.
if spec.virtual:
if virtual:
yield spec
continue
for pkg in spack.repo.get(name).preorder_traversal(visited, **kwargs):
for pkg in spack.repo.get(name).preorder_traversal(visited,
**kwargs):
yield pkg
def provides(self, vpkg_name):
"""True if this package provides a virtual package with the specified name."""
"""
True if this package provides a virtual package with the specified name
"""
return any(s.name == vpkg_name for s in self.provided)
def virtual_dependencies(self, visited=None):
for spec in sorted(set(self.preorder_traversal(virtual=True))):
yield spec
@property
def installed(self):
return os.path.isdir(self.prefix)
@property
def installed_dependents(self):
"""Return a list of the specs of all installed packages that depend
@ -651,60 +644,62 @@ def installed_dependents(self):
dependents.append(spec)
return dependents
@property
def prefix(self):
"""Get the prefix into which this package should be installed."""
return self.spec.prefix
@property
def compiler(self):
"""Get the spack.compiler.Compiler object used to build this package."""
"""Get the spack.compiler.Compiler object used to build this package"""
if not self.spec.concrete:
raise ValueError("Can only get a compiler for a concrete package.")
return spack.compilers.compiler_for_spec(self.spec.compiler)
def url_version(self, version):
"""Given a version, this returns a string that should be substituted into the
package's URL to download that version.
By default, this just returns the version string. Subclasses may need to
override this, e.g. for boost versions where you need to ensure that there
are _'s in the download URL.
"""
Given a version, this returns a string that should be substituted
into the package's URL to download that version.
By default, this just returns the version string. Subclasses may need
to override this, e.g. for boost versions where you need to ensure that
there are _'s in the download URL.
"""
return str(version)
def remove_prefix(self):
"""Removes the prefix for a package along with any empty parent directories."""
"""
Removes the prefix for a package along with any empty parent
directories
"""
spack.install_layout.remove_install_directory(self.spec)
def do_fetch(self, mirror_only=False):
"""Creates a stage directory and downloads the tarball for this package.
Working directory will be set to the stage directory.
"""
Creates a stage directory and downloads the tarball for this package.
Working directory will be set to the stage directory.
"""
if not self.spec.concrete:
raise ValueError("Can only fetch concrete packages.")
start_time = time.time()
if spack.do_checksum and not self.version in self.versions:
tty.warn("There is no checksum on file to fetch %s safely."
% self.spec.format('$_$@'))
if spack.do_checksum and self.version not in self.versions:
tty.warn("There is no checksum on file to fetch %s safely." %
self.spec.format('$_$@'))
# Ask the user whether to skip the checksum if we're
# interactive, but just fail if non-interactive.
checksum_msg = "Add a checksum or use --no-checksum to skip this check."
checksum_msg = "Add a checksum or use --no-checksum to skip this check." # NOQA: ignore=E501
ignore_checksum = False
if sys.stdout.isatty():
ignore_checksum = tty.get_yes_or_no(" Fetch anyway?", default=False)
ignore_checksum = tty.get_yes_or_no(" Fetch anyway?",
default=False)
if ignore_checksum:
tty.msg("Fetching with no checksum.", checksum_msg)
if not ignore_checksum:
raise FetchError(
"Will not fetch %s" % self.spec.format('$_$@'), checksum_msg)
raise FetchError("Will not fetch %s" %
self.spec.format('$_$@'), checksum_msg)
self.stage.fetch(mirror_only)
@ -723,7 +718,6 @@ def do_stage(self, mirror_only=False):
self.stage.expand_archive()
self.stage.chdir_to_source()
def do_patch(self):
"""Calls do_stage(), then applied patches to the expanded tarball if they
haven't been applied already."""
@ -743,10 +737,10 @@ def do_patch(self):
# Construct paths to special files in the archive dir used to
# keep track of whether patches were successfully applied.
archive_dir = self.stage.source_path
good_file = join_path(archive_dir, '.spack_patched')
archive_dir = self.stage.source_path
good_file = join_path(archive_dir, '.spack_patched')
no_patches_file = join_path(archive_dir, '.spack_no_patches')
bad_file = join_path(archive_dir, '.spack_patch_failed')
bad_file = join_path(archive_dir, '.spack_patch_failed')
# If we encounter an archive that failed to patch, restage it
# so that we can apply all the patches again.
@ -801,13 +795,11 @@ def do_patch(self):
else:
touch(no_patches_file)
@property
def namespace(self):
namespace, dot, module = self.__module__.rpartition('.')
return namespace
def do_fake_install(self):
"""Make a fake install directory contaiing a 'fake' file in bin."""
mkdirp(self.prefix.bin)
@ -815,15 +807,15 @@ def do_fake_install(self):
mkdirp(self.prefix.lib)
mkdirp(self.prefix.man1)
def _get_needed_resources(self):
resources = []
# Select the resources that are needed for this build
for when_spec, resource_list in self.resources.items():
if when_spec in self.spec:
resources.extend(resource_list)
# Sorts the resources by the length of the string representing their destination. Since any nested resource
# must contain another resource's name in its path, it seems that should work
# Sorts the resources by the length of the string representing their
# destination. Since any nested resource must contain another
# resource's name in its path, it seems that should work
resources = sorted(resources, key=lambda res: len(res.destination))
return resources
@ -832,10 +824,14 @@ def _resource_stage(self, resource):
resource_stage_folder = '-'.join(pieces)
return resource_stage_folder
def do_install(self,
keep_prefix=False, keep_stage=False, ignore_deps=False,
skip_patch=False, verbose=False, make_jobs=None, fake=False):
keep_prefix=False,
keep_stage=False,
ignore_deps=False,
skip_patch=False,
verbose=False,
make_jobs=None,
fake=False):
"""Called by commands to install a package and its dependencies.
Package implementations should override install() to describe
@ -846,18 +842,20 @@ def do_install(self,
keep_stage -- By default, stage is destroyed only if there are no
exceptions during build. Set to True to keep the stage
even with exceptions.
ignore_deps -- Do not install dependencies before installing this package.
ignore_deps -- Don't install dependencies before installing this
package
fake -- Don't really build -- install fake stub files instead.
skip_patch -- Skip patch stage of build if True.
verbose -- Display verbose build output (by default, suppresses it)
make_jobs -- Number of make jobs to use for install. Default is ncpus.
make_jobs -- Number of make jobs to use for install. Default is ncpus
"""
if not self.spec.concrete:
raise ValueError("Can only install concrete packages.")
# No installation needed if package is external
if self.spec.external:
tty.msg("%s is externally installed in %s" % (self.name, self.spec.external))
tty.msg("%s is externally installed in %s" %
(self.name, self.spec.external))
return
# Ensure package is not already installed
@ -869,9 +867,13 @@ def do_install(self,
# First, install dependencies recursively.
if not ignore_deps:
self.do_install_dependencies(
keep_prefix=keep_prefix, keep_stage=keep_stage, ignore_deps=ignore_deps,
fake=fake, skip_patch=skip_patch, verbose=verbose, make_jobs=make_jobs)
self.do_install_dependencies(keep_prefix=keep_prefix,
keep_stage=keep_stage,
ignore_deps=ignore_deps,
fake=fake,
skip_patch=skip_patch,
verbose=verbose,
make_jobs=make_jobs)
# Set parallelism before starting build.
self.make_jobs = make_jobs
@ -899,35 +901,41 @@ def build_process():
self.do_fake_install()
else:
# Do the real install in the source directory.
self.stage.chdir_to_source()
self.stage.chdir_to_source()
# Save the build environment in a file before building.
env_path = join_path(os.getcwd(), 'spack-build.env')
# Save the build environment in a file before building.
env_path = join_path(os.getcwd(), 'spack-build.env')
try:
# Redirect I/O to a build log (and optionally to the terminal)
try:
# Redirect I/O to a build log (and optionally to
# the terminal)
log_path = join_path(os.getcwd(), 'spack-build.out')
log_file = open(log_path, 'w')
with log_output(log_file, verbose, sys.stdout.isatty(), True):
with log_output(log_file, verbose, sys.stdout.isatty(),
True):
dump_environment(env_path)
self.install(self.spec, self.prefix)
except ProcessError as e:
# Annotate ProcessErrors with the location of the build log.
e.build_log = log_path
raise e
except ProcessError as e:
# Annotate ProcessErrors with the location of
# the build log
e.build_log = log_path
raise e
# Ensure that something was actually installed.
self.sanity_check_prefix()
# Ensure that something was actually installed.
self.sanity_check_prefix()
# Copy provenance into the install directory on success
log_install_path = spack.install_layout.build_log_path(self.spec)
env_install_path = spack.install_layout.build_env_path(self.spec)
packages_dir = spack.install_layout.build_packages_path(self.spec)
# Copy provenance into the install directory on success
log_install_path = spack.install_layout.build_log_path(
self.spec)
env_install_path = spack.install_layout.build_env_path(
self.spec)
packages_dir = spack.install_layout.build_packages_path(
self.spec)
install(log_path, log_install_path)
install(env_path, env_install_path)
dump_packages(self.spec, packages_dir)
install(log_path, log_install_path)
install(env_path, env_install_path)
dump_packages(self.spec, packages_dir)
# Run post install hooks before build stage is removed.
spack.hooks.post_install(self)
@ -937,8 +945,9 @@ def build_process():
build_time = self._total_time - self._fetch_time
tty.msg("Successfully installed %s" % self.name,
"Fetch: %s. Build: %s. Total: %s."
% (_hms(self._fetch_time), _hms(build_time), _hms(self._total_time)))
"Fetch: %s. Build: %s. Total: %s." %
(_hms(self._fetch_time), _hms(build_time),
_hms(self._total_time)))
print_pkg(self.prefix)
try:
@ -953,16 +962,17 @@ def build_process():
tty.warn("Keeping install prefix in place despite error.",
"Spack will think this package is installed. " +
"Manually remove this directory to fix:",
self.prefix, wrap=True)
self.prefix,
wrap=True)
raise
# note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.installed_db.add(self.spec, self.prefix)
def sanity_check_prefix(self):
"""This function checks whether install succeeded."""
def check_paths(path_list, filetype, predicate):
if isinstance(path_list, basestring):
path_list = [path_list]
@ -970,8 +980,9 @@ def check_paths(path_list, filetype, predicate):
for path in path_list:
abs_path = os.path.join(self.prefix, path)
if not predicate(abs_path):
raise InstallError("Install failed for %s. No such %s in prefix: %s"
% (self.name, filetype, path))
raise InstallError(
"Install failed for %s. No such %s in prefix: %s" %
(self.name, filetype, path))
check_paths(self.sanity_check_is_file, 'file', os.path.isfile)
check_paths(self.sanity_check_is_dir, 'directory', os.path.isdir)
@ -982,13 +993,11 @@ def check_paths(path_list, filetype, predicate):
raise InstallError(
"Install failed for %s. Nothing was installed!" % self.name)
def do_install_dependencies(self, **kwargs):
# Pass along paths of dependencies here
for dep in self.spec.dependencies.values():
dep.package.do_install(**kwargs)
@property
def build_log_path(self):
if self.installed:
@ -996,7 +1005,6 @@ def build_log_path(self):
else:
return join_path(self.stage.source_path, 'spack-build.out')
@property
def module(self):
"""Use this to add variables to the class's module's scope.
@ -1006,7 +1014,7 @@ def module(self):
fromlist=[self.__class__.__name__])
def setup_environment(self, spack_env, run_env):
"""Set up the compile and runtime environemnts for a package.
"""Set up the compile and runtime environments for a package.
`spack_env` and `run_env` are `EnvironmentModifications`
objects. Package authors can call methods on them to alter
@ -1037,7 +1045,6 @@ def setup_environment(self, spack_env, run_env):
"""
pass
def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
"""Set up the environment of packages that depend on this one.
@ -1077,7 +1084,6 @@ def setup_dependent_environment(self, spack_env, run_env, dependent_spec):
"""
self.setup_environment(spack_env, run_env)
def setup_dependent_package(self, module, dependent_spec):
"""Set up Python module-scope variables for dependent packages.
@ -1123,8 +1129,11 @@ def setup_dependent_package(self, module, dependent_spec):
pass
def install(self, spec, prefix):
"""Package implementations override this with their own build configuration."""
raise InstallError("Package %s provides no install method!" % self.name)
"""
Package implementations override this with their own configuration
"""
raise InstallError("Package %s provides no install method!" %
self.name)
def do_uninstall(self, force=False):
if not self.installed:
@ -1146,12 +1155,10 @@ def do_uninstall(self, force=False):
# Once everything else is done, run post install hooks
spack.hooks.post_uninstall(self)
def _check_extendable(self):
if not self.extendable:
raise ValueError("Package %s is not extendable!" % self.name)
def _sanity_check_extension(self):
if not self.is_extension:
raise ActivationError("This package is not an extension.")
@ -1160,12 +1167,13 @@ def _sanity_check_extension(self):
extendee_package._check_extendable()
if not extendee_package.installed:
raise ActivationError("Can only (de)activate extensions for installed packages.")
raise ActivationError(
"Can only (de)activate extensions for installed packages.")
if not self.installed:
raise ActivationError("Extensions must first be installed.")
if not self.extendee_spec.name in self.extendees:
raise ActivationError("%s does not extend %s!" % (self.name, self.extendee.name))
if self.extendee_spec.name not in self.extendees:
raise ActivationError("%s does not extend %s!" %
(self.name, self.extendee.name))
def do_activate(self, force=False):
"""Called on an etension to invoke the extendee's activate method.
@ -1175,8 +1183,8 @@ def do_activate(self, force=False):
"""
self._sanity_check_extension()
spack.install_layout.check_extension_conflict(
self.extendee_spec, self.spec)
spack.install_layout.check_extension_conflict(self.extendee_spec,
self.spec)
# Activate any package dependencies that are also extensions.
if not force:
@ -1188,9 +1196,8 @@ def do_activate(self, force=False):
self.extendee_spec.package.activate(self, **self.extendee_args)
spack.install_layout.add_extension(self.extendee_spec, self.spec)
tty.msg("Activated extension %s for %s"
% (self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
tty.msg("Activated extension %s for %s" %
(self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
def activate(self, extension, **kwargs):
"""Symlinks all files from the extension into extendee's install dir.
@ -1201,6 +1208,7 @@ def activate(self, extension, **kwargs):
always executed.
"""
def ignore(filename):
return (filename in spack.install_layout.hidden_file_paths or
kwargs.get('ignore', lambda f: False)(filename))
@ -1212,7 +1220,6 @@ def ignore(filename):
tree.merge(self.prefix, ignore=ignore)
def do_deactivate(self, **kwargs):
"""Called on the extension to invoke extendee's deactivate() method."""
self._sanity_check_extension()
@ -1230,7 +1237,7 @@ def do_deactivate(self, **kwargs):
for dep in aspec.traverse():
if self.spec == dep:
raise ActivationError(
"Cannot deactivate %s beacuse %s is activated and depends on it."
"Cannot deactivate %s because %s is activated and depends on it." # NOQA: ignore=E501
% (self.spec.short_spec, aspec.short_spec))
self.extendee_spec.package.deactivate(self, **self.extendee_args)
@ -1238,11 +1245,11 @@ def do_deactivate(self, **kwargs):
# redundant activation check -- makes SURE the spec is not
# still activated even if something was wrong above.
if self.activated:
spack.install_layout.remove_extension(self.extendee_spec, self.spec)
tty.msg("Deactivated extension %s for %s"
% (self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
spack.install_layout.remove_extension(self.extendee_spec,
self.spec)
tty.msg("Deactivated extension %s for %s" %
(self.spec.short_spec, self.extendee_spec.format("$_$@$+$%@")))
def deactivate(self, extension, **kwargs):
"""Unlinks all files from extension out of this package's install dir.
@ -1253,6 +1260,7 @@ def deactivate(self, extension, **kwargs):
always executed.
"""
def ignore(filename):
return (filename in spack.install_layout.hidden_file_paths or
kwargs.get('ignore', lambda f: False)(filename))
@ -1260,17 +1268,14 @@ def ignore(filename):
tree = LinkTree(extension.prefix)
tree.unmerge(self.prefix, ignore=ignore)
def do_restage(self):
"""Reverts expanded/checked out source to a pristine state."""
self.stage.restage()
def do_clean(self):
"""Removes the package's build stage and source tarball."""
self.stage.destroy()
def format_doc(self, **kwargs):
"""Wrap doc string at 72 characters and format nicely"""
indent = kwargs.get('indent', 0)
@ -1285,7 +1290,6 @@ def format_doc(self, **kwargs):
results.write((" " * indent) + line + "\n")
return results.getvalue()
@property
def all_urls(self):
urls = []
@ -1297,7 +1301,6 @@ def all_urls(self):
urls.append(args['url'])
return urls
def fetch_remote_versions(self):
"""Try to find remote versions of this package using the
list_url and any other URLs described in the package file."""
@ -1306,26 +1309,30 @@ def fetch_remote_versions(self):
try:
return spack.util.web.find_versions_of_archive(
*self.all_urls, list_url=self.list_url, list_depth=self.list_depth)
*self.all_urls,
list_url=self.list_url,
list_depth=self.list_depth)
except spack.error.NoNetworkConnectionError as e:
tty.die("Package.fetch_versions couldn't connect to:",
e.url, e.message)
tty.die("Package.fetch_versions couldn't connect to:", e.url,
e.message)
@property
def rpath(self):
"""Get the rpath this package links with, as a list of paths."""
rpaths = [self.prefix.lib, self.prefix.lib64]
rpaths.extend(d.prefix.lib for d in self.spec.traverse(root=False)
rpaths.extend(d.prefix.lib
for d in self.spec.traverse(root=False)
if os.path.isdir(d.prefix.lib))
rpaths.extend(d.prefix.lib64 for d in self.spec.traverse(root=False)
rpaths.extend(d.prefix.lib64
for d in self.spec.traverse(root=False)
if os.path.isdir(d.prefix.lib64))
return rpaths
@property
def rpath_args(self):
"""Get the rpath args as a string, with -Wl,-rpath, for each element."""
"""
Get the rpath args as a string, with -Wl,-rpath, for each element
"""
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
@ -1333,6 +1340,7 @@ def install_dependency_symlinks(pkg, spec, prefix):
"""Execute a dummy install and flatten dependencies"""
flatten_dependencies(spec, prefix)
def flatten_dependencies(spec, flat_dir):
"""Make each dependency of spec present in dir via symlink."""
for dep in spec.traverse(root=False):
@ -1341,13 +1349,13 @@ def flatten_dependencies(spec, flat_dir):
dep_path = spack.install_layout.path_for_spec(dep)
dep_files = LinkTree(dep_path)
os.mkdir(flat_dir+'/'+name)
os.mkdir(flat_dir + '/' + name)
conflict = dep_files.find_conflict(flat_dir+'/'+name)
conflict = dep_files.find_conflict(flat_dir + '/' + name)
if conflict:
raise DependencyConflictError(conflict)
dep_files.merge(flat_dir+'/'+name)
dep_files.merge(flat_dir + '/' + name)
def validate_package_url(url_string):
@ -1388,9 +1396,11 @@ def dump_packages(spec, path):
# Create a source repo and get the pkg directory out of it.
try:
source_repo = spack.repository.Repo(source_repo_root)
source_pkg_dir = source_repo.dirname_for_package_name(node.name)
except RepoError as e:
tty.warn("Warning: Couldn't copy in provenance for %s" % node.name)
source_pkg_dir = source_repo.dirname_for_package_name(
node.name)
except RepoError:
tty.warn("Warning: Couldn't copy in provenance for %s" %
node.name)
# Create a destination repository
dest_repo_root = join_path(path, node.namespace)
@ -1410,7 +1420,7 @@ def print_pkg(message):
"""Outputs a message with a package icon."""
from llnl.util.tty.color import cwrite
cwrite('@*g{[+]} ')
print message
print(message)
def _hms(seconds):
@ -1419,20 +1429,25 @@ def _hms(seconds):
h, m = divmod(m, 60)
parts = []
if h: parts.append("%dh" % h)
if m: parts.append("%dm" % m)
if s: parts.append("%.2fs" % s)
if h:
parts.append("%dh" % h)
if m:
parts.append("%dm" % m)
if s:
parts.append("%.2fs" % s)
return ' '.join(parts)
class FetchError(spack.error.SpackError):
"""Raised when something goes wrong during fetch."""
def __init__(self, message, long_msg=None):
super(FetchError, self).__init__(message, long_msg)
class InstallError(spack.error.SpackError):
"""Raised when something goes wrong during install or uninstall."""
def __init__(self, message, long_msg=None):
super(InstallError, self).__init__(message, long_msg)
@ -1443,21 +1458,24 @@ class ExternalPackageError(InstallError):
class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""
def __init__(self, spec, dependents):
super(PackageStillNeededError, self).__init__(
"Cannot uninstall %s" % spec)
super(PackageStillNeededError, self).__init__("Cannot uninstall %s" %
spec)
self.spec = spec
self.dependents = dependents
class PackageError(spack.error.SpackError):
"""Raised when something is wrong with a package definition."""
def __init__(self, message, long_msg=None):
super(PackageError, self).__init__(message, long_msg)
class PackageVersionError(PackageError):
"""Raised when a version URL cannot automatically be determined."""
def __init__(self, version):
super(PackageVersionError, self).__init__(
"Cannot determine a URL automatically for version %s" % version,
@ -1466,6 +1484,7 @@ def __init__(self, version):
class VersionFetchError(PackageError):
"""Raised when a version URL cannot automatically be determined."""
def __init__(self, cls):
super(VersionFetchError, self).__init__(
"Cannot fetch versions for package %s " % cls.__name__ +
@ -1474,12 +1493,15 @@ def __init__(self, cls):
class NoURLError(PackageError):
"""Raised when someone tries to build a URL for a package with no URLs."""
def __init__(self, cls):
super(NoURLError, self).__init__(
"Package %s has no version with a URL." % cls.__name__)
class ExtensionError(PackageError): pass
class ExtensionError(PackageError):
pass
class ExtensionConflictError(ExtensionError):
@ -1495,7 +1517,8 @@ def __init__(self, msg, long_msg=None):
class DependencyConflictError(spack.error.SpackError):
"""Raised when the dependencies cannot be flattened as asked for."""
def __init__(self, conflict):
super(DependencyConflictError, self).__init__(
"%s conflicts with another file in the flattened directory." %(
"%s conflicts with another file in the flattened directory." % (
conflict))

View File

@ -23,52 +23,23 @@
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import sys
import unittest
import nose
from spack.test.tally_plugin import Tally
from llnl.util.filesystem import join_path
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import nose
import spack
from llnl.util.filesystem import join_path
from llnl.util.tty.colify import colify
from spack.test.tally_plugin import Tally
"""Names of tests to be included in Spack's test suite"""
test_names = ['versions',
'url_parse',
'url_substitution',
'packages',
'stage',
'spec_syntax',
'spec_semantics',
'spec_dag',
'concretize',
'multimethod',
'install',
'package_sanity',
'config',
'directory_layout',
'pattern',
'python_version',
'git_fetch',
'svn_fetch',
'hg_fetch',
'mirror',
'url_extrapolate',
'cc',
'link_tree',
'spec_yaml',
'optional_deps',
'make_executable',
'configure_guess',
'lock',
'database',
'namespace_trie',
'yaml',
'sbang',
'environment',
'cmd.uninstall',
'cmd.test_install']
test_names = ['versions', 'url_parse', 'url_substitution', 'packages', 'stage',
'spec_syntax', 'spec_semantics', 'spec_dag', 'concretize',
'multimethod', 'install', 'package_sanity', 'config',
'directory_layout', 'pattern', 'python_version', 'git_fetch',
'svn_fetch', 'hg_fetch', 'mirror', 'modules', 'url_extrapolate',
'cc', 'link_tree', 'spec_yaml', 'optional_deps',
'make_executable', 'configure_guess', 'lock', 'database',
'namespace_trie', 'yaml', 'sbang', 'environment',
'cmd.uninstall', 'cmd.test_install']
def list_tests():
@ -79,8 +50,6 @@ def list_tests():
def run(names, outputDir, verbose=False):
"""Run tests with the supplied names. Names should be a list. If
it's empty, run ALL of Spack's tests."""
verbosity = 1 if not verbose else 2
if not names:
names = test_names
else:
@ -94,7 +63,7 @@ def run(names, outputDir, verbose=False):
tally = Tally()
for test in names:
module = 'spack.test.' + test
print module
print(module)
tty.msg("Running test: %s" % test)
@ -104,15 +73,13 @@ def run(names, outputDir, verbose=False):
xmlOutputFname = "unittests-{0}.xml".format(test)
xmlOutputPath = join_path(outputDir, xmlOutputFname)
runOpts += ["--with-xunit",
"--xunit-file={0}".format(xmlOutputPath)]
"--xunit-file={0}".format(xmlOutputPath)]
argv = [""] + runOpts + [module]
result = nose.run(argv=argv, addplugins=[tally])
nose.run(argv=argv, addplugins=[tally])
succeeded = not tally.failCount and not tally.errorCount
tty.msg("Tests Complete.",
"%5d tests run" % tally.numberOfTestsRun,
"%5d failures" % tally.failCount,
"%5d errors" % tally.errorCount)
tty.msg("Tests Complete.", "%5d tests run" % tally.numberOfTestsRun,
"%5d failures" % tally.failCount, "%5d errors" % tally.errorCount)
if succeeded:
tty.info("OK", format='g')

View File

@ -0,0 +1,157 @@
import collections
from contextlib import contextmanager
import StringIO
import spack.modules
from spack.test.mock_packages_test import MockPackagesTest
FILE_REGISTRY = collections.defaultdict(StringIO.StringIO)
# Monkey-patch open to write module files to a StringIO instance
@contextmanager
def mock_open(filename, mode):
if not mode == 'w':
raise RuntimeError(
'test.modules : unexpected opening mode for monkey-patched open')
FILE_REGISTRY[filename] = StringIO.StringIO()
try:
yield FILE_REGISTRY[filename]
finally:
handle = FILE_REGISTRY[filename]
FILE_REGISTRY[filename] = handle.getvalue()
handle.close()
configuration_autoload_direct = {
'enable': ['tcl'],
'tcl': {
'all': {
'autoload': 'direct'
}
}
}
configuration_autoload_all = {
'enable': ['tcl'],
'tcl': {
'all': {
'autoload': 'all'
}
}
}
configuration_alter_environment = {
'enable': ['tcl'],
'tcl': {
'all': {
'filter': {'environment_blacklist': ['CMAKE_PREFIX_PATH']}
},
'=x86-linux': {
'environment': {'set': {'FOO': 'foo'},
'unset': ['BAR']}
}
}
}
configuration_blacklist = {
'enable': ['tcl'],
'tcl': {
'blacklist': ['callpath'],
'all': {
'autoload': 'direct'
}
}
}
configuration_conflicts = {
'enable': ['tcl'],
'tcl': {
'naming_scheme': '{name}/{version}-{compiler.name}',
'all': {
'conflict': ['{name}', 'intel/14.0.1']
}
}
}
class TclTests(MockPackagesTest):
def setUp(self):
super(TclTests, self).setUp()
self.configuration_obj = spack.modules.CONFIGURATION
spack.modules.open = mock_open
# Make sure that a non-mocked configuration will trigger an error
spack.modules.CONFIGURATION = None
def tearDown(self):
del spack.modules.open
spack.modules.CONFIGURATION = self.configuration_obj
super(TclTests, self).tearDown()
def get_modulefile_content(self, spec):
spec.concretize()
generator = spack.modules.TclModule(spec)
generator.write()
content = FILE_REGISTRY[generator.file_name].split('\n')
return content
def test_simple_case(self):
spack.modules.CONFIGURATION = configuration_autoload_direct
spec = spack.spec.Spec('mpich@3.0.4=x86-linux')
content = self.get_modulefile_content(spec)
self.assertTrue('module-whatis "mpich @3.0.4"' in content)
def test_autoload(self):
spack.modules.CONFIGURATION = configuration_autoload_direct
spec = spack.spec.Spec('mpileaks=x86-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(len([x for x in content if 'is-loaded' in x]), 2)
self.assertEqual(len([x for x in content if 'module load ' in x]), 2)
spack.modules.CONFIGURATION = configuration_autoload_all
spec = spack.spec.Spec('mpileaks=x86-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(len([x for x in content if 'is-loaded' in x]), 5)
self.assertEqual(len([x for x in content if 'module load ' in x]), 5)
def test_alter_environment(self):
spack.modules.CONFIGURATION = configuration_alter_environment
spec = spack.spec.Spec('mpileaks=x86-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(
len([x
for x in content
if x.startswith('prepend-path CMAKE_PREFIX_PATH')]), 0)
self.assertEqual(
len([x for x in content if 'setenv FOO "foo"' in x]), 1)
self.assertEqual(len([x for x in content if 'unsetenv BAR' in x]), 1)
spec = spack.spec.Spec('libdwarf=x64-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(
len([x
for x in content
if x.startswith('prepend-path CMAKE_PREFIX_PATH')]), 0)
self.assertEqual(
len([x for x in content if 'setenv FOO "foo"' in x]), 0)
self.assertEqual(len([x for x in content if 'unsetenv BAR' in x]), 0)
def test_blacklist(self):
spack.modules.CONFIGURATION = configuration_blacklist
spec = spack.spec.Spec('mpileaks=x86-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(len([x for x in content if 'is-loaded' in x]), 1)
self.assertEqual(len([x for x in content if 'module load ' in x]), 1)
def test_conflicts(self):
spack.modules.CONFIGURATION = configuration_conflicts
spec = spack.spec.Spec('mpileaks=x86-linux')
content = self.get_modulefile_content(spec)
self.assertEqual(
len([x for x in content if x.startswith('conflict')]), 2)
self.assertEqual(
len([x for x in content if x == 'conflict mpileaks']), 1)
self.assertEqual(
len([x for x in content if x == 'conflict intel/14.0.1']), 1)

View File

@ -41,7 +41,7 @@
# commands. This allows the user to use packages without knowing all
# their installation details.
#
# e.g., rather than requring a full spec for libelf, the user can type:
# e.g., rather than requiring a full spec for libelf, the user can type:
#
# spack use libelf
#
@ -113,11 +113,11 @@ function spack {
unuse $_sp_module_args $_sp_full_spec
fi ;;
"load")
if _sp_full_spec=$(command spack $_sp_flags module find dotkit $_sp_spec); then
if _sp_full_spec=$(command spack $_sp_flags module find tcl $_sp_spec); then
module load $_sp_module_args $_sp_full_spec
fi ;;
"unload")
if _sp_full_spec=$(command spack $_sp_flags module find dotkit $_sp_spec); then
if _sp_full_spec=$(command spack $_sp_flags module find tcl $_sp_spec); then
module unload $_sp_module_args $_sp_full_spec
fi ;;
esac