Update packages.yaml format and support configuration updates

The YAML config for paths and modules of external packages has
changed: the new format allows a single spec to load multiple
modules. Spack will automatically convert from the old format
when reading the configs (the updates do not add new essential
properties, so this change in Spack is backwards-compatible).

With this update, Spack cannot modify existing configs/environments
without updating them (e.g. “spack config add” will fail if the
configuration is in a format that predates this PR). The user is
prompted to do this explicitly and commands are provided. All
config scopes can be updated at once. Each environment must be
updated one at a time.
This commit is contained in:
Massimiliano Culpo 2020-07-31 12:58:48 +02:00 committed by Peter Scheibel
parent 1398038bee
commit 193e8333fa
35 changed files with 1134 additions and 240 deletions

View File

@ -21,11 +21,14 @@ packages:
- gcc - gcc
- intel - intel
providers: providers:
elf: [libelf] elf:
unwind: [apple-libunwind] - libelf
unwind:
- apple-libunwind
apple-libunwind: apple-libunwind:
paths: buildable: false
externals:
# Apple bundles libunwind version 35.3 with macOS 10.9 and later, # Apple bundles libunwind version 35.3 with macOS 10.9 and later,
# although the version number used here isn't critical # although the version number used here isn't critical
apple-libunwind@35.3: /usr - spec: apple-libunwind@35.3
buildable: False prefix: /usr

View File

@ -57,10 +57,13 @@ directory. Here's an example of an external configuration:
packages: packages:
openmpi: openmpi:
paths: externals:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3 - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug prefix: /opt/openmpi-1.4.3
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC, This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel. one built with GCC and debug information, and another built with Intel.
@ -76,13 +79,15 @@ of the installation prefixes. The following example says that module
.. code-block:: yaml .. code-block:: yaml
cmake: cmake:
modules: externals:
cmake@3.7.2: CMake/3.7.2 - spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` token, followed Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add a ``paths`` or ``modules`` by a list of package names. To specify externals, add an ``externals:``
token under the package name, which lists externals in a attribute under the package name, which lists externals.
``spec: /path`` or ``spec: module-name`` format. Each spec should be as Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based package version, then Spack will guess the missing component based
@ -106,10 +111,13 @@ be:
packages: packages:
openmpi: openmpi:
paths: externals:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3 - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug prefix: /opt/openmpi-1.4.3
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build The addition of the ``buildable`` flag tells Spack that it should never build
@ -137,10 +145,13 @@ but more conveniently:
mpi: mpi:
buildable: False buildable: False
openmpi: openmpi:
paths: externals:
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3 - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug prefix: /opt/openmpi-1.4.3
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel - spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Implementations can also be listed immediately under the virtual they provide: Implementations can also be listed immediately under the virtual they provide:
@ -172,8 +183,9 @@ After running this command your ``packages.yaml`` may include new entries:
packages: packages:
cmake: cmake:
paths: externals:
cmake@3.17.2: /usr - spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages; Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies. for now this is generally limited to finding build-only dependencies.

View File

@ -418,9 +418,13 @@ Adapt the following example. Be sure to maintain the indentation:
# other content ... # other content ...
intel-mkl: intel-mkl:
modules: externals:
intel-mkl@2018.2.199 arch=linux-centos6-x86_64: intel-mkl/18/18.0.2 - spec: "intel-mkl@2018.2.199 arch=linux-centos6-x86_64"
intel-mkl@2018.3.222 arch=linux-centos6-x86_64: intel-mkl/18/18.0.3 modules:
- intel-mkl/18/18.0.2
- spec: "intel-mkl@2018.3.222 arch=linux-centos6-x86_64"
modules:
- intel-mkl/18/18.0.3
The version numbers for the ``intel-mkl`` specs defined here correspond to file The version numbers for the ``intel-mkl`` specs defined here correspond to file
and directory names that Intel uses for its products because they were adopted and directory names that Intel uses for its products because they were adopted
@ -451,12 +455,16 @@ mechanism.
packages: packages:
intel-parallel-studio: intel-parallel-studio:
modules: externals:
intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64: intel/18/18.0.2 - spec: "intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64"
intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64: intel/18/18.0.3 modules:
- intel/18/18.0.2
- spec: "intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal arch=linux-centos6-x86_64"
modules:
- intel/18/18.0.3
buildable: False buildable: False
One additional example illustrates the use of ``paths:`` instead of One additional example illustrates the use of ``prefix:`` instead of
``modules:``, useful when external modulefiles are not available or not ``modules:``, useful when external modulefiles are not available or not
suitable: suitable:
@ -464,13 +472,15 @@ suitable:
packages: packages:
intel-parallel-studio: intel-parallel-studio:
paths: externals:
intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal: /opt/intel - spec: "intel-parallel-studio@cluster.2018.2.199 +mkl+mpi+ipp+tbb+daal"
intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal: /opt/intel prefix: /opt/intel
- spec: "intel-parallel-studio@cluster.2018.3.222 +mkl+mpi+ipp+tbb+daal"
prefix: /opt/intel
buildable: False buildable: False
Note that for the Intel packages discussed here, the directory values in the Note that for the Intel packages discussed here, the directory values in the
``paths:`` entries must be the high-level and typically version-less ``prefix:`` entries must be the high-level and typically version-less
"installation directory" that has been used by Intel's product installer. "installation directory" that has been used by Intel's product installer.
Such a directory will typically accumulate various product versions. Amongst Such a directory will typically accumulate various product versions. Amongst
them, Spack will select the correct version-specific product directory based on them, Spack will select the correct version-specific product directory based on

View File

@ -712,8 +712,9 @@ an OpenMPI installed in /opt/local, one would use:
packages: packages:
openmpi: openmpi:
paths: externals:
openmpi@1.10.1: /opt/local - spec: openmpi@1.10.1
prefix: /opt/local
buildable: False buildable: False
In general, Spack is easier to use and more reliable if it builds all of In general, Spack is easier to use and more reliable if it builds all of
@ -775,8 +776,9 @@ Then add the following to ``~/.spack/packages.yaml``:
packages: packages:
openssl: openssl:
paths: externals:
openssl@1.0.2g: /usr - spec: openssl@1.0.2g
prefix: /usr
buildable: False buildable: False
@ -791,8 +793,9 @@ to add the following to ``packages.yaml``:
packages: packages:
netlib-lapack: netlib-lapack:
paths: externals:
netlib-lapack@3.6.1: /usr - spec: netlib-lapack@3.6.1
prefix: /usr
buildable: False buildable: False
all: all:
providers: providers:
@ -1181,9 +1184,13 @@ Here's an example of an external configuration for cray modules:
packages: packages:
mpich: mpich:
modules: externals:
mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10: cray-mpich - spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10: cray-mpich modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all: all:
providers: providers:
mpi: [mpich] mpi: [mpich]
@ -1195,7 +1202,7 @@ via module load.
.. note:: .. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``paths:`` For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line). compiler wrapper link line).
@ -1211,19 +1218,31 @@ Here is an example of a full packages.yaml used at NERSC
packages: packages:
mpich: mpich:
modules: externals:
mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-mpich - spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge: cray-mpich modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False buildable: False
netcdf: netcdf:
modules: externals:
netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-netcdf - spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge: cray-netcdf modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False buildable: False
hdf5: hdf5:
modules: externals:
hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge: cray-hdf5 - spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge: cray-hdf5 modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False buildable: False
all: all:
compiler: [gcc@5.2.0, intel@16.0.0.109] compiler: [gcc@5.2.0, intel@16.0.0.109]

View File

@ -1545,8 +1545,9 @@ Avoid double-installing CUDA by adding, e.g.
packages: packages:
cuda: cuda:
paths: externals:
cuda@9.0.176%gcc@5.4.0 arch=linux-ubuntu16-x86_64: /usr/local/cuda - spec: "cuda@9.0.176%gcc@5.4.0 arch=linux-ubuntu16-x86_64"
prefix: /usr/local/cuda
buildable: False buildable: False
to your ``packages.yaml``. to your ``packages.yaml``.

View File

@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections import collections
import errno import errno
import hashlib import hashlib
@ -377,17 +376,17 @@ def install(src, dest):
copy(src, dest, _permissions=True) copy(src, dest, _permissions=True)
def resolve_link_target_relative_to_the_link(l): def resolve_link_target_relative_to_the_link(link):
""" """
os.path.isdir uses os.path.exists, which for links will check os.path.isdir uses os.path.exists, which for links will check
the existence of the link target. If the link target is relative to the existence of the link target. If the link target is relative to
the link, we need to construct a pathname that is valid from the link, we need to construct a pathname that is valid from
our cwd (which may not be the same as the link's directory) our cwd (which may not be the same as the link's directory)
""" """
target = os.readlink(l) target = os.readlink(link)
if os.path.isabs(target): if os.path.isabs(target):
return target return target
link_dir = os.path.dirname(os.path.abspath(l)) link_dir = os.path.dirname(os.path.abspath(link))
return os.path.join(link_dir, target) return os.path.join(link_dir, target)
@ -1570,6 +1569,19 @@ def can_access_dir(path):
return os.path.isdir(path) and os.access(path, os.R_OK | os.X_OK) return os.path.isdir(path) and os.access(path, os.R_OK | os.X_OK)
@memoized
def can_write_to_dir(path):
"""Return True if the argument is a directory in which we can write.
Args:
path: path to be tested
Returns:
True if ``path`` is an writeable directory, else False
"""
return os.path.isdir(path) and os.access(path, os.R_OK | os.X_OK | os.W_OK)
@memoized @memoized
def files_in(*search_paths): def files_in(*search_paths):
"""Returns all the files in paths passed as arguments. """Returns all the files in paths passed as arguments.
@ -1683,3 +1695,18 @@ def prefixes(path):
pass pass
return paths return paths
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()

View File

@ -62,7 +62,7 @@
from spack.util.environment import system_dirs from spack.util.environment import system_dirs
from spack.error import NoLibrariesError, NoHeadersError from spack.error import NoLibrariesError, NoHeadersError
from spack.util.executable import Executable from spack.util.executable import Executable
from spack.util.module_cmd import load_module, get_path_from_module, module from spack.util.module_cmd import load_module, path_from_modules, module
from spack.util.log_parse import parse_log_events, make_log_context from spack.util.log_parse import parse_log_events, make_log_context
@ -642,7 +642,7 @@ def get_rpaths(pkg):
# Second module is our compiler mod name. We use that to get rpaths from # Second module is our compiler mod name. We use that to get rpaths from
# module show output. # module show output.
if pkg.compiler.modules and len(pkg.compiler.modules) > 1: if pkg.compiler.modules and len(pkg.compiler.modules) > 1:
rpaths.append(get_path_from_module(pkg.compiler.modules[1])) rpaths.append(path_from_modules(pkg.compiler.modules[1]))
return list(dedupe(filter_system_paths(rpaths))) return list(dedupe(filter_system_paths(rpaths)))
@ -706,8 +706,9 @@ def load_external_modules(pkg):
pkg (PackageBase): package to load deps for pkg (PackageBase): package to load deps for
""" """
for dep in list(pkg.spec.traverse()): for dep in list(pkg.spec.traverse()):
if dep.external_module: external_modules = dep.external_modules or []
load_module(dep.external_module) for external_module in external_modules:
load_module(external_module)
def setup_package(pkg, dirty): def setup_package(pkg, dirty):

View File

@ -2,16 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function from __future__ import print_function
import collections
import os import os
import re import re
import shutil
import llnl.util.filesystem as fs
import llnl.util.tty as tty import llnl.util.tty as tty
import spack.config import spack.config
import spack.cmd.common.arguments
import spack.schema.env import spack.schema.env
import spack.environment as ev import spack.environment as ev
import spack.schema.packages
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
from spack.util.editor import editor from spack.util.editor import editor
@ -80,6 +84,19 @@ def setup_parser(subparser):
# Make the add parser available later # Make the add parser available later
setup_parser.add_parser = add_parser setup_parser.add_parser = add_parser
update = sp.add_parser(
'update', help='update configuration files to the latest format'
)
spack.cmd.common.arguments.add_common_arguments(update, ['yes_to_all'])
update.add_argument('section', help='section to update')
revert = sp.add_parser(
'revert',
help='revert configuration files to their state before update'
)
spack.cmd.common.arguments.add_common_arguments(revert, ['yes_to_all'])
revert.add_argument('section', help='section to update')
def _get_scope_and_section(args): def _get_scope_and_section(args):
"""Extract config scope and section from arguments.""" """Extract config scope and section from arguments."""
@ -275,12 +292,164 @@ def config_remove(args):
set_config(args, path, existing, scope) set_config(args, path, existing, scope)
def _can_update_config_file(scope_dir, cfg_file):
dir_ok = fs.can_write_to_dir(scope_dir)
cfg_ok = fs.can_access(cfg_file)
return dir_ok and cfg_ok
def config_update(args):
# Read the configuration files
spack.config.config.get_config(args.section, scope=args.scope)
updates = spack.config.config.format_updates[args.section]
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
scope_dir = scope.path
can_be_updated = _can_update_config_file(scope_dir, cfg_file)
if not can_be_updated:
if scope.name == 'system':
skip_system_scope = True
msg = ('Not enough permissions to write to "system" scope. '
'Skipping update at that location [cfg={0}]')
tty.warn(msg.format(cfg_file))
continue
cannot_overwrite.append((scope, cfg_file))
if cannot_overwrite:
msg = 'Detected permission issues with the following scopes:\n\n'
for scope, cfg_file in cannot_overwrite:
msg += '\t[scope={0}, cfg={1}]\n'.format(scope.name, cfg_file)
msg += ('\nEither ensure that you have sufficient permissions to '
'modify these files or do not include these scopes in the '
'update.')
tty.die(msg)
if skip_system_scope:
updates = [x for x in updates if x.name != 'system']
# Report if there are no updates to be done
if not updates:
msg = 'No updates needed for "{0}" section.'
tty.msg(msg.format(args.section))
return
proceed = True
if not args.yes_to_all:
msg = ('The following configuration files are going to be updated to'
' the latest schema format:\n\n')
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
msg += '\t[scope={0}, file={1}]\n'.format(scope.name, cfg_file)
msg += ('\nIf the configuration files are updated, versions of Spack '
'that are older than this version may not be able to read '
'them. Spack stores backups of the updated files which can '
'be retrieved with "spack config revert"')
tty.msg(msg)
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
# Get a function to update the format
update_fn = spack.config.ensure_latest_format_fn(args.section)
for scope in updates:
cfg_file = spack.config.config.get_config_filename(
scope.name, args.section
)
with open(cfg_file) as f:
data = syaml.load(f) or {}
data = data.pop(args.section, {})
update_fn(data)
# Make a backup copy and rewrite the file
bkp_file = cfg_file + '.bkp'
shutil.copy(cfg_file, bkp_file)
spack.config.config.update_config(
args.section, data, scope=scope.name, force=True
)
msg = 'File "{0}" updated [backup={1}]'
tty.msg(msg.format(cfg_file, bkp_file))
def _can_revert_update(scope_dir, cfg_file, bkp_file):
dir_ok = fs.can_write_to_dir(scope_dir)
cfg_ok = not os.path.exists(cfg_file) or fs.can_access(cfg_file)
bkp_ok = fs.can_access(bkp_file)
return dir_ok and cfg_ok and bkp_ok
def config_revert(args):
scopes = [args.scope] if args.scope else [
x.name for x in spack.config.config.file_scopes
]
# Search for backup files in the configuration scopes
Entry = collections.namedtuple('Entry', ['scope', 'cfg', 'bkp'])
to_be_restored, cannot_overwrite = [], []
for scope in scopes:
cfg_file = spack.config.config.get_config_filename(scope, args.section)
bkp_file = cfg_file + '.bkp'
# If the backup files doesn't exist move to the next scope
if not os.path.exists(bkp_file):
continue
# If it exists and we don't have write access in this scope
# keep track of it and report a comprehensive error later
entry = Entry(scope, cfg_file, bkp_file)
scope_dir = os.path.dirname(bkp_file)
can_be_reverted = _can_revert_update(scope_dir, cfg_file, bkp_file)
if not can_be_reverted:
cannot_overwrite.append(entry)
continue
to_be_restored.append(entry)
# Report errors if we can't revert a configuration
if cannot_overwrite:
msg = 'Detected permission issues with the following scopes:\n\n'
for e in cannot_overwrite:
msg += '\t[scope={0.scope}, cfg={0.cfg}, bkp={0.bkp}]\n'.format(e)
msg += ('\nEither ensure to have the right permissions before retrying'
' or be more specific on the scope to revert.')
tty.die(msg)
proceed = True
if not args.yes_to_all:
msg = ('The following scopes will be restored from the corresponding'
' backup files:\n')
for entry in to_be_restored:
msg += '\t[scope={0.scope}, bkp={0.bkp}]\n'.format(entry)
msg += 'This operation cannot be undone.'
tty.msg(msg)
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
for _, cfg_file, bkp_file in to_be_restored:
shutil.copy(bkp_file, cfg_file)
os.unlink(bkp_file)
msg = 'File "{0}" reverted to old state'
tty.msg(msg.format(cfg_file))
def config(parser, args): def config(parser, args):
action = {'get': config_get, action = {
'blame': config_blame, 'get': config_get,
'edit': config_edit, 'blame': config_blame,
'list': config_list, 'edit': config_edit,
'add': config_add, 'list': config_list,
'rm': config_remove, 'add': config_add,
'remove': config_remove} 'rm': config_remove,
'remove': config_remove,
'update': config_update,
'revert': config_revert
}
action[args.config_command](args) action[args.config_command](args)

View File

@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os import os
import shutil
import sys import sys
from collections import namedtuple from collections import namedtuple
@ -14,6 +15,7 @@
import spack.config import spack.config
import spack.schema.env import spack.schema.env
import spack.cmd.common.arguments
import spack.cmd.install import spack.cmd.install
import spack.cmd.uninstall import spack.cmd.uninstall
import spack.cmd.modules import spack.cmd.modules
@ -37,6 +39,8 @@
['status', 'st'], ['status', 'st'],
'loads', 'loads',
'view', 'view',
'update',
'revert'
] ]
@ -394,6 +398,80 @@ def env_loads(args):
print(' source %s' % loads_file) print(' source %s' % loads_file)
def env_update_setup_parser(subparser):
"""update environments to the latest format"""
subparser.add_argument(
metavar='env', dest='env',
help='name or directory of the environment to activate'
)
spack.cmd.common.arguments.add_common_arguments(subparser, ['yes_to_all'])
def env_update(args):
manifest_file = ev.manifest_file(args.env)
backup_file = manifest_file + ".bkp"
needs_update = not ev.is_latest_format(manifest_file)
if not needs_update:
tty.msg('No update needed for the environment "{0}"'.format(args.env))
return
proceed = True
if not args.yes_to_all:
msg = ('The environment "{0}" is going to be updated to the latest '
'schema format.\nIf the environment is updated, versions of '
'Spack that are older than this version may not be able to '
'read it. Spack stores backups of the updated environment '
'which can be retrieved with "spack env revert"')
tty.msg(msg.format(args.env))
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
ev.update_yaml(manifest_file, backup_file=backup_file)
msg = 'Environment "{0}" has been updated [backup={1}]'
tty.msg(msg.format(args.env, backup_file))
def env_revert_setup_parser(subparser):
"""restore environments to their state before update"""
subparser.add_argument(
metavar='env', dest='env',
help='name or directory of the environment to activate'
)
spack.cmd.common.arguments.add_common_arguments(subparser, ['yes_to_all'])
def env_revert(args):
manifest_file = ev.manifest_file(args.env)
backup_file = manifest_file + ".bkp"
# Check that both the spack.yaml and the backup exist, the inform user
# on what is going to happen and ask for confirmation
if not os.path.exists(manifest_file):
msg = 'cannot fine the manifest file of the environment [file={0}]'
tty.die(msg.format(manifest_file))
if not os.path.exists(backup_file):
msg = 'cannot find the old manifest file to be restored [file={0}]'
tty.die(msg.format(backup_file))
proceed = True
if not args.yes_to_all:
msg = ('Spack is going to overwrite the current manifest file'
' with a backup copy [manifest={0}, backup={1}]')
tty.msg(msg.format(manifest_file, backup_file))
proceed = tty.get_yes_or_no('Do you want to proceed?', default=False)
if not proceed:
tty.die('Operation aborted.')
shutil.copy(backup_file, manifest_file)
os.remove(backup_file)
msg = 'Environment "{0}" reverted to old state'
tty.msg(msg.format(manifest_file))
#: Dictionary mapping subcommand names and aliases to functions #: Dictionary mapping subcommand names and aliases to functions
subcommand_functions = {} subcommand_functions = {}

View File

@ -74,19 +74,28 @@ def _generate_pkg_config(external_pkg_entries):
This does not generate the entire packages.yaml. For example, given some This does not generate the entire packages.yaml. For example, given some
external entries for the CMake package, this could return:: external entries for the CMake package, this could return::
{ 'paths': { {
'cmake@3.17.1': '/opt/cmake-3.17.1/', 'externals': [{
'cmake@3.16.5': '/opt/cmake-3.16.5/' 'spec': 'cmake@3.17.1',
} 'prefix': '/opt/cmake-3.17.1/'
}, {
'spec': 'cmake@3.16.5',
'prefix': '/opt/cmake-3.16.5/'
}]
} }
""" """
paths_dict = syaml.syaml_dict()
pkg_dict = syaml.syaml_dict()
pkg_dict['externals'] = []
for e in external_pkg_entries: for e in external_pkg_entries:
if not _spec_is_valid(e.spec): if not _spec_is_valid(e.spec):
continue continue
paths_dict[str(e.spec)] = e.base_dir
pkg_dict = syaml.syaml_dict() external_items = [('spec', str(e.spec)), ('prefix', e.base_dir)]
pkg_dict['paths'] = paths_dict external_items.extend(e.spec.extra_attributes.items())
pkg_dict['externals'].append(
syaml.syaml_dict(external_items)
)
return pkg_dict return pkg_dict
@ -259,6 +268,14 @@ def _get_external_packages(packages_to_check, system_path_to_exe=None):
else: else:
resolved_specs[spec] = prefix resolved_specs[spec] = prefix
try:
spec.validate_detection()
except Exception as e:
msg = ('"{0}" has been detected on the system but will '
'not be added to packages.yaml [{1}]')
tty.warn(msg.format(spec, str(e)))
continue
pkg_to_entries[pkg.name].append( pkg_to_entries[pkg.name].append(
ExternalPackageEntry(spec=spec, base_dir=pkg_prefix)) ExternalPackageEntry(spec=spec, base_dir=pkg_prefix))

View File

@ -30,6 +30,7 @@
""" """
import collections
import copy import copy
import os import os
import re import re
@ -352,6 +353,7 @@ def __init__(self, *scopes):
self.scopes = OrderedDict() self.scopes = OrderedDict()
for scope in scopes: for scope in scopes:
self.push_scope(scope) self.push_scope(scope)
self.format_updates = collections.defaultdict(list)
def push_scope(self, scope): def push_scope(self, scope):
"""Add a higher precedence scope to the Configuration.""" """Add a higher precedence scope to the Configuration."""
@ -440,7 +442,7 @@ def clear_caches(self):
for scope in self.scopes.values(): for scope in self.scopes.values():
scope.clear() scope.clear()
def update_config(self, section, update_data, scope=None): def update_config(self, section, update_data, scope=None, force=False):
"""Update the configuration file for a particular scope. """Update the configuration file for a particular scope.
Overwrites contents of a section in a scope with update_data, Overwrites contents of a section in a scope with update_data,
@ -449,7 +451,26 @@ def update_config(self, section, update_data, scope=None):
update_data should have the top-level section name stripped off update_data should have the top-level section name stripped off
(it will be re-added). Data itself can be a list, dict, or any (it will be re-added). Data itself can be a list, dict, or any
other yaml-ish structure. other yaml-ish structure.
Configuration scopes that are still written in an old schema
format will fail to update unless ``force`` is True.
Args:
section (str): section of the configuration to be updated
update_data (dict): data to be used for the update
scope (str): scope to be updated
force (str): force the update
""" """
if self.format_updates.get(section) and not force:
msg = ('The "{0}" section of the configuration needs to be written'
' to disk, but is currently using a deprecated format. '
'Please update it using:\n\n'
'\tspack config [--scope=<scope] update {0}\n\n'
'Note that previous versions of Spack will not be able to '
'use the updated configuration.')
msg = msg.format(section)
raise RuntimeError(msg)
_validate_section_name(section) # validate section name _validate_section_name(section) # validate section name
scope = self._validate_scope(scope) # get ConfigScope object scope = self._validate_scope(scope) # get ConfigScope object
@ -514,6 +535,15 @@ def get_config(self, section, scope=None):
if section not in data: if section not in data:
continue continue
# We might be reading configuration files in an old format,
# thus read data and update it in memory if need be.
changed = _update_in_memory(data, section)
if changed:
self.format_updates[section].append(scope)
msg = ('OUTDATED CONFIGURATION FILE '
'[section={0}, scope={1}, dir={2}]')
tty.debug(msg.format(section, scope.name, scope.path))
merged_section = merge_yaml(merged_section, data) merged_section = merge_yaml(merged_section, data)
# no config files -- empty config. # no config files -- empty config.
@ -723,7 +753,7 @@ def get(path, default=None, scope=None):
def set(path, value, scope=None): def set(path, value, scope=None):
"""Convenience function for getting single values in config files. """Convenience function for setting single values in config files.
Accepts the path syntax described in ``get()``. Accepts the path syntax described in ``get()``.
""" """
@ -999,6 +1029,41 @@ def default_list_scope():
return None return None
def _update_in_memory(data, section):
"""Update the format of the configuration data in memory.
This function assumes the section is valid (i.e. validation
is responsibility of the caller)
Args:
data (dict): configuration data
section (str): section of the configuration to update
Returns:
True if the data was changed, False otherwise
"""
update_fn = ensure_latest_format_fn(section)
changed = update_fn(data[section])
return changed
def ensure_latest_format_fn(section):
"""Return a function that takes as input a dictionary read from
a configuration file and update it to the latest format.
The function returns True if there was any update, False otherwise.
Args:
section (str): section of the configuration e.g. "packages",
"config", etc.
"""
# The line below is based on the fact that every module we need
# is already imported at the top level
section_module = getattr(spack.schema, section)
update_fn = getattr(section_module, 'update', lambda x: False)
return update_fn
class ConfigError(SpackError): class ConfigError(SpackError):
"""Superclass for all Spack config related errors.""" """Superclass for all Spack config related errors."""

View File

@ -1473,6 +1473,18 @@ def write(self, regenerate_views=True):
writing if True. writing if True.
""" """
# Intercept environment not using the latest schema format and prevent
# them from being modified
manifest_exists = os.path.exists(self.manifest_path)
if manifest_exists and not is_latest_format(self.manifest_path):
msg = ('The environment "{0}" needs to be written to disk, but '
'is currently using a deprecated format. Please update it '
'using:\n\n'
'\tspack env update {0}\n\n'
'Note that previous versions of Spack will not be able to '
'use the updated configuration.')
raise RuntimeError(msg.format(self.name))
# ensure path in var/spack/environments # ensure path in var/spack/environments
fs.mkdirp(self.path) fs.mkdirp(self.path)
@ -1723,5 +1735,92 @@ def deactivate_config_scope(env):
spack.config.config.remove_scope(scope.name) spack.config.config.remove_scope(scope.name)
def manifest_file(env_name_or_dir):
"""Return the absolute path to a manifest file given the environment
name or directory.
Args:
env_name_or_dir (str): either the name of a valid environment
or a directory where a manifest file resides
Raises:
AssertionError: if the environment is not found
"""
env_dir = None
if is_env_dir(env_name_or_dir):
env_dir = os.path.abspath(env_name_or_dir)
elif exists(env_name_or_dir):
env_dir = os.path.abspath(root(env_name_or_dir))
assert env_dir, "environment not found [env={0}]".format(env_name_or_dir)
return os.path.join(env_dir, manifest_name)
def update_yaml(manifest, backup_file):
"""Update a manifest file from an old format to the current one.
Args:
manifest (str): path to a manifest file
backup_file (str): file where to copy the original manifest
Returns:
True if the manifest was updated, False otherwise.
Raises:
AssertionError: in case anything goes wrong during the update
"""
# Check if the environment needs update
with open(manifest) as f:
data = syaml.load(f)
top_level_key = _top_level_key(data)
needs_update = spack.schema.env.update(data[top_level_key])
if not needs_update:
msg = "No update needed [manifest={0}]".format(manifest)
tty.debug(msg)
return False
# Copy environment to a backup file and update it
msg = ('backup file "{0}" already exists on disk. Check its content '
'and remove it before trying to update again.')
assert not os.path.exists(backup_file), msg.format(backup_file)
shutil.copy(manifest, backup_file)
with open(manifest, 'w') as f:
_write_yaml(data, f)
return True
def _top_level_key(data):
"""Return the top level key used in this environment
Args:
data (dict): raw yaml data of the environment
Returns:
Either 'spack' or 'env'
"""
msg = ('cannot find top level attribute "spack" or "env"'
'in the environment')
assert any(x in data for x in ('spack', 'env')), msg
if 'spack' in data:
return 'spack'
return 'env'
def is_latest_format(manifest):
"""Return True if the manifest file is at the latest schema format,
False otherwise.
Args:
manifest (str): manifest file to be analyzed
"""
with open(manifest) as f:
data = syaml.load(f)
top_level_key = _top_level_key(data)
changed = spack.schema.env.update(data[top_level_key])
return not changed
class SpackEnvironmentError(spack.error.SpackError): class SpackEnvironmentError(spack.error.SpackError):
"""Superclass for all errors to do with Spack environments.""" """Superclass for all errors to do with Spack environments."""

View File

@ -272,9 +272,9 @@ def _process_external_package(pkg, explicit):
pre = '{s.name}@{s.version} :'.format(s=pkg.spec) pre = '{s.name}@{s.version} :'.format(s=pkg.spec)
spec = pkg.spec spec = pkg.spec
if spec.external_module: if spec.external_modules:
tty.msg('{0} has external module in {1}' tty.msg('{0} has external module in {1}'
.format(pre, spec.external_module)) .format(pre, spec.external_modules))
tty.debug('{0} is actually installed in {1}' tty.debug('{0} is actually installed in {1}'
.format(pre, spec.external_path)) .format(pre, spec.external_path))
else: else:

View File

@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import stat import stat
from six import string_types from six import string_types
@ -158,7 +157,7 @@ def spec_externals(spec):
"""Return a list of external specs (w/external directory path filled in), """Return a list of external specs (w/external directory path filled in),
one for each known external installation.""" one for each known external installation."""
# break circular import. # break circular import.
from spack.util.module_cmd import get_path_from_module # NOQA: ignore=F401 from spack.util.module_cmd import path_from_modules # NOQA: ignore=F401
allpkgs = spack.config.get('packages') allpkgs = spack.config.get('packages')
names = set([spec.name]) names = set([spec.name])
@ -167,24 +166,24 @@ def spec_externals(spec):
external_specs = [] external_specs = []
for name in names: for name in names:
pkg_config = allpkgs.get(name, {}) pkg_config = allpkgs.get(name, {})
pkg_paths = pkg_config.get('paths', {}) pkg_externals = pkg_config.get('externals', [])
pkg_modules = pkg_config.get('modules', {}) for entry in pkg_externals:
if (not pkg_paths) and (not pkg_modules): spec_str = entry['spec']
continue external_path = entry.get('prefix', None)
if external_path:
for external_spec, path in pkg_paths.items(): external_path = canonicalize_path(external_path)
external_spec = spack.spec.Spec( external_modules = entry.get('modules', None)
external_spec, external_path=canonicalize_path(path)) external_spec = spack.spec.Spec.from_detection(
spack.spec.Spec(
spec_str,
external_path=external_path,
external_modules=external_modules
), extra_attributes=entry.get('extra_attributes', {})
)
if external_spec.satisfies(spec): if external_spec.satisfies(spec):
external_specs.append(external_spec) external_specs.append(external_spec)
for external_spec, module in pkg_modules.items(): # Defensively copy returned specs
external_spec = spack.spec.Spec(
external_spec, external_module=module)
if external_spec.satisfies(spec):
external_specs.append(external_spec)
# defensively copy returned specs
return [s.copy() for s in external_specs] return [s.copy() for s in external_specs]

View File

@ -90,11 +90,15 @@ def _deprecated_properties(validator, deprecated, instance, schema):
is_error = deprecated['error'] is_error = deprecated['error']
if not is_error: if not is_error:
for entry in deprecated_properties: for entry in deprecated_properties:
llnl.util.tty.warn(msg.format(property=entry)) llnl.util.tty.warn(
msg.format(property=entry, entry=instance[entry])
)
else: else:
import jsonschema import jsonschema
for entry in deprecated_properties: for entry in deprecated_properties:
yield jsonschema.ValidationError(msg.format(property=entry)) yield jsonschema.ValidationError(
msg.format(property=entry, entry=instance[entry])
)
return jsonschema.validators.extend( return jsonschema.validators.extend(
jsonschema.Draft4Validator, { jsonschema.Draft4Validator, {

View File

@ -8,9 +8,12 @@
.. literalinclude:: _spack_root/lib/spack/spack/schema/env.py .. literalinclude:: _spack_root/lib/spack/spack/schema/env.py
:lines: 36- :lines: 36-
""" """
import warnings
from llnl.util.lang import union_dicts from llnl.util.lang import union_dicts
import spack.schema.merged import spack.schema.merged
import spack.schema.packages
import spack.schema.projections import spack.schema.projections
#: legal first keys in the schema #: legal first keys in the schema
@ -133,3 +136,22 @@
} }
} }
} }
def update(data):
"""Update the data in place to remove deprecated properties.
Args:
data (dict): dictionary to be updated
Returns:
True if data was changed, False otherwise
"""
if 'include' in data:
msg = ("included configuration files should be updated manually"
" [files={0}]")
warnings.warn(msg.format(', '.join(data['include'])))
if 'packages' in data:
return spack.schema.packages.update(data['packages'])
return False

View File

@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for packages.yaml configuration files. """Schema for packages.yaml configuration files.
.. literalinclude:: _spack_root/lib/spack/spack/schema/packages.py .. literalinclude:: _spack_root/lib/spack/spack/schema/packages.py
@ -59,10 +58,6 @@
}, },
}, },
}, },
'modules': {
'type': 'object',
'default': {},
},
'providers': { 'providers': {
'type': 'object', 'type': 'object',
'default': {}, 'default': {},
@ -72,17 +67,39 @@
'type': 'array', 'type': 'array',
'default': [], 'default': [],
'items': {'type': 'string'}, }, }, }, 'items': {'type': 'string'}, }, }, },
'paths': {
'type': 'object',
'default': {},
},
'variants': { 'variants': {
'oneOf': [ 'oneOf': [
{'type': 'string'}, {'type': 'string'},
{'type': 'array', {'type': 'array',
'items': {'type': 'string'}}], 'items': {'type': 'string'}}],
}, },
'externals': {
'type': 'array',
'items': {
'type': 'object',
'properties': {
'spec': {'type': 'string'},
'prefix': {'type': 'string'},
'modules': {'type': 'array',
'items': {'type': 'string'}},
'extra_attributes': {'type': 'object'}
},
'additionalProperties': True,
'required': ['spec']
}
},
# Deprecated properties, will trigger an error with a
# message telling how to update.
'paths': {'type': 'object'},
'modules': {'type': 'object'},
}, },
'deprecatedProperties': {
'properties': ['modules', 'paths'],
'message': 'the attribute "{property}" in the "packages" '
'section of the configuration has been '
'deprecated [entry={entry}]',
'error': False
}
}, },
}, },
}, },
@ -97,3 +114,33 @@
'additionalProperties': False, 'additionalProperties': False,
'properties': properties, 'properties': properties,
} }
def update(data):
"""Update in-place the data to remove deprecated properties.
Args:
data (dict): dictionary to be updated
Returns:
True if data was changed, False otherwise
"""
changed = False
for cfg_object in data.values():
externals = []
paths = cfg_object.pop('paths', {})
for spec, prefix in paths.items():
externals.append({
'spec': str(spec),
'prefix': str(prefix)
})
modules = cfg_object.pop('modules', {})
for spec, module in modules.items():
externals.append({
'spec': str(spec),
'modules': [str(module)]
})
if externals:
changed = True
cfg_object['externals'] = externals
return changed

View File

@ -959,7 +959,7 @@ class Spec(object):
def __init__(self, spec_like=None, def __init__(self, spec_like=None,
normal=False, concrete=False, external_path=None, normal=False, concrete=False, external_path=None,
external_module=None, full_hash=None): external_modules=None, full_hash=None):
"""Create a new Spec. """Create a new Spec.
Arguments: Arguments:
@ -988,8 +988,6 @@ def __init__(self, spec_like=None,
self.variants = vt.VariantMap(self) self.variants = vt.VariantMap(self)
self.architecture = None self.architecture = None
self.compiler = None self.compiler = None
self.external_path = None
self.external_module = None
self.compiler_flags = FlagMap(self) self.compiler_flags = FlagMap(self)
self._dependents = DependencyMap() self._dependents = DependencyMap()
self._dependencies = DependencyMap() self._dependencies = DependencyMap()
@ -1010,9 +1008,13 @@ def __init__(self, spec_like=None,
self._normal = normal self._normal = normal
self._concrete = concrete self._concrete = concrete
self.external_path = external_path self.external_path = external_path
self.external_module = external_module self.external_modules = external_modules
self._full_hash = full_hash self._full_hash = full_hash
# This attribute is used to store custom information for
# external specs. None signal that it was not set yet.
self.extra_attributes = None
if isinstance(spec_like, six.string_types): if isinstance(spec_like, six.string_types):
spec_list = SpecParser(self).parse(spec_like) spec_list = SpecParser(self).parse(spec_like)
if len(spec_list) > 1: if len(spec_list) > 1:
@ -1025,7 +1027,7 @@ def __init__(self, spec_like=None,
@property @property
def external(self): def external(self):
return bool(self.external_path) or bool(self.external_module) return bool(self.external_path) or bool(self.external_modules)
def get_dependency(self, name): def get_dependency(self, name):
dep = self._dependencies.get(name) dep = self._dependencies.get(name)
@ -1526,7 +1528,8 @@ def to_node_dict(self, hash=ht.dag_hash):
if self.external: if self.external:
d['external'] = syaml.syaml_dict([ d['external'] = syaml.syaml_dict([
('path', self.external_path), ('path', self.external_path),
('module', self.external_module), ('module', self.external_modules),
('extra_attributes', self.extra_attributes)
]) ])
if not self._concrete: if not self._concrete:
@ -1695,21 +1698,21 @@ def from_node_dict(node):
for name in FlagMap.valid_compiler_flags(): for name in FlagMap.valid_compiler_flags():
spec.compiler_flags[name] = [] spec.compiler_flags[name] = []
spec.external_path = None
spec.external_modules = None
if 'external' in node: if 'external' in node:
spec.external_path = None
spec.external_module = None
# This conditional is needed because sometimes this function is # This conditional is needed because sometimes this function is
# called with a node already constructed that contains a 'versions' # called with a node already constructed that contains a 'versions'
# and 'external' field. Related to virtual packages provider # and 'external' field. Related to virtual packages provider
# indexes. # indexes.
if node['external']: if node['external']:
spec.external_path = node['external']['path'] spec.external_path = node['external']['path']
spec.external_module = node['external']['module'] spec.external_modules = node['external']['module']
if spec.external_module is False: if spec.external_modules is False:
spec.external_module = None spec.external_modules = None
else: spec.extra_attributes = node['external'].get(
spec.external_path = None 'extra_attributes', syaml.syaml_dict()
spec.external_module = None )
# specs read in are concrete unless marked abstract # specs read in are concrete unless marked abstract
spec._concrete = node.get('concrete', True) spec._concrete = node.get('concrete', True)
@ -1970,6 +1973,44 @@ def from_json(stream):
tty.debug(e) tty.debug(e)
raise sjson.SpackJSONError("error parsing JSON spec:", str(e)) raise sjson.SpackJSONError("error parsing JSON spec:", str(e))
@staticmethod
def from_detection(spec_str, extra_attributes=None):
"""Construct a spec from a spec string determined during external
detection and attach extra attributes to it.
Args:
spec_str (str): spec string
extra_attributes (dict): dictionary containing extra attributes
Returns:
spack.spec.Spec: external spec
"""
s = Spec(spec_str)
extra_attributes = syaml.sorted_dict(extra_attributes or {})
# This is needed to be able to validate multi-valued variants,
# otherwise they'll still be abstract in the context of detection.
vt.substitute_abstract_variants(s)
s.extra_attributes = extra_attributes
return s
def validate_detection(self):
"""Validate the detection of an external spec.
This method is used as part of Spack's detection protocol, and is
not meant for client code use.
"""
# Assert that _extra_attributes is a Mapping and not None,
# which likely means the spec was created with Spec.from_detection
msg = ('cannot validate "{0}" since it was not created '
'using Spec.from_detection'.format(self))
assert isinstance(self.extra_attributes, collections.Mapping), msg
# Validate the spec calling a package specific method
validate_fn = getattr(
self.package, 'validate_detected_spec', lambda x, y: None
)
validate_fn(self, self.extra_attributes)
def _concretize_helper(self, concretizer, presets=None, visited=None): def _concretize_helper(self, concretizer, presets=None, visited=None):
"""Recursive helper function for concretize(). """Recursive helper function for concretize().
This concretizes everything bottom-up. As things are This concretizes everything bottom-up. As things are
@ -2115,8 +2156,8 @@ def feq(cfield, sfield):
feq(replacement.variants, spec.variants) and feq(replacement.variants, spec.variants) and
feq(replacement.external_path, feq(replacement.external_path,
spec.external_path) and spec.external_path) and
feq(replacement.external_module, feq(replacement.external_modules,
spec.external_module)): spec.external_modules)):
continue continue
# Refine this spec to the candidate. This uses # Refine this spec to the candidate. This uses
# replace_with AND dup so that it can work in # replace_with AND dup so that it can work in
@ -2250,7 +2291,7 @@ def concretize(self, tests=False):
t[-1] for t in ordered_hashes) t[-1] for t in ordered_hashes)
for s in self.traverse(): for s in self.traverse():
if s.external_module and not s.external_path: if s.external_modules and not s.external_path:
compiler = spack.compilers.compiler_for_spec( compiler = spack.compilers.compiler_for_spec(
s.compiler, s.architecture) s.compiler, s.architecture)
for mod in compiler.modules: for mod in compiler.modules:
@ -2259,8 +2300,8 @@ def concretize(self, tests=False):
# get the path from the module # get the path from the module
# the package can override the default # the package can override the default
s.external_path = getattr(s.package, 'external_prefix', s.external_path = getattr(s.package, 'external_prefix',
md.get_path_from_module( md.path_from_modules(
s.external_module)) s.external_modules))
# Mark everything in the spec as concrete, as well. # Mark everything in the spec as concrete, as well.
self._mark_concrete() self._mark_concrete()
@ -3046,7 +3087,7 @@ def _dup(self, other, deps=True, cleardeps=True, caches=None):
self._normal != other._normal and self._normal != other._normal and
self.concrete != other.concrete and self.concrete != other.concrete and
self.external_path != other.external_path and self.external_path != other.external_path and
self.external_module != other.external_module and self.external_modules != other.external_modules and
self.compiler_flags != other.compiler_flags) self.compiler_flags != other.compiler_flags)
self._package = None self._package = None
@ -3074,7 +3115,8 @@ def _dup(self, other, deps=True, cleardeps=True, caches=None):
self.variants.spec = self self.variants.spec = self
self.external_path = other.external_path self.external_path = other.external_path
self.external_module = other.external_module self.external_modules = other.external_modules
self.extra_attributes = other.extra_attributes
self.namespace = other.namespace self.namespace = other.namespace
# Cached fields are results of expensive operations. # Cached fields are results of expensive operations.

View File

@ -527,14 +527,10 @@ def test_ci_generate_with_external_pkg(tmpdir, mutable_mock_env_path,
ci_cmd('generate', '--output-file', outputfile) ci_cmd('generate', '--output-file', outputfile)
with open(outputfile) as f: with open(outputfile) as f:
contents = f.read() yaml_contents = syaml.load(f)
print('generated contents: ')
print(contents) # Check that the "externaltool" package was not erroneously staged
yaml_contents = syaml.load(contents) assert not any('externaltool' in key for key in yaml_contents)
for ci_key in yaml_contents.keys():
if 'externaltool' in ci_key:
print('Erroneously staged "externaltool" pkg')
assert(False)
def test_ci_generate_debug_with_custom_spack(tmpdir, mutable_mock_env_path, def test_ci_generate_debug_with_custom_spack(tmpdir, mutable_mock_env_path,

View File

@ -2,17 +2,40 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import os import os
from llnl.util.filesystem import mkdirp import pytest
import llnl.util.filesystem as fs
import spack.config import spack.config
import spack.environment as ev import spack.environment as ev
from spack.main import SpackCommand import spack.main
import spack.util.spack_yaml as syaml
config = SpackCommand('config') config = spack.main.SpackCommand('config')
env = SpackCommand('env') env = spack.main.SpackCommand('env')
@pytest.fixture()
def packages_yaml_v015(mutable_config):
"""Create a packages.yaml in the old format"""
def _create(scope=None):
old_data = {
'packages': {
'cmake': {
'paths': {'cmake@3.14.0': '/usr'}
},
'gcc': {
'modules': {'gcc@8.3.0': 'gcc-8'}
}
}
}
scope = scope or spack.config.default_modify_scope()
cfg_file = spack.config.config.get_config_filename(scope, 'packages')
with open(cfg_file, 'w') as f:
syaml.dump(old_data, stream=f)
return cfg_file
return _create
def test_get_config_scope(mock_low_high_config): def test_get_config_scope(mock_low_high_config):
@ -23,8 +46,8 @@ def test_get_config_scope_merged(mock_low_high_config):
low_path = mock_low_high_config.scopes['low'].path low_path = mock_low_high_config.scopes['low'].path
high_path = mock_low_high_config.scopes['high'].path high_path = mock_low_high_config.scopes['high'].path
mkdirp(low_path) fs.mkdirp(low_path)
mkdirp(high_path) fs.mkdirp(high_path)
with open(os.path.join(low_path, 'repos.yaml'), 'w') as f: with open(os.path.join(low_path, 'repos.yaml'), 'w') as f:
f.write('''\ f.write('''\
@ -403,3 +426,104 @@ def test_config_remove_from_env(mutable_empty_config, mutable_mock_env_path):
""" """
assert output == expected assert output == expected
def test_config_update_packages(packages_yaml_v015):
"""Test Spack updating old packages.yaml format for externals
to new format. Ensure that data is preserved and converted
properly.
"""
packages_yaml_v015()
config('update', '-y', 'packages')
# Check the entries have been transformed
data = spack.config.get('packages')
check_update(data)
def test_config_update_not_needed(mutable_config):
data_before = spack.config.get('repos')
config('update', '-y', 'repos')
data_after = spack.config.get('repos')
assert data_before == data_after
def test_config_update_fail_on_permission_issue(
packages_yaml_v015, monkeypatch
):
# The first time it will update and create the backup file
packages_yaml_v015()
# Mock a global scope where we cannot write
monkeypatch.setattr(
spack.cmd.config, '_can_update_config_file', lambda x, y: False
)
with pytest.raises(spack.main.SpackCommandError):
config('update', '-y', 'packages')
def test_config_revert(packages_yaml_v015):
cfg_file = packages_yaml_v015()
bkp_file = cfg_file + '.bkp'
config('update', '-y', 'packages')
# Check that the backup file exists, compute its md5 sum
assert os.path.exists(bkp_file)
md5bkp = fs.md5sum(bkp_file)
config('revert', '-y', 'packages')
# Check that the backup file does not exist anymore and
# that the md5 sum of the configuration file is the same
# as that of the old backup file
assert not os.path.exists(bkp_file)
assert md5bkp == fs.md5sum(cfg_file)
def test_config_revert_raise_if_cant_write(packages_yaml_v015, monkeypatch):
packages_yaml_v015()
config('update', '-y', 'packages')
# Mock a global scope where we cannot write
monkeypatch.setattr(
spack.cmd.config, '_can_revert_update', lambda x, y, z: False
)
# The command raises with an helpful error if a configuration
# file is to be deleted and we don't have sufficient permissions
with pytest.raises(spack.main.SpackCommandError):
config('revert', '-y', 'packages')
def test_updating_config_implicitly_raises(packages_yaml_v015):
# Trying to write implicitly to a scope with a configuration file
# in the old format raises an exception
packages_yaml_v015()
with pytest.raises(RuntimeError):
config('add', 'packages:cmake:buildable:false')
def test_updating_multiple_scopes_at_once(packages_yaml_v015):
# Create 2 config files in the old format
packages_yaml_v015(scope='user')
packages_yaml_v015(scope='site')
# Update both of them at once
config('update', '-y', 'packages')
for scope in ('user', 'site'):
data = spack.config.get('packages', scope=scope)
check_update(data)
def check_update(data):
"""Check that the data from the packages_yaml_v015
has been updated.
"""
assert 'externals' in data['cmake']
externals = data['cmake']['externals']
assert {'spec': 'cmake@3.14.0', 'prefix': '/usr'} in externals
assert 'paths' not in data['cmake']
assert 'externals' in data['gcc']
externals = data['gcc']['externals']
assert {'spec': 'gcc@8.3.0', 'modules': ['gcc-8']} in externals
assert 'modules' not in data['gcc']

View File

@ -448,8 +448,9 @@ def test_env_view_external_prefix(tmpdir_factory, mutable_database,
external_config = StringIO("""\ external_config = StringIO("""\
packages: packages:
a: a:
paths: externals:
a: {a_prefix} - spec: a
prefix: {a_prefix}
buildable: false buildable: false
""".format(a_prefix=str(fake_prefix))) """.format(a_prefix=str(fake_prefix)))
external_config_dict = spack.util.spack_yaml.load_config(external_config) external_config_dict = spack.util.spack_yaml.load_config(external_config)
@ -2041,3 +2042,73 @@ def test_env_write_only_non_default():
yaml = f.read() yaml = f.read()
assert yaml == ev.default_manifest_yaml assert yaml == ev.default_manifest_yaml
@pytest.fixture
def packages_yaml_v015(tmpdir):
"""Return the path to an existing manifest in the v0.15.x format
and the path to a non yet existing backup file.
"""
raw_yaml = """
spack:
specs:
- mpich
packages:
cmake:
paths:
cmake@3.17.3: /usr
"""
manifest = tmpdir.ensure('spack.yaml')
backup_file = tmpdir.join('spack.yaml.bkp')
manifest.write(raw_yaml)
return manifest, backup_file
def test_update_anonymous_env(packages_yaml_v015):
manifest, backup_file = packages_yaml_v015
env('update', '-y', str(manifest.dirname))
# The environment is now at the latest format
assert ev.is_latest_format(str(manifest))
# A backup file has been created and it's not at the latest format
assert os.path.exists(str(backup_file))
assert not ev.is_latest_format(str(backup_file))
def test_double_update(packages_yaml_v015):
manifest, backup_file = packages_yaml_v015
# Update the environment
env('update', '-y', str(manifest.dirname))
# Try to read the environment (it should not error)
ev.create('test', str(manifest))
# Updating again does nothing since the manifest is up-to-date
env('update', '-y', str(manifest.dirname))
# The environment is at the latest format
assert ev.is_latest_format(str(manifest))
# A backup file has been created and it's not at the latest format
assert os.path.exists(str(backup_file))
assert not ev.is_latest_format(str(backup_file))
def test_update_and_revert(packages_yaml_v015):
manifest, backup_file = packages_yaml_v015
# Update the environment
env('update', '-y', str(manifest.dirname))
assert os.path.exists(str(backup_file))
assert not ev.is_latest_format(str(backup_file))
assert ev.is_latest_format(str(manifest))
# Revert to previous state
env('revert', '-y', str(manifest.dirname))
assert not os.path.exists(str(backup_file))
assert not ev.is_latest_format(str(manifest))
def test_old_format_cant_be_updated_implicitly(packages_yaml_v015):
manifest, backup_file = packages_yaml_v015
env('activate', str(manifest.dirname))
with pytest.raises(spack.main.SpackCommandError):
add('hdf5')

View File

@ -70,21 +70,20 @@ def test_find_external_two_instances_same_package(create_exe):
def test_find_external_update_config(mutable_config): def test_find_external_update_config(mutable_config):
pkg_to_entries = { entries = [
'cmake': [ ExternalPackageEntry(Spec.from_detection('cmake@1.foo'), '/x/y1/'),
ExternalPackageEntry(Spec('cmake@1.foo'), '/x/y1/'), ExternalPackageEntry(Spec.from_detection('cmake@3.17.2'), '/x/y2/'),
ExternalPackageEntry(Spec('cmake@3.17.2'), '/x/y2/'), ]
] pkg_to_entries = {'cmake': entries}
}
spack.cmd.external._update_pkg_config(pkg_to_entries, False) spack.cmd.external._update_pkg_config(pkg_to_entries, False)
pkgs_cfg = spack.config.get('packages') pkgs_cfg = spack.config.get('packages')
cmake_cfg = pkgs_cfg['cmake'] cmake_cfg = pkgs_cfg['cmake']
cmake_paths_cfg = cmake_cfg['paths'] cmake_externals = cmake_cfg['externals']
assert cmake_paths_cfg['cmake@1.foo'] == '/x/y1/' assert {'spec': 'cmake@1.foo', 'prefix': '/x/y1/'} in cmake_externals
assert cmake_paths_cfg['cmake@3.17.2'] == '/x/y2/' assert {'spec': 'cmake@3.17.2', 'prefix': '/x/y2/'} in cmake_externals
def test_get_executables(working_env, create_exe): def test_get_executables(working_env, create_exe):
@ -103,15 +102,16 @@ def test_find_external_cmd(mutable_config, working_env, create_exe):
which restricts the set of packages that Spack looks for. which restricts the set of packages that Spack looks for.
""" """
cmake_path1 = create_exe("cmake", "cmake version 1.foo") cmake_path1 = create_exe("cmake", "cmake version 1.foo")
prefix = os.path.dirname(os.path.dirname(cmake_path1))
os.environ['PATH'] = ':'.join([os.path.dirname(cmake_path1)]) os.environ['PATH'] = ':'.join([os.path.dirname(cmake_path1)])
external('find', 'cmake') external('find', 'cmake')
pkgs_cfg = spack.config.get('packages') pkgs_cfg = spack.config.get('packages')
cmake_cfg = pkgs_cfg['cmake'] cmake_cfg = pkgs_cfg['cmake']
cmake_paths_cfg = cmake_cfg['paths'] cmake_externals = cmake_cfg['externals']
assert 'cmake@1.foo' in cmake_paths_cfg assert {'spec': 'cmake@1.foo', 'prefix': prefix} in cmake_externals
def test_find_external_cmd_not_buildable( def test_find_external_cmd_not_buildable(
@ -134,16 +134,18 @@ def test_find_external_cmd_full_repo(
""" """
exe_path1 = create_exe( exe_path1 = create_exe(
"find-externals1-exe", "find-externals1 version 1.foo") "find-externals1-exe", "find-externals1 version 1.foo"
)
prefix = os.path.dirname(os.path.dirname(exe_path1))
os.environ['PATH'] = ':'.join([os.path.dirname(exe_path1)]) os.environ['PATH'] = ':'.join([os.path.dirname(exe_path1)])
external('find') external('find')
pkgs_cfg = spack.config.get('packages') pkgs_cfg = spack.config.get('packages')
pkg_cfg = pkgs_cfg['find-externals1'] pkg_cfg = pkgs_cfg['find-externals1']
pkg_paths_cfg = pkg_cfg['paths'] pkg_externals = pkg_cfg['externals']
assert 'find-externals1@1.foo' in pkg_paths_cfg assert {'spec': 'find-externals1@1.foo', 'prefix': prefix} in pkg_externals
def test_find_external_merge(mutable_config, mutable_mock_repo): def test_find_external_merge(mutable_config, mutable_mock_repo):
@ -152,26 +154,31 @@ def test_find_external_merge(mutable_config, mutable_mock_repo):
""" """
pkgs_cfg_init = { pkgs_cfg_init = {
'find-externals1': { 'find-externals1': {
'paths': { 'externals': [{
'find-externals1@1.1': '/preexisting-prefix/' 'spec': 'find-externals1@1.1',
}, 'prefix': '/preexisting-prefix/'
}],
'buildable': False 'buildable': False
} }
} }
mutable_config.update_config('packages', pkgs_cfg_init) mutable_config.update_config('packages', pkgs_cfg_init)
entries = [
pkg_to_entries = { ExternalPackageEntry(
'find-externals1': [ Spec.from_detection('find-externals1@1.1'), '/x/y1/'
ExternalPackageEntry(Spec('find-externals1@1.1'), '/x/y1/'), ),
ExternalPackageEntry(Spec('find-externals1@1.2'), '/x/y2/'), ExternalPackageEntry(
] Spec.from_detection('find-externals1@1.2'), '/x/y2/'
} )
]
pkg_to_entries = {'find-externals1': entries}
spack.cmd.external._update_pkg_config(pkg_to_entries, False) spack.cmd.external._update_pkg_config(pkg_to_entries, False)
pkgs_cfg = spack.config.get('packages') pkgs_cfg = spack.config.get('packages')
pkg_cfg = pkgs_cfg['find-externals1'] pkg_cfg = pkgs_cfg['find-externals1']
pkg_paths_cfg = pkg_cfg['paths'] pkg_externals = pkg_cfg['externals']
assert pkg_paths_cfg['find-externals1@1.1'] == '/preexisting-prefix/' assert {'spec': 'find-externals1@1.1',
assert pkg_paths_cfg['find-externals1@1.2'] == '/x/y2/' 'prefix': '/preexisting-prefix/'} in pkg_externals
assert {'spec': 'find-externals1@1.2',
'prefix': '/x/y2/'} in pkg_externals

View File

@ -373,7 +373,7 @@ def test_external_package_module(self):
spec = Spec('externalmodule') spec = Spec('externalmodule')
spec.concretize() spec.concretize()
assert spec['externalmodule'].external_module == 'external-module' assert spec['externalmodule'].external_modules == ['external-module']
assert 'externalprereq' not in spec assert 'externalprereq' not in spec
assert spec['externalmodule'].compiler.satisfies('gcc') assert spec['externalmodule'].compiler.satisfies('gcc')

View File

@ -198,8 +198,9 @@ def test_external_mpi(self):
mpi: [mpich] mpi: [mpich]
mpich: mpich:
buildable: false buildable: false
paths: externals:
mpich@3.0.4: /dummy/path - spec: mpich@3.0.4
prefix: /dummy/path
""") """)
spack.config.set('packages', conf, scope='concretize') spack.config.set('packages', conf, scope='concretize')
@ -229,8 +230,9 @@ def mock_module(cmd, module):
mpi: [mpich] mpi: [mpich]
mpi: mpi:
buildable: false buildable: false
modules: externals:
mpich@3.0.4: dummy - spec: mpich@3.0.4
modules: [dummy]
""") """)
spack.config.set('packages', conf, scope='concretize') spack.config.set('packages', conf, scope='concretize')

View File

@ -4,15 +4,21 @@ packages:
mpi: [openmpi, mpich] mpi: [openmpi, mpich]
externaltool: externaltool:
buildable: False buildable: False
paths: externals:
externaltool@1.0%gcc@4.5.0: /path/to/external_tool - spec: externaltool@1.0%gcc@4.5.0
externaltool@0.9%gcc@4.5.0: /usr prefix: /path/to/external_tool
- spec: externaltool@0.9%gcc@4.5.0
prefix: /usr
externalvirtual: externalvirtual:
buildable: False buildable: False
paths: externals:
externalvirtual@2.0%clang@3.3: /path/to/external_virtual_clang - spec: externalvirtual@2.0%clang@3.3
externalvirtual@1.0%gcc@4.5.0: /path/to/external_virtual_gcc prefix: /path/to/external_virtual_clang
- spec: externalvirtual@1.0%gcc@4.5.0
prefix: /path/to/external_virtual_gcc
externalmodule: externalmodule:
buildable: False buildable: False
modules: externals:
externalmodule@1.0%gcc@4.5.0: external-module - spec: externalmodule@1.0%gcc@4.5.0
modules:
- external-module

View File

@ -689,17 +689,17 @@ def test_115_reindex_with_packages_not_in_repo(mutable_database):
def test_external_entries_in_db(mutable_database): def test_external_entries_in_db(mutable_database):
rec = mutable_database.get_record('mpileaks ^zmpi') rec = mutable_database.get_record('mpileaks ^zmpi')
assert rec.spec.external_path is None assert rec.spec.external_path is None
assert rec.spec.external_module is None assert not rec.spec.external_modules
rec = mutable_database.get_record('externaltool') rec = mutable_database.get_record('externaltool')
assert rec.spec.external_path == '/path/to/external_tool' assert rec.spec.external_path == '/path/to/external_tool'
assert rec.spec.external_module is None assert not rec.spec.external_modules
assert rec.explicit is False assert rec.explicit is False
rec.spec.package.do_install(fake=True, explicit=True) rec.spec.package.do_install(fake=True, explicit=True)
rec = mutable_database.get_record('externaltool') rec = mutable_database.get_record('externaltool')
assert rec.spec.external_path == '/path/to/external_tool' assert rec.spec.external_path == '/path/to/external_tool'
assert rec.spec.external_module is None assert not rec.spec.external_modules
assert rec.explicit is True assert rec.explicit is True

View File

@ -157,11 +157,11 @@ def test_process_external_package_module(install_mockery, monkeypatch, capfd):
monkeypatch.setattr(spack.database.Database, 'get_record', _none) monkeypatch.setattr(spack.database.Database, 'get_record', _none)
spec.external_path = '/actual/external/path/not/checked' spec.external_path = '/actual/external/path/not/checked'
spec.external_module = 'unchecked_module' spec.external_modules = ['unchecked_module']
inst._process_external_package(spec.package, False) inst._process_external_package(spec.package, False)
out = capfd.readouterr()[0] out = capfd.readouterr()[0]
assert 'has external module in {0}'.format(spec.external_module) in out assert 'has external module in {0}'.format(spec.external_modules) in out
def test_process_binary_cache_tarball_none(install_mockery, monkeypatch, def test_process_binary_cache_tarball_none(install_mockery, monkeypatch,
@ -257,15 +257,15 @@ def test_installer_ensure_ready_errors(install_mockery):
fmt = r'cannot be installed locally.*{0}' fmt = r'cannot be installed locally.*{0}'
# Force an external package error # Force an external package error
path, module = spec.external_path, spec.external_module path, modules = spec.external_path, spec.external_modules
spec.external_path = '/actual/external/path/not/checked' spec.external_path = '/actual/external/path/not/checked'
spec.external_module = 'unchecked_module' spec.external_modules = ['unchecked_module']
msg = fmt.format('is external') msg = fmt.format('is external')
with pytest.raises(inst.ExternalPackageError, match=msg): with pytest.raises(inst.ExternalPackageError, match=msg):
installer._ensure_install_ready(spec.package) installer._ensure_install_ready(spec.package)
# Force an upstream package error # Force an upstream package error
spec.external_path, spec.external_module = path, module spec.external_path, spec.external_modules = path, modules
spec.package._installed_upstream = True spec.package._installed_upstream = True
msg = fmt.format('is upstream') msg = fmt.format('is upstream')
with pytest.raises(inst.UpstreamPackageError, match=msg): with pytest.raises(inst.UpstreamPackageError, match=msg):

View File

@ -9,7 +9,7 @@
from spack.util.module_cmd import ( from spack.util.module_cmd import (
module, module,
get_path_from_module, path_from_modules,
get_path_args_from_module_line, get_path_args_from_module_line,
get_path_from_module_contents get_path_from_module_contents
) )
@ -55,7 +55,7 @@ def fake_module(*args):
return line return line
monkeypatch.setattr(spack.util.module_cmd, 'module', fake_module) monkeypatch.setattr(spack.util.module_cmd, 'module', fake_module)
path = get_path_from_module('mod') path = path_from_modules(['mod'])
assert path == '/path/to' assert path == '/path/to'
@ -116,10 +116,10 @@ def test_get_argument_from_module_line():
bad_lines = ['prepend_path(PATH,/lib/path)', bad_lines = ['prepend_path(PATH,/lib/path)',
'prepend-path (LD_LIBRARY_PATH) /lib/path'] 'prepend-path (LD_LIBRARY_PATH) /lib/path']
assert all(get_path_args_from_module_line(l) == ['/lib/path'] assert all(get_path_args_from_module_line(x) == ['/lib/path']
for l in simple_lines) for x in simple_lines)
assert all(get_path_args_from_module_line(l) == ['/lib/path', '/pkg/path'] assert all(get_path_args_from_module_line(x) == ['/lib/path', '/pkg/path']
for l in complex_lines) for x in complex_lines)
for bl in bad_lines: for bl in bad_lines:
with pytest.raises(ValueError): with pytest.raises(ValueError):
get_path_args_from_module_line(bl) get_path_args_from_module_line(bl)

View File

@ -135,18 +135,32 @@ def get_path_args_from_module_line(line):
return paths return paths
def get_path_from_module(mod): def path_from_modules(modules):
"""Inspects a TCL module for entries that indicate the absolute path """Inspect a list of TCL modules for entries that indicate the absolute
at which the library supported by said module can be found. path at which the library supported by said module can be found.
"""
# Read the module
text = module('show', mod).split('\n')
p = get_path_from_module_contents(text, mod) Args:
if p and not os.path.exists(p): modules (list): module files to be loaded to get an external package
tty.warn("Extracted path from module does not exist:"
"\n\tExtracted path: " + p) Returns:
return p Guess of the prefix path where the package
"""
best_choice = None
for module_name in modules:
# Read the current module and return a candidate path
text = module('show', module_name).split('\n')
candidate_path = get_path_from_module_contents(text, module_name)
if candidate_path and not os.path.exists(candidate_path):
msg = ("Extracted path from module does not exist "
"[module={0}, path={0}]")
tty.warn(msg.format(module_name, candidate_path))
# If anything is found, then it's the best choice. This means
# that we give preference to the last module to be loaded
# for packages requiring to load multiple modules in sequence
best_choice = candidate_path or best_choice
return best_choice
def get_path_from_module_contents(text, module_name): def get_path_from_module_contents(text, module_name):

View File

@ -13,7 +13,7 @@
""" """
import ctypes import ctypes
import collections
from ordereddict_backport import OrderedDict from ordereddict_backport import OrderedDict
from six import string_types, StringIO from six import string_types, StringIO
@ -332,6 +332,22 @@ def dump_annotated(data, stream=None, *args, **kwargs):
return getvalue() return getvalue()
def sorted_dict(dict_like):
"""Return an ordered dict with all the fields sorted recursively.
Args:
dict_like (dict): dictionary to be sorted
Returns:
dictionary sorted recursively
"""
result = syaml_dict(sorted(dict_like.items()))
for key, value in result.items():
if isinstance(value, collections.Mapping):
result[key] = sorted_dict(value)
return result
class SpackYAMLError(spack.error.SpackError): class SpackYAMLError(spack.error.SpackError):
"""Raised when there are issues with YAML parsing.""" """Raised when there are issues with YAML parsing."""
def __init__(self, msg, yaml_error): def __init__(self, msg, yaml_error):

View File

@ -1,26 +1,32 @@
packages: packages:
cmake: cmake:
buildable: False buildable: False
paths: externals:
cmake@3.12.4: /usr - spec: cmake@3.12.4
prefix: /usr
r: r:
buildable: False buildable: False
paths: externals:
r@3.4.4: /usr - spec: r@3.4.4
prefix: /usr
perl: perl:
buildable: False buildable: False
paths: externals:
perl@5.26.1: /usr - spec: perl@5.26.1
prefix: /usr
findutils: findutils:
buildable: False buildable: False
paths: externals:
findutils@4.6.0: /usr - spec: findutils@4.6.0
prefix: /usr
openssl: openssl:
buildable: False buildable: False
paths: externals:
openssl@1.1.1: /usr - spec: openssl@1.1.1
prefix: /usr
libpciaccess: libpciaccess:
buildable: False buildable: False
paths: externals:
libpciaccess@0.13.5: /usr - spec: libpciaccess@0.13.5
prefix: /usr

View File

@ -570,7 +570,7 @@ _spack_config() {
then then
SPACK_COMPREPLY="-h --help --scope" SPACK_COMPREPLY="-h --help --scope"
else else
SPACK_COMPREPLY="get blame edit list add remove rm" SPACK_COMPREPLY="get blame edit list add remove rm update revert"
fi fi
} }
@ -632,6 +632,24 @@ _spack_config_rm() {
fi fi
} }
_spack_config_update() {
if $list_options
then
SPACK_COMPREPLY="-h --help -y --yes-to-all"
else
_config_sections
fi
}
_spack_config_revert() {
if $list_options
then
SPACK_COMPREPLY="-h --help -y --yes-to-all"
else
_config_sections
fi
}
_spack_containerize() { _spack_containerize() {
SPACK_COMPREPLY="-h --help" SPACK_COMPREPLY="-h --help"
} }
@ -725,7 +743,7 @@ _spack_env() {
then then
SPACK_COMPREPLY="-h --help" SPACK_COMPREPLY="-h --help"
else else
SPACK_COMPREPLY="activate deactivate create remove rm list ls status st loads view" SPACK_COMPREPLY="activate deactivate create remove rm list ls status st loads view update revert"
fi fi
} }
@ -803,6 +821,24 @@ _spack_env_view() {
fi fi
} }
_spack_env_update() {
if $list_options
then
SPACK_COMPREPLY="-h --help -y --yes-to-all"
else
_environments
fi
}
_spack_env_revert() {
if $list_options
then
SPACK_COMPREPLY="-h --help -y --yes-to-all"
else
_environments
fi
}
_spack_extensions() { _spack_extensions() {
if $list_options if $list_options
then then

View File

@ -2,12 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
import os import os
import re import re
import spack.package
class FindExternals1(AutotoolsPackage): class FindExternals1(AutotoolsPackage):
executables = ['find-externals1-exe'] executables = ['find-externals1-exe']
@ -31,4 +30,6 @@ def determine_spec_details(cls, prefix, exes_in_prefix):
match = re.search(r'find-externals1.*version\s+(\S+)', output) match = re.search(r'find-externals1.*version\s+(\S+)', output)
if match: if match:
version_str = match.group(1) version_str = match.group(1)
return Spec('find-externals1@{0}'.format(version_str)) return Spec.from_detection(
'find-externals1@{0}'.format(version_str)
)

View File

@ -187,7 +187,7 @@ def setup_run_environment(self, env):
# their run environments the code to make the compilers available. # their run environments the code to make the compilers available.
# For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers. # For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers.
# Cray MPIs always have cray in the module name, e.g. "cray-mpich" # Cray MPIs always have cray in the module name, e.g. "cray-mpich"
if self.spec.external_module and 'cray' in self.spec.external_module: if self.spec.external_modules and 'cray' in self.spec.external_modules:
env.set('MPICC', spack_cc) env.set('MPICC', spack_cc)
env.set('MPICXX', spack_cxx) env.set('MPICXX', spack_cxx)
env.set('MPIF77', spack_fc) env.set('MPIF77', spack_fc)
@ -210,7 +210,7 @@ def setup_dependent_build_environment(self, env, dependent_spec):
def setup_dependent_package(self, module, dependent_spec): def setup_dependent_package(self, module, dependent_spec):
# For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers. # For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers.
# Cray MPIs always have cray in the module name, e.g. "cray-mpich" # Cray MPIs always have cray in the module name, e.g. "cray-mpich"
if self.spec.external_module and 'cray' in self.spec.external_module: if self.spec.external_modules and 'cray' in self.spec.external_modules:
self.spec.mpicc = spack_cc self.spec.mpicc = spack_cc
self.spec.mpicxx = spack_cxx self.spec.mpicxx = spack_cxx
self.spec.mpifc = spack_fc self.spec.mpifc = spack_fc

View File

@ -235,7 +235,7 @@ def setup_dependent_build_environment(self, env, dependent_spec):
def setup_compiler_environment(self, env): def setup_compiler_environment(self, env):
# For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers. # For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers.
# Cray MPIs always have cray in the module name, e.g. "cray-mvapich" # Cray MPIs always have cray in the module name, e.g. "cray-mvapich"
if self.spec.external_module and 'cray' in self.spec.external_module: if self.spec.external_modules and 'cray' in self.spec.external_modules:
env.set('MPICC', spack_cc) env.set('MPICC', spack_cc)
env.set('MPICXX', spack_cxx) env.set('MPICXX', spack_cxx)
env.set('MPIF77', spack_fc) env.set('MPIF77', spack_fc)
@ -249,7 +249,7 @@ def setup_compiler_environment(self, env):
def setup_dependent_package(self, module, dependent_spec): def setup_dependent_package(self, module, dependent_spec):
# For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers. # For Cray MPIs, the regular compiler wrappers *are* the MPI wrappers.
# Cray MPIs always have cray in the module name, e.g. "cray-mvapich" # Cray MPIs always have cray in the module name, e.g. "cray-mvapich"
if self.spec.external_module and 'cray' in self.spec.external_module: if self.spec.external_modules and 'cray' in self.spec.external_modules:
self.spec.mpicc = spack_cc self.spec.mpicc = spack_cc
self.spec.mpicxx = spack_cxx self.spec.mpicxx = spack_cxx
self.spec.mpifc = spack_fc self.spec.mpifc = spack_fc