Create, install and relocate tarballs of installed packages

Adds the "buildcache" command to spack. The buildcache command is
used to create gpg signatures for archives of installed spack
packages; the signatures and archives are placed together in a
directory that can be added to a spack mirror. A user can retrieve
the archives from a mirror and verify their integrity using the
buildcache command. It is often the case that the user's Spack
instance is located in a different path compared to the Spack
instance used to generate the package archive and signature, so
this includes logic to relocate the RPATHs generated by Spack.
This commit is contained in:
Patrick Gartung 2017-08-14 16:32:27 -05:00 committed by scheibelp
parent 5184edb15f
commit ab56c742ca
7 changed files with 1467 additions and 0 deletions

View File

@ -0,0 +1,108 @@
.. _binary_caches:
Build caches
============
Some sites may encourage users to set up their own test environments
before carrying out central installations, or some users prefer to set
up these environments on their own motivation. To reduce the load of
recompiling otherwise identical package specs in different installations,
installed packages can be put into build cache tarballs, uploaded to
your spack mirror and then downloaded and installed by others.
Creating build cache files
--------------------------
A compressed tarball of an installed package is created. Tarballs are created
for all of its link and run dependency packages as well. Compressed tarballs are
signed with gpg and signature and tarball and put in a ".spack" file. Optionally
, the rpaths ( and ids and deps on macOS ) can be changed to paths relative to
the spack install tree before the tarball is created.
Build caches are created via:
.. code-block:: sh
$ spack buildcache create
Finding or installing build cache files
---------------------------------------
To find build caches or install build caches, a spack mirror must be configured
with
``spack mirror add <name> <url>``.
Build caches are found via:
.. code-block:: sh
$ spack buildcache list
Build caches are installed via:
.. code-block:: sh
$ spack buildcache install
Relocation
----------
Initial build and later installation do not necessarily happen at the same
location. Spack provides a relocation capability and corrects for RPATHs and
non-relocatable scripts. However, many packages compile paths into binary
artificats directly. In such cases, the build instructions of this package would
need to be adjusted for better re-locatability.
Usage
-----
spack buildcache create <>
^^^^^^^^^^^^^^^^^^^^^^^^^^
Create tarball of installed spack package and all dependencies.
Tarballs is checksummed and signed if gpg2 is available.
Places them in a directory build_cache that can be copied to a mirror.
Commands like "spack buildcache install" will search it for pre-compiled packages.
options:
-d <path> : directory in which "build_cache" direcory is created, defaults to "."
-f : overwrite ".spack" file in "build_cache" directory if it exists
-k <key> : the key to sign package with. In the case where multiple keys exist, the package will be unsigned unless -k is used.
-r : make paths in binaries relative before creating tarball
-y : answer yes to all create unsigned "build_cache" questions
<> : list of package specs or package hashes with leading /
spack buildcache list <>
^^^^^^^^^^^^^^^^^^^^^^^^
Retrieves all specs for build caches available on a spack mirror.
options:
<> string to be matched to matched to begining of listed concretized short
specs, eg. "spack buildcache list gcc" with print only commands to install gcc
package(s)
spack buildcache install <>
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Retrieves all specs for build caches available on a spack mirror and installs build caches
with specs matching the specs or hashes input.
options:
-f : remove install directory if it exists before unpacking tarball
-y : answer yes to all to don't verify package with gpg questions
<> : list of package specs or package hashes with leading /
spack buildcache keys
^^^^^^^^^^^^^^^^^^^^^
List public keys available on spack mirror.
options:
-i : trust the keys downloaded with prompt for each
-y : answer yes to all trust all keys downloaded

View File

@ -65,6 +65,7 @@ or refer to the full manual below.
repositories
command_index
package_list
binary_caches
.. toctree::
:maxdepth: 2

View File

@ -0,0 +1,494 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import re
import tarfile
import yaml
import shutil
import llnl.util.tty as tty
from spack.util.gpg import Gpg
from llnl.util.filesystem import mkdirp, join_path, install_tree
from spack.util.web import spider
import spack.cmd
import spack
from spack.stage import Stage
import spack.fetch_strategy as fs
from contextlib import closing
import spack.util.gpg as gpg_util
import hashlib
from spack.util.executable import ProcessError
import spack.relocate as relocate
class NoOverwriteException(Exception):
pass
class NoGpgException(Exception):
pass
class PickKeyException(Exception):
pass
class NoKeyException(Exception):
pass
class NoVerifyException(Exception):
pass
class NoChecksumException(Exception):
pass
def has_gnupg2():
try:
gpg_util.Gpg.gpg()('--version', output=os.devnull)
return True
except ProcessError:
return False
def buildinfo_file_name(prefix):
"""
Filename of the binary package meta-data file
"""
name = prefix + "/.spack/binary_distribution"
return name
def read_buildinfo_file(prefix):
"""
Read buildinfo file
"""
filename = buildinfo_file_name(prefix)
with open(filename, 'r') as inputfile:
content = inputfile.read()
buildinfo = yaml.load(content)
return buildinfo
def write_buildinfo_file(prefix):
"""
Create a cache file containing information
required for the relocation
"""
text_to_relocate = []
binary_to_relocate = []
blacklist = (".spack", "man")
# Do this at during tarball creation to save time when tarball unpacked.
# Used by make_package_relative to determine binaries to change.
for root, dirs, files in os.walk(prefix, topdown=True):
dirs[:] = [d for d in dirs if d not in blacklist]
for filename in files:
path_name = os.path.join(root, filename)
filetype = relocate.get_filetype(path_name)
if relocate.needs_binary_relocation(filetype):
rel_path_name = os.path.relpath(path_name, prefix)
binary_to_relocate.append(rel_path_name)
elif relocate.needs_text_relocation(filetype):
rel_path_name = os.path.relpath(path_name, prefix)
text_to_relocate.append(rel_path_name)
# Create buildinfo data and write it to disk
buildinfo = {}
buildinfo['buildpath'] = spack.store.layout.root
buildinfo['relocate_textfiles'] = text_to_relocate
buildinfo['relocate_binaries'] = binary_to_relocate
filename = buildinfo_file_name(prefix)
with open(filename, 'w') as outfile:
outfile.write(yaml.dump(buildinfo, default_flow_style=True))
def tarball_directory_name(spec):
"""
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>-<version>/
"""
return "%s/%s/%s-%s" % (spack.architecture.sys_type(),
str(spec.compiler).replace("@", "-"),
spec.name, spec.version)
def tarball_name(spec, ext):
"""
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash><ext>
"""
return "%s-%s-%s-%s-%s%s" % (spack.architecture.sys_type(),
str(spec.compiler).replace("@", "-"),
spec.name,
spec.version,
spec.dag_hash(),
ext)
def tarball_path_name(spec, ext):
"""
Return the full path+name for a given spec according to the convention
<tarball_directory_name>/<tarball_name>
"""
return os.path.join(tarball_directory_name(spec),
tarball_name(spec, ext))
def checksum_tarball(file):
# calculate sha256 hash of tar file
BLOCKSIZE = 65536
hasher = hashlib.sha256()
with open(file, 'rb') as tfile:
buf = tfile.read(BLOCKSIZE)
while len(buf) > 0:
hasher.update(buf)
buf = tfile.read(BLOCKSIZE)
return hasher.hexdigest()
def sign_tarball(yes_to_all, key, force, specfile_path):
# Sign the packages if keys available
if not has_gnupg2():
raise NoGpgException()
else:
if key is None:
keys = Gpg.signing_keys()
if len(keys) == 1:
key = keys[0]
if len(keys) > 1:
raise PickKeyException()
if len(keys) == 0:
raise NoKeyException()
if os.path.exists('%s.asc' % specfile_path):
if force:
os.remove('%s.asc' % specfile_path)
else:
raise NoOverwriteException('%s.asc' % specfile_path)
Gpg.sign(key, specfile_path, '%s.asc' % specfile_path)
def generate_index(outdir, indexfile_path):
f = open(indexfile_path, 'w')
header = """<html>
<head></head>
<list>"""
footer = "</list></html>"
paths = os.listdir(outdir + '/build_cache')
f.write(header)
for path in paths:
rel = os.path.basename(path)
f.write('<li><a href="%s" %s</a>' % (rel, rel))
f.write(footer)
f.close()
def build_tarball(spec, outdir, force=False, rel=False, yes_to_all=False,
key=None):
"""
Build a tarball from given spec and put it into the directory structure
used at the mirror (following <tarball_directory_name>).
"""
# set up some paths
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_dir = join_path(outdir, "build_cache",
tarball_directory_name(spec))
tarfile_path = join_path(tarfile_dir, tarfile_name)
mkdirp(tarfile_dir)
spackfile_path = os.path.join(
outdir, "build_cache", tarball_path_name(spec, '.spack'))
if os.path.exists(spackfile_path):
if force:
os.remove(spackfile_path)
else:
raise NoOverwriteException(str(spackfile_path))
# need to copy the spec file so the build cache can be downloaded
# without concretizing with the current spack packages
# and preferences
spec_file = join_path(spec.prefix, ".spack", "spec.yaml")
specfile_name = tarball_name(spec, '.spec.yaml')
specfile_path = os.path.realpath(
join_path(outdir, "build_cache", specfile_name))
indexfile_path = join_path(outdir, "build_cache", "index.html")
if os.path.exists(specfile_path):
if force:
os.remove(specfile_path)
else:
raise NoOverwriteException(str(specfile_path))
# make a copy of the install directory to work with
prefix = join_path(outdir, os.path.basename(spec.prefix))
if os.path.exists(prefix):
shutil.rmtree(prefix)
install_tree(spec.prefix, prefix)
# create info for later relocation and create tar
write_buildinfo_file(prefix)
# optinally make the paths in the binaries relative to each other
# in the spack install tree before creating tarball
if rel:
make_package_relative(prefix)
# create compressed tarball of the install prefix
with closing(tarfile.open(tarfile_path, 'w:gz')) as tar:
tar.add(name='%s' % prefix,
arcname='%s' % os.path.basename(prefix))
# remove copy of install directory
shutil.rmtree(prefix)
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
# add sha256 checksum to spec.yaml
spec_dict = {}
with open(spec_file, 'r') as inputfile:
content = inputfile.read()
spec_dict = yaml.load(content)
bchecksum = {}
bchecksum['hash_algorithm'] = 'sha256'
bchecksum['hash'] = checksum
spec_dict['binary_cache_checksum'] = bchecksum
with open(specfile_path, 'w') as outfile:
outfile.write(yaml.dump(spec_dict))
signed = False
if not yes_to_all:
# sign the tarball and spec file with gpg
try:
sign_tarball(yes_to_all, key, force, specfile_path)
signed = True
except NoGpgException:
raise NoGpgException()
except PickKeyException:
raise PickKeyException()
except NoKeyException():
raise NoKeyException()
# put tarball, spec and signature files in .spack archive
with closing(tarfile.open(spackfile_path, 'w')) as tar:
tar.add(name='%s' % tarfile_path, arcname='%s' % tarfile_name)
tar.add(name='%s' % specfile_path, arcname='%s' % specfile_name)
if signed:
tar.add(name='%s.asc' % specfile_path,
arcname='%s.asc' % specfile_name)
# cleanup file moved to archive
os.remove(tarfile_path)
if signed:
os.remove('%s.asc' % specfile_path)
# create an index.html for the build_cache directory so specs can be found
if os.path.exists(indexfile_path):
os.remove(indexfile_path)
generate_index(outdir, indexfile_path)
return None
def download_tarball(spec):
"""
Download binary tarball for given package into stage area
Return True if successful
"""
mirrors = spack.config.get_config('mirrors')
if len(mirrors) == 0:
tty.die("Please add a spack mirror to allow " +
"download of pre-compiled packages.")
tarball = tarball_path_name(spec, '.spack')
for key in mirrors:
url = mirrors[key] + "/build_cache/" + tarball
# stage the tarball into standard place
stage = Stage(url, name="build_cache", keep=True)
try:
stage.fetch()
return stage.save_filename
except fs.FetchError:
continue
return None
def make_package_relative(prefix):
"""
Change paths in binaries to relative paths
"""
buildinfo = read_buildinfo_file(prefix)
old_path = buildinfo['buildpath']
for filename in buildinfo['relocate_binaries']:
path_name = os.path.join(prefix, filename)
relocate.make_binary_relative(path_name, old_path)
def relocate_package(prefix):
"""
Relocate the given package
"""
buildinfo = read_buildinfo_file(prefix)
new_path = spack.store.layout.root
old_path = buildinfo['buildpath']
if new_path == old_path:
return
tty.msg("Relocating package from",
"%s to %s." % (old_path, new_path))
for filename in buildinfo['relocate_binaries']:
path_name = os.path.join(prefix, filename)
relocate.relocate_binary(path_name, old_path, new_path)
for filename in buildinfo['relocate_textfiles']:
path_name = os.path.join(prefix, filename)
relocate.relocate_text(path_name, old_path, new_path)
def extract_tarball(spec, filename, yes_to_all=False, force=False):
"""
extract binary tarball for given package into install area
"""
installpath = spec.prefix
if os.path.exists(installpath):
if force:
shutil.rmtree(installpath)
else:
raise NoOverwriteException(str(installpath))
mkdirp(installpath)
stagepath = os.path.dirname(filename)
spackfile_name = tarball_name(spec, '.spack')
spackfile_path = os.path.join(stagepath, spackfile_name)
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_path = os.path.join(stagepath, tarfile_name)
specfile_name = tarball_name(spec, '.spec.yaml')
specfile_path = os.path.join(stagepath, specfile_name)
with closing(tarfile.open(spackfile_path, 'r')) as tar:
tar.extractall(stagepath)
if os.path.exists('%s.asc' % specfile_path):
Gpg.verify('%s.asc' % specfile_path, specfile_path)
os.remove(specfile_path + '.asc')
else:
if not yes_to_all:
raise NoVerifyException()
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
# get the sha256 checksum recorded at creation
spec_dict = {}
with open(specfile_path, 'r') as inputfile:
content = inputfile.read()
spec_dict = yaml.load(content)
bchecksum = spec_dict['binary_cache_checksum']
# if the checksums don't match don't install
if bchecksum['hash'] != checksum:
raise NoChecksumException()
with closing(tarfile.open(tarfile_path, 'r')) as tar:
tar.extractall(path=join_path(installpath, '..'))
os.remove(tarfile_path)
os.remove(specfile_path)
relocate_package(installpath)
def get_specs():
"""
Get spec.yaml's for build caches available on mirror
"""
mirrors = spack.config.get_config('mirrors')
if len(mirrors) == 0:
tty.die("Please add a spack mirror to allow " +
"download of build caches.")
path = str(spack.architecture.sys_type())
specs = set()
urls = set()
from collections import defaultdict
durls = defaultdict(list)
for key in mirrors:
url = mirrors[key]
if url.startswith('file'):
mirror = url.replace('file://', '') + '/build_cache'
tty.msg("Finding buildcaches in %s" % mirror)
files = os.listdir(mirror)
for file in files:
if re.search('spec.yaml', file):
link = 'file://' + mirror + '/' + file
urls.add(link)
else:
tty.msg("Finding buildcaches on %s" % url)
p, links = spider(url + "/build_cache")
for link in links:
if re.search("spec.yaml", link) and re.search(path, link):
urls.add(link)
for link in urls:
with Stage(link, name="build_cache", keep=True) as stage:
try:
stage.fetch()
except fs.FetchError:
continue
with open(stage.save_filename, 'r') as f:
spec = spack.spec.Spec.from_yaml(f)
specs.add(spec)
durls[spec].append(link)
return specs, durls
def get_keys(install=False, yes_to_all=False):
"""
Get pgp public keys available on mirror
"""
mirrors = spack.config.get_config('mirrors')
if len(mirrors) == 0:
tty.die("Please add a spack mirror to allow " +
"download of build caches.")
keys = set()
for key in mirrors:
url = mirrors[key]
if url.startswith('file'):
mirror = url.replace('file://', '') + '/build_cache'
tty.msg("Finding public keys in %s" % mirror)
files = os.listdir(mirror)
for file in files:
if re.search('\.key', file):
link = 'file://' + mirror + '/' + file
keys.add(link)
else:
tty.msg("Finding public keys on %s" % url)
p, links = spider(url + "/build_cache", depth=1)
for link in links:
if re.search("\.key", link):
keys.add(link)
for link in keys:
with Stage(link, name="build_cache", keep=True) as stage:
try:
stage.fetch()
except fs.FetchError:
continue
tty.msg('Found key %s' % link)
if install:
if yes_to_all:
Gpg.trust(stage.save_filename)
tty.msg('Added this key to trusted keys.')
else:
tty.msg('Will not add this key to trusted keys.'
'Use -y to override')

View File

@ -0,0 +1,230 @@
##############################################################################
# Copyright (c) 2013, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Written by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License (as published by
# the Free Software Foundation) version 2.1 dated February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public License
# along with this program; if not, write to the Free Software Foundation,
# Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import argparse
import os
import re
import llnl.util.tty as tty
import spack
import spack.cmd
import spack.binary_distribution as bindist
from spack.binary_distribution import NoOverwriteException, NoGpgException
from spack.binary_distribution import NoKeyException, PickKeyException
from spack.binary_distribution import NoVerifyException, NoChecksumException
description = "Create, download and install build cache files."
section = "caching"
level = "long"
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help='buildcache sub-commands')
create = subparsers.add_parser('create')
create.add_argument('-r', '--rel', action='store_true',
help="make all rpaths relative" +
" before creating tarballs.")
create.add_argument('-f', '--force', action='store_true',
help="overwrite tarball if it exists.")
create.add_argument('-y', '--yes-to-all', action='store_true',
help="answer yes to all create unsigned " +
"buildcache questions")
create.add_argument('-k', '--key', metavar='key',
type=str, default=None,
help="Key for signing.")
create.add_argument('-d', '--directory', metavar='directory',
type=str, default='.',
help="directory in which to save the tarballs.")
create.add_argument(
'packages', nargs=argparse.REMAINDER,
help="specs of packages to create buildcache for")
create.set_defaults(func=createtarball)
install = subparsers.add_parser('install')
install.add_argument('-f', '--force', action='store_true',
help="overwrite install directory if it exists.")
install.add_argument('-y', '--yes-to-all', action='store_true',
help="answer yes to all install unsigned " +
"buildcache questions")
install.add_argument(
'packages', nargs=argparse.REMAINDER,
help="specs of packages to install biuldache for")
install.set_defaults(func=installtarball)
listcache = subparsers.add_parser('list')
listcache.add_argument(
'packages', nargs=argparse.REMAINDER,
help="specs of packages to search for")
listcache.set_defaults(func=listspecs)
dlkeys = subparsers.add_parser('keys')
dlkeys.add_argument(
'-i', '--install', action='store_true',
help="install Keys pulled from mirror")
dlkeys.add_argument(
'-y', '--yes-to-all', action='store_true',
help="answer yes to all trust questions")
dlkeys.set_defaults(func=getkeys)
def createtarball(args):
if not args.packages:
tty.die("build cache file creation requires at least one" +
" installed package argument")
pkgs = set(args.packages)
specs = set()
outdir = os.getcwd()
if args.directory:
outdir = args.directory
signkey = None
if args.key:
signkey = args.key
yes_to_all = False
force = False
relative = False
if args.yes_to_all:
yes_to_all = True
if args.force:
force = True
if args.rel:
relative = True
for pkg in pkgs:
for spec in spack.cmd.parse_specs(pkg, concretize=True):
specs.add(spec)
tty.msg('recursing dependencies')
for d, node in spec.traverse(order='post',
depth=True,
deptype=('link', 'run'),
deptype_query='run'):
if not node.external:
tty.msg('adding dependency %s' % node.format())
specs.add(node)
for spec in specs:
tty.msg('creating binary cache file for package %s ' % spec.format())
try:
bindist.build_tarball(spec, outdir, force,
relative, yes_to_all, signkey)
except NoOverwriteException as e:
tty.warn("%s exists, use -f to force overwrite." % e)
except NoGpgException:
tty.warn("gpg2 is not available,"
" use -y to create unsigned build caches")
except NoKeyException:
tty.warn("no default key available for signing,"
" use -y to create unsigned build caches"
" or spack gpg init to create a default key")
except PickKeyException:
tty.warn("multi keys available for signing,"
" use -y to create unsigned build caches"
" or -k <key hash> to pick a key")
def installtarball(args):
if not args.packages:
tty.die("build cache file installation requires" +
" at least one package spec argument")
pkgs = set(args.packages)
specs, links = bindist.get_specs()
matches = set()
for spec in specs:
for pkg in pkgs:
if re.match(re.escape(pkg), str(spec)):
matches.add(spec)
if re.match(re.escape(pkg), '/%s' % spec.dag_hash()):
matches.add(spec)
for match in matches:
install_tarball(match, args)
def install_tarball(spec, args):
s = spack.spec.Spec(spec)
yes_to_all = False
force = False
if args.yes_to_all:
yes_to_all = True
if args.force:
force = True
for d in s.dependencies():
tty.msg("Installing buildcache for dependency spec %s" % d)
install_tarball(d, args)
package = spack.repo.get(spec)
if s.concrete and package.installed and not force:
tty.warn("Package for spec %s already installed." % spec.format(),
" Use -f flag to overwrite.")
else:
tarball = bindist.download_tarball(spec)
if tarball:
tty.msg('Installing buildcache for spec %s' % spec.format())
try:
bindist.extract_tarball(spec, tarball, yes_to_all, force)
except NoOverwriteException as e:
tty.warn("%s exists. use -f to force overwrite." % e.args)
except NoVerifyException:
tty.die("Package spec file failed signature verification,"
" use -y flag to install build cache")
except NoChecksumException:
tty.die("Package tarball failed checksum verification,"
" use -y flag to install build cache")
finally:
spack.store.db.reindex(spack.store.layout)
else:
tty.die('Download of binary cache file for spec %s failed.' %
spec.format())
def listspecs(args):
specs, links = bindist.get_specs()
if args.packages:
pkgs = set(args.packages)
for pkg in pkgs:
tty.msg("buildcache spec(s) matching %s \n" % pkg)
for spec in sorted(specs):
if re.search("^" + re.escape(pkg), str(spec)):
tty.msg('run "spack buildcache install /%s"' %
spec.dag_hash(7) + ' to install %s\n' %
spec.format())
else:
tty.msg("buildcache specs ")
for spec in sorted(specs):
tty.msg('run "spack buildcache install /%s" to install %s\n' %
(spec.dag_hash(7), spec.format()))
def getkeys(args):
install = False
if args.install:
install = True
yes_to_all = False
if args.yes_to_all:
yes_to_all = True
bindist.get_keys(install, yes_to_all)
def buildcache(parser, args):
if args.func:
args.func(args)

289
lib/spack/spack/relocate.py Normal file
View File

@ -0,0 +1,289 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
import os
import platform
import re
import spack
import spack.cmd
from spack.util.executable import Executable
from llnl.util.filesystem import filter_file
import llnl.util.tty as tty
def get_patchelf():
"""
Builds and installs spack patchelf package on linux platforms
using the first concretized spec.
Returns the full patchelf binary path.
"""
# as we may need patchelf, find out where it is
if platform.system() == 'Darwin':
return None
patchelf_spec = spack.cmd.parse_specs("patchelf", concretize=True)[0]
patchelf = spack.repo.get(patchelf_spec)
if not patchelf.installed:
patchelf.do_install()
patchelf_executable = os.path.join(patchelf.prefix.bin, "patchelf")
return patchelf_executable
def get_existing_elf_rpaths(path_name):
"""
Return the RPATHS returned by patchelf --print-rpath path_name
as a list of strings.
"""
if platform.system() == 'Linux':
command = Executable(get_patchelf())
output = command('--print-rpath', '%s' %
path_name, output=str, err=str)
return output.rstrip('\n').split(':')
else:
tty.die('relocation not supported for this platform')
return
def get_relative_rpaths(path_name, orig_dir, orig_rpaths):
"""
Replaces orig_dir with relative path from dirname(path_name) if an rpath
in orig_rpaths contains orig_path. Prefixes $ORIGIN
to relative paths and returns replacement rpaths.
"""
rel_rpaths = []
for rpath in orig_rpaths:
if re.match(orig_dir, rpath):
rel = os.path.relpath(rpath, start=os.path.dirname(path_name))
rel_rpaths.append('$ORIGIN/%s' % rel)
else:
rel_rpaths.append(rpath)
return rel_rpaths
def macho_get_paths(path_name):
"""
Examines the output of otool -l path_name for these three fields:
LC_ID_DYLIB, LC_LOAD_DYLIB, LC_RPATH and parses out the rpaths,
dependiencies and library id.
Returns these values.
"""
otool = Executable('otool')
output = otool("-l", path_name, output=str, err=str)
last_cmd = None
idpath = ''
rpaths = []
deps = []
for line in output.split('\n'):
match = re.search('( *[a-zA-Z]+ )(.*)', line)
if match:
lhs = match.group(1).lstrip().rstrip()
rhs = match.group(2)
match2 = re.search('(.*) \(.*\)', rhs)
if match2:
rhs = match2.group(1)
if lhs == 'cmd':
last_cmd = rhs
if lhs == 'path' and last_cmd == 'LC_RPATH':
rpaths.append(rhs)
if lhs == 'name' and last_cmd == 'LC_ID_DYLIB':
idpath = rhs
if lhs == 'name' and last_cmd == 'LC_LOAD_DYLIB':
deps.append(rhs)
return rpaths, deps, idpath
def macho_make_paths_relative(path_name, old_dir, rpaths, deps, idpath):
"""
Replace old_dir with relative path from dirname(path_name)
in rpaths and deps; idpaths are replaced with @rpath/basebane(path_name);
replacement are returned.
"""
id = None
nrpaths = []
ndeps = []
if idpath:
id = '@rpath/%s' % os.path.basename(idpath)
for rpath in rpaths:
if re.match(old_dir, rpath):
rel = os.path.relpath(rpath, start=os.path.dirname(path_name))
nrpaths.append('@loader_path/%s' % rel)
else:
nrpaths.append(rpath)
for dep in deps:
if re.match(old_dir, dep):
rel = os.path.relpath(dep, start=os.path.dirname(path_name))
ndeps.append('@loader_path/%s' % rel)
else:
ndeps.append(dep)
return nrpaths, ndeps, id
def macho_replace_paths(old_dir, new_dir, rpaths, deps, idpath):
"""
Replace old_dir with new_dir in rpaths, deps and idpath
and return replacements
"""
id = None
nrpaths = []
ndeps = []
if idpath:
id = idpath.replace(old_dir, new_dir)
for rpath in rpaths:
nrpath = rpath.replace(old_dir, new_dir)
nrpaths.append(nrpath)
for dep in deps:
ndep = dep.replace(old_dir, new_dir)
ndeps.append(ndep)
return nrpaths, ndeps, id
def modify_macho_object(path_name, old_dir, new_dir, relative):
"""
Modify MachO binary path_name by replacing old_dir with new_dir
or the relative path to spack install root.
The old install dir in LC_ID_DYLIB is replaced with the new install dir
using install_name_tool -id newid binary
The old install dir in LC_LOAD_DYLIB is replaced with the new install dir
using install_name_tool -change old new binary
The old install dir in LC_RPATH is replaced with the new install dir using
install_name_tool -rpath old new binary
"""
# avoid error message for libgcc_s
if 'libgcc_' in path_name:
return
rpaths, deps, idpath = macho_get_paths(path_name)
id = None
nrpaths = []
ndeps = []
if relative:
nrpaths, ndeps, id = macho_make_paths_relative(path_name,
old_dir, rpaths,
deps, idpath)
else:
nrpaths, ndeps, id = macho_replace_paths(old_dir, new_dir, rpaths,
deps, idpath)
install_name_tool = Executable('install_name_tool')
if id:
install_name_tool('-id', id, path_name, output=str, err=str)
for orig, new in zip(deps, ndeps):
install_name_tool('-change', orig, new, path_name)
for orig, new in zip(rpaths, nrpaths):
install_name_tool('-rpath', orig, new, path_name)
return
def get_filetype(path_name):
"""
Return the output of file path_name as a string to identify file type.
"""
file = Executable('file')
file.add_default_env('LC_ALL', 'C')
output = file('-b', '-h', '%s' % path_name,
output=str, err=str)
return output.strip()
def modify_elf_object(path_name, orig_rpath, new_rpath):
"""
Replace orig_rpath with new_rpath in RPATH of elf object path_name
"""
if platform.system() == 'Linux':
new_joined = ':'.join(new_rpath)
patchelf = Executable(get_patchelf())
patchelf('--force-rpath', '--set-rpath', '%s' % new_joined,
'%s' % path_name, output=str, cmd=str)
else:
tty.die('relocation not supported for this platform')
def needs_binary_relocation(filetype):
"""
Check whether the given filetype is a binary that may need relocation.
"""
retval = False
if "relocatable" in filetype:
return False
if platform.system() == 'Darwin':
return ('Mach-O' in filetype)
elif platform.system() == 'Linux':
return ('ELF' in filetype)
else:
tty.die("Relocation not implemented for %s" % platform.system())
return retval
def needs_text_relocation(filetype):
"""
Check whether the given filetype is text that may need relocation.
"""
return ("text" in filetype)
def relocate_binary(path_name, old_dir, new_dir):
"""
Change old_dir to new_dir in RPATHs of elf or mach-o file path_name
"""
if platform.system() == 'Darwin':
modify_macho_object(path_name, old_dir, new_dir, relative=False)
elif platform.system() == 'Linux':
orig_rpaths = get_existing_elf_rpaths(path_name)
new_rpaths = substitute_rpath(orig_rpaths, old_dir, new_dir)
modify_elf_object(path_name, orig_rpaths, new_rpaths)
else:
tty.die("Relocation not implemented for %s" % platform.system())
def make_binary_relative(path_name, old_dir):
"""
Make RPATHs relative to old_dir in given elf or mach-o file path_name
"""
if platform.system() == 'Darwin':
new_dir = ''
modify_macho_object(path_name, old_dir, new_dir, relative=True)
elif platform.system() == 'Linux':
orig_rpaths = get_existing_elf_rpaths(path_name)
new_rpaths = get_relative_rpaths(path_name, old_dir, orig_rpaths)
modify_elf_object(path_name, orig_rpaths, new_rpaths)
else:
tty.die("Prelocation not implemented for %s" % platform.system())
def relocate_text(path_name, old_dir, new_dir):
"""
Replace old path with new path in text file path_name
"""
filter_file("r'%s'" % old_dir, "r'%s'" % new_dir, path_name)
def substitute_rpath(orig_rpath, topdir, new_root_path):
"""
Replace topdir with new_root_path RPATH list orig_rpath
"""
new_rpaths = []
for path in orig_rpath:
new_rpath = path.replace(topdir, new_root_path)
new_rpaths.append(new_rpath)
return new_rpaths

View File

@ -0,0 +1,304 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the NOTICE and LICENSE files for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
"""
This test checks the binary packaging infrastructure
"""
import pytest
import spack
import spack.store
from spack.fetch_strategy import URLFetchStrategy, FetchStrategyComposite
from spack.spec import Spec
import spack.binary_distribution as bindist
from llnl.util.filesystem import join_path, mkdirp
import argparse
import spack.cmd.buildcache as buildcache
from spack.relocate import *
import os
import stat
import sys
import shutil
from spack.util.executable import ProcessError
@pytest.fixture(scope='function')
def testing_gpg_directory(tmpdir):
old_gpg_path = spack.util.gpg.GNUPGHOME
spack.util.gpg.GNUPGHOME = str(tmpdir.join('gpg'))
yield
spack.util.gpg.GNUPGHOME = old_gpg_path
def has_gnupg2():
try:
spack.util.gpg.Gpg.gpg()('--version', output=os.devnull)
return True
except ProcessError:
return False
def fake_fetchify(url, pkg):
"""Fake the URL for a package so it downloads from a file."""
fetcher = FetchStrategyComposite()
fetcher.append(URLFetchStrategy(url))
pkg.fetcher = fetcher
@pytest.mark.usefixtures('install_mockery', 'testing_gpg_directory')
def test_packaging(mock_archive, tmpdir):
# tweak patchelf to only do a download
spec = Spec("patchelf")
spec.concretize()
pkg = spack.repo.get(spec)
fake_fetchify(pkg.fetcher, pkg)
mkdirp(join_path(pkg.prefix, "bin"))
patchelfscr = join_path(pkg.prefix, "bin", "patchelf")
f = open(patchelfscr, 'w')
body = """#!/bin/bash
echo $PATH"""
f.write(body)
f.close()
st = os.stat(patchelfscr)
os.chmod(patchelfscr, st.st_mode | stat.S_IEXEC)
# Install the test package
spec = Spec('trivial-install-test-package')
spec.concretize()
assert spec.concrete
pkg = spack.repo.get(spec)
fake_fetchify(mock_archive.url, pkg)
pkg.do_install()
# Put some non-relocatable file in there
filename = join_path(spec.prefix, "dummy.txt")
with open(filename, "w") as script:
script.write(spec.prefix)
# Create the build cache and
# put it directly into the mirror
mirror_path = join_path(tmpdir, 'test-mirror')
specs = [spec]
spack.mirror.create(
mirror_path, specs, no_checksum=True
)
# register mirror with spack config
mirrors = {'spack-mirror-test': 'file://' + mirror_path}
spack.config.update_config('mirrors', mirrors)
stage = spack.stage.Stage(
mirrors['spack-mirror-test'], name="build_cache", keep=True)
stage.create()
# setup argument parser
parser = argparse.ArgumentParser()
buildcache.setup_parser(parser)
# Create a private key to sign package with if gpg2 available
if has_gnupg2():
spack.util.gpg.Gpg.create(name='test key 1', expires='0',
email='spack@googlegroups.com',
comment='Spack test key')
# Create build cache with signing
args = parser.parse_args(['create', '-d', mirror_path, str(spec)])
buildcache.buildcache(parser, args)
# Uninstall the package
pkg.do_uninstall(force=True)
# test overwrite install
args = parser.parse_args(['install', '-f', str(spec)])
buildcache.buildcache(parser, args)
# create build cache with relative path and signing
args = parser.parse_args(
['create', '-d', mirror_path, '-f', '-r', str(spec)])
buildcache.buildcache(parser, args)
# Uninstall the package
pkg.do_uninstall(force=True)
# install build cache with verification
args = parser.parse_args(['install', str(spec)])
buildcache.install_tarball(spec, args)
# test overwrite install
args = parser.parse_args(['install', '-f', str(spec)])
buildcache.buildcache(parser, args)
else:
# create build cache without signing
args = parser.parse_args(
['create', '-d', mirror_path, '-y', str(spec)])
buildcache.buildcache(parser, args)
# Uninstall the package
pkg.do_uninstall(force=True)
# install build cache without verification
args = parser.parse_args(['install', '-y', str(spec)])
buildcache.install_tarball(spec, args)
# test overwrite install without verification
args = parser.parse_args(['install', '-f', '-y', str(spec)])
buildcache.buildcache(parser, args)
# create build cache with relative path
args = parser.parse_args(
['create', '-d', mirror_path, '-f', '-r', '-y', str(spec)])
buildcache.buildcache(parser, args)
# Uninstall the package
pkg.do_uninstall(force=True)
# install build cache
args = parser.parse_args(['install', '-y', str(spec)])
buildcache.install_tarball(spec, args)
# test overwrite install
args = parser.parse_args(['install', '-f', '-y', str(spec)])
buildcache.buildcache(parser, args)
# Validate the relocation information
buildinfo = bindist.read_buildinfo_file(spec.prefix)
assert(buildinfo['relocate_textfiles'] == ['dummy.txt'])
args = parser.parse_args(['list'])
buildcache.buildcache(parser, args)
args = parser.parse_args(['list', 'trivial'])
buildcache.buildcache(parser, args)
# Copy a key to the mirror to have something to download
shutil.copyfile(spack.mock_gpg_keys_path + '/external.key',
mirror_path + '/external.key')
args = parser.parse_args(['keys'])
buildcache.buildcache(parser, args)
# unregister mirror with spack config
mirrors = {}
spack.config.update_config('mirrors', mirrors)
shutil.rmtree(mirror_path)
stage.destroy()
def test_relocate():
assert (needs_binary_relocation('relocatable') is False)
out = macho_make_paths_relative('/Users/Shares/spack/pkgC/lib/libC.dylib',
'/Users/Shared/spack',
('/Users/Shared/spack/pkgA/lib',
'/Users/Shared/spack/pkgB/lib',
'/usr/local/lib'),
('/Users/Shared/spack/pkgA/libA.dylib',
'/Users/Shared/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'),
'/Users/Shared/spack/pkgC/lib/libC.dylib')
assert out == (['@loader_path/../../../../Shared/spack/pkgA/lib',
'@loader_path/../../../../Shared/spack/pkgB/lib',
'/usr/local/lib'],
['@loader_path/../../../../Shared/spack/pkgA/libA.dylib',
'@loader_path/../../../../Shared/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'],
'@rpath/libC.dylib')
out = macho_make_paths_relative('/Users/Shared/spack/pkgC/bin/exeC',
'/Users/Shared/spack',
('/Users/Shared/spack/pkgA/lib',
'/Users/Shared/spack/pkgB/lib',
'/usr/local/lib'),
('/Users/Shared/spack/pkgA/libA.dylib',
'/Users/Shared/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'), None)
assert out == (['@loader_path/../../pkgA/lib',
'@loader_path/../../pkgB/lib',
'/usr/local/lib'],
['@loader_path/../../pkgA/libA.dylib',
'@loader_path/../../pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'], None)
out = macho_replace_paths('/Users/Shared/spack',
'/Applications/spack',
('/Users/Shared/spack/pkgA/lib',
'/Users/Shared/spack/pkgB/lib',
'/usr/local/lib'),
('/Users/Shared/spack/pkgA/libA.dylib',
'/Users/Shared/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'),
'/Users/Shared/spack/pkgC/lib/libC.dylib')
assert out == (['/Applications/spack/pkgA/lib',
'/Applications/spack/pkgB/lib',
'/usr/local/lib'],
['/Applications/spack/pkgA/libA.dylib',
'/Applications/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'],
'/Applications/spack/pkgC/lib/libC.dylib')
out = macho_replace_paths('/Users/Shared/spack',
'/Applications/spack',
('/Users/Shared/spack/pkgA/lib',
'/Users/Shared/spack/pkgB/lib',
'/usr/local/lib'),
('/Users/Shared/spack/pkgA/libA.dylib',
'/Users/Shared/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'),
None)
assert out == (['/Applications/spack/pkgA/lib',
'/Applications/spack/pkgB/lib',
'/usr/local/lib'],
['/Applications/spack/pkgA/libA.dylib',
'/Applications/spack/pkgB/libB.dylib',
'/usr/local/lib/libloco.dylib'],
None)
out = get_relative_rpaths(
'/usr/bin/test', '/usr',
('/usr/lib', '/usr/lib64', '/opt/local/lib'))
assert out == ['$ORIGIN/../lib', '$ORIGIN/../lib64', '/opt/local/lib']
out = substitute_rpath(
('/usr/lib', '/usr/lib64', '/opt/local/lib'), '/usr', '/opt')
assert out == ['/opt/lib', '/opt/lib64', '/opt/local/lib']
@pytest.mark.skipif(sys.platform != 'darwin',
reason="only works with Mach-o objects")
def test_relocate_macho():
get_patchelf()
assert (needs_binary_relocation('Mach-O') is True)
macho_get_paths('/bin/bash')
shutil.copyfile('/bin/bash', 'bash')
modify_macho_object('bash', '/usr', '/opt', False)
modify_macho_object('bash', '/usr', '/opt', True)
shutil.copyfile('/usr/lib/libncurses.5.4.dylib', 'libncurses.5.4.dylib')
modify_macho_object('libncurses.5.4.dylib', '/usr', '/opt', False)
modify_macho_object('libncurses.5.4.dylib', '/usr', '/opt', True)
@pytest.mark.skipif(sys.platform != 'linux2',
reason="only works with Elf objects")
def test_relocate_elf():
assert (needs_binary_relocation('ELF') is True)

View File

@ -0,0 +1,41 @@
##############################################################################
# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
# Produced at the Lawrence Livermore National Laboratory.
#
# This file is part of Spack.
# Created by Todd Gamblin, tgamblin@llnl.gov, All rights reserved.
# LLNL-CODE-647188
#
# For details, see https://github.com/llnl/spack
# Please also see the LICENSE file for our notice and the LGPL.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU Lesser General Public License (as
# published by the Free Software Foundation) version 2.1, February 1999.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
# conditions of the GNU Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
##############################################################################
from spack import *
class Patchelf(AutotoolsPackage):
"""
PatchELF is a small utility to modify the
dynamic linker and RPATH of ELF executables.
"""
homepage = "https://nixos.org/patchelf.html"
url = "http://nixos.org/releases/patchelf/patchelf-0.8/patchelf-0.8.tar.gz"
list_url = "http://nixos.org/releases/patchelf/"
list_depth = 1
version('0.9', '3c265508526760f233620f35d79c79fc')
version('0.8', '407b229e6a681ffb0e2cdd5915cb2d01')