Merge branch 'develop' into features/shared
This commit is contained in:
@@ -1,3 +1,12 @@
|
||||
# v0.13.4 (2020-02-07)
|
||||
|
||||
This release contains several bugfixes:
|
||||
|
||||
* bugfixes for invoking python in various environments (#14349, #14496, #14569)
|
||||
* brought tab completion up to date (#14392)
|
||||
* bugfix for removing extensions from views in order (#12961)
|
||||
* bugfix for nondeterministic hashing for specs with externals (#14390)
|
||||
|
||||
# v0.13.3 (2019-12-23)
|
||||
|
||||
This release contains more major performance improvements for Spack
|
||||
|
||||
@@ -30,11 +30,21 @@ Default is ``$spack/opt/spack``.
|
||||
``install_hash_length`` and ``install_path_scheme``
|
||||
---------------------------------------------------
|
||||
|
||||
The default Spack installation path can be very long and can create
|
||||
problems for scripts with hardcoded shebangs. There are two parameters
|
||||
to help with that. Firstly, the ``install_hash_length`` parameter can
|
||||
set the length of the hash in the installation path from 1 to 32. The
|
||||
default path uses the full 32 characters.
|
||||
The default Spack installation path can be very long and can create problems
|
||||
for scripts with hardcoded shebangs. Additionally, when using the Intel
|
||||
compiler, and if there is also a long list of dependencies, the compiler may
|
||||
segfault. If you see the following:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
: internal error: ** The compiler has encountered an unexpected problem.
|
||||
** Segmentation violation signal raised. **
|
||||
Access violation or stack overflow. Please contact Intel Support for assistance.
|
||||
|
||||
it may be because variables containing dependency specs may be too long. There
|
||||
are two parameters to help with long path names. Firstly, the
|
||||
``install_hash_length`` parameter can set the length of the hash in the
|
||||
installation path from 1 to 32. The default path uses the full 32 characters.
|
||||
|
||||
Secondly, it is also possible to modify the entire installation
|
||||
scheme. By default Spack uses
|
||||
|
||||
307
lib/spack/docs/containers.rst
Normal file
307
lib/spack/docs/containers.rst
Normal file
@@ -0,0 +1,307 @@
|
||||
.. Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
|
||||
SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
.. _containers:
|
||||
|
||||
================
|
||||
Container Images
|
||||
================
|
||||
|
||||
Spack can be an ideal tool to setup images for containers since all the
|
||||
features discussed in :ref:`environments` can greatly help to manage
|
||||
the installation of software during the image build process. Nonetheless,
|
||||
building a production image from scratch still requires a lot of
|
||||
boilerplate to:
|
||||
|
||||
- Get Spack working within the image, possibly running as root
|
||||
- Minimize the physical size of the software installed
|
||||
- Properly update the system software in the base image
|
||||
|
||||
To facilitate users with these tedious tasks, Spack provides a command
|
||||
to automatically generate recipes for container images based on
|
||||
Environments:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ ls
|
||||
spack.yaml
|
||||
|
||||
$ spack containerize
|
||||
# Build stage with Spack pre-installed and ready to be used
|
||||
FROM spack/centos7:latest as builder
|
||||
|
||||
# What we want to install and how we want to install it
|
||||
# is specified in a manifest file (spack.yaml)
|
||||
RUN mkdir /opt/spack-environment \
|
||||
&& (echo "spack:" \
|
||||
&& echo " specs:" \
|
||||
&& echo " - gromacs+mpi" \
|
||||
&& echo " - mpich" \
|
||||
&& echo " concretization: together" \
|
||||
&& echo " config:" \
|
||||
&& echo " install_tree: /opt/software" \
|
||||
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
|
||||
|
||||
# Install the software, remove unecessary deps
|
||||
RUN cd /opt/spack-environment && spack install && spack gc -y
|
||||
|
||||
# Strip all the binaries
|
||||
RUN find -L /opt/view/* -type f -exec readlink -f '{}' \; | \
|
||||
xargs file -i | \
|
||||
grep 'charset=binary' | \
|
||||
grep 'x-executable\|x-archive\|x-sharedlib' | \
|
||||
awk -F: '{print $1}' | xargs strip -s
|
||||
|
||||
# Modifications to the environment that are necessary to run
|
||||
RUN cd /opt/spack-environment && \
|
||||
spack env activate --sh -d . >> /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
|
||||
# Bare OS image to run the installed executables
|
||||
FROM centos:7
|
||||
|
||||
COPY --from=builder /opt/spack-environment /opt/spack-environment
|
||||
COPY --from=builder /opt/software /opt/software
|
||||
COPY --from=builder /opt/view /opt/view
|
||||
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
RUN yum update -y && yum install -y epel-release && yum update -y \
|
||||
&& yum install -y libgomp \
|
||||
&& rm -rf /var/cache/yum && yum clean all
|
||||
|
||||
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
|
||||
|
||||
|
||||
LABEL "app"="gromacs"
|
||||
LABEL "mpi"="mpich"
|
||||
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
|
||||
|
||||
|
||||
The bits that make this automation possible are discussed in details
|
||||
below. All the images generated in this way will be based on
|
||||
multi-stage builds with:
|
||||
|
||||
- A fat ``build`` stage containing common build tools and Spack itself
|
||||
- A minimal ``final`` stage containing only the software requested by the user
|
||||
|
||||
-----------------
|
||||
Spack Base Images
|
||||
-----------------
|
||||
|
||||
Docker images with Spack preinstalled and ready to be used are
|
||||
built on `Docker Hub <https://hub.docker.com/u/spack>`_
|
||||
at every push to ``develop`` or to a release branch. The OS that
|
||||
are currently supported are summarized in the table below:
|
||||
|
||||
.. _containers-supported-os:
|
||||
|
||||
.. list-table:: Supported operating systems
|
||||
:header-rows: 1
|
||||
|
||||
* - Operating System
|
||||
- Base Image
|
||||
- Spack Image
|
||||
* - Ubuntu 16.04
|
||||
- ``ubuntu:16.04``
|
||||
- ``spack/ubuntu-xenial``
|
||||
* - Ubuntu 18.04
|
||||
- ``ubuntu:16.04``
|
||||
- ``spack/ubuntu-bionic``
|
||||
* - CentOS 6
|
||||
- ``centos:6``
|
||||
- ``spack/centos6``
|
||||
* - CentOS 7
|
||||
- ``centos:7``
|
||||
- ``spack/centos7``
|
||||
|
||||
All the images are tagged with the corresponding release of Spack:
|
||||
|
||||
.. image:: dockerhub_spack.png
|
||||
|
||||
with the exception of the ``latest`` tag that points to the HEAD
|
||||
of the ``develop`` branch. These images are available for anyone
|
||||
to use and take care of all the repetitive tasks that are necessary
|
||||
to setup Spack within a container. All the container recipes generated
|
||||
automatically by Spack use them as base images for their ``build`` stage.
|
||||
|
||||
|
||||
-------------------------
|
||||
Environment Configuration
|
||||
-------------------------
|
||||
|
||||
Any Spack Environment can be used for the automatic generation of container
|
||||
recipes. Sensible defaults are provided for things like the base image or the
|
||||
version of Spack used in the image. If a finer tuning is needed it can be
|
||||
obtained by adding the relevant metadata under the ``container`` attribute
|
||||
of environments:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
spack:
|
||||
specs:
|
||||
- gromacs+mpi
|
||||
- mpich
|
||||
|
||||
container:
|
||||
# Select the format of the recipe e.g. docker,
|
||||
# singularity or anything else that is currently supported
|
||||
format: docker
|
||||
|
||||
# Select from a valid list of images
|
||||
base:
|
||||
image: "centos:7"
|
||||
spack: develop
|
||||
|
||||
# Whether or not to strip binaries
|
||||
strip: true
|
||||
|
||||
# Additional system packages that are needed at runtime
|
||||
os_packages:
|
||||
- libgomp
|
||||
|
||||
# Extra instructions
|
||||
extra_instructions:
|
||||
final: |
|
||||
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
|
||||
|
||||
# Labels for the image
|
||||
labels:
|
||||
app: "gromacs"
|
||||
mpi: "mpich"
|
||||
|
||||
The tables below describe the configuration options that are currently supported:
|
||||
|
||||
.. list-table:: General configuration options for the ``container`` section of ``spack.yaml``
|
||||
:header-rows: 1
|
||||
|
||||
* - Option Name
|
||||
- Description
|
||||
- Allowed Values
|
||||
- Required
|
||||
* - ``format``
|
||||
- The format of the recipe
|
||||
- ``docker`` or ``singularity``
|
||||
- Yes
|
||||
* - ``base:image``
|
||||
- Base image for ``final`` stage
|
||||
- See :ref:`containers-supported-os`
|
||||
- Yes
|
||||
* - ``base:spack``
|
||||
- Version of Spack
|
||||
- Valid tags for ``base:image``
|
||||
- Yes
|
||||
* - ``strip``
|
||||
- Whether to strip binaries
|
||||
- ``true`` (default) or ``false``
|
||||
- No
|
||||
* - ``os_packages``
|
||||
- System packages to be installed
|
||||
- Valid packages for the ``final`` OS
|
||||
- No
|
||||
* - ``extra_instructions:build``
|
||||
- Extra instructions (e.g. `RUN`, `COPY`, etc.) at the end of the ``build`` stage
|
||||
- Anything understood by the current ``format``
|
||||
- No
|
||||
* - ``extra_instructions:final``
|
||||
- Extra instructions (e.g. `RUN`, `COPY`, etc.) at the end of the ``final`` stage
|
||||
- Anything understood by the current ``format``
|
||||
- No
|
||||
* - ``labels``
|
||||
- Labels to tag the image
|
||||
- Pairs of key-value strings
|
||||
- No
|
||||
|
||||
.. list-table:: Configuration options specific to Singularity
|
||||
:header-rows: 1
|
||||
|
||||
* - Option Name
|
||||
- Description
|
||||
- Allowed Values
|
||||
- Required
|
||||
* - ``singularity:runscript``
|
||||
- Content of ``%runscript``
|
||||
- Any valid script
|
||||
- No
|
||||
* - ``singularity:startscript``
|
||||
- Content of ``%startscript``
|
||||
- Any valid script
|
||||
- No
|
||||
* - ``singularity:test``
|
||||
- Content of ``%test``
|
||||
- Any valid script
|
||||
- No
|
||||
* - ``singularity:help``
|
||||
- Description of the image
|
||||
- Description string
|
||||
- No
|
||||
|
||||
Once the Environment is properly configured a recipe for a container
|
||||
image can be printed to standard output by issuing the following
|
||||
command from the directory where the ``spack.yaml`` resides:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack containerize
|
||||
|
||||
The example ``spack.yaml`` above would produce for instance the
|
||||
following ``Dockerfile``:
|
||||
|
||||
.. code-block:: docker
|
||||
|
||||
# Build stage with Spack pre-installed and ready to be used
|
||||
FROM spack/centos7:latest as builder
|
||||
|
||||
# What we want to install and how we want to install it
|
||||
# is specified in a manifest file (spack.yaml)
|
||||
RUN mkdir /opt/spack-environment \
|
||||
&& (echo "spack:" \
|
||||
&& echo " specs:" \
|
||||
&& echo " - gromacs+mpi" \
|
||||
&& echo " - mpich" \
|
||||
&& echo " concretization: together" \
|
||||
&& echo " config:" \
|
||||
&& echo " install_tree: /opt/software" \
|
||||
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
|
||||
|
||||
# Install the software, remove unecessary deps
|
||||
RUN cd /opt/spack-environment && spack install && spack gc -y
|
||||
|
||||
# Strip all the binaries
|
||||
RUN find -L /opt/view/* -type f -exec readlink -f '{}' \; | \
|
||||
xargs file -i | \
|
||||
grep 'charset=binary' | \
|
||||
grep 'x-executable\|x-archive\|x-sharedlib' | \
|
||||
awk -F: '{print $1}' | xargs strip -s
|
||||
|
||||
# Modifications to the environment that are necessary to run
|
||||
RUN cd /opt/spack-environment && \
|
||||
spack env activate --sh -d . >> /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
|
||||
# Bare OS image to run the installed executables
|
||||
FROM centos:7
|
||||
|
||||
COPY --from=builder /opt/spack-environment /opt/spack-environment
|
||||
COPY --from=builder /opt/software /opt/software
|
||||
COPY --from=builder /opt/view /opt/view
|
||||
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
RUN yum update -y && yum install -y epel-release && yum update -y \
|
||||
&& yum install -y libgomp \
|
||||
&& rm -rf /var/cache/yum && yum clean all
|
||||
|
||||
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ \[$(tput sgr0)\]"' >> ~/.bashrc
|
||||
|
||||
|
||||
LABEL "app"="gromacs"
|
||||
LABEL "mpi"="mpich"
|
||||
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
|
||||
|
||||
.. note::
|
||||
Spack can also produce Singularity definition files to build the image. The
|
||||
minimum version of Singularity required to build a SIF (Singularity Image Format)
|
||||
from them is ``3.5.3``.
|
||||
BIN
lib/spack/docs/dockerhub_spack.png
Normal file
BIN
lib/spack/docs/dockerhub_spack.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 88 KiB |
@@ -49,6 +49,8 @@ Spack uses a "manifest and lock" model similar to `Bundler gemfiles
|
||||
managers. The user input file is named ``spack.yaml`` and the lock
|
||||
file is named ``spack.lock``
|
||||
|
||||
.. _environments-using:
|
||||
|
||||
------------------
|
||||
Using Environments
|
||||
------------------
|
||||
@@ -382,11 +384,12 @@ the Environment.
|
||||
Loading
|
||||
^^^^^^^
|
||||
|
||||
Once an environment has been installed, the following creates a load script for it:
|
||||
Once an environment has been installed, the following creates a load
|
||||
script for it:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env myenv loads -r
|
||||
$ spack env loads -r
|
||||
|
||||
This creates a file called ``loads`` in the environment directory.
|
||||
Sourcing that file in Bash will make the environment available to the
|
||||
|
||||
@@ -66,6 +66,7 @@ or refer to the full manual below.
|
||||
config_yaml
|
||||
build_settings
|
||||
environments
|
||||
containers
|
||||
mirrors
|
||||
module_file_support
|
||||
repositories
|
||||
|
||||
@@ -929,6 +929,9 @@ Git fetching supports the following parameters to ``version``:
|
||||
* ``tag``: Name of a tag to fetch.
|
||||
* ``commit``: SHA hash (or prefix) of a commit to fetch.
|
||||
* ``submodules``: Also fetch submodules recursively when checking out this repository.
|
||||
* ``submodules_delete``: A list of submodules to forcibly delete from the repository
|
||||
after fetching. Useful if a version in the repository has submodules that
|
||||
have disappeared/are no longer accessible.
|
||||
* ``get_full_repo``: Ensure the full git history is checked out with all remote
|
||||
branch information. Normally (``get_full_repo=False``, the default), the git
|
||||
option ``--depth 1`` will be used if the version of git and the specified
|
||||
@@ -1989,6 +1992,28 @@ inject the dependency's ``prefix/lib`` directory, but the package needs to
|
||||
be in ``PATH`` and ``PYTHONPATH`` during the build process and later when
|
||||
a user wants to run the package.
|
||||
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
Conditional dependencies
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
You may have a package that only requires a dependency under certain
|
||||
conditions. For example, you may have a package that has optional MPI support,
|
||||
- MPI is only a dependency when you want to enable MPI support for the
|
||||
package. In that case, you could say something like:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
variant('mpi', default=False)
|
||||
depends_on('mpi', when='+mpi')
|
||||
|
||||
``when`` can include constraints on the variant, version, compiler, etc. and
|
||||
the :mod:`syntax<spack.spec>` is the same as for Specs written on the command
|
||||
line.
|
||||
|
||||
If a dependency/feature of a package isn't typically used, you can save time
|
||||
by making it conditional (since Spack will not build the dependency unless it
|
||||
is required for the Spec).
|
||||
|
||||
.. _dependency_dependency_patching:
|
||||
|
||||
^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
@@ -1095,6 +1095,248 @@ or filesystem views. However, it has some drawbacks:
|
||||
integrate Spack explicitly in their workflow. Not all users are
|
||||
willing to do this.
|
||||
|
||||
-------------------------------------
|
||||
Using Spack to Replace Homebrew/Conda
|
||||
-------------------------------------
|
||||
|
||||
Spack is an incredibly powerful package manager, designed for supercomputers
|
||||
where users have diverse installation needs. But Spack can also be used to
|
||||
handle simple single-user installations on your laptop. Most macOS users are
|
||||
already familiar with package managers like Homebrew and Conda, where all
|
||||
installed packages are symlinked to a single central location like ``/usr/local``.
|
||||
In this section, we will show you how to emulate the behavior of Homebrew/Conda
|
||||
using :ref:`environments`!
|
||||
|
||||
^^^^^
|
||||
Setup
|
||||
^^^^^
|
||||
|
||||
First, let's create a new environment. We'll assume that Spack is already set up
|
||||
correctly, and that you've already sourced the setup script for your shell.
|
||||
To create a new environment, simply run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env create myenv
|
||||
==> Updating view at /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
==> Created environment 'myenv' in /Users/me/spack/var/spack/environments/myenv
|
||||
$ spack env activate myenv
|
||||
|
||||
Here, *myenv* can be anything you want to name your environment. Next, we can add
|
||||
a list of packages we would like to install into our environment. Let's say we
|
||||
want a newer version of Bash than the one that comes with macOS, and we want a
|
||||
few Python libraries. We can run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack add bash
|
||||
==> Adding bash to environment myenv
|
||||
==> Updating view at /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
$ spack add python@3:
|
||||
==> Adding python@3: to environment myenv
|
||||
==> Updating view at /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
$ spack add py-numpy py-scipy py-matplotlib
|
||||
==> Adding py-numpy to environment myenv
|
||||
==> Adding py-scipy to environment myenv
|
||||
==> Adding py-matplotlib to environment myenv
|
||||
==> Updating view at /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
|
||||
Each package can be listed on a separate line, or combined into a single line.
|
||||
Notice that we're explicitly asking for Python 3 here. You can use any spec
|
||||
you would normally use on the command line with other Spack commands.
|
||||
|
||||
Next, we want to manually configure a couple of things. In the ``myenv``
|
||||
directory, we can find the ``spack.yaml`` that actually defines our environment.
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ vim ~/spack/var/spack/environments/myenv/spack.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
# This is a Spack Environment file.
|
||||
#
|
||||
# It describes a set of packages to be installed, along with
|
||||
# configuration settings.
|
||||
spack:
|
||||
# add package specs to the `specs` list
|
||||
specs: [bash, 'python@3:', py-numpy, py-scipy, py-matplotlib]
|
||||
view:
|
||||
default:
|
||||
root: /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
projections: {}
|
||||
config: {}
|
||||
mirrors: {}
|
||||
modules:
|
||||
enable: []
|
||||
packages: {}
|
||||
repos: []
|
||||
upstreams: {}
|
||||
definitions: []
|
||||
concretization: separately
|
||||
|
||||
You can see the packages we added earlier in the ``specs:`` section. If you
|
||||
ever want to add more packages, you can either use ``spack add`` or manually
|
||||
edit this file.
|
||||
|
||||
We also need to change the ``concretization:`` option. By default, Spack
|
||||
concretizes each spec *separately*, allowing multiple versions of the same
|
||||
package to coexist. Since we want a single consistent environment, we want to
|
||||
concretize all of the specs *together*.
|
||||
|
||||
Here is what your ``spack.yaml`` looks like with these new settings, and with
|
||||
some of the sections we don't plan on using removed:
|
||||
|
||||
.. code-block:: diff
|
||||
|
||||
spack:
|
||||
- specs: [bash, 'python@3:', py-numpy, py-scipy, py-matplotlib]
|
||||
+ specs:
|
||||
+ - bash
|
||||
+ - 'python@3:'
|
||||
+ - py-numpy
|
||||
+ - py-scipy
|
||||
+ - py-matplotlib
|
||||
- view:
|
||||
- default:
|
||||
- root: /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
- projections: {}
|
||||
+ view: /Users/me/spack/var/spack/environments/myenv/.spack-env/view
|
||||
- config: {}
|
||||
- mirrors: {}
|
||||
- modules:
|
||||
- enable: []
|
||||
- packages: {}
|
||||
- repos: []
|
||||
- upstreams: {}
|
||||
- definitions: []
|
||||
+ concretization: together
|
||||
- concretization: separately
|
||||
|
||||
""""""""""""""""
|
||||
Symlink location
|
||||
""""""""""""""""
|
||||
|
||||
In the ``spack.yaml`` file above, you'll notice that by default, Spack symlinks
|
||||
all installations to ``/Users/me/spack/var/spack/environments/myenv/.spack-env/view``.
|
||||
You can actually change this to any directory you want. For example, Homebrew
|
||||
uses ``/usr/local``, while Conda uses ``/Users/me/anaconda``. In order to access
|
||||
files in these locations, you need to update ``PATH`` and other environment variables
|
||||
to point to them. Activating the Spack environment does this automatically, but
|
||||
you can also manually set them in your ``.bashrc``.
|
||||
|
||||
.. warning::
|
||||
|
||||
There are several reasons why you shouldn't use ``/usr/local``:
|
||||
|
||||
1. If you are on macOS 10.11+ (El Capitan and newer), Apple makes it hard
|
||||
for you. You may notice permissions issues on ``/usr/local`` due to their
|
||||
`System Integrity Protection <https://support.apple.com/en-us/HT204899>`_.
|
||||
By default, users don't have permissions to install anything in ``/usr/local``,
|
||||
and you can't even change this using ``sudo chown`` or ``sudo chmod``.
|
||||
2. Other package managers like Homebrew will try to install things to the
|
||||
same directory. If you plan on using Homebrew in conjunction with Spack,
|
||||
don't symlink things to ``/usr/local``.
|
||||
3. If you are on a shared workstation, or don't have sudo priveleges, you
|
||||
can't do this.
|
||||
|
||||
If you still want to do this anyway, there are several ways around SIP.
|
||||
You could disable SIP by booting into recovery mode and running
|
||||
``csrutil disable``, but this is not recommended, as it can open up your OS
|
||||
to security vulnerabilities. Another technique is to run ``spack concretize``
|
||||
and ``spack install`` using ``sudo``. This is also not recommended.
|
||||
|
||||
The safest way I've found is to create your installation directories using
|
||||
sudo, then change ownership back to the user like so:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
for directory in .spack bin contrib include lib man share
|
||||
do
|
||||
sudo mkdir -p /usr/local/$directory
|
||||
sudo chown $(id -un):$(id -gn) /usr/local/$directory
|
||||
done
|
||||
|
||||
Depending on the packages you install in your environment, the exact list of
|
||||
directories you need to create may vary. You may also find some packages
|
||||
like Java libraries that install a single file to the installation prefix
|
||||
instead of in a subdirectory. In this case, the action is the same, just replace
|
||||
``mkdir -p`` with ``touch`` in the for-loop above.
|
||||
|
||||
But again, it's safer just to use the default symlink location.
|
||||
|
||||
|
||||
^^^^^^^^^^^^
|
||||
Installation
|
||||
^^^^^^^^^^^^
|
||||
|
||||
To actually concretize the environment, run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack concretize
|
||||
|
||||
This will tell you which if any packages are already installed, and alert you
|
||||
to any conflicting specs.
|
||||
|
||||
To actually install these packages and symlink them to your ``view:``
|
||||
directory, simply run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack install
|
||||
|
||||
Now, when you type ``which python3``, it should find the one you just installed.
|
||||
|
||||
In order to change the default shell to our newer Bash installation, we first
|
||||
need to add it to this list of acceptable shells. Run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ sudo vim /etc/shells
|
||||
|
||||
and add the absolute path to your bash executable. Then run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ chsh -s /path/to/bash
|
||||
|
||||
Now, when you log out and log back in, ``echo $SHELL`` should point to the
|
||||
newer version of Bash.
|
||||
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
Updating Installed Packages
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Let's say you upgraded to a new version of macOS, or a new version of Python
|
||||
was released, and you want to rebuild your entire software stack. To do this,
|
||||
simply run the following commands:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env activate myenv
|
||||
$ spack concretize --force
|
||||
$ spack install
|
||||
|
||||
The ``--force`` flag tells Spack to overwrite its previous concretization
|
||||
decisions, allowing you to choose a new version of Python. If any of the new
|
||||
packages like Bash are already installed, ``spack install`` won't re-install
|
||||
them, it will keep the symlinks in place.
|
||||
|
||||
^^^^^^^^^^^^^^
|
||||
Uninstallation
|
||||
^^^^^^^^^^^^^^
|
||||
|
||||
If you decide that Spack isn't right for you, uninstallation is simple.
|
||||
Just run:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env activate myenv
|
||||
$ spack uninstall --all
|
||||
|
||||
This will uninstall all packages in your environment and remove the symlinks.
|
||||
|
||||
------------------------
|
||||
Using Spack on Travis-CI
|
||||
------------------------
|
||||
|
||||
@@ -1298,6 +1298,20 @@
|
||||
"ppc64"
|
||||
]
|
||||
},
|
||||
"vsx": {
|
||||
"reason": "VSX alitvec extensions are supported by PowerISA from v2.06 (Power7+), but might not be listed in features",
|
||||
"families": [
|
||||
"ppc64le",
|
||||
"ppc64"
|
||||
]
|
||||
},
|
||||
"fma": {
|
||||
"reason": "FMA has been supported by PowerISA since Power1, but might not be listed in features",
|
||||
"families": [
|
||||
"ppc64le",
|
||||
"ppc64"
|
||||
]
|
||||
},
|
||||
"sse4.1": {
|
||||
"reason": "permits to refer to sse4_1 also as sse4.1",
|
||||
"any_of": [
|
||||
|
||||
@@ -201,7 +201,6 @@ def groupid_to_group(x):
|
||||
output_file.writelines(input_file.readlines())
|
||||
|
||||
except BaseException:
|
||||
os.remove(tmp_filename)
|
||||
# clean up the original file on failure.
|
||||
shutil.move(backup_filename, filename)
|
||||
raise
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
|
||||
|
||||
#: major, minor, patch version for Spack, in a tuple
|
||||
spack_version_info = (0, 13, 3)
|
||||
spack_version_info = (0, 13, 4)
|
||||
|
||||
#: String containing Spack version joined with .'s
|
||||
spack_version = '.'.join(str(v) for v in spack_version_info)
|
||||
|
||||
@@ -33,6 +33,7 @@
|
||||
from spack.spec import Spec
|
||||
from spack.stage import Stage
|
||||
from spack.util.gpg import Gpg
|
||||
import spack.architecture as architecture
|
||||
|
||||
_build_cache_relative_path = 'build_cache'
|
||||
|
||||
@@ -659,19 +660,85 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
|
||||
shutil.rmtree(tmpdir)
|
||||
|
||||
|
||||
#: Internal cache for get_specs
|
||||
_cached_specs = None
|
||||
# Internal cache for downloaded specs
|
||||
_cached_specs = set()
|
||||
|
||||
|
||||
def get_specs(force=False, use_arch=False):
|
||||
def try_download_specs(urls=None, force=False):
|
||||
'''
|
||||
Try to download the urls and cache them
|
||||
'''
|
||||
global _cached_specs
|
||||
if urls is None:
|
||||
return {}
|
||||
for link in urls:
|
||||
with Stage(link, name="build_cache", keep=True) as stage:
|
||||
if force and os.path.exists(stage.save_filename):
|
||||
os.remove(stage.save_filename)
|
||||
if not os.path.exists(stage.save_filename):
|
||||
try:
|
||||
stage.fetch()
|
||||
except fs.FetchError:
|
||||
continue
|
||||
with open(stage.save_filename, 'r') as f:
|
||||
# read the spec from the build cache file. All specs
|
||||
# in build caches are concrete (as they are built) so
|
||||
# we need to mark this spec concrete on read-in.
|
||||
spec = Spec.from_yaml(f)
|
||||
spec._mark_concrete()
|
||||
_cached_specs.add(spec)
|
||||
|
||||
return _cached_specs
|
||||
|
||||
|
||||
def get_spec(spec=None, force=False):
|
||||
"""
|
||||
Check if spec.yaml exists on mirrors and return it if it does
|
||||
"""
|
||||
global _cached_specs
|
||||
urls = set()
|
||||
if spec is None:
|
||||
return {}
|
||||
specfile_name = tarball_name(spec, '.spec.yaml')
|
||||
|
||||
if not spack.mirror.MirrorCollection():
|
||||
tty.debug("No Spack mirrors are currently configured")
|
||||
return {}
|
||||
|
||||
if spec in _cached_specs:
|
||||
return _cached_specs
|
||||
|
||||
for mirror in spack.mirror.MirrorCollection().values():
|
||||
fetch_url_build_cache = url_util.join(
|
||||
mirror.fetch_url, _build_cache_relative_path)
|
||||
|
||||
mirror_dir = url_util.local_file_path(fetch_url_build_cache)
|
||||
if mirror_dir:
|
||||
tty.msg("Finding buildcaches in %s" % mirror_dir)
|
||||
link = url_util.join(fetch_url_build_cache, specfile_name)
|
||||
urls.add(link)
|
||||
|
||||
else:
|
||||
tty.msg("Finding buildcaches at %s" %
|
||||
url_util.format(fetch_url_build_cache))
|
||||
link = url_util.join(fetch_url_build_cache, specfile_name)
|
||||
urls.add(link)
|
||||
|
||||
return try_download_specs(urls=urls, force=force)
|
||||
|
||||
|
||||
def get_specs(force=False, allarch=False):
|
||||
"""
|
||||
Get spec.yaml's for build caches available on mirror
|
||||
"""
|
||||
global _cached_specs
|
||||
arch = architecture.Arch(architecture.platform(),
|
||||
'default_os', 'default_target')
|
||||
arch_pattern = ('([^-]*-[^-]*-[^-]*)')
|
||||
if not allarch:
|
||||
arch_pattern = '(%s-%s-[^-]*)' % (arch.platform, arch.os)
|
||||
|
||||
if _cached_specs:
|
||||
tty.debug("Using previously-retrieved specs")
|
||||
return _cached_specs
|
||||
regex_pattern = '%s(.*)(spec.yaml$)' % (arch_pattern)
|
||||
arch_re = re.compile(regex_pattern)
|
||||
|
||||
if not spack.mirror.MirrorCollection():
|
||||
tty.debug("No Spack mirrors are currently configured")
|
||||
@@ -688,14 +755,9 @@ def get_specs(force=False, use_arch=False):
|
||||
if os.path.exists(mirror_dir):
|
||||
files = os.listdir(mirror_dir)
|
||||
for file in files:
|
||||
if re.search('spec.yaml', file):
|
||||
m = arch_re.search(file)
|
||||
if m:
|
||||
link = url_util.join(fetch_url_build_cache, file)
|
||||
if use_arch and re.search('%s-%s' %
|
||||
(spack.architecture.platform,
|
||||
spack.architecture.os),
|
||||
file):
|
||||
urls.add(link)
|
||||
else:
|
||||
urls.add(link)
|
||||
else:
|
||||
tty.msg("Finding buildcaches at %s" %
|
||||
@@ -703,34 +765,11 @@ def get_specs(force=False, use_arch=False):
|
||||
p, links = web_util.spider(
|
||||
url_util.join(fetch_url_build_cache, 'index.html'))
|
||||
for link in links:
|
||||
if re.search("spec.yaml", link):
|
||||
if use_arch and re.search('%s-%s' %
|
||||
(spack.architecture.platform,
|
||||
spack.architecture.os),
|
||||
link):
|
||||
urls.add(link)
|
||||
else:
|
||||
m = arch_re.search(link)
|
||||
if m:
|
||||
urls.add(link)
|
||||
|
||||
_cached_specs = []
|
||||
for link in urls:
|
||||
with Stage(link, name="build_cache", keep=True) as stage:
|
||||
if force and os.path.exists(stage.save_filename):
|
||||
os.remove(stage.save_filename)
|
||||
if not os.path.exists(stage.save_filename):
|
||||
try:
|
||||
stage.fetch()
|
||||
except fs.FetchError:
|
||||
continue
|
||||
with open(stage.save_filename, 'r') as f:
|
||||
# read the spec from the build cache file. All specs
|
||||
# in build caches are concrete (as they are built) so
|
||||
# we need to mark this spec concrete on read-in.
|
||||
spec = Spec.from_yaml(f)
|
||||
spec._mark_concrete()
|
||||
_cached_specs.append(spec)
|
||||
|
||||
return _cached_specs
|
||||
return try_download_specs(urls=urls, force=force)
|
||||
|
||||
|
||||
def get_keys(install=False, trust=False, force=False):
|
||||
|
||||
@@ -25,6 +25,7 @@ def setup_parser(subparser):
|
||||
def add(parser, args):
|
||||
env = ev.get_env(args, 'add', required=True)
|
||||
|
||||
with env.write_transaction():
|
||||
for spec in spack.cmd.parse_specs(args.specs):
|
||||
if not env.add(spec, args.list_name):
|
||||
tty.msg("Package {0} was already added to {1}"
|
||||
|
||||
@@ -87,6 +87,9 @@ def setup_parser(subparser):
|
||||
help='show variants in output (can be long)')
|
||||
listcache.add_argument('-f', '--force', action='store_true',
|
||||
help="force new download of specs")
|
||||
listcache.add_argument('-a', '--allarch', action='store_true',
|
||||
help="list specs for all available architectures" +
|
||||
" instead of default platform and OS")
|
||||
arguments.add_common_arguments(listcache, ['specs'])
|
||||
listcache.set_defaults(func=listspecs)
|
||||
|
||||
@@ -263,7 +266,8 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False):
|
||||
# List of specs that match expressions given via command line
|
||||
specs_from_cli = []
|
||||
has_errors = False
|
||||
specs = bindist.get_specs(force)
|
||||
allarch = False
|
||||
specs = bindist.get_specs(force, allarch)
|
||||
for pkg in pkgs:
|
||||
matches = []
|
||||
tty.msg("buildcache spec(s) matching %s \n" % pkg)
|
||||
@@ -415,7 +419,7 @@ def install_tarball(spec, args):
|
||||
|
||||
def listspecs(args):
|
||||
"""list binary packages available from mirrors"""
|
||||
specs = bindist.get_specs(args.force)
|
||||
specs = bindist.get_specs(args.force, args.allarch)
|
||||
if args.specs:
|
||||
constraints = set(args.specs)
|
||||
specs = [s for s in specs if any(s.satisfies(c) for c in constraints)]
|
||||
|
||||
@@ -18,6 +18,7 @@ def setup_parser(subparser):
|
||||
|
||||
def concretize(parser, args):
|
||||
env = ev.get_env(args, 'concretize', required=True)
|
||||
with env.write_transaction():
|
||||
concretized_specs = env.concretize(force=args.force)
|
||||
ev.display_specs(concretized_specs)
|
||||
env.write()
|
||||
|
||||
25
lib/spack/spack/cmd/containerize.py
Normal file
25
lib/spack/spack/cmd/containerize.py
Normal file
@@ -0,0 +1,25 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import os
|
||||
import os.path
|
||||
import spack.container
|
||||
|
||||
description = ("creates recipes to build images for different"
|
||||
" container runtimes")
|
||||
section = "container"
|
||||
level = "long"
|
||||
|
||||
|
||||
def containerize(parser, args):
|
||||
config_dir = args.env_dir or os.getcwd()
|
||||
config_file = os.path.abspath(os.path.join(config_dir, 'spack.yaml'))
|
||||
if not os.path.exists(config_file):
|
||||
msg = 'file not found: {0}'
|
||||
raise ValueError(msg.format(config_file))
|
||||
|
||||
config = spack.container.validate(config_file)
|
||||
|
||||
recipe = spack.container.recipe(config)
|
||||
print(recipe)
|
||||
@@ -44,7 +44,8 @@ def update_kwargs_from_args(args, kwargs):
|
||||
'upstream': args.upstream,
|
||||
'cache_only': args.cache_only,
|
||||
'explicit': True, # Always true for install command
|
||||
'stop_at': args.until
|
||||
'stop_at': args.until,
|
||||
'unsigned': args.unsigned,
|
||||
})
|
||||
|
||||
kwargs.update({
|
||||
@@ -100,6 +101,10 @@ def setup_parser(subparser):
|
||||
'--cache-only', action='store_true', dest='cache_only', default=False,
|
||||
help="only install package from binary mirrors")
|
||||
|
||||
subparser.add_argument(
|
||||
'--no-check-signature', action='store_true',
|
||||
dest='unsigned', default=False,
|
||||
help="do not check signatures of binary packages")
|
||||
subparser.add_argument(
|
||||
'--show-log-on-error', action='store_true',
|
||||
help="print full build log to stderr if build fails")
|
||||
@@ -237,8 +242,13 @@ def install_spec(cli_args, kwargs, abstract_spec, spec):
|
||||
env = ev.get_env(cli_args, 'install')
|
||||
|
||||
if env:
|
||||
env.install(abstract_spec, spec, **kwargs)
|
||||
env.write()
|
||||
with env.write_transaction():
|
||||
concrete = env.concretize_and_add(
|
||||
abstract_spec, spec)
|
||||
env.write(regenerate_views=False)
|
||||
env._install(concrete, **kwargs)
|
||||
with env.write_transaction():
|
||||
env.regenerate_views()
|
||||
else:
|
||||
spec.package.do_install(**kwargs)
|
||||
spack.config.set('config:active_tree', '~/.spack/opt/spack',
|
||||
@@ -301,6 +311,7 @@ def install(parser, args, **kwargs):
|
||||
env = ev.get_env(args, 'install')
|
||||
if env:
|
||||
if not args.only_concrete:
|
||||
with env.write_transaction():
|
||||
concretized_specs = env.concretize()
|
||||
ev.display_specs(concretized_specs)
|
||||
|
||||
@@ -310,6 +321,9 @@ def install(parser, args, **kwargs):
|
||||
|
||||
tty.msg("Installing environment %s" % env.name)
|
||||
env.install_all(args)
|
||||
with env.write_transaction():
|
||||
# It is not strictly required to synchronize view regeneration
|
||||
# but doing so can prevent redundant work in the filesystem.
|
||||
env.regenerate_views()
|
||||
return
|
||||
else:
|
||||
|
||||
@@ -12,6 +12,7 @@
|
||||
import spack.environment as ev
|
||||
import spack.util.environment
|
||||
import spack.user_environment as uenv
|
||||
import spack.store
|
||||
|
||||
description = "add package to the user environment"
|
||||
section = "user environment"
|
||||
@@ -63,10 +64,12 @@ def load(parser, args):
|
||||
tty.msg(*msg)
|
||||
return 1
|
||||
|
||||
with spack.store.db.read_transaction():
|
||||
if 'dependencies' in args.things_to_load:
|
||||
include_roots = 'package' in args.things_to_load
|
||||
specs = [dep for spec in specs
|
||||
for dep in spec.traverse(root=include_roots, order='post')]
|
||||
for dep in
|
||||
spec.traverse(root=include_roots, order='post')]
|
||||
|
||||
env_mod = spack.util.environment.EnvironmentModifications()
|
||||
for spec in specs:
|
||||
|
||||
@@ -31,6 +31,7 @@ def setup_parser(subparser):
|
||||
def remove(parser, args):
|
||||
env = ev.get_env(args, 'remove', required=True)
|
||||
|
||||
with env.write_transaction():
|
||||
if args.all:
|
||||
env.clear()
|
||||
else:
|
||||
|
||||
@@ -8,6 +8,7 @@
|
||||
import argparse
|
||||
import copy
|
||||
import sys
|
||||
import itertools
|
||||
|
||||
import spack.cmd
|
||||
import spack.environment as ev
|
||||
@@ -253,9 +254,6 @@ def do_uninstall(env, specs, force):
|
||||
# want to uninstall.
|
||||
spack.package.Package.uninstall_by_spec(item, force=True)
|
||||
|
||||
if env:
|
||||
_remove_from_env(item, env)
|
||||
|
||||
# A package is ready to be uninstalled when nothing else references it,
|
||||
# unless we are requested to force uninstall it.
|
||||
is_ready = lambda x: not spack.store.db.query_by_spec_hash(x)[1].ref_count
|
||||
@@ -375,9 +373,13 @@ def uninstall_specs(args, specs):
|
||||
if not args.yes_to_all:
|
||||
confirm_removal(anything_to_do)
|
||||
|
||||
# just force-remove things in the remove list
|
||||
for spec in remove_list:
|
||||
if env:
|
||||
# Remove all the specs that are supposed to be uninstalled or just
|
||||
# removed.
|
||||
with env.write_transaction():
|
||||
for spec in itertools.chain(remove_list, uninstall_list):
|
||||
_remove_from_env(spec, env)
|
||||
env.write()
|
||||
|
||||
# Uninstall everything on the list
|
||||
do_uninstall(env, uninstall_list, args.force)
|
||||
|
||||
@@ -61,3 +61,7 @@ def c11_flag(self):
|
||||
@property
|
||||
def pic_flag(self):
|
||||
return "-KPIC"
|
||||
|
||||
def setup_custom_environment(self, pkg, env):
|
||||
env.append_flags('fcc_ENV', '-Nclang')
|
||||
env.append_flags('FCC_ENV', '-Nclang')
|
||||
|
||||
81
lib/spack/spack/container/__init__.py
Normal file
81
lib/spack/spack/container/__init__.py
Normal file
@@ -0,0 +1,81 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Package that provides functions and classes to
|
||||
generate container recipes from a Spack environment
|
||||
"""
|
||||
import warnings
|
||||
|
||||
import spack.environment
|
||||
import spack.schema.env as env
|
||||
import spack.util.spack_yaml as syaml
|
||||
from .writers import recipe
|
||||
|
||||
__all__ = ['validate', 'recipe']
|
||||
|
||||
|
||||
def validate(configuration_file):
|
||||
"""Validate a Spack environment YAML file that is being used to generate a
|
||||
recipe for a container.
|
||||
|
||||
Since a few attributes of the configuration must have specific values for
|
||||
the container recipe, this function returns a sanitized copy of the
|
||||
configuration in the input file. If any modification is needed, a warning
|
||||
will be issued.
|
||||
|
||||
Args:
|
||||
configuration_file (str): path to the Spack environment YAML file
|
||||
|
||||
Returns:
|
||||
A sanitized copy of the configuration stored in the input file
|
||||
"""
|
||||
import jsonschema
|
||||
with open(configuration_file) as f:
|
||||
config = syaml.load(f)
|
||||
|
||||
# Ensure we have a "container" attribute with sensible defaults set
|
||||
env_dict = spack.environment.config_dict(config)
|
||||
env_dict.setdefault('container', {
|
||||
'format': 'docker',
|
||||
'base': {'image': 'ubuntu:18.04', 'spack': 'develop'}
|
||||
})
|
||||
env_dict['container'].setdefault('format', 'docker')
|
||||
env_dict['container'].setdefault(
|
||||
'base', {'image': 'ubuntu:18.04', 'spack': 'develop'}
|
||||
)
|
||||
|
||||
# Remove attributes that are not needed / allowed in the
|
||||
# container recipe
|
||||
for subsection in ('cdash', 'gitlab_ci', 'modules'):
|
||||
if subsection in env_dict:
|
||||
msg = ('the subsection "{0}" in "{1}" is not used when generating'
|
||||
' container recipes and will be discarded')
|
||||
warnings.warn(msg.format(subsection, configuration_file))
|
||||
env_dict.pop(subsection)
|
||||
|
||||
# Set the default value of the concretization strategy to "together" and
|
||||
# warn if the user explicitly set another value
|
||||
env_dict.setdefault('concretization', 'together')
|
||||
if env_dict['concretization'] != 'together':
|
||||
msg = ('the "concretization" attribute of the environment is set '
|
||||
'to "{0}" [the advised value is instead "together"]')
|
||||
warnings.warn(msg.format(env_dict['concretization']))
|
||||
|
||||
# Check if the install tree was explicitly set to a custom value and warn
|
||||
# that it will be overridden
|
||||
environment_config = env_dict.get('config', {})
|
||||
if environment_config.get('install_tree', None):
|
||||
msg = ('the "config:install_tree" attribute has been set explicitly '
|
||||
'and will be overridden in the container image')
|
||||
warnings.warn(msg)
|
||||
|
||||
# Likewise for the view
|
||||
environment_view = env_dict.get('view', None)
|
||||
if environment_view:
|
||||
msg = ('the "view" attribute has been set explicitly '
|
||||
'and will be overridden in the container image')
|
||||
warnings.warn(msg)
|
||||
|
||||
jsonschema.validate(config, schema=env.schema)
|
||||
return config
|
||||
50
lib/spack/spack/container/images.json
Normal file
50
lib/spack/spack/container/images.json
Normal file
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"ubuntu:18.04": {
|
||||
"update": "apt-get -yqq update && apt-get -yqq upgrade",
|
||||
"install": "apt-get -yqq install",
|
||||
"clean": "rm -rf /var/lib/apt/lists/*",
|
||||
"environment": [],
|
||||
"build": "spack/ubuntu-bionic",
|
||||
"build_tags": {
|
||||
"develop": "latest",
|
||||
"0.14": "0.14",
|
||||
"0.14.0": "0.14.0"
|
||||
}
|
||||
},
|
||||
"ubuntu:16.04": {
|
||||
"update": "apt-get -yqq update && apt-get -yqq upgrade",
|
||||
"install": "apt-get -yqq install",
|
||||
"clean": "rm -rf /var/lib/apt/lists/*",
|
||||
"environment": [],
|
||||
"build": "spack/ubuntu-xenial",
|
||||
"build_tags": {
|
||||
"develop": "latest",
|
||||
"0.14": "0.14",
|
||||
"0.14.0": "0.14.0"
|
||||
}
|
||||
},
|
||||
"centos:7": {
|
||||
"update": "yum update -y && yum install -y epel-release && yum update -y",
|
||||
"install": "yum install -y",
|
||||
"clean": "rm -rf /var/cache/yum && yum clean all",
|
||||
"environment": [],
|
||||
"build": "spack/centos7",
|
||||
"build_tags": {
|
||||
"develop": "latest",
|
||||
"0.14": "0.14",
|
||||
"0.14.0": "0.14.0"
|
||||
}
|
||||
},
|
||||
"centos:6": {
|
||||
"update": "yum update -y && yum install -y epel-release && yum update -y",
|
||||
"install": "yum install -y",
|
||||
"clean": "rm -rf /var/cache/yum && yum clean all",
|
||||
"environment": [],
|
||||
"build": "spack/centos6",
|
||||
"build_tags": {
|
||||
"develop": "latest",
|
||||
"0.14": "0.14",
|
||||
"0.14.0": "0.14.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
72
lib/spack/spack/container/images.py
Normal file
72
lib/spack/spack/container/images.py
Normal file
@@ -0,0 +1,72 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Manages the details on the images used in the build and the run stage."""
|
||||
import json
|
||||
import os.path
|
||||
|
||||
#: Global variable used to cache in memory the content of images.json
|
||||
_data = None
|
||||
|
||||
|
||||
def data():
|
||||
"""Returns a dictionary with the static data on the images.
|
||||
|
||||
The dictionary is read from a JSON file lazily the first time
|
||||
this function is called.
|
||||
"""
|
||||
global _data
|
||||
if not _data:
|
||||
json_dir = os.path.abspath(os.path.dirname(__file__))
|
||||
json_file = os.path.join(json_dir, 'images.json')
|
||||
with open(json_file) as f:
|
||||
_data = json.load(f)
|
||||
return _data
|
||||
|
||||
|
||||
def build_info(image, spack_version):
|
||||
"""Returns the name of the build image and its tag.
|
||||
|
||||
Args:
|
||||
image (str): image to be used at run-time. Should be of the form
|
||||
<image_name>:<image_tag> e.g. "ubuntu:18.04"
|
||||
spack_version (str): version of Spack that we want to use to build
|
||||
|
||||
Returns:
|
||||
A tuple with (image_name, image_tag) for the build image
|
||||
"""
|
||||
# Don't handle error here, as a wrong image should have been
|
||||
# caught by the JSON schema
|
||||
image_data = data()[image]
|
||||
build_image = image_data['build']
|
||||
|
||||
# Try to check if we have a tag for this Spack version
|
||||
try:
|
||||
build_tag = image_data['build_tags'][spack_version]
|
||||
except KeyError:
|
||||
msg = ('the image "{0}" has no tag for Spack version "{1}" '
|
||||
'[valid versions are {2}]')
|
||||
msg = msg.format(build_image, spack_version,
|
||||
', '.join(image_data['build_tags'].keys()))
|
||||
raise ValueError(msg)
|
||||
|
||||
return build_image, build_tag
|
||||
|
||||
|
||||
def package_info(image):
|
||||
"""Returns the commands used to update system repositories, install
|
||||
system packages and clean afterwards.
|
||||
|
||||
Args:
|
||||
image (str): image to be used at run-time. Should be of the form
|
||||
<image_name>:<image_tag> e.g. "ubuntu:18.04"
|
||||
|
||||
Returns:
|
||||
A tuple of (update, install, clean) commands.
|
||||
"""
|
||||
image_data = data()[image]
|
||||
update = image_data['update']
|
||||
install = image_data['install']
|
||||
clean = image_data['clean']
|
||||
return update, install, clean
|
||||
154
lib/spack/spack/container/writers/__init__.py
Normal file
154
lib/spack/spack/container/writers/__init__.py
Normal file
@@ -0,0 +1,154 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Writers for different kind of recipes and related
|
||||
convenience functions.
|
||||
"""
|
||||
import collections
|
||||
import copy
|
||||
|
||||
import spack.environment
|
||||
import spack.schema.env
|
||||
import spack.tengine as tengine
|
||||
import spack.util.spack_yaml as syaml
|
||||
|
||||
from spack.container.images import build_info, package_info
|
||||
|
||||
#: Caches all the writers that are currently supported
|
||||
_writer_factory = {}
|
||||
|
||||
|
||||
def writer(name):
|
||||
"""Decorator to register a factory for a recipe writer.
|
||||
|
||||
Each factory should take a configuration dictionary and return a
|
||||
properly configured writer that, when called, prints the
|
||||
corresponding recipe.
|
||||
"""
|
||||
def _decorator(factory):
|
||||
_writer_factory[name] = factory
|
||||
return factory
|
||||
return _decorator
|
||||
|
||||
|
||||
def create(configuration):
|
||||
"""Returns a writer that conforms to the configuration passed as input.
|
||||
|
||||
Args:
|
||||
configuration: how to generate the current recipe
|
||||
"""
|
||||
name = spack.environment.config_dict(configuration)['container']['format']
|
||||
return _writer_factory[name](configuration)
|
||||
|
||||
|
||||
def recipe(configuration):
|
||||
"""Returns a recipe that conforms to the configuration passed as input.
|
||||
|
||||
Args:
|
||||
configuration: how to generate the current recipe
|
||||
"""
|
||||
return create(configuration)()
|
||||
|
||||
|
||||
class PathContext(tengine.Context):
|
||||
"""Generic context used to instantiate templates of recipes that
|
||||
install software in a common location and make it available
|
||||
directly via PATH.
|
||||
"""
|
||||
def __init__(self, config):
|
||||
self.config = spack.environment.config_dict(config)
|
||||
self.container_config = self.config['container']
|
||||
|
||||
@tengine.context_property
|
||||
def run(self):
|
||||
"""Information related to the run image."""
|
||||
image = self.container_config['base']['image']
|
||||
Run = collections.namedtuple('Run', ['image'])
|
||||
return Run(image=image)
|
||||
|
||||
@tengine.context_property
|
||||
def build(self):
|
||||
"""Information related to the build image."""
|
||||
|
||||
# Map the final image to the correct build image
|
||||
run_image = self.container_config['base']['image']
|
||||
spack_version = self.container_config['base']['spack']
|
||||
image, tag = build_info(run_image, spack_version)
|
||||
|
||||
Build = collections.namedtuple('Build', ['image', 'tag'])
|
||||
return Build(image=image, tag=tag)
|
||||
|
||||
@tengine.context_property
|
||||
def strip(self):
|
||||
"""Whether or not to strip binaries in the image"""
|
||||
return self.container_config.get('strip', True)
|
||||
|
||||
@tengine.context_property
|
||||
def paths(self):
|
||||
"""Important paths in the image"""
|
||||
Paths = collections.namedtuple('Paths', [
|
||||
'environment', 'store', 'view'
|
||||
])
|
||||
return Paths(
|
||||
environment='/opt/spack-environment',
|
||||
store='/opt/software',
|
||||
view='/opt/view'
|
||||
)
|
||||
|
||||
@tengine.context_property
|
||||
def manifest(self):
|
||||
"""The spack.yaml file that should be used in the image"""
|
||||
import jsonschema
|
||||
# Copy in the part of spack.yaml prescribed in the configuration file
|
||||
manifest = copy.deepcopy(self.config)
|
||||
manifest.pop('container')
|
||||
|
||||
# Ensure that a few paths are where they need to be
|
||||
manifest.setdefault('config', syaml.syaml_dict())
|
||||
manifest['config']['install_tree'] = self.paths.store
|
||||
manifest['view'] = self.paths.view
|
||||
manifest = {'spack': manifest}
|
||||
|
||||
# Validate the manifest file
|
||||
jsonschema.validate(manifest, schema=spack.schema.env.schema)
|
||||
|
||||
return syaml.dump(manifest, default_flow_style=False).strip()
|
||||
|
||||
@tengine.context_property
|
||||
def os_packages(self):
|
||||
"""Additional system packages that are needed at run-time."""
|
||||
package_list = self.container_config.get('os_packages', None)
|
||||
if not package_list:
|
||||
return package_list
|
||||
|
||||
image = self.container_config['base']['image']
|
||||
update, install, clean = package_info(image)
|
||||
Packages = collections.namedtuple(
|
||||
'Packages', ['update', 'install', 'list', 'clean']
|
||||
)
|
||||
return Packages(update=update, install=install,
|
||||
list=package_list, clean=clean)
|
||||
|
||||
@tengine.context_property
|
||||
def extra_instructions(self):
|
||||
Extras = collections.namedtuple('Extra', ['build', 'final'])
|
||||
extras = self.container_config.get('extra_instructions', {})
|
||||
build, final = extras.get('build', None), extras.get('final', None)
|
||||
return Extras(build=build, final=final)
|
||||
|
||||
@tengine.context_property
|
||||
def labels(self):
|
||||
return self.container_config.get('labels', {})
|
||||
|
||||
def __call__(self):
|
||||
"""Returns the recipe as a string"""
|
||||
env = tengine.make_environment()
|
||||
t = env.get_template(self.template_name)
|
||||
return t.render(**self.to_dict())
|
||||
|
||||
|
||||
# Import after function definition all the modules in this package,
|
||||
# so that registration of writers will happen automatically
|
||||
import spack.container.writers.singularity # noqa
|
||||
import spack.container.writers.docker # noqa
|
||||
30
lib/spack/spack/container/writers/docker.py
Normal file
30
lib/spack/spack/container/writers/docker.py
Normal file
@@ -0,0 +1,30 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import spack.tengine as tengine
|
||||
|
||||
from . import writer, PathContext
|
||||
|
||||
|
||||
@writer('docker')
|
||||
class DockerContext(PathContext):
|
||||
"""Context used to instantiate a Dockerfile"""
|
||||
#: Name of the template used for Dockerfiles
|
||||
template_name = 'container/Dockerfile'
|
||||
|
||||
@tengine.context_property
|
||||
def manifest(self):
|
||||
manifest_str = super(DockerContext, self).manifest
|
||||
# Docker doesn't support HEREDOC so we need to resort to
|
||||
# a horrible echo trick to have the manifest in the Dockerfile
|
||||
echoed_lines = []
|
||||
for idx, line in enumerate(manifest_str.split('\n')):
|
||||
if idx == 0:
|
||||
echoed_lines.append('&& (echo "' + line + '" \\')
|
||||
continue
|
||||
echoed_lines.append('&& echo "' + line + '" \\')
|
||||
|
||||
echoed_lines[-1] = echoed_lines[-1].replace(' \\', ')')
|
||||
|
||||
return '\n'.join(echoed_lines)
|
||||
33
lib/spack/spack/container/writers/singularity.py
Normal file
33
lib/spack/spack/container/writers/singularity.py
Normal file
@@ -0,0 +1,33 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import spack.tengine as tengine
|
||||
from . import writer, PathContext
|
||||
|
||||
|
||||
@writer('singularity')
|
||||
class SingularityContext(PathContext):
|
||||
"""Context used to instantiate a Singularity definition file"""
|
||||
#: Name of the template used for Singularity definition files
|
||||
template_name = 'container/singularity.def'
|
||||
|
||||
@property
|
||||
def singularity_config(self):
|
||||
return self.container_config.get('singularity', {})
|
||||
|
||||
@tengine.context_property
|
||||
def runscript(self):
|
||||
return self.singularity_config.get('runscript', '')
|
||||
|
||||
@tengine.context_property
|
||||
def startscript(self):
|
||||
return self.singularity_config.get('startscript', '')
|
||||
|
||||
@tengine.context_property
|
||||
def test(self):
|
||||
return self.singularity_config.get('test', '')
|
||||
|
||||
@tengine.context_property
|
||||
def help(self):
|
||||
return self.singularity_config.get('help', '')
|
||||
@@ -1145,7 +1145,27 @@ def activated_extensions_for(self, extendee_spec, extensions_layout=None):
|
||||
continue
|
||||
# TODO: conditional way to do this instead of catching exceptions
|
||||
|
||||
def get_by_hash_local(self, dag_hash, default=None, installed=any):
|
||||
def _get_by_hash_local(self, dag_hash, default=None, installed=any):
|
||||
# hash is a full hash and is in the data somewhere
|
||||
if dag_hash in self._data:
|
||||
rec = self._data[dag_hash]
|
||||
if rec.install_type_matches(installed):
|
||||
return [rec.spec]
|
||||
else:
|
||||
return default
|
||||
|
||||
# check if hash is a prefix of some installed (or previously
|
||||
# installed) spec.
|
||||
matches = [record.spec for h, record in self._data.items()
|
||||
if h.startswith(dag_hash) and
|
||||
record.install_type_matches(installed)]
|
||||
if matches:
|
||||
return matches
|
||||
|
||||
# nothing found
|
||||
return default
|
||||
|
||||
def get_by_hash_local(self, *args, **kwargs):
|
||||
"""Look up a spec in *this DB* by DAG hash, or by a DAG hash prefix.
|
||||
|
||||
Arguments:
|
||||
@@ -1169,24 +1189,7 @@ def get_by_hash_local(self, dag_hash, default=None, installed=any):
|
||||
|
||||
"""
|
||||
with self.read_transaction():
|
||||
# hash is a full hash and is in the data somewhere
|
||||
if dag_hash in self._data:
|
||||
rec = self._data[dag_hash]
|
||||
if rec.install_type_matches(installed):
|
||||
return [rec.spec]
|
||||
else:
|
||||
return default
|
||||
|
||||
# check if hash is a prefix of some installed (or previously
|
||||
# installed) spec.
|
||||
matches = [record.spec for h, record in self._data.items()
|
||||
if h.startswith(dag_hash) and
|
||||
record.install_type_matches(installed)]
|
||||
if matches:
|
||||
return matches
|
||||
|
||||
# nothing found
|
||||
return default
|
||||
return self._get_by_hash_local(*args, **kwargs)
|
||||
|
||||
def get_by_hash(self, dag_hash, default=None, installed=any):
|
||||
"""Look up a spec by DAG hash, or by a DAG hash prefix.
|
||||
@@ -1211,9 +1214,14 @@ def get_by_hash(self, dag_hash, default=None, installed=any):
|
||||
(list): a list of specs matching the hash or hash prefix
|
||||
|
||||
"""
|
||||
search_path = [self] + self.upstream_dbs
|
||||
for db in search_path:
|
||||
spec = db.get_by_hash_local(
|
||||
|
||||
spec = self.get_by_hash_local(
|
||||
dag_hash, default=default, installed=installed)
|
||||
if spec is not None:
|
||||
return spec
|
||||
|
||||
for upstream_db in self.upstream_dbs:
|
||||
spec = upstream_db._get_by_hash_local(
|
||||
dag_hash, default=default, installed=installed)
|
||||
if spec is not None:
|
||||
return spec
|
||||
@@ -1273,7 +1281,8 @@ def _query(
|
||||
if not (start_date < inst_date < end_date):
|
||||
continue
|
||||
|
||||
if query_spec is any or rec.spec.satisfies(query_spec):
|
||||
if (query_spec is any or
|
||||
rec.spec.satisfies(query_spec, strict=True)):
|
||||
results.append(rec.spec)
|
||||
|
||||
return results
|
||||
|
||||
@@ -36,6 +36,7 @@
|
||||
from spack.spec import Spec
|
||||
from spack.spec_list import SpecList, InvalidSpecConstraintError
|
||||
from spack.variant import UnknownVariantError
|
||||
import spack.util.lock as lk
|
||||
|
||||
#: environment variable used to indicate the active environment
|
||||
spack_env_var = 'SPACK_ENV'
|
||||
@@ -557,12 +558,18 @@ def __init__(self, path, init_file=None, with_view=None):
|
||||
path to the view.
|
||||
"""
|
||||
self.path = os.path.abspath(path)
|
||||
|
||||
self.txlock = lk.Lock(self._transaction_lock_path)
|
||||
|
||||
# This attribute will be set properly from configuration
|
||||
# during concretization
|
||||
self.concretization = None
|
||||
self.clear()
|
||||
|
||||
if init_file:
|
||||
# If we are creating the environment from an init file, we don't
|
||||
# need to lock, because there are no Spack operations that alter
|
||||
# the init file.
|
||||
with fs.open_if_filename(init_file) as f:
|
||||
if hasattr(f, 'name') and f.name.endswith('.lock'):
|
||||
self._read_manifest(default_manifest_yaml)
|
||||
@@ -571,6 +578,30 @@ def __init__(self, path, init_file=None, with_view=None):
|
||||
else:
|
||||
self._read_manifest(f, raw_yaml=default_manifest_yaml)
|
||||
else:
|
||||
with lk.ReadTransaction(self.txlock):
|
||||
self._read()
|
||||
|
||||
if with_view is False:
|
||||
self.views = {}
|
||||
elif with_view is True:
|
||||
self.views = {
|
||||
default_view_name: ViewDescriptor(self.view_path_default)}
|
||||
elif isinstance(with_view, six.string_types):
|
||||
self.views = {default_view_name: ViewDescriptor(with_view)}
|
||||
# If with_view is None, then defer to the view settings determined by
|
||||
# the manifest file
|
||||
|
||||
def _re_read(self):
|
||||
"""Reinitialize the environment object if it has been written (this
|
||||
may not be true if the environment was just created in this running
|
||||
instance of Spack)."""
|
||||
if not os.path.exists(self.manifest_path):
|
||||
return
|
||||
|
||||
self.clear()
|
||||
self._read()
|
||||
|
||||
def _read(self):
|
||||
default_manifest = not os.path.exists(self.manifest_path)
|
||||
if default_manifest:
|
||||
# No manifest, use default yaml
|
||||
@@ -592,15 +623,9 @@ def __init__(self, path, init_file=None, with_view=None):
|
||||
self.lock_path, self._lock_backup_v1_path))
|
||||
shutil.copy(self.lock_path, self._lock_backup_v1_path)
|
||||
|
||||
if with_view is False:
|
||||
self.views = {}
|
||||
elif with_view is True:
|
||||
self.views = {
|
||||
default_view_name: ViewDescriptor(self.view_path_default)}
|
||||
elif isinstance(with_view, six.string_types):
|
||||
self.views = {default_view_name: ViewDescriptor(with_view)}
|
||||
# If with_view is None, then defer to the view settings determined by
|
||||
# the manifest file
|
||||
def write_transaction(self):
|
||||
"""Get a write lock context manager for use in a `with` block."""
|
||||
return lk.WriteTransaction(self.txlock, acquire=self._re_read)
|
||||
|
||||
def _read_manifest(self, f, raw_yaml=None):
|
||||
"""Read manifest file and set up user specs."""
|
||||
@@ -694,6 +719,13 @@ def manifest_path(self):
|
||||
"""Path to spack.yaml file in this environment."""
|
||||
return os.path.join(self.path, manifest_name)
|
||||
|
||||
@property
|
||||
def _transaction_lock_path(self):
|
||||
"""The location of the lock file used to synchronize multiple
|
||||
processes updating the same environment.
|
||||
"""
|
||||
return os.path.join(self.env_subdir_path, 'transaction_lock')
|
||||
|
||||
@property
|
||||
def lock_path(self):
|
||||
"""Path to spack.lock file in this environment."""
|
||||
@@ -986,11 +1018,18 @@ def _concretize_separately(self):
|
||||
concretized_specs.append((uspec, concrete))
|
||||
return concretized_specs
|
||||
|
||||
def install(self, user_spec, concrete_spec=None, **install_args):
|
||||
"""Install a single spec into an environment.
|
||||
def concretize_and_add(self, user_spec, concrete_spec=None):
|
||||
"""Concretize and add a single spec to the environment.
|
||||
|
||||
This will automatically concretize the single spec, but it won't
|
||||
affect other as-yet unconcretized specs.
|
||||
Concretize the provided ``user_spec`` and add it along with the
|
||||
concretized result to the environment. If the given ``user_spec`` was
|
||||
already present in the environment, this does not add a duplicate.
|
||||
The concretized spec will be added unless the ``user_spec`` was
|
||||
already present and an associated concrete spec was already present.
|
||||
|
||||
Args:
|
||||
concrete_spec: if provided, then it is assumed that it is the
|
||||
result of concretizing the provided ``user_spec``
|
||||
"""
|
||||
if self.concretization == 'together':
|
||||
msg = 'cannot install a single spec in an environment that is ' \
|
||||
@@ -1001,7 +1040,6 @@ def install(self, user_spec, concrete_spec=None, **install_args):
|
||||
|
||||
spec = Spec(user_spec)
|
||||
|
||||
with spack.store.db.read_transaction():
|
||||
if self.add(spec):
|
||||
concrete = concrete_spec or spec.concretized()
|
||||
self._add_concrete_spec(spec, concrete)
|
||||
@@ -1016,22 +1054,7 @@ def install(self, user_spec, concrete_spec=None, **install_args):
|
||||
concrete = spec.concretized()
|
||||
self._add_concrete_spec(spec, concrete)
|
||||
|
||||
self._install(concrete, **install_args)
|
||||
|
||||
def _install(self, spec, **install_args):
|
||||
spec.package.do_install(**install_args)
|
||||
|
||||
# Make sure log directory exists
|
||||
log_path = self.log_path
|
||||
fs.mkdirp(log_path)
|
||||
|
||||
with fs.working_dir(self.path):
|
||||
# Link the resulting log file into logs dir
|
||||
build_log_link = os.path.join(
|
||||
log_path, '%s-%s.log' % (spec.name, spec.dag_hash(7)))
|
||||
if os.path.lexists(build_log_link):
|
||||
os.remove(build_log_link)
|
||||
os.symlink(spec.package.build_log_path, build_log_link)
|
||||
return concrete
|
||||
|
||||
@property
|
||||
def default_view(self):
|
||||
@@ -1131,6 +1154,33 @@ def _add_concrete_spec(self, spec, concrete, new=True):
|
||||
self.concretized_order.append(h)
|
||||
self.specs_by_hash[h] = concrete
|
||||
|
||||
def install(self, user_spec, concrete_spec=None, **install_args):
|
||||
"""Install a single spec into an environment.
|
||||
|
||||
This will automatically concretize the single spec, but it won't
|
||||
affect other as-yet unconcretized specs.
|
||||
"""
|
||||
concrete = self.concretize_and_add(user_spec, concrete_spec)
|
||||
|
||||
self._install(concrete, **install_args)
|
||||
|
||||
def _install(self, spec, **install_args):
|
||||
# "spec" must be concrete
|
||||
spec.package.do_install(**install_args)
|
||||
|
||||
if not spec.external:
|
||||
# Make sure log directory exists
|
||||
log_path = self.log_path
|
||||
fs.mkdirp(log_path)
|
||||
|
||||
with fs.working_dir(self.path):
|
||||
# Link the resulting log file into logs dir
|
||||
build_log_link = os.path.join(
|
||||
log_path, '%s-%s.log' % (spec.name, spec.dag_hash(7)))
|
||||
if os.path.lexists(build_log_link):
|
||||
os.remove(build_log_link)
|
||||
os.symlink(spec.package.build_log_path, build_log_link)
|
||||
|
||||
def install_all(self, args=None):
|
||||
"""Install all concretized specs in an environment.
|
||||
|
||||
@@ -1138,10 +1188,20 @@ def install_all(self, args=None):
|
||||
that needs to be done separately with a call to write().
|
||||
|
||||
"""
|
||||
|
||||
# If "spack install" is invoked repeatedly for a large environment
|
||||
# where all specs are already installed, the operation can take
|
||||
# a large amount of time due to repeatedly acquiring and releasing
|
||||
# locks, this does an initial check across all specs within a single
|
||||
# DB read transaction to reduce time spent in this case.
|
||||
uninstalled_specs = []
|
||||
with spack.store.db.read_transaction():
|
||||
for concretized_hash in self.concretized_order:
|
||||
spec = self.specs_by_hash[concretized_hash]
|
||||
if not spec.package.installed:
|
||||
uninstalled_specs.append(spec)
|
||||
|
||||
for spec in uninstalled_specs:
|
||||
# Parse cli arguments and construct a dictionary
|
||||
# that will be passed to Package.do_install API
|
||||
kwargs = dict()
|
||||
@@ -1150,14 +1210,6 @@ def install_all(self, args=None):
|
||||
|
||||
self._install(spec, **kwargs)
|
||||
|
||||
if not spec.external:
|
||||
# Link the resulting log file into logs dir
|
||||
log_name = '%s-%s' % (spec.name, spec.dag_hash(7))
|
||||
build_log_link = os.path.join(self.log_path, log_name)
|
||||
if os.path.lexists(build_log_link):
|
||||
os.remove(build_log_link)
|
||||
os.symlink(spec.package.build_log_path, build_log_link)
|
||||
|
||||
def all_specs_by_hash(self):
|
||||
"""Map of hashes to spec for all specs in this environment."""
|
||||
# Note this uses dag-hashes calculated without build deps as keys,
|
||||
@@ -1424,8 +1476,8 @@ def write(self, regenerate_views=True):
|
||||
|
||||
# Remove yaml sections that are shadowing defaults
|
||||
# construct garbage path to ensure we don't find a manifest by accident
|
||||
bare_env = Environment(os.path.join(self.manifest_path, 'garbage'),
|
||||
with_view=self.view_path_default)
|
||||
with fs.temp_cwd() as env_dir:
|
||||
bare_env = Environment(env_dir, with_view=self.view_path_default)
|
||||
keys_present = list(yaml_dict.keys())
|
||||
for key in keys_present:
|
||||
if yaml_dict[key] == config_dict(bare_env.yaml).get(key, None):
|
||||
|
||||
@@ -714,7 +714,8 @@ class GitFetchStrategy(VCSFetchStrategy):
|
||||
Repositories are cloned into the standard stage source path directory.
|
||||
"""
|
||||
url_attr = 'git'
|
||||
optional_attrs = ['tag', 'branch', 'commit', 'submodules', 'get_full_repo']
|
||||
optional_attrs = ['tag', 'branch', 'commit', 'submodules',
|
||||
'get_full_repo', 'submodules_delete']
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
# Discards the keywords in kwargs that may conflict with the next call
|
||||
@@ -725,6 +726,7 @@ def __init__(self, **kwargs):
|
||||
|
||||
self._git = None
|
||||
self.submodules = kwargs.get('submodules', False)
|
||||
self.submodules_delete = kwargs.get('submodules_delete', False)
|
||||
self.get_full_repo = kwargs.get('get_full_repo', False)
|
||||
|
||||
@property
|
||||
@@ -858,6 +860,14 @@ def fetch(self):
|
||||
git(*pull_args, ignore_errors=1)
|
||||
git(*co_args)
|
||||
|
||||
if self.submodules_delete:
|
||||
with working_dir(self.stage.source_path):
|
||||
for submodule_to_delete in self.submodules_delete:
|
||||
args = ['rm', submodule_to_delete]
|
||||
if not spack.config.get('config:debug'):
|
||||
args.insert(1, '--quiet')
|
||||
git(*args)
|
||||
|
||||
# Init submodules if the user asked for them.
|
||||
if self.submodules:
|
||||
with working_dir(self.stage.source_path):
|
||||
|
||||
@@ -13,6 +13,7 @@
|
||||
import sys
|
||||
import re
|
||||
import os
|
||||
import os.path
|
||||
import inspect
|
||||
import pstats
|
||||
import argparse
|
||||
@@ -21,6 +22,7 @@
|
||||
from six import StringIO
|
||||
|
||||
import llnl.util.cpu
|
||||
import llnl.util.filesystem as fs
|
||||
import llnl.util.tty as tty
|
||||
import llnl.util.tty.color as color
|
||||
from llnl.util.tty.log import log_output
|
||||
@@ -35,6 +37,7 @@
|
||||
import spack.store
|
||||
import spack.util.debug
|
||||
import spack.util.path
|
||||
import spack.util.executable as exe
|
||||
from spack.error import SpackError
|
||||
|
||||
|
||||
@@ -107,6 +110,35 @@ def add_all_commands(parser):
|
||||
parser.add_command(cmd)
|
||||
|
||||
|
||||
def get_version():
|
||||
"""Get a descriptive version of this instance of Spack.
|
||||
|
||||
If this is a git repository, and if it is not on a release tag,
|
||||
return a string like:
|
||||
|
||||
release_version-commits_since_release-commit
|
||||
|
||||
If we *are* at a release tag, or if this is not a git repo, return
|
||||
the real spack release number (e.g., 0.13.3).
|
||||
|
||||
"""
|
||||
git_path = os.path.join(spack.paths.prefix, ".git")
|
||||
if os.path.exists(git_path):
|
||||
git = exe.which("git")
|
||||
if git:
|
||||
with fs.working_dir(spack.paths.prefix):
|
||||
desc = git(
|
||||
"describe", "--tags", output=str, fail_on_error=False)
|
||||
|
||||
if git.returncode == 0:
|
||||
match = re.match(r"v([^-]+)-([^-]+)-g([a-f\d]+)", desc)
|
||||
if match:
|
||||
v, n, commit = match.groups()
|
||||
return "%s-%s-%s" % (v, n, commit)
|
||||
|
||||
return spack.spack_version
|
||||
|
||||
|
||||
def index_commands():
|
||||
"""create an index of commands by section for this help level"""
|
||||
index = {}
|
||||
@@ -680,7 +712,7 @@ def main(argv=None):
|
||||
# -h, -H, and -V are special as they do not require a command, but
|
||||
# all the other options do nothing without a command.
|
||||
if args.version:
|
||||
print(spack.spack_version)
|
||||
print(get_version())
|
||||
return 0
|
||||
elif args.help:
|
||||
sys.stdout.write(parser.format_help(level=args.help))
|
||||
|
||||
@@ -1262,7 +1262,10 @@ def content_hash(self, content=None):
|
||||
raise spack.error.SpackError(err_msg)
|
||||
|
||||
hash_content = list()
|
||||
try:
|
||||
source_id = fs.for_package_version(self, self.version).source_id()
|
||||
except fs.ExtrapolationError:
|
||||
source_id = None
|
||||
if not source_id:
|
||||
# TODO? in cases where a digest or source_id isn't available,
|
||||
# should this attempt to download the source and set one? This
|
||||
@@ -1505,9 +1508,10 @@ def _update_explicit_entry_in_db(self, rec, explicit):
|
||||
message = '{s.name}@{s.version} : marking the package explicit'
|
||||
tty.msg(message.format(s=self))
|
||||
|
||||
def try_install_from_binary_cache(self, explicit):
|
||||
def try_install_from_binary_cache(self, explicit, unsigned=False):
|
||||
tty.msg('Searching for binary cache of %s' % self.name)
|
||||
specs = binary_distribution.get_specs(use_arch=True)
|
||||
specs = binary_distribution.get_spec(spec=self.spec,
|
||||
force=False)
|
||||
binary_spec = spack.spec.Spec.from_dict(self.spec.to_dict())
|
||||
binary_spec._mark_concrete()
|
||||
if binary_spec not in specs:
|
||||
@@ -1521,7 +1525,7 @@ def try_install_from_binary_cache(self, explicit):
|
||||
tty.msg('Installing %s from binary cache' % self.name)
|
||||
binary_distribution.extract_tarball(
|
||||
binary_spec, tarball, allow_root=False,
|
||||
unsigned=False, force=False)
|
||||
unsigned=unsigned, force=False)
|
||||
self.installed_from_binary_cache = True
|
||||
spack.store.db.add(
|
||||
self.spec, spack.store.layout, explicit=explicit)
|
||||
@@ -1662,7 +1666,8 @@ def do_install(self, **kwargs):
|
||||
tty.msg(colorize('@*{Installing} @*g{%s}' % self.name))
|
||||
|
||||
if kwargs.get('use_cache', True):
|
||||
if self.try_install_from_binary_cache(explicit):
|
||||
if self.try_install_from_binary_cache(
|
||||
explicit, unsigned=kwargs.get('unsigned', False)):
|
||||
tty.msg('Successfully installed %s from binary cache'
|
||||
% self.name)
|
||||
print_pkg(self.prefix)
|
||||
|
||||
@@ -932,7 +932,7 @@ def dump_provenance(self, spec, path):
|
||||
tty.warn("Patch file did not exist: %s" % patch.path)
|
||||
|
||||
# Install the package.py file itself.
|
||||
install(self.filename_for_package_name(spec), path)
|
||||
install(self.filename_for_package_name(spec.name), path)
|
||||
|
||||
def purge(self):
|
||||
"""Clear entire package instance cache."""
|
||||
@@ -974,20 +974,12 @@ def providers_for(self, vpkg_spec):
|
||||
def extensions_for(self, extendee_spec):
|
||||
return [p for p in self.all_packages() if p.extends(extendee_spec)]
|
||||
|
||||
def _check_namespace(self, spec):
|
||||
"""Check that the spec's namespace is the same as this repository's."""
|
||||
if spec.namespace and spec.namespace != self.namespace:
|
||||
raise UnknownNamespaceError(spec.namespace)
|
||||
|
||||
@autospec
|
||||
def dirname_for_package_name(self, spec):
|
||||
def dirname_for_package_name(self, pkg_name):
|
||||
"""Get the directory name for a particular package. This is the
|
||||
directory that contains its package.py file."""
|
||||
self._check_namespace(spec)
|
||||
return os.path.join(self.packages_path, spec.name)
|
||||
return os.path.join(self.packages_path, pkg_name)
|
||||
|
||||
@autospec
|
||||
def filename_for_package_name(self, spec):
|
||||
def filename_for_package_name(self, pkg_name):
|
||||
"""Get the filename for the module we should load for a particular
|
||||
package. Packages for a Repo live in
|
||||
``$root/<package_name>/package.py``
|
||||
@@ -996,8 +988,7 @@ def filename_for_package_name(self, spec):
|
||||
package doesn't exist yet, so callers will need to ensure
|
||||
the package exists before importing.
|
||||
"""
|
||||
self._check_namespace(spec)
|
||||
pkg_dir = self.dirname_for_package_name(spec.name)
|
||||
pkg_dir = self.dirname_for_package_name(pkg_name)
|
||||
return os.path.join(pkg_dir, package_file_name)
|
||||
|
||||
@property
|
||||
|
||||
82
lib/spack/spack/schema/container.py
Normal file
82
lib/spack/spack/schema/container.py
Normal file
@@ -0,0 +1,82 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
"""Schema for the 'container' subsection of Spack environments."""
|
||||
|
||||
#: Schema for the container attribute included in Spack environments
|
||||
container_schema = {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
# The recipe formats that are currently supported by the command
|
||||
'format': {
|
||||
'type': 'string',
|
||||
'enum': ['docker', 'singularity']
|
||||
},
|
||||
# Describes the base image to start from and the version
|
||||
# of Spack to be used
|
||||
'base': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'image': {
|
||||
'type': 'string',
|
||||
'enum': ['ubuntu:18.04',
|
||||
'ubuntu:16.04',
|
||||
'centos:7',
|
||||
'centos:6']
|
||||
},
|
||||
'spack': {
|
||||
'type': 'string',
|
||||
'enum': ['develop', '0.14', '0.14.0']
|
||||
}
|
||||
},
|
||||
'required': ['image', 'spack']
|
||||
},
|
||||
# Whether or not to strip installed binaries
|
||||
'strip': {
|
||||
'type': 'boolean',
|
||||
'default': True
|
||||
},
|
||||
# Additional system packages that are needed at runtime
|
||||
'os_packages': {
|
||||
'type': 'array',
|
||||
'items': {
|
||||
'type': 'string'
|
||||
}
|
||||
},
|
||||
# Add labels to the image
|
||||
'labels': {
|
||||
'type': 'object',
|
||||
},
|
||||
# Add a custom extra section at the bottom of a stage
|
||||
'extra_instructions': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'properties': {
|
||||
'build': {'type': 'string'},
|
||||
'final': {'type': 'string'}
|
||||
}
|
||||
},
|
||||
# Reserved for properties that are specific to each format
|
||||
'singularity': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'default': {},
|
||||
'properties': {
|
||||
'runscript': {'type': 'string'},
|
||||
'startscript': {'type': 'string'},
|
||||
'test': {'type': 'string'},
|
||||
'help': {'type': 'string'}
|
||||
}
|
||||
},
|
||||
'docker': {
|
||||
'type': 'object',
|
||||
'additionalProperties': False,
|
||||
'default': {},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
properties = {'container': container_schema}
|
||||
@@ -13,6 +13,7 @@
|
||||
import spack.schema.cdash
|
||||
import spack.schema.compilers
|
||||
import spack.schema.config
|
||||
import spack.schema.container
|
||||
import spack.schema.gitlab_ci
|
||||
import spack.schema.mirrors
|
||||
import spack.schema.modules
|
||||
@@ -26,6 +27,7 @@
|
||||
spack.schema.cdash.properties,
|
||||
spack.schema.compilers.properties,
|
||||
spack.schema.config.properties,
|
||||
spack.schema.container.properties,
|
||||
spack.schema.gitlab_ci.properties,
|
||||
spack.schema.mirrors.properties,
|
||||
spack.schema.modules.properties,
|
||||
|
||||
@@ -1117,6 +1117,18 @@ def _add_dependency(self, spec, deptypes):
|
||||
self._dependencies[spec.name] = dspec
|
||||
spec._dependents[self.name] = dspec
|
||||
|
||||
def _add_default_platform(self):
|
||||
"""If a spec has an os or a target and no platform, give it
|
||||
the default platform.
|
||||
|
||||
This is private because it is used by the parser -- it's not
|
||||
expected to be used outside of ``spec.py``.
|
||||
|
||||
"""
|
||||
arch = self.architecture
|
||||
if arch and not arch.platform and (arch.os or arch.target):
|
||||
self._set_architecture(platform=spack.architecture.platform().name)
|
||||
|
||||
#
|
||||
# Public interface
|
||||
#
|
||||
@@ -4053,14 +4065,6 @@ def do_parse(self):
|
||||
except spack.parse.ParseError as e:
|
||||
raise SpecParseError(e)
|
||||
|
||||
# If the spec has an os or a target and no platform, give it
|
||||
# the default platform
|
||||
platform_default = spack.architecture.platform().name
|
||||
for spec in specs:
|
||||
for s in spec.traverse():
|
||||
if s.architecture and not s.architecture.platform and \
|
||||
(s.architecture.os or s.architecture.target):
|
||||
s._set_architecture(platform=platform_default)
|
||||
return specs
|
||||
|
||||
def spec_from_file(self):
|
||||
@@ -4192,6 +4196,7 @@ def spec(self, name):
|
||||
else:
|
||||
break
|
||||
|
||||
spec._add_default_platform()
|
||||
return spec
|
||||
|
||||
def variant(self, name=None):
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
def mock_get_specs(database, monkeypatch):
|
||||
specs = database.query_local()
|
||||
monkeypatch.setattr(
|
||||
spack.binary_distribution, 'get_specs', lambda x: specs
|
||||
spack.binary_distribution, 'get_specs', lambda x, y: specs
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -95,12 +95,16 @@ def test_create_template(parser, mock_test_repo, args, name, expected):
|
||||
(' ', 'name must be provided'),
|
||||
('bad#name', 'name can only contain'),
|
||||
])
|
||||
def test_create_template_bad_name(parser, mock_test_repo, name, expected):
|
||||
def test_create_template_bad_name(
|
||||
parser, mock_test_repo, name, expected, capsys):
|
||||
"""Test template creation with bad name options."""
|
||||
constr_args = parser.parse_args(['--skip-editor', '-n', name])
|
||||
with pytest.raises(SystemExit, matches=expected):
|
||||
with pytest.raises(SystemExit):
|
||||
spack.cmd.create.create(parser, constr_args)
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert expected in str(captured)
|
||||
|
||||
|
||||
def test_build_system_guesser_no_stage(parser):
|
||||
"""Test build system guesser when stage not provided."""
|
||||
@@ -108,7 +112,7 @@ def test_build_system_guesser_no_stage(parser):
|
||||
|
||||
# Ensure get the expected build system
|
||||
with pytest.raises(AttributeError,
|
||||
matches="'NoneType' object has no attribute"):
|
||||
match="'NoneType' object has no attribute"):
|
||||
guesser(None, '/the/url/does/not/matter')
|
||||
|
||||
|
||||
@@ -142,7 +146,7 @@ def test_get_name_urls(parser, url, expected):
|
||||
assert name == expected
|
||||
|
||||
|
||||
def test_get_name_error(parser, monkeypatch):
|
||||
def test_get_name_error(parser, monkeypatch, capsys):
|
||||
"""Test get_name UndetectableNameError exception path."""
|
||||
def _parse_name_offset(path, v):
|
||||
raise UndetectableNameError(path)
|
||||
@@ -152,5 +156,7 @@ def _parse_name_offset(path, v):
|
||||
url = 'downloads.sourceforge.net/noapp/'
|
||||
args = parser.parse_args([url])
|
||||
|
||||
with pytest.raises(SystemExit, matches="Couldn't guess a name"):
|
||||
with pytest.raises(SystemExit):
|
||||
spack.cmd.create.get_name(args)
|
||||
captured = capsys.readouterr()
|
||||
assert "Couldn't guess a name" in str(captured)
|
||||
|
||||
@@ -30,7 +30,9 @@ def test_packages_are_removed(config, mutable_database, capsys):
|
||||
|
||||
|
||||
@pytest.mark.db
|
||||
def test_gc_with_environment(config, mutable_database, capsys):
|
||||
def test_gc_with_environment(
|
||||
config, mutable_database, mutable_mock_env_path, capsys
|
||||
):
|
||||
s = spack.spec.Spec('simple-inheritance')
|
||||
s.concretize()
|
||||
s.package.do_install(fake=True, explicit=True)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
@@ -170,7 +170,7 @@ def ignore_stage_files():
|
||||
Used to track which leftover files in the stage have been seen.
|
||||
"""
|
||||
# to start with, ignore the .lock file at the stage root.
|
||||
return set(['.lock', spack.stage._source_path_subdir])
|
||||
return set(['.lock', spack.stage._source_path_subdir, 'build_cache'])
|
||||
|
||||
|
||||
def remove_whatever_it_is(path):
|
||||
@@ -744,11 +744,31 @@ def mock_archive(request, tmpdir_factory):
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def mock_git_repository(tmpdir_factory):
|
||||
"""Creates a very simple git repository with two branches and
|
||||
two commits.
|
||||
"""Creates a simple git repository with two branches,
|
||||
two commits and two submodules. Each submodule has one commit.
|
||||
"""
|
||||
git = spack.util.executable.which('git', required=True)
|
||||
|
||||
suburls = []
|
||||
for submodule_count in range(2):
|
||||
tmpdir = tmpdir_factory.mktemp('mock-git-repo-submodule-dir-{0}'
|
||||
.format(submodule_count))
|
||||
tmpdir.ensure(spack.stage._source_path_subdir, dir=True)
|
||||
repodir = tmpdir.join(spack.stage._source_path_subdir)
|
||||
suburls.append((submodule_count, 'file://' + str(repodir)))
|
||||
|
||||
# Initialize the repository
|
||||
with repodir.as_cwd():
|
||||
git('init')
|
||||
git('config', 'user.name', 'Spack')
|
||||
git('config', 'user.email', 'spack@spack.io')
|
||||
|
||||
# r0 is just the first commit
|
||||
submodule_file = 'r0_file_{0}'.format(submodule_count)
|
||||
repodir.ensure(submodule_file)
|
||||
git('add', submodule_file)
|
||||
git('commit', '-m', 'mock-git-repo r0 {0}'.format(submodule_count))
|
||||
|
||||
tmpdir = tmpdir_factory.mktemp('mock-git-repo-dir')
|
||||
tmpdir.ensure(spack.stage._source_path_subdir, dir=True)
|
||||
repodir = tmpdir.join(spack.stage._source_path_subdir)
|
||||
@@ -759,6 +779,9 @@ def mock_git_repository(tmpdir_factory):
|
||||
git('config', 'user.name', 'Spack')
|
||||
git('config', 'user.email', 'spack@spack.io')
|
||||
url = 'file://' + str(repodir)
|
||||
for number, suburl in suburls:
|
||||
git('submodule', 'add', suburl,
|
||||
'third_party/submodule{0}'.format(number))
|
||||
|
||||
# r0 is just the first commit
|
||||
r0_file = 'r0_file'
|
||||
|
||||
16
lib/spack/spack/test/container/cli.py
Normal file
16
lib/spack/spack/test/container/cli.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import llnl.util.filesystem as fs
|
||||
import spack.main
|
||||
|
||||
|
||||
containerize = spack.main.SpackCommand('containerize')
|
||||
|
||||
|
||||
def test_command(configuration_dir, capsys):
|
||||
with capsys.disabled():
|
||||
with fs.working_dir(configuration_dir):
|
||||
output = containerize()
|
||||
assert 'FROM spack/ubuntu-bionic' in output
|
||||
43
lib/spack/spack/test/container/conftest.py
Normal file
43
lib/spack/spack/test/container/conftest.py
Normal file
@@ -0,0 +1,43 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import pytest
|
||||
|
||||
import spack.util.spack_yaml as syaml
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def minimal_configuration():
|
||||
return {
|
||||
'spack': {
|
||||
'specs': [
|
||||
'gromacs',
|
||||
'mpich',
|
||||
'fftw precision=float'
|
||||
],
|
||||
'container': {
|
||||
'format': 'docker',
|
||||
'base': {
|
||||
'image': 'ubuntu:18.04',
|
||||
'spack': 'develop'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def config_dumper(tmpdir):
|
||||
"""Function that dumps an environment config in a temporary folder."""
|
||||
def dumper(configuration):
|
||||
content = syaml.dump(configuration, default_flow_style=False)
|
||||
config_file = tmpdir / 'spack.yaml'
|
||||
config_file.write(content)
|
||||
return str(tmpdir)
|
||||
return dumper
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def configuration_dir(minimal_configuration, config_dumper):
|
||||
return config_dumper(minimal_configuration)
|
||||
74
lib/spack/spack/test/container/docker.py
Normal file
74
lib/spack/spack/test/container/docker.py
Normal file
@@ -0,0 +1,74 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import spack.container.writers as writers
|
||||
|
||||
|
||||
def test_manifest(minimal_configuration):
|
||||
writer = writers.create(minimal_configuration)
|
||||
manifest_str = writer.manifest
|
||||
for line in manifest_str.split('\n'):
|
||||
assert 'echo' in line
|
||||
|
||||
|
||||
def test_build_and_run_images(minimal_configuration):
|
||||
writer = writers.create(minimal_configuration)
|
||||
|
||||
# Test the output of run property
|
||||
run = writer.run
|
||||
assert run.image == 'ubuntu:18.04'
|
||||
|
||||
# Test the output of the build property
|
||||
build = writer.build
|
||||
assert build.image == 'spack/ubuntu-bionic'
|
||||
assert build.tag == 'latest'
|
||||
|
||||
|
||||
def test_packages(minimal_configuration):
|
||||
# In this minimal configuration we don't have packages
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.os_packages is None
|
||||
|
||||
# If we add them a list should be returned
|
||||
pkgs = ['libgomp1']
|
||||
minimal_configuration['spack']['container']['os_packages'] = pkgs
|
||||
writer = writers.create(minimal_configuration)
|
||||
p = writer.os_packages
|
||||
assert p.update
|
||||
assert p.install
|
||||
assert p.clean
|
||||
assert p.list == pkgs
|
||||
|
||||
|
||||
def test_ensure_render_works(minimal_configuration):
|
||||
# Here we just want to ensure that nothing is raised
|
||||
writer = writers.create(minimal_configuration)
|
||||
writer()
|
||||
|
||||
|
||||
def test_strip_is_set_from_config(minimal_configuration):
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.strip is True
|
||||
|
||||
minimal_configuration['spack']['container']['strip'] = False
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.strip is False
|
||||
|
||||
|
||||
def test_extra_instructions_is_set_from_config(minimal_configuration):
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.extra_instructions == (None, None)
|
||||
|
||||
test_line = 'RUN echo Hello world!'
|
||||
e = minimal_configuration['spack']['container']
|
||||
e['extra_instructions'] = {}
|
||||
e['extra_instructions']['build'] = test_line
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.extra_instructions == (test_line, None)
|
||||
|
||||
e['extra_instructions']['final'] = test_line
|
||||
del e['extra_instructions']['build']
|
||||
writer = writers.create(minimal_configuration)
|
||||
assert writer.extra_instructions == (None, test_line)
|
||||
58
lib/spack/spack/test/container/images.py
Normal file
58
lib/spack/spack/test/container/images.py
Normal file
@@ -0,0 +1,58 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import os.path
|
||||
|
||||
import pytest
|
||||
|
||||
import spack.container
|
||||
|
||||
|
||||
@pytest.mark.parametrize('image,spack_version,expected', [
|
||||
('ubuntu:18.04', 'develop', ('spack/ubuntu-bionic', 'latest')),
|
||||
('ubuntu:18.04', '0.14.0', ('spack/ubuntu-bionic', '0.14.0')),
|
||||
])
|
||||
def test_build_info(image, spack_version, expected):
|
||||
output = spack.container.images.build_info(image, spack_version)
|
||||
assert output == expected
|
||||
|
||||
|
||||
@pytest.mark.parametrize('image,spack_version', [
|
||||
('ubuntu:18.04', 'doesnotexist')
|
||||
])
|
||||
def test_build_info_error(image, spack_version):
|
||||
with pytest.raises(ValueError, match=r"has no tag for"):
|
||||
spack.container.images.build_info(image, spack_version)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('image', [
|
||||
'ubuntu:18.04'
|
||||
])
|
||||
def test_package_info(image):
|
||||
update, install, clean = spack.container.images.package_info(image)
|
||||
assert update
|
||||
assert install
|
||||
assert clean
|
||||
|
||||
|
||||
@pytest.mark.parametrize('extra_config,expected_msg', [
|
||||
({'modules': {'enable': ['tcl']}}, 'the subsection "modules" in'),
|
||||
({'concretization': 'separately'}, 'the "concretization" attribute'),
|
||||
({'config': {'install_tree': '/some/dir'}},
|
||||
'the "config:install_tree" attribute has been set'),
|
||||
({'view': '/some/dir'}, 'the "view" attribute has been set')
|
||||
])
|
||||
def test_validate(
|
||||
extra_config, expected_msg, minimal_configuration, config_dumper
|
||||
):
|
||||
minimal_configuration['spack'].update(extra_config)
|
||||
spack_yaml_dir = config_dumper(minimal_configuration)
|
||||
spack_yaml = os.path.join(spack_yaml_dir, 'spack.yaml')
|
||||
|
||||
with pytest.warns(UserWarning) as w:
|
||||
spack.container.validate(spack_yaml)
|
||||
|
||||
# Tests are designed to raise only one warning
|
||||
assert len(w) == 1
|
||||
assert expected_msg in str(w.pop().message)
|
||||
16
lib/spack/spack/test/container/schema.py
Normal file
16
lib/spack/spack/test/container/schema.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import spack.container
|
||||
import spack.schema.container
|
||||
|
||||
|
||||
def test_images_in_schema():
|
||||
properties = spack.schema.container.container_schema['properties']
|
||||
allowed_images = set(
|
||||
properties['base']['properties']['image']['enum']
|
||||
)
|
||||
images_in_json = set(x for x in spack.container.images.data())
|
||||
assert images_in_json == allowed_images
|
||||
42
lib/spack/spack/test/container/singularity.py
Normal file
42
lib/spack/spack/test/container/singularity.py
Normal file
@@ -0,0 +1,42 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
import pytest
|
||||
|
||||
import spack.container.writers as writers
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def singularity_configuration(minimal_configuration):
|
||||
minimal_configuration['spack']['container']['format'] = 'singularity'
|
||||
return minimal_configuration
|
||||
|
||||
|
||||
def test_ensure_render_works(singularity_configuration):
|
||||
container_config = singularity_configuration['spack']['container']
|
||||
assert container_config['format'] == 'singularity'
|
||||
# Here we just want to ensure that nothing is raised
|
||||
writer = writers.create(singularity_configuration)
|
||||
writer()
|
||||
|
||||
|
||||
@pytest.mark.parametrize('properties,expected', [
|
||||
({'runscript': '/opt/view/bin/h5ls'},
|
||||
{'runscript': '/opt/view/bin/h5ls',
|
||||
'startscript': '',
|
||||
'test': '',
|
||||
'help': ''})
|
||||
])
|
||||
def test_singularity_specific_properties(
|
||||
properties, expected, singularity_configuration
|
||||
):
|
||||
# Set the property in the configuration
|
||||
container_config = singularity_configuration['spack']['container']
|
||||
for name, value in properties.items():
|
||||
container_config.setdefault('singularity', {})[name] = value
|
||||
|
||||
# Assert the properties return the expected values
|
||||
writer = writers.create(singularity_configuration)
|
||||
for name, value in expected.items():
|
||||
assert getattr(writer, name) == value
|
||||
@@ -14,4 +14,3 @@ config:
|
||||
dirty: false
|
||||
module_roots:
|
||||
tcl: $spack/share/spack/modules
|
||||
lmod: $spack/share/spack/lmod
|
||||
|
||||
@@ -55,10 +55,11 @@ def test_global_db_initializtion():
|
||||
@pytest.fixture()
|
||||
def upstream_and_downstream_db(tmpdir_factory, gen_mock_layout):
|
||||
mock_db_root = str(tmpdir_factory.mktemp('mock_db_root'))
|
||||
upstream_db = spack.database.Database(mock_db_root)
|
||||
upstream_write_db = spack.database.Database(mock_db_root)
|
||||
upstream_db = spack.database.Database(mock_db_root, is_upstream=True)
|
||||
# Generate initial DB file to avoid reindex
|
||||
with open(upstream_db._index_path, 'w') as db_file:
|
||||
upstream_db._write_to_file(db_file)
|
||||
with open(upstream_write_db._index_path, 'w') as db_file:
|
||||
upstream_write_db._write_to_file(db_file)
|
||||
upstream_layout = gen_mock_layout('/a/')
|
||||
|
||||
downstream_db_root = str(
|
||||
@@ -69,13 +70,14 @@ def upstream_and_downstream_db(tmpdir_factory, gen_mock_layout):
|
||||
downstream_db._write_to_file(db_file)
|
||||
downstream_layout = gen_mock_layout('/b/')
|
||||
|
||||
yield upstream_db, upstream_layout, downstream_db, downstream_layout
|
||||
yield upstream_write_db, upstream_db, upstream_layout,\
|
||||
downstream_db, downstream_layout
|
||||
|
||||
|
||||
@pytest.mark.usefixtures('config')
|
||||
def test_installed_upstream(upstream_and_downstream_db):
|
||||
upstream_db, upstream_layout, downstream_db, downstream_layout = (
|
||||
upstream_and_downstream_db)
|
||||
upstream_write_db, upstream_db, upstream_layout,\
|
||||
downstream_db, downstream_layout = (upstream_and_downstream_db)
|
||||
|
||||
default = ('build', 'link')
|
||||
x = MockPackage('x', [], [])
|
||||
@@ -89,7 +91,14 @@ def test_installed_upstream(upstream_and_downstream_db):
|
||||
spec.concretize()
|
||||
|
||||
for dep in spec.traverse(root=False):
|
||||
upstream_db.add(dep, upstream_layout)
|
||||
upstream_write_db.add(dep, upstream_layout)
|
||||
upstream_db._read()
|
||||
|
||||
for dep in spec.traverse(root=False):
|
||||
record = downstream_db.get_by_hash(dep.dag_hash())
|
||||
assert record is not None
|
||||
with pytest.raises(spack.database.ForbiddenLockError):
|
||||
record = upstream_db.get_by_hash(dep.dag_hash())
|
||||
|
||||
new_spec = spack.spec.Spec('w')
|
||||
new_spec.concretize()
|
||||
@@ -110,8 +119,8 @@ def test_installed_upstream(upstream_and_downstream_db):
|
||||
|
||||
@pytest.mark.usefixtures('config')
|
||||
def test_removed_upstream_dep(upstream_and_downstream_db):
|
||||
upstream_db, upstream_layout, downstream_db, downstream_layout = (
|
||||
upstream_and_downstream_db)
|
||||
upstream_write_db, upstream_db, upstream_layout,\
|
||||
downstream_db, downstream_layout = (upstream_and_downstream_db)
|
||||
|
||||
default = ('build', 'link')
|
||||
z = MockPackage('z', [], [])
|
||||
@@ -122,13 +131,15 @@ def test_removed_upstream_dep(upstream_and_downstream_db):
|
||||
spec = spack.spec.Spec('y')
|
||||
spec.concretize()
|
||||
|
||||
upstream_db.add(spec['z'], upstream_layout)
|
||||
upstream_write_db.add(spec['z'], upstream_layout)
|
||||
upstream_db._read()
|
||||
|
||||
new_spec = spack.spec.Spec('y')
|
||||
new_spec.concretize()
|
||||
downstream_db.add(new_spec, downstream_layout)
|
||||
|
||||
upstream_db.remove(new_spec['z'])
|
||||
upstream_write_db.remove(new_spec['z'])
|
||||
upstream_db._read()
|
||||
|
||||
new_downstream = spack.database.Database(
|
||||
downstream_db.root, upstream_dbs=[upstream_db])
|
||||
@@ -143,8 +154,8 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db):
|
||||
DB. When a package is recorded as installed in both, the results should
|
||||
refer to the downstream DB.
|
||||
"""
|
||||
upstream_db, upstream_layout, downstream_db, downstream_layout = (
|
||||
upstream_and_downstream_db)
|
||||
upstream_write_db, upstream_db, upstream_layout,\
|
||||
downstream_db, downstream_layout = (upstream_and_downstream_db)
|
||||
|
||||
x = MockPackage('x', [], [])
|
||||
mock_repo = MockPackageMultiRepo([x])
|
||||
@@ -155,7 +166,8 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db):
|
||||
|
||||
downstream_db.add(spec, downstream_layout)
|
||||
|
||||
upstream_db.add(spec, upstream_layout)
|
||||
upstream_write_db.add(spec, upstream_layout)
|
||||
upstream_db._read()
|
||||
|
||||
upstream, record = downstream_db.query_by_spec_hash(spec.dag_hash())
|
||||
# Even though the package is recorded as installed in the upstream DB,
|
||||
@@ -731,3 +743,23 @@ def test_query_unused_specs(mutable_database):
|
||||
unused = spack.store.db.unused_specs
|
||||
assert len(unused) == 1
|
||||
assert unused[0].name == 'cmake'
|
||||
|
||||
|
||||
@pytest.mark.regression('10019')
|
||||
def test_query_spec_with_conditional_dependency(mutable_database):
|
||||
# The issue is triggered by having dependencies that are
|
||||
# conditional on a Boolean variant
|
||||
s = spack.spec.Spec('hdf5~mpi')
|
||||
s.concretize()
|
||||
s.package.do_install(fake=True, explicit=True)
|
||||
|
||||
results = spack.store.db.query_local('hdf5 ^mpich')
|
||||
assert not results
|
||||
|
||||
|
||||
@pytest.mark.regression('10019')
|
||||
def test_query_spec_with_non_conditional_virtual_dependency(database):
|
||||
# Ensure the same issue doesn't come up for virtual
|
||||
# dependency that are not conditional on variants
|
||||
results = spack.store.db.query_local('mpileaks ^mpich')
|
||||
assert len(results) == 1
|
||||
|
||||
@@ -19,7 +19,6 @@
|
||||
from spack.fetch_strategy import GitFetchStrategy
|
||||
from spack.util.executable import which
|
||||
|
||||
|
||||
pytestmark = pytest.mark.skipif(
|
||||
not which('git'), reason='requires git to be installed')
|
||||
|
||||
@@ -169,7 +168,7 @@ def test_git_extra_fetch(tmpdir):
|
||||
def test_needs_stage():
|
||||
"""Trigger a NoStageError when attempt a fetch without a stage."""
|
||||
with pytest.raises(spack.fetch_strategy.NoStageError,
|
||||
matches=_mock_transport_error):
|
||||
match=r"set_stage.*before calling fetch"):
|
||||
fetcher = GitFetchStrategy(git='file:///not-a-real-git-repo')
|
||||
fetcher.fetch()
|
||||
|
||||
@@ -217,3 +216,59 @@ def test_get_full_repo(get_full_repo, git_version, mock_git_repository,
|
||||
else:
|
||||
assert(nbranches == 2)
|
||||
assert(ncommits == 1)
|
||||
|
||||
|
||||
@pytest.mark.disable_clean_stage_check
|
||||
@pytest.mark.parametrize("submodules", [True, False])
|
||||
def test_gitsubmodule(submodules, mock_git_repository, config,
|
||||
mutable_mock_repo):
|
||||
"""
|
||||
Test GitFetchStrategy behavior with submodules
|
||||
"""
|
||||
type_of_test = 'tag-branch'
|
||||
t = mock_git_repository.checks[type_of_test]
|
||||
|
||||
# Construct the package under test
|
||||
spec = Spec('git-test')
|
||||
spec.concretize()
|
||||
pkg = spack.repo.get(spec)
|
||||
args = copy.copy(t.args)
|
||||
args['submodules'] = submodules
|
||||
pkg.versions[ver('git')] = args
|
||||
pkg.do_stage()
|
||||
with working_dir(pkg.stage.source_path):
|
||||
for submodule_count in range(2):
|
||||
file_path = os.path.join(pkg.stage.source_path,
|
||||
'third_party/submodule{0}/r0_file_{0}'
|
||||
.format(submodule_count))
|
||||
if submodules:
|
||||
assert os.path.isfile(file_path)
|
||||
else:
|
||||
assert not os.path.isfile(file_path)
|
||||
|
||||
|
||||
@pytest.mark.disable_clean_stage_check
|
||||
def test_gitsubmodules_delete(mock_git_repository, config, mutable_mock_repo):
|
||||
"""
|
||||
Test GitFetchStrategy behavior with submodules_delete
|
||||
"""
|
||||
type_of_test = 'tag-branch'
|
||||
t = mock_git_repository.checks[type_of_test]
|
||||
|
||||
# Construct the package under test
|
||||
spec = Spec('git-test')
|
||||
spec.concretize()
|
||||
pkg = spack.repo.get(spec)
|
||||
args = copy.copy(t.args)
|
||||
args['submodules'] = True
|
||||
args['submodules_delete'] = ['third_party/submodule0',
|
||||
'third_party/submodule1']
|
||||
pkg.versions[ver('git')] = args
|
||||
pkg.do_stage()
|
||||
with working_dir(pkg.stage.source_path):
|
||||
file_path = os.path.join(pkg.stage.source_path,
|
||||
'third_party/submodule0')
|
||||
assert not os.path.isdir(file_path)
|
||||
file_path = os.path.join(pkg.stage.source_path,
|
||||
'third_party/submodule1')
|
||||
assert not os.path.isdir(file_path)
|
||||
|
||||
@@ -316,14 +316,14 @@ def test_uninstall_by_spec_errors(mutable_database):
|
||||
"""Test exceptional cases with the uninstall command."""
|
||||
|
||||
# Try to uninstall a spec that has not been installed
|
||||
rec = mutable_database.get_record('zmpi')
|
||||
with pytest.raises(InstallError, matches="not installed"):
|
||||
PackageBase.uninstall_by_spec(rec.spec)
|
||||
spec = Spec('dependent-install')
|
||||
spec.concretize()
|
||||
with pytest.raises(InstallError, match="is not installed"):
|
||||
PackageBase.uninstall_by_spec(spec)
|
||||
|
||||
# Try an unforced uninstall of a spec with dependencies
|
||||
rec = mutable_database.get_record('mpich')
|
||||
|
||||
with pytest.raises(PackageStillNeededError, matches="cannot uninstall"):
|
||||
with pytest.raises(PackageStillNeededError, match="Cannot uninstall"):
|
||||
PackageBase.uninstall_by_spec(rec.spec)
|
||||
|
||||
|
||||
|
||||
@@ -245,7 +245,7 @@ def test_unsupported_optimization_flags(target_name, compiler, version):
|
||||
target = llnl.util.cpu.targets[target_name]
|
||||
with pytest.raises(
|
||||
llnl.util.cpu.UnsupportedMicroarchitecture,
|
||||
matches='cannot produce optimized binary'
|
||||
match='cannot produce optimized binary'
|
||||
):
|
||||
target.optimization_flags(compiler, version)
|
||||
|
||||
@@ -287,5 +287,5 @@ def test_invalid_family():
|
||||
vendor='Imagination', features=[], compilers={}, generation=0
|
||||
)
|
||||
with pytest.raises(AssertionError,
|
||||
matches='a target is expected to belong'):
|
||||
match='a target is expected to belong'):
|
||||
multi_parents.family
|
||||
|
||||
@@ -116,13 +116,13 @@ def test_parent_dir(self, stage):
|
||||
|
||||
# Make sure we get the right error if we try to copy a parent into
|
||||
# a descendent directory.
|
||||
with pytest.raises(ValueError, matches="Cannot copy"):
|
||||
with pytest.raises(ValueError, match="Cannot copy"):
|
||||
with fs.working_dir(str(stage)):
|
||||
fs.copy_tree('source', 'source/sub/directory')
|
||||
|
||||
# Only point with this check is to make sure we don't try to perform
|
||||
# the copy.
|
||||
with pytest.raises(IOError, matches="No such file or directory"):
|
||||
with pytest.raises(IOError, match="No such file or directory"):
|
||||
with fs.working_dir(str(stage)):
|
||||
fs.copy_tree('foo/ba', 'foo/bar')
|
||||
|
||||
|
||||
63
lib/spack/spack/test/main.py
Normal file
63
lib/spack/spack/test/main.py
Normal file
@@ -0,0 +1,63 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import os
|
||||
|
||||
import llnl.util.filesystem as fs
|
||||
|
||||
import spack.paths
|
||||
from spack.main import get_version, main
|
||||
|
||||
|
||||
def test_get_version_no_match_git(tmpdir, working_env):
|
||||
git = str(tmpdir.join("git"))
|
||||
with open(git, "w") as f:
|
||||
f.write("""#!/bin/sh
|
||||
echo v0.13.3
|
||||
""")
|
||||
fs.set_executable(git)
|
||||
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
assert spack.spack_version == get_version()
|
||||
|
||||
|
||||
def test_get_version_match_git(tmpdir, working_env):
|
||||
git = str(tmpdir.join("git"))
|
||||
with open(git, "w") as f:
|
||||
f.write("""#!/bin/sh
|
||||
echo v0.13.3-912-g3519a1762
|
||||
""")
|
||||
fs.set_executable(git)
|
||||
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
assert "0.13.3-912-3519a1762" == get_version()
|
||||
|
||||
|
||||
def test_get_version_no_repo(tmpdir, monkeypatch):
|
||||
monkeypatch.setattr(spack.paths, "prefix", str(tmpdir))
|
||||
assert spack.spack_version == get_version()
|
||||
|
||||
|
||||
def test_get_version_no_git(tmpdir, working_env):
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
assert spack.spack_version == get_version()
|
||||
|
||||
|
||||
def test_main_calls_get_version(tmpdir, capsys, working_env):
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
main(["-V"])
|
||||
assert spack.spack_version == capsys.readouterr()[0].strip()
|
||||
|
||||
|
||||
def test_get_version_bad_git(tmpdir, working_env):
|
||||
bad_git = str(tmpdir.join("git"))
|
||||
with open(bad_git, "w") as f:
|
||||
f.write("""#!/bin/sh
|
||||
exit 1
|
||||
""")
|
||||
fs.set_executable(bad_git)
|
||||
|
||||
os.environ["PATH"] = str(tmpdir)
|
||||
assert spack.spack_version == get_version()
|
||||
@@ -214,7 +214,7 @@ def test_buildcache(mock_archive, tmpdir):
|
||||
stage.destroy()
|
||||
|
||||
# Remove cached binary specs since we deleted the mirror
|
||||
bindist._cached_specs = None
|
||||
bindist._cached_specs = set()
|
||||
|
||||
|
||||
def test_relocate_text(tmpdir):
|
||||
|
||||
@@ -927,8 +927,11 @@ def test_stage_create_replace_path(tmp_build_stage_dir):
|
||||
assert os.path.isdir(stage.path)
|
||||
|
||||
|
||||
def test_cannot_access():
|
||||
def test_cannot_access(capsys):
|
||||
"""Ensure can_access dies with the expected error."""
|
||||
with pytest.raises(SystemExit, matches='Insufficient permissions'):
|
||||
with pytest.raises(SystemExit):
|
||||
# It's far more portable to use a non-existent filename.
|
||||
spack.stage.ensure_access('/no/such/file')
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert 'Insufficient permissions' in str(captured)
|
||||
|
||||
@@ -313,7 +313,7 @@ _spack() {
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -H --all-help --color -C --config-scope -d --debug --timestamp --pdb -e --env -D --env-dir -E --no-env --use-env-repo -k --insecure -l --enable-locks -L --disable-locks -m --mock -p --profile --sorted-profile --lines -v --verbose --stacktrace -V --version --print-shell-vars"
|
||||
else
|
||||
SPACK_COMPREPLY="activate add arch blame bootstrap build build-env buildcache cd checksum ci clean clone commands compiler compilers concretize config configure create deactivate debug dependencies dependents deprecate dev-build diy docs edit env extensions fetch find flake8 gc gpg graph help info install license list load location log-parse maintainers mirror module patch pkg providers pydoc python reindex remove rm repo resource restage setup spec stage test uninstall unload upload-s3 url verify versions view"
|
||||
SPACK_COMPREPLY="activate add arch blame bootstrap build build-env buildcache cd checksum ci clean clone commands compiler compilers concretize config configure containerize create deactivate debug dependencies dependents deprecate dev-build diy docs edit env extensions fetch find flake8 gc gpg graph help info install license list load location log-parse maintainers mirror module patch pkg providers pydoc python reindex remove rm repo resource restage setup spec stage test uninstall unload upload-s3 url verify versions view"
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -400,7 +400,7 @@ _spack_buildcache_install() {
|
||||
_spack_buildcache_list() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -l --long -L --very-long -v --variants -f --force"
|
||||
SPACK_COMPREPLY="-h --help -l --long -L --very-long -v --variants -f --force -a --allarch"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -628,6 +628,10 @@ _spack_configure() {
|
||||
fi
|
||||
}
|
||||
|
||||
_spack_containerize() {
|
||||
SPACK_COMPREPLY="-h --help"
|
||||
}
|
||||
|
||||
_spack_create() {
|
||||
if $list_options
|
||||
then
|
||||
@@ -941,7 +945,7 @@ _spack_info() {
|
||||
_spack_install() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help --only -u --until -j --jobs --overwrite --keep-prefix --keep-stage --dont-restage --use-cache --no-cache --cache-only --show-log-on-error --source -n --no-checksum -v --verbose --fake --only-concrete -f --file --upstream -g --global --clean --dirty --test --run-tests --log-format --log-file --help-cdash --cdash-upload-url --cdash-build --cdash-site --cdash-track --cdash-buildstamp -y --yes-to-all"
|
||||
SPACK_COMPREPLY="-h --help --only -u --until -j --jobs --overwrite --keep-prefix --keep-stage --dont-restage --use-cache --no-cache --cache-only --no-check-signature --show-log-on-error --source -n --no-checksum -v --verbose --fake --only-concrete -f --file --clean --dirty --test --run-tests --log-format --log-file --help-cdash --cdash-upload-url --cdash-build --cdash-site --cdash-track --cdash-buildstamp -y --yes-to-all"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -1415,7 +1419,7 @@ _spack_test() {
|
||||
_spack_uninstall() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -f --force -R --dependents -y --yes-to-all -a --all -u --upstream -g --global"
|
||||
SPACK_COMPREPLY="-h --help -f --force -R --dependents -y --yes-to-all -a --all"
|
||||
else
|
||||
_installed_packages
|
||||
fi
|
||||
|
||||
51
share/spack/templates/container/Dockerfile
Normal file
51
share/spack/templates/container/Dockerfile
Normal file
@@ -0,0 +1,51 @@
|
||||
# Build stage with Spack pre-installed and ready to be used
|
||||
FROM {{ build.image }}:{{ build.tag }} as builder
|
||||
|
||||
# What we want to install and how we want to install it
|
||||
# is specified in a manifest file (spack.yaml)
|
||||
RUN mkdir {{ paths.environment }} \
|
||||
{{ manifest }} > {{ paths.environment }}/spack.yaml
|
||||
|
||||
# Install the software, remove unecessary deps
|
||||
RUN cd {{ paths.environment }} && spack install && spack gc -y
|
||||
{% if strip %}
|
||||
|
||||
# Strip all the binaries
|
||||
RUN find -L {{ paths.view }}/* -type f -exec readlink -f '{}' \; | \
|
||||
xargs file -i | \
|
||||
grep 'charset=binary' | \
|
||||
grep 'x-executable\|x-archive\|x-sharedlib' | \
|
||||
awk -F: '{print $1}' | xargs strip -s
|
||||
{% endif %}
|
||||
|
||||
# Modifications to the environment that are necessary to run
|
||||
RUN cd {{ paths.environment }} && \
|
||||
spack env activate --sh -d . >> /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
{% if extra_instructions.build %}
|
||||
{{ extra_instructions.build }}
|
||||
{% endif %}
|
||||
|
||||
# Bare OS image to run the installed executables
|
||||
FROM {{ run.image }}
|
||||
|
||||
COPY --from=builder {{ paths.environment }} {{ paths.environment }}
|
||||
COPY --from=builder {{ paths.store }} {{ paths.store }}
|
||||
COPY --from=builder {{ paths.view }} {{ paths.view }}
|
||||
COPY --from=builder /etc/profile.d/z10_spack_environment.sh /etc/profile.d/z10_spack_environment.sh
|
||||
|
||||
{% if os_packages %}
|
||||
RUN {{ os_packages.update }} \
|
||||
&& {{ os_packages.install }}{% for pkg in os_packages.list %} {{ pkg }}{% endfor %} \
|
||||
&& {{ os_packages.clean }}
|
||||
{% endif %}
|
||||
|
||||
{% if extra_instructions.final %}
|
||||
{{ extra_instructions.final }}
|
||||
{% endif %}
|
||||
|
||||
{% for label, value in labels.items() %}
|
||||
LABEL "{{ label }}"="{{ value }}"
|
||||
{% endfor %}
|
||||
|
||||
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
|
||||
90
share/spack/templates/container/singularity.def
Normal file
90
share/spack/templates/container/singularity.def
Normal file
@@ -0,0 +1,90 @@
|
||||
Bootstrap: docker
|
||||
From: {{ build.image }}:{{ build.tag }}
|
||||
Stage: build
|
||||
|
||||
%post
|
||||
# Create the manifest file for the installation in /opt/spack-environment
|
||||
mkdir {{ paths.environment }} && cd {{ paths.environment }}
|
||||
cat << EOF > spack.yaml
|
||||
{{ manifest }}
|
||||
EOF
|
||||
|
||||
# Install all the required software
|
||||
. /opt/spack/share/spack/setup-env.sh
|
||||
spack install
|
||||
spack gc -y
|
||||
spack env activate --sh -d . >> {{ paths.environment }}/environment_modifications.sh
|
||||
{% if strip %}
|
||||
|
||||
# Strip the binaries to reduce the size of the image
|
||||
find -L {{ paths.view }}/* -type f -exec readlink -f '{}' \; | \
|
||||
xargs file -i | \
|
||||
grep 'charset=binary' | \
|
||||
grep 'x-executable\|x-archive\|x-sharedlib' | \
|
||||
awk -F: '{print $1}' | xargs strip -s
|
||||
{% endif %}
|
||||
{% if extra_instructions.build %}
|
||||
{{ extra_instructions.build }}
|
||||
{% endif %}
|
||||
|
||||
|
||||
{% if apps %}
|
||||
{% for application, help_text in apps.items() %}
|
||||
|
||||
%apprun {{ application }}
|
||||
exec /opt/view/bin/{{ application }} "$@"
|
||||
|
||||
%apphelp {{ application }}
|
||||
{{help_text }}
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
|
||||
Bootstrap: docker
|
||||
From: {{ run.image }}
|
||||
Stage: final
|
||||
|
||||
%files from build
|
||||
{{ paths.environment }} /opt
|
||||
{{ paths.store }} /opt
|
||||
{{ paths.view }} /opt
|
||||
{{ paths.environment }}/environment_modifications.sh {{ paths.environment }}/environment_modifications.sh
|
||||
|
||||
%post
|
||||
{% if os_packages.list %}
|
||||
# Update, install and cleanup of system packages
|
||||
{{ os_packages.update }}
|
||||
{{ os_packages.install }} {{ os_packages.list | join | replace('\n', ' ') }}
|
||||
{{ os_packages.clean }}
|
||||
{% endif %}
|
||||
# Modify the environment without relying on sourcing shell specific files at startup
|
||||
cat {{ paths.environment }}/environment_modifications.sh >> $SINGULARITY_ENVIRONMENT
|
||||
{% if extra_instructions.final %}
|
||||
{{ extra_instructions.final }}
|
||||
{% endif %}
|
||||
|
||||
{% if runscript %}
|
||||
%runscript
|
||||
{{ runscript }}
|
||||
{% endif %}
|
||||
|
||||
{% if startscript %}
|
||||
%startscript
|
||||
{{ startscript }}
|
||||
{% endif %}
|
||||
|
||||
{% if test %}
|
||||
%test
|
||||
{{ test }}
|
||||
{% endif %}
|
||||
|
||||
{% if help %}
|
||||
%help
|
||||
{{ help }}
|
||||
{% endif %}
|
||||
|
||||
{% if labels %}
|
||||
%labels
|
||||
{% for label, value in labels.items() %}
|
||||
{{ label }} {{ value }}
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
15
var/spack/repos/builtin.mock/packages/hdf5/package.py
Normal file
15
var/spack/repos/builtin.mock/packages/hdf5/package.py
Normal file
@@ -0,0 +1,15 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
|
||||
class Hdf5(Package):
|
||||
homepage = "http://www.llnl.gov"
|
||||
url = "http://www.llnl.gov/hdf5-1.0.tar.gz"
|
||||
|
||||
version(2.3, 'foobarbaz')
|
||||
|
||||
variant('mpi', default=True, description='Debug variant')
|
||||
|
||||
depends_on('mpi', when='mpi')
|
||||
@@ -71,7 +71,7 @@ class Abinit(AutotoolsPackage):
|
||||
# depends_on('elpa~openmp', when='+elpa+mpi~openmp')
|
||||
# depends_on('elpa+openmp', when='+elpa+mpi+openmp')
|
||||
|
||||
depends_on('fftw precision=float')
|
||||
depends_on('fftw precision=float,double')
|
||||
depends_on('fftw~openmp', when='~openmp')
|
||||
depends_on('fftw+openmp', when='+openmp')
|
||||
|
||||
|
||||
@@ -34,6 +34,7 @@ class ActsCore(CMakePackage):
|
||||
maintainers = ['HadrienG2']
|
||||
|
||||
version('develop', branch='master')
|
||||
version('0.16.0', commit='b3d965fe0b8ae335909d79114ef261c6b996773a')
|
||||
version('0.15.0', commit='267c28f69c561e64369661a6235b03b5a610d6da')
|
||||
version('0.14.0', commit='38d678fcb205b77d60326eae913fbb1b054acea1')
|
||||
version('0.13.0', commit='b33f7270ddbbb33050b7ec60b4fa255dc2bfdc88')
|
||||
@@ -58,16 +59,18 @@ class ActsCore(CMakePackage):
|
||||
version('0.08.0', commit='99eedb38f305e3a1cd99d9b4473241b7cd641fa9')
|
||||
|
||||
# Variants that affect the core ACTS library
|
||||
variant('legacy', default=False, description='Build the Legacy package')
|
||||
variant('benchmarks', default=False, description='Build the performance benchmarks')
|
||||
variant('examples', default=False, description='Build the examples')
|
||||
variant('tests', default=False, description='Build the unit tests')
|
||||
variant('integration_tests', default=False, description='Build the integration tests')
|
||||
|
||||
# Variants the enable / disable ACTS plugins
|
||||
variant('digitization', default=False, description='Build the geometric digitization plugin')
|
||||
variant('dd4hep', default=False, description='Build the DD4hep plugin')
|
||||
variant('digitization', default=False, description='Build the geometric digitization plugin')
|
||||
variant('fatras', default=False, description='Build the FAst TRAcking Simulation package')
|
||||
variant('identification', default=False, description='Build the Identification plugin')
|
||||
variant('json', default=False, description='Build the Json plugin')
|
||||
variant('legacy', default=False, description='Build the Legacy package')
|
||||
variant('tgeo', default=False, description='Build the TGeo plugin')
|
||||
|
||||
depends_on('cmake @3.11:', type='build')
|
||||
@@ -76,8 +79,8 @@ class ActsCore(CMakePackage):
|
||||
depends_on('eigen @3.2.9:', type='build')
|
||||
depends_on('nlohmann-json @3.2.0:', when='@0.14.0: +json')
|
||||
depends_on('root @6.10: cxxstd=14', when='+tgeo @:0.8.0')
|
||||
depends_on('root @6.10:', when='+tgeo @0.8.1:')
|
||||
depends_on('dd4hep @1.2:', when='+dd4hep')
|
||||
depends_on('root @6.10: cxxstd=17', when='+tgeo @0.8.1:')
|
||||
depends_on('dd4hep @1.2: +xercesc', when='+dd4hep')
|
||||
|
||||
def cmake_args(self):
|
||||
spec = self.spec
|
||||
@@ -86,15 +89,23 @@ def cmake_variant(cmake_label, spack_variant):
|
||||
enabled = spec.satisfies('+' + spack_variant)
|
||||
return "-DACTS_BUILD_{0}={1}".format(cmake_label, enabled)
|
||||
|
||||
integration_tests_label = "INTEGRATIONTESTS"
|
||||
tests_label = "UNITTESTS"
|
||||
if spec.satisfies('@:0.15.99'):
|
||||
integration_tests_label = "INTEGRATION_TESTS"
|
||||
tests_label = "TESTS"
|
||||
|
||||
args = [
|
||||
cmake_variant("LEGACY", "legacy"),
|
||||
cmake_variant("BENCHMARKS", "benchmarks"),
|
||||
cmake_variant("EXAMPLES", "examples"),
|
||||
cmake_variant("TESTS", "tests"),
|
||||
cmake_variant("INTEGRATION_TESTS", "integration_tests"),
|
||||
cmake_variant(tests_label, "tests"),
|
||||
cmake_variant(integration_tests_label, "integration_tests"),
|
||||
cmake_variant("DIGITIZATION_PLUGIN", "digitization"),
|
||||
cmake_variant("DD4HEP_PLUGIN", "dd4hep"),
|
||||
cmake_variant("FATRAS", "fatras"),
|
||||
cmake_variant("IDENTIFICATION_PLUGIN", "identification"),
|
||||
cmake_variant("JSON_PLUGIN", "json"),
|
||||
cmake_variant("LEGACY", "legacy"),
|
||||
cmake_variant("TGEO_PLUGIN", "tgeo")
|
||||
]
|
||||
|
||||
|
||||
92
var/spack/repos/builtin/packages/akantu/package.py
Normal file
92
var/spack/repos/builtin/packages/akantu/package.py
Normal file
@@ -0,0 +1,92 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Akantu(CMakePackage):
|
||||
"""
|
||||
Akantu means a little element in Kinyarwanda, a Bantu language. From now
|
||||
on it is also an opensource object-oriented Finite Element library which
|
||||
has the ambition to be generic and efficient.
|
||||
|
||||
"""
|
||||
homepage = "https://akantu.ch"
|
||||
url = "https://gitlab.com/akantu/akantu/-/archive/v3.0.0/akantu-v3.0.0.tar.gz"
|
||||
git = "https://gitlab.com/akantu/akantu.git"
|
||||
|
||||
maintainers = ['nrichart']
|
||||
|
||||
version('master', branch='master')
|
||||
version('3.0.0', sha256='7e8f64e25956eba44def1b2d891f6db8ba824e4a82ff0d51d6b585b60ab465db')
|
||||
|
||||
variant('external_solvers', values=any_combination_of('mumps', 'petsc'),
|
||||
description="Activates the implicit solver")
|
||||
variant('mpi', default=True,
|
||||
description="Activates parallel capabilities")
|
||||
variant('python', default=False,
|
||||
description="Activates python bindings")
|
||||
|
||||
depends_on('boost@:1.66', when='@:3.0.99')
|
||||
depends_on('boost')
|
||||
depends_on('lapack')
|
||||
depends_on('cmake@3.5.1:', type='build')
|
||||
depends_on('python', when='+python', type=('build', 'run'))
|
||||
depends_on('py-numpy', when='+python', type=('build', 'run'))
|
||||
depends_on('py-scipy', when='+python', type=('build', 'run'))
|
||||
depends_on('py-pybind11', when='@3.1:+python', type=('build', 'run'))
|
||||
|
||||
depends_on('mumps', when='~mpi external_solvers=mumps')
|
||||
depends_on('mumps+mpi', when='+mpi external_solvers=mumps')
|
||||
depends_on('netlib-scalapack', when='+mpi external_solvers=mumps')
|
||||
depends_on('petsc+double', when='~mpi external_solvers=petsc')
|
||||
depends_on('petsc+double+mpi', when='+mpi external_solvers=petsc')
|
||||
|
||||
depends_on('mpi', when='+mpi')
|
||||
depends_on('scotch', when='+mpi')
|
||||
|
||||
extends('python', when='+python')
|
||||
|
||||
conflicts('gcc@:5.3.99')
|
||||
conflicts('@:3.0.99 external_solvers=petsc')
|
||||
conflicts('@:3.0.99 +python')
|
||||
|
||||
def cmake_args(self):
|
||||
spec = self.spec
|
||||
|
||||
args = [
|
||||
'-DAKANTU_COHESIVE_ELEMENT:BOOL=ON',
|
||||
'-DAKANTU_DAMAGE_NON_LOCAL:BOOL=ON',
|
||||
'-DAKANTU_HEAT_TRANSFER:BOOL=ON',
|
||||
'-DAKANTU_SOLID_MECHANICS:BOOL=ON',
|
||||
'-DAKANTU_STRUCTURAL_MECHANICS:BOOL=OFF',
|
||||
'-DAKANTU_PARALLEL:BOOL={0}'.format(
|
||||
'ON' if spec.satisfies('+mpi') else 'OFF'),
|
||||
'-DAKANTU_PYTHON_INTERFACE:BOOL={0}'.format(
|
||||
'ON' if spec.satisfies('+python') else 'OFF'),
|
||||
]
|
||||
|
||||
if spec.satisfies('@:3.0.99'):
|
||||
args.extend(['-DCMAKE_CXX_FLAGS=-Wno-class-memaccess',
|
||||
'-DAKANTU_TRACTION_AT_SPLIT_NODE_CONTACT:BOOL=OFF'])
|
||||
else:
|
||||
args.append('-DAKANTU_TRACTION_AT_SPLIT_NODE_CONTACT:BOOL=ON')
|
||||
|
||||
solvers = []
|
||||
if spec.satisfies('external_solvers=mumps'):
|
||||
solvers.append('Mumps')
|
||||
args.append('-DMUMPS_DIR:PATH=${0}'.format(spec['mumps'].prefix))
|
||||
if spec.satisfies('external_solvers=petsc'):
|
||||
solvers.append('PETSc')
|
||||
|
||||
if len(solvers) > 0:
|
||||
args.extend([
|
||||
'-DAKANTU_IMPLICIT_SOLVER:STRING={0}'.format(
|
||||
'+'.join(solvers)),
|
||||
'-DAKANTU_IMPLICIT:BOOL=ON'])
|
||||
else:
|
||||
args.append('-DAKANTU_IMPLICIT:BOOL=OFF')
|
||||
|
||||
return args
|
||||
@@ -3,6 +3,7 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import os
|
||||
from spack import *
|
||||
|
||||
|
||||
@@ -45,4 +46,13 @@ def cmake_args(self):
|
||||
'-DALUMINUM_ENABLE_CUDA:BOOL=%s' % ('+gpu' in spec),
|
||||
'-DALUMINUM_ENABLE_MPI_CUDA:BOOL=%s' % ('+mpi_cuda' in spec),
|
||||
'-DALUMINUM_ENABLE_NCCL:BOOL=%s' % ('+nccl' in spec)]
|
||||
|
||||
# Add support for OS X to find OpenMP
|
||||
if (self.spec.satisfies('%clang platform=darwin')):
|
||||
clang = self.compiler.cc
|
||||
clang_bin = os.path.dirname(clang)
|
||||
clang_root = os.path.dirname(clang_bin)
|
||||
args.extend([
|
||||
'-DOpenMP_DIR={0}'.format(clang_root)])
|
||||
|
||||
return args
|
||||
|
||||
22
var/spack/repos/builtin/packages/amdblis/package.py
Normal file
22
var/spack/repos/builtin/packages/amdblis/package.py
Normal file
@@ -0,0 +1,22 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
from spack.pkg.builtin.blis import BlisBase
|
||||
|
||||
|
||||
class Amdblis(BlisBase):
|
||||
"""AMD Optimized BLIS.
|
||||
|
||||
BLIS is a portable software framework for instantiating high-performance
|
||||
BLAS-like dense linear algebra libraries. The framework was designed to
|
||||
isolate essential kernels of computation that, when optimized, immediately
|
||||
enable optimized implementations of most of its commonly used and
|
||||
computationally intensive operations.
|
||||
"""
|
||||
|
||||
homepage = "https://developer.amd.com/amd-aocl/blas-library/"
|
||||
url = "https://github.com/amd/blis/archive/2.1.tar.gz"
|
||||
git = "https://github.com/amd/blis.git"
|
||||
|
||||
version('2.1', sha256='3b1d611d46f0f13b3c0917e27012e0f789b23dbefdddcf877b20327552d72fb3')
|
||||
@@ -15,6 +15,13 @@ class Ant(Package):
|
||||
homepage = "http://ant.apache.org/"
|
||||
url = "https://archive.apache.org/dist/ant/source/apache-ant-1.9.7-src.tar.gz"
|
||||
|
||||
version('1.10.7', sha256='2f9c4ef094581663b41a7412324f65b854f17622e5b2da9fcb9541ca8737bd52')
|
||||
version('1.10.6', sha256='c641721ae844196b28780e7999d2ae886085b89433438ab797d531413a924311')
|
||||
version('1.10.5', sha256='5937cf11d74d75d6e8927402950b012e037e362f9f728262ce432ad289b9f6ca')
|
||||
version('1.10.4', sha256='b0718c6c1b2b8d3bc77cd1e30ea183cd7741cfb52222a97c754e02b8e38d1948')
|
||||
version('1.10.3', sha256='988b0cac947559f7347f314b9a3dae1af0dfdcc254de56d1469de005bf281c5a')
|
||||
version('1.10.2', sha256='f3cf217b9befae2fef7198b51911e33a8809d98887cc971c8957596f459c6285')
|
||||
version('1.10.1', sha256='68f7ced0aa15d1f9f672f23d67c86deaf728e9576936313cfbff4f7a0e6ce382')
|
||||
version('1.10.0', sha256='1f78036c38753880e16fb755516c8070187a78fe4b2e99b59eda5b81b58eccaf')
|
||||
version('1.9.9', sha256='d6a0c93777ab27db36212d77c5733ac80d17fe24e83f947df23a8e0ad4ac48cc')
|
||||
version('1.9.8', sha256='5f4daf56e66fc7a71de772920ca27c15eac80cf1fcf41f3b4f2d535724942681')
|
||||
|
||||
@@ -11,6 +11,10 @@ class Atop(Package):
|
||||
homepage = "http://www.atoptool.nl/index.php"
|
||||
url = "http://www.atoptool.nl/download/atop-2.2-3.tar.gz"
|
||||
|
||||
version('2.5.0', sha256='4b911057ce50463b6e8b3016c5963d48535c0cddeebc6eda817e292b22f93f33')
|
||||
version('2.4.0', sha256='be1c010a77086b7d98376fce96514afcd73c3f20a8d1fe01520899ff69a73d69')
|
||||
version('2.3.0', sha256='73e4725de0bafac8c63b032e8479e2305e3962afbe977ec1abd45f9e104eb264')
|
||||
version('2.2.6', sha256='d0386840ee4df36e5d0ad55f144661b434d9ad35d94deadc0405b514485db615')
|
||||
version('2.2-3', sha256='c785b8a2355be28b3de6b58a8ea4c4fcab8fadeaa57a99afeb03c66fac8e055d')
|
||||
|
||||
depends_on('zlib')
|
||||
|
||||
@@ -0,0 +1,17 @@
|
||||
diff -Naur a/setup.py b/setup.py
|
||||
--- a/setup.py 2020-02-06 15:40:26.000000000 -0600
|
||||
+++ b/setup.py 2020-02-06 15:41:17.000000000 -0600
|
||||
@@ -27,10 +27,12 @@
|
||||
"future>=0.16.0,<=0.18.2",
|
||||
"tabulate>=0.8.2,<=0.8.3",
|
||||
"ipaddress>=1.0.22",
|
||||
- "enum34>=1.1.6",
|
||||
"PyYAML>=5.1.2",
|
||||
]
|
||||
|
||||
+if sys.version_info < (3, 4):
|
||||
+ REQUIRES.append("enum34>=1.1.6")
|
||||
+
|
||||
if sys.version_info[0] == 2:
|
||||
REQUIRES.append("configparser>=3.5.0,<=3.8.1")
|
||||
|
||||
@@ -4,6 +4,7 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
import os
|
||||
|
||||
|
||||
class AwsParallelcluster(PythonPackage):
|
||||
@@ -12,14 +13,21 @@ class AwsParallelcluster(PythonPackage):
|
||||
|
||||
homepage = "https://github.com/aws/aws-parallelcluster"
|
||||
url = "https://pypi.io/packages/source/a/aws-parallelcluster/aws-parallelcluster-2.5.1.tar.gz"
|
||||
maintainers = ['sean-smith', 'demartinofra', 'enrico-usai',
|
||||
'lukeseawalker', 'rexcsn', 'ddeidda', 'tilne']
|
||||
|
||||
maintainers = [
|
||||
'sean-smith', 'demartinofra', 'enrico-usai', 'lukeseawalker', 'rexcsn',
|
||||
'ddeidda', 'tilne'
|
||||
]
|
||||
import_modules = [
|
||||
'pcluster', 'awsbatch', 'pcluster.dcv', 'pcluster.configure',
|
||||
'pcluster.config', 'pcluster.networking'
|
||||
]
|
||||
|
||||
version('2.5.1', sha256='4fd6e14583f8cf81f9e4aa1d6188e3708d3d14e6ae252de0a94caaf58be76303')
|
||||
version('2.5.0', sha256='3b0209342ea0d9d8cc95505456103ad87c2d4e35771aa838765918194efd0ad3')
|
||||
|
||||
depends_on('python@2.7:', type=('build', 'run'))
|
||||
depends_on('py-setuptools', type='build')
|
||||
depends_on('py-setuptools', type=('build', 'run'))
|
||||
depends_on('py-boto3@1.10.15:', type=('build', 'run'))
|
||||
depends_on('py-future@0.16.0:0.18.2', type=('build', 'run'))
|
||||
depends_on('py-tabulate@0.8.2:0.8.3', type=('build', 'run'))
|
||||
@@ -27,3 +35,15 @@ class AwsParallelcluster(PythonPackage):
|
||||
depends_on('py-enum34@1.1.6:', when='^python@:3.3', type=('build', 'run'))
|
||||
depends_on('py-pyyaml@5.1.2:', type=('build', 'run'))
|
||||
depends_on('py-configparser@3.5.0:3.8.1', when='^python@:2', type=('build', 'run'))
|
||||
|
||||
# https://github.com/aws/aws-parallelcluster/pull/1633
|
||||
patch('enum34.patch', when='@:2.5.1')
|
||||
|
||||
@run_after('install')
|
||||
@on_package_attributes(run_tests=True)
|
||||
def install_test(self):
|
||||
# Make sure executables work
|
||||
for exe in ['awsbhosts', 'awsbkill', 'awsbout', 'awsbqueues',
|
||||
'awsbstat', 'awsbsub', 'pcluster']:
|
||||
exe = Executable(os.path.join(self.prefix.bin, exe))
|
||||
exe('--help')
|
||||
|
||||
@@ -3,38 +3,16 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
# Although this looks like an Autotools package, it's not one. Refer to:
|
||||
# https://github.com/flame/blis/issues/17
|
||||
# https://github.com/flame/blis/issues/195
|
||||
# https://github.com/flame/blis/issues/197
|
||||
|
||||
|
||||
class Blis(Package):
|
||||
"""BLIS is a portable software framework for instantiating high-performance
|
||||
BLAS-like dense linear algebra libraries. The framework was designed to
|
||||
isolate essential kernels of computation that, when optimized, immediately
|
||||
enable optimized implementations of most of its commonly used and
|
||||
computationally intensive operations. BLIS is written in ISO C99 and
|
||||
available under a new/modified/3-clause BSD license. While BLIS exports a
|
||||
new BLAS-like API, it also includes a BLAS compatibility layer which gives
|
||||
application developers access to BLIS implementations via traditional BLAS
|
||||
routine calls. An object-based API unique to BLIS is also available."""
|
||||
|
||||
homepage = "https://github.com/flame/blis"
|
||||
url = "https://github.com/flame/blis/archive/0.4.0.tar.gz"
|
||||
git = "https://github.com/flame/blis.git"
|
||||
|
||||
version('master', branch='master')
|
||||
version('0.6.0', sha256='ad5765cc3f492d0c663f494850dafc4d72f901c332eb442f404814ff2995e5a9')
|
||||
version('0.5.0', sha256='1a004d69c139e8a0448c6a6007863af3a8c3551b8d9b8b73fe08e8009f165fa8')
|
||||
version('0.4.0', sha256='9c7efd75365a833614c01b5adfba93210f869d92e7649e0b5d9edc93fc20ea76')
|
||||
version('0.3.2', sha256='b87e42c73a06107d647a890cbf12855925777dc7124b0c7698b90c5effa7f58f')
|
||||
version('0.3.1', sha256='957f28d47c5cf71ffc62ce8cc1277e17e44d305b1c2fa8506b0b55617a9f28e4')
|
||||
version('0.3.0', sha256='d34d17df7bdc2be8771fe0b7f867109fd10437ac91e2a29000a4a23164c7f0da')
|
||||
version('0.2.2', sha256='4a7ecb56034fb20e9d1d8b16e2ef587abbc3d30cb728e70629ca7e795a7998e8')
|
||||
|
||||
class BlisBase(Package):
|
||||
"""Base class for building BLIS, shared with the AMD optimized version
|
||||
of the library in the 'amdblis' package.
|
||||
"""
|
||||
depends_on('python@2.7:2.8,3.4:', type=('build', 'run'))
|
||||
|
||||
variant(
|
||||
@@ -73,10 +51,6 @@ class Blis(Package):
|
||||
provides('blas', when="+blas")
|
||||
provides('blas', when="+cblas")
|
||||
|
||||
# Problems with permissions on installed libraries:
|
||||
# https://github.com/flame/blis/issues/343
|
||||
patch('Makefile_0.6.0.patch', when='@0.4.0:0.6.0')
|
||||
|
||||
phases = ['configure', 'build', 'install']
|
||||
|
||||
def configure(self, spec, prefix):
|
||||
@@ -127,3 +101,36 @@ def darwin_fix(self):
|
||||
# The shared library is not installed correctly on Darwin; fix this
|
||||
if self.spec.satisfies('platform=darwin'):
|
||||
fix_darwin_install_name(self.prefix.lib)
|
||||
|
||||
|
||||
class Blis(BlisBase):
|
||||
"""BLIS is a portable software framework for instantiating high-performance
|
||||
BLAS-like dense linear algebra libraries.
|
||||
|
||||
The framework was designed to isolate essential kernels of computation
|
||||
that, when optimized, immediately enable optimized implementations of
|
||||
most of its commonly used and computationally intensive operations. BLIS
|
||||
is written in ISO C99 and available under a new/modified/3-clause BSD
|
||||
license. While BLIS exports a new BLAS-like API, it also includes a
|
||||
BLAS compatibility layer which gives application developers access to
|
||||
BLIS implementations via traditional BLAS routine calls.
|
||||
An object-based API unique to BLIS is also available.
|
||||
"""
|
||||
|
||||
homepage = "https://github.com/flame/blis"
|
||||
url = "https://github.com/flame/blis/archive/0.4.0.tar.gz"
|
||||
git = "https://github.com/flame/blis.git"
|
||||
|
||||
version('master', branch='master')
|
||||
version('0.6.1', sha256='76b22f29b7789cf117c0873d2a6b2a6d61f903869168148f2e7306353c105c37')
|
||||
version('0.6.0', sha256='ad5765cc3f492d0c663f494850dafc4d72f901c332eb442f404814ff2995e5a9')
|
||||
version('0.5.0', sha256='1a004d69c139e8a0448c6a6007863af3a8c3551b8d9b8b73fe08e8009f165fa8')
|
||||
version('0.4.0', sha256='9c7efd75365a833614c01b5adfba93210f869d92e7649e0b5d9edc93fc20ea76')
|
||||
version('0.3.2', sha256='b87e42c73a06107d647a890cbf12855925777dc7124b0c7698b90c5effa7f58f')
|
||||
version('0.3.1', sha256='957f28d47c5cf71ffc62ce8cc1277e17e44d305b1c2fa8506b0b55617a9f28e4')
|
||||
version('0.3.0', sha256='d34d17df7bdc2be8771fe0b7f867109fd10437ac91e2a29000a4a23164c7f0da')
|
||||
version('0.2.2', sha256='4a7ecb56034fb20e9d1d8b16e2ef587abbc3d30cb728e70629ca7e795a7998e8')
|
||||
|
||||
# Problems with permissions on installed libraries:
|
||||
# https://github.com/flame/blis/issues/343
|
||||
patch('Makefile_0.6.0.patch', when='@0.4.0:0.6.0')
|
||||
|
||||
@@ -22,8 +22,11 @@ class Boost(Package):
|
||||
git = "https://github.com/boostorg/boost.git"
|
||||
list_url = "http://sourceforge.net/projects/boost/files/boost/"
|
||||
list_depth = 1
|
||||
maintainers = ['hainest']
|
||||
|
||||
version('develop', branch='develop', submodules=True)
|
||||
version('1.72.0', sha256='59c9b274bc451cf91a9ba1dd2c7fdcaf5d60b1b3aa83f2c9fa143417cc660722')
|
||||
version('1.71.0', sha256='d73a8da01e8bf8c7eda40b4c84915071a8c8a0df4a6734537ddde4a8580524ee')
|
||||
version('1.70.0', sha256='430ae8354789de4fd19ee52f3b1f739e1fba576f0aded0897c3c2bc00fb38778')
|
||||
version('1.69.0', sha256='8f32d4617390d1c2d16f26a27ab60d97807b35440d45891fa340fc2648b04406')
|
||||
version('1.68.0', sha256='7f6130bc3cf65f56a618888ce9d5ea704fa10b462be126ad053e80e553d6d8b7')
|
||||
@@ -205,7 +208,7 @@ def libs(self):
|
||||
|
||||
# Add option to C/C++ compile commands in clang-linux.jam
|
||||
patch('clang-linux_add_option.patch', when='@1.56.0:1.63.0')
|
||||
patch('clang-linux_add_option2.patch', when='@:1.55.0')
|
||||
patch('clang-linux_add_option2.patch', when='@1.47.0:1.55.0')
|
||||
|
||||
def url_for_version(self, version):
|
||||
if version >= Version('1.63.0'):
|
||||
@@ -216,9 +219,6 @@ def url_for_version(self, version):
|
||||
return url.format(version.dotted, version.underscored)
|
||||
|
||||
def determine_toolset(self, spec):
|
||||
if spec.satisfies("platform=darwin"):
|
||||
return 'darwin'
|
||||
|
||||
toolsets = {'g++': 'gcc',
|
||||
'icpc': 'intel',
|
||||
'clang++': 'clang',
|
||||
|
||||
@@ -19,6 +19,8 @@ class Clingo(CMakePackage):
|
||||
homepage = "https://potassco.org/clingo/"
|
||||
url = "https://github.com/potassco/clingo/archive/v5.2.2.tar.gz"
|
||||
|
||||
version('5.4.0', sha256='e2de331ee0a6d254193aab5995338a621372517adcf91568092be8ac511c18f3')
|
||||
version('5.3.0', sha256='b0d406d2809352caef7fccf69e8864d55e81ee84f4888b0744894977f703f976')
|
||||
version('5.2.2', sha256='da1ef8142e75c5a6f23c9403b90d4f40b9f862969ba71e2aaee9a257d058bfcf')
|
||||
|
||||
depends_on('doxygen', type=('build'))
|
||||
|
||||
@@ -16,6 +16,8 @@ class Cln(AutotoolsPackage):
|
||||
homepage = "https://www.ginac.de/CLN/"
|
||||
git = "git://www.ginac.de/cln.git"
|
||||
|
||||
version('1.3.6', commit='d4ba1cc869be2c647c4ab48ac571b1fc9c2021a9')
|
||||
version('1.3.5', commit='b221c033c082b462455502b7e63702a9c466aede')
|
||||
version('1.3.4', commit='9b86a7fc69feb1b288469982001af565f73057eb')
|
||||
version('1.3.3', commit='1c9bd61ff0b89b0bf8030e44cb398e8f75112222')
|
||||
version('1.3.2', commit='00817f7b60a961b860f6d305ac82dd51b70d6ba6')
|
||||
|
||||
@@ -101,7 +101,8 @@ class Cmake(Package):
|
||||
depends_on('zlib', when='~ownlibs')
|
||||
depends_on('bzip2', when='~ownlibs')
|
||||
depends_on('xz', when='~ownlibs')
|
||||
depends_on('libarchive', when='~ownlibs')
|
||||
depends_on('libarchive@3.1.0:', when='~ownlibs')
|
||||
depends_on('libarchive@3.3.3:', when='@3.15.0:~ownlibs')
|
||||
depends_on('libuv@1.0.0:1.10.99', when='@3.7.0:3.10.3~ownlibs')
|
||||
depends_on('libuv@1.10.0:1.10.99', when='@3.11.0:3.11.99~ownlibs')
|
||||
depends_on('libuv@1.10.0:', when='@3.12.0:~ownlibs')
|
||||
|
||||
@@ -15,6 +15,8 @@ class Coreutils(AutotoolsPackage, GNUMirrorPackage):
|
||||
homepage = "http://www.gnu.org/software/coreutils/"
|
||||
gnu_mirror_path = "coreutils/coreutils-8.26.tar.xz"
|
||||
|
||||
version('8.31', sha256='ff7a9c918edce6b4f4b2725e3f9b37b0c4d193531cac49a48b56c4d0d3a9e9fd')
|
||||
version('8.30', sha256='e831b3a86091496cdba720411f9748de81507798f6130adeaef872d206e1b057')
|
||||
version('8.29', sha256='92d0fa1c311cacefa89853bdb53c62f4110cdfda3820346b59cbd098f40f955e')
|
||||
version('8.26', sha256='155e94d748f8e2bc327c66e0cbebdb8d6ab265d2f37c3c928f7bf6c3beba9a8e')
|
||||
version('8.23', sha256='ec43ca5bcfc62242accb46b7f121f6b684ee21ecd7d075059bf650ff9e37b82d')
|
||||
|
||||
18
var/spack/repos/builtin/packages/cpio/package.py
Normal file
18
var/spack/repos/builtin/packages/cpio/package.py
Normal file
@@ -0,0 +1,18 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Cpio(AutotoolsPackage, GNUMirrorPackage):
|
||||
"""GNU cpio copies files into or out of a cpio or tar archive. The
|
||||
archive can be another file on the disk, a magnetic tape, or a pipe.
|
||||
"""
|
||||
homepage = "https://www.gnu.org/software/cpio/"
|
||||
gnu_mirror_path = "cpio/cpio-2.13.tar.gz"
|
||||
|
||||
version('2.13', sha256='e87470d9c984317f658567c03bfefb6b0c829ff17dbf6b0de48d71a4c8f3db88')
|
||||
|
||||
build_directory = 'spack-build'
|
||||
43
var/spack/repos/builtin/packages/cray-libsci/package.py
Executable file
43
var/spack/repos/builtin/packages/cray-libsci/package.py
Executable file
@@ -0,0 +1,43 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from llnl.util.filesystem import LibraryList
|
||||
from spack import *
|
||||
import os
|
||||
|
||||
|
||||
class CrayLibsci(Package):
|
||||
"""The Cray Scientific Libraries package, LibSci, is a collection of
|
||||
numerical routines optimized for best performance on Cray systems."""
|
||||
|
||||
homepage = "http://www.nersc.gov/users/software/programming-libraries/math-libraries/libsci/"
|
||||
url = "http://www.nersc.gov/users/software/programming-libraries/math-libraries/libsci/"
|
||||
|
||||
version("18.11.1.2")
|
||||
version("16.11.1")
|
||||
version("16.09.1")
|
||||
version('16.07.1')
|
||||
version("16.06.1")
|
||||
version("16.03.1")
|
||||
|
||||
provides("blas")
|
||||
provides("lapack")
|
||||
provides("scalapack")
|
||||
|
||||
# NOTE: Cray compiler wrappers already include linking for the following
|
||||
@property
|
||||
def blas_libs(self):
|
||||
return LibraryList(os.path.join(self.prefix.lib, 'libsci.so'))
|
||||
|
||||
@property
|
||||
def lapack_libs(self):
|
||||
return self.blas_libs
|
||||
|
||||
@property
|
||||
def scalapack_libs(self):
|
||||
return self.blas_libs
|
||||
|
||||
def install(self, spec, prefix):
|
||||
raise NoBuildError(spec)
|
||||
@@ -22,6 +22,10 @@ class Dd4hep(CMakePackage):
|
||||
version('1.11.0', commit='280c7d748d56a704699408ac8e57815d029b169a')
|
||||
version('1.10.0', commit='9835d1813c07d9d5850d1e68276c0171d1726801')
|
||||
|
||||
# Workarounds for various TBB issues in DD4hep v1.11
|
||||
# See https://github.com/AIDASoft/DD4hep/pull/613 .
|
||||
patch('tbb-workarounds.patch', when='@1.11.0')
|
||||
|
||||
variant('xercesc', default=False, description="Enable 'Detector Builders' based on XercesC")
|
||||
variant('geant4', default=False, description="Enable the simulation part based on Geant4")
|
||||
variant('testing', default=False, description="Enable and build tests")
|
||||
|
||||
@@ -0,0 +1,41 @@
|
||||
diff --git a/DDDigi/CMakeLists.txt b/DDDigi/CMakeLists.txt
|
||||
index e6fb1096..88eb5c92 100644
|
||||
--- a/DDDigi/CMakeLists.txt
|
||||
+++ b/DDDigi/CMakeLists.txt
|
||||
@@ -34,12 +34,10 @@ target_include_directories(DDDigi
|
||||
|
||||
FIND_PACKAGE(TBB QUIET)
|
||||
if(TBB_FOUND)
|
||||
- dd4hep_print( "|++> TBB_INCLUDE_DIR --> ${TBB_INCLUDE_DIR}")
|
||||
- dd4hep_print( "|++> TBB_LIBRARY --> ${TBB_LIBRARY}")
|
||||
+ dd4hep_print( "|++> TBB_IMPORTED_TARGETS --> ${TBB_IMPORTED_TARGETS}")
|
||||
dd4hep_print( "|++> TBB found. DDDigi will run multi threaded.")
|
||||
target_compile_definitions(DDDigi PUBLIC DD4HEP_USE_TBB)
|
||||
- target_link_libraries(DDDigi ${TBB_LIBRARY})
|
||||
- target_include_directories(DDDigi ${TBB_INCLUDE_DIRS})
|
||||
+ target_link_libraries(DDDigi PUBLIC ${TBB_IMPORTED_TARGETS})
|
||||
else()
|
||||
dd4hep_print( "|++> TBB not found. DDDigi will only work single threaded.")
|
||||
endif()
|
||||
diff --git a/DDDigi/src/DigiKernel.cpp b/DDDigi/src/DigiKernel.cpp
|
||||
index d62c6694..f2c2e86c 100644
|
||||
--- a/DDDigi/src/DigiKernel.cpp
|
||||
+++ b/DDDigi/src/DigiKernel.cpp
|
||||
@@ -91,7 +91,7 @@ public:
|
||||
DigiEventAction* action = 0;
|
||||
Wrapper(DigiContext& c, DigiEventAction* a)
|
||||
: context(c), action(a) {}
|
||||
- Wrapper(Wrapper&& copy) = delete;
|
||||
+ Wrapper(Wrapper&& copy) = default;
|
||||
Wrapper(const Wrapper& copy) = default;
|
||||
Wrapper& operator=(Wrapper&& copy) = delete;
|
||||
Wrapper& operator=(const Wrapper& copy) = delete;
|
||||
@@ -111,7 +111,7 @@ class DigiKernel::Processor {
|
||||
DigiKernel& kernel;
|
||||
public:
|
||||
Processor(DigiKernel& k) : kernel(k) {}
|
||||
- Processor(Processor&& l) = delete;
|
||||
+ Processor(Processor&& l) = default;
|
||||
Processor(const Processor& l) = default;
|
||||
void operator()() const {
|
||||
int todo = 1;
|
||||
@@ -180,6 +180,12 @@ class Dealii(CMakePackage, CudaPackage):
|
||||
sha256='61f217744b70f352965be265d2f06e8c1276685e2944ca0a88b7297dd55755da',
|
||||
when='@9.0.1 ^boost@1.70.0:')
|
||||
|
||||
# Fix TBB version check
|
||||
# https://github.com/dealii/dealii/pull/9208
|
||||
patch('https://github.com/dealii/dealii/commit/80b13fe5a2eaefc77fa8c9266566fa8a2de91edf.patch',
|
||||
sha256='6f876dc8eadafe2c4ec2a6673864fb451c6627ca80511b6e16f3c401946fdf33',
|
||||
when='@9.0.0:9.1.1')
|
||||
|
||||
# check that the combination of variants makes sense
|
||||
# 64-bit BLAS:
|
||||
for p in ['openblas', 'intel-mkl', 'intel-parallel-studio+mkl']:
|
||||
|
||||
@@ -19,3 +19,8 @@ class Diffutils(AutotoolsPackage, GNUMirrorPackage):
|
||||
build_directory = 'spack-build'
|
||||
|
||||
depends_on('libiconv')
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
if self.spec.satisfies('%fj'):
|
||||
env.append_flags('CFLAGS',
|
||||
'-Qunused-arguments')
|
||||
|
||||
@@ -16,6 +16,10 @@ class Diy(CMakePackage):
|
||||
version('3.5.0', sha256='b3b5490441d521b6e9b33471c782948194bf95c7c3df3eb97bc5cf4530b91576')
|
||||
version('master', branch='master')
|
||||
|
||||
depends_on('mpi')
|
||||
|
||||
def cmake_args(self):
|
||||
args = ['-Dbuild_examples=off', '-Dbuild_tests=off']
|
||||
args = ['-Dbuild_examples=off',
|
||||
'-Dbuild_tests=off',
|
||||
'-DCMAKE_CXX_COMPILER=%s' % self.spec['mpi'].mpicxx]
|
||||
return args
|
||||
|
||||
62
var/spack/repos/builtin/packages/draco/d710-python2.patch
Normal file
62
var/spack/repos/builtin/packages/draco/d710-python2.patch
Normal file
@@ -0,0 +1,62 @@
|
||||
diff --git a/config/ApplicationUnitTest.cmake b/config/ApplicationUnitTest.cmake
|
||||
index a0a79858..0c47b72a 100644
|
||||
--- a/config/ApplicationUnitTest.cmake
|
||||
+++ b/config/ApplicationUnitTest.cmake
|
||||
@@ -249,7 +249,7 @@ macro( aut_register_test )
|
||||
endif(VERBOSE_DEBUG)
|
||||
|
||||
# Look for python, which is used to drive application unit tests
|
||||
- if( NOT PYTHONINTERP_FOUND )
|
||||
+ if( NOT Python_Interpreter_FOUND )
|
||||
# python should have been found when vendor_libraries.cmake was run.
|
||||
message( FATAL_ERROR "Draco requires python. Python not found in PATH.")
|
||||
endif()
|
||||
@@ -289,7 +289,7 @@ macro( aut_register_test )
|
||||
if (${PYTHON_TEST})
|
||||
add_test(
|
||||
NAME ${ctestname_base}${argname}
|
||||
- COMMAND "${PYTHON_EXECUTABLE}"
|
||||
+ COMMAND "${Python_EXECUTABLE}"
|
||||
${aut_DRIVER}
|
||||
${SHARED_ARGUMENTS}
|
||||
)
|
||||
diff --git a/config/draco-config-install.cmake.in b/config/draco-config-install.cmake.in
|
||||
index c5bf1c75..a16f72f4 100644
|
||||
--- a/config/draco-config-install.cmake.in
|
||||
+++ b/config/draco-config-install.cmake.in
|
||||
@@ -107,8 +107,9 @@ set( WITH_CUDA "@WITH_CUDA@" )
|
||||
#endif()
|
||||
|
||||
# Python
|
||||
-set( PYTHONINTERP_FOUND "@PYTHONINTERP_FOUND@" )
|
||||
-set( PYTHON_EXECUTABLE "@PYTHON_EXECUTABLE@" )
|
||||
+set( Python_FOUND "@Python_FOUND@" )
|
||||
+set( Python_Interpreter_FOUND "@Python_Interpreter_FOUND@" )
|
||||
+set( Python_EXECUTABLE "@Python_EXECUTABLE@" )
|
||||
|
||||
## ---------------------------------------------------------------------------
|
||||
## Set useful general variables
|
||||
diff --git a/config/vendor_libraries.cmake b/config/vendor_libraries.cmake
|
||||
index c3e079bc..6b393eb4 100644
|
||||
--- a/config/vendor_libraries.cmake
|
||||
+++ b/config/vendor_libraries.cmake
|
||||
@@ -16,7 +16,7 @@ include( setupMPI ) # defines the macros setupMPILibrariesUnix|Windows
|
||||
macro( setupPython )
|
||||
|
||||
message( STATUS "Looking for Python...." )
|
||||
- find_package(PythonInterp 2.7 QUIET REQUIRED)
|
||||
+ find_package(Python 2.7 QUIET REQUIRED COMPONENTS Interpreter)
|
||||
# PYTHONINTERP_FOUND - Was the Python executable found
|
||||
# PYTHON_EXECUTABLE - path to the Python interpreter
|
||||
set_package_properties( PythonInterp PROPERTIES
|
||||
@@ -25,8 +25,8 @@ macro( setupPython )
|
||||
TYPE REQUIRED
|
||||
PURPOSE "Required for running tests and accessing features that rely on matplotlib."
|
||||
)
|
||||
- if( PYTHONINTERP_FOUND )
|
||||
- message( STATUS "Looking for Python....found ${PYTHON_EXECUTABLE}" )
|
||||
+ if( Python_Interpreter_FOUND )
|
||||
+ message( STATUS "Looking for Python....found ${Python_EXECUTABLE}" )
|
||||
else()
|
||||
message( STATUS "Looking for Python....not found" )
|
||||
endif()
|
||||
62
var/spack/repos/builtin/packages/draco/d710.patch
Normal file
62
var/spack/repos/builtin/packages/draco/d710.patch
Normal file
@@ -0,0 +1,62 @@
|
||||
diff --git a/config/ApplicationUnitTest.cmake b/config/ApplicationUnitTest.cmake
|
||||
index a0a79858..0c47b72a 100644
|
||||
--- a/config/ApplicationUnitTest.cmake
|
||||
+++ b/config/ApplicationUnitTest.cmake
|
||||
@@ -249,7 +249,7 @@ macro( aut_register_test )
|
||||
endif(VERBOSE_DEBUG)
|
||||
|
||||
# Look for python, which is used to drive application unit tests
|
||||
- if( NOT PYTHONINTERP_FOUND )
|
||||
+ if( NOT Python_Interpreter_FOUND )
|
||||
# python should have been found when vendor_libraries.cmake was run.
|
||||
message( FATAL_ERROR "Draco requires python. Python not found in PATH.")
|
||||
endif()
|
||||
@@ -289,7 +289,7 @@ macro( aut_register_test )
|
||||
if (${PYTHON_TEST})
|
||||
add_test(
|
||||
NAME ${ctestname_base}${argname}
|
||||
- COMMAND "${PYTHON_EXECUTABLE}"
|
||||
+ COMMAND "${Python_EXECUTABLE}"
|
||||
${aut_DRIVER}
|
||||
${SHARED_ARGUMENTS}
|
||||
)
|
||||
diff --git a/config/draco-config-install.cmake.in b/config/draco-config-install.cmake.in
|
||||
index c5bf1c75..a16f72f4 100644
|
||||
--- a/config/draco-config-install.cmake.in
|
||||
+++ b/config/draco-config-install.cmake.in
|
||||
@@ -107,8 +107,9 @@ set( WITH_CUDA "@WITH_CUDA@" )
|
||||
#endif()
|
||||
|
||||
# Python
|
||||
-set( PYTHONINTERP_FOUND "@PYTHONINTERP_FOUND@" )
|
||||
-set( PYTHON_EXECUTABLE "@PYTHON_EXECUTABLE@" )
|
||||
+set( Python_FOUND "@Python_FOUND@" )
|
||||
+set( Python_Interpreter_FOUND "@Python_Interpreter_FOUND@" )
|
||||
+set( Python_EXECUTABLE "@Python_EXECUTABLE@" )
|
||||
|
||||
## ---------------------------------------------------------------------------
|
||||
## Set useful general variables
|
||||
diff --git a/config/vendor_libraries.cmake b/config/vendor_libraries.cmake
|
||||
index c3e079bc..6b393eb4 100644
|
||||
--- a/config/vendor_libraries.cmake
|
||||
+++ b/config/vendor_libraries.cmake
|
||||
@@ -16,7 +16,7 @@ include( setupMPI ) # defines the macros setupMPILibrariesUnix|Windows
|
||||
macro( setupPython )
|
||||
|
||||
message( STATUS "Looking for Python...." )
|
||||
- find_package(PythonInterp 2.7 QUIET REQUIRED)
|
||||
+ find_package(Python 3.5 QUIET REQUIRED COMPONENTS Interpreter)
|
||||
# PYTHONINTERP_FOUND - Was the Python executable found
|
||||
# PYTHON_EXECUTABLE - path to the Python interpreter
|
||||
set_package_properties( PythonInterp PROPERTIES
|
||||
@@ -25,8 +25,8 @@ macro( setupPython )
|
||||
TYPE REQUIRED
|
||||
PURPOSE "Required for running tests and accessing features that rely on matplotlib."
|
||||
)
|
||||
- if( PYTHONINTERP_FOUND )
|
||||
- message( STATUS "Looking for Python....found ${PYTHON_EXECUTABLE}" )
|
||||
+ if( Python_Interpreter_FOUND )
|
||||
+ message( STATUS "Looking for Python....found ${Python_EXECUTABLE}" )
|
||||
else()
|
||||
message( STATUS "Looking for Python....not found" )
|
||||
endif()
|
||||
34
var/spack/repos/builtin/packages/draco/d730.patch
Normal file
34
var/spack/repos/builtin/packages/draco/d730.patch
Normal file
@@ -0,0 +1,34 @@
|
||||
diff --git a/config/platform_checks.cmake b/config/platform_checks.cmake
|
||||
index c9841b0d..84bf07f5 100644
|
||||
--- a/config/platform_checks.cmake
|
||||
+++ b/config/platform_checks.cmake
|
||||
@@ -85,6 +85,7 @@ macro( query_craype )
|
||||
set( CRAY_PE ON CACHE BOOL
|
||||
"Are we building in a Cray Programming Environment?")
|
||||
|
||||
+ if (FALSE)
|
||||
# We expect developers to use the Cray compiler wrappers (especially in
|
||||
# setupMPI.cmake). See also
|
||||
# https://cmake.org/cmake/help/latest/module/FindMPI.html
|
||||
@@ -111,6 +112,7 @@ macro( query_craype )
|
||||
"Otherwise please email this error message and other related information to"
|
||||
" draco@lanl.gov.\n" )
|
||||
endif()
|
||||
+ endif()
|
||||
message( STATUS
|
||||
"Looking to see if we are building in a Cray Environment..."
|
||||
"found version $ENV{CRAYPE_VERSION}.")
|
||||
|
||||
diff --git a/config/setupMPI.cmake b/config/setupMPI.cmake
|
||||
index da522499..5b5e27c5 100644
|
||||
--- a/config/setupMPI.cmake
|
||||
+++ b/config/setupMPI.cmake
|
||||
@@ -51,7 +51,7 @@ function( setMPIflavorVer )
|
||||
if( DEFINED ENV{CRAY_MPICH2_VER} )
|
||||
set( MPI_VERSION $ENV{CRAY_MPICH2_VER} )
|
||||
endif()
|
||||
- elseif( ${MPI_FLAVOR} STREQUAL "spectrum" )
|
||||
+ elseif( "${MPI_FLAVOR}" STREQUAL "spectrum" )
|
||||
if( DEFINED ENV{LMOD_MPI_VERSION} )
|
||||
set( LMOD_MPI_VERSION $ENV{LMOD_MPI_VERSION} )
|
||||
endif()
|
||||
21
var/spack/repos/builtin/packages/draco/d740.patch
Normal file
21
var/spack/repos/builtin/packages/draco/d740.patch
Normal file
@@ -0,0 +1,21 @@
|
||||
diff --git a/config/platform_checks.cmake b/config/platform_checks.cmake
|
||||
index c9841b0d..aeecc767 100644
|
||||
--- a/config/platform_checks.cmake
|
||||
+++ b/config/platform_checks.cmake
|
||||
@@ -88,6 +88,8 @@ macro( query_craype )
|
||||
# We expect developers to use the Cray compiler wrappers (especially in
|
||||
# setupMPI.cmake). See also
|
||||
# https://cmake.org/cmake/help/latest/module/FindMPI.html
|
||||
+ if( NOT "$ENV{CXX}" MATCHES "/lib/spack/env/" )
|
||||
+ # skip this check if building from within spack.
|
||||
if( NOT "$ENV{CXX}" MATCHES "CC$" OR
|
||||
NOT "$ENV{CC}" MATCHES "cc$" OR
|
||||
NOT "$ENV{FC}" MATCHES "ftn$" OR
|
||||
@@ -110,6 +112,7 @@ macro( query_craype )
|
||||
" export CRAYPE_LINK_TYPE=dynamic\n"
|
||||
"Otherwise please email this error message and other related information to"
|
||||
" draco@lanl.gov.\n" )
|
||||
+ endif()
|
||||
endif()
|
||||
message( STATUS
|
||||
"Looking to see if we are building in a Cray Environment..."
|
||||
@@ -15,42 +15,52 @@ class Draco(CMakePackage):
|
||||
homepage = "https://github.com/lanl/draco"
|
||||
url = "https://github.com/lanl/Draco/archive/draco-7_1_0.zip"
|
||||
git = "https://github.com/lanl/Draco.git"
|
||||
maintainers = ['KineticTheory']
|
||||
|
||||
version('develop', branch='develop')
|
||||
version('7.4.0', sha256='61da2c3feace0e92c5410c9e9e613708fdf8954b1367cdc62c415329b0ddab6e')
|
||||
version('7.3.0', sha256='dc47ef6c1e04769ea177a10fc6ddf506f3e1e8d36eb5d49f4bc38cc509e24f10')
|
||||
version('7.2.0', sha256='ac4eac03703d4b7344fa2390a54140533c5e1f6ea0d59ef1f1d525c434ebe639')
|
||||
version('7_1_0', sha256='eca6bb86eb930837fb5e09b76c85c200b2c1522267cc66f81f2ec11a8262b5c9')
|
||||
version('6_25_0', sha256='e27eba44f397e7d111ff9a45b518b186940f75facfc6f318d76bd0e72f987440')
|
||||
version('6_23_0', sha256='edf20308746c06647087cb4e6ae7656fd057a89091a22bcba8f17a52e28b7849')
|
||||
version('6_22_0', sha256='4d1ed54944450c4ec7d00d7ba371469506c6985922f48f780bae2580c9335b86')
|
||||
version('6_21_0', sha256='f1ac88041606cdb1dfddd3bc74db0f1e15d8fc9d0a1eed939c8aa0fa63a85b55')
|
||||
version('6_20_1', sha256='b1c51000c9557e0818014713fce70d681869c50ed9c4548dcfb2e9219c354ebe')
|
||||
version('6_20_0', sha256='a6e3142c1c90b09c4ff8057bfee974369b815122b01d1f7b57888dcb9b1128f6')
|
||||
version('7.1.0', sha256='eca6bb86eb930837fb5e09b76c85c200b2c1522267cc66f81f2ec11a8262b5c9')
|
||||
version('6.25.0', sha256='e27eba44f397e7d111ff9a45b518b186940f75facfc6f318d76bd0e72f987440')
|
||||
version('6.23.0', sha256='edf20308746c06647087cb4e6ae7656fd057a89091a22bcba8f17a52e28b7849')
|
||||
version('6.22.0', sha256='4d1ed54944450c4ec7d00d7ba371469506c6985922f48f780bae2580c9335b86')
|
||||
version('6.21.0', sha256='f1ac88041606cdb1dfddd3bc74db0f1e15d8fc9d0a1eed939c8aa0fa63a85b55')
|
||||
version('6.20.1', sha256='b1c51000c9557e0818014713fce70d681869c50ed9c4548dcfb2e9219c354ebe')
|
||||
version('6.20.0', sha256='a6e3142c1c90b09c4ff8057bfee974369b815122b01d1f7b57888dcb9b1128f6')
|
||||
|
||||
variant('build_type', default='Release', description='CMake build type',
|
||||
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
|
||||
variant('eospac', default=False, description='Enable EOSPAC Support')
|
||||
variant('lapack', default=False, description='Enable LAPACK Wrapper')
|
||||
variant('parmetis', default=False, description='Enable Parmetis Support')
|
||||
variant('qt', default=False, description='Enable Qt Support')
|
||||
variant('superlu_dist', default=False, description='Enable SuperLU-DIST Support')
|
||||
variant('eospac', default=True, description='Enable EOSPAC support')
|
||||
variant('lapack', default=True, description='Enable LAPACK wrapper')
|
||||
variant('libquo', default=True, description='Enable Quo wrapper')
|
||||
variant('parmetis', default=True, description='Enable Parmetis support')
|
||||
variant('qt', default=False, description='Enable Qt support')
|
||||
variant('superlu_dist', default=True, description='Enable SuperLU-DIST support')
|
||||
|
||||
depends_on('gsl', type=('build', 'link'))
|
||||
depends_on('gsl')
|
||||
depends_on('mpi@3:', type=('build', 'link', 'run'))
|
||||
depends_on('numdiff', type='build')
|
||||
depends_on('python@2.7:', type=('build', 'run'))
|
||||
depends_on('random123', type='build')
|
||||
|
||||
depends_on('cmake@3.9:', when='@:6.99', type='build')
|
||||
depends_on('cmake@3.11:', when='@7.0.0:7.1.9', type='build')
|
||||
depends_on('cmake@3.11:', when='@7.0.0:7.1.99', type='build')
|
||||
depends_on('cmake@3.14:', when='@7.2:', type='build')
|
||||
depends_on('eospac@6.3:', when='+eospac', type=('build', 'link'))
|
||||
depends_on('lapack', when='+lapack', type=('build', 'link'))
|
||||
depends_on('metis', when='+parmetis', type=('build', 'link'))
|
||||
depends_on('parmetis', when='+parmetis', type=('build', 'link'))
|
||||
depends_on('eospac@6.3:', when='+eospac')
|
||||
depends_on('lapack', when='+lapack')
|
||||
depends_on('libquo', when='@7.4.0:+libquo')
|
||||
depends_on('metis', when='+parmetis')
|
||||
depends_on('parmetis', when='+parmetis')
|
||||
depends_on('qt', when='+qt',
|
||||
type=('build', 'link', 'run'))
|
||||
depends_on('superlu-dist@:5.99', when='+superlu-dist',
|
||||
type=('build', 'link'))
|
||||
depends_on('superlu-dist@:5.99', when='+superlu_dist')
|
||||
|
||||
# Fix python discovery.
|
||||
patch('d710.patch', when='@7.1.0^python@3:')
|
||||
patch('d710-python2.patch', when='@7.1.0^python@2.7:2.99')
|
||||
patch('d730.patch', when='@7.3.0:7.3.99')
|
||||
patch('d740.patch', when='@7.4.0:7.4.99')
|
||||
|
||||
def url_for_version(self, version):
|
||||
url = "https://github.com/lanl/Draco/archive/draco-{0}.zip"
|
||||
@@ -63,11 +73,3 @@ def cmake_args(self):
|
||||
'-DBUILD_TESTING={0}'.format('ON' if self.run_tests else 'OFF')
|
||||
])
|
||||
return options
|
||||
|
||||
@run_after('build')
|
||||
@on_package_attributes(run_tests=True)
|
||||
def check(self):
|
||||
"""Run ctest after building project."""
|
||||
|
||||
with working_dir(self.build_directory):
|
||||
ctest()
|
||||
|
||||
@@ -15,6 +15,8 @@ class Enchant(AutotoolsPackage):
|
||||
homepage = "https://abiword.github.io/enchant/"
|
||||
url = "https://github.com/AbiWord/enchant/releases/download/v2.2.5/enchant-2.2.5.tar.gz"
|
||||
|
||||
version('2.2.7', sha256='1b22976135812b35cb5b8d21a53ad11d5e7c1426c93f51e7a314a2a42cab3a09')
|
||||
version('2.2.6', sha256='8048c5bd26190b21279745cfecd05808c635bc14912e630340cd44a49b87d46d')
|
||||
version('2.2.5', sha256='ffce4ea00dbda1478d91c3e1538cadfe5761d9d6c0ceb27bc3dba51882fe1c47')
|
||||
version('2.2.4', sha256='f5d6b689d23c0d488671f34b02d07b84e408544b2f9f6e74fb7221982b1ecadc')
|
||||
version('2.2.3', sha256='abd8e915675cff54c0d4da5029d95c528362266557c61c7149d53fa069b8076d')
|
||||
|
||||
@@ -0,0 +1,19 @@
|
||||
--- spack-src/src/libfastx/fastx.h.org 2019-12-19 12:05:37.497936486 +0900
|
||||
+++ spack-src/src/libfastx/fastx.h 2019-12-19 13:44:55.481837853 +0900
|
||||
@@ -58,7 +58,7 @@
|
||||
OUTPUT_SAME_AS_INPUT=3
|
||||
} OUTPUT_FILE_TYPE;
|
||||
|
||||
-#pragma pack(1)
|
||||
+#pragma pack(push, 1)
|
||||
typedef struct
|
||||
{
|
||||
/* Record data - common for FASTA/FASTQ */
|
||||
@@ -115,6 +115,7 @@
|
||||
FILE* input;
|
||||
FILE* output;
|
||||
} FASTX ;
|
||||
+#pragma pack(pop)
|
||||
|
||||
|
||||
void fastx_init_reader(FASTX *pFASTX, const char* filename,
|
||||
@@ -19,3 +19,5 @@ class FastxToolkit(AutotoolsPackage):
|
||||
|
||||
# patch implicit fallthrough
|
||||
patch("pr-22.patch")
|
||||
# fix error [-Werror,-Wpragma-pack]
|
||||
patch('fix_pragma_pack.patch', when='%fj')
|
||||
|
||||
@@ -13,6 +13,7 @@ class Flatbuffers(CMakePackage):
|
||||
homepage = "http://google.github.io/flatbuffers/"
|
||||
url = "https://github.com/google/flatbuffers/archive/v1.9.0.tar.gz"
|
||||
|
||||
version('1.11.0', sha256='3f4a286642094f45b1b77228656fbd7ea123964f19502f9ecfd29933fd23a50b')
|
||||
version('1.10.0', sha256='3714e3db8c51e43028e10ad7adffb9a36fc4aa5b1a363c2d0c4303dd1be59a7c')
|
||||
version('1.9.0', sha256='5ca5491e4260cacae30f1a5786d109230db3f3a6e5a0eb45d0d0608293d247e3')
|
||||
version('1.8.0', sha256='c45029c0a0f1a88d416af143e34de96b3091642722aa2d8c090916c6d1498c2e')
|
||||
|
||||
@@ -18,7 +18,7 @@ class Gdal(AutotoolsPackage):
|
||||
"""
|
||||
|
||||
homepage = "https://www.gdal.org/"
|
||||
url = "https://download.osgeo.org/gdal/3.0.3/gdal-3.0.3.tar.xz"
|
||||
url = "https://download.osgeo.org/gdal/3.0.4/gdal-3.0.4.tar.xz"
|
||||
list_url = "https://download.osgeo.org/gdal/"
|
||||
list_depth = 1
|
||||
|
||||
@@ -29,6 +29,7 @@ class Gdal(AutotoolsPackage):
|
||||
'osgeo.gdal_array', 'osgeo.gdalconst'
|
||||
]
|
||||
|
||||
version('3.0.4', sha256='5569a4daa1abcbba47a9d535172fc335194d9214fdb96cd0f139bb57329ae277')
|
||||
version('3.0.3', sha256='e20add5802265159366f197a8bb354899e1693eab8dbba2208de14a457566109')
|
||||
version('3.0.2', sha256='c3765371ce391715c8f28bd6defbc70b57aa43341f6e94605f04fe3c92468983')
|
||||
version('3.0.1', sha256='45b4ae25dbd87282d589eca76481c426f72132d7a599556470d5c38263b09266')
|
||||
|
||||
@@ -71,6 +71,7 @@ class Geant4(CMakePackage):
|
||||
|
||||
depends_on('geant4-data@10.03.p03', when='@10.03.p03 ~data')
|
||||
depends_on('geant4-data@10.04', when='@10.04 ~data')
|
||||
depends_on('geant4-data@10.05.p01', when='@10.05.p01 ~data')
|
||||
|
||||
# As released, 10.03.03 has issues with respect to using external
|
||||
# CLHEP.
|
||||
|
||||
@@ -13,6 +13,8 @@ class Ghostscript(AutotoolsPackage):
|
||||
homepage = "http://ghostscript.com/"
|
||||
url = "https://github.com/ArtifexSoftware/ghostpdl-downloads/releases/download/gs926/ghostscript-9.26.tar.gz"
|
||||
|
||||
version('9.50', sha256='0f53e89fd647815828fc5171613e860e8535b68f7afbc91bf89aee886769ce89')
|
||||
version('9.27', sha256='9760e8bdd07a08dbd445188a6557cb70e60ccb6a5601f7dbfba0d225e28ce285')
|
||||
version('9.26', sha256='831fc019bd477f7cc2d481dc5395ebfa4a593a95eb2fe1eb231a97e450d7540d')
|
||||
version('9.21', sha256='02bceadbc4dddeb6f2eec9c8b1623d945d355ca11b8b4df035332b217d58ce85')
|
||||
version('9.18', sha256='5fc93079749a250be5404c465943850e3ed5ffbc0d5c07e10c7c5ee8afbbdb1b')
|
||||
|
||||
@@ -24,6 +24,11 @@ class Git(AutotoolsPackage):
|
||||
# You can find the source here: https://mirrors.edge.kernel.org/pub/software/scm/git/sha256sums.asc
|
||||
|
||||
releases = [
|
||||
{
|
||||
'version': '2.25.0',
|
||||
'sha256': 'a98c9b96d91544b130f13bf846ff080dda2867e77fe08700b793ab14ba5346f6',
|
||||
'sha256_manpages': '22b2380842ef75e9006c0358de250ead449e1376d7e5138070b9a3073ef61d44'
|
||||
},
|
||||
{
|
||||
'version': '2.21.0',
|
||||
'sha256': '85eca51c7404da75e353eba587f87fea9481ba41e162206a6f70ad8118147bee',
|
||||
@@ -175,7 +180,7 @@ class Git(AutotoolsPackage):
|
||||
depends_on('libiconv')
|
||||
depends_on('openssl')
|
||||
depends_on('pcre', when='@:2.13')
|
||||
depends_on('pcre+jit', when='@2.14:')
|
||||
depends_on('pcre2', when='@2.14:')
|
||||
depends_on('perl')
|
||||
depends_on('zlib')
|
||||
|
||||
@@ -216,12 +221,17 @@ def configure_args(self):
|
||||
'--with-curl={0}'.format(spec['curl'].prefix),
|
||||
'--with-expat={0}'.format(spec['expat'].prefix),
|
||||
'--with-iconv={0}'.format(spec['libiconv'].prefix),
|
||||
'--with-libpcre={0}'.format(spec['pcre'].prefix),
|
||||
'--with-openssl={0}'.format(spec['openssl'].prefix),
|
||||
'--with-perl={0}'.format(spec['perl'].command.path),
|
||||
'--with-zlib={0}'.format(spec['zlib'].prefix),
|
||||
]
|
||||
|
||||
if '^pcre' in self.spec:
|
||||
configure_args.append('--with-libpcre={0}'.format(
|
||||
spec['pcre'].prefix))
|
||||
if '^pcre2' in self.spec:
|
||||
configure_args.append('--with-libpcre2={0}'.format(
|
||||
spec['pcre2'].prefix))
|
||||
if '+tcltk' in self.spec:
|
||||
configure_args.append('--with-tcltk={0}'.format(
|
||||
self.spec['tk'].prefix.bin.wish))
|
||||
|
||||
@@ -13,6 +13,7 @@ class Gl2ps(CMakePackage):
|
||||
homepage = "http://www.geuz.org/gl2ps/"
|
||||
url = "http://geuz.org/gl2ps/src/gl2ps-1.3.9.tgz"
|
||||
|
||||
version('1.4.0', sha256='03cb5e6dfcd87183f3b9ba3b22f04cd155096af81e52988cc37d8d8efe6cf1e2')
|
||||
version('1.3.9', sha256='8a680bff120df8bcd78afac276cdc38041fed617f2721bade01213362bcc3640')
|
||||
|
||||
variant('png', default=True, description='Enable PNG support')
|
||||
|
||||
@@ -26,6 +26,7 @@ class Gnuplot(AutotoolsPackage):
|
||||
# dependency of readline. Fix it with a small patch
|
||||
patch('term_include.patch')
|
||||
|
||||
version('5.2.8', sha256='60a6764ccf404a1668c140f11cc1f699290ab70daa1151bb58fed6139a28ac37')
|
||||
version('5.2.7', sha256='97fe503ff3b2e356fe2ae32203fc7fd2cf9cef1f46b60fe46dc501a228b9f4ed')
|
||||
version('5.2.5', sha256='039db2cce62ddcfd31a6696fe576f4224b3bc3f919e66191dfe2cdb058475caa')
|
||||
version('5.2.2', sha256='a416d22f02bdf3873ef82c5eb7f8e94146795811ef808e12b035ada88ef7b1a1')
|
||||
|
||||
@@ -35,6 +35,7 @@ class Go(Package):
|
||||
|
||||
extendable = True
|
||||
|
||||
version('1.13.7', sha256='e4ad42cc5f5c19521fbbbde3680995f2546110b5c6aa2b48c3754ff7af9b41f4')
|
||||
version('1.13.6', sha256='aae5be954bdc40bcf8006eb77e8d8a5dde412722bc8effcdaf9772620d06420c')
|
||||
version('1.13.5', sha256='27d356e2a0b30d9983b60a788cf225da5f914066b37a6b4f69d457ba55a626ff')
|
||||
version('1.13.4', sha256='95dbeab442ee2746b9acf0934c8e2fc26414a0565c008631b04addb8c02e7624')
|
||||
|
||||
@@ -18,6 +18,11 @@ class Graphicsmagick(AutotoolsPackage):
|
||||
homepage = "http://www.graphicsmagick.org/"
|
||||
url = "https://sourceforge.net/projects/graphicsmagick/files/graphicsmagick/1.3.29/GraphicsMagick-1.3.29.tar.xz/download"
|
||||
|
||||
version('1.3.34', sha256='df009d5173ed0d6a0c6457234256c5a8aeaace782afa1cbab015d5a12bd4f7a4')
|
||||
version('1.3.33', sha256='130cb330a633580b5124eba5c125bbcbc484298423a97b9bed37ccd50d6dc778')
|
||||
version('1.3.32', sha256='b842a5a0d6c84fd6c5f161b5cd8e02bbd210b0c0b6728dd762b7c53062ba94e1')
|
||||
version('1.3.31', sha256='096bbb59d6f3abd32b562fc3b34ea90d88741dc5dd888731d61d17e100394278')
|
||||
version('1.3.30', sha256='d965e5c6559f55eec76c20231c095d4ae682ea0cbdd8453249ae8771405659f1')
|
||||
version('1.3.29', sha256='e18df46a6934c8c12bfe274d09f28b822f291877f9c81bd9a506f879a7610cd4')
|
||||
|
||||
depends_on('bzip2')
|
||||
|
||||
@@ -16,3 +16,5 @@ class Graphite2(CMakePackage):
|
||||
url = "https://github.com/silnrsi/graphite/releases/download/1.3.13/graphite2-1.3.13.tgz"
|
||||
|
||||
version('1.3.13', sha256='dd63e169b0d3cf954b397c122551ab9343e0696fb2045e1b326db0202d875f06')
|
||||
|
||||
patch('regparm.patch')
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user