Merge branch 'develop' into features/shared
This commit is contained in:
29
.github/ISSUE_TEMPLATE/bug_report.md
vendored
29
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -1,14 +1,21 @@
|
||||
---
|
||||
name: Bug report
|
||||
name: "\U0001F41E Bug report"
|
||||
about: Report a bug in the core of Spack (command not working as expected, etc.)
|
||||
labels: bug
|
||||
labels: "bug,triage"
|
||||
---
|
||||
|
||||
|
||||
<!--
|
||||
*Explain, in a clear and concise way, the command you ran and the result you were trying to achieve.
|
||||
Example: "I ran Spack find to list all the installed packages and..."*
|
||||
-->
|
||||
|
||||
|
||||
### Spack version
|
||||
<!-- Add the output to the command below -->
|
||||
```console
|
||||
$ spack --version
|
||||
|
||||
```
|
||||
|
||||
### Steps to reproduce the issue
|
||||
|
||||
@@ -20,7 +27,7 @@ $ spack <command2> <spec>
|
||||
|
||||
### Error Message
|
||||
|
||||
If Spack reported an error, provide the error message. If it did not report an error
|
||||
<!--If Spack reported an error, provide the error message. If it did not report an error
|
||||
but the output appears incorrect, provide the incorrect output. If there was no error
|
||||
message and no output but the result is incorrect, describe how it does not match
|
||||
what you expect. To provide more information you might re-run the commands with
|
||||
@@ -31,19 +38,27 @@ $ spack -d --stacktrace <command2> <spec>
|
||||
...
|
||||
```
|
||||
that activate the full debug output.
|
||||
|
||||
-->
|
||||
|
||||
### Information on your system
|
||||
|
||||
<!--
|
||||
This includes:
|
||||
|
||||
1. which platform you are using
|
||||
2. any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.)
|
||||
|
||||
-----
|
||||
-->
|
||||
|
||||
### General information
|
||||
|
||||
- [ ] I have run `spack --version` and reported the version of Spack
|
||||
- [ ] I have searched the issues of this repo and believe this is not a duplicate
|
||||
- [ ] I have run the failing commands in debug mode and reported the output
|
||||
|
||||
<!--
|
||||
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
|
||||
|
||||
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on our Slack first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
|
||||
|
||||
Other than that, thanks for taking the time to contribute to Spack!
|
||||
-->
|
||||
25
.github/ISSUE_TEMPLATE/build_error.md
vendored
25
.github/ISSUE_TEMPLATE/build_error.md
vendored
@@ -1,19 +1,24 @@
|
||||
---
|
||||
name: Build error
|
||||
name: "\U0001F4A5 Build error"
|
||||
about: Some package in Spack didn't build correctly
|
||||
labels: "build-error"
|
||||
---
|
||||
|
||||
*Thanks for taking the time to report this build failure. To proceed with the
|
||||
<!--*Thanks for taking the time to report this build failure. To proceed with the
|
||||
report please:*
|
||||
1. Title the issue "Installation issue: <name-of-the-package>".
|
||||
2. Provide the information required below.
|
||||
3. Remove the template instructions before posting the issue.
|
||||
|
||||
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
|
||||
-->
|
||||
|
||||
|
||||
---
|
||||
### Spack version
|
||||
<!-- Add the output to the command below -->
|
||||
```console
|
||||
$ spack --version
|
||||
|
||||
```
|
||||
|
||||
### Steps to reproduce the issue
|
||||
|
||||
@@ -24,7 +29,7 @@ $ spack install <spec> # Fill in the exact spec you are using
|
||||
|
||||
### Platform and user environment
|
||||
|
||||
Please report your OS here:
|
||||
<!-- Please report your OS here:
|
||||
```commandline
|
||||
$ uname -a
|
||||
Linux nuvolari 4.15.0-29-generic #31-Ubuntu SMP Tue Jul 17 15:39:52 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
|
||||
@@ -37,10 +42,11 @@ and, if relevant, post or attach:
|
||||
- `compilers.yaml`
|
||||
|
||||
to the issue
|
||||
-->
|
||||
|
||||
### Additional information
|
||||
|
||||
Sometimes the issue benefits from additional details. In these cases there are
|
||||
<!--Sometimes the issue benefits from additional details. In these cases there are
|
||||
a few things we can suggest doing. First of all, you can post the full output of:
|
||||
```console
|
||||
$ spack spec --install-status <spec>
|
||||
@@ -75,4 +81,9 @@ will provide additional debug information. After the failure you will find two f
|
||||
failed object after Spack's compiler wrapper did its processing
|
||||
|
||||
You can post or attach those files to provide maintainers with more information on what
|
||||
is causing the failure.
|
||||
is causing the failure.-->
|
||||
|
||||
### General information
|
||||
|
||||
- [ ] I have run `spack --version` and reported the version of Spack
|
||||
- [ ] I have searched the issues of this repo and believe this is not a duplicate
|
||||
|
||||
21
.github/ISSUE_TEMPLATE/feature_request.md
vendored
21
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -1,28 +1,33 @@
|
||||
---
|
||||
name: Feature request
|
||||
name: "\U0001F38A Feature request"
|
||||
about: Suggest adding a feature that is not yet in Spack
|
||||
labels: feature
|
||||
|
||||
---
|
||||
|
||||
*Please add a concise summary of your suggestion here.*
|
||||
<!--*Please add a concise summary of your suggestion here.*-->
|
||||
|
||||
### Rationale
|
||||
|
||||
*Is your feature request related to a problem? Please describe it!*
|
||||
<!--*Is your feature request related to a problem? Please describe it!*-->
|
||||
|
||||
### Description
|
||||
|
||||
*Describe the solution you'd like and the alternatives you have considered.*
|
||||
<!--*Describe the solution you'd like and the alternatives you have considered.*-->
|
||||
|
||||
|
||||
### Additional information
|
||||
*Add any other context about the feature request here.*
|
||||
<!--*Add any other context about the feature request here.*-->
|
||||
|
||||
|
||||
-----
|
||||
### General information
|
||||
|
||||
- [ ] I have run `spack --version` and reported the version of Spack
|
||||
- [ ] I have searched the issues of this repo and believe this is not a duplicate
|
||||
|
||||
|
||||
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on our Slack first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
|
||||
|
||||
Other than that, thanks for taking the time to contribute to Spack!
|
||||
<!--If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on our Slack first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
|
||||
|
||||
Other than that, thanks for taking the time to contribute to Spack!
|
||||
-->
|
||||
2
.github/workflows/linux_build_tests.yaml
vendored
2
.github/workflows/linux_build_tests.yaml
vendored
@@ -28,7 +28,7 @@ jobs:
|
||||
matrix:
|
||||
package: [lz4, mpich, tut, py-setuptools, openjpeg, r-rcpp]
|
||||
steps:
|
||||
- uses: actions/checkout@v1
|
||||
- uses: actions/checkout@v2
|
||||
- name: Cache ccache's store
|
||||
uses: actions/cache@v1
|
||||
with:
|
||||
|
||||
@@ -75,6 +75,12 @@ config:
|
||||
misc_cache: ~/.spack/cache
|
||||
|
||||
|
||||
# Timeout in seconds used for downloading sources etc. This only applies
|
||||
# to the connection phase and can be increased for slow connections or
|
||||
# servers. 0 means no timeout.
|
||||
connect_timeout: 10
|
||||
|
||||
|
||||
# If this is false, tools like curl that use SSL will not verify
|
||||
# certifiates. (e.g., curl will use use the -k option)
|
||||
verify_ssl: true
|
||||
|
||||
@@ -25,6 +25,14 @@ It is recommended that the following be put in your ``.bashrc`` file:
|
||||
|
||||
alias less='less -R'
|
||||
|
||||
If you do not see colorized output when using ``less -R`` it is because color
|
||||
is being disabled in the piped output. In this case, tell spack to force
|
||||
colorized output.
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack --color always | less -R
|
||||
|
||||
--------------------------
|
||||
Listing available packages
|
||||
--------------------------
|
||||
@@ -45,7 +53,7 @@ can install:
|
||||
.. command-output:: spack list
|
||||
:ellipsis: 10
|
||||
|
||||
There are thosands of them, so we've truncated the output above, but you
|
||||
There are thousands of them, so we've truncated the output above, but you
|
||||
can find a :ref:`full list here <package-list>`.
|
||||
Packages are listed by name in alphabetical order.
|
||||
A pattern to match with no wildcards, ``*`` or ``?``,
|
||||
@@ -267,7 +275,7 @@ the ``spack gc`` ("garbage collector") command, which will uninstall all unneede
|
||||
-- linux-ubuntu18.04-broadwell / gcc@9.0.1 ----------------------
|
||||
hdf5@1.10.5 libiconv@1.16 libpciaccess@0.13.5 libszip@2.1.1 libxml2@2.9.9 mpich@3.3.2 openjpeg@2.3.1 xz@5.2.4 zlib@1.2.11
|
||||
|
||||
In the example above Spack went through all the packages in the DB
|
||||
In the example above Spack went through all the packages in the package database
|
||||
and removed everything that is not either:
|
||||
|
||||
1. A package installed upon explicit request of the user
|
||||
@@ -854,7 +862,7 @@ Variants are named options associated with a particular package. They are
|
||||
optional, as each package must provide default values for each variant it
|
||||
makes available. Variants can be specified using
|
||||
a flexible parameter syntax ``name=<value>``. For example,
|
||||
``spack install libelf debug=True`` will install libelf build with debug
|
||||
``spack install libelf debug=True`` will install libelf built with debug
|
||||
flags. The names of particular variants available for a package depend on
|
||||
what was provided by the package author. ``spack info <package>`` will
|
||||
provide information on what build variants are available.
|
||||
@@ -1067,13 +1075,13 @@ of failing:
|
||||
In the snippet above, for instance, the microarchitecture was demoted to ``haswell`` when
|
||||
compiling with ``gcc@4.8`` since support to optimize for ``broadwell`` starts from ``gcc@4.9:``.
|
||||
|
||||
Finally if Spack has no information to match compiler and target, it will
|
||||
Finally, if Spack has no information to match compiler and target, it will
|
||||
proceed with the installation but avoid injecting any microarchitecture
|
||||
specific flags.
|
||||
|
||||
.. warning::
|
||||
|
||||
Currently Spack doesn't print any warning to the user if it has no information
|
||||
Currently, Spack doesn't print any warning to the user if it has no information
|
||||
on which optimization flags should be used for a given compiler. This behavior
|
||||
might change in the future.
|
||||
|
||||
@@ -1083,7 +1091,7 @@ specific flags.
|
||||
Virtual dependencies
|
||||
--------------------
|
||||
|
||||
The dependence graph for ``mpileaks`` we saw above wasn't *quite*
|
||||
The dependency graph for ``mpileaks`` we saw above wasn't *quite*
|
||||
accurate. ``mpileaks`` uses MPI, which is an interface that has many
|
||||
different implementations. Above, we showed ``mpileaks`` and
|
||||
``callpath`` depending on ``mpich``, which is one *particular*
|
||||
@@ -1410,12 +1418,12 @@ packages listed as activated:
|
||||
py-nose@1.3.4 py-numpy@1.9.1 py-setuptools@11.3.1
|
||||
|
||||
Now, when a user runs python, ``numpy`` will be available for import
|
||||
*without* the user having to explicitly loaded. ``python@2.7.8`` now
|
||||
*without* the user having to explicitly load it. ``python@2.7.8`` now
|
||||
acts like a system Python installation with ``numpy`` installed inside
|
||||
of it.
|
||||
|
||||
Spack accomplishes this by symbolically linking the *entire* prefix of
|
||||
the ``py-numpy`` into the prefix of the ``python`` package. To the
|
||||
the ``py-numpy`` package into the prefix of the ``python`` package. To the
|
||||
python interpreter, it looks like ``numpy`` is installed in the
|
||||
``site-packages`` directory.
|
||||
|
||||
|
||||
@@ -2913,7 +2913,7 @@ discover its dependencies.
|
||||
|
||||
If you want to see the environment that a package will build with, or
|
||||
if you want to run commands in that environment to test them out, you
|
||||
can use the :ref:`cmd-spack-env` command, documented
|
||||
can use the :ref:`cmd-spack-build-env` command, documented
|
||||
below.
|
||||
|
||||
^^^^^^^^^^^^^^^^^^^^^
|
||||
@@ -4332,31 +4332,31 @@ directory, install directory, package directory) and others change to
|
||||
core spack locations. For example, ``spack cd --module-dir`` will take you to
|
||||
the main python source directory of your spack install.
|
||||
|
||||
.. _cmd-spack-env:
|
||||
.. _cmd-spack-build-env:
|
||||
|
||||
^^^^^^^^^^^^^
|
||||
``spack env``
|
||||
^^^^^^^^^^^^^
|
||||
^^^^^^^^^^^^^^^^^^^
|
||||
``spack build-env``
|
||||
^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
``spack env`` functions much like the standard unix ``env`` command,
|
||||
but it takes a spec as an argument. You can use it to see the
|
||||
``spack build-env`` functions much like the standard unix ``build-env``
|
||||
command, but it takes a spec as an argument. You can use it to see the
|
||||
environment variables that will be set when a particular build runs,
|
||||
for example:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack env mpileaks@1.1%intel
|
||||
$ spack build-env mpileaks@1.1%intel
|
||||
|
||||
This will display the entire environment that will be set when the
|
||||
``mpileaks@1.1%intel`` build runs.
|
||||
|
||||
To run commands in a package's build environment, you can simply
|
||||
provide them after the spec argument to ``spack env``:
|
||||
provide them after the spec argument to ``spack build-env``:
|
||||
|
||||
.. code-block:: console
|
||||
|
||||
$ spack cd mpileaks@1.1%intel
|
||||
$ spack env mpileaks@1.1%intel ./configure
|
||||
$ spack build-env mpileaks@1.1%intel ./configure
|
||||
|
||||
This will cd to the build directory and then run ``configure`` in the
|
||||
package's build environment.
|
||||
|
||||
2
lib/spack/env/cc
vendored
2
lib/spack/env/cc
vendored
@@ -43,7 +43,7 @@ parameters=(
|
||||
# The compiler input variables are checked for sanity later:
|
||||
# SPACK_CC, SPACK_CXX, SPACK_F77, SPACK_FC
|
||||
# The default compiler flags are passed from these variables:
|
||||
# SPACK_CFLAGS, SPACK_CXXFLAGS, SPACK_FCFLAGS, SPACK_FFLAGS,
|
||||
# SPACK_CFLAGS, SPACK_CXXFLAGS, SPACK_FFLAGS,
|
||||
# SPACK_LDFLAGS, SPACK_LDLIBS
|
||||
# Debug env var is optional; set to "TRUE" for debug logging:
|
||||
# SPACK_DEBUG
|
||||
|
||||
1
lib/spack/external/distro.py
vendored
1
lib/spack/external/distro.py
vendored
@@ -64,6 +64,7 @@
|
||||
'enterpriseenterprise': 'oracle', # Oracle Enterprise Linux
|
||||
'redhatenterpriseworkstation': 'rhel', # RHEL 6, 7 Workstation
|
||||
'redhatenterpriseserver': 'rhel', # RHEL 6, 7 Server
|
||||
'redhatenterprisecomputenode': 'rhel', # RHEL 6 ComputeNode
|
||||
}
|
||||
|
||||
#: Translation table for normalizing the distro ID derived from the file name
|
||||
|
||||
@@ -308,7 +308,7 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
|
||||
tmpdir = tempfile.mkdtemp()
|
||||
cache_prefix = build_cache_prefix(tmpdir)
|
||||
|
||||
tarfile_name = tarball_name(spec, '.tar.bz2')
|
||||
tarfile_name = tarball_name(spec, '.tar.gz')
|
||||
tarfile_dir = os.path.join(cache_prefix, tarball_directory_name(spec))
|
||||
tarfile_path = os.path.join(tarfile_dir, tarfile_name)
|
||||
spackfile_path = os.path.join(
|
||||
@@ -377,8 +377,8 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
|
||||
shutil.rmtree(tmpdir)
|
||||
tty.die(e)
|
||||
|
||||
# create compressed tarball of the install prefix
|
||||
with closing(tarfile.open(tarfile_path, 'w:bz2')) as tar:
|
||||
# create gzip compressed tarball of the install prefix
|
||||
with closing(tarfile.open(tarfile_path, 'w:gz')) as tar:
|
||||
tar.add(name='%s' % workdir,
|
||||
arcname='%s' % os.path.basename(spec.prefix))
|
||||
# remove copy of install directory
|
||||
@@ -433,6 +433,9 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
|
||||
web_util.push_to_url(
|
||||
specfile_path, remote_specfile_path, keep_original=False)
|
||||
|
||||
tty.msg('Buildache for "%s" written to \n %s' %
|
||||
(spec, remote_spackfile_path))
|
||||
|
||||
try:
|
||||
# create an index.html for the build_cache directory so specs can be
|
||||
# found
|
||||
@@ -589,16 +592,16 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
|
||||
stagepath = os.path.dirname(filename)
|
||||
spackfile_name = tarball_name(spec, '.spack')
|
||||
spackfile_path = os.path.join(stagepath, spackfile_name)
|
||||
tarfile_name = tarball_name(spec, '.tar.bz2')
|
||||
tarfile_name = tarball_name(spec, '.tar.gz')
|
||||
tarfile_path = os.path.join(tmpdir, tarfile_name)
|
||||
specfile_name = tarball_name(spec, '.spec.yaml')
|
||||
specfile_path = os.path.join(tmpdir, specfile_name)
|
||||
|
||||
with closing(tarfile.open(spackfile_path, 'r')) as tar:
|
||||
tar.extractall(tmpdir)
|
||||
# older buildcache tarfiles use gzip compression
|
||||
# some buildcache tarfiles use bzip2 compression
|
||||
if not os.path.exists(tarfile_path):
|
||||
tarfile_name = tarball_name(spec, '.tar.gz')
|
||||
tarfile_name = tarball_name(spec, '.tar.bz2')
|
||||
tarfile_path = os.path.join(tmpdir, tarfile_name)
|
||||
if not unsigned:
|
||||
if os.path.exists('%s.asc' % specfile_path):
|
||||
@@ -799,6 +802,7 @@ def get_specs(force=False, allarch=False):
|
||||
def get_keys(install=False, trust=False, force=False):
|
||||
"""
|
||||
Get pgp public keys available on mirror
|
||||
with suffix .key or .pub
|
||||
"""
|
||||
if not spack.mirror.MirrorCollection():
|
||||
tty.die("Please add a spack mirror to allow " +
|
||||
@@ -815,16 +819,18 @@ def get_keys(install=False, trust=False, force=False):
|
||||
tty.msg("Finding public keys in %s" % mirror_dir)
|
||||
files = os.listdir(mirror_dir)
|
||||
for file in files:
|
||||
if re.search(r'\.key', file):
|
||||
if re.search(r'\.key', file) or re.search(r'\.pub', file):
|
||||
link = url_util.join(fetch_url_build_cache, file)
|
||||
keys.add(link)
|
||||
else:
|
||||
tty.msg("Finding public keys at %s" %
|
||||
url_util.format(fetch_url_build_cache))
|
||||
p, links = web_util.spider(fetch_url_build_cache, depth=1)
|
||||
# For s3 mirror need to request index.html directly
|
||||
p, links = web_util.spider(
|
||||
url_util.join(fetch_url_build_cache, 'index.html'), depth=1)
|
||||
|
||||
for link in links:
|
||||
if re.search(r'\.key', link):
|
||||
if re.search(r'\.key', link) or re.search(r'\.pub', link):
|
||||
keys.add(link)
|
||||
|
||||
for link in keys:
|
||||
|
||||
@@ -263,6 +263,12 @@ def flags_to_build_system_args(self, flags):
|
||||
if values:
|
||||
values_str = '{0}={1}'.format(flag.upper(), ' '.join(values))
|
||||
self.configure_flag_args.append(values_str)
|
||||
# Spack's fflags are meant for both F77 and FC, therefore we
|
||||
# additionaly set FCFLAGS if required.
|
||||
values = flags.get('fflags', None)
|
||||
if values:
|
||||
values_str = 'FCFLAGS={0}'.format(' '.join(values))
|
||||
self.configure_flag_args.append(values_str)
|
||||
|
||||
def configure(self, spec, prefix):
|
||||
"""Runs configure with the arguments specified in
|
||||
|
||||
@@ -13,39 +13,65 @@ class CudaPackage(PackageBase):
|
||||
"""Auxiliary class which contains CUDA variant, dependencies and conflicts
|
||||
and is meant to unify and facilitate its usage.
|
||||
"""
|
||||
maintainers = ['ax3l', 'svenevs']
|
||||
|
||||
# FIXME: keep cuda and cuda_arch separate to make usage easier untill
|
||||
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
|
||||
# https://developer.nvidia.com/cuda-gpus
|
||||
# https://en.wikipedia.org/wiki/CUDA#GPUs_supported
|
||||
cuda_arch_values = [
|
||||
'10', '11', '12', '13',
|
||||
'20', '21',
|
||||
'30', '32', '35', '37',
|
||||
'50', '52', '53',
|
||||
'60', '61', '62',
|
||||
'70', '72', '75',
|
||||
]
|
||||
|
||||
# FIXME: keep cuda and cuda_arch separate to make usage easier until
|
||||
# Spack has depends_on(cuda, when='cuda_arch!=None') or alike
|
||||
variant('cuda', default=False,
|
||||
description='Build with CUDA')
|
||||
# see http://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
|
||||
# https://developer.nvidia.com/cuda-gpus
|
||||
|
||||
variant('cuda_arch',
|
||||
description='CUDA architecture',
|
||||
values=spack.variant.any_combination_of(
|
||||
'20', '30', '32', '35', '50', '52', '53', '60', '61',
|
||||
'62', '70', '72', '75'
|
||||
))
|
||||
values=spack.variant.any_combination_of(*cuda_arch_values))
|
||||
|
||||
# see http://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples
|
||||
# and http://llvm.org/docs/CompileCudaWithLLVM.html#compiling-cuda-code
|
||||
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples
|
||||
# https://llvm.org/docs/CompileCudaWithLLVM.html#compiling-cuda-code
|
||||
@staticmethod
|
||||
def cuda_flags(arch_list):
|
||||
return [('--generate-code arch=compute_{0},code=sm_{0} '
|
||||
'--generate-code arch=compute_{0},code=compute_{0}').format(s)
|
||||
for s in arch_list]
|
||||
|
||||
depends_on("cuda@7:", when='+cuda')
|
||||
depends_on('cuda', when='+cuda')
|
||||
|
||||
# CUDA version vs Architecture
|
||||
depends_on("cuda@8:", when='cuda_arch=60')
|
||||
depends_on("cuda@8:", when='cuda_arch=61')
|
||||
depends_on("cuda@8:", when='cuda_arch=62')
|
||||
depends_on("cuda@9:", when='cuda_arch=70')
|
||||
depends_on("cuda@9:", when='cuda_arch=72')
|
||||
depends_on("cuda@10:", when='cuda_arch=75')
|
||||
# https://en.wikipedia.org/wiki/CUDA#GPUs_supported
|
||||
depends_on('cuda@:6.0', when='cuda_arch=10')
|
||||
depends_on('cuda@:6.5', when='cuda_arch=11')
|
||||
depends_on('cuda@2.1:6.5', when='cuda_arch=12')
|
||||
depends_on('cuda@2.1:6.5', when='cuda_arch=13')
|
||||
|
||||
depends_on('cuda@:8', when='cuda_arch=20')
|
||||
depends_on('cuda@3.0:8.0', when='cuda_arch=20')
|
||||
depends_on('cuda@3.2:8.0', when='cuda_arch=21')
|
||||
|
||||
depends_on('cuda@5.0:10.2', when='cuda_arch=30')
|
||||
depends_on('cuda@5.0:10.2', when='cuda_arch=32')
|
||||
depends_on('cuda@5.0:10.2', when='cuda_arch=35')
|
||||
depends_on('cuda@6.5:10.2', when='cuda_arch=37')
|
||||
|
||||
depends_on('cuda@6.0:', when='cuda_arch=50')
|
||||
depends_on('cuda@6.5:', when='cuda_arch=52')
|
||||
depends_on('cuda@6.5:', when='cuda_arch=53')
|
||||
|
||||
depends_on('cuda@8.0:', when='cuda_arch=60')
|
||||
depends_on('cuda@8.0:', when='cuda_arch=61')
|
||||
depends_on('cuda@8.0:', when='cuda_arch=62')
|
||||
|
||||
depends_on('cuda@9.0:', when='cuda_arch=70')
|
||||
depends_on('cuda@9.0:', when='cuda_arch=72')
|
||||
depends_on('cuda@10.0:', when='cuda_arch=75')
|
||||
|
||||
# There are at least three cases to be aware of for compiler conflicts
|
||||
# 1. Linux x86_64
|
||||
@@ -130,18 +156,8 @@ def cuda_flags(arch_list):
|
||||
# `clang-apple@x.y.z as a possible fix.
|
||||
# Compiler conflicts will be eventual taken from here:
|
||||
# https://docs.nvidia.com/cuda/cuda-installation-guide-mac-os-x/index.html#abstract
|
||||
conflicts('platform=darwin', when='+cuda ^cuda@11.0:')
|
||||
|
||||
# Make sure cuda_arch can not be used without +cuda
|
||||
conflicts('~cuda', when='cuda_arch=20')
|
||||
conflicts('~cuda', when='cuda_arch=30')
|
||||
conflicts('~cuda', when='cuda_arch=32')
|
||||
conflicts('~cuda', when='cuda_arch=35')
|
||||
conflicts('~cuda', when='cuda_arch=50')
|
||||
conflicts('~cuda', when='cuda_arch=52')
|
||||
conflicts('~cuda', when='cuda_arch=53')
|
||||
conflicts('~cuda', when='cuda_arch=60')
|
||||
conflicts('~cuda', when='cuda_arch=61')
|
||||
conflicts('~cuda', when='cuda_arch=62')
|
||||
conflicts('~cuda', when='cuda_arch=70')
|
||||
conflicts('~cuda', when='cuda_arch=72')
|
||||
conflicts('~cuda', when='cuda_arch=75')
|
||||
for value in cuda_arch_values:
|
||||
conflicts('~cuda', when='cuda_arch=' + value)
|
||||
|
||||
@@ -51,8 +51,9 @@ def _fetch_cache():
|
||||
|
||||
|
||||
class MirrorCache(object):
|
||||
def __init__(self, root):
|
||||
def __init__(self, root, skip_unstable_versions):
|
||||
self.root = os.path.abspath(root)
|
||||
self.skip_unstable_versions = skip_unstable_versions
|
||||
|
||||
def store(self, fetcher, relative_dest):
|
||||
"""Fetch and relocate the fetcher's target into our mirror cache."""
|
||||
@@ -85,5 +86,3 @@ def symlink(self, mirror_ref):
|
||||
|
||||
#: Spack's local cache for downloaded source archives
|
||||
fetch_cache = llnl.util.lang.Singleton(_fetch_cache)
|
||||
|
||||
mirror_cache = None
|
||||
|
||||
@@ -947,8 +947,9 @@ def read_cdashid_from_mirror(spec, mirror_url):
|
||||
def push_mirror_contents(env, spec, yaml_path, mirror_url, build_id):
|
||||
if mirror_url:
|
||||
tty.debug('Creating buildcache')
|
||||
buildcache._createtarball(env, yaml_path, None, mirror_url, None,
|
||||
True, True, False, False, True, False)
|
||||
buildcache._createtarball(env, yaml_path, None, True, False,
|
||||
mirror_url, None, True, False, False, True,
|
||||
False)
|
||||
if build_id:
|
||||
tty.debug('Writing cdashid ({0}) to remote mirror: {1}'.format(
|
||||
build_id, mirror_url))
|
||||
|
||||
@@ -52,16 +52,35 @@ def setup_parser(subparser):
|
||||
create.add_argument('-k', '--key', metavar='key',
|
||||
type=str, default=None,
|
||||
help="Key for signing.")
|
||||
create.add_argument('-d', '--directory', metavar='directory',
|
||||
type=str, default='.',
|
||||
help="directory in which to save the tarballs.")
|
||||
output = create.add_mutually_exclusive_group(required=True)
|
||||
output.add_argument('-d', '--directory',
|
||||
metavar='directory',
|
||||
type=str,
|
||||
help="local directory where " +
|
||||
"buildcaches will be written.")
|
||||
output.add_argument('-m', '--mirror-name',
|
||||
metavar='mirror-name',
|
||||
type=str,
|
||||
help="name of the mirror where " +
|
||||
"buildcaches will be written.")
|
||||
output.add_argument('--mirror-url',
|
||||
metavar='mirror-url',
|
||||
type=str,
|
||||
help="URL of the mirror where " +
|
||||
"buildcaches will be written.")
|
||||
create.add_argument('--no-rebuild-index', action='store_true',
|
||||
default=False, help="skip rebuilding index after " +
|
||||
"building package(s)")
|
||||
create.add_argument('-y', '--spec-yaml', default=None,
|
||||
help='Create buildcache entry for spec from yaml file')
|
||||
create.add_argument('--no-deps', action='store_true', default='false',
|
||||
help='Create buildcache entry wo/ dependencies')
|
||||
create.add_argument('--only', default='package,dependencies',
|
||||
dest='things_to_install',
|
||||
choices=['package', 'dependencies'],
|
||||
help=('Select the buildcache mode. the default is to'
|
||||
' build a cache for the package along with all'
|
||||
' its dependencies. Alternatively, one can'
|
||||
' decide to build a cache for only the package'
|
||||
' or only the dependencies'))
|
||||
arguments.add_common_arguments(create, ['specs'])
|
||||
create.set_defaults(func=createtarball)
|
||||
|
||||
@@ -76,6 +95,10 @@ def setup_parser(subparser):
|
||||
install.add_argument('-u', '--unsigned', action='store_true',
|
||||
help="install unsigned buildcache" +
|
||||
" tarballs for testing")
|
||||
install.add_argument('-o', '--otherarch', action='store_true',
|
||||
help="install specs from other architectures" +
|
||||
" instead of default platform and OS")
|
||||
|
||||
arguments.add_common_arguments(install, ['specs'])
|
||||
install.set_defaults(func=installtarball)
|
||||
|
||||
@@ -252,7 +275,8 @@ def find_matching_specs(pkgs, allow_multiple_matches=False, env=None):
|
||||
return specs_from_cli
|
||||
|
||||
|
||||
def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False):
|
||||
def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
|
||||
other_arch=False):
|
||||
"""Returns a list of specs matching the not necessarily
|
||||
concretized specs given from cli
|
||||
|
||||
@@ -266,7 +290,7 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False):
|
||||
# List of specs that match expressions given via command line
|
||||
specs_from_cli = []
|
||||
has_errors = False
|
||||
allarch = False
|
||||
allarch = other_arch
|
||||
specs = bindist.get_specs(force, allarch)
|
||||
for pkg in pkgs:
|
||||
matches = []
|
||||
@@ -299,8 +323,9 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False):
|
||||
return specs_from_cli
|
||||
|
||||
|
||||
def _createtarball(env, spec_yaml, packages, directory, key, no_deps, force,
|
||||
rel, unsigned, allow_root, no_rebuild_index):
|
||||
def _createtarball(env, spec_yaml, packages, add_spec, add_deps,
|
||||
output_location, key, force, rel, unsigned, allow_root,
|
||||
no_rebuild_index):
|
||||
if spec_yaml:
|
||||
packages = set()
|
||||
with open(spec_yaml, 'r') as fd:
|
||||
@@ -320,13 +345,12 @@ def _createtarball(env, spec_yaml, packages, directory, key, no_deps, force,
|
||||
pkgs = set(packages)
|
||||
specs = set()
|
||||
|
||||
outdir = '.'
|
||||
if directory:
|
||||
outdir = directory
|
||||
|
||||
mirror = spack.mirror.MirrorCollection().lookup(outdir)
|
||||
mirror = spack.mirror.MirrorCollection().lookup(output_location)
|
||||
outdir = url_util.format(mirror.push_url)
|
||||
|
||||
msg = 'Buildcache files will be output to %s/build_cache' % outdir
|
||||
tty.msg(msg)
|
||||
|
||||
signkey = None
|
||||
if key:
|
||||
signkey = key
|
||||
@@ -342,14 +366,23 @@ def _createtarball(env, spec_yaml, packages, directory, key, no_deps, force,
|
||||
tty.debug('skipping external or virtual spec %s' %
|
||||
match.format())
|
||||
else:
|
||||
tty.debug('adding matching spec %s' % match.format())
|
||||
specs.add(match)
|
||||
if no_deps is True:
|
||||
if add_spec:
|
||||
tty.debug('adding matching spec %s' % match.format())
|
||||
specs.add(match)
|
||||
else:
|
||||
tty.debug('skipping matching spec %s' % match.format())
|
||||
|
||||
if not add_deps:
|
||||
continue
|
||||
|
||||
tty.debug('recursing dependencies')
|
||||
for d, node in match.traverse(order='post',
|
||||
depth=True,
|
||||
deptype=('link', 'run')):
|
||||
# skip root, since it's handled above
|
||||
if d == 0:
|
||||
continue
|
||||
|
||||
if node.external or node.virtual:
|
||||
tty.debug('skipping external or virtual dependency %s' %
|
||||
node.format())
|
||||
@@ -360,14 +393,10 @@ def _createtarball(env, spec_yaml, packages, directory, key, no_deps, force,
|
||||
tty.debug('writing tarballs to %s/build_cache' % outdir)
|
||||
|
||||
for spec in specs:
|
||||
tty.msg('creating binary cache file for package %s ' % spec.format())
|
||||
try:
|
||||
bindist.build_tarball(spec, outdir, force, rel,
|
||||
unsigned, allow_root, signkey,
|
||||
not no_rebuild_index)
|
||||
except Exception as e:
|
||||
tty.warn('%s' % e)
|
||||
pass
|
||||
tty.debug('creating binary cache file for package %s ' % spec.format())
|
||||
bindist.build_tarball(spec, outdir, force, rel,
|
||||
unsigned, allow_root, signkey,
|
||||
not no_rebuild_index)
|
||||
|
||||
|
||||
def createtarball(args):
|
||||
@@ -376,9 +405,47 @@ def createtarball(args):
|
||||
# restrict matching to current environment if one is active
|
||||
env = ev.get_env(args, 'buildcache create')
|
||||
|
||||
_createtarball(env, args.spec_yaml, args.specs, args.directory,
|
||||
args.key, args.no_deps, args.force, args.rel, args.unsigned,
|
||||
args.allow_root, args.no_rebuild_index)
|
||||
output_location = None
|
||||
if args.directory:
|
||||
output_location = args.directory
|
||||
|
||||
# User meant to provide a path to a local directory.
|
||||
# Ensure that they did not accidentally pass a URL.
|
||||
scheme = url_util.parse(output_location, scheme='<missing>').scheme
|
||||
if scheme != '<missing>':
|
||||
raise ValueError(
|
||||
'"--directory" expected a local path; got a URL, instead')
|
||||
|
||||
# User meant to provide a path to a local directory.
|
||||
# Ensure that the mirror lookup does not mistake it for a named mirror.
|
||||
output_location = 'file://' + output_location
|
||||
|
||||
elif args.mirror_name:
|
||||
output_location = args.mirror_name
|
||||
|
||||
# User meant to provide the name of a preconfigured mirror.
|
||||
# Ensure that the mirror lookup actually returns a named mirror.
|
||||
result = spack.mirror.MirrorCollection().lookup(output_location)
|
||||
if result.name == "<unnamed>":
|
||||
raise ValueError(
|
||||
'no configured mirror named "{name}"'.format(
|
||||
name=output_location))
|
||||
|
||||
elif args.mirror_url:
|
||||
output_location = args.mirror_url
|
||||
|
||||
# User meant to provide a URL for an anonymous mirror.
|
||||
# Ensure that they actually provided a URL.
|
||||
scheme = url_util.parse(output_location, scheme='<missing>').scheme
|
||||
if scheme == '<missing>':
|
||||
raise ValueError(
|
||||
'"{url}" is not a valid URL'.format(url=output_location))
|
||||
add_spec = ('package' in args.things_to_install)
|
||||
add_deps = ('dependencies' in args.things_to_install)
|
||||
|
||||
_createtarball(env, args.spec_yaml, args.specs, add_spec, add_deps,
|
||||
output_location, args.key, args.force, args.rel,
|
||||
args.unsigned, args.allow_root, args.no_rebuild_index)
|
||||
|
||||
|
||||
def installtarball(args):
|
||||
@@ -387,7 +454,8 @@ def installtarball(args):
|
||||
tty.die("build cache file installation requires" +
|
||||
" at least one package spec argument")
|
||||
pkgs = set(args.specs)
|
||||
matches = match_downloaded_specs(pkgs, args.multiple, args.force)
|
||||
matches = match_downloaded_specs(pkgs, args.multiple, args.force,
|
||||
args.otherarch)
|
||||
|
||||
for match in matches:
|
||||
install_tarball(match, args)
|
||||
|
||||
@@ -45,7 +45,10 @@ def setup_parser(subparser):
|
||||
" (this requires significant time and space)")
|
||||
create_parser.add_argument(
|
||||
'-f', '--file', help="file with specs of packages to put in mirror")
|
||||
|
||||
create_parser.add_argument(
|
||||
'--skip-unstable-versions', action='store_true',
|
||||
help="don't cache versions unless they identify a stable (unchanging)"
|
||||
" source code")
|
||||
create_parser.add_argument(
|
||||
'-D', '--dependencies', action='store_true',
|
||||
help="also fetch all dependencies")
|
||||
@@ -308,7 +311,8 @@ def mirror_create(args):
|
||||
existed = web_util.url_exists(directory)
|
||||
|
||||
# Actually do the work to create the mirror
|
||||
present, mirrored, error = spack.mirror.create(directory, mirror_specs)
|
||||
present, mirrored, error = spack.mirror.create(
|
||||
directory, mirror_specs, args.skip_unstable_versions)
|
||||
p, m, e = len(present), len(mirrored), len(error)
|
||||
|
||||
verb = "updated" if existed else "created"
|
||||
|
||||
@@ -154,7 +154,7 @@ def test(parser, args, unknown_args):
|
||||
|
||||
# The default is to test the core of Spack. If the option `--extension`
|
||||
# has been used, then test that extension.
|
||||
pytest_root = spack.paths.test_path
|
||||
pytest_root = spack.paths.spack_root
|
||||
if args.extension:
|
||||
target = args.extension
|
||||
extensions = spack.config.get('config:extensions')
|
||||
|
||||
@@ -97,6 +97,7 @@
|
||||
config_defaults = {
|
||||
'config': {
|
||||
'debug': False,
|
||||
'connect_timeout': 10,
|
||||
'verify_ssl': True,
|
||||
'checksum': True,
|
||||
'dirty': False,
|
||||
@@ -279,6 +280,7 @@ def __init__(self, name, data=None):
|
||||
self.sections = syaml.syaml_dict()
|
||||
|
||||
if data:
|
||||
data = InternalConfigScope._process_dict_keyname_overrides(data)
|
||||
for section in data:
|
||||
dsec = data[section]
|
||||
validate({section: dsec}, section_schemas[section])
|
||||
@@ -305,6 +307,25 @@ def write_section(self, section):
|
||||
def __repr__(self):
|
||||
return '<InternalConfigScope: %s>' % self.name
|
||||
|
||||
@staticmethod
|
||||
def _process_dict_keyname_overrides(data):
|
||||
"""Turn a trailing `:' in a key name into an override attribute."""
|
||||
result = {}
|
||||
for sk, sv in iteritems(data):
|
||||
if sk.endswith(':'):
|
||||
key = syaml.syaml_str(sk[:-1])
|
||||
key.override = True
|
||||
else:
|
||||
key = sk
|
||||
|
||||
if isinstance(sv, dict):
|
||||
result[key]\
|
||||
= InternalConfigScope._process_dict_keyname_overrides(sv)
|
||||
else:
|
||||
result[key] = copy.copy(sv)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
class Configuration(object):
|
||||
"""A full Spack configuration, from a hierarchy of config files.
|
||||
@@ -504,14 +525,14 @@ def set(self, path, value, scope=None):
|
||||
|
||||
Accepts the path syntax described in ``get()``.
|
||||
"""
|
||||
section, _, rest = path.partition(':')
|
||||
parts = _process_config_path(path)
|
||||
section = parts.pop(0)
|
||||
|
||||
if not rest:
|
||||
if not parts:
|
||||
self.update_config(section, value, scope=scope)
|
||||
else:
|
||||
section_data = self.get_config(section, scope=scope)
|
||||
|
||||
parts = rest.split(':')
|
||||
data = section_data
|
||||
while len(parts) > 1:
|
||||
key = parts.pop(0)
|
||||
@@ -611,7 +632,7 @@ def _config():
|
||||
"""Singleton Configuration instance.
|
||||
|
||||
This constructs one instance associated with this module and returns
|
||||
it. It is bundled inside a function so that configuratoin can be
|
||||
it. It is bundled inside a function so that configuration can be
|
||||
initialized lazily.
|
||||
|
||||
Return:
|
||||
@@ -762,17 +783,12 @@ def _merge_yaml(dest, source):
|
||||
Config file authors can optionally end any attribute in a dict
|
||||
with `::` instead of `:`, and the key will override that of the
|
||||
parent instead of merging.
|
||||
|
||||
"""
|
||||
def they_are(t):
|
||||
return isinstance(dest, t) and isinstance(source, t)
|
||||
|
||||
# If both are None, handle specially and return None.
|
||||
if source is None and dest is None:
|
||||
return None
|
||||
|
||||
# If source is None, overwrite with source.
|
||||
elif source is None:
|
||||
if source is None:
|
||||
return None
|
||||
|
||||
# Source list is prepended (for precedence)
|
||||
@@ -798,8 +814,9 @@ def they_are(t):
|
||||
# to copy mark information on source keys to dest.
|
||||
key_marks[sk] = sk
|
||||
|
||||
# ensure that keys are marked in the destination. the key_marks dict
|
||||
# ensures we can get the actual source key objects from dest keys
|
||||
# ensure that keys are marked in the destination. The
|
||||
# key_marks dict ensures we can get the actual source key
|
||||
# objects from dest keys
|
||||
for dk in list(dest.keys()):
|
||||
if dk in key_marks and syaml.markable(dk):
|
||||
syaml.mark(dk, key_marks[dk])
|
||||
@@ -811,9 +828,34 @@ def they_are(t):
|
||||
|
||||
return dest
|
||||
|
||||
# In any other case, overwrite with a copy of the source value.
|
||||
else:
|
||||
return copy.copy(source)
|
||||
# If we reach here source and dest are either different types or are
|
||||
# not both lists or dicts: replace with source.
|
||||
return copy.copy(source)
|
||||
|
||||
|
||||
#
|
||||
# Process a path argument to config.set() that may contain overrides ('::' or
|
||||
# trailing ':')
|
||||
#
|
||||
def _process_config_path(path):
|
||||
result = []
|
||||
if path.startswith(':'):
|
||||
raise syaml.SpackYAMLError("Illegal leading `:' in path `{0}'".
|
||||
format(path), '')
|
||||
seen_override_in_path = False
|
||||
while path:
|
||||
front, sep, path = path.partition(':')
|
||||
if (sep and not path) or path.startswith(':'):
|
||||
if seen_override_in_path:
|
||||
raise syaml.SpackYAMLError("Meaningless second override"
|
||||
" indicator `::' in path `{0}'".
|
||||
format(path), '')
|
||||
path = path.lstrip(':')
|
||||
front = syaml.syaml_str(front)
|
||||
front.override = True
|
||||
seen_override_in_path = True
|
||||
result.append(front)
|
||||
return result
|
||||
|
||||
|
||||
#
|
||||
|
||||
@@ -18,32 +18,27 @@
|
||||
as the authoritative database of packages in Spack. This module
|
||||
provides a cache and a sanity checking mechanism for what is in the
|
||||
filesystem.
|
||||
|
||||
"""
|
||||
import datetime
|
||||
import time
|
||||
import os
|
||||
import sys
|
||||
import socket
|
||||
import contextlib
|
||||
from six import string_types
|
||||
from six import iteritems
|
||||
|
||||
from ruamel.yaml.error import MarkedYAMLError, YAMLError
|
||||
import contextlib
|
||||
import datetime
|
||||
import os
|
||||
import socket
|
||||
import sys
|
||||
import time
|
||||
|
||||
import llnl.util.tty as tty
|
||||
from llnl.util.filesystem import mkdirp
|
||||
|
||||
import spack.store
|
||||
import six
|
||||
import spack.repo
|
||||
import spack.spec
|
||||
import spack.store
|
||||
import spack.util.lock as lk
|
||||
import spack.util.spack_yaml as syaml
|
||||
import spack.util.spack_json as sjson
|
||||
from spack.filesystem_view import YamlFilesystemView
|
||||
from spack.util.crypto import bit_length
|
||||
from llnl.util.filesystem import mkdirp
|
||||
from spack.directory_layout import DirectoryLayoutError
|
||||
from spack.error import SpackError
|
||||
from spack.filesystem_view import YamlFilesystemView
|
||||
from spack.util.crypto import bit_length
|
||||
from spack.version import Version
|
||||
|
||||
# TODO: Provide an API automatically retyring a build after detecting and
|
||||
@@ -284,27 +279,20 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
|
||||
exist. This is the ``db_dir``.
|
||||
|
||||
The Database will attempt to read an ``index.json`` file in
|
||||
``db_dir``. If it does not find one, it will fall back to read
|
||||
an ``index.yaml`` if one is present. If that does not exist, it
|
||||
will create a database when needed by scanning the entire
|
||||
Database root for ``spec.yaml`` files according to Spack's
|
||||
``DirectoryLayout``.
|
||||
``db_dir``. If that does not exist, it will create a database
|
||||
when needed by scanning the entire Database root for ``spec.yaml``
|
||||
files according to Spack's ``DirectoryLayout``.
|
||||
|
||||
Caller may optionally provide a custom ``db_dir`` parameter
|
||||
where data will be stored. This is intended to be used for
|
||||
where data will be stored. This is intended to be used for
|
||||
testing the Database class.
|
||||
|
||||
"""
|
||||
self.root = root
|
||||
if db_dir is None:
|
||||
# If the db_dir is not provided, default to within the db root.
|
||||
self._db_dir = os.path.join(self.root, _db_dirname)
|
||||
else:
|
||||
# Allow customizing the database directory location for testing.
|
||||
self._db_dir = db_dir
|
||||
|
||||
# If the db_dir is not provided, default to within the db root.
|
||||
self._db_dir = db_dir or os.path.join(self.root, _db_dirname)
|
||||
|
||||
# Set up layout of database files within the db dir
|
||||
self._old_yaml_index_path = os.path.join(self._db_dir, 'index.yaml')
|
||||
self._index_path = os.path.join(self._db_dir, 'index.json')
|
||||
self._lock_path = os.path.join(self._db_dir, 'lock')
|
||||
|
||||
@@ -358,8 +346,8 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
|
||||
}
|
||||
try:
|
||||
sjson.dump(database, f)
|
||||
except YAMLError as e:
|
||||
raise syaml.SpackYAMLError(
|
||||
except Exception as e:
|
||||
raise Exception(
|
||||
"error writing YAML database:", str(e))
|
||||
self.lock = ForbiddenLock()
|
||||
else:
|
||||
@@ -572,7 +560,8 @@ def prefix_write_lock(self, spec):
|
||||
prefix_lock.release_write()
|
||||
|
||||
def _write_to_file(self, stream):
|
||||
"""Write out the databsae to a JSON file.
|
||||
"""Write out the database in JSON format to the stream passed
|
||||
as argument.
|
||||
|
||||
This function does not do any locking or transactions.
|
||||
"""
|
||||
@@ -594,9 +583,8 @@ def _write_to_file(self, stream):
|
||||
|
||||
try:
|
||||
sjson.dump(database, stream)
|
||||
except YAMLError as e:
|
||||
raise syaml.SpackYAMLError(
|
||||
"error writing YAML database:", str(e))
|
||||
except (TypeError, ValueError) as e:
|
||||
raise sjson.SpackJSONError("error writing JSON database:", str(e))
|
||||
|
||||
def _read_spec_from_dict(self, hash_key, installs):
|
||||
"""Recursively construct a spec from a hash in a YAML database.
|
||||
@@ -667,28 +655,15 @@ def _assign_dependencies(self, hash_key, installs, data):
|
||||
|
||||
spec._add_dependency(child, dtypes)
|
||||
|
||||
def _read_from_file(self, stream, format='json'):
|
||||
"""
|
||||
Fill database from file, do not maintain old data
|
||||
Translate the spec portions from node-dict form to spec form
|
||||
def _read_from_file(self, filename):
|
||||
"""Fill database from file, do not maintain old data.
|
||||
Translate the spec portions from node-dict form to spec form.
|
||||
|
||||
Does not do any locking.
|
||||
"""
|
||||
if format.lower() == 'json':
|
||||
load = sjson.load
|
||||
elif format.lower() == 'yaml':
|
||||
load = syaml.load
|
||||
else:
|
||||
raise ValueError("Invalid database format: %s" % format)
|
||||
|
||||
try:
|
||||
if isinstance(stream, string_types):
|
||||
with open(stream, 'r') as f:
|
||||
fdata = load(f)
|
||||
else:
|
||||
fdata = load(stream)
|
||||
except MarkedYAMLError as e:
|
||||
raise syaml.SpackYAMLError("error parsing YAML database:", str(e))
|
||||
with open(filename, 'r') as f:
|
||||
fdata = sjson.load(f)
|
||||
except Exception as e:
|
||||
raise CorruptDatabaseError("error parsing database:", str(e))
|
||||
|
||||
@@ -700,12 +675,12 @@ def check(cond, msg):
|
||||
raise CorruptDatabaseError(
|
||||
"Spack database is corrupt: %s" % msg, self._index_path)
|
||||
|
||||
check('database' in fdata, "No 'database' attribute in YAML.")
|
||||
check('database' in fdata, "no 'database' attribute in JSON DB.")
|
||||
|
||||
# High-level file checks
|
||||
db = fdata['database']
|
||||
check('installs' in db, "No 'installs' in YAML DB.")
|
||||
check('version' in db, "No 'version' in YAML DB.")
|
||||
check('installs' in db, "no 'installs' in JSON DB.")
|
||||
check('version' in db, "no 'version' in JSON DB.")
|
||||
|
||||
installs = db['installs']
|
||||
|
||||
@@ -781,7 +756,6 @@ def reindex(self, directory_layout):
|
||||
"""Build database index from scratch based on a directory layout.
|
||||
|
||||
Locks the DB if it isn't locked already.
|
||||
|
||||
"""
|
||||
if self.is_upstream:
|
||||
raise UpstreamDatabaseLockingError(
|
||||
@@ -945,7 +919,6 @@ def _write(self, type, value, traceback):
|
||||
after the start of the next transaction, when it read from disk again.
|
||||
|
||||
This routine does no locking.
|
||||
|
||||
"""
|
||||
# Do not write if exceptions were raised
|
||||
if type is not None:
|
||||
@@ -970,35 +943,23 @@ def _read(self):
|
||||
"""Re-read Database from the data in the set location.
|
||||
|
||||
This does no locking, with one exception: it will automatically
|
||||
migrate an index.yaml to an index.json if possible. This requires
|
||||
taking a write lock.
|
||||
|
||||
try to regenerate a missing DB if local. This requires taking a
|
||||
write lock.
|
||||
"""
|
||||
if os.path.isfile(self._index_path):
|
||||
# Read from JSON file if a JSON database exists
|
||||
self._read_from_file(self._index_path, format='json')
|
||||
# Read from file if a database exists
|
||||
self._read_from_file(self._index_path)
|
||||
return
|
||||
elif self.is_upstream:
|
||||
raise UpstreamDatabaseLockingError(
|
||||
"No database index file is present, and upstream"
|
||||
" databases cannot generate an index file")
|
||||
|
||||
elif os.path.isfile(self._old_yaml_index_path):
|
||||
if (not self.is_upstream) and os.access(
|
||||
self._db_dir, os.R_OK | os.W_OK):
|
||||
# if we can write, then read AND write a JSON file.
|
||||
self._read_from_file(self._old_yaml_index_path, format='yaml')
|
||||
with lk.WriteTransaction(self.lock):
|
||||
self._write(None, None, None)
|
||||
else:
|
||||
# Read chck for a YAML file if we can't find JSON.
|
||||
self._read_from_file(self._old_yaml_index_path, format='yaml')
|
||||
|
||||
else:
|
||||
if self.is_upstream:
|
||||
raise UpstreamDatabaseLockingError(
|
||||
"No database index file is present, and upstream"
|
||||
" databases cannot generate an index file")
|
||||
# The file doesn't exist, try to traverse the directory.
|
||||
# reindex() takes its own write lock, so no lock here.
|
||||
with lk.WriteTransaction(self.lock):
|
||||
self._write(None, None, None)
|
||||
self.reindex(spack.store.layout)
|
||||
# The file doesn't exist, try to traverse the directory.
|
||||
# reindex() takes its own write lock, so no lock here.
|
||||
with lk.WriteTransaction(self.lock):
|
||||
self._write(None, None, None)
|
||||
self.reindex(spack.store.layout)
|
||||
|
||||
def _add(
|
||||
self,
|
||||
@@ -1078,7 +1039,9 @@ def _add(
|
||||
)
|
||||
|
||||
# Connect dependencies from the DB to the new copy.
|
||||
for name, dep in iteritems(spec.dependencies_dict(_tracked_deps)):
|
||||
for name, dep in six.iteritems(
|
||||
spec.dependencies_dict(_tracked_deps)
|
||||
):
|
||||
dkey = dep.spec.dag_hash()
|
||||
upstream, record = self.query_by_spec_hash(dkey)
|
||||
new_spec._add_dependency(record.spec, dep.deptypes)
|
||||
@@ -1151,8 +1114,7 @@ def _increment_ref_count(self, spec):
|
||||
rec.ref_count += 1
|
||||
|
||||
def _remove(self, spec):
|
||||
"""Non-locking version of remove(); does real work.
|
||||
"""
|
||||
"""Non-locking version of remove(); does real work."""
|
||||
key = self._get_matching_spec_key(spec)
|
||||
rec = self._data[key]
|
||||
|
||||
|
||||
@@ -257,7 +257,7 @@ def __init__(self, url=None, checksum=None, **kwargs):
|
||||
self.digest = kwargs[h]
|
||||
|
||||
self.expand_archive = kwargs.get('expand', True)
|
||||
self.extra_curl_options = kwargs.get('curl_options', [])
|
||||
self.extra_options = kwargs.get('fetch_options', [])
|
||||
self._curl = None
|
||||
|
||||
self.extension = kwargs.get('extension', None)
|
||||
@@ -326,8 +326,6 @@ def _fetch_from_url(self, url):
|
||||
'-D',
|
||||
'-', # print out HTML headers
|
||||
'-L', # resolve 3xx redirects
|
||||
# Timeout if can't establish a connection after 10 sec.
|
||||
'--connect-timeout', '10',
|
||||
url,
|
||||
]
|
||||
|
||||
@@ -339,7 +337,22 @@ def _fetch_from_url(self, url):
|
||||
else:
|
||||
curl_args.append('-sS') # just errors when not.
|
||||
|
||||
curl_args += self.extra_curl_options
|
||||
connect_timeout = spack.config.get('config:connect_timeout')
|
||||
|
||||
if self.extra_options:
|
||||
cookie = self.extra_options.get('cookie')
|
||||
if cookie:
|
||||
curl_args.append('-j') # junk cookies
|
||||
curl_args.append('-b') # specify cookie
|
||||
curl_args.append(cookie)
|
||||
|
||||
timeout = self.extra_options.get('timeout')
|
||||
if timeout:
|
||||
connect_timeout = max(connect_timeout, int(timeout))
|
||||
|
||||
if connect_timeout > 0:
|
||||
# Timeout if can't establish a connection after n sec.
|
||||
curl_args.extend(['--connect-timeout', str(connect_timeout)])
|
||||
|
||||
# Run curl but grab the mime type from the http headers
|
||||
curl = self.curl
|
||||
@@ -1166,6 +1179,15 @@ def fetch(self):
|
||||
raise FailedDownloadError(self.url)
|
||||
|
||||
|
||||
def stable_target(fetcher):
|
||||
"""Returns whether the fetcher target is expected to have a stable
|
||||
checksum. This is only true if the target is a preexisting archive
|
||||
file."""
|
||||
if isinstance(fetcher, URLFetchStrategy) and fetcher.cachable:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def from_url(url):
|
||||
"""Given a URL, find an appropriate fetch strategy for it.
|
||||
Currently just gives you a URLFetchStrategy that uses curl.
|
||||
|
||||
79
lib/spack/spack/installer.py
Executable file → Normal file
79
lib/spack/spack/installer.py
Executable file → Normal file
@@ -651,6 +651,66 @@ def _add_bootstrap_compilers(self, pkg):
|
||||
if package_id(comp_pkg) not in self.build_tasks:
|
||||
self._push_task(comp_pkg, is_compiler, 0, 0, STATUS_ADDED)
|
||||
|
||||
def _check_db(self, spec):
|
||||
"""Determine if the spec is flagged as installed in the database
|
||||
|
||||
Args:
|
||||
spec (Spec): spec whose database install status is being checked
|
||||
|
||||
Return:
|
||||
(rec, installed_in_db) tuple where rec is the database record, or
|
||||
None, if there is no matching spec, and installed_in_db is
|
||||
``True`` if the spec is considered installed and ``False``
|
||||
otherwise
|
||||
"""
|
||||
try:
|
||||
rec = spack.store.db.get_record(spec)
|
||||
installed_in_db = rec.installed if rec else False
|
||||
except KeyError:
|
||||
# KeyError is raised if there is no matching spec in the database
|
||||
# (versus no matching specs that are installed).
|
||||
rec = None
|
||||
installed_in_db = False
|
||||
return rec, installed_in_db
|
||||
|
||||
def _check_deps_status(self):
|
||||
"""Check the install status of the explicit spec's dependencies"""
|
||||
|
||||
err = 'Cannot proceed with {0}: {1}'
|
||||
for dep in self.spec.traverse(order='post', root=False):
|
||||
dep_pkg = dep.package
|
||||
dep_id = package_id(dep_pkg)
|
||||
|
||||
# Check for failure since a prefix lock is not required
|
||||
if spack.store.db.prefix_failed(dep):
|
||||
action = "'spack install' the dependency"
|
||||
msg = '{0} is marked as an install failure: {1}' \
|
||||
.format(dep_id, action)
|
||||
raise InstallError(err.format(self.pkg_id, msg))
|
||||
|
||||
# Attempt to get a write lock to ensure another process does not
|
||||
# uninstall the dependency while the requested spec is being
|
||||
# installed
|
||||
ltype, lock = self._ensure_locked('write', dep_pkg)
|
||||
if lock is None:
|
||||
msg = '{0} is write locked by another process'.format(dep_id)
|
||||
raise InstallError(err.format(self.pkg_id, msg))
|
||||
|
||||
# Flag external and upstream packages as being installed
|
||||
if dep_pkg.spec.external or dep_pkg.installed_upstream:
|
||||
form = 'external' if dep_pkg.spec.external else 'upstream'
|
||||
tty.debug('Flagging {0} {1} as installed'.format(form, dep_id))
|
||||
self.installed.add(dep_id)
|
||||
continue
|
||||
|
||||
# Check the database to see if the dependency has been installed
|
||||
# and flag as such if appropriate
|
||||
rec, installed_in_db = self._check_db(dep)
|
||||
if installed_in_db:
|
||||
tty.debug('Flagging {0} as installed per the database'
|
||||
.format(dep_id))
|
||||
self.installed.add(dep_id)
|
||||
|
||||
def _prepare_for_install(self, task, keep_prefix, keep_stage,
|
||||
restage=False):
|
||||
"""
|
||||
@@ -680,14 +740,7 @@ def _prepare_for_install(self, task, keep_prefix, keep_stage,
|
||||
return
|
||||
|
||||
# Determine if the spec is flagged as installed in the database
|
||||
try:
|
||||
rec = spack.store.db.get_record(task.pkg.spec)
|
||||
installed_in_db = rec.installed if rec else False
|
||||
except KeyError:
|
||||
# KeyError is raised if there is no matching spec in the database
|
||||
# (versus no matching specs that are installed).
|
||||
rec = None
|
||||
installed_in_db = False
|
||||
rec, installed_in_db = self._check_db(task.pkg.spec)
|
||||
|
||||
# Make sure the installation directory is in the desired state
|
||||
# for uninstalled specs.
|
||||
@@ -929,6 +982,11 @@ def _init_queue(self, install_deps, install_package):
|
||||
# Be sure to clear any previous failure
|
||||
spack.store.db.clear_failure(self.pkg.spec, force=True)
|
||||
|
||||
# If not installing dependencies, then determine their
|
||||
# installation status before proceeding
|
||||
if not install_deps:
|
||||
self._check_deps_status()
|
||||
|
||||
# Now add the package itself, if appropriate
|
||||
self._push_task(self.pkg, False, 0, 0, STATUS_ADDED)
|
||||
|
||||
@@ -1022,7 +1080,8 @@ def build_process():
|
||||
configure_args = getattr(pkg, attr)()
|
||||
configure_args = ' '.join(configure_args)
|
||||
|
||||
with open(pkg.configure_args_path, 'w') as args_file:
|
||||
with open(pkg.configure_args_path, 'w') as \
|
||||
args_file:
|
||||
args_file.write(configure_args)
|
||||
|
||||
break
|
||||
@@ -1095,7 +1154,7 @@ def build_process():
|
||||
except StopIteration as e:
|
||||
# A StopIteration exception means that do_install was asked to
|
||||
# stop early from clients.
|
||||
tty.msg('{0} {1}'.format(self.pid, e.message))
|
||||
tty.msg('{0} {1}'.format(self.pid, str(e)))
|
||||
tty.msg('Package stage directory : {0}'
|
||||
.format(pkg.stage.source_path))
|
||||
|
||||
|
||||
@@ -401,7 +401,7 @@ def get_matching_versions(specs, num_versions=1):
|
||||
return matching
|
||||
|
||||
|
||||
def create(path, specs):
|
||||
def create(path, specs, skip_unstable_versions=False):
|
||||
"""Create a directory to be used as a spack mirror, and fill it with
|
||||
package archives.
|
||||
|
||||
@@ -409,6 +409,9 @@ def create(path, specs):
|
||||
path: Path to create a mirror directory hierarchy in.
|
||||
specs: Any package versions matching these specs will be added \
|
||||
to the mirror.
|
||||
skip_unstable_versions: if true, this skips adding resources when
|
||||
they do not have a stable archive checksum (as determined by
|
||||
``fetch_strategy.stable_target``)
|
||||
|
||||
Return Value:
|
||||
Returns a tuple of lists: (present, mirrored, error)
|
||||
@@ -440,16 +443,14 @@ def create(path, specs):
|
||||
raise MirrorError(
|
||||
"Cannot create directory '%s':" % mirror_root, str(e))
|
||||
|
||||
mirror_cache = spack.caches.MirrorCache(mirror_root)
|
||||
mirror_cache = spack.caches.MirrorCache(
|
||||
mirror_root, skip_unstable_versions=skip_unstable_versions)
|
||||
mirror_stats = MirrorStats()
|
||||
try:
|
||||
spack.caches.mirror_cache = mirror_cache
|
||||
# Iterate through packages and download all safe tarballs for each
|
||||
for spec in specs:
|
||||
mirror_stats.next_spec(spec)
|
||||
add_single_spec(spec, mirror_root, mirror_stats)
|
||||
finally:
|
||||
spack.caches.mirror_cache = None
|
||||
|
||||
# Iterate through packages and download all safe tarballs for each
|
||||
for spec in specs:
|
||||
mirror_stats.next_spec(spec)
|
||||
_add_single_spec(spec, mirror_cache, mirror_stats)
|
||||
|
||||
return mirror_stats.stats()
|
||||
|
||||
@@ -495,7 +496,7 @@ def error(self):
|
||||
self.errors.add(self.current_spec)
|
||||
|
||||
|
||||
def add_single_spec(spec, mirror_root, mirror_stats):
|
||||
def _add_single_spec(spec, mirror, mirror_stats):
|
||||
tty.msg("Adding package {pkg} to mirror".format(
|
||||
pkg=spec.format("{name}{@version}")
|
||||
))
|
||||
@@ -503,10 +504,10 @@ def add_single_spec(spec, mirror_root, mirror_stats):
|
||||
while num_retries > 0:
|
||||
try:
|
||||
with spec.package.stage as pkg_stage:
|
||||
pkg_stage.cache_mirror(mirror_stats)
|
||||
pkg_stage.cache_mirror(mirror, mirror_stats)
|
||||
for patch in spec.package.all_patches():
|
||||
if patch.cache():
|
||||
patch.cache().cache_mirror(mirror_stats)
|
||||
if patch.stage:
|
||||
patch.stage.cache_mirror(mirror, mirror_stats)
|
||||
patch.clean()
|
||||
exception = None
|
||||
break
|
||||
|
||||
@@ -344,7 +344,11 @@ def get_module(module_type, spec, get_full_path, required=True):
|
||||
The module name or path. May return ``None`` if the module is not
|
||||
available.
|
||||
"""
|
||||
if spec.package.installed_upstream:
|
||||
try:
|
||||
upstream = spec.package.installed_upstream
|
||||
except spack.repo.UnknownPackageError:
|
||||
upstream, record = spack.store.db.query_by_spec_hash(spec.dag_hash())
|
||||
if upstream:
|
||||
module = (spack.modules.common.upstream_module_index
|
||||
.upstream_module(spec, module_type))
|
||||
if not module:
|
||||
@@ -426,6 +430,7 @@ def suffixes(self):
|
||||
for constraint, suffix in self.conf.get('suffixes', {}).items():
|
||||
if constraint in self.spec:
|
||||
suffixes.append(suffix)
|
||||
suffixes = sorted(set(suffixes))
|
||||
if self.hash:
|
||||
suffixes.append(self.hash)
|
||||
return suffixes
|
||||
|
||||
@@ -1135,8 +1135,8 @@ def do_fetch(self, mirror_only=False):
|
||||
|
||||
for patch in self.spec.patches:
|
||||
patch.fetch()
|
||||
if patch.cache():
|
||||
patch.cache().cache_local()
|
||||
if patch.stage:
|
||||
patch.stage.cache_local()
|
||||
|
||||
def do_stage(self, mirror_only=False):
|
||||
"""Unpacks and expands the fetched tarball."""
|
||||
|
||||
@@ -85,7 +85,8 @@ def apply(self, stage):
|
||||
|
||||
apply_patch(stage, self.path, self.level, self.working_dir)
|
||||
|
||||
def cache(self):
|
||||
@property
|
||||
def stage(self):
|
||||
return None
|
||||
|
||||
def to_dict(self):
|
||||
@@ -248,9 +249,6 @@ def stage(self):
|
||||
self._stage.create()
|
||||
return self._stage
|
||||
|
||||
def cache(self):
|
||||
return self.stage
|
||||
|
||||
def clean(self):
|
||||
self.stage.destroy()
|
||||
|
||||
|
||||
@@ -398,10 +398,44 @@ def replace_prefix_text(path_name, old_dir, new_dir):
|
||||
|
||||
|
||||
def replace_prefix_bin(path_name, old_dir, new_dir):
|
||||
"""
|
||||
Attempt to replace old install prefix with new install prefix
|
||||
in binary files by prefixing new install prefix with os.sep
|
||||
until the lengths of the prefixes are the same.
|
||||
"""
|
||||
|
||||
def replace(match):
|
||||
occurances = match.group().count(old_dir.encode('utf-8'))
|
||||
olen = len(old_dir.encode('utf-8'))
|
||||
nlen = len(new_dir.encode('utf-8'))
|
||||
padding = (olen - nlen) * occurances
|
||||
if padding < 0:
|
||||
return data
|
||||
return match.group().replace(old_dir.encode('utf-8'),
|
||||
os.sep.encode('utf-8') * padding +
|
||||
new_dir.encode('utf-8'))
|
||||
|
||||
with open(path_name, 'rb+') as f:
|
||||
data = f.read()
|
||||
f.seek(0)
|
||||
original_data_len = len(data)
|
||||
pat = re.compile(old_dir.encode('utf-8') + b'([^\0]*?)\0')
|
||||
if not pat.search(data):
|
||||
return
|
||||
ndata = pat.sub(replace, data)
|
||||
if not len(ndata) == original_data_len:
|
||||
raise BinaryStringReplacementException(
|
||||
path_name, original_data_len, len(ndata))
|
||||
f.write(ndata)
|
||||
f.truncate()
|
||||
|
||||
|
||||
def replace_prefix_nullterm(path_name, old_dir, new_dir):
|
||||
"""
|
||||
Attempt to replace old install prefix with new install prefix
|
||||
in binary files by replacing with null terminated string
|
||||
that is the same length unless the old path is shorter
|
||||
Used on linux to replace mach-o rpaths
|
||||
"""
|
||||
|
||||
def replace(match):
|
||||
@@ -413,7 +447,6 @@ def replace(match):
|
||||
return data
|
||||
return match.group().replace(old_dir.encode('utf-8'),
|
||||
new_dir.encode('utf-8')) + b'\0' * padding
|
||||
|
||||
with open(path_name, 'rb+') as f:
|
||||
data = f.read()
|
||||
f.seek(0)
|
||||
@@ -466,8 +499,7 @@ def relocate_macho_binaries(path_names, old_dir, new_dir, allow_root):
|
||||
modify_object_macholib(path_name, placeholder, new_dir)
|
||||
modify_object_macholib(path_name, old_dir, new_dir)
|
||||
if len(new_dir) <= len(old_dir):
|
||||
replace_prefix_bin(path_name, old_dir,
|
||||
new_dir)
|
||||
replace_prefix_nullterm(path_name, old_dir, new_dir)
|
||||
else:
|
||||
tty.warn('Cannot do a binary string replacement'
|
||||
' with padding for %s'
|
||||
|
||||
@@ -55,6 +55,7 @@
|
||||
},
|
||||
'source_cache': {'type': 'string'},
|
||||
'misc_cache': {'type': 'string'},
|
||||
'connect_timeout': {'type': 'integer', 'minimum': 0},
|
||||
'verify_ssl': {'type': 'boolean'},
|
||||
'suppress_gpg_warnings': {'type': 'boolean'},
|
||||
'install_missing_compilers': {'type': 'boolean'},
|
||||
|
||||
@@ -369,6 +369,10 @@ def _satisfies_target(self, other_target, strict):
|
||||
if not need_to_check:
|
||||
return True
|
||||
|
||||
# self is not concrete, but other_target is there and strict=True
|
||||
if self.target is None:
|
||||
return False
|
||||
|
||||
for target_range in str(other_target).split(','):
|
||||
t_min, sep, t_max = target_range.partition(':')
|
||||
|
||||
@@ -1919,9 +1923,7 @@ def from_dict(data):
|
||||
|
||||
yaml_deps = node[name]['dependencies']
|
||||
for dname, dhash, dtypes in Spec.read_yaml_dep_specs(yaml_deps):
|
||||
# Fill in dependencies by looking them up by name in deps dict
|
||||
deps[name]._dependencies[dname] = DependencySpec(
|
||||
deps[name], deps[dname], dtypes)
|
||||
deps[name]._add_dependency(deps[dname], dtypes)
|
||||
|
||||
return spec
|
||||
|
||||
|
||||
@@ -494,8 +494,14 @@ def cache_local(self):
|
||||
spack.caches.fetch_cache.store(
|
||||
self.fetcher, self.mirror_paths.storage_path)
|
||||
|
||||
def cache_mirror(self, stats):
|
||||
"""Perform a fetch if the resource is not already cached"""
|
||||
def cache_mirror(self, mirror, stats):
|
||||
"""Perform a fetch if the resource is not already cached
|
||||
|
||||
Arguments:
|
||||
mirror (MirrorCache): the mirror to cache this Stage's resource in
|
||||
stats (MirrorStats): this is updated depending on whether the
|
||||
caching operation succeeded or failed
|
||||
"""
|
||||
if isinstance(self.default_fetcher, fs.BundleFetchStrategy):
|
||||
# BundleFetchStrategy has no source to fetch. The associated
|
||||
# fetcher does nothing but the associated stage may still exist.
|
||||
@@ -506,20 +512,23 @@ def cache_mirror(self, stats):
|
||||
# must examine the type of the fetcher.
|
||||
return
|
||||
|
||||
dst_root = spack.caches.mirror_cache.root
|
||||
if (mirror.skip_unstable_versions and
|
||||
not fs.stable_target(self.default_fetcher)):
|
||||
return
|
||||
|
||||
absolute_storage_path = os.path.join(
|
||||
dst_root, self.mirror_paths.storage_path)
|
||||
mirror.root, self.mirror_paths.storage_path)
|
||||
|
||||
if os.path.exists(absolute_storage_path):
|
||||
stats.already_existed(absolute_storage_path)
|
||||
else:
|
||||
self.fetch()
|
||||
self.check()
|
||||
spack.caches.mirror_cache.store(
|
||||
mirror.store(
|
||||
self.fetcher, self.mirror_paths.storage_path)
|
||||
stats.added(absolute_storage_path)
|
||||
|
||||
spack.caches.mirror_cache.symlink(self.mirror_paths)
|
||||
mirror.symlink(self.mirror_paths)
|
||||
|
||||
def expand_archive(self):
|
||||
"""Changes to the stage directory and attempt to expand the downloaded
|
||||
|
||||
@@ -214,3 +214,16 @@ def test_optimization_flags_with_custom_versions(
|
||||
)
|
||||
opt_flags = target.optimization_flags(compiler)
|
||||
assert opt_flags == expected_flags
|
||||
|
||||
|
||||
@pytest.mark.regression('15306')
|
||||
@pytest.mark.parametrize('architecture_tuple,constraint_tuple', [
|
||||
(('linux', 'ubuntu18.04', None), ('linux', None, 'x86_64')),
|
||||
(('linux', 'ubuntu18.04', None), ('linux', None, 'x86_64:')),
|
||||
])
|
||||
def test_satisfy_strict_constraint_when_not_concrete(
|
||||
architecture_tuple, constraint_tuple
|
||||
):
|
||||
architecture = spack.spec.ArchSpec(architecture_tuple)
|
||||
constraint = spack.spec.ArchSpec(constraint_tuple)
|
||||
assert not architecture.satisfies(constraint, strict=True)
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
import errno
|
||||
import platform
|
||||
|
||||
import pytest
|
||||
@@ -11,6 +12,7 @@
|
||||
import spack.binary_distribution
|
||||
|
||||
buildcache = spack.main.SpackCommand('buildcache')
|
||||
install = spack.main.SpackCommand('install')
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
@@ -41,3 +43,16 @@ def test_buildcache_list_duplicates(mock_get_specs, capsys):
|
||||
output = buildcache('list', 'mpileaks', '@2.3')
|
||||
|
||||
assert output.count('mpileaks') == 3
|
||||
|
||||
|
||||
def test_buildcache_create_fail_on_perm_denied(
|
||||
install_mockery, mock_fetch, monkeypatch, tmpdir):
|
||||
"""Ensure that buildcache create fails on permission denied error."""
|
||||
install('trivial-install-test-package')
|
||||
|
||||
tmpdir.chmod(0)
|
||||
with pytest.raises(OSError) as error:
|
||||
buildcache('create', '-d', str(tmpdir),
|
||||
'--unsigned', 'trivial-install-test-package')
|
||||
assert error.value.errno == errno.EACCES
|
||||
tmpdir.chmod(0o700)
|
||||
|
||||
@@ -672,6 +672,30 @@ def test_install_only_dependencies(tmpdir, mock_fetch, install_mockery):
|
||||
assert not os.path.exists(root.prefix)
|
||||
|
||||
|
||||
def test_install_only_package(tmpdir, mock_fetch, install_mockery, capfd):
|
||||
msg = ''
|
||||
with capfd.disabled():
|
||||
try:
|
||||
install('--only', 'package', 'dependent-install')
|
||||
except spack.installer.InstallError as e:
|
||||
msg = str(e)
|
||||
|
||||
assert 'Cannot proceed with dependent-install' in msg
|
||||
assert '1 uninstalled dependency' in msg
|
||||
|
||||
|
||||
def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
|
||||
dep = Spec('dependency-install').concretized()
|
||||
root = Spec('dependent-install').concretized()
|
||||
|
||||
install('--only', 'dependencies', 'dependent-install')
|
||||
assert os.path.exists(dep.prefix)
|
||||
assert not os.path.exists(root.prefix)
|
||||
|
||||
install('--only', 'package', 'dependent-install')
|
||||
assert os.path.exists(root.prefix)
|
||||
|
||||
|
||||
@pytest.mark.regression('12002')
|
||||
def test_install_only_dependencies_in_env(tmpdir, mock_fetch, install_mockery,
|
||||
mutable_mock_env_path):
|
||||
|
||||
@@ -66,6 +66,29 @@ def test_mirror_from_env(tmpdir, mock_packages, mock_fetch, config,
|
||||
assert mirror_res == expected
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def source_for_pkg_with_hash(mock_packages, tmpdir):
|
||||
pkg = spack.repo.get('trivial-pkg-with-valid-hash')
|
||||
local_url_basename = os.path.basename(pkg.url)
|
||||
local_path = os.path.join(str(tmpdir), local_url_basename)
|
||||
with open(local_path, 'w') as f:
|
||||
f.write(pkg.hashed_content)
|
||||
local_url = "file://" + local_path
|
||||
pkg.versions[spack.version.Version('1.0')]['url'] = local_url
|
||||
|
||||
|
||||
def test_mirror_skip_unstable(tmpdir_factory, mock_packages, config,
|
||||
source_for_pkg_with_hash):
|
||||
mirror_dir = str(tmpdir_factory.mktemp('mirror-dir'))
|
||||
|
||||
specs = [spack.spec.Spec(x).concretized() for x in
|
||||
['git-test', 'trivial-pkg-with-valid-hash']]
|
||||
spack.mirror.create(mirror_dir, specs, skip_unstable_versions=True)
|
||||
|
||||
assert (set(os.listdir(mirror_dir)) - set(['_source-cache']) ==
|
||||
set(['trivial-pkg-with-valid-hash']))
|
||||
|
||||
|
||||
def test_mirror_crud(tmp_scope, capsys):
|
||||
with capsys.disabled():
|
||||
mirror('add', '--scope', tmp_scope, 'mirror', 'http://spack.io')
|
||||
|
||||
@@ -6,6 +6,7 @@
|
||||
from spack.main import SpackCommand
|
||||
|
||||
spack_test = SpackCommand('test')
|
||||
cmd_test_py = 'lib/spack/spack/test/cmd/test.py'
|
||||
|
||||
|
||||
def test_list():
|
||||
@@ -16,13 +17,13 @@ def test_list():
|
||||
|
||||
|
||||
def test_list_with_pytest_arg():
|
||||
output = spack_test('--list', 'cmd/test.py')
|
||||
assert output.strip() == "cmd/test.py"
|
||||
output = spack_test('--list', cmd_test_py)
|
||||
assert output.strip() == cmd_test_py
|
||||
|
||||
|
||||
def test_list_with_keywords():
|
||||
output = spack_test('--list', '-k', 'cmd/test.py')
|
||||
assert output.strip() == "cmd/test.py"
|
||||
assert output.strip() == cmd_test_py
|
||||
|
||||
|
||||
def test_list_long(capsys):
|
||||
@@ -44,7 +45,7 @@ def test_list_long(capsys):
|
||||
|
||||
def test_list_long_with_pytest_arg(capsys):
|
||||
with capsys.disabled():
|
||||
output = spack_test('--list-long', 'cmd/test.py')
|
||||
output = spack_test('--list-long', cmd_test_py)
|
||||
assert "test.py::\n" in output
|
||||
assert "test_list" in output
|
||||
assert "test_list_with_pytest_arg" in output
|
||||
@@ -74,7 +75,7 @@ def test_list_names():
|
||||
|
||||
|
||||
def test_list_names_with_pytest_arg():
|
||||
output = spack_test('--list-names', 'cmd/test.py')
|
||||
output = spack_test('--list-names', cmd_test_py)
|
||||
assert "test.py::test_list\n" in output
|
||||
assert "test.py::test_list_with_pytest_arg\n" in output
|
||||
assert "test.py::test_list_with_keywords\n" in output
|
||||
|
||||
@@ -15,15 +15,15 @@
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def concretize_scope(config, tmpdir):
|
||||
def concretize_scope(mutable_config, tmpdir):
|
||||
"""Adds a scope for concretization preferences"""
|
||||
tmpdir.ensure_dir('concretize')
|
||||
config.push_scope(
|
||||
mutable_config.push_scope(
|
||||
ConfigScope('concretize', str(tmpdir.join('concretize'))))
|
||||
|
||||
yield
|
||||
|
||||
config.pop_scope()
|
||||
mutable_config.pop_scope()
|
||||
spack.repo.path._provider_index = None
|
||||
|
||||
|
||||
@@ -84,16 +84,24 @@ def test_preferred_variants(self):
|
||||
'mpileaks', debug=True, opt=True, shared=False, static=False
|
||||
)
|
||||
|
||||
def test_preferred_compilers(self, mutable_mock_repo):
|
||||
def test_preferred_compilers(self):
|
||||
"""Test preferred compilers are applied correctly
|
||||
"""
|
||||
update_packages('mpileaks', 'compiler', ['clang@3.3'])
|
||||
spec = concretize('mpileaks')
|
||||
assert spec.compiler == spack.spec.CompilerSpec('clang@3.3')
|
||||
# Need to make sure the test uses an available compiler
|
||||
compiler_list = spack.compilers.all_compiler_specs()
|
||||
assert compiler_list
|
||||
|
||||
update_packages('mpileaks', 'compiler', ['gcc@4.5.0'])
|
||||
# Try the first available compiler
|
||||
compiler = str(compiler_list[0])
|
||||
update_packages('mpileaks', 'compiler', [compiler])
|
||||
spec = concretize('mpileaks')
|
||||
assert spec.compiler == spack.spec.CompilerSpec('gcc@4.5.0')
|
||||
assert spec.compiler == spack.spec.CompilerSpec(compiler)
|
||||
|
||||
# Try the last available compiler
|
||||
compiler = str(compiler_list[-1])
|
||||
update_packages('mpileaks', 'compiler', [compiler])
|
||||
spec = concretize('mpileaks')
|
||||
assert spec.compiler == spack.spec.CompilerSpec(compiler)
|
||||
|
||||
def test_preferred_target(self, mutable_mock_repo):
|
||||
"""Test preferred compilers are applied correctly
|
||||
|
||||
@@ -46,7 +46,19 @@
|
||||
|
||||
config_override_list = {
|
||||
'config': {
|
||||
'build_stage:': ['patha', 'pathb']}}
|
||||
'build_stage:': ['pathd', 'pathe']}}
|
||||
|
||||
config_merge_dict = {
|
||||
'config': {
|
||||
'info': {
|
||||
'a': 3,
|
||||
'b': 4}}}
|
||||
|
||||
config_override_dict = {
|
||||
'config': {
|
||||
'info:': {
|
||||
'a': 7,
|
||||
'c': 9}}}
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
@@ -382,7 +394,7 @@ def test_read_config_override_list(mock_low_high_config, write_config_file):
|
||||
write_config_file('config', config_override_list, 'high')
|
||||
assert spack.config.get('config') == {
|
||||
'install_tree': 'install_tree_path',
|
||||
'build_stage': ['patha', 'pathb']
|
||||
'build_stage': config_override_list['config']['build_stage:']
|
||||
}
|
||||
|
||||
|
||||
@@ -857,3 +869,74 @@ def test_dotkit_in_config_does_not_raise(
|
||||
# we throw a a deprecation warning without raising
|
||||
assert '_sp_sys_type' in captured[0] # stdout
|
||||
assert 'Warning' in captured[1] # stderr
|
||||
|
||||
|
||||
def test_internal_config_section_override(mock_low_high_config,
|
||||
write_config_file):
|
||||
write_config_file('config', config_merge_list, 'low')
|
||||
wanted_list = config_override_list['config']['build_stage:']
|
||||
mock_low_high_config.push_scope(spack.config.InternalConfigScope
|
||||
('high', {
|
||||
'config:': {
|
||||
'build_stage': wanted_list
|
||||
}
|
||||
}))
|
||||
assert mock_low_high_config.get('config:build_stage') == wanted_list
|
||||
|
||||
|
||||
def test_internal_config_dict_override(mock_low_high_config,
|
||||
write_config_file):
|
||||
write_config_file('config', config_merge_dict, 'low')
|
||||
wanted_dict = config_override_dict['config']['info:']
|
||||
mock_low_high_config.push_scope(spack.config.InternalConfigScope
|
||||
('high', config_override_dict))
|
||||
assert mock_low_high_config.get('config:info') == wanted_dict
|
||||
|
||||
|
||||
def test_internal_config_list_override(mock_low_high_config,
|
||||
write_config_file):
|
||||
write_config_file('config', config_merge_list, 'low')
|
||||
wanted_list = config_override_list['config']['build_stage:']
|
||||
mock_low_high_config.push_scope(spack.config.InternalConfigScope
|
||||
('high', config_override_list))
|
||||
assert mock_low_high_config.get('config:build_stage') == wanted_list
|
||||
|
||||
|
||||
def test_set_section_override(mock_low_high_config, write_config_file):
|
||||
write_config_file('config', config_merge_list, 'low')
|
||||
wanted_list = config_override_list['config']['build_stage:']
|
||||
with spack.config.override('config::build_stage', wanted_list):
|
||||
assert mock_low_high_config.get('config:build_stage') == wanted_list
|
||||
assert config_merge_list['config']['build_stage'] == \
|
||||
mock_low_high_config.get('config:build_stage')
|
||||
|
||||
|
||||
def test_set_list_override(mock_low_high_config, write_config_file):
|
||||
write_config_file('config', config_merge_list, 'low')
|
||||
wanted_list = config_override_list['config']['build_stage:']
|
||||
with spack.config.override('config:build_stage:', wanted_list):
|
||||
assert wanted_list == mock_low_high_config.get('config:build_stage')
|
||||
assert config_merge_list['config']['build_stage'] == \
|
||||
mock_low_high_config.get('config:build_stage')
|
||||
|
||||
|
||||
def test_set_dict_override(mock_low_high_config, write_config_file):
|
||||
write_config_file('config', config_merge_dict, 'low')
|
||||
wanted_dict = config_override_dict['config']['info:']
|
||||
with spack.config.override('config:info:', wanted_dict):
|
||||
assert wanted_dict == mock_low_high_config.get('config:info')
|
||||
assert config_merge_dict['config']['info'] == \
|
||||
mock_low_high_config.get('config:info')
|
||||
|
||||
|
||||
def test_set_bad_path(config):
|
||||
with pytest.raises(syaml.SpackYAMLError, match='Illegal leading'):
|
||||
with spack.config.override(':bad:path', ''):
|
||||
pass
|
||||
|
||||
|
||||
def test_bad_path_double_override(config):
|
||||
with pytest.raises(syaml.SpackYAMLError,
|
||||
match='Meaningless second override'):
|
||||
with spack.config.override('bad::double:override::directive', ''):
|
||||
pass
|
||||
|
||||
@@ -5,3 +5,4 @@ tcl:
|
||||
suffixes:
|
||||
'+debug': foo
|
||||
'~debug': bar
|
||||
'^mpich': foo
|
||||
|
||||
@@ -769,7 +769,7 @@ def test_query_spec_with_non_conditional_virtual_dependency(database):
|
||||
def test_failed_spec_path_error(database):
|
||||
"""Ensure spec not concrete check is covered."""
|
||||
s = spack.spec.Spec('a')
|
||||
with pytest.raises(ValueError, matches='Concrete spec required'):
|
||||
with pytest.raises(ValueError, match='Concrete spec required'):
|
||||
spack.store.db._failed_spec_path(s)
|
||||
|
||||
|
||||
|
||||
@@ -27,6 +27,12 @@ def _none(*args, **kwargs):
|
||||
return None
|
||||
|
||||
|
||||
def _not_locked(installer, lock_type, pkg):
|
||||
"""Generic monkeypatch function for _ensure_locked to return no lock"""
|
||||
tty.msg('{0} locked {1}' .format(lock_type, pkg.spec.name))
|
||||
return lock_type, None
|
||||
|
||||
|
||||
def _true(*args, **kwargs):
|
||||
"""Generic monkeypatch function that always returns True."""
|
||||
return True
|
||||
@@ -239,7 +245,7 @@ def test_ensure_locked_have(install_mockery, tmpdir):
|
||||
def test_package_id(install_mockery):
|
||||
"""Test to cover package_id functionality."""
|
||||
pkg = spack.repo.get('trivial-install-test-package')
|
||||
with pytest.raises(ValueError, matches='spec is not concretized'):
|
||||
with pytest.raises(ValueError, match='spec is not concretized'):
|
||||
inst.package_id(pkg)
|
||||
|
||||
spec = spack.spec.Spec('trivial-install-test-package')
|
||||
@@ -274,7 +280,7 @@ def _no_compilers(pkg, arch_spec):
|
||||
|
||||
# Test up to the dependency check
|
||||
monkeypatch.setattr(spack.compilers, 'compilers_for_spec', _no_compilers)
|
||||
with pytest.raises(spack.repo.UnknownPackageError, matches='not found'):
|
||||
with pytest.raises(spack.repo.UnknownPackageError, match='not found'):
|
||||
inst._packages_needed_to_bootstrap_compiler(spec.package)
|
||||
|
||||
|
||||
@@ -285,6 +291,57 @@ def test_dump_packages_deps(install_mockery, tmpdir):
|
||||
inst.dump_packages(spec, '.')
|
||||
|
||||
|
||||
@pytest.mark.tld
|
||||
def test_check_deps_status_errs(install_mockery, monkeypatch):
|
||||
"""Test to cover _check_deps_status failures."""
|
||||
spec, installer = create_installer('a')
|
||||
|
||||
# Make sure the package is identified as failed
|
||||
orig_fn = spack.database.Database.prefix_failed
|
||||
monkeypatch.setattr(spack.database.Database, 'prefix_failed', _true)
|
||||
|
||||
with pytest.raises(inst.InstallError, match='install failure'):
|
||||
installer._check_deps_status()
|
||||
|
||||
monkeypatch.setattr(spack.database.Database, 'prefix_failed', orig_fn)
|
||||
|
||||
# Ensure do not acquire the lock
|
||||
monkeypatch.setattr(inst.PackageInstaller, '_ensure_locked', _not_locked)
|
||||
|
||||
with pytest.raises(inst.InstallError, match='write locked by another'):
|
||||
installer._check_deps_status()
|
||||
|
||||
|
||||
@pytest.mark.tld
|
||||
def test_check_deps_status_external(install_mockery, monkeypatch):
|
||||
"""Test to cover _check_deps_status for external."""
|
||||
spec, installer = create_installer('a')
|
||||
|
||||
deps = spec.dependencies()
|
||||
assert len(deps) > 0
|
||||
dep_id = 'b'
|
||||
|
||||
# Ensure the known dependent is installed if flagged as external
|
||||
monkeypatch.setattr(spack.spec.Spec, 'external', True)
|
||||
installer._check_deps_status()
|
||||
assert dep_id in installer.installed
|
||||
|
||||
|
||||
@pytest.mark.tld
|
||||
def test_check_deps_status_upstream(install_mockery, monkeypatch):
|
||||
"""Test to cover _check_deps_status for upstream."""
|
||||
spec, installer = create_installer('a')
|
||||
|
||||
deps = spec.dependencies()
|
||||
assert len(deps) > 0
|
||||
dep_id = 'b'
|
||||
|
||||
# Ensure the known dependent, b, is installed if flagged as upstream
|
||||
monkeypatch.setattr(spack.package.PackageBase, 'installed_upstream', True)
|
||||
installer._check_deps_status()
|
||||
assert dep_id in installer.installed
|
||||
|
||||
|
||||
def test_add_bootstrap_compilers(install_mockery, monkeypatch):
|
||||
"""Test to cover _add_bootstrap_compilers."""
|
||||
def _pkgs(pkg):
|
||||
@@ -334,6 +391,27 @@ def test_install_task_use_cache(install_mockery, monkeypatch):
|
||||
assert spec.package.name in installer.installed
|
||||
|
||||
|
||||
def test_install_task_stop_iter(install_mockery, monkeypatch, capfd):
|
||||
"""Test _install_task to cover the StopIteration exception."""
|
||||
mock_err_msg = 'mock stop iteration'
|
||||
|
||||
def _raise(installer, pkg):
|
||||
raise StopIteration(mock_err_msg)
|
||||
|
||||
spec, installer = create_installer('a')
|
||||
task = create_build_task(spec.package)
|
||||
|
||||
monkeypatch.setattr(spack.package.PackageBase, 'unit_test_check', _true)
|
||||
monkeypatch.setattr(inst.PackageInstaller, '_setup_install_dir', _raise)
|
||||
|
||||
installer._install_task(task)
|
||||
out = capfd.readouterr()[0]
|
||||
|
||||
assert mock_err_msg in out
|
||||
assert 'Package stage directory' in out
|
||||
assert spec.package.stage.source_path in out
|
||||
|
||||
|
||||
def test_release_lock_write_n_exception(install_mockery, tmpdir, capsys):
|
||||
"""Test _release_lock for supposed write lock with exception."""
|
||||
spec, installer = create_installer('trivial-install-test-package')
|
||||
@@ -428,7 +506,7 @@ def test_install_uninstalled_deps(install_mockery, monkeypatch, capsys):
|
||||
monkeypatch.setattr(inst.PackageInstaller, '_update_failed', _noop)
|
||||
|
||||
msg = 'Cannot proceed with dependent-install'
|
||||
with pytest.raises(spack.installer.InstallError, matches=msg):
|
||||
with pytest.raises(spack.installer.InstallError, match=msg):
|
||||
installer.install()
|
||||
|
||||
out = str(capsys.readouterr())
|
||||
@@ -446,7 +524,7 @@ def test_install_failed(install_mockery, monkeypatch, capsys):
|
||||
monkeypatch.setattr(inst.PackageInstaller, '_install_task', _noop)
|
||||
|
||||
msg = 'Installation of b failed'
|
||||
with pytest.raises(spack.installer.InstallError, matches=msg):
|
||||
with pytest.raises(spack.installer.InstallError, match=msg):
|
||||
installer.install()
|
||||
|
||||
out = str(capsys.readouterr())
|
||||
@@ -458,10 +536,6 @@ def test_install_lock_failures(install_mockery, monkeypatch, capfd):
|
||||
def _requeued(installer, task):
|
||||
tty.msg('requeued {0}' .format(task.pkg.spec.name))
|
||||
|
||||
def _not_locked(installer, lock_type, pkg):
|
||||
tty.msg('{0} locked {1}' .format(lock_type, pkg.spec.name))
|
||||
return lock_type, None
|
||||
|
||||
spec, installer = create_installer('b')
|
||||
|
||||
# Ensure never acquire a lock
|
||||
@@ -485,10 +559,6 @@ def test_install_lock_installed_requeue(install_mockery, monkeypatch, capfd):
|
||||
def _install(installer, task, **kwargs):
|
||||
tty.msg('{0} installing'.format(task.pkg.spec.name))
|
||||
|
||||
def _not_locked(installer, lock_type, pkg):
|
||||
tty.msg('{0} locked {1}' .format(lock_type, pkg.spec.name))
|
||||
return lock_type, None
|
||||
|
||||
def _prep(installer, task, keep_prefix, keep_stage, restage):
|
||||
installer.installed.add('b')
|
||||
tty.msg('{0} is installed' .format(task.pkg.spec.name))
|
||||
@@ -573,7 +643,7 @@ def _install(installer, task, **kwargs):
|
||||
|
||||
spec, installer = create_installer('b')
|
||||
|
||||
with pytest.raises(dl.InstallDirectoryAlreadyExistsError, matches=err):
|
||||
with pytest.raises(dl.InstallDirectoryAlreadyExistsError, match=err):
|
||||
installer.install()
|
||||
|
||||
assert 'b' in installer.installed
|
||||
|
||||
@@ -1272,7 +1272,7 @@ def test_downgrade_write_fails(tmpdir):
|
||||
lock = lk.Lock('lockfile')
|
||||
lock.acquire_read()
|
||||
msg = 'Cannot downgrade lock from write to read on file: lockfile'
|
||||
with pytest.raises(lk.LockDowngradeError, matches=msg):
|
||||
with pytest.raises(lk.LockDowngradeError, match=msg):
|
||||
lock.downgrade_write_to_read()
|
||||
|
||||
|
||||
@@ -1292,5 +1292,5 @@ def test_upgrade_read_fails(tmpdir):
|
||||
lock = lk.Lock('lockfile')
|
||||
lock.acquire_write()
|
||||
msg = 'Cannot upgrade lock from read to write on file: lockfile'
|
||||
with pytest.raises(lk.LockUpgradeError, matches=msg):
|
||||
with pytest.raises(lk.LockUpgradeError, match=msg):
|
||||
lock.upgrade_read_to_write()
|
||||
|
||||
@@ -32,7 +32,8 @@ def test_log_python_output_with_fd_stream(capfd, tmpdir):
|
||||
with open('foo.txt') as f:
|
||||
assert f.read() == 'logged\n'
|
||||
|
||||
assert capfd.readouterr() == ('', '')
|
||||
# Coverage is cluttering stderr during tests
|
||||
assert capfd.readouterr()[0] == ''
|
||||
|
||||
|
||||
def test_log_python_output_and_echo_output(capfd, tmpdir):
|
||||
@@ -42,7 +43,8 @@ def test_log_python_output_and_echo_output(capfd, tmpdir):
|
||||
print('echo')
|
||||
print('logged')
|
||||
|
||||
assert capfd.readouterr() == ('echo\n', '')
|
||||
# Coverage is cluttering stderr during tests
|
||||
assert capfd.readouterr()[0] == 'echo\n'
|
||||
|
||||
with open('foo.txt') as f:
|
||||
assert f.read() == 'echo\nlogged\n'
|
||||
@@ -75,7 +77,8 @@ def test_log_subproc_and_echo_output(capfd, tmpdir):
|
||||
echo('echo')
|
||||
print('logged')
|
||||
|
||||
assert capfd.readouterr() == ('echo\n', '')
|
||||
# Coverage is cluttering stderr during tests
|
||||
assert capfd.readouterr()[0] == 'echo\n'
|
||||
|
||||
with open('foo.txt') as f:
|
||||
assert f.read() == 'logged\n'
|
||||
|
||||
@@ -213,7 +213,7 @@ def test_mirror_cache_symlinks(tmpdir):
|
||||
"""
|
||||
cosmetic_path = 'zlib/zlib-1.2.11.tar.gz'
|
||||
global_path = '_source-cache/archive/c3/c3e5.tar.gz'
|
||||
cache = spack.caches.MirrorCache(str(tmpdir))
|
||||
cache = spack.caches.MirrorCache(str(tmpdir), False)
|
||||
reference = spack.mirror.MirrorReference(cosmetic_path, global_path)
|
||||
|
||||
cache.store(MockFetcher(), reference.storage_path)
|
||||
|
||||
@@ -11,6 +11,7 @@
|
||||
import spack.spec
|
||||
import spack.modules.tcl
|
||||
from spack.modules.common import UpstreamModuleIndex
|
||||
from spack.spec import Spec
|
||||
|
||||
import spack.error
|
||||
|
||||
@@ -183,3 +184,33 @@ def test_get_module_upstream():
|
||||
assert m1_path == '/path/to/a'
|
||||
finally:
|
||||
spack.modules.common.upstream_module_index = old_index
|
||||
|
||||
|
||||
def test_load_installed_package_not_in_repo(install_mockery, mock_fetch,
|
||||
monkeypatch):
|
||||
# Get a basic concrete spec for the trivial install package.
|
||||
spec = Spec('trivial-install-test-package')
|
||||
spec.concretize()
|
||||
assert spec.concrete
|
||||
|
||||
# Get the package
|
||||
pkg = spec.package
|
||||
|
||||
def find_nothing(*args):
|
||||
raise spack.repo.UnknownPackageError(
|
||||
'Repo package access is disabled for test')
|
||||
|
||||
try:
|
||||
pkg.do_install()
|
||||
|
||||
spec._package = None
|
||||
monkeypatch.setattr(spack.repo, 'get', find_nothing)
|
||||
with pytest.raises(spack.repo.UnknownPackageError):
|
||||
spec.package
|
||||
|
||||
module_path = spack.modules.common.get_module('tcl', spec, True)
|
||||
assert module_path
|
||||
pkg.do_uninstall()
|
||||
except Exception:
|
||||
pkg.remove_prefix()
|
||||
raise
|
||||
|
||||
@@ -215,9 +215,10 @@ def test_suffixes(self, module_configuration, factory):
|
||||
|
||||
writer, spec = factory('mpileaks+debug arch=x86-linux')
|
||||
assert 'foo' in writer.layout.use_name
|
||||
assert 'foo-foo' not in writer.layout.use_name
|
||||
|
||||
writer, spec = factory('mpileaks~debug arch=x86-linux')
|
||||
assert 'bar' in writer.layout.use_name
|
||||
assert 'bar-foo' in writer.layout.use_name
|
||||
|
||||
def test_setup_environment(self, modulefile_content, module_configuration):
|
||||
"""Tests the internal set-up of run-time environment."""
|
||||
|
||||
@@ -205,6 +205,8 @@ def push_to_url(
|
||||
# needs to be done in separate steps.
|
||||
shutil.copy2(local_file_path, remote_file_path)
|
||||
os.remove(local_file_path)
|
||||
else:
|
||||
raise
|
||||
|
||||
elif remote_url.scheme == 's3':
|
||||
if extra_args is None:
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# content of pytest.ini
|
||||
[pytest]
|
||||
addopts = --durations=20 -ra
|
||||
testpaths = .
|
||||
testpaths = lib/spack/spack/test
|
||||
python_files = *.py
|
||||
markers =
|
||||
db: tests that require creating a DB
|
||||
@@ -37,16 +37,12 @@ bin/spack -h
|
||||
bin/spack help -a
|
||||
|
||||
# Profile and print top 20 lines for a simple call to spack spec
|
||||
bin/spack -p --lines 20 spec mpileaks%gcc ^elfutils@0.170
|
||||
spack -p --lines 20 spec mpileaks%gcc ^elfutils@0.170
|
||||
|
||||
#-----------------------------------------------------------
|
||||
# Run unit tests with code coverage
|
||||
#-----------------------------------------------------------
|
||||
extra_args=""
|
||||
if [[ -n "$@" ]]; then
|
||||
extra_args="-k $@"
|
||||
fi
|
||||
$coverage_run bin/spack test -x --verbose "$extra_args"
|
||||
$coverage_run $(which spack) test -x --verbose
|
||||
|
||||
#-----------------------------------------------------------
|
||||
# Run tests for setup-env.sh
|
||||
|
||||
@@ -382,7 +382,7 @@ _spack_buildcache() {
|
||||
_spack_buildcache_create() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -r --rel -f --force -u --unsigned -a --allow-root -k --key -d --directory --no-rebuild-index -y --spec-yaml --no-deps"
|
||||
SPACK_COMPREPLY="-h --help -r --rel -f --force -u --unsigned -a --allow-root -k --key -d --directory -m --mirror-name --mirror-url --no-rebuild-index -y --spec-yaml --only"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -391,7 +391,7 @@ _spack_buildcache_create() {
|
||||
_spack_buildcache_install() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -f --force -m --multiple -a --allow-root -u --unsigned"
|
||||
SPACK_COMPREPLY="-h --help -f --force -m --multiple -a --allow-root -u --unsigned -o --otherarch"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
@@ -1025,7 +1025,7 @@ _spack_mirror() {
|
||||
_spack_mirror_create() {
|
||||
if $list_options
|
||||
then
|
||||
SPACK_COMPREPLY="-h --help -d --directory -a --all -f --file -D --dependencies -n --versions-per-spec"
|
||||
SPACK_COMPREPLY="-h --help -d --directory -a --all -f --file --skip-unstable-versions -D --dependencies -n --versions-per-spec"
|
||||
else
|
||||
_all_packages
|
||||
fi
|
||||
|
||||
@@ -0,0 +1,17 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class TrivialPkgWithValidHash(Package):
|
||||
url = "http://www.unit-test-should-replace-this-url/trivial_install-1.0"
|
||||
|
||||
version('1.0', '6ae8a75555209fd6c44157c0aed8016e763ff435a19cf186f76863140143ff72', expand=False)
|
||||
|
||||
hashed_content = "test content"
|
||||
|
||||
def install(self, spec, prefix):
|
||||
pass
|
||||
@@ -16,7 +16,7 @@ class Adios2(CMakePackage):
|
||||
|
||||
maintainers = ['ax3l', 'chuckatkins', 'williamfgc']
|
||||
|
||||
version('develop', branch='master')
|
||||
version('master', branch='master')
|
||||
version('2.5.0', sha256='7c8ff3bf5441dd662806df9650c56a669359cb0185ea232ecb3578de7b065329')
|
||||
version('2.4.0', sha256='50ecea04b1e41c88835b4b3fd4e7bf0a0a2a3129855c9cc4ba6cf6a1575106e2')
|
||||
version('2.3.1', sha256='3bf81ccc20a7f2715935349336a76ba4c8402355e1dc3848fcd6f4c3c5931893')
|
||||
@@ -100,8 +100,12 @@ class Adios2(CMakePackage):
|
||||
|
||||
extends('python', when='+python')
|
||||
depends_on('python@2.7:2.8,3.5:',
|
||||
when='@:2.4.0 +python', type=('build', 'run'))
|
||||
depends_on('python@3.5:', when='@2.5.0: +python', type=('build', 'run'))
|
||||
when='@:2.4.0 +python',
|
||||
type=('build', 'run'))
|
||||
depends_on('python@3.5:', when='@2.5.0: +python',
|
||||
type=('build', 'run'))
|
||||
depends_on('python@2.7:2.8,3.5:', when='@:2.4.0', type='test')
|
||||
depends_on('python@3.5:', when='@2.5.0:', type='test')
|
||||
depends_on('py-numpy@1.6.1:', type=('build', 'run'), when='+python')
|
||||
depends_on('py-mpi4py@2.0.0:', type=('build', 'run'), when='+mpi +python')
|
||||
|
||||
@@ -141,9 +145,11 @@ def cmake_args(self):
|
||||
'ON' if '+fortran' in spec else 'OFF'),
|
||||
'-DADIOS2_USE_Endian_Reverse={0}'.format(
|
||||
'ON' if '+endian_reverse' in spec else 'OFF'),
|
||||
'-DBUILD_TESTING:BOOL={0}'.format(
|
||||
'ON' if self.run_tests else 'OFF'),
|
||||
]
|
||||
|
||||
if self.spec.version >= Version('2.4.0'):
|
||||
if spec.version >= Version('2.4.0'):
|
||||
args.append('-DADIOS2_USE_Blosc={0}'.format(
|
||||
'ON' if '+blosc' in spec else 'OFF'))
|
||||
args.append('-DADIOS2_USE_BZip2={0}'.format(
|
||||
@@ -153,7 +159,7 @@ def cmake_args(self):
|
||||
args.append('-DADIOS2_USE_SSC={0}'.format(
|
||||
'ON' if '+ssc' in spec else 'OFF'))
|
||||
|
||||
if self.spec.version >= Version('2.5.0'):
|
||||
if spec.version >= Version('2.5.0'):
|
||||
args.append('-DADIOS2_USE_DataSpaces={0}'.format(
|
||||
'ON' if '+dataspaces' in spec else 'OFF'))
|
||||
|
||||
@@ -173,8 +179,8 @@ def cmake_args(self):
|
||||
args.append('-DCMAKE_POSITION_INDEPENDENT_CODE:BOOL={0}'.format(
|
||||
'ON' if '+pic' in spec else 'OFF'))
|
||||
|
||||
if spec.satisfies('+python'):
|
||||
if spec.satisfies('+python') or self.run_tests:
|
||||
args.append('-DPYTHON_EXECUTABLE:FILEPATH=%s'
|
||||
% self.spec['python'].command.path)
|
||||
% spec['python'].command.path)
|
||||
|
||||
return args
|
||||
|
||||
@@ -18,6 +18,7 @@ class Amrex(CMakePackage):
|
||||
maintainers = ['mic84', 'asalmgren']
|
||||
|
||||
version('develop', branch='development')
|
||||
version('20.03', sha256='a535dcc016f0d38b55d0ab8e9067c1c53e3686961f6a1fb471cb18a0ebc909e6')
|
||||
version('20.02', sha256='33529a23694283d12eb37d4682aa86c9cc1240bd50124efcf4464747a7554147')
|
||||
version('20.01', sha256='f7026d267ca5de79ec7e740264d54230f419776d40feae705e939be0b1d8e0d3')
|
||||
version('19.10', commit='52844b32b7da11e9733b9a7f4a782e51de7f5e1e') # tag:19.10
|
||||
|
||||
@@ -108,7 +108,11 @@ def install(self, spec, prefix):
|
||||
if '+mpi' in spec:
|
||||
options.append('-DMPI=ON')
|
||||
|
||||
# TODO: -DINTERFACE64=ON
|
||||
# If 64-bit BLAS is used:
|
||||
if (spec.satisfies('^openblas+ilp64') or
|
||||
spec.satisfies('^intel-mkl+ilp64') or
|
||||
spec.satisfies('^intel-parallel-studio+mkl+ilp64')):
|
||||
options.append('-DINTERFACE64=1')
|
||||
|
||||
if '+shared' in spec:
|
||||
options.append('-DBUILD_SHARED_LIBS=ON')
|
||||
|
||||
@@ -12,7 +12,7 @@ class AwsParallelcluster(PythonPackage):
|
||||
tool to deploy and manage HPC clusters in the AWS cloud."""
|
||||
|
||||
homepage = "https://github.com/aws/aws-parallelcluster"
|
||||
url = "https://pypi.io/packages/source/a/aws-parallelcluster/aws-parallelcluster-2.5.1.tar.gz"
|
||||
url = "https://pypi.io/packages/source/a/aws-parallelcluster/aws-parallelcluster-2.6.0.tar.gz"
|
||||
|
||||
maintainers = [
|
||||
'sean-smith', 'demartinofra', 'enrico-usai', 'lukeseawalker', 'rexcsn',
|
||||
@@ -23,6 +23,7 @@ class AwsParallelcluster(PythonPackage):
|
||||
'pcluster.config', 'pcluster.networking'
|
||||
]
|
||||
|
||||
version('2.6.0', sha256='aaed6962cf5027206834ac24b3d312da91e0f96ae8607f555e12cb124b869f0c')
|
||||
version('2.5.1', sha256='4fd6e14583f8cf81f9e4aa1d6188e3708d3d14e6ae252de0a94caaf58be76303')
|
||||
version('2.5.0', sha256='3b0209342ea0d9d8cc95505456103ad87c2d4e35771aa838765918194efd0ad3')
|
||||
|
||||
|
||||
29
var/spack/repos/builtin/packages/bat/package.py
Normal file
29
var/spack/repos/builtin/packages/bat/package.py
Normal file
@@ -0,0 +1,29 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Bat(Package):
|
||||
"""A cat(1) clone with wings."""
|
||||
|
||||
homepage = "https://github.com/sharkdp/bat"
|
||||
url = "https://github.com/sharkdp/bat/archive/v0.10.0.tar.gz"
|
||||
|
||||
version('0.12.1', sha256='1dd184ddc9e5228ba94d19afc0b8b440bfc1819fef8133fe331e2c0ec9e3f8e2')
|
||||
|
||||
depends_on('rust')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
cargo = which('cargo')
|
||||
cargo('install', '--root', prefix, '--path', '.')
|
||||
|
||||
# cargo seems to need these to be set so that when it's building
|
||||
# onig_sys it can run llvm-config and link against libclang.
|
||||
def setup_build_environment(self, env):
|
||||
env.append_flags('LLVM_CONFIG_PATH',
|
||||
join_path(self.spec['llvm'].prefix.libexec.llvm,
|
||||
'llvm-config'))
|
||||
env.append_flags('LIBCLANG_PATH', self.spec['llvm'].prefix.lib)
|
||||
@@ -12,6 +12,8 @@ class Bmi(AutotoolsPackage):
|
||||
homepage = 'https://xgitlab.cels.anl.gov/sds/bmi'
|
||||
git = 'https://xgitlab.cels.anl.gov/sds/bmi.git'
|
||||
|
||||
maintainers = ['carns']
|
||||
|
||||
version('develop', branch='master')
|
||||
|
||||
depends_on('autoconf', type='build')
|
||||
|
||||
@@ -96,8 +96,12 @@ class Boost(Package):
|
||||
# mpi/python are not installed by default because they pull in many
|
||||
# dependencies and/or because there is a great deal of customization
|
||||
# possible (and it would be difficult to choose sensible defaults)
|
||||
#
|
||||
# Boost.Container can be both header-only and compiled. '+container'
|
||||
# indicates the compiled version which requires Extended Allocator
|
||||
# support. The header-only library is installed when no variant is given.
|
||||
default_noinstall_libs\
|
||||
= set(['context', 'coroutine', 'fiber', 'mpi', 'python'])
|
||||
= set(['container', 'context', 'coroutine', 'fiber', 'mpi', 'python'])
|
||||
|
||||
all_libs = default_install_libs | default_noinstall_libs
|
||||
|
||||
@@ -174,6 +178,9 @@ def libs(self):
|
||||
conflicts('+taggedlayout', when='+versionedlayout')
|
||||
conflicts('+numpy', when='~python')
|
||||
|
||||
# Container's Extended Allocators were not added until 1.56.0
|
||||
conflicts('+container', when='@:1.55.99')
|
||||
|
||||
# Patch fix from https://svn.boost.org/trac/boost/ticket/11856
|
||||
patch('boost_11856.patch', when='@1.60.0%gcc@4.4.7')
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ class Bowtie2(Package):
|
||||
"""Bowtie 2 is an ultrafast and memory-efficient tool for aligning
|
||||
sequencing reads to long reference sequences"""
|
||||
|
||||
homepage = "bowtie-bio.sourceforge.net/bowtie2/index.shtml"
|
||||
homepage = "http://bowtie-bio.sourceforge.net/bowtie2/index.shtml"
|
||||
url = "http://downloads.sourceforge.net/project/bowtie-bio/bowtie2/2.3.1/bowtie2-2.3.1-source.zip"
|
||||
|
||||
version('2.3.5.1', sha256='335c8dafb1487a4a9228ef922fbce4fffba3ce8bc211e2d7085aac092155a53f')
|
||||
|
||||
@@ -16,9 +16,12 @@ class Bzip2(Package):
|
||||
homepage = "https://sourceware.org/bzip2/"
|
||||
url = "https://sourceware.org/pub/bzip2/bzip2-1.0.8.tar.gz"
|
||||
|
||||
version('1.0.8', sha256='ab5a03176ee106d3f0fa90e381da478ddae405918153cca248e682cd0c4a2269')
|
||||
version('1.0.7', sha256='e768a87c5b1a79511499beb41500bcc4caf203726fff46a6f5f9ad27fe08ab2b')
|
||||
version('1.0.6', sha256='a2848f34fcd5d6cf47def00461fcb528a0484d8edef8208d6d2e2909dc61d9cd')
|
||||
# The server is sometimes a bit slow to respond
|
||||
fetch_options = {'timeout': 60}
|
||||
|
||||
version('1.0.8', sha256='ab5a03176ee106d3f0fa90e381da478ddae405918153cca248e682cd0c4a2269', fetch_options=fetch_options)
|
||||
version('1.0.7', sha256='e768a87c5b1a79511499beb41500bcc4caf203726fff46a6f5f9ad27fe08ab2b', fetch_options=fetch_options)
|
||||
version('1.0.6', sha256='a2848f34fcd5d6cf47def00461fcb528a0484d8edef8208d6d2e2909dc61d9cd', fetch_options=fetch_options)
|
||||
|
||||
variant('shared', default=True, description='Enables the build of shared libraries.')
|
||||
|
||||
|
||||
@@ -64,12 +64,12 @@ class Caliper(CMakePackage):
|
||||
depends_on('libpfm4@4.8:4.99', when='+libpfm')
|
||||
|
||||
depends_on('mpi', when='+mpi')
|
||||
depends_on('unwind@2018.10.12,1.2:1.99', when='+callpath')
|
||||
depends_on('unwind@1.2:1.99', when='+callpath')
|
||||
|
||||
depends_on('sosflow@spack', when='@1.0:1.99+sosflow')
|
||||
|
||||
depends_on('cmake', type='build')
|
||||
depends_on('python', type='build')
|
||||
depends_on('python@3:', type='build')
|
||||
|
||||
# sosflow support not yet in 2.0
|
||||
conflicts('+sosflow', '@2.0.0:2.2.99')
|
||||
@@ -79,6 +79,8 @@ def cmake_args(self):
|
||||
spec = self.spec
|
||||
|
||||
args = [
|
||||
('-DPYTHON_EXECUTABLE=%s' %
|
||||
spec['python'].command.path),
|
||||
'-DBUILD_TESTING=Off',
|
||||
'-DBUILD_DOCS=Off',
|
||||
'-DBUILD_SHARED_LIBS=%s' % ('On' if '+shared' in spec else 'Off'),
|
||||
|
||||
@@ -14,6 +14,8 @@ class Clhep(CMakePackage):
|
||||
list_url = "https://proj-clhep.web.cern.ch/proj-clhep/"
|
||||
list_depth = 1
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('2.4.1.3', sha256='27c257934929f4cb1643aa60aeaad6519025d8f0a1c199bc3137ad7368245913')
|
||||
version('2.4.1.2', sha256='ff96e7282254164380460bc8cf2dff2b58944084eadcd872b5661eb5a33fa4b8')
|
||||
version('2.4.1.0', sha256='d14736eb5c3d21f86ce831dc1afcf03d423825b35c84deb6f8fd16773528c54d')
|
||||
|
||||
@@ -10,8 +10,9 @@ class Cscope(AutotoolsPackage):
|
||||
"""Cscope is a developer's tool for browsing source code."""
|
||||
|
||||
homepage = "http://cscope.sourceforge.net/"
|
||||
url = "http://downloads.sourceforge.net/project/cscope/cscope/15.8b/cscope-15.8b.tar.gz"
|
||||
url = "https://sourceforge.net/projects/cscope/files/cscope/v15.9/cscope-15.9.tar.gz"
|
||||
|
||||
version('15.9', sha256='c5505ae075a871a9cd8d9801859b0ff1c09782075df281c72c23e72115d9f159')
|
||||
version('15.8b', sha256='4889d091f05aa0845384b1e4965aa31d2b20911fb2c001b2cdcffbcb7212d3af')
|
||||
|
||||
depends_on('ncurses')
|
||||
@@ -20,4 +21,11 @@ class Cscope(AutotoolsPackage):
|
||||
depends_on('bison', type='build')
|
||||
depends_on('pkgconfig', type='build')
|
||||
|
||||
build_targets = ['CURSES_LIBS=-lncursesw']
|
||||
build_targets = ['CURSES_LIBS=-lncursesw -ltinfo']
|
||||
|
||||
def url_for_version(self, version):
|
||||
url = "https://sourceforge.net/projects/cscope/files/cscope/{0}{1}/cscope-{1}.tar.gz"
|
||||
if version >= Version('15.9'):
|
||||
return url.format('v', version)
|
||||
else:
|
||||
return url.format('', version)
|
||||
|
||||
@@ -18,6 +18,8 @@ class DarshanRuntime(Package):
|
||||
url = "http://ftp.mcs.anl.gov/pub/darshan/releases/darshan-3.1.0.tar.gz"
|
||||
git = "https://xgitlab.cels.anl.gov/darshan/darshan.git"
|
||||
|
||||
maintainers = ['shanedsnyder', 'carns']
|
||||
|
||||
version('develop', branch='master')
|
||||
version('3.1.7', sha256='9ba535df292727ac1e8025bdf2dc42942715205cad8319d925723fd88709e8d6')
|
||||
version('3.1.6', sha256='21cb24e2a971c45e04476e00441b7fbea63d2afa727a5cf8b7a4a9d9004dd856')
|
||||
|
||||
@@ -16,6 +16,8 @@ class DarshanUtil(Package):
|
||||
url = "http://ftp.mcs.anl.gov/pub/darshan/releases/darshan-3.1.0.tar.gz"
|
||||
git = "https://xgitlab.cels.anl.gov/darshan/darshan.git"
|
||||
|
||||
maintainers = ['shanedsnyder', 'carns']
|
||||
|
||||
version('develop', branch='master')
|
||||
version('3.1.7', sha256='9ba535df292727ac1e8025bdf2dc42942715205cad8319d925723fd88709e8d6')
|
||||
version('3.1.6', sha256='21cb24e2a971c45e04476e00441b7fbea63d2afa727a5cf8b7a4a9d9004dd856')
|
||||
|
||||
@@ -151,6 +151,7 @@ class Dealii(CMakePackage, CudaPackage):
|
||||
depends_on('slepc@:3.6.3', when='@:8.4.1+slepc+petsc+mpi')
|
||||
depends_on('slepc~arpack', when='+slepc+petsc+mpi+int64')
|
||||
depends_on('sundials@:3~pthread', when='@9.0:+sundials')
|
||||
depends_on('trilinos gotype=int', when='+trilinos')
|
||||
# Both Trilinos and SymEngine bundle the Teuchos RCP library.
|
||||
# This leads to conflicts between macros defined in the included
|
||||
# headers when they are not compiled in the same mode.
|
||||
|
||||
@@ -4,31 +4,33 @@
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
|
||||
import os
|
||||
from spack import *
|
||||
|
||||
|
||||
class EcpProxyApps(Package):
|
||||
class EcpProxyApps(BundlePackage):
|
||||
"""This is a collection of packages that represents the official suite of
|
||||
DOE/ECP proxy applications. This is a Spack bundle package that
|
||||
installs the ECP proxy application suite.
|
||||
"""
|
||||
|
||||
tags = ['proxy-app', 'ecp-proxy-app']
|
||||
maintainers = ['bhatele']
|
||||
maintainers = ['rspavel']
|
||||
|
||||
homepage = "https://proxyapps.exascaleproject.org"
|
||||
# Dummy url
|
||||
url = 'https://github.com/exascaleproject/proxy-apps/archive/v1.0.tar.gz'
|
||||
|
||||
version('2.1', sha256='604da008fc4ef3bdbc25505088d610333249e3e9745eac7dbfd05b91e33e218d')
|
||||
version('2.0', sha256='5f3cb3a772224e738c1dab42fb34d40f6b313af51ab1c575fb334e573e41e09a')
|
||||
version('1.1', sha256='8537e03588c0f46bebf5b7f07146c79812f2ebfb77d29e184baa4dd5f4603ee3')
|
||||
version('1.0', sha256='13d9795494dabdb4c724d2c0f322c2149b2507d2fd386ced12b54292b7ecf595')
|
||||
version('3.0')
|
||||
version('2.1')
|
||||
version('2.0')
|
||||
version('1.1')
|
||||
version('1.0')
|
||||
|
||||
variant('candle', default=False,
|
||||
description='Also build CANDLE Benchmarks')
|
||||
|
||||
# Added with release 3.0
|
||||
depends_on('miniamr@1.4.4', when='@3.0:')
|
||||
depends_on('xsbench@19', when='@3.0:')
|
||||
|
||||
# Added with release 2.1
|
||||
depends_on('amg@1.2', when='@2.1:')
|
||||
depends_on('miniamr@1.4.3', when='@2.1:')
|
||||
@@ -40,15 +42,15 @@ class EcpProxyApps(Package):
|
||||
depends_on('picsarlite@0.1', when='@2.0:')
|
||||
depends_on('thornado-mini@1.0', when='@2.0:')
|
||||
|
||||
depends_on('candle-benchmarks@0.1', when='+candle @2.0:')
|
||||
depends_on('candle-benchmarks@0.1', when='+candle @2.0:2.1')
|
||||
depends_on('laghos@2.0', when='@2.0:')
|
||||
depends_on('macsio@1.1', when='@2.0:')
|
||||
depends_on('sw4lite@1.1', when='@2.0:')
|
||||
depends_on('xsbench@18', when='@2.0:')
|
||||
depends_on('xsbench@18', when='@2.0:2.1')
|
||||
|
||||
# Dependencies for version 2.0
|
||||
depends_on('amg@1.1', when='@2.0')
|
||||
depends_on('miniamr@1.4.1', when='@2.0')
|
||||
depends_on('miniamr@1.4.1', when='@2.0:2.1')
|
||||
|
||||
# Added with release 1.1
|
||||
depends_on('examinimd@1.0', when='@1.1:')
|
||||
@@ -71,12 +73,3 @@ class EcpProxyApps(Package):
|
||||
|
||||
# Removed after release 1.0
|
||||
depends_on('comd@1.1', when='@1.0')
|
||||
|
||||
# Dummy install for now, will be removed when metapackage is available
|
||||
def install(self, spec, prefix):
|
||||
with open(os.path.join(spec.prefix, 'package-list.txt'), 'w') as out:
|
||||
for dep in spec.dependencies(deptype='build'):
|
||||
out.write("%s\n" % dep.format(
|
||||
format_string='${PACKAGE} ${VERSION}'))
|
||||
os.symlink(dep.prefix, os.path.join(spec.prefix, dep.name))
|
||||
out.close()
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
Version 3.3.4 contained a bug that prevented it from finding scotch~mpi.
|
||||
|
||||
diff --git a/tmp/FindPTSCOTCH.cmake b/cmake/FindPTSCOTCH.cmake
|
||||
index 1396d05..23451b1 100644
|
||||
--- a/tmp/FindPTSCOTCH.cmake
|
||||
+++ b/cmake/FindPTSCOTCH.cmake
|
||||
@@ -167,11 +167,11 @@ endif()
|
||||
|
||||
# If found, add path to cmake variable
|
||||
# ------------------------------------
|
||||
+unset(PTSCOTCH_INCLUDE_DIRS)
|
||||
foreach(ptscotch_hdr ${PTSCOTCH_hdrs_to_find})
|
||||
if (PTSCOTCH_${ptscotch_hdr}_DIRS)
|
||||
list(APPEND PTSCOTCH_INCLUDE_DIRS "${PTSCOTCH_${ptscotch_hdr}_DIRS}")
|
||||
else ()
|
||||
- set(PTSCOTCH_INCLUDE_DIRS "PTSCOTCH_INCLUDE_DIRS-NOTFOUND")
|
||||
if (NOT PTSCOTCH_FIND_QUIETLY)
|
||||
message(STATUS "Looking for ptscotch -- ${ptscotch_hdr} not found")
|
||||
endif()
|
||||
@@ -255,7 +255,6 @@ foreach(ptscotch_lib ${PTSCOTCH_libs_to_find})
|
||||
list(APPEND PTSCOTCH_LIBRARIES "${PTSCOTCH_${ptscotch_lib}_LIBRARY}")
|
||||
list(APPEND PTSCOTCH_LIBRARY_DIRS "${${ptscotch_lib}_lib_path}")
|
||||
else ()
|
||||
- list(APPEND PTSCOTCH_LIBRARIES "${PTSCOTCH_${ptscotch_lib}_LIBRARY}")
|
||||
if (NOT PTSCOTCH_FIND_QUIETLY)
|
||||
message(STATUS "Looking for ptscotch -- lib ${ptscotch_lib} not found")
|
||||
endif()
|
||||
@@ -3,8 +3,6 @@
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Eigen(CMakePackage):
|
||||
"""Eigen is a C++ template library for linear algebra matrices,
|
||||
@@ -12,41 +10,32 @@ class Eigen(CMakePackage):
|
||||
"""
|
||||
|
||||
homepage = 'http://eigen.tuxfamily.org/'
|
||||
url = 'https://bitbucket.org/eigen/eigen/get/3.3.4.tar.bz2'
|
||||
url = 'https://gitlab.com/libeigen/eigen/-/archive/3.3.7/eigen-3.3.7.tar.gz'
|
||||
|
||||
version('3.3.7', sha256='9f13cf90dedbe3e52a19f43000d71fdf72e986beb9a5436dddcd61ff9d77a3ce')
|
||||
version('3.3.5', sha256='7352bff3ea299e4c7d7fbe31c504f8eb9149d7e685dec5a12fbaa26379f603e2')
|
||||
version('3.3.4', sha256='dd254beb0bafc695d0f62ae1a222ff85b52dbaa3a16f76e781dce22d0d20a4a6')
|
||||
version('3.3.3', sha256='a4143fc45e4454b4b98fcea3516b3a79b8cdb3bc7fadf996d088c6a0d805fea1')
|
||||
version('3.3.1', sha256='a0b4cebaabd8f371d1b364f9723585fbcc7c9640ca60273b99835e6cf115f056')
|
||||
version('3.2.10', sha256='760e6656426fde71cc48586c971390816f456d30f0b5d7d4ad5274d8d2cb0a6d')
|
||||
version('3.2.9', sha256='4d1e036ec1ed4f4805d5c6752b76072d67538889f4003fadf2f6e00a825845ff')
|
||||
version('3.2.8', sha256='722a63d672b70f39c271c5e2a4a43ba14d12015674331790414fcb167c357e55')
|
||||
version('3.2.7', sha256='e58e1a11b23cf2754e32b3c5990f318a8461a3613c7acbf6035870daa45c2f3e')
|
||||
version('3.3.7', sha256='d56fbad95abf993f8af608484729e3d87ef611dd85b3380a8bad1d5cbc373a57')
|
||||
version('3.3.6', sha256='e7cd8c94d6516d1ada9893ccc7c9a400fcee99927c902f15adba940787104dba')
|
||||
version('3.3.5', sha256='383407ab3d0c268074e97a2cbba84ac197fd24532f014aa2adc522355c1aa2d0')
|
||||
version('3.3.4', sha256='c5ca6e3442fb48ae75159ca7568854d9ba737bc351460f27ee91b6f3f9fd1f3d')
|
||||
version('3.3.3', sha256='fd72694390bd8e81586205717d2cf823e718f584b779a155db747d1e68481a2e')
|
||||
version('3.3.2', sha256='8d7611247fba1236da4dee7a64607017b6fb9ca5e3f0dc44d480e5d33d5663a5')
|
||||
version('3.3.1', sha256='50dd21a8997fce0857b27a126811ae8ee7619984ab5425ecf33510cec649e642')
|
||||
version('3.3.0', sha256='de82e01f97e1a95f121bd3ace87aa1237818353c14e38f630a65f5ba2c92f0e1')
|
||||
version('3.2.10', sha256='0920cb60ec38de5fb509650014eff7cc6d26a097c7b38c7db4b1aa5df5c85042')
|
||||
version('3.2.9', sha256='f683b20259ad72c3d384c00278166dd2a42d99b78dcd589ed4a6ca74bbb4ca07')
|
||||
version('3.2.8', sha256='64c54781cfe9eefef2792003ab04b271d4b2ec32eda6e9cdf120d7aad4ebb282')
|
||||
version('3.2.7', sha256='0ea9df884873275bf39c2965d486fa2d112f3a64b97b60b45b8bc4bb034a36c1')
|
||||
version('3.2.6', sha256='e097b8dcc5ad30d40af4ad72d7052e3f78639469baf83cffaadc045459cda21f')
|
||||
version('3.2.5', sha256='8068bd528a2ff3885eb55225c27237cf5cda834355599f05c2c85345db8338b4')
|
||||
|
||||
variant('metis', default=False,
|
||||
description='Enables metis permutations in sparse algebra')
|
||||
variant('scotch', default=False,
|
||||
description='Enables scotch/pastix sparse factorization methods')
|
||||
variant('fftw', default=False,
|
||||
description='Enables FFTW backend for the FFT plugin')
|
||||
variant('suitesparse', default=False,
|
||||
description='Enables SuiteSparse sparse factorization methods')
|
||||
variant('mpfr', default=False,
|
||||
description='Enables the multi-precisions floating-point plugin')
|
||||
# From http://eigen.tuxfamily.org/index.php?title=Main_Page#Requirements
|
||||
# "Eigen doesn't have any dependencies other than the C++ standard
|
||||
# library."
|
||||
variant('build_type', default='RelWithDebInfo',
|
||||
description='The build type to build',
|
||||
values=('Debug', 'Release', 'RelWithDebInfo'))
|
||||
|
||||
# TODO : dependency on googlehash, superlu, adolc missing
|
||||
depends_on('metis@5:', when='+metis')
|
||||
depends_on('scotch', when='+scotch')
|
||||
depends_on('fftw', when='+fftw')
|
||||
depends_on('suite-sparse', when='+suitesparse')
|
||||
depends_on('mpfr@2.3.0:', when='+mpfr')
|
||||
depends_on('gmp', when='+mpfr')
|
||||
|
||||
patch('find-ptscotch.patch', when='@3.3.4')
|
||||
# TODO: latex and doxygen needed to produce docs with make doc
|
||||
# TODO: Other dependencies might be needed to test this package
|
||||
|
||||
def setup_run_environment(self, env):
|
||||
env.prepend_path('CPATH', self.prefix.include.eigen3)
|
||||
|
||||
@@ -21,15 +21,18 @@ class Elfutils(AutotoolsPackage):
|
||||
list_url = "https://sourceware.org/elfutils/ftp"
|
||||
list_depth = 1
|
||||
|
||||
version('0.178', sha256='31e7a00e96d4e9c4bda452e1f2cdac4daf8abd24f5e154dee232131899f3a0f2')
|
||||
version('0.177', sha256='fa489deccbcae7d8c920f60d85906124c1989c591196d90e0fd668e3dc05042e')
|
||||
version('0.176', sha256='eb5747c371b0af0f71e86215a5ebb88728533c3a104a43d4231963f308cd1023')
|
||||
version('0.175', sha256='f7ef925541ee32c6d15ae5cb27da5f119e01a5ccdbe9fe57bf836730d7b7a65b')
|
||||
version('0.174', sha256='cdf27e70076e10a29539d89e367101d516bc4aa11b0d7777fe52139e3fcad08a')
|
||||
version('0.173', sha256='b76d8c133f68dad46250f5c223482c8299d454a69430d9aa5c19123345a000ff')
|
||||
version('0.170', sha256='1f844775576b79bdc9f9c717a50058d08620323c1e935458223a12f249c9e066')
|
||||
version('0.168', sha256='b88d07893ba1373c7dd69a7855974706d05377766568a7d9002706d5de72c276')
|
||||
version('0.163', sha256='7c774f1eef329309f3b05e730bdac50013155d437518a2ec0e24871d312f2e23')
|
||||
# Sourceware is often slow to respond.
|
||||
timeout = {'timeout': 60}
|
||||
|
||||
version('0.178', sha256='31e7a00e96d4e9c4bda452e1f2cdac4daf8abd24f5e154dee232131899f3a0f2', fetch_options=timeout)
|
||||
version('0.177', sha256='fa489deccbcae7d8c920f60d85906124c1989c591196d90e0fd668e3dc05042e', fetch_options=timeout)
|
||||
version('0.176', sha256='eb5747c371b0af0f71e86215a5ebb88728533c3a104a43d4231963f308cd1023', fetch_options=timeout)
|
||||
version('0.175', sha256='f7ef925541ee32c6d15ae5cb27da5f119e01a5ccdbe9fe57bf836730d7b7a65b', fetch_options=timeout)
|
||||
version('0.174', sha256='cdf27e70076e10a29539d89e367101d516bc4aa11b0d7777fe52139e3fcad08a', fetch_options=timeout)
|
||||
version('0.173', sha256='b76d8c133f68dad46250f5c223482c8299d454a69430d9aa5c19123345a000ff', fetch_options=timeout)
|
||||
version('0.170', sha256='1f844775576b79bdc9f9c717a50058d08620323c1e935458223a12f249c9e066', fetch_options=timeout)
|
||||
version('0.168', sha256='b88d07893ba1373c7dd69a7855974706d05377766568a7d9002706d5de72c276', fetch_options=timeout)
|
||||
version('0.163', sha256='7c774f1eef329309f3b05e730bdac50013155d437518a2ec0e24871d312f2e23', fetch_options=timeout)
|
||||
|
||||
# Libraries for reading compressed DWARF sections.
|
||||
variant('bzip2', default=False,
|
||||
|
||||
@@ -15,3 +15,6 @@ class Exiv2(CMakePackage):
|
||||
url = "https://github.com/Exiv2/exiv2/archive/v0.27.2.tar.gz"
|
||||
|
||||
version('0.27.2', sha256='3dbcaf01fbc5b98d42f091d1ff0d4b6cd9750dc724de3d9c0d113948570b2934')
|
||||
|
||||
depends_on('zlib', type='link')
|
||||
depends_on('expat@2.2.6:', type='link')
|
||||
|
||||
@@ -66,7 +66,8 @@ def darwin_fix(self):
|
||||
fix_darwin_install_name(
|
||||
join_path(self.prefix.lib, 'expect{0}'.format(self.version)))
|
||||
|
||||
old = 'libexpect{0}.dylib'.format(self.version)
|
||||
new = glob.glob(join_path(self.prefix.lib, 'expect*', 'libexpect*'))[0]
|
||||
install_name_tool = Executable('install_name_tool')
|
||||
install_name_tool('-change', old, new, self.prefix.bin.expect)
|
||||
old = 'libexpect{0}.dylib'.format(self.version)
|
||||
new = glob.glob(join_path(self.prefix.lib, 'expect*',
|
||||
'libexpect*'))[0]
|
||||
install_name_tool = Executable('install_name_tool')
|
||||
install_name_tool('-change', old, new, self.prefix.bin.expect)
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
|
||||
class ExuberantCtags(AutotoolsPackage):
|
||||
"""The canonical ctags generator"""
|
||||
homepage = "ctags.sourceforge.net"
|
||||
homepage = "http://ctags.sourceforge.net"
|
||||
url = "http://downloads.sourceforge.net/project/ctags/ctags/5.8/ctags-5.8.tar.gz"
|
||||
|
||||
version('5.8', sha256='0e44b45dcabe969e0bbbb11e30c246f81abe5d32012db37395eb57d66e9e99c7')
|
||||
|
||||
@@ -15,10 +15,12 @@ class F77Zmq(MakefilePackage):
|
||||
|
||||
maintainers = ['scemama']
|
||||
|
||||
version('4.3.2', sha256='f1fb7544d38d9bb7235f98c96f241875ddcb0d37ed950618c23d4e4d666a73ca')
|
||||
version('4.3.1', sha256='a15d72d93022d3e095528d2808c7767cece974a2dc0e2dd95e4c122f60fcf0a8')
|
||||
|
||||
depends_on('libzmq')
|
||||
depends_on('python', type='build')
|
||||
depends_on('python@3:', type='build', when="@:4.3.1")
|
||||
depends_on('python', type='build', when="@4.3.2:")
|
||||
|
||||
def setup_build_environment(self, env):
|
||||
env.append_flags('CFLAGS', '-O3')
|
||||
|
||||
@@ -17,7 +17,8 @@ class Fairlogger(CMakePackage):
|
||||
maintainers = ['dennisklein', 'ChristianTackeGSI']
|
||||
# generator = 'Ninja'
|
||||
|
||||
version('develop', branch='dev')
|
||||
version('develop', branch='dev', get_full_repo=True)
|
||||
version('1.6.2', sha256='5c6ef0c0029eb451fee71756cb96e6c5011040a9813e8889667b6f3b6b04ed03')
|
||||
version('1.6.1', sha256='3894580f4c398d724ba408e410e50f70c9f452e8cfaf7c3ff8118c08df28eaa8')
|
||||
version('1.6.0', sha256='721e8cadfceb2f63014c2a727e098babc6deba653baab8866445a772385d0f5b')
|
||||
version('1.5.0', sha256='8e74e0b1e50ee86f4fca87a44c6b393740b32099ac3880046bf252c31c58dd42')
|
||||
@@ -31,19 +32,22 @@ class Fairlogger(CMakePackage):
|
||||
values=('Debug', 'Release', 'RelWithDebInfo'),
|
||||
multi=False,
|
||||
description='CMake build type')
|
||||
variant('cxxstd', default='11',
|
||||
variant('cxxstd', default='default',
|
||||
values=('11', '14', '17'),
|
||||
multi=False,
|
||||
description='Use the specified C++ standard when building.')
|
||||
variant('pretty',
|
||||
default=False,
|
||||
description='Use BOOST_PRETTY_FUNCTION macro.')
|
||||
description='Use BOOST_PRETTY_FUNCTION macro (Supported by 1.4+).')
|
||||
conflicts('+pretty', when='@:1.3.99')
|
||||
|
||||
depends_on('cmake@3.9.4:', type='build')
|
||||
depends_on('git', type='build', when='@develop')
|
||||
|
||||
depends_on('boost', when='+pretty')
|
||||
depends_on('fmt@5.3.0:', when='@1.6.0:')
|
||||
conflicts('^boost@1.70:', when='^cmake@:3.14')
|
||||
depends_on('fmt@5.3.0:5.99', when='@1.6.0:1.6.1')
|
||||
depends_on('fmt@5.3.0:', when='@1.6.2:')
|
||||
|
||||
def patch(self):
|
||||
"""FairLogger gets its version number from git.
|
||||
@@ -55,9 +59,10 @@ def patch(self):
|
||||
'CMakeLists.txt')
|
||||
|
||||
def cmake_args(self):
|
||||
cxxstd = self.spec.variants['cxxstd'].value
|
||||
args = []
|
||||
args.append('-DCMAKE_CXX_STANDARD=%s' % cxxstd)
|
||||
cxxstd = self.spec.variants['cxxstd'].value
|
||||
if cxxstd != 'default':
|
||||
args.append('-DCMAKE_CXX_STANDARD=%s' % cxxstd)
|
||||
args.append('-DUSE_BOOST_PRETTY_FUNCTION=%s' %
|
||||
('ON' if '+pretty' in self.spec else 'OFF'))
|
||||
if self.spec.satisfies('@1.6:'):
|
||||
|
||||
@@ -18,6 +18,7 @@ class FastGlobalFileStatus(AutotoolsPackage):
|
||||
depends_on('mrnet')
|
||||
depends_on('mount-point-attributes')
|
||||
depends_on('mpi')
|
||||
depends_on('openssl')
|
||||
|
||||
def configure_args(self):
|
||||
spec = self.spec
|
||||
|
||||
21
var/spack/repos/builtin/packages/fd-find/package.py
Normal file
21
var/spack/repos/builtin/packages/fd-find/package.py
Normal file
@@ -0,0 +1,21 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class FdFind(Package):
|
||||
"""A simple, fast and user-friendly alternative to 'find'."""
|
||||
|
||||
homepage = "https://github.com/sharkdp/fd"
|
||||
url = "https://github.com/sharkdp/fd/archive/v7.3.0.tar.gz"
|
||||
|
||||
version('7.4.0', sha256='33570ba65e7f8b438746cb92bb9bc4a6030b482a0d50db37c830c4e315877537')
|
||||
|
||||
depends_on('rust')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
cargo = which('cargo')
|
||||
cargo('install', '--root', prefix, '--path', '.')
|
||||
@@ -17,3 +17,16 @@ class Flatbuffers(CMakePackage):
|
||||
version('1.10.0', sha256='3714e3db8c51e43028e10ad7adffb9a36fc4aa5b1a363c2d0c4303dd1be59a7c')
|
||||
version('1.9.0', sha256='5ca5491e4260cacae30f1a5786d109230db3f3a6e5a0eb45d0d0608293d247e3')
|
||||
version('1.8.0', sha256='c45029c0a0f1a88d416af143e34de96b3091642722aa2d8c090916c6d1498c2e')
|
||||
|
||||
variant('shared', default=True,
|
||||
description='Build shared instead of static libraries')
|
||||
|
||||
def cmake_args(self):
|
||||
args = []
|
||||
args.append('-DFLATBUFFERS_BUILD_SHAREDLIB={0}'.format(
|
||||
'ON' if '+shared' in self.spec else 'OFF'))
|
||||
args.append('-DFLATBUFFERS_BUILD_FLATLIB={0}'.format(
|
||||
'ON' if '+shared' not in self.spec else 'OFF'))
|
||||
if 'darwin' in self.spec.architecture:
|
||||
args.append('-DCMAKE_MACOSX_RPATH=ON')
|
||||
return args
|
||||
|
||||
@@ -11,6 +11,7 @@ class G4abla(Package):
|
||||
"""Geant4 data for nuclear shell effects in INCL/ABLA hadronic mode"""
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4ABLA.3.0.tar.gz"
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version(
|
||||
'3.0', sha256='99fd4dcc9b4949778f14ed8364088e45fa4ff3148b3ea36f9f3103241d277014')
|
||||
|
||||
@@ -11,6 +11,7 @@ class G4emlow(Package):
|
||||
"""Geant4 data files for low energy electromagnetic processes."""
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4EMLOW.6.50.tar.gz"
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version(
|
||||
'6.50', sha256='c97be73fece5fb4f73c43e11c146b43f651c6991edd0edf8619c9452f8ab1236')
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4ensdfstate(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4ENSDFSTATE.2.1.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('2.1', sha256='933e7f99b1c70f24694d12d517dfca36d82f4e95b084c15d86756ace2a2790d9')
|
||||
version('2.2', sha256='dd7e27ef62070734a4a709601f5b3bada6641b111eb7069344e4f99a01d6e0a6')
|
||||
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4ndl(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4NDL.4.5.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('4.5', sha256='cba928a520a788f2bc8229c7ef57f83d0934bb0c6a18c31ef05ef4865edcdf8e')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
@@ -13,6 +13,8 @@ class G4neutronxs(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4NEUTRONXS.1.4.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('1.4', sha256='57b38868d7eb060ddd65b26283402d4f161db76ed2169437c266105cca73a8fd')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4photonevaporation(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4PhotonEvaporation.4.3.2.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('4.3.2', sha256='d4641a6fe1c645ab2a7ecee09c34e5ea584fb10d63d2838248bfc487d34207c7')
|
||||
version('5.2', sha256='83607f8d36827b2a7fca19c9c336caffbebf61a359d0ef7cee44a8bcf3fc2d1f')
|
||||
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4pii(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4PII.1.3.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('1.3', sha256='6225ad902675f4381c98c6ba25fc5a06ce87549aa979634d3d03491d6616e926')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4radioactivedecay(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4RadioactiveDecay.5.1.1.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('5.1.1', sha256='f7a9a0cc998f0d946359f2cb18d30dff1eabb7f3c578891111fc3641833870ae')
|
||||
version('5.2', sha256='99c038d89d70281316be15c3c98a66c5d0ca01ef575127b6a094063003e2af5d')
|
||||
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4realsurface(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/RealSurface.1.0.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('1.0', sha256='3e2d2506600d2780ed903f1f2681962e208039329347c58ba1916740679020b1')
|
||||
version('2.1', sha256='2a287adbda1c0292571edeae2082a65b7f7bd6cf2bf088432d1d6f889426dcf3')
|
||||
version('2.1.1', sha256='90481ff97a7c3fa792b7a2a21c9ed80a40e6be386e581a39950c844b2dd06f50')
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4saiddata(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4SAIDDATA.1.1.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('1.1', sha256='a38cd9a83db62311922850fe609ecd250d36adf264a88e88c82ba82b7da0ed7f')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
|
||||
@@ -12,6 +12,8 @@ class G4tendl(Package):
|
||||
homepage = "http://geant4.web.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/datasets/G4TENDL.1.3.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('1.3', sha256='52ad77515033a5d6f995c699809b464725a0e62099b5e55bf07c8bdd02cd3bce')
|
||||
version('1.3.2', sha256='3b2987c6e3bee74197e3bd39e25e1cc756bb866c26d21a70f647959fc7afb849')
|
||||
|
||||
|
||||
@@ -28,6 +28,7 @@ class Gcc(AutotoolsPackage, GNUMirrorPackage):
|
||||
version('9.2.0', sha256='ea6ef08f121239da5695f76c9b33637a118dcf63e24164422231917fa61fb206')
|
||||
version('9.1.0', sha256='79a66834e96a6050d8fe78db2c3b32fb285b230b855d0a66288235bc04b327a0')
|
||||
|
||||
version('8.4.0', sha256='e30a6e52d10e1f27ed55104ad233c30bd1e99cfb5ff98ab022dc941edd1b2dd4')
|
||||
version('8.3.0', sha256='64baadfe6cc0f4947a84cb12d7f0dfaf45bb58b7e92461639596c21e02d97d2c')
|
||||
version('8.2.0', sha256='196c3c04ba2613f893283977e6011b2345d1cd1af9abeac58e916b1aab3e0080')
|
||||
version('8.1.0', sha256='1d1866f992626e61349a1ccd0b8d5253816222cdc13390dcfaa74b093aa2b153')
|
||||
@@ -193,6 +194,11 @@ class Gcc(AutotoolsPackage, GNUMirrorPackage):
|
||||
# Binutils can't build ld on macOS
|
||||
conflicts('+binutils', when='platform=darwin')
|
||||
|
||||
# Newer binutils than RHEL's is required to run `as` on some instructions
|
||||
# generated by new GCC (see https://github.com/spack/spack/issues/12235)
|
||||
conflicts('~binutils', when='@7: os=rhel6',
|
||||
msg='New GCC cannot use system assembler on RHEL6')
|
||||
|
||||
if sys.platform == 'darwin':
|
||||
# Fix parallel build on APFS filesystem
|
||||
# https://gcc.gnu.org/bugzilla/show_bug.cgi?id=81797
|
||||
|
||||
@@ -14,6 +14,8 @@ class Geant4Data(Package):
|
||||
homepage = "http://geant4.cern.ch"
|
||||
url = "http://geant4-data.web.cern.ch/geant4-data/ReleaseNotes/ReleaseNotes4.10.3.html"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('10.03.p03', sha256='3e0d4d4e6854c8667d930fd5beaec09b7e6ec41f4847935e5d6a2720d0094b30', expand=False)
|
||||
version('10.04', sha256='f67fb899b99348a1a7e471a05f249f972e7e303c78238fc5f693b99968642255', expand=False)
|
||||
|
||||
|
||||
@@ -17,6 +17,8 @@ class Geant4(CMakePackage):
|
||||
homepage = "http://geant4.cern.ch/"
|
||||
url = "http://geant4.cern.ch/support/source/geant4.10.01.p03.tar.gz"
|
||||
|
||||
maintainers = ['drbenmorgan']
|
||||
|
||||
version('10.05.p01', sha256='f4a292220500fad17e0167ce3153e96e3410ecbe96284e572dc707f63523bdff')
|
||||
version('10.04', sha256='f6d883132f110eb036c69da2b21df51f13c585dc7b99d4211ddd32f4ccee1670')
|
||||
version('10.03.p03', sha256='a164f49c038859ab675eec474d08c9d02be8c4be9c0c2d3aa8e69adf89e1e138')
|
||||
|
||||
36
var/spack/repos/builtin/packages/gengetopt/package.py
Normal file
36
var/spack/repos/builtin/packages/gengetopt/package.py
Normal file
@@ -0,0 +1,36 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Gengetopt(AutotoolsPackage):
|
||||
"""Tool to write command line option parsing code for C programs"""
|
||||
|
||||
homepage = "https://www.gnu.org/software/gengetopt/gengetopt.html"
|
||||
url = "ftp://ftp.gnu.org/gnu/gengetopt/gengetopt-2.23.tar.xz"
|
||||
|
||||
maintainers = ['rblake-llnl']
|
||||
|
||||
version('2.23', sha256='b941aec9011864978dd7fdeb052b1943535824169d2aa2b0e7eae9ab807584ac')
|
||||
version('2.22.6', sha256='30b05a88604d71ef2a42a2ef26cd26df242b41f5b011ad03083143a31d9b01f7')
|
||||
version('2.22.5', sha256='3b6fb3240352b0eb0c5b8583b58b62cbba58167cef5a7e82fa08a7f968ed2137')
|
||||
version('2.22.4', sha256='4edf6b24ec8085929c86835c51d5bf904052cc671530c15f9314d9b87fe54421')
|
||||
version('2.22.3', sha256='8ce6b3df49cefea97bd522dc054ede2037939978bf23754d5c17311e5a1df3dc')
|
||||
version('2.22.2', sha256='4bf96bea9f80ac85c716cd07f5fe68602db7f380f6dc2d025f17139aa0b56455')
|
||||
version('2.22.1', sha256='e8d1de4f8c102263844886a2f2b57d82c291c1eae6307ea406fb96f29a67c3a7')
|
||||
version('2.22', sha256='b605555e41e9bf7e852a37b051e6a49014e561f21290680e3a60c279488d417e')
|
||||
version('2.21', sha256='355a32310b2fee5e7289d6d6e89eddd13275a7c85a243dc5dd293a6cb5bb047e')
|
||||
version('2.20', sha256='4c8b3b42cecff579f5f9de5ccad47e0849e0245e325a04ff5985c248141af1a4')
|
||||
|
||||
parallel = False
|
||||
|
||||
def url_for_version(self, version):
|
||||
url = 'ftp://ftp.gnu.org/gnu/gengetopt/gengetopt-{0}.tar.{1}'
|
||||
if version >= Version('2.23'):
|
||||
suffix = 'xz'
|
||||
else:
|
||||
suffix = 'gz'
|
||||
return url.format(version, suffix)
|
||||
39
var/spack/repos/builtin/packages/glfw/package.py
Normal file
39
var/spack/repos/builtin/packages/glfw/package.py
Normal file
@@ -0,0 +1,39 @@
|
||||
# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Glfw(CMakePackage):
|
||||
"""GLFW is an Open Source, multi-platform library for
|
||||
OpenGL, OpenGL ES and Vulkan development on the desktop. It
|
||||
provides a simple API for creating windows, contexts and
|
||||
surfaces, receiving input and events."""
|
||||
|
||||
homepage = "https://www.glfw.org/"
|
||||
url = "https://github.com/glfw/glfw/archive/3.3.2.tar.gz"
|
||||
|
||||
version('3.3.2', sha256='98768e12e615fbe9f3386f5bbfeb91b5a3b45a8c4c77159cef06b1f6ff749537')
|
||||
version('3.3.1', sha256='6bca16e69361798817a4b62a5239a77253c29577fcd5d52ae8b85096e514177f')
|
||||
version('3.3', sha256='81bf5fde487676a8af55cb317830703086bb534c53968d71936e7b48ee5a0f3e')
|
||||
version('3.2.1', sha256='e10f0de1384d75e6fc210c53e91843f6110d6c4f3afbfb588130713c2f9d8fe8')
|
||||
version('3.2', sha256='cb3aab46757981a39ae108e5207a1ecc4378e68949433a2b040ce2e17d8f6aa6')
|
||||
version('3.1.2', sha256='6ac642087682aaf7f8397761a41a99042b2c656498217a1c63ba9706d1eef122')
|
||||
version('3.1.1', sha256='4de311ec9bf43bfdc8423ddf93b91dc54dc73dcfbedfb0991b6fbb3a9baf245f')
|
||||
version('3.1', sha256='2140f4c532e7ce4c84cb7e4c419d0979d5954fa1ce204b7646491bd2cc5bf308')
|
||||
version('3.0.4', sha256='a4e7c57db2086803de4fc853bd472ff8b6d2639b9aa16e6ac6b19ffb53958caf')
|
||||
version('3.0.3', sha256='7a182047ba6b1fdcda778b79aac249bb2328b6d141188cb5df29560715d01693')
|
||||
|
||||
depends_on('libxrandr')
|
||||
depends_on('libxinerama')
|
||||
depends_on('libxcursor')
|
||||
depends_on('libxdamage')
|
||||
depends_on('libxft')
|
||||
depends_on('libxi')
|
||||
depends_on('libxmu')
|
||||
depends_on('freetype')
|
||||
depends_on('fontconfig')
|
||||
depends_on('doxygen', type='build')
|
||||
depends_on('pkgconfig', type='build')
|
||||
@@ -7,38 +7,49 @@
|
||||
|
||||
|
||||
class Gnupg(AutotoolsPackage):
|
||||
"""GnuPG is a complete and free implementation of the OpenPGP
|
||||
standard as defined by RFC4880 """
|
||||
"""GNU Pretty Good Privacy (PGP) package."""
|
||||
|
||||
homepage = "https://gnupg.org/index.html"
|
||||
url = "https://gnupg.org/ftp/gcrypt/gnupg/gnupg-2.2.17.tar.bz2"
|
||||
url = "https://gnupg.org/ftp/gcrypt/gnupg/gnupg-2.2.19.tar.bz2"
|
||||
|
||||
version('2.2.19', sha256='242554c0e06f3a83c420b052f750b65ead711cc3fddddb5e7274fcdbb4e9dec0')
|
||||
version('2.2.17', sha256='afa262868e39b651a2db4c071fba90415154243e83a830ca00516f9a807fd514')
|
||||
version('2.2.15', sha256='cb8ce298d7b36558ffc48aec961b14c830ff1783eef7a623411188b5e0f5d454')
|
||||
version('2.2.3', sha256='cbd37105d139f7aa74f92b6f65d136658682094b0e308666b820ae4b984084b4')
|
||||
version('2.1.21', sha256='7aead8a8ba75b69866f583b6c747d91414d523bfdfbe9a8e0fe026b16ba427dd')
|
||||
|
||||
depends_on('npth@1.2:')
|
||||
depends_on('libgpg-error@1.24:')
|
||||
depends_on('libgcrypt@1.7.0:')
|
||||
depends_on('libksba@1.3.4:')
|
||||
depends_on('libassuan@2.4:', when='@:2.2.3')
|
||||
depends_on('libassuan@2.5:', when='@2.2.15:')
|
||||
depends_on('libksba@1.3.4:')
|
||||
depends_on('libgpg-error@1.24:')
|
||||
depends_on('npth@1.2:')
|
||||
depends_on('zlib')
|
||||
depends_on('pinentry', type='run')
|
||||
depends_on('libiconv')
|
||||
depends_on('zlib')
|
||||
|
||||
def configure_args(self):
|
||||
return [
|
||||
'--without-tar',
|
||||
args = [
|
||||
'--disable-bzip2',
|
||||
'--disable-sqlite',
|
||||
'--disable-ntbtls',
|
||||
'--disable-gnutls',
|
||||
'--disable-ldap',
|
||||
'--disable-regex',
|
||||
'--with-pinentry-pgm=' + self.spec['pinentry'].command.path,
|
||||
'--with-libgpg-error-prefix=' + self.spec['libgpg-error'].prefix,
|
||||
'--with-libgcrypt-prefix=' + self.spec['libgcrypt'].prefix,
|
||||
'--with-libassuan-prefix=' + self.spec['libassuan'].prefix,
|
||||
'--with-ksba-prefix=' + self.spec['libksba'].prefix,
|
||||
'--with-npth-prefix=' + self.spec['npth'].prefix,
|
||||
'--without-ldap',
|
||||
'--with-libiconv-prefix=' + self.spec['libiconv'].prefix,
|
||||
'--without-regex',
|
||||
'--with-zlib=' + self.spec['zlib'].prefix,
|
||||
'--without-bzip2',
|
||||
'--without-tar',
|
||||
'--without-libiconv-prefix',
|
||||
'--without-readline',
|
||||
]
|
||||
|
||||
if self.run_tests:
|
||||
args.append('--enable-all-tests')
|
||||
|
||||
return args
|
||||
|
||||
@@ -35,6 +35,7 @@ class Go(Package):
|
||||
|
||||
extendable = True
|
||||
|
||||
version('1.14', sha256='6d643e46ad565058c7a39dac01144172ef9bd476521f42148be59249e4b74389')
|
||||
version('1.13.8', sha256='b13bf04633d4d8cf53226ebeaace8d4d2fd07ae6fa676d0844a688339debec34')
|
||||
version('1.13.7', sha256='e4ad42cc5f5c19521fbbbde3680995f2546110b5c6aa2b48c3754ff7af9b41f4')
|
||||
version('1.13.6', sha256='aae5be954bdc40bcf8006eb77e8d8a5dde412722bc8effcdaf9772620d06420c')
|
||||
|
||||
@@ -24,8 +24,10 @@ class Gromacs(CMakePackage):
|
||||
git = 'https://github.com/gromacs/gromacs.git'
|
||||
maintainers = ['junghans', 'marvinbernhardt']
|
||||
|
||||
version('develop', branch='master')
|
||||
version('master', branch='master')
|
||||
version('2020.1', sha256='e1666558831a3951c02b81000842223698016922806a8ce152e8f616e29899cf')
|
||||
version('2020', sha256='477e56142b3dcd9cb61b8f67b24a55760b04d1655e8684f979a75a5eec40ba01')
|
||||
version('2019.6', sha256='bebe396dc0db11a9d4cc205abc13b50d88225617642508168a2195324f06a358')
|
||||
version('2019.5', sha256='438061a4a2d45bbb5cf5c3aadd6c6df32d2d77ce8c715f1c8ffe56156994083a')
|
||||
version('2019.4', sha256='ba4366eedfc8a1dbf6bddcef190be8cd75de53691133f305a7f9c296e5ca1867')
|
||||
version('2019.3', sha256='4211a598bf3b7aca2b14ad991448947da9032566f13239b1a05a2d4824357573')
|
||||
|
||||
323
var/spack/repos/builtin/packages/gsl/gsl-2.3-cblas.patch
Normal file
323
var/spack/repos/builtin/packages/gsl/gsl-2.3-cblas.patch
Normal file
@@ -0,0 +1,323 @@
|
||||
Makefile.am | 8 +-
|
||||
ax_cblas.m4 | 69 +++++
|
||||
bspline/Makefile.am | 2 +-
|
||||
configure.ac | 10 +
|
||||
eigen/Makefile.am | 2 +-
|
||||
gsl-config.in | 4 +-
|
||||
gsl.pc.in | 2 +-
|
||||
interpolation/Makefile.am | 2 +-
|
||||
linalg/Makefile.am | 2 +-
|
||||
multifit/Makefile.am | 4 +-
|
||||
multimin/Makefile.am | 4 +-
|
||||
multiroots/Makefile.am | 2 +-
|
||||
ode-initval/Makefile.am | 2 +-
|
||||
poly/Makefile.am | 2 +-
|
||||
specfunc/Makefile.am | 2 +-
|
||||
wavelet/Makefile.am | 2 +-
|
||||
31 files changed, 1157 insertions(+), 19 deletions(-)
|
||||
|
||||
diff --git a/Makefile.am b/Makefile.am
|
||||
index c522001..4513bc8 100644
|
||||
--- a/Makefile.am
|
||||
+++ b/Makefile.am
|
||||
@@ -19,7 +19,7 @@ EXTRA_DIST = autogen.sh gsl-config.in gsl.pc.in configure.ac THANKS BUGS gsl.spe
|
||||
|
||||
lib_LTLIBRARIES = libgsl.la
|
||||
libgsl_la_SOURCES = version.c
|
||||
-libgsl_la_LIBADD = $(GSL_LIBADD) $(SUBLIBS)
|
||||
+libgsl_la_LIBADD = $(GSL_LIBADD) $(SUBLIBS) @CBLAS_LINK_LIBS@
|
||||
libgsl_la_LDFLAGS = $(GSL_LDFLAGS) -version-info $(GSL_LT_VERSION)
|
||||
noinst_HEADERS = templates_on.h templates_off.h build.h
|
||||
|
||||
@@ -29,10 +29,10 @@ m4data_DATA = gsl.m4
|
||||
bin_PROGRAMS = gsl-randist gsl-histogram
|
||||
|
||||
gsl_randist_SOURCES = gsl-randist.c
|
||||
-gsl_randist_LDADD = libgsl.la cblas/libgslcblas.la
|
||||
+gsl_randist_LDADD = libgsl.la
|
||||
|
||||
gsl_histogram_SOURCES = gsl-histogram.c
|
||||
-gsl_histogram_LDADD = libgsl.la cblas/libgslcblas.la
|
||||
+gsl_histogram_LDADD = libgsl.la
|
||||
|
||||
check_SCRIPTS = test_gsl_histogram.sh pkgconfig.test
|
||||
TESTS = test_gsl_histogram.sh pkgconfig.test
|
||||
@@ -51,6 +51,8 @@ edit = $(SED) \
|
||||
-e 's|@GSL_CFLAGS[@]|$(GSL_CFLAGS)|g' \
|
||||
-e 's|@GSL_LIBM[@]|$(GSL_LIBM)|g' \
|
||||
-e 's|@GSL_LIBS[@]|$(GSL_LIBS)|g' \
|
||||
+ -e 's|@CBLAS_CFLAGS[@]|$(CBLAS_CFLAGS)|g' \
|
||||
+ -e 's|@CBLAS_LIBS[@]|$(CBLAS_LIBS)|g' \
|
||||
-e 's|@LIBS[@]|$(LIBS)|g' \
|
||||
-e 's|@VERSION[@]|$(VERSION)|g'
|
||||
|
||||
diff --git a/ax_cblas.m4 b/ax_cblas.m4
|
||||
new file mode 100644
|
||||
index 0000000..6ef143a
|
||||
--- /dev/null
|
||||
+++ b/ax_cblas.m4
|
||||
@@ -0,0 +1,69 @@
|
||||
+AC_DEFUN([AX_CBLAS],[
|
||||
+
|
||||
+ ext_cblas=no
|
||||
+ ext_cblas_libs="-lcblas"
|
||||
+ ext_cblas_cflags=""
|
||||
+
|
||||
+ AC_ARG_WITH(cblas-external,
|
||||
+ [AS_HELP_STRING([--with-cblas-external],
|
||||
+ [Use external CBLAS library (default is no)])],
|
||||
+ [with_ext_cblas=$withval],
|
||||
+ [with_ext_cblas=no])
|
||||
+
|
||||
+ case $with_ext_cblas in
|
||||
+ no) ext_cblas=no ;;
|
||||
+ yes) ext_cblas=yes ;;
|
||||
+ -* | */* | *.a | *.so | *.so.* | *.o)
|
||||
+ ext_cblas=yes
|
||||
+ ext_cblas_libs="$with_cblas" ;;
|
||||
+ *) ext_cblas=yes
|
||||
+ ext_cblas_libs="-l$with_cblas" ;;
|
||||
+ esac
|
||||
+
|
||||
+ AC_ARG_WITH(cblas-external-libs,
|
||||
+ [AS_HELP_STRING([--with-cblas-external-libs=<libs>],
|
||||
+ [External cblas libraries to link with (default is "$ext_cblas_libs")])],
|
||||
+ [ext_cblas_libs=$withval],
|
||||
+ [])
|
||||
+
|
||||
+ AC_ARG_WITH(cblas-external-cflags,
|
||||
+ [AS_HELP_STRING([--with-cblas-external-cflags=<flags>],
|
||||
+ [Pre-processing flags to compile with external cblas ("-I<dir>")])],
|
||||
+ [ext_cblas_cflags=$withval],
|
||||
+ [])
|
||||
+
|
||||
+ if test x$ext_cblas != xno; then
|
||||
+ if test "x$CBLAS_LIBS" = x; then
|
||||
+ CBLAS_LIBS="$ext_cblas_libs"
|
||||
+ fi
|
||||
+ if test "x$CBLAS_CFLAGS" = x; then
|
||||
+ CBLAS_CFLAGS="$ext_cblas_cflags"
|
||||
+ fi
|
||||
+
|
||||
+ CFLAGS_sav="$CFLAGS"
|
||||
+ CFLAGS="$CFLAGS $CBLAS_CFLAGS"
|
||||
+ AC_CHECK_HEADER(cblas.h, ,
|
||||
+ [AC_MSG_ERROR([
|
||||
+ *** Header file cblas.h not found.
|
||||
+ *** If you installed cblas header in a non standard place,
|
||||
+ *** specify its install prefix using the following option
|
||||
+ *** --with-cblas-external-cflags="-I<include_dir>"])
|
||||
+ ])
|
||||
+ CFLAGS="$CFLAGS_sav"
|
||||
+
|
||||
+ LIBS_sav="$LIBS"
|
||||
+ LIBS="$LIBS $CBLAS_LIBS -lm"
|
||||
+ AC_MSG_CHECKING([for cblas_sgemm in $CBLAS_LIBS])
|
||||
+ AC_TRY_LINK_FUNC(cblas_sgemm, [ext_cblas=yes],
|
||||
+ [AC_MSG_ERROR([
|
||||
+ *** Linking with cblas with $LIBS failed.
|
||||
+ *** If you installed cblas library in a non standard place,
|
||||
+ *** specify its install prefix using the following option
|
||||
+ *** --with-cblas-external-libs="-L<lib_dir> -l<lib>"])
|
||||
+ ])
|
||||
+ AC_MSG_RESULT($ext_cblas)
|
||||
+ LIBS="$LIBS_sav"
|
||||
+ AC_SUBST([CBLAS_CFLAGS])
|
||||
+ AC_SUBST([CBLAS_LIBS])
|
||||
+ fi
|
||||
+])
|
||||
diff --git a/bspline/Makefile.am b/bspline/Makefile.am
|
||||
index 3f4f950..d413036 100644
|
||||
--- a/bspline/Makefile.am
|
||||
+++ b/bspline/Makefile.am
|
||||
@@ -12,6 +12,6 @@ check_PROGRAMS = test
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-test_LDADD = libgslbspline.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../cblas/libgslcblas.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la ../statistics/libgslstatistics.la
|
||||
+test_LDADD = libgslbspline.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la @CBLAS_LINK_LIBS@ ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la ../statistics/libgslstatistics.la
|
||||
|
||||
test_SOURCES = test.c
|
||||
diff --git a/configure.ac b/configure.ac
|
||||
index a26fc1e..564d426 100644
|
||||
--- a/configure.ac
|
||||
+++ b/configure.ac
|
||||
@@ -208,6 +208,16 @@ if test "x$LIBS" = "x" ; then
|
||||
AC_CHECK_LIB(m, cos)
|
||||
fi
|
||||
|
||||
+sinclude(ax_cblas.m4)
|
||||
+AX_CBLAS
|
||||
+if test "x$CBLAS_LIBS" != "x"; then
|
||||
+ CBLAS_LINK_LIBS="$CBLAS_LIBS"
|
||||
+else
|
||||
+ CBLAS_LINK_LIBS="\$(top_builddir)/cblas/libgslcblas.la"
|
||||
+ CBLAS_LIBS="-lgslcblas"
|
||||
+fi
|
||||
+AC_SUBST(CBLAS_LINK_LIBS)
|
||||
+
|
||||
dnl Remember to put a definition in acconfig.h for each of these
|
||||
AC_CHECK_DECLS(feenableexcept,,,[#define _GNU_SOURCE 1
|
||||
#include <fenv.h>])
|
||||
diff --git a/eigen/Makefile.am b/eigen/Makefile.am
|
||||
index c28bfde..14197a4 100644
|
||||
--- a/eigen/Makefile.am
|
||||
+++ b/eigen/Makefile.am
|
||||
@@ -11,7 +11,7 @@ noinst_HEADERS = qrstep.c
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-test_LDADD = libgsleigen.la ../test/libgsltest.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../sys/libgslsys.la ../err/libgslerr.la ../utils/libutils.la ../rng/libgslrng.la ../sort/libgslsort.la
|
||||
+test_LDADD = libgsleigen.la ../test/libgsltest.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../sys/libgslsys.la ../err/libgslerr.la ../utils/libutils.la ../rng/libgslrng.la ../sort/libgslsort.la
|
||||
|
||||
test_SOURCES = test.c
|
||||
|
||||
diff --git a/gsl-config.in b/gsl-config.in
|
||||
old mode 100755
|
||||
new mode 100644
|
||||
index 3f3fa61..c9c4262
|
||||
--- a/gsl-config.in
|
||||
+++ b/gsl-config.in
|
||||
@@ -58,11 +58,11 @@ while test $# -gt 0; do
|
||||
;;
|
||||
|
||||
--cflags)
|
||||
- echo @GSL_CFLAGS@
|
||||
+ echo @GSL_CFLAGS@ @CBLAS_CFLAGS@
|
||||
;;
|
||||
|
||||
--libs)
|
||||
- : ${GSL_CBLAS_LIB=-lgslcblas}
|
||||
+ : ${GSL_CBLAS_LIB=@CBLAS_LIBS@}
|
||||
echo @GSL_LIBS@ $GSL_CBLAS_LIB @GSL_LIBM@
|
||||
;;
|
||||
|
||||
diff --git a/gsl.pc.in b/gsl.pc.in
|
||||
index 5e9ef21..5a7a0f3 100644
|
||||
--- a/gsl.pc.in
|
||||
+++ b/gsl.pc.in
|
||||
@@ -2,7 +2,7 @@ prefix=@prefix@
|
||||
exec_prefix=@exec_prefix@
|
||||
libdir=@libdir@
|
||||
includedir=@includedir@
|
||||
-GSL_CBLAS_LIB=-lgslcblas
|
||||
+GSL_CBLAS_LIB=@CBLAS_LIBS@
|
||||
|
||||
Name: GSL
|
||||
Description: GNU Scientific Library
|
||||
diff --git a/interpolation/Makefile.am b/interpolation/Makefile.am
|
||||
index 1d80755..e45bd51 100644
|
||||
--- a/interpolation/Makefile.am
|
||||
+++ b/interpolation/Makefile.am
|
||||
@@ -12,7 +12,7 @@ AM_CPPFLAGS = -I$(top_srcdir)
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-test_LDADD = libgslinterpolation.la ../poly/libgslpoly.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../cblas/libgslcblas.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslinterpolation.la ../poly/libgslpoly.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la @CBLAS_LINK_LIBS@ ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
test_SOURCES = test.c
|
||||
|
||||
diff --git a/linalg/Makefile.am b/linalg/Makefile.am
|
||||
index a6c15b0..447ebbe 100644
|
||||
--- a/linalg/Makefile.am
|
||||
+++ b/linalg/Makefile.am
|
||||
@@ -13,4 +13,4 @@ TESTS = $(check_PROGRAMS)
|
||||
check_PROGRAMS = test
|
||||
|
||||
test_SOURCES = test.c
|
||||
-test_LDADD = libgsllinalg.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la ../rng/libgslrng.la
|
||||
+test_LDADD = libgsllinalg.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la ../rng/libgslrng.la
|
||||
diff --git a/multifit/Makefile.am b/multifit/Makefile.am
|
||||
index 988614e..793b485 100644
|
||||
--- a/multifit/Makefile.am
|
||||
+++ b/multifit/Makefile.am
|
||||
@@ -67,8 +67,8 @@ check_PROGRAMS = test #demo
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
test_SOURCES = test.c
|
||||
-test_LDADD = libgslmultifit.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../matrix/libgslmatrix.la ../sort/libgslsort.la ../statistics/libgslstatistics.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../utils/libutils.la ../sys/libgslsys.la ../rng/libgslrng.la ../specfunc/libgslspecfunc.la ../min/libgslmin.la
|
||||
+test_LDADD = libgslmultifit.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../matrix/libgslmatrix.la ../sort/libgslsort.la ../statistics/libgslstatistics.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../utils/libutils.la ../sys/libgslsys.la ../rng/libgslrng.la ../specfunc/libgslspecfunc.la ../min/libgslmin.la
|
||||
|
||||
#demo_SOURCES = demo.c
|
||||
-#demo_LDADD = libgslmultifit.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../randist/libgslrandist.la ../rng/libgslrng.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../utils/libutils.la ../sys/libgslsys.la
|
||||
+#demo_LDADD = libgslmultifit.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../randist/libgslrandist.la ../rng/libgslrng.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../utils/libutils.la ../sys/libgslsys.la
|
||||
|
||||
diff --git a/multimin/Makefile.am b/multimin/Makefile.am
|
||||
index 7071359..65a488a 100644
|
||||
--- a/multimin/Makefile.am
|
||||
+++ b/multimin/Makefile.am
|
||||
@@ -13,8 +13,8 @@ check_PROGRAMS = test #demo
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
test_SOURCES = test.c test_funcs.c test_funcs.h
|
||||
-test_LDADD = libgslmultimin.la ../min/libgslmin.la ../poly/libgslpoly.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslmultimin.la ../min/libgslmin.la ../poly/libgslpoly.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../linalg/libgsllinalg.la ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
#demo_SOURCES = demo.c
|
||||
-#demo_LDADD = libgslmultimin.la ../min/libgslmin.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../linalg/libgsllinalg.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+#demo_LDADD = libgslmultimin.la ../min/libgslmin.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../linalg/libgsllinalg.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
diff --git a/multiroots/Makefile.am b/multiroots/Makefile.am
|
||||
index a351c3f..6178448 100644
|
||||
--- a/multiroots/Makefile.am
|
||||
+++ b/multiroots/Makefile.am
|
||||
@@ -15,5 +15,5 @@ check_PROGRAMS = test
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
test_SOURCES = test.c test_funcs.c test_funcs.h
|
||||
-test_LDADD = libgslmultiroots.la ../linalg/libgsllinalg.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslmultiroots.la ../linalg/libgsllinalg.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../permutation/libgslpermutation.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
diff --git a/ode-initval/Makefile.am b/ode-initval/Makefile.am
|
||||
index 9c774b5..346c381 100644
|
||||
--- a/ode-initval/Makefile.am
|
||||
+++ b/ode-initval/Makefile.am
|
||||
@@ -12,7 +12,7 @@ check_PROGRAMS = test
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-test_LDADD = libgslodeiv.la ../linalg/libgsllinalg.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../matrix/libgslmatrix.la ../permutation/libgslpermutation.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslodeiv.la ../linalg/libgsllinalg.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../matrix/libgslmatrix.la ../permutation/libgslpermutation.la ../vector/libgslvector.la ../block/libgslblock.la ../complex/libgslcomplex.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
test_SOURCES = test.c
|
||||
|
||||
diff --git a/poly/Makefile.am b/poly/Makefile.am
|
||||
index f1dae5d..e0f8e83 100644
|
||||
--- a/poly/Makefile.am
|
||||
+++ b/poly/Makefile.am
|
||||
@@ -10,7 +10,7 @@ noinst_HEADERS = balance.c companion.c qr.c
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-check_PROGRAMS = test
|
||||
+#check_PROGRAMS = test
|
||||
|
||||
test_SOURCES = test.c
|
||||
test_LDADD = libgslpoly.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la ../sort/libgslsort.la
|
||||
diff --git a/specfunc/Makefile.am b/specfunc/Makefile.am
|
||||
index eba9ab2..772cc7e 100644
|
||||
--- a/specfunc/Makefile.am
|
||||
+++ b/specfunc/Makefile.am
|
||||
@@ -12,7 +12,7 @@ TESTS = $(check_PROGRAMS)
|
||||
|
||||
check_PROGRAMS = test
|
||||
|
||||
-test_LDADD = libgslspecfunc.la ../eigen/libgsleigen.la ../linalg/libgsllinalg.la ../sort/libgslsort.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../block/libgslblock.la ../complex/libgslcomplex.la ../poly/libgslpoly.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslspecfunc.la ../eigen/libgsleigen.la ../linalg/libgsllinalg.la ../sort/libgslsort.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../block/libgslblock.la ../complex/libgslcomplex.la ../poly/libgslpoly.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
test_SOURCES = test_sf.c test_sf.h test_airy.c test_bessel.c test_coulomb.c test_dilog.c test_gamma.c test_hyperg.c test_legendre.c test_mathieu.c
|
||||
|
||||
diff --git a/wavelet/Makefile.am b/wavelet/Makefile.am
|
||||
index 9da20d8..8cdbd77 100644
|
||||
--- a/wavelet/Makefile.am
|
||||
+++ b/wavelet/Makefile.am
|
||||
@@ -10,7 +10,7 @@ check_PROGRAMS = test
|
||||
|
||||
TESTS = $(check_PROGRAMS)
|
||||
|
||||
-test_LDADD = libgslwavelet.la ../blas/libgslblas.la ../cblas/libgslcblas.la ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
+test_LDADD = libgslwavelet.la ../blas/libgslblas.la @CBLAS_LINK_LIBS@ ../matrix/libgslmatrix.la ../vector/libgslvector.la ../block/libgslblock.la ../ieee-utils/libgslieeeutils.la ../err/libgslerr.la ../test/libgsltest.la ../sys/libgslsys.la ../utils/libutils.la
|
||||
|
||||
test_SOURCES = test.c
|
||||
|
||||
@@ -24,3 +24,31 @@ class Gsl(AutotoolsPackage, GNUMirrorPackage):
|
||||
version('2.1', sha256='59ad06837397617f698975c494fe7b2b698739a59e2fcf830b776428938a0c66')
|
||||
version('2.0', sha256='e361f0b19199b5e6c21922e9f16adf7eca8dd860842802424906d0f83485ca2d')
|
||||
version('1.16', sha256='73bc2f51b90d2a780e6d266d43e487b3dbd78945dd0b04b14ca5980fe28d2f53')
|
||||
|
||||
variant('external-cblas', default=False, description='Build against external blas')
|
||||
|
||||
# from https://dev.gentoo.org/~mgorny/dist/gsl-2.3-cblas.patch.bz2
|
||||
patch('gsl-2.3-cblas.patch', when="+external-cblas")
|
||||
|
||||
conflicts('+external-cblas', when="@:2.2.9999")
|
||||
depends_on('m4', type='build', when='+external-cblas')
|
||||
depends_on('autoconf', type='build', when='+external-cblas')
|
||||
depends_on('automake', type='build', when='+external-cblas')
|
||||
depends_on('libtool', type='build', when='+external-cblas')
|
||||
depends_on('blas', when='+external-cblas')
|
||||
|
||||
@property
|
||||
def force_autoreconf(self):
|
||||
# The external cblas patch touches configure
|
||||
return self.spec.satisfies('+external-cblas')
|
||||
|
||||
def configure_args(self):
|
||||
configure_args = []
|
||||
if self.spec.satisfies('+external-cblas'):
|
||||
configure_args.append('--with-external-cblas')
|
||||
configure_args.append('CBLAS_CFLAGS=%s'
|
||||
% self.spec['blas'].headers.include_flags)
|
||||
configure_args.append('CBLAS_LIBS=%s'
|
||||
% self.spec['blas'].libs.ld_flags)
|
||||
|
||||
return configure_args
|
||||
|
||||
31
var/spack/repos/builtin/packages/gunrock/package.py
Normal file
31
var/spack/repos/builtin/packages/gunrock/package.py
Normal file
@@ -0,0 +1,31 @@
|
||||
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
|
||||
# Spack Project Developers. See the top-level COPYRIGHT file for details.
|
||||
#
|
||||
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
|
||||
|
||||
from spack import *
|
||||
|
||||
|
||||
class Gunrock(CMakePackage, CudaPackage):
|
||||
"""High-Performance Graph Primitives on GPUs"""
|
||||
|
||||
homepage = "https://gunrock.github.io/docs/"
|
||||
git = "https://github.com/gunrock/gunrock.git"
|
||||
|
||||
version('master', submodules=True)
|
||||
version('1.1', submodules=True, tag='v1.1')
|
||||
version('1.0', submodules=True, tag='v1.0')
|
||||
version('0.5.1', submodules=True, tag='v0.5.1')
|
||||
version('0.5', submodules=True, tag='v0.5')
|
||||
version('0.4', submodules=True, tag='v0.4')
|
||||
version('0.3.1', submodules=True, tag='v0.3.1')
|
||||
version('0.3', submodules=True, tag='v0.3')
|
||||
version('0.2', submodules=True, tag='v0.2')
|
||||
version('0.1', submodules=True, tag='v0.1')
|
||||
|
||||
depends_on('cuda')
|
||||
|
||||
def install(self, spec, prefix):
|
||||
with working_dir(self.build_directory):
|
||||
install_tree('bin', prefix.bin)
|
||||
install_tree('lib', prefix.lib)
|
||||
@@ -19,6 +19,7 @@ class Hpctoolkit(AutotoolsPackage):
|
||||
maintainers = ['mwkrentel']
|
||||
|
||||
version('master', branch='master')
|
||||
version('2020.03.01', commit='94ede4e6fa1e05e6f080be8dc388240ea027f769')
|
||||
version('2019.12.28', commit='b4e1877ff96069fd8ed0fdf0e36283a5b4b62240')
|
||||
version('2019.08.14', commit='6ea44ed3f93ede2d0a48937f288a2d41188a277c')
|
||||
version('2018.12.28', commit='8dbf0d543171ffa9885344f32f23cc6f7f6e39bc')
|
||||
@@ -52,7 +53,7 @@ class Hpctoolkit(AutotoolsPackage):
|
||||
'for the compute nodes.')
|
||||
|
||||
variant('cuda', default=False,
|
||||
description='Support CUDA on NVIDIA GPUs (master branch).')
|
||||
description='Support CUDA on NVIDIA GPUs (2020.03.01 or later).')
|
||||
|
||||
boost_libs = (
|
||||
'+atomic +chrono +date_time +filesystem +system +thread +timer'
|
||||
@@ -61,18 +62,18 @@ class Hpctoolkit(AutotoolsPackage):
|
||||
|
||||
depends_on('binutils+libiberty~nls', type='link')
|
||||
depends_on('boost' + boost_libs)
|
||||
depends_on('bzip2', type='link')
|
||||
depends_on('dyninst')
|
||||
depends_on('elfutils~nls', type='link')
|
||||
depends_on('intel-tbb')
|
||||
depends_on('bzip2+shared', type='link')
|
||||
depends_on('dyninst@9.3.2:')
|
||||
depends_on('elfutils+bzip2+xz~nls', type='link')
|
||||
depends_on('intel-tbb+shared')
|
||||
depends_on('libdwarf')
|
||||
depends_on('libmonitor+hpctoolkit')
|
||||
depends_on('libmonitor+bgq', when='+bgq')
|
||||
depends_on('libunwind@1.4:')
|
||||
depends_on('libunwind@1.4: +xz')
|
||||
depends_on('mbedtls+pic')
|
||||
depends_on('xerces-c transcoder=iconv')
|
||||
depends_on('xz', type='link')
|
||||
depends_on('zlib')
|
||||
depends_on('zlib+shared')
|
||||
|
||||
depends_on('cuda', when='+cuda')
|
||||
depends_on('intel-xed', when='target=x86_64:')
|
||||
@@ -83,11 +84,14 @@ class Hpctoolkit(AutotoolsPackage):
|
||||
conflicts('%gcc@:4.7.99', when='^dyninst@10.0.0:',
|
||||
msg='hpctoolkit requires gnu gcc 4.8.x or later')
|
||||
|
||||
conflicts('%gcc@:4.99.99', when='@master',
|
||||
msg='the master branch requires gnu gcc 5.x or later')
|
||||
conflicts('%gcc@:4.99.99', when='@2020.03.01:',
|
||||
msg='hpctoolkit requires gnu gcc 5.x or later')
|
||||
|
||||
conflicts('+cuda', when='@2018.0.0:2019.99.99',
|
||||
msg='cuda is only available on the master branch')
|
||||
msg='cuda requires 2020.03.01 or later')
|
||||
|
||||
conflicts('+bgq', when='@2020.03.01:',
|
||||
msg='blue gene requires 2019.12.28 or earlier')
|
||||
|
||||
flag_handler = AutotoolsPackage.build_system_flags
|
||||
|
||||
|
||||
@@ -37,6 +37,9 @@ class Hpcviewer(Package):
|
||||
maintainers = ['mwkrentel']
|
||||
|
||||
viewer_sha = {
|
||||
('2020.02', 'x86_64'): 'af1f514547a9325aee30eb891b31e38c7ea3f33d2d1978b44f83e7daa3d5de6b',
|
||||
('2020.02', 'ppc64'): '7bb4926202db663aedd5a6830778c5f73f6b08a65d56861824ea95ba83b1f59c',
|
||||
('2020.02', 'ppc64le'): 'cfcebb7ba301affd6d21d2afd43c540e6dd4c5bc39b0d20e8bd1e4fed6aa3481',
|
||||
('2020.01', 'x86_64'): '3cd5a2a382cec1d64c8bd0abaf2b1461dcd4092a4b4074ddbdc1b96d2a0b4220',
|
||||
('2020.01', 'ppc64'): '814394a5f410033cc1019526c268ef98b5b381e311fcd39ae8b2bde6c6ff017c',
|
||||
('2020.01', 'ppc64le'): 'e830e956b8088c415fb25ef44a8aca16ebcb27bcd34536866612343217e3f9e4',
|
||||
@@ -61,6 +64,9 @@ class Hpcviewer(Package):
|
||||
}
|
||||
|
||||
trace_sha = {
|
||||
('2020.02', 'x86_64'): 'b7b634e91108aa50a2e8647ac6bac87df775ae38aff078545efaa84735e0a666',
|
||||
('2020.02', 'ppc64'): 'a3e845901689e1b32bc6ab2826c6ac6ed352df4839090fa530b20f747e6e0957',
|
||||
('2020.02', 'ppc64le'): 'a64a283f61e706d988952a7cede9fac0328b09d2d0b64e4c08acc54e38781c98',
|
||||
('2020.01', 'x86_64'): '9459177a2445e85d648384e2ccee20524592e91a74d615262f32d0876831cd7c',
|
||||
('2020.01', 'ppc64'): '02366a2ba30b9b2450d50cf44933288f04fae5bf9868eef7bb2ae1b49d4f454e',
|
||||
('2020.01', 'ppc64le'): '39970e84e397ed96bc994e7b8db3b7b3aab4e3155fa7ca8e68b9274bb58115f0',
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user