Separate spack instances installing to separate install trees can fight
over the same stage directory because we do not currently unique stage
paths by instance.
- [x] add a new `$instance` substitution that gives an 8-digit hash
unique to the spack instance
- [x] make the default stage directory use `$instance`
- [x] rework `spack.util.path.substitute_config_variables()` so that
expensive operations like hashing are done lazily, not at module
load time.
* fix issue #22228 build of gdk-pixbuf
* Update var/spack/repos/builtin/packages/gdk-pixbuf/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This package used to be a part of rpm, but now is being developed separately.
It will supposedly be moved to a sourceware branch (it is maintained by
redhat) but I do not know if this will happen soon. We need it in order
to change locations in binaries that are built in /tmp and then moved
elsewhere. I will ping @woodard who might be able to give us an estimate
if we should include this development repository or wait for it to be
moved elsewhere. Once this is merged, we will want to use the bootstrap
approach to install and use the library from spack.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
This change accounts for platform specific configuration scopes,
like ~/.spack/linux, during bootstrapping. These scopes were
previously not accounted for and that was causing issues e.g.
when searching for compilers.
* Replace URL computation in base IntelOneApiPackage class with
defining URLs in component packages (this is expected to be
simpler for now)
* Add component_dir property that all oneAPI component packages must
define. This property names a directory that should exist after
installation completes (useful for making sure the install was
successful) and also defines the search location for the
component's environment update script.
* Add needed dependencies for components (e.g. intel-oneapi-dnn
requires intel-oneapi-tbb). The compilers provided by
intel-oneapi-compilers need some components under certain
circumstances (e.g. when enabling SYCL support) but these were
omitted since the libraries should only be linked when a
dependent package requests that feature
* Remove individual setup_run_environment implementations and use
IntelOneApiPackage superclass method which sources vars.sh
(located in a subdirectory of component_dir)
* Add documentation for IntelOneApiPackge build system
Co-authored-by: Vasily Danilin <vasily.danilin@yandex.ru>
This is to help debug situations like #22383, where python3.4 is
accidentally preferred over python2. It will also help on systems where
there is no python2 available or some other issue.
* QA: reduce number of unit tests for jobs not in the matrix
* Fixup for CentOS6 dependencies
* Put correct conditions back in place
* Add dependency on changes
* Change default FFT implementation to FFTW
To account for the default changing with casacore v3.4.0, as well as the
CMake logic for getting the FFTPack implementation.
* Switch to using spec.satisfies() for Python CMake values
This PR will update the urls to not have www (not needed),
the repository user should be hpcng instead of sylabs (technically
GitHub maintains the old links but this might not be forever) and
also added 3.7.1 and 3.7.2 versions of Singularity, newly released
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Beginning with version 2.4.1, the python interpreter line changed from
"#!/usr/bin/env python" to "#!/usr/bin/env python3"
That caused the bowtie2-build and bowtie2-inspect scripts to have a
trailing '3' at the end of the interpreter line. This PR fixes that. I
also observed that older versions do not build with intel-oneapi-tbb
so added a conflicts statement for that.
PRs that change only package recipes will only run tests under "package_sanity.py" and without coverage. This should result in a huge drop the cpu-time spent in CI for most PRs.
* updated deps to get gtkplus to build
* gtk-doc requires docbook-xml 4.3
* patch gtk-doc build to find xml catalogs
* patch gtk-doc build to find xml catalogs
* patch gtk-doc build to find xml catalogs
* add new version, fix macOS build error
* reorder docbook versions from newest to oldest
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Added 2 new configure patch files to build WRF 3.9.1.1 and 4.2
with aocc@3.0
* Renamed patch files used for building WRF 3.9.1.1 and 4.2 with
aocc@2.3 (mostly, this also removes -march=native from AOCCOPT
and updates LIBMVEC options for aocc@2.3)
* unit tests: mark slow tests as "maybeslow"
This commit also removes the "network" marker and
marks every "network" test as "maybeslow". Tests
marked as db are maintained, but they're not slow
anymore.
* GA: require style tests to pass before running unit-tests
* GA: make MacOS unit tests fail fast
* GA: move all unit tests into the same workflow, run style tests as a prerequisite
All the unit tests have been moved into the same workflow so that a single
run of the dorny/paths-filter action can be used to ask for coverage based
on the files that have been changed in a PR. The basic idea is that for PRs
that introduce only changes to packages coverage is not necessary, this
resulting in a faster execution of the tests.
Also, for package only PRs slow unit tests are skipped.
Finally, MacOS and linux unit tests are now conditional on style tests passing
meaning that e.g. we won't waste a MacOS worker if we know that the PR has
flake8 issues.
* Addressed review comments
* Skipping slow tests on MacOS for package only recipes
* QA: make tests on changes correct before merging
In most cases, we want condition_holds(ID) to imply any imposed
constraints associated with the ID. However, the dependency relationship
in Spack is special because it's "extra" conditional -- a dependency
*condition* may hold, but we have decided that externals will not have
dependencies, so we need a way to avoid having imposed constraints appear
for nodes that don't exist.
This introduces a new rule that says that constraints are imposed
*unless* we define `do_not_impose(ID)`. This allows rules like
dependencies, which rely on more than just spec conditions, to cancel
imposed constraints.
We add one special case for this: dependencies of externals.
We only consider test dependencies some of the time. Some packages are
*only* test dependencies. Spack's algorithm was previously generating
dependency conditions that could hold, *even* if there was no potential
dependency type.
- [x] change asp.py so that this can't happen -- we now only generate
dependency types for possible dependencies.
This builds on #20638 by unifying all the places in the concretizer where
things are conditional on specs. Previously, we duplicated a common spec
conditional pattern for dependencies, virtual providers, conflicts, and
externals. That was introduced in #20423 and refined in #20507, and
roughly looked as follows.
Given some directives in a package like:
```python
depends_on("foo@1.0+bar", when="@2.0+variant")
provides("mpi@2:", when="@1.9:")
```
We handled the `@2.0+variant` and `@1.9:` parts by generating generated
`dependency_condition()`, `required_dependency_condition()`, and
`imposed_dependency_condition()` facts to trigger rules like this:
```prolog
dependency_conditions_hold(ID, Parent, Dependency) :-
attr(Name, Arg1) : required_dependency_condition(ID, Name, Arg1);
attr(Name, Arg1, Arg2) : required_dependency_condition(ID, Name, Arg1, Arg2);
attr(Name, Arg1, Arg2, Arg3) : required_dependency_condition(ID, Name, Arg1, Arg2, Arg3);
dependency_condition(ID, Parent, Dependency);
node(Parent).
```
And we handled `foo@1.0+bar` and `mpi@2:` parts ("imposed constraints")
like this:
```prolog
attr(Name, Arg1, Arg2) :-
dependency_conditions_hold(ID, Package, Dependency),
imposed_dependency_condition(ID, Name, Arg1, Arg2).
attr(Name, Arg1, Arg2, Arg3) :-
dependency_conditions_hold(ID, Package, Dependency),
imposed_dependency_condition(ID, Name, Arg1, Arg2, Arg3).
```
These rules were repeated with different input predicates for
requirements (e.g., `required_dependency_condition`) and imposed
constraints (e.g., `imposed_dependency_condition`) throughout
`concretize.lp`. In #20638 it got to be a bit confusing, because we used
the same `dependency_condition_holds` predicate to impose constraints on
conditional dependencies and virtual providers. So, even though the
pattern was repeated, some of the conditional rules were conjoined in a
weird way.
Instead of repeating this pattern everywhere, we now have *one* set of
consolidated rules for conditions:
```prolog
condition_holds(ID) :-
condition(ID);
attr(Name, A1) : condition_requirement(ID, Name, A1);
attr(Name, A1, A2) : condition_requirement(ID, Name, A1, A2);
attr(Name, A1, A2, A3) : condition_requirement(ID, Name, A1, A2, A3).
attr(Name, A1) :- condition_holds(ID), imposed_constraint(ID, Name, A1).
attr(Name, A1, A2) :- condition_holds(ID), imposed_constraint(ID, Name, A1, A2).
attr(Name, A1, A2, A3) :- condition_holds(ID), imposed_constraint(ID, Name, A1, A2, A3).
```
this allows us to use `condition(ID)` and `condition_holds(ID)` to
encapsulate the conditional logic on specs in all the scenarios where we
need it. Instead of defining predicates for the requirements and imposed
constraints, we generate the condition inputs with generic facts, and
define predicates to associate the condition ID with a particular
scenario. So, now, the generated facts for a condition look like this:
```prolog
condition(121).
condition_requirement(121,"node","cairo").
condition_requirement(121,"variant_value","cairo","fc","True").
imposed_constraint(121,"version_satisfies","fontconfig","2.10.91:").
dependency_condition(121,"cairo","fontconfig").
dependency_type(121,"build").
dependency_type(121,"link").
```
The requirements and imposed constraints are generic, and we associate
them with their meaning via the id. Here, `dependency_condition(121,
"cairo", "fontconfig")` tells us that condition 121 has to do with the
dependency of `cairo` on `fontconfig`, and the conditional dependency
rules just become:
```prolog
dependency_holds(Package, Dependency, Type) :-
dependency_condition(ID, Package, Dependency),
dependency_type(ID, Type),
condition_holds(ID).
```
Dependencies, virtuals, conflicts, and externals all now use similar
patterns, and the logic for generating condition facts is common to all
of them on the python side, as well. The more specific routines like
`package_dependencies_rules` just call `self.condition(...)` to get an id
and generate requirements and imposed constraints, then they generate
their extra facts with the returned id, like this:
```python
def package_dependencies_rules(self, pkg, tests):
"""Translate 'depends_on' directives into ASP logic."""
for _, conditions in sorted(pkg.dependencies.items()):
for cond, dep in sorted(conditions.items()):
condition_id = self.condition(cond, dep.spec, pkg.name) # create a condition and get its id
self.gen.fact(fn.dependency_condition( # associate specifics about the dependency w/the id
condition_id, pkg.name, dep.spec.name
))
# etc.
```
- [x] unify generation and logic for conditions
- [x] use unified logic for dependencies
- [x] use unified logic for virtuals
- [x] use unified logic for conflicts
- [x] use unified logic for externals
LocalWords: concretizer mpi attr Arg concretize lp cairo fc fontconfig
LocalWords: virtuals def pkg cond dep fn refactor github py
* Rewrite relative dev_spec paths internally to absolute paths in case of relocation of the environment file
* Test relative paths for dev_path in environments
* Add a --keep-relative flag to spack env create
This ensures that relative paths of develop paths are not expanded to
absolute paths when initializing the environment in a different location
from the spack.yaml init file.
Currently, regardless of a spec being concrete or not, we validate its variants in `spec_clauses` (part of `SpackSolverSetup`).
This PR skips the check if the spec is concrete.
The reason we want to do this is so that the solver setup class (really, `spec_clauses`) can be used for cases when we just want the logic statements / facts (is that what they are called?) and we don't need to re-validate an already concrete spec. We can't change existing concrete specs, and we have to be able to handle them *even if they violate constraints in the current spack*. This happens in practice if we are doing the validation for a spec produced by a different spack install.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
As of OpenBLAS 0.3.13, leaving off `TARGET` by default optimizes most
code for the host system -- adding flags that cause the resulting
library to fail (SIGILL) on older systems. This change should ensure
that a "x86_64" target for example will work across deployment systems.
https://github.com/xianyi/OpenBLAS/issues/3139
This pull request will add the ability for a user to add a configuration argument on the fly, on the command line, e.g.,:
```bash
$ spack -c config:install_tree:root:/path/to/config.yaml -c packages:all:compiler:[gcc] list --help
```
The above command doesn't do anything (I'm just getting help for list) but you can imagine having another root of packages, and updating it on the fly for a command (something I'd like to do in the near future!)
I've moved the logic for config_add that used to be in spack/cmd/config.py into spack/config.py proper, and now both the main.py (where spack commands live) and spack/cmd/config.py use these functions. I only needed spack config add, so I didn't move the others. We can move the others if there are also needed in multiple places.
Was getting the following error:
```
$ spack test list
==> Error: issubclass() arg 1 must be a class
```
This PR adds a check in `has_test_method` (in case it is re-used elsewhere such as #22097) and ensures a class is passed to the method from `spack test list`.
Updated the versions for DiHydrogen and Aluminum. Added new constraints on versions of Aluminum that are used across the software stack. Cleaned up the dependency on DiHydrogen for LBANN.
* py-chainer: Add test method for ChainerMN (continued #21848, #21940)
* py-chainer: Fixed the word in the message
* py-chainer: Delete unnecessary imports
* py-chainer: Incorporation of the measures pointed out in #21940 was insufficient.
Adds several EpetraExt_BUILD_* options as well as an Amesos2_ENABLE_Basker option. Adds `none` as an option to `gotype=`, which should be among the options since 'none' is specifically handled later in the package definition.
Adds `stokhos` and `trilinoscouplings` as options in spack which already are available in CMake for Trilinos (e.g. Trilinos_ENABLE_Stokhos:BOOL=)
* py-pytest-html recipe
* added missing deps + copyright
* Update var/spack/repos/builtin/packages/py-pytest-html/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pytest-metadata/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Patch eospac's Makefile.-linux-gnu.hashes to consider only `$(notdir
$(F90))` when constructing a key to look up compiler flags in the
_F90-CPUINFO_COMP_FLAGS associative array.
This patch was accepted into eospac itself after the release of
6.4.2beta, so apply it only to 6.4.2beta and earlier releases.
- Fix faulty patch
- Only use GEARSHIFFT_BACKEND_FFTW_PTHREADS if ~openmp
- Explicitly disable float16 support
- Use correct minimum required Boost version
- Add variants for Intel MKL and ROCm rocfft
This is a workaround for an issue with how "spack install" is invoked from within "spack ci rebuild". The fact that we don't get an exception or even the actual returncode when using the object returned by spack.util.executable.which('spack') to install the target spec means we get no indication of failures about the install command itself. Instead we rely on the subsequent buildcache creation failure to fail the job.
In the past, we only had the binutils variant, which included the
bootstrapping flag. Now that we have a separate bootstrap variant, fix
the nvptx conflict accordingly.
* Add intel cluster package update2 for 2020
* add pacifica cli tools, and pager
* remove boilerplate code
* update flake8 lints
* update flake8 lint, missed one
* add a description for pager
* Shorten a line
* Remove whitespace
* check on dependencies and move urls to proper place
* Remove import package as it seems it is not required
* add requests to the uploader config
* remove blank Line
* change to build and run for packages
* add run and build to the packages
* move from url method to pypi method
* adjust requirements based on feedback from adamjstewart
* remove python 3 requirement, and add setuptools-scm
* remove dependence on python
Co-authored-by: Evan Felix <evan.felix@pnnl.gov>
Unlike the other commands of the `R CMD` interface, the `INSTALL` command
will read `Renviron` files. This can potentially break builds of r-
packages, depending on what is set in the `Renviron` file. This PR adds
the `--vanilla` flag to ensure that neither `Rprofile` nor `Renviron` files
are read during Spack builds of r- packages.
Cray added necessary functionality for CMake to support fortran preprocessing using crayftn. This patch is necessary for the current release of cmake (3.19), with this patched expected to be in the 3.20 release of Cmake. The included patch is from kitware.
see https://gitlab.kitware.com/cmake/cmake/-/merge_requests/5882
Co-authored-by: James Elliott <jjellio@sandia.govv>
This adds a `--path` option to `spack python` that shows the `python`
interpreter that Spack is using.
e.g.:
```console
$ spack python --path
/Users/gamblin2/src/spack/var/spack/environments/default/.spack-env/view/bin/python
```
This is useful for debugging, and we can ask users to run it to
understand what python Spack is picking up via preferences in `bin/spack`
and via the `SPACK_PYTHON` environment variable introduced in #21222.
`spack test list` will show you which *installed* packages can be tested
but it won't show you which packages have tests.
- [x] add `spack test list --all` to show which packages have test methods
- [x] update `has_test_method()` to handle package instances *and*
package classes.
* adding package for libabigail, which we likely will need to use it for an analysis!
* includes variant for documentation (doxygen and pysphinx are associated dependencies)
* Improve R package creation
This PR adds the `list_url` attribute to CRAN R packages when using
`spack create`. It also adds the `git` attribute to R Bioconductor
packages upon creation.
* Switch over to using cran/bioc attributes
The cran/bioc entries are set to have the '=' line up with homepage
entry, but homepage does not need to exist in the package file. If it
does not, that could affect the alignment.
* Do not have to split bioc
* Edit R package documentation
Explain Bioconductor packages and add `cran` and `bioc` attributes.
* Update lib/spack/docs/build_systems/rpackage.rst
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update lib/spack/docs/build_systems/rpackage.rst
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Simplify the cran attribute
The version can be faked so that the cran attribute is simply equal to
the CRAN package name.
* Edit the docs to reflect new `cran` attribute format
* Use the first element of self.versions() for url
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Remove casacore's old version of the file with a package patch()
function, and depend on a modern CMake for the build.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
1. Add version 2021.03.01.
2. Cleanup the binutils dependencies now that 2.35.2 and 2.36 are available.
3. Require gcc 7.x or later for current 2021 version.
4. Simplify the xz depends to always require +pic.
This works around a glitch in the original concretizer.
This allows users to use relative paths for mirrors and repos and other things that may be part of a Spack environment. There are two ways to do it.
1. Relative to the file
```yaml
spack:
repos:
- local_dir/my_repository
```
Which will refer to a repository like this in the directory where `spack.yaml` lives:
```
env/
spack.yaml <-- the config file above
local_dir/
my_repository/ <-- this repository
repo.yaml
packages/
```
2. Relative to the environment
```yaml
spack:
repos:
- $env/local_dir/my_repository
```
Both of these would refer to the same directory, but they differ for included files. For example, if you had this layout:
```
env/
spack.yaml
repository/
includes/
repos.yaml
repository/
```
And this `spack.yaml`:
```yaml
spack:
include: includes/repos.yaml
```
Then, these two `repos.yaml` files are functionally different:
```yaml
repos:
- $env/repository # refers to env/repository/ above
repos:
- repository # refers to env/includes/repository/ above
```
The $env variable will not be evaluated if there is no active environment. This generally means that it should not be used outside of an environment's spack.yaml file. However, if other aspects of your workflow guarantee that there is always an active environment, it may be used in other config scopes.
For opt-in packages in Spack, its common that the `cuda` variant
is disabled by default.
This also simplifies downstream usage in multi-variants for
backends in user code.
* Allow the bootstrapping of clingo from sources
Allow python builds with system python as external
for MacOS
* Ensure consistent configuration when bootstrapping clingo
This commit uses context managers to ensure we can
bootstrap clingo using a consistent configuration
regardless of the use case being managed.
* Github actions: test clingo with bootstrapping from sources
* Add command to inspect and clean the bootstrap store
Prevent users to set the install tree root to the bootstrap store
* clingo: documented how to bootstrap from sources
Co-authored-by: Gregory Becker <becker33@llnl.gov>
- as outlined in merge-request #21336 some clang compilers
can trigger erroneous floating point exceptions.
OpenFOAM normally traps FPE, but disable this in the etc/controlDict
for specific compilers:
change "trapFpe digit;" -> "trapFpe 0;"
Eliminate previous use of FOAM_SGIFPE env variable in favour of
using the etc/controlDict setting - cleaner and robuster.
Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
* py-importlib: Python 2.7 is needed to build.
added depends_on('python@2.7.0:2.7.99')
* Update var/spack/repos/builtin/packages/py-importlib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This check is performed in cmake_args rather than with a 'conflicts'
statement because matching on !clang (i.e. any compiler that is not
clang) cannot currently be done with our spec syntax.
If a user creates a wrapper for the ifx binary called ifx_orig,
this causes the ifx --version command to produce:
$ ifx --version
ifx_orig (IFORT) 2021.1 Beta 20201113
Copyright (C) 1985-2020 Intel Corporation. All rights reserved.
The regex for ifx currently expects the output to begin with
"ifx (IFORT)..." so the wrapper would not be detected as ifx. This
PR removes the need for the static "ifx" string which allows wrappers
to be detected as ifx.
In general, the Intel compiler regexes do not include the invoked
executable name (i.e., ifort, icc, icx, etc.), so this is not
expected to cause any issues.
* asciidoc: current sourceforge a2x needs python2, new github release python3
* asciidoc: current sourceforge a2x needs python2, new github release python3
* asciidoc: making python 2.3 to 2.7 able to cope with asciidoc
* asciidoc: current sourceforge a2x needs python2, new github release python3
* asciidoc: current sourceforge a2x needs python2, new github release python3
* asciidoc: current sourceforge a2x needs python2, new github release python3
* asciidoc: current sourceforge a2x needs python2, new github release python3
* Fix sensei@develop
Should work with all options but libsim.
Current releases don't work with ~catalyst
See
https://gitlab.kitware.com/sensei/sensei/-/merge_requests/240
for the fix for develop.
Current releases work only with paraview 5.7 and 5.6
See
https://gitlab.kitware.com/sensei/sensei/-/merge_requests/239
for the fix for develop (which works with 5.9)
* Fix libsim.
* Fix warnings.
* Fix python runtime.
* Many changes:
* Reworked cmake options top use the CMakePackage option helpers
* Simplified and consolidated options
* Replaced adios with adios2 variant
* Added vtkm variant (not yet working)
* paraview: Fix downstream consumers getting the wrong FindMPI
* vtk: Fix downstream consumers getting the wrong FindMPI
* Add +ascent, +adios2; remove +adios; variants off by default
* Fix catalyst python logic
* sensei: cleanup formatting
Co-authored-by: Chuck Atkins <chuck.atkins@kitware.com>
* make `spack fetch` work with environments
* previously: `spack fetch` required the explicit statement of
the specs to be fetched, even when in an environment
* now: if there is no spec(s) provided to `spack fetch` we check
if an environment is active and if yes we fetch all
uninstalled specs.
When using an external package with the old concretizer, all
dependencies of that external package were severed. This was not
performed bidirectionally though, so for an external package W with
a dependency on Z, if some other package Y depended on Z, Z could
still pull properties (e.g. compiler) from W since it was not
severed as a parent dependency.
This performs the severing bidirectionally, and adds tests to
confirm expected behavior when using config from DAG-adjacent
packages during concretization.
Allow libfuse to build without setuid binary and bump versions of both
libfuse and fuse-overlayfs.
Still doesn't solve the issue where this package tries to install things
into /etc/init.d though.
kcov CMakeLists.txt generates the "kcov" executable only if
certain dependencies are found. These dependencies are
"libbfd", "libopcodes" and "libiberty", hence the dependency
on binutils.
There clingo-cffi job has two issues to be solved:
1. It uses the default concretizer
2. It requires a package from https://test.pypi.org/simple/
The former can be fixed by setting the SPACK_TEST_SOLVER
environment variable to "clingo".
The latter though requires clingo-cffi to be pushed to a
more stable package index (since https://test.pypi.org/simple/
is meant as a scratch version of PyPI that can be wiped at
any time).
For the time being run the tests in a container. Switch back to
PyPI whenever a new official version of clingo will be released.
This allows for quickly configuring a spack install/env to use upstream packages by default. This is particularly important when upstreaming from a set of officially supported spack installs on a production cluster. By configuring such that package preferences match the upstream, you ensure maximal reuse of existing package installations.
* n2p2: Add new package
* remove ,
* Resurrection of , and changed " to single
* changed example.command to example.co
* n2p2: Added v2.1.1
* n2p2: Changed the type of depends_on.
Since there are many variables being set I thought it would be a good idea to document them better and slightly simplify the logic for external vs not-external.
Fixes for gitlab pipelines
* Remove accidentally retained testing branch name
* Generate pipeline w/out debug mode
* Make jobs interruptible for auto-cancel pending
* Work around concretization conflicts
* Support clingo when used with cffi
Clingo recently merged in a new Python module option based on cffi.
Compatibility with this module requires a few changes to spack - it does not automatically convert strings/ints/etc to Symbol and clingo.Symbol.string throws on failure.
manually convert str/int to clingo.Symbol types
catch stringify exceptions
add job for clingo-cffi to Spack CI
switch to potassco-vendored wheel for clingo-cffi CI
on_unsat argument when cffi
* Spec.splice feature
Construct a new spec with a dependency swapped out. Currently can only swap dependencies of the same name, and can only apply to concrete specs.
This feature is not yet attached to any install functionality, but will eventually allow us to "rewire" a package to depend on a different set of dependencies.
Docstring is reformatted for git below
Splices dependency "other" into this ("target") Spec, and return the result as a concrete Spec.
If transitive, then other and its dependencies will be extrapolated to a list of Specs and spliced in accordingly.
For example, let there exist a dependency graph as follows:
T
| \
Z<-H
In this example, Spec T depends on H and Z, and H also depends on Z.
Suppose, however, that we wish to use a differently-built H, known as H'. This function will splice in the new H' in one of two ways:
1. transitively, where H' depends on the Z' it was built with, and the new T* also directly depends on this new Z', or
2. intransitively, where the new T* and H' both depend on the original Z.
Since the Spec returned by this splicing function is no longer deployed the same way it was built, any such changes are tracked by setting the build_spec to point to the corresponding dependency from the original Spec.
Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
If you install packages using spack install in an environment with
complex spec constraints, and the install fails, you may want to
test out the build using spack build-env; one issue (particularly
if you use concretize: together) is that it may be hard to pass
the appropriate spec that matches what the environment is
attempting to install.
This updates the build-env command to default to pulling a matching
spec from the environment rather than concretizing what the user
provides on the command line independently.
This makes a similar change to spack cd.
If the user-provided spec matches multiple specs in the environment,
then these commands will now report an error and display all
matching specs (to help the user specify).
Co-authored-by: Gregory Becker <becker33@llnl.gov>
* Made DiHydrogen a required dependencies on newer versions of LBANN.
Added an explicit variant for enabling Boost-dependent callbacks.
Updated the separation for embedded Python and the Python front end
code and associated dependencies.
* Bugfix on ROCm include in DiHydrogen
Drops:
* C_INCLUDE_PATH
* CPLUS_INCLUDE_PATH
* LIBRARY_PATH
* INCLUDE
We already decided to use C_INCLUDE_PATH, CPLUS_INCLUDE_PATH, INCLUDE over CPATH here:
https://github.com/spack/spack/pull/14749
However, none of these flags apply to Fortran on Linux. So for consistency it seems better to make the user use -I and -L flags by hand or through pkgconfig.
BlasPP by ECP SLATE will fail to install by default
(`spack install blaspp`) because:
- the default BLAS installation in Spack is OpenBLAS
- BlasPP conflicts with `threads=none` for all recent OpenBLAS releases
OpenBLAS introduced a threadsafe compile option
with 0.3.7+ aka `USE_LOCKING`:
```
61 # If you want to build a single-threaded OpenBLAS, but expect to call this
62 # from several concurrent threads in some other program, comment this in for
63 # thread safety. (This is done automatically for USE_THREAD=1 , and should not
64 # be necessary when USE_OPENMP=1)
65 # USE_LOCKING = 1
```
According to tests, with `spack install --test root blaspp`,
this exactly addresses the issues in BlasPP tests.
It also seems to be a good option to set by default for OpenBLAS and
users that do not need this safety net can always disable it.
Solve issues with newer OpenBLAS by requiring
`+locking` over none-default threading options.
* Improve error message for inconsistencies in package.py
Sometimes directives refer to variants that do not exist.
Make it such that:
1. The name of the variant
2. The name of the package which is supposed to have
such variant
3. The name of the package making this assumption
are all printed in the error message for easier debugging.
* Add unit tests
* Also removed LBANN CUDA CMake flags that are set by the
version of Hydrogen that is compiled against.
* Updated recipes to use HWLOC 2.3 with ROCm to enable
topology awareness.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* genesis: New package.
* fujitsu-ssl2: fix unit test error
* genesis: Fix for comments and add test method
* genesis: Fix for comments
* genesis: Fix for comments
* libblastrampoline: new package
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
It turns out there are certain cases where having Open MPI use an external hwloc messes up other
applications that also rely on hwloc, but a different version.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* VTK-m: No `pic` variant
A leftover conflict between `shared` and `pic` variants, the
latter is not part of the package anymore, leads to a solver
error with clingo.
This removes the outdated conflict section.
* VTK-m: Kokkos AMD GPU variant changed
Set the minimun C++ standard for LBANN, Hydrogen, and DiHydrogen to
C++17. The minumim C++ standard for Aluminum is C++14. Add new
versions for Aluminum, Hydrogen, and DiHydrogen. Added support for
high performance linkers in LBANN recipe (gold and lld). Added
variants to LBANN for enabling embedded Python support independently
from the Python front end.
* py-fenics-instant: new package for legacy fenics 2016 and 2017 versions
* Update var/spack/repos/builtin/packages/py-fenics-instant/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The signature for configure_args in the template for new
RPackage packages was incorrect (different than what is
defined and used in lib/spack/spack/build_systems/r.py)
See issue #21774
Keep spack.store.store and spack.store.db consistent in unit tests
* Remove calls to monkeypatch for spack.store.store and spack.store.db:
tests that used these called one or the other, which lead to
inconsistencies (the tests passed regardless but were fragile as a
result)
* Fixtures making use of monkeypatch with mock_store now use the
updated use_store function, which sets store.store and store.db
consistently
* subprocess_context.TestState now transfers the serializes and
restores spack.store.store (without the monkeypatch changes this
would have created inconsistencies)
Since signals are fundamentally racy, We can't bound the amount of time
that the `test_foreground_background_output` test will take to get to
'on', we can only observe that it transitions to 'on'. So instead of
using an arbitrary limit, just adjust the test to allow either 'on' or
'off' followed by 'on'.
This should eliminate the spurious errors we see in CI.
Follow-up to #17110
### Before
```bash
CC=/Users/Adam/spack/lib/spack/env/clang/clang; export CC
SPACK_CC=/usr/bin/clang; export SPACK_CC
PATH=...:/Users/Adam/spack/lib/spack/env/apple-clang:/Users/Adam/spack/lib/spack/env/case-insensitive:/Users/Adam/spack/lib/spack/env:...; export PATH
```
### After
```bash
CC=/Users/Adam/spack/lib/spack/env/clang/clang; export CC
SPACK_CC=/usr/bin/clang; export SPACK_CC
PATH=...:/Users/Adam/spack/lib/spack/env/clang:/Users/Adam/spack/lib/spack/env/case-insensitive:/Users/Adam/spack/lib/spack/env:...; export PATH
```
`CC` and `SPACK_CC` were being set correctly, but `PATH` was using the name of the compiler `apple-clang` instead of `clang`. For most packages, since `CC` was set correctly, nothing broke. But for packages using `Makefiles` that set `CC` based on `which clang`, it was using the system compilers instead of the compiler wrappers. Discovered when working on `py-xgboost@0.90`.
An alternative fix would be to copy the symlinks in `env/clang` to `env/apple-clang`. Let me know if you think there's a better way to do this, or to test this.
The dependencies needed a little clean up as several dependencies are
only needed for the +X variant. This PR consolidates all of the
dependencies that actually require +X and explicitly disables them when
~X to prevent accidentally picking up system libraries.
- modified the description of the +X variant
- arranges dependencies to group them
- added missing dependency on xz
- removed unneeded dependencies
- freetype
- glib
- set dependencies when +X
- cairo
- jpeg
- libpng
- libtiff
- tcl/tk
- R uses tcl/tk together, so only tk needs to be depended on, and only
when +X
- moved tcl/tk resources to with/without-x test
- added explicit with/without settings for
- cairo
- jpeglib
- libpng
- libtiff
- tcltk
The fixture was introduced in #19690 maybe accidentally.
It's not used in unit tests, and though it should be
mutable it seems an exact copy of it's immutable version.
Before this change, in pipeline environments where runners do not have access
to persistent shared file-system storage, the only way to pass buildcaches to
dependents in later stages was by using the "enable-artifacts-buildcache" flag
in the gitlab-ci section of the spack.yaml. This change supports a second
mechanism, named "temporary-storage-url-prefix", which can be provided instead
of the "enable-artifacts-buildcache" feature, but the two cannot be used at the
same time. If this prefix is provided (only "file://" and "s3://" urls are
supported), the gitlab "CI_PIPELINE_ID" will be appended to it to create a url
for a mirror where pipeline jobs will write buildcache entries for use by jobs
in subsequent stages. If this prefix is provided, a cleanup job will be
generated to run after all the rebuild jobs have finished that will delete the
contents of the temporary mirror. To support this behavior a new mirror
sub-command has been added: "spack mirror destroy" which can take either a
mirror name or url.
This change also fixes a bug in generation of "needs" list for each job. Each
jobs "needs" list is supposed to only contain direct dependencies for scheduling
purposes, unless "enable-artifacts-buildcache" is specified. Only in that case
are the needs lists supposed to contain all transitive dependencies. This
changes fixes a bug that caused the needs lists to always contain all transitive
dependencies, regardless of whether or not "enable-artifacts-buildcache" was
specified.
* py-typing: new version, avoid issues with newer versions of python
https://pypi.org/project/typing/
"For package maintainers, it is preferred to use
typing;python_version<"3.5" if your package requires it to support
earlier Python versions."
* update conflict version / more message detail
* change the depends_on, leave a comment suggesting correct usage
The actual, documented minimum version of the cfitsio dependency,
v3.181, is now neither available for (easy) download from NASA, nor as
a Spack package. No upper bound on version number exists (at this time).
Pipelines: DAG pruning
During the pipeline generation staging process we check each spec against all configured mirrors to determine whether it is up to date on any of the mirrors. By default, and with the --prune-dag argument to "spack ci generate", any spec already up to date on at least one remote mirror is omitted from the generated pipeline. To generate jobs for up to date specs instead of omitting them, use the --no-prune-dag argument. To speed up the pipeline generation process, pass the --check-index-only argument. This will cause spack to check only remote buildcache indices and avoid directly fetching any spec.yaml files from mirrors. The drawback is that if the remote buildcache index is out of date, spec rebuild jobs may be scheduled unnecessarily.
This change removes the final-stage-rebuild-index block from gitlab-ci section of spack.yaml. Now rebuilding the buildcache index of the mirror specified in the spack.yaml is the default, unless "rebuild-index: False" is set. Spack assigns the generated rebuild-index job runner attributes from an optional new "service-job-attributes" block, which is also used as the source of runner attributes for another generated non-build job, a no-op job, which spack generates to avoid gitlab errors when DAG pruning results in empty pipelines.
Add `manual_download = True` to packages that need to do manual
downloads but do not have the `manual_download attribute set. This
provides a message when installing these packages rather than a generic
fetch error.
Add versions 2020.12 and 2021.01. The viewer and trace viewer are now
integrated into a single program and one tar ball. Now available on
arm/aarch64 and now uses Java 11.
Update some things in hpctoolkit to prepare for a 2021.02.x release:
1. allow binutils to be built with +nls.
2. require libmonitor to be built with +dlopen.
3. allow rocm in more than just develop branch.
4. remove some conflicting setenv's in hpctoolkit module.
The SPACK_PYTHON environment variable can be set to a python interpreter to be
used by the spack command. This allows the spack command itself to use a
consistent and separate interpreter from whatever python might be used for package
building.
* add new flag when compiling mumps with %gcc@10.
* Fix style
* Try to fix formatting
* Use flag_handler approach suggested by @michaelkuhn
in the PR review.
* Delete former approach
* Another style issue
* Add another space
* More fixes
We still need mesa18 for some of our builds.
Those builds require python@2, normal mesa only works with
python@3.
* Remove the deprecation tag
* Add myself as a maintainer: I volunteer to help with this
package for the time being.
* There is only one version, no need to prefer it.
Modifications:
- Make use of SpackCommand objects wherever possible
- Deduplicated code when possible
- Moved cleaning of mirrors to fixtures
- Ensure mock configuration has a clear initialization order
* fixed install with ver 3 and python 3.0
* replaced @3 with @2.999
* [py-pyspark] added version requirements for py-py4j
* [py-pyspark] all versions require at least version 2.7 of python
* [py-pyspark] fixed comma syntax
Co-authored-by: Sid Pendelberry <sid@rit.edu>
The GROMACS package embeds references to its build tool chain.
Use the Spack utilities to make sure these references are correct
outside of the isolated Spack build environment.
`query()` calls `datetime.datetime.fromtimestamp` regardless of whether a
date query is being done. Guard this with an if statement to avoid the
unnecessary work.
Constructing a spec from a name instead of setting name directly forces
from_node_dict to call Spec.parse(), which is slow. Avoid this by using a
zero-arg constructor and setting name directly.
cmake was added as a runtime dependency to meson in #20449. This
introduces an unnecessary implicit cmake dependency, which increases
build time for meson considerably. cmake is only one of many methods for
finding dependencies (pkg-config, qmake etc.), which are also not
runtime dependencies of meson. Add cmake as a build dependency to mesa
instead.
This solves a few FIXMEs in conftest.py, where
we were manipulating globals and seeing side
effects prior to registering fixtures.
This commit solves the FIXMEs, but introduces
a performance regression on tests that may need
to be investigated
The method is now called "use_repositories" and
makes it clear in the docstring that it accepts
as arguments either Repo objects or paths.
Since there was some duplication between this
contextmanager and "use_repo" in the testing framework,
remove the latter and use spack.repo.use_repositories
across the entire code base.
Make a few adjustment to MockPackageMultiRepo, since it was
stating in the docstring that it was supposed to mock
spack.repo.Repo and was instead mocking spack.repo.RepoPath.
Some compilers, such as the NV compilers, do not recognize -isystem
dir when specified without a space.
Works: -isystem ../include
Does not work: -isystem../include
This PR updates the compiler wrapper to include the space with -isystem.
Environment views fail when the tmpdir used for view generation is
on a separate mount from the install_tree because the files cannot
by symlinked between the two. The fix is to use an alternative
tmpdir located alongside the view.
* [py-moviepy] created template
* [py-moviepy] added dependencies
* [py-moviepy] removed fixmes, added homepage and description
* [py-moviepy] updated to pypi and updated checksum
* [py-moviepy] added setuptools dependency
* [py-moviepy] more specific version limit
* [py-moviepy] added checksum for version 1.0.1
* [py-moviepy] numpy restriction not nesessary here
* 3DTK: add new package
* Add missing opencv variants
Co-authored-by: Michael Kuhn <michael@ikkoku.de>
* Fix cmake version req, add eigen dep
* Prefer trunk version
* Tell 3dtk where to find eigen
* Fix installation
* Fix installation
Co-authored-by: Michael Kuhn <michael@ikkoku.de>
This PR fixes the case where groff fails to build if the spack install
path is really long. There are a couple of perl scripts that get built,
and used, during the build phase that will fail when the perl
interpreter line is too long. Filtering the lines will not work because
the files don not exist after the configure phase and patching after the
build phase is too late. This PR runs the scripts explicitly with the
spack perl via the $(PERL) variable in the call to the script.
* Procedure to deprecate old versions of software
* Add documentation
* Fix bug in logic
* Update tab completion
* Deprecate legacy packages
* Deprecate old mxnet as well
* More explicit docs
* py-dvc: new package
* Update var/spack/repos/builtin/packages/py-dvc/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-dvc: add version dependency for py-networkx
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Clarify relaxed double precision option
This is only intended for use on the Fujitsu PRIMEHPC platform
* Fix typo
* Shorten line to keep linter happy
Added conflict for macsio@1.1~mpi after investigating source code. As of
1.1 tag macsio does not properly guard out MPI commands. This is
verified as corrected in @develop
* mxnet: convert to CMakePackage
* Package isn't installed yet, can't find libs
* Fix bug with GCC 8+ and CUDA 10 on PowerPC
* Add space
* Add patch to fix cmake cuda flags
* Space no longer needed
* Add patch to fix OpenBLAS linking
* Add missing CMake flag
* Fix env set, default to Distribution
* Add new version, patch
* added py-python-benedict recipe
* Update var/spack/repos/builtin/packages/py-python-benedict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
+ Provide optional variant `pythontools` (default False) that adds a run-time dependency on
`py-matplotlib`.
+ Latest versions require `cmake@3.18:` to support cuda features.
+ Enable a cmake option that forcibly disables qt support. Previously, draco would enable qt
support if it was available in the local build environment (outside of spack).
* graphviz: Remove ghostscript requirement when ~ghostscript
* Add doc variant and patch for 2.44.1
* Patch does not apply
* Update graphviz versions, using archives rather than git hash
* Complete implementation of doc variant
* Fix typo
This commit adds an option to the `external find`
command that allows it to search by tags. In this
way group of executables with common purposes can
be grouped under a single name and a simple command
can be used to detect all of them.
As an example introduce the 'build-tools' tag to
search for common development tools on a system
* add to LD_LIBRARY_PATH so that it finds libimf.so
* amrex: fix handling of CUDA arch (#20786)
* amrex: fix handling of CUDA arch
* amrex: fix style
* amrex: fix bug
* Update var/spack/repos/builtin/packages/amrex/package.py
* Update var/spack/repos/builtin/packages/amrex/package.py
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
* ecp-data-vis-sdk: Combine the vis and io SDK packages (#20737)
This better enables the collective set to be deployed togethor satisfying
eachothers dependencies
* r-sf: fix dependency error (#20898)
* improve documentation for Rocm (hip amd builds) (#20812)
* improve documentation
* astyle: Fix makefile for install parameter (#20899)
* llvm-doe: added new package (#20719)
The package contains duplicated code from llvm/package.py,
will supersede solve.
* r-e1071: added v1.7-4 (#20891)
* r-diffusionmap: added v1.2.0 (#20881)
* r-covr: added v3.5.1 (#20868)
* r-class: added v7.3-17 (#20856)
* py-h5py: HDF5_DIR is needed for ~mpi too (#20905)
For the `~mpi` variant, the environment variable `HDF5_DIR` is still required. I moved this command out of the `+mpi` conditional.
* py-hovorod: fix typo on variant name in conflicts directive (#20906)
* fujitsu-fftw: Add new package (#20824)
* pocl: added v1.6 (#20932)
Made version 1.5 or lower conflicts with a64fx.
* PCL: add new package (#20933)
* r-rle: new package (#20916)
Common 'base' and 'stats' methods for 'rle' objects, aiming to make it
possible to treat them transparently as vectors.
* r-ellipsis: added v0.3.1 (#20913)
* libconfig: add build dependency on texinfo (#20930)
* r-flexmix: add v2.3-17 (#20924)
* r-fitdistrplus: add v1.1-3 (#20923)
* r-fit-models: add v0.64 (#20922)
* r-fields: add v11.6 (#20921)
* r-fftwtools: add v0.9-9 (#20920)
* r-farver: add v2.0.3 (#20919)
* r-expm: add v0.999-6 (#20918)
* cln: add build dependency on texinfo (#20928)
* r-expint: add v0.1-6 (#20917)
* r-envstats: add v2.4.0 (#20915)
* r-energy: add v1.7-7 (#20914)
* r-ellipse: add v0.4.2 (#20912)
* py-fiscalyear: add v0.3.0 (#20911)
* r-ecp: add v3.1.3 (#20910)
* r-plotmo: add v3.6.0 (#20909)
* Improve gcc detection in llvm. (#20189)
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
Co-authored-by: Thomas Green <ca-tgreen@gw4a64fxlogin00.head.gw4.metoffice.gov.uk>
* hatchet: updated urls (#20908)
* py-anuga: add new package (#20782)
* libvips: added v8.10.5 (#20902)
* libzmq: add platform conditions to libbsd dependency (#20893)
* r-dtw: add v1.22-3 (#20890)
* r-dt: add v0.17 (#20889)
* r-dosnow: add v1.0.19 (#20888)
* add version 1.0.16 to r-doparallel (#20886)
* add version 1.3.7 to r-domc (#20885)
* add version 0.9-15 to r-diversitree (#20884)
* add version 1.3-3 to r-dismo (#20883)
* add version 0.6.27 to r-digest (#20882)
* add version 1.5 to r-rngtools (#20887)
* add version 1.5.8 to r-dicekriging (#20877)
* add version 1.4.2 to r-httr (#20876)
* add version 1.28 to r-desolve (#20875)
* add version 2.2-5 to r-deoptim (#20874)
* add version 0.2-3 to r-deldir (#20873)
* add version 1.0.0 to r-crul (#20870)
* add version 1.1.0.1 to r-crosstalk (#20869)
* add version 1.0-1 to r-copula (#20867)
* add version 5.0.2 to r-rcppparallel (#20866)
* add version 2.0-1 to r-compositions (#20865)
* add version 0.4.10 to r-rlang (#20796)
* add version 0.3.6 to r-vctrs (#20878)
* amrex: add ROCm support (#20809)
* add version 2.0-0 to r-colorspace (#20864)
* add version 1.3-1 to r-coin (#20863)
* add version 0.19-4 to r-coda (#20862)
* add version 1.3.7 to r-clustergeneration (#20861)
* add version 0.3-58 to r-clue (#20860)
* add version 0.7.1 to r-clipr (#20859)
* add version 2.2.0 to r-cli (#20858)
* add version 0.4-3 to r-classint (#20857)
* add version 0.1.2 to r-globaloptions (#20855)
* add version 2.3-56 to r-chron (#20854)
* add version 0.4.10 to r-checkpoint (#20853)
* add version 2.0.0 to r-checkmate (#20852)
* add version 1.18.1 to r-catools (#20850)
* add version 1.2.2.2 to r-modelmetrics (#20849)
* add version 3.0-4 to r-cardata (#20847)
* add version 1.0.1 to r-caracas (#20846)
* r-lifecycle: new package at v0.2.0 (#20845)
* add version 3.0-10 to r-car (#20844)
* add version 3.4.5 to r-processx (#20843)
* add version 1.5-12.2 to r-cairo (#20842)
* add version 0.2.3 to r-cubist (#20841)
* add version 2.6 to r-rmarkdown (#20838)
* add version 1.2.1 to r-blob (#20819)
* add version 4.0.4 to r-bit (#20818)
* add version 2.4-1 to r-bio3d (#20816)
* add version 0.4.2.3 to r-bibtex (#20815)
* add version 3.1-4 to r-bayesm (#20807)
* add version 1.2.1 to r-backports (#20806)
* add version 2.0.3 to r-argparse (#20805)
* add version 5.4-1 to r-ape (#20804)
* add version 0.8-18 to r-amap (#20803)
* r-pixmap: added new package (#20795)
* zoltan: source code location change (#20787)
* refactor path logic
* added some paths to make compilers and libs discoverable
* add to LD_LIBRARY_PATH so that it finds libimf.so
and cleanup PEP8
* refactor path logic
* adding paths to LIBRARY_PATH so compiler wrappers will find -lmpi
* added vals for CC=icx, CXX=icpx, FC=ifx to generated module
* back out changes to intel-oneapi-mpi, save for separate PR
* Update var/spack/repos/builtin/packages/intel-oneapi-compilers/package.py
path is joined in _ld_library_path()
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* set absolute paths to icx,icpx,ifx
* dang close parenthesis
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: mic84 <mrosso@lbl.gov>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Chuck Atkins <chuck.atkins@kitware.com>
Co-authored-by: darmac <xiaojun2@hisilicon.com>
Co-authored-by: Danny Taller <66029857+dtaller@users.noreply.github.com>
Co-authored-by: Tomoyasu Nojiri <68096132+t-nojiri@users.noreply.github.com>
Co-authored-by: Shintaro Iwasaki <siwasaki@anl.gov>
Co-authored-by: Glenn Johnson <glenn-johnson@uiowa.edu>
Co-authored-by: Kelly (KT) Thompson <KineticTheory@users.noreply.github.com>
Co-authored-by: Henrique Mendonça <henrique@users.noreply.github.com>
Co-authored-by: h-denpo <57649496+h-denpo@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Thomas Green <tomgreen66@hotmail.com>
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
Co-authored-by: Thomas Green <ca-tgreen@gw4a64fxlogin00.head.gw4.metoffice.gov.uk>
Co-authored-by: Abhinav Bhatele <bhatele@cs.umd.edu>
Co-authored-by: a-saitoh-fj <63334055+a-saitoh-fj@users.noreply.github.com>
Co-authored-by: QuellynSnead <quellyn@lanl.gov>
The "fact" method before was dealing with multiple facts
registered per call, which was used when we were emitting
grounded rules from knowledge of the problem instance.
Now that the encoding is changed we can simplify the method
to deal only with a single fact per call.
* py-dictdiffer: fix offline dependencies
* Update var/spack/repos/builtin/packages/py-dictdiffer/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-flatten-dict: new recipe
* Update var/spack/repos/builtin/packages/py-flatten-dict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-flatten-dict: fix dependencies
* py-flatten-dict: fix dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* clingo/clingo-bootstrap: added a package with option for bootstrapping clingo
package builds in Release mode
uses GCC options to link libstdc++ and libgcc statically
* clingo-bootstrap: apple-clang options to bootstrap statically on darwin
* clingo: fix the path of the Python interpreter
In case multiple Python versions are in the same prefix
(e.g. when clingo is built against an external Python),
it may happen that the Python used by CMake does not
match the corresponding node in the current spec.
This is fixed here by defining "Python_EXECUTABLE"
properly as a hint to CMake.
* clingo: the commit for "spack" version has been updated.
- add variants for build targets, language bindings, backends
- ensure selected variants are compatible with zfp version
- point to GitHub (not LLNL) tar balls
- add dependencies
- update link to homepage
- add maintainers
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Patch provided by @Billae
Avoid the following error:
File "/home/danlipsa/projects/spack/lib/spack/llnl/util/tty/log.py", line 768, in _writer_daemon
line = _retry(in_pipe.readline)()
File "/home/danlipsa/projects/spack/lib/spack/llnl/util/tty/log.py", line 830, in wrapped
return function(*args, **kwargs)
File "/usr/lib/python3.8/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0x97 in position 220: invalid start byte
This PR adds:
1. A patch that fixes a bug in version 2.70
(will be fixed upstream in the next release: https://savannah.gnu.org/support/?110396).
2. A fix for the way we patch shebang in bin/autom4te.in.
For 2, we need to keep the original modification timestamp of the file.
Otherwise, we either get an empty man page for autom4te (versions 2.69 and before)
or a failure at the build time (versions 2.70 and after).
The difference has to do with the update of the missing script: https://git.savannah.gnu.org/cgit/automake.git/commit/lib/missing?id=a22717dffe37f30ef2ad2c355b68c9b3b5e4b8c7
It will take time until developers of Autotools-based packages adjust their scripts
to the new version, therefore, 2.69 is marked as preferred.
* New interface reconstruction package
* forgot to put in CMake option for Jali
* cleanup whitespace
* fix lines with more than 79 chars
* more long line cleanup
* fix typo WONTON_ENABLE_Kokkos ---> TANGRAM_ENABLE_Kokkos
* New interface reconstruction package
* forgot to put in CMake option for Jali
* cleanup whitespace
* fix lines with more than 79 chars
* more long line cleanup
* fix typo WONTON_ENABLE_Kokkos ---> TANGRAM_ENABLE_Kokkos
* fix bugs in CMake section
* more compact cmake block
* update hash for 1.2.10 and add 1.2.11
* update recipe for Portage 3.0.0
* removing old versions - they won't build with the new recipe and the url specification doesn't work for them
* update version to 3.3.6
Co-authored-by: Rao Garimella <rao@abyzou.lanl.gov>
* added pytest-benchmark recipe
* Update var/spack/repos/builtin/packages/py-pytest-benchmark/package.py
Added Python2 dependence.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* added package py-pytest-cpp
* Update var/spack/repos/builtin/packages/py-pytest-cpp/package.py
package is !=5.4.0 use @:5.3.999
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* added package py-pytest-timeout
* Update var/spack/repos/builtin/packages/py-pytest-timeout/package.py
Added Python2.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* added package py-openmc
* Update var/spack/repos/builtin/packages/py-openmc/package.py
specify branch when using branch names for versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-openmc/package.py
use run after fixture to install openmc lib
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-openmc/package.py
Simplify copying openmc library to py-openmc prefix using install
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-openmc/package.py
NumPy should be 1.9+
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix paren missing
* Update var/spack/repos/builtin/packages/py-openmc/package.py
fixed parens
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-openmc/package.py
use v0.11.0 in URL
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Sometimes we need to patch a file that is a dependency for some other
automatically generated file that comes in a release tarball. As a
result, make tries to regenerate the dependent file using additional
tools (e.g. help2man), which would not be needed otherwise.
In some cases, it's preferable to avoid that (e.g. see #21255). A way
to do that is to save the modification timestamps before patching and
restoring them afterwards. This PR introduces a context wrapper that
does that.
Python extensions use CC and LDSHARED from the sysconfig module to
build. When Spack installs Python, it replaces the Spack compiler
wrappers in these values with the underlying compilers (since these
wrappers are not useful outside of the context of running Spack).
In order to use the Spack compiler wrappers when building Python
extensions with Spack, Spack sets the LDSHARED environment variable
when running `Python.setup_py` (which overrides sysconfig). However,
many Python extensions use an alternative method to build (namely
PythonPackage.setup_py), which meant that LDSHARED was not set (and
RPATHs were not inserted for dependencies).
This commit makes the following changes:
* Sets LDSHARED in the environment: this applies to all commands
executed during the build, rather than for a single command
invocation
* Updates the logic to set LDSHARED: this replaces the compiler
executable in LDSHARED with the Spack compiler wrapper. This
means that for some externally-built instances of Python,
Spack will now switch to using the Spack wrappers when building
extensions. The behavior is expected to be the same for Spack-
built instances of Python.
* Performs similar modifications for LDCXXSHARED (to ensure RPATHs
are included for C++ codes)
On ppc64le and aarch64, Spack tries to execute any "config.guess" and
"config.sub" scripts it finds in the source package.
However, in the libsodium tarball, these files are present but not
executable. This causes the following error when trying to install
libsodium with spack:
Error: RuntimeError: Failed to find suitable substitutes for config.sub, config.guess
Fix this by chmod-ing the scripts in the patch() function of libsodium.
* recipe: add version 6.1.1 for pytest
add recipe for new dependency py-iniconfig
recipe: add version 6.1.1 for pytest
add recipe for new dependency py-iniconfig
* fix: 'SyntaxError: invalid syntax' during unittests
* requested changes on the pull request done
* requested changes on dep for py-pytest
* change constaint on python for importlib-metadata
* undo change on py-importlib-metada as requested
* bug fix
* bug fix on py-wcwidth
* fix as requested
* forget @ in when param
* forget a colon
* add new versions py-pytest and py-py
* fix setuptools* version
* add rule for more-itertools
* [py-intel-openmp] created template
* [py-intel-openmp] is wheel
* [py-intel-openmp] fixed version for linux
* [py-intel-openmp] removed fixmes, added homepage and description
* [py-intel-openmp] added macos support
* [py-intel-openmp] style fix
* petsc: add a +mkl-pardiso variant
mkl_pardiso solver is distributed with intel-mkl
* petsc: depend on mkl instead of intel-mkl
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The first of my two upstream patches to mypy landed in the 0.800 tag that was released this morning, which lets us use module and package parameters with a .mypy.ini file that has a files key. This uses those parameters to check all of spack in style, but leaves the packages out for now since they are still very, very broken. If no package has been modified, the packages are not checked, but if one has they are. Includes some fixes for the log tests since they were not type checking.
Should also fix all failures related to "duplicate module named package" errors.
Hopefully the next drop of mypy will include my other patch so we can just specify the modules and packages in the config file to begin with, but for now we'll have to live with a bare mypy doing a check of the libs but not the packages.
* use module and package flags to check packages properly
* stop checking package files, use package flag for libs
The packages are not type checkable yet, need to finish out another PR
before they can be. The previous commit also didn't check the libraries
properly, this one does.
Add version 4.12.6, 5.0.3
I think, the preferred was there to keep version 4.
But that's why we have spack, because people can install
whatever version they want.
And root has a properly versioned dependency.
* mumps: Fix for problematic src/makefile patch (#20590)
Minor change in src/Makefile between 5.2.0 and 5.3.3 causing patch to
break. Split into 2 patchfiles
* mumps: Additional patch for fixing #20590
This is to fix issue wherein build fails on Ubuntu due to undefined
symbols, despite symbols being included in other libraries referenced
on the compilation line. I believe the issue is that the inclusion
of libsmumps.so was (due to my original patch) causing
libmumps_common.so to be automatically loaded, but since libpords.so
was not also required, the error was occuring. I have added libpords.so
along with libmumps_common.so to be explicit dependencies of
libsmumps.so, etc., which seems to resolve the issue.
* ArrayFire: Add version 3.7.2.
* ArrayFire: Allow using MKL as the FFTW provider.
* ArrayFire: Ensure the libraries are properly found.
The required backend(s) can be specified in the library query.
* openssl: remove preprocessor flags incompatible with NVIDIA HPC SDK
* Update var/spack/repos/builtin/packages/openssl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-statsmodels] added version 0.12.1 and updated dependencies accordingly
* [py-statsmodels] added python requirements for new version and fixed formatting for readability
* added m4 dep to PVM recipe
* added libtirpc dep to PVM recipe
* decode str or bytestr string to unicode
* Resolved comments from @adamjstewart on setup_build_environment
* When the SCR spec specifies a resource_manager=SLURM or LSF flag, propagate the spec through to
the libyogrt scheduler=slurm or lsf
* Use libyogrt default scheduler option when the SCR spec does not specify LSF or SLURM
* updated relion for new versions
* Switched to checksum versions
* Enabled spack tracking for MKL and TBB when CPU optimizations are enabled
* Added variants to control MKL FFT and Ppatent feature
* Replaced tags with sha256 for older versions an and switched to virtual packages
* py-funcy: new recipe
* Update var/spack/repos/builtin/packages/py-funcy/package.py
add build and run python dependencies
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* sbang pushed back to callers;
star moved to util.lang
* updated unit test
* sbang test moved; local tests pass
Co-authored-by: Nathan Hanford <hanford1@llnl.gov>
* fixing outdated metis link
* updated url to the official website since the previous url was a GitHub repo that is an unofficial mirror that only contains the latest version
* py-dictdiffer: new recipie
* Update var/spack/repos/builtin/packages/py-dictdiffer/package.py
add correct setuptools dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update NEURON simulator package
- update recipe to support autoconf as well as cmake
- new versions >=7.8 support cmake
- remove old variants
- added patch for latest bug fix release 7.8.2
Co-authored-by: Kumbhar Pramod Shivaji <kumbhar@bbpv1.epfl.ch>
Co-authored-by: Kumbhar Pramod Shivaji <kumbhar@bb-c02vf1h0hv2r.epfl.ch>
* NAMD: FIX build +cuda
Hi,
If I try to compile NAMD with CUDA support, it fails because cannot file the file "{self.arch}.cuda" because it is undet the "arch" folder.
* NAMD: FIX mpi ~smp
Fix `spack install namd ^charmpp backend=mpi ~smp`
* ssht: New version 1.3.4
ssht changed configuration mechanism from "home-grown" to "cmake. The previously current version 1.2b1 (a beta release) is thus unfortunately not available any more.
* ssht: Don't set build type
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Don't use CUDA for hipblas
* old versions use TRY_CUDA
* Update var/spack/repos/builtin/packages/hipblas/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add version 2.2-2 to r-gwmodel
* Update var/spack/repos/builtin/packages/r-gwmodel/package.py
Fix comma, space issue.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add version 0.3.17 to r-inline
* Drop R version constraint
A really old version of R was specified in the 0.3.14 and 0.3.15
versions of r-inline. This constraint was dropped in the 0.3.17 version.
Drop it from the spack recipe as well.
* vasp: fix build with gfortran 10
Avoid Error: Type mismatch between actual argument at (1) and actual argument at (2)
* vasp: add version 6.1.1
* vasp 6: allow building without CUDA
* Adding PiP recipe
* pip@1 recipe (it seems working)
* change install dir hierarchy
* installing PiP man pages
* add pip-glibc & pip-gdb
* fix configure option designations, fix dependency types
* fix dependency type of pip
* use AutotoolsPackage in pip recipe
* add patch for pip-glibc & pip-gdb to enable 'disable-werror'
* change glibc install directory
* add linux distro check to pip-gdb
* create process-in-process package
* use flag_handler and join_path
* add gcc version constraint, change install-test to check-installed
* fix gcc version designations on conflicts()
* add constraint of target cpu, fix flake8 warnings
* add version constraint to resource()
* Some fixes to adapt the current version
not to execute 'piplnlibs'
change documentation install command
* Update
new branch name of PiP-gdb
adapting PiP-Testsuite
* update pip-gdb github urls
* The very first commit of Process-in-Process (PiP)
details can be found at https://github.com/RIKEN-SysSoft/PiP
* Fix comment style issues
* New Package: Process-in-Process (PiP) -- 2nd trial
* fix style issue
* change inline comments style (required to have two spaces)
Co-authored-by: Daiki Matsunaga <daikim@axe.bz>
Imagemagick-7.0.8 needs to link against libltdl. Otherwise, the build will fail with:
```
2 errors found in build log:
503 checking for libltdl...
504 checking ltdl.h usability... no
505 checking ltdl.h presence... no
506 checking for ltdl.h... no
507 checking for lt_dlinit in -lltdl... no
508 checking if libltdl package is complete... no
>> 509 configure: error: in `/tmp/gpjohnsn/spack-stage/spack-stage-imagemagick-7.0.8-7-4y44gaklhhciiwjzhfpxjfwdj5q
ltjp3/spack-src':
>> 510 configure: error: libltdl is required for modules and OpenCL builds
511 See `config.log' for more details
```
* add version 3.8.2 to r-gtools
* Improve formatting of description
In case the list gets formatted as a non-list:
- added semicolons to end of list items
- replaced dashes with [#]
* add version 1.30 to r-knitr
* Fix version constraints
- r-digest
- r-formatr
The version constraints on those packages should actually be in the `when`
clause.
'date' is a C++ header library offering extensive date and time
functionality for the C++11, C++14 and C++17 standards written by Howard
Hinnant and released under the MIT license. A slightly modified version
has been accepted (along with 'tz.h') as part of C++20. This package
regroups all header files from the upstream repository by Howard Hinnant
so that other R packages can use them in their C++ code. At present, few
of the types have explicit 'Rcpp' wrapper though these may be added as
needed.
Designed to ease the application and comparison of multiple hypothesis
testing procedures for FWER, gFWER, FDR and FDX. Methods are
standardized and usable by the accompanying 'mutossGUI'.
Utility functions that enhance the 'parallel' package and support the
built-in parallel backends of the 'future' package. For example,
availableCores() gives the number of CPU cores available to your R
process as given by the operating system, 'cgroups' and Linux
containers, R options, and environment variables, including those set by
job schedulers on high-performance compute clusters. If none is set, it
will fall back to parallel::detectCores(). Another example is
makeClusterPSOCK(), which is backward compatible with
parallel::makePSOCKcluster() while doing a better job in setting up
remote cluster workers without the need for configuring the firewall to
do port-forwarding to your local computer.
Contains third-party map tile provider information from 'Leaflet.js',
<https://github.com/leaflet-extras/leaflet-providers>, to be used with
the 'leaflet' R package. Additionally, 'leaflet.providers' enables users
to retrieve up-to-date provider information between package updates.
Provides a header only, C++11 interface to R's C interface. Compared to
other approaches 'cpp11' strives to be safe against long jumps from the
C API as well as C++ exceptions, conform to normal R function semantics
and supports interaction with 'ALTREP' vectors.
Query, set, delete credentials from the 'git' credential store. Manage
'GitHub' tokens and other 'git' credentials. This package is to be used
by other packages that need to authenticate to 'GitHub' and/or other
'git' repositories.
Importance sampling from the truncated multivariate normal using the GHK
(Geweke-Hajivassiliou-Keane) simulator. Unlike Gibbs sampling which can
get stuck in one truncation sub-region depending on initial values, this
package allows truncation based on disjoint regions that are created by
truncation of absolute values. The GHK algorithm uses simple Cholesky
transformation followed by recursive simulation of univariate truncated
normals hence there are also no convergence issues. Importance sample is
returned along with sampling weights, based on which, one can calculate
integrals over truncated regions for multivariate normals.
This release also includes the HDF5 VOL plugin so I've added an additional
funciton to ensure the HDF5_PLUGIN_PATH env var gets updated with the adios
install prefix
* intel-xed: add version 12.0.1
Rework the version numbers for intel-xed, now that xed has actual
releases and tags. Add releases 11.2.0 and 12.0.1. Rename 2019.03.01
to 10.2019.03 as a legacy version that fits in the new order.
Add variant +pic to compile libxed.a with PIC code so that it can be
linked into another shared library.
Add conflict for aarch64.
Add mwkrentel as maintainer.
* py-pyfiglet:new recipe
* Update var/spack/repos/builtin/packages/py-pyfiglet/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pyfiglet: use pypi url
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
fixes#20736
Before this one line fix we were erroneously deducing
that dependency conditions hold even if a package
was external.
This may result in answer sets that contain imposed
conditions on a node without the node being present
in the DAG, hence #20736.
fixes#20611
The conflict was triggered by an invalid value of the
'scheduler' variant. This causes Spack to error when libyogrt
facts are validated by the ASP-based concretizer.
At some point in the past, the skip_patch argument was removed
from the call to package.do_install() this broke the --skip-patch
flag on the dev-build command.
There's two issues with hip where it tries to autodetect the patch
version number from git (when installed), but it does not check if it
even is inside of a git repo. The result is we end up with a shared lib
with a trailing dash in the library suffix: `libamd64.so.x.y.z-`, which
confuses GCC. The patch tries to check if the `.git` folder exists, and
if it does not, it handles version numbering the same as when git was
not installed previously.
* opencl-c-headers: add new version 2020.12.18
* opencl-clhpp: add new version 2.0.13
* opencl-headers: now supports OpenCL 3.0 with new versions of opencl-c-headers and opencl-clhpp
* ocl-icd: add new version 2.2.14 add now can provide OpenCL 3.0
PaRSEC: the Parallel Runtime Scheduler and Execution Controller for micro-tasks on distributed heterogeneous systems.
Signed-off-by: Aurelien Bouteiller <bouteill@icl.utk.edu>
* py-tensorflow: 2.4.0 and dependency updates
* minor version updates
* fix numpy dependency
* dependency rework: compatible release issues, start to clarify cuda versions
* --incompatible_no_support_tools_in_action_inputs was removed in bazel 3.6
* adjustment to versions of cuda dependency, also make sure that
patches/filters still apply to certain release trains.
* python 3.8 and tf < 2.2 have issues
* missed py-grpcio version bump
Set up environment and dependent packages properly when building
with intel-oneapi-mpi as a dependency MPI provider (e.g. point to
mpicc compiler wrapper).
* eospac: add version 6.4.2beta
* eospac: clarify EOSPAC "beta" versions
Compared to 6.4.1, EOSPAC 6.4.2beta contains only one change, a fix
for an inability to read some SESAME files in ASCII format. From the
release announcement,
EOSPAC 6.4.2beta has been released for general use as the latest
(i.e., eospac6-latest) versions. This is a small patch to the
previously-released version 6.4.1, which was requested by an
affected user.
But the "beta" label can cause confusion, especially when a beta
version is the new preferred version, as is the case here. As
suggested by reviewers, add a comment clarifying EOSPAC's use of
"beta".
This properly sets PATH/CPATH/LIBRARY_PATH etc. to make the
Spack-generated module file for intel-oneapi-compilers useful
(without this, 'icx' would not be found after loading the module
file for intel-oneapi-compilers).
The C-Library for the current compiler should already be used by the compiler. So there is no point in returning any libs for this package.
Without this patch: if one uses this as an external package (as intended), then this will can inject system library paths into the build process at the wrong place.
fixes#20679
In this refactor we have a single cardinality rule on the
provider, which triggers a rule transforming a dependency
on a virtual package into a dependency on the provider of
the virtual.
This adds a -i option to "spack python" which allows use of the
IPython interpreter; it can be used with "spack python -i ipython".
This assumes it is available in the Python instance used to run
Spack (i.e. that you can "import IPython").
* Update recipe for AOMP.
Reduced repitition with version hashes.
Expanded dependency versioning.
Reduced repitition with cmake args.
Added version 3.10.0
* Update dependency versions and remove uneeded quotes.
* Update var/spack/repos/builtin/packages/aomp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update of Eccodes to 2.19.1
* PEP8
* PEP8
* PEP8-whitespace
* Update var/spack/repos/builtin/packages/eccodes/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Michael Blaschek <michael.blaschek@univie.ac.at>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Every other predicate in the concretizer uses a `_set` suffix to
implement user- or package-supplied settings, but compiler settings use a
`_hard` suffix for this. There's no difference in how they're used, so
make the names the same.
- [x] change `node_compiler_hard` to `node_compiler_set`
- [x] change `node_compiler_version_hard` to `node_compiler_version_set`
* OpenMPI: Depends on hwlock & libevent
Both hwlock & libevent are required dependencies of Open MPI.
While they are also shipped internally, newer releases (>=4.0)
will start looking for external packages by default.
This caused build issues of Open MPI 4.0.5 with Fortran on macOS
10.15.
* Open MPI 4.0: libevent external
Internally shipped libevent just works fine for prior releases.
#20076 moved Cray-specific MPICH support from the Spack MPICH package
to a new cray-mpich Package. This broke existing package installs
using external mpich on Cray systems. This PR keeps the cray-mpich
package but restores the Cray-specific MPICH support for older
installations.
In the future this support should be removed from the Spack mpich
package and users should be directed to use cray-mpich on Cray.
Previously, the concretizer handled version constraints by comparing all
pairs of constraints and ensuring they satisfied each other. This led to
INCONSISTENT ressults from clingo, due to ambiguous semantics like:
version_constraint_satisfies("mpi", ":1", ":3")
version_constraint_satisfies("mpi", ":3", ":1")
To get around this, we introduce possible (fake) versions for virtuals,
based on their constraints. Essentially, we add any Versions,
VersionRange endpoints, and all such Versions and endpoints from
VersionLists to the constraint. Virtuals will have one of these synthetic
versions "picked" by the solver. This also allows us to remove a special
case from handling of `version_satisfies/3` -- virtuals now work just
like regular packages.
This converts the virtual handling in the new concretizer from
already-ground rules to facts. This is the last thing that needs to be
refactored, and it converts the entire concretizer to just use facts.
The previous way of handling virtuals hinged on rules involving
`single_provider_for` facts that were tied to the virtual and a version
range. The new method uses the condition pattern we've been using for
dependencies, externals, and conflicts.
To handle virtuals as conditions, we impose constraints on "fake" virtual
specs in the logic program. i.e., `version_satisfies("mpi", "2.0:",
"2.0")` is legal whereas before we wouldn't have seen something like
this. Currently, constriants are only handled on versions -- we don't
handle variants or anything else yet, but they key change here is that we
*could*. For a long time, virtual handling in Spack has only dealt with
versions, and we'd like to be able to handle variants as well. We could
easily add an integrity constraint to handle variants like the one we use
for versions.
One issue with the implementation here is that virtual packages don't
actually declare possible versions like regular packages do. To get
around that, we implement an integrity constraint like this:
:- virtual_node(Virtual),
version_satisfies(Virtual, V1), version_satisfies(Virtual, V2),
not version_constraint_satisfies(Virtual, V1, V2).
This requires us to compare every version constraint to every other, both
in program generation and within the concretizer -- so there's a
potentially quadratic evaluation time on virtual constraints because we
don't have a real version to "anchor" things to. We just say that all the
constraints need to agree for the virtual constraint to hold.
We can investigate adding synthetic versions for virtuals in the future,
to speed this up.
This code in `SpecBuilder.build_specs()` introduced in #20203, can loop
seemingly interminably for very large specs:
```python
set([spec.root for spec in self._specs.values()])
```
It's deceptive, because it seems like there must be an issue with
`spec.root`, but that works fine. It's building the set afterwards that
takes forever, at least on `r-rminer`. Currently if you try running
`spack solve r-rminer`, it loops infinitely and spins up your fan.
The issue (I think) is that the spec is not yet complete when this is
run, and something is going wrong when constructing and comparing so many
values produced by `_cmp_key()`. We can investigate the efficiency of
`_cmp_key()` separately, but for now, the fix is:
```python
roots = [spec.root for spec in self._specs.values()]
roots = dict((id(r), r) for r in roots)
```
We know the specs in `self._specs` are distinct (they just came out of
the solver), so we can just use their `id()` to unique them here. This
gets rid of the infinite loop.
- [x] add `concretize.lp`, `spack.yaml`, etc. to licensed files
- [x] update all licensed files to say 2013-2021 using
`spack license update-copyright-year`
- [x] appease mypy with some additions to package.py that needed
for oneapi.py
This adds a new subcommand to `spack license` that automatically updates
the copyright year in files that should have a license header.
- [x] add `spack license update-copyright-year` command
- [x] add test
This adds two lines to `.gitattributes`:
- [x] exclude vendored code from GitHub's language calculation
- [x] recognize `.lp` files as Prolog (closest language to ASP that
linguist supports)
It looks like there have been two attempts
(https://github.com/github/linguist/issues/3867,
https://github.com/github/linguist/issues/4860) to add ASP as a language
to Linguist, but it's not widespread enough to be standard yet (or at
least the people who submitted the PRs haven't been able to show enough
stats to prove it). We'll settle for calling ASP "Prolog" for now as
that'll get us some syntax highlighting for `concretize.lp`.
* hdf-eos5: new package (HDF for Earth Observing Sytem using hdf v5)
* hdf-eos5: flake8 fixes
* hdf-eos5: trying to fix flake8 errors
* hdf-eos5: flake8 fix
* hdf-eos5: Fix to support Fortran codes
The -Df2cFortran compilation flag needed to support Fortran
* hdf-eos2: new package (HDF for Earth Observing System using hdf5)
* hdf-eos2: flake8 fixes
* hdf-eos2: fix to support Fortran
Need the compilation flag -Df2cFortran to allow support for Fortran
codes
libuuid is currently contained in util-linux, libuuid and uuid. This
change introduces a new virtual provider `uuid` and renames the existing
`uuid` package to `ossp-uuid`.
util-linux's libuuid is provided in the form of a separate package
util-linux-uuid to make sure that packages depending on uuid and
util-linux can use a separate uuid implementation, which the concretizer
does not allow if libuuid is contained in util-linux.
- added several patches
- added some missing dependencies
- remove unneeded dependencies
- add CUDA support
- disable queue support, which was limited, and broken anyway
- move package text that was specific to the package to a comment, so it
does not show up the environment module
- set conflicts for cuda and compilers
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* OpenMPI: Add version 4.1.0
* OpenMPI: Prefer version 4.0.5.
* OpenMPI: Update links
The download links changed, there is currently a redirection but it might not work forever. The website also switched to https.
Previously compiler-rt didn't correctly passthrough cmake
variables for python when building the various santizers.
This patch passes these variables through.
This patch may also correctly apply to any version of LLVM
to any version of LLVM that uses the newer monorepo style organization,
and any older llvm newer than 7.0.0 as long as the paths were set
appropriately. However, this was not done so because it was not
tested with older LLVM releases.
Fixes#19908
See also: https://bugs.llvm.org/show_bug.cgi?id=48180
This updates the UnifyFS packages to account for the latest v0.9.1
release.
Updates required and optional dependencies for the respective
releases.
Locks margo and mercury dependencies at specific versions while
integration with their latest versions is still in progress.
* PGI compiler has trouble with avx2 SIMD support
(https://github.com/FFTW/fftw3/issues/78)
* Hew to the project's preferred indentation standard.
* Expand '%nvhpc' logic to include '%pgi'.
* Exceeded the max line-length.
* Break up the long compound statement into nested if's.
* Inadvertently picked up an extraneous file.
* PGI compiler has trouble with avx2/avx-512 SIMD support, too.
* Add PGI runtime libs to LDFLAGS when '%pgi' in spec.
* Revert "Add PGI runtime libs to LDFLAGS when '%pgi' in spec."
This reverts commit 31c3ef8ea2.
* Add PGI runtime libs to LDFLAGS when '%pgi' in spec.
GCC looks for included files based on several env vars.
Remove C_INCLUDE_PATH, CPLUS_INCLUDE_PATH, and OBJC_INCLUDE_PATH
from the build environment to ensure it's clean and prevent
accidental clobbering.
* Adding support for the CMake flags in LBANN that are missing.
* Added new flag to OpenCV dependency and removed negative variants
since OpenCV no longer turns on everything by default. Removed CMake
flags in LBANN that have been deprecated.
* Removed type='build' flags from dependencies so that they get linked
into a environment's view.
* Removed type='build' flags from dependencies so that they get linked
into a environment's view. Fixed DiHydrogen variant to enable
DistConv feature, renamed to +distconv from +legacy. Added conflicts
line to indicated that DistConv and ROCm don't work with +half
support.
* Fixed Flake8 and cleaned up ordering of variants.
* Flake8
* Backed out changes to not mark and cmake and ninja as build
dependencies, which was introduced to make sure that they appear in
a spack environment.
* Backed out changes to not mark doc related packages as build
dependencies, which was introduced to make sure that they appear
in a spack environment.
* Fixed how recipe communicates the intent to build and run tests to the
package CMake.
This is to make sure that the build system doesn't pick up a library that
would happen to be available.
Co-authored-by: Baptiste Jonglez <git@bitsofnetworks.org>
Environment yaml files should not have default values written to them.
To accomplish this, we change the validator to not add the default values to yaml. We rely on the code to set defaults for all values (and use defaulting getters like dict.get(key, default)).
Includes regression test.
This creates a set of packages which all use the same script to install
components of Intel oneAPI. This includes:
* An inheritable IntelOneApiPackage which knows how to invoke the
installation script based on which components are requested
* For components which include headers/libraries, an inheritable
IntelOneApiLibraryPackage is provided to locate them
* Individual packages for DAL, DNN, TBB, etc.
* A package for the Intel oneAPI compilers (icx/ifx). This also includes
icc/ifortran but these are not currently detected in this PR
I lost my mind a bit after getting the completion stuff working and
decided to get Mypy working for spack as well. This adds a
`.mypy.ini` that checks all of the spack and llnl modules, though
not yet packages, and fixes all of the identified missing types and
type issues for the spack library.
In addition to these changes, this includes:
* rename `spack flake8` to `spack style`
Aliases flake8 to style, and just runs flake8 as before, but with
a warning. The style command runs both `flake8` and `mypy`,
in sequence. Added --no-<tool> options to turn off one or the
other, they are on by default. Fixed two issues caught by the tools.
* stub typing module for python2.x
We don't support typing in Spack for python 2.x. To allow 2.x to
support `import typing` and `from typing import ...` without a
try/except dance to support old versions, this adds a stub module
*just* for python 2.x. Doing it this way means we can only reliably
use all type hints in python3.7+, and mypi.ini has been updated to
reflect that.
* add non-default black check to spack style
This is a first step to requiring black. It doesn't enforce it by
default, but it will check it if requested. Currently enforcing the
line length of 79 since that's what flake8 requires, but it's a bit odd
for a black formatted project to be quite that narrow. All settings are
in the style command since spack has no pyproject.toml and I don't
want to add one until more discussion happens. Also re-format
`style.py` since it no longer passed the black style check
with the new length.
* use style check in github action
Update the style and docs action to use `spack style`, adding in mypy
and black to the action even if it isn't running black right now.
We have to repeat all the spec attributes in a number of places in
`concretize.lp`, and Spack has a fair number of spec attributes. If we
instead add some rules up front that establish equivalencies like this:
```
node(Package) :- attr("node", Package).
attr("node", Package) :- node(Package).
version(Package, Version) :- attr("version", Package, Version).
attr("version", Package, Version) :- version(Package, Version).
```
We can rewrite most of the repetitive conditions with `attr` and repeat
only for each arity (there are only 3 arities for spec attributes so far)
as opposed to each spec attribute. This makes the logic easier to read
and the rules easier to follow.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This patch logic resovles a linking issue with ncurses in the mesa
package. This appears to be a recurring problem that was identified in
the mesa gitlab issues here:
https://gitlab.freedesktop.org/mesa/mesa/-/issues/2843
Using `_llvm_method = 'auto'` is broken. This patch replaces that with
`_llvm_method = 'config-tool'`, which is a hack, but makes it possible
to build.
I have commented on the closed issue (2843), referencing the original
author of the bug, and one of the mesa developers, so perhaps they will
fix the problem.
This PR does three related things to try to improve developer tooling quality of life:
1. Adds new options to `.flake8` so it applies the rules of both `.flake8` and `.flake_package` based on paths in the repository.
2. Adds a re-factoring of the `spack flake8` logic into a flake8 plugin so using flake8 directly, or through editor or language server integration, only reports errors that `spack flake8` would.
3. Allows star import of `spack.pkgkit` in packages, since this is now the thing that needs to be imported for completion to work correctly in package files, it's nice to be able to do that.
I'm sorely tempted to sed over the whole repository and put `from spack.pkgkit import *` in every package, but at least being allowed to do it on a per-package basis helps.
As an example of what the result of this is:
```
~/Workspace/Projects/spack/spack develop* ⇣
❯ flake8 --format=pylint ./var/spack/repos/builtin/packages/kripke/package.py
./var/spack/repos/builtin/packages/kripke/package.py:6: [F403] 'from spack.pkgkit import *' used; unable to detect undefined names
./var/spack/repos/builtin/packages/kripke/package.py:25: [E501] line too long (88 > 79 characters)
~/Workspace/Projects/spack/spack refactor-flake8*
1 ❯ flake8 --format=spack ./var/spack/repos/builtin/packages/kripke/package.py
~/Workspace/Projects/spack/spack refactor-flake8*
❯ flake8 ./var/spack/repos/builtin/packages/kripke/package.py
```
* qa/flake8: update .flake8, spack formatter plugin
Adds:
* Modern flake8 settings for per-path/glob error ignores, allows
packages to use the same `.flake8` as the rest of spack
* A spack formatter plugin to flake8 that implements the behavior of
`spack flake8` for direct invocations. Makes integration with
developer tooling nicer, linting with flake8 reports only errors that
`spack flake8` would report. Using pyls and pyls-flake8, or any other
non-format-dependent flake8 integration, now works with spack's rules.
* qa/flake8: allow star import of spack.pkgkit
To get working completion of directives and spack components it's
necessary to import the contents of spack.pkgkit. At the moment doing
this makes flake8 displeased. For now, allow spack.pkgkit and spack
both, next step is to ban spack * and require spack.pkgkit *.
* first cut at refactoring spack flake8
This version still copies all of the files to be checked as befire, and
some other things that probably aren't necessary, but it relies on the
spack formatter plugin to implement the ignore logic.
* keep flake8 from rejecting itself
* remove separate packages flake8 config
* fix failures from too many files
I ran into this in the PR converting pkgkit to std. The solution in
that branch does not work in all cases as it turns out, and all the
workarounds I tried to use generated configs to get a single invocation
of flake8 with a filename optoion to work failed. It's an astonishingly
frustrating config option.
Regardless, this removes all temporary file creation from the command
and relies on the plugin instead. To work around the huge number of
files in spack and still allow the command to control what gets checked,
it scans files in batches of 100. This is a completely arbitrary number
but was chosen to be safely under common line-length limits. One
side-effect of this is that every 100 files the command will produce
output, rather than only at the end, which doesn't seem like a terrible
thing.
* Dependencies of Go will now correctly set the GOPATH for the
appropriate spec to avoid using the user's default path.
* Bumped version to latest releases(1.15.6 & 1.14.13).
Most people installing `clingo` with Spack are going to be doing it to
use the new concretizer, and that requires the `master` branch.
- [x] make `master` the default so we don't have to keep telling people
to install `clingo@master`. We'll update the preferred version when
there's a new release.
Continuing to convert everything in `asp.py` into facts, make the
generation of ground rules for conditional dependencies use facts, and
move the semantics into `concretize.lp`.
This is probably the most complex logic in Spack, as dependencies can be
conditional on anything, and we need conditional ASP rules to accumulate
and map all the dependency conditions to spec attributes.
The logic looks complicated, but essentially it accumulates any
constraints associated with particular conditions into a fact associated
with the condition by id. Then, if *any* condition id's fact is True, we
trigger the dependency.
This simplifies the way `declared_dependency()` works -- the dependency
is now declared regardless of whether it is conditional, and the
conditions are handled by `dependency_condition()` facts.
There are currently no places where we do not want to traverse
dependencies in `spec_clauses()`, so simplify the logic by consolidating
`spec_traverse_clauses()` with `spec_clauses()`.
`version_satisfies/2` and `node_compiler_version_satisfies/3` are
generated but need `#defined` directives to avoid " info: atom does not
occur in any rule head:" warnings.
Since zsh can load bash completion files natively, seems reasonable to just turn this on.
The only changes are to switch from `type -t` which zsh doesn't support to using `type`
with a regex and adding a new arm to the sourcing of the completions to allow it to work
for zsh as well as bash.
Could use more bash/dash/etc testing probably, but everything I've thought to try has
worked so far.
Notes:
* unit-test zsh support, fix issues
Specifically fixed word splitting in completion-test, use a different
method to apply sh emulation to zsh loaded bash completion, and fixed
an incompatibility in regex operator quoting requirements.
* compinit now ignores insecure directories
Completion isn't meant to be enabled in non-interactive environments, so
by default compinit will ask the user if they want to ignore insecure
directories or load them anyway. To pass the spack unit tests in GH
actions, this prompt must be disabled, so ignore explicitly until a
better solution can be found.
* debug functions test also requires bash emulation
COMP_WORDS is a bash-ism that zsh doesn't natively support, turn on
emulation for just that section of tests to allow the comparison to
work. Does not change the behavior of the functions themselves since
they are already pinned to sh emulation elsewhere.
* propagate change to .in file
* fix comment and update script based on .in
This PR addresses a number of issues related to compiler bootstrapping.
Specifically:
1. Collect compilers to be bootstrapped while queueing in installer
Compiler tasks currently have an incomplete list in their task.dependents,
making those packages fail to install as they think they have not all their
dependencies installed. This PR collects the dependents and sets them on
compiler tasks.
2. allow boostrapped compilers to back off target
Bootstrapped compilers may be built with a compiler that doesn't support
the target used by the rest of the spec. Allow them to build with less
aggressive target optimization settings.
3. Support for target ranges
Backing off the target necessitates computing target ranges, so make Spack
handle those properly. Notably, this adds an intersection method for target
ranges and fixes the way ranges are satisfied and constrained on Spec objects.
This PR also:
- adds testing
- improves concretizer handling of target ranges
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Cray's version of MPICH uses a different versioning system than
MPICH, so it has been split into its own package. It is an
external-only package (always provided by the system, never
installed by Spack).
* Kluge to get the gfortran linker to work correctly on Big Sur.
* Fixed formatting error; stetting the other.
* Removed spaces.
* Added comment, mainly to re-trigger Spack CI.
Currently, version range constraints, compiler version range constraints,
and target range constraints are implemented by generating ground rules
from `asp.py`, via `one_of_iff()`. The rules look like this:
```
version_satisfies("python", "2.6:") :- 1 { version("python", "2.4"); ... } 1.
1 { version("python", "2.4"); ... } 1. :- version_satisfies("python", "2.6:").
```
So, `version_satisfies(Package, Constraint)` is true if and only if the
package is assigned a version that satisfies the constraint. We
precompute the set of known versions that satisfy the constraint, and
generate the rule in `SpackSolverSetup`.
We shouldn't need to generate already-ground rules for this. Rather, we
should leave it to the grounder to do the grounding, and generate facts
so that the constraint semantics can be defined in `concretize.lp`.
We can replace rules like the ones above with facts like this:
```
version_satisfies("python", "2.6:", "2.4")
```
And ground them in `concretize.lp` with rules like this:
```
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint).
version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
```
The top rule is the same as before. It makes conditional dependencies and
other places where version constraints are used work properly. Note that
we do not need the cardinality constraint for the second rule -- we
already have rules saying there can be only one version assigned to a
package, so we can just infer from `version/2` `version_satisfies/3`.
This form is also safe for grounding -- If we used the original form we'd
have unsafe variables like `Constraint` and `Package` -- the original
form only really worked when specified as ground to begin with.
- [x] use facts instead of generating rules for package version constraints
- [x] use facts instead of generating rules for compiler version constraints
- [x] use facts instead of generating rules for target range constraints
- [x] remove `one_of_iff()` and `iff()` as they're no longer needed
* ParaView: add new ParaView-5.9.0-RC2 release
Signed-off-by: Vicente Adolfo Bolea Sanchez <vicente.bolea@kitware.com>
* Update var/spack/repos/builtin/packages/paraview/package.py
Indeed, I misunderstood the previous review. This looks good to me too.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
I was keeping the old `clingo` driver code around in case we had to run
using the command line tool instad of through the Python interface.
So far, the command line is faster than running through Python, but I'm
working on fixing that. I found that if I do this:
```python
control = clingo.Control()
control.load("concretize.lp")
control.load("hdf5.lp") # code from spack solve --show asp hdf5
control.load("display.lp")
control.ground([("base", [])])
control.solve(...)
```
It's just as fast as the command line tool. So we can always generate the
code and load it manually if we need to -- we don't need two drivers for
clingo. Given that the python interface is also the only way to get unsat
cores, I think we pretty much have to use it.
So, I'm removing the old command line driver and other unused code. We
can dig it up again from the history if it is needed.
This fixes a logging error observed on macOS 11.0.1 (Big Sur).
When performing a Spack install in debugging mode (e.g.
`spack -d install py-scipy`) Spack is supposed to write a log of
compiler wrapper command line invocations to the current working
directory.
Due to a regression error introduced by #18205, these files were
no-longer generated, and Spack was printing errors such as
"No such file or directory: None/." This is because the log file
directory gets set from `spack.main.spack_working_dir`, but that
variable is not set in the spawned process.
This PR ensures that the working directory (at the time of the
"spack install" invocation) is persisted to the subprocess.
Fixed hard tab in flux-sched edit and unbound hwloc in flux-core after
testing to better support modern MPIs in spack environments
Verified that flux-core@0.17 is when hwloc@2: became viable
Track all the variant values mentioned when emitting constraints, validate them
and emit a fact that allows them as possible values.
This modification ensures that open-ended variants (variants accepting any string
or any integer) are projected to the finite set of values that are relevant for this
concretization.
2020.10.0 is the latest stable release, and the preferred version
for general use (when the user does not specify otherwise).
2020.11.0 is a prototype for the memory kinds feature that is also
available when requested.
Other parts of the concretizer code build up lists of things we can't
know without traversing all specs and packages, and they output these
list at the very end.
The code for this for variant values from spec literals was intertwined
with the code for traversing the input specs. This only covers the input
specs and misses variant values that might come from directives in
packages.
- [x] move ad-hoc value handling code into spec_clauses so we do it in
one place for CLI and packages
- [x] move handling of `variant_possible_value`, etc. into
`concretize.lp`, where we can automatically infer variant existence
more concisely.
- [x] simplify/clarify some of the code for variants in `spec_clauses()`
* [cmd versions] add spack versions --new flag to only fetch new versions
format
[cmd versions] rename --latest to --newest and add --remote-only
[cmd versions] add tests for --remote-only and --new
format
[cmd versions] update shell tab completion
[cmd versions] remove test for --remote-only --new which gives empty output
[cmd versions] final rename
format
* add brillig mock package
* add test for spack versions --new
* [brillig] format
* [versions] increase test coverage
* Update lib/spack/spack/cmd/versions.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Update lib/spack/spack/cmd/versions.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Update geant4-data and individual datasets for Geant4 versions 10.6.3
and 10.7.0.
Update geant4 package with new versions 10.6.3 and 10.7.0. Update
dependencies on CLHEP and VecGeom with versions required for Geant4
10.7.
Add GEANT4_INSTALL_PACKAGE_CACHE=OFF to CMake args for 10.6 onwards.
Prevents install of the "package cahce" file that contains hard-coded
paths for dependencies, improving relocatability. It relies on Spack
setting CMAKE_PREFIX_PATH correctly in build/use environments that
consume the geant4 package.
`cmake @3.17:` is necessary to handle `cuda @11:` correctly. Earlier versions of `cmake` do not know that `cuda @11:` does not support `compute_30` any more, and list that compute capability as supported. This is handled in `cmake`'s file `Modules/FindCUDA/select_compute_arch.cmake`.
The bowtie2 Makefile uses `prefix`, not `PREFIX`, for versions before v2.4.
Credit to @tkameyama
Co-authored-by: george.hartzell <george.hartzell@sana.com>
* allow install of build-deps from cache via --include-build-deps switch
* make clear that --include-build-deps is useful for CI pipeline troubleshooting
fixes#20055
Compiler with custom versions like gcc@foo are not currently
matched to the appropriate targets. This is because the
version of spec doesn't match the "real" version of the
compiler.
This PR replicates the strategy used in the original
concretizer to deal with that and tries to detect the real
version of compilers if the version in the spec returns no
results.
* bump up version for rocm-3.10.0 release
* bump up version for rocm-3.10.0
* remove duplicate version addition for 3.9.0
* bump up version for rocm-3.10.0 release
* bump up version for rocm-3.10.0 release
* bump up version for rocm-debug-agent and rocm-dbgapi
* bump up version for rocm-bandwidth-test,rocm-gdb,rocprofiler,roctracer for rocm-3.10.0
* add smoke test
* remove whitespaces
* fix minimum version issue
* reorder decorators & replace make with cmake build
* merge cmake build into one line
* reorganize smoke test function
Co-authored-by: Jieyang Chen <chenj3@ornl.gov>
* added dockerfile for opensuse leap 15
* updated maintainer info
* Update share/spack/docker/leap-15.dockerfile
* move copies and symlinks after package install
also use ${SPACK_ROOT} for spack calls as
this works with buildah
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* New package: py-qsymm
* py-qsymm: Convert to using tarballs from PyPi instead of git checkouts
* py-qsymm: add missing dependencies
* Update var/spack/repos/builtin/packages/py-qsymm/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-qsymm: Fix url to use pypi hidden download interface
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* AOCC-2.3.0 is now added to spack
Change-Id: I18fd9606e6fd9a288cc7dc6c6ead11ea17839a7c
* Added flag and version tests for AOCC-2.3.0
* Addressed review comments
Co-authored-by: vkallesh <Vijay-teekinavar.Kallesh@amd.com>
fixes#20040
Matching compilers among nodes has been prioritized
in #20020. Selection of default variants has been
tuned in #20182. With this setup there is no need
to have an ad-hoc rule for external packages. On
the contrary it should be removed to prefer having
default variant values over more external nodes in
the DAG.
refers #20040
Before this PR optimization rules would have selected default
providers at a higher priority than default variants. Here we
swap this priority and we consider variants that are forced by
any means (root spec or spec in depends_on clause) the same as
if they were with a default value.
This prevents the solver from avoiding expected configurations
just because they contain directives like:
depends_on('pkg+foo')
and `+foo` is not the default variant value for pkg.
* OpenBLAS: More Precise GCC Conflicts
Add more precise GCC conflicts so e.g. GCC 6 and GCC 7.5 don't fail.
* Compact syntax
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
As part of pull request #19452, a patch method was added to the mfem
package to delete byte order marks from 3 mfem source files. These
files first appeared in a stable release of mfem as of version
4.1. Consequently, attempts to install mfem 3.4 or mfem 4.0 fail
because no files exist at the path arguments of the filter_file
commands used to execute this operation. Decorating the patch method
so it runs only on mfem versions 4.1 and later resolves the errors
that were thrown due to files not found.
This commit adds that decorator.
* Qt: add options to disable docs and gui
- Add `~gui` option for minimal build
- Add `+doc` option to install docs, and attempt to disable the implicit
llvm dependency if not
- Removes the 'freetype' option which hasn't worked reliably in qt5, as
many of the gui components implicitly rely on freetype.
- Add and test version 5.15 (and skip qtlocation if disabling opengl)
- Refactor some of the dependency logic
I've tested this on linux with 5.15.2 and 4.8.7 in a couple of different
configurations.
* Address reviewer feedback and correctly disable llvm
* Fix qt doc generation
* py-rosdep: add new package
* setuptools needed at run-time
Co-authored-by: Andrew W Elble <aweits@rit.edu>
Co-authored-by: Andrew W Elble <aweits@rit.edu>
* py-rospkg: add new package
* setuptools needed at run-time
Co-authored-by: Andrew W Elble <aweits@rit.edu>
Co-authored-by: Andrew W Elble <aweits@rit.edu>
* py-catkin-pkg: add new package
* setuptools is needed at run-time
Co-authored-by: Andrew W Elble <aweits@rit.edu>
Co-authored-by: Andrew W Elble <aweits@rit.edu>
fixes#19981
This commit adds support for target ranges in directives,
for instance:
conflicts('+foo', when='target=x86_64:,aarch64:')
If any target in a spec body is not a known target the
following clause will be emitted:
node_target_satisfies(Package, TargetConstraint)
when traversing the spec and a definition of
the clause will then be printed at the end similarly
to what is done for package and compiler versions.
* spack recipe for gromacs with aocc compiler support
Change-Id: I364aab4a0aa2dcd44bc47eb50c81b2d94c99cfbd
* Removed arch and other associated compilers flags
Added cycle_subcounters variant
Co-authored-by: vkallesh <Vijay-teekinavar.Kallesh@amd.com>
fixes#20019
Before this modification having a newer version of a node came
at higher priority in the optimization than having matching
compilers. This could result in unexpected configurations for
packages with conflict directives on compilers of the type:
conflicts('%gcc@X.Y:', when='@:A.B')
where changing the compiler for just that node is preferred to
lower the node version to less than 'A.B'. Now the priority has
been switched so the solver will try to lower the version of the
nodes in question before changing their compiler.
* llvm-amdgpu: fix the build for version 3.9.0
Adapt the fix-system-zlib-ncurses.patch for version 3.9.0. Without
the patch, llvm-amdgpu builds, but then rocm-device-libs fails with
"cannot find -ltinfo."
Tighten the version requirements for cmake according to the
llvm/CMakeLists.txt file.
* Add a conflict for cmake 3.19.0.
refers #20079
Added docstrings to 'concretize' and 'concretized' to
document the format for tests.
Added tests for the activation of test dependencies.
refers #20040
This modification emits rules like:
provides_virtual("netlib-lapack","blas") :- variant_value("netlib-lapack","external-blas","False").
for packages that provide virtual dependencies conditionally instead
of a fact that doesn't account for the condition.
* intel-tbb: patch for arm64 on macOS
as submitted upstream and used in homebrew
* intel-tbb: check patchable versions
* intel-tbb: avoid patch breakage when 2021.1 is released
2021.1-beta05 would be considered newer than 2021.1
* Add the 'exciting' package.
Version 14 (latest available) is defined.
An as-of-yet unpublished patch (dfgather.patch) from the developers is also
included.
* fixed flake8 errors (I *thought* I had already gotten them! OOPS!)
* Update var/spack/repos/builtin/packages/exciting/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fixed install method to just do the install, and no build method is needed.
* *Actually* added the lapack dependency!
* removed variant from blas dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix: leading . is not needed in extension kwarg
* mfem: add support for NVIDIA AmgX
fix: proper spacing
* mfem: use conflict to indicate that AmgX is expected to depend on CUDA
fixes#19966
Global arrays supports GCC 10 since version 5.7.1,
therefore a conflict has been added to avoid old
releases to error at build-time.
Removed the 'blas' and 'lapack' variant since
BLAS and LAPACK are always a dependency, and
if not specified during configure, a version
of these APIs vendored with Global Arrays is
built.
Fixed a few options in configuration.
The point of this variant is to give the end user an option to use system
installed fabrics such as mofed instead of upstream fabrics such as rdma-core.
This was found to avoid run time errors on some systems.
Co-authored-by: nithintsk <nithintsk@github.com>
This PR fixes two problems with clang/llvm's version detection. clang's
version output looks like this:
```
clang version 11.0.0
Target: x86_64-unknown-linux-gnu
```
This caused clang's version to be misdetected as:
```
clang@11.0.0
Target:
```
This resulted in errors when trying to actually use it as a compiler.
When using `spack external find`, we couldn't determine the compiler
version, resulting in errors like this:
```
==> Warning: "llvm@11.0.0+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for llvm@11.0.0+clang+lld+lldb]
```
Changing the regex to only match until the end of the line fixes these
problems.
Fixes: #19473
* Updated the cuDNN recipe to generate the proper version names for only
the arhcitecture that you are on. This prevents the concretizer from
selecting a source code version that is incompatible with your current
architecture. Additionally, add constraints to ensure that the
corresponding CUDA version is properly set as well.
* Added maintainer
* Fixed renaming for darwin systems
* Fixed flake8
* Fixed flake8
* Fixed range typo
* Update var/spack/repos/builtin/packages/cudnn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fixed style issues
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* seems to have been introduced errorously by users using gitk-based
workflows. This should be handled by the git package
* fixes build problems on OSX bigsur
* charmpp: various fixes
- change URLs to https
- address deprecated/renamed versions
- make it build with the cmake build system
* flake8
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
# Copyright 2013-2020 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.