It can be hard to tell when the concretizer has made decisions that are not the defaults
for packages. This introduces a `spack spec -d` option that will highlight the
nondefault options in red.
You can use this to see:
- When a selected version is not the most preferred;
- When a selected compiler is not the most preferred;
- When a nondefault variant is selected;
- When the chosen target is not the native microarchitecture;
- When a chosen virtual provider is not the most preferred.
With `--reuse`, you'll see clearly which versions and options on reused packages are not
current with the latest `package.py` files. With `--fresh`, you can see which options
are forced by particular package preferences. e.g., `spack spec -d tar` shows `zstd`'s
`+programs` option in red because it's off by default, but `tar` requires it via
`depends_on("zstd+programs")`.
This should be useful for debugging strange/unintuitive concretizations.
- [x] Modify concretizer to return optimization weights
- [x] Add `_weights` metadata to the `Spec` object
- [x] Modify `Spec.format()` to colorize nondefaults when weights are nonzero
- [x] Add tests
- [x] Remove `colorize_spec()`, which uses older, now incorrect colorization logic. It
is older than the current `Spec.format()` implementation and it doesn't work with
`@=`. The only usages in `cmd/info.py` can be replaced wtih `Spec.cformat()`.
- [x] Remove `_SEPARATORS` AND `COLOR_FORMATS`, which are no longer needed.
- [x] Simplify `Spec.format()` code to use more readable color constants directly.
- [x] Remove unused color constants.
- [x] Remove unused parameter for `write_attribute()`
- [x] Remove unused `Spec.colorized()` method
Patch allowing Clingo to build with VS22 has landed both in Spack
and Clingo upstream, update Spack's bootstrap constraints to handle
this.
Additionally, properly scope the patch application in the clingo
package to handle upstream patch.
Currently (outside of this PR) when you `spack develop` a path, this path is treated as the staging
directory (this means that for example all build artifacts are placed in the develop path).
This PR creates a separate staging directory for all `spack develop`ed builds. It looks like
```
# the stage root
/the-stage-root-for-all-spack-builds/
spack-stage-<hash>
# Spack packages inheriting CMakePackage put their build artifacts here
spack-build-<hash>/
```
Unlike non-develop builds, there is no `spack-src` directory, `source_path` is the provided `dev_path`.
Instead, separately, in the `dev_path`, we have:
```
/dev/path/for/foo/
build-{arch}-<hash> -> /the-stage-root-for-all-spack-builds/spack-stage-<hash>/
```
The main benefit of this is that build artifacts for out-of-source builds that are relative to
`Stage.path` are easily identified (and you can delete them with `spack clean`).
Other behavior added here:
- [x] A symlink is made from the `dev_path` to the stage directory. This symlink name incorporates
spec details, so that multiple Spack environments that develop the same path will not conflict
with one another
- [x] `spack cd` and `spack location` have added a `-c` shorthand for `--source-dir`
Spack builds can still change the develop path (in particular to keep track of applied patches),
and for in-source builds, this doesn't change much (although logs would not be written into
the develop path). Packages inheriting from `CMakePackage` should get this benefit
automatically though.
Add current versions of the 17 and 18 releases
Stop making it nearly impossible to compose this correctly with code built with gcc
Build for compatibility by default like we do in every other llvm package
The `patch()` directive can now be invoked with `reverse=True` to apply a patch in reverse.
This is useful for reverting commits that caused errors in projects, even if only the forward
patch is available, e.g. via a GitHub commit patch URL.
`patch(..., reverse=True)` runs `patch -R` behind the scenes. This is a POSIX option so we
can expect it to be available on the `patch` command.
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
set the FC variable to the MPI Fortran compiler and also set the F90 variable
to the same compiler for versions 9.8 and up. FC needs to be set because the
configure script still uses FC.
* unmaintained packages: add new versions
* Fix parallel and numactl
* Revert numactl changes
* rollback lua-sol2 version
* Update alluxio version format
* py-psyclone and py-fparser new releases added
* py-fparser: add missing releases for py-psyclone
* py-psyclone: actioned @adamjstewart comments
* py-psyclone: removed py-pytest-pylint
* py-pytest-pylint: added package @ latest version
* py-pytest-pylint: reformatted
* Update var/spack/repos/builtin/packages/py-psyclone/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pytest-pylint: added build deps and runtime dep versions
* py-pytest-pylint: removed version from setuptools
* py-psyclone: add py-pytest-pylint test dep and alphabetize deps
* Update var/spack/repos/builtin/packages/py-pytest-pylint/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-psyclone: deps ordered
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
fixes#43097
Before this PR the behavior of mixins used together with
builders was to mask completely the callbacks defined from
the class coming later in the MRO.
Here we fix the behavior by accumulating all callbacks,
and de-duplicating them later.
* package/qgis: add new version
* improve Qsci.pro
* improve
* fix undefined symbol qsciprinter error
* add import test
* fix bug
* add version 3.36
* [@spackbot] updating style on behalf of Sinan81
* fix long line
* only run import test when +python
* first attempt at stand-alone test
* add TODO
---------
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Sinan81 <Sinan@world>
* proj: correct CMake arg for shared build with proj older than 7.0.0
* Actually use new CMake arg
* Update var/spack/repos/builtin/packages/proj/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This adds Spack packages for these Perl distributons:
- Bio::DB::EUtilities and its dependencies:
- Bio::ASN1::EntrezGene
- Bio::Cluster
- Bio::Variation
* Properly specify Curl builder interface for static vs shared curl
with NMake system to ensure all built curls export expected
symbols.
* Symlinks curl library build artifact to more idiomatic name for
FindCurl.cmake implementations and other NMake consumers.
* Deprecating py-pylint@2.3 as it cannot build with python@3.8:
* Style fix
* Removed versions because can't build with python@3.7
---------
Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
* py_cheap_repr: add initial package.py
* Update var/spack/repos/builtin/packages/py-cheap-repr/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cheap-repr: use pypi link instead
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Added new versions @1.5.1 and @1.4.1 (sfcgal moved from github to gitlab)
Placed restrictions on what versions of cgal are supported for different
cfcgal versions. These restrictions are based on what was found in the
version history at https://gitlab.com/sfcgal/SFCGAL/-/blob/v1.5.1/NEWS?ref_type=tags
as well as the CMakeLists.txt for different versions.
@1.4 and @1.5 seem to require a specific version of cgal based on CMakeLists.txt
Earlier versions (@1.3.8:1.3.10) claim to support cgal@4.3: (but Spack recipe
claims did not work until @4.7, so sticking with that as minimum). CMakeList.txt
suggests they support cgal@5 as well, but version history suggests otherwise.
* Expanding the duckdb package to fix the version number (required for external extensions to work) being pulled from git and have variants for the built-in extensions at build-time. This also changes the build system from CMakePackage to Makefile package (as advised from upstream).
* - Reorganized and cleaned up variants
- Updated the patch to work for 0.10.0+
- Removed (or made non-default) some unecessary variants
- Added ninja as the default generator
- Set up some shared library dependencies;
* enable tensorflow-2.11 support for ROCm
* add latest sha for mesa and limit the patches to older version.similar
changes in #37910 to enable gitlab-ci pass
* address review commemts
Remove dependency on `importlib_metadata` and `pkg_resources`, which can be problematic if the version in PYTHONPATH is incompatible with the interpreter Spack is running under.
* Kokkos Kernels: adding missing TPLs and pre-conditions
Adding variants and dependencies for rocBLAS and rocSPARSE.
Also adding a "when=" close to the TPL variants that prevents
enabling the TPLs in versions of the library when it was not
yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: fix issue with unpacking wrong number of args
After changing the tpls dictionary we need to update the unpacking
logic to catch the right number of outputs out of it!
* Kokkos Kernels: updating doc string for tpls var and using f-string
Improving comment a bit and switching to f-string for more readability.
When setup to do more testing will try to use f-string in the CMake
options generation part of the package.
* Style change
* sst-core now effectively depends on ncurses
* use --with-curses
* sst-core: update comment about ncurses
* should have curses for build, link, and run
Closes#43052.
Maybe moving the argument to the `find` subcommand is a good idea, but I
just wanted to get the docs fix out.
Co-authored-by: Patrice Peterson <patrice.peterson@itz.uni-halle.de>
This PR adds the ability to load spack extensions through `importlib.metadata` entry
points, in addition to the regular configuration variable.
It requires Python 3.8 or greater to be properly supported.
#42878 adds a post install filter of the netCDFConfig.cmake file
that replaces a valid CMake target on Windows with an invalid one.
Don't do this replacement on Windows.
* Updates for migraphx 6.0.0 & 6.0.2
* Style check error and audit check error fix
* Adding patch for half-include-directory
* The parameter GPU_TARGETS is used from 5.7 in migraphx
* Adding rocmlir dependency in migraphx and 6.0 updates in rocmlir
* Applying upcoming changes to make CK JIT optional and enable
compilation on Windows in order to build without ck dependency
* mpfr: missing dependency for version 4.0.1
mpfr 4.0.1 (like 4.0.2) needs autoconf-archive where it takes the
AX_PHREAD macro from
* autoconf-archive is also required for mpfr@4.0.0
New changes have been made to nvptx-tools that address dropping support
for sm_30 in later CUDA versions (12.0+).
Also refactor gcc to make nvptx-tools a dependency instead of a resource.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Adds CHI::Driver::Memcached.
Modified perl-chi to make the tests work. Testing modules in perl-chi
were not loaded when testing CHI::Driver::Memcached, so added the "run"
type to these.
* ASP-based solver: improve reusing nodes with gcc-runtime
This PR skips emitting dependency constraints on "gcc-runtime",
for concrete specs that are considered for reuse.
Instead, an appropriate version of gcc-runtime is recomputed
considering also the concrete nodes from reused specs.
This ensures that root nodes in a DAG have always a runtime
that is at a version greater or equal than their dependencies.
* Add unit-test for view with multiple runtimes
* Select latest version of runtimes in views
* Construct result keeping track of latest
* Keep ordering stable, just in case
1) support for version @3.0
Unfortunately, download seems to require registration now
so using manual_download mechanism for @3:
2) copying from hdf-eos5 patch from @vanderwb to enable
use of Spack compiler wrappers instead of h4cc
3) Patching an issue in hdf-eos2 configure script. The
script will test for jpeg, libz libraries, succeed and
append HAVE_LIBJPEG=1, etc to confdefs.h, and then abort
because HAVE_LIBJPEG not set in running environment.
4) Add some LDFLAGS to build environment. Otherwise
seems to fail to build test script due to rpc dependence
in HDF4.
Adds Net::Server::SS::PreFork and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Server::Starter
- Net::Server
- HTTP::Server::Simple
Adds Catalyst::Action::RenderView and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
* package/imagemagick add new version, improve
* confimed that build fails when libsm is missing on linux
---------
Co-authored-by: Sinan81 <Sinan@world>
* perl-chi and deps: new packages
Adds CHI and its dependencies.
Installed OK with build-time tests. Added dependencies:
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
* Add license
* legion: Add 23.09.0 and 23.12.0, remove control_replication.
The branch control_replication has been merged to master and should no
longer be used.
* flecsi: Switch to Legion master branch.
Legion control_replication has been merged to master.
* Fix Legion 23.09.0 and 23.12.0 build for ROCm 6.
* Add ncvis and opengl option for wxwidgets
* Style fixes for ncvis
* Replace in with satisfies for opengl constraint
Co-authored-by: Alec Scott <alec@bcs.sh>
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* when built with cmake, libogg does not build with a shared libary by default. This resolves that
* spack style fixes
* Clean up imports
* enforce +pic when +shared
* Execute `args.help` after setting main options so that extension commands will show with `spack -h`
---------
Co-authored-by: psakievich <psakiev@sandia.gov>
* Update package.py
1. add one compiler type named 'musl'
2. add a variant name 'multilib'
3. add a variant name 'cmodel'
* Added one compiler type named 'musl'.
Added a variant named 'multilib'.
Added a variant named 'cmodel'.
Added several versions.
* aarch64 is not supported.
The deps were added in #40945 to make it work on macOS 11, because the
old configure scripts only detect macOS 10. Apparently people reported the
autoreconf script caused issues, later fixed in #41057. However, also
with that fix, things are incorrect, cause people now report:
```
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.7
libtool: and run autoconf again.
```
HOWEVER, all this is unnecessary, because the underlying issue was
already fixed long ago, it's just that it regressed at some point, but
it's back in place since #41205.
This adds Readonly::XS. Since this module can not be used by itself, the
Spack package comes with a test override. This anticipates that the perl
builder will one day have a generic standalone module usage test.
This should fix issue #40780
We explicitly cast self.spec["python"].command to str in the filter_file
call in _fix_dtrace_shebang to avoid the error
==> Error: TypeError: expected str, bytes or os.PathLike object, not Executable
Not sure why the error is appearing (is it only for specific python versions, etc?),
but the fix should be quite safe.
* e4s: new packages: glvis, laghos
* gl: require: osmesa
* be explicit: glvis ^llvm so that llvm-amdgpu not chosen
* glvis fails on oneapi stack due to issue 42839
Spack merges ranges and concrete versions if they have non-empty
intersection. That is not enough for adjacent version ranges.
This commit ensures that disjoint ranges in version lists are simplified
if their union is not disjoint:
```python
"@1.0:2.0,2.1,2.2:3,4:6" # simplifies to "@1.0:6"
```
Libraries for openexr are named libOpenEXR*.so, etc., so the default libs
handler in spec does not find them.
Add a custom libs property to address this.
Partial fix for #42273
Co-authored-by: payerle <payerle@users.noreply.github.com>
for various reasons had to advance dependency of 5.0.2 to at least
pmix 4.2.4. 5.0.1 and 5.0.0 can also build with 4.2.4 pmix or newer.
related to #42651
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Refactoring `SpackSolverSetup` is a bit easier with type annotations, so I started
adding some. This adds annotations for the (many) instance variables on
`SpackSolverSetup` as well as a few other places.
This also refactors `condition()` to reduce redundancy and to allow
`_get_condition_id()` to be called independently of the larger condition
function.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* py-ipyvuetify: new package
* Limit py-jupyter-packing version to 0.7.x
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix py-jupyterlab version and type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix py-ipyvue version range to exclude 2
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* rm py-wheel, already considered for PythonPackage
* fix: pynpm only required for build, reorder dependencies as in the pyproject.toml
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* snakemake: add Snakemake 8 with dependencies
* snakemake: add missing description
* Whitespace
* Whitespace
* Whitespace
* Whitespace
* py-conda-inject: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-azure-batch: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-cluster-generic: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake: add upper bound for Python
* py-snakemake-executor-plugin-drmaa: specify dependency type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-googlebatch: correct dependency version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-tes: correct dependency version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-storage-plugin-s3: reorder
* snakemake: remove newly added variants
* snakemake: remove newly added variants
* snakemake: remove newly added variant
* snakemake: update version
* snakemake: update version
* snakemake: whitespace
* py-snakemake-storage-plugin-s3: update version
* snakemake: use newer version
* snakemake: whitespace
* snakemake: update interfaces
* py-snakemake-storage-plugin-gcs: link issue
* snakemake: update versions
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* package/libspatialite: add conflict, new version
* depends on new version of freexl
* fix bug
* remove manual download stuff
* improve style
* first depracate
* [@spackbot] updating style on behalf of Sinan81
* get rid of conflict, reorder deps
* remove manual download
---------
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
* Allow awscli-v2 to be installed without examples/ dir
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The CMake-based build is anticipated to work in all cases where the
Autotools-based build did, and to address all prior issues with less
maintenance of the package. In detail:
* Fixes#42735 (CMake's find_package helps with linking to proper
netcdf-c)
* Replaces older Autotools-based build
* All preexisting variants are handled
* Record hdf5 as an explicit dependency (was missing before)
* Add +tests option
Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
* proj: apply stdint.h patch in version 8
* Update var/spack/repos/builtin/packages/proj/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Some builds on Windows break when encountering paths with spaces. This
reencodes some paths in Windows 8.3 filename format (when on Windows):
this serves as an equivalent identifier for the file, but in a form that
does not have spaces.
8.3 filenames are also truncated in length, which could be helpful, but
that is not the primary intended purpose of using this format.
Overall
* nmake/msbuild packages do this generally for the install prefix
* curl/perl require additional modifications (as written now, each package
may require calls to `windows_sfn` to work when the Spack
root/install/staging prefixes contain spaces)
Some items for follow-up:
* Spack itself does not create paths with spaces "on top" of whatever
the user configures or where it is placed (e.g. the Spack root, the
staging directory, etc.), so it might be possible to edit some of these
paths once and avoid a proliferation of individual `windows_sfn`
calls in individual packages.
* This approach may result in the insertion of 8.3-style paths into
build artifacts (on Windows), handling this may require additional
bookkeeping (e.g. when relocating).
* Move spec_list into its own file, instead of __init__.py
* Remove spack.schema.spack
This module was introduced in #33960 It's almost an exact duplicate of
spack.schema.env, and is not used anywhere.
* Fix typo
* adds the spack recipe for py-jwcrypto
* split long line to fix E501
* Specify versions for py-cryptography and py-typing-extensions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adds the spack recipe for reacton python package
* Fix versions for ipywidgets and typing-extensions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-find-libpython: new package
* Update var/spack/repos/builtin/packages/py-find-libpython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add spack recipe for ipyvue
* Specify version for ipywidgets
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
fix bug.
* Update var/spack/repos/builtin/packages/pdal/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ansible: add v2.16.3
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add specific python version requirements from setup.cfg
* Add additional ranges for py-setuptools
* Update var/spack/repos/builtin/packages/py-ansible/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Compiling version 2.2.4 fails (on a Debian system with only a minimum set of packages installed) with an error because `INT_MAX` is undeclared:
```
263 gd_gd2.c: In function '_gd2GetHeader':
>> 264 gd_gd2.c:212:54: error: 'INT_MAX' undeclared (first use in this function)
265 212 | if (*ncx <= 0 || *ncy <= 0 || *ncx > INT_MAX / *ncy) {
266 | ^~~~~~~
267 gd_gd2.c:87:1: note: 'INT_MAX' is defined in header '<limits.h>'; did you forget to '#include <limits.h>'?
```
* pythia6: deal with dead pythiasix.hepforge.org
* pythia6: rm main81.f from CMakeLists.txt
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Any package `X` used as `depends_on("x", type="build")` will have
`X.setup_run_environment(env)` called, because it has to be able to
"run" in the build environment.
So there is no point in calling `setup_run_environment` from
`setup_dependent_build_environment`.
Also it's redundant to call `setup_run_environment` in
`setup_dependent_run_environment`, cause (a) the latter is called _for
every parent edge_ instead of once per node, and (b) it's only called
after `setup_run_environment` is called anyways. Better to call
`setup_run_environment` once and only once.
fftw object was originally created with spec["fftw:openmp"], but
referencing spec["fftw"] overwrites the 'last_query' in the spec object,
so later use of fftw.libs was not returing FFTW OpenMP libs.
Also allow the post-install fixup to support amdfftw as well as fftw.
Co-authored-by: Branden Moore <branden.moore@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
* new builtin package: py-biobb-model
* Update var/spack/repos/builtin/packages/py-biobb-model/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add package py-jacobi
* fix: add description
* fix: add description
* fix: add description
* [@spackbot] updating style on behalf of jonas-eschle
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/py-jacobi/package.py
I don't think that numpy is used in "build"? But not important
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-google-cloud-storage: add new versions
* py-google-api-core: add new versions
* py-proto-plus: add new package
* py-google-api-core: add grpc variant
* py-google-api-core: add grpc variant
* py-google-api-core: add missing prefix
* py-google-cloud-batch: add new package
* py-google-cloud-logging: add new package
* py-google-cloud-appengine-logging: add new package
* py-google-cloud-audit-log: add new package
* py-grpc-google-iam-v1: add new package
* py-proto-plus: remove obvious dependency
* Whitespace
* Whitespace
* py-google-cloud-audit-log: correct conflict
* py-proto-plus: correct dependency type
* Whitespace
* py-google-auth: add new version
* py-google-resumable-media: add new version
* py-google-cloud-storage: constrain version of dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-grpcio-status: use newer version
* py-google-resumable-media: add upper bound of dependency
* Add types of dependencies.
* py-grpcio: add new version
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-smote-variants: Added package py-smote-variants
Also added py-minisom and py-metric-learn as dependencies
* py-metric-learn: Added build dependency on setuptools
* py-smote-variants: Added a dependency on py-pytest-runner
As well as a comment about why statistics isn't included
* [@spackbot] updating style on behalf of alex391
---------
Co-authored-by: Alex C Leute <aclrc@rit.edu>
* py-charm4py: needs Cython<3.0
* Update var/spack/repos/builtin/packages/py-charm4py/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Kokkos Ecosystem: update for release 4.2.01
Will rebase this on top of develop once Kokkos Core PR merges.
* Kokkos Ecosystem: update license statement to reflect current license
* helics: Add version 3.5.0
* helics: define CMAKE_CXX_STANDARD=20 when GCC>=13 is used to compile
---------
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
* Add support for clang in oneapi packages with OpenMP
* Add fallback search for libomp in OneApi package with OpenMP threading
* Add requires for the compiler when using threads=openmp in intel-oneapi-mkl
* Cosmetic changes to messages in oneapi.py
* Update error message in oneapi.py
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* Update another error message in oneapi.py
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* Inline helper error function in oneapi.py
* Update one more error message in oneapi.py
* Wrap long line in oneapi.py
---------
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* allow packages to request no submodules be updated when self.submodules is a
callable function
* Extend the test added in Allow more fine-grained control over what submodules are
updated: part 2 #27293 to include this case
* Update the type signature for the submodules arg of version() in directives.py
---------
Co-authored-by: tjfulle <tjfulle@users.noreply.github.com>
* cmake: Enable CMAKE_EXPORT_COMPILE_COMMANDS
Enabling this option causes CMake to generate a compile_commands.json file
containing a compilation database that can be used to drive third-party tools.
CMAKE_EXPORT_COMPILE_COMMANDS only exists for CMake >= 3.5
Exporting compilation databases is only supported for Makefile and Ninja
generators, so check these conditions as well.
CMAKE_EXPORT_COMPILE_COMMANDS is only enabled in supported configurations
* khmer: new package @2.1.1
* rationalising patch
* adding dep, modifying patch
* Update var/spack/repos/builtin/packages/khmer/package.py
Indeed!
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* possvm: new package @1.2
* black!
* appeasing flake8
* Updating commit ID
* Adding graphing dep
* Update var/spack/repos/builtin/packages/py-markov-clustering/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pysftp: add new package
* py-pysftp: correct version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adds the spack recipe for building the pynpm python package
* fix license header
* Update var/spack/repos/builtin/packages/py-pynpm/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-fastremap: new package @1.14.1
* py-pyqtgraph: new package @0.13.3
* py-roifile: new package @2024.1.10
* py-superqt: new package @0.6.1
* cellpose: new package @2.2.3
* Appeasing black ...
* Dropping the cuda variant
* Update var/spack/repos/builtin/packages/py-fastremap/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding python version dependency
* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pyqtgraph/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-roifile/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-superqt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding missing dep & conflict
* Appeasing black
* Adding missing py- prefix for dep
* Switching over to py-pyqt6
* Switching over to py-pyqt6
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new builtin package: py-biobb-structure-checking
* Update var/spack/repos/builtin/packages/py-biobb-structure-checking/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* dependencies are also build dependencies
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new builtin package: py-biobb-io
* Update var/spack/repos/builtin/packages/py-biobb-io/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new builtin package: py-biobb-gromacs
* Update var/spack/repos/builtin/packages/py-biobb-gromacs/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* added setuptools dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Bump up the version for rocm-6.0.2 release
* extend the patches that were created for apps for rocm-6.0.0 and rocm-6.0.2 releases
(but apply hipfft patch for only 6.0.0)
* gitlab: remove requests for unreferenced packages
The packages removed in this commit are not built by any of
our current GitLab CI stacks.
* gitlab: update memory requests for "huge" packages
* gitlab: reduce memory requests for overprovisioned packages
* gitlab: more memory for py-torch (again)
* gitlab: update memory but keep CPU the same
This fixes bugs, performance issues, and removes no longer necessary code.
Short version:
1. Creating views from Python extensions would error if the Spack `opt` dir itself was in some symlinked directory. Use of `realpath` would expand those, and keying into `merge_map` would fail.
2. Creating views from Python extensions (and Python itself, potentially) could fail if the `bin/` dir contains symlinks pointing outside the package prefix -- Spack keyed into `merge_map[target_of_symlink]` incorrectly.
3. In the `python` package the `remove_files_from_view` function was broken after a breaking API change two years ago (#24355). However, the entire function body was redundant anyways, so solved it by removing it.
4. Notions of "global view" (i.e. python extensions being linked into Python's own prefix instead of into a view) are completely outdated, and removed. It used to be supported but was removed years ago.
5. Views for Python extension would _always_ copy non-symlinks in `./bin/*`, which is a big mistake, since all we care about is rewriting shebangs of scripts; we don't want to copy binaries. Now we first check if the file is executable, and then read two bytes to check if it has a shebang, and only if so, copy the entire file and patch up shebangs.
The bug fixes for (1) and (2) basically consist of getting rid of `realpath` entirely, and instead simply keep track of file identifiers of files that are copied/modified in the view. Only after patching up regular files do we iterate over symlinks and check if they target one of those. If so, retarget it to the modified file in the view.
--with-ch4-max-vcis=default is no longer accepted by MPICH configure
since the 4.1 release. Just omit the option from the +vci variant, since
configure will select the default value in its absence.
Problem: Older versions of the flux-core package require czmq and
jsonschema, but these dependencies have been dropped in recent
versions.
Add `when=` arguments to drop these requirements for the appropriate
versions of flux-core.
* fix wrong versioning
use doc version and not the one extrapolated from the path (i.e. 0.2.0.1)
* nvpl-lapack requires nvpl-blas
propagate matching variants to nvpl-blas dependency
Co-authored-by: albestro <albestro@users.noreply.github.com>
* rct: update packages (RE, RG, RP, RS, RU) with new versions
* rct: updated style
* rct: 1.46 release
* rct: RP 1.46.1 (hotfix applied)
* rct: deprecated old versions
* Update var/spack/repos/builtin/packages/py-radical-entk/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-radical-pilot/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-radical-saga/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* rct: brought back dependencies for deprecated versions
* rct: RP hotfix release 1.46.2
* rct: new stack release 1.47.0
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ucsf-pyem: updating to commit e92cd4d
* py-pyfftw: updating to @0.13:1
* py-ucsf-pyem: correcting install of scripts
* Upper limits
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Upper limits
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* updates from github review
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add new package libsixel
* reorder to group variants and dependencies together
* Switch to using source from fork
The original developer of libsixel, Hayaki Saito (@saitoha), disappeared
in early 2020 and has not updated the repository or been seen since.
However, a fork of the project has been created at libsixel/libsixel,
and that fork has been getting much more regular updates. In this commit
I switch the package to using the better-maintained, now apparently
canonical fork of the project.
That project switched the build system from autotools to Meson, so much
of the build arguments code needed to be rewritten. The newest official
release also contained an error in meson.build which would break on
newer versions of Meson. That error only applied to the libjpeg and
libpng variants, though, so I switched the default to use libgd which
has broader format support anyway.
* blacken
* broke description into multiple lines
* allow libjpeg, etc to be loaded automatically
* apply patch to all compilers as per spack/issues/42007
* [@spackbot] updating style on behalf of Chrismarsh
* Resolve flake8 style issue from spackbot
* fix flake8 trailing whitespace
---------
Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
These 7 hooks were not used.
- Six of them related to install phases were unused after `spack`
`monitor` was removed, and the code seems to have bit rotten as there
were reports they were not (always?) triggered when they should.
- The post environment one was made redundant after spack install for
environment started following the common code path for generating
module files in #42147.
It should not be a breaking change to remove, since users cannot define
hooks in extensions, they would have to fork Spack.
If we ever _were_ to make those hooks extendable outside of core Spack,
it would also be better to start with fewer rather than more, cause
everything you expose gets relied upon...
Removing those also allows us to rethink what hooks we really need, and
in particular it seems like we need a hook that runs post install also when
the spec is inserted into the database.
The lack of a rule to avoid enforcing requirements on multi-valued variants, when the condition activating the environment was not met, resulted in multiple optimal solutions. The fix is to prevent imposing a requirement if the when= rule activating it is not met.
The section was highly outdated as it referred to old defaults, and
failed to mention `hide_implicits: true`.
This commit restructures it, moves some deeply nested sections a level
up, and promotes `hide_implicits: true` + `autoload: direct` before
talking about `exclude`.
It is useful to enable/disable stacks in order to handle turning
specific stacks on/off based on runner availability, stack stability,
testing requirements, etc.
The disabled stack list takes precedence over the enable stack list. The
assumption is that stacks that are disabled are so due to some
functionality missing or broken for that stack.
The enable stack list implicitly disables all stacks not listed in the
enable list.
* Registry queries can fail due to simultaneous access from other
processes. Wrap some of these accesses and retry when this may
be the cause (the generated exceptions don't allow pinpointing
this as the reason, but we add logic to identify cases which
are definitely unrecoverable, and retry if it is not one of
these).
* Add make recursion optioal for most registry search functions;
disable recursive search in case where it was originally always
recursive.
* py-pythran: apply patch to fix compilation with GCC 13
* Update var/spack/repos/builtin/packages/py-pythran/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix ipyrad numpy depedencies
ipyrad versions through 0.9.90 use np.int, which is deprecated in numpy 1.20.
* fix whitespace
* Correct numpy dependency restrictions
Fix two separate problems:
1. We want to always visit parents before children while creating views
(when it comes to ignoring conflicts, the first instance generated in
the view is chosen, and we want the parent instance to have precedence).
Our preorder traversal does not guarantee that, but our topological-
order traversal does.
2. For copy style views with packages x depending on y, where
<x-prefix>/foo is a symlink to <y-prefix>/foo, we want to guarantee
that:
* A conflict is not registered
* <y-prefix>/foo is chosen (otherwise, the "foo" symlink would become
self-referential if relocated relative to the view root)
Note that
* This is an exception to [1] (in this case the dependency instance
overrides the dependent)
* Prior to this change, if "foo" was ignored as a conflict, it was
possible to create this self-referential symlink
Add tests for each of these cases
* Sqlite requires the user to provide a command line arg (DYNAMIC_SHELL)
to export shared symbols to import lib from .def
* Add other options recommended by Sqlite docs:
https://github.com/sqlite/sqlite/blob/master/doc/compile-for-windows.md
* Some of these options mean we can restore variants that were
disabled for Windows (fts, functions, rtree).
* Updated with generic test functions
* Added level_zero test
As the compiler is not installed by TAU, it should be loaded first and run with --dirty. Will check if there is a way to make it work without --dirty.
* Update var/spack/repos/builtin/packages/tau/package.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
The main script body is over-written for power. Putting thet timing
aggregation in the after script allows it to be called on all of the
current pipelines.
* Update package.py
Add support for OpenSubdiv 3.5.x
* Fixed Dependencies
I was getting errors about cmake not being able to find `xf86vm`, and adding a dependency on `libxxf86vm` fixes it.
Sometimes the logs are too long and the copy & paste command is not
shown. In that case I'd like to just copy the failing GitLab job URL in
my browser to `spack reproduce-build <url>`.
Currently, the `SpackSolverSetup` and the `PyclingoDriver` are more coupled than necessary:
1. The driver object needs a setup object to be injected during a solve,
2. And the setup object will get a reference back to the driver
This design is necessary because we use the low-level `clingo.backend` interface to setup our problem. This interface though is meant to bypass the grounder and add symbols directly in the grounded table, which is a feature we don't currently use.
The PR simplifies the encoding by having the setup object returning the problem-specific facts / rules as a list of strings, and the driver ingesting them using the [clingo.Control.add](https://potassco.org/clingo/python-api/5.6/clingo/control.html#clingo.control.Control.add) method. This removes any use of the low level interface.
Using this encoding makes it easy to hash the output of the setup phase, since it is returned as a string.
* py-nbconvert: avoid install-time downloads
* install css files as resources for py-nbconvert
* Update var/spack/repos/builtin/packages/py-nbconvert/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* py-nbconvert: update dependencies for 7.14.1
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
* Update var/spack/repos/builtin/packages/py-nbconvert/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update resources & remove style.min.css file
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This "breaks" the deprecated schema by allowing unknown attributes
to the attributes section of the job types. The breaking change here is
that deprecated stacks will no longer ignore attributes that are unknown
but rather assume the new CI schema behavior of injecting them into the
generated CI configuration. This change is required to secure
authentication in Spack CI.
Improve naming, so it's clear file "extensions" are not taken in the
`PurePath(path).suffix` sense as the original function name suggests,
but rather that the files are opened and their magic bytes are
classified.
Add type hints.
Fix a bug where `stream.read(num_bytes)` was run on the compressed
stream instead of the uncompressed stream, which can potentially break
detection of tar.bz2 files.
Ensure that when peeking into streams for magic bytes, they are reset to
their original position upon return.
Use new API in `spack logs`.
Relocation of `PT_INTERP` in ELF files already happens to work from long to short path, thanks to generic binary relocation (i.e. find and replace). This PR improves it:
1. Adds logic to grow `PT_INTERP` strings through patchelf (which is only useful if the interpreter and rpath paths are the _only_ paths in the binary that need to be relocated)
2. Makes shrinking `PT_INTERP` cleaner. Before this PR when you would use Spack-built glibc as link dep, and relocate
executables using its dynamic linker, you'd end up with
```
$ file exe
exe: ELF 64-bit LSD pie executable, ..., interpreter /////////////////////////////////////////////////path/to/glibc/lib/ld-linux.so
```
With this PR you get something sensible:
```
$ file exe
exe: ELF 64-bit LSD pie executable, ..., interpreter /path/to/glibc/lib/ld-linux.so
```
When Spack cannot modify the interpreter or rpath strings in-place, it errors out without modifying the file, and leaves both tasks to patchelf instead.
Also add type hints to `elf.py`.
Certain versions of ifx (the majority of those available) have an issue
where they are not compatible with TMP directories with dot chars
This precludes their use with CMake.
Remap TMP to point to the stage directory rather than whatever the TMP
default is
* Add fenics development version and ufl-legacy
* Make sure python and setuptools are added to ufl-legacy and add version 2022.3.0 as well
* Run black and add maintainer to fenics package
* Fix typo
* Use Fiat version 2019.1.0
* Run black
* Add back master branch of fiat
* Remove master from the list of dolfin versions and add one extra line of each deps instead
* Run black
* Do not specify python version in ufl-legacy
* Remove python dependency from ufl-legacy
* Remove python dependency from ffc
* Add special case for master in ffc
* Run black
* Remove master from loop in ffc
* Run black again
* geopm: Mark all as deprecated
- This recipe will be removed in a future release.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* Add py-sphinx-emoji
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* Add py-dasbus
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* py-pygobject: Add v3.46.0
- Previous versions error during build phase.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* py-sphinx-tabs: Add new versions
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* Add geopm-service
- Previous geopm package is now 2 packages:
geopm-service and geopm-runtime.
- The GEOPM service is designed as a systemd/dbus
service providing a userspace interface to
privileged hardware telemetry and configurations.
- Installing via spack will enable some userspace
testing, but generally most users will want to
install the GEOPM service via the system package
manager as root to get full functionality.
- This recipe will enable the creation of the fully
userspace geopm-runtime recipe which will replace
the old geopm recipe.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
---------
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* py-textual: New package py-textual
* py-textual: Depend on py-mdit-py-plugins
* py-textual: Added dependency on python@3.8:3
* py-textual: Added a comment about why there is a dependency on
py-mdit-py-plugins
* py-textual: Ran black
---------
Co-authored-by: Alex C Leute <aclrc@rit.edu>
* add pytest-aiohttp
* black
* py-setuptools -> py-setuptools-scm
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
VTK struggles to consume some Spack-derived packages on Windows:
Patch VTK to allow a smoother integration
Also add install for examples as they are not part of the install
interface.
gl2ps tries to build static and shared libs simultaneously with
the same target name on the generator side. This causes a name
clash issue for Ninja on Windows (where the extension is .lib
in both cases).
Add a variant on Windows to force building only one of shared
or static, and patch the CMake build to enable use of this
variant.
* gitlab: remove commented-out duplicate entries
* gitlab: reclassify some packages from "huge" to "large"
Our observed max memory usage for these packages is as follows:
hipblas: 7.7G
qt: 6.6G
visit: 9.7G
All of these should fit within a "large" request (currently 12G).
* gitlab: remove pango from list of huge packages
This package is not currently built by any of our CI stacks.
* gitlab: update requests for high memory packages
Refine resource requests for memory-intensive packages based on
max memory usage data.
- Adds perl-sql-translator and its missing deps:
- Adds perl-import-into
- Adds perl-package-variant
- Adds perl-strictures
Built with build-time tests and comes with a simple run-time test.
Add the empty deptype `spack.deptypes.NONE`.
Test the case `traverse_nodes(deptype=spack.deptypes.NONE)` to not
traverse dependencies, only de-duplicate.
Use the construct in environment views that otherwise would branch on
whether deps are enabled or not.
* ensure umpire~cuda~rocm when ~cuda~rocm
* MDAnalysis: add v2.7.0
* Apply suggestions from code review
* license
* review changes
* add gsd and py-cmake versions
* Update package.py
* py-cmake
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix and reorder
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add py-vl-convert-python
* code format
* Update package.py
add homepage
* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove old versions, add todo
* Update var/spack/repos/builtin/packages/py-vl-convert-python/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove blank line
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* New package: py-metrics
* [py-metrics] cleaned up extra comment
* Updated copyright and ran black
* py-pathspec: Added version 0.5.5
---------
Co-authored-by: vehrc <vehrc@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
Co-authored-by: Alex C Leute <aclrc@rit.edu>
* py-frozendict: patch up for Python 3.11
See also Marco-Sulla/python-frozendict#68, rely on a pure Python
implementation when 3.11+ is used.
* mention related Github issue
* new builtin package: biobb-common
* removed whitespace for the style test
* replaced ' by " for the style test
* changed import line for the style test
* Update var/spack/repos/builtin/packages/biobb-common/package.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Update var/spack/repos/builtin/packages/biobb-common/package.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Update var/spack/repos/builtin/packages/biobb-common/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* removed test
* using pypi instead of git
I hope this is declared the right way.
I tried to test it locally but I am having problems with a dependency (openblas) thus I can not even try if this works.
* upgraded version
* Added py prefix
* removed biopython version restriction
* directory with new py-prefixed name
* Delete old non-prefixed package
* Update var/spack/repos/builtin/packages/py-biobb-common/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* update tarball urls, add new versions and update maintainers
* remove unnecessary variant like cross-compile
* use self.define and self.define_from_variant
* improve regex / bug that matches with -DMPICH_SKIP_MPICXX=1
Co-authored-by: Matthias Wolf <matthias.wolf@epfl.ch>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Previously, for abstract specs like:
```
foo ^[virtuals=a] bar ^[virtuals=b] bar
```
the second requirement was silently discarded on concretization. Now they're merged, and the abstract spec is equivalent to:
```
foo ^[virtuals=a,b] bar
```
* Added countdown repo
* Added fixed version of COUNTDOWN
* Style fixes
* Changed mantainer syntax
* Variants listed in alphabetical order by name
* Added conflicts and some reordering
* Fixed conflicts syntax
* Style fixes
* [@spackbot] updating style on behalf of danielecesarini
---------
Co-authored-by: danielecesarini <danielecesarini@users.noreply.github.com>
CMake may write and read from `~/.cmake` through `export(...)` and read `find_package(...)` respectively. We don't want this as it may influence the build in a non-deterministic way, so disable it for all versions of `cmake`.
* Trilinos: Add AMD to SuiteSparse TPL list
When Trilinos is built with SuiteSparse support, it should enable AMD as
one of the TPLs. It was previously enabling only UMFPACK. The Xyce
package uses AMD (but not UMFPACK).
* Xyce: Add 7.8 release and various improvements
In addition to adding the latest Xyce release (7.8), all the earlier
releases were deprecated, with the exception of 7.7.
The Trilinos specification was updated to remove unneeded packages,
explicitly enable all needed packages, and specify additional
do-not-build packages.
The serial build is now the default, with MPI still being an option.
I also epanded the explanation for one of the patches; and, finally, I
took the opportunity to update the Xyce description to better match the
current Xyce README description.
Fixes a bug where Spack did not generate module files of non-roots during
spack install with an active environment.
The reason being that Environment.new_installs only contained roots.
This PR:
Drops special casing of automatic module generation in post-install hooks
When `use_view`, compute environment variable modifications like always, and
applies a view projection to them afterwards, like we do for spack env activate.
This ensures we don't have to delay module generation until after the view is
created.
Fixes another bug in use_view where prefixes of dependencies would not be
projected -- previously Spack would only temporarily set the current spec's prefix.
Removes the one and only use of the post_env_write hook (but doesn't drop it to
make this backportable w/o changes)
Previously `std_args` was called on non-roots in a build context, which is redundant, and also leads to issues when `std_args` expects build deps of the `pkg` to be installed.
HPCToolkit `develop` now can optionally be built via Meson. This PR adds the `build_system=(autotools|meson)` variant and splits the build-system-dependent pieces into `AutotoolsBuilder` and `MesonBuilder`. The default is to build with `autotools`.
As of writing, the Meson is simply a wrapper around the original Autotools build, hence the build requires a native file listing install prefixes of all dependencies (which are internally mapped to `--with-{pkg}={prefix}` arguments for `./configure`). This is an unconventional but temporary state of affairs until the build system is fully ported to Meson and conventional dependency acquisition techniques like `pkg-config` and `cmake` are practically available.
* Environments: fix environment config
* Don't change the lockfile manifest path
* Update activate's comments to tie the manifest to the configuration
* Add spec_list override method
* Remove type checking from 'activate' since already have built-in check
* Refactor global methods tied to the manifest to be in its class
* Restore Environment's 'env_subdir_path' and pass its config_stage_dir to EnvironmentManifestFile
* Restore global env_subdir_path for use by Environment and EnvironmentManifestFile
The CUDA target should be specified at build time, otherwise
by default `autodock-gpu` will be built for compute capabilities
52, 60, 61, 70, which may cause errors on unsopported cards.
Currently requirements allow to express "strong preferences" and "conflicts" from
configuration using a convoluted syntax:
```yaml
packages:
zlib-ng:
require:
# conflict on %clang
- one_of: ["%clang", "@:"]
# Strong preference for +shared
- any_of: ["+shared", "@:"]
```
This PR adds syntactic sugar so that the same can be written as:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
Preferences written in this way are "stronger" that the ones documented at:
- https://spack.readthedocs.io/en/latest/packages_yaml.html#package-preferences
`spack install` early exit behavior was sometimes convenient, except
that it had and has bugs:
1. prior bug: we didn't mark env roots of already installed specs as
explicit in the db
2. current bug: `spack install --overwrite` is ignored
So this PR simplifies by letting the installer do its thing even if
everything is supposedly installed.
* Bump up the version for ROCm-6.0.0
* Adding patch files
* Style check failure fix
* Style check fixes
* Style check error fixes
* Patch to remove hipblas client file installation in 6.0
* Patch need to be applied on all 5.7 relases
* 6.0 update for math libs and other packages, new github url etc
* Correct package-audit failures
* Correcting shasum for rocfft patch and limiting patch in rocblas
* Reverting updates in rocprofiler-dev due to ci-gitlab failure
* Fixes for ci-gitlab failure due to disabling hip backward compatibilit
* Adding patch file to Change HIP_PLATFORM from HCC to AMD and NVCC to NVIDIA
* Use the gcnArchName inplace of gcnArch as gcnArch is deprecated from rocm-6.0.0
* Patches to fix magma and blaspp build error with rocm 6.0.0
* Patch for mfem and arborx for rocm 6.0
* Style check error fix
* Correcting style check errors
* Uodating dependent version
* Update for petsc to build with rocm 6.0
Need reverting-operator-mixup-fix-for-slate.patch for rocm 6.0
* Reverting the change in url for 2.7.4-rocm-enhanced
* hip-tensor 6.0.0 update
This commit ensures that CMake packages that also have Python as a build/link dep get a couple defines for the Python path so that CMake's builtin `FindPython3`, `FindPython`, `FindPythonInterp` modules can locate Python correctly.
The main problem with those CMake modules is that they first search for Python versions known at the time of release, meaning that old CMake maybe find older system Python 3.8 even though Python 3.11 comes first in `CMAKE_PREFIX_PATH` and `PATH`.
Package maintainers can opt out of this by overriding the `find_python_hints = False` attribute in the package class.
* Add nglview package
* Use slightly older version
* py-nglview: Correct py-versioneer version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-nglview: Correct version of py-jupyter-packaging dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-versioneer-518 package
* py-versioneer-518: Correct version
* py-nglview: Numpy is needed during build for the tests
* py-nglview: dependency needed for tests
* py-nglview: Correct dependency types
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* e4s ci: use latest intel/hpckit 2024 based image
* use latest container image: ecpe4s/ubuntu22.04-runner-amd64-oneapi-2024.0.0:2023.12.01
* comment out failing specs
* update to use patched container
* remove generalized package preference for intel-oneapi-mkl@2023
* change packages commented out
* VecGeom: new version 1.2.7 and fix URLs
* vecgeom: remove deprecated ancient RC version with incorrect hash
* geant4: remove support for @10.3+vecgeom
The piece of code that is removed in this PR predates environment views.
Spack would symlink build logs in `<env>/.spack-env/logs/*`, but this is
redundant because:
1. Views already add `<prefix>/.spack` (and there's logic there to avoid
clashes)
2. The code was broken anyways: it would only symlink the logs of
environment roots, not their deps, even if they were just built.
If users disable views, I'm pretty sure they're not waiting for
`.spack-env/logs` either. So, imo we can delete this code, and it was
probably overlooked in the past.
For a requirement like
```
packages:
foo:
require:
- "+debug"
```
(not `one_of:`, `any_of:`, or `spec:`)
`spack config change` would ignore the string. This was particularly evident if toggling a variant for a previously unmentioned package:
```
$ spack config change packages:foo:require:+debug
$ spack config change packages:foo:require:~debug
```
This fixes that and adds a test for it.
```
$ curl 'https://thepeg.hepforge.org/downloads/?f=ThePEG-2.3.0.tar.bz2' | sha256sum
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1904k 0 1904k 0 0 1075k 0 --:--:-- 0:00:01 --:--:-- 1074k
ac35979ae89c29608ca92c156a49ff68aace7a5a12a0c92f0a01a833d2d34572 -
```
* Reduce the size on disk for logs
This PR does two things:
1. Store a compressed `spack-build-out.txt.gz`
2. Get rid of phase logs, as they are duplicates of
`spack-build-out.txt`
The logs are not compressed in the stage dir, so on build failure the
workflow for users is no different.
It's just that on install the logs are rarely used, and if needed, users
can easily `gzip -d` or `zgrep` them.
In the case of GCC installs, the compressed logs are <5% of the original
size, which is typically dozens of MBs.
* get rid of "backwards compat" of file names in stage dirs
Sbangs don't exist on Native Windows, and the hook is causing errors
due to the file comparison + behavior of os.rename on Windows. Skip
the hook on Windows.
New patch 7.5.3p5, new bugfix 7.7.2, new minor 7.8.0.
Only possible impact on spack is the potential addition of a variant to select the memory manager in 7.8.0, see [diff](https://github.com/Open-Cascade-SAS/OCCT/compare/V7_7_2...V7_8_0). Not adding a variant at this time.
Like `spack change` for specs in environments, this can e.g. replace `examplespec+debug` with `examplespec~debug` in a `require:` section.
Example behavior for a config like:
```
packages:
foo:
require:
- spec: +debug
```
* `spack config change packages:foo:require:~debug` replaces `+debug` with `~debug`
* `spack config change packages:foo:require:@1.1` adds a requirement to the list
* `spack config change packages:bar:require:~debug` adds a requirement
As observed in #40944, when using `spack config add <path>`, the `path` might
contain keys that are enclosed in quotes.
This was broken in https://github.com/spack/spack/pull/39831, which assumed that
only the value (if present, the final element of the path) would use quotes.
This preserves the primary intended behavior of #39931 (allowing ":" in values when
using `spack config add`) while also allowing quotes on keys.
This has complicated the function `process_config_path`, but:
* It is not used outside of `config.py`
* The docstring has been updated to account for this
* Created an object to formalize the DSL, added a test for that, and
refactored parsing to make use of regular expressions as well.
* Updated the parsing and also updated the `config_path_dsl` test with an explicit check.
At a higher level, split the parsing to check if something is either a key or not:
* in the first case, it is covered by a regex
* in the second, it may be a YAML value, but in that case it would have to be the last
entry of x:y:z, so in that case I attempt to use the YAML handling logic to parse it as such
- Use MakefilePackage and simplified package.py
- Deprecate old versions - they did not build for me with OCaml 4.13.1
that is currently in Spack. Also, the changes from the previous
versions seem to be quite significant.
Spack packages may not have a public download option, and can implement
`download_instr` to inform users how to obtain the artifacts needed to
build. `spack checksum` however did not account for this and would print
out a confusing error message when invoked on such packages ("Could not
find any remote versions").
This PR updates the error message to output the manual download instructions
if `spack checksum` is invoked on a package with `manual_download = True`.
Currently when you repeatedly create a bootstrap mirror that includes
`clingo-bootstrap@spack` you get different tarballs every time.
This is a general problem with mirroring checkouts from version control
as tarballs. I think it's best to create tarballs ourselves, since that way we
have more control over its contents.
This PR ensures normalized tarballs like we do for build caches:
- normalize file permissions (in fact that was already inspired by git, so
should be good)
- normalized file creation/modification time (timestamp 0)
- uid / guid = 0, no usernames
- normalized gzip header
- dir entries are ordered by `(is_dir, name)` where strings are not locale aware ;)
- POSIX says st_mode of symlinks is unspecified, so work around it and
force mode to `0o755`
R embeds an absolute path to the `which` executable in the sources for
`Sys.which`. This gets ultimately stored as serialized byte code in some
custom database format, which uses compression for entries.
As a result, Spack cannot relocate `<prefix which>/bin/which` when
installing from a build cache.
The patch works around this by making R create a symlink to `which` in
its own prefix, have the R sources call that, so that relocation works
again.
See https://github.com/r-devel/r-svn/pull/151
Explicitly requested namespaces are annotated during
the setup phase, and used to retrieve the correct package
class.
An attribute for the namespace has been added for each node.
Currently, a single namespace per package is allowed
during concretization.
* add new versions of py-altair
* fix year
* Update var/spack/repos/builtin/packages/py-altair/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-altair/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* reorder dependencies
* remove rc
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-nanoplot and py-nanostat
* Add myself as spack package maintainer
* Remove python version requirement
* Remove python dependency
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py: remove python dependency
* Set dependency types
* Update py-nanomath package.py
* Update py-nanoplot package.py
* Update py-nanostat package.py
* Add missing py-python-deprecated dependency
* Make kaleido a source package
* Fix py-nanoget deps
* Kaleido lint
* Nanoget lint
* Nanomath lint
* Nanoplot lint
* Nanostat lint
* Another kaleido lint I missed
* py-nanoplot missed lint
* py-nanostat missed lint
* py-kaleido even more missed lint
* The linter really can't make up its mind
* The linter REALLY can't make up its mind
* Add py-python-deprecated package
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* perl-class-accessor-lvalue: New package
Adds Class::Accessor::Lvalue
* perl-date-utils and deps: New packages
This adds:
- perl-date-utils and its dependencies:
- perl-date-exception
- perl-term-ansicolor-markup
The current `mkdir {{ paths.environment }}` will generate an error if:
* `{{ paths.environment }}` already exists, or
* `{{ paths.environment }}` is nested in non-existing dirs.
Adding `-p` to the command will make this robust to both possibilities.
Set noclobber bash option when writing manifest.
* Add GASNet-EX release 2023.9.0
Add level_zero variant
Deprecate old versions that are no longer supported
Add version enforcement to accelerator variants
* Add UPC++ release 2023.9.0
Deprecate old versions that are no longer supported
Add version enforcement to accelerator variants
* perl-rose-datetime: New package plus updates
- perl-rose-datetime: New package
- perl-datetime: New version, updated dependencies to enable build
time tests and added a runtime test
* Remove copyright line
* New year
* bigdft: convert the use of format strings to fstrings everywhere
* Fix formatting
* reformat with black
* Update compiler flags in bigdft-core
* Revert "Update compiler flags in bigdft-core"
This reverts commit f7524ed784.
* Bump the build cache layout version from 1 to 2
* Version to lists parent directories of the prefix in the tarball too, which is required from some container runtimes
* Move in vs. satisfies to a note and mention special cases of in
* Address feedback: oveoverlap -> intersect
* Re-word the satisfies versus in note.
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This fixes an issue where pkg.stage throws because a patch cannot be found,
but the patch is redundant because the spec is reused from a build cache and
will be installed from existing binaries.
Part 4 of reworking all package metadata to key by `when` conditions.
Changes conflict dictionary structure from this:
{ provided_spec: {when_spec, ...} }
to this:
{ when_spec: {provided_spec, ...} }
`make_when_spec()` was being used in the solver, but it has semantics that are specific
to parsing when specs from `package.py`. In particular, it returns `None` when the
`when` spec is `False`, and directives are responsible for ignoring that case and not
adding requirements, deps, etc. when there's an actual `False` passed in from
`package.py`.
In `asp.py`, we know that there won't ever be a raw boolean when spec or constraint, so
we know we can parse them without any of the special boolean handling. However, we
should report where in the file the error happened on error, so this adds some parsing
logic to extract the `mark` from YAML and alert the user where the bad parse is.
- [x] refactor `config.py` so that basic `spack_yaml` mark info is in its own method
- [x] refactor `asp.py` so that it uses the smarter YAML parsing routine
- [x] refactor `asp.py` so that YAML input validation for requirements is done up front
Part 3 of reworking all package metadata to key by `when` conditions.
Changes conflict dictionary structure from this:
{ (requirement_spec, ...): [(when_spec, policy, msg)] }
to this:
{ when_spec: [((requirement_spec, ...), policy, msg), ...] }
Part 2 of reworking all package metadata to key by `when` conditions.
Changes conflict dictionary structure from this:
{ conflict_spec: [(when_spec, msg), ...] }
to this:
{ when_spec: [(conflict_spec, msg), ...] }
Also attempts to consistently name the variables used to iterate over conflict
dictionaries.
Part 1 of making all package metadata indexed by `when` condition. This
will allow us to handle all the dictionaries on `PackageBase` consistently.
Convert the current dependency dictionary structure from this:
{ name: { when_spec: [Dependency ...] } }
to this:
{ when_spec: { name: [Dependency ...] } }
On an M1 mac, this actually shaves 5% off the time it takes to load all
packages, I think because we're able to trade off lookups by spec key
for more lookups by name.
* Boost: add version 1.84.0
* Conflict with 98/03
* Set C++11 as default
Starting with 1.84.0, the minimum required is c++11. It has been a very
long time since 98/03 has been required. It's time to bump the minimum.
* Add OpenMPI 5.0.0/5.0.1 release
* Fix a problem with dlopen syms with 5.0.0
* Crank up lex buffer to 1MB so that Open MPI's compiler wrapper can parse the enormously long lines present in, for example, mpicc-wrapper-data.txt when the spack install is utilizing Spack's path padding feature.
* Disable romio by default for 5.0.0 and beyond owing to problems compiling the romio package when using the Intel OneAPI compiler.
* Patch for addiing cuda lib location in case of non-standard location of libcuda.so
* build accel components as DSOs. It appears from looking at some of the spack CI that it implicitly assumes that Open MPI is built with components as DSOs. The default behavior for Open MPI was changed between the 4.1.x release stream and the 5.0.x release stream changed and this premise is now incorrect.
Turns out that starting with Open MPI 5.0.0 building static
does not work when using a now very important variant, namely cuda.
In older versions of Open MPI the libcuda.so was dlopened at
run time when needed, but now libcuda is linked in to the cuda
components of openmpi directly. This works when using Open MPI's
dynamically loadable component option, but doesn't work now for
a lot of the Spack CI pipelines because they don't include libcuda.so
in LD_LIBRARY_PATH of packages that dont think they are using
cuda themselves.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Co-authored-by: Jack Morrison <jack.morrison@cornelisnetworks.com>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Needed for #40326, which can changes the iteration order over package dependencies during concretization.
While clingo doesn't have this problem, the original concretizer (which we still use for bootstrapping) can be sensitive to iteration order when evaluating dependency constraints in `when` conditions. This can cause it to ignore conditional dependencies unless the dependencies in the condition are listed first in the package.
The issue was in the way the original concretizer would disconnect specs *every* time `normalize()` ran. When specs were disconnected, `^dependency` constraints wouldn't see the dependency in the dependency condition loop.
We now only only disconnect *all* dependencies at the start of `concretize()` and `normalize()`, and we disconnect any leftover dependents from replaced externals at the *end* of `normalize()`. This trims stale connections while keeping the ones that are needed to trigger dependency conditions.
- [x] refactor `flat_dependencies()` to not disconnect the spec by default.
- [x] `flat_dependencies()` is never called with `copy=True` -- remove the `copy` kwarg.
- [x] disconnect only once at the beginning of `normalize()` or `concretize()`.
- [x] add a test that perturbs dependency iteration order to ensure this doesn't regress.
- [x] disconnect unused dependents at end of `normalize()`
* setup-env: Fix back and forth between two instances
* setup-env.csh: Fix SPACK_ROOT when switch to a different instance
i.e. Always look for the current SPACK_ROOT
* setup-env: Update comments
This adds options to `spack list` that allow you to list only packages from specific
repositories/namespaces, e.g.:
```console
spack list -r builtin
```
only lists packages from the `builtin` repo, while:
```console
spack list -r myrepo -r myrepo2
```
would list packages from `myrepo` and `myrepo2`, but not from `builtin`. Note that you
can use the same argument multiple times.
You can use either `-r` / `--repo` or `-N` / `--namespace`. `-N` is there to match the
corresponding option on `spack find`.
- [x] add `-r` / `--repo` / `-N` / `--namespace` argument
- [x] add test
* Add rust-analyzer as variant to rust build
* Expose cargo module only when +cargo
* rust: add v1.74.0 and v1.75.0 and remove variants in favor of +dev
* [@spackbot] updating style on behalf of alecbcs
* Fix variant typo
---------
Co-authored-by: alecbcs <alecbcs@users.noreply.github.com>
This method is vestigial; the only arg we ever used was `ignore=`, and that was
eliminated in #29317 and #35588.
The `kwargs` field of the extensions dictionary is actually completely unused now. Add a
note for future removal.
Literal compiler config in `test_requires_directive` specifically lists `target:
x86_64`, but it doesn't need to, and the unnecessary target makes the test fail on
non-`x86_64` machines.
- [x] Remove target from config yaml in `test_requires_directive`
* py-torch: set env OpenBLAS_HOME
Because [`FindOpenBLAS.cmake`](https://github.com/pytorch/pytorch/blob/main/cmake/Modules/FindOpenBLAS.cmake) uses a hardcoded list of search paths for includes and libraries, we have to pass the `OpenBLAS_HOME` environment variable.
* py-torch: patch for ${OpenBLAS_HOME}/include/openblas
The context of this patch is unchanged since v0.4.0.
* py-torch: move patch before def patch
* py-torch: also set Atlas_ROOT_DIR and BLIS_HOME
* py-torch: fix openblas patch range to @:2.1
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Xorg pkgs: updated version to current latest
This updates the versions of multiple Xorg packages to their current
latest version. Verified the requirements and version dependencies, and
updated where needed. Modified one homepage (xkbcomp) in the interest of
conformity with the other packages.
Summary of dependency changes:
- libsm:
- depends_on("libice@1.1.0:", when="@1.2.4:")
- libx11:
- depends_on("libxcb@1.11.1:", when="@1.6.4:")
- libxcomposite:
- depends_on("xproto@7.0.22:", when="@0.4.6")
- libxfixes:
- depends_on("fixesproto@5.0:", when="@5")
- depends_on("fixesproto@6.0:", when="@6")
- libxi:
- depends_on("inputproto@2.2.99.1:", when="@1.7:")
- depends_on("inputproto@2.3.99.1:", when="@1.8:")
* xcb-proto, libxcb: new version 1.15
* xorg libs: additional new versions
New minor version upgrades:
- libXcursor (no changed needed)
- libXres
- depends_on("resourceproto@1.0:", when="@1.0")
- depends_on("resourceproto@1.2:", when="@1.2")
* libxpm: ... depends_on ncompress only when 3.5.15
* Xorg libs: add maintainer
* xtrans: new version 1.5.0
* xcb-proto: new version 1.16.0
* libxt: new version 1.3.0
* libxrandr: new version 1.5.4
* libxpm: new versions 3.5.16, 3.5.17
* libxi: new version 1.8.1
* libxft: new version 2.3.8
* libxfixes: new version 6.0.1
* libxcb: new version 1.16
* libx11: new version 1.8.5, 1.8.6, 1.8.7
* libxfixes: comment out problematic fixesproto versions
* libxi: comment out problematic inputproto versions
* libxfixes, libxi: add reference to issue that blocks updates
* patch on variant fftw3; fix for #41577
* a line of doubtful blas/fftw +openmp. Are they needed with nwchem+openmp?
* Update var/spack/repos/builtin/packages/nwchem/package.py
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
---------
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
* gaudi: add a patch for catch2
* Fix indentation
* Add a diff at the end of the path
* gaudi: canonicalize patch url
* gaudi: canonicalize patch url for gitlab diff
---------
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
PAPI CI checks a `spack install` of `papi@master`, and the open range here breaks their
CI with the fix because the patch is no longer needed (see #26784, #27625 for why it's
difficult to avoid this).
The patch issue is going to be fixed in PAPI upstream with whatever release is after
`7.1.0`, so we can restrict the patch to `7.1.0` and avoid this issue.
* shell: fix zsh color formatting for PS1 in environments
The `colorize` function in `llnl.util.tty.color` only applies proper formatting for Bash
ANSI and for console output, but this is not what zsh expects for environment variables.
In particular, when using `zsh`, `spack env activate -p` produces a `PS1` prompt that
looks like this:
```
\[\033[0;92m\][ENVIRONMENT]\[\033[0m\]
```
For zsh the formatting should be:
```
\e[0;92m[ENVIRONMENT]\e0;m
```
- [x] Add a `zsh` option to `colorize()` to enable zsh color formatting
- [x] Add conditional to choose the right `PS1` for `zsh`, `bash`, and `sh`
- [x] Don't use color escapes for `sh`, as they don't print properly
* convert lots of += lines to triple quotes
No online release notes or repository, but the new tarball has the following in `NEWS`:
* ThePEG-2.3.0 release: 2023-12-11
** gcc-12/c++17/c++20 compatibility added
** hepmc3 compatibility added
** rivet interface improved
** inforstructure for dark interaction added
PAPI 7.1.0 unconditionally adds `FFLAGS = -ffree-form` in the sysdetect tests,
regardless of the compiler.
This was added in https://github.com/icl-utk-edu/papi/pull/108 to make a build with
`armflang` work, but it breaks CCE (and our `develop` pipeline).
- [x] Add a patch that fixes both problems
- [x] Patch PAPI when at 7.1.0 or higher
Add a "checked_by" field to the `license()` directive so that we can track who verified
the license for a project. also check the license of 18 or so projects and mark them
checked.
This patch adds license information for about 5,300 packages from automated sources.
The license information was obtained from Alpine Linux and PyPI and processed
using tooling available in https://github.com/boomanaiden154/spack-license-utils.
The license field was added in after all other directives in an automated fashion.
Note that while this license information is probably fairly accurate, it is not
guaranteed to be accurate. In addition some of the license strings from Alpine Linux
might not be valid SPDX license strings. Invalid SPDX identifiers can be picked up
and fixed once we have validation/parsing infrastructure in place for the solver,
and issues can be fixed as they come up.
This adds a few options to `spack gc`.
One to give you a little more control over dependencies:
* `-b` / `--keep-build-dependencies`: By default, `spack gc` considers build dependencies to be "no longer needed" once their dependents are installed. With this option, we'll keep build dependencies of needed installations as well.
And two more to make working with environments easier:
* `-E` / `--except-any-environment`: Garbage collect anything NOT needed by an environment. `spack gc -E` and `spack gc -bE` are now easy ways to get rid of everytihng not used by some environment.
* `-e` / `--except-environment` `ENV`: Instead of considering all environments, garbage collect everything not needed by a *specific* environment. Note that you can use this with `-E` to add directory environments to the list of considered envs, e.g.:
spack gc -E -e /path/to/direnv1 -e /path/to/direnv2 #...
- [x] rework `unused_specs()` method on DB to add options for roots and deptypes
- [x] add `all_hashes()` method on DB
- [x] rework `spack gc` command to add 3 more options
- [x] tests
* Add all versions back to 0.20, add more depends_on (flex, bison, libffi and ccache), add the ability to enable or disable both abc and ccache, abc is enabled by default, ccache is disabled by default
* Fixed style with black
* Removed unused f-string setups
* Fixed style with black (again)
* Add enum34, numdifftools, and updated pyomo packages
* Syntax error
* Apply black style
* Trying to get around Python spec issue
* All SHAs were somehow wrong
* Change enum version
* Change optional dependencies to be on run, not build
* Add Pyomo 6.7.0
* Update SHA and version mismatch
* Remove py-enum34
* Add three new packages to address comments
* Fix linting errors; move casadi to py-casadi
* Update license; add in dependency
* Update setuptools version
* Update class name to python class
* Remove other boielerplate stuff
* Update homepage addresses; update py-casadi
* Add PAPI 7.0.1
* Add comment about skipping PAPI 7.0.0
* Add patch to avoid adding Intel ifort/ifx flag on Cray ftn
* Modify patch to include Cray-specific flags
* Adjust recipe to always apply patch for 7.0.1
* Expand Cray compiler checks in patch
* Forgot to update recipe
* Adjust recipe so it looks for hipcc in the correct path
* Revert "Adjust recipe so it looks for hipcc in the correct path"
This reverts commit 0db3df4fe2.
* Patch HIP_PATH to work with Spack-built HIP
* Patch LDFLAGS with llvm-amdgpu path
* Forgot the depends_on line
* libomptarget only builds with clang
* Try a self-consistent build of llvm-amdgpu
* Try making llvm-amdgpu depend on llvm for llvmoffloadarch library
* Update prereq to use rocm-openmp-extras instead
* Refactor llvm-amdgpu to use a version dict
* Fix typo
* Hack to exclude older versions without matching rocm-openmp-extras
* Add PAPI 7.1.0
* Revert changes to llvm-amdgpu
* Fix PAPI 7.1.0 checksum
For now, this only includes packages that I personally maintain.
Notable removals:
* Anaconda 2
* Catalyst
* Ancient numpy/scipy
* Ancient PyTorch
* Ancient Bazel/TF
To work properly, Spack requires a few directories from its repository to be added to
`sys.path`. Previously these were buried in `spack_installable.main.main()`, but it's
sometimes useful to get the paths separately, e.g., if you want to set up your own
functioning spack environment.
With this change, adding the paths is much simpler:
```python
import spack_installable
sys.path[:0] = get_spack_sys_paths(spack_prefix)
```
- [x] Add `get_spack_sys_paths()` method with extra paths in order.
- [x] Refactor `spack_installable.main.main()` to use it.
With an improper/incomplete/broken installation of Clingo, it can be
importable but not have any of the expected attributes
Improve error reporting in this case
* Restore PackageBase class, and modify only ASP
This prevents a noticeable slowdown in concretization
due to the number of directives involved.
* Fix issue with 'clang' being preferred to 'gcc',
due to runtime version weights
* Constraints on runtimes are declared by compilers
The declaration of available runtime versions, and of
their compatibility constraints are in the associated
compiler class.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
The gcc-runtime package adds a separate node for gcc's dynamic runtime
libraries.
This should help with:
1. binary caches where rpaths for compiler support libs cannot be
relocated because the compiler is missing on the target system
2. creating "minimal" container images
The package is versioned like `gcc` (in principle it could be
unversioned, but Spack doesn't always guarantee not mixing compilers)
If you are calling Spack from the python API, you might have written something like this
before #41529:
```
find = SpackCommand("find")
find('--format={name}', 'saxpy@1.0.0', '+rocm', 'amdgpu_target="gfx90a"')
```
But with the breaking change in #41529, you should write:
```
find = SpackCommand("find")
find('--format={name}', 'gromacs', '+rocm', 'amdgpu_target=gfx90a')
```
Note that we don't need quotes in Python strings, and that this is what would come in
via argv if you typed a quoted variant on the CLI.
The error messages for strings like this are not great -- you get something like this:
```
==> No package matches the query: gromacs+rocm amdgpu_target="gfx90a"
```
Which doesn't indicate that the issue might be your quoting. This is because we were
simply outputting the argv we got, instead of using spec.format() to output the error
message. This PR fixes such errors to use `spec.format()` and to look like this:
```
==> No package matches the query: gromacs+rocm amdgpu_target='"gfx90a"'
```
So users should have an easier time understanding that Spack considers the variant value
to contain quotes here.
- [x] update ConstraintAction to store parsed Specs
- [x] refactor commands to display formatted parsed Specs instead of raw input
Users expect that changes to the externals sections in packages.yaml config apply immediately, but reuse concretization caused this not to be the case. With this commit, the concretizer is only allowed to reuse externals previously imported from config if identical config exists.
This PR adds a flag `--tag/-t` to `buildcache push`, which you can use like
```
$ spack mirror add my-oci-registry oci://example.com/hello/world
$ spack -e my_env buildcache push --base-image ubuntu:22.04 --tag my_custom_tag my-oci-registry
```
and lets users ship a full, installed environment as a minimal container image where each image layer is one Spack package, on top of a base image of choice. The image can then be used as
```
$ docker run -it --rm example.com/hello/world:my_custom_tag
```
Apart from environments, users can also pick arbitrary installed spec from their database, for instance:
```
$ spack buildcache push --base-image ubuntu:22.04 --tag some_specs my-oci-registry gcc@12 cmake
$ docker run -it --rm example.com/hello/world:some_specs
```
It has many advantages over `spack containerize`:
1. No external tools required (`docker`, `buildah`, ...)
2. Creates images from locally installed Spack packages (No need to rebuild inside `docker build`, where troubleshooting build failures is notoriously hard)
3. No need for multistage builds (Spack just tarballs existing installations of runtime deps)
4. Reduced storage size / composability: when pushing multiple environments with common specs, container image layers are shared.
5. Automatic build cache: later `spack install` of the env elsewhere speeds up since the containerized environment is a build cache
These packages were written before the "requires" directive,
and so they are conflicting with all compilers but Fujitsu
to express they _require_ `%fj`
* Ensure a spack libproj is used instead of a system libproj when libproj < 8.
spack/spack/issues/41299
* Fix style as per ci-bot
* Fix style as per ci-bot
* Ensure 3.5:3.8.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add trim function to `Spec` and `--ignore` option to 'spack diff'
Allows user to compare two specs while ignoring the sub-DAG of a particular dependency, e.g.
spack diff --ignore=mpi --ignore=zlib trilinos/abcdef trilinos/fedcba
to focus on differences closer to the root of the software stack
Sometimes env variables computed in `setup_run_environment` depend on tests
w.r.t. files in `spec.prefix`, but Spack temporarily projects `spec.prefix` to
the view.
This is problematic for two reasons:
1. Some packages iterate over `<prefix>/bin`: they expect only the current
package's executables, but find all linked in the view, leading to false
positives.
2. Some packages test for `os.path.islink(...)`, which is always true in a view
`gcc` is an example that does both.
This PR lets Spack compute the environment modifications using the original
prefix, and projects to the view afterwards
Currently, a virtual spec is composed of just a name and a version. When a virtual spec contains other components, such as variants, Spack won't emit warnings or errors but will silently drop them - which is unexpected by users.
This adds three new versions in the 31.* series. Release notes of 31.0.0 at https://github.com/acts-project/acts/releases/tag/v31.0.0. No changes to the CMakeLists.txt files that need addressing in the package recipe.
The only new feature I'm a bit concerned about is https://github.com/acts-project/acts/pull/2626, which replaces testing for C++20 concepts support by the feature-testing macro `__cpp_concepts`, which is also a C++20 feature. So technically we now should require `cxxstd=20` even though Acts itself still allows (and defaults to) 17. Judging by https://en.cppreference.com/w/cpp/compiler_support/20, the support for feature-testing macros was added very early by most compilers.
This PR changes the default behavior of `spack config get` and `spack config blame`
to print a flattened version of the entire spack configuration, including any active
environment, if the commands are invoked with no section arguments.
The new behavior is used in Gitlab CI to help debug CI configuration, but it can also
be useful when asking for more information in issues, or when simply debugging Spack.
Convert the 'develop' section of an environment to a dedicated configuration section.
This means for example that instead of having to define `develop` specs in the
`spack.yaml`, the environment can `include:` another `develop.yaml` configuration
which specifies which specs should be developed in the environment.
This change is not expected to be disruptive given that existing environment `spack.yaml`
files will conform to the new schema.
(Update 11/28/2023) I have implemented the `develop`/`undevelop` commands in terms
of more-generic modification functions added to the `config` module: `change_or_add`
and `update_all`. It is assumed that the semantics added here (described in 11/18 update)
would be desirable to extend to other config update actions (e.g. adding compilers,
changing package requirements, adding mirrors).
(Update 11/18/2023) I have updated this such that `spack develop`, and
`spack undevelop` to potentially modify all writable scopes, like
https://github.com/spack/spack/pull/41147. https://github.com/spack/spack/pull/35307
will be useful for modifying included scopes, but generally speaking specifying a
`--scope` will not be required for `spack develop`: `spack develop` will add new
develop specs to whatever scope already has develop specs defined, or to the
highest-priority writable scope (which should be the env scope).
TODOs:
- [x] If you `spack undevelop` a package which is mentioned at multiple layers of
configuration, then currently this would only modify one of them. That's not
technically a new issue (has always existed for configuration modification), but
may be confusing to users when presented via an interface other than `spack config set`
- [x] Need to add (or confirm) the ability to modify individual config files by providing
a path (rather than using a scope identifier as a key to retrieve associated config).
- [x] `spack develop` adds new develop specs to the scope that defines them
(potentially skipping higher priority scopes to e.g. augment included scope files)
---------
Co-authored-by: scheibelp <scheibelp@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* py-plum-dispatch: add new package
* Update var/spack/repos/builtin/packages/py-plum-dispatch/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-htgettoken: use os.environ, avoid AttributeError
This avoids the following error:
```
Warning: could not load runtime environment due to AttributeError: 'EnvironmentModifications' object has no attribute 'get'
```
* py-htgettoken: allow for undefined variables
* py-htgettoken: use dict get()
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* mpifileutils: add DAOS variant
* mpifileutils: Add daos dep when +daos
Add dependency on DAOS when +daos
Pass DAOS prefix to ensure correct DAOS is found by during configuration
* Change in to satisfies for boolean variants
---------
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
* perl-datetime-format-strptime: New package
Adds package:
- perl-datetime-format-strptime
And adds these because they are test dependencies:
- perl-test-file-sharedir
- perl-test2-plugin-nowarnings
- perl-test2-suite
And modifies these to enable build time tests:
- perl-b-hooks-endofscope
- perl-class-singleton
- perl-datetime-locale
- perl-datetime-timezone
- perl-file-sharedir
- perl-namespace-autoclean
- perl-namespace-clean
- perl-params-validationcompiler
- perl-specio
* Add myself as maintainer
* add new cpp compiler version
* empty ftn for 2023.2.3
* OLD ftn in 2023.2.3 version
* tolerate missing fortran compiler
---------
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
* geant4: new version 11.2.0
* geant4: depends_on geant4-data@11.2:
* geant4-data: new version 11.2.0
* g4abla: new version 3.3
* g4emlow: new version 8.4
* g4incl: new version 1.1
* geant4: depends_on vecgeom@1.2.6:
* geant4: depends_on qt@5.9: when @11.2: +qt
* vecgeom: new version 1.2.6
This PR does several things:
- [x] Allow any character to appear in the quoted values of variants and flags.
- [x] Allow easier passing of quoted flags on the command line, e.g. `cflags="-O2 -g"`.
- [x] Handle quoting better in spec output, using single quotes around double
quotes and vice versa.
- [x] Disallow spaces around `=` and `==` when parsing variants and flags.
## Motivation
This PR is motivated by the issues above and by ORNL's
[tips for launching at scale on Frontier](https://docs.olcf.ornl.gov/systems/frontier_user_guide.html#tips-for-launching-at-scale).
ORNL recommends using `sbcast --send-libs` to broadcast executables and their
libraries to compute nodes when running large jobs (e.g., 80k ranks). For an
executable named `exe`, `sbcast --send-libs` stores the needed libraries in a
directory alongside the executable called `exe_libs`. ORNL recommends pointing
`LD_LIBRARY_PATH` at that directory so that `exe` will find the local libraries and
not overwhelm the filesystem.
There are other ways to mitigate this problem:
* You could build with `RUNPATH` using `spack config add config:shared_linking:type:runpath`,
which would make `LD_LIBRARY_PATH` take precedence over Spack's `RUNPATHs`.
I don't recommend this one because `RUNPATH` can cause many other things to go wrong.
* You could use `spack config add config:shared_linking:bind:true`, added in #31948, which
will greatly reduce the filesystem load for large jobs by pointing `DT_NEEDED` entries in
ELF *directly* at the needed `.so` files instead of relying on `RPATH` search via soname.
I have not experimented with this at 80,000 ranks, but it should help quite a bit.
* You could use [Spindle](https://github.com/hpc/Spindle) (as LLNL does on its machines)
which should transparently fix this without any changes to your executable and without
any need to use `sbcast` or other tools.
But we want to support the `sbcast` use case as well.
## `sbcast` and Spack
Spack's `RPATHs` break the `sbcast` fix because they're considered with higher precedence
than `LD_LIBRARY_PATH`. So Spack applications will still end up hitting the shared filesystem
when searching for libraries. We can avoid this by injecting some `ldflags` in to the build, e.g.,
if were were going to launch, say, `LAMMPS` at scale, we could add another `RPATH`
specifically for use with `sbcast`:
spack install lammps ldflags='-Wl,-rpath=$ORIGIN/lmp_libs'
This will put the `lmp_libs` directory alongside `LAMMPS`'s `lmp` executable first in the
`RPATH`, so it will be searched before any directories on the shared filesystem.
## Issues with quoting
Before this PR, the command above would've errored out for two reasons:
1. `$` wasn't an allowed character in our spec parser.
2. You would've had to double quote the flags to get them to pass through correctly:
spack install lammps ldflags='"-Wl,-rpath=$ORIGIN/lmp_libs"'
This is ugly and I don't think many users will easily figure it out. The behavior was added in
#29282, and it improved parsing of specs passed as a single string, e.g.:
spack install 'lammps ldflags="-Wl,-rpath=$ORIGIN/lmp_libs"'
but a lot of users are naturally going to try to quote arguments *directly* on the command
line, without quoting their entire spec. #29282 used a heuristic to detect unquoted flags
and warn the user, but the warning could be confusing. In particular, if you wrote
`cflags="-O2 -g"` on the command line, it would break the flags up, warn, and tell you
that you could fix the issue by writing `cflags="-O2 -g"` even though you just wrote
that. It's telling you to *quote* that value, but the user has to know to double quote.
## New heuristic for quoted arguments from the CLI
There are only two places where we allow arbitrary quoted strings in specs: flags and
variant values, so this PR adds a simpler heuristic to the CLI parser: if an argument in
`sys.argv` starts with `name=...`, then we assume the whole argument is quoted.
This means you can write:
spack install bzip2 cflags="-O2 -g"
directly on the command line, without multiple levels of quoting. This also works:
spack install 'bzip2 cflags="-O2 -g"'
The only place where this heuristic runs into ambiguity is if you attempt to pass
anonymous specs that start with `name=...` as one large string. e.g., this will be
interpreted as one large flag value:
spack find 'cflags="-O2 -g" ~bar +baz'
This sets `cflags` to `"-O2 -g" ~bar +baz`, which is likely not what you wanted. You
can fix this easily by either removing the quotes:
spack find cflags="-O2 -g" ~bar +baz
Or by adding a space at the start, which has the same effect:
spack find ' cflags="-O2 -g" ~bar +baz'
You may wonder why we don't just look for quotes inside of flag arguments, and the
reason is that you *might* want them there. If you are passing arguments like:
spack install zlib cppflags="-D DEBUG_MSG1='quick fox' -D DEBUG_MSG2='lazy dog'"
You *need* the quotes there. So we've opted for one potentially confusing, but easily
fixed outcome vs. limiting what you can put in your quoted strings.
## Quotes in formatted spec output
In addition to being more lenient about characters accepted in quoted strings, this PR fixes
up spec formatting a bit. We now format quoted strings in specs with single quotes, unless
the string has a single quote in it, in which case we JSON-escape the string (i.e., we add
`\` before `"` and `\`).
zlib cflags='-D FOO="bar"'
zlib cflags="-D FOO='bar'"
zlib cflags="-D FOO='bar' BAR=\"baz\""
* adding necessary headers, to fix https://github.com/spack/spack/issues/41398
* deleting something imported by accident
* [@spackbot] updating style on behalf of yizeyi18
* undo commit 7688fed according to suggestion from @msimberg
* patching camp@:2022.10.1 for compatibility with gcc-13
* adding the patch
* fixing paths in the patch
* [@spackbot] updating style on behalf of yizeyi18
* Update camp patch using LLNL/camp@05e1c35
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* changing patch name
---------
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
`setup_dependent_package` is not a build phase, it should just set
globals for a package.
It's called during setup of runtime environment of packages, and there
have been reports of it actually failing due to a read only file system
(not sure under what exact conditions that is possible).
This license annotation is currently invalid as it specifies a URL
rather than an SPDX expression. Remove it for now until we have a
consensus on how to represent this case.
* New package Model Coupling Toolkit (MCT)
* Remove ~mpi variant from mct, build is not working correctly
* Remove boilerplate stuff from var/spack/repos/builtin/packages/mct/package.py
MySQL was performing a core API call to `Spec.flat_dependencies`
when setting up the build environment. This function is an
implementation detail of the old concretizer, where multiple nodes
from the same package are not allowed.
This PR uses a more idiomatic way to check if "python" is
in the DAG.
For reference, see #11356 to check why the call was introduced.
* [protobuf] New versions, explicit cxxstd variant
* New versions 3.15.8, 3.25.0, 3.25.1.
* New explicit variant `cxxstd` with support for older Protobuf
versions.
* Support testing.
* Use Protobuf's `protobuf_BUILD_SHARED_LIBS` instead of
`BUILD_SHARED_LIBS`.
* Support building with LLVM/Clang's `libc++`.
* Address audit issue
* Variant default does not honor `when` clause
* Use `self.spec.satisfies()` instead of `with when()`
* Fix silliness; improve consistency
* Today was apparently a go-back-to-bed day
* initial commit for rocm-5.7.0 and 5.7.1 releases
* bump up ther version for 5.7.0 and 5.7.1 releases
* update recipes to support 5.7.0 and 5.7.1 releases
* bump up the version for ROCm 5.7.0 and ROCm-5.7.1 releases
* bump up the version for composable-kernel amd miopen-hip
* fix style errors
* fix style errors in hip etc
* renaming composable-kernel recipe
* changes for composable_kernel
* Revert "renaming composable-kernel recipe"
This reverts commit 0cf6c6debf.
* Revert "changes for composable_kernel"
This reverts commit 05272a10a7.
* bump up the version for hiprand
* using the checksum for hiprand-5.7.1
* bump up the version for 5.7.0 and 5.7.1 releases
* fix style errors
* fix merge conflicts with the develop.
* temp workaround for the error seen with rocm-5.7.0 when trying
to generate the dependency file for runtime/legion/legion_redop.cu
* fix build issue(work around) with legion
* add patch for migraphx package to turn off ck
* update to hip recipe
* fix hip-path detection inside llvm clang driver
* update llvm-amdgpu and rocm-validation-suite recipes
* fix style errors
* bump up the version for amdsmi for rocm-5.7.0 release
* add support for gfx941,gfx942 for rocm-5.7.0 release onwards
* revert changes to rocm.py file
* added gfx941 and gfx942 to rocm.py and add the gfx942 to kokkos and new checksum
the new version seem to support gfx942
* bump up the version for rccl for 5.7.1
* update the patch for rocm-openmp-extras for 5.7.0
* update mivisionx recipe for 5.7.0 release
* add new dependencies for rocfft tests
* port the fix for avx build, the start address of values_ buffer in KernelParameters is not
correct as it is computed based on 16-byte alignment
* set HIP_PATH=ROCM_PATH for 5.7.0 onwards
* address review comments
* revert adding xnack- and xnack+ to gfx940,gfx941,gfx942 as the prechecks were failing
* Add `signed` property to mirror config
* make unsigned a tri-state: true/false overrides mirror config, none takes mirror config
* test commands
* Document this
* add a test
The new feq-parse version includes fixes for ifort and ifx compilers
Additionally, evaluation of parser objects with multidimension arrays is
now supported.
* samtools/htslib add latest version
* Given I work at the same institute as the authors I think it fair I am willing to review changes, if it's complex I can ask them over tea.
---------
Co-authored-by: James Beal <jb23@sanger.ac.uk>
* New variants:
- `tmvz-cpu`
- `tmvz-gpu`
- `tmvz-pymva`
- `tmvz-sofie`
* Improve X-related dependencies.
* Improve TMVA-related dependencies with more specificity.
* Patch possible missing standard header include in Eve7.
* Patch Protobuf handling to support new Protobuf-provided CMake config
files required to handle transitive `abseil-cpp` dependence.
* Add missing terminal newline to `webgui` patch to remove patch
warning.
* Handle deprecated/removed build options.
* geant4/geant4-data: add builtin_clhep variant and v10.0.4.
* geant4: revert addition of builtin_clhep variant.
* geant4: fix vecgeom variant only being available for v10.3 and above.
Fix filer_compiler_wrapper for cases where the compiler returned in None, this happens on some installed gcc systems that do not have fortran built into them as standard, e.g. gcc@11.4.0 on ubuntu 22.04
Before (hard to read, doesn't fit on small terminals):
:
```console
-I, --install-status show install status of packages
packages can be: installed [+], missing and needed by an installed package [-], installed in an upstream instance [^], or not installed (no annotation)
```
After (fits in 80 columns):
```console
-I, --install-status show install status of packages
[+] installed [^] installed in an upstream
- not installed [-] missing dep of installed package
```
* Fix cdash reporter time stamps (#38818).
The cdash reporter is created before packages are installed so save the
starttime then instead of the endtime.
* Use endtime instead of starttime for the endtime of update
---------
Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Participation in the venerable Spack google group has dwindled, though we still have
540+ subscribers there. I've made the mailing list announcement-only, and I've given
a few maintainers posting privileges.
This PR adds some notes to the README indicating that the mailing list is only for
announcements.
* Ensure that additional environment variables are set when a module
file is generated.
* Fixed the detection of the opal_prefix / MPI_ROOT field to use ompi_info.
---------
Co-authored-by: Greg Becker <becker33@llnl.gov>
* depend_on python
There is an ill-named variant "python" that enables the pytrilinos1
variant. This made it through our testing but broke on our actual
CI test machines.
* adjust "python" variant based on Trilinos version
For Trilinos <= 14, enable PyTrilinos(1). For later versions
of Trilinos, enable PyTrilinos2.
We still support directly enabling PyTrilinos2 via the "pytrilinos2"
variant.
* remove pytrilinos2 variant
* correct depends_on constraints
- we don't have a fallback if make is not installed
- we assume file system locking works
- we don't verify that make is gnu make (bootstrapping fails on FreeBSD as a result)
- there are some weird race conditions in writing spack.yaml on concurrent spack install
- the view is updated after every package install instead of post environment install.
Forbid nested dependencies in depends_on declarations, by running an audit in CI.
Fix the packages not passing the new audit:
- amd-aocl
- exago
- palace
- shapemapper
- xsdk-examples
ginkgo: add a commit sha to v1.5.0.glu_experimental
* New variants:
- `tmvz-cpu`
- `tmvz-gpu`
- `tmvz-pymva`
- `tmvz-sofie`
* Improve X-related dependencies.
* Improve TMVA-related dependencies with more specificity.
* Patch possible missing standard header include in Eve7.
* Patch Protobuf handling to support new Protobuf-provided CMake config
files required to handle transitive `abseil-cpp` dependence.
* Add missing terminal newline to `webgui` patch to remove patch
warning.
* Handle deprecated/removed build options.
* Handle unwanted system paths in various `PATH`-like environment
variables.
The condition on swig can be interpreted as "true if true,
false if false" and gives clingo the option to add swig
or not.
If not other optimization criteria break the tie, then
the concretization is non-deterministic.
* added external libxc/elpa choice
* fixed formatting issues and 1 unused variant found by reviewer
* try to fix a string formatting issue
* try to fix some other string formatting issues
* fixed 1 flake8 style issue
* use explicit fftw-api@3
* Add `url_list` to facilitate finding new versions.
* `cxxstd` is not meaningful when `@:2.99.99` as it was a header-only
package before v3.
* Support C++20/23, remove C++14 support.
* Add @greenc-FNAL to maintainers.
* Add CMake arguments to support testing, build of extras and examples.
* Only build tests for proj package if required
Even if tests are not explictly required to be built, proj build them
anyway and tries to download Google Test.
* proj: fix name of test activation flag
* proj: Always set test activation flag
* proj: Patch test activation logic for versions 5.x
* py-python-pptx: new package
* py-python-pptx: use pil instead of pillow, remove version constraint on python
---------
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
* py-tldextract: new package
* py-tldextract: add version 5.1.1
* py-tldextract: fix version constraint on py-setuptools-scm
---------
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
* Add a new version of ruff
* Add a comment about where the dependency can be found
---------
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
* Update py-werkzeug version dependency for py-graphene-tornado@2.6.1
* Add note on diverging version requirements for py-werkzeug in py-graphene-tornado
* legion: correct cuda dependency for cr version
* Update var/spack/repos/builtin/packages/legion/package.py
---------
Co-authored-by: Davis Herring <herring@lanl.gov>
* py-gidgethub: add new package
* Add main branch version and scope flit/flit-core dependency
* Update var/spack/repos/builtin/packages/py-gidgethub/package.py
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
* Add optional dependencies as variants of package
* Add git url for main version
* Fix variant and dependency ordering
---------
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
* gidgetlab: add new package
* Convert both cachetools and aiohttp to optional deps with variants
* Fix forgotten variant conditional on cachetools dependency
* Add git url and main version for dev workflows
* Fix variant and dependency ordering
* Remove cachetools variant and merge dependency with aiohttp variant
* grep WM_PROJECT_VERSION from etc/bashrc
This fixes the problem when building from a manually checked out repo
which might have a different version wrt the one defined in the spack
package (e.g. anything later than 5.0 is known as 5.x by the build
system)
* patch applies to just 5.0, in newer versions it is already addressed
In `5.20171030` there's a commit
c66fba323c
very similar (almost identical) to what the patch `50-etc.patch` does.
So the patch should not be applied to other than `5.0` otherwise it errors.
References:
- https://github.com/OpenFOAM/OpenFOAM-5.x/commits/20171030/etc/bashrc
- 197d9d3bf2/etc/bashrc (L45-L47)
This was missed while backporting the new `spack info` command from #40326.
Variants should be sorted by name when invoking `spack info --variants-by-name`.
Intel made an incompatible change in XED in 2023.08.21 that breaks
hpctoolkit (at run time). Hpctoolkit develop can adapt (soon will),
but older versions must use xed :2023.07.09.
* Updating the LBANN, Hydrogen, and DiHydrogen recipes for both new
variants and to make sure that RPATHs are properly setup.
Co-authored-by: bvanessen <bvanessen@users.noreply.github.com>
Deprecating intel package, which contains intel classic compilers. This package has not been updated in 3 years. Please use intel-oneapi-compilers instead.
* gipaw.x installed by cmake if version >= 5c4a4ce.
gipaw.x will only be installed with cmake if the qe-gipaw version
is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
Here a patch to the submodule_commit_hash_records to use a newer
qe-gipaw version.
* Update package.py
* Delete var/spack/repos/builtin/packages/quantum-espresso/gipaw-eccee44.patch
* Update package.py
* Restoring gipaw-eccee44 patch
* Update package.py
* Add fox variant in quantum-espresso
* Fix an issue introduced in #36484. Patches are 7.1 only.
* Change plugin handling.
* formatting.
* Typo correction
* Refine conflict
---------
Co-authored-by: S. Alexis Paz <alexis.paz@gmail.com>
* Add EGL support to ParaView and Glew
add a package for egl that provides GL but also adds
EGL libs and headers for projects that need them
Fix a header problem with the opengl package
Format files using black
* better description for egl variant description
Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
* better check/setup of non egl variant dependencies
Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
* Add biddisco as maintainer
* Fix unused var style warning
* Add egl conflicts for other gl providers
---------
Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
* py-pyglet: version bump
* py-pyglet: use zip instead of whl, update dependencies
* py-pyglet: 2.0.9 and 2.0.10 zips should be downloaded from github
* py-pyglet: style
* py-pyglet: use virtual packages in dependencies
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
* py-pyglet: doesn't depend on py-future any more
* py-pyglet: remove glx dependency
* py-pyglet: back to the pypi zipfiles with patch instead
---------
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
* Add initial version of verible to spack
* Update to use explict url path for each release, as the release tagh includes extra data, also added the bottom most point of gcc, gcc9
* py-pycma: new package
* rename py-pycma in py-cma; py-cma: use pypi instead of github sources
---------
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
* lammps: add new stable version 20230802.1
* lammps: add missing potential download for +mesont
* lammps: fix python package install
* Update var/spack/repos/builtin/packages/lammps/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* lammps: py-numpy and py-mpi4py should be build and run deps
* lammps: add new 20231121 release
- MPIIO package has been removed -> disable mpiio variant
- LAMMPS_EXCEPTIONS is now always on -> disable exceptions variant
- CMake 3.16+ is now required
- Kokkos 4.1.0 is now supported
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add missing runtime dependency on py-colorama to py-ansimarkup
* Add py-metomi-isodatetime@3.1.0
* New package py-graphql-relay
* Update py-cylc-flowi, add version 8.2.3
* Fix merge conflict
* Revert mistake in var/spack/repos/builtin/packages/py-cylc-flow/package.py
* Update py-metomi-isodatetime dependencies for py-cylc-flow
* Add 'climbfuji' to list of maintainers for py-cylc-flow
* py-beartype: new package with version 0.15.0
* Update var/spack/repos/builtin/packages/py-beartype/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-beartype: depend on python 3.8 or higher
* py-beartype: add new version 0.16.2
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-mpi4py: fix build with Apple Clang
* [@spackbot] updating style on behalf of adamjstewart
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
* mochi-thallium: added a few newer versions
* mochi-thallium: added constraint on the version of margo required
* removed thallium 0.12 which needs updating
* mochi-thallium: fixed hash for version 0.12.0
* removed thallium 0.12 which needs updating (again)
When using `spack external find acfl`, we get the full version string
with 4 components in `packages.yaml`. This PR truncates the version
nubmer when finding the `armpl` component to be able to run without
intervention.
* developer tools stack try 2
This version is actually in use locally and has largely stabilized, at
least on x86. Some packages are still a challenge on ppc64le, but maybe
worth keeping this working as a set.
* add packages, try to get container with newer gcc
* remove reuse: true
* try to get cmake to build on medium, 25 minutes is too long
* add lsd package and add to dev tools stack
* clean up fzf dependency and sorting
* Update share/spack/gitlab/cloud_pipelines/stacks/developer_tools/spack.yaml
* cuda: add 12.3.0 (#40827)
* Switch to dashes
* yet more underscores
---------
Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
This looks to me like the best compromise regarding externals in a
build cache. I wouldn't want `spack install` on my machine to install
specs that were marked external on another. At the same time there are
centers that control the target systems on which spack is used, and
would want to use external in buildcaches.
As a solution, reuse concretization will now consider those externals
used in buildcaches that match a locally configured external in
packages.yaml.
So for example person A installs and pushes specs with this config:
```yaml
packages:
ncurses:
externals:
- spec: ncurses@6.0.12345 +feature
prefix: /usr
```
and person B concretizes and installs using that buildcache with the
following config:
```yaml
packages:
ncurses:
externals:
- spec: ncurses@6
prefix: /usr
```
the spec will be reused (or rather, will be considered for reuse...)
* gipaw.x installed by cmake if version >= 5c4a4ce.
gipaw.x will only be installed with cmake if the qe-gipaw version
is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
Here a patch to the submodule_commit_hash_records to use a newer
qe-gipaw version.
* initial commit to update hipblas rocalution, rocsolver, rocsparse to new syntax
* add rocblas test changes and fixes for hipblas and rocsolver tests
* fix styling
* remove updates for rocblas
* solver: use a unique counter for condition, triggers and effects
* Do not reset counters when re-running setup
What we need is just a unique ID, it doesn't need
to start from zero every time.
* Update SST packages to 13.1.0
* Allow mismatch between sst-core dependency and current macro version
* SST does not work with Python 3.12 yet
* Sanity check install binaries for sst-core
* Elements compiles with OTF2 but not OTF
* Version bounds in specs are inclusive
* Remove not-strictly-necessary file check
* oneapi 2024.0.0 release
* oneapi v2 directory support and some cleanups
* sycl abi change requires 2024 compilers for packages that use sycl
---------
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
* update spack recipe
* [@spackbot] updating style on behalf of toxa81
* change from @develop to @7.5.0
* return dependency on boost_filesystem
* return dependency on boost_filesystem
* remove boost filesystem as agreed by @RMeli and @simonpintarelli
---------
Co-authored-by: toxa81 <toxa81@users.noreply.github.com>
* Update to latest version
* Add dependency
* revert
* address PR comments
* Correct dependencies for 0.7 to 0.8 transition
* Fix cmake line.
* Update nanobind dep
---------
Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Jack S. Hale <mail@jackhale.co.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
* add 0.12.0
* remove whitespace
* update deps
* Update var/spack/repos/builtin/packages/py-neo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add dep for python 3.8+
* add dep for python 3.8+ with 0.12.0
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
PR #40929 reverted the argument parsing to make `spack --verbose
install` work again. It looks like `--verbose` is the only instance
where this kind of argument inheritance is used since all other commands
override arguments with the same name instead. For instance, `spack
--bootstrap clean` does not invoke `spack clean --bootstrap`.
Therefore, fix multi-line aliases again by parsing the resolved
arguments and instead explicitly pass down `args.verbose` to commands.
This commit discards type mismatches or failures to validate a package preference during concretization. The values discarded are logged as debug level messages. It also adds a config audit to help users spot misconfigurations in packages.yaml preferences.
This roughly restores the order of operation from Spack 0.20,
where where `AutotoolsPackage.setup_build_environment` would
override the env variable set in `setup_platform_environment` on
macOS.
When improving the error message, we started #showing in the
answer set a lot more symbols - but we forgot to suppress the
debug messages warning about UNKNOWN SYMBOLs
* onnxruntime: fix the call to as_string() operator
* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-onnxruntime: rm now-unused stringpiece_1_10.patch
---------
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* packages/hub: add new version, update to module
Hub now uses a go module to build, needs different env vars, and we're
on a very, very old version before that. Deprecate the old ones so we
can clean out that old build once we pass a spack version.
* cleanup suggested by @adamjstewart
* Update to latest version
* Fix linebreak
* Make suggested changes
* bumped to 0.6.1
* Update var/spack/repos/builtin/packages/py-scikit-build-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Chris Richardson <cnr12@cam.ac.uk>
Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
In practice, one can only compiler for the Intel Data Center Max GPU
via a SYCL build and the oneAPI compiler. This is unlikely to change,
so we can be explicit about that.
* elbencho add new version and git master branch
* Update var/spack/repos/builtin/packages/elbencho/package.py
Co-authored-by: Alec Scott <alec@bcs.sh>
* formatting fix requested by @alecbcs
* remove whitespace added in blank line by github auto resolve
---------
Co-authored-by: Ethan W <mail@ethanwilliams.xyz>
Co-authored-by: Alec Scott <alec@bcs.sh>
* gromacs: Add new variants and clarify existing ones
Add new variants that reflect existing capabilities and defaults in
the upstream build system. Add other existing constraints that were
not yet specified.
* conform to style
* Fix missing hyphens
* Correct cmake variable names
* add elephant version v0.12.0 and 0.13.0
* update copyright
* reformat according to black format errors
* restore maintainers directive
* Update var/spack/repos/builtin/packages/py-elephant/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add dependency python 3.8+
* sorted dependencies
* sort dependencies from newest to oldest
* add deps for @master
* removed dependency for master, since it is included in 0.12.0:
* removed dependency for python 3.7+ , since 3.7+ is the lowest supported version anyway
* removed specific deps for master, since master is always newer than all stable releases
* updated numpy dependency for Elephant 0.12.0:
* Update var/spack/repos/builtin/packages/py-elephant/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-elephant/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* removed upper bounds for py-quantities, omitting v0.14.0
* add elephant v0.14.0
* update required quantities version
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cleo: add versions 2.0.0 2.0.1; add maintainers
* py-cleo: add forgotten dependence
* py-cleo: update from review: remove preferred version, remove a dependence, fix py-rapidfuzz version
* py-cleo: deprecated version 1.0.0a5; add version 1.0.0; update dependences
* py-cleo: add version 2.1.0; update version range of dependences
* py-crashtest: add version 0.4.1, dependence of py-cleo
* py-cleo: update dependence
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cleo: update dependence py-clikit
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cleo: update dependence py-rapidfuzz
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-rapidfuzz: add version 2.2.0 dependence of py-cleo@2
* py-cleo: fix version range of py-crashtest
* py-rapidfuzz: fix dependences; add py-rapidfuzz-capi and py-jarowinkler
---------
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Using Python 3.12 in a freshly cloned Spack repository results in
warnings such as this:
```
==> Warning: invalid escape sequence '\$'
==> Warning: invalid escape sequence '\('
==> Warning: invalid escape sequence '\.'
==> Warning: invalid escape sequence '\.'
```
These will turn into errors in 3.13, so fix them. All of them actually
do not need to be regexes, so convert them into normal strings.
* add version 0.14.1
* formatting
* style checks
* fix style errors
* remove old versions
* fix typo
* style
* update maintainers directive
* sort dependencies from newest to oldest
* Update var/spack/repos/builtin/packages/py-quantities/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-quantities/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* sort dependencies from newest to oldest
* removed upper bounds for python version
* Update var/spack/repos/builtin/packages/py-quantities/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove dependency on Python 3.7 +, since 3.7 is the lowest supported version anyway
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Added recent versions to ecflow/package.py, as well as added a cxxstd variant that is
needed to set BOOST_NO_CXX98_FUNCTION_BASE appropriately when building with C++17 standard.
* Fixed pep8 style error in the ecflow package.py script.
* Remov
* Removed cxxstd variant since the ecflow cmake configuration was already specifying
to use the c++17 standard for newer versions. The use of the BOOST_NO_CXX98_FUNCTION_BASE
define is now triggered by the ecflow version.
* Permit packages that depend on Intel oneAPI packages to access sdk
* Implement and use IntelOneapiLibraryPackageWithSdk
* Restore libs property to IntelOneapiLibraryPackage
* Conform to style
* Provide new class to infrastructure
* Treat sdk/include as the main include
* root: new version 6.30.00
There is a new release of ROOT, v6.30.00, with release notes at https://root.cern/doc/v630/release-notes.html.
In addition to some deprecations of build options, this updates the C++ standard to 17 or higher (well, 20), and increases the vc minimum version.
* vc: new version 1.4.4
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Improves the warning for deprecated preferences, and adds a configuration
audit to get files:lines details of the issues.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* package/lemon: improve
* fix bug
* final improvements
* use f strings for boolean options, add soplex as TODO
* leave +coin as TODO
* depends on bzip2 when +coin
* tidy
---------
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
* initial commit to enable rocAL and add MIVisionX tests
* fix styling
* updated checksum for libjpeg patches
* update for 5.6
* use satisfies for checking spec version
xsdk: add +sycl variant - with amrex, arborx, ginkgo, petsc, sundials
xsdk: add +pflotran variant
xsdk: enable hypre+rocm
xsdk: enable superlu-dist for GPU - but use trilinos~superlu-dist [as that breaks builds]
xsdk: dealii: disable oce as it can cause intel-tbb-2017.6 to be picked up for some builds (for ex: gcc=13) and result in subsequent build failures
Some environments may have `dd4hep` as a concretized package without having it installed (yet). For those environments, `dd4hep` has property `libs` that is an empty list. Nevertheless, it can be added to a run environment (for example in case `dd4hep` is part of an environment). This results in an IndexError:
```
==> Warning: couldn't load runtime environment due to IndexError: list index out of range
```
To avoid the IndexError, only prepend the `dd4hep` libs if there are actually libs found.
Tests didn't cover the new `--variants-by-name` parameter in #40998.
Add some parameterization to hit that.
This changeset makes me think that the main section-printing loop in `spack info` isn't
factored so well. It makes it difficult to pass different arguments to different helper
functions. I could break it out into if statements if folks think that would be cleaner.
* dealii: 9.5.0
* kokkos+cuda_lambda
* dealii ^kokkos@3.7: require +cuda +cuda_lambda +wrapper
* Added 9.5.1, try ~cgal when +cuda
* Forward Cuda architecture request
* Remove workaround
* Try not enforcing the Kokkos compiler
* Enforce using nvcc_wrapper with Trilinos+Cuda
* Don't define CMAKE_*_COMPILER to point to MPI wrappers
* Use the same compiler as Trilinos/Kokkos
* Only check for Trilinos compiler
* Disable Trilinos+Cuda
* Disable Cuda support
* Try CUDA build without ninja
* Combined examples and examples_compile
* Use f-string for cuda_arch
* p -> _package
* Indentation
* Fix up f-string
---------
Co-authored-by: Luca Heltai <luca.heltai@sissa.it>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
We have two ways to concretize now:
* `spack concretize` concretizes only the root specs that are not concrete in the environment.
* `spack concretize -f` eliminates all cached concretization data and reconcretizes the *entire* environment.
This PR adds `spack deconcretize`, which eliminates cached concretization data for a spec. This allows
users greater control over what is preserved from their `spack.lock` file and what is reused when not
using `spack concretize -f`. If you want to update a spec installed in your environment, you can call
`spack deconcretize` on it, and that spec and any relevant dependents will be removed from the lock file.
`spack concretize` has two options:
* `--root`: limits deconcretized specs to *specific* roots in the environment. You can use this to
deconcretize exactly one root in a `unify: false` environment. i.e., if `foo` root is a dependent
of `bar`, both roots, `spack deconcretize bar` will *not* deconcretize `foo`.
* `--all`: deconcretize *all* specs that match the input spec. By default `spack deconcretize`
will complain about multiple matches, like `spack uninstall`.
The ^mkl pattern was used to refer to three packages
even though none of software using it was depending
on "mkl".
This pattern, which follows Hyrum's law, is now being
removed in favor of a more explicit one.
In this PR gromacs, abinit, lammps, and quantum-espresso
are modified.
Intel packages are also modified to provide "lapack"
and "blas" together.
And improve the error message (load vs unload).
Of course you could have some uninstalled dependency too, but as long as
it doesn't implement `setup_run_environment` etc, I don't think it hurts
to attempt to load the root anyways, given that failure to do so is a
warning, not a fatal error.
This changes variant display to use a much more legible format, and to use screen space
much better (particularly on narrow terminals). It also adds color the variant display
to match other parts of `spack info`.
Descriptions and variant value lists that were frequently squished into a tiny column
before now have closer to the full terminal width.
This change also preserves any whitespace formatting present in `package.py`, so package
maintainers can make easer-to-read descriptions of variant values if they want. For
example, `gasnet` has had a nice description of the `conduits` variant for a while, but
it was wrapped and made illegible by `spack info`. That is now fixed and the original
newlines are kept.
Conditional variants are grouped by their when clauses by default, but if you do not
like the grouping, you can display all the variants in order with `--variants-by-name`.
I'm not sure when people will prefer this, but it makes it easier to tell that a
particular variant is/isn't there. I do think grouping by `when` is the better default.
* [lcov] Add build and runtime deps necessary for lcov@2.0.0:
+ Many additional Perl package dependecies are required for the new version of lcov.
+ Some of the new dependencies were not known to spack until now.
* Style fix
This commit improves forward compatibility of Spack with newer build cache metadata formats.
Before this commit, invalid or unrecognized metadata would be fatal errors, now they just cause
a mirror to be skipped.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Libgit2 requires python as build dependency. I was getting an error because it was falling back to system Python which is compiled with Intel compilers and thus, `libgit2` was failing because it couldn't find `libimf.so` (which doesn't make sense).
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
The libevent release tarballs ship with a `configure` script generated by an old `libtool`. The `libtool` generated by `configure` is not compatible with `MACOSX_DEPLOYMENT_VERSION` > 10. Regeneration of the `configure` scripts fixes build on macOS.
Original configure contains:
```
case $host_os in
rhapsody* | darwin1.[012])
_lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
darwin1.*)
_lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
darwin*) # darwin 5.x on
# if running on 10.5 or later, the deployment target defaults
# to the OS version, if on x86, and 10.4, the deployment
# target defaults to 10.4. Don't you love it?
case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in
10.0,*86*-darwin8*|10.0,*-darwin[91]*)
_lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
10.[012][,.]*)
_lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
10.*)
_lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
esac
```
After re-running `autogen.sh`:
```
case $host_os in
rhapsody* | darwin1.[012])
_lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
darwin1.*)
_lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
darwin*)
case $MACOSX_DEPLOYMENT_TARGET,$host in
10.[012],*|,*powerpc*-darwin[5-8]*)
_lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
*)
_lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
esac
```
* lcov: add version2, perl dep at build and runtime
* lcov: add runtime deps
* namespace-autoclean: new perl package
* datetime: dep on autoclean
* formatting
* abinit: add v9.10.3
Changed configure arguments for specfying how to use Wannier90 for versions
after 9.8.
When the mpi variant is requested, set the F90 environment variable to point
to the MPI Fortran wrapper when building versions after 9.8 instead of FC.
---------
Co-authored-by: Alec Scott <hi@alecbcs.com>
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 (#40676)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 due to pybind error with py 3.11
* hiop@:1.0 +cuda: constrain to cuda@:11.9
* fixes syntax of maintainers
---------
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
* intel-xed: fix git hash for mbuild, add version 2023.10.11
Fixes#40912
* Fix the git commit hash for mbuild 2022.04.17. This was broken in
commit eef9939c21 by mixing up the hashes for xed versus mbuild.
* Add versions 2023.08.21 and 2023.10.11.
* fix style
Before this PR, variant were not propagated to leaf nodes that could accept
the propagated value, if some intermediate node couldn't accept it.
This PR fixes that issue by marking nodes as "candidate" for propagation
and by setting the variant only if it can be accepted by the node.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Problem: the current configure arguments are added lists to a list,
and this needs to be adding strings to the same list.
Solution: ensure we add each item (string) separately.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
* Use `major.minor.patch`, `major.minor`, `major` in tags
* Ensure `latest` is the semver largest version, and not "latest in time"
* Remove Ubuntu 18.04 from the list of images
Modify the packages.yaml schema so that soft-preferences on targets,
compilers and providers can only be specified under the "all" attribute.
This makes them effectively global preferences.
Version preferences instead can only be specified under a package
specific section.
If a preference attribute is found in a section where it should
not be, it will be ignored and a warning is printed to screen.
Most queries will end up calling `spec.satisfies(query)` on everything in the DB, which
will cause Spack to ask whether the query spec is virtual if its name doesn't match the
target spec's. This can be expensive, because it can cause Spack to check if any new
virtuals showed up in *all* the packages it knows about. That can currently trigger
thousands of `stat()` calls.
We can avoid the virtual check for most successful queries if we consider that if there
*is* a match by name, the query spec *can't* be virtual. This PR adds an optimization to
the query loop to save any comparisons that would trigger a virtual check for last.
- [x] Add a `deferred` list to the `query()` loop.
- [x] First run through the `query()` loop *only* checks for name matches.
- [x] Query loop now returns early if there's a name match, skipping most `satisfies()` calls.
- [x] Second run through the `deferred()` list only runs if query spec is virtual.
- [x] Fix up handling of concrete specs.
- [x] Add test for querying virtuals in DB.
- [x] Avoid allocating deferred if not necessary.
---------
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Currently there's some hacky logic in the AppleClang compiler that makes
it also accept `gfortran` as a fortran compiler if `flang` is not found.
This is guarded by `if sys.platform` checks s.t. it only applies to
Darwin.
But on Linux the feature of detecting mixed toolchains is highly
requested too, cause it's rather annoying to run into a failed build of
`openblas` after dozens of minutes of compiling its dependencies, just
because clang doesn't have a fortran compiler.
In particular in CI where the system compilers may change during system
updates, it's typically impossible to fix compilers in a hand-written
compilers.yaml config file: the config will almost certainly be outdated
sooner or later, and maintaining one config file per target machine and
writing logic to select the correct config is rather undesirable too.
---
This PR introduces a flag `spack compiler find --mixed-toolchain` that
fills out missing `fc` and `f77` entries in `clang` / `apple-clang` by
picking the best matching `gcc`.
It is enabled by default on macOS, but not on Linux, matching current
behavior of `spack compiler find`.
The "best matching gcc" logic and compiler path updates are identical to
how compiler path dictionaries are currently flattened "horizontally"
(per compiler id). This just adds logic to do the same "vertically"
(across different compiler ids).
So, with this change on Ubuntu 22.04:
```
$ spack compiler find --mixed-toolchain
==> Added 6 new compilers to /home/harmen/.spack/linux/compilers.yaml
gcc@13.1.0 gcc@12.3.0 gcc@11.4.0 gcc@10.5.0 clang@16.0.0 clang@15.0.7
==> Compilers are defined in the following files:
/home/harmen/.spack/linux/compilers.yaml
```
you finally get:
```
compilers:
- compiler:
spec: clang@=15.0.7
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu23.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
- compiler:
spec: clang@=16.0.0
paths:
cc: /usr/bin/clang-16
cxx: /usr/bin/clang++-16
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: ubuntu23.04
target: x86_64
modules: []
environment: {}
extra_rpaths: []
```
The "best gcc" is automatically default system gcc, since it has no
suffixes / prefixes.
Add a new config section: `config:aliases`, which is a dictionary mapping aliases
to commands.
For instance:
```yaml
config:
aliases:
sp: spec -I
```
will define a new command `sp` that will execute `spec` with the `-I`
argument.
Aliases cannot override existing commands, and this is ensured with a test.
We cannot currently alias subcommands. Spack will warn about any aliases
containing a space, but will not error, which leaves room for subcommand
aliases in the future.
---------
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* Test that setup_run_environment changes to CC/CXX/FC/F77 are dropped in build env
* compilers set in run env shouldn't impact build
Adds `drop` to EnvironmentModifications courtesy of @haampie, and uses
it to clear modifications of CC, CXX, F77 and FC made by
`setup_{,dependent_}run_environment` routines when producing an
environment in BUILD context.
* comment / style
* comment
---------
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
This adds a rather trivial context manager that lets you deduplicate repeated
arguments in directives, e.g.
```python
depends_on("py-x@1", when="@1", type=("build", "run"))
depends_on("py-x@2", when="@2", type=("build", "run"))
depends_on("py-x@3", when="@3", type=("build", "run"))
depends_on("py-x@4", when="@4", type=("build", "run"))
```
can be condensed to
```python
with default_args(type=("build", "run")):
depends_on("py-x@1", when="@1")
depends_on("py-x@2", when="@2")
depends_on("py-x@3", when="@3")
depends_on("py-x@4", when="@4")
```
The advantage is it's clear for humans, the downside it's less clear for type checkers due to type erasure.
Create chains of causation for error messages.
The current implementation is only completed for some of the many errors presented by the concretizer. The rest will need to be filled out over time, but this demonstrates the capability.
The basic idea is to associate conditions in the solver with one another in causal relationships, and to associate errors with the proximate causes of their facts in the condition graph. Then we can construct causal trees to explain errors, which will hopefully present users with useful information to avoid the error or report issues.
Technically, this is implemented as a secondary solve. The concretizer computes the optimal model, and if the optimal model contains an error, then a secondary solve computes causation information about the error(s) in the concretizer output.
Examples:
$ spack solve hdf5 ^cmake@3.0.1
==> Error: concretization failed for the following reasons:
1. Cannot satisfy 'cmake@3.0.1'
2. Cannot satisfy 'cmake@3.0.1'
required because hdf5 ^cmake@3.0.1 requested from CLI
3. Cannot satisfy 'cmake@3.18:' and 'cmake@3.0.1
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 depends on cmake@3.18: when @1.13:
required because hdf5 ^cmake@3.0.1 requested from CLI
4. Cannot satisfy 'cmake@3.12:' and 'cmake@3.0.1
required because hdf5 depends on cmake@3.12:
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 ^cmake@3.0.1 requested from CLI
$ spack spec cmake ^curl~ldap # <-- with curl configured non-buildable and an external with `+ldap`
==> Error: concretization failed for the following reasons:
1. Attempted to use external for 'curl' which does not satisfy any configured external spec
2. Attempted to build package curl which is not buildable and does not have a satisfying external
attr('variant_value', 'curl', 'ldap', 'True') is an external constraint for curl which was not satisfied
3. Attempted to build package curl which is not buildable and does not have a satisfying external
attr('variant_value', 'curl', 'gssapi', 'True') is an external constraint for curl which was not satisfied
4. Attempted to build package curl which is not buildable and does not have a satisfying external
'curl+ldap' is an external constraint for curl which was not satisfied
'curl~ldap' required
required because cmake ^curl~ldap requested from CLI
$ spack solve yambo+mpi ^hdf5~mpi
==> Error: concretization failed for the following reasons:
1. 'hdf5' required multiple values for single-valued variant 'mpi'
2. 'hdf5' required multiple values for single-valued variant 'mpi'
Requested '~mpi' and '+mpi'
required because yambo depends on hdf5+mpi when +mpi
required because yambo+mpi ^hdf5~mpi requested from CLI
required because yambo+mpi ^hdf5~mpi requested from CLI
3. 'hdf5' required multiple values for single-valued variant 'mpi'
Requested '~mpi' and '+mpi'
required because netcdf-c depends on hdf5+mpi when +mpi
required because netcdf-fortran depends on netcdf-c
required because yambo depends on netcdf-fortran
required because yambo+mpi ^hdf5~mpi requested from CLI
required because netcdf-fortran depends on netcdf-c@4.7.4: when @4.5.3:
required because yambo depends on netcdf-fortran
required because yambo+mpi ^hdf5~mpi requested from CLI
required because yambo depends on netcdf-c
required because yambo+mpi ^hdf5~mpi requested from CLI
required because yambo depends on netcdf-c+mpi when +mpi
required because yambo+mpi ^hdf5~mpi requested from CLI
required because yambo+mpi ^hdf5~mpi requested from CLI
Future work:
In addition to fleshing out the causes of other errors, I would like to find a way to associate different components of the error messages with different causes. In this example it's pretty easy to infer which part is which, but I'm not confident that will always be the case.
See the previous PR #34500 for discussion of how the condition chains are incomplete. In the future, we may need custom logic for individual attributes to associate some important choice rules with conditions such that clingo choices or other derivations can be part of the explanation.
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This PR implements the concept of "default environment", which doesn't have to be
created explicitly. The aim is to lower the barrier for adopting environments.
To (create and) activate the default environment, run
```
$ spack env activate
```
This mimics the behavior of
```
$ cd
```
which brings you to your home directory.
This is not a breaking change, since `spack env activate` without arguments
currently errors. It is similar to the already existing `spack env activate --temp`
command which always creates an env in a temporary directory, the difference
is that the default environment is a managed / named environment named `default`.
The name `default` is not a reserved name, it's just that `spack env activate`
creates it for you if you don't have it already.
With this change, you can get started with environments faster:
```
$ spack env activate [--prompt]
$ spack install --add x y z
```
instead of
```
$ spack env create default
==> Created environment 'default in /Users/harmenstoppels/spack/var/spack/environments/default
==> You can activate this environment with:
==> spack env activate default
$ spack env activate [--prompt] default
$ spack install --add x y z
```
Notice that Spack supports switching (but not stacking) environments, so the
parallel with `cd` is pretty clear:
```
$ spack env activate named_env
$ spack env status
==> In environment named_env
$ spack env activate
$ spack env status
==> In environment default
```
* Add command suggestions
This adds suggestions of similar commands in case users mistype a
command. Before:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.
```
After:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.
Did you mean one of the following commands?
spec
patch
```
* Add package name suggestions
* Remove suggestion to run spack clean -m
This completes to `spack concretize`:
```
spack conc<tab>
```
but this still gets hung up on the difference between `concretize` and `concretise`:
```
spack -e . conc<tab>
```
We were checking `"$COMP_CWORD" = 1`, which tracks the word on the command line
including any flags and their args, but we should track `"$COMP_CWORD_NO_FLAGS" = 1` to
figure out if the arg we're completing is the first real command.
This PR adds support for including separate definitions from `spack.yaml`.
Supporting the inclusion of files with definitions enables user to make
curated/standardized collections of packages that can re-used by others.
Currently module globals aren't set before running
`setup_[dependent_]run_environment` to compute environment modifications
for module files. This commit fixes that.
Looking at the memory profiles of concurrent solves
for environment with unify:false, it seems memory
is only ramping up.
This exchange in the potassco mailing list:
https://sourceforge.net/p/potassco/mailman/potassco-users/thread/b55b5b8c2e8945409abb3fa3c935c27e%40lohn.at/#msg36517698
Seems to suggest that clingo doesn't release memory
until end of the application.
Since when unify:false we distribute work to processes,
here we give a maxtaskperchild=1, so we clean memory
after each solve.
Some providers must provide virtuals "together", i.e.
if they provide one virtual of a set, they must be the
providers also of the others.
There was a bug though, where we were not checking if
the other virtuals in the set were needed at all in
the DAG.
This commit fixes the bug.
* libtheora: regenerate Makefile.in during autoreconf
The patch to inhibit running of configure would exit autogen.sh so early
that it did not yet run autoconf/automake/...
Instead of patching autogen.sh, just pass -V as argument, as this is
passed on to configure and lets it just print its version instead of
configuring the build tree.
Also drop arguments from autogen.sh, as they are unused when configure
does not run.
* libtheora: fix build on macos
Apply upstream patches in order to avoid unresolved symbols during building of libtheoraenc.
These patches require re-running automake/autoconf/...
Error messages:
libtool: link: /Users/ma/git/spack/lib/spack/env/clang/clang -dynamiclib -o .libs/libtheoraenc.1.dylib .libs/apiwrapper.o .libs/fragment.o .libs/idct.o .libs/internal.o .libs/state.o .libs/quant.o .l
ibs/analyze.o .libs/fdct.o .libs/encfrag.o .libs/encapiwrapper.o .libs/encinfo.o .libs/encode.o .libs/enquant.o .libs/huffenc.o .libs/mathops.o .libs/mcenc.o .libs/rate.o .libs/tokenize.o -L/opt/spac
k/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib -logg -lm -Wl,-exported_symbols_list -Wl,/var/folders/zv/qr55pmd9065glf0mcltpx5bm000102/T/ma/spack-stage/spac
k-stage-libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/spack-src/lib/theoraenc.exp -install_name /opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib
/libtheoraenc.1.dylib -compatibility_version 3 -current_version 3.2
ld: warning: search path '/opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib' not found
ld: Undefined symbols:
_th_comment_add, referenced from:
_theora_comment_add in apiwrapper.o
_th_comment_add_tag, referenced from:
_theora_comment_add_tag in apiwrapper.o
_th_comment_clear, referenced from:
_theora_comment_clear in apiwrapper.o
_th_comment_init, referenced from:
_theora_comment_init in apiwrapper.o
_th_comment_query, referenced from:
_theora_comment_query in apiwrapper.o
_th_comment_query_count, referenced from:
_theora_comment_query_count in apiwrapper.o
* libtheora: add git versions
stable as version name for theora-1.1 branch was chosen so that it sorts between 1.1.x and master
* libtheora: remove unused patch
thanks to @michaelkuhn for noticing
* Add 2023.09-0 for x86_64, aarch64, and ppc64le
extend the anaconda3 package.py to support aarch64 and ppc64le.
add the latest version of anaconda3 to each new platform, including the existing x86_64
* formatting
* Fix py-pyside2 to build with newer llvm and to use spack libglx and libxcb headers where system headers are missing
pyside2 needs LLVM_INSTALL_DIR to be set when using llvm 11: and expects system headers for libglx and libxcb and won't build otherwise.
* Fix styling
* remove raw string type
* Update var/spack/repos/builtin/packages/py-pyside2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This PR makes it possible to select only a subset of virtual dependencies from a spec that _may_ provide more. To select providers, a syntax to specify edge attributes is introduced:
```
hdf5 ^[virtuals=mpi] mpich
```
With that syntax we can concretize specs like:
```console
$ spack spec strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
```
On `develop` this would currently fail with:
```console
$ spack spec strumpack ^intel-parallel-studio+mkl ^openblas
==> Error: Spec cannot include multiple providers for virtual 'blas'
Requested 'intel-parallel-studio' and 'openblas'
```
In package recipes, virtual specs that are declared in the same `provides` directive need to be provided _together_. This means that e.g. `openblas`, which has:
```python
provides("blas", "lapack")
```
needs to provide both `lapack` and `blas` when requested to provide at least one of them.
## Additional notes
This capability is needed to model compilers. Assuming that languages are treated like virtual dependencies, we might want e.g. to use LLVM to compile C/C++ and Gnu GCC to compile Fortran. This can be accomplished by the following[^1]:
```
hdf5 ^[virtuals=c,cxx] llvm ^[virtuals=fortran] gcc
```
[^1]: We plan to add some syntactic sugar around this syntax, and reuse the `%` sigil to avoid having a lot of boilerplate around compilers.
Modifications:
- [x] Add syntax to interact with edge attributes from spec literals
- [x] Add concretization logic to be able to cherry-pick virtual dependencies
- [x] Extend semantic of the `provides` directive to express when virtuals need to be provided together
- [x] Add unit-tests and documentation
* vcftools: adding new version 0.1.16
* Update var/spack/repos/builtin/packages/vcftools/package.py
Co-authored-by: Alec Scott <alec@bcs.sh>
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* Make sure sio is in dependent build env for podio
* podio: Fix likely(?) typo in root dependency
* podio: Add latest tag and new variants + dependencies
* podio: Add v00-16-07 tag
* podio: Fix dependencies flagged by package audit
* podio: Simplify root dependency
* podio: Add 0.17.1 tag
Allowing white space around `:` in version ranges introduces an ambiguity:
```
a@1: b
```
parses as `a@1:b` but should really be parsed as two separate specs `a@1:` and `b`.
With white space disallowed around `:` in ranges, the ambiguity is resolved.
Call setup_dependent_run_environment on both link and run edges,
instead of only run edges, which restores old behavior.
Move setup_build_environment into get_env_modifications
Also call setup_run_environment on direct build deps, since their run
environment has to be set up.
* update SPERR package
* remove blank line
* update SPERR to be version 0.7.1
* a little clean up
* bound versions that require zstd
* add USE_ZSTD
* add libpressio-sperr version upbound
* update libpressio-sperr
* address review comments
* improve format
---------
Co-authored-by: Samuel Li <Sam@Navada>
Co-authored-by: Samuel Li <sam@cisl-m121a>
* Add tests to ensure variant propagation syntax can round-trip to/from string
* Add a regression test for the bug in 35298
* Reconstruct the spec constraints in the worker process
Specs do not preserve any information on propagation of variants
when round-tripping to/from JSON (which we use to pickle), but
preserve it when round-tripping to/from strings.
Therefore, we pass a spec literal to the worker and reconstruct
the Spec objects there.
* Added NVML support to the slurm package
* dbus package is required for cgroup support
* Fixing formatting
* Style fix
* Added PAM support
* Added ROCm SMI support
- [x] Add links to information people are going to want to know when adding license
information to their packages (namely OSI licenses and SPDX identifiers).
- [x] Update the packaging docs for `license()` with Spack as an example for `when=`.
After all, it's a dual-licensed package that changed once in the past.
- [x] Add link to https://spdx.org/licenses/ in the `spack create` boilerplate as well.
* Load the script file during enviroment setup so that all the enviroment variables are set properly
* Patch csh/tcsh so that it uses spacks via env
* Update SHA for latest version
* Extend shebang to perl and fix up the regex
* Add liggght patched for newer compiler
Add C++ 17 support
Add Clang and Oneapi support
* Add maintainers
* Fix format in liggghts
* Fix maintainers before versions
Co-authored-by: Alec Scott <alec@bcs.sh>
* Fix style and user to usr
* Update package.py
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* [add] py-moarchiving: new package
* py-moarchiving: update from review: description, variant default value is False, switch when and type
---------
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
* py-bluepyemodel: new package with dependencies
* py-morphio: add MPI as dependency to avoid failing builds
* Formatting
* py-bluepyefe: no need to set NEURON_INIT_MPI
* py-morphio: unifurcation branch is ancient history
* py-bluepyopt: only set NEURON_INIT_MPI with +neuron
* py-efel: get rid of old version
* py-morph{-tool,io}: rename develop to master to match branch
* py-bluepyefe: unset PMI_RANK is also neuron-related
* py-bluepyopt: PMI_RANK is also neuron-related
* Implement review remarks
* py-morph-tool, py-neurom: small fixes
* py-morphio: reword dependencies
Typically MSVC is detected via the VSWhere program. However, this may
not be available, or may be installed in an unpredictable location.
This PR adds an additional approach via Windows Registry queries to
determine VS install location root.
Additionally:
* Construct vs_install_paths after class-definition time (move it to
variable-access time).
* Skip over keys for which a user does not have read permissions
when performing searches (previously the presence of these keys
would have caused an error, regardless of whether they were
needed).
* Extend helper functionality with option for regex matching on
registry keys vs. exact string matching.
* Some internal refactoring: remove boolean parameters in some cases
where the function was always called with the same value
(e.g. `find_subkey`)
* e4s ci: add exago +cuda, +rocm builds
* exago: rename 5-18-2022-snapshot to snapshot.5-18-2022
* disable exago +rocm for non-external rocm ci install
* note that hiop +rocm fails to find hip libraries when they are spack-installed
.bat or .exe files can be considered executable on Windows. This PR
expands the regex for detectable packages to allow for the detection
of packages that vendor .bat wrappers (intel mpi for example).
Additional changes:
* Outside of Windows, when searching for executables `path_hints=None`
was used to indicate that default path hints should be provided,
and `[]` was taken to mean that no defaults should be chosen
(in that case, nothing is searched); behavior on Windows has
now been updated to match.
* Above logic for handling of `path_hints=[]` has also been extended
to library search (for both Linux and Windows).
* All exceptions for external packages were documented as timeout
errors: this commit adds a distinction for other types of errors
in warning messages to the user.
Credits to @ChristianKniep for advocating the idea of OCI image layers
being identical to spack buildcache tarballs.
With this you can configure an OCI registry as a buildcache:
```console
$ spack mirror add my_registry oci://user/image # Dockerhub
$ spack mirror add my_registry oci://ghcr.io/haampie/spack-test # GHCR
$ spack mirror set --push --oci-username ... --oci-password ... my_registry # set login credentials
```
which should result in this config:
```yaml
mirrors:
my_registry:
url: oci://ghcr.io/haampie/spack-test
push:
access_pair: [<username>, <password>]
```
It can be used like any other registry
```
spack buildcache push my_registry [specs...]
```
It will upload the Spack tarballs in parallel, as well as manifest + config
files s.t. the binaries are compatible with `docker pull` or `skopeo copy`.
In fact, a base image can be added to get a _runnable_ image:
```console
$ spack buildcache push --base-image ubuntu:23.04 my_registry python
Pushed ... as [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
$ docker run --rm -it [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
```
which should really be a game changer for sharing binaries.
Further, all content-addressable blobs that are downloaded and verified
will be cached in Spack's download cache. This should make repeated
`push` commands faster, as well as `push` followed by a separate
`update-index` command.
An end to end example of how to use this in Github Actions is here:
**https://github.com/haampie/spack-oci-buildcache-example**
TODO:
- [x] Generate environment modifications in config so PATH is set up
- [x] Enrich config with Spack's `spec` json (this is allowed in the OCI specification)
- [x] When ^ is done, add logic to create an index in say `<image>:index` by fetching all config files (using OCI distribution discovery API)
- [x] Add logic to use object storage in an OCI registry in `spack install`.
- [x] Make the user pick the base image for generated OCI images.
- [x] Update buildcache install logic to deal with absolute paths in tarballs
- [x] Merge with `spack buildcache` command
- [x] Merge #37441 (included here)
- [x] Merge #39077 (included here)
- [x] #39187 + #39285
- [x] #39341
- [x] Not a blocker: #35737 fixes correctness run env for the generated container images
NOTE:
1. `oci://` is unfortunately taken, so it's being abused in this PR to mean "oci type mirror". `skopeo` uses `docker://` which I'd like to avoid, given that classical docker v1 registries are not supported.
2. this is currently `https`-only, given that basic auth is used to login. I _could_ be convinced to allow http, but I'd prefer not to, given that for a `spack buildcache push` command multiple domains can be involved (auth server, source of base image, destination registry). Right now, no urllib http handler is added, so redirects to https and auth servers with http urls will simply result in a hard failure.
CAVEATS:
1. Signing is not implemented in this PR. `gpg --clearsign` is not the nicest solution, since (a) the spec.json is merged into the image config, which must be valid json, and (b) it would be better to sign the manifest (referencing both config/spec file and tarball) using more conventional image signing tools
2. `spack.binary_distribution.push` is not yet implemented for the OCI buildcache, only `spack buildcache push` is. This is because I'd like to always push images + deps to the registry, so that it's `docker pull`-able, whereas in `spack ci` we really wanna push an individual package without its deps to say `pr-xyz`, while its deps reside in some `develop` buildcache.
3. The `push -j ...` flag only works for OCI buildcache, not for others
* itk: patch missing include for newer compilers
* itk: The package doesn't use MPI
* itk: package requires the high-level hdf5 api
* itk: patch url with ?full_index=1
* itk: point to 4041 commit in master
* itk: don't constrain hdf5 with ~mpi
* rtmpdump: New package
* curl: Fix librtmp variant
Add the previously missing dependency required for rtmp support.
The variant has been broken since its addition in PR #25166.
Fixes one of the two issues reported in #26887.
* spack checksum pkg@1.2, use as version filter
Currently pkg@1.2 splits on @ and looks for 1.2 specifically, with this
PR pkg@1.2 is a filter so any matching 1.2, 1.2.1, ..., 1.2.10 version
is displayed.
* fix tests
* fix style
Update Tcl modulefile template to simplify generated `append-path`,
`prepend-path` and `remove-path` commands and improve their readability.
If path element delimiter is colon character, do not set the `--delim`
option as it is the default delimiter value.
Renames exclude_implicits to hide_implicits
When hide_implicits option is enabled, generate modulefile of
implicitly installed software and hide them. Even if implicit, those
modulefiles may be referred as dependency in other modulefiles thus they
should be generated to make module properly load dependent module.
A new hidden property is added to BaseConfiguration class.
To hide modulefiles, modulercs are generated along modulefiles. Such rc
files contain specific module command to indicate a module should be
hidden (for instance when using "module avail").
A modulerc property is added to TclFileLayout and LmodFileLayout classes
to get fully qualified path name of the modulerc associated to a given
modulefile.
Modulerc files will be located in each module directory, next to the
version modulefiles. This scheme is supported by both module tool
implementations.
modulerc_header and hide_cmd_format attributes are added to
TclModulefileWriter and LmodModulefileWriter. They help to know how to
generate a modulerc file with hidden commands for each module tool.
Tcl modulerc file requires an header. As we use a command introduced on
Modules 4.7 (module-hide --hidden-loaded), a version requirement is
added to header string.
For lmod, modules that open up a hierarchy are never hidden, even if
they are implicitly installed.
Modulerc is created, updated or removed when associated modulefile is
written or removed. If an implicit modulefile becomes explicit, hidden
command in modulerc for this modulefile is removed. If modulerc becomes
empty, this file is removed. Modulerc file is not rewritten when no
content change is detected.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* paraview: rebase the adios2 patch for 5.12-to-be
* paraview: disable fastfloat and token for 5.12-to-be
* paraview: require older protobuf for 5.12 as well
* paraview: require C++11-supporting protobuf for `master` too
* ci: don't register detectable compilers
Cause they go out of sync...
* remove intel compiler, it can be detected too
* Do not run spack compiler find since compilers are registered in concretize job already
* trilinos: work around +stokhos +cuda +superlu-dist bug due to EMPTY macro
Previously, we only searched for `patch` inside of whatever Git
installation was available because the most common installation of Git
available on Windows had `patch`. That's not true for all possible
installations of Git though, so this updates the search to also check
PATH.
GitLab's .patch URLs only provide abbreviated hashes, while .diff URLs
provide full hashes. There does not seem to be a parameter to force
.patch URLs to also return full hashes, so we should make sure to use
the .diff ones.
While e.g. GNU patch 2.7.6 (as provided by homebrew) would apply the previous
version of this patch without problems, Apple's patch 2.0-12u11-Apple fails
to find out which file to patch.
Adding two lines to the patch fixes that. Renamed the patch in order to
not require a `spack clean -m`.
* gromacs +cp2k: build in CI
* libxsmm: x86 only
* attempt to fix dbcsr + new mpich
* use c11 standard
* gromacs: does not depend on dbcsr
* cp2k: build with cmake in CI, s.t. dbcsr is a separate package
* cp2k: cmake patches for config files and C/C++ std
* cp2k: remove unnecessary constraints due to patch
With the introduction of multiple build dependencies from the same package in the DAG, we need to minimize a few weights accounting for edges rather than nodes. If we don't do that we might have multiple "optimal" solutions that differ only in how the same nodes are connected together. This commit ensures optimal versions are picked per parent in case of multiple choices for a dependency.
* py-statsmodels: add 0.14.0
* Fix style
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-statsmodels/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove python limits
* Remove comment
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cykhash: adding new package py-cykhash
* py-hmmlearn: adding new package py-hmmlearn
* py-macs3: adding new package py-macs3
* py-macs3: adding python version restriction and other changes.
Fix the following syntax which validates only the first array entry:
```python
"compilers": {
"type": "array",
"items": [
{
"type": ...
}
]
}
```
to
```python
"compilers": {
"type": "array",
"items": {
"type": ...
}
}
```
which validates the entire array.
Oops...
On some systems, multiple pythonx.y are placed in the same prefix as
pythonx (where only one of them is associated with that pythonx).
Spack external detection for Python was willing to register all of
these as external versions. Moreover, the `package.py` for Python
was able to distinguish these.
This can cause an issue for some build systems, which will just look
for python3 for example, so if that python3 is actually python3.6,
and the build system needs 3.7 (which spack may have found in the
same prefix, and offered as a suitable external), it will fail when
invoking python3.
To avoid that issue, we simply avoid treating pythonx.y as external
candidates. In the above case, Spack would only detect a Python 3.6
external, and the build would be forced to use a Spack-built Python
3.7 (which we consider a good thing).
This adds a `SetupContext` class which is responsible for setting
package.py module globals, and computing the changes to environment
variables for the build, test or run context.
The class uses `effective_deptypes` which takes a list of specs (e.g. single
item of a spec to build, or a list of environment roots) and a context
(build, run, test), and outputs a flat list of specs that affect the
environment together with a flag in what way they do so. This list is
topologically ordered from root to leaf, so that one can be assured that
dependents override variables set by dependencies, not the other way
around.
This is used to replace the logic in `modifications_from_dependencies`,
which has several issues: missing calls to `setup_run_environment`, and
the order in which operations are applied.
Further, it should improve performance a bit in certain cases, since
`effective_deptypes` run in O(v + e) time, whereas `spack env activate`
currently can take up to O(v^2 + e) time due to loops over roots. Each
edge in the DAG is visited once by calling `effective_deptypes` with
`env.concrete_roots()`.
By marking and propagating flags through the DAG, this commit also fixes
a bug where Spack wouldn't call `setup_run_environment` for runtime
dependencies of link dependencies. And this PR ensures that Spack
correctly sets up the runtime environment of direct build dependencies.
Regarding test dependencies: in a build context they are are build-time
test deps, whereas in a test context they are install-time test deps.
Since there are no means to distinguish the build/install type test deps,
they're both.
Further changes:
- all `package.py` module globals are guaranteed to be set before any of the
`setup_(dependent)_(run|build)_env` functions is called
- traversal order during setup: first the group of externals, then the group
of non-externals, with specs in each group traversed topological (dependencies
are setup before dependents)
- modules: only ever call `setup_dependent_run_environment` of *direct* link/run
type deps
- the marker in `set_module_variables_for_package` is dropped, since we should
call the method once per spec. This allows us to set only a cheap subset of
globals on the module: for example it's not necessary to compute the expensive
`cmake_args` and w/e if the spec under consideration is not the root node to be
built.
- `spack load`'s `--only` is deprecated (it has no effect now), and `spack load x`
now means: do everything that's required for `x` to work at runtime, which
requires runtime deps to be setup -- just like `spack env activate`.
- `spack load` no longer loads build deps (of build deps) ...
- `spack env activate` on partially installed or broken environments: this is all
or nothing now. If some spec errors during setup of its runtime env, you'll only
get the unconditional variables + a warning that says the runtime changes for
specs couldn't be applied.
- Remove traversal in upward direction from `setup_dependent_*` in packages.
Upward traversal may iterate to specs that aren't children of the roots
(e.g. zlib / python have hundreds of dependents, only a small fraction is
reachable from the roots. Packages should only modify the direct dependent
they receive as an argument)
The ability to select the top N versions got removed in the checksum overhaul,
cause initially numbers were used for commands.
Now that we settled on characters for commands, let's make numbers pick the top
N again.
This flag was only relevant when targeting powerpc from apple-clang,
which we don't do. The flag is removed from apple-clang@15. Let's drop
it unconditionally.
Improve how mirrors are used in gitlab ci, where we have until now thought
of them as only a string.
By configuring ci mirrors ahead of time using the proposed mirror templates,
and by taking advantage of the expressiveness that spack now has for mirrors,
this PR will allow us to easily switch the protocol/url we use for fetching
binary dependencies.
This change also deprecates some gitlab functionality and marks it for
removal in Spack 0.23:
- arguments to "spack ci generate":
* --buildcache-destination
* --copy-to
- gitlab configuration options:
* enable-artifacts-buildcache
* temporary-storage-url-prefix
* petsc: add variant +sycl
* petsc: add in gmake as dependency - so that consistent make gets used between petsc and slepc builds [that can have different env for each of the builds]
Reused specs used to be referenced directly into the built spec.
This might cause issues like in issue 39570 where two objects in
memory represent the same node, because two reused specs were
loaded from different sources but referred to the same spec
by DAG hash.
The issue is solved by copying concrete specs to a dictionary keyed
by dag hash.
`spack dev-build` would incorrectly set `keep_stage=True` for the
entire DAG, including for non-dev specs, even though the dev specs
have a DIYStage which never deletes sources.
py-werkzeug@:0.12 does not work with python@3.10:
Test with py-werkzeug 0.12.2 and python 3.10:
```
$ python3.10 -c 'import werkzeug'
py-werkzeug-0.12.2/lib/python3.11/site-packages/werkzeug/datastructures.py", line 16, in <module>
from collections import Container, Iterable, MutableSet
ImportError: cannot import name 'Container' from 'collections'
```
Test with py-werkzeug 0.12.2 and python 3.9:
```
python3.9 -c "from collections import Container"
<string>:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will stop working
```
This patch adds in a license directive to get the ball rolling on adding in license
information about packages to spack. I'm primarily interested in just adding
license into spack, but this would also help with other efforts that people are
interested in such as adding license information to the ASP solve for
concretization to make sure licenses are compatible.
Usage:
Specifying the specific license that a package is released under in a project's
`package.py` is good practice. To specify a license, find the SPDX identifier for
a project and then add it using the license directive:
```python
license("<SPDX Identifier HERE>")
```
For example, for Apache 2.0, you might write:
```python
license("Apache-2.0")
```
Note that specifying a license without a when clause makes it apply to all
versions and variants of the package, which might not actually be the case.
For example, a project might have switched licenses at some point or have
certain build configurations that include files that are licensed differently.
To account for this, you can specify when licenses should be applied. For
example, to specify that a specific license identifier should only apply
to versionup to and including 1.5, you could write the following directive:
```python
license("MIT", when="@:1.5")
```
`xmllint` is called by `xmlto` during generation of `libzmq`'s docs, so
adding `libxml2`.
The docbook deps and the patches are taken from
https://src.fedoraproject.org/rpms/xmlto/blob/rawhide/f/xmlto.spec
There are still many more dependencies missing, but this is out of scope
of this patch (which is only concerned about the use case of `libzmq`).
This commit allows version specifiers to refer to git branches that contain
forward slashes. For example, the following is valid syntax now:
pkg@git.releases/1.0
It also adds a new method `Spec.format_path(fmt)` which is like `Spec.format`,
but also maps unsafe characters to `_` after interpolation. The difference is
as follows:
>>> Spec("pkg@git.releases/1.0").format("{name}/{version}")
'pkg/git.releases/1.0'
>>> Spec("pkg@git.releases/1.0").format_path("{name}/{version}")
'pkg/git.releases_1.0'
The `format_path` method is used in all projections. Notice that this method
also maps `=` to `_`
>>> Spec("pkg@git.main=1.0").format_path("{name}/{version}")
'pkg/git.main_1.0'
which should avoid syntax issues when `Spec.prefix` is literally copied into a
Makefile as sometimes happens in AutotoolsPackage or MakefilePackage
Currently `spack env activate --with-view` exists, but is a no-op.
So, it is not too much of a breaking change to make this redundant flag
accept a value `spack env activate --with-view <name>` which activates
a particular view by name.
The view name is stored in `SPACK_ENV_VIEW`.
This also fixes an issue where deactivating a view that was activated
with `--without-view` possibly removes entries from PATH, since now we
keep track of whether the default view was "enabled" or not.
A few packages have encoded an idiom that pre-dates the introduction
of the 'requires' directive, and they cycle over all compilers
to conflict with the ones that are not supported.
Here instead we reverse the logic, and require the ones that
are supported.
* Added initial package for building Beatnik with spack
* Fixed github ID for Jason as a maintainer.
* Major revision of beatnik spack package to properly support GPU spack builds with CUDA (and ROCm, though that it untested)
* Marked that beatnik 1.0 will require cabana 0.6.0. We will wait for the cabana 0.6.0 release before we release beatnik
* Update to beatnik package spec to compile with hipcc when +rocm
* Updated spack package for cabana for version 0.6.0 and appropriate heffte dependency
* Updated beatnik package to require cabana 0.6.0
* More updates to cabana and beatnik to build with cabana 0.6.0
* Finish removing BLT dependence from beatnik
* More updates to beatnik package for compiling on cray systems
* Updated beatnik package for new cabana package
* Changes to silo package for new silo version
* Fixed version specs for heffte to be able to concretize and build
* Fixed spack style issues for beatnik and silo packages
* More spack formatting fixes to beatnik and silo
* Patrick adopting silo package as maintainer for now
* Should address final style changes to beatnik package spec
* Yet more style fixes.
* Perhaps this is the final style fixes? :)
* Minor fix to cabana package on required versions
* Updating patch to add flag mcode-object-version=none when
device libs is buils as part of llvm-amdgpu
* Limiting patch to +rocm-device-libs variant and adding
appropriate comment for the patch
* Updating llvmpatch as per the mailine code
Updating hsa-rocr patch as per the latest code
Updating the if elif condition for the hip test src path
* Updating flags for 5.5 relases and above
* Updating build flags and patches
* Fix version incompatibilities of py-pandas and py-openpyxl
* Add variant excel for py-pandas
* Add package py-pyxlsb
* Add versios for py-xlsxwriter
* Define excel dependencies for py-pandas 1.4, 1.5, 2.0, 2.1
* Fix variant excel in py-pandas
* Add package py-odfpy, which is also a dependency for py-pandas@2.0:
* Rearrange excel dependencies for py-pandas
* Change url to pypi
* Add missing newline to fix style in py-odfpy
* Uodate for Basix 0.7
* Version fix for nanobind dependency
* Simplification
* Version update and simplify dependencies
* Add comment on location of pyproject.toml
* Update var/spack/repos/builtin/packages/py-fenics-basix/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This tweaks the matrix description to indicate that it's bridged with Slack. So people
don't think they're missing out (even though the icon says there are only 3 users on
Matrix).
* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: on Linux, tell aws-crt-python to use libcrypto from spack (openssl)
* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: add missing build dependencies cmake (for all), openssl (for linux)
* Update var/spack/repos/builtin/packages/py-awscrt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [add] py-graphene-tornado: new recipe, required by py-cylc-uiserver
* py-graphene-tornado: Taking reviewing into account
* py-graphene-tornado: add type run in dependences py-jinja, py-tornado and py-werkzeug
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-urllib3: add 2.0.5
* Add py-brotli package
* Group brotli dependencies and make limits more specific
* Add minimum version limits to variants
* Remove python upper limit for py-brotli
* Fix restrictions for py-brotli dependency
* Fix py-brotli dependency
* py-urllib3: add 2.0.6
* New package: cpr
* Support libcpr version 1.9
* Fix build phase for git
* Update var/spack/repos/builtin/packages/cpr/package.py
Co-authored-by: Alec Scott <alec@bcs.sh>
* [@spackbot] updating style on behalf of sethrj
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* wayland: dot is a build dependency
otherwise this build failure happens:
../spack-src/doc/meson.build:5:6: ERROR: Program 'dot' not found or not executable
* wayland: make building of documentation optional
renders several dependencies optional
* spack checksum: improve interactive filtering
* fix signature of executable
* Fix restart when using editor
* Don't show [x version(s) are new] when no known versions (e.g. in spack create <url>)
* Test ^D in test_checksum_interactive_quit_from_ask_each
* formatting
* colorize / skip header on invalid command
* show original total, not modified total
* use colify for command list
* Warn about possible URL changes
* show possible URL change as comment
* make mypy happy
* drop numbers
* [o]pen editor -> [e]dit
* Fpocket: fix edit() positional args + add install()
* Remove comments
* Fix line too long
* Fix line too long
* Remove extension specification in version
Co-authored-by: Alec Scott <alec@bcs.sh>
* Use f-strings
Co-authored-by: Alec Scott <alec@bcs.sh>
* Fix styling
* Use the default MakefilePackage install stage
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
A complete texinfo install includes both `info` and `makeinfo`. Some
system installations of texinfo may exclude one or the other. This
updates the external finding logic to require both.
* openimagedenoise: checksum 2.0.1
* ospray: new versions 2.11.0 and 2.12.0
- both depend on embree@4
- also update dependency versions for rkcommon, openvkl, openimagedenois and ispc
- expose that dependency on openvkl is optional since @2.11 with variant "volumes"
* ospray: limit embree to @3 for ospray @:2.10
After the merge of #37957 (Add static and pic variants), if a gettext install
from a build before that merge is present, building any package using gettext
fails with keyerror: "shared" because the use of self.spec.variants["shared"]
does not check for the presence of the new variant in the old installation
but expects that the new key variants["shared"] exists always.
Fix it with a fallback to the default of True and update gettext to v22.3
Co-authored-by: Bernharad Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
* initial commit to fix ck build for 5.6.1
* disable mlir for miopen-hip
* use satisfies for checking specs and add nlohmann-json dependency for 5.4 onwards
Because those end up being passed to ar which does not understand linker
arguments. This was making ldflags largely unusuable for statically
linked cmake packages.
* py-jupyter-packaging: remove duplicate packages
* Allow py-jupyter-packaging to be duplicated in DAG
* Deprecate version of py-jupyterlab that requires py-jupyter-packaging at run-time
* [py-torch-sparse] New version 0.6.17
* [py-torch-sparse] added dependency on parallel-hashmap
* [py-torch-sparse]
- spack only supports python@3.7:
- py-pytest-runner only needed with old versions
You will note the `Cite this repository` link is not working.
This commit fixes the underlying file...
* `authors` was not indented
* `authors` required by `preferred-citation`
* `authors` list required at top level (I simply duplicated)
* `"USA"` not correct country code
* `month` requires an integer month number
* Added URL to the actual pdf of the cited paper
* Used `identifiers` for doi and LLNL doc number
* added `abstract` copied from paper
Various fixes were confirmed by `cffconvert` using `docker run -v `pwd`:/app citationcff/cffconvert --validate`
* [add] py-metomi-rose: new recipe, required by py-cylc-rose
* py-metomi-rose: remove version constraint on python
---------
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
* Allow branching out of the "generic build" unification set
For cases like the one in https://github.com/spack/spack/pull/39661
we need to relax rules on unification sets.
The issue is that, right now, nodes in the "generic build" unification
set are unified together with their build dependencies. This was done
out of caution to avoid the risk of circular dependencies, which would
ultimately cause a very slow solve.
For build-tools like Cython, however, the build dependencies is masked
by a long chain of "build, run" dependencies that belong in the
"generic build" unification space.
To allow splitting on cases like this, we relax the rule disallowing
branching out of the "generic build" unification set.
* Fix issue with pure build virtual dependencies
Pure build virtual dependencies were not accounted properly in the
list of possible virtuals. This caused some facts connecting virtuals
to the corresponding providers to not be emitted, and in the end
lead to unsat problems.
* Fixed a few issues in packages
py-gevent: restore dependency on py-cython@3
jsoncpp: fix typo in build dependency
ecp-data-vis-sdk: update spack.yaml and cmake recipe
py-statsmodels: add v0.13.5
* Make dependency on "blt" of type "build"
We run pip with `--no-build-isolation` because we don't wanna let pip
install build deps.
As a consequence, when pip runs hooks, it runs hooks of *any* package it
can find in `sys.path`.
For Spack-built Python this includes user site packages -- there
shouldn't be any system site packages. So in this case it suffices to
set the environment variable PYTHONNOUSERSITE=1.
For external Python, more needs to be done, cause there is no env
variable that disables both system and user site packages; setting the
`python -S` flag doesn't work because pip runs subprocesses that don't
inherit this flag (and there is no API to know if -S was passed)
So, for external Python, an empty venv is created before invoking pip in
Spack's build env ensures that pip can no longer see anything but
standard libraries and `PYTHONPATH`.
The downside of this is that pip will generate shebangs that point to
the python executable from the venv. So, for external python an extra
step is necessary where we fix up shebangs post install.
* Add new package awscli-v2 and its missing dependency awscrt
* Remove boilerplate comments from awscli-v2 and awscrt packages
* Fix typos in var/spack/repos/builtin/packages/awscli-v2/package.py
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Address reviewer comments
* Remove py-pip version dependency from var/spack/repos/builtin/packages/awscli-v2/package.py
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-isort: needs setuptools build dep before v5
Detected in #40224.
In the past, system setuptools could be picked up when using an external
python, so py-isort@4 would install fine. With the linked PR, pip can
only consider packages that Spack controls from PYTHONPATH, so the issue
of missing py-setuptools showed up.
* py-importlib-metadata: fix lowerbounds on python
* review
* py-isort unconditionally add optional setuptools dep to prevent picking up user package at runtime
* style
* drop optional py-setuptools run dep
* e4s amd64 gcc ci stack: sync with e4s-23.08
* e4s amd64 oneapi ci stack: sync with e4s-23.08
* e4s ppc64 gcc ci stack: sync with e4s-23.08
* add new ci stack: e4s amd64 gcc w/ external rocm
* add new ci stack: e4s arm gcc ci
* updates
* py-scipy: -fvisibility issue is resolved in 2023.1.0: #39464
* paraview oneapi fails
* comment out pkgs that fail to build on power
* fix arm stack name
* fix cabana +cuda specification
* comment out failing spces
* visit fails build on arm
* comment out slepc arm builds due to make issue
* comment out failing dealii arm builds
* Fix python versioning issue for py-biom-format
* Update deps according to feedback
* Remove version requirement for py-cython
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Only depend on py-six for versions >=2.1.10
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-future as non-dependency for 2.1.15
* There we are. Everything anyone could ever want
* Missed cython version change
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-dipy: Update version to support python@3.10
py-dipy only adds support to python@3.10 in py-dipy@1.5.0
See #40228
* py-dipy: fix formatting issues
* py-dipy: another formatting fix
* py-dipy: Use depends instead of conflicts
* py-dipy: formatting fix
* py-dipy: Updating for @1.7.0
Added new minimum version requirements for
py-cython
py-numpy
py-scipy
py-h5py
as suggested by @manuelakuhn
(py-nibabel min version unchanged for @1.7.0)
* [add] py-graphql-relay: new package
* py-graphql-relay: Update package.py
Remove leftovers from version 3.2.0:
* archive name in pypi
* dependencies
* [fix] py-graphql-relay: remove constraint on python version; add dependence py-rx because of ModuleNotFoundError during spack test
* [fix] py-graphql-relay: remove py-rx dependence; py-graphql-core: add dependencies for version 2.3.2
* py-graphql-core: Update from review; set build backend, py-poetry for version 3: and py-setuptools for version 2
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Two changes in this PR:
1. Register absolute paths in tarballs, which makes it easier
to use them as container image layers, or rootfs in general, outside
of Spack. Spack supports this already on develop.
2. Assemble the tarfile entries "by hand", which has a few advantages:
1. Avoid reading `/etc/passwd`, `/etc/groups`, `/etc/nsswitch.conf`
which `tar.add(dir)` does _for each file it adds_
2. Reduce the number of stat calls per file added by a factor two,
compared to `tar.add`, which should help with slow, shared filesystems
where these calls are expensive
4. Create normalized `TarInfo` entries from the start, instead of letting
Python create them and patching them after the fact
5. Don't recurse into subdirs before processing files, to avoid
keeping nested directories opened. (this changes the tar entry
order slightly, it's like sorting by `(not is_dir, name)`.
* 5.6.0 updates
* Rocm 5.6.0 updates
* Style and audit corrections for 5.6
* Patching smi path for tests.
* Style correction
* 5.6.1 updates
* Updated hip tests for ci build failure
Updated hiprand with the release tag
Taken care the review comment rocsolver
* Adding rocm-smi path for 5.6
* Adding the patch file
* Setting library directory uniform
* gl depends on mesa but it should not be llvm variant
* Fix for the issue 39520 by setting CMAKE_INSTALL_LIBDIR=lib
* i1 muls can sometimes happen after SCEV. They resulted in
ISel failures because we were missing the patterns for them.
* 5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx
* Revert "5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx"
This reverts commit f54c9c6c67.
* Revert operator mixup fix
* Splitting compiler-rt-linkage-for-host and operator mixup patch
* Adding missing patch for reverting operator mixup
* 5.6 update for composable-kernel,migraphx,miopen-hip and mivisionx
* Updating rvs, rcd and rccl for 5.6.1. adding comment for llvm patch
* SEACAS: Update package.py to handle new SEACAS project name
The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.
The refactor also changes several:
"-DSome_CMAKE_Option:BOOL=ON"
to
define("Some_CMAKE_Option", True)
* CGNS: If fortran not enabled, do not specify parallel fortran compiler
* Update package formatting as suggested by black
* Accept suggested change
Fix issue in configure_args which resulted in duplicate "--with-ldflags" arguments (with different values) being passed to configure. And extended the fix to similar arguments.
Also, repeated some flags to "--with-libs" to "--with-ldflags" as when the flags were only in "--with-libs", they did not seem to be picked up everywhere. I suspect this is a bug in the configure script, but adding to both locations seems to solve it and should not have any adverse effects.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
- upstream patch does not apply cleanly to older versions, not required
for newer ones
- also add conflict for older versions, except for 3.13.3 which works by
chance
For a long time, the docs have generated a huge, static HTML package list. It has some
disadvantages:
* It's slow to load
* It's slow to build
* It's hard to search
We now have a nice website that can tell us about Spack packages, and it's searchable so
users can easily find the one or two packages out of 7400 that they're looking for. We
should link to this instead of including a static package list page in the docs.
- [x] Replace package list link with link to packages.spack.io
- [x] Remove `package_list.html` generation from `conf.py`.
- [x] Add a new section for "Links" to the docs.
- [x] Remove docstring notes from contribution guide (we haven't generated RST
for package docstrings for a while)
- [x] Remove referencese to `package-list` from docs.
Currently, Windows SDK detection will only pick up SDK versions
related to the current version of Windows Spack is running on.
However, in some circumstances, we want to detect other version
of the SDK, for example, for compiling on Windows 11 for Windows
10 to ensure an API is compatible with Win10.
* Make use of `prefix` in the Cray manifest schema (prepend it to
the relative CC etc.) - this was a Spack error.
* Warn people when wrong-looking compilers are found in the manifest
(i.e. non-existent CC path).
* Bypass compilers that we fail to add (don't allow a single bad
compiler to terminate the entire read-cray-manifest action).
* Refactor Cray manifest tests: module-level variables have been
replaced with fixtures, specifically using the `test_platform`
fixture, which allows the unit tests to run with the new
concretizer.
* Add unit test to check case where adding a compiler raises an
exception (check that this doesn't prevent processing the
rest of the manifest).
If you `spack install x ^y` where `y` is a pure build dep of `x`, and
then uninstall `y`, and then `spack install --overwrite x ^y`, the build
fails because `y` is not re-installed.
Same can happen when you install a develop spec, run `spack gc`,
modify sources, and install again; develop specs rely on overwrite
install to work correctly.
-- Performaing formatting changes
-- Formatting file to conform with spack style
-- Adding updates from review
-- Removing old release candidates from the specification
-- Adding external conduit support for Catalyst
-- Adding Catalyst to `CMAKE_PREFIX_PATH` for the test to find
This PR adds a new audit sub-command to check that detection of relevant packages
is performed correctly in a few scenarios mocking real use-cases. The data for each
package being tested is in a YAML file called detection_test.yaml alongside the
corresponding package.py file.
This is to allow encoding detection tests for compilers and other widely used tools,
in preparation for compilers as dependencies.
Add versions 2020.08.1 and branch 2023.08.stable. Note: the version
numbers are a little different. Here, 2023.08.1 means release no. 1
from the release/2023.08 branch.
Modifications:
- [x] Move `spack.util.string` to `llnl.string`
- [x] Remove dependency of `llnl` on `spack.error`
- [x] Move path of `spack.util.path` to `llnl.path`
- [x] Move `spack.util.environment.get_host_*` to `spack.spec`
This resolves an interesting circular dependency between gcc and glibc:
1. glibc < 2.17 depends on libgcc.a and libgcc_eh.a
2. libgcc_eh.a is only built when gcc is configured with
--enable-shared
3. but building shared libraries requires crt*.o and libc.so
Backport AT_RANDOM auxval changes to avoid dealing with wrong inline
assembly (fallback code fails on ubuntu 23.04)
Update Trilinos and dependencies to build a limited version of Trilinos
on Windows.
* Support trilinos~mpi~shared on Windows
* superlu: force CMake build on Windows
* boost: update to build on Windows (proper option formatting and
build tool names)
* pcre, openblas: add CMake-based build (keep prior build system
as default on platforms other than Windows)
* openblas: add patch when using Intel Fortran compiler (currently
this is included as part of the hybrid %msvc compiler in Spack)
Co-authored-by: John Parent <john.parent@kitware.com>
As reported in #40159, a shared library build of ffmpeg 6.0 fails with the linker that was added with XCode 15:
ld: building exports trie: duplicate symbol '_av_ac3_parse_header'
clang: error: linker command failed with exit code 1 (use -v to see invocation)
Forcing the old linker with -Wl,-ld_classic works around this.
VTK's (and therefore Paraview's) FindFreetype module required patching to
handle static import libraries from Freetype. However it did not cover
shared libraries. This adds support for importing shared freetype into the VTK build
* Rename var/spack/repos/builtin/packages/python/cray-rpath-3.1.patch as var/spack/repos/builtin/packages/python/rpath-non-gcc.patch and apply unconditionally
* Update var/spack/repos/builtin/packages/python/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-numcodecs: drop upperbound, add new version, avoid native compilation
* py-numcodecs: add entrypoints
* Remove another upperbound on python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The fix for the compile issue was improved by Bernhard Kaindl.
He also added to fix two classes of other build fails:
- add missing openssl dependency version limit to older openssh versions
- add missing -fcommon for new compilers building old openssh versions
Co-authored-by: snehring <snehring@iastate.edu>
* e4s cray rhel stack: expand to full spec list
* comment out gasnet; require %gcc for unzip
* require openblas@0.3.20 to get around %cce error; follow up with issue report
* comment out failing specs;
* comment out axom and xyce due to errors
* improve clarity of failing specs
* msvc.py: don't import distutils
Introduced in #27021, makes Spack forward incompatible with Python.
The module was already deprecated at the time of the PR.
* update spack package
* py-gpy: drop cython induced python upperbound
* py-gpy: bump scipy
* py-fn-py: python bounds for old version, new version w/o
* py-statsmodels: force recythonization
* py-gpy: clarifying comment about cython build type
* py-aiofiles: Add version 0.7.0 required by py-cylc-flow
* py-aiofiles: update from review
* depends on py-setuptools before 0.6
* depends on py-poetry-core after 0.7
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
* py-aiofiles: Taking reviewing into account; add a upper version constraint for Python.
* py-aiofiles: update from review, set lower version of python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-fenics-dolfinx: add upper bound on Python version
* Small fix
* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Fixes#39622
Add a timeout to compiler detection and allow Spack to proceed when
this timeout occurs.
In all cases, the timeout is 120 seconds: it is assumed any compiler
invocation we do for the purposes of verifying it would resolve in
that amount of time.
Also refine executables that are tested as being possible MSVC
instances, and limit where we try to detect MSVC. In more detail:
* Compiler detection should timeout after a certain period of time.
Because compiler detection executes arbitrary executables on the
system, we could encounter a program that just hangs, or even a
compiler that hangs on a license key or similar. A timeout
prevents this from hanging Spack.
* Prevents things like cl-.* from being detected as potential MSVC
installs. cl is always just cl in all cases that Spack supports.
Change the MSVC class to indicate this.
* Prevent compilers unsupported on certain platforms from being
detected there (i.e. don't look for MSVC on systems other than
Windows).
The first point alone is sufficient to address #39622, but the
next two reduce the likelihood of timeouts (which is useful since
those slow down the user even if they are survivable).
Put back normalization of the "virtuals" input as a sorted tuple.
Without this we might get edges that differ just for the order of
virtuals, or that have lists, which are not hashable.
Add unit-tests to prevent regressions.
By default, do not let deprecated versions enter the solve.
Previously you could concretize to something deprecated, only to get errors on install.
With this commit, we get errors on concretization, so the issue is caught earlier.
* initial commit to add composable kernel package
* change dependencies to type build and add amdgpu_target variant
* fix spacing
* fix styling
* remove rocmmlir from miopen-hip recipe
* enable miopen with ck after 5.5.1
* fix typo
Fix permissions for configure file in var/spack/repos/builtin/packages/hdf-eos2/package.py, fix dependencies to match what hdf provides, update compiler flags for apple-clang
* [chore] py-aniso8601: Add version 3.0.2 required by py-graphene
* py-aniso8601: Add version 7.0.0 and constraints on python version
* py-aniso8601: fix style
* [fix] py-aniso8601: remove version 3.0.2 which depends on removed version of python; set the minimal version of python to 3.7
* py-aniso8601: remove version constraint on python
* Add Scine QCMaquis recipe to builtin packages
* Format scine-qcmaquis recipe using black
* Remove import os from scine-qcmaquis recipe
* Reduce length of symmetries line in scine-qcmaquis
* Add myself as a maintainer of the scine-qcmaquis recipe
* Update Scine-QCMaquis recipe following the review
PR URL: https://github.com/spack/spack/pull/39709
Changes:
- Updated Symmetries variant to us multi feature
- Added dependency to boost+chrono since it was undocumented
- Use define_from_variant to setup CMake args
- Make version 3.1.2 be preferred since 3.1.3 build is broken
* Change build_tests boolean condition
---------
Co-authored-by: Adam Grofe <adamgrofe@microsoft.com>
PythonExtension is a base class for PythonPackage, and
is meant to be used for any package that is a Python
extension but is not built using "python_pip".
The "update_external_dependency" method in the base
class calls another method that is defined in the derived
class.
Push "get_external_python_for_prefix" up in the hierarchy
to make method calls consistent.
* Add OIDC tokens to gitlab-ci jobs
This should allow us to start issuing just-in-time generated
credentials for CI jobs that need to modify binary mirrors. The "aud"
claim of the token describes what the token is allowed to do. The
claim is verified against a set of rules on the IAM role using signed
information from GitLab. See spack-infrastructure for the claim
verification logic.
---------
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
This commit replaces the internal representation of deptypes with `int`, which is more compact
and faster to operate with.
Double loops like:
```
any(x in ys for x in xs)
```
are replaced by constant operations bool(xs & ys), where xs and ys are dependency types.
Global constants are exposed for convenience in `spack.deptypes`
Currently the configuration of the pipeline is such that
there are multiple "optimal" solutions. This is due to
the pipeline making ~lld the default for LLVM, but leaving
+libomptarget from package.py
Since LLVM recipe has a:
conflicts("~lld", "+libomptarget")
clingo is forced to change one of the two defaults, and
the penalty for doing so is the same for each. Hence, it
is random whether we'll get +libomptarget+lld
or ~libomptarget~lld.
This fixes it by changing also the default for libomptarget.
* scorep version 8.1 added
* configure finds cudart and cupti in the nvhpcsdk suite
* style fixed
* changes to find libcuda.so in cuda directory
---------
Co-authored-by: Laura Bellentani <lbellen1@login01.leonardo.local>
* cp2k: patch several old versions to help newer compilers
* cp2k: use -O2 optimization for AOCC compiler
* cp2k: do not support old AOCC compilers
* cp2k: simplify when clause due to conflicting out old compilers
* cp2k: give a more meaningful message for confilcts
Co-authored-by: Ning Li <ning.li@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
* libfirefly: Add cmake package for v2.1.0 and master (git) versions
* Separate git URL from version declaration
Co-authored-by: Alec Scott <alec@bcs.sh>
Currently, the concretizer emits facts for all versions known to Spack, including deprecated versions, and has a specific optimization objective to minimize their use.
This commit simplifies how deprecated versions are handled by considering possible versions for a spec only if they appear in a spec literal, or if the `config:deprecated:true` is set directly or through the `--deprecated` flag. The optimization objective has also been removed, in favor of just ordering versions and having deprecated ones last.
This results in:
a) no delayed errors on install, but concretization errors when deprecated versions would be the only option. This is in particular relevant for CI where it's better to get errors early
b) a slight concretization speed-up due to fewer facts
c) a simplification of the logic program.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* unifyfs: drop upperbound on deprecated openssl
The package uses deprecated MD5 functions from OpenSSL, which causes
warnings, but (a) Spack by default disables -Werror, and (b) those
functions will continue to exist in OpenSSL 3.
* unifyfs: enable parallel build, only make check sequential
* unifyfs: order class methods by install phases
* py-maestrowf: add new version 1.1.9, deprecate development releases
* py-maestrowf: drop py-cryptography in 1.1.9
* py-maestrowf: drop py-cryptography dependency entirely, since it is not a direct dependency
* py-merlin: new version, ensure openssl 3 compat
* py-merlin: drop py-coloredlogs@10: lowerbound
* py-maestrowf: add py-rich, reorder deps
* py-celery: explain why upperbound is in spack but not in requirements.txt
* openexr: add 2.4.3, 2.5.9, 3.1.11 & 3.2.0
- 2.5.9 is the latest version compatible with OpenScenGraph
- improved compatibility with GCC 13
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* openexr: drop unsupported debug variant
--disable/enable-debug is not supported by ./configure in 1.3, 1.7, 2.0, 2.2 and 2.3
* openexr: transform into multi-build system package
simplifies package considerably, as nothing special seems to be required
* openexr: pkg-config is also used by @3
* openexr: use system libdeflate instead of internal
if no libdeflate is found, openexr would download and build its own starting with 3.2.0
* openexr: disable tests
would download lots of data during cmake and make build times noticably longer
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Make the state of the Python Front End (PFE) and Python data reader
support sticky so that the concretizer does not arbitrarily disable
them.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* package_qscintilla_build_with_qt6
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* improve
* fix Qsci.abi3.so install
* simplify, fix, tidy
* fix style
* fix style
* fix style
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* improve
* improve style
* fix style
* make black happy
* add ver 2.14.1
* update
* make black happy
* Update var/spack/repos/builtin/packages/qscintilla/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* improve
---------
Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* simple build of py-openai
* added variants to py-openai
* py-pandas-stubs is a dependency for py-openai
* fixed format and flake8 errors for py-openai
* black format error for py-pandas-stubs
* [@spackbot] updating style on behalf of sidpbury
* made style and format changes to py-openai
* made style and format changes to py-pandas-stubs
* py-types-pytz is a dependency for py-openai
* [@spackbot] updating style on behalf of sidpbury
* updated py-openpyxl for ver 3.0.7 and 3.1.2
* Update var/spack/repos/builtin/packages/py-pandas-stubs/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ajs requested changes for py-openai
* updated py-openpyxl for supported python
* [@spackbot] updating style on behalf of sidpbury
* updated py-openpyxl
* removed requirement.txt dependencies in py-openpyxl
* removed python depends on from openpyxl
* updated package to support newer versions
* updated version of py-pygit2
* py-fairscale is a new package in support of torch
* py-pgzip is a dependency for py-fairscale
* switch fairscale pypi, added extra variant for convenience
* removed python dependency
* changed multiple requirement versions
* changes for upstream py-fairscale
* changes for upsteam py-pygit2
* Update package.py
* Update package.py
* sorted out some of the dependency versions
* removed version 1.12.2 because dependency could not be met
* updated py-cached-property dependency
* suggested changes from adamjstewart
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ensure umpire~cuda~rocm when ~cuda~rocm
* update mdanalysis
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-hatchling: add 1.18.0
* py-pipdeptree: add new package
* py-hatchling: add Python 3.8 dependency
* Apply suggestion from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Fábio de Andrade Barboza <f168817@dac.unicamp.br>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* glibc: add missing deps, drop < 2.17
Building glibc 2.17 requires linking libgcc_{s,eh}.a, which themselves
depend on whatever glibc the current gcc uses for its runtime libraries.
Newer gcc depends on gnu extensions of libdl it seems, so that means you
simply cannot build old glibc with gcc-using-new-glibc.
The relevant fix in glibc is this commit:
commit 95f5a9a866695da4e038aa4e6ccbbfd5d9cf63b7
Author: Joseph Myers <joseph@codesourcery.com>
Date: Tue Jul 3 19:14:59 2012 +0000
Avoid use of libgcc_s and libgcc_eh when building glibc.
See also references to that commit in the glibc mailing list.
* update the gmake bound
* add --without-selinux
SIRIUS was introduced in version 7 of cp2k but could be used
in practice in version 9 (input format and functionalities).
SIRIUS with version 6 and below are marked as a dependency
conflict until CP2K version 9.
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Instead of pointing to the image on DockerHub, which rate limits us and
causes pipeline failures durying busy times, use the version at ghcr.
And we might as well use the ghcr version everywhere else too.
NMake makefiles are still called makefiles. The corresponding builder
variable was called "nmakefile", which is a bit unintuitive and lead
to a few easy-to-make, hard-to-notice mistakes when creating packages.
This commit renames the builder property to be "makefile"
Extensionless archives requiring two-stage decompression and extraction
require intermediate archives to be renamed after decompression/extraction
to prevent collision. Prior behavior attempted to cleanup the intermediate
archive with the original name, this PR ensures the renamed folder is
cleaned instead.
Co-authored-by: Dan Lipsa <dan.lipsa@khq.kitware.com>
Co-authored-by: John Parent <john.parent@kitware.com>
Smart alias completion introduced in #39499 wasn't as smart as it needed to be, and
would complete any invalid command prefix and some env names with alias names.
- [x] don't complete aliases if there are no potential completions
e.g., don't convert `spack isnotacommand` -> `spack concretize`
- [x] don't complete with an aliases if we're not looking at a top-level subcommand.
* Perform external spec detection with multiple workers
The logic to perform external spec detection has been refactored
into classes. These classes use the GoF "template" pattern to account
for the small differences between searching for "executables" and
for "libraries", while unifying the larger part of the algorithm.
A ProcessPoolExecutor is used to parallelize the work.
* Speed-up external find by tagging detectable packages automatically
Querying packages by tag is much faster than inspecting the repository,
since tags are cached. This commit adds a "detectable" tag to every
package that implements the detection protocol, and external detection
uses it to search for packages.
* Pass package names instead of package classes to workers
The slowest part of the search is importing the Python modules
associated with candidate packages. The import is done serially
before we distribute the work to the pool of executors.
This commit pushes the import of the Python module to the job
performed by the workers, and passes just the name of the packages
to the executors.
In this way imports can be done in parallel.
* Rework unit-tests for Windows
Some unit tests were doing a full e2e run of a command
just to check a input handling. Make the test more
focused by just stressing a specific function.
Mark as xfailed 2 tests on Windows, that will be fixed
by a PR in the queue. The tests are failing because we
monkeypatch internals in the parent process, but the
monkeypatching is not done in the "spawned" child
process.
* Write timing information for installs from cache
* CI: aggregate and upload install_times.json to artifacts
* CI: Don't change root directory for artifact generation
* Flat event based timer variation
Event based timer allows for easily starting and stopping timers without
wiping sub-timer data. It also requires less branching logic when
tracking time.
The json output is non-hierarchical in this version and hierarchy is
less rigidly enforced between starting and stopping.
* Add and write timers for top level install
* Update completion
* remove unused subtimer api
* Fix unit tests
* Suppress timing summary option
* Save timers summaries to user_data artifacts
* Remove completion from fish
* Move spack python to script section
* Write timer correctly for non-cache installs
* Re-add hash to timer file
* Fish completion updates
* Fix null timer yield value
* fix type hints
* Remove timer-summary-file option
* Add "." in front of non-package timer name
---------
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* seacas: Set TPL_ENABLE_Pthread=ON when +thread_safe
This should fix#39702
Basically, following suggestion of error message and setting
TPL_ENABLE_Pthread to the value of the boolean spack variant thread_safe
* seacas: Fix style issue
Add space after comment #
* cgns: Add -fPIC to Fortan/C compilation flags
This should fix#39699
* cgns: Add -fPIC compilation flags when +fortran
Incorporating suggestions of @aumuell with addition of making +pic required when +fortran
* r-sets: Add r-sets.
* r-sets: Fix R dependency for first version.
* r-sets: Remove package versions for which a compatible version of R cannot be installed.
The spiral-software package had a number of extensions, but does not
work unless they actually exist in the spiral software prefix (creating
a view is not sufficient). With the removal of "spack activate"
(different from "spack env activate"), a new approach is needed to
support optional components of `spiral-software` . This commit updates
the spiral-software package to copy the dependency installations into
its own prefix.
This commit also adds versions for `fftx` and `spiral-software`, as
well as an optional `spiral-software-jit` package.
This is a fixed version of b72a268
* That commit would discard the final key component (so if you set
"config:install_tree:root", it would discard "root" and just set
install tree).
* When setting key:"value", with the quotes, that commit would
discard the quotes, which would confuse the system if adding a
value like "{example}" (the "{" character indicates a dictionary).
This commit retains the quotes.
* adding bacio to g2 dependencies
* edited documentation
* added version 3.4.6
* starting with 3.4.6, can use any version of jasper
* adding w3emc dependency for versions up to 3.4.5
* removed t-brown as maintainer at his request
* Add hpx-kokkos 0.4.0
* Make git global package property in hpx-kokkos instead of having it version-specific
* Add variant for choosing future type in hpx-kokkos
* Add support for testing hpx-kokkos
These commands are currently broken on powershell (Windows) due to
improper use of the InvokeCommand commandlet and a lack of direct
support for the `--pwsh` argument in `spack load`, `spack unload`,
and `spack env deactivate`.
If you wanted to set a configuration option like
`config:install_tree:root` to "C:/path/to/config.yaml", Spack had
trouble parsing this because of the ":" in the value. This adds
logic to allow using quotes to enclose the value, so you can add
`config:install_tree:root:"C:/path/to/config.yaml"`.
Configuration keys should never contain a quote character, so the
presence of any quote is taken to mean that the rest of the string
is specifying the value.
* add a possibility to control default cuda version
* fix stype
* style fix
* resolve comment
* resolve comment
* Fix style in nvhpc package.py
---------
Co-authored-by: antonk <antonk@cscs.ch>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Setting the undocumented variable SPACK_CONCRETIZER_REQUIRE_CHECKSUM
now causes the solver to avoid accounting for versions that are not checksummed.
This feature is used in CI to avoid spurious concretization against e.g. develop branches.
* The flang project does not support exceptions enabled in the core llvm
library (and developer guidelines explicitly state they should not be
used). For this reason, when the flang variant is selected,
LLVM_ENABLE_EH needs to be disabled. In the current main branch of
llvm (and thus future releases), enabling flang and setting
LLVM_ENABLE_EH will cause the overall build to fail.
* Update var/spack/repos/builtin/packages/llvm/package.py
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* fix syntax
---------
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Currently, OneAPI's setvars scripts effectively disregard any arguments
we're passing to the MSVC vcvars env setup script, and additionally,
completely ignore the requested version of OneAPI, defaulting to whatever
the latest installed on the system is.
This leads to a scenario where we have improperly constructed Windows
native development environments, with potentially multiple versions of
MSVC and OneAPI being loaded or called in the same env. Obviously this is
far from ideal and leads to some fairly inscrutable errors such as
overlapping header files between MSVC and OneAPI and a different version
of OneAPI being called than the env was setup for.
This PR solves this issue by creating a structured invocation of each
relevant script in an order that ensures the correct values are set in
the resultant build env.
The order needs to be:
1. MSVC vcvarsall
2. The compiler specific env.bat script for the relevant version of
the oneapi compiler we're looking for. The root setvars scripts seems
to respect this as well, although it is less explicit
3. The root oneapi setvars script, which sets up everything else the
oneapi env needs and seems to respect previous env invocations.
`compgen -W` does not behave the same way in zsh as it does in bash; it seems not to
actually generate the completions we want.
- [x] add a zsh equivalent and `_compgen_w` to abstract it away
- [x] use `_compgen_w` instead of `compgen -W`
Bash completion is now smarter about handling aliases. In particular, if all completions
for some input command are aliased to the same thing, we'll just complete with that thing.
If you've already *typed* the full alias for a command, we'll complete the alias.
So, for example, here there's more than one real command involved, so all aliases are
shown:
```console
$ spack con
concretise concretize config containerise containerize
```
Here, there are two possibilities: `concretise` and `concretize`, but both map to
`concretize` so we just complete that:
```console
$ spack conc
concretize
```
And here, the user has already typed `concretis`, so we just go with it as there is only
one option:
```console
spack concretis
concretise
```
From a user:
> Aargh.
> ```
> ==> Error: concretise is not a recognized Spack command or extension command; check with `spack commands`.
> ```
To make things easier for our friends in the UK, this adds `concretise` and
`containerise` aliases for the `spack concretize` and `spack containerize` commands.
- [x] add aliases
- [x] update completions
Switch from makefile to CMake-based build. CMake support is currently
in a specific branch of the amg2023 repo, so add this branch as a
version in the package.
Set BL_USE_PARTICLES to 1, which should case boxlib build to include Particles classes
according to CMakeLists.txt.
This seems to fix#18172
The aforementioned error seems to occur in cmake phase while processing
CMakeLists.txt in Src/C_ParticleLib, and appears to be due to the
variable containing the list of src files for the add_library() call
being empty unless BL_USE_PARTICLES is set to 1.
Use full length commit sha instead of short prefixes, to improve
reproducibility (future clashes) and guard against compromised repos and
man in the middle attacks.
Abbreviated commit shas are expanded to full length, to guard against future
clashes on short hash. It also guards against compromised repos and
man in the middle attacks, where attackers can easily fabricate a malicious
commit with a shasum prefix collision.
Versions with just tags now also get a commit sha, which can later be used to
check for retagged commits.
* VTK: Add patch for python 3.8 support
* CI: Re-enable VisIt in CI
* Configure spec matrix for stack with VisIt
* Add pugixml dep for 8.2.0
* Make VTK and ParaView consistent on proj dep
* OpenMPI 3: provides MP support by default
* Add details on proj dep in ParaView
* Add python 3.8 to test mock repo
* Patches to get VisIt VTK interface
* CI: Disable VisIt with GUI in DAV
GitHub's beta private security issue reporting feature is enabled on the Spack repo now,
so we can change `SECURITY.md` to recommend using it instead of `maintainers@spack.io`.
- [x] Update `SECURITY.md` to direct people to the GitHub security tab.
- [x] Update working in `SECURITY.md` to say "last two major releases" with a link to
the releases page, instead of explicitly listing release names. This way we don't have
to update it (which we keep forgetting to do).
This reapplies 66f7540, which adds supports for hardlinks/junctions on
Windows systems where developer mode is not enabled.
The commit was reverted on account of multiple issues:
* Checks added to prevent dangling symlinks were interfering with
existing CI builds on Linux (i.e. builds that otherwise succeed were
failing for creating dangling symlinks).
* The logic also updated symlinking to perform redirection of relative
paths, which lead to malformed symlinks.
This commit fixes these issues.
There are other ways to enforce cray-pmi being loaded in environments
that use cray-mpich. This avoids breaking environments where this was
already the case and avoids forcing them to declare an external.
* AOCC and AOCL spack recipes for 4.1 release
* Fix broken checksum
* remove blank line
* Add missing `@when` for 4.1 only function
---------
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
* Create a smartsim package
* rm ss 0.4.2
* no py upper bound, add build dep
* add setup_build_env
* add comment to find ml deps lower bounds
* Apply suggestions from code review
Correct dep versions, use `python_purelib`
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove the cuda/rocm vars
* point editors to bin deps version constraints
* Apply suggestions from code review
Loosen `py-smartredis` constraint, enforce `setup.cfg` py version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* style
* rm rai lower bound
* lower bound setuptools
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add magma variant to strumpack
* clarify conflicts to be excuive to rocm/cuda, without it +magma+cuda fails as it is ~rocm
modified: var/spack/repos/builtin/packages/strumpack/package.py
* add missing depends_on for magma variant
#35042 introduced lazy hash parsing, but didn't remove a
few attributes from the parser that were needed only for
concrete specs
This commit removes them, since they are effectively
dead code.
Clone is a hard dependency as of HTTP-Message v6.44. This causes
problems in packages like Roary which depend on perl-http-message:
$ spack load roary
$ roary
Use of uninitialized value in require at /var/scratch/spack/opt/spack/linux-centos8-x86_64_v3/gcc-8.5.0/perl-http-message-6.44-lzp5th4jddd3gojkjfli4hljgem2nl26/lib/perl5/HTTP/Headers.pm line 8.
Can't locate Clone.pm in @INC (you may need to install the Clone module) (@INC contains: /home/aorth/lib ...
See: https://github.com/libwww-perl/HTTP-Message/blob/master/Changes
While spack does not yet provide binutils 2.41, they might still be
installed. However, building ffmpeg on x86_64 fails with multiple errors like
this:
./libavcodec/x86/mathops.h:125: Error: operand type mismatch for `shr'
also reported here: https://trac.ffmpeg.org/ticket/10405
Add amg2023 package
Consolidate existing amg and amg2013 packages (they reference the
same code) under the amg2013 name to minimize confusion between
amg2023 and amg2013.
Co-authored-by: Riyaz Haque <haque1@llnl.gov>
The heuristic for duplicate nodes contains a few typos, and
apparently slows down the solve for specs that have a lot of
sub-optimal choices to be taken.
This is likely because with a lot of sub-optimal choices, the
low priority, flawed heuristic is being used by clingo.
Here I split the heuristic, so complex rules that matter only
if we allow multiple nodes from the same package are used
only in that case.
* use full cuda_arch list
* update compiler requirement
* update boost requirements
* propagate +kokkos to legion in non-GPU cases
* add missing graphviz dependency for +doc
Spack installs the hsa-rocr-dev and rocprofiler packages into different
directories. However, PAPI typically expects those to be under the same
directory and locates such directory through the PAPI_ROCM_ROOT env
variable.
The PAPI rocm component also allows users to override PAPI_ROCM_ROOT to
locate directly the librocprofiler64.so through the HSA_TOOLS_LIB env
variable that acts directly onto the HSA runtime tools mechanism.
Hence, in order to account for the decoupling of hsa and rocprofiler,
this patch sets HSA_TOOLS_LIB to librocprofiler64.so full path.
* Update pennylane-lightning.
* Update Lightning-Kokkos to v0.31
* Constrain scipy version.
* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix PLK kokkos dep versioning.
* Move kokkos ver outised backend loop and reformat.
* Update package.py
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Since #34821 we are annotating virtual dependencies on
DAG edges, and reconstructing virtuals in memory when
we read a concrete spec from previous formats.
Therefore, we can remove a TODO in asp.py, and rely on
"virtual_on_edge" facts to be imposed.
Add support for PGO and LTO for gcc, clang and apple-clang, and add a
patch to allow mimalloc as an allocator in operator new/delete, give
reduces clingo runtime by about 30%.
Previous changes to this file stopped directly processing CL args to
stop batch `for` from interpolating batch reserved characters needed in
arguments like URLS. In doing so, we relied on `for` for an easy
"split" operation, however this incorrectly splits paths with spaces in
certain cases. Processing everything ourselves with manual looping via
`goto` statements allows for full control over CL parsing and handling
of both paths with spaces and reserved characters.
* WarpX 23.08
Update WarpX and related Python packages to the lastest releases.
* fix style
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
In the HPC package manager, we want the fastest `zlib` implementation by default. `zlib-ng` is up to 4x faster than stock `zlib`, and it can do things like take advantage of AVX-512 instructions. This PR makes `zlib-ng` the default `zlib-api` provider (`zlib-api` was introduced earlier, in #37372).
As far as I can see, the only issues you can encounter are:
1. Build issues with packages that heavily rely on `zlib` internals. In Gitlab CI only one out of hundreds of packages had that issue (it extended zlib with deflate stuff, and used its own copy of zlib sources).
2. Packages that like to detect `zlib-ng` separately and rely on `zlib-ng` internals. The only issue I've found with this among the hundreds of packages built in CI is `perl` trying to report more specific zlib-ng version details, and relied on some internals that got refactored. But yeah... that warrants a patch / conflict and is nothing special.
At runtime, you cannot really have any issues, given that zlib and zlib-ng export the exact same symbols (and zlib-ng tests this in their CI).
You can't really have issues with externals when using zlib-ng either. The only type of issue is when system zlib is rather new, and not marked as external; if another external uses new symbols, and Spack builds an older zlib/zlib-ng, then the external might not find the new symbols. But this is a configuration issue, and it's not an issue caused by zlib-ng, as the same would happen with older Spack zlib.
* zlib-api: use zlib-ng +compat by default
* make a trivial change to zlib-ng to trigger a rebuild
* add `haampie` as maintainer
Computing str(spec) is faster than computing hash(spec), and
since all the abstract specs we deal with come from user configuration
they cannot cover DAG structures that are not captured by str() but
are captured by hash()
Delay lookup for abstract hashes until concretization time, instead of
until Spec comparison. This has a few advantages:
1. `satisfies` / `intersects` etc don't always know where to resolve the
abstract hash (in some cases it's wrong to look in the current env,
db, buildcache, ...). Better to let the call site dictate it.
2. Allows search by abstract hash without triggering a database lookup,
causing quadratic complexity issues (accidental nested loop during
search)
3. Simplifies queries against the buildcache, they can now use Spec
instances instead of strings.
The rules are straightforward:
1. a satisfies b when b's hash is prefix of a's hash
2. a intersects b when either a's or b's hash is a prefix of b's or a's
hash respectively
The median length of this list of 1. For reasons I don't know, `.sort()`
still like to call the key function.
This saves ~9% of total database read time, and the number of calls
goes from 5305 -> 1715.
* Do not impose provider conditions, if the node is not a provider
fixes#39455
When a node can be a provider of a spec, but is not selected as
a provider, we should not be imposing provider conditions on the
virtual.
* Adjust the integrity constraint, by using the correct atom
* Add "only_clingo", "only_original" and "not_on_windows" markers
* Modify tests to use the "not_on_windows" marker
* Mark tests that run only with clingo
* Mark tests that run only with the original concretizer
* Fixed HeFFTe package spec to not do the smoke test prior to 2.2.0, where it breaks
* Convert test return to 'raise SkipTest'
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
To avoid paying the cost of setup and of a full grounding again,
move cycle detection into a separate program and check first if
the solution has cycles.
If it has, ground only the integrity constraint preventing cycles
and solve again.
The "concretizer" section has been extended with a "duplicates:strategy"
attribute, that can take three values:
- "none": only 1 node per package
- "minimal": allow multiple nodes opf specific packages
- "full": allow full duplication for a build tool
This refactor introduces extra indices for triggers and
effect of a condition, so that the corresponding clauses
are evaluated once for every condition they apply to.
All the solution modes we use imply that we have to solve for all
the literals, except for "when possible".
Here we remove a minimization on the number of literals not
solved, and emit directly a fact when a literal *has* to be
solved.
Introduce the concept of "condition sets", i.e. the set of packages on which
a package can require / impose conditions. This currently maps to the link/run
sub-dag of each package + its direct build dependencies.
Parametrize the "condition" and "requirement" logic to multiple nodes.
So far the encoding has a single ID per package, i.e. all the
facts will be node(0, Package). This will prepare the stage for
extending this logic and having multiple nodes from the same
package in a DAG.
Each fact that is deduced from package rules, and start with
a bare package atom, is transformed into a "facts" atom containing
a nested function.
For instance we transformed
version_declared(Package, ...) -> facts(Package, version_declared(...))
This allows us to clearly mark facts that represent a rule on the package,
and will be of help later when we'll have to distinguish the cases where
the atom "Package" is being used referred to package rules and not to a
node in the DAG.
Windows executable paths can have spaces in them, which was leading to
errors when constructing Executable objects: the parser was intended
to handle cases where users could provide an executable along with one
or more space-delimited arguments.
* Executable now assumes that it is constructed with a string argument
that represents the path to the executable, but no additional arguments.
* Invocations of Executable.__init__ that depended on this have been
updated (this includes the core, tests, and one instance of builtin
repository package).
* The error handling for failed invocations of Executable.__call__ now
includes a check for whether the executable name/path contains a
space, to help users debug cases where they (now incorrectly)
concatenate the path and the arguments.
* The module-level skip for tests in `cmd.install` on Windows is removed.
A few classes of errors still persist:
* Cdash tests are not working on Windows
* Tests for failed installs are also not working (this will require
investigating bugs in output redirection)
* Environments are not yet supported on Windows
overall though, this enables testing of most basic uses of "spack install"
* Git repositories cached for version lookups were using a layout that
mimicked the URL as much as possible. This was useful for listing the
cache directory and understanding what was present at a glance, but
the paths were overly long on Windows. On all systems, the layout is
now a single directory based on a hash of the Git URL and is shortened
(which ensures a consistent and acceptable length, and also avoids
special characters).
* In particular, this removes util.url.parse_git_url and its associated
test, which were used exclusively for generating the git cache layout
* Bootstrapping is now enabled for unit tests on Windows
Change the shebang in mkinc from /usr/bin/perl to /usr/bin/env perl for
portability to systems that don't necessarily have perl in /usr/bin.
Also adds perl as a build-time dependency.
#36770 added git as a dependency to `setuptools-scm`. This in turn makes `git` a
transitive dependency for our bootstrapping process.
Since `git` may take a long time to build, and is found on most systems, try to
detect it as an external.
* WarpX 23.07
Update WarpX and related Python packages to the lastest releases.
* py-picmistandard: refresh hashes
Keeping even a single `git` version in here confuses the fetcher.
Remove broken old versions.
* Remove `py-warpx` versions with broken PICMI
* py-cryptography: does not run-depend on py-setuptools-rust
* py-cryptography: depens_on py-setuptools-rust when @3.4.2:
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cryptography: re-add depends_on type=run for narrow range
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This makes the name of the global variable representing
the repository currently in use uppercase. Doing so is advised
by pylint rules, and helps to identify where the global is used.
* fix(sgpp): Fix installation phase scons args
* fix(sgpp): Workaround for distutils deprecation
The distutils deprecation warning in Python 3.10 - 3.11 caused
problems within the SGpp SConfigure checks by causing failures
when looking for Python.h. This commit works around this by adding a
patch that simply disables the warning. It also puts limits on the
python dependency version until distutils is removed from SGpp.
* fix(sgpp): cleanup and simplify
* fix(sgpp): Fix style
`cmake+qt` depends on `qt`, which depends on `libmng`, which is a CMake
package, and has been for 4 years. Nobody ever complained about
`cmake+qt` not concretizing... so why pay the solve cost.
Before:
```
setup 3.779s
load 0.018s
ground 2.625s
solve 4.511s
total 11.236s
```
After:
```
setup 3.734s
load 0.018s
ground 0.468s
solve 0.560s
total 5.080s
```
* Prefix conflict messages with package name
This patch prefixes all conflict messages with the package name to
alleviate what was otherwise a very manual process. Note that this patch
is a one line change but has a fairly outsized impact.
* same for requires directive
---------
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* AMReX: 23.06+ Multi-Dim Support
This updated the Spack package to allow to install AMReX, modules of
AMReX in E4S deployments and dependent packages with support for
multiple dimensions. Due to an upstream change in AMReX, we do not
longer need to ship three, binary incompatible package variants.
* [E4S] oneAPI AMReX < 23.06 Variant
Work-around the auto-concretization to the multi-dim of `dimensions`,
which only in 23.06+ became a multi-variant.
* e4s cray rhel ci: temporarily disable amrex build until spurious ci failure can be resolved
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
The package won't compile with newer compilers because warnings
are converted to errors. Hence, disable such conversion.
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
With the previous version of the recipe I would get linking errors due
to missing `-lfftw3f`.
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
* Ensure that all variants have a description
* Update mock packages too
* Fix test invocations
* Black fix
* mgard: update variant descriptions
* flake8 fix
* black fix
* Add to audit tests
* Relax type hints
* Older Python support
* Undo all changes to mock packages
* Flake8 fix
This adds the latest C-Blosc2 release, the first to add CMake
CONFIG install files for easier downstream usage of the installed
library and CMake targets.
* include tests for h5bench
* update to new test format
* update path
* improve description and detect MPI runner
* fix test name
* Include new release
* update with corrections
* include SLURM dependency and reduce test to one process
* fixes on filters
* fix on oversubscribe
* suggested fixes
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Refactor gitlab ci configs so that mac and cray jobs can reuse as much higher level
configuration as possible.
* CI: remove redundant sections
* CI: Include base linux CI configs in cray stacks
Relocation and runner mapping is consistent between cray and linux runners.
* Export user cache path in before script
* CI: add GPG root for mac runners
* Disable user configs
Metal runners share a ~ directory
* Disable user config and add configs in activate env
This PR extracts two responsibilities from the `Database` class:
1. Managing locks for prefixes during an installation
2. Marking installation failures
and pushes them into their own class (`SpecLocker` and `FailureMarker`). These responsibilities are also pushed up into the `Store`, leaving to `Database` only the duty to manage `index.json` files.
`SpecLocker` classes no longer share a global list of locks, but locks are per instance. Their identifier is simply `(dag hash, package name)`, and not the spec prefix path, to avoid circular dependencies across Store / Database / Spec.
* Fixed the LD_LIBRARY_PATH to use the lib64 directory.
* On AMD systems the llvm/lib directory is not properly put into the LD_LIBRARY_PATH.
* Added both lib and lib64 paths for libfabric.
* Split the prepend statements.
* Add support to export the LD_LIBRARY_PATH for the libfabric package
and subsequent module files.
Fix the AWS OFI RCCL package so that it prepends the enviornment
variables.
* Fixed comment
* Create a spack package for smartredis python client
* make py-SR deps versions match docs
* tie SR v0.4.0 to redis-plus-plus v1.3.5
* looser extension lib deps for concretization
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Address reviewer feedback
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-sphinx-rtd-theme: avoid concretizing 0.5 with Sphinx 7.0
* Update var/spack/repos/builtin/packages/py-sphinx-rtd-theme/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* vecgeom: new version 1.2.5
* Mark previous versions as deprecated
* [@spackbot] updating style on behalf of sethrj
* vecgeom: fix 1.2.4 checksum for change to git describe
FastPackageChecker.modified_since should use a default number < 0
When the repo cache does not exist, Spack uses mtime 0. This causes the repo
cache not to be generated when the repo has mtime 0.
Some popular package managers such as spack use 0 mtime normalization for
reproducible tarballs. So when installing spack with spack from a buildcache, the
repo cache doesn't generate
Also add some typehints
* Inform mypy that tty.die is noreturn
* avoid temporary allocation in env
* update spack buildcache save-specfile
* fix spack buildcache check/download/get-buildcache-name
- ensure that required args and mutually exclusive ones are marked as
such in argparse for better error messages
- deprecate --spec-file everywhere
- use disambiguate for better error messages
* py-paralleltask: new package @0.2.2
* adding hidden dependency
* nextdenovo: new package @2.5.2
* style
* Update var/spack/repos/builtin/packages/py-paralleltask/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-melissa-core: add new versions and py-iterative-stats dependency
* Enhance dependency specification
* Fix dependency specification
* Fix comment alignment
* Improve dependency specification style
* Enhance dependencies
* Fix mpi4py, py-cloudpickle and py-python-hostlist dependencies
* Update var/spack/repos/builtin/packages/py-melissa-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-melissa-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-melissa-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Set develop version as preferred
* Update var/spack/repos/builtin/packages/py-melissa-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* libEnsemble: add v0.10.2
* Make setuptools build only dep
* Update var/spack/repos/builtin/packages/py-libensemble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* tximeta: add package
Signed-off-by: Pablo <pablo.aledo@seqera.io>
* [@spackbot] updating style on behalf of pabloaledo
---------
Signed-off-by: Pablo <pablo.aledo@seqera.io>
* CI: Refactor ci reproducer
* Autostart container
* Reproducer paths match CI paths
* Generate start scripts for docker and reproducer
* CI: Add interactive and gpg options to reproduce-build
* Interactive will determine if the docker container persists
after running reproduction.
* GPG path/url allow downloading GPG keys needed for binary
cache download validation. This is important for running
reproducer for protected CI jobs.
* Add exit_on_failure option to CI scripts
* CI: Add runtime option for reproducer
- Run `mkdirp` on `spec.prefix`
- Extract directly into `spec.prefix`
1. No need for `$store/tmp.xxx` where we extract the tarball directly, pray that it has one subdir `<name>-<version>-<hash>`, and then `rm -rf` the package prefix followed by `mv`.
2. No need to clean up this temp dir in `spack clean`.
3. Instead figure out package directory prefix from the tarball contents, and strip the tarinfo entries accordingly (kinda like tar --strip-components but more strict)
- Set package dir permissions
- Don't error during error handling when files cannot removed
- No need to "enrich" spec.json with this tarball-toplevel-path
After this PR, we can in fact tarball packages relative to `/` instead of `spec.prefix/..`, which makes it possible to use Spack tarballs as container layers, where relocation is impossible, and rootfs tarballs are expected.
`spack buildcache list` did not have a way to display the namespace of
packages in the buildcache. This PR adds that functionality.
For consistency's sake, it moves the `-N/--namespace` arg definition to
the `common/arguments.py` and modifies `find`, `solve`, `spec` to use
the common definition.
Previously, `find` was using `--namespace` (singular) to control whether
to display the namespace (it doesn't restrict the search to that
namespace). The other commands were using `--namespaces` (plural). For
backwards compatibility and for consistency with `--deps`, `--tags`,
etc, the plural `--namespaces` was chosen. The argument parser ensures
that `find --namespace` will continue to behave as before.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* dcmtk: checksum 3.6.7
compiles on macos
* dcmtk: support pic configuration
* [@spackbot] updating style on behalf of aumuell
* dcmtk: use define_from_variant for shorter code
* dcmtk: refine conflict
it appears that dcmtk < 3.6.7 only fails on macos/aarch64:
/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/include/xmmintrin.h:14:2:
error: "This header is only meant to be used on x86 and x64 architecture"
---------
Co-authored-by: aumuell <aumuell@users.noreply.github.com>
* py-gevent: add 23.7.0 and py-greenlet: add 3.0.0a1
* Update var/spack/repos/builtin/packages/py-gevent/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove version 1.3.a2
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* simple build of py-openai
* added variants to py-openai
* py-pandas-stubs is a dependency for py-openai
* fixed format and flake8 errors for py-openai
* black format error for py-pandas-stubs
* [@spackbot] updating style on behalf of sidpbury
* made style and format changes to py-openai
* made style and format changes to py-pandas-stubs
* py-types-pytz is a dependency for py-openai
* [@spackbot] updating style on behalf of sidpbury
* updated py-openpyxl for ver 3.0.7 and 3.1.2
* Update var/spack/repos/builtin/packages/py-pandas-stubs/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ajs requested changes for py-openai
* updated py-openpyxl for supported python
* [@spackbot] updating style on behalf of sidpbury
* updated py-openpyxl
* removed requirement.txt dependencies in py-openpyxl
* removed python depends on from openpyxl
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix Python package.py for Debian and derivatives to find the system Python library location
When bootstrapping from source, find_library() does not contain any paths that work for Debian and derivatives. fixes#36666
* Update to pass styling
* Update to styling
* Update python package.py with fake config value for LIBPL
* Update python package.py libpl config_vars entry to follow double quote standard
* styling update
* Add rewrite of spack checksum to include --verify and better add versions to package.py files
* Fix formatting and remove unused import
* Update checksum unit-tests to correctly test multiple versions and add to package
* Remove references to latest in stage.py
* Update bash-completion scripts to fix unit tests failures
* Fix docs generation
* Remove unused url_dict argument from methods
* Reduce chance of redundant remote_versions work
* Add print() before tty.die() to increase error readablity
* Update version regular expression to allow for multi-line versions
* Add a few unit tests to improve test coverage
* Update command completion
* Add type hints to added functions and fix a few py-lint suggestions
* Add @no_type_check to prevent mypy from failing on pkg.versions
* Add type hints to format.py and fix unit test
* Black format lib/spack/spack/package_base.py
* Attempt ignoring type errors
* Add optional dict type hint and declare versions in PackageBase
* Refactor util/format.py to allow for url_dict as an optional parameter
* Directly reference PackageBase class instead of using TypeVar
* Fix comment typo
---------
Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
* py-cupy updates: add +rocm and +all
* rocm deps are link only
* set parallelism for both +rocm and +cuda
* add missing deps; remove unnecessary deps; uncomment maintainers; get hipcc properly
* py-rdflib: add 6.3.2
* Update var/spack/repos/builtin/packages/py-rdflib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove python dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add recipe for py-hostlist
* Fix style
* Fix style
* Add homepage, fix version url and remove unnecessary dependency
* Fix version and remove url
* Rename package and fix git link
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
* sourmash: new package @4.8.2
* sourmash: new package @4.8.2
* py-bitarray: add 2.7.6, 2.7.4
* Update var/spack/repos/builtin/packages/py-bitstring/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update setuptools dependency
* Adding missing deps
* Update var/spack/repos/builtin/packages/sourmash/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Correcting maturin dep
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/sourmash/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/sourmash/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/sourmash/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding dependency types
* Add `@0.14.17` as the last pre-`@1:` release
* Switch to use `python_platlib`
* Update package.py
* Update var/spack/repos/builtin/packages/py-screed/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding optional hip test
* Modifications to run every samples test
* Skipping test directories without a Makefile
* fix styling and cleaning code
* fix styling and changed method of itterating through sample folders
* changed to new syntax for standalone tests
* Updates for changes in syntax
Currently lm-sensors defaults to the compiler specified in the Makefile
(gcc) rather than the one specified in the spec. This patch appends the
CC flag to the make invocation with the spec compiler to fix this
behavior.
* initial commit for rocm-5.5.0 release
* fix the hipsparse build error for 5.5.0
* fix build error for amrex .add hiprand as a dependency
* modify the patch for rocprofiler-dev
* add hiprand for +rocm build
* initial commit for rocm-5.5.1 release
* bump up the version for rocm-5.5.1 release.
* bump up the version for rocmlir.miopen to use this backend only till 5.5
* add new recipe py-barectf and add it as dependency for rocprofiler-dev
* revert the changes for rocprofiler-dev for 5.5.0/1 for now as it depends
on hsa-amdaqlprofile.so which is a closed source and no spack recipe is
available for now.
* add rocm-core as dependency for rocm packages from 5.5.0 onwards
* avoid download of the gtest for building unit tests
* feature (packages): implement `odgi` package
This commit re-implements odgi package (superseded by #38741)
* fix (packages): remove redundant `requires()` from odgi package
This commit removes redundant use of `requires()` for gcc version in the `odgi` package
libelf fails to build with clang16+ due to Wimplicit-int and
Wimplicit-function-declarations becoming errors by default. This breaks
the configuration stage, so no build takes place. This patch fixes this
by passing -Wno-error=implicit-int and
-Wno-error=implicit-function-declarations as cflags.
gcoff uses the register keyword in a couple different places which
causes errors when building with c++17, which is the default in clang
16. This patch adds the -Wno-register flag to ignore these errors when
when building with clang 16.
Clang 16's change to erroring out by default on implicit function
declarations and implicit integers causes the build script for unzip to
break. Since this project hasn't had a release since 2010, we need to
patch it downstream/pass additional flags to get the build to succeed.
With the release of clang 16, clang now treats implicit function
declarations and implicit integers as errors rather than warnings,
causing the build to fail. This patch adds flags to prevent build
failures.
The sanitization function is completely bogus as it tries to replace /
on unix after ... splitting on it. The way it's implemented is very
questionable: the input is a file name, not a path. It doesn't make
sense to interpret the input as a path and then make the components
valid -- you'll interpret / in a filename as a dir separator.
It also fails to deal with path components that contain just unsupported
characters (resulting in empty component).
The correct way to deal with this is to have a function that takes a
potential file name and replaces unsupported characters.
I'm not going to fix the other issues on Windows, such as reserved file
names, but left a note, and hope that @johnwparent can fix that
separately.
(Obviously we wouldn't have this problem at all if we just fixed the
filename in a safe way instead of trying to derive something from
the url; we could use the content digest when available for example)
* py-jaxlib: add conflict for missing cuda cuda_arch specification
* Update var/spack/repos/builtin/packages/py-jaxlib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-jaxlib: conflict missing cuda_arch value when with cuda
* Update var/spack/repos/builtin/packages/py-jaxlib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* commands: provide more information to Command
* fish: Add script to generate fish completion
* fish: auto prepend `spack` command to avoid duplication
* fish: impove completion generation code readability
* commands: replace match-case with if-else
* fish: fix optspec variable name prefix
* fish: fix return value in get_optspecs
* fish: fix return value in get_optspecs
* format: split long line and trim trailing space
* bugfix: replace f-string with interpolation
* fish: compete more specs and some fixes
* fish: complete hash spec starts with /
* fish: improve compatibility
* style: trim trailing whitespace
* commands: add fish to update args and update tests
* commands: add fish completion file
* style: merge imports
* fish: source completion in setup-env
* fish: caret only completes dependencies
* fish: make sure we always get same order of output
* fish: spack activate
only show installed packages that have extensions
* fish: update completion file
* fish: make dict keys sorted
* Blacken code
* Fix bad merge
* Undo style changes to setup-env.fish
* Fix unit tests
* Style fix
* Compatible with fish_indent
* Use list for stability of order
* Sort one more place
* Sort more things
* Sorting unneeded
* Unsort
* Print difference
* Style fix
* Help messages need quotes
* Arguments to -a must be quoted
* Update types
* Update types
* Update types
* Add type hints
* Change order of positionals
* Always expand help
* Remove shared base class
* Fix type hints
* Remove platform-specific choices
* First line of help only
* Remove unused maps
* Remove suppress
* Remove debugging comments
* Better quoting
* Fish completions have no double dash
* Remove test for deleted class
* Fix grammar in header file
* Use single quotes in most places
* Better support for remainder nargs
* No magic strings
* * and + can also complete multiple
* lower case, no period
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* tinygltf: new versions and release branch
for each minor release available, the newest patch release has been added
---------
Co-authored-by: aumuell <aumuell@users.noreply.github.com>
- drop use_xcode = True, as this would lead to an attempt install Xcode (#34064)
- don't automatically build Qt Location with +opengl, as this is
still broken
This built sucessfully with qt@5.15.10+opengl+dbus+phonon on ventura/arm without
Xcode installed (only command line tools) - I did not check with Xcode installed.
The user store is lazily evaluated. The change
in #38975 made it such that the first evaluation
was happening in the middle of swapping to user
configuration.
Ensure we construct the user store before that.
Use curly braces instead of quotes to enclose value or text in Tcl
modulefile. Within curly braces Tcl special characters like [, ] or $
are treated verbatim whereas they are evaluated within quotes.
Curly braces is Tcl recommended way to enclose verbatim content [1].
Note: if curly braces charaters are used within content, they must be
balanced. This point has been checked against current repository and no
unbalanced curly braces has been spotted.
Fixes#24243
[1] https://wiki.tcl-lang.org/page/Tcl+Minimal+Escaping+Style
* Fetching patches wouldn't result in acquiring a stage lock during install
* The installer would acquire a stage lock *after* fetching instead of
before, leading to races
* The name of the stage for patches was random, so on build failure
(where stage dirs are not removed), these directories would continue
to exist after a second successful install.
* There was this redundant "composite fetch" object -- there's already
a composite stage. Remove this.
* For some reason we do *double* shasum validation of patches, before
and after compression -- that's just too much? I removed it.
Spack heuristically adds `<install prefix>/lib` and `<install prefix>/lib64` as rpath entries, as it doesn't know what the install dir is going to be ahead of the build. This PR cleans up non-existing, absolute paths[^1], which
1. avoids redundant stat calls at runtime
2. drops redundant rpaths in `patchelf`, making it relocatable -- you don't need patchelf recursively then.
[^1]: It also removes relative paths not starting with `$` (so, `$ORIGIN/../lib` is retained -- we _could_ interpolate `$ORIGIN`, but that's hard to get right when symlinks have to be taken into account). Relative paths _are_ supported in glibc, but are relative to _the current working directory_, which is madness, and it would be better to drop those paths.
A LazyReference object is a reference to an attribute of a
lazily evaluated singleton. Its only purpose is to let developers
use shorter names to refer to such attribute.
This class does more harm than good, as it obfuscates the fact
that we are using the attribute of a global object. Also, it can easily
go out of sync with the singleton it refers to if, for instance, the
singleton is updated but the references are not.
This commit removes the LazyReference class entirely, and access
the attributes explicitly passing through the global value to which
they are attached.
Without the package name being present in the conflict messages, it is
significantly more difficult to debug concretization failures in
environments that contain many packages.
mesa-glu still has a couple instances of the register keyword which
causes build failures with clang on my platform. This patch removes the
register keyword which doesn't have any impact on correctness.
gperf still uses the register keyword in one place which makes
compilation fail with c++17. This patch adds in a patch file to remove
the usage of the reigster keyword so that it compiles properly.
In late 2021 elfutils was patched to make it build with clang, and these
patches ended up in version 0.186. This commit updates the conflicts to
specify this so elfutils can be built with clang.
These tests now work without any changes to core. Furthermore, it is
surprising that they had to be disabled (at least, as long as the
installer.py tests are run on Windows: these tests are more-basic
and their functionality would have been exercised automatically).
* py-poetry-core: add 1.6.1 and fix url
* Update var/spack/repos/builtin/packages/py-poetry-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Re-add python upper bound for older versions
* Update var/spack/repos/builtin/packages/py-poetry-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
In Clang 15, -Wint-conversion became an error instead of a warning,
breaking the fftw build for clang versions > 15. This patch fixes fftw
builds with clang 15+ by passing -Wno-error=int-conversion as a cflag.
Without --allow-root spack cannot push binaries that contain paths in
binaries. This flag is almost always needed, so there is no point of
requiring users to spell it out.
Even without --allow-root, rpaths would still have to be patched, so the
flag is not there to guarantee binaries are not modified on install.
This commit makes --allow-root the default, and drops the code
required for it. It also deprecates `spack buildcache preview`, since
the command made sense only with --allow-root.
As a side effect, Spack no longer depends on binutils for relocation
Add support for conflict directives in Lua modulefile like done for Tcl
modulefile.
Note that conflicts are correctly honored on Lmod and Environment
Modules <4.2 only if mutually expressed on both modulefiles that
conflict with each other.
Migrate conflict code from Tcl-specific classes to the common part. Add
tests for Lmod and split the conflict test case in two.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* When using system tools to unpack a .gz file, the input file needs a
different name than the output file. Normally, we generate this new
name by stripping off the .gz extension off of the file name.
This was not sufficient if the file name did not have an extension,
so we temporarily rename the file in that case.
* When using system tar utility to untar on Windows, we were (erroneously)
skipping the actual untar step if the filename was lacking a .tar
extension
* For foo.txz, we were not changing the extension of the decompressed file
(i.e. we would decompress foo.txz to foo.txz). This did not cause any
problems, but is confusing, so has been updated such that the output
filename reflects its decompressed state (i.e. foo.tar).
* Added test for strip_compression_extension
* Update test_native_unpacking to test each archive type with and without
an extension as part of the file name (i.e. we test "foo.tar.gz", but
also make sure we decompress properly if it is named "foo").
* py-pyside: fix build for version 1.2.2
* Remove check for python version
* Fix style
* Remove unnecessary patch
* Update var/spack/repos/builtin/packages/py-pyside/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pyside/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove py-markupsafe conflict
* Update var/spack/repos/builtin/packages/py-pyside/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pyside/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Move python check removal below suprocess patch
* Remove preference of 1.2.2
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* soapdenovo2: strip optimization flags from injected flags
* soapdenovo2: add maintainer
* soapdenovo2: only append on cflags
* soapdenovo2: clean up some wording and implementation
Running `spack test run <python package>` resulted in the error
```
'str' object is not callable
```
because the python executable was not set correctly.
* qt-base: always link to GSS framework on macOS
On macos, the code in src/network/kernel/qauthenticator.cpp
unconditionally includes the header from the GSS framework, so we should
link against it.
This applies two patches from the dev branch. They are to be cherry-picked
into the 6.5 (probably released with 6.5.2) and 6.6 branches, but they
apply against 6.3.2 as well.
* qt-base: disable libproxy on macOS
src/network/CMakeLists.txt disables it on MACOS anyway. And as it is not
found without pkg-config, building with +network would break because of
the feature being explicitly enabled.
* qt-base: don't depend on pkgconfig on macOS
On macOS, usage of pkg-config is disabled by unsetting
PKG_CONFIG_EXECUTABLE, unless the feature pkg-config is requested explicitly.
* qt-base: don't depend on at-spi2-core on macOS
Does not build on macOS and seems to be targeted at linux. Qt6 on
homebrew does not depend on it, either.
* qt-base: fix long lines
* qt-base: restrict use of pkgconfig to linux
yes, probably not needed on windows, either
Co-authored-by: Alec Scott <alec@bcs.sh>
* qt-base: disable libproxy on Windows as well
according to src/network/CMakeLists.txt it's only used on Unix
* qt-base: improvements based on reviewer suggestions
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* legion: Add 23.06.0, variants for UCX, max nodes, update CUDA version.
* legion: Make newer CUDA versions dependent on newer Legion.
* legion: Update CUDA arch list so that we can stop tracking manually.
* Added py-eprosima-fastdds package
* Fixed python extension and dependency version
* Added build type for swig
* Added minimum cmake support
* Added py-test dependency
* Added suggestion on python extension
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Added suggestion on build type for cmake
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Lock objects can now be instantiated independently,
without being tied to the global configuration. The
same is true for database and store objects.
The database __init__ method has been simplified to
take a single lock configuration object. Some common
lock configurations (e.g. NO_LOCK or NO_TIMEOUT) have
been named and are provided as globals.
The use_store context manager keeps the configuration
consistent by pushing and popping an internal scope.
It can also be tuned by passing extra data to set up
e.g. upstreams or anything else that might be related
to the store.
Not having the package name in the conflict messages can make debugging
conflicts exceedingly hard when trying to concretize an environment with
a sufficient number of packages. This patch adds the package name to all
of the conflict messages so that it is easy to tell just from the
message which package is causing conflicts.
* py-jinja2: add conflict for py-markupsafe@2.0.2
* Update var/spack/repos/builtin/packages/py-jinja2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-shiboken: fix build by restricting dependencies
* Update var/spack/repos/builtin/packages/py-shiboken/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove conflict
* Remove py-markupsafe conflict
* Update var/spack/repos/builtin/packages/py-shiboken/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add a phist patch to avoid trying to compile SSE code if that is not available.
* phist: make the avoid-sse patch more robust because compiler on ARM system still tried to compile SSE code
* Bugfix: spack.yaml concretizer:unify needs to be read and used
* Optional: add environment test to ensure configuration scheme is used
* Activate environment in unit tests
A more proper solution would be to keep
an environment instance configuration as
an attribute, but that is a bigger refactor
* Delay evaluation of Environment.unify
* Slightly simplify unit tests
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Update TensorFlow ecosystem
* Re-add +cpp
* Do not use system protobuf
* Let bazel auto-detect macOS SDK version
* Unnecessary duplicated dep
* Remove unused import
Allow the following formats:
```yaml
mirrors:
name: <url>
```
```yaml
mirrors:
name:
url: s3://xyz
access_pair: [x, y]
```
```yaml
mirrors:
name:
fetch: http://xyz
push:
url: s3://xyz
access_pair: [x, y]
```
And reserve two new properties to indicate the mirror type (e.g.
mirror.spack.io is a source mirror, not a binary cache)
```yaml
mirrors:
spack-public:
source: true
binary: false
url: https://mirror.spack.io
```
A few packages have version directives evaluated
within if statements, conditional on the value of
`platform.platform()`.
Sometimes there are no cases for e.g. platform=darwin and that
causes a lot of spurious failures with version existence
audits.
This PR allows expressing conditions to skip version
existence checks in audits and avoid these spurious reports.
### Rationale
While working on #29549, I noticed a lot of inconsistencies in our argparse help messages. This is important for fish where these help messages end up as descriptions in the tab completion menu. See https://github.com/spack/spack/pull/29549#issuecomment-1627596477 for some examples of longer or more stylized help messages.
### Implementation
This PR makes the following changes:
- [x] help messages start with a lowercase letter.
- [x] Help messages do not end with a period
- [x] the first line of a help message is short and simple
longer text is separated by an empty line
- [x] "help messages do not use triple quotes"
"""(except docstrings)"""
- [x] Parentheses not needed for string concatenation inside function call
- [x] Remove "..." "..." string concatenation leftover from black reformatting
- [x] Remove Sphinx argument docs from help messages
The first 2 choices aren't very controversial, and are designed to match the syntax of the `--help` flag automatically added by argparse. The 3rd choice is more up for debate, and is designed to match our package/module docstrings. The 4th choice is designed to avoid excessive newline characters and indentation. We may actually want to go even further and disallow docstrings altogether.
### Alternatives
Choice 3 in particular has a lot of alternatives. My goal is solely to ensure that fish tab completion looks reasonable. Alternatives include:
1. Get rid of long help messages, only allow short simple messages
2. Move longer help messages to epilog
3. Separate by 2 newline characters instead of 1
4. Separate by period instead of newline. First sentence goes into tab completion description
The number of commands with long help text is actually rather small, and is mostly relegated to `spack ci` and `spack buildcache`. So 1 isn't actually as ridiculous as it sounds.
Let me know if there are any other standardizations or alternatives you would like to suggest.
* ci: run spack list in power ci
Let's see if Spack itself is the bottleneck in CI...
* rebuild curl in CI
* more of the same please!
* drop the profiler
* undo rebuildme test in ci variant
* add comment for posterity
* enable profiling
* trigger CI
* See how it goes now that perf regressions are fixed on develop
* try shorter poll intervals
* Revert "try shorter poll intervals"
This reverts commit d60c34ad3eceead0c13a5277cf8e783fd42b7458.
* Remove spec.format call in Database._get_matching_spec_key
* once more in ci please
* undo irrelevant changes
* run spack list in before script
* test in ci
* -:
* Undo CI testing
The spdlog project precisely states/depends which fmt version should
be used for compatibility. Latest version 1.11.0 depends explictly on
fmt 9.1.0. Without fixed version micromamba build fails when using spack
install micromamba on e.g. Rockylinux 8.5.
The 'bison' executable requires libtextstyle to run. I think this was
usually satisfied because gettext is often installed with the OS, or
brought in accidentally via perl/m4.
Looks like the libtextstyle library dependency started in Bison 3.4
Refactor `TermTitle` into `InstallStatus` and use it to show progress
information both in the terminal title as well as inline. This also
turns on the terminal title status by default.
The inline output will look like the following after this change:
```
==> Installing m4-1.4.19-w2fxrpuz64zdq63woprqfxxzc3tzu7p3 [4/4]
```
* llvm: fix build with libcxx=none
* ispc: checksum 1.20.0
* ispc: ensure that it does not crash immediately
this would happen if linked to the wrong libc++
* ispc: fix build on macos
find ncurses instead of curses and link against tinfo in order to avoid
unresolved references to _del_curterm, _set_curterm, _setupterm, and
_tigetnum
* ispc: enable arm targets, if building on arm
* ispc: remove double cmake argument
I forgot to remove the constant -DARM_ENABLED=FALSE when adding
-DARM_ENABLED with a value depending on target architecture
* ispc: fix linux build
since 1.20, linux build uses TBB as default tasking system and thus
needs to depend on it
* ispc: try to fix link error on linux
link against both curses (as before) and tinfo (added because of macos)
* ispc: update for recent llvm changes
libcxx=none instead of ~libcxx
`mypy` will check *all* imported packages, even optional dependencies outside your
project, and this can cause issues if you are targeting python versions *older* than the
one you're running in. `mypy` will report issues in the latest versions of dependencies
as errors even if installing on some older python would have installed an older version
of the dependency.
We saw this problem before with `numpy` in #34732. We've started seeing it with IPython
in #38704. This fixes the issue by exempting `IPython` and a number of other imports of
Spack's from `mypy` checking.
* Setting library path as lib similar to other rocm packages.
* Fix style check failure
* Restricting changes to 5.4.3 and above
* Including comgr change
* initial commit for adding hip-examples package
* adding test to hip-examples
* fixed compile error on add4
* change standalone test to use new syntax
Spack-installed Perl always has opcode support, but external Perl
installations might not. This commit adds a +opcode variant and
updates the external detection logic to check for opcode support.
The postgresql package is updated to require perl+opcode (in
combination with the above, this helps detect when an external
Perl instance is sufficient for a Spack build of postgreqsql, or
if Spack needs to build its own Perl).
* Update cp2k recipe to use cmake or the current build system
Offers the possibility to build cp2k with the new cmake build system. commands like this are now supported
spack install cp2k@master build_system=cmake +.....
the recipe supports the following optional functionalities
- superlu, cosma, sirius, spglib, metis, spglib, libxc, libint, cuda/rocm, mkl/openblas/sci (and others), mpi, openmp, dbcsr
- dbcsr is built separately using the currently available recipe.
Two PRs need to be merged to be fully functional (cosma update in spack + one PR in cp2k github).
* Fix indentation
* Fix indentation
* Update libvori
* More typos
* Simplify BLAS/LAPACK
* Simplify BLAS/LAPACK
* Add A100 gpu value
* Fix typo
* Add the enable_regtests option
if -DCP2K_ENABLE_REGTESTS=ON (+enable_regtests with spack) then the location of the binary executables will be in the cp2k root directory under exe/build-cmake-*. This option is needed to run the regtests afterwards.
* Minor update
* more fixes
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* small changes
* Remove any reference to nvidia architecture in the rocm list
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Final reformating
* Update py-fypp
---------
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
People frequently ask us how to pipe `spack find` output to other commands, and we tell
them to do things like this:
```console
$ spack find --format "/{hash}" | spack uninstall -ay
```
Sometimes users don't know about hash references and come up with potentially ambiguous
formulations like this:
```console
spack find --format {name}@{version}%{compiler} | spack uninstall -ay
```
Since this is a common enough thing to want to do, and to make it more obvious how, this
PR adds a `-H` / `--hashes` as a shortcut, so you can now just do:
```console
spack find -H | spack uninstall -ay
```
* Added packages bitstruct, callmonitor, and PYnvtx
* Revert "Added packages bitstruct, callmonitor, and PYnvtx"
This reverts commit 76d25aa76b.
* py-bitstruct: This module is intended to have a similar interface as the python struct module, but working on bits instead of primitive data types (char, int, …)
* Update package.py
To pass the style prechecks
* PyNVTX: new package
* Delete package.py
Accidentally added this package.
* Update var/spack/repos/builtin/packages/py-bitstruct/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* openssl: prefer 3.x
This PR is not intended to be merged immediately, but it would be good
to see what packages fail to build in CI so that we can get proper
version constraints on openssl (before all packages update and support
both openssl 1 and 3)
* Disable assembly for 3.x %oneapi
* cmake: depend on spack curl, to deal with curl - openssl compat
* also make zlib external
* remove overly strict & unsafe requirement on py-cryptographty patch version number
* update openssl compat bounds in py-cryptography
* smaller diff
* Make libssh2 an autotools/cmake package
* fix weird upperbound in libssh2 as there is not openssl v2
* libssh2: pc file lists plain -lssl -lcrypto w/o leading -L flag, confusing libgit2 parsing of pkg-config output
* Actually fix the issue in libssh2: its pc file looks broken
`"%s" % spec` formats the spec with deps included, which produces sometimes KBs
of data and is slow to run in pure Python. It can delay otherwise very short-lived
read/write locks on the database.
Discovered in #38762 where profile output showed about 2 seconds is spent in
`spec.format`, which is significant overhead when using multiprocessing to install
from binary cache in parallel (installation often takes <5s for small packages). With
this change, `spec.format` no longer shows up in profile output.
(This line hasn't changed since Spack v0.9 ;p)
* move format() call to custom NoSuchSpecError exception
* add a comment saying why, so we can eventually change `Spec.__str__`
* qt-base: new version 6.5.0
* qt-declarative: new version 6.5.0
* qt-quick3d: new version 6.5.0
* qt-quicktimeline: new version 6.5.0
* qt-shadertools: new version 6.5.0
* qt-*: new version 6.5.1
* qt-base: new version 6.5.1
* py-pyarrow: enable parquet variant by default
* Disable parquet variant by default
* Add conflict to enable parquet when dataset is active
* Disable dataset variant by default
* initial commit of nanobind package
* style fixes
* Update package.py
Typo
* addressed PR comments
* add v1.4.0
* Update var/spack/repos/builtin/packages/py-nanobind/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Matthew Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-astropy: fix import tests and restrict py-pip version
* Fix --install-option name in comments
* Rename variant and fix variant dependencies
* Remove parquet variant from py-pyarrow
1. Fix O(n^2) iteration in `_get_overwrite_specs`
2. Early exit `get_by_hash` on full hash
3. Fix O(n^2) double lookup in `all_matching_specs` with hashes
4. Fix some legibility issues
* mlpack: new package
mlpack is an intuitive, fast, and flexible header-only C++ machine learning library with bindings to other languages. It is meant to be a machine learning analog to LAPACK, and aims to implement a wide array of machine learning methods and functions as a "swiss army knife" for machine learning researchers.
* mlpack: upstream merged patch to allow python installation in spack
* Added v5.0.0 of PyAMG. This required v7.1.0 of setuptools_scm due to a bug in 7.0.5.
* Added comment about version requirement.
* Loosened dependency based on build experiments.
* Updated tomli deps.
* Update var/spack/repos/builtin/packages/py-setuptools-scm/package.py
Dependence for 7.0 only.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pyamg/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Swapped lines.
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pip: add 23.1.2
* Restrict py-pip version for py-protobuf
* Restrict py-pip version for straightforward packages
* Restrict py-pip version for nrm
* Fix --install-option name in comments
* Simplify py-pip restriction for py-scs
* nrm: fix wrong comment
* py-spglib: add 2.0.2
* Update var/spack/repos/builtin/packages/py-spglib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove py-setuptools as run dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add maintainers
* Updated cosma archive checksum and costa version
- updated cosma version (in the cosma build system)
- updated costa version
- use the default generic url for downloading packages
- do not build tiled-mm when the cpu only version is needed
Signed-off-by: Dr. Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
NAMD users expect the Tcl scripting interface to be enabled as it is used in many examples and tutorials in addition to being required for features such as multi-copy algorithms.
* When installing a package Spack will attempt to set group permissions on
the install prefix even when the configuration does not specify a group.
Co-authored-by: David Gomez <dvdgomez@users.noreply.github.com>
From the configure.ac file:
> H5VL_log is built on top of MPI. Configure option --without-mpi or
> --with-mpi=no should not be used. Abort.
This currently fails to build in the oneAPI pipeline on `develop`
* py-userpath: new package
* pipx: new package
* Update var/spack/repos/builtin/packages/pipx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* pipx: Remove incorrect dependency on py-platformdirs
* Update var/spack/repos/builtin/packages/pipx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-userpath: Remove version requirements to match upstream
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This PR removes deprecated versions for all packages that I'm maintaining. In future Spack releases, I'm planning to do this on a much larger scale, but we can hold off until we have better reproducibility.
I'm hoping that this will improve the maintainability of these packages. If any other maintainers of these recipes would like to retain any of these deprecated versions, or add new versions, speak now or forever hold your peace 😄
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
Windows runtime library loading searches PATH, and therefore bin/ is
the appropriate place to put .dll files. Prior to this change, XZ was
installing both .dll and .lib files to the lib/ directory.
Move the logic checking which mirrors have the specs we need closer
to where that information is needed. Also update the staging summary
to contain a brief description of why we scheduled or pruned each
job. If a spec was found on any mirrors, regardless of whether
we scheduled a job for it, print those mirrors.
* Change maintainer, add new version and deprecate old one
* Fix style issue
* Revert deprecation
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
PowerShell requires explicit shell and env support in Spack.
This is due to the distinct differences in shell interactions between
cmd and pwsh. Add a doskey in pwsh piping 'spack' commands to a
powershell script similar to the sh function 'spack'. Add
support for PowerShell-specific shell interactions from Spack
(set/unset shell variables).
* py-ruamel-yaml: add 0.17.32 and py-ruamel-yaml-clib: add 0.2.7
* Update var/spack/repos/builtin/packages/py-ruamel-yaml/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix style
* Fix python dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* HDF5: is_enabled helper (ON)
Slightly generalize the `is_enabled` helper in the HDF5 package.
`ON` is the most typical CMake bool option passed, besides many
other possible `true` values, and should be included as a possible
check to the config.
* Simplify
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Support hardlinks/junctions on Windows systems without developer
mode enabled
* Generally, use of llnl.util.symlink.symlink is preferred over
os.symlink since it handles this automatically
* Generally an error is now reported if a user attempts to create a
symlink to a file that does not exist (this was previously allowed
on Linux/Mac).
* One exception to this: when Spack installs files from the source
into their final prefix, dangling symlinks are allowed (on
Linux/Mac - Windows does not allow this in any circumstance).
The intent behind this is to avoid generating failures for
installations on Linux/Mac that were succeeding before.
* Because Windows is strict about forbidding dangling symlinks,
`traverse_tree` has been updated to skip creating symlinks if they
would point to a file that is ignored. This check is not
transitive (i.e., a symlink to a symlink to an ignored file would
not be caught appropriately)
* Relocate function: resolve_link_target_relative_to_the_link
(this is not otherwise modified)
Co-authored-by: jamessmillie <smillie@txcorp.com>
Update list of excluded variables in `from_sourcing_file` function to
cover all variables specific to Environment Modules or Lmod. Add
specifically variables relative to the definition of `module()`, `ml()`
and `_module_raw()` Bash functions.
Fixes#13504
* Add new versions of Qthreads
* Add version URLs explicitly as it has recently changed
* Use function to extrapolate version URL for older versions
* Fix url formatter
llvm @13-15 is required for ispc, but fails to build with GCC 13.
14.0.6 and 15.0.7 built successfully with upstream patch, 13.0.1
still fails. Thus upstream patch is applied to 14 and 15 only.
Update `env.set` command and underlying `SetEnv` object to add the `raw`
boolean attribute. `raw` is optional and set to False by default. When
set to True, value format is skipped for object when generating
environment modifications.
With this change it is now possible to define environment variable
whose value contains variable reference syntax (like `{foo}` or `{}`)
that should be set as-is.
Fixes#29578
By default, `find_package(Python)` searches from highest version to lowest version, identifying the highest version that satisfies the requirements. This means that `/usr/bin/python3.11` will be found before `$(spack location -i python)/bin/python3.10`, even when other packages have been built with the `python` in spack.
This ensures that the `python` dependency is explicitly the `python` version that is used.
* libEnsemble: add v0.10.0
* Make new deps required
* Fixes to deps
* Update var/spack/repos/builtin/packages/py-libensemble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix build, run
* Reorder required deps
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pytest: add 7.3.2
* [@spackbot] updating style on behalf of manuelakuhn
* Swap py-importlib-metadata dependency order
* Restrict python version for older versions
* Add recipe for iterative-stats
* Fix branch name and remove comment
* Add git link
* Add package maintainer
* Enforce multiple requested changes
* Update var/spack/repos/builtin/packages/py-iterative-stats/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update checksum
* Fix openturns dependency specification
* Add python variant spec to openturns
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add rust v1.70.0 and simplify package logic by moving bootstrap to dedicated package
* Fix formatting of rust-bootstrap package file
* Re-enable Rust as extendable
* Add nightly version to rust and rust-bootstrap
* Manually inject openssl certs into environment
* Add master and beta versions to rust
* Add additional documentation for using rust development releases
* Remove @AndrewGaspar as maintainer
* py-notebook: add 6.5.4
* [@spackbot] updating style on behalf of manuelakuhn
* Update var/spack/repos/builtin/packages/py-notebook/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix version of py-nbclassic dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
It installs the LFortran runtime library and
LFortran can compile codes to binaries. The interactive mode does not
work yet with LLVM > 11, that has to be fixed upstream.
Co-authored-by: Wileam Y. Phan <50928756+wyphan@users.noreply.github.com>
No changes to the build system, no changes to `package.py` needed.
Changelog: https://github.com/qt/qtbase/compare/v5.15.9-lts-lgpl...v5.15.10-lts-lgpl
Main change taking up space:
- bundled 3rdparty/pcre2 updated from 10.39 to 10.40 (spack now includes 10.42, and we don't put specific version requirements in `package.py`)
* py-networkx: add 3.1
* Update var/spack/repos/builtin/packages/py-networkx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add default variant
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pycairo: add 1.24.0
* Change python dependency to 3.8
* Remove upper bound for python dependency
* Update var/spack/repos/builtin/packages/py-pycairo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add darshan 3.4.3 releases
* darshan-runtime 3.4.3
* darshan-util 3.4.3
* py-darshan 3.4.3.0
- add py-humanize as new dependency
* py-darshan has strict darshan-util version reqs
darshan-util version required is based on the first 3 parts of
the py-darshan version string
* remove support for python3.6
* py-humanize dependency for 3.4.3+ versions
* only enforce scipy dependency for 3.4.0.1
* drop optional lxml dependency
* drop matplotlib pinning
* importlib-resources not a dep in python-3.7+
* drop unnecessary numpy pin
* add build dep for pytest-runner
* fix typo in pytest-runner package name
* pip setuptools to match pydarsan setup.py
* spack style fix
* py-gsutil: add 5.24, fix and add dependencies
* Update var/spack/repos/builtin/packages/py-httplib2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add httplib2@0.20.4 and pin it in py-gsutil
* Add py-cryptography conflict
* Update var/spack/repos/builtin/packages/py-httplib2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pyopenssl: fix py-cryptography conflict
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* update mda and mdatests
* black
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* polish
* Update var/spack/repos/builtin/packages/py-mdanalysistests/package.py
* fixes
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
`FFLAGS` and `FCFLAGS` are being ignored by WRF build system. Not only in version
`3.9.1.1`, but also `4.x`.
Also, I see no reason to explicitly add `-w` and `-O2` to compile lines when
using `gcc@10:`. Tested for version `3.9.1.1`, `4.2.2`, & `4.5.0`.
Tagging original authors of this part @MichaelLaufer and @giordano in case they
want to chime in.
* ncbi-rmblastn: patching to support building with %gcc@13:
* ncbi-rmblastn: patching to build with %gcc@13:
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
* py-pre-commit: add 3.3.3
* Update var/spack/repos/builtin/packages/py-pre-commit/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Spack flags supplied by users should supersede flags from package build systems and
other places in Spack. However, Spack currently adds user-supplied flags to the
beginning of the compile line, which means that in some cases build system flags will
supersede user-supplied ones.
The right place to add a flag to ensure it has highest precedence for the compiler really
depends on the type of flag. For example, search paths like `-L` and `-I` are examined
in order, so adding them first is highest precedence. Compilers take the *last* occurrence
of optimization flags like `-O2`, so those should be placed *after* other such flags. Shim
libraries with `-l` should go *before* other libraries on the command line, so we want
user-supplied libs to go first, etc.
`lib/spack/env/cc` already knows how to split arguments into categories like `libs_list`,
`rpath_dirs_list`, etc., so we can leverage that functionality to merge user flags into
the arg list correctly.
The general rules for injected flags are:
1. All `-L`, `-I`, `-isystem`, `-l`, and `*-rpath` flags from `spack_flags_*` to appear
before their regular counterparts.
2. All other flags ordered with the ones from flags after their regular counterparts,
i.e. `other_flags` before `spack_flags_other_flags`
- [x] Generalize argument categorization into its own function in the `cc` shell script
- [x] Apply the same splitting logic to injected flags and flags from the original compile line.
- [x] Use the resulting flag lists to merge user- and build-system-supplied flags by category.
- [x] Add tests.
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
Co-authored-by: iermolae <igor.ermolaev@intel.com>
* py-vermin: add latest version 1.5.2
* Removed obsolete dep and setuptools is only for build-time
- setuptools are not used as runtime
- py27 isn't strictly necessary
The `unparser` that Spack uses for package hashing had several tweaks to ensure compatibility
with Python 2.7:
1. Currently, the unparser automatically moves `*` and `**` args to the end to preserve
compatibility with `python@:3.4`
2. `print a, b, c` statements and single-tuple `print((a, b, c))` function calls were
remapped to `print(a, b, c)` in the unparsed output for consistency across versions.
(1) is causing issues in our tests because a recent patch to the Python source code
(https://github.com/python/cpython/pull/102953/files#diff-7972dffec6674d5f09410c71766ac6caacb95b9bccbf032061806ae304519c9bR813-R823)
has a `**` arg before an named argument, and we round-trip the core python source code
as a test of our unparser. This isn't actually a break with our consistent unpausing -- it's still
consistent, the python source just doesn't unparse to the same thing anymore. It does makes
it harder to test, so it's not worth maintaining the Python2-specific stuff anymore.
Since we only support `python@3.6:`, this PR removes (1) and (2) from the unparser, but keeps
one last tweak for unicode AST inconsistencies, as it's still needed for Python 3.5-3.7.
This fixes the CI error we've been seeing on `python@3.11.4` and `python@3.10.12`. Again, that
bug exists only in the test system and doesn't affect our canonical hashing of Python code.
* WarpX 23.06
Update WarpX and related Python packages to the lastest releases.
WarpX 23.06 introduces multi-dimension support in a single package,
which will ease deployment in E4S et al. that can ship now a single,
full-feature module/package that is NOT incompatible with itself
anymore.
* e4s ci stacks: multiple specs for each dim variant no longer required
* [@spackbot] updating style on behalf of ax3l
* WarpX: Update CMake CLI and Test/Check
* Add Missing `build-directory`
* [@spackbot] updating style on behalf of ax3l
* Remove `build_directory` again
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: ax3l <ax3l@users.noreply.github.com>
* star: add 2.7.10
* star: fix building for non-avx2 arch processors
* convert to MakefilePackage, second take at fixing for aarch64
* style
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
* DependencySpec: add virtuals attribute on edges
This works for both the new and the old concretizer. Also,
added type hints to involved functions.
* Improve virtual reconstruction from old format
* Reconstruct virtuals when reading from Cray manifest
* Reconstruct virtual information on test dependencies
* openradioss-starter,engine: new package
* openradioss-engine: change version name develop to main
* openradioss-starter: change version name develop to main
Update Tcl modulefile template to use the `depends-on` command to
autoload modules if Lmod is the current module tool.
Autoloading modules with `module load` command in Tcl modulefile does
not work well for Lmod at some extend. An attempt to unload then load
designated module is performed each time such command is encountered. It
may lead to a load storm that may not end correctly with large number of
module dependencies.
`depends-on` command should be used for Lmod instead of `module load`,
as it checks if module is already loaded, and does not attempt to reload
this module.
Lua modulefile template already uses `depends_on` command to autoload
dependencies. Thus it is already considered that to use Lmod with Spack,
it must support `depends_on` command (version 7.6+).
Environment Modules copes well with `module load` command to autoload
dependencies (version 3.2+). `depends-on` command is supported starting
version 5.1 (as an alias of `prereq-all` command) which was relased last
year.
This change introduces a test to determine if current module tool that
evaluates modulefile is Lmod. If so, autoload dependencies are defined
with `depends-on` command. Otherwise `module load` command is used.
Test is based on `LMOD_VERSION_MAJOR` environment variable, which is set
by Lmod starting version 5.1.
Fixes#36764
When interpreting local paths as relative URL endpoints, they were
formatted as Windows paths on Windows (i.e. with '\'). URLs should
always be POSIX-style.
Update modulefile templates to append a trailing delimiter to MANPATH
environment variable, if the modulefile sets it.
With a trailing delimiter at ends of MANPATH's value, man will search
the system man pages after searching the specific paths set.
Using append-path/append_path to add this element, the module tool
ensures it is appended only once. When modulefile is unloaded, the
number of append attempt is decreased, thus the trailing delimiter is
removed only if this number equals 0.
Disclaimer: no path element should be appended to MANPATH by generated
modulefiles. It should always be prepended to ensure this variable's
value ends with the trailing delimiter.
Fixes#11355.
* opencascade: new variants
OpenCascade has several major modules and not every
application needs all of them. This adds variants for
the various modules.
It also uodatws the 3rdparty dependency treatment.
* [@spackbot] updating style on behalf of wdconinc
* Update var/spack/repos/builtin/packages/opencascade/package.py
* opencascade: remove variant foundation_classes (always true)
* [@spackbot] updating style on behalf of wdconinc
* Update var/spack/repos/builtin/packages/opencascade/package.py
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Deprecate R packages for spatial analysis
* [@spackbot] updating style on behalf of adamjstewart
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
No changes to build system required. Changelog: https://github.com/xiaoyeli/superlu/compare/v5.3.0...v6.0.0
Since this new version adds "64-bit indexing support", and since at least one dependent package (`armadillo`) requires "32-bit integers" (faa6cbf895), the previous version remains preferred.
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* New version for openCARP packages
* Update carputils dependencies
* Update types of openCARP dependencies
* Add type "run" to setuptools dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add package py-common as carputils dependency
* Add setuptools dependency for py-common
* Remove spaces on blank line
* Restrict type of dependency setuptools to "build"
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: openCARP consortium <info@opencarp.org>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [gromacs] Fix intel (classic) libstdc++ path
Gromacs's `cmake` run will look for `--gcc-toolchain` (e.g. LLVM, icpx) or
`--gcc-name` (e.g. icpc) in `CMAKE_CXX_FLAGS`. Only if it does not find a good
g++ candidate there it will look for `GMX_GPLUSPLUS_PATH`:
cb6b311c39/cmake/FindLibStdCpp.cmake (L97)
Spack installed intel compilers already define a g++ for std libs. But in
`icp{c,x}.cfg` instead of the compile line. If we use the pre-defined g++ we not
only have less chance of mixing g++ versions, but also don't need to explicitly
add `gcc` as dependency to `gromacs`.
* Fix format
* Use a variant
As there is no way to check if a file exists at depends_on stage
* Fix format
* New name and fail if variant is used with other compiler
* Line too long.
The pcluster image has am internal buildcache without an index.
Also, we need to force reuse to avoid rebuilding GCC, since the default is
to only reuse dependencies - and that is subject to changes in the GCC
recipe.
* bedtools2: patching to build with gcc@13
* bedtools2: patching to build with gcc@13
* Update var/spack/repos/builtin/packages/bedtools2/package.py
Yep, sure. Makes sense.
Co-authored-by: Alec Scott <alec@bcs.sh>
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <alec@bcs.sh>
* ensmallen: new package
ensmallen is a high-quality C++ library for non-linear numerical optimization.
* r-rcppensmallen: new package
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Improve lib/spack/spack/test/cmd/compiler.py
* Use "tmp_path" in the "mock_executable" fixture
* Return a pathlib.Path from mock_executable
* Fix mock_executable fixture on Windows
"mock_gcc" was very similar to mock_executable, so use the latter to reduce code duplication
* Remove wrong compiler cache, fix compiler removal
fixes#37996
_CACHE_CONFIG_FILES was both unneeded and wrong, if called
subsequently with different scopes.
Here we remove that cache, and we fix an issue with compiler
removal triggered by having the same compiler spec in multiple
scopes.
* e4s oneapi ci: use official intel oneapi-derived runner image
* update oneapi image
* tau builds ok, but only with libdrm - comment out for now, follow up with pr later
* Guard for define in netcdf 4.9.0 and later.
This code is already available in ParaView 5.11.0 so no patching
needed there.
* Add latest needed version (even if not in spack).
---------
Co-authored-by: Dan Lipsa <dan.lipsa@khq.kitware.com>
* llvm: add new versions and set default for libomptarget according to os
modified: var/spack/repos/builtin/packages/llvm/package.py
* Incorporate reviewer suggestions
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
---------
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
* e4s cray ci stack
* e4s ci: add cray
* add zen4 tag
* WIP: new defintions just for cray
* updates
* remove ci signing job overrride, not necessary
* echo $PATH and show modules loaded
* add mirror
* add external def for cray-libsci
* comment out quantum-espresso
* use /etc/protected-runner as key path
* cray ci stack: do not remove tags: [spack, public]
* make cray stack composable
* generate job should run on public tagged runner, override default config:install_tree:root
* CI: Use relative path in default script
* CI: Use relative includes paths for shell runners
* Use concrete_env_dir for relpath
* ml-darwin-aarch64-mps: jax has bazel codesign issue
---------
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
#37592 updated cached cmake packages to set CMAKE_CUDA_ARCHITECTURES.
The condition `if archs != "none"` lead to `CMAKE_CUDA_ARCHITECTURES=none`
when cuda_arch=none (incorrect check on the value of a multi-valued
variant), i.e. CMAKE_CUDA_ARCHITECTURES is always set. This PR udpates
the condition to if archs[0] != "none" to ensure CMAKE_CUDA_ARCHITECTURES
is only set if cuda_arch is not none (which seems to be the pattern used
in other packages).
This does the same for HIP (although in general ROCmPackage disallows
amdgpu_target=none when +rocm).
* fastqc: add 0.12.1
* fastqc: add 0.12.1
* Update var/spack/repos/builtin/packages/fastqc/package.py
Yeah, had considered doing the same, I'd just opted to maintain the status quo. All good.
Co-authored-by: Alec Scott <alec@bcs.sh>
* Update package.py
Style fiddles to make the bot contented.
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <alec@bcs.sh>
* py-htseq: add 0.12.3, switching over to new GitHub repo
* py-htseq: add 0.12.3, switching over to new GitHub repo
Style fixes
* py-htseq: add 2.0.3, switch to PyPI
* py-htseq: add 2.0.3, switch to PyPI
* Update package.py
* Update var/spack/repos/builtin/packages/py-htseq/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Removing SWIG
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* llvm: replace +omp_as_runtime with omp=runtime
* llvm: fetch 'libomp-libflags-as-list.patch' from upstream repo
* llvm: replace 'llvm14-hwloc-ompd.patch' with the official fix from upstream repo
* llvm: fix-up for the black reformatting
* llvm: fetch 'constexpr_longdouble.patch' from upstream repo
* llvm: optionally build libcxx as a runtime
* llvm: fetch 'llvm5-sanitizer-ustat.patch' from upstream repo
* llvm: update 'sanitizer-ipc_perm_mode.patch'
* llvm: refactor compiler conflicts when libcxx=project
* llvm: fetch 'llvm_python_path.patch' from upstream repo
* llvm: update comments and condition for 'xray_buffer_queue-cstddef.patch'
* llvm: optionally build compiler-rt as a runtime
* llvm: fetch 'lldb_external_ncurses-10.patch' from upstream repo
* llvm: fetch 'llvm_py37.patch' from upstream repo
* llvm: rename variant 'internal_unwind' to 'libunwind'
* llvm: optionally build libunwind as a runtime
* llvm: extend the list of maintainers
* llvm: allow for explicit '~clang~flang~libomptarget~lldb~omp_debug~z3'
* llvm: fetch 'llvm5-lld-ELF-Symbols.patch' from FreeBSD port repo
* llvm: fetch most of 'missing-includes.patch' from upstream repo and reuse 'llvm-gcc11.patch'
* llvm: regroup patches for missing include directives and drop compiler constraints for them
* llvm: fetch 'llvm-gcc11.patch' from upstream repo
* llvm: fetch 'no_cyclades.patch' from upstream repo
* llvm: update comments and condition for 'no_cyclades9.patch'
* llvm: rename variant 'omp' to 'openmp'
* llvm: constrain and rename variant 'omp_tsan' to 'libomp_tsan'
* llvm: rename variant 'omp_debug' to 'libomptarget_debug'
* llvm: do not apply same patch twice
* llvm: constrain and document the '*-thread.patch' patches
* llvm: document the '~lld+libomptarget' conflict
* llvm: update comments for the 'D133513.diff' patch
* py-future: add 0.18.3
* Update var/spack/repos/builtin/packages/py-future/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding libpsm3 package
* Make changes suggested by flake8
* Make one more flake8-suggested change, blank line after 'import os'
* Change to standard header to pass flake8 tests
* Update doc string, remove unnecessary comments
* Reviewer-recommende changes
* Alphabetize variants
* Use helper functions
* Change quotes to pass spack style check
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
- Add pkgconfig dependency from 1.23.0 onward.
- Add conflict of old versions with new gcc due to missing includes.
- Deprecate uneven minor versions because they are not regarded as stable.
- Add maintainer
* [xsdk-examples] Initial commit for v0.4.0
* [xsdk-examples] v0.4.0 depends on xsdk@0.8.0
* add in missing xsdk dependencies
* [xsdk-examples] remove repeated 'depends_on' directive
* [xsdk-examples] simplify and extend a bit the package
[mfem] process more optional dependencies of HiOp
[strumpack, superlu-dist] add a workaround for an issue on Mac
* [mfem] fix the handling of the hiop dependency
* [@spackbot] updating style on behalf of v-dobrev
* [xsdk-examples] enable 'heffte' and 'tasmanian' if enabled in 'xsdk'
* [xsdk-examples] Add PUMI dependency
* [xsdk-examples] Add preCICE dependency
* [xsdk-examples] add +rocm
* heffte: add in a backport fix for building xsdk-examples with cuda
* [xsdk] Remove the explicit requirement for deal.II to be built +hdf5
* ENABLE_ROCM -> ENABLE_HIP
* [hiop] Workaround for CMake not finding Cray's BLAS (libsci)
[xsdk-examples] Set CUDA/HIP architectures; sync cuda/rocm variants with xsdk
* [@spackbot] updating style on behalf of v-dobrev
* [exago] Workaround for CMake not finding Cray's LAPACK/BLAS, libsci
[mfem] Tweaks for running tests under Flux and PBS
* [slate] Pass CUDA/HIP architectures to CMake
* [heffte] For newer CMake versions, set CMAKE_CUDA_ARCHITECTURES
* [hypre] Patch v2.26.0 to fix sequential compilation in 'src/seq_mv'
* [xsdk-examples] Some tweaks in dependencies and compilers used
* [xsdk] Make the 'trilinos' variant sticky
[xsdk-examples] Tweak dependencies
* [slate] Fix copy-paste error
* [xsdk-examples] Workaround for CMakePackage not having the legacy
property 'build_directory'
* [xsdk-examples] Replace the testing branch used temporarily for v0.4.0 with
the official release
---------
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
* Add CMake options for building with CUDA/HIP support to
CachedCMakePackages (intended to reduce duplication across packages
building with +hip/+cuda and using CachedCMakePackage)
* Define generic variables like CMAKE_PREFIX_PATH for
CachedCMakePackages (so that a user may invoke "cmake" themselves
without needing to setthem on the command line).
* Make `lbann` a CachedCMakePackage.
Co-authored-by: Chris White <white238@llnl.gov>
fa7719a changed syntax for specifying exact versions, which are
required for some compiler specs (including those read as part
of parsing a Cray manifest). This fixes that and also makes a
couple other improvements to manifest parsing.
* Instantiate compiler specs with exact versions (fixes#37893)
* fix slingshot network detection (CPE 22.10+ has libcxi.so
in /usr/lib64)
* "spack external find": add arg to ignore default dir for cray
manifests
This will build flux-security separately to have a flux-imp
that can be defined in a flux broker.toml. Note that the user
that wants a multi-user setup is recommended to create a view,
and then a system/broker.toml in flux config directory that
points to it.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Sphinx is used to build Open MPI manpages, etc. as part of the make dist
process to create release tarballs. There should be no need/use to do
this within Spack. Also some sites have older Sphinx installs which
aren't compatible with the needs of the Open MPI documentation.
For example, attempts to install openmpi@main fail at NERSC owing to
such a situation.
Since Spack normally is used to build from release tarballs, in which
the docs have already been installed, this should present no issues.
This configuration option will be ignored for older than 5.0.0 Open MPI releases.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
This option is needed for DFT FE - or more accurately the check needs to
be checked off for a number of platforms or else the code doesn't work.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Change default naming scheme for tcl modules for a more user-friendly
experience.
Change from flat projection to "per software name" projection.
Flat naming scheme restrains module selection capabilities. The
`{name}/{version}...` scheme make possible to use user-friendly
mechanisms:
* implicit defaults (`module load git`)
* extended default (`module load git/2`)
* advanced version specifiers (`module load git@2:`)
* py-cutadapt: add 4.4, 4.3, 4.2 versions
* Update var/spack/repos/builtin/packages/py-cutadapt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-cutadapt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update PennyLane ecosystem for 0.30 release
* Update package dep versions
* Fix formatting
* Update dep versions
* Remove PL hard pin and rely on PLQ to define version
* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py
Co-authored-by: Vincent Michaud-Rioux <vincent.michaud-rioux@xanadu.ai>
* Convert pybind11 from build to link dep, and PL ver limit
---------
Co-authored-by: Vincent Michaud-Rioux <vincent.michaud-rioux@xanadu.ai>
Note the win-sdk package is not installable and reports an error
which instructs the user how to add it. Without this fix, a
(more confusing) error occurs before this message can be generated.
"spack build-env" was not generating proper environment variable
definitions on Windows; this commit updates the generated commands
to succeed with batch/PowerShell.
Add a nightly job to attempt building all Paraview dependencies and
upload the results to cdash. This check doesn't affect the reported
build/test status of Spack. We are using this to monitor the state of
Windows support while working on more-robust checks (eventually the
Windows build will have to succeed to merge PRs to Spack).
* Dyninst: add standalone test
* Add docstring with description
* Don't use join_path for builtin path objects
* Whitespace
* Update format of docstring
Some requirements for @main version of environment-modules were missing:
* python (to build ChangeLog documentation file)
* py-sphinx@1.0: (to build man-pages, etc)
Also adding gzip, which is now required to build ChangeLog.gz (which is
now shipped instead of ChangeLog).
Other versions are not requiring these tools (as documentation is
pre-built in dist tarball).
* [devito] Move to version 4.8.1
* Fix: Adding patch file
* Update var/spack/repos/builtin/packages/py-devito/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-devito/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Addressing @adamjstewart comments
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add macOS ML CI stacks
* torchmeta is no longer maintained and requires ancient PyTorch
* Add MXNet
* update darwin aarch64 stacks
* add darwin-aarch64 scoped config.yaml
* remove unnecessary cleanup job
* fix specifications
* fix labels
* fix labels
* fix indent on tags specification
* no tags for trigger jobs
* try overriding tags in stack spack.yaml
* do not use CI_STACK_CONFIG_SCOPES
* incorporate config:install_tree:root: overrides and compiler defs
* copy relevant ci-scoped config settings directly into stack spack.yaml
* remove build-job-remove
* spack ci generate: add debug flag
* include cdash config directly in stack spack.yaml
* customize build-job script section to avoid absolute paths
* add any-job specification
* tags: use aarch64-macos instead of aarch64
* generate tags: use aarch64-macos instead of aarch64
* do not add morepadding
* use shared mirror; comment out known failures
* remove any-job
* nproc || true
* comment out specs failing due to bazel from cache codesign issue
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
* py-babel: add 2.12.1
* Update var/spack/repos/builtin/packages/py-babel/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When attempting to build paraview@5.10.1 using a recent Intel
compiler (Classic or OneAPI) or the IBM XL compiler, the build
fails if the version of protobuf used is > 3.18
* [pcluster pipeline] Use local buildcache instead of upstream spack
Spack currently does not relocate compiler references from upstream spack
installations. When using a buildcache we don't need an upstream spack.
* gcc needs to be installed via postinstall to get correct deps
* quantum-espresso@gcc@12.3.0 returns ICE on neoverse_{n,v}1
* Force gitlab to pull the new container
* Revert "Force gitlab to pull the new container"
This reverts commit 3af5f4cd88.
Seems the gitlab version does not yet support "pull_policy" in .gitlab-ci.yml
* Gitlab keeps picking up wrong container. Renaming
* Update containers once more after failed build
* add version 1.48.0 to bioconductor package r-a4
* add version 1.48.0 to bioconductor package r-a4base
* add version 1.48.0 to bioconductor package r-a4classif
* add version 1.48.0 to bioconductor package r-a4core
* add version 1.48.0 to bioconductor package r-a4preproc
* add version 1.48.0 to bioconductor package r-a4reporting
* add version 1.54.0 to bioconductor package r-absseq
* add version 1.30.0 to bioconductor package r-acde
* add version 1.78.0 to bioconductor package r-acgh
* add version 2.56.0 to bioconductor package r-acme
* add version 1.70.0 to bioconductor package r-adsplit
* add version 1.72.0 to bioconductor package r-affxparser
* add version 1.78.0 to bioconductor package r-affy
* add version 1.76.0 to bioconductor package r-affycomp
* add version 1.58.0 to bioconductor package r-affycontam
* add version 1.72.0 to bioconductor package r-affycoretools
* add version 1.48.0 to bioconductor package r-affydata
* add version 1.52.0 to bioconductor package r-affyilm
* add version 1.70.0 to bioconductor package r-affyio
* add version 1.76.0 to bioconductor package r-affyplm
* add version 1.46.0 to bioconductor package r-affyrnadegradation
* add version 1.48.0 to bioconductor package r-agdex
* add version 3.32.0 to bioconductor package r-agilp
* add version 2.50.0 to bioconductor package r-agimicrorna
* add version 1.32.0 to bioconductor package r-aims
* add version 1.32.0 to bioconductor package r-aldex2
* add version 1.38.0 to bioconductor package r-allelicimbalance
* add version 1.26.0 to bioconductor package r-alpine
* add version 2.62.0 to bioconductor package r-altcdfenvs
* add version 2.24.0 to bioconductor package r-anaquin
* add version 1.28.0 to bioconductor package r-aneufinder
* add version 1.28.0 to bioconductor package r-aneufinderdata
* add version 1.72.0 to bioconductor package r-annaffy
* add version 1.78.0 to bioconductor package r-annotate
* add version 1.62.0 to bioconductor package r-annotationdbi
* add version 1.24.0 to bioconductor package r-annotationfilter
* add version 1.42.0 to bioconductor package r-annotationforge
* add version 3.8.0 to bioconductor package r-annotationhub
* add version 3.30.0 to bioconductor package r-aroma-light
* add version 1.32.0 to bioconductor package r-bamsignals
* add version 2.16.0 to bioconductor package r-beachmat
* add version 2.60.0 to bioconductor package r-biobase
* add version 2.8.0 to bioconductor package r-biocfilecache
* add version 0.46.0 to bioconductor package r-biocgeneric
* add version 1.10.0 to bioconductor package r-biocio
* add version 1.18.0 to bioconductor package r-biocneighbors
* add version 1.34.0 to bioconductor package r-biocparallel
* add version 1.16.0 to bioconductor package r-biocsingular
* add version 2.28.0 to bioconductor package r-biocstyle
* add version 3.17.1 to bioconductor package r-biocversion
* add version 2.56.0 to bioconductor package r-biomart
* add version 1.28.0 to bioconductor package r-biomformat
* add version 2.68.0 to bioconductor package r-biostrings
* add version 1.48.0 to bioconductor package r-biovizbase
* add version 1.10.0 to bioconductor package r-bluster
* add version 1.68.0 to bioconductor package r-bsgenome
* add version 1.36.0 to bioconductor package r-bsseq
* add version 1.42.0 to bioconductor package r-bumphunter
* add version 2.66.0 to bioconductor package r-category
* add version 2.30.0 to bioconductor package r-champ
* add version 2.32.0 to bioconductor package r-champdata
* add version 1.50.0 to bioconductor package r-chipseq
* add version 4.8.0 to bioconductor package r-clusterprofiler
* add version 1.36.0 to bioconductor package r-cner
* add version 1.32.0 to bioconductor package r-codex
* add version 2.16.0 to bioconductor package r-complexheatmap
* add version 1.74.0 to bioconductor package r-ctc
* add version 2.28.0 to bioconductor package r-decipher
* add version 0.26.0 to bioconductor package r-delayedarray
* add version 1.22.0 to bioconductor package r-delayedmatrixstats
* add version 1.40.0 to bioconductor package r-deseq2
* add version 1.46.0 to bioconductor package r-dexseq
* add version 1.42.0 to bioconductor package r-dirichletmultinomial
* add version 2.14.0 to bioconductor package r-dmrcate
* add version 1.74.0 to bioconductor package r-dnacopy
* add version 3.26.0 to bioconductor package r-dose
* add version 2.48.0 to bioconductor package r-dss
* add version 3.42.0 to bioconductor package r-edger
* add version 1.20.0 to bioconductor package r-enrichplot
* add version 2.24.0 to bioconductor package r-ensembldb
* add version 1.46.0 to bioconductor package r-exomecopy
* add version 2.8.0 to bioconductor package r-experimenthub
* add version 1.26.0 to bioconductor package r-fgsea
* add version 2.72.0 to bioconductor package r-gcrma
* add version 1.36.0 to bioconductor package r-gdsfmt
* add version 1.82.0 to bioconductor package r-genefilter
* add version 1.36.0 to bioconductor package r-genelendatabase
* add version 1.72.0 to bioconductor package r-genemeta
* add version 1.78.0 to bioconductor package r-geneplotter
* add version 1.22.0 to bioconductor package r-genie3
* add version 1.36.0 to bioconductor package r-genomeinfodb
* update r-genomeinfodbdata
* add version 1.36.0 to bioconductor package r-genomicalignments
* add version 1.52.0 to bioconductor package r-genomicfeatures
* add version 1.52.0 to bioconductor package r-genomicranges
* add version 2.68.0 to bioconductor package r-geoquery
* add version 1.48.0 to bioconductor package r-ggbio
* add version 3.8.0 to bioconductor package r-ggtree
* add version 2.10.0 to bioconductor package r-glimma
* add version 1.12.0 to bioconductor package r-glmgampoi
* add version 5.54.0 to bioconductor package r-globaltest
* update r-go-db
* add version 1.20.0 to bioconductor package r-gofuncr
* add version 2.26.0 to bioconductor package r-gosemsim
* add version 1.52.0 to bioconductor package r-goseq
* add version 2.66.0 to bioconductor package r-gostats
* add version 1.78.0 to bioconductor package r-graph
* add version 1.62.0 to bioconductor package r-gseabase
* add version 1.32.0 to bioconductor package r-gtrellis
* add version 1.44.0 to bioconductor package r-gviz
* add version 1.28.0 to bioconductor package r-hdf5array
* add version 1.72.0 to bioconductor package r-hypergraph
* add version 1.36.0 to bioconductor package r-illumina450probevariants-db
* add version 0.42.0 to bioconductor package r-illuminaio
* add version 1.74.0 to bioconductor package r-impute
* add version 1.38.0 to bioconductor package r-interactivedisplaybase
* add version 2.34.0 to bioconductor package r-iranges
* add version 1.60.0 to bioconductor package r-kegggraph
* add version 1.40.0 to bioconductor package r-keggrest
* add version 3.56.0 to bioconductor package r-limma
* add version 2.52.0 to bioconductor package r-lumi
* add version 1.76.0 to bioconductor package r-makecdfenv
* add version 1.78.0 to bioconductor package r-marray
* add version 1.12.0 to bioconductor package r-matrixgenerics
* add version 1.8.0 to bioconductor package r-metapod
* add version 2.46.0 to bioconductor package r-methylumi
* add version 1.46.0 to bioconductor package r-minfi
* add version 1.34.0 to bioconductor package r-missmethyl
* add version 1.80.0 to bioconductor package r-mlinterfaces
* add version 1.12.0 to bioconductor package r-mscoreutils
* add version 2.26.0 to bioconductor package r-msnbase
* add version 2.56.0 to bioconductor package r-multtest
* add version 1.38.0 to bioconductor package r-mzid
* add version 2.34.0 to bioconductor package r-mzr
* add version 1.62.0 to bioconductor package r-oligoclasses
* update r-org-hs-eg-db
* add version 1.42.0 to bioconductor package r-organismdbi
* add version 1.40.0 to bioconductor package r-pathview
* add version 1.92.0 to bioconductor package r-pcamethods
* update r-pfam-db
* add version 1.44.0 to bioconductor package r-phyloseq
* add version 1.62.0 to bioconductor package r-preprocesscore
* add version 1.32.0 to bioconductor package r-protgenerics
* add version 1.34.0 to bioconductor package r-quantro
* add version 2.32.0 to bioconductor package r-qvalue
* add version 1.76.0 to bioconductor package r-rbgl
* add version 2.40.0 to bioconductor package r-reportingtools
* add version 2.44.0 to bioconductor package r-rgraphviz
* add version 2.44.0 to bioconductor package r-rhdf5
* add version 1.12.0 to bioconductor package r-rhdf5filters
* add version 1.22.0 to bioconductor package r-rhdf5lib
* add version 1.76.0 to bioconductor package r-roc
* add version 1.28.0 to bioconductor package r-rots
* add version 2.16.0 to bioconductor package r-rsamtools
* add version 1.60.0 to bioconductor package r-rtracklayer
* add version 0.38.0 to bioconductor package r-s4vectors
* add version 1.8.0 to bioconductor package r-scaledmatrix
* add version 1.28.0 to bioconductor package r-scater
* add version 1.14.0 to bioconductor package r-scdblfinder
* add version 1.28.0 to bioconductor package r-scran
* add version 1.10.0 to bioconductor package r-scuttle
* add version 1.66.0 to bioconductor package r-seqlogo
* add version 1.58.0 to bioconductor package r-shortread
* add version 1.74.0 to bioconductor package r-siggenes
* add version 1.22.0 to bioconductor package r-singlecellexperiment
* add version 1.34.0 to bioconductor package r-snprelate
* add version 1.50.0 to bioconductor package r-snpstats
* add version 2.36.0 to bioconductor package r-somaticsignatures
* add version 1.12.0 to bioconductor package r-sparsematrixstats
* add version 1.40.0 to bioconductor package r-spem
* add version 1.38.0 to bioconductor package r-sseq
* add version 1.30.0 to bioconductor package r-summarizedexperiment
* add version 3.48.0 to bioconductor package r-sva
* add version 1.38.0 to bioconductor package r-tfbstools
* add version 1.22.0 to bioconductor package r-tmixclust
* add version 2.52.0 to bioconductor package r-topgo
* add version 1.24.0 to bioconductor package r-treeio
* add version 1.28.0 to bioconductor package r-tximport
* add version 1.28.0 to bioconductor package r-tximportdata
* add version 1.46.0 to bioconductor package r-variantannotation
* add version 3.68.0 to bioconductor package r-vsn
* add version 2.6.0 to bioconductor package r-watermelon
* add version 2.46.0 to bioconductor package r-xde
* add version 1.58.0 to bioconductor package r-xmapbridge
* add version 0.40.0 to bioconductor package r-xvector
* add version 1.26.0 to bioconductor package r-yapsa
* add version 1.26.0 to bioconductor package r-yarn
* add version 1.46.0 to bioconductor package r-zlibbioc
* Revert "add version 1.82.0 to bioconductor package r-genefilter"
This reverts commit 1702071c6d.
* Revert "add version 0.38.0 to bioconductor package r-s4vectors"
This reverts commit 58a7df2387.
* add version 0.38.0 to bioconductor package r-s4vectors
* Revert "add version 1.28.0 to bioconductor package r-aneufinder"
This reverts commit 0a1f59de6c.
* add version 1.28.0 to bioconductor package r-aneufinder
* Revert "add version 2.16.0 to bioconductor package r-beachmat"
This reverts commit cd49fb8e4c.
* add version 2.16.0 to bioconductor package r-beachmat
* Revert "add version 4.8.0 to bioconductor package r-clusterprofiler"
This reverts commit 6e9a951cbe.
* add version 4.8.0 to bioconductor package r-clusterprofiler
* Fix syntax error
* r-genefilter: add version 1.82.0
* new package: r-basilisk-utils
* new package: r-basilisk
* new package: r-densvis
* new package: r-dir-expiry
* r-affyplm: add zlib dependency
* r-cner: add zlib dependency
* r-mzr: add zlib dependency
* r-rhdf5filters: add zstd dependency
* r-shortread: add zlib dependency
* r-snpstats: add zlib dependency
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
macOS doesn't have `getrandom`, and 1.10.2 fails to compile because of this.
There's an upstream fix at https://dev.gnupg.org/T6442 that will be in the next
`libgcrypt` release, but the patch is available now.
* py-argcomplete: add 3.0.8
* Update var/spack/repos/builtin/packages/py-argcomplete/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of manuelakuhn
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* aml: v0.2.1
* add version 0.2.1
* fix hip variant bug
* [fix] pkgconf required for all builds
On top of needing pkgconf for autoreconf builds, the release configure
scripts needs pkgconf do detect dependencies if any of the hwloc, ze, or
opencl variants are active.
* Remove deprecation for v0.2.0 based on PR advise.
* intel-xed: add version 2023.04.16
1. add version 2023.04.16
2. adjust the mbuild resource to better match the xed version at the time
3. replace three conflicts() with one new requires() for x86_64 target
4. add patch for libxed-ild for some new avx512 instructions
* [@spackbot] updating style on behalf of mwkrentel
* Fix the build for 2023.04.16. XED requires its source directory to be exactly 'xed', so add a symlink.
5. move the mbuild resource up one level, xed wants it to be in the same directory as the xed source dir
6. deprecate 10.2019.03
* semantic style fix: add OSError to except
* [@spackbot] updating style on behalf of mwkrentel
---------
Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
* py-jarvis-util: add a new package
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update py-nltk
* [@spackbot] updating style on behalf of meyersbs
* Update var/spack/repos/builtin/packages/py-nltk/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add version 1.28.3 to r-hexbin
* add version 5.0-1 to r-hmisc
* add version 0.5.5 to r-htmltools
* add version 1.6.2 to r-htmlwidgets
* add version 1.4.2 to r-igraph
* add version 0.42.19 to r-imager
* add version 1.0-5 to r-inum
* add version 0.9-14 to r-ipred
* add version 1.3.2 to r-irkernel
* add version 2.2.0 to r-janitor
* add version 0.1-10 to r-jpeg
* add version 1.2.2 to r-jsonify
* add version 0.9-32 to r-kernlab
* add version 1.7-2 to r-klar
* add version 1.42 to r-knitr
* add version 1.14.0 to r-ks
* add version 2.11.0 to r-labelled
* add version 1.7.2.1 to r-lava
* add version 0.6-15 to r-lavaan
* add version 2.1.2 to r-leaflet
* add version 2.9-0 to r-lfe
* add version 1.1.6 to r-lhs
* add version 1.1-33 to r-lme4
* add version 1.5-9.7 to r-locfit
* add version 0.4.3 to r-log4r
* add version 5.6.18 to r-lpsolve
* add version 0.2-11 to r-lwgeom
* add version 2.7.4 to r-magick
* add version 1.22.1 to r-maldiquant
* add version 1.2.11 to r-mapproj
* add version 1.6 to r-markdown
* add version 7.3-59 to r-mass
* add version 1.5-4 to r-matrix
* add version 0.63.0 to r-matrixstats
* add version 4.2-3 to r-memuse
* add version 4.0-0 to r-metafor
* add version 1.8-42 to r-mgcv
* add version 3.15.0 to r-mice
* add version 0.4-5 to r-mitml
* add version 2.0.0 to r-mixtools
* add version 0.1.11 to r-modelr
* add version 1.4-23 to r-multcomp
* add version 0.1-9 to r-multcompview
* add version 0.1-13 to r-mutoss
* add version 1.18.1 to r-network
* add version 3.3.4 to r-nleqslv
* add version 3.1-162 to r-nlme
* add version 0.26 to r-nmf
* add version 0.60-17 to r-np
* add version 4.2.5.2 to r-openxlsx
* add version 2022.11-16 to r-ordinal
* add version 0.6.0.8 to r-osqp
* add version 0.9.1 to r-packrat
* add version 1.35.0 to r-parallelly
* add version 1.3-13 to r-party
* add version 1.2-20 to r-partykit
* add version 1.7-0 to r-pbapply
* add version 0.3-9 to r-pbdzmq
* add version 1.2 to r-pegas
* add version 1.5-1 to r-phytools
* add version 1.9.0 to r-pillar
* add version 1.4.0 to r-pkgbuild
* add version 2.1.0 to r-pkgcache
* add version 0.5.0 to r-pkgdepends
* add version 2.0.7 to r-pkgdown
* add version 1.3.2 to r-pkgload
* add version 0.1-8 to r-png
* add version 1.1.22 to r-polspline
* add version 1.0.1 to r-pool
* add version 1.4.1 to r-posterior
* add version 3.8.1 to r-processx
* add version 2023.03.31 to r-prodlim
* add version 1.0-12 to r-proj4
* add version 2.5.0 to r-projpred
* add version 0.1.6 to r-pryr
* add version 1.7.5 to r-ps
* add version 1.0.1 to r-purrr
* add version 1.3.2 to r-qqconf
* add version 0.25.5 to r-qs
* add version 1.60 to r-qtl
* add version 0.4.22 to r-quantmod
* add version 5.95 to r-quantreg
* add version 0.7.8 to r-questionr
* add version 1.2.5 to r-ragg
* add version 0.15.1 to r-ranger
* add version 3.6-20 to r-raster
* add version 2.2.13 to r-rbibutils
* add version 1.0.10 to r-rcpp
* add version 0.12.2.0.0 to r-rcpparmadillo
* add version 0.1.7 to r-rcppde
* add version 0.3.13 to r-rcppgsl
* add version 1.98-1.12 to r-rcurl
* add version 1.2-1 to r-rda
* add version 2.1.4 to r-readr
* add version 1.4.2 to r-readxl
* add version 1.0.6 to r-recipes
* add version 1.1.6 to r-repr
* add version 1.2.16 to r-reproducible
* add version 0.3.0 to r-require
* add version 1.28 to r-reticulate
* add version 2.0.7 to r-rfast
* add version 1.6-6 to r-rgdal
* add version 0.6-2 to r-rgeos
* add version 1.1.3 to r-rgl
* add version 0.2.18 to r-rinside
* add version 4-14 to r-rjags
* add version 1.3-1.8 to r-rjsonio
* add version 2.21 to r-rmarkdown
* add version 0.9-2 to r-rmpfr
* add version 0.7-1 to r-rmpi
* add version 6.6-0 to r-rms
* add version 0.10.25 to r-rmysql
* add version 0.8.7 to r-rncl
* add version 2.4.11 to r-rnexml
* add version 0.95-1 to r-robustbase
* add version 1.3-20 to r-rodbc
* add version 7.2.3 to r-roxygen2
* add version 1.4.5 to r-rpostgres
* add version 0.7-5 to r-rpostgresql
* add version 0.8.29 to r-rsconnect
* add version 0.4-15 to r-rsnns
* add version 2.3.1 to r-rsqlite
* add version 0.7.2 to r-rstatix
* add version 1.1.2 to r-s2
* add version 0.4.5 to r-sass
* add version 0.1.9 to r-scatterpie
* add version 0.3-43 to r-scatterplot3d
* add version 3.2.4 to r-scs
* add version 1.6-4 to r-segmented
* add version 4.2-30 to r-seqinr
* add version 0.26 to r-servr
* add version 4.3.0 to r-seurat
* add version 1.0-12 to r-sf
* add version 0.4.2 to r-sfheaders
* add version 1.1-15 to r-sfsmisc
* add version 1.7.4 to r-shiny
* add version 1.9.0 to r-signac
* add version 1.6.0.3 to r-smoof
* add version 0.1.7-1 to r-sourcetools
* add version 1.6-0 to r-sp
* add version 1.3-0 to r-spacetime
* add version 7.3-16 to r-spatial
* add version 2.0-0 to r-spatialeco
* add version 1.2-8 to r-spatialreg
* add version 3.0-5 to r-spatstat
* add version 3.0-1 to r-spatstat-data
* add version 3.1-0 to r-spatstat-explore
* add version 3.1-0 to r-spatstat-geom
* add version 3.1-0 to r-spatstat-linnet
* add version 3.1-4 to r-spatstat-random
* add version 3.0-1 to r-spatstat-sparse
* add version 3.0-2 to r-spatstat-utils
* add version 2.2.2 to r-spdata
* add version 1.2-8 to r-spdep
* add version 0.6-1 to r-stars
* add version 1.5.0 to r-statmod
* add version 4.8.0 to r-statnet-common
* add version 1.7.12 to r-stringi
* add version 1.5.0 to r-stringr
* add version 1.9.1 to r-styler
* add version 3.5-5 to r-survival
* add version 1.5-4 to r-tclust
* add version 1.7-29 to r-terra
* add version 3.1.7 to r-testthat
* add version 1.1-2 to r-th-data
* add version 1.2 to r-tictoc
* add version 1.3.2 to r-tidycensus
* add version 1.2.3 to r-tidygraph
* add version 1.3.0 to r-tidyr
* add version 2.0.0 to r-tidyverse
* add version 0.2.0 to r-timechange
* add version 0.45 to r-tinytex
* add version 0.4.1 to r-triebeard
* add version 1.0-9 to r-truncnorm
* add version 0.10-53 to r-tseries
* add version 0.8-1 to r-units
* add version 4.3.0 to r-v8
* add version 1.4-11 to r-vcd
* add version 1.14.0 to r-vcfr
* add version 0.6.2 to r-vctrs
* add version 1.1-8 to r-vgam
* add version 0.4.0 to r-vioplot
* add version 1.6.1 to r-vroom
* add version 1.72-1 to r-wgcna
* add version 0.4.1 to r-whisker
* add version 0.7.2 to r-wk
* add version 0.39 to r-xfun
* add version 1.7.5.1 to r-xgboost
* add version 1.0.7 to r-xlconnect
* add version 3.99-0.14 to r-xml
* add version 0.13.1 to r-xts
* add version 2.3.7 to r-yaml
* add version 2.3.0 to r-zip
* add version 1.8-12 to r-zoo
* r-bigmem: dependency on uuid
* r-bio3d: dependency on zlib
* r-devtools: dependency cleanup
* r-dose: dependency cleanup
* r-dss: dependency cleanup
* r-enrichplot: dependency cleanup
* r-fgsea: dependency cleanup
* r-geor: dependency cleanup
* r-ggridges: dependency cleanup
* r-lobstr: dependency cleanup
* r-lubridate: dependency cleanup
* r-mnormt: dependency cleanup
* r-sctransform: version format correction
* r-seuratobject: dependency cleanup
* r-tidyselect: dependency cleanup
* r-tweenr: dependency cleanup
* r-uwot: dependency cleanup
* new package: r-clock
* new package: r-conflicted
* new package: r-diagram
* new package: r-doby
* new package: r-httr2
* new package: r-kableextra
* new package: r-mclogit
* new package: r-memisc
* new package: r-spatstat-model
* r-rmysql: use mariadb-client
* r-snpstats: add zlib dependency
* r-qs: add zstd dependency
* r-rcppcnpy: add zlib dependency
* black reformatting
* Revert "r-dose: dependency cleanup"
This reverts commit 4c8ae8f5615ee124fff01ce43eddd3bb5d06b9bc.
* Revert "r-dss: dependency cleanup"
This reverts commit a6c5c15c617a9a688fdcfe2b70c501c3520d4706.
* Revert "r-enrichplot: dependency cleanup"
This reverts commit 65e116c18a94d885bc1a0ae667c1ef07d1fe5231.
* Revert "r-fgsea: dependency cleanup"
This reverts commit ffe2cdcd1f73f69d66167b941970ede0281b56d7.
* r-rda: this package is back in CRAN
* r-sctransform: fix copyright
* r-seurat: fix copyright
* r-seuratobject: fix copyright
* Revert "add version 6.0-94 to r-caret"
This reverts commit 236260597de97a800bfc699aec1cd1d0e3d1ac60.
* add version 6.0-94 to r-caret
* Revert "add version 1.8.5 to r-emmeans"
This reverts commit 64a129beb0bd88d5c88fab564cade16c03b956ec.
* add version 1.8.5 to r-emmeans
* Revert "add version 5.0-1 to r-hmisc"
This reverts commit 517643f4fd8793747365dfcfc264b894d2f783bd.
* add version 5.0-1 to r-hmisc
* Revert "add version 1.42 to r-knitr"
This reverts commit 2a0d9a4c1f0ba173f7423fed59ba725bac902c37.
* add version 1.42 to r-knitr
* Revert "add version 1.6 to r-markdown"
This reverts commit 4b5565844b5704559b819d2e775fe8dec625af99.
* add version 1.6 to r-markdown
* Revert "add version 0.26 to r-nmf"
This reverts commit 4c44a788b17848f2cda67b32312a342c0261caec.
* add version 0.26 to r-nmf
* Revert "add version 2.3.1 to r-rsqlite"
This reverts commit 5722ee2297276e4db8beee461d39014b0b17e420.
* add version 2.3.1 to r-rsqlite
* Revert "add version 1.0-12 to r-sf"
This reverts commit ee1734fd62cc02ca7a9359a87ed734f190575f69.
* add version 1.0-12 to r-sf
* fix syntax error
* Add FNAL Spack team to maintainers
* New variants and configuration improvements
* Version dependent "no-systemd" patches.
* New variants `client_only`, and `davix`
* Better handling of `cxxstd` for different versions, including
improved patching and CMake options.
* Version-specific CMake requirements.
* Better version-specific handling of `openssl` dependency.
* `py-setuptools` required for `+python` build.
* Specific enable/disable of CMake options and use of
`-DFORCE_ENABLED=TRUE` to prevent unwanted/non-portable activation
of features.
* Better handling of `+python` configuration.
* New version 5.5.5
Add aws-plcuster[-aarch64] stacks. These stacks build packages defined in
https://github.com/spack/spack-configs/tree/main/AWS/parallelcluster
They use a custom container from https://github.com/spack/gitlab-runners which
includes necessary ParallelCluster software to link and build as well as an
upstream spack installation with current GCC and dependencies.
Intel and ARM software is installed and used during the build stage but removed
from the buildcache before the signing stage.
Files `configs/linux/{arch}/ci.yaml` select the necessary providers in order to
build for specific architectures (icelake, skylake, neoverse_{n,v}1).
Make it clear that copy-only pipelines are not supported while still
using the deprecated ci config format. Also ensure that the deprecated
stack does not fail on spack pipelines for tags.
* Fix reporting of packageless specs as having no tests
* Add test_test_output_multiple_specs with update to simple-standalone-test (and tests)
* Refactored test status summary; added more tests or checks
MSVC compiler logic was using string parsing to extract version
from compiler spec, which was fragile. This broke in #37572, so has
been fixed and made more robust by using attribute access.
* CI: Expand E4S ROCm stack to include missing DaV packages
Ascent: Fixup for VTK-m with Kokkos backend
* DaV SDK: Removed duplicated openmp variant for ascent
* Drop visit and add conflict for Kokkos
* E4S: Drop ascent from CUDA builds
Ensure that requirements `packages:*:require:@x` and preferences `packages:*:version:[x]`
fail concretization when no version defined in the package satisfies `x`. This always holds
except for git versions -- they are defined on the fly.
In the past, Spack did not allow two different versions of the
same package within a DAG. That led to difficulties with packages
that still required Python 2 while other packages had already
switched to Python 3.
The libxcb and xcb-proto packages did not have Python 3 support
for a time. To get around this issue, Spack maintainers disabled
their dependency on an internal (i.e., Spack-provided) Python
(see #4145),forcing these packages to look for a system-provided
Python (see #7646).
This has worked for us all right, but with the arrival of our most
recent platform we seem to be missing the critical xcbgen Python
module on the system. Since most software has largely moved on to
Python 3 now, let's re-enable internal Spack dependencies for the
libxcb and xcb-proto packages.
Two bugs came in from #37438
1. `unify: when_possible` was broken, because of an incorrect assertion. abstract/concrete
spec pairs were compared against the results that were in the process of being computed,
rather than against the previous results.
2. `unify: true` had an ordering bug that could mix the association between abstract and
concrete specs
- [x] 1 is resolved by creating a lookup from old concrete specs to old abstract specs,
and we use that to associate the "new" concrete specs that happen to be the old
ones with their abstract specs (since those are stripped out for concretization
- [x] 2 is resolved by combining the new and old abstract as lists instead of combining
them as sets. This is important because `set() | set()` does not make any ordering
promises, even though set ordering is otherwise guaranteed in `python@3.7:`
* Upgrading kosh to 3.0.
* Accidentally regressed the package, changing back.
* Updating py-hdbscan versions for kosh.
* Fixing bug in patch.
* Adding 3.0.1
* Removing 3.0.
* Updating package deps for hdbscan to match requirements.txt.
* Version reqs for 3.0.*, need newer numpy and networkx
* spack style
* Reordering to match setup.py, adding "type" to python depends.
* trilinos@develop fixes
* Update var/spack/repos/builtin/packages/trilinos/package.py
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
---------
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Spack displays package code context when it shouldn't (e.g., on `FetchError`s)
and doesn't display it when it should (e.g., when errors occur in builder classes.
The line attribution can sometimes be off by one, as well.
- [x] Display package context when errors occur in a subclass of `PackageBase`
- [x] Display package context when errors occur in a subclass of `BaseBuilder`
- [x] Do not display package context when errors occur in `PackageBase`,
`BaseBuilder` or other core code that is not in a `package.py` file.
- [x] Fix off-by-one error for core code (don't subtract one from the line number *unless*
it's in an actual `package.py` file.
---------
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
We currently throw a nasty error if you try to reuse packages from some other namespace
(e.g., OLCF), but we should be able to reuse patched local versions of builtin packages.
Right now the only obstacle to that is that we try to look up virtual info for unknown
namespaces, and we can't get the package from the repo to do that. We *can* assume that
a package with a known namespace is similar, and that its virtual provider information
is reasonably accurate, so we now do that. This isn't 100% accurate, but neither is
relying on the package itself, as it may have gone out of date.
The real solution here is virtual edge information, but this is a stopgap until we have
that.
`spec_clauses()` attempts to look up package information for concrete specs in order to
determine which virtuals they may provide. This fails for renamed/deleted dependencies
of buildcaches and installed packages.
This will eventually be fixed by #35258, which adds virtual information on edges, but we
need a workaround to make older buildcaches usable.
- [x] make an exception for renamed packages and omit their virtual constraints
- [x] add a note that this will be solved by adding virtuals to edges
The concretizer can fail with `reuse:true` if a buildcache or installation contains a
package with a dependency that has been renamed or deleted in the main repo (e.g.,
`netcdf` was refactored to `netcdf-c`, `netcdf-fortran`, etc., but there are still
binary packages with dependencies called `netcdf`).
We should still be able to install things for which we are missing `package.py` files.
`Spec.inject_patches_variant()` was failing this requirement by attempting to look up
the package class for concrete specs. This isn't needed -- we can skip it.
- [x] swap two conditions in `Spec.inject_patches_variant()`
I will follow this up with a variant to flux-core to add flux-security, and then automation in the flux-framework/spack repository.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Clone `spack-configs <https://github.com/spack/spack-configs>`_ repo and activate Intel oneAPI CPU environment::
git clone https://github.com/spack/spack-configs
spack env activate spack-configs/INTEL/CPU
spack concretize -f
`Intel oneAPI CPU environment <https://github.com/spack/spack-configs/blob/main/INTEL/CPU/spack.yaml>`_ contains applications tested and validated by Intel, this list is constantly extended. And currently it supports:
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -17,7 +17,7 @@ case you want to skip directly to specific docs:
*:ref:`config.yaml <config-yaml>`
*:ref:`mirrors.yaml <mirrors>`
*:ref:`modules.yaml <modules>`
*:ref:`packages.yaml <build-settings>`
*:ref:`packages.yaml <packages-config>`
*:ref:`repos.yaml <repositories>`
You can also add any of these as inline configuration in the YAML
@@ -73,9 +73,12 @@ are six configuration scopes. From lowest to highest:
Spack instance per project) or for site-wide settings on a multi-user
machine (e.g., for a common Spack instance).
#.**plugin**: Read from a Python project's entry points. Settings here affect
all instances of Spack running with the same Python installation. This scope takes higher precedence than site, system, and default scopes.
#.**user**: Stored in the home directory: ``~/.spack/``. These settings
affect all instances of Spack and take higher precedence than site,
system, or defaults scopes.
system, plugin, or defaults scopes.
#.**custom**: Stored in a custom directory specified by ``--config-scope``.
If multiple scopes are listed on the command line, they are ordered
@@ -196,6 +199,45 @@ with MPICH. You can create different configuration scopes for use with
mpi: [mpich]
.._plugin-scopes:
^^^^^^^^^^^^^
Plugin scopes
^^^^^^^^^^^^^
..note::
Python version >= 3.8 is required to enable plugin configuration.
Spack can be made aware of configuration scopes that are installed as part of a python package. To do so, register a function that returns the scope's path to the ``"spack.config"`` entry point. Consider the Python package ``my_package`` that includes Spack configurations:
..code-block::console
my-package/
├── src
│ ├── my_package
│ │ ├── __init__.py
│ │ └── spack/
│ │ │ └── config.yaml
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make ``my_package``'s ``spack/`` configurations visible to Spack when ``my_package`` is installed:
..code-block::toml
[project.entry_points."spack.config"]
my_package="my_package:get_config_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
---------------------------------------
Registering Extensions via Entry Points
---------------------------------------
..note::
Python version >= 3.8 is required to register extensions via entry points.
Spack can be made aware of extensions that are installed as part of a python package. To do so, register a function that returns the extension path, or paths, to the ``"spack.extensions"`` entry point. Consider the Python package ``my_package`` that includes a Spack extension:
..code-block::console
my-package/
├── src
│ ├── my_package
│ │ └── __init__.py
│ └── spack-scripting/ # the spack extensions
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make the ``spack-scripting`` extension visible to Spack when ``my_package`` is installed:
..code-block::toml
[project.entry_points."spack.extenions"]
my_package="my_package:get_extension_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
Currently the repository contains the following JSON files:
```console
.
├── COPYRIGHT
└── cpu
├── microarchitectures.json # Contains information on CPU microarchitectures
└── microarchitectures_schema.json # Schema for the file above
cpu/
├── cpuid.json # Contains information on CPUID calls to retrieve vendor and features on x86_64
├── cpuid_schema.json # Schema for the file above
├── microarchitectures.json # Contains information on CPU microarchitectures
└── microarchitectures_schema.json # Schema for the file above
```
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.