Allow the following formats:
```yaml
mirrors:
name: <url>
```
```yaml
mirrors:
name:
url: s3://xyz
access_pair: [x, y]
```
```yaml
mirrors:
name:
fetch: http://xyz
push:
url: s3://xyz
access_pair: [x, y]
```
And reserve two new properties to indicate the mirror type (e.g.
mirror.spack.io is a source mirror, not a binary cache)
```yaml
mirrors:
spack-public:
source: true
binary: false
url: https://mirror.spack.io
```
A few packages have version directives evaluated
within if statements, conditional on the value of
`platform.platform()`.
Sometimes there are no cases for e.g. platform=darwin and that
causes a lot of spurious failures with version existence
audits.
This PR allows expressing conditions to skip version
existence checks in audits and avoid these spurious reports.
### Rationale
While working on #29549, I noticed a lot of inconsistencies in our argparse help messages. This is important for fish where these help messages end up as descriptions in the tab completion menu. See https://github.com/spack/spack/pull/29549#issuecomment-1627596477 for some examples of longer or more stylized help messages.
### Implementation
This PR makes the following changes:
- [x] help messages start with a lowercase letter.
- [x] Help messages do not end with a period
- [x] the first line of a help message is short and simple
longer text is separated by an empty line
- [x] "help messages do not use triple quotes"
"""(except docstrings)"""
- [x] Parentheses not needed for string concatenation inside function call
- [x] Remove "..." "..." string concatenation leftover from black reformatting
- [x] Remove Sphinx argument docs from help messages
The first 2 choices aren't very controversial, and are designed to match the syntax of the `--help` flag automatically added by argparse. The 3rd choice is more up for debate, and is designed to match our package/module docstrings. The 4th choice is designed to avoid excessive newline characters and indentation. We may actually want to go even further and disallow docstrings altogether.
### Alternatives
Choice 3 in particular has a lot of alternatives. My goal is solely to ensure that fish tab completion looks reasonable. Alternatives include:
1. Get rid of long help messages, only allow short simple messages
2. Move longer help messages to epilog
3. Separate by 2 newline characters instead of 1
4. Separate by period instead of newline. First sentence goes into tab completion description
The number of commands with long help text is actually rather small, and is mostly relegated to `spack ci` and `spack buildcache`. So 1 isn't actually as ridiculous as it sounds.
Let me know if there are any other standardizations or alternatives you would like to suggest.
* ci: run spack list in power ci
Let's see if Spack itself is the bottleneck in CI...
* rebuild curl in CI
* more of the same please!
* drop the profiler
* undo rebuildme test in ci variant
* add comment for posterity
* enable profiling
* trigger CI
* See how it goes now that perf regressions are fixed on develop
* try shorter poll intervals
* Revert "try shorter poll intervals"
This reverts commit d60c34ad3eceead0c13a5277cf8e783fd42b7458.
* Remove spec.format call in Database._get_matching_spec_key
* once more in ci please
* undo irrelevant changes
* run spack list in before script
* test in ci
* -:
* Undo CI testing
The spdlog project precisely states/depends which fmt version should
be used for compatibility. Latest version 1.11.0 depends explictly on
fmt 9.1.0. Without fixed version micromamba build fails when using spack
install micromamba on e.g. Rockylinux 8.5.
The 'bison' executable requires libtextstyle to run. I think this was
usually satisfied because gettext is often installed with the OS, or
brought in accidentally via perl/m4.
Looks like the libtextstyle library dependency started in Bison 3.4
Refactor `TermTitle` into `InstallStatus` and use it to show progress
information both in the terminal title as well as inline. This also
turns on the terminal title status by default.
The inline output will look like the following after this change:
```
==> Installing m4-1.4.19-w2fxrpuz64zdq63woprqfxxzc3tzu7p3 [4/4]
```
* llvm: fix build with libcxx=none
* ispc: checksum 1.20.0
* ispc: ensure that it does not crash immediately
this would happen if linked to the wrong libc++
* ispc: fix build on macos
find ncurses instead of curses and link against tinfo in order to avoid
unresolved references to _del_curterm, _set_curterm, _setupterm, and
_tigetnum
* ispc: enable arm targets, if building on arm
* ispc: remove double cmake argument
I forgot to remove the constant -DARM_ENABLED=FALSE when adding
-DARM_ENABLED with a value depending on target architecture
* ispc: fix linux build
since 1.20, linux build uses TBB as default tasking system and thus
needs to depend on it
* ispc: try to fix link error on linux
link against both curses (as before) and tinfo (added because of macos)
* ispc: update for recent llvm changes
libcxx=none instead of ~libcxx
`mypy` will check *all* imported packages, even optional dependencies outside your
project, and this can cause issues if you are targeting python versions *older* than the
one you're running in. `mypy` will report issues in the latest versions of dependencies
as errors even if installing on some older python would have installed an older version
of the dependency.
We saw this problem before with `numpy` in #34732. We've started seeing it with IPython
in #38704. This fixes the issue by exempting `IPython` and a number of other imports of
Spack's from `mypy` checking.
* Setting library path as lib similar to other rocm packages.
* Fix style check failure
* Restricting changes to 5.4.3 and above
* Including comgr change
* initial commit for adding hip-examples package
* adding test to hip-examples
* fixed compile error on add4
* change standalone test to use new syntax
Spack-installed Perl always has opcode support, but external Perl
installations might not. This commit adds a +opcode variant and
updates the external detection logic to check for opcode support.
The postgresql package is updated to require perl+opcode (in
combination with the above, this helps detect when an external
Perl instance is sufficient for a Spack build of postgreqsql, or
if Spack needs to build its own Perl).
* Update cp2k recipe to use cmake or the current build system
Offers the possibility to build cp2k with the new cmake build system. commands like this are now supported
spack install cp2k@master build_system=cmake +.....
the recipe supports the following optional functionalities
- superlu, cosma, sirius, spglib, metis, spglib, libxc, libint, cuda/rocm, mkl/openblas/sci (and others), mpi, openmp, dbcsr
- dbcsr is built separately using the currently available recipe.
Two PRs need to be merged to be fully functional (cosma update in spack + one PR in cp2k github).
* Fix indentation
* Fix indentation
* Update libvori
* More typos
* Simplify BLAS/LAPACK
* Simplify BLAS/LAPACK
* Add A100 gpu value
* Fix typo
* Add the enable_regtests option
if -DCP2K_ENABLE_REGTESTS=ON (+enable_regtests with spack) then the location of the binary executables will be in the cp2k root directory under exe/build-cmake-*. This option is needed to run the regtests afterwards.
* Minor update
* more fixes
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
* small changes
* Remove any reference to nvidia architecture in the rocm list
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/cp2k/package.py
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
* Final reformating
* Update py-fypp
---------
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
People frequently ask us how to pipe `spack find` output to other commands, and we tell
them to do things like this:
```console
$ spack find --format "/{hash}" | spack uninstall -ay
```
Sometimes users don't know about hash references and come up with potentially ambiguous
formulations like this:
```console
spack find --format {name}@{version}%{compiler} | spack uninstall -ay
```
Since this is a common enough thing to want to do, and to make it more obvious how, this
PR adds a `-H` / `--hashes` as a shortcut, so you can now just do:
```console
spack find -H | spack uninstall -ay
```
* Added packages bitstruct, callmonitor, and PYnvtx
* Revert "Added packages bitstruct, callmonitor, and PYnvtx"
This reverts commit 76d25aa76b.
* py-bitstruct: This module is intended to have a similar interface as the python struct module, but working on bits instead of primitive data types (char, int, …)
* Update package.py
To pass the style prechecks
* PyNVTX: new package
* Delete package.py
Accidentally added this package.
* Update var/spack/repos/builtin/packages/py-bitstruct/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* openssl: prefer 3.x
This PR is not intended to be merged immediately, but it would be good
to see what packages fail to build in CI so that we can get proper
version constraints on openssl (before all packages update and support
both openssl 1 and 3)
* Disable assembly for 3.x %oneapi
* cmake: depend on spack curl, to deal with curl - openssl compat
* also make zlib external
* remove overly strict & unsafe requirement on py-cryptographty patch version number
* update openssl compat bounds in py-cryptography
* smaller diff
* Make libssh2 an autotools/cmake package
* fix weird upperbound in libssh2 as there is not openssl v2
* libssh2: pc file lists plain -lssl -lcrypto w/o leading -L flag, confusing libgit2 parsing of pkg-config output
* Actually fix the issue in libssh2: its pc file looks broken
`"%s" % spec` formats the spec with deps included, which produces sometimes KBs
of data and is slow to run in pure Python. It can delay otherwise very short-lived
read/write locks on the database.
Discovered in #38762 where profile output showed about 2 seconds is spent in
`spec.format`, which is significant overhead when using multiprocessing to install
from binary cache in parallel (installation often takes <5s for small packages). With
this change, `spec.format` no longer shows up in profile output.
(This line hasn't changed since Spack v0.9 ;p)
* move format() call to custom NoSuchSpecError exception
* add a comment saying why, so we can eventually change `Spec.__str__`
* qt-base: new version 6.5.0
* qt-declarative: new version 6.5.0
* qt-quick3d: new version 6.5.0
* qt-quicktimeline: new version 6.5.0
* qt-shadertools: new version 6.5.0
* qt-*: new version 6.5.1
* qt-base: new version 6.5.1
* py-pyarrow: enable parquet variant by default
* Disable parquet variant by default
* Add conflict to enable parquet when dataset is active
* Disable dataset variant by default
* initial commit of nanobind package
* style fixes
* Update package.py
Typo
* addressed PR comments
* add v1.4.0
* Update var/spack/repos/builtin/packages/py-nanobind/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Matthew Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-astropy: fix import tests and restrict py-pip version
* Fix --install-option name in comments
* Rename variant and fix variant dependencies
* Remove parquet variant from py-pyarrow
1. Fix O(n^2) iteration in `_get_overwrite_specs`
2. Early exit `get_by_hash` on full hash
3. Fix O(n^2) double lookup in `all_matching_specs` with hashes
4. Fix some legibility issues
* mlpack: new package
mlpack is an intuitive, fast, and flexible header-only C++ machine learning library with bindings to other languages. It is meant to be a machine learning analog to LAPACK, and aims to implement a wide array of machine learning methods and functions as a "swiss army knife" for machine learning researchers.
* mlpack: upstream merged patch to allow python installation in spack
* Added v5.0.0 of PyAMG. This required v7.1.0 of setuptools_scm due to a bug in 7.0.5.
* Added comment about version requirement.
* Loosened dependency based on build experiments.
* Updated tomli deps.
* Update var/spack/repos/builtin/packages/py-setuptools-scm/package.py
Dependence for 7.0 only.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pyamg/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Swapped lines.
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pip: add 23.1.2
* Restrict py-pip version for py-protobuf
* Restrict py-pip version for straightforward packages
* Restrict py-pip version for nrm
* Fix --install-option name in comments
* Simplify py-pip restriction for py-scs
* nrm: fix wrong comment
* py-spglib: add 2.0.2
* Update var/spack/repos/builtin/packages/py-spglib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove py-setuptools as run dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add maintainers
* Updated cosma archive checksum and costa version
- updated cosma version (in the cosma build system)
- updated costa version
- use the default generic url for downloading packages
- do not build tiled-mm when the cpu only version is needed
Signed-off-by: Dr. Mathieu Taillefumier <mathieu.taillefumier@free.fr>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
NAMD users expect the Tcl scripting interface to be enabled as it is used in many examples and tutorials in addition to being required for features such as multi-copy algorithms.
* When installing a package Spack will attempt to set group permissions on
the install prefix even when the configuration does not specify a group.
Co-authored-by: David Gomez <dvdgomez@users.noreply.github.com>
From the configure.ac file:
> H5VL_log is built on top of MPI. Configure option --without-mpi or
> --with-mpi=no should not be used. Abort.
This currently fails to build in the oneAPI pipeline on `develop`
* py-userpath: new package
* pipx: new package
* Update var/spack/repos/builtin/packages/pipx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* pipx: Remove incorrect dependency on py-platformdirs
* Update var/spack/repos/builtin/packages/pipx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-userpath: Remove version requirements to match upstream
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This PR removes deprecated versions for all packages that I'm maintaining. In future Spack releases, I'm planning to do this on a much larger scale, but we can hold off until we have better reproducibility.
I'm hoping that this will improve the maintainability of these packages. If any other maintainers of these recipes would like to retain any of these deprecated versions, or add new versions, speak now or forever hold your peace 😄
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
Windows runtime library loading searches PATH, and therefore bin/ is
the appropriate place to put .dll files. Prior to this change, XZ was
installing both .dll and .lib files to the lib/ directory.
Move the logic checking which mirrors have the specs we need closer
to where that information is needed. Also update the staging summary
to contain a brief description of why we scheduled or pruned each
job. If a spec was found on any mirrors, regardless of whether
we scheduled a job for it, print those mirrors.
* Change maintainer, add new version and deprecate old one
* Fix style issue
* Revert deprecation
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
PowerShell requires explicit shell and env support in Spack.
This is due to the distinct differences in shell interactions between
cmd and pwsh. Add a doskey in pwsh piping 'spack' commands to a
powershell script similar to the sh function 'spack'. Add
support for PowerShell-specific shell interactions from Spack
(set/unset shell variables).
* py-ruamel-yaml: add 0.17.32 and py-ruamel-yaml-clib: add 0.2.7
* Update var/spack/repos/builtin/packages/py-ruamel-yaml/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix style
* Fix python dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* HDF5: is_enabled helper (ON)
Slightly generalize the `is_enabled` helper in the HDF5 package.
`ON` is the most typical CMake bool option passed, besides many
other possible `true` values, and should be included as a possible
check to the config.
* Simplify
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Support hardlinks/junctions on Windows systems without developer
mode enabled
* Generally, use of llnl.util.symlink.symlink is preferred over
os.symlink since it handles this automatically
* Generally an error is now reported if a user attempts to create a
symlink to a file that does not exist (this was previously allowed
on Linux/Mac).
* One exception to this: when Spack installs files from the source
into their final prefix, dangling symlinks are allowed (on
Linux/Mac - Windows does not allow this in any circumstance).
The intent behind this is to avoid generating failures for
installations on Linux/Mac that were succeeding before.
* Because Windows is strict about forbidding dangling symlinks,
`traverse_tree` has been updated to skip creating symlinks if they
would point to a file that is ignored. This check is not
transitive (i.e., a symlink to a symlink to an ignored file would
not be caught appropriately)
* Relocate function: resolve_link_target_relative_to_the_link
(this is not otherwise modified)
Co-authored-by: jamessmillie <smillie@txcorp.com>
Update list of excluded variables in `from_sourcing_file` function to
cover all variables specific to Environment Modules or Lmod. Add
specifically variables relative to the definition of `module()`, `ml()`
and `_module_raw()` Bash functions.
Fixes#13504
* Add new versions of Qthreads
* Add version URLs explicitly as it has recently changed
* Use function to extrapolate version URL for older versions
* Fix url formatter
llvm @13-15 is required for ispc, but fails to build with GCC 13.
14.0.6 and 15.0.7 built successfully with upstream patch, 13.0.1
still fails. Thus upstream patch is applied to 14 and 15 only.
Update `env.set` command and underlying `SetEnv` object to add the `raw`
boolean attribute. `raw` is optional and set to False by default. When
set to True, value format is skipped for object when generating
environment modifications.
With this change it is now possible to define environment variable
whose value contains variable reference syntax (like `{foo}` or `{}`)
that should be set as-is.
Fixes#29578
By default, `find_package(Python)` searches from highest version to lowest version, identifying the highest version that satisfies the requirements. This means that `/usr/bin/python3.11` will be found before `$(spack location -i python)/bin/python3.10`, even when other packages have been built with the `python` in spack.
This ensures that the `python` dependency is explicitly the `python` version that is used.
* libEnsemble: add v0.10.0
* Make new deps required
* Fixes to deps
* Update var/spack/repos/builtin/packages/py-libensemble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix build, run
* Reorder required deps
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pytest: add 7.3.2
* [@spackbot] updating style on behalf of manuelakuhn
* Swap py-importlib-metadata dependency order
* Restrict python version for older versions
* Add recipe for iterative-stats
* Fix branch name and remove comment
* Add git link
* Add package maintainer
* Enforce multiple requested changes
* Update var/spack/repos/builtin/packages/py-iterative-stats/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update checksum
* Fix openturns dependency specification
* Add python variant spec to openturns
---------
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add rust v1.70.0 and simplify package logic by moving bootstrap to dedicated package
* Fix formatting of rust-bootstrap package file
* Re-enable Rust as extendable
* Add nightly version to rust and rust-bootstrap
* Manually inject openssl certs into environment
* Add master and beta versions to rust
* Add additional documentation for using rust development releases
* Remove @AndrewGaspar as maintainer
* py-notebook: add 6.5.4
* [@spackbot] updating style on behalf of manuelakuhn
* Update var/spack/repos/builtin/packages/py-notebook/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix version of py-nbclassic dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
It installs the LFortran runtime library and
LFortran can compile codes to binaries. The interactive mode does not
work yet with LLVM > 11, that has to be fixed upstream.
Co-authored-by: Wileam Y. Phan <50928756+wyphan@users.noreply.github.com>
No changes to the build system, no changes to `package.py` needed.
Changelog: https://github.com/qt/qtbase/compare/v5.15.9-lts-lgpl...v5.15.10-lts-lgpl
Main change taking up space:
- bundled 3rdparty/pcre2 updated from 10.39 to 10.40 (spack now includes 10.42, and we don't put specific version requirements in `package.py`)
* py-networkx: add 3.1
* Update var/spack/repos/builtin/packages/py-networkx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add default variant
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pycairo: add 1.24.0
* Change python dependency to 3.8
* Remove upper bound for python dependency
* Update var/spack/repos/builtin/packages/py-pycairo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add darshan 3.4.3 releases
* darshan-runtime 3.4.3
* darshan-util 3.4.3
* py-darshan 3.4.3.0
- add py-humanize as new dependency
* py-darshan has strict darshan-util version reqs
darshan-util version required is based on the first 3 parts of
the py-darshan version string
* remove support for python3.6
* py-humanize dependency for 3.4.3+ versions
* only enforce scipy dependency for 3.4.0.1
* drop optional lxml dependency
* drop matplotlib pinning
* importlib-resources not a dep in python-3.7+
* drop unnecessary numpy pin
* add build dep for pytest-runner
* fix typo in pytest-runner package name
* pip setuptools to match pydarsan setup.py
* spack style fix
* py-gsutil: add 5.24, fix and add dependencies
* Update var/spack/repos/builtin/packages/py-httplib2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add httplib2@0.20.4 and pin it in py-gsutil
* Add py-cryptography conflict
* Update var/spack/repos/builtin/packages/py-httplib2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pyopenssl: fix py-cryptography conflict
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* update mda and mdatests
* black
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* polish
* Update var/spack/repos/builtin/packages/py-mdanalysistests/package.py
* fixes
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
`FFLAGS` and `FCFLAGS` are being ignored by WRF build system. Not only in version
`3.9.1.1`, but also `4.x`.
Also, I see no reason to explicitly add `-w` and `-O2` to compile lines when
using `gcc@10:`. Tested for version `3.9.1.1`, `4.2.2`, & `4.5.0`.
Tagging original authors of this part @MichaelLaufer and @giordano in case they
want to chime in.
* ncbi-rmblastn: patching to support building with %gcc@13:
* ncbi-rmblastn: patching to build with %gcc@13:
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
* py-pre-commit: add 3.3.3
* Update var/spack/repos/builtin/packages/py-pre-commit/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Spack flags supplied by users should supersede flags from package build systems and
other places in Spack. However, Spack currently adds user-supplied flags to the
beginning of the compile line, which means that in some cases build system flags will
supersede user-supplied ones.
The right place to add a flag to ensure it has highest precedence for the compiler really
depends on the type of flag. For example, search paths like `-L` and `-I` are examined
in order, so adding them first is highest precedence. Compilers take the *last* occurrence
of optimization flags like `-O2`, so those should be placed *after* other such flags. Shim
libraries with `-l` should go *before* other libraries on the command line, so we want
user-supplied libs to go first, etc.
`lib/spack/env/cc` already knows how to split arguments into categories like `libs_list`,
`rpath_dirs_list`, etc., so we can leverage that functionality to merge user flags into
the arg list correctly.
The general rules for injected flags are:
1. All `-L`, `-I`, `-isystem`, `-l`, and `*-rpath` flags from `spack_flags_*` to appear
before their regular counterparts.
2. All other flags ordered with the ones from flags after their regular counterparts,
i.e. `other_flags` before `spack_flags_other_flags`
- [x] Generalize argument categorization into its own function in the `cc` shell script
- [x] Apply the same splitting logic to injected flags and flags from the original compile line.
- [x] Use the resulting flag lists to merge user- and build-system-supplied flags by category.
- [x] Add tests.
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
Co-authored-by: iermolae <igor.ermolaev@intel.com>
* py-vermin: add latest version 1.5.2
* Removed obsolete dep and setuptools is only for build-time
- setuptools are not used as runtime
- py27 isn't strictly necessary
The `unparser` that Spack uses for package hashing had several tweaks to ensure compatibility
with Python 2.7:
1. Currently, the unparser automatically moves `*` and `**` args to the end to preserve
compatibility with `python@:3.4`
2. `print a, b, c` statements and single-tuple `print((a, b, c))` function calls were
remapped to `print(a, b, c)` in the unparsed output for consistency across versions.
(1) is causing issues in our tests because a recent patch to the Python source code
(https://github.com/python/cpython/pull/102953/files#diff-7972dffec6674d5f09410c71766ac6caacb95b9bccbf032061806ae304519c9bR813-R823)
has a `**` arg before an named argument, and we round-trip the core python source code
as a test of our unparser. This isn't actually a break with our consistent unpausing -- it's still
consistent, the python source just doesn't unparse to the same thing anymore. It does makes
it harder to test, so it's not worth maintaining the Python2-specific stuff anymore.
Since we only support `python@3.6:`, this PR removes (1) and (2) from the unparser, but keeps
one last tweak for unicode AST inconsistencies, as it's still needed for Python 3.5-3.7.
This fixes the CI error we've been seeing on `python@3.11.4` and `python@3.10.12`. Again, that
bug exists only in the test system and doesn't affect our canonical hashing of Python code.
* WarpX 23.06
Update WarpX and related Python packages to the lastest releases.
WarpX 23.06 introduces multi-dimension support in a single package,
which will ease deployment in E4S et al. that can ship now a single,
full-feature module/package that is NOT incompatible with itself
anymore.
* e4s ci stacks: multiple specs for each dim variant no longer required
* [@spackbot] updating style on behalf of ax3l
* WarpX: Update CMake CLI and Test/Check
* Add Missing `build-directory`
* [@spackbot] updating style on behalf of ax3l
* Remove `build_directory` again
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: ax3l <ax3l@users.noreply.github.com>
* star: add 2.7.10
* star: fix building for non-avx2 arch processors
* convert to MakefilePackage, second take at fixing for aarch64
* style
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
* DependencySpec: add virtuals attribute on edges
This works for both the new and the old concretizer. Also,
added type hints to involved functions.
* Improve virtual reconstruction from old format
* Reconstruct virtuals when reading from Cray manifest
* Reconstruct virtual information on test dependencies
* openradioss-starter,engine: new package
* openradioss-engine: change version name develop to main
* openradioss-starter: change version name develop to main
Update Tcl modulefile template to use the `depends-on` command to
autoload modules if Lmod is the current module tool.
Autoloading modules with `module load` command in Tcl modulefile does
not work well for Lmod at some extend. An attempt to unload then load
designated module is performed each time such command is encountered. It
may lead to a load storm that may not end correctly with large number of
module dependencies.
`depends-on` command should be used for Lmod instead of `module load`,
as it checks if module is already loaded, and does not attempt to reload
this module.
Lua modulefile template already uses `depends_on` command to autoload
dependencies. Thus it is already considered that to use Lmod with Spack,
it must support `depends_on` command (version 7.6+).
Environment Modules copes well with `module load` command to autoload
dependencies (version 3.2+). `depends-on` command is supported starting
version 5.1 (as an alias of `prereq-all` command) which was relased last
year.
This change introduces a test to determine if current module tool that
evaluates modulefile is Lmod. If so, autoload dependencies are defined
with `depends-on` command. Otherwise `module load` command is used.
Test is based on `LMOD_VERSION_MAJOR` environment variable, which is set
by Lmod starting version 5.1.
Fixes#36764
When interpreting local paths as relative URL endpoints, they were
formatted as Windows paths on Windows (i.e. with '\'). URLs should
always be POSIX-style.
Update modulefile templates to append a trailing delimiter to MANPATH
environment variable, if the modulefile sets it.
With a trailing delimiter at ends of MANPATH's value, man will search
the system man pages after searching the specific paths set.
Using append-path/append_path to add this element, the module tool
ensures it is appended only once. When modulefile is unloaded, the
number of append attempt is decreased, thus the trailing delimiter is
removed only if this number equals 0.
Disclaimer: no path element should be appended to MANPATH by generated
modulefiles. It should always be prepended to ensure this variable's
value ends with the trailing delimiter.
Fixes#11355.
* opencascade: new variants
OpenCascade has several major modules and not every
application needs all of them. This adds variants for
the various modules.
It also uodatws the 3rdparty dependency treatment.
* [@spackbot] updating style on behalf of wdconinc
* Update var/spack/repos/builtin/packages/opencascade/package.py
* opencascade: remove variant foundation_classes (always true)
* [@spackbot] updating style on behalf of wdconinc
* Update var/spack/repos/builtin/packages/opencascade/package.py
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Deprecate R packages for spatial analysis
* [@spackbot] updating style on behalf of adamjstewart
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
No changes to build system required. Changelog: https://github.com/xiaoyeli/superlu/compare/v5.3.0...v6.0.0
Since this new version adds "64-bit indexing support", and since at least one dependent package (`armadillo`) requires "32-bit integers" (faa6cbf895), the previous version remains preferred.
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* New version for openCARP packages
* Update carputils dependencies
* Update types of openCARP dependencies
* Add type "run" to setuptools dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add package py-common as carputils dependency
* Add setuptools dependency for py-common
* Remove spaces on blank line
* Restrict type of dependency setuptools to "build"
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: openCARP consortium <info@opencarp.org>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [gromacs] Fix intel (classic) libstdc++ path
Gromacs's `cmake` run will look for `--gcc-toolchain` (e.g. LLVM, icpx) or
`--gcc-name` (e.g. icpc) in `CMAKE_CXX_FLAGS`. Only if it does not find a good
g++ candidate there it will look for `GMX_GPLUSPLUS_PATH`:
cb6b311c39/cmake/FindLibStdCpp.cmake (L97)
Spack installed intel compilers already define a g++ for std libs. But in
`icp{c,x}.cfg` instead of the compile line. If we use the pre-defined g++ we not
only have less chance of mixing g++ versions, but also don't need to explicitly
add `gcc` as dependency to `gromacs`.
* Fix format
* Use a variant
As there is no way to check if a file exists at depends_on stage
* Fix format
* New name and fail if variant is used with other compiler
* Line too long.
The pcluster image has am internal buildcache without an index.
Also, we need to force reuse to avoid rebuilding GCC, since the default is
to only reuse dependencies - and that is subject to changes in the GCC
recipe.
* bedtools2: patching to build with gcc@13
* bedtools2: patching to build with gcc@13
* Update var/spack/repos/builtin/packages/bedtools2/package.py
Yep, sure. Makes sense.
Co-authored-by: Alec Scott <alec@bcs.sh>
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <alec@bcs.sh>
* ensmallen: new package
ensmallen is a high-quality C++ library for non-linear numerical optimization.
* r-rcppensmallen: new package
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Improve lib/spack/spack/test/cmd/compiler.py
* Use "tmp_path" in the "mock_executable" fixture
* Return a pathlib.Path from mock_executable
* Fix mock_executable fixture on Windows
"mock_gcc" was very similar to mock_executable, so use the latter to reduce code duplication
* Remove wrong compiler cache, fix compiler removal
fixes#37996
_CACHE_CONFIG_FILES was both unneeded and wrong, if called
subsequently with different scopes.
Here we remove that cache, and we fix an issue with compiler
removal triggered by having the same compiler spec in multiple
scopes.
* e4s oneapi ci: use official intel oneapi-derived runner image
* update oneapi image
* tau builds ok, but only with libdrm - comment out for now, follow up with pr later
* Guard for define in netcdf 4.9.0 and later.
This code is already available in ParaView 5.11.0 so no patching
needed there.
* Add latest needed version (even if not in spack).
---------
Co-authored-by: Dan Lipsa <dan.lipsa@khq.kitware.com>
* llvm: add new versions and set default for libomptarget according to os
modified: var/spack/repos/builtin/packages/llvm/package.py
* Incorporate reviewer suggestions
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
---------
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
* e4s cray ci stack
* e4s ci: add cray
* add zen4 tag
* WIP: new defintions just for cray
* updates
* remove ci signing job overrride, not necessary
* echo $PATH and show modules loaded
* add mirror
* add external def for cray-libsci
* comment out quantum-espresso
* use /etc/protected-runner as key path
* cray ci stack: do not remove tags: [spack, public]
* make cray stack composable
* generate job should run on public tagged runner, override default config:install_tree:root
* CI: Use relative path in default script
* CI: Use relative includes paths for shell runners
* Use concrete_env_dir for relpath
* ml-darwin-aarch64-mps: jax has bazel codesign issue
---------
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
#37592 updated cached cmake packages to set CMAKE_CUDA_ARCHITECTURES.
The condition `if archs != "none"` lead to `CMAKE_CUDA_ARCHITECTURES=none`
when cuda_arch=none (incorrect check on the value of a multi-valued
variant), i.e. CMAKE_CUDA_ARCHITECTURES is always set. This PR udpates
the condition to if archs[0] != "none" to ensure CMAKE_CUDA_ARCHITECTURES
is only set if cuda_arch is not none (which seems to be the pattern used
in other packages).
This does the same for HIP (although in general ROCmPackage disallows
amdgpu_target=none when +rocm).
* fastqc: add 0.12.1
* fastqc: add 0.12.1
* Update var/spack/repos/builtin/packages/fastqc/package.py
Yeah, had considered doing the same, I'd just opted to maintain the status quo. All good.
Co-authored-by: Alec Scott <alec@bcs.sh>
* Update package.py
Style fiddles to make the bot contented.
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <alec@bcs.sh>
* py-htseq: add 0.12.3, switching over to new GitHub repo
* py-htseq: add 0.12.3, switching over to new GitHub repo
Style fixes
* py-htseq: add 2.0.3, switch to PyPI
* py-htseq: add 2.0.3, switch to PyPI
* Update package.py
* Update var/spack/repos/builtin/packages/py-htseq/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Removing SWIG
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* llvm: replace +omp_as_runtime with omp=runtime
* llvm: fetch 'libomp-libflags-as-list.patch' from upstream repo
* llvm: replace 'llvm14-hwloc-ompd.patch' with the official fix from upstream repo
* llvm: fix-up for the black reformatting
* llvm: fetch 'constexpr_longdouble.patch' from upstream repo
* llvm: optionally build libcxx as a runtime
* llvm: fetch 'llvm5-sanitizer-ustat.patch' from upstream repo
* llvm: update 'sanitizer-ipc_perm_mode.patch'
* llvm: refactor compiler conflicts when libcxx=project
* llvm: fetch 'llvm_python_path.patch' from upstream repo
* llvm: update comments and condition for 'xray_buffer_queue-cstddef.patch'
* llvm: optionally build compiler-rt as a runtime
* llvm: fetch 'lldb_external_ncurses-10.patch' from upstream repo
* llvm: fetch 'llvm_py37.patch' from upstream repo
* llvm: rename variant 'internal_unwind' to 'libunwind'
* llvm: optionally build libunwind as a runtime
* llvm: extend the list of maintainers
* llvm: allow for explicit '~clang~flang~libomptarget~lldb~omp_debug~z3'
* llvm: fetch 'llvm5-lld-ELF-Symbols.patch' from FreeBSD port repo
* llvm: fetch most of 'missing-includes.patch' from upstream repo and reuse 'llvm-gcc11.patch'
* llvm: regroup patches for missing include directives and drop compiler constraints for them
* llvm: fetch 'llvm-gcc11.patch' from upstream repo
* llvm: fetch 'no_cyclades.patch' from upstream repo
* llvm: update comments and condition for 'no_cyclades9.patch'
* llvm: rename variant 'omp' to 'openmp'
* llvm: constrain and rename variant 'omp_tsan' to 'libomp_tsan'
* llvm: rename variant 'omp_debug' to 'libomptarget_debug'
* llvm: do not apply same patch twice
* llvm: constrain and document the '*-thread.patch' patches
* llvm: document the '~lld+libomptarget' conflict
* llvm: update comments for the 'D133513.diff' patch
* py-future: add 0.18.3
* Update var/spack/repos/builtin/packages/py-future/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding libpsm3 package
* Make changes suggested by flake8
* Make one more flake8-suggested change, blank line after 'import os'
* Change to standard header to pass flake8 tests
* Update doc string, remove unnecessary comments
* Reviewer-recommende changes
* Alphabetize variants
* Use helper functions
* Change quotes to pass spack style check
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
- Add pkgconfig dependency from 1.23.0 onward.
- Add conflict of old versions with new gcc due to missing includes.
- Deprecate uneven minor versions because they are not regarded as stable.
- Add maintainer
* [xsdk-examples] Initial commit for v0.4.0
* [xsdk-examples] v0.4.0 depends on xsdk@0.8.0
* add in missing xsdk dependencies
* [xsdk-examples] remove repeated 'depends_on' directive
* [xsdk-examples] simplify and extend a bit the package
[mfem] process more optional dependencies of HiOp
[strumpack, superlu-dist] add a workaround for an issue on Mac
* [mfem] fix the handling of the hiop dependency
* [@spackbot] updating style on behalf of v-dobrev
* [xsdk-examples] enable 'heffte' and 'tasmanian' if enabled in 'xsdk'
* [xsdk-examples] Add PUMI dependency
* [xsdk-examples] Add preCICE dependency
* [xsdk-examples] add +rocm
* heffte: add in a backport fix for building xsdk-examples with cuda
* [xsdk] Remove the explicit requirement for deal.II to be built +hdf5
* ENABLE_ROCM -> ENABLE_HIP
* [hiop] Workaround for CMake not finding Cray's BLAS (libsci)
[xsdk-examples] Set CUDA/HIP architectures; sync cuda/rocm variants with xsdk
* [@spackbot] updating style on behalf of v-dobrev
* [exago] Workaround for CMake not finding Cray's LAPACK/BLAS, libsci
[mfem] Tweaks for running tests under Flux and PBS
* [slate] Pass CUDA/HIP architectures to CMake
* [heffte] For newer CMake versions, set CMAKE_CUDA_ARCHITECTURES
* [hypre] Patch v2.26.0 to fix sequential compilation in 'src/seq_mv'
* [xsdk-examples] Some tweaks in dependencies and compilers used
* [xsdk] Make the 'trilinos' variant sticky
[xsdk-examples] Tweak dependencies
* [slate] Fix copy-paste error
* [xsdk-examples] Workaround for CMakePackage not having the legacy
property 'build_directory'
* [xsdk-examples] Replace the testing branch used temporarily for v0.4.0 with
the official release
---------
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
* Add CMake options for building with CUDA/HIP support to
CachedCMakePackages (intended to reduce duplication across packages
building with +hip/+cuda and using CachedCMakePackage)
* Define generic variables like CMAKE_PREFIX_PATH for
CachedCMakePackages (so that a user may invoke "cmake" themselves
without needing to setthem on the command line).
* Make `lbann` a CachedCMakePackage.
Co-authored-by: Chris White <white238@llnl.gov>
fa7719a changed syntax for specifying exact versions, which are
required for some compiler specs (including those read as part
of parsing a Cray manifest). This fixes that and also makes a
couple other improvements to manifest parsing.
* Instantiate compiler specs with exact versions (fixes#37893)
* fix slingshot network detection (CPE 22.10+ has libcxi.so
in /usr/lib64)
* "spack external find": add arg to ignore default dir for cray
manifests
This will build flux-security separately to have a flux-imp
that can be defined in a flux broker.toml. Note that the user
that wants a multi-user setup is recommended to create a view,
and then a system/broker.toml in flux config directory that
points to it.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Sphinx is used to build Open MPI manpages, etc. as part of the make dist
process to create release tarballs. There should be no need/use to do
this within Spack. Also some sites have older Sphinx installs which
aren't compatible with the needs of the Open MPI documentation.
For example, attempts to install openmpi@main fail at NERSC owing to
such a situation.
Since Spack normally is used to build from release tarballs, in which
the docs have already been installed, this should present no issues.
This configuration option will be ignored for older than 5.0.0 Open MPI releases.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
This option is needed for DFT FE - or more accurately the check needs to
be checked off for a number of platforms or else the code doesn't work.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Change default naming scheme for tcl modules for a more user-friendly
experience.
Change from flat projection to "per software name" projection.
Flat naming scheme restrains module selection capabilities. The
`{name}/{version}...` scheme make possible to use user-friendly
mechanisms:
* implicit defaults (`module load git`)
* extended default (`module load git/2`)
* advanced version specifiers (`module load git@2:`)
* py-cutadapt: add 4.4, 4.3, 4.2 versions
* Update var/spack/repos/builtin/packages/py-cutadapt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-cutadapt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
---------
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update PennyLane ecosystem for 0.30 release
* Update package dep versions
* Fix formatting
* Update dep versions
* Remove PL hard pin and rely on PLQ to define version
* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py
Co-authored-by: Vincent Michaud-Rioux <vincent.michaud-rioux@xanadu.ai>
* Convert pybind11 from build to link dep, and PL ver limit
---------
Co-authored-by: Vincent Michaud-Rioux <vincent.michaud-rioux@xanadu.ai>
Note the win-sdk package is not installable and reports an error
which instructs the user how to add it. Without this fix, a
(more confusing) error occurs before this message can be generated.
"spack build-env" was not generating proper environment variable
definitions on Windows; this commit updates the generated commands
to succeed with batch/PowerShell.
Add a nightly job to attempt building all Paraview dependencies and
upload the results to cdash. This check doesn't affect the reported
build/test status of Spack. We are using this to monitor the state of
Windows support while working on more-robust checks (eventually the
Windows build will have to succeed to merge PRs to Spack).
* Dyninst: add standalone test
* Add docstring with description
* Don't use join_path for builtin path objects
* Whitespace
* Update format of docstring
Some requirements for @main version of environment-modules were missing:
* python (to build ChangeLog documentation file)
* py-sphinx@1.0: (to build man-pages, etc)
Also adding gzip, which is now required to build ChangeLog.gz (which is
now shipped instead of ChangeLog).
Other versions are not requiring these tools (as documentation is
pre-built in dist tarball).
* [devito] Move to version 4.8.1
* Fix: Adding patch file
* Update var/spack/repos/builtin/packages/py-devito/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-devito/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Addressing @adamjstewart comments
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add macOS ML CI stacks
* torchmeta is no longer maintained and requires ancient PyTorch
* Add MXNet
* update darwin aarch64 stacks
* add darwin-aarch64 scoped config.yaml
* remove unnecessary cleanup job
* fix specifications
* fix labels
* fix labels
* fix indent on tags specification
* no tags for trigger jobs
* try overriding tags in stack spack.yaml
* do not use CI_STACK_CONFIG_SCOPES
* incorporate config:install_tree:root: overrides and compiler defs
* copy relevant ci-scoped config settings directly into stack spack.yaml
* remove build-job-remove
* spack ci generate: add debug flag
* include cdash config directly in stack spack.yaml
* customize build-job script section to avoid absolute paths
* add any-job specification
* tags: use aarch64-macos instead of aarch64
* generate tags: use aarch64-macos instead of aarch64
* do not add morepadding
* use shared mirror; comment out known failures
* remove any-job
* nproc || true
* comment out specs failing due to bazel from cache codesign issue
---------
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
* py-babel: add 2.12.1
* Update var/spack/repos/builtin/packages/py-babel/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When attempting to build paraview@5.10.1 using a recent Intel
compiler (Classic or OneAPI) or the IBM XL compiler, the build
fails if the version of protobuf used is > 3.18
* [pcluster pipeline] Use local buildcache instead of upstream spack
Spack currently does not relocate compiler references from upstream spack
installations. When using a buildcache we don't need an upstream spack.
* gcc needs to be installed via postinstall to get correct deps
* quantum-espresso@gcc@12.3.0 returns ICE on neoverse_{n,v}1
* Force gitlab to pull the new container
* Revert "Force gitlab to pull the new container"
This reverts commit 3af5f4cd88.
Seems the gitlab version does not yet support "pull_policy" in .gitlab-ci.yml
* Gitlab keeps picking up wrong container. Renaming
* Update containers once more after failed build
* add version 1.48.0 to bioconductor package r-a4
* add version 1.48.0 to bioconductor package r-a4base
* add version 1.48.0 to bioconductor package r-a4classif
* add version 1.48.0 to bioconductor package r-a4core
* add version 1.48.0 to bioconductor package r-a4preproc
* add version 1.48.0 to bioconductor package r-a4reporting
* add version 1.54.0 to bioconductor package r-absseq
* add version 1.30.0 to bioconductor package r-acde
* add version 1.78.0 to bioconductor package r-acgh
* add version 2.56.0 to bioconductor package r-acme
* add version 1.70.0 to bioconductor package r-adsplit
* add version 1.72.0 to bioconductor package r-affxparser
* add version 1.78.0 to bioconductor package r-affy
* add version 1.76.0 to bioconductor package r-affycomp
* add version 1.58.0 to bioconductor package r-affycontam
* add version 1.72.0 to bioconductor package r-affycoretools
* add version 1.48.0 to bioconductor package r-affydata
* add version 1.52.0 to bioconductor package r-affyilm
* add version 1.70.0 to bioconductor package r-affyio
* add version 1.76.0 to bioconductor package r-affyplm
* add version 1.46.0 to bioconductor package r-affyrnadegradation
* add version 1.48.0 to bioconductor package r-agdex
* add version 3.32.0 to bioconductor package r-agilp
* add version 2.50.0 to bioconductor package r-agimicrorna
* add version 1.32.0 to bioconductor package r-aims
* add version 1.32.0 to bioconductor package r-aldex2
* add version 1.38.0 to bioconductor package r-allelicimbalance
* add version 1.26.0 to bioconductor package r-alpine
* add version 2.62.0 to bioconductor package r-altcdfenvs
* add version 2.24.0 to bioconductor package r-anaquin
* add version 1.28.0 to bioconductor package r-aneufinder
* add version 1.28.0 to bioconductor package r-aneufinderdata
* add version 1.72.0 to bioconductor package r-annaffy
* add version 1.78.0 to bioconductor package r-annotate
* add version 1.62.0 to bioconductor package r-annotationdbi
* add version 1.24.0 to bioconductor package r-annotationfilter
* add version 1.42.0 to bioconductor package r-annotationforge
* add version 3.8.0 to bioconductor package r-annotationhub
* add version 3.30.0 to bioconductor package r-aroma-light
* add version 1.32.0 to bioconductor package r-bamsignals
* add version 2.16.0 to bioconductor package r-beachmat
* add version 2.60.0 to bioconductor package r-biobase
* add version 2.8.0 to bioconductor package r-biocfilecache
* add version 0.46.0 to bioconductor package r-biocgeneric
* add version 1.10.0 to bioconductor package r-biocio
* add version 1.18.0 to bioconductor package r-biocneighbors
* add version 1.34.0 to bioconductor package r-biocparallel
* add version 1.16.0 to bioconductor package r-biocsingular
* add version 2.28.0 to bioconductor package r-biocstyle
* add version 3.17.1 to bioconductor package r-biocversion
* add version 2.56.0 to bioconductor package r-biomart
* add version 1.28.0 to bioconductor package r-biomformat
* add version 2.68.0 to bioconductor package r-biostrings
* add version 1.48.0 to bioconductor package r-biovizbase
* add version 1.10.0 to bioconductor package r-bluster
* add version 1.68.0 to bioconductor package r-bsgenome
* add version 1.36.0 to bioconductor package r-bsseq
* add version 1.42.0 to bioconductor package r-bumphunter
* add version 2.66.0 to bioconductor package r-category
* add version 2.30.0 to bioconductor package r-champ
* add version 2.32.0 to bioconductor package r-champdata
* add version 1.50.0 to bioconductor package r-chipseq
* add version 4.8.0 to bioconductor package r-clusterprofiler
* add version 1.36.0 to bioconductor package r-cner
* add version 1.32.0 to bioconductor package r-codex
* add version 2.16.0 to bioconductor package r-complexheatmap
* add version 1.74.0 to bioconductor package r-ctc
* add version 2.28.0 to bioconductor package r-decipher
* add version 0.26.0 to bioconductor package r-delayedarray
* add version 1.22.0 to bioconductor package r-delayedmatrixstats
* add version 1.40.0 to bioconductor package r-deseq2
* add version 1.46.0 to bioconductor package r-dexseq
* add version 1.42.0 to bioconductor package r-dirichletmultinomial
* add version 2.14.0 to bioconductor package r-dmrcate
* add version 1.74.0 to bioconductor package r-dnacopy
* add version 3.26.0 to bioconductor package r-dose
* add version 2.48.0 to bioconductor package r-dss
* add version 3.42.0 to bioconductor package r-edger
* add version 1.20.0 to bioconductor package r-enrichplot
* add version 2.24.0 to bioconductor package r-ensembldb
* add version 1.46.0 to bioconductor package r-exomecopy
* add version 2.8.0 to bioconductor package r-experimenthub
* add version 1.26.0 to bioconductor package r-fgsea
* add version 2.72.0 to bioconductor package r-gcrma
* add version 1.36.0 to bioconductor package r-gdsfmt
* add version 1.82.0 to bioconductor package r-genefilter
* add version 1.36.0 to bioconductor package r-genelendatabase
* add version 1.72.0 to bioconductor package r-genemeta
* add version 1.78.0 to bioconductor package r-geneplotter
* add version 1.22.0 to bioconductor package r-genie3
* add version 1.36.0 to bioconductor package r-genomeinfodb
* update r-genomeinfodbdata
* add version 1.36.0 to bioconductor package r-genomicalignments
* add version 1.52.0 to bioconductor package r-genomicfeatures
* add version 1.52.0 to bioconductor package r-genomicranges
* add version 2.68.0 to bioconductor package r-geoquery
* add version 1.48.0 to bioconductor package r-ggbio
* add version 3.8.0 to bioconductor package r-ggtree
* add version 2.10.0 to bioconductor package r-glimma
* add version 1.12.0 to bioconductor package r-glmgampoi
* add version 5.54.0 to bioconductor package r-globaltest
* update r-go-db
* add version 1.20.0 to bioconductor package r-gofuncr
* add version 2.26.0 to bioconductor package r-gosemsim
* add version 1.52.0 to bioconductor package r-goseq
* add version 2.66.0 to bioconductor package r-gostats
* add version 1.78.0 to bioconductor package r-graph
* add version 1.62.0 to bioconductor package r-gseabase
* add version 1.32.0 to bioconductor package r-gtrellis
* add version 1.44.0 to bioconductor package r-gviz
* add version 1.28.0 to bioconductor package r-hdf5array
* add version 1.72.0 to bioconductor package r-hypergraph
* add version 1.36.0 to bioconductor package r-illumina450probevariants-db
* add version 0.42.0 to bioconductor package r-illuminaio
* add version 1.74.0 to bioconductor package r-impute
* add version 1.38.0 to bioconductor package r-interactivedisplaybase
* add version 2.34.0 to bioconductor package r-iranges
* add version 1.60.0 to bioconductor package r-kegggraph
* add version 1.40.0 to bioconductor package r-keggrest
* add version 3.56.0 to bioconductor package r-limma
* add version 2.52.0 to bioconductor package r-lumi
* add version 1.76.0 to bioconductor package r-makecdfenv
* add version 1.78.0 to bioconductor package r-marray
* add version 1.12.0 to bioconductor package r-matrixgenerics
* add version 1.8.0 to bioconductor package r-metapod
* add version 2.46.0 to bioconductor package r-methylumi
* add version 1.46.0 to bioconductor package r-minfi
* add version 1.34.0 to bioconductor package r-missmethyl
* add version 1.80.0 to bioconductor package r-mlinterfaces
* add version 1.12.0 to bioconductor package r-mscoreutils
* add version 2.26.0 to bioconductor package r-msnbase
* add version 2.56.0 to bioconductor package r-multtest
* add version 1.38.0 to bioconductor package r-mzid
* add version 2.34.0 to bioconductor package r-mzr
* add version 1.62.0 to bioconductor package r-oligoclasses
* update r-org-hs-eg-db
* add version 1.42.0 to bioconductor package r-organismdbi
* add version 1.40.0 to bioconductor package r-pathview
* add version 1.92.0 to bioconductor package r-pcamethods
* update r-pfam-db
* add version 1.44.0 to bioconductor package r-phyloseq
* add version 1.62.0 to bioconductor package r-preprocesscore
* add version 1.32.0 to bioconductor package r-protgenerics
* add version 1.34.0 to bioconductor package r-quantro
* add version 2.32.0 to bioconductor package r-qvalue
* add version 1.76.0 to bioconductor package r-rbgl
* add version 2.40.0 to bioconductor package r-reportingtools
* add version 2.44.0 to bioconductor package r-rgraphviz
* add version 2.44.0 to bioconductor package r-rhdf5
* add version 1.12.0 to bioconductor package r-rhdf5filters
* add version 1.22.0 to bioconductor package r-rhdf5lib
* add version 1.76.0 to bioconductor package r-roc
* add version 1.28.0 to bioconductor package r-rots
* add version 2.16.0 to bioconductor package r-rsamtools
* add version 1.60.0 to bioconductor package r-rtracklayer
* add version 0.38.0 to bioconductor package r-s4vectors
* add version 1.8.0 to bioconductor package r-scaledmatrix
* add version 1.28.0 to bioconductor package r-scater
* add version 1.14.0 to bioconductor package r-scdblfinder
* add version 1.28.0 to bioconductor package r-scran
* add version 1.10.0 to bioconductor package r-scuttle
* add version 1.66.0 to bioconductor package r-seqlogo
* add version 1.58.0 to bioconductor package r-shortread
* add version 1.74.0 to bioconductor package r-siggenes
* add version 1.22.0 to bioconductor package r-singlecellexperiment
* add version 1.34.0 to bioconductor package r-snprelate
* add version 1.50.0 to bioconductor package r-snpstats
* add version 2.36.0 to bioconductor package r-somaticsignatures
* add version 1.12.0 to bioconductor package r-sparsematrixstats
* add version 1.40.0 to bioconductor package r-spem
* add version 1.38.0 to bioconductor package r-sseq
* add version 1.30.0 to bioconductor package r-summarizedexperiment
* add version 3.48.0 to bioconductor package r-sva
* add version 1.38.0 to bioconductor package r-tfbstools
* add version 1.22.0 to bioconductor package r-tmixclust
* add version 2.52.0 to bioconductor package r-topgo
* add version 1.24.0 to bioconductor package r-treeio
* add version 1.28.0 to bioconductor package r-tximport
* add version 1.28.0 to bioconductor package r-tximportdata
* add version 1.46.0 to bioconductor package r-variantannotation
* add version 3.68.0 to bioconductor package r-vsn
* add version 2.6.0 to bioconductor package r-watermelon
* add version 2.46.0 to bioconductor package r-xde
* add version 1.58.0 to bioconductor package r-xmapbridge
* add version 0.40.0 to bioconductor package r-xvector
* add version 1.26.0 to bioconductor package r-yapsa
* add version 1.26.0 to bioconductor package r-yarn
* add version 1.46.0 to bioconductor package r-zlibbioc
* Revert "add version 1.82.0 to bioconductor package r-genefilter"
This reverts commit 1702071c6d.
* Revert "add version 0.38.0 to bioconductor package r-s4vectors"
This reverts commit 58a7df2387.
* add version 0.38.0 to bioconductor package r-s4vectors
* Revert "add version 1.28.0 to bioconductor package r-aneufinder"
This reverts commit 0a1f59de6c.
* add version 1.28.0 to bioconductor package r-aneufinder
* Revert "add version 2.16.0 to bioconductor package r-beachmat"
This reverts commit cd49fb8e4c.
* add version 2.16.0 to bioconductor package r-beachmat
* Revert "add version 4.8.0 to bioconductor package r-clusterprofiler"
This reverts commit 6e9a951cbe.
* add version 4.8.0 to bioconductor package r-clusterprofiler
* Fix syntax error
* r-genefilter: add version 1.82.0
* new package: r-basilisk-utils
* new package: r-basilisk
* new package: r-densvis
* new package: r-dir-expiry
* r-affyplm: add zlib dependency
* r-cner: add zlib dependency
* r-mzr: add zlib dependency
* r-rhdf5filters: add zstd dependency
* r-shortread: add zlib dependency
* r-snpstats: add zlib dependency
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
macOS doesn't have `getrandom`, and 1.10.2 fails to compile because of this.
There's an upstream fix at https://dev.gnupg.org/T6442 that will be in the next
`libgcrypt` release, but the patch is available now.
* py-argcomplete: add 3.0.8
* Update var/spack/repos/builtin/packages/py-argcomplete/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of manuelakuhn
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* aml: v0.2.1
* add version 0.2.1
* fix hip variant bug
* [fix] pkgconf required for all builds
On top of needing pkgconf for autoreconf builds, the release configure
scripts needs pkgconf do detect dependencies if any of the hwloc, ze, or
opencl variants are active.
* Remove deprecation for v0.2.0 based on PR advise.
* intel-xed: add version 2023.04.16
1. add version 2023.04.16
2. adjust the mbuild resource to better match the xed version at the time
3. replace three conflicts() with one new requires() for x86_64 target
4. add patch for libxed-ild for some new avx512 instructions
* [@spackbot] updating style on behalf of mwkrentel
* Fix the build for 2023.04.16. XED requires its source directory to be exactly 'xed', so add a symlink.
5. move the mbuild resource up one level, xed wants it to be in the same directory as the xed source dir
6. deprecate 10.2019.03
* semantic style fix: add OSError to except
* [@spackbot] updating style on behalf of mwkrentel
---------
Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
* py-jarvis-util: add a new package
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update py-nltk
* [@spackbot] updating style on behalf of meyersbs
* Update var/spack/repos/builtin/packages/py-nltk/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add version 1.28.3 to r-hexbin
* add version 5.0-1 to r-hmisc
* add version 0.5.5 to r-htmltools
* add version 1.6.2 to r-htmlwidgets
* add version 1.4.2 to r-igraph
* add version 0.42.19 to r-imager
* add version 1.0-5 to r-inum
* add version 0.9-14 to r-ipred
* add version 1.3.2 to r-irkernel
* add version 2.2.0 to r-janitor
* add version 0.1-10 to r-jpeg
* add version 1.2.2 to r-jsonify
* add version 0.9-32 to r-kernlab
* add version 1.7-2 to r-klar
* add version 1.42 to r-knitr
* add version 1.14.0 to r-ks
* add version 2.11.0 to r-labelled
* add version 1.7.2.1 to r-lava
* add version 0.6-15 to r-lavaan
* add version 2.1.2 to r-leaflet
* add version 2.9-0 to r-lfe
* add version 1.1.6 to r-lhs
* add version 1.1-33 to r-lme4
* add version 1.5-9.7 to r-locfit
* add version 0.4.3 to r-log4r
* add version 5.6.18 to r-lpsolve
* add version 0.2-11 to r-lwgeom
* add version 2.7.4 to r-magick
* add version 1.22.1 to r-maldiquant
* add version 1.2.11 to r-mapproj
* add version 1.6 to r-markdown
* add version 7.3-59 to r-mass
* add version 1.5-4 to r-matrix
* add version 0.63.0 to r-matrixstats
* add version 4.2-3 to r-memuse
* add version 4.0-0 to r-metafor
* add version 1.8-42 to r-mgcv
* add version 3.15.0 to r-mice
* add version 0.4-5 to r-mitml
* add version 2.0.0 to r-mixtools
* add version 0.1.11 to r-modelr
* add version 1.4-23 to r-multcomp
* add version 0.1-9 to r-multcompview
* add version 0.1-13 to r-mutoss
* add version 1.18.1 to r-network
* add version 3.3.4 to r-nleqslv
* add version 3.1-162 to r-nlme
* add version 0.26 to r-nmf
* add version 0.60-17 to r-np
* add version 4.2.5.2 to r-openxlsx
* add version 2022.11-16 to r-ordinal
* add version 0.6.0.8 to r-osqp
* add version 0.9.1 to r-packrat
* add version 1.35.0 to r-parallelly
* add version 1.3-13 to r-party
* add version 1.2-20 to r-partykit
* add version 1.7-0 to r-pbapply
* add version 0.3-9 to r-pbdzmq
* add version 1.2 to r-pegas
* add version 1.5-1 to r-phytools
* add version 1.9.0 to r-pillar
* add version 1.4.0 to r-pkgbuild
* add version 2.1.0 to r-pkgcache
* add version 0.5.0 to r-pkgdepends
* add version 2.0.7 to r-pkgdown
* add version 1.3.2 to r-pkgload
* add version 0.1-8 to r-png
* add version 1.1.22 to r-polspline
* add version 1.0.1 to r-pool
* add version 1.4.1 to r-posterior
* add version 3.8.1 to r-processx
* add version 2023.03.31 to r-prodlim
* add version 1.0-12 to r-proj4
* add version 2.5.0 to r-projpred
* add version 0.1.6 to r-pryr
* add version 1.7.5 to r-ps
* add version 1.0.1 to r-purrr
* add version 1.3.2 to r-qqconf
* add version 0.25.5 to r-qs
* add version 1.60 to r-qtl
* add version 0.4.22 to r-quantmod
* add version 5.95 to r-quantreg
* add version 0.7.8 to r-questionr
* add version 1.2.5 to r-ragg
* add version 0.15.1 to r-ranger
* add version 3.6-20 to r-raster
* add version 2.2.13 to r-rbibutils
* add version 1.0.10 to r-rcpp
* add version 0.12.2.0.0 to r-rcpparmadillo
* add version 0.1.7 to r-rcppde
* add version 0.3.13 to r-rcppgsl
* add version 1.98-1.12 to r-rcurl
* add version 1.2-1 to r-rda
* add version 2.1.4 to r-readr
* add version 1.4.2 to r-readxl
* add version 1.0.6 to r-recipes
* add version 1.1.6 to r-repr
* add version 1.2.16 to r-reproducible
* add version 0.3.0 to r-require
* add version 1.28 to r-reticulate
* add version 2.0.7 to r-rfast
* add version 1.6-6 to r-rgdal
* add version 0.6-2 to r-rgeos
* add version 1.1.3 to r-rgl
* add version 0.2.18 to r-rinside
* add version 4-14 to r-rjags
* add version 1.3-1.8 to r-rjsonio
* add version 2.21 to r-rmarkdown
* add version 0.9-2 to r-rmpfr
* add version 0.7-1 to r-rmpi
* add version 6.6-0 to r-rms
* add version 0.10.25 to r-rmysql
* add version 0.8.7 to r-rncl
* add version 2.4.11 to r-rnexml
* add version 0.95-1 to r-robustbase
* add version 1.3-20 to r-rodbc
* add version 7.2.3 to r-roxygen2
* add version 1.4.5 to r-rpostgres
* add version 0.7-5 to r-rpostgresql
* add version 0.8.29 to r-rsconnect
* add version 0.4-15 to r-rsnns
* add version 2.3.1 to r-rsqlite
* add version 0.7.2 to r-rstatix
* add version 1.1.2 to r-s2
* add version 0.4.5 to r-sass
* add version 0.1.9 to r-scatterpie
* add version 0.3-43 to r-scatterplot3d
* add version 3.2.4 to r-scs
* add version 1.6-4 to r-segmented
* add version 4.2-30 to r-seqinr
* add version 0.26 to r-servr
* add version 4.3.0 to r-seurat
* add version 1.0-12 to r-sf
* add version 0.4.2 to r-sfheaders
* add version 1.1-15 to r-sfsmisc
* add version 1.7.4 to r-shiny
* add version 1.9.0 to r-signac
* add version 1.6.0.3 to r-smoof
* add version 0.1.7-1 to r-sourcetools
* add version 1.6-0 to r-sp
* add version 1.3-0 to r-spacetime
* add version 7.3-16 to r-spatial
* add version 2.0-0 to r-spatialeco
* add version 1.2-8 to r-spatialreg
* add version 3.0-5 to r-spatstat
* add version 3.0-1 to r-spatstat-data
* add version 3.1-0 to r-spatstat-explore
* add version 3.1-0 to r-spatstat-geom
* add version 3.1-0 to r-spatstat-linnet
* add version 3.1-4 to r-spatstat-random
* add version 3.0-1 to r-spatstat-sparse
* add version 3.0-2 to r-spatstat-utils
* add version 2.2.2 to r-spdata
* add version 1.2-8 to r-spdep
* add version 0.6-1 to r-stars
* add version 1.5.0 to r-statmod
* add version 4.8.0 to r-statnet-common
* add version 1.7.12 to r-stringi
* add version 1.5.0 to r-stringr
* add version 1.9.1 to r-styler
* add version 3.5-5 to r-survival
* add version 1.5-4 to r-tclust
* add version 1.7-29 to r-terra
* add version 3.1.7 to r-testthat
* add version 1.1-2 to r-th-data
* add version 1.2 to r-tictoc
* add version 1.3.2 to r-tidycensus
* add version 1.2.3 to r-tidygraph
* add version 1.3.0 to r-tidyr
* add version 2.0.0 to r-tidyverse
* add version 0.2.0 to r-timechange
* add version 0.45 to r-tinytex
* add version 0.4.1 to r-triebeard
* add version 1.0-9 to r-truncnorm
* add version 0.10-53 to r-tseries
* add version 0.8-1 to r-units
* add version 4.3.0 to r-v8
* add version 1.4-11 to r-vcd
* add version 1.14.0 to r-vcfr
* add version 0.6.2 to r-vctrs
* add version 1.1-8 to r-vgam
* add version 0.4.0 to r-vioplot
* add version 1.6.1 to r-vroom
* add version 1.72-1 to r-wgcna
* add version 0.4.1 to r-whisker
* add version 0.7.2 to r-wk
* add version 0.39 to r-xfun
* add version 1.7.5.1 to r-xgboost
* add version 1.0.7 to r-xlconnect
* add version 3.99-0.14 to r-xml
* add version 0.13.1 to r-xts
* add version 2.3.7 to r-yaml
* add version 2.3.0 to r-zip
* add version 1.8-12 to r-zoo
* r-bigmem: dependency on uuid
* r-bio3d: dependency on zlib
* r-devtools: dependency cleanup
* r-dose: dependency cleanup
* r-dss: dependency cleanup
* r-enrichplot: dependency cleanup
* r-fgsea: dependency cleanup
* r-geor: dependency cleanup
* r-ggridges: dependency cleanup
* r-lobstr: dependency cleanup
* r-lubridate: dependency cleanup
* r-mnormt: dependency cleanup
* r-sctransform: version format correction
* r-seuratobject: dependency cleanup
* r-tidyselect: dependency cleanup
* r-tweenr: dependency cleanup
* r-uwot: dependency cleanup
* new package: r-clock
* new package: r-conflicted
* new package: r-diagram
* new package: r-doby
* new package: r-httr2
* new package: r-kableextra
* new package: r-mclogit
* new package: r-memisc
* new package: r-spatstat-model
* r-rmysql: use mariadb-client
* r-snpstats: add zlib dependency
* r-qs: add zstd dependency
* r-rcppcnpy: add zlib dependency
* black reformatting
* Revert "r-dose: dependency cleanup"
This reverts commit 4c8ae8f5615ee124fff01ce43eddd3bb5d06b9bc.
* Revert "r-dss: dependency cleanup"
This reverts commit a6c5c15c617a9a688fdcfe2b70c501c3520d4706.
* Revert "r-enrichplot: dependency cleanup"
This reverts commit 65e116c18a94d885bc1a0ae667c1ef07d1fe5231.
* Revert "r-fgsea: dependency cleanup"
This reverts commit ffe2cdcd1f73f69d66167b941970ede0281b56d7.
* r-rda: this package is back in CRAN
* r-sctransform: fix copyright
* r-seurat: fix copyright
* r-seuratobject: fix copyright
* Revert "add version 6.0-94 to r-caret"
This reverts commit 236260597de97a800bfc699aec1cd1d0e3d1ac60.
* add version 6.0-94 to r-caret
* Revert "add version 1.8.5 to r-emmeans"
This reverts commit 64a129beb0bd88d5c88fab564cade16c03b956ec.
* add version 1.8.5 to r-emmeans
* Revert "add version 5.0-1 to r-hmisc"
This reverts commit 517643f4fd8793747365dfcfc264b894d2f783bd.
* add version 5.0-1 to r-hmisc
* Revert "add version 1.42 to r-knitr"
This reverts commit 2a0d9a4c1f0ba173f7423fed59ba725bac902c37.
* add version 1.42 to r-knitr
* Revert "add version 1.6 to r-markdown"
This reverts commit 4b5565844b5704559b819d2e775fe8dec625af99.
* add version 1.6 to r-markdown
* Revert "add version 0.26 to r-nmf"
This reverts commit 4c44a788b17848f2cda67b32312a342c0261caec.
* add version 0.26 to r-nmf
* Revert "add version 2.3.1 to r-rsqlite"
This reverts commit 5722ee2297276e4db8beee461d39014b0b17e420.
* add version 2.3.1 to r-rsqlite
* Revert "add version 1.0-12 to r-sf"
This reverts commit ee1734fd62cc02ca7a9359a87ed734f190575f69.
* add version 1.0-12 to r-sf
* fix syntax error
* Add FNAL Spack team to maintainers
* New variants and configuration improvements
* Version dependent "no-systemd" patches.
* New variants `client_only`, and `davix`
* Better handling of `cxxstd` for different versions, including
improved patching and CMake options.
* Version-specific CMake requirements.
* Better version-specific handling of `openssl` dependency.
* `py-setuptools` required for `+python` build.
* Specific enable/disable of CMake options and use of
`-DFORCE_ENABLED=TRUE` to prevent unwanted/non-portable activation
of features.
* Better handling of `+python` configuration.
* New version 5.5.5
Add aws-plcuster[-aarch64] stacks. These stacks build packages defined in
https://github.com/spack/spack-configs/tree/main/AWS/parallelcluster
They use a custom container from https://github.com/spack/gitlab-runners which
includes necessary ParallelCluster software to link and build as well as an
upstream spack installation with current GCC and dependencies.
Intel and ARM software is installed and used during the build stage but removed
from the buildcache before the signing stage.
Files `configs/linux/{arch}/ci.yaml` select the necessary providers in order to
build for specific architectures (icelake, skylake, neoverse_{n,v}1).
Make it clear that copy-only pipelines are not supported while still
using the deprecated ci config format. Also ensure that the deprecated
stack does not fail on spack pipelines for tags.
* Fix reporting of packageless specs as having no tests
* Add test_test_output_multiple_specs with update to simple-standalone-test (and tests)
* Refactored test status summary; added more tests or checks
MSVC compiler logic was using string parsing to extract version
from compiler spec, which was fragile. This broke in #37572, so has
been fixed and made more robust by using attribute access.
* CI: Expand E4S ROCm stack to include missing DaV packages
Ascent: Fixup for VTK-m with Kokkos backend
* DaV SDK: Removed duplicated openmp variant for ascent
* Drop visit and add conflict for Kokkos
* E4S: Drop ascent from CUDA builds
Ensure that requirements `packages:*:require:@x` and preferences `packages:*:version:[x]`
fail concretization when no version defined in the package satisfies `x`. This always holds
except for git versions -- they are defined on the fly.
In the past, Spack did not allow two different versions of the
same package within a DAG. That led to difficulties with packages
that still required Python 2 while other packages had already
switched to Python 3.
The libxcb and xcb-proto packages did not have Python 3 support
for a time. To get around this issue, Spack maintainers disabled
their dependency on an internal (i.e., Spack-provided) Python
(see #4145),forcing these packages to look for a system-provided
Python (see #7646).
This has worked for us all right, but with the arrival of our most
recent platform we seem to be missing the critical xcbgen Python
module on the system. Since most software has largely moved on to
Python 3 now, let's re-enable internal Spack dependencies for the
libxcb and xcb-proto packages.
Two bugs came in from #37438
1. `unify: when_possible` was broken, because of an incorrect assertion. abstract/concrete
spec pairs were compared against the results that were in the process of being computed,
rather than against the previous results.
2. `unify: true` had an ordering bug that could mix the association between abstract and
concrete specs
- [x] 1 is resolved by creating a lookup from old concrete specs to old abstract specs,
and we use that to associate the "new" concrete specs that happen to be the old
ones with their abstract specs (since those are stripped out for concretization
- [x] 2 is resolved by combining the new and old abstract as lists instead of combining
them as sets. This is important because `set() | set()` does not make any ordering
promises, even though set ordering is otherwise guaranteed in `python@3.7:`
* Upgrading kosh to 3.0.
* Accidentally regressed the package, changing back.
* Updating py-hdbscan versions for kosh.
* Fixing bug in patch.
* Adding 3.0.1
* Removing 3.0.
* Updating package deps for hdbscan to match requirements.txt.
* Version reqs for 3.0.*, need newer numpy and networkx
* spack style
* Reordering to match setup.py, adding "type" to python depends.
* trilinos@develop fixes
* Update var/spack/repos/builtin/packages/trilinos/package.py
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
---------
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Spack displays package code context when it shouldn't (e.g., on `FetchError`s)
and doesn't display it when it should (e.g., when errors occur in builder classes.
The line attribution can sometimes be off by one, as well.
- [x] Display package context when errors occur in a subclass of `PackageBase`
- [x] Display package context when errors occur in a subclass of `BaseBuilder`
- [x] Do not display package context when errors occur in `PackageBase`,
`BaseBuilder` or other core code that is not in a `package.py` file.
- [x] Fix off-by-one error for core code (don't subtract one from the line number *unless*
it's in an actual `package.py` file.
---------
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
We currently throw a nasty error if you try to reuse packages from some other namespace
(e.g., OLCF), but we should be able to reuse patched local versions of builtin packages.
Right now the only obstacle to that is that we try to look up virtual info for unknown
namespaces, and we can't get the package from the repo to do that. We *can* assume that
a package with a known namespace is similar, and that its virtual provider information
is reasonably accurate, so we now do that. This isn't 100% accurate, but neither is
relying on the package itself, as it may have gone out of date.
The real solution here is virtual edge information, but this is a stopgap until we have
that.
`spec_clauses()` attempts to look up package information for concrete specs in order to
determine which virtuals they may provide. This fails for renamed/deleted dependencies
of buildcaches and installed packages.
This will eventually be fixed by #35258, which adds virtual information on edges, but we
need a workaround to make older buildcaches usable.
- [x] make an exception for renamed packages and omit their virtual constraints
- [x] add a note that this will be solved by adding virtuals to edges
The concretizer can fail with `reuse:true` if a buildcache or installation contains a
package with a dependency that has been renamed or deleted in the main repo (e.g.,
`netcdf` was refactored to `netcdf-c`, `netcdf-fortran`, etc., but there are still
binary packages with dependencies called `netcdf`).
We should still be able to install things for which we are missing `package.py` files.
`Spec.inject_patches_variant()` was failing this requirement by attempting to look up
the package class for concrete specs. This isn't needed -- we can skip it.
- [x] swap two conditions in `Spec.inject_patches_variant()`
I will follow this up with a variant to flux-core to add flux-security, and then automation in the flux-framework/spack repository.
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
The @= in `spack find` output adds a bit of noise. Remove it as we
did for `spack spec` and `spack concretize`.
This modifies display_specs so it actually covers other places we use that routine, as
well, e.g., `spack buildcache list`.
before:
```
-- linux-ubuntu20.04-aarch64 / gcc@=11.1.0 -----------------------
ofdlcpi libpressio@0.88.0
```
after:
```
-- linux-ubuntu20.04-aarch64 / gcc@11.1.0 -----------------------
ofdlcpi libpressio@0.88.0
```
If a user does not explicitly `--force` the concretization of an entire environment,
Spack will try to reuse the concrete specs that are already in the lockfile.
---------
Co-authored-by: becker33 <becker33@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* gitlab ci: release fixes and improvements
- use rules to reduce boilerplate in .gitlab-ci.yml
- support copy-only pipeline jobs
- make pipelines for release branches rebuild everything
- make pipelines for protected tags copy-only
* gitlab ci: remove url changes used in testing
* gitlab ci: tag mirrors need public key
Make sure that mirrors associated with release branches and tags
contain the public key needed to verify the signed binaries. This
also ensures that when stack-specific mirror contents are copied
to the root, the root mirror has the public key as well.
* review: be more specific about tags, curl flags
* Make the check in ci.yaml consistent with the .gitlab-ci.yml
---------
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Currently, specs on buildcache mirrors must be referenced by their full description. This PR allows buildcache specs to be referenced by their hashes, rather than their full description.
### How it works
Hash resolution has been moved from `SpecParser` into `Spec`, and now includes the ability to execute a `BinaryCacheQuery` after checking the local store, but before concluding that the hash doesn't exist.
### Side-effects of Proposed Changes
Failures will take longer when nonexistent hashes are parsed, as mirrors will now be scanned.
### Other Changes
- `BinaryCacheIndex.update` has been modified to fail appropriately only when mirrors have been configured.
- Tests of hash failures have been updated to use `mutable_empty_config` so they don't needlessly search mirrors.
- Documentation has been clarified for `BinaryCacheQuery`, and more documentation has been added to the hash resolution functions added to `Spec`.
* py-rsatoolbox: add 0.0.5, 0.1.0 and 0.1.2 from wheels
* py-setuptools: add 63.4.3
* remove wheels and open up requirements
* Fix style
* Update var/spack/repos/builtin/packages/py-rsatoolbox/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-rsatoolbox/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Change version for python restriction
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-mne: add 1.4.0 and py-importlib-resources: add 5.12.0
* Fix style
* Update var/spack/repos/builtin/packages/py-mne/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This PR ensures that we'll get a comprehensible error message whenever an old
version of Spack tries to use a DB or a lockfile that is "too new".
* Fix error message when using a too new DB
* Add a unit-test to ensure we have a comprehensible error message
* py-pysam: adding version 0.21.0
* Update var/spack/repos/builtin/packages/py-pysam/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Add a section to the lock file to track the Spack version/commit that produced
an environment. This should (eventually) enhance reproducibility, though we
do not currently do anything with the information. It just adds to provenance
at the moment.
Changes include:
- [x] adding the version/commit to `spack.lock`
- [x] refactor `spack.main.get_version()
- [x] fix a couple of environment lock file-related typos
* add a virtual dependency name instead of complete package name
* add OneAPI components as providers of virtual packages
* Revert the default of tbb
---------
Co-authored-by: Nisarg Patel <nisarg.patel@lrz.de>
The flags --mirror-name / --mirror-url / --directory were deprecated in
favor of just passing a positional name, url or directory, and letting spack
figure it out.
---------
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
Prior to this PR, the HOMEDRIVE environment variable was used to
detect what drive we are operating in. This variable is not available
for service account logins (like what is used for CI), so switch to
extracting the drive from PROGRAMFILES (which is more-widely defined).
On Windows, several commonly available system tools for decompression
are unreliable (gz/bz2/xz). This commit refactors `decompressor_for`
to call out to a Windows or Unix-specific method:
* The decompressor_for_nix method behaves the same as before and
generally treats the Python/system support options for decompression
as interchangeable (although avoids using Python's built-in tar
support since that has had issues with permissions).
* The decompressor_for_win method can only use Python support for
gz/bz2/xz, although for a tar.gz it does use system support for
untar (after the decompression step). .zip uses the system tar
utility, and .Z depends on external support (i.e. that the user
has installed 7zip).
A naming scheme has been introduced for the various _decompression
methods:
* _system_gunzip means to use a system tool (and fail if it's not
available)
* _py_gunzip means to use Python's built-in support for decompressing
.gzip files (and fail if it's not available)
* _gunzip is a method that can do either
* fix(hdf5): h5pfc link failure
develop branch doesn't need linking any more.
See: acb186f6e5
* [@spackbot] updating style on behalf of hyoklee
---------
Co-authored-by: hyoklee <hyoklee@users.noreply.github.com>
This is a refactor of Spack's stand-alone test process to be more spack- and pytest-like.
It is more spack-like in that test parts are no longer "hidden" in a package's run_test()
method and pytest-like in that any package method whose name starts test_
(i.e., a "test" method) is a test part. We also support the ability to embed test parts in a
test method when that makes sense.
Test methods are now implicit test parts. The docstring is the purpose for the test part.
The name of the method is the name of the test part. The working directory is the active
spec's test stage directory. You can embed test parts using the test_part context manager.
Functionality added by this commit:
* Adds support for multiple test_* stand-alone package test methods, each of which is
an implicit test_part for execution and reporting purposes;
* Deprecates package use of run_test();
* Exposes some functionality from run_test() as optional helper methods;
* Adds a SkipTest exception that can be used to flag stand-alone tests as being skipped;
* Updates the packaging guide section on stand-alone tests to provide more examples;
* Restores the ability to run tests "inherited" from provided virtual packages;
* Prints the test log path (like we currently do for build log paths);
* Times and reports the post-install process (since it can include post-install tests);
* Corrects context-related error message to distinguish test recipes from build recipes.
* hip: get_paths for hipify-clang
* fix: need to actually use get_paths now to get hipify-clang path
* set hipify-clang path differentluy for external vs spack-installed case
* [@spackbot] updating style on behalf of eugeneswalker
fixes#22341
Using double quotes creates issues with shell variable substitutions,
in particular when the manifest has "definitions:" in it. Use single
quotes instead.
Add a "require" directive to packages, which functions exactly like
requirements specified in packages.yaml (uses the same fact-generation
logic); update both to allow making the requirement conditional.
* Packages may now use "require" to add constraints. This can be useful
for something like "require(%gcc)" (where before we had to add a
conflict for every compiler except gcc).
* Requirements (in packages.yaml or in a "require" directive) can be
conditional on a spec, e.g. "require(%gcc, when=@1.0.0)" (version
1.0.0 can only build with gcc).
* Requirements may include a message which clarifies why they are needed.
The concretizer assigns a high priority to errors which generate these
messages (in particular over errors for unsatisfied requirements that
do not produce messages, but also over a number of more-generic
errors).
## Version types, parsing and printing
- The version classes have changed: `VersionBase` is removed, there is now a
`ConcreteVersion` base class. `StandardVersion` and `GitVersion` both inherit
from this.
- The public api (`Version`, `VersionRange`, `ver`) has changed a bit:
1. `Version` produces either `StandardVersion` or `GitVersion` instances.
2. `VersionRange` produces a `ClosedOpenRange`, but this shouldn't affect the user.
3. `ver` produces any of `VersionList`, `ClosedOpenRange`, `StandardVersion`
or `GitVersion`.
- No unexpected type promotion, so that the following is no longer an identity:
`Version(x) != VersionRange(x, x)`.
- `VersionList.concrete` now returns a version if it contains only a single element
subtyping `ConcreteVersion` (i.e. `StandardVersion(...)` or `GitVersion(...)`)
- In version lists, the parser turns `@x` into `VersionRange(x, x)` instead
of `Version(x)`.
- The above also means that `ver("x")` produces a range, whereas
`ver("=x")` produces a `StandardVersion`. The `=` is part of _VersionList_
syntax.
- `VersionList.__str__` now outputs `=x.y.z` for specific version entries,
and `x.y.z` as a short-hand for ranges `x.y.z:x.y.z`.
- `Spec.format` no longer aliases `{version}` to `{versions}`, but pulls the
concrete version out of the list and prints that -- except when the list is
is not concrete, then is falls back to `{versions}` to avoid a pedantic error.
For projections of concrete specs, `{version}` should be used to render
`1.2.3` instead of `=1.2.3` (which you would get with `{versions}`).
The default `Spec` format string used in `Spec.__str__` now uses
`{versions}` so that `str(Spec(string)) == string` holds.
## Changes to `GitVersion`
- `GitVersion` is a small wrapper around `StandardVersion` which enriches it
with a git ref. It no longer inherits from it.
- `GitVersion` _always_ needs to be able to look up an associated Spack version
if it was not assigned (yet). It throws a `VersionLookupError` whenever `ref_version`
is accessed but it has no means to look up the ref; in the past Spack would
not error and use the commit sha as a literal version, which was incorrect.
- `GitVersion` is never equal to `StandardVersion`, nor is satisfied by it. This
is such that we don't lose transitivity. This fixes the following bug on `develop`
where `git_version_a == standard_version == git_version_b` does not imply
`git_version_a == git_version_b`. It also ensures equality always implies equal
hash, which is also currently broken on develop; inclusion tests of a set of
versions + git versions would behave differently from inclusion tests of a
list of the same objects.
- The above means `ver("ref=1.2.3) != ver("=1.2.3")` could break packages that branch
on specific versions, but that was brittle already, since the same happens with
externals: `pkg@1.2.3-external` suffixes wouldn't be exactly equal either. Instead,
those checks should be `x.satisfies("@1.2.3")` which works both for git versions and
custom version suffixes.
- `GitVersion` from commit will now print as `<hash>=<version>` once the
git ref is resolved to a spack version. This is for reliability -- version is frozen
when added to the database and queried later. It also improves performance
since there is no need to clone all repos of all git versions after `spack clean -m`
is run and something queries the database, triggering version comparison, such
as potentially reuse concretization.
- The "empty VerstionStrComponent trick" for `GitVerison` is dropped since it wasn't
representable as a version string (by design). Instead, it's replaced by `git`,
so you get `1.2.3.git.4` (which reads 4 commits after a tag 1.2.3). This means
that there's an edge case for version schemes `1.1.1`, `1.1.1a`, since the
generated git version `1.1.1.git.1` (1 commit after `1.1.1`) compares larger
than `1.1.1a`, since `a < git` are compared as strings. This is currently a
wont-fix edge case, but if really required, could be fixed by special casing
the `git` string.
- Saved, concrete specs (database, lock file, ...) that only had a git sha as their
version, but have no means to look the effective Spack version anymore, will
now see their version mapped to `hash=develop`. Previously these specs
would always have their sha literally interpreted as a version string (even when
it _could_ be looked up). This only applies to databases, lock files and spec.json
files created before Spack 0.20; after this PR, we always have a Spack version
associated to the relevant GitVersion).
- Fixes a bug where previously `to_dict` / `from_dict` (de)serialization would not
reattach the repo to the GitVersion, causing the git hash to be used as a literal
(bogus) version instead of the resolved version. This was in particularly breaking
version comparison in the build process on macOS/Windows.
## Installing or matching specific versions
- In the past, `spack install pkg@3.2` would install `pkg@=3.2` if it was a
known specific version defined in the package, even when newer patch releases
`3.2.1`, `3.2.2`, `...` were available. This behavior was only there because
there was no syntax to distinguish between `3.2` and `3.2.1`. Since there is
syntax for this now through `pkg@=3.2`, the old exact matching behavior is
removed. This means that `spack install pkg@3.2` constrains the `pkg` version
to the range `3.2`, and `spack install pkg@=3.2` constrains it to the specific
version `3.2`.
- Also in directives such as `depends_on("pkg@2.3")` and their when
conditions `conflicts("...", when="@2.3")` ranges are ranges, and specific
version matches require `@=2.3.`.
- No matching version: in the case `pkg@3.2` matches nothing, concretization
errors. However, if you run `spack install pkg@=3.2` and this version
doesn't exist, Spack will define it; this allows you to install non-registered
versions.
- For consistency, you can now do `%gcc@10` and let it match a configured
`10.x.y` compiler. It errors when there is no matching compiler.
In the past it was interpreted like a specific `gcc@=10` version, which
would get bootstrapped.
- When compiler _bootstrapping_ is enabled, `%gcc@=10.2.0` can be used to
bootstrap a specific compiler version.
## Other changes
- Externals, compilers, and develop spec definitions are backwards compatible.
They are typically defined as `pkg@3.2.1` even though they should be
saying `pkg@=3.2.1`. Spack now transforms `pkg@3` into `pkg@=3` in those cases.
- Finally, fix strictness of `version(...)` directive/declaration. It just does a simple
type check, and now requires strings/integers. Floats are not allowed because
they are ambiguous `str(3.10) == "3.1"`.
`spack buildcache create` is a misnomer cause it's the only way to push to
an existing buildcache (and it in fact calls binary_distribution.push).
Also we have `spack buildcache update-index` but for create the flag is
`--rebuild-index`, which is confusing (and also... why "rebuild"
something if the command is "create" in the first place, that implies it
wasn't there to begin with).
So, after this PR, you can use either
```
spack buildcache create --rebuild-index
```
or
```
spack buildcache push --update-index
```
Also, alias `spack buildcache rebuild-index` to `spack buildcache
update-index`.
Spack never parsed `nagfor` linker arguments put on the compiler line:
```
nagfor -Wl,-Wl,,-rpath,,/path
````
so, let's continue not attempting to parse that.
`buildcache create --rel`: deprecate this because there is no point in
making things relative before tarballing; on install you need to expand
`$ORIGIN` / `@loader_path` / relative symlinks anyways because some
dependencies may actually be in an upstream, or have different
projections.
`buildcache install --allow-root`: this flag was propagated through a
lot of functions but was ultimately unused.
* py-amici, py-python-libsbml: new packages
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Swig and cmake are build-only dependencies
* cmake as a run dependency after all
* py-amici: default boost and hdf5 variants to True
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When building perl with posix jobserver, it seems to eat jobs, which
reduces parallelism to 1 in many cases, and is rather annoying. This is
solved in GNU Make 4.4 (fifo is more stable than file descriptors), but
that version is typically not available.
So, fix this issue by simply unsetting MAKEFLAGS for the duration of
./Configure. That's enough, and the build phase runs perfectly in
parallel again.
This switches the default Make build type to `build_type=Release`.
This offers:
- higher optimization level, including loop vectorization on older GCC
- adds NDEBUG define, which disables assertions, which could cause speedups if assertions are in loops etc
- no `-g` means smaller install size
Downsides are:
- worse backtraces (though this does NOT strip symbols)
- perf reports may be useless
- no function arguments / local variables in debugger (could be of course)
- no file path / line numbers in debugger
The downsides can be mitigated by overriding to `build_type=RelWithDebInfo` in `packages.yaml`,
if needed. The upside is that builds will be MUCH smaller (and faster) with this change.
---------
Co-authored-by: Gregory Becker <becker33@llnl.gov>
* Vendor ruamel.yaml v0.17.21
* Add unit test for whitespace regression
* Add an abstraction layer in Spack to wrap ruamel.yaml
All YAML operations are routed through spack.util.spack_yaml
The custom classes have been adapted to the new ruamel.yaml
class hierarchy.
Fixed line annotation issue in "spack config blame"
This ensures that:
a) no externals are added to the tarball metadata file
b) no externals are added to the prefix to prefix map on install, also
for old tarballs that did include externals
c) ensure that the prefix -> prefix map is always string to string, and
doesn't contain None in case for some reason a hash is missing
* libiconv can be provided by libc, so update packages which depend on
libiconv to require the iconv virtual instead
* Many packages need special consideration when locating iconv depending
on whether it is provided by libc (no prefix provided) or the libiconv
package (in that case we want to provide a prefix)
* It was also noticed that when an iconv external was provided, that
there was interference with linking (this should generally be handled
by Spack's compiler wrappers and bears further investigation)
* Like iconv, libintl can be provided by libc or another package, namely
gettext. It is not converted to a provider like libiconv because it
provides additional routines. The logic is similar to that of iconv
but instead of checking the provider, we check whether the gettext
installation includes libintl.
* Provide openmp from rocm-open-extras for roblas test
* Addressing the prechecks/audit/package-audits check
* Correcting style check errors.
* rocm-openmp-extras path veriable restricting for test
* Correcting the env variable to run_tests
* Guard use of OpenMP to make it optional in rocblas test
* Removing unused patch
* Patch to handle file reorg changes for the tests
* Correcting patch file name
* Limiting hipify-clang path to 5.4 and later
* Set hipify-clang path env in CMake
* Disable module generation by default (#35564)
a) It's used by site administrators, so it's niche
b) If it's used by site administrators, they likely need to modify the config anyhow, so the default config only serves as an example to get started
c) it's too arbitrary to enable tcl, but disable lmod
* Remove leftover from old module file schema
* Warn if module file config is detected and generation is disabled
---------
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Change the signature of the Environment.__init__ method to have
a single argument, i.e. the directory where the environment manifest
is located. Initializing that directory is now delegated to a function
taking care of all the error handling upfront. Environment objects
require a "spack.yaml" to be available to be constructed.
Add a class to manage the environment manifest file. The environment
now delegates to an attribute of that class the responsibility of keeping
track of changes modifying the manifest. This allows simplifying the
updates of the manifest file, and helps keeping in sync the spec lists in
memory with the spack.yaml on disk.
* generax: adding new package generax
* muscle5: adding new package muscle5
* py-custom-inherit: adding new package py-custom-inherit
* py-ete3: adding new package py-ete3
* py-itolapi: adding new package py-itolapi
* py-opentree: adding new package py-opentree
* py-pypng: adding new package py-pypng
* py-toyplot: adding new package py-toyplot
* py-toytree: adding new package py-toytree
* py-pastml: adding new package py-pastml
* raxml-ng: adding new version 1.1.0
* py-topiary: adding new package py-topiary
* generax: adding master branch version
generax: adding version 2.0.1
generax: add mpi variant
* py-topiary: add main
* generax: correcting commit for 2.0.1
* Update var/spack/repos/builtin/packages/py-itolapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-opentree/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-topiary-asr: rename package, requested changes.
* Update var/spack/repos/builtin/packages/py-topiary-asr/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Make sure to append additional flags needed for specific compilers
in the flag_handler instead of adding them as separate cmake define
lines that override the main spack cflags.
Spack comes to a crawl post-install of nvhpc, which is partly thanks to
this post install hook which has a lot of redundancy, and isn't correct.
1. There's no need to store "type" because that _is_ "mode".
2. There are more file types than "symlink", "dir", "file".
3. Don't checksum device type things
4. Don't run 3 stat calls (exists, stat, isdir/islink), but one lstat
call
5. Don't read entire files into memory
I also don't know why `spack.crypto` wasn't used for checksumming, but I
guess it's too late for that now. Finally md5 would've been the faster
algorithm, which would've been fine given that a non cryptographicall
checksum was used anyways.
The bricks package uses header from the opencl-clhpp package when built with
the cuda variant activated. In order to find the header files, the bricks
CMakeLists.txt uses the `find_package(OpenCL 2.0)` statement. The CMake
FindOpenCL module searches several paths to find the header files. Eventually
it will search for header files in the local /usr/include directories. If
OpenCL headers are found, but CUDA is not installed locally, then the build
will fail.
One of the CMake variables searched for a path to the OpenCL headers is
OCL_ROOT. This fix utilizes the OCL_ROOT variable to identify the correct path
to the install opencl-clhpp package within Spack. Also, if the cuda variant is
not used, then the OpenCL build is disabled to prevent a build failure due to
improperly-identified locally-installed OpenCL header files.
The default behavior of the build process has not changed. An external variable
definitions must be made to activate these features. Specifically, to disable
the OpenCL build, this flag must be provided to CMake:
-DBRICK_USE_OPENCL=OFF
The Spack build process explicitly uses this option unless the cuda variant is
specified. If the cuda variant is specified, then the BRICK_USE_OPENCL variable
is set to ON and the OCL_ROOT variable is set to the path of the opencl-clhpp
include directory.
* "new py-subword-nmt package"
* [@spackbot] updating style on behalf of Sangu-Mbekelu
* Update package.py
updating package based on review
* [@spackbot] updating style on behalf of Sangu-Mbekelu
---------
Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
* [py-breathe] New version 4.35.0
* Update var/spack/repos/builtin/packages/py-breathe/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-sphinx-design] New versions 0.4.0, 0.4.1
* conflicts() -> depends_on()
Per @adamjstewart
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Adding py-ipyrad for testing
* py-ipyrad: placating flake8
* py-ipyrad: adding version 0.9.90, fixing hard coded path.
* py-ipyrad: use join_path instead of hard coded linux path
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ipyrad: Removing unneeded dependencies
* py-ipyrad: Readded future (see ipyrad setup.py)
* py-ipyrad: Switch to an anchored link in the docs
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ipyrad: Removed patch decorator
---------
Co-authored-by: snehring <snehring@iastate.edu>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* elfutils cannot build against libarchive@3.62+iconv
* elfutils needs libmicrohttpd version 0.9.50 or older
* elfutils: explicitly depend on pkg-config
* libmicrohttpd: Add several new versions
* Add patch for MacOS M1/M2 machines to fix segfault when using gmp
* Update var/spack/repos/builtin/packages/gmp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Restrict patch to v6.2.1
* Update var/spack/repos/builtin/packages/gmp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The patch for the zziplib package applied for version 0.13.69 and
earlier includes a reference to Creative Commons
Attribution-NonCommercial-ShareAlike 2.0 Generic license, which
causes Flexera's ~expensive perl script~ FlexNet Code Insights open
source license and compliance tool to flag Spack as a non-commercial
product. This patch narrows the diff context in the patch to exclude
this text. The semantics of the patch file are unchanged.
We have successfully been building silo@4.10.2 against hdf5@1.10.4 from
some time. Refinement of #34275 (which was concerned with 4.11 but
unnecessarily restricted 4.10).
* py-numpy: set openblas `symbol_suffix` in site.cfg
This writes the correct `symbol_suffix` variant value from the `openblas` in the spec into the `site.cfg`. Fixes#37133.
* py-numpy: fix style
* py-numpy: handle symbol_suffix == "none"
* py-codecov: deprecate since not on pypi anymore
* codecov: new package
* [@spackbot] updating style on behalf of wdconinc
* codecov: use github URL instead, multi-platform
* fix: install to prefix.bin.codecov
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* codecov: use versions lookup dict
* codecov: versions -> _versions, fix style
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Silence make
Set a fixed and large NUM_THREADS by default, to avoid that it gets initialized with the host # CPUs.
Set OMP_NUM_THREADS/OPENBLAS_NUM_THREADS in terms of make_jobs so that tests don't need excessive CPU.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
If we modify both Path and PATH, on Windows they will clobber one
another. This PR updates the shell modification logic to automatically
convert variable names to upper-case on Windows.
Paths with spaces are an issue on Windows and our current powershell
scripts are not sufficiently hardended against their use.
This PR removes promlematic commandlets that do not work well with paths
with spaces and adds escape quotes in other areas where this could be an
issue.
* glib: new version 2.76.1
This adds a new stable version of glib, 2.76.1 (skipping the 2.75 unstable series).
`mkenums.py` check now is specified as a dict, after r62dca6c1cf. The `filter_file` should disable both old and new. Better (maybe, but more complicated) would be to add the `can_fail` flag for this test.
The `iconv` argument was already deprecated and has now been removed. It is now resolved through meson itself, e71ecc8771.
Builds successfully on my system (and several dependents on top of it):
```console
==> glib: Successfully installed glib-2.76.1-7iy4mee2evabd357gviozbtyh5yxi27t
```
as does the previous 2.74.6 version
* glib: patch for 2.76.1, new version 2.74.7
Replace my initial libintl check with the much nicer check for
"intl" in self.spec["gettext"].libs.names. Thanks to Chris Green!
Co-authored-by: Bernhard Kaindl <bkaindl@gmail.com>
* initial commit for enabling test for rccl hsakmt-roct and rocm-opencl
* fix styling and cleaning code
* adding missing imports and minor fixes
* minor style fix
* moidfying hsakmt-roct test to run right after installation
* osg-ca-certs: igtf link should point to version, not 'current'
* osg-ca-certs: new version 1.110.igtf.1.119
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* ants: add version 2.4.3
- add version 2.4.3
- deprecate old git version
* [@spackbot] updating style on behalf of glennpj
---------
Co-authored-by: glennpj <glennpj@users.noreply.github.com>
* Add new package MozJPEG
MozJPEG is a patched version of libjpeg-turbo which improves JPEG compression efficiency achieving higher visual quality and smaller file sizes at the same time.
* MozJPEG: Add myself as a maintainer and fix style
* google-cloud-cli: add new package
* black fixes
* Less verbose
* [@spackbot] updating style on behalf of adamjstewart
* More robust if ver doesn't exist for platform
* Deprecate ancient GEE
* Fix ppc64le bug
---------
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
- [x] Replace `version(ver, checksum=None, **kwargs)` signature with
`version(ver, checksum=None, *, sha256=..., ...)` explicitly listing all arguments.
- [x] Fix various issues in packages:
- `tags` instead of `tag`
- `default` instead of `preferred`
- `sha26` instead of `sha256`
- etc
Also, use `sha256=...` consistently.
Note: setting `sha256` currently doesn't validate the checksum length, so you could do
`sha256="a"*32` and it would get checked as `md5`... but that's something for another PR.
* py-torch: define property cmake_prefix_paths
`py-torch` installs `libtorch` and a cmake config in a non-standard location. This points downstream code to the relevant locations. From there it should pick up the correctly library and include paths for C++ projects.
* py-torch: python_platlib suggestion
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of wdconinc
* py-torch: back to self.spec["python"].package.platlib
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Add mfem v4.5.2 and related updates/tweaks in other packages
* [mfem] Add the release source link for MFEM v4.5.2
* [mfem] Remove 'goxberry' (his request) from MFEM's maintainers list
This means that `spack install` will now build the minimal set of packages
required to install the root(s).
To opt out of build edge pruning, use `spack install --include-build-deps`.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Other tools like git support `GIT_EDITOR` which takes higher precedence than the
standard `VISUAL` or `EDITOR` variables. This adds similar support for Spack, in the
`SPACK_EDITOR` env var.
- [x] consolidate editor code from hooks into `spack.util.editor`
- [x] add more editor tests
- [x] add support for `SPACK_EDITOR`
- [x] add a documentation section for controlling the editor and reference it
Code from `spack.util.editor` was duplicated into our licensing hook in #11968. We
really only want one place where editor search logic is implemented. This consolidates
the logic into `spack.util.editor`, including a special case to run `gvim` with `-f`.
- [x] consolidate editor search logic in spack.util.editor
- [x] add tests for licensing case, where `Executable` is used instead of `os.execv`
- [x] make `_exec_func` argument of `editor()` into public `exec_fn` arg
- [x] add type annotations
fixes#36628
Fix using compilers that declare "target: any" in their
configuration. This should happen only on Cray with the
module based programming environment.
* Simplify test/cmd/ci.py::test_ci_generate_with_custom_scripts
* Rearrange the build-job logic in generate_gitlab_ci_yaml
* Preserve all unknown attributes in build jobs
* Slip tests for custom attributes in the tests for other job types
* Support custom artifacts
* [@spackbot] updating style on behalf of blue42u
* Don't bother sorting needs
---------
Co-authored-by: blue42u <blue42u@users.noreply.github.com>
* DaV SDK: Enable ParaView raytracing with in SDK
* CI: Drop swr testing from Data Vis SDK
* ISPC: extend LLVM requirement to main
* DaV SDK: Disallow concretizing develop unifyfs
No longer needed after mochi-margo patch
* New packages: py-ogb, py-outdated, py-littleutils
* Update var/spack/repos/builtin/packages/py-outdated/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Other tools like git support `GIT_EDITOR` which takes higher precedence than the
standard `VISUAL` or `EDITOR` variables. This adds similar support for Spack, in the
`SPACK_EDITOR` env var.
- [x] consolidate editor code from hooks into `spack.util.editor`
- [x] add more editor tests
- [x] add support for `SPACK_EDITOR`
- [x] add a documentation section for controlling the editor and reference it
Code from `spack.util.editor` was duplicated into our licensing hook in #11968. We
really only want one place where editor search logic is implemented. This consolidates
the logic into `spack.util.editor`, including a special case to run `gvim` with `-f`.
- [x] consolidate editor search logic in spack.util.editor
- [x] add tests for licensing case, where `Executable` is used instead of `os.execv`
- [x] make `_exec_func` argument of `editor()` into public `exec_fn` arg
- [x] add type annotations
* lmod modules: allow users to remove items from hierarchy per-spec
This allows MPI wrappers that depend on MPI to be removed from the MPI portion of
the hierarchy and be made available when the appropriate compiler is loaded.
module load gcc
module load mpi-wrapper # implicitly loads mpi
module load hdf5
This allows users to treat an mpi wrapper like an mpi program
This adds the new LTS version of Qt5. No build system changes needed.
The bundled libjpeg and sqlite versions were updated, but it is unclear if these are actual build requirements, and we have not been tracking these specific versions in the version dependencies (likely due to exactly this lack of clarity).
Compare: https://github.com/qt/qtbase/compare/v5.15.8-lts-lgpl...v5.15.9-lts-lgpl
* Add NetCDF95 package.
NetCDF95 is an alternative Fortran interface to the NetCDF library which uses Fortran 2003 features.
* [@spackbot] updating style on behalf of RemiLacroix-IDRIS
---------
Co-authored-by: RemiLacroix-IDRIS <RemiLacroix-IDRIS@users.noreply.github.com>
* libmypaint: change extend to append
For same reason as #36939, `extend` takes a list as argument, while `append` takes list entry. Here `append` should be used.
* libmypaint: depends_on intltool
starting with the 5.0.x release stream the cuda related configury
items in Open MPI once again need --with-cuda-libdir so that
libcuda.so can be found at configure time, otherwise no cuda
support unless someone copies libcuda.so to
$CUDA_HOME/lib64
related to #36760
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* Updating torchgeo to 0.4.1
* Added some commas
* Update var/spack/repos/builtin/packages/py-torchgeo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-torchgeo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-torchgeo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Changed fiona bounds
* Update var/spack/repos/builtin/packages/py-torchgeo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* upcxx: Enhance auto-detection for HPE Cray EX platforms
1. Some Cray EX systems use ALPS instead of SLURM, ensure we default
the pmi-runcmd appropriately.
2. Some Cray EX systems run a stock kernel and lack a Cray PrgEnv
(yes, really), so add a check for libfabric CXI provider as a
last resort for detecting Cray EX, and ensure we don't choke on
a lack of `$CRAYPE_DIR`.
* upcxx: Cray XC improvements
1. Future-proof Cray XC detection, in case Spack ever starts reporting
it as "linux".
2. Revert cray-libsci workaround for ALCF Theta. The workaround no longer
appears to be necessary, and is actually causing failures on Theta now.
* upcxx: Add level_zero variant detection
This commit changes the environment modifications class to escape
strings with double quotes instead of single quotes.
Single quotes prevent the expansion of enviornment variables that are
nested within environment variable definitions.
Fixes#36689
- The "base" builder class should be last in the MRO
- `filter_compiler_wrappers` needs to be moved to builders
- Decorating a function from a mixin class require using
the correct metaclass for the mixin
* py-dask-mpi: remove jupyter-server-proxy
This dependency isn't a 'hard' one; it optionally simplifies getting access to the web consoles.
See: https://github.com/dask/dask-mpi/pull/102
* Add patch to remove unnecessary dependency
* review comments
* pass formatting
* Update m4
For %oneapi & %intel, we explicitly set -O0 so dependents of m4 do not break
# The default optimization level for icx/icpx is "-O2",
# but building m4 with this level breaks the build of dependents.
# So we set it explicitely to "-O0".
* [@spackbot] updating style on behalf of hpcnpatel
This aims to resolve#34164 by resolving the <include-fragment> tags
that GitHub has started using for their release pages, see
https://github.github.io/include-fragment-element/.
This feels a bit hacky but intended as a starting point for discussion.
After reading a page during spidering, it first parses for
include-fragments, gets them all, and treats them all as separate pages.
Then it looks for href links in both the page itself and the fragments.
Co-authored-by: Alec Scott <alec@bcs.sh>
* py-torch: Update conflicts for +/~tensorpipe
* [@spackbot] updating style on behalf of blue42u
* Update var/spack/repos/builtin/packages/py-torch/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of blue42u
---------
Co-authored-by: blue42u <blue42u@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
MSVC compilers rely on vcvars environment setup scripts to establish
build environement variables neccesary for all projects to build
successfully. Prior to this we were only piping LIB, INCLUDE, and PATH
change through.
Instead we need to propegate all changes to the env variables made by
VCVARs in order to establish robust support for the MSVC compiler.
This most significantly impacts projects that need to be build with
NMake and MSBuild
No changes to the build recipe required. Changelog at https://github.com/xrootd/xrootd/compare/v5.5.3...v5.5.4.
Built successfully on my test system:
```console
[+] /opt/software/linux-ubuntu23.04-skylake/gcc-12.2.0/xrootd-5.5.4-cgyz43ivwwqkc7bhdofnxhl2fusysg3m
```
* CI: Fixup docs for bootstrap.
* CI: Add compatibility shim
* Add an update method for CI
Update requires manually renaming section to `ci`. After
this patch, updating and using the deprecated `gitlab-ci` section
should be possible.
* Fix typos in generate warnings
* Fixup CI schema validation
* Add unit tests for legacy CI
* Add deprecated CI stack for continuous testing
* Allow updating gitlab-ci section directly with env update
* Make warning give good advice for updating gitlab-ci
* Fix typo in CI name
* Remove white space
* Remove unneeded component of deprected-ci
* cppzmq: new versions 4.8.1, 4.9.0 (updated cmake dependency)
No important changes in the build system, https://github.com/zeromq/cppzmq/compare/v4.7.1...v4.9.0, other than the more recent cmake required starting with 4.8.0.
There is also a patch version 4.8.0, but presumably 4.8.1 is preferred.
* cppzmq: add maintainer
* Extract a method to warn when the manifest is not up-to-date
* Extract methods to update the repository and ensure dir exists
* Simplify further the write method, add failing unit-test
* Fix the function computing YAML equivalence between two instances
* ECP-SDK: enable hdf5 VOL adapters
- When +hdf5, enable VOL adapters suitable for the SDK.
- Each VOL package must prepend to the HDF5_PLUGIN_PATH.
- hdf5: 1.13.3 will break existing VOL packages, constrain
VOLs related to SDK and add note to keep 1.13.2 available.
- hdf5-vol-async:
- Do not set HDF5_VOL_CONNECTOR, consumers must opt-in.
- Enforce DAG constraints on MPI to require threaded version.
- Depend on an explicit version of argbots to relax
concretization issues in other spack environments.
- paraview: fix compiler flag usage for the 110 ABI (followup to #33617).
* ECP Data and ViS: Add constraits for HDF5 VOLS
* CI: HDF5 1.14 builds without VisIt
* hdf5-vol-async: Update docs string
---------
Co-authored-by: Stephen McDowell <stephen.mcdowell@kitware.com>
* Fix py-torch build on Linux >=6.0.3
* Update var/spack/repos/builtin/packages/py-torch/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
VeloC/SCR component releases needed for upcoming VeloC release.
* AXL v0.8.0
* ER v0.4.0
* KVTree v1.4.0
* Rankstr v0.3.0
* Redset v0.3.0
* Shuffile v0.3.0
* Spath v0.3.0
Added some dependency compatibility restraints as the new componenet
versions have a change to how their cmake config works.
* add cuDNN variant and make RDKit optional
* [@spackbot] updating style on behalf of RMeli
* add newer version of rdkit
---------
Co-authored-by: RMeli <RMeli@users.noreply.github.com>
* archspec: add v0.2.0, deprecate old versions
* Simplify version ranges
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove py-setuptools
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Rename PIKA_WITH_P2300_REFERENCE_IMPLEMENTATION CMake option in pika package
* Remove unnecessary use of self in pika package
* Use append instead of list += for single options in pika package
* Add pika 0.14.0
* add tandem package
* apply black
* fix import
* fix year of license
* add version 1.0 and associated compile fix
* change git to property
* add conflict to intel
* Kokkos: add release 4.0.0
* Kokkos: updating default c++ standard requirement
Now Kokkos requires c++17 as its new minimum c++ standard library.
* Kokkos: adding support for new GPU architectures
The new updates include NVIDIA Hopper and AMD Navi
* Kokkos: fixing style...
* paraview +rocm: constrain kokkos dep to @:3.7.01
---------
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
* initial commit for enabling test for rocm-smi-lib, rocm-cmake and rocm-clang-ocl
* fix styling and cleaning code
* disabling some tests for rocm-smi-lib
* fix style errors
* msmpi has no wrappers so don't set MPI_CXX_COMPILER etc. for that
MPI implementation
* hdf5 on Windows does not have h5cc etc., so do not try to filter
them on Windows
* py-uproot: new versions 5.0.4, 5.0.5
No changed in dependency versions
* py-awkward-cpp: new versions
* py-awkward: new versions
* py-awkward: new version in 1.10.* series
Previously `spack -e bla config update <section>` would treat the
environment config scope as standard config file instead of a single
file config scope. This fixes that.
When app is uninstalled, if it matches a default, then remove the
default symlink targeting its modulefile.
Until now, when a default were uninstalled, the default symlink were
left pointing to a nonexistent modulefile.
- Update default image to Ubuntu 22.04 (previously was still Ubuntu 18.04)
- Optionally use depfiles to install the environment within the container
- Allow extending Dockerfile Jinja2 template
- Allow extending Singularity definition file Jinja2 template
- Deprecate previous options to add extra instructions
Unless the amdgpu_target is overriden, the libraries will default to
being built for cuda, since amdgpu_target=none is both default and in
conflict with +rocm. This requires a custom Disjoint set to include
both the 'auto' variant used by the rocm mathlibs and the 'none'
variant used by ROCmPackage.
* Fix search for hip+cuda in hipcub@5.1 and later
This patch is not strictly necessary, but it may fix the search for HIP
in certain environments.
* Backport fix for CUDA 11.5 to hipsparse
* Reduce effort on grounding by employing cardinality constraints
If we use a cardinality constraint instead of a rule
using pair of values, we'll end up grounding 1 rule
instead of all the possible pair combinations of the
allowed values.
* Display all errors from concretization, instead of just one
If clingo produces multiple "error" facts, we now print all
of them in the error message. Before we were printing just
the one with the least priority.
Consolidate a few common patterns in concretize.lp to ensure
that certain node attributes have one and only one value
assigned.
All errors are displayed, so use a single criterion
instead of three.
* Account for weights in concretize.lp
To recover the optimization order we had before, account
for weights of errors when minimizing.
The priority is mapped to powers of 10, so to effectively
get back the same results as with priorities.
openblas likes to concurrently writes to the same archive from different
targets, and solves that through .NOTPARALLEL: all
We run `make x y z` which is not affected by `NOTPARALLEL`, running into
races.
When generating modulefile, correctly detect software installation asked
by user as explicit installation.
Explicit installation status were previously fetched from database
record of spec, which was only set after modulefile generation.
Code is updated to pass down the explicit status of software
installation to the object that generates modulefiles.
Fixes#34730.
Fixes#12105.
A value for the explicit argument has to be set when creating a new
installation, but for operations on existing installation, this value is
retrieved from database. Such operations are: module rm, module refresh,
module setdefaults or when get_module function is used.
Update on the way tests that mimics an installation, thus explicit
argument has to be set under such situation.
Original Author of this change: Chris Green <greenc@fnal.gov>
Two changes:
- Remove adding the library path using -L: It is obsolete now
that we have the library paths in before the system paths.
- Link with -linto only if the gettext recipe provides it:
When we are on a glibc system, we can use external gettext,
which means we use the libintl inside libc.so: no -lintl then.
This change was already submitted in #35450 and reviewed but is
stuck in this big PR which is trying to do too in a single PR.
This fixes a bug in the Windows build of Perl.
An attribute defined in package class is inaccessible from the install
method due to builder: refactor it to be a method.
Add `config:stage_name` which is a Spec format string that can
customize the names of stages created by Spack. This was primarily
created to allow generating shorter stage names on Windows (along
with `config:build_stage`, this can be used to create stages with
short absolute paths).
By default, this is not set and the prior name stage format is used.
This also removes the username component that is always added to
Stage paths on Windows (if users want to include this, they can
add it to the `build_stage`).
* py-aioitertools: add v0.11.0
* Update var/spack/repos/builtin/packages/py-aioitertools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* compiler wrapper: fix -Xlinker parsing
* handle the case of -rpath without value; avoid that we drop the flag
* also handle the -Xlinker -rpath -Xlinker without further args case...
* fix test
* get rid of global $rp var, reduce branching
* Ascent: Drop VTK-h dependency for 0.9
* Ascent: Remove duplicate OpenMP constraints
* Ascent: 0.9.0 cannot build with vtk-m@2
* Ascent: Only needs vtkm when using vtkh
* Ascent: Require fides when building with ADIOS2
* QE v7.1 add post-processing tools installation
Quantum-Espersso@7.1 ships with an incoplete CMakeLists.txt that prevents the installation of post-processing tools.
Added patches, is fixed in two different commits in upstream.
* fixed style
* removed spaces
* added MR references
* Add new libfabric versions 1.17.1, 1.17.0, 1.16.0, 1.15.2.
* Add libfabric dependency on numactl and linux-headers when building
with OPX provider support.
* Set libfabric flag_handler to pass compiler flags as arguments to
configure.
Fix patching old boost versions to account for builders.
Add a proper version constraint on boost for recent dyninst.
The constraint can be found in dyninst source code under
"cmake/Boost.cmake" which contains:
set(_boost_min_version 1.70.0)
Co-authored-by: Greg Becker <becker33@llnl.gov>
* Current develop spack.bat file cannot handle any reserved characters
being passed via the CLI, particularly '=' and '?'. To address this,
re-do the CLI parsing for loop to use custom logic to allow for more
granular handling of CLI args.
* We take a less-than-ideal approach to escaping local scope and
handling unset variables as well as the actual parsing of CL
arguments. To address this, don't quote the args and then try to
parse the quotes we just added (resulting in spack flags being
undefined). Instead, leverage batch script features. Since we are
not unnecessarily quoting things, we don't need to think about
removing them, and in the case of paths with spaces, we should _not_
be removing the quotes as we currently do.
Corrects libs detection with a more specific root, otherwise there
can be inconsistencies between version of WGL requested and the
version picked up by `find_libraries`.
Corrects headers detection - win-sdk, win-wdk, and WGL headers all
exist under the same directory, so we can compute the headers for WGL
without querying the spec for win-sdk (which causes errors).
This commit also removes the `plat` variant of `wgl`, which is
redundant with the Spec's target.
- [x] Specs that define 'new' versions in the require: section need to generate associated facts to indicate that those versions are valid.
- [x] add test to verify success with unknown versions.
- [x] remove unneeded check that was leading to extra complexity and test
failures (at this point, all `hash=version` does not require listing out that version
in `packages.yaml`)
- [x] unique index for origin (dont reuse 0)
Co-authored-by: Peter Josef Scheibel <scheibel1@llnl.gov>
* Add v8.4.1, and a few other changes.
Minor adjustments for better alignment between Spack and ESMF native
build. For ESMF >= 8.3.1 now Spack defaults to using
external-parallelio. Before use internal version, which was PIO-1 all
the way up to v8.3.0b10 anyway! Xerces is disabled by default.
* Deal with two long lines flagged by prechecks/style.
* Try to satisfy prechecks/style.
* Try to satisfy flake8 rules wrt indentation of continuation lines.
* Now trying to satisfy "black reformatting".
* For "black" formatting really put that ugly comma at the end before
closing parentheses. Interesting.
* Support building against external-parallelio even w/o mpi, but select the
external-parallelio dependency accordingly.
* Correct C compiler setting.
* Handle `pnetcdf` variant consistent with how `ParallelIO` does it. And
also pass the `pnetcdf` variant down to the `external-parallelio`
dependency if set.
* Long line formatting again.
* Simplify handling of tarball URL construction and update sha256
checksums.
* Align version check with recommended self.spec.satisfies().
* Deprecate v8.4.0 which has a bug that can cause memory corruption, fixed
in v8.4.1.
* Use double quotes vs single quotes as per style-check... although
https://spack-tutorial.readthedocs.io/en/latest/tutorial_packaging.html#querying-spec-versions
clearly shows it with single quotes.
* ispc: attempts at getting more recent versions to work
* ispc: more attempts to get newer versions to build
* ispc: cleanup
* llvm: remove ispc_patches variant again
* ispc: unpin ncurses
* ispc: satisfy style checks
* ispc: 1.19 is only compatible with LLVM 13-15
otherwise it would not build against develop, as this now has LLVM 16
* ispc: relax LLVM version to what ispc requires itself
verified that it builds against LLVM 13, 14, 15, but not 12 and 16
* ispc: use spec.satisfies instead of version comparison
according to suggestions from review and docs, this is the canonical way to do it
* ispc: checksum 1.18.1
just in order to include all versions, also checked that it builds
---------
Co-authored-by: Martin Aumüller <aumuell@reserv.at>
Add a `find_first` method that locates one instance of a file
that matches a specified pattern by recursively searching a directory
tree. Unlike other `find` methods, this only locates one file at most,
so can use optimizations that avoid searching the entire tree:
Typically the relevant files are at low depth, so it makes sense to
locate files through iterative deepening and early exit.
Update tcl and lmod modulefile template to provide more information on
help message (name, version and target) like done on whatis for lmod
modulefiles.
This resolves a loose end from #36241 (missed due to package name). `libpthread-stubs` is another package from the xcb project that is now tracked through https://gitlab.freedesktop.org/xorg/ instead. No new versions; no changed hashes.
* py-iniconfig: new version 2.0.0 (-> hatchling)
py-iniconfig switched to hatchling with v2.0.0:
https://github.com/pytest-dev/iniconfig/blob/v2.0.0/pyproject.toml
```
==> py-iniconfig: Successfully installed py-iniconfig-2.0.0-ttoip2aalmxqqybv3vnozcabk47vg2yn
```
* Update var/spack/repos/builtin/packages/py-iniconfig/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Adapt tcl and lmod modulefile templates to generate append-path or
remove-path commands in modulefile when respectively append_flags or
remove_flags commands are defined in package for run environment.
Fixes#10299.
* Add option to optionally build with CMake
* Autotools is preferred where available
* Unlike the autotools-based build, the CMake-based build creates
either static or shared libs, not both (the default is shared
and is controlled with a new "shared" variant that only exists
when building with cmake)
* Note that `cmake~ownlibs` depends on expat, so would require
`expat build_system=autotools` (to avoid a cyclic dependency)
Simplify environment modification block in modulefile Tcl template by
always setting a path delimiter to the prepend-path, append-path and
remove-path commands.
Remove --delim option to the setenv command as this command does not
allow such option.
Update test_prepend_path_separator test to explicitly check the 6
path-like commands that should be present in generated modulefile.
* Add a pre-check job that just bootstrap the environment on Python 3.6
* py-typing-extension: restore information on Python 3.6 installation
* Fix job name, try to run quick test on installed python packages
fixes#36339
We were missing a rule that enforced a match between
the `node_compiler` and the compiler used to satisfy
a requirement.
Fix compiler with custom, made up version too
* gcc: fix for apple-clang conflict
* Update var/spack/repos/builtin/packages/gcc/package.py
Use variant by @adamjstewart
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Specs that define 'new' versions in the require: section need to generate
associated facts to indicate that those versions are valid.
* add test to verify success with unknown versions.
Since environment-modules has support for autoloading since 4.2,
and Spack-builds of it enable it by default, use the same autoload
default for tcl as lmod.
* add opppy-0_1_6 and opppy-0_1_7 releases to the spack recipes
* update urls
* remove sphinx from the dependency list
* cleanup OPPPY versions to capture OPPPY-0_1_1 tag descrepency
* one more attempt at fixing the url for opppy-0_1_1 (simpler fix)
---------
Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: clevelam <clevelam@users.noreply.github.com>
* update python package
* change package inheritance
* small update
* enable cpp tests
* small update
* Add flaky package
* Restructure PennyLane deps and order
* Change Lightning defaults and add libomp support for MacOS
* Replace explicit git url with PyPI
* Add Flaky support
* Update PennyLane and PennyLane Lightning support
* fix format
* update packages versioning
* Add patching and default updates for lightning package
* Format
* fix patch version
* update py-flaky package
* update py-pennylane-lightning package
* update py-pennylane package
* remove explicity python dependence
* Remove redundant lines from patch-file
* Update SHA for new patch
* Initial commit for PLLKokkos.
* Comment verbose variant.
* Update develop commit version and restore verbose option.
* Add backends.
* Add mesa package dep (libxml2). Fix rocm install for py-pennylane-lightning-kokkos.
* Restore sycl backend.
* Revert mesa package.
* Make py-pe-li-kokkos into CudaPackage, ROCmPackage.
* Do not force kokkos+wrapper when +cuda
* Few mods following comments on py-pll.
* Update versions of py-pennylane*.
* Remove py-pennylane-lightning patch.
* Remove redundant preferred=True.
* Fix lint in py-pennylane-lightning-kokkos.
* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Ninja and pip not required at runtime. Set lower bound on PL/PLL versions.
* Remove v0.29.0 from pennylane.
* Add AmintorDusko as maintainer.
---------
Co-authored-by: AmintorDusko <amintor_dusko@hotmail.com>
Co-authored-by: Lee J. O'Riordan <lee@xanadu.ai>
Co-authored-by: Amintor Dusko <87949283+AmintorDusko@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The xcb-utils have been migrated to the gitlab.freedesktop.org, from the
previous separate location. That means that a URL change is needed to
pick up newer version
([ref](https://lists.freedesktop.org/archives/xcb/2022-October/011422.html)).
This replaces the `homepage` and `url` with the latest (to an `xz`
file), adds a `url_for_version` function to resolve past versions, and
add the latest versions. Because of the `url_for_version` I don't think
we can use the `xorg_mirror_path` approach here.
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
netcdf-cxx and netcdf-c now build with CMake rather than Autotools.
netcdf-c can still optionally build with Autotools (but defaults to
CMake). With some additional patches to the CMake files, netcdf-c
can use CMake to build on Windows.
* abinit: add version 9.8.3
* require hdf5 up to 1.8 and libxc up to version 5
* abinit: constrained versions of libxc and hdf5
* fixed bad syntax for format
* fixed error looking for fftw in spec.
* Changed to look for fftw-api in spec.
* Update var/spack/repos/builtin/packages/abinit/package.py
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Xorg apps: updated versions to current latest
This updates all xorg apps to the latest versions, adding updated
requirements where needed.
No major version increases in any packages.
Minor version increases in some packages (build changes, if any, are
indicated below):
- rgb
- xauth
- xcalc
- xclock
- xeyes: xi >= 1.7, x11-xcb xcb-present >= 1.9 xcb-xfixes xcb-damage
- xfontsel
- xfs: xfont2 >= 2.0.1
- xinit
- xpr
- xrdb
Bugfix version increases in many packages, with no expected impact on
dependencies or interfaces.
Summary of dependency changes:
- xeyes:
- depends_on("libxi@1.7:", when="@1.2:")
- depends_on("libxcb@1.9:", when="@1.2:")
- xfs:
- depends_on("libxfont@1.4.5:", when="@:1.1")
- depends_on("libxfont2@2.0.1:", when="@1.2:")
* setxkbmap: depends_on libxrandr when @1.3.3:
* constype: new version
Add support for building with CMake and make it the default build
system on all platforms. By doing this, lz4 can now be built on
Windows. The makefile-based build remains as an option.
* wayland: new versions, new build system (meson)
* wayland-protocols: new version, new build system (meson)
* [@spackbot] updating style on behalf of wdconinc
* wayland-protocols: added maintainer
* wayland: added maintainer
* wayland-protocols: no need to import build systems, per flake8
* wayland: no need to import build system, per flake8
* Update var/spack/repos/builtin/packages/wayland/package.py
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
If you have a "require:" section in your packages config, and you
use it to specify a list of requirements, the list elements can
now include strings (before this, each element in the list had to
be a `one_of` or `any_of` specification, which is awkward if you
wanted to apply just one spec with no alternatives).
* py-pdf2image: new package
* py-pdf2image: 1.16.3 source now available on pypi
* Update var/spack/repos/builtin/packages/py-pdf2image/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Example one:
```
spack install --add x y z
```
is equivalent to
```
spack add x y z
spack concretize
spack install --only-concrete
```
where `--only-concrete` installs without modifying spack.yaml/spack.lock
Example two:
```
spack install
```
concretizes current spack.yaml if outdated and installs all specs.
Example three:
```
spack install x y z
```
concretizes current spack.yaml if outdated and installs *only* concrete
specs in the environment that match abstract specs `x`, `y`, or `z`.
* update versions and arch flags
* style update
* more style issues
* fix hashes and testing problem
* return the old versions, but they are really bad
* fix style
---------
Co-authored-by: Gerald Ragghianti <gerald@ragghianti.com>
The `ignore` parameter was only used for `spack activate/deactivate`, and it isn't used
by Spack Environments which have their own handling of file conflicts. We should remove it.
Everything that handles `ignore=` was removed in #29317 and included in 0.19, when we
removed `spack activate` and `spack deactivate` in favor of environments. So all of these
usages removed here were already being ignored by Spack.
Adapt tcl modulefile template to call "module load" on autoload
dependency without testing if this dependency is already loaded or not.
The is-loaded test is not necessary, as module commands know how to cope
with an already loaded module. With environment-modules 4.2+ (released
in 2018) it is also important to have this "module load" command even if
dependency is already loaded in order to record that the modulefile
declares such dependency. This is important if you want to keep a
consistent environment when a dependent module is unloaded.
The "Autoloading" verbose message is also removed as recent module
commands will report such information to the user (depending on the
verbosity configured for the module command).
Such change has been test successfully with Modules 3.2 (EL7), 4.5 (EL8)
and 5.2 (latest) and also with Lmod 7 and 8 (as it is mentionned in
Spack docs that Lmod can be used along with tcl modules). Dependencies
are correctly loaded or unloaded, whether they are loaded/unloaded or
not.
This change fixes Tcl quoting issue introduced in #32853.
Fixes#19155.
* py-setuptools-git-versioning: new package
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-hepunits: new versions 2.2.0, 2.2.1, 2.3.0, 2.3.1
Python 2 support dropped in 2.2 series.
Ref: https://github.com/scikit-hep/hepunits/compare/v2.1.1...v2.3.1
* py-hepunits: py-hatchling as of version 2.3
* [@spackbot] updating style on behalf of wdconinc
* py-hepunits: only depends_on toml through 2.1.1
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
In the Windows filesystem logic for creating a symlink, we intend to
fall back to a copy when the symlink cannot be created (for some
configuration settings on Windows it is not possible for the user
to create a symlink). It turns out we were overly-broad in which
exceptions lead to this fallback, and the subsequent copy would
also fail: at least one case where this occurred is when we
attempted to create a symlink that already existed.
The updated logic expressly avoids falling back to a copy when the
file/symlink already exists.
* Bazel: limit parallelism
* Patch packages that don't directly invoke bazel
* Style fixes
* flag comes after build, not bazel
* flag comes after build, not bazel
* command is only attribute if specific package
* ASP-based solver: use satisfies instead of intersects
They are semantically equivalent for concrete versions,
but the GitVersion.intersects implementation is buggy
* Mitigation for git version bug
fixes#36134
This commit works around the issue in #36134, by using
GitVersion.satisfies instead of GitVersion.intersects
There are still underlying issues when trying to infer the
"reference version" when no explicit one is given, but:
1. They are not reproducible with our synthetic repo
2. They occur only when the `git.<xxx>` form of Git version
is used
Here we just work around the user facing issue and ensure
the tests are correct with our synthetic repository.
* py-pytorch-lightning: add v2.0.0
* py-lightning-utilities: add v0.8.0
* Update all PyTorch packages
* Open-CE does not yet have patches for PyTorch 2 on ppc64le
For `spack install --test=all gromacs`
* remove the `test` target from the `check()` call and just use
the `check` target, in accordance with usual GROMACS test protocol
* build the test binaries explicitly during the build phase
Additional minor updates are necessary. This change
updates the package structure to the newer format with a
separate Builder class so we can override `check()`.
However, note that additional modernization should be
undertaken with care.
This PR does 2 unrelated things:
1. It changes the encoding of the compilers
2. It tweaks the heuristic for the solves in a0d8817907
Both were initially motivated by trying to get a performance gain but, while 2 showed significant speed-ups[^1], 1 instead didn't. I kept it anyhow, since I think the code related to compilers is more consolidated with the new encoding and we might get some performance improvement out of it if we can base our errors on the `node_compiler(Package, CompilerID)` atoms instead of `attrs`.
[^1]: In general the changes in the heuristic brought a ~10% speed-up on the tests I did. I'll post detailed results below.
Add a warning about compilers.yaml that is triggered if there are multiple compilers with the same spec, os and
target (since they can't be selected by users with the spec syntax only).
* [openmpi] 5.0.0.rc10 onwards needs munge
This is the error you will see when munge is missing from `PKG_CONFIG_PATH`:
```
configure:63942: checking for pmix pkg-config cflags
configure:63956: check_package_pkgconfig_run_results=Package munge was not found in the pkg-config search path.
Perhaps you should add the directory containing `munge.pc'
to the PKG_CONFIG_PATH environment variable
Package 'munge', required by 'pmix', not found
configure:63959: $? = 1
configure:63966: pkg-config output: Package munge was not found in the pkg-config search path.
Perhaps you should add the directory containing `munge.pc'
to the PKG_CONFIG_PATH environment variable
Package 'munge', required by 'pmix', not found
configure:63972: result: error
configure:63974: error: An error occurred retrieving pmix cppflags from pkg-config
```
* Use same PKG_CONFIG_PATH defaults for ompi+pmix+prrte
The issue I tried to fix in https://github.com/spack/spack/pull/36105 comes from
different default search paths in different `pkg-config` executables used in
`openmpi` and `pmix` package. As these tools (`openmpi`, `pmix`, and `prrte`)
all use the same mechanisms to detect dependencies, the `pkg-config` environment
they use should also be equal.
Since GPG clear-sign cannot deal with lines longer than 19995 characters
and doesn't even error but simply truncates those linese (don't ask me
why...), we have to be careful not to hit that line limit when reducing
the filesize.
So, instead this PR sets the indent level to 0 and drops the whitespace
after `: `, which still reduces file size by 50% or so.
Per feedback from the UCX community, we rarely do update
releases to anything but the current and one previous main
release stream.
Update comments in the UCX spack file to reflect this.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* py-pyshacl: patch dependency typo
* py-pyshacl: satisfy flake8
* Update var/spack/repos/builtin/packages/py-pyshacl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Adds builders appropriate for building these packages on Windows.
It is intended that builds on other platforms are unaffected (e.g.
they build with Autotools as before on Linux).
* py-pint: new versions
* Update var/spack/repos/builtin/packages/py-pint/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pint/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pint/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-antspyx: new package
Also adds required dependencies.
Requires options to ITK to enable the right support libraries, and
patches to remove tune the setup and provide resources rather than
downloading libraries/"submodules" on the fly.
* Fix patch URL
* Style fixes.
* bump version and re-include `git clone ...` as resource
... and use colors in disambiguate message for clarity.
This commit avoids the loop:
```
for root in roots:
for dep in deps(root):
...
```
instead it ensures each node is visited once and only once.
Also adds a small optimization when searching for concrete specs, since
we can assume uniqueness of dag hash, so it's fine to early exit.
This adds a new mode for `concretizer:reuse` called `dependencies`,
which only reuses dependencies. Currently, `spack install foo` will
reuse older versions of `foo`, which might be surprising to users.
* update mda dependencies
* apply black
* mdanalysis draft
* update
* small fixes
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ci: version bump for ghcr.io/spack/e4s-amazonlinux-2
This new image comes with GnuPG v2.4.0
* py-cython: upperbounds for Python versions
* fix py-gevent nonsense
---------
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
* CI configuration boilerplate reduction and refactor
Configuration:
- New notation for list concatenation (prepend/append)
- New notation for string concatenation (prepend/append)
- Break out configuration files for: ci.yaml, cdash.yaml, view.yaml
- Spack CI section refactored to improve self-consistency and
composability
- Scripts are now lists of lists and/or lists of strings
- Job attributes are now listed under precedence ordered list that are
composed/merged using Spack config merge rules.
- "service-jobs" are identified explicitly rather than as a batch
CI:
- Consolidate common, platform, and architecture configurations for all CI stacks into composable configuration files
- Make padding consistent across all stacks (256)
- Merge all package -> runner mappings to be consistent across all
stacks
Unit Test:
- Refactor CI module unit-tests for refactor configuration
Docs:
- Add docs for new notations in configuration.rst
- Rewrite docs on CI pipelines to be consistent with refactored CI
workflow
* Script verbose environ, dev bootstrap
* Port #35409
By setting the traversal depth to 1, only specs matching the changed
package and direct dependents of those (and of course all dependencies
of that set) are removed from pruning candidacy.
* py-deap: newer version can use newer setuptools
* Update var/spack/repos/builtin/packages/py-deap/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The overlapping dependency version ranges caused the concretizer to pick
version 7.1 even though version 8.0 is available:
```
==> Error: No version for 'cubelib' satisfies '@4.7.1' and '@4.8'
```
Moreover, Score-P 8.0 requires libbfd:
```
configure: error: bfd.h required
```
* Provide openmp from rocm-open-extras when tensile uses openmp
* Correcting audit check failure in rocm-openmp-extras dependency
* Fixing style check error
* rocm-openmp-extras required instead of llvm-amdgpu both varient
When untouched spec pruning is enabled, specs possibly affected
by a change cannot be pruned from a pipeline.
Previously spack looked at all specs matching changed package
names, and traversed dependents of each, all the way to the
environment root, to compute the set of environment specs
possibly affected by a change (and thus, not candidates for
pruning).
With this PR, when untouched spec pruning is enabled, a new
environment variable can control how far towards the root spack
traverses to compute the set of specs possibly affected by a
change. SPACK_UNTOUCHED_PRUNING_DEPENDENT_DEPTH can be set
to any numeric value before the "spack ci generate" command
is called to control this traversal depth parameter. Setting
it to "0" traverses only touched specs, setting it to "1"
traverses only touched specs and their direct dependents, and
so on. Omitting the variable results in the previous behavior
of traversing all the way to the root. Setting it to a negative
value means no traversal is done, and always yields an empty
set of possibly affected specs (which would result in the max
pruning possible).
Currently `spack buildcache create` creates compressed tarballs that
differ between each invocation, thanks to:
1. The gzip header containing mtime set to time.time()
2. The generated buildinfo file which has a different mtime every time.
To avoid this, you have to explicitly construct GZipFile yourself, since
the Python API doesn't expose the mtime arg, and we have to manually
create the tarinfo object for the buildinfo metadata file.
Normalize mode: regular files & hardlinks executable by user, dirs, symlinks: set 0o755 permissions in tarfile; other files use 0o644
* "new py-thop package"
* [@spackbot] updating style on behalf of Sangu-Mbekelu
* Update package.py
modified the url and dependencies
---------
Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
* py-openmesh: new package
* Update var/spack/repos/builtin/packages/py-openmesh/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This commit formalizes `satisfies(lhs, rhs, strict=True/False)`
and splits it into two functions: `satisfies(lhs, rhs)` and
`intersects(lhs, rhs)`.
- `satisfies(lhs, rhs)` means: all concrete specs matching the
left hand side also match the right hand side
- `intersects(lhs, rhs)` means: there exist concrete specs
matching both lhs and rhs.
`intersects` now has the property that it's commutative,
which previously was not guaranteed.
For abstract specs, `intersects(lhs, rhs)` implies that
`constrain(lhs, rhs)` works.
What's *not* done in this PR is ensuring that
`intersects(concrete, abstract)` returns false when the
abstract spec has additional properties not present in the
concrete spec, but `constrain(concrete, abstract)` will
raise an error.
To accomplish this, some semantics have changed, as well
as bugfixes to ArchSpec:
- GitVersion is now interpreted as a more constrained
version
- Compiler flags are interpreted as strings since their
order is important
- Abstract specs respect variant type (bool / multivalued)
* New package: py-imbalanced-learn
* Fix typo
* [@spackbot] updating style on behalf of meyersbs
* Update var/spack/repos/builtin/packages/py-imbalanced-learn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update/fix py-meldmd; update openmm
* Restrict filter_file based on openmm version
* Updates based on Adam's feedback
* [@spackbot] updating style on behalf of meyersbs
* Break up long filter_file
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The Cray fortran compiler names fortran modules in uppercase by
default. Compile with the "-ef" flag to produce the lowercase
name that singularity-eos is expecting.
Two fixes:
1. `-Wl,a,b,c,d` is a comma separated list of linker arguments, we
incorrectly assume key/value pairs, which runs into issues with for
example `-Wl,--enable-new-dtags,-rpath,/x`
2. `-Xlinker,xxx` is not a think, so it shouldn't be parsed.
* httpie: add v3.2.1
* Add additional 3.2.1 dependencies to httpie
* Add version condition to dependency
* Reorder dependencies for efficiency
* Update var/spack/repos/builtin/packages/httpie/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* awscli: add v1.27.84
* Add botocore dependency to awscli
* Update var/spack/repos/builtin/packages/awscli/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-botocore@1.29.84 dependency
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* update mda dependencies
* apply black
* Update var/spack/repos/builtin/packages/py-gsd/package.py
* Update var/spack/repos/builtin/packages/py-griddataformats/package.py
* Update var/spack/repos/builtin/packages/py-griddataformats/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove numpy upper bound
* Update var/spack/repos/builtin/packages/py-griddataformats/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gsd/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* update proj
* re-add autotools support
* style
* Setup env in builders
* Drop direct windows conflict for older versions
* Default to CMake
Add new style class definiton
* Proj: setup_run_environment in package not builder
* Proj: move run env changes to pkg, rm cmake arg
* Set PROJ_LIB during build
* Style
* Rm redundant configure arg
Currently, if two compilers with the same spec differ on the flags, the concretizer will:
1. mix both sets of flags for the spec in the ASP program
2. error noting that the set of flags from the compiler (both of them) doesn't match the set from the lower priority compiler
This PR fixes both -- only flags from the highest priority compiler with a given spec are considered.
* py-elasticsearch: new versions
Also add py-elastic-transport as a new dependency
* py-elasticsearch: py-urllib3 is no longer a dependency
* Update var/spack/repos/builtin/packages/py-elasticsearch/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
`mypy` only understands `sys.platform == "win32"`, not indirect assignments of that
value to things like `is_windows`. If we don't use the accepted platform checks, `mypy`
registers many Windows-only symbols as not present on Linux, when it should skip the
checks for platform-specific code.
* add 2.14.2 py-astroid version
* add py-pylint 2.26.2
* fix black
* fix py-dill depends_on
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix py-astroid minor versionning
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* modify typing_extensions depends_on
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Start using paths found in extra_rpaths in compilers.yaml when building
* running black and changing maintainer list
* changing import order to pass isort
---------
Co-authored-by: Matthew Lieber <lieber.31@osu.edu>
Update `spack.util.environment` to remove legacy idioms.
* Remove kwargs from method signature and use a class for traces
* Uppercase a few global variables
* spack.util.environment: add type-hints
* Improve docstrings
* Fixed most style issues reported by pylint
* py-ipdb: updating versions
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
* py-ipdb: fixing versions problem and deleting 10.1 which is too old for Python > 3.6
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
* Update var/spack/repos/builtin/packages/py-ipdb/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ipdb: removed useless dependencies
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
* Update var/spack/repos/builtin/packages/py-ipdb/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-ipdb/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ipdb: missing @
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
* Update var/spack/repos/builtin/packages/py-ipdb/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* e3sm-scorpio: add e3sm-scorpio package
This is the Scorpio package from the e3sm.org site.
* fixed style errors
* removed unneeded dependency on cmake
Use correct `shlib_symbol_version` for Julia 1.8, work around issue where libuv-julia's git checkout has arbitrary mtime, causing make to regenerate configure scripts, sometimes.
* Add a `py-gmxapi` package.
This package provides the Python package for the GROMACS
public API. The Python package is not strongly coupled to
a specific GROMACS _version_, but its compiled extension module
is strongly coupled to a specific GROMACS _installation_.
* Update conflict info.
In order to allow `^gromacs@2022.1` while rejecting `^gromacs@2022`,
we need to compare to `gromacs@2022.0`.
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Apply suggestions from code review.
* Simplify build system structure.
* Update dependencies for completeness.
* Update var/spack/repos/builtin/packages/py-gmxapi/package.py
Per code review, pretend gmxapi <0.4 doesn't exist, for simplicity.
* Update var/spack/repos/builtin/packages/py-gmxapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gmxapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
a shared library /lib64/libcxi.so, which seems to also appear on other
non-slingshot systems. This patch also checks to make sure that there
is a Cray programming enviornment in /opt/cray/pe in addition to the
shared library.
* py-pygments 2.12; fix py-docutils, again
`2.12` is the latest for which our style hack works, beyond that we need
our own package to make a plugin.
Old docutils needs old setuptools
* py-setuptools is always a dep
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update the range
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The Intel OneAPI's extreme pickiness continues to bring out
buggy/noncompliant code.
This patch fixes an error in the configure.in embedded 'c' test code
and also in a file with an initialized, but unused, variable.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* Updates to release 0.6.
* Dep updates
* Dep version fix
* Another version fix
* Fix typo
* UFL version fix
* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fenics-ffcx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Some updates following review
* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* More updates
* More updates
* build/run updates
* Small fix
* Fix version number.
* specify lower bounds for python dependencies
* address style issues
* address style issues
* address PR comments
* amend setuptools dependency to be of type build only
* amend setuptools dependency to be of type run and build for ffcx and ufl
* add build dependency to ensure import tests pass
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Matthew Archer <ma595@cam.ac.uk>
* py-dask-mpi: new package with dependencies
* py-hatch-jupyter-builder is not needed after all
* skip_modules seems cleaner
* Update var/spack/repos/builtin/packages/py-jupyter-server-proxy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-simpervisor/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Flags `-fallow-argument-mismatch -fallow-invalid-boz` set in `FFLAGS`/`FCFLAGS`
environment variables don't really have effect in older versions of WRF, we need
to force them in the compiler wrappers.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
With the last merge request for OOMMF [1], the intention was to have version
20b0_20220930 as the preferred version, and provide 20b0_20220930-vanilla as an
additional version for the unlikely case anybody needed that.
I made the (wrong) assumption that the `version` listed first in the `package.py` file
would be the preferred version. This merge request is to correct that by
explicitly tagging the preferred version with `preferred=True`.
[1] https://github.com/spack/spack/pull/33072/files
If the docbook packages
- docbook-xml
- docbook-xsl
are installed in a spack environment view the catalog files will be in
conflict in the view directory. This PR resolves that by adding an
appropriate prefix to each catalog name so that they are unique in the
view. The resulting XML_CATALOG_FILES environment variable will then be
able to point to both of them.
1. add version 2023.03.01
2. add variant 'python' that supports unwinding python source
3. clean up some things with the cray variant
4. require the latest libmonitor
5. fix sha256 checksum for url patch
6. delete rocm 5.3 from older versions
* cleaned up style, linked to external htslib
* removed htslib/bcfrools/samtools deps, use bundled libs instead
the pysam package includes the necessary libs to link to, so it wasn't even using linked libs when building
* fixed style
* revert to using external htslib
currently uses bundled samtools and bcftools, and there is no way to use external versions for those dependencies
* added libs property to htslibs package
added support for lib64
* added htslib name
* onednn: add variant to use Arm Compute Library on aarch64
* Update cmake version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Shorten macro definition
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update cpu/gpu_runtime variants
* Update acl variant when 1.7+
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of annop-w
* Add dependencies for new runtimes
* Fix dependency package name to oneapi-level-zero
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-awkward: new version 1.10.*, 2.0.*
Lots of changes in 2.0.*, see https://github.com/scikit-hep/awkward/releases. This will need some extra testing.
* py-awkward: hatchling
* Update var/spack/repos/builtin/packages/py-awkward/package.py
* Update var/spack/repos/builtin/packages/py-awkward/package.py
* py-scikit-build-core: new and improved py-scikit-build
* py-awkward-cpp: new package
* py-awkward: add depends_on py-awkward-cpp
* py-awkward: depends_on py-packaging
* py-awkward-cpp: new versions pinned by py-awkward
* py-scikit-build-core: additional depends_on
* py-awkward: branch master deprecated
* py-pytest-subprocess: new package
* py-pytest: new version 7.2.1
* py-scikit-build-core: add tests dependencies
* [@spackbot] updating style on behalf of wdconinc
* py-scikit-build-core: two more test dependencies
* py-pytest: depends_on py-exceptiongroup
* py-awkward: add pytest support
* py-pytest: suggestions from review
* py-scikit-build-core: suggestions from review
* Update var/spack/repos/builtin/packages/py-awkward-cpp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-awkward: depends_on pyyaml when @:1, order old deps last
* [@spackbot] updating style on behalf of wdconinc
* py-awkward: move some opt deps to test, order test deps
* py-awkward: remove test dependencies
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
if dump file existed it was not truncating the file, resulting in
a file with unaltered filesize, with the new content at the beginning,
"padded" with the tail of the old content, since the new content was
not enough to overwrite it.
`colify` is an old module in Spack that still uses `**kwargs` liberally.
We should be more explicit. Doing this eliminates the need for many
checks (can't pass the wrong arg if it isn't allowed) and makes the
function documentation more clear.
Fixes a bug introduced in 44ed0de8c0
where the push method of binary_distribution now takes named args
include_root and include_depedencies, to avoid the **kwarg hole.
But the call site wasn't update and we passed a dict of keys/values instead
of arguments, which resulted in a call like this:
```
push(include_root={"include_root": True, "include_dependencies": False})
```
This commit fixes that, and adds a test to see if we push the correct packages.
This error shows up a lot, typically it's harmless because an error
happened before the source build even started, in which case we don't
have build logs to copy. So, warn instead of error, cause it distracts
from the actual CI error.
Currently we attempt to setup the build environment even when
dependencies are not installed, which typically results in error while
searching for libraries or executables in a dependency's prefix.
With this change, we get a more user friendly error:
```
$ spack build-env perl
==> Error: Not all dependencies of perl are installed, cannot setup build environment:
- qpj6dw5 perl@5.36.0%apple-clang@14.0.0+cpanm+open+shared+threads build_system=generic arch=darwin-ventura-m1
- jq2plbe ^berkeley-db@18.1.40%apple-clang@14.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-ventura-m1
...
$ echo $?
1
```
* Allow users to specify root env dir
Environments managed by spack have some advantages over anonymous Environments
but they are tucked away inside spack's directory tree. This PR gives
users the ability to specify where the environments should live.
See #32823
This is also taken as an opportunity to ensure that all references are to "managed environments",
rather than "named environments". Prior to this PR some references to the latter persisted.
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
* Update exago w/ 1.5.1 and small updates to hiop.
* Fix styling.
* Add RAJA back to ExaGO package.
* Update RAJA requirement for ExaGO and HiOp.
* Update last RAJA requirement in HiOp.
* Add new sphinx rtd theme release 1.2.0
The new release helps with supporting more recent version of docutils
* set docutils officially supported version
* add jquery dependency for sphinx-rtd-theme
* add conflict with jquery version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* correct dependency
* fix version dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* set sphinx version
* fix sha256
* add version for flit-core
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The call:
```
x.satisfies(y[, strict=False])
```
is commutative, and tests non-empty intersection, whereas:
```
x.satsifies(y, strict=True)
```
is not commutative, and tests set-inclusion.
There are 2 fast paths. When strict=False both self and other need to
be concrete, when strict=True we can optimize when other is concrete.
a) It's used by site administrators, so it's niche
b) If it's used by site administrators, they likely need to modify the config anyhow, so the default config only serves as an example to get started
c) it's too arbitrary to enable tcl, but disable lmod
Spack generally ignores file-file projection clashes in environment
views, but would eventually error when linking the `.spack` directory
for two specs of the same package.
This leads to obscure errors where users have no clue what the issue is
and how to fix it. On top of that, the error comes very late, since it
happens when the .spack dir contents are linked (which happens after
everything else)
This PR improves that by doing a quick check ahead of time if clashes
are going to be anticipated (by simply checking for clashes in the
projection of each spec's .spack metadir). If there are clashes, a
human-readable error is thrown which shows two of the conflicting specs,
and tells users to user unify:true, view:false, or set up custom
projections.
* add pytng
* black
* add setuptools
* fix
* Update var/spack/repos/builtin/packages/py-pytng/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pytng/package.py
* Update var/spack/repos/builtin/packages/py-pytng/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Kokkos when compiled by spack without +wrapper could potentially capture the spack compiler wrappers, resulting in cmake configs and kokkos_launch_compiler trying to run the spack compiler wrapper after installation.
The checksum exception was not detailed enough and not reraised when using cache only, resulting in useless error messages.
Now it dumps the file path, expected
hash, computed hash, and the downloaded file summary.
Batch scripts in general will not function without carriage return line
endings on Windows. We rely on these scripts to support cmd, so we
should not allow these scripts to be converted to lf.
Note: Windows 11 supports lf line endings due to the use of Windows
Terminal. Once support for Windows 10 is dropped, this change can be
reverted.
When running many concurrent spack install processes that need to write
to the db, Spack regularly times out. This is because writing to the DB
after another process has written to it requires deserialization of the
db, mutating it in memory, and serializing it again, which takes some
time. On top of that, I believe there's a 1 second retry when a write
lock cannot be obtained, so I think this means only 3 processes can
really write to the DB at the same time before timing out.
* Style: black 23, skip magic trailing commas
* isort should use same line length as black
* Fix unused import
* Update version of black used in CI
* Update new packages
* Update new packages
* Update package.py
Initial new stuff
* Update package.py
* Update package.py
* Update package.py
* fix targets
* non-llvm backends
* ooops
* fix style
* Somehow that was not caught?
Somehow that was not caught?
* style
* Last fix
make capitalization consistent with Halide not LLVM...
* py-cmake-format: new version, new variants
* Update var/spack/repos/builtin/packages/py-cmake-format/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cufflinks: new package version with 0.17.3
* Update var/spack/repos/builtin/packages/py-cufflinks/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Specs that did not contribute any files to an env view caused a problem
where zip(specs, files grouped by prefix) got "out of sync", causing the
wrong merge map to be passed to a package's `add_files_to_view`, which
specifically caused an issue where *sometimes* bin/python ended up as a
symlink instead of a copy.
One such example is kokkos + kokkos-nvcc-wrapper, as the latter package
only provides the file bin/nvcc_wrapper, which is also added to view by
kokkos, causing kokkos-nvcc-wrapper to contribute 0 files.
The test feels a bit contrived, but it captures the problem... pkg a is
added first and has 0 files to contribute, pkg b adds a single file, and
we check if pkg b receives a merge map (and a does not).
* pfunit: add v4.6.3
* pfunit: use CMakePackage methods to define arguments
* pfunit: deprecate v3.X, make a variant conditional
* pfunit: simplify setting up environment variables
Reading the docs it seems only v3
needs F90_VENDOR to be set
* pfunit: fix option names
The names set before were unused
* pfunit: shared libraries seem not to be supported
See https://github.com/Goddard-Fortran-Ecosystem/pFUnit/issues/308#issuecomment-874725759
* Add py-mlflow and its dependencies
* mlflow: fix syntax error in package.py
* py-mlflow: cleanup
Process review remarks, add missing dependencies, add skinny variant
* Apply suggestions from code review
* Fix flake8 issues
* More formatting fixes
* Fix py-waitress dependency version
* py-mlflow: platform-specific dependency
* Update var/spack/repos/builtin/packages/py-mlflow/package.py
* Update var/spack/repos/builtin/packages/py-mlflow/package.py
* Process review remarks
* Fix typo in dependency version
* py-shap: fix dependencies
* py-arrow: fix dependencies
* py-slicer: remove py-setuptools explicit version
* py-pyarrow: dataset variant and pass options through environment
It appears there are some issues when using `pip install` instead of
`python setup.py` - this setup_build_environment should fix that.
* py-pyarrow: review remark
* Decouple setup_build_environment from install_options
* py-pyarrow: style
* Bump licenses to 2023
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Matthias Wolf <matthias.wolf@epfl.ch>
`spack gc` removes build deps of explicitly installed specs, but somehow
if you take one of the specs that `spack gc` would remove, and feed it
to `spack uninstall /<hash>` by hash, it complains about all the
dependents that still rely on it.
This resolves the inconsistency by only following run/link type deps in
spack uninstall.
That way you can finally do `spack uninstall cmake` without having to
remove all packages built with cmake.
Default package requirements might contain
variants that are not defined in each package,
so we shouldn't verify them when emitting facts
for the ASP solver.
Account for group when enforcing requirements
packages:all : don't emit facts for requirement conditions
that can't apply to current spec
* Update package.py
Several libraries are need to be present at run time so that the code can be run in parallel.
I have added them as dependencies and to LD_LIBRARY_PATH. Orca comes as a binary so the libraries cannot be added as RPATH at compilation time.
Also, orca 5.0.3 was compiled against 4.1.1, not 4.1.2.
* fortls
* Update var/spack/repos/builtin/packages/py-fortls/package.py
* review
* Update var/spack/repos/builtin/packages/py-fortls/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fixes
* review
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new py-amplpy package
* [@spackbot] updating style on behalf of sm2939
* Update package.py
* Rename var/spack/repos/builtin/py-amplpy/package.py to var/spack/repos/builtin/packages/py-amplpy/package.py
* Edited file to change copyright year/dependencies and changed the directory of the file
---------
Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
#35098 added the correct extraction of toolset version for the MSVC
compiler. This updates the associated method in MSBuilder to retrieve
the (now correct) property.
Meme 4.5.0 has the first occurrence of the string
```
use XML::Simple
```
I found this by doing a binary search manually extracting tarballs until `grep` came up empty.
* new ampltools package
* [@spackbot] updating style on behalf of sm2939
* Update and rename var/spack/repos/builtin/py-ampltools/package.py to var/spack/repos/builtin/packages/py-ampltools/package.py
Edited file to change copyright year/dependencies and edited directory
---------
Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
* Add Score-P 8.0 and Cube 4.7/4.8 packages.
* Score-P 8.0 requires 4.8, not 4.7, Cube packages
* Add maintainer
* Add CUDA and HIP variants. Add version checks for CUDA (Score-P 8 requires CUDA 7), ROCm (variant only valid as of Score-P 8), and MPI (Score-P 7 requires at least version 2.2 of the MPI standard).
* Deprecate everything pre-7.0.
* Fix HIP dependencies and enable CUDA and HIP variants for configure.
* Deprecate OTF2 pre-2.3 and Cube pre-4.6
* Add "fake" mpi compiler wrappers to msmpi: msmpi doesn't actually
provide wrappers, so this just assigns the wrappers to be whatever
compiler that a dependent is using. Packages referencing the
wrappers would otherwise break. This is assumed to be workable
because build scripts will need to assemble appropriate information
to pass to the compiler anyway
* Fix msmpi detection stanza ('executable' is not the correct name of
the property)
* Fix compiler pkg dereference
* add initial package.
* Update package.py
* Update var/spack/repos/builtin/packages/tiramisu/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/tiramisu/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/tiramisu/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Hopefully this will be fine.
* Update package.py
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/tiramisu/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* cvise: new package
* cvise: colordiff as optional dependency
* cvise: remove old versions and correctly name master version
* cvise: update license date
* cvise: use maintainers directive
* Remove @olupton as maintainer
After live discussion: it's been too long since he did anything with this package.
* add halide package.
* some style changes.
* small fix
* Update var/spack/repos/builtin/packages/halide/package.py
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Update var/spack/repos/builtin/packages/halide/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/halide/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
add comment to requirements.txt
* Update package.py
Fix version order.
* Update package.py
style
* Update package.py
Removed unneeded vars.
* Update var/spack/repos/builtin/packages/halide/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/halide/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Fix some deps
* Update package.py
* Fix finding llvm cmake info
* Update package.py
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This PR enables the successful execution of the spack binary cache
tutorial on Windows. It assumes gnupg and file are available (they
can be installed with choco).
* Fix handling of args with quotes in spack.bat
* `file` utility can be installed on Windows (e.g. with choco): update
error message accordingly
I don't know if this is new in version 7.0, but to build `info`, which is a required executable at the end of the recipe, it is necessary to have a terminal library, otherwise you get
```
[...]
checking for tgetent in -ltinfo... no
checking for tgetent in -lncurses... no
checking for tgetent in -lcurses... no
checking for tgetent in -ltermlib... no
checking for tgetent in -ltermcap... no
checking for tgetent in -lterminfo... no
configure: WARNING: info needs a terminal library, one of: tinfo ncurses curses termlib termcap terminfo
[...]
configure: WARNING: Could not find a terminal library among tinfo ncurses curses termlib termcap terminfo
configure: WARNING: The programs from `info' directory will not be built.
```
then compilation runs, `info` is not built and installation fails according to Spack because the required executable is missing.
* Support packages for using scitokens on OSG
The Open Science Grid (OSG) encourages scitokens to provide
certain services (e.g. writing to xrootd). Spack already
supports this through scitokens-cpp and xrootd +scitokens-cpp.
This adds py-htgettoken, a python utility to get a scitoken
from a vault through web authentication. To support htgettoken,
this also adds py-gssapi.
This also adds the OSG CA cert collection which is typically
at /etc/grid-security but pointed to in user installations by
the X509_CERTS_DIR variable.
This allows userspace through spack for functionality that
otherwise depends on installing the RPMs provided by OSG.
* fine, I'll fix style myself then
* fix maintainers
* py-gssapi: version before depends_on
* remove list_url
* add documentation on reason for git describe version numbers
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* better BEARER_TOKEN definition
* import os
* remove older version that don't build with setuptools
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
At least with ZSH, prefix inspections containing `./bin` result in a
`$PREFIX/./bin` and result in strange `$PATH` handling.
I.e., `module load git` will prepend `/path/to/git/./bin`, `which git`
will find the right executable, but `git --version` will print the
system one. Normalize the relative path to avoid this behavior.
See also spack/spack#31867.
* changes to enable LLVM_ENABLE_RUNTIMES for libcxx and libcxxabi
* remove version update for 5.3.0 as it is done thru PR #33320 to enable
ci and reviews
* initial commit for rocm-5.4.0 release
* update the versions for more packages for 5.4.0 release
* update the gallium patch for mesa for libllvm-15 for ROCm-5.4.0 release
* update rocm-openmp-extras and rocwmma recipes for 5.4.0 release
* fix build error for rocfft for 5.4.0
* address review comments for rocfft for 5.4.0 change
* undo the removal of the older patch file
* bump up the version for hipfft for 5.4.0
* fix the failure after the merge with develop
* add recipes updates for 5.4.0 for migraphx.miopen-hip,miopen-opencl
* address the review comments on the mesa patch.update the rdc package for
5.4.0 release
* fix style errors
* acts: new versions 21.1.1, 22.0.1, 23.0.0
New versions:
- [major 23.0.0](https://github.com/acts-project/acts/compare/v22.0.0...v23.0.0):
- new option `ACTS_BUILD_PLUGIN_GEANT4` -> enabled with existing variant `geant4`
- new option `ACTS_BUILD_EXAMPLES_BINARIES`:
- it is my understanding that the binaries for examples are deprecated (in favor of python examples); warnings to this effect have been printed for a few versions, and now the building of binaries is disabled by default,
- rather than introducing a variant to enable deprecated behavior for only one or two versions, I propose that we just follow the default and keep this disabled.
- [bugfix 22.0.1](https://github.com/acts-project/acts/compare/v22.0.0...v22.0.1) (no build system changes)
- [bugfix 21.1.1](https://github.com/acts-project/acts/compare/v21.1.0...v21.1.1) (no build system changes)
* acts: correct 23.0.0 sha
Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>
As of 2.4.113, the flag for man-pages is now a feature,
so true/false is now enabled/disabled. Other similarly
changed options are not used in the spack recipe (i.e.
experimental kms drivers).
* qt: new versions 6.4.0, 6.4.1
- New libpsl vendored dependency in qt-base.
- New embree and tinyexr dependency in qt-quick3d.
We need to figure out a better way to deal with these vendored
dependencies in src/3rdparty. Removing them was a way to make sure
they are not used unintentionally. Many of these dependencies cannot
be overridden with a QT_FEATURE_system_* flag and are included directly
in cpp files. Many change versions from release to release, so even if
they use system (ie spack managed) versions we need to support this in
the depends_on lines.
What we can rely on?
- src/3rdparty is where vendored stuff is stored
- not much else...
Possible ways to deal with this:
- Change vendor_deps_to_keep to dict with versions, eg
```
vendor_deps_to_keep = {
"xatlas": "@6:",
"embree": "@6.4:",
"tinyexr": "@6.4:",
}
```
- Similarly introduce system_deps_to_use:
```
system_deps_to_use = {
"assimp@5.2:": "@6:",
}
```
and derive depends_on and QT_FEATURE_system_* from this dict.
* qt-*: new version 6.4.2, invert vendored pkgs logic
* qt-base: fix vendor_deps_to_avoid typo
* qt-*: move lots into QtPackage base layer
* py-minkowskiengine: new package (sparse tensor autodiff by Nvidia)
This python package (with cuda support) provides torch support for sparse
tensors. The `pybind11` headers are not found without the patch to `setup.py`.
* [@spackbot] updating style on behalf of wdconinc
* py-minkowskiengine: depends_on numpy, pybind11 type=link; no patch
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
The fastqc script was using the system perl. This PR sets the script to
use the spack built/provided perl. This PR also removes the code that
adds the java path. That should be handled by module loading as far as I
know.
* Add HDF5 version 1.13.3.
* Remove maintainers no longer with The HDFGroup.
* Add version hdf5-vol-async@1.4
* Add HDF5 version 1.14.0, develop-1.14, develop-1.15.
Add missing conflicts for api version and develop versions.
* Add conflicts statement to hdf5/package.py to avoid building hdf5 with
MPICH 4.0.x versions with bug that causes testphdf5 test to fail.
* Add patch to call find_package(MPI) for dependent packages not finding
it, not having called it themselves.
* Remove language components from find_package(MPI) in
hdf5_1_14_0_config_find_mpi.patch.
* Add HDF5 version 1.14.0, develop-1.14, develop-1.15.
Add missing conflicts for api version and develop versions.
* Add conflicts statement to hdf5/package.py to avoid building hdf5 with
MPICH 4.0.x versions with bug that causes testphdf5 test to fail.
* Add patch to call find_package(MPI) for dependent packages not finding
it, not having called it themselves.
* Remove language components from find_package(MPI) in
hdf5_1_14_0_config_find_mpi.patch.
* Don't guard ParaView patch on HDF5 variant
ParaView always needsd HDF5 and ignores the variant.
* py-h5py: Newer versions of HDF5 introduce breaking API changes
---------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
* Add trilinos-solvers variant to nalu-wind package.
This allows nalu-wind to be built against a trilinos installation
which doesn't have amesos2, belos, ifpack2, or muelu enabled, if
the nalu-wind user provides the spec 'nalu-wind@master~trilinos-solvers'
Support for these solver-packages remains on by default.
* Fixed a style issue reported by CI.
* Incorporate change in wording suggested from review comments.
... to clarify that at least one, or both, of hypre and/or
trilinos-solvers must be enabled. The error condition is if
both are disabled.
* That style checker is picky...
* It really did want a trailing comma...
* py-jinja2-cli: new package
* Update var/spack/repos/builtin/packages/py-jinja2-cli/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add py-docker@5:
* [@spackbot] updating style on behalf of spoutn1k
* Ignore `tls` variant
* Update var/spack/repos/builtin/packages/py-docker/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* `py-docker`: `py-paramiko` version fix
---------
Co-authored-by: spoutn1k <spoutn1k@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The gfx906:xnack- and gfx908:xnack- targets were introduced in ROCm 4.1
and replaced gfx906 and gfx908 as default build targets, but the library
can still be built for gfx906 and gfx908 if requested.
* e4s: restore builds builds
* gitlab ci: allow UO to build protected binaries for signing
* use newer image; comment out failing builds
* gitlab-ci: Some tweaks for e4s power builds
- fix tags (no longer require generate jobs to run on aws)
- fix resource requests for generation jobs resource requests
- remove SPACK_SIGNING_KEY from protected power build jobs
- update UO signing key path
- change the CDash build group to reflect stack name
- retry pipeline generation jobs *always*
* correct double packages: section
* gitlab-ci:script: modernize
* remove new gnu make, not for ppc64le
---------
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
* Added e4s-cl package
* Version order change
* Added e4s-cl dependencies
* Added python-sotools dependency
* [@spackbot] updating style on behalf of spoutn1k
* Add missing versions to py- packages
* Fix style
* [@spackbot] updating style on behalf of spoutn1k
* Update var/spack/repos/builtin/packages/e4s-cl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/e4s-cl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-python-sotools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add docker removing patch for e4s-cl
Co-authored-by: spoutn1k <spoutn1k@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-nexusforge: add with dependencies
* py-pyshacl, py-sseclient: more style
* py-hjson, py-nexus-sdk, py-nexusforge, py-puremagic: more style
* py-pyshacl: license update
* py-nexusforge, py-prettytable, py-pyshacl: review remarks
* py-nexusforge: make the variant mean something
Too hasty to commit...
* py-ipyparallel: add 8.4.1, which builds with py-hatchling
* py-ipyparallel: copyright and redundant py-setuptools dependency
* py-ipyparallel: py-packaging was dropped after 8.0.0
fixes#34879
This commit adds a new maintainer directive,
which by default extend the list of maintainers
for a given package.
The directive is backward compatible with the current
practice of having a "maintainers" list declared at
the class level.
Move the relocation of binary text in its own class
Drop threaded text replacement, since the current bottleneck
is decompression. It would be better to parallellize over packages,
instead of over files per package.
A small improvement with separate classes for text replacement is that we
now compile the regex in the constructor; previously it was compiled per
binary to be relocated.
The regex doesn't actually work because dollar signs and parentheses have to be
escaped. Also, compiling with OpenMPI requires defining the macro
`MPI2SUPPORT`.
This commit makes explicit the format version of the spec file
we are reading from.
Before there were different functions capable of reading some
part of the spec file at multiple format versions. The decision
was implicit, since checks were based on the structure of the
JSON without ever checking a format version number.
The refactor makes also explicit which spec file format is used
by which database and lockfile format, since the information is
stored in global mappings.
To ensure we don't change the hash of old specs, JSON representations
of specs have been added as data. A unit tests checks that we read
the correct hash in, and that the hash stays the same when we
re-serialize the spec using the most recent format version.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
```
File ".../spack/var/spack/environments/scale-mpi/.spack-env/._view/4yiorsdd4pefrnwgrwlwt3yzo5i235il/lib/python3.10/site-packages/h5py/_hl/base.py", line 19, in <module>
from collections import (Mapping, MutableMapping, KeysView,
ImportError: cannot import name 'Mapping' from 'collections' (.../spack/var/spack/environments/scale-mpi/.spack-env/._view/4yiorsdd4pefrnwgrwlwt3yzo5i235il/lib/python3.10/collections/__init__.py)
```
Fixed in https://github.com/h5py/h5py/pull/1069 which was first merged
in v2.9.
* py-flatten-dict: require poetry to build.
The sources seem to contain a bundled, auto-generated `setup.py`.
Building with `pip` insist on using Poetry as mentioned in
`pyproject.toml`, so require it as a build dependency.
* Update var/spack/repos/builtin/packages/py-flatten-dict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-flatten-dict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Currently we print "sha256 checksum failed for [file]. Expected X but
got Y".
This PR extends that message with file size and contents info:
"... but got Y. File size = 123456 bytes. Contents = b'abc...def'"
That way we can immediately see if the file was downloaded only
partially, or if we downloaded a text page instead of a binary, etc.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
py-scipy 1.6 and older come with pre cython-ized files that
use the _PyGen_Send symbol that was removed from python 3.10.0.161,
so do not build these old versions with python 3.10.1 and later
Parts of libgcrypt should not be optimized with -O1/2/3, so it's best to
let the build system do that; the build system cannot know the compiler
wrapper would inject optimization flags
When running unit-test the test/ci.py module is leaving
garbage (help.sh, test.sh files) in the current working
directory.
This commit changes the current working directory to a
temporary path before those files are created.
* freeimage: fails to compile with c++17, use c++14
Only `opencascade` when a (non-default) variant depends on `freeimage`, which seems to have gone unmaintained. There are c++17 standard violations [[1]]( https://en.cppreference.com/w/cpp/language/except_spec) in the code, so we can at most expect c++14. Since some compilers default to c++17 (gcc-12) we need to be explicit.
* freeimage: install directly in prefix
* freeimage: fix inverted patch
* environments: don't rewrite relative view path, expand path on cli ahead of time
Currently if you have a spack.yaml that specifies a view by relative
path, Spack expands it to an absolute path on `spack -e . install` and
persists that to disk.
This is rather annoying when you have a `spack.yaml` file inside a git
repo, cause you want to use relative paths to make it relocatable, but
you constantly have to undo the changes made to spack.yaml by Spack.
So, as an alternative:
1. Always stick to paths as they are provided in spack.yaml, never
replace them with a canonicalized version
2. Turn relative paths on the command line into absolute paths before
storing to spack.yaml. This way you can do `spack env create --dir
./env --with-view ./view` and both `./env` and `./view` are resolved
to the current working dir, as expected (not `./env/view`). This
corresponds to the old behavior of `spack env create`.
* create --with-view always takes a value
All packages with explicit Windows support can be found with
`spack list --tags=windows`.
This also removes the documentation which explicitly lists
supported packages on Windows (which is currently out of date and
is now unnecessary with the added tags).
Note that if a package does not appear in this list, it *may*
still build on Windows, but it likely means that no explicit
attempt has been made to support it.
* nextflow recipe: added latest stable version
* tower-cli recipe: added latest release
* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli
* recipes nf-tower-agent and nf-tower-cli: small fix
* nf-core-tools recipe: added most py- dependencies
* nf-core-tools: recipe without galaxy-tool-util (for testing)
* fixed typos in py-yacman recipe
* fixed typos in py-pytest-workflow recipe
* fixed typo in nf-core-tools recipe
* fixed typos in py-yacman recipe
* fixes in recipes for py-questionary and py-url-normalize
* fixes to py-yacman recipe
* style fixes to py- packages that are dependencies to nf-core-tools
* fix in py-requests-cache recipe
* added missing dep in py-requests-cache recipe
* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper
* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util
* nf-core-tools: new version with extra dependency
* added py-galaxy-util, draft: added some required dep versions, still have to add 40+ deps
* nextflow and nf-core-tools packages: added my self as maintainer
* style fixes
* style fix for nf-core-tools recipe
* added license to py-logmuse recipe
* audit fixes
* style fix after audit fix
* py-galaxy-tool-util: added deps 1st bunch
* audit/style fixes, including adding missing dep package
* more audit/style fixes
* more more audit/style fixes
* moooore audit fixes
* py-galaxy-tool-util: dependencies 2nd chunk
* silly audit fix
* py-galaxy-util deps: 3rd bunch - first 20 done
* fixes
* style fix
* py-galaxy-tool-util: 4th bunch of deps
* stashing dep recipe backbones for py-galaxy-tool-util
* nf-core-tools: using pre-built wheel for dependency py-galaxy-tool-util
* nf-core-tools: adding also py-galaxy-util, as wheel
* fix
* nextflow: added latest bugfix version
* Update var/spack/repos/builtin/packages/nf-core-tools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/nf-core-tools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* nf-core-tools pr: 1st bunch of review edits
* nf-core-tools: 2nd bunch of review edits
* adding back tower-agent and tower-cli as deprecated
* nf-core-tools: 3rd bunch of review edits
* small style fix
* prepping py-galaxy-tool-util for further work
* nf-core-tools: last bunch of deps, except for galaxy-tool-util and pulsar
* audit fixes
* updates to py-galaxy-tool-util and its deps, still 2 to work on
* one style fix
* updated recipe for py-galaxy-util
* updated recipe for py-pulsar-galaxy-lib
* typo fix
* shasum fixes
* updated py-sqlalchemy from develop
* added newest versions (today) for nf-tower-agent and nf-tower-cli
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adding 2nd bunch of nf-core deps from update/nextflow-tools
* adding 3rd bunch of nf-core deps from update/nextflow-tools
* 4th chunk of nf-core deps from update/nextflow-tools
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pastedeploy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pebble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gunicorn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-parsley/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-gxformat2: added comment
* py-lagom: now using github tarballs
* fix for py-lagom
* adding missing deps to py-fastapi-utils
* another fix to py-lagom
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-lagom/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-supervisor/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-social-auth-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fixes from PR review
* adding missing deps, from PR review
* py-galaxy2cwl from github tarball, as per PR review
* fix to py-tuswsgi, as per PR review
* nf-tools: edits from PR review
* adding 3x more galaxy deps
* fix
* fixing circular dep of py-poetry-plugin-export with py-poetry
* added newest nf-core-tools version
* Update var/spack/repos/builtin/packages/py-galaxy-util/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix in py-poetry-plugin-export
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Currently, the `python` package tries to set `CPATH` in `setup_run_environment()`.
We no longer set `CPATH` and other destructive environment variables (like
`LD_LIBRARY_PATH`) in modules, so we shouldn't do something special for Python.
Also, the way `python` sets `CPATH` causes issues. Because it does a header search to
find directories containing headers, if you bootstrap `mypy` or other style tools on a
fresh Ubuntu image with *no* python devel headers installed, you'll get an error like
this when trying to load the thing you just installed:
```console
[root@980de539843d /]# spack -b load py-mypy
==> Error: Unable to locate python headers in any of these locations:
/usr/include/python3.6m
/usr/include/3.6
/usr/Headers
```
The headers and includes aren't needed to get `mypy` in the path or for `mypy` to work,
so we're failing unnecessarily here.
- [x] remove `setup_run_environment()` from `python/package.py`
Since SPACK_PACKAGE_IDS is now also "namespaced" with <prefix>, it makes
more sense to call the flag `--make-prefix` and alias the old flag
`--make-target-prefix` to it.
1. add variant cray-static, older crays build hpcprof-mpi static,
newer ones build dynamic.
2. move URL patches from github to gitlab.
3. add workaround for a bug where a file is mistakenly overwritten.
4. add conflict for hpcprof-mpi at 2022.10.01.
* [@spackbot] updating style on behalf of mwkrentel
Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
Normally when using external packages in concretization, Spack ignores
all dependencies of the external. #33777 updated this logic to attach
a Python Spec to external Python extensions (most py-* packages), but
as implemented there were a couple issues:
* this did not account for concretization groups and could generate
multiple different python specs for a single DAG
* in some cases this created a fake Python spec with insufficient
details to be usable (concretization/installation of the
extension would fail)
This PR addresses both of these issues:
* For environment specs that are concretized together, external python
extensions in those specs will all be assigned the same Python spec
* If Spack needs to "invent" a Python spec, then it will have all the
needed details (e.g. compiler/architecture)
* npm: Add latest version, update build
The `npm` package had gotten a bit long in the tooth and only suported the last version
for which running `configure` / `make` / `make install` actually worked.
- [x] Update the package to support npm@9, in which `npm install .` works properly and
installation is easier.
- [x] Update the package so that `npm@6:8` also install successfullly. The incantation
that is *supposed* to work on these versions is `node bin/npm-cli.js install $(node
bin/npm-cli.js pack . | tail -1)`, but depending on the version one of `npm install`
or `npm pack` will fail when run straight from the install directory. So now we just
manually copies things over.
This seems to make the `npm` install much more reliable for all of `npm@6:9` (at least
for me).
udpates the `npm` build to support versions 6-9 and fixes the install for all of
them on macos.
* update for review
With the new variable [prefix/]SPACK_PACKAGE_IDS you can conveniently execute
things after each successful install.
For example push just-built packages to a buildcache
```
SPACK ?= spack
export SPACK_COLOR = always
MAKEFLAGS += -Orecurse
MY_BUILDCACHE := $(CURDIR)/cache
.PHONY: all clean
all: push
ifeq (,$(filter clean,$(MAKECMDGOALS)))
include env.mk
endif
# the relevant part: push has *all* example/push/<pkg identifier> as prereqs
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))
$(SPACK) -e . buildcache update-index --directory $(MY_BUILDCACHE)
$(info Pushed everything, yay!)
# and each example/push/<pkg identifier> has the install target as prereq,
# and the body can use target local $(HASH) and $(SPEC) variables to do
# things, such as pushing to a build cache
example/push/%: example/install/%
@mkdir -p $(dir $@)
$(SPACK) -e . buildcache create --allow-root --only=package --unsigned --directory $(MY_BUILDCACHE) /$(HASH) # push $(SPEC)
@touch $@
spack.lock: spack.yaml
$(SPACK) -e . concretize -f
env.mk: spack.lock
$(SPACK) -e . env depfile -o $@ --make-target-prefix example
clean:
rm -rf spack.lock env.mk example/
``
* [armpl-gcc] Make pkg-config files available
ARMpl pkgconfig files are located in a non-default location and do not have the
.pc extension. Changing those both helps pkgconfig pick them up correctly, e.g.
in the meson build of `py-scipy`.
* Address @annop-w comments
* symlink instead of cp.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
In the past we checked remote binary mirrors for existence of a spec
before attempting to download it. That changed to only checking local
copies of index.jsons (if available) to prioritize certain mirrors where
we expect to find a tarball. That was faster for CI since fetching
index.json and loading it just to order mirrors takes more time than
just attempting to fetch tarballs -- and also if we have a direct hit
there's no point to look at other mirrors.
Long story short: the info message only makes sense in the old version
of Spack, so it's better to remove it.
* py-sphinx-immaterial: new package
This is a new-ish theme for Sphinx that's based on MkDocs's `immaterial` theme. More on
the theme here: https://jbms.github.io/sphinx-immaterial/, but it seems to be very clear
and readable. We *might* consider switching to it for Spack's docs.
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add note about node.js requirements
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Forward lookup of "test_log_file" and "test_failures"
refers #34531closes#34487fixes#34440
* Add unit test
* py-libensemble: fix tests
* Support stand-alone tests with cached files as install-time tests
Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Ensure `spack mirror add <name> <url/path>` without further arguments translates to `<name>: <url>` key value pairs in mirrors.yaml. If --s3-* flags are provided, only store the provided ones.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
This is only a work-around for the actual problem that Python is used to install libraries instead of CMake, so we end up with BUILD_RPATH not INSTALL_RPATHs.
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* Add new file for MVAPICH 3.0a release
Creating this as a new package since it requires some new configuration
options and because we are moving to the name "MVAPICH" and droping the
2 (following a similar move by MPICH).
Co-authored-by: Nat Shineman <shineman.5@osu.edu>
Co-authored-by: Matthew Lieber <lieber.31@osu.edu>
* OpenCV: checksum for 4.5.5, make contrib optional
* [@spackbot] updating style on behalf of iarspider
* Add conflicts for contrib modules
* Fix typo
* Implement changes from review
* Update var/spack/repos/builtin/packages/opencv/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Revert "Revert "4th chunk of nf-core deps from update/nextflow-tools (#34564)" (#34960)"
This reverts commit 891a63cae6.
* fix to py-python-multipart, as per PR review
* py-macs2: add version 2.2.7.1 and support python@3.10:
The tarball from PyPi includes the Cythonized C files. The tarball from
github does not. Remove the Cythonized C files from the source so that
they are rebuilt with the Spack Python/Cython combination. This is
necessary for python-3.10 but make sense for other combinations as well.
* Edits based on review
- set python version constraint on version 2.2.4
- removed all python-2 versions and related constraints.
* Add package file for xtb and py-xtb
* Retain maintainers from PythonPackage
* Update package files
- use extends("python") instead of tampering with PYTHONPATH
- use PyPI for downloading sdist of py-xtb
- add simple-dftd3 0.7.0
- add dftd4 3.5.0
- remove --wrap-mode=nodownload from toml-f
* Remove logic for download URL
With this change we get the invariant that `mirror.fetch_url` and
`mirror.push_url` return valid URLs, even when the backing config
file is actually using (relative) paths with potentially `$spack` and
`$env` like variables.
Secondly it avoids expanding mirror path / URLs too early,
so if I say `spack mirror add name ./path`, it stays `./path` in my
config. When it's retrieved through MirrorCollection() we
exand it to say `file://<env dir>/path` if `./path` was set in an
environment scope.
Thirdly, the interface is simplified for the relevant buildcache
commands, so it's more like `git push`:
```
spack buildcache create [mirror] [specs...]
```
`mirror` is either a mirror name, a path, or a URL.
Resolving the relevant mirror goes as follows:
- If it contains either / or \ it is used as an anonymous mirror with
path or url.
- Otherwise, it's interpreted as a named mirror, which must exist.
This helps to guard against typos, e.g. typing `my-mirror` when there
is no such named mirror now errors with:
```
$ spack -e . buildcache create my-mirror
==> Error: no mirror named "my-mirror". Did you mean ./my-mirror?
```
instead of creating a directory in the current working directory. I
think this is reasonable, as the alternative (requiring that a local dir
exists) feels a bit pendantic in the general case -- spack is happy to
create the build cache dir when needed, saving a `mkdir`.
The old (now deprecated) format will still be available in Spack 0.20,
but is scheduled to be removed in 0.21:
```
spack buildcache create (--directory | --mirror-url | --mirror-name) [specs...]
```
This PR also touches `tmp_scope` in tests, because it didn't really
work for me, since spack fixes the possible --scope values once and
for all across tests, so tests failed when run out of order.
Packages that use docbook-xml may specify a specific entity version.
When this is specified as a version constraint in the package recipe it
will cause problems when using `unify = True` in a Spack environment, as
there could be multiple versions of docbook-xml in the spec. In
practice, any entity version should work with any other version and
everything should work with the latest version. This PR maps all Spack
docbook-xml entity versions to the docbook-xml version in the spec.
Ideally, the version in the spec would be the latest version. With this
PR, even if a package specifies an older entity version, it will map
to the entity version (latest) in the spec. This means that there can be one
docbook-xml version in a Spack environment spec and packages requesting
older entity versions will still work.
To help facilitate this, docbook-xml version constraints for packages
that have them have been removed. Those packages are dbus and gtk-doc.
What's in AOCL 4.0:
1. amdblis
LPGEMM variants with post-ops support
AMD "Zen4" support for BLIS
2. amdlibflame
Upgrade to LAPACK 3.10.1 specification
Improvements in a few more variants of SVD and Eigen Value routines
Multithread support enabled for selected APIs
3. amdfftw
AVX-512 enablement of DFT kernels
AVX-512 optimization of copy and transpose routines
5. amdlibm
Black & Scholes support (logf, expf, erff, both scalar and vector)
AVX-512 variants of vector functions
6. aocl-sparse
New Iterative Solver APIs
AVX-512 support for SPMV API
7. amdscalapack
Upgrade to Netlib ScaLAPACK 2.2.0
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Sometimes I just want to know how many packages of a certain type there are.
- [x] add `--count` option to `spack list` that output the number of packages that
*would* be listed.
```console
> spack list --count
6864
> spack list --count py-
2040
> spack list --count r-
1162
```
* Add packages
* Style
* Respond to comments
* Change conflct dep type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* paraview: add `rocm` variant
This conflicts with CUDA and requires at least ParaView 5.11.0. More
dependencies are also needed.
* E4S: Add ParaView for ROCm and CUDA stacks
* DAV SDK: Update ParaView version and GPU variants
* Verify using hipcc vs amdclang++ for newer hip
Co-authored-by: Ben Boeckel <ben.boeckel@kitware.com>
* adding 1st version of py-strawberryfields
* py-strawberryfields: minor edit
* added backbone for 4x new SF dependencies
* edits to SF and its 4x new deps
* added all deps for SF
* added one version to py-lark-parser
* py-quantum-xir with tarball from github
* Update var/spack/repos/builtin/packages/py-strawberryfields/package.py
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* updated version for py-quantum-xir : pypy fixed
* py-pennylane: added pythonpackage.maintainers, too
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Currently, all of the replacements in `spack.util.path.replacements()` get evaluated for
each replacement. This makes it easy to get bootstrap issues, because config is used
very early on in Spack.
Right now, if I run `test_autotools_gnuconfig_replacement_no_gnuconfig` on my M1 mac, I
get the circular reference error below. This fixes the issue by making all of the path
replacements lazy lambdas.
As a bonus, this cleans up the way we do substitution for `$env` -- it's consistent with
other substitutions now.
- [x] make all path `replacements()` lazy
- [x] clean up handling of `$env`
```console
> spack unit-test -k test_autotools_gnuconfig_replacement_no_gnuconfig
...
==> [2022-12-31-15:44:21.771459] Error: AttributeError:
The 'autotools-config-replacement' package cannot find an attribute while trying to build from sources. This might be due to a change in Spack's package format to support multiple build-systems for a single package. You can fix this by updating the build recipe, and you can also report the issue as a bug. More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure
/Users/gamblin2/src/spack/lib/spack/spack/package_base.py:1332, in prefix:
1330 @property
1331 def prefix(self):
>> 1332 """Get the prefix into which this package should be installed."""
1333 return self.spec.prefix
Traceback (most recent call last):
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 1030, in _setup_pkg_and_run
kwargs["env_modifications"] = setup_package(
^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 757, in setup_package
set_module_variables_for_package(pkg)
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 596, in set_module_variables_for_package
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/build_systems/cmake.py", line 241, in std_args
define("CMAKE_INSTALL_PREFIX", pkg.prefix),
^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/package_base.py", line 1333, in prefix
return self.spec.prefix
^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 1710, in prefix
self.prefix = spack.store.layout.path_for_spec(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/directory_layout.py", line 336, in path_for_spec
path = self.relative_path_for_spec(spec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/directory_layout.py", line 106, in relative_path_for_spec
projection = spack.projections.get_projection(self.projections, spec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/projections.py", line 13, in get_projection
if spec.satisfies(spec_like, strict=True):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 3642, in satisfies
if not self.virtual and other.virtual:
^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 1622, in virtual
return spack.repo.path.is_virtual(self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 890, in is_virtual
return have_name and pkg_name in self.provider_index
^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 770, in provider_index
self._provider_index.merge(repo.provider_index)
^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 1096, in provider_index
return self.index["providers"]
~~~~~~~~~~^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 592, in __getitem__
self._build_all_indexes()
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 607, in _build_all_indexes
self.indexes[name] = self._build_index(name, indexer)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 616, in _build_index
index_mtime = self.cache.mtime(cache_filename)
^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/llnl/util/lang.py", line 826, in __getattr__
return getattr(self.instance, name)
^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/llnl/util/lang.py", line 825, in __getattr__
raise AttributeError()
AttributeError
```
* Added support for libfabric to find an external installation and
identify variants supported.
* Change the fabrics definition to only include CXI when on a cray
system with a libfabric-based slingshot network.
* Added a conflict when trying to build the CXI fabric value since it is
only available as closed source at this time.
#33128 Introduces a dependency on re2c into the Ninja build recipe.
This is problematic on Windows as we use CMake to build re2c, and
Ninja to drive the CMake build. This PR resolves this issue by
adding a variant to toggle the use of re2c with ninja.
* Added a more robust check for an external version of the library.
Included a guard to identify when the library gives no discernible
version information and then to substitute with "unknown_ver"
identifier.
* gaudi: new versions 36.8, 36.9
As of 36.8, the tests use catch2 ([commit](https://gitlab.cern.ch/gaudi/Gaudi/- /commit/f2cafb5c9d04c9d497d49182258aa3a0440622c0)).
* gaudi: still depends_on fmt@:8
* unifyfs: new release v1.0.1
Add 1.0.1 release
Add new variant for new configure time option
Co-authored-by: CamStan <CamStan@users.noreply.github.com>
Now that the `tix` variant is conditional, it should also be detected
condititionally, otherwise the spec is invalid and cannot be used during
concretization.
ShellCheck is installed with a downloaded binary instead of being
compiled from source, and there should be comments to point out this
unorthodox approach.
autoconf 2.70 uses use warnings instead of -w so that PERL=/usr/bin/env perl can be passed, but we want to fix absolute paths anyhow through sbang upon install. So, we stick to patching the one perl script that's used during the build.
Gitlab does not merge lists when a job extends two other definitions
that include the same list (e.g. tags). Also, it merges dictionaries
as long as the keys are distinct, but just takes the last mentioned
value when there are key collisions.
This change makes sure that when different tags are needed by a
pipeline, the ones we want are actually provided. It also changes
the example stack to better follow this pattern so we do not lead
developers astray in the future.
Since we dropped support for Python 2.7, we can embrace using keyword only arguments
for many functions in Spack that use **kwargs in the function signature. Here this is done
for the llnl.util.filesystem module.
There were a couple of bugs lurking in the code related to typo-like errors when retrieving
from kwargs. Those have been fixed as well.
* Update xsbench to version 20
XSBench version 20 has implementations for new
architectures and accelerators.
* Added CUDA support for XSBench
* Fixed style issues
The code in FileCache for write_transaction attempts to delete the temporary file when an exception occurs under the context by calling shutil.rmtree. However, rmtree only operates on directories while the rest of FileCache uses normal files. This causes an empty file to be written to the cache key when unneeded.
Use os.remove instead which operates on normal files.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* cernlib: depends_on freetype, libnsl, libxcrypt, openssl; and patch
In addition to #34448, cernlib depends on these additional packages.
This also applies a patch to the current release in which crypto is
specified wbere libcrypt (in libxcrypt) is actually needed. Because
the upstream git repository is behind a CERN login, we cannot patch
by gitlab URL link.
* [@spackbot] updating style on behalf of wdconinc
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* update the version for 5.3.0 release
* update the rocwmma for 5.3.0 release
* fix the +hip variant
* update the version for rocm-openmp-extras package for 5.3.0 release
* update the hipsolver and hipfft as per review comments
* address review comments
* revert changes to mivisionx with regard to change added for clangrt
* fix for the petsc failure
Spack was running an external detection of Python during each invocation
of the setup script for Windows CMD/PWSH, which has dramatic performance
implications each time the script is invoked, and is completely
unneccesary. Remove this operation.
Windows CMD prompt does not automatically support ASCI color control
characters on the console from Python. Enable this behavior by
accessing the current console and allowing the interpreation of ASCI
control characters from Python via the win32 API.
* adding 2nd bunch of nf-core deps from update/nextflow-tools
* adding 3rd bunch of nf-core deps from update/nextflow-tools
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pastedeploy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pebble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gunicorn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-parsley/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-gxformat2: added comment
* py-lagom: now using github tarballs
* fix for py-lagom
* adding missing deps to py-fastapi-utils
* another fix to py-lagom
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-lagom/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-supervisor/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This commit allows (remote) spec.json files to be clearsigned and gzipped.
The idea is to reduce the number of requests and number of bytes transferred
* added recipes for py-qutip and py-qutip-qip
* small fix
* updated qutip 2x versions
* py-qutip-qip: tarball url from github
* style fix in py-qutip-qip
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* draft for py-pennylane recipe
* first draft for py-strawberryfields recipe
* minimal fix
* small fixes
* accounting for circular dep in py-pennylane and py-pennylane-lightning
* removing py-strawberryfields from this branch
* updated versions for py-pennylane 2x
* needs cmake
* py-pennylane-lightning using github tarball
* Update var/spack/repos/builtin/packages/py-autoray/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pennylane-lightning/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The Intel compiler isn't able to deal with noinline member functions of template classses defined in headers. As such it outputs
```
warning #2196: routine is both "inline" and "noinline"
```
cmake bootstrap will fail due to the word 'warning'.
See spack/var/spack/repos/builtin/packages/protobuf/intel-v2.patch for reference.
The issue does not appear with intel@2021.7.0 or later:
```
$~: compiler=/shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-12.2.0/intel-oneapi-compilers-2022.2.0-uqvb2553zy5toeapvoopacndd27x6p5m/compiler/2022.2.0/linux/bin/intel64/icpc
$~: $compiler unique.c
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
```
This is a clean version of https://github.com/spack/spack/pull/34167
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
* package/libproxy: fix py3 install
* improve readability
* fix bug
* also add extend
* make flake happy
* [@spackbot] updating style on behalf of Sinan81
* Update var/spack/repos/builtin/packages/libproxy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* python dependency implied by extends const.
* disable python variant by default
* add run_env, add py conflict
* Update var/spack/repos/builtin/packages/libproxy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* set env for macos as well
* generalize lib dir detection
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add_new_package: py-file-magic
* re-order depends...
* Update var/spack/repos/builtin/packages/py-file-magic/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of Sinan81
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
* Add py-svgpath and dependency
* Update copyright expiration
* [@spackbot] updating style on behalf of heerener
* Process review remarks
* Update var/spack/repos/builtin/packages/py-trimesh/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix style issue
* py-trimesh: cleanup and optional dependencies
* Fix formatting issue
* py-trimesh: complete dependency list for easy variant
Two new packages: py-mapbox-earcut and py-pycollada
* Some more missing dependencies
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new package: py-kb-python + dependencies
- py-loompy
- py-ngs-tools
- py-numpy-groupies
* Update var/spack/repos/builtin/packages/py-kb-python/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-shortuuid: add version 1.0.11
* Update var/spack/repos/builtin/packages/py-shortuuid/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update packages config to indicate that MSVC is the preferred compiler
* Update packages config to indicate that msmpi is the preferred MPI provider
* Fix msmpi external detection
Spack imports `pytest`, which *can* import `numpy`. Recent versions of `numpy` require
Python 3.8 or higher, and they use 3.8 type annotations in their type stubs (`.pyi`
files). At the same time, we tell `mypy` to target Python 3.7, as we still support older
versions of Python.
What all this means is that if you run `mypy` on `spack`, `mypy` will follow all the
static import statements, and it ends up giving you this error when it finds numpy stuff
that is newer than the target Python version:
```
==> Running mypy checks
src/spack/var/spack/environments/default/.spack-env/._view/4g7jd4ibkg4gopv4rosq3kn2vsxrxm2f/lib/python3.11/site-packages/numpy/__init__.pyi:638: error: Positional-only parameters are only supported in Python 3.8 and greater [syntax]
Found 1 error in 1 file (errors prevented further checking)
mypy found errors
```
We can fix this by telling `mypy` to skip all imports of `numpy` in `pyproject.toml`:
```toml
[[tool.mypy.overrides]]
module = 'numpy'
follow_imports = 'skip'
follow_imports_for_stubs = true
```
- [x] don't follow imports from `numpy` in `mypy`
- [x] get rid of old rule not to follow `jinja2` imports, as we now require Python 3
The code in Spack to generate install and test reports currently suffers from unneeded complexity. For
instance, we have classes in Spack core packages, like `spack.reporters.CDash`, that need an
`argparse.Namespace` to be initialized and have "hard-coded" string literals on which they branch to
change their behavior:
```python
if do_fn.__name__ == "do_test" and skip_externals:
package["result"] = "skipped"
else:
package["result"] = "success"
package["stdout"] = fetch_log(pkg, do_fn, self.dir)
package["installed_from_binary_cache"] = pkg.installed_from_binary_cache
if do_fn.__name__ == "_install_task" and installed_already:
return
```
This PR attempt to polish the major issues encountered in both `spack.report` and `spack.reporters`.
Details:
- [x] `spack.reporters` is now a package that contains both the base class `Reporter` and all
the derived classes (`JUnit` and `CDash`)
- [x] Classes derived from `spack.reporters.Reporter` don't take an `argparse.Namespace` anymore
as argument to `__init__`. The rationale is that code for commands should be built upon Spack
core classes, not vice-versa.
- [x] An `argparse.Action` has been coded to create the correct `Reporter` object based on command
line arguments
- [x] The context managers to generate reports from either `spack install` or from `spack test` have
been greatly simplified, and have been made less "dynamic" in nature. In particular, the `collect_info`
class has been deleted in favor of two more specific context managers. This allows for a simpler
structure of the code, and less knowledge required to client code (in particular on which method to patch)
- [x] The `InfoCollector` class has been turned into a simple hierarchy, so to avoid conditional statements
within methods that assume a knowledge of the context in which the method is called.
On systems with remote groups, the primary user group may be remote and may not exist on
the local system (i.e., it might just be a number). On the CLI, it looks like this:
```console
> touch foo
> l foo
-rw-r--r-- 1 gamblin2 57095 0 Dec 29 22:24 foo
> chmod 2000 foo
chmod: changing permissions of 'foo': Operation not permitted
```
Here, the local machine doesn't know about per-user groups, so they appear as gids in
`ls` output. `57095` is also `gamblin2`'s uid, but the local machine doesn't know that
`gamblin2` is in the `57095` group.
Unfortunately, it seems that Python's `os.chmod()` just fails silently, setting
permissions to `0o0000` instead of `0o2000`. We can avoid this by ensuring that the file
has a group the user is known to be a member of.
- [x] Add `ensure_known_group()` in the permissions tests.
- [x] Call `ensure_known_group()` on tempfile in `test_chmod_real_entries_ignores_suid_sgid`.
There are a number of places in our docstrings where we write "list of X" as the type, even though napoleon doesn't actually support this. It ends up causing warnings when generating docs.
Now that we require Python 3, we don't have to rely on type hints in docs -- we can just use Python type hints and omit the types of args and return values from docstrings.
We should probably do this for all types in docstrings eventually, but this PR focuses on the ones that generate warnings during doc builds.
Some `mypy` annoyances we should consider in the future:
1. Adding some of these type annotations gets you:
```
note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs [annotation-unchecked]
```
because they are in unannotated functions (like constructors where we don't really need any annotations).
You can silence these with `disable_error_code = "annotation-unchecked"` in `pyproject.toml`
2. Right now we support running `mypy` in Python `3.6`. That means we have to support `mypy` `.971`, which does not support `disable_error_code = "annotation-unchecked"`, so I just filter `[annotation-unchecked]` lines out in `spack style`.
3. I would rather just turn on `check_untyped_defs` and get more `mypy` coverage everywhere, but that will require about 1,000 fixes. We should probably do that eventually.
4. We could also consider only running `mypy` on newer python versions. This is not easy to do while supporting `3.6`, because you have to use `if TYPE_CHECKING` for a lot of things to ensure that 3.6 still parses correctly. If we only supported `3.7` and above we could use [`from __future__ import annotations`](https://mypy.readthedocs.io/en/stable/runtime_troubles.html#future-annotations-import-pep-563), but we have to support 3.6 for now. Sigh.
- [x] Convert a number of docstring types to Python type hints
- [x] Get rid of "list of" wherever it appears
Based on the following lines in the top level `CMakeLists.txt` (I can't deep link since gitlab.cern.ch not public), `cernlib` needs an explicit dependency on `libxaw` and `libxt`:
```cmake
find_package(X11 REQUIRED)
message(STATUS "CERNLIB: X11_Xt_LIB=${X11_Xt_LIB} X11_Xaw_LIB=${X11_Xaw_LIB} X11_LIBRARIES=${X11_LIBRARIES}")
```
Per https://github.com/spack/spack/issues/34192, apptainer does not support `--without-conmon`, so we introduce a base class `config_options` property that can be overridden in the `apptainer` package.
`texinfo` depends on `gettext`, and it builds a perl module that uses gettext via XS
module FFI. Unfortunately, the XS modules build asks perl to tell it what compiler to
use instead of respecting the one passed to configure.
Without this change, the build fails with this error:
```
parsetexi/api.c:33:10: fatal error: 'libintl.h' file not found
^~~~~~~~~~~
```
We need the gettext dependency and the spack wrappers to ensure XS builds properly.
- [x] Add needed `gettext` dependency to `texinfo`
- [x] Override XS compiler with `PERL_EXT_CC`
Co-authored-by: Paul Kuberry <pakuber@sandia.gov>
Local `git` tests will fail with `fatal: transport 'file' not allowed` when using git 2.38.1 or higher, due to a fix for `CVE-2022-39253`.
This was fixed in CI in #33429, but that doesn't help the issue for anyone's local environment. Instead of fixing this with git config in CI, we should ensure that the tests run anywhere.
- [x] Introduce `spack.util.git`.
- [x] Use `spack.util.git.get_git()` to get a git executable, instead of `which("git")` everywhere.
- [x] Make all `git` tests use a `git` fixture that goes through `spack.util.git.get_git()`.
- [x] Add `-c protocol.file.allow=always` to all `git` invocations under `pytest`.
- [x] Revert changes from #33429, which are no longer needed.
`spack graph` has been reworked to use:
- Jinja templates
- builder objects to construct the template context when DOT graphs are requested.
This allowed to add a new colored output for DOT graphs that highlights both
the dependency types and the nodes that are needed at runtime for a given spec.
`spack solve` is supposed to show you times you can compare. setup, ground, solve, etc.
all in a list. You're also supposed to be able to compare easily across runs. With
`pretty_seconds()` (introduced in #33900), it's easy to miss the units, e.g., spot the
bottleneck here:
```console
> spack solve --timers tcl
setup 22.125ms
load 16.083ms
ground 8.298ms
solve 848.055us
total 58.615ms
```
It's easier to see what matters if these are all in the same units, e.g.:
```
> spack solve --timers tcl
setup 0.0147s
load 0.0130s
ground 0.0078s
solve 0.0008s
total 0.0463s
```
And the units won't fluctuate from run to run as you make changes.
-[x] make `spack solve` timings consistent like before
* py-pytest-datadir: Init at 1.4.1
* py-pytest-data-dir: Fix missing dep
Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
* ML CI: Linux x86_64
* Update comments
* Rename again
* Rename comments
* Update to match other arches
* No compiler
* Compiler was wrong anyway
* Faster TF
Avoid text decoding and encoding when combining log files, instead
combine in binary mode.
Also do a buffered copy which is sometimes faster for large log files.
Currently, the Spack docs show documentation for submodules *before* documentation for
submodules on package doc pages. This means that if you put docs in `__init__.py` in
some package, the docs in there will be shown *after* the docs for all submodules of the
package instead of at the top as an intro to the package. See, e.g.,
[the lockfile docs](https://spack.readthedocs.io/en/latest/spack.environment.html#module-spack.environment),
which should be at the
[top of that page](https://spack.readthedocs.io/en/latest/spack.environment.html).
- [x] add the `--module-first` option to sphinx so that it generates module docs at top of page.
* librsvg: add 2.40.21, which does not require rust and has some security backports
https://download.gnome.org/sources/librsvg/2.40/librsvg-2.40.21.news
* librsvg: prevent finding broken gtkdoc binaries when ~doc is selected.
On my CentOS7 hosts, ./configure finds e.g. /bin/gtkdoc-rebase even when
~doc is selected. These tools use Python2, and fail with an error:
"ImportError: No module named site"
So prevent ./configure from finding these broken tools when not building
the +doc variant.
without this patch, build of paraview has a meltdown when reaching 3rd party catalyst and other packages
with these types of errors:
335 /tmp/foo/spack-stage/spack-stage-paraview-5.10.1-gscoqxhhakjyyfirdefuhmi2bzw4scho/spack-src/VTK/ThirdParty/fmt/vtkfmt/vtkfmt/format.h:1732:11: error: cannot capture a bi
t-field by reference
336 if (sign) *it++ = static_cast<Char>(data::signs[sign]);
337 ^
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
The sticky property will prevent clingo from changing the amdgpu_target
to work around conflicts. This is the same behaviour as was adopted for
cuda_arch in 055c9d125d.
Replace the filter_file for older configure with rocm 5.3 with an
upstream patch. Further, the patch is no longer needed for develop or
later releases.
Implement an alternative strategy to do index.json invalidation.
The current approach of pairs of index.json / index.json.hash is
problematic because it leads to races.
The standard solution for cache invalidation is etags, which are
supported by both http and s3 protocols, which allows one to do
conditional fetches.
This PR implements that for the http/https schemes. It should also work
for s3 schemes, but that requires other prs to be merged.
Also it improves unit tests for index.json fetches.
* py-scikit-image: add 0.19.3
* Update var/spack/repos/builtin/packages/py-scikit-image/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* LLVM: replace libelf dependency with elf
I didn't test this extensively, but in CMS LLVM builds just fine with elfutils.
* [@spackbot] updating style on behalf of iarspider
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
* exiv2: add new versions
* babl: new package required to build GIMP
* gegl: new package required to build GIMP
* gexiv2: new package required to build GIMP
* libmypaint: new package required to build GIMP
* mypaint-brushes: new package required to build GIMP
* vala: new package required to build GIMP
* GIMP: new package definition for building GIMP-2.10 from source
* libjxl: update for 0.7.0
* libwmf: a library for reading vector images in Windows Metafile Format (WMF)
* libde265: an open source implementation of the h.265 video codec
* libwebp: add new versions
* GIMP: additional variants for building GIMP-2.10 from source
* libde265: remove boilerplate
* fixes for style precheck
* updates based on feedback
* fixes for style precheck
Update the depends_on("perl") to depends_on("perl+threads").
This and #34074 is needed to properly handle e.g. the perl-Thread-Queue
rpm package:
It may not be installed on RedHat-based hosts, which can lead to automake
build failures when `spack external find perl` or `spack external find --all`
was used to use the system-provided perl install.
* nextflow recipe: added latest stable version
* tower-cli recipe: added latest release
* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli
* recipes nf-tower-agent and nf-tower-cli: small fix
* nf-core-tools recipe: added most py- dependencies
* nf-core-tools: recipe without galaxy-tool-util (for testing)
* fixed typos in py-yacman recipe
* fixed typos in py-pytest-workflow recipe
* fixed typo in nf-core-tools recipe
* fixed typos in py-yacman recipe
* fixes in recipes for py-questionary and py-url-normalize
* fixes to py-yacman recipe
* style fixes to py- packages that are dependencies to nf-core-tools
* fix in py-requests-cache recipe
* added missing dep in py-requests-cache recipe
* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper
* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util
* nf-core-tools: new version with extra dependency
* commit to merge packages on focus from update/nextflow-tools
* nf-core: commenting galaxy dep for this pr
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* removed nf-core-tools from this branch, will be back at the end
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Interim fix for #34559
Spack's MSVC compiler definition uses ifx as the Fortran compiler.
Prior to #33385, the Spack MSVC compiler definition required the
executable to be called "ifx.exe"; #33385 replaced this with just
"ifx", which inadvertently led to ifx falsely indicating the
presence of MSVC on non-Windows systems (which leads to future
errors when attempting to query/use those compiler objects).
This commit applies a short-term fix by updating MSVC Fortran
version detection to always indicate a failure on non-Windows.
fixes#34518
Fix an issue due to the MRO chain of the package wrapper
during build. Before this PR we were always returning
False when the builder object was created before the
run_tests method was monkey patched.
openPMD, a metadata standard on top of backends like ADIOS2 and HDF5,
is implemented in ParaView 5.9+ via a Python3 module.
Simplify Conflicts & Variant
Add to ECP Data Vis SDK
This reverts commit 8035eeb36d.
And also removes logic around an additional HEAD request to prevent
a more expensive GET request on wrong content-type. Since large files
are typically an attachment and only downloaded when reading the
stream, it's not an optimization that helps much, and in fact the logic
was broken since the GET request was done unconditionally.
* Add py-bmtk and py-neurotools
* py-bmtk: version bump
* [@spackbot] updating style on behalf of heerener
* Maybe the copyright needs to be extended to 2022 for the check to pass
* Process review remarks
* Update var/spack/repos/builtin/packages/py-neurotools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The main issue that's fixed is that Spack passes paths (as strings) to
functions that require urls. That wasn't an issue on unix, since there
you can simply concatenate `file://` and `path` and all is good, but on
Windows that gives invalid file urls. Also on Unix, Spack would not deal with uri encoding like x%20y for file paths.
It also removes Spack's custom url.parse function, which had its own incorrect interpretation of file urls, taking file://x/y to mean the relative path x/y instead of hostname=x and path=/y. Also it automatically interpolated variables, which is surprising for a function that parses URLs.
Instead of all sorts of ad-hoc `if windows: fix_broken_file_url` this PR
adds two helper functions around Python's own path2url and reverse.
Also fixes a bug where some `spack buildcache` commands
used `-d` as a flag to mean `--mirror-url` requiring a URL, and others
`--directory`, requiring a path. It is now the latter consistently.
* Add GFE packages, Update pFUnit
* Remove citibeth as maintainer per her request
* Version 3.3.0 is an odd duck. Needs a v
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
When installing binary tarballs, Spack has to download from its
binary mirrors.
Sometimes Spack has cache available for these mirrors.
That cache helps to order mirrors to increase the likelihood of
getting a direct hit.
However, currently, when Spack can't find a spec in any local cache
of mirrors, it's very dumb:
- A while ago it used to query each mirror to see if it had a spec,
and use that information to order the mirror again, only to go
about and do exactly a part of what it just did: fetch the spec
from that mirror confused
- Recently, it was changed to download a full index.json, which
can be multiple dozens of MBs of data and may take a minute to
process thanks to the blazing fast performance you get with
Python.
In a typical use case of concretizing with reuse, the full index.json
is already available, and it likely that the local cache gives a perfect
mirror ordering on install. (There's typically no need to update any
caches).
However, in the use case of Gitlab CI, the build jobs don't have cache,
and it would be smart to just do direct fetches instead of all the
redundant work of (1) and/or (2).
Also, direct fetches from mirrors will soon be fast enough to
prefer these direct fetches over the excruciating slowness of
index.json files.
* include Drishti
* fix syntax
* Update var/spack/repos/builtin/packages/drishti/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update var/spack/repos/builtin/packages/drishti/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-sphinxcontrib-devhelp: add 1.0.2
* Update var/spack/repos/builtin/packages/py-sphinxcontrib-devhelp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-sphinxcontrib-applehelp: add 1.0.2
* Update var/spack/repos/builtin/packages/py-sphinxcontrib-applehelp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ROCm 5.3.0 updates
* New patches for 5.3.0 on hip and hsakmt
* Adding additional build arguments in hip and llvm
* RVS updates for 5.3.0 release
* New patches and rocm-tensile, rocprofiler-dev, roctracer-dev recipe updates for 5.3.0
* Reverting OPENMP fix from rocm-tensile
* Removing the patch to compile without git and adding witout it
* Install library in to lib directory instead of lib64 across all platform
* Setting lib install directory to lib
* Disable gallivm coroutine for libllvm15
* Update llvm-amdgpu prefix path in hip-config.cmake.in
Removing libllvm15 from Mesa dependency removing
* hip-config.cmake.in update required from 5.2
* hip-config.cmake.in update required from 5.2 and above
* hip-config.cmake.in update required for all 5.2 release above
* Style check correction in hip update
* ginkgo: add missing include
* Patching hsa include path for rocm 5.3
* Restricting patch for llvm-15
* Style check error correction
* PIC flag required for the new test applications
* Passing -DCMAKE_POSITION_INDEPENDENT_CODE=ON in the cmake_args instead of setting -fPIC in CFLAGS
Co-authored-by: Cordell Bloor <Cordell.Bloor@amd.com>
gnulib/lib/malloca.c uses single value `static_assert()` only available in c-11
syntax. `gcc` seems to be fine, but `icc` needs extra flag.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Since it's a header-only library there's nothing to build. However, the
default targets include tests and examples and there's no option to turn
them off during configuration time.
Per https://geant4.web.cern.ch/node/1837 the correct dependency for 10.6 is on `g4emlow@7.9.1`, not on both `g4emlow@7.9` and `g4emlow@7.9.1`.
This is a minor cosmetic fix. The concretization for 10.6 works just fine here. But this removes the duplicate entry.
This PR patches the f_check script to detect the ifort compiler and
ensure that F_COMPILER is iset to INTEL. This problem was introduced with
openblas-0.3.21. Without this patch, the value of F_COMPILER falls back
to G77 and icc rather than ifort is used for the linking stage. That
results in the openblas library missing libifcore, which in turn means
many Fotran programs can not be compiled with ifort.
Writing a long dependency like:
```python
depends_on(
"llvm"
"targets=amdgpu,bpf,nvptx,webassembly"
"version_suffix=jl +link_llvm_dylib ~internal_unwind"
)
```
when it should be formatted like this:
```python
depends_on(
"llvm"
" targets=amdgpu,bpf,nvptx,webassembly"
" version_suffix=jl +link_llvm_dylib ~internal_unwind"
)
```
can cause really subtle errors. Specifically, you'll get something like this in
the package sanity tests:
```
AttributeError: 'NoneType' object has no attribute 'rpartition'
```
because Spack happily constructs a class that has a dependency with name `None`.
We can catch this earlier by banning anonymous dependency specs directly in
`depends_on()`. This causes the package itself to fail to parse, and emits
a much better error message:
```
==> Error: Invalid dependency specification in package 'julia':
llvmtargets=amdgpu,bpf,nvptx,webassemblyversion_suffix=jl +link_llvm_dylib ~internal_unwind
```
* Only restrict CMake version in umpire when examples and rocm are enabled
* Add CMAKE_HIP_ARCHITECTURES to Umpire and lift cmake version restriction
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
This patch:
https://gcc.gnu.org/legacy-ml/gcc-patches/2018-01/msg01962.html
is actually in Amazon Linux GCC 7.3.1, which we use in CI.
So we should not hold openblas back because of it.
Old versions of OpenBLAS fail to detect the host arch of some of the
AVX512 cpus of build nodes, causing build failures.
Of course we should try to set ARCH properly in OpenBLAS to avoid that
it looks up the build arch, but that's quite some work.
* Debian like distros use multiarch implementation spec
https://wiki.ubuntu.com/MultiarchSpec
Instead of being limited to /usr/lib64, architecture based
lib directories are used. For instance, under ubuntu a library package
on x86_64 installs binaries under /usr/lib/x86_64-linux-gnu.
Building pmix with external dependencies like hwloc or libevent
fail as with prefix set to /usr, that prefix works for
headers and binaries but does not work for libraries. The default
location for library /usr/lib64 does not hold installed binaries.
Pmix build options --with-libevent and --with-libhwloc allow us to
specify dependent library locations. This commit is an effort to
highlight and resolve such an issue when a users want to use Debian like
distro library packages and use spack to build pmix.
There maybe other packages that might be impacted in a similar way.
* Adding libs property to hwloc and libevent and some cleanups to pmix patch
* Fixing style and adding comment on Pmix' 32-bit hwloc version detection issue
* `url_exists` improvements (take 2)
Make `url_exists` do HEAD request for http/https/s3 protocols
Rework the opener: construct it once and only once, dynamically dispatch
to the right one based on config.
* geant4: version bumps for Geant4 11.1.0
- Version bumps for new data libraries
- g4ndl 4.7
- g4emlow 8.2
- Add geant4-data@11.1.0
- Checksum new Geant4 11.1.0 release
- Limit +python variant to maximum of :11.0 due to removal of
Geant4Py in 11.1
- Update CLHEP dependency to at least 2.4.6.0 for this release
- Update VecGeom dependency to at least 1.2.0 for this release,
closing version ranges for older releases to prevent multiple
versions satisfying requirement
* geant4: correct max version for python support
It's very common for us to tell users to grep through the existing Spack packages to
find examples of what they want, and it's also very common for package developers to do
it. Now, searching packages is even easier.
`spack pkg grep` runs grep on all `package.py` files in repos known to Spack. It has no
special options other than the search string; all options passed to it are forwarded
along to `grep`.
```console
> spack pkg grep --help
usage: spack pkg grep [--help] ...
positional arguments:
grep_args arguments for grep
options:
--help show this help message and exit
```
```console
> spack pkg grep CMakePackage | head -3
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/3dtk/package.py:class _3dtk(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/abseil-cpp/package.py:class AbseilCpp(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/accfft/package.py:class Accfft(CMakePackage, CudaPackage):
```
```console
> spack pkg grep -Eho '(\S*)\(PythonPackage\)' | head -3
AwsParallelcluster(PythonPackage)
Awscli(PythonPackage)
Bueno(PythonPackage)
```
## Return Value
This retains the return value semantics of `grep`:
* 0 for found,
* 1 for not found
* >1 for error
## Choosing a `grep`
You can set the ``SPACK_GREP`` environment variable to choose the ``grep``
executable this command should use.
Unit tests on Windows are supposed to pass for any PR to pass CI.
However, the return code for the unit test command was not being
checked, which meant this check was always passing (effectively
disabled). This PR
* Properly checks the result of the unit tests and fails if the
unit tests fail
* Fixes (or disables on Windows) a number of tests which have
"drifted" out of support on Windows since this check was
effectively disabled
At some point the `a` mock package became an `AutotoolsPackage`, and that means it
depends on `gnuconfig` on macOS. This was causing one of our shell tests to fail on
macOS because it was testing for `{a.prefix.bin}:{b.prefix.bin}` in `PATH`, but
`gnuconfig` shows up between them.
- [x] simplify the test to check `spack load --sh a` and `spack load --sh b` separately
* Add 20 as a valid option for cxxstd to fmt
* Add pika 0.11.0
* Fix version constraint for p2300 variant in pika package
* Add pika-algorithms package
* py-reportlab: add 3.6.12
* Update var/spack/repos/builtin/packages/py-reportlab/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adding recipe for singularity-hpc - 1st go
* typo in singularity-hpc recipe
* singularity-hpc, spython recipes: added platform variant
* singularity-hpc, spython recipes: platform variant renamed to runtime
* style fix
* another style fix
* yet another style fix (why are they not reported altogether)
* singularity-hpc recipe: added Vanessa as maintainer
* singularity-hpc recipe: add podman variant
* singularity-hpc recipe: added variant for module system
* shpc recipe: add version for py-semver dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-spython recipe: no need to specify generic python dep for a python pkg
* py-spython: py-requests not needed
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
## Motivation
Our parser grew to be quite complex, with a 2-state lexer and logic in the parser
that has up to 5 levels of nested conditionals. In the future, to turn compilers into
proper dependencies, we'll have to increase the complexity further as we foresee
the need to add:
1. Edge attributes
2. Spec nesting
to the spec syntax (see https://github.com/spack/seps/pull/5 for an initial discussion of
those changes). The main attempt here is thus to _simplify the existing code_ before
we start extending it later. We try to do that by adopting a different token granularity,
and by using more complex regexes for tokenization. This allow us to a have a "flatter"
encoding for the parser. i.e., it has fewer nested conditionals and a near-trivial lexer.
There are places, namely in `VERSION`, where we have to use negative lookahead judiciously
to avoid ambiguity. Specifically, this parse is ambiguous without `(?!\s*=)` in `VERSION_RANGE`
and an extra final `\b` in `VERSION`:
```
@ 1.2.3 : develop # This is a version range 1.2.3:develop
@ 1.2.3 : develop=foo # This is a version range 1.2.3: followed by a key-value pair
```
## Differences with the previous parser
~There are currently 2 known differences with the previous parser, which have been added on purpose:~
- ~No spaces allowed after a sigil (e.g. `foo @ 1.2.3` is invalid while `foo @1.2.3` is valid)~
- ~`/<hash> @1.2.3` can be parsed as a concrete spec followed by an anonymous spec (before was invalid)~
~We can recover the previous behavior on both ones but, especially for the second one, it seems the current behavior in the PR is more consistent.~
The parser is currently 100% backward compatible.
## Error handling
Being based on more complex regexes, we can possibly improve error
handling by adding regexes for common issues and hint users on that.
I'll leave that for a following PR, but there's a stub for this approach in the PR.
## Performance
To be sure we don't add any performance penalty with this new encoding, I measured:
```console
$ spack python -m timeit -s "import spack.spec" -c "spack.spec.Spec(<spec>)"
```
for different specs on my machine:
* **Spack:** 0.20.0.dev0 (c9db4e50ba045f5697816187accaf2451cb1aae7)
* **Python:** 3.8.10
* **Platform:** linux-ubuntu20.04-icelake
* **Concretizer:** clingo
results are:
| Spec | develop | this PR |
| ------------- | ------------- | ------- |
| `trilinos` | 28.9 usec | 13.1 usec |
| `trilinos @1.2.10:1.4.20,2.0.1` | 131 usec | 120 usec |
| `trilinos %gcc` | 44.9 usec | 20.9 usec |
| `trilinos +foo` | 44.1 usec | 21.3 usec |
| `trilinos foo=bar` | 59.5 usec | 25.6 usec |
| `trilinos foo=bar ^ mpich foo=baz` | 120 usec | 82.1 usec |
so this new parser seems to be consistently faster than the previous one.
## Modifications
In this PR we just substituted the Spec parser, which means:
- [x] Deleted in `spec.py` the `SpecParser` and `SpecLexer` classes. deleted `spack/parse.py`
- [x] Added a new parser in `spack/parser.py`
- [x] Hooked the new parser in all the places the previous one was used
- [x] Adapted unit tests in `test/spec_syntax.py`
## Possible future improvements
Random thoughts while working on the PR:
- Currently we transform hashes and files into specs during parsing. I think
we might want to introduce an additional step and parse special objects like
a `FileSpec` etc. in-between parsing and concretization.
* py-pywavelets: add 1.4.1
* Update var/spack/repos/builtin/packages/py-pywavelets/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pywavelets/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [quantum-espresso] Parallel make fails for 6.{6,7}
I run into a race condition in `make` with Intel compiler on icelake when building QE 6.6 and 6.7.
* Fix comment
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
* fix location of input for darshan-util tests
Darshan log file used for test input was removed from the Darshan
repo after the 3.4.0 release. This commit adds logic to use a
different log file as test input for later Darshan versions.
* libctl: add new version
Change-Id: I16f91cfab198c66b60407ab5bb2cb3ebeac6bc19
* New package: libgdsii
Change-Id: I34b52260ab68ecc857ddf8cc63b124adc2689a51
* New package: mpb
Change-Id: I6fdf5321c33d6bdbcaa1569026139a8483a3bcf8
* meep: add new version and variants
Change-Id: I0b60a9a4d9a329f7bde9027514467e17376e6a39
* meep: use with_or_without
Change-Id: I05584cb13df8ee153ed385e77d367cb34e39777e
* LibCatalyst: Fix version of pre-release develop version
* ParaView: Requires libcatalyst@2:
* ParaView: Apply adios2 module no kit patch to 5.11
This patch is still pending in VTK and didn't make it into 5.11 as anticipated.
* bcache: support external gettext when `libintl` is in glibc
Many glibc-based Linux systems don't have gettext's libintl because
libintl is included in the standard system's glibc (libc) itself.
When using `spack external find gettext` on those, packages like
`bcache` which unconditionally to link using `-lintl` fail to link
with -lintl.
Description of the fix:
The libs property of spack's gettext recipe returns the list of libs,
so when gettext provides libintl, use it. When not, there is no
separate liblint library and the libintl API is provided by glibc.
Tested with `spack external find gettext` on glibc-based Linux and
in musl-based Alpine Linux to make sure that when -lintl is really
needed, it is really used and nothing breaks.
* Fix mercurial print_str failure.
* Perform same fix on py-pybind11 for print_string missing method.
Co-authored-by: Nicholas Cameron Sly <sly1@llnl.gov>
* Add new root versions and associated dependency constraints
* Please style guide
* Avoid conflicts where possible
* Untested prototype of macOS version detection
* Fixes for macOS version prototype
* More logical ordering
* More correctness and style fixes
* Try to use spack's macos_version
* Add some forgotten @s
* Actually, Spack can't build Python 3.6 anymore, and thus no older PyROOT
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
* Fix compile errors with latest HDF5 1.13.3
* format
* Update var/spack/repos/builtin/packages/hdf5-vol-async/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This commit reworks the bootstrapping procedure to use Spack environments
as much as possible.
The `spack.bootstrap` module has also been reorganized into a Python package.
A distinction is made among "core" Spack dependencies (clingo, GnuPG, patchelf)
and other dependencies. For a number of reasons, explained in the `spack.bootstrap.core`
module docstring, "core" dependencies are bootstrapped with the current ad-hoc
method.
All the other dependencies are instead bootstrapped using a Spack environment
that lives in a directory specific to the interpreter and the architecture being used.
The cray manifest shows dependency information for cray-mpich, which we never previously cared about
because it can only be used as an external. This updates Spack's dependency information to make cray-mpich
specs read in from the cray external manifest usable.
* initial changes for rocm recipes
* drop support for older releases
* drop support for older rocm releases - add more recipes
* drop support for older releases
* address style issues
* address style error
* fix errors
* address review comments
* Added plumed version 2.8.1 including gromacs compatibility
* Corrected ~mpi and +mpi variants in new depends
* Fixed regression logic plumed+gromacs@2020.6 support
All the vermin annotations we were using were for optional features introduced in early
Python 3 versions. We no longer need any of them, as we only support Python 3.6+. If we
start optionally using features from newer Pythons than 3.6, we will need more vermin
annotations.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
We no longer support Python <3.6, so we don't need to check whether Python supports SSL
verification in `spack.util.web`.
- [x] Remove a bunch of logic we needed to appease Python 2
We've stopped supporting Python 2, and contributors are noticing that our CI no longer
allows Python 2.7 comment type hints. They end up having to adapt them, but this adds
extra unrelated work to PRs.
- [x] Move to 3.6 type hints across the entire code base
* doxygen: add build-tool tag
This allows it to be included automatically as an external. No one links against doxygen so this should be ok.
* doxygen: add self as maintainer
* vecgeom: add new 1.2.1 version
* vecgeom: introduce conflict between gcc/cuda
Recent tests of vecgeom in Spack environments have shown that the build
with +cuda fails with GCC >= 11.3 and CUDA < 11.7 with error
...lib/gcc/x86_64-pc-linux-gnu/11.3.0/include/serializeintrin.h(41):
error: identifier "__builtin_ia32_serialize" is undefined
1 error detected in the compilation of
".../VecGeom/source/BVHManager.cu".
Other GCC/CUDA combinations appear o.k.
Avoid this error in spack, and document it for users, with a conflict
directive to express the restriction.
All Spec attributes are now represented as `attr(attribute_name, ... args ...)`, e.g.
`attr(node, "hdf5")` instead of `node("hdf5")`, as we *have* to maintain the `attr()`
form anyway, and it simplifies the encoding to just maintain one form of the Spec
information.
Background
----------
In #20644, we unified the way conditionals are done in the concretizer, but this
introduced a nasty aspect to the encoding: we have to maintain everything we want in
general conditions in two forms: `predicate(...)` and `attr("predicate", ...)`. For
example, here's the start of the table of spec attributes we had to maintain:
```prolog
node(Package) :- attr("node", Package).
virtual_node(Virtual) :- attr("virtual_node", Virtual).
hash(Package, Hash) :- attr("hash", Package, Hash).
version(Package, Version) :- attr("version", Package, Version).
...
```
```prolog
attr("node", Package) :- node(Package).
attr("virtual_node", Virtual) :- virtual_node(Virtual).
attr("hash", Package, Hash) :- hash(Package, Hash).
attr("version", Package, Version) :- version(Package, Version).
...
```
This adds cognitive load to understanding how the concretizer works, as you have to
understand the equivalence between the two forms of spec attributes. It also makes the
general condition logic in #20644 hard to explain, and it's easy to forget to add a new
equivalence to this list when adding new spec attributes (at least two people have been
bitten by this).
Solution
--------
- [x] remove the equivalence list from `concretize.lp`
- [x] simplify `spec_clauses()`, `condition()`, and other functions in `asp.py` that need
to deal with `Spec` attributes.
- [x] Convert all old-form spec attributes in `concretize.lp` to the `attr()` form
- [x] Simplify `display.lp`, where we also had to maintain a list of spec attributes. Now
we only need to show `attr/2`, `attr/3`, and `attr/4`.
- [x] Simplify model extraction logic in `asp.py`.
Performance
-----------
This seems to result in a smaller grounded problem (as there are no longer duplicated
`attr("foo", ...)` / `foo(...)` predicates in the program), but it also adds a slight
performance overhead vs. develop. Ultimately, simplifying the encoding will be a win,
particularly for improving error messages.
Notes
-----
This will simplify future node refactors in `concretize.lp` (e.g., not identifying nodes
by package name, which we need for separate build dependencies).
I'm still not entirely used to reading `attr()` notation, but I thnk it's ultimately
clearer than what we did before. We need more uniform naming, and it's now clear what is
part of a solution. We should probably continue making the encoding of `concretize.lp`
simpler and more self-explanatory. It may make sense to rename `attr` to something like
`node_attr` and to simplify the names of node attributes. It also might make sense to do
something similar for other types of predicates in `concretize.lp`.
* Add py-python-lsp-server and dependencies
* Update var/spack/repos/builtin/packages/py-python-lsp-server/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Relax version range constraints on py-python-lsp-jsonrpc and add missing dep
* Add runtime dependency flag to setuptools dependencies
* Remove unused python@3.6: dependency and move setuptools-scm to build dep only
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix recipe
* Update package.py
* Update recipe following review
* Update var/spack/repos/builtin/packages/py-onnx-runtime/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove unused imports
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* CI: Update Data and Vis SDK Stack
* Update image to match target deployments (E4S)
* Enable all packages
* Test supported variants of ParaView and VisIt
* Sensei: Update Python hint for newer cmake
* Sensei: add Python3 hint
* Add new version of snakemake
* Add myself as a maintainer
* py-retry -> py-reretry
* Added snakemake variants for storage systems
* Updated comments
* Responded to Adam's comments
* Fixed spack style
* Add build/run dependency types
* New package py-statmorph w/ dependecies. Add py-astropy@5.1
* [@spackbot] updating style on behalf of meyersbs
* [py-statmorph,py-astropy,py-pyerfa] minor fixes
* Add checksum for py-rsa 4.9
* Update package.py
* Update var/spack/repos/builtin/packages/py-rsa/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-fastpath: new package
* Update var/spack/repos/builtin/packages/py-fastpath/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Historically, development of the Octopus code was done on the "develop" branch
on https://gitlab.com/octopus-code/octopus but now development takes place on
"main" (since Q3 2022).
The suggestion in this PR to solve the issue is to keep the spack label
`octopus@develop` as this better indicates this is the development branch on git
than `octopus@main`, but of course to use the `main` branch (there is no choice
here - the `develop` branch is not touched anymore). Sticking to
`octopus@develop` as the version label also keeps backwards compatibility.
This reverts commit d06fd26c9a.
The problem is that Bitbucket's API forwards download requests to an S3 bucket using a temporary URL. This URL includes a signature for the request, which embeds the HTTP verb. That means only GET requests are allowed, and HEAD requests would fail verification, leading to 403 erros. The same is observed when using `curl -LI ...`
v0.5 does not build due to a change in setting `arch` introduced in v0.5.2, compatibility with which was not kept in `arbor/package.py`. Since v0.5.2 is compatible with `arbor/package.py`, and is API compatible with v0.5, any users relying on v0.5 can rely on v0.5.2.
Using `-Werror` is good practice for development and testing, but causes us a great
deal of heartburn supporting multiple compiler versions, especially as newer compiler
versions add warnings for released packages. This PR adds support for suppressing
`-Werror` through spack's compiler wrappers. There are currently three modes for
the `flags:keep_werror` setting:
* `none`: (default) cancel all `-Werror`, `-Werror=*` and `-Werror-*` flags by
converting them to `-Wno-error[=]*` flags
* `specific`: preserve explicitly selected warnings as errors, such as
`-Werror=format-truncation`, but reverse the blanket `-Werror`
* `all`: keeps all `-Werror` flags
These can be set globally in config.yaml, through the config command-line flags, or
overridden by a particular package (some packages use Werror as a proxy for determining
support for other compiler features). We chose to use this approach because:
1. removing `-Werror` flags entirely broke *many* build systems, especially autoconf
based ones, because of things like checking `-Werror=feature` and making the
assumption that if that did not error other flags related to that feature would also work
2. Attempting to preserve `-Werror` in some phases but not others caused similar issues
3. The per-package setting came about because some packages, even with all these
protections, still use `-Werror` unsafely. Currently there are roughly 3 such packages
known.
* Working updates to py-antlr4-python3-runtime and py-omegaconf
* [py-antlr4-python3-runtime] added version 4.9.3
* [@spackbot] updating style on behalf of qwertos
Co-authored-by: Benjamin Meyers <bsmits@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
For reasons beyond me Python thinks it's a great idea to upgrade HEAD
requests to GET requests when following redirects. So, this PR adds a
better `HTTPRedirectHandler`, and also moves some ad-hoc logic around
for dealing with disabling SSL certs verification.
Also, I'm stumped by the fact that Spack's `url_exists` does not use
HEAD requests at all, so in certain cases Spack awkwardly downloads
something first to see if it can download it, and then downloads it
again because it knows it can download it. So, this PR ensures that both
urllib and botocore use HEAD requests.
Finally, it also removes some things that were there to support currently
unsupported Python versions.
Notice that the HTTP spec [section 10.3.2](https://datatracker.ietf.org/doc/html/rfc2616.html#section-10.3.2) just talks about how to deal
with POST request on redirect (whether to follow or not):
> If the 301 status code is received in response to a request other
> than GET or HEAD, the user agent MUST NOT automatically redirect the
> request unless it can be confirmed by the user, since this might
> change the conditions under which the request was issued.
> Note: When automatically redirecting a POST request after
> receiving a 301 status code, some existing HTTP/1.0 user agents
> will erroneously change it into a GET request.
Python has a comment about this, they choose to go with the "erroneous change".
But they then mess up the HEAD request while following the redirect, probably
because they were too busy discussing how to deal with POST.
See https://github.com/python/cpython/pull/99731
The package.py assumed "+mpi" in many places, without checking for the variant.
This problem went undetected, as a hard dependency on scalapack pulled an mpi
implementation into the dependency chain (this is also fixed).
Also, the +mpi variant is used select between serial and parallel mode:
It has to enable MPI and ScaLAPACK: They are inter-dependent. Compile
fails because of checks for the other if the other is not enabled.
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
This adds super-lazy maintainer mode to `spack checksum`: Instead of
only printing the new checksums to the terminal, `-a` and
`--add-to-package` will add the new checksums to the `package.py` file
and open it in the editor afterwards for final checks.
This PR removes [end of life](https://endoflife.date/python) versions of Python from Spack. Specifically, this includes all versions of Python older than 3.7.
See https://github.com/spack/spack/discussions/31824 for rationale. Deprecated in #32615. And #28003.
For anyone using software that relies on Python 2, you have a few options:
* Upgrade the software to support Python 3. The `3to2` tool may get you most of the way there, although more complex libraries may need manual tweaking.
* Add Python 2 as an [external package](https://spack.readthedocs.io/en/latest/build_settings.html#external-packages). Many Python libraries do not support Python 2, but you may be able to add older versions that did once upon a time.
* Use Spack 0.19. Spack 0.19 is the last release to officially support Python 3.6 and older
* Create and maintain your own [custom repository](https://spack.readthedocs.io/en/latest/repositories.html). Basically, you would need a package for Python 2 and any other Python 2-specific libraries you need.
* Add a WindowsRegistryView class, which can query for existing
package installations on Windows. This is particularly important
because some Windows packages (including those added here)
do not allow two simultaneous installs, and this can be
queried in order to provide a clear error message.
* Consolidate external path detection logic for Windows into
WindowsKitExternalPaths and WindowsCompilerExternalPaths objects.
* Add external-only packages win-sdk and wgl
* Add win-wdk (including external detection) which depends on
win-sdk
* Replace prior msmpi implementation with a source-based install
(depends on win-wdk). This install can control the install
destination (unlike the binary installation).
* Update MSVC compiler to choose vcvars based on win-sdk dependency
* Provide "msbuild" module-level variable to packages during build
* When creating symlinks on Windows, need to explicitly specify when
a symlink target is a directory
* executables_in_path no-longer defaults to using PATH (this is
now expected to be taken care of by the caller)
Spec traversals can now specify a topological ordering. A topologically-
ordered traversal with input specs X1, X2... will
* include all of X1, X2... and their children
* be ordered such that a given node is guaranteed to appear before any
of its children in the traversal
Other notes:
* Input specs can be children of other input specs (this is useful if
a user specifies a set of specs to uninstall: some of those specs
might be children of others)
* `direction="parents"` will produce a reversed topological order
(children always come before parents).
* `cover="edges"` will generate a list of edges L such that (a) input
edges will always appear before output edges and (b) if you create
a list with the destination of each edge in L the result is
topologically ordered
* test_suite.py: speed up slow test by using mock packages
* Don't resolve the sha during unit-tests
* Skip long-running test that fails, instead of executing it
* uninstall: fix accidental cubic complexity
Currently spack uninstall runs in worst case cubic time complexity
thanks to traversal during traversal during traversal while collecting
the specs to be uninstalled.
Also brings down the number of error messages printed to something
linear in the amount of matching specs instead of quadratic.
* qt6: initial commit of several basic qt6 packages
* Qt6: fix style issues
* [qt6] fix style issues, trailing spaces
* [qt6] rename to qt-* ecosystem; remove imports
* [qt6] rename dependencies; change version strings
* [qt6] list_urls
* [qt6] homepage links
* [qt6] missing closing quotes failed style check
* qt-declarative: use private _versions
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
* qt-quick3d, qt-quicktimeline, qt-shadertools: use private _versions
* qt-base: rework feature defines and use run_tests
* qt: new version 6.2.4
* flake8 whitespace before comma
* qt-base: variant opengl when +gui
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
* qt6: rebase and apply new black style
* qt6: apply style isort fixes
* qt6: new version 6.3.0 and 6.3.1
* qt6: add 6.3.0 and 6.3.1 to versions list
* qt6: multi-argument join_path
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* qt-base: fix isort
* qt-shadertools: no cmake_args needed
* qt-declarative: imports up front
* qt-quick3d: fix import
* qt-declarative: remove useless cmake_args
* qt-shadertools: imports and join_path fixes
* qt-quick3d: join_path fixes
* qt-declarative: join_path fixes
* Update features based on gui usage
* Update dependencies, cmake args, mac support
* Update features based on linux
* More updates
* qt-base: fix style
* qt-base: archive_files join_path
* qt-base: new version 6.3.2
* qt-{declarative,quick3d,quicktimeline,shadertools}@6.3.2
* qt-base: require libxcb@1.13: and use system xcb_xinput when on linux
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update dask and related packages
* Update package dependency specs
* Run spack style
* Add new version of locket
* Respond to comments
* Added constraints
* Add version constraints for py-dask+distributed
* Run spack style
* Update var/spack/repos/builtin/packages/py-dask/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Deprecated dask versions
* Deprecated more dask and distirbuted
* spack style --fix
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* graphviz: remove cyclic dep to svg when pangocairo and poppler+glib failure
* graphviz: remove cyclic dep to svg when pangocairo and poppler+glib failure
* Add a regression test for 33928
* PackageBase should not set `(build|install)_time_test_callbacks`
* Fix audits by preserving the current semantic
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
gcc@10: Newer binutils than RHEL7/8's are required to for guaranteed operaton. Therefore, on RHEL7/8, reject ~binutils. You need to add +binutils to be sure to have binutils which are recent enough.
See this discussion with the OpenBLAS devs for reference:
https://github.com/xianyi/OpenBLAS/issues/3805#issuecomment-1319878852
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
Add a dependency on python versions less than 3.10 in order to work
around a bug in libxml2's configure script that fails to parse python
version strings with more than one character for the minor version.
The bug is present in v2.10.1, but has been fixed in 2.10.2.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
* add version 1.46.0 to bioconductor package r-a4
* add version 1.46.0 to bioconductor package r-a4base
* add version 1.46.0 to bioconductor package r-a4classif
* add version 1.46.0 to bioconductor package r-a4core
* add version 1.46.0 to bioconductor package r-a4preproc
* add version 1.46.0 to bioconductor package r-a4reporting
* add version 1.52.0 to bioconductor package r-absseq
* add version 1.28.0 to bioconductor package r-acde
* add version 1.76.0 to bioconductor package r-acgh
* add version 2.54.0 to bioconductor package r-acme
* add version 1.68.0 to bioconductor package r-adsplit
* add version 1.70.0 to bioconductor package r-affxparser
* add version 1.76.0 to bioconductor package r-affy
* add version 1.74.0 to bioconductor package r-affycomp
* add version 1.58.0 to bioconductor package r-affycompatible
* add version 1.56.0 to bioconductor package r-affycontam
* add version 1.70.0 to bioconductor package r-affycoretools
* add version 1.46.0 to bioconductor package r-affydata
* add version 1.50.0 to bioconductor package r-affyilm
* add version 1.68.0 to bioconductor package r-affyio
* add version 1.74.0 to bioconductor package r-affyplm
* add version 1.44.0 to bioconductor package r-affyrnadegradation
* add version 1.46.0 to bioconductor package r-agdex
* add version 3.30.0 to bioconductor package r-agilp
* add version 2.48.0 to bioconductor package r-agimicrorna
* add version 1.30.0 to bioconductor package r-aims
* add version 1.30.0 to bioconductor package r-aldex2
* add version 1.36.0 to bioconductor package r-allelicimbalance
* add version 1.24.0 to bioconductor package r-alpine
* add version 2.60.0 to bioconductor package r-altcdfenvs
* add version 2.22.0 to bioconductor package r-anaquin
* add version 1.26.0 to bioconductor package r-aneufinder
* add version 1.26.0 to bioconductor package r-aneufinderdata
* add version 1.70.0 to bioconductor package r-annaffy
* add version 1.76.0 to bioconductor package r-annotate
* add version 1.60.0 to bioconductor package r-annotationdbi
* add version 1.22.0 to bioconductor package r-annotationfilter
* add version 1.40.0 to bioconductor package r-annotationforge
* add version 3.6.0 to bioconductor package r-annotationhub
* add version 3.28.0 to bioconductor package r-aroma-light
* add version 1.30.0 to bioconductor package r-bamsignals
* add version 2.14.0 to bioconductor package r-beachmat
* add version 2.58.0 to bioconductor package r-biobase
* add version 2.6.0 to bioconductor package r-biocfilecache
* add version 0.44.0 to bioconductor package r-biocgenerics
* add version 1.8.0 to bioconductor package r-biocio
* add version 1.16.0 to bioconductor package r-biocneighbors
* add version 1.32.1 to bioconductor package r-biocparallel
* add version 1.14.0 to bioconductor package r-biocsingular
* add version 2.26.0 to bioconductor package r-biocstyle
* add version 3.16.0 to bioconductor package r-biocversion
* add version 2.54.0 to bioconductor package r-biomart
* add version 1.26.0 to bioconductor package r-biomformat
* add version 2.66.0 to bioconductor package r-biostrings
* add version 1.46.0 to bioconductor package r-biovizbase
* add version 1.8.0 to bioconductor package r-bluster
* add version 1.66.1 to bioconductor package r-bsgenome
* add version 1.34.0 to bioconductor package r-bsseq
* add version 1.40.0 to bioconductor package r-bumphunter
* add version 2.64.0 to bioconductor package r-category
* add version 2.28.0 to bioconductor package r-champ
* add version 2.30.0 to bioconductor package r-champdata
* add version 1.48.0 to bioconductor package r-chipseq
* add version 4.6.0 to bioconductor package r-clusterprofiler
* add version 1.34.0 to bioconductor package r-cner
* add version 1.30.0 to bioconductor package r-codex
* add version 2.14.0 to bioconductor package r-complexheatmap
* add version 1.72.0 to bioconductor package r-ctc
* add version 2.26.0 to bioconductor package r-decipher
* add version 0.24.0 to bioconductor package r-delayedarray
* add version 1.20.0 to bioconductor package r-delayedmatrixstats
* add version 1.38.0 to bioconductor package r-deseq2
* add version 1.44.0 to bioconductor package r-dexseq
* add version 1.40.0 to bioconductor package r-dirichletmultinomial
* add version 2.12.0 to bioconductor package r-dmrcate
* add version 1.72.0 to bioconductor package r-dnacopy
* add version 3.24.1 to bioconductor package r-dose
* add version 2.46.0 to bioconductor package r-dss
* add version 3.40.0 to bioconductor package r-edger
* add version 1.18.0 to bioconductor package r-enrichplot
* add version 2.22.0 to bioconductor package r-ensembldb
* add version 1.44.0 to bioconductor package r-exomecopy
* add version 2.6.0 to bioconductor package r-experimenthub
* add version 1.24.0 to bioconductor package r-fgsea
* add version 2.70.0 to bioconductor package r-gcrma
* add version 1.34.0 to bioconductor package r-gdsfmt
* add version 1.80.0 to bioconductor package r-genefilter
* add version 1.34.0 to bioconductor package r-genelendatabase
* add version 1.70.0 to bioconductor package r-genemeta
* add version 1.76.0 to bioconductor package r-geneplotter
* add version 1.20.0 to bioconductor package r-genie3
* add version 1.34.3 to bioconductor package r-genomeinfodb
* update r-genomeinfodbdata
* add version 1.34.0 to bioconductor package r-genomicalignments
* add version 1.50.2 to bioconductor package r-genomicfeatures
* add version 1.50.1 to bioconductor package r-genomicranges
* add version 2.66.0 to bioconductor package r-geoquery
* add version 1.46.0 to bioconductor package r-ggbio
* add version 3.6.2 to bioconductor package r-ggtree
* add version 2.8.0 to bioconductor package r-glimma
* add version 1.10.0 to bioconductor package r-glmgampoi
* add version 5.52.0 to bioconductor package r-globaltest
* update r-go-db
* add version 1.18.0 to bioconductor package r-gofuncr
* add version 2.24.0 to bioconductor package r-gosemsim
* add version 1.50.0 to bioconductor package r-goseq
* add version 2.64.0 to bioconductor package r-gostats
* add version 1.76.0 to bioconductor package r-graph
* add version 1.60.0 to bioconductor package r-gseabase
* add version 1.30.0 to bioconductor package r-gtrellis
* add version 1.42.0 to bioconductor package r-gviz
* add version 1.26.0 to bioconductor package r-hdf5array
* add version 1.70.0 to bioconductor package r-hypergraph
* add version 1.34.0 to bioconductor package r-illumina450probevariants-db
* add version 0.40.0 to bioconductor package r-illuminaio
* add version 1.72.0 to bioconductor package r-impute
* add version 1.36.0 to bioconductor package r-interactivedisplaybase
* add version 2.32.0 to bioconductor package r-iranges
* add version 1.58.0 to bioconductor package r-kegggraph
* add version 1.38.0 to bioconductor package r-keggrest
* add version 3.54.0 to bioconductor package r-limma
* add version 2.50.0 to bioconductor package r-lumi
* add version 1.74.0 to bioconductor package r-makecdfenv
* add version 1.76.0 to bioconductor package r-marray
* add version 1.10.0 to bioconductor package r-matrixgenerics
* add version 1.6.0 to bioconductor package r-metapod
* add version 2.44.0 to bioconductor package r-methylumi
* add version 1.44.0 to bioconductor package r-minfi
* add version 1.32.0 to bioconductor package r-missmethyl
* add version 1.78.0 to bioconductor package r-mlinterfaces
* add version 1.10.0 to bioconductor package r-mscoreutils
* add version 2.24.0 to bioconductor package r-msnbase
* add version 2.54.0 to bioconductor package r-multtest
* add version 1.36.0 to bioconductor package r-mzid
* add version 2.32.0 to bioconductor package r-mzr
* add version 1.60.0 to bioconductor package r-oligoclasses
* update r-org-hs-eg-db
* add version 1.40.0 to bioconductor package r-organismdbi
* add version 1.38.0 to bioconductor package r-pathview
* add version 1.90.0 to bioconductor package r-pcamethods
* update r-pfam-db
* add version 1.42.0 to bioconductor package r-phyloseq
* add version 1.60.0 to bioconductor package r-preprocesscore
* add version 1.30.0 to bioconductor package r-protgenerics
* add version 1.32.0 to bioconductor package r-quantro
* add version 2.30.0 to bioconductor package r-qvalue
* add version 1.74.0 to bioconductor package r-rbgl
* add version 2.38.0 to bioconductor package r-reportingtools
* add version 2.42.0 to bioconductor package r-rgraphviz
* add version 2.42.0 to bioconductor package r-rhdf5
* add version 1.10.0 to bioconductor package r-rhdf5filters
* add version 1.20.0 to bioconductor package r-rhdf5lib
* add version 2.0.0 to bioconductor package r-rhtslib
* add version 1.74.0 to bioconductor package r-roc
* add version 1.26.0 to bioconductor package r-rots
* add version 2.14.0 to bioconductor package r-rsamtools
* add version 1.58.0 to bioconductor package r-rtracklayer
* add version 0.36.0 to bioconductor package r-s4vectors
* add version 1.6.0 to bioconductor package r-scaledmatrix
* add version 1.26.0 to bioconductor package r-scater
* add version 1.12.0 to bioconductor package r-scdblfinder
* add version 1.26.0 to bioconductor package r-scran
* add version 1.8.0 to bioconductor package r-scuttle
* add version 1.64.0 to bioconductor package r-seqlogo
* add version 1.56.0 to bioconductor package r-shortread
* add version 1.72.0 to bioconductor package r-siggenes
* add version 1.20.0 to bioconductor package r-singlecellexperiment
* add version 1.32.0 to bioconductor package r-snprelate
* add version 1.48.0 to bioconductor package r-snpstats
* add version 2.34.0 to bioconductor package r-somaticsignatures
* add version 1.10.0 to bioconductor package r-sparsematrixstats
* add version 1.38.0 to bioconductor package r-spem
* add version 1.36.0 to bioconductor package r-sseq
* add version 1.28.0 to bioconductor package r-summarizedexperiment
* add version 3.46.0 to bioconductor package r-sva
* add version 1.36.0 to bioconductor package r-tfbstools
* add version 1.20.0 to bioconductor package r-tmixclust
* add version 2.50.0 to bioconductor package r-topgo
* add version 1.22.0 to bioconductor package r-treeio
* add version 1.26.0 to bioconductor package r-tximport
* add version 1.26.0 to bioconductor package r-tximportdata
* add version 1.44.0 to bioconductor package r-variantannotation
* add version 3.66.0 to bioconductor package r-vsn
* add version 2.4.0 to bioconductor package r-watermelon
* add version 2.44.0 to bioconductor package r-xde
* add version 1.56.0 to bioconductor package r-xmapbridge
* add version 0.38.0 to bioconductor package r-xvector
* add version 1.24.0 to bioconductor package r-yapsa
* add version 1.24.0 to bioconductor package r-yarn
* add version 1.44.0 to bioconductor package r-zlibbioc
* make version resource consistent for r-bsgenome-hsapiens-ucsc-hg19
* make version resource consistent for r-go-db
* make version resource consistent for r-kegg-db
* make version resource consistent for r-org-hs-eg-db
* make version resource consistent for r-pfam-db
* new package: r-ggrastr
* Patches not needed for new version
* new package: r-hdo-db
* new package: r-ggnewscale
* new package: r-gson
* Actually depends on ggplot2@3.4.0:
* Fix formatting of r-hdo-db
* Fix dependency version specifiers
* Clean up duplicate dependency references
* Enable hdf5 build (including +mpi) on Windows
* This includes updates to hdf5 dependencies openssl (minor edit) and
bzip2 (more-extensive edits)
* Add binary-based installation of msmpi (this is currently the only
supported MPI implementation in Spack for Windows). Note that this
does not install to the Spack-specified prefix. This implementation
will be replaced with a source-based implementation
Co-authored-by: John Parent <john.parent@kitware.com>
Setting PYTHONHOME is rarely needed (since each interpreter has
various ways of setting it automatically) and very often it is
difficult to get right manually.
For instance, the change done to set PYTHONHOME to
sysconfig["base_prefix"] broke bootstrapping dev dependencies
of Spack for me, when working inside a virtual environment in Linux.
* replace mpi as a variant instead of dependency
* separate serial and MPI dependencies
* configure args depending on serial or mpi variant
* reformat with black
In #3113, `https` was removed to ensure that `curl` can be bootstrapped
without SSL being present. This was lost in #25672 which aimed to use
`https` where possible.
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
#32942 fixed bootstrapping on Windows by having the core Spack
code explicitly add the Clingo package bin/ directory as a
DLL path.
Since then, #33400 has been merged, which ensures that the Python
module installed by the Spack `clingo` package can find the DLLs
in bin/.
Note that this only works for Spack instances which have been
bootstrapped after #33400: for installations bootstrapped before
then, you will need to run `spack clean -b` (this would only
be needed for Spack instances running on Windows).
Buildin neovim@0.8.0 complains (for me) about Lua's lpeg and mpack
packages not being available at build time. Removing the link-only
setting in the dependencies for these two packages fixes the build for
me.
Revamp the timer so we always have a designated begin and end.
Fix a bug where the phase timer was stopped before the phase started,
resulting in incorrect timing reports in timers.json.
Add spack.ld_so_conf.host_dynamic_linker_search_paths
Retrieve the current host runtime search paths for shared libraries;
for GNU and musl Linux we try to retrieve the dynamic linker from the
current Python interpreter and then find the corresponding config file
(e.g. ld.so.conf or ld-musl-<arch>.path). Similar can be done for
BSD and others, but this is not implemented yet. The default paths
are always returned. We don't check if the listed directories exist.
Use this in spack external find for libraries.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
While binaries built for PRs that get merged must still be rebuilt
in develop pipelines, they can be used by other PRs that find they
would otherwise need to rebuild them. Now that spackbot is
managing copying PR binaries from merged PRs into a shared location,
keeping it pruned to a reasonable size, and making sure the indices
are up to date, spack can use these mirrors as a potential source
of binaries.
I'm finding I often want the date in my paths and it would be nice if spack had a config variable for this.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Remove CI jobs related to Python 2.7
* Remove Python 2.7 specific code from Spack core
* Remove externals for Python 2 only
* Remove llnl.util.compat
We added a hotfix to releases/v0.19 with a feature flag, but the flag
is incompatible with the config schema on `develop`.
- [x] Ensure schema is compatible on develop even though config option is unused.
* Speed-up bootstrap mirror unit test
The unit test doesn't need to concretize, since it checks
only metadata for the mirror.
* architecture.py: use "default_mock_concretization" for slow test
Environments and environment views have taken over the role of `spack activate/deactivate`, and we should deprecate these commands for several reasons:
- Global activation is a really poor idea:
- Install prefixes should be immutable; since they can have multiple, unrelated dependents; see below
- Added complexity elsewhere: verification of installations, tarballs for build caches, creation of environment views of packages with unrelated extensions "globally activated"... by removing the feature, it gets easier for people to contribute, and we'd end up with fewer bugs due to edge cases.
- Environment accomplish the same thing for non-global "activation" i.e. `spack view`, but better.
Also we write in the docs:
```
However, Spack global activations have two potential drawbacks:
#. Activated packages that involve compiled C extensions may still
need their dependencies to be loaded manually. For example,
``spack load openblas`` might be required to make ``py-numpy``
work.
#. Global activations "break" a core feature of Spack, which is that
multiple versions of a package can co-exist side-by-side. For example,
suppose you wish to run a Python package in two different
environments but the same basic Python --- one with
``py-numpy@1.7`` and one with ``py-numpy@1.8``. Spack extensions
will not support this potential debugging use case.
```
Now that environments are established and views can take over the role of activation
non-destructively, we can remove global activation/deactivation.
* add version 1.7-20 to r-ade4
* add version 1.1-13 to r-adephylo
* add version 0.3-20 to r-adespatial
* add version 1.2-0 to r-afex
* add version 0.8-19 to r-amap
* add version 0.1.8 to r-aplot
* add version 2.0.8 to r-biasedurn
* add version 2.4-4 to r-bio3d
* add version 1.30.19 to r-biocmanager
* add version 1.2-9 to r-brobdingnag
* add version 0.4.1 to r-bslib
* add version 3.7.3 to r-callr
* add version 3.1-1 to r-car
* add version 0.3-62 to r-clue
* add version 1.2.0 to r-colourpicker
* add version 1.8.1 to r-commonmark
* add version 0.4.3 to r-cpp11
* add version 1.14.4 to r-data-table
* add version 1.34 to r-desolve
* add version 2.4.5 to r-devtools
* add version 0.6.30 to r-digest
* add version 0.26 to r-dt
* add version 1.7-12 to r-e1071
* add version 1.8.2 to r-emmeans
* add version 1.1.16 to r-exomedepth
* add version 0.1-8 to r-expint
* add version 0.4.0 to r-fontawesome
* add version 1.5-2 to r-fracdiff
* add version 1.10.0 to r-future-apply
* add version 3.0.1 to r-ggmap
* add version 3.4.0 to r-ggplot2
* add version 2.1.0 to r-ggraph
* add version 0.6.4 to r-ggsignif
* add version 0.6-7 to r-gmp
* add version 2022.10-2 to r-gparotation
* add version 0.8.3 to r-graphlayouts
* add version 1.8.8 to r-grbase
* add version 2.1-0 to r-gstat
* add version 0.18.6 to r-insight
* add version 1.3.1 to r-irkernel
* add version 1.8.3 to r-jsonlite
* add version 1.7.0 to r-lava
* add version 1.1-31 to r-lme4
* add version 5.6.17 to r-lpsolve
* add version 5.5.2.0-17.9 to r-lpsolveapi
* add version 1.2.9 to r-mapproj
* add version 3.4.1 to r-maps
* add version 1.1-5 to r-maptools
* add version 1.3 to r-markdown
* add version 6.0.0 to r-mclust
* add version 4.2-2 to r-memuse
* add version 1.8-41 to r-mgcv
* add version 1.2.5 to r-minqa
* add version 0.3.7 to r-nanotime
* add version 2.4.1.1 to r-nfactors
* add version 3.1-160 to r-nlme
* add version 2.7-1 to r-nmof
* add version 0.60-16 to r-np
* add version 2.0.4 to r-openssl
* add version 4.2.5.1 to r-openxlsx
* add version 0.3-8 to r-pbdzmq
* add version 2.0-3 to r-pcapp
* add version 0.7.0 to r-philentropy
* add version 2.0.3 to r-pkgcache
* add version 1.3.1 to r-pkgload
* add version 1.10-4 to r-polyclip
* add version 3.8.0 to r-processx
* add version 1.7.2 to r-ps
* add version 2.12.1 to r-r-utils
* add version 1.2.4 to r-ragg
* add version 3.7 to r-rainbow
* add version 0.0.20 to r-rcppannoy
* add version 0.3.3.9.3 to r-rcppeigen
* add version 0.3.12 to r-rcppgsl
* add version 1.0.2 to r-recipes
* add version 1.1.10 to r-rmutil
* add version 0.10.24 to r-rmysql
* add version 2.4.8 to r-rnexml
* add version 4.1.19 to r-rpart
* add version 1.7-2 to r-rrcov
* add version 0.8.28 to r-rsconnect
* add version 0.25 to r-servr
* add version 1.7.3 to r-shiny
* add version 3.0-0 to r-spatstat-data
* add version 3.0-3 to r-spatstat-geom
* add version 3.0-1 to r-spatstat-random
* add version 3.0-0 to r-spatstat-sparse
* add version 3.0-1 to r-spatstat-utils
* add version 1.8.0 to r-styler
* add version 3.4.1 to r-sys
* add version 1.5-2 to r-tclust
* add version 0.0.9 to r-tfmpvalue
* add version 1.2.0 to r-tidyselect
* add version 0.10-52 to r-tseries
* add version 4.2.2 to r-v8
* add version 0.5.0 to r-vctrs
* add version 2.6-4 to r-vegan
* add version 0.7.0 to r-wk
* add version 0.34 to r-xfun
* add version 1.0.6 to r-xlconnect
* add version 3.99-0.12 to r-xml
* add version 0.12.2 to r-xts
* add version 1.0-33 to r-yaimpute
* add version 2.3.6 to r-yaml
* add version 2.2.2 to r-zip
* add version 2.2-7 to r-deoptim
* add version 4.3.1 to r-ergm
* add version 0.18 to r-evaluate
* add version 1.29.0 to r-future
* add version 0.0.8 to r-ggfun
* add version 0.9.2 to r-ggrepel
* add version 1.9.0 to r-lubridate
* add version 4.10.1 to r-plotly
* add version 0.2.12 to r-rcppcctz
* add version 1.2 to r-rook
* add version 1.6-1 to r-segmented
* add version 4.2.1 to r-seurat
* add version 4.1.3 to r-seuratobject
* add version 1.0-9 to r-sf
* add version 1.5-1 to r-sp
* add version 1.8.1 to r-styler
* new package: r-timechange
* new package: r-stars
* new package: r-sftime
* new package: r-spatstat-explore
Co-authored-by: glennpj <glennpj@users.noreply.github.com>
* updates and fixes for libpressio
* differentiate between standalone and build tests
* add e4s tags
Co-authored-by: Robert Underwood <runderwood@anl.gov>
* include py-darshan
* include requested changes
* fix required versions
* fix style
* fix style
* Update package.py
* Update var/spack/repos/builtin/packages/py-darshan/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Boost 1.64.0 has build errors when building the python and MPI modules. This was previously just a comment in the package.py which allowed broken specs to concretize. The comments are now expressed in conflicts to prevent this.
Currently, external `PythonPackage`s cause install failures because the logic in `PythonPackage` assumes that it can ask for `spec["python"]`. Because we chop off externals' dependencies, an external Python extension may not have a `python` dependency.
This PR resolves the issue by guaranteeing that a `python` node is present in one of two ways:
1. If there is already a `python` node in the DAG, we wire the external up to it.
2. If there is no existing `python` node, we wire up a synthetic external `python` node, and we assume that it has the same prefix as the external.
The assumption in (2) isn't always valid, but it's better than leaving the user with a non-working `PythonPackage`.
The logic here is specific to `python`, but other types of extensions could take advantage of it. Packages need only define `update_external_dependencies(self)`, and this method will be called on externals after concretization. This likely needs to be fleshed out in the future so that any added nodes are included in concretization, but for now we only bolt on dependencies post-concretization.
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* [mfem] updates related to building with cuda
* [hypre] tweak to support building with external ROCm/HIP
* [mfem] more tweaks related to building with +rocm
* [mfem] temporary (?) workaround for issue #33684
* [mfem] fix style
* [mfem] fix +shared+miniapps install
* Add checksum for py-protobuf 4.21.7, protobuf 21.7
* Update var/spack/repos/builtin/packages/protobuf/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update var/spack/repos/builtin/packages/protobuf/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/protobuf/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update package.py
* Delete protoc2.5.0_aarch64.patch
* Update package.py
* Restore but deprecate py-protobuf 3.0.0a/b; deprecate py-tensorflow 0.x
* Fix audit
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Spack currently creates a temporary sbang that is moved "atomically" in place,
but this temporary causes races when multiple processes start installing sbang.
Let's just stick to an idempotent approach. Notice that we only re-install sbang
if Spack updates it (since we do file compare), and sbang was only touched
18 times in the past 6 years, whereas we hit the sbang tempfile issue
frequently with parallel install on a fresh spack instance in CI.
Also fixes a bug where permissions weren't updated if config changed but
the latest version of the sbang file was already installed.
The `intel` compiler at versions > 20 is provided by the `intel-oneapi-compilers-classic`
package (a thin wrapper around the `intel-oneapi-compilers` package), and the `oneapi`
compiler is provided by the `intel-oneapi-compilers` package.
Prior to this work, neither of these compilers could be bootstrapped by Spack as part of
an install with `install_missing_compilers: True`.
Changes made to make these two packages bootstrappable:
1. The `intel-oneapi-compilers-classic` package includes a bin directory and symlinks
to the compiler executables, not just logical pointers in Spack.
2. Spack can look for bootstrapped compilers in directories other than `$prefix/bin`,
defined on a per-package basis
3. `intel-oneapi-compilers` specifies a non-default search directory for the
compiler executables.
4. The `spack.compilers` module now can make more advanced associations between
packages and compilers, not just simple name translations
5. Spack support for lmod hierarchies accounts for differences between package
names and the associated compiler names for `intel-oneapi-compilers/oneapi`,
`intel-oneapi-compilers-classic/intel@20:`, `llvm+clang/clang`, and
`llvm-amdgpu/rocmcc`.
- [x] full end-to-end testing
- [x] add unit tests
* Add zstd support for elfutils
Not defining `+zstd` implies `--without-zstd` flag to configure.
This avoids automatic library detection and thus make the build only
depends on Spack installed dependencies.
* Use autotools helper "with_or_without"
* Revert use of with_or_without
Using `with_or_without()` with `variant` keyword does not seem to work.
"spack install foo" no longer adds package "foo" to the environment
(i.e. to the list of root specs) by default: you must specify "--add".
Likewise "spack uninstall foo" no longer removes package "foo" from
the environment: you must specify --remove. Generally this means
that install/uninstall commands will no longer modify the users list
of root specs (which many users found problematic: they had to
deactivate an environment if they wanted to uninstall a spec without
changing their spack.yaml description).
In more detail: if you have environments e1 and e2, and specs [P, Q, R]
such that P depends on R, Q depends on R, [P, R] are in e1, and [Q, R]
are in e2:
* `spack uninstall --dependents --remove r` in e1: removes R from e1
(but does not uninstall it) and uninstalls (and removes) P
* `spack uninstall -f --dependents r` in e1: will uninstall P, Q, and
R (i.e. e2 will have dependent specs uninstalled as a side effect)
* `spack uninstall -f --dependents --remove r` in e1: this uninstalls
P, Q, and R, and removes [P, R] from e1
* `spack uninstall -f --remove r` in e1: uninstalls R (so it is
"missing" in both environments) and removes R from e1 (note that e1
would still install R as a dependency of P, but it would no longer
be listed as a root spec)
* `spack uninstall --dependents r` in e1: will fail because e2 needs R
Individual unit tests were created for each of these scenarios.
Somehow a network error when cloning the repo for ci gets
categorized by gitlab as a script failure. To make sure we retry
jobs that failed for that reason or a similar one, include
"script_failure" as one of the reasons for retrying service jobs
(which include "no specs to rebuild" jobs, update buildcache
index jobs, and temp storage cleanup jobs.
Add a `project` block to the toml config along with development and CI
dependencies and a minimal `build-system` block, doing basically
nothing, so that spack can be bootstrapped to a full development
environment with:
```shell
$ hatch -e dev shell
```
or for a minimal environment without hatch:
```shell
$ python3 -m venv venv
$ source venv/bin/activate
$ python3 -m pip install --upgrade pip
$ python3 -m pip install -e '.[dev]'
```
This means we can re-use the requirements list throughout the workflow
yaml files and otherwise maintain this list in *one place* rather than
several disparate ones. We may be stuck with a couple more temporarily
to continue supporting python2.7, but aside from that it's less places
to get out of sync and a couple new bootstrap options.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This change uses the aws cli, if available, to retrieve spec files
from the mirror to a local temp directory, then parallelizes the
reading of those files from disk using multiprocessing.ThreadPool.
If the aws cli is not available, then a ThreadPool is used to fetch
and read the spec files from the mirror.
Using aws cli results in ~16 times speed up to recreate the binary
mirror index, while just parallelizing the fetching and reading
results in ~3 speed up.
The compiler bootstrapping logic currently does not add a task when the compiler package is already in the install task queue. This causes failures when the compiler package is added without the additional metadata telling the task to update the compilers list.
Solution: requeue compilers for bootstrapping when needed, to update `task.compiler` metadata.
Currently, develop specs that are not roots and are not explicitly listed dependencies
of the roots are not applied.
- [x] ensure dev specs are applied.
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
`spack env create` enables a view by default (in a weird hidden
directory, but well...). This is asking for trouble with the other
default of `concretizer:unify:false`, since having different flavors of
the same spec in an environment, leads to collision errors when
generating the view.
A change of defaults would improve user experience:
However, `unify:true` makes most sense, since any time the issue is
brought up in Slack, the user changes the concretization config, since
it wasn't the intention to have different flavors of the same spec, and
install times are decreased.
Further we improve the docs and drop the duplicate root spec limitation
Dependencies specified by hash are unique in Spack in that the abstract
specs are created with internal structure. In this case, the constraint
generation for spec matrices fails due to flattening the structure.
It turns out that the dep_difference method for Spec.constrain does not
need to operate on transitive deps to ensure correctness. Removing transitive
deps from this method resolves the bug.
- [x] Includes regression test
Without this, Meson will use its Wraps to automatically download and
install dependencies. We want to manage dependencies explicitly,
therefore disable this functionality.
Currently, Spack can fail for a valid spec if the spec is constructed from overlapping, but not conflicting, concrete specs via the hash.
For example, if abcdef and ghijkl are the hashes of specs that both depend on zlib/mnopqr, then foo ^/abcdef ^/ghijkl will fail to construct a spec, with the error message "Cannot depend on zlib... twice".
This PR changes this behavior to check whether the specs are compatible before failing.
With this PR, foo ^/abcdef ^/ghijkl will concretize.
As a side-effect, so will foo ^zlib ^zlib and other specs that are redundant on their dependencies.
* ADD version 0.19.0 in py-gym recipe
* Fix py-gym download url and dependencies for v0.19.0
* Fix stupid error in previous commit: no change in py-cloudpickle dep
* Yes, I should've paid more attention! O:)
I think now it is right, thanks!
Argparse started raising ArgumentError exceptions
when the same parser is added twice. Therefore, we
perform the addition only if the parser is not there
already
Port match syntax to our unparser
Compilers and linker optimize string constants for space by aliasing
them when one is a suffix of another. For gcc / binutils this happens
already at -O1, due to -fmerge-constants. This means that we have
to take care during relocation to always preserve a certain length
of the suffix of those prefixes that are C-strings.
In this commit we pick length 7 as a safe suffix length, assuming the
suffix is typically the 7 characters from the hash (i.e. random), so
it's unlikely to alias with any string constant used in the sources.
In general we now pad shortened strings from the left with leading
dir seperators, but in the case of C-strings that are much shorter
and don't share a common suffix (due to projections), we do allow
shrinking the C-string, appending a null, and retaining the old part
of the prefix.
Also when rewiring, we ensure that the new hash preserves the last
7 bytes of the old hash.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
A user may want to set some attributes on a package without actually modifying the package (e.g. if they want to git pull updates to the package without conflicts). This PR adds a per-package configuration section called "set", which is a dictionary of attribute names to desired values. For example:
packages:
openblas:
package_attributes:
submodules: true
git: "https://github.com/myfork/openblas"
in this case, the package will always retrieve git submodules, and will use an alternate location for the git repo.
While git, url, and submodules are the attributes for which we envision the most usage, this allows any attribute to be overridden, and the acceptable values are any value parseable from yaml.
Newer versions of the CrayPE for EX systems have standalone compiler executables for CCE and compiler wrappers for Cray MPICH. With those, we can treat the cray systems as part of the linux platform rather than having a separate cray platform.
This PR:
- [x] Changes cray platform detection to ignore EX systems with Craype version 21.10 or later
- [x] Changes the cce compiler to be detectable via paths
- [x] Changes the spack compiler wrapper to understand the executable names for the standalone cce compiler (`craycc`, `crayCC`, `crayftn`).
For some instances of externally-provided Python (e.g. Homebrew),
the LDLIBRARY/LIBRARY config variables don't actually refer to
libraries and should therefore be excluded from ".libs".
Only enable the hdf5-vfd-gds package if it can compile.
- hdf5-vfd-gds needs cuda@11.7.1+ to be able to `find_library` for cuFile.
- Only enable hdf5-vfd-gds in the sdk if cuda@11.7.1+ is available.
If an earlier version of cuda is being used, do not depend on the
hdf5-vfd-gds package at all.
* take two
* Add missing import statement
* Group dependencies together
* Extract libtiff arguments
* Extract libpng arguments
* Push preamble variable into png_args and tiff_args
* Extract setting args associated with the screenshot variant
* Inlined a few variables
* Modify only build targets and install targets
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Whenever the rpath string actually _grows_, it falls back to patchelf,
when it stays the same length or gets shorter, we update it in-place,
padded with null bytes.
This PR only deals with absolute -> absolute rpath replacement. We don't
use `_build_tarball(relative=True)` in our CI. If `relative` then it falls
back to the old replacement code.
With this PR, relocation time goes down significantly, likely because patchelf
does some odd things with mmap, causing lots of overhead. Example:
- `binutils`: 700MB installed, goes from `1.91s` to `0.57s`, or `3.4x` faster.
Relocation time: 27% -> 10% of total install time
- `llvm`: 6.8GB installed, goes from `28.56s` to `5.38`, or `5.3x` faster.
Relocation time: 44% -> 13% of total install time
The bottleneck is now decompression.
Note: I'm somewhat confused about the "relative rpath" code paths. Right
now this PR only deals with absolute -> absolute replacement. As far as
I understand, if you embrace relative rpaths when uploading to the
buildcache, the whole point is you _don't_ want to patch rpaths on
install? So it seems fine to not expand `$ORIGIN` again imho.
When a package asks for non-parallel make, we need to force `make -j1` because just doing `make` will run in parallel under jobserver (e.g. `spack env depfile`).
We now always add `-j1` when asked for a non-parallel execution (even if there is no jobserver).
And each `MakeExecutable` can now ask for jobserver support or not. For example: the default `ninja` does not support jobserver so spack applies the default `-j`, but `ninja@kitware` or `ninja-fortran` does, so spack doesn't add `-j`.
Tips: you can run `SPACK_INSTALL_FLAGS=-j1 make -f spack-env-depfile.make -j8` to avoid massive job-spawning because of build tools that don't support jobserver (ninja).
* testing ssh key
* test
* LR : Creating the packge to install the gegelati app
* LR : Gegelati, a TPG C++ library added and fully tested
* LR : adjusting for fork
* LR: taking out the boilerplates
* LR: taking out the rest
We try to avoid non-default variant values in the concretizer, but this doesn't make
sense for variants forced to take some non-default value by variant propagation.
Counting this as a penalty effectively biases the concretizer for small specs dependency
graphs -- something we try very hard to avoid elsewhere because it can lead to very
strange decisions.
Example: with the penalty, `spack spec hdf5` will choose the default `openmpi` as its
`mpi` provider, but `spack spec hdf5 ~~shared` will choose `mpich` because it has to set
fewer non-default variant values because `mpich`'s DAG is smaller. That's not a good
reason to prefer a non-default virtual provider.
To fix this, if the user explicitly requests a non-default value to be propagated, there
shouldn't be a penalty. Variant values set on the CLI already don't count as default; we
just need to extend that to propagated values.
Adds another post install hook that loops over the install prefix, looking for shared libraries type of ELF files, and sets the soname to their own absolute paths.
The idea being, whenever somebody links against those libraries, the linker copies the soname (which is the absolute path to the library) as a "needed" library, so that at runtime the dynamic loader realizes the needed library is a path which should be loaded directly without searching.
As a result:
1. rpaths are not used for the fixed/static list of needed libraries in the dynamic section (only for _actually_ dynamically loaded libraries through `dlopen`), which largely solves the issue that Spack's rpaths are a heuristic (`<prefix>/lib` and `<prefix>/lib64` might not be where libraries really are...)
2. improved startup times (no library search required)
Untouched spec pruning was added to reduce the number of specs
developers see getting rebuilt in their PR pipelines that they
don't understand. Because the state of the develop mirror lags
quite far behind the tip of the develop branch, PRs often find
they need to rebuild things untouched by their PR.
Untouched spec pruning was previously implemented by finding all
specs in the environment with names of packages touched by the PR,
traversing in both directions the DAGS of those specs, and adding
all dependencies as well as dependents to a list of concrete specs
that should not be considered for pruning.
We found that this heuristic results in too many pruned specs, and
that dependents of touched specs must have all their dependencies
added to the list of specs that should not be considered for pruning.
* SEACAS: Update package.py to handle new SEACAS project name
The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.
The refactor also changes several:
"-DSome_CMAKE_Option:BOOL=ON"
to
define("Some_CMAKE_Option", True)
* SEACAS: Additional refactorings
* Replaced all cmake "-Dsomething=other" lines with either `define`
or `define_from_variant` functions.
Consolidated the application (fortran, legacy, all) enabling lines
into loops over the code names. Easier to see categorization of
applications and also to add/move/remove an application
Reordered some lines; general cleanup and restructuring.
* Address flake8 issues
* Remove trailing whitespace
* Reformat using black
* add new package: py-pylatex
* fix bugs
* add extras indicated in setup.py
* Update var/spack/repos/builtin/packages/py-pylatex/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pylatex/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* improvements
* remove git merge related lines
* tidy
* Update var/spack/repos/builtin/packages/py-pylatex/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* remove variant
* [@spackbot] updating style on behalf of Sinan81
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
This issue was introduced in #29761:
```
==> Installing ncurses-6.3-22hz6q6cvo3ep2uhrs3erpp2kogxncbn
==> No binary for ncurses-6.3-22hz6q6cvo3ep2uhrs3erpp2kogxncbn found: installing from source
==> Using cached archive: /spack/var/spack/cache/_source-cache/archive/97/97fc51ac2b085d4cde31ef4d2c3122c21abc217e9090a43a30fc5ec21684e059.tar.gz
==> No patches needed for ncurses
==> ncurses: Executing phase: 'autoreconf'
==> ncurses: Executing phase: 'configure'
==> ncurses: Executing phase: 'build'
==> ncurses: Executing phase: 'install'
==> Error: AttributeError: 'str' object has no attribute 'propagate'
The 'ncurses' package cannot find an attribute while trying to build from sources. This might be due to a change in Spack's package format to support multiple build-systems for a single package. You can fix this by updating the build recipe, and you can also report the issue as a bug. More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure
/spack/lib/spack/spack/build_environment.py:1075, in _setup_pkg_and_run:
1072 tb_string = traceback.format_exc()
1073
1074 # build up some context from the offending package so we can
>> 1075 # show that, too.
1076 package_context = get_package_context(tb)
1077
1078 logfile = None
```
It turns out this was caused by a bug that had been around much longer, in which the flags were passed by reference to the flag_handler, and the flag_handler was modifying the spec object, not just the flags given to the build system. The scope of this bug was limited by the forking model in Spack, which is how it went under the radar for so long.
PR includes regression test.
* remove deptype_query remnants
* deptypes -> deptype
These arguments haven't existed since 2017, but `traverse` now fails on unknown **kwargs, so they have finally popped up.
The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.
The refactor also changes several:
"-DSome_CMAKE_Option:BOOL=ON"
to
define("Some_CMAKE_Option", True)
This updates the propagation logic used in `concretize.lp` to avoid rules with `path()`
in the body and instead base propagation around `depends_on()`.
Currently, compiler flags and variants are inconsistent: compiler flags set for a
package are inherited by its dependencies, while variants are not. We should have these
be consistent by allowing for inheritance to be enabled or disabled for both variants
and compiler flags.
- [x] Make new (spec language) operators
- [x] Apply operators to variants and compiler flags
- [x] Conflicts currently result in an unsatisfiable spec
(i.e., you can't propagate two conflicting values)
What I propose is using two of the currently used sigils to symbolized that the variant
or compiler flag will be inherited:
Example syntax:
- `package ++variant`
enabled variant that will be propagated to dependencies
- `package +variant`
enabled variant that will NOT be propagated to dependencies
- `package ~~variant`
disabled variant that will be propagated to dependencies
- `package ~variant`
disabled variant that will NOT be propagated to dependencies
- `package cflags==True`
`cflags` will be propagated to dependencies
- `package cflags=True`
`cflags` will NOT be propagated to dependencies
Syntax for string-valued variants is similar to compiler flags.
Fixes an issue on the RHEL8 UBI container where this test would fail because `gr_mem`
was empty for every entry in the `grp` DB.
You have to check *both* the `pwd` database (which has primary groups) and `grp` (which
has other gorups) to do this correctly.
- [x] update `llnl.util.filesystem.group_ids()` to do this
- [x] use it in the `sbang` test
This PR introduces breadth-first traversal, and moves depth-first traversal
logic out of Spec's member functions, into `traverse.py`.
It introduces a high-level API with three main methods:
```python
spack.traverse.traverse_edges(specs, kwargs...)
spack.traverse.traverse_nodes(specs, kwags...)
spack.traverse.traverse_tree(specs, kwargs...)
```
with the usual `root`, `order`, `cover`, `direction`, `deptype`, `depth`, `key`,
`visited` kwargs for the first two.
What's new is that `order="breadth"` is added for breadth-first traversal.
The lower level API is not exported, but is certainly useful for advanced use
cases. The lower level API includes visitor classes for direction reversal and
edge pruning, which can be used to create more advanced traversal methods,
especially useful when the `deptype` is not constant but depends on the node
or depth.
---
There's a couple nice use-cases for breadth-first traversal:
- Sometimes roots have to be handled differently (e.g. follow build edges of
roots but not of deps). BFS ensures that root nodes are always discovered at
depth 0, instead of at any depth > 1 as a dep of another root.
- When printing a tree, it would be nice to reduce indent levels so it fits in the
terminal, and ensure that e.g. `zlib` is not printed at indent level 10 as a
dependency of a build dep of a build dep -- rather if it's a direct dep of my
package, I wanna see it at depth 1. This basically requires one breadth-first
traversal to construct a tree, which can then be printed with depth-first traversal.
- In environments in general, it's sometimes inconvenient to have a double
loop: first over the roots then over each root's deps, and maintain your own
`visited` set outside. With BFS, you can simply init the queue with the
environment root specs and it Just Works. [Example here](3ec7304699/lib/spack/spack/environment/environment.py (L1815-L1816))
Currently, many tests hardcode to older versions of gcc for comparisons of
concretization among compiler versions. Those versions are too old to concretize for
`aarch64`-family targets, which leads to failing tests on `aarch64`.
This PR fixes those tests by updating the compiler versions used for testing.
Currently, many tests hardcode the expected architecture result in concretization to the
`x86_64` family of architectures.
This PR generalizes the tests that can be generalized, to cover multiple architecture
families. For those that test specific relationships among `x86_64`-family targets, it
ensures that concretization uses the `x86_64`-family targets in those cases.
Currently, many tests rely on the fact that `AutotoolsPackage` imposes no dependencies
on the inheriting package. That is not true on `aarch64`-family architectures.
This PR ensures that the fact `AutotoolsPackage` on `aarch64` pulls in a dependency on
`gnuconfig` is ignored when testing for the appropriate relationships among dependencies
Additionally, 5 tests currently prompt the user for input when `gpg` is available in the
user's path. This PR fixes that issue. And 7 tests fail currently when the user has a
yubikey available. This PR fixes the incorrect gpg argument causing those issues.
The `spack info <package>` command does not show the `Virtual Packages:` output unless the `--virtuals` command option is passed. Before this changes, the information that the command is supposed to be illustrating is not shown in the example and is confusing.
* julia: don't look for the openlibm libraries when unneeded
Cause spack to *not* check for the existence of the openlibm libraries (by adding it to the pkgs list) when ~openlibm is specified.
* [@spackbot] updating style on behalf of downloadico
Co-authored-by: downloadico <downloadico@users.noreply.github.com>
- the updated OpenFOAM wmake rules now allow multiple locations for
compiler flags:
* wmake/General/common/c++Opt [central]
* wmake/linux64Gcc/c++Opt [traditional]
- match both '=' and ':=' make rule lines
Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
Changes to improve locating shared libraries on Windows, which in
turn enables the use of Clingo. This PR attempts to establish a
proper distinction between linking on Windows vs. Linux/Mac: on
Windows, linking is always done with .lib files (never .dll files).
This somewhat complicates the model since the Spec.lib method could
return libraries that were used for both linking and loading, but
since these are not always the same on Windows, it was decided to
treat Spec.libs as being for link-time libraries. Additional functions
are added to help dependents locate run-time libraries.
* Clingo is now the default concretizer on Windows
* Clingo is now the concretizer used for unit tests on Windows
* Fix a permissions issue that can occur while moving Git files during
fetching/staging
* Packages can now implement "win_add_library_dependent" to register
files/directories that include libraries that would need to link
to dependency dlls
* Packages can now implement "win_add_rpath" to register the locations
of dlls that dependents would want to load
* "Spec.libs" on Windows is updated to return link-time libraries
(i.e. .lib files, rather than .dll files)
* PackageBase.rpath on Windows is now updated to return the most-likely
locations where .dlls will be found (which is generally in the bin/
directory)
* Nalu-Wind: Allow for standard versions of trilinos
This will allow us to utilize custom numeric versions for trilinos in `spack-manager` while we continue to develop `nalu-wind`.
Pinging @eugeneswalker @jrood-nrel @tasmith4
* Update var/spack/repos/builtin/packages/nalu-wind/package.py
Currently there's a slow sequential step in binary relocation where all
strings of a binary are collected, with rpaths removed, and then
filtered for the old install root.
This is completely unnecessary, and also incorrect, since we replace
more than just the old install root in the prefix to prefix mapping. And
in fact the prefix to prefix mapping is parallel, and a single pass. So
even as an optimization, this filter makes no sense anymore.
Therefor we remove it
- single pass over the binary data matching all prefixes
- collect offsets and replacement strings
- do in-place updates with `fseek` / `fwrite`, since typically our
replacement touch O(few bytes) while the file is O(many megabytes)
- be nice: leave the file untouched if some string can't be
replaced
* Added py-medaka and dependencies
* fixed py-parasail build error
* medaka still doesn't have correct linked libdeflate
* fixed pyspoa deps
* added htslib.patch, confirmed builds and runs
* fixed style
* Update var/spack/repos/builtin/packages/py-auditwheel/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* made requested changes
* added targets for pyspoa dep
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
/home/xsdk/spack.x/lib/spack/env/oneapi/icx -DAdd_ -Dscalapack_EXPORTS -I/opt/intel/oneapi/mpi/2021.7.0/include -O3 -DNDEBUG -fPIC -MD -MT CMakeFiles/scalapack.dir/BLACS/SRC/dgamx2d_.c.o -MF CMakeFiles/scalapack.dir/BLACS/SRC/dgamx2d_.c.o.d -o CMakeFiles/scalapack.dir/BLACS/SRC/dgamx2d_.c.o -c /home/xsdk/spack.x/spack-stage/spack-stage-netlib-scalapack-2.2.0-uj3jepiowz5is4hmdmjrzjltetgdr3lx/spack-src/BLACS/SRC/dgamx2d_.c
/home/xsdk/spack.x/spack-stage/spack-stage-netlib-scalapack-2.2.0-uj3jepiowz5is4hmdmjrzjltetgdr3lx/spack-src/BLACS/SRC/igsum2d_.c:154:7: error: call to undeclared function 'BI_imvcopy'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
BI_imvcopy(Mpval(m), Mpval(n), A, tlda, bp->Buff);
^
Currently the vasp package always enables the use of shmem to reduce algorithm memory usage (see
https://www.vasp.at/wiki/index.php/Precompiler_options). This is great,but on some systems gives compile errors with the interoperability of C and Fortran. This PR makes that shmem flag optional, but retains the
existing default on behavior.
* Added support for building the DiHydrogen package and LBANN extensions
to DiHydrogen with ROCm libraries.
Fixed a bug on Cray systems where CMake didn't try hard enough to find
an MPI-compatible compiler wrapper. Make it look more.
Added support for the roctracer package when using ROCm libraries.
* Fixed how ROCm support is defined for pre-v0.3 versions.
- hdf5-vfd-gds:
- Add new version 1.0.2 compatible with hdf5@1.13.
- CMake is a build dependency.
- Set `HDF5_PLUGIN_PATH` in the runtime environment, this plugin
is loaded dynamically.
- SDK:
- The VFD GDS driver only has utility when CUDA is enabled.
- Require hdf5-vfd-gds@1.0.2+ (1.0.1 and earlier do not compile).
* Add patches for building clingo with MSVC
* Help python find clingo DLL
* If an executable is located in "C:\Program Files", Executable was
running into issues with the extra space. This quotes the exe
to ensure that it is treated as a single value.
Signed-off-by: Kiruya Momochi <65301509+KiruyaMomochi@users.noreply.github.com>
This commit extends the DSL that can be used in packages
to allow declaring that a package uses different build-systems
under different conditions.
It requires each spec to have a `build_system` single valued
variant. The variant can be used in many context to query, manipulate
or select the build system associated with a concrete spec.
The knowledge to build a package has been moved out of the
PackageBase hierarchy, into a new Builder hierarchy. Customization
of the default behavior for a given builder can be obtained by
coding a new derived builder in package.py.
The "run_after" and "run_before" decorators are now applied to
methods on the builder. They can also incorporate a "when="
argument to specify that a method is run only when certain
conditions apply.
For packages that do not define their own builder, forwarding logic
is added between the builder and package (methods not found in one
will be retrieved from the other); this PR is expected to be fully
backwards compatible with unmodified packages that use a single
build system.
* Updating package file for osu-micro-benchmarks for the 6.2 release
* updating sha hash for 6.2 tarball
Co-authored-by: natshineman <shineman.5@osu.edu>
* Add netcdf-c 4.9.0 and netcdf-fortran 4.6.0
With v4.9.0 netcdf-c introduces zstandard compression option which is added as a variant.
* Fix when= in dependency
* Turn on variant zstd by default
Co-authored-by: kgerheiser <kgerheiser@icloud.com>
* alquimia, pflotran, plasma, py-mpi4py, strumpack - add in new versions
* Fix hip CI failure
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Instead of looping over multiple regexes and the entire text file
contents, create a giant regex with all literal prefixes and do a single
pass over files to detect prefixes. Not only is a single pass faster,
it's also likely that the regex is compiled better, given that most
prefixes share a common ... prefix.
In the dfs code, flip edges so that `parent` means `from` and
`spec` means `to` in the direction of traversal. This makes it slightly
easier to write generic/composable code. For example when using visitors
where one visitor reverses direction, and another only cares about
accepting particular edges or not depending on whether the target node
is seen before, it would be good if the second visitor didn't have to
know whether the order was changed or not.
Use the same compression level as `gzip` (6) instead of what Python uses
(9).
The LLVM tarball takes 4m instead of 12m to create, and is <1% larger.
That's not worth the wait...
* udunits: Update download URL
* udunits: Deprecate older versions
Unidata now only provides the latest version of each X.Y branch. Older 2.2 versions have been deprecated accordingly but are still available in the build cache.
Co-authored-by: RemiLacroix-IDRIS <RemiLacroix-IDRIS@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Update wgrib2 from JCSDA/NOAA-EMC fork
* var/spack/repos/builtin/packages/wgrib2/package.py: fix typo in comment, add conflict for variants netcdf3, netcdf4
* wget hdf5/netcdf4 internal dependencies for wgrib2
* Black-format var/spack/repos/builtin/packages/wgrib2/package.py
* More format changes in var/spack/repos/builtin/packages/wgrib2/package.py
#32137 added an option to update() a BinaryCacheIndex with a
cooldown: repeated attempts within this cooldown would not
actually retry. However, the cooldown was not properly
tracked for failures (which is common when the mirror
does not store any binaries and therefore has no index.json).
This commit ensures that update(..., with_cooldown=True) will
also skip the update even if a failure has occurred within the
cooldown period.
Due to reuse concretization, we may generate DAGs with two occurrences
of the same package corresponding to distinct specs. This happens when
build dependencies are reused, since their dependencies are ignored in
concretization.
This caused a regression, for example: `spec['openssl']` would take the
'openssl' of the build dep `cmake`, instead of the direct `openssl`
dependency, simply because the edge to `cmake` was traversed first and
we do depth first traversal.
One solution that was discussed is to limit `spec[name]` to just direct
deps, or direct deps + transitive link deps, but this is too breaking.
Instead, this PR simply prioritizes transitive link and direct
build/run/test deps, and then falls back to a full DAG traversal. So,
it's just about order of iteration.
* Update py-sphinxcontrib-mermaid
* Add py-myst-parser
* Fix py-mdit-py-plugins and py-myst-parser dependencies
* Add py-exhale package
* Update var/spack/repos/builtin/packages/py-mdit-py-plugins/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-myst-parser/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update py-exhale and py-myst-parser dependencies
* Add @svenevs as py-exhale maintainer
* Update var/spack/repos/builtin/packages/py-mdit-py-plugins/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Scan the text files for relocatable prefixes *before* creating a tarball,
to reduce the amount of work to be done during install from binary
cache.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* py-drep: new package
* fixed file extension
* added darwin conflict
* py-checkm-genome and py-pysam: bumped version and updated deps (#10)
added checkm and pysam deps
* added dep documentation and fixed style
* changed checkm and pysam back to dev version for upstreaming
* added url and perl run dep
* fixed style
Instead of showing
```
==> Error: Timed out waiting for a write lock.
```
show
```
==> Error: Timed out waiting for a write lock after 1.200ms and 4 attempts on file: /some/file
```
s.t. we actually get to see where acquiring a lock failed even when not
running in debug mode.
And use pretty time units everywhere, so we don't get 1.45e-9 seconds
but 1.450ns etc.
* backtraces without --debug
Currently `--debug` is too verbose and not-`--debug` gives to little
context about where exceptions are coming from.
So, instead, it'd be nice to have `spack --backtrace` and
`SPACK_BACKTRACE=1` as methods to get something inbetween: no verbose
debug messages, but always a full backtrace.
This is useful for CI, where we don't want to drown in debug messages
when installing deps, but we do want to get details where something goes
wrong if it goes wrong.
* completion
* acts: new versions
In the 20.x release line, these are the changes, https://github.com/acts-project/acts/compare/v20.0.0...v20.2.0
- `option(ACTS_SETUP_ACTSVG "Build ActSVG display plugin" OFF)` introduced in v20.1.0
- `option(ACTS_USE_SYSTEM_ACTSVG "Use the ActSVG system library" OFF)` introduced in v20.1.0
- `option(ACTS_BUILD_PLUGIN_ACTSVG "Build SVG display plugin" OFF)` introduced in v20.1.0
- `option(ACTS_USE_EXAMPLES_TBB "Use Threading Building Blocks library in examples" ON)` introduced in v20.1.0
- `option(ACTS_EXATRKX_ENABLE_ONNX "Build the Onnx backend for the exatrkx plugin" OFF)` introduced in v20.2.0
- `option(ACTS_EXATRKX_ENABLE_TORCH "Build the torchscript backend for the exatrkx plugin" ON)` introduced in v20.2.0
In the 19.x release line, these are the changes: https://github.com/acts-project/acts/compare/v19.7.0...v19.9.0
- `option(ACTS_USE_EXAMPLES_TBB "Use Threading Building Blocks library in examples" ON)` introduced in v19.8.0
The new build options have not been implemented in this commit but will be implemented next.
* acts: new variant svg
* actsvg: new package
* actsvg: style fixes
* acts: new versions 20.3.0 and 19.10.0
* astsvg: depends_on boost googletest
* actsvg: new version 0.4.26 (and style fix)
Includes fix to build issue when +examples, https://github.com/acts-project/actsvg/pull/23
* acts: new variant tbb when +examples @19.8:19 @20.1:
* acts: set ACTS_USE_EXAMPLES_TBB
* acts: no need for ACTS_SETUP_ACTSVG
* acts: move tbb variant to examples block
* acts: ACTS_USE_SYSTEM_ACTSDD4HEP removed in 20.3
* acts: use new ACTS_USE_SYSTEM_LIBS
* acts-dd4hep: new version 1.0.1, maintainer handle fixed
* acts: simplify variant tbb condition
* py-checkm-genome and py-pysam: bumped version and updated deps
* updated setuptools dep type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When we lose a running pod (possibly loss of spot instance) or encounter
some other infrastructure-related failure of this job, we need to retry
it. This retries the job the maximum number of times in those cases.
Currently `relocate_text` and `relocate_text_bin` are unsafe in the
sense that they run in parallel, and lead to races when modifying
different items pointing to the same inode.
This leads to the issue observed in #33453.
This PR:
1. Renames those functions to `unsafe_*` so people are aware
2. Adds logic to deal with hardlinks in current binary packages
3. Adds logic to deal with hardlinks when creating new binary tarballs,
so the install side doesn't have to de-dupe hardlinks.
4. Adds a test for 3
The assumption is that all our relocation logic preserves inodes. That
is, we should never copy a file, modify it, and then move it back. I
quickly verified, and its seems like this is true for (binary) text
relocation, as well as rpath patching in patchelf (even when the file
grows) and mach-o fixes.
* axom@0.7.0: require cmake@3.21:
* Update var/spack/repos/builtin/packages/axom/package.py
Co-authored-by: Chris White <white238@llnl.gov>
Co-authored-by: Chris White <white238@llnl.gov>
`reuse` and `when_possible` concretization broke the invariant that
`spec[pkg_name]` has unique keys. This invariant is relied on in tons of
places, such as when setting up the build environment.
When using `when_possible` concretization, one may end up with two or
more `perl`s or `python`s among the transitive deps of a spec, because
concretization does not consider build-only deps of reusable specs.
Until the code base is fixed not to rely on this broken property of
`__getitem__`, we should disable reuse in CI.
* fixed version numbers to python 2 and old biopython
* changed shortbred pacakge to pypi, removed python 2 version
* added package description
* re-added shortbred package with depreciated flag
* fixed style and removed unnecessary python dep (it can't build with python 2 anyway)
* removed whitespace and readded the python2.7.9+ dep
* fixed style
* gitlab: Do not use root_spec['pkg_name'] anymore
For a long time it was fine to index a concrete root spec with the name
of a dependency in order to access the concrete dependency spec. Since
pipelines started using `--use-buildcache dependencies:only,package:never`
though, it has exposed a scheduling issue in how pipelines are
generated. If a concrete root spec depends on two different hashes of
`openssl` for example, indexing that root with just the package name
is ambiguous, so we should no longer depend on that approach when
scheduling jobs.
* env: make sure exactly one spec in env matches hash
When installing some/all specs from a buildcache, build edges are pruned
from those specs. This can result in a much smaller effective DAG. Until
now, `spack env depfile` would always generate a full DAG.
Ths PR adds the `spack env depfile --use-buildcache` flag that was
introduced for `spack install` before. This way, not only can we drop
build edges, but also we can automatically set the right buildcache
related flags on the specific specs that are gonna get installed.
This way we get parallel installs of binary deps without redundancy,
which is useful for Gitlab CI.
When downloading from binary cache not only replace RPATHs to dependencies, but
also text references to dependencies.
Example:
`autoconf@2.69` contains a text reference to the executable of its dependency
`perl`:
```
$ grep perl-5 /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/autoconf-2.69-q3lo/bin/autoreconf
eval 'case $# in 0) exec /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/perl-5.34.1-yphg/bin/perl -S "$0";; *) exec /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/perl-5.34.1-yphg/bin/perl -S "$0" "$@";; esac'
```
These references need to be replace or any package using `autoreconf` will fail
as it cannot find the installed `perl`.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
"spack install" will not update the binary index if given a concrete
spec, which causes it to fall back to direct fetches when a simple
index update would have helped. For S3 buckets in particular, this
significantly and needlessly slows down the install process.
This commit alters the logic so that the binary index is updated
whenever a by-hash lookup fails. The lookup is attempted again with
the updated index before falling back to direct fetches. To avoid
updating too frequently (potentially once for each spec being
installed), BinaryCacheIndex.update now includes a "cooldown"
option, and when this option is enabled it will not update more
than once in a cooldown window (set in config.yaml).
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Without this patch one hits this error trying to compiler papi with Intel OneAPI:
icx: error: Note that use of '-g' without any optimization-level option will turn off most compiler optimizations similar to use of '-O0' [-Werror,-Wdebug-disables-optimization]
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
* Fixed two bugs in the HIP package recipe. The first is that the
HIP_PATH was being set to the actual spec, and not the spec prefix.
The second bug is that HIP is expected to be in /opt/rocm-x.y.z/hip
but it's libraries can exist at both /opt/rocm-x.y.z/hip/lib and
/opt/rocm-x.y.z/lib. This means that the external detection logic may
find it in either and it turns out that some modules only expose one
of those two locations. Logic is added to ensure that the internal
HIP_PATH and associated ROCM_PATH are correctly set in both cases.
* Added support for Aluminum to use the libfabric plugin with either
RCCL or NCCL.
* Add libpressio and dependencies; some of these packages are
maintained as forks of the original repositories and in those
cases the docstring mentions this.
* Add optional dependency in adios2 on libpressio
* cub package: set CUB_DIR environment variable for dependent
installations
* Clear R_HOME/R_ENVIRON before Spack installation (avoid sources
outside of Spack from affecting the installation in Spack)
* Rename dlib to dorian3d-dlib and update dependents; add new dlib
implementation. Pending an official policy on how to handle
packages with short names, reviewer unilaterally decided that
the rename was acceptable given that the new Spack dlib package
is referenced more widely (by orders of magnitude) than the
original
Co-authored-by: Samuel Li <shaomeng@users.noreply.github.com>
* Classic Intel compilers do not support gcc-toolchain
This fix removes `--gcc-toolchain=` from the ~.fcg` files for the classic Intel
compilers. AFAIK this option is only supported for Clang based compilers.
This lead to an issue when installing cmake. Reproducer:
```
spack install cmake@3.24.2%intel@2021.7.0~doc+ncurses+ownlibs~qt
build_type=Release arch=linux-amzn2-skylake_avx512
```
Tagging maintainer @rscohn2
* Add `-gcc-name` for icc
.. and `-gxx-name` for icpc.
AFAIK this is used for modern C++ support, so we can ignore `ifort`.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
When installing an individual spec `spack --only=package --cache-only /xyz`
from a buildcache, Spack currently issues tons of warnings about
missing build deps (and their deps) in the database.
This PR disables these warnings, since it's fine to have a spec without
its build deps in the db (they are just "missing").
Currently `traverse_dependencies` fixes deptypes to traverse once and
for all in the recursion, but this is incorrect, since deptypes depend
on the node (e.g. if it's a dependency and cache-only, don't follow
build type edges, even if the parent is build from sources and needs
build deps.)
Support spackbot rebuilding all specs from source when asked (with "rebuild everything")
- Allow overriding --prune-dag cli opt with env var
- Use job variable to optionally prevent rebuild jobs early exit behavior
- ci rebuild: Use new install argument to insist deps are always installed from binary, but
package is only installed from source.
- gitlab: fix bug w/ untouched pruning
- ci rebuild: install from hash rather than json file
- When doing a "rebuild everything" pipeline, make sure that each install job only consumes
binary dependencies from the mirror being populated by the current pipeline. This avoids
using, e.g. binaries from develop, when rebuilding everything on a PR.
- When running a pipeline to rebuild everything, do not die because we generated a hash on
the broken specs list. Instead only warn in that case.
- bugfix: Replace broken no-args tty.die() with sys.exit(1)
Print a message of the form
```
Fetch mm:ss. Build: mm:ss. Total: mm:ss
```
when installing from buildcache.
Previously this only happened for source builds.
Currently "spack ci generate" chooses the first matching entry in
gitlab-ci:mappings to fill attributes for a generated build-job,
requiring that the entire configuration matrix is listed out
explicitly. This unfortunately causes significant problems in
environments with large configuration spaces, for example the
environment in #31598 (spack.yaml) supports 5 operating systems,
3 architectures and 130 packages with explicit size requirements,
resulting in 1300 lines of configuration YAML.
This patch adds a configuraiton option to the gitlab-ci schema called
"match_behavior"; when it is set to "merge", all matching entries
are applied in order to the final build-job, allowing a few entries
to cover an entire matrix of configurations.
The default for "match_behavior" is "first", which behaves as before
this commit (only the runner attributes of the first match are used).
In addition, match entries may now include a "remove-attributes"
configuration, which allows matches to remove tags that have been
aggregated by prior matches. This only makes sense to use with
"match_behavior:merge". You can combine "runner-attributes" with
"remove-attributes" to effectively override prior tags.
* petsc,py-petsc4py,slepc,py-slepc4py: add version 3.18.0
* workaround for dealii build failure [with petsc version check]
* pism: add compatibility fix to for petsc@3.18
* add in hipsolver dependency
* Add checksum for py-gitpython 3.1.27
* Update package.py
* Update var/spack/repos/builtin/packages/py-gitpython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
When a pipeline generation job is automatically failed because it
generated jobs for specs known to be broken on develop, print better
information about the broken specs that were encountered. Include
at a minimum the hash and the url of the job whose failure caused it
to be put on the broken specs list in the first place.
* new package + deps
* Update var/spack/repos/builtin/packages/py-about-time/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-alive-progress/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* removed unnecessary python version dep
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add new lua releases
* split install phase and move it into a build phase, remove hardcoded standard flag
* revert back to the original hardcoded std flag, guard patch against versions above 5.4
* py-pyopenssl: add version 22.1.0
* Update var/spack/repos/builtin/packages/py-pyopenssl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-regex 2022.8.17
* Update var/spack/repos/builtin/packages/py-regex/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-async-lru 1.0.3
* [@spackbot] updating style on behalf of iarspider
* Update var/spack/repos/builtin/packages/py-async-lru/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* env depfile: allow deps only install
- Refactor `spack env depfile` to use a Jinja template, making it a bit
easier to follow as a human being.
- Add a layer of indirection in the generated Makefile through an
`<prefix>/.install-deps/<hash>` target, which allows one to specify
different options when installing dependencies. For example, only
verbose/debug mode on when installing some particular spec:
```
$ spack -e my_env env depfile -o Makefile --make-target-prefix example
$ make example/.install-deps/<hash> -j16
$ make example/.install/<hash> SPACK="spack -d" SPACK_INSTALL_FLAGS=--verbose -j16
```
This could be used to speed up `spack ci rebuild`:
- Parallel install of dependencies from buildcache
- Better readability of logs, e.g. reducing verbosity when installing
dependencies, and splitting logs into deps.log and current_spec.log
* Silence please!
* Add bioc attribute to r-do-db
* add version 1.38.1 to bioconductor package r-annotationforge
* add version 1.30.4 to bioconductor package r-biocparallel
* add version 2.64.1 to bioconductor package r-biostrings
* add version 4.4.4 to bioconductor package r-clusterprofiler
* add version 2.12.1 to bioconductor package r-complexheatmap
* add version 1.18.1 to bioconductor package r-delayedmatrixstats
* add version 3.22.1 to bioconductor package r-dose
* add version 3.38.4 to bioconductor package r-edger
* add version 1.16.2 to bioconductor package r-enrichplot
* add version 2.20.2 to bioconductor package r-ensembldb
* add version 1.32.4 to bioconductor package r-genomeinfodb
* add version 1.32.1 to bioconductor package r-genomicalignments
* add version 1.48.4 to bioconductor package r-genomicfeatures
* add version 1.44.1 to bioconductor package r-ggbio
* add version 3.4.4 to bioconductor package r-ggtree
* add version 1.24.2 to bioconductor package r-hdf5array
* add version 2.30.1 to bioconductor package r-iranges
* add version 1.36.3 to bioconductor package r-keggrest
* add version 3.52.4 to bioconductor package r-limma
* add version 1.8.1 to bioconductor package r-matrixgenerics
* update r-org-hs-eg-db
* add version 1.38.1 to bioconductor package r-organismdbi
* add version 1.36.1 to bioconductor package r-pathview
* add version 1.56.1 to bioconductor package r-rtracklayer
* add version 1.4.1 to bioconductor package r-scaledmatrix
* add version 1.24.1 to bioconductor package r-scran
* add version 1.6.3 to bioconductor package r-scuttle
* add version 1.18.1 to bioconductor package r-singlecellexperiment
* add version 1.20.2 to bioconductor package r-treeio
* Revert "Add bioc attribute to r-do-db"
This reverts commit 36be5c6072.
* Fix quotes on versions
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
For older versions of intel-oneapi-compilers, running the compiler in
preprocessor / compilation mode would trigger warnings that
`-Wl,-rpath,...` flags were unused.
This in turn caused certain configure scripts to fail as they did not
expect output from the compiler (it's treated as an error). Notably
cmake's bootstrap phase failed to detect c++ features of the compiler.
As a workaround, add this flag to silence the warning, since I don't
think we can scope the flags to compile+link mode.
* bump version for libvterm, required by neovim
* bump version for neovim and add related dep constraints
see release note:
d367ed9b23
in particular:
'deps: Bump required libvterm to v0.3'
https://github.com/neovim/neovim/pull/20222
* spack.compiler.Compiler: introduce prefix property
We currently don't really have something that gives the GCC install
path, which is used by many LLVM-based compilers (llvm, llvm-amdgpu,
nvhpc, ...) to fix the GCC toolchain once and for all.
This `prefix` property is dynamic in the sense that it queries the
compiler itself. This is necessary because it's not easy to deduce the
install path from the `cc` property (might be a symlink, might be a
filename like `gcc` which works by having the compiler load a module
that sets the PATH variable, might be a generic compiler wrapper based
on environment variables like on cray...).
With this property introduced, we can clean up some recipes that have
the logic repeated for GCC.
* intel-oneapi-compilers: set --gcc-sysroot to %gcc prefix
Caches used by repositories don't reference the global spack.repo.path instance
anymore, but get the repository they refer to during initialization.
Spec.virtual now use the index, and computation done to compute the index
use Repository.is_virtual_safe.
Code to construct mock packages and mock repository has been factored into
a unique MockRepositoryBuilder that is used throughout the codebase.
Add debug print for pushing and popping config scopes.
Changed spack.repo.use_repositories so that it can override or not previous repos
spack.repo.use_repositories updates spack.config.config according to the modifications done
Removed a peculiar behavior from spack.config.Configuration where push would always
bubble-up a scope named command_line if it existed
Resolves#31782
With this change, if a spec is concrete after parsing (e.g. spec.yaml
or /hash-based), then it is not disambiguated (a process which requires
(a) that the spec be installed and (b) that it be part of the
currently-active environment).
This commit allows you to:
* Diff specs from an environment regardless of whether they have
been installed (more useful for projection/matrix-based envs)
* Diff specs read from .yaml files which may or may not be entirely
different installations of Spack
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* hpctoolkit: add version 2022.10.01
1. add version 2022.10.01
2. remove version for master branch, develop is now the main branch
3. add CPATH and LD_LIBRARY_PATH to module run environment,
this is for apps that want to use the start/stop interface
4. cleanup style in variants, depends and conflicts
5. remove all-static variant, nothing uses it
6. deprecate more old versions
* [@spackbot] updating style on behalf of mwkrentel
* Add when(+level_zero) to the gtpin variant.
* Test commit to see if this passes E4S.
* Another test commit to see if E4S succeeds.
* Add temporary hack to ignore +mpi for version 2022.10.01 and issue a
warning instead.
Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
* Add checksum for py-msgpack 1.0.4
* Update var/spack/repos/builtin/packages/py-msgpack/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-virtualenv 20.16.4
* Add checksum for py-werkzeug 2.2.2
* Restore py-virtualenv/package.py
* Update var/spack/repos/builtin/packages/py-werkzeug/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Docs: Getting Started Dependencies
Finally document what one needs to install to use Spack on
Linux and Mac :-)
With <3 for minimal container users and my colleagues with
their fancy Macs.
* Debian Update Packages: GCC, Python
- build-essential: includes gcc, g++ (thx Cory)
- Python: add python3-venv, python3-distutils (thx Pradyun)
* Add RHEL8 Dependencies
* Add checksum for py-skl2onnx 1.12
* Update var/spack/repos/builtin/packages/py-skl2onnx/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-tables 3.7.0
* Update package.py
* Update var/spack/repos/builtin/packages/py-tables/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
ROOT has a webgui which is available with the `+root7` variant. This is a fairly large part of a ROOT install (275MB out of 732MB on my system) which is not necessarily useful in all use cases (e.g. inside containers on network-restricted HPC/HTC compute nodes). This new variant adds the option to retain the ROOT7 functionality but not necessarily include the `webgui` aspects.
`__unused__` defined in `general.h` conflict with the one defined by libc headers,
so change it to `__attribute__unused__` according to s.zharkoff:
https://bugs.gentoo.org/828550#c11
cmd:
`grep -rl "__unused__" . | xargs -n1 sed -i -e 's/\b__unused__\b/__attribute__unused__/g' -e 's/(unused)/(__unused__)/g'`
Basic stack of ML packages we would like to test and generate binaries for in CI.
Spack now has a large CI framework in GitLab for PR testing and public binary generation.
We should take advantage of this to test and distribute optimized binaries for popular ML
frameworks.
This is a pretty extensive initial set, including CPU, ROCm, and CUDA versions of a core
`x96_64_v4` stack.
### Core ML frameworks
These are all popular core ML frameworks already available in Spack.
- [x] PyTorch
- [x] TensorFlow
- [x] Scikit-learn
- [x] MXNet
- [x] CNTK
- [x] Caffe
- [x] Chainer
- [x] XGBoost
- [x] Theano
### ML extensions
These are domain libraries and wrappers that build on top of core ML libraries
- [x] Keras
- [x] TensorBoard
- [x] torchvision
- [x] torchtext
- [x] torchaudio
- [x] TorchGeo
- [x] PyTorch Lightning
- [x] torchmetrics
- [x] GPyTorch
- [x] Horovod
### ML-adjacent libraries
These are libraries that aren't specific to ML but are still core libraries used in ML pipelines
- [x] numpy
- [x] scipy
- [x] pandas
- [x] ONNX
- [x] bazel
Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
* Add checksum for py-scikit-build 0.15.0 and use sources from pypi
* Update var/spack/repos/builtin/packages/py-scikit-build/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-terminado 0.15.0
* Update var/spack/repos/builtin/packages/py-terminado/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* added metaphlan v4, cleaned up phylophlan
* added iqtree2
* fixed phylophlan, builds now
* changed config.yaml to default
* fixed style
* py-jsonschema: add 4.16.0 and new package py-hatch-fancy-pypi-readme (#32929)
* acfl: add v22.1 (#32915)
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
* Fixup errors introduced by Clingo Pr: (#32905)
* re2c depends on cmake on Windows
* Winbison properly added to bootstrap package search list
* Set CMAKE_HIP_ARCHITECTURES with the value of amdgpu_target (#32901)
* libtiff: default to +zlib+jpeg (#32945)
* octave: add version 7.2.0 (#32943)
* simgrid new releases (#32920)
* [rocksdb] Added rtti variant (#32918)
* rvs binary path updated for 5.2 rocm release (#32892)
* Add checksum for py-pytest-runner 6.0.0 (#32957)
* py-einops: add v0.5.0 (#32959)
* Replace repo with the NVIDIA one (#32951)
* Add checksum for py-tomli 2.0.1 (#32949)
* QMCPACK: add @3.15.0 (#32931)
* Tidied up configure arguments to use special spack autotools features. (#32930)
* casper: old domain fell off, adding github repo (#32928)
* unifyfs: pin mercury version; add boost variant (#32911)
Mercury has a new version (v2.2) releasing soon that UnifyFS does not build with and hasn't been tested with. This pins UnifyFS to the last version of Mercury used/tested.
Add a variant to avoid building/using boost
Append -std=gnu99 to cflags if building with gcc@4. Needed for mochi-margo to compile
* trilinos: constrain superlu-dist version (#32889)
* trilinos: constrain superlu-dist version for 13.x
* syntax
* FEniCSx: Updates for 0.5.1 (#32665)
* Updates for DOLFINx 0.5.1 and associated packages
* xtensor needed on anything less than main
* Switch back to Python 3.7 minimum.
* Might be good to point out in our README how to fix Python version?
* Fix basix, xtensor dep
* Add numba feature
* Fix checksum
* Make slepc optional
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* simgrid: add variant and remove flag (#32797)
* simgrid: remove std c++11 flag
* simgrid: add msg variant
* Axom: bring in changes from axom repo (#32643)
* bring in changes from axom repo
Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Add checksum for py-pyparsing 3.0.9 (#32952)
* rdma-core: fix syntax for external discoverability (#32962)
* Add checksum for py-flatbuffers 2.0.7 (#32955)
* amrex: add v22.10 (#32966)
* Remove CMakePackage.define alias from most packages (#32950)
* Bug fix for `ca-certificates-mozilla/package.py` to enable `spack install --source` (#32953)
* made suggested changes to iqtree2, py-dendropy, py-metaphlan, and py-pkgconfig. Poetry install still broken
* reverted py-pkgconfig deps to poetry-core
* made iqtree2 less dedundant, changes to py-dendropy and py-pkgconfig deps
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@gmail.com>
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Co-authored-by: Auriane R <48684432+aurianer@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Kai Torben Ohlhus <k.ohlhus@gmail.com>
Co-authored-by: Vinícius <viniciusvgp@gmail.com>
Co-authored-by: Matthieu Dorier <mdorier@anl.gov>
Co-authored-by: renjithravindrankannath <94420380+renjithravindrankannath@users.noreply.github.com>
Co-authored-by: iarspider <iarspider@gmail.com>
Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
Co-authored-by: Brian Van Essen <vanessen1@llnl.gov>
Co-authored-by: snehring <7978778+snehring@users.noreply.github.com>
Co-authored-by: Cameron Stanavige <stanavige1@llnl.gov>
Co-authored-by: Cody Balos <balos1@llnl.gov>
Co-authored-by: Jack S. Hale <mail@jackhale.co.uk>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Lucas Nesi <lucas31nesi@hotmail.com>
Co-authored-by: Chris White <white238@llnl.gov>
Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Martin Pokorny <mpokorny@caltech.edu>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Dom Heinzeller <dom.heinzeller@icloud.com>
* filter_file: introduce argument 'start_at'
* autotools: extend patching of the libtool script
* autotools: refactor _patch_usr_bin_file
* autotools: improve readability of the filtering
* autotools: keep the modification time of the configure scripts
* autotools: do not try to patch directories
* autotools: explain libtool patching for posterity
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* Add checksum for py-onnxmltools 1.11.0
* Add checksum for py-onnxmltools 1.11.0
* Fix patch name
* Update var/spack/repos/builtin/packages/py-onnx-runtime/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-pkginfo 1.8.3
* Update var/spack/repos/builtin/packages/py-pkginfo/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* trilinos and xyce: fix fortran library handling
xyce:
- add pymi_static_blas variant and logic
handles blas and hdf5 conflicts for Xyce-PyMi
- make +isorropia and +zoltan conditional on mpi
* xyce: clean up CMake options
* xyce: change pymi_static_blas to pymi_static_tpls
* xyce: made pymi_static_tpls only when +pymi
Remove `module-info mode load` condition that prevents auto-unloading when autoloading is enabled. It looks like this condition was added to work around an issue in environment-modules that is no longer necessary.
Add quotes to make is-loaded happy
* Add checksum for py-urllib3 1.26.12
* Update var/spack/repos/builtin/packages/py-urllib3/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fixes from review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add checksum for py-soupsieve 2.3.2.post1
* Update package.py
* Update var/spack/repos/builtin/packages/py-soupsieve/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update to PyFR v1.15.0
* allow unsupported compilers for cuda
* update requirement for py-gimmik version
* update requirement for cuda version
* update versions, style fix
* lib directory changed, style fixes
* PYFR_METIS_LIBRARY_PATH set
There is a new OpenBLAS release out that can be compiled w/o
a Fortran compiler.
macOS XCode developers, rejoice. Maybe at some point Spack
becomes a package manager that can be used without using
another package manager (to get gfortran) 🎉
phist: add conflict on reference netlib-lapack due to API change in lapack.h
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* Added a package for the aws-ofi-nccl plug-in from to enable NCCL to
use libfabric communication library as a network provider.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
When concurrent misc_cache provider index rebuilds happen, try to
rebuild it only once, so we don't exceed misc_cache lock timeout.
For example, when using `spack env depfile`, with no previous
misc_cache, running `make -f depfile -j8` could run at most 8 concurrent
`spack install` locking on misc_cache to rebuild the provider index. If
one rebuild takes 30s, before this fix, the "worst" lock could wait up
to 30s * 7, easily exceeding misc_cache lock timeout. Now, the "worst"
lock would take 30s * 1 + ~1s * 6.
* Add checksums for Rivet 3.1.7 and YODA 1.9.7
* Add checksum for fastjet 3.4.0
* Add v3.1.7b...
... in which version requirement for autoconf was lowered to 2.68
Currently, module changes from `setup_dependent_package` are applied only to the module of the package class, but not to any parent classes' modules between the package class module and `spack.package_base`.
In this PR, we create a custom class to accumulate module changes, and apply those changes to each class that requires it. This design allows us to code for a single module, while applying the changes to multiple modules as needed under the hood, without requiring the user to reason about package inheritance.
* find/list: display package counts last
We have over 6,600 packages now, and `spack list` still displays the number of packages
before it lists them all. This is useless for large sets of results (e.g., with no args)
as the number has scrolled way off the screen before you can see it. The same is true
for `spack find` with large installations.
This PR changes `spack find` and `spack list` so that they display the package count
last.
* add some quick testing
Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
* Updates for DOLFINx 0.5.1 and associated packages
* xtensor needed on anything less than main
* Switch back to Python 3.7 minimum.
* Might be good to point out in our README how to fix Python version?
* Fix basix, xtensor dep
* Add numba feature
* Fix checksum
* Make slepc optional
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Mercury has a new version (v2.2) releasing soon that UnifyFS does not build with and hasn't been tested with. This pins UnifyFS to the last version of Mercury used/tested.
Add a variant to avoid building/using boost
Append -std=gnu99 to cflags if building with gcc@4. Needed for mochi-margo to compile
Allow environment variables and spack-specific path substitution variables (e.g. `$spack`) to be
used in the paths associated with develop specs, while maintaining the ability to keep those
paths relative to the environment rather than the working directory.
* Added new versions
* New slate version
* Adding GPU support for lapackpp package
* Modified dependency on lapackpp
* Added rocblas and rocsolver to deps
* Testing with custom lapackpp repo
* Added chaining depends_on for +rocm
* Removing testing repo
* py-ipython: add 8.5.0 and py-win-unicode-console: new package
* Fix style
* Fix dependency version
* Deprecate version 2.3.1 and 3.1.0
* Always skip IPython.kernel
* Move skip_module definition to top
Install: Add use-buildcache option to install
* Allow differentiating between top level packages and dependencies when
determining whether to install from the cache or not.
* Add unit test for --use-buildcache
* Use metavar to display use-buildcache options.
* Update spack-completion
* Added a package for the aws-ofi-rccl plug-in from the ROCm software
stack. It allows RCCL to use the libfabric communication library.
Added support for using libfabric in Aluminum.
* Updated the run environment so that the plugin would get loaded.
* Added support for setting up the the LD_LIBRARY_PATH for dependent packages.
* Added package for RCCL tests to assess the impact of OFI libfabric RCCL plug-in.
* Add tag for xgboost 1.6.2
* Update py-xgboost as well
* Update package.py
* Update var/spack/repos/builtin/packages/py-xgboost/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* intel-oneapi-compilers-classic: refactor setup_run_environment
* intel-oneapi-compilers-classic: extend setup_run_environment with the one from intel-oneapi-compilers
* py-interface-meta: add 1.3.0 (incl new dependency packages)
* Update var/spack/repos/builtin/packages/py-poetry-dynamic-versioning/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
superlu:oneapi-deal with non ISO C99 complianc in the package.
The Intel OneAPI compilers are based on LLVM 14. A recent enhancement to LLVM -
https://reviews.llvm.org/D122983
results in superlu-dist not compiling because of some non ISO C99 compliant stuff.
A workaround is to use an llvm compile line option noted in the above URL.
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Make it possible to install the Clingo package on Windows; this
also provides a means to use Clingo with Spack on Windows.
This includes
* A new "winbison" package: Windows has a port of bison and flex where
the two packages are grouped together. Clingo dependencies have been
updated to use winbison on Windows and bison elsewhere (this avoids
complicating the existin bison/flex packages until we can add support
for implied virtuals).
* The CMake build system was incorrectly converting CMAKE_INSTALL_PREFIX
to POSIX format.
* The re2c package has been modified to use CMake on Windows; for now
this is done by overloading the configure/build/install methods to
perform CMake-appropriate operations; the package should be refactored
once support for multiple build systems in one Package is available.
This commit fixes#27027.
The root cause of the issue is that the `SPACK_OLD_PROMPT` variable
was evaluated in string interpolation regardless of whether the
guard condition above evaluates to true or false. This commit uses
the `eval` keyword to defer evaluation until the command is executed.
Co-authored-by: Alexander Hornburg <alexande@xilinx.com>
The file rbg.txt is needed for many PGPLOT application, such as the MESA
stellar evolution code. This change installs the file to the PGPLOT_DIR,
where the library expects it.
PGPLOT_DIR was previously set to the prefix itself, which is an odd
place to install rgb.txt. This commit changes it to lib/pgplot5,
following the convention used by Debian.
Co-authored-by: Philipp Edelmann <edelmann@fs.tum.de>
* py-datalad-metalad: add 0.4.5
* Update var/spack/repos/builtin/packages/py-datalad-metalad/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
In addition to the new release, made intel-oneapi-compilers-classic version number match the compiler version number, instead of the version of the package that contains it.
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
* root: make X11 really optional on macOS
* Update var/spack/repos/builtin/packages/root/package.py
* remove when clauses in provides
Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>
This modifications breaks `develop` since it doesn't
pass audits with Python 2.7 It is to be investigated
why audits pass in CI for the PR and the issue is
revealed only when the package is pushed to develop.
PR #32615 deprecated Python versions up to 3.6.X. Since
the "build-systems" pipeline requires Python 3.6.15 to
build "tut", it will fail on the first rebuild that
involves Python.
The "tut" package is meant to perform an end-to-end
test of the "Waf" build-system, which is scarcely
used. The fix therefore is just to remove it from
the pipeline.
Spack currently depends on parsing filenames of downloaded files to
determine what type of archive they are and how to decompress them.
This commit adds a preliminary check based on magic numbers to
determine archive type (but falls back on name parsing if the
extension type cannot be determined).
As part of this work, this commit also enables decompression of
.tar.xz-compressed archives on Windows.
* Update py-poetry to 1.2.1
* Update py-xattr
* Apply style from review
* Apply suggestions from code review (part 1)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Changes from review - 2
* Fix typo
* Fix style
* Add missing py-dulwich version
* Update var/spack/repos/builtin/packages/py-poetry-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-poetry/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-poetry/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Apply changes from review
* Add py-backports-cached-property and fix style
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-eccodes: ensure the minimal recommended shared version of libeccodes
* py-eccodes: set less general environment variables to enable location of libeccodes
* py-eccodes: add version 1.5.0
* py-eccodes: make flake8 happy
amazon linux 2 ships a glibc that is too old to work with cuda toolkit
for aarch64.
For example:
`libcurand.so.10.2.10.50` requires the symbol `logf@@GLIBC_2.27`, but
glibc is at 2.26.
So, these specs are removed.
* New packages: py-conan, py-node-semver, py-patch-ng
* Update var/spack/repos/builtin/packages/py-conan/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-conan/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-babel: add 2.10.3
* Update var/spack/repos/builtin/packages/py-babel/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* New package: py-jaraco-classes
* Update var/spack/repos/builtin/packages/py-jaraco-classes/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-tinydb] added package py-tinydb
* [py-tinydb] corrected version in dependency for py-tinydb
* [py-tinydb] update python dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-tinydb] update dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Include exception info related to url retrieval in debug messages
which otherwise would be swallowed. This is intended to be useful
for detecting if CA configuration interferes with downloads from
HTTPS links.
* add the 2 variants OPENCL and HIP and their dependencies correctly
for OPENCL - rocm-opencl, miopengemm and miopen-opencl
for HIP - miopen-hip
Earlier this was adding both the dependencies -miopen-hip and miopen-opencl
for both the backends which did not seem correct.
Also corrected the miopen-hip or miopen-opencl config.h in patch() depending on the
backend
Also added libjpeg-turbo as it is required for building ROCAl .
the AMDRpp is still required for ROCAL inclusion but it currently does not build
AMDRpp will be added as a new spack recipe and the mivisionx will refer to that as a
dependency in future.
* fix style errors
* bump up the version for 5.2.3 release. tested +opencl, +hip and ~hip~opencl(cpu backend)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Initial opencatalyst proxy app spackage
Initial spackage for opencatalyst proxy app. Includes exposing of
versions in dependency spackages
* Verified Functionality and Spack Style
Verified build of mlperf-opencatalyst and fixed lingering spack style
issues
* Making requested changes to py-ocp
Making requested changed to spackage
* Further Requested Changes
Since fmt@9.0.0 and 9.1.0 were [added](6c4acfbf83) to spack a few days ago, gaudi fails to compile with default concretization. Since gaudi developers are usually paying attention to new versions of dependencies, I'm going to assume (perhaps optimistically) that the next bugfix version of gaudi will fix this (even though the issue has not been reported yet to Gaudi; I posted on the [key4hep public mirror](https://github.com/key4hep/Gaudi/issues/1)).
* Add fixed version 0.7.0 to cpu-features
* [@spackbot] updating style on behalf of iarspider
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
* started updating multiqc package
* working now
* added py-rich-click
* fixed style
* changed py-matlibplot versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-networkx versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-coloredlogs versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed python versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-markdown versions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-pyyaml requirement
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-requests requirements
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* changed py-spectra requirements
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Bump up the version for rocm-5.2.1-initial commit
* Bump up the version for rocm-5.2.1 release
* Bump up the version for rocm-5.2.1 release
* correct the PROF_API_HEADER_PATH to include
* Bump up the version of rocm-openmp-extras for rocm-5.2.1 release
* bump up the version of rocwmma for 5.2.1
* imagemagick: new variant ghostscript
Ghostscript adds about 75 dependencies to an installation of
imagemagick (97 without, 172 with Ghostscript). This adds a
variant (defaulting to true for backward compatibility) that
allows users to turn off Ghostscript support.
* imagemagick: be explicit when `--with-gslib`
* imagemagick: fix suggestion fail
* imagemagick: use spec.prefix.share.font
* imagemagick: default ghostscript false
* imagemagick: no need for join_path anymore
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
This PR deprecates using Spack to install [EOL Python versions](https://endoflife.date/python), including Python 2.7, 3.1–3.6. It also deprecates running Spack with Python 2.7. Going forward, we expect Spack to have the following Python support timeline.
### Running Spack
* Spack 0.18 (spring 2022): Python 2.7, 3.5–3.10
* Spack 0.19 (fall 2022): Python 2.7 (deprecated), 3.6–3.11
* Spack 0.20 (spring 2023): Python 3.6–3.11
### Building with Spack
* Spack 0.18 (spring 2022): Python 2.7, 3.1–3.10
* Spack 0.19 (fall 2022): Python 2.7, 3.1–3.6 (deprecated), 3.7–3.11
* Spack 0.20 (spring 2023): Python 3.7–3.11
This is a reboot of #28003. See #31824 for a detailed discussion of the motivation for this PR.
If you have concerns about this change, please comment on #31824.
* py-datalad: add 0.17.5
* Fix description of py-types-urllib3
* Update var/spack/repos/builtin/packages/py-datalad/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-datalad/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add colorama dependency for windows
* Fix importlib-metadata dependency
* Update var/spack/repos/builtin/packages/py-pytest/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pytest/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pytest/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Use conflict to avoid dependency duplication
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Allow the nccl flag to be specified even for ROCm builds so that NCCL kernels are included in the build.
In this case the NCCL kernels will use RCCL as the backend implementation.
* Adding intel-oneapi-itac package
* Make black happy
* add rscohn2 as maintainer
* black prefers double quotes
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
* Don't run bootstrap on package only PRs
* Run bootstrap tests when ci.yaml is modified
* Test a package only PR
* Revert "Test a package only PR"
This reverts commit af96b1af60b0c31efcc9a2875ffb1a809ef97241.
Now that `podio` can support `cxxstd` variants 17 and 20, we can allow `edm4hep` to use `cxxstd=20` as well, but must ensure that `edm4hep` uses the same `cxxstd` variant as `podio`. Solution as in `celeritas`.
This fixes a bug where two installations that differ only by package hash will not show
up in `spack find`.
The bug arose because `_cmp_node` on `Spec` didn't include the package hash in its
yielded fields. So, any two `Spec` objects that were only different by package hash
would appear to be equal and would overwrite each other when inserted into the same
`dict`. Note that we could still *install* specs with different package hashes, and they
would appear in the database, but we code that needed to put them into data structures
that use `__hash__` would have issues.
This PR makes `Spec.__hash__` and `Spec.__eq__` include the `process_hash()`, and it
makes `Spec._cmp_node` include the package hash. All of these *should* include all
information in a spec so that we don't end up in a situation where we are blind to
particular field differences.
Eventually, we should unify the `_cmp_*` methods with `to_node_dict` so there aren't two
sources of truth, but this needs some thought, since the `_cmp_*` methods exist for
speed. We should benchmark whether it's really worth having two types of hashing now
that we use `json` instead of `yaml` for spec hashing.
- [x] Add `package_hash` to `Spec._cmp_node`
- [x] Add `package_hash` to `spack.solve.asp.spec_clauses` so that the `package_hash`
will show up in `spack diff`.
- [x] Add `package_hash` to the `process_hash` (which doesn't affect abstract specs
but will make concrete specs correct)
- [x] Make `_cmp_iter` report the dag_hash so that no two specs with different
process hashes will be considered equal.
* initial commit of 0.5.0 changes
* updated dependencies
* updated ffcx sha
* comment style
* llvm compilers
* introduce pugixml dependency for 0.5.0:
* update compilers to support C++20 features
* style fixes
* xtensor and xtl not needed for basix 0.5.0 and above
* Skip to Basix 0.5.1
The 0.5.1 release removes the C++ build dependency on Python that sneaked into the 0.5.0 build system.
* Improve depends on version ranges
* More dependency version improvements
Co-authored-by: Chris Richardson <chris@bpi.cam.ac.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
* added missing dependency for py-msgpack that breaks neovim
* Update var/spack/repos/builtin/packages/py-pynvim/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The issue is that we are not not able to install (Fetch URL error) any
version of catalyst other than the specified in the spack package.py.
This very version is accessible only because it is cached by Spack. The
real URL does not exist anymore, I believe the reason is that there used
to be a tag in catalyst that does not exist anymore.
53a7b49 created a reference error which broke `.libs` (and
`find_libraries`) for many packages. This fixes the reference
error and improves the testing for `find_libraries` by actually
checking the extension types of libraries that are retrieved by
the function.
* catch json schema errors and reraise as property of SpackError
* no need to catch subclass of given error
* Builtin json library for Python 2 uses more generic type
* Correct instantiation of SpackError (requires a string rather than an exception)
* Use exception chaining (where possible)
* Add libraqm package
* py-pillow: Add optional raqm dependency/variant
* Use sha256
* Use " instead of '
* Use more explicit import
* Only add raqm from @8.4.0:
* Make the docstring shorter to satisfy flake
* Add conflict, silence warning, adjust version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Add a post-install step which runs (only) on Windows to modify an
install prefix, adding symlinks to all dependency libraries.
Windows does not have the same concept of RPATHs as Linux, but when
resolving symbols will check the local directory for dependency
libraries; by placing a symlink to each dependency library in the
directory with the library that needs it, the package can then
use all Spack-built dependencies.
Note:
* This collects dependency libraries based on Package.rpath, which
includes only direct link dependencies
* There is no examination of libraries to check what dependencies
they require, so all libraries of dependencies are symlinked
into any directory of the package which contains libraries
UnifyFS:
- Add 1.0 release
- Deprecate older, unsupported versions
- Set fortran variant to true by default
- Update gotcha and mochi-margo dependency versions for unifyfs@1.0
and unifyfs@develop
- Add conflict of unifyfs with libfabric 1.13.* versions
- Update configure_args to use helper functions
GOTCHA: Hasn't been under active development for a couple years but
the develop branch has some fixes UnifyFS uses. To avoid having
UnifyFS v1.0 depend on a develop branch of a dependency, this creates
a new release in the Gotcha Spack package based on the most recent
commit of the develop branch.
* Adds hyperqueue package
* Adds maintainers
* Adds missing dockstring
* Bump year for Copyright
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Fixes checksum, selects preferred version
Don't use a release candidate as the default version
* Switch maintainer to one of the developers for HQ
* Update package.py
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* Add new intel-mpi-benchmarks version
* Add new versions of intel mpi benchmarks
* Fix style bugs
* Fix style bugs
* Switch to using url_for_version formatting and improve patch ranges
* p2p benchmark is not included on older versions
* Set patch to proper version
* Add url field, improve patch versioning, improve version detection
* Add url field, improve patch versioning, improve version detection
* Bug fix
Syntax fix
* Remove 2019 from valid version on reorder_benchmark_macros patch
* OpenMPI isn't supported on older versions of the benchmark. Prevents OpenMPI from being selected on those versions
* Add new requirement of gmake for older versions
* Require intel-mpi for older versions of benchmark
* Minor changes to build directory for older versions
* Remove repeated conflict
* Minor style changes
* Minor change
* Correct fix for intel-mpi-benchmarks
* Bug fix
* Bug fix
* Attempted fix for install bug
* Attempted fix for install bug
* Remove duplicate build_directory setting
* new packages: py-arm-pyart and dependencies
- py-arm-pyart
- py-cylp
- rsl
* Update var/spack/repos/builtin/packages/py-cylp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix dependencies
- xarray is not optional
- pandas is needed
- pylab is needed
- new package, py-pylab-sdk
- setuptools is needed at run time
* Patch for import of StringIO
* Update var/spack/repos/builtin/packages/py-arm-pyart/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix call to `StringIO` in patch
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ci: restore coverage computation
* Mark "test_foreground_background" as xfail
* Mark "test_foreground_background_output" as xfail
* Make number of processes explicit, remove verbosity on linux
* Run coverage on just 3 Python jobs for linux
* Run coverage on just 3 Python jobs for linux
* Run coverage on just 2 Python jobs for linux
* Add back verbose, since before we didn't encounter the xdist internal error
* Reduce the workers to 2
* Try to use command line
* Fix a version cmp bug in asp.py
* Fix submodule bug for git refs
* Add branch in logic for submodules
* Fix git version comparisons
main does not satisfy git.foo=main
git.foo=main does satisfy main
* [py-python-bioformats] New package
* [py-python-bioformats] Added version 4.0.0
* [py-python-bioformats] Added types
* [py-python-bioformats] setuptools is build only
* [py-python-bioformats] fixup import
* [@spackbot] updating style on behalf of qwertos
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
* Fixed the py-protobuf recipe so that when cpp support is require so
that it uses the same major and minor version range for the protobuf
package.
* Fixed the range bound for the 3.x py-protobuf packages.
Added mappings for 4.x py-protobuf packages to 3.x protobuf packages.
Removed a hash for v21.1 protobuf and replaced with v3.21.1 to keep a
standard versioning convention. Note that while Google has started
releasing both 3.x.y and a tag that dropped the leading 3. so it is
just x.y. This provides the appearance of a new major version, but
really is just a new minor version. These packages still report
versions as 3.x.y, so switching to versions and hashes with that
convention.
* Simplified constraints based on reviewer comments.
* Fixed flake8 errors
* Update var/spack/repos/builtin/packages/py-protobuf/package.py
* Fixed constraints on v2. versions and addressed Flake8 comments.
* Fixed flake8
* Fixed range dependencies for version 2.x
* Update var/spack/repos/builtin/packages/py-protobuf/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fixed version ranges to skip unknown versions.
* Fixed the dependencies on protobuf to solve weird build issues.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ci: remove !docs from "core" filters
Written like it is now it causes package only PRs
to run with coverage.
* Try to skip job under condition, see if the workflow proceed
* Try to cancel a running CI job
* Simplify linux unit-tests, skip windows unit-tests on package PRs
* Reduce the inputs to unit-tests workflow
* Move control logic to main workflow, remove inputs
* Revert "Move control logic to main workflow, remove inputs"
This reverts commit 0c46fece4c.
* Do not compute "with_coverage" since it's always == to "core"
* Remove workflow dispatch from unit tests
* Revert "Revert "Move control logic to main workflow, remove inputs""
This reverts commit dd4e4a4e61.
* Try to skip all from the main workflow
* Add back bootstrap to needed checks for "all"
* Restore the correct logic for conditionals
* Versions added for each dep, but I think I'll need to remove them
* py-tesseract now builds and will import in python
* Fixed flake style error as raised by pipeline
* changed to proper python dependency
* added pil as a dependency
* Fixed flake style errors
* [py-pytesseract] py-pillow and py-wheel are redundant
* [py-pytesseract]
- fixed spelling
- removed unneeded dependency
* [py-pytesseract] update import
Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
* Add two no-op jobs named "all-prechecks" and "all"
These are a suggestion from @tgamblin, they are stable named markers we
can use from gitlab and possibly for required checks to make CI more
resilient to refactors changing the names of specific checks.
* Enable parallel testing using xdist for unit testing in CI
* Normalize tmp paths to deal with macos
* add -u flag compatibility to spack python
As of now, it is accepted and ignored. The usage with xdist, where it
is invoked specifically by `python -u spack python` which is then passed
`-u` by xdist is the entire reason for doing this. It should never be
used without explicitly passing -u to the executing python interpreter.
* use spack python in xdist to support python 2
When running on python2, spack has many import cycles unless started
through main. To allow that, this uses `spack python` as the
interpreter, leveraging the `-u` support so xdist doesn't error out when
it unconditionally requests unbuffered binary IO.
* Use shutil.move to account for tmpdir being in a separate filesystem sometimes
update libflame for work with crayCC, craycc, crayftn compiler wrappers. These lightweight compiler drivers do not add the `-L<lib_path>` like the CC/cc/ftn compiler drivers do. I've made a slight change to add the lib directories.
This change adds support for building the rocthrust tests and adds the `amdgpu_target`
variant to the `rocthrust` package.
- [x] rocthrust: add amdgpu_target and spack build test
- [x] Drop numactl as it is not a direct dependency
* add workaround for broken behavior in HIP
Hip has a longstanding cmake issue where they calculate include paths
incorrectly, this works around it for raja and adds an explicit rocprim
dependency.
* propagate openmp requirement and workaround to camp
* refactor and include umpire
* propagate openmp option to camp in umpire and use main camp for main and develop raja and umpire
* bump camp to new patch release
This patchset refactors our GitHub actions into a single top-level ci workflow that
invokes a series of reusable actions. The main goal of this is to be able to easily
control which tests run and in what order based on the success or failure of top-level
prechecks. Our previous workflows ran in three sets:
* nix tests: style and verification first, then linux and macos tests if successful
* windows tests: style and verification first, then linux and macos tests if successful
* bootstrap tests
As a result, the bootstrap tests ran even if the style failed, and style and verification
had to run on two different platforms despite running identical checks. I'm relatively
sure that's because of the limitation on dependencies between steps in the jobs.
Reusable workflows allow us to run the style, verification and now audit checks once,
then depending on the results, and the files changed, run the appropriate nix, windows
and bootstrap tests. While it saves only a few minutes by itself, this makes it easier to
refactor checks to subset tests without having to replicate tests or other workflow
components in the future.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Fix GCC compiler warnings due to not using C99 mode
CC should be overriden with Spack's value, and the other flags needed
to be copied from the Makefile.
Move the copying of the buildcache to a root job that runs after all the child
pipelines have finished, so that the operation can be coordinated across all
child pipelines to remove the possibility of race conditions during potentially
simlutandous copies. This lets us ensure the .spec.json.sig and .spack files
for any spec in the root mirror always come from the same child pipeline
mirror (though which pipeline is arbitrary). It also allows us to avoid copying
of duplicates, which we now do.
If you have an environment like
```
$ cat spack.yaml
spack:
specs: [openmpi@4.1.0+cuda]
```
this PR provides a new command `spack change` that you can use to adjust environment specs from the command line:
```
$ spack change openmpi~cuda
$ cat spack.yaml
spack:
specs: [openmpi@4.1.0~cuda]
```
in other words, this allows you to tweak the details of environment specs from the command line.
Notes:
* This is only allowed for environments that do not define matrices
* This is possible but not anticipated to be needed immediately
* If this were done, it should probably only be done for "named"/not-anonymous specs (i.e. we can change `openmpi+cuda` but not spec like `+cuda` or `@4.0.1~cuda`)
The building of tests is optional [as of 2.42.9](801eef111d). This applies this option in the build.
The reason the option was added was to deal with test build failures in sandboxed environments and with certain glibc versions (caused by glib gresources). For example, with the latest version glibc and in the latest version of docker these tests [cannot be built](https://github.com/moby/moby/issues/43595).
* llvm: fix 15.0.0rc builds on MacOS with command-line-tools
Ref: https://github.com/llvm/llvm-project/issues/57037
i.e use -DBUILTINS_CMAKE_ARGS=-DCOMPILER_RT_ENABLE_IOS=OFF. But this needs switching "compiler-rt" from "projects" to "runtimes".
Also fixing the warnings below fixes compile errors
CMake Warning at CMakeLists.txt:101 (message):
Using LLVM_ENABLE_PROJECTS=libcxx is deprecated now, please use
-DLLVM_ENABLE_RUNTIMES=libcxx or see the instructions at
https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.
CMake Warning at CMakeLists.txt:101 (message):
Using LLVM_ENABLE_PROJECTS=libcxxabi is deprecated now, please use
-DLLVM_ENABLE_RUNTIMES=libcxxabi or see the instructions at
https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.
CMake Warning at CMakeLists.txt:101 (message):
Using LLVM_ENABLE_PROJECTS=libunwind is deprecated now, please use
-DLLVM_ENABLE_RUNTIMES=libunwind or see the instructions at
https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.
/private/var/folders/nt/_m1t_x7j76q6sl3xt91tqgs00000gn/T/balay/spack-stage/spack-stage-llvm-15.0.0-rc2-h2t5bohzyy7exz2ub3m42pfycjcmbndk/spack-build-h2t5boh/include/c++/v1/cstdlib:135:9: error: no member named 'at_quick_exit' in the global namespace
using ::at_quick_exit _LIBCPP_USING_IF_EXISTS;
~~^
* Update var/spack/repos/builtin/packages/llvm/package.py
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
fixes#31484
Before this change if anything was matching an external
condition, it was considered "external" and thus something
to be "built".
This was happening in particular to external packages
that were re-read from the DB, which then couldn't be
reused, causing the problems shown in #31484.
This PR fixes the issue by excluding specs with a
"hash" from being considered "external"
* Test that users have a way to select a virtual
This ought to be solved by extending the "require"
attribute to virtual packages, so that one can:
```yaml
mpi:
require: 'multi-provider-mpi'
```
* Prevent conflicts to be enforced on specs that can be reused.
* Rename the "external_only" fact to "buildable_false", to better reflect its origin
* py-breathe: add new version and improve version constraints
* py-breathe: everyone loves versions
```
py-breathe, py-breathe in the air
don't be afraid to care
````
* Update var/spack/repos/builtin/packages/py-breathe/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add comment
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Preliminary support for include URLs in spack.yaml (environment) files
This commit adds support in environments for external configuration files obtained from a URL with a preference for grabbing raw text from GitHub and gitlab for efficient downloads of the relevant files. The URL can also be a link to a directory that contains multiple configuration files.
Remote configuration files are retrieved and cached for the environment. Configuration files with the same name will not be overwritten once cached.
Extend the semantics of package requirements to
allow using them also under a virtual package
attribute in packages.yaml
These requirements are enforced whenever that
virtual spec is present in the DAG.
Per #32214, the existing patch `92b2c...` for `gcc@11:` only applies
cleanly for `libzmq@4.3.3:4.3.4`. This PR adds a conflict for earlier
versions, as they cannot be patched due to different context.
For `gcc@11`, this leaves the most recent two versions available (a
satisfactory compromise).
For `gcc@12`, however, there is another existing conflict that makes
these most recent two versions unavailable. This PR adds an upstream
patch for the single most recent version that allows compilation with
`gcc@12` for that most recent version.
Starting point:
- `gcc@11` concretizes on all versions, attempts to apply patch on
`@4.2.3:4.3.4`, and only succeeds to apply patch on `@4.3.3:4.3.4`,
- `gcc@12` concretizes on `@:4.3.1` (and `@master`), attempts to apply
patch on `@4.2.3:4.3.1`, fails to apply patch on all.
Ending point:
- `gcc@11` concretizes on `@4.3.3:4.3.4` (and `@master`), attempts and
succeeds to apply patch on `@4.3.3:4.3.4`,
- `gcc@12` concretizes on `@4.3.4` (and `@master`), attempts and
succeeds to apply patch on `@4.3.4`.
Verified with environment build:
```yaml
spack:
specs:
- libzmq@4.3.4%gcc@12.1.0
- libzmq@4.3.4%gcc@11.3.0
- libzmq@4.3.3%gcc@11.3.0
view: false
```
which returns the following:
```console
16:14:47 wdconinc@menelaos ~/git/spack (libzmq-patch-gcc12 *+$%=) $
spack install --fresh
==> Installing environment libzmq
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libmd-1.0.4-egpgd6eoaqtsl5fja2iwsl6gyc4o32p5
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libsodium-1.0.18-af3rsfnvck6anxf7eeog3f2bph44tjia
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/pkgconf-1.8.0-z5of2hj2c6ygd3kxr4cwv7u7t42sxair
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libmd-1.0.4-tec234gco2sd7n52dkwbrad43sdhaw4o
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libsodium-1.0.18-uljf675u3yrn5c7fdjdpa5c7qnnkynke
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/pkgconf-1.8.0-l4hzc2g4pnn7dwyttphmxivt3xghvpoq
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libbsd-0.11.5-fi3ri64moy45ksr4sf5pcwd6k23dsa4o
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libbsd-0.11.5-2matmm7im7oygrr77k7wznttv4rbupfz
==> Installing libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
==> No binary for libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr found:
installing from source
==> Fetching
c593001a89.tar.gz
==> Fetching
310b8aa57a
==> Fetching
https://github.com/zeromq/libzmq/pull/4334.patch?full_index=1
==> Applied patch
92b2c38a2c.patch?full_index=1
==> Applied patch
https://github.com/zeromq/libzmq/pull/4334.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
Fetch: 0.61s. Build: 1m 31.57s. Total: 1m 32.18s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
==> Installing libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
==> No binary for libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz found:
installing from source
==> Fetching
9d9285db37.tar.gz
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/31/310b8aa57a8ea77b7ac74debb3bf928cbafdef5e7ca35beaac5d9c61c7edd239
==> Applied patch
92b2c38a2c.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
Fetch: 0.93s. Build: 11.55s. Total: 12.48s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
==> Installing libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
==> No binary for libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h found:
installing from source
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/c5/c593001a89f5a85dd2ddf564805deb860e02471171b3f204944857336295c3e5.tar.gz
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/31/310b8aa57a8ea77b7ac74debb3bf928cbafdef5e7ca35beaac5d9c61c7edd239
==> Applied patch
92b2c38a2c.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
Fetch: 0.01s. Build: 10.77s. Total: 10.78s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
```
* ddt: Initial commit
* ddt: Minor change to license date to appease the CI gods
* Get rid of unattractive extra line
* Switch to sha256 instead of md5
* Initial Draft of E3SM-Kernels Spackage
Initial draft of E3SM-Kernels. Currently no support for nested_loops due
to build system limitations
* Style Check and Fixed gfortran Check
Fixed style issues and changed gnu toolchain check to a gfortran check
due to hybrid compilers (e.g. clang+gfortran)
* Fixed Style Issues
* tixi: new variants (fortran,shared)
Since some tixi 3 versions, additional CMake flags are needed to build
tixi with shared libraries, respectively with Fortran support.
* tixi: fix style
* Changes for ROCm-5.2.0 changes and new recipe rocwmma
* modify the maintainers for hipify-clang
* address review comments
* update the rocwmma new recipe as per latest syntax
* fix style errors
* modify the patch file to provide the details about the patch
* fix style errors
* whizard: Fix passing of build options, update versions
The dependency of whizard on libtirpc is now correctly passed down as
an autotools option.
Update known versions of package with 3.0.2 and 3.0.3.
* Express path to headers via spec object methods
Allow users to express default requirements in packages.yaml.
These requirements are overridden if more specific requirements
are present for a given package.
* cpu-features: fix fetch failure
`master` branch was renamed to `main`
* Update var/spack/repos/builtin/packages/cpu-features/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-loguru] New package
* [py-loguru] Removed commented out line
* [py-loguru] Added types removed extra dependencies
* [py-loguru] missing windows dependency. listing windows as a conflict for now
* [py-loguru] depends on py-colorama when platform=windows
* [py-loguru] flake8
* [py-loguru] Import update
* [py-loguru]
- python is a runtime dependency
- setuptools is a build dependency
* [@spackbot] updating style on behalf of qwertos
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
* libfabric: add new versions
1.15.0, 1.15.1, main (previously named master)
* Add OPX fabric option, with conflict for versions before v1.15.0
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* geant4: use define function
* geant4: Change new feature from conflicts to when
* geant4: add support/conflicts for nvhpc
* fixup! geant4: add support/conflicts for nvhpc
This support requires adding the '--tests' option to 'spack ci rebuild'.
Packages whose stand-alone tests are broken (in the CI environment) can
be configured in gitlab-ci to be skipped by adding them to
broken-tests-packages.
Highlights include:
- Restructured 'spack ci' help to provide better subcommand summaries;
- Ensured only one InstallError (i.e., installer's) rather than allowing
build_environment to have its own; and
- Refactored CI and CDash reporting to keep CDash-related properties and
behavior in a separate class.
This allows stand-alone tests from `spack ci` to run when the `--tests`
option is used. With `--tests`, stand-alone tests are run **after** a
**successful** (re)build of the package. Test results are collected
and report(able) using CDash.
This PR adds the following features:
- Adds `-t` and `--tests` to `spack ci rebuild` to run stand-alone tests;
- Adds `--fail-fast` to stop stand-alone tests after the first failure;
- Ensures a *single* `InstallError` across packages
(i.e., removes second class from build environment);
- Captures skipping tests for externals and uninstalled packages
(for CDash reporting);
- Copies test logs and outputs to the CI artifacts directory to facilitate
debugging;
- Parses stand-alone test results to report outputs from each `run_test` as
separate test parts (CDash reporting);
- Logs a test completion message to allow capture of timing of the last
`run_test` part;
- Adds the runner description to the CDash site to better distinguish entries
in CDash tables;
- Adds `gitlab-ci` `broken-tests-packages` to CI configuration to skip
stand-alone testing for packages with known issues;
- Changes `spack ci --help` so description of each subcommand is a single line;
- Changes `spack ci <subcommand> --help` to provide the full description of
each command (versus no description); and
- Ensures `junit` test log file ends in an `.xml` extension (versus default where
it does not).
Tasks:
- [x] Include the equivalent of the architecture information, or at least the host target, in the CDash output
- [x] Upload stand-alone test results files as `test` artifacts
- [x] Confirm tests are run in GitLab
- [x] Ensure CDash results are uploaded as artifacts
- [x] Resolve issues with CDash build-and test results appearing on same row of the table
- [x] Add unit tests as needed
- [x] Investigate why some (dependency) packages don't have test results (e.g., related from other pipelines)
- [x] Ensure proper parsing and reporting of skipped tests (as `not run`) .. post- #28701 merge
- [x] Restore the proper CDash URLand or mirror ONCE out-of-band testing completed
* gaudi: consistent test dependencies when +examples
Gaudi requires testing to be enabled for examples to be built, so all
test dependencies are also there for `+examples`. This PR fixes a
missing pytest dependency when `+examples` is used but no testing is
enabled. The construct with the loop is to ensure the identical
dependencies are always used, even as version ranges my start to differ.
Testing with gaudi_add_tests was added in v35r0. Some nosetests and
QMtests were in the tree before, but not accessible it seems. The
effective dependency since 35.0 is also applied for pytest, extending
the range that was there before disentangling `optional`, at
9d67d1e034/var/spack/repos/builtin/packages/gaudi/package.py
* gaudi: version 36.7 in other PR...
The "release" tarball provided by github lacks several files in
the SFconv/expat/xmlparse directory, including xmlparse. Using
tarballs based off of version tags solves the problem.
o Changes version() to use commits associated with version tags.
o Adds several additional versions.
o Adds myself as maintainer.
o Adds hook to execute autogen.sh.
o Adds autotools &ct dependences.
o Removes expat dependence.
* openmc: add v0.13.1
* Add @paulromano as maintainer of openmc and py-openmc
* Address review comments from @adamjstewart
* Add back MPI variant in openmc package
In a fast-moving project with as many forks as LLVM, it's difficult to
accurately determine if a function exists just by checking the version
number. The existing version check fails, for example, with llvm-amdgpu
from ROCm 4.5. It is more robust to directly check if the function
exists.
Added the SHA256 for version 6.3.2 and added logic to eal with the change of naming pattern for the makefile.include files that now appears to leave out the "linux_" prefix. (Changes should be. backwards compatible.)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: alansill <alansill@users.noreply.github.com>
* modified list.py and added functionality for --tag
* Removed long and very long, shifted rest of code above return statement
* removed results variable
* added import statement at top
* added the line accidentally deleted
* added line accidentally deleted
* changed p.name to p, added line inside if statement
* line order switched
* [@spackbot] updating style on behalf of sparkyniner
* ran update completion command
* add tests
* Update lib/spack/spack/test/cmd/list.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of sparkyniner
* changed argument to mock_packages and moved code under filter by tag
* removed bad rebase code and added additional test
* [@spackbot] updating style on behalf of sparkyniner
* added line removed earlier
* added line removed earlier
* replaced function
* added more recommended changes
Co-authored-by: sairaj <sairaj@sairajs-MacBook-Pro.local>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* VTK-h: don't assume I have Fortran
Don't assume I have a working Fortran compiler in my toolchain :)
* Conduit: Do not Assume Fortran
* Ascent: Do not Assume Fortran
* fix style
* Build Tensorflow using the fork for rocm. Initial commit
* re-order the versions
* fix style errors
* address review comments
* add conflicts for rocm version
* address review comments
* remove rocm variant as its added by ROCmPackage
* add mothra tests
* py-globus-sdk: add new versions; unpin py-cryptography version constraint
* Update var/spack/repos/builtin/packages/py-globus-sdk/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pyjwt: fix py-cryptography version constraints
* Update var/spack/repos/builtin/packages/py-pyjwt/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ForTrilinos: new versions 2.0.1, 2.1.0
I also had to update the checksum for the released 2.0.0: see #32000 for
the explanation and solution to keep this from happening again.
* Soooo stylish
Assertions without messages if/when hit create a blank error message for users.
This PR adds error messages to all assertions in asp.py even
if it seems unlikely they will ever be needed.
* Sensei: Refactor package to work with v4.0.0
* Add missing MPI dependency
* Patch bug in libsim adapter
* Simplify conflicts with when-clauses
* Conflict variants that are incompatible (catalyst/libsim/ascent)
* Fix paraview version constraints to be more clear
* Add version constraints for VTK
* Drop unneeded visit restrictions
* Specify +vtkm dependency on ParaView's VTKm
* +hl is not needed for VTK, and is already specified in the VTK recipe
when it is needed
* Pass paths for adios2 and ascent packages
* ECP-SDK: Enable sensei
* CI: Add sensei to the data-vis-sdk pipeline
* Sensei: Change VISIT_DIR to work on linux
* Fixup: style check
* Sensei: Add patch for version detection
* CI: revert SDK pipeline in favor of new matrices
* Sensei: Formatting fixes
The argument is very likely a typo, and was meant to
be given to the fixture decorator. Since the value
being passed is the default, let's just remove it.
* metaeuk: new package
* sepp: new package
* busco: adding version 5.4.3
* busco: adding maintainers
* metauk: more accurate cmake dep
* sepp: adding missing dep and accurate python dep
* sepp: remove install_tree
* sepp: extend python
Ensure that build tools with module-level commands in spack use
the version built as part of their build graph if one exists.
This is now also required for mesa, scons, cmake and ctest, out
of graph versions of these tools in path will not be found unless
added as an external.
This bug appeared because a new version of rocprim needs cmake
3.16, while I have 3.14 in my path I had added an external for
cmake 3.20 to the dag, but 3.14 was still used to configure
rocprim causing it to fail. As far as I can tell, all the build
tools added in build_environment.py had this problem, despite the
fact that they should have been resolving these tools by name
with a path search and find the one in the dag that way. I'm
still investigating why the path searching and Executable logic
didn't do it, but this makes three of the build systems much more
explicit, and leaves only gmake and ninja as dependencies from
out in the system while ensuring the version in the dag is used
if there is one.
The additional sqlite version is to perturb the hash of python to
work around a relocation bug which will be fixed in a subsequent
PR.
The Python that comes with XCode command line tools is a bit weird.
This fixes two issues that were preventing it from bootstrapping:
- [x] CommandLineTools python does not come with a `python-config` executable. If you
don't have one of these, CMake's FindPython will, for some reason, assume that you
want Python 2, so you need to set `CLINGO_PYTHON_VERSION` to ensure that clingo
tells `find_package(Python <VERSION>)` the right thing.
- [x] We weren't setting `PYTHONHOME` correctly in Python's `package.py`. We were
setting it to the `prefix` from Python `sysconfig`, but `prefix` tells you where
the executable lives. CommandLineTools python reports its prefix as
/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.8
but that doesn't exist. It looks like Apple builds the full python and just copies
pieces of it into command line tools. PYTHONHOME is supposed to tell you where the
default python library location is. So you have to look at the `base`, which
`sysconfig` reports as
/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8
On most systems this is probably the same as `prefix`, but not with
CommandLineTools, which has its system library in a different place.
The second change here was cherry-picked to 0d981a012d before merging and doesn't show
up here.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
- [x] Rework `headers` to search a sequence of directories and to display all searched
locations on error, as opposed to handling each directory with a variable
- [x] Make `headers` and `libs` do the same type of search and raise the same sort of
errors.
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
* [py-bitshuffle] New package
* [py-bitshuffle] compiling against spack hdf5
* [py-bitshuffle] flake8
* [py-bitshuffle] update import ine
* [@spackbot] updating style on behalf of qwertos
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
We're seeing errors pop up with older versions of Kokkos and newer versions of
`oneapi`, specifically:
error: no member named 'ONEAPI' in namespace 'sycl'
This hapens because `sycl::ONEAPI` is `sycl::ext::oneapi` since oneapi `2022.0.0`.
`kokkos@3.6:` uses the new namespace. A conflict was present for this, but it was too
specific -- both to `dpcpp` and to the `2022.0.0` version.
- [x] Expand version ranges in `kokkos` conflict
- [x] Add conflict for `oneapi` in addition to `dpcpp`
* filesystem: use lstat in recursive mtime
When a `develop` path contains a dead symlink, the `os.stat` in the recursive `mtime` determination trips up over it.
Closes#32165.
`requirement_policy/3` is generated and may not be in Spack's inputs to Clingo.
Currently this is causing warnings like:
```
$ spack spec zlib
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:510:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"one_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:517:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"one_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:523:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"any_of")
/global/u2/t/tgamblin/src/spack/lib/spack/spack/solver/concretize.lp:534:3-43: info: atom does not occur in any rule head:
requirement_policy(Package,X,"any_of")
Input spec
--------------------------------
zlib
Concretized
--------------------------------
zlib@1.2.11%gcc@7.5.0+optimize+pic+shared arch=cray-sles15-haswell
```
- [x] Silence warning with `#defined requirement_policy/3`
Spack doesn't have an easy way to say something like "If I build
package X, then I *need* version Y":
* If you specify something on the command line, then you ensure
that the constraints are applied, but the package is always built
* Likewise if you `spack add X...`` to your environment, the
constraints are guaranteed to hold, but the environment always
builds the package
* You can add preferences to packages.yaml, but these are not
guaranteed to hold (Spack can choose other settings)
This commit adds a 'require' subsection to packages.yaml: the
specs added there are guaranteed to hold. The commit includes
documentation for the feature.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* celeritas: update checksum for git describe difference
Using `$Format:%h$` and the export-subst gitattribute property mean that
the release checksum can change post release...
```
$ diff -ru spack-stage-celeritas-0.1.0-hv7ewpjouekqws2y5iaql2cnp6tn76iz/ spack-stage-celeritas-0.1.0.wtf-ynyck3df2a7kkgkqrpwapgz3l2i62omz/
diff -ru spack-stage-celeritas-0.1.0-hv7ewpjouekqws2y5iaql2cnp6tn76iz/spack-src/cmake/CgvFindVersion.cmake spack-stage-celeritas-0.1.0.wtf-ynyck3df2a7kkgkqrpwapgz3l2i62omz/spack-src/cmake/CgvFindVersion.cmake
--- spack-stage-celeritas-0.1.0-hv7ewpjouekqws2y5iaql2cnp6tn76iz/spack-src/cmake/CgvFindVersion.cmake 2022-07-31 19:45:03.000000000 -0400
+++ spack-stage-celeritas-0.1.0.wtf-ynyck3df2a7kkgkqrpwapgz3l2i62omz/spack-src/cmake/CgvFindVersion.cmake 2022-07-31 19:45:03.000000000 -0400
@@ -75,7 +75,7 @@
# Get a possible Git version generated using git-archive (see the
# .gitattributes file)
- set(_ARCHIVE_TAG "HEAD -> master, tag: v0.1.0")
+ set(_ARCHIVE_TAG "tag: v0.1.0")
set(_ARCHIVE_HASH "04fe945d9")
set(_TAG_REGEX "v([0-9.]+)(-dev[0-9.]+)?")
Binary files spack-stage-celeritas-0.1.0-hv7ewpjouekqws2y5iaql2cnp6tn76iz/v0.1.0.tar.gz and spack-stage-celeritas-0.1.0.wtf-ynyck3df2a7kkgkqrpwapgz3l2i62omz/v0.1.0.tar.gz differ
```
* celeritas: Use exported and re-uploaded tar.gz for reproducibility
All PRs are failing the docs build on account of an error with
pygments. These errors coincide with a new release of pygments
(2.13.0) and restricting to < 2.13 allows the doc tests to pass,
so this commit enforces that constraint for the docs build.
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
A few attribute in packages are meant to be reserved for
Spack internal use. This audit checks packages to ensure none
of these attributes are overridden.
- [x] add additional audit check
In file included from /data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.cxx:17:
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx: In static member function 'static void OSD_Parallel::ForEach(InputIterator, InputIterator, const Functor&, Standard_Boolean)':
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx:223:18: error: 'captured_exception' in namespace 'tbb' does not name a type
223 | catch ( tbb::captured_exception& anException )
| ^~~~~~~~~~~~~~~~~~
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx:225:38: error: 'anException' was not declared in this scope
225 | Standard_NotImplemented::Raise(anException.what());
| ^~~~~~~~~~~
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx: In static member function 'static void OSD_Parallel::For(Standard_Integer, Standard_Integer, const Functor&, Standard_Boolean)':
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx:272:18: error: 'captured_exception' in namespace 'tbb' does not name a type
272 | catch ( tbb::captured_exception& anException )
| ^~~~~~~~~~~~~~~~~~
/data/xsdkci/VS1mJQ1K/0/xsdk-project/spack-xsdk/spack-stage/balay/spack-stage-oce-0.18.3-rim5zhuacb7z4plxag52fjj3gbc4znv3/spack-src/src/OSD/OSD_Parallel.hxx:274:38: error: 'anException' was not declared in this scope
274 | Standard_NotImplemented::Raise(anException.what());
| ^~~~~~~~~~~
`setup_dependent_build_environment(self, env, dependent_spec)` does not have a variable `spec`.
This causes several issues right now:
```console
==> Installing gaudi-36.6-cjjrpjwpcqrtojyrdqml3jpzkbn55hpb
==> No binary for gaudi-36.6-cjjrpjwpcqrtojyrdqml3jpzkbn55hpb found: installing from source
==> Error: NameError: name 'spec' is not defined
/home/wdconinc/git/spack/var/spack/repos/builtin/packages/root/package.py:614, in setup_dependent_build_environment:
611 env.prepend_path("ROOT_INCLUDE_PATH", dependent_spec.prefix.include)
612 if "+rpath" not in self.spec:
613 env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib.root)
>> 614 if "platform=darwin" in spec:
615 # Newer deployment targets cause fatal errors in rootcling
616 env.unset("MACOSX_DEPLOYMENT_TARGET")
```
On PR pipelines we need to override the buildcache destination to
point to the "spack-binaries-prs" bucket, otherwise, those pipelines
try to push to the default mirror in a bucket for which they don't
have write permission.
This PR fixes the performance regression reported in #31985 and a few
other issues found while refactoring the spack mirror create command.
Modifications:
* (Primary) Do not require concretization for
`spack mirror create --all`
* Forbid using --versions-per-spec together with --all
* Fixed a few issues when reading specs from input file (specs were
not concretized, command would fail when trying to mirror
dependencies)
* Fix issue with default directory for spack mirror create not being
canonicalized
* Add more unit tests to poke spack mirror create
* Skip externals also when mirroring environments
* Changed slightly the wording for reporting (it was mentioning
"Successfully created" even in presence of errors)
* Fix issue with colify (was not called properly during error
reporting)
* [py-pynndescent] New package
* [py-pynndescent] Removed white space
* [py-pynndescent] fixed dependencies
* [py-pynndescent] flake8
* [py-pynndescent] Import update
Co-authored-by: James A Zilberman <jazrc@rit.edu>
- [x] Add release v0.2 of the AML package, deprecate v0.1, and add support for
OpenCL, HIP, and CUDA variants of the library. Also update repo and
release URL, as the previous one is not accessible anymore.
- [x] aml: add oneapi-level-zero support
- [x] Change openmp flags to force compatibility when compiling with
intel-oneapi compilers.
`LD_LIBRARY_PATH` can break system executables (e.g., when an enviornment is loaded) and isn't necessary thanks to `RPATH`s. Packages that require `LD_LIBRARY_PATH` can set this in `setup_run_environment`.
- [x] Prefix inspections no longer set `LD_LIBRARY_PATH` by default
- [x] Document changes and workarounds for people who want `LD_LIBRARY_PATH`
These changes make many packages build on nixos where nearly nothing
comes from /bin or /usr/bin (the only things in "system locations" are
/bin/sh and /usr/bin/env, all the rest is found through PATH).
Many configuration scripts hardcode /usr/bin/file instead of using the
one from PATH. This patches them to use file from PATH.
* ADIOS2: ZFP<1.0
The tagged ADIOS2 releases in Spack (and develop) do not yet
work with ZFP 1.0. Express valid depedency range before someone
updates the ZFP package and triggers the incompatibility.
* Use semver
* flux packages used in a spack view will already be unshallow
and this particular command will error and break the entire build. Instead
we want to catch the ProcessError and allow for a regular fetch.
* final tweaks to add missing sqlite and fallback to fetch
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
fixes#31736
Catch errors when concretizing specs and report them as
debug messages. The corresponding spec is skipped.
Co-authored-by: Greg Becker <becker33@llnl.gov>
* Extracted two functions in cmd/install.py
* Extracted a function to perform installation from the active environment
* Rename a few functions, remove args from their arguments
* Rework conditional in install_from_active_environment to reduce nesting in the function
* Extract functions to parsespecs from cli and files
* Extract functions to getuser confirmation for overwrite
* Extract functions to install specs inside and outside environments
* Rename a couple of functions
* Fix outdated comment
* Add missing imports
* Split conditional to dedent one level
* Invert check and exit early to dedent one level when requiring user confirmation
* tassel: add version 3.0.174
fix naming of 5.2.39 from "2017-07-22"
* changed single quotes to double quotes to appease the gods ;)
* fixed style issues
The current use of git ref's as a version requires a search algorithm to pick the right matching version based on the tags in the git history of the package.
This is less than ideal for the use case where users already know the specific version they want the git ref to be associated with. This PR makes a new version syntax [package]@[ref]=[version] to allow the users to specify the exact hash they wish to use.
* Add checksum for py-arror 1.2.1 and 1.2.2
* Update package.py
* Update var/spack/repos/builtin/packages/py-arrow/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update package.py
* Update package.py
* Update package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [py-glymur] New Package: py-glymur
* [py-glymur] fixed copyright line
* [py-glumur] update import ine
* [@spackbot] updating style on behalf of qwertos
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
There is a problem with the git repo for rhtslib that apparently led to
a bad version entry during the previous round of package updates. A git
checkout on the commit also fails so use branch for most recent version.
* new package: alphafold
and related dependencies, depends on #27138
* [@spackbot] updating style on behalf of aweits
* fix
Co-authored-by: aweits <aweits@users.noreply.github.com>
On some systems the shell in login mode wipes important parts of the
environment, such as PATH. This causes the build to fail since it can't
find `spack`.
For better robustness, don't use a login shell.
In a full CI job the final spack install is run in an environment formed by scripts running in this order:
export AWS_SECRET=... # 1. Load environment from GitLab project variables
source spack/share/spack/setup-env.sh # 2. Load Spack into the environment (PATH)
spack env activate -V concrete_env # 3. Activate the concrete environment
source /etc/profile # 4. Bash login shell (from -l)
spack install ...
Whereas when a user launches their own container with (docker|podman) run -it, they end up running spack install in an environment formed in this order:
source /etc/bash.bashrc # (not 4). Bash interactive shell (default with TTY)
export AWS_SECRET=... #~1. Manually load environment from GitLab project variables
source spack/share/spack/setup-env.sh # 2. Load Spack into the environment (PATH)
spack env activate -V concrete_env # 3. Activate the concrete environment
spack install ...
The big problem being that (4) has a completely different position and content (on Leap 15 and possibly other containers).
So in context, this PR removes (4) from the CI job case, leaving us with the simpler:
export AWS_SECRET=... # 1. Load environment from GitLab project variables
source spack/share/spack/setup-env.sh # 2. Load Spack into the environment (PATH)
spack env activate -V concrete_env # 3. Activate the concrete environment
spack install ...
* database: don't sort on return from query_local
* ASP-based solver: don't build the hash-lookup dictionary twice
Building this dictionary twice and traversing all the specs
might be time-consuming for large buildcaches.
This test relied on an old version of the `flake8_package` fixture that modified
the spack repository, but it doesn't do that anymore. There are other tests for
`changed_files()` that do a better job of mocking up a git repository with
changes, so we can just delete this one.
A GitHub rebase merge seems to rewrite commits even if it would be a
fast-forward, which means that the commit merged from #24718 is wrong.
- [x] update `.git-blame-ignore-revs` with real commit from `develop`
`spack style` tests were annoyingly brittle because we could not easily be
specific about which tools to run (we had to use `--no-black`, `--no-isort`,
`--no-flake8`, and `--no-mypy`). We should be able to specify what to run OR
what to skip.
Now you can run, e.g.:
spack style --tool black,flake8
or:
spack style --skip black,isort
- [x] Remove `--no-black`, `--no-isort`, `--no-flake8`, and `--no-mypy` args.
- [x] Add `--tool TOOL` argument.
- [x] Add `--skip TOOL` argument.
- [x] Allow either `--tool black --tool flake8` or `--tool black,flake8` syntax.
- [x] remove alignment spaces from tempaltes
- [x] replace single with double quotes
- [x] Makefile template now generates parsable code
(function body is `pass` instead of just a comment)
- [x] template checks now run black to check output
Previously we'd accept any version for bootstrapping black, but we need <= 21.
- [x] modify bootstrapping code to check black version before accepting an
executable from `PATH`.
- [x] add `.git-blame-ignore-revs` to ignore black reformatting
- [x] make `spack blame` respect `.git-blame-ignore-revs`
(even if the user hasn't configured git to do so)
Some of our tests rely on single vs. double quotes, and others rely on specific
line numbers in the source. These needed fixing after the switch to Black.
Black will automatically fix a lot of the exceptions we previously allowed for
directives, so we don't need them in our custom `flake8_formatter` anymore.
- [x] remove `E501` (long line) exceptions for directives from `flake8_formatter`,
as they won't help us now.
- [x] Refine exceptions for long URLs in the `flake8_formatter`.
- [x] Adjust the mock `flake8-package` to exhibit the exceptions we still allow.
- [x] Update style tests for new `flake8-package`.
- [x] Blacken style test.
Many noqa's in the code are no longer necessary now that the column limit is 99
characters. Others can easily be eliminated, and still more can just be made more
specific if they do not have to do with line length.
The only E501's still in the code are in the tests for `spack.util.path` and the tests
for `spack style`.
This adds necessary configuration for flake8 and black to work together.
This also sets the line length to 99, per the data here:
* https://github.com/spack/spack/pull/24718#issuecomment-876933636
Given the data and the spirit of black's 88-character limit, we set the limit to 99
characters for all of Spack, because:
* 99 is one less than 100, a nice round number, and all lines will fit in a
100-character wide terminal (even when the text editor puts a \ at EOL).
* 99 is just past the knee the file size curve for packages, and it means that packages
remain readable and not significantly longer than they are now.
* It doesn't seem to hurt core -- files in core might change length by a few percent but
seem like they'll be mostly the same as before -- just a bit more roomy.
- [x] set line length to 99
- [x] remove most exceptions from `.flake8` and add the ones black cares about
- [x] add `[tool.black]` to `pyproject.toml`
- [x] make `black` run if available in `spack style --fix`
Co-Authored-By: Tom Scogland <tscogland@llnl.gov>
* Add checksum for numba 0.55.2 and 0.56
* Add checksum for py-llvmlite 0.39.0
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* OpenGL: Restructures the OpenGL packages
This provides concrete glx and osmesa packages which delegate to
virtual libglx and libosmesa packages provided by mesa. This was
necessary because GLX and OSMesa both provide gl implementations but
with mesa providing the girtual gl package there was no way to properly
distinguish which of the two opengl implementations was beiing requested
when querying the spec['gl'] dependency. This additional level of
indirection allows for that.
* OpenGL: Adjust downstream dependents of OpenGL for the restructure
This implements the necessary fixes in the packages that depend on
OpenGL to work with the restructuring. This also attempts to create a
consistent variant for specifying glx or osmesa.
the patch urls dynamically generate a diff, which includes metadata
about the git version used, meaning they are not content-addressable.
instead ship the patches with spack.
In #31618 the idea was to determine the file extension heuristically by dropping query params etc from a url and then consider it as a file path. That broke for URLs that only have query params like http://example.com/?patch=x as it would result in empty string as basename. This PR reverts to the old behavior of saving files as ?patch=x in that case.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
* herwig3: change lhapdfsets dependency type to build
These data sets are needed for a check during build, but due to the difficulty of versioning the datasets it is preferred not to keep the them in the run environment.
* herwig3: explicitly state needed boost libs
* thepeg: explicitly state needed boost libs
* style
* stylestyle
* Tau must get GCC path from environment on Cray
self.compiler doesn't provide the path to the gcc compiler when using cray cc and the spack internal compiler overrides the location in PATH. If possible get the location from the GCC_PATH variable instead.
* Fix flake8 issues
* Update package.py
* llvm: Use variant when clauses for many of the expressed conflicts
* llvm: Remove the shared variant as it wasn't really used
* llvm: Remove unnecessary deps and make explicit the ones that are
* llvm: Cleanup patch conditions
* pocl: Update for llvm cleanup
* unit-test: update unparse package hash with the updated llvm package
* llvm: Fix ppc long double patching and add clarifying comments
`self.archive_file` is (among others) a symlink to a tarball. `extension()` on a
symlink will result in no extension. This patch fixes the behavior introduced in
https://github.com/spack/spack/pull/31618.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
When
1. Spack installs libtool,
2. system libtool is installed too, and
3. system automake is used
Spack passes system automake's `-I <prefix>` flag to itself, even though
it's a default search path. This takes precedence over spack's libtool
prefix dir. This causes the wrong `libtool.m4` file to be used (since
system libtool is in the same prefix as system automake).
And that leads to error messages about incompatible libtool, something
something LT_INIT.
fixes#31627
spack.mirror.get_all_versions now uses the package class
instead of the package object in its implementation.
Ensure spec is concrete before staging for mirrors
* Update open-ce patches for py-torch to us immutable URLs. Update magma dependency specs to be more explicit.
* Address comments for PR regarding URLs and conflicting variants.
Co-authored-by: Nicholas Cameron Sly <sly1@llnl.gov>
* rocblas: make tensile dependencies conditional
* Remove rocm-smi from the rocblas dependency list
rocm-smi was added to the rocblas dependency list because Tensile was a
dependency of rocBLAS and rocm-smi was a dependency of Tensile. However,
that reasoning was not correct.
Tensile is composed of three components:
1. A command-line tool for generating kernels, benchmarking them, and
saving the parameters used for generating the best kernels
(a.k.a. "Solutions") in YAML files.
2. A build system component that reads YAML solution files, generates
kernel source files, and invokes the compiler to compile then into
code object files (*.co, *.hsco). An index of the kernels and their
associated parameters is also generated and stored in either YAML
or MessagePack format (TensileLibrary.yaml or TensileLibrary.dat).
3. A runtime library that will load the TensileLibrary and code object
files when asked to execute a GEMM and choose the ideal kernel for
your specific input parameters.
rocBLAS developers use (1) during rocBLAS development. This is when
Tensile depends on rocm-smi. The GPU clock speed and temperature must be
controlled to ensure consistency when doing the solution benchmarking.
That control is provided by rocm-smi. When building rocBLAS, Tensile is
used for (2). However, there is no need for control of the GPU at that
point and rocm-smi is not a dependency. At runtime, the rocBLAS library
uses Tensile for (3), but there is again no dependency on rocm-smi.
tl;dr: rocm-smi is a dependency of the tensile benchmarking tool,
which is not a build dependency or runtime dependency of rocblas.
This PR contains several fixes for the kallisto package.
- create hdf5 variant as hdf5 is optional beginning with 0.46.2
- provide patch for 0.43 to link against libz
- provide patch for older versions to build again gcc-11 and up
- patch and code to use autoconf-2.70 and up
Newer versions of botocore (>=1.23.47) support the full IOBase
interface, so the hacks added to supplement the missing attributes are
no longer needed. Conditionally disable the hacks if they appear to be
unnecessary based on the class hierarchy found at runtime.
* py-panaroo: new package
* moving panaroo to branch
* updated mizani, plotnine, and pystan versions and requirements
* made suggested fixes
* adding more requested fixes
* added new versions of statsmodels and httpstan
* py-torch: add version 0.23.0 and fix to built on aarch64
* Add newer versions, fix build issues
* Fix tests
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add connection information to buildcache update command
Ensure that the s3 connection made when attempting to update the content
of a buildcache attempts to use the extra connection information
from the mirror creation.
* Add unique help for endpoint URL argument
Fix copy/paste error for endpoint URL help which was the same as
the access token
* Re-work URL checking for S3 mirrors
Due to the fact that nested bucket URLs would never match the string used
for checking that the mirror is the same, switch the check used.
Sort all mirror URLs by length to have the most specific cases first
and see if the desired URL "starts with" the mirror URL.
* Long line style fixes
Add execptions for long lines and fix other style errors
* Use format() function to rebuild URL
Use the format command to rebuild the url instead of crafing a
formatted string out of known values
* Add early exit for URL checking
When a valid mirror is found, break from the loop
For a long time the module configuration has had a few settings that use
`blacklist`/`whitelist` terminology. We've been asked by some of our users to replace
this with more inclusive language. In addition to being non-inclusive, `blacklist` and
`whitelist` are inconsistent with the rest of Spack, which uses `include` and `exclude`
for the same concepts.
- [x] Deprecate `blacklist`, `whitelist`, `blacklist_implicits` and `environment_blacklist`
in favor of `exclude`, `include`, `exclude_implicits` and `exclude_env_vars` in module
configuration, to be removed in Spack v0.20.
- [x] Print deprecation warnings if any of the deprecated names are in module config.
- [x] Update tests to test old and new names.
- [x] Update docs.
- [x] Update `spack config update` to fix this automatically, and include a note in the error
that you can use this command.
Thanks for taking the time to report this build failure. To proceed with the report please:
1. Title the issue `Installation issue: <name-of-the-package>`.
2. Provide the information required below.
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
- type:textarea
id:reproduce
@@ -29,7 +29,9 @@ body:
description:|
Please post the error message from spack inside the `<details>` tag below:
value:|
<details><summary>Error message</summary><pre>
<details><summary>Error message</summary>
<pre>
...
</pre></details>
validations:
@@ -53,7 +55,7 @@ body:
Please upload the following files:
* **`spack-build-out.txt`**
* **`spack-build-env.txt`**
They should be present in the stage directory of the failing build. Also upload any `config.log` or similar file if one exists.
description:Suggest adding a feature that is not yet in Spack
labels:[feature]
body:
@@ -29,13 +29,11 @@ body:
attributes:
label:General information
options:
- label:I have run `spack --version` and reported the version of Spack
required:true
- label:I have searched the issues of this repo and believe this is not a duplicate
required:true
- type:markdown
attributes:
value:|
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on [our Slack](https://slack.spack.io/) first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
Other than that, thanks for taking the time to contribute to Spack!
description:Some package in Spack had stand-alone tests that didn't pass
title:"Testing issue: "
labels:[test-error]
body:
- type:textarea
id:reproduce
attributes:
label:Steps to reproduce the failure(s) or link(s) to test output(s)
description:|
Fill in the test output from the exact spec that is having stand-alone test failures. Links to test outputs (e.g., CDash) can also be provided.
value:|
```console
$ spack spec -I <spec>
...
```
- type:textarea
id:error
attributes:
label:Error message
description:|
Please post the error message from spack inside the `<details>` tag below:
value:|
<details><summary>Error message</summary>
<pre>
...
</pre></details>
validations:
required:true
- type:textarea
id:information
attributes:
label:Information on your system or the test runner
description:Please include the output of `spack debug report` for your system.
validations:
required:true
- type:markdown
attributes:
value:|
If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well.
- type:textarea
id:additional_information
attributes:
label:Additional information
description:|
Please upload test logs or any additional information about the problem.
- type:markdown
attributes:
value:|
Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and **@mention** them here if they exist.
- type:checkboxes
id:checks
attributes:
label:General information
options:
- label:I have reported the version of Spack/Python/Platform/Runner
required:true
- label:I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers
required:true
- label:I have uploaded any available logs
required:true
- label:I have searched the issues of this repo and believe this is not a duplicate
Clone `spack-configs <https://github.com/spack/spack-configs>`_ repo and activate Intel oneAPI CPU environment::
git clone https://github.com/spack/spack-configs
spack env activate spack-configs/INTEL/CPU
spack concretize -f
`Intel oneAPI CPU environment <https://github.com/spack/spack-configs/blob/main/INTEL/CPU/spack.yaml>`_ contains applications tested and validated by Intel, this list is constantly extended. And currently it supports:
-`GROMACS <https://www.gromacs.org/>`_
-`HPCG <https://www.hpcg-benchmark.org/>`_
-`HPL <https://netlib.org/benchmark/hpl/>`_
-`LAMMPS <https://www.lammps.org/#gsc.tab=0>`_
-`OpenFOAM <https://www.openfoam.com/>`_
-`STREAM <https://www.cs.virginia.edu/stream/>`_
-`WRF <https://github.com/wrf-model/WRF>`_
To build lammps with oneAPI compiler from this environment just run::
spack install lammps
Compiled binaries can be find using::
spack cd -i lammps
You can do the same for all other applications from this environment.
==> Installing "patchelf@0.16.1%gcc@10.2.1 ldflags="-static-libstdc++ -static-libgcc" build_system=autotools arch=linux-centos7-x86_64" from a buildcache
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.