* Added e4s-cl package
* Version order change
* Added e4s-cl dependencies
* Added python-sotools dependency
* [@spackbot] updating style on behalf of spoutn1k
* Add missing versions to py- packages
* Fix style
* [@spackbot] updating style on behalf of spoutn1k
* Update var/spack/repos/builtin/packages/e4s-cl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/e4s-cl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-python-sotools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add docker removing patch for e4s-cl
Co-authored-by: spoutn1k <spoutn1k@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-nexusforge: add with dependencies
* py-pyshacl, py-sseclient: more style
* py-hjson, py-nexus-sdk, py-nexusforge, py-puremagic: more style
* py-pyshacl: license update
* py-nexusforge, py-prettytable, py-pyshacl: review remarks
* py-nexusforge: make the variant mean something
Too hasty to commit...
* py-ipyparallel: add 8.4.1, which builds with py-hatchling
* py-ipyparallel: copyright and redundant py-setuptools dependency
* py-ipyparallel: py-packaging was dropped after 8.0.0
fixes#34879
This commit adds a new maintainer directive,
which by default extend the list of maintainers
for a given package.
The directive is backward compatible with the current
practice of having a "maintainers" list declared at
the class level.
Move the relocation of binary text in its own class
Drop threaded text replacement, since the current bottleneck
is decompression. It would be better to parallellize over packages,
instead of over files per package.
A small improvement with separate classes for text replacement is that we
now compile the regex in the constructor; previously it was compiled per
binary to be relocated.
The regex doesn't actually work because dollar signs and parentheses have to be
escaped. Also, compiling with OpenMPI requires defining the macro
`MPI2SUPPORT`.
This commit makes explicit the format version of the spec file
we are reading from.
Before there were different functions capable of reading some
part of the spec file at multiple format versions. The decision
was implicit, since checks were based on the structure of the
JSON without ever checking a format version number.
The refactor makes also explicit which spec file format is used
by which database and lockfile format, since the information is
stored in global mappings.
To ensure we don't change the hash of old specs, JSON representations
of specs have been added as data. A unit tests checks that we read
the correct hash in, and that the hash stays the same when we
re-serialize the spec using the most recent format version.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
```
File ".../spack/var/spack/environments/scale-mpi/.spack-env/._view/4yiorsdd4pefrnwgrwlwt3yzo5i235il/lib/python3.10/site-packages/h5py/_hl/base.py", line 19, in <module>
from collections import (Mapping, MutableMapping, KeysView,
ImportError: cannot import name 'Mapping' from 'collections' (.../spack/var/spack/environments/scale-mpi/.spack-env/._view/4yiorsdd4pefrnwgrwlwt3yzo5i235il/lib/python3.10/collections/__init__.py)
```
Fixed in https://github.com/h5py/h5py/pull/1069 which was first merged
in v2.9.
* py-flatten-dict: require poetry to build.
The sources seem to contain a bundled, auto-generated `setup.py`.
Building with `pip` insist on using Poetry as mentioned in
`pyproject.toml`, so require it as a build dependency.
* Update var/spack/repos/builtin/packages/py-flatten-dict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-flatten-dict/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Currently we print "sha256 checksum failed for [file]. Expected X but
got Y".
This PR extends that message with file size and contents info:
"... but got Y. File size = 123456 bytes. Contents = b'abc...def'"
That way we can immediately see if the file was downloaded only
partially, or if we downloaded a text page instead of a binary, etc.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
py-scipy 1.6 and older come with pre cython-ized files that
use the _PyGen_Send symbol that was removed from python 3.10.0.161,
so do not build these old versions with python 3.10.1 and later
Parts of libgcrypt should not be optimized with -O1/2/3, so it's best to
let the build system do that; the build system cannot know the compiler
wrapper would inject optimization flags
When running unit-test the test/ci.py module is leaving
garbage (help.sh, test.sh files) in the current working
directory.
This commit changes the current working directory to a
temporary path before those files are created.
* freeimage: fails to compile with c++17, use c++14
Only `opencascade` when a (non-default) variant depends on `freeimage`, which seems to have gone unmaintained. There are c++17 standard violations [[1]]( https://en.cppreference.com/w/cpp/language/except_spec) in the code, so we can at most expect c++14. Since some compilers default to c++17 (gcc-12) we need to be explicit.
* freeimage: install directly in prefix
* freeimage: fix inverted patch
* environments: don't rewrite relative view path, expand path on cli ahead of time
Currently if you have a spack.yaml that specifies a view by relative
path, Spack expands it to an absolute path on `spack -e . install` and
persists that to disk.
This is rather annoying when you have a `spack.yaml` file inside a git
repo, cause you want to use relative paths to make it relocatable, but
you constantly have to undo the changes made to spack.yaml by Spack.
So, as an alternative:
1. Always stick to paths as they are provided in spack.yaml, never
replace them with a canonicalized version
2. Turn relative paths on the command line into absolute paths before
storing to spack.yaml. This way you can do `spack env create --dir
./env --with-view ./view` and both `./env` and `./view` are resolved
to the current working dir, as expected (not `./env/view`). This
corresponds to the old behavior of `spack env create`.
* create --with-view always takes a value
All packages with explicit Windows support can be found with
`spack list --tags=windows`.
This also removes the documentation which explicitly lists
supported packages on Windows (which is currently out of date and
is now unnecessary with the added tags).
Note that if a package does not appear in this list, it *may*
still build on Windows, but it likely means that no explicit
attempt has been made to support it.
* nextflow recipe: added latest stable version
* tower-cli recipe: added latest release
* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli
* recipes nf-tower-agent and nf-tower-cli: small fix
* nf-core-tools recipe: added most py- dependencies
* nf-core-tools: recipe without galaxy-tool-util (for testing)
* fixed typos in py-yacman recipe
* fixed typos in py-pytest-workflow recipe
* fixed typo in nf-core-tools recipe
* fixed typos in py-yacman recipe
* fixes in recipes for py-questionary and py-url-normalize
* fixes to py-yacman recipe
* style fixes to py- packages that are dependencies to nf-core-tools
* fix in py-requests-cache recipe
* added missing dep in py-requests-cache recipe
* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper
* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util
* nf-core-tools: new version with extra dependency
* added py-galaxy-util, draft: added some required dep versions, still have to add 40+ deps
* nextflow and nf-core-tools packages: added my self as maintainer
* style fixes
* style fix for nf-core-tools recipe
* added license to py-logmuse recipe
* audit fixes
* style fix after audit fix
* py-galaxy-tool-util: added deps 1st bunch
* audit/style fixes, including adding missing dep package
* more audit/style fixes
* more more audit/style fixes
* moooore audit fixes
* py-galaxy-tool-util: dependencies 2nd chunk
* silly audit fix
* py-galaxy-util deps: 3rd bunch - first 20 done
* fixes
* style fix
* py-galaxy-tool-util: 4th bunch of deps
* stashing dep recipe backbones for py-galaxy-tool-util
* nf-core-tools: using pre-built wheel for dependency py-galaxy-tool-util
* nf-core-tools: adding also py-galaxy-util, as wheel
* fix
* nextflow: added latest bugfix version
* Update var/spack/repos/builtin/packages/nf-core-tools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/nf-core-tools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* nf-core-tools pr: 1st bunch of review edits
* nf-core-tools: 2nd bunch of review edits
* adding back tower-agent and tower-cli as deprecated
* nf-core-tools: 3rd bunch of review edits
* small style fix
* prepping py-galaxy-tool-util for further work
* nf-core-tools: last bunch of deps, except for galaxy-tool-util and pulsar
* audit fixes
* updates to py-galaxy-tool-util and its deps, still 2 to work on
* one style fix
* updated recipe for py-galaxy-util
* updated recipe for py-pulsar-galaxy-lib
* typo fix
* shasum fixes
* updated py-sqlalchemy from develop
* added newest versions (today) for nf-tower-agent and nf-tower-cli
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adding 2nd bunch of nf-core deps from update/nextflow-tools
* adding 3rd bunch of nf-core deps from update/nextflow-tools
* 4th chunk of nf-core deps from update/nextflow-tools
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pastedeploy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pebble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gunicorn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-parsley/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-gxformat2: added comment
* py-lagom: now using github tarballs
* fix for py-lagom
* adding missing deps to py-fastapi-utils
* another fix to py-lagom
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-lagom/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-supervisor/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-social-auth-core/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fixes from PR review
* adding missing deps, from PR review
* py-galaxy2cwl from github tarball, as per PR review
* fix to py-tuswsgi, as per PR review
* nf-tools: edits from PR review
* adding 3x more galaxy deps
* fix
* fixing circular dep of py-poetry-plugin-export with py-poetry
* added newest nf-core-tools version
* Update var/spack/repos/builtin/packages/py-galaxy-util/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* fix in py-poetry-plugin-export
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Currently, the `python` package tries to set `CPATH` in `setup_run_environment()`.
We no longer set `CPATH` and other destructive environment variables (like
`LD_LIBRARY_PATH`) in modules, so we shouldn't do something special for Python.
Also, the way `python` sets `CPATH` causes issues. Because it does a header search to
find directories containing headers, if you bootstrap `mypy` or other style tools on a
fresh Ubuntu image with *no* python devel headers installed, you'll get an error like
this when trying to load the thing you just installed:
```console
[root@980de539843d /]# spack -b load py-mypy
==> Error: Unable to locate python headers in any of these locations:
/usr/include/python3.6m
/usr/include/3.6
/usr/Headers
```
The headers and includes aren't needed to get `mypy` in the path or for `mypy` to work,
so we're failing unnecessarily here.
- [x] remove `setup_run_environment()` from `python/package.py`
Since SPACK_PACKAGE_IDS is now also "namespaced" with <prefix>, it makes
more sense to call the flag `--make-prefix` and alias the old flag
`--make-target-prefix` to it.
1. add variant cray-static, older crays build hpcprof-mpi static,
newer ones build dynamic.
2. move URL patches from github to gitlab.
3. add workaround for a bug where a file is mistakenly overwritten.
4. add conflict for hpcprof-mpi at 2022.10.01.
* [@spackbot] updating style on behalf of mwkrentel
Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
Normally when using external packages in concretization, Spack ignores
all dependencies of the external. #33777 updated this logic to attach
a Python Spec to external Python extensions (most py-* packages), but
as implemented there were a couple issues:
* this did not account for concretization groups and could generate
multiple different python specs for a single DAG
* in some cases this created a fake Python spec with insufficient
details to be usable (concretization/installation of the
extension would fail)
This PR addresses both of these issues:
* For environment specs that are concretized together, external python
extensions in those specs will all be assigned the same Python spec
* If Spack needs to "invent" a Python spec, then it will have all the
needed details (e.g. compiler/architecture)
* npm: Add latest version, update build
The `npm` package had gotten a bit long in the tooth and only suported the last version
for which running `configure` / `make` / `make install` actually worked.
- [x] Update the package to support npm@9, in which `npm install .` works properly and
installation is easier.
- [x] Update the package so that `npm@6:8` also install successfullly. The incantation
that is *supposed* to work on these versions is `node bin/npm-cli.js install $(node
bin/npm-cli.js pack . | tail -1)`, but depending on the version one of `npm install`
or `npm pack` will fail when run straight from the install directory. So now we just
manually copies things over.
This seems to make the `npm` install much more reliable for all of `npm@6:9` (at least
for me).
udpates the `npm` build to support versions 6-9 and fixes the install for all of
them on macos.
* update for review
With the new variable [prefix/]SPACK_PACKAGE_IDS you can conveniently execute
things after each successful install.
For example push just-built packages to a buildcache
```
SPACK ?= spack
export SPACK_COLOR = always
MAKEFLAGS += -Orecurse
MY_BUILDCACHE := $(CURDIR)/cache
.PHONY: all clean
all: push
ifeq (,$(filter clean,$(MAKECMDGOALS)))
include env.mk
endif
# the relevant part: push has *all* example/push/<pkg identifier> as prereqs
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))
$(SPACK) -e . buildcache update-index --directory $(MY_BUILDCACHE)
$(info Pushed everything, yay!)
# and each example/push/<pkg identifier> has the install target as prereq,
# and the body can use target local $(HASH) and $(SPEC) variables to do
# things, such as pushing to a build cache
example/push/%: example/install/%
@mkdir -p $(dir $@)
$(SPACK) -e . buildcache create --allow-root --only=package --unsigned --directory $(MY_BUILDCACHE) /$(HASH) # push $(SPEC)
@touch $@
spack.lock: spack.yaml
$(SPACK) -e . concretize -f
env.mk: spack.lock
$(SPACK) -e . env depfile -o $@ --make-target-prefix example
clean:
rm -rf spack.lock env.mk example/
``
* [armpl-gcc] Make pkg-config files available
ARMpl pkgconfig files are located in a non-default location and do not have the
.pc extension. Changing those both helps pkgconfig pick them up correctly, e.g.
in the meson build of `py-scipy`.
* Address @annop-w comments
* symlink instead of cp.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
In the past we checked remote binary mirrors for existence of a spec
before attempting to download it. That changed to only checking local
copies of index.jsons (if available) to prioritize certain mirrors where
we expect to find a tarball. That was faster for CI since fetching
index.json and loading it just to order mirrors takes more time than
just attempting to fetch tarballs -- and also if we have a direct hit
there's no point to look at other mirrors.
Long story short: the info message only makes sense in the old version
of Spack, so it's better to remove it.
* py-sphinx-immaterial: new package
This is a new-ish theme for Sphinx that's based on MkDocs's `immaterial` theme. More on
the theme here: https://jbms.github.io/sphinx-immaterial/, but it seems to be very clear
and readable. We *might* consider switching to it for Spack's docs.
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-sphinx-immaterial/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add note about node.js requirements
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Forward lookup of "test_log_file" and "test_failures"
refers #34531closes#34487fixes#34440
* Add unit test
* py-libensemble: fix tests
* Support stand-alone tests with cached files as install-time tests
Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Ensure `spack mirror add <name> <url/path>` without further arguments translates to `<name>: <url>` key value pairs in mirrors.yaml. If --s3-* flags are provided, only store the provided ones.
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
This is only a work-around for the actual problem that Python is used to install libraries instead of CMake, so we end up with BUILD_RPATH not INSTALL_RPATHs.
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* Add new file for MVAPICH 3.0a release
Creating this as a new package since it requires some new configuration
options and because we are moving to the name "MVAPICH" and droping the
2 (following a similar move by MPICH).
Co-authored-by: Nat Shineman <shineman.5@osu.edu>
Co-authored-by: Matthew Lieber <lieber.31@osu.edu>
* OpenCV: checksum for 4.5.5, make contrib optional
* [@spackbot] updating style on behalf of iarspider
* Add conflicts for contrib modules
* Fix typo
* Implement changes from review
* Update var/spack/repos/builtin/packages/opencv/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Revert "Revert "4th chunk of nf-core deps from update/nextflow-tools (#34564)" (#34960)"
This reverts commit 891a63cae6.
* fix to py-python-multipart, as per PR review
* py-macs2: add version 2.2.7.1 and support python@3.10:
The tarball from PyPi includes the Cythonized C files. The tarball from
github does not. Remove the Cythonized C files from the source so that
they are rebuilt with the Spack Python/Cython combination. This is
necessary for python-3.10 but make sense for other combinations as well.
* Edits based on review
- set python version constraint on version 2.2.4
- removed all python-2 versions and related constraints.
* Add package file for xtb and py-xtb
* Retain maintainers from PythonPackage
* Update package files
- use extends("python") instead of tampering with PYTHONPATH
- use PyPI for downloading sdist of py-xtb
- add simple-dftd3 0.7.0
- add dftd4 3.5.0
- remove --wrap-mode=nodownload from toml-f
* Remove logic for download URL
With this change we get the invariant that `mirror.fetch_url` and
`mirror.push_url` return valid URLs, even when the backing config
file is actually using (relative) paths with potentially `$spack` and
`$env` like variables.
Secondly it avoids expanding mirror path / URLs too early,
so if I say `spack mirror add name ./path`, it stays `./path` in my
config. When it's retrieved through MirrorCollection() we
exand it to say `file://<env dir>/path` if `./path` was set in an
environment scope.
Thirdly, the interface is simplified for the relevant buildcache
commands, so it's more like `git push`:
```
spack buildcache create [mirror] [specs...]
```
`mirror` is either a mirror name, a path, or a URL.
Resolving the relevant mirror goes as follows:
- If it contains either / or \ it is used as an anonymous mirror with
path or url.
- Otherwise, it's interpreted as a named mirror, which must exist.
This helps to guard against typos, e.g. typing `my-mirror` when there
is no such named mirror now errors with:
```
$ spack -e . buildcache create my-mirror
==> Error: no mirror named "my-mirror". Did you mean ./my-mirror?
```
instead of creating a directory in the current working directory. I
think this is reasonable, as the alternative (requiring that a local dir
exists) feels a bit pendantic in the general case -- spack is happy to
create the build cache dir when needed, saving a `mkdir`.
The old (now deprecated) format will still be available in Spack 0.20,
but is scheduled to be removed in 0.21:
```
spack buildcache create (--directory | --mirror-url | --mirror-name) [specs...]
```
This PR also touches `tmp_scope` in tests, because it didn't really
work for me, since spack fixes the possible --scope values once and
for all across tests, so tests failed when run out of order.
Packages that use docbook-xml may specify a specific entity version.
When this is specified as a version constraint in the package recipe it
will cause problems when using `unify = True` in a Spack environment, as
there could be multiple versions of docbook-xml in the spec. In
practice, any entity version should work with any other version and
everything should work with the latest version. This PR maps all Spack
docbook-xml entity versions to the docbook-xml version in the spec.
Ideally, the version in the spec would be the latest version. With this
PR, even if a package specifies an older entity version, it will map
to the entity version (latest) in the spec. This means that there can be one
docbook-xml version in a Spack environment spec and packages requesting
older entity versions will still work.
To help facilitate this, docbook-xml version constraints for packages
that have them have been removed. Those packages are dbus and gtk-doc.
What's in AOCL 4.0:
1. amdblis
LPGEMM variants with post-ops support
AMD "Zen4" support for BLIS
2. amdlibflame
Upgrade to LAPACK 3.10.1 specification
Improvements in a few more variants of SVD and Eigen Value routines
Multithread support enabled for selected APIs
3. amdfftw
AVX-512 enablement of DFT kernels
AVX-512 optimization of copy and transpose routines
5. amdlibm
Black & Scholes support (logf, expf, erff, both scalar and vector)
AVX-512 variants of vector functions
6. aocl-sparse
New Iterative Solver APIs
AVX-512 support for SPMV API
7. amdscalapack
Upgrade to Netlib ScaLAPACK 2.2.0
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Sometimes I just want to know how many packages of a certain type there are.
- [x] add `--count` option to `spack list` that output the number of packages that
*would* be listed.
```console
> spack list --count
6864
> spack list --count py-
2040
> spack list --count r-
1162
```
* Add packages
* Style
* Respond to comments
* Change conflct dep type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* paraview: add `rocm` variant
This conflicts with CUDA and requires at least ParaView 5.11.0. More
dependencies are also needed.
* E4S: Add ParaView for ROCm and CUDA stacks
* DAV SDK: Update ParaView version and GPU variants
* Verify using hipcc vs amdclang++ for newer hip
Co-authored-by: Ben Boeckel <ben.boeckel@kitware.com>
* adding 1st version of py-strawberryfields
* py-strawberryfields: minor edit
* added backbone for 4x new SF dependencies
* edits to SF and its 4x new deps
* added all deps for SF
* added one version to py-lark-parser
* py-quantum-xir with tarball from github
* Update var/spack/repos/builtin/packages/py-strawberryfields/package.py
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* updated version for py-quantum-xir : pypy fixed
* py-pennylane: added pythonpackage.maintainers, too
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Currently, all of the replacements in `spack.util.path.replacements()` get evaluated for
each replacement. This makes it easy to get bootstrap issues, because config is used
very early on in Spack.
Right now, if I run `test_autotools_gnuconfig_replacement_no_gnuconfig` on my M1 mac, I
get the circular reference error below. This fixes the issue by making all of the path
replacements lazy lambdas.
As a bonus, this cleans up the way we do substitution for `$env` -- it's consistent with
other substitutions now.
- [x] make all path `replacements()` lazy
- [x] clean up handling of `$env`
```console
> spack unit-test -k test_autotools_gnuconfig_replacement_no_gnuconfig
...
==> [2022-12-31-15:44:21.771459] Error: AttributeError:
The 'autotools-config-replacement' package cannot find an attribute while trying to build from sources. This might be due to a change in Spack's package format to support multiple build-systems for a single package. You can fix this by updating the build recipe, and you can also report the issue as a bug. More information at https://spack.readthedocs.io/en/latest/packaging_guide.html#installation-procedure
/Users/gamblin2/src/spack/lib/spack/spack/package_base.py:1332, in prefix:
1330 @property
1331 def prefix(self):
>> 1332 """Get the prefix into which this package should be installed."""
1333 return self.spec.prefix
Traceback (most recent call last):
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 1030, in _setup_pkg_and_run
kwargs["env_modifications"] = setup_package(
^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 757, in setup_package
set_module_variables_for_package(pkg)
File "/Users/gamblin2/src/spack/lib/spack/spack/build_environment.py", line 596, in set_module_variables_for_package
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/build_systems/cmake.py", line 241, in std_args
define("CMAKE_INSTALL_PREFIX", pkg.prefix),
^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/package_base.py", line 1333, in prefix
return self.spec.prefix
^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 1710, in prefix
self.prefix = spack.store.layout.path_for_spec(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/directory_layout.py", line 336, in path_for_spec
path = self.relative_path_for_spec(spec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/directory_layout.py", line 106, in relative_path_for_spec
projection = spack.projections.get_projection(self.projections, spec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/projections.py", line 13, in get_projection
if spec.satisfies(spec_like, strict=True):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 3642, in satisfies
if not self.virtual and other.virtual:
^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/spec.py", line 1622, in virtual
return spack.repo.path.is_virtual(self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 890, in is_virtual
return have_name and pkg_name in self.provider_index
^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 770, in provider_index
self._provider_index.merge(repo.provider_index)
^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 1096, in provider_index
return self.index["providers"]
~~~~~~~~~~^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 592, in __getitem__
self._build_all_indexes()
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 607, in _build_all_indexes
self.indexes[name] = self._build_index(name, indexer)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/spack/repo.py", line 616, in _build_index
index_mtime = self.cache.mtime(cache_filename)
^^^^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/llnl/util/lang.py", line 826, in __getattr__
return getattr(self.instance, name)
^^^^^^^^^^^^^
File "/Users/gamblin2/src/spack/lib/spack/llnl/util/lang.py", line 825, in __getattr__
raise AttributeError()
AttributeError
```
* Added support for libfabric to find an external installation and
identify variants supported.
* Change the fabrics definition to only include CXI when on a cray
system with a libfabric-based slingshot network.
* Added a conflict when trying to build the CXI fabric value since it is
only available as closed source at this time.
#33128 Introduces a dependency on re2c into the Ninja build recipe.
This is problematic on Windows as we use CMake to build re2c, and
Ninja to drive the CMake build. This PR resolves this issue by
adding a variant to toggle the use of re2c with ninja.
* Added a more robust check for an external version of the library.
Included a guard to identify when the library gives no discernible
version information and then to substitute with "unknown_ver"
identifier.
* gaudi: new versions 36.8, 36.9
As of 36.8, the tests use catch2 ([commit](https://gitlab.cern.ch/gaudi/Gaudi/- /commit/f2cafb5c9d04c9d497d49182258aa3a0440622c0)).
* gaudi: still depends_on fmt@:8
* unifyfs: new release v1.0.1
Add 1.0.1 release
Add new variant for new configure time option
Co-authored-by: CamStan <CamStan@users.noreply.github.com>
Now that the `tix` variant is conditional, it should also be detected
condititionally, otherwise the spec is invalid and cannot be used during
concretization.
ShellCheck is installed with a downloaded binary instead of being
compiled from source, and there should be comments to point out this
unorthodox approach.
autoconf 2.70 uses use warnings instead of -w so that PERL=/usr/bin/env perl can be passed, but we want to fix absolute paths anyhow through sbang upon install. So, we stick to patching the one perl script that's used during the build.
Gitlab does not merge lists when a job extends two other definitions
that include the same list (e.g. tags). Also, it merges dictionaries
as long as the keys are distinct, but just takes the last mentioned
value when there are key collisions.
This change makes sure that when different tags are needed by a
pipeline, the ones we want are actually provided. It also changes
the example stack to better follow this pattern so we do not lead
developers astray in the future.
Since we dropped support for Python 2.7, we can embrace using keyword only arguments
for many functions in Spack that use **kwargs in the function signature. Here this is done
for the llnl.util.filesystem module.
There were a couple of bugs lurking in the code related to typo-like errors when retrieving
from kwargs. Those have been fixed as well.
* Update xsbench to version 20
XSBench version 20 has implementations for new
architectures and accelerators.
* Added CUDA support for XSBench
* Fixed style issues
The code in FileCache for write_transaction attempts to delete the temporary file when an exception occurs under the context by calling shutil.rmtree. However, rmtree only operates on directories while the rest of FileCache uses normal files. This causes an empty file to be written to the cache key when unneeded.
Use os.remove instead which operates on normal files.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
* cernlib: depends_on freetype, libnsl, libxcrypt, openssl; and patch
In addition to #34448, cernlib depends on these additional packages.
This also applies a patch to the current release in which crypto is
specified wbere libcrypt (in libxcrypt) is actually needed. Because
the upstream git repository is behind a CERN login, we cannot patch
by gitlab URL link.
* [@spackbot] updating style on behalf of wdconinc
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* update the version for 5.3.0 release
* update the rocwmma for 5.3.0 release
* fix the +hip variant
* update the version for rocm-openmp-extras package for 5.3.0 release
* update the hipsolver and hipfft as per review comments
* address review comments
* revert changes to mivisionx with regard to change added for clangrt
* fix for the petsc failure
Spack was running an external detection of Python during each invocation
of the setup script for Windows CMD/PWSH, which has dramatic performance
implications each time the script is invoked, and is completely
unneccesary. Remove this operation.
Windows CMD prompt does not automatically support ASCI color control
characters on the console from Python. Enable this behavior by
accessing the current console and allowing the interpreation of ASCI
control characters from Python via the win32 API.
* adding 2nd bunch of nf-core deps from update/nextflow-tools
* adding 3rd bunch of nf-core deps from update/nextflow-tools
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pastedeploy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pebble/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-gunicorn/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-starlette/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-parsley/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-paste/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-gxformat2: added comment
* py-lagom: now using github tarballs
* fix for py-lagom
* adding missing deps to py-fastapi-utils
* another fix to py-lagom
* Update var/spack/repos/builtin/packages/py-dnspython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-fastapi-utils/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-lagom/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-supervisor/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This commit allows (remote) spec.json files to be clearsigned and gzipped.
The idea is to reduce the number of requests and number of bytes transferred
* added recipes for py-qutip and py-qutip-qip
* small fix
* updated qutip 2x versions
* py-qutip-qip: tarball url from github
* style fix in py-qutip-qip
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip-qip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-qutip/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* draft for py-pennylane recipe
* first draft for py-strawberryfields recipe
* minimal fix
* small fixes
* accounting for circular dep in py-pennylane and py-pennylane-lightning
* removing py-strawberryfields from this branch
* updated versions for py-pennylane 2x
* needs cmake
* py-pennylane-lightning using github tarball
* Update var/spack/repos/builtin/packages/py-autoray/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pennylane-lightning/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The Intel compiler isn't able to deal with noinline member functions of template classses defined in headers. As such it outputs
```
warning #2196: routine is both "inline" and "noinline"
```
cmake bootstrap will fail due to the word 'warning'.
See spack/var/spack/repos/builtin/packages/protobuf/intel-v2.patch for reference.
The issue does not appear with intel@2021.7.0 or later:
```
$~: compiler=/shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-12.2.0/intel-oneapi-compilers-2022.2.0-uqvb2553zy5toeapvoopacndd27x6p5m/compiler/2022.2.0/linux/bin/intel64/icpc
$~: $compiler unique.c
icpc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
```
This is a clean version of https://github.com/spack/spack/pull/34167
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
* package/libproxy: fix py3 install
* improve readability
* fix bug
* also add extend
* make flake happy
* [@spackbot] updating style on behalf of Sinan81
* Update var/spack/repos/builtin/packages/libproxy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* python dependency implied by extends const.
* disable python variant by default
* add run_env, add py conflict
* Update var/spack/repos/builtin/packages/libproxy/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* set env for macos as well
* generalize lib dir detection
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add_new_package: py-file-magic
* re-order depends...
* Update var/spack/repos/builtin/packages/py-file-magic/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [@spackbot] updating style on behalf of Sinan81
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
* Add py-svgpath and dependency
* Update copyright expiration
* [@spackbot] updating style on behalf of heerener
* Process review remarks
* Update var/spack/repos/builtin/packages/py-trimesh/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix style issue
* py-trimesh: cleanup and optional dependencies
* Fix formatting issue
* py-trimesh: complete dependency list for easy variant
Two new packages: py-mapbox-earcut and py-pycollada
* Some more missing dependencies
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* new package: py-kb-python + dependencies
- py-loompy
- py-ngs-tools
- py-numpy-groupies
* Update var/spack/repos/builtin/packages/py-kb-python/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-shortuuid: add version 1.0.11
* Update var/spack/repos/builtin/packages/py-shortuuid/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update packages config to indicate that MSVC is the preferred compiler
* Update packages config to indicate that msmpi is the preferred MPI provider
* Fix msmpi external detection
Spack imports `pytest`, which *can* import `numpy`. Recent versions of `numpy` require
Python 3.8 or higher, and they use 3.8 type annotations in their type stubs (`.pyi`
files). At the same time, we tell `mypy` to target Python 3.7, as we still support older
versions of Python.
What all this means is that if you run `mypy` on `spack`, `mypy` will follow all the
static import statements, and it ends up giving you this error when it finds numpy stuff
that is newer than the target Python version:
```
==> Running mypy checks
src/spack/var/spack/environments/default/.spack-env/._view/4g7jd4ibkg4gopv4rosq3kn2vsxrxm2f/lib/python3.11/site-packages/numpy/__init__.pyi:638: error: Positional-only parameters are only supported in Python 3.8 and greater [syntax]
Found 1 error in 1 file (errors prevented further checking)
mypy found errors
```
We can fix this by telling `mypy` to skip all imports of `numpy` in `pyproject.toml`:
```toml
[[tool.mypy.overrides]]
module = 'numpy'
follow_imports = 'skip'
follow_imports_for_stubs = true
```
- [x] don't follow imports from `numpy` in `mypy`
- [x] get rid of old rule not to follow `jinja2` imports, as we now require Python 3
The code in Spack to generate install and test reports currently suffers from unneeded complexity. For
instance, we have classes in Spack core packages, like `spack.reporters.CDash`, that need an
`argparse.Namespace` to be initialized and have "hard-coded" string literals on which they branch to
change their behavior:
```python
if do_fn.__name__ == "do_test" and skip_externals:
package["result"] = "skipped"
else:
package["result"] = "success"
package["stdout"] = fetch_log(pkg, do_fn, self.dir)
package["installed_from_binary_cache"] = pkg.installed_from_binary_cache
if do_fn.__name__ == "_install_task" and installed_already:
return
```
This PR attempt to polish the major issues encountered in both `spack.report` and `spack.reporters`.
Details:
- [x] `spack.reporters` is now a package that contains both the base class `Reporter` and all
the derived classes (`JUnit` and `CDash`)
- [x] Classes derived from `spack.reporters.Reporter` don't take an `argparse.Namespace` anymore
as argument to `__init__`. The rationale is that code for commands should be built upon Spack
core classes, not vice-versa.
- [x] An `argparse.Action` has been coded to create the correct `Reporter` object based on command
line arguments
- [x] The context managers to generate reports from either `spack install` or from `spack test` have
been greatly simplified, and have been made less "dynamic" in nature. In particular, the `collect_info`
class has been deleted in favor of two more specific context managers. This allows for a simpler
structure of the code, and less knowledge required to client code (in particular on which method to patch)
- [x] The `InfoCollector` class has been turned into a simple hierarchy, so to avoid conditional statements
within methods that assume a knowledge of the context in which the method is called.
On systems with remote groups, the primary user group may be remote and may not exist on
the local system (i.e., it might just be a number). On the CLI, it looks like this:
```console
> touch foo
> l foo
-rw-r--r-- 1 gamblin2 57095 0 Dec 29 22:24 foo
> chmod 2000 foo
chmod: changing permissions of 'foo': Operation not permitted
```
Here, the local machine doesn't know about per-user groups, so they appear as gids in
`ls` output. `57095` is also `gamblin2`'s uid, but the local machine doesn't know that
`gamblin2` is in the `57095` group.
Unfortunately, it seems that Python's `os.chmod()` just fails silently, setting
permissions to `0o0000` instead of `0o2000`. We can avoid this by ensuring that the file
has a group the user is known to be a member of.
- [x] Add `ensure_known_group()` in the permissions tests.
- [x] Call `ensure_known_group()` on tempfile in `test_chmod_real_entries_ignores_suid_sgid`.
There are a number of places in our docstrings where we write "list of X" as the type, even though napoleon doesn't actually support this. It ends up causing warnings when generating docs.
Now that we require Python 3, we don't have to rely on type hints in docs -- we can just use Python type hints and omit the types of args and return values from docstrings.
We should probably do this for all types in docstrings eventually, but this PR focuses on the ones that generate warnings during doc builds.
Some `mypy` annoyances we should consider in the future:
1. Adding some of these type annotations gets you:
```
note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs [annotation-unchecked]
```
because they are in unannotated functions (like constructors where we don't really need any annotations).
You can silence these with `disable_error_code = "annotation-unchecked"` in `pyproject.toml`
2. Right now we support running `mypy` in Python `3.6`. That means we have to support `mypy` `.971`, which does not support `disable_error_code = "annotation-unchecked"`, so I just filter `[annotation-unchecked]` lines out in `spack style`.
3. I would rather just turn on `check_untyped_defs` and get more `mypy` coverage everywhere, but that will require about 1,000 fixes. We should probably do that eventually.
4. We could also consider only running `mypy` on newer python versions. This is not easy to do while supporting `3.6`, because you have to use `if TYPE_CHECKING` for a lot of things to ensure that 3.6 still parses correctly. If we only supported `3.7` and above we could use [`from __future__ import annotations`](https://mypy.readthedocs.io/en/stable/runtime_troubles.html#future-annotations-import-pep-563), but we have to support 3.6 for now. Sigh.
- [x] Convert a number of docstring types to Python type hints
- [x] Get rid of "list of" wherever it appears
Based on the following lines in the top level `CMakeLists.txt` (I can't deep link since gitlab.cern.ch not public), `cernlib` needs an explicit dependency on `libxaw` and `libxt`:
```cmake
find_package(X11 REQUIRED)
message(STATUS "CERNLIB: X11_Xt_LIB=${X11_Xt_LIB} X11_Xaw_LIB=${X11_Xaw_LIB} X11_LIBRARIES=${X11_LIBRARIES}")
```
Per https://github.com/spack/spack/issues/34192, apptainer does not support `--without-conmon`, so we introduce a base class `config_options` property that can be overridden in the `apptainer` package.
`texinfo` depends on `gettext`, and it builds a perl module that uses gettext via XS
module FFI. Unfortunately, the XS modules build asks perl to tell it what compiler to
use instead of respecting the one passed to configure.
Without this change, the build fails with this error:
```
parsetexi/api.c:33:10: fatal error: 'libintl.h' file not found
^~~~~~~~~~~
```
We need the gettext dependency and the spack wrappers to ensure XS builds properly.
- [x] Add needed `gettext` dependency to `texinfo`
- [x] Override XS compiler with `PERL_EXT_CC`
Co-authored-by: Paul Kuberry <pakuber@sandia.gov>
Local `git` tests will fail with `fatal: transport 'file' not allowed` when using git 2.38.1 or higher, due to a fix for `CVE-2022-39253`.
This was fixed in CI in #33429, but that doesn't help the issue for anyone's local environment. Instead of fixing this with git config in CI, we should ensure that the tests run anywhere.
- [x] Introduce `spack.util.git`.
- [x] Use `spack.util.git.get_git()` to get a git executable, instead of `which("git")` everywhere.
- [x] Make all `git` tests use a `git` fixture that goes through `spack.util.git.get_git()`.
- [x] Add `-c protocol.file.allow=always` to all `git` invocations under `pytest`.
- [x] Revert changes from #33429, which are no longer needed.
`spack graph` has been reworked to use:
- Jinja templates
- builder objects to construct the template context when DOT graphs are requested.
This allowed to add a new colored output for DOT graphs that highlights both
the dependency types and the nodes that are needed at runtime for a given spec.
`spack solve` is supposed to show you times you can compare. setup, ground, solve, etc.
all in a list. You're also supposed to be able to compare easily across runs. With
`pretty_seconds()` (introduced in #33900), it's easy to miss the units, e.g., spot the
bottleneck here:
```console
> spack solve --timers tcl
setup 22.125ms
load 16.083ms
ground 8.298ms
solve 848.055us
total 58.615ms
```
It's easier to see what matters if these are all in the same units, e.g.:
```
> spack solve --timers tcl
setup 0.0147s
load 0.0130s
ground 0.0078s
solve 0.0008s
total 0.0463s
```
And the units won't fluctuate from run to run as you make changes.
-[x] make `spack solve` timings consistent like before
* py-pytest-datadir: Init at 1.4.1
* py-pytest-data-dir: Fix missing dep
Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
* ML CI: Linux x86_64
* Update comments
* Rename again
* Rename comments
* Update to match other arches
* No compiler
* Compiler was wrong anyway
* Faster TF
Avoid text decoding and encoding when combining log files, instead
combine in binary mode.
Also do a buffered copy which is sometimes faster for large log files.
Currently, the Spack docs show documentation for submodules *before* documentation for
submodules on package doc pages. This means that if you put docs in `__init__.py` in
some package, the docs in there will be shown *after* the docs for all submodules of the
package instead of at the top as an intro to the package. See, e.g.,
[the lockfile docs](https://spack.readthedocs.io/en/latest/spack.environment.html#module-spack.environment),
which should be at the
[top of that page](https://spack.readthedocs.io/en/latest/spack.environment.html).
- [x] add the `--module-first` option to sphinx so that it generates module docs at top of page.
* librsvg: add 2.40.21, which does not require rust and has some security backports
https://download.gnome.org/sources/librsvg/2.40/librsvg-2.40.21.news
* librsvg: prevent finding broken gtkdoc binaries when ~doc is selected.
On my CentOS7 hosts, ./configure finds e.g. /bin/gtkdoc-rebase even when
~doc is selected. These tools use Python2, and fail with an error:
"ImportError: No module named site"
So prevent ./configure from finding these broken tools when not building
the +doc variant.
without this patch, build of paraview has a meltdown when reaching 3rd party catalyst and other packages
with these types of errors:
335 /tmp/foo/spack-stage/spack-stage-paraview-5.10.1-gscoqxhhakjyyfirdefuhmi2bzw4scho/spack-src/VTK/ThirdParty/fmt/vtkfmt/vtkfmt/format.h:1732:11: error: cannot capture a bi
t-field by reference
336 if (sign) *it++ = static_cast<Char>(data::signs[sign]);
337 ^
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
The sticky property will prevent clingo from changing the amdgpu_target
to work around conflicts. This is the same behaviour as was adopted for
cuda_arch in 055c9d125d.
Replace the filter_file for older configure with rocm 5.3 with an
upstream patch. Further, the patch is no longer needed for develop or
later releases.
Implement an alternative strategy to do index.json invalidation.
The current approach of pairs of index.json / index.json.hash is
problematic because it leads to races.
The standard solution for cache invalidation is etags, which are
supported by both http and s3 protocols, which allows one to do
conditional fetches.
This PR implements that for the http/https schemes. It should also work
for s3 schemes, but that requires other prs to be merged.
Also it improves unit tests for index.json fetches.
* py-scikit-image: add 0.19.3
* Update var/spack/repos/builtin/packages/py-scikit-image/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* LLVM: replace libelf dependency with elf
I didn't test this extensively, but in CMS LLVM builds just fine with elfutils.
* [@spackbot] updating style on behalf of iarspider
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
* exiv2: add new versions
* babl: new package required to build GIMP
* gegl: new package required to build GIMP
* gexiv2: new package required to build GIMP
* libmypaint: new package required to build GIMP
* mypaint-brushes: new package required to build GIMP
* vala: new package required to build GIMP
* GIMP: new package definition for building GIMP-2.10 from source
* libjxl: update for 0.7.0
* libwmf: a library for reading vector images in Windows Metafile Format (WMF)
* libde265: an open source implementation of the h.265 video codec
* libwebp: add new versions
* GIMP: additional variants for building GIMP-2.10 from source
* libde265: remove boilerplate
* fixes for style precheck
* updates based on feedback
* fixes for style precheck
Update the depends_on("perl") to depends_on("perl+threads").
This and #34074 is needed to properly handle e.g. the perl-Thread-Queue
rpm package:
It may not be installed on RedHat-based hosts, which can lead to automake
build failures when `spack external find perl` or `spack external find --all`
was used to use the system-provided perl install.
* nextflow recipe: added latest stable version
* tower-cli recipe: added latest release
* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli
* recipes nf-tower-agent and nf-tower-cli: small fix
* nf-core-tools recipe: added most py- dependencies
* nf-core-tools: recipe without galaxy-tool-util (for testing)
* fixed typos in py-yacman recipe
* fixed typos in py-pytest-workflow recipe
* fixed typo in nf-core-tools recipe
* fixed typos in py-yacman recipe
* fixes in recipes for py-questionary and py-url-normalize
* fixes to py-yacman recipe
* style fixes to py- packages that are dependencies to nf-core-tools
* fix in py-requests-cache recipe
* added missing dep in py-requests-cache recipe
* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper
* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util
* nf-core-tools: new version with extra dependency
* commit to merge packages on focus from update/nextflow-tools
* nf-core: commenting galaxy dep for this pr
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-requests-cache/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* removed nf-core-tools from this branch, will be back at the end
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Interim fix for #34559
Spack's MSVC compiler definition uses ifx as the Fortran compiler.
Prior to #33385, the Spack MSVC compiler definition required the
executable to be called "ifx.exe"; #33385 replaced this with just
"ifx", which inadvertently led to ifx falsely indicating the
presence of MSVC on non-Windows systems (which leads to future
errors when attempting to query/use those compiler objects).
This commit applies a short-term fix by updating MSVC Fortran
version detection to always indicate a failure on non-Windows.
fixes#34518
Fix an issue due to the MRO chain of the package wrapper
during build. Before this PR we were always returning
False when the builder object was created before the
run_tests method was monkey patched.
openPMD, a metadata standard on top of backends like ADIOS2 and HDF5,
is implemented in ParaView 5.9+ via a Python3 module.
Simplify Conflicts & Variant
Add to ECP Data Vis SDK
This reverts commit 8035eeb36d.
And also removes logic around an additional HEAD request to prevent
a more expensive GET request on wrong content-type. Since large files
are typically an attachment and only downloaded when reading the
stream, it's not an optimization that helps much, and in fact the logic
was broken since the GET request was done unconditionally.
* Add py-bmtk and py-neurotools
* py-bmtk: version bump
* [@spackbot] updating style on behalf of heerener
* Maybe the copyright needs to be extended to 2022 for the check to pass
* Process review remarks
* Update var/spack/repos/builtin/packages/py-neurotools/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The main issue that's fixed is that Spack passes paths (as strings) to
functions that require urls. That wasn't an issue on unix, since there
you can simply concatenate `file://` and `path` and all is good, but on
Windows that gives invalid file urls. Also on Unix, Spack would not deal with uri encoding like x%20y for file paths.
It also removes Spack's custom url.parse function, which had its own incorrect interpretation of file urls, taking file://x/y to mean the relative path x/y instead of hostname=x and path=/y. Also it automatically interpolated variables, which is surprising for a function that parses URLs.
Instead of all sorts of ad-hoc `if windows: fix_broken_file_url` this PR
adds two helper functions around Python's own path2url and reverse.
Also fixes a bug where some `spack buildcache` commands
used `-d` as a flag to mean `--mirror-url` requiring a URL, and others
`--directory`, requiring a path. It is now the latter consistently.
* Add GFE packages, Update pFUnit
* Remove citibeth as maintainer per her request
* Version 3.3.0 is an odd duck. Needs a v
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
When installing binary tarballs, Spack has to download from its
binary mirrors.
Sometimes Spack has cache available for these mirrors.
That cache helps to order mirrors to increase the likelihood of
getting a direct hit.
However, currently, when Spack can't find a spec in any local cache
of mirrors, it's very dumb:
- A while ago it used to query each mirror to see if it had a spec,
and use that information to order the mirror again, only to go
about and do exactly a part of what it just did: fetch the spec
from that mirror confused
- Recently, it was changed to download a full index.json, which
can be multiple dozens of MBs of data and may take a minute to
process thanks to the blazing fast performance you get with
Python.
In a typical use case of concretizing with reuse, the full index.json
is already available, and it likely that the local cache gives a perfect
mirror ordering on install. (There's typically no need to update any
caches).
However, in the use case of Gitlab CI, the build jobs don't have cache,
and it would be smart to just do direct fetches instead of all the
redundant work of (1) and/or (2).
Also, direct fetches from mirrors will soon be fast enough to
prefer these direct fetches over the excruciating slowness of
index.json files.
* include Drishti
* fix syntax
* Update var/spack/repos/builtin/packages/drishti/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
* Update var/spack/repos/builtin/packages/drishti/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-sphinxcontrib-devhelp: add 1.0.2
* Update var/spack/repos/builtin/packages/py-sphinxcontrib-devhelp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-sphinxcontrib-applehelp: add 1.0.2
* Update var/spack/repos/builtin/packages/py-sphinxcontrib-applehelp/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ROCm 5.3.0 updates
* New patches for 5.3.0 on hip and hsakmt
* Adding additional build arguments in hip and llvm
* RVS updates for 5.3.0 release
* New patches and rocm-tensile, rocprofiler-dev, roctracer-dev recipe updates for 5.3.0
* Reverting OPENMP fix from rocm-tensile
* Removing the patch to compile without git and adding witout it
* Install library in to lib directory instead of lib64 across all platform
* Setting lib install directory to lib
* Disable gallivm coroutine for libllvm15
* Update llvm-amdgpu prefix path in hip-config.cmake.in
Removing libllvm15 from Mesa dependency removing
* hip-config.cmake.in update required from 5.2
* hip-config.cmake.in update required from 5.2 and above
* hip-config.cmake.in update required for all 5.2 release above
* Style check correction in hip update
* ginkgo: add missing include
* Patching hsa include path for rocm 5.3
* Restricting patch for llvm-15
* Style check error correction
* PIC flag required for the new test applications
* Passing -DCMAKE_POSITION_INDEPENDENT_CODE=ON in the cmake_args instead of setting -fPIC in CFLAGS
Co-authored-by: Cordell Bloor <Cordell.Bloor@amd.com>
gnulib/lib/malloca.c uses single value `static_assert()` only available in c-11
syntax. `gcc` seems to be fine, but `icc` needs extra flag.
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Since it's a header-only library there's nothing to build. However, the
default targets include tests and examples and there's no option to turn
them off during configuration time.
Per https://geant4.web.cern.ch/node/1837 the correct dependency for 10.6 is on `g4emlow@7.9.1`, not on both `g4emlow@7.9` and `g4emlow@7.9.1`.
This is a minor cosmetic fix. The concretization for 10.6 works just fine here. But this removes the duplicate entry.
This PR patches the f_check script to detect the ifort compiler and
ensure that F_COMPILER is iset to INTEL. This problem was introduced with
openblas-0.3.21. Without this patch, the value of F_COMPILER falls back
to G77 and icc rather than ifort is used for the linking stage. That
results in the openblas library missing libifcore, which in turn means
many Fotran programs can not be compiled with ifort.
Writing a long dependency like:
```python
depends_on(
"llvm"
"targets=amdgpu,bpf,nvptx,webassembly"
"version_suffix=jl +link_llvm_dylib ~internal_unwind"
)
```
when it should be formatted like this:
```python
depends_on(
"llvm"
" targets=amdgpu,bpf,nvptx,webassembly"
" version_suffix=jl +link_llvm_dylib ~internal_unwind"
)
```
can cause really subtle errors. Specifically, you'll get something like this in
the package sanity tests:
```
AttributeError: 'NoneType' object has no attribute 'rpartition'
```
because Spack happily constructs a class that has a dependency with name `None`.
We can catch this earlier by banning anonymous dependency specs directly in
`depends_on()`. This causes the package itself to fail to parse, and emits
a much better error message:
```
==> Error: Invalid dependency specification in package 'julia':
llvmtargets=amdgpu,bpf,nvptx,webassemblyversion_suffix=jl +link_llvm_dylib ~internal_unwind
```
* Only restrict CMake version in umpire when examples and rocm are enabled
* Add CMAKE_HIP_ARCHITECTURES to Umpire and lift cmake version restriction
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
This patch:
https://gcc.gnu.org/legacy-ml/gcc-patches/2018-01/msg01962.html
is actually in Amazon Linux GCC 7.3.1, which we use in CI.
So we should not hold openblas back because of it.
Old versions of OpenBLAS fail to detect the host arch of some of the
AVX512 cpus of build nodes, causing build failures.
Of course we should try to set ARCH properly in OpenBLAS to avoid that
it looks up the build arch, but that's quite some work.
* Debian like distros use multiarch implementation spec
https://wiki.ubuntu.com/MultiarchSpec
Instead of being limited to /usr/lib64, architecture based
lib directories are used. For instance, under ubuntu a library package
on x86_64 installs binaries under /usr/lib/x86_64-linux-gnu.
Building pmix with external dependencies like hwloc or libevent
fail as with prefix set to /usr, that prefix works for
headers and binaries but does not work for libraries. The default
location for library /usr/lib64 does not hold installed binaries.
Pmix build options --with-libevent and --with-libhwloc allow us to
specify dependent library locations. This commit is an effort to
highlight and resolve such an issue when a users want to use Debian like
distro library packages and use spack to build pmix.
There maybe other packages that might be impacted in a similar way.
* Adding libs property to hwloc and libevent and some cleanups to pmix patch
* Fixing style and adding comment on Pmix' 32-bit hwloc version detection issue
* `url_exists` improvements (take 2)
Make `url_exists` do HEAD request for http/https/s3 protocols
Rework the opener: construct it once and only once, dynamically dispatch
to the right one based on config.
* geant4: version bumps for Geant4 11.1.0
- Version bumps for new data libraries
- g4ndl 4.7
- g4emlow 8.2
- Add geant4-data@11.1.0
- Checksum new Geant4 11.1.0 release
- Limit +python variant to maximum of :11.0 due to removal of
Geant4Py in 11.1
- Update CLHEP dependency to at least 2.4.6.0 for this release
- Update VecGeom dependency to at least 1.2.0 for this release,
closing version ranges for older releases to prevent multiple
versions satisfying requirement
* geant4: correct max version for python support
It's very common for us to tell users to grep through the existing Spack packages to
find examples of what they want, and it's also very common for package developers to do
it. Now, searching packages is even easier.
`spack pkg grep` runs grep on all `package.py` files in repos known to Spack. It has no
special options other than the search string; all options passed to it are forwarded
along to `grep`.
```console
> spack pkg grep --help
usage: spack pkg grep [--help] ...
positional arguments:
grep_args arguments for grep
options:
--help show this help message and exit
```
```console
> spack pkg grep CMakePackage | head -3
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/3dtk/package.py:class _3dtk(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/abseil-cpp/package.py:class AbseilCpp(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/accfft/package.py:class Accfft(CMakePackage, CudaPackage):
```
```console
> spack pkg grep -Eho '(\S*)\(PythonPackage\)' | head -3
AwsParallelcluster(PythonPackage)
Awscli(PythonPackage)
Bueno(PythonPackage)
```
## Return Value
This retains the return value semantics of `grep`:
* 0 for found,
* 1 for not found
* >1 for error
## Choosing a `grep`
You can set the ``SPACK_GREP`` environment variable to choose the ``grep``
executable this command should use.
Unit tests on Windows are supposed to pass for any PR to pass CI.
However, the return code for the unit test command was not being
checked, which meant this check was always passing (effectively
disabled). This PR
* Properly checks the result of the unit tests and fails if the
unit tests fail
* Fixes (or disables on Windows) a number of tests which have
"drifted" out of support on Windows since this check was
effectively disabled
At some point the `a` mock package became an `AutotoolsPackage`, and that means it
depends on `gnuconfig` on macOS. This was causing one of our shell tests to fail on
macOS because it was testing for `{a.prefix.bin}:{b.prefix.bin}` in `PATH`, but
`gnuconfig` shows up between them.
- [x] simplify the test to check `spack load --sh a` and `spack load --sh b` separately
* Add 20 as a valid option for cxxstd to fmt
* Add pika 0.11.0
* Fix version constraint for p2300 variant in pika package
* Add pika-algorithms package
* py-reportlab: add 3.6.12
* Update var/spack/repos/builtin/packages/py-reportlab/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adding recipe for singularity-hpc - 1st go
* typo in singularity-hpc recipe
* singularity-hpc, spython recipes: added platform variant
* singularity-hpc, spython recipes: platform variant renamed to runtime
* style fix
* another style fix
* yet another style fix (why are they not reported altogether)
* singularity-hpc recipe: added Vanessa as maintainer
* singularity-hpc recipe: add podman variant
* singularity-hpc recipe: added variant for module system
* shpc recipe: add version for py-semver dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-spython recipe: no need to specify generic python dep for a python pkg
* py-spython: py-requests not needed
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
## Motivation
Our parser grew to be quite complex, with a 2-state lexer and logic in the parser
that has up to 5 levels of nested conditionals. In the future, to turn compilers into
proper dependencies, we'll have to increase the complexity further as we foresee
the need to add:
1. Edge attributes
2. Spec nesting
to the spec syntax (see https://github.com/spack/seps/pull/5 for an initial discussion of
those changes). The main attempt here is thus to _simplify the existing code_ before
we start extending it later. We try to do that by adopting a different token granularity,
and by using more complex regexes for tokenization. This allow us to a have a "flatter"
encoding for the parser. i.e., it has fewer nested conditionals and a near-trivial lexer.
There are places, namely in `VERSION`, where we have to use negative lookahead judiciously
to avoid ambiguity. Specifically, this parse is ambiguous without `(?!\s*=)` in `VERSION_RANGE`
and an extra final `\b` in `VERSION`:
```
@ 1.2.3 : develop # This is a version range 1.2.3:develop
@ 1.2.3 : develop=foo # This is a version range 1.2.3: followed by a key-value pair
```
## Differences with the previous parser
~There are currently 2 known differences with the previous parser, which have been added on purpose:~
- ~No spaces allowed after a sigil (e.g. `foo @ 1.2.3` is invalid while `foo @1.2.3` is valid)~
- ~`/<hash> @1.2.3` can be parsed as a concrete spec followed by an anonymous spec (before was invalid)~
~We can recover the previous behavior on both ones but, especially for the second one, it seems the current behavior in the PR is more consistent.~
The parser is currently 100% backward compatible.
## Error handling
Being based on more complex regexes, we can possibly improve error
handling by adding regexes for common issues and hint users on that.
I'll leave that for a following PR, but there's a stub for this approach in the PR.
## Performance
To be sure we don't add any performance penalty with this new encoding, I measured:
```console
$ spack python -m timeit -s "import spack.spec" -c "spack.spec.Spec(<spec>)"
```
for different specs on my machine:
* **Spack:** 0.20.0.dev0 (c9db4e50ba045f5697816187accaf2451cb1aae7)
* **Python:** 3.8.10
* **Platform:** linux-ubuntu20.04-icelake
* **Concretizer:** clingo
results are:
| Spec | develop | this PR |
| ------------- | ------------- | ------- |
| `trilinos` | 28.9 usec | 13.1 usec |
| `trilinos @1.2.10:1.4.20,2.0.1` | 131 usec | 120 usec |
| `trilinos %gcc` | 44.9 usec | 20.9 usec |
| `trilinos +foo` | 44.1 usec | 21.3 usec |
| `trilinos foo=bar` | 59.5 usec | 25.6 usec |
| `trilinos foo=bar ^ mpich foo=baz` | 120 usec | 82.1 usec |
so this new parser seems to be consistently faster than the previous one.
## Modifications
In this PR we just substituted the Spec parser, which means:
- [x] Deleted in `spec.py` the `SpecParser` and `SpecLexer` classes. deleted `spack/parse.py`
- [x] Added a new parser in `spack/parser.py`
- [x] Hooked the new parser in all the places the previous one was used
- [x] Adapted unit tests in `test/spec_syntax.py`
## Possible future improvements
Random thoughts while working on the PR:
- Currently we transform hashes and files into specs during parsing. I think
we might want to introduce an additional step and parse special objects like
a `FileSpec` etc. in-between parsing and concretization.
* py-pywavelets: add 1.4.1
* Update var/spack/repos/builtin/packages/py-pywavelets/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update var/spack/repos/builtin/packages/py-pywavelets/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* [quantum-espresso] Parallel make fails for 6.{6,7}
I run into a race condition in `make` with Intel compiler on icelake when building QE 6.6 and 6.7.
* Fix comment
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
* fix location of input for darshan-util tests
Darshan log file used for test input was removed from the Darshan
repo after the 3.4.0 release. This commit adds logic to use a
different log file as test input for later Darshan versions.
* libctl: add new version
Change-Id: I16f91cfab198c66b60407ab5bb2cb3ebeac6bc19
* New package: libgdsii
Change-Id: I34b52260ab68ecc857ddf8cc63b124adc2689a51
* New package: mpb
Change-Id: I6fdf5321c33d6bdbcaa1569026139a8483a3bcf8
* meep: add new version and variants
Change-Id: I0b60a9a4d9a329f7bde9027514467e17376e6a39
* meep: use with_or_without
Change-Id: I05584cb13df8ee153ed385e77d367cb34e39777e
* LibCatalyst: Fix version of pre-release develop version
* ParaView: Requires libcatalyst@2:
* ParaView: Apply adios2 module no kit patch to 5.11
This patch is still pending in VTK and didn't make it into 5.11 as anticipated.
* bcache: support external gettext when `libintl` is in glibc
Many glibc-based Linux systems don't have gettext's libintl because
libintl is included in the standard system's glibc (libc) itself.
When using `spack external find gettext` on those, packages like
`bcache` which unconditionally to link using `-lintl` fail to link
with -lintl.
Description of the fix:
The libs property of spack's gettext recipe returns the list of libs,
so when gettext provides libintl, use it. When not, there is no
separate liblint library and the libintl API is provided by glibc.
Tested with `spack external find gettext` on glibc-based Linux and
in musl-based Alpine Linux to make sure that when -lintl is really
needed, it is really used and nothing breaks.
* Fix mercurial print_str failure.
* Perform same fix on py-pybind11 for print_string missing method.
Co-authored-by: Nicholas Cameron Sly <sly1@llnl.gov>
* Add new root versions and associated dependency constraints
* Please style guide
* Avoid conflicts where possible
* Untested prototype of macOS version detection
* Fixes for macOS version prototype
* More logical ordering
* More correctness and style fixes
* Try to use spack's macos_version
* Add some forgotten @s
* Actually, Spack can't build Python 3.6 anymore, and thus no older PyROOT
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
* Fix compile errors with latest HDF5 1.13.3
* format
* Update var/spack/repos/builtin/packages/hdf5-vol-async/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This commit reworks the bootstrapping procedure to use Spack environments
as much as possible.
The `spack.bootstrap` module has also been reorganized into a Python package.
A distinction is made among "core" Spack dependencies (clingo, GnuPG, patchelf)
and other dependencies. For a number of reasons, explained in the `spack.bootstrap.core`
module docstring, "core" dependencies are bootstrapped with the current ad-hoc
method.
All the other dependencies are instead bootstrapped using a Spack environment
that lives in a directory specific to the interpreter and the architecture being used.
The cray manifest shows dependency information for cray-mpich, which we never previously cared about
because it can only be used as an external. This updates Spack's dependency information to make cray-mpich
specs read in from the cray external manifest usable.
* initial changes for rocm recipes
* drop support for older releases
* drop support for older rocm releases - add more recipes
* drop support for older releases
* address style issues
* address style error
* fix errors
* address review comments
* Added plumed version 2.8.1 including gromacs compatibility
* Corrected ~mpi and +mpi variants in new depends
* Fixed regression logic plumed+gromacs@2020.6 support
All the vermin annotations we were using were for optional features introduced in early
Python 3 versions. We no longer need any of them, as we only support Python 3.6+. If we
start optionally using features from newer Pythons than 3.6, we will need more vermin
annotations.
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
We no longer support Python <3.6, so we don't need to check whether Python supports SSL
verification in `spack.util.web`.
- [x] Remove a bunch of logic we needed to appease Python 2
We've stopped supporting Python 2, and contributors are noticing that our CI no longer
allows Python 2.7 comment type hints. They end up having to adapt them, but this adds
extra unrelated work to PRs.
- [x] Move to 3.6 type hints across the entire code base
* doxygen: add build-tool tag
This allows it to be included automatically as an external. No one links against doxygen so this should be ok.
* doxygen: add self as maintainer
* vecgeom: add new 1.2.1 version
* vecgeom: introduce conflict between gcc/cuda
Recent tests of vecgeom in Spack environments have shown that the build
with +cuda fails with GCC >= 11.3 and CUDA < 11.7 with error
...lib/gcc/x86_64-pc-linux-gnu/11.3.0/include/serializeintrin.h(41):
error: identifier "__builtin_ia32_serialize" is undefined
1 error detected in the compilation of
".../VecGeom/source/BVHManager.cu".
Other GCC/CUDA combinations appear o.k.
Avoid this error in spack, and document it for users, with a conflict
directive to express the restriction.
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.