* compiler wrapper: prioritize spack managed paths in search order
This commit partitions search paths of -L, -I (and -rpath) into three
groups, from highest priority to lowest:
1. Spack managed directories: these include absolute paths such as
stores and the stage dir, as well as all relative paths since they
are relative to a Spack owned dir
2. Non-system dirs: these are for externals that live in non-system
locations
3. System dirs: your typical `/usr/lib` etc.
It's very easy for Spack to known the prefixes it owns, it's much more
difficult to tell system dirs from non-system dirs. Before this commit
Spack tried to distinguish only system and non-system dirs, and failed
for very trivial cases like `/usr/lib/x/..` which comes up often, since
build systems sometimes copy search paths from `gcc -print-search-dirs`.
Potentially this implementation is even faster than the current state of
things, since a loop over paths is replaced with an eval'ed `case ...`.
* Trigger a pipeline
* Revert "Trigger a pipeline"
This reverts commit 5d7fa863de.
* remove redudant return statement
* Add conflicts for some blas implementations that don't build on
Windows (or with %msvc)
* Need to enclose CC/CXX variables in quotes in case those paths
have spaces, otherwise Meson runs into errors
* On Windows, Python dependencies now add <prefix>/Scripts to the
PATH (this is established as a standard in PEP 370)
* Later versions of oneAPI have moved, so update detection to find it
in both old and new location
* Remove reliance on ONEAPI_ROOT env variable when determining Fortran
compiler version for %msvc
* When finding a Fortran compiler for MSVC, there was logic enforcing
a maximum MSVC version for a given oneAPI Fortran version. This
mapping was out of date and excluding valid combinations, so has
been removed (the logic now just picks the latest available
oneAPI Fortran compiler for any given MSVC version).
On Windows, bootstrapping logic now searches for and adds the win-sdk
and wgl packages to the user's top scope as externals if they are not
present.
These packages are generally required to install most packages with
Spack on Windows, and are only available as externals, so it is
assumed that doing this automatically would be useful and avoid
a mandatory manual step for each new Spack instance.
Note this is the first case of bootstrapping logic modifying
configuration other than the bootstrap configuration.
* gxsview: compiles againts system qt and vtk on rhel8
* Update gxsview/package.py for blanks around operator
* Update gxsview/package.py import blank line
* Update gxsview/package.py for style
* Update gxsview/package.py checking vtk version
Score-P does not accept "--with-foo=yes", but only "--with-foo" or "--with-foo=some-valid-specific-choice-or-path". This keeps Spack from generating config flags that will cause Score-P to barf.
This adds some improvements to `spack find` output when in environments based
around some thoughts about what users want to know when they're in an env.
If you're working in an enviroment, you mostly care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
So, this PR adds a couple tweaks to display that information more clearly:
- [x] We now display install status next to every root. You can easily see
which are installed and which aren't.
- [x] When you run `spack find -l` in an env, the roots now show their concrete
hash (if they've been concretized). They previously would show `-------`
(b/c the root spec itself is abstract), but showing the concretized root's
hash is a lot more useful.
- [x] Newly added/unconcretized specs still show `-------`, which now makes more
sense, b/c they are not concretized.
- [x] There is a new option, `-r` / `--only-roots` to *only* show env roots if
you don't want to look at all the installed specs.
- [x] Roots in the installed spec list are now highlighted as bold. This is
actually an old feature from the first env implementation , but various
refactors had disabled it inadvertently.
Reduce incidence of spurious errors by:
* Ensuring we're passing the buffer by reference
* Get the correct short string size from Windows API instead of computing ourselves
* Ensure sufficient space for null terminator character
Add test for `windows_sfn`
Currently if you request pkg +example where example is a conditional
variant, and you have a pkg in the database for which the condition
did not hold (so no +example nor ~example), the solver would reuse it
regardless, not imposing +example.
The change rules out exactly one thing: variant_set without variant_value,
which in practice could only happen when not node_has_variant (i.e. when
under the current package.py rules the variant's when condition did not
trigger).
Add a "default" option that passes no option to configure. Existing
options changed in the MPICH 4.2.0 release, so update the package to
reflect those changes.
Currently, some of the tests in `spec_format` and `spec_semantics` fetch
the actual zlib repository when run, because they call `str()` on specs
like `zlib@foo/bar`, which at least currently requires a remote git clone
to resolve.
This doesn't change the behavior of git versions, but it uses our mock git
repo infrastructure and clones the `git-test` package instead of the *real*
URL from the mock `zlib` package.
This should speed up tests. We could probably refactor more so that the git
tests *all* use such a fixture, but the `checks` field that unfortunately
tightly couples the mock git repository and the `git_fetch` tests complicates
this. We could also consider *not* making `str()` resolve git versions, but
I did not dig into that here.
- [x] add a mock_git_test_package fixture that sets up a mock git repo *and*
monkeypatches the `git-test` package (like our git test packages do)
- [x] use fixture in `test_spec_format_path`
- [x] use fixture in `test_spec_format_path_posix`
- [x] use fixture in `test_spec_format_path_windows`
- [x] use fixture in `test_parse_single_spec`
Commit 330a9a7c9a aimed at preventing
generation of .cfg files when a given compiler does not exist
in the particular release. However the check does not
contain the full paths so it always fails resulting in empty
.cfg files. This commit fixes it.
* add new axom releases, sync changes between repos
* add new version of axom and add fallback depends for umpire/raja
* remove now redundant flags, add flags to cuda flags to flags output by cachedcmakepackage
Co-authored-by: white238 <white238@users.noreply.github.com>
* w3nco %oneapi cflags: append -Wno-error=implicit-function-declaration
* update flag for %apple-clang, %clang
* [@spackbot] updating style on behalf of eugeneswalker
Upon close inspection of clingo answer sets, in some cases we have "equivalent" (i.e. same hash for the concrete spec) duplicates that differ only because of virtual nodes that are added to the answer set, without any edge using them.
This commit adds a property `autopush` to mirrors. When true, every source build is immediately followed by a push to the build cache. This is useful in ephemeral environments such as CI / containers.
To enable autopush on existing build caches, use `spack mirror set --autopush <name>`. The same flag can be used in `spack mirror add`.
* Simplify config command and add BLAS/LAPACK location
* Use BLAS_ROOT and LAPACK_ROOT and disable use of system
package registry
* Adds location of BLAS_LIBRARIES and LAPACK_LIBRARIES to CMake
* Adds CMake variables to prevent picking up system installations
of BLAS/LAPACK. Fixes previous PR #43328 that was picking up
incorrect installations
* Adds versions 7.2.1
* libzip: add up to v1.10.1
- update homepage and change download url to GitHub
- change build system to CMake for releases starting with 1.4
* [@spackbot] updating style on behalf of aumuell
* libzip: fix urls
* [@spackbot] updating style on behalf of aumuell
* libzip: do not add versions from libzip.org
these are old, and urllib refuses to fetch them
* libzip: deprecate versions from libzip.org
urllib refuses to fetch them, only curl would work
---------
Co-authored-by: aumuell <aumuell@users.noreply.github.com>
* acts: new version 33.1.0
* actsvg: new version 0.4.41
* geomodel: new package
* [@spackbot] updating style on behalf of wdconinc
* acts: plugin_cmake_variant(geomodel)
* geomodel: with when(+visualization)
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
* Include recent change for Ubuntu
Select option -disable-no-pie-on-ubuntu for some Ubuntu systems
823971df01
* Added conflict for new variant
* Updated conflict version
* Added mention of Ubuntu to variant description
* Kokkos Kernels: adding missing TPLs and pre-conditions
Adding variants and dependencies for rocBLAS and rocSPARSE.
Also adding a "when=" close to the TPL variants that prevents
enabling the TPLs in versions of the library when it was not
yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: adding cusolver and rocsolver for at version 4.3.00
* Kokkos Ecosystem: updating packages for release 4.3.00
* Kokkos: adding arch for SG2042
* Removing sg2042 from spack_micro_arch_map
Removing it here and will work to add it in the proper generic spack location, likely: `spack/lib/spack/external/archspec/json/cpu/microarchitectures.json` ?
Allow reuse of specs that were built with compilers not in the current configuration. This means that specs from build caches don't need to have a matching compiler locally to be reused. Similarly when updating a distro. If a node needs to be built, only available compilers will be considered as candidates.
* add c++11 header to gold for compiler not defaulting to c++11
* glibc: add 2.39
* llvm: add 18.1.3
* fix#42314, remove cxx11 flag for llvm; should be controlled by cmake.
* modify patch
* llvm version
* add gmake version request
* yoda: new versions 1.9.9, 1.9.10, and 2.0.0
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
As in the original makefile "FFLAGS_PROMOTION = -fdefault-real-8
-fdefault-double-8" is only used when `precision=double`. This is the default
for the Spack package, so no change if `precision` is left unset.
* Generally use os.replace on Windows and Linux
* Windows behavior for os.replace differs when the destination exists
and is a symlink to a directory: on Linux the dst is replaced and
on Windows this fails - this PR makes Windows behave like Linux
(by deleting the dst before doing the rename unless src and dst
are the same)
* Nalu: updating Trilinos recipe a bit
Basic changes to build/install nalu properly using Spack.
Some more changes would be nice for instance adding an
option to build against Trilinos master or develop. Adding
a dependency on googletest to avoid the annoying build
failures in the unit-tests.
* Nalu: adding release 1.6.0
Nalu v1.6.0 can build cleanly against Trilinos 14.0.0 with the
proposed changes. The only other combo is master / master but
than one is "floating" as these branch evolve over time. When a
new Nalu comes out we might want to add another fixed version to
keep this recipes up to date!
the autotools build system does something funky which causes a link line
where gccs default link dirs are explicitly added and end up before the
-L from spack's libunwind, so that ultimately it links against system
libunwind.
the cmake build system does better.
* upgrade new versions
* style fix
* update jaxlib deps (not cuda and bazel yet)
* update jaxlib cuda versions
* update jaxlib cuda versions
* update jaxlib cuda versions
* chore: style fix
* Update package.py
* Update package.py
* fix: typo
* docs: add source for cuda version
* py-jaxlib 0.4.14 also doesn't build on ppc64le
* Add 0.4.26
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* ParaView: Update version 5.12.0
Add 5.12.0 release
Update default to 5.12.0
* Add patch for building ParaView 5.12 with kits
* Drop VTKm from neoverse
* For avx build, the start address of values_ buffer in KernelParameters
is not a correct as it is computed based on 16-byte alignment.
* Style check error fix
* updating package.py for py-celery, py-kombu, py-amq
* added more py-kombu package versions
* fix copyrights and stype on py-kombu/package.py
* removed extra spaces
* added py-billiard 4.2.0 and added back the license('BSD-3-Clause')
* removed extra spaces in py-celery/package.py
* fixed py-amqp 2.4.0 sha; fixed py-celery's dependency of py-click (when version restrictions)
* more clean up on specifying version bounds
* Relax compiler and target mismatches
The mismatch occurs on an edge. Previously it was assigned
the parent priority, now it is assigned the child priority.
This should make reuse from buildcaches or store more likely,
since most mismatches will be counted with "reused" priority.
* Optimize version badness for runtimes at very low priority
We don't want to e.g. switch other attributes because we
cannot reuse an old installed runtime.
* Optimize runtime attributes at very low priority
This is such that the version of the runtime would
not influence whether we should reuse a spec.
Compiler mismatches are considered for runtimes,
to avoid situations where compiling foo%gcc@9
brings in gcc-runtime%gcc@13 if gcc@13 is among
the available compilers
* Exclude specs without runtimes from reuse
This should ensure that we do not reuse specs that
could be broken, as they expect the compiler to be
installed in a specific place.
The installer runs `get_dependent_ids`, which follows edges outside the
subdag that's being installed, so it returns a superset of the actual
dependents.
That's generally fine, except that it calls `s.package` on every
dependent, which triggers a package class to be instantiated, which is a
lot of work.
Instead, compute the package id from the spec, since that's all that's
used anyways and does not trigger *lots* of slow and redundant
instantiations of package objects.
If ONEAPI_ROOT is not set as an environment variable, the current approach will raise an error.
Instead we can compute the OneAPI_ROOT from the compiler paths like we do with vcvarsall.
The installation mechanism used on Linux to install py-pip (using pip
from the downloaded wheel to install the wheel) does not work on Windows.
This updates the installation of py-pip on Windows to download and
use a zipapp of a specific pip version in order to install the wheel
pip version that is requested.
`dpcpp` is deprecated by intel and has been superseded by `oneapi` compilers for a very long time.
---------
Co-authored-by: becker33 <becker33@users.noreply.github.com>
* SEACAS: Update package.py to handle new SEACAS project name
The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.
The refactor also changes several:
"-DSome_CMAKE_Option:BOOL=ON"
to
define("Some_CMAKE_Option", True)
* SEACAS, EXODUSII: New version; deprecate older versions; better variant descriptions
* [@spackbot] updating style on behalf of gsjaardema
* Fix long lines reported by flake8
---------
Co-authored-by: gsjaardema <gsjaardema@users.noreply.github.com>
* Add release 2.8.0
* Changing C compiler to hipcc
* Final release version
* Adding new cmake definition for rocm support
* Enabling rocm version support
* Update sha256
* Updating website URL
* Removing unnecessary C compiler spec
* Adding rocm-core dependency
* fixing rocm-core version
* fixing rocm-core version
* fixing style
* bugfix
* Add systemd
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* gobject-introspection: Correct glib versions
- The meson.build requirement that the glib version
is >= the gobject-introspective version is not in place
until v1.76.1.
- Prior to that, the requirement was glib >= 2.58.0.
- Bug introduced in acbf0d99c4, PR #42222.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* util-linux: add v2.39.3
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* py-natsort: add new versions
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* geopm-service: default systemd support to true
- Make the dependency sticky to force a failure
if systemd compilation fails, or force
the user to disable the option.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* geopm-service: Add initial multi-architecture support
- Restrict arch conflicts to 3.0.1
- Disable cpuid at configure time on non-x86_64 platforms.
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* geopm-service: update docstrings
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* Add py-geopmdpy
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
* Add geopm-runtime recipe
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
---------
Signed-off-by: Brad Geltz <brad.geltz@intel.com>
This PR allows the user to specify a path to a custom cert file (or directory) in
Spack's config:
```yaml
# This is where custom certs for proxy/firewall are stored.
# It can be a path or environment variable. To match ssl env configuration
# the default is the environment variable SSL_CERT_FILE
ssl_certs: $SSL_CERT_FILE
```
`config:ssl_certs` can be a path to a file or a directory, or it can be and environment
variable that resolves to one of those. When it posts to something valid, Spack will
update the ssl context to include custom certs, and fetching via `urllib` and `curl`
will trust the provided certs.
This should resolve many issues with fetching behind corporate firewalls.
---------
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: Alec Scott <alec@bcs.sh>
After #41373, where we stopped considering the source directory to be the stage for develop builds,
we resumed *deleting* the stage even after a successful build.
We don't want this for develop builds because developers need to iterate; we should keep the artifacts
unless they explicitly run `spack clean`.
Now:
- [x] Build artifacts for develop packages are not removed after a successful install
- [x] They are also not removed before an install starts, i.e. develop packages always
reuse prior artifacts, if available.
- [x] They can be deleted in any other context, e.g. by running `spack clean --stage`
Prior FleCSI releases relied on commits on the control-replication branch (cr)
of Legion. That branch has now been merged into master and is part of the
24.03.0 Legion release.
* (py-)onnx: new version 1.14.{0,1}, 1.15.0
Notes on `onnx`:
- The C++ standard was changed to 14 in 1.15, so no more filter_file is needed. (The C++ standard has since changed to 17 in master.)
Notes on `py-onnx`:
- `py-pybind11` was an unlisted requirement in CMakeLists.txt since 1.3 or so (before earliest spack package).
* py-onnx: depends_on pybind11 with type link, not run
* py-onnx: depends_on py-setuptools@64:
Users requested an option to filter between local/upstream results in `spack find` output.
```
# default behavior, same as without --install-tree argument
$ spack find --install-tree all
# show only local results
$ spack find --install-tree local
# show results from all upstreams
$ spack find --install-tree upstream
# show results from a particular upstream or the local install_tree
$ spack find --install-tree /path/to/install/tree/root
```
---------
Co-authored-by: becker33 <becker33@users.noreply.github.com>
* sundials: add new version
* note previous default
* update when clause for removed options
---------
Co-authored-by: David J. Gardner <gardner48@llnl.gov>
* SEACAS: Update package.py to handle new SEACAS project name
The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.
The refactor also changes several:
"-DSome_CMAKE_Option:BOOL=ON"
to
define("Some_CMAKE_Option", True)
* exodusii -- refactor and bring up-to-date
* Add missed patch file
* [@spackbot] updating style on behalf of gsjaardema
* Apply seacas windows patch here also.
* Update url so old checksums valid; redo new checksums
---------
Co-authored-by: gsjaardema <gsjaardema@users.noreply.github.com>
* Initial commit to bump rocprofiler-dev to 6.0 and add aqlprofile recipe
* bump to 6.0.2 and extracting binaries from deb pkg
* fixes for hpctoolkit build errors
* add yum and zyp aqlprofile packages
* fix style issues
* Allow compilers to function across compatible OS's
* Add documentation in the default yaml
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
* py-libsonata: new package starting at 0.1.25
* align spack version with the git branch
* Fix min required versions / use spack packages instead of submodules
* Create devcontainer.json
* Ensure codespace can be setup for current branch
* fix: find compilers in site scope
* fix: use cloud_pipelines ubuntu20.04 image
* fix: spack config --scope site add
* fix: use develop, not develop-root mirror
This adds a dependency on pkg-config which in turn builds pkg-config
on pipelines using %onapi/%cce: update the pkg-config build to disable
specific warnings-as-errors from these compilers.
Co-authored-by: Reid Priedhorsky <1682574+reidpr@users.noreply.github.com>
* Add macos-14 as a runner (Apple M1)
* Mark a test xfail
We need to check later if this test needs modifications
on Apple Silicon chips.
---------
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: alalazo <alalazo@users.noreply.github.com>
* buildcache sync: manifest-glob with arbitrary destination
The current implementation of the --manifest-glob is a bit restrictive
requiring the destination to be known by the generation stage of CI.
This allows specifying an arbitrary destination mirror URL.
* Add unit test for buildcache sync with manifest
* Fix test and arguments for manifest-glob with override destination
* Add testing path for unused mirror argument
* Remove a few compilers from static test data
These compilers were used only in a bunch of tests, so
they are added only there.
* Remove clang@3.3 from unit test configuration
* Parametrize compilers.yaml
* Remove specially named gcc from static data
The compilers are used in two tests
* Remove apple-clang and macOS compilers from static data
The compiler was used only in multimethod tests
* Remove clang@3.5 (compiler seems to be unused)
* Remove gcc@4.4.0 (compiler seems to be unused)
* Exclude x86_64 tests on other architectures
* Mark two tests as for clingo only
* Update version syntax in compilers.yaml
* Parametrize tcl tests on architectures
* Parametrize lmod tests on architectures
* Substitute gcc@4.5.0 with gcc@4.8.0 so it can be used on aarch64
* Fix a few issues with aarch64 and unit-tests
It's now possible to add config on the command line with `spack -c <CONFIG_VARS> ...`, but the new `command_line` scope isn't reflected in the help output for `--scope`:
```bash
> spack help config
...
--scope {defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT
configuration scope to read/modify
...
```
This PR adds:
- A new runtime for `%oneapi` compilers, called `intel-oneapi-runtime`
- Information to both `gcc-runtime` and `intel-oneapi-runtime`, to ensure
that we don't mix compilers using different soname for either `libgfortran`
or `libifcore`
To do so, the following internal mechanisms have been implemented:
- Possibility to inject virtual dependencies from the `runtime_constraints`
callback on packages
Information has been added to `gcc-runtime` to provide the correct soname
under different conditions on its `%gcc`.
Rules injected into the solver looks like:
```prolog
% Add a dependency on 'gfortran@5' for nodes compiled with gcc@=13.2.0 and using the 'fortran' language
attr("dependency_holds", node(ID, Package), "gfortran", "link") :-
attr("node", node(ID, Package)),
attr("node_compiler", node(ID, Package), "gcc"),
attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
not external(node(ID, Package)),
not runtime(Package),
attr("language", node(ID, Package), "fortran").
attr("virtual_node", node(RuntimeID, "gfortran")) :-
attr("depends_on", node(ID, Package), ProviderNode, "link"),
provider(ProviderNode, node(RuntimeID, "gfortran")),
attr("node", node(ID, Package)),
attr("node_compiler", node(ID, Package), "gcc"),
attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
not external(node(ID, Package)),
not runtime(Package),
attr("language", node(ID, Package), "fortran").
attr("node_version_satisfies", node(RuntimeID, "gfortran"), "5") :-
attr("depends_on", node(ID, Package), ProviderNode, "link"),
provider(ProviderNode, node(RuntimeID, "gfortran")),
attr("node", node(ID, Package)),
attr("node_compiler", node(ID, Package), "gcc"),
attr("node_compiler_version", node(ID, Package), "gcc", "13.2.0"),
not external(node(ID, Package)),
not runtime(Package),
attr("language", node(ID, Package), "fortran").
```
The default url couldn't be the one with v0.0.0-aws since spack was
replacing v0.0.0-aws with v<version_number> for example, deleting the
-aws suffix. I used the url_for_version method to specify this suffix.
This adds support for prereleases. Alpha, beta and release candidate
suffixes are ordered in the intuitive way:
```
1.2.0-alpha < 1.2.0-alpha.1 < 1.2.0-beta.2 < 1.2.0-rc.3 < 1.2.0 < 1.2.0-xyz
```
Alpha, beta and rc prereleases are defined as follows: split the version
string into components like before (on delimiters and string boundaries).
If there's a string component `alpha`, `beta` or `rc` followed by an optional
numeric component at the end, then the version is prerelease.
So `1.2.0-alpha.1 == 1.2.0alpha1 == 1.2.0.alpha1` are all the same, as usual.
The strings `alpha`, `beta` and `rc` are chosen because they match semver,
they are sufficiently long to be unambiguous, and and all contain at least
one non-hex character so distinguish them from shasum/digest type suffixes.
The comparison key is now stored as `(release_tuple, prerelease_tuple)`, so in
the above example:
```
((1,2,0),(ALPHA,)) < ((1,2,0),(ALPHA,1)) < ((1,2,0),(BETA,2)) < ((1,2,0),(RC,3)) < ((1,2,0),(FINAL,)) < ((1,2,0,"xyz"), (FINAL,))
```
The version ranges `@1.2.0:` and `@:1.1` do *not* include prereleases of
`1.2.0`.
So for packaging, if the `1.2.0alpha` and `1.2.0` versions have the same constraints on
dependencies, it's best to write
```python
depends_on("x@1:", when="@1.2.0alpha:")
```
However, `@1.2:` does include `1.2.0alpha`. This is because Spack considers
`1.2 < 1.2.0` as distinct versions, with `1.2 < 1.2.0alpha < 1.2.0` as a consequence.
Alternatively, the above `depends_on` statement can thus be written
```python
depends_on("x@1:", when="@1.2:")
```
which can be useful too. A short-hand to include prereleases, but you
can still be explicit to exclude the prerelease by specifying the patch version
number.
### Concretization
Concretization uses a different version order than `<`. Prereleases are ordered
between final releases and develop versions. That way, users should not
have to set `preferred=True` on every final release if they add just one
prerelease to a package. The concretizer is unlikely to pick a prerelease when
final releases are possible.
### Limitations
1. You can't express a range that includes all alpha release but excludes all beta
releases. Only alternative is good old repeated nines: `@:1.2.0alpha99`.
2. The Python ecosystem defaults to `a`, `b`, `rc` strings, so translation of Python versions to
Spack versions requires expansion to `alpha`, `beta`, `rc`. It's mildly annoying, because
this means we may need to compute URLs differently (not done in this commit).
### Hash
Care is taken not to break hashes of versions that do not have a prerelease
suffix.
Generate CI scripts as powershell on Windows. This is intended to
output exactly the same bash scripts as before on Linux.
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
As the cmake build is triggered by scikit build, the usual spack option
for enabling tests had no effect and the heavy test suite ran all the time.
Used https://github.com/scikit-build/cmake-python-distributions/issues/172#issuecomment-890322263
to implement how to pass options to the actual `cmake` build.
I also excluded some tests that failed for me on alma9 (gcc 11.4.1),
so the rest of the test suite can be run.
Enable OpenBLAS's built-in CPU capability detection and kernel selection.
This allows run-time selection of the "best" kernels for the running CPU, rather
than what is specified at build time. For example, it allows OpenBLAS to use
AVX512 kernels when running on ZEN4, and built targeting the "ZEN" architecture.
Co-authored-by: Branden Moore <branden.moore@amd.com>
* Changes to re-enable aws-pcluster pipelines
- Use compilers from pre-installed spack store such that compiler path relocation works when downloading from buildcache.
- Install gcc from hash so there is no risk of building gcc from source in pipleine.
- `packages.yam` files are now part of the pipelines.
- No more eternal `postinstall.sh`. The necessary steps are in `setup=pcluster.sh` and will be version controlled within this repo.
- Re-enable pipelines.
* Add and
* Debugging output & mv skylake -> skylake_avx512
* Explicilty check for packages
* Handle case with no intel compiler
* compatibility when using setup-pcluster.sh on a pre-installed cluster.
* Disable palace as parser cannot read require clause at the moment
* ifort cannot build superlu in buildcache
`ifort` is unable to handle such long file names as used when cmake compiles
test programs inside build cache.
* Fix spack commit for intel compiler installation
* Need to fetch other commits before using them
* fix style
* Add TODO
* Update packages.yaml to not use 'compiler:', 'target:' or 'provider:'
Synchronize with changes in https://github.com/spack/spack-configs/blob/main/AWS/parallelcluster/
* Use Intel compiler from later version (orig commit no longer found)
* Use envsubst to deal with quoted newlines
This is cleaner than the `eval` command used.
* Need to fetch tags for checkout on version number
* Intel compiler needs to be from version that has compatible DB
* Install intel compiler with commit that has DB ver 7
* Decouple the intel compiler installation from current commit
- Use a completely different spack installation such that this current pipeline
commit remains untouched.
- Make the script suceed even if the compiler installation fails (e.g. because
the Database version has been updated)
- Make the install targets fall back to gcc in case the compiler did not install
correctly.
* Use generic target for x86_64_vX
There is no way to provision a skylake/icelake/zen runner. They are all in the
same pools under x86_64_v3 and x86_64_v4.
* Find the intel compiler in the current spack installation
* Remove SPACK_TARGET_ARCH
* Fix virtual package index & use package.yaml for intel compiler
* Use only one stack & pipeline per generic architecture
* Fix yaml format
* Cleanup typos
* Include fix for ifx.cfg to get the right gcc toolchain when linking
* [removeme] Adding timeout to debug hang in make (palace)
* Revert "[removeme] Adding timeout to debug hang in make (palace)"
This reverts commit fee8a01580489a4ea364368459e9353b46d0d7e2.
* palace x86_64_v4 gets stuck when compiling try newer oneapi
* Update comment
* Use the latest container image
* Update gcc_hashes to match new container
* Use only one tag providing tags per extends call
Also removed an unnecessary tag.
* Move generic setup script out of individual stack
* Cleanup from last commit
* Enable checking signature for packages available on the container
* Remove commented packages / Add comment for palace
* Enable openmpi@5 which needs pmix>3
* don't look for intel compiler on aarch64
* py-python-lsp-server: add v1.10.0
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Remove py-wheel from package
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Running a `spack-python` script like this:
```python
import spack
import multiprocessing
def echo(args):
print(args)
if __name__ == "__main__":
pool = multiprocessing.Pool(2)
pool.map(echo, range(10))
```
will fail in `develop` with an error like this:
```console
_pickle.PicklingError: Can't pickle <function echo at 0x104865820>: attribute lookup echo on __main__ failed
```
Python expects to be able to look up the method `echo` in `sys.path["__main__"]` in
subprocesses spawned by `multiprocessing`, but because we use `InteractiveConsole` to
run `spack python`, the executed file isn't considered to be the `__main__` module, and
lookups in subprocesses fail. We tried to fake this by setting `__name__` to `__main__`
in the `spack python` command, but that doesn't fix the fact that no `__main__` module
exists.
Another annoyance with `InteractiveConsole` is that `__file__` is not defined in the
main script scope, so you can't use it in your scripts.
We can use the [runpy.run_path()](https://docs.python.org/3/library/runpy.html#runpy.run_path) function,
which has been around since Python 3.2, to fix this.
- [x] Use `runpy` module to launch non-interactive `spack python` invocations
- [x] Only use `InteractiveConsole` for interactive `spack python`
Often in containers, the files we use to detect whether a cray system supports new features are not available.
Given that the cray containers only support the newer versions, and that these versions have been
around for a while at this point and few sites don't support them, this PR changes the logic for
detecting cray systems so that:
1. Don't even consider whether something is the `cray` platform if `opt/cray` is not in `MODULEPATH`
2. Only use the `cray` platform if we can read files in /opt/cray/pe and positively detect an older version
3. Otherwise, assume we're *not* on a cray (includes newer Cray PE's, which we treat as Linux)
Compilation with the old flags fails on PowerPC (power8le) due to syntax
errors in the output from the preprocessor. Compilation with the
extended set of flags works both on PowerPC and x86_64.
The correct set of flags was suggested from the berkeleygw developers:
https://groups.google.com/a/berkeleygw.org/g/help/c/ewi3RZgOyeE/m/jSIoe45PAgAJ
Uses spack-provided compiler prefix to call cpp when compiling berkeleygw with gcc.
Adds variants to turn off tests
Add variants for some missing TPL options
Add the variables required to build in ~shared
* Add pamgen to Trilinos as a variant to support SEACAS
This adds the ability to turn off and on pamgen as needed
through the variant interface for the Trilinos package.py.
Add changes for seacas package.py to build the appropriate
Trilinos variants.
Add zlib-api as depends_on instead of zlib directly for SEACAS
package.py
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Add variant typescript for py-jupyter-server@:1, which then requires npm/node. Patch the build system for ~typescript so that it doesn't find any npm/node installations and attempts to build the typescript extension even though it shouldn't
* Fix formatting in var/spack/repos/builtin/packages/py-jupyter-server/package.py
* Constrain typescript variant to py-jupyter-server versions 1.10.2:1
* with when not needed if variant doesn't exist for other versions
This should fix issue #43192
Basically, had issue where a make variable was set to the output
of a shell function which included cd commands, and then the value
of that variable used as a makefile target.
The cd commands in the shell function caused assorted informational
messages (e.g. "Entering directory ...") which got included in the
return of the shell function, corrupting the value of the variable.
The presence of colons in the corrupted value caused make to issue
"multiple target" erros.
This fix adds --no-print-directory flags to the calls to the
make function in the package's build method, which resolves the
issue above.
It is admittedly a crude fix, and will remove *all* informational
messages re directory changes, thereby potentially making it more
difficult to diagnose/debug future issues building this package.
However, I do not see a way for to turn off these messages in a
more surgical manner.
* mgard: don't restrict protobuf version more than necessary
successfully built:
mgard@2022-11-18 ^protobuf@3.{4,21,25}
mgard@2023-01-10 ^protobuf@3.{4,25}
mgard@2023-03-31 ^protobuf@3.{4,25}
compile failures:
mgard@2022-11-18 ^protobuf@3.3
mgard@2023-01-10 ^protobuf@3.3
mgard@2023-03-31 ^protobuf@3.3
* mgard: add conflicts to address CI errors
* mgard: conflict between cuda and abseil@20240116.1
compiling mgard+cuda with gcc@12.3.0 and nvcc from cuda@12.3.0 against
protobuf pulling in abseil-cpp@20240116.1 results in the errors reported
here: https://github.com/abseil/abseil-cpp/issues/1629
* py-shacl: new version, update dependencies
Also updates the dependencies py-prettytable and py-rdflib.
* review comments
* Update var/spack/repos/builtin/packages/py-pyshacl/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-poetry-core: add required 1.8.1
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
`jinja2` can be a costly import, and right now it happens at startup every time we run
Spack. This slows down `spack --print-shell-vars` a bit, which is needed by `setup-env.*sh`.
Patch allowing Clingo to build with VS22 has landed both in Spack
and Clingo upstream, update Spack's bootstrap constraints to handle
this.
Additionally, properly scope the patch application in the clingo
package to handle upstream patch.
Currently (outside of this PR) when you `spack develop` a path, this path is treated as the staging
directory (this means that for example all build artifacts are placed in the develop path).
This PR creates a separate staging directory for all `spack develop`ed builds. It looks like
```
# the stage root
/the-stage-root-for-all-spack-builds/
spack-stage-<hash>
# Spack packages inheriting CMakePackage put their build artifacts here
spack-build-<hash>/
```
Unlike non-develop builds, there is no `spack-src` directory, `source_path` is the provided `dev_path`.
Instead, separately, in the `dev_path`, we have:
```
/dev/path/for/foo/
build-{arch}-<hash> -> /the-stage-root-for-all-spack-builds/spack-stage-<hash>/
```
The main benefit of this is that build artifacts for out-of-source builds that are relative to
`Stage.path` are easily identified (and you can delete them with `spack clean`).
Other behavior added here:
- [x] A symlink is made from the `dev_path` to the stage directory. This symlink name incorporates
spec details, so that multiple Spack environments that develop the same path will not conflict
with one another
- [x] `spack cd` and `spack location` have added a `-c` shorthand for `--source-dir`
Spack builds can still change the develop path (in particular to keep track of applied patches),
and for in-source builds, this doesn't change much (although logs would not be written into
the develop path). Packages inheriting from `CMakePackage` should get this benefit
automatically though.
Add current versions of the 17 and 18 releases
Stop making it nearly impossible to compose this correctly with code built with gcc
Build for compatibility by default like we do in every other llvm package
The `patch()` directive can now be invoked with `reverse=True` to apply a patch in reverse.
This is useful for reverting commits that caused errors in projects, even if only the forward
patch is available, e.g. via a GitHub commit patch URL.
`patch(..., reverse=True)` runs `patch -R` behind the scenes. This is a POSIX option so we
can expect it to be available on the `patch` command.
---------
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
set the FC variable to the MPI Fortran compiler and also set the F90 variable
to the same compiler for versions 9.8 and up. FC needs to be set because the
configure script still uses FC.
* unmaintained packages: add new versions
* Fix parallel and numactl
* Revert numactl changes
* rollback lua-sol2 version
* Update alluxio version format
* py-psyclone and py-fparser new releases added
* py-fparser: add missing releases for py-psyclone
* py-psyclone: actioned @adamjstewart comments
* py-psyclone: removed py-pytest-pylint
* py-pytest-pylint: added package @ latest version
* py-pytest-pylint: reformatted
* Update var/spack/repos/builtin/packages/py-psyclone/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-pytest-pylint: added build deps and runtime dep versions
* py-pytest-pylint: removed version from setuptools
* py-psyclone: add py-pytest-pylint test dep and alphabetize deps
* Update var/spack/repos/builtin/packages/py-pytest-pylint/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-psyclone: deps ordered
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
fixes#43097
Before this PR the behavior of mixins used together with
builders was to mask completely the callbacks defined from
the class coming later in the MRO.
Here we fix the behavior by accumulating all callbacks,
and de-duplicating them later.
* package/qgis: add new version
* improve Qsci.pro
* improve
* fix undefined symbol qsciprinter error
* add import test
* fix bug
* add version 3.36
* [@spackbot] updating style on behalf of Sinan81
* fix long line
* only run import test when +python
* first attempt at stand-alone test
* add TODO
---------
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Sinan81 <Sinan@world>
* proj: correct CMake arg for shared build with proj older than 7.0.0
* Actually use new CMake arg
* Update var/spack/repos/builtin/packages/proj/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
This adds Spack packages for these Perl distributons:
- Bio::DB::EUtilities and its dependencies:
- Bio::ASN1::EntrezGene
- Bio::Cluster
- Bio::Variation
* Properly specify Curl builder interface for static vs shared curl
with NMake system to ensure all built curls export expected
symbols.
* Symlinks curl library build artifact to more idiomatic name for
FindCurl.cmake implementations and other NMake consumers.
* Deprecating py-pylint@2.3 as it cannot build with python@3.8:
* Style fix
* Removed versions because can't build with python@3.7
---------
Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
* py_cheap_repr: add initial package.py
* Update var/spack/repos/builtin/packages/py-cheap-repr/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-cheap-repr: use pypi link instead
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Added new versions @1.5.1 and @1.4.1 (sfcgal moved from github to gitlab)
Placed restrictions on what versions of cgal are supported for different
cfcgal versions. These restrictions are based on what was found in the
version history at https://gitlab.com/sfcgal/SFCGAL/-/blob/v1.5.1/NEWS?ref_type=tags
as well as the CMakeLists.txt for different versions.
@1.4 and @1.5 seem to require a specific version of cgal based on CMakeLists.txt
Earlier versions (@1.3.8:1.3.10) claim to support cgal@4.3: (but Spack recipe
claims did not work until @4.7, so sticking with that as minimum). CMakeList.txt
suggests they support cgal@5 as well, but version history suggests otherwise.
* Expanding the duckdb package to fix the version number (required for external extensions to work) being pulled from git and have variants for the built-in extensions at build-time. This also changes the build system from CMakePackage to Makefile package (as advised from upstream).
* - Reorganized and cleaned up variants
- Updated the patch to work for 0.10.0+
- Removed (or made non-default) some unecessary variants
- Added ninja as the default generator
- Set up some shared library dependencies;
* enable tensorflow-2.11 support for ROCm
* add latest sha for mesa and limit the patches to older version.similar
changes in #37910 to enable gitlab-ci pass
* address review commemts
Remove dependency on `importlib_metadata` and `pkg_resources`, which can be problematic if the version in PYTHONPATH is incompatible with the interpreter Spack is running under.
* Kokkos Kernels: adding missing TPLs and pre-conditions
Adding variants and dependencies for rocBLAS and rocSPARSE.
Also adding a "when=" close to the TPL variants that prevents
enabling the TPLs in versions of the library when it was not
yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: fix issue with unpacking wrong number of args
After changing the tpls dictionary we need to update the unpacking
logic to catch the right number of outputs out of it!
* Kokkos Kernels: updating doc string for tpls var and using f-string
Improving comment a bit and switching to f-string for more readability.
When setup to do more testing will try to use f-string in the CMake
options generation part of the package.
* Style change
* sst-core now effectively depends on ncurses
* use --with-curses
* sst-core: update comment about ncurses
* should have curses for build, link, and run
Closes#43052.
Maybe moving the argument to the `find` subcommand is a good idea, but I
just wanted to get the docs fix out.
Co-authored-by: Patrice Peterson <patrice.peterson@itz.uni-halle.de>
This PR adds the ability to load spack extensions through `importlib.metadata` entry
points, in addition to the regular configuration variable.
It requires Python 3.8 or greater to be properly supported.
#42878 adds a post install filter of the netCDFConfig.cmake file
that replaces a valid CMake target on Windows with an invalid one.
Don't do this replacement on Windows.
* Updates for migraphx 6.0.0 & 6.0.2
* Style check error and audit check error fix
* Adding patch for half-include-directory
* The parameter GPU_TARGETS is used from 5.7 in migraphx
* Adding rocmlir dependency in migraphx and 6.0 updates in rocmlir
* Applying upcoming changes to make CK JIT optional and enable
compilation on Windows in order to build without ck dependency
* mpfr: missing dependency for version 4.0.1
mpfr 4.0.1 (like 4.0.2) needs autoconf-archive where it takes the
AX_PHREAD macro from
* autoconf-archive is also required for mpfr@4.0.0
New changes have been made to nvptx-tools that address dropping support
for sm_30 in later CUDA versions (12.0+).
Also refactor gcc to make nvptx-tools a dependency instead of a resource.
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Adds CHI::Driver::Memcached.
Modified perl-chi to make the tests work. Testing modules in perl-chi
were not loaded when testing CHI::Driver::Memcached, so added the "run"
type to these.
* ASP-based solver: improve reusing nodes with gcc-runtime
This PR skips emitting dependency constraints on "gcc-runtime",
for concrete specs that are considered for reuse.
Instead, an appropriate version of gcc-runtime is recomputed
considering also the concrete nodes from reused specs.
This ensures that root nodes in a DAG have always a runtime
that is at a version greater or equal than their dependencies.
* Add unit-test for view with multiple runtimes
* Select latest version of runtimes in views
* Construct result keeping track of latest
* Keep ordering stable, just in case
1) support for version @3.0
Unfortunately, download seems to require registration now
so using manual_download mechanism for @3:
2) copying from hdf-eos5 patch from @vanderwb to enable
use of Spack compiler wrappers instead of h4cc
3) Patching an issue in hdf-eos2 configure script. The
script will test for jpeg, libz libraries, succeed and
append HAVE_LIBJPEG=1, etc to confdefs.h, and then abort
because HAVE_LIBJPEG not set in running environment.
4) Add some LDFLAGS to build environment. Otherwise
seems to fail to build test script due to rpc dependence
in HDF4.
Adds Net::Server::SS::PreFork and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Server::Starter
- Net::Server
- HTTP::Server::Simple
Adds Catalyst::Action::RenderView and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
* package/imagemagick add new version, improve
* confimed that build fails when libsm is missing on linux
---------
Co-authored-by: Sinan81 <Sinan@world>
* perl-chi and deps: new packages
Adds CHI and its dependencies.
Installed OK with build-time tests. Added dependencies:
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
* Add license
* legion: Add 23.09.0 and 23.12.0, remove control_replication.
The branch control_replication has been merged to master and should no
longer be used.
* flecsi: Switch to Legion master branch.
Legion control_replication has been merged to master.
* Fix Legion 23.09.0 and 23.12.0 build for ROCm 6.
* Add ncvis and opengl option for wxwidgets
* Style fixes for ncvis
* Replace in with satisfies for opengl constraint
Co-authored-by: Alec Scott <alec@bcs.sh>
---------
Co-authored-by: Alec Scott <alec@bcs.sh>
* when built with cmake, libogg does not build with a shared libary by default. This resolves that
* spack style fixes
* Clean up imports
* enforce +pic when +shared
* Execute `args.help` after setting main options so that extension commands will show with `spack -h`
---------
Co-authored-by: psakievich <psakiev@sandia.gov>
* Update package.py
1. add one compiler type named 'musl'
2. add a variant name 'multilib'
3. add a variant name 'cmodel'
* Added one compiler type named 'musl'.
Added a variant named 'multilib'.
Added a variant named 'cmodel'.
Added several versions.
* aarch64 is not supported.
The deps were added in #40945 to make it work on macOS 11, because the
old configure scripts only detect macOS 10. Apparently people reported the
autoreconf script caused issues, later fixed in #41057. However, also
with that fix, things are incorrect, cause people now report:
```
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.7
libtool: and run autoconf again.
```
HOWEVER, all this is unnecessary, because the underlying issue was
already fixed long ago, it's just that it regressed at some point, but
it's back in place since #41205.
This adds Readonly::XS. Since this module can not be used by itself, the
Spack package comes with a test override. This anticipates that the perl
builder will one day have a generic standalone module usage test.
This should fix issue #40780
We explicitly cast self.spec["python"].command to str in the filter_file
call in _fix_dtrace_shebang to avoid the error
==> Error: TypeError: expected str, bytes or os.PathLike object, not Executable
Not sure why the error is appearing (is it only for specific python versions, etc?),
but the fix should be quite safe.
* e4s: new packages: glvis, laghos
* gl: require: osmesa
* be explicit: glvis ^llvm so that llvm-amdgpu not chosen
* glvis fails on oneapi stack due to issue 42839
Spack merges ranges and concrete versions if they have non-empty
intersection. That is not enough for adjacent version ranges.
This commit ensures that disjoint ranges in version lists are simplified
if their union is not disjoint:
```python
"@1.0:2.0,2.1,2.2:3,4:6" # simplifies to "@1.0:6"
```
Libraries for openexr are named libOpenEXR*.so, etc., so the default libs
handler in spec does not find them.
Add a custom libs property to address this.
Partial fix for #42273
Co-authored-by: payerle <payerle@users.noreply.github.com>
for various reasons had to advance dependency of 5.0.2 to at least
pmix 4.2.4. 5.0.1 and 5.0.0 can also build with 4.2.4 pmix or newer.
related to #42651
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Refactoring `SpackSolverSetup` is a bit easier with type annotations, so I started
adding some. This adds annotations for the (many) instance variables on
`SpackSolverSetup` as well as a few other places.
This also refactors `condition()` to reduce redundancy and to allow
`_get_condition_id()` to be called independently of the larger condition
function.
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* py-ipyvuetify: new package
* Limit py-jupyter-packing version to 0.7.x
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix py-jupyterlab version and type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Fix py-ipyvue version range to exclude 2
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* rm py-wheel, already considered for PythonPackage
* fix: pynpm only required for build, reorder dependencies as in the pyproject.toml
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* snakemake: add Snakemake 8 with dependencies
* snakemake: add missing description
* Whitespace
* Whitespace
* Whitespace
* Whitespace
* py-conda-inject: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-azure-batch: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-cluster-generic: add constraint for Python
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake: add upper bound for Python
* py-snakemake-executor-plugin-drmaa: specify dependency type
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-googlebatch: correct dependency version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-executor-plugin-tes: correct dependency version
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-snakemake-storage-plugin-s3: reorder
* snakemake: remove newly added variants
* snakemake: remove newly added variants
* snakemake: remove newly added variant
* snakemake: update version
* snakemake: update version
* snakemake: whitespace
* py-snakemake-storage-plugin-s3: update version
* snakemake: use newer version
* snakemake: whitespace
* snakemake: update interfaces
* py-snakemake-storage-plugin-gcs: link issue
* snakemake: update versions
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* package/libspatialite: add conflict, new version
* depends on new version of freexl
* fix bug
* remove manual download stuff
* improve style
* first depracate
* [@spackbot] updating style on behalf of Sinan81
* get rid of conflict, reorder deps
* remove manual download
---------
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
* Allow awscli-v2 to be installed without examples/ dir
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
* Update var/spack/repos/builtin/packages/awscli-v2/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
The CMake-based build is anticipated to work in all cases where the
Autotools-based build did, and to address all prior issues with less
maintenance of the package. In detail:
* Fixes#42735 (CMake's find_package helps with linking to proper
netcdf-c)
* Replaces older Autotools-based build
* All preexisting variants are handled
* Record hdf5 as an explicit dependency (was missing before)
* Add +tests option
Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
* proj: apply stdint.h patch in version 8
* Update var/spack/repos/builtin/packages/proj/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Some builds on Windows break when encountering paths with spaces. This
reencodes some paths in Windows 8.3 filename format (when on Windows):
this serves as an equivalent identifier for the file, but in a form that
does not have spaces.
8.3 filenames are also truncated in length, which could be helpful, but
that is not the primary intended purpose of using this format.
Overall
* nmake/msbuild packages do this generally for the install prefix
* curl/perl require additional modifications (as written now, each package
may require calls to `windows_sfn` to work when the Spack
root/install/staging prefixes contain spaces)
Some items for follow-up:
* Spack itself does not create paths with spaces "on top" of whatever
the user configures or where it is placed (e.g. the Spack root, the
staging directory, etc.), so it might be possible to edit some of these
paths once and avoid a proliferation of individual `windows_sfn`
calls in individual packages.
* This approach may result in the insertion of 8.3-style paths into
build artifacts (on Windows), handling this may require additional
bookkeeping (e.g. when relocating).
* Move spec_list into its own file, instead of __init__.py
* Remove spack.schema.spack
This module was introduced in #33960 It's almost an exact duplicate of
spack.schema.env, and is not used anywhere.
* Fix typo
* adds the spack recipe for py-jwcrypto
* split long line to fix E501
* Specify versions for py-cryptography and py-typing-extensions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* adds the spack recipe for reacton python package
* Fix versions for ipywidgets and typing-extensions
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-find-libpython: new package
* Update var/spack/repos/builtin/packages/py-find-libpython/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add spack recipe for ipyvue
* Specify version for ipywidgets
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Update package.py
fix bug.
* Update var/spack/repos/builtin/packages/pdal/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-ansible: add v2.16.3
* Apply suggestions from code review
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* Add specific python version requirements from setup.cfg
* Add additional ranges for py-setuptools
* Update var/spack/repos/builtin/packages/py-ansible/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Compiling version 2.2.4 fails (on a Debian system with only a minimum set of packages installed) with an error because `INT_MAX` is undeclared:
```
263 gd_gd2.c: In function '_gd2GetHeader':
>> 264 gd_gd2.c:212:54: error: 'INT_MAX' undeclared (first use in this function)
265 212 | if (*ncx <= 0 || *ncy <= 0 || *ncx > INT_MAX / *ncy) {
266 | ^~~~~~~
267 gd_gd2.c:87:1: note: 'INT_MAX' is defined in header '<limits.h>'; did you forget to '#include <limits.h>'?
```
* pythia6: deal with dead pythiasix.hepforge.org
* pythia6: rm main81.f from CMakeLists.txt
* [@spackbot] updating style on behalf of wdconinc
---------
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Any package `X` used as `depends_on("x", type="build")` will have
`X.setup_run_environment(env)` called, because it has to be able to
"run" in the build environment.
So there is no point in calling `setup_run_environment` from
`setup_dependent_build_environment`.
Also it's redundant to call `setup_run_environment` in
`setup_dependent_run_environment`, cause (a) the latter is called _for
every parent edge_ instead of once per node, and (b) it's only called
after `setup_run_environment` is called anyways. Better to call
`setup_run_environment` once and only once.
fftw object was originally created with spec["fftw:openmp"], but
referencing spec["fftw"] overwrites the 'last_query' in the spec object,
so later use of fftw.libs was not returing FFTW OpenMP libs.
Also allow the post-install fixup to support amdfftw as well as fftw.
Co-authored-by: Branden Moore <branden.moore@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
* new builtin package: py-biobb-model
* Update var/spack/repos/builtin/packages/py-biobb-model/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add package py-jacobi
* fix: add description
* fix: add description
* fix: add description
* [@spackbot] updating style on behalf of jonas-eschle
* Update package.py
* Update package.py
* Update var/spack/repos/builtin/packages/py-jacobi/package.py
I don't think that numpy is used in "build"? But not important
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-google-cloud-storage: add new versions
* py-google-api-core: add new versions
* py-proto-plus: add new package
* py-google-api-core: add grpc variant
* py-google-api-core: add grpc variant
* py-google-api-core: add missing prefix
* py-google-cloud-batch: add new package
* py-google-cloud-logging: add new package
* py-google-cloud-appengine-logging: add new package
* py-google-cloud-audit-log: add new package
* py-grpc-google-iam-v1: add new package
* py-proto-plus: remove obvious dependency
* Whitespace
* Whitespace
* py-google-cloud-audit-log: correct conflict
* py-proto-plus: correct dependency type
* Whitespace
* py-google-auth: add new version
* py-google-resumable-media: add new version
* py-google-cloud-storage: constrain version of dependency
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-grpcio-status: use newer version
* py-google-resumable-media: add upper bound of dependency
* Add types of dependencies.
* py-grpcio: add new version
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* py-smote-variants: Added package py-smote-variants
Also added py-minisom and py-metric-learn as dependencies
* py-metric-learn: Added build dependency on setuptools
* py-smote-variants: Added a dependency on py-pytest-runner
As well as a comment about why statistics isn't included
* [@spackbot] updating style on behalf of alex391
---------
Co-authored-by: Alex C Leute <aclrc@rit.edu>
* py-charm4py: needs Cython<3.0
* Update var/spack/repos/builtin/packages/py-charm4py/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-17 14:16:31 -06:00
1449 changed files with 29422 additions and 16567 deletions
@@ -220,6 +220,40 @@ section of the configuration:
.._binary_caches_oci:
---------------------------------
Automatic push to a build cache
---------------------------------
Sometimes it is convenient to push packages to a build cache as soon as they are installed. Spack can do this by setting autopush flag when adding a mirror:
..code-block::console
$ spack mirror add --autopush <name> <url or path>
Or the autopush flag can be set for an existing mirror:
..code-block::console
$ spack mirror set --autopush <name> # enable automatic push for an existing mirror
$ spack mirror set --no-autopush <name> # disable automatic push for an existing mirror
Then after installing a package it is automatically pushed to all mirrors with ``autopush: true``. The command
..code-block::console
$ spack install <package>
will have the same effect as
..code-block::console
$ spack install <package>
$ spack buildcache push <cache> <package> # for all caches with autopush: true
..note::
Packages are automatically pushed to a build cache only if they are built from source.
@@ -73,9 +73,12 @@ are six configuration scopes. From lowest to highest:
Spack instance per project) or for site-wide settings on a multi-user
machine (e.g., for a common Spack instance).
#.**plugin**: Read from a Python project's entry points. Settings here affect
all instances of Spack running with the same Python installation. This scope takes higher precedence than site, system, and default scopes.
#.**user**: Stored in the home directory: ``~/.spack/``. These settings
affect all instances of Spack and take higher precedence than site,
system, or defaults scopes.
system, plugin, or defaults scopes.
#.**custom**: Stored in a custom directory specified by ``--config-scope``.
If multiple scopes are listed on the command line, they are ordered
@@ -196,6 +199,45 @@ with MPICH. You can create different configuration scopes for use with
mpi: [mpich]
.._plugin-scopes:
^^^^^^^^^^^^^
Plugin scopes
^^^^^^^^^^^^^
..note::
Python version >= 3.8 is required to enable plugin configuration.
Spack can be made aware of configuration scopes that are installed as part of a python package. To do so, register a function that returns the scope's path to the ``"spack.config"`` entry point. Consider the Python package ``my_package`` that includes Spack configurations:
..code-block::console
my-package/
├── src
│ ├── my_package
│ │ ├── __init__.py
│ │ └── spack/
│ │ │ └── config.yaml
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make ``my_package``'s ``spack/`` configurations visible to Spack when ``my_package`` is installed:
..code-block::toml
[project.entry_points."spack.config"]
my_package="my_package:get_config_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
@@ -111,3 +111,39 @@ The corresponding unit tests can be run giving the appropriate options to ``spac
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
---------------------------------------
Registering Extensions via Entry Points
---------------------------------------
..note::
Python version >= 3.8 is required to register extensions via entry points.
Spack can be made aware of extensions that are installed as part of a python package. To do so, register a function that returns the extension path, or paths, to the ``"spack.extensions"`` entry point. Consider the Python package ``my_package`` that includes a Spack extension:
..code-block::console
my-package/
├── src
│ ├── my_package
│ │ └── __init__.py
│ └── spack-scripting/ # the spack extensions
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make the ``spack-scripting`` extension visible to Spack when ``my_package`` is installed:
..code-block::toml
[project.entry_points."spack.extenions"]
my_package="my_package:get_extension_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
suffix not recognized as a pre-release is treated as an ordinary
string component, so ``1.2 < 1.2-mysuffix``.
The order on versions is defined as follows. A version string is split
into a list of components based on delimiters such as``.``, ``-`` etc.
Lists are then ordered lexicographically, where components are ordered
as follows:
Finally, there are a few special string components that are considered
"infinity versions". They include ``develop``,``main``, ``master``,
``head``, ``trunk``, and ``stable``. For example: ``1.2 < develop``.
These are useful for specifying the most recent development version of
a package (often a moving target like a git branch), without assigning
a specific version number. Infinity versions are not automatically used when determining the latest version of a package unless explicitly required by another package or user.
More formally, the order on versions is defined as follows. A version
string is split into a list of components based on delimiters such as
``.`` and ``-`` and string boundaries. The components are split into
the **release** and a possible **pre-release** (if the last component
is numeric and the second to last is a string ``alpha``, ``beta`` or ``rc``).
The release components are ordered lexicographically, with comparsion
between different types of components as follows:
#. The following special strings are considered larger than any other
numeric or non-numeric version component, and satisfy the following
@@ -925,6 +949,9 @@ as follows:
#. All other non-numeric components are less than numeric components,
and are ordered alphabetically.
Finally, if the release components are equal, the pre-release components
are used to break the tie, in the obvious way.
The logic behind this sort order is two-fold:
#. Non-numeric versions are usually used for special cases while
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.