Compare commits

..

616 Commits

Author SHA1 Message Date
Martin Aumüller
1527853efd intel-tbb: patch patch for Apple's patch (#40640)
While e.g. GNU patch 2.7.6 (as provided by homebrew) would apply the previous
version of this patch without problems, Apple's patch 2.0-12u11-Apple fails
to find out which file to patch.

Adding two lines to the patch fixes that. Renamed the patch in order to
not require a `spack clean -m`.
2023-10-21 09:26:36 -04:00
Harmen Stoppels
d820cf73e9 py-kombu: fix setuptools bound (#40646) 2023-10-21 13:38:30 +02:00
Scott Wittenburg
8714b24420 py-kombu: pick older version of py-setuptools (#40642) 2023-10-21 08:38:03 +02:00
Lydéric Debusschère
0c18f81b80 [add] py-dict2css: new package (#40552)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 18:09:13 -06:00
Andrey Alekseenko
d442fac69a gromacs: add 2022.6, 2023.2, 2023.3 versions (#38906)
* gromacs: add 2022.6, 2023.2 versions
* gromacs: add version 2023.3
2023-10-20 17:28:45 -06:00
Garth N. Wells
76c57af021 py-fenics-ffcx: update to v0.7 (#40569) 2023-10-20 12:04:02 -05:00
Harmen Stoppels
27a0425e5d concretize separately: show concretization time per spec as they concretize when verbose (#40634) 2023-10-20 17:09:19 +02:00
Harmen Stoppels
4bade7ef96 gromacs +cp2k: build in CI (#40494)
* gromacs +cp2k: build in CI

* libxsmm: x86 only

* attempt to fix dbcsr + new mpich

* use c11 standard

* gromacs: does not depend on dbcsr

* cp2k: build with cmake in CI, s.t. dbcsr is a separate package

* cp2k: cmake patches for config files and C/C++ std

* cp2k: remove unnecessary constraints due to patch
2023-10-20 16:20:20 +02:00
Lydéric Debusschère
a0e33bf7b0 py-corner: new package (#40546)
* [add] py-corner: new package

* py-corner: remove py-wheel dependence with respect to reviewing commentary

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:37:15 -05:00
Massimiliano Culpo
cbc39977ca ASP-based solver: minimize weights over edges (#40632)
With the introduction of multiple build dependencies from the same package in the DAG, we need to minimize a few weights accounting for edges rather than nodes. If we don't do that we might have multiple "optimal" solutions that differ only in how the same nodes are connected together. This commit ensures optimal versions are picked per parent in case of multiple choices for a dependency.
2023-10-20 14:37:07 +02:00
Adam J. Stewart
06fc24df5e TensorFlow/Keras/TensorBoard: add v2.14.0 (#40297)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-10-20 14:23:54 +02:00
Claire Guilbaud
9543abd2d9 add recipes for sphinx-book-theme and its dependencies if unknown (#40312)
* add recipes for sphinx-book-theme and its dependencies if unknown

* fix version and mission https

* fix based on reviewers remarks
2023-10-20 07:18:41 -05:00
Lydéric Debusschère
004d3e4cca [add] py-fraction: new package (#40554)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:03:48 -05:00
Lydéric Debusschère
25aff66d34 [add] py-cssutils: new package (#40551)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:01:38 -05:00
Lydéric Debusschère
9bd77b2ed3 [add] py-css-parser: new package (#40550)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-10-20 07:00:46 -05:00
Manuela Kuhn
5de1c1c98f py-statsmodels: add 0.14.0 (#39156)
* py-statsmodels: add 0.14.0

* Fix style

* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-statsmodels/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove python limits

* Remove comment

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-20 06:59:26 -05:00
Manuela Kuhn
5b9b5eaa28 py-dcm2bids: add v3.1.0 (#40447)
* py-dcm2bids: add 3.1.0

* Fix python restriction
2023-10-20 06:56:48 -05:00
Manuela Kuhn
00ee72396f py-bidscoin: add v4.1.1 and py-argparse-manpage: add new package (#40414)
* py-bidscoin: add 4.1.1

* Fix style

* Fix restrictions for dependencies
2023-10-20 06:55:41 -05:00
snehring
aa4d55004c Add package py-macs3 and dependencies (#40498)
* py-cykhash: adding new package py-cykhash

* py-hmmlearn: adding new package py-hmmlearn

* py-macs3: adding new package py-macs3

* py-macs3: adding python version restriction and other changes.
2023-10-20 06:53:41 -05:00
Harmen Stoppels
468f6c757e schema/compilers.py: fix validation of 2+ entries (#40627)
Fix the following syntax which validates only the first array entry:

```python
"compilers": {
    "type": "array",
    "items": [
        {
            "type": ...
        }
    ]
}
```

to

```python
"compilers": {
    "type": "array",
    "items": {
        "type": ...
    }
}
```

which validates the entire array.

Oops...
2023-10-20 09:51:49 +02:00
Adam J. Stewart
0907d43783 Drop support for external PythonX.Y (#40628)
On some systems, multiple pythonx.y are placed in the same prefix as
pythonx (where only one of them is associated with that pythonx).
Spack external detection for Python was willing to register all of
these as external versions. Moreover, the `package.py` for Python
was able to distinguish these.

This can cause an issue for some build systems, which will just look
for python3 for example, so if that python3 is actually python3.6,
and the build system needs 3.7 (which spack may have found in the
same prefix, and offered as a suitable external), it will fail when
invoking python3.

To avoid that issue, we simply avoid treating pythonx.y as external
candidates. In the above case, Spack would only detect a Python 3.6
external, and the build would be forced to use a Spack-built Python
3.7 (which we consider a good thing).
2023-10-20 00:29:38 -06:00
wspear
c9e5173bbd TAU: Respect ~fortran for +mpi (#40617) 2023-10-19 23:24:17 -06:00
Vicente Bolea
0019faaa17 vtk-m: update to latest release (#40624) 2023-10-19 18:02:25 -06:00
Lydéric Debusschère
e30f53f206 perl: change permissions in order to apply patch on version 5.38.0 (#40609)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-10-20 00:00:24 +02:00
dependabot[bot]
f2ba25e09d build(deps): bump actions/checkout from 4.1.0 to 4.1.1 (#40584) 2023-10-19 23:04:40 +02:00
dependabot[bot]
405de56c71 build(deps): bump mypy from 1.6.0 to 1.6.1 in /lib/spack/docs (#40603) 2023-10-19 23:03:48 +02:00
dependabot[bot]
ba571f2404 build(deps): bump mypy from 1.6.0 to 1.6.1 in /.github/workflows/style (#40602) 2023-10-19 23:03:23 +02:00
dependabot[bot]
4c1785d5f6 build(deps): bump urllib3 from 2.0.6 to 2.0.7 in /lib/spack/docs (#40583) 2023-10-19 23:02:51 +02:00
Adam J. Stewart
fa4d5ee929 py-rasterio: add v1.3.9 (#40621) 2023-10-19 12:08:43 -07:00
Cody Balos
8720cec283 add nvechip to sundials components when mfem+rocm (#40512) 2023-10-19 12:08:24 -07:00
Harmen Stoppels
72b36ac144 Improve setup build / run / test environment (#35737)
This adds a `SetupContext` class which is responsible for setting
package.py module globals, and computing the changes to environment
variables for the build, test or run context.

The class uses `effective_deptypes` which takes a list of specs (e.g. single
item of a spec to build, or a list of environment roots) and a context
(build, run, test), and outputs a flat list of specs that affect the
environment together with a flag in what way they do so. This list is
topologically ordered from root to leaf, so that one can be assured that
dependents override variables set by dependencies, not the other way
around.

This is used to replace the logic in `modifications_from_dependencies`,
which has several issues: missing calls to `setup_run_environment`, and
the order in which operations are applied.

Further, it should improve performance a bit in certain cases, since
`effective_deptypes` run in O(v + e) time, whereas `spack env activate`
currently can take up to O(v^2 + e) time due to loops over roots. Each
edge in the DAG is visited once by calling `effective_deptypes` with
`env.concrete_roots()`.

By marking and propagating flags through the DAG, this commit also fixes
a bug where Spack wouldn't call `setup_run_environment` for runtime
dependencies of link dependencies. And this PR ensures that Spack
correctly sets up the runtime environment of direct build dependencies.

Regarding test dependencies: in a build context they are are build-time
test deps, whereas in a test context they are install-time test deps.
Since there are no means to distinguish the build/install type test deps,
they're both.

Further changes:

- all `package.py` module globals are guaranteed to be set before any of the
  `setup_(dependent)_(run|build)_env` functions is called
- traversal order during setup: first the group of externals, then the group
  of non-externals, with specs in each group traversed topological (dependencies
  are setup before dependents)
- modules: only ever call `setup_dependent_run_environment` of *direct* link/run
   type deps
- the marker in `set_module_variables_for_package` is dropped, since we should
  call the method once per spec. This allows us to set only a cheap subset of
  globals on the module: for example it's not necessary to compute the expensive
  `cmake_args` and w/e if the spec under consideration is not the root node to be
  built.
- `spack load`'s `--only` is deprecated (it has no effect now), and `spack load x`
  now means: do everything that's required for `x` to work at runtime, which
  requires runtime deps to be setup -- just like `spack env activate`.
- `spack load` no longer loads build deps (of build deps) ...
- `spack env activate` on partially installed or broken environments: this is all
  or nothing now. If some spec errors during setup of its runtime env, you'll only
  get the unconditional variables + a warning that says the runtime changes for
  specs couldn't be applied.
- Remove traversal in upward direction from `setup_dependent_*` in packages.
  Upward traversal may iterate to specs that aren't children of the roots
  (e.g. zlib / python have hundreds of dependents, only a small fraction is
  reachable from the roots. Packages should only modify the direct dependent
  they receive as an argument)
2023-10-19 20:44:05 +02:00
Harmen Stoppels
79896ee85c spack checksum: restore ability to select top n (#40531)
The ability to select the top N versions got removed in the checksum overhaul,
cause initially numbers were used for commands.

Now that we settled on characters for commands, let's make numbers pick the top
N again.
2023-10-19 11:33:01 -07:00
Vanessasaurus
408ee04014 Automated deployment to update package flux-core 2023-10-19 (#40605)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-10-19 10:16:42 -07:00
Harmen Stoppels
3f594e86a1 libvorbis: drop -force_cpusubtype_ALL flag (#40616)
This flag was only relevant when targeting powerpc from apple-clang,
which we don't do. The flag is removed from apple-clang@15. Let's drop
it unconditionally.
2023-10-19 10:15:18 -07:00
Scott Wittenburg
46c1a8e4c6 gitlab ci: Rework how mirrors are configured (#39939)
Improve how mirrors are used in gitlab ci, where we have until now thought
of them as only a string.

By configuring ci mirrors ahead of time using the proposed mirror templates,
and by taking advantage of the expressiveness that spack now has for mirrors,
this PR will allow us to easily switch the protocol/url we use for fetching
binary dependencies.

This change also deprecates some gitlab functionality and marks it for
removal in Spack 0.23:

    - arguments to "spack ci generate":
        * --buildcache-destination
        * --copy-to
    - gitlab configuration options:
        * enable-artifacts-buildcache
        * temporary-storage-url-prefix
2023-10-19 11:04:59 -05:00
Satish Balay
b2d3e01fe6 petsc: add variant +sycl (#40562)
* petsc: add variant +sycl

* petsc: add in gmake as dependency - so that consistent make gets used between petsc and slepc builds [that can have different env for each of the builds]
2023-10-19 07:31:02 -07:00
Harmen Stoppels
681639985a ci: remove incorrect compilers.yaml (#40610) 2023-10-19 16:11:42 +02:00
Massimiliano Culpo
a1ca1a944a ASP-based solver: single Spec instance per dag hash (#39590)
Reused specs used to be referenced directly into the built spec.

This might cause issues like in issue 39570 where two objects in
memory represent the same node, because two reused specs were
loaded from different sources but referred to the same spec
by DAG hash.

The issue is solved by copying concrete specs to a dictionary keyed
by dag hash.
2023-10-19 16:00:45 +02:00
Tamara Dahlgren
4f49f7b9df Stand-alone test feature deprecation postponed to v0.22 (#40600) 2023-10-19 06:03:54 -06:00
Aiden Grossman
fb584853dd byte-unixbench: respect compiler choice (#39242)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-10-19 02:20:34 -06:00
Aiden Grossman
cc47b06756 connect-proxy: respect compiler choice (#39243) 2023-10-19 09:58:58 +02:00
Aiden Grossman
b68a620fc2 bioawk: respect compiler choice (#39241) 2023-10-19 09:55:57 +02:00
Aiden Grossman
e417ca54a0 busybox: respect compiler choice (#39239) 2023-10-19 09:55:06 +02:00
Michael Kuhn
5bbf8454d0 julia: Fix build for @1.9 (#39045)
julia@1.9 tries to download ittapi, which requires cmake. Disable it
explicitly.
2023-10-19 09:09:45 +02:00
Annop Wongwathanarat
67b8dd0913 acfl: add version 23.10 (#40510) 2023-10-18 17:07:40 -07:00
Harmen Stoppels
a42eb0d2bd unparse: also support generic type aliases (#40328) 2023-10-18 23:16:05 +02:00
Harmen Stoppels
294e659ae8 AutotoolsPackage / MakefilePackage: add gmake build dependency (#40380) 2023-10-18 19:56:54 +02:00
Harmen Stoppels
55198c49e5 llvm: fix ncurses+termlib linking in lldb (#40594) 2023-10-18 19:04:49 +02:00
Harmen Stoppels
dc071a3995 Fix dev-build keep_stage behavior (#40576)
`spack dev-build` would incorrectly set `keep_stage=True` for the
entire DAG, including for non-dev specs, even though the dev specs
have a DIYStage which never deletes sources.
2023-10-18 11:44:26 +00:00
Lydéric Debusschère
db5d0ac6ac [fix] py-werkzeug: add constraint in python dependence (#40590)
py-werkzeug@:0.12 does not work with python@3.10:

Test with py-werkzeug 0.12.2 and python 3.10:
```
$ python3.10 -c 'import werkzeug'
py-werkzeug-0.12.2/lib/python3.11/site-packages/werkzeug/datastructures.py", line 16, in <module>
from collections import Container, Iterable, MutableSet
ImportError: cannot import name 'Container' from 'collections'
```

Test with py-werkzeug 0.12.2 and python 3.9:
```
python3.9 -c "from collections import Container"
<string>:1: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will stop working
```
2023-10-18 13:04:21 +02:00
Aiden Grossman
2802013dc6 Add license directive (#39346)
This patch adds in a license directive to get the ball rolling on adding in license 
information about packages to spack. I'm primarily interested in just adding
license into spack, but this would also help with other efforts that people are
interested in such as adding license information to the ASP solve for 
concretization to make sure licenses are compatible.

Usage:

Specifying the specific license that a package is released under in a project's
`package.py` is good practice. To specify a license, find the SPDX identifier for
a project and then add it using the license directive:

```python
   license("<SPDX Identifier HERE>")
```

For example, for Apache 2.0, you might write:

```python
   license("Apache-2.0")
```

Note that specifying a license without a when clause makes it apply to all
versions and variants of the package, which might not actually be the case.
For example, a project might have switched licenses at some point or have
certain build configurations that include files that are licensed differently.
To account for this, you can specify when licenses should be applied. For
example, to specify that a specific license identifier should only apply
to versionup to and including 1.5, you could write the following directive:

```python
   license("MIT", when="@:1.5")
```
2023-10-18 03:58:19 -07:00
Greg Becker
37bafce384 abi.py: fix typo, add type-hints (#38216)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-18 11:22:55 +02:00
jfavre
da0813b049 paraview: add variant for NVIDIA IndeX (#40577)
* add variant for NVIDIA IndeX
* remove whitespaces
2023-10-17 13:03:41 -06:00
Dennis Klein
e2bb2595b3 xmlto: add more dependencies (#40578)
`xmllint` is called by `xmlto` during generation of `libzmq`'s docs, so
adding `libxml2`.

The docbook deps and the patches are taken from
https://src.fedoraproject.org/rpms/xmlto/blob/rawhide/f/xmlto.spec

There are still many more dependencies missing, but this is out of scope
of this patch (which is only concerned about the use case of `libzmq`).
2023-10-17 11:58:46 -07:00
Mikael Simberg
b7cbcfdcab Add tracy 0.10 (#40573) 2023-10-17 11:35:55 -07:00
Peter Scheibel
9cde25b39e Allow / in GitVersion (#39398)
This commit allows version specifiers to refer to git branches that contain
forward slashes. For example, the following is valid syntax now:

    pkg@git.releases/1.0
   
It also adds a new method `Spec.format_path(fmt)` which is like `Spec.format`,
but also maps unsafe characters to `_` after interpolation. The difference is
as follows:

    >>> Spec("pkg@git.releases/1.0").format("{name}/{version}")
    'pkg/git.releases/1.0'

    >>> Spec("pkg@git.releases/1.0").format_path("{name}/{version}")
    'pkg/git.releases_1.0'

The `format_path` method is used in all projections. Notice that this method
also maps `=` to `_`

    >>> Spec("pkg@git.main=1.0").format_path("{name}/{version}")
    'pkg/git.main_1.0'
   
which should avoid syntax issues when `Spec.prefix` is literally copied into a
Makefile as sometimes happens in AutotoolsPackage or MakefilePackage
2023-10-17 20:33:59 +02:00
Rocco Meli
49ea0a8e2e Add mpi_f08 variant to CP2K (#40574)
* add mpi_f08 variant
* add conflict
* add conflict with released versions of cp2k and +mpi_f08
2023-10-17 11:33:13 -07:00
Adam J. Stewart
d317ddfebe py-rtree: add v1.1.0 (#40575) 2023-10-17 11:25:18 -07:00
Cameron Rutherford
b1eef4c82d hiop: 1.0.1 release (#40580) 2023-10-17 11:14:17 -07:00
Harmen Stoppels
a4ad365de0 patchelf: fix compilation with GCC 7 (#40581) 2023-10-17 11:12:09 -07:00
wspear
8c257d55b4 Support apple-clang in pdt (#40582) 2023-10-17 11:23:15 -06:00
Harmen Stoppels
bd165ebc4d Support spack env activate --with-view <name> <env> (#40549)
Currently `spack env activate --with-view` exists, but is a no-op.

So, it is not too much of a breaking change to make this redundant flag
accept a value `spack env activate --with-view <name>` which activates
a particular view by name.

The view name is stored in `SPACK_ENV_VIEW`.

This also fixes an issue where deactivating a view that was activated
with `--without-view` possibly removes entries from PATH, since now we
keep track of whether the default view was "enabled" or not.
2023-10-17 15:40:48 +02:00
Massimiliano Culpo
348e5cb522 packages: use "requires" to allow only selected compilers (#40567)
A few packages have encoded an idiom that pre-dates the introduction
of the 'requires' directive, and they cycle over all compilers
to conflict with the ones that are not supported.

Here instead we reverse the logic, and require the ones that
are supported.
2023-10-17 08:38:06 +02:00
Patrick Bridges
7cc17f208c Creation of Beatnik package and associated updates to silo and cabana spack package (#40382)
* Added initial package for building Beatnik with spack

* Fixed github ID for Jason as a maintainer.

* Major revision of beatnik spack package to properly support GPU spack builds with CUDA (and ROCm, though that it untested)

* Marked that beatnik 1.0 will require cabana 0.6.0. We will wait for the cabana 0.6.0 release before we release beatnik

* Update to beatnik package spec to compile with hipcc when +rocm

* Updated spack package for cabana for version 0.6.0 and appropriate heffte dependency

* Updated beatnik package to require cabana 0.6.0

* More updates to cabana and beatnik to build with cabana 0.6.0

* Finish removing BLT dependence from beatnik

* More updates to beatnik package for compiling on cray systems

* Updated beatnik package for new cabana package

* Changes to silo package for new silo version

* Fixed version specs for heffte to be able to concretize and build

* Fixed spack style issues for beatnik and silo packages

* More spack formatting fixes to beatnik and silo

* Patrick adopting silo package as maintainer for now

* Should address final style changes to beatnik package spec

* Yet more style fixes.

* Perhaps this is the final style fixes? :)

* Minor fix to cabana package on required versions
2023-10-16 21:13:31 -06:00
Eric Berquist
2913cd936a Add latest versions of rlwrap (#40563)
* Add latest versions of rlwrap
* rlwrap: fix URL for v0.46.1
2023-10-16 20:28:51 -06:00
Stephen Sachs
361a185ddb intel-oneapi-compilers: ifx is located in bin not bin/intel64 (#40561)
This is a fix on top of https://github.com/spack/spack/pull/40557 .
Tagging @rscohn2 for review.
2023-10-16 20:23:46 -06:00
Seth R. Johnson
9d5615620a py-furo: new version (#40559) 2023-10-16 17:10:31 -04:00
Adam J. Stewart
ae185087e7 py-grayskull: add new package (#40293)
* py-grayskull: add new package

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-10-16 15:37:50 -05:00
Massimiliano Culpo
4a96d29e69 Use string representation of deptypes for concrete specs (#40566) 2023-10-16 22:36:22 +02:00
Adam J. Stewart
1e44f33163 py-fiona: add v1.9.5 (#40497) 2023-10-16 15:34:13 -05:00
Adam J. Stewart
348493abcd py-shapely: add v2.0.2 (#40523) 2023-10-16 15:33:56 -05:00
Adam J. Stewart
2bc4bfa877 py-grpcio: cython 3 still not supported (#40537) 2023-10-16 15:33:32 -05:00
Adam J. Stewart
3e3b287761 py-lightning: add v2.1.0 (#40496) 2023-10-16 14:14:46 -05:00
renjithravindrankannath
74bbb1ef1b Updating patch to enable flag mcode-object-version=none (#40367)
* Updating patch to add flag mcode-object-version=none when
   device libs is buils as part of llvm-amdgpu
* Limiting patch to +rocm-device-libs variant and adding
   appropriate comment for the patch
* Updating llvmpatch as per the mailine code
   Updating hsa-rocr patch as per the latest code
   Updating the if elif condition for the hip test src path
* Updating flags for 5.5 relases and above
* Updating build flags and patches
2023-10-16 10:25:10 -07:00
Dom Heinzeller
22405fbb68 Fix version incompatibilities of py-pandas and py-openpyxl (#40472)
* Fix version incompatibilities of py-pandas and py-openpyxl

* Add variant excel for py-pandas

* Add package py-pyxlsb

* Add versios for py-xlsxwriter

* Define excel dependencies for py-pandas 1.4, 1.5, 2.0, 2.1

* Fix variant excel in py-pandas

* Add package py-odfpy, which is also a dependency for py-pandas@2.0:

* Rearrange excel dependencies for py-pandas

* Change url to pypi

* Add missing newline to fix style in py-odfpy
2023-10-16 10:28:38 -06:00
Diego Alvarez S
14d935bd6c Add nextflow 23.10.0 (#40547) 2023-10-16 09:06:36 -07:00
Garth N. Wells
363b9d3c7b fenics-basix: update for v0.7 (#40440)
* Uodate for Basix 0.7

* Version fix for nanobind dependency

* Simplification

* Version update and simplify dependencies

* Add comment on location of pyproject.toml

* Update var/spack/repos/builtin/packages/py-fenics-basix/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-16 08:53:02 -06:00
Stephen Sachs
8347ae3766 intel-oneapi-compilers: ifx uses --gcc-name & --gxx-name (#40557)
`ifx` uses the older syntax instead of `--gcc-toolchain`. Tested up to version
2023.2.0.
2023-10-16 10:24:21 -04:00
Garth N. Wells
1106f6b9f2 py-fenics-ufl: update version and add test (#40534)
* Update py-ufl vesion

* Syntax fix

* Syntax fix

* Add test

* Updates following comments
2023-10-16 06:52:59 -05:00
Lydéric Debusschère
e22117304e [add] py-cylc-uiserver: new recipe (#39983)
* [add] py-cylc-uiserver: new recipe

* py-cylc-uiserver: remove version constraint on the dependence python

* [fix] py-cylc-uiserver: add forgotten dependence py-graphql-core

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-16 06:51:11 -05:00
Vanessasaurus
10999c0283 fix: flux-core needs libarchive with +iconv (#40541)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-10-15 17:43:10 -06:00
Todd Gamblin
7adeee0980 README.md: tweak matrix description to indicate bridging (#40540)
This tweaks the matrix description to indicate that it's bridged with Slack. So people
don't think they're missing out (even though the icon says there are only 3 users on
Matrix).
2023-10-15 22:48:05 +00:00
Harmen Stoppels
a9cfa32c34 spack checksum: handle all versions dropped better (#40530)
* spack checksum: fix error when all versions are dropped

* add test
2023-10-15 15:08:11 -07:00
Garth N. Wells
718aa8b82f Version update and simplify dependencies (#40543) 2023-10-15 15:53:03 -06:00
Adam J. Stewart
dbf3bed380 py-torchdata: version rename (#40522) 2023-10-15 13:22:49 -07:00
Adam J. Stewart
ef55c7c916 Python: allow OneAPI 2024 when it's released (#40536) 2023-10-15 11:18:04 -06:00
Adam J. Stewart
2015d3d2bc py-click: fix Python 3.6 support (#40535) 2023-10-15 19:16:51 +02:00
Alec Scott
76bac6d4bf Add matrix space link and badge to README (#40532) 2023-10-15 09:28:08 +00:00
Miroslav Stoyanov
b960d476e3 tasmanian: patch for clang17 (#40515) 2023-10-15 00:07:15 -05:00
Veselin Dobrev
8a311d7746 mfem: add a patch for v4.6 for gcc 13, see mfem PR 3903 (#40495) 2023-10-15 00:01:53 -05:00
Miroslav Stoyanov
39d2baec8a heffte: fix rocm deps (#40514) 2023-10-14 23:55:01 -05:00
Dom Heinzeller
26e063177d Bug fixes in py-awscrt to fix build errors reported in #40386 (#40469)
* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: on Linux, tell aws-crt-python to use libcrypto from spack (openssl)

* Bug fix in var/spack/repos/builtin/packages/py-awscrt/package.py: add missing build dependencies cmake (for all), openssl (for linux)

* Update var/spack/repos/builtin/packages/py-awscrt/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-14 21:02:41 +00:00
Alex Richert
149d1946ee Add static support for proj (#40322)
* Add static-only option for proj

* Update proj

* update proj

* Update package.py

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* proj: Add pic and static variant support for cmake
2023-10-14 13:09:24 -05:00
Manuela Kuhn
3604f6238d py-pydicom: add 2.4.3 (#40487) 2023-10-14 13:06:51 -05:00
Manuela Kuhn
2ad9470670 py-pybids: add 0.16.3 (#40486) 2023-10-14 13:05:36 -05:00
Manuela Kuhn
8dde74854a py-cfgv: add 3.4.0 (#40465) 2023-10-14 13:03:43 -05:00
Manuela Kuhn
fa5aadbbc0 py-charset-normalizer: add 3.3.0 (#40471) 2023-10-14 12:56:59 -05:00
Manuela Kuhn
0989cb8866 py-click: add 8.1.7 (#40473) 2023-10-14 12:56:14 -05:00
Manuela Kuhn
b536260eb5 py-chardet: add 5.2.0 (#40468) 2023-10-14 12:47:01 -05:00
Manuela Kuhn
8f2de4663e py-certifi: add 2023.7.22 (#40445) 2023-10-14 12:41:34 -05:00
Manuela Kuhn
0693892521 py-bidskit: add 2023.9.7 (#40444) 2023-10-14 12:40:40 -05:00
Manuela Kuhn
3be78717d2 py-pyqt6: add 6.5.2 (#40413) 2023-10-14 12:34:45 -05:00
Sam Gillingham
655d123785 py-python-fmask: update to latest versions (#40378)
* tidy and add new version

* add comment about dependencies

* whitespace
2023-10-14 12:28:51 -05:00
Lydéric Debusschère
b8cb36ce50 [add] py-graphene-tornado: new recipe, required by py-cylc-uiserver (#39985)
* [add] py-graphene-tornado: new recipe, required by py-cylc-uiserver

* py-graphene-tornado: Taking reviewing into account

* py-graphene-tornado: add type run in dependences py-jinja, py-tornado and py-werkzeug

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-14 12:25:02 -05:00
Manuela Kuhn
bc3cd02776 py-urllib3: add 2.0.6 (#40207)
* py-urllib3: add 2.0.5

* Add py-brotli package

* Group brotli dependencies and make limits more specific

* Add minimum version limits to variants

* Remove python upper limit for py-brotli

* Fix restrictions for py-brotli dependency

* Fix py-brotli dependency

* py-urllib3: add 2.0.6
2023-10-14 12:23:54 -05:00
Seth R. Johnson
a027adcaa2 cpr: new package (#40509)
* New package: cpr

* Support libcpr version 1.9

* Fix build phase for git

* Update var/spack/repos/builtin/packages/cpr/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* [@spackbot] updating style on behalf of sethrj

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-14 08:31:53 -07:00
Harmen Stoppels
3783032d28 screen: add v4.9.1 (#40529) 2023-10-14 08:29:55 -07:00
Harmen Stoppels
c0ac5e3f6b git: add 2.42 (#40528) 2023-10-14 08:28:52 -07:00
Michael Kuhn
87371d58d5 libbson, mongo-c-driver: add 1.24.4 (#40518) 2023-10-14 11:38:00 +02:00
Michael Kuhn
ef11fd7f75 libfuse: add 3.16.2 (#40519) 2023-10-14 11:37:07 +02:00
Michael Kuhn
d0f046e788 mariadb-c-client: add 3.3.7 (#40521) 2023-10-14 11:35:46 +02:00
Michael Kuhn
794fb9b252 rocksdb: add 8.6.7 (#40525) 2023-10-14 11:34:32 +02:00
Michael Kuhn
86c7d646c3 Fix pkgconfig dependencies (#40524)
`pkgconfig` is the correct virtual dependency.
2023-10-14 11:33:36 +02:00
Michael Kuhn
7d96077667 glib: add 2.78.0, 2.76.6 (#40517) 2023-10-14 11:33:16 +02:00
Michael Kuhn
a6fbfedc08 sqlite: add 3.43.2 (#40520) 2023-10-14 11:32:25 +02:00
Alex Richert
a6cfeabc10 cairo: add shared and pic variants (#40302) 2023-10-13 14:21:43 -06:00
Martin Aumüller
a3a29006aa wayland: dot is a build dependency (#39854)
* wayland: dot is a build dependency

otherwise this build failure happens:
../spack-src/doc/meson.build:5:6: ERROR: Program 'dot' not found or not executable

* wayland: make building of documentation optional

renders several dependencies optional
2023-10-13 21:57:13 +02:00
Harmen Stoppels
a5cb7a9816 spack checksum: improve interactive filtering (#40403)
* spack checksum: improve interactive filtering

* fix signature of executable

* Fix restart when using editor

* Don't show [x version(s) are new] when no known versions (e.g. in spack create <url>)

* Test ^D in test_checksum_interactive_quit_from_ask_each

* formatting

* colorize / skip header on invalid command

* show original total, not modified total

* use colify for command list

* Warn about possible URL changes

* show possible URL change as comment

* make mypy happy

* drop numbers

* [o]pen editor -> [e]dit
2023-10-13 19:43:22 +00:00
Gabriel Cretin
edf4aa9f52 Fpocket: fix installation (#40499)
* Fpocket: fix edit() positional args + add install()

* Remove comments

* Fix line too long

* Fix line too long

* Remove extension specification in version

Co-authored-by: Alec Scott <alec@bcs.sh>

* Use f-strings

Co-authored-by: Alec Scott <alec@bcs.sh>

* Fix styling

* Use the default MakefilePackage install stage

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-10-13 11:30:20 -07:00
Dom Heinzeller
02c680ec3a texinfo package: fix external detection (#40470)
A complete texinfo install includes both `info` and `makeinfo`. Some
system installations of texinfo may exclude one or the other. This
updates the external finding logic to require both.
2023-10-13 10:39:08 -07:00
David Huber
8248e180ca Add gsi-ncdiag v1.1.2 (#40508) 2023-10-13 08:48:23 -07:00
Harmen Stoppels
c9677b2465 Expand multiple build systems section (#39589)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-13 14:59:44 +02:00
Massimiliano Culpo
3752fe9e42 Better error message when wrong platform is used (#40492)
fixes #40299
2023-10-13 11:18:55 +02:00
Matthew Chan
8a0de10f60 containerize: update docs to activate env before using container templates (#40493) 2023-10-13 06:59:44 +00:00
Adam J. Stewart
6aa8d76e32 py-cmocean: add v3.0.3 (#40482) 2023-10-12 20:32:48 -06:00
Nils Vu
fb1d0f60d9 catch2: add +pic and +shared options (#40337)
Also add latest version
2023-10-12 20:13:01 -06:00
Martin Aumüller
728eaa515f ospray: new versions 2.11.0 and 2.12.0 (#40394)
* openimagedenoise: checksum 2.0.1
* ospray: new versions 2.11.0 and 2.12.0
  - both depend on embree@4
  - also update dependency versions for rkcommon, openvkl, openimagedenois and ispc
  - expose that dependency on openvkl is optional since @2.11 with variant "volumes"
* ospray: limit embree to @3 for ospray @:2.10
2023-10-12 14:13:15 -07:00
Julius Plehn
7c354095a9 Updates Variorum to 0.7.0 (#40488) 2023-10-12 11:27:06 -06:00
Harmen Stoppels
64ef33767f modules:prefix_inspections: allow empty dict (#40485)
Currently

```
modules:
  prefix_inspections:: {}
```

gives you the builtin defaults instead of no mapping.
2023-10-12 09:28:16 -07:00
Dennis Klein
265432f7b7 libzmq: Revert "libzmq: make location of libsodium explicit (#34553)" (#40477)
and make variants independent of upstream defaults
2023-10-12 09:15:00 -06:00
dependabot[bot]
aa7dfdb5c7 build(deps): bump python-levenshtein in /lib/spack/docs (#40461)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.22.0 to 0.23.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.22.0...v0.23.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-12 14:38:33 +00:00
Alec Scott
bfe37435a4 go: add v1.21.3 and deprecate previous versions due to CVE-2023-39533 (#40454) 2023-10-12 07:40:04 -06:00
Adam J. Stewart
285a50f862 PyTorch: fix build with Xcode 15 (#40460) 2023-10-12 07:27:03 -06:00
Harmen Stoppels
995e82e72b gettext: Add 0.22.3 and fix keyerror: "shared" (#39423)
After the merge of #37957 (Add static and pic variants), if a gettext install
from a build before that merge is present, building any package using gettext
fails with keyerror: "shared" because the use of self.spec.variants["shared"]
does not check for the presence of the new variant in the old installation
but expects that the new key variants["shared"] exists always.

Fix it with a fallback to the default of True and update gettext to v22.3

Co-authored-by: Bernharad Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2023-10-12 04:40:38 -06:00
Massimiliano Culpo
3935e047c6 Remove deprecated "extra_instructions" option for containers (#40365) 2023-10-12 12:12:15 +02:00
Harmen Stoppels
0fd2427d9b clingo: fix build with Python 3.12 (#40154) 2023-10-12 12:11:22 +02:00
Alec Scott
30d29d0201 acfl: use f-strings (#40433) 2023-10-12 11:30:22 +02:00
Tim Haines
3e1f2392d4 must: add versions 1.8.0 and 1.9.0 (#40141) 2023-10-11 22:34:22 -07:00
Lydéric Debusschère
6a12a40208 [add] py-cylc-rose: new recipe (#39980)
* [add] py-cylc-rose: new recipe

* py-cylc-rose: update recipe

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-11 22:31:46 -07:00
Leonhard Reichenbach
90e73391c2 opendatadetector: add version v3.0.0 (#39693) 2023-10-11 22:29:00 -07:00
Alec Scott
deec1b7c2e adios2: use f-strings (#40437) 2023-10-11 21:08:00 -06:00
Scott Wittenburg
d9cb1a1070 buildcache: Tell servers not to cache index or hash (#40339) 2023-10-11 18:13:57 -06:00
afzpatel
01747b50df fix ck build for 5.6.1 (#40304)
* initial commit to fix ck build for 5.6.1
* disable mlir for miopen-hip
* use satisfies for checking specs and add nlohmann-json dependency for 5.4 onwards
2023-10-11 14:43:21 -07:00
Alex Richert
df01a11e07 apr-util: Fix spack install apr-util +crypto ^openssl~shared (#40301) 2023-10-11 23:38:28 +02:00
Victor Brunini
7a4b479724 cmake: drop CMAKE_STATIC_LINKER_FLAGS (#40423)
Because those end up being passed to ar which does not understand linker
arguments. This was making ldflags largely unusuable for statically
linked cmake packages.
2023-10-11 23:30:44 +02:00
Harmen Stoppels
89e34d56a1 curl: add v8.4.0, allow r@8.3: to use it (#40442)
* curl: 8.4.0

* fix r curl upperbound range
2023-10-11 21:04:09 +02:00
Miroslav Stoyanov
a5853ee51a update for the tasmanain versions (#40453) 2023-10-11 12:33:31 -06:00
Alec Scott
537ab48167 acts: use f-strings (#40434) 2023-10-11 11:03:30 -07:00
Alec Scott
e43a090877 abduco: use f-string (#40432) 2023-10-11 19:51:26 +02:00
Alec Scott
275a2f35b5 adiak: use f-strings (#40435) 2023-10-11 10:48:39 -07:00
Alec Scott
dae746bb96 abacus: use f-string (#40431) 2023-10-11 19:48:32 +02:00
Alec Scott
3923b81d87 adios: use f-strings (#40436) 2023-10-11 10:47:42 -07:00
Alec Scott
5d582a5e48 7zip: use f-strings (#40430) 2023-10-11 19:46:23 +02:00
Alec Scott
7dbc712fba 3proxy: use f-strings (#40429) 2023-10-11 19:44:57 +02:00
Alec Scott
639ef9e24a adol-c: use f-strings (#40438) 2023-10-11 10:43:31 -07:00
Alex Richert
86d2200523 krb5: Fix spack install krb5 ^openssl~shared (#40306) 2023-10-11 19:37:48 +02:00
Massimiliano Culpo
fe6860e0d7 cmake: add v3.27.7 (#40441) 2023-10-11 10:20:49 -07:00
Manuela Kuhn
8f2e68aeb8 c-blosc2: add 2.10.5 (#40428) 2023-10-11 19:19:58 +02:00
Laura Weber
bc4c887452 tecplot: Add version 2023r1 (#40425) 2023-10-11 10:05:37 -07:00
Alec Scott
b3534b4435 restic: add v0.16.0 (#40439) 2023-10-11 19:04:26 +02:00
Massimiliano Culpo
861bb4d35a Update bootstrap buildcache to support Python 3.12 (#40404)
* Add support for Python 3.12
* Use optimized build of clingo
2023-10-11 19:03:17 +02:00
Harmen Stoppels
65e7ec0509 spider: respect <base> tag (#40443) 2023-10-11 08:49:50 -07:00
Manuela Kuhn
1ab8886695 qt-base: fix-build without opengl (#40421) 2023-10-11 08:56:59 -04:00
dependabot[bot]
26136c337f build(deps): bump mypy from 1.5.1 to 1.6.0 in /.github/workflows/style (#40422)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 06:35:51 -06:00
dependabot[bot]
e3b71b32aa build(deps): bump mypy from 1.5.1 to 1.6.0 in /lib/spack/docs (#40424)
Bumps [mypy](https://github.com/python/mypy) from 1.5.1 to 1.6.0.
- [Commits](https://github.com/python/mypy/compare/v1.5.1...v1.6.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-11 12:35:41 +00:00
Alec Scott
6d1711f4c2 Update legacy .format() calls to fstrings in installer.py (#40426) 2023-10-11 12:35:37 +00:00
Massimiliano Culpo
26f291ef25 spack buildcache: fix a typo in a function call (#40446)
fixes #40415
2023-10-11 13:09:21 +02:00
Martin Aumüller
da030617a1 botan: checksum 3.2.0 (#40417) 2023-10-10 21:52:51 -06:00
Martin Aumüller
1ebfcd3b18 openvkl: add 1.3.2 (#40392)
* openvkl: add 1.3.2

works with (and requires) embree@4

* openvkl: simplify formatting with f-string

thank you for the suggestion in the review
2023-10-10 21:24:17 -06:00
Edward Hartnett
d385a57da3 w3emc: add v2.11.0 (#40376)
* added version 2.11.0

* more fixes
2023-10-10 15:11:14 -07:00
eugeneswalker
37df8bfc73 e4s arm stack: duplicate and target neoverse v1 (#40369)
* e4s arm stack: duplicate and target both neoverse n1, v1

* remove neoverse_n1 target until issue #40397 is resolved
2023-10-10 14:32:51 -07:00
Adam J. Stewart
b781a530a1 GCC: fix build with Apple Clang 15 (#40318) 2023-10-10 22:35:15 +02:00
Harmen Stoppels
390b0aa25c More helpful error when patch lookup fails (#40379) 2023-10-10 21:09:04 +02:00
Adam J. Stewart
620835e30c py-jupyter-packaging: remove duplicate packages (#38671)
* py-jupyter-packaging: remove duplicate packages

* Allow py-jupyter-packaging to be duplicated in DAG

* Deprecate version of py-jupyterlab that requires py-jupyter-packaging at run-time
2023-10-10 13:50:22 -05:00
Adam J. Stewart
da10487219 Update PyTorch ecosystem (#40321)
* Update PyTorch ecosystem

* py-pybind11: better documentation of supported compilers

* py-torchdata: add v0.7.0

* Black fixes

* py-torchtext: fix Python reqs

* py-horovod: py-torch 2.1.0 not yet supported
2023-10-10 13:45:32 -05:00
Miroslav Stoyanov
4d51810888 find rocm fix (#40388)
* find rocm fix
* format fix
* style fix
* formatting is broken
2023-10-10 10:42:26 -07:00
Harmen Stoppels
6c7b2e1056 git: optimize build by not setting CFLAGS (#40387) 2023-10-10 11:48:06 +02:00
Carlos Bederián
749e99bf11 python: add 3.11.6 (#40384) 2023-10-10 09:53:04 +02:00
Martin Aumüller
6db8e0a61e embree: checksum 4.3.0 (#40395) 2023-10-09 20:44:10 -06:00
Martin Aumüller
6fe914421a rkcommon: checksum 0.11.0 (#40391) 2023-10-09 20:43:56 -06:00
Andrew W Elble
9275f180bb openmm: new version 8.0.0 (#40396) 2023-10-09 20:33:40 -06:00
Tom Epperly
2541b42fc2 Add a new release sha256 hash (#37680) 2023-10-09 20:28:45 -06:00
Auriane R
fb340f130b Add pika 0.19.1 (#40385) 2023-10-09 20:23:54 -06:00
Dennis Klein
d2ddd99ef6 libzmq: add v4.3.5 (#40383) 2023-10-09 17:30:46 -06:00
kenche-linaro
492a8111b9 linaro-forge: added package file for rebranded product (#39587) 2023-10-09 12:14:14 -07:00
George Young
d846664165 paintor: new package @3.0 (#40359)
* paintor: new package @3.0
* Update var/spack/repos/builtin/packages/paintor/package.py
---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:54:44 -07:00
Vanessasaurus
31b3e4898b Add: flux-pmix 0.4.0 (#40323)
* Automated deployment to update package flux-pmix 2023-10-05

* Pin exactly to flux-core 0.49.0 when between 0.3 and 0.4

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Mark Grondona <mark.grondona@gmail.com>

* Update var/spack/repos/builtin/packages/flux-pmix/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Mark Grondona <mark.grondona@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-09 09:29:18 -07:00
Harmen Stoppels
82f1267486 unparse: drop python 3.3 remnants (#40331) 2023-10-09 08:22:27 -07:00
Harmen Stoppels
19202b2528 docs: update Spack prerequisites (#40381) 2023-10-09 13:41:36 +00:00
Adam J. Stewart
831cbec71f py-pydevtool: add new package (#40377) 2023-10-09 15:09:59 +02:00
Patrick Broderick
bb2ff802e2 fftx: add v1.1.3 (#40283) 2023-10-09 15:06:59 +02:00
dependabot[bot]
83e9537f57 build(deps): bump python-levenshtein in /lib/spack/docs (#40220)
Bumps [python-levenshtein](https://github.com/maxbachmann/python-Levenshtein) from 0.21.1 to 0.22.0.
- [Release notes](https://github.com/maxbachmann/python-Levenshtein/releases)
- [Changelog](https://github.com/maxbachmann/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/maxbachmann/python-Levenshtein/compare/v0.21.1...v0.22.0)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-09 15:02:14 +02:00
Gavin John
3488e83deb py-s3cmd: Add new versions (#40212)
* Add new versions of py-s3cmd

* Use correct hashes
2023-10-09 07:46:38 -05:00
Jacob King
c116eee921 nimrod-aai: add v23.9. (#40303) 2023-10-09 14:39:40 +02:00
Adam J. Stewart
9cb291b41b py-jsonargparse: add v4.25.0 (#40185) 2023-10-09 14:33:45 +02:00
Thomas Dickerson
c0f1072dc7 racket packages: fix typo after multiple build systems support (#40088) 2023-10-09 14:21:13 +02:00
Jordan Galby
3108036533 elfutils: fix +debuginfod again with new libarchive versions (#40314) 2023-10-09 14:17:58 +02:00
Mike Renfro
215c699307 velvet: improved variants (#40225) 2023-10-09 12:47:54 +02:00
jmuddnv
f609093c6e Adding NVIDIA HPC SDK 23.9 (#40371) 2023-10-09 12:35:59 +02:00
Joe Schoonover
eb4fd98f09 feq-parse: add v2.0.3 (#40230) 2023-10-09 12:31:23 +02:00
Harmen Stoppels
08da9a854a parser: use non-capturing groups (#40373) 2023-10-09 07:18:27 +02:00
Mark (he/his) C. Miller
3a18fe04cc Update CITATION.cff with conf dates (#40375)
Add `start-date` and `end-date` to citation
2023-10-08 18:04:25 -07:00
Harmen Stoppels
512e41a84a py-setuptools: sdist + rpath patch backport (#40205) 2023-10-08 11:47:36 -06:00
Alex Richert
8089aedde1 gettext: Add static and pic options (#37957) 2023-10-08 17:42:21 +02:00
Manuela Kuhn
6b9e103305 py-tables: add 3.9.0 (#40340)
* py-tables: add 3.9.0

* Add conflict with apple-clang
2023-10-08 10:28:13 -05:00
Jen Herting
00396fbe6c [py-tokenizers] added version 0.13.3 (#40360) 2023-10-08 10:27:18 -05:00
Jen Herting
a3be9cb853 [py-tensorboardx] Added version 2.6.2.2 (#39731)
* [py-tensorboardx] Added version 2.6.2.2

* [py-tensorboardx] flake8

* [py-tensorboardx] requires py-setuptools-scm
2023-10-08 10:21:55 -05:00
Jen Herting
81f58229ab py-torch-sparse: add v0.6.17 (#39495)
* [py-torch-sparse] New version 0.6.17

* [py-torch-sparse] added dependency on parallel-hashmap

* [py-torch-sparse]

- spack only supports python@3.7:
- py-pytest-runner only needed with old versions
2023-10-08 10:20:46 -05:00
Jen Herting
2eb16a8ea2 [py-lvis] New package (#39080)
* [py-lvis] New package

* [py-lvis] flake8

* [py-lvis] os agnostic

* [py-lvis] added comment for imported dependency

* [py-lvis] style fix
2023-10-08 10:18:01 -05:00
Manuela Kuhn
9db782f8d9 py-bids-validator: add 1.13.1 (#40356)
* py-bids-validator: add 1.13.1

* Fix style
2023-10-08 10:16:08 -05:00
Lydéric Debusschère
633df54520 [add] py-cylc-flow: new recipe (#39986)
* [add] py-cylc-flow: new recipe

* py-cylc-flow: fix py-protobuf version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cylc-flow: fix py-colorama version

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-aiofiles

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylc-flow: Update dependence on py-pyzmq

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-cylcflow: remove useless dependence

* py-cylc-flow: fix indent

* py-cylc-flow: fix argument in depends_on; move lines

* py-cylc-flow: fix the type of the dependence py-setuptools

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-08 10:06:13 -05:00
Mark (he/his) C. Miller
e2a7f2ee9a Update CITATION.cff (#40363)
You will note the `Cite this repository` link is not working.

This commit fixes the underlying file...

* `authors` was not indented
* `authors` required by `preferred-citation`
* `authors` list required at top level (I simply duplicated)
* `"USA"` not correct country code
* `month` requires an integer month number
* Added URL to the actual pdf of the cited paper
* Used `identifiers` for doi and LLNL doc number
* added `abstract` copied from paper

Various fixes were confirmed by `cffconvert` using `docker run -v `pwd`:/app citationcff/cffconvert --validate`
2023-10-07 17:44:31 -07:00
Massimiliano Culpo
28c49930e2 Remove warning for custom module configuration, when no module is enabled (#40358)
The warning was added in v0.20 and was slated for removal in v0.21
2023-10-07 09:21:04 +02:00
Ken Raffenetti
6c1868f8ae yaksa: add version 0.3 (#40368) 2023-10-06 22:28:15 -06:00
Massimiliano Culpo
4f992475f4 Python add v3.11.5 (#40330) 2023-10-06 18:03:00 -06:00
Alex Richert
7a358c9005 Change 'exit' to 'return' in setup-env.sh (#36137)
* Change 'exit' to 'return' in `setup-env.sh` to avoid losing shell in some cases when sourcing twice.
2023-10-06 16:19:19 -07:00
Veselin Dobrev
b5079614b0 MFEM: add new version v4.6 (#40170)
* [mfem] Initial changes for v4.6

* [@spackbot] updating style on behalf of v-dobrev

* [mfem] Set the proper download link for v4.6
2023-10-06 15:31:23 -07:00
Alex Richert
482525d0f9 Update bufr recipe (#40033)
* Update bufr recipe
* Add v12.0.1
* style fixes
* remove test-related functionality for bufr
* Re-add testing

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-10-06 15:04:51 -07:00
snehring
599220924d wise2: adding new package wise2 (#40341) 2023-10-06 14:24:44 -07:00
Harmen Stoppels
d341be83e5 VersionRange: improve error message for empty range (#40345) 2023-10-06 14:19:49 -07:00
George Young
b027d7d0de metal: new package @2020-05-05 (#40355)
* metal: new package
* style

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-06 14:14:45 -07:00
Howard Pritchard
0357df0c8b openmpi: add 4.1.6 release (#40361)
related to #40232

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-10-06 13:06:37 -07:00
Edward Hartnett
f70ae6e3c4 g2: updated for 3.4.8 release (#40366)
* updated for 3.4.8 release
2023-10-06 13:02:17 -07:00
Manuela Kuhn
921ed1c21b py-expecttest: new package (#40347) 2023-10-06 14:34:47 -05:00
Manuela Kuhn
c95d43771a py-asttokens: add 2.4.0 (#40349) 2023-10-06 14:33:56 -05:00
Manuela Kuhn
db3d816f8b py-argcomplete: add 3.1.2 (#40348) 2023-10-06 14:32:21 -05:00
Manuela Kuhn
1d6a142608 py-anyio: add 4.0.0 (#40346) 2023-10-06 14:27:53 -05:00
George Young
98271c3712 topaz: new package @0.2.5 (#40352)
* topaz: new package @0.2.5

* switching over to pypi

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-06 13:00:03 -06:00
Manuela Kuhn
e3f6df884e py-zipp: add 3.17.0 (#40278)
* py-zipp: add 3.17.0

* Re-add python@3.7 dependency
2023-10-06 13:54:25 -05:00
Sam Gillingham
b0f36b2cd9 RIOS: add recent versions (#40243)
* add recent versions of RIOS

* fix depends_on syntax

* fix typo

* fix sha and add parallel variant

* remove self

* try doing in one
2023-10-06 13:52:15 -05:00
Lydéric Debusschère
5524492e25 [add] py-metomi-rose: new recipe, required by py-cylc-rose (#39981)
* [add] py-metomi-rose: new recipe, required by py-cylc-rose

* py-metomi-rose: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere.tgcc@cea.fr>
2023-10-06 13:49:59 -05:00
Sam Gillingham
112f045352 py-tuiview: add recent versions of tuiview (#40244)
* add recent versions of tuiview and remove Qt4 version

* reformat

* fix stray tabs

* add back a deprecated 1.1.7

* tabs

* more tabs

* reformat

* comma
2023-10-06 13:49:06 -05:00
Harmen Stoppels
72ed8711a7 unparse: drop python 2 remnants (#40329) 2023-10-06 09:42:47 -07:00
Harmen Stoppels
55e0c2c900 openssh: 9.5p1 (#40354) 2023-10-06 09:37:42 -07:00
Massimiliano Culpo
e20c05fcdf Make "minimal" the default duplicate strategy (#39621)
* Allow branching out of the "generic build" unification set

For cases like the one in https://github.com/spack/spack/pull/39661
we need to relax rules on unification sets.

The issue is that, right now, nodes in the "generic build" unification
set are unified together with their build dependencies. This was done
out of caution to avoid the risk of circular dependencies, which would
ultimately cause a very slow solve.

For build-tools like Cython, however, the build dependencies is masked
by a long chain of "build, run" dependencies that belong in the
"generic build" unification space.

To allow splitting on cases like this, we relax the rule disallowing
branching out of the "generic build" unification set.

* Fix issue with pure build virtual dependencies

Pure build virtual dependencies were not accounted properly in the
list of possible virtuals. This caused some facts connecting virtuals
to the corresponding providers to not be emitted, and in the end
lead to unsat problems.

* Fixed a few issues in packages

py-gevent: restore dependency on py-cython@3
jsoncpp: fix typo in build dependency
ecp-data-vis-sdk: update spack.yaml and cmake recipe
py-statsmodels: add v0.13.5

* Make dependency on "blt" of type "build"
2023-10-06 10:24:21 +02:00
Harmen Stoppels
36183eac40 python: add 3.12.0 (but keep 3.11 preferred) (#40282) 2023-10-06 09:44:09 +02:00
Alex Richert
7254c76b68 ecFlow update (#40305)
* Support static openssl for ecflow

* Update ecflow/static openssl

* Update ssl settings in ecflow

* add pic variant for ecflow

* style fix

* Update package.py

* Update package.py
2023-10-05 20:04:44 -07:00
eugeneswalker
e0e6ff5a68 Revert "cray rhel: disable due to runner issues (#40324)" (#40335)
This reverts commit bf7f54449b.
2023-10-05 10:32:52 -07:00
snehring
b0d49d4973 pbmpi: adding new version and maintainer (#40319) 2023-10-05 10:13:50 -07:00
Harmen Stoppels
4ce5d14066 unparse: drop python 3.4 remnants (#40333) 2023-10-05 09:52:23 -07:00
Thomas Madlener
9e9653ac58 whizard: Make sure to detect LCIO if requested (#40316) 2023-10-05 08:53:56 -07:00
Auriane R
bec873aec9 Add pika 0.19.0 (#40313) 2023-10-05 09:34:22 +02:00
Harmen Stoppels
bf7f54449b cray rhel: disable due to runner issues (#40324) 2023-10-05 08:45:33 +02:00
Cameron Rutherford
9f0e3c0fed exago: add and logging variant. (#40188) 2023-10-05 06:57:17 +02:00
eugeneswalker
79e7da9420 trilinos: add variant to build tests (#40284)
* trilinos: add variant: testing

* trilinos: rename +testing to +test
2023-10-04 16:32:30 -05:00
Harmen Stoppels
0f43074f3e Improve build isolation in PythonPipBuilder (#40224)
We run pip with `--no-build-isolation` because we don't wanna let pip
install build deps.

As a consequence, when pip runs hooks, it runs hooks of *any* package it
can find in `sys.path`.

For Spack-built Python this includes user site packages -- there
shouldn't be any system site packages. So in this case it suffices to
set the environment variable PYTHONNOUSERSITE=1.

For external Python, more needs to be done, cause there is no env
variable that disables both system and user site packages; setting the
`python -S` flag doesn't work because pip runs subprocesses that don't
inherit this flag (and there is no API to know if -S was passed)

So, for external Python, an empty venv is created before invoking pip in
Spack's build env ensures that pip can no longer see anything but
standard libraries and `PYTHONPATH`.

The downside of this is that pip will generate shebangs that point to
the python executable from the venv. So, for external python an extra
step is necessary where we fix up shebangs post install.
2023-10-04 14:38:50 -05:00
Ken Raffenetti
d297098504 yaksa: Allow unsupported host compiler with CUDA (#40298)
Fixes #40272.
2023-10-04 09:37:58 -07:00
Josh Bowden
284eaf1afe Damaris release v1.9.2 (#40285)
* Update to latest dot versions and improved installation of Damaris python module damaris4py

* fix for visit dependency typo

* whitespace check

* whitespace check

* fix for style issue

* reviewer suggestions for integrating Python added

* suggestion for boost depends statement added
2023-10-04 09:34:43 -06:00
Dom Heinzeller
da637dba84 Add new package awscli-v2 and its missing dependency awscrt (#40288)
* Add new package awscli-v2 and its missing dependency awscrt

* Remove boilerplate comments from awscli-v2 and awscrt packages

* Fix typos in var/spack/repos/builtin/packages/awscli-v2/package.py

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Address reviewer comments

* Remove py-pip version dependency from var/spack/repos/builtin/packages/awscli-v2/package.py

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-04 09:34:23 -06:00
Harmen Stoppels
931fce2c24 py-isort: needs setuptools build dep before v5 (#40234)
* py-isort: needs setuptools build dep before v5

Detected in #40224.

In the past, system setuptools could be picked up when using an external
python, so py-isort@4 would install fine. With the linked PR, pip can
only consider packages that Spack controls from PYTHONPATH, so the issue
of missing py-setuptools showed up.

* py-importlib-metadata: fix lowerbounds on python

* review

* py-isort unconditionally add optional setuptools dep to prevent picking up user package at runtime

* style

* drop optional py-setuptools run dep
2023-10-04 03:31:46 -06:00
Adam J. Stewart
42fbf17c82 py-einops: add v0.7.0 (#40296) 2023-10-04 04:28:49 -05:00
Harmen Stoppels
d9cacf664c petsc: add conflict on rocm 5.6: for now (#40300)
hipsparse@5.6.0 changed hipsparseSpSV_solve() API, but reverted in 5.6.1
2023-10-04 09:59:59 +02:00
Scott Wittenburg
7bf6780de2 ci: pull E4S images from github instead of dockerhub (#40307) 2023-10-04 09:55:06 +02:00
eugeneswalker
91178d40f3 e4s arm: disable bricks due to target=aarch64 not being respected (#40308) 2023-10-04 09:51:14 +02:00
kwryankrattiger
2817cd2936 ADIOS2: v2.8 is not compatible with HDF5 v1.14: (#40258) 2023-10-03 16:45:24 -05:00
Scott Wittenburg
92a6ddcbc3 ci: Change how job names appear in gitlab (#39963) 2023-10-03 15:16:41 -06:00
Sinan
58017f484c fix_qgis_build_with_pysip5 (#39941)
* fix_qgis_build_with_pysip5

* build fails with newer protobuf

* somehow findgdal can figure this out.

* Update var/spack/repos/builtin/packages/qgis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix gdal lib again

* qgis needs QtPositioning provided by qt+location option

* fix FindPyQt5 cmake file

* fix bug

* fix qsci sip issue

* fix bug

* blackify

* improve

* add latest LTR

* add build dep

* revert until bug is fixed

* specify proj version for qgis 3.28

* improve gdal libs search via indicating gdal-config

* make flake happy

* improve deps

* add 3.28.11, improve style

* fix style

* [@spackbot] updating style on behalf of Sinan81

---------

Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-10-03 14:21:51 -05:00
George Young
86d2e1af97 falco: new package @1.2.1 (#40289)
* falco: new package @1.2.1
* specifying gmake
* Replacing homepage - readthedocs is empty

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-10-03 11:26:49 -07:00
Adam J. Stewart
bf23be291b py-tables: add v3.8.0 (#40295) 2023-10-03 11:12:55 -07:00
Lydéric Debusschère
3b32a9918c [add] py-graphene: new recipe, required by py-cylc-flow (#39988) 2023-10-03 11:21:36 -06:00
Jordan Galby
f0260c84b4 Fix binutils regression on +gas~ld fix (#40292) 2023-10-03 17:28:55 +02:00
eugeneswalker
8746c75db0 e4s ci stacks: sync with e4s-23.08 (#40263)
* e4s amd64 gcc ci stack: sync with e4s-23.08

* e4s amd64 oneapi ci stack: sync with e4s-23.08

* e4s ppc64 gcc ci stack: sync with e4s-23.08

* add new ci stack: e4s amd64 gcc w/ external rocm

* add new ci stack: e4s arm gcc ci

* updates

* py-scipy: -fvisibility issue is resolved in 2023.1.0: #39464

* paraview oneapi fails

* comment out pkgs that fail to build on power

* fix arm stack name

* fix cabana +cuda specification

* comment out failing spces

* visit fails build on arm

* comment out slepc arm builds due to make issue

* comment out failing dealii arm builds
2023-10-03 08:12:51 -07:00
Gavin John
e8f230199f py-biom-format: Fix package file issues and bring up to date (#39962)
* Fix python versioning issue for py-biom-format

* Update deps according to feedback

* Remove version requirement for py-cython

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Only depend on py-six for versions >=2.1.10

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-future as non-dependency for 2.1.15

* There we are. Everything anyone could ever want

* Missed cython version change

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-03 09:55:15 -05:00
GeorgeJuniorGG
1e3c7abc1c Adding dynaconf module (#39762)
* Adding dynaconf module

* Fixing style

* Adding dependency
2023-10-03 06:23:29 -05:00
Manuela Kuhn
12e51da102 py-yarl: add 1.9.2 (#40277) 2023-10-03 06:22:20 -05:00
Tom Payerle
992291c738 py-dipy: Update version to support python@3.10 (#40229)
* py-dipy: Update version to support python@3.10

py-dipy only adds support to python@3.10 in py-dipy@1.5.0

See #40228

* py-dipy: fix formatting issues

* py-dipy: another formatting fix

* py-dipy: Use depends instead of conflicts

* py-dipy: formatting fix

* py-dipy: Updating for @1.7.0

Added new minimum version requirements for
py-cython
py-numpy
py-scipy
py-h5py
as suggested by @manuelakuhn
(py-nibabel min version unchanged for @1.7.0)
2023-10-03 06:10:14 -05:00
Manuela Kuhn
78e63fa257 py-wcwidth: add 0.2.7 (#40256) 2023-10-03 06:04:30 -05:00
Manuela Kuhn
487ea8b263 py-virtualenv: add 20.24.5 (#40255)
* py-virtualenv: add 20.22.0

* py-platformdirs: add 3.5.3

* py-filelock: add 3.12.4

* py-distlib: add 0.3.7
2023-10-03 06:03:35 -05:00
Manuela Kuhn
0d877b4184 py-websocket-client: add 1.6.3 (#40274) 2023-10-03 05:58:53 -05:00
Manuela Kuhn
994544f208 py-werkzeug: add 2.3.7 and 3.0.0 (#40275) 2023-10-03 05:57:27 -05:00
Manuela Kuhn
36bb2a5d09 py-wrapt: add 1.15.0 (#40276) 2023-10-03 05:55:28 -05:00
Stephen Hudson
071c1c38dc py-libEnsemble: add v1.0.0 (#40203)
* py-libEnsemble: add v1.0.0

* Add version ranges for deps

* Add comments to variants

* Put when before type

* Fix line lengths

* Re-format

* Update var/spack/repos/builtin/packages/py-libensemble/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Update var/spack/repos/builtin/packages/py-libensemble/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-10-03 05:48:17 -05:00
Lydéric Debusschère
b480ae2b7d [add] py-graphql-relay: new package (#39807)
* [add] py-graphql-relay: new package

* py-graphql-relay: Update package.py

Remove leftovers from version 3.2.0:
* archive name in pypi
* dependencies

* [fix] py-graphql-relay: remove constraint on python version; add dependence py-rx because of ModuleNotFoundError during spack test

* [fix] py-graphql-relay: remove py-rx dependence; py-graphql-core: add dependencies for version 2.3.2

* py-graphql-core: Update from review; set build backend, py-poetry for version 3: and py-setuptools for version 2

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-10-03 05:45:06 -05:00
Lydéric Debusschère
7a390f503d [add] py-metomi-isodatetime: new recipe, required by py-cylc-flow (#39990)
* [add] py-metomi-isodatetime: new recipe, required by py-cylc-flow

* py-metomi-isodatetime: use sources from pypi instead of github
2023-10-03 05:42:46 -05:00
dependabot[bot]
b7cb3462d4 build(deps): bump actions/setup-python from 4.7.0 to 4.7.1 (#40287)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.0 to 4.7.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](61a6322f88...65d7f2d534)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-03 12:28:02 +02:00
dependabot[bot]
f2230100ac build(deps): bump urllib3 from 2.0.5 to 2.0.6 in /lib/spack/docs (#40286)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.5 to 2.0.6.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/v2.0.5...2.0.6)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-10-03 12:27:36 +02:00
Massimiliano Culpo
4b06862a7f Deactivate Cray sles, due to unavailable runner (#40291)
This reverts commit 0274091204.
2023-10-03 11:06:36 +02:00
Harmen Stoppels
06057d6dba Buildcache tarballs with rootfs structure (#39341)
Two changes in this PR:

1. Register absolute paths in tarballs, which makes it easier
   to use them as container image layers, or rootfs in general, outside
   of Spack. Spack supports this already on develop.
2. Assemble the tarfile entries "by hand", which has a few advantages:
   1. Avoid reading `/etc/passwd`, `/etc/groups`, `/etc/nsswitch.conf`
      which `tar.add(dir)` does _for each file it adds_
   2. Reduce the number of stat calls per file added by a factor two,
      compared to `tar.add`, which should help with slow, shared filesystems
      where these calls are expensive
   4. Create normalized `TarInfo` entries from the start, instead of letting
      Python create them and patching them after the fact
   5. Don't recurse into subdirs before processing files, to avoid
      keeping nested directories opened. (this changes the tar entry
      order slightly, it's like sorting by `(not is_dir, name)`.
2023-10-03 09:56:18 +02:00
Sam Reeve
bb03a1768b Add AdditiveFOAM package (#39295)
* Add AdditiveFOAM package
* Add AdditiveFOAM build and install

Co-authored-by: kjrstory <kjrstory@gmail.com>
Co-authored-by: Knapp, Gerry <knappgl@ornl.gov>

---------

Co-authored-by: kjrstory <kjrstory@gmail.com>
Co-authored-by: Knapp, Gerry <knappgl@ornl.gov>
2023-10-02 16:27:13 -07:00
Adam J. Stewart
75ed26258c py-torchgeo: add v0.5.0 (#40267)
* py-torchgeo: add v0.5.0

* Better documentation of dependency quirks
2023-10-02 15:23:49 -05:00
eugeneswalker
1da8477a3c vtk-m@2.0: depend on kokkos@3.7:3.9 per issue #40268 (#40281) 2023-10-02 12:26:55 -07:00
Christoph Weber
4c111554ae Add ipm package (#40069) 2023-10-02 11:46:17 -07:00
renjithravindrankannath
615312fcee Rocm 5.6.0 & 5.6.1 release updates (#39673)
* 5.6.0 updates
* Rocm 5.6.0 updates
* Style and audit corrections for 5.6
* Patching smi path for tests.
* Style correction
* 5.6.1 updates
* Updated hip tests for ci build  failure
   Updated hiprand with the release tag
   Taken care the review comment rocsolver
* Adding rocm-smi path for 5.6
* Adding the patch file
* Setting library directory uniform
* gl depends on mesa but it should not be llvm variant
* Fix for the issue 39520 by setting CMAKE_INSTALL_LIBDIR=lib
* i1 muls can sometimes happen after SCEV. They resulted in
   ISel failures because we were missing the patterns for them.
* 5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx
* Revert "5.6.0 & 5.6.1 updates for migraphx, miopen-hip, mivisionx"
   This reverts commit f54c9c6c67.
* Revert operator mixup fix
* Splitting compiler-rt-linkage-for-host and operator mixup patch
* Adding missing patch for reverting operator mixup
* 5.6 update for composable-kernel,migraphx,miopen-hip and mivisionx
* Updating rvs, rcd and rccl for 5.6.1. adding comment for llvm patch
2023-10-02 11:36:51 -07:00
Joseph Wang
453625014d fix lhapdf package (#37384)
* fix lhapdf package
* [@spackbot] updating style on behalf of joequant

---------

Co-authored-by: joequant <joequant@users.noreply.github.com>
2023-10-02 11:33:54 -07:00
G-Ragghianti
1b75651af6 Implemented +sycl for slate (#39927)
* Implemented +sycl for slate

* style

* style

* add slate +sycl to ci

* slate +sycl: explicitly specify +mpi

* comment out slate+sycl+mpi in e4s oneapi stack

* removing requirement of intel-oneapi-mpi

* Added slate+sycl to e4s oneapi stack

* Removing obsolete comment

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-10-02 07:35:00 -07:00
Thomas Madlener
b3e3604f46 libtirpc: Add latest version and allow for builds on macOS (#40189) 2023-10-02 15:02:33 +02:00
Harmen Stoppels
6c4ce379ca buildcache: ignore errors of newer buildcache version (#40279)
Currently buildcaches are forward incompatible in an annoying way: spack
errors out when trying to use them.

With this change, you just get a warning.
2023-10-02 14:51:48 +02:00
Harmen Stoppels
a9dcba76ce py-importlib-metadata: add patch releases (#40249)
importlib_metadata 4.11.4 fixes some nasty issues with
`metadata.entry_points(...)`, so ensure we have those bugfixes.
2023-10-02 10:09:36 +02:00
Harmen Stoppels
32f21f2a01 Spack python 3.12: PEP 695 unparse support (#40155) 2023-10-02 00:25:52 -07:00
Greg Sjaardema
e60bbd1bfc Cgns fix no fortran config (#40241)
* SEACAS: Update package.py to handle new SEACAS project name

The base project name for the SEACAS project has changed from
"SEACASProj" to "SEACAS" as of @2022-10-14, so the package
needed to be updated to use the new project name when needed.

The refactor also changes several:
    "-DSome_CMAKE_Option:BOOL=ON"
to
   define("Some_CMAKE_Option", True)

* CGNS: If fortran not enabled, do not specify parallel fortran compiler

* Update package formatting as suggested by black

* Accept suggested change
2023-10-02 16:03:47 +09:00
Freifrau von Bleifrei
71c5b948d0 discotec package: add selalib variant (#40222) 2023-10-02 07:14:50 +02:00
Tom Payerle
726d6b9881 zoltan: Fixes for #40198 (#40199)
Fix issue in configure_args which resulted in duplicate "--with-ldflags" arguments (with different values) being passed to configure.  And extended the fix to similar arguments.

Also, repeated some flags to "--with-libs" to "--with-ldflags" as when the flags were only in "--with-libs", they did not seem to be picked up everywhere.  I suspect this is a bug in the configure script, but adding to both locations seems to solve it and should not have any adverse effects.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-10-02 07:11:36 +02:00
Sinan
aff64c02e8 xv: new package (#40032)
Co-authored-by: Sinan81 <Sinan@world>
2023-10-02 07:06:00 +02:00
Matthew Archer
31ae5cba91 add maintainers to py-nanobind (#40235) 2023-10-02 07:03:49 +02:00
Sam Reeve
0a91d2411a cabana: add v0.6 (#40168) 2023-10-02 06:57:55 +02:00
snehring
5f3af3d5e4 perl-dbd-mysql update to 4.050 (#40245)
* perl-devel-checklib: adding new package perl-devel-checklib

* perl-dbd-mysql: adding new version 4.050 and a new build dep
2023-10-02 06:45:11 +02:00
Satish Balay
37158cb913 petsc,py-petsc4py,slepc,py-slepc4py: add version 3.20.0 (#40260) 2023-10-02 06:28:37 +02:00
Wouter Deconinck
a596e16a37 libxkbcommon: new versions 1.4.1, 1.5.0 (#40273) 2023-10-02 05:54:55 +02:00
Weiqun Zhang
4e69f5121f amrex: add v23.10 (#40270) 2023-10-02 05:53:16 +02:00
Wouter Deconinck
2a0f4393c3 assimp: new version 5.3.1 (#40271) 2023-10-02 05:52:30 +02:00
Wouter Deconinck
c9e1e7d90c acts: impose cxxstd variant on geant4 dependency (#39767) 2023-10-02 05:49:51 +02:00
Juan Miguel Carceller
7170f2252c rivet: remove deprecated versions and clean up recipe (#39861)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2023-10-02 05:48:35 +02:00
eugeneswalker
b09073e01e py-pandas@0.24.2 %oneapi: add cflag=-Wno-error=implicit-function-declaration (#39470) 2023-10-01 14:42:56 -06:00
eugeneswalker
2d509dc3eb py-scipy: -fvisibility issue is resolved in 2023.1.0: (#39464)
* py-scipy: -fvisibility issue is resolved in 2023.1.0:

* e4s oneapi ci: add py-scipy
2023-10-01 17:29:21 +00:00
Martin Aumüller
8a9d45cc29 embree: fix linux build on aarch64 for 3.13.5 (#39749)
- upstream patch does not apply cleanly to older versions, not required
  for newer ones
- also add conflict for older versions, except for 3.13.3 which works by
  chance
2023-10-01 09:04:37 -07:00
Wouter Deconinck
b25f8643ff geant4, vecgeom: support variant cxxstd=20 (#39785) 2023-10-01 21:33:37 +09:00
Juan Miguel Carceller
9120b6644d qt: change version for opengl dependencies (#39718) 2023-10-01 21:31:11 +09:00
eugeneswalker
68dbd25f5f e4s cray sles ci: expand spec list (#40162)
* e4s cray sles stack: expand spec list

* remove unnecessary packages:trilinos:one_of
2023-09-30 21:32:33 -07:00
Todd Gamblin
9e54134daf docs: Replace package list with packages.spack.io (#40251)
For a long time, the docs have generated a huge, static HTML package list. It has some
disadvantages:

* It's slow to load
* It's slow to build
* It's hard to search

We now have a nice website that can tell us about Spack packages, and it's searchable so
users can easily find the one or two packages out of 7400 that they're looking for. We
should link to this instead of including a static package list page in the docs.

- [x] Replace package list link with link to packages.spack.io
- [x] Remove `package_list.html` generation from `conf.py`.
- [x] Add a new section for "Links" to the docs.
- [x] Remove docstring notes from contribution guide (we haven't generated RST
      for package docstrings for a while)
- [x] Remove referencese to `package-list` from docs.
2023-10-01 05:36:22 +02:00
eugeneswalker
08a9345fcc e4s ci: add packages: drishti, dxt-explorer (#39597)
* e4s ci: add packages: drishti, dxt-explorer

* e4s oneapi ci: comment out dxt-explorer until r%oneapi issue #40257 is resolved
2023-09-30 06:15:58 +09:00
John W. Parent
7d072cc16f Windows: detect all available SDK versions (#39823)
Currently, Windows SDK detection will only pick up SDK versions
related to the current version of Windows Spack is running on.
However, in some circumstances, we want to detect other version
of the SDK, for example, for compiling on Windows 11 for Windows
10 to ensure an API is compatible with Win10.
2023-09-29 20:17:10 +00:00
snehring
d81f457e7a modeltest-ng: adding new version, swapping maintainer (#40217)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-09-29 20:19:29 +02:00
Peter Scheibel
3969653f1b Cray manifest: compiler handling fixes (#40061)
* Make use of `prefix` in the Cray manifest schema (prepend it to
  the relative CC etc.) - this was a Spack error.
* Warn people when wrong-looking compilers are found in the manifest
  (i.e. non-existent CC path).
* Bypass compilers that we fail to add (don't allow a single bad
  compiler to terminate the entire read-cray-manifest action).
* Refactor Cray manifest tests: module-level variables have been
  replaced with fixtures, specifically using the `test_platform`
  fixture, which allows the unit tests to run with the new
  concretizer.
* Add unit test to check case where adding a compiler raises an
  exception (check that this doesn't prevent processing the
  rest of the manifest).
2023-09-29 09:47:30 -07:00
Jordan Galby
db37672abf Print error when missing git (#40254)
Like a missing curl.
2023-09-29 15:14:39 +00:00
eugeneswalker
1f75ca96df kokkos-nvcc-wrapper: add v4.1.00 (#40240) 2023-09-29 17:08:42 +02:00
Harmen Stoppels
605835fe42 Don't drop build deps on overwrite install (#40252)
If you `spack install x ^y` where `y` is a pure build dep of `x`, and
then uninstall `y`, and then `spack install --overwrite x ^y`, the build
fails because `y` is not re-installed.

Same can happen when you install a develop spec, run `spack gc`,
modify sources, and install again; develop specs rely on overwrite
install to work correctly.
2023-09-29 16:40:21 +02:00
Abhishek Yenpure
6c2748c37d Adding Catalyst 2.0 smoke test (#40112)
--  Performaing formatting changes
  --  Formatting file to conform with spack style
  --  Adding updates from review
  --  Removing old release candidates from the specification
  --  Adding external conduit support for Catalyst
  --  Adding Catalyst to `CMAKE_PREFIX_PATH` for the test to find
2023-09-29 16:36:24 +02:00
Massimiliano Culpo
210d221357 Test package detection in a systematic way (#18175)
This PR adds a new audit sub-command to check that detection of relevant packages
is performed correctly in a few scenarios mocking real use-cases. The data for each 
package being tested is in a YAML file called detection_test.yaml alongside the 
corresponding package.py file.

This is to allow encoding detection tests for compilers and other widely used tools, 
in preparation for compilers as dependencies.
2023-09-29 10:24:42 +02:00
Mark W. Krentel
c9ef5c8152 hpctoolkit: add version 2023.08.1 (#40248)
Add versions 2020.08.1 and branch 2023.08.stable.  Note: the version
numbers are a little different.  Here, 2023.08.1 means release no. 1
from the release/2023.08 branch.
2023-09-29 00:48:05 -06:00
eugeneswalker
0274091204 Revert "e4s cray sles: suspend ci until machine issues resolved (#40219)" (#40238)
This reverts commit 5165524ca6.
2023-09-29 07:25:50 +02:00
John W. Parent
3f2f0cc146 openblas package: fix lib suffix construction in libs (#40242) 2023-09-28 22:02:37 -06:00
Massimiliano Culpo
a236fce31f Partial removal of circular dependencies between spack and llnl (#40090)
Modifications:
- [x] Move `spack.util.string` to `llnl.string`
- [x] Remove dependency of `llnl` on `spack.error`
- [x] Move path of `spack.util.path` to `llnl.path`
- [x] Move `spack.util.environment.get_host_*` to `spack.spec`
2023-09-28 16:21:52 +00:00
snehring
f77a38a96b mash: fix compilation for aarch64 (#40218) 2023-09-28 10:16:11 -06:00
Harmen Stoppels
6d55066b94 Use st_nlink in hardlink tracking (#39328)
Only add potential hardlinks to a set/dict, instead of each file. This
should be much cheaper, since hardlinks are very rare.
2023-09-28 13:24:56 +00:00
Harmen Stoppels
78132f2d6b glibc: dont link libgcc_eh.a pre 2.17, and backport at_random auxval patch (#40013)
This resolves an interesting circular dependency between gcc and glibc:

1. glibc < 2.17 depends on libgcc.a and libgcc_eh.a
2. libgcc_eh.a is only built when gcc is configured with
   --enable-shared
3. but building shared libraries requires crt*.o and libc.so

Backport AT_RANDOM auxval changes to avoid dealing with wrong inline 
assembly (fallback code fails on ubuntu 23.04)
2023-09-28 13:30:43 +02:00
Brian Vanderwende
fba47e87d7 Support optional vars script arguments (#40064) 2023-09-28 06:52:22 -04:00
Massimiliano Culpo
bf8e8d9f5f cmake: add v3.27.6 (#40037)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-09-28 11:06:28 +02:00
Sam Gillingham
ebc0e9be19 kealib: add 1.5.1 and 1.5.2 versions (#40202)
* add recent kealib versions
* udpdate depends_on so hl not required for latest version
2023-09-27 19:08:28 -06:00
Vicente Bolea
b6f08f1d4e vtk-m: add VTK-m 2.1.0-rc1 (#39581) 2023-09-27 10:13:18 -06:00
Jared Popelar
d9724597ed Trilinos package: build on Windows (#34622)
Update Trilinos and dependencies to build a limited version of Trilinos
on Windows.

* Support trilinos~mpi~shared on Windows
* superlu: force CMake build on Windows
* boost: update to build on Windows (proper option formatting and
  build tool names)
* pcre, openblas: add CMake-based build (keep prior build system
  as default on platforms other than Windows)
* openblas: add patch when using Intel Fortran compiler (currently
  this is included as part of the hybrid %msvc compiler in Spack)

Co-authored-by: John Parent <john.parent@kitware.com>
2023-09-27 09:58:12 -06:00
kwryankrattiger
c90c946d52 ParaView: Add versio 5.11.2 (#40213) 2023-09-27 10:20:13 -05:00
Martin Aumüller
bb66d15d08 ffmpeg: fix build with XCode 15 (#40187)
As reported in #40159, a shared library build of ffmpeg 6.0 fails with the linker that was added with XCode 15:
ld: building exports trie: duplicate symbol '_av_ac3_parse_header'
clang: error: linker command failed with exit code 1 (use -v to see invocation)

Forcing the old linker with -Wl,-ld_classic works around this.
2023-09-27 16:41:32 +02:00
snehring
1cd2d07f0b last: add new version, fix build on aarch64 (#40216)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-09-27 16:32:18 +02:00
Garth N. Wells
43cb49d87a py-nanonbind: add new versions (#40161) 2023-09-27 16:28:26 +02:00
eugeneswalker
5165524ca6 e4s cray sles: suspend ci until machine issues resolved (#40219) 2023-09-27 04:51:34 +02:00
Martin Aumüller
38dc3a6896 vtk: @9.2 need seacas@2022-10-14, fix macos (#39896)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-09-26 13:48:44 -06:00
John W. Parent
115e448bd3 Paraview package: update freetype find patch for shared libs (#40085)
VTK's (and therefore Paraview's) FindFreetype module required patching to
handle static import libraries from Freetype. However it did not cover
shared libraries. This adds support for importing shared freetype into the VTK build
2023-09-26 10:41:24 -07:00
Harmen Stoppels
5c25437c9f spack --profile: dump to stderr (#40209) 2023-09-26 16:18:47 +02:00
Christoph Weber
0e72ff4a0d ploticus: new package (#39881)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2023-09-26 08:15:12 -05:00
Matt Fraser
90edd18d1f Add NCCL v2.18.5-1 release to recipe (#40201) 2023-09-26 08:05:23 -05:00
Massimiliano Culpo
40dbadb868 crosstool-ng: add v1.26.0 (#40204) 2023-09-26 11:17:39 +02:00
Martin Aumüller
c16546bd4c libressl: fix package & new versions (#39741) 2023-09-26 10:39:29 +02:00
Dom Heinzeller
143e6a4fbb Always apply Python unixcompilers.py rpath fix, not just on cray (#39929)
* Rename var/spack/repos/builtin/packages/python/cray-rpath-3.1.patch as var/spack/repos/builtin/packages/python/rpath-non-gcc.patch and apply unconditionally

* Update var/spack/repos/builtin/packages/python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-26 02:38:20 -06:00
Martin Aumüller
13816f19fd opencv: add 4.7.0 and 4.8.0 and fix ffmpeg dependency (#39013) 2023-09-26 10:28:29 +02:00
Martin Aumüller
34ff3b408b seacas: correctly limit fmt dependency (#39756)
versions older than 2022-10-14 do not build with fmt@10, either:
fmt/core.h:1579:3: error: static_assert failed due to requirement 'formattable'
2023-09-26 06:35:38 +02:00
Victor A. P. Magri
aec88ef3e6 hypre: add support for magma (#40121) 2023-09-26 06:27:14 +02:00
snehring
9fda22d942 bcl2fastq2: patch for compiling on aarch64 (#40195) 2023-09-26 06:26:19 +02:00
snehring
550613ee3a gmap-gsnap: add v2023-07-20 and aarch64 support (#40200) 2023-09-26 05:59:31 +02:00
Ethan Williams
5341834ebe Elbencho: add elbencho package (#40193)
* add elbencho package
* pep8 compliant
* add description
* depends on boost
* boost options
* dependencies
* fix ncurses not linking properly missing tinfo
* pep8 fix
* fix style of boost variants
* final style fix - spack style is happy
* correct cufile env variable name

---------

Co-authored-by: Ethan W <mail@ethanwilliams.xyz>
2023-09-25 18:53:28 -06:00
Alberto Invernizzi
fc027e34d1 apex: add new version 2.6.3 + deal with ompt build problem (#40073)
* add new version 2.6.3
* add conflict for openmp when using gcc
2023-09-25 16:00:58 -07:00
Harmen Stoppels
aef58776b4 openssl: new versions (#40194)
apparently there was still a 1.1.1 release
2023-09-25 21:54:05 +02:00
Cameron Smith
4e17a40d09 omegah: new version (#40192) 2023-09-25 13:53:36 -06:00
Matthieu Dorier
d97251a4c6 spdlog: add version 1.12.0 (#40186) 2023-09-25 12:23:02 -07:00
snehring
2a66a67132 subread: removing mtune arg to compiler (#40197) 2023-09-25 12:19:25 -07:00
eugeneswalker
10fd69ec74 new package: gsibec (#40137) 2023-09-25 11:28:04 -06:00
Harmen Stoppels
9889a0cac6 py-numcodecs: drop upperbound, add new version, avoid native compilation (#40183)
* py-numcodecs: drop upperbound, add new version, avoid native compilation

* py-numcodecs: add entrypoints

* Remove another upperbound on python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-25 07:28:32 -06:00
Harmen Stoppels
65ffd0dd63 jsoncpp: add meson build system, update package, update cmake ~ownlibs (#39639) 2023-09-25 10:13:30 +02:00
Bernhard Kaindl
3d2c779b87 openssh: add version 9.4p1 and fix compile issue with new zlib-1.3 (#40174)
The fix for the compile issue was improved by Bernhard Kaindl.
He also added to fix two classes of other build fails:
- add missing openssl dependency version limit to older openssh versions
- add missing -fcommon for new compilers building old openssh versions

Co-authored-by: snehring <snehring@iastate.edu>
2023-09-25 08:30:10 +02:00
Harmen Stoppels
b57c2a10d4 phist: remove python 3.10 upperbound (#40175) 2023-09-25 08:28:13 +02:00
Diego Alvarez S
849a0a5eeb Add opendjk 11.0.20.1+1, 17.0.8.1+1 (#40035) 2023-09-25 05:09:30 +02:00
Rocco Meli
7e4a6160b9 namd: Add NAMD 3.0b3 and ARM support (#40134) 2023-09-25 05:04:17 +02:00
dependabot[bot]
646e7b4b00 build(deps): bump black in /.github/workflows/style (#40165)
Bumps [black](https://github.com/psf/black) from 23.1.0 to 23.9.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.1.0...23.9.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-24 17:48:02 +00:00
eugeneswalker
eea86c3981 new package: ecmwf-atlas (#40136) 2023-09-24 10:43:09 -06:00
dependabot[bot]
37c48fc82c build(deps): bump urllib3 from 2.0.4 to 2.0.5 in /lib/spack/docs (#40119)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.4 to 2.0.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.4...v2.0.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-24 17:35:48 +02:00
George Young
47daba3dc1 blast-plus: add 2.14.1 (#40179)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-24 09:33:38 -06:00
Brian Vanderwende
46062d98fd visit: Add NetCDF format support (#40065) 2023-09-24 17:26:34 +02:00
François Trahay
4e37084ed4 EZTrace: add v2.1 (#39748) 2023-09-24 17:12:32 +02:00
Thomas Madlener
e7c229393d garfieldpp: Fix the signature of setup_dependent_build_environment (#40133) 2023-09-24 17:04:41 +02:00
Houjun Tang
87fd9c3e93 sw4: add v3.0 (#40146) 2023-09-24 16:54:32 +02:00
snehring
676f2a3175 hmmer: adding new version 3.4 (#40148) 2023-09-24 16:53:18 +02:00
Adam J. Stewart
5fe7f5329b py-torchvision: minimize dependencies (#40163) 2023-09-24 16:27:09 +02:00
Adam J. Stewart
c6a3dd03ab py-torchmetrics: add v1.2.0 (#40164) 2023-09-24 16:25:34 +02:00
Hartmut Kaiser
ebdd5e28f2 flecsi: bumping HPX requirement to V1.9.1 (#32355) 2023-09-24 16:24:33 +02:00
dependabot[bot]
068abdd105 build(deps): bump mypy from 1.5.0 to 1.5.1 in /.github/workflows/style (#40166)
Bumps [mypy](https://github.com/python/mypy) from 1.5.0 to 1.5.1.
- [Commits](https://github.com/python/mypy/compare/v1.5.0...v1.5.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-24 16:21:41 +02:00
dependabot[bot]
9ec372a86c build(deps): bump actions/checkout from 4.0.0 to 4.1.0 (#40172)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.0.0 to 4.1.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](3df4ab11eb...8ade135a41)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-24 16:08:15 +02:00
Alec Scott
5e05f6b7c6 bfs: add v3.0.2 (#40176) 2023-09-24 15:53:15 +02:00
Alec Scott
7988d8c67d glab: add v1.33.0 (#40177) 2023-09-24 15:53:01 +02:00
Alec Scott
658a3f2fdb hugo: add v0.118.2 (#40178) 2023-09-24 15:52:42 +02:00
Rocco Meli
582f0289af Add mda-xdrlib and update pyedr and panedr (#39912)
* ensure umpire~cuda~rocm when ~cuda~rocm

* Add mda-xdrlib and update pyedr and panedr

* Update var/spack/repos/builtin/packages/py-mda-xdrlib/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* code review

* fix python pin, conflated with py-tomli

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-09-24 02:07:59 -06:00
eugeneswalker
95b737d923 trilinos +shylu: patch to resolve trilinos issue #12048 (#40169)
* trilinos +shylu: patch to resolve trilinos issue #12048

* e4s ci: build xyce ^trilinos+shylu

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2023-09-24 16:28:03 +09:00
Adam J. Stewart
d7e9a13f53 protobuf: add v4.24.3 (#40152) 2023-09-23 20:04:10 +02:00
Tom Payerle
25e45f9f07 py-tinyarray: add version 1.2.4, needed for python@3.10 (#40171)
* py-tinyarray: add version 1.2.4, needed for python@3.10

Added @1.2.4
Require @1.2.4 when using python@3.10:

* py-tinyarray: format fix
2023-09-23 08:57:46 -05:00
Cameron Smith
38cc51ec36 pumi: add version 2.2.8 (#40143) 2023-09-22 14:38:46 -07:00
Massimiliano Culpo
5c10c29923 Pin the version of tools used in CI, have dependabot manage their version (#40066) 2023-09-22 10:33:01 -07:00
eugeneswalker
daf95227bf e4s cray rhel stack: expand to full spec list (#40111)
* e4s cray rhel stack: expand to full spec list

* comment out gasnet; require %gcc for unzip

* require openblas@0.3.20 to get around %cce error; follow up with issue report

* comment out failing specs;

* comment out axom and xyce due to errors

* improve clarity of failing specs
2023-09-22 16:04:20 +00:00
Adam J. Stewart
480a5e0848 py-pyzmq: fix order of decorators (#40157) 2023-09-22 17:20:56 +02:00
Satish Balay
783e253f7d llvm: add version 17.0.1 (#40108) 2023-09-22 14:41:23 +02:00
Harmen Stoppels
716196930a Remove distutils dependency in Spack (#40153)
* msvc.py: don't import distutils

Introduced in #27021, makes Spack forward incompatible with Python.

The module was already deprecated at the time of the PR.

* update spack package
2023-09-22 13:04:58 +02:00
Adam J. Stewart
f42402129a py-matplotlib: add v3.7.3 (#39961) 2023-09-22 11:23:57 +02:00
Adam J. Stewart
419878f035 Add support for macOS Sonoma (#40115) 2023-09-22 04:21:26 -05:00
Harmen Stoppels
32927fd1c1 py-pyzmq: force recythonize of older versions (#40151) 2023-09-22 03:18:34 -06:00
Harmen Stoppels
092a5a8d75 py-gpy: re-cythonize & drop python upperbound (#40089)
* py-gpy: drop cython induced python upperbound

* py-gpy: bump scipy

* py-fn-py: python bounds for old version, new version w/o

* py-statsmodels: force recythonization

* py-gpy: clarifying comment about cython build type
2023-09-22 04:18:01 -05:00
Adam J. Stewart
f082b19058 py-pyproj: add v3.6.1 (#40128) 2023-09-22 03:29:37 -05:00
Harmen Stoppels
eb7e26006f py-hpbandster: use build/run deps instead of build (#40102) 2023-09-22 02:18:39 -06:00
Adam J. Stewart
2d3d9640dc py-pandas: add v2.1.1 (#40127) 2023-09-22 10:16:01 +02:00
Adam J. Stewart
f95080393a libzmq: fix build with Apple Clang 15.0.0 (#40116) 2023-09-22 10:14:43 +02:00
Adam J. Stewart
0b8203940a OpenBLAS: add v0.3.24 (#40132)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-09-22 08:58:23 +02:00
Adam J. Stewart
3bd7859e7f py-scikit-learn: add v1.3.1 (#40126) 2023-09-22 08:39:38 +02:00
eugeneswalker
599b612edf add met v11.1.0 (#40138) 2023-09-22 08:35:10 +02:00
eugeneswalker
6f9126d738 add metplus v5.1.0 (#40139) 2023-09-22 08:30:38 +02:00
eugeneswalker
59399ab1f8 add scotch v7.0.4 (#40140) 2023-09-22 08:30:22 +02:00
Lydéric Debusschère
b67f619448 py-aiofiles: Add version 0.7.0 required by py-cylc-flow (#39987)
* py-aiofiles: Add version 0.7.0 required by py-cylc-flow

* py-aiofiles: update from review

* depends on py-setuptools before 0.6
* depends on py-poetry-core after 0.7

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-aiofiles: Taking reviewing into account; add a upper version constraint for Python.

* py-aiofiles: update from review, set lower version of python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-22 00:29:03 -05:00
eugeneswalker
2755706115 Revert "e4s-cray: drop in ci as there are no runners (#40122)" (#40144)
This reverts commit bd8d0324a9.
2023-09-21 22:38:26 -06:00
Thiago Genez
43ed8a12b7 openssh: Fix segfault on x86_64-darwin (#40044)
Import patches from homebrew
2023-09-22 01:41:22 +02:00
Alex Richert
3f5b4a4907 Update grib-util package.py (#40120) 2023-09-21 14:15:06 -07:00
Garth N. Wells
cd16478aba py-fenics-dolfinx: add version upper bound for Python dependency (#40125)
* py-fenics-dolfinx: add upper bound on Python version

* Small fix

* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-21 13:17:45 -06:00
Rocco Meli
aa87c747f9 Add missing platform in Charm++ (#40051)
Add missing `linux` platform in Charm++ when building for `arm8`.
2023-09-21 20:31:17 +02:00
Steven R. Brandt
9210b19398 lorene: add C++ headers to the install (#39909) 2023-09-21 20:27:23 +02:00
George Young
4d156b9e6b relion: add 4.0.1, fix build on Rocky8 (#40074)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-21 20:24:03 +02:00
Jonas Thies
56df8b61a2 phist: fix compatibility with python@3.11: (#40082) 2023-09-21 20:21:35 +02:00
Anton Kozhevnikov
170867e38a SIRIUS: remove old versions (#40105) 2023-09-21 20:18:58 +02:00
Wileam Y. Phan
5ca9fd6c82 Fix false detection of llvm-amdgpu as llvm and llvm-doe (#40113) 2023-09-21 20:15:20 +02:00
Robert Underwood
ef165c80b3 py-nanobind add cmake path (#40079)
* py-nanobind add cmake path

* fix style

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2023-09-21 13:15:02 -05:00
Lydéric Debusschère
6859694e8e [add] py-graphql-ws: new recipe, required by py-cylc-uiserver (#39991)
* [add] py-graphql-ws: new recipe, required by py-cylc-uiserver

* py-graphql-ws: remove constraint on version for python
2023-09-21 13:14:11 -05:00
John W. Parent
aeaca77630 Harden compiler detection against errors (#39736)
Fixes #39622

Add a timeout to compiler detection and allow Spack to proceed when
this timeout occurs.

In all cases, the timeout is 120 seconds: it is assumed any compiler
invocation we do for the purposes of verifying it would resolve in
that amount of time.

Also refine executables that are tested as being possible MSVC
instances, and limit where we try to detect MSVC. In more detail:

* Compiler detection should timeout after a certain period of time.
  Because compiler detection executes arbitrary executables on the
  system, we could encounter a program that just hangs, or even a
  compiler that hangs on a license key or similar. A timeout
  prevents this from hanging Spack.
* Prevents things like cl-.* from being detected as potential MSVC
  installs. cl is always just cl in all cases that Spack supports.
  Change the MSVC class to indicate this.
* Prevent compilers unsupported on certain platforms from being
  detected there (i.e. don't look for MSVC on systems other than
  Windows).

The first point alone is sufficient to address #39622, but the
next two reduce the likelihood of timeouts (which is useful since
those slow down the user even if they are survivable).
2023-09-21 10:42:12 -07:00
AMD Toolchain Support
2a9d1d444b aocl-sparse: use .libs instead of hard-coded value for library computation (#39868)
Co-authored-by: matooley <phil.tooley@amd.com>
2023-09-21 18:47:29 +02:00
Massimiliano Culpo
abad16c198 Restore virtuals normalization on edge construction (#40130)
Put back normalization of the "virtuals" input as a sorted tuple. 

Without this we might get edges that differ just for the order of 
virtuals, or that have lists, which are not hashable.

Add unit-tests to prevent regressions.
2023-09-21 17:02:34 +02:00
Massimiliano Culpo
4c0bc39054 Remove Python 3.6 from bootstrap tests on Ubuntu, add 3.11 (#40131) 2023-09-21 15:14:47 +02:00
Harmen Stoppels
501d322264 e4s: drop python 3.8 preference (#40123) 2023-09-21 15:07:21 +02:00
Manuela Kuhn
59fc09e93f py-versioneer: add 0.29 (#40076) 2023-09-21 07:22:20 -05:00
Harmen Stoppels
8bd54e2f8f cython: fix recythonize by default patch (#40096) 2023-09-21 07:09:47 -05:00
Sinan
bd8d121a23 package:pylint fix isort dependency versions (#40094)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2023-09-21 06:43:44 -05:00
Lydéric Debusschère
2bd487988f [add] py-ansimarkup: new recipe required by py-cylc-flow (#39989)
* [add] py-ansimarkup: new recipe required by py-cylc-flow

* py-ansimarkup: remove version constraint on python, add version 2.1.0
2023-09-21 06:36:15 -05:00
Manuela Kuhn
452e56f467 py-vcrpy: add 5.1.0 (#40075) 2023-09-21 06:28:25 -05:00
Christian Glusa
62e94b0cb7 py-scikit-sparse: add 0.4.11, 0.4.12 (#40077) 2023-09-21 06:27:04 -05:00
Thomas Dickerson
5d331d4141 Bazel patch specs were too restrictive (#40084)
These patches should always be applied - the existing Bazel code is always wrong, working on some older compilers was a lucky fluke.
2023-09-21 05:27:11 -05:00
Harmen Stoppels
bd8d0324a9 e4s-cray: drop in ci as there are no runners (#40122) 2023-09-21 09:54:06 +00:00
Adam J. Stewart
d79a3ecc28 py-gevent: relax dependency constraints (#40117) 2023-09-21 09:00:28 +02:00
Harmen Stoppels
dac3b45387 gptune: doesnt depend on cython (#40104) 2023-09-20 15:31:45 -07:00
Dom Heinzeller
31fe78e378 ESMF package: fix netcdf static libs and variant combinations (#39738)
Add "snapshot" variant for identifying UFS WM support
2023-09-20 10:28:52 -07:00
Patrick Gartung
e16ca49036 Revert PR 40091 which duplicates PR 38987 (#40107) 2023-09-20 13:03:49 -04:00
Auriane R
a7ee72708a Add support for C++23 in pika and pika-algorithms packages (#40078)
* Add C++23 support for pika

* Add C++23 support for pika-algorithms as well
2023-09-20 09:28:25 -06:00
Auriane R
c4a53cf376 Add 23 and 26 to the cxxstd variant for boost (#40081) 2023-09-20 16:21:06 +02:00
Massimiliano Culpo
e963d02a07 Fix a leftover typo from depflag rework (#40101) 2023-09-20 14:16:53 +00:00
Harmen Stoppels
f89451b4b8 cython: add build-tools tag (#40100) 2023-09-20 15:18:04 +02:00
Raffaele Solcà
9c87506c2c DLA-Future: add v0.2.1 (#40098) 2023-09-20 14:58:23 +02:00
Auriane R
35223543e9 Add 23 to the cxxstd variant of fmt (#40080) 2023-09-20 11:20:41 +02:00
Marc Mengel
e1b22325ea Add details on error messages from requirements (#40092) 2023-09-20 06:16:29 +02:00
Patrick Gartung
2389047072 Fix error in qt5.15 building qtlocation with gcc13 (#40091) 2023-09-19 16:18:34 -06:00
Richard Berger
f49c58708b FleCSI updates (#40087)
* flecsi: update description and flog variant help
* flecsi: use kokkos_cxx when needed
2023-09-19 14:28:13 -06:00
Daryl W. Grunau
d916073397 eospac: expose version 6.5.7 (#40060)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2023-09-19 11:44:20 -07:00
AMD Toolchain Support
bfa514af98 python version dependency for supporting nwchem-702 build (#40072)
Co-authored-by: viveshar <vivek.sharma2@amd.com>
2023-09-19 11:41:04 -07:00
Harmen Stoppels
c57e2140c2 ci: dont use nightly image tag (#40070) 2023-09-19 17:53:28 +02:00
Tuomas Koskela
9e21d490ea conquest: add new package (#40053) 2023-09-19 09:38:37 -06:00
Massimiliano Culpo
3b4ca0374e Use process pool executors for web-crawling and retrieving archives (#39888)
Fix a race condition when searching urls, and updating a shared
set '_visited'.
2023-09-19 15:32:59 +02:00
Harmen Stoppels
7d33c36a30 ci: drop redundant packages.yaml (#40068) 2023-09-19 11:47:48 +02:00
Massimiliano Culpo
4f14db19c4 ASP-based solver: declare deprecated versions iff config:deprecated:true (#40011)
By default, do not let deprecated versions enter the solve.

Previously you could concretize to something deprecated, only to get errors on install.

With this commit, we get errors on concretization, so the issue is caught earlier.
2023-09-19 09:44:49 +00:00
AMD Toolchain Support
a0dcf9620b Lammps: don't apply AOCC patches to versions containing the backport (#39844)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2023-09-19 08:52:15 +02:00
Tim Haines
9a9c3a984b extrae: add versions 4.0.5, 4.0.6 (#40063) 2023-09-19 08:45:59 +02:00
Manuela Kuhn
a3543e2248 py-typing-extensions: add 4.8.0 (#40049) 2023-09-19 01:02:31 -05:00
Adam J. Stewart
4cdbb04b15 py-lightning: add v2.0.8/9 (#40058) 2023-09-18 21:13:24 -06:00
Adam J. Stewart
2594be9459 py-rarfile: add v4.1 (#40054) 2023-09-18 19:58:39 -06:00
Scott Wittenburg
8f07983ab6 adios2: add smoke tests (#39970) 2023-09-18 19:43:46 -06:00
dependabot[bot]
8e34eaaa75 build(deps): bump docker/setup-buildx-action from 2.10.0 to 3.0.0 (#39967)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.10.0 to 3.0.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](885d1462b8...f95db51fdd)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-19 00:48:30 +00:00
Andrew-Dunning-NNL
7e7d373ab3 mdspan: new package (#40024)
* new package mdspan

* Update var/spack/repos/builtin/packages/mdspan/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* mdspan- fix style

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-09-18 18:44:17 -06:00
Adam J. Stewart
dbd520f851 Better detection of Python libs/headers (#39308) 2023-09-18 15:46:33 -07:00
afzpatel
4fd7fa5fc1 new package composable kernel (#39222)
* initial commit to add composable kernel package
* change dependencies to type build and add amdgpu_target variant
* fix spacing
* fix styling
* remove rocmmlir from miopen-hip recipe
* enable miopen with ck after 5.5.1
* fix typo
2023-09-18 14:24:45 -07:00
Johann Gaebler
84d2097a8c r-wru: Add R package wru (#39935) 2023-09-18 14:15:56 -07:00
Brian Vanderwende
842f19c6e3 grads: add versions and variants (#40017)
* Add grads versions, variants, and dependency
* Bugfix for hdf4 libs and style fixes
* Switch boolean variant checking syntax
2023-09-18 13:34:46 -07:00
eugeneswalker
577ea0a0a8 pmix: new versions (#40055) 2023-09-18 12:47:36 -07:00
Brian Vanderwende
4d8f9ff3e8 xxdiff: add new package (#40021)
* Add new package xxdiff

* style fixes

* Versioning updates to match best practices

* Full hash and flex, bison as build deps
2023-09-18 12:40:22 -07:00
rfeki
60ce6c7302 Add cuda as a variant to opa-psm2 (#39754)
Signed-off-by: Raafat Feki <rfeki@cornelisnetworks.com>
2023-09-18 12:28:37 -07:00
Manuela Kuhn
d111bde69e py-tqdm: add 4.66.1 (#39952) 2023-09-18 13:04:39 -06:00
eugeneswalker
3045ed0e43 Revert "gitlab ci: temporarily disable darwin aarch64 (#39974)" (#40052)
This reverts commit 2c13361b09.
2023-09-18 12:48:32 -06:00
Freifrau von Bleifrei
2de0e30016 SeLaLib: add new package (#39847) 2023-09-18 13:15:10 -05:00
Manuela Kuhn
15085ef6e5 py-tornado: add 6.3.3 (#39949)
* py-tornado: add 6.3.3

* Fix style
2023-09-18 11:18:36 -06:00
Manuela Kuhn
0c2849da4d py-twine: add 4.0.2 (#40026) 2023-09-18 11:50:51 -05:00
Manuela Kuhn
75eeab1297 py-types-psutil: add 5.9.5.16 (#40027) 2023-09-18 11:49:20 -05:00
Dom Heinzeller
b8e32ff6b3 Fix several build errors for hdf-eos2 (not only but in particular on macOS w/ apple-clang) (#39877)
Fix permissions for configure file in var/spack/repos/builtin/packages/hdf-eos2/package.py, fix dependencies to match what hdf provides, update compiler flags for apple-clang
2023-09-18 18:47:25 +02:00
Manuela Kuhn
e7924148af py-types-typed-ast: add 1.5.8.7 (#40047) 2023-09-18 11:47:00 -05:00
Manuela Kuhn
d454cf4711 py-types-urllib3: add 1.26.25.14 (#40048)
* py-types-urllib3: add 1.26.25.14

* Fix style
2023-09-18 11:44:26 -05:00
George Young
ba81ef50f5 cpat: new package @3.0.4 (#40022)
* cpat: new package @3.0.4

* Update package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-18 11:37:21 -05:00
Manuela Kuhn
d5dd4b8b5d py-types-python-dateutil: add 2.8.19.14 (#40038) 2023-09-18 11:34:53 -05:00
Manuela Kuhn
0e47548cb6 py-types-requests: add 2.31.0.2 (#40039) 2023-09-18 11:34:01 -05:00
Manuela Kuhn
74974d85f6 py-types-setuptools: add 68.2.0.0 (#40040) 2023-09-18 11:33:01 -05:00
dependabot[bot]
7925bb575e build(deps): bump docker/login-action from 2.2.0 to 3.0.0 (#39966)
Bumps [docker/login-action](https://github.com/docker/login-action) from 2.2.0 to 3.0.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](465a07811f...343f7c4344)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-18 18:23:37 +02:00
dependabot[bot]
976cb02f78 build(deps): bump docker/setup-qemu-action from 2.2.0 to 3.0.0 (#39968)
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 2.2.0 to 3.0.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](2b82ce82d5...68827325e0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-18 18:22:58 +02:00
dependabot[bot]
f1bdc74789 build(deps): bump docker/build-push-action from 4.2.1 to 5.0.0 (#39969)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 4.2.1 to 5.0.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](0a97817b6a...0565240e2d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-18 18:22:41 +02:00
Lydéric Debusschère
920347c21a py-werkzeug: Add version 0.12.2 required by py-graphene-tornado @2.6.1 (#39984) 2023-09-18 11:21:22 -05:00
Ben Cowan
c00ece6cf2 Enabled building GCC 13.1 on darwin/aarch64. (#38667) 2023-09-18 18:20:19 +02:00
Juan Miguel Carceller
7b0157c7e7 python: Don't prefer 3.10 (#40036) 2023-09-18 10:08:49 -06:00
Lydéric Debusschère
1b3a2ba06a [add] py-ldap3: new recipe, required by py-metomi-rose (#39982) 2023-09-18 11:06:23 -05:00
Manuela Kuhn
dc22a80f86 py-traits: add 6.4.2 (#39976) 2023-09-18 10:57:47 -05:00
Manuela Kuhn
27d6a75692 py-trove-classifiers: add 2023.8.7 (#39977) 2023-09-18 10:50:55 -05:00
George Young
70456ce4a7 py-rseqc: add 5.0.1 (#39953)
* py-rseqec: add 5.0.1

* Update package.py

* style ...

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-18 10:49:58 -05:00
Lydéric Debusschère
078369ec2b [chore] py-aniso8601: Add version 7.0.0 (#39808)
* [chore] py-aniso8601: Add version 3.0.2 required by py-graphene

* py-aniso8601: Add version 7.0.0 and constraints on python version

* py-aniso8601: fix style

* [fix] py-aniso8601: remove version 3.0.2 which depends on removed version of python; set the minimal version of python to 3.7

* py-aniso8601: remove version constraint on python
2023-09-18 10:46:30 -05:00
Diego Alvarez S
da8e022f6b Nextflow: add 22.04.3 (#40034) 2023-09-18 17:39:27 +02:00
George Young
c8a3f1a8ae chexmix: new package (#38441)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-18 17:36:44 +02:00
Michael Kuhn
be3f7b5da3 checksum: use FIXME instead of FIX ME when adding versions (#40050)
The former is highlighted by editors, while the latter is not.
2023-09-18 17:31:11 +02:00
Martin Aumüller
68d7ce3bb6 coin3d: apply upstream patches to fix build on Centos 8 (#39839)
fixes undefined references to dlerror, dlopen, dlsym, dlclose and
XDefaultScreenOfDisplay, XScreenNumberOfScreen, XCreatePixmap, XFree, XFreePixmap, XCloseDisplay, XOpenDisplay
2023-09-18 17:16:56 +02:00
Rémi Lacroix
7a490e95b6 Julia: Fix sha256 for some LLVM patches (#40041) 2023-09-18 10:52:00 +02:00
Joscha Legewie
088e4c6b64 grass: Update spec for gdal dependency (#38396) 2023-09-15 21:36:46 -05:00
Caetano Melone
76816d722a CI: add spec to job vars (#39905)
* CI: add details about built spec to ci job variables

Co-authored-by: Alec Scott <alec@bcs.sh>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2023-09-15 14:40:00 -06:00
Scott Wittenburg
cbabdf283c gitlab ci: submit the correct audience for protected publish jobs (#40046) 2023-09-15 20:09:18 +00:00
John W. Parent
060bc01273 Windows RPATHing: fix symlink error (#39933)
With 349ba83, you cannot symlink() if the link already exists.
Update the simulated RPATHing logic on Windows to account for that.
2023-09-15 12:55:18 -07:00
Harmen Stoppels
0280ac51ed linux-headers: drop rsync dep, add new version (#39867) 2023-09-15 17:23:29 +02:00
David Gardner
bd442fea40 update sha256 after new tarball uploads (#40030) 2023-09-15 17:20:37 +02:00
Massimiliano Culpo
fb9e5fcc4f Group primitive url/path handling functions together (#40028) 2023-09-15 15:43:23 +02:00
Jack Morrison
bc02453f6d Sandia OpenSHMEM: Update to include version 1.5.2 (#40025) 2023-09-15 08:38:47 -04:00
dependabot[bot]
74a6c48d96 build(deps): bump sphinx from 7.2.5 to 7.2.6 in /lib/spack/docs (#40029)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.2.5 to 7.2.6.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.5...v7.2.6)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-15 09:51:51 +02:00
Alex Richert
854f169ded Add shared and pic variants to libxml2 (#38094) 2023-09-14 19:57:52 -07:00
Adam Grofe
94c2043b28 Add Scine QCMaquis recipe to builtin packages (#39709)
* Add Scine QCMaquis recipe to builtin packages
* Format scine-qcmaquis recipe using black
* Remove import os from scine-qcmaquis recipe
* Reduce length of symmetries line in scine-qcmaquis
* Add myself as a maintainer of the scine-qcmaquis recipe
* Update Scine-QCMaquis recipe following the review
  PR URL: https://github.com/spack/spack/pull/39709
  Changes:
  - Updated Symmetries variant to us multi feature
  - Added dependency to boost+chrono since it was undocumented
  - Use define_from_variant to setup CMake args
  - Make version 3.1.2 be preferred since 3.1.3 build is broken
* Change build_tests boolean condition

---------

Co-authored-by: Adam Grofe <adamgrofe@microsoft.com>
2023-09-14 17:57:42 -04:00
Scott Wittenburg
579df768ca gitlab ci: Put Cray binaries in public mirrors (#39938) 2023-09-14 15:11:29 -06:00
George Young
e004db8f77 meme: add 5.5.4, correct dependencies (#40014)
* meme: add 5.5.4, correct dependencies
* variant renaming

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-14 13:52:40 -05:00
George Young
2d1cca2839 gffread: new package @0.12.7 (#40015)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-14 13:34:22 -04:00
Alex Richert
8dae369a69 Set # of ninja procs to 'make_jobs' for scipy (#293) (#38831) 2023-09-14 13:08:59 -04:00
Alex Richert
0d76436780 Add pic option & maintainer to lz4 (#38095)
Co-authored-by: Greg Becker <becker33@llnl.gov>
2023-09-14 09:50:11 -07:00
Massimiliano Culpo
34402beeb7 Ensure PythonExtension is consistent when finding externals (#40012)
PythonExtension is a base class for PythonPackage, and
is meant to be used for any package that is a Python
extension but is not built using "python_pip".

The "update_external_dependency" method in the base
class calls another method that is defined in the derived
class.

Push "get_external_python_for_prefix" up in the hierarchy
to make method calls consistent.
2023-09-14 18:26:40 +02:00
Dan LaManna
6a249944f5 Add OIDC tokens to gitlab-ci jobs (#39813)
* Add OIDC tokens to gitlab-ci jobs

This should allow us to start issuing just-in-time generated
credentials for CI jobs that need to modify binary mirrors. The "aud"
claim of the token describes what the token is allowed to do. The
claim is verified against a set of rules on the IAM role using signed
information from GitLab. See spack-infrastructure for the claim
verification logic.

---------

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2023-09-14 09:59:25 -06:00
Harmen Stoppels
6838ee6bb8 Add efficient deptype flag and spack.deptypes module (#39472)
This commit replaces the internal representation of deptypes with `int`, which is more compact
and faster to operate with.

Double loops like:
```
any(x in ys for x in xs)
```
are replaced by constant operations bool(xs & ys), where xs and ys are dependency types. 

Global constants are exposed for convenience in `spack.deptypes`
2023-09-14 12:25:24 +02:00
Massimiliano Culpo
d50f296d4f data-vis-sdk: make llvm~libomptarget the default (#40007)
Currently the configuration of the pipeline is such that
there are multiple "optimal" solutions. This is due to
the pipeline making ~lld the default for LLVM, but leaving
+libomptarget from package.py

Since LLVM recipe has a:

conflicts("~lld", "+libomptarget")

clingo is forced to change one of the two defaults, and
the penalty for doing so is the same for each. Hence, it
is random whether we'll get +libomptarget+lld
or ~libomptarget~lld.

This fixes it by changing also the default for libomptarget.
2023-09-14 09:19:47 +02:00
Alex Richert
e5d227e73d Support static case for find_libraries() in xz (#38100)
* Update xz

* add maintainer to xz

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2023-09-13 22:34:11 -04:00
Alex Richert
af7b4c5a2f Add pic variant for libpng (#37964) 2023-09-13 20:43:10 -04:00
Alex Richert
75e9742d71 Allow git to compile against static curl (#37960) 2023-09-13 16:03:06 -07:00
Billae
e2f2559a5a add trompeloeil package (#39944)
* add trompeloeil package
* remove maintainer as he wish not to be
* fix style
2023-09-13 16:02:24 -07:00
Alex Richert
cac7f9774a Add static-only option for udunits (#38099)
* Add static-only option for udunits

* add maintainer to udunits
2023-09-13 15:58:55 -07:00
Alex Richert
6c4f8e62ae Add libs property/find_libraries to libiconv (#37962) 2023-09-13 15:52:30 -07:00
Alex Richert
cb03db3d69 Add pic and shared variants to GSL (#37961)
* Add pic and shared variants for gsl

* fix gsl variant logic

* make +pic default for gsl
2023-09-13 15:49:32 -07:00
Alex Richert
372bbb43a8 ip recipe updates (#39997) 2023-09-13 15:41:45 -07:00
Alex Richert
7b763faa1c Add develop version to g2tmpl (#39994)
* Add develop version to g2tmpl
* Update package.py
2023-09-13 15:39:42 -07:00
Alex Richert
4a79857b5e Add develop version for gfsio (#39995)
* Add develop version for gfsio
* Update package.py
2023-09-13 15:38:13 -07:00
Alex Richert
61df3b9080 Add develop version to landsfcutil (#39998) 2023-09-13 15:36:43 -07:00
Alex Richert
0b134aa711 Add two versions to ncio recipe (#39999) 2023-09-13 15:35:29 -07:00
Alex Richert
ea71477a9d Add develop version for nemsio (#40000) 2023-09-13 15:32:32 -07:00
Alex Richert
bc26848cee Add develop version to nemsiogfs (#40001) 2023-09-13 15:31:19 -07:00
Alex Richert
3e50ee70be Add develop version to sfcio (#40002)
* Add develop version to sfcio
* Update package.py
2023-09-13 15:30:14 -07:00
Alex Richert
0d8a20b05e Add develop version to sigio (#40003) 2023-09-13 15:28:48 -07:00
Alex Richert
74d63c2fd3 Add develop version to wrfio (#40004) 2023-09-13 15:17:42 -07:00
Alex Richert
a0622a2ee0 Add check() to sp recipe (#40005) 2023-09-13 15:14:07 -07:00
Alex Richert
a25a910ba0 Update upp recipe (#40006) 2023-09-13 15:10:05 -07:00
snehring
0e5ce57fd5 interproscan: adding new version and variant to include databases. (#40009) 2023-09-13 18:03:44 -04:00
Jordan Ogas
e86c07547d add charliecloud 0.34, deprecate previous (#40008) 2023-09-13 14:59:24 -07:00
Harmen Stoppels
d7b5a27d1d Revert "ASP-based solver: don't declare deprecated versions unless required (#38181)" (#40010)
This reverts commit babd29da50.
2023-09-13 23:33:55 +02:00
Laura Bellentani
eee8fdc438 scorep: Change in configuration option for nvhpc compiler (#39919)
* scorep version 8.1 added
* configure finds cudart and cupti in the nvhpcsdk suite
* style fixed
* changes to find libcuda.so in cuda directory

---------

Co-authored-by: Laura Bellentani <lbellen1@login01.leonardo.local>
2023-09-13 16:07:32 -04:00
Cameron Rutherford
5a91802807 hiop: Add version 1.0.0 (#39972) 2023-09-13 10:15:30 -07:00
Harmen Stoppels
7fd56da5b7 glibc: add older versions (#39978) 2023-09-13 10:06:42 -07:00
Adam J. Stewart
eefa5d6cb5 GDAL: add v3.7.2 (#39979) 2023-09-13 09:59:51 -07:00
AMD Toolchain Support
845973273a CP2K package: various AOCC compatibility fixes (#39773)
* cp2k: patch several old versions to help newer compilers
* cp2k: use -O2 optimization for AOCC compiler
* cp2k: do not support old AOCC compilers
* cp2k: simplify when clause due to conflicting out old compilers
* cp2k: give a more meaningful message for confilcts

Co-authored-by: Ning Li <ning.li@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2023-09-13 09:48:45 -07:00
Gurkirat Singh
0696497ffa Add cmake package for v2.1.0 and master (git) versions of libfirefly (#39778)
* libfirefly: Add cmake package for v2.1.0 and master (git) versions

* Separate git URL from version declaration

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-09-13 09:17:33 -07:00
Massimiliano Culpo
babd29da50 ASP-based solver: don't declare deprecated versions unless required (#38181)
Currently, the concretizer emits facts for all versions known to Spack, including deprecated versions, and has a specific optimization objective to minimize their use.

This commit simplifies how deprecated versions are handled by considering possible versions for a spec only if they appear in a spec literal, or if the `config:deprecated:true` is set directly or through the `--deprecated` flag. The optimization objective has also been removed, in favor of just ordering versions and having deprecated ones last.

This results in:

a) no delayed errors on install, but concretization errors when deprecated versions would be the only option. This is in particular relevant for CI where it's better to get errors early
b) a slight concretization speed-up due to fewer facts
c) a simplification of the logic program.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-09-13 18:15:57 +02:00
Massimiliano Culpo
c4e2d24ca9 Spec.tree: add type hints (#39948) 2023-09-13 11:53:14 +02:00
Harmen Stoppels
2c13361b09 gitlab ci: temporarily disable darwin aarch64 (#39974)
as there are no or few runners available
2023-09-13 11:11:33 +02:00
Harmen Stoppels
e8740b40da unifyfs: support openssl 3 (#39945)
* unifyfs: drop upperbound on deprecated openssl

The package uses deprecated MD5 functions from OpenSSL, which causes
warnings, but (a) Spack by default disables -Werror, and (b) those
functions will continue to exist in OpenSSL 3.

* unifyfs: enable parallel build, only make check sequential

* unifyfs: order class methods by install phases
2023-09-13 08:59:47 +02:00
Harmen Stoppels
e45fc994aa py-maestrowf: new version, fix artificial py-cryptography dep, support openssl 3 (#39946)
* py-maestrowf: add new version 1.1.9, deprecate development releases

* py-maestrowf: drop py-cryptography in 1.1.9

* py-maestrowf: drop py-cryptography dependency entirely, since it is not a direct dependency

* py-merlin: new version, ensure openssl 3 compat

* py-merlin: drop py-coloredlogs@10: lowerbound

* py-maestrowf: add py-rich, reorder deps

* py-celery: explain why upperbound is in spack but not in requirements.txt
2023-09-13 07:30:52 +02:00
Alex Richert
ed3f5fba1f Update g2c recipe: add develop version and misc. variants (#39965)
* Update g2c recipe: add develop version and misc. variants
* add testing to g2c
* Update package.py
2023-09-12 18:23:48 -04:00
Adam J. Stewart
d0e8a4d26f py-lightly: add v1.4.18 (#39960) 2023-09-12 16:53:41 -04:00
Massimiliano Culpo
60a5f70b80 Fix typo in metaclasses (#39947) 2023-09-12 22:53:21 +02:00
Scott Wittenburg
fa9acb6a98 ci: Tag reindex job to match the image architecture (#39958) 2023-09-12 16:33:56 -04:00
Johann Gaebler
621d42d0c7 r-piggyback: Add R package piggyback (#39932) 2023-09-12 12:53:47 -07:00
Johann Gaebler
cfdaee4725 r-pl94171: Add R package PL94171 (#39926) 2023-09-12 12:51:23 -07:00
Massimiliano Culpo
306ba86709 binutils: add v2.41 (#39483) 2023-09-12 21:15:45 +02:00
George Young
b47fd61f18 bam-readcount: add 1.0.1, correct build flags (#39950)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-12 11:26:30 -07:00
Tamara Dahlgren
c10ff27600 CI/Add superlu-dist to broken stand-alone tests due to hang (#38908)
* Add superlu-dist to broken stand-alone CI tests
* Revert "disable superlu test (#38894)"
  This reverts commit e16397b5d8.
2023-09-12 11:09:25 -07:00
Adam Moody
9056f31f11 mpifileutils: switch to main branch (#39525) 2023-09-12 10:57:29 -07:00
Dom Heinzeller
513232cdb3 libjpeg-turbo: Fix Darwin lib install name (#39834)
* Fix Darwin lib install name in var/spack/repos/builtin/packages/libjpeg-turbo/package.py

* Update var/spack/repos/builtin/packages/libjpeg-turbo/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-09-12 10:49:30 -07:00
Dom Heinzeller
30893dd99a Filter spack compiler wrappers for h4cc in var/spack/repos/builtin/packages/hdf/package.py (#39870) 2023-09-12 10:41:04 -07:00
Alex Richert
b7c2411b50 g2 package bugfix (#39923)
* g2 package bugfix

* g2: fix w3emc dep

* fix setup_run_environment

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* style fix

* Update var/spack/repos/builtin/packages/g2/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update var/spack/repos/builtin/packages/g2/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-09-12 10:40:37 -07:00
Weiqun Zhang
002e833993 amrex: add v23.09 (#39788) 2023-09-12 10:39:56 -07:00
Martin Aumüller
02c9296db4 openexr: add 2.4.3, 2.5.9, 3.1.11 & 3.2.0 (#39863)
* openexr: add 2.4.3, 2.5.9, 3.1.11 & 3.2.0

- 2.5.9 is the latest version compatible with OpenScenGraph
- improved compatibility with GCC 13

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* openexr: drop unsupported debug variant

--disable/enable-debug is not supported by ./configure in 1.3, 1.7, 2.0, 2.2 and 2.3

* openexr: transform into multi-build system package

simplifies package considerably, as nothing special seems to be required

* openexr: pkg-config is also used by @3

* openexr: use system libdeflate instead of internal

if no libdeflate is found, openexr would download and build its own starting with 3.2.0

* openexr: disable tests

would download lots of data during cmake and make build times noticably longer

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-12 10:37:48 -07:00
Vincent Michaud-Rioux
562065c427 Update PennyLane packages to v0.32. (#39688)
* Update PennyLane packages to v0.32.

* Reformat.

* Couple small fixes.

* Fix Lightning cmake_args.

* Couple dep fixes in py-pennylane + plugins.

* Fix scipy condition.

* Add comment on conflicting requirement.
2023-09-12 12:19:29 -05:00
Harmen Stoppels
9d00bcb286 openssl: deprecate 1.1.1 (#39942) 2023-09-12 12:45:51 +02:00
Harmen Stoppels
a009a1a62a asp.py: fix deprecation warning (#39943) 2023-09-12 09:27:03 +00:00
Mikael Simberg
f7692d5699 Add gperftools 2.13 (#39940) 2023-09-12 11:26:19 +02:00
George Young
12e1768fdb filtlong: add v0.2.1, patch for %gcc@13: (#39775)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-12 04:03:02 -04:00
dependabot[bot]
28a3be3eca build(deps): bump black from 23.7.0 to 23.9.1 in /lib/spack/docs (#39937)
Bumps [black](https://github.com/psf/black) from 23.7.0 to 23.9.1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.7.0...23.9.1)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-12 07:44:19 +02:00
Brian Van Essen
33500b5169 lbann: make two variants sticky (#39710)
* Make the state of the Python Front End (PFE) and Python data reader
support sticky so that the concretizer does not arbitrarily disable
them.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-09-12 01:42:48 -04:00
Stephen Sachs
91011a8c5f [gromacs] Fix 2021.2 build error (#39722)
* [gromacs] Fix 2021.2 build error

See https://gromacs.bioexcel.eu/t/compilation-failure-for-gromacs-2021-1-and-2021-2-with-cmake-3-20-2/2129

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Andrey Alekseenko <al42and@gmail.com>

---------

Co-authored-by: Andrey Alekseenko <al42and@gmail.com>
2023-09-11 19:49:18 -04:00
Caetano Melone
4570c9de5b fish: add 3.6.0, 3.6.1, update cmake requirement (#39899) 2023-09-11 19:34:33 -04:00
Adam J. Stewart
e7507dcd08 py-build: add v1 (#39930) 2023-09-11 19:04:32 -04:00
Dom Heinzeller
4711758593 Add awscli@1.29.41 and its dependency py-botocore@1.31.41 (#39878)
* Add awscli@1.29.41 and its dependency py-botocore@1.31.41

* Update var/spack/repos/builtin/packages/awscli/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-09-11 18:02:46 -04:00
Sinan
b55e9e8248 package_qscintilla_build_with_qt6 (#39544)
* package_qscintilla_build_with_qt6

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improve

* fix Qsci.abi3.so install

* simplify, fix, tidy

* fix style

* fix style

* fix style

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improve

* improve style

* fix style

* make black happy

* add ver 2.14.1

* update

* make black happy

* Update var/spack/repos/builtin/packages/qscintilla/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improve

---------

Co-authored-by: Sinan81 <Sinan@world>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-11 16:58:28 -05:00
Adam J. Stewart
2e62cfea3e py-black: add v23.9 (#39931) 2023-09-11 16:36:16 -05:00
Manuela Kuhn
286e1147d6 py-tifffile: add 2023.8.30 (#39918) 2023-09-11 16:34:20 -05:00
Manuela Kuhn
7e3d228d19 py-tinycss2: add 1.2.1 (#39921) 2023-09-11 16:32:42 -05:00
Manuela Kuhn
a1ab42c8c0 py-tomlkit: add 0.12.1 (#39922) 2023-09-11 16:31:55 -05:00
Rocco Meli
57a2f2ddde tidynamics: add v1.1.2 (#39911)
* ensure umpire~cuda~rocm when ~cuda~rocm

* update tydynamics
2023-09-11 16:17:17 -05:00
Mike Renfro
b7b7b2fac4 correct py-bx-python@0.8.8 python dependency (#39766)
* correct python dependency for 0.8.8

* add missing zlib dependency

* restore missing whitespace

* correct versions for 0.9.0, clarify rules for future releases

* Update package.py

* Remove whitespace for black check
2023-09-11 16:16:28 -05:00
Tim Haines
5ac1167250 fpchecker: add version 0.4.0 (#39924) 2023-09-11 17:16:20 -04:00
Richard Berger
fcc4185132 cuda: add versions 12.2.0 and 12.2.1 (#39684) 2023-09-11 23:13:06 +02:00
sid
943c8091c2 py-fairscale and dependencies py-pgzip libgit2 py-pygit2 (#39220)
* simple build of py-openai

* added variants to py-openai

* py-pandas-stubs is a dependency for py-openai

* fixed format and flake8 errors for py-openai

* black format error for py-pandas-stubs

* [@spackbot] updating style on behalf of sidpbury

* made style and format changes to py-openai

* made style and format changes to py-pandas-stubs

* py-types-pytz is a dependency for py-openai

* [@spackbot] updating style on behalf of sidpbury

* updated py-openpyxl for ver 3.0.7 and 3.1.2

* Update var/spack/repos/builtin/packages/py-pandas-stubs/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* ajs requested changes for py-openai

* updated py-openpyxl for supported python

* [@spackbot] updating style on behalf of sidpbury

* updated py-openpyxl

* removed requirement.txt dependencies in  py-openpyxl

* removed python depends on from openpyxl

* updated package to support newer versions

* updated version of py-pygit2

* py-fairscale is a new package in support of torch

* py-pgzip is a dependency for py-fairscale

* switch fairscale pypi, added extra variant for convenience

* removed python dependency

* changed multiple requirement versions

* changes for upstream py-fairscale

* changes for upsteam py-pygit2

* Update package.py

* Update package.py

* sorted out some of the dependency versions

* removed version 1.12.2 because dependency could not be met

* updated py-cached-property dependency

* suggested changes from adamjstewart

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-11 16:11:02 -05:00
Nils Vu
a7a6f2aaef spectre: add v2023.08.18 and v2023.09.07 (#39925) 2023-09-11 17:10:28 -04:00
Rocco Meli
5aaa82fb69 Update MDAnalysis (#39787)
* ensure umpire~cuda~rocm when ~cuda~rocm

* update mdanalysis

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-11 16:06:20 -05:00
Simon Pintarelli
bc33d5f421 update q-e-sirius package (#39744) 2023-09-11 22:48:19 +02:00
Juan Miguel Carceller
0e4e232ad1 Add another url for whizard (#37354)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-09-11 22:45:21 +02:00
Scot Halverson
145d44cd97 gdrcopy: inherit from CudaPackage, update dependencies (#39596) 2023-09-11 22:36:20 +02:00
Fábio de Andrade Barboza
e797a89fe1 py-pipdeptree: add new package; py-hatchling: add 1.18.0 (#39697)
* py-hatchling: add 1.18.0

* py-pipdeptree: add new package

* py-hatchling: add Python 3.8 dependency

* Apply suggestion from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Fábio de Andrade Barboza <f168817@dac.unicamp.br>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-11 15:14:33 -05:00
Jose E. Roman
254a2bc3ea SLEPc: add v3.19.2 (#39805)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-09-11 22:13:21 +02:00
Fábio de Andrade Barboza
873652a33e py-enum-tools: add new package and dependencies (#39672)
* Adds package 'enum-tools'

* Fixes syntax

* py-enum-tools: update pypi link and dependencies

* py-whey: add new package

* py-consolekit: add new package

* py-shippinglabel: add new package

* py-project-parser: add new package

* py-handy-archives: add new package

* py-domdf-python-tools: add new package

* py-dom-toml: add new package

* py-dist-meta: add new package

* py-deprecation-alias: add new package

* py-pygments: add versions 2.16.0 and 2.16.1

* py-mistletoe: add new package

* py-apeye: add new package

* py-apeye-core: add new package

* Fix dependencies' names and remove redundancies

* Fix style

* py-enum-tools: fix dependencies

* py-mistletoe: update dependencies

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-whey: fix dependencies

* py-whey: fix style

* py-apeye: fix dependencies

---------

Co-authored-by: Fábio de Andrade Barboza <f168817@dac.unicamp.br>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-09-11 15:12:46 -05:00
Travis
3f686734ab dakota: add v6.18 (#39882)
Co-authored-by: Travis Drayna <tdrayna@loki.local>
2023-09-11 22:06:16 +02:00
Thomas Madlener
e1373d5408 dd4hep: make sure to find libraries correctly (#39516) 2023-09-11 16:04:42 -04:00
dslarm
a2054564d8 ctffind: extend a Power 9 patch to AArch64 (#39137) 2023-09-11 16:04:20 -04:00
Sébastien Valat
ea75dbf7bd MALT: add v1.2.2 (#39890) 2023-09-11 21:57:36 +02:00
Sébastien Valat
63a67e525b NUMAPROF: add v1.1.5 (#39891) 2023-09-11 21:57:04 +02:00
Manuela Kuhn
95aaaeb5af py-dunamai: add 1.18.0 (#39849) 2023-09-11 14:56:34 -05:00
Harmen Stoppels
a0bd53148b glibc: fix deps, drop pre 2.17 (#39806)
* glibc: add missing deps, drop < 2.17

Building glibc 2.17 requires linking libgcc_{s,eh}.a, which themselves
depend on whatever glibc the current gcc uses for its runtime libraries.

Newer gcc depends on gnu extensions of libdl it seems, so that means you
simply cannot build old glibc with gcc-using-new-glibc.

The relevant fix in glibc is this commit:

commit 95f5a9a866695da4e038aa4e6ccbbfd5d9cf63b7
Author: Joseph Myers <joseph@codesourcery.com>
Date:   Tue Jul 3 19:14:59 2012 +0000

    Avoid use of libgcc_s and libgcc_eh when building glibc.

See also references to that commit in the glibc mailing list.

* update the gmake bound

* add --without-selinux
2023-09-11 10:45:53 -07:00
Massimiliano Culpo
3302b176fd Update archspec to latest commit (#39920)
- [x] Intel flags for old architectures
- [x] Support for Sapphire Rapids
- [x] Cache the "ancestors" computation
2023-09-11 10:03:35 -07:00
Taillefumier Mathieu
4672346d9c Support of versions of cp2k below 7 marked as deprecated (#39652)
SIRIUS was introduced in version 7 of cp2k but could be used
in practice in version 9 (input format and functionalities).
SIRIUS with version 6 and below are marked as a dependency
conflict until CP2K version 9.

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-09-11 18:46:52 +02:00
Tom Payerle
4182e97761 elpa: Add --enable-scalapack-tests to configure when +autotune (#39728)
When building elpa +autotune, we apparently need to add
the configure option --enable-scalapack-tests
2023-09-11 18:37:56 +02:00
Seth R. Johnson
5598de88ff celeritas: add v0.3.2 (#39879) 2023-09-11 12:18:48 -04:00
Simon Pintarelli
d88ecf0af0 sirius: fix cuda_arch flags (#39889) 2023-09-11 12:13:49 -04:00
Scott Wittenburg
516f0461b8 gitlab ci: Use image with awscli for rebuild-index job (#39636)
Instead of pointing to the image on DockerHub, which rate limits us and
causes pipeline failures durying busy times, use the version at ghcr.

And we might as well use the ghcr version everywhere else too.
2023-09-11 18:03:11 +02:00
Alex Richert
b8cf7c3835 w3emc: add develop version and more variants (#39884)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-09-11 18:02:50 +02:00
Annop Wongwathanarat
707684a7b7 acfl: map rocky9 to RHEL-9 (#39892) 2023-09-11 11:58:45 -04:00
Alex Richert
739aebbd18 sp: add develop version, build precision, and OpenMP support (#39874)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-09-11 17:41:32 +02:00
Tao Lin
71c053c391 Garfield++: Add the patch to fix the missing headers and Set the environment variable HEED_DATABASE (#39904) 2023-09-11 11:13:22 -04:00
Thomas Gruber
9a0a4eceaf likwid: search library path for compilation with hwloc (#39659) 2023-09-11 11:03:21 -04:00
kjrstory
19f8e9147d openfoam-org: zoltan renumbering and decompsition (#39915) 2023-09-11 16:42:50 +02:00
George Young
e99750fd3c transdecoder: add 5.7.1 (#39916)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-11 16:37:36 +02:00
George Young
33cde87775 gffcompare: new package @0.12.6 (#39917)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-09-11 16:13:49 +02:00
Annop Wongwathanarat
e63597bf79 wrf: fix patches for 4.2 to 4.4.2 for aarch64 config (#39900) 2023-09-11 16:02:21 +02:00
Robert Cohn
2f63342677 intel-oneapi-mkl: hpcx-mpi support (#39750) 2023-09-11 15:47:52 +02:00
Freifrau von Bleifrei
04eae7316f DisCoTec: add new package (#35239)
Co-authored-by: Alexander Van Craen <40516079+vancraar@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-09-11 04:18:00 -04:00
Adam Fidel
67d6c086d8 singularity: fix package name in comment (#39822) 2023-09-10 12:33:10 -04:00
Mark (he/his) C. Miller
69370c9c8f Update package H5Z-ZFP to 1.1.1 (#39740)
* update to 1.1.1

* add maintainers

* add 1.1.0; set default
2023-09-10 09:27:24 -07:00
Rocco Meli
9948785220 spglib: add v2.1.0 (#39910)
* ensure umpire~cuda~rocm when ~cuda~rocm

* update spglib
2023-09-10 08:48:13 -07:00
dependabot[bot]
4047e025e0 build(deps): bump docker/build-push-action from 4.1.1 to 4.2.1 (#39901)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 4.1.1 to 4.2.1.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](2eb1c1961a...0a97817b6a)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-10 09:52:15 +02:00
896 changed files with 26760 additions and 8402 deletions

View File

@@ -10,3 +10,8 @@ updates:
directory: "/lib/spack/docs"
schedule:
interval: "daily"
# Requirements to run style checks
- package-ecosystem: "pip"
directory: "/.github/workflows/style"
schedule:
interval: "daily"

View File

@@ -22,8 +22,8 @@ jobs:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
@@ -34,6 +34,7 @@ jobs:
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
@@ -41,6 +42,7 @@ jobs:
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -42,8 +42,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -80,8 +80,8 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup repo
@@ -145,8 +145,8 @@ jobs:
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -158,13 +158,13 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
@@ -179,11 +179,11 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.6' '3.7' '3.8' '3.9' '3.10' ; do
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
@@ -204,7 +204,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup repo
@@ -214,7 +214,7 @@ jobs:
- name: Bootstrap clingo
run: |
set -ex
for ver in '3.6' '3.7' '3.8' '3.9' '3.10' ; do
for ver in '3.7' '3.8' '3.9' '3.10' '3.11' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
@@ -247,7 +247,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -265,6 +265,7 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -283,7 +284,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- name: Setup non-root user
@@ -302,8 +303,8 @@ jobs:
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -316,10 +317,11 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
spack -d gpg list
tree ~/.spack/bootstrap/store/
@@ -333,13 +335,13 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack bootstrap disable github-actions-v0.3
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -92,13 +92,13 @@ jobs:
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@2b82ce82d56a2a04d2637cd93a637ae1b359c0a7 # @v1
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@885d1462b80bc1c1c7f0b00334ad271f09369c55 # @v1
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@465a07811f14bebb1938fbed4728c6a1ff8901fc # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -106,13 +106,13 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@465a07811f14bebb1938fbed4728c6a1ff8901fc # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@2eb1c1961a95fc15694676618e422e8ba1d63825 # @v2
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09 # @v2
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -35,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -0,0 +1,7 @@
black==23.9.1
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.6.1
types-six==1.16.21.9
vermin==1.5.2

View File

@@ -15,7 +15,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
@@ -45,12 +45,16 @@ jobs:
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.11'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -94,10 +98,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -133,7 +137,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -152,10 +156,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -185,12 +189,12 @@ jobs:
runs-on: macos-latest
strategy:
matrix:
python-version: ["3.10"]
python-version: ["3.11"]
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -18,15 +18,15 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python Packages
run: |
pip install --upgrade pip
pip install --upgrade vermin
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
- name: vermin (Spack's Core)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
@@ -35,16 +35,17 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python packages
run: |
python3 -m pip install --upgrade pip setuptools types-six black==23.1.0 mypy isort clingo flake8
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -68,7 +69,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac # @v2
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,10 +15,10 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -39,10 +39,10 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -63,10 +63,10 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@3df4ab11eba7bda6032a0b82a6bb43b11571feac
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -27,12 +27,53 @@
# And here's the CITATION.cff format:
#
cff-version: 1.2.0
type: software
message: "If you are referencing Spack in a publication, please cite the paper below."
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
abstract: >-
Large HPC centers spend considerable time supporting software for thousands of users, but the complexity of HPC software is quickly outpacing the capabilities of existing software management tools.
Scientific applications require specific versions of compilers, MPI, and other dependency libraries, so using a single, standard software stack is infeasible.
However, managing many configurations is difficult because the configuration space is combinatorial in size.
We introduce Spack, a tool used at Lawrence Livermore National Laboratory to manage this complexity.
Spack provides a novel, re- cursive specification syntax to invoke parametric builds of packages and dependencies.
It allows any number of builds to coexist on the same system, and it ensures that installed packages can find their dependencies, regardless of the environment.
We show through real-world use cases that Spack supports diverse and demanding applications, bringing order to HPC software chaos.
preferred-citation:
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
type: conference-paper
doi: "10.1145/2807591.2807623"
url: "https://github.com/spack/spack"
url: "https://tgamblin.github.io/pubs/spack-sc15.pdf"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
given-names: "Matthew"
- family-names: "Collette"
given-names: "Michael R."
- family-names: "Lee"
given-names: "Gregory L."
- family-names: "Moody"
given-names: "Adam"
- family-names: "de Supinski"
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "US"
date-start: 2015-11-15
date-end: 2015-11-20
month: 11
year: 2015
identifiers:
- description: "The concept DOI of the work."
type: doi
value: 10.1145/2807591.2807623
- description: "The DOE Document Release Number of the work"
type: other
value: "LLNL-CONF-669890"
authors:
- family-names: "Gamblin"
given-names: "Todd"
- family-names: "LeGendre"
@@ -47,12 +88,3 @@ preferred-citation:
given-names: "Bronis R."
- family-names: "Futral"
given-names: "Scott"
title: "The Spack Package Manager: Bringing Order to HPC Software Chaos"
conference:
name: "Supercomputing 2015 (SC15)"
city: "Austin"
region: "Texas"
country: "USA"
month: November 15-20
year: 2015
notes: LLNL-CONF-669890

View File

@@ -7,6 +7,7 @@
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Slack](https://slack.spack.io/badge.svg)](https://slack.spack.io)
[![Matrix](https://img.shields.io/matrix/spack-space%3Amatrix.org?label=Matrix)](https://matrix.to/#/#spack-space:matrix.org)
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
@@ -62,7 +63,10 @@ Resources:
* **Slack workspace**: [spackpm.slack.com](https://spackpm.slack.com).
To get an invitation, visit [slack.spack.io](https://slack.spack.io).
* [**Github Discussions**](https://github.com/spack/spack/discussions): not just for discussions, also Q&A.
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
not just for discussions, also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!

View File

@@ -9,15 +9,15 @@ bootstrap:
# may not be able to bootstrap all the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions-v0.5'
metadata: $spack/share/spack/bootstrap/github-actions-v0.5
- name: 'github-actions-v0.4'
metadata: $spack/share/spack/bootstrap/github-actions-v0.4
- name: 'github-actions-v0.3'
metadata: $spack/share/spack/bootstrap/github-actions-v0.3
- name: 'spack-install'
metadata: $spack/share/spack/bootstrap/spack-install
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions-v0.5: true
github-actions-v0.4: true
github-actions-v0.3: true
spack-install: true

View File

@@ -41,4 +41,4 @@ concretizer:
# "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: none
strategy: minimal

View File

@@ -1,4 +1,3 @@
package_list.html
command_index.rst
spack*.rst
llnl*.rst

View File

@@ -45,7 +45,8 @@ Listing available packages
To install software with Spack, you need to know what software is
available. You can see a list of available package names at the
:ref:`package-list` webpage, or using the ``spack list`` command.
`packages.spack.io <https://packages.spack.io>`_ website, or
using the ``spack list`` command.
.. _cmd-spack-list:
@@ -60,7 +61,7 @@ can install:
:ellipsis: 10
There are thousands of them, so we've truncated the output above, but you
can find a :ref:`full list here <package-list>`.
can find a `full list here <https://packages.spack.io>`_.
Packages are listed by name in alphabetical order.
A pattern to match with no wildcards, ``*`` or ``?``,
will be treated as though it started and ended with

View File

@@ -3,6 +3,103 @@
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _concretizer-options:
==========================================
Concretization Settings (concretizer.yaml)
==========================================
The ``concretizer.yaml`` configuration file allows to customize aspects of the
algorithm used to select the dependencies you install. The default configuration
is the following:
.. literalinclude:: _spack_root/etc/spack/defaults/concretizer.yaml
:language: yaml
--------------------------------
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
------------------------------------------
Selection of the target microarchitectures
------------------------------------------
The options under the ``targets`` attribute control which targets are considered during a solve.
Currently the options in this section are only configurable from the ``concretizer.yaml`` file
and there are no corresponding command line arguments to enable them for a single solve.
The ``granularity`` option can take two possible values: ``microarchitectures`` and ``generic``.
If set to:
.. code-block:: yaml
concretizer:
targets:
granularity: microarchitectures
Spack will consider all the microarchitectures known to ``archspec`` to label nodes for
compatibility. If instead the option is set to:
.. code-block:: yaml
concretizer:
targets:
granularity: generic
Spack will consider only generic microarchitectures. For instance, when running on an
Haswell node, Spack will consider ``haswell`` as the best target in the former case and
``x86_64_v3`` as the best target in the latter case.
The ``host_compatible`` option is a Boolean option that determines whether or not the
microarchitectures considered during the solve are constrained to be compatible with the
host Spack is currently running on. For instance, if this option is set to ``true``, a
user cannot concretize for ``target=icelake`` while running on an Haswell node.
---------------
Duplicate nodes
---------------
The ``duplicates`` attribute controls whether the DAG can contain multiple configurations of
the same package. This is mainly relevant for build dependencies, which may have their version
pinned by some nodes, and thus be required at different versions by different nodes in the same
DAG.
The ``strategy`` option controls how the solver deals with duplicates. If the value is ``none``,
then a single configuration per package is allowed in the DAG. This means, for instance, that only
a single ``cmake`` or a single ``py-setuptools`` version is allowed. The result would be a slightly
faster concretization, at the expense of making a few specs unsolvable.
If the value is ``minimal`` Spack will allow packages tagged as ``build-tools`` to have duplicates.
This allows, for instance, to concretize specs whose nodes require different, and incompatible, ranges
of some build tool. For instance, in the figure below the latest `py-shapely` requires a newer `py-setuptools`,
while `py-numpy` still needs an older version:
.. figure:: images/shapely_duplicates.svg
:scale: 70 %
:align: center
Up to Spack v0.20 ``duplicates:strategy:none`` was the default (and only) behavior. From Spack v0.21 the
default behavior is ``duplicates:strategy:minimal``.
.. _build-settings:
================================
@@ -232,76 +329,6 @@ Specific limitations include:
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _concretizer-options:
----------------------
Concretizer options
----------------------
``packages.yaml`` gives the concretizer preferences for specific packages,
but you can also use ``concretizer.yaml`` to customize aspects of the
algorithm it uses to select the dependencies you install:
.. literalinclude:: _spack_root/etc/spack/defaults/concretizer.yaml
:language: yaml
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Reuse already installed packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Selection of the target microarchitectures
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The options under the ``targets`` attribute control which targets are considered during a solve.
Currently the options in this section are only configurable from the ``concretizer.yaml`` file
and there are no corresponding command line arguments to enable them for a single solve.
The ``granularity`` option can take two possible values: ``microarchitectures`` and ``generic``.
If set to:
.. code-block:: yaml
concretizer:
targets:
granularity: microarchitectures
Spack will consider all the microarchitectures known to ``archspec`` to label nodes for
compatibility. If instead the option is set to:
.. code-block:: yaml
concretizer:
targets:
granularity: generic
Spack will consider only generic microarchitectures. For instance, when running on an
Haswell node, Spack will consider ``haswell`` as the best target in the former case and
``x86_64_v3`` as the best target in the latter case.
The ``host_compatible`` option is a Boolean option that determines whether or not the
microarchitectures considered during the solve are constrained to be compatible with the
host Spack is currently running on. For instance, if this option is set to ``true``, a
user cannot concretize for ``target=icelake`` while running on an Haswell node.
.. _package-requirements:
--------------------

View File

@@ -25,8 +25,8 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. See the :ref:`package-list` for the full list of available
oneAPI packages or use::
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -48,9 +48,6 @@
os.environ["COLIFY_SIZE"] = "25x120"
os.environ["COLUMNS"] = "120"
# Generate full package list if needed
subprocess.call(["spack", "list", "--format=html", "--update=package_list.html"])
# Generate a command index if an update is needed
subprocess.call(
[
@@ -214,6 +211,7 @@ def setup(sphinx):
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
("py:class", "spack.spec.InstallStatus"),
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),
]

View File

@@ -212,18 +212,12 @@ under the ``container`` attribute of environments:
final:
- libgomp
# Extra instructions
extra_instructions:
final: |
RUN echo 'export PS1="\[$(tput bold)\]\[$(tput setaf 1)\][gromacs]\[$(tput setaf 2)\]\u\[$(tput sgr0)\]:\w $ "' >> ~/.bashrc
# Labels for the image
labels:
app: "gromacs"
mpi: "mpich"
A detailed description of the options available can be found in the
:ref:`container_config_options` section.
A detailed description of the options available can be found in the :ref:`container_config_options` section.
-------------------
Setting Base Images
@@ -525,6 +519,13 @@ the example below:
COPY data /share/myapp/data
{% endblock %}
The Dockerfile is generated by running:
.. code-block:: console
$ spack -e /opt/environment containerize
Note that the environment must be active for spack to read the template.
The recipe that gets generated contains the two extra instruction that we added in our template extension:
.. code-block:: Dockerfile

View File

@@ -310,53 +310,11 @@ Once all of the dependencies are installed, you can try building the documentati
$ make clean
$ make
If you see any warning or error messages, you will have to correct those before
your PR is accepted.
If you are editing the documentation, you should obviously be running the
documentation tests. But even if you are simply adding a new package, your
changes could cause the documentation tests to fail:
.. code-block:: console
package_list.rst:8745: WARNING: Block quote ends without a blank line; unexpected unindent.
At first, this error message will mean nothing to you, since you didn't edit
that file. Until you look at line 8745 of the file in question:
.. code-block:: rst
Description:
NetCDF is a set of software libraries and self-describing, machine-
independent data formats that support the creation, access, and sharing
of array-oriented scientific data.
Our documentation includes :ref:`a list of all Spack packages <package-list>`.
If you add a new package, its docstring is added to this page. The problem in
this case was that the docstring looked like:
.. code-block:: python
class Netcdf(Package):
"""
NetCDF is a set of software libraries and self-describing,
machine-independent data formats that support the creation,
access, and sharing of array-oriented scientific data.
"""
Docstrings cannot start with a newline character, or else Sphinx will complain.
Instead, they should look like:
.. code-block:: python
class Netcdf(Package):
"""NetCDF is a set of software libraries and self-describing,
machine-independent data formats that support the creation,
access, and sharing of array-oriented scientific data."""
Documentation changes can result in much more obfuscated warning messages.
If you don't understand what they mean, feel free to ask when you submit
your PR.
If you see any warning or error messages, you will have to correct those before your PR
is accepted. If you are editing the documentation, you should be running the
documentation tests to make sure there are no errors. Documentation changes can result
in some obfuscated warning messages. If you don't understand what they mean, feel free
to ask when you submit your PR.
--------
Coverage

File diff suppressed because it is too large Load Diff

After

Width:  |  Height:  |  Size: 108 KiB

View File

@@ -54,9 +54,16 @@ or refer to the full manual below.
features
getting_started
basic_usage
Tutorial: Spack 101 <https://spack-tutorial.readthedocs.io>
replace_conda_homebrew
.. toctree::
:maxdepth: 2
:caption: Links
Tutorial (spack-tutorial.rtfd.io) <https://spack-tutorial.readthedocs.io>
Packages (packages.spack.io) <https://packages.spack.io>
Binaries (binaries.spack.io) <https://cache.spack.io>
.. toctree::
:maxdepth: 2
:caption: Reference
@@ -72,7 +79,6 @@ or refer to the full manual below.
repositories
binary_caches
command_index
package_list
chain
extensions
pipelines

View File

@@ -1,17 +0,0 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _package-list:
============
Package List
============
This is a list of things you can install using Spack. It is
automatically generated based on the packages in this Spack
version.
.. raw:: html
:file: package_list.html

View File

@@ -3635,7 +3635,8 @@ regardless of the build system. The arguments for the phase are:
The arguments ``spec`` and ``prefix`` are passed only for convenience, as they always
correspond to ``self.spec`` and ``self.spec.prefix`` respectively.
If the ``package.py`` encodes builders explicitly, the signature for a phase changes slightly:
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
@@ -3645,56 +3646,6 @@ If the ``package.py`` encodes builders explicitly, the signature for a phase cha
In this case the package is passed as the second argument, and ``self`` is the builder instance.
.. _multiple_build_systems:
^^^^^^^^^^^^^^^^^^^^^^
Multiple build systems
^^^^^^^^^^^^^^^^^^^^^^
There are cases where a software actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases natively, if a recipe is written using builders explicitly.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default. The ``package.py``
will likely contain some overriding of default builder methods:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
pass
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
pass
In more complex cases it might happen that the build system changes according to certain conditions,
for instance across versions. That can be expressed with conditional variant values:
.. code-block:: python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
^^^^^^^^^^^^^^^^^^
Mixin base classes
^^^^^^^^^^^^^^^^^^
@@ -3741,6 +3692,106 @@ for instance:
In the example above ``Cp2k`` inherits all the conflicts and variants that ``CudaPackage`` defines.
.. _multiple_build_systems:
----------------------
Multiple build systems
----------------------
There are cases where a package actively supports two build systems, or changes build systems
as it evolves, or needs different build systems on different platforms. Spack allows dealing with
these cases by splitting the build instructions into separate builder classes.
For instance, software that supports two build systems unconditionally should derive from
both ``*Package`` base classes, and declare the possible use of multiple build systems using
a directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
variant("my_feature", default=True)
build_system("cmake", "autotools", default="cmake")
In this case the software can be built with both ``autotools`` and ``cmake``. Since the package
supports multiple build systems, it is necessary to declare which one is the default.
Additional build instructions are split into separate builder classes:
.. code-block:: python
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
return [
self.define_from_variant("MY_FEATURE", "my_feature")
]
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
return self.with_or_without("my-feature", variant="my_feature")
In this example, ``spack install example +feature build_sytem=cmake`` will
pick the ``CMakeBuilder`` and invoke ``cmake -DMY_FEATURE:BOOL=ON``.
Similarly, ``spack install example +feature build_system=autotools`` will pick
the ``AutotoolsBuilder`` and invoke ``./configure --with-my-feature``.
Dependencies are always specified in the package class. When some dependencies
depend on the choice of the build system, it is possible to use when conditions as
usual:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system("cmake", "autotools", default="cmake")
# Runtime dependencies
depends_on("ncurses")
depends_on("libxml2")
# Lowerbounds for cmake only apply when using cmake as the build system
with when("build_system=cmake"):
depends_on("cmake@3.18:", when="@2.0:", type="build")
depends_on("cmake@3:", type="build")
# Specify extra build dependencies used only in the configure script
with when("build_system=autotools"):
depends_on("perl", type="build")
depends_on("pkgconfig", type="build")
Very often projects switch from one build system to another, or add support
for a new build system from a certain version, which means that the choice
of the build system typically depends on a version range. Those situations can
be handled by using conditional values in the ``build_system`` directive:
.. code-block:: python
class Example(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
In the example the directive impose a change from ``Autotools`` to ``CMake`` going
from ``v0.63`` to ``v0.64``.
The ``build_system`` can be used as an ordinary variant, which also means that it can
be used in ``depends_on`` statements. This can be useful when a package *requires* that
its dependency has a CMake config file, meaning that the dependent can only build when the
dependency is built with CMake, and not Autotools. In that case, you can force the choice
of the build system in the dependent:
.. code-block:: python
class Dependent(CMakePackage):
depends_on("example build_system=cmake")
.. _install-environment:
-----------------------
@@ -6196,7 +6247,100 @@ follows:
"foo-package@{0}".format(version_str)
)
.. _package-lifecycle:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Add detection tests to packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To ensure that software is detected correctly for multiple configurations
and on different systems users can write a ``detection_test.yaml`` file and
put it in the package directory alongside the ``package.py`` file.
This YAML file contains enough information for Spack to mock an environment
and try to check if the detection logic yields the results that are expected.
As a general rule, attributes at the top-level of ``detection_test.yaml``
represent search mechanisms and they each map to a list of tests that should confirm
the validity of the package's detection logic.
The detection tests can be run with the following command:
.. code-block:: console
$ spack audit externals
Errors that have been detected are reported to screen.
""""""""""""""""""""""""""
Tests for PATH inspections
""""""""""""""""""""""""""
Detection tests insisting on ``PATH`` inspections are listed under
the ``paths`` attribute:
.. code-block:: yaml
paths:
- layout:
- executables:
- "bin/clang-3.9"
- "bin/clang++-3.9"
script: |
echo "clang version 3.9.1-19ubuntu1 (tags/RELEASE_391/rc2)"
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
results:
- spec: 'llvm@3.9.1 +clang~lld~lldb'
Each test is performed by first creating a temporary directory structure as
specified in the corresponding ``layout`` and by then running
package detection and checking that the outcome matches the expected
``results``. The exact details on how to specify both the ``layout`` and the
``results`` are reported in the table below:
.. list-table:: Test based on PATH inspections
:header-rows: 1
* - Option Name
- Description
- Allowed Values
- Required Field
* - ``layout``
- Specifies the filesystem tree used for the test
- List of objects
- Yes
* - ``layout:[0]:executables``
- Relative paths for the mock executables to be created
- List of strings
- Yes
* - ``layout:[0]:script``
- Mock logic for the executable
- Any valid shell script
- Yes
* - ``results``
- List of expected results
- List of objects (empty if no result is expected)
- Yes
* - ``results:[0]:spec``
- A spec that is expected from detection
- Any valid spec
- Yes
"""""""""""""""""""""""""""""""
Reuse tests from other packages
"""""""""""""""""""""""""""""""
When using a custom repository, it is possible to customize a package that already exists in ``builtin``
and reuse its external tests. To do so, just write a ``detection_tests.yaml`` alongside the customized
``package.py`` with an ``includes`` attribute. For instance the ``detection_tests.yaml`` for
``myrepo.llvm`` might look like:
.. code-block:: yaml
includes:
- "builtin.llvm"
This YAML file instructs Spack to run the detection tests defined in ``builtin.llvm`` in addition to
those locally defined in the file.
-----------------------------
Style guidelines for packages
@@ -6655,3 +6799,30 @@ To achieve backward compatibility with the single-class format Spack creates in
Overall the role of the adapter is to route access to attributes of methods first through the ``*Package``
hierarchy, and then back to the base class builder. This is schematically shown in the diagram above, where
the adapter role is to "emulate" a method resolution order like the one represented by the red arrows.
------------------------------
Specifying License Information
------------------------------
A significant portion of software that Spack packages is open source. Most open
source software is released under one or more common open source licenses.
Specifying the specific license that a package is released under in a project's
`package.py` is good practice. To specify a license, find the SPDX identifier for
a project and then add it using the license directive:
.. code-block:: python
license("<SPDX Identifier HERE>")
Note that specifying a license without a when clause makes it apply to all
versions and variants of the package, which might not actually be the case.
For example, a project might have switched licenses at some point or have
certain build configurations that include files that are licensed differently.
To account for this, you can specify when licenses should be applied. For
example, to specify that a specific license identifier should only apply
to versionup to and including 1.5, you could write the following directive:
.. code-block:: python
license("...", when="@:1.5")

View File

@@ -213,6 +213,16 @@ pipeline jobs.
``spack ci generate``
^^^^^^^^^^^^^^^^^^^^^
Throughout this documentation, references to the "mirror" mean the target
mirror which is checked for the presence of up-to-date specs, and where
any scheduled jobs should push built binary packages. In the past, this
defaulted to the mirror at index 0 in the mirror configs, and could be
overridden using the ``--buildcache-destination`` argument. Starting with
Spack 0.23, ``spack ci generate`` will require you to identify this mirror
by the name "buildcache-destination". While you can configure any number
of mirrors as sources for your pipelines, you will need to identify the
destination mirror by name.
Concretizes the specs in the active environment, stages them (as described in
:ref:`staging_algorithm`), and writes the resulting ``.gitlab-ci.yml`` to disk.
During concretization of the environment, ``spack ci generate`` also writes a

View File

@@ -4,7 +4,7 @@
SPDX-License-Identifier: (Apache-2.0 OR MIT)
=====================================
Using Spack to Replace Homebrew/Conda
Spack for Homebrew/Conda Users
=====================================
Spack is an incredibly powerful package manager, designed for supercomputers
@@ -191,18 +191,18 @@ The ``--fresh`` flag tells Spack to use the latest version of every package
where possible instead of trying to optimize for reuse of existing installed
packages.
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
won't re-install them, it will keep the symlinks in place.
-----------------------------------
Updating & Cleaning Up Old Packages
-----------------------------------
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
package versions, simply run the following commands:
.. code-block:: console
@@ -212,9 +212,9 @@ package versions, simply run the following commands:
$ spack concretize --fresh --force
$ spack install
$ spack gc
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
spack's garbage collection system that these packages should be cleaned up.
Don't worry however, this will not remove your entire environment.
@@ -223,8 +223,8 @@ a fresh concretization and will re-mark any packages that should remain
installed as "explicitly" installed.
**Note:** if you use multiple spack environments you should re-run ``spack install``
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
environment you just upgraded.
--------------

View File

@@ -1,13 +1,13 @@
sphinx==7.2.5
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
python-levenshtein==0.21.1
python-levenshtein==0.23.0
docutils==0.18.1
pygments==2.16.1
urllib3==2.0.4
urllib3==2.0.7
pytest==7.4.2
isort==5.12.0
black==23.7.0
black==23.9.1
flake8==6.1.0
mypy==1.5.1
mypy==1.6.1

View File

@@ -1,9 +1,7 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 3.6--3.11, , Interpreter for Spack
Python, 3.6--3.12, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
1 Name Supported Versions Notes Requirement Reason
2 Python 3.6--3.11 3.6--3.12 Interpreter for Spack
3 C/C++ Compilers Building software
make Build software
4 patch Build software
bash Compiler wrappers
5 tar Extract/create archives
6 gzip Compress/Decompress archives
7 unzip Compress/Decompress archives

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.1 (commit 9e1117bd8a2f0581bced161f2a2e8d6294d0300b)
* Version: 0.2.1 (commit df43a1834460bf94516136951c4729a3100603ec)
astunparse
----------------

View File

@@ -1,2 +1,2 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.0"
__version__ = "0.2.1"

View File

@@ -79,14 +79,18 @@ def __init__(self, name, parents, vendor, features, compilers, generation=0):
self.features = features
self.compilers = compilers
self.generation = generation
# Cache the ancestor computation
self._ancestors = None
@property
def ancestors(self):
"""All the ancestors of this microarchitecture."""
value = self.parents[:]
for parent in self.parents:
value.extend(a for a in parent.ancestors if a not in value)
return value
if self._ancestors is None:
value = self.parents[:]
for parent in self.parents:
value.extend(a for a in parent.ancestors if a not in value)
self._ancestors = value
return self._ancestors
def _to_set(self):
"""Returns a set of the nodes in this microarchitecture DAG."""

View File

@@ -145,6 +145,13 @@
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3"
}
],
"intel": [
{
"versions": "16.0:",
"name": "corei7",
"flags": "-march={name} -mtune=generic -mpopcnt"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
@@ -217,6 +224,13 @@
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave"
}
],
"intel": [
{
"versions": "16.0:",
"name": "core-avx2",
"flags": "-march={name} -mtune={name} -fma -mf16c"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
@@ -300,6 +314,13 @@
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave -mavx512f -mavx512bw -mavx512cd -mavx512dq -mavx512vl"
}
],
"intel": [
{
"versions": "16.0:",
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
@@ -1412,6 +1433,92 @@
]
}
},
"sapphirerapids": {
"from": [
"icelake"
],
"vendor": "GenuineIntel",
"features": [
"mmx",
"sse",
"sse2",
"ssse3",
"sse4_1",
"sse4_2",
"popcnt",
"aes",
"pclmulqdq",
"avx",
"rdrand",
"f16c",
"movbe",
"fma",
"avx2",
"bmi1",
"bmi2",
"rdseed",
"adx",
"clflushopt",
"xsavec",
"xsaveopt",
"avx512f",
"avx512vl",
"avx512bw",
"avx512dq",
"avx512cd",
"avx512vbmi",
"avx512ifma",
"sha_ni",
"clwb",
"rdpid",
"gfni",
"avx512_vbmi2",
"avx512_vpopcntdq",
"avx512_bitalg",
"avx512_vnni",
"vpclmulqdq",
"vaes",
"avx512_bf16",
"cldemote",
"movdir64b",
"movdiri",
"pdcm",
"serialize",
"waitpkg"
],
"compilers": {
"gcc": [
{
"versions": "11.0:",
"flags": "-march={name} -mtune={name}"
}
],
"clang": [
{
"versions": "12.0:",
"flags": "-march={name} -mtune={name}"
}
],
"intel": [
{
"versions": "2021.2:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": "2021.2:",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": "2021.2:",
"flags": "-march={name} -mtune={name}"
}
]
}
},
"k10": {
"from": ["x86_64"],
"vendor": "AuthenticAMD",
@@ -2065,8 +2172,6 @@
"pku",
"gfni",
"flush_l1d",
"erms",
"avic",
"avx512f",
"avx512dq",
"avx512ifma",
@@ -2083,12 +2188,12 @@
"compilers": {
"gcc": [
{
"versions": "10.3:13.0",
"versions": "10.3:12.2",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
},
{
"versions": "13.1:",
"versions": "12.3:",
"name": "znver4",
"flags": "-march={name} -mtune={name}"
}

105
lib/spack/llnl/path.py Normal file
View File

@@ -0,0 +1,105 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Path primitives that just require Python standard library."""
import functools
import sys
from typing import List, Optional
from urllib.parse import urlparse
class Path:
"""Enum to identify the path-style."""
unix: int = 0
windows: int = 1
platform_path: int = windows if sys.platform == "win32" else unix
def format_os_path(path: str, mode: int = Path.unix) -> str:
"""Formats the input path to use consistent, platform specific separators.
Absolute paths are converted between drive letters and a prepended '/' as per platform
requirement.
Parameters:
path: the path to be normalized, must be a string or expose the replace method.
mode: the path file separator style to normalize the passed path to.
Default is unix style, i.e. '/'
"""
if not path:
return path
if mode == Path.windows:
path = path.replace("/", "\\")
else:
path = path.replace("\\", "/")
return path
def convert_to_posix_path(path: str) -> str:
"""Converts the input path to POSIX style."""
return format_os_path(path, mode=Path.unix)
def convert_to_windows_path(path: str) -> str:
"""Converts the input path to Windows style."""
return format_os_path(path, mode=Path.windows)
def convert_to_platform_path(path: str) -> str:
"""Converts the input path to the current platform's native style."""
return format_os_path(path, mode=Path.platform_path)
def path_to_os_path(*parameters: str) -> List[str]:
"""Takes an arbitrary number of positional parameters, converts each argument of type
string to use a normalized filepath separator, and returns a list of all values.
"""
def _is_url(path_or_url: str) -> bool:
if "\\" in path_or_url:
return False
url_tuple = urlparse(path_or_url)
return bool(url_tuple.scheme) and len(url_tuple.scheme) > 1
result = []
for item in parameters:
if isinstance(item, str) and not _is_url(item):
item = convert_to_platform_path(item)
result.append(item)
return result
def system_path_filter(_func=None, arg_slice: Optional[slice] = None):
"""Filters function arguments to account for platform path separators.
Optional slicing range can be specified to select specific arguments
This decorator takes all (or a slice) of a method's positional arguments
and normalizes usage of filepath separators on a per platform basis.
Note: `**kwargs`, urls, and any type that is not a string are ignored
so in such cases where path normalization is required, that should be
handled by calling path_to_os_path directly as needed.
Parameters:
arg_slice: a slice object specifying the slice of arguments
in the decorated method over which filepath separators are
normalized
"""
def holder_func(func):
@functools.wraps(func)
def path_filter_caller(*args, **kwargs):
args = list(args)
if arg_slice:
args[arg_slice] = path_to_os_path(*args[arg_slice])
else:
args = path_to_os_path(*args)
return func(*args, **kwargs)
return path_filter_caller
if _func:
return holder_func(_func)
return holder_func

67
lib/spack/llnl/string.py Normal file
View File

@@ -0,0 +1,67 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""String manipulation functions that do not have other dependencies than Python
standard library
"""
from typing import List, Optional
def comma_list(sequence: List[str], article: str = "") -> str:
if type(sequence) is not list:
sequence = list(sequence)
if not sequence:
return ""
if len(sequence) == 1:
return sequence[0]
out = ", ".join(str(s) for s in sequence[:-1])
if len(sequence) != 2:
out += "," # oxford comma
out += " "
if article:
out += article + " "
out += str(sequence[-1])
return out
def comma_or(sequence: List[str]) -> str:
"""Return a string with all the elements of the input joined by comma, but the last
one (which is joined by 'or').
"""
return comma_list(sequence, "or")
def comma_and(sequence: List[str]) -> str:
"""Return a string with all the elements of the input joined by comma, but the last
one (which is joined by 'and').
"""
return comma_list(sequence, "and")
def quote(sequence: List[str], q: str = "'") -> List[str]:
"""Quotes each item in the input list with the quote character passed as second argument."""
return [f"{q}{e}{q}" for e in sequence]
def plural(n: int, singular: str, plural: Optional[str] = None, show_n: bool = True) -> str:
"""Pluralize <singular> word by adding an s if n != 1.
Arguments:
n: number of things there are
singular: singular form of word
plural: optional plural form, for when it's not just singular + 's'
show_n: whether to include n in the result string (default True)
Returns:
"1 thing" if n == 1 or "n things" if n != 1
"""
number = f"{n} " if show_n else ""
if n == 1:
return f"{number}{singular}"
elif plural is not None:
return f"{number}{plural}"
else:
return f"{number}{singular}s"

459
lib/spack/llnl/url.py Normal file
View File

@@ -0,0 +1,459 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""URL primitives that just require Python standard library."""
import itertools
import os.path
import re
from typing import Optional, Set, Tuple
from urllib.parse import urlsplit, urlunsplit
# Archive extensions allowed in Spack
PREFIX_EXTENSIONS = ("tar", "TAR")
EXTENSIONS = ("gz", "bz2", "xz", "Z")
NO_TAR_EXTENSIONS = ("zip", "tgz", "tbz2", "tbz", "txz")
# Add PREFIX_EXTENSIONS and EXTENSIONS last so that .tar.gz is matched *before* .tar or .gz
ALLOWED_ARCHIVE_TYPES = (
tuple(".".join(ext) for ext in itertools.product(PREFIX_EXTENSIONS, EXTENSIONS))
+ PREFIX_EXTENSIONS
+ EXTENSIONS
+ NO_TAR_EXTENSIONS
)
CONTRACTION_MAP = {"tgz": "tar.gz", "txz": "tar.xz", "tbz": "tar.bz2", "tbz2": "tar.bz2"}
def find_list_urls(url: str) -> Set[str]:
r"""Find good list URLs for the supplied URL.
By default, returns the dirname of the archive path.
Provides special treatment for the following websites, which have a
unique list URL different from the dirname of the download URL:
========= =======================================================
GitHub https://github.com/<repo>/<name>/releases
GitLab https://gitlab.\*/<repo>/<name>/tags
BitBucket https://bitbucket.org/<repo>/<name>/downloads/?tab=tags
CRAN https://\*.r-project.org/src/contrib/Archive/<name>
PyPI https://pypi.org/simple/<name>/
LuaRocks https://luarocks.org/modules/<repo>/<name>
========= =======================================================
Note: this function is called by `spack versions`, `spack checksum`,
and `spack create`, but not by `spack fetch` or `spack install`.
Parameters:
url (str): The download URL for the package
Returns:
set: One or more list URLs for the package
"""
url_types = [
# GitHub
# e.g. https://github.com/llnl/callpath/archive/v1.0.1.tar.gz
(r"(.*github\.com/[^/]+/[^/]+)", lambda m: m.group(1) + "/releases"),
# GitLab API endpoint
# e.g. https://gitlab.dkrz.de/api/v4/projects/k202009%2Flibaec/repository/archive.tar.gz?sha=v1.0.2
(
r"(.*gitlab[^/]+)/api/v4/projects/([^/]+)%2F([^/]+)",
lambda m: m.group(1) + "/" + m.group(2) + "/" + m.group(3) + "/tags",
),
# GitLab non-API endpoint
# e.g. https://gitlab.dkrz.de/k202009/libaec/uploads/631e85bcf877c2dcaca9b2e6d6526339/libaec-1.0.0.tar.gz
(r"(.*gitlab[^/]+/(?!api/v4/projects)[^/]+/[^/]+)", lambda m: m.group(1) + "/tags"),
# BitBucket
# e.g. https://bitbucket.org/eigen/eigen/get/3.3.3.tar.bz2
(r"(.*bitbucket.org/[^/]+/[^/]+)", lambda m: m.group(1) + "/downloads/?tab=tags"),
# CRAN
# e.g. https://cran.r-project.org/src/contrib/Rcpp_0.12.9.tar.gz
# e.g. https://cloud.r-project.org/src/contrib/rgl_0.98.1.tar.gz
(
r"(.*\.r-project\.org/src/contrib)/([^_]+)",
lambda m: m.group(1) + "/Archive/" + m.group(2),
),
# PyPI
# e.g. https://pypi.io/packages/source/n/numpy/numpy-1.19.4.zip
# e.g. https://www.pypi.io/packages/source/n/numpy/numpy-1.19.4.zip
# e.g. https://pypi.org/packages/source/n/numpy/numpy-1.19.4.zip
# e.g. https://pypi.python.org/packages/source/n/numpy/numpy-1.19.4.zip
# e.g. https://files.pythonhosted.org/packages/source/n/numpy/numpy-1.19.4.zip
# e.g. https://pypi.io/packages/py2.py3/o/opencensus-context/opencensus_context-0.1.1-py2.py3-none-any.whl
(
r"(?:pypi|pythonhosted)[^/]+/packages/[^/]+/./([^/]+)",
lambda m: "https://pypi.org/simple/" + m.group(1) + "/",
),
# LuaRocks
# e.g. https://luarocks.org/manifests/gvvaughan/lpeg-1.0.2-1.src.rock
# e.g. https://luarocks.org/manifests/openresty/lua-cjson-2.1.0-1.src.rock
(
r"luarocks[^/]+/(?:modules|manifests)/(?P<org>[^/]+)/"
+ r"(?P<name>.+?)-[0-9.-]*\.src\.rock",
lambda m: "https://luarocks.org/modules/"
+ m.group("org")
+ "/"
+ m.group("name")
+ "/",
),
]
list_urls = {os.path.dirname(url)}
for pattern, fun in url_types:
match = re.search(pattern, url)
if match:
list_urls.add(fun(match))
return list_urls
def strip_query_and_fragment(url: str) -> Tuple[str, str]:
"""Strips query and fragment from a url, then returns the base url and the suffix.
Args:
url: URL to be stripped
Raises:
ValueError: when there is any error parsing the URL
"""
components = urlsplit(url)
stripped = components[:3] + (None, None)
query, frag = components[3:5]
suffix = ""
if query:
suffix += "?" + query
if frag:
suffix += "#" + frag
return urlunsplit(stripped), suffix
SOURCEFORGE_RE = re.compile(r"(.*(?:sourceforge\.net|sf\.net)/.*)(/download)$")
def split_url_on_sourceforge_suffix(url: str) -> Tuple[str, ...]:
"""If the input is a sourceforge URL, returns base URL and "/download" suffix. Otherwise,
returns the input URL and an empty string.
"""
match = SOURCEFORGE_RE.search(url)
if match is not None:
return match.groups()
return url, ""
def has_extension(path_or_url: str, ext: str) -> bool:
"""Returns true if the extension in input is present in path, false otherwise."""
prefix, _ = split_url_on_sourceforge_suffix(path_or_url)
if not ext.startswith(r"\."):
ext = rf"\.{ext}$"
if re.search(ext, prefix):
return True
return False
def extension_from_path(path_or_url: Optional[str]) -> Optional[str]:
"""Tries to match an allowed archive extension to the input. Returns the first match,
or None if no match was found.
Raises:
ValueError: if the input is None
"""
if path_or_url is None:
raise ValueError("Can't call extension() on None")
for t in ALLOWED_ARCHIVE_TYPES:
if has_extension(path_or_url, t):
return t
return None
def remove_extension(path_or_url: str, *, extension: str) -> str:
"""Returns the input with the extension removed"""
suffix = rf"\.{extension}$"
return re.sub(suffix, "", path_or_url)
def check_and_remove_ext(path: str, *, extension: str) -> str:
"""Returns the input path with the extension removed, if the extension is present in path.
Otherwise, returns the input unchanged.
"""
if not has_extension(path, extension):
return path
path, _ = split_url_on_sourceforge_suffix(path)
return remove_extension(path, extension=extension)
def strip_extension(path_or_url: str, *, extension: Optional[str] = None) -> str:
"""If a path contains the extension in input, returns the path stripped of the extension.
Otherwise, returns the input path.
If extension is None, attempts to strip any allowed extension from path.
"""
if extension is None:
for t in ALLOWED_ARCHIVE_TYPES:
if has_extension(path_or_url, ext=t):
extension = t
break
else:
return path_or_url
return check_and_remove_ext(path_or_url, extension=extension)
def split_url_extension(url: str) -> Tuple[str, ...]:
"""Some URLs have a query string, e.g.:
1. https://github.com/losalamos/CLAMR/blob/packages/PowerParser_v2.0.7.tgz?raw=true
2. http://www.apache.org/dyn/closer.cgi?path=/cassandra/1.2.0/apache-cassandra-1.2.0-rc2-bin.tar.gz
3. https://gitlab.kitware.com/vtk/vtk/repository/archive.tar.bz2?ref=v7.0.0
In (1), the query string needs to be stripped to get at the
extension, but in (2) & (3), the filename is IN a single final query
argument.
This strips the URL into three pieces: ``prefix``, ``ext``, and ``suffix``.
The suffix contains anything that was stripped off the URL to
get at the file extension. In (1), it will be ``'?raw=true'``, but
in (2), it will be empty. In (3) the suffix is a parameter that follows
after the file extension, e.g.:
1. ``('https://github.com/losalamos/CLAMR/blob/packages/PowerParser_v2.0.7', '.tgz', '?raw=true')``
2. ``('http://www.apache.org/dyn/closer.cgi?path=/cassandra/1.2.0/apache-cassandra-1.2.0-rc2-bin', '.tar.gz', None)``
3. ``('https://gitlab.kitware.com/vtk/vtk/repository/archive', '.tar.bz2', '?ref=v7.0.0')``
"""
# Strip off sourceforge download suffix.
# e.g. https://sourceforge.net/projects/glew/files/glew/2.0.0/glew-2.0.0.tgz/download
prefix, suffix = split_url_on_sourceforge_suffix(url)
ext = extension_from_path(prefix)
if ext is not None:
prefix = strip_extension(prefix)
return prefix, ext, suffix
try:
prefix, suf = strip_query_and_fragment(prefix)
except ValueError:
# FIXME: tty.debug("Got error parsing path %s" % path)
# Ignore URL parse errors here
return url, ""
ext = extension_from_path(prefix)
prefix = strip_extension(prefix)
suffix = suf + suffix
if ext is None:
ext = ""
return prefix, ext, suffix
def strip_version_suffixes(path_or_url: str) -> str:
"""Some tarballs contain extraneous information after the version:
* ``bowtie2-2.2.5-source``
* ``libevent-2.0.21-stable``
* ``cuda_8.0.44_linux.run``
These strings are not part of the version number and should be ignored.
This function strips those suffixes off and returns the remaining string.
The goal is that the version is always the last thing in ``path``:
* ``bowtie2-2.2.5``
* ``libevent-2.0.21``
* ``cuda_8.0.44``
Args:
path_or_url: The filename or URL for the package
Returns:
The ``path`` with any extraneous suffixes removed
"""
# NOTE: This could be done with complicated regexes in parse_version_offset
# NOTE: The problem is that we would have to add these regexes to the end
# NOTE: of every single version regex. Easier to just strip them off
# NOTE: permanently
suffix_regexes = [
# Download type
r"[Ii]nstall",
r"all",
r"code",
r"[Ss]ources?",
r"file",
r"full",
r"single",
r"with[a-zA-Z_-]+",
r"rock",
r"src(_0)?",
r"public",
r"bin",
r"binary",
r"run",
r"[Uu]niversal",
r"jar",
r"complete",
r"dynamic",
r"oss",
r"gem",
r"tar",
r"sh",
# Download version
r"release",
r"bin",
r"stable",
r"[Ff]inal",
r"rel",
r"orig",
r"dist",
r"\+",
# License
r"gpl",
# Arch
# Needs to come before and after OS, appears in both orders
r"ia32",
r"intel",
r"amd64",
r"linux64",
r"x64",
r"64bit",
r"x86[_-]64",
r"i586_64",
r"x86",
r"i[36]86",
r"ppc64(le)?",
r"armv?(7l|6l|64)",
# Other
r"cpp",
r"gtk",
r"incubating",
# OS
r"[Ll]inux(_64)?",
r"LINUX",
r"[Uu]ni?x",
r"[Ss]un[Oo][Ss]",
r"[Mm]ac[Oo][Ss][Xx]?",
r"[Oo][Ss][Xx]",
r"[Dd]arwin(64)?",
r"[Aa]pple",
r"[Ww]indows",
r"[Ww]in(64|32)?",
r"[Cc]ygwin(64|32)?",
r"[Mm]ingw",
r"centos",
# Arch
# Needs to come before and after OS, appears in both orders
r"ia32",
r"intel",
r"amd64",
r"linux64",
r"x64",
r"64bit",
r"x86[_-]64",
r"i586_64",
r"x86",
r"i[36]86",
r"ppc64(le)?",
r"armv?(7l|6l|64)?",
# PyPI
r"[._-]py[23].*\.whl",
r"[._-]cp[23].*\.whl",
r"[._-]win.*\.exe",
]
for regex in suffix_regexes:
# Remove the suffix from the end of the path
# This may be done multiple times
path_or_url = re.sub(r"[._-]?" + regex + "$", "", path_or_url)
return path_or_url
def expand_contracted_extension(extension: str) -> str:
"""Returns the expanded version of a known contracted extension.
This function maps extensions like ".tgz" to ".tar.gz". On unknown extensions,
return the input unmodified.
"""
extension = extension.strip(".")
return CONTRACTION_MAP.get(extension, extension)
def expand_contracted_extension_in_path(
path_or_url: str, *, extension: Optional[str] = None
) -> str:
"""Returns the input path or URL with any contraction extension expanded.
Args:
path_or_url: path or URL to be expanded
extension: if specified, only attempt to expand that extension
"""
extension = extension or extension_from_path(path_or_url)
if extension is None:
return path_or_url
expanded = expand_contracted_extension(extension)
if expanded != extension:
return re.sub(rf"{extension}", rf"{expanded}", path_or_url)
return path_or_url
def compression_ext_from_compressed_archive(extension: str) -> Optional[str]:
"""Returns compression extension for a compressed archive"""
extension = expand_contracted_extension(extension)
for ext in [*EXTENSIONS]:
if ext in extension:
return ext
return None
def strip_compression_extension(path_or_url: str, ext: Optional[str] = None) -> str:
"""Strips the compression extension from the input, and returns it. For instance,
"foo.tgz" becomes "foo.tar".
If no extension is given, try a default list of extensions.
Args:
path_or_url: input to be stripped
ext: if given, extension to be stripped
"""
if not extension_from_path(path_or_url):
return path_or_url
expanded_path = expand_contracted_extension_in_path(path_or_url)
candidates = [ext] if ext is not None else EXTENSIONS
for current_extension in candidates:
modified_path = check_and_remove_ext(expanded_path, extension=current_extension)
if modified_path != expanded_path:
return modified_path
return expanded_path
def allowed_archive(path_or_url: str) -> bool:
"""Returns true if the input is a valid archive, False otherwise."""
return (
False if not path_or_url else any(path_or_url.endswith(t) for t in ALLOWED_ARCHIVE_TYPES)
)
def determine_url_file_extension(path: str) -> str:
"""This returns the type of archive a URL refers to. This is
sometimes confusing because of URLs like:
(1) https://github.com/petdance/ack/tarball/1.93_02
Where the URL doesn't actually contain the filename. We need
to know what type it is so that we can appropriately name files
in mirrors.
"""
match = re.search(r"github.com/.+/(zip|tar)ball/", path)
if match:
if match.group(1) == "zip":
return "zip"
elif match.group(1) == "tar":
return "tar.gz"
prefix, ext, suffix = split_url_extension(path)
return ext

View File

@@ -11,6 +11,7 @@
import itertools
import numbers
import os
import pathlib
import posixpath
import re
import shutil
@@ -27,7 +28,8 @@
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from spack.util.executable import Executable, which
from spack.util.path import path_to_os_path, system_path_filter
from ..path import path_to_os_path, system_path_filter
if sys.platform != "win32":
import grp
@@ -154,6 +156,37 @@ def lookup(name):
shutil.copystat = copystat
def polite_path(components: Iterable[str]):
"""
Given a list of strings which are intended to be path components,
generate a path, and format each component to avoid generating extra
path entries.
For example all "/", "\", and ":" characters will be replaced with
"_". Other characters like "=" will also be replaced.
"""
return os.path.join(*[polite_filename(x) for x in components])
@memoized
def _polite_antipattern():
# A regex of all the characters we don't want in a filename
return re.compile(r"[^A-Za-z0-9_.-]")
def polite_filename(filename: str) -> str:
"""
Replace generally problematic filename characters with underscores.
This differs from sanitize_filename in that it is more aggressive in
changing characters in the name. For example it removes "=" which can
confuse path parsing in external tools.
"""
# This character set applies for both Windows and Linux. It does not
# account for reserved filenames in Windows.
return _polite_antipattern().sub("_", filename)
def getuid():
if sys.platform == "win32":
import ctypes
@@ -335,8 +368,7 @@ def groupid_to_group(x):
if string:
regex = re.escape(regex)
filenames = path_to_os_path(*filenames)
for filename in filenames:
for filename in path_to_os_path(*filenames):
msg = 'FILTER FILE: {0} [replacing "{1}"]'
tty.debug(msg.format(filename, regex))
@@ -2426,7 +2458,7 @@ def library_dependents(self):
"""
Set of directories where package binaries/libraries are located.
"""
return set([self.pkg.prefix.bin]) | self._additional_library_dependents
return set([pathlib.Path(self.pkg.prefix.bin)]) | self._additional_library_dependents
def add_library_dependent(self, *dest):
"""
@@ -2439,9 +2471,9 @@ def add_library_dependent(self, *dest):
"""
for pth in dest:
if os.path.isfile(pth):
self._additional_library_dependents.add(os.path.dirname)
self._additional_library_dependents.add(pathlib.Path(pth).parent)
else:
self._additional_library_dependents.add(pth)
self._additional_library_dependents.add(pathlib.Path(pth))
@property
def rpaths(self):
@@ -2454,7 +2486,7 @@ def rpaths(self):
dependent_libs.extend(list(find_all_shared_libraries(path, recursive=True)))
for extra_path in self._addl_rpaths:
dependent_libs.extend(list(find_all_shared_libraries(extra_path, recursive=True)))
return set(dependent_libs)
return set([pathlib.Path(x) for x in dependent_libs])
def add_rpath(self, *paths):
"""
@@ -2470,7 +2502,7 @@ def add_rpath(self, *paths):
"""
self._addl_rpaths = self._addl_rpaths | set(paths)
def _link(self, path, dest_dir):
def _link(self, path: pathlib.Path, dest_dir: pathlib.Path):
"""Perform link step of simulated rpathing, installing
simlinks of file in path to the dest_dir
location. This method deliberately prevents
@@ -2478,27 +2510,35 @@ def _link(self, path, dest_dir):
This is because it is both meaningless from an rpath
perspective, and will cause an error when Developer
mode is not enabled"""
file_name = os.path.basename(path)
dest_file = os.path.join(dest_dir, file_name)
if os.path.exists(dest_dir) and not dest_file == path:
def report_already_linked():
# We have either already symlinked or we are encoutering a naming clash
# either way, we don't want to overwrite existing libraries
already_linked = islink(str(dest_file))
tty.debug(
"Linking library %s to %s failed, " % (str(path), str(dest_file))
+ "already linked."
if already_linked
else "library with name %s already exists at location %s."
% (str(file_name), str(dest_dir))
)
file_name = path.name
dest_file = dest_dir / file_name
if not dest_file.exists() and dest_dir.exists() and not dest_file == path:
try:
symlink(path, dest_file)
symlink(str(path), str(dest_file))
# For py2 compatibility, we have to catch the specific Windows error code
# associate with trying to create a file that already exists (winerror 183)
# Catch OSErrors missed by the SymlinkError checks
except OSError as e:
if sys.platform == "win32" and (e.winerror == 183 or e.errno == errno.EEXIST):
# We have either already symlinked or we are encoutering a naming clash
# either way, we don't want to overwrite existing libraries
already_linked = islink(dest_file)
tty.debug(
"Linking library %s to %s failed, " % (path, dest_file) + "already linked."
if already_linked
else "library with name %s already exists at location %s."
% (file_name, dest_dir)
)
pass
report_already_linked()
else:
raise e
# catch errors we raise ourselves from Spack
except llnl.util.symlink.AlreadyExistsError:
report_already_linked()
def establish_link(self):
"""

View File

@@ -14,7 +14,7 @@
from llnl.util import lang, tty
import spack.util.string
from ..string import plural
if sys.platform != "win32":
import fcntl
@@ -169,7 +169,7 @@ def _attempts_str(wait_time, nattempts):
if nattempts <= 1:
return ""
attempts = spack.util.string.plural(nattempts, "attempt")
attempts = plural(nattempts, "attempt")
return " after {} and {}".format(lang.pretty_seconds(wait_time), attempts)

View File

@@ -11,8 +11,7 @@
from llnl.util import lang, tty
from spack.error import SpackError
from spack.util.path import system_path_filter
from ..path import system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
@@ -66,7 +65,9 @@ def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not
if not allow_broken_symlinks:
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise SymlinkError(f"Link path ({link_path}) already exists. Cannot create link.")
raise AlreadyExistsError(
f"Link path ({link_path}) already exists. Cannot create link."
)
if not os.path.exists(source_path):
if os.path.isabs(source_path) and not allow_broken_symlinks:
@@ -78,7 +79,7 @@ def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link patg ahead of link creation to prevent broken
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
@@ -234,7 +235,7 @@ def _windows_create_junction(source: str, link: str):
elif not os.path.exists(source):
raise SymlinkError("Source path does not exist, cannot create a junction.")
elif os.path.lexists(link):
raise SymlinkError("Link path already exists, cannot create a junction.")
raise AlreadyExistsError("Link path already exists, cannot create a junction.")
elif not os.path.isdir(source):
raise SymlinkError("Source path is not a directory, cannot create a junction.")
@@ -259,7 +260,7 @@ def _windows_create_hard_link(path: str, link: str):
elif not os.path.exists(path):
raise SymlinkError(f"File path {path} does not exist. Cannot create hard link.")
elif os.path.lexists(link):
raise SymlinkError(f"Link path ({link}) already exists. Cannot create hard link.")
raise AlreadyExistsError(f"Link path ({link}) already exists. Cannot create hard link.")
elif not os.path.isfile(path):
raise SymlinkError(f"File path ({link}) is not a file. Cannot create hard link.")
else:
@@ -336,7 +337,11 @@ def resolve_link_target_relative_to_the_link(link):
return os.path.join(link_dir, target)
class SymlinkError(SpackError):
class SymlinkError(RuntimeError):
"""Exception class for errors raised while creating symlinks,
junctions and hard links
"""
class AlreadyExistsError(SymlinkError):
"""Link path already exists."""

View File

@@ -8,8 +8,8 @@
from llnl.util.lang import memoized
import spack.spec
import spack.version
from spack.compilers.clang import Clang
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
@@ -17,7 +17,9 @@ class ABI:
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(self, target, constraint):
def architecture_compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec
) -> bool:
"""Return true if architecture of target spec is ABI compatible
to the architecture of constraint spec. If either the target
or constraint specs have no architecture, target is also defined
@@ -34,7 +36,7 @@ def _gcc_get_libstdcxx_version(self, version):
a compiler's libstdc++ or libgcc_s"""
from spack.build_environment import dso_suffix
spec = CompilerSpec("gcc", version)
spec = spack.spec.CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:
return None
@@ -77,16 +79,20 @@ def _gcc_compiler_compare(self, pversion, cversion):
return False
return plib == clib
def _intel_compiler_compare(self, pversion, cversion):
def _intel_compiler_compare(
self, pversion: spack.version.ClosedOpenRange, cversion: spack.version.ClosedOpenRange
) -> bool:
"""Returns true iff the intel version pversion and cversion
are ABI compatible"""
# Test major and minor versions. Ignore build version.
if len(pversion.version) < 2 or len(cversion.version) < 2:
return False
return pversion.version[:2] == cversion.version[:2]
pv = pversion.lo
cv = cversion.lo
return pv.up_to(2) == cv.up_to(2)
def compiler_compatible(self, parent, child, **kwargs):
def compiler_compatible(
self, parent: spack.spec.Spec, child: spack.spec.Spec, loose: bool = False
) -> bool:
"""Return true if compilers for parent and child are ABI compatible."""
if not parent.compiler or not child.compiler:
return True
@@ -95,7 +101,7 @@ def compiler_compatible(self, parent, child, **kwargs):
# Different compiler families are assumed ABI incompatible
return False
if kwargs.get("loose", False):
if loose:
return True
# TODO: Can we move the specialized ABI matching stuff
@@ -116,9 +122,10 @@ def compiler_compatible(self, parent, child, **kwargs):
return True
return False
def compatible(self, target, constraint, **kwargs):
def compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec, loose: bool = False
) -> bool:
"""Returns true if target spec is ABI compatible to constraint spec"""
loosematch = kwargs.get("loose", False)
return self.architecture_compatible(target, constraint) and self.compiler_compatible(
target, constraint, loose=loosematch
target, constraint, loose=loose
)

View File

@@ -38,10 +38,13 @@ def _search_duplicate_compilers(error_cls):
import ast
import collections
import collections.abc
import glob
import inspect
import itertools
import pathlib
import pickle
import re
import warnings
from urllib.request import urlopen
import llnl.util.lang
@@ -798,3 +801,76 @@ def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
errors.append(err)
return errors
#: Sanity checks on package directives
external_detection = AuditClass(
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
)
def packages_with_detection_tests():
"""Return the list of packages with a corresponding detection_test.yaml file."""
import spack.config
import spack.util.path
to_be_tested = []
for current_repo in spack.repo.PATH.repos:
namespace = current_repo.namespace
packages_dir = pathlib.PurePath(current_repo.packages_path)
pattern = packages_dir / "**" / "detection_test.yaml"
pkgs_with_tests = [
f"{namespace}.{str(pathlib.PurePath(x).parent.name)}" for x in glob.glob(str(pattern))
]
to_be_tested.extend(pkgs_with_tests)
return to_be_tested
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
"""Test drive external detection for packages"""
import spack.detection
errors = []
# Filter the packages and retain only the ones with detection tests
pkgs_with_tests = packages_with_detection_tests()
selected_pkgs = []
for current_package in pkgs_with_tests:
_, unqualified_name = spack.repo.partition_package_name(current_package)
# Check for both unqualified name and qualified name
if unqualified_name in pkgs or current_package in pkgs:
selected_pkgs.append(current_package)
selected_pkgs.sort()
if not selected_pkgs:
summary = "No detection test to run"
details = [f' "{p}" has no detection test' for p in pkgs]
warnings.warn("\n".join([summary] + details))
return errors
for pkg_name in selected_pkgs:
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
not_detected = set(expected_specs) - set(specs)
if not_detected:
summary = pkg_name + ": cannot detect some specs"
details = [f'"{s}" was not detected [test_id={idx}]' for s in sorted(not_detected)]
errors.append(error_cls(summary=summary, details=details))
not_expected = set(specs) - set(expected_specs)
if not_expected:
summary = pkg_name + ": detected unexpected specs"
msg = '"{0}" was detected, but was not expected [test_id={1}]'
details = [msg.format(s, idx) for s in sorted(not_expected)]
errors.append(error_cls(summary=summary, details=details))
return errors

View File

@@ -23,7 +23,7 @@
import warnings
from contextlib import closing, contextmanager
from gzip import GzipFile
from typing import List, NamedTuple, Optional, Union
from typing import Dict, List, NamedTuple, Optional, Tuple, Union
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
@@ -34,6 +34,7 @@
import spack.cmd
import spack.config as config
import spack.database as spack_db
import spack.error
import spack.hooks
import spack.hooks.sbang
import spack.mirror
@@ -215,11 +216,11 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
except spack_db.InvalidDatabaseVersionError as e:
msg = (
tty.warn(
f"you need a newer Spack version to read the buildcache index for the "
f"following mirror: '{mirror_url}'. {e.database_version_message}"
)
raise BuildcacheIndexError(msg) from e
return
spec_list = db.query_local(installed=False, in_buildcache=True)
@@ -624,8 +625,7 @@ def buildinfo_file_name(prefix):
"""
Filename of the binary package meta-data file
"""
name = os.path.join(prefix, ".spack/binary_distribution")
return name
return os.path.join(prefix, ".spack/binary_distribution")
def read_buildinfo_file(prefix):
@@ -646,8 +646,7 @@ class BuildManifestVisitor(BaseDirectoryVisitor):
directories."""
def __init__(self):
# Save unique identifiers of files to avoid
# relocating hardlink files for each path.
# Save unique identifiers of hardlinks to avoid relocating them multiple times
self.visited = set()
# Lists of files we will check
@@ -656,6 +655,8 @@ def __init__(self):
def seen_before(self, root, rel_path):
stat_result = os.lstat(os.path.join(root, rel_path))
if stat_result.st_nlink == 1:
return False
identifier = (stat_result.st_dev, stat_result.st_ino)
if identifier in self.visited:
return True
@@ -796,11 +797,7 @@ def tarball_directory_name(spec):
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>-<version>/
"""
return os.path.join(
str(spec.architecture),
f"{spec.compiler.name}-{spec.compiler.version}",
f"{spec.name}-{spec.version}",
)
return spec.format_path("{architecture}/{compiler.name}-{compiler.version}/{name}-{version}")
def tarball_name(spec, ext):
@@ -808,10 +805,10 @@ def tarball_name(spec, ext):
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash><ext>
"""
return (
f"{spec.architecture}-{spec.compiler.name}-{spec.compiler.version}-"
f"{spec.name}-{spec.version}-{spec.dag_hash()}{ext}"
spec_formatted = spec.format_path(
"{architecture}-{compiler.name}-{compiler.version}-{name}-{version}-{hash}"
)
return f"{spec_formatted}{ext}"
def tarball_path_name(spec, ext):
@@ -912,7 +909,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
index_json_path,
url_util.join(cache_prefix, "index.json"),
keep_original=False,
extra_args={"ContentType": "application/json"},
extra_args={"ContentType": "application/json", "CacheControl": "no-cache"},
)
# Push the hash
@@ -920,7 +917,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
index_hash_path,
url_util.join(cache_prefix, "index.json.hash"),
keep_original=False,
extra_args={"ContentType": "text/plain"},
extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"},
)
@@ -1156,57 +1153,99 @@ def gzip_compressed_tarfile(path):
yield tar
def deterministic_tarinfo(tarinfo: tarfile.TarInfo):
# We only add files, symlinks, hardlinks, and directories
# No character devices, block devices and FIFOs should ever enter a tarball.
if tarinfo.isdev():
return None
# For distribution, it makes no sense to user/group data; since (a) they don't exist
# on other machines, and (b) they lead to surprises as `tar x` run as root will change
# ownership if it can. We want to extract as the current user. By setting owner to root,
# root will extract as root, and non-privileged user will extract as themselves.
tarinfo.uid = 0
tarinfo.gid = 0
tarinfo.uname = ""
tarinfo.gname = ""
# Reset mtime to epoch time, our prefixes are not truly immutable, so files may get
# touched; as long as the content does not change, this ensures we get stable tarballs.
tarinfo.mtime = 0
# Normalize mode
if tarinfo.isfile() or tarinfo.islnk():
# If user can execute, use 0o755; else 0o644
# This is to avoid potentially unsafe world writable & exeutable files that may get
# extracted when Python or tar is run with privileges
tarinfo.mode = 0o644 if tarinfo.mode & 0o100 == 0 else 0o755
else: # symbolic link and directories
tarinfo.mode = 0o755
return tarinfo
def _tarinfo_name(p: str):
return p.lstrip("/")
def tar_add_metadata(tar: tarfile.TarFile, path: str, data: dict):
# Serialize buildinfo for the tarball
bstring = syaml.dump(data, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=path)
tarinfo.size = len(bstring)
tar.addfile(deterministic_tarinfo(tarinfo), io.BytesIO(bstring))
def tarfile_of_spec_prefix(tar: tarfile.TarFile, prefix: str) -> None:
"""Create a tarfile of an install prefix of a spec. Skips existing buildinfo file.
Only adds regular files, symlinks and dirs. Skips devices, fifos. Preserves hardlinks.
Normalizes permissions like git. Tar entries are added in depth-first pre-order, with
dir entries partitioned by file | dir, and sorted alphabetically, for reproducibility.
Partitioning ensures only one dir is in memory at a time, and sorting improves compression.
Args:
tar: tarfile object to add files to
prefix: absolute install prefix of spec"""
if not os.path.isabs(prefix) or not os.path.isdir(prefix):
raise ValueError(f"prefix '{prefix}' must be an absolute path to a directory")
hardlink_to_tarinfo_name: Dict[Tuple[int, int], str] = dict()
stat_key = lambda stat: (stat.st_dev, stat.st_ino)
try: # skip buildinfo file if it exists
files_to_skip = [stat_key(os.lstat(buildinfo_file_name(prefix)))]
except OSError:
files_to_skip = []
dir_stack = [prefix]
while dir_stack:
dir = dir_stack.pop()
# Add the dir before its contents
dir_info = tarfile.TarInfo(_tarinfo_name(dir))
dir_info.type = tarfile.DIRTYPE
dir_info.mode = 0o755
tar.addfile(dir_info)
# Sort by name: reproducible & improves compression
with os.scandir(dir) as it:
entries = sorted(it, key=lambda entry: entry.name)
new_dirs = []
for entry in entries:
if entry.is_dir(follow_symlinks=False):
new_dirs.append(entry.path)
continue
file_info = tarfile.TarInfo(_tarinfo_name(entry.path))
s = entry.stat(follow_symlinks=False)
# Skip existing binary distribution files.
id = stat_key(s)
if id in files_to_skip:
continue
# Normalize the mode
file_info.mode = 0o644 if s.st_mode & 0o100 == 0 else 0o755
if entry.is_symlink():
file_info.type = tarfile.SYMTYPE
file_info.linkname = os.readlink(entry.path)
tar.addfile(file_info)
elif entry.is_file(follow_symlinks=False):
# Deduplicate hardlinks
if s.st_nlink > 1:
if id in hardlink_to_tarinfo_name:
file_info.type = tarfile.LNKTYPE
file_info.linkname = hardlink_to_tarinfo_name[id]
tar.addfile(file_info)
continue
hardlink_to_tarinfo_name[id] = file_info.name
# If file not yet seen, copy it.
file_info.type = tarfile.REGTYPE
file_info.size = s.st_size
with open(entry.path, "rb") as f:
tar.addfile(file_info, f)
dir_stack.extend(reversed(new_dirs)) # we pop, so reverse to stay alphabetical
def deterministic_tarinfo_without_buildinfo(tarinfo: tarfile.TarInfo):
"""Skip buildinfo file when creating a tarball, and normalize other tarinfo fields."""
if tarinfo.name.endswith("/.spack/binary_distribution"):
return None
return deterministic_tarinfo(tarinfo)
def _do_create_tarball(tarfile_path: str, binaries_dir: str, pkg_dir: str, buildinfo: dict):
def _do_create_tarball(tarfile_path: str, binaries_dir: str, buildinfo: dict):
with gzip_compressed_tarfile(tarfile_path) as tar:
tar.add(name=binaries_dir, arcname=pkg_dir, filter=deterministic_tarinfo_without_buildinfo)
tar_add_metadata(tar, buildinfo_file_name(pkg_dir), buildinfo)
# Tarball the install prefix
tarfile_of_spec_prefix(tar, binaries_dir)
# Serialize buildinfo for the tarball
bstring = syaml.dump(buildinfo, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=_tarinfo_name(buildinfo_file_name(binaries_dir)))
tarinfo.type = tarfile.REGTYPE
tarinfo.size = len(bstring)
tarinfo.mode = 0o644
tar.addfile(tarinfo, io.BytesIO(bstring))
class PushOptions(NamedTuple):
@@ -1278,14 +1317,12 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
):
raise NoOverwriteException(url_util.format(remote_specfile_path))
pkg_dir = os.path.basename(spec.prefix.rstrip(os.path.sep))
binaries_dir = spec.prefix
# create info for later relocation and create tar
buildinfo = get_buildinfo_dict(spec)
_do_create_tarball(tarfile_path, binaries_dir, pkg_dir, buildinfo)
_do_create_tarball(tarfile_path, binaries_dir, buildinfo)
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
@@ -1417,7 +1454,7 @@ def try_fetch(url_to_fetch):
try:
stage.fetch()
except web_util.FetchError:
except spack.error.FetchError:
stage.destroy()
return None
@@ -1580,9 +1617,10 @@ def dedupe_hardlinks_if_necessary(root, buildinfo):
for rel_path in buildinfo[key]:
stat_result = os.lstat(os.path.join(root, rel_path))
identifier = (stat_result.st_dev, stat_result.st_ino)
if identifier in visited:
continue
visited.add(identifier)
if stat_result.st_nlink > 1:
if identifier in visited:
continue
visited.add(identifier)
new_list.append(rel_path)
buildinfo[key] = new_list
@@ -2144,7 +2182,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
if not os.path.exists(stage.save_filename):
try:
stage.fetch()
except web_util.FetchError:
except spack.error.FetchError:
continue
tty.debug("Found key {0}".format(fingerprint))
@@ -2296,7 +2334,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
try:
stage.fetch()
break
except web_util.FetchError as e:
except spack.error.FetchError as e:
tty.debug(e)
else:
if fail_if_missing:

View File

@@ -228,7 +228,7 @@ def _install_and_test(
if not abstract_spec.intersects(candidate_spec):
continue
if python_spec is not None and python_spec not in abstract_spec:
if python_spec is not None and not abstract_spec.intersects(f"^{python_spec}"):
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
@@ -446,16 +446,11 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
env_mods = spack.util.environment.EnvironmentModifications()
for dep in concrete_spec.traverse(
root=True, order="post", deptype=("link", "run")
):
env_mods.extend(
spack.user_environment.environment_modifications_for_spec(
dep, set_package_py_globals=False
)
cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False
)
cmd.add_default_envmod(env_mods)
)
return cmd
assert exception_handler, (

View File

@@ -40,11 +40,15 @@
import sys
import traceback
import types
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.string import plural
from llnl.util.filesystem import join_path
from llnl.util.lang import dedupe
from llnl.util.lang import dedupe, stable_partition
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
@@ -54,17 +58,21 @@
import spack.build_systems.python
import spack.builder
import spack.config
import spack.deptypes as dt
import spack.main
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
import spack.schema.environment
import spack.spec
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.path
import spack.util.pattern
from spack import traverse
from spack.context import Context
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import spack_install_test_log
from spack.installer import InstallError
@@ -75,14 +83,12 @@
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
validate,
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.string import plural
#
# This can be set by the user to globally disable parallel builds.
@@ -109,7 +115,6 @@
SPACK_CCACHE_BINARY = "SPACK_CCACHE_BINARY"
SPACK_SYSTEM_DIRS = "SPACK_SYSTEM_DIRS"
# Platform-specific library suffix.
if sys.platform == "darwin":
dso_suffix = "dylib"
@@ -406,19 +411,13 @@ def set_compiler_environment_variables(pkg, env):
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper
(which have the prefix `SPACK_`) and also add the compiler wrappers
to PATH.
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
This determines the injected -L/-I/-rpath options; each
of these specifies a search order and this function computes these
options in a manner that is intended to match the DAG traversal order
in `modifications_from_dependencies`: that method uses a post-order
traversal so that `PrependPath` actions from dependencies take lower
precedence; we use a post-order traversal here to match the visitation
order of `modifications_from_dependencies` (so we are visiting the
lowest priority packages first).
"""
This determines the injected -L/-I/-rpath options; each of these specifies a search order and
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
@@ -537,45 +536,42 @@ def update_compiler_args_for_dep(dep):
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs))
def set_module_variables_for_package(pkg):
def set_package_py_globals(pkg, context: Context = Context.BUILD):
"""Populate the Python module of a package with some useful global names.
This makes things easier for package writers.
"""
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
marker = "_set_run_already_called"
if getattr(pkg.module, marker, False):
return
module = ModuleChangePropagator(pkg)
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m = module
m.make_jobs = jobs
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# analog to configure for win32
m.cscript = Executable("cscript")
if context == Context.BUILD:
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m.make_jobs = jobs
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# analog to configure for win32
m.cscript = Executable("cscript")
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
m.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
# Put spack compiler paths in module scope.
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
m.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
m.spack_cc = os.path.join(link_dir, pkg.compiler.link_paths["cc"])
m.spack_cxx = os.path.join(link_dir, pkg.compiler.link_paths["cxx"])
@@ -599,9 +595,6 @@ def static_to_shared_library(static_lib, shared_lib=None, **kwargs):
m.static_to_shared_library = static_to_shared_library
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
setattr(m, marker, True)
module.propagate_changes_to_mro()
@@ -727,12 +720,15 @@ def load_external_modules(pkg):
load_module(external_module)
def setup_package(pkg, dirty, context="build"):
def setup_package(pkg, dirty, context: Context = Context.BUILD):
"""Execute all environment setup routines."""
if context not in ["build", "test"]:
raise ValueError("'context' must be one of ['build', 'test'] - got: {0}".format(context))
if context not in (Context.BUILD, Context.TEST):
raise ValueError(f"'context' must be Context.BUILD or Context.TEST - got {context}")
set_module_variables_for_package(pkg)
# First populate the package.py's module with the relevant globals that could be used in any
# of the setup_* functions.
setup_context = SetupContext(pkg.spec, context=context)
setup_context.set_all_package_py_globals()
# Keep track of env changes from packages separately, since we want to
# issue warnings when packages make "suspicious" modifications.
@@ -740,13 +736,15 @@ def setup_package(pkg, dirty, context="build"):
env_mods = EnvironmentModifications()
# setup compilers for build contexts
need_compiler = context == "build" or (context == "test" and pkg.test_requires_compiler)
need_compiler = context == Context.BUILD or (
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(modifications_from_dependencies(pkg.spec, context, custom_mods_only=False))
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
# architecture specific setup
@@ -754,7 +752,7 @@ def setup_package(pkg, dirty, context="build"):
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods)
if context == "build":
if context == Context.BUILD:
tty.debug("setup_package: setup build environment for root")
builder = spack.builder.create(pkg)
builder.setup_build_environment(env_mods)
@@ -765,16 +763,7 @@ def setup_package(pkg, dirty, context="build"):
"config to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'."
)
elif context == "test":
tty.debug("setup_package: setup test environment for root")
env_mods.extend(
inspect_path(
pkg.spec.prefix,
spack.user_environment.prefix_inspections(pkg.spec.platform),
exclude=is_system_path,
)
)
pkg.setup_run_environment(env_mods)
elif context == Context.TEST:
env_mods.prepend_path("PATH", ".")
# First apply the clean environment changes
@@ -813,158 +802,245 @@ def setup_package(pkg, dirty, context="build"):
return env_base
def _make_runnable(pkg, env):
# Helper method which prepends a Package's bin/ prefix to the PATH
# environment variable
prefix = pkg.prefix
class EnvironmentVisitor:
def __init__(self, *roots: spack.spec.Spec, context: Context):
# For the roots (well, marked specs) we follow different edges
# than for their deps, depending on the context.
self.root_hashes = set(s.dag_hash() for s in roots)
for dirname in ["bin", "bin64"]:
bin_dir = os.path.join(prefix, dirname)
if os.path.isdir(bin_dir):
env.prepend_path("PATH", bin_dir)
if context == Context.BUILD:
# Drop direct run deps in build context
# We don't really distinguish between install and build time test deps,
# so we include them here as build-time test deps.
self.root_depflag = dt.BUILD | dt.TEST | dt.LINK
elif context == Context.TEST:
# This is more of an extended run environment
self.root_depflag = dt.TEST | dt.RUN | dt.LINK
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
depflag = self.root_depflag
else:
depflag = dt.LINK | dt.RUN
return traverse.sort_edges(spec.edges_to_dependencies(depflag=depflag))
def modifications_from_dependencies(
spec, context, custom_mods_only=True, set_package_py_globals=True
):
"""Returns the environment modifications that are required by
the dependencies of a spec and also applies modifications
to this spec's package at module scope, if need be.
class UseMode(Flag):
#: Entrypoint spec (a spec to be built; an env root, etc)
ROOT = auto()
Environment modifications include:
#: A spec used at runtime, but no executables in PATH
RUNTIME = auto()
- Updating PATH so that executables can be found
- Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective
tools can find Spack-built dependencies
- Running custom package environment modifications
#: A spec used at runtime, with executables in PATH
RUNTIME_EXECUTABLE = auto()
Custom package modifications can conflict with the default PATH changes
we make (specifically for the PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH
environment variables), so this applies changes in a fixed order:
#: A spec that's a direct build or test dep
BUILDTIME_DIRECT = auto()
- All modifications (custom and default) from external deps first
- All modifications from non-external deps afterwards
#: A spec that should be visible in search paths in a build env.
BUILDTIME = auto()
With that order, `PrependPath` actions from non-external default
environment modifications will take precedence over custom modifications
from external packages.
#: Flag is set when the (node, mode) is finalized
ADDED = auto()
A secondary constraint is that custom and default modifications are
grouped on a per-package basis: combined with the post-order traversal this
means that default modifications of dependents can override custom
modifications of dependencies (again, this would only occur for PATH,
CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).
Args:
spec (spack.spec.Spec): spec for which we want the modifications
context (str): either 'build' for build-time modifications or 'run'
for run-time modifications
custom_mods_only (bool): if True returns only custom modifications, if False
returns custom and default modifications
set_package_py_globals (bool): whether or not to set the global variables in the
package.py files (this may be problematic when using buildcaches that have
been built on a different but compatible OS)
"""
if context not in ["build", "run", "test"]:
raise ValueError(
"Expecting context to be one of ['build', 'run', 'test'], " "got: {0}".format(context)
def effective_deptypes(
*specs: spack.spec.Spec, context: Context = Context.BUILD
) -> List[Tuple[spack.spec.Spec, UseMode]]:
"""Given a list of input specs and a context, return a list of tuples of
all specs that contribute to (environment) modifications, together with
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
if parent is None:
use_modes[child] = UseMode.ROOT
continue
parent_mode = use_modes[parent]
# Nothing to propagate.
if not parent_mode:
continue
# Dependending on the context, include particular deps from the root.
if UseMode.ROOT & parent_mode:
if context == Context.BUILD:
if (dt.BUILD | dt.TEST) & depflag:
use_modes[child] |= UseMode.BUILDTIME_DIRECT
if dt.LINK & depflag:
use_modes[child] |= UseMode.BUILDTIME
elif context == Context.TEST:
if (dt.RUN | dt.TEST) & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
elif dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
elif context == Context.RUN:
if dt.RUN & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
elif dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
# Propagate RUNTIME and RUNTIME_EXECUTABLE through link and run deps.
if (UseMode.RUNTIME | UseMode.RUNTIME_EXECUTABLE | UseMode.BUILDTIME_DIRECT) & parent_mode:
if dt.LINK & depflag:
use_modes[child] |= UseMode.RUNTIME
if dt.RUN & depflag:
use_modes[child] |= UseMode.RUNTIME_EXECUTABLE
# Propagate BUILDTIME through link deps.
if UseMode.BUILDTIME & parent_mode:
if dt.LINK & depflag:
use_modes[child] |= UseMode.BUILDTIME
# Finalize the spec; the invariant is that all in-edges are processed
# before out-edges, meaning that parent is done.
if not (UseMode.ADDED & parent_mode):
use_modes[parent] |= UseMode.ADDED
nodes_with_type.append((parent, parent_mode))
# Attach the leaf nodes, since we only added nodes with out-edges.
for spec, parent_mode in use_modes.items():
if parent_mode and not (UseMode.ADDED & parent_mode):
nodes_with_type.append((spec, parent_mode))
return nodes_with_type
class SetupContext:
"""This class encapsulates the logic to determine environment modifications, and is used as
well to set globals in modules of package.py."""
def __init__(self, *specs: spack.spec.Spec, context: Context) -> None:
"""Construct a ModificationsFromDag object.
Args:
specs: single root spec for build/test context, possibly more for run context
context: build, run, or test"""
if (context == Context.BUILD or context == Context.TEST) and not len(specs) == 1:
raise ValueError("Cannot setup build environment for multiple specs")
specs_with_type = effective_deptypes(*specs, context=context)
self.specs = specs
self.context = context
self.external: List[Tuple[spack.spec.Spec, UseMode]]
self.nonexternal: List[Tuple[spack.spec.Spec, UseMode]]
# Reverse so we go from leaf to root
self.nodes_in_subdag = set(id(s) for s, _ in specs_with_type)
# Split into non-external and external, maintaining topo order per group.
self.external, self.nonexternal = stable_partition(
reversed(specs_with_type), lambda t: t[0].external
)
self.should_be_runnable = UseMode.BUILDTIME_DIRECT | UseMode.RUNTIME_EXECUTABLE
self.should_setup_run_env = UseMode.RUNTIME | UseMode.RUNTIME_EXECUTABLE
self.should_setup_dependent_build_env = UseMode.BUILDTIME | UseMode.BUILDTIME_DIRECT
env = EnvironmentModifications()
if context == Context.RUN or context == Context.TEST:
self.should_be_runnable |= UseMode.ROOT
self.should_setup_run_env |= UseMode.ROOT
# Note: see computation of 'custom_mod_deps' and 'exe_deps' later in this
# function; these sets form the building blocks of those collections.
build_deps = set(spec.dependencies(deptype=("build", "test")))
link_deps = set(spec.traverse(root=False, deptype="link"))
build_link_deps = build_deps | link_deps
build_and_supporting_deps = set()
for build_dep in build_deps:
build_and_supporting_deps.update(build_dep.traverse(deptype="run"))
run_and_supporting_deps = set(spec.traverse(root=False, deptype=("run", "link")))
test_and_supporting_deps = set()
for test_dep in set(spec.dependencies(deptype="test")):
test_and_supporting_deps.update(test_dep.traverse(deptype="run"))
# Everything that calls setup_run_environment and setup_dependent_* needs globals set.
self.should_set_package_py_globals = (
self.should_setup_dependent_build_env | self.should_setup_run_env | UseMode.ROOT
)
# In a build context, the root and direct build deps need build-specific globals set.
self.needs_build_context = UseMode.ROOT | UseMode.BUILDTIME_DIRECT
# All dependencies that might have environment modifications to apply
custom_mod_deps = set()
if context == "build":
custom_mod_deps.update(build_and_supporting_deps)
# Tests may be performed after build
custom_mod_deps.update(test_and_supporting_deps)
else:
# test/run context
custom_mod_deps.update(run_and_supporting_deps)
if context == "test":
custom_mod_deps.update(test_and_supporting_deps)
custom_mod_deps.update(link_deps)
def set_all_package_py_globals(self):
"""Set the globals in modules of package.py files."""
for dspec, flag in chain(self.external, self.nonexternal):
pkg = dspec.package
# Determine 'exe_deps': the set of packages with binaries we want to use
if context == "build":
exe_deps = build_and_supporting_deps | test_and_supporting_deps
elif context == "run":
exe_deps = set(spec.traverse(deptype="run"))
elif context == "test":
exe_deps = test_and_supporting_deps
if self.should_set_package_py_globals & flag:
if self.context == Context.BUILD and self.needs_build_context & flag:
set_package_py_globals(pkg, context=Context.BUILD)
else:
# This includes runtime dependencies, also runtime deps of direct build deps.
set_package_py_globals(pkg, context=Context.RUN)
def default_modifications_for_dep(dep):
if dep in build_link_deps and not is_system_path(dep.prefix) and context == "build":
prefix = dep.prefix
for spec in dspec.dependents():
# Note: some specs have dependents that are unreachable from the root, so avoid
# setting globals for those.
if id(spec) not in self.nodes_in_subdag:
continue
dependent_module = ModuleChangePropagator(spec.package)
pkg.setup_dependent_package(dependent_module, spec)
dependent_module.propagate_changes_to_mro()
env.prepend_path("CMAKE_PREFIX_PATH", prefix)
def get_env_modifications(self) -> EnvironmentModifications:
"""Returns the environment variable modifications for the given input specs and context.
Environment modifications include:
- Updating PATH for packages that are required at runtime
- Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective
tools can find Spack-built dependencies (when context=build)
- Running custom package environment modifications (setup_run_environment,
setup_dependent_build_environment, setup_dependent_run_environment)
for directory in ("lib", "lib64", "share"):
pcdir = os.path.join(prefix, directory, "pkgconfig")
if os.path.isdir(pcdir):
env.prepend_path("PKG_CONFIG_PATH", pcdir)
The (partial) order imposed on the specs is externals first, then topological
from leaf to root. That way externals cannot contribute search paths that would shadow
Spack's prefixes, and dependents override variables set by dependencies."""
env = EnvironmentModifications()
for dspec, flag in chain(self.external, self.nonexternal):
tty.debug(f"Adding env modifications for {dspec.name}")
pkg = dspec.package
if dep in exe_deps and not is_system_path(dep.prefix):
_make_runnable(dep, env)
if self.should_setup_dependent_build_env & flag:
self._make_buildtime_detectable(dspec, env)
def add_modifications_for_dep(dep):
tty.debug("Adding env modifications for {0}".format(dep.name))
# Some callers of this function only want the custom modifications.
# For callers that want both custom and default modifications, we want
# to perform the default modifications here (this groups custom
# and default modifications together on a per-package basis).
if not custom_mods_only:
default_modifications_for_dep(dep)
for spec in self.specs:
builder = spack.builder.create(pkg)
builder.setup_dependent_build_environment(env, spec)
# Perform custom modifications here (PrependPath actions performed in
# the custom method override the default environment modifications
# we do to help the build, namely for PATH, CMAKE_PREFIX_PATH, and
# PKG_CONFIG_PATH)
if dep in custom_mod_deps:
dpkg = dep.package
if set_package_py_globals:
set_module_variables_for_package(dpkg)
if self.should_be_runnable & flag:
self._make_runnable(dspec, env)
current_module = ModuleChangePropagator(spec.package)
dpkg.setup_dependent_package(current_module, spec)
current_module.propagate_changes_to_mro()
if self.should_setup_run_env & flag:
# TODO: remove setup_dependent_run_environment...
for spec in dspec.dependents(deptype=dt.RUN):
if id(spec) in self.nodes_in_subdag:
pkg.setup_dependent_run_environment(env, spec)
pkg.setup_run_environment(env)
return env
if context == "build":
builder = spack.builder.create(dpkg)
builder.setup_dependent_build_environment(env, spec)
else:
dpkg.setup_dependent_run_environment(env, spec)
tty.debug("Added env modifications for {0}".format(dep.name))
def _make_buildtime_detectable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
if is_system_path(dep.prefix):
return
# Note that we want to perform environment modifications in a fixed order.
# The Spec.traverse method provides this: i.e. in addition to
# the post-order semantics, it also guarantees a fixed traversal order
# among dependencies which are not constrained by post-order semantics.
for dspec in spec.traverse(root=False, order="post"):
if dspec.external:
add_modifications_for_dep(dspec)
env.prepend_path("CMAKE_PREFIX_PATH", dep.prefix)
for d in ("lib", "lib64", "share"):
pcdir = os.path.join(dep.prefix, d, "pkgconfig")
if os.path.isdir(pcdir):
env.prepend_path("PKG_CONFIG_PATH", pcdir)
for dspec in spec.traverse(root=False, order="post"):
# Default env modifications for non-external packages can override
# custom modifications of external packages (this can only occur
# for modifications to PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH)
if not dspec.external:
add_modifications_for_dep(dspec)
def _make_runnable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
if is_system_path(dep.prefix):
return
return env
for d in ("bin", "bin64"):
bin_dir = os.path.join(dep.prefix, d)
if os.path.isdir(bin_dir):
env.prepend_path("PATH", bin_dir)
def get_cmake_prefix_path(pkg):
@@ -996,7 +1072,7 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
context = kwargs.get("context", "build")
context: str = kwargs.get("context", "build")
try:
# We are in the child process. Python sets sys.stdin to
@@ -1012,7 +1088,7 @@ def _setup_pkg_and_run(
if not kwargs.get("fake", False):
kwargs["unmodified_env"] = os.environ.copy()
kwargs["env_modifications"] = setup_package(
pkg, dirty=kwargs.get("dirty", False), context=context
pkg, dirty=kwargs.get("dirty", False), context=Context.from_string(context)
)
return_value = function(pkg, kwargs)
write_pipe.send(return_value)

View File

@@ -46,6 +46,7 @@ class AutotoolsPackage(spack.package_base.PackageBase):
depends_on("gnuconfig", type="build", when="target=ppc64le:")
depends_on("gnuconfig", type="build", when="target=aarch64:")
depends_on("gnuconfig", type="build", when="target=riscv64:")
depends_on("gmake", type="build")
conflicts("platform=windows")
def flags_to_build_system_args(self, flags):

View File

@@ -142,10 +142,10 @@ def flags_to_build_system_args(self, flags):
# We specify for each of them.
if flags["ldflags"]:
ldflags = " ".join(flags["ldflags"])
ld_string = "-DCMAKE_{0}_LINKER_FLAGS={1}"
# cmake has separate linker arguments for types of builds.
for type in ["EXE", "MODULE", "SHARED", "STATIC"]:
self.cmake_flag_args.append(ld_string.format(type, ldflags))
self.cmake_flag_args.append(f"-DCMAKE_EXE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_MODULE_LINKER_FLAGS={ldflags}")
self.cmake_flag_args.append(f"-DCMAKE_SHARED_LINKER_FLAGS={ldflags}")
# CMake has libs options separated by language. Apply ours to each.
if flags["ldlibs"]:

View File

@@ -9,7 +9,8 @@
import spack.builder
import spack.package_base
from spack.directives import build_system, conflicts
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BaseBuilder,
@@ -29,7 +30,10 @@ class MakefilePackage(spack.package_base.PackageBase):
legacy_buildsystem = "makefile"
build_system("makefile")
conflicts("platform=windows", when="build_system=makefile")
with when("build_system=makefile"):
conflicts("platform=windows")
depends_on("gmake", type="build")
@spack.builder.builder("makefile")

View File

@@ -10,7 +10,7 @@
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on, variant
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
@@ -47,6 +47,13 @@ class MesonPackage(spack.package_base.PackageBase):
variant("strip", default=False, description="Strip targets on install")
depends_on("meson", type="build")
depends_on("ninja", type="build")
# Python detection in meson requires distutils to be importable, but distutils no longer
# exists in Python 3.12. In Spack, we can't use setuptools as distutils replacement,
# because the distutils-precedence.pth startup file that setuptools ships with is not run
# when setuptools is in PYTHONPATH; it has to be in system site-packages. In a future meson
# release, the distutils requirement will be dropped, so this conflict can be relaxed.
# We have patches to make it work with meson 1.1 and above.
conflicts("^python@3.12:", when="^meson@:1.0")
def flags_to_build_system_args(self, flags):
"""Produces a list of all command line arguments to pass the specified

View File

@@ -61,6 +61,11 @@ def component_prefix(self):
"""Path to component <prefix>/<component>/<version>."""
return self.prefix.join(join_path(self.component_dir, self.spec.version))
@property
def env_script_args(self):
"""Additional arguments to pass to vars.sh script."""
return ()
def install(self, spec, prefix):
self.install_component(basename(self.url_for_version(spec.version)))
@@ -124,7 +129,7 @@ def setup_run_environment(self, env):
if "~envmods" not in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh")
join_path(self.component_prefix, "env", "vars.sh"), *self.env_script_args
)
)

View File

@@ -6,6 +6,7 @@
import os
import re
import shutil
import stat
from typing import Optional
import archspec
@@ -16,6 +17,7 @@
import spack.builder
import spack.config
import spack.deptypes as dt
import spack.detection
import spack.multimethod
import spack.package_base
@@ -24,6 +26,7 @@
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError, SpecError
from spack.install_test import test_part
from spack.util.executable import Executable
from spack.version import Version
from ._checks import BaseBuilder, execute_install_time_tests
@@ -226,7 +229,48 @@ def update_external_dependencies(self, extendee_spec=None):
python.external_path = self.spec.external_path
python._mark_concrete()
self.spec.add_dependency_edge(python, deptypes=("build", "link", "run"), virtuals=())
self.spec.add_dependency_edge(python, depflag=dt.BUILD | dt.LINK | dt.RUN, virtuals=())
def get_external_python_for_prefix(self):
"""
For an external package that extends python, find the most likely spec for the python
it depends on.
First search: an "installed" external that shares a prefix with this package
Second search: a configured external that shares a prefix with this package
Third search: search this prefix for a python package
Returns:
spack.spec.Spec: The external Spec for python most likely to be compatible with self.spec
"""
python_externals_installed = [
s for s in spack.store.STORE.db.query("python") if s.prefix == self.spec.external_path
]
if python_externals_installed:
return python_externals_installed[0]
python_external_config = spack.config.get("packages:python:externals", [])
python_externals_configured = [
spack.spec.parse_with_version_concrete(item["spec"])
for item in python_external_config
if item["prefix"] == self.spec.external_path
]
if python_externals_configured:
return python_externals_configured[0]
python_externals_detection = spack.detection.by_path(
["python"], path_hints=[self.spec.external_path]
)
python_externals_detected = [
d.spec
for d in python_externals_detection.get("python", [])
if d.prefix == self.spec.external_path
]
if python_externals_detected:
return python_externals_detected[0]
raise StopIteration("No external python could be detected for %s to depend on" % self.spec)
class PythonPackage(PythonExtension):
@@ -273,54 +317,16 @@ def list_url(cls):
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
def get_external_python_for_prefix(self):
"""
For an external package that extends python, find the most likely spec for the python
it depends on.
First search: an "installed" external that shares a prefix with this package
Second search: a configured external that shares a prefix with this package
Third search: search this prefix for a python package
Returns:
spack.spec.Spec: The external Spec for python most likely to be compatible with self.spec
"""
python_externals_installed = [
s for s in spack.store.STORE.db.query("python") if s.prefix == self.spec.external_path
]
if python_externals_installed:
return python_externals_installed[0]
python_external_config = spack.config.get("packages:python:externals", [])
python_externals_configured = [
spack.spec.parse_with_version_concrete(item["spec"])
for item in python_external_config
if item["prefix"] == self.spec.external_path
]
if python_externals_configured:
return python_externals_configured[0]
python_externals_detection = spack.detection.by_path(
["python"], path_hints=[self.spec.external_path]
)
python_externals_detected = [
d.spec
for d in python_externals_detection.get("python", [])
if d.prefix == self.spec.external_path
]
if python_externals_detected:
return python_externals_detected[0]
raise StopIteration("No external python could be detected for %s to depend on" % self.spec)
@property
def headers(self):
"""Discover header files in platlib."""
# Remove py- prefix in package name
name = self.spec.name[3:]
# Headers may be in either location
include = self.prefix.join(self.spec["python"].package.include)
platlib = self.prefix.join(self.spec["python"].package.platlib)
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
headers = fs.find_all_headers(include) + fs.find_all_headers(platlib)
if headers:
@@ -334,18 +340,64 @@ def libs(self):
"""Discover libraries in platlib."""
# Remove py- prefix in package name
library = "lib" + self.spec.name[3:].replace("-", "?")
root = self.prefix.join(self.spec["python"].package.platlib)
name = self.spec.name[3:]
for shared in [True, False]:
libs = fs.find_libraries(library, root, shared=shared, recursive=True)
if libs:
return libs
root = self.prefix.join(self.spec["python"].package.platlib).join(name)
libs = fs.find_all_libraries(root, recursive=True)
if libs:
return libs
msg = "Unable to recursively locate {} libraries in {}"
raise NoLibrariesError(msg.format(self.spec.name, root))
def fixup_shebangs(path: str, old_interpreter: bytes, new_interpreter: bytes):
# Recurse into the install prefix and fixup shebangs
exe = stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
dirs = [path]
hardlinks = set()
while dirs:
with os.scandir(dirs.pop()) as entries:
for entry in entries:
if entry.is_dir(follow_symlinks=False):
dirs.append(entry.path)
continue
# Only consider files, not symlinks
if not entry.is_file(follow_symlinks=False):
continue
lstat = entry.stat(follow_symlinks=False)
# Skip over files that are not executable
if not (lstat.st_mode & exe):
continue
# Don't modify hardlinks more than once
if lstat.st_nlink > 1:
key = (lstat.st_ino, lstat.st_dev)
if key in hardlinks:
continue
hardlinks.add(key)
# Finally replace shebangs if any.
with open(entry.path, "rb+") as f:
contents = f.read(2)
if contents != b"#!":
continue
contents += f.read()
if old_interpreter not in contents:
continue
f.seek(0)
f.write(contents.replace(old_interpreter, new_interpreter))
f.truncate()
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
phases = ("install",)
@@ -442,8 +494,36 @@ def global_options(self, spec, prefix):
"""
return []
@property
def _build_venv_path(self):
"""Return the path to the virtual environment used for building when
python is external."""
return os.path.join(self.spec.package.stage.path, "build_env")
@property
def _build_venv_python(self) -> Executable:
"""Return the Python executable in the build virtual environment when
python is external."""
return Executable(os.path.join(self._build_venv_path, "bin", "python"))
def install(self, pkg, spec, prefix):
"""Install everything from build directory."""
python: Executable = spec["python"].command
# Since we invoke pip with --no-build-isolation, we have to make sure that pip cannot
# execute hooks from user and system site-packages.
if spec["python"].external:
# There are no environment variables to disable the system site-packages, so we use a
# virtual environment instead. The downside of this approach is that pip produces
# incorrect shebangs that refer to the virtual environment, which we have to fix up.
python("-m", "venv", "--without-pip", self._build_venv_path)
pip = self._build_venv_python
else:
# For a Spack managed Python, system site-packages is empty/unused by design, so it
# suffices to disable user site-packages, for which there is an environment variable.
pip = python
pip.add_default_env("PYTHONNOUSERSITE", "1")
pip.add_default_arg("-m")
pip.add_default_arg("pip")
args = PythonPipBuilder.std_args(pkg) + ["--prefix=" + prefix]
@@ -467,8 +547,31 @@ def install(self, pkg, spec, prefix):
else:
args.append(".")
pip = inspect.getmodule(pkg).pip
with fs.working_dir(self.build_directory):
pip(*args)
@spack.builder.run_after("install")
def fixup_shebangs_pointing_to_build(self):
"""When installing a package using an external python, we use a temporary virtual
environment which improves build isolation. The downside is that pip produces shebangs
that point to the temporary virtual environment. This method fixes them up to point to the
underlying Python."""
# No need to fixup shebangs if no build venv was used. (this post install function also
# runs when install was overridden in another package, so check existence of the venv path)
if not os.path.exists(self._build_venv_path):
return
# Use sys.executable, since that's what pip uses.
interpreter = (
lambda python: python("-c", "import sys; print(sys.executable)", output=str)
.strip()
.encode("utf-8")
)
fixup_shebangs(
path=self.spec.prefix,
old_interpreter=interpreter(self._build_venv_python),
new_interpreter=interpreter(self.spec["python"].command),
)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -64,7 +64,7 @@ class RacketBuilder(spack.builder.Builder):
@property
def subdirectory(self):
if self.racket_name:
if self.pkg.racket_name:
return "pkgs/{0}".format(self.pkg.racket_name)
return None

View File

@@ -49,7 +49,11 @@
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
# TODO: Remove this in Spack 0.23
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{arch=architecture}"
)
spack_gpg = spack.main.SpackCommand("gpg")
spack_compiler = spack.main.SpackCommand("compiler")
@@ -69,48 +73,23 @@ def __exit__(self, exc_type, exc_value, exc_traceback):
return False
def get_job_name(spec, osarch, build_group):
"""Given the necessary parts, format the gitlab job name
def get_job_name(spec: spack.spec.Spec, build_group: str = ""):
"""Given a spec and possibly a build group, return the job name. If the
resulting name is longer than 255 characters, it will be truncated.
Arguments:
spec (spack.spec.Spec): Spec job will build
osarch: Architecture TODO: (this is a spack.spec.ArchSpec,
but sphinx doesn't recognize the type and fails).
build_group (str): Name of build group this job belongs to (a CDash
notion)
Returns: The job name
"""
item_idx = 0
format_str = ""
format_args = []
format_str += "{{{0}}}".format(item_idx)
format_args.append(spec.name)
item_idx += 1
format_str += "/{{{0}}}".format(item_idx)
format_args.append(spec.dag_hash(7))
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(spec.version)
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(spec.compiler)
item_idx += 1
format_str += " {{{0}}}".format(item_idx)
format_args.append(osarch)
item_idx += 1
job_name = spec.format(JOB_NAME_FORMAT)
if build_group:
format_str += " {{{0}}}".format(item_idx)
format_args.append(build_group)
item_idx += 1
job_name = "{0} {1}".format(job_name, build_group)
return format_str.format(*format_args)
return job_name[:255]
def _remove_reserved_tags(tags):
@@ -308,7 +287,7 @@ def append_dep(s, d):
dependencies.append({"spec": s, "depends": d})
for spec in spec_list:
for s in spec.traverse(deptype=all):
for s in spec.traverse(deptype="all"):
if s.external:
tty.msg("Will not stage external pkg: {0}".format(s))
continue
@@ -316,7 +295,7 @@ def append_dep(s, d):
skey = _spec_deps_key(s)
spec_labels[skey] = s
for d in s.dependencies(deptype=all):
for d in s.dependencies(deptype="all"):
dkey = _spec_deps_key(d)
if d.external:
tty.msg("Will not stage external dep: {0}".format(d))
@@ -337,7 +316,7 @@ def _spec_matches(spec, match_string):
def _format_job_needs(
dep_jobs, osname, build_group, prune_dag, rebuild_decisions, enable_artifacts_buildcache
dep_jobs, build_group, prune_dag, rebuild_decisions, enable_artifacts_buildcache
):
needs_list = []
for dep_job in dep_jobs:
@@ -347,7 +326,7 @@ def _format_job_needs(
if not prune_dag or rebuild:
needs_list.append(
{
"job": get_job_name(dep_job, dep_job.architecture, build_group),
"job": get_job_name(dep_job, build_group),
"artifacts": enable_artifacts_buildcache,
}
)
@@ -700,7 +679,7 @@ def generate_gitlab_ci_yaml(
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option.
on branch name, facilitated by this option. DEPRECATED
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
@@ -797,17 +776,39 @@ def generate_gitlab_ci_yaml(
"instead.",
)
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors:
if remote_mirror_override:
tty.die(
"Using the deprecated --buildcache-destination cli option and "
"having a mirror named 'buildcache-destination' at the same time "
"is not allowed"
)
buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
# TODO: Remove this block in spack 0.23
remote_mirror_url = None
if deprecated_mirror_config:
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if spack_buildcache_copy:
buildcache_copies = {}
buildcache_copy_src_prefix = remote_mirror_override or remote_mirror_url
buildcache_copy_src_prefix = (
buildcache_destination.fetch_url
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
buildcache_copy_dest_prefix = spack_buildcache_copy
# Check for a list of "known broken" specs that we should not bother
@@ -819,6 +820,7 @@ def generate_gitlab_ci_yaml(
enable_artifacts_buildcache = False
if "enable-artifacts-buildcache" in ci_config:
tty.warn("Support for enable-artifacts-buildcache will be removed in Spack 0.23")
enable_artifacts_buildcache = ci_config["enable-artifacts-buildcache"]
rebuild_index_enabled = True
@@ -827,13 +829,15 @@ def generate_gitlab_ci_yaml(
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
tty.warn("Support for temporary-storage-url-prefix will be removed in Spack 0.23")
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
# If a remote mirror override (alternate buildcache destination) was
# specified, add it here in case it has already built hashes we might
# generate.
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if remote_mirror_override:
if deprecated_mirror_config and remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
# Overriding the main mirror in this case might result
# in skipping jobs on a release pipeline because specs are
@@ -853,8 +857,9 @@ def generate_gitlab_ci_yaml(
cfg.default_modify_scope(),
)
# TODO: Remove this block in Spack 0.23
shared_pr_mirror = None
if spack_pipeline_type == "spack_pull_request":
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
stack_name = os.environ.get("SPACK_CI_STACK_NAME", "")
shared_pr_mirror = url_util.join(SHARED_PR_MIRROR_URL, stack_name)
spack.mirror.add(
@@ -906,6 +911,7 @@ def generate_gitlab_ci_yaml(
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
# TODO: Remove this line in Spack 0.23
local_mirror_dir = os.path.join(pipeline_artifacts_dir, "mirror")
user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data")
@@ -920,11 +926,11 @@ def generate_gitlab_ci_yaml(
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir)
rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir)
# TODO: Remove this line in Spack 0.23
rel_local_mirror_dir = os.path.join(local_mirror_dir, ci_project_dir)
rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir)
# Speed up staging by first fetching binary indices from all mirrors
# (including the override mirror we may have just added above).
try:
bindist.binary_index.update()
except bindist.FetchCacheError as e:
@@ -1023,19 +1029,23 @@ def main_script_replacements(cmd):
if "after_script" in job_object:
job_object["after_script"] = _unpack_script(job_object["after_script"])
osname = str(release_spec.architecture)
job_name = get_job_name(release_spec, osname, build_group)
job_name = get_job_name(release_spec, build_group)
job_vars = job_object.setdefault("variables", {})
job_vars["SPACK_JOB_SPEC_DAG_HASH"] = release_spec_dag_hash
job_vars["SPACK_JOB_SPEC_PKG_NAME"] = release_spec.name
job_vars["SPACK_JOB_SPEC_PKG_VERSION"] = release_spec.format("{version}")
job_vars["SPACK_JOB_SPEC_COMPILER_NAME"] = release_spec.format("{compiler.name}")
job_vars["SPACK_JOB_SPEC_COMPILER_VERSION"] = release_spec.format("{compiler.version}")
job_vars["SPACK_JOB_SPEC_ARCH"] = release_spec.format("{architecture}")
job_vars["SPACK_JOB_SPEC_VARIANTS"] = release_spec.format("{variants}")
job_object["needs"] = []
if spec_label in dependencies:
if enable_artifacts_buildcache:
# Get dependencies transitively, so they're all
# available in the artifacts buildcache.
dep_jobs = [d for d in release_spec.traverse(deptype=all, root=False)]
dep_jobs = [d for d in release_spec.traverse(deptype="all", root=False)]
else:
# In this case, "needs" is only used for scheduling
# purposes, so we only get the direct dependencies.
@@ -1046,7 +1056,6 @@ def main_script_replacements(cmd):
job_object["needs"].extend(
_format_job_needs(
dep_jobs,
osname,
build_group,
prune_dag,
rebuild_decisions,
@@ -1132,6 +1141,7 @@ def main_script_replacements(cmd):
},
)
# TODO: Remove this block in Spack 0.23
if enable_artifacts_buildcache:
bc_root = os.path.join(local_mirror_dir, "build_cache")
job_object["artifacts"]["paths"].extend(
@@ -1161,10 +1171,12 @@ def main_script_replacements(cmd):
_print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions)
# Clean up remote mirror override if enabled
if remote_mirror_override:
spack.mirror.remove("ci_pr_mirror", cfg.default_modify_scope())
if spack_pipeline_type == "spack_pull_request":
spack.mirror.remove("ci_shared_pr_mirror", cfg.default_modify_scope())
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config:
if remote_mirror_override:
spack.mirror.remove("ci_pr_mirror", cfg.default_modify_scope())
if spack_pipeline_type == "spack_pull_request":
spack.mirror.remove("ci_shared_pr_mirror", cfg.default_modify_scope())
tty.debug("{0} build jobs generated in {1} stages".format(job_id, stage_id))
@@ -1195,10 +1207,28 @@ def main_script_replacements(cmd):
sync_job["needs"] = [
{"job": generate_job_name, "pipeline": "{0}".format(parent_pipeline_id)}
]
if "variables" not in sync_job:
sync_job["variables"] = {}
sync_job["variables"]["SPACK_COPY_ONLY_DESTINATION"] = (
buildcache_destination.fetch_url
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
if "buildcache-source" in pipeline_mirrors:
buildcache_source = pipeline_mirrors["buildcache-source"].fetch_url
else:
# TODO: Remove this condition in Spack 0.23
buildcache_source = os.environ.get("SPACK_SOURCE_MIRROR", None)
sync_job["variables"]["SPACK_BUILDCACHE_SOURCE"] = buildcache_source
output_object["copy"] = sync_job
job_id += 1
if job_id > 0:
# TODO: Remove this block in Spack 0.23
if temp_storage_url_prefix:
# There were some rebuild jobs scheduled, so we will need to
# schedule a job to clean up the temporary storage location
@@ -1232,6 +1262,13 @@ def main_script_replacements(cmd):
signing_job["when"] = "always"
signing_job["retry"] = {"max": 2, "when": ["always"]}
signing_job["interruptible"] = True
if "variables" not in signing_job:
signing_job["variables"] = {}
signing_job["variables"]["SPACK_BUILDCACHE_DESTINATION"] = (
buildcache_destination.push_url # need the s3 url for aws s3 sync
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
output_object["sign-pkgs"] = signing_job
@@ -1240,13 +1277,13 @@ def main_script_replacements(cmd):
stage_names.append("stage-rebuild-index")
final_job = spack_ci_ir["jobs"]["reindex"]["attributes"]
index_target_mirror = mirror_urls[0]
if remote_mirror_override:
index_target_mirror = remote_mirror_override
final_job["stage"] = "stage-rebuild-index"
target_mirror = remote_mirror_override or remote_mirror_url
if buildcache_destination:
target_mirror = buildcache_destination.push_url
final_job["script"] = _unpack_script(
final_job["script"],
op=lambda cmd: cmd.replace("{index_target_mirror}", index_target_mirror),
op=lambda cmd: cmd.replace("{index_target_mirror}", target_mirror),
)
final_job["when"] = "always"
@@ -1268,20 +1305,24 @@ def main_script_replacements(cmd):
"SPACK_CONCRETE_ENV_DIR": rel_concrete_env_dir,
"SPACK_VERSION": spack_version,
"SPACK_CHECKOUT_VERSION": version_to_clone,
# TODO: Remove this line in Spack 0.23
"SPACK_REMOTE_MIRROR_URL": remote_mirror_url,
"SPACK_JOB_LOG_DIR": rel_job_log_dir,
"SPACK_JOB_REPRO_DIR": rel_job_repro_dir,
"SPACK_JOB_TEST_DIR": rel_job_test_dir,
# TODO: Remove this line in Spack 0.23
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
# TODO: Remove this line in Spack 0.23
"SPACK_CI_SHARED_PR_MIRROR_URL": shared_pr_mirror or "None",
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
}
if remote_mirror_override:
# TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override:
(output_object["variables"]["SPACK_REMOTE_MIRROR_OVERRIDE"]) = remote_mirror_override
spack_stack_name = os.environ.get("SPACK_CI_STACK_NAME", None)
@@ -2021,43 +2062,23 @@ def process_command(name, commands, repro_dir, run=True, exit_on_failure=True):
def create_buildcache(
input_spec: spack.spec.Spec,
*,
pipeline_mirror_url: Optional[str] = None,
buildcache_mirror_url: Optional[str] = None,
sign_binaries: bool = False,
input_spec: spack.spec.Spec, *, destination_mirror_urls: List[str], sign_binaries: bool = False
) -> List[PushResult]:
"""Create the buildcache at the provided mirror(s).
Arguments:
input_spec: Installed spec to package and push
buildcache_mirror_url: URL for the buildcache mirror
pipeline_mirror_url: URL for the pipeline mirror
destination_mirror_urls: List of urls to push to
sign_binaries: Whether or not to sign buildcache entry
Returns: A list of PushResults, indicating success or failure.
"""
results = []
# Create buildcache in either the main remote mirror, or in the
# per-PR mirror, if this is a PR pipeline
if buildcache_mirror_url:
for mirror_url in destination_mirror_urls:
results.append(
PushResult(
success=push_mirror_contents(input_spec, buildcache_mirror_url, sign_binaries),
url=buildcache_mirror_url,
)
)
# Create another copy of that buildcache in the per-pipeline
# temporary storage mirror (this is only done if either
# artifacts buildcache is enabled or a temporary storage url
# prefix is set)
if pipeline_mirror_url:
results.append(
PushResult(
success=push_mirror_contents(input_spec, pipeline_mirror_url, sign_binaries),
url=pipeline_mirror_url,
success=push_mirror_contents(input_spec, mirror_url, sign_binaries), url=mirror_url
)
)

View File

@@ -11,6 +11,7 @@
from textwrap import dedent
from typing import List, Match, Tuple
import llnl.string
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.lang import attr_setdefault, index_by
@@ -29,7 +30,6 @@
import spack.user_environment as uenv
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.string
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -516,7 +516,7 @@ def print_how_many_pkgs(specs, pkg_type=""):
category, e.g. if pkg_type is "installed" then the message
would be "3 installed packages"
"""
tty.msg("%s" % spack.util.string.plural(len(specs), pkg_type + " package"))
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package"))
def spack_is_git_repo():

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty as tty
import llnl.util.tty.colify
import llnl.util.tty.color as cl
import spack.audit
@@ -20,6 +21,15 @@ def setup_parser(subparser):
# Audit configuration files
sp.add_parser("configs", help="audit configuration files")
# Audit package recipes
external_parser = sp.add_parser("externals", help="check external detection in packages")
external_parser.add_argument(
"--list",
action="store_true",
dest="list_externals",
help="if passed, list which packages have detection tests",
)
# Https and other linting
https_parser = sp.add_parser("packages-https", help="check https in packages")
https_parser.add_argument(
@@ -29,7 +39,7 @@ def setup_parser(subparser):
# Audit package recipes
pkg_parser = sp.add_parser("packages", help="audit package recipes")
for group in [pkg_parser, https_parser]:
for group in [pkg_parser, https_parser, external_parser]:
group.add_argument(
"name",
metavar="PKG",
@@ -62,6 +72,18 @@ def packages_https(parser, args):
_process_reports(reports)
def externals(parser, args):
if args.list_externals:
msg = "@*{The following packages have detection tests:}"
tty.msg(cl.colorize(msg))
llnl.util.tty.colify.colify(spack.audit.packages_with_detection_tests(), indent=2)
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
_process_reports(reports)
def list(parser, args):
for subcommand, check_tags in spack.audit.GROUPS.items():
print(cl.colorize("@*b{" + subcommand + "}:"))
@@ -78,6 +100,7 @@ def list(parser, args):
def audit(parser, args):
subcommands = {
"configs": configs,
"externals": externals,
"packages": packages,
"packages-https": packages_https,
"list": list,

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd.common.env_utility as env_utility
from spack.context import Context
description = (
"run a command in a spec's install environment, or dump its environment to screen or file"
@@ -14,4 +15,4 @@
def build_env(parser, args):
env_utility.emulate_env_utility("build-env", "build", args)
env_utility.emulate_env_utility("build-env", Context.BUILD, args)

View File

@@ -13,6 +13,7 @@
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.string import plural
from llnl.util.lang import elide_list
import spack.binary_distribution as bindist
@@ -32,7 +33,6 @@
from spack.cmd import display_specs
from spack.spec import Spec, save_dependency_specfiles
from spack.stage import Stage
from spack.util.string import plural
description = "create, download and install binary packages"
section = "packaging"
@@ -268,7 +268,7 @@ def _matching_specs(specs: List[Spec]) -> List[Spec]:
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
def push_fn(args):
def push_fn(args: argparse.Namespace):
"""create a binary package and push it to a mirror"""
if args.spec_file:
tty.warn(
@@ -414,7 +414,7 @@ def preview_fn(args):
)
def check_fn(args):
def check_fn(args: argparse.Namespace):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt
this command uses the process exit code to indicate its result, specifically, if the
@@ -429,7 +429,7 @@ def check_fn(args):
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs, specs)
specs = _matching_specs(specs)
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()
@@ -527,7 +527,7 @@ def copy_buildcache_file(src_url, dest_url, local_path=None):
temp_stage.create()
temp_stage.fetch()
web_util.push_to_url(local_path, dest_url, keep_original=True)
except web_util.FetchError as e:
except spack.error.FetchError as e:
# Expected, since we have to try all the possible extensions
tty.debug("no such file: {0}".format(src_url))
tty.debug(e)

View File

@@ -7,6 +7,7 @@
import re
import sys
import llnl.string
import llnl.util.lang
from llnl.util import tty
@@ -15,6 +16,7 @@
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.package_base import PackageBase, deprecated_version, preferred_version
from spack.util.editor import editor
@@ -66,7 +68,7 @@ def setup_parser(subparser):
modes_parser.add_argument(
"--verify", action="store_true", default=False, help="verify known package checksums"
)
arguments.add_common_arguments(subparser, ["package"])
arguments.add_common_arguments(subparser, ["package", "jobs"])
subparser.add_argument(
"versions", nargs=argparse.REMAINDER, help="versions to generate checksums for"
)
@@ -96,7 +98,7 @@ def checksum(parser, args):
# Add latest version if requested
if args.latest:
remote_versions = pkg.fetch_remote_versions()
remote_versions = pkg.fetch_remote_versions(args.jobs)
if len(remote_versions) > 0:
latest_version = sorted(remote_versions.keys(), reverse=True)[0]
versions.append(latest_version)
@@ -119,27 +121,47 @@ def checksum(parser, args):
# if we get here, it's because no valid url was provided by the package
# do expensive fallback to try to recover
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions()
remote_versions = pkg.fetch_remote_versions(args.jobs)
if version in remote_versions:
url_dict[version] = remote_versions[version]
if len(versions) <= 0:
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions()
remote_versions = pkg.fetch_remote_versions(args.jobs)
url_dict = remote_versions
# A spidered URL can differ from the package.py *computed* URL, pointing to different tarballs.
# For example, GitHub release pages sometimes have multiple tarballs with different shasum:
# - releases/download/1.0/<pkg>-1.0.tar.gz (uploaded tarball)
# - archive/refs/tags/1.0.tar.gz (generated tarball)
# We wanna ensure that `spack checksum` and `spack install` ultimately use the same URL, so
# here we check whether the crawled and computed URLs disagree, and if so, prioritize the
# former if that URL exists (just sending a HEAD request that is).
url_changed_for_version = set()
for version, url in url_dict.items():
possible_urls = pkg.all_urls_for_version(version)
if url not in possible_urls:
for possible_url in possible_urls:
if web_util.url_exists(possible_url):
url_dict[version] = possible_url
break
else:
url_changed_for_version.add(version)
if not url_dict:
tty.die(f"Could not find any remote versions for {pkg.name}")
# print an empty line to create a new output section block
print()
elif len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
filtered_url_dict = spack.stage.interactive_version_filter(
url_dict, pkg.versions, url_changes=url_changed_for_version
)
if not filtered_url_dict:
exit(0)
url_dict = filtered_url_dict
else:
tty.info(f"Found {llnl.string.plural(len(url_dict), 'version')} of {pkg.name}")
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
pkg.name,
keep_stage=args.keep_stage,
batch=(args.batch or len(versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options,
url_dict, pkg.name, keep_stage=args.keep_stage, fetch_options=pkg.fetch_options
)
if args.verify:
@@ -239,7 +261,7 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str):
parsed_version = Version(contents_version.group(1))
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIX ME", "\n"]
split_contents[i:i] = [new_versions.pop(0)[1], " # FIXME", "\n"]
num_versions_added += 1
elif parsed_version == new_versions[0][0]:

View File

@@ -191,6 +191,14 @@ def ci_generate(args):
"""
env = spack.cmd.require_active_env(cmd_name="ci generate")
if args.copy_to:
tty.warn("The flag --copy-to is deprecated and will be removed in Spack 0.23")
if args.buildcache_destination:
tty.warn(
"The flag --buildcache-destination is deprecated and will be removed in Spack 0.23"
)
output_file = args.output_file
copy_yaml_to = args.copy_to
run_optimizer = args.optimize
@@ -264,12 +272,6 @@ def ci_rebuild(args):
if not ci_config:
tty.die("spack ci rebuild requires an env containing ci cfg")
tty.msg(
"SPACK_BUILDCACHE_DESTINATION={0}".format(
os.environ.get("SPACK_BUILDCACHE_DESTINATION", None)
)
)
# Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself.
@@ -277,6 +279,7 @@ def ci_rebuild(args):
job_log_dir = os.environ.get("SPACK_JOB_LOG_DIR")
job_test_dir = os.environ.get("SPACK_JOB_TEST_DIR")
repro_dir = os.environ.get("SPACK_JOB_REPRO_DIR")
# TODO: Remove this in Spack 0.23
local_mirror_dir = os.environ.get("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = os.environ.get("SPACK_CONCRETE_ENV_DIR")
ci_pipeline_id = os.environ.get("CI_PIPELINE_ID")
@@ -285,9 +288,12 @@ def ci_rebuild(args):
job_spec_pkg_name = os.environ.get("SPACK_JOB_SPEC_PKG_NAME")
job_spec_dag_hash = os.environ.get("SPACK_JOB_SPEC_DAG_HASH")
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE")
# TODO: Remove this in Spack 0.23
remote_mirror_override = os.environ.get("SPACK_REMOTE_MIRROR_OVERRIDE")
# TODO: Remove this in Spack 0.23
remote_mirror_url = os.environ.get("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = os.environ.get("SPACK_CI_STACK_NAME")
# TODO: Remove this in Spack 0.23
shared_pr_mirror_url = os.environ.get("SPACK_CI_SHARED_PR_MIRROR_URL")
rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING")
require_signing = os.environ.get("SPACK_REQUIRE_SIGNING")
@@ -344,21 +350,36 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False
buildcache_destination = None
if "buildcache-destination" in pipeline_mirrors:
buildcache_destination = pipeline_mirrors["buildcache-destination"]
else:
deprecated_mirror_config = True
# TODO: This will be an error in Spack 0.23
# If no override url exists, then just push binary package to the
# normal remote mirror url.
# TODO: Remove in Spack 0.23
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
if buildcache_destination:
buildcache_mirror_url = buildcache_destination.push_url
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build
# artifacts from upstream to downstream jobs.
# TODO: Remove this in Spack 0.23
pipeline_mirror_url = None
# TODO: Remove this in Spack 0.23
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in ci_config:
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
pipeline_mirror_url = url_util.join(temp_storage_url_prefix, ci_pipeline_id)
# TODO: Remove this in Spack 0.23
enable_artifacts_mirror = False
if "enable-artifacts-buildcache" in ci_config:
enable_artifacts_mirror = ci_config["enable-artifacts-buildcache"]
@@ -454,12 +475,14 @@ def ci_rebuild(args):
# If we decided there should be a temporary storage mechanism, add that
# mirror now so it's used when we check for a hash match already
# built for this spec.
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror = spack.mirror.Mirror(pipeline_mirror_url, name=spack_ci.TEMP_STORAGE_MIRROR_NAME)
spack.mirror.add(mirror, cfg.default_modify_scope())
pipeline_mirrors.append(pipeline_mirror_url)
# Check configured mirrors for a built spec with a matching hash
# TODO: Remove this block in Spack 0.23
mirrors_to_check = None
if remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
@@ -477,7 +500,8 @@ def ci_rebuild(args):
)
pipeline_mirrors.append(remote_mirror_override)
if spack_pipeline_type == "spack_pull_request":
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and spack_pipeline_type == "spack_pull_request":
if shared_pr_mirror_url != "None":
pipeline_mirrors.append(shared_pr_mirror_url)
@@ -499,6 +523,7 @@ def ci_rebuild(args):
tty.msg("No need to rebuild {0}, found hash match at: ".format(job_spec_pkg_name))
for match in matches:
tty.msg(" {0}".format(match["mirror_url"]))
# TODO: Remove this block in Spack 0.23
if enable_artifacts_mirror:
matching_mirror = matches[0]["mirror_url"]
build_cache_dir = os.path.join(local_mirror_dir, "build_cache")
@@ -513,7 +538,8 @@ def ci_rebuild(args):
# only want to keep the mirror being used by the current pipeline as it's binary
# package destination. This ensures that the when we rebuild everything, we only
# consume binary dependencies built in this pipeline.
if full_rebuild:
# TODO: Remove this in Spack 0.23
if deprecated_mirror_config and full_rebuild:
spack_ci.remove_other_mirrors(pipeline_mirrors, cfg.default_modify_scope())
# No hash match anywhere means we need to rebuild spec
@@ -579,7 +605,9 @@ def ci_rebuild(args):
"SPACK_COLOR=always",
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(job_spec.format("{name}-{version}-{hash}")),
"install-deps/{}".format(
ev.depfile.MakefileSpec(job_spec).safe_format("{name}-{version}-{hash}")
),
],
spack_cmd + ["install"] + root_install_args,
]
@@ -676,21 +704,25 @@ def ci_rebuild(args):
# print out some instructions on how to reproduce this build failure
# outside of the pipeline environment.
if install_exit_code == 0:
if buildcache_mirror_url or pipeline_mirror_url:
for result in spack_ci.create_buildcache(
input_spec=job_spec,
buildcache_mirror_url=buildcache_mirror_url,
pipeline_mirror_url=pipeline_mirror_url,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
"{} {} to {}".format(
"Pushed" if result.success else "Failed to push",
job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when()),
result.url,
)
mirror_urls = [buildcache_mirror_url]
# TODO: Remove this block in Spack 0.23
if pipeline_mirror_url:
mirror_urls.append(pipeline_mirror_url)
for result in spack_ci.create_buildcache(
input_spec=job_spec,
destination_mirror_urls=mirror_urls,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
"{} {} to {}".format(
"Pushed" if result.success else "Failed to push",
job_spec.format("{name}{@version}{/hash:7}", color=clr.get_color_when()),
result.url,
)
)
# If this is a develop pipeline, check if the spec that we just built is
# on the broken-specs list. If so, remove it.

View File

@@ -12,7 +12,7 @@
import spack.cmd
import spack.config
import spack.dependency as dep
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.modules
@@ -114,16 +114,13 @@ def __call__(self, parser, namespace, jobs, option_string):
class DeptypeAction(argparse.Action):
"""Creates a tuple of valid dependency types from a deptype argument."""
"""Creates a flag of valid dependency types from a deptype argument."""
def __call__(self, parser, namespace, values, option_string=None):
deptype = dep.all_deptypes
if values:
deptype = tuple(x.strip() for x in values.split(","))
if deptype == ("all",):
deptype = "all"
deptype = dep.canonical_deptype(deptype)
if not values or values == "all":
deptype = dt.ALL
else:
deptype = dt.canonicalize(values.split(","))
setattr(namespace, self.dest, deptype)
@@ -285,9 +282,8 @@ def deptype():
return Args(
"--deptype",
action=DeptypeAction,
default=dep.all_deptypes,
help="comma-separated list of deptypes to traverse\n\ndefault=%s"
% ",".join(dep.all_deptypes),
default=dt.ALL,
help="comma-separated list of deptypes to traverse (default=%s)" % ",".join(dt.ALL_TYPES),
)

View File

@@ -7,14 +7,15 @@
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import traverse
from spack import build_environment, traverse
from spack.context import Context
from spack.util.environment import dump_environment, pickle_environment
@@ -41,14 +42,14 @@ def setup_parser(subparser):
class AreDepsInstalledVisitor:
def __init__(self, context="build"):
if context not in ("build", "test"):
raise ValueError("context can only be build or test")
if context == "build":
self.direct_deps = ("build", "link", "run")
def __init__(self, context: Context = Context.BUILD):
if context == Context.BUILD:
# TODO: run deps shouldn't be required for build env.
self.direct_deps = dt.BUILD | dt.LINK | dt.RUN
elif context == Context.TEST:
self.direct_deps = dt.BUILD | dt.TEST | dt.LINK | dt.RUN
else:
self.direct_deps = ("build", "test", "link", "run")
raise ValueError("context can only be Context.BUILD or Context.TEST")
self.has_uninstalled_deps = False
@@ -71,11 +72,11 @@ def accept(self, item):
def neighbors(self, item):
# Direct deps: follow build & test edges.
# Transitive deps: follow link / run.
deptypes = self.direct_deps if item.depth == 0 else ("link", "run")
return item.edge.spec.edges_to_dependencies(deptype=deptypes)
depflag = self.direct_deps if item.depth == 0 else dt.LINK | dt.RUN
return item.edge.spec.edges_to_dependencies(depflag=depflag)
def emulate_env_utility(cmd_name, context, args):
def emulate_env_utility(cmd_name, context: Context, args):
if not args.spec:
tty.die("spack %s requires a spec." % cmd_name)
@@ -119,7 +120,7 @@ def emulate_env_utility(cmd_name, context, args):
hashes=True,
# This shows more than necessary, but we cannot dynamically change deptypes
# in Spec.tree(...).
deptypes="all" if context == "build" else ("build", "test", "link", "run"),
deptypes="all" if context == Context.BUILD else ("build", "test", "link", "run"),
),
)

View File

@@ -5,6 +5,7 @@
import os
import re
import sys
import urllib.parse
import llnl.util.tty as tty
@@ -62,6 +63,9 @@ class {class_name}({base_class_name}):
# notify when the package is updated.
# maintainers("github_user1", "github_user2")
# FIXME: Add the SPDX identifier of the project's license below.
license("UNKNOWN")
{versions}
{dependencies}
@@ -822,7 +826,12 @@ def get_versions(args, name):
if args.url is not None and args.template != "bundle" and valid_url:
# Find available versions
try:
url_dict = spack.util.web.find_versions_of_archive(args.url)
url_dict = spack.url.find_versions_of_archive(args.url)
if len(url_dict) > 1 and not args.batch and sys.stdin.isatty():
url_dict_filtered = spack.stage.interactive_version_filter(url_dict)
if url_dict_filtered is None:
exit(0)
url_dict = url_dict_filtered
except UndetectableVersionError:
# Use fake versions
tty.warn("Couldn't detect version in: {0}".format(args.url))
@@ -834,11 +843,7 @@ def get_versions(args, name):
url_dict = {version: args.url}
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
name,
first_stage_function=guesser,
keep_stage=args.keep_stage,
batch=(args.batch or len(url_dict) == 1),
url_dict, name, first_stage_function=guesser, keep_stage=args.keep_stage
)
versions = get_version_lines(version_hashes, url_dict)

View File

@@ -74,7 +74,7 @@ def dependencies(parser, args):
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,
deptype=args.deptype,
depflag=args.deptype,
)
if spec.name in dependencies:

View File

@@ -8,7 +8,9 @@
import shutil
import sys
import tempfile
from typing import Optional
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
@@ -28,7 +30,6 @@
import spack.schema.env
import spack.spec
import spack.tengine
import spack.util.string as string
from spack.util.environment import EnvironmentModifications
description = "manage virtual environments"
@@ -96,22 +97,16 @@ def env_activate_setup_parser(subparser):
view_options = subparser.add_mutually_exclusive_group()
view_options.add_argument(
"-v",
"--with-view",
action="store_const",
dest="with_view",
const=True,
default=True,
help="update PATH, etc., with associated view",
"-v",
metavar="name",
help="set runtime environment variables for specific view",
)
view_options.add_argument(
"-V",
"--without-view",
action="store_const",
dest="with_view",
const=False,
default=True,
help="do not update PATH, etc., with associated view",
"-V",
action="store_true",
help="do not set runtime environment variables for any view",
)
subparser.add_argument(
@@ -197,10 +192,20 @@ def env_activate(args):
# Activate new environment
active_env = ev.Environment(env_path)
# Check if runtime environment variables are requested, and if so, for what view.
view: Optional[str] = None
if args.with_view:
view = args.with_view
if not active_env.has_view(view):
tty.die(f"The environment does not have a view named '{view}'")
elif not args.without_view and active_env.has_view(ev.default_view_name):
view = ev.default_view_name
cmds += spack.environment.shell.activate_header(
env=active_env, shell=args.shell, prompt=env_prompt if args.prompt else None
env=active_env, shell=args.shell, prompt=env_prompt if args.prompt else None, view=view
)
env_mods.extend(spack.environment.shell.activate(env=active_env, add_view=args.with_view))
env_mods.extend(spack.environment.shell.activate(env=active_env, view=view))
cmds += env_mods.shell_modifications(args.shell)
sys.stdout.write(cmds)

View File

@@ -5,6 +5,7 @@
import argparse
import errno
import os
import re
import sys
from typing import List, Optional
@@ -156,11 +157,20 @@ def packages_to_search_for(
):
result = []
for current_tag in tags:
result.extend(spack.repo.PATH.packages_with_tags(current_tag))
result.extend(spack.repo.PATH.packages_with_tags(current_tag, full=True))
if names:
result = [x for x in result if x in names]
# Match both fully qualified and unqualified
parts = [rf"(^{x}$|[.]{x}$)" for x in names]
select_re = re.compile("|".join(parts))
result = [x for x in result if select_re.search(x)]
if exclude:
result = [x for x in result if x not in exclude]
# Match both fully qualified and unqualified
parts = [rf"(^{x}$|[.]{x}$)" for x in exclude]
select_re = re.compile("|".join(parts))
result = [x for x in result if not select_re.search(x)]
return result

View File

@@ -74,19 +74,19 @@ def graph(parser, args):
if args.static:
args.dot = True
static_graph_dot(specs, deptype=args.deptype)
static_graph_dot(specs, depflag=args.deptype)
return
if args.dot:
builder = SimpleDAG()
if args.color:
builder = DAGWithDependencyTypes()
graph_dot(specs, builder=builder, deptype=args.deptype)
graph_dot(specs, builder=builder, depflag=args.deptype)
return
# ascii is default: user doesn't need to provide it explicitly
debug = spack.config.get("config:debug")
graph_ascii(specs[0], debug=debug, deptype=args.deptype)
graph_ascii(specs[0], debug=debug, depflag=args.deptype)
for spec in specs[1:]:
print() # extra line bt/w independent graphs
graph_ascii(spec, debug=debug)

View File

@@ -11,6 +11,7 @@
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
import spack.repo
@@ -71,6 +72,10 @@ def variant(s):
return spack.spec.ENABLED_VARIANT_COLOR + s + plain_format
def license(s):
return spack.spec.VERSION_COLOR + s + plain_format
class VariantFormatter:
def __init__(self, variants):
self.variants = variants
@@ -160,7 +165,7 @@ def print_dependencies(pkg):
for deptype in ("build", "link", "run"):
color.cprint("")
color.cprint(section_title("%s Dependencies:" % deptype.capitalize()))
deps = sorted(pkg.dependencies_of_type(deptype))
deps = sorted(pkg.dependencies_of_type(dt.flag_from_string(deptype)))
if deps:
colify(deps, indent=4)
else:
@@ -347,6 +352,22 @@ def print_virtuals(pkg):
color.cprint(" None")
def print_licenses(pkg):
"""Output the licenses of the project."""
color.cprint("")
color.cprint(section_title("Licenses: "))
if len(pkg.licenses) == 0:
color.cprint(" None")
else:
pad = padder(pkg.licenses, 4)
for when_spec in pkg.licenses:
license_identifier = pkg.licenses[when_spec]
line = license(" {0}".format(pad(license_identifier))) + color.cescape(when_spec)
color.cprint(line)
def info(parser, args):
spec = spack.spec.Spec(args.package)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
@@ -376,6 +397,7 @@ def info(parser, args):
(args.all or not args.no_dependencies, print_dependencies),
(args.all or args.virtuals, print_virtuals),
(args.all or args.tests, print_tests),
(args.all or True, print_licenses),
]
for print_it, func in sections:
if print_it:

View File

@@ -240,8 +240,7 @@ def default_log_file(spec):
"""Computes the default filename for the log file and creates
the corresponding directory if not present
"""
fmt = "test-{x.name}-{x.version}-{hash}.xml"
basename = fmt.format(x=spec, hash=spec.dag_hash())
basename = spec.format_path("test-{name}-{version}-{hash}.xml")
dirname = fs.os.path.join(spack.paths.reports_path, "junit")
fs.mkdirp(dirname)
return fs.os.path.join(dirname, basename)

View File

@@ -16,7 +16,7 @@
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.dependency
import spack.deptypes as dt
import spack.repo
from spack.version import VersionList
@@ -149,8 +149,8 @@ def rows_for_ncols(elts, ncols):
def get_dependencies(pkg):
all_deps = {}
for deptype in spack.dependency.all_deptypes:
deps = pkg.dependencies_of_type(deptype)
for deptype in dt.ALL_TYPES:
deps = pkg.dependencies_of_type(dt.flag_from_string(deptype))
all_deps[deptype] = [d for d in deps]
return all_deps
@@ -275,8 +275,8 @@ def head(n, span_id, title, anchor=None):
out.write("\n")
out.write("</dd>\n")
for deptype in spack.dependency.all_deptypes:
deps = pkg_cls.dependencies_of_type(deptype)
for deptype in dt.ALL_TYPES:
deps = pkg_cls.dependencies_of_type(dt.flag_from_string(deptype))
if deps:
out.write("<dt>%s Dependencies:</dt>\n" % deptype.capitalize())
out.write("<dd>\n")

View File

@@ -5,6 +5,8 @@
import sys
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.find
@@ -108,16 +110,14 @@ def load(parser, args):
)
return 1
with spack.store.STORE.db.read_transaction():
if "dependencies" in args.things_to_load:
include_roots = "package" in args.things_to_load
specs = [
dep for spec in specs for dep in spec.traverse(root=include_roots, order="post")
]
if args.things_to_load != "package,dependencies":
tty.warn(
"The `--only` flag in spack load is deprecated and will be removed in Spack v0.22"
)
env_mod = spack.util.environment.EnvironmentModifications()
with spack.store.STORE.db.read_transaction():
env_mod = uenv.environment_modifications_for_specs(*specs)
for spec in specs:
env_mod.extend(uenv.environment_modifications_for_spec(spec))
env_mod.prepend_path(uenv.spack_loaded_hashes_var, spec.dag_hash())
cmds = env_mod.shell_modifications(args.shell)

View File

@@ -6,10 +6,11 @@
import posixpath
import sys
from llnl.path import convert_to_posix_path
import spack.paths
import spack.util.executable
from spack.spec import Spec
from spack.util.path import convert_to_posix_path
description = "generate Windows installer"
section = "admin"

View File

@@ -176,17 +176,29 @@ def solve(parser, args):
output = sys.stdout if "asp" in show else None
setup_only = set(show) == {"asp"}
unify = spack.config.get("concretizer:unify")
allow_deprecated = spack.config.get("config:deprecated", False)
if unify != "when_possible":
# set up solver parameters
# Note: reuse and other concretizer prefs are passed as configuration
result = solver.solve(
specs, out=output, timers=args.timers, stats=args.stats, setup_only=setup_only
specs,
out=output,
timers=args.timers,
stats=args.stats,
setup_only=setup_only,
allow_deprecated=allow_deprecated,
)
if not setup_only:
_process_result(result, show, required_format, kwargs)
else:
for idx, result in enumerate(
solver.solve_in_rounds(specs, out=output, timers=args.timers, stats=args.stats)
solver.solve_in_rounds(
specs,
out=output,
timers=args.timers,
stats=args.stats,
allow_deprecated=allow_deprecated,
)
):
if "solutions" in show:
tty.msg("ROUND {0}".format(idx))

View File

@@ -5,6 +5,7 @@
import io
import sys
import llnl.string
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
@@ -24,7 +25,7 @@ def report_tags(category, tags):
if isatty:
num = len(tags)
fmt = "{0} package tag".format(category)
buffer.write("{0}:\n".format(spack.util.string.plural(num, fmt)))
buffer.write("{0}:\n".format(llnl.string.plural(num, fmt)))
if tags:
colify.colify(tags, output=buffer, tty=isatty, indent=4)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd.common.env_utility as env_utility
from spack.context import Context
description = (
"run a command in a spec's test environment, or dump its environment to screen or file"
@@ -14,4 +15,4 @@
def test_env(parser, args):
env_utility.emulate_env_utility("test-env", "test", args)
env_utility.emulate_env_utility("test-env", Context.TEST, args)

View File

@@ -88,9 +88,8 @@ def unload(parser, args):
)
return 1
env_mod = spack.util.environment.EnvironmentModifications()
env_mod = uenv.environment_modifications_for_specs(*specs).reversed()
for spec in specs:
env_mod.extend(uenv.environment_modifications_for_spec(spec).reversed())
env_mod.remove_path(uenv.spack_loaded_hashes_var, spec.dag_hash())
cmds = env_mod.shell_modifications(args.shell)

View File

@@ -12,6 +12,7 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
import spack.url
import spack.util.crypto as crypto
from spack.url import (
UndetectableNameError,
@@ -26,7 +27,6 @@
substitution_offsets,
)
from spack.util.naming import simplify_name
from spack.util.web import find_versions_of_archive
description = "debugging tool for url parsing"
section = "developer"
@@ -139,7 +139,7 @@ def url_parse(args):
if args.spider:
print()
tty.msg("Spidering for versions:")
versions = find_versions_of_archive(url)
versions = spack.url.find_versions_of_archive(url)
if not versions:
print(" Found no versions for {0}".format(name))

View File

@@ -37,10 +37,7 @@ def setup_parser(subparser):
action="store_true",
help="only list remote versions newer than the latest checksummed version",
)
subparser.add_argument(
"-c", "--concurrency", default=32, type=int, help="number of concurrent requests"
)
arguments.add_common_arguments(subparser, ["package"])
arguments.add_common_arguments(subparser, ["package", "jobs"])
def versions(parser, args):
@@ -68,7 +65,7 @@ def versions(parser, args):
if args.safe:
return
fetched_versions = pkg.fetch_remote_versions(args.concurrency)
fetched_versions = pkg.fetch_remote_versions(args.jobs)
if args.new:
if sys.stdout.isatty():

View File

@@ -13,6 +13,7 @@
import tempfile
from typing import List, Optional, Sequence
import llnl.path
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
@@ -24,7 +25,6 @@
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.path import system_path_filter
__all__ = ["Compiler"]
@@ -39,10 +39,17 @@ def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
version_arg (str): the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
compiler_invocation_args = {
"output": str,
"error": str,
"ignore_errors": ignore_errors,
"timeout": 120,
"fail_on_error": True,
}
if version_arg:
output = compiler(version_arg, output=str, error=str, ignore_errors=ignore_errors)
output = compiler(version_arg, **compiler_invocation_args)
else:
output = compiler(output=str, error=str, ignore_errors=ignore_errors)
output = compiler(**compiler_invocation_args)
return output
@@ -153,7 +160,7 @@ def _parse_link_paths(string):
return implicit_link_dirs
@system_path_filter
@llnl.path.system_path_filter
def _parse_non_system_link_dirs(string: str) -> List[str]:
"""Parses link paths out of compiler debug output.
@@ -229,6 +236,9 @@ class Compiler:
# by any compiler
_all_compiler_rpath_libraries = ["libc", "libc++", "libstdc++"]
#: Platform matcher for Platform objects supported by compiler
is_supported_on_platform = lambda x: True
# Default flags used by a compiler to set an rpath
@property
def cc_rpath_arg(self):
@@ -594,8 +604,6 @@ def search_regexps(cls, language):
compiler_names = getattr(cls, "{0}_names".format(language))
prefixes = [""] + cls.prefixes
suffixes = [""]
# Windows compilers generally have an extension of some sort
# as do most files on Windows, handle that case here
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
cls_suf = [suf + ext for suf in cls.suffixes]

View File

@@ -10,7 +10,7 @@
import itertools
import multiprocessing.pool
import os
from typing import Dict
from typing import Dict, List
import archspec.cpu
@@ -298,7 +298,7 @@ def select_new_compilers(compilers, scope=None):
return compilers_not_in_config
def supported_compilers():
def supported_compilers() -> List[str]:
"""Return a set of names of compilers supported by Spack.
See available_compilers() to get a list of all the available
@@ -306,10 +306,41 @@ def supported_compilers():
"""
# Hack to be able to call the compiler `apple-clang` while still
# using a valid python name for the module
return sorted(
name if name != "apple_clang" else "apple-clang"
for name in llnl.util.lang.list_modules(spack.paths.compilers_path)
)
return sorted(all_compiler_names())
def supported_compilers_for_host_platform() -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the current host platform
"""
host_plat = spack.platforms.real_host()
return supported_compilers_for_platform(host_plat)
def supported_compilers_for_platform(platform: spack.platforms.Platform) -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the provided platform
Args:
platform (str): string representation of platform
for which compiler compatability should be determined
"""
return [
name
for name in supported_compilers()
if class_for_compiler_name(name).is_supported_on_platform(platform)
]
def all_compiler_names() -> List[str]:
def replace_apple_clang(name):
return name if name != "apple_clang" else "apple-clang"
return [replace_apple_clang(name) for name in all_compiler_module_names()]
def all_compiler_module_names() -> List[str]:
return [name for name in llnl.util.lang.list_modules(spack.paths.compilers_path)]
@_auto_compiler_spec
@@ -628,7 +659,7 @@ def arguments_to_detect_version_fn(operating_system, paths):
def _default(search_paths):
command_arguments = []
files_to_be_tested = fs.files_in(*search_paths)
for compiler_name in spack.compilers.supported_compilers():
for compiler_name in spack.compilers.supported_compilers_for_host_platform():
compiler_cls = class_for_compiler_name(compiler_name)
for language in ("cc", "cxx", "f77", "fc"):
@@ -687,9 +718,11 @@ def _default(fn_args):
value = fn_args._replace(id=compiler_id._replace(version=version))
return value, None
error = "Couldn't get version for compiler {0}".format(path)
error = f"Couldn't get version for compiler {path}".format(path)
except spack.util.executable.ProcessError as e:
error = "Couldn't get version for compiler {0}\n".format(path) + str(e)
error = f"Couldn't get version for compiler {path}\n" + str(e)
except spack.util.executable.ProcessTimeoutError as e:
error = f"Couldn't get version for compiler {path}\n" + str(e)
except Exception as e:
# Catching "Exception" here is fine because it just
# means something went wrong running a candidate executable.

View File

@@ -112,6 +112,7 @@ def extract_version_from_output(cls, output):
match = re.search(r"AOCC_(\d+)[._](\d+)[._](\d+)", output)
if match:
return ".".join(match.groups())
return "unknown"
@classmethod
def fc_version(cls, fortran_compiler):

View File

@@ -7,7 +7,6 @@
import re
import subprocess
import sys
from distutils.version import StrictVersion
from typing import Dict, List, Set
import spack.compiler
@@ -115,11 +114,11 @@ def command_str(self):
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver)
sort_fn = lambda fc_ver: StrictVersion(fc_ver)
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
if ver in fortran_mapping:
if StrictVersion(cl_ver) <= StrictVersion(fortran_mapping[ver]):
if Version(cl_ver) <= Version(fortran_mapping[ver]):
return fc_path[ver]
return None
@@ -154,9 +153,12 @@ class Msvc(Compiler):
#: Regex used to extract version from compiler's output
version_regex = r"([1-9][0-9]*\.[0-9]*\.[0-9]*)"
# The MSVC compiler class overrides this to prevent instances
# of erroneous matching on executable names that cannot be msvc
# compilers
suffixes = []
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
is_supported_on_platform = lambda x: isinstance(x, spack.platforms.Windows)
def __init__(self, *args, **kwargs):
# This positional argument "paths" is later parsed and process by the base class
@@ -167,6 +169,8 @@ def __init__(self, *args, **kwargs):
cspec = args[0]
new_pth = [pth if pth else get_valid_fortran_pth(cspec.version) for pth in paths]
paths[:] = new_pth
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
super().__init__(*args, **kwargs)
# To use the MSVC compilers, VCVARS must be invoked
# VCVARS is located at a fixed location, referencable

View File

@@ -155,7 +155,7 @@ def _valid_virtuals_and_externals(self, spec):
),
)
def choose_virtual_or_external(self, spec):
def choose_virtual_or_external(self, spec: spack.spec.Spec):
"""Given a list of candidate virtual and external packages, try to
find one that is most ABI compatible.
"""
@@ -744,8 +744,11 @@ def concretize_specs_together(*abstract_specs, **kwargs):
def _concretize_specs_together_new(*abstract_specs, **kwargs):
import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(abstract_specs, tests=kwargs.get("tests", False))
result = solver.solve(
abstract_specs, tests=kwargs.get("tests", False), allow_deprecated=allow_deprecated
)
result.raise_if_unsat()
return [s.copy() for s in result.specs]

View File

@@ -272,13 +272,6 @@ def _os_pkg_manager(self):
raise spack.error.SpackError(msg)
return os_pkg_manager
@tengine.context_property
def extra_instructions(self):
Extras = namedtuple("Extra", ["build", "final"])
extras = self.container_config.get("extra_instructions", {})
build, final = extras.get("build", None), extras.get("final", None)
return Extras(build=build, final=final)
@tengine.context_property
def labels(self):
return self.container_config.get("labels", {})

View File

@@ -0,0 +1,29 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module provides classes used in user and build environment"""
from enum import Enum
class Context(Enum):
"""Enum used to indicate the context in which an environment has to be setup: build,
run or test."""
BUILD = 1
RUN = 2
TEST = 3
def __str__(self):
return ("build", "run", "test")[self.value - 1]
@classmethod
def from_string(cls, s: str):
if s == "build":
return Context.BUILD
elif s == "run":
return Context.RUN
elif s == "test":
return Context.TEST
raise ValueError(f"context should be one of 'build', 'run', 'test', got {s}")

View File

@@ -4,6 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import json
import os
import traceback
import warnings
import jsonschema
import jsonschema.exceptions
@@ -11,6 +14,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.deptypes as dt
import spack.error
import spack.hash_types as hash_types
import spack.platforms
@@ -45,9 +49,29 @@ def translated_compiler_name(manifest_compiler_name):
)
def compiler_from_entry(entry):
def compiler_from_entry(entry: dict, manifest_path: str):
# Note that manifest_path is only passed here to compose a
# useful warning message when paths appear to be missing.
compiler_name = translated_compiler_name(entry["name"])
paths = entry["executables"]
if "prefix" in entry:
prefix = entry["prefix"]
paths = dict(
(lang, os.path.join(prefix, relpath))
for (lang, relpath) in entry["executables"].items()
)
else:
paths = entry["executables"]
# Do a check for missing paths. Note that this isn't possible for
# all compiler entries, since their "paths" might actually be
# exe names like "cc" that depend on modules being loaded. Cray
# manifest entries are always paths though.
missing_paths = []
for path in paths.values():
if not os.path.exists(path):
missing_paths.append(path)
# to instantiate a compiler class we may need a concrete version:
version = "={}".format(entry["version"])
arch = entry["arch"]
@@ -56,8 +80,18 @@ def compiler_from_entry(entry):
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
spec = spack.spec.CompilerSpec(compiler_cls.name, version)
paths = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
return compiler_cls(spec, operating_system, target, paths)
path_list = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
if missing_paths:
warnings.warn(
"Manifest entry refers to nonexistent paths:\n\t"
+ "\n\t".join(missing_paths)
+ f"\nfor {str(spec)}"
+ f"\nin {manifest_path}"
+ "\nPlease report this issue"
)
return compiler_cls(spec, operating_system, target, path_list)
def spec_from_entry(entry):
@@ -158,13 +192,13 @@ def entries_to_specs(entries):
dependencies = entry["dependencies"]
for name, properties in dependencies.items():
dep_hash = properties["hash"]
deptypes = properties["type"]
depflag = dt.canonicalize(properties["type"])
if dep_hash in spec_dict:
if entry["hash"] not in spec_dict:
continue
parent_spec = spec_dict[entry["hash"]]
dep_spec = spec_dict[dep_hash]
parent_spec._add_dependency(dep_spec, deptypes=deptypes, virtuals=())
parent_spec._add_dependency(dep_spec, depflag=depflag, virtuals=())
for spec in spec_dict.values():
spack.spec.reconstruct_virtuals_on_edges(spec)
@@ -186,12 +220,21 @@ def read(path, apply_updates):
tty.debug("{0}: {1} specs read from manifest".format(path, str(len(specs))))
compilers = list()
if "compilers" in json_data:
compilers.extend(compiler_from_entry(x) for x in json_data["compilers"])
compilers.extend(compiler_from_entry(x, path) for x in json_data["compilers"])
tty.debug("{0}: {1} compilers read from manifest".format(path, str(len(compilers))))
# Filter out the compilers that already appear in the configuration
compilers = spack.compilers.select_new_compilers(compilers)
if apply_updates and compilers:
spack.compilers.add_compilers_to_config(compilers, init_config=False)
for compiler in compilers:
try:
spack.compilers.add_compilers_to_config([compiler], init_config=False)
except Exception:
warnings.warn(
f"Could not add compiler {str(compiler.spec)}: "
f"\n\tfrom manifest: {path}"
"\nPlease reexecute with 'spack -d' and include the stack trace"
)
tty.debug(f"Include this\n{traceback.format_exc()}")
if apply_updates:
for spec in specs.values():
spack.store.STORE.db.add(spec, directory_layout=None)

View File

@@ -27,6 +27,8 @@
import time
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Set, Type, Union
import spack.deptypes as dt
try:
import uuid
@@ -89,7 +91,7 @@
#: Types of dependencies tracked by the database
#: We store by DAG hash, so we track the dependencies that the DAG hash includes.
_TRACKED_DEPENDENCIES = ht.dag_hash.deptype
_TRACKED_DEPENDENCIES = ht.dag_hash.depflag
#: Default list of fields written for each install record
DEFAULT_INSTALL_RECORD_FIELDS = (
@@ -795,7 +797,7 @@ def _assign_dependencies(self, spec_reader, hash_key, installs, data):
tty.warn(msg)
continue
spec._add_dependency(child, deptypes=dtypes, virtuals=virtuals)
spec._add_dependency(child, depflag=dt.canonicalize(dtypes), virtuals=virtuals)
def _read_from_file(self, filename):
"""Fill database from file, do not maintain old data.
@@ -1146,7 +1148,7 @@ def _add(
# Retrieve optional arguments
installation_time = installation_time or _now()
for edge in spec.edges_to_dependencies(deptype=_TRACKED_DEPENDENCIES):
for edge in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
if edge.spec.dag_hash() in self._data:
continue
# allow missing build-only deps. This prevents excessive
@@ -1154,7 +1156,7 @@ def _add(
# is missing a build dep; there's no need to install the
# build dep's build dep first, and there's no need to warn
# about it missing.
dep_allow_missing = allow_missing or edge.deptypes == ("build",)
dep_allow_missing = allow_missing or edge.depflag == dt.BUILD
self._add(
edge.spec,
directory_layout,
@@ -1198,10 +1200,10 @@ def _add(
self._data[key] = InstallRecord(new_spec, path, installed, ref_count=0, **extra_args)
# Connect dependencies from the DB to the new copy.
for dep in spec.edges_to_dependencies(deptype=_TRACKED_DEPENDENCIES):
for dep in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
dkey = dep.spec.dag_hash()
upstream, record = self.query_by_spec_hash(dkey)
new_spec._add_dependency(record.spec, deptypes=dep.deptypes, virtuals=dep.virtuals)
new_spec._add_dependency(record.spec, depflag=dep.depflag, virtuals=dep.virtuals)
if not upstream:
record.ref_count += 1
@@ -1371,7 +1373,13 @@ def deprecate(self, spec, deprecator):
return self._deprecate(spec, deprecator)
@_autospec
def installed_relatives(self, spec, direction="children", transitive=True, deptype="all"):
def installed_relatives(
self,
spec,
direction="children",
transitive=True,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
):
"""Return installed specs related to this one."""
if direction not in ("parents", "children"):
raise ValueError("Invalid direction: %s" % direction)

View File

@@ -3,64 +3,11 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List, Optional, Set, Tuple, Union
from typing import Dict, List
import spack.deptypes as dt
import spack.spec
#: The types of dependency relationships that Spack understands.
all_deptypes = ("build", "link", "run", "test")
#: Default dependency type if none is specified
default_deptype = ("build", "link")
#: Type hint for the arguments accepting a dependency type
DependencyArgument = Union[str, List[str], Tuple[str, ...]]
def deptype_chars(*type_tuples: str) -> str:
"""Create a string representing deptypes for many dependencies.
The string will be some subset of 'blrt', like 'bl ', 'b t', or
' lr ' where each letter in 'blrt' stands for 'build', 'link',
'run', and 'test' (the dependency types).
For a single dependency, this just indicates that the dependency has
the indicated deptypes. For a list of dependnecies, this shows
whether ANY dpeendency in the list has the deptypes (so the deptypes
are merged).
"""
types: Set[str] = set()
for t in type_tuples:
if t:
types.update(t)
return "".join(t[0] if t in types else " " for t in all_deptypes)
def canonical_deptype(deptype: DependencyArgument) -> Tuple[str, ...]:
"""Convert deptype to a canonical sorted tuple, or raise ValueError.
Args:
deptype: string representing dependency type, or a list/tuple of such strings.
Can also be the builtin function ``all`` or the string 'all', which result in
a tuple of all dependency types known to Spack.
"""
if deptype in ("all", all):
return all_deptypes
elif isinstance(deptype, str):
if deptype not in all_deptypes:
raise ValueError("Invalid dependency type: %s" % deptype)
return (deptype,)
elif isinstance(deptype, (tuple, list, set)):
bad = [d for d in deptype if d not in all_deptypes]
if bad:
raise ValueError("Invalid dependency types: %s" % ",".join(str(t) for t in bad))
return tuple(sorted(set(deptype)))
raise ValueError("Invalid dependency type: %s" % repr(deptype))
class Dependency:
"""Class representing metadata for a dependency on a package.
@@ -93,7 +40,7 @@ def __init__(
self,
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
type: Optional[Tuple[str, ...]] = default_deptype,
depflag: dt.DepFlag = dt.DEFAULT,
):
"""Create a new Dependency.
@@ -110,11 +57,7 @@ def __init__(
# This dict maps condition specs to lists of Patch objects, just
# as the patches dict on packages does.
self.patches: Dict[spack.spec.Spec, "List[spack.patch.Patch]"] = {}
if type is None:
self.type = set(default_deptype)
else:
self.type = set(type)
self.depflag = depflag
@property
def name(self) -> str:
@@ -124,7 +67,7 @@ def name(self) -> str:
def merge(self, other: "Dependency"):
"""Merge constraints, deptypes, and patches of other into self."""
self.spec.constrain(other.spec)
self.type |= other.type
self.depflag |= other.depflag
# concatenate patch lists, or just copy them in
for cond, p in other.patches.items():
@@ -135,5 +78,5 @@ def merge(self, other: "Dependency"):
self.patches[cond] = other.patches[cond]
def __repr__(self) -> str:
types = deptype_chars(*self.type)
types = dt.flag_to_chars(self.depflag)
return f"<Dependency: {self.pkg.name} -> {self.spec} [{types}]>"

123
lib/spack/spack/deptypes.py Normal file
View File

@@ -0,0 +1,123 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's edge types."""
from typing import Iterable, List, Tuple, Union
#: Type hint for the low-level dependency input (enum.Flag is too slow)
DepFlag = int
#: Type hint for the high-level dependency input
DepTypes = Union[str, List[str], Tuple[str, ...]]
#: Individual dependency types
DepType = str # Python 3.8: Literal["build", "link", "run", "test"]
# Flag values. NOTE: these values are not arbitrary, since hash computation imposes
# the order (link, run, build, test) when depending on the same package multiple times,
# and we rely on default integer comparison to sort dependency types.
# New dependency types should be appended.
LINK = 0b0001
RUN = 0b0010
BUILD = 0b0100
TEST = 0b1000
#: The types of dependency relationships that Spack understands.
ALL_TYPES: Tuple[DepType, ...] = ("build", "link", "run", "test")
#: Default dependency type if none is specified
DEFAULT_TYPES: Tuple[DepType, ...] = ("build", "link")
#: A flag with all dependency types set
ALL: DepFlag = BUILD | LINK | RUN | TEST
#: Default dependency type if none is specified
DEFAULT: DepFlag = BUILD | LINK
#: An iterator of all flag components
ALL_FLAGS: Tuple[DepFlag, DepFlag, DepFlag, DepFlag] = (BUILD, LINK, RUN, TEST)
def flag_from_string(s: str) -> DepFlag:
if s == "build":
return BUILD
elif s == "link":
return LINK
elif s == "run":
return RUN
elif s == "test":
return TEST
else:
raise ValueError(f"Invalid dependency type: {s}")
def flag_from_strings(deptype: Iterable[str]) -> DepFlag:
"""Transform an iterable of deptype strings into a flag."""
flag = 0
for deptype_str in deptype:
flag |= flag_from_string(deptype_str)
return flag
def canonicalize(deptype: DepTypes) -> DepFlag:
"""Convert deptype user input to a DepFlag, or raise ValueError.
Args:
deptype: string representing dependency type, or a list/tuple of such strings.
Can also be the builtin function ``all`` or the string 'all', which result in
a tuple of all dependency types known to Spack.
"""
if deptype in ("all", all):
return ALL
if isinstance(deptype, str):
return flag_from_string(deptype)
if isinstance(deptype, (tuple, list, set)):
return flag_from_strings(deptype)
raise ValueError(f"Invalid dependency type: {deptype!r}")
def flag_to_tuple(x: DepFlag) -> Tuple[DepType, ...]:
deptype: List[DepType] = []
if x & BUILD:
deptype.append("build")
if x & LINK:
deptype.append("link")
if x & RUN:
deptype.append("run")
if x & TEST:
deptype.append("test")
return tuple(deptype)
def flag_to_string(x: DepFlag) -> DepType:
if x == BUILD:
return "build"
elif x == LINK:
return "link"
elif x == RUN:
return "run"
elif x == TEST:
return "test"
else:
raise ValueError(f"Invalid dependency type flag: {x}")
def flag_to_chars(depflag: DepFlag) -> str:
"""Create a string representing deptypes for many dependencies.
The string will be some subset of 'blrt', like 'bl ', 'b t', or
' lr ' where each letter in 'blrt' stands for 'build', 'link',
'run', and 'test' (the dependency types).
For a single dependency, this just indicates that the dependency has
the indicated deptypes. For a list of dependnecies, this shows
whether ANY dpeendency in the list has the deptypes (so the deptypes
are merged)."""
return "".join(
t_str[0] if t_flag & depflag else " " for t_str, t_flag in zip(ALL_TYPES, ALL_FLAGS)
)

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from .common import DetectedPackage, executable_prefix, update_configuration
from .path import by_path, executables_in_path
from .test import detection_tests
__all__ = [
"DetectedPackage",
@@ -11,4 +12,5 @@
"executables_in_path",
"executable_prefix",
"update_configuration",
"detection_tests",
]

View File

@@ -299,36 +299,36 @@ def find_windows_compiler_bundled_packages() -> List[str]:
class WindowsKitExternalPaths:
plat_major_ver = None
if sys.platform == "win32":
plat_major_ver = str(winOs.windows_version()[0])
@staticmethod
def find_windows_kit_roots() -> Optional[str]:
def find_windows_kit_roots() -> List[str]:
"""Return Windows kit root, typically %programfiles%\\Windows Kits\\10|11\\"""
if sys.platform != "win32":
return None
return []
program_files = os.environ["PROGRAMFILES(x86)"]
kit_base = os.path.join(
program_files, "Windows Kits", WindowsKitExternalPaths.plat_major_ver
)
return kit_base
kit_base = os.path.join(program_files, "Windows Kits", "**")
return glob.glob(kit_base)
@staticmethod
def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]:
"""Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base is not None, "unexpected value for kit_base"
kit_bin = os.path.join(kit_base, "bin")
return glob.glob(os.path.join(kit_bin, "[0-9]*", "*\\"))
assert kit_base, "Unexpectedly empty value for Windows kit base path"
kit_paths = []
for kit in kit_base:
kit_bin = os.path.join(kit, "bin")
kit_paths.extend(glob.glob(os.path.join(kit_bin, "[0-9]*", "*\\")))
return kit_paths
@staticmethod
def find_windows_kit_lib_paths(kit_base: Optional[str] = None) -> List[str]:
"""Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base is not None, "unexpected value for kit_base"
kit_lib = os.path.join(kit_base, "Lib")
return glob.glob(os.path.join(kit_lib, "[0-9]*", "*", "*\\"))
assert kit_base, "Unexpectedly empty value for Windows kit base path"
kit_paths = []
for kit in kit_base:
kit_lib = os.path.join(kit, "Lib")
kit_paths.extend(glob.glob(os.path.join(kit_lib, "[0-9]*", "*", "*\\")))
return kit_paths
@staticmethod
def find_windows_driver_development_kit_paths() -> List[str]:
@@ -347,23 +347,30 @@ def find_windows_kit_reg_installed_roots_paths() -> List[str]:
if not reg:
# couldn't find key, return empty list
return []
return WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("KitsRoot%s" % WindowsKitExternalPaths.plat_major_ver).value
)
kit_root_reg = re.compile(r"KitsRoot[0-9]+")
root_paths = []
for kit_root in filter(kit_root_reg.match, reg.get_values().keys()):
root_paths.extend(
WindowsKitExternalPaths.find_windows_kit_lib_paths(reg.get_value(kit_root).value)
)
return root_paths
@staticmethod
def find_windows_kit_reg_sdk_paths() -> List[str]:
reg = spack.util.windows_registry.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft\\Microsoft SDKs\\Windows\\v%s.0"
% WindowsKitExternalPaths.plat_major_ver,
sdk_paths = []
sdk_regex = re.compile(r"v[0-9]+.[0-9]+")
windows_reg = spack.util.windows_registry.WindowsRegistryView(
"SOFTWARE\\WOW6432Node\\Microsoft\\Microsoft SDKs\\Windows",
root_key=spack.util.windows_registry.HKEY.HKEY_LOCAL_MACHINE,
)
if not reg:
# couldn't find key, return empty list
return []
return WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("InstallationFolder").value
)
for key in filter(sdk_regex.match, [x.name for x in windows_reg.get_subkeys()]):
reg = windows_reg.get_subkey(key)
sdk_paths.extend(
WindowsKitExternalPaths.find_windows_kit_lib_paths(
reg.get_value("InstallationFolder").value
)
)
return sdk_paths
def find_win32_additional_install_paths() -> List[str]:

View File

@@ -2,7 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Detection of software installed in the system based on paths inspections
"""Detection of software installed in the system, based on paths inspections
and running executables.
"""
import collections
@@ -322,12 +322,14 @@ def by_path(
path_hints: Optional[List[str]] = None,
max_workers: Optional[int] = None,
) -> Dict[str, List[DetectedPackage]]:
"""Return the list of packages that have been detected on the system,
searching by path.
"""Return the list of packages that have been detected on the system, keyed by
unqualified package name.
Args:
packages_to_search: list of package classes to be detected
packages_to_search: list of packages to be detected. Each package can be either unqualified
of fully qualified
path_hints: initial list of paths to be searched
max_workers: maximum number of workers to search for packages in parallel
"""
# TODO: Packages should be able to define both .libraries and .executables in the future
# TODO: determine_spec_details should get all relevant libraries and executables in one call
@@ -355,7 +357,8 @@ def by_path(
try:
detected = future.result(timeout=DETECTION_TIMEOUT)
if detected:
result[pkg_name].extend(detected)
_, unqualified_name = spack.repo.partition_package_name(pkg_name)
result[unqualified_name].extend(detected)
except Exception:
llnl.util.tty.debug(
f"[EXTERNAL DETECTION] Skipping {pkg_name}: timeout reached"

View File

@@ -0,0 +1,187 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Create and run mock e2e tests for package detection."""
import collections
import contextlib
import pathlib
import tempfile
from typing import Any, Deque, Dict, Generator, List, NamedTuple, Tuple
import jinja2
from llnl.util import filesystem
import spack.repo
import spack.spec
from spack.util import spack_yaml
from .path import by_path
class MockExecutables(NamedTuple):
"""Mock executables to be used in detection tests"""
#: Relative paths for mock executables to be created
executables: List[str]
#: Shell script for the mock executable
script: str
class ExpectedTestResult(NamedTuple):
"""Data structure to model assertions on detection tests"""
#: Spec to be detected
spec: str
class DetectionTest(NamedTuple):
"""Data structure to construct detection tests by PATH inspection.
Packages may have a YAML file containing the description of one or more detection tests
to be performed. Each test creates a few mock executable scripts in a temporary folder,
and checks that detection by PATH gives the expected results.
"""
pkg_name: str
layout: List[MockExecutables]
results: List[ExpectedTestResult]
class Runner:
"""Runs an external detection test"""
def __init__(self, *, test: DetectionTest, repository: spack.repo.RepoPath) -> None:
self.test = test
self.repository = repository
self.tmpdir = tempfile.TemporaryDirectory()
def execute(self) -> List[spack.spec.Spec]:
"""Executes a test and returns the specs that have been detected.
This function sets-up a test in a temporary directory, according to the prescriptions
in the test layout, then performs a detection by executables and returns the specs that
have been detected.
"""
with self._mock_layout() as path_hints:
entries = by_path([self.test.pkg_name], path_hints=path_hints)
_, unqualified_name = spack.repo.partition_package_name(self.test.pkg_name)
specs = set(x.spec for x in entries[unqualified_name])
return list(specs)
@contextlib.contextmanager
def _mock_layout(self) -> Generator[List[str], None, None]:
hints = set()
try:
for entry in self.test.layout:
exes = self._create_executable_scripts(entry)
for mock_executable in exes:
hints.add(str(mock_executable.parent))
yield list(hints)
finally:
self.tmpdir.cleanup()
def _create_executable_scripts(self, mock_executables: MockExecutables) -> List[pathlib.Path]:
relative_paths = mock_executables.executables
script = mock_executables.script
script_template = jinja2.Template("#!/bin/bash\n{{ script }}\n")
result = []
for mock_exe_path in relative_paths:
rel_path = pathlib.Path(mock_exe_path)
abs_path = pathlib.Path(self.tmpdir.name) / rel_path
abs_path.parent.mkdir(parents=True, exist_ok=True)
abs_path.write_text(script_template.render(script=script))
filesystem.set_executable(abs_path)
result.append(abs_path)
return result
@property
def expected_specs(self) -> List[spack.spec.Spec]:
return [spack.spec.Spec(r.spec) for r in self.test.results]
def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runner]:
"""Returns a list of test runners for a given package.
Currently, detection tests are specified in a YAML file, called ``detection_test.yaml``,
alongside the ``package.py`` file.
This function reads that file to create a bunch of ``Runner`` objects.
Args:
pkg_name: name of the package to test
repository: repository where the package lives
"""
result = []
detection_tests_content = read_detection_tests(pkg_name, repository)
tests_by_path = detection_tests_content.get("paths", [])
for single_test_data in tests_by_path:
mock_executables = []
for layout in single_test_data["layout"]:
mock_executables.append(
MockExecutables(executables=layout["executables"], script=layout["script"])
)
expected_results = []
for assertion in single_test_data["results"]:
expected_results.append(ExpectedTestResult(spec=assertion["spec"]))
current_test = DetectionTest(
pkg_name=pkg_name, layout=mock_executables, results=expected_results
)
result.append(Runner(test=current_test, repository=repository))
return result
def read_detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> Dict[str, Any]:
"""Returns the normalized content of the detection_tests.yaml associated with the package
passed in input.
The content is merged with that of any package that is transitively included using the
"includes" attribute.
Args:
pkg_name: name of the package to test
repository: repository in which to search for packages
"""
content_stack, seen = [], set()
included_packages: Deque[str] = collections.deque()
root_detection_yaml, result = _detection_tests_yaml(pkg_name, repository)
included_packages.extend(result.get("includes", []))
seen |= set(result.get("includes", []))
while included_packages:
current_package = included_packages.popleft()
try:
current_detection_yaml, content = _detection_tests_yaml(current_package, repository)
except FileNotFoundError as e:
msg = (
f"cannot read the detection tests from the '{current_package}' package, "
f"included by {root_detection_yaml}"
)
raise FileNotFoundError(msg + f"\n\n\t{e}\n")
content_stack.append((current_package, content))
included_packages.extend(x for x in content.get("includes", []) if x not in seen)
seen |= set(content.get("includes", []))
result.setdefault("paths", [])
for pkg_name, content in content_stack:
result["paths"].extend(content.get("paths", []))
return result
def _detection_tests_yaml(
pkg_name: str, repository: spack.repo.RepoPath
) -> Tuple[pathlib.Path, Dict[str, Any]]:
pkg_dir = pathlib.Path(repository.filename_for_package_name(pkg_name)).parent
detection_tests_yaml = pkg_dir / "detection_test.yaml"
with open(str(detection_tests_yaml)) as f:
content = spack_yaml.load(f)
return detection_tests_yaml, content

View File

@@ -38,13 +38,14 @@ class OpenMpi(Package):
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.error
import spack.patch
import spack.spec
import spack.url
import spack.util.crypto
import spack.variant
from spack.dependency import Dependency, canonical_deptype, default_deptype
from spack.dependency import Dependency
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
@@ -63,6 +64,7 @@ class OpenMpi(Package):
"depends_on",
"extends",
"maintainers",
"license",
"provides",
"patch",
"variant",
@@ -436,7 +438,7 @@ def _execute_version(pkg, ver, **kwargs):
pkg.versions[version] = kwargs
def _depends_on(pkg, spec, when=None, type=default_deptype, patches=None):
def _depends_on(pkg, spec, when=None, type=dt.DEFAULT_TYPES, patches=None):
when_spec = make_when_spec(when)
if not when_spec:
return
@@ -447,7 +449,7 @@ def _depends_on(pkg, spec, when=None, type=default_deptype, patches=None):
if pkg.name == dep_spec.name:
raise CircularReferenceError("Package '%s' cannot depend on itself." % pkg.name)
type = canonical_deptype(type)
depflag = dt.canonicalize(type)
conditions = pkg.dependencies.setdefault(dep_spec.name, {})
# call this patches here for clarity -- we want patch to be a list,
@@ -477,12 +479,12 @@ def _depends_on(pkg, spec, when=None, type=default_deptype, patches=None):
# this is where we actually add the dependency to this package
if when_spec not in conditions:
dependency = Dependency(pkg, dep_spec, type=type)
dependency = Dependency(pkg, dep_spec, depflag=depflag)
conditions[when_spec] = dependency
else:
dependency = conditions[when_spec]
dependency.spec.constrain(dep_spec, deps=False)
dependency.type |= set(type)
dependency.depflag |= depflag
# apply patches to the dependency
for execute_patch in patches:
@@ -525,7 +527,7 @@ def _execute_conflicts(pkg):
@directive(("dependencies"))
def depends_on(spec, when=None, type=default_deptype, patches=None):
def depends_on(spec, when=None, type=dt.DEFAULT_TYPES, patches=None):
"""Creates a dict of deps with specs defining when they apply.
Args:
@@ -861,6 +863,44 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(pkg, license_identifier: str, when):
# If when is not specified the license always holds
when_spec = make_when_spec(when)
if not when_spec:
return
for other_when_spec in pkg.licenses:
if when_spec.intersects(other_when_spec):
when_message = ""
if when_spec != make_when_spec(None):
when_message = f"when {when_spec}"
other_when_message = ""
if other_when_spec != make_when_spec(None):
other_when_message = f"when {other_when_spec}"
err_msg = (
f"{pkg.name} is specified as being licensed as {license_identifier} "
f"{when_message}, but it is also specified as being licensed under "
f"{pkg.licenses[other_when_spec]} {other_when_message}, which conflict."
)
raise OverlappingLicenseError(err_msg)
pkg.licenses[when_spec] = license_identifier
@directive("licenses")
def license(license_identifier: str, when=None):
"""Add a new license directive, to specify the SPDX identifier the software is
distributed under.
Args:
license_identifiers: A list of SPDX identifiers specifying the licenses
the software is distributed under.
when: A spec specifying when the license applies.
"""
return lambda pkg: _execute_license(pkg, license_identifier, when)
@directive("requirements")
def requires(*requirement_specs, policy="one_of", when=None, msg=None):
"""Allows a package to request a configuration to be present in all valid solutions.
@@ -919,3 +959,7 @@ class DependencyPatchError(DirectiveError):
class UnsupportedPackageDirective(DirectiveError):
"""Raised when an invalid or unsupported package directive is specified."""
class OverlappingLicenseError(DirectiveError):
"""Raised when two licenses are declared that apply on overlapping specs."""

View File

@@ -104,7 +104,7 @@ def relative_path_for_spec(self, spec):
_check_concrete(spec)
projection = spack.projections.get_projection(self.projections, spec)
path = spec.format(projection)
path = spec.format_path(projection)
return str(Path(path))
def write_spec(self, spec, path):
@@ -120,10 +120,8 @@ def write_host_environment(self, spec):
versioning. We use it in the case that an analysis later needs to
easily access this information.
"""
from spack.util.environment import get_host_environment_metadata
env_file = self.env_metadata_path(spec)
environ = get_host_environment_metadata()
environ = spack.spec.get_host_environment_metadata()
with open(env_file, "w") as fd:
sjson.dump(environ, fd)

View File

@@ -365,6 +365,7 @@
read,
root,
spack_env_var,
spack_env_view_var,
update_yaml,
)
@@ -397,5 +398,6 @@
"read",
"root",
"spack_env_var",
"spack_env_view_var",
"update_yaml",
]

View File

@@ -12,6 +12,7 @@
from enum import Enum
from typing import List, Optional
import spack.deptypes as dt
import spack.environment.environment as ev
import spack.spec
import spack.traverse as traverse
@@ -36,7 +37,9 @@ def from_string(s: str) -> "UseBuildCache":
def _deptypes(use_buildcache: UseBuildCache):
"""What edges should we follow for a given node? If it's a cache-only
node, then we can drop build type deps."""
return ("link", "run") if use_buildcache == UseBuildCache.ONLY else ("build", "link", "run")
return (
dt.LINK | dt.RUN if use_buildcache == UseBuildCache.ONLY else dt.BUILD | dt.LINK | dt.RUN
)
class DepfileNode:
@@ -69,13 +72,13 @@ def __init__(self, pkg_buildcache: UseBuildCache, deps_buildcache: UseBuildCache
self.adjacency_list: List[DepfileNode] = []
self.pkg_buildcache = pkg_buildcache
self.deps_buildcache = deps_buildcache
self.deptypes_root = _deptypes(pkg_buildcache)
self.deptypes_deps = _deptypes(deps_buildcache)
self.depflag_root = _deptypes(pkg_buildcache)
self.depflag_deps = _deptypes(deps_buildcache)
def neighbors(self, node):
"""Produce a list of spec to follow from node"""
deptypes = self.deptypes_root if node.depth == 0 else self.deptypes_deps
return traverse.sort_edges(node.edge.spec.edges_to_dependencies(deptype=deptypes))
depflag = self.depflag_root if node.depth == 0 else self.depflag_deps
return traverse.sort_edges(node.edge.spec.edges_to_dependencies(depflag=depflag))
def accept(self, node):
self.adjacency_list.append(

View File

@@ -28,6 +28,7 @@
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.hash_types as ht
@@ -63,6 +64,8 @@
#: environment variable used to indicate the active environment
spack_env_var = "SPACK_ENV"
#: environment variable used to indicate the active environment view
spack_env_view_var = "SPACK_ENV_VIEW"
#: currently activated environment
_active_environment: Optional["Environment"] = None
@@ -403,7 +406,7 @@ def _write_yaml(data, str_or_file):
def _eval_conditional(string):
"""Evaluate conditional definitions using restricted variable scope."""
valid_variables = spack.util.environment.get_host_environment()
valid_variables = spack.spec.get_host_environment()
valid_variables.update({"re": re, "env": os.environ})
return eval(string, valid_variables)
@@ -1395,7 +1398,10 @@ def _concretize_together_where_possible(
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
for result in solver.solve_in_rounds(specs_to_concretize, tests=tests):
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
specs_to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
result = []
@@ -1474,11 +1480,12 @@ def _concretize_separately(self, tests=False):
self._add_concrete_spec(s, concrete, new=False)
# Concretize any new user specs that we haven't concretized yet
arguments, root_specs = [], []
args, root_specs, i = [], [], 0
for uspec, uspec_constraints in zip(self.user_specs, self.user_specs.specs_as_constraints):
if uspec not in old_concretized_user_specs:
root_specs.append(uspec)
arguments.append((uspec_constraints, tests))
args.append((i, uspec_constraints, tests))
i += 1
# Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo":
@@ -1497,34 +1504,36 @@ def _concretize_separately(self, tests=False):
_ = spack.compilers.get_compiler_config()
# Early return if there is nothing to do
if len(arguments) == 0:
if len(args) == 0:
return []
# Solve the environment in parallel on Linux
start = time.time()
max_processes = min(
len(arguments), # Number of specs
spack.util.cpus.determine_number_of_jobs(parallel=True),
)
num_procs = min(len(args), spack.util.cpus.determine_number_of_jobs(parallel=True))
# TODO: revisit this print as soon as darwin is parallel too
# TODO: support parallel concretization on macOS and Windows
msg = "Starting concretization"
if sys.platform != "darwin":
pool_size = spack.util.parallel.num_processes(max_processes=max_processes)
if pool_size > 1:
msg = msg + " pool with {0} processes".format(pool_size)
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
concretized_root_specs = spack.util.parallel.parallel_map(
_concretize_task, arguments, max_processes=max_processes, debug=tty.is_debug()
)
batch = []
for i, concrete, duration in spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug()
):
batch.append((i, concrete))
tty.verbose(f"[{duration:7.2f}s] {root_specs[i]}")
sys.stdout.flush()
# Add specs in original order
batch.sort(key=lambda x: x[0])
by_hash = {} # for attaching information on test dependencies
for root, (_, concrete) in zip(root_specs, batch):
self._add_concrete_spec(root, concrete)
by_hash[concrete.dag_hash()] = concrete
finish = time.time()
tty.msg("Environment concretized in %.2f seconds." % (finish - start))
by_hash = {}
for abstract, concrete in zip(root_specs, concretized_root_specs):
self._add_concrete_spec(abstract, concrete)
by_hash[concrete.dag_hash()] = concrete
tty.msg(f"Environment concretized in {finish - start:.2f} seconds")
# Unify the specs objects, so we get correct references to all parents
self._read_lockfile_dict(self._to_lockfile_dict())
@@ -1536,13 +1545,13 @@ def _concretize_separately(self, tests=False):
for h in self.specs_by_hash:
current_spec, computed_spec = self.specs_by_hash[h], by_hash[h]
for node in computed_spec.traverse():
test_edges = node.edges_to_dependencies(deptype="test")
test_edges = node.edges_to_dependencies(depflag=dt.TEST)
for current_edge in test_edges:
test_dependency = current_edge.spec
if test_dependency in current_spec[node.name]:
continue
current_spec[node.name].add_dependency_edge(
test_dependency.copy(), deptypes="test", virtuals=current_edge.virtuals
test_dependency.copy(), depflag=dt.TEST, virtuals=current_edge.virtuals
)
results = [
@@ -1591,16 +1600,14 @@ def concretize_and_add(self, user_spec, concrete_spec=None, tests=False):
@property
def default_view(self):
if not self.views:
raise SpackEnvironmentError("{0} does not have a view enabled".format(self.name))
if default_view_name not in self.views:
raise SpackEnvironmentError(
"{0} does not have a default view enabled".format(self.name)
)
if not self.has_view(default_view_name):
raise SpackEnvironmentError(f"{self.name} does not have a default view enabled")
return self.views[default_view_name]
def has_view(self, view_name: str) -> bool:
return view_name in self.views
def update_default_view(self, path_or_bool: Union[str, bool]) -> None:
"""Updates the path of the default view.
@@ -1686,62 +1693,34 @@ def check_views(self):
"Loading the environment view will require reconcretization." % self.name
)
def _env_modifications_for_default_view(self, reverse=False):
all_mods = spack.util.environment.EnvironmentModifications()
def _env_modifications_for_view(
self, view: ViewDescriptor, reverse: bool = False
) -> spack.util.environment.EnvironmentModifications:
try:
mods = uenv.environment_modifications_for_specs(*self.concrete_roots(), view=view)
except Exception as e:
# Failing to setup spec-specific changes shouldn't be a hard error.
tty.warn(
"couldn't load runtime environment due to {}: {}".format(e.__class__.__name__, e)
)
return spack.util.environment.EnvironmentModifications()
return mods.reversed() if reverse else mods
visited = set()
errors = []
for root_spec in self.concrete_roots():
if root_spec in self.default_view and root_spec.installed and root_spec.package:
for spec in root_spec.traverse(deptype="run", root=True):
if spec.name in visited:
# It is expected that only one instance of the package
# can be added to the environment - do not attempt to
# add multiple.
tty.debug(
"Not adding {0} to shell modifications: "
"this package has already been added".format(
spec.format("{name}/{hash:7}")
)
)
continue
else:
visited.add(spec.name)
try:
mods = uenv.environment_modifications_for_spec(spec, self.default_view)
except Exception as e:
msg = "couldn't get environment settings for %s" % spec.format(
"{name}@{version} /{hash:7}"
)
errors.append((msg, str(e)))
continue
all_mods.extend(mods.reversed() if reverse else mods)
return all_mods, errors
def add_default_view_to_env(self, env_mod):
"""
Collect the environment modifications to activate an environment using the
default view. Removes duplicate paths.
def add_view_to_env(
self, env_mod: spack.util.environment.EnvironmentModifications, view: str
) -> spack.util.environment.EnvironmentModifications:
"""Collect the environment modifications to activate an environment using the provided
view. Removes duplicate paths.
Args:
env_mod (spack.util.environment.EnvironmentModifications): the environment
modifications object that is modified.
"""
if default_view_name not in self.views:
# No default view to add to shell
env_mod: the environment modifications object that is modified.
view: the name of the view to activate."""
descriptor = self.views.get(view)
if not descriptor:
return env_mod
env_mod.extend(uenv.unconditional_environment_modifications(self.default_view))
mods, errors = self._env_modifications_for_default_view()
env_mod.extend(mods)
if errors:
for err in errors:
tty.warn(*err)
env_mod.extend(uenv.unconditional_environment_modifications(descriptor))
env_mod.extend(self._env_modifications_for_view(descriptor))
# deduplicate paths from specs mapped to the same location
for env_var in env_mod.group_by_name():
@@ -1749,23 +1728,21 @@ def add_default_view_to_env(self, env_mod):
return env_mod
def rm_default_view_from_env(self, env_mod):
"""
Collect the environment modifications to deactivate an environment using the
default view. Reverses the action of ``add_default_view_to_env``.
def rm_view_from_env(
self, env_mod: spack.util.environment.EnvironmentModifications, view: str
) -> spack.util.environment.EnvironmentModifications:
"""Collect the environment modifications to deactivate an environment using the provided
view. Reverses the action of ``add_view_to_env``.
Args:
env_mod (spack.util.environment.EnvironmentModifications): the environment
modifications object that is modified.
"""
if default_view_name not in self.views:
# No default view to add to shell
env_mod: the environment modifications object that is modified.
view: the name of the view to deactivate."""
descriptor = self.views.get(view)
if not descriptor:
return env_mod
env_mod.extend(uenv.unconditional_environment_modifications(self.default_view).reversed())
mods, _ = self._env_modifications_for_default_view(reverse=True)
env_mod.extend(mods)
env_mod.extend(uenv.unconditional_environment_modifications(descriptor).reversed())
env_mod.extend(self._env_modifications_for_view(descriptor, reverse=True))
return env_mod
@@ -2190,7 +2167,7 @@ def _read_lockfile_dict(self, d):
name, data = reader.name_and_data(node_dict)
for _, dep_hash, deptypes, _, virtuals in reader.dependencies_from_node_dict(data):
specs_by_hash[lockfile_key]._add_dependency(
specs_by_hash[dep_hash], deptypes=deptypes, virtuals=virtuals
specs_by_hash[dep_hash], depflag=dt.canonicalize(deptypes), virtuals=virtuals
)
# Traverse the root specs one at a time in the order they appear.
@@ -2418,10 +2395,12 @@ def _concretize_from_constraints(spec_constraints, tests=False):
invalid_constraints.extend(inv_variant_constraints)
def _concretize_task(packed_arguments):
spec_constraints, tests = packed_arguments
def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
index, spec_constraints, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
return _concretize_from_constraints(spec_constraints, tests)
start = time.time()
spec = _concretize_from_constraints(spec_constraints, tests)
return index, spec, time.time() - start
def make_repo_path(root):

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Optional
import llnl.util.tty as tty
from llnl.util.tty.color import colorize
@@ -13,12 +14,14 @@
from spack.util.environment import EnvironmentModifications
def activate_header(env, shell, prompt=None):
def activate_header(env, shell, prompt=None, view: Optional[str] = None):
# Construct the commands to run
cmds = ""
if shell == "csh":
# TODO: figure out how to make color work for csh
cmds += "setenv SPACK_ENV %s;\n" % env.path
if view:
cmds += "setenv SPACK_ENV_VIEW %s;\n" % view
cmds += 'alias despacktivate "spack env deactivate";\n'
if prompt:
cmds += "if (! $?SPACK_OLD_PROMPT ) "
@@ -29,6 +32,8 @@ def activate_header(env, shell, prompt=None):
prompt = colorize("@G{%s} " % prompt, color=True)
cmds += "set -gx SPACK_ENV %s;\n" % env.path
if view:
cmds += "set -gx SPACK_ENV_VIEW %s;\n" % view
cmds += "function despacktivate;\n"
cmds += " spack env deactivate;\n"
cmds += "end;\n"
@@ -40,15 +45,21 @@ def activate_header(env, shell, prompt=None):
elif shell == "bat":
# TODO: Color
cmds += 'set "SPACK_ENV=%s"\n' % env.path
if view:
cmds += 'set "SPACK_ENV_VIEW=%s"\n' % view
# TODO: despacktivate
# TODO: prompt
elif shell == "pwsh":
cmds += "$Env:SPACK_ENV='%s'\n" % env.path
if view:
cmds += "$Env:SPACK_ENV_VIEW='%s'\n" % view
else:
if "color" in os.getenv("TERM", "") and prompt:
prompt = colorize("@G{%s}" % prompt, color=True, enclose=True)
cmds += "export SPACK_ENV=%s;\n" % env.path
if view:
cmds += "export SPACK_ENV_VIEW=%s;\n" % view
cmds += "alias despacktivate='spack env deactivate';\n"
if prompt:
cmds += "if [ -z ${SPACK_OLD_PS1+x} ]; then\n"
@@ -66,12 +77,14 @@ def deactivate_header(shell):
cmds = ""
if shell == "csh":
cmds += "unsetenv SPACK_ENV;\n"
cmds += "unsetenv SPACK_ENV_VIEW;\n"
cmds += "if ( $?SPACK_OLD_PROMPT ) "
cmds += ' eval \'set prompt="$SPACK_OLD_PROMPT" &&'
cmds += " unsetenv SPACK_OLD_PROMPT';\n"
cmds += "unalias despacktivate;\n"
elif shell == "fish":
cmds += "set -e SPACK_ENV;\n"
cmds += "set -e SPACK_ENV_VIEW;\n"
cmds += "functions -e despacktivate;\n"
#
# NOTE: Not changing fish_prompt (above) => no need to restore it here.
@@ -79,14 +92,19 @@ def deactivate_header(shell):
elif shell == "bat":
# TODO: Color
cmds += 'set "SPACK_ENV="\n'
cmds += 'set "SPACK_ENV_VIEW="\n'
# TODO: despacktivate
# TODO: prompt
elif shell == "pwsh":
cmds += "Set-Item -Path Env:SPACK_ENV\n"
cmds += "Set-Item -Path Env:SPACK_ENV_VIEW\n"
else:
cmds += "if [ ! -z ${SPACK_ENV+x} ]; then\n"
cmds += "unset SPACK_ENV; export SPACK_ENV;\n"
cmds += "fi;\n"
cmds += "if [ ! -z ${SPACK_ENV_VIEW+x} ]; then\n"
cmds += "unset SPACK_ENV_VIEW; export SPACK_ENV_VIEW;\n"
cmds += "fi;\n"
cmds += "alias despacktivate > /dev/null 2>&1 && unalias despacktivate;\n"
cmds += "if [ ! -z ${SPACK_OLD_PS1+x} ]; then\n"
cmds += " if [ \"$SPACK_OLD_PS1\" = '$$$$' ]; then\n"
@@ -100,24 +118,23 @@ def deactivate_header(shell):
return cmds
def activate(env, use_env_repo=False, add_view=True):
"""
Activate an environment and append environment modifications
def activate(
env: ev.Environment, use_env_repo=False, view: Optional[str] = "default"
) -> EnvironmentModifications:
"""Activate an environment and append environment modifications
To activate an environment, we add its configuration scope to the
existing Spack configuration, and we set active to the current
environment.
Arguments:
env (spack.environment.Environment): the environment to activate
use_env_repo (bool): use the packages exactly as they appear in the
environment's repository
add_view (bool): generate commands to add view to path variables
env: the environment to activate
use_env_repo: use the packages exactly as they appear in the environment's repository
view: generate commands to add runtime environment variables for named view
Returns:
spack.util.environment.EnvironmentModifications: Environment variables
modifications to activate environment.
"""
modifications to activate environment."""
ev.activate(env, use_env_repo=use_env_repo)
env_mods = EnvironmentModifications()
@@ -129,9 +146,9 @@ def activate(env, use_env_repo=False, add_view=True):
# become PATH variables.
#
try:
if add_view and ev.default_view_name in env.views:
if view and env.has_view(view):
with spack.store.STORE.db.read_transaction():
env.add_default_view_to_env(env_mods)
env.add_view_to_env(env_mods, view)
except (spack.repo.UnknownPackageError, spack.repo.UnknownNamespaceError) as e:
tty.error(e)
tty.die(
@@ -145,17 +162,15 @@ def activate(env, use_env_repo=False, add_view=True):
return env_mods
def deactivate():
"""
Deactivate an environment and collect corresponding environment modifications.
def deactivate() -> EnvironmentModifications:
"""Deactivate an environment and collect corresponding environment modifications.
Note: unloads the environment in its current state, not in the state it was
loaded in, meaning that specs that were removed from the spack environment
after activation are not unloaded.
Returns:
spack.util.environment.EnvironmentModifications: Environment variables
modifications to activate environment.
Environment variables modifications to activate environment.
"""
env_mods = EnvironmentModifications()
active = ev.active_environment()
@@ -163,10 +178,12 @@ def deactivate():
if active is None:
return env_mods
if ev.default_view_name in active.views:
active_view = os.getenv(ev.spack_env_view_var)
if active_view and active.has_view(active_view):
try:
with spack.store.STORE.db.read_transaction():
active.rm_default_view_from_env(env_mods)
active.rm_view_from_env(env_mods, active_view)
except (spack.repo.UnknownPackageError, spack.repo.UnknownNamespaceError) as e:
tty.warn(e)
tty.warn(

View File

@@ -128,3 +128,7 @@ def __init__(self, provided, required, constraint_type):
self.provided = provided
self.required = required
self.constraint_type = constraint_type
class FetchError(SpackError):
"""Superclass for fetch-related errors."""

View File

@@ -31,9 +31,11 @@
import urllib.parse
from typing import List, Optional
import llnl.url
import llnl.util
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.string import comma_and, quote
from llnl.util.filesystem import get_single_file, mkdirp, temp_cwd, temp_rename, working_dir
from llnl.util.symlink import symlink
@@ -46,9 +48,8 @@
import spack.util.web as web_util
import spack.version
import spack.version.git_ref_lookup
from spack.util.compression import decompressor_for, extension_from_path
from spack.util.compression import decompressor_for
from spack.util.executable import CommandNotFoundError, which
from spack.util.string import comma_and, quote
#: List of all fetch strategies, created by FetchStrategy metaclass.
all_strategies = []
@@ -400,7 +401,7 @@ def _fetch_curl(self, url):
try:
web_util.check_curl_code(curl.returncode)
except web_util.FetchError as err:
except spack.error.FetchError as err:
raise spack.fetch_strategy.FailedDownloadError(url, str(err))
self._check_headers(headers)
@@ -441,7 +442,7 @@ def expand(self):
# TODO: replace this by mime check.
if not self.extension:
self.extension = spack.url.determine_url_file_extension(self.url)
self.extension = llnl.url.determine_url_file_extension(self.url)
if self.stage.expanded:
tty.debug("Source already staged to %s" % self.stage.source_path)
@@ -570,7 +571,7 @@ def expand(self):
@_needs_stage
def archive(self, destination, **kwargs):
assert extension_from_path(destination) == "tar.gz"
assert llnl.url.extension_from_path(destination) == "tar.gz"
assert self.stage.source_path.startswith(self.stage.path)
tar = which("tar", required=True)
@@ -733,7 +734,11 @@ def version_from_git(git_exe):
@property
def git(self):
if not self._git:
self._git = spack.util.git.git()
try:
self._git = spack.util.git.git(required=True)
except CommandNotFoundError as exc:
tty.error(str(exc))
raise
# Disable advice for a quieter fetch
# https://github.com/git/git/blob/master/Documentation/RelNotes/1.7.2.txt
@@ -1289,7 +1294,7 @@ def fetch(self):
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "s3":
raise web_util.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
raise spack.error.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
tty.debug("Fetching {0}".format(self.url))
@@ -1336,7 +1341,7 @@ def fetch(self):
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "gs":
raise web_util.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
raise spack.error.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
tty.debug("Fetching {0}".format(self.url))
@@ -1430,7 +1435,7 @@ def from_kwargs(**kwargs):
on attribute names (e.g., ``git``, ``hg``, etc.)
Raises:
spack.util.web.FetchError: If no ``fetch_strategy`` matches the args.
spack.error.FetchError: If no ``fetch_strategy`` matches the args.
"""
for fetcher in all_strategies:
if fetcher.matches(kwargs):
@@ -1537,7 +1542,7 @@ def for_package_version(pkg, version=None):
# if it's a commit, we must use a GitFetchStrategy
if isinstance(version, spack.version.GitVersion):
if not hasattr(pkg, "git"):
raise web_util.FetchError(
raise spack.error.FetchError(
f"Cannot fetch git version for {pkg.name}. Package has no 'git' attribute"
)
# Populate the version with comparisons to other commits
@@ -1687,11 +1692,11 @@ def destroy(self):
shutil.rmtree(self.root, ignore_errors=True)
class NoCacheError(web_util.FetchError):
class NoCacheError(spack.error.FetchError):
"""Raised when there is no cached archive for a package."""
class FailedDownloadError(web_util.FetchError):
class FailedDownloadError(spack.error.FetchError):
"""Raised when a download fails."""
def __init__(self, url, msg=""):
@@ -1699,23 +1704,23 @@ def __init__(self, url, msg=""):
self.url = url
class NoArchiveFileError(web_util.FetchError):
class NoArchiveFileError(spack.error.FetchError):
"""Raised when an archive file is expected but none exists."""
class NoDigestError(web_util.FetchError):
class NoDigestError(spack.error.FetchError):
"""Raised after attempt to checksum when URL has no digest."""
class ExtrapolationError(web_util.FetchError):
class ExtrapolationError(spack.error.FetchError):
"""Raised when we can't extrapolate a version for a package."""
class FetcherConflict(web_util.FetchError):
class FetcherConflict(spack.error.FetchError):
"""Raised for packages with invalid fetch attributes."""
class InvalidArgsError(web_util.FetchError):
class InvalidArgsError(spack.error.FetchError):
"""Raised when a version can't be deduced from a set of arguments."""
def __init__(self, pkg=None, version=None, **args):
@@ -1728,11 +1733,11 @@ def __init__(self, pkg=None, version=None, **args):
super().__init__(msg, long_msg)
class ChecksumError(web_util.FetchError):
class ChecksumError(spack.error.FetchError):
"""Raised when archive fails to checksum."""
class NoStageError(web_util.FetchError):
class NoStageError(spack.error.FetchError):
"""Raised when fetch operations are called before set_stage()."""
def __init__(self, method):

View File

@@ -500,7 +500,7 @@ def get_projection_for_spec(self, spec):
proj = spack.projections.get_projection(self.projections, locator_spec)
if proj:
return os.path.join(self._root, locator_spec.format(proj))
return os.path.join(self._root, locator_spec.format_path(proj))
return self._root
def get_all_specs(self):
@@ -776,7 +776,7 @@ def get_relative_projection_for_spec(self, spec):
spec = spec.package.extendee_spec
p = spack.projections.get_projection(self.projections, spec)
return spec.format(p) if p else ""
return spec.format_path(p) if p else ""
def get_projection_for_spec(self, spec):
"""
@@ -791,7 +791,7 @@ def get_projection_for_spec(self, spec):
proj = spack.projections.get_projection(self.projections, spec)
if proj:
return os.path.join(self._root, spec.format(proj))
return os.path.join(self._root, spec.format_path(proj))
return self._root

Some files were not shown because too many files have changed in this diff Show More