Compare commits

..

378 Commits

Author SHA1 Message Date
Todd Gamblin
7894a6077d use underline instead of bold for the highlight 2024-03-17 23:43:05 -07:00
Todd Gamblin
0d010fe24f spack spec: add -d / --debug-nondefaults option
It can be hard to tell when the concretizer has made decisions that are not the defaults
for packages. This introduces a `spack spec -d` option that will highlight the
nondefault options in red.

You can use this to see:
- When a selected version is not the most preferred;
- When a selected compiler is not the most preferred;
- When a nondefault variant is selected;
- When the chosen target is not the native microarchitecture;
- When a chosen virtual provider is not the most preferred.

With `--reuse`, you'll see clearly which versions and options on reused packages are not
current with the latest `package.py` files. With `--fresh`, you can see which options
are forced by particular package preferences. e.g., `spack spec -d tar` shows `zstd`'s
`+programs` option in red because it's off by default, but `tar` requires it via
`depends_on("zstd+programs")`.

This should be useful for debugging strange/unintuitive concretizations.

- [x] Modify concretizer to return optimization weights
- [x] Add `_weights` metadata to the `Spec` object
- [x] Modify `Spec.format()` to colorize nondefaults when weights are nonzero
- [x] Add tests
2024-03-17 23:43:01 -07:00
Todd Gamblin
d83a9c63b4 spec: simplify color formatting
- [x] Remove `colorize_spec()`, which uses older, now incorrect colorization logic. It
      is older than the current `Spec.format()` implementation and it doesn't work with
      `@=`. The only usages in `cmd/info.py` can be replaced wtih `Spec.cformat()`.
- [x] Remove `_SEPARATORS` AND `COLOR_FORMATS`, which are no longer needed.
- [x] Simplify `Spec.format()` code to use more readable color constants directly.
- [x] Remove unused color constants.
- [x] Remove unused parameter for `write_attribute()`
- [x] Remove unused `Spec.colorized()` method
2024-03-17 23:40:20 -07:00
Seth R. Johnson
5ab10d57be geant4: add matinainer, clean args (#43218) 2024-03-16 17:29:03 +00:00
dependabot[bot]
96061d2c00 build(deps): bump black from 24.2.0 to 24.3.0 in /lib/spack/docs (#43228)
Bumps [black](https://github.com/psf/black) from 24.2.0 to 24.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.2.0...24.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-15 17:24:44 -06:00
dependabot[bot]
e78d20dc84 build(deps): bump black in /.github/workflows/style (#43227)
Bumps [black](https://github.com/psf/black) from 24.2.0 to 24.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.2.0...24.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-15 16:09:59 -07:00
Adam J. Stewart
6d2341c109 py-black: add v24.3.0 (#43226) 2024-03-15 16:09:17 -07:00
Seth R. Johnson
968ad02473 geant4: patch old versions to work on new compiler/ubuntu (#43212)
* geant4: patch old version for %clang@15

* Backport ascii-V10-07-03

* Apply to all compilers, but only for 10.5-10.6
2024-03-15 12:22:08 -06:00
Jose E. Roman
b93882804f New patch release SLEPc 3.20.2 (#43211) 2024-03-15 09:51:08 -07:00
Wouter Deconinck
f58ebd4fbb pandora{pfa,sdk,monitoring}: new HEP package for particle flow analysis (#37714)
* pandorapfa: new package

* pandorasdk: new package

* [@spackbot] updating style on behalf of wdconinc

* pandorasdk: use self.define

* pandoramonitoring: new package

* pandorasdk: new version 3.4.2

* pandora{pfa,sdk,monitoring}: add maintainer jmcarcell

Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>

* pandora{pfa,sdk,monitoring}: add maintainer jmcarcell

Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>

* pandora{pfa,sdk,monitoring}: add maintainer jmcarcell

Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
2024-03-15 14:58:36 +01:00
Massimiliano Culpo
6f7f9528e5 cray-rhel: add a lower bound to mgard (#43187) 2024-03-15 11:25:56 +01:00
Greg Becker
59c7ff8683 Allow compilers to be configured in packages.yaml (#42016)
Co-authored-by: becker33 <becker33@users.noreply.github.com>
2024-03-15 11:01:49 +01:00
John W. Parent
4495e0341d Clingo bootstrapping: Remove msvc constraint (#43199)
Patch allowing Clingo to build with VS22 has landed both in Spack
and Clingo upstream, update Spack's bootstrap constraints to handle
this.

Additionally, properly scope the patch application in the clingo
package to handle upstream patch.
2024-03-14 19:08:38 -06:00
eugeneswalker
ba39924046 e4s cray ci: mgard is broken, disable spec (#43194) 2024-03-14 14:06:36 -07:00
Sergey Kosukhin
751c3fef86 nag: add version 7.2.7200 (#43188) 2024-03-14 15:04:05 -06:00
Adrien Bernede
102811adb9 Fix Axom: index out of range when configuring axom~mpi on toss_4 (#43186) 2024-03-14 14:49:09 -06:00
Massimiliano Culpo
8f56eb620f Improve error message when an unknown compiler is requested (#43143)
Fixes #43141
2024-03-14 13:47:56 -07:00
Peter Scheibel
ec517b40e9 spack develop: stage build artifacts in same root as non-dev builds (#41373)
Currently (outside of this PR) when you `spack develop` a path, this path is treated as the staging
directory (this means that for example all build artifacts are placed in the develop path).

This PR creates a separate staging directory for all `spack develop`ed builds. It looks like

```
# the stage root
/the-stage-root-for-all-spack-builds/
    spack-stage-<hash>
        # Spack packages inheriting CMakePackage put their build artifacts here
        spack-build-<hash>/
```

Unlike non-develop builds, there is no `spack-src` directory, `source_path` is the provided `dev_path`.
Instead, separately, in the `dev_path`, we have:

```
/dev/path/for/foo/
    build-{arch}-<hash> -> /the-stage-root-for-all-spack-builds/spack-stage-<hash>/
```

The main benefit of this is that build artifacts for out-of-source builds that are relative to
`Stage.path` are easily identified (and you can delete them with `spack clean`).

Other behavior added here:

- [x] A symlink is made from the `dev_path` to the stage directory. This symlink name incorporates
    spec details, so that multiple Spack environments that develop the same path will not conflict
    with one another

- [x] `spack cd` and `spack location` have added a `-c` shorthand for `--source-dir`

Spack builds can still change the develop path (in particular to keep track of applied patches), 
and for in-source builds, this doesn't change much (although logs would not be written into 
the develop path). Packages inheriting from `CMakePackage` should get this benefit
automatically though.
2024-03-14 13:32:01 -07:00
Martin Aumüller
22cb3815fe ispc: add v1.22.0 & v1.23.0 (#43159)
* ispc: add v1.22.0 & v1.23.0
* ispc: fix build on macos
  apply upstream patch
* ispc: demote llvm to just a build dependency
  still builds fine
2024-03-14 10:59:45 -07:00
Greg Becker
f549354f78 move --deprecated arg to concretizer args (#43177) 2024-03-14 18:43:52 +01:00
Scott Wittenburg
dc212d0e59 mgard: add version 2023-12-09 (1.5.2) (#41493) 2024-03-14 11:15:47 -06:00
victor-decaria-nnl
8f14acb139 mfem: add MUMPs option (#42929)
* Added MUMPs option to mfem
* implement suggestions
* loosened mumps+metis dependency to just mumps
2024-03-14 09:09:24 -07:00
Harmen Stoppels
c38ef72b06 compiler.py: simplify implicit link dir bits (#43078) 2024-03-14 08:54:47 +01:00
jmuddnv
7d67d9ece4 nvhpc: add v24.3 (#43175) 2024-03-14 08:35:09 +01:00
Carlos Bederián
2c30962c74 amduprof: new package (#30575)
Co-authored-by: zzzoom <zzzoom@users.noreply.github.com>
2024-03-14 07:44:43 +01:00
dependabot[bot]
cc28334049 build(deps): bump pytest from 8.0.2 to 8.1.1 in /lib/spack/docs (#43134)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.2 to 8.1.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.2...8.1.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-14 07:40:15 +01:00
dependabot[bot]
dbdf5bacc4 build(deps): bump actions/checkout from 4.1.1 to 4.1.2 (#43152)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.1 to 4.1.2.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](b4ffde65f4...9bb56186c3)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-14 05:55:08 +01:00
Rocco Meli
531f01f0b9 highway: add v1.1.0 (#43174) 2024-03-14 05:43:37 +01:00
Sebastian Pipping
794593b478 expat: Add release 2.6.2 with security fixes (#43171) 2024-03-13 22:43:00 -06:00
Tom Scogland
afcf0d2e39 llvm-openmp: make llvm-openmp consistent with other llvm builds (#43165)
Add current versions of the 17 and 18 releases

Stop making it nearly impossible to compose this correctly with code built with gcc

Build for compatibility by default like we do in every other llvm package
2024-03-14 05:34:09 +01:00
dependabot[bot]
29ee861366 build(deps): bump docker/login-action from 3.0.0 to 3.1.0 (#43176)
Bumps [docker/login-action](https://github.com/docker/login-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](343f7c4344...e92390c5fb)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-14 05:27:16 +01:00
John W. Parent
b1a984ef02 msvc: patch property ref bug (#43173) 2024-03-13 22:28:36 +00:00
Christopher Christofi
cc545d8c9a jags: add version 4.3.2 (#43166) 2024-03-13 23:06:03 +01:00
Ashwin Kumar Karnad
49ff816fb0 octopus: disable gdlib by default (#43161) 2024-03-13 13:04:39 -06:00
Ashwin Kumar Karnad
8c33841567 Add iamashwin99 as a maintainer in Octopus package (#43163) 2024-03-13 12:59:41 -06:00
Ashwin Kumar Karnad
21b50fbbe3 octopus: Support new version octopus@14 (#43160)
* add checksum for octopus 14

* Update dependencies for BerkeleyGW package
2024-03-13 12:59:16 -06:00
Martin Aumüller
2a8e503a04 protobuf: apply centos 8 patch only to @3.4: (#43162)
does not apply cleanly to older versions
2024-03-13 12:54:02 -06:00
Wileam Y. Phan
4b695d4722 rocm-openmp-extras: Fix resource download URLs (#43147)
* rocm-openmp-extras: Fix resource download URLs

* Apply review suggestions
2024-03-13 12:53:46 -06:00
Juan Miguel Carceller
2fa816184e py-ruff: add version 0.3.0 and deprecate the oldest version (#43069)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-13 11:03:28 -07:00
Brian Spilner
25f622e809 update to cdo-2.4.0 (#43071) 2024-03-13 11:01:41 -07:00
吴坎
7506acabe7 antlr4-cpp-runtime: update dependencies (#43115)
* update dependencies

* Update package.py
2024-03-13 11:00:54 -07:00
Dom Heinzeller
3a828358cb Update var/spack/repos/builtin/packages/ecmwf-atlas/package.py: set correct ectrans/trans variant, configure tesselation variant (#43151) 2024-03-13 10:19:24 -07:00
Adam J. Stewart
94a1d1414a spack.patch: support reversing patches (#43040)
The `patch()` directive can now be invoked with `reverse=True` to apply a patch in reverse.
This is useful for reverting commits that caused errors in projects, even if only the forward
patch is available, e.g. via a GitHub commit patch URL.

`patch(..., reverse=True)` runs `patch -R` behind the scenes. This is a POSIX option so we
can expect it to be available on the `patch` command.

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-12 23:22:10 -07:00
downloadico
0f080b38f4 abinit: add version 9.10.5 (#43148)
set the FC variable to the MPI Fortran compiler and also set the F90 variable
to the same compiler for versions 9.8 and up.  FC needs to be set because the
configure script still uses FC.
2024-03-12 14:04:35 -07:00
Alec Scott
f1ec4859c8 unmaintained packages: add new versions (#43112)
* unmaintained packages: add new versions
* Fix parallel and numactl
* Revert numactl changes
* rollback lua-sol2 version
* Update alluxio version format
2024-03-12 13:34:17 -07:00
Greg Sjaardema
63baba0308 SEACAS: Make latest release available in spack (#43123) 2024-03-12 11:32:32 -07:00
Christopher Christofi
aeec861544 py-art: add new package (#43127) 2024-03-12 11:29:13 -07:00
Christopher Christofi
e54d4678f9 py-lazy-loader: add new version (#43130) 2024-03-12 11:14:17 -07:00
Dom Heinzeller
187b8adb4f ecmwf-atlas@0.36: depend on ecbuild@3.4: (#43133) 2024-03-12 11:00:15 -07:00
Samuel K. Gutiérrez
d6fd96f024 libquo: Update default version from 1.3.1 to 1.4. (#43057)
Signed-off-by: Samuel K. Gutierrez <samuel@lanl.gov>
2024-03-12 10:28:30 -07:00
Andrey Perestoronin
e3b6d2c3c7 Intel oneapi compilers 2023.2.4 (#43144)
* added intel-oneapi-compilers

* added classics version
2024-03-12 12:54:19 -04:00
Tamara Dahlgren
1e9c46296c perl testing: refactor stand-alone testing into base class (#43044) 2024-03-12 15:39:18 +01:00
Valentin Volkl
48183b37be gaudi: add py-yaml dependency (#43116)
Needed to parse job options in yaml format (https://gitlab.cern.ch/gaudi/Gaudi/-/blob/master/GaudiKernel/python/GaudiKernel/ProcessJobOptions.py#L528).
In principle this is optional, but it's needed for the test  35 - GaudiKernel.pytest.nose.test_Configurables (Failed). Seems like there's no harm done in adding this as a full dependency.
2024-03-12 08:58:32 -05:00
Oakley Brunt
9a3d248348 py-psyclone and py-fparser: Add new releases (#42744)
* py-psyclone and py-fparser new releases added

* py-fparser: add missing releases for py-psyclone

* py-psyclone: actioned @adamjstewart comments

* py-psyclone: removed py-pytest-pylint

* py-pytest-pylint: added package @ latest version

* py-pytest-pylint: reformatted

* Update var/spack/repos/builtin/packages/py-psyclone/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-pytest-pylint: added build deps and runtime dep versions

* py-pytest-pylint: removed version from setuptools

* py-psyclone: add py-pytest-pylint test dep and alphabetize deps

* Update var/spack/repos/builtin/packages/py-pytest-pylint/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-psyclone: deps ordered

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-12 14:46:48 +01:00
Pierre Blanchard
03e22adb5b sleef: add v3.6 (#42978) 2024-03-12 11:04:17 +01:00
Massimiliano Culpo
5f5fc78236 Update archspec to v0.2.3 (#42854) 2024-03-12 09:31:15 +01:00
Adam J. Stewart
e12a8a69c7 py-neuralgcm: add new package (#43117)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-03-11 21:02:03 -06:00
Christopher Christofi
001af62585 perl-math-symbolic: add required perl-parse-recdescent dependency (#42157)
* perl-math-symbolic: add required perl-parse-recdescent dependency

* perl-math-symbolic: add new runtime dependency
2024-03-11 17:09:51 -06:00
Christopher Christofi
f5e89df6f2 perl-parsetemplate: add new package (#41845)
* perl-parsetemplate: add new package

* perl-parsetemplate: update to latest copyright date and add license
2024-03-11 17:04:35 -06:00
Massimiliano Culpo
ce75adada6 Fix callbacks accumulation when using mixins with builders (#43100)
fixes #43097

Before this PR the behavior of mixins used together with
builders was to mask completely the callbacks defined from
the class coming later in the MRO.

Here we fix the behavior by accumulating all callbacks,
and de-duplicating them later.
2024-03-11 20:48:21 +01:00
Adam J. Stewart
24d37df1a2 py-climax: add new package (#43121) 2024-03-11 12:47:20 -07:00
Stephen Sachs
a9d294c532 aocl-utils: source sha has changed (#43120)
* aocl-utils: source sha has changed

* aocl-utils: version 4.1 hash updated
2024-03-11 12:45:30 -07:00
Adam J. Stewart
9dcaa56db4 py-onnx-opcounter: add new package (#43119) 2024-03-11 12:42:16 -07:00
Adam J. Stewart
98162aa2e1 py-earth2mip: add new package (#43062)
* py-earth2mip: add new package

* Fix audit tests

* Make things easier to concretize

* Make things easier to concretize

* Fix magics build

* Fix onnx build

* blacken

* cmake > py-cmake

* Fix onnxruntime build

* onnxruntime: remove unknown cmake vars

* Trick bazelisk into using Spack bazel

* Different deps for main

* protobuf: add v3.25.3

* Enforce upper bound on jaxlib version

* py-chex: add v0.1.85

* py-dm-haiku: add v0.0.12

* py-jax: add v0.4.25

* Older jaxlib doesn't build on ppc64le either
2024-03-11 12:28:03 -07:00
Alec Scott
3934df622c fzf: add v0.47.0 (#43113) 2024-03-11 11:58:15 -07:00
tpeterka
dbf5d79557 updated diy/package.py to version 3.6.0 (#43101) 2024-03-11 11:56:26 -07:00
eugeneswalker
97e29e501d e4s ci stacks: add cp2k cpu and gpu specs (#42454)
* e4s ci stacks: add cp2k cpu and gpu specs

* remove non-building cp2k specs
2024-03-11 09:29:51 -07:00
Wouter Deconinck
258c651a8f geant4: new variant timemory (#43111)
* geant4: new variant timemory

* geant4: depends_on timemory@3.2:

* geant4: fix style
2024-03-11 15:50:02 +00:00
downloadico
43ca6da346 ruby: add version 3.3.0 (#43058)
* ruby: add version 3.3.0

* added dependency on libyaml
2024-03-11 08:49:24 -07:00
Adam J. Stewart
9786bd932b Update TensorFlow ecosystem (#41069) 2024-03-11 14:33:16 +01:00
Sinan
c72619d4db package/qgis: add new version (#42888)
* package/qgis: add new version

* improve Qsci.pro

* improve

* fix undefined symbol qsciprinter error

* add import test

* fix bug

* add version 3.36

* [@spackbot] updating style on behalf of Sinan81

* fix long line

* only run import test when +python

* first attempt at stand-alone test

* add TODO

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Sinan81 <Sinan@world>
2024-03-11 05:09:29 -05:00
Mikael Simberg
8ecae17c46 dla-future: Add patch for compilation with newer ROCm versions (#42348) 2024-03-11 10:44:00 +01:00
Massimiliano Culpo
1e47ccb83a Remove dead code (#43114)
* Remove dead code in spack
* Remove dead code in llnl
2024-03-11 00:47:55 -07:00
Jonas Eschle
d6421a69eb fix typo in dependency (#43105) 2024-03-10 18:56:27 -07:00
dependabot[bot]
000dff2fd4 build(deps): bump docker/build-push-action from 5.1.0 to 5.2.0 (#43107)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.1.0 to 5.2.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4a13e500e5...af5a7ed5ba)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-10 18:55:44 -07:00
dependabot[bot]
1e413477dd build(deps): bump mypy from 1.8.0 to 1.9.0 in /lib/spack/docs (#43103)
Bumps [mypy](https://github.com/python/mypy) from 1.8.0 to 1.9.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.8.0...1.9.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-10 18:54:58 -07:00
Pranav Sivaraman
8955e63a68 feat: add LLVM v18.1.1 (#43109) 2024-03-10 08:55:39 +01:00
John W. Parent
bf14b424bb proj: correct CMake arg for shared build with proj older than 7.0.0 (#43089)
* proj: correct CMake arg for shared build with proj older than 7.0.0

* Actually use new CMake arg

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-09 00:40:27 -07:00
Arne Becker
14209a86a6 perl-bio-eutilities and deps: new packages (#42869)
This adds Spack packages for these Perl distributons:
- Bio::DB::EUtilities and its dependencies:
- Bio::ASN1::EntrezGene
- Bio::Cluster
- Bio::Variation
2024-03-08 10:42:26 -08:00
Arne Becker
b7d9900764 perl-test-warn: update version, dependencies (#42901)
* perl-test-warn: new package
  Adds Test::Warn
* Revive older version
* perl-test-warn: fix URL and deprecate old version
2024-03-08 10:40:20 -08:00
Arne Becker
bc155e7b90 perl-starman and deps: new packages (#42958)
Adds Starman and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Starman
- Starman
2024-03-08 10:36:20 -08:00
Massimiliano Culpo
65f9ba345f gnupg: add v2.4.5, remove deprecated versions (#43090)
Also updates dependencies:
  libassuan: add v2.5.7
  libgcrypt: deprecate end of life versions
2024-03-08 10:20:55 -08:00
Dave Keeshan
ca49bc5652 verible: Add revision v0.0-3607-g46de0f64 (#43086) 2024-03-08 10:18:39 -08:00
snehring
b84b85a7e0 visit: link against hip when built with vtk-m+rocm (#42452) 2024-03-08 19:09:35 +01:00
John W. Parent
016cdba16f Curl package: update Windows support (#42752)
* Properly specify Curl builder interface for static vs shared curl
  with NMake system to ensure all built curls export expected
  symbols.
* Symlinks curl library build artifact to more idiomatic name for
  FindCurl.cmake implementations and other NMake consumers.
2024-03-08 10:08:12 -08:00
Jonas Eschle
4806e6549f Add package zfit (#42667)
* Add package zfit (WIP)

* Add package zfit

* Add package zfit

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* enh: add extras, 0.18.1

* fix: add default

* fix:  typo

* fix:  typo

* chore: cleanup arguments

* chore: cleanup arguments

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* chore: cleanup arguments

* fix:  typo

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-zfit/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 11:04:35 -07:00
Adam J. Stewart
c14b277150 py-fiona: add v1.9.6 (#43099) 2024-03-08 10:32:15 -07:00
John W. Parent
919025d9f3 VTK package: use CMake config provide by PROJ (#43088)
PROJ can install its own config file: VTK should prefer that over manual detection.
2024-03-08 10:32:00 -07:00
Adam J. Stewart
52f57c90eb Retiring as PythonPackage maintainer (#43091) 2024-03-08 18:29:24 +01:00
fgava90
ee1fa3e50c Deprecating py-pylint@2.3 as it cannot build with python@3.8: (#43096)
* Deprecating py-pylint@2.3 as it cannot build with python@3.8:

* Style fix

* Removed versions because can't build with python@3.7

---------

Co-authored-by: Gava, Francesco <francesco.gava@mclaren.com>
2024-03-08 07:43:46 -07:00
jmlapre
772928241b py-cheap-repr: new package (#42944)
* py_cheap_repr: add initial package.py

* Update var/spack/repos/builtin/packages/py-cheap-repr/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cheap-repr: use pypi link instead

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-03-08 06:45:25 -06:00
Martin Lang
7440bb4c36 bigdft-psolver: new versions 1.9.3, 1.9.4 (#42644) 2024-03-08 07:22:15 +01:00
Tom Payerle
c464866deb sfcgal: add v1.5.1, and restrict versions of cgal dependency (#42278)
Added new versions @1.5.1 and @1.4.1 (sfcgal moved from github to gitlab)

Placed restrictions on what versions of cgal are supported for different
cfcgal versions.  These restrictions are based on what was found in the
version history at https://gitlab.com/sfcgal/SFCGAL/-/blob/v1.5.1/NEWS?ref_type=tags
as well as the CMakeLists.txt for different versions.

@1.4 and @1.5 seem to require a specific version of cgal based on CMakeLists.txt

Earlier versions (@1.3.8:1.3.10) claim to support cgal@4.3: (but Spack recipe
claims did not work until @4.7, so sticking with that as minimum).  CMakeList.txt
suggests they support cgal@5 as well, but version history suggests otherwise.
2024-03-08 07:08:34 +01:00
Tom Payerle
799a8a5090 sfcgal: add custom libs property (#42276) 2024-03-08 07:06:38 +01:00
m-shunji
c218ee50e9 libint: add link option for fujitsu compiler (#42233)
Co-authored-by: inada-yoshie <inada.yoshie@fujitsu.com>
2024-03-08 07:03:18 +01:00
Teague Sterling
8ff7a20320 duckdb: add v0.10.0, overhaul of package recipe (#38922)
* Expanding the duckdb package to fix the version number (required for external extensions to work) being pulled from git and have variants for the built-in extensions at build-time. This also changes the build system from CMakePackage to Makefile package (as advised from upstream).

* - Reorganized and cleaned up variants
 - Updated the patch to work for 0.10.0+
 - Removed (or made non-default) some unecessary variants
 - Added ninja as the default generator
 - Set up some shared library dependencies;
2024-03-08 06:32:29 +01:00
Dave Keeshan
e3fe6bc0f7 Add 5.022 and cleanup how autoconf is done (#43085) 2024-03-07 18:29:27 -07:00
Sreenivasa Murthy Kolam
c6fcb1068f Enable tensorflow-2.11 support for ROCm (#38520)
* enable tensorflow-2.11 support for ROCm

* add latest sha for mesa and limit the patches to older version.similar
changes in #37910 to enable gitlab-ci pass

* address review commemts
2024-03-07 15:23:40 -07:00
Weiqun Zhang
54ac3e72ed amrex: add v24.03 (#43083) 2024-03-07 13:59:29 -08:00
Matthew Thompson
274fbebc4c Update GFE packages (#43082)
* Update GFE packages
* Add pflogger 1.13.1
2024-03-07 13:47:33 -08:00
Tim Haines
d40eb19918 Dyninst: add v13.0.0 (#43068) 2024-03-07 13:43:18 -08:00
dependabot[bot]
31de670bd2 build(deps): bump dorny/paths-filter from 3.0.1 to 3.0.2 (#43000)
Bumps [dorny/paths-filter](https://github.com/dorny/paths-filter) from 3.0.1 to 3.0.2.
- [Release notes](https://github.com/dorny/paths-filter/releases)
- [Changelog](https://github.com/dorny/paths-filter/blob/master/CHANGELOG.md)
- [Commits](ebc4d7e9eb...de90cc6fb3)

---
updated-dependencies:
- dependency-name: dorny/paths-filter
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-07 11:54:24 -08:00
Harmen Stoppels
6c0961549b zlib-ng: 2.1.6 (#43060) 2024-03-07 12:35:36 -07:00
Tim Fuller
c090bc5ebe Drop optional dependencies of Spack (#43081)
Remove dependency on `importlib_metadata` and `pkg_resources`, which can be problematic if the version in PYTHONPATH is incompatible with the interpreter Spack is running under.
2024-03-07 17:52:49 +00:00
renjithravindrankannath
bca4d37d76 hip: update patch PARAMETERS_MIN_ALIGNMENT for 6.0 (#42834) 2024-03-07 18:31:23 +01:00
Sreenivasa Murthy Kolam
9b484d2eea fix build issue with rocm-openmp-extras for 5.4.3 release (#43046)
* fix build issue with rocm-openmp-extras for 5.4.3 release
https://github.com/spack/spack/issues/36930
* fix style errors
2024-03-07 10:23:07 -07:00
Adam J. Stewart
a57b0e1e2d py-lightning: add v2.2.1 (#43042) 2024-03-07 09:22:52 -08:00
Luc Berger
e3cb3b29d9 Kokkos Kernels: adding missing TPLs and pre-conditions (#43043)
* Kokkos Kernels: adding missing TPLs and pre-conditions
  Adding variants and dependencies for rocBLAS and rocSPARSE.
  Also adding a "when=" close to the TPL variants that prevents
  enabling the TPLs in versions of the library when it was not
  yet available.
* Kokkos Kernels: remove comment for better format
* Kokkos Kernels: fix issue with unpacking wrong number of args
  After changing the tpls dictionary we need to update the unpacking
  logic to catch the right number of outputs out of it!
* Kokkos Kernels: updating doc string for tpls var and using f-string
  Improving comment a bit and switching to f-string for more readability.
  When setup to do more testing will try to use f-string in the CMake
  options generation part of the package.
* Style change
2024-03-07 09:00:10 -08:00
Mark W. Krentel
ac48ecd375 meson: add version 1.3.2 (#43036) 2024-03-07 09:56:20 -07:00
Juan Miguel Carceller
0bb20d34db autotools: fix a typo in comment (#43080)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-03-07 16:53:14 +00:00
Eric Berquist
971fda5c33 sst-core: use ncurses for interactive sst-info (#43063)
* sst-core now effectively depends on ncurses

* use --with-curses

* sst-core: update comment about ncurses

* should have curses for build, link, and run
2024-03-07 10:45:47 -06:00
Mikael Simberg
dcc4423a9d pika: Add 0.23.0 (#43077) 2024-03-07 16:06:54 +01:00
runiq
82c380b563 Fix spack find bootstrapping docs (#43074)
Closes #43052.

Maybe moving the argument to the `find` subcommand is a good idea, but I
just wanted to get the docs fix out.

Co-authored-by: Patrice Peterson <patrice.peterson@itz.uni-halle.de>
2024-03-07 14:13:32 +01:00
Adam J. Stewart
8bcf6a31ae ML CI: variants are now required (#42851) 2024-03-07 11:42:13 +01:00
Vanessasaurus
ddd88e266a Automated deployment to update package flux-core 2024-03-07 (#43067) 2024-03-07 11:13:31 +01:00
Martin Aumüller
08c597d83e botan: add v3.3.0, v2.9.14 (#43047) 2024-03-06 09:29:04 -08:00
Martin Aumüller
bf5340755d embree: add v4.3.1 (#43048) 2024-03-06 09:25:22 -08:00
Satish Balay
f8e70a0c96 llvm: add 18.1.0 (#43049) 2024-03-06 17:36:03 +01:00
Tim Fuller
7e468aefd5 Allow loading extensions through python entry-points (#42370)
This PR adds the ability to load spack extensions through `importlib.metadata` entry 
points, in addition to the regular configuration variable.

It requires Python 3.8 or greater to be properly supported.
2024-03-06 11:18:49 +01:00
John W. Parent
e685d04f84 netcdf-c package: Skip problematic post install on Windows (#43039)
#42878 adds a post install filter of the netCDFConfig.cmake file
that replaces a valid CMake target on Windows with an invalid one.
Don't do this replacement on Windows.
2024-03-05 21:44:36 -07:00
Adam J. Stewart
9d962f55b0 spack.patch: fix type hint circular import (#43041) 2024-03-05 17:26:44 -08:00
renjithravindrankannath
00d3066b97 migraphx: add v6.0.0 & v6.0.2 (#42918)
* Updates for migraphx 6.0.0 & 6.0.2
* Style check error and audit check error fix
* Adding patch for half-include-directory
* The parameter GPU_TARGETS is used from 5.7 in migraphx
* Adding rocmlir dependency in migraphx and 6.0 updates in rocmlir
* Applying upcoming changes to make CK JIT optional and enable
  compilation on Windows in order to build without ck dependency
2024-03-05 15:16:31 -08:00
Martin Lang
5ca0dcecb2 mpfr: missing dependency for version 4.0.1 (#43012)
* mpfr: missing dependency for version 4.0.1
  mpfr 4.0.1 (like 4.0.2) needs autoconf-archive where it takes the
  AX_PHREAD macro from
* autoconf-archive is also required for mpfr@4.0.0
2024-03-05 15:11:31 -08:00
Pranav Sivaraman
fa8fb7903b nvptx-tools: add v2023-09-13 (#40897)
New changes have been made to nvptx-tools that address dropping support
for sm_30 in later CUDA versions (12.0+).

Also refactor gcc to make nvptx-tools a dependency instead of a resource.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-03-05 22:51:40 +01:00
Adam J. Stewart
f35ff441f2 spack.patch: add type hints (#42811)
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-03-05 13:19:43 -08:00
Arne Becker
9ea9ee05c8 perl-catalyst-view-json: new package (#42981)
Adds Catalyst::View::JSON
2024-03-05 12:04:38 -08:00
Arne Becker
24ddc49c1b perl-bsd-resource: new package (#42982)
Adds BSD::Resource
2024-03-05 12:03:52 -08:00
Arne Becker
2f4266161c perl-chart-gnuplot: new package (#42983)
Adds Chart::Gnuplot
2024-03-05 12:03:00 -08:00
Arne Becker
7bcb0fff7d perl-config-inifiles: new package (#42984)
Adds Config::IniFiles
2024-03-05 12:02:06 -08:00
Arne Becker
cd332c6370 perl-plack-middleware-assets and deps: new packages (#42990)
This adds Plack::Middleware::Assets and its missing deps:
- CSS::Minifier::XS
- JavaScript::Minifier::XS
- Test::DiagINC
2024-03-05 11:59:52 -08:00
Arne Becker
6daf9677f3 perl-plack-middleware-crossorigin: new package (#42991)
Adds Plack::Middleware::CrossOrigin
2024-03-05 11:59:25 -08:00
G-Ragghianti
cb6450977d Removing application of the ldconfig patch because it conflicts with (#43001)
gdrcopy version 2.4.1
2024-03-05 11:57:24 -08:00
Vanessasaurus
bf62ac0769 Automated deployment to update package flux-sched 2024-03-05 (#43005)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-03-05 11:53:29 -08:00
Sergey Kosukhin
0223fe746b serialbox: fix PYTHONPATH (#43034) 2024-03-05 11:48:18 -08:00
Arne Becker
12fba13441 perl-chi-driver-memcached: new package (#43011)
Adds CHI::Driver::Memcached.

Modified perl-chi to make the tests work. Testing modules in perl-chi
were not loaded when testing CHI::Driver::Memcached, so added the "run"
type to these.
2024-03-05 11:46:57 -08:00
Arne Becker
0c44f5a140 perl-html-template: new package (#43017)
Adds HTML::Template
2024-03-05 11:45:14 -08:00
Arne Becker
f4853790c5 perl-catalyst-devel and deps: new packages (#43013)
This add Catalyst::Devel and its missing dependencies:
- Catalyst::Plugin::ConfigLoader
- Catalyst::Plugin::Static::Simple
- File::ChangeNotify
2024-03-05 11:44:53 -08:00
Arne Becker
9ed2e396f4 perl-plack-middleware-deflater: new package (#43014)
Adds Plack::Middleware::Deflater
2024-03-05 11:43:39 -08:00
Arne Becker
3ee6fc937e perl-data-predicate: new package (#43015)
* perl-data-predicate: new package
Adds Data::Predicate
* Added description
2024-03-05 11:42:48 -08:00
Arne Becker
c9b6cc9a58 perl-exporter-auto: new package (#43016)
Adds Exporter::Auto
2024-03-05 11:41:46 -08:00
Arne Becker
58b394bcec perl-list-compare: new package (#43018)
Adds List::Compare
2024-03-05 11:39:05 -08:00
Arne Becker
4d89eeca9b perl-sereal and deps: new packages (#43022)
* perl-sereal and deps: new packages
This adds Sereal and its missing dependencies:
- Sereal::Encoder
- Sereal::Decoder
* Add missing files.
2024-03-05 11:38:24 -08:00
Arne Becker
bfc71e9dae perl-string-approx: new package (#43023)
Adds String::Approx
2024-03-05 11:35:04 -08:00
Arne Becker
f061dcda74 perl-string-numeric: new package (#43024)
Adds String::Numeric
2024-03-05 11:34:03 -08:00
Arne Becker
cc460894fd perl-test-file-contents: new package (#43025)
Adds Test::File::Contents
2024-03-05 11:32:50 -08:00
Arne Becker
5e09660e87 perl-test-mockobject and deps: new packages (#43026)
Adds Test::MockObject and its dependencies.
Installed OK with build-time tests. Added dependencies:
- UNIVERSAL::can
- UNIVERSAL::isa
2024-03-05 11:31:59 -08:00
Arne Becker
5a8efb3b14 perl-test-perl-critic: new package (#43027)
Adds Test::Perl::Critic
2024-03-05 11:30:12 -08:00
Arne Becker
99002027c4 perl-test-pod-coverage and deps: new packages (#43028)
Adds Test::Pod::Coverage and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Pod::Coverage
2024-03-05 11:28:28 -08:00
Arne Becker
a247879be3 perl-tie-ixhash: new package (#43029)
Adds Tie::IxHash
2024-03-05 11:25:59 -08:00
Sergey Kosukhin
7b46993fed eccodes: drop redundant build dependency (#43035) 2024-03-05 11:25:00 -08:00
Arne Becker
dd59f4ba34 perl-text-csv-xs: new package (#43030)
Adds Text::CSV_XS
2024-03-05 11:22:38 -08:00
Arne Becker
18ab14e659 perl-xml-hash-xs: new package (#43031)
Adds XML::Hash::XS
2024-03-05 11:20:00 -08:00
Massimiliano Culpo
28eb5e1bf6 archspec: add v0.2.3 (#43009) 2024-03-05 20:09:46 +01:00
Sergey Kosukhin
c658ddbfa3 icon: add new package (#43037) 2024-03-05 12:03:38 -07:00
Massimiliano Culpo
12963c894f libksba: add v1.6.6, remove deprecated versions (#43006) 2024-03-05 19:30:39 +01:00
Mark W. Krentel
61fa12508f hpcviewer: add version 2024.02 (#42997) 2024-03-05 10:15:32 -08:00
Martin Lang
daf6acef6e py-numpy: patch for AVX512 build flags on Intel Classic Compiler (#43020) 2024-03-05 17:27:05 +01:00
Alex Richert
d30621e787 dakota: make python dependency optional, add v6.19 (#42914)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 12:57:13 +01:00
Wouter Deconinck
dd4b365608 container: don't map develop to latest (#42952)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-03-05 10:47:04 +01:00
Massimiliano Culpo
157d47fc5a ASP-based solver: improve reusing nodes with gcc-runtime (#42408)
* ASP-based solver: improve reusing nodes with gcc-runtime

This PR skips emitting dependency constraints on "gcc-runtime",
for concrete specs that are considered for reuse.

Instead, an appropriate version of gcc-runtime is recomputed
considering also the concrete nodes from reused specs.

This ensures that root nodes in a DAG have always a runtime
that is at a version greater or equal than their dependencies.

* Add unit-test for view with multiple runtimes
* Select latest version of runtimes in views
* Construct result keeping track of latest
* Keep ordering stable, just in case
2024-03-04 22:46:28 -08:00
Massimiliano Culpo
13daa1b692 libgpg-error: add v1.48 (#42872) 2024-03-05 05:56:50 +01:00
Mosè Giordano
f923e650f9 py-pythran: @:0.12.1 is incompatible with python@3.11: (#42994)
Ref: https://github.com/serge-sans-paille/pythran/issues/2101 and https://github.com/scipy/scipy/issues/18390.
2024-03-04 18:28:17 -07:00
Victoria Cherkas
1a1bbb8af2 metkit: add git url (#42867) 2024-03-04 21:15:03 +01:00
Victoria Cherkas
594fcc3c80 metkit: add additional eckit dependency constraint (#42871) 2024-03-04 21:14:09 +01:00
Tom Payerle
76ec19b26e hdf-eos2: support version @3.0, plus assorted fixes (#41782)
1) support for version @3.0
Unfortunately, download seems to require registration now
so using manual_download mechanism for @3:

2) copying from hdf-eos5 patch from @vanderwb to enable
use of Spack compiler wrappers instead of h4cc

3) Patching an issue in hdf-eos2 configure script.  The
script will test for jpeg, libz libraries, succeed and
append HAVE_LIBJPEG=1, etc to confdefs.h, and then abort
because HAVE_LIBJPEG not set in running environment.

4) Add some LDFLAGS to build environment.  Otherwise
seems to fail to build test script due to rpc dependence
in HDF4.
2024-03-04 21:08:19 +01:00
Arne Becker
00baaf868e perl-perl-critic-moose and deps: new packages (#42992)
Adds Perl::Critic::Moose and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Perl::Critic
- Pod::Parser
- Perl::Tidy
- PPI
- PPIx::QuoteLike
- List::SomeUtils
- PPIx::Regexp
- B::Keywords
- PPIx::Utils
- String::Format
- Pod::Spell
- Test::SubCalls
- Test::Object
- Lingua::EN::Inflect
- Hook::LexWrap
2024-03-04 12:54:14 -07:00
Harmen Stoppels
3b06347f65 repo.py: cleanup packages_with_tags (#42980) 2024-03-04 20:50:04 +01:00
Eric Berquist
5b9e207db2 bear: add up to version 3.1.3 (#42993) 2024-03-04 12:17:24 -07:00
psakievich
d6fd9017c4 Document new environment variable expansion in projections (#42963)
Adding docs and test for #42917

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-03-04 12:17:08 -07:00
Sreenivasa Murthy Kolam
913d79238e llvm-amdgpu: remove the openmp variant. (#42807)
Add rocm-openmp-extras package as a dependency for +openmp for rocm
2024-03-04 12:16:53 -07:00
Arne Becker
250038fa9b perl-dbix-class and deps: new packages (#42986)
Adds DBIx::Class and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Class::C3::Componentised
- Data::Dumper::Concise
- Config::Any
- Context::Preserve
- Class::Accessor::Grouped
- Module::Find
- SQL::Abstract::Classic
- Class::C3
- SQL::Abstract
- Algorithm::C3
2024-03-04 12:11:38 -07:00
Arne Becker
26c553fce7 perl-net-server-ss-prefork and deps: new packages (#42989)
Adds Net::Server::SS::PreFork and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Server::Starter
- Net::Server
- HTTP::Server::Simple
2024-03-04 12:11:14 -07:00
Arne Becker
e24c242fb7 perl-gzip-faster: new package (#42988)
Adds Gzip::Faster
2024-03-04 12:10:57 -07:00
Arne Becker
ca14ce2629 perl-catalyst-action-rest: new package (#42960)
Adds Catalyst::Action::REST
2024-03-04 10:59:57 -08:00
Arne Becker
44f443946c perl-graphviz and deps: new packages (#42987)
Adds GraphViz and its dependencies.
Installed OK with build-time tests. Added dependencies:
- XML::XPath
2024-03-04 11:58:12 -07:00
Arne Becker
6e6bc89bda perl-catalyst-action-renderview and deps: new packages (#42961)
Adds Catalyst::Action::RenderView and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
- Catalyst::Action::RenderView
2024-03-04 10:57:50 -08:00
Arne Becker
8714ea6652 perl-catalyst-component-instancepercontext: new package (#42962)
Adds Catalyst::Component::InstancePerContext
2024-03-04 10:56:08 -08:00
Sinan
df92f0a7d4 imagemagick: add v7.0.9:7.1.1-29 (#42968)
* package/imagemagick add new version, improve
* confimed that build fails when libsm is missing on linux

---------

Co-authored-by: Sinan81 <Sinan@world>
2024-03-04 10:52:56 -08:00
Alec Scott
d24b91157c restic: add v0.16.4 (#42956) 2024-03-04 19:52:20 +01:00
Alec Scott
1a0f77388c direnv: add v2.34.0 (#42955) 2024-03-04 19:51:47 +01:00
Harmen Stoppels
34571d4ad6 linux-pam: add missing libtirpc dep (#42976) 2024-03-04 19:29:25 +01:00
Rocco Meli
a574f40732 add namd 3.0b6 (#42979) 2024-03-04 19:06:03 +01:00
Arne Becker
d4ffe244af perl-chi and deps: new packages (#42959)
* perl-chi and deps: new packages

Adds CHI and its dependencies.
Installed OK with build-time tests. Added dependencies:
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI
- CHI

* Add license
2024-03-04 08:59:33 -08:00
AMD Toolchain Support
e08e66ad89 AOCL: add v4.2.0 (#42920)
* AOCL: add v4.2.0
   Co-authored-by: Phil Tooley <phil.tooley@amd.com> and
                vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
* Review comments for spack community PR #42920

---------

Co-authored-by: Phil Tooley <phil.tooley@amd.com> and vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-03-04 08:43:27 -08:00
Vanessasaurus
0543710258 flux-core: flux-security dependency with build/link (#42985) 2024-03-04 16:48:12 +01:00
Harmen Stoppels
5d994e48d5 modules: allow autoload: run, like in environment views (#42743) 2024-03-04 08:49:45 +01:00
Harmen Stoppels
d1fa23e9c6 versions: fix typing problems (#42903)
Fix the type declaration of VersionList.versions.

Fix further problems exposed by that fix.
2024-03-04 08:38:54 +01:00
Isaac Corley
f1db8b7871 torchgeo v0.5.2 (#42967) 2024-03-03 13:57:39 -07:00
Axel Huebl
c5cca54c27 Fix mgard: OpenMP on AppleClang (#42933)
macOS AppleClang does not provide OpenMP by default with XCode.
Use LLVM's OpenMP to fix compile errors of mgard with OpenMP (default).
2024-03-03 08:57:24 +01:00
Arne Becker
a9c1648db8 perl-log-dispatch-filerotate and dep: new package (#42965)
Adds Log::Dispatch::FileRotate and its dependency:
- Log::Dispatch

Installed OK, build-time tests ran successfully.
2024-03-03 00:40:27 -07:00
Arne Becker
3bd911377e perl-catalyst-plugin-cache: new package (#42964)
Adds Catalyst::Plugin::Cache
2024-03-02 23:34:26 -07:00
otsukay
fcb2f7d3aa Remove unnecessary if statements, which are harmful since +blas+lapack variants have been removed. (#42936)
Co-authored-by: Yuichi Otsuka <otsukay@riken.jp>
2024-03-02 22:13:47 -07:00
Arne Becker
a8a9e0160a perl-test-time-hires: new package (#42957)
Adds Test::Time::HiRes
2024-03-02 18:10:08 -08:00
jfavre
9ca6aaeafd libcatalyst: add fortran & python variants (#42941)
* Update package.py

Adding two variants 'fortran' and 'python' to enable language wrappings

* Update package.py

remove extra space
2024-03-02 17:55:46 -08:00
Adam J. Stewart
aed5c39312 py-numpy: add v1.26.4 (#42515) 2024-03-02 21:56:43 +01:00
Chris Marsh
eb36bb2a8f netcdf-c package: correct library names (#42878)
An incorrect hdf5 library name is added to pkconfig and CMake config
files when netcdf-c is built with CMake.
2024-03-02 11:05:59 -08:00
dependabot[bot]
8dcf860888 build(deps): bump codecov/codecov-action from 4.0.2 to 4.1.0 (#42860)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.2 to 4.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](0cfda1dd0a...54bcd8715e)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-02 10:39:09 -08:00
Arne Becker
a5b7cb6e6f perl-test-xml-simple and deps: new packages (#42873)
* perl-test-longstring: New package

Adds Test::LongString

* perl-test-xml-simple: new package

- Adds perl-test-xml-simple
2024-03-02 10:37:35 -08:00
Arne Becker
34c0bfefa6 perl-test-xml and deps: new packages (#42870)
- Adds perl-test-xml and dependency:
- Adds perl-xml-semanticdiff
2024-03-02 10:35:56 -08:00
Elliott Slaughter
ea5db048f3 legion: Add 23.09.0 and 23.12.0, remove control_replication. (#42915)
* legion: Add 23.09.0 and 23.12.0, remove control_replication.

The branch control_replication has been merged to master and should no
longer be used.

* flecsi: Switch to Legion master branch.

Legion control_replication has been merged to master.

* Fix Legion 23.09.0 and 23.12.0 build for ROCm 6.
2024-03-02 10:32:44 -08:00
Adam J. Stewart
e68a17f2c6 Bazel: fix patching of 4.2.4 (#42938) 2024-03-02 10:29:18 -08:00
Brian Vanderwende
4af9ec3d8a Add ncvis package and add option to wxwidgets (#38204)
* Add ncvis and opengl option for wxwidgets

* Style fixes for ncvis

* Replace in with satisfies for opengl constraint

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-03-02 10:26:36 -08:00
Zachary Newell
eb90e2c894 Added NCCL version 2.20.3-1 (#42951)
Added NCCL version 2.20.3-1 to the package.py. I tested compiling it and running nccl-tests on Ubuntu 22.04.
2024-03-02 06:59:17 -06:00
Mosè Giordano
763f444d63 py-numba: add tbb variant (#42930)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-03-01 02:03:39 -07:00
Chris Marsh
6614c4322d Seacas: fix patch hash (#42934) 2024-03-01 01:48:36 -07:00
Chris Marsh
983422facf libogg does not build a shared libary with cmake (#42877)
* when built with cmake, libogg does not build with a shared libary by default. This resolves that

* spack style fixes

* Clean up imports

* enforce +pic when +shared
2024-02-29 21:12:27 -06:00
Ye Luo
d0bdd66238 Add Quantum ESPRESSO 7.3.1 (#42927) 2024-02-29 20:08:25 -07:00
Tim Fuller
3a50d32299 Show extension commands with spack -h (#41726)
* Execute `args.help` after setting main options so that extension commands will show with `spack -h`

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-02-29 16:51:42 -08:00
psakievich
50ee3624c0 Support environment variable expansion inside module projections (#42917) 2024-02-29 16:49:37 -08:00
afzpatel
2840cb54b5 initial commit to fix ck gpu targets cmake arg (#42924) 2024-02-29 15:48:07 -06:00
eugeneswalker
5c482d0d7e reduce size of e4s to deal with large rebuild artifact (#42884) 2024-02-29 13:44:01 -07:00
joscot-linaro
f3ad990b55 linaro-forge: added 23.1.2 version (#42922) 2024-02-29 12:28:03 -08:00
Victoria Cherkas
977603cd96 Update fdb package.py with libs (#42874)
* Update fdb package.py with libs
* Formatting
2024-02-29 12:23:37 -08:00
Wanlin Wang
1f919255fd Update riscv-gnu-toolchain package.py (#42893)
* Update package.py
  1. add one compiler type named 'musl'
  2. add a variant name 'multilib'
  3. add a variant name 'cmodel'
* Added one compiler type named 'musl'.
  Added a variant named 'multilib'.
  Added a variant named 'cmodel'.
  Added several versions.
* aarch64 is not supported.
2024-02-29 12:11:40 -08:00
Adam J. Stewart
5140a9b6a3 py-keras: add v3.0.5 (#42697) 2024-02-29 17:56:03 +01:00
Chris Marsh
1732ad0af4 vtk: Update proj dependency (#42797)
* Update proj dependency to enable newer proj usage

* Allow for any proj version
2024-02-29 06:07:21 -08:00
dependabot[bot]
e7510ae292 build(deps): bump docker/setup-buildx-action from 3.0.0 to 3.1.0 (#42883)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](f95db51fdd...0d103c3126)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-29 14:48:24 +01:00
Harmen Stoppels
0c224ba4a7 libevent: remove autotools build deps again (#42908)
The deps were added in #40945 to make it work on macOS 11, because the
old configure scripts only detect macOS 10. Apparently people reported the
autoreconf script caused issues, later fixed in #41057. However, also
with that fix, things are incorrect, cause people now report:

```
libtool: You should recreate aclocal.m4 with macros from libtool 2.4.7
libtool: and run autoconf again.
```

HOWEVER, all this is unnecessary, because the underlying issue was
already fixed long ago, it's just that it regressed at some point, but
it's back in place since #41205.
2024-02-29 09:57:21 +01:00
Terry Cojean
86b4a867ef ginkgo: add PAPI SDE support (#39425)
Signed-off-by: Terry Cojean <terry.cojean@kit.edu>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-29 06:04:34 +01:00
Arne Becker
6049e5f6eb perl-readonly-xs: new package (#42897)
This adds Readonly::XS. Since this module can not be used by itself, the
Spack package comes with a test override. This anticipates that the perl
builder will one day have a generic standalone module usage test.
2024-02-28 14:27:22 -08:00
Arne Becker
0339690a59 perl-test-json, perl-json-any: New packages (#42896)
* perl-test-json: New package
  Adds Test::JSON
* Adds perl-json-any
2024-02-28 14:25:34 -08:00
Arne Becker
2bae1b7e00 perl-test-xpath: New package (#42895)
Adds Test::XPath
2024-02-28 14:23:41 -08:00
Arne Becker
ae5b605f99 perl-uri-find: New package (#42894)
Adds URI::Find
2024-02-28 14:22:19 -08:00
Arne Becker
35898d94d2 perl-net-cidr-lite: new package (#42898)
* perl-net-cidr-lite: new package
   Adds Net::CIDR::Lite
* Add license
2024-02-28 14:20:52 -08:00
Arne Becker
7e00bd5014 perl-mojolicious: new package (#42899)
Adds Mojolicious
2024-02-28 14:16:36 -08:00
Arne Becker
1f3aefb0a3 perl-cache-memcached and deps: new packages (#42911)
Adds Cache::Memcached and its dependencies.
Installed OK with build-time tests. Added dependencies:
- Cache::Memcached
2024-02-28 14:02:38 -08:00
Harmen Stoppels
d4601d0e53 Unit tests: skip tests that intermittently fail on Windows (#42909) 2024-02-28 14:00:09 -08:00
Tom Payerle
935660e3d5 mysql: explicity cast python command to str in _fix_dtrace_shebang() (#40781)
This should fix issue #40780

We explicitly cast self.spec["python"].command to str in the filter_file
call in _fix_dtrace_shebang to avoid the error
==> Error: TypeError: expected str, bytes or os.PathLike object, not Executable

Not sure why the error is appearing (is it only for specific python versions, etc?),
but the fix should be quite safe.
2024-02-28 13:00:51 -08:00
Alec Scott
17bfc41841 bison: remove unnessisary deps, add variant for colored output (#42209) 2024-02-28 12:32:40 -08:00
Arne Becker
49b38e3a78 perl-yaml-syck: New package (#42892)
Adds YAML::Syck
2024-02-28 12:26:09 -08:00
Arne Becker
66f078ff84 perl-catalyst-runtime and deps: new packages (#42886)
* perl-catalyst-runtime and deps: new packages
  This add Perl Catalyst::Runtime and its missing dependencies.
  Adds:
  - perl-catalyst-runtime
  - perl-apache-logformat-compiler
  - perl-cgi-simple
  - perl-cgi-struct
  - perl-class-c3-adopt-next
  - perl-cookie-baker
  - perl-data-dump
  - perl-devel-stacktrace-ashtml
  - perl-filesys-notify-simple
  - perl-getopt-long-descriptive
  - perl-hash-multivalue
  - perl-http-body
  - perl-http-entity-parser
  - perl-http-headers-fast
  - perl-http-multipartparser
  - perl-moosex-emulate-class-accessor-fast
  - perl-moosex-getopt
  - perl-moosex-methodattributes
  - perl-moosex-role-parameterized
  - perl-path-class
  - perl-plack
  - perl-plack-middleware-fixmissingbodyinredirect
  - perl-plack-middleware-methodoverride
  - perl-plack-middleware-removeredundantbody
  - perl-plack-middleware-reverseproxy
  - perl-plack-test-externalserver
  - perl-posix-strftime-compiler
  - perl-stream-buffered
  - perl-string-rewriteprefix
  - perl-test-mocktime
  - perl-test-tcp
  - perl-test-time
  - perl-test-trap
  - perl-tree-simple
  - perl-tree-simple-visitorfactory
  - perl-uri-ws
  - perl-www-form-urlencoded
2024-02-28 12:17:45 -08:00
AMD Toolchain Support
304a63507a AOCC: add v4.2.0 (#42891)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-02-28 20:24:57 +01:00
Jonas Eschle
c7afc0eb5f Upgrade TensorFlow Probability with newer versions (#42673)
* enh: add newer versions

* enh: add newer versions

* format

* fix typo

* Update package.py

* make jax and TF optional dependencies

* style fix

* remove dependency

* remove old TFP version

* fix:  style
2024-02-28 12:29:23 -06:00
kwryankrattiger
57cde78c56 ParaView Release Candidate 5.12.0-RC3 (#42654) 2024-02-28 09:41:10 -08:00
Arne Becker
b01f6308d5 perl-json-xsand deps: new packages (#42904)
Adds JSON::XS and its deps:
- Canary::Stability
- Types::Serialiser
2024-02-28 09:30:46 -08:00
eugeneswalker
13709bb7b7 e4s: new packages: glvis, laghos (#42847)
* e4s: new packages: glvis, laghos

* gl: require: osmesa

* be explicit: glvis ^llvm so that llvm-amdgpu not chosen

* glvis fails on oneapi stack due to issue 42839
2024-02-28 09:26:53 -08:00
Harmen Stoppels
661ae1f230 versions: simplify list if union not disjoint (#42902)
Spack merges ranges and concrete versions if they have non-empty
intersection. That is not enough for adjacent version ranges.

This commit ensures that disjoint ranges in version lists are simplified
if their union is not disjoint:

```python
"@1.0:2.0,2.1,2.2:3,4:6" # simplifies to "@1.0:6"
```
2024-02-28 16:33:25 +01:00
Sinan
287e1039f5 package_py_systemd_python_improve (#42865)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-02-28 07:56:17 -06:00
Jonas Eschle
015dc4ee2e Add package zfit interface (#42666)
* Add package zfit interface

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of jonas-eschle

* Update var/spack/repos/builtin/packages/py-zfit-interface/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-28 07:52:40 -06:00
Axel Huebl
46165982b1 C-Blosc2: Fuzzer Tests (#42881)
The fuzzer tests are a bit flaky and have linker issues on
clang. We generally only should build them in testing.
2024-02-27 21:08:33 -07:00
eugeneswalker
c9a111946e e4s oneapi: remove outdated package preferences (#42875) 2024-02-27 14:35:06 -08:00
renjithravindrankannath
62160021c1 Adding dependency of roctracer-dev and patch in miopen-hip (#42637) 2024-02-27 14:52:47 -07:00
Tom Payerle
3290e2c189 openexr: Add custom libs property (#42274)
Libraries for openexr are named libOpenEXR*.so, etc., so the default libs
handler in spec does not find them.

Add a custom libs property to address this.

Partial fix for #42273

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-02-27 10:45:29 +01:00
George Young
2a9fc3452a regtools: add new package (#42852)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-27 10:44:34 +01:00
YI Zeping
2ea8a2e6ae btop: add cmake version restriction (#42835)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-27 10:08:22 +01:00
Lydéric Debusschère
fe4a4ddcf9 bazel: allow offline build of major versions 5 and 6 (#41575)
* bazel: allow offline build of major versions 5 and 6; add variant download_data

* bazel: add maintainer LydDeb

* bazel: install offline only; remove variant download_data

* bazel: fix variable name: resource_dico --> resource_dictionary

* bazel: fix style

* bazel: fix the build of version 4

* bazel: add comment about resources

* bazel: access to resource stages with self.stage

* bazel: add except to solve AttributeError: 'Stage' object has no attribute 'resource'

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2024-02-27 03:03:25 -06:00
Juan Miguel Carceller
c45714fd3c delphes: use the same C++ standard as in ROOT (#42816)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:26 +01:00
Juan Miguel Carceller
523d12d9a8 garfieldpp: Add version 5.0 (#42817)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 09:54:03 +01:00
Howard Pritchard
5340e0184d Open MPI: adjust pmix dependency for 5.0.x (#42827)
for various reasons had to advance dependency of 5.0.2 to at least
pmix 4.2.4.  5.0.1 and 5.0.0 can also build with 4.2.4 pmix or newer.

related to #42651

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-02-27 09:49:29 +01:00
AMD Toolchain Support
bca7698138 openfoam: add mpfr search paths (#42779)
Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-02-27 09:37:48 +01:00
Peter Scheibel
5c26ce5385 skip test which is causing spurious failures on Windows (#42832) 2024-02-27 09:36:10 +01:00
Eisuke Kawashima
02137dda17 eigenexa: add 2.7–2.12 (#38170) 2024-02-27 09:23:08 +01:00
eugeneswalker
4abac88895 e4s ci: use ubuntu 22.04 images (#42843) 2024-02-27 01:12:53 -07:00
stepanvanecek
79c2a55e00 gpuscout: new package (#42761)
Co-authored-by: Stepan Vanecek <stepan@Stepans-MBP.fritz.box>
2024-02-27 09:05:09 +01:00
Carsten Uphoff
71c169293c double-batched: add v0.5.0 (#42850)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-02-27 08:59:42 +01:00
Wouter Deconinck
bcc5ded205 dd4hep: new version 1.28 (#42846) 2024-02-27 08:50:01 +01:00
Kensuke WATANABE
379a5d8fa0 root: add dependent package required for build time tests (#42849) 2024-02-27 08:43:08 +01:00
Juan Miguel Carceller
d8c2782949 bdsim: use the same C++ standard as in ROOT, add a patch (#42031)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-02-27 08:34:42 +01:00
dependabot[bot]
6dde6ca887 build(deps): bump pytest from 8.0.1 to 8.0.2 in /lib/spack/docs (#42861)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.0.1 to 8.0.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.0.1...8.0.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-27 08:30:03 +01:00
downloadico
8f8c262fb3 picard: add version 3.1.1 (#42862) 2024-02-27 08:29:06 +01:00
afzpatel
93b8e771b6 rocm-gdb: add v6.0.2 (#42855) 2024-02-27 08:22:56 +01:00
Todd Gamblin
48088ee24a refactor: add type annotations and refactor solver conditions (#42081)
Refactoring `SpackSolverSetup` is a bit easier with type annotations, so I started
adding some. This adds annotations for the (many) instance variables on
`SpackSolverSetup` as well as a few other places.

This also refactors `condition()` to reduce redundancy and to allow
`_get_condition_id()` to be called independently of the larger condition
function.


Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-26 22:26:01 +00:00
Mikael Simberg
c7df258ca6 Update camp missing headers patch to be applied with all compilers (#42857) 2024-02-26 12:30:01 -05:00
Erik Heeren
b8e8fa2dcd py-find-libpython: 0.3.1 (#42853)
* py-find-libpython: 0.3.1

* py-find-libpython: sort versions
2024-02-26 10:07:51 -07:00
Adam J. Stewart
8e885f4eb2 ImageMagick: fewer dependencies on macOS (#42739) 2024-02-26 17:45:29 +01:00
Miranda Mundt
116308fa17 py-pyomo: add v6.7.1 (#42795)
* Update Pyomo spack package for 6.7.1 release

* Apply changes from @adamjstewart

* Update sphinx+Pyomo versions

* Whoops - typo
2024-02-26 10:14:36 -06:00
Adam J. Stewart
5eb4d858cb Update PyTorch ecosystem (#42819) 2024-02-25 19:20:25 -08:00
eugeneswalker
8dd5f36b68 e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1 (#42842)
* e4s external rocm ci: use ubuntu 22 image with rocm 5.7.1

* comment out slate+rocm due to build error
2024-02-25 17:50:56 -08:00
eugeneswalker
e3ce3ab266 e4s ci: add py-mpi4py, py-numba (#42845) 2024-02-25 17:23:39 -08:00
Jeremy Fix
0618cb98d1 py-ipyvuetify: new package (#42836)
* py-ipyvuetify: new package

* Limit py-jupyter-packing version to 0.7.x

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-jupyterlab version and type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-ipyvue version range to exclude 2

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* rm py-wheel, already considered for PythonPackage

* fix: pynpm only required for build, reorder dependencies as in the pyproject.toml

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 07:34:23 -06:00
Maciej Wójcik
3b4a27ce7b snakemake: new version with plugins (#42713)
* snakemake: add Snakemake 8 with dependencies

* snakemake: add missing description

* Whitespace

* Whitespace

* Whitespace

* Whitespace

* py-conda-inject: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-azure-batch: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-cluster-generic: add constraint for Python

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake: add upper bound for Python

* py-snakemake-executor-plugin-drmaa: specify dependency type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-googlebatch: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-executor-plugin-tes: correct dependency version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-snakemake-storage-plugin-s3: reorder

* snakemake: remove newly added variants

* snakemake: remove newly added variants

* snakemake: remove newly added variant

* snakemake: update version

* snakemake: update version

* snakemake: whitespace

* py-snakemake-storage-plugin-s3: update version

* snakemake: use newer version

* snakemake: whitespace

* snakemake: update interfaces

* py-snakemake-storage-plugin-gcs: link issue

* snakemake: update versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-25 03:47:38 -07:00
Veselin Dobrev
07afbd5619 [laghos] Add a patch for MPI_Session (#42841) 2024-02-24 16:07:59 -07:00
Pranav Sivaraman
5d8cd207ec zoxide: new package (#42840)
* feat: zoxide package

* Apply suggestions from code review

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-24 14:53:08 -07:00
Adam J. Stewart
3990589b08 py-lightly: add v1.5.0 (#42820) 2024-02-24 12:27:53 -08:00
dependabot[bot]
38d821461c build(deps): bump codecov/codecov-action from 4.0.1 to 4.0.2 (#42831)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.0.1 to 4.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](e0b68c6749...0cfda1dd0a)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-24 12:26:30 -08:00
Adam J. Stewart
80b13a0059 py-pandas: add v2.2.1 (#42838) 2024-02-24 12:20:55 -08:00
Maciej Wójcik
ab101d33be py-azure-...: add new versions (#42742)
* py-azure-core: add new versions

* py-azure-identity: add new versions, flatten dependencies

* py-azure-storage-blob: add new versions

* py-msal: add new versions

* py-azure-...: black is terrible

* py-azure-storage-blob: correct dependency

* Reorder

* Reorder
2024-02-24 12:29:05 -06:00
Sinan
cc742126ef package/libspatialite: add conflict, new version (#42573)
* package/libspatialite: add conflict, new version

* depends on new version of freexl

* fix bug

* remove manual download stuff

* improve style

* first depracate

* [@spackbot] updating style on behalf of Sinan81

* get rid of conflict, reorder deps

* remove manual download

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2024-02-24 09:29:46 -06:00
Stephen Hudson
a1f21436a4 libEnsemble: add v1.2.1 (#42828) 2024-02-24 09:27:44 -06:00
Alex Richert
95fdc92c1f Allow awscli-v2 to be installed without examples/ dir (#42773)
* Allow awscli-v2 to be installed without examples/ dir

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update var/spack/repos/builtin/packages/awscli-v2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-24 09:25:59 -06:00
Alex Richert
6680c6b72e libjpeg-turbo: add v2.1.5, update recipe (#37963)
Co-authored-by: Alec Scott <hi@alecbcs.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-23 19:23:52 -07:00
Chris Marsh
74b6bf14b8 netcdf-cxx4: convert to CMake-based build (#42766)
The CMake-based build is anticipated to work in all cases where the
Autotools-based build did, and to address all prior issues with less
maintenance of the package. In detail:

* Fixes #42735 (CMake's find_package helps with linking to proper
  netcdf-c)
* Replaces older Autotools-based build
* All preexisting variants are handled
* Record hdf5 as an explicit dependency (was missing before)
* Add +tests option

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-02-23 15:37:33 -08:00
Vicente Bolea
7c315fc14b proj: apply stdint.h patch in version 8 (#42791)
* proj: apply stdint.h patch in version 8

* Update var/spack/repos/builtin/packages/proj/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 14:48:20 -07:00
John W. Parent
f51c9fc6c3 Windows path handling: change representation for paths with spaces (#42754)
Some builds on Windows break when encountering paths with spaces. This
reencodes some paths in Windows 8.3 filename format (when on Windows):
this serves as an equivalent identifier for the file, but in a form that
does not have spaces.

8.3 filenames are also truncated in length, which could be helpful, but
that is not the primary intended purpose of using this format.

Overall

* nmake/msbuild packages do this generally for the install prefix
* curl/perl require additional modifications (as written now, each package
  may require calls to `windows_sfn` to work when the Spack
  root/install/staging prefixes contain spaces)

Some items for follow-up:

* Spack itself does not create paths with spaces "on top" of whatever
  the user configures or where it is placed (e.g. the Spack root, the
  staging directory, etc.), so it might be possible to edit some of these
  paths once and avoid a proliferation of individual `windows_sfn`
  calls in individual packages.
* This approach may result in the insertion of 8.3-style paths into
  build artifacts (on Windows), handling this may require additional
  bookkeeping (e.g. when relocating).
2024-02-23 13:30:11 -08:00
John W. Parent
3e713bb0fa vtk package: support vtk@9 on Windows (#42751) 2024-02-23 11:58:58 -08:00
Peter Scheibel
55bbb10984 Alert user to failed concretizations (#42655)
With this change an error message is emitted when the result of concretization 
is in an inconsistent state.
2024-02-23 20:15:25 +01:00
Simon Pintarelli
6e37f873f5 tiled-mm: add v2.3 (#42829) 2024-02-23 19:53:51 +01:00
Massimiliano Culpo
d49cbecf8c Cleanup spack.schema (#42815)
* Move spec_list into its own file, instead of __init__.py

* Remove spack.schema.spack

This module was introduced in #33960 It's almost an exact duplicate of
spack.schema.env, and is not used anywhere.

* Fix typo
2024-02-23 10:23:54 -08:00
Dr Marco Claudio De La Pierre
fe07645e3b Update/add packages in the Nextflow ecosystem (#42776)
Signed-off-by: Dr Marco Claudio De La Pierre <marco.delapierre@gmail.com>
2024-02-23 17:53:46 +01:00
Ben Morgan
c5b8d5c92a geant4: new version v11.2.1 (#42822) 2024-02-23 08:07:35 -05:00
Jeremy Fix
2278816cb3 py-jwcrypto: new package (#42783)
* adds the spack recipe for py-jwcrypto

* split long line to fix E501

* Specify versions for py-cryptography and py-typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:58:11 -07:00
Jeremy Fix
4bd305b6d3 py-reacton: new package (#42794)
* adds the spack recipe for reacton python package

* Fix versions for ipywidgets and typing-extensions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:48:06 -07:00
Erik Heeren
a26e0ff999 py-find-libpython: new package (#42804)
* py-find-libpython: new package

* Update var/spack/repos/builtin/packages/py-find-libpython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:42:58 -07:00
Jeremy Fix
4550fb83c0 py-ipyvue: new package (#42789)
* add spack recipe for ipyvue

* Specify version for ipywidgets

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:37:48 -07:00
Todd Gamblin
5b96c2e89f py-sympy: add version 1.12 (#42770) 2024-02-23 05:42:49 -06:00
Sinan
18c8406091 pdal: fix version range for patch (#42769)
* Update package.py

fix bug.

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:28:41 -06:00
Caetano Melone
bccefa14cb py-codespell: add package (#42694)
* py-codespell: add package

* setuptools-scm conflict

confirmed via https://github.com/codespell-project/codespell/issues/3365

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* reorder dependencies and versions

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 05:07:24 -06:00
Maciej Wójcik
20fc5a174a py-s3transfer, py-boto3, py-botocore: add new versions (#42741)
* py-s3transfer: add new versions

* py-boto3: add new versions

* py-botocore: add new versions

* py-boto3: correct version ranges
2024-02-23 04:18:45 -06:00
Alec Scott
3abcb408d1 py-ansible: add v2.16.3 (#42734)
* py-ansible: add v2.16.3

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add specific python version requirements from setup.cfg

* Add additional ranges for py-setuptools

* Update var/spack/repos/builtin/packages/py-ansible/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-23 04:13:33 -06:00
Maciej Wójcik
a0d97d9294 py-argparse-dataclass: add new package (#42494)
* py-argparse-dataclass: add new package

* Remove obvious dependency
2024-02-23 04:11:21 -06:00
Massimiliano Culpo
0979a6a875 Remove dead code from Environment (#42818)
Environment.concretize_and_add is not used anywhere.
2024-02-23 10:48:07 +01:00
Massimiliano Culpo
98de8e3257 Fix wrong call to a function (#42814) 2024-02-23 06:37:22 +01:00
akimler
23b299c086 matio: add v1.5.26 (#42808) 2024-02-23 06:03:06 +01:00
Massimiliano Culpo
adcd3364c9 elpa: remove deprecated versions (#42802) 2024-02-23 06:02:06 +01:00
Tom Payerle
a2908fb9f7 qb3: add custom libs property (#42275) 2024-02-22 15:13:05 -07:00
John W. Parent
f514e72011 netcdf-c package: fix hdf5 linking on Windows (#42749) 2024-02-22 14:18:34 -07:00
Adam J. Stewart
b61d964eb8 PythonPackage: check purelib for libs/headers (#42602)
* PythonPackage: check purelib for libs/headers

* Update error messages too

* Fix functools.reduce argument order
2024-02-22 13:17:21 -08:00
John W. Parent
2066eda3cd Seacas package: add Windows support (#42692) 2024-02-22 13:28:33 -07:00
Alex Richert
d8b186a381 dakota: add boost components (#42659)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-22 13:03:13 -07:00
Adam J. Stewart
d258aec099 GDAL: add v3.8.4 (#42805) 2024-02-22 11:42:14 -08:00
Harmen Stoppels
3d1d5f755f oci: when base image uses Image Manifest Version 2, follow suit (#42777) 2024-02-22 16:33:56 +01:00
Thomas-Ulrich
90778873d1 tandem: update package (#42785) 2024-02-22 15:45:20 +01:00
AMD Toolchain Support
1af57e13fb elpa: fix support for patched version (#42803)
Co-authored-by: Ning Li <ning.li@amd.com>
2024-02-22 15:43:12 +01:00
AMD Toolchain Support
28d25affcc ELPA: Linking fixes for BLAS and OpenMP (#42747)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2024-02-22 15:21:00 +01:00
Martin Lang
3ebaf33915 libgd: fix INT_MAX not defined (#42104)
Compiling version 2.2.4 fails (on a Debian system with only a minimum set of packages installed) with an error because `INT_MAX` is undeclared:
```
   263    gd_gd2.c: In function '_gd2GetHeader':
>> 264    gd_gd2.c:212:54: error: 'INT_MAX' undeclared (first use in this function)
   265      212 |                 if (*ncx <= 0 || *ncy <= 0 || *ncx > INT_MAX / *ncy) {
   266          |                                                      ^~~~~~~
   267    gd_gd2.c:87:1: note: 'INT_MAX' is defined in header '<limits.h>'; did you forget to '#include <limits.h>'?
```
2024-02-22 03:23:59 -07:00
Alec Scott
e8d981b405 rust: add v1.76.0 (#42798) 2024-02-22 03:09:31 -07:00
Steven Hahn
02f222f6a3 google benchmark: Add variant with libpfm4 (#42620)
Signed-off-by: Steven Hahn <hahnse@ornl.gov>
2024-02-22 07:29:49 +01:00
James Beal
8345c6fb85 delly2: add v1.2.6 (#42745)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-22 07:28:40 +01:00
Xavier Delaruelle
3f23634c46 environment-modules: add version 5.4.0 (#42763) 2024-02-22 07:10:55 +01:00
MatthewLieber
d5766431a0 mvapich: add v3.0 (#42756)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-02-22 07:09:14 +01:00
Martin Lang
1388bfe47d bigdft-atlab: add v1.9.3, v1.9.4 (#42643) 2024-02-22 07:05:00 +01:00
dependabot[bot]
579dec3b35 build(deps): bump urllib3 from 2.2.0 to 2.2.1 in /lib/spack/docs (#42757)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.2.0 to 2.2.1.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.2.0...2.2.1)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-22 07:01:45 +01:00
Martin Lang
9608ed9dbd cgal: add v5.5.3 (#42650) 2024-02-22 06:58:03 +01:00
Wouter Deconinck
42b739d6d5 podio: depends_on py-graphviz type run (for podio-vis) (#42787)
The podio-vis tool depends at run-time on py-graphviz, https://github.com/AIDASoft/podio/blob/master/tools/podio-vis#L10.
2024-02-22 06:14:38 +01:00
Matthieu Dorier
91a0c71ed1 nlohmann-json-schema-validator: added version 2.2.0 and 2.3.0 (#42792) 2024-02-22 06:11:29 +01:00
Dom Heinzeller
6ee6fbe56b ecflow: apply ctsapi_cassert.patch for all compilers (#42793) 2024-02-22 06:09:54 +01:00
Mikael Simberg
be4eae3fa8 pika: add sanitizers variant (#42778) 2024-02-22 05:54:33 +01:00
Harmen Stoppels
ad70b88d5f spack gc: do not show uninstalled but needed specs (#42696) 2024-02-22 05:21:39 +01:00
eugeneswalker
c1d230f25f e4s ci stacks: add python packages (#42774)
* e4s ci stacks: add python packages

* comment out failing specs
2024-02-21 20:59:05 -07:00
John W. Parent
4bc52fc1a3 env activate: use Win-compatible print on Windows (#42755)
Use "echo" instead of "printf" on Windows.
2024-02-21 11:02:04 -08:00
John W. Parent
7d728822f0 Windows: fix error with can_symlink check (#42753) 2024-02-21 10:18:25 -08:00
John W. Parent
e96640d2b9 cgns package: don't use MPI wrappers on Windows (#42750) 2024-02-21 10:10:48 -08:00
Alex Richert
e196978c7c Add 'docs' variant to rust-bootstrap (#42768)
* Add 'docs' variant to rust-bootstrap

* remove docs for rust-bootstrap
2024-02-21 11:04:13 -07:00
Harmen Stoppels
de3d1e6c66 rocm: removal of deprecated <5.1 versions (#42676)
The package `aomp` is removed entirely, as it was too outdated to have non-deprecated dependencies.
2024-02-21 14:07:40 +01:00
Massimiliano Culpo
2d8e0825fe binutils: add v2.42 (#42760) 2024-02-20 23:03:51 -08:00
pauleonix
d5c06c4e2c asio: add patches 1.28.2 and 1.28.1 (#42762) 2024-02-20 23:02:43 -08:00
Wouter Deconinck
0d92b07dbd pythia6: deal with dead pythiasix.hepforge.org (#42162)
* pythia6: deal with dead pythiasix.hepforge.org

* pythia6: rm main81.f from CMakeLists.txt

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-02-20 15:30:54 -06:00
Auriane R
6cfcbc0167 Update conflict between stdexec and clang (#42765) 2024-02-20 13:32:54 -07:00
Massimiliano Culpo
f9e9fab2da clingo: add v5.7.1 (#42758) 2024-02-20 07:49:49 -08:00
Massimiliano Culpo
b3f790c259 btop: add v1.3.2 (#42759) 2024-02-20 07:46:35 -08:00
George Young
b5b5130bed pblat: add new package (#42517)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 22:28:27 +01:00
Vicente Bolea
d670a2a5ce adios2: update kokkos dependency (#42621)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-02-19 11:54:08 -07:00
Harmen Stoppels
0ee3a3c401 Use relative target in symlinks to modified files in view (#42699) 2024-02-19 16:33:38 +01:00
Dave Keeshan
ad6dc0d103 verible: add v0.0-3539-g9442853c (#42628) 2024-02-19 14:40:23 +01:00
Alex Richert
6e373b46f8 scorep: specify binutils headers and libs (#42656) 2024-02-19 14:35:22 +01:00
Thomas-Ulrich
abe617c4ea hipsycl: update package (#42518) 2024-02-19 14:34:05 +01:00
Satish Balay
a0e80b23b9 DTK: specify MPI compilers (#42592)
Co-authored-by: balay <balay@users.noreply.github.com>
2024-02-19 14:24:28 +01:00
Dom Heinzeller
4d051eb6ff ecflow: fix compilation with Intel classic compilers (#42622) 2024-02-19 14:23:09 +01:00
kinagaki-fj
0d89083cb5 omm-bundle: add new package (#42304) 2024-02-19 14:16:39 +01:00
Alex Richert
23e586fd85 ferret: add support for gcc@10: (#42660) 2024-02-19 14:14:11 +01:00
Richard Berger
c2b116175b kokkos: disable CUDA_MALLOC_ASYNC on cray-mpich (#42661)
Co-authored-by: Daniel Arndt <arndtd@ornl.gov>
2024-02-19 13:52:48 +01:00
Mikael Simberg
a1f90620c3 umpire: depend on camp~rocm when umpire itself has ~rocm (#42701) 2024-02-19 11:58:31 +01:00
Harmen Stoppels
668879141f remove a few redundant calls to setup_run_environment (#42718)
Any package `X` used as `depends_on("x", type="build")` will have
`X.setup_run_environment(env)` called, because it has to be able to
"run" in the build environment.

So there is no point in calling `setup_run_environment` from
`setup_dependent_build_environment`.

Also it's redundant to call `setup_run_environment` in
`setup_dependent_run_environment`, cause (a) the latter is called _for
every parent edge_ instead of once per node, and (b) it's only called
after `setup_run_environment` is called anyways. Better to call
`setup_run_environment` once and only once.
2024-02-19 11:43:45 +01:00
Adam J. Stewart
267defd9d3 py-matplotlib: add v3.8.3 (#42698) 2024-02-19 11:40:18 +01:00
Ken Raffenetti
603d3f2283 mpich: Remove invalid pmi option (#42686)
pmi=off is not a valid configuration option. MPICH cannot function
without a PMI library. Fixes #42685.
2024-02-19 11:38:49 +01:00
James Beal
02bfbbe269 Bump for new version of bedtools2 (#42034)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2024-02-19 11:35:26 +01:00
Alec Scott
44d08d2c24 gnupg: make discoverable as external (#42736) 2024-02-19 11:27:39 +01:00
AMD Toolchain Support
c6faab10aa CP2K: fix multiple use of spec["fftw"] (#42724)
fftw object was originally created with spec["fftw:openmp"], but
referencing spec["fftw"] overwrites the 'last_query' in the spec object,
so later use of fftw.libs was not returing FFTW OpenMP libs.

Also allow the post-install fixup to support amdfftw as well as fftw.

Co-authored-by: Branden Moore <branden.moore@amd.com>
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-02-19 11:21:11 +01:00
Geoffrey Lentner
baa203f115 duckdb: add v0.9.2 (#42374) 2024-02-19 03:02:43 -07:00
Alec Scott
575a33846f bfs: add v3.1.1 (#42740) 2024-02-19 10:54:00 +01:00
Adam J. Stewart
79df065730 py-shapely: add v2.0.3 (#42738) 2024-02-18 21:29:18 +01:00
William Moses
953ee2d0ca Bump enzyme to 0.0.100 (#42626) 2024-02-18 08:48:21 -08:00
Nai-Yuan Chiang
6796b581ed hiop new release, v1.0.3 (#42730) 2024-02-18 08:45:44 -08:00
Christoph Junghans
f49a5519b7 byfl: initial commit (#42731) 2024-02-18 08:43:56 -08:00
Tom Drever
e07a6f51dc Add py-click-option-group (#42678)
* Add py-click-option-group

* Specify dependency versions
2024-02-18 08:22:25 -06:00
Dani
cb73d71cf1 new builtin package: py-biobb-model (#42681)
* new builtin package: py-biobb-model

* Update var/spack/repos/builtin/packages/py-biobb-model/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:18:03 -06:00
Jonas Eschle
5949fc2c88 add package py-jacobi (#42672)
* add package py-jacobi

* fix:  add description

* fix:  add description

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/py-jacobi/package.py

I don't think that numpy is used in "build"? But not important

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:17:35 -06:00
Jonas Eschle
fd10cfdebf add package py dotmap (#42665)
* add package py dotmap

* add maintainer

* [@spackbot] updating style on behalf of jonas-eschle

* fix:  add description

* [@spackbot] updating style on behalf of jonas-eschle

* Update package.py

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
2024-02-18 03:16:32 -06:00
Maciej Wójcik
32506e222d py-sysrsync: add new package (#42492)
* py-sysrsync: add new package

* py-sysrsync: specify dependency type

* py-sysrsync: add constraint for Python
2024-02-18 03:15:42 -06:00
Maciej Wójcik
a7d5cc6d68 py-google-...: add new versions and few new packages (#42671)
* py-google-cloud-storage: add new versions

* py-google-api-core: add new versions

* py-proto-plus: add new package

* py-google-api-core: add grpc variant

* py-google-api-core: add grpc variant

* py-google-api-core: add missing prefix

* py-google-cloud-batch: add new package

* py-google-cloud-logging: add new package

* py-google-cloud-appengine-logging: add new package

* py-google-cloud-audit-log: add new package

* py-grpc-google-iam-v1: add new package

* py-proto-plus: remove obvious dependency

* Whitespace

* Whitespace

* py-google-cloud-audit-log: correct conflict

* py-proto-plus: correct dependency type

* Whitespace

* py-google-auth: add new version

* py-google-resumable-media: add new version

* py-google-cloud-storage: constrain version of dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-grpcio-status: use newer version

* py-google-resumable-media: add upper bound of dependency

* Add types of dependencies.

* py-grpcio: add new version

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-18 03:15:02 -06:00
Vanessasaurus
222241f232 flux-core: add uuid (#42635)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-02-18 08:54:11 +01:00
Sinan
aa1820eb5c pdal: new package (#42714)
* new package pdal

* [@spackbot] updating style on behalf of Sinan81

* fix style

* add license

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* [@spackbot] updating style on behalf of Sinan81

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* Update var/spack/repos/builtin/packages/pdal/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* improve dependency spec

* add maintainer

* add conflict

* fix bug

* improve

* improve

* [@spackbot] updating style on behalf of Sinan81

* fix style

* specify cmake dependency version

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
Co-authored-by: Alec Scott <alec@bcs.sh>
2024-02-17 13:40:40 -08:00
George Young
16ea5f68ba cryodrgn: new package @2.3.0 (#42443)
* cryodrgn: new package @2.3.0

* correcting dependency ranges

* correcting dependency ranges

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-02-17 15:07:56 -06:00
Dani
aeec515b4f new builtin package: py-biobb-structure-utils (#42683) 2024-02-17 14:42:10 -06:00
Maciej Wójcik
6e2ec2950b py-kubernetes: add new versions (#42670)
* py-kubernetes: add new versions

* py-oauthlib: add new version
2024-02-17 14:32:20 -06:00
Maciej Wójcik
fe5772898d py-azure-... and py-msrest: add new versions (#42624)
* py-azure-batch: add new versions

* py-azure-core: add new versions

* py-azure-identity: add new versions

* py-azure-mgmt-batch: add new versions

* py-azure-mgmt-core: add new versions

* py-azure-storage-blob: add new versions

* py-msrest: add new versions

* Whitespace

* Whitespace

* py-msrest: add a note

* py-msrest: version-dependent URL

* py-azure-mgmt-batch: correct version of dependency
2024-02-17 14:24:11 -06:00
Alex Leute
384ddf8e93 py-smote-variants: Added package py-smote-variants (#42502)
* py-smote-variants: Added package py-smote-variants

Also added py-minisom and py-metric-learn as dependencies

* py-metric-learn: Added build dependency on setuptools

* py-smote-variants: Added a dependency on py-pytest-runner

As well as a comment about why statistics isn't included

* [@spackbot] updating style on behalf of alex391

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-02-17 14:20:03 -06:00
Tom Vander Aa
32c2e240f8 py-charm4py: needs Cython<3.0 (#42491)
* py-charm4py: needs Cython<3.0

* Update var/spack/repos/builtin/packages/py-charm4py/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-02-17 14:16:31 -06:00
956 changed files with 17130 additions and 11390 deletions

View File

@@ -22,7 +22,7 @@ jobs:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{inputs.python_version}}
@@ -43,7 +43,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup repo
@@ -158,7 +158,7 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: "3.12"
@@ -182,7 +182,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap clingo
run: |
set -ex
@@ -207,7 +207,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup repo
@@ -250,7 +250,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
@@ -287,7 +287,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- name: Setup non-root user
@@ -320,7 +320,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -338,7 +338,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -55,7 +55,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -96,10 +96,10 @@ jobs:
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226
uses: docker/setup-buildx-action@0d103c3126aa41d772a8362f6aa67afac040f80c
- name: Log in to GitHub Container Registry
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -107,13 +107,13 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4a13e500e55cf31b7a5d59a38ab2040ab0f42f56
uses: docker/build-push-action@af5a7ed5ba88268d5278f7203fb52cd833f66d6e
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -35,12 +35,12 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@ebc4d7e9ebcb0b1eb21480bb8f43113e996ac77a
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c

View File

@@ -1,4 +1,4 @@
black==24.2.0
black==24.3.0
clingo==5.7.1
flake8==7.0.0
isort==5.13.2

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
@@ -91,14 +91,14 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
@@ -122,7 +122,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: shelltests,linux
@@ -137,7 +137,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -156,7 +156,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
@@ -181,7 +181,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044 # @v2.1.0
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab # @v2.1.0
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
@@ -191,7 +191,7 @@ jobs:
matrix:
python-version: ["3.11"]
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
@@ -216,6 +216,6 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,macos

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
@@ -69,7 +69,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633 # @v2
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
@@ -33,13 +33,13 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
@@ -57,13 +57,13 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e0b68c6749509c5f83f984dd99a76a1c1a231044
- uses: codecov/codecov-action@54bcd8715eee62d40e33596ef5e8f0f48dbbccab
with:
flags: unittests,windows
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c

View File

@@ -87,7 +87,7 @@ You can check what is installed in the bootstrapping store at any time using:
.. code-block:: console
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 11 installed packages
-- darwin-catalina-x86_64 / apple-clang@12.0.0 ------------------
@@ -101,7 +101,7 @@ In case it is needed you can remove all the software in the current bootstrappin
% spack clean -b
==> Removing bootstrapped software and configuration in "/Users/spack/.spack/bootstrap"
% spack find -b
% spack -b find
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 0 installed packages
@@ -175,4 +175,4 @@ bootstrapping.
This command needs to be run on a machine with internet access and the resulting folder
has to be moved over to the air-gapped system. Once the local sources are added using the
commands suggested at the prompt, they can be used to bootstrap Spack.
commands suggested at the prompt, they can be used to bootstrap Spack.

View File

@@ -173,6 +173,72 @@ arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
]
^^^^^^^
Testing
^^^^^^^
``PerlPackage`` provides a simple stand-alone test of the successfully
installed package to confirm that installed perl module(s) can be used.
These tests can be performed any time after the installation using
``spack -v test run``. (For more information on the command, see
:ref:`cmd-spack-test-run`.)
The base class automatically detects perl modules based on the presence
of ``*.pm`` files under the package's library directory. For example,
the files under ``perl-bignum``'s perl library are:
.. code-block:: console
$ find . -name "*.pm"
./bigfloat.pm
./bigrat.pm
./Math/BigFloat/Trace.pm
./Math/BigInt/Trace.pm
./Math/BigRat/Trace.pm
./bigint.pm
./bignum.pm
which results in the package having the ``use_modules`` property containing:
.. code-block:: python
use_modules = [
"bigfloat",
"bigrat",
"Math::BigFloat::Trace",
"Math::BigInt::Trace",
"Math::BigRat::Trace",
"bigint",
"bignum",
]
.. note::
This list can often be used to catch missing dependencies.
If the list is somehow wrong, you can provide the names of the modules
yourself by overriding ``use_modules`` like so:
.. code-block:: python
use_modules = ["bigfloat", "bigrat", "bigint", "bignum"]
If you only want a subset of the automatically detected modules to be
tested, you could instead define the ``skip_modules`` property on the
package. So, instead of overriding ``use_modules`` as shown above, you
could define the following:
.. code-block:: python
skip_modules = [
"Math::BigFloat::Trace",
"Math::BigInt::Trace",
"Math::BigRat::Trace",
]
for the same use tests.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack
^^^^^^^^^^^^^^^^^^^^^

View File

@@ -73,9 +73,12 @@ are six configuration scopes. From lowest to highest:
Spack instance per project) or for site-wide settings on a multi-user
machine (e.g., for a common Spack instance).
#. **plugin**: Read from a Python project's entry points. Settings here affect
all instances of Spack running with the same Python installation. This scope takes higher precedence than site, system, and default scopes.
#. **user**: Stored in the home directory: ``~/.spack/``. These settings
affect all instances of Spack and take higher precedence than site,
system, or defaults scopes.
system, plugin, or defaults scopes.
#. **custom**: Stored in a custom directory specified by ``--config-scope``.
If multiple scopes are listed on the command line, they are ordered
@@ -196,6 +199,45 @@ with MPICH. You can create different configuration scopes for use with
mpi: [mpich]
.. _plugin-scopes:
^^^^^^^^^^^^^
Plugin scopes
^^^^^^^^^^^^^
.. note::
Python version >= 3.8 is required to enable plugin configuration.
Spack can be made aware of configuration scopes that are installed as part of a python package. To do so, register a function that returns the scope's path to the ``"spack.config"`` entry point. Consider the Python package ``my_package`` that includes Spack configurations:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   ├── __init__.py
│   │   └── spack/
│   │   │   └── config.yaml
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make ``my_package``'s ``spack/`` configurations visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.config"]
my_package = "my_package:get_config_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_config_path():
dirname = importlib.resources.files("my_package").joinpath("spack")
if dirname.exists():
return str(dirname)
.. _platform-scopes:
------------------------

View File

@@ -952,6 +952,17 @@ function, as shown in the example below:
^mpi: "{name}-{version}/{^mpi.name}-{^mpi.version}-{compiler.name}-{compiler.version}"
all: "{name}-{version}/{compiler.name}-{compiler.version}"
Projections also permit environment and spack configuration variable
expansions as shown below:
.. code-block:: yaml
projections:
all: "{name}-{version}/{compiler.name}-{compiler.version}/$date/$SYSTEM_ENV_VARIBLE"
where ``$date`` is the spack configuration variable that will expand with the ``YYYY-MM-DD``
format and ``$SYSTEM_ENV_VARIABLE`` is an environment variable defined in the shell.
The entries in the projections configuration file must all be either
specs or the keyword ``all``. For each spec, the projection used will
be the first non-``all`` entry that the spec satisfies, or ``all`` if

View File

@@ -111,3 +111,39 @@ The corresponding unit tests can be run giving the appropriate options to ``spac
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
---------------------------------------
Registering Extensions via Entry Points
---------------------------------------
.. note::
Python version >= 3.8 is required to register extensions via entry points.
Spack can be made aware of extensions that are installed as part of a python package. To do so, register a function that returns the extension path, or paths, to the ``"spack.extensions"`` entry point. Consider the Python package ``my_package`` that includes a Spack extension:
.. code-block:: console
my-package/
├── src
│   ├── my_package
│   │   └── __init__.py
│   └── spack-scripting/ # the spack extensions
└── pyproject.toml
adding the following to ``my_package``'s ``pyproject.toml`` will make the ``spack-scripting`` extension visible to Spack when ``my_package`` is installed:
.. code-block:: toml
[project.entry_points."spack.extenions"]
my_package = "my_package:get_extension_path"
The function ``my_package.get_extension_path`` in ``my_package/__init__.py`` might look like
.. code-block:: python
import importlib.resources
def get_extension_path():
dirname = importlib.resources.files("my_package").joinpath("spack-scripting")
if dirname.exists():
return str(dirname)

View File

@@ -250,9 +250,10 @@ Compiler configuration
Spack has the ability to build packages with multiple compilers and
compiler versions. Compilers can be made available to Spack by
specifying them manually in ``compilers.yaml``, or automatically by
running ``spack compiler find``, but for convenience Spack will
automatically detect compilers the first time it needs them.
specifying them manually in ``compilers.yaml`` or ``packages.yaml``,
or automatically by running ``spack compiler find``, but for
convenience Spack will automatically detect compilers the first time
it needs them.
.. _cmd-spack-compilers:
@@ -457,6 +458,48 @@ specification. The operations available to modify the environment are ``set``, `
prepend_path: # Similar for append|remove_path
LD_LIBRARY_PATH: /ld/paths/added/by/setvars/sh
.. note::
Spack is in the process of moving compilers from a separate
attribute to be handled like all other packages. As part of this
process, the ``compilers.yaml`` section will eventually be replaced
by configuration in the ``packages.yaml`` section. This new
configuration is now available, although it is not yet the default
behavior.
Compilers can also be configured as external packages in the
``packages.yaml`` config file. Any external package for a compiler
(e.g. ``gcc`` or ``llvm``) will be treated as a configured compiler
assuming the paths to the compiler executables are determinable from
the prefix.
If the paths to the compiler executable are not determinable from the
prefix, you can add them to the ``extra_attributes`` field. Similarly,
all other fields from the compilers config can be added to the
``extra_attributes`` field for an external representing a compiler.
.. code-block:: yaml
packages:
gcc:
external:
- spec: gcc@12.2.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
environment:
set:
GCC_ROOT: /usr
external:
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
paths:
cc: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fc: /usr/bin/gfortran
f77: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
^^^^^^^^^^^^^^^^^^^^^^^
Build Your Own Compiler

View File

@@ -273,9 +273,21 @@ builtin support through the ``depends_on`` function, the latter simply uses a ``
statement. Both module systems (at least in newer versions) do reference counting, so that if a
module is loaded by two different modules, it will only be unloaded after the others are.
The ``autoload`` key accepts the values ``none``, ``direct``, and ``all``. To disable it, use
``none``, and to enable, it's best to stick to ``direct``, which only autoloads the direct link and
run type dependencies, relying on recursive autoloading to load the rest.
The ``autoload`` key accepts the values:
* ``none``: no autoloading
* ``run``: autoload direct *run* type dependencies
* ``direct``: autoload direct *link and run* type dependencies
* ``all``: autoload all dependencies
In case of ``run`` and ``direct``, a ``module load`` triggers a recursive load.
The ``direct`` option is most correct: there are cases where pure link dependencies need to set
variables for themselves, or need to have variables of their own dependencies set.
In practice however, ``run`` is often sufficient, and may make ``module load`` snappier.
The ``all`` option is discouraged and seldomly used.
A common complaint about autoloading is the large number of modules that are visible to the user.
Spack has a solution for this as well: ``hide_implicits: true``. This ensures that only those
@@ -297,11 +309,11 @@ Environment Modules requires version 4.7 or higher.
tcl:
hide_implicits: true
all:
autoload: direct
autoload: direct # or `run`
lmod:
hide_implicits: true
all:
autoload: direct
autoload: direct # or `run`
.. _anonymous_specs:

View File

@@ -5,9 +5,9 @@ sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.2.0
pytest==8.0.1
urllib3==2.2.1
pytest==8.1.1
isort==5.13.2
black==24.2.0
black==24.3.0
flake8==7.0.0
mypy==1.8.0
mypy==1.9.0

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.2 (commit 1dc58a5776dd77e6fc6e4ba5626af5b1fb24996e)
* Version: 0.2.3 (commit 7b8fe60b69e2861e7dac104bc1c183decfcd3daf)
astunparse
----------------

View File

@@ -1,2 +1,3 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.2"
__version__ = "0.2.3"

View File

@@ -3,6 +3,7 @@
"""
import sys
from .cli import main
sys.exit(main())

View File

@@ -46,7 +46,11 @@ def _make_parser() -> argparse.ArgumentParser:
def cpu() -> int:
"""Run the `archspec cpu` subcommand."""
print(archspec.cpu.host())
try:
print(archspec.cpu.host())
except FileNotFoundError as exc:
print(exc)
return 1
return 0

View File

@@ -5,10 +5,14 @@
"""The "cpu" package permits to query and compare different
CPU microarchitectures.
"""
from .microarchitecture import Microarchitecture, UnsupportedMicroarchitecture
from .microarchitecture import TARGETS, generic_microarchitecture
from .microarchitecture import version_components
from .detect import host
from .microarchitecture import (
TARGETS,
Microarchitecture,
UnsupportedMicroarchitecture,
generic_microarchitecture,
version_components,
)
__all__ = [
"Microarchitecture",

View File

@@ -4,15 +4,17 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Detection of CPU microarchitectures"""
import collections
import functools
import os
import platform
import re
import struct
import subprocess
import warnings
from typing import Dict, List, Optional, Set, Tuple, Union
from .microarchitecture import generic_microarchitecture, TARGETS
from .schema import TARGETS_JSON
from ..vendor.cpuid.cpuid import CPUID
from .microarchitecture import TARGETS, Microarchitecture, generic_microarchitecture
from .schema import CPUID_JSON, TARGETS_JSON
#: Mapping from operating systems to chain of commands
#: to obtain a dictionary of raw info on the current cpu
@@ -22,43 +24,46 @@
#: functions checking the compatibility of the host with a given target
COMPATIBILITY_CHECKS = {}
# Constants for commonly used architectures
X86_64 = "x86_64"
AARCH64 = "aarch64"
PPC64LE = "ppc64le"
PPC64 = "ppc64"
RISCV64 = "riscv64"
def info_dict(operating_system):
"""Decorator to mark functions that are meant to return raw info on
the current cpu.
def detection(operating_system: str):
"""Decorator to mark functions that are meant to return partial information on the current cpu.
Args:
operating_system (str or tuple): operating system for which the marked
function is a viable factory of raw info dictionaries.
operating_system: operating system where this function can be used.
"""
def decorator(factory):
INFO_FACTORY[operating_system].append(factory)
@functools.wraps(factory)
def _impl():
info = factory()
# Check that info contains a few mandatory fields
msg = 'field "{0}" is missing from raw info dictionary'
assert "vendor_id" in info, msg.format("vendor_id")
assert "flags" in info, msg.format("flags")
assert "model" in info, msg.format("model")
assert "model_name" in info, msg.format("model_name")
return info
return _impl
return factory
return decorator
@info_dict(operating_system="Linux")
def proc_cpuinfo():
"""Returns a raw info dictionary by parsing the first entry of
``/proc/cpuinfo``
"""
info = {}
def partial_uarch(
name: str = "", vendor: str = "", features: Optional[Set[str]] = None, generation: int = 0
) -> Microarchitecture:
"""Construct a partial microarchitecture, from information gathered during system scan."""
return Microarchitecture(
name=name,
parents=[],
vendor=vendor,
features=features or set(),
compilers={},
generation=generation,
)
@detection(operating_system="Linux")
def proc_cpuinfo() -> Microarchitecture:
"""Returns a partial Microarchitecture, obtained from scanning ``/proc/cpuinfo``"""
data = {}
with open("/proc/cpuinfo") as file: # pylint: disable=unspecified-encoding
for line in file:
key, separator, value = line.partition(":")
@@ -70,11 +75,96 @@ def proc_cpuinfo():
#
# we are on a blank line separating two cpus. Exit early as
# we want to read just the first entry in /proc/cpuinfo
if separator != ":" and info:
if separator != ":" and data:
break
info[key.strip()] = value.strip()
return info
data[key.strip()] = value.strip()
architecture = _machine()
if architecture == X86_64:
return partial_uarch(
vendor=data.get("vendor_id", "generic"), features=_feature_set(data, key="flags")
)
if architecture == AARCH64:
return partial_uarch(
vendor=_canonicalize_aarch64_vendor(data),
features=_feature_set(data, key="Features"),
)
if architecture in (PPC64LE, PPC64):
generation_match = re.search(r"POWER(\d+)", data.get("cpu", ""))
try:
generation = int(generation_match.group(1))
except AttributeError:
# There might be no match under emulated environments. For instance
# emulating a ppc64le with QEMU and Docker still reports the host
# /proc/cpuinfo and not a Power
generation = 0
return partial_uarch(generation=generation)
if architecture == RISCV64:
if data.get("uarch") == "sifive,u74-mc":
data["uarch"] = "u74mc"
return partial_uarch(name=data.get("uarch", RISCV64))
return generic_microarchitecture(architecture)
class CpuidInfoCollector:
"""Collects the information we need on the host CPU from cpuid"""
# pylint: disable=too-few-public-methods
def __init__(self):
self.cpuid = CPUID()
registers = self.cpuid.registers_for(**CPUID_JSON["vendor"]["input"])
self.highest_basic_support = registers.eax
self.vendor = struct.pack("III", registers.ebx, registers.edx, registers.ecx).decode(
"utf-8"
)
registers = self.cpuid.registers_for(**CPUID_JSON["highest_extension_support"]["input"])
self.highest_extension_support = registers.eax
self.features = self._features()
def _features(self):
result = set()
def check_features(data):
registers = self.cpuid.registers_for(**data["input"])
for feature_check in data["bits"]:
current = getattr(registers, feature_check["register"])
if self._is_bit_set(current, feature_check["bit"]):
result.add(feature_check["name"])
for call_data in CPUID_JSON["flags"]:
if call_data["input"]["eax"] > self.highest_basic_support:
continue
check_features(call_data)
for call_data in CPUID_JSON["extension-flags"]:
if call_data["input"]["eax"] > self.highest_extension_support:
continue
check_features(call_data)
return result
def _is_bit_set(self, register: int, bit: int) -> bool:
mask = 1 << bit
return register & mask > 0
@detection(operating_system="Windows")
def cpuid_info():
"""Returns a partial Microarchitecture, obtained from running the cpuid instruction"""
architecture = _machine()
if architecture == X86_64:
data = CpuidInfoCollector()
return partial_uarch(vendor=data.vendor, features=data.features)
return generic_microarchitecture(architecture)
def _check_output(args, env):
@@ -83,14 +173,25 @@ def _check_output(args, env):
return str(output.decode("utf-8"))
WINDOWS_MAPPING = {
"AMD64": "x86_64",
"ARM64": "aarch64",
}
def _machine():
""" "Return the machine architecture we are on"""
"""Return the machine architecture we are on"""
operating_system = platform.system()
# If we are not on Darwin, trust what Python tells us
if operating_system != "Darwin":
# If we are not on Darwin or Windows, trust what Python tells us
if operating_system not in ("Darwin", "Windows"):
return platform.machine()
# Normalize windows specific names
if operating_system == "Windows":
platform_machine = platform.machine()
return WINDOWS_MAPPING.get(platform_machine, platform_machine)
# On Darwin it might happen that we are on M1, but using an interpreter
# built for x86_64. In that case "platform.machine() == 'x86_64'", so we
# need to fix that.
@@ -103,54 +204,47 @@ def _machine():
if "Apple" in output:
# Note that a native Python interpreter on Apple M1 would return
# "arm64" instead of "aarch64". Here we normalize to the latter.
return "aarch64"
return AARCH64
return "x86_64"
return X86_64
@info_dict(operating_system="Darwin")
def sysctl_info_dict():
@detection(operating_system="Darwin")
def sysctl_info() -> Microarchitecture:
"""Returns a raw info dictionary parsing the output of sysctl."""
child_environment = _ensure_bin_usrbin_in_path()
def sysctl(*args):
def sysctl(*args: str) -> str:
return _check_output(["sysctl"] + list(args), env=child_environment).strip()
if _machine() == "x86_64":
flags = (
sysctl("-n", "machdep.cpu.features").lower()
+ " "
+ sysctl("-n", "machdep.cpu.leaf7_features").lower()
if _machine() == X86_64:
features = (
f'{sysctl("-n", "machdep.cpu.features").lower()} '
f'{sysctl("-n", "machdep.cpu.leaf7_features").lower()}'
)
info = {
"vendor_id": sysctl("-n", "machdep.cpu.vendor"),
"flags": flags,
"model": sysctl("-n", "machdep.cpu.model"),
"model name": sysctl("-n", "machdep.cpu.brand_string"),
}
else:
model = "unknown"
model_str = sysctl("-n", "machdep.cpu.brand_string").lower()
if "m2" in model_str:
model = "m2"
elif "m1" in model_str:
model = "m1"
elif "apple" in model_str:
model = "m1"
features = set(features.split())
info = {
"vendor_id": "Apple",
"flags": [],
"model": model,
"CPU implementer": "Apple",
"model name": sysctl("-n", "machdep.cpu.brand_string"),
}
return info
# Flags detected on Darwin turned to their linux counterpart
for darwin_flag, linux_flag in TARGETS_JSON["conversions"]["darwin_flags"].items():
if darwin_flag in features:
features.update(linux_flag.split())
return partial_uarch(vendor=sysctl("-n", "machdep.cpu.vendor"), features=features)
model = "unknown"
model_str = sysctl("-n", "machdep.cpu.brand_string").lower()
if "m2" in model_str:
model = "m2"
elif "m1" in model_str:
model = "m1"
elif "apple" in model_str:
model = "m1"
return partial_uarch(name=model, vendor="Apple")
def _ensure_bin_usrbin_in_path():
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is
# usually found there
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is usually found there
child_environment = dict(os.environ.items())
search_paths = child_environment.get("PATH", "").split(os.pathsep)
for additional_path in ("/sbin", "/usr/sbin"):
@@ -160,22 +254,10 @@ def _ensure_bin_usrbin_in_path():
return child_environment
def adjust_raw_flags(info):
"""Adjust the flags detected on the system to homogenize
slightly different representations.
"""
# Flags detected on Darwin turned to their linux counterpart
flags = info.get("flags", [])
d2l = TARGETS_JSON["conversions"]["darwin_flags"]
for darwin_flag, linux_flag in d2l.items():
if darwin_flag in flags:
info["flags"] += " " + linux_flag
def adjust_raw_vendor(info):
"""Adjust the vendor field to make it human readable"""
if "CPU implementer" not in info:
return
def _canonicalize_aarch64_vendor(data: Dict[str, str]) -> str:
"""Adjust the vendor field to make it human-readable"""
if "CPU implementer" not in data:
return "generic"
# Mapping numeric codes to vendor (ARM). This list is a merge from
# different sources:
@@ -185,43 +267,37 @@ def adjust_raw_vendor(info):
# https://github.com/gcc-mirror/gcc/blob/master/gcc/config/aarch64/aarch64-cores.def
# https://patchwork.kernel.org/patch/10524949/
arm_vendors = TARGETS_JSON["conversions"]["arm_vendors"]
arm_code = info["CPU implementer"]
if arm_code in arm_vendors:
info["CPU implementer"] = arm_vendors[arm_code]
arm_code = data["CPU implementer"]
return arm_vendors.get(arm_code, arm_code)
def raw_info_dictionary():
"""Returns a dictionary with information on the cpu of the current host.
def _feature_set(data: Dict[str, str], key: str) -> Set[str]:
return set(data.get(key, "").split())
This function calls all the viable factories one after the other until
there's one that is able to produce the requested information.
def detected_info() -> Microarchitecture:
"""Returns a partial Microarchitecture with information on the CPU of the current host.
This function calls all the viable factories one after the other until there's one that is
able to produce the requested information. Falls-back to a generic microarchitecture, if none
of the calls succeed.
"""
# pylint: disable=broad-except
info = {}
for factory in INFO_FACTORY[platform.system()]:
try:
info = factory()
return factory()
except Exception as exc:
warnings.warn(str(exc))
if info:
adjust_raw_flags(info)
adjust_raw_vendor(info)
break
return info
return generic_microarchitecture(_machine())
def compatible_microarchitectures(info):
"""Returns an unordered list of known micro-architectures that are
compatible with the info dictionary passed as argument.
Args:
info (dict): dictionary containing information on the host cpu
def compatible_microarchitectures(info: Microarchitecture) -> List[Microarchitecture]:
"""Returns an unordered list of known micro-architectures that are compatible with the
partial Microarchitecture passed as input.
"""
architecture_family = _machine()
# If a tester is not registered, be conservative and assume no known
# target is compatible with the host
# If a tester is not registered, assume no known target is compatible with the host
tester = COMPATIBILITY_CHECKS.get(architecture_family, lambda x, y: False)
return [x for x in TARGETS.values() if tester(info, x)] or [
generic_microarchitecture(architecture_family)
@@ -230,8 +306,8 @@ def compatible_microarchitectures(info):
def host():
"""Detects the host micro-architecture and returns it."""
# Retrieve a dictionary with raw information on the host's cpu
info = raw_info_dictionary()
# Retrieve information on the host's cpu
info = detected_info()
# Get a list of possible candidates for this micro-architecture
candidates = compatible_microarchitectures(info)
@@ -258,16 +334,15 @@ def sorting_fn(item):
return max(candidates, key=sorting_fn)
def compatibility_check(architecture_family):
def compatibility_check(architecture_family: Union[str, Tuple[str, ...]]):
"""Decorator to register a function as a proper compatibility check.
A compatibility check function takes the raw info dictionary as a first
argument and an arbitrary target as the second argument. It returns True
if the target is compatible with the info dictionary, False otherwise.
A compatibility check function takes a partial Microarchitecture object as a first argument,
and an arbitrary target Microarchitecture as the second argument. It returns True if the
target is compatible with first argument, False otherwise.
Args:
architecture_family (str or tuple): architecture family for which
this test can be used, e.g. x86_64 or ppc64le etc.
architecture_family: architecture family for which this test can be used
"""
# Turn the argument into something iterable
if isinstance(architecture_family, str):
@@ -280,86 +355,57 @@ def decorator(func):
return decorator
@compatibility_check(architecture_family=("ppc64le", "ppc64"))
@compatibility_check(architecture_family=(PPC64LE, PPC64))
def compatibility_check_for_power(info, target):
"""Compatibility check for PPC64 and PPC64LE architectures."""
basename = platform.machine()
generation_match = re.search(r"POWER(\d+)", info.get("cpu", ""))
try:
generation = int(generation_match.group(1))
except AttributeError:
# There might be no match under emulated environments. For instance
# emulating a ppc64le with QEMU and Docker still reports the host
# /proc/cpuinfo and not a Power
generation = 0
# We can use a target if it descends from our machine type and our
# generation (9 for POWER9, etc) is at least its generation.
arch_root = TARGETS[basename]
arch_root = TARGETS[_machine()]
return (
target == arch_root or arch_root in target.ancestors
) and target.generation <= generation
) and target.generation <= info.generation
@compatibility_check(architecture_family="x86_64")
@compatibility_check(architecture_family=X86_64)
def compatibility_check_for_x86_64(info, target):
"""Compatibility check for x86_64 architectures."""
basename = "x86_64"
vendor = info.get("vendor_id", "generic")
features = set(info.get("flags", "").split())
# We can use a target if it descends from our machine type, is from our
# vendor, and we have all of its features
arch_root = TARGETS[basename]
arch_root = TARGETS[X86_64]
return (
(target == arch_root or arch_root in target.ancestors)
and target.vendor in (vendor, "generic")
and target.features.issubset(features)
and target.vendor in (info.vendor, "generic")
and target.features.issubset(info.features)
)
@compatibility_check(architecture_family="aarch64")
@compatibility_check(architecture_family=AARCH64)
def compatibility_check_for_aarch64(info, target):
"""Compatibility check for AARCH64 architectures."""
basename = "aarch64"
features = set(info.get("Features", "").split())
vendor = info.get("CPU implementer", "generic")
# At the moment it's not clear how to detect compatibility with
# At the moment, it's not clear how to detect compatibility with
# a specific version of the architecture
if target.vendor == "generic" and target.name != "aarch64":
if target.vendor == "generic" and target.name != AARCH64:
return False
arch_root = TARGETS[basename]
arch_root = TARGETS[AARCH64]
arch_root_and_vendor = arch_root == target.family and target.vendor in (
vendor,
info.vendor,
"generic",
)
# On macOS it seems impossible to get all the CPU features
# with syctl info, but for ARM we can get the exact model
if platform.system() == "Darwin":
model_key = info.get("model", basename)
model = TARGETS[model_key]
model = TARGETS[info.name]
return arch_root_and_vendor and (target == model or target in model.ancestors)
return arch_root_and_vendor and target.features.issubset(features)
return arch_root_and_vendor and target.features.issubset(info.features)
@compatibility_check(architecture_family="riscv64")
@compatibility_check(architecture_family=RISCV64)
def compatibility_check_for_riscv64(info, target):
"""Compatibility check for riscv64 architectures."""
basename = "riscv64"
uarch = info.get("uarch")
# sifive unmatched board
if uarch == "sifive,u74-mc":
uarch = "u74mc"
# catch-all for unknown uarchs
else:
uarch = "riscv64"
arch_root = TARGETS[basename]
arch_root = TARGETS[RISCV64]
return (target == arch_root or arch_root in target.ancestors) and (
target == uarch or target.vendor == "generic"
target.name == info.name or target.vendor == "generic"
)

View File

@@ -13,6 +13,7 @@
import archspec
import archspec.cpu.alias
import archspec.cpu.schema
from .alias import FEATURE_ALIASES
from .schema import LazyDictionary
@@ -47,7 +48,7 @@ class Microarchitecture:
which has "broadwell" as a parent, supports running binaries
optimized for "broadwell".
vendor (str): vendor of the micro-architecture
features (list of str): supported CPU flags. Note that the semantic
features (set of str): supported CPU flags. Note that the semantic
of the flags in this field might vary among architectures, if
at all present. For instance x86_64 processors will list all
the flags supported by a given CPU while Arm processors will
@@ -180,24 +181,28 @@ def generic(self):
generics = [x for x in [self] + self.ancestors if x.vendor == "generic"]
return max(generics, key=lambda x: len(x.ancestors))
def to_dict(self, return_list_of_items=False):
"""Returns a dictionary representation of this object.
def to_dict(self):
"""Returns a dictionary representation of this object."""
return {
"name": str(self.name),
"vendor": str(self.vendor),
"features": sorted(str(x) for x in self.features),
"generation": self.generation,
"parents": [str(x) for x in self.parents],
"compilers": self.compilers,
}
Args:
return_list_of_items (bool): if True returns an ordered list of
items instead of the dictionary
"""
list_of_items = [
("name", str(self.name)),
("vendor", str(self.vendor)),
("features", sorted(str(x) for x in self.features)),
("generation", self.generation),
("parents", [str(x) for x in self.parents]),
]
if return_list_of_items:
return list_of_items
return dict(list_of_items)
@staticmethod
def from_dict(data) -> "Microarchitecture":
"""Construct a microarchitecture from a dictionary representation."""
return Microarchitecture(
name=data["name"],
parents=[TARGETS[x] for x in data["parents"]],
vendor=data["vendor"],
features=set(data["features"]),
compilers=data.get("compilers", {}),
generation=data.get("generation", 0),
)
def optimization_flags(self, compiler, version):
"""Returns a string containing the optimization flags that needs
@@ -271,9 +276,7 @@ def tuplify(ver):
flags = flags_fmt.format(**compiler_entry)
return flags
msg = (
"cannot produce optimized binary for micro-architecture '{0}' with {1}@{2}"
)
msg = "cannot produce optimized binary for micro-architecture '{0}' with {1}@{2}"
if compiler_info:
versions = [x["versions"] for x in compiler_info]
msg += f' [supported compiler versions are {", ".join(versions)}]'
@@ -289,9 +292,7 @@ def generic_microarchitecture(name):
Args:
name (str): name of the micro-architecture
"""
return Microarchitecture(
name, parents=[], vendor="generic", features=[], compilers={}
)
return Microarchitecture(name, parents=[], vendor="generic", features=[], compilers={})
def version_components(version):
@@ -345,9 +346,7 @@ def fill_target_from_dict(name, data, targets):
compilers = values.get("compilers", {})
generation = values.get("generation", 0)
targets[name] = Microarchitecture(
name, parents, vendor, features, compilers, generation
)
targets[name] = Microarchitecture(name, parents, vendor, features, compilers, generation)
known_targets = {}
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]

View File

@@ -7,7 +7,9 @@
"""
import collections.abc
import json
import os.path
import os
import pathlib
from typing import Tuple
class LazyDictionary(collections.abc.MutableMapping):
@@ -46,21 +48,65 @@ def __len__(self):
return len(self.data)
def _load_json_file(json_file):
json_dir = os.path.join(os.path.dirname(__file__), "..", "json", "cpu")
json_dir = os.path.abspath(json_dir)
#: Environment variable that might point to a directory with a user defined JSON file
DIR_FROM_ENVIRONMENT = "ARCHSPEC_CPU_DIR"
def _factory():
filename = os.path.join(json_dir, json_file)
with open(filename, "r", encoding="utf-8") as file:
return json.load(file)
#: Environment variable that might point to a directory with extensions to JSON files
EXTENSION_DIR_FROM_ENVIRONMENT = "ARCHSPEC_EXTENSION_CPU_DIR"
return _factory
def _json_file(filename: str, allow_custom: bool = False) -> Tuple[pathlib.Path, pathlib.Path]:
"""Given a filename, returns the absolute path for the main JSON file, and an
optional absolute path for an extension JSON file.
Args:
filename: filename for the JSON file
allow_custom: if True, allows overriding the location where the file resides
"""
json_dir = pathlib.Path(__file__).parent / ".." / "json" / "cpu"
if allow_custom and DIR_FROM_ENVIRONMENT in os.environ:
json_dir = pathlib.Path(os.environ[DIR_FROM_ENVIRONMENT])
json_dir = json_dir.absolute()
json_file = json_dir / filename
extension_file = None
if allow_custom and EXTENSION_DIR_FROM_ENVIRONMENT in os.environ:
extension_dir = pathlib.Path(os.environ[EXTENSION_DIR_FROM_ENVIRONMENT])
extension_dir.absolute()
extension_file = extension_dir / filename
return json_file, extension_file
def _load(json_file: pathlib.Path, extension_file: pathlib.Path):
with open(json_file, "r", encoding="utf-8") as file:
data = json.load(file)
if not extension_file or not extension_file.exists():
return data
with open(extension_file, "r", encoding="utf-8") as file:
extension_data = json.load(file)
top_level_sections = list(data.keys())
for key in top_level_sections:
if key not in extension_data:
continue
data[key].update(extension_data[key])
return data
#: In memory representation of the data in microarchitectures.json,
#: loaded on first access
TARGETS_JSON = LazyDictionary(_load_json_file("microarchitectures.json"))
TARGETS_JSON = LazyDictionary(_load, *_json_file("microarchitectures.json", allow_custom=True))
#: JSON schema for microarchitectures.json, loaded on first access
SCHEMA = LazyDictionary(_load_json_file("microarchitectures_schema.json"))
TARGETS_JSON_SCHEMA = LazyDictionary(_load, *_json_file("microarchitectures_schema.json"))
#: Information on how to call 'cpuid' to get information on the HOST CPU
CPUID_JSON = LazyDictionary(_load, *_json_file("cpuid.json", allow_custom=True))
#: JSON schema for cpuid.json, loaded on first access
CPUID_JSON_SCHEMA = LazyDictionary(_load, *_json_file("cpuid_schema.json"))

View File

@@ -9,11 +9,11 @@ language specific APIs.
Currently the repository contains the following JSON files:
```console
.
├── COPYRIGHT
── cpu
   ├── microarchitectures.json # Contains information on CPU microarchitectures
   └── microarchitectures_schema.json # Schema for the file above
cpu/
├── cpuid.json # Contains information on CPUID calls to retrieve vendor and features on x86_64
── cpuid_schema.json # Schema for the file above
├── microarchitectures.json # Contains information on CPU microarchitectures
└── microarchitectures_schema.json # Schema for the file above
```

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,134 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Schema for microarchitecture definitions and feature aliases",
"type": "object",
"additionalProperties": false,
"properties": {
"vendor": {
"type": "object",
"additionalProperties": false,
"properties": {
"description": {
"type": "string"
},
"input": {
"type": "object",
"additionalProperties": false,
"properties": {
"eax": {
"type": "integer"
},
"ecx": {
"type": "integer"
}
}
}
}
},
"highest_extension_support": {
"type": "object",
"additionalProperties": false,
"properties": {
"description": {
"type": "string"
},
"input": {
"type": "object",
"additionalProperties": false,
"properties": {
"eax": {
"type": "integer"
},
"ecx": {
"type": "integer"
}
}
}
}
},
"flags": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"properties": {
"description": {
"type": "string"
},
"input": {
"type": "object",
"additionalProperties": false,
"properties": {
"eax": {
"type": "integer"
},
"ecx": {
"type": "integer"
}
}
},
"bits": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"properties": {
"name": {
"type": "string"
},
"register": {
"type": "string"
},
"bit": {
"type": "integer"
}
}
}
}
}
}
},
"extension-flags": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"properties": {
"description": {
"type": "string"
},
"input": {
"type": "object",
"additionalProperties": false,
"properties": {
"eax": {
"type": "integer"
},
"ecx": {
"type": "integer"
}
}
},
"bits": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"properties": {
"name": {
"type": "string"
},
"register": {
"type": "string"
},
"bit": {
"type": "integer"
}
}
}
}
}
}
}
}
}

View File

@@ -0,0 +1,20 @@
The MIT License (MIT)
Copyright (c) 2014 Anders Høst
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,76 @@
cpuid.py
========
Now, this is silly!
Pure Python library for accessing information about x86 processors
by querying the [CPUID](http://en.wikipedia.org/wiki/CPUID)
instruction. Well, not exactly pure Python...
It works by allocating a small piece of virtual memory, copying
a raw x86 function to that memory, giving the memory execute
permissions and then calling the memory as a function. The injected
function executes the CPUID instruction and copies the result back
to a ctypes.Structure where is can be read by Python.
It should work fine on both 32 and 64 bit versions of Windows and Linux
running x86 processors. Apple OS X and other BSD systems should also work,
not tested though...
Why?
----
For poops and giggles. Plus, having access to a low-level feature
without having to compile a C wrapper is pretty neat.
Examples
--------
Getting info with eax=0:
import cpuid
q = cpuid.CPUID()
eax, ebx, ecx, edx = q(0)
Running the files:
$ python example.py
Vendor ID : GenuineIntel
CPU name : Intel(R) Xeon(R) CPU W3550 @ 3.07GHz
Vector instructions supported:
SSE : Yes
SSE2 : Yes
SSE3 : Yes
SSSE3 : Yes
SSE4.1 : Yes
SSE4.2 : Yes
SSE4a : --
AVX : --
AVX2 : --
$ python cpuid.py
CPUID A B C D
00000000 0000000b 756e6547 6c65746e 49656e69
00000001 000106a5 00100800 009ce3bd bfebfbff
00000002 55035a01 00f0b2e4 00000000 09ca212c
00000003 00000000 00000000 00000000 00000000
00000004 00000000 00000000 00000000 00000000
00000005 00000040 00000040 00000003 00001120
00000006 00000003 00000002 00000001 00000000
00000007 00000000 00000000 00000000 00000000
00000008 00000000 00000000 00000000 00000000
00000009 00000000 00000000 00000000 00000000
0000000a 07300403 00000044 00000000 00000603
0000000b 00000000 00000000 00000095 00000000
80000000 80000008 00000000 00000000 00000000
80000001 00000000 00000000 00000001 28100800
80000002 65746e49 2952286c 6f655820 2952286e
80000003 55504320 20202020 20202020 57202020
80000004 30353533 20402020 37302e33 007a4847
80000005 00000000 00000000 00000000 00000000
80000006 00000000 00000000 01006040 00000000
80000007 00000000 00000000 00000000 00000100
80000008 00003024 00000000 00000000 00000000

View File

@@ -0,0 +1,172 @@
# -*- coding: utf-8 -*-
#
# Copyright (c) 2024 Anders Høst
#
from __future__ import print_function
import platform
import os
import ctypes
from ctypes import c_uint32, c_long, c_ulong, c_size_t, c_void_p, POINTER, CFUNCTYPE
# Posix x86_64:
# Three first call registers : RDI, RSI, RDX
# Volatile registers : RAX, RCX, RDX, RSI, RDI, R8-11
# Windows x86_64:
# Three first call registers : RCX, RDX, R8
# Volatile registers : RAX, RCX, RDX, R8-11
# cdecl 32 bit:
# Three first call registers : Stack (%esp)
# Volatile registers : EAX, ECX, EDX
_POSIX_64_OPC = [
0x53, # push %rbx
0x89, 0xf0, # mov %esi,%eax
0x89, 0xd1, # mov %edx,%ecx
0x0f, 0xa2, # cpuid
0x89, 0x07, # mov %eax,(%rdi)
0x89, 0x5f, 0x04, # mov %ebx,0x4(%rdi)
0x89, 0x4f, 0x08, # mov %ecx,0x8(%rdi)
0x89, 0x57, 0x0c, # mov %edx,0xc(%rdi)
0x5b, # pop %rbx
0xc3 # retq
]
_WINDOWS_64_OPC = [
0x53, # push %rbx
0x89, 0xd0, # mov %edx,%eax
0x49, 0x89, 0xc9, # mov %rcx,%r9
0x44, 0x89, 0xc1, # mov %r8d,%ecx
0x0f, 0xa2, # cpuid
0x41, 0x89, 0x01, # mov %eax,(%r9)
0x41, 0x89, 0x59, 0x04, # mov %ebx,0x4(%r9)
0x41, 0x89, 0x49, 0x08, # mov %ecx,0x8(%r9)
0x41, 0x89, 0x51, 0x0c, # mov %edx,0xc(%r9)
0x5b, # pop %rbx
0xc3 # retq
]
_CDECL_32_OPC = [
0x53, # push %ebx
0x57, # push %edi
0x8b, 0x7c, 0x24, 0x0c, # mov 0xc(%esp),%edi
0x8b, 0x44, 0x24, 0x10, # mov 0x10(%esp),%eax
0x8b, 0x4c, 0x24, 0x14, # mov 0x14(%esp),%ecx
0x0f, 0xa2, # cpuid
0x89, 0x07, # mov %eax,(%edi)
0x89, 0x5f, 0x04, # mov %ebx,0x4(%edi)
0x89, 0x4f, 0x08, # mov %ecx,0x8(%edi)
0x89, 0x57, 0x0c, # mov %edx,0xc(%edi)
0x5f, # pop %edi
0x5b, # pop %ebx
0xc3 # ret
]
is_windows = os.name == "nt"
is_64bit = ctypes.sizeof(ctypes.c_voidp) == 8
class CPUID_struct(ctypes.Structure):
_register_names = ("eax", "ebx", "ecx", "edx")
_fields_ = [(r, c_uint32) for r in _register_names]
def __getitem__(self, item):
if item not in self._register_names:
raise KeyError(item)
return getattr(self, item)
def __repr__(self):
return "eax=0x{:x}, ebx=0x{:x}, ecx=0x{:x}, edx=0x{:x}".format(self.eax, self.ebx, self.ecx, self.edx)
class CPUID(object):
def __init__(self):
if platform.machine() not in ("AMD64", "x86_64", "x86", "i686"):
raise SystemError("Only available for x86")
if is_windows:
if is_64bit:
# VirtualAlloc seems to fail under some weird
# circumstances when ctypes.windll.kernel32 is
# used under 64 bit Python. CDLL fixes this.
self.win = ctypes.CDLL("kernel32.dll")
opc = _WINDOWS_64_OPC
else:
# Here ctypes.windll.kernel32 is needed to get the
# right DLL. Otherwise it will fail when running
# 32 bit Python on 64 bit Windows.
self.win = ctypes.windll.kernel32
opc = _CDECL_32_OPC
else:
opc = _POSIX_64_OPC if is_64bit else _CDECL_32_OPC
size = len(opc)
code = (ctypes.c_ubyte * size)(*opc)
if is_windows:
self.win.VirtualAlloc.restype = c_void_p
self.win.VirtualAlloc.argtypes = [ctypes.c_void_p, ctypes.c_size_t, ctypes.c_ulong, ctypes.c_ulong]
self.addr = self.win.VirtualAlloc(None, size, 0x1000, 0x40)
if not self.addr:
raise MemoryError("Could not allocate RWX memory")
ctypes.memmove(self.addr, code, size)
else:
from mmap import (
mmap,
MAP_PRIVATE,
MAP_ANONYMOUS,
PROT_WRITE,
PROT_READ,
PROT_EXEC,
)
self.mm = mmap(
-1,
size,
flags=MAP_PRIVATE | MAP_ANONYMOUS,
prot=PROT_WRITE | PROT_READ | PROT_EXEC,
)
self.mm.write(code)
self.addr = ctypes.addressof(ctypes.c_int.from_buffer(self.mm))
func_type = CFUNCTYPE(None, POINTER(CPUID_struct), c_uint32, c_uint32)
self.func_ptr = func_type(self.addr)
def __call__(self, eax, ecx=0):
struct = self.registers_for(eax=eax, ecx=ecx)
return struct.eax, struct.ebx, struct.ecx, struct.edx
def registers_for(self, eax, ecx=0):
"""Calls cpuid with eax and ecx set as the input arguments, and returns a structure
containing eax, ebx, ecx, and edx.
"""
struct = CPUID_struct()
self.func_ptr(struct, eax, ecx)
return struct
def __del__(self):
if is_windows:
self.win.VirtualFree.restype = c_long
self.win.VirtualFree.argtypes = [c_void_p, c_size_t, c_ulong]
self.win.VirtualFree(self.addr, 0, 0x8000)
else:
self.mm.close()
if __name__ == "__main__":
def valid_inputs():
cpuid = CPUID()
for eax in (0x0, 0x80000000):
highest, _, _, _ = cpuid(eax)
while eax <= highest:
regs = cpuid(eax)
yield (eax, regs)
eax += 1
print(" ".join(x.ljust(8) for x in ("CPUID", "A", "B", "C", "D")).strip())
for eax, regs in valid_inputs():
print("%08x" % eax, " ".join("%08x" % reg for reg in regs))

View File

@@ -0,0 +1,62 @@
# -*- coding: utf-8 -*-
#
# Copyright (c) 2024 Anders Høst
#
from __future__ import print_function
import struct
import cpuid
def cpu_vendor(cpu):
_, b, c, d = cpu(0)
return struct.pack("III", b, d, c).decode("utf-8")
def cpu_name(cpu):
name = "".join((struct.pack("IIII", *cpu(0x80000000 + i)).decode("utf-8")
for i in range(2, 5)))
return name.split('\x00', 1)[0]
def is_set(cpu, leaf, subleaf, reg_idx, bit):
"""
@param {leaf} %eax
@param {sublead} %ecx, 0 in most cases
@param {reg_idx} idx of [%eax, %ebx, %ecx, %edx], 0-based
@param {bit} bit of reg selected by {reg_idx}, 0-based
"""
regs = cpu(leaf, subleaf)
if (1 << bit) & regs[reg_idx]:
return "Yes"
else:
return "--"
if __name__ == "__main__":
cpu = cpuid.CPUID()
print("Vendor ID : %s" % cpu_vendor(cpu))
print("CPU name : %s" % cpu_name(cpu))
print()
print("Vector instructions supported:")
print("SSE : %s" % is_set(cpu, 1, 0, 3, 25))
print("SSE2 : %s" % is_set(cpu, 1, 0, 3, 26))
print("SSE3 : %s" % is_set(cpu, 1, 0, 2, 0))
print("SSSE3 : %s" % is_set(cpu, 1, 0, 2, 9))
print("SSE4.1 : %s" % is_set(cpu, 1, 0, 2, 19))
print("SSE4.2 : %s" % is_set(cpu, 1, 0, 2, 20))
print("SSE4a : %s" % is_set(cpu, 0x80000001, 0, 2, 6))
print("AVX : %s" % is_set(cpu, 1, 0, 2, 28))
print("AVX2 : %s" % is_set(cpu, 7, 0, 1, 5))
print("BMI1 : %s" % is_set(cpu, 7, 0, 1, 3))
print("BMI2 : %s" % is_set(cpu, 7, 0, 1, 8))
# Intel RDT CMT/MBM
print("L3 Monitoring : %s" % is_set(cpu, 0xf, 0, 3, 1))
print("L3 Occupancy : %s" % is_set(cpu, 0xf, 1, 3, 0))
print("L3 Total BW : %s" % is_set(cpu, 0xf, 1, 3, 1))
print("L3 Local BW : %s" % is_set(cpu, 0xf, 1, 3, 2))

View File

@@ -42,11 +42,6 @@ def convert_to_posix_path(path: str) -> str:
return format_os_path(path, mode=Path.unix)
def convert_to_windows_path(path: str) -> str:
"""Converts the input path to Windows style."""
return format_os_path(path, mode=Path.windows)
def convert_to_platform_path(path: str) -> str:
"""Converts the input path to the current platform's native style."""
return format_os_path(path, mode=Path.platform_path)

View File

@@ -237,16 +237,6 @@ def _get_mime_type():
return file_command("-b", "-h", "--mime-type")
@memoized
def _get_mime_type_compressed():
"""Same as _get_mime_type but attempts to check for
compression first
"""
mime_uncompressed = _get_mime_type()
mime_uncompressed.add_default_arg("-Z")
return mime_uncompressed
def mime_type(filename):
"""Returns the mime type and subtype of a file.
@@ -262,21 +252,6 @@ def mime_type(filename):
return type, subtype
def compressed_mime_type(filename):
"""Same as mime_type but checks for type that has been compressed
Args:
filename (str): file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type_compressed()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
#: This generates the library filenames that may appear on any OS.
library_extensions = ["a", "la", "so", "tbd", "dylib"]
@@ -308,13 +283,6 @@ def paths_containing_libs(paths, library_names):
return rpaths_to_include
@system_path_filter
def same_path(path1, path2):
norm1 = os.path.abspath(path1).rstrip(os.path.sep)
norm2 = os.path.abspath(path2).rstrip(os.path.sep)
return norm1 == norm2
def filter_file(
regex: str,
repl: Union[str, Callable[[Match], str]],
@@ -909,17 +877,6 @@ def is_exe(path):
return os.path.isfile(path) and os.access(path, os.X_OK)
@system_path_filter
def get_filetype(path_name):
"""
Return the output of file path_name as a string to identify file type.
"""
file = Executable("file")
file.add_default_env("LC_ALL", "C")
output = file("-b", "-h", "%s" % path_name, output=str, error=str)
return output.strip()
def has_shebang(path):
"""Returns whether a path has a shebang line. Returns False if the file cannot be opened."""
try:
@@ -1169,20 +1126,6 @@ def write_tmp_and_move(filename):
shutil.move(tmp, filename)
@contextmanager
@system_path_filter
def open_if_filename(str_or_file, mode="r"):
"""Takes either a path or a file object, and opens it if it is a path.
If it's a file object, just yields the file object.
"""
if isinstance(str_or_file, str):
with open(str_or_file, mode) as f:
yield f
else:
yield str_or_file
@system_path_filter
def touch(path):
"""Creates an empty file at the specified path."""
@@ -1240,6 +1183,47 @@ def get_single_file(directory):
return fnames[0]
@system_path_filter
def windows_sfn(path: os.PathLike):
"""Returns 8.3 Filename (SFN) representation of
path
8.3 Filenames (SFN or short filename) is a file
naming convention used prior to Win95 that Windows
still (and will continue to) support. This convention
caps filenames at 8 characters, and most importantly
does not allow for spaces in addition to other specifications.
The scheme is generally the same as a normal Windows
file scheme, but all spaces are removed and the filename
is capped at 6 characters. The remaining characters are
replaced with ~N where N is the number file in a directory
that a given file represents i.e. Program Files and Program Files (x86)
would be PROGRA~1 and PROGRA~2 respectively.
Further, all file/directory names are all caps (although modern Windows
is case insensitive in practice).
Conversion is accomplished by fileapi.h GetShortPathNameW
Returns paths in 8.3 Filename form
Note: this method is a no-op on Linux
Args:
path: Path to be transformed into SFN (8.3 filename) format
"""
# This should not be run-able on linux/macos
if sys.platform != "win32":
return path
path = str(path)
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
return ret_str.value
@contextmanager
def temp_cwd():
tmp_dir = tempfile.mkdtemp()
@@ -1254,19 +1238,6 @@ def temp_cwd():
shutil.rmtree(tmp_dir, **kwargs)
@contextmanager
@system_path_filter
def temp_rename(orig_path, temp_path):
same_path = os.path.realpath(orig_path) == os.path.realpath(temp_path)
if not same_path:
shutil.move(orig_path, temp_path)
try:
yield
finally:
if not same_path:
shutil.move(temp_path, orig_path)
@system_path_filter
def can_access(file_name):
"""True if we have read/write access to the file."""

View File

@@ -98,36 +98,6 @@ def caller_locals():
del stack
def get_calling_module_name():
"""Make sure that the caller is a class definition, and return the
enclosing module's name.
"""
# Passing zero here skips line context for speed.
stack = inspect.stack(0)
try:
# Make sure locals contain __module__
caller_locals = stack[2][0].f_locals
finally:
del stack
if "__module__" not in caller_locals:
raise RuntimeError(
"Must invoke get_calling_module_name() " "from inside a class definition!"
)
module_name = caller_locals["__module__"]
base_name = module_name.split(".")[-1]
return base_name
def attr_required(obj, attr_name):
"""Ensure that a class has a required attribute."""
if not hasattr(obj, attr_name):
raise RequiredAttributeError(
"No required attribute '%s' in class '%s'" % (attr_name, obj.__class__.__name__)
)
def attr_setdefault(obj, name, value):
"""Like dict.setdefault, but for objects."""
if not hasattr(obj, name):
@@ -513,42 +483,6 @@ def copy(self):
return clone
def in_function(function_name):
"""True if the caller was called from some function with
the supplied Name, False otherwise."""
stack = inspect.stack()
try:
for elt in stack[2:]:
if elt[3] == function_name:
return True
return False
finally:
del stack
def check_kwargs(kwargs, fun):
"""Helper for making functions with kwargs. Checks whether the kwargs
are empty after all of them have been popped off. If they're
not, raises an error describing which kwargs are invalid.
Example::
def foo(self, **kwargs):
x = kwargs.pop('x', None)
y = kwargs.pop('y', None)
z = kwargs.pop('z', None)
check_kwargs(kwargs, self.foo)
# This raises a TypeError:
foo(w='bad kwarg')
"""
if kwargs:
raise TypeError(
"'%s' is an invalid keyword argument for function %s()."
% (next(iter(kwargs)), fun.__name__)
)
def match_predicate(*args):
"""Utility function for making string matching predicates.
@@ -764,11 +698,6 @@ def pretty_seconds(seconds):
return pretty_seconds_formatter(seconds)(seconds)
class RequiredAttributeError(ValueError):
def __init__(self, message):
super().__init__(message)
class ObjectWrapper:
"""Base class that wraps an object. Derived classes can add new behavior
while staying undercover.
@@ -843,6 +772,30 @@ def __repr__(self):
return repr(self.instance)
def get_entry_points(*, group: str):
"""Wrapper for ``importlib.metadata.entry_points``
Args:
group: entry points to select
Returns:
EntryPoints for ``group`` or empty list if unsupported
"""
try:
import importlib.metadata # type: ignore # novermin
except ImportError:
return []
try:
return importlib.metadata.entry_points(group=group)
except TypeError:
# Prior to Python 3.10, entry_points accepted no parameters and always
# returned a dictionary of entry points, keyed by group. See
# https://docs.python.org/3/library/importlib.metadata.html#entry-points
return importlib.metadata.entry_points().get(group, [])
def load_module_from_file(module_name, module_path):
"""Loads a python module from the path of the corresponding file.
@@ -911,25 +864,6 @@ def uniq(sequence):
return uniq_list
def star(func):
"""Unpacks arguments for use with Multiprocessing mapping functions"""
def _wrapper(args):
return func(*args)
return _wrapper
class Devnull:
"""Null stream with less overhead than ``os.devnull``.
See https://stackoverflow.com/a/2929954.
"""
def write(self, *_):
pass
def elide_list(line_list, max_num=10):
"""Takes a long list and limits it to a smaller number of elements,
replacing intervening elements with '...'. For example::

View File

@@ -815,10 +815,6 @@ def __init__(self, path):
super().__init__(msg)
class LockLimitError(LockError):
"""Raised when exceed maximum attempts to acquire a lock."""
class LockTimeoutError(LockError):
"""Raised when an attempt to acquire a lock times out."""

View File

@@ -189,6 +189,7 @@ def _windows_can_symlink() -> bool:
import llnl.util.filesystem as fs
fs.touchp(fpath)
fs.mkdirp(dpath)
try:
os.symlink(dpath, dlink)

View File

@@ -44,10 +44,6 @@ def is_debug(level=1):
return _debug >= level
def is_stacktrace():
return _stacktrace
def set_debug(level=0):
global _debug
assert level >= 0, "Debug level must be a positive value"
@@ -252,37 +248,6 @@ def die(message, *args, **kwargs) -> NoReturn:
sys.exit(1)
def get_number(prompt, **kwargs):
default = kwargs.get("default", None)
abort = kwargs.get("abort", None)
if default is not None and abort is not None:
prompt += " (default is %s, %s to abort) " % (default, abort)
elif default is not None:
prompt += " (default is %s) " % default
elif abort is not None:
prompt += " (%s to abort) " % abort
number = None
while number is None:
msg(prompt, newline=False)
ans = input()
if ans == str(abort):
return None
if ans:
try:
number = int(ans)
if number < 1:
msg("Please enter a valid number.")
number = None
except ValueError:
msg("Please enter a valid number.")
elif default is not None:
number = default
return number
def get_yes_or_no(prompt, **kwargs):
default_value = kwargs.get("default", None)

View File

@@ -1541,7 +1541,7 @@ def fetch_url_to_mirror(url):
response = spack.oci.opener.urlopen(
urllib.request.Request(
url=ref.manifest_url(),
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
headers={"Accept": ", ".join(spack.oci.oci.manifest_content_type)},
)
)
except Exception:

View File

@@ -213,9 +213,6 @@ def _root_spec(spec_str: str) -> str:
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":

View File

@@ -147,7 +147,7 @@ def _add_compilers_if_missing() -> None:
mixed_toolchain=sys.platform == "darwin"
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)
spack.compilers.add_compilers_to_config(new_compilers)
@contextlib.contextmanager

View File

@@ -541,7 +541,7 @@ def autoreconf(self, pkg, spec, prefix):
if os.path.exists(self.configure_abs_path):
return
# Else try to regenerate it, which reuquires a few build dependencies
# Else try to regenerate it, which requires a few build dependencies
ensure_build_dependencies_or_raise(
spec=spec,
dependencies=["autoconf", "automake", "libtool"],

View File

@@ -69,7 +69,7 @@ class MSBuildBuilder(BaseBuilder):
@property
def build_directory(self):
"""Return the directory containing the MSBuild solution or vcxproj."""
return self.pkg.stage.source_path
return fs.windows_sfn(self.pkg.stage.source_path)
@property
def toolchain_version(self):

View File

@@ -77,7 +77,11 @@ def ignore_quotes(self):
@property
def build_directory(self):
"""Return the directory containing the makefile."""
return self.pkg.stage.source_path if not self.makefile_root else self.makefile_root
return (
fs.windows_sfn(self.pkg.stage.source_path)
if not self.makefile_root
else fs.windows_sfn(self.makefile_root)
)
@property
def std_nmake_args(self):

View File

@@ -4,12 +4,15 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import Iterable
from llnl.util.filesystem import filter_file
from llnl.util.filesystem import filter_file, find
from llnl.util.lang import memoized
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests
@@ -28,6 +31,58 @@ class PerlPackage(spack.package_base.PackageBase):
extends("perl", when="build_system=perl")
@property
@memoized
def _platform_dir(self):
"""Name of platform-specific module subdirectory."""
perl = self.spec["perl"].command
options = "-E", "use Config; say $Config{archname}"
out = perl(*options, output=str.split, error=str.split)
return out.strip()
@property
def use_modules(self) -> Iterable[str]:
"""Names of the package's perl modules."""
module_files = find(self.prefix.lib, ["*.pm"], recursive=True)
# Drop the platform directory, if present
if self._platform_dir:
platform_dir = self._platform_dir + os.sep
module_files = [m.replace(platform_dir, "") for m in module_files]
# Drop the extension and library path
prefix = self.prefix.lib + os.sep
modules = [os.path.splitext(m)[0].replace(prefix, "") for m in module_files]
# Drop the perl subdirectory as well
return ["::".join(m.split(os.sep)[1:]) for m in modules]
@property
def skip_modules(self) -> Iterable[str]:
"""Names of modules that should be skipped when running tests.
These are a subset of use_modules.
Returns:
List of strings of module names.
"""
return []
def test_use(self):
"""Test 'use module'"""
if not self.use_modules:
raise SkipTest("Test requires use_modules package property.")
perl = self.spec["perl"].command
for module in self.use_modules:
if module in self.skip_modules:
continue
with test_part(self, f"test_use-{module}", purpose=f"checking use of {module}"):
options = ["-we", f'use strict; use {module}; print("OK\n")']
out = perl(*options, output=str.split, error=str.split)
assert "OK" in out
@spack.builder.builder("perl")
class PerlBuilder(BaseBuilder):
@@ -52,7 +107,7 @@ class PerlBuilder(BaseBuilder):
phases = ("configure", "build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("configure_args", "check")
legacy_methods = ("configure_args", "check", "test_use")
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()

View File

@@ -2,7 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import inspect
import operator
import os
import re
import shutil
@@ -24,7 +27,7 @@
import spack.package_base
import spack.spec
import spack.store
from spack.directives import build_system, depends_on, extends, maintainers
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
@@ -53,8 +56,6 @@ def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
@@ -180,7 +181,7 @@ def add_files_to_view(self, view, merge_map, skip_if_exists=True):
except (OSError, KeyError):
target = None
if target:
os.symlink(target, dst)
os.symlink(os.path.relpath(target, os.path.dirname(dst)), dst)
else:
view.link(src, dst, spec=self.spec)
@@ -368,16 +369,19 @@ def headers(self) -> HeaderList:
# Remove py- prefix in package name
name = self.spec.name[3:]
# Headers may be in either location
# Headers should only be in include or platlib, but no harm in checking purelib too
include = self.prefix.join(self.spec["python"].package.include).join(name)
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
headers = fs.find_all_headers(include) + fs.find_all_headers(platlib)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
headers_list = map(fs.find_all_headers, [include, platlib, purelib])
headers = functools.reduce(operator.add, headers_list)
if headers:
return headers
msg = "Unable to locate {} headers in {} or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
msg = "Unable to locate {} headers in {}, {}, or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib, purelib))
@property
def libs(self) -> LibraryList:
@@ -386,15 +390,19 @@ def libs(self) -> LibraryList:
# Remove py- prefix in package name
name = self.spec.name[3:]
root = self.prefix.join(self.spec["python"].package.platlib).join(name)
# Libraries should only be in platlib, but no harm in checking purelib too
platlib = self.prefix.join(self.spec["python"].package.platlib).join(name)
purelib = self.prefix.join(self.spec["python"].package.purelib).join(name)
libs = fs.find_all_libraries(root, recursive=True)
find_all_libraries = functools.partial(fs.find_all_libraries, recursive=True)
libs_list = map(find_all_libraries, [platlib, purelib])
libs = functools.reduce(operator.add, libs_list)
if libs:
return libs
msg = "Unable to recursively locate {} libraries in {}"
raise NoLibrariesError(msg.format(self.spec.name, root))
msg = "Unable to recursively locate {} libraries in {} or {}"
raise NoLibrariesError(msg.format(self.spec.name, platlib, purelib))
@spack.builder.builder("python_pip")

View File

@@ -162,23 +162,9 @@ def hip_flags(amdgpu_target):
# Add compiler minimum versions based on the first release where the
# processor is included in llvm/lib/Support/TargetParser.cpp
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx900:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx906:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx908:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx90c")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1032")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1033")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx1034")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1035")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx1036")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1100")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1101")

View File

@@ -9,6 +9,8 @@
import inspect
from typing import List, Optional, Tuple
from llnl.util import lang
import spack.build_environment
#: Builder classes, as registered by the "builder" decorator
@@ -231,24 +233,27 @@ def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# We don't have callbacks in this class, move on
if not staged_callbacks:
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, get first what
# was attached to parent classes, then prepend what we have registered here.
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
callbacks_from_base = getattr(base, temporary_stage.attribute_name, None)
if callbacks_from_base:
break
else:
callbacks_from_base = []
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]

View File

@@ -594,6 +594,15 @@ def _put_manifest(
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
# If the base image uses `vnd.docker.distribution.manifest.v2+json`, then we use that too.
# This is because Singularity / Apptainer is very strict about not mixing them.
base_manifest_mediaType = base_manifest.get(
"mediaType", "application/vnd.oci.image.manifest.v1+json"
)
use_docker_format = (
base_manifest_mediaType == "application/vnd.docker.distribution.manifest.v2+json"
)
spack.user_environment.environment_modifications_for_specs(*specs).apply_modifications(env)
# Create an oci.image.config file
@@ -625,8 +634,8 @@ def _put_manifest(
# Upload the config file
upload_blob_with_retry(image_ref, file=config_file, digest=config_file_checksum)
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
manifest = {
"mediaType": base_manifest_mediaType,
"schemaVersion": 2,
"config": {
"mediaType": base_manifest["config"]["mediaType"],
@@ -637,7 +646,11 @@ def _put_manifest(
*(layer for layer in base_manifest["layers"]),
*(
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"mediaType": (
"application/vnd.docker.image.rootfs.diff.tar.gzip"
if use_docker_format
else "application/vnd.oci.image.layer.v1.tar+gzip"
),
"digest": str(checksums[s.dag_hash()].compressed_digest),
"size": checksums[s.dag_hash()].size,
}
@@ -646,11 +659,11 @@ def _put_manifest(
],
}
if annotations:
oci_manifest["annotations"] = annotations
if not use_docker_format and annotations:
manifest["annotations"] = annotations
# Finally upload the manifest
upload_manifest_with_retry(image_ref, oci_manifest=oci_manifest)
upload_manifest_with_retry(image_ref, manifest=manifest)
# delete the config file
os.unlink(config_file)

View File

@@ -404,6 +404,17 @@ def no_install_status():
)
@arg
def debug_nondefaults():
return Args(
"-d",
"--debug-nondefaults",
action="store_true",
default=False,
help="show non-default decisions in red",
)
@arg
def no_checksum():
return Args(
@@ -570,6 +581,14 @@ def add_concretizer_args(subparser):
default=None,
help="reuse installed dependencies only",
)
subgroup.add_argument(
"--deprecated",
action=ConfigSetAction,
dest="config:deprecated",
const=True,
default=None,
help="allow concretizer to select deprecated versions",
)
def add_connection_args(subparser, add_help):

View File

@@ -115,7 +115,7 @@ def emulate_env_utility(cmd_name, context: Context, args):
f"Not all dependencies of {spec.name} are installed. "
f"Cannot setup {context} environment:",
spec.tree(
install_status=True,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,
# This shows more than necessary, but we cannot dynamically change deptypes

View File

@@ -89,7 +89,7 @@ def compiler_find(args):
paths, scope=None, mixed_toolchain=args.mixed_toolchain
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope, init_config=False)
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope)
n = len(new_compilers)
s = "s" if n > 1 else ""

View File

@@ -19,7 +19,7 @@
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["jobs"])
arguments.add_common_arguments(subparser, ["jobs", "no_checksum", "spec"])
subparser.add_argument(
"-d",
"--source-path",
@@ -34,7 +34,6 @@ def setup_parser(subparser):
dest="ignore_deps",
help="do not try to install dependencies of requested packages",
)
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated"])
subparser.add_argument(
"--keep-prefix",
action="store_true",
@@ -63,7 +62,6 @@ def setup_parser(subparser):
choices=["root", "all"],
help="run tests on only root packages or all packages",
)
arguments.add_common_arguments(subparser, ["spec"])
stop_group = subparser.add_mutually_exclusive_group()
stop_group.add_argument(
@@ -125,9 +123,6 @@ def dev_build(self, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
tests = False
if args.test == "all":
tests = True

View File

@@ -270,7 +270,8 @@ def create_temp_env_directory():
def _tty_info(msg):
"""tty.info like function that prints the equivalent printf statement for eval."""
decorated = f'{colorize("@*b{==>}")} {msg}\n'
print(f"printf {shlex.quote(decorated)};")
executor = "echo" if sys.platform == "win32" else "printf"
print(f"{executor} {shlex.quote(decorated)};")
def env_activate(args):

View File

@@ -18,6 +18,7 @@
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.repo
import spack.util.environment
from spack.cmd.common import arguments
@@ -152,9 +153,9 @@ def external_find(args):
def packages_to_search_for(
*, names: Optional[List[str]], tags: List[str], exclude: Optional[List[str]]
):
result = []
for current_tag in tags:
result.extend(spack.repo.PATH.packages_with_tags(current_tag, full=True))
result = list(
{pkg for tag in tags for pkg in spack.repo.PATH.packages_with_tags(tag, full=True)}
)
if names:
# Match both fully qualified and unqualified

View File

@@ -18,7 +18,7 @@
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated"])
arguments.add_common_arguments(subparser, ["no_checksum", "specs"])
subparser.add_argument(
"-m",
"--missing",
@@ -28,7 +28,7 @@ def setup_parser(subparser):
subparser.add_argument(
"-D", "--dependencies", action="store_true", help="also fetch all dependencies"
)
arguments.add_common_arguments(subparser, ["specs"])
arguments.add_concretizer_args(subparser)
subparser.epilog = (
"With an active environment, the specs "
"parameter can be omitted. In this case all (uninstalled"
@@ -40,9 +40,6 @@ def fetch(parser, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
else:

View File

@@ -475,10 +475,7 @@ def print_virtuals(pkg, args):
color.cprint(section_title("Virtual Packages: "))
if pkg.provided:
for when, specs in reversed(sorted(pkg.provided.items())):
line = " %s provides %s" % (
when.colorized(),
", ".join(s.colorized() for s in specs),
)
line = " %s provides %s" % (when.cformat(), ", ".join(s.cformat() for s in specs))
print(line)
else:

View File

@@ -176,7 +176,7 @@ def setup_parser(subparser):
dest="install_source",
help="install source files in prefix",
)
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated"])
arguments.add_common_arguments(subparser, ["no_checksum"])
subparser.add_argument(
"-v",
"--verbose",
@@ -326,9 +326,6 @@ def install(parser, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
if args.log_file and not args.log_format:
msg = "the '--log-format' must be specified when using '--log-file'"
tty.die(msg)

View File

@@ -53,6 +53,7 @@ def setup_parser(subparser):
"-S", "--stages", action="store_true", help="top level stage directory"
)
directories.add_argument(
"-c",
"--source-dir",
action="store_true",
help="source directory for a spec (requires it to be staged first)",

View File

@@ -28,7 +28,7 @@
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated"])
arguments.add_common_arguments(subparser, ["no_checksum"])
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="mirror_command")
@@ -72,6 +72,7 @@ def setup_parser(subparser):
" retrieve all versions of each package",
)
arguments.add_common_arguments(create_parser, ["specs"])
arguments.add_concretizer_args(create_parser)
# Destroy
destroy_parser = sp.add_parser("destroy", help=mirror_destroy.__doc__)
@@ -549,7 +550,4 @@ def mirror(parser, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
action[args.mirror_command](args)

View File

@@ -19,7 +19,7 @@
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated", "specs"])
arguments.add_common_arguments(subparser, ["no_checksum", "specs"])
arguments.add_concretizer_args(subparser)
@@ -33,9 +33,6 @@ def patch(parser, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
specs = spack.cmd.parse_specs(args.specs, concretize=False)
for spec in specs:
_patch(spack.cmd.matching_spec_from_env(spec).package)

View File

@@ -45,7 +45,9 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["long", "very_long", "namespaces"])
install_status_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(install_status_group, ["install_status", "no_install_status"])
arguments.add_common_arguments(
install_status_group, ["install_status", "no_install_status", "debug_nondefaults"]
)
subparser.add_argument(
"-y",
@@ -127,14 +129,13 @@ def _process_result(result, show, required_format, kwargs):
print()
if result.unsolved_specs and "solutions" in show:
tty.msg("Unsolved specs")
for spec in result.unsolved_specs:
print(spec)
print()
tty.msg(asp.Result.format_unsolved(result.unsolved_specs))
def solve(parser, args):
# these are the same options as `spack spec`
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt
@@ -144,8 +145,9 @@ def solve(parser, args):
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"install_status": args.install_status,
"status_fn": install_status_fn if args.install_status else None,
"hashes": args.long or args.very_long,
"nondefaults": args.debug_nondefaults,
}
# process output options

View File

@@ -32,7 +32,9 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["long", "very_long", "namespaces"])
install_status_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(install_status_group, ["install_status", "no_install_status"])
arguments.add_common_arguments(
install_status_group, ["install_status", "no_install_status", "debug_nondefaults"]
)
format_group = subparser.add_mutually_exclusive_group()
format_group.add_argument(
@@ -75,6 +77,8 @@ def setup_parser(subparser):
def spec(parser, args):
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt
@@ -82,9 +86,11 @@ def spec(parser, args):
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashes": args.long or args.very_long,
"nondefaults": args.debug_nondefaults,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"install_status": args.install_status,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every
@@ -123,12 +129,14 @@ def spec(parser, args):
# repeated output. This happens because parse_specs outputs concrete
# specs for `/hash` inputs.
if not input.concrete:
tree_kwargs["hashes"] = False # Always False for input spec
# NOTE: can use overrides | tree_kwargs in python 3.9+
overrides = {"hashes": False}
overriden_kwargs = {**overrides, **tree_kwargs}
print("Input spec")
print("--------------------------------")
print(input.tree(**tree_kwargs))
print(input.tree(**overriden_kwargs))
print("Concretized")
print("--------------------------------")
tree_kwargs["hashes"] = args.long or args.very_long
print(output.tree(**tree_kwargs))

View File

@@ -22,7 +22,7 @@
def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["no_checksum", "deprecated", "specs"])
arguments.add_common_arguments(subparser, ["no_checksum", "specs"])
subparser.add_argument(
"-p", "--path", dest="path", help="path to stage package, does not add to spack tree"
)
@@ -33,9 +33,6 @@ def stage(parser, args):
if args.no_checksum:
spack.config.set("config:checksum", False, scope="command_line")
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
if not args.specs:
env = ev.active_environment()
if not env:

View File

@@ -228,7 +228,7 @@ def create_reporter(args, specs_to_test, test_suite):
def test_list(args):
"""list installed packages with available tests"""
tagged = set(spack.repo.PATH.packages_with_tags(*args.tag)) if args.tag else set()
tagged = spack.repo.PATH.packages_with_tags(*args.tag) if args.tag else set()
def has_test_and_tags(pkg_class):
tests = spack.install_test.test_functions(pkg_class)

View File

@@ -334,6 +334,40 @@ def __init__(
# used for version checks for API, e.g. C++11 flag
self._real_version = None
def __eq__(self, other):
return (
self.cc == other.cc
and self.cxx == other.cxx
and self.fc == other.fc
and self.f77 == other.f77
and self.spec == other.spec
and self.operating_system == other.operating_system
and self.target == other.target
and self.flags == other.flags
and self.modules == other.modules
and self.environment == other.environment
and self.extra_rpaths == other.extra_rpaths
and self.enable_implicit_rpaths == other.enable_implicit_rpaths
)
def __hash__(self):
return hash(
(
self.cc,
self.cxx,
self.fc,
self.f77,
self.spec,
self.operating_system,
self.target,
str(self.flags),
str(self.modules),
str(self.environment),
str(self.extra_rpaths),
self.enable_implicit_rpaths,
)
)
def verify_executables(self):
"""Raise an error if any of the compiler executables is not valid.
@@ -389,8 +423,7 @@ def implicit_rpaths(self):
# Put CXX first since it has the most linking issues
# And because it has flags that affect linking
exe_paths = [x for x in [self.cxx, self.cc, self.fc, self.f77] if x]
link_dirs = self._get_compiler_link_paths(exe_paths)
link_dirs = self._get_compiler_link_paths()
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@@ -403,43 +436,33 @@ def required_libs(self):
# By default every compiler returns the empty list
return []
def _get_compiler_link_paths(self, paths):
first_compiler = next((c for c in paths if c), None)
if not first_compiler:
return []
if not self.verbose_flag:
# In this case there is no mechanism to learn what link directories
# are used by the compiler
def _get_compiler_link_paths(self):
cc = self.cc if self.cc else self.cxx
if not cc or not self.verbose_flag:
# Cannot determine implicit link paths without a compiler / verbose flag
return []
# What flag types apply to first_compiler, in what order
flags = ["cppflags", "ldflags"]
if first_compiler == self.cc:
flags = ["cflags"] + flags
elif first_compiler == self.cxx:
flags = ["cxxflags"] + flags
if cc == self.cc:
flags = ["cflags", "cppflags", "ldflags"]
else:
flags.append("fflags")
flags = ["cxxflags", "cppflags", "ldflags"]
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, "main.c")
with open(fin, "w+") as csource:
with open(fin, "w") as csource:
csource.write(
"int main(int argc, char* argv[]) { " "(void)argc; (void)argv; return 0; }\n"
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
compiler_exe = spack.util.executable.Executable(first_compiler)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in flags:
for flag in self.flags.get(flag_type, []):
compiler_exe.add_default_arg(flag)
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
output = ""
with self.compiler_environment():
output = str(
compiler_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
) # str for py2
output = cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
return _parse_non_system_link_dirs(output)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)

View File

@@ -109,7 +109,7 @@ def _to_dict(compiler):
return {"compiler": d}
def get_compiler_config(scope=None, init_config=True):
def get_compiler_config(scope=None, init_config=False):
"""Return the compiler configuration for the specified architecture."""
config = spack.config.get("compilers", scope=scope) or []
@@ -118,6 +118,8 @@ def get_compiler_config(scope=None, init_config=True):
merged_config = spack.config.get("compilers")
if merged_config:
# Config is empty for this scope
# Do not init config because there is a non-empty scope
return config
_init_compiler_config(scope=scope)
@@ -125,6 +127,95 @@ def get_compiler_config(scope=None, init_config=True):
return config
def get_compiler_config_from_packages(scope=None):
"""Return the compiler configuration from packages.yaml"""
config = spack.config.get("packages", scope=scope)
if not config:
return []
packages = []
compiler_package_names = supported_compilers() + list(package_name_to_compiler_name.keys())
for name, entry in config.items():
if name not in compiler_package_names:
continue
externals_config = entry.get("externals", None)
if not externals_config:
continue
packages.extend(_compiler_config_from_package_config(externals_config))
return packages
def _compiler_config_from_package_config(config):
compilers = []
for entry in config:
compiler = _compiler_config_from_external(entry)
if compiler:
compilers.append(compiler)
return compilers
def _compiler_config_from_external(config):
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("paths", {})
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
return None
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture
else:
target = spec.target
if not target:
host_platform = spack.platforms.host()
target = host_platform.target("default_target").microarchitecture
operating_system = spec.os
if not operating_system:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
compiler_entry = {
"compiler": {
"spec": str(compiler_spec),
"paths": paths,
"flags": extra_attributes.get("flags", {}),
"operating_system": str(operating_system),
"target": str(target.family),
"modules": config.get("modules", []),
"environment": extra_attributes.get("environment", {}),
"extra_rpaths": extra_attributes.get("extra_rpaths", []),
"implicit_rpaths": extra_attributes.get("implicit_rpaths", None),
}
}
return compiler_entry
def _init_compiler_config(*, scope):
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
@@ -142,17 +233,20 @@ def compiler_config_files():
compiler_config = config.get("compilers", scope=name)
if compiler_config:
config_files.append(config.get_config_filename(name, "compilers"))
compiler_config_from_packages = get_compiler_config_from_packages(scope=name)
if compiler_config_from_packages:
config_files.append(config.get_config_filename(name, "packages"))
return config_files
def add_compilers_to_config(compilers, scope=None, init_config=True):
def add_compilers_to_config(compilers, scope=None):
"""Add compilers to the config for the specified architecture.
Arguments:
compilers: a list of Compiler objects.
scope: configuration scope to modify.
"""
compiler_config = get_compiler_config(scope, init_config)
compiler_config = get_compiler_config(scope, init_config=False)
for compiler in compilers:
if not compiler.cc:
tty.debug(f"{compiler.spec} does not have a C compiler")
@@ -184,6 +278,9 @@ def remove_compiler_from_config(compiler_spec, scope=None):
for current_scope in candidate_scopes:
removal_happened |= _remove_compiler_from_scope(compiler_spec, scope=current_scope)
msg = "`spack compiler remove` will not remove compilers defined in packages.yaml"
msg += "\nTo remove these compilers, either edit the config or use `spack external remove`"
tty.debug(msg)
return removal_happened
@@ -198,7 +295,7 @@ def _remove_compiler_from_scope(compiler_spec, scope):
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(scope)
compiler_config = get_compiler_config(scope, init_config=False)
filtered_compiler_config = [
compiler_entry
for compiler_entry in compiler_config
@@ -221,7 +318,14 @@ def all_compilers_config(scope=None, init_config=True):
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
return get_compiler_config(scope, init_config)
from_packages_yaml = get_compiler_config_from_packages(scope)
if from_packages_yaml:
init_config = False
from_compilers_yaml = get_compiler_config(scope, init_config)
result = from_compilers_yaml + from_packages_yaml
key = lambda c: _compiler_from_config_entry(c["compiler"])
return list(llnl.util.lang.dedupe(result, key=key))
def all_compiler_specs(scope=None, init_config=True):
@@ -388,7 +492,7 @@ def find_specs_by_arch(compiler_spec, arch_spec, scope=None, init_config=True):
def all_compilers(scope=None, init_config=True):
config = get_compiler_config(scope, init_config=init_config)
config = all_compilers_config(scope, init_config=init_config)
compilers = list()
for items in config:
items = items["compiler"]
@@ -403,10 +507,7 @@ def compilers_for_spec(
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
if use_cache:
config = all_compilers_config(scope, init_config)
else:
config = get_compiler_config(scope, init_config)
config = all_compilers_config(scope, init_config)
matches = set(find(compiler_spec, scope, init_config))
compilers = []
@@ -583,9 +684,7 @@ def get_compiler_duplicates(compiler_spec, arch_spec):
scope_to_compilers = {}
for scope in config.scopes:
compilers = compilers_for_spec(
compiler_spec, arch_spec=arch_spec, scope=scope, use_cache=False
)
compilers = compilers_for_spec(compiler_spec, arch_spec=arch_spec, scope=scope)
if compilers:
scope_to_compilers[scope] = compilers

View File

@@ -10,6 +10,8 @@
import tempfile
from typing import Dict, List, Set
import archspec.cpu
import spack.compiler
import spack.operating_systems.windows_os
import spack.platforms
@@ -186,6 +188,9 @@ def __init__(self, *args, **kwargs):
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
arch = arch.replace("-", "_")
if str(archspec.cpu.host().family) == "x86_64":
arch = "amd64"
self.vcvars_call = VCVarsInvocation(vcvars_script_path, arch, self.msvc_version)
env_cmds.append(self.vcvars_call)
# Below is a check for a valid fortran path
@@ -318,7 +323,7 @@ def fc_version(cls, fc):
fc_path[fc_ver] = fc
if os.getenv("ONEAPI_ROOT"):
try:
sps = spack.operating_systems.windows_os.WindowsOs.compiler_search_paths
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError("Windows compiler search paths not established")
clp = spack.util.executable.which_string("cl", path=sps)

View File

@@ -764,6 +764,31 @@ def _add_platform_scope(
cfg.push_scope(scope_type(plat_name, plat_path))
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
"""Load configuration paths from entry points
A python package can register entry point metadata so that Spack can find
its configuration by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.config"]
baz = "baz:get_spack_config_path"
The function ``get_spack_config_path`` returns the path to the package's
spack configuration scope
"""
config_paths: List[Tuple[str, str]] = []
for entry_point in lang.get_entry_points(group="spack.config"):
hook = entry_point.load()
if callable(hook):
config_path = hook()
if config_path and os.path.exists(config_path):
config_paths.append(("plugin-%s" % entry_point.name, str(config_path)))
return config_paths
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
@@ -816,6 +841,9 @@ def create() -> Configuration:
# No site-level configs should be checked into spack by default.
configuration_paths.append(("site", os.path.join(spack.paths.etc_path)))
# Python package's can register configuration scopes via entry_points
configuration_paths.extend(config_paths_from_entry_points())
# User configuration can override both spack defaults and site config
# This is disabled if user asks for no local configuration.
if not disable_local_config:

View File

@@ -19,9 +19,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora38",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:38"
}
@@ -33,9 +30,6 @@
},
"os_package_manager": "dnf",
"build": "spack/fedora37",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/fedora:37"
}
@@ -47,9 +41,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:9"
}
@@ -61,9 +52,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/rockylinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "docker.io/rockylinux:8"
}
@@ -75,9 +63,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux9",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinuxorg/almalinux:9"
}
@@ -89,9 +74,6 @@
},
"os_package_manager": "dnf_epel",
"build": "spack/almalinux8",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "quay.io/almalinuxorg/almalinux:8"
}
@@ -105,9 +87,6 @@
"build": "spack/centos-stream",
"final": {
"image": "quay.io/centos/centos:stream"
},
"build_tags": {
"develop": "latest"
}
},
"centos:7": {
@@ -115,10 +94,7 @@
"template": "container/centos_7.dockerfile"
},
"os_package_manager": "yum",
"build": "spack/centos7",
"build_tags": {
"develop": "latest"
}
"build": "spack/centos7"
},
"opensuse/leap:15": {
"bootstrap": {
@@ -126,9 +102,6 @@
},
"os_package_manager": "zypper",
"build": "spack/leap15",
"build_tags": {
"develop": "latest"
},
"final": {
"image": "opensuse/leap:latest"
}
@@ -148,19 +121,13 @@
"template": "container/ubuntu_2204.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-jammy",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-jammy"
},
"ubuntu:20.04": {
"bootstrap": {
"template": "container/ubuntu_2004.dockerfile"
},
"build": "spack/ubuntu-focal",
"build_tags": {
"develop": "latest"
},
"os_package_manager": "apt"
},
"ubuntu:18.04": {
@@ -168,10 +135,7 @@
"template": "container/ubuntu_1804.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-bionic",
"build_tags": {
"develop": "latest"
}
"build": "spack/ubuntu-bionic"
}
},
"os_package_managers": {

View File

@@ -50,10 +50,7 @@ def build_info(image, spack_version):
if not build_image:
return None, None
# Translate version from git to docker if necessary
build_tag = image_data["build_tags"].get(spack_version, spack_version)
return build_image, build_tag
return build_image, spack_version
def os_package_manager_for(image):

View File

@@ -227,7 +227,7 @@ def read(path, apply_updates):
if apply_updates and compilers:
for compiler in compilers:
try:
spack.compilers.add_compilers_to_config([compiler], init_config=False)
spack.compilers.add_compilers_to_config([compiler])
except Exception:
warnings.warn(
f"Could not add compiler {str(compiler.spec)}: "

View File

@@ -1687,7 +1687,11 @@ def root(key, record):
with self.read_transaction():
roots = [rec.spec for key, rec in self._data.items() if root(key, rec)]
needed = set(id(spec) for spec in tr.traverse_nodes(roots, deptype=deptype))
return [rec.spec for rec in self._data.values() if id(rec.spec) not in needed]
return [
rec.spec
for rec in self._data.values()
if id(rec.spec) not in needed and rec.installed
]
def update_explicit(self, spec, explicit):
"""

View File

@@ -660,6 +660,7 @@ def patch(
level: int = 1,
when: WhenType = None,
working_dir: str = ".",
reverse: bool = False,
sha256: Optional[str] = None,
archive_sha256: Optional[str] = None,
) -> Patcher:
@@ -673,10 +674,10 @@ def patch(
level: patch level (as in the patch shell command)
when: optional anonymous spec that specifies when to apply the patch
working_dir: dir to change to before applying
reverse: reverse the patch
sha256: sha256 sum of the patch, used to verify the patch (only required for URL patches)
archive_sha256: sha256 sum of the *archive*, if the patch is compressed (only required for
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependency]):
@@ -703,18 +704,22 @@ def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependenc
patch: spack.patch.Patch
if "://" in url_or_filename:
if sha256 is None:
raise ValueError("patch() with a url requires a sha256")
patch = spack.patch.UrlPatch(
pkg,
url_or_filename,
level,
working_dir,
working_dir=working_dir,
reverse=reverse,
ordering_key=ordering_key,
sha256=sha256,
archive_sha256=archive_sha256,
)
else:
patch = spack.patch.FilePatch(
pkg, url_or_filename, level, working_dir, ordering_key=ordering_key
pkg, url_or_filename, level, working_dir, reverse, ordering_key=ordering_key
)
cur_patches.append(patch)

View File

@@ -626,14 +626,13 @@ def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
new: If a string, create a FilesystemView rooted at that path. Default None. This
should only be used to regenerate the view, and cannot be used to access specs.
"""
root = new if new else self._current_root
if not root:
path = new if new else self._current_root
if not path:
# This can only be hit if we write a future bug
raise SpackEnvironmentViewError(
"Attempting to get nonexistent view from environment. "
f"View root is at {self.root}"
f"Attempting to get nonexistent view from environment. View root is at {self.root}"
)
return self._view(root)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
"""Returns a view object for a given root dir."""
@@ -678,7 +677,9 @@ def specs_for_view(self, concrete_roots: List[Spec]) -> List[Spec]:
# Filter selected, installed specs
with spack.store.STORE.db.read_transaction():
return [s for s in specs if s in self and s.installed]
result = [s for s in specs if s in self and s.installed]
return self._exclude_duplicate_runtimes(result)
def regenerate(self, concrete_roots: List[Spec]) -> None:
specs = self.specs_for_view(concrete_roots)
@@ -765,6 +766,16 @@ def regenerate(self, concrete_roots: List[Spec]) -> None:
msg += str(e)
tty.warn(msg)
def _exclude_duplicate_runtimes(self, nodes):
all_runtimes = spack.repo.PATH.packages_with_tags("runtime")
runtimes_by_name = {}
for s in nodes:
if s.name not in all_runtimes:
continue
current_runtime = runtimes_by_name.get(s.name, s)
runtimes_by_name[s.name] = max(current_runtime, s, key=lambda x: x.version)
return [x for x in nodes if x.name not in all_runtimes or runtimes_by_name[x.name] == x]
def _create_environment(path):
return Environment(path)
@@ -1416,7 +1427,7 @@ def _concretize_separately(self, tests=False):
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.get_compiler_config()
_ = spack.compilers.get_compiler_config(init_config=True)
# Early return if there is nothing to do
if len(args) == 0:
@@ -1485,44 +1496,6 @@ def _concretize_separately(self, tests=False):
]
return results
def concretize_and_add(self, user_spec, concrete_spec=None, tests=False):
"""Concretize and add a single spec to the environment.
Concretize the provided ``user_spec`` and add it along with the
concretized result to the environment. If the given ``user_spec`` was
already present in the environment, this does not add a duplicate.
The concretized spec will be added unless the ``user_spec`` was
already present and an associated concrete spec was already present.
Args:
concrete_spec: if provided, then it is assumed that it is the
result of concretizing the provided ``user_spec``
"""
if self.unify is True:
msg = (
"cannot install a single spec in an environment that is "
"configured to be concretized together. Run instead:\n\n"
" $ spack add <spec>\n"
" $ spack install\n"
)
raise SpackEnvironmentError(msg)
spec = Spec(user_spec)
if self.add(spec):
concrete = concrete_spec or spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
else:
# spec might be in the user_specs, but not installed.
# TODO: Redo name-based comparison for old style envs
spec = next(s for s in self.user_specs if s.satisfies(user_spec))
concrete = self.specs_by_hash.get(spec.dag_hash())
if not concrete:
concrete = spec.concretized(tests=tests)
self._add_concrete_spec(spec, concrete)
return concrete
@property
def default_view(self):
if not self.has_view(default_view_name):
@@ -2212,7 +2185,7 @@ def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.DISPLAY_FORMAT,
install_status=True,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,
)

View File

@@ -12,6 +12,7 @@
import re
import sys
import types
from pathlib import Path
from typing import List
import llnl.util.lang
@@ -132,10 +133,38 @@ def load_extension(name: str) -> str:
def get_extension_paths():
"""Return the list of canonicalized extension paths from config:extensions."""
extension_paths = spack.config.get("config:extensions") or []
extension_paths.extend(extension_paths_from_entry_points())
paths = [spack.util.path.canonicalize_path(p) for p in extension_paths]
return paths
def extension_paths_from_entry_points() -> List[str]:
"""Load extensions from a Python package's entry points.
A python package can register entry point metadata so that Spack can find
its extensions by adding the following to the project's pyproject.toml:
.. code-block:: toml
[project.entry-points."spack.extensions"]
baz = "baz:get_spack_extensions"
The function ``get_spack_extensions`` returns paths to the package's
spack extensions
"""
extension_paths: List[str] = []
for entry_point in llnl.util.lang.get_entry_points(group="spack.extensions"):
hook = entry_point.load()
if callable(hook):
paths = hook() or []
if isinstance(paths, (Path, str)):
extension_paths.append(str(paths))
else:
extension_paths.extend(paths)
return extension_paths
def get_command_paths():
"""Return the list of paths where to search for command files."""
command_paths = []

View File

@@ -950,14 +950,10 @@ def _main(argv=None):
parser.print_help()
return 1
# -h, -H, and -V are special as they do not require a command, but
# all the other options do nothing without a command.
# version is special as it does not require a command or loading and additional infrastructure
if args.version:
print(get_version())
return 0
elif args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# ------------------------------------------------------------------------
# This part of the `main()` sets up Spack's configuration.
@@ -996,6 +992,12 @@ def _main(argv=None):
print_setup_info(*args.print_shell_vars.split(","))
return 0
# -h and -H are special as they do not require a command, but
# all the other options do nothing without a command.
if args.help:
sys.stdout.write(parser.format_help(level=args.help))
return 0
# At this point we've considered all the options to spack itself, so we
# need a command or we're done.
if not args.command:

View File

@@ -121,43 +121,26 @@ def update_dictionary_extending_lists(target, update):
target[key] = update[key]
def dependencies(spec, request="all"):
"""Returns the list of dependent specs for a given spec, according to the
request passed as parameter.
def dependencies(spec: spack.spec.Spec, request: str = "all") -> List[spack.spec.Spec]:
"""Returns the list of dependent specs for a given spec.
Args:
spec: spec to be analyzed
request: either 'none', 'direct' or 'all'
request: one of "none", "run", "direct", "all"
Returns:
list of dependencies
The return list will be empty if request is 'none', will contain
the direct dependencies if request is 'direct', or the entire DAG
if request is 'all'.
list of requested dependencies
"""
if request not in ("none", "direct", "all"):
message = "Wrong value for argument 'request' : "
message += "should be one of ('none', 'direct', 'all')"
raise tty.error(message + " [current value is '%s']" % request)
if request == "none":
return []
elif request == "run":
return spec.dependencies(deptype=dt.RUN)
elif request == "direct":
return spec.dependencies(deptype=dt.RUN | dt.LINK)
elif request == "all":
return list(spec.traverse(order="topo", deptype=dt.LINK | dt.RUN, root=False))
if request == "direct":
return spec.dependencies(deptype=("link", "run"))
# FIXME : during module file creation nodes seem to be visited multiple
# FIXME : times even if cover='nodes' is given. This work around permits
# FIXME : to get a unique list of spec anyhow. Do we miss a merge
# FIXME : step among nodes that refer to the same package?
seen = set()
seen_add = seen.add
deps = sorted(
spec.traverse(order="post", cover="nodes", deptype=("link", "run"), root=False),
reverse=True,
)
return [d for d in deps if not (d in seen or seen_add(d))]
raise ValueError(f'request "{request}" is not one of "none", "direct", "run", "all"')
def merge_config_rules(configuration, spec):

View File

@@ -161,7 +161,7 @@ def upload_blob(
def upload_manifest(
ref: ImageReference,
oci_manifest: dict,
manifest: dict,
tag: bool = True,
_urlopen: spack.oci.opener.MaybeOpen = None,
):
@@ -169,7 +169,7 @@ def upload_manifest(
Args:
ref: The image reference.
oci_manifest: The OCI manifest or index.
manifest: The manifest or index.
tag: When true, use the tag, otherwise use the digest,
this is relevant for multi-arch images, where the
tag is an index, referencing the manifests by digest.
@@ -179,7 +179,7 @@ def upload_manifest(
"""
_urlopen = _urlopen or spack.oci.opener.urlopen
data = json.dumps(oci_manifest, separators=(",", ":")).encode()
data = json.dumps(manifest, separators=(",", ":")).encode()
digest = Digest.from_sha256(hashlib.sha256(data).hexdigest())
size = len(data)
@@ -190,7 +190,7 @@ def upload_manifest(
url=ref.manifest_url(),
method="PUT",
data=data,
headers={"Content-Type": oci_manifest["mediaType"]},
headers={"Content-Type": manifest["mediaType"]},
)
response = _urlopen(request)

View File

@@ -64,7 +64,7 @@
install_test_root,
)
from spack.installer import InstallError, PackageInstaller
from spack.stage import DIYStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.version import GitVersion, StandardVersion
@@ -566,6 +566,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict[str, Tuple["spack.variant.Variant", "spack.spec.Spec"]]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -1074,7 +1075,12 @@ def _make_stage(self):
# If it's a dev package (not transitively), use a DIY stage object
dev_path_var = self.spec.variants.get("dev_path", None)
if dev_path_var:
return DIYStage(dev_path_var.value)
dev_path = dev_path_var.value
link_format = spack.config.get("config:develop_stage_link")
if not link_format:
link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format)
return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
# To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
@@ -1406,7 +1412,7 @@ def do_fetch(self, mirror_only=False):
return
checksum = spack.config.get("config:checksum")
fetch = self.stage.managed_by_spack
fetch = self.stage.needs_fetching
if (
checksum
and fetch
@@ -1479,9 +1485,6 @@ def do_stage(self, mirror_only=False):
if self.has_code:
self.do_fetch(mirror_only)
self.stage.expand_archive()
if not os.listdir(self.stage.path):
raise spack.error.FetchError("Archive was empty for %s" % self.name)
else:
# Support for post-install hooks requires a stage.source_path
fsys.mkdirp(self.stage.source_path)
@@ -1515,7 +1518,7 @@ def do_patch(self):
# If we encounter an archive that failed to patch, restage it
# so that we can apply all the patches again.
if os.path.isfile(bad_file):
if self.stage.managed_by_spack:
if self.stage.requires_patch_success:
tty.debug("Patching failed last time. Restaging.")
self.stage.restage()
else:
@@ -1536,6 +1539,8 @@ def do_patch(self):
tty.msg("No patches needed for {0}".format(self.name))
return
errors = []
# Apply all the patches for specs that match this one
patched = False
for patch in patches:
@@ -1545,12 +1550,16 @@ def do_patch(self):
tty.msg("Applied patch {0}".format(patch.path_or_url))
patched = True
except spack.error.SpackError as e:
tty.debug(e)
# Touch bad file if anything goes wrong.
tty.msg("Patch %s failed." % patch.path_or_url)
fsys.touch(bad_file)
raise
error_msg = f"Patch {patch.path_or_url} failed."
if self.stage.requires_patch_success:
tty.msg(error_msg)
raise
else:
tty.debug(error_msg)
tty.debug(e)
errors.append(e)
if has_patch_fun:
try:
@@ -1568,24 +1577,29 @@ def do_patch(self):
# printed a message for each patch.
tty.msg("No patches needed for {0}".format(self.name))
except spack.error.SpackError as e:
tty.debug(e)
# Touch bad file if anything goes wrong.
tty.msg("patch() function failed for {0}".format(self.name))
fsys.touch(bad_file)
raise
error_msg = f"patch() function failed for {self.name}"
if self.stage.requires_patch_success:
tty.msg(error_msg)
raise
else:
tty.debug(error_msg)
tty.debug(e)
errors.append(e)
# Get rid of any old failed file -- patches have either succeeded
# or are not needed. This is mostly defensive -- it's needed
# if the restage() method doesn't clean *everything* (e.g., for a repo)
if os.path.isfile(bad_file):
os.remove(bad_file)
if not errors:
# Get rid of any old failed file -- patches have either succeeded
# or are not needed. This is mostly defensive -- it's needed
# if we didn't restage
if os.path.isfile(bad_file):
os.remove(bad_file)
# touch good or no patches file so that we skip next time.
if patched:
fsys.touch(good_file)
else:
fsys.touch(no_patches_file)
# touch good or no patches file so that we skip next time.
if patched:
fsys.touch(good_file)
else:
fsys.touch(no_patches_file)
@classmethod
def all_patches(cls):

View File

@@ -9,9 +9,9 @@
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type
import llnl.util.filesystem
import llnl.util.lang
from llnl.url import allowed_archive
import spack
@@ -25,15 +25,21 @@
from spack.util.executable import which, which_string
def apply_patch(stage, patch_path, level=1, working_dir="."):
def apply_patch(
stage: "spack.stage.Stage",
patch_path: str,
level: int = 1,
working_dir: str = ".",
reverse: bool = False,
) -> None:
"""Apply the patch at patch_path to code in the stage.
Args:
stage (spack.stage.Stage): stage with code that will be patched
patch_path (str): filesystem location for the patch to apply
level (int or None): patch level (default 1)
working_dir (str): relative path *within* the stage to change to
(default '.')
stage: stage with code that will be patched
patch_path: filesystem location for the patch to apply
level: patch level
working_dir: relative path *within* the stage to change to
reverse: reverse the patch
"""
git_utils_path = os.environ.get("PATH", "")
if sys.platform == "win32":
@@ -44,6 +50,10 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
git_root = git_root / "usr" / "bin"
git_utils_path = os.pathsep.join([str(git_root), git_utils_path])
args = ["-s", "-p", str(level), "-i", patch_path, "-d", working_dir]
if reverse:
args.append("-R")
# TODO: Decouple Spack's patch support on Windows from Git
# for Windows, and instead have Spack directly fetch, install, and
# utilize that patch.
@@ -52,22 +62,36 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
# flag is passed.
patch = which("patch", required=True, path=git_utils_path)
with llnl.util.filesystem.working_dir(stage.source_path):
patch("-s", "-p", str(level), "-i", patch_path, "-d", working_dir)
patch(*args)
class Patch:
"""Base class for patches.
Arguments:
pkg (str): the package that owns the patch
The owning package is not necessarily the package to apply the patch
to -- in the case where a dependent package patches its dependency,
it is the dependent's fullname.
"""
def __init__(self, pkg, path_or_url, level, working_dir):
sha256: str
def __init__(
self,
pkg: "spack.package_base.PackageBase",
path_or_url: str,
level: int,
working_dir: str,
reverse: bool = False,
) -> None:
"""Initialize a new Patch instance.
Args:
pkg: the package that owns the patch
path_or_url: the relative path or URL to a patch file
level: patch level
working_dir: relative path *within* the stage to change to
reverse: reverse the patch
"""
# validate level (must be an integer >= 0)
if not isinstance(level, int) or not level >= 0:
raise ValueError("Patch level needs to be a non-negative integer.")
@@ -75,59 +99,88 @@ def __init__(self, pkg, path_or_url, level, working_dir):
# Attributes shared by all patch subclasses
self.owner = pkg.fullname
self.path_or_url = path_or_url # needed for debug output
self.path = None # must be set before apply()
self.path: Optional[str] = None # must be set before apply()
self.level = level
self.working_dir = working_dir
self.reverse = reverse
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Arguments:
stage (spack.stage.Stage): stage where source code lives
Args:
stage: stage where source code lives
"""
if not self.path or not os.path.isfile(self.path):
raise NoSuchPatchError(f"No such patch: {self.path}")
apply_patch(stage, self.path, self.level, self.working_dir)
apply_patch(stage, self.path, self.level, self.working_dir, self.reverse)
@property
def stage(self):
return None
# TODO: Use TypedDict once Spack supports Python 3.8+ only
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
def to_dict(self):
"""Partial dictionary -- subclases should add to this."""
Returns:
A dictionary representation.
"""
return {
"owner": self.owner,
"sha256": self.sha256,
"level": self.level,
"working_dir": self.working_dir,
"reverse": self.reverse,
}
def __eq__(self, other):
def __eq__(self, other: object) -> bool:
"""Equality check.
Args:
other: another patch
Returns:
True if both patches have the same checksum, else False
"""
if not isinstance(other, Patch):
return NotImplemented
return self.sha256 == other.sha256
def __hash__(self):
def __hash__(self) -> int:
"""Unique hash.
Returns:
A unique hash based on the sha256.
"""
return hash(self.sha256)
class FilePatch(Patch):
"""Describes a patch that is retrieved from a file in the repository.
"""Describes a patch that is retrieved from a file in the repository."""
Arguments:
pkg (str): the class object for the package that owns the patch
relative_path (str): path to patch, relative to the repository
directory for a package.
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
_sha256: Optional[str] = None
def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
relative_path: str,
level: int,
working_dir: str,
reverse: bool = False,
ordering_key: Optional[Tuple[str, int]] = None,
) -> None:
"""Initialize a new FilePatch instance.
Args:
pkg: the class object for the package that owns the patch
relative_path: path to patch, relative to the repository directory for a package.
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
reverse: reverse the patch
ordering_key: key used to ensure patches are applied in a consistent order
"""
self.relative_path = relative_path
# patches may be defined by relative paths to parent classes
# search mro to look for the file
abs_path = None
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls):
@@ -148,52 +201,94 @@ def __init__(self, pkg, relative_path, level, working_dir, ordering_key=None):
msg += "package %s.%s does not exist." % (pkg.namespace, pkg.name)
raise ValueError(msg)
super().__init__(pkg, abs_path, level, working_dir)
super().__init__(pkg, abs_path, level, working_dir, reverse)
self.path = abs_path
self._sha256 = None
self.ordering_key = ordering_key
@property
def sha256(self):
if self._sha256 is None:
def sha256(self) -> str:
"""Get the patch checksum.
Returns:
The sha256 of the patch file.
"""
if self._sha256 is None and self.path is not None:
self._sha256 = checksum(hashlib.sha256, self.path)
assert isinstance(self._sha256, str)
return self._sha256
def to_dict(self):
return llnl.util.lang.union_dicts(super().to_dict(), {"relative_path": self.relative_path})
@sha256.setter
def sha256(self, value: str) -> None:
"""Set the patch checksum.
Args:
value: the sha256
"""
self._sha256 = value
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["relative_path"] = self.relative_path
return data
class UrlPatch(Patch):
"""Describes a patch that is retrieved from a URL.
"""Describes a patch that is retrieved from a URL."""
Arguments:
pkg (str): the package that owns the patch
url (str): URL where the patch can be fetched
level (int): level to pass to patch command
working_dir (str): path within the source directory where patch
should be applied
"""
def __init__(
self,
pkg: "spack.package_base.PackageBase",
url: str,
level: int = 1,
*,
working_dir: str = ".",
reverse: bool = False,
sha256: str, # This is required for UrlPatch
ordering_key: Optional[Tuple[str, int]] = None,
archive_sha256: Optional[str] = None,
) -> None:
"""Initialize a new UrlPatch instance.
def __init__(self, pkg, url, level=1, working_dir=".", ordering_key=None, **kwargs):
super().__init__(pkg, url, level, working_dir)
Arguments:
pkg: the package that owns the patch
url: URL where the patch can be fetched
level: level to pass to patch command
working_dir: path within the source directory where patch should be applied
reverse: reverse the patch
ordering_key: key used to ensure patches are applied in a consistent order
sha256: sha256 sum of the patch, used to verify the patch
archive_sha256: sha256 sum of the *archive*, if the patch is compressed
(only required for compressed URL patches)
"""
super().__init__(pkg, url, level, working_dir, reverse)
self.url = url
self._stage = None
self._stage: Optional["spack.stage.Stage"] = None
self.ordering_key = ordering_key
self.archive_sha256 = kwargs.get("archive_sha256")
if allowed_archive(self.url) and not self.archive_sha256:
if allowed_archive(self.url) and not archive_sha256:
raise PatchDirectiveError(
"Compressed patches require 'archive_sha256' "
"and patch 'sha256' attributes: %s" % self.url
)
self.archive_sha256 = archive_sha256
self.sha256 = kwargs.get("sha256")
if not self.sha256:
if not sha256:
raise PatchDirectiveError("URL patches require a sha256 checksum")
self.sha256 = sha256
def apply(self, stage: "spack.stage.Stage"):
def apply(self, stage: "spack.stage.Stage") -> None:
"""Apply a patch to source in a stage.
Args:
stage: stage where source code lives
"""
assert self.stage.expanded, "Stage must be expanded before applying patches"
# Get the patch file.
@@ -204,15 +299,20 @@ def apply(self, stage: "spack.stage.Stage"):
return super().apply(stage)
@property
def stage(self):
def stage(self) -> "spack.stage.Stage":
"""The stage in which to download (and unpack) the URL patch.
Returns:
The stage object.
"""
if self._stage:
return self._stage
fetch_digest = self.archive_sha256 or self.sha256
# Two checksums, one for compressed file, one for its contents
if self.archive_sha256:
fetcher = fs.FetchAndVerifyExpandedFile(
if self.archive_sha256 and self.sha256:
fetcher: fs.FetchStrategy = fs.FetchAndVerifyExpandedFile(
self.url, archive_sha256=self.archive_sha256, expanded_sha256=self.sha256
)
else:
@@ -231,7 +331,12 @@ def stage(self):
)
return self._stage
def to_dict(self):
def to_dict(self) -> Dict[str, Any]:
"""Dictionary representation of the patch.
Returns:
A dictionary representation.
"""
data = super().to_dict()
data["url"] = self.url
if self.archive_sha256:
@@ -239,8 +344,21 @@ def to_dict(self):
return data
def from_dict(dictionary, repository=None):
"""Create a patch from json dictionary."""
def from_dict(
dictionary: Dict[str, Any], repository: Optional["spack.repo.RepoPath"] = None
) -> Patch:
"""Create a patch from json dictionary.
Args:
dictionary: dictionary representation of a patch
repository: repository containing package
Returns:
A patch object.
Raises:
ValueError: If *owner* or *url*/*relative_path* are missing in the dictionary.
"""
repository = repository or spack.repo.PATH
owner = dictionary.get("owner")
if "owner" not in dictionary:
@@ -252,14 +370,21 @@ def from_dict(dictionary, repository=None):
pkg_cls,
dictionary["url"],
dictionary["level"],
dictionary["working_dir"],
working_dir=dictionary["working_dir"],
# Added in v0.22, fallback required for backwards compatibility
reverse=dictionary.get("reverse", False),
sha256=dictionary["sha256"],
archive_sha256=dictionary.get("archive_sha256"),
)
elif "relative_path" in dictionary:
patch = FilePatch(
pkg_cls, dictionary["relative_path"], dictionary["level"], dictionary["working_dir"]
pkg_cls,
dictionary["relative_path"],
dictionary["level"],
dictionary["working_dir"],
# Added in v0.22, fallback required for backwards compatibility
dictionary.get("reverse", False),
)
# If the patch in the repo changes, we cannot get it back, so we
@@ -267,7 +392,7 @@ def from_dict(dictionary, repository=None):
# TODO: handle this more gracefully.
sha256 = dictionary["sha256"]
checker = Checker(sha256)
if not checker.check(patch.path):
if patch.path and not checker.check(patch.path):
raise fs.ChecksumError(
"sha256 checksum failed for %s" % patch.path,
"Expected %s but got %s " % (sha256, checker.sum)
@@ -295,10 +420,17 @@ class PatchCache:
namespace2.package2:
<patch json>
... etc. ...
"""
def __init__(self, repository, data=None):
def __init__(
self, repository: "spack.repo.RepoPath", data: Optional[Dict[str, Any]] = None
) -> None:
"""Initialize a new PatchCache instance.
Args:
repository: repository containing package
data: nested dictionary of patches
"""
if data is None:
self.index = {}
else:
@@ -309,21 +441,39 @@ def __init__(self, repository, data=None):
self.repository = repository
@classmethod
def from_json(cls, stream, repository):
def from_json(cls, stream: Any, repository: "spack.repo.RepoPath") -> "PatchCache":
"""Initialize a new PatchCache instance from JSON.
Args:
stream: stream of data
repository: repository containing package
Returns:
A new PatchCache instance.
"""
return PatchCache(repository=repository, data=sjson.load(stream))
def to_json(self, stream):
def to_json(self, stream: Any) -> None:
"""Dump a JSON representation to a stream.
Args:
stream: stream of data
"""
sjson.dump({"patches": self.index}, stream)
def patch_for_package(self, sha256: str, pkg):
def patch_for_package(self, sha256: str, pkg: "spack.package_base.PackageBase") -> Patch:
"""Look up a patch in the index and build a patch object for it.
Arguments:
sha256: sha256 hash to look up
pkg (spack.package_base.PackageBase): Package object to get patch for.
We build patch objects lazily because building them requires that
we have information about the package's location in its repo."""
we have information about the package's location in its repo.
Args:
sha256: sha256 hash to look up
pkg: Package object to get patch for.
Returns:
The patch object.
"""
sha_index = self.index.get(sha256)
if not sha_index:
raise PatchLookupError(
@@ -346,7 +496,12 @@ def patch_for_package(self, sha256: str, pkg):
patch_dict["sha256"] = sha256
return from_dict(patch_dict, repository=self.repository)
def update_package(self, pkg_fullname):
def update_package(self, pkg_fullname: str) -> None:
"""Update the patch cache.
Args:
pkg_fullname: package to update.
"""
# remove this package from any patch entries that reference it.
empty = []
for sha256, package_to_patch in self.index.items():
@@ -372,14 +527,29 @@ def update_package(self, pkg_fullname):
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
def update(self, other):
"""Update this cache with the contents of another."""
def update(self, other: "PatchCache") -> None:
"""Update this cache with the contents of another.
Args:
other: another patch cache to merge
"""
for sha256, package_to_patch in other.index.items():
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
@staticmethod
def _index_patches(pkg_class, repository):
def _index_patches(
pkg_class: Type["spack.package_base.PackageBase"], repository: "spack.repo.RepoPath"
) -> Dict[Any, Any]:
"""Patch index for a specific patch.
Args:
pkg_class: package object to get patches for
repository: repository containing the package
Returns:
The patch index for that package.
"""
index = {}
# Add patches from the class

View File

@@ -3,6 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.path
def get_projection(projections, spec):
"""
@@ -11,7 +13,7 @@ def get_projection(projections, spec):
all_projection = None
for spec_like, projection in projections.items():
if spec.satisfies(spec_like):
return projection
return spack.util.path.substitute_path_variables(projection)
elif spec_like == "all":
all_projection = projection
all_projection = spack.util.path.substitute_path_variables(projection)
return all_projection

View File

@@ -25,7 +25,7 @@
import traceback
import types
import uuid
from typing import Any, Dict, List, Tuple, Union
from typing import Any, Dict, List, Set, Tuple, Union
import llnl.path
import llnl.util.filesystem as fs
@@ -746,19 +746,17 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags, full=False):
"""Returns a list of packages matching any of the tags in input.
def packages_with_tags(self, *tags: str, full: bool = False) -> Set[str]:
"""Returns a set of packages matching any of the tags in input.
Args:
full: if True the package names in the output are fully-qualified
"""
r = set()
for repo in self.repos:
current = repo.packages_with_tags(*tags)
if full:
current = [f"{repo.namespace}.{x}" for x in current]
r |= set(current)
return sorted(r)
return {
f"{repo.namespace}.{pkg}" if full else pkg
for repo in self.repos
for pkg in repo.packages_with_tags(*tags)
}
def all_package_classes(self):
for name in self.all_package_names():
@@ -1169,15 +1167,10 @@ def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags):
def packages_with_tags(self, *tags: str) -> Set[str]:
v = set(self.all_package_names())
index = self.tag_index
for t in tags:
t = t.lower()
v &= set(index[t])
return sorted(v)
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
return v
def all_package_classes(self):
"""Iterator over all package *classes* in the repository.

View File

@@ -6,7 +6,6 @@
import warnings
import llnl.util.lang
import llnl.util.tty
# jsonschema is imported lazily as it is heavy to import
@@ -62,25 +61,3 @@ def _deprecated_properties(validator, deprecated, instance, schema):
Validator = llnl.util.lang.Singleton(_make_validator)
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": {
"type": "array",
"items": {"type": "array", "items": {"type": "string"}},
},
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -63,6 +63,7 @@
"oneOf": [{"type": "string"}, {"type": "array", "items": {"type": "string"}}]
},
"stage_name": {"type": "string"},
"develop_stage_link": {"type": "string"},
"test_stage": {"type": "string"},
"extensions": {"type": "array", "items": {"type": "string"}},
"template_dirs": {"type": "array", "items": {"type": "string"}},

View File

@@ -10,7 +10,7 @@
"""
from typing import Any, Dict
import spack.schema
from .spec_list import spec_list_schema
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
@@ -20,7 +20,7 @@
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spack.schema.spec_list_schema},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
},
}
}

View File

@@ -16,11 +16,11 @@
import spack.schema.merged
import spack.schema.projections
from .spec_list import spec_list_schema
#: Top level key in a manifest file
TOP_LEVEL_KEY = "spack"
projections_scheme = spack.schema.projections.properties["projections"]
properties: Dict[str, Any] = {
"spack": {
"type": "object",
@@ -34,7 +34,7 @@
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
"specs": spec_list_schema,
},
),
}

View File

@@ -34,7 +34,7 @@
dictionary_of_strings = {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}}
dependency_selection = {"type": "string", "enum": ["none", "direct", "all"]}
dependency_selection = {"type": "string", "enum": ["none", "run", "direct", "all"]}
module_file_configuration = {
"type": "object",

View File

@@ -1,46 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for spack environment
.. literalinclude:: _spack_root/lib/spack/spack/schema/spack.py
:lines: 20-
"""
from typing import Any, Dict
from llnl.util.lang import union_dicts
import spack.schema
import spack.schema.gitlab_ci as ci_schema # DEPRECATED
import spack.schema.merged as merged_schema
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"spack": {
"type": "object",
"default": {},
"additionalProperties": False,
"properties": union_dicts(
# Include deprecated "gitlab-ci" section
ci_schema.properties,
# merged configuration scope schemas
merged_schema.properties,
# extra environment schema properties
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spack.schema.spec_list_schema,
},
),
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack environment file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -0,0 +1,24 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
matrix_schema = {"type": "array", "items": {"type": "array", "items": {"type": "string"}}}
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": matrix_schema,
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -15,7 +15,7 @@
import types
import typing
import warnings
from typing import Callable, Dict, List, NamedTuple, Optional, Sequence, Set, Tuple, Union
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
import archspec.cpu
@@ -258,7 +258,7 @@ def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunc
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
def _create_counter(specs, tests):
def _create_counter(specs: List[spack.spec.Spec], tests: bool):
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
if strategy == "full":
return FullDuplicatesCounter(specs, tests=tests)
@@ -411,7 +411,7 @@ def raise_if_unsat(self):
"""
Raise an appropriate error if the result is unsatisfiable.
The error is an InternalConcretizerError, and includes the minimized cores
The error is an SolverError, and includes the minimized cores
resulting from the solve, formatted to be human readable.
"""
if self.satisfiable:
@@ -422,7 +422,7 @@ def raise_if_unsat(self):
constraints = constraints[0]
conflicts = self.format_minimal_cores()
raise InternalConcretizerError(constraints, conflicts=conflicts)
raise SolverError(constraints, conflicts=conflicts)
@property
def specs(self):
@@ -435,7 +435,10 @@ def specs(self):
@property
def unsolved_specs(self):
"""List of abstract input specs that were not solved."""
"""List of tuples pairing abstract input specs that were not
solved with their associated candidate spec from the solver
(if the solve completed).
"""
if self._unsolved_specs is None:
self._compute_specs_from_answer_set()
return self._unsolved_specs
@@ -449,7 +452,7 @@ def specs_by_input(self):
def _compute_specs_from_answer_set(self):
if not self.satisfiable:
self._concrete_specs = []
self._unsolved_specs = self.abstract_specs
self._unsolved_specs = list((x, None) for x in self.abstract_specs)
self._concrete_specs_by_input = {}
return
@@ -470,7 +473,22 @@ def _compute_specs_from_answer_set(self):
self._concrete_specs.append(answer[node])
self._concrete_specs_by_input[input_spec] = answer[node]
else:
self._unsolved_specs.append(input_spec)
self._unsolved_specs.append((input_spec, candidate))
@staticmethod
def format_unsolved(unsolved_specs):
"""Create a message providing info on unsolved user specs and for
each one show the associated candidate spec from the solver (if
there is one).
"""
msg = "Unsatisfied input specs:"
for input_spec, candidate in unsolved_specs:
msg += f"\n\tInput spec: {str(input_spec)}"
if candidate:
msg += f"\n\tCandidate spec: {str(candidate)}"
else:
msg += "\n\t(No candidate specs from solver)"
return msg
def _normalize_packages_yaml(packages_yaml):
@@ -805,6 +823,14 @@ def on_model(model):
print("Statistics:")
pprint.pprint(self.control.statistics)
if result.satisfiable and result.unsolved_specs and setup.concretize_everything:
unsolved_str = Result.format_unsolved(result.unsolved_specs)
raise InternalConcretizerError(
"Internal Spack error: the solver completed but produced specs"
" that do not satisfy the request. Please report a bug at "
f"https://github.com/spack/spack/issues\n\t{unsolved_str}"
)
return result, timer, self.control.statistics
@@ -872,35 +898,41 @@ def __iter__(self):
return iter(self.data)
# types for condition caching in solver setup
ConditionSpecKey = Tuple[str, Optional[TransformFunction]]
ConditionIdFunctionPair = Tuple[int, List[AspFunction]]
ConditionSpecCache = Dict[str, Dict[ConditionSpecKey, ConditionIdFunctionPair]]
class SpackSolverSetup:
"""Class to set up and run a Spack concretization solve."""
def __init__(self, tests=False):
self.gen = None # set by setup()
def __init__(self, tests: bool = False):
# these are all initialized in setup()
self.gen: "ProblemInstanceBuilder" = ProblemInstanceBuilder()
self.possible_virtuals: Set[str] = set()
self.assumptions = []
self.declared_versions = collections.defaultdict(list)
self.possible_versions = collections.defaultdict(set)
self.deprecated_versions = collections.defaultdict(set)
self.assumptions: List[Tuple["clingo.Symbol", bool]] = [] # type: ignore[name-defined]
self.declared_versions: Dict[str, List[DeclaredVersion]] = collections.defaultdict(list)
self.possible_versions: Dict[str, Set[GitOrStandardVersion]] = collections.defaultdict(set)
self.deprecated_versions: Dict[str, Set[GitOrStandardVersion]] = collections.defaultdict(
set
)
self.possible_virtuals = None
self.possible_compilers = []
self.possible_oses = set()
self.variant_values_from_specs = set()
self.version_constraints = set()
self.target_constraints = set()
self.default_targets = []
self.compiler_version_constraints = set()
self.post_facts = []
self.possible_compilers: List = []
self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set()
self.target_constraints: Set = set()
self.default_targets: List = []
self.compiler_version_constraints: Set = set()
self.post_facts: List = []
# (ID, CompilerSpec) -> dictionary of attributes
self.compiler_info = collections.defaultdict(dict)
self.reusable_and_possible: ConcreteSpecsByHash = ConcreteSpecsByHash()
self.reusable_and_possible = ConcreteSpecsByHash()
self._id_counter = itertools.count()
self._trigger_cache = collections.defaultdict(dict)
self._effect_cache = collections.defaultdict(dict)
self._id_counter: Iterator[int] = itertools.count()
self._trigger_cache: ConditionSpecCache = collections.defaultdict(dict)
self._effect_cache: ConditionSpecCache = collections.defaultdict(dict)
# Caches to optimize the setup phase of the solver
self.target_specs_cache = None
@@ -912,8 +944,8 @@ def __init__(self, tests=False):
self.concretize_everything = True
# Set during the call to setup
self.pkgs = None
self.explicitly_required_namespaces = {}
self.pkgs: Set[str] = set()
self.explicitly_required_namespaces: Dict[str, str] = {}
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1197,6 +1229,38 @@ def variant_rules(self, pkg):
self.gen.newline()
def _get_condition_id(
self,
named_cond: spack.spec.Spec,
cache: ConditionSpecCache,
body: bool,
transform: Optional[TransformFunction] = None,
) -> int:
"""Get the id for one half of a condition (either a trigger or an imposed constraint).
Construct a key from the condition spec and any associated transformation, and
cache the ASP functions that they imply. The saved functions will be output
later in ``trigger_rules()`` and ``effect_rules()``.
Returns:
The id of the cached trigger or effect.
"""
pkg_cache = cache[named_cond.name]
named_cond_key = (str(named_cond), transform)
result = pkg_cache.get(named_cond_key)
if result:
return result[0]
cond_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=body)
if transform:
requirements = transform(named_cond, requirements)
pkg_cache[named_cond_key] = (cond_id, requirements)
return cond_id
def condition(
self,
required_spec: spack.spec.Spec,
@@ -1222,7 +1286,8 @@ def condition(
"""
named_cond = required_spec.copy()
named_cond.name = named_cond.name or name
assert named_cond.name, "must provide name for anonymous conditions!"
if not named_cond.name:
raise ValueError(f"Must provide a name for anonymous condition: '{named_cond}'")
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the caller,
@@ -1232,35 +1297,19 @@ def condition(
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
cache = self._trigger_cache[named_cond.name]
named_cond_key = (str(named_cond), transform_required)
if named_cond_key not in cache:
trigger_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=True, required_from=name)
if transform_required:
requirements = transform_required(named_cond, requirements)
cache[named_cond_key] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond_key]
trigger_id = self._get_condition_id(
named_cond, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
if not imposed_spec:
return condition_id
cache = self._effect_cache[named_cond.name]
imposed_spec_key = (str(imposed_spec), transform_imposed)
if imposed_spec_key not in cache:
effect_id = next(self._id_counter)
requirements = self.spec_clauses(imposed_spec, body=False, required_from=name)
if transform_imposed:
requirements = transform_imposed(imposed_spec, requirements)
cache[imposed_spec_key] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec_key]
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
@@ -1362,23 +1411,13 @@ def virtual_preferences(self, pkg_name, func):
def provider_defaults(self):
self.gen.h2("Default virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
self.virtual_preferences(
"all", lambda v, p, i: self.gen.fact(fn.default_provider_preference(v, p, i))
)
def provider_requirements(self):
self.gen.h2("Requirements on virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
parser = RequirementParser(spack.config.CONFIG)
assert self.possible_virtuals is not None, msg
for virtual_str in sorted(self.possible_virtuals):
rules = parser.rules_from_virtual(virtual_str)
if rules:
@@ -1577,35 +1616,57 @@ def flag_defaults(self):
fn.compiler_version_flag(compiler.name, compiler.version, name, flag)
)
def spec_clauses(self, *args, **kwargs):
"""Wrap a call to `_spec_clauses()` into a try/except block that
raises a comprehensible error message in case of failure.
def spec_clauses(
self,
spec: spack.spec.Spec,
*,
body: bool = False,
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps=False,
required_from: Optional[str] = None,
) -> List[AspFunction]:
"""Wrap a call to `_spec_clauses()` into a try/except block with better error handling.
Arguments are as for ``_spec_clauses()`` except ``required_from``.
Arguments:
required_from: name of package that caused this call.
"""
requestor = kwargs.pop("required_from", None)
try:
clauses = self._spec_clauses(*args, **kwargs)
clauses = self._spec_clauses(
spec,
body=body,
transitive=transitive,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
)
except RuntimeError as exc:
msg = str(exc)
if requestor:
msg += ' [required from package "{0}"]'.format(requestor)
if required_from:
msg += f" [required from package '{required_from}']"
raise RuntimeError(msg)
return clauses
def _spec_clauses(
self, spec, body=False, transitive=True, expand_hashes=False, concrete_build_deps=False
):
self,
spec: spack.spec.Spec,
*,
body: bool = False,
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps: bool = False,
) -> List[AspFunction]:
"""Return a list of clauses for a spec mandates are true.
Arguments:
spec (spack.spec.Spec): the spec to analyze
body (bool): if True, generate clauses to be used in rule bodies
(final values) instead of rule heads (setters).
transitive (bool): if False, don't generate clauses from
dependencies (default True)
expand_hashes (bool): if True, descend into hashes of concrete specs
(default False)
concrete_build_deps (bool): if False, do not include pure build deps
of concrete specs (as they have no effect on runtime constraints)
spec: the spec to analyze
body: if True, generate clauses to be used in rule bodies (final values) instead
of rule heads (setters).
transitive: if False, don't generate clauses from dependencies (default True)
expand_hashes: if True, descend into hashes of concrete specs (default False)
concrete_build_deps: if False, do not include pure build deps of concrete specs
(as they have no effect on runtime constraints)
Normally, if called with ``transitive=True``, ``spec_clauses()`` just generates
hashes for the dependency requirements of concrete specs. If ``expand_hashes``
@@ -1615,7 +1676,7 @@ def _spec_clauses(
"""
clauses = []
f = _Body if body else _Head
f: Union[Type[_Head], Type[_Body]] = _Body if body else _Head
if spec.name:
clauses.append(f.node(spec.name) if not spec.virtual else f.virtual_node(spec.name))
@@ -1704,8 +1765,9 @@ def _spec_clauses(
# dependencies
if spec.concrete:
# older specs do not have package hashes, so we have to do this carefully
if getattr(spec, "_package_hash", None):
clauses.append(fn.attr("package_hash", spec.name, spec._package_hash))
package_hash = getattr(spec, "_package_hash", None)
if package_hash:
clauses.append(fn.attr("package_hash", spec.name, package_hash))
clauses.append(fn.attr("hash", spec.name, spec.dag_hash()))
edges = spec.edges_from_dependents()
@@ -1725,6 +1787,11 @@ def _spec_clauses(
dep = dspec.spec
if spec.concrete:
# GCC runtime is solved again by clingo, even on concrete specs, to give
# the possibility to reuse specs built against a different runtime.
if dep.name == "gcc-runtime":
continue
# We know dependencies are real for concrete specs. For abstract
# specs they just mean the dep is somehow in the DAG.
for dtype in dt.ALL_FLAGS:
@@ -1764,7 +1831,7 @@ def _spec_clauses(
return clauses
def define_package_versions_and_validate_preferences(
self, possible_pkgs, *, require_checksum: bool, allow_deprecated: bool
self, possible_pkgs: Set[str], *, require_checksum: bool, allow_deprecated: bool
):
"""Declare any versions in specs not declared in packages."""
packages_yaml = spack.config.get("packages")
@@ -1797,7 +1864,7 @@ def define_package_versions_and_validate_preferences(
if pkg_name not in packages_yaml or "version" not in packages_yaml[pkg_name]:
continue
version_defs = []
version_defs: List[GitOrStandardVersion] = []
for vstr in packages_yaml[pkg_name]["version"]:
v = vn.ver(vstr)
@@ -2008,13 +2075,6 @@ def target_defaults(self, specs):
def virtual_providers(self):
self.gen.h2("Virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
# what provides what
for vspec in sorted(self.possible_virtuals):
self.gen.fact(fn.virtual(vspec))
self.gen.newline()
@@ -2211,7 +2271,7 @@ def define_concrete_input_specs(self, specs, possible):
def setup(
self,
specs: Sequence[spack.spec.Spec],
specs: List[spack.spec.Spec],
*,
reuse: Optional[List[spack.spec.Spec]] = None,
allow_deprecated: bool = False,
@@ -2233,8 +2293,7 @@ def setup(
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
runtimes = spack.repo.PATH.packages_with_tags("runtime")
self.pkgs.update(set(runtimes))
self.pkgs.update(spack.repo.PATH.packages_with_tags("runtime"))
# Fail if we already know an unreachable node is requested
for spec in specs:
@@ -2908,6 +2967,23 @@ def consume_facts(self):
self._setup.effect_rules()
def call_on_concrete(fun):
"""Annotation for SpecBuilder actions that indicates they should be called on concrete specs.
While building Specs, we set a lot of attributes on the Specs that are under
construction, but we may also just look up a concrete spec (e.g., if a specific hash
was reused). We don't need to set most attributes on these b/c we just look them up
in the DB.
Metadata attributes like concretizer weights aren't stored on the spec or in the DB
-- they're transient annotations for a single run. So we need their handlers to be
called on all specs, including concrete ones.
"""
fun._call_on_concrete = True
return fun
class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results."""
@@ -3007,6 +3083,7 @@ def node_flag_compiler_default(self, node):
def node_flag(self, node, flag_type, flag):
self._specs[node].compiler_flags.add_flag(flag_type, flag, False)
@call_on_concrete
def node_flag_source(self, node, flag_type, source):
self._flag_sources[(node, flag_type)].add(source)
@@ -3115,6 +3192,31 @@ def reorder_flags(self):
def deprecated(self, node: NodeArgument, version: str) -> None:
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
@call_on_concrete
def version_weight(self, node: NodeArgument, weight: int):
self._specs[node]._weights["version_weight"] = int(weight)
@call_on_concrete
def provider_weight(self, node: NodeArgument, virtual: str, weight: int):
self._specs[node]._weights.setdefault("provider_weight", 0)
self._specs[node]._weights["provider_weight"] += int(weight)
@call_on_concrete
def variant_not_default(self, node: NodeArgument, variant: str, value: str):
self._specs[node]._weights[f"variant_not_default({variant})"] = bool(value)
@call_on_concrete
def variant_default_not_used(self, node: NodeArgument, variant: str, value: str):
self._specs[node]._weights[f"variant_default_not_used({variant})"] = bool(value)
@call_on_concrete
def node_compiler_weight(self, node: NodeArgument, weight: int):
self._specs[node]._weights["node_compiler_weight"] = int(weight)
@call_on_concrete
def node_target_weight(self, node: NodeArgument, weight: int):
self._specs[node]._weights["node_target_weight"] = int(weight)
@staticmethod
def sort_fn(function_tuple):
"""Ensure attributes are evaluated in the correct order.
@@ -3173,13 +3275,12 @@ def build_specs(self, function_tuples):
if spack.repo.PATH.is_virtual(pkg):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
# if we've already gotten a concrete spec for this pkg, do not bother
# calling actions on it except for certain metaata methods that add
# information not tracked on the spec itself.
spec = self._specs.get(args[0])
if spec and spec.concrete:
if name != "node_flag_source":
continue
if spec and spec.concrete and not getattr(action, "_call_on_concrete", None):
continue
action(*args)
@@ -3429,15 +3530,13 @@ def solve_in_rounds(
if not result.satisfiable or not result.specs:
break
input_specs = result.unsolved_specs
input_specs = list(x for (x, y) in result.unsolved_specs)
for spec in result.specs:
reusable_specs.extend(spec.traverse())
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""
"""There was an issue with the spec that was requested (i.e. a user error)."""
def __init__(self, msg):
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
@@ -3447,8 +3546,21 @@ def __init__(self, msg):
class InternalConcretizerError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""Errors that indicate a bug in Spack."""
def __init__(self, msg):
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
self.provided = None
self.required = None
self.constraint_type = None
class SolverError(InternalConcretizerError):
"""For cases where the solver is unable to produce a solution.
Such cases are unexpected because we allow for solutions with errors,
so for example user specs that are over-constrained should still
get a solution.
"""
def __init__(self, provided, conflicts):
@@ -3461,7 +3573,7 @@ def __init__(self, provided, conflicts):
if conflicts:
msg += ", errors are:" + "".join([f"\n {conflict}" for conflict in conflicts])
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
super().__init__(msg)
self.provided = provided

View File

@@ -175,7 +175,7 @@ pkg_fact(Package, version_declared(Version, Weight)) :- pkg_fact(Package, versio
% We cannot use a version declared for an installed package if we end up building it
:- pkg_fact(Package, version_declared(Version, Weight, "installed")),
attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
not attr("hash", node(ID, Package), _),
internal_error("Reuse version weight used for built package").
@@ -215,7 +215,7 @@ possible_version_weight(node(ID, Package), Weight)
% we can't use the weight for an external version if we don't use the
% corresponding external spec.
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
pkg_fact(Package, version_declared(Version, Weight, "external")),
not external(node(ID, Package)),
internal_error("External weight used for built package").
@@ -223,18 +223,18 @@ possible_version_weight(node(ID, Package), Weight)
% we can't use a weight from an installed spec if we are building it
% and vice-versa
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
pkg_fact(Package, version_declared(Version, Weight, "installed")),
build(node(ID, Package)),
internal_error("Reuse version weight used for build package").
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
not pkg_fact(Package, version_declared(Version, Weight, "installed")),
not build(node(ID, Package)),
internal_error("Build version weight used for reused package").
1 { version_weight(node(ID, Package), Weight) : pkg_fact(Package, version_declared(Version, Weight)) } 1
1 { attr("version_weight", node(ID, Package), Weight) : pkg_fact(Package, version_declared(Version, Weight)) } 1
:- attr("version", node(ID, Package), Version),
attr("node", node(ID, Package)),
internal_error("version weights must exist and be unique").
@@ -583,7 +583,7 @@ do_not_impose(EffectID, node(X, Package))
% A provider may have different possible weights depending on whether it's an external
% or not, or on preferences expressed in packages.yaml etc. This rule ensures that
% we select the weight, among the possible ones, that minimizes the overall objective function.
1 { provider_weight(DependencyNode, VirtualNode, Weight) :
1 { attr("provider_weight", DependencyNode, VirtualNode, Weight) :
possible_provider_weight(DependencyNode, VirtualNode, Weight, _) } 1
:- provider(DependencyNode, VirtualNode), internal_error("Package provider weights must be unique").
@@ -626,7 +626,7 @@ error(100, "Attempted to use external for '{0}' which does not satisfy a unique
:- external(node(ID, Package)),
2 { external_version(node(ID, Package), Version, Weight) }.
version_weight(PackageNode, Weight) :- external_version(PackageNode, Version, Weight).
attr("version_weight", PackageNode, Weight) :- external_version(PackageNode, Version, Weight).
attr("version", PackageNode, Version) :- external_version(PackageNode, Version, Weight).
% if a package is not buildable, only externals or hashed specs are allowed
@@ -644,7 +644,7 @@ external(PackageNode) :- attr("external_spec_selected", PackageNode, _).
% we can't use the weight for an external version if we don't use the
% corresponding external spec.
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
pkg_fact(Package, version_declared(Version, Weight, "external")),
not external(node(ID, Package)),
internal_error("External weight used for internal spec").
@@ -886,7 +886,7 @@ attr("variant_value", PackageNode, Variant, Value)
% The rules below allow us to prefer default values for variants
% whenever possible. If a variant is set in a spec, or if it is
% specified in an external, we score it as if it was a default value.
variant_not_default(node(ID, Package), Variant, Value)
attr("variant_not_default", node(ID, Package), Variant, Value)
:- attr("variant_value", node(ID, Package), Variant, Value),
not variant_default_value(Package, Variant, Value),
% variants set explicitly on the CLI don't count as non-default
@@ -901,7 +901,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% A default variant value that is not used
variant_default_not_used(node(ID, Package), Variant, Value)
attr("variant_default_not_used", node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
@@ -1086,7 +1086,7 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target
node_target_weight(node(ID, Package), Weight)
attr("node_target_weight", node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
target_weight(Target, Weight).
@@ -1172,11 +1172,13 @@ attr("node_compiler_version_satisfies", PackageNode, Compiler, Constraint)
% If the compiler version was set from the command line,
% respect it verbatim
:- attr("node_compiler_version_set", PackageNode, Compiler, Version),
not attr("node_compiler_version", PackageNode, Compiler, Version).
error(100, "Cannot set the required compiler: {2}%{0}@{1}", Compiler, Version, Package)
:- attr("node_compiler_version_set", node(X, Package), Compiler, Version),
not attr("node_compiler_version", node(X, Package), Compiler, Version).
:- attr("node_compiler_set", PackageNode, Compiler),
not attr("node_compiler_version", PackageNode, Compiler, _).
error(100, "Cannot set the required compiler: {1}%{0}", Compiler, Package)
:- attr("node_compiler_set", node(X, Package), Compiler),
not attr("node_compiler_version", node(X, Package), Compiler, _).
% Cannot select a compiler if it is not supported on the OS
% Compilers that are explicitly marked as allowed
@@ -1211,16 +1213,12 @@ compiler_mismatch_required(PackageNode, DependencyNode)
#defined allow_compiler/2.
% compilers weighted by preference according to packages.yaml
node_compiler_weight(node(ID, Package), Weight)
:- node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
attr("node_compiler_weight", PackageNode, Weight)
:- node_compiler(PackageNode, CompilerID),
compiler_weight(CompilerID, Weight).
node_compiler_weight(node(ID, Package), 100)
:- node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
attr("node_compiler_weight", PackageNode, 100)
:- node_compiler(PackageNode, CompilerID),
not compiler_weight(CompilerID, _).
% For the time being, be strict and reuse only if the compiler match one we have on the system
@@ -1357,7 +1355,7 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
% will instead build the latest bar. When we actually include transitive
% build deps in the solve, consider using them as a preference to resolve this.
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
attr("version_weight", node(ID, Package), Weight),
pkg_fact(Package, version_declared(Version, Weight, "installed")),
not optimize_for_reuse().
@@ -1431,7 +1429,7 @@ opt_criterion(70, "version weight").
#minimize {
Weight@70+Priority
: attr("root", PackageNode),
version_weight(PackageNode, Weight),
attr("version_weight", PackageNode, Weight),
build_priority(PackageNode, Priority)
}.
@@ -1440,7 +1438,7 @@ opt_criterion(65, "number of non-default variants (roots)").
#minimize{ 0@65: #true }.
#minimize {
1@65+Priority,PackageNode,Variant,Value
: variant_not_default(PackageNode, Variant, Value),
: attr("variant_not_default", PackageNode, Variant, Value),
attr("root", PackageNode),
build_priority(PackageNode, Priority)
}.
@@ -1450,7 +1448,7 @@ opt_criterion(60, "preferred providers for roots").
#minimize{ 0@60: #true }.
#minimize{
Weight@60+Priority,ProviderNode,Virtual
: provider_weight(ProviderNode, Virtual, Weight),
: attr("provider_weight", ProviderNode, Virtual, Weight),
attr("root", ProviderNode),
build_priority(ProviderNode, Priority)
}.
@@ -1460,7 +1458,7 @@ opt_criterion(55, "default values of variants not being used (roots)").
#minimize{ 0@55: #true }.
#minimize{
1@55+Priority,PackageNode,Variant,Value
: variant_default_not_used(PackageNode, Variant, Value),
: attr("variant_default_not_used", PackageNode, Variant, Value),
attr("root", PackageNode),
build_priority(PackageNode, Priority)
}.
@@ -1471,7 +1469,7 @@ opt_criterion(50, "number of non-default variants (non-roots)").
#minimize{ 0@50: #true }.
#minimize {
1@50+Priority,PackageNode,Variant,Value
: variant_not_default(PackageNode, Variant, Value),
: attr("variant_not_default", PackageNode, Variant, Value),
not attr("root", PackageNode),
build_priority(PackageNode, Priority)
}.
@@ -1483,7 +1481,7 @@ opt_criterion(45, "preferred providers (non-roots)").
#minimize{ 0@45: #true }.
#minimize{
Weight@45+Priority,ProviderNode,Virtual
: provider_weight(ProviderNode, Virtual, Weight),
: attr("provider_weight", ProviderNode, Virtual, Weight),
not attr("root", ProviderNode),
build_priority(ProviderNode, Priority)
}.
@@ -1532,7 +1530,7 @@ opt_criterion(25, "version badness").
#minimize{ 0@25: #true }.
#minimize{
Weight@25+Priority,PackageNode
: version_weight(PackageNode, Weight),
: attr("version_weight", PackageNode, Weight),
build_priority(PackageNode, Priority)
}.
@@ -1542,7 +1540,7 @@ opt_criterion(20, "default values of variants not being used (non-roots)").
#minimize{ 0@20: #true }.
#minimize{
1@20+Priority,PackageNode,Variant,Value
: variant_default_not_used(PackageNode, Variant, Value),
: attr("variant_default_not_used", PackageNode, Variant, Value),
not attr("root", PackageNode),
build_priority(PackageNode, Priority)
}.
@@ -1553,7 +1551,7 @@ opt_criterion(15, "non-preferred compilers").
#minimize{ 0@15: #true }.
#minimize{
Weight@15+Priority,PackageNode
: node_compiler_weight(PackageNode, Weight),
: attr("node_compiler_weight", PackageNode, Weight),
build_priority(PackageNode, Priority)
}.
@@ -1573,7 +1571,7 @@ opt_criterion(5, "non-preferred targets").
#minimize{ 0@5: #true }.
#minimize{
Weight@5+Priority,PackageNode
: node_target_weight(PackageNode, Weight),
: attr("node_target_weight", PackageNode, Weight),
build_priority(PackageNode, Priority)
}.
@@ -1583,7 +1581,7 @@ opt_criterion(1, "edge wiring").
#minimize{ 0@1: #true }.
#minimize{
Weight@1,ParentNode,PackageNode
: version_weight(PackageNode, Weight),
: attr("version_weight", PackageNode, Weight),
not attr("root", PackageNode),
depends_on(ParentNode, PackageNode)
}.

View File

@@ -117,7 +117,7 @@ def _compute_cache_values(self):
self._possible_dependencies = set(self._link_run) | set(self._total_build)
def possible_packages_facts(self, gen, fn):
build_tools = set(spack.repo.PATH.packages_with_tags("build-tools"))
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
gen.h2("Packages with at most a single node")
for package_name in sorted(self.possible_dependencies() - build_tools):
gen.fact(fn.max_dupes(package_name, 1))
@@ -142,7 +142,7 @@ def possible_packages_facts(self, gen, fn):
class FullDuplicatesCounter(MinimalDuplicatesCounter):
def possible_packages_facts(self, gen, fn):
build_tools = set(spack.repo.PATH.packages_with_tags("build-tools"))
build_tools = spack.repo.PATH.packages_with_tags("build-tools")
counter = collections.Counter(
list(self._link_run) + list(self._total_build) + list(self._direct_build)
)

View File

@@ -125,31 +125,15 @@
IDENTIFIER_RE = r"\w[\w-]*"
# Coloring of specs when using color output. Fields are printed with
# different colors to enhance readability.
# See llnl.util.tty.color for descriptions of the color codes.
COMPILER_COLOR = "@g" #: color for highlighting compilers
VERSION_COLOR = "@c" #: color for highlighting versions
ARCHITECTURE_COLOR = "@m" #: color for highlighting architectures
ENABLED_VARIANT_COLOR = "@B" #: color for highlighting enabled variants
DISABLED_VARIANT_COLOR = "r" #: color for highlighting disabled varaints
DEPENDENCY_COLOR = "@." #: color for highlighting dependencies
VARIANT_COLOR = "@B" #: color for highlighting variants
HASH_COLOR = "@K" #: color for highlighting package hashes
#: This map determines the coloring of specs when using color output.
#: We make the fields different colors to enhance readability.
#: See llnl.util.tty.color for descriptions of the color codes.
COLOR_FORMATS = {
"%": COMPILER_COLOR,
"@": VERSION_COLOR,
"=": ARCHITECTURE_COLOR,
"+": ENABLED_VARIANT_COLOR,
"~": DISABLED_VARIANT_COLOR,
"^": DEPENDENCY_COLOR,
"#": HASH_COLOR,
}
#: Regex used for splitting by spec field separators.
#: These need to be escaped to avoid metacharacters in
#: ``COLOR_FORMATS.keys()``.
_SEPARATORS = "[\\%s]" % "\\".join(COLOR_FORMATS.keys())
NONDEFAULT_COLOR = "@_R" #: color for highlighting non-defaults in spec output
#: Default format for Spec.format(). This format can be round-tripped, so that:
#: Spec(Spec("string").format()) == Spec("string)"
@@ -186,31 +170,11 @@ class InstallStatus(enum.Enum):
Options are artificially disjoint for display purposes
"""
INSTALLED = "@g{[+]}"
UPSTREAM = "@g{[^]}"
EXTERNAL = "@g{[e]}"
ABSENT = "@K{ - }"
MISSING = "@r{[-]}"
def colorize_spec(spec):
"""Returns a spec colorized according to the colors specified in
COLOR_FORMATS."""
class insert_color:
def __init__(self):
self.last = None
def __call__(self, match):
# ignore compiler versions (color same as compiler)
sep = match.group(0)
if self.last == "%" and sep == "@":
return clr.cescape(sep)
self.last = sep
return "%s%s" % (COLOR_FORMATS[sep], clr.cescape(sep))
return clr.colorize(re.sub(_SEPARATORS, insert_color(), str(spec)) + "@.")
installed = "@g{[+]} "
upstream = "@g{[^]} "
external = "@g{[e]} "
absent = "@K{ - } "
missing = "@r{[-]} "
OLD_STYLE_FMT_RE = re.compile(r"\${[A-Z]+}")
@@ -625,18 +589,6 @@ def __init__(self, *args):
else:
raise TypeError("__init__ takes 1 or 2 arguments. (%d given)" % nargs)
def _add_versions(self, version_list):
# If it already has a non-trivial version list, this is an error
if self.versions and self.versions != vn.any_version:
# Note: This may be impossible to reach by the current parser
# Keeping it in case the implementation changes.
raise MultipleVersionError(
"A spec cannot contain multiple version signifiers. Use a version list instead."
)
self.versions = vn.VersionList()
for version in version_list:
self.versions.add(version)
def _autospec(self, compiler_spec_like):
if isinstance(compiler_spec_like, CompilerSpec):
return compiler_spec_like
@@ -1384,6 +1336,10 @@ def __init__(
# Build spec should be the actual build spec unless marked dirty.
self._build_spec = None
# Optimization weight information from the solver for coloring non-defaults in
# spec formatting output.
self._weights = {}
if isinstance(spec_like, str):
spack.parser.parse_one_or_raise(spec_like, self)
@@ -1499,7 +1455,7 @@ def edge_attributes(self) -> str:
if not deptypes_str and not virtuals_str:
return ""
result = f"{deptypes_str} {virtuals_str}".strip()
return f"[{result}] "
return f"[{result}]"
def dependencies(
self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL
@@ -1544,20 +1500,6 @@ def _dependencies_dict(self, depflag: dt.DepFlag = dt.ALL):
result[key] = list(group)
return result
#
# Private routines here are called by the parser when building a spec.
#
def _add_versions(self, version_list):
"""Called by the parser to add an allowable version."""
# If it already has a non-trivial version list, this is an error
if self.versions and self.versions != vn.any_version:
raise MultipleVersionError(
"A spec cannot contain multiple version signifiers." " Use a version list instead."
)
self.versions = vn.VersionList()
for version in version_list:
self.versions.add(version)
def _add_flag(self, name, value, propagate):
"""Called by the parser to add a known flag.
Known flags currently include "arch"
@@ -1626,14 +1568,6 @@ def _set_architecture(self, **kwargs):
else:
setattr(self.architecture, new_attr, new_value)
def _set_compiler(self, compiler):
"""Called by the parser to set the compiler."""
if self.compiler:
raise DuplicateCompilerSpecError(
"Spec for '%s' cannot have two compilers." % self.name
)
self.compiler = compiler
def _add_dependency(self, spec: "Spec", *, depflag: dt.DepFlag, virtuals: Tuple[str, ...]):
"""Called by the parser to add another spec as a dependency."""
if spec.name not in self._dependencies or not spec.name:
@@ -2091,7 +2025,12 @@ def to_node_dict(self, hash=ht.dag_hash):
if hasattr(variant, "_patches_in_order_of_appearance"):
d["patches"] = variant._patches_in_order_of_appearance
if self._concrete and hash.package_hash and self._package_hash:
if (
self._concrete
and hash.package_hash
and hasattr(self, "_package_hash")
and self._package_hash
):
# We use the attribute here instead of `self.package_hash()` because this
# should *always* be assignhed at concretization time. We don't want to try
# to compute a package hash for concrete spec where a) the package might not
@@ -4062,6 +4001,7 @@ def _dup(self, other, deps: Union[bool, dt.DepTypes, dt.DepFlag] = True, clearde
self.compiler_flags.spec = self
self.variants = other.variants.copy()
self._build_spec = other._build_spec
self._weights = other._weights.copy()
# FIXME: we manage _patches_in_order_of_appearance specially here
# to keep it from leaking out of spec.py, but we should figure
@@ -4315,11 +4255,15 @@ def deps():
yield deps
def colorized(self):
return colorize_spec(self)
def format(self, format_string=DEFAULT_FORMAT, **kwargs):
r"""Prints out particular pieces of a spec, depending on what is in the format string.
def format(
self,
format_string: str = DEFAULT_FORMAT,
color: bool = False,
nondefaults: bool = False,
transform: Optional[Callable] = None,
):
r"""Prints out particular pieces of a spec, depending on what is
in the format string.
Using the ``{attribute}`` syntax, any field of the spec can be
selected. Those attributes can be recursive. For example,
@@ -4389,18 +4333,17 @@ def format(self, format_string=DEFAULT_FORMAT, **kwargs):
"""
ensure_modern_format_string(format_string)
color = kwargs.get("color", False)
transform = kwargs.get("transform", {})
transform = transform if transform is not None else {}
out = io.StringIO()
def write(s, c=None):
f = clr.cescape(s)
if c is not None:
f = COLOR_FORMATS[c] + f + "@."
clr.cwrite(f, stream=out, color=color)
def write(string, cformat=None):
escaped = clr.cescape(string)
if cformat is not None:
escaped = f"{cformat}{escaped}@."
clr.cwrite(escaped, stream=out, color=color)
def write_attribute(spec, attribute, color):
def write_attribute(spec, attribute):
attribute = attribute.lower()
sig = ""
@@ -4445,16 +4388,12 @@ def write_attribute(spec, attribute, color):
elif attribute == "spack_install":
write(morph(spec, spack.store.STORE.layout.root))
return
elif re.match(r"install_status", attribute):
write(self.install_status_symbol())
return
elif re.match(r"hash(:\d)?", attribute):
col = "#"
if ":" in attribute:
_, length = attribute.split(":")
write(sig + morph(spec, current.dag_hash(int(length))), col)
write(sig + morph(spec, current.dag_hash(int(length))), HASH_COLOR)
else:
write(sig + morph(spec, current.dag_hash()), col)
write(sig + morph(spec, current.dag_hash()), HASH_COLOR)
return
# Iterate over components using getattr to get next element
@@ -4498,18 +4437,39 @@ def write_attribute(spec, attribute, color):
return
# Set color codes for various attributes
col = None
def nondefault_or_color(color, *weights):
"""Spec attributes are nondefault relevant weights are present and nonzero."""
if nondefaults and any(self._weights.get(w) for w in weights):
return NONDEFAULT_COLOR
else:
return color
attr_color = None
if "variants" in parts:
col = "+"
write(sig)
for variant, boolean in current.ordered_variants():
attr_color = nondefault_or_color(
VARIANT_COLOR,
f"variant_not_default({variant.name})",
f"variant_default_not_used({variant.name})",
)
if not boolean:
write(" ")
write(morph(spec, str(variant)), attr_color)
return
elif "name" == parts[0]:
attr_color = nondefault_or_color(None, "provider_weight")
elif "architecture" in parts:
col = "="
# TODO: handle arch parts independently
attr_color = nondefault_or_color(ARCHITECTURE_COLOR, "node_target_weight")
elif "compiler" in parts or "compiler_flags" in parts:
col = "%"
attr_color = nondefault_or_color(COMPILER_COLOR, "node_compiler_weight")
elif "version" in parts or "versions" in parts:
col = "@"
attr_color = nondefault_or_color(VERSION_COLOR, "version_weight")
# Finally, write the output
write(sig + morph(spec, str(current)), col)
write(sig + morph(spec, str(current)), attr_color)
attribute = ""
in_attribute = False
@@ -4523,7 +4483,7 @@ def write_attribute(spec, attribute, color):
escape = True
elif in_attribute:
if c == "}":
write_attribute(self, attribute, color)
write_attribute(self, attribute)
attribute = ""
in_attribute = False
else:
@@ -4542,18 +4502,8 @@ def write_attribute(spec, attribute, color):
"Format string terminated while reading attribute." "Missing terminating }."
)
# remove leading whitespace from directives that add it for internal formatting.
# Arch, compiler flags, and variants add spaces for spec format correctness, but
# we don't really want them in formatted string output. We do want to preserve
# whitespace from the format string.
formatted_spec = out.getvalue()
whitespace_attrs = [r"{arch=[^}]*}", r"{architecture}", r"{compiler_flags}", r"{variants}"]
if any(re.match(rx, format_string) for rx in whitespace_attrs):
formatted_spec = formatted_spec.lstrip()
if any(re.search(f"{rx}$", format_string) for rx in whitespace_attrs):
formatted_spec = formatted_spec.rstrip()
return formatted_spec
return formatted_spec.strip()
def cformat(self, *args, **kwargs):
"""Same as format, but color defaults to auto instead of False."""
@@ -4603,7 +4553,7 @@ def __str__(self):
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
sorted_dependencies = [
d.format("{edge_attributes}" + DEFAULT_FORMAT) for d in sorted_dependencies
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@@ -4623,25 +4573,20 @@ def colored_str(self):
def install_status(self):
"""Helper for tree to print DB install status."""
if not self.concrete:
return InstallStatus.ABSENT
return InstallStatus.absent
if self.external:
return InstallStatus.EXTERNAL
return InstallStatus.external
upstream, record = spack.store.STORE.db.query_by_spec_hash(self.dag_hash())
if not record:
return InstallStatus.ABSENT
return InstallStatus.absent
elif upstream and record.installed:
return InstallStatus.UPSTREAM
return InstallStatus.upstream
elif record.installed:
return InstallStatus.INSTALLED
return InstallStatus.installed
else:
return InstallStatus.MISSING
def install_status_symbol(self):
"""Get an install status symbol."""
status = self.install_status()
return clr.colorize(status.value)
return InstallStatus.missing
def _installed_explicitly(self):
"""Helper for tree to print DB install status."""
@@ -4667,8 +4612,9 @@ def tree(
show_types: bool = False,
depth_first: bool = False,
recurse_dependencies: bool = True,
install_status: bool = False,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
prefix: Optional[Callable[["Spec"], str]] = None,
nondefaults: bool = False,
) -> str:
"""Prints out this spec and its dependencies, tree-formatted
with indentation.
@@ -4688,9 +4634,11 @@ def tree(
show_types: if True, show the (merged) dependency type of a node
depth_first: if True, traverse the DAG depth first when representing it as a tree
recurse_dependencies: if True, recurse on dependencies
install_status: if True, show installation status next to each spec
status_fn: optional callable that takes a node as an argument and return its
installation status
prefix: optional callable that takes a node as an argument and return its
installation prefix
nondefaults: highlight non-default values in output
"""
out = ""
@@ -4702,9 +4650,6 @@ def tree(
):
node = dep_spec.spec
if install_status:
out += node.format("{install_status} ")
if prefix is not None:
out += prefix(node)
out += " " * indent
@@ -4712,6 +4657,15 @@ def tree(
if depth:
out += "%-4d" % d
if status_fn:
status = status_fn(node)
if status in list(InstallStatus):
out += clr.colorize(status.value, color=color)
elif status:
out += clr.colorize("@g{[+]} ", color=color)
else:
out += clr.colorize("@r{[-]} ", color=color)
if hashes:
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hashlen)
@@ -4733,7 +4687,7 @@ def tree(
out += " " * d
if d > 0:
out += "^"
out += node.format(format, color=color) + "\n"
out += node.format(format, color=color, nondefaults=nondefaults) + "\n"
# Check if we wanted just the first line
if not recurse_dependencies:

View File

@@ -208,7 +208,103 @@ def _mirror_roots():
]
class Stage:
class LockableStagingDir:
"""A directory whose lifetime can be managed with a context
manager (but persists if the user requests it). Instances can have
a specified name and if they do, then for all instances that have
the same name, only one can enter the context manager at a time.
"""
def __init__(self, name, path, keep, lock):
# TODO: This uses a protected member of tempfile, but seemed the only
# TODO: way to get a temporary name. It won't be the same as the
# TODO: temporary stage area in _stage_root.
self.name = name
if name is None:
self.name = stage_prefix + next(tempfile._get_candidate_names())
# Use the provided path or construct an optionally named stage path.
if path is not None:
self.path = path
else:
self.path = os.path.join(get_stage_root(), self.name)
# Flag to decide whether to delete the stage folder on exit or not
self.keep = keep
# File lock for the stage directory. We use one file for all
# stage locks. See spack.database.Database.prefix_locker.lock for
# details on this approach.
self._lock = None
self._use_locks = lock
# When stages are reused, we need to know whether to re-create
# it. This marks whether it has been created/destroyed.
self.created = False
def _get_lock(self):
if not self._lock:
sha1 = hashlib.sha1(self.name.encode("utf-8")).digest()
lock_id = prefix_bits(sha1, bit_length(sys.maxsize))
stage_lock_path = os.path.join(get_stage_root(), ".lock")
self._lock = spack.util.lock.Lock(
stage_lock_path, start=lock_id, length=1, desc=self.name
)
return self._lock
def __enter__(self):
"""
Entering a stage context will create the stage directory
Returns:
self
"""
if self._use_locks:
self._get_lock().acquire_write(timeout=60)
self.create()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
"""
Exiting from a stage context will delete the stage directory unless:
- it was explicitly requested not to do so
- an exception has been raised
Args:
exc_type: exception type
exc_val: exception value
exc_tb: exception traceback
Returns:
Boolean
"""
# Delete when there are no exceptions, unless asked to keep.
if exc_type is None and not self.keep:
self.destroy()
if self._use_locks:
self._get_lock().release_write()
def create(self):
"""
Ensures the top-level (config:build_stage) directory exists.
"""
# User has full permissions and group has only read permissions
if not os.path.exists(self.path):
mkdirp(self.path, mode=stat.S_IRWXU | stat.S_IRGRP | stat.S_IXGRP)
elif not os.path.isdir(self.path):
os.remove(self.path)
mkdirp(self.path, mode=stat.S_IRWXU | stat.S_IRGRP | stat.S_IXGRP)
# Make sure we can actually do something with the stage we made.
ensure_access(self.path)
self.created = True
def destroy(self):
raise NotImplementedError(f"{self.__class__.__name__} is abstract")
class Stage(LockableStagingDir):
"""Manages a temporary stage directory for building.
A Stage object is a context manager that handles a directory where
@@ -251,7 +347,8 @@ class Stage:
"""
#: Most staging is managed by Spack. DIYStage is one exception.
managed_by_spack = True
needs_fetching = True
requires_patch_success = True
def __init__(
self,
@@ -297,6 +394,8 @@ def __init__(
The search function that provides the fetch strategy
instance.
"""
super().__init__(name, path, keep, lock)
# TODO: fetch/stage coupling needs to be reworked -- the logic
# TODO: here is convoluted and not modular enough.
if isinstance(url_or_fetch_strategy, str):
@@ -314,72 +413,8 @@ def __init__(
self.srcdir = None
# TODO: This uses a protected member of tempfile, but seemed the only
# TODO: way to get a temporary name. It won't be the same as the
# TODO: temporary stage area in _stage_root.
self.name = name
if name is None:
self.name = stage_prefix + next(tempfile._get_candidate_names())
self.mirror_paths = mirror_paths
# Use the provided path or construct an optionally named stage path.
if path is not None:
self.path = path
else:
self.path = os.path.join(get_stage_root(), self.name)
# Flag to decide whether to delete the stage folder on exit or not
self.keep = keep
# File lock for the stage directory. We use one file for all
# stage locks. See spack.database.Database.prefix_locker.lock for
# details on this approach.
self._lock = None
if lock:
sha1 = hashlib.sha1(self.name.encode("utf-8")).digest()
lock_id = prefix_bits(sha1, bit_length(sys.maxsize))
stage_lock_path = os.path.join(get_stage_root(), ".lock")
self._lock = spack.util.lock.Lock(
stage_lock_path, start=lock_id, length=1, desc=self.name
)
# When stages are reused, we need to know whether to re-create
# it. This marks whether it has been created/destroyed.
self.created = False
def __enter__(self):
"""
Entering a stage context will create the stage directory
Returns:
self
"""
if self._lock is not None:
self._lock.acquire_write(timeout=60)
self.create()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
"""
Exiting from a stage context will delete the stage directory unless:
- it was explicitly requested not to do so
- an exception has been raised
Args:
exc_type: exception type
exc_val: exception value
exc_tb: exception traceback
Returns:
Boolean
"""
# Delete when there are no exceptions, unless asked to keep.
if exc_type is None and not self.keep:
self.destroy()
if self._lock is not None:
self._lock.release_write()
@property
def expected_archive_files(self):
"""Possible archive file paths."""
@@ -631,21 +666,6 @@ def restage(self):
"""
self.fetcher.reset()
def create(self):
"""
Ensures the top-level (config:build_stage) directory exists.
"""
# User has full permissions and group has only read permissions
if not os.path.exists(self.path):
mkdirp(self.path, mode=stat.S_IRWXU | stat.S_IRGRP | stat.S_IXGRP)
elif not os.path.isdir(self.path):
os.remove(self.path)
mkdirp(self.path, mode=stat.S_IRWXU | stat.S_IRGRP | stat.S_IXGRP)
# Make sure we can actually do something with the stage we made.
ensure_access(self.path)
self.created = True
def destroy(self):
"""Removes this stage directory."""
remove_linked_tree(self.path)
@@ -752,7 +772,8 @@ def __init__(self):
"cache_mirror",
"steal_source",
"disable_mirrors",
"managed_by_spack",
"needs_fetching",
"requires_patch_success",
]
)
@@ -808,8 +829,8 @@ class DIYStage:
directory naming convention.
"""
"""DIY staging is, by definition, not managed by Spack."""
managed_by_spack = False
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
@@ -857,6 +878,65 @@ def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False
def __init__(self, name, dev_path, reference_link):
super().__init__(name=name, path=None, keep=False, lock=True)
self.dev_path = dev_path
self.source_path = dev_path
# The path of a link that will point to this stage
if os.path.isabs(reference_link):
link_path = reference_link
else:
link_path = os.path.join(self.source_path, reference_link)
if not os.path.isdir(os.path.dirname(link_path)):
raise StageError(f"The directory containing {link_path} must exist")
self.reference_link = link_path
@property
def archive_file(self):
return None
def fetch(self, *args, **kwargs):
tty.debug("No fetching needed for develop stage.")
def check(self):
tty.debug("No checksum needed for develop stage.")
def expand_archive(self):
tty.debug("No expansion needed for develop stage.")
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def create(self):
super().create()
try:
llnl.util.symlink.symlink(self.path, self.reference_link)
except (llnl.util.symlink.AlreadyExistsError, FileExistsError):
pass
def destroy(self):
# Destroy all files, but do not follow symlinks
try:
shutil.rmtree(self.path)
except FileNotFoundError:
pass
self.created = False
def restage(self):
self.destroy()
self.create()
def cache_local(self):
tty.debug("Sources for Develop stages are not cached")
def ensure_access(file):
"""Ensure we can access a directory and die with an error if we can't."""
if not can_access(file):

View File

@@ -102,7 +102,10 @@ def to_dict_or_value(self):
if self.microarchitecture.vendor == "generic":
return str(self)
return syaml.syaml_dict(self.microarchitecture.to_dict(return_list_of_items=True))
# Get rid of compiler flag information before turning the uarch into a dict
uarch_dict = self.microarchitecture.to_dict()
uarch_dict.pop("compilers", None)
return syaml.syaml_dict(uarch_dict.items())
def __repr__(self):
cls_name = self.__class__.__name__

View File

@@ -164,3 +164,20 @@ def test_install_time_test_callback(tmpdir, config, mock_packages, mock_stage):
with open(s.package.tester.test_log_file, "r") as f:
results = f.read().replace("\n", " ")
assert "PyTestCallback test" in results
@pytest.mark.regression("43097")
@pytest.mark.usefixtures("builder_test_repository", "config")
def test_mixins_with_builders(working_env):
"""Tests that run_after and run_before callbacks are accumulated correctly,
when mixins are used with builders.
"""
s = spack.spec.Spec("builder-and-mixins").concretized()
builder = spack.builder.create(s.package)
# Check that callbacks added by the mixin are in the list
assert any(fn.__name__ == "before_install" for _, fn in builder.run_before_callbacks)
assert any(fn.__name__ == "after_install" for _, fn in builder.run_after_callbacks)
# Check that callback from the GenericBuilder are in the list too
assert any(fn.__name__ == "sanity_check_prefix" for _, fn in builder.run_after_callbacks)

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import sys
import pytest
@@ -247,3 +248,76 @@ def test_compiler_list_empty(no_compilers_yaml, working_env, compilers_dir):
out = compiler("list")
assert not out
assert compiler.returncode == 0
@pytest.mark.parametrize(
"external,expected",
[
(
{
"spec": "gcc@=7.7.7 os=foobar target=x86_64",
"prefix": "/path/to/fake",
"modules": ["gcc/7.7.7", "foobar"],
"extra_attributes": {
"paths": {
"cc": "/path/to/fake/gcc",
"cxx": "/path/to/fake/g++",
"fc": "/path/to/fake/gfortran",
"f77": "/path/to/fake/gfortran",
},
"flags": {"fflags": "-ffree-form"},
},
},
"""gcc@7.7.7:
\tpaths:
\t\tcc = /path/to/fake/gcc
\t\tcxx = /path/to/fake/g++
\t\tf77 = /path/to/fake/gfortran
\t\tfc = /path/to/fake/gfortran
\tflags:
\t\tfflags = ['-ffree-form']
\tmodules = ['gcc/7.7.7', 'foobar']
\toperating system = foobar
""",
),
(
{
"spec": "gcc@7.7.7",
"prefix": "{prefix}",
"modules": ["gcc/7.7.7", "foobar"],
"extra_attributes": {"flags": {"fflags": "-ffree-form"}},
},
"""gcc@7.7.7:
\tpaths:
\t\tcc = {compilers_dir}{sep}gcc-8{suffix}
\t\tcxx = {compilers_dir}{sep}g++-8{suffix}
\t\tf77 = {compilers_dir}{sep}gfortran-8{suffix}
\t\tfc = {compilers_dir}{sep}gfortran-8{suffix}
\tflags:
\t\tfflags = ['-ffree-form']
\tmodules = ['gcc/7.7.7', 'foobar']
\toperating system = debian6
""",
),
],
)
def test_compilers_shows_packages_yaml(
external, expected, no_compilers_yaml, working_env, compilers_dir
):
"""Spack should see a single compiler defined from packages.yaml"""
external["prefix"] = external["prefix"].format(prefix=os.path.dirname(compilers_dir))
gcc_entry = {"externals": [external]}
packages = spack.config.get("packages")
packages["gcc"] = gcc_entry
spack.config.set("packages", packages)
out = compiler("list")
assert out.count("gcc@7.7.7") == 1
out = compiler("info", "gcc@7.7.7")
assert out == expected.format(
compilers_dir=str(compilers_dir),
sep=os.sep,
suffix=".bat" if sys.platform == "win32" else "",
)

View File

@@ -41,6 +41,7 @@ def test_dev_build_basics(tmpdir, install_mockery):
assert os.path.exists(str(tmpdir))
@pytest.mark.disable_clean_stage_check
def test_dev_build_before(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
@@ -57,6 +58,7 @@ def test_dev_build_before(tmpdir, install_mockery):
assert not os.path.exists(spec.prefix)
@pytest.mark.disable_clean_stage_check
def test_dev_build_until(tmpdir, install_mockery):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
@@ -74,6 +76,7 @@ def test_dev_build_until(tmpdir, install_mockery):
assert not spack.store.STORE.db.query(spec, installed=True)
@pytest.mark.disable_clean_stage_check
def test_dev_build_until_last_phase(tmpdir, install_mockery):
# Test that we ignore the last_phase argument if it is already last
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
@@ -93,6 +96,7 @@ def test_dev_build_until_last_phase(tmpdir, install_mockery):
assert os.path.exists(str(tmpdir))
@pytest.mark.disable_clean_stage_check
def test_dev_build_before_until(tmpdir, install_mockery, capsys):
spec = spack.spec.Spec(f"dev-build-test-install@0.0.0 dev_path={tmpdir}").concretized()
@@ -130,6 +134,7 @@ def mock_module_noop(*args):
pass
@pytest.mark.disable_clean_stage_check
def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery, working_env):
monkeypatch.setattr(os, "execvp", print_spack_cc)
monkeypatch.setattr(spack.build_environment, "module", mock_module_noop)

View File

@@ -14,6 +14,7 @@
import spack.spec
from spack.main import SpackCommand
add = SpackCommand("add")
develop = SpackCommand("develop")
env = SpackCommand("env")
@@ -192,14 +193,16 @@ def test_develop_full_git_repo(
finally:
spec.package.do_clean()
# Now use "spack develop": look at the resulting stage directory and make
# Now use "spack develop": look at the resulting dev_path and make
# sure the git repo pulled includes the full branch history (or rather,
# more than just one commit).
env("create", "test")
with ev.read("test"):
with ev.read("test") as e:
add("git-test-commit")
develop("git-test-commit@1.2")
location = SpackCommand("location")
develop_stage_dir = location("git-test-commit").strip()
commits = _git_commit_list(develop_stage_dir)
e.concretize()
spec = e.all_specs()[0]
develop_dir = spec.variants["dev_path"].value
commits = _git_commit_list(develop_dir)
assert len(commits) > 1

Some files were not shown because too many files have changed in this diff Show More