Compare commits

..

656 Commits

Author SHA1 Message Date
Gregory Becker
dd668047fb prototype: check commits against the tip of known branches 2022-03-03 10:35:15 -08:00
Brian Van Essen
2ac11812ba lbann, dihydrogen: removing nice to have Python packages (#29302)
Recipes that are not actually required for LBANN or DiHydrogen to
build. These should be concretized within the same environment or
installed via PIP using the same Python that installed LBANN.
Removing these will help eliminate build time failures that are
actually associated with Python tools, not LBANN.
2022-03-03 18:12:14 +00:00
dependabot[bot]
f6ea56276f build(deps): bump actions/setup-python from 2.3.1 to 3 (#29253)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 2.3.1 to 3.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](f382193329...0ebf233433)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-03 19:10:47 +01:00
Paul Kuberry
cfae035a7b py-pycompadre: add new package (#29316) 2022-03-03 19:10:04 +01:00
Eduardo Rothe
9dfaa3d8fd relion: add version 4.0-beta (#29318)
- add version 4.0-beta to available versions
- update stable version to 3.1.3

Co-authored-by: Eduardo Rothe <eduardo.rothe@epfl.ch>
2022-03-03 19:08:46 +01:00
haralmha
858c780e1f hbase: add v2.4.9 (#29319) 2022-03-03 19:04:58 +01:00
Tamara Dahlgren
afc397be3a OpenMPI: correct slurm-pmi conflict version (#29309) 2022-03-03 17:40:14 +01:00
Weston Ortiz
cd2cb4aef5 goma: add new package (#29307) 2022-03-03 17:26:08 +01:00
haralmha
ebf92b428e prophecy4f: add new package (#29289) 2022-03-03 15:03:10 +01:00
Carlos Bederián
1834ab971d nss: set C++ compiler (#29195) 2022-03-03 14:59:29 +01:00
dependabot[bot]
ed6274a0bf build(deps): bump actions/checkout from 2 to 3 (#29276)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 3.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v2...a12a3943b4bdde767164f792f33f40b04645d846)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-03 14:43:49 +01:00
dependabot[bot]
450bbdd854 build(deps): bump docker/login-action from 1.13.0 to 1.14.1 (#29277)
Bumps [docker/login-action](https://github.com/docker/login-action) from 1.13.0 to 1.14.1.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](6af3c118c8...dd4fa0671b)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-03 14:42:28 +01:00
Michael Kuhn
49069e4f58 installer: Fix cosmetic problem with terminal title (#29070)
The status displayed in the terminal title could be wrong when doing
distributed builds. For instance, doing `spack install glib` in two
different terminals could lead to the current package being reported as
`40/29` due to the way Spack handles retrying locks.

Work around this by keeping track of the package IDs that were already
encountered to avoid counting packages twice.
2022-03-03 14:21:15 +01:00
kwryankrattiger
20471b8420 ECP-SDK: Propagate hdf5 to darshan-runtime (#29081) 2022-03-03 10:41:10 +00:00
Wouter Deconinck
5cd1e08e8a libdrm: new versions up to v2.4.110, build system meson (#27253) 2022-03-03 10:39:47 +01:00
Scott Wittenburg
c72735229f test/installer.py: remove commented code and inaccurate docstring (#29305) 2022-03-03 10:19:57 +01:00
Qian Jianhua
79c0f631de gfsio: support Fujitsu compiler (#29311) 2022-03-03 09:06:49 +00:00
Toyohisa Kameyama
01c1e6860f onednn: add version 2.5.2. (#29259) 2022-03-02 19:33:40 -06:00
kwryankrattiger
cbfe0d7492 HIP: Change mesa dep to gl (#29017)
* HIP: Change mesa18 dep to gl

* Mesa: Conflict with llvm-amdgpu when +llvm and swr

* Add def for suffix

* Disable llvm suffix patch.

* LLVM: Remove version suffix patches
2022-03-02 18:02:30 -07:00
Hadrien G
a6aff211d2 [acts] Add version 17 and 17.1, digitization plugin went in core (#28745) 2022-03-02 23:16:18 +01:00
Jen Herting
dfdb11bc71 julia: add version 1.7.2 (#29303) 2022-03-02 22:52:36 +01:00
Danny McClanahan
2c331a1d7f make @llnl.util.lang.memoized support kwargs (#21722)
* make memoized() support kwargs

* add testing for @memoized
2022-03-02 11:12:15 -08:00
kwryankrattiger
916c94fd65 ECP-SDK: Propagate cuda to ascent (#28878) 2022-03-02 13:39:36 -05:00
kwryankrattiger
1599d841c0 ECP-SDK: ParaView 5.11: required for CUDA (#29054)
* ECP-SDK: ParaView 5.11: required for CUDA

* Add conflict with ParaView@master

Because of the additional constraints for cuda, ParaView@master may be
selected unintentionally. Prefer older versions of ParaView without cuda
to master with cuda.
2022-03-02 13:17:25 -05:00
QuellynSnead
e6dcd382ee Paraview: Use 'UNIX Makefiles' instead of ninja as the CMake generator for XL (#29163)
* hypre: Add releases 2.21.0 and 2.22.0

* Revert "hypre: Add releases 2.21.0 and 2.22.0"

This reverts commit 8921cdb3ac.

* Address external linkage failures in elfutils 0.185:
  https://bugs.gentoo.org/794601
  https://sourceware.org/pipermail/elfutils-devel/2021q2/003862.html
Encountered while building within a Spack environment.

* Revert "Address external linkage failures in elfutils 0.185:"

This reverts commit 76b93e4504.

* paraview: The ninja generator has problems with XL and CCE

See https://gitlab.kitware.com/paraview/paraview/-/issues/21223

* paraview: Add variant to allow choice of cmake generator.
This will be necessary until problems with cmake+ninja on XL and
CCE builds can be resolved.

See https://gitlab.kitware.com/paraview/paraview/-/issues/21223

* paraview: ninja generator problems with XL/CCE
    By popular preference, abandon the idea of a special variant
    and select the generator based on compiler.

* Greg Becker suggested using the dedicated "generator" method to
pass the choice of makefile generator to cmake.

* paraview: The build errors I saw before with paraview%cce + ninja
have not reappeared in subsequent testing, so I'm dropping it from this
PR. If they re-occur I'll report the issue separately to KitWare.
2022-03-02 11:26:31 -05:00
Manuela Kuhn
a94f11a2b2 py-nibabel: add 3.2.2 (#29266)
* py-nibabel: add 3.2.2

* fix dependencies
2022-03-02 09:50:09 -06:00
Manuela Kuhn
a1f32cdaff py-msgpack: add 1.0.3 (#29216)
* py-msgpack: add 1.0.3

* remove dependencies
2022-03-02 09:49:38 -06:00
Massimiliano Culpo
8d118104c7 Fix typos when forwarding arguments to traverse_edges (#29261)
A few calls use `deptypes=...` instead of `deptype=...`
2022-03-02 08:43:26 +01:00
Phil Carns
a43e3f3ffb bmi package: add version 2.8.1 (#29274) 2022-03-01 18:56:44 -08:00
Wouter Deconinck
bab331ff34 assimp package: add version 5.2.2 (#29240) 2022-03-01 18:48:18 -08:00
Wouter Deconinck
d9113eb5ec gaudi package: add version 36.4 (#29241) 2022-03-01 18:47:21 -08:00
Valentin Volkl
4831daa2dc sherpa package: deprecate old versions with build failures (#29268) 2022-03-01 18:46:00 -08:00
Phil Carns
d5dd6471fc mochi-margo package: add version 0.9.7 (#29275) 2022-03-01 18:44:03 -08:00
Glenn Johnson
277f578707 Fix tags attribute for tar (#29189) 2022-03-01 18:35:44 -07:00
iarspider
3478b06262 py-gosam: use legacy setup.py invocation (fixes #29260) (#29264) 2022-03-01 18:31:48 -06:00
Richarda Butler
7268ab75f4 Superlu: Update Spack test dir (#27844)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-03-01 16:20:47 -08:00
Harmen Stoppels
f2c5092588 llvm: add missing system libs (#29270)
```
$ ./bin/llvm-config --link-static --system-libs
-lrt -ldl -lpthread -lm -lz -ltinfo -lxml2
```

`libz` and `libxml2` were missing.
2022-03-01 15:35:38 -07:00
Jen Herting
d05560ee32 New package: py-deepsig (#28480)
* [py-deepsig] New package and several cherry picks

* [py-deepsig] Cleanup of testing comments

* finished package with git reference

* fixed flake8 issue

* [py-deepsig] fixed copyright

* [py-deepsig] switched to version with functional requirements.txt. Ignoring missing README_RAW.md

* [py-deepsig] updated version in pypi variable

* [py-deepsig] simplifying missing readme workaround

Co-authored-by: Doug Heckman <dahdco@rit.edu>
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-03-01 13:21:04 -07:00
Manuela Kuhn
71f081366b py-neurora: add 1.1.6.1 (#29265) 2022-03-01 13:51:49 -06:00
Manuela Kuhn
327fcf7a54 py-nest-asyncio: add 1.5.4 (#29262) 2022-03-01 13:48:45 -06:00
Manuela Kuhn
8cefe38b0e py-keyrings-alt: add 4.1.0 (#29177) 2022-03-01 07:20:52 -07:00
Manuela Kuhn
88eb437d94 py-keyring: add 23.5.0 (#29176) 2022-03-01 06:54:10 -07:00
Manuela Kuhn
a155975b8b mesa: add conflict with llvm@13 (#28870)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-03-01 11:10:58 +01:00
Tamara Dahlgren
b20df12d09 test_env_install_two_specs_same_dep: properly check installed specs (#29222) 2022-03-01 10:35:14 +01:00
Harmen Stoppels
f60c6ca485 libuv: remove bash dependency (#29251) 2022-03-01 10:15:22 +01:00
Wouter Deconinck
a3947381c7 assimp: patch for issue that prevents 5.1: from working with qt (#29257) 2022-02-28 18:42:08 -07:00
Manuela Kuhn
6eef12cd10 py-nbclassic: add 0.3.5 (#29220)
* py-nbclassic: add 0.3.5

* Update var/spack/repos/builtin/packages/py-nbclassic/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix style

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-28 16:36:06 -07:00
Cory Bloor
2bd016895d rocblas: add spack test support (#27436)
* rocblas: add spack test support

* rocblas: limit spack test support to @4.2.0:

* rocblas: simplify define
2022-03-01 00:26:08 +01:00
Jen Herting
10f3113b3c New package: py-ci-sdr (#28644)
* clean py-ci-sdr

* [py-ci-sdr] updated copyright

* [py-ci-sdr] added upper bound for python version

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-02-28 15:27:44 -06:00
Massimiliano Culpo
9b298fd7e4 Add a new test to catch exit code failure (#29244)
* Add a new test to catch exit code failure

fixes #29226

This introduces a new unit test that checks the return
code of `spack unit-test` when it is supposed to fail.

This is to prevent bugs like the one introduced in #25601
in which CI didn't catch a missing return statement.

In retrospective it seems that the shell test we have right
now all go through `tty.die` or similar code paths which
call `sys.exit(a)` explicitly. This new test instead checks
`spack unit-test` which relies on the return code from
command invocation in case of errors.
2022-02-28 12:55:24 -08:00
Toyohisa Kameyama
8f4e029e3a qt: Fix build with Fujitsu compiler. (#29199) 2022-02-27 22:12:36 -05:00
iarspider
d29253bd7a Add 'master' version for dmtcp (#29211)
* Add 'develop' version for dmtcp

* Update var/spack/repos/builtin/packages/dmtcp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-27 13:34:09 -06:00
iarspider
8eefab4033 New version: py-onnx 1.8.1 (#29214)
* New version: py-onnx 1.8.1

* Update package.py
2022-02-26 11:50:17 -06:00
Ondřej Čertík
abfd300eef libtermkey: Update to the latest release (#29229)
And add a missing dependency.

Fixes #26672.
2022-02-26 10:15:02 -07:00
Ondřej Čertík
dd5943bc6f libluv: update to the latest release (#29228)
Fixes #26673.
2022-02-26 10:11:53 -07:00
Seth R. Johnson
00ed99dc16 darwin: propagate build environment variables for Autotools (#28948)
GraphViz has failures due to overriding the default autoconf build
environment variables.
2022-02-26 11:19:40 -05:00
Harmen Stoppels
db2340007a openmpi: ~gpfs by default (#29218) 2022-02-25 17:51:07 -07:00
Manuela Kuhn
2610423e78 r-rlang: add 1.0.1 (#29040) 2022-02-25 18:19:57 -06:00
Manuela Kuhn
888eb11565 r-foreach: add 1.5.2 (#29062) 2022-02-25 18:19:29 -06:00
Manuela Kuhn
1cc5391443 r-cli: add 3.2.0 (#29005) 2022-02-25 18:18:31 -06:00
Scott Wittenburg
b082c33c85 commands: Propgate command return value as exit code (#29223) 2022-02-25 10:49:56 -08:00
Manuela Kuhn
d62b8f0bf3 py-jupyterlab: add 3.1.19 and 3.2.9 (#29174) 2022-02-24 23:24:13 -06:00
Kinh Nguyen
c17f8d938e r-units 0.8-0 (#29183) 2022-02-24 17:54:38 -08:00
Glenn Johnson
c6556b7a06 Adjust binutils detection (#29188)
The `spack external find binutils` command was failing to find my system
binutils because the regex was not matching. The name of the executable
follows the string 'GNU' that I tested with three different
installations so I changed the regex to look for that. On my CentOS-7
system, the version had the RPM details so I set the version to capture
the first three parts of the version.
2022-02-24 17:52:45 -08:00
Tim Haines
31c8567007 capstone: update URL (#29190)
* capstone: Change GitHub URL

* capstone: Remove 'master' branch

Only the 'next' branch is used for development/release now.
2022-02-24 17:51:28 -08:00
Harmen Stoppels
8c1e54180e disable gpfs through --with-gpfs=no (#29181) 2022-02-24 22:39:38 +01:00
Manuela Kuhn
dc01f9597e py-jedi: add 0.18.1 (#29130)
* py-jedi: add 0.18.1

* fix py-parso version
2022-02-24 15:09:58 -06:00
Manuela Kuhn
8f372fc88f py-jupyterlab-server: add 2.10.3 (#29172) 2022-02-24 15:07:24 -06:00
Manuela Kuhn
6cc2e7bcd4 py-ipython: add 8.0.1 (#29175) 2022-02-24 15:04:28 -06:00
Manuela Kuhn
d7b9ad6456 py-lxml: add 4.8.0 (#29184) 2022-02-24 14:57:29 -06:00
Seth R. Johnson
ceea479b56 vecgeom: add patch for @1.1.18 +cuda (#29185) 2022-02-24 09:34:42 -07:00
Harmen Stoppels
8bd9527a71 mpich: add pmi=cray support (#29160)
* mpich: add pmi=cray support

After marking `cray-pmi` as external:

```yaml
packages:
  cray-pmi:
    externals:
    - spec: cray-pmi@5.0.17
      prefix: /opt/cray/pe/pmi/5.0.17
```

You can now install

```
spack install something ^mpich pmi=cray
```

and

```console
srun $(spack location -i something)/bin/your_app
```
2022-02-24 16:35:01 +01:00
Manuela Kuhn
205e9f7d73 py-jupyter-client: add 7.1.2 (#29135) 2022-02-24 05:18:18 -07:00
Manuela Kuhn
a8b1fbde41 py-jupyter-core: add 4.9.2 (#29134) 2022-02-24 05:17:56 -07:00
Manuela Kuhn
3243731c1c py-jupyter-server: add 1.13.5 (#29145) 2022-02-24 04:32:52 -07:00
Manuela Kuhn
0ed94f9529 py-json5: add 0.9.6 (#29133) 2022-02-24 04:23:44 -07:00
Manuela Kuhn
96236f229f py-joblib: add 1.1.0 (#29131) 2022-02-24 03:11:47 -07:00
QuellynSnead
dccc58c0ad netlib-lapack: apply ibm-xl-3.9.1 patch to cce (#28812) 2022-02-24 10:55:35 +01:00
Diego Alvarez
2e0cf6f9ee nextflow: add v21.10.6 (#29164) 2022-02-24 10:54:14 +01:00
Carlos Bederián
b2a0b6d6c3 freeglut: add 3.2.2 (#29165) 2022-02-24 10:53:25 +01:00
Carlos Bederián
fddc58387c libdrm: add -fcommon to CFLAGS for %aocc@2.3.0: and %clang@11.0.0: (#29169) 2022-02-24 10:20:54 +01:00
Tamara Dahlgren
0b4f40ab79 Testing: Summarize test results and add verbose output (#28700) 2022-02-23 18:36:21 -08:00
eugeneswalker
b21d30d640 tau: java is runtime dep for paraprof (#29166) 2022-02-23 15:54:41 -08:00
eugeneswalker
b1f223d224 tau: unpin binutils version (#29161) 2022-02-23 16:28:57 -05:00
Nils Vu
9fef13ce95 blaze package: add blas/lapack/smp variants (#29010)
Also:

* spectre: disable blaze+blas in earlier versions
* cblas: fix a broken URL link
2022-02-23 11:25:25 -08:00
Simon Frasch
e4ba7bb044 spla: add version 1.5.3 (#29048) 2022-02-23 07:02:50 -08:00
Seth R. Johnson
ab1e9d717e util-linux-uuid: add conflict for new version and old compilers (#29149)
The system compiler on RHEL7 fails to build the latest linux-uuid.
```
util-linux-uuid@2.37.4%gcc@4.8.5 arch=linux-rhel7-haswell
```
results in:
```
libuuid/src/unparse.c:42:73: error: expected ';', ',' or ')' before 'fmt'
 static void uuid_fmt(const uuid_t uuid, char *buf, char const *restrict fmt)
```
It looks like it's assuming C99 by default so there may be a better way
to handle this... but this at least works
2022-02-23 07:35:34 -07:00
Brian Spilner
8e4ccf91e4 cdo: add 2.0.4 (#29157) 2022-02-23 13:39:56 +01:00
Carlos Bederián
125e4e00b4 cuda: add 11.6.1 (#29156) 2022-02-23 12:06:22 +01:00
Massimiliano Culpo
1ddad522a4 Move early exit for setup only argument (#29041)
See https://github.com/spack/spack/pull/28468/files#r809156986

If we exit before generating the:

 error("Dependencies must have compatible OS's with their dependents").
 ...

facts we'll output a problem that is effectively
different by the one solved by clingo.
2022-02-23 01:46:52 -08:00
eugeneswalker
1cb82dc542 e4s ci: packages: prefer openturns@1.18 (#29154) 2022-02-22 17:17:34 -08:00
Harmen Stoppels
17c065a750 libtree: 3.0.3 (#29153) 2022-02-22 16:56:30 -08:00
Tom Scogland
a9ba40164a Checksum match (#28989)
* cmd/checksum: prefer url matching url_from_version

This is a minimal change toward getting the right archive from places
like github.  The heuristic is:

* if an archive url exists, take its version
* generate a url from the package with pkg.url_from_version
* if they match
  * stop considering other URLs for this version
  * otherwise, continue replacing the url for the version

I doubt this will always work, but it should address a variety of
versions of this bug.  A good test right now is `spack checksum gh`,
which checksums macos binaries without this, and the correct source
packages with it.

fixes #15985
related to #14129
related to #13940

* add heuristics to help create as well

Since create can't rely on an existing package, this commit adds another
pair of heuristics:
1. if the current version is a specifically listed archive, don't
   replace it
2. if the current url matches the result of applying
   `spack.url.substitute_version(a, ver)` for any a in archive_urls,
   prefer it and don't replace it

fixes #13940

* clean up style and a lingering debug import

* ok flake8, you got me

* document reference_package argument

* Update lib/spack/spack/util/web.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* try to appease sphinx

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-23 00:55:59 +00:00
Manuela Kuhn
535262844b py-iso8601: add 1.0.2 (#29128) 2022-02-22 20:21:54 +00:00
Manuela Kuhn
ed447e1ac7 py-stack-data: add new package (#29125) 2022-02-22 14:11:52 -06:00
Manuela Kuhn
d840c3a069 py-pure-eval: add 0.2.2 and get sources from pypi (#29123) 2022-02-22 14:10:46 -06:00
Manuela Kuhn
6a259ecd85 py-isodate: add 0.6.1 (#29129) 2022-02-22 14:07:31 -06:00
Todd Gamblin
36b0730fac Add spack --bootstrap option for accessing bootstrap store (#25601)
We can see what is in the bootstrap store with `spack find -b`, and you can clean it with `spack
clean -b`, but we can't do much else with it, and if there are bootstrap issues they can be hard to
debug.

We already have `spack --mock`, which allows you to swap in the mock packages from the command
line. This PR introduces `spack -b` / `spack --bootstrap`, which runs all of spack with
`ensure_bootstrap_configuration()` set. This means that you can run `spack -b find`, `spack -b
install`, `spack -b spec`, etc. to see what *would* happen with bootstrap configuration, to remove
specific bootstrap packages, etc. This will hopefully make developers' lives easier as they deal
with bootstrap packages.

This PR also uses a `nullcontext` context manager. `nullcontext` has been implemented in several
other places in Spack, and this PR consolidates them to `llnl.util.lang`, with a note that we can
delete the function if we ever reqyire a new enough Python.

- [x] introduce `spack --bootstrap` option
- [x] consolidated all `nullcontext` usages to `llnl.util.lang`
2022-02-22 12:35:34 -07:00
Adam J. Stewart
800933bbdf py-notebook: fix py-nbconvert dep (#29020) 2022-02-22 15:26:23 +01:00
Adam J. Stewart
2516885615 py-shapely: add v1.8.1 (#29023) 2022-02-22 15:24:56 +01:00
SXS Bot
2446771b63 spectre: add v2022.02.17 (#29045)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-02-22 15:21:45 +01:00
kwryankrattiger
f9843367ed ParaView: constrain +cuda variant to version < 5.8 (#29049) 2022-02-22 15:21:07 +01:00
dependabot[bot]
b08ed91309 build(deps): bump docker/login-action from 1.12.0 to 1.13.0 (#29053)
Bumps [docker/login-action](https://github.com/docker/login-action) from 1.12.0 to 1.13.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](42d299face...6af3c118c8)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-02-22 15:18:38 +01:00
Ethan Stam
0d3ecff903 ParaView: add use_vtkm as a multi-valued variant (#21977)
spelling fixes
2022-02-22 15:17:18 +01:00
Manuela Kuhn
aca6b73a6c py-flit: add v3.6.0 (#29124) 2022-02-22 14:58:21 +01:00
Manuela Kuhn
92970c2006 git-annex: add 10.20220121 (#28993) 2022-02-22 14:54:29 +01:00
Harmen Stoppels
f47e24381d zstd+programs: use xz for lzma lib (#29107) 2022-02-22 14:19:06 +01:00
Seth R. Johnson
2941afe9e0 qt: mark conflict between older versions and with newer xcode (#29122) 2022-02-22 14:15:15 +01:00
Harmen Stoppels
e6521e7379 llvm: new version 13.0.1 (#29119) 2022-02-22 06:22:22 -05:00
Tiziano Müller
37f021ef3c cp2k: bump version to 9.1, fix building with CUDA (#29108)
first step towards fixing #28554
2022-02-22 11:26:47 +01:00
Chuck Atkins
2ab1ace5f4 rhash: un-block intel builds (#29117) 2022-02-22 09:24:29 +01:00
liuyangzhuan
8daee48231 openturns: add v1.18 (#29097) 2022-02-22 09:07:58 +01:00
Manuela Kuhn
8485474140 py-imageio: add 2.16.0 (#29066)
* py-imageio: add 2.16.0

* Update var/spack/repos/builtin/packages/py-imageio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-21 16:41:28 -07:00
Glenn Johnson
51488dbff5 update bioconductor packages to bioconductor 3.14 (#28900) 2022-02-21 15:38:27 -07:00
Manuela Kuhn
1953d986ae py-asttokens: add new package (#29073) 2022-02-21 14:47:28 -07:00
Manuela Kuhn
fef58db792 r-brobdingnag: add 1.2-7 (#28997) 2022-02-21 13:57:16 -06:00
Manuela Kuhn
36c64c8012 r-bh: add 1.78.0-0 (#28996) 2022-02-21 13:48:24 -06:00
Manuela Kuhn
8cd95b9f35 r-backports: add 1.4.1 (#28995) 2022-02-21 13:46:58 -06:00
Todd Gamblin
7912a8e90b bugfix: Not all concrete versions on the CLI should be considered real (#28620)
Some "concrete" versions on the command line, e.g. `qt@5` are really
meant to satisfy some actual concrete version from a package. We should
only assume the user is introducing a new, unknown version on the CLI
if we, well, don't know of any version that satisfies the user's
request.  So, if we know about `5.11.1` and `5.11.3` and they ask for
`5.11.2`, we'd ask the solver to consider `5.11.2` as a solution.  If
they just ask for `5`, though, `5.11.1` or `5.11.3` are fine solutions,
as they satisfy `@5`, so use them.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-02-21 11:46:37 -08:00
Todd Gamblin
2210f84a91 py-numpy, py-scipy: add rgommers as a maintainer (#29109) 2022-02-21 12:41:27 -07:00
Alberto Invernizzi
b8d042273a Bring back cuda@11.4.0 conflicts for GCC and clang; add 11.4.3:11.4.4 (#29076)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-02-21 20:28:56 +01:00
Christoph Junghans
9173fd7c61 votca: fix git url and allow build without xtp parts (#29056)
* votca: fix git url
* Votca: add xtp variant
2022-02-21 09:23:00 -08:00
Glenn Johnson
275608e2f2 update CRAN R packages (#28786) 2022-02-21 11:22:33 -06:00
Manuela Kuhn
f744640289 py-importlib-metadata: add 4.11.1 (#29069) 2022-02-21 09:23:32 -07:00
Manuela Kuhn
b9d26caab8 py-executing: add new package (#29074) 2022-02-21 08:56:42 -07:00
Manuela Kuhn
0dd5d493d5 py-pandas: fix tests (#29044) 2022-02-21 08:20:33 -07:00
Seth R. Johnson
e7894b4863 geant4: fix CMake-derived data path (#29091)
* geant4-data: use build+run-only depends

* geant4: point to dependent datadir

This is "used" in the configure step to set up the Geant4Config.cmake
file's persistent pointers to the data directory, but the dependency
is still listed as "run" -- though I'm not sure this is the right behavior
since the geant4 installation really does change as a function of the
data directory, and the installation is incomplete/erroneous
without using one.

* Style
2022-02-21 12:05:59 +00:00
Manuela Kuhn
94d75d0327 py-flit-core: add v3.6.0 (#29009) 2022-02-21 11:35:00 +01:00
Adam J. Stewart
96fceb6e38 py-liblas: add new package (#29027) 2022-02-21 11:23:07 +01:00
h-murai
fc8c3ada56 nwchem: add support for Fujitsu A64FX compilers on v7.0.2 (#28990) 2022-02-21 10:26:51 +01:00
Sebastian Pipping
60fe21ddd7 expat: add v2.4.6, deprecate v2.4.5 (#29100) 2022-02-21 10:24:47 +01:00
Erik Schnetter
0ffce33447 Add c-blosc v1.21.1, c-blosc2 v2.0.4 (#29101) 2022-02-21 10:16:24 +01:00
Seth R. Johnson
043794362a iwyu: add 0.17 and improve llvm target requirements (#29077)
* iwyu: new version 0.17

* iwyu: allow default llvm target when x86/x86_64
2022-02-21 10:14:21 +01:00
Tom Scogland
f36a6f3fc0 neovim remote: add new package (#29103) 2022-02-21 10:11:57 +01:00
Tom Scogland
7995b7eac4 py-pynvim: add new package (#29102) 2022-02-21 10:11:30 +01:00
Seth R. Johnson
509f1cc00a git: install keychain helper on macOS (#29090) 2022-02-21 10:09:49 +01:00
Seth R. Johnson
7459aa6c95 Various package fixes for macOS (#29024)
* trilinos: disable dl on macOS

* py-sphinx-argparse: add explicit poetry dependency

* libzmq: fix libbsd dependency

libbsd is *always* required when +libbsd (introduced in #28503) . #20893
had previously removed the macos dependency because libbsd wasn't always
enabled. Libbsd support is only available after 4.3.2 so change it to a
conflict rather than bumping the dependency.

* hdf5: work around GCC11.2 monterey fortran bug

* go-bootstrap: mark conflict for monterey
2022-02-21 09:08:08 +01:00
Olivier Cessenat
2852126196 perl-fth: add the external find feature for perl-fth and new release (#28886) 2022-02-20 23:20:27 -07:00
Sebastian Pipping
9e2d78cffc expat: Add latest release 2.4.5 with security fixes (#29086) 2022-02-19 15:53:20 -07:00
Seth R. Johnson
e19f29da66 silo package: fix patch for 4.11-bsd (#29075)
There's a check to skip for version '10.2-bsd', but the patch
does not apply to '11-bsd' either.
2022-02-19 07:26:32 -07:00
Harmen Stoppels
39fcafaf45 binutils: add v2.38 (#29067) 2022-02-19 06:02:34 -07:00
downloadico
22d07b328e New package: ovito (#28787) 2022-02-19 05:05:11 -07:00
Manuela Kuhn
f37855f5bb py-ipykernel: add 6.9.1 (#29071)
* py-ipykernel: add 6.9.1

* Add py-jupyter-core version restriction
2022-02-18 22:54:07 -07:00
Seth R. Johnson
f33770553f Add/remove conflicts for Apple silicon (M1/aarch64) (#28850)
* go: remove broken bootstrapping for macos aarch64

* qt: mark apple silicon conflict

* trilinos: remove apple silicon conflict

* Apply review suggestions

* python: add apple silicon conflict for 2.7
2022-02-18 21:56:32 -07:00
Manuela Kuhn
e886a61a6c r-distributional package: add version 0.3.0 (#29037) 2022-02-18 18:38:19 -08:00
eugeneswalker
bbb81d5d68 kokkos@:3.5.00 +sycl conflicts with %dpcpp@2022.0.0 (#29055) 2022-02-18 14:33:59 -08:00
Manuela Kuhn
99707beae7 py-humanize: add 4.0.0 (#29034) 2022-02-18 16:28:28 -06:00
Glenn Johnson
be53b5db96 py-tensorflow-estimator: new versions (#28093)
* py-tensorflow-estimator: new versions

* Set proper constraint on py-funcsigs

Only needed when the python version is less than 3.3.
2022-02-18 14:19:05 -06:00
Andrew W Elble
45f3b2fc52 py-keras: new versions (#27991)
* py-keras: new versions

* incorporate feedback

* adjust deps, add url_for_version()

* change to use pip

* remove build phase / remove unnecessary TEST_TMPDIR envvar

* switch url_for_version around
install, not build
2022-02-18 14:18:11 -06:00
Glenn Johnson
e5f6914bd2 py-tensorflow: add versions 2.5.0 and 2.6.0 (#27138)
* py-tensorflow: add versions 2.5.0 and 2.6.0

- add version 2.5.0
- add version 2.6.0
- add patches for newer protobuf
- set constraints

* Remove import os. left over from testing

* Remove unused patch file

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-clang dependency

* Adjust py-clang constraint

* Build tensorflow with tensorboard

- tensorflow
    - added 2.6.1 and 2.6.2 versions

- tensorboard
    - have bazel use number of jobs set by spack
    - add versions and constraints

- new package: py-tensorboard-data-server
- use wheel for py-tensorboard-plugin-wit
  This package can not build with newer versions of bazel that are
  needed for newer versions of py-tensorboard.

* Update var/spack/repos/builtin/packages/py-clang/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove empty line at end of file

* Fix import sorting

* Adjust python dependencies on py-clang

* Add version 2.7.0 of pt-tensorflow and py-tensorboard

* Adjust bazel constraints

* bazel-4 support begins with py-tensorflow-2.7.0

* Adjust dependencies

* Loosen cuda constraint on versions > 2.5

Tensorflow-2.5 and above can use cuda up to version 11.4.

* Add constraints to patch

The 0008-Fix-protobuf-errors-when-using-system-protobuf.patch patch
should only apply to versions 2.5 and above.

* Adjust constraints

- versions 2.4 and below need protobuf-3.12 and below
- versions 2.4 and above can use up to cuda-11.4
- versions 2.2 and below can not use cudnn-8
- the null_linker_bin patch should only be applied to versions 2.5 and
  above.

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-tensorflow/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix py-grpcio dependency for version 2.7

Also, make sure py-h5py mpi specs are consistent.

* Add llvm as run dependency.

* Fix python spec for py-tensorboard

* Fix py-google-auth spec for py-tensorboard

* Do not override the pip spec for tensorboard-plugin-wit

* Converted py-tensorboard-plugin-wit to wheel only package

* Fix bazel dependency spec in tensorflow

* Adjust pip masks

- allow tensorboard to be specified in pip constraints
- mask tensorflow-estimator

* Remove blank line at end of file

* Adjust pip constraints in setup.py

Also, adjust constraint on a patch that is fixed in 2.7

* Fix flake8 error

Adjust formatting for consistency.

* Get bazel dep right

* Fix old cudnn dependency, caught in audit test

* Adjust the regex to ensure proper line is changed

* Add py-libclang package

- Stripped the py-clang package down to just version 5
- added comments to indicate the purpose of py-clang and that
  py-libclang should be preferred
- set dependencies accordingly in py-tensorflow

* Remove cap on py-h5py dependency for v2.7

* Add TODO entries for tensorflow-io-gcs-filesystem

* Edit some comments

* Add phases and select python in PATH for tensorboard-data-server

* py-libclang

- remove py-wheel dependency
- remove raw string notation in filter_file

* py-tensorboard-data-server

- remove py-wheel dep
- remove py-pip dep
- use python from package class

* py-tensorboard-plugin-wit

- switch to PythonPackage
- add version 1.8.1
- remove unneeded code

* Add comment as to why a wheel is need for tensorboard-plugin-wit

* remove which pip from tensorboard-data-server

* Fix dependency specs in tensorboard

* tweak dependencies for tensorflow

* fix python constraint

* Use llvm libs property

* py-tensorboard-data-server

- merge build into install
- use std_pip_args

* remove py-clang dependency

* remove my edits to py-tensorboard-plugin-wit

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-18 14:10:00 -06:00
Massimiliano Culpo
7fd94fc4bc spack external find: change default behavior (#29031)
See https://github.com/spack/spack/issues/25353#issuecomment-1041868116

This commit changes the default behavior of
```
$ spack external find
```
from searching all the possible packages Spack knows about to
search only for the ones tagged as being a "build-tool".

It also introduces a `--all` option to restore the old behavior.
2022-02-18 11:51:01 -07:00
Michael Kuhn
2ed52d32c7 libbson, mongo-c-driver: add 1.21.0 (#28978) 2022-02-18 07:08:48 -07:00
Michael Kuhn
31e538795e curl: add v7.81.0 (#28973) 2022-02-18 03:59:52 -07:00
Simon Frasch
0ad3319243 spfft: add version 1.0.6 (#29047) 2022-02-18 10:58:52 +01:00
Manuela Kuhn
4efd47d0c3 py-fasteners: add 0.17.3 (#28988) 2022-02-18 02:57:04 -07:00
Michael Kuhn
43016d0ff4 util-linux(-uuid): add v2.37.4 (#28985) 2022-02-18 02:08:48 -07:00
Michael Kuhn
5796de8bcb mariadb-c-client: add v3.2.6 (#28981) 2022-02-18 01:08:58 -07:00
Michael Kuhn
60e9af6e0f sqlite: add v3.37.2 (#28984) 2022-02-18 00:23:57 -07:00
Michael Kuhn
ae84ce535b pcre: add v8.45 (#28979) 2022-02-17 23:03:38 -07:00
Michael Kuhn
5bdc72e2ed pcre2: add v10.39 (#28980) 2022-02-17 22:53:47 -07:00
Michael Kuhn
7a1364fcb2 rdma-core: add v39.0 (#28987) 2022-02-17 22:44:52 -07:00
Seth R. Johnson
6c61c2695a darwin: robust macos version detection (#28991)
Prefer `sw_vers` to `platform.mac_ver`. In anaconda3 installation, for example, the latter reports 10.16 on Monterey -- I think this is affected by how and where the python instance was built.

Use MACOSX_DEPLOYMENT_TARGET if present to override the operating system choice.
2022-02-17 20:50:41 -07:00
Toyohisa Kameyama
b768fb85c6 abinit: Add version 9.6.1 and support fujitsu compiler and fujitsu-fftw. (#28474)
* abinit: Add version 9.6.1 and support fujitsu compiler and fujitsu-fftw.
* fix conflicts +openmp.
* Add openmp conflicts.
2022-02-17 15:42:49 -08:00
Manuela Kuhn
b3f5f55f95 py-charset-normalizer: add 2.0.12 (#28945) 2022-02-17 15:44:38 -07:00
Adam J. Stewart
00f0d11b8f py-fiscalyear: add v0.4.0 (#29025) 2022-02-17 22:10:47 +01:00
Manuela Kuhn
e6142e8183 py-fonttools: add 4.29.1 and fix tests (#29032) 2022-02-17 15:06:57 -06:00
Manuela Kuhn
98fa2c6f10 py-gevent: add 21.12.0 (#29033) 2022-02-17 15:05:47 -06:00
Scott Wittenburg
38643dcd7e gitlab: Propagate stack name to downstream build jobs (#29019)
It will be useful for metrics gathering and possibly debugging to
have this environment variable available in the runner pods that
do the actual rebuilds.
2022-02-17 15:36:48 -05:00
Glenn Johnson
e7e6a16064 llvm: add libs property (#28621)
* llvm: add libs property

* Use llvm-config helper for libs property
2022-02-17 13:52:49 -06:00
Tamara Dahlgren
fefe65a35b Testing: optionally run tests on externally installed packages (#28701)
Since Spack does not install external packages, this commit skips them by
default when running stand-alone tests. The assumption is that such packages
have likely undergone an acceptance test process. 

However, the tests can be run against installed externals using 
```
% spack test run --externals ...
```
2022-02-17 19:47:42 +01:00
Michael Kuhn
4d669bfdf4 zstd: add v1.5.2 (#28986) 2022-02-17 07:05:23 -07:00
Harmen Stoppels
d93f9b82ac Reduce verbosity of patches=... variant (#29015)
* reduce verbosity of patches=... variant

* Special-case prefix-matches for satisfies of patches variant
2022-02-17 11:06:32 +01:00
Massimiliano Culpo
fa132614e0 ASP-based solver: don't sort when defining variant possible values (#29013)
fixes #28260

Since we iterate over different variants from many packages, the variant 
values may have types which are not comparable, which causes errors 
at runtime. This is not a real issue though, since we don't need the facts
to be ordered. Thus, to avoid needless sorting, the sorted function has 
been removed and a comment has been added to tip any developer that
might need to inspect these clauses for debugging to add back sorting 
on the first two items only.

It's kind of difficult to add a test for this, since the error depends on 
whether Python sorting algorithm ever needs to compare the third 
value of a tuple being ordered.
2022-02-17 08:50:50 +01:00
Tom Scogland
a0bd6c8817 binutils: add external detection (#29022) 2022-02-17 08:45:24 +01:00
Michael Kuhn
dcdf5022ad libmd: add v1.0.4 (#28971) 2022-02-16 19:50:38 -07:00
Michael Kuhn
b021cf39aa libbsd: add v0.11.5 (#28969) 2022-02-16 19:50:15 -07:00
Tom Scogland
8f5fcc6e95 extensions: allow multiple "extends" directives (#28853)
* extensions: allow multiple "extends" directives

This will allow multiple extends directives in a package as long as only one of
them is selected as a dependency in the concrete spec.

* document the option to have multiple extends
2022-02-16 21:23:12 +00:00
Michael Kuhn
e8c5f195a7 libffi: add v3.4.2 (#28970) 2022-02-16 13:41:29 -07:00
Mark W. Krentel
36020d69d7 intel-gtpin: new package (#28542) 2022-02-16 18:22:26 +00:00
Todd Gamblin
b1ff9c05bc concretizer: refactor argument passing for reuse
Reuse previously was a very invasive change that required parameters to be added to all
the methods that called `concretize()` on a `Spec` object. With the addition of
concretizer configuration, we can use the config system to simplify this argument
passing and keep the code cleaner.

We decided that concretizer config options should be read at `Solver` instantiation
time, and if config changes between instnatiation of a particular solver and
`solve()` invocation, the `Solver` should use the settings from `__init__()`.

- [x] remove `reuse` keyword argument from most concretize functions
- [x] refactor usages to use `spack.config.override("concretizer:reuse", True)`
- [x] rework argument passing in `Solver` so that parameters are set from config
      at instantiation time
2022-02-16 10:17:18 -08:00
Todd Gamblin
d33973df6c docs: add section on concretizer configuration
* Document `concretizer.yaml`, `--reuse`, and `--fresh`.
2022-02-16 10:17:18 -08:00
Todd Gamblin
a2b8e0c3e9 commands: refactor --reuse handling to use config
`--reuse` was previously handled individually by each command that
needed it. We are growing more concretization options, and they'll
need their own section for commands that support them.

Now there are two concretization options:

* `--reuse`: Attempt to reuse packages from installs and buildcaches.
* `--fresh`: Opposite of reuse -- traditional spack install.

To handle thes, this PR adds a `ConfigSetAction` for `argparse`, so
that you can write argparse code like this:

```
     subgroup.add_argument(
        '--reuse', action=ConfigSetAction, dest="concretizer:reuse",
        const=True, default=None,
        help='reuse installed dependencies/buildcaches when possible'
     )
```

With this, you don't need to add logic to pull the argument out and
handle it; the `ConfigSetAction` just does it for you. This can probably
be used to clean up some other commands later, as well.

Code that was previously passing `reuse=True` around everywhere has
been refactored to use config, and config is set from the CLI using
a new `add_concretizer_args()` function in `spack.cmd.common.arguments`.

- [x] Add `ConfigSetAction` to simplify concretizer config on the CLI
- [x] Refactor code so that it does not pass `reuse=True` to every function.
- [x] Refactor commands to use `add_concretizer_args()` and to pass
      concretizer config using the config system.
2022-02-16 10:17:18 -08:00
Todd Gamblin
f155de7462 tests: consolidate mock scope creation logic in conftest.py
Config scopes were different for `config` and `mutable_config`,
and `mutable_config` did not have a command line scope.

- [x] Fix by consolidating the creation logic for the two fixtures.
2022-02-16 10:17:18 -08:00
Todd Gamblin
800ed16e7a config: add a new concretizer config section
The concretizer is going to grow to have many more configuration,
and we really need some structured config for that.

* We have the `config:concretizer` option that chooses the solver,
  but extending that is awkward (we'd need to replace a string with
  a `dict`) and the solver choice will be deprecated eventually.

* We have the `concretization` option in environments, but it's
  not a top-level config section -- it's just for environments,
  and it also only admits a string right now.

To avoid overlapping with either of these and to allow the most
extensibility in the future, this adds a new `concretizer` config
section that can be used in and outside of environments. There
is only one option right now: `reuse`.  This can expand to include
other options later.

Likely, we will soon deprecate `config:concretizer` and warn when
the user doesn't use `clingo`, and we will eventually (sometime later)
move the `together` / `separately` options from `concretization` into
the top-level `concretizer` section.

This commit just adds the new section and schema. Fully wiring it
up is TBD.
2022-02-16 10:17:18 -08:00
Todd Gamblin
1903e45eec refactor: convert spack.solver.asp.solve() to a class
The solver has a lot of configuration associated with it. Rather
than adding arguments to everything, we should encapsulate that
in a class. This is the start of that work; it replaces `solve()`
and its kwargs with a class and properties.
2022-02-16 10:17:18 -08:00
Mark W. Krentel
87a3b72ef0 Add 'stable' to the list of infinity version names. (#28772)
* Add 'stable' to the list of infinity version names.
Rename libunwind 1.5-head to 1.5-stable.

* Add stable to the infinite version list in packaging_guide.rst.
2022-02-16 09:08:51 -08:00
Manuela Kuhn
884da5e326 py-etelemetry: add 0.3.0 (#28983)
* py-etelemetry: add 0.3.0

* Update var/spack/repos/builtin/packages/py-etelemetry/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-16 10:22:38 -06:00
Manuela Kuhn
9fc9386944 py-filelock: add 3.5.0 (#29006) 2022-02-16 10:22:16 -06:00
Manuela Kuhn
e84a2db23d py-tomli-w: add new package (#29007) 2022-02-16 10:21:09 -06:00
Adam J. Stewart
5c1edbe00a py-build: python+ensurepip not required (#28562)
* py-build: python+ensurepip not required

* Add patch to fix issue when pip is in PYTHONPATH
2022-02-16 15:53:22 +01:00
Massimiliano Culpo
5272e72344 mesa: enable the swr variant if +llvm, instead of using conflicts (#29008)
* mesa: enable the `swr` variant if +llvm, instead of using conflicts

fixes #28994
2022-02-16 07:26:29 -07:00
Adam J. Stewart
3c1b2c0fc9 find_libraries: search for both .so and .dylib on macOS (#28924) 2022-02-16 14:07:44 +01:00
psakievich
e6ea4c788a Trilinos: Add UVM variant (#28949)
13.2+ still supports UVM as an option.
2022-02-16 07:29:49 -05:00
Robert Pavel
5cd8fc37ba flecsi: update constraints and add new dependency (#28788)
Updated flecsi constraints to better match internal development. Also
added dependency on `lanl-cmake-modules` for flecsi@2.1.1:
2022-02-16 10:51:01 +01:00
Carson Woods
3e69449ecd py-archspec: add versions up to v0.1.3 (#28928) 2022-02-16 10:44:31 +01:00
haralmha
ef921e4107 storm: add v2.3.0 (#28957) 2022-02-16 10:43:37 +01:00
kwryankrattiger
6d6641f706 VTK-m: Allow building +shared+cuda_native @1.7: (#28877) 2022-02-16 10:40:33 +01:00
Tom Scogland
8f19bf2f31 gh: add versions up to v2.5.1 (#28958) 2022-02-16 10:38:11 +01:00
Adam J. Stewart
fe5cb90f83 py-nbconvert: add v6.4.2 (#28965)
No version of py-nbconvert@5: can be concretized due to conflicting versions 
of flit-core that are required. This issue could be solved by separate 
concretization of build deps.
2022-02-16 10:24:52 +01:00
Michael Kuhn
b0dc83afff glib: add v2.70.4 (#28972) 2022-02-16 01:59:19 -07:00
Michael Kuhn
4c8582dfc8 gdbm: add v1.23 (#28974) 2022-02-16 09:53:45 +01:00
Manuela Kuhn
8654cc93ef py-entrypoints: add v0.4 (#28977) 2022-02-16 09:40:54 +01:00
Manuela Kuhn
5d58b94322 py-backports-entry-points-selectable: add v1.1.1 (#28954) 2022-02-16 09:40:10 +01:00
Manuela Kuhn
924be97a9b py-docutils: add 0.18.1 (#28976) 2022-02-15 22:20:34 -07:00
Manuela Kuhn
dfd83e60ac py-argon2-cffi: add 21.3.0 (#28968) 2022-02-15 22:14:24 -07:00
Manuela Kuhn
0f9f636f38 py-distlib: add 0.3.4 (#28975) 2022-02-15 22:35:17 -06:00
Manuela Kuhn
2fa892daa2 py-decorator: add 5.1.1 (#28953) 2022-02-15 18:57:23 -07:00
Manuela Kuhn
c433a35fdb py-datalad: add 0.15.5 (#28952)
* py-datalad: add 0.15.5

* Move dependency position
2022-02-15 17:02:53 -07:00
Manuela Kuhn
23ddfba16d py-cryptography: add 36.0.1 (#28950) 2022-02-15 16:44:37 -07:00
Filippo Spiga
52d4e209e2 Adding NVIDIA HPC SDK 22.2 (#28874) 2022-02-15 16:41:42 -07:00
Massimiliano Culpo
5b34b947a8 archspec: remove pyproject.toml to workaround PEP517 (#28956)
* archspec: remove pyproject.toml to workaround PEP517

If pyproject.toml is in the folder, that is preferred to the
setup.py packaged by poetry itself. Adding a dependency on
poetry for deploying a pure Python package seems wasteful,
since the package to be deployed just needs to be copied in
place, so we don't want to built rust for that.

* archspec: patch pyproject.toml to comply to PEP517

See https://python-poetry.org/docs/pyproject/#poetry-and-pep-517

* Fix style issues
2022-02-15 16:14:39 -07:00
Paul Romano
dea9766336 OpenMC: new version 0.13.0 (#28929) 2022-02-15 16:11:33 -07:00
Manuela Kuhn
939e94790f py-argon2-cffi-bindings: add new package (#28943) 2022-02-15 15:14:25 -07:00
Christoph Conrads
9da8f18e3a FEniCS: avoid HDF5 version 1.12+ (#28920)
The new HDF5 version 1.12 API causes compiler errors due to modified function prototypes. Note that version 1.11 is the development version of HDF5 1.12.
2022-02-15 15:09:08 -07:00
Manuela Kuhn
f707987275 py-coverage: add 6.3.1 (#28946) 2022-02-15 15:08:37 -07:00
Manuela Kuhn
39c4af5f79 py-numba: add 0.55.1 (#28933)
* py-numba: add 0.55.1

* Remove comment

* Pin down py-llvmlite version for older py-numba releases

* Remove py-llvmlite deps for releases not in spack

* Set upper bounds for python and py-numpy

* Add stricter upper bound to py-numpy for releases <=0.47
2022-02-15 15:02:29 -07:00
Manuela Kuhn
9cd311c82d py-argcomplete: add 2.0.0 (#28942) 2022-02-15 14:44:38 -07:00
Manuela Kuhn
53ca65b103 py-memory-profiler: add 0.60.0 (#28918) 2022-02-15 13:29:49 -07:00
Seth R. Johnson
c987d06a19 github: add default value for spack spec and error message (#28796) 2022-02-15 18:50:21 +00:00
Stephen Sachs
79f22423b8 intel compiler: fix link time error with LLVMgold.so (#28731)
The Intel compiler will, at link time, call `ld -plugin LLVMgold.so`, which
expects libraries like `libimfo.so` to be found either in the `LD_LIBRARY_PATH` or
in `LLVMgold.so`s RPATH.

As `LLVMgold.so` already uses RUNPATH, I used that to extend this to the
necessary library locations.

This PR should fix issues:
https://github.com/spack/spack/issues/10308
https://github.com/spack/spack/issues/18606
https://github.com/spack/spack/issues/17100
https://github.com/spack/spack/issues/21237
https://github.com/spack/spack/issues/4261

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-02-15 17:47:29 +00:00
Harmen Stoppels
cebe4fdf1d Make spack -e [env] spec show environment root specs (#25941) 2022-02-15 09:42:05 -08:00
Anton Kozhevnikov
d61c1f623c SIRIUS: optionlal dependency on Python (#28711)
* depend on Python only when building the module

* fixes for shared costa libray

* add deprecated=True for old packages
2022-02-15 10:23:37 -07:00
Harmen Stoppels
55996d3ad4 Unalias despacktivate only when alias exists (#28939) 2022-02-15 16:21:19 +01:00
acastanedam
3640c258dc direnv: fix installation procedure (#28940)
Setting Spack's `$prefix` to `$DESTDIR` and not to `$PREFIX` install the
package in `$prefix/usr/local` and not in `$prefix`, thus when it is
loaded the executable `direnv` in not "seen" by the environment.
2022-02-15 08:17:41 -07:00
Nils Vu
a8d440d3ab spectre: patch Boost pre v1.67 (#28931) 2022-02-15 15:13:13 +01:00
dlkuehn
7fc9c16f9e genometools: add v1.6.2 (#28936)
Co-authored-by: las_dkuehn <las_dkuehn@gilman-0107-02.las.iastate.edu>
2022-02-15 15:11:38 +01:00
Wouter Deconinck
6864b28fd8 dd4hep: add v1.20 (#28937)
No major changes to the build system, https://github.com/AIDASoft/DD4hep/compare/v01-19...v01-20
2022-02-15 15:06:43 +01:00
Cody Balos
cf0c9affff sundials: use make for some of the smoke tests (#28938) 2022-02-15 15:03:58 +01:00
Mark W. Krentel
09a8656f1f meson: add versions 0.61.2 and 0.60.3 (#28934) 2022-02-15 05:20:34 -07:00
Manuela Kuhn
9f1c6c0c29 py-anyio: add 3.5.0 (#28915)
* py-anyio: add 3.5.0

* Add wheel dependency
2022-02-15 03:38:17 -07:00
Mikael Simberg
fdec3b47cc apex: add patch to install missing header (#28844)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.if>
2022-02-15 01:59:39 -07:00
Daniele Cesarini
c313a72e76 New quantum-espresso maintainer (#28927) 2022-02-15 09:32:46 +01:00
Manuela Kuhn
28caa0225f py-attrs: add 21.4.0 (#28914) 2022-02-14 22:50:55 -07:00
Manuela Kuhn
30fafa63e0 py-jsonschema: add 4.4.0 (#28919) 2022-02-14 22:50:37 -07:00
Seth R. Johnson
08cad7d0ee darwin: make sure MACOSX_DEPLOYMENT_TARGET has a minor component (#28926) 2022-02-15 05:50:22 +00:00
luker
9165e3fb86 updated cray-fftw (#28922)
added the latest cray-fftw version, added myself as a maintainer, updated the homepage
2022-02-14 22:14:39 -07:00
Pierre Neyron
d54a5d9dd8 py-pyopencl: fix dependency to py-six (#28916) (#28917) 2022-02-14 19:59:25 -07:00
Sreenivasa Murthy Kolam
76489eb213 replace mesa18 recipe with mesa recipe for the rocm releases (#28491) 2022-02-14 16:29:45 -06:00
ravil-mobile
64dd6378d4 easi: add new package (#28828)
Co-authored-by: ravil <ravil.dorozhinskii@tum.de>
2022-02-14 15:29:32 -07:00
Manuela Kuhn
67ea14098d py-bids-validator: add 1.8.9 (#28913) 2022-02-14 14:53:29 -07:00
MichaelLaufer
5b80c4ab6c New package: py-pyfr (#28847)
* Initial support for PyFR

* styling

* fixes

* more syling fixes

* fixes for dependencies

* add min rocblas version
2022-02-14 15:43:18 -06:00
Tom Vander Aa
1377c02e26 intel-oneapi-mpi: set I_MPI_ROOT in setup_dependent_build_environment (#28890)
* setting I_MPI_ROOT is sometimes needed for find_package(MPI) in cmake to work
2022-02-14 13:23:35 -07:00
Thomas Madlener
13013d0291 fastjet add v3.3.4, fjcontrib add v1.045 (#22667) 2022-02-14 13:23:06 -07:00
estewart08
fe35ce1843 AMD rocm-openmp-extras: add v4.5.2 (#28790) 2022-02-14 10:14:44 -07:00
Toyohisa Kameyama
5353032eef Python: fix build with Fujitsu compiler (#28744) 2022-02-14 08:06:34 -06:00
Olli Lupton
dc57f987a7 py-{mizani,plotnine}: add packages. (#28841)
* py-{mizani,plotnine}: add packages.

* Add missing python dependency.
2022-02-14 03:20:29 -07:00
G-Ragghianti
aa6e725633 magma: move the execution of stand-alone tests to preferred directory (#28895) 2022-02-14 11:19:45 +01:00
John Wohlbier
9649246764 palisade-development: change fppe-logreg variant to point to a stable tag instead of a branch (#28717)
Co-authored-by: John Wohlbier <jgwohlbier@sei.cmu.edu>
2022-02-14 11:18:38 +01:00
Adam J. Stewart
b0658b8b03 py-pandas: add v1.4.1 (#28904) 2022-02-14 11:01:13 +01:00
Adam J. Stewart
e0a0f41ff9 py-mypy: add new versions (#28905) 2022-02-14 11:01:00 +01:00
Seth R. Johnson
3a4ab5f96f trilinos: disable UVM for 13.2+ (#28879) 2022-02-14 10:54:15 +01:00
Jen Herting
2c56cbd2bc [py-ctgan] fixed py-torchvision version range (#28892) 2022-02-13 22:26:10 -07:00
Harmen Stoppels
bece9fd823 Bump libtree (#28908) 2022-02-13 15:50:15 -08:00
Seth R. Johnson
97e4b43ddc go: mark conflict for ancient compilers (#28888) 2022-02-13 14:20:27 -07:00
Adam J. Stewart
cab0e4cb24 py-grpcio: add v1.43.0 (#28784) 2022-02-12 20:13:21 +01:00
Sarah Osborn
8c943261a2 hypre: add v2.24.0 (#28896) 2022-02-12 12:47:55 +01:00
Steve Leak
9f1db4e3a5 cray: add NVIDIA to the list of canonical names (#28246) 2022-02-12 10:45:46 +01:00
Adam Moody
0a25370e78 lwgrp/dtcmp packages: add version 1.0.5 and 1.1.4, respectively(#28817) 2022-02-11 18:52:51 -08:00
Olli Lupton
b840557d64 ffmpeg package: add libx264 variant; fix license variants. (#28808)
Prior to this patch, setting +gpl did not pass --enable-gpl and
did not allow GPL-only codecs to be enabled.
2022-02-11 18:40:35 -08:00
Brian Van Essen
fc7b7cfeab LBANN SW stack packages: gcc-toolchain and clang support (#28769)
* Added support to LBANN, Hydrogen, DiHydrogen, and Aluminum to capture
  a gcc-toolchain cxxflags argument and pass it to a CMAKE_CUDA_FLAG
  argument when set.  This helps deal with compiling with clang on
  systems with old base gcc installations.
* Added a dependency on py-scipy when enabling tests on LBANN.
* Updated the C++ standard for Hydrogen to C++17.
* Added a new variant +apps to enable (or disable) python packages that
  are used by applications in the LBANN repo, but are not strictly
  required for building and using LBANN.
* Added a run time dependency for both py-pytest and py-scipy so that
  they are activated in any environment.
* Added support for building LBANN, Hydrogen, and DiHydrogen with the
  IBM ESSL BLAS library.  This requires explicit identification of
  additional LAPACK libraries, since ESSL does not implement LAPACK, but
  is found by CMake.
* Fixed a bug in the LBANN dependency on OpenCV for Power architectures.
  The +powerpc variant is only required for GCC toolchains and causes
  Clang to break. Switched to only enabling when using %gcc on power.
2022-02-11 17:33:15 -08:00
Richarda Butler
af4d555c20 Update/caliper stand alone test compiler (#28526) 2022-02-11 15:02:40 -08:00
Simon Frasch
dc949a5bda spfft: fix missing hipFFT dependency with latest ROCm versions (#28880) 2022-02-11 14:17:40 -07:00
Teodor Nikolov
7ec9958b48 [rocm-smi-lib] Disable pdf generation with a patch (#28842)
- Installation often hangs building the documentation. This happens when
  doxygen and latex are found. To avoid the issue, comment-out that part
  of the code until an explicit cmake variable to disable documentation
  generation is available.
2022-02-11 13:47:50 -07:00
Cody Balos
853200c42d sundials: add new version and fix smoke tests (#28894)
* sundials: fix smoke tests

* sundials: add new version

* use cmake+make instead of make for tests, fix style

* use cmake_bin workaround from https://github.com/spack/spack/pull/28622
2022-02-11 12:23:58 -08:00
Dom Heinzeller
ef030ed0ee Bug fix for findutils: apply nonnull.patch only for versions 4.8.0+ (#28857) 2022-02-11 11:47:27 -07:00
haralmha
e365b05821 openloops: remove README from install prefix (#28767) 2022-02-11 11:11:33 -07:00
Glenn Johnson
61528c0b0a swftools package: patches for GCC@10: (#28296) 2022-02-11 09:54:53 -08:00
Danny McClanahan
e8838109d8 move typing_extensions.py back into typing.py =\ (#28549) 2022-02-11 09:52:01 -08:00
Mikael Simberg
389b24c4dc hpx: add conflict for Asio/CUDA (#27947)
* Add missing Asio versions from 1.16.0 to 1.21.0

* Add conflict for Asio/CUDA to HPX package

Co-authored-by: Mikael Simberg <mikael.simberg@iki.if>
2022-02-11 11:03:55 +01:00
Adam Moody
6c47aa06fc libyogrt: add v1.27 (#28835)
Signed-off-by: Adam Moody <moody20@llnl.gov>
2022-02-11 11:02:35 +01:00
haralmha
7c9e015c25 coral: add 3.3.10 and libtirpc dependency (#28802)
Co-authored-by: Harald Minde Hansen <harald.minde.hansen@cern.ch>
2022-02-11 09:58:57 +00:00
Adam J. Stewart
d0e2462908 py-fiona: add v1.8.21 (#28810) 2022-02-11 10:57:42 +01:00
Nils Vu
2d223fb275 spectre: add v2022.02.08 (#28834) 2022-02-11 09:53:16 +00:00
Wouter Deconinck
ba1b203848 assimp: add v5.2.1 (#28818) 2022-02-11 10:47:12 +01:00
ravil-mobile
94826a3f1d pspamm: new package (#28825)
Co-authored-by: ravil <ravil.dorozhinskii@tum.de>
2022-02-11 10:39:24 +01:00
ravil-mobile
73463f61d7 impalajit-llvm: add new package (#28826)
Co-authored-by: ravil <ravil.dorozhinskii@tum.de>
2022-02-11 10:36:46 +01:00
Adam Moody
dff770667e mpifileutils: add v0.11.1 (#28837)
Signed-off-by: Adam Moody <moody20@llnl.gov>
2022-02-11 10:26:56 +01:00
Gilles Grospellier
bb3a1dc7a5 icet: add variants to enable 'opengl' and shared libraries. (#27281) 2022-02-11 09:26:29 +00:00
Gilles Grospellier
a783f92fba dotnet-core-sdk: add v6.0.2. (#28843) 2022-02-11 10:21:03 +01:00
Sergey Kosukhin
b11767de48 eccodes: update package (#28849)
* eccodes: modernize the package

* eccodes: add patch fixing location of NetCDF

* eccodes: add variant 'shared'

* eccodes: enable building with NAG compiler

* eccodes: add property 'libs'

* eccodes: add variant 'definitions'

* eccodes: add variant 'samples'

* eccodes: add variant 'tools'
2022-02-11 10:10:31 +01:00
haralmha
1e494eec2f cool: add v3.3.10 (#28858)
Co-authored-by: Harald Minde Hansen <harald.minde.hansen@cern.ch>
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2022-02-11 09:41:45 +01:00
Olivier Cessenat
5b71839127 subversion: add the external find feature using svn (#28862) 2022-02-11 09:40:29 +01:00
kwryankrattiger
a45c7c185b Ecp sdk update package contraints (#28693)
* ECP-SDK: Require HDF5 1.12

* ECP-SDK: Require SDK spec for SDK packages
2022-02-11 09:35:30 +01:00
Seth R. Johnson
c16ce9408e qt: patch version 5 on macOS 12 SDK (#28865)
Note that the SDK is not the same as the system version: using
apple-clang@13 is a better match than `os=monterey` since this actually
fails on bigsur as well, as long as xcode 13 is being used.
2022-02-11 09:31:37 +01:00
Ben Darwin
4c396d2cee minc-toolkit: fix perl dependencies (#28868) 2022-02-11 09:30:36 +01:00
Sergey Kosukhin
c80c92aa39 serialbox: add new package (#28872) 2022-02-11 09:23:47 +01:00
kwryankrattiger
cba918c081 Ascent: add dependency on adios2, remove deprecated variant (#28876)
Also update specs to be less complicated for dray/mfem/vtk-h.
Removed deprecated adios variant.
2022-02-11 09:14:06 +01:00
Bryan Herman
37728900e6 PVM package: fix missing install files and runtime env (#28851) 2022-02-10 18:03:11 -08:00
Carlos Bederián
b547819c40 ucx package: add version 1.12.0 (#28752) 2022-02-10 17:53:42 -08:00
Jon Rood
e9b50324cf trilinos: add rocm_rdc variant (#28873)
* Add rocm_rdc variant to trilinos.

* Fix syntax error.
2022-02-10 16:53:22 -07:00
eugeneswalker
8d8822f749 arborx +rocm: use hipcc and depend on rocthrust (#28875) 2022-02-10 15:39:40 -08:00
Seth R. Johnson
2fa6cd6d23 macOS: always set MACOSX_DEPLOYMENT_TARGET (#28797)
* core: Make platform environment an instance not class method

In preparation for accessing data constructed in __init__.

* macos: set consistent macosx deployment target

This should silence numerous warnings from mixed gcc/macos toolchains.

* perl: prevent too-new deployment target version

```
*** Unexpected MACOSX_DEPLOYMENT_TARGET=11
***
*** Please either set it to a valid macOS version number (e.g., 10.15) or to empty.
```

* Stylin'

* Add deployment target overrides to failing autoconf packages

* Move configure workaround to base autoconf package

This reverts commit 3c119eaf8b4fb37c943d503beacf5ad2aa513d4c.

* Stylin'

* macos: add utility functions for SDK

These aren't yet used but should probably be added to spack debug
report.
2022-02-10 23:22:30 +00:00
Adam J. Stewart
e502a264b5 MAGMA: declare cuda_arch support conflicts (#28811) 2022-02-10 16:52:23 -06:00
Harmen Stoppels
93e7efdc42 hsakmt-roct: dont look for libudev or compiler support libs (#28861) 2022-02-10 23:04:54 +01:00
Manuela Kuhn
16d1746fc3 py-llvmlite: add 0.38.0 (#28822) 2022-02-10 15:26:35 -06:00
Massimiliano Culpo
4437ff1fc3 Stabilize the concretization of ecp-data-vis-sdk by preferring conduit@0.7.2 (#28871) 2022-02-10 21:45:07 +01:00
Massimiliano Culpo
e6e109cbc5 ASP-based solver: reduce input facts and add heuristic (#28848)
* Remove node_target_satisfies/3 in favor of target_satisfies/2

When emitting input facts we don't need to couple target with
packages, but we can emit fewer facts independently and let
the grounder combine them.

* Remove compiler_version_satisfies/4 in favor of compiler_version_satisfies/3

When emitting input facts we don't need to couple compilers with
packages, but we can emit fewer facts independently and let
the grounder combine them.

* Introduce heuristic in the ASP-program

With heuristic we can drive clingo to make better
initial guesses, which lead to fewer choices and
conflicts in the overall solve
2022-02-10 11:37:10 -08:00
eugeneswalker
81a6d17f4c e4s: new specs: nccmp, nco, wannier90, lammps (#28833) 2022-02-10 10:47:19 -08:00
eugeneswalker
1e4c08f3a5 llvm-doe: rename @clacc to @develop.clacc for proper version matching (#28859) 2022-02-10 10:08:54 -08:00
Olivier Cessenat
8b39aa1b50 cvs: add the external find feature for cvs (#28863) 2022-02-10 12:35:22 -05:00
Christoph Junghans
5206bca19b lammps: add new versions (#27462)
* lammps: add new versions
2022-02-10 11:37:21 -05:00
Seth R. Johnson
92b26257f4 Fix CMakePackage.define for libs/headers (#28838)
The 'libs' property returned by a spec is not a list nor tuple.

Closes #28836.
2022-02-10 13:43:22 +00:00
luker
fa38af285c Adding support for rocmcc to cray-libsci package (#28601) 2022-02-10 10:50:33 +01:00
Greg Becker
130354b867 spack audit: fix spurious failures for target/platform conflicts (#28860) 2022-02-10 09:10:23 +01:00
Tamara Dahlgren
36ef59bc67 Tests: move has_test_method to spack.package (#28813) 2022-02-09 22:27:48 -08:00
Mark Grondona
634cba930e flux-core: add v0.34, v0.35 and deprecate v0.27.0 and earlier (#28852)
* flux-core: add versions 0.34.0 and 0.35.0

* flux-core: deprecate versions <= 0.27.0
2022-02-09 16:12:39 -08:00
Massimiliano Culpo
549c785227 Detecting "Cray" as "linux" during bootstrap (#28726)
* bootstrap: avoid detecting "Cray" and treat the platform as "linux"

* bootstrap: create a proper context manager to disable cray
2022-02-09 17:41:11 -05:00
Cory Bloor
a582670e15 rocrand: add spack test support and cleanup (#27437) 2022-02-09 14:30:56 -07:00
Robert Cohn
7f1759b52e Remove danvev as maintainer of intel packages (#28806)
* Remove danvev as maintainer of intel packages
* Update hash to trigger gitlab again
2022-02-09 09:51:36 -08:00
Olivier Cessenat
d4ca803fc5 freetype: add 2.11.1 and update Intel conflict (#28606) 2022-02-08 21:34:38 +01:00
Massimiliano Culpo
ffa58a20c6 CI: pin the version of pathlib to v2.3.6 (#28832) 2022-02-08 20:00:20 +00:00
G-Ragghianti
500e05ee50 SLATE package: tests (#28723)
This improves the stand-alone tests for slate by providing most
of the dependencies to the test framework and enabling stand-alone
tests on all versions except the oldest.
2022-02-08 10:30:57 -08:00
Toyohisa Kameyama
c1b51d6e99 fujitsu frontistr: New package (#28514)
* fujitsu-frontistr: new package.

* Add frontistr@5.3
Add fujitsu-frontistr static variant
style fix.

* update copyrigght year.
2022-02-07 18:42:26 -08:00
Michael Bentley
e97bffc3e8 flit: remove dependency from development branch version (#28789) 2022-02-07 18:35:18 -07:00
Adam J. Stewart
457fe842f0 py-scipy: add v1.8.0 (#28798) 2022-02-07 16:35:32 -07:00
Adam J. Stewart
1809a122ef py-pymumps: remove install_options (#28452) 2022-02-07 14:27:17 -08:00
Axel Huebl
8430e28a52 AMReX: +tiny_profile (#28785)
* AMReX: +tiny_profile

The tiny profiler options in AMReX are by default off but needed
by WarpX. Adds a new variant to control it.

* Add Erik Palmer as Co-Maintainer

... so he receives pings on updates of the package for review.
2022-02-07 15:15:25 -07:00
Tim Haines
c791ddc742 libiberty: add versions from v2.34.1 to v2.37 (#28792) 2022-02-07 06:12:08 -07:00
Harmen Stoppels
9a194dd0a9 Bump ca-certificates-mozilla (#28803) 2022-02-07 06:11:50 -07:00
VladimirUspenskii
0ce71b38b0 Add oneAPI packages from the 2022.1.2 release (#28750)
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2022-02-07 14:01:50 +01:00
Stephen Sachs
14902a5821 intel-mkl: BLACS with intel-oneapi-mpi (#28476)
Identify the correct BLACS libaries when `intel-oneapi-mpi` is used.
2022-02-07 13:26:20 +01:00
Seth R. Johnson
efc35c2ffc gcc: fix build on apple-clang@13 (#28801) 2022-02-07 13:04:37 +01:00
Jordan Galby
37ae4c0fdb Support config variables in config.yaml extensions paths (#17772) 2022-02-07 11:40:52 +01:00
Adam J. Stewart
f3139555b1 py-torch: update dep constraints (#28743)
The version of the ONNX submodule was updated between the PyTorch
1.9 and 1.10 releases, which fixed builds with newer protobuf but
broke builds with older protobuf.

Also this adds minimum version reqs for numpy/typing-extensions
(which were not present before).
2022-02-06 21:33:12 -08:00
Harmen Stoppels
aa5e1a0723 git: new versions (#28777) 2022-02-06 10:58:17 -08:00
Seth R. Johnson
6e36c71d68 Fix GCC 8 build on macOS bigsur %apple-clang@12.0.5 (#28795)
* gcc: revise patch range on darwin

* gcc: add conflict to work around bootstrap failure

closes #23296 . See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=100340
.

```
Comparing stages 2 and 3
Bootstrap comparison failure!
gcc/tree-ssa-operands.o differs
gcc/tree-ssanames.o differs
gcc/ipa-inline.o differs
gcc/tree-ssa-pre.o differs
gcc/gimple-loop-interchange.o differs
...
```
639 total differences.

* gcc: bump conflict up to correct later version
2022-02-06 07:48:16 -05:00
Axel Huebl
d82dcd35b3 WarpX: add v22.02 (#28742) 2022-02-04 23:53:51 -07:00
Toyohisa Kameyama
797c061743 openfdtd: add version 2.6.3, 2.7.1, 2.7.3 and support Fujitsu Compiler. (#28472) 2022-02-04 11:54:39 -08:00
Toyohisa Kameyama
c29e6dbd20 wrf: Add New version, support fujitsu compiler, refactor and bug fix. (#28567)
* wrf: Add version 4.3, support fujitsu compioler, output build log, and fix to build diffwrf.
2022-02-04 11:53:59 -08:00
Toyohisa Kameyama
cccd1ce376 atompaw: use autotoolsPackage and support Fujitsu compiler. (#28471)
* atompaw: use autotoolsPackage and support Fujitsu compiler.
2022-02-04 11:50:55 -08:00
Harmen Stoppels
73077f3a67 database: fix reindex with uninstalled deps (#28764)
* Fix reindex with uninstalled deps

When a prefix of a dep is removed, and the db is reindexed, it is added
through the dependent, but until now it incorrectly listed the spec as
'installed'.

There was also some questionable behavior in the db when the same spec
was added multiple times, it would always be marked installed.

* Always reserve path

* Only add installed spec's prefixes to install prefixes set

* Improve warning, and ensure ensure only ensures

* test: reindex with every file system remnant removed except for the old index; it should give a database with nothing installed, including records with installed==False,external==False,ref_count==0,explicit=True, and these should be removable from the database
2022-02-04 19:31:39 +00:00
Massimiliano Culpo
5881a03408 Use Spec.constrain to construct spec lists for stacks (#28783)
* stacks: add regression tests for matrix expansion

* Use constrain semantics to construct spec lists for stacks

* Fix semantics for constraining an anonymous spec. Add tests
2022-02-04 19:17:23 +00:00
Axel Huebl
d668dea97d WarpX/HiPACE: noacc PSATD also needs FFTW (#28756)
Forgot to add that `compute=noacc` also needs FFTW3 for PSATD builds.
2022-02-04 10:59:47 -08:00
Valentin Volkl
45a285a751 py-kubernetes: add new versions up to v21.7.0 (#28687)
* py-kubernetes: add new versions up to v21.7.0

* py-kubernetes: update python dependency
2022-02-04 11:25:10 -06:00
Marko Kabic
7af48f4570 Adding new package: COSTA (#28727) 2022-02-04 10:02:32 -07:00
Glenn Johnson
d0019d7049 cntk: fix build and add comments (#28676) 2022-02-04 10:56:21 -06:00
Adam J. Stewart
caa1a5cec7 py-mpi4py: fix install_options (#28759) 2022-02-04 11:22:11 +01:00
Axel Huebl
172ec0e3a1 py-warpx: needs no py-cmake (#28761)
Since in Spack we pull binaries out of the `warpx` package, we don't
need `py-cmake` to build `py-warpx`.
Generally, `py-cmake` in `pyproject.toml` is just a mean for us to
tell `pip` to make a `cmake` CLI tool available.
2022-02-04 11:21:32 +01:00
eugeneswalker
4c04a0c0b7 e4s ci: uncomment variorum following pr #28754 (#28763) 2022-02-04 11:15:35 +01:00
Adam J. Stewart
9ca4c10b34 py-numpy: add v1.22.2 (#28771) 2022-02-04 11:11:41 +01:00
Tim Haines
fda89b3b4c flit: add version for development branch (#28773) 2022-02-04 11:10:11 +01:00
Tim Haines
1ba6e27830 fpchecker: add version for master branch (#28774) 2022-02-04 11:09:22 +01:00
Tim Haines
9c686a12c1 pruners-ninja: Add version for master branch (#28765) 2022-02-03 14:59:42 -08:00
Weiqun Zhang
c7def4121a amrex: add v22.02 (#28757) 2022-02-03 14:37:52 -08:00
Pieter Ghysels
bdf37db9e4 STRUMPACK new versions 6.3.0, 6.2.1, 6.2.0 (#28762)
* Add new versions for STRUMPACK, newer ButterflyPACK dependency

* small fix
2022-02-03 12:42:36 -08:00
liuyangzhuan
f503b3770b Bump butterflypack versions to v.2.1.0 (#28758)
* added package gptune with all its dependencies: adding py-autotune, pygmo, py-pyaml, py-autotune, py-gpy, py-lhsmdu, py-hpbandster, pagmo2, py-opentuner; modifying superlu-dist, py-scikit-optimize

* adding gptune package

* minor fix for macos spack test

* update patch for py-scikit-optimize; update test files for gptune

* fixing gptune package style error

* fixing unit tests

* a few changes reviewed in the PR

* improved gptune package.py with a few newly added/improved dependencies

* fixed a few style errors

* minor fix on package name py-pyro4

* fixing more style errors

* Update var/spack/repos/builtin/packages/py-scikit-optimize/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* resolved a few issues in the PR

* fixing file permissions

* a few minor changes

* style correction

* minor correction to jq package file

* Update var/spack/repos/builtin/packages/py-pyro4/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fixing a few issues in the PR

* adding py-selectors34 required by py-pyro4

* improved the superlu-dist package

* improved the superlu-dist package

* moree changes to gptune and py-selectors34 based on the PR

* Update var/spack/repos/builtin/packages/py-selectors34/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improved gptune package: 1. addressing comments of tldahlgren in PR 26936; 2. adding variant openmpi

* fixing style issue of gptune

* changing file mode

* improved gptune package: add variant mpispawn which depends on openmpi; add variant superlu and hypre for installing the drivers; modified hypre package file to add a gptune variant

* fixing style error

* corrected pddrive_spawn path in gptune test; enforcing gcc>7

* fixing style error

* setting environment variables when loading gptune

* removing debug print in hypre/package.py

* adding superlu-dist v7.2.0; fixing an issue with CMAKE_INSTALL_LIBDIR

* changing site_packages_dir to python_platlib

* not using python3.9 for py-gpy, which causes due to dropped support of tp_print

* more replacement of site_packages_dir

* fixing a few dependencies in gptune; added a gptune version

* adding url for gptune

* minor correction of gptune

* updating versions in butterflypack

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-03 12:42:21 -08:00
Matthieu Dorier
4a308d0d4b [py-xonsh] added py-xonsh package (#28746)
* [py-xonsh] added py-xonsh package

* [py-xonsh] change dependency to python 3.6

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-03 14:09:10 -06:00
eugeneswalker
48b2cf54cf variorum: needs jansson (#28754) 2022-02-03 10:45:19 -08:00
Mosè Giordano
5208fcf1ea libblastrampoline: Add versions 4.1.0, 5.0.0, 5.0.1 (#28732) 2022-02-03 11:55:48 +01:00
Chris White
98fe047671 Sundials: update version sha, add -D to cmake option (#28720) 2022-02-03 11:48:18 +01:00
Satish Balay
95f77dacd6 petsc, py-petsc4py: add versions 3.16.4 (#28729) 2022-02-03 11:47:34 +01:00
Harmen Stoppels
4134b318ac libgit2: add mmap variant, disabling it makes it work on filesystems that do not implement mmap (#28520) 2022-02-03 11:31:53 +01:00
liuyangzhuan
932408ac2b gptune: add v2.1.0 (#28739) 2022-02-03 11:02:46 +01:00
kwryankrattiger
5a640f9063 conduit: convert HDF5 constaint to compile flag (#28735) 2022-02-03 10:43:39 +01:00
Greg Sjaardema
7c5db7678a seacas: add v2022-01-27 (#28738) 2022-02-03 10:41:48 +01:00
Cyrus Harrison
cdf17fdfe5 conduit: add v0.8.2 (#28730) 2022-02-03 10:39:46 +01:00
Harmen Stoppels
2ada0fa5a2 julia: fix gfortran version detection (#28741) 2022-02-03 10:37:14 +01:00
Nils Vu
18b83c3833 Change my name (#28737) 2022-02-02 15:21:45 -08:00
Glenn Johnson
22b7d9cf67 llvm: patch for gcc-11 (#28547)
* llvm: patch for gcc-11

- added llvm-gcc11.patch
- adjusted version constraints on existing missing-includes patch.

* remove unnecessary comment

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2022-02-02 13:32:16 -07:00
Adam J. Stewart
fa8c9e4939 py-torch: avoid build issue with newer protobuf (#28721) 2022-02-02 12:23:47 -07:00
Adam J. Stewart
3576e5f3d6 Revert "Deprecate Python 2 installations" (#28411)
This reverts commit 7b76e3982f.
2022-02-02 10:12:33 -08:00
Massimiliano Culpo
cd04109e17 Add a "sticky" property to variants (#28630)
* Add sticky variants

* Add unit tests for sticky variants

* Add documentation for sticky variants

* Revert "Revert 19736 because conflicts are avoided by clingo by default (#26721)"

This reverts commit 33ef7d57c1.

* Add stickiness to "allow-unsupported-compiler"
2022-02-02 10:05:24 -08:00
dependabot[bot]
dd7acecf3d build(deps): bump docker/build-push-action from 2.8.0 to 2.9.0 (#28719)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 2.8.0 to 2.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1814d3dfb3...7f9d37fa54)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-02-02 11:57:31 +01:00
Carlos Bederián
d3bfdcb566 gromacs: Add versions 2021.4 and 2021.5 (#28722) 2022-02-02 03:44:12 -07:00
Marcus Boden
e1ec08b749 Snakemake: add v6.15.1 (#28724) 2022-02-02 10:32:36 +01:00
Glenn Johnson
3b5afef842 new packages: py-rst2pdf and py-smartypants (#28697)
* new packages: py-rst2pdf and py-smartypants

The py-smartypants package is a dependency of py-rst2pdf.

* add missing dependencies
2022-02-01 23:56:09 -07:00
Sebastian Pipping
b9bb303063 expat: Add latest release 2.4.4 with security fixes (#28692) 2022-02-01 16:29:29 -07:00
Adam J. Stewart
7144af9c3a py-argparse: skip dep for newer Python (#28718) 2022-02-01 15:38:11 -07:00
Rao Garimella
50c6075509 Jali package: add version 1.1.6 (#28545)
Boost dependency is only needed for @:1.1.5
2022-02-01 14:22:24 -08:00
Harmen Stoppels
b93b64ca67 TermStatusLine: fix python 2.7 and add test (#28715)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-02-01 21:59:29 +01:00
Massimiliano Culpo
7c31d4d279 CI: macOS try to build latest GCC (#28709) 2022-02-01 21:29:27 +01:00
Nils Leif Fischer
4a29512113 py-rich: fix installation with poetry (#28710) 2022-02-01 14:19:40 -06:00
Teodor Nikolov
9cf20b9d32 Sarus: Add 1.4.1, use commits instead of tarballs, tests (#28705)
- To retrieve the correct spack version we need to get it from the git
  repo.

- Recommend installing the package with root for production

- Add Tomas as maintainer to sarus' spack package

- Add the option to disable unit tests in latest versions
2022-02-01 11:49:58 -07:00
Harmen Stoppels
165f686400 lua: add fetcher for luarocks (#28707) 2022-02-01 10:44:51 -07:00
Jose E. Roman
427d281d15 SLEPc: add v3.16.2 (#28706) 2022-02-01 09:38:28 -07:00
estewart08
0a595d488b rocm-openmp-extras: add v4.3.1 and v4.5.0 (#27653) 2022-02-01 16:39:33 +01:00
kwryankrattiger
099a16ed8f ParaView/VTK: Update hdf5 constraints (#28691)
hdf5@1.10: is required for the new version of vtkHDFReader
2022-02-01 09:50:11 -05:00
kwryankrattiger
eb7bc7fc4b Ascent/Conduit: Update the +fortran build config (#28646)
Do not silently turn off fortran support based on checking if
a fortran compiler exists. Let the build fail if there is no
working fortran compiler.
2022-02-01 07:44:48 -07:00
Adam J. Stewart
bdfcf7c92b py-torchgeo: specify libtiff variants (#28681) 2022-02-01 08:43:54 -06:00
Cyrus Harrison
3f6d045c53 conduit: add new variant and hcfg entry (#27822) 2022-02-01 13:53:30 +01:00
Cyrus Harrison
b300a9d7a5 dray: add new variant and host config entry (#28598) 2022-02-01 13:52:12 +01:00
Torbjörn Lönnemark
218b1c153c p7zip: fix build on gcc 11 (#28376)
Fixes the following build failure when building with gcc 11:

         478    ../../../../CPP/7zip/Archive/Wim/WimHandler.cpp: In member function 'virtual LONG NArchive::NWim::CHandler::GetArchiveProperty(PROPID, PROPVARIANT*)':
      >> 479    ../../../../CPP/7zip/Archive/Wim/WimHandler.cpp:308:11: error: use of an operand of type 'bool' in 'operator++' is forbidden in C++17
         480      308 |           numMethods++;
         481          |           ^~~~~~~~~~
      >> 482    ../../../../CPP/7zip/Archive/Wim/WimHandler.cpp:318:9: error: use of an operand of type 'bool' in 'operator++' is forbidden in C++17
         483      318 |         numMethods++;
         484          |         ^~~~~~~~~~
2022-02-01 12:26:00 +01:00
Nils Leif Fischer
c3bad2a935 gettext: find external (#28610) 2022-02-01 12:25:02 +01:00
Brent Huisman
dedf4a1470 Add version v0.6 to Arbor package. (#28628)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-02-01 12:24:28 +01:00
Sergey Kosukhin
0e28356aba Eccodes: update urls and add new versions (#28614) 2022-02-01 12:20:06 +01:00
Massimiliano Culpo
80a3f01bbb gcc: allow building 11.2.0 on macOS (#28662) 2022-02-01 11:19:24 +00:00
Cyrus Harrison
f5a70dece8 ascent: add patch for gcc 10 + 11 (#28592) 2022-02-01 12:18:48 +01:00
Mikhail Titov
d7e7542486 Update package versions: RADICAL-Cybertools (RE, RP, RS, RU) (#28523)
* rct: update packages (RE, RP, RS, RU) with new versions

* rct: minor version release for RP

* rct: fixed versions for dependencies (RS and RU)
2022-02-01 11:35:14 +01:00
Glenn Johnson
a3f0c3ca93 saga-gis: tweak dependencies and deprecate some versions (#28677)
- deprecate versions 2 and 3
- set needed opencv variants
- adjust proj constraint
- patch header path for opencv
2022-02-01 11:09:50 +01:00
Glenn Johnson
ab2187780b libdc1394: update dependencies and add libraw1394 dependency (#28675) 2022-02-01 11:09:00 +01:00
Adam J. Stewart
7ac66547da py-black: add v22.1.0 (#28680) 2022-02-01 11:06:24 +01:00
Pat Riehecky
4b52f0e4d7 jsonnet: add new package (#28645)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-02-01 10:22:52 +01:00
Mikael Simberg
d8dcb91297 pika: add v0.1.0 (#28684)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.if>
2022-02-01 10:20:01 +01:00
Valentin Volkl
5e8ef57a82 alpaka: add v0.8.0 (#28689) 2022-02-01 10:19:40 +01:00
Vicente Bolea
c9e135729c vtk-m: add v1.7.1 release (#28690) 2022-02-01 10:19:19 +01:00
Nils Leif Fischer
8ccc4df6ff charmpp: default backend to 'multicore' on macOS (#28443) 2022-02-01 10:17:48 +01:00
Tom Scogland
545a429646 llvm: add llvm_config helper (#28670)
Add a helper as a base to build better libs etc. methods
2022-02-01 09:58:39 +01:00
Paul Kuberry
bff5d253cf xyce: add 'plugin' variant (#28698) 2022-02-01 09:49:57 +01:00
Ryan Marcellino
3456a355ce maven: add v3.8.4 (#28699) 2022-02-01 09:48:52 +01:00
Dylan Jude
aedb106c5f samrai: add "--enable-shared" option as "+shared" (#28695) 2022-02-01 09:48:19 +01:00
Adam J. Stewart
a4a23ff713 py-torch: fix python_platlib reference (#28565) 2022-02-01 09:44:08 +01:00
Adam J. Stewart
728ac61bd8 pygmo: fix build (#28702) 2022-02-01 09:38:06 +01:00
Glenn Johnson
21524f5149 opencv: add new version, variant, and patch (#27374)
* opencv: add new version, variant, and patch

- added version 4.5.4
- added tesseract variant
- added patch to not add system paths

* Add leptonica depends and contrib conflicts

* Add dependencies for 1394 support

- new package: libraw1394
- add sdl dependency to libdc1394
- add conflict for openjpeg and jasper

* Adjust dependencies and conflicts for opencv modules

* rewrite of opencv

- all prebuilt apps are now variants and can be installed
- core is no longer a variant. It was always built anyway so it was not
  really a variant.
- contrib is no longer a variant. All of the contrib modules are now
  available as variants.
- components that can not be built with Spack are no longer variants.
  They are set to 'off' to prevent pulling from system.
- handle the case where a module and a component have the same name
- use `with when` framework
- adjust dependencies and conflicts
- new package: libraw1394
- have libdc1394 depend on libraw1394
- patch to find clp
- patch to find onnx
- patch for cvv to find Qt
- format with black

* Incorporate recommended changes

- fix variants and dependencies on packages that depend on opencv
- remove opencv-3.2 and patches
- add some new patches to handle different versions
- cntk needs further work
- the openvslam package was markde deprecated as it is no longer an
  active project and the repository has no code

* Remove gmake dependency.

* Remove sdl support

SDL is only used in an example case, but the examples are not built.

* remove openvslam

* Remove opencv+flann variant from 3dtk

* Back out cfitsio constraint from py-astropy

* remove opencv+flann variant from dlib

* remove boost constraint from 3dtk

* Remove non-opencv related bohrium changes

* Adjustments for cntk

- protobuf constraint at version 3.10
- need specific variants for opencv
- improve patch

* Deprecate CNTK package

* variant tweaks

- added appropriate conflicts for cublas
- made cuda/cudev relationship explicit
- moved openx to pending components as it needs an openvx package

* fix isort style error

* Use date version from kaldi rather than commit

* Revert changes from a bad rebase

* Add +flann to 3dtk and dlib

* Use compression support with libtiff

* remove `+datasets` from opencv dependency 

The py-torchgeo package does not need opencv+datasets.

* fix typo

zip --> zlib
2022-01-31 21:24:03 -06:00
eugeneswalker
6e99f328b6 e4s ci: add spec: gptune (#28604) 2022-01-31 18:44:25 -08:00
eugeneswalker
d6d29f7c09 mesa18: depends on python@3.8.99 max (#28345) 2022-01-31 13:20:33 -07:00
Pat Riehecky
a5d0687115 Update Packages: py-asgiref, py-h11, py-websockets (#28649) 2022-01-31 14:13:44 -06:00
iarspider
a909f2d910 Add checksum for py-cryptography 3.2.1 (#28685) 2022-01-31 14:11:41 -06:00
liuyangzhuan
3b64ca30c7 changing site_packages_dir to python_platlib for gptune (#28669)
* added package gptune with all its dependencies: adding py-autotune, pygmo, py-pyaml, py-autotune, py-gpy, py-lhsmdu, py-hpbandster, pagmo2, py-opentuner; modifying superlu-dist, py-scikit-optimize

* adding gptune package

* minor fix for macos spack test

* update patch for py-scikit-optimize; update test files for gptune

* fixing gptune package style error

* fixing unit tests

* a few changes reviewed in the PR

* improved gptune package.py with a few newly added/improved dependencies

* fixed a few style errors

* minor fix on package name py-pyro4

* fixing more style errors

* Update var/spack/repos/builtin/packages/py-scikit-optimize/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* resolved a few issues in the PR

* fixing file permissions

* a few minor changes

* style correction

* minor correction to jq package file

* Update var/spack/repos/builtin/packages/py-pyro4/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fixing a few issues in the PR

* adding py-selectors34 required by py-pyro4

* improved the superlu-dist package

* improved the superlu-dist package

* moree changes to gptune and py-selectors34 based on the PR

* Update var/spack/repos/builtin/packages/py-selectors34/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improved gptune package: 1. addressing comments of tldahlgren in PR 26936; 2. adding variant openmpi

* fixing style issue of gptune

* changing file mode

* improved gptune package: add variant mpispawn which depends on openmpi; add variant superlu and hypre for installing the drivers; modified hypre package file to add a gptune variant

* fixing style error

* corrected pddrive_spawn path in gptune test; enforcing gcc>7

* fixing style error

* setting environment variables when loading gptune

* removing debug print in hypre/package.py

* adding superlu-dist v7.2.0; fixing an issue with CMAKE_INSTALL_LIBDIR

* changing site_packages_dir to python_platlib

* not using python3.9 for py-gpy, which causes due to dropped support of tp_print

* more replacement of site_packages_dir

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-01-31 11:21:06 -08:00
Massimiliano Culpo
68cf57d0d0 Fix SECURITY.md file by adding v0.17.x to supported versions (#28661) 2022-01-31 10:04:06 -08:00
Harmen Stoppels
f080e593cb julia: add targets constraint to llvm (#28682) 2022-01-31 06:29:23 -07:00
Paul Kuberry
fff9a29401 trilinos: Fix cxxstd effects for Trilinos versions <13 (#28663) 2022-01-31 00:41:21 -07:00
Adam J. Stewart
1679274ac2 py-horovod: add py-tensorflow-estimator dependency (#28674) 2022-01-29 20:22:19 -06:00
Jen Herting
c78a754e5d New package: py-kaldiio (#28643)
* espnet first build with depends

* fixed flake8

* updated to lastest version and removed python dependency

* changed to pypi and version 2.17.2

* [py-kaldiio] depends on py-pytest-runner

* [py-kaldiio] updated copyright

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-01-29 12:52:23 -06:00
Valentin Volkl
813a0d9b19 prmon: better testing, fix checksums (#28672)
* prmon: make sure integration tests do not run in parallel

Some integration tests fail if not run on an otherwise idle machine.

* prmon: run unittests based on googletest

* prmon: fix checksums
2022-01-29 14:32:57 +01:00
Seth R. Johnson
aece700e14 superlu: simplify cmake logic and fix cuda conflict (#28658)
* superlu-dist: use CMakePackage helper functions

* Fix #28609

It's OK to have CUDA in the dependency tree as long as it's not being
used for superlu-cuda.
2022-01-29 12:58:09 +00:00
Pat Riehecky
a98a261b74 Update Packages: py-celery py-kombu py-amqp py-pytz py-click-plugins (#28648) 2022-01-28 16:47:10 -07:00
Pat Riehecky
6f3670d1eb New Package: py-python-dotenv (#28651) 2022-01-28 14:23:25 -07:00
Pat Riehecky
f189e5eb1c Update Package: py-pre-commit (#28665) 2022-01-28 14:36:48 -06:00
Massimiliano Culpo
bc06c1206d macholib, altgraph: update vendored dependency (#28664) 2022-01-28 10:55:12 -08:00
Massimiliano Culpo
4bd761d1d5 Update actions/setup-python to latest version (#28634) 2022-01-28 14:17:59 +01:00
Graeme A Stewart
02b48f5206 prmon: add up to v3.0.1, add new maintainers (#28657)
* Update prmon package to latest versions

Add recent versions of the prmon package
Add the spdlog dependency for versions >=3
Add package developers as additional maintainers
Update name of development branch to 'main'

* Correct checksum for v3.0.1
2022-01-28 04:42:02 -07:00
Michele Martone
cce8f3faf5 librsb: added v1.2.0.11/v1.3.0.0 (#28637) 2022-01-28 11:46:54 +01:00
Tim Haines
5519289f1d fpchecker: add v0.3.5 (#28639)
Co-authored-by: Tim Haines <thaines@cs.wisc.edu>
2022-01-28 11:44:18 +01:00
Nils Leif Fischer
d80241a203 spectre: allow disabling debug symbols (#28511)
* spectre: allow disabling debug symbols

* spectre: fix PCH builds

* spectre: support builds with shared libs
2022-01-28 11:40:02 +01:00
Desmond Orton
1e1e3ae80a metabat: switch to CMakePackage, deprecate old versions (#28384) 2022-01-28 11:34:35 +01:00
kwryankrattiger
a3755e5c76 ParaView: Require Ninja for builds (#28650) 2022-01-28 11:30:23 +01:00
Satish Balay
4c3bc0d3dc slepc: switch from using -with-arpack-dir to --with-arpack-include/lib options (#28654)
Also save configure.log, make.log
2022-01-28 11:20:35 +01:00
Adam J. Stewart
46b9911289 py-torch: add v1.10.2 (#28653) 2022-01-28 11:01:17 +01:00
Vasileios Karakasis
e06319a3dc Add more ReFrame versions (#28655) 2022-01-28 02:42:07 -07:00
Desmond Orton
bf80a9f83a at-spi2-core: patch for version 2.40.1 and beyond (#28497)
at-spi2-core is automatically selecting dbus-broker and enabling systemd if it finds dbus-broker-launch which some systems might have even without systemd being part of the actual spack environment. This is not ideal for a spack package.
2022-01-28 10:40:26 +01:00
Tom Vander Aa
f7f7a168e9 ucx backtrace-detail variant (#28625)
ucx has the configure option --[enable|disable]-backtrace detail.
This option is not explicitely set by spack, causing problems on my system, because
./configure does not find the bfd.h header file / libbfd.so library.

Added variant + dependencies (binutils). Disabled by default
2022-01-28 10:35:10 +01:00
liuyangzhuan
80f92cfdde gptune: add variants (mpispawn, hypre) (#27733)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-01-28 10:31:56 +01:00
Toyohisa Kameyama
693f0958b5 trilinos: version 12 cxxstd flags (#28582)
* trilinos: version 12 requires cxxstd=11

* trilinos: use cmake version 3.21 or old when trilinos version 12

* conflict cxxstd=17 and cmake@3.2.[01]

* trilinos: version 12 requires cxxstd=11.

* Trilinos_CXX11_FLAGS is set to ' ' to avoid inject C++11 flag.

* set Trilinos_CXX11_FLAGS only version 12 or older.
2022-01-28 01:29:18 -07:00
Seth R. Johnson
2fd26be988 Trilinos: minimize E4S CUDA build (#28591)
* trilinos: update dependencies

Use the tribits deps to clarify some dependencies, and group some together
using `with` statements, eliminating some transitive conflict duplication.

* trilinos: Restricit cuda incompatibility

* e4s: vastly reduce number of packages in trilinos-cuda build

Not clear who the customers of cuda-enabled trilinos are, or what options
they need, or which sets of options conflict...

* e4s: remove ~wrapper from trilinos+cuda
2022-01-27 21:08:15 -07:00
Jen Herting
405adce5ae New package: py-wordcloud (#28482)
* [py-wordcloud] created template

* [py-wordcloud]

- added dependencies
- added homepage
- added description
- removed fixmes
- updated copyright
2022-01-27 15:08:57 -07:00
Jen Herting
a2170c9e04 New package: py-spectral (#28481)
* simple build of spectral

* fixed trailing whitespace for style

* [py-spectral] updated copyright

* [py-spectral] runtime imports py-numpy

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-01-27 14:57:14 -07:00
Jen Herting
bae2a2660e New package: py-deepecho (#28488)
* [py-deepecho] created template

* [py-deepecho]

- added dependencies
- added homepage
- added description
- removed fixmes
- updated copyright

* [py-deepecho] flake8

* [py-deepecho] removed .999

* [py-deepecho] added dependency on py-pytest-runner
2022-01-27 15:25:28 -06:00
Harmen Stoppels
08fb190264 python: 3.9.10, 3.10.2 (#28635) 2022-01-27 14:03:02 -07:00
Massimiliano Culpo
b9b1665cb2 Pin the version of "coverage" in CI to 6.2 (#28638)
Hotfix for coverage v6.3 blocking our CI workflows
2022-01-27 20:35:05 +01:00
Paul Kuberry
ec9748eaf3 xyce: Add cxxstd variant with only valid value of 11 (#28616) 2022-01-27 11:40:57 -05:00
Max Zeyen
28ea1bab68 gpi-space: add new package (#27989) 2022-01-27 07:35:20 -07:00
Gregory Lee
61ea456f5d conduit: add v0.8.1 and patch to add algorithm for std sort (#28618) 2022-01-27 12:26:18 +01:00
Tim Moon
71c8375862 Update CMake to 3.21 in LBANN-related projects (#28507)
* Update CMake to 3.21 in LBANN-related projects

* Allow CMake 3.17 for older versions of LBANN, Hydrogen, Aluminum
2022-01-27 12:21:45 +01:00
kwryankrattiger
8106983ddb Vtkm kokkos variant (#28363)
* VTK-m: Make vtk-m consistent with ROCmPackage

* VTKm: Add kokkos variant

Specifying +kokkos will enable kokkos backend.
Specifying +kokkos with +rocm will require a kokkos with a ROCm backend.
Specifying +cuda enables VTK-m native CUDA backend. VTK-m native cuda backend
  is not compatible with the kokkos +cuda backend.

* VTK-m: Add cuda_native variant

Required to allow specifying a vtk-m spec the selects a
cuda_arch and predictably propagate that to the underlying kokkos
dependency.

This also makes explicit selecting kokkos with a cuda backend or using
the VTK-m cuda backend.
2022-01-27 01:08:13 -07:00
Nils Leif Fischer
28bda22e52 New packages: py-quaternionic and py-spherical (#28611) 2022-01-26 17:48:43 -05:00
kwryankrattiger
991438a242 Mesa(18): Use libllvm virtual package (#28365)
* Mesa(18): Use libllvm virtual package

* Mesa patch configuration

Patch Mesa to define LLVM_VERSION_SUFFIX if llvm is pre-release

* Patch llvm-config to define LLVM_VERSION_SUFFIX
2022-01-26 15:16:41 -05:00
Harmen Stoppels
92abffa2d4 cmake: new version 3.22.2 (#28608) 2022-01-26 11:28:00 -07:00
Massimiliano Culpo
3cf5df7e3b Ensure "spack unit-test" can bootstrap clingo (#28572) 2022-01-26 14:19:15 +01:00
Valentin Volkl
b700335be7 lhapdfsets: add new package (#25477) 2022-01-26 04:58:58 -07:00
haralmha
b7bb687d17 vbfnlo: use patch method and filter_file instead of patch directive (#28506)
* Patch only for @3.0.0beta5

* Replace version specific patch file with patch function

Co-authored-by: root <root@hahansencs8.cern.ch>
2022-01-26 11:16:02 +01:00
Harmen Stoppels
e3d62b2f7b Print 'Waiting for another process to install x, y, z' in distributed builds (#28535)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-01-26 10:42:08 +01:00
Seth R. Johnson
5300cbbb2e trilinos: Enable LAPACK functionality in amesos2 (#28577) 2022-01-25 17:30:40 -08:00
Richarda Butler
38a1036c46 Bugfix: Tasmanian stand alone test (#28496) 2022-01-26 01:10:28 +00:00
Ryan Marcellino
75db515af2 alluxio: new version v2.7.2 (#28589) 2022-01-25 19:46:37 -05:00
Adam J. Stewart
2ace2a750b Fix build of libnetworkit/py-networkit (#28458) 2022-01-25 17:07:43 -06:00
Cyrus Harrison
7134cab8c4 occa: new release 1.2.0 release, and commit refs for prior releases (#28590) 2022-01-25 13:26:32 -07:00
Adam J. Stewart
4c57503e7e py-tensorboard-plugin-wit: install wheel (#28566) 2022-01-25 14:00:17 -06:00
Daniele Cesarini
1f3f1100b3 New packages: BigDFT suite (#26853) 2022-01-25 10:35:09 -08:00
Seth R. Johnson
edb99a2b05 Nalu wind: simplify version dependencies with trilinos (#28588)
* nalu-wind: simplify trilinos variant requirements

* nalu-wind: simplify version dependencies

With spack versioning, 'master' < 'develop' and 'develop' > 2.18.2.
2022-01-25 12:22:21 -05:00
eugeneswalker
fe76d6563a tau: %clang needs cmake for build (#28531)
* tau%clang needs cmake for build

* specify cmake@3.14:
2022-01-25 09:18:56 -08:00
Mark Grondona
4af334fb7e flux-pmix: add new package (#28559)
* pmix: add v4.1.0

* flux-pmix: add flux-pmix package
2022-01-25 09:35:36 -07:00
Daniele Cesarini
bed3a69498 mosquitto, cjson: added new packages (#28550)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-01-25 05:08:29 -07:00
Kai Torben Ohlhus
b36920283d octave: add maintainer @siko1056 (#28584) 2022-01-25 04:33:02 -07:00
Mikael Simberg
5cfaead675 Rename hpx-local to pika (#28585) 2022-01-25 04:32:39 -07:00
Wouter Deconinck
79d28db558 geant4: GEANT4_INSTALL_DATA=OFF and depends_on geant4-data type=run (#28581)
* [geant4] set GEANT4_INSTALL_DATA=OFF

* [geant4] downgrade geant4-data dependency to type=run
2022-01-25 11:38:13 +01:00
Jordan Ogas
5cb10f7362 charliecloud: add v0.26 (#28571) 2022-01-25 11:07:43 +01:00
Kai Torben Ohlhus
13eee02b41 octave: add v6.4.0 (#28580) 2022-01-25 09:44:31 +01:00
Andrew Davison
9fd870388c py-lazyarray: add versions from 0.3.3 to 0.5.2 (#28574)
* py-lazyarray: add versions from 0.3.3 to 0.5.2

* Restore py-setuptools dependency since pip replaces distutils with setuptools
2022-01-24 12:26:31 -07:00
Mark Grondona
a40d064030 flux-core: add v0.32.0 and v0.33.0 (#28347)
Additional changes:

 - update documentation of C4.1 link
 - do not set FLUX_PMI_LIBRARY_PATH
 - remove unnecessary ",master" from depends_on()
2022-01-24 09:20:49 -08:00
Wouter Deconinck
520b80fd5d gaudi: add v36.1, v36.2 and v36.3 (#28078) 2022-01-24 09:08:29 -07:00
Daniele Cesarini
ebedf3f83b cassandra: add v4.0.1, changed download url (#28551) 2022-01-24 10:09:38 +01:00
Adam J. Stewart
947c270446 Resource stage: no space before colon (#28560) 2022-01-24 10:01:28 +01:00
Adam J. Stewart
1c12e964ac grep: prevent recursive symlink (#28561) 2022-01-24 10:00:31 +01:00
Adam J. Stewart
471fa04c4b py-pandas: add v1.4.0 (#28563) 2022-01-24 09:59:52 +01:00
Adam J. Stewart
68c23549b0 py-pillow: add v9.0.0 (#28564) 2022-01-24 09:59:32 +01:00
Peter Brady
20f2ee99bc glm: add develop version (#28499)
* Add a new version to track development

The released versions do not properly install via cmake which leads to
errors when linking against the library.  These upstream problems have
been addressed on the glm development branch.

* Move git to class level and remove redundant depends
2022-01-24 09:47:43 +01:00
Sebastian Ehlert
03ba35920d dftd4: add v3.3.0 (#28558) 2022-01-23 08:14:13 -07:00
Timo Heister
8c85f72322 aspect: development branch is called main now (#28512) 2022-01-23 13:37:17 +01:00
Todd Gamblin
ecfc57ea3e m4: move patches hosted at savannah into the repo (#28515)
These URLs were giving 404's to me and to some of our users --
they do not seem to be reliable, so moving them into the tree.
2022-01-23 13:35:05 +01:00
iarspider
7e5afd1e73 libzmq: add variants "docs", "libbsd" (#28503) 2022-01-23 13:27:18 +01:00
Erik Schnetter
03e93345b1 compose: add new package (#28371) 2022-01-23 12:18:29 +01:00
Erik Schnetter
065e445e4d openmpi: declare more build dependencies for @master (#28309) 2022-01-23 11:10:02 +01:00
Jordan Galby
b526eafa45 Fix spack -C command_line_scope with other flags (#28418)
The option `spack -C` was broken if another flag that touched config was
also set (`spack -C ... -d`, `-c config:...` etc...).
2022-01-23 11:02:13 +01:00
Peter Brady
551c44f0fd petaca: add shared library variant (#28543) 2022-01-23 10:44:29 +01:00
John Wohlbier
3893b90cb9 palisade: add a named version for the "fppe-logreg" branch (#28544)
Co-authored-by: John Wohlbier <jgwohlbier@sei.cmu.edu>
2022-01-23 10:43:32 +01:00
Wouter Deconinck
5ff72ca079 acts: add v15.0.1 (#28539)
This is a bugfix version only, for an issue observed in production environments.
2022-01-23 10:32:20 +01:00
Erik Schnetter
4d70e4f281 libfyaml: add v0.7.12 (#28555) 2022-01-23 10:17:12 +01:00
Wouter Deconinck
ffb3f3a20d Python: ensurepip install instead of upgrade (#28553) 2022-01-22 21:55:56 -06:00
Axel Huebl
c38d34223a openPMD-api: add version 0.14.4 (#28546) 2022-01-21 17:35:16 -08:00
Danny McClanahan
0c2de252f1 introduce llnl.util.compat to remove sys.version_info checks (#21720)
- also split typing.py into typing_extensions and add py2 shims
2022-01-21 12:32:52 -08:00
Sebastian Schmitt
796f5a3cbc Bump py-brian2 to 2.5.0.2 (#28533) 2022-01-21 13:34:23 -06:00
Sreenivasa Murthy Kolam
417227e1eb bump up the version for 3.10.5 release (#28536) 2022-01-21 09:53:37 -07:00
Seth R. Johnson
9ac207c901 vecgeom/veccore: fix version interdependency (#28164)
* vecgeom: require exact version of veccore

Fixes configure error from downstream package:
```
CMake Error at /rnsdhpc/code/spack/opt/spack/apple-clang/cmake/7zgbrwt/share/cmake-3.22/Modules/CMakeFindDependencyMacro.cmake:47 (find_package):
  Could not find a configuration file for package "VecCore" that is
  compatible with requested version "0.8.0".

  The following configuration files were considered but not accepted:

    /rnsdhpc/code/spack/var/spack/environments/celeritas/.spack-env/view/lib/cmake/VecCore/VecCoreConfig.cmake, version: 0.6.0
```

* veccore: add new versions
2022-01-21 13:11:12 +00:00
Bram Veenboer
bd43467cbf CUDA: add v11.6.0 (#28439) 2022-01-21 11:08:20 +00:00
Wouter Deconinck
9b803a00b3 [geant4-data] depends_on g4ensdfstate also @11 (#28388)
I guess that one got dropped by mistake in the geant4 upgrade...
2022-01-21 10:54:19 +00:00
Danny McClanahan
645b40b249 add six.raise_from() to preserve exception traceback (#28532)
* add six.raise_from() to preserve exception traceback

* add tests for code coverage
2022-01-21 00:24:12 -08:00
Scott Wittenburg
282fd57114 trilinos: conflict with cuda >= 11.6.0 (#28530) 2022-01-20 21:08:34 -05:00
Patrick Bridges
76b7095445 Add flags to cabana to enable hypre and heffte when they are part of … (#28517)
* Add flags to cabana to enable hypre and heffte when they are part of spec. Also add googletest to build dependencies

* Fixed mixed spaced and tabs

* Update package.py

* Update package.py

* Update package.py

* Modified to request specifically heFFTe version 2.0.0 due to
limitations in heFFTe cmakefiles.

* Update var/spack/repos/builtin/packages/cabana/package.py

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>

* Integrated more heffte and hypre versions into cabana requests

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
2022-01-20 15:32:25 -07:00
Anton Kozhevnikov
5719b40ded SIRIUS: add Eigen dependency (#28525)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-01-20 22:10:03 +01:00
Erik Schnetter
de7de9778f hdf5: Add missing "close" statement (#28317)
* hdf5: Add missing "close" statement

* hdf5: Apply fortran-kinds patch only to @1.10.7
2022-01-20 14:11:34 -06:00
Harmen Stoppels
9a23fbbcf0 Julia: build without vendored dependencies (#27280)
* Julia unvendored

* restrict LBT
2022-01-20 11:07:04 -05:00
Mosè Giordano
1a5add3021 libblastrampoline: Add v4.0.0 and co-maintainer (#28521) 2022-01-20 05:08:24 -07:00
Jon Rood
5e699d5354 Remove unnecessary CMAKE_C_COMPILER from nalu-wind. (#28513) 2022-01-19 18:57:31 -08:00
Nils Leif Fischer
c11ce3bd1f New package: SpECTRE numerical relativity code (#28399)
* New package: SpECTRE numerical relativity code
2022-01-19 10:18:07 -08:00
kwryankrattiger
d5297b29be ParaView/VTK: Constrain version for ADIOS2 patch. (#28505)
* ParaView/VTK: Constrain version for ADIOS2 patch.

Older available versions of ParaView/VTK predate
ADIODS2 support.

ParaView lower bound is 5.8 and VTK lower bound is 8.2.0

* ParaView: Gate the ADIOS2 by verison
2022-01-19 13:15:24 -05:00
Adam J. Stewart
5e351ffff4 py-shiboken: fix build with new Python (#28464) 2022-01-19 10:32:29 -07:00
Charles Doutriaux
192f5cf66d Adding Kosh python package v2.0 (#28426) 2022-01-19 09:14:31 -07:00
Adam J. Stewart
22426f17d1 SIPPackage: fix build system (#28447) 2022-01-19 12:35:46 +01:00
Adam J. Stewart
cb95a30be7 py-astropy: remove install_options (#28448) 2022-01-19 12:35:01 +01:00
Adam J. Stewart
5688252a1f py-pygpu: remove install_options (#28449)
* py-pygpu: remove install_options

* Fix dependencies
2022-01-19 12:34:36 +01:00
Adam J. Stewart
cdac2cd4f2 py-pymol: don't install with pip (#28450) 2022-01-19 12:33:59 +01:00
Adam J. Stewart
d12bc3121a py-pyodbc: remove install_options (#28454) 2022-01-19 12:33:08 +01:00
Adam J. Stewart
3b2c7633eb py-numpy: remove install_options (#28459) 2022-01-19 12:31:47 +01:00
Adam J. Stewart
abe77ec8ab py-scipy: remove -j flag (#28460) 2022-01-19 12:31:23 +01:00
Adam J. Stewart
d6c841f67c py-python-meep: remove install_options (#28461)
* py-python-meep: remove install_options

* Remove unused import
2022-01-19 12:27:38 +01:00
Adam J. Stewart
e6ba8eb060 py-scs: fix CUDA build (#28463) 2022-01-19 12:17:22 +01:00
Stephen Sachs
d25e4889ba Remove '==> OpenFOAM bashrc env: ' from module (#28444)
It seems that spack reads the output of `setup_run_environment` to build the actual spack modules and lmod modules. So, any output here will used verbatim on the shell.

This patch fixes https://github.com/spack/spack/issues/26733
2022-01-19 12:00:31 +01:00
Richarda Butler
7fb11ea84f caliper: fix stand alone test (#28390) 2022-01-19 11:59:46 +01:00
Adam J. Stewart
d853e2db57 py-pyside2: fix various build issues (#28455) 2022-01-19 11:57:40 +01:00
Jen Herting
017c6786da py-smac: new package (#28490) 2022-01-19 11:51:53 +01:00
Logan Harbour
9723d918bb spack compiler info exit 1 when no compilers are found (#28493)
fixes #28492
2022-01-19 11:50:15 +01:00
dependabot[bot]
93ebc4f47b build(deps): bump docker/build-push-action from 2.7.0 to 2.8.0 (#28495)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 2.7.0 to 2.8.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](a66e35b9cb...1814d3dfb3)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-01-19 11:27:59 +01:00
Cameron Rutherford
4918b7d007 exago: add v1.3.0 (#28498) 2022-01-19 11:15:15 +01:00
Hadrien G
7ae2d2e387 acts: update requirements on autodiff (#28501) 2022-01-19 10:59:04 +01:00
Thomas Helfer
8da204de61 mgis, tfel: add explicit dependency to py-numpy (#28502) 2022-01-19 10:57:13 +01:00
Harmen Stoppels
3dccb913a9 spack: add tcl/lmod as dependencies(#28355) 2022-01-19 10:02:04 +01:00
Jen Herting
97376f4694 New package: py-meldmd (#28479)
* [py-meldmd] created template

* [py-meldmd] added dependencies

* [py-meldmd]

- added homepage
- added description
- removed fixmes

* [py-meldmd] updated copyright
2022-01-18 20:17:24 -07:00
Melven Roehrig-Zoellner
10a0ce8c49 valgrind package: add versions 3.18.0 and 3.18.1 (#28453) 2022-01-18 18:28:05 -08:00
Adam J. Stewart
95fe66cb18 Add xgboost 1.5.2 (#28462)
Also remove obsolete py-dask-xgboost package
2022-01-18 18:11:08 -08:00
Adam J. Stewart
05df2c3371 py-sphinx: add v4.4.0 (#28465) 2022-01-18 18:06:44 -08:00
Adam J. Stewart
c5b1fcf5f9 py-torchmetrics: add v0.7.0 (#28489) 2022-01-18 18:00:09 -08:00
Jen Herting
68c0bdb698 New package: py-pyrfr (#28485)
* [py-pyrfr] create template

* [py-pyrfr]

- added homepage
- added description
- added dependencies
- updated copyright
- removed fixmes
2022-01-18 16:20:42 -07:00
Mathew Cleveland
e7b4a9bfd7 draco: new version 7.13.0 (#28359)
* Add release draco-7.13.0

* fix hash

Co-authored-by: Cleveland <cleveland@lanl.gov>
2022-01-18 17:40:04 -05:00
Jen Herting
24d9d4544e renamed package to align with usage (#28487)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-01-18 16:27:33 -06:00
Jen Herting
cac86345e5 New package: py-fastai (#28151)
* [py-fastai] new package

* [py-fastai] flake8

* [py-fastai] added dependencies from settings.ini

* [py-fastai] py-pillow -> pil

* [py-fastai] set version for py-setuptools

* [py-fastai] removed .999

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-01-18 13:44:29 -07:00
Peter Brady
b5bb06cc36 New package: cppcoro (#28467)
A library for c++ coroutine abstractions
2022-01-18 12:03:33 -08:00
Tom Payerle
4d10df3003 petsc: add libspqr.so to list of suite-sparse libraries (#28478)
Should fix  #28477
2022-01-18 11:20:40 -07:00
Harmen Stoppels
e72f87ec64 Switch lmod default all:autoload from none to direct (#28357)
* Switch lmod module all autoload default from none to direct

* Fix the docs
2022-01-18 09:06:41 -08:00
Paul Kuberry
612430859e Xyce: add version 7.4.0 and '+shared' variant (#28469) 2022-01-18 10:05:39 -07:00
Jen Herting
bd987d9278 [py-tpot] fixed dependency and added version 0.11.7 (#28484)
* [py-tpot] fixed py-scipy dependency

* [py-tpot] added version 0.11.7
2022-01-18 10:45:49 -06:00
Joe Kaushal
58c598ffe6 new package: op2-dsl (#28396) 2022-01-18 09:20:33 -07:00
renjithravindrankannath
2f7a32a74d Updated recipes for 4.5.2 and corrected run style errors (#28413) 2022-01-18 14:12:22 +01:00
Jon Rood
7ae8a5d55b Add ROCmPackage to trilinos (#28424)
* Add ROCmPackage to trilinos.

* Simplify Trilinos ROCm support. Add MI200 to Kokkos AMD GPU arch map.
2022-01-18 06:24:32 -05:00
Teodor Nikolov
64913af04a New package: Sarus (#28349)
And tini
2022-01-18 03:59:09 -07:00
Mark W. Krentel
d642a396af hpctoolkit: add version 2022.01.15 (#28473) 2022-01-18 10:41:11 +01:00
Adam J. Stewart
5ca6ba6c8b py-tomopy: add v1.11.0 (#28466) 2022-01-17 21:50:43 -06:00
Hadrien G
f88b40308d ACTS: fix autodiff version requirements (#28440) 2022-01-17 12:14:29 -07:00
Tamara Dahlgren
f238835b65 is_system_path: return False if path is None (#28403) 2022-01-17 08:44:10 -07:00
kwryankrattiger
7e3677db6f Fix issue when propagating cuda with other variant (#28377)
+cuda was hiding propagation of hdf5 and adios2 to paraview
2022-01-17 13:57:43 +01:00
Erik Schnetter
b2f92a1307 mpc: add v1.2.1 (#28307) 2022-01-17 13:49:45 +01:00
Timothy Brown
e17434b9e2 WRF: add the ability to compile with Intel. (#28382) 2022-01-17 13:48:47 +01:00
Luc Berger
6add69d658 Kokkos Kernels: release maintenance (#28409)
1. adding latest release 3.5.0
2. updating cmake requirement to match that of Kokkos
3. adding logic to depend on the right version of Kokkos by default
2022-01-17 12:48:20 +00:00
Eric Brugger
20796b2045 VTK-m: Add testlib variant. (#28324) 2022-01-17 13:43:41 +01:00
Sreenivasa Murthy Kolam
53801f3ebb deprecate rocm-4.1.0 and older rocm releases (#28395) 2022-01-17 13:08:24 +01:00
eugeneswalker
0f3637e332 nalu-wind: depends on trilinos +gtest (#28414) 2022-01-17 12:55:42 +01:00
Adam J. Stewart
6c4e765f40 Open3D: use Spack-installed 3rd party deps when possible (#28422) 2022-01-17 12:54:22 +01:00
Adam J. Stewart
e7c9f05cd9 miniconda: source conda.sh setup script (#28321) 2022-01-17 12:53:41 +01:00
Adam J. Stewart
5d56a3b306 py-numpy: add v1.22.1 (#28423) 2022-01-17 12:52:43 +01:00
Glenn Horton-Smith
58fd0d859a net-snmp: add v5.9.1 (#28268) 2022-01-17 12:51:15 +01:00
Hadrien G
f5a2f693dc acts: add v16.0.0 (#28438) 2022-01-17 04:11:27 -07:00
Mark W. Krentel
905c6973c9 hpcviewer: add v2022.01. (#28425) 2022-01-17 10:15:48 +01:00
Erik Schnetter
350d2da8b9 findutils: ensure __nonnull is defined on macOS (#28316) 2022-01-17 10:14:39 +01:00
Adam J. Stewart
20bccc3843 mesa18: explicitly specify python executable (#28420) 2022-01-17 09:50:44 +01:00
Nils Leif Fischer
624b3a00ff brigand: only build tests when requested (#28228) 2022-01-17 09:45:34 +01:00
Adam J. Stewart
b93ff6fdff PyTorch: specify CUDA version incompatibility (#28432) 2022-01-17 09:41:23 +01:00
Adam J. Stewart
5c4a250bf9 NCCL: specify cuda_arch (#28433) 2022-01-17 09:40:38 +01:00
Adam J. Stewart
5e595586a7 zip: add external detection (#28434) 2022-01-17 09:39:17 +01:00
Vanessasaurus
dbb36ef8b1 Fixing pthread-stack-min-fix.patch patch version range (#28437)
pthread-stack-min-fix.patch should no be applied to 1.73.0
See https://github.com/spack/spack/issues/28436
2022-01-17 01:32:11 -07:00
Sebastian Pipping
425995db3e expat: Add latest release 2.4.3 with security fixes (#28435) 2022-01-16 17:23:02 -07:00
Luc Berger
e093187266 Kokkos: updating package list, maintainers and minimum cmake version (#28410)
* Kokkos: updating package list, maintainers and minimum cmake version
* Kokkos: updating maintainers list

Updating maintainers list to have the correct GitHub handle for Jan.
2022-01-16 11:50:34 -08:00
Christoph Junghans
a817538870 votca: add v2022 (#28431) 2022-01-16 07:50:02 -07:00
Brian Van Essen
b8d159c62d hwloc: added support to find external (#28348) 2022-01-15 16:44:13 -07:00
Peter Brady
415d662ec0 Fix boost build failures on newer glibc 2022-01-15 14:02:30 -08:00
Peter Brady
30b3cd3b44 chaco: new package (#28342)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-01-15 14:08:18 -07:00
Peter Brady
1dd9238021 chaparral: add new package (#28343) 2022-01-15 14:05:03 -07:00
Nicholas Knoblauch
a5c0a4dca4 Add libdeflate as variant for htslib (#24141) 2022-01-15 10:07:45 -07:00
Sreenivasa Murthy Kolam
ebdf1cb73e rocm ecosystem: bump version for 4.5.2 release (#28358) 2022-01-15 08:59:15 -07:00
Marcus Boden
9e87a6a4b7 snakemake: add v6.13.1 and new variant (#28393) 2022-01-15 08:56:01 -07:00
Nils Leif Fischer
3f903c49e4 charmpp: use CMake for versions 7.0.0+ (#28401) 2022-01-14 23:56:16 -07:00
Todd Gamblin
93377942d1 Update copyright year to 2022 2022-01-14 22:50:21 -08:00
Todd Gamblin
e3527983ac spack license update-copyright-year now updates the LICENSE-MIT file
`spack license update-copyright-year` was updating license headers but not the MIT
license file. Make it do that and add a test.

Also simplify the way we bump the latest copyright year so that we only need to
update it in one place.
2022-01-14 22:50:21 -08:00
Jen Herting
1422bde25a kaldi: new version 2021-11-16 (#28154)
* [kaldi] Added version 2021-11-16

* [kaldi] Added logic for new version and when cuda 11 is used

* [kaldi] Added patch file when cuda 11 as cub is now built into it

* [kaldi] removed .999 and simplified some logic

Co-authored-by: Doug Heckman <dahdco@rit.edu>
2022-01-14 23:08:15 -07:00
Chris White
ed105fcc76 New Package: ATS (#28013)
* add py-ats package

* add new 7.0.10 tag

* add myself as a maintainer

* add dependencies for python and setuptools

* style

* added todo for flux

* words

* update versions users should use
2022-01-14 22:32:12 -07:00
Brian Spilner
39f19bcc16 CDO package: add version 2.0.3 (#28416) 2022-01-14 18:53:55 -08:00
Adam J. Stewart
a2181e9d25 Python: add ensurepip variant (#28205) 2022-01-14 20:09:42 -06:00
Adam J. Stewart
6235184522 valgrind: default to ~ubsan (#28389) 2022-01-14 16:29:16 -07:00
Satish Balay
9ed32a3c65 gdbm@1.22: fix build on MacOS (#28379) 2022-01-14 14:02:50 -07:00
Peter Brady
2917aad6f9 scorpio: add new package (#28375)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-01-14 13:47:36 -07:00
Adam J. Stewart
3540f8200a PythonPackage: install packages with pip (#27798)
* Use pip to bootstrap pip

* Bootstrap wheel from source

* Update PythonPackage to install using pip

* Update several packages

* Add wheel as base class dep

* Build phase no longer exists

* Add py-poetry package, fix py-flit-core bootstrapping

* Fix isort build

* Clean up many more packages

* Remove unused import

* Fix unit tests

* Don't directly run setup.py

* Typo fix

* Remove unused imports

* Fix issues caught by CI

* Remove custom setup.py file handling

* Use PythonPackage for installing wheels

* Remove custom phases in PythonPackages

* Remove <phase>_args methods

* Remove unused import

* Fix various packages

* Try to test Python packages directly in CI

* Actually run the pipeline

* Fix more packages

* Fix mappings, fix packages

* Fix dep version

* Work around bug in concretizer

* Various concretization fixes

* Fix gitlab yaml, packages

* Fix typo in gitlab yaml

* Skip more packages that fail to concretize

* Fix? jupyter ecosystem concretization issues

* Solve Jupyter concretization issues

* Prevent duplicate entries in PYTHONPATH

* Skip fenics-dolfinx

* Build fewer Python packages

* Fix missing npm dep

* Specify image

* More package fixes

* Add backends for every from-source package

* Fix version arg

* Remove GitLab CI stuff, add py-installer package

* Remove test deps, re-add install_options

* Function declaration syntax fix

* More build fixes

* Update spack create template

* Update PythonPackage documentation

* Fix documentation build

* Fix unit tests

* Remove pip flag added only in newer pip

* flux: add explicit dependency on jsonschema

* Update packages that have been added since this was branched off of develop

* Move Python 2 deprecation to a separate PR

* py-neurolab: add build dep on py-setuptools

* Use wheels for pip/wheel

* Allow use of pre-installed pip for external Python

* pip -> python -m pip

* Use python -m pip for all packages

* Fix py-wrapt

* Add both platlib and purelib to PYTHONPATH

* py-pyyaml: setuptools is needed for all versions

* py-pyyaml: link flags aren't needed

* Appease spack audit packages

* Some build backend is required for all versions, distutils -> setuptools

* Correctly handle different setup.py filename

* Use wheels for py-tomli to avoid circular dep on py-flit-core

* Fix busco installation procedure

* Clarify things in spack create template

* Test other Python build backends

* Undo changes to busco

* Various fixes

* Don't test other backends
2022-01-14 12:37:57 -06:00
eugeneswalker
0b2507053e hsakmt-roct@4.5.0: needs pkgconf (#28400) 2022-01-14 08:38:19 -07:00
Adam J. Stewart
e0f044561e Python: improve site_packages_dir handling (#28346)
* Python: improve site_packages_dir handling

* Replace all site_packages_dir with purelib/platlib
2022-01-13 20:11:16 -06:00
Adam J. Stewart
2e238307c7 Python: add maintainers for Python 2 (#28386) 2022-01-13 20:05:42 -06:00
Jim Galarowicz
b7accb6a9d Add new survey package to spack. (#25518)
* Add new package to spack.  survey is a lightweight application performance tool that also gathers system information and stores it as metadata.

* Add maintainer and note about source access.

* Update the man path per spack reviewer suggestion.

* Remove redundant settings for PYTHONPATH, PATH, and MANPATH.

* Move to a one mpi collector approach for cce/tce integration.

* Add pyyaml dependency

* Make further spack reviewer changes to python type specs, mpi args, build type variant.

* Add reviewer requested changes.

* Add reviewer docstring requested changes.

* Add more updates from spack reviewer comments.

* Update the versions to use tags, not branches

* Redo dashes to fix issue with spack testing.

Co-authored-by: Jim Galarowicz <jgalarowicz@newmexicoconsortium.org>
2022-01-13 19:59:05 -06:00
Erik
eda565f3b1 AMReX Smoke Test (#27411) 2022-01-13 17:43:43 +00:00
Andrew W Elble
1ea4497802 musl: set syslibdir correctly, new versions (#28373) 2022-01-13 03:50:11 -07:00
Adam J. Stewart
7d3a696d27 fenics: specify python exe (#28323) 2022-01-12 12:47:21 -07:00
Harmen Stoppels
d74396ad21 Do not initialize config on spack compiler list (#28042)
When `spack compiler list` is run without being restricted to a
particular scope, and no compilers are found, say that none are 
available, and hint that the use should run spack compiler find to 
auto detect compilers.

* Improve docs
* Check if stdin is a tty
* add a test
2022-01-12 16:26:28 +00:00
Massimiliano Culpo
91fc4cf28f bootstrap: fix bootstrapping GnuPG from different macOS versions (#28350) 2022-01-12 08:18:16 -08:00
Nils Leif Fischer
38fee7e0da libogg: patch stdint include, add v1.3.5 (#28332)
Backport a patch for v1.3.4 that fixes an unsigned typedef problem
on macOS: https://github.com/xiph/ogg/pull/64

Also add v1.3.5 that has this issue fixed.
2022-01-12 09:17:52 -07:00
Andrew W Elble
b3043cae8f grace: workaround for buffer overflows (#28232)
spack paths can be long and this overflows (at least) these buffers
inside of the bundled T1lib inside of the grace distribution, leading
to crashes on startup.
2022-01-12 17:14:37 +01:00
kwryankrattiger
6c9b781e6e ECP-DAV: Propagate adios2 variant to paraview (#26747)
Depends on #26728 and #26175
2022-01-12 11:14:12 -05:00
Matthieu Dorier
04b9d87fca gptune: fix build by setting compilers to mpi wrappers (#28235) 2022-01-12 17:13:37 +01:00
Harmen Stoppels
868eaeb70e lmod: add v8.6.5 (#28353) 2022-01-12 08:26:43 -07:00
Nils Leif Fischer
0e1e705e15 charmpp: disable pre-7.0.0 macOS builds with clang@7: (#28221)
Charm++ versions below 7.0.0 have build issues on macOS, mainly due to the
pre-7.0.0 `VERSION` file conflicting with other version files on the
system: https://github.com/UIUC-PPL/charm/issues/2844. Specifically, it
conflicts with LLVM's `<version>` header that was added in llvm@7.0.0 to
comply with the C++20 standard:
https://en.cppreference.com/w/cpp/header/version. The conflict only occurs
on case-insensitive file systems, as typically used on macOS machines.
2022-01-12 15:53:49 +01:00
Adam J. Stewart
faaf38ca7d py-torch: use conditional variants (#28242) 2022-01-12 15:52:16 +01:00
Todd Gamblin
54d741ba54 unparser: handle package-level loops, if statements, and with blocks
Many packages implement logic at the class level to handle complex dependencies and
conflicts. Others have started using `with when("@1.0"):` blocks since we added that
capability. The loops and other control logic can cause some pure directive logic not to
be removed by our package hashing logic -- and in many cases that's a lot of code that
will cause unnecessary rebuilds.

This commit changes the unparser so that it will descend into these blocks. Specifically:

  1. Descend into loops, if statements, and with blocks at the class level.
  2. Don't look inside function definitions (in or outside a class).
  3. Don't look at nested class definitions (they don't have directives)
  4. Add logic to *remove* empty loops/with blocks/if statements if all directives
     in them were removed.

This allows our package hash to ignore a lot of pure metadata that it was not ignoring
before, and makes it less sensitive.

In addition, we add `maintainers` and `tags` to the list of metadata attributes that
Spack should remove from packages when constructing canonoical source for a package
hash.

- [x] Make unparser handle if/for/while/with at class level.
- [x] Add tests for control logic removal.
- [x] Add a test to ensure that all packages are not only unparseable, but also
      that their canonical source is still compilable. This is a test for
      our control logic removal.
- [x] Add another unparse test package that has complex logic.
2022-01-12 06:14:18 -08:00
Todd Gamblin
101f080138 unparser: add unparser unit tests
These are the unit tests from astunparse, converted to pytest, with a few backports from
upstream cpython. These should hopefully keep `unparser.py` well covered as we change it.
2022-01-12 06:14:18 -08:00
Todd Gamblin
4d7226832d unparser: rename t to node to mirror upstream
These refactors have happened in upstream `ast.unparse()`
2022-01-12 06:14:18 -08:00
Todd Gamblin
0370324f1f unparser: rename _Class() methods to visit_Class() to mirror upstream
These are refactors that have happened in upstream `ast.unparse()`
2022-01-12 06:14:18 -08:00
Todd Gamblin
ec16c2d7c2 unparser: do a better job of roundtripping strings
Handle complex f-strings.  Backport of:

    a993e901eb#
2022-01-12 06:14:18 -08:00
Todd Gamblin
e9612696fd unparser: treat print(a, b, c) and print((a, b, c)) the same
We can't tell `print(a, b, c)` and `print((a, b, c))` apart -- both of these expressions
generate different ASTs in Python 2 and Python 3.  However, we can decide that we don't
care.  This commit treats both of them the same when `py_ver_consistent` is set with
`unparse()`.

This means that the package hash won't notice changes from printing a tuple to printing
multiple values, but we don't care, because this is extremely unlikely to affect the build.
More than likely this is just an error message for the user of the package.

- [x] treat `print(a, b, c)` and `print((a, b, c))` the same in py2 and py3
- [x] add another package parsing test -- legion -- that exercises this feature
2022-01-12 06:14:18 -08:00
Todd Gamblin
a18a0e7a47 commands: add spack pkg source and spack pkg hash
To make it easier to see how package hashes change and how they are computed, add two
commands:

* `spack pkg source <spec>`: dumps source code for a package to the terminal

* `spack pkg source --canonical <spec>`: dumps canonicalized source code for a
   package to the terminal. It strips comments, directives, and known-unused
   multimethods from the package. It is used to generate package hashes.

* `spack pkg hash <spec>`: This gives the package hash for a particular spec.
  It is generated from the canonical source code for the spec.

- [x] `add spack pkg source` and `spack pkg hash`
- [x] add tests
- [x] fix bug in multimethod resolution with boolean `@when` values

Co-authored-by: Greg Becker <becker33@llnl.gov>
2022-01-12 06:14:18 -08:00
Todd Gamblin
106ae7abe6 package_hash: switch to using canonical source instead of AST repr
We are planning to switch to using full hashes for Spack specs, which means that the
package hash will be included in the deployment descriptor. This means we need a more
robust package hash than simply dumping the `repr` of the AST.

The AST repr that we previously used for package content is unreliable because it can
vary between python versions (Python's AST actually changes fairly frequently).

- [x] change `package_hash`, `package_ast`, and `canonical_source` to accept a string for
      alternate source instead of a filename.
- [x] consolidate package hash tests in `test/util/package_hash.py`.
- [x] remove old `package_content` method.
- [x] make `package_hash` do what `canonical_source_hash` was doing before.
- [x] modify `content_hash` in `package.py` to use the new `package_hash` function.

Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2022-01-12 06:14:18 -08:00
Todd Gamblin
39afe946a0 unparser: Don't omit parenthesis when unparsing a slice
Backport of
  c102a14825

Includes support for Python 2.
2022-01-12 06:14:18 -08:00
Todd Gamblin
0776c3b4d6 unparser: Don't put unnecessary parentheses on class declarations
Backport of
* 25160cdc47
2022-01-12 06:14:18 -08:00
Todd Gamblin
ff5e73d6eb package_hash: add test to ensure that every package in Spack can be unparsed
- [x] add option to canonical source to *not* filter multimethods
- [x] add test to unparse every package in builtin
2022-01-12 06:14:18 -08:00
Todd Gamblin
b6dde510bd package_hash: add test to ensure consistency across Python versions
Our package hash is supposed to be consistent from python version to python version.
Test this by adding some known unparse inputs and ensuring that they always have the
same canonical hash.  This test relies on the fact that we run Spack's unit tests
across many python versions.  We can't compute for several python versions within the
same test run so we precompute the hashes and check them in CI.
2022-01-12 06:14:18 -08:00
Todd Gamblin
800229a448 package_hash: fix handling of multimethods and add tests
Package hashing was not properly handling multimethods. In particular, it was removing
any functions that had decorators from the output, so we'd miss things like
`@run_after("install")`, etc.

There were also problems with handling multiple `@when`'s in a single file, and with
handling `@when` functions that *had* to be evaluated dynamically.

- [x] Rework static `@when` resolution for package hash
- [x] Ensure that functions with decorators are not removed from output
- [x] Add tests for many different @when scenarios (multiple @when's,
      combining with other decorators, default/no default, etc.)

Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2022-01-12 06:14:18 -08:00
Todd Gamblin
93a6c51d88 package_hash: rework RemoveDirectives and add a test
Previously we used `directives.__all__` to get directive names, but it wasn't
quite right -- it included `DirectiveMeta`, etc.  It's not wrong, but it's also
not the clearest way to do this.

- [x] Refactor `@directive` to track names in `directive_names` global
- [x] Rename `_directive_names` to `_directive_dict_names` in `DirectiveMeta`
- [x] Add a test for `RemoveDirectives`

Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2022-01-12 06:14:18 -08:00
Todd Gamblin
8880a00862 package_hash: remove all unassigned strings, not just docstrings
Some packages use top-level unassigned strings instead of comments, either just after a
docstring on in the body somewhere else. Ignore those strings becasue they have no
effect on package behavior.

- [x] adjust RemoveDocstrings to remove all free-standing strings.
- [x] move tests for util/package_hash.py to test/util/package_hash.py

Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2022-01-12 06:14:18 -08:00
Todd Gamblin
572fbf4f49 unparser: handle unicode string literals consistently across Python versions
Python 2 and 3 represent string literals differently in the AST. Python 2 requires '\x'
literals, and Python 3 source is always unicode, and allows unicode to be written
directly.  These also unparse differently by default.

- [x] modify unparser to write both out the way `repr` would in Python 2 when
  `py_ver_consistent` is provided.
2022-01-12 06:14:18 -08:00
Todd Gamblin
396c37d82f unparser: implement operator precedence algorithm for unparser
Backport operator precedence algorithm from here:
    397b96f6d7

This eliminates unnecessary parentheses from our unparsed output and makes Spack's unparser
consistent with the one in upstream Python 3.9+, with one exception.

Our parser normalizes argument order when `py_ver_consistent` is set, so that star arguments
in function calls come last.  We have to do this because Python 2's AST doesn't have information
about their actual order.

If we ever support only Python 3.9 and higher, we can easily switch over to `ast.unparse`, as
the unparsing is consistent except for this detail (modulo future changes to `ast.unparse`)
2022-01-12 06:14:18 -08:00
Todd Gamblin
afb358313a unparser: refactor delimiting with context managers in ast.unparse
Backport of 4b3b1226e8
2022-01-12 06:14:18 -08:00
Todd Gamblin
5847eb1e65 unparser: add block() context manager for indentation
This is a backport of a refactor from cpython 3.9
2022-01-12 06:14:18 -08:00
Todd Gamblin
2badd6500e unparse: Make unparsing consistent for 2.7 and 3.5-3.10
Previously, there were differences in the unparsed code for Python 2.7 and for 3.5-3.10.
This makes unparsed code the same across these Python versions by:

    1. Ensuring there are no spaces between unary operators and
       their operands.
    2. Ensuring that *args and **kwargs are always the last arguments,
       regardless of the python version.
    3. Always unparsing print as a function.
    4. Not putting an extra comma after Python 2 class definitions.

Without these changes, the same source can generate different code for different
Python versions, depending on subtle AST differences.

One place where single source will generate an inconsistent AST is with
multi-argument print statements, e.g.:

```
    print("foo", "bar", "baz")
```

In Python 2, this prints a tuple; in Python 3, it is the print function with
multiple arguments.  Use `from __future__ import print_function` to avoid
this inconsistency.
2022-01-12 06:14:18 -08:00
Todd Gamblin
b324fe5d95 externals: add astunparse
Add `astunparse` as `spack_astunparse`. This library unparses Python ASTs and we're
adding it under our own name so that we can make modifications to it.

Ultimately this will be used to make `package_hash` consistent across Python versions.
2022-01-12 06:14:18 -08:00
Andrew W Elble
a38cdddd37 perl-tk: add missing dependencies (#28240) 2022-01-12 14:53:25 +01:00
Dylan Simon
2cf22ad5f2 e2fsprogs: fix non-root install (#28255) 2022-01-12 14:52:46 +01:00
Harmen Stoppels
c8e01752a1 Use depends_on over load in lmod module files generated by Spack (#28352) 2022-01-12 13:29:22 +00:00
snehring
640a4f7dcd poamsa: fix build errors for gcc10+ (#28262) 2022-01-12 14:09:55 +01:00
Danny McClanahan
435a241869 yarn: add runtime dependency on node-js@4.0: (#27654) 2022-01-12 13:05:49 +00:00
H. Joe Lee
a5ff3206f9 HDF5 GPU VFD: add new package. (#28272) 2022-01-12 13:59:51 +01:00
Melven Roehrig-Zoellner
5bb5bf3efb ITensor: add v3.1.10 and 'shared' variant (#28370) 2022-01-12 05:20:10 -07:00
Adam J. Stewart
55a62a1ee5 py-async-timeout: fix checksum issue (#28329) 2022-01-12 10:53:12 +01:00
Nils Leif Fischer
7fa81a66af libtheora: disable docs by default (#28330) 2022-01-12 10:24:58 +01:00
Nils Leif Fischer
4e0372b9b7 curl: add support for external detection (#28331) 2022-01-12 10:24:10 +01:00
Massimiliano Culpo
5476e5d035 Remove tut since it requires deprecated Python 3.6 (#28360) 2022-01-12 09:34:50 +01:00
kwryankrattiger
363a565ce3 Packaging: Virtual package for libllvm (#27200)
Add an abstraction around libllvm to allow libllvm
providers to be specified for all packages.

This is targeting allowing mesa to build against
llvm-amdgpu or intel-llvm or llvm or any other
custom llvm variant that arises for specific GPU
toolchains
2022-01-11 13:28:13 -08:00
Erik Schnetter
d4a468c160 memkind: add v1.12.0 (#28306) 2022-01-11 11:23:36 -07:00
Erik Schnetter
6690bb7411 openblas: New version 0.3.19 (#28308) 2022-01-11 16:55:23 +01:00
Adam J. Stewart
2ab871dd17 py-aiohttp: switch to PyPI tarball (#28333) 2022-01-11 14:33:02 +01:00
Adam J. Stewart
2ea8abef34 Open3D: add missing LLVM dep (#28334) 2022-01-11 14:32:34 +01:00
Erik Schnetter
67d9abb5f1 ffmpeg: add v4.4.1 (#28300) 2022-01-11 14:21:43 +01:00
Harmen Stoppels
f611eb0283 hdf5: prefer stable over experimental releases (#28340) 2022-01-11 10:32:28 +01:00
Harmen Stoppels
78debc1143 mbedtls: add v2.16.12, v2.28.0 and v3.1.0(#28281) 2022-01-11 10:28:58 +01:00
Adam J. Stewart
7b76e3982f Deprecate Python 2 installations (#28003)
* Deprecate Python 2 installations

* Deprecate py-python-meep

* Deprecate older easybuild backend libs

* Deprecate Python 3.6

* Deprecate miniconda2
2022-01-10 15:45:35 -06:00
Erik Schnetter
675210bd8b automake: New version 1.16.5 (#28299) 2022-01-10 21:33:14 +01:00
Erik Schnetter
0236e88af4 gawk: New version 5.1.1 (#28301) 2022-01-10 21:32:53 +01:00
Erik Schnetter
c939af8185 gdbm: New version 1.22 (#28302) 2022-01-10 21:32:40 +01:00
Erik Schnetter
33d8e8f31c grep: New version 3.7 (#28303) 2022-01-10 21:32:33 +01:00
Erik Schnetter
f678d5818f lorene: Beautify package title in documentation (#28305) 2022-01-10 21:32:20 +01:00
Erik Schnetter
4ef8e15ed3 wi4mpi: New version 3.5.0 (#28313) 2022-01-10 21:31:30 +01:00
Erik Schnetter
09ca7c4d9c wget: New version 1.21.2 (#28312) 2022-01-10 21:31:13 +01:00
Erik Schnetter
76d9adae8c openssl: New version 1.1.1m (#28310) 2022-01-10 21:30:58 +01:00
Peter Scheibel
9f7fb6d01a stage.steal_source: preserve symlinks
This avoids dangling symlink errors. ignore_dangling_symlinks option would be more-targeted but is only available for Python >= 3.2 (#28318)
2022-01-10 10:10:49 -08:00
Adam J. Stewart
cc32b08205 Python: set default config_vars (#28290)
* Python: set default config_vars

* Add missing commas

* dso_suffix not present for some reason

* Remove use of default_site_packages_dir

* Use config_vars during bootstrapping too

* Catch more errors

* Fix unit tests

* Catch more errors

* Update docstring
2022-01-10 12:00:06 -06:00
7270 changed files with 39917 additions and 20886 deletions

View File

@@ -16,19 +16,29 @@ body:
attributes:
label: Steps to reproduce the issue
description: |
Fill in the exact spec you are trying to build and the relevant part of the error message
placeholder: |
Fill in the console output from the exact spec you are trying to build.
value: |
```console
$ spack install <spec>
$ spack spec -I <spec>
...
```
- type: textarea
id: error
attributes:
label: Error message
description: |
Please post the error message from spack inside the `<details>` tag below:
value: |
<details><summary>Error message</summary><pre>
...
</pre></details>
validations:
required: true
- type: textarea
id: information
attributes:
label: Information on your system
description: Please include the output of `spack debug report`
description: Please include the output of `spack debug report`.
validations:
required: true
- type: markdown

View File

@@ -31,7 +31,7 @@ jobs:
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -61,7 +61,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -90,7 +90,7 @@ jobs:
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -118,7 +118,7 @@ jobs:
bzip2 curl file gcc-c++ gcc gcc-fortran tar git gpg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -138,7 +138,7 @@ jobs:
- name: Install dependencies
run: |
brew install cmake bison@2.7 tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -157,8 +157,8 @@ jobs:
- name: Install dependencies
run: |
brew install tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
@@ -174,8 +174,8 @@ jobs:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9']
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo and non-root user
@@ -202,7 +202,7 @@ jobs:
apt-get install -y \
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- uses: actions/checkout@v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- name: Setup repo and non-root user
run: |
git --version
@@ -231,7 +231,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree \
gawk
- uses: actions/checkout@v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- name: Setup repo and non-root user
run: |
git --version
@@ -256,7 +256,7 @@ jobs:
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -272,7 +272,7 @@ jobs:
brew install gawk tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- uses: actions/checkout@v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -37,7 +37,7 @@ jobs:
name: Build ${{ matrix.dockerfile[0] }}
steps:
- name: Checkout
uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -67,7 +67,7 @@ jobs:
uses: docker/setup-buildx-action@94ab11c41e45d028884a99163086648e898eed25 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@42d299face0c5c43a0487c477f595ac9cf22f1a7 # @v1
uses: docker/login-action@dd4fa0671be5250ee6f50aedf4cb05514abda2c7 # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -75,13 +75,13 @@ jobs:
- name: Log in to DockerHub
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@42d299face0c5c43a0487c477f595ac9cf22f1a7 # @v1
uses: docker/login-action@dd4fa0671be5250ee6f50aedf4cb05514abda2c7 # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[1] }}
uses: docker/build-push-action@a66e35b9cbcf4ad0ea91ffcaf7bbad63ad9e0229 # @v2
uses: docker/build-push-action@7f9d37fa544684fb73bfe4835ed7214c255ce02b # @v2
with:
file: share/spack/docker/${{matrix.dockerfile[1]}}
platforms: ${{ matrix.dockerfile[2] }}

View File

@@ -2,19 +2,7 @@
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
# TODO: remove this explicit setting once apple-clang detection is fixed
cat <<EOF > etc/spack/compilers.yaml
compilers:
- compiler:
spec: apple-clang@11.0.3
paths:
cc: /usr/bin/clang
cxx: /usr/bin/clang++
f77: /usr/local/bin/gfortran-9
fc: /usr/local/bin/gfortran-9
modules: []
operating_system: catalina
target: x86_64
EOF
spack compiler find
spack compiler info apple-clang
spack debug report
spack solve zlib

View File

@@ -24,23 +24,23 @@ jobs:
name: gcc with clang
runs-on: macos-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: spack install
run: |
. .github/workflows/install_spack.sh
# 9.2.0 is the latest version on which we apply homebrew patch
spack install -v --fail-fast gcc@9.2.0 %apple-clang
spack install -v --fail-fast gcc@11.2.0 %apple-clang
install_jupyter_clang:
name: jupyter
runs-on: macos-latest
timeout-minutes: 700
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: spack install
@@ -52,8 +52,8 @@ jobs:
name: scipy, mpl, pd
runs-on: macos-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: spack install

View File

@@ -15,8 +15,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: Install Python Packages
@@ -31,10 +31,10 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: Install Python packages
@@ -57,7 +57,7 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
with_coverage: ${{ steps.coverage.outputs.with_coverage }}
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -106,10 +106,10 @@ jobs:
- python-version: 3.9
concretizer: original
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -121,12 +121,16 @@ jobs:
patchelf cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
pip install --upgrade pip six setuptools pytest codecov "coverage[toml]<=6.2"
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
pip install --upgrade flake8 isort>=4.3.5 mypy>=0.900 black
fi
- name: Pin pathlib for Python 2.7
if: ${{ matrix.python-version == 2.7 }}
run: |
pip install -U pathlib2==2.3.6
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -167,10 +171,10 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -180,7 +184,7 @@ jobs:
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -214,7 +218,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -233,10 +237,10 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -248,7 +252,7 @@ jobs:
patchelf kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml] clingo
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2 clingo
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -282,16 +286,16 @@ jobs:
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade pytest codecov coverage[toml]
pip install --upgrade pytest codecov coverage[toml]==6.2
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
@@ -327,13 +331,13 @@ jobs:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
- uses: actions/checkout@a12a3943b4bdde767164f792f33f40b04645d846 # @v2
- uses: actions/setup-python@0ebf233433c08fb9061af664d501c3f3ff0e9e20 # @v2
with:
python-version: 3.9
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
- name: Package audits (with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
run: |

View File

@@ -34,10 +34,18 @@ includes the sbang tool directly in bin/sbang. These packages are covered
by various permissive licenses. A summary listing follows. See the
license included with each package for full details.
PackageName: altgraph
PackageHomePage: https://altgraph.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: argparse
PackageHomePage: https://pypi.python.org/pypi/argparse
PackageLicenseDeclared: Python-2.0
PackageName: astunparse
PackageHomePage: https://github.com/simonpercivall/astunparse
PackageLicenseDeclared: Python-2.0
PackageName: attrs
PackageHomePage: https://github.com/python-attrs/attrs
PackageLicenseDeclared: MIT
@@ -62,6 +70,10 @@ PackageName: jsonschema
PackageHomePage: https://pypi.python.org/pypi/jsonschema
PackageLicenseDeclared: MIT
PackageName: macholib
PackageHomePage: https://macholib.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: markupsafe
PackageHomePage: https://pypi.python.org/pypi/MarkupSafe
PackageLicenseDeclared: BSD-3-Clause
@@ -93,11 +105,3 @@ PackageLicenseDeclared: Apache-2.0 OR MIT
PackageName: six
PackageHomePage: https://pypi.python.org/pypi/six
PackageLicenseDeclared: MIT
PackageName: macholib
PackageHomePage: https://macholib.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT
PackageName: altgraph
PackageHomePage: https://altgraph.readthedocs.io/en/latest/index.html
PackageLicenseDeclared: MIT

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) 2013-2020 LLNS, LLC and other Spack Project Developers.
Copyright (c) 2013-2022 LLNS, LLC and other Spack Project Developers.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@@ -10,6 +10,7 @@ For more on Spack's release structure, see
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.17.x | :white_check_mark: |
| 0.16.x | :white_check_mark: |
## Reporting a Vulnerability

View File

@@ -1,7 +1,7 @@
#!/bin/sh
# -*- python -*-
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,6 +1,6 @@
#!/bin/sh
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,17 @@
# -------------------------------------------------------------------------
# This is the default spack configuration file.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing
# `$SPACK_ROOT/etc/spack/concretizer.yaml`, `~/.spack/concretizer.yaml`,
# or by adding a `concretizer:` section to an environment.
# -------------------------------------------------------------------------
concretizer:
# Whether to consider installed packages or packages from buildcaches when
# concretizing specs. If `true`, we'll try to use as many installs/binaries
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: false

View File

@@ -155,14 +155,17 @@ config:
# The concretization algorithm to use in Spack. Options are:
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs.
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs. This will soon be deprecated in
# favor of clingo.
#
# See `concretizer.yaml` for more settings you can fine-tune when
# using clingo.
concretizer: clingo

View File

@@ -46,7 +46,13 @@ modules:
enable:
- tcl
tcl:
all:
autoload: none
# Default configurations if lmod is enabled
lmod:
all:
autoload: direct
hierarchy:
- mpi

View File

@@ -34,6 +34,7 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libllvm: [llvm, llvm-amdgpu]
lua-lang: [lua, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]
mkl: [intel-mkl]

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -194,9 +194,9 @@ Reusing installed dependencies
.. warning::
The ``--reuse`` option described here is experimental, and it will
likely be replaced with a different option and configuration settings
in the next Spack release.
The ``--reuse`` option described here will become the default installation
method in the next Spack version, and you will be able to get the current
behavior by using ``spack install --fresh``.
By default, when you run ``spack install``, Spack tries to build a new
version of the package you asked for, along with updated versions of
@@ -216,6 +216,9 @@ the ``mpich`` will be build with the installed versions, if possible.
You can use the :ref:`spack spec -I <cmd-spack-spec>` command to see what
will be reused and what will be built before you install.
You can configure Spack to use the ``--reuse`` behavior by default in
``concretizer.yaml``.
.. _cmd-spack-uninstall:
^^^^^^^^^^^^^^^^^^^
@@ -1280,7 +1283,7 @@ Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default
target), see :ref:`concretization-preferences`.
target), see :ref:`package-preferences`.
.. admonition:: Cray machines

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -209,11 +209,49 @@ Specific limitations include:
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _concretization-preferences:
.. _concretizer-options:
--------------------------
Concretization Preferences
--------------------------
----------------------
Concretizer options
----------------------
``packages.yaml`` gives the concretizer preferences for specific packages,
but you can also use ``concretizer.yaml`` to customize aspects of the
algorithm it uses to select the dependencies you install:
.. _code-block: yaml
concretizer:
# Whether to consider installed packages or packages from buildcaches when
# concretizing specs. If `true`, we'll try to use as many installs/binaries
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: false
^^^^^^^^^^^^^^^^
``reuse``
^^^^^^^^^^^^^^^^
This controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``). .
You can use ``spack install --reuse`` to enable reuse for a single installation,
and you can use ``spack install --fresh`` to do a fresh install if ``reuse`` is
enabled by default.
.. note::
``reuse: false`` is the current default, but ``reuse: true`` will be the default
in the next Spack release. You will still be able to use ``spack install --fresh``
to get the old behavior.
.. _package-preferences:
-------------------
Package Preferences
-------------------
Spack can be configured to prefer certain compilers, package
versions, dependencies, and variants during concretization.
@@ -269,6 +307,7 @@ concretization rules. A provider lists a value that packages may
``depend_on`` (e.g, MPI) and a list of rules for fulfilling that
dependency.
.. _package_permissions:
-------------------

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -649,7 +649,7 @@ follow `the next section <intel-install-libs_>`_ instead.
* If you specified a custom variant (for example ``+vtune``) you may want to add this as your
preferred variant in the packages configuration for the ``intel-parallel-studio`` package
as described in :ref:`concretization-preferences`. Otherwise you will have to specify
as described in :ref:`package-preferences`. Otherwise you will have to specify
the variant everytime ``intel-parallel-studio`` is being used as ``mkl``, ``fftw`` or ``mpi``
implementation to avoid pulling in a different variant.
@@ -811,13 +811,13 @@ by one of the following means:
$ spack install libxc@3.0.0%intel
* Alternatively, request Intel compilers implicitly by concretization preferences.
* Alternatively, request Intel compilers implicitly by package preferences.
Configure the order of compilers in the appropriate ``packages.yaml`` file,
under either an ``all:`` or client-package-specific entry, in a
``compiler:`` list. Consult the Spack documentation for
`Configuring Package Preferences <https://spack-tutorial.readthedocs.io/en/latest/tutorial_configuration.html#configuring-package-preferences>`_
and
:ref:`Concretization Preferences <concretization-preferences>`.
:ref:`Package Preferences <package-preferences>`.
Example: ``etc/spack/packages.yaml`` might simply contain:
@@ -867,7 +867,7 @@ virtual package, in order of decreasing preference. To learn more about the
``providers:`` settings, see the Spack tutorial for
`Configuring Package Preferences <https://spack-tutorial.readthedocs.io/en/latest/tutorial_configuration.html#configuring-package-preferences>`_
and the section
:ref:`Concretization Preferences <concretization-preferences>`.
:ref:`Package Preferences <package-preferences>`.
Example: The following fairly minimal example for ``packages.yaml`` shows how
to exclusively use the standalone ``intel-mkl`` package for all the linear

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -9,216 +9,80 @@
PythonPackage
-------------
Python packages and modules have their own special build system.
Python packages and modules have their own special build system. This
documentation covers everything you'll need to know in order to write
a Spack build recipe for a Python library.
^^^^^^
Phases
^^^^^^
^^^^^^^^^^^
Terminology
^^^^^^^^^^^
The ``PythonPackage`` base class provides the following phases that
can be overridden:
In the Python ecosystem, there are a number of terms that are
important to understand.
* ``build``
* ``build_py``
* ``build_ext``
* ``build_clib``
* ``build_scripts``
* ``install``
* ``install_lib``
* ``install_headers``
* ``install_scripts``
* ``install_data``
**PyPI**
The `Python Package Index <https://pypi.org/>`_, where most Python
libraries are hosted.
These are all standard ``setup.py`` commands and can be found by running:
**sdist**
Source distributions, distributed as tarballs (.tar.gz) and zip
files (.zip). Contain the source code of the package.
.. code-block:: console
**bdist**
Built distributions, distributed as wheels (.whl). Contain the
pre-built library.
$ python setup.py --help-commands
**wheel**
A binary distribution format common in the Python ecosystem. This
file is actually just a zip file containing specific metadata and
code. See the
`documentation <https://packaging.python.org/en/latest/specifications/binary-distribution-format/>`_
for more details.
**build frontend**
Command-line tools used to build and install wheels. Examples
include `pip <https://pip.pypa.io/>`_,
`build <https://pypa-build.readthedocs.io/>`_, and
`installer <https://installer.readthedocs.io/>`_.
By default, only the ``build`` and ``install`` phases are run:
**build backend**
Libraries used to define how to build a wheel. Examples
include `setuptools <https://setuptools.pypa.io/>`__,
`flit <https://flit.readthedocs.io/>`_, and
`poetry <https://python-poetry.org/>`_.
#. ``build`` - build everything needed to install
#. ``install`` - install everything from build directory
^^^^^^^^^^^
Downloading
^^^^^^^^^^^
If for whatever reason you need to run more phases, simply modify your
``phases`` list like so:
.. code-block:: python
phases = ['build_ext', 'install']
Each phase provides a function ``<phase>`` that runs:
.. code-block:: console
$ python -s setup.py --no-user-cfg <phase>
Each phase also has a ``<phase_args>`` function that can pass arguments to
this call. All of these functions are empty except for the ``install_args``
function, which passes ``--prefix=/path/to/installation/prefix``. There is
also some additional logic specific to setuptools and eggs.
If you need to run a phase that is not a standard ``setup.py`` command,
you'll need to define a function for it like so:
.. code-block:: python
phases = ['configure', 'build', 'install']
def configure(self, spec, prefix):
self.setup_py('configure')
^^^^^^
Wheels
^^^^^^
Some Python packages are closed-source and distributed as wheels.
Instead of using the ``PythonPackage`` base class, you should extend
the ``Package`` base class and implement the following custom installation
procedure:
.. code-block:: python
def install(self, spec, prefix):
pip = which('pip')
pip('install', self.stage.archive_file, '--prefix={0}'.format(prefix))
This will require a dependency on pip, as mentioned below.
^^^^^^^^^^^^^^^
Important files
^^^^^^^^^^^^^^^
Python packages can be identified by the presence of a ``setup.py`` file.
This file is used by package managers like ``pip`` to determine a
package's dependencies and the version of dependencies required, so if
the ``setup.py`` file is not accurate, the package will not build properly.
For this reason, the ``setup.py`` file should be fairly reliable. If the
documentation and ``setup.py`` disagree on something, the ``setup.py``
file should be considered to be the truth. As dependencies are added or
removed, the documentation is much more likely to become outdated than
the ``setup.py``.
The Python ecosystem has evolved significantly over the years. Before
setuptools became popular, most packages listed their dependencies in a
``requirements.txt`` file. Once setuptools took over, these dependencies
were listed directly in the ``setup.py``. Newer PEPs introduced additional
files, like ``setup.cfg`` and ``pyproject.toml``. You should look out for
all of these files, as they may all contain important information about
package dependencies.
Some Python packages are closed-source and are distributed as Python
wheels. For example, ``py-azureml-sdk`` downloads a ``.whl`` file. This
file is simply a zip file, and can be extracted using:
.. code-block:: console
$ unzip *.whl
The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe.
.. _pypi:
^^^^
PyPI
^^^^
The vast majority of Python packages are hosted on PyPI (The Python
Package Index), which is :ref:`preferred over GitHub <pypi-vs-github>`
for downloading packages. ``pip`` only supports packages hosted on PyPI, making
it the only option for developers who want a simple installation.
Search for "PyPI <package-name>" to find the download page. Note that
some pages are versioned, and the first result may not be the newest
version. Click on the "Latest Version" button to the top right to see
if a newer version is available. The download page is usually at::
The first step in packaging a Python library is to figure out where
to download it from. The vast majority of Python packages are hosted
on `PyPI <https://pypi.org/>`_, which is
:ref:`preferred over GitHub <pypi-vs-github>` for downloading
packages. Search for the package name on PyPI to find the project
page. The project page is usually located at::
https://pypi.org/project/<package-name>
Since PyPI is so common, the ``PythonPackage`` base class has a
``pypi`` attribute that can be set. Once set, ``pypi`` will be used
to define the ``homepage``, ``url``, and ``list_url``. For example,
the following:
.. code-block:: python
homepage = 'https://pypi.org/project/setuptools/'
url = 'https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip'
list_url = 'https://pypi.org/simple/setuptools/'
is equivalent to:
.. code-block:: python
pypi = 'setuptools/setuptools-49.2.0.zip'
^^^^^^^^^^^
Description
^^^^^^^^^^^
The top of the PyPI downloads page contains a description of the
package. The first line is usually a short description, while there
may be a several line "Project Description" that follows. Choose whichever
is more useful. You can also get these descriptions on the command-line
using:
.. code-block:: console
$ python setup.py --description
$ python setup.py --long-description
^^^^^^^^
Homepage
^^^^^^^^
Package developers use ``setup.py`` to upload new versions to PyPI.
The ``setup`` method often passes metadata like ``homepage`` to PyPI.
This metadata is displayed on the left side of the download page.
Search for the text "Homepage" under "Project links" to find it. You
should use this page instead of the PyPI page if they differ. You can
also get the homepage on the command-line by running:
.. code-block:: console
$ python setup.py --url
^^^
URL
^^^
If ``pypi`` is set as mentioned above, ``url`` and ``list_url`` will
be automatically set for you. If both ``.tar.gz`` and ``.zip`` versions
are available, ``.tar.gz`` is preferred. If some releases offer both
``.tar.gz`` and ``.zip`` versions, but some only offer ``.zip`` versions,
use ``.zip``.
Some Python packages are closed-source and do not ship ``.tar.gz`` or ``.zip``
files on either PyPI or GitHub. If this is the case, you can still download
and install a Python wheel. For example, ``py-azureml-sdk`` is closed source
and can be downloaded from::
On the project page, there is a "Download files" tab containing
download URLs. Whenever possible, we prefer to build Spack packages
from source. If PyPI only has wheels, check to see if the project is
hosted on GitHub and see if GitHub has source distributions. The
project page usually has a "Homepage" and/or "Source code" link for
this. If the project is closed-source, it may only have wheels
available. For example, ``py-azureml-sdk`` is closed-source and can
be downloaded from::
https://pypi.io/packages/py3/a/azureml_sdk/azureml_sdk-1.11.0-py3-none-any.whl
Once you've found a URL to download the package from, run:
You may see Python-specific or OS-specific URLs. Note that when you add a
``.whl`` URL, you should add ``expand=False`` to ensure that Spack doesn't
try to extract the wheel:
.. code-block:: console
.. code-block:: python
$ spack create <url>
version('1.11.0', sha256='d8c9d24ea90457214d798b0d922489863dad518adde3638e08ef62de28fb183a', expand=False)
to create a new package template.
.. _pypi-vs-github:
@@ -226,11 +90,13 @@ try to extract the wheel:
PyPI vs. GitHub
"""""""""""""""
Many packages are hosted on PyPI, but are developed on GitHub or another
version control systems. The tarball can be downloaded from either
location, but PyPI is preferred for the following reasons:
Many packages are hosted on PyPI, but are developed on GitHub or
another version control system hosting service. The source code can
be downloaded from either location, but PyPI is preferred for the
following reasons:
#. PyPI contains the bare minimum number of files needed to install the package.
#. PyPI contains the bare minimum number of files needed to install
the package.
You may notice that the tarball you download from PyPI does not
have the same checksum as the tarball you download from GitHub.
@@ -267,252 +133,124 @@ location, but PyPI is preferred for the following reasons:
PyPI is nice because it makes it physically impossible to
re-release the same version of a package with a different checksum.
Use the :ref:`pypi attribute <pypi>` to facilitate construction of PyPI package
references.
The only reason to use GitHub instead of PyPI is if PyPI only has
wheels or if the PyPI sdist is missing a file needed to build the
package. If this is the case, please add a comment above the ``url``
explaining this.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^
PyPI
^^^^
There are a few dependencies common to the ``PythonPackage`` build system.
""""""
Python
""""""
Obviously, every ``PythonPackage`` needs Python at build-time to run
``python setup.py build && python setup.py install``. Python is also
needed at run-time if you want to import the module. Due to backwards
incompatible changes between Python 2 and 3, it is very important to
specify which versions of Python are supported. If the documentation
mentions that Python 3 is required, this can be specified as:
Since PyPI is so commonly used to host Python libraries, the
``PythonPackage`` base class has a ``pypi`` attribute that can be
set. Once set, ``pypi`` will be used to define the ``homepage``,
``url``, and ``list_url``. For example, the following:
.. code-block:: python
depends_on('python@3:', type=('build', 'run'))
homepage = 'https://pypi.org/project/setuptools/'
url = 'https://pypi.org/packages/source/s/setuptools/setuptools-49.2.0.zip'
list_url = 'https://pypi.org/simple/setuptools/'
If Python 2 is required, this would look like:
is equivalent to:
.. code-block:: python
depends_on('python@:2', type=('build', 'run'))
pypi = 'setuptools/setuptools-49.2.0.zip'
If Python 2.7 is the only version that works, you can use:
If a package has a different homepage listed on PyPI, you can
override it by setting your own ``homepage``.
^^^^^^^^^^^
Description
^^^^^^^^^^^
The top of the PyPI project page contains a short description of the
package. The "Project description" tab may also contain a longer
description of the package. Either of these can be used to populate
the package docstring.
^^^^^^^^^^^^^
Build backend
^^^^^^^^^^^^^
Once you've determined the basic metadata for a package, the next
step is to determine the build backend. ``PythonPackage`` uses
`pip <https://pip.pypa.io/>`_ to install the package, but pip
requires a backend to actually build the package.
To determine the build backend, look for a ``pyproject.toml`` file.
If there is no ``pyproject.toml`` file and only a ``setup.py`` or
``setup.cfg`` file, you can assume that the project uses
:ref:`setuptools`. If there is a ``pyproject.toml`` file, see if it
contains a ``[build-system]`` section. For example:
.. code-block:: toml
[build-system]
requires = [
"setuptools>=42",
"wheel",
]
build-backend = "setuptools.build_meta"
This section does two things: the ``requires`` key lists build
dependencies of the project, and the ``build-backend`` key defines
the build backend. All of these build dependencies should be added as
dependencies to your package:
.. code-block:: python
depends_on('python@2.7:2.8', type=('build', 'run'))
depends_on('py-setuptools@42:', type='build')
The documentation may not always specify supported Python versions.
Another place to check is in the ``setup.py`` or ``setup.cfg`` file.
Look for a line containing ``python_requires``. An example from
`py-numpy <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-numpy/package.py>`_
looks like:
Note that ``py-wheel`` is already listed as a build dependency in the
``PythonPackage`` base class, so you don't need to add it unless you
need to specify a specific version requirement or change the
dependency type.
.. code-block:: python
See `PEP 517 <https://www.python.org/dev/peps/pep-0517/>`_ and
`PEP 518 <https://www.python.org/dev/peps/pep-0518/>`_ for more
information on the design of ``pyproject.toml``.
python_requires='>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*'
Depending on which build backend a project uses, there are various
places that run-time dependencies can be listed.
"""""""""
distutils
"""""""""
You may also find a version check at the top of the ``setup.py``:
Before the introduction of setuptools and other build backends,
Python packages had to rely on the built-in distutils library.
Distutils is missing many of the features that setuptools and other
build backends offer, and users are encouraged to use setuptools
instead. In fact, distutils was deprecated in Python 3.10 and will be
removed in Python 3.12. Because of this, pip actually replaces all
imports of distutils with setuptools. If a package uses distutils,
you should instead add a build dependency on setuptools. Check for a
``requirements.txt`` file that may list dependencies of the project.
.. code-block:: python
if sys.version_info[:2] < (2, 7) or (3, 0) <= sys.version_info[:2] < (3, 4):
raise RuntimeError("Python version 2.7 or >= 3.4 required.")
This can be converted to Spack's spec notation like so:
.. code-block:: python
depends_on('python@2.7:2.8,3.4:', type=('build', 'run'))
If you are writing a recipe for a package that only distributes
wheels, look for a section in the ``METADATA`` file that looks like::
Requires-Python: >=3.5,<4
This would be translated to:
.. code-block:: python
extends('python')
depends_on('python@3.5:3', type=('build', 'run'))
Many ``setup.py`` or ``setup.cfg`` files also contain information like::
Programming Language :: Python :: 2
Programming Language :: Python :: 2.6
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.3
Programming Language :: Python :: 3.4
Programming Language :: Python :: 3.5
Programming Language :: Python :: 3.6
This is a list of versions of Python that the developer likely tests.
However, you should not use this to restrict the versions of Python
the package uses unless one of the two former methods (``python_requires``
or ``sys.version_info``) is used. There is no logic in setuptools
that prevents the package from building for Python versions not in
this list, and often new releases like Python 3.7 or 3.8 work just fine.
.. _setuptools:
""""""""""
setuptools
""""""""""
Originally, the Python language had a single build system called
distutils, which is built into Python. Distutils provided a common
framework for package authors to describe their project and how it
should be built. However, distutils was not without limitations.
Most notably, there was no way to list a project's dependencies
with distutils. Along came setuptools, a non-builtin build system
designed to overcome the limitations of distutils. Both projects
use a similar API, making the transition easy while adding much
needed functionality. Today, setuptools is used in around 90% of
the Python packages in Spack.
Since setuptools isn't built-in to Python, you need to add it as a
dependency. To determine whether or not a package uses setuptools,
search the file for an import statement like:
.. code-block:: python
import setuptools
or:
.. code-block:: python
from setuptools import setup
Some packages are designed to work with both setuptools and distutils,
so you may find something like:
.. code-block:: python
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
This uses setuptools if available, and falls back to distutils if not.
In this case, you would still want to add a setuptools dependency, as
it offers us more control over the installation.
Unless specified otherwise, setuptools is usually a build-only dependency.
That is, it is needed to install the software, but is not needed at
run-time. This can be specified as:
.. code-block:: python
depends_on('py-setuptools', type='build')
"""
pip
"""
Packages distributed as Python wheels will require an extra dependency
on pip:
.. code-block:: python
depends_on('py-pip', type='build')
We will use pip to install the actual wheel.
""""""
cython
""""""
Compared to compiled languages, interpreted languages like Python can
be quite a bit slower. To work around this, some Python developers
rewrite computationally demanding sections of code in C, a process
referred to as "cythonizing". In order to build these package, you
need to add a build dependency on cython:
.. code-block:: python
depends_on('py-cython', type='build')
Look for references to "cython" in the ``setup.py`` to determine
whether or not this is necessary. Cython may be optional, but
even then you should list it as a required dependency. Spack is
designed to compile software, and is meant for HPC facilities
where speed is crucial. There is no reason why someone would not
want an optimized version of a library instead of the pure-Python
version.
Note that some release tarballs come pre-cythonized, and cython is
not needed as a dependency. However, this is becoming less common
as Python continues to evolve and developers discover that cythonized
sources are no longer compatible with newer versions of Python and
need to be re-cythonized.
^^^^^^^^^^^^^^^^^^^
Python dependencies
^^^^^^^^^^^^^^^^^^^
When you install a package with ``pip``, it reads the ``setup.py``
file in order to determine the dependencies of the package.
If the dependencies are not yet installed, ``pip`` downloads them
and installs them for you. This may sound convenient, but Spack
cannot rely on this behavior for two reasons:
#. Spack needs to be able to install packages on air-gapped networks.
If there is no internet connection, ``pip`` can't download the
package dependencies. By explicitly listing every dependency in
the ``package.py``, Spack knows what to download ahead of time.
#. Duplicate installations of the same dependency may occur.
Spack supports *activation* of Python extensions, which involves
symlinking the package installation prefix to the Python installation
prefix. If your package is missing a dependency, that dependency
will be installed to the installation directory of the same package.
If you try to activate the package + dependency, it may cause a
problem if that package has already been activated.
For these reasons, you must always explicitly list all dependencies.
Although the documentation may list the package's dependencies,
often the developers assume people will use ``pip`` and won't have to
worry about it. Always check the ``setup.py`` to find the true
dependencies.
If the package relies on ``distutils``, it may not explicitly list its
dependencies. Check for statements like:
.. code-block:: python
try:
import numpy
except ImportError:
raise ImportError("numpy must be installed prior to installation")
Obviously, this means that ``py-numpy`` is a dependency.
If the package uses ``setuptools``, check for the following clues:
If the ``pyproject.toml`` lists ``setuptools.build_meta`` as a
``build-backend``, or if the package has a ``setup.py`` that imports
``setuptools``, or if the package has a ``setup.cfg`` file, then it
uses setuptools to build. Setuptools is a replacement for the
distutils library, and has almost the exact same API. Dependencies
can be listed in the ``setup.py`` or ``setup.cfg`` file. Look for the
following arguments:
* ``python_requires``
As mentioned above, this specifies which versions of Python are
required.
This specifies the version of Python that is required.
* ``setup_requires``
@@ -524,43 +262,88 @@ If the package uses ``setuptools``, check for the following clues:
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``extra_requires``
* ``extras_require``
These packages are optional dependencies that enable additional
functionality. You should add a variant that optionally adds these
dependencies. This variant should be False by default.
* ``test_requires``
* ``tests_require``
These are packages that are required to run the unit tests for the
package. These dependencies can be specified using the
``type='test'`` dependency type. However, the PyPI tarballs rarely
contain unit tests, so there is usually no reason to add these.
In the root directory of the package, you may notice a
``requirements.txt`` file. It may look like this file contains a list
of all of the package's dependencies. Don't be fooled. This file is
used by tools like Travis to install the pre-requisites for the
package... and a whole bunch of other things. It often contains
dependencies only needed for unit tests, like:
See https://setuptools.pypa.io/en/latest/userguide/dependency_management.html
for more information on how setuptools handles dependency management.
See `PEP 440 <https://www.python.org/dev/peps/pep-0440/#version-specifiers>`_
for documentation on version specifiers in setuptools.
* mock
* nose
* pytest
""""
flit
""""
It can also contain dependencies for building the documentation, like
sphinx. If you can't find any information about the package's
dependencies, you can take a look in ``requirements.txt``, but be sure
not to add test or documentation dependencies.
There are actually two possible ``build-backend`` for flit, ``flit``
and ``flit_core``. If you see these in the ``pyproject.toml``, add a
build dependency to your package. With flit, all dependencies are
listed directly in the ``pyproject.toml`` file. Older versions of
flit used to store this info in a ``flit.ini`` file, so check for
this too.
Newer PEPs have added alternative ways to specify a package's dependencies.
If you don't see any dependencies listed in the ``setup.py``, look for a
``setup.cfg`` or ``pyproject.toml``. These files can be used to store the
same ``install_requires`` information that ``setup.py`` used to use.
Either of these files may contain keys like:
If you are write a recipe for a package that only distributes wheels,
check the ``METADATA`` file for lines like::
* ``requires-python``
This specifies the version of Python that is required
* ``dependencies`` or ``requires``
These packages are required for building and installation. You can
add them with ``type=('build', 'run')``.
* ``project.optional-dependencies`` or ``requires-extra``
This section includes keys with lists of optional dependencies
needed to enable those features. You should add a variant that
optionally adds these dependencies. This variant should be False
by default.
See https://flit.readthedocs.io/en/latest/pyproject_toml.html for
more information.
""""""
poetry
""""""
Like flit, poetry also has two possible ``build-backend``, ``poetry``
and ``poetry_core``. If you see these in the ``pyproject.toml``, add
a build dependency to your package. With poetry, all dependencies are
listed directly in the ``pyproject.toml`` file. Dependencies are
listed in a ``[tool.poetry.dependencies]`` section, and use a
`custom syntax <https://python-poetry.org/docs/dependency-specification/#version-constraints>`_
for specifying the version requirements. Note that ``~=`` works
differently in poetry than in setuptools and flit for versions that
start with a zero.
""""""
wheels
""""""
Some Python packages are closed-source and are distributed as Python
wheels. For example, ``py-azureml-sdk`` downloads a ``.whl`` file. This
file is simply a zip file, and can be extracted using:
.. code-block:: console
$ unzip *.whl
The zip file will not contain a ``setup.py``, but it will contain a
``METADATA`` file which contains all the information you need to
write a ``package.py`` build recipe. Check for lines like::
Requires-Python: >=3.5,<4
Requires-Dist: azureml-core (~=1.11.0)
Requires-Dist: azureml-dataset-runtime[fuse] (~=1.11.0)
Requires-Dist: azureml-train (~=1.11.0)
@@ -572,62 +355,58 @@ check the ``METADATA`` file for lines like::
Requires-Dist: azureml-train-automl (~=1.11.0); extra == 'automl'
Lines that use ``Requires-Dist`` are similar to ``install_requires``.
Lines that use ``Provides-Extra`` are similar to ``extra_requires``,
and you can add a variant for those dependencies. The ``~=1.11.0``
syntax is equivalent to ``1.11.0:1.11``.
""""""""""
setuptools
""""""""""
Setuptools is a bit of a special case. If a package requires setuptools
at run-time, how do they express this? They could add it to
``install_requires``, but setuptools is imported long before this and is
needed to read this line. And since you can't install the package
without setuptools, the developers assume that setuptools will already
be there, so they never mention when it is required. We don't want to
add run-time dependencies if they aren't needed, so you need to
determine whether or not setuptools is needed. Grep the installation
directory for any files containing a reference to ``setuptools`` or
``pkg_resources``. Both modules come from ``py-setuptools``.
``pkg_resources`` is particularly common in scripts found in
``prefix/bin``.
``Requires-Python`` is equivalent to ``python_requires`` and
``Requires-Dist`` is equivalent to ``install_requires``.
``Provides-Extra`` is used to name optional features (variants) and
a ``Requires-Dist`` with ``extra == 'foo'`` will list any
dependencies needed for that feature.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to setup.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The default build and install phases should be sufficient to install
most packages. However, you may want to pass additional flags to
either phase.
The default install phase should be sufficient to install most
packages. However, the installation instructions for a package may
suggest passing certain flags to the ``setup.py`` call. The
``PythonPackage`` class has two techniques for doing this.
You can view the available options for a particular phase with:
""""""""""""""
Global options
""""""""""""""
.. code-block:: console
$ python setup.py <phase> --help
Each phase provides a ``<phase_args>`` function that can be used to
pass arguments to that phase. For example,
`py-numpy <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-numpy/package.py>`_
adds:
These flags are added directly after ``setup.py`` when pip runs
``python setup.py install``. For example, the ``py-pyyaml`` package
has an optional dependency on ``libyaml`` that can be enabled like so:
.. code-block:: python
def build_args(self, spec, prefix):
args = []
def global_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
options.append('--with-libyaml')
else:
options.append('--without-libyaml')
return options
# From NumPy 1.10.0 on it's possible to do a parallel build.
if self.version >= Version('1.10.0'):
# But Parallel build in Python 3.5+ is broken. See:
# https://github.com/spack/spack/issues/7927
# https://github.com/scipy/scipy/issues/7112
if spec['python'].version < Version('3.5'):
args = ['-j', str(make_jobs)]
return args
"""""""""""""""
Install options
"""""""""""""""
These flags are added directly after ``install`` when pip runs
``python setup.py install``. For example, the ``py-pyyaml`` package
allows you to specify the directories to search for ``libyaml``:
.. code-block:: python
def install_options(self, spec, prefix):
options = []
if '+libyaml' in spec:
options.extend([
spec['libyaml'].libs.search_flags,
spec['libyaml'].headers.include_flags,
])
return options
^^^^^^^
@@ -669,9 +448,9 @@ a "package" is a directory containing files like:
whereas a "module" is a single Python file.
The ``PythonPackage`` base class automatically detects these module
names for you. If, for whatever reason, the module names detected
are wrong, you can provide the names yourself by overriding
The ``PythonPackage`` base class automatically detects these package
and module names for you. If, for whatever reason, the module names
detected are wrong, you can provide the names yourself by overriding
``import_modules`` like so:
.. code-block:: python
@@ -692,10 +471,8 @@ This can be expressed like so:
@property
def import_modules(self):
modules = ['yaml']
if '+libyaml' in self.spec:
modules.append('yaml.cyaml')
return modules
@@ -713,8 +490,8 @@ Unit tests
""""""""""
The package may have its own unit or regression tests. Spack can
run these tests during the installation by adding phase-appropriate
test methods.
run these tests during the installation by adding test methods after
installation.
For example, ``py-numpy`` adds the following as a check to run
after the ``install`` phase:
@@ -740,34 +517,14 @@ when testing is enabled during the installation (i.e., ``spack install
Setup file in a sub-directory
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In order to be compatible with package managers like ``pip``, the package
is required to place its ``setup.py`` in the root of the tarball. However,
not every Python package cares about ``pip`` or PyPI. If you are installing
a package that is not hosted on PyPI, you may find that it places its
``setup.py`` in a sub-directory. To handle this, add the directory containing
``setup.py`` to the package like so:
Many C/C++ libraries provide optional Python bindings in a
subdirectory. To tell pip which directory to build from, you can
override the ``build_directory`` attribute. For example, if a package
provides Python bindings in a ``python`` directory, you can use:
.. code-block:: python
build_directory = 'source'
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Alternate names for setup.py
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
As previously mentioned, packages need to call their setup script ``setup.py``
in order to be compatible with package managers like ``pip``. However, some
packages like
`py-meep <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-meep/package.py>`_ and
`py-adios <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-adios/package.py>`_
come with multiple setup scripts, one for a serial build and another for a
parallel build. You can override the default name to use like so:
.. code-block:: python
def setup_file(self):
return 'setup-mpi.py' if '+mpi' in self.spec else 'setup.py'
build_directory = 'python'
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -781,10 +538,14 @@ on Python are not necessarily ``PythonPackage``'s.
Choosing a build system
"""""""""""""""""""""""
First of all, you need to select a build system. ``spack create`` usually
does this for you, but if for whatever reason you need to do this manually,
choose ``PythonPackage`` if and only if the package contains a ``setup.py``
file.
First of all, you need to select a build system. ``spack create``
usually does this for you, but if for whatever reason you need to do
this manually, choose ``PythonPackage`` if and only if the package
contains one of the following files:
* ``pyproject.toml``
* ``setup.py``
* ``setup.cfg``
"""""""""""""""""""""""
Choosing a package name
@@ -857,10 +618,9 @@ having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
thumb is to check the installation prefix. If Python libraries are
installed to ``prefix/lib/python2.7/site-packages`` (where 2.7 is the
MAJOR.MINOR version of Python you used to install the package), then
you should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``prefix/bin``, then
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``, as symlinking the package wouldn't be useful.
^^^^^^^^^^^^^^^^^^^^^
@@ -893,4 +653,17 @@ External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on Python packaging, see:
https://packaging.python.org/
* https://packaging.python.org/
For more information on build and installation frontend tools, see:
* pip: https://pip.pypa.io/
* build: https://pypa-build.readthedocs.io/
* installer: https://installer.readthedocs.io/
For more information on build backend tools, see:
* setuptools: https://setuptools.pypa.io/
* flit: https://flit.readthedocs.io/
* poetry: https://python-poetry.org/

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -13,12 +13,16 @@ Spack has many configuration files. Here is a quick list of them, in
case you want to skip directly to specific docs:
* :ref:`compilers.yaml <compiler-config>`
* :ref:`concretizer.yaml <concretizer-options>`
* :ref:`config.yaml <config-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <build-settings>`
* :ref:`repos.yaml <repositories>`
You can also add any of these as inline configuration in ``spack.yaml``
in an :ref:`environment <environment-configuration>`.
-----------
YAML Format
-----------

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -38,8 +38,7 @@ obtained by cloning the corresponding git repository:
.. code-block:: console
$ pwd
/home/user
$ cd ~/
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
@@ -62,7 +61,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- /home/user/tmp/spack-scripting
- ~/tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -271,9 +271,10 @@ Compiler configuration
----------------------
Spack has the ability to build packages with multiple compilers and
compiler versions. Spack searches for compilers on your machine
automatically the first time it is run. It does this by inspecting
your ``PATH``.
compiler versions. Compilers can be made available to Spack by
specifying them manually in ``compilers.yaml``, or automatically by
running ``spack compiler find``, but for convenience Spack will
automatically detect compilers the first time it needs them.
.. _cmd-spack-compilers:
@@ -281,7 +282,7 @@ your ``PATH``.
``spack compilers``
^^^^^^^^^^^^^^^^^^^
You can see which compilers spack has found by running ``spack
You can see which compilers are available to Spack by running ``spack
compilers`` or ``spack compiler list``:
.. code-block:: console
@@ -320,9 +321,10 @@ An alias for ``spack compiler find``.
``spack compiler find``
^^^^^^^^^^^^^^^^^^^^^^^
If you do not see a compiler in this list, but you want to use it with
Spack, you can simply run ``spack compiler find`` with the path to
where the compiler is installed. For example:
Lists the compilers currently available to Spack. If you do not see
a compiler in this list, but you want to use it with Spack, you can
simply run ``spack compiler find`` with the path to where the
compiler is installed. For example:
.. code-block:: console

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -615,44 +615,39 @@ modifications to either ``CPATH`` or ``LIBRARY_PATH``.
Autoload dependencies
"""""""""""""""""""""
In some cases it can be useful to have module files that automatically load
their dependencies. This may be the case for Python extensions, if not
activated using ``spack activate``:
Often it is required for a module to have its (transient) dependencies loaded as well.
One example where this is useful is when one package needs to use executables provided
by its dependency; when the dependency is autoloaded, the executable will be in the
PATH. Similarly for scripting languages such as Python, packages and their dependencies
have to be loaded together.
Autoloading is enabled by default for LMod, as it has great builtin support for through
the ``depends_on`` function. For Environment Modules it is disabled by default.
Autoloading can also be enabled conditionally:
.. code-block:: yaml
modules:
default:
tcl:
^python:
autoload: 'direct'
modules:
default:
tcl:
all:
autoload: none
^python:
autoload: direct
The configuration file above will produce module files that will
load their direct dependencies if the package installed depends on ``python``.
The allowed values for the ``autoload`` statement are either ``none``,
``direct`` or ``all``. The default is ``none``.
.. tip::
Building external software
Setting ``autoload`` to ``direct`` for all packages can be useful
when building software outside of a Spack installation that depends on
artifacts in that installation. E.g. (adjust ``lmod`` vs ``tcl``
as appropriate):
.. code-block:: yaml
modules:
default:
lmod:
all:
autoload: 'direct'
``direct`` or ``all``.
.. note::
TCL prerequisites
In the ``tcl`` section of the configuration file it is possible to use
the ``prerequisites`` directive that accepts the same values as
``autoload``. It will produce module files that have a ``prereq``
statement instead of automatically loading other modules.
statement, which can be used to autoload dependencies in some versions
of Environment Modules.
------------------------
Maintaining Module Files

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -705,7 +705,8 @@ as follows:
#. The following special strings are considered larger than any other
numeric or non-numeric version component, and satisfy the following
order between themselves: ``develop > main > master > head > trunk``.
order between themselves:
``develop > main > master > head > trunk > stable``.
#. Numbers are ordered numerically, are less than special strings, and
larger than other non-numeric components.
@@ -1441,6 +1442,32 @@ The ``when`` clause follows the same syntax and accepts the same
values as the ``when`` argument of
:py:func:`spack.directives.depends_on`
^^^^^^^^^^^^^^^
Sticky Variants
^^^^^^^^^^^^^^^
The variant directive can be marked as ``sticky`` by setting to ``True`` the
corresponding argument:
.. code-block:: python
variant('bar', default=False, sticky=True)
A ``sticky`` variant differs from a regular one in that it is always set
to either:
#. An explicit value appearing in a spec literal or
#. Its default value
The concretizer thus is not free to pick an alternate value to work
around conflicts, but will error out instead.
Setting this property on a variant is useful in cases where the
variant allows some dangerous or controversial options (e.g. using unsupported versions
of a compiler for a library) and the packager wants to ensure that
allowing these options is done on purpose by the user, rather than
automatically by the solver.
^^^^^^^^^^^^^^^^^^^
Overriding Variants
^^^^^^^^^^^^^^^^^^^
@@ -2443,6 +2470,24 @@ Now, the ``py-numpy`` package can be used as an argument to ``spack
activate``. When it is activated, all the files in its prefix will be
symbolically linked into the prefix of the python package.
A package can only extend one other package at a time. To support packages
that may extend one of a list of other packages, Spack supports multiple
``extends`` directives as long as at most one of them is selected as
a dependency during concretization. For example, a lua package could extend
either lua or luajit, but not both:
.. code-block:: python
class LuaLpeg(Package):
...
variant('use_lua', default=True)
extends('lua', when='+use_lua')
extends('lua-luajit', when='~use_lua')
...
Now, a user can install, and activate, the ``lua-lpeg`` package for either
lua or luajit.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding additional constraints
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -2832,7 +2877,7 @@ be concretized on their system. For example, one user may prefer packages
built with OpenMPI and the Intel compiler. Another user may prefer
packages be built with MVAPICH and GCC.
See the :ref:`concretization-preferences` section for more details.
See the :ref:`package-preferences` section for more details.
.. _group_when_spec:

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)

2
lib/spack/env/cc vendored
View File

@@ -1,7 +1,7 @@
#!/bin/sh
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -6,6 +6,13 @@
"""This module contains the following external, potentially separately
licensed, packages that are included in Spack:
altgraph
--------
* Homepage: https://altgraph.readthedocs.io/en/latest/index.html
* Usage: dependency of macholib
* Version: 0.17.2
archspec
--------
@@ -24,6 +31,22 @@
vendored copy ever needs to be updated again:
https://github.com/spack/spack/pull/6786/commits/dfcef577b77249106ea4e4c69a6cd9e64fa6c418
astunparse
----------------
* Homepage: https://github.com/simonpercivall/astunparse
* Usage: Unparsing Python ASTs for package hashes in Spack
* Version: 1.6.3 (plus modifications)
* Note: This is in ``spack.util.unparse`` because it's very heavily
modified, and we want to track coverage for it.
Specifically, we have modified this library to generate consistent unparsed ASTs
regardless of the Python version. It is based on:
1. The original ``astunparse`` library;
2. Modifications for consistency;
3. Backports from the ``ast.unparse`` function in Python 3.9 and later
The unparsing is now mostly consistent with upstream ``ast.unparse``, so if
we ever require Python 3.9 or higher, we can drop this external package.
attrs
----------------
@@ -68,6 +91,13 @@
* Version: 3.2.0 (last version before 2.7 and 3.6 support was dropped)
* Note: We don't include tests or benchmarks; just what Spack needs.
macholib
--------
* Homepage: https://macholib.readthedocs.io/en/latest/index.html#
* Usage: Manipulation of Mach-o binaries for relocating macOS buildcaches on Linux
* Version: 1.15.2
markupsafe
----------
@@ -124,18 +154,4 @@
* Usage: Python 2 and 3 compatibility utilities.
* Version: 1.16.0
macholib
--------
* Homepage: https://macholib.readthedocs.io/en/latest/index.html#
* Usage: Manipulation of Mach-o binaries for relocating macOS buildcaches on Linux
* Version: 1.12
altgraph
--------
* Homepage: https://altgraph.readthedocs.io/en/latest/index.html
* Usage: dependency of macholib
* Version: 0.16.1
"""

View File

@@ -1,4 +1,4 @@
'''
"""
altgraph.Dot - Interface to the dot language
============================================
@@ -107,7 +107,7 @@
- for more details on how to control the graph drawing process see the
`graphviz reference
<http://www.research.att.com/sw/tools/graphviz/refs.html>`_.
'''
"""
import os
import warnings
@@ -115,25 +115,34 @@
class Dot(object):
'''
"""
A class providing a **graphviz** (dot language) representation
allowing a fine grained control over how the graph is being
displayed.
If the :command:`dot` and :command:`dotty` programs are not in the current
system path their location needs to be specified in the contructor.
'''
"""
def __init__(
self, graph=None, nodes=None, edgefn=None, nodevisitor=None,
edgevisitor=None, name="G", dot='dot', dotty='dotty',
neato='neato', graphtype="digraph"):
'''
self,
graph=None,
nodes=None,
edgefn=None,
nodevisitor=None,
edgevisitor=None,
name="G",
dot="dot",
dotty="dotty",
neato="neato",
graphtype="digraph",
):
"""
Initialization.
'''
"""
self.name, self.attr = name, {}
assert graphtype in ['graph', 'digraph']
assert graphtype in ["graph", "digraph"]
self.type = graphtype
self.temp_dot = "tmp_dot.dot"
@@ -148,8 +157,10 @@ def __init__(
if graph is not None and nodes is None:
nodes = graph
if graph is not None and edgefn is None:
def edgefn(node, graph=graph):
return graph.out_nbrs(node)
if nodes is None:
nodes = ()
@@ -177,20 +188,19 @@ def edgefn(node, graph=graph):
self.edge_style(head, tail, **edgestyle)
def style(self, **attr):
'''
"""
Changes the overall style
'''
"""
self.attr = attr
def display(self, mode='dot'):
'''
def display(self, mode="dot"):
"""
Displays the current graph via dotty
'''
"""
if mode == 'neato':
if mode == "neato":
self.save_dot(self.temp_neo)
neato_cmd = "%s -o %s %s" % (
self.neato, self.temp_dot, self.temp_neo)
neato_cmd = "%s -o %s %s" % (self.neato, self.temp_dot, self.temp_neo)
os.system(neato_cmd)
else:
self.save_dot(self.temp_dot)
@@ -199,24 +209,24 @@ def display(self, mode='dot'):
os.system(plot_cmd)
def node_style(self, node, **kwargs):
'''
"""
Modifies a node style to the dot representation.
'''
"""
if node not in self.edges:
self.edges[node] = {}
self.nodes[node] = kwargs
def all_node_style(self, **kwargs):
'''
"""
Modifies all node styles
'''
"""
for node in self.nodes:
self.node_style(node, **kwargs)
def edge_style(self, head, tail, **kwargs):
'''
"""
Modifies an edge style to the dot representation.
'''
"""
if tail not in self.nodes:
raise GraphError("invalid node %s" % (tail,))
@@ -229,10 +239,10 @@ def edge_style(self, head, tail, **kwargs):
def iterdot(self):
# write graph title
if self.type == 'digraph':
yield 'digraph %s {\n' % (self.name,)
elif self.type == 'graph':
yield 'graph %s {\n' % (self.name,)
if self.type == "digraph":
yield "digraph %s {\n" % (self.name,)
elif self.type == "graph":
yield "graph %s {\n" % (self.name,)
else:
raise GraphError("unsupported graphtype %s" % (self.type,))
@@ -240,11 +250,11 @@ def iterdot(self):
# write overall graph attributes
for attr_name, attr_value in sorted(self.attr.items()):
yield '%s="%s";' % (attr_name, attr_value)
yield '\n'
yield "\n"
# some reusable patterns
cpatt = '%s="%s",' # to separate attributes
epatt = '];\n' # to end attributes
cpatt = '%s="%s",' # to separate attributes
epatt = "];\n" # to end attributes
# write node attributes
for node_name, node_attr in sorted(self.nodes.items()):
@@ -256,25 +266,24 @@ def iterdot(self):
# write edge attributes
for head in sorted(self.edges):
for tail in sorted(self.edges[head]):
if self.type == 'digraph':
if self.type == "digraph":
yield '\t"%s" -> "%s" [' % (head, tail)
else:
yield '\t"%s" -- "%s" [' % (head, tail)
for attr_name, attr_value in \
sorted(self.edges[head][tail].items()):
for attr_name, attr_value in sorted(self.edges[head][tail].items()):
yield cpatt % (attr_name, attr_value)
yield epatt
# finish file
yield '}\n'
yield "}\n"
def __iter__(self):
return self.iterdot()
def save_dot(self, file_name=None):
'''
"""
Saves the current graph representation into a file
'''
"""
if not file_name:
warnings.warn(DeprecationWarning, "always pass a file_name")
@@ -284,19 +293,18 @@ def save_dot(self, file_name=None):
for chunk in self.iterdot():
fp.write(chunk)
def save_img(self, file_name=None, file_type="gif", mode='dot'):
'''
def save_img(self, file_name=None, file_type="gif", mode="dot"):
"""
Saves the dot file as an image file
'''
"""
if not file_name:
warnings.warn(DeprecationWarning, "always pass a file_name")
file_name = "out"
if mode == 'neato':
if mode == "neato":
self.save_dot(self.temp_neo)
neato_cmd = "%s -o %s %s" % (
self.neato, self.temp_dot, self.temp_neo)
neato_cmd = "%s -o %s %s" % (self.neato, self.temp_dot, self.temp_neo)
os.system(neato_cmd)
plot_cmd = self.dot
else:
@@ -305,5 +313,9 @@ def save_img(self, file_name=None, file_type="gif", mode='dot'):
file_name = "%s.%s" % (file_name, file_type)
create_cmd = "%s -T%s %s -o %s" % (
plot_cmd, file_type, self.temp_dot, file_name)
plot_cmd,
file_type,
self.temp_dot,
file_name,
)
os.system(create_cmd)

View File

@@ -13,9 +13,10 @@
#--Nathan Denny, May 27, 1999
"""
from altgraph import GraphError
from collections import deque
from altgraph import GraphError
class Graph(object):
"""
@@ -58,8 +59,10 @@ def __init__(self, edges=None):
raise GraphError("Cannot create edge from %s" % (item,))
def __repr__(self):
return '<Graph: %d nodes, %d edges>' % (
self.number_of_nodes(), self.number_of_edges())
return "<Graph: %d nodes, %d edges>" % (
self.number_of_nodes(),
self.number_of_edges(),
)
def add_node(self, node, node_data=None):
"""
@@ -111,7 +114,7 @@ def add_edge(self, head_id, tail_id, edge_data=1, create_nodes=True):
self.nodes[tail_id][0].append(edge)
self.nodes[head_id][1].append(edge)
except KeyError:
raise GraphError('Invalid nodes %s -> %s' % (head_id, tail_id))
raise GraphError("Invalid nodes %s -> %s" % (head_id, tail_id))
# store edge information
self.edges[edge] = (head_id, tail_id, edge_data)
@@ -124,13 +127,12 @@ def hide_edge(self, edge):
time.
"""
try:
head_id, tail_id, edge_data = \
self.hidden_edges[edge] = self.edges[edge]
head_id, tail_id, edge_data = self.hidden_edges[edge] = self.edges[edge]
self.nodes[tail_id][0].remove(edge)
self.nodes[head_id][1].remove(edge)
del self.edges[edge]
except KeyError:
raise GraphError('Invalid edge %s' % edge)
raise GraphError("Invalid edge %s" % edge)
def hide_node(self, node):
"""
@@ -144,7 +146,7 @@ def hide_node(self, node):
self.hide_edge(edge)
del self.nodes[node]
except KeyError:
raise GraphError('Invalid node %s' % node)
raise GraphError("Invalid node %s" % node)
def restore_node(self, node):
"""
@@ -157,7 +159,7 @@ def restore_node(self, node):
self.restore_edge(edge)
del self.hidden_nodes[node]
except KeyError:
raise GraphError('Invalid node %s' % node)
raise GraphError("Invalid node %s" % node)
def restore_edge(self, edge):
"""
@@ -170,7 +172,7 @@ def restore_edge(self, edge):
self.edges[edge] = head_id, tail_id, data
del self.hidden_edges[edge]
except KeyError:
raise GraphError('Invalid edge %s' % edge)
raise GraphError("Invalid edge %s" % edge)
def restore_all_edges(self):
"""
@@ -203,7 +205,7 @@ def edge_by_id(self, edge):
head, tail, data = self.edges[edge]
except KeyError:
head, tail = None, None
raise GraphError('Invalid edge %s' % edge)
raise GraphError("Invalid edge %s" % edge)
return (head, tail)
@@ -339,7 +341,7 @@ def out_edges(self, node):
try:
return list(self.nodes[node][1])
except KeyError:
raise GraphError('Invalid node %s' % node)
raise GraphError("Invalid node %s" % node)
def inc_edges(self, node):
"""
@@ -348,7 +350,7 @@ def inc_edges(self, node):
try:
return list(self.nodes[node][0])
except KeyError:
raise GraphError('Invalid node %s' % node)
raise GraphError("Invalid node %s" % node)
def all_edges(self, node):
"""
@@ -488,7 +490,7 @@ def iterdfs(self, start, end=None, forward=True):
The forward parameter specifies whether it is a forward or backward
traversal.
"""
visited, stack = set([start]), deque([start])
visited, stack = {start}, deque([start])
if forward:
get_edges = self.out_edges
@@ -515,7 +517,7 @@ def iterdata(self, start, end=None, forward=True, condition=None):
condition callback is only called when node_data is not None.
"""
visited, stack = set([start]), deque([start])
visited, stack = {start}, deque([start])
if forward:
get_edges = self.out_edges
@@ -547,7 +549,7 @@ def _iterbfs(self, start, end=None, forward=True):
traversal. Returns a list of tuples where the first value is the hop
value the second value is the node id.
"""
queue, visited = deque([(start, 0)]), set([start])
queue, visited = deque([(start, 0)]), {start}
# the direction of the bfs depends on the edges that are sampled
if forward:

View File

@@ -1,7 +1,7 @@
'''
"""
altgraph.GraphAlgo - Graph algorithms
=====================================
'''
"""
from altgraph import GraphError
@@ -28,9 +28,9 @@ def dijkstra(graph, start, end=None):
Adapted to altgraph by Istvan Albert, Pennsylvania State University -
June, 9 2004
"""
D = {} # dictionary of final distances
P = {} # dictionary of predecessors
Q = _priorityDictionary() # estimated distances of non-final vertices
D = {} # dictionary of final distances
P = {} # dictionary of predecessors
Q = _priorityDictionary() # estimated distances of non-final vertices
Q[start] = 0
for v in Q:
@@ -44,7 +44,8 @@ def dijkstra(graph, start, end=None):
if w in D:
if vwLength < D[w]:
raise GraphError(
"Dijkstra: found better path to already-final vertex")
"Dijkstra: found better path to already-final vertex"
)
elif w not in Q or vwLength < Q[w]:
Q[w] = vwLength
P[w] = v
@@ -76,7 +77,7 @@ def shortest_path(graph, start, end):
# Utility classes and functions
#
class _priorityDictionary(dict):
'''
"""
Priority dictionary using binary heaps (internal use only)
David Eppstein, UC Irvine, 8 Mar 2002
@@ -92,22 +93,22 @@ class _priorityDictionary(dict):
order. Each item is not removed until the next item is requested,
so D[x] will still return a useful value until the next iteration
of the for-loop. Each operation takes logarithmic amortized time.
'''
"""
def __init__(self):
'''
"""
Initialize priorityDictionary by creating binary heap of pairs
(value,key). Note that changing or removing a dict entry will not
remove the old pair from the heap until it is found by smallest()
or until the heap is rebuilt.
'''
"""
self.__heap = []
dict.__init__(self)
def smallest(self):
'''
"""
Find smallest item after removing deleted items from front of heap.
'''
"""
if len(self) == 0:
raise IndexError("smallest of empty priorityDictionary")
heap = self.__heap
@@ -115,9 +116,11 @@ def smallest(self):
lastItem = heap.pop()
insertionPoint = 0
while 1:
smallChild = 2*insertionPoint+1
if smallChild+1 < len(heap) and \
heap[smallChild] > heap[smallChild+1]:
smallChild = 2 * insertionPoint + 1
if (
smallChild + 1 < len(heap)
and heap[smallChild] > heap[smallChild + 1]
):
smallChild += 1
if smallChild >= len(heap) or lastItem <= heap[smallChild]:
heap[insertionPoint] = lastItem
@@ -127,22 +130,24 @@ def smallest(self):
return heap[0][1]
def __iter__(self):
'''
"""
Create destructive sorted iterator of priorityDictionary.
'''
"""
def iterfn():
while len(self) > 0:
x = self.smallest()
yield x
del self[x]
return iterfn()
def __setitem__(self, key, val):
'''
"""
Change value stored in dictionary and add corresponding pair to heap.
Rebuilds the heap if the number of deleted items gets large, to avoid
memory leakage.
'''
"""
dict.__setitem__(self, key, val)
heap = self.__heap
if len(heap) > 2 * len(self):
@@ -152,15 +157,15 @@ def __setitem__(self, key, val):
newPair = (val, key)
insertionPoint = len(heap)
heap.append(None)
while insertionPoint > 0 and newPair < heap[(insertionPoint-1)//2]:
heap[insertionPoint] = heap[(insertionPoint-1)//2]
insertionPoint = (insertionPoint-1)//2
while insertionPoint > 0 and newPair < heap[(insertionPoint - 1) // 2]:
heap[insertionPoint] = heap[(insertionPoint - 1) // 2]
insertionPoint = (insertionPoint - 1) // 2
heap[insertionPoint] = newPair
def setdefault(self, key, val):
'''
"""
Reimplement setdefault to pass through our customized __setitem__.
'''
"""
if key not in self:
self[key] = val
return self[key]

View File

@@ -1,11 +1,11 @@
'''
"""
altgraph.GraphStat - Functions providing various graph statistics
=================================================================
'''
"""
def degree_dist(graph, limits=(0, 0), bin_num=10, mode='out'):
'''
def degree_dist(graph, limits=(0, 0), bin_num=10, mode="out"):
"""
Computes the degree distribution for a graph.
Returns a list of tuples where the first element of the tuple is the
@@ -15,10 +15,10 @@ def degree_dist(graph, limits=(0, 0), bin_num=10, mode='out'):
Example::
....
'''
"""
deg = []
if mode == 'inc':
if mode == "inc":
get_deg = graph.inc_degree
else:
get_deg = graph.out_degree
@@ -34,38 +34,38 @@ def degree_dist(graph, limits=(0, 0), bin_num=10, mode='out'):
return results
_EPS = 1.0/(2.0**32)
_EPS = 1.0 / (2.0 ** 32)
def _binning(values, limits=(0, 0), bin_num=10):
'''
"""
Bins data that falls between certain limits, if the limits are (0, 0) the
minimum and maximum values are used.
Returns a list of tuples where the first element of the tuple is the
center of the bin and the second element of the tuple are the counts.
'''
"""
if limits == (0, 0):
min_val, max_val = min(values) - _EPS, max(values) + _EPS
else:
min_val, max_val = limits
# get bin size
bin_size = (max_val - min_val)/float(bin_num)
bin_size = (max_val - min_val) / float(bin_num)
bins = [0] * (bin_num)
# will ignore these outliers for now
for value in values:
try:
if (value - min_val) >= 0:
index = int((value - min_val)/float(bin_size))
index = int((value - min_val) / float(bin_size))
bins[index] += 1
except IndexError:
pass
# make it ready for an x,y plot
result = []
center = (bin_size/2) + min_val
center = (bin_size / 2) + min_val
for i, y in enumerate(bins):
x = center + bin_size * i
result.append((x, y))

View File

@@ -1,31 +1,29 @@
'''
"""
altgraph.GraphUtil - Utility classes and functions
==================================================
'''
"""
import random
from collections import deque
from altgraph import Graph
from altgraph import GraphError
from altgraph import Graph, GraphError
def generate_random_graph(
node_num, edge_num, self_loops=False, multi_edges=False):
'''
def generate_random_graph(node_num, edge_num, self_loops=False, multi_edges=False):
"""
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance with
*node_num* nodes randomly connected by *edge_num* edges.
'''
"""
g = Graph.Graph()
if not multi_edges:
if self_loops:
max_edges = node_num * node_num
else:
max_edges = node_num * (node_num-1)
max_edges = node_num * (node_num - 1)
if edge_num > max_edges:
raise GraphError(
"inconsistent arguments to 'generate_random_graph'")
raise GraphError("inconsistent arguments to 'generate_random_graph'")
nodes = range(node_num)
@@ -52,17 +50,16 @@ def generate_random_graph(
return g
def generate_scale_free_graph(
steps, growth_num, self_loops=False, multi_edges=False):
'''
def generate_scale_free_graph(steps, growth_num, self_loops=False, multi_edges=False):
"""
Generates and returns a :py:class:`~altgraph.Graph.Graph` instance that
will have *steps* \* *growth_num* nodes and a scale free (powerlaw)
will have *steps* \\* *growth_num* nodes and a scale free (powerlaw)
connectivity. Starting with a fully connected graph with *growth_num*
nodes at every step *growth_num* nodes are added to the graph and are
connected to existing nodes with a probability proportional to the degree
of these existing nodes.
'''
# FIXME: The code doesn't seem to do what the documentation claims.
"""
# The code doesn't seem to do what the documentation claims.
graph = Graph.Graph()
# initialize the graph
@@ -113,7 +110,7 @@ def filter_stack(graph, head, filters):
in *removes*.
"""
visited, removes, orphans = set([head]), set(), set()
visited, removes, orphans = {head}, set(), set()
stack = deque([(head, head)])
get_data = graph.node_data
get_edges = graph.out_edges
@@ -137,8 +134,6 @@ def filter_stack(graph, head, filters):
visited.add(tail)
stack.append((last_good, tail))
orphans = [
(lg, tl)
for (lg, tl) in orphans if tl not in removes]
orphans = [(lg, tl) for (lg, tl) in orphans if tl not in removes]
return visited, removes, orphans

View File

@@ -27,7 +27,7 @@ def __init__(self, graph=None, debug=0):
graph.add_node(self, None)
def __repr__(self):
return '<%s>' % (type(self).__name__,)
return "<%s>" % (type(self).__name__,)
def flatten(self, condition=None, start=None):
"""
@@ -58,6 +58,7 @@ def iter_edges(lst, n):
if ident not in seen:
yield self.findNode(ident)
seen.add(ident)
return iter_edges(outraw, 3), iter_edges(incraw, 2)
def edgeData(self, fromNode, toNode):
@@ -87,12 +88,12 @@ def filterStack(self, filters):
visited, removes, orphans = filter_stack(self.graph, self, filters)
for last_good, tail in orphans:
self.graph.add_edge(last_good, tail, edge_data='orphan')
self.graph.add_edge(last_good, tail, edge_data="orphan")
for node in removes:
self.graph.hide_node(node)
return len(visited)-1, len(removes), len(orphans)
return len(visited) - 1, len(removes), len(orphans)
def removeNode(self, node):
"""
@@ -135,7 +136,7 @@ def getRawIdent(self, node):
"""
if node is self:
return node
ident = getattr(node, 'graphident', None)
ident = getattr(node, "graphident", None)
return ident
def __contains__(self, node):
@@ -192,8 +193,7 @@ def msg(self, level, s, *args):
Print a debug message with the given level
"""
if s and level <= self.debug:
print("%s%s %s" % (
" " * self.indent, s, ' '.join(map(repr, args))))
print("%s%s %s" % (" " * self.indent, s, " ".join(map(repr, args))))
def msgin(self, level, s, *args):
"""

View File

@@ -1,4 +1,4 @@
'''
"""
altgraph - a python graph library
=================================
@@ -138,13 +138,11 @@
@newfield contributor: Contributors:
@contributor: U{Reka Albert <http://www.phys.psu.edu/~ralbert/>}
'''
# import pkg_resources
# __version__ = pkg_resources.require('altgraph')[0].version
# pkg_resources is not finding the altgraph import despite the fact that it is in sys.path
# there is no .dist-info or .egg-info for pkg_resources to query the version from
# so it must be set manually
__version__ = '0.16.1'
"""
import pkg_resources
__version__ = pkg_resources.require("altgraph")[0].version
class GraphError(ValueError):
pass

View File

@@ -3,21 +3,43 @@
"""
from __future__ import print_function
import sys
import struct
import os
from .mach_o import MH_FILETYPE_SHORTNAMES, LC_DYSYMTAB, LC_SYMTAB
from .mach_o import load_command, S_ZEROFILL, section_64, section
from .mach_o import LC_REGISTRY, LC_ID_DYLIB, LC_SEGMENT, fat_header
from .mach_o import LC_SEGMENT_64, MH_CIGAM_64, MH_MAGIC_64, FAT_MAGIC
from .mach_o import mach_header, fat_arch64, FAT_MAGIC_64, fat_arch
from .mach_o import LC_REEXPORT_DYLIB, LC_PREBOUND_DYLIB, LC_LOAD_WEAK_DYLIB
from .mach_o import LC_LOAD_UPWARD_DYLIB, LC_LOAD_DYLIB, mach_header_64
from .mach_o import MH_CIGAM, MH_MAGIC
from .ptypes import sizeof
import struct
import sys
from macholib.util import fileview
from .mach_o import (
FAT_MAGIC,
FAT_MAGIC_64,
LC_DYSYMTAB,
LC_ID_DYLIB,
LC_LOAD_DYLIB,
LC_LOAD_UPWARD_DYLIB,
LC_LOAD_WEAK_DYLIB,
LC_PREBOUND_DYLIB,
LC_REEXPORT_DYLIB,
LC_REGISTRY,
LC_SEGMENT,
LC_SEGMENT_64,
LC_SYMTAB,
MH_CIGAM,
MH_CIGAM_64,
MH_FILETYPE_SHORTNAMES,
MH_MAGIC,
MH_MAGIC_64,
S_ZEROFILL,
fat_arch,
fat_arch64,
fat_header,
load_command,
mach_header,
mach_header_64,
section,
section_64,
)
from .ptypes import sizeof
try:
from macholib.compat import bytes
except ImportError:
@@ -31,23 +53,23 @@
if sys.version_info[0] == 2:
range = xrange # noqa: F821
__all__ = ['MachO']
__all__ = ["MachO"]
_RELOCATABLE = set((
_RELOCATABLE = {
# relocatable commands that should be used for dependency walking
LC_LOAD_DYLIB,
LC_LOAD_UPWARD_DYLIB,
LC_LOAD_WEAK_DYLIB,
LC_PREBOUND_DYLIB,
LC_REEXPORT_DYLIB,
))
}
_RELOCATABLE_NAMES = {
LC_LOAD_DYLIB: 'load_dylib',
LC_LOAD_UPWARD_DYLIB: 'load_upward_dylib',
LC_LOAD_WEAK_DYLIB: 'load_weak_dylib',
LC_PREBOUND_DYLIB: 'prebound_dylib',
LC_REEXPORT_DYLIB: 'reexport_dylib',
LC_LOAD_DYLIB: "load_dylib",
LC_LOAD_UPWARD_DYLIB: "load_upward_dylib",
LC_LOAD_WEAK_DYLIB: "load_weak_dylib",
LC_PREBOUND_DYLIB: "prebound_dylib",
LC_REEXPORT_DYLIB: "reexport_dylib",
}
@@ -65,13 +87,14 @@ def lc_str_value(offset, cmd_info):
cmd_load, cmd_cmd, cmd_data = cmd_info
offset -= sizeof(cmd_load) + sizeof(cmd_cmd)
return cmd_data[offset:].strip(b'\x00')
return cmd_data[offset:].strip(b"\x00")
class MachO(object):
"""
Provides reading/writing the Mach-O header of a specific existing file
"""
# filename - the original filename of this mach-o
# sizediff - the current deviation from the initial mach-o size
# header - the mach-o header
@@ -91,7 +114,7 @@ def __init__(self, filename):
# initialized by load
self.fat = None
self.headers = []
with open(filename, 'rb') as fp:
with open(filename, "rb") as fp:
self.load(fp)
def __repr__(self):
@@ -99,7 +122,7 @@ def __repr__(self):
def load(self, fh):
assert fh.tell() == 0
header = struct.unpack('>I', fh.read(4))[0]
header = struct.unpack(">I", fh.read(4))[0]
fh.seek(0)
if header in (FAT_MAGIC, FAT_MAGIC_64):
self.load_fat(fh)
@@ -112,11 +135,9 @@ def load(self, fh):
def load_fat(self, fh):
self.fat = fat_header.from_fileobj(fh)
if self.fat.magic == FAT_MAGIC:
archs = [fat_arch.from_fileobj(fh)
for i in range(self.fat.nfat_arch)]
archs = [fat_arch.from_fileobj(fh) for i in range(self.fat.nfat_arch)]
elif self.fat.magic == FAT_MAGIC_64:
archs = [fat_arch64.from_fileobj(fh)
for i in range(self.fat.nfat_arch)]
archs = [fat_arch64.from_fileobj(fh) for i in range(self.fat.nfat_arch)]
else:
raise ValueError("Unknown fat header magic: %r" % (self.fat.magic))
@@ -132,19 +153,18 @@ def rewriteLoadCommands(self, *args, **kw):
def load_header(self, fh, offset, size):
fh.seek(offset)
header = struct.unpack('>I', fh.read(4))[0]
header = struct.unpack(">I", fh.read(4))[0]
fh.seek(offset)
if header == MH_MAGIC:
magic, hdr, endian = MH_MAGIC, mach_header, '>'
magic, hdr, endian = MH_MAGIC, mach_header, ">"
elif header == MH_CIGAM:
magic, hdr, endian = MH_CIGAM, mach_header, '<'
magic, hdr, endian = MH_CIGAM, mach_header, "<"
elif header == MH_MAGIC_64:
magic, hdr, endian = MH_MAGIC_64, mach_header_64, '>'
magic, hdr, endian = MH_MAGIC_64, mach_header_64, ">"
elif header == MH_CIGAM_64:
magic, hdr, endian = MH_CIGAM_64, mach_header_64, '<'
magic, hdr, endian = MH_CIGAM_64, mach_header_64, "<"
else:
raise ValueError("Unknown Mach-O header: 0x%08x in %r" % (
header, fh))
raise ValueError("Unknown Mach-O header: 0x%08x in %r" % (header, fh))
hdr = MachOHeader(self, fh, offset, size, magic, hdr, endian)
self.headers.append(hdr)
@@ -157,6 +177,7 @@ class MachOHeader(object):
"""
Provides reading/writing the Mach-O header of a specific existing file
"""
# filename - the original filename of this mach-o
# sizediff - the current deviation from the initial mach-o size
# header - the mach-o header
@@ -189,15 +210,19 @@ def __init__(self, parent, fh, offset, size, magic, hdr, endian):
def __repr__(self):
return "<%s filename=%r offset=%d size=%d endian=%r>" % (
type(self).__name__, self.parent.filename, self.offset, self.size,
self.endian)
type(self).__name__,
self.parent.filename,
self.offset,
self.size,
self.endian,
)
def load(self, fh):
fh = fileview(fh, self.offset, self.size)
fh.seek(0)
self.sizediff = 0
kw = {'_endian_': self.endian}
kw = {"_endian_": self.endian}
header = self.mach_header.from_fileobj(fh, **kw)
self.header = header
# if header.magic != self.MH_MAGIC:
@@ -236,8 +261,9 @@ def load(self, fh):
section_cls = section_64
expected_size = (
sizeof(klass) + sizeof(load_command) +
(sizeof(section_cls) * cmd_cmd.nsects)
sizeof(klass)
+ sizeof(load_command)
+ (sizeof(section_cls) * cmd_cmd.nsects)
)
if cmd_load.cmdsize != expected_size:
raise ValueError("Segment size mismatch")
@@ -248,12 +274,12 @@ def load(self, fh):
low_offset = min(low_offset, cmd_cmd.fileoff)
else:
# this one has multiple segments
for j in range(cmd_cmd.nsects):
for _j in range(cmd_cmd.nsects):
# read the segment
seg = section_cls.from_fileobj(fh, **kw)
# if the segment has a size and is not zero filled
# then its beginning is the offset of this segment
not_zerofill = ((seg.flags & S_ZEROFILL) != S_ZEROFILL)
not_zerofill = (seg.flags & S_ZEROFILL) != S_ZEROFILL
if seg.offset > 0 and seg.size > 0 and not_zerofill:
low_offset = min(low_offset, seg.offset)
if not_zerofill:
@@ -266,7 +292,7 @@ def load(self, fh):
# data is a list of segments
cmd_data = segs
# XXX: Disabled for now because writing back doesn't work
# These are disabled for now because writing back doesn't work
# elif cmd_load.cmd == LC_CODE_SIGNATURE:
# c = fh.tell()
# fh.seek(cmd_cmd.dataoff)
@@ -280,17 +306,17 @@ def load(self, fh):
else:
# data is a raw str
data_size = (
cmd_load.cmdsize - sizeof(klass) - sizeof(load_command)
)
data_size = cmd_load.cmdsize - sizeof(klass) - sizeof(load_command)
cmd_data = fh.read(data_size)
cmd.append((cmd_load, cmd_cmd, cmd_data))
read_bytes += cmd_load.cmdsize
# make sure the header made sense
if read_bytes != header.sizeofcmds:
raise ValueError("Read %d bytes, header reports %d bytes" % (
read_bytes, header.sizeofcmds))
raise ValueError(
"Read %d bytes, header reports %d bytes"
% (read_bytes, header.sizeofcmds)
)
self.total_size = sizeof(self.mach_header) + read_bytes
self.low_offset = low_offset
@@ -303,8 +329,9 @@ def walkRelocatables(self, shouldRelocateCommand=_shouldRelocateCommand):
if shouldRelocateCommand(lc.cmd):
name = _RELOCATABLE_NAMES[lc.cmd]
ofs = cmd.name - sizeof(lc.__class__) - sizeof(cmd.__class__)
yield idx, name, data[ofs:data.find(b'\x00', ofs)].decode(
sys.getfilesystemencoding())
yield idx, name, data[
ofs : data.find(b"\x00", ofs) # noqa: E203
].decode(sys.getfilesystemencoding())
def rewriteInstallNameCommand(self, loadcmd):
"""Rewrite the load command of this dylib"""
@@ -317,8 +344,9 @@ def changedHeaderSizeBy(self, bytes):
self.sizediff += bytes
if (self.total_size + self.sizediff) > self.low_offset:
print(
"WARNING: Mach-O header in %r may be too large to relocate" % (
self.parent.filename,))
"WARNING: Mach-O header in %r may be too large to relocate"
% (self.parent.filename,)
)
def rewriteLoadCommands(self, changefunc):
"""
@@ -327,22 +355,22 @@ def rewriteLoadCommands(self, changefunc):
data = changefunc(self.parent.filename)
changed = False
if data is not None:
if self.rewriteInstallNameCommand(
data.encode(sys.getfilesystemencoding())):
if self.rewriteInstallNameCommand(data.encode(sys.getfilesystemencoding())):
changed = True
for idx, name, filename in self.walkRelocatables():
for idx, _name, filename in self.walkRelocatables():
data = changefunc(filename)
if data is not None:
if self.rewriteDataForCommand(idx, data.encode(
sys.getfilesystemencoding())):
if self.rewriteDataForCommand(
idx, data.encode(sys.getfilesystemencoding())
):
changed = True
return changed
def rewriteDataForCommand(self, idx, data):
lc, cmd, old_data = self.commands[idx]
hdrsize = sizeof(lc.__class__) + sizeof(cmd.__class__)
align = struct.calcsize('Q')
data = data + (b'\x00' * (align - (len(data) % align)))
align = struct.calcsize("Q")
data = data + (b"\x00" * (align - (len(data) % align)))
newsize = hdrsize + len(data)
self.commands[idx] = (lc, cmd, data)
self.changedHeaderSizeBy(newsize - lc.cmdsize)
@@ -352,10 +380,17 @@ def rewriteDataForCommand(self, idx, data):
def synchronize_size(self):
if (self.total_size + self.sizediff) > self.low_offset:
raise ValueError(
("New Mach-O header is too large to relocate in %r "
"(new size=%r, max size=%r, delta=%r)") % (
self.parent.filename, self.total_size + self.sizediff,
self.low_offset, self.sizediff))
(
"New Mach-O header is too large to relocate in %r "
"(new size=%r, max size=%r, delta=%r)"
)
% (
self.parent.filename,
self.total_size + self.sizediff,
self.low_offset,
self.sizediff,
)
)
self.header.sizeofcmds += self.sizediff
self.total_size = sizeof(self.mach_header) + self.header.sizeofcmds
self.sizediff = 0
@@ -396,16 +431,16 @@ def write(self, fileobj):
# zero out the unused space, doubt this is strictly necessary
# and is generally probably already the case
fileobj.write(b'\x00' * (self.low_offset - fileobj.tell()))
fileobj.write(b"\x00" * (self.low_offset - fileobj.tell()))
def getSymbolTableCommand(self):
for lc, cmd, data in self.commands:
for lc, cmd, _data in self.commands:
if lc.cmd == LC_SYMTAB:
return cmd
return None
def getDynamicSymbolTableCommand(self):
for lc, cmd, data in self.commands:
for lc, cmd, _data in self.commands:
if lc.cmd == LC_DYSYMTAB:
return cmd
return None
@@ -414,22 +449,23 @@ def get_filetype_shortname(self, filetype):
if filetype in MH_FILETYPE_SHORTNAMES:
return MH_FILETYPE_SHORTNAMES[filetype]
else:
return 'unknown'
return "unknown"
def main(fn):
m = MachO(fn)
seen = set()
for header in m.headers:
for idx, name, other in header.walkRelocatables():
for _idx, name, other in header.walkRelocatables():
if other not in seen:
seen.add(other)
print('\t' + name + ": " + other)
print("\t" + name + ": " + other)
if __name__ == '__main__':
if __name__ == "__main__":
import sys
files = sys.argv[1:] or ['/bin/ls']
files = sys.argv[1:] or ["/bin/ls"]
for fn in files:
print(fn)
main(fn)

View File

@@ -8,10 +8,10 @@
from altgraph.ObjectGraph import ObjectGraph
from macholib.dyld import dyld_find
from macholib.MachO import MachO
from macholib.itergraphreport import itergraphreport
from macholib.MachO import MachO
__all__ = ['MachOGraph']
__all__ = ["MachOGraph"]
try:
unicode
@@ -25,13 +25,14 @@ def __init__(self, filename):
self.headers = ()
def __repr__(self):
return '<%s graphident=%r>' % (type(self).__name__, self.graphident)
return "<%s graphident=%r>" % (type(self).__name__, self.graphident)
class MachOGraph(ObjectGraph):
"""
Graph data structure of Mach-O dependencies
"""
def __init__(self, debug=0, graph=None, env=None, executable_path=None):
super(MachOGraph, self).__init__(debug=debug, graph=graph)
self.env = env
@@ -41,16 +42,18 @@ def __init__(self, debug=0, graph=None, env=None, executable_path=None):
def locate(self, filename, loader=None):
if not isinstance(filename, (str, unicode)):
raise TypeError("%r is not a string" % (filename,))
if filename.startswith('@loader_path/') and loader is not None:
if filename.startswith("@loader_path/") and loader is not None:
fn = self.trans_table.get((loader.filename, filename))
if fn is None:
loader_path = loader.loader_path
try:
fn = dyld_find(
filename, env=self.env,
filename,
env=self.env,
executable_path=self.executable_path,
loader_path=loader_path)
loader_path=loader_path,
)
self.trans_table[(loader.filename, filename)] = fn
except ValueError:
return None
@@ -60,8 +63,8 @@ def locate(self, filename, loader=None):
if fn is None:
try:
fn = dyld_find(
filename, env=self.env,
executable_path=self.executable_path)
filename, env=self.env, executable_path=self.executable_path
)
self.trans_table[filename] = fn
except ValueError:
return None
@@ -83,11 +86,11 @@ def run_file(self, pathname, caller=None):
m = self.findNode(pathname, loader=caller)
if m is None:
if not os.path.exists(pathname):
raise ValueError('%r does not exist' % (pathname,))
raise ValueError("%r does not exist" % (pathname,))
m = self.createNode(MachO, pathname)
self.createReference(caller, m, edge_data='run_file')
self.createReference(caller, m, edge_data="run_file")
self.scan_node(m)
self.msgout(2, '')
self.msgout(2, "")
return m
def load_file(self, name, caller=None):
@@ -103,20 +106,20 @@ def load_file(self, name, caller=None):
self.scan_node(m)
else:
m = self.createNode(MissingMachO, name)
self.msgout(2, '')
self.msgout(2, "")
return m
def scan_node(self, node):
self.msgin(2, 'scan_node', node)
self.msgin(2, "scan_node", node)
for header in node.headers:
for idx, name, filename in header.walkRelocatables():
for _idx, name, filename in header.walkRelocatables():
assert isinstance(name, (str, unicode))
assert isinstance(filename, (str, unicode))
m = self.load_file(filename, caller=node)
self.createReference(node, m, edge_data=name)
self.msgout(2, '', node)
self.msgout(2, "", node)
def itergraphreport(self, name='G'):
def itergraphreport(self, name="G"):
nodes = map(self.graph.describe_node, self.graph.iterdfs(self))
describe_edge = self.graph.describe_edge
return itergraphreport(nodes, describe_edge, name=name)
@@ -134,5 +137,5 @@ def main(args):
g.graphreport()
if __name__ == '__main__':
main(sys.argv[1:] or ['/bin/ls'])
if __name__ == "__main__":
main(sys.argv[1:] or ["/bin/ls"])

View File

@@ -1,11 +1,17 @@
import os
from macholib.MachOGraph import MachOGraph, MissingMachO
from macholib.util import iter_platform_files, in_system_path, mergecopy, \
mergetree, flipwritable, has_filename_filter
from macholib.dyld import framework_info
from collections import deque
from macholib.dyld import framework_info
from macholib.MachOGraph import MachOGraph, MissingMachO
from macholib.util import (
flipwritable,
has_filename_filter,
in_system_path,
iter_platform_files,
mergecopy,
mergetree,
)
class ExcludedMachO(MissingMachO):
pass
@@ -23,22 +29,20 @@ def createNode(self, cls, name):
def locate(self, filename, loader=None):
newname = super(FilteredMachOGraph, self).locate(filename, loader)
print("locate", filename, loader, "->", newname)
if newname is None:
return None
return self.delegate.locate(newname, loader=loader)
class MachOStandalone(object):
def __init__(
self, base, dest=None, graph=None, env=None,
executable_path=None):
self.base = os.path.join(os.path.abspath(base), '')
def __init__(self, base, dest=None, graph=None, env=None, executable_path=None):
self.base = os.path.join(os.path.abspath(base), "")
if dest is None:
dest = os.path.join(self.base, 'Contents', 'Frameworks')
dest = os.path.join(self.base, "Contents", "Frameworks")
self.dest = dest
self.mm = FilteredMachOGraph(
self, graph=graph, env=env, executable_path=executable_path)
self, graph=graph, env=env, executable_path=executable_path
)
self.changemap = {}
self.excludes = []
self.pending = deque()
@@ -80,8 +84,7 @@ def copy_dylib(self, filename):
# when two libraries link to the same dylib but using different
# symlinks.
if os.path.islink(filename):
dest = os.path.join(
self.dest, os.path.basename(os.path.realpath(filename)))
dest = os.path.join(self.dest, os.path.basename(os.path.realpath(filename)))
else:
dest = os.path.join(self.dest, os.path.basename(filename))
@@ -96,9 +99,9 @@ def mergetree(self, src, dest):
return mergetree(src, dest)
def copy_framework(self, info):
dest = os.path.join(self.dest, info['shortname'] + '.framework')
destfn = os.path.join(self.dest, info['name'])
src = os.path.join(info['location'], info['shortname'] + '.framework')
dest = os.path.join(self.dest, info["shortname"] + ".framework")
destfn = os.path.join(self.dest, info["name"])
src = os.path.join(info["location"], info["shortname"] + ".framework")
if not os.path.exists(dest):
self.mergetree(src, dest)
self.pending.append((destfn, iter_platform_files(dest)))
@@ -107,7 +110,7 @@ def copy_framework(self, info):
def run(self, platfiles=None, contents=None):
mm = self.mm
if contents is None:
contents = '@executable_path/..'
contents = "@executable_path/.."
if platfiles is None:
platfiles = iter_platform_files(self.base)
@@ -121,18 +124,20 @@ def run(self, platfiles=None, contents=None):
mm.run_file(fn, caller=ref)
changemap = {}
skipcontents = os.path.join(os.path.dirname(self.dest), '')
skipcontents = os.path.join(os.path.dirname(self.dest), "")
machfiles = []
for node in mm.flatten(has_filename_filter):
machfiles.append(node)
dest = os.path.join(
contents, os.path.normpath(node.filename[len(skipcontents):]))
contents,
os.path.normpath(node.filename[len(skipcontents) :]), # noqa: E203
)
changemap[node.filename] = dest
def changefunc(path):
if path.startswith('@loader_path/'):
# XXX: This is a quick hack for py2app: In that
if path.startswith("@loader_path/"):
# This is a quick hack for py2app: In that
# usecase paths like this are found in the load
# commands of relocatable wheels. Those don't
# need rewriting.
@@ -140,9 +145,8 @@ def changefunc(path):
res = mm.locate(path)
rv = changemap.get(res)
if rv is None and path.startswith('@loader_path/'):
rv = changemap.get(mm.locate(mm.trans_table.get(
(node.filename, path))))
if rv is None and path.startswith("@loader_path/"):
rv = changemap.get(mm.locate(mm.trans_table.get((node.filename, path))))
return rv
for node in machfiles:
@@ -150,14 +154,14 @@ def changefunc(path):
if fn is None:
continue
rewroteAny = False
for header in node.headers:
for _header in node.headers:
if node.rewriteLoadCommands(changefunc):
rewroteAny = True
if rewroteAny:
old_mode = flipwritable(fn)
try:
with open(fn, 'rb+') as f:
for header in node.headers:
with open(fn, "rb+") as f:
for _header in node.headers:
f.seek(0)
node.write(f)
f.seek(0, 2)

View File

@@ -3,12 +3,20 @@
"""
from __future__ import with_statement
from macholib.mach_o import relocation_info, dylib_reference, dylib_module
from macholib.mach_o import dylib_table_of_contents, nlist, nlist_64
from macholib.mach_o import MH_CIGAM_64, MH_MAGIC_64
import sys
__all__ = ['SymbolTable']
from macholib.mach_o import (
MH_CIGAM_64,
MH_MAGIC_64,
dylib_module,
dylib_reference,
dylib_table_of_contents,
nlist,
nlist_64,
relocation_info,
)
__all__ = ["SymbolTable"]
if sys.version_info[0] == 2:
range = xrange # noqa: F821
@@ -21,7 +29,7 @@ def __init__(self, macho, header=None, openfile=None):
if header is None:
header = macho.headers[0]
self.macho_header = header
with openfile(macho.filename, 'rb') as fh:
with openfile(macho.filename, "rb") as fh:
self.symtab = header.getSymbolTableCommand()
self.dysymtab = header.getDynamicSymbolTableCommand()
@@ -43,22 +51,32 @@ def readSymbolTable(self, fh):
else:
cls = nlist
for i in range(cmd.nsyms):
for _i in range(cmd.nsyms):
cmd = cls.from_fileobj(fh, _endian_=self.macho_header.endian)
if cmd.n_un == 0:
nlists.append((cmd, ''))
nlists.append((cmd, ""))
else:
nlists.append(
(cmd, strtab[cmd.n_un:strtab.find(b'\x00', cmd.n_un)]))
(
cmd,
strtab[cmd.n_un : strtab.find(b"\x00", cmd.n_un)], # noqa: E203
)
)
return nlists
def readDynamicSymbolTable(self, fh):
cmd = self.dysymtab
nlists = self.nlists
self.localsyms = nlists[cmd.ilocalsym:cmd.ilocalsym+cmd.nlocalsym]
self.extdefsyms = nlists[cmd.iextdefsym:cmd.iextdefsym+cmd.nextdefsym]
self.undefsyms = nlists[cmd.iundefsym:cmd.iundefsym+cmd.nundefsym]
self.localsyms = nlists[
cmd.ilocalsym : cmd.ilocalsym + cmd.nlocalsym # noqa: E203
]
self.extdefsyms = nlists[
cmd.iextdefsym : cmd.iextdefsym + cmd.nextdefsym # noqa: E203
]
self.undefsyms = nlists[
cmd.iundefsym : cmd.iundefsym + cmd.nundefsym # noqa: E203
]
if cmd.tocoff == 0:
self.toc = None
else:
@@ -75,7 +93,7 @@ def readmodtab(self, fh, off, n):
def readsym(self, fh, off, n):
fh.seek(self.macho_header.offset + off)
refs = []
for i in range(n):
for _i in range(n):
ref = dylib_reference.from_fileobj(fh)
isym, flags = divmod(ref.isym_flags, 256)
refs.append((self.nlists[isym], flags))

View File

@@ -5,4 +5,4 @@
And also Apple's documentation.
"""
__version__ = '1.10'
__version__ = "1.15.2"

View File

@@ -1,26 +1,24 @@
from __future__ import print_function, absolute_import
from __future__ import absolute_import, print_function
import os
import sys
from macholib import macho_dump, macho_standalone
from macholib.util import is_platform_file
from macholib import macho_dump
from macholib import macho_standalone
gCommand = None
def check_file(fp, path, callback):
if not os.path.exists(path):
print(
'%s: %s: No such file or directory' % (gCommand, path),
file=sys.stderr)
print("%s: %s: No such file or directory" % (gCommand, path), file=sys.stderr)
return 1
try:
is_plat = is_platform_file(path)
except IOError as msg:
print('%s: %s: %s' % (gCommand, path, msg), file=sys.stderr)
print("%s: %s: %s" % (gCommand, path, msg), file=sys.stderr)
return 1
else:
@@ -34,10 +32,9 @@ def walk_tree(callback, paths):
for base in paths:
if os.path.isdir(base):
for root, dirs, files in os.walk(base):
for root, _dirs, files in os.walk(base):
for fn in files:
err |= check_file(
sys.stdout, os.path.join(root, fn), callback)
err |= check_file(sys.stdout, os.path.join(root, fn), callback)
else:
err |= check_file(sys.stdout, base, callback)
@@ -60,17 +57,17 @@ def main():
gCommand = sys.argv[1]
if gCommand == 'dump':
if gCommand == "dump":
walk_tree(macho_dump.print_file, sys.argv[2:])
elif gCommand == 'find':
elif gCommand == "find":
walk_tree(lambda fp, path: print(path, file=fp), sys.argv[2:])
elif gCommand == 'standalone':
elif gCommand == "standalone":
for dn in sys.argv[2:]:
macho_standalone.standaloneApp(dn)
elif gCommand in ('help', '--help'):
elif gCommand in ("help", "--help"):
print_usage(sys.stdout)
sys.exit(0)

View File

@@ -1,7 +1,8 @@
"""
Internal helpers for basic commandline tools
"""
from __future__ import print_function, absolute_import
from __future__ import absolute_import, print_function
import os
import sys
@@ -10,15 +11,16 @@
def check_file(fp, path, callback):
if not os.path.exists(path):
print('%s: %s: No such file or directory' % (
sys.argv[0], path), file=sys.stderr)
print(
"%s: %s: No such file or directory" % (sys.argv[0], path), file=sys.stderr
)
return 1
try:
is_plat = is_platform_file(path)
except IOError as msg:
print('%s: %s: %s' % (sys.argv[0], path, msg), file=sys.stderr)
print("%s: %s: %s" % (sys.argv[0], path, msg), file=sys.stderr)
return 1
else:
@@ -38,10 +40,9 @@ def main(callback):
for base in args:
if os.path.isdir(base):
for root, dirs, files in os.walk(base):
for root, _dirs, files in os.walk(base):
for fn in files:
err |= check_file(
sys.stdout, os.path.join(root, fn), callback)
err |= check_file(sys.stdout, os.path.join(root, fn), callback)
else:
err |= check_file(sys.stdout, base, callback)

View File

@@ -2,18 +2,45 @@
dyld emulation
"""
import ctypes
import os
import platform
import sys
from itertools import chain
import os
import sys
from macholib.framework import framework_info
from macholib.dylib import dylib_info
from macholib.framework import framework_info
__all__ = [
'dyld_find', 'framework_find',
'framework_info', 'dylib_info',
]
__all__ = ["dyld_find", "framework_find", "framework_info", "dylib_info"]
if sys.platform == "darwin" and [
int(x) for x in platform.mac_ver()[0].split(".")[:2]
] >= [10, 16]:
try:
libc = ctypes.CDLL("libSystem.dylib")
except OSError:
_dyld_shared_cache_contains_path = None
else:
try:
_dyld_shared_cache_contains_path = libc._dyld_shared_cache_contains_path
except AttributeError:
_dyld_shared_cache_contains_path = None
else:
_dyld_shared_cache_contains_path.restype = ctypes.c_bool
_dyld_shared_cache_contains_path.argtypes = [ctypes.c_char_p]
if sys.version_info[0] != 2:
__dyld_shared_cache_contains_path = _dyld_shared_cache_contains_path
def _dyld_shared_cache_contains_path(path):
return __dyld_shared_cache_contains_path(path.encode())
else:
_dyld_shared_cache_contains_path = None
# These are the defaults as per man dyld(1)
#
@@ -31,13 +58,16 @@
"/usr/lib",
]
# XXX: Is this function still needed?
if sys.version_info[0] == 2:
def _ensure_utf8(s):
if isinstance(s, unicode): # noqa: F821
return s.encode('utf8')
return s.encode("utf8")
return s
else:
def _ensure_utf8(s):
if s is not None and not isinstance(s, str):
raise ValueError(s)
@@ -48,31 +78,31 @@ def _dyld_env(env, var):
if env is None:
env = os.environ
rval = env.get(var)
if rval is None or rval == '':
if rval is None or rval == "":
return []
return rval.split(':')
return rval.split(":")
def dyld_image_suffix(env=None):
if env is None:
env = os.environ
return env.get('DYLD_IMAGE_SUFFIX')
return env.get("DYLD_IMAGE_SUFFIX")
def dyld_framework_path(env=None):
return _dyld_env(env, 'DYLD_FRAMEWORK_PATH')
return _dyld_env(env, "DYLD_FRAMEWORK_PATH")
def dyld_library_path(env=None):
return _dyld_env(env, 'DYLD_LIBRARY_PATH')
return _dyld_env(env, "DYLD_LIBRARY_PATH")
def dyld_fallback_framework_path(env=None):
return _dyld_env(env, 'DYLD_FALLBACK_FRAMEWORK_PATH')
return _dyld_env(env, "DYLD_FALLBACK_FRAMEWORK_PATH")
def dyld_fallback_library_path(env=None):
return _dyld_env(env, 'DYLD_FALLBACK_LIBRARY_PATH')
return _dyld_env(env, "DYLD_FALLBACK_LIBRARY_PATH")
def dyld_image_suffix_search(iterator, env=None):
@@ -83,8 +113,8 @@ def dyld_image_suffix_search(iterator, env=None):
def _inject(iterator=iterator, suffix=suffix):
for path in iterator:
if path.endswith('.dylib'):
yield path[:-len('.dylib')] + suffix + '.dylib'
if path.endswith(".dylib"):
yield path[: -len(".dylib")] + suffix + ".dylib"
else:
yield path + suffix
yield path
@@ -102,7 +132,7 @@ def dyld_override_search(name, env=None):
if framework is not None:
for path in dyld_framework_path(env):
yield os.path.join(path, framework['name'])
yield os.path.join(path, framework["name"])
# If DYLD_LIBRARY_PATH is set then use the first file that exists
# in the path. If none use the original name.
@@ -114,16 +144,18 @@ def dyld_executable_path_search(name, executable_path=None):
# If we haven't done any searching and found a library and the
# dylib_name starts with "@executable_path/" then construct the
# library name.
if name.startswith('@executable_path/') and executable_path is not None:
yield os.path.join(executable_path, name[len('@executable_path/'):])
if name.startswith("@executable_path/") and executable_path is not None:
yield os.path.join(
executable_path, name[len("@executable_path/") :] # noqa: E203
)
def dyld_loader_search(name, loader_path=None):
# If we haven't done any searching and found a library and the
# dylib_name starts with "@loader_path/" then construct the
# library name.
if name.startswith('@loader_path/') and loader_path is not None:
yield os.path.join(loader_path, name[len('@loader_path/'):])
if name.startswith("@loader_path/") and loader_path is not None:
yield os.path.join(loader_path, name[len("@loader_path/") :]) # noqa: E203
def dyld_default_search(name, env=None):
@@ -136,11 +168,11 @@ def dyld_default_search(name, env=None):
if fallback_framework_path:
for path in fallback_framework_path:
yield os.path.join(path, framework['name'])
yield os.path.join(path, framework["name"])
else:
for path in _DEFAULT_FRAMEWORK_FALLBACK:
yield os.path.join(path, framework['name'])
yield os.path.join(path, framework["name"])
fallback_library_path = dyld_fallback_library_path(env)
if fallback_library_path:
@@ -158,12 +190,20 @@ def dyld_find(name, executable_path=None, env=None, loader_path=None):
"""
name = _ensure_utf8(name)
executable_path = _ensure_utf8(executable_path)
for path in dyld_image_suffix_search(chain(
dyld_override_search(name, env),
dyld_executable_path_search(name, executable_path),
dyld_loader_search(name, loader_path),
dyld_default_search(name, env),
), env):
for path in dyld_image_suffix_search(
chain(
dyld_override_search(name, env),
dyld_executable_path_search(name, executable_path),
dyld_loader_search(name, loader_path),
dyld_default_search(name, env),
),
env,
):
if (
_dyld_shared_cache_contains_path is not None
and _dyld_shared_cache_contains_path(path)
):
return path
if os.path.isfile(path):
return path
raise ValueError("dylib %s could not be found" % (name,))
@@ -182,9 +222,9 @@ def framework_find(fn, executable_path=None, env=None):
return dyld_find(fn, executable_path=executable_path, env=env)
except ValueError:
pass
fmwk_index = fn.rfind('.framework')
fmwk_index = fn.rfind(".framework")
if fmwk_index == -1:
fmwk_index = len(fn)
fn += '.framework'
fn += ".framework"
fn = os.path.join(fn, os.path.basename(fn[:fmwk_index]))
return dyld_find(fn, executable_path=executable_path, env=env)

View File

@@ -4,9 +4,10 @@
import re
__all__ = ['dylib_info']
__all__ = ["dylib_info"]
_DYLIB_RE = re.compile(r"""(?x)
_DYLIB_RE = re.compile(
r"""(?x)
(?P<location>^.*)(?:^|/)
(?P<name>
(?P<shortname>\w+?)
@@ -14,7 +15,8 @@
(?:_(?P<suffix>[^._]+))?
\.dylib$
)
""")
"""
)
def dylib_info(filename):

View File

@@ -4,9 +4,10 @@
import re
__all__ = ['framework_info']
__all__ = ["framework_info"]
_STRICT_FRAMEWORK_RE = re.compile(r"""(?x)
_STRICT_FRAMEWORK_RE = re.compile(
r"""(?x)
(?P<location>^.*)(?:^|/)
(?P<name>
(?P<shortname>[-_A-Za-z0-9]+).framework/
@@ -14,7 +15,8 @@
(?P=shortname)
(?:_(?P<suffix>[^_]+))?
)$
""")
"""
)
def framework_info(filename):

View File

@@ -1,7 +1,5 @@
"""
Utilities for creating dot output from a MachOGraph
XXX: need to rewrite this based on altgraph.Dot
"""
from collections import deque
@@ -11,28 +9,28 @@
except ImportError:
imap = map
__all__ = ['itergraphreport']
__all__ = ["itergraphreport"]
def itergraphreport(nodes, describe_edge, name='G'):
def itergraphreport(nodes, describe_edge, name="G"):
edges = deque()
nodetoident = {}
def nodevisitor(node, data, outgoing, incoming):
return {'label': str(node)}
return {"label": str(node)}
def edgevisitor(edge, data, head, tail):
return {}
yield 'digraph %s {\n' % (name,)
attr = dict(rankdir='LR', concentrate='true')
yield "digraph %s {\n" % (name,)
attr = {"rankdir": "LR", "concentrate": "true"}
cpatt = '%s="%s"'
for item in attr.iteritems():
yield '\t%s;\n' % (cpatt % item,)
for item in attr.items():
yield "\t%s;\n" % (cpatt % item,)
# find all packages (subgraphs)
for (node, data, outgoing, incoming) in nodes:
nodetoident[node] = getattr(data, 'identifier', node)
for (node, data, _outgoing, _incoming) in nodes:
nodetoident[node] = getattr(data, "identifier", node)
# create sets for subgraph, write out descriptions
for (node, data, outgoing, incoming) in nodes:
@@ -43,17 +41,19 @@ def edgevisitor(edge, data, head, tail):
# describe node
yield '\t"%s" [%s];\n' % (
node,
','.join([
(cpatt % item) for item in
nodevisitor(node, data, outgoing, incoming).iteritems()
]),
",".join(
[
(cpatt % item)
for item in nodevisitor(node, data, outgoing, incoming).items()
]
),
)
graph = []
while edges:
edge, data, head, tail = edges.popleft()
if data in ('run_file', 'load_dylib'):
if data in ("run_file", "load_dylib"):
graph.append((edge, data, head, tail))
def do_graph(edges, tabs):
@@ -64,10 +64,10 @@ def do_graph(edges, tabs):
yield edgestr % (
head,
tail,
','.join([(cpatt % item) for item in attribs.iteritems()]),
",".join([(cpatt % item) for item in attribs.items()]),
)
for s in do_graph(graph, '\t'):
for s in do_graph(graph, "\t"):
yield s
yield '}\n'
yield "}\n"

File diff suppressed because it is too large Load Diff

View File

@@ -5,15 +5,14 @@
import sys
from macholib._cmdline import main as _main
from macholib.mach_o import CPU_TYPE_NAMES, MH_CIGAM_64, MH_MAGIC_64, get_cpu_subtype
from macholib.MachO import MachO
from macholib.mach_o import get_cpu_subtype, CPU_TYPE_NAMES
from macholib.mach_o import MH_CIGAM_64, MH_MAGIC_64
ARCH_MAP = {
('<', '64-bit'): 'x86_64',
('<', '32-bit'): 'i386',
('>', '64-bit'): 'ppc64',
('>', '32-bit'): 'ppc',
("<", "64-bit"): "x86_64",
("<", "32-bit"): "i386",
(">", "64-bit"): "ppc64",
(">", "32-bit"): "ppc",
}
@@ -24,34 +23,34 @@ def print_file(fp, path):
seen = set()
if header.MH_MAGIC == MH_MAGIC_64 or header.MH_MAGIC == MH_CIGAM_64:
sz = '64-bit'
sz = "64-bit"
else:
sz = '32-bit'
sz = "32-bit"
arch = CPU_TYPE_NAMES.get(
header.header.cputype, header.header.cputype)
arch = CPU_TYPE_NAMES.get(header.header.cputype, header.header.cputype)
subarch = get_cpu_subtype(
header.header.cputype, header.header.cpusubtype)
subarch = get_cpu_subtype(header.header.cputype, header.header.cpusubtype)
print(' [%s endian=%r size=%r arch=%r subarch=%r]' % (
header.__class__.__name__, header.endian, sz, arch, subarch),
file=fp)
for idx, name, other in header.walkRelocatables():
print(
" [%s endian=%r size=%r arch=%r subarch=%r]"
% (header.__class__.__name__, header.endian, sz, arch, subarch),
file=fp,
)
for _idx, _name, other in header.walkRelocatables():
if other not in seen:
seen.add(other)
print('\t' + other, file=fp)
print('', file=fp)
print("\t" + other, file=fp)
print("", file=fp)
def main():
print(
"WARNING: 'macho_dump' is deprecated, use 'python -mmacholib dump' "
"instead")
"WARNING: 'macho_dump' is deprecated, use 'python -mmacholib dump' " "instead"
)
_main(print_file)
if __name__ == '__main__':
if __name__ == "__main__":
try:
sys.exit(main())
except KeyboardInterrupt:

View File

@@ -1,5 +1,6 @@
#!/usr/bin/env python
from __future__ import print_function
from macholib._cmdline import main as _main
@@ -9,12 +10,12 @@ def print_file(fp, path):
def main():
print(
"WARNING: 'macho_find' is deprecated, "
"use 'python -mmacholib dump' instead")
"WARNING: 'macho_find' is deprecated, " "use 'python -mmacholib dump' instead"
)
_main(print_file)
if __name__ == '__main__':
if __name__ == "__main__":
try:
main()
except KeyboardInterrupt:

View File

@@ -8,10 +8,8 @@
def standaloneApp(path):
if not (os.path.isdir(path) and os.path.exists(
os.path.join(path, 'Contents'))):
print(
'%s: %s does not look like an app bundle' % (sys.argv[0], path))
if not (os.path.isdir(path) and os.path.exists(os.path.join(path, "Contents"))):
print("%s: %s does not look like an app bundle" % (sys.argv[0], path))
sys.exit(1)
files = MachOStandalone(path).run()
strip_files(files)
@@ -20,12 +18,13 @@ def standaloneApp(path):
def main():
print(
"WARNING: 'macho_standalone' is deprecated, use "
"'python -mmacholib standalone' instead")
"'python -mmacholib standalone' instead"
)
if not sys.argv[1:]:
raise SystemExit('usage: %s [appbundle ...]' % (sys.argv[0],))
raise SystemExit("usage: %s [appbundle ...]" % (sys.argv[0],))
for fn in sys.argv[1:]:
standaloneApp(fn)
if __name__ == '__main__':
if __name__ == "__main__":
main()

View File

@@ -4,12 +4,12 @@
"""
import struct
import sys
from itertools import chain, starmap
try:
from itertools import izip, imap
from itertools import imap, izip
except ImportError:
izip, imap = zip, map
from itertools import chain, starmap
__all__ = """
sizeof
@@ -44,7 +44,7 @@ def sizeof(s):
"""
Return the size of an object when packed
"""
if hasattr(s, '_size_'):
if hasattr(s, "_size_"):
return s._size_
elif isinstance(s, bytes):
@@ -58,14 +58,15 @@ class MetaPackable(type):
Fixed size struct.unpack-able types use from_tuple as their designated
initializer
"""
def from_mmap(cls, mm, ptr, **kw):
return cls.from_str(mm[ptr:ptr+cls._size_], **kw)
return cls.from_str(mm[ptr : ptr + cls._size_], **kw) # noqa: E203
def from_fileobj(cls, f, **kw):
return cls.from_str(f.read(cls._size_), **kw)
def from_str(cls, s, **kw):
endian = kw.get('_endian_', cls._endian_)
endian = kw.get("_endian_", cls._endian_)
return cls.from_tuple(struct.unpack(endian + cls._format_, s), **kw)
def from_tuple(cls, tpl, **kw):
@@ -73,7 +74,7 @@ def from_tuple(cls, tpl, **kw):
class BasePackable(object):
_endian_ = '>'
_endian_ = ">"
def to_str(self):
raise NotImplementedError
@@ -82,7 +83,7 @@ def to_fileobj(self, f):
f.write(self.to_str())
def to_mmap(self, mm, ptr):
mm[ptr:ptr+self._size_] = self.to_str()
mm[ptr : ptr + self._size_] = self.to_str() # noqa: E203
# This defines a class with a custom metaclass, we'd normally
@@ -92,9 +93,10 @@ def to_mmap(self, mm, ptr):
def _make():
def to_str(self):
cls = type(self)
endian = getattr(self, '_endian_', cls._endian_)
endian = getattr(self, "_endian_", cls._endian_)
return struct.pack(endian + cls._format_, self)
return MetaPackable("Packable", (BasePackable,), {'to_str': to_str})
return MetaPackable("Packable", (BasePackable,), {"to_str": to_str})
Packable = _make()
@@ -109,8 +111,8 @@ def pypackable(name, pytype, format):
size, items = _formatinfo(format)
def __new__(cls, *args, **kwds):
if '_endian_' in kwds:
_endian_ = kwds.pop('_endian_')
if "_endian_" in kwds:
_endian_ = kwds.pop("_endian_")
else:
_endian_ = cls._endian_
@@ -118,12 +120,11 @@ def __new__(cls, *args, **kwds):
result._endian_ = _endian_
return result
return type(Packable)(name, (pytype, Packable), {
'_format_': format,
'_size_': size,
'_items_': items,
'__new__': __new__,
})
return type(Packable)(
name,
(pytype, Packable),
{"_format_": format, "_size_": size, "_items_": items, "__new__": __new__},
)
def _formatinfo(format):
@@ -131,7 +132,7 @@ def _formatinfo(format):
Calculate the size and number of items in a struct format.
"""
size = struct.calcsize(format)
return size, len(struct.unpack(format, b'\x00' * size))
return size, len(struct.unpack(format, b"\x00" * size))
class MetaStructure(MetaPackable):
@@ -142,17 +143,17 @@ class MetaStructure(MetaPackable):
we can do a bunch of calculations up front and pack or
unpack the whole thing in one struct call.
"""
def __new__(cls, clsname, bases, dct):
fields = dct['_fields_']
fields = dct["_fields_"]
names = []
types = []
structmarks = []
format = ''
format = ""
items = 0
size = 0
def struct_property(name, typ):
def _get(self):
return self._objects_[name]
@@ -169,16 +170,16 @@ def _set(self, obj):
types.append(typ)
format += typ._format_
size += typ._size_
if (typ._items_ > 1):
if typ._items_ > 1:
structmarks.append((items, typ._items_, typ))
items += typ._items_
dct['_structmarks_'] = structmarks
dct['_names_'] = names
dct['_types_'] = types
dct['_size_'] = size
dct['_items_'] = items
dct['_format_'] = format
dct["_structmarks_"] = structmarks
dct["_names_"] = names
dct["_types_"] = types
dct["_size_"] = size
dct["_items_"] = items
dct["_format_"] = format
return super(MetaStructure, cls).__new__(cls, clsname, bases, dct)
def from_tuple(cls, tpl, **kw):
@@ -196,7 +197,7 @@ def from_tuple(cls, tpl, **kw):
# See metaclass discussion earlier in this file
def _make():
class_dict = {}
class_dict['_fields_'] = ()
class_dict["_fields_"] = ()
def as_method(function):
class_dict[function.__name__] = function
@@ -219,7 +220,7 @@ def __init__(self, *args, **kwargs):
@as_method
def _get_packables(self):
for obj in imap(self._objects_.__getitem__, self._names_):
if hasattr(obj, '_get_packables'):
if hasattr(obj, "_get_packables"):
for obj in obj._get_packables():
yield obj
@@ -228,18 +229,19 @@ def _get_packables(self):
@as_method
def to_str(self):
return struct.pack(
self._endian_ + self._format_, *self._get_packables())
return struct.pack(self._endian_ + self._format_, *self._get_packables())
@as_method
def __cmp__(self, other):
if type(other) is not type(self):
raise TypeError(
'Cannot compare objects of type %r to objects of type %r' % (
type(other), type(self)))
"Cannot compare objects of type %r to objects of type %r"
% (type(other), type(self))
)
if sys.version_info[0] == 2:
_cmp = cmp # noqa: F821
else:
def _cmp(a, b):
if a < b:
return -1
@@ -251,7 +253,8 @@ def _cmp(a, b):
raise TypeError()
for cmpval in starmap(
_cmp, izip(self._get_packables(), other._get_packables())):
_cmp, izip(self._get_packables(), other._get_packables())
):
if cmpval != 0:
return cmpval
return 0
@@ -289,12 +292,12 @@ def __ge__(self, other):
@as_method
def __repr__(self):
result = []
result.append('<')
result.append("<")
result.append(type(self).__name__)
for nm in self._names_:
result.append(' %s=%r' % (nm, getattr(self, nm)))
result.append('>')
return ''.join(result)
result.append(" %s=%r" % (nm, getattr(self, nm)))
result.append(">")
return "".join(result)
return MetaStructure("Structure", (BasePackable,), class_dict)
@@ -308,17 +311,17 @@ def __repr__(self):
long = int
# export common packables with predictable names
p_char = pypackable('p_char', bytes, 'c')
p_int8 = pypackable('p_int8', int, 'b')
p_uint8 = pypackable('p_uint8', int, 'B')
p_int16 = pypackable('p_int16', int, 'h')
p_uint16 = pypackable('p_uint16', int, 'H')
p_int32 = pypackable('p_int32', int, 'i')
p_uint32 = pypackable('p_uint32', long, 'I')
p_int64 = pypackable('p_int64', long, 'q')
p_uint64 = pypackable('p_uint64', long, 'Q')
p_float = pypackable('p_float', float, 'f')
p_double = pypackable('p_double', float, 'd')
p_char = pypackable("p_char", bytes, "c")
p_int8 = pypackable("p_int8", int, "b")
p_uint8 = pypackable("p_uint8", int, "B")
p_int16 = pypackable("p_int16", int, "h")
p_uint16 = pypackable("p_uint16", int, "H")
p_int32 = pypackable("p_int32", int, "i")
p_uint32 = pypackable("p_uint32", long, "I")
p_int64 = pypackable("p_int64", long, "q")
p_uint64 = pypackable("p_uint64", long, "Q")
p_float = pypackable("p_float", float, "f")
p_double = pypackable("p_double", float, "d")
# Deprecated names, need trick to emit deprecation warning.
p_byte = p_int8

View File

@@ -1,18 +1,18 @@
import os
import sys
import shutil
import stat
import struct
import shutil
import sys
from macholib import mach_o
MAGIC = [
struct.pack('!L', getattr(mach_o, 'MH_' + _))
for _ in ['MAGIC', 'CIGAM', 'MAGIC_64', 'CIGAM_64']
struct.pack("!L", getattr(mach_o, "MH_" + _))
for _ in ["MAGIC", "CIGAM", "MAGIC_64", "CIGAM_64"]
]
FAT_MAGIC_BYTES = struct.pack('!L', mach_o.FAT_MAGIC)
FAT_MAGIC_BYTES = struct.pack("!L", mach_o.FAT_MAGIC)
MAGIC_LEN = 4
STRIPCMD = ['/usr/bin/strip', '-x', '-S', '-']
STRIPCMD = ["/usr/bin/strip", "-x", "-S", "-"]
try:
unicode
@@ -20,7 +20,7 @@
unicode = str
def fsencoding(s, encoding=sys.getfilesystemencoding()):
def fsencoding(s, encoding=sys.getfilesystemencoding()): # noqa: M511,B008
"""
Ensure the given argument is in filesystem encoding (not unicode)
"""
@@ -66,16 +66,17 @@ def __init__(self, fileobj, start, size):
self._end = start + size
def __repr__(self):
return '<fileview [%d, %d] %r>' % (
self._start, self._end, self._fileobj)
return "<fileview [%d, %d] %r>" % (self._start, self._end, self._fileobj)
def tell(self):
return self._fileobj.tell() - self._start
def _checkwindow(self, seekto, op):
if not (self._start <= seekto <= self._end):
raise IOError("%s to offset %d is outside window [%d, %d]" % (
op, seekto, self._start, self._end))
raise IOError(
"%s to offset %d is outside window [%d, %d]"
% (op, seekto, self._start, self._end)
)
def seek(self, offset, whence=0):
seekto = offset
@@ -87,21 +88,22 @@ def seek(self, offset, whence=0):
seekto += self._end
else:
raise IOError("Invalid whence argument to seek: %r" % (whence,))
self._checkwindow(seekto, 'seek')
self._checkwindow(seekto, "seek")
self._fileobj.seek(seekto)
def write(self, bytes):
here = self._fileobj.tell()
self._checkwindow(here, 'write')
self._checkwindow(here + len(bytes), 'write')
self._checkwindow(here, "write")
self._checkwindow(here + len(bytes), "write")
self._fileobj.write(bytes)
def read(self, size=sys.maxsize):
if size < 0:
raise ValueError(
"Invalid size %s while reading from %s", size, self._fileobj)
"Invalid size %s while reading from %s", size, self._fileobj
)
here = self._fileobj.tell()
self._checkwindow(here, 'read')
self._checkwindow(here, "read")
bytes = min(size, self._end - here)
return self._fileobj.read(bytes)
@@ -110,8 +112,7 @@ def mergecopy(src, dest):
"""
copy2, but only if the destination isn't up to date
"""
if os.path.exists(dest) and \
os.stat(dest).st_mtime >= os.stat(src).st_mtime:
if os.path.exists(dest) and os.stat(dest).st_mtime >= os.stat(src).st_mtime:
return
copy2(src, dest)
@@ -138,13 +139,16 @@ def mergetree(src, dst, condition=None, copyfn=mergecopy, srcbase=None):
continue
try:
if os.path.islink(srcname):
# XXX: This is naive at best, should check srcbase(?)
realsrc = os.readlink(srcname)
os.symlink(realsrc, dstname)
elif os.path.isdir(srcname):
mergetree(
srcname, dstname,
condition=condition, copyfn=copyfn, srcbase=srcbase)
srcname,
dstname,
condition=condition,
copyfn=copyfn,
srcbase=srcbase,
)
else:
copyfn(srcname, dstname)
except (IOError, os.error) as why:
@@ -158,10 +162,10 @@ def sdk_normalize(filename):
Normalize a path to strip out the SDK portion, normally so that it
can be decided whether it is in a system path or not.
"""
if filename.startswith('/Developer/SDKs/'):
pathcomp = filename.split('/')
if filename.startswith("/Developer/SDKs/"):
pathcomp = filename.split("/")
del pathcomp[1:4]
filename = '/'.join(pathcomp)
filename = "/".join(pathcomp)
return filename
@@ -173,9 +177,9 @@ def in_system_path(filename):
Return True if the file is in a system path
"""
fn = sdk_normalize(os.path.realpath(filename))
if fn.startswith('/usr/local/'):
if fn.startswith("/usr/local/"):
return False
elif fn.startswith('/System/') or fn.startswith('/usr/'):
elif fn.startswith("/System/") or fn.startswith("/usr/"):
if fn in NOT_SYSTEM_FILES:
return False
return True
@@ -187,7 +191,7 @@ def has_filename_filter(module):
"""
Return False if the module does not have a filename attribute
"""
return getattr(module, 'filename', None) is not None
return getattr(module, "filename", None) is not None
def get_magic():
@@ -204,16 +208,16 @@ def is_platform_file(path):
if not os.path.exists(path) or os.path.islink(path):
return False
# If the header is fat, we need to read into the first arch
with open(path, 'rb') as fileobj:
with open(path, "rb") as fileobj:
bytes = fileobj.read(MAGIC_LEN)
if bytes == FAT_MAGIC_BYTES:
# Read in the fat header
fileobj.seek(0)
header = mach_o.fat_header.from_fileobj(fileobj, _endian_='>')
header = mach_o.fat_header.from_fileobj(fileobj, _endian_=">")
if header.nfat_arch < 1:
return False
# Read in the first fat arch header
arch = mach_o.fat_arch.from_fileobj(fileobj, _endian_='>')
arch = mach_o.fat_arch.from_fileobj(fileobj, _endian_=">")
fileobj.seek(arch.offset)
# Read magic off the first header
bytes = fileobj.read(MAGIC_LEN)
@@ -227,7 +231,7 @@ def iter_platform_files(dst):
"""
Walk a directory and yield each full path that is a Mach-O file
"""
for root, dirs, files in os.walk(dst):
for root, _dirs, files in os.walk(dst):
for fn in files:
fn = os.path.join(root, fn)
if is_platform_file(fn):
@@ -242,7 +246,7 @@ def strip_files(files, argv_max=(256 * 1024)):
while tostrip:
cmd = list(STRIPCMD)
flips = []
pathlen = sum([len(s) + 1 for s in cmd])
pathlen = sum(len(s) + 1 for s in cmd)
while pathlen < argv_max:
if not tostrip:
break

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -6,79 +6,98 @@
This is a fake set of symbols to allow spack to import typing in python
versions where we do not support type checking (<3)
"""
Annotated = None
Any = None
Callable = None
ForwardRef = None
Generic = None
Literal = None
Optional = None
Tuple = None
TypeVar = None
Union = None
AbstractSet = None
ByteString = None
Container = None
Hashable = None
ItemsView = None
Iterable = None
Iterator = None
KeysView = None
Mapping = None
MappingView = None
MutableMapping = None
MutableSequence = None
MutableSet = None
Sequence = None
Sized = None
ValuesView = None
Awaitable = None
AsyncIterator = None
AsyncIterable = None
Coroutine = None
Collection = None
AsyncGenerator = None
AsyncContextManager = None
Reversible = None
SupportsAbs = None
SupportsBytes = None
SupportsComplex = None
SupportsFloat = None
SupportsInt = None
SupportsRound = None
ChainMap = None
Dict = None
List = None
OrderedDict = None
Set = None
FrozenSet = None
NamedTuple = None
Generator = None
AnyStr = None
cast = None
from collections import defaultdict
# (1) Unparameterized types.
Annotated = object
Any = object
AnyStr = object
ByteString = object
Counter = object
Final = object
Hashable = object
NoReturn = object
Sized = object
SupportsAbs = object
SupportsBytes = object
SupportsComplex = object
SupportsFloat = object
SupportsIndex = object
SupportsInt = object
SupportsRound = object
# (2) Parameterized types.
AbstractSet = defaultdict(lambda: object)
AsyncContextManager = defaultdict(lambda: object)
AsyncGenerator = defaultdict(lambda: object)
AsyncIterable = defaultdict(lambda: object)
AsyncIterator = defaultdict(lambda: object)
Awaitable = defaultdict(lambda: object)
Callable = defaultdict(lambda: object)
ChainMap = defaultdict(lambda: object)
ClassVar = defaultdict(lambda: object)
Collection = defaultdict(lambda: object)
Container = defaultdict(lambda: object)
ContextManager = defaultdict(lambda: object)
Coroutine = defaultdict(lambda: object)
DefaultDict = defaultdict(lambda: object)
Deque = defaultdict(lambda: object)
Dict = defaultdict(lambda: object)
ForwardRef = defaultdict(lambda: object)
FrozenSet = defaultdict(lambda: object)
Generator = defaultdict(lambda: object)
Generic = defaultdict(lambda: object)
ItemsView = defaultdict(lambda: object)
Iterable = defaultdict(lambda: object)
Iterator = defaultdict(lambda: object)
KeysView = defaultdict(lambda: object)
List = defaultdict(lambda: object)
Literal = defaultdict(lambda: object)
Mapping = defaultdict(lambda: object)
MappingView = defaultdict(lambda: object)
MutableMapping = defaultdict(lambda: object)
MutableSequence = defaultdict(lambda: object)
MutableSet = defaultdict(lambda: object)
NamedTuple = defaultdict(lambda: object)
Optional = defaultdict(lambda: object)
OrderedDict = defaultdict(lambda: object)
Reversible = defaultdict(lambda: object)
Sequence = defaultdict(lambda: object)
Set = defaultdict(lambda: object)
Tuple = defaultdict(lambda: object)
Type = defaultdict(lambda: object)
TypedDict = defaultdict(lambda: object)
Union = defaultdict(lambda: object)
ValuesView = defaultdict(lambda: object)
# (3) Type variable declarations.
TypeVar = lambda *args, **kwargs: None
# (4) Functions.
cast = lambda _type, x: x
get_args = None
get_origin = None
get_type_hints = None
no_type_check = None
no_type_check_decorator = None
NoReturn = None
# these are the typing extension symbols
ClassVar = None
Final = None
Protocol = None
Type = None
TypedDict = None
ContextManager = None
Counter = None
Deque = None
DefaultDict = None
SupportsIndex = None
final = None
IntVar = None
Literal = None
NewType = None
overload = None
runtime_checkable = None
Text = None
TYPE_CHECKING = None
## typing_extensions
# We get a ModuleNotFoundError when attempting to import anything from typing_extensions
# if we separate this into a separate typing_extensions.py file for some reason.
# (1) Unparameterized types.
IntVar = object
Literal = object
NewType = object
Text = object
# (2) Parameterized types.
Protocol = defaultdict(lambda: object)
# (3) Macro for avoiding evaluation except during type checking.
TYPE_CHECKING = False
# (4) Decorators.
final = lambda x: x
overload = lambda x: x
runtime_checkable = lambda x: x

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,39 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# isort: off
import sys
if sys.version_info < (3,):
from itertools import ifilter as filter
from itertools import imap as map
from itertools import izip as zip
from itertools import izip_longest as zip_longest # novm
from urllib import urlencode as urlencode
from urllib import urlopen as urlopen
else:
filter = filter
map = map
zip = zip
from itertools import zip_longest as zip_longest # novm # noqa: F401
from urllib.parse import urlencode as urlencode # novm # noqa: F401
from urllib.request import urlopen as urlopen # novm # noqa: F401
if sys.version_info >= (3, 3):
from collections.abc import Hashable as Hashable # novm
from collections.abc import Iterable as Iterable # novm
from collections.abc import Mapping as Mapping # novm
from collections.abc import MutableMapping as MutableMapping # novm
from collections.abc import MutableSequence as MutableSequence # novm
from collections.abc import MutableSet as MutableSet # novm
from collections.abc import Sequence as Sequence # novm
else:
from collections import Hashable as Hashable # noqa: F401
from collections import Iterable as Iterable # noqa: F401
from collections import Mapping as Mapping # noqa: F401
from collections import MutableMapping as MutableMapping # noqa: F401
from collections import MutableSequence as MutableSequence # noqa: F401
from collections import MutableSet as MutableSet # noqa: F401
from collections import Sequence as Sequence # noqa: F401

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
@@ -21,15 +21,11 @@
import six
from llnl.util import tty
from llnl.util.compat import Sequence
from llnl.util.lang import dedupe, memoized
from spack.util.executable import Executable
if sys.version_info >= (3, 3):
from collections.abc import Sequence # novm
else:
from collections import Sequence
__all__ = [
'FileFilter',
'FileList',
@@ -1642,12 +1638,18 @@ def find_libraries(libraries, root, shared=True, recursive=False):
raise TypeError(message)
# Construct the right suffix for the library
if shared is True:
suffix = 'dylib' if sys.platform == 'darwin' else 'so'
if shared:
# Used on both Linux and macOS
suffixes = ['so']
if sys.platform == 'darwin':
# Only used on macOS
suffixes.append('dylib')
else:
suffix = 'a'
suffixes = ['a']
# List of libraries we are searching with suffixes
libraries = ['{0}.{1}'.format(lib, suffix) for lib in libraries]
libraries = ['{0}.{1}'.format(lib, suffix) for lib in libraries
for suffix in suffixes]
if not recursive:
# If not recursive, look for the libraries directly in root

View File

@@ -1,10 +1,11 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import division
import contextlib
import functools
import inspect
import os
@@ -12,19 +13,10 @@
import sys
from datetime import datetime, timedelta
import six
from six import string_types
if sys.version_info < (3, 0):
from itertools import izip_longest # novm
zip_longest = izip_longest
else:
from itertools import zip_longest # novm
if sys.version_info >= (3, 3):
from collections.abc import Hashable, MutableMapping # novm
else:
from collections import Hashable, MutableMapping
from llnl.util.compat import MutableMapping, zip_longest
# Ignore emacs backups when listing modules
ignore_modules = [r'^\.#', '~$']
@@ -174,6 +166,19 @@ def union_dicts(*dicts):
return result
# Used as a sentinel that disambiguates tuples passed in *args from coincidentally
# matching tuples formed from kwargs item pairs.
_kwargs_separator = (object(),)
def stable_args(*args, **kwargs):
"""A key factory that performs a stable sort of the parameters."""
key = args
if kwargs:
key += _kwargs_separator + tuple(sorted(kwargs.items()))
return key
def memoized(func):
"""Decorator that caches the results of a function, storing them in
an attribute of that function.
@@ -181,15 +186,23 @@ def memoized(func):
func.cache = {}
@functools.wraps(func)
def _memoized_function(*args):
if not isinstance(args, Hashable):
# Not hashable, so just call the function.
return func(*args)
def _memoized_function(*args, **kwargs):
key = stable_args(*args, **kwargs)
if args not in func.cache:
func.cache[args] = func(*args)
return func.cache[args]
try:
return func.cache[key]
except KeyError:
ret = func(*args, **kwargs)
func.cache[key] = ret
return ret
except TypeError as e:
# TypeError is raised when indexing into a dict if the key is unhashable.
raise six.raise_from(
UnhashableArguments(
"args + kwargs '{}' was not hashable for function '{}'"
.format(key, func.__name__),
),
e)
return _memoized_function
@@ -931,3 +944,15 @@ def elide_list(line_list, max_num=10):
return line_list[:max_num - 1] + ['...'] + line_list[-1:]
else:
return line_list
@contextlib.contextmanager
def nullcontext(*args, **kwargs):
"""Empty context manager.
TODO: replace with contextlib.nullcontext() if we ever require python 3.7.
"""
yield
class UnhashableArguments(TypeError):
"""Raise when an @memoized function receives unhashable arg or kwarg values."""

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -1,4 +1,4 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

Some files were not shown because too many files have changed in this diff Show More