Compare commits

..

673 Commits

Author SHA1 Message Date
Todd Gamblin
50396b41b4 refactor: Move spack.util.executable -> llnl.util.executable 2023-01-01 23:37:25 -08:00
Todd Gamblin
b4a2e2b82c refactor: move most of spack.util.environment -> llnl.util.envmod 2023-01-01 22:29:11 -08:00
Todd Gamblin
d3f8bf1b96 refactor: move windows path logic out of spack.util.path and into llnl.util.path 2023-01-01 22:29:11 -08:00
Todd Gamblin
fd5cf345e1 refactor: Move spack.util.string to llnl.util.string 2023-01-01 22:29:11 -08:00
Todd Gamblin
ca265ea0c2 style: fix spurious mypy errors from numpy (#34732)
Spack imports `pytest`, which *can* import `numpy`. Recent versions of `numpy` require
Python 3.8 or higher, and they use 3.8 type annotations in their type stubs (`.pyi`
files). At the same time, we tell `mypy` to target Python 3.7, as we still support older
versions of Python.

What all this means is that if you run `mypy` on `spack`, `mypy` will follow all the
static import statements, and it ends up giving you this error when it finds numpy stuff
that is newer than the target Python version:

```
==> Running mypy checks
src/spack/var/spack/environments/default/.spack-env/._view/4g7jd4ibkg4gopv4rosq3kn2vsxrxm2f/lib/python3.11/site-packages/numpy/__init__.pyi:638: error: Positional-only parameters are only supported in Python 3.8 and greater  [syntax]
Found 1 error in 1 file (errors prevented further checking)
  mypy found errors
```

We can fix this by telling `mypy` to skip all imports of `numpy` in `pyproject.toml`:

```toml
   [[tool.mypy.overrides]]
   module = 'numpy'
   follow_imports = 'skip'
   follow_imports_for_stubs = true
```

- [x] don't follow imports from `numpy` in `mypy`
- [x] get rid of old rule not to follow `jinja2` imports, as we now require Python 3
2023-01-01 01:05:17 +00:00
Glenn Johnson
7a92579480 py-fisher: add version 0.1.10 (#34738) 2022-12-31 12:48:21 -06:00
Glenn Johnson
190dfd0269 py-youtube-dl: add version 2021.12.17 (#34740) 2022-12-31 12:42:58 -06:00
Massimiliano Culpo
b549548f69 Simplify creation of test and install reports (#34712)
The code in Spack to generate install and test reports currently suffers from unneeded complexity. For
instance, we have classes in Spack core packages, like `spack.reporters.CDash`, that need an
`argparse.Namespace` to be initialized and have "hard-coded" string literals on which they branch to
change their behavior:

```python
if do_fn.__name__ == "do_test" and skip_externals:
    package["result"] = "skipped"
else:
    package["result"] = "success"
package["stdout"] = fetch_log(pkg, do_fn, self.dir)
package["installed_from_binary_cache"] = pkg.installed_from_binary_cache
if do_fn.__name__ == "_install_task" and installed_already:
    return
```
This PR attempt to polish the major issues encountered in both `spack.report` and `spack.reporters`.

Details:
- [x] `spack.reporters` is now a package that contains both the base class `Reporter` and all 
      the derived classes (`JUnit` and `CDash`)
- [x] Classes derived from `spack.reporters.Reporter` don't take an `argparse.Namespace` anymore
       as argument to `__init__`. The rationale is that code for commands should be built upon Spack
       core classes, not vice-versa.
- [x] An `argparse.Action` has been coded to create the correct `Reporter` object based on command
       line arguments
- [x] The context managers to generate reports from either `spack install` or from `spack test` have
       been greatly simplified, and have been made less "dynamic" in nature. In particular, the `collect_info`
       class has been deleted in favor of two more specific context managers. This allows for a simpler
       structure of the code, and less knowledge required to client code (in particular on which method to patch)
- [x] The `InfoCollector` class has been turned into a simple hierarchy, so to avoid conditional statements
       within methods that assume a knowledge of the context in which the method is called.
2022-12-30 10:15:38 -08:00
Heiko Bauke
79268cedd2 mpl: add v0.2.1, v0.2.0 (#34716) 2022-12-30 09:21:58 -08:00
Satish Balay
2004171b7e petsc, py-petsc4py: add v3.18.3 (#34725) 2022-12-30 10:49:21 +01:00
Todd Gamblin
06312ddf18 bugfix: setgid tests fail when primary group is unknown (#34729)
On systems with remote groups, the primary user group may be remote and may not exist on
the local system (i.e., it might just be a number). On the CLI, it looks like this:

```console
> touch foo
> l foo
-rw-r--r-- 1 gamblin2 57095 0 Dec 29 22:24 foo
> chmod 2000 foo
chmod: changing permissions of 'foo': Operation not permitted
```

Here, the local machine doesn't know about per-user groups, so they appear as gids in
`ls` output. `57095` is also `gamblin2`'s uid, but the local machine doesn't know that
`gamblin2` is in the `57095` group.

Unfortunately, it seems that Python's `os.chmod()` just fails silently, setting
permissions to `0o0000` instead of `0o2000`. We can avoid this by ensuring that the file
has a group the user is known to be a member of.

- [x] Add `ensure_known_group()` in the permissions tests.
- [x] Call `ensure_known_group()` on tempfile in `test_chmod_real_entries_ignores_suid_sgid`.
2022-12-30 10:24:35 +01:00
Todd Gamblin
3a0db729c7 docs: avoid errors by using type hints instead of doc types (#34707)
There are a number of places in our docstrings where we write "list of X" as the type, even though napoleon doesn't actually support this. It ends up causing warnings when generating docs.

Now that we require Python 3, we don't have to rely on type hints in docs -- we can just use Python type hints and omit the types of args and return values from docstrings.

We should probably do this for all types in docstrings eventually, but this PR focuses on the ones that generate warnings during doc builds.

Some `mypy` annoyances we should consider in the future:
1. Adding some of these type annotations gets you:
    ```
    note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
    ```
   because they are in unannotated functions (like constructors where we don't really need any annotations).
   You can silence these with `disable_error_code = "annotation-unchecked"` in `pyproject.toml`
2. Right now we support running `mypy` in Python `3.6`.  That means we have to support `mypy` `.971`, which does not support `disable_error_code = "annotation-unchecked"`, so I just filter `[annotation-unchecked]` lines out in `spack style`.
3. I would rather just turn on `check_untyped_defs` and get more `mypy` coverage everywhere, but that will require about 1,000 fixes.  We should probably do that eventually.
4. We could also consider only running `mypy` on newer python versions.  This is not easy to do while supporting `3.6`, because you have to use `if TYPE_CHECKING` for a lot of things to ensure that 3.6 still parses correctly.  If we only supported `3.7` and above we could use [`from __future__ import annotations`](https://mypy.readthedocs.io/en/stable/runtime_troubles.html#future-annotations-import-pep-563), but we have to support 3.6 for now. Sigh.

- [x] Convert a number of docstring types to Python type hints
- [x] Get rid of "list of" wherever it appears
2022-12-29 16:45:09 -08:00
dependabot[bot]
9759331f43 build(deps): bump actions/setup-python from 4.3.1 to 4.4.0 (#34667)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](2c3dd9e7e2...5ccb29d877)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-29 14:57:58 +01:00
downloadico
ceca97518a trinity: add version 2.15.0-FULL (#34666) 2022-12-29 11:13:47 +01:00
Brent Huisman
1929d5e3de arbor: add v0.8.1 (#34660) 2022-12-29 11:07:17 +01:00
Lucas Frérot
238e9c3613 tamaas: added v2.6.0 (#34676) 2022-12-29 11:04:33 +01:00
Jim Galarowicz
d43e7cb5cd survey: add v1.0.7 (#34679) 2022-12-29 11:00:45 +01:00
Christopher Christofi
51a037d52a perl-archive-zip: add 1.68 (#34684) 2022-12-29 10:57:32 +01:00
Tim Haines
c91f8c2f14 boost: apply 'intel-oneapi-linux-jam.patch' to all versions since 1.76 (#34670) 2022-12-29 10:56:45 +01:00
Christopher Christofi
04ad42e5ee perl-appconfig: add v1.71 (#34685) 2022-12-29 10:55:41 +01:00
Alex Hedges
d02c71e443 git-filter-repo: add new package (#34690) 2022-12-29 10:53:19 +01:00
David Zmick
ca6e178890 jq: set -D_REENTRANT for builds on darwin (#34691) 2022-12-29 10:49:09 +01:00
Jed Brown
b145085fff libceed: add v0.11.0 (#34694) 2022-12-29 10:31:25 +01:00
AMD Toolchain Support
3a4b96e61c AOCC: add v4.0.0 (#33833) 2022-12-29 10:30:35 +01:00
Adam J. Stewart
36d87a4783 py-numpy: add v1.24.1 (#34697) 2022-12-29 10:23:20 +01:00
Wouter Deconinck
6d2645f73b libpsl: new versions through 0.21.2 (#34699)
This adds the final bugfix versions through the 0.21.2 just released.

With 0.21.1 the tag name pattern was changed, hence url_for_version.
2022-12-29 10:22:27 +01:00
Wouter Deconinck
44f7363fbe cernlib: depends_on libxaw libxt (#34448)
Based on the following lines in the top level `CMakeLists.txt` (I can't deep link since gitlab.cern.ch not public), `cernlib` needs an explicit dependency on `libxaw` and `libxt`:
```cmake
find_package(X11  REQUIRED)
message(STATUS "CERNLIB: X11_Xt_LIB=${X11_Xt_LIB} X11_Xaw_LIB=${X11_Xaw_LIB} X11_LIBRARIES=${X11_LIBRARIES}")
```
2022-12-29 09:25:07 +01:00
Wouter Deconinck
9d936a2a75 singularity, apptainer: --without-conmon into @property config_options (#34474)
Per https://github.com/spack/spack/issues/34192, apptainer does not support `--without-conmon`, so we introduce a base class `config_options` property that can be overridden in the `apptainer` package.
2022-12-29 09:24:41 +01:00
Wouter Deconinck
18438c395d dd4hep: depends_on virtual tbb instead of intel-tbb (#34704)
Recent changes to dd4hep remove the explicit dependency
on an older version of intel-tbb. This makes this explicit
in the spack package.
2022-12-29 09:13:28 +01:00
wspear
28a30bcea6 veloc: add v1.6 and dependencies (#34706) 2022-12-29 09:12:51 +01:00
Alex Richert
536c7709c2 Change regex in bacio patch to avoid python re bug (#34668) 2022-12-29 08:50:27 +01:00
Todd Gamblin
e28738a01e bugfix: make texinfo build properly with gettext (#34312)
`texinfo` depends on `gettext`, and it builds a perl module that uses gettext via XS
module FFI. Unfortunately, the XS modules build asks perl to tell it what compiler to
use instead of respecting the one passed to configure.

Without this change, the build fails with this error:

```
parsetexi/api.c:33:10: fatal error: 'libintl.h' file not found
         ^~~~~~~~~~~
```

We need the gettext dependency and the spack wrappers to ensure XS builds properly.

- [x] Add needed `gettext` dependency to `texinfo`
- [x] Override XS compiler with `PERL_EXT_CC`

Co-authored-by: Paul Kuberry <pakuber@sandia.gov>
2022-12-28 15:20:53 -08:00
Todd Gamblin
5f8c706128 Consolidate how Spack uses git (#34700)
Local `git` tests will fail with `fatal: transport 'file' not allowed` when using git 2.38.1 or higher, due to a fix for `CVE-2022-39253`.

This was fixed in CI in #33429, but that doesn't help the issue for anyone's local environment. Instead of fixing this with git config in CI, we should ensure that the tests run anywhere.

- [x] Introduce `spack.util.git`.
- [x] Use `spack.util.git.get_git()` to get a git executable, instead of `which("git")` everywhere.
- [x] Make all `git` tests use a `git` fixture that goes through `spack.util.git.get_git()`.
- [x] Add `-c protocol.file.allow=always` to all `git` invocations under `pytest`.
- [x] Revert changes from #33429, which are no longer needed.
2022-12-28 00:44:11 -08:00
Rémi Lacroix
558695793f CPMD: Remove now unused "import" 2022-12-27 13:06:08 -08:00
Rémi Lacroix
b43a27674b CPMD: Update for open-source release
CPMD has been open-sourced on GitHub so manual download is no longer needed. The patches have been included in the new 4.3 release.
2022-12-27 13:06:08 -08:00
Massimiliano Culpo
3d961b9a1f spack graph: rework to use Jinja templates and builders (#34637)
`spack graph` has been reworked to use:

- Jinja templates
- builder objects to construct the template context when DOT graphs are requested. 

This allowed to add a new colored output for DOT graphs that highlights both
the dependency types and the nodes that are needed at runtime for a given spec.
2022-12-27 15:25:53 +01:00
Todd Gamblin
d100ac8923 types: fix type annotations and remove novm annootations for llnl module
Apparently I forgot to do this in #34305.
2022-12-26 22:28:44 +01:00
Harmen Stoppels
e8fa8c5f01 timer: pick a single unit based on max duration. 2022-12-26 22:28:44 +01:00
Todd Gamblin
be6bb413df spack solve: use consistent units for time
`spack solve` is supposed to show you times you can compare. setup, ground, solve, etc.
all in a list. You're also supposed to be able to compare easily across runs. With
`pretty_seconds()` (introduced in #33900), it's easy to miss the units, e.g., spot the
bottleneck here:

```console
> spack solve --timers tcl
    setup        22.125ms
    load         16.083ms
    ground        8.298ms
    solve       848.055us
    total        58.615ms
```

It's easier to see what matters if these are all in the same units, e.g.:

```
> spack solve --timers tcl
    setup         0.0147s
    load          0.0130s
    ground        0.0078s
    solve         0.0008s
    total         0.0463s
```

And the units won't fluctuate from run to run as you make changes.

-[x] make `spack solve` timings consistent like before
2022-12-26 22:28:44 +01:00
Adam J. Stewart
d23c302ca2 qt-base: ~network by default (#34688) 2022-12-26 10:19:03 -06:00
Rohit Goswami
ed0c1cea91 py-pytest-datadir: Init at 1.4.1 (#34692)
* py-pytest-datadir: Init at 1.4.1

* py-pytest-data-dir: Fix missing dep

Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>

Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
2022-12-24 11:42:05 -07:00
Lucas Frérot
ffc42e287d py-uvw: added v0.5.0 (#34677) 2022-12-24 11:12:44 -06:00
Ralf Gommers
ba0d182e10 Update py-meson-python (0.11.0, 0.12.0) and meson (0.64.1, 1.0.0) (#34675)
* Update py-meson-python versions (0.11.0, 0.12.0)

* Update `meson` to version 0.64.1

* Add Meson 1.0.0

* Apply code review suggestions
2022-12-23 19:22:19 -07:00
David Zmick
8d8104de2c tmux: add 3.3a (#34671) 2022-12-24 01:52:32 +01:00
Adam J. Stewart
7975e0afbc QMakeBuilder: fix bug introduced during multi-bs refactor (#34683) 2022-12-23 13:57:44 -06:00
Adam J. Stewart
4a43522763 py-kornia: add v0.6.9 (#34652) 2022-12-22 15:13:52 -07:00
Adam J. Stewart
30343d65ba libelf: fix build on macOS x86_64 (#34646) 2022-12-22 14:58:32 -07:00
Alex Richert
38c1639c9c bacio: fix typo in patch method (#34663) 2022-12-22 18:59:32 +01:00
Wouter Deconinck
be5033c869 sherpa: add v2.2.13 (#34628) 2022-12-22 10:58:21 -07:00
Adam J. Stewart
eb67497020 ML CI: Linux x86_64 (#34299)
* ML CI: Linux x86_64

* Update comments

* Rename again

* Rename comments

* Update to match other arches

* No compiler

* Compiler was wrong anyway

* Faster TF
2022-12-22 11:31:40 -06:00
Loïc Pottier
371268a9aa added py-dynim package (#34651)
Signed-off-by: Loïc Pottier <48072795+lpottier@users.noreply.github.com>

Signed-off-by: Loïc Pottier <48072795+lpottier@users.noreply.github.com>
2022-12-22 09:55:18 -06:00
Andrew Wood
344e8d142a Restrict a patch of rhash to versions >=1.3.6 (#34310) 2022-12-22 08:02:15 -07:00
Harmen Stoppels
161fbfadf4 Fix combine_phase_logs text encoding issues (#34657)
Avoid text decoding and encoding when combining log files, instead
combine in binary mode.

Also do a buffered copy which is sometimes faster for large log files.
2022-12-22 15:32:48 +01:00
Wladimir Arturo Garces Carrillo
3304312b26 neve: add new package (#34596)
Co-authored-by: WladIMirG <WladIMirG@users.noreply.github.com>
2022-12-22 07:27:07 -07:00
Alec Scott
3279ee7068 Add --fresh to docs to actually upgrade spack environments (#34433) 2022-12-22 11:19:24 +00:00
Todd Gamblin
8f3f838763 docs: show module documentation before submodules (#34258)
Currently, the Spack docs show documentation for submodules *before* documentation for
submodules on package doc pages. This means that if you put docs in `__init__.py` in
some package, the docs in there will be shown *after* the docs for all submodules of the
package instead of at the top as an intro to the package. See, e.g.,
[the lockfile docs](https://spack.readthedocs.io/en/latest/spack.environment.html#module-spack.environment),
which should be at the
[top of that page](https://spack.readthedocs.io/en/latest/spack.environment.html).

- [x] add the `--module-first` option to sphinx so that it generates module docs at top of page.
2022-12-22 11:50:48 +01:00
Todd Gamblin
09864d00c5 docs: remove monitors and analyzers (#34358)
These experimental features were removed in #31130, but the docs were not.

- [x] remove the `spack monitor` and `spack analyze` docs
2022-12-22 11:47:13 +01:00
Benjamin S. Kirk
0f7fa27327 librsvg: add 2.40.21, which does not require rust (#34585)
* librsvg: add 2.40.21, which does not require rust and has some security backports

https://download.gnome.org/sources/librsvg/2.40/librsvg-2.40.21.news

* librsvg: prevent finding broken gtkdoc binaries when ~doc is selected.

On my CentOS7 hosts, ./configure finds e.g. /bin/gtkdoc-rebase even when
~doc is selected.  These tools use Python2, and fail with an error:
"ImportError: No module named site"

So prevent ./configure from finding these broken tools when not building
the +doc variant.
2022-12-22 11:28:30 +01:00
Rocco Meli
a27139c081 openbabel: add 3.1.0 and 3.1.1 (#34631) 2022-12-22 03:17:50 -07:00
Vasileios Karakasis
4d4338db16 reframe: rework recipe, add v4.0.0-dev4 (#34584) 2022-12-22 09:53:42 +01:00
Annop Wongwathanarat
6d64ffdd1a quantum-espresso: enable linking with armpl-gcc and acfl for BLAS and FFT (#34416) 2022-12-22 09:50:51 +01:00
Harmen Stoppels
e9ea9e2316 index.json.hash, no fatal error if key cannot be fetched (#34643) 2022-12-22 09:48:05 +01:00
Benjamin Fovet
2a5509ea90 kokkos: add v3.7.01 (#34645)
Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2022-12-22 09:45:13 +01:00
Adam J. Stewart
b9d027f0cc py-pytorch-lightning: add v1.8.6 (#34647) 2022-12-22 09:43:57 +01:00
Christopher Christofi
6d54dc2a44 perl-config-simple: add 4.58 (#34649) 2022-12-22 09:43:41 +01:00
renjithravindrankannath
6cd9cbf578 Using corresponding commit ids of hiprand for each releases (#34545) 2022-12-22 09:36:14 +01:00
Alex Richert
72e81796d1 bacio: patch for v2.4.1 (#34575) 2022-12-22 09:12:29 +01:00
Andre Merzky
f116e6762a add py-psij-python and py-pystache packages (#34357)
* add psij package and deps

* update hashes, URLs

* linting

* Update var/spack/repos/builtin/packages/py-psij-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pystache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pystache/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Update package.py

apply suggested change

* Update package.py

apply suggested change

* Update package.py

ensure maintainer inheritance

* add psij to exaworks meta-package

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-12-21 20:07:35 -07:00
Howard Pritchard
c74bbc6723 paraview: patch catalyst etc. to build with oneapi (#33562)
without this patch, build of paraview has a meltdown when reaching 3rd party catalyst and other packages
with these types of errors:

   335    /tmp/foo/spack-stage/spack-stage-paraview-5.10.1-gscoqxhhakjyyfirdefuhmi2bzw4scho/spack-src/VTK/ThirdParty/fmt/vtkfmt/vtkfmt/format.h:1732:11: error: cannot capture a bi
            t-field by reference
   336          if (sign) *it++ = static_cast<Char>(data::signs[sign]);
   337              ^

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2022-12-21 17:07:24 -07:00
Harmen Stoppels
492a603d5e json: remove python 2 only code (#34615) 2022-12-21 14:18:12 -07:00
Adam J. Stewart
dab68687bd py-cartopy: older versions don't support Python 3.10 (#34626) 2022-12-21 21:23:22 +01:00
Thomas Madlener
1a32cea114 podio: add v0.16.2 (#34606) 2022-12-21 12:52:47 -07:00
Hector Martinez-Seara
aaec76652b relion: add v4.0.0 (#34600) 2022-12-21 20:41:13 +01:00
Michael Kuhn
f748911ea0 glib: add 2.74.3 (#34603) 2022-12-21 20:40:04 +01:00
Cory Bloor
e60e74694f rocm: make amdgpu_target sticky (#34591)
The sticky property will prevent clingo from changing the amdgpu_target
to work around conflicts. This is the same behaviour as was adopted for
cuda_arch in 055c9d125d.
2022-12-21 20:21:20 +01:00
Sergey Kosukhin
2ef026b8c6 eckit: skip broken test (#34610) 2022-12-21 20:20:05 +01:00
Mark W. Krentel
a6c2569b18 hpctoolkit: replace filter_file with upstream patch (#34604)
Replace the filter_file for older configure with rocm 5.3 with an
upstream patch.  Further, the patch is no longer needed for develop or
later releases.
2022-12-21 20:18:58 +01:00
shanedsnyder
5483b5ff99 dashan-runtime,darshan-util,py-darshan: update package versions for darshan-3.4.2 (#34583) 2022-12-21 20:18:27 +01:00
louisespellacy-arm
2b78a7099d arm-forge: add 22.1.2 (#34569) 2022-12-21 20:09:42 +01:00
lpoirel
34cdc6f52b starpu: add conflict for ~blocking +simgrid (#34616)
see 1f5a911d43
2022-12-21 20:09:23 +01:00
Adam J. Stewart
3aafdb06c9 py-pyproj: add new versions (#34633) 2022-12-21 20:00:53 +01:00
Harmen Stoppels
4a22c1c699 urlopen: handle timeout in opener (#34639) 2022-12-21 19:40:26 +01:00
Andrey Perestoronin
f021479ef0 feat: 🎸 Add new 2023.0.0 oneVPL package (#34642) 2022-12-21 11:07:41 -07:00
Manuela Kuhn
3f374fb62f py-vcrpy: add 4.2.1 (#34636) 2022-12-21 11:02:55 -07:00
Niclas Jansson
949be42f32 neko: add v0.5.0 (#34640) 2022-12-21 11:02:37 -07:00
Rob Falgout
e5abd5abc1 hypre: add v2.27.0 (#34625) 2022-12-21 11:02:23 -07:00
Harmen Stoppels
4473d5d811 etags for index.json invalidation, test coverage (#34641)
Implement an alternative strategy to do index.json invalidation.

The current approach of pairs of index.json / index.json.hash is
problematic because it leads to races.

The standard solution for cache invalidation is etags, which are
supported by both http and s3 protocols, which allows one to do
conditional fetches.

This PR implements that for the http/https schemes. It should also work
for s3 schemes, but that requires other prs to be merged.

Also it improves unit tests for index.json fetches.
2022-12-21 18:41:59 +01:00
Mikael Simberg
c3e61664cf Add patch for pika on macOS (#34619) 2022-12-21 13:41:49 +01:00
Manuela Kuhn
c3217775c3 py-scikit-image: add 0.19.3 (#34618)
* py-scikit-image: add 0.19.3

* Update var/spack/repos/builtin/packages/py-scikit-image/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-20 12:59:48 -06:00
kwryankrattiger
58a7e11db9 DAV: VTK-m needs to install examples for smoke test (#34611)
SDK deployment targets being able to validate and run VTK-m via spack
deployments, so examples should be installed.
2022-12-20 09:56:50 -08:00
Andrey Perestoronin
ac570bb5c4 2023.0.0 oneAPI release promotion (#34617) 2022-12-20 08:47:08 -07:00
Massimiliano Culpo
b2c806f6fc archspec: add support for zen4 (#34609)
Also add:
- Upper bound for Xeon Phi compiler support
- Better detection for a64fx
2022-12-20 11:22:50 +01:00
Nicholas Knoblauch
bd613b3124 Remove dep on jupyter meta-package (#34573) 2022-12-19 17:22:34 -06:00
Manuela Kuhn
f1b85bc653 py-nipype: add 1.8.5 and py-looseversion: add new package (#34608) 2022-12-19 13:25:22 -06:00
Hector Martinez-Seara
e1fab4dd51 Gromacs: added version 2022.4 (#34599) 2022-12-19 15:30:20 +01:00
Anton Kozhevnikov
a924079f66 [ELPA] add sha256 for elpa-2022.11.001.rc2.tar.gz (#33439) 2022-12-19 04:12:02 -07:00
Adam J. Stewart
c5aff1d412 py-horovod: patch no longer applies (#34593) 2022-12-19 11:49:02 +01:00
Adam J. Stewart
6c9602ee64 aws-sdk-cpp: add v1.10.32 (#34592) 2022-12-19 11:48:31 +01:00
Adam J. Stewart
64327bfef0 py-pyvista: add v0.37.0 (#34590) 2022-12-19 11:48:01 +01:00
Adam J. Stewart
05c3cb7cc9 netcdf-cxx: add patch to fix macOS build (#34588) 2022-12-19 11:46:33 +01:00
Adam J. Stewart
c87b251639 XNNPACK: fix build on macOS, update deps (#34555) 2022-12-19 11:44:56 +01:00
Adam J. Stewart
f2332a17d3 Node.js: new versions, newer Python support, macOS fixes (#34478) 2022-12-19 11:40:31 +01:00
Adam J. Stewart
c7f24a132e py-numpy: add v1.24.0 (#34602) 2022-12-18 17:17:06 -07:00
Alec Scott
96a7af1dd2 Add py-docstring-to-markdown v0.11 (#34595) 2022-12-18 14:59:47 -06:00
eugeneswalker
db1caa9e92 intel-oneapi-dpl: add v2022.0.0 (#34601) 2022-12-18 13:44:23 -05:00
iarspider
237d26460d LLVM: replace libelf dependency with elf (#34265)
* LLVM: replace libelf dependency with elf

I didn't test this extensively, but in CMS LLVM builds just fine with elfutils.

* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-12-17 14:44:27 -08:00
Harmen Stoppels
1020b65297 fix != -> == typo (#34568) 2022-12-17 20:15:15 +01:00
Adam J. Stewart
dceb4c9d65 Update PyTorch ecosystem (#34582) 2022-12-17 11:51:59 -07:00
Alex Richert
50570ea334 Add static-only option for ESMF (#34576) 2022-12-17 04:27:22 -07:00
eugeneswalker
7e836b925d e4s: disable mac stack due to binary relocation issue#32571 (#34560) 2022-12-17 10:53:15 +00:00
Benjamin S. Kirk
cec3da61d2 Add gimp & dependent packages (#34558)
* exiv2: add new versions

* babl: new package required to build GIMP

* gegl: new package required to build GIMP

* gexiv2: new package required to build GIMP

* libmypaint: new package required to build GIMP

* mypaint-brushes: new package required to build GIMP

* vala: new package required to build GIMP

* GIMP: new package definition for building GIMP-2.10 from source

* libjxl: update for 0.7.0

* libwmf: a library for reading vector images in Windows Metafile Format (WMF)

* libde265: an open source implementation of the h.265 video codec

* libwebp: add new versions

* GIMP: additional variants for building GIMP-2.10 from source

* libde265: remove boilerplate

* fixes for style precheck

* updates based on feedback

* fixes for style precheck
2022-12-17 03:52:56 -07:00
Mikhail Titov
7ed53cf083 Update package versions: RADICAL-Cybertools (RE, RG, RP, RS, RU) (#34572)
* rct: update packages (RE, RG, RP, RS, RU) with new versions

* re: fixed radical-pilot requirement for radical-entk
2022-12-16 22:24:00 -07:00
eugeneswalker
bdc3ab5b54 intel-oneapi-compilers: add v2023.0.0 (#34571) 2022-12-16 21:38:51 -07:00
Jack Morrison
5a985e33ea Add --enable-orterun-prefix-by-default configure option for OpenMPI (#34469) 2022-12-16 17:59:24 -07:00
Bernhard Kaindl
9817593c1c Automake requires Thread::Queue, but it is only provided with in perl+threads. (#34076)
Update the depends_on("perl") to depends_on("perl+threads").

This and #34074 is needed to properly handle e.g. the perl-Thread-Queue
rpm package:

It may not be installed on RedHat-based hosts, which can lead to automake
build failures when `spack external find perl` or `spack external find --all`
was used to use the system-provided perl install.
2022-12-16 16:11:11 -08:00
Rémi Lacroix
1cc78dac38 octopus: Ensure MPI is used consistently (#33969)
Some variants have MPI dependencies, make sure they can be used only when the `mpi` variable is enabled.
2022-12-16 15:17:37 -08:00
Marco De La Pierre
e2c5fe4aa3 adding 2nd bunch of nf-core deps from update/nextflow-tools (#34562)
* adding 2nd bunch of nf-core deps from update/nextflow-tools

* Update var/spack/repos/builtin/packages/py-a2wsgi/package.py

* Update var/spack/repos/builtin/packages/py-apispec/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-tuspy/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-bioblend/package.py

* Update var/spack/repos/builtin/packages/py-circus/package.py

* Update var/spack/repos/builtin/packages/py-circus/package.py

* Update var/spack/repos/builtin/packages/py-cloudbridge/package.py

* Update var/spack/repos/builtin/packages/py-cloudbridge/package.py

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-16 15:18:49 -07:00
Marco De La Pierre
1bf87dbb5d Adding first bunch of recipes for dependencies of nf-core-tools (#34537)
* nextflow recipe: added latest stable version

* tower-cli recipe: added latest release

* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli

* recipes nf-tower-agent and nf-tower-cli: small fix

* nf-core-tools recipe: added most py- dependencies

* nf-core-tools: recipe without galaxy-tool-util (for testing)

* fixed typos in py-yacman recipe

* fixed typos in py-pytest-workflow recipe

* fixed typo in nf-core-tools recipe

* fixed typos in py-yacman recipe

* fixes in recipes for py-questionary and py-url-normalize

* fixes to py-yacman recipe

* style fixes to py- packages that are dependencies to nf-core-tools

* fix in py-requests-cache recipe

* added missing dep in py-requests-cache recipe

* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper

* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util

* nf-core-tools: new version with extra dependency

* commit to merge packages on focus from update/nextflow-tools

* nf-core: commenting galaxy dep for this pr

* Update var/spack/repos/builtin/packages/py-requests-cache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-requests-cache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed nf-core-tools from this branch, will be back at the end

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-16 14:28:51 -07:00
Sam Reeve
ffe527b141 Add HACCabana proxy app (#34567) 2022-12-16 14:03:08 -07:00
John W. Parent
642c5b876b Compiler detection: avoid false recognition of MSVC (#34574)
Interim fix for #34559

Spack's MSVC compiler definition uses ifx as the Fortran compiler.
Prior to #33385, the Spack MSVC compiler definition required the
executable to be called "ifx.exe"; #33385 replaced this with just
"ifx", which inadvertently led to ifx falsely indicating the
presence of MSVC on non-Windows systems (which leads to future
errors when attempting to query/use those compiler objects).

This commit applies a short-term fix by updating MSVC Fortran
version detection to always indicate a failure on non-Windows.
2022-12-16 19:22:04 +00:00
Brian Spilner
8b7bd6dc74 new release cdo-2.1.1 (#34548) 2022-12-16 11:16:32 -08:00
Adam J. Stewart
2f97dc7aa6 py-pytorch-lightning: add v1.8.5 (#34557) 2022-12-16 11:10:19 -08:00
Adam J. Stewart
958d542f81 GDAL: add v3.6.1 (#34556) 2022-12-16 10:32:54 -08:00
Vicente Bolea
b1aae1c2ed vtk-m: add v2.0.0-rc1 (#34561) 2022-12-16 10:31:10 -08:00
SXS Bot
690f9d69fe spectre: add v2022.12.16 (#34570)
* spectre: add v2022.12.16
* [@spackbot] updating style on behalf of sxs-bot

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-12-16 11:27:56 -07:00
Marc Joos
a78c16a609 add version 3.6.4 to wi4mpi (#34565) 2022-12-16 10:26:46 -08:00
snehring
7bb2d3cca3 nwchem: restricting current versions to python@3.9 at latest (#34506) 2022-12-16 17:20:19 +01:00
Paul Kuberry
7216050dd3 libzmq: make location of libsodium explicit (#34553) 2022-12-15 16:17:15 -07:00
eugeneswalker
2f26e422d6 nco: add v5.0.6 (#34512) 2022-12-15 15:42:13 -07:00
Brian Van Essen
3477d578a3 roctracer: fixed a bug in how the external is identified (#33517)
Make the package a proper ROCm package.
2022-12-15 23:29:36 +01:00
Zack Galbreath
aa8e1ba606 gitlab ci: more resources for slow builds (#34505) 2022-12-15 14:35:54 -07:00
Manuela Kuhn
08e007e9a6 py-traits: add 6.4.1 (#34550) 2022-12-15 13:10:16 -06:00
eugeneswalker
d6fb65ebc6 eckit: add v1.19.0 (#34510) 2022-12-15 10:38:24 -08:00
eugeneswalker
2b5be919dd odc: add v1.4.5 (#34513) 2022-12-15 10:38:06 -08:00
Sebastian Grimberg
cc2dff48a8 arpack-ng: add variant for ISO C binding support (#34529)
Co-authored-by: Sebastian Grimberg <sjg@amazon.com>
2022-12-15 10:56:13 -07:00
Massimiliano Culpo
22922bf74c Propagate exceptions from Spack python console (#34547)
fixes #34489

Customize sys.excepthook to raise SystemExit when
any unhandled exception reaches the hook.
2022-12-15 18:08:53 +01:00
Sean Koyama
8a02463d7d IntelOneApiPackage: add envmods variant to toggle environment modifications by oneapi packages (#34253)
Co-authored-by: Sean Koyama <skoyama@anl.gov>
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2022-12-15 17:52:09 +01:00
Harmen Stoppels
c6465bd9bd Add a proper deprecation warning for update-index -d (#34520) 2022-12-15 17:45:32 +01:00
Harmen Stoppels
9025caed6e Remove warning in download_tarball (#34549) 2022-12-15 14:03:30 +00:00
Massimiliano Culpo
7056a4bffd Forward lookup of the "run_tests" attribute (#34531)
fixes #34518

Fix an issue due to the MRO chain of the package wrapper
during build. Before this PR we were always returning
False when the builder object was created before the
run_tests method was monkey patched.
2022-12-15 09:35:33 +01:00
snehring
d2aa8466eb metabat: adding missing build dependency (#34530) 2022-12-15 09:23:59 +01:00
Loïc Pottier
6e4684fbca talass: fixed URLs so the package is reachable (#34387)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2022-12-15 09:23:05 +01:00
Brian Vanderwende
fcbf617d38 ncl: add RPC lib with ncl+hdf4 (#34451) 2022-12-15 09:22:00 +01:00
downloadico
1f8b55a021 Add G'MIC package with only the "cli" target available (#34533) 2022-12-15 09:19:50 +01:00
David Gardner
b5f8ed07fb sundials: fix typo in smoke tests (#34539) 2022-12-15 09:07:54 +01:00
Thomas Madlener
65bd9b9ac5 podio, edm4hep: add v0.7.2 and v0.16.1 respectively (#34526)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-12-15 09:02:16 +01:00
Adam J. Stewart
6250d84b41 cpuinfo: new versions, shared libs (#34544) 2022-12-15 09:00:51 +01:00
Wouter Deconinck
99056e03bd acts: new versions 19.11.0, 21.0.0, 21.1.0 (#34540)
* acts: new versions 19.11.0, 21.0.0, 21.1.0

https://github.com/acts-project/acts/compare/v19.10.0...v19.11.0:
- python 3.8 required if ACTS_BUILD_EXAMPLES_PYTHON_BINDINGS

https://github.com/acts-project/acts/compare/v20.3.0...v21.0.0:
- python 3.8 required if ACTS_BUILD_EXAMPLES_PYTHON_BINDINGS

https://github.com/acts-project/acts/compare/v21.0.0...v21.1.0:
- no build system changes

* acts: depends_on python@3.8: when sometimes
2022-12-15 08:56:32 +01:00
Fabien Bruneval
1db849ee5f libcint: Fix +coulomb_erf and add +pypzpx (#34524) 2022-12-15 05:31:58 +01:00
Thomas Madlener
2f82b213df lcio: add latest version (#34527) 2022-12-15 05:06:59 +01:00
Axel Huebl
2a5f0158bc ParaView: Add openPMD Support (#33821)
openPMD, a metadata standard on top of backends like ADIOS2 and HDF5,
is implemented in ParaView 5.9+ via a Python3 module.

Simplify Conflicts & Variant

Add to ECP Data Vis SDK
2022-12-14 20:45:27 -07:00
Manuela Kuhn
21a1f7dd97 py-traitlets: add w5.7.1 (#34525) 2022-12-14 21:34:51 -06:00
David Boehme
4b5ed94af4 caliper: add version 2.9.0 (#34538) 2022-12-15 03:52:53 +01:00
snehring
06788019a4 apptainer: add new version 1.1.4 (#34536) 2022-12-15 02:06:22 +01:00
Sam Grayson
cab8f795a7 Patch dill._dill._is_builtin_module (#34534)
* Patch dill._dill._is_builtin_module

* Fix style

* Add test
2022-12-14 16:03:03 -07:00
finkandreas
2db38bfa38 py-archspec: replace removed .build_directory with .stage.source_path (#34521) 2022-12-15 00:00:21 +01:00
Harmen Stoppels
ea029442e6 Revert "Revert "Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)"" (#34498)
This reverts commit 8035eeb36d.

And also removes logic around an additional HEAD request to prevent
a more expensive GET request on wrong content-type. Since large files
are typically an attachment and only downloaded when reading the
stream, it's not an optimization that helps much, and in fact the logic
was broken since the GET request was done unconditionally.
2022-12-14 23:47:11 +01:00
Axel Huebl
43e38d0d12 WarpX 22.11, 22.12 & PICMI-Standard (#34517)
* PICMI: 0.0.22

* WarpX: 22.11, 22.12
2022-12-14 13:59:16 -08:00
Marco De La Pierre
2522c8b754 edits to 8x existing recipes, mostly new versions, plus two dependency fixes (#34516) 2022-12-14 12:28:33 -07:00
Marco De La Pierre
f64cb29aea Nextflow, Tower Agent, Tower CLI: updates (#34515)
* renamed tower-agent and tower-cli with prefif nf-

* new nextflow package version

* added newest versions (today) for nf-tower-agent and nf-tower-cli
2022-12-14 11:44:22 -07:00
Erik Heeren
80e30222e1 New neuroscience packages: py-bmtk, py-neurotools (#34464)
* Add py-bmtk and py-neurotools

* py-bmtk: version bump

* [@spackbot] updating style on behalf of heerener

* Maybe the copyright needs to be extended to 2022 for the check to pass

* Process review remarks

* Update var/spack/repos/builtin/packages/py-neurotools/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-14 12:00:26 -06:00
eugeneswalker
55356e9edb bufr: add v11.6, 11.7, 11.7.1 (#34509) 2022-12-14 06:24:26 -07:00
eugeneswalker
eec09f791d fms: add v2019.01.03 (#34511) 2022-12-14 04:50:00 -07:00
Harmen Stoppels
9032179b34 Use update-index --mirror-url <url> instead of -d <url> (#34519) 2022-12-14 10:03:18 +01:00
Alberto Sartori
45b40115fb justbuild: add v1.0.0 (#34467) 2022-12-14 01:17:42 -07:00
snehring
e030833129 r-rgdal: adding new version 1.6-2 (#34502) 2022-12-13 20:06:31 -07:00
Harmen Stoppels
e055dc0e64 Use file paths/urls correctly (#34452)
The main issue that's fixed is that Spack passes paths (as strings) to
functions that require urls. That wasn't an issue on unix, since there
you can simply concatenate `file://` and `path` and all is good, but on
Windows that gives invalid file urls. Also on Unix, Spack would not deal with uri encoding like x%20y for file paths. 

It also removes Spack's custom url.parse function, which had its own incorrect interpretation of file urls, taking file://x/y to mean the relative path x/y instead of hostname=x and path=/y. Also it automatically interpolated variables, which is surprising for a function that parses URLs.

Instead of all sorts of ad-hoc `if windows: fix_broken_file_url` this PR
adds two helper functions around Python's own path2url and reverse.

Also fixes a bug where some `spack buildcache` commands
used `-d` as a flag to mean `--mirror-url` requiring a URL, and others
`--directory`, requiring a path. It is now the latter consistently.
2022-12-13 23:44:13 +01:00
Matthias Wolf
c45729cba1 py-submitit: add 1.4.5 (#34460) 2022-12-13 16:11:14 -06:00
Manuela Kuhn
b02b2f0f00 py-tifffile: add 2022.10.10 (#34499) 2022-12-13 16:09:01 -06:00
Manuela Kuhn
3ded50cc8c py-sphinxcontrib-qthelp: add 1.0.3 (#34495) 2022-12-13 16:08:06 -06:00
Manuela Kuhn
a7280cd5bb py-sqlalchemy: add 1.4.45 (#34497) 2022-12-13 16:07:34 -06:00
Paul Kuberry
2837b47ea5 trilinos: extend range of Teuchos patch (#34504) 2022-12-13 14:49:20 -07:00
Matthew Thompson
ea2c61c683 Update pFunit, add gFTL, gFTL-Shared, fArgParse, pFlogger, yaFyaml (#34476)
* Add GFE packages, Update pFUnit
* Remove citibeth as maintainer per her request
* Version 3.3.0 is an odd duck. Needs a v

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-12-13 22:40:33 +01:00
Bernhard Kaindl
217b34825a py-tensorboard-data-server: build needs rust+rustfmt (#34465) 2022-12-13 10:56:31 -07:00
Mosè Giordano
17d90f4cbc pcre2: add new versions and update URL (#34477) 2022-12-13 10:48:27 -07:00
Annop Wongwathanarat
7a5bd8cac4 gromacs: enable linking with acfl FFT (#34494) 2022-12-13 09:32:42 -08:00
Harmen Stoppels
333da47dc7 Don't fetch to order mirrors (#34359)
When installing binary tarballs, Spack has to download from its
binary mirrors.

Sometimes Spack has cache available for these mirrors.

That cache helps to order mirrors to increase the likelihood of
getting a direct hit.

However, currently, when Spack can't find a spec in any local cache
of mirrors, it's very dumb:

- A while ago it used to query each mirror to see if it had a spec,
  and use that information to order the mirror again, only to go
  about and do exactly a part of what it just did: fetch the spec
  from that mirror confused
- Recently, it was changed to download a full index.json, which
  can be multiple dozens of MBs of data and may take a minute to
  process thanks to the blazing fast performance you get with
  Python.

In a typical use case of concretizing with reuse, the full index.json
is already available, and it likely that the local cache gives a perfect
mirror ordering on install. (There's typically no need to update any
caches).

However, in the use case of Gitlab CI, the build jobs don't have cache,
and it would be smart to just do direct fetches instead of all the
redundant work of (1) and/or (2).

Also, direct fetches from mirrors will soon be fast enough to
prefer these direct fetches over the excruciating slowness of
index.json files.
2022-12-13 17:07:11 +01:00
dependabot[bot]
8b68b4ae72 build(deps): bump actions/checkout from 3.1.0 to 3.2.0 (#34480)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](93ea575cb5...755da8c3cf)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-13 09:05:50 -07:00
Adam J. Stewart
40a3fdefa8 py-cartopy: add v0.21.1 (#34482) 2022-12-13 07:12:24 -07:00
Adam J. Stewart
a61474f2c1 libicd: macOS now supported (#34483) 2022-12-13 07:12:00 -07:00
Aidan Heerdegen
b95a75779b Fix markdown links in rst files (#34488) 2022-12-13 14:11:38 +00:00
Harmen Stoppels
0ff6a1bd1c spack/package.py: improve editor support for some +/- static props (#34319) 2022-12-13 13:55:32 +01:00
Massimiliano Culpo
f9cfc2f57e scons: fix signature for install_args (#34481) 2022-12-13 12:21:44 +01:00
Adam J. Stewart
f4fb20e27e py-shapely: add v2.0.0 (#34475) 2022-12-13 09:59:23 +01:00
Massimiliano Culpo
3ff5d49102 Be strict on the markers used in unit tests (#33884) 2022-12-13 09:21:57 +01:00
Erik Heeren
238d4f72f5 py-pyld: add with dependency (#34472)
* py-pyld: add with dependency

* py-pyld and py-frozendict: update copyright expiration

* [@spackbot] updating style on behalf of heerener
2022-12-12 20:15:43 -07:00
Matthias Wolf
c5bc469eeb py-sh: new versions (#34458)
* py-sh: new versions

* style
2022-12-12 20:15:28 -07:00
Sam Grayson
b01e7dca9d Update packages for running azure (#34403)
* Update packages for running azure

* Update py-msal-extensions

* Respond to comments
2022-12-12 21:10:50 -06:00
Jean Luca Bez
c62906f781 New python package: Drishti (#33316)
* include Drishti

* fix syntax

* Update var/spack/repos/builtin/packages/drishti/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update var/spack/repos/builtin/packages/drishti/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 19:33:50 -07:00
Sam Grayson
94bac8d6dd Add new package: micromamba (#34195)
* Add new packages

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* style

* wip

* Respond to comments

* Respond to comments

* Spack style

* Remove linkage=full_static to pass package audit

* Spack style

* Moved tl-expected version
2022-12-12 14:00:41 -06:00
Manuela Kuhn
cd9c9b47e8 py-sphinxcontrib-devhelp: add 1.0.2 (#34462)
* py-sphinxcontrib-devhelp: add 1.0.2

* Update var/spack/repos/builtin/packages/py-sphinxcontrib-devhelp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 13:57:06 -06:00
Manuela Kuhn
8560295529 py-sphinxcontrib-applehelp: add 1.0.2 (#34461)
* py-sphinxcontrib-applehelp: add 1.0.2

* Update var/spack/repos/builtin/packages/py-sphinxcontrib-applehelp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 13:56:48 -06:00
Adam J. Stewart
fd248ad0b8 GEOS: add v3.10-3.11 (#34473) 2022-12-12 11:50:03 -08:00
renjithravindrankannath
0578ccc0e6 ROCm 5.3.0 updates (#33320)
* ROCm 5.3.0 updates
* New patches for 5.3.0 on hip and hsakmt
* Adding additional build arguments in hip and llvm
* RVS updates for 5.3.0 release
* New patches and rocm-tensile, rocprofiler-dev, roctracer-dev recipe updates for 5.3.0
* Reverting OPENMP fix from rocm-tensile
* Removing the patch to compile without git and adding witout it
* Install library in to lib directory instead of lib64 across all platform
* Setting lib install directory to lib
* Disable gallivm coroutine for libllvm15
* Update llvm-amdgpu prefix path in hip-config.cmake.in
  Removing libllvm15 from Mesa dependency removing
* hip-config.cmake.in update required from 5.2
* hip-config.cmake.in update required from 5.2 and above
* hip-config.cmake.in update required for all 5.2 release above
* Style check correction in hip update
* ginkgo: add missing include
* Patching hsa include path for rocm 5.3
* Restricting patch for llvm-15
* Style check error correction
* PIC flag required for the new test applications
* Passing -DCMAKE_POSITION_INDEPENDENT_CODE=ON in the cmake_args instead of setting -fPIC in CFLAGS

Co-authored-by: Cordell Bloor <Cordell.Bloor@amd.com>
2022-12-12 13:46:20 -06:00
Glenn Johnson
fcc2ab8b4b julia: have recipe explicitly use Spack compiler wrapper (#34365) 2022-12-12 19:53:26 +01:00
Vanessasaurus
76511ac039 Automated deployment to update package flux-core 2022-12-12 (#34456) 2022-12-12 11:47:36 -07:00
Jim Edwards
e4547982b3 allow esmf to use parallelio without mpi (#34182)
* allow esmf to use parallelio without mpi
* add hash for 8.4.0
* spack no longer sets arch to cray
2022-12-12 09:50:41 -08:00
Manuela Kuhn
80722fbaa3 py-snowballstemmer: add 2.2.0 (#34459) 2022-12-12 10:23:55 -06:00
Wouter Deconinck
c2fa444344 geant4: rm preference for 10.7.3 now that 11.1.0 is out (#34445) 2022-12-12 09:05:47 -07:00
Stephen Sachs
088ece1219 [texinfo] @7.0: needs c-11 syntax (#34261)
gnulib/lib/malloca.c uses single value `static_assert()` only available in c-11
syntax. `gcc` seems to be fine, but `icc` needs extra flag.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-12-12 16:52:26 +01:00
Veselin Dobrev
fcdd275564 MFEM: fix issue with cxxflags (#34435) 2022-12-12 16:52:00 +01:00
Mikael Simberg
b6d6a1ab2c Build tests for fmt conditionally (#34424) 2022-12-12 16:49:05 +01:00
Robert Blake
7efcb5ae73 Fixes to the silo packages for 4.11. (#34275) 2022-12-12 07:39:24 -07:00
Mikael Simberg
06e6389258 stdexec: skip build phase (#34425)
Since it's a header-only library there's nothing to build. However, the
default targets include tests and examples and there's no option to turn
them off during configuration time.
2022-12-12 07:16:40 -07:00
Simon Flood
b7f0f7879d foam-extend: add v4.1 (released Oct 2019) (#34398) 2022-12-12 07:16:17 -07:00
Bernhard Kaindl
f7cfbe2702 hdf5: "hdf5@1.13:" needs a depends_on "cmake@3.18:" for build. (#34447) 2022-12-12 15:12:55 +01:00
Wouter Deconinck
1466f8d602 geant4-data: depends_on g4emlow@7.9.1 when @10.6 (#34444)
Per https://geant4.web.cern.ch/node/1837 the correct dependency for 10.6 is on `g4emlow@7.9.1`, not on both `g4emlow@7.9` and `g4emlow@7.9.1`.

This is a minor cosmetic fix. The concretization for 10.6 works just fine here. But this removes the duplicate entry.
2022-12-12 07:11:42 -07:00
Glenn Johnson
9fdb36585f Fix openblas build with intel compiler (#34432)
This PR patches the f_check script to detect the ifort compiler and
ensure that F_COMPILER is iset to INTEL. This problem was introduced with
openblas-0.3.21. Without this patch, the value of F_COMPILER falls back
to G77 and icc rather than ifort is used for the linking stage. That
results in the openblas library missing libifcore, which in turn means
many Fotran programs can not be compiled with ifort.
2022-12-12 14:27:54 +01:00
Filippo Spiga
1f0a9fdc11 Adding NVIDIA HPC SDK 22.11 (#33954) 2022-12-12 14:26:39 +01:00
Jen Herting
0baba62900 arrow: dependency fixes (#33666)
+python needs more dependencies
don't look for dependency spec when it's not there
2022-12-12 14:26:02 +01:00
iarspider
4a0e34eda8 Add checksum for py-prometheus-client 0.14.1 (#34259) 2022-12-12 13:32:02 +01:00
Luke Diorio-Toth
88f2f59d92 Added ARM/aarch64 conflict to Eddy/Rivas lab tools (#34190) 2022-12-12 13:26:57 +01:00
Bernhard Kaindl
c1d11975f5 intel-parallel-studio: package is only available for x86_64 (#34392) 2022-12-12 12:09:29 +01:00
Glenn Johnson
cca56291c6 libgit2: add pcre dependency for @0.99: (#34289) 2022-12-12 11:55:49 +01:00
dependabot[bot]
ef155c16f0 build(deps): bump actions/setup-python from 4.3.0 to 4.3.1 (#34413)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](13ae5bb136...2c3dd9e7e2)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-12 11:37:06 +01:00
Adam J. Stewart
0952d314bd py-pytorch-lightning: add v1.8.4 (#34426) 2022-12-12 11:35:20 +01:00
Wileam Y. Phan
f29ac34558 nvhpc: add v22.11 (#34410) 2022-12-12 11:35:00 +01:00
snehring
47628521b9 delly2: add v1.1.6 (#34411) 2022-12-12 11:31:26 +01:00
Todd Gamblin
62da76cb5d directives: depends_on should not admit anonymous specs (#34368)
Writing a long dependency like:

```python
     depends_on(
         "llvm"
         "targets=amdgpu,bpf,nvptx,webassembly"
         "version_suffix=jl +link_llvm_dylib ~internal_unwind"
     )
```

when it should be formatted like this:

```python
     depends_on(
         "llvm"
         " targets=amdgpu,bpf,nvptx,webassembly"
         " version_suffix=jl +link_llvm_dylib ~internal_unwind"
     )
```

can cause really subtle errors. Specifically, you'll get something like this in
the package sanity tests:

```
    AttributeError: 'NoneType' object has no attribute 'rpartition'
```

because Spack happily constructs a class that has a dependency with name `None`.

We can catch this earlier by banning anonymous dependency specs directly in
`depends_on()`.  This causes the package itself to fail to parse, and emits
a much better error message:

```
==> Error: Invalid dependency specification in package 'julia':
    llvmtargets=amdgpu,bpf,nvptx,webassemblyversion_suffix=jl +link_llvm_dylib ~internal_unwind
```
2022-12-12 11:24:28 +01:00
Brian Vanderwende
65c914fff7 netcdf-c: add libxml2 when +dap (#34178) 2022-12-12 11:04:38 +01:00
Mikael Simberg
dd7b2deb47 Only restrict CMake version in Umpire when examples and rocm are enabled (#32025)
* Only restrict CMake version in umpire when examples and rocm are enabled

* Add CMAKE_HIP_ARCHITECTURES to Umpire and lift cmake version restriction

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2022-12-12 10:55:37 +01:00
Adam J. Stewart
7d72aeb4fe py-tensorboard-data-server: add Linux aarch64 support (#34437) 2022-12-12 10:40:48 +01:00
John W. Parent
43d97afd8b Bump CMake version to 3.25.1 (#34336) 2022-12-12 10:35:27 +01:00
Robert Cohn
39f13853ba intel-oneapi-* conflicts for non linux, x86 (#34441) 2022-12-12 09:23:14 +01:00
Sebastian Pipping
d65b9c559a expat: Add latest release 2.5.0 with security fixes (#34453) 2022-12-12 00:08:44 -07:00
Stephen Sachs
bde5720a81 glib: Add list_url+list_depth to list versions (#33904)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-12-12 06:51:09 +01:00
Harmen Stoppels
2371ec7497 openblas: fix bound :7.3 to :7.3.0 (#34443)
This patch:

https://gcc.gnu.org/legacy-ml/gcc-patches/2018-01/msg01962.html

is actually in Amazon Linux GCC 7.3.1, which we use in CI.

So we should not hold openblas back because of it.

Old versions of OpenBLAS fail to detect the host arch of some of the
AVX512 cpus of build nodes, causing build failures.

Of course we should try to set ARCH properly in OpenBLAS to avoid that
it looks up the build arch, but that's quite some work.
2022-12-11 19:02:07 +01:00
Todd Gamblin
aa3b6e598f pkg grep: use capfd instead of executable for tests 2022-12-10 16:43:44 -08:00
Todd Gamblin
8035eeb36d Revert "Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)"
This reverts commit db8f115013.
2022-12-10 16:43:44 -08:00
Michael Kuhn
57383a2294 py-scipy: print error message if no Fortran compiler is available (#34439) 2022-12-10 20:19:50 +01:00
Adam J. Stewart
9517dab409 py-scikit-learn: add v1.2.0 (#34408) 2022-12-10 11:10:31 -06:00
Manuela Kuhn
84fa4e6c4c py-setuptools-scm-git-archive: add 1.4 (#34422) 2022-12-10 09:58:39 -06:00
Harmen Stoppels
f33507961d py-{boto3,botocore,jmespath,s3transfer} bump (#34423) 2022-12-10 09:07:58 -06:00
Adam J. Stewart
46010ef1e1 valgrind: add v3.20.0, mark macOS conflict (#34436) 2022-12-10 12:19:42 +01:00
Abhik Sarkar
f9d9d43b63 Support for building Pmix with Debian/Ubuntu external dependencies (#32690)
* Debian like distros use multiarch implementation spec
https://wiki.ubuntu.com/MultiarchSpec
Instead of being limited to /usr/lib64, architecture based
lib directories are used. For instance, under ubuntu a library package
on x86_64 installs binaries under /usr/lib/x86_64-linux-gnu.
Building pmix with external dependencies like hwloc or libevent
fail as with prefix set to /usr, that prefix works for
headers and binaries but does not work for libraries. The default
location for library /usr/lib64 does not hold installed binaries.
Pmix build options --with-libevent and --with-libhwloc allow us to
specify dependent library locations. This commit is an effort to
highlight and resolve such an issue when a users want to use Debian like
distro library packages and use spack to build pmix.
There maybe other packages that might be impacted in a similar way.

* Adding libs property to hwloc and libevent and some cleanups to pmix patch

* Fixing style and adding comment on Pmix' 32-bit hwloc version detection issue
2022-12-09 18:30:45 -08:00
Harmen Stoppels
db8f115013 Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)
* `url_exists` improvements (take 2)

Make `url_exists` do HEAD request for http/https/s3 protocols

Rework the opener: construct it once and only once, dynamically dispatch
to the right one based on config.
2022-12-10 00:20:29 +01:00
Manuela Kuhn
09b5476049 py-simplejson: add 3.18.0 (#34430) 2022-12-09 13:11:30 -07:00
Sinan
14c4896ec2 package/qt-base: add conflict for older gcc (#34420) 2022-12-09 12:47:29 -07:00
Ben Morgan
b5ef5c2eb5 geant4: version bumps for Geant4 11.1.0 release (#34428)
* geant4: version bumps for Geant4 11.1.0

- Version bumps for new data libraries
  - g4ndl 4.7
  - g4emlow 8.2
- Add geant4-data@11.1.0
- Checksum new Geant4 11.1.0 release
  - Limit +python variant to maximum of :11.0 due to removal of
    Geant4Py in 11.1
  - Update CLHEP dependency to at least 2.4.6.0 for this release
  - Update VecGeom dependency to at least 1.2.0 for this release,
    closing version ranges for older releases to prevent multiple
    versions satisfying requirement

* geant4: correct max version for python support
2022-12-09 12:26:22 -07:00
Scott Wittenburg
675afd884d gitlab ci: more resources for paraview and py-torch (#34412) 2022-12-09 11:58:37 -07:00
shanedsnyder
0f5482dc9a [darshan-runtime, darshan-util, py-darshan]: darshan 3.4.1 release updates (#34294) 2022-12-09 19:56:53 +01:00
Jen Herting
069e5f874c New package: py-torchdiffeq (#34409)
* [py-torchdiffeq] new package

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-12-09 12:38:14 -06:00
Manuela Kuhn
cad01a03cb py-nbformat: add 5.7.0 and new package py-hatch-nodejs-version (#34361) 2022-12-09 12:32:41 -06:00
Manuela Kuhn
f10f8ed013 py-setupmeta: add 3.3.2 (#34421) 2022-12-09 12:32:19 -06:00
Todd Gamblin
d991ec90e3 new command: spack pkg grep to search package files (#34388)
It's very common for us to tell users to grep through the existing Spack packages to
find examples of what they want, and it's also very common for package developers to do
it. Now, searching packages is even easier.

`spack pkg grep` runs grep on all `package.py` files in repos known to Spack. It has no
special options other than the search string; all options passed to it are forwarded
along to `grep`.

```console
> spack pkg grep --help
usage: spack pkg grep [--help] ...

positional arguments:
  grep_args  arguments for grep

options:
  --help     show this help message and exit
```

```console
> spack pkg grep CMakePackage | head -3
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/3dtk/package.py:class _3dtk(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/abseil-cpp/package.py:class AbseilCpp(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/accfft/package.py:class Accfft(CMakePackage, CudaPackage):
```

```console
> spack pkg grep -Eho '(\S*)\(PythonPackage\)' | head -3
AwsParallelcluster(PythonPackage)
Awscli(PythonPackage)
Bueno(PythonPackage)
```

## Return Value

This retains the return value semantics of `grep`:
* 0  for found,
* 1  for not found
* >1 for error

## Choosing a `grep`

You can set the ``SPACK_GREP`` environment variable to choose the ``grep``
executable this command should use.
2022-12-09 10:07:54 -08:00
snehring
8353d1539f py-torchvision: fix typo in version restriction for ffmpeg (#34415) 2022-12-09 11:05:43 -07:00
iarspider
bf3d18bf06 Add checksum for py-packaging11 0.12.3 (#34402) 2022-12-09 06:43:44 -07:00
John W. Parent
0e69710f41 Windows: reenable unit tests (#33385)
Unit tests on Windows are supposed to pass for any PR to pass CI.
However, the return code for the unit test command was not being
checked, which meant this check was always passing (effectively
disabled). This PR

* Properly checks the result of the unit tests and fails if the
  unit tests fail
* Fixes (or disables on Windows) a number of tests which have
  "drifted" out of support on Windows since this check was
  effectively disabled
2022-12-09 13:27:46 +00:00
Harmen Stoppels
ec62150ed7 binary distribution: warn about issues (#34152) 2022-12-09 13:25:32 +01:00
Massimiliano Culpo
d37dc37504 btop++: add new package (#34399) 2022-12-09 12:59:46 +01:00
iarspider
38d37897d4 Add checksum for py-onnxmltools 1.11.1 (#34400) 2022-12-09 04:04:20 -07:00
Todd Gamblin
606eef43bd bugfix: spack load shell test can fail on macos (#34419)
At some point the `a` mock package became an `AutotoolsPackage`, and that means it
depends on `gnuconfig` on macOS. This was causing one of our shell tests to fail on
macOS because it was testing for `{a.prefix.bin}:{b.prefix.bin}` in `PATH`, but
`gnuconfig` shows up between them.

- [x] simplify the test to check `spack load --sh a` and `spack load --sh b` separately
2022-12-09 10:36:54 +00:00
Mikael Simberg
02a30f8d95 Add pika-algorithms package and pika 0.11.0 (#34397)
* Add 20 as a valid option for cxxstd to fmt

* Add pika 0.11.0

* Fix version constraint for p2300 variant in pika package

* Add pika-algorithms package
2022-12-09 11:26:48 +01:00
Harmen Stoppels
7e054cb7fc s3: cache client instance (#34372) 2022-12-09 08:50:32 +01:00
Manuela Kuhn
d29cb87ecc py-reportlab: add 3.6.12 (#34396)
* py-reportlab: add 3.6.12

* Update var/spack/repos/builtin/packages/py-reportlab/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-08 20:08:00 -06:00
Bernhard Kaindl
f8c0d9728d intel-mkl: It is only available for x86_64 (#34391) 2022-12-08 18:10:00 -07:00
Bernhard Kaindl
f5bff16745 bcache: Fix check for libintl to work correctly (#34383) 2022-12-08 17:37:10 -07:00
Adam J. Stewart
2d1cb6d64a bash: add v5.2, readline patches (#34301) 2022-12-08 13:46:21 -07:00
Peter Scheibel
c6e35da2c7 Cray manifest: automatically convert 'cray' platform to 'linux' (#34177)
* Automatically convert 'cray' platform to 'linux'
2022-12-08 11:28:06 -08:00
Manuela Kuhn
f1cd327186 py-rdflib: add 6.2.0 (#34394) 2022-12-08 13:07:26 -06:00
Victor Lopez Herrero
391ad8cec4 dlb: new package (#34211) 2022-12-08 05:57:48 -07:00
Larry Knox
2c668f4bfd Update hdf5 vol async version (#34376)
* Add version hdf5-vol-async@1.4
2022-12-08 05:37:34 -07:00
Glenn Johnson
52fdae83f0 pixman: add libs property (#34281) 2022-12-08 06:34:49 +01:00
Michael Kuhn
0ea81affd1 py-torch: fix build with gcc@12: (#34352) 2022-12-08 06:31:00 +01:00
Brian Van Essen
ddc6e233c7 libxcrypt: building @:4.4.17 requires automake@1.14: 2022-12-08 03:17:28 +01:00
Jon Rood
7ee4499f2b Add texinfo dependency for binutils through version 2.38. (#34173) 2022-12-08 03:08:37 +01:00
Marco De La Pierre
641adae961 Add recipe for singularity-hpc, py-spython (#34234)
* adding recipe for singularity-hpc - 1st go

* typo in singularity-hpc recipe

* singularity-hpc, spython recipes: added platform variant

* singularity-hpc, spython recipes: platform variant renamed to runtime

* style fix

* another style fix

* yet another style fix (why are they not reported altogether)

* singularity-hpc recipe: added Vanessa as maintainer

* singularity-hpc recipe: add podman variant

* singularity-hpc recipe: added variant for module system

* shpc recipe: add version for py-semver dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-spython recipe: no need to specify generic python dep for a python pkg

* py-spython: py-requests not needed

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-07 20:07:30 -06:00
John W. Parent
aed77efb9a Windows: Prevent SameFileError when rpathing (#34332) 2022-12-07 16:58:44 -08:00
Massimiliano Culpo
ab6499ce1e parser: refactor with coarser token granularity (#34151)
## Motivation

Our parser grew to be quite complex, with a 2-state lexer and logic in the parser
that has up to 5 levels of nested conditionals. In the future, to turn compilers into
proper dependencies, we'll have to increase the complexity further as we foresee
the need to add:
1. Edge attributes
2. Spec nesting

to the spec syntax (see https://github.com/spack/seps/pull/5 for an initial discussion of
those changes).  The main attempt here is thus to _simplify the existing code_ before
we start extending it later. We try to do that by adopting a different token granularity,
and by using more complex regexes for tokenization. This allow us to a have a "flatter"
encoding for the parser. i.e., it has fewer nested conditionals and a near-trivial lexer.

There are places, namely in `VERSION`, where we have to use negative lookahead judiciously
to avoid ambiguity.  Specifically, this parse is ambiguous without `(?!\s*=)` in `VERSION_RANGE`
and an extra final `\b` in `VERSION`:

```
@ 1.2.3     :        develop  # This is a version range 1.2.3:develop
@ 1.2.3     :        develop=foo  # This is a version range 1.2.3: followed by a key-value pair
```

## Differences with the previous parser

~There are currently 2 known differences with the previous parser, which have been added on purpose:~

- ~No spaces allowed after a sigil (e.g. `foo @ 1.2.3` is invalid while `foo @1.2.3` is valid)~
- ~`/<hash> @1.2.3` can be parsed as a concrete spec followed by an anonymous spec (before was invalid)~

~We can recover the previous behavior on both ones but, especially for the second one, it seems the current behavior in the PR is more consistent.~

The parser is currently 100% backward compatible.

## Error handling

Being based on more complex regexes, we can possibly improve error
handling by adding regexes for common issues and hint users on that.
I'll leave that for a following PR, but there's a stub for this approach in the PR.

## Performance

To be sure we don't add any performance penalty with this new encoding, I measured:
```console
$ spack python -m timeit -s "import spack.spec" -c "spack.spec.Spec(<spec>)"
```
for different specs on my machine:

* **Spack:** 0.20.0.dev0 (c9db4e50ba045f5697816187accaf2451cb1aae7)
* **Python:** 3.8.10
* **Platform:** linux-ubuntu20.04-icelake
* **Concretizer:** clingo

results are:

| Spec          | develop       | this PR |
| ------------- | ------------- | ------- |
| `trilinos`  |  28.9 usec | 13.1 usec |
| `trilinos @1.2.10:1.4.20,2.0.1`  | 131 usec  | 120 usec |
| `trilinos %gcc`  | 44.9 usec  | 20.9 usec |
| `trilinos +foo`  | 44.1 usec  | 21.3 usec |
| `trilinos foo=bar`  | 59.5 usec  | 25.6 usec |
| `trilinos foo=bar ^ mpich foo=baz`  | 120 usec  | 82.1 usec |

so this new parser seems to be consistently faster than the previous one.

## Modifications

In this PR we just substituted the Spec parser, which means:
- [x] Deleted in `spec.py` the `SpecParser` and `SpecLexer` classes. deleted `spack/parse.py`
- [x] Added a new parser in `spack/parser.py`
- [x] Hooked the new parser in all the places the previous one was used
- [x] Adapted unit tests in `test/spec_syntax.py`


## Possible future improvements

Random thoughts while working on the PR:
- Currently we transform hashes and files into specs during parsing. I think
we might want to introduce an additional step and parse special objects like
a `FileSpec` etc. in-between parsing and concretization.
2022-12-07 14:56:53 -08:00
Houjun Tang
412bec45aa SW4: new package (#34252)
* sw4
* use h5z-zfp develop
* update for macos
* Update package.py

Co-authored-by: Houjun Tang <tang@Houjuns-MacBook-Pro.local>
2022-12-07 14:26:05 -07:00
Manuela Kuhn
c3dcd94ebc py-numba: add 0.56.4 (#34362) 2022-12-07 14:18:45 -07:00
Hanqi Guo
cb8f642297 ftk: add 0.0.7.1 (#34146) 2022-12-07 22:13:46 +01:00
Manuela Kuhn
92f19c8491 py-pywavelets: add 1.4.1 (#34369)
* py-pywavelets: add 1.4.1

* Update var/spack/repos/builtin/packages/py-pywavelets/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pywavelets/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-07 13:40:10 -07:00
iarspider
f3f8b31be5 XRootD: add checksum + patch for 5.5.1 (#34209)
* Update package.py
* Add full_index to patch URL
* Update var/spack/repos/builtin/packages/xrootd/package.py
* Restore list_url

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-12-07 12:06:38 -08:00
MatthewLieber
63cadf04ea osu-micro-benchmarks: add v7.0.1 (#34221)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2022-12-07 21:02:43 +01:00
eugeneswalker
541e75350f libnrm: allow mpi other than mpich (#34232) 2022-12-07 11:57:13 -08:00
Stephen Sachs
8806e74419 [quantum-espresso] Parallel make fails for 6.{6,7} (#34238)
* [quantum-espresso] Parallel make fails for 6.{6,7}
  I run into a race condition in `make` with Intel compiler on icelake when building QE 6.6 and 6.7.
* Fix comment

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-12-07 11:52:12 -08:00
Sam Gillingham
381f8161b1 update kealib to 1.5.0 (#34237) 2022-12-07 11:49:41 -08:00
shanedsnyder
884123b7ce darshan-util: fix location of input for darshan-util tests (#34245)
* fix location of input for darshan-util tests
  Darshan log file used for test input was removed from the Darshan
  repo after the 3.4.0 release. This commit adds logic to use a
  different log file as test input for later Darshan versions.
2022-12-07 11:48:55 -08:00
Eric Müller
35aa875762 meep: add new versions and additional variants incl. dependencies (#34242)
* libctl: add new version
  Change-Id: I16f91cfab198c66b60407ab5bb2cb3ebeac6bc19
* New package: libgdsii
  Change-Id: I34b52260ab68ecc857ddf8cc63b124adc2689a51
* New package: mpb
  Change-Id: I6fdf5321c33d6bdbcaa1569026139a8483a3bcf8
* meep: add new version and variants
  Change-Id: I0b60a9a4d9a329f7bde9027514467e17376e6a39
* meep: use with_or_without
  Change-Id: I05584cb13df8ee153ed385e77d367cb34e39777e
2022-12-07 11:44:26 -08:00
Sam Grayson
9b0e79fcab Fix Apptainer (#34329)
* Fix Apptainer
* Add comments
2022-12-07 11:05:22 -08:00
kwryankrattiger
8ba0faa9ee Paraview catalyst updates (#34364)
* LibCatalyst: Fix version of pre-release develop version
* ParaView: Requires libcatalyst@2:
* ParaView: Apply adios2 module no kit patch to 5.11

This patch is still pending in VTK and didn't make it into 5.11 as anticipated.
2022-12-07 10:27:47 -08:00
Bernhard Kaindl
d464185bba bcache: support external gettext when libintl is in glibc (#34114)
* bcache: support external gettext when `libintl` is in glibc

Many glibc-based Linux systems don't have gettext's libintl because
libintl is included in the standard system's glibc (libc) itself.

When using `spack external find gettext` on those, packages like
`bcache` which unconditionally to link using `-lintl` fail to link
with -lintl.

Description of the fix:

The libs property of spack's gettext recipe returns the list of libs,
so when gettext provides libintl, use it. When not, there is no
separate liblint library and the libintl API is provided by glibc.

Tested with `spack external find gettext` on glibc-based Linux and
in musl-based Alpine Linux to make sure that when -lintl is really
needed, it is really used and nothing breaks.
2022-12-07 11:39:02 -05:00
G-Ragghianti
7f4d71252b Package magma: cleaned up cmake config (#33766) 2022-12-07 16:30:20 +01:00
Matthias Wolf
7950311767 likwid: add a permission fixing script a la singularity (#33503) 2022-12-07 15:51:02 +01:00
Greg Becker
194f9a9ca9 compiler flags: fix mixed flags from cli and yaml (#34218) 2022-12-06 16:32:08 -08:00
Cameron Rutherford
a72021fd63 Fix dependency specification for CuSolver variant in HiOp. (#34138)
Co-authored-by: pelesh <peless@ornl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-12-06 21:57:57 +01:00
Massimiliano Culpo
d910b3725b Add back depends_on directives needed to bootstrap on Python 3.6 (#34355)
This partially reverts commit 95b5d54129.
2022-12-06 20:08:26 +01:00
David Zmick
99f209019e htop: new version 3.2.1 (#34346) 2022-12-06 12:03:46 -07:00
Manuela Kuhn
c11a4e0ad3 py-nbclient: add 0.7.2 and py-jupyter-core: add 5.1.0 (#34348) 2022-12-06 10:43:28 -07:00
Nicholas Sly
4a429ec315 mercurial/py-pybind11: print_string no longer exists (#34340)
* Fix mercurial print_str failure.

* Perform same fix on py-pybind11 for print_string missing method.

Co-authored-by: Nicholas Cameron Sly <sly1@llnl.gov>
2022-12-06 09:28:00 -08:00
eugeneswalker
eadccfe332 trilinos: +teko conflicts with ~ml (#34339) 2022-12-06 09:19:25 -07:00
Harmen Stoppels
dfab5b5ceb Stop checking for {s3://path}/index.html (#34325) 2022-12-06 09:19:04 -07:00
lorddavidiii
862029215c cfitsio: add v4.2.0 (#34316) 2022-12-06 09:18:51 -07:00
Hadrien G
559c3de213 ROOT: new versions and associated dependency constraints (#34185)
* Add new root versions and associated dependency constraints

* Please style guide

* Avoid conflicts where possible

* Untested prototype of macOS version detection

* Fixes for macOS version prototype

* More logical ordering

* More correctness and style fixes

* Try to use spack's macos_version

* Add some forgotten @s

* Actually, Spack can't build Python 3.6 anymore, and thus no older PyROOT

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2022-12-06 08:25:52 -07:00
Harmen Stoppels
e3bf7358d7 Avoid stat call in llnl.util.symlink on non-windows (#34321) 2022-12-06 15:17:15 +00:00
Harmen Stoppels
b58ec9e2b9 Remove legacy yaml from buildcache fetch (#34347) 2022-12-06 16:12:20 +01:00
Adam J. Stewart
95b5d54129 pip/wheel/setuptools: extend PythonExtension (#34137)
* pip/wheel/setuptools: extend PythonExtension

* Base class still required
2022-12-06 08:58:05 -06:00
Houjun Tang
bcce9c3e9c Fix compile errors with latest HDF5 1.13.3 (#34337)
* Fix compile errors with latest HDF5 1.13.3

* format

* Update var/spack/repos/builtin/packages/hdf5-vol-async/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-06 08:15:52 -06:00
Massimiliano Culpo
4c05fe569c Bootstrap most of Spack dependencies using environments (#34029)
This commit reworks the bootstrapping procedure to use Spack environments 
as much as possible.

The `spack.bootstrap` module has also been reorganized into a Python package. 
A distinction is made among "core" Spack dependencies (clingo, GnuPG, patchelf)
and other dependencies. For a number of reasons, explained in the `spack.bootstrap.core`
module docstring, "core" dependencies are bootstrapped with the current ad-hoc
method. 

All the other dependencies are instead bootstrapped using a Spack environment
that lives in a directory specific to the interpreter and the architecture being used.
2022-12-06 11:54:02 +01:00
Sam Grayson
e550665df7 Update packages (#34344) 2022-12-05 23:52:05 -06:00
Glenn Johnson
d92d34b162 graphite2: add dependency on freetype (#34292) 2022-12-05 14:38:52 -08:00
Miguel Dias Costa
f27be808a4 berkeleygw: add back python dependencies and tweak testsuite (#34125)
* slightly raise tolerance of some tests
2022-12-05 23:37:19 +01:00
Seth R. Johnson
855d3519b6 SWIG: new version 4.1.0 (#34250) 2022-12-05 23:30:14 +01:00
downloadico
37f232e319 psrcat: fixed typo/undefined variable problem (#34334)
replaced the reference to the undefined "bindir" variable with prefix.bin
2022-12-05 15:19:54 -07:00
Luke Diorio-Toth
ac1c29eac0 pharokka and py-phanotate: new packages (#34333)
* pharokka and py-phanotate: new packages

* move libxcrypt edit

I don't need libxcrypt when not building dev infernal. Moving to a different PR
2022-12-05 16:16:59 -06:00
Bernhard Kaindl
56072172f5 jellyfish: add variants for python and ruby bindings (#33832)
Co-authored-by: teachers-uk-net <stuart.morrison@kcl.ac.uk>
2022-12-05 14:10:57 -07:00
Peter Scheibel
64d957dece cray-mpich: fix dependencies for externals from manifest (#34231)
The cray manifest shows dependency information for cray-mpich, which we never previously cared about
because it can only be used as an external. This updates Spack's dependency information to make cray-mpich
specs read in from the cray external manifest usable.
2022-12-05 12:11:56 -08:00
Loïc Pottier
3edc85ec21 redis: newer version and added TLS support (#34230)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-12-05 12:44:13 -07:00
Manuela Kuhn
d8006a9495 py-nodeenv: add 1.7.0 (#34314) 2022-12-05 12:32:30 -07:00
Manuela Kuhn
a2cfc07412 py-num2words: add 0.5.12 (#34315) 2022-12-05 12:25:01 -07:00
Sreenivasa Murthy Kolam
1295ea5d40 Drop support for older rocm releases - 4.5.0 till 5.0.2 (#34264)
* initial changes for rocm recipes
* drop support for older releases
* drop support for older rocm releases - add more recipes
* drop support for older releases
* address style issues
* address style error
* fix errors
* address review comments
2022-12-05 10:47:45 -08:00
Hector Martinez-Seara
4664b3cd1e Added plumed version 2.8.1 including gromacs compatibility (#34268)
* Added plumed version 2.8.1 including gromacs compatibility
* Corrected ~mpi and +mpi variants in new depends
* Fixed regression logic plumed+gromacs@2020.6 support
2022-12-05 10:43:08 -08:00
Richard Berger
dc7e0e3ef6 LAMMPS: Add new versions (#32522)
* LAMMPS: Add version 20220803 and 20220623.1

* LAMMPS: Add 20220915, 20221103, and 20220623.2
2022-12-05 12:52:10 -05:00
H. Joe Lee
9aa615aa98 feat(Hermes)!: add yaml-cpp dependency (#34330)
The 0.9.0-beta requires yaml-cpp for parsing the configuration file format in YAML.

P.S. I'm using https://www.conventionalcommits.org/en/v1.0.0/#specification for this commit message.
2022-12-05 09:49:45 -08:00
downloadico
85b6bf99a4 Add packages related to the LWA software stack (#34112)
* epsic: add epsic package to spack

* psrcat: add psrcat to spack

* psrchive: add psarchive to spack

* tempo: add tempo package to spack
2022-12-05 09:48:04 -08:00
Todd Gamblin
78ec3d5662 clingo: add version 5.6.2 (#34317)
See release notes at https://github.com/potassco/clingo/releases/tag/v5.6.2
2022-12-05 10:39:30 -07:00
andriish
a7b5f2ef39 Add the very first version of cernlib package (#33911)
* Add the very first version of cernlib
* Update package.py
* Update package.py

Co-authored-by: Andrii Verbytskyi <andriish@pcatlas18.mpp.mpg.de>
2022-12-05 09:31:15 -08:00
Manuela Kuhn
f71701f39d py-nbclassic: add 0.4.8 and new package py-notebook-shim (#34320)
* py-nbclassic: add 0.4.8 and new package py-notebook-shim

* Add missing dependencies
2022-12-05 09:51:28 -07:00
Todd Gamblin
54008a2342 vermin: remove all novm comments from code (#34308)
All the vermin annotations we were using were for optional features introduced in early
Python 3 versions. We no longer need any of them, as we only support Python 3.6+. If we
start optionally using features from newer Pythons than 3.6, we will need more vermin
annotations.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-12-05 08:59:27 -07:00
Manuela Kuhn
1670c325c6 py-llvmlite: add 0.39.1 (#34318) 2022-12-05 09:22:05 -06:00
SXS Bot
534a994b4c spectre: add v2022.12.02 (#34277)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-12-05 12:45:04 +01:00
Jean-Luc Fattebert
359efca201 Thermo4PFM: new package (#34287) 2022-12-05 12:03:44 +01:00
Glenn Johnson
65809140f3 gurobi: add v10.0.0, v9.5.2 (#34291) 2022-12-05 11:58:09 +01:00
Glenn Johnson
3f1622f9e7 freeglut: add dependency on libxxf86vm (#34293) 2022-12-05 11:57:27 +01:00
Glenn Johnson
8332a59194 wannier90: gfortran-10 support and libs property (#34278) 2022-12-05 11:43:23 +01:00
HELICS-bot
05abea3a3a helics: add v3.3.2 (#34297)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2022-12-05 11:42:03 +01:00
Luke Diorio-Toth
e7fc9ea243 mmseqs2: new package (#34189) 2022-12-05 11:38:56 +01:00
Adam J. Stewart
eea3ea7675 py-torch: fix patching on ppc64le (#34283) 2022-12-05 11:37:31 +01:00
Auriane R
895ac2626d Add conflicts between gcc@12.2.0 and rocblas@5.2.1:5.2.3 (#34217)
* Add conflicts with gcc@12.2.0

* Add more links for reference
2022-12-05 11:13:22 +01:00
Todd Gamblin
94dc86e163 web: remove checks for SSL verification support (#34307)
We no longer support Python <3.6, so we don't need to check whether Python supports SSL
verification in `spack.util.web`.

- [x] Remove a bunch of logic we needed to appease Python 2
2022-12-05 08:46:27 +01:00
Manuela Kuhn
729b1c9fa6 py-mne: add 1.2.2, 1.2.3 and dependency packages (#34295)
* py-mne: add 1.2.2 and depencendy packages

* py-mne: add 1.2.3

* Remove unnecessary when statement
2022-12-05 00:02:27 -06:00
Todd Gamblin
82b7fe649f typing: move from comment annotations to Python 3.6 annotations (#34305)
We've stopped supporting Python 2, and contributors are noticing that our CI no longer
allows Python 2.7 comment type hints. They end up having to adapt them, but this adds
extra unrelated work to PRs.

- [x] Move to 3.6 type hints across the entire code base
2022-12-04 21:41:12 -08:00
Adam J. Stewart
76417d6ac6 py-torchmetrics: add v0.11.0 (#34220) 2022-12-04 15:32:02 -07:00
Glenn Johnson
fe995542ab py-tensorflow: patch for cuBLAS error (#34279) 2022-12-03 09:17:12 -07:00
wspear
8f5209063d Use correct method for selecting compiler names. (#34175) 2022-12-02 15:02:24 -08:00
Enrico Usai
241a8f6be6 aws-parallelcluster: Add v2.11.9 (#34270) 2022-12-02 15:57:18 -07:00
Seth R. Johnson
a8a0a6916a doxygen: add build-tools tag (#34249)
* doxygen: add build-tool tag
   This allows it to be included automatically as an external. No one links against doxygen so this should be ok.
* doxygen: add self as maintainer
2022-12-02 15:56:57 -07:00
Ben Morgan
8d10dce651 vecgeom: add new 1.2.1 version (#34240)
* vecgeom: add new 1.2.1 version
* vecgeom: introduce conflict between gcc/cuda

Recent tests of vecgeom in Spack environments have shown that the build
with +cuda fails with GCC >= 11.3 and CUDA < 11.7 with error

...lib/gcc/x86_64-pc-linux-gnu/11.3.0/include/serializeintrin.h(41):
error: identifier "__builtin_ia32_serialize" is undefined

1 error detected in the compilation of
".../VecGeom/source/BVHManager.cu".

Other GCC/CUDA combinations appear o.k.

Avoid this error in spack, and document it for users, with a conflict
directive to express the restriction.
2022-12-02 15:56:43 -07:00
Phil Carns
a2938c9348 add mochi-margo 0.11.1 point release (#34271) 2022-12-02 15:56:29 -07:00
snehring
8017f4b55b libvips: adding version 8.13.3 (#34228) 2022-12-02 15:32:22 -07:00
Andrew W Elble
588d2e295f py-alphafold: update to 2.2.4, update dependencies (#33876)
* py-alphafold: update to 2.2.4, update dependencies

* style
2022-12-02 22:12:06 +00:00
Manuela Kuhn
c10b84f08d py-nilearn: fix dependency version (#34284) 2022-12-02 15:04:46 -07:00
Greg Becker
99044bedd7 patch command: add concretizer args (#34282)
* patch command: add concretizer args
* tab completion
2022-12-02 14:02:20 -08:00
Seth R. Johnson
3afe6f1adc ROOT: add math/gsl conflict and change version-dependent features to conditional variants (#34244)
* ROOT: add GSL/math dependency
* ROOT: use conditional variants instead of conflicts
2022-12-02 12:05:43 -07:00
Manuela Kuhn
fcd9038225 py-neurokit2: add 0.2.2 (#34267) 2022-12-02 11:41:43 -07:00
H. Joe Lee
9d82024f1a feat(Hermes): update version to 0.9.0-beta. (#34243)
* feat(Hermes): update version to 0.9.0-beta.
   This is the latest release.
* feat(Hermes): fix checksum.
  Credit: @tldahlgren
2022-12-02 11:31:09 -07:00
Tamara Dahlgren
bcefe6a73e Docs: Minor change 'several'->'over a dozen' (#34274) 2022-12-02 10:27:37 -08:00
Todd Gamblin
87562042df concretizer: use only attr() for Spec attributes (#31202)
All Spec attributes are now represented as `attr(attribute_name, ... args ...)`, e.g.
`attr(node, "hdf5")` instead of `node("hdf5")`, as we *have* to maintain the `attr()`
form anyway, and it simplifies the encoding to just maintain one form of the Spec
information.

Background
----------

In #20644, we unified the way conditionals are done in the concretizer, but this
introduced a nasty aspect to the encoding: we have to maintain everything we want in
general conditions in two forms: `predicate(...)` and `attr("predicate", ...)`. For
example, here's the start of the table of spec attributes we had to maintain:

```prolog
node(Package)                      :- attr("node", Package).
virtual_node(Virtual)              :- attr("virtual_node", Virtual).
hash(Package, Hash)                :- attr("hash", Package, Hash).
version(Package, Version)          :- attr("version", Package, Version).
...
```

```prolog
attr("node", Package)              :- node(Package).
attr("virtual_node", Virtual)      :- virtual_node(Virtual).
attr("hash", Package, Hash)        :- hash(Package, Hash).
attr("version", Package, Version)  :- version(Package, Version).
...
```

This adds cognitive load to understanding how the concretizer works, as you have to
understand the equivalence between the two forms of spec attributes. It also makes the
general condition logic in #20644 hard to explain, and it's easy to forget to add a new
equivalence to this list when adding new spec attributes (at least two people have been
bitten by this).

Solution
--------

- [x] remove the equivalence list from `concretize.lp`
- [x] simplify `spec_clauses()`, `condition()`, and other functions in `asp.py` that need
      to deal with `Spec` attributes.
- [x] Convert all old-form spec attributes in `concretize.lp` to the `attr()` form
- [x] Simplify `display.lp`, where we also had to maintain a list of spec attributes. Now
      we only need to show `attr/2`, `attr/3`, and `attr/4`.
- [x] Simplify model extraction logic in `asp.py`.

Performance
-----------

This seems to result in a smaller grounded problem (as there are no longer duplicated
`attr("foo", ...)` / `foo(...)` predicates in the program), but it also adds a slight
performance overhead vs. develop. Ultimately, simplifying the encoding will be a win,
particularly for improving error messages.

Notes
-----

This will simplify future node refactors in `concretize.lp` (e.g., not identifying nodes
by package name, which we need for separate build dependencies).

I'm still not entirely used to reading `attr()` notation, but I thnk it's ultimately
clearer than what we did before. We need more uniform naming, and it's now clear what is
part of a solution. We should probably continue making the encoding of `concretize.lp`
simpler and more self-explanatory. It may make sense to rename `attr` to something like
`node_attr` and to simplify the names of node attributes. It also might make sense to do
something similar for other types of predicates in `concretize.lp`.
2022-12-02 18:56:18 +01:00
Manuela Kuhn
10d10b612a py-keyrings-alt: add 4.2.0 (#34262)
* py-keyrings-alt: add 4.2.0

* Add missing py-jaraco-classes dependency
2022-12-02 09:08:53 -07:00
iarspider
69dd742dc9 Add checksum for py-hatchling 1.8.1 (#34260) 2022-12-02 09:29:38 -06:00
Tamara Dahlgren
18efd817b1 Bugfix: Fetch should not force use of curl to check url existence (#34225)
* Bugfix: Fetch should not force use of curl to check url existence

* Switch type hints from comments to actual hints
2022-12-02 04:50:23 -07:00
Manuela Kuhn
65a5369d6a py-flask: add 2.2.2 and fix dependencies for py-werkzeug and py-markupsafe (#32849)
* py-flask: add 2.2.2, py-werkzeug: add 2.2.2, py-markupsafe: add 2.1.1

* Remove py-dataclasses dependency
2022-12-01 22:56:18 -07:00
iarspider
f66ec00fa9 Herwig3: make njet, vbfnlo dependencies optional... (#33941)
* Herwig3: make njet, vbfnlo dependencies optional...
  also drop openloops dependency when building on PowerPC
* Update package.py
2022-12-01 17:19:46 -08:00
Manuela Kuhn
f63fb2f521 py-twine: add 4.0.1, py-readme-renderer: add 37.3 (#34203)
* py-twine: add 4.0.1

* Remove py-setuptools as run dependency
2022-12-01 12:29:10 -06:00
Annop Wongwathanarat
dfa00f5a8d acfl: add post-installation check by running examples (#34172) 2022-12-01 09:47:59 -07:00
Alec Scott
6602780657 Add py-python-lsp-server and dependencies (#34149)
* Add py-python-lsp-server and dependencies

* Update var/spack/repos/builtin/packages/py-python-lsp-server/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Relax version range constraints on py-python-lsp-jsonrpc and add missing dep

* Add runtime dependency flag to setuptools dependencies

* Remove unused python@3.6: dependency and move setuptools-scm to build dep only

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-01 09:34:49 -07:00
iarspider
8420c610fa gnuplot: make readline optional (#34179)
* gnuplot: make readline optional
* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-12-01 09:11:14 -07:00
Greg Becker
b139cab687 conditional variant values: allow boolean (#33939) 2022-12-01 08:25:57 +01:00
Adam J. Stewart
99fcc57607 py-scipy: hardcode to use blis.pc (#34171) 2022-11-30 12:29:20 -08:00
Larry Knox
5a394d37b7 Add HDF5 version 1.13.3. (#34165)
* Add HDF5 version 1.13.3.
* Remove maintainers no longer with The HDFGroup.
* Fix indentation.
2022-11-30 11:23:38 -08:00
Loïc Pottier
472074cb7c redis-plus-plus: newer version and added TLS support (#34197)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2022-11-30 11:18:47 -08:00
Adam J. Stewart
45c8d7f457 py-segmentation-models-pytorch: add v0.3.1 (#34214) 2022-11-30 10:58:27 -08:00
Edward Hartnett
dce1f01f1a new w3emc version (#34219)
* updated version of w3emc package
* fixed sha
2022-11-30 10:54:45 -08:00
Sam Grayson
03cc83bc67 Add py-yt 4.x versions (#30418)
* Add py-yt 4.x versions

* Fix spelling

* Add yt dependencies

* Refine cython dependency

* Tweak depends_on for py-yt 4.x

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix comments from code review

* Fix formatting

* Fix stuff

* Fix constraints

* Update py-yt to 4.1.2

* Updated packages

* Fix py-tomli checksum

* Remove `expand` from `py-tomli/package.py`

* Respond to Adam's comments

* Update checksums

* Update checksusm

* Respond to comments

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-30 12:03:26 -06:00
eugeneswalker
f452741e3d e4s ci: use 2022-12-01 runner images (#34212) 2022-11-30 09:52:30 -08:00
eugeneswalker
99b68e646d e4s ci: hpx: set max_cpu_count=512 (#33977) 2022-11-30 09:16:57 -08:00
iarspider
f78c8265f4 Add checksum for py-kiwisolver 1.4.4 (#34121) 2022-11-30 09:18:18 -06:00
iarspider
5d3efbba14 Fix recipe for py-onnx-runtime (#34130)
* Fix recipe

* Update package.py

* Update recipe following review

* Update var/spack/repos/builtin/packages/py-onnx-runtime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove unused imports

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-30 09:17:22 -06:00
Sajid Ali
7423f52cd3 PMIx: enable python bindings (#34107) 2022-11-30 15:55:24 +01:00
Massimiliano Culpo
43d93f7773 Deduplicate code to propagate module changes across MRO (#34157) 2022-11-30 11:10:42 +01:00
Harmen Stoppels
f8dec3e87f Single pass text replacement (#34180) 2022-11-30 10:21:51 +01:00
Manuela Kuhn
ef06b9db5b py-bidscoin, py-multiecho: add new packages (#34168) 2022-11-29 17:37:50 -07:00
Todd Gamblin
c64c9649be debug: move "nonexistent config path" message to much higher verbosity level (#34201)
We currently report that searched config paths don't exist at debug level 1, which
clutters the output quite a bit:

```console
> spack -d solve --fresh --show asp hdf5 > hdf5.lp
==> [2022-11-29-14:18:21.035133] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/darwin/concretizer.yaml
==> [2022-11-29-14:18:21.035151] Skipping nonexistent config path /Users/gamblin2/.spack/concretizer.yaml
==> [2022-11-29-14:18:21.035169] Skipping nonexistent config path /Users/gamblin2/.spack/darwin/concretizer.yaml
==> [2022-11-29-14:18:21.035238] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/repos.yaml
==> [2022-11-29-14:18:21.035996] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/defaults/darwin/repos.yaml
==> [2022-11-29-14:18:21.036021] Skipping nonexistent config path /etc/spack/repos.yaml
==> [2022-11-29-14:18:21.036039] Skipping nonexistent config path /etc/spack/darwin/repos.yaml
==> [2022-11-29-14:18:21.036057] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/repos.yaml
==> [2022-11-29-14:18:21.036072] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/darwin/repos.yaml
==> [2022-11-29-14:18:21.036088] Skipping nonexistent config path /Users/gamblin2/.spack/repos.yaml
==> [2022-11-29-14:18:21.036105] Skipping nonexistent config path /Users/gamblin2/.spack/darwin/repos.yaml
==> [2022-11-29-14:18:21.071828] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/config.yaml
==> [2022-11-29-14:18:21.081628] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/defaults/darwin/config.yaml
==> [2022-11-29-14:18:21.081669] Skipping nonexistent config path /etc/spack/config.yaml
==> [2022-11-29-14:18:21.081692] Skipping nonexistent config path /etc/spack/darwin/config.yaml
==> [2022-11-29-14:18:21.081712] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/config.yaml
==> [2022-11-29-14:18:21.081731] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/darwin/config.yaml
==> [2022-11-29-14:18:21.081748] Skipping nonexistent config path /Users/gamblin2/.spack/config.yaml
==> [2022-11-29-14:18:21.081764] Skipping nonexistent config path /Users/gamblin2/.spack/darwin/config.yaml
==> [2022-11-29-14:18:21.134909] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/packages.yaml
==> [2022-11-29-14:18:21.148695] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/darwin/packages.yaml
==> [2022-11-29-14:18:21.152555] Skipping nonexistent config path /etc/spack/packages.yaml
==> [2022-11-29-14:18:21.152582] Skipping nonexistent config path /etc/spack/darwin/packages.yaml
==> [2022-11-29-14:18:21.152601] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/packages.yaml
==> [2022-11-29-14:18:21.152620] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/darwin/packages.yaml
==> [2022-11-29-14:18:21.152637] Skipping nonexistent config path /Users/gamblin2/.spack/packages.yaml
==> [2022-11-29-14:18:21.152654] Skipping nonexistent config path /Users/gamblin2/.spack/darwin/packages.yaml
==> [2022-11-29-14:18:21.853915] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/defaults/compilers.yaml
==> [2022-11-29-14:18:21.853962] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/defaults/darwin/compilers.yaml
==> [2022-11-29-14:18:21.853987] Skipping nonexistent config path /etc/spack/compilers.yaml
==> [2022-11-29-14:18:21.854007] Skipping nonexistent config path /etc/spack/darwin/compilers.yaml
==> [2022-11-29-14:18:21.854025] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/compilers.yaml
==> [2022-11-29-14:18:21.854043] Skipping nonexistent config path /Users/gamblin2/src/spack/etc/spack/darwin/compilers.yaml
==> [2022-11-29-14:18:21.854060] Skipping nonexistent config path /Users/gamblin2/.spack/compilers.yaml
==> [2022-11-29-14:18:21.854093] Reading config from file /Users/gamblin2/.spack/darwin/compilers.yaml
```

It is very rare that I want to know this much information about config search, so I've
moved this to level 3. Now at level 1, we can see much more clearly what configs were
actually found:

```console
> spack -d solve --fresh --show asp hdf5 > hdf5.lp
==> [2022-11-29-14:19:04.035457] Imported solve from built-in commands
==> [2022-11-29-14:19:04.035818] Imported solve from built-in commands
==> [2022-11-29-14:19:04.037626] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/concretizer.yaml
==> [2022-11-29-14:19:04.040033] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/repos.yaml
==> [2022-11-29-14:19:04.080852] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/config.yaml
==> [2022-11-29-14:19:04.133241] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/packages.yaml
==> [2022-11-29-14:19:04.147175] Reading config from file /Users/gamblin2/src/spack/etc/spack/defaults/darwin/packages.yaml
==> [2022-11-29-14:19:05.157896] Reading config from file /Users/gamblin2/.spack/darwin/compilers.yaml
```

You can still get the old messages with `spack -ddd` (to activate debug level 3).
2022-11-29 16:17:13 -08:00
Paul R. C. Kent
2c6b52f137 add 15.0.5, 15.0.6 (#34194) 2022-11-29 15:26:13 -08:00
kwryankrattiger
33422acef0 CI: Update Data and Vis SDK Stack (#34009)
* CI: Update Data and Vis SDK Stack

* Update image to match target deployments (E4S)
* Enable all packages
* Test supported variants of ParaView and VisIt

* Sensei: Update Python hint for newer cmake

* Sensei: add Python3 hint
2022-11-29 14:49:55 -07:00
Stephen Sachs
428f635142 icc@2021.6.0 does not support gcc@12 headers (#34191)
Error message:
```
/shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/gcc-12.2.0-4tairupdxg2tg2yhvjdlbs7xbd7wudl3/bin/../include/c++/12.2.0/bits/random.h(104): error: expected a declaration
{ extension using type = unsigned __int128; };
^
```

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-11-29 14:36:48 -05:00
kwryankrattiger
c6c74e98ff Dav sdk catalyst (#34010)
* SDK: Add Catalyst 1 and 2 support to the SDK

* LibCatalyst: Remove unused python3 variant from package
2022-11-29 11:28:32 -06:00
Valentin Volkl
d9b438ec76 evtgen: add v02.02.00 (#34187)
* evtgen: add v02.02.00
* format
2022-11-29 09:54:14 -07:00
Cristian Le
c6ee30497c Fix libxc cflag (#34000)
Using standard c99 should not be specific to intel compilers.
2022-11-29 13:45:28 +01:00
Loïc Pottier
1270ae1526 hiredis: updated package definition to use CMake (#33949) 2022-11-29 13:44:50 +01:00
psakievich
d15fead30c Add maintainer to Exawind stack and Trilinos (#34174)
* Add maintainer to Nalu-Wind and Trilinos

* Add to trilinos

* Exawind too

* amr-wind too
2022-11-28 17:23:17 -08:00
Valentin Volkl
23aaaf2d28 genfit: add v02-00-01 (#34159) 2022-11-28 17:09:47 -08:00
Hans Fangohr
56f9c76394 fix typo in path for sanity check (#34117)
- typo breaks install
2022-11-28 16:49:57 -07:00
Sam Grayson
49cda811fc Add new version of snakemake (#34041)
* Add new version of snakemake

* Add myself as a maintainer

* py-retry -> py-reretry

* Added snakemake variants for storage systems

* Updated comments

* Responded to Adam's comments

* Fixed spack style

* Add build/run dependency types
2022-11-28 15:10:10 -07:00
Benjamin Meyers
a97312535a New package: py-statmorph (#34158)
* New package py-statmorph w/ dependecies. Add py-astropy@5.1

* [@spackbot] updating style on behalf of meyersbs

* [py-statmorph,py-astropy,py-pyerfa] minor fixes
2022-11-28 15:46:10 -06:00
Benjamin Meyers
a0180ef741 New package: py-stui (#34156)
* New package py-stui

* [py-stui] add maintainer

* [@spackbot] updating style on behalf of meyersbs

* [py-stui] fix deps
2022-11-28 15:45:40 -06:00
iarspider
d640a573a8 Add checksum for py-rsa 4.9 (#34115)
* Add checksum for py-rsa 4.9

* Update package.py

* Update var/spack/repos/builtin/packages/py-rsa/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-28 14:10:21 -07:00
Cameron Smith
b3679406d0 omegah: new scorec version, fix cuda flags (#34169) 2022-11-28 13:03:27 -08:00
eugeneswalker
587488882a e4s ci: add hdf5-vol-async; remove expired comments (#34110) 2022-11-28 19:35:42 +00:00
Satish Balay
a17844a367 petsc, py-petsc4py: add 3.18.2 (#34161) 2022-11-28 10:51:48 -08:00
Manuela Kuhn
093a37750c py-bidskit: new package and dcm2niix: add 1.0.20220720 (#34162)
* py-bidskit: new package and dcm2niix: add 1.0.20220720

* Remove list_url
2022-11-28 11:05:49 -07:00
iarspider
173cc7e973 Add checksum for py-traitlets 5.3.0 (#34127) 2022-11-28 11:05:35 -07:00
Greg Becker
451e3ff50b warn about removal of deprecated format strings (#34101)
* warn about removal of deprecated format strings

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2022-11-28 10:03:49 -08:00
Manuela Kuhn
523c4c2b63 py-dcm2bids: add new package (#34163) 2022-11-28 10:46:04 -07:00
Thomas-Ulrich
35e5a916bc easi: update package, rework impalajit (#34032) 2022-11-28 15:54:21 +01:00
Annop Wongwathanarat
1374577659 acfl: provides blas, lapack, and fftw-api@3 (#34154) 2022-11-28 14:25:32 +01:00
Hector Martinez-Seara
4c017403db texinfo: add v7.0 (#34150) 2022-11-28 06:22:13 -07:00
Massimiliano Culpo
fdfda72371 Use a module-like object to propagate changes in the MRO, when setting build env (#34059)
This fixes an issue introduced in #32340, which changed the semantics of the "module"
object passed to the "setup_dependent_package" callback.
2022-11-28 14:18:26 +01:00
Harmen Stoppels
efa1dba9e4 Revert "Revert "gitlab: Add shared PR mirror to places pipelines look for binaries. (#33746)" (#34087)" (#34153)
This reverts commit 63e4406514.
2022-11-28 06:06:03 -07:00
Annop Wongwathanarat
2a7ae2a700 armpl-gcc: add post-installation check by running examples (#34086) 2022-11-28 13:55:51 +01:00
Adam J. Stewart
a1b4e1bccd Add type hints to Prefix class (#34135) 2022-11-28 13:49:57 +01:00
Alec Scott
066ec31604 Add restic v0.14.0 (#34148) 2022-11-28 08:39:37 +01:00
Alec Scott
bb1888dbd4 direnv: add v2.32.2 (#34147) 2022-11-27 22:09:53 +01:00
Michael Kuhn
bc17b6cefb rust: add 1.65.0 (#34124) 2022-11-27 16:03:46 +01:00
Umashankar Sivakumar
46a0cd8e55 SingularityCE: Add conmon+squashfs as dependencies (#33891) 2022-11-27 13:05:49 +01:00
fpruvost
b2ceb23165 pastix: add new version 6.2.2 (#34066) 2022-11-27 00:33:05 +01:00
Harmen Stoppels
2fad966139 neovim: fix deptypes (#34060) 2022-11-27 00:23:24 +01:00
Luke Diorio-Toth
0b01c8c950 py-fastpath: new package (#34142)
* py-fastpath: new package

* Update var/spack/repos/builtin/packages/py-fastpath/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-26 15:57:45 -07:00
Eric Berquist
613d0b7e8e emacs: add variant treesitter for Emacs 29+ (#34134) 2022-11-26 23:45:00 +01:00
Wouter Deconinck
21c29ee375 prmon: Add missing depends_on py-numpy, py-pandas when +plot (#34123) 2022-11-26 23:43:25 +01:00
Luke Diorio-Toth
e236339e5a aragorn: add newer versions and URL (#34140) 2022-11-26 23:34:59 +01:00
Luke Diorio-Toth
7a03525c35 minced: add v0.4.2 (#34141) 2022-11-26 23:30:56 +01:00
Hans Fangohr
17ca86a309 Octopus: branch for Octopus development is now "main" (#34128)
Historically, development of the Octopus code was done on the "develop" branch
on https://gitlab.com/octopus-code/octopus but now development takes place on
"main" (since Q3 2022).

The suggestion in this PR to solve the issue is to keep the spack label
`octopus@develop` as this better indicates this is the development branch on git
than `octopus@main`, but of course to use the `main` branch (there is no choice
here - the `develop` branch is not touched anymore). Sticking to
`octopus@develop` as the version label also keeps backwards compatibility.
2022-11-26 15:29:49 -07:00
marcost2
ce71a38703 nvtop: Add 2.0.3, 2.0.4, 3.0.0 and 3.0.1 (#34145)
* And add the option to compile support for Intel GPU's
2022-11-26 21:04:48 +01:00
Satish Balay
12c23f2724 Revert "url_exists related improvements (#34095)" (#34144)
This reverts commit d06fd26c9a.

The problem is that Bitbucket's API forwards download requests to an S3 bucket using a temporary URL. This URL includes a signature for the request, which embeds the HTTP verb. That means only GET requests are allowed, and HEAD requests would fail verification, leading to 403 erros. The same is observed when using `curl -LI ...`
2022-11-26 17:56:36 +00:00
Carlos Bederián
b8ae0fbbf4 amdblis: symlink libblis-mt to libblis (#32819) 2022-11-26 00:33:29 +01:00
Sebastian Ehlert
6b5c86e0be toml-f: add 0.2.4 and 0.3.1 (#34025) 2022-11-25 16:29:58 -07:00
Victoria Cherkas
1ed1b49c9b metkit, fdb: Add latest versions (#33289) 2022-11-25 23:54:51 +01:00
petertea
4265d5e111 Update TotalView versions and website (#33418)
Co-authored-by: Peter Thompson <thompson81@llnl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-11-25 23:42:56 +01:00
snehring
8c0fb91d4e ffmpeg: adding version 5.1.2 (#33758)
* ffmpeg: add version 5.1.2 and switch to conditional variants

Also: py-torchvision: restrict ffmpeg dependency
2022-11-25 23:37:17 +01:00
Thomas Madlener
567532b9e5 lcio: Add new version and restrictions on c++ standard (#33997) 2022-11-25 23:08:23 +01:00
Adam J. Stewart
47d59e571e py-pandas: add v1.5.2 (#34091) 2022-11-25 23:03:16 +01:00
Adam J. Stewart
93ff19c9b7 py-pytorch-lightning: add v1.8.3 (#34096) 2022-11-25 22:59:39 +01:00
Harmen Stoppels
2167cbf72c Track locks by (dev, ino); close file handlers between tests (#34122) 2022-11-25 10:57:33 +01:00
Sergey Kosukhin
7a5e527cab zlib: fix shared libraries when '%nvhpc' (#34039) 2022-11-24 20:10:18 +01:00
iarspider
a25868594c Add checksum for py-uncertainties 3.1.7 (#34116) 2022-11-24 12:11:58 -06:00
Brent Huisman
dd5263694b Arbor: Yank v0.5 (#34094)
v0.5 does not build due to a change in setting `arch` introduced in v0.5.2, compatibility with which was not kept in `arbor/package.py`. Since v0.5.2 is compatible with `arbor/package.py`, and is API compatible with v0.5, any users relying on v0.5 can rely on v0.5.2.
2022-11-24 11:19:32 +01:00
Thomas-Ulrich
5fca1c9aff hipsycl: add v0.9.3 (#34052) 2022-11-23 23:14:01 -07:00
Tim Haines
1d7393c281 dyninst: add checksums for all supported versions (#34051) 2022-11-24 03:45:56 +01:00
Jen Herting
f0bc551718 New package: py-kt-legacy (#34104)
* first build of keras-tuner with dataset kt-legacy

* [py-kt-legacy] fixed homepage

* [py-kt-legacy] depends on setuptools

* [py-kt-legacy] fixed import

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-11-23 19:26:21 -07:00
Adam J. Stewart
46b9a09843 py-cartopy: older versions don't support newer matplotlib (#34109) 2022-11-24 02:44:20 +01:00
Stephen Sachs
c0898565b9 openfoam: Fix openfoam@2012_220610 %intel (add #include <array>) (#34088) 2022-11-24 02:34:40 +01:00
Benjamin Meyers
3018e7f63d Add py-urwid@2.1.2 (#34103)
* Add py-urwid@2.1.2

* [@spackbot] updating style on behalf of meyersbs
2022-11-23 19:25:02 -06:00
Satish Balay
dfa1a42420 petsc, slepc: enable parallel builds (#34024) 2022-11-24 02:16:20 +01:00
Adam J. Stewart
2c8ab85e6a gnuplot: fix build with Apple Clang (#34092) 2022-11-24 01:33:33 +01:00
Nicolas Cornu
b2505aed5c HighFive: bump to 2.6.2 (#34090) 2022-11-24 00:20:36 +01:00
Valentin Volkl
7847d4332e docs: update info on XCode requirements (#34097) 2022-11-24 00:20:09 +01:00
Massimiliano Culpo
70bcbba5eb ecflow: polish recipe (#34043) 2022-11-23 21:38:37 +01:00
Tom Scogland
0182603609 Control Werror by converting to Wno-error (#30882)
Using `-Werror` is good practice for development and testing, but causes us a great
deal of heartburn supporting multiple compiler versions, especially as newer compiler
versions add warnings for released packages.  This PR adds support for suppressing
`-Werror` through spack's compiler wrappers.  There are currently three modes for
the `flags:keep_werror` setting:

* `none`: (default) cancel all `-Werror`, `-Werror=*` and `-Werror-*` flags by
  converting them to `-Wno-error[=]*` flags
* `specific`: preserve explicitly selected warnings as errors, such as
  `-Werror=format-truncation`, but reverse the blanket `-Werror`
* `all`: keeps all `-Werror` flags

These can be set globally in config.yaml, through the config command-line flags, or
overridden by a particular package (some packages use Werror as a proxy for determining
support for other compiler features).  We chose to use this approach because:

1. removing `-Werror` flags entirely broke *many* build systems, especially autoconf
   based ones, because of things like checking `-Werror=feature` and making the
   assumption that if that did not error other flags related to that feature would also work
2. Attempting to preserve `-Werror` in some phases but not others caused similar issues
3. The per-package setting came about because some packages, even with all these
   protections, still use `-Werror` unsafely.  Currently there are roughly 3 such packages
   known.
2022-11-23 12:29:17 -08:00
Jen Herting
bf1b846f26 [py-antlr4-python3-runtime] Added versions 4.9.3 and 4.10 (#34102)
* Working updates to py-antlr4-python3-runtime and py-omegaconf

* [py-antlr4-python3-runtime] added version 4.9.3

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: Benjamin Meyers <bsmits@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-11-23 12:30:11 -07:00
Harmen Stoppels
d06fd26c9a url_exists related improvements (#34095)
For reasons beyond me Python thinks it's a great idea to upgrade HEAD
requests to GET requests when following redirects. So, this PR adds a
better `HTTPRedirectHandler`, and also moves some ad-hoc logic around
for dealing with disabling SSL certs verification.

Also, I'm stumped by the fact that Spack's `url_exists` does not use
HEAD requests at all, so in certain cases Spack awkwardly downloads
something first to see if it can download it, and then downloads it
again because it knows it can download it. So, this PR ensures that both
urllib and botocore use HEAD requests.

Finally, it also removes some things that were there to support currently
unsupported Python versions.

Notice that the HTTP spec [section 10.3.2](https://datatracker.ietf.org/doc/html/rfc2616.html#section-10.3.2) just talks about how to deal
with POST request on redirect (whether to follow or not):

>   If the 301 status code is received in response to a request other
>   than GET or HEAD, the user agent MUST NOT automatically redirect the
>   request unless it can be confirmed by the user, since this might
>   change the conditions under which the request was issued.

>   Note: When automatically redirecting a POST request after
>   receiving a 301 status code, some existing HTTP/1.0 user agents
>   will erroneously change it into a GET request.

Python has a comment about this, they choose to go with the "erroneous change".
But they then mess up the HEAD request while following the redirect, probably
because they were too busy discussing how to deal with POST.

See https://github.com/python/cpython/pull/99731
2022-11-23 19:26:24 +00:00
kwryankrattiger
5d2c9636ff E4S: Conservatively add ecp-data-vis-sdk (#33621)
* E4S: Conservatively add ecp-data-vis-sdk

* Remove ascent from CUDA SDK stack to stop hanging on Dray

* Adios2: Newer FindPython uses Python_EXECUTABLE
2022-11-23 11:01:30 -08:00
Harmen Stoppels
63e4406514 Revert "gitlab: Add shared PR mirror to places pipelines look for binaries. (#33746)" (#34087)
This reverts commit 5c4137baf1.
2022-11-23 10:41:52 -08:00
Jen Herting
d56380fc07 New package: py-imagecodecs (#34098)
* [libjpeg-turbo] Added version 2.1.3

* [imagecodecs] Added jpeg deependency commented outconglicting libraries

* [WIP]

* [py-imagecodecs] modifying setup.py to work with spack install locations

* [py-imagecodecs] Removed comments and unneeded dependencies

* [py-imagecodecs] removed some comments and fixed up some flake8 complaints

* [py-imagecodecs] flake8

* [py-imagecodecs] fixed import

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-11-23 10:39:45 -07:00
Adam J. Stewart
f89cc96b0c libelf: fix build on macOS arm64 (#34036) 2022-11-23 09:27:44 -07:00
iarspider
cf952d41d8 Add checksum for py-cffi 1.15.1 (#34081) 2022-11-23 08:52:21 -07:00
iarspider
5f737c5a71 Add checksum for py-parsimonious 0.10.0 (#34079) 2022-11-23 08:59:30 -06:00
Bernhard Kaindl
a845b1f984 openloops: add check for added Fortran compiler (#34014) 2022-11-23 07:59:13 -07:00
Henning Glawe
b8d059e8f4 berkelygw: use mpi variant for scalapack (#33948)
The package.py assumed "+mpi" in many places, without checking for the variant.
This problem went undetected, as a hard dependency on scalapack pulled an mpi
implementation into the dependency chain (this is also fixed).

Also, the +mpi variant is used select between serial and parallel mode:

It has to enable MPI and ScaLAPACK: They are inter-dependent. Compile
fails because of checks for the other if the other is not enabled.

Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-11-23 07:42:03 -07:00
Cory Bloor
1006c77374 rocm: add minimum versions for amdgpu_targets (#34030) 2022-11-23 14:23:38 +01:00
Alec Scott
38d4fd7711 Add conflicts statements to flux-core to limit builds to linux based platforms (#34068) 2022-11-23 06:08:46 -07:00
Harmen Stoppels
643ce586de libxcrypt: add v4.33 (#34069) 2022-11-23 05:36:47 -07:00
Adam J. Stewart
5b3b0130f2 Build System docs: consistent headers (#34047) 2022-11-23 13:35:55 +01:00
Harmen Stoppels
55c77d659e make/ninja: use the right number of jobs (#34057) 2022-11-23 12:35:15 +01:00
Alexander Knieps
fe1c105161 capnproto: update to v0.10.2 (#34063)
Co-authored-by: Alexander Knieps <a.knieps@fz-juelich.de>
2022-11-23 12:32:10 +01:00
Bernhard Kaindl
09f2b6f5f5 boost: At least with older Xcode, boost can't build with lzma (#34075)
Reference: https://lists.boost.org/Archives/boost/2019/11/247380.php
As reported at the end of #33998 and this link, liblzma on older Xcode on
MacOSX 10 misses _lzma_cputhreads, so boost's can't use liblzma on those.
2022-11-23 03:36:22 -07:00
Matthieu Dorier
73fe21ba41 [mochi-margo] fixed dependency to Argobots (#34082) 2022-11-23 03:36:05 -07:00
Jen Herting
81fb87cedf New package: py-rasterstats (#34070)
* Fixed dependencies for rasterstats

* Fixed flake8 errors

* Fix flake8 error

* Cleans up package desc., adds build dependency on setuptools.

* Fixes flake8 error

Co-authored-by: Bailey Brown <bobits@rit.edu>
2022-11-23 02:55:14 -07:00
Tim Haines
7de39c44b1 dyninst: add v12.2.1 (#34050) 2022-11-23 10:13:00 +01:00
Keita Iwabuchi
c902e27e52 Metall package: add v0.22, v0.23, and v0.23.1 (#34073) 2022-11-23 09:57:32 +01:00
Takahiro Ueda
65b991a4c5 form: new version 4.3.0 (#34078) 2022-11-23 09:51:55 +01:00
Mosè Giordano
65520311a6 ccache: add new versions (#34067) 2022-11-23 01:42:44 -07:00
Umar Arshad
def79731d0 span-lite: Add new versions (#34072) 2022-11-23 02:21:58 +01:00
Michael Kuhn
0fd3c9f451 cmd/checksum: allow adding new versions to package (#24532)
This adds super-lazy maintainer mode to `spack checksum`: Instead of
only printing the new checksums to the terminal, `-a` and
`--add-to-package` will add the new checksums to the `package.py` file
and open it in the editor afterwards for final checks.
2022-11-22 16:30:49 -08:00
Adam J. Stewart
c5883fffd7 Python: drop EOL versions (#33898)
This PR removes [end of life](https://endoflife.date/python) versions of Python from Spack. Specifically, this includes all versions of Python older than 3.7.

See https://github.com/spack/spack/discussions/31824 for rationale. Deprecated in #32615. And #28003.

For anyone using software that relies on Python 2, you have a few options:

* Upgrade the software to support Python 3. The `3to2` tool may get you most of the way there, although more complex libraries may need manual tweaking.
* Add Python 2 as an [external package](https://spack.readthedocs.io/en/latest/build_settings.html#external-packages). Many Python libraries do not support Python 2, but you may be able to add older versions that did once upon a time.
* Use Spack 0.19. Spack 0.19 is the last release to officially support Python 3.6 and older
* Create and maintain your own [custom repository](https://spack.readthedocs.io/en/latest/repositories.html). Basically, you would need a package for Python 2 and any other Python 2-specific libraries you need.
2022-11-22 15:02:30 -08:00
Harmen Stoppels
4bf964e6b3 spack uninstall: use topo order (#34053) 2022-11-22 07:22:07 -07:00
Bernhard Kaindl
bcc0fda4e2 berkeleygw: fix build (no change to attribute spec.compiler_flags) (#34019) 2022-11-22 07:13:56 -07:00
iarspider
69987fd323 cpu-features: Fix appending -DBUILD_SHARED_LIBS (#34055)
Also:
* Use the release tarball for v0.7.0 to fix spack warning

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-22 06:10:34 -07:00
Dominic Hofer
9a16234ed4 eckit: add v1.20.2, v1.16.3 (#33200)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-11-22 05:45:17 -07:00
Massimiliano Culpo
bd198312c9 Revert "Warn about removal of deprecated format strings (#33829)" (#34056)
This reverts commit 7f9af8d4a0.
2022-11-22 12:35:36 +01:00
Greg Becker
7f9af8d4a0 Warn about removal of deprecated format strings (#33829)
Co-authored-by: becker33 <becker33@users.noreply.github.com>
2022-11-22 10:56:57 +01:00
John W. Parent
793a7bc6a9 Windows: add registry query and SDK/WDK packages (#33021)
* Add a WindowsRegistryView class, which can query for existing
  package installations on Windows. This is particularly important
  because some Windows packages (including those added here)
  do not allow two simultaneous installs, and this can be
  queried in order to provide a clear error message.
* Consolidate external path detection logic for Windows into
  WindowsKitExternalPaths and WindowsCompilerExternalPaths objects.
* Add external-only packages win-sdk and wgl
* Add win-wdk (including external detection) which depends on
  win-sdk
* Replace prior msmpi implementation with a source-based install
  (depends on win-wdk). This install can control the install
  destination (unlike the binary installation).
* Update MSVC compiler to choose vcvars based on win-sdk dependency
* Provide "msbuild" module-level variable to packages during build
* When creating symlinks on Windows, need to explicitly specify when
  a symlink target is a directory
* executables_in_path no-longer defaults to using PATH (this is
  now expected to be taken care of by the caller)
2022-11-22 00:27:42 -08:00
genric
376afd631c py-kubernetes: add version 25.3.0 (#33915) 2022-11-22 05:46:41 +01:00
Harmen Stoppels
e287c6ac4b gnuconfig: bump with 2022-09-17 (#34035) 2022-11-22 05:31:29 +01:00
Sergey Kosukhin
e864744b60 py-fprettify: new version 0.3.7 (#34040) 2022-11-21 21:26:31 -06:00
Andrew W Elble
5b3af53b10 qiskit: updates (#33877) 2022-11-21 21:10:10 -06:00
Harmen Stoppels
44c22a54c9 Spec traversal: add option for topological ordering (#33817)
Spec traversals can now specify a topological ordering. A topologically-
ordered traversal with input specs X1, X2... will

* include all of X1, X2... and their children
* be ordered such that a given node is guaranteed to appear before any
  of its children in the traversal

Other notes:

* Input specs can be children of other input specs (this is useful if
  a user specifies a set of specs to uninstall: some of those specs
  might be children of others)
* `direction="parents"` will produce a reversed topological order
  (children always come before parents).
* `cover="edges"` will generate a list of edges L such that (a) input
  edges will always appear before output edges and (b) if you create
  a list with the destination of each edge in L the result is
  topologically ordered
2022-11-21 18:33:35 -08:00
Chris Green
f97f37550a libjpeg-turbo: make build_system settings comprehensive (#34046) 2022-11-22 03:15:32 +01:00
Massimiliano Culpo
0e4ee3d352 Speed-up a few unit-tests (#34044)
* test_suite.py: speed up slow test by using mock packages

* Don't resolve the sha during unit-tests

* Skip long-running test that fails, instead of executing it
2022-11-21 23:50:55 +01:00
Brian Van Essen
05fc800db9 Fixed the rdma-core package to find its external library (#33798) 2022-11-21 15:45:47 -07:00
Dom Heinzeller
2387c116ad ecflow: add v5.8.3, update with changes from JCSDA-EMC fork (#34038) 2022-11-21 11:01:52 -07:00
Scott Wittenburg
6411cbd803 ci: restore ability to reproduce gitlab job failures (#33953) 2022-11-21 10:39:03 -06:00
Harmen Stoppels
8ea366b33f uninstall: fix accidental cubic complexity (#34005)
* uninstall: fix accidental cubic complexity

Currently spack uninstall runs in worst case cubic time complexity
thanks to traversal during traversal during traversal while collecting
the specs to be uninstalled.

Also brings down the number of error messages printed to something
linear in the amount of matching specs instead of quadratic.
2022-11-21 16:44:48 +01:00
Drew Whitehouse
9a2fbf373c openvdb: update to v10.0.0 (#33835) 2022-11-21 06:38:28 +01:00
Mosè Giordano
9e1fef8813 texinfo: require also makeinfo executable (#33370)
* texinfo: require also `makeinfo` executable
* texinfo: add versions 6.6, 6.7, 6.8
* texinfo: add `info` and `makeinfo` sanity checks
2022-11-21 05:20:11 +01:00
Alec Scott
f8a6e3ad90 hugo: add v0.106.0 (#34023) 2022-11-21 04:22:32 +01:00
Adam J. Stewart
0706919b09 Python: specify tcl/tk version requirements (#34027) 2022-11-20 19:13:47 -06:00
Wouter Deconinck
b9b93ce272 qt6: new packages (#29555)
* qt6: initial commit of several basic qt6 packages

* Qt6: fix style issues

* [qt6] fix style issues, trailing spaces

* [qt6] rename to qt-* ecosystem; remove imports

* [qt6] rename dependencies; change version strings

* [qt6] list_urls

* [qt6] homepage links

* [qt6] missing closing quotes failed style check

* qt-declarative: use private _versions

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* qt-quick3d, qt-quicktimeline, qt-shadertools: use private _versions

* qt-base: rework feature defines and use run_tests

* qt: new version 6.2.4

* flake8 whitespace before comma

* qt-base: variant opengl when +gui

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* qt6: rebase and apply new black style

* qt6: apply style isort fixes

* qt6: new version 6.3.0 and 6.3.1

* qt6: add 6.3.0 and 6.3.1 to versions list

* qt6: multi-argument join_path

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* qt-base: fix isort

* qt-shadertools: no cmake_args needed

* qt-declarative: imports up front

* qt-quick3d: fix import

* qt-declarative: remove useless cmake_args

* qt-shadertools: imports and join_path fixes

* qt-quick3d: join_path fixes

* qt-declarative: join_path fixes

* Update features based on gui usage

* Update dependencies, cmake args, mac support

* Update features based on linux

* More updates

* qt-base: fix style

* qt-base: archive_files join_path

* qt-base: new version 6.3.2

* qt-{declarative,quick3d,quicktimeline,shadertools}@6.3.2

* qt-base: require libxcb@1.13: and use system xcb_xinput when on linux

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-20 20:03:52 -05:00
Pedro Ciambra
87cb9760ce mold: new package for the mold linker (#34017) 2022-11-21 01:32:47 +01:00
Adam J. Stewart
d472e28bfe py-numpy: add v1.23.5 (#34026) 2022-11-21 00:40:37 +01:00
Bernhard Kaindl
dbc81549db krb5: Add new versions 1.19.4 and 1.20.1 (#34021)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-11-21 00:39:13 +01:00
Sam Grayson
dc00c4fdae Update dask and related packages (#33925)
* Update dask and related packages

* Update package dependency specs

* Run spack style

* Add new version of locket

* Respond to comments

* Added constraints

* Add version constraints for py-dask+distributed

* Run spack style

* Update var/spack/repos/builtin/packages/py-dask/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Deprecated dask versions

* Deprecated more dask and distirbuted

* spack style --fix

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-20 12:43:32 -06:00
Bernhard Kaindl
f1b9da16c8 bash: Update 5.1 to 5.1.16 (#34022) 2022-11-20 18:36:24 +01:00
Adam J. Stewart
93ce943301 py-torch: add note about MPS variant (#34018) 2022-11-19 18:10:06 -07:00
Jen Herting
632b36ab5d New package: py-ahpy (#34008)
* first build of ahpy

* updated to limit python to >4

* added from spack.package import * to >4

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-11-19 12:10:14 -07:00
Paul Romano
94c76c5823 OpenMC: add v0.13.2 (#33903)
* openmc: add v0.13.2

* Fix style formatting

* Update Python version dependency

* Update numpy version dependency
2022-11-19 12:22:58 -06:00
Massimiliano Culpo
45b4cedb7e spack find: remove deprecated "--bootstrap" option (#34015) 2022-11-19 16:09:34 +01:00
Erik Schnetter
6d0a8f78b2 libxcrypt: Disable -Werror (#34013) 2022-11-19 06:29:50 -07:00
Chris Green
409cf185ce package_base.py: Fix #34006: test msg needs to be a string (#34007) 2022-11-19 13:02:51 +01:00
iarspider
602984460d Boost: enable lzma and zstd iostreams (#33998) 2022-11-19 12:07:52 +01:00
Olivier Cessenat
62b1d52a1e graphviz: remove 1. cyclic dep when +pangocairo and 2. error with poppler+glib (#32120)
* graphviz: remove cyclic dep to svg when pangocairo and poppler+glib failure

* graphviz: remove cyclic dep to svg when pangocairo and poppler+glib failure
2022-11-19 11:53:09 +01:00
Jim Edwards
790bd175e0 parallelio: update package to use mpi-serial, add extra module info (#33153) 2022-11-19 03:50:01 -07:00
Adam J. Stewart
2f057d729d py-scipy: add v1.9 (#31810) 2022-11-19 11:16:01 +01:00
Jerome Soumagne
a124185090 mercury: add version 2.2.0 (#31966)
add psm, psm2 and hwloc variants

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-19 11:12:52 +01:00
Michael Kuhn
c5235bbe86 sqlite: add 3.40.0 (#33975) 2022-11-18 18:42:03 -07:00
Chris Green
e715901cb2 PackageBase should not define builder legacy attributes (#33942)
* Add a regression test for 33928

* PackageBase should not set `(build|install)_time_test_callbacks`

* Fix audits by preserving the current semantic

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-11-18 22:22:51 +01:00
Robert Blake
1db914f567 Updating faiss with new versions (#33983)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-11-18 21:43:41 +01:00
Robert Blake
688dae7058 Updating hiredis with new software versions (#33982) 2022-11-18 21:43:20 +01:00
Richard Berger
ddb460ec8d charliecloud: new version 0.30 (#34004) 2022-11-18 11:29:33 -08:00
snehring
703e5fe44a trnascan-se: adding missing build dep (#33978) 2022-11-18 11:22:21 -08:00
Bernhard Kaindl
c601bdf7bf gcc: Ensure matching assembler/binutils on RHEL8 (#33994)
gcc@10: Newer binutils than RHEL7/8's are required to for guaranteed operaton. Therefore, on RHEL7/8, reject ~binutils. You need to add +binutils to be sure to have binutils which are recent enough.

See this discussion with the OpenBLAS devs for reference:
https://github.com/xianyi/OpenBLAS/issues/3805#issuecomment-1319878852

Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-11-18 10:55:52 -08:00
Satish Balay
778dddc523 pflotran: add "rxn" variant (#33995) 2022-11-18 10:34:27 -08:00
Robert Underwood
acc19ad34f LibPressio support for MGARD (#33999)
Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-11-18 10:26:50 -08:00
Adam J. Stewart
f4826e1b33 py-mypy: add new versions; add new py-types packages (#34002)
* py-mypy: add new versions
* Add new packages
2022-11-18 10:23:55 -08:00
snehring
05ff7e657c py-cutadapt: adding version 4.1 (#33959)
* py-dnaio: adding version 0.9.1
py-cutadapt: adding version 4.1

* py-cutadapt: remove old python versions

* py-dnaio: remove old python versions

* py-cutadapt: add cython dep
2022-11-18 10:33:57 -07:00
Massimiliano Culpo
839a14c0ba Improve error message for requirements (#33988)
refers #33985
2022-11-18 15:49:46 +01:00
Bernhard Kaindl
9aafbec121 openblas: Fix build on ARM Neoverse with gcc@:9 (no sve2+bf16) (#33968)
Also improve the InstallError message when +fortran but no FC was added.

Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-11-18 02:54:04 -07:00
iarspider
20071e0c04 Update libjpeg-turbo using new multibuildsystem approach (#33971)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-18 01:41:53 -07:00
Adam J. Stewart
51bb2f23a3 py-pytorch-lightning: add v1.8.2 (#33984) 2022-11-17 23:13:55 -07:00
Ross Miller
2060d51bd0 libxml2: make older versions depend on python@:3.9 (#33952)
Add a dependency on python versions less than 3.10 in order to work
around a bug in libxml2's configure script that fails to parse python
version strings with more than one character for the minor version.

The bug is present in v2.10.1, but has been fixed in 2.10.2.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-18 05:16:11 +01:00
Harmen Stoppels
e5af0ccc09 libuv-julia: usa static libs for julia (#33980) 2022-11-18 03:09:07 +01:00
Jen Herting
284859e742 [lerc] added version 4.0.0 (#33974)
* [lerc] added version 4.0.0

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-11-17 19:50:58 -06:00
Cory Quammen
37e77f7a15 ParaView: add ParaView-5.11.0 new release (#33972) 2022-11-17 18:06:08 -07:00
Harmen Stoppels
d2432e1ba4 julia: 1.8.3 (#33976) 2022-11-17 16:18:32 -07:00
David
5809ba0e3f ompss-2: new package (#33844) 2022-11-17 15:03:32 -08:00
Robert Underwood
95e294b2e8 fixes for ndzip and sperr (#33882)
* fixes for ndzip
* fix commits in spack
* new and fixed sperr release

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-11-17 14:25:48 -08:00
Glenn Johnson
cdaac58488 Spack Bioconductor package updates (#33852)
* add version 1.46.0 to bioconductor package r-a4
* add version 1.46.0 to bioconductor package r-a4base
* add version 1.46.0 to bioconductor package r-a4classif
* add version 1.46.0 to bioconductor package r-a4core
* add version 1.46.0 to bioconductor package r-a4preproc
* add version 1.46.0 to bioconductor package r-a4reporting
* add version 1.52.0 to bioconductor package r-absseq
* add version 1.28.0 to bioconductor package r-acde
* add version 1.76.0 to bioconductor package r-acgh
* add version 2.54.0 to bioconductor package r-acme
* add version 1.68.0 to bioconductor package r-adsplit
* add version 1.70.0 to bioconductor package r-affxparser
* add version 1.76.0 to bioconductor package r-affy
* add version 1.74.0 to bioconductor package r-affycomp
* add version 1.58.0 to bioconductor package r-affycompatible
* add version 1.56.0 to bioconductor package r-affycontam
* add version 1.70.0 to bioconductor package r-affycoretools
* add version 1.46.0 to bioconductor package r-affydata
* add version 1.50.0 to bioconductor package r-affyilm
* add version 1.68.0 to bioconductor package r-affyio
* add version 1.74.0 to bioconductor package r-affyplm
* add version 1.44.0 to bioconductor package r-affyrnadegradation
* add version 1.46.0 to bioconductor package r-agdex
* add version 3.30.0 to bioconductor package r-agilp
* add version 2.48.0 to bioconductor package r-agimicrorna
* add version 1.30.0 to bioconductor package r-aims
* add version 1.30.0 to bioconductor package r-aldex2
* add version 1.36.0 to bioconductor package r-allelicimbalance
* add version 1.24.0 to bioconductor package r-alpine
* add version 2.60.0 to bioconductor package r-altcdfenvs
* add version 2.22.0 to bioconductor package r-anaquin
* add version 1.26.0 to bioconductor package r-aneufinder
* add version 1.26.0 to bioconductor package r-aneufinderdata
* add version 1.70.0 to bioconductor package r-annaffy
* add version 1.76.0 to bioconductor package r-annotate
* add version 1.60.0 to bioconductor package r-annotationdbi
* add version 1.22.0 to bioconductor package r-annotationfilter
* add version 1.40.0 to bioconductor package r-annotationforge
* add version 3.6.0 to bioconductor package r-annotationhub
* add version 3.28.0 to bioconductor package r-aroma-light
* add version 1.30.0 to bioconductor package r-bamsignals
* add version 2.14.0 to bioconductor package r-beachmat
* add version 2.58.0 to bioconductor package r-biobase
* add version 2.6.0 to bioconductor package r-biocfilecache
* add version 0.44.0 to bioconductor package r-biocgenerics
* add version 1.8.0 to bioconductor package r-biocio
* add version 1.16.0 to bioconductor package r-biocneighbors
* add version 1.32.1 to bioconductor package r-biocparallel
* add version 1.14.0 to bioconductor package r-biocsingular
* add version 2.26.0 to bioconductor package r-biocstyle
* add version 3.16.0 to bioconductor package r-biocversion
* add version 2.54.0 to bioconductor package r-biomart
* add version 1.26.0 to bioconductor package r-biomformat
* add version 2.66.0 to bioconductor package r-biostrings
* add version 1.46.0 to bioconductor package r-biovizbase
* add version 1.8.0 to bioconductor package r-bluster
* add version 1.66.1 to bioconductor package r-bsgenome
* add version 1.34.0 to bioconductor package r-bsseq
* add version 1.40.0 to bioconductor package r-bumphunter
* add version 2.64.0 to bioconductor package r-category
* add version 2.28.0 to bioconductor package r-champ
* add version 2.30.0 to bioconductor package r-champdata
* add version 1.48.0 to bioconductor package r-chipseq
* add version 4.6.0 to bioconductor package r-clusterprofiler
* add version 1.34.0 to bioconductor package r-cner
* add version 1.30.0 to bioconductor package r-codex
* add version 2.14.0 to bioconductor package r-complexheatmap
* add version 1.72.0 to bioconductor package r-ctc
* add version 2.26.0 to bioconductor package r-decipher
* add version 0.24.0 to bioconductor package r-delayedarray
* add version 1.20.0 to bioconductor package r-delayedmatrixstats
* add version 1.38.0 to bioconductor package r-deseq2
* add version 1.44.0 to bioconductor package r-dexseq
* add version 1.40.0 to bioconductor package r-dirichletmultinomial
* add version 2.12.0 to bioconductor package r-dmrcate
* add version 1.72.0 to bioconductor package r-dnacopy
* add version 3.24.1 to bioconductor package r-dose
* add version 2.46.0 to bioconductor package r-dss
* add version 3.40.0 to bioconductor package r-edger
* add version 1.18.0 to bioconductor package r-enrichplot
* add version 2.22.0 to bioconductor package r-ensembldb
* add version 1.44.0 to bioconductor package r-exomecopy
* add version 2.6.0 to bioconductor package r-experimenthub
* add version 1.24.0 to bioconductor package r-fgsea
* add version 2.70.0 to bioconductor package r-gcrma
* add version 1.34.0 to bioconductor package r-gdsfmt
* add version 1.80.0 to bioconductor package r-genefilter
* add version 1.34.0 to bioconductor package r-genelendatabase
* add version 1.70.0 to bioconductor package r-genemeta
* add version 1.76.0 to bioconductor package r-geneplotter
* add version 1.20.0 to bioconductor package r-genie3
* add version 1.34.3 to bioconductor package r-genomeinfodb
* update r-genomeinfodbdata
* add version 1.34.0 to bioconductor package r-genomicalignments
* add version 1.50.2 to bioconductor package r-genomicfeatures
* add version 1.50.1 to bioconductor package r-genomicranges
* add version 2.66.0 to bioconductor package r-geoquery
* add version 1.46.0 to bioconductor package r-ggbio
* add version 3.6.2 to bioconductor package r-ggtree
* add version 2.8.0 to bioconductor package r-glimma
* add version 1.10.0 to bioconductor package r-glmgampoi
* add version 5.52.0 to bioconductor package r-globaltest
* update r-go-db
* add version 1.18.0 to bioconductor package r-gofuncr
* add version 2.24.0 to bioconductor package r-gosemsim
* add version 1.50.0 to bioconductor package r-goseq
* add version 2.64.0 to bioconductor package r-gostats
* add version 1.76.0 to bioconductor package r-graph
* add version 1.60.0 to bioconductor package r-gseabase
* add version 1.30.0 to bioconductor package r-gtrellis
* add version 1.42.0 to bioconductor package r-gviz
* add version 1.26.0 to bioconductor package r-hdf5array
* add version 1.70.0 to bioconductor package r-hypergraph
* add version 1.34.0 to bioconductor package r-illumina450probevariants-db
* add version 0.40.0 to bioconductor package r-illuminaio
* add version 1.72.0 to bioconductor package r-impute
* add version 1.36.0 to bioconductor package r-interactivedisplaybase
* add version 2.32.0 to bioconductor package r-iranges
* add version 1.58.0 to bioconductor package r-kegggraph
* add version 1.38.0 to bioconductor package r-keggrest
* add version 3.54.0 to bioconductor package r-limma
* add version 2.50.0 to bioconductor package r-lumi
* add version 1.74.0 to bioconductor package r-makecdfenv
* add version 1.76.0 to bioconductor package r-marray
* add version 1.10.0 to bioconductor package r-matrixgenerics
* add version 1.6.0 to bioconductor package r-metapod
* add version 2.44.0 to bioconductor package r-methylumi
* add version 1.44.0 to bioconductor package r-minfi
* add version 1.32.0 to bioconductor package r-missmethyl
* add version 1.78.0 to bioconductor package r-mlinterfaces
* add version 1.10.0 to bioconductor package r-mscoreutils
* add version 2.24.0 to bioconductor package r-msnbase
* add version 2.54.0 to bioconductor package r-multtest
* add version 1.36.0 to bioconductor package r-mzid
* add version 2.32.0 to bioconductor package r-mzr
* add version 1.60.0 to bioconductor package r-oligoclasses
* update r-org-hs-eg-db
* add version 1.40.0 to bioconductor package r-organismdbi
* add version 1.38.0 to bioconductor package r-pathview
* add version 1.90.0 to bioconductor package r-pcamethods
* update r-pfam-db
* add version 1.42.0 to bioconductor package r-phyloseq
* add version 1.60.0 to bioconductor package r-preprocesscore
* add version 1.30.0 to bioconductor package r-protgenerics
* add version 1.32.0 to bioconductor package r-quantro
* add version 2.30.0 to bioconductor package r-qvalue
* add version 1.74.0 to bioconductor package r-rbgl
* add version 2.38.0 to bioconductor package r-reportingtools
* add version 2.42.0 to bioconductor package r-rgraphviz
* add version 2.42.0 to bioconductor package r-rhdf5
* add version 1.10.0 to bioconductor package r-rhdf5filters
* add version 1.20.0 to bioconductor package r-rhdf5lib
* add version 2.0.0 to bioconductor package r-rhtslib
* add version 1.74.0 to bioconductor package r-roc
* add version 1.26.0 to bioconductor package r-rots
* add version 2.14.0 to bioconductor package r-rsamtools
* add version 1.58.0 to bioconductor package r-rtracklayer
* add version 0.36.0 to bioconductor package r-s4vectors
* add version 1.6.0 to bioconductor package r-scaledmatrix
* add version 1.26.0 to bioconductor package r-scater
* add version 1.12.0 to bioconductor package r-scdblfinder
* add version 1.26.0 to bioconductor package r-scran
* add version 1.8.0 to bioconductor package r-scuttle
* add version 1.64.0 to bioconductor package r-seqlogo
* add version 1.56.0 to bioconductor package r-shortread
* add version 1.72.0 to bioconductor package r-siggenes
* add version 1.20.0 to bioconductor package r-singlecellexperiment
* add version 1.32.0 to bioconductor package r-snprelate
* add version 1.48.0 to bioconductor package r-snpstats
* add version 2.34.0 to bioconductor package r-somaticsignatures
* add version 1.10.0 to bioconductor package r-sparsematrixstats
* add version 1.38.0 to bioconductor package r-spem
* add version 1.36.0 to bioconductor package r-sseq
* add version 1.28.0 to bioconductor package r-summarizedexperiment
* add version 3.46.0 to bioconductor package r-sva
* add version 1.36.0 to bioconductor package r-tfbstools
* add version 1.20.0 to bioconductor package r-tmixclust
* add version 2.50.0 to bioconductor package r-topgo
* add version 1.22.0 to bioconductor package r-treeio
* add version 1.26.0 to bioconductor package r-tximport
* add version 1.26.0 to bioconductor package r-tximportdata
* add version 1.44.0 to bioconductor package r-variantannotation
* add version 3.66.0 to bioconductor package r-vsn
* add version 2.4.0 to bioconductor package r-watermelon
* add version 2.44.0 to bioconductor package r-xde
* add version 1.56.0 to bioconductor package r-xmapbridge
* add version 0.38.0 to bioconductor package r-xvector
* add version 1.24.0 to bioconductor package r-yapsa
* add version 1.24.0 to bioconductor package r-yarn
* add version 1.44.0 to bioconductor package r-zlibbioc
* make version resource consistent for r-bsgenome-hsapiens-ucsc-hg19
* make version resource consistent for r-go-db
* make version resource consistent for r-kegg-db
* make version resource consistent for r-org-hs-eg-db
* make version resource consistent for r-pfam-db
* new package: r-ggrastr
* Patches not needed for new version
* new package: r-hdo-db
* new package: r-ggnewscale
* new package: r-gson
* Actually depends on ggplot2@3.4.0:
* Fix formatting of r-hdo-db
* Fix dependency version specifiers
* Clean up duplicate dependency references
2022-11-17 14:04:45 -08:00
Erik Heeren
13389f7eb8 glib: fix URLs (#33919) 2022-11-17 13:58:07 -07:00
Adam J. Stewart
4964633614 py-tensorflow: add patch releases, remove v0.X (#33963) 2022-11-17 14:29:48 -06:00
Jared Popelar
381bedf369 Hdf5 package: build on Windows (#31141)
* Enable hdf5 build (including +mpi) on Windows
* This includes updates to hdf5 dependencies openssl (minor edit) and
  bzip2 (more-extensive edits)
* Add binary-based installation of msmpi (this is currently the only
  supported MPI implementation in Spack for Windows). Note that this
  does not install to the Spack-specified prefix. This implementation
  will be replaced with a source-based implementation

Co-authored-by: John Parent <john.parent@kitware.com>
2022-11-17 10:40:53 -08:00
John W. Parent
6811651a0f Update CMake version to 3.25.0 (#33957)
CMake had official release 3.25.0, update package version to reflect

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-17 10:58:38 -07:00
Chris Green
22aada0e20 Waf build system: fix typo in legacy_attributes (#33958)
Fix erroneous duplication of `build_time_test_callbacks` in
`legacy_attributes`: one of the duplicates should be
`install_time_test_callbacks`
2022-11-17 16:54:46 +01:00
Massimiliano Culpo
4a71020cd2 Python: do not set PYTHONHOME during build (#33956)
Setting PYTHONHOME is rarely needed (since each interpreter has
various ways of setting it automatically) and very often it is
difficult to get right manually.

For instance, the change done to set PYTHONHOME to
sysconfig["base_prefix"] broke bootstrapping dev dependencies
of Spack for me, when working inside a virtual environment in Linux.
2022-11-17 15:41:50 +01:00
Adam J. Stewart
294e6f80a0 py-rasterio: add v1.3.4 (#33961) 2022-11-17 15:23:26 +01:00
Harmen Stoppels
cc2d0eade6 docs: fix typo in multiple build systems (#33965) 2022-11-17 15:20:10 +01:00
Harmen Stoppels
f00e411287 relocate.py: small refactor for file_is_relocatable (#33967) 2022-11-17 13:21:27 +01:00
Massimiliano Culpo
da0a6280ac Remove deprecated subcommands from "spack bootstrap" (#33964)
These commands are slated for removal in v0.20
2022-11-17 12:42:57 +01:00
Ashwin Kumar
6ee6844473 Octopus : Separate serial and MPI dependencies (#33836)
* replace mpi as a variant instead of dependency 
* separate serial and MPI dependencies
* configure args depending on serial or mpi variant
* reformat with black
2022-11-17 04:34:03 -07:00
Adam J. Stewart
69822b0d82 py-torchmetrics: add v0.10.3 (#33962) 2022-11-17 11:18:57 +01:00
Michael Kuhn
93eecae0c3 Add sgid notice when running on AFS (#30247) 2022-11-17 09:17:41 +01:00
Massimiliano Culpo
61f5d85525 Python: fix bug detection, trying to access self (#33955)
Typo introduced in #33847
2022-11-17 08:38:49 +01:00
Ben Boeckel
a90e86de75 zlib, openssl: return to http URLs (#33324)
In #3113, `https` was removed to ensure that `curl` can be bootstrapped
without SSL being present. This was lost in #25672 which aimed to use
`https` where possible.

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-16 22:25:55 -07:00
Stephen Sachs
7247a493ab [xz] icc does not support attribute __symver__ (#33839)
xz have added attribute __symver__ for compilers that identify as GCC>=10.0 via
__GNUC__. Intels `icc` sets __GNUC__ but currently does not support this
attribute:
https://community.intel.com/t5/Intel-C-Compiler/symver-not-supported/m-p/1429028/emcs_t/S2h8ZW1haWx8dG9waWNfc3Vic2NyaXB0aW9ufExBQVRWMjIyUFFZTlZTfDE0MjkwMjh8U1VCU0NSSVBUSU9OU3xoSw#M40459
2022-11-16 20:06:41 -07:00
Sergey Kosukhin
cd8ec60ae9 serialbox: update patch to handle UTF-8 conversion errors (#33317) 2022-11-16 20:06:28 -07:00
Manuela Kuhn
fe597dfb0c py-jupyter-server: add 1.21.0 (#33310) 2022-11-16 20:02:20 -07:00
Brian Vanderwende
c721aab006 lib/spack/spack/store.py: Fix #28170 for padding relocation (#33122) 2022-11-17 03:56:00 +01:00
Manuela Kuhn
6a08e9ed08 py-jupyterlab: add 3.4.8 (#33308) 2022-11-16 19:53:59 -07:00
Christian Kniep
7637efb363 blast-plus: update to 2.13.0 and add cpio as build dependency (#33187) 2022-11-17 03:42:25 +01:00
Sebastian Grabowski
b31f1b0353 * ninja: Set min. required version of re2c dependency, fixes #33127 (#33128)
Reviewed-by: haampie, bernhardkaindl
2022-11-17 03:28:06 +01:00
Jonathon Anderson
497682260f gettext: On ppc64le, for older versions, use system cdefs.h (#33411)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-17 02:55:02 +01:00
Massimiliano Culpo
e47beceb8a Clean unit-test workflow file (#33945)
Delete statements related to Python 2.7, and avoid installing
patchelf since now we can bootstrap it.
2022-11-16 23:28:49 +01:00
chenwany
067976f4b8 updated v2.11.8 (#33944) 2022-11-16 13:04:15 -08:00
Greg Becker
1263b5c444 initial implementation of slingshot detection (#33793) 2022-11-16 13:01:37 -08:00
psakievich
90f0a8eacc Upstreams: add canonicalize path (#33946) 2022-11-16 14:37:32 -06:00
John W. Parent
61a7420c94 Windows bootstrapping: remove unneeded call to add dll to PATH (#33622)
#32942 fixed bootstrapping on Windows by having the core Spack
code explicitly add the Clingo package bin/ directory as a
DLL path.

Since then, #33400 has been merged, which ensures that the Python
module installed by the Spack `clingo` package can find the DLLs
in bin/.

Note that this only works for Spack instances which have been
bootstrapped after #33400: for installations bootstrapped before
then, you will need to run `spack clean -b` (this would only
be needed for Spack instances running on Windows).
2022-11-16 12:04:57 -08:00
Stephen Sachs
39a1f1462b SIP build system: fix "python not defined in builder" (#33906)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-11-16 13:36:25 -06:00
Wouter Deconinck
6de5d8e68c qt: new version 5.15.6 and 5.15.7 (#33933) 2022-11-16 11:35:01 -07:00
Robert Underwood
b0f2523350 add sz3 and mdz smoke test for inclusion in e4s (#33864)
* add sz3 and mdz smoke test

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-11-16 10:05:43 -08:00
snehring
bc8cc39871 py-isal: use external libisal (#33932)
* py-isal: adding some missing build deps

* py-isal: use external libisal
2022-11-16 10:58:15 -07:00
Pat Riehecky
b36a8f4f2e New Package: rarpd (#28686) 2022-11-16 08:29:58 -07:00
Harmen Stoppels
0a952f8b7b docs updates for spack env depfile (#33937) 2022-11-16 15:47:31 +01:00
Matthias Wolf
26a0384171 neovim: fix build dependencies (#33935)
Buildin neovim@0.8.0 complains (for me) about Lua's lpeg and mpack
packages not being available at build time. Removing the link-only
setting in the dependencies for these two packages fixes the build for
me.
2022-11-16 07:30:08 -07:00
Harmen Stoppels
d18cccf7c5 spack env depfile in Gitlab CI should use install-deps/pkg-version-hash target (#33936) 2022-11-16 14:02:56 +00:00
Kevin Broch
fbe6b4b486 Change code suggestions to output black formatter compliant code (#33931) 2022-11-16 12:08:20 +01:00
Nathalie Furmento
5fe08a5647 starpu: add v1.3.10 (#33934) 2022-11-16 12:04:31 +01:00
Vanessasaurus
ac2fc4f271 Automated deployment to update package flux-core 2022-11-09 (#33780)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-11-16 11:58:51 +01:00
Adam J. Stewart
93430496e2 Getting Started: Python 2 is no longer supported (#33927) 2022-11-16 08:36:49 +01:00
Michael Kuhn
901b31a7aa docs: fix typo (#33926) 2022-11-15 16:06:12 -08:00
Garrett Morrison
0cec2d3110 zfp: update for version 1.0.0 (#33452)
* zfp: updates and many fixes for version 1.0.0

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-15 16:21:57 -07:00
Satish Balay
40e4884e8b xsdk: add version @0.8.0 (#33794)
- enable sundials+magma+ginkgo
- enable dealii+sundials
- disable dealii~cgal due to build errors
- enable petsc+rocm
- enable slepc+cuda+rocm
- enable strumpack+cuda+slate
- enable slate build with non-gcc compilers
- enable pumi+shared
- enable mfem+shared
- enable ginkgo+mpi
- add hiop
- and exago
  - use exago~ipopt due to mumps~mpi conflict with mpi.h)
  - add raja variant [used by exago/hiop]
    ~raja builds exago in pflow-only mode - i.e exago~hiop~ipopt~python~cuda ^hiop~cuda [Default on MacOS]

Co-authored-by: Cody Balos <balos1@llnl.gov>
2022-11-15 16:52:58 -06:00
Jean-Luc Fattebert
f18425a51f Update BML versions list in package.py (#33920)
* Update BML versions list in package.py
* Update package.py
2022-11-15 14:08:10 -08:00
Massimiliano Culpo
472893c5c4 Run Python 3.6 unit tests on ubuntu-20.04 (#33918) 2022-11-15 22:33:11 +01:00
SXS Bot
289bbf74f6 spectre: add v2022.11.15 (#33921)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-11-15 13:05:02 -08:00
Harmen Stoppels
a0182c069f Show time per phase (#33874) 2022-11-15 21:10:07 +01:00
Erik Schnetter
4ecb6ecaff meson: add new version 0.64.0 (#33880) 2022-11-15 12:42:11 -07:00
Saqib Khan
d5193f73d8 New Package: Prime95/Mprime (#33895)
* New Package: Prime95/Mprime
* Fix trailing whitespaces in prime95
* Fix checksum for prime95

Signed-off-by: saqibkh <saqibkhan@utexas.edu>
2022-11-15 11:39:02 -08:00
Brent Huisman
1aab5bb9f2 Add v0.8 to Arbor (#33916) 2022-11-15 12:57:29 -06:00
Erik Schnetter
8dda4ff60b nsimd: Update Python requirements (#33879)
We need Python 3.0:3.9
2022-11-15 10:05:14 -08:00
iarspider
0811f81a09 thepeg: make rivet dependency optional... (#33912)
* thepeg: make rivet dependency optional...
* add "libs" variant, move compiler flags to flag_handler

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-11-15 09:36:15 -08:00
Harmen Stoppels
af74680405 depfile: improve tab completion (#33773)
This PR allows you to do:

```
spack env create -d .
spack -e . add python
spack -e . concretize
spack -e . env depfile -o Makefile

make in<tab>              # -> install
make install-<tab>        # -> install-deps/
make install-deps/py<tab> # -> install-deps/python-x.y.z-hash
make install/zl<tab>      # -> install/zlib-x.y.z-hash

make SP<tab>              # -> make SPACK
make SPACK_<tab>          # -> make SPACK_INSTALL_FLAGS=
```
2022-11-15 18:03:17 +01:00
Harmen Stoppels
d1715c5fdf Fixup: start the timer before the phase (#33917) 2022-11-15 16:52:43 +01:00
Harmen Stoppels
b245f1ece1 Fix incorrect timer (#33900)
Revamp the timer so we always have a designated begin and end.

Fix a bug where the phase timer was stopped before the phase started,
resulting in incorrect timing reports in timers.json.
2022-11-15 16:33:47 +01:00
Harmen Stoppels
e10c47c53d glib: add missing libelf dep (#33894) 2022-11-15 07:18:10 -07:00
Harmen Stoppels
0697d20fd4 openssh: add libxcrypt (#33892) 2022-11-15 06:58:12 -07:00
Brian Van Essen
fd4f905ce5 External find now searches all dynamic linker paths (#33800)
Add spack.ld_so_conf.host_dynamic_linker_search_paths

Retrieve the current host runtime search paths for shared libraries;
for GNU and musl Linux we try to retrieve the dynamic linker from the
current Python interpreter and then find the corresponding config file
(e.g. ld.so.conf or ld-musl-<arch>.path). Similar can be done for
BSD and others, but this is not implemented yet. The default paths
are always returned. We don't check if the listed directories exist.

Use this in spack external find for libraries.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-11-15 14:48:15 +01:00
Harmen Stoppels
d36c7b20d2 python: missing libxcrypt dep (#33847)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-15 14:36:16 +01:00
Harmen Stoppels
2948248d7a Remove exit(0) (#33896)
Since they cause --backtrace to report backtraces even with exit code 0
2022-11-15 14:35:44 +01:00
Jonathon Anderson
850c54c3b1 Revert "fix perl libxcrypt.so dep (#33846)" (#33909)
This reverts commit bf1b2a828c, as libxcrypt's configure script requires Perl, leading to a circular dependency.
2022-11-15 14:04:31 +01:00
Harmen Stoppels
90fb16033e gitlab: report load in generate job (#33888) 2022-11-15 13:21:21 +01:00
Matthieu Dorier
13a68d547d mochi-margo: add v0.11 (#33910) 2022-11-15 13:19:18 +01:00
Massimiliano Culpo
857ae5a74b fixup 2022-11-15 12:42:48 +01:00
Massimiliano Culpo
b3124bff7c Stop using six in Spack (#33905)
Since we dropped support for Python 2.7, there's no need
so use `six` anymore. We still need to vendor it until
we update our vendored dependencies.
2022-11-15 10:07:54 +01:00
Scott Wittenburg
5c4137baf1 gitlab: Add shared PR mirror to places pipelines look for binaries. (#33746)
While binaries built for PRs that get merged must still be rebuilt
in develop pipelines, they can be used by other PRs that find they
would otherwise need to rebuild them.  Now that spackbot is
managing copying PR binaries from merged PRs into a shared location,
keeping it pruned to a reasonable size, and making sure the indices
are up to date, spack can use these mirrors as a potential source
of binaries.
2022-11-14 19:37:23 -07:00
Stephen Sachs
a9dcd4c01e [py-pmw-patched] needs setuptools to build (#33902)
Error message:
```
ModuleNotFoundError: No module named 'setuptools'
```

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-11-14 18:11:23 -06:00
Harmen Stoppels
2cd7322b11 swig: needs zlib (#33890) 2022-11-14 13:22:12 -07:00
Terry Cojean
f9e9ecd0c1 Ginkgo 1.5.0 version, new MPI variant, related fixes (#33838)
* Ginkgo 1.5.0 release, new MPI variant

* Fix ROCTHRUST/ROCPRIM issues

* Fix deal.II issue with Ginkgo 1.5.0

* Also fix hipRAND+rocRAND RPATH settings

* Turn off CCACHE for spack builds.

Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>

Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2022-11-14 12:33:11 -06:00
Veselin Dobrev
d756034161 Alquimia: tweak for building xsdk v0.8.0 with ROCm/HIP (#33881) 2022-11-14 12:15:50 -06:00
psakievich
2460c4fc28 Add $date option to the list of config variables (#33875)
I'm finding I often want the date in my paths and it would be nice if spack had a config variable for this.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-11-14 10:13:30 -08:00
Cory Bloor
6e39efbb9a rocm: add all GFX9, GFX10 and GFX11 amdgpu_targets (#33871)
This change adds all documented AMDGPU processors from GFX9 through GFX11 and sorts the list.
2022-11-14 10:11:22 -08:00
Satish Balay
277e35c3b0 superlu-dist: add version 8.1.2 (#33868) 2022-11-14 10:09:55 -08:00
Harmen Stoppels
bf1b2a828c fix perl libxcrypt.so dep (#33846)
Detected using https://github.com/spack/spack/pull/28109
2022-11-14 18:56:43 +01:00
Dom Heinzeller
2913f8b42b hdf4: fix build on Apple M1 (#33740) 2022-11-14 17:49:34 +01:00
Harmen Stoppels
57f4c922e9 util-linux: fix deps (#33893) 2022-11-14 17:26:21 +01:00
Todd Gamblin
8d82fecce9 Update CHANGELOG.md for v0.19.0
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-11-14 08:22:29 -06:00
Todd Gamblin
96126cbf17 Update SECURITY.md for v0.19 2022-11-14 08:22:29 -06:00
Todd Gamblin
6ecb57e91f Bump version to v0.20.0.dev0 2022-11-14 08:22:29 -06:00
Harmen Stoppels
a75af62fe3 Get rid of context for exceptions outside PackageBase (#33887) 2022-11-14 14:11:22 +01:00
Massimiliano Culpo
e4e02dbeae Fix a bug/typo in a config_values.py fixture (#33886) 2022-11-14 05:26:14 -07:00
Massimiliano Culpo
3efa4ee26f Remove support for running with Python 2.7 (#33063)
* Remove CI jobs related to Python 2.7

* Remove Python 2.7 specific code from Spack core

* Remove externals for Python 2 only

* Remove llnl.util.compat
2022-11-14 13:11:28 +01:00
Harmen Stoppels
f4c3d98064 libxcrypt: 4.4.31 (#33885) 2022-11-14 04:30:04 -07:00
Harmen Stoppels
a4cec82841 Speed up traverse unit tests (#33840) 2022-11-14 09:34:14 +01:00
Veselin Dobrev
3812edd0db [plasma] add support for building with 'cray-libsci' (#33869) 2022-11-13 21:47:24 -06:00
Harmen Stoppels
3ea9c8529a tau: checksum (#33873) 2022-11-13 13:06:08 -07:00
Adam J. Stewart
ed28797f83 GDAL: add v3.6.0 (#33856)
* GDAL: add v3.6.0

* Explicitly control BASISU

* More reasonable variant defaults
2022-11-13 13:56:11 -06:00
Adam J. Stewart
eadb6ae774 py-pytorch-lightning: add v1.8.1 (#33854) 2022-11-13 11:45:23 -08:00
Veselin Dobrev
a5d35c3077 [sundials] fix cmake argument generation for '+magma' (#33858)
[dealii] force cmake to accept Scalapack settings from Spack
2022-11-13 09:50:57 -06:00
Todd Gamblin
3d811617e6 hotfix: ensure that schema is compatible with tutorial VM config
We added a hotfix to releases/v0.19 with a feature flag, but the flag
is incompatible with the config schema on `develop`.

- [x] Ensure schema is compatible on develop even though config option is unused.
2022-11-13 09:14:00 -06:00
Massimiliano Culpo
03224e52d2 Speed-up bootstrap and architecture unit tests (#33865)
* Speed-up bootstrap mirror unit test

The unit test doesn't need to concretize, since it checks
only metadata for the mirror.

* architecture.py: use "default_mock_concretization" for slow test
2022-11-13 13:09:22 +01:00
Greg Becker
4ebe57cd64 update tutorial command to newest release branch (#33867) 2022-11-12 13:29:38 -08:00
Greg Becker
343cd04a54 use spack.version_info as source of version truth for spack tutorial command (#33860)
* Use spack.spack_version_info as source of truth

Co-authored-by: Todd Gamblin <gamblin2@llnl.gov>
2022-11-12 11:42:59 -08:00
Morten Kristensen
ed45385b7b py-vermin: add latest version 1.5.1 (#33861) 2022-11-12 10:31:23 -06:00
Cameron Rutherford
8a3b596042 Update ExaGO for 1.5.0 release. (#33853)
* Update ExaGO for 1.5.0 release
2022-11-11 21:58:56 -06:00
Satish Balay
1792327874 magma: add version 2.7.0 (#33814)
* magma: add version 2.7.0

Co-authored-by: Stan Tomov <tomov@icl.utk.edu>
2022-11-11 21:58:02 -06:00
Veselin Dobrev
d0dedda9a9 [mfem] add a patch to fix some issues with building with ROCm (#33810) 2022-11-11 14:42:42 -08:00
Harmen Stoppels
368dde437a libxcrypt: 4.4.30 (#33845)
* libxcrypt: 4.4.30

* libxcrypt.so.2 by default

* add libs prop, since libxcrypt provides libcrypt
2022-11-11 15:22:17 -06:00
Massimiliano Culpo
022a2d2eaf Speed-up unit tests by caching default mock concretization (#33755) 2022-11-11 21:32:40 +01:00
kwryankrattiger
5f8511311c ParaView: Add variant for VisItBridge (#33783)
* ParaView: Add variant for VisItBridge

* ParaView: Add kwryankrattiger has package maintainer
2022-11-11 13:09:43 -07:00
Greg Becker
2d2c591633 ensure view projections for extensions always point to extendee (#33848) 2022-11-11 10:59:23 -07:00
Satish Balay
d49c992b23 xsdk: update maintainers (#33822) 2022-11-11 08:30:28 -06:00
Satish Balay
f1392bbd49 trilinos: add version 13.4.1 (#33533)
And update dependency on superlu-dist (now works with @8)
2022-11-11 09:28:47 -05:00
Harmen Stoppels
c14dc2f56a docs: updates related to extensions (#33837) 2022-11-11 13:24:17 +01:00
Harmen Stoppels
0f54a63dfd remove activate/deactivate support in favor of environments (#29317)
Environments and environment views have taken over the role of `spack activate/deactivate`, and we should deprecate these commands for several reasons:

- Global activation is a really poor idea:
   - Install prefixes should be immutable; since they can have multiple, unrelated dependents; see below
   - Added complexity elsewhere: verification of installations, tarballs for build caches, creation of environment views of packages with unrelated extensions "globally activated"... by removing the feature, it gets easier for people to contribute, and we'd end up with fewer bugs due to edge cases.
- Environment accomplish the same thing for non-global "activation" i.e. `spack view`, but better.

Also we write in the docs:

```
However, Spack global activations have two potential drawbacks:

#. Activated packages that involve compiled C extensions may still
   need their dependencies to be loaded manually.  For example,
   ``spack load openblas`` might be required to make ``py-numpy``
   work.

#. Global activations "break" a core feature of Spack, which is that
   multiple versions of a package can co-exist side-by-side.  For example,
   suppose you wish to run a Python package in two different
   environments but the same basic Python --- one with
   ``py-numpy@1.7`` and one with ``py-numpy@1.8``.  Spack extensions
   will not support this potential debugging use case.
```

Now that environments are established and views can take over the role of activation
non-destructively, we can remove global activation/deactivation.
2022-11-11 00:50:07 -08:00
Greg Becker
f11778bb02 remove deprecated top-level module config (#33828)
* remove deprecated top-level module config per deprecation in 0.18
2022-11-10 23:35:33 -08:00
Brian Van Essen
3437926cde Fixed a bug where the external HIP library is found in a nested directory, even on newer releases of ROCm. (#33772) 2022-11-10 23:32:55 -08:00
Pranith
d25375da55 emacs: Add version 28.2 (#33255) 2022-11-11 04:07:11 +01:00
Yang Zongze
0b302034df global: add --with-ncurses with prefix to configure (#33136) 2022-11-11 03:40:03 +01:00
Seth R. Johnson
b9f69a8dfa go-bootstrap: improve error message for Monterey (#31800)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2022-11-11 02:22:37 +01:00
Loïc Pottier
c3e9aeeed0 redis-plus-plus: added initial support (#33803)
* redis-plus-plus: added initial support
* redis-plus-plus: use cmake arg provided by Spack
* redis-plus-plus: oups tiny typo

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Signed-off-by: Loïc Pottier <lpottier@arnor>
Co-authored-by: Loïc Pottier <lpottier@arnor>
2022-11-10 17:19:45 -08:00
Greg Becker
277234c044 remove deprecated concretization environment key (#33774) 2022-11-11 00:11:52 +00:00
Harmen Stoppels
0077a25639 Use bfs order in spack find --deps tree (#33782)
* Use bfs order in spack find --deps tree
* fix printing tests
2022-11-10 15:39:07 -08:00
Massimiliano Culpo
6a3e20023e Delete directory with vestigial configuration file (#33825) 2022-11-11 00:10:29 +01:00
MatthewLieber
f92987b11f adding sha for 7.0 release (#33823)
Co-authored-by: Matthew Lieber <lieber.31@osu.edu>
2022-11-10 14:24:05 -08:00
Satish Balay
61f198e8af xsdk: deprecate 0.6.0, and remove older deprecated releases (#33815)
xsdk-examples: deprecate @0.2.0, remove @0.1.0 in sync with older xsdk releases
2022-11-10 14:16:17 -08:00
Glenn Johnson
4d90d663a3 Cran updates (#33819)
* add version 1.7-20 to r-ade4
* add version 1.1-13 to r-adephylo
* add version 0.3-20 to r-adespatial
* add version 1.2-0 to r-afex
* add version 0.8-19 to r-amap
* add version 0.1.8 to r-aplot
* add version 2.0.8 to r-biasedurn
* add version 2.4-4 to r-bio3d
* add version 1.30.19 to r-biocmanager
* add version 1.2-9 to r-brobdingnag
* add version 0.4.1 to r-bslib
* add version 3.7.3 to r-callr
* add version 3.1-1 to r-car
* add version 0.3-62 to r-clue
* add version 1.2.0 to r-colourpicker
* add version 1.8.1 to r-commonmark
* add version 0.4.3 to r-cpp11
* add version 1.14.4 to r-data-table
* add version 1.34 to r-desolve
* add version 2.4.5 to r-devtools
* add version 0.6.30 to r-digest
* add version 0.26 to r-dt
* add version 1.7-12 to r-e1071
* add version 1.8.2 to r-emmeans
* add version 1.1.16 to r-exomedepth
* add version 0.1-8 to r-expint
* add version 0.4.0 to r-fontawesome
* add version 1.5-2 to r-fracdiff
* add version 1.10.0 to r-future-apply
* add version 3.0.1 to r-ggmap
* add version 3.4.0 to r-ggplot2
* add version 2.1.0 to r-ggraph
* add version 0.6.4 to r-ggsignif
* add version 0.6-7 to r-gmp
* add version 2022.10-2 to r-gparotation
* add version 0.8.3 to r-graphlayouts
* add version 1.8.8 to r-grbase
* add version 2.1-0 to r-gstat
* add version 0.18.6 to r-insight
* add version 1.3.1 to r-irkernel
* add version 1.8.3 to r-jsonlite
* add version 1.7.0 to r-lava
* add version 1.1-31 to r-lme4
* add version 5.6.17 to r-lpsolve
* add version 5.5.2.0-17.9 to r-lpsolveapi
* add version 1.2.9 to r-mapproj
* add version 3.4.1 to r-maps
* add version 1.1-5 to r-maptools
* add version 1.3 to r-markdown
* add version 6.0.0 to r-mclust
* add version 4.2-2 to r-memuse
* add version 1.8-41 to r-mgcv
* add version 1.2.5 to r-minqa
* add version 0.3.7 to r-nanotime
* add version 2.4.1.1 to r-nfactors
* add version 3.1-160 to r-nlme
* add version 2.7-1 to r-nmof
* add version 0.60-16 to r-np
* add version 2.0.4 to r-openssl
* add version 4.2.5.1 to r-openxlsx
* add version 0.3-8 to r-pbdzmq
* add version 2.0-3 to r-pcapp
* add version 0.7.0 to r-philentropy
* add version 2.0.3 to r-pkgcache
* add version 1.3.1 to r-pkgload
* add version 1.10-4 to r-polyclip
* add version 3.8.0 to r-processx
* add version 1.7.2 to r-ps
* add version 2.12.1 to r-r-utils
* add version 1.2.4 to r-ragg
* add version 3.7 to r-rainbow
* add version 0.0.20 to r-rcppannoy
* add version 0.3.3.9.3 to r-rcppeigen
* add version 0.3.12 to r-rcppgsl
* add version 1.0.2 to r-recipes
* add version 1.1.10 to r-rmutil
* add version 0.10.24 to r-rmysql
* add version 2.4.8 to r-rnexml
* add version 4.1.19 to r-rpart
* add version 1.7-2 to r-rrcov
* add version 0.8.28 to r-rsconnect
* add version 0.25 to r-servr
* add version 1.7.3 to r-shiny
* add version 3.0-0 to r-spatstat-data
* add version 3.0-3 to r-spatstat-geom
* add version 3.0-1 to r-spatstat-random
* add version 3.0-0 to r-spatstat-sparse
* add version 3.0-1 to r-spatstat-utils
* add version 1.8.0 to r-styler
* add version 3.4.1 to r-sys
* add version 1.5-2 to r-tclust
* add version 0.0.9 to r-tfmpvalue
* add version 1.2.0 to r-tidyselect
* add version 0.10-52 to r-tseries
* add version 4.2.2 to r-v8
* add version 0.5.0 to r-vctrs
* add version 2.6-4 to r-vegan
* add version 0.7.0 to r-wk
* add version 0.34 to r-xfun
* add version 1.0.6 to r-xlconnect
* add version 3.99-0.12 to r-xml
* add version 0.12.2 to r-xts
* add version 1.0-33 to r-yaimpute
* add version 2.3.6 to r-yaml
* add version 2.2.2 to r-zip
* add version 2.2-7 to r-deoptim
* add version 4.3.1 to r-ergm
* add version 0.18 to r-evaluate
* add version 1.29.0 to r-future
* add version 0.0.8 to r-ggfun
* add version 0.9.2 to r-ggrepel
* add version 1.9.0 to r-lubridate
* add version 4.10.1 to r-plotly
* add version 0.2.12 to r-rcppcctz
* add version 1.2 to r-rook
* add version 1.6-1 to r-segmented
* add version 4.2.1 to r-seurat
* add version 4.1.3 to r-seuratobject
* add version 1.0-9 to r-sf
* add version 1.5-1 to r-sp
* add version 1.8.1 to r-styler
* new package: r-timechange
* new package: r-stars
* new package: r-sftime
* new package: r-spatstat-explore

Co-authored-by: glennpj <glennpj@users.noreply.github.com>
2022-11-10 12:06:13 -07:00
Rémi Lacroix
7a7e9eb04f pandoc: add version 2.19.2 (#33811) 2022-11-10 10:47:28 -08:00
Chris White
3ea4b53bf6 add utilities and examples variants to conduit (#33804) 2022-11-10 11:05:24 -07:00
Olivier Cessenat
ad0d908d8d octave: new version 7.3.0 (#33788) 2022-11-10 09:42:00 -08:00
Harmen Stoppels
9a793fe01b gcc: drop target bootstrap flags for aarch64 (#33813)
See https://github.com/spack/spack/issues/31184

GCC bootstrap logic adds `-mpcu` for libatomic (iirc), which conflicts
with the `-march` flag we provide.
2022-11-10 17:42:45 +01:00
Dave Love
6dd3c78924 elpa: Fix build on ppc64le (#33639) 2022-11-10 14:51:17 +01:00
Axel Huebl
5b080d63fb pybind11: v2.10.1 (#33645)
Add the latest release of pybind11:
  https://github.com/pybind/pybind11/releases/tag/v2.10.1
2022-11-10 05:01:57 -07:00
Adam J. Stewart
ea8e3c27a4 py-einops: add v0.6.0 (#33799)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-11-10 11:54:14 +01:00
iarspider
30ffd6d33e dcap: add variant ~plugins: Disables the build of plugins (#33524) 2022-11-10 11:20:32 +01:00
iarspider
c1aec72f60 cpu-features: Add variant to enable BUILD_SHARED_LIBS=True (#33809)
* Allow building shared libraries for cpu-features
2022-11-10 11:14:52 +01:00
Massimiliano Culpo
cfd0dc6d89 spack location: fix attribute lookup after multiple build systems (#33791)
fixes #33785
2022-11-10 01:53:59 -07:00
Satish Balay
60b3d32072 py-mpi4py: add version 3.1.4 (#33805) 2022-11-09 23:33:56 -07:00
Glenn Johnson
5142ebdd57 udunits: Add libs property to recipe to find libudunits2 (#33764) 2022-11-10 01:43:20 +01:00
Harmen Stoppels
6b782e6d7e ucx: fix int overflow: use ssize_t (#33784) 2022-11-10 01:18:22 +01:00
Saqib Khan
168bced888 New Package: stressapptest (#33736)
Signed-off-by: saqibkh <saqibkhan@utexas.edu>
2022-11-10 01:11:46 +01:00
Saqib Khan
489de38890 New Package: y-cruncher (#33754)
y-cruncher is a program that can compute Pi and
other constants to trillions of digits.

Signed-off-by: saqibkh <saqibkhan@utexas.edu>
2022-11-10 00:50:54 +01:00
Robert Underwood
2a20520cc8 updates and fixes for libpressio (#33789)
* updates and fixes for libpressio

* differentiate between standalone and build tests

* add e4s tags

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-11-09 16:22:58 -07:00
Jean Luca Bez
ae6213b193 New package: py-darshan (#33430)
* include py-darshan

* include requested changes

* fix required versions

* fix style

* fix style

* Update package.py

* Update var/spack/repos/builtin/packages/py-darshan/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-09 16:44:45 -06:00
Chris White
bb1cd430c0 always use the cxx compiler as a host compiler (#33771) 2022-11-09 13:22:43 -08:00
Filippo Spiga
36877abd02 Adding libpfm 4.12.0 (#33779) 2022-11-09 12:54:50 -08:00
snehring
62db008e42 sentieon-genomics: adding version 202112.06 (#33786) 2022-11-09 12:52:57 -08:00
James Willenbring
b10d75b1c6 Update package.py (#33787)
Proposing to add myself as a maintainer of the Trilinos Spack package after a related conversation with @kuberry.
2022-11-09 12:40:33 -08:00
kwryankrattiger
078767946c Boost: Change comment to conflict for MPI/Python (#33767)
Boost 1.64.0 has build errors when building the python and MPI modules. This was previously just a comment in the package.py which allowed broken specs to concretize. The comments are now expressed in conflicts to prevent this.
2022-11-09 09:05:27 -06:00
snehring
9ca7165ef0 postgresql: fix weird spack message (#33770) 2022-11-09 15:25:22 +01:00
Harmen Stoppels
d1d668a9d5 Revert "fix racy sbang (#33549)" (#33778)
This reverts commit 4d28a64661.
2022-11-09 09:31:27 +00:00
Greg Becker
284c3a3fd8 ensure external PythonPackages have python deps (#33777)
Currently, external `PythonPackage`s cause install failures because the logic in `PythonPackage` assumes that it can ask for `spec["python"]`.  Because we chop off externals' dependencies, an external Python extension may not have a `python` dependency.

This PR resolves the issue by guaranteeing that a `python` node is present in one of two ways:
1. If there is already a `python` node in the DAG, we wire the external up to it.
2. If there is no existing `python` node, we wire up a synthetic external `python` node, and we assume that it has the same prefix as the external.

The assumption in (2) isn't always valid, but it's better than leaving the user with a non-working `PythonPackage`.

The logic here is specific to `python`, but other types of extensions could take advantage of it.  Packages need only define `update_external_dependencies(self)`, and this method will be called on externals after concretization.  This likely needs to be fleshed out in the future so that any added nodes are included in concretization, but for now we only bolt on dependencies post-concretization.

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-11-09 08:25:30 +00:00
Massimiliano Culpo
ec89c47aee Account for patchelf binaries when creating local bootstrap mirror (#33776)
This was overlooked when we added binary patchelf buildcaches
2022-11-08 23:05:03 -08:00
Veselin Dobrev
49114ffff7 MFEM: more updates for v4.5 (#33603)
* [mfem] updates related to building with cuda

* [hypre] tweak to support building with external ROCm/HIP

* [mfem] more tweaks related to building with +rocm

* [mfem] temporary (?) workaround for issue #33684

* [mfem] fix style

* [mfem] fix +shared+miniapps install
2022-11-08 22:56:58 -08:00
iarspider
05fd39477e Add checksum for py-protobuf 4.21.7, protobuf 21.7; remove protobuf and py-protobuf 2.x (#32977)
* Add checksum for py-protobuf 4.21.7, protobuf 21.7

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update package.py

* Delete protoc2.5.0_aarch64.patch

* Update package.py

* Restore but deprecate py-protobuf 3.0.0a/b; deprecate py-tensorflow 0.x

* Fix audit

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-11-08 16:10:29 -07:00
1896 changed files with 25844 additions and 27011 deletions

View File

@@ -19,8 +19,8 @@ jobs:
package-audits:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup repo
@@ -158,7 +158,7 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -179,7 +179,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap clingo
run: |
set -ex
@@ -204,7 +204,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup repo
@@ -214,7 +214,7 @@ jobs:
- name: Bootstrap clingo
run: |
set -ex
for ver in '2.7' '3.6' '3.7' '3.8' '3.9' '3.10' ; do
for ver in '3.6' '3.7' '3.8' '3.9' '3.10' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
@@ -247,7 +247,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -283,7 +283,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -316,7 +316,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -333,7 +333,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -50,7 +50,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- name: Set Container Tag Normal (Nightly)
run: |

View File

@@ -20,12 +20,6 @@ jobs:
uses: ./.github/workflows/valid-style.yml
with:
with_coverage: ${{ needs.changes.outputs.core }}
audit-ancient-python:
uses: ./.github/workflows/audit.yaml
needs: [ changes ]
with:
with_coverage: ${{ needs.changes.outputs.core }}
python_version: 2.7
all-prechecks:
needs: [ prechecks ]
runs-on: ubuntu-latest
@@ -41,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -85,7 +79,7 @@ jobs:
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
all:
needs: [ windows, unit-tests, bootstrap, audit-ancient-python ]
needs: [ windows, unit-tests, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

View File

@@ -1,15 +1,9 @@
# (c) 2021 Lawrence Livermore National Laboratory
Set-Location spack
# (c) 2022 Lawrence Livermore National Laboratory
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
git config --global core.longpaths true
# See https://github.com/git/git/security/advisories/GHSA-3wp6-j8xr-qw85 (CVE-2022-39253)
# This is needed to let some fixture in our unit-test suite run
git config --global protocol.file.allow always
if ($(git branch --show-current) -ne "develop")
{
git branch develop origin/develop

View File

@@ -2,10 +2,6 @@
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
# See https://github.com/git/git/security/advisories/GHSA-3wp6-j8xr-qw85 (CVE-2022-39253)
# This is needed to let some fixture in our unit-test suite run
git config --global protocol.file.allow always
# create a local pr base branch
if [[ -n $GITHUB_BASE_REF ]]; then
git fetch origin "${GITHUB_BASE_REF}:${GITHUB_BASE_REF}"

View File

@@ -11,39 +11,46 @@ concurrency:
jobs:
# Run unit tests with different configurations on linux
ubuntu:
runs-on: ubuntu-latest
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['2.7', '3.6', '3.7', '3.8', '3.9', '3.10', '3.11']
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
include:
- python-version: 2.7
- python-version: '3.11'
os: ubuntu-latest
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.11'
concretizer: original
- python-version: '3.6'
os: ubuntu-20.04
concretizer: clingo
on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude:
- python-version: '3.7'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.8'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.9'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.10'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -52,24 +59,11 @@ jobs:
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf cmake bison libbison-dev kcov
cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov[toml] pytest-xdist
# Install pytest-cov only on recent Python, to avoid stalling on Python 2.7 due
# to bugs on an unmaintained version of the package when used with xdist.
if [[ ${{ matrix.python-version }} != "2.7" ]]; then
pip install --upgrade pytest-cov
fi
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click==8.0.4" "black<=21.12b0"
fi
- name: Pin pathlib for Python 2.7
if: ${{ matrix.python-version == 2.7 }}
run: |
pip install -U pathlib2==2.3.6 toml
pip install --upgrade pip six setuptools pytest codecov[toml] pytest-xdist pytest-cov
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click" "black"
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -82,6 +76,7 @@ jobs:
run: |
. share/spack/setup-env.sh
spack bootstrap disable spack-install
spack bootstrap now
spack -v solve zlib
- name: Run unit tests
env:
@@ -89,7 +84,7 @@ jobs:
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
SPACK_TEST_PARALLEL: 2
COVERAGE: true
UNIT_TEST_COVERAGE: ${{ (matrix.python-version == '3.11') }}
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
@@ -99,10 +94,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -138,7 +133,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -150,25 +145,22 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
spack -d bootstrap now --dev
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
- name: Install System packages
run: |
sudo apt-get -y update
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf kcov
sudo apt-get -y install coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml] pytest-cov clingo pytest-xdist
@@ -193,10 +185,10 @@ jobs:
matrix:
python-version: ["3.10"]
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -18,8 +18,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
cache: 'pip'
@@ -28,23 +28,23 @@ jobs:
pip install --upgrade pip
pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv var/spack/repos
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python packages
run: |
python3 -m pip install --upgrade pip six setuptools types-six click==8.0.2 'black==21.12b0' mypy isort clingo flake8
python3 -m pip install --upgrade pip six setuptools types-six black mypy isort clingo flake8
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.

View File

@@ -10,15 +10,15 @@ concurrency:
defaults:
run:
shell:
powershell Invoke-Expression -Command ".\share\spack\qa\windows_test_setup.ps1"; {0}
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -26,13 +26,11 @@ jobs:
python -m pip install --upgrade pip six pywin32 setuptools codecov pytest-cov clingo
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
cd spack
dir
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
@@ -41,10 +39,10 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -52,12 +50,11 @@ jobs:
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage pytest-cov clingo
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
cd spack
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
@@ -66,10 +63,10 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -78,81 +75,81 @@ jobs:
- name: Build Test
run: |
spack compiler find
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
spack external find cmake
spack external find ninja
spack -d install abseil-cpp
make-installer:
runs-on: windows-latest
steps:
- name: Disable Windows Symlinks
run: |
git config --global core.symlinks false
shell:
powershell
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools
- name: Add Light and Candle to Path
run: |
$env:WIX >> $GITHUB_PATH
- name: Run Installer
run: |
.\spack\share\spack\qa\setup_spack.ps1
spack make-installer -s spack -g SILENT pkg
echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
env:
ProgressPreference: SilentlyContinue
- uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
with:
name: Windows Spack Installer Bundle
path: ${{ env.installer_root }}\pkg\Spack.exe
- uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
with:
name: Windows Spack Installer
path: ${{ env.installer_root}}\pkg\Spack.msi
execute-installer:
needs: make-installer
runs-on: windows-latest
defaults:
run:
shell: pwsh
steps:
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools
- name: Setup installer directory
run: |
mkdir -p spack_installer
echo "spack_installer=$((pwd).Path)\spack_installer" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
- uses: actions/download-artifact@v3
with:
name: Windows Spack Installer Bundle
path: ${{ env.spack_installer }}
- name: Execute Bundled Installer
run: |
$proc = Start-Process ${{ env.spack_installer }}\spack.exe "/install /quiet" -Passthru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
$LASTEXITCODE
env:
ProgressPreference: SilentlyContinue
- uses: actions/download-artifact@v3
with:
name: Windows Spack Installer
path: ${{ env.spack_installer }}
- name: Execute MSI
run: |
$proc = Start-Process ${{ env.spack_installer }}\spack.msi "/quiet" -Passthru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
$LASTEXITCODE
# TODO: johnwparent - reduce the size of the installer operations
# make-installer:
# runs-on: windows-latest
# steps:
# - name: Disable Windows Symlinks
# run: |
# git config --global core.symlinks false
# shell:
# powershell
# - uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
# with:
# fetch-depth: 0
# - uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
# with:
# python-version: 3.9
# - name: Install Python packages
# run: |
# python -m pip install --upgrade pip six pywin32 setuptools
# - name: Add Light and Candle to Path
# run: |
# $env:WIX >> $GITHUB_PATH
# - name: Run Installer
# run: |
# ./share/spack/qa/setup_spack_installer.ps1
# spack make-installer -s . -g SILENT pkg
# echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
# env:
# ProgressPreference: SilentlyContinue
# - uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
# with:
# name: Windows Spack Installer Bundle
# path: ${{ env.installer_root }}\pkg\Spack.exe
# - uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
# with:
# name: Windows Spack Installer
# path: ${{ env.installer_root}}\pkg\Spack.msi
# execute-installer:
# needs: make-installer
# runs-on: windows-latest
# defaults:
# run:
# shell: pwsh
# steps:
# - uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
# with:
# python-version: 3.9
# - name: Install Python packages
# run: |
# python -m pip install --upgrade pip six pywin32 setuptools
# - name: Setup installer directory
# run: |
# mkdir -p spack_installer
# echo "spack_installer=$((pwd).Path)\spack_installer" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
# - uses: actions/download-artifact@v3
# with:
# name: Windows Spack Installer Bundle
# path: ${{ env.spack_installer }}
# - name: Execute Bundled Installer
# run: |
# $proc = Start-Process ${{ env.spack_installer }}\spack.exe "/install /quiet" -Passthru
# $handle = $proc.Handle # cache proc.Handle
# $proc.WaitForExit();
# $LASTEXITCODE
# env:
# ProgressPreference: SilentlyContinue
# - uses: actions/download-artifact@v3
# with:
# name: Windows Spack Installer
# path: ${{ env.spack_installer }}
# - name: Execute MSI
# run: |
# $proc = Start-Process ${{ env.spack_installer }}\spack.msi "/quiet" -Passthru
# $handle = $proc.Handle # cache proc.Handle
# $proc.WaitForExit();
# $LASTEXITCODE

View File

@@ -1,16 +1,284 @@
# v0.19.0 (2022-11-11)
`v0.19.0` is a major feature release.
## Major features in this release
1. **Package requirements**
Spack's traditional [package preferences](
https://spack.readthedocs.io/en/latest/build_settings.html#package-preferences)
are soft, but we've added hard requriements to `packages.yaml` and `spack.yaml`
(#32528, #32369). Package requirements use the same syntax as specs:
```yaml
packages:
libfabric:
require: "@1.13.2"
mpich:
require:
- one_of: ["+cuda", "+rocm"]
```
More details in [the docs](
https://spack.readthedocs.io/en/latest/build_settings.html#package-requirements).
2. **Environment UI Improvements**
* Fewer surprising modifications to `spack.yaml` (#33711):
* `spack install` in an environment will no longer add to the `specs:` list; you'll
need to either use `spack add <spec>` or `spack install --add <spec>`.
* Similarly, `spack uninstall` will not remove from your environment's `specs:`
list; you'll need to use `spack remove` or `spack uninstall --remove`.
This will make it easier to manage an environment, as there is clear separation
between the stack to be installed (`spack.yaml`/`spack.lock`) and which parts of
it should be installed (`spack install` / `spack uninstall`).
* `concretizer:unify:true` is now the default mode for new environments (#31787)
We see more users creating `unify:true` environments now. Users who need
`unify:false` can add it to their environment to get the old behavior. This will
concretize every spec in the environment independently.
* Include environment configuration from URLs (#29026, [docs](
https://spack.readthedocs.io/en/latest/environments.html#included-configurations))
You can now include configuration in your environment directly from a URL:
```yaml
spack:
include:
- https://github.com/path/to/raw/config/compilers.yaml
```
4. **Multiple Build Systems**
An increasing number of packages in the ecosystem need the ability to support
multiple build systems (#30738, [docs](
https://spack.readthedocs.io/en/latest/packaging_guide.html#multiple-build-systems)),
either across versions, across platforms, or within the same version of the software.
This has been hard to support through multiple inheritance, as methods from different
build system superclasses would conflict. `package.py` files can now define separate
builder classes with installation logic for different build systems, e.g.:
```python
class ArpackNg(CMakePackage, AutotoolsPackage):
build_system(
conditional("cmake", when="@0.64:"),
conditional("autotools", when="@:0.63"),
default="cmake",
)
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
def cmake_args(self):
pass
class Autotoolsbuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
pass
```
5. **Compiler and variant propagation**
Currently, compiler flags and variants are inconsistent: compiler flags set for a
package are inherited by its dependencies, while variants are not. We should have
these be consistent by allowing for inheritance to be enabled or disabled for both
variants and compiler flags.
Example syntax:
- `package ++variant`:
enabled variant that will be propagated to dependencies
- `package +variant`:
enabled variant that will NOT be propagated to dependencies
- `package ~~variant`:
disabled variant that will be propagated to dependencies
- `package ~variant`:
disabled variant that will NOT be propagated to dependencies
- `package cflags==-g`:
`cflags` will be propagated to dependencies
- `package cflags=-g`:
`cflags` will NOT be propagated to dependencies
Syntax for non-boolan variants is similar to compiler flags. More in the docs for
[variants](
https://spack.readthedocs.io/en/latest/basic_usage.html#variants) and [compiler flags](
https://spack.readthedocs.io/en/latest/basic_usage.html#compiler-flags).
6. **Enhancements to git version specifiers**
* `v0.18.0` added the ability to use git commits as versions. You can now use the
`git.` prefix to specify git tags or branches as versions. All of these are valid git
versions in `v0.19` (#31200):
```console
foo@abcdef1234abcdef1234abcdef1234abcdef1234 # raw commit
foo@git.abcdef1234abcdef1234abcdef1234abcdef1234 # commit with git prefix
foo@git.develop # the develop branch
foo@git.0.19 # use the 0.19 tag
```
* `v0.19` also gives you more control over how Spack interprets git versions, in case
Spack cannot detect the version from the git repository. You can suffix a git
version with `=<version>` to force Spack to concretize it as a particular version
(#30998, #31914, #32257):
```console
# use mybranch, but treat it as version 3.2 for version comparison
foo@git.mybranch=3.2
# use the given commit, but treat it as develop for version comparison
foo@git.abcdef1234abcdef1234abcdef1234abcdef1234=develop
```
More in [the docs](
https://spack.readthedocs.io/en/latest/basic_usage.html#version-specifier)
7. **Changes to Cray EX Support**
Cray machines have historically had their own "platform" within Spack, because we
needed to go through the module system to leverage compilers and MPI installations on
these machines. The Cray EX programming environment now provides standalone `craycc`
executables and proper `mpicc` wrappers, so Spack can treat EX machines like Linux
with extra packages (#29392).
We expect this to greatly reduce bugs, as external packages and compilers can now be
used by prefix instead of through modules. We will also no longer be subject to
reproducibility issues when modules change from Cray PE release to release and from
site to site. This also simplifies dealing with the underlying Linux OS on cray
systems, as Spack will properly model the machine's OS as either SuSE or RHEL.
8. **Improvements to tests and testing in CI**
* `spack ci generate --tests` will generate a `.gitlab-ci.yml` file that not only does
builds but also runs tests for built packages (#27877). Public GitHub pipelines now
also run tests in CI.
* `spack test run --explicit` will only run tests for packages that are explicitly
installed, instead of all packages.
9. **Experimental binding link model**
You can add a new option to `config.yaml` to make Spack embed absolute paths to
needed shared libraries in ELF executables and shared libraries on Linux (#31948, [docs](
https://spack.readthedocs.io/en/latest/config_yaml.html#shared-linking-bind)):
```yaml
config:
shared_linking:
type: rpath
bind: true
```
This can improve launch time at scale for parallel applications, and it can make
installations less susceptible to environment variables like `LD_LIBRARY_PATH`, even
especially when dealing with external libraries that use `RUNPATH`. You can think of
this as a faster, even higher-precedence version of `RPATH`.
## Other new features of note
* `spack spec` prints dependencies more legibly. Dependencies in the output now appear
at the *earliest* level of indentation possible (#33406)
* You can override `package.py` attributes like `url`, directly in `packages.yaml`
(#33275, [docs](
https://spack.readthedocs.io/en/latest/build_settings.html#assigning-package-attributes))
* There are a number of new architecture-related format strings you can use in Spack
configuration files to specify paths (#29810, [docs](
https://spack.readthedocs.io/en/latest/configuration.html#config-file-variables))
* Spack now supports bootstrapping Clingo on Windows (#33400)
* There is now support for an `RPATH`-like library model on Windows (#31930)
## Performance Improvements
* Major performance improvements for installation from binary caches (#27610, #33628,
#33636, #33608, #33590, #33496)
* Test suite can now be parallelized using `xdist` (used in GitHub Actions) (#32361)
* Reduce lock contention for parallel builds in environments (#31643)
## New binary caches and stacks
* We now build nearly all of E4S with `oneapi` in our buildcache (#31781, #31804,
#31804, #31803, #31840, #31991, #32117, #32107, #32239)
* Added 3 new machine learning-centric stacks to binary cache: `x86_64_v3`, CUDA, ROCm
(#31592, #33463)
## Removals and Deprecations
* Support for Python 3.5 is dropped (#31908). Only Python 2.7 and 3.6+ are officially
supported.
* This is the last Spack release that will support Python 2 (#32615). Spack `v0.19`
will emit a deprecation warning if you run it with Python 2, and Python 2 support will
soon be removed from the `develop` branch.
* `LD_LIBRARY_PATH` is no longer set by default by `spack load` or module loads.
Setting `LD_LIBRARY_PATH` in Spack environments/modules can cause binaries from
outside of Spack to crash, and Spack's own builds use `RPATH` and do not need
`LD_LIBRARY_PATH` set in order to run. If you still want the old behavior, you
can run these commands to configure Spack to set `LD_LIBRARY_PATH`:
```console
spack config add modules:prefix_inspections:lib64:[LD_LIBRARY_PATH]
spack config add modules:prefix_inspections:lib:[LD_LIBRARY_PATH]
```
* The `spack:concretization:[together|separately]` has been removed after being
deprecated in `v0.18`. Use `concretizer:unify:[true|false]`.
* `config:module_roots` is no longer supported after being deprecated in `v0.18`. Use
configuration in module sets instead (#28659, [docs](
https://spack.readthedocs.io/en/latest/module_file_support.html)).
* `spack activate` and `spack deactivate` are no longer supported, having been
deprecated in `v0.18`. Use an environment with a view instead of
activating/deactivating ([docs](
https://spack.readthedocs.io/en/latest/environments.html#configuration-in-spack-yaml)).
* The old YAML format for buildcaches is now deprecated (#33707). If you are using an
old buildcache with YAML metadata you will need to regenerate it with JSON metadata.
* `spack bootstrap trust` and `spack bootstrap untrust` are deprecated in favor of
`spack bootstrap enable` and `spack bootstrap disable` and will be removed in `v0.20`.
(#33600)
* The `graviton2` architecture has been renamed to `neoverse_n1`, and `graviton3`
is now `neoverse_v1`. Buildcaches using the old architecture names will need to be rebuilt.
* The terms `blacklist` and `whitelist` have been replaced with `include` and `exclude`
in all configuration files (#31569). You can use `spack config update` to
automatically fix your configuration files.
## Notable Bugfixes
* Permission setting on installation now handles effective uid properly (#19980)
* `buildable:true` for an MPI implementation now overrides `buildable:false` for `mpi` (#18269)
* Improved error messages when attempting to use an unconfigured compiler (#32084)
* Do not punish explicitly requested compiler mismatches in the solver (#30074)
* `spack stage`: add missing --fresh and --reuse (#31626)
* Fixes for adding build system executables like `cmake` to package scope (#31739)
* Bugfix for binary relocation with aliased strings produced by newer `binutils` (#32253)
## Spack community stats
* 6,751 total packages, 335 new since `v0.18.0`
* 141 new Python packages
* 89 new R packages
* 303 people contributed to this release
* 287 committers to packages
* 57 committers to core
# v0.18.1 (2022-07-19)
### Spack Bugfixes
* Fix several bugs related to bootstrapping (#30834,#31042,#31180)
* Fix a regression that was causing spec hashes to differ between
* Fix a regression that was causing spec hashes to differ between
Python 2 and Python 3 (#31092)
* Fixed compiler flags for oneAPI and DPC++ (#30856)
* Fixed several issues related to concretization (#31142,#31153,#31170,#31226)
* Improved support for Cray manifest file and `spack external find` (#31144,#31201,#31173,#31186)
* Assign a version to openSUSE Tumbleweed according to the GLIBC version
in the system (#19895)
in the system (#19895)
* Improved Dockerfile generation for `spack containerize` (#29741,#31321)
* Fixed a few bugs related to concurrent execution of commands (#31509,#31493,#31477)
* Fixed a few bugs related to concurrent execution of commands (#31509,#31493,#31477)
### Package updates
* WarpX: add v22.06, fixed libs property (#30866,#31102)

View File

@@ -10,8 +10,8 @@ For more on Spack's release structure, see
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.17.x | :white_check_mark: |
| 0.16.x | :white_check_mark: |
| 0.19.x | :white_check_mark: |
| 0.18.x | :white_check_mark: |
## Reporting a Vulnerability

View File

@@ -31,13 +31,11 @@ import os
import os.path
import sys
min_python3 = (3, 5)
min_python3 = (3, 6)
if sys.version_info[:2] < (2, 7) or (
sys.version_info[:2] >= (3, 0) and sys.version_info[:2] < min_python3
):
if sys.version_info[:2] < min_python3:
v_info = sys.version_info[:3]
msg = "Spack requires Python 2.7 or %d.%d or higher " % min_python3
msg = "Spack requires Python %d.%d or higher " % min_python3
msg += "You are running spack with Python %d.%d.%d." % v_info
sys.exit(msg)

View File

@@ -19,7 +19,7 @@ config:
install_tree:
root: $spack/opt/spack
projections:
all: "${ARCHITECTURE}/${COMPILERNAME}-${COMPILERVER}/${PACKAGE}-${VERSION}-${HASH}"
all: "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
# install_tree can include an optional padded length (int or boolean)
# default is False (do not pad)
# if padded_length is True, Spack will pad as close to the system max path
@@ -214,4 +214,8 @@ config:
# Number of seconds a buildcache's index.json is cached locally before probing
# for updates, within a single Spack invocation. Defaults to 10 minutes.
binary_index_ttl: 600
binary_index_ttl: 600
flags:
# Whether to keep -Werror flags active in package builds.
keep_werror: 'none'

View File

@@ -1,162 +0,0 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _analyze:
=======
Analyze
=======
The analyze command is a front-end to various tools that let us analyze
package installations. Each analyzer is a module for a different kind
of analysis that can be done on a package installation, including (but not
limited to) binary, log, or text analysis. Thus, the analyze command group
allows you to take an existing package install, choose an analyzer,
and extract some output for the package using it.
-----------------
Analyzer Metadata
-----------------
For all analyzers, we write to an ``analyzers`` folder in ``~/.spack``, or the
value that you specify in your spack config at ``config:analyzers_dir``.
For example, here we see the results of running an analysis on zlib:
.. code-block:: console
$ tree ~/.spack/analyzers/
└── linux-ubuntu20.04-skylake
└── gcc-9.3.0
└── zlib-1.2.11-sl7m27mzkbejtkrajigj3a3m37ygv4u2
├── environment_variables
│   └── spack-analyzer-environment-variables.json
├── install_files
│   └── spack-analyzer-install-files.json
└── libabigail
└── spack-analyzer-libabigail-libz.so.1.2.11.xml
This means that you can always find analyzer output in this folder, and it
is organized with the same logic as the package install it was run for.
If you want to customize this top level folder, simply provide the ``--path``
argument to ``spack analyze run``. The nested organization will be maintained
within your custom root.
-----------------
Listing Analyzers
-----------------
If you aren't familiar with Spack's analyzers, you can quickly list those that
are available:
.. code-block:: console
$ spack analyze list-analyzers
install_files : install file listing read from install_manifest.json
environment_variables : environment variables parsed from spack-build-env.txt
config_args : config args loaded from spack-configure-args.txt
libabigail : Application Binary Interface (ABI) features for objects
In the above, the first three are fairly simple - parsing metadata files from
a package install directory to save
-------------------
Analyzing a Package
-------------------
The analyze command, akin to install, will accept a package spec to perform
an analysis for. The package must be installed. Let's walk through an example
with zlib. We first ask to analyze it. However, since we have more than one
install, we are asked to disambiguate:
.. code-block:: console
$ spack analyze run zlib
==> Error: zlib matches multiple packages.
Matching packages:
fz2bs56 zlib@1.2.11%gcc@7.5.0 arch=linux-ubuntu18.04-skylake
sl7m27m zlib@1.2.11%gcc@9.3.0 arch=linux-ubuntu20.04-skylake
Use a more specific spec.
We can then specify the spec version that we want to analyze:
.. code-block:: console
$ spack analyze run zlib/fz2bs56
If you don't provide any specific analyzer names, by default all analyzers
(shown in the ``list-analyzers`` subcommand list) will be run. If an analyzer does not
have any result, it will be skipped. For example, here is a result running for
zlib:
.. code-block:: console
$ ls ~/.spack/analyzers/linux-ubuntu20.04-skylake/gcc-9.3.0/zlib-1.2.11-sl7m27mzkbejtkrajigj3a3m37ygv4u2/
spack-analyzer-environment-variables.json
spack-analyzer-install-files.json
spack-analyzer-libabigail-libz.so.1.2.11.xml
If you want to run a specific analyzer, ask for it with `--analyzer`. Here we run
spack analyze on libabigail (already installed) _using_ libabigail1
.. code-block:: console
$ spack analyze run --analyzer abigail libabigail
.. _analyze_monitoring:
----------------------
Monitoring An Analysis
----------------------
For any kind of analysis, you can
use a `spack monitor <https://github.com/spack/spack-monitor>`_ "Spackmon"
as a server to upload the same run metadata to. You can
follow the instructions in the `spack monitor documentation <https://spack-monitor.readthedocs.org>`_
to first create a server along with a username and token for yourself.
You can then use this guide to interact with the server.
You should first export our spack monitor token and username to the environment:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
By default, the host for your server is expected to be at ``http://127.0.0.1``
with a prefix of ``ms1``, and if this is the case, you can simply add the
``--monitor`` flag to the install command:
.. code-block:: console
$ spack analyze run --monitor wget
If you need to customize the host or the prefix, you can do that as well:
.. code-block:: console
$ spack analyze run --monitor --monitor-prefix monitor --monitor-host https://monitor-service.io wget
If your server doesn't have authentication, you can skip it:
.. code-block:: console
$ spack analyze run --monitor --monitor-disable-auth wget
Regardless of your choice, when you run analyze on an installed package (whether
it was installed with ``--monitor`` or not, you'll see the results generating as they did
before, and a message that the monitor server was pinged:
.. code-block:: console
$ spack analyze --monitor wget
...
==> Sending result for wget bin/wget to monitor.

View File

@@ -1114,21 +1114,21 @@ set of arbitrary versions, such as ``@1.0,1.5,1.7`` (``1.0``, ``1.5``,
or ``1.7``). When you supply such a specifier to ``spack install``,
it constrains the set of versions that Spack will install.
For packages with a ``git`` attribute, ``git`` references
may be specified instead of a numerical version i.e. branches, tags
and commits. Spack will stage and build based off the ``git``
For packages with a ``git`` attribute, ``git`` references
may be specified instead of a numerical version i.e. branches, tags
and commits. Spack will stage and build based off the ``git``
reference provided. Acceptable syntaxes for this are:
.. code-block:: sh
# branches and tags
foo@git.develop # use the develop branch
foo@git.0.19 # use the 0.19 tag
# commit hashes
foo@abcdef1234abcdef1234abcdef1234abcdef1234 # 40 character hashes are automatically treated as git commits
foo@git.abcdef1234abcdef1234abcdef1234abcdef1234
Spack versions from git reference either have an associated version supplied by the user,
or infer a relationship to known versions from the structure of the git repository. If an
associated version is supplied by the user, Spack treats the git version as equivalent to that
@@ -1244,8 +1244,8 @@ For example, for the ``stackstart`` variant:
.. code-block:: sh
mpileaks stackstart=4 # variant will be propagated to dependencies
mpileaks stackstart==4 # only mpileaks will have this variant value
mpileaks stackstart==4 # variant will be propagated to dependencies
mpileaks stackstart=4 # only mpileaks will have this variant value
^^^^^^^^^^^^^^
Compiler Flags
@@ -1672,9 +1672,13 @@ own install prefix. However, certain packages are typically installed
`Python <https://www.python.org>`_ packages are typically installed in the
``$prefix/lib/python-2.7/site-packages`` directory.
Spack has support for this type of installation as well. In Spack,
a package that can live inside the prefix of another package is called
an *extension*. Suppose you have Python installed like so:
In Spack, installation prefixes are immutable, so this type of installation
is not directly supported. However, it is possible to create views that
allow you to merge install prefixes of multiple packages into a single new prefix.
Views are a convenient way to get a more traditional filesystem structure.
Using *extensions*, you can ensure that Python packages always share the
same prefix in the view as Python itself. Suppose you have
Python installed like so:
.. code-block:: console
@@ -1712,8 +1716,6 @@ You can find extensions for your Python installation like this:
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
==> None activated.
The extensions are a subset of what's returned by ``spack list``, and
they are packages like any other. They are installed into their own
prefixes, and you can see this with ``spack find --paths``:
@@ -1741,32 +1743,72 @@ directly when you run ``python``:
ImportError: No module named numpy
>>>
^^^^^^^^^^^^^^^^
Using Extensions
^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Extensions in Environments
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are four ways to get ``numpy`` working in Python. The first is
to use :ref:`shell-support`. You can simply ``load`` the extension,
and it will be added to the ``PYTHONPATH`` in your current shell:
The recommended way of working with extensions such as ``py-numpy``
above is through :ref:`Environments <environments>`. For example,
the following creates an environment in the current working directory
with a filesystem view in the ``./view`` directory:
.. code-block:: console
$ spack load python
$ spack load py-numpy
$ spack env create --with-view view --dir .
$ spack -e . add py-numpy
$ spack -e . concretize
$ spack -e . install
We recommend environments for two reasons. Firstly, environments
can be activated (requires :ref:`shell-support`):
.. code-block:: console
$ spack env activate .
which sets all the right environment variables such as ``PATH`` and
``PYTHONPATH``. This ensures that
.. code-block:: console
$ python
>>> import numpy
works. Secondly, even without shell support, the view ensures
that Python can locate its extensions:
.. code-block:: console
$ ./view/bin/python
>>> import numpy
See :ref:`environments` for a more in-depth description of Spack
environments and customizations to views.
^^^^^^^^^^^^^^^^^^^^
Using ``spack load``
^^^^^^^^^^^^^^^^^^^^
A more traditional way of using Spack and extensions is ``spack load``
(requires :ref:`shell-support`). This will add the extension to ``PYTHONPATH``
in your current shell, and Python itself will be available in the ``PATH``:
.. code-block:: console
$ spack load py-numpy
$ python
>>> import numpy
Now ``import numpy`` will succeed for as long as you keep your current
session open.
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Instead of using Spack's environment modification capabilities through
the ``spack load`` command, you can load numpy through your
environment modules (using ``environment-modules`` or ``lmod``). This
will also add the extension to the ``PYTHONPATH`` in your current
shell.
Apart from ``spack env activate`` and ``spack load``, you can load numpy
through your environment modules (using ``environment-modules`` or
``lmod``). This will also add the extension to the ``PYTHONPATH`` in
your current shell.
.. code-block:: console
@@ -1776,130 +1818,6 @@ If you do not know the name of the specific numpy module you wish to
load, you can use the ``spack module tcl|lmod loads`` command to get
the name of the module from the Spack spec.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Activating Extensions in a View
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Another way to use extensions is to create a view, which merges the
python installation along with the extensions into a single prefix.
See :ref:`configuring_environment_views` for a more in-depth description
of views.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Activating Extensions Globally
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
As an alternative to creating a merged prefix with Python and its extensions,
and prior to support for views, Spack has provided a means to install the
extension into the Spack installation prefix for the extendee. This has
typically been useful since extendable packages typically search their own
installation path for addons by default.
Global activations are performed with the ``spack activate`` command:
.. _cmd-spack-activate:
^^^^^^^^^^^^^^^^^^
``spack activate``
^^^^^^^^^^^^^^^^^^
.. code-block:: console
$ spack activate py-numpy
==> Activated extension py-setuptools@11.3.1%gcc@4.4.7 arch=linux-debian7-x86_64-3c74eb69 for python@2.7.8%gcc@4.4.7.
==> Activated extension py-nose@1.3.4%gcc@4.4.7 arch=linux-debian7-x86_64-5f70f816 for python@2.7.8%gcc@4.4.7.
==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=linux-debian7-x86_64-66733244 for python@2.7.8%gcc@4.4.7.
Several things have happened here. The user requested that
``py-numpy`` be activated in the ``python`` installation it was built
with. Spack knows that ``py-numpy`` depends on ``py-nose`` and
``py-setuptools``, so it activated those packages first. Finally,
once all dependencies were activated in the ``python`` installation,
``py-numpy`` was activated as well.
If we run ``spack extensions`` again, we now see the three new
packages listed as activated:
.. code-block:: console
$ spack extensions python
==> python@2.7.8%gcc@4.4.7 arch=linux-debian7-x86_64-703c7a96
==> 36 extensions:
geos py-ipython py-pexpect py-pyside py-sip
py-basemap py-libxml2 py-pil py-pytz py-six
py-biopython py-mako py-pmw py-rpy2 py-sympy
py-cython py-matplotlib py-pychecker py-scientificpython py-virtualenv
py-dateutil py-mpi4py py-pygments py-scikit-learn
py-epydoc py-mx py-pylint py-scipy
py-gnuplot py-nose py-pyparsing py-setuptools
py-h5py py-numpy py-pyqt py-shiboken
==> 12 installed:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-dateutil@2.4.0 py-nose@1.3.4 py-pyside@1.2.2
py-dateutil@2.4.0 py-numpy@1.9.1 py-pytz@2014.10
py-ipython@2.3.1 py-pygments@2.0.1 py-setuptools@11.3.1
py-matplotlib@1.4.2 py-pyparsing@2.0.3 py-six@1.9.0
==> 3 currently activated:
-- linux-debian7-x86_64 / gcc@4.4.7 --------------------------------
py-nose@1.3.4 py-numpy@1.9.1 py-setuptools@11.3.1
Now, when a user runs python, ``numpy`` will be available for import
*without* the user having to explicitly load it. ``python@2.7.8`` now
acts like a system Python installation with ``numpy`` installed inside
of it.
Spack accomplishes this by symbolically linking the *entire* prefix of
the ``py-numpy`` package into the prefix of the ``python`` package. To the
python interpreter, it looks like ``numpy`` is installed in the
``site-packages`` directory.
The only limitation of global activation is that you can only have a *single*
version of an extension activated at a time. This is because multiple
versions of the same extension would conflict if symbolically linked
into the same prefix. Users who want a different version of a package
can still get it by using environment modules or views, but they will have to
explicitly load their preferred version.
^^^^^^^^^^^^^^^^^^^^^^^^^^
``spack activate --force``
^^^^^^^^^^^^^^^^^^^^^^^^^^
If, for some reason, you want to activate a package *without* its
dependencies, you can use ``spack activate --force``:
.. code-block:: console
$ spack activate --force py-numpy
==> Activated extension py-numpy@1.9.1%gcc@4.4.7 arch=linux-debian7-x86_64-66733244 for python@2.7.8%gcc@4.4.7.
.. _cmd-spack-deactivate:
^^^^^^^^^^^^^^^^^^^^
``spack deactivate``
^^^^^^^^^^^^^^^^^^^^
We've seen how activating an extension can be used to set up a default
version of a Python module. Obviously, you may want to change that at
some point. ``spack deactivate`` is the command for this. There are
several variants:
* ``spack deactivate <extension>`` will deactivate a single
extension. If another activated extension depends on this one,
Spack will warn you and exit with an error.
* ``spack deactivate --force <extension>`` deactivates an extension
regardless of packages that depend on it.
* ``spack deactivate --all <extension>`` deactivates an extension and
all of its dependencies. Use ``--force`` to disregard dependents.
* ``spack deactivate --all <extendee>`` deactivates *all* activated
extensions of a package. For example, to deactivate *all* python
extensions, use:
.. code-block:: console
$ spack deactivate --all python
-----------------------
Filesystem requirements
-----------------------

View File

@@ -5,9 +5,9 @@
.. _cachedcmakepackage:
------------------
CachedCMakePackage
------------------
-----------
CachedCMake
-----------
The CachedCMakePackage base class is used for CMake-based workflows
that create a CMake cache file prior to running ``cmake``. This is

View File

@@ -5,9 +5,9 @@
.. _cudapackage:
-----------
CudaPackage
-----------
----
Cuda
----
Different from other packages, ``CudaPackage`` does not represent a build system.
Instead its goal is to simplify and unify usage of ``CUDA`` in other packages by providing a `mixin-class <https://en.wikipedia.org/wiki/Mixin>`_.
@@ -80,7 +80,7 @@ standard CUDA compiler flags.
**cuda_flags**
This built-in static method returns a list of command line flags
This built-in static method returns a list of command line flags
for the chosen ``cuda_arch`` value(s). The flags are intended to
be passed to the CUDA compiler driver (i.e., ``nvcc``).

View File

@@ -6,9 +6,9 @@
.. _inteloneapipackage:
====================
IntelOneapiPackage
====================
===========
IntelOneapi
===========
.. contents::
@@ -36,7 +36,7 @@ For more information on a specific package, do::
Intel no longer releases new versions of Parallel Studio, which can be
used in Spack via the :ref:`intelpackage`. All of its components can
now be found in oneAPI.
now be found in oneAPI.
Examples
========

View File

@@ -5,9 +5,9 @@
.. _intelpackage:
------------
IntelPackage
------------
-----
Intel
-----
.. contents::

View File

@@ -5,9 +5,9 @@
.. _pythonpackage:
-------------
PythonPackage
-------------
------
Python
------
Python packages and modules have their own special build system. This
documentation covers everything you'll need to know in order to write
@@ -724,10 +724,9 @@ extends vs. depends_on
This is very similar to the naming dilemma above, with a slight twist.
As mentioned in the :ref:`Packaging Guide <packaging_extensions>`,
``extends`` and ``depends_on`` are very similar, but ``extends`` adds
the ability to *activate* the package. Activation involves symlinking
everything in the installation prefix of the package to the installation
prefix of Python. This allows the user to import a Python module without
``extends`` and ``depends_on`` are very similar, but ``extends`` ensures
that the extension and extendee share the same prefix in views.
This allows the user to import a Python module without
having to add that module to ``PYTHONPATH``.
When deciding between ``extends`` and ``depends_on``, the best rule of
@@ -735,7 +734,7 @@ thumb is to check the installation prefix. If Python libraries are
installed to ``<prefix>/lib/pythonX.Y/site-packages``, then you
should use ``extends``. If Python libraries are installed elsewhere
or the only files that get installed reside in ``<prefix>/bin``, then
don't use ``extends``, as symlinking the package wouldn't be useful.
don't use ``extends``.
^^^^^^^^^^^^^^^^^^^^^
Alternatives to Spack

View File

@@ -5,9 +5,9 @@
.. _rocmpackage:
-----------
ROCmPackage
-----------
----
ROCm
----
The ``ROCmPackage`` is not a build system but a helper package. Like ``CudaPackage``,
it provides standard variants, dependencies, and conflicts to facilitate building
@@ -25,7 +25,7 @@ This package provides the following variants:
* **rocm**
This variant is used to enable/disable building with ``rocm``.
This variant is used to enable/disable building with ``rocm``.
The default is disabled (or ``False``).
* **amdgpu_target**

View File

@@ -5,9 +5,9 @@
.. _rpackage:
--------
RPackage
--------
--
R
--
Like Python, R has its own built-in build system.
@@ -193,10 +193,10 @@ Build system dependencies
As an extension of the R ecosystem, your package will obviously depend
on R to build and run. Normally, we would use ``depends_on`` to express
this, but for R packages, we use ``extends``. ``extends`` is similar to
``depends_on``, but adds an additional feature: the ability to "activate"
the package by symlinking it to the R installation directory. Since
every R package needs this, the ``RPackage`` base class contains:
this, but for R packages, we use ``extends``. This implies a special
dependency on R, which is used to set environment variables such as
``R_LIBS`` uniformly. Since every R package needs this, the ``RPackage``
base class contains:
.. code-block:: python

View File

@@ -5,15 +5,15 @@
.. _sourceforgepackage:
------------------
SourceforgePackage
------------------
-----------
Sourceforge
-----------
``SourceforgePackage`` is a
``SourceforgePackage`` is a
`mixin-class <https://en.wikipedia.org/wiki/Mixin>`_. It automatically
sets the URL based on a list of Sourceforge mirrors listed in
`sourceforge_mirror_path`, which defaults to a half dozen known mirrors.
Refer to the package source
Refer to the package source
(`<https://github.com/spack/spack/blob/develop/lib/spack/spack/build_systems/sourceforge.py>`__) for the current list of mirrors used by Spack.
@@ -29,7 +29,7 @@ This package provides a method for populating mirror URLs.
It is decorated with `property` so its results are treated as
a package attribute.
Refer to
Refer to
`<https://spack.readthedocs.io/en/latest/packaging_guide.html#mirrors-of-the-main-url>`__
for information on how Spack uses the `urls` attribute during
fetching.

View File

@@ -37,12 +37,6 @@
os.symlink(os.path.abspath("../../.."), link_name, target_is_directory=True)
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external"))
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external/pytest-fallback"))
if sys.version_info[0] < 3:
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external/yaml/lib"))
else:
sys.path.insert(0, os.path.abspath("_spack_root/lib/spack/external/yaml/lib3"))
sys.path.append(os.path.abspath("_spack_root/lib/spack/"))
# Add the Spack bin directory to the path so that we can use its output in docs.
@@ -80,6 +74,7 @@
"--force", # Overwrite existing files
"--no-toc", # Don't create a table of contents file
"--output-dir=.", # Directory to place all output
"--module-first", # emit module docs before submodule docs
]
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/spack"])
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/llnl"])
@@ -160,8 +155,8 @@ def setup(sphinx):
master_doc = "index"
# General information about the project.
project = u"Spack"
copyright = u"2013-2021, Lawrence Livermore National Laboratory."
project = "Spack"
copyright = "2013-2021, Lawrence Livermore National Laboratory."
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
@@ -206,12 +201,14 @@ def setup(sphinx):
("py:class", "_frozen_importlib_external.SourceFileLoader"),
("py:class", "clingo.Control"),
("py:class", "six.moves.urllib.parse.ParseResult"),
("py:class", "TextIO"),
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.VersionBase"),
("py:class", "spack.spec.DependencySpec"),
]
# The reST default role (used for this markup: `text`) to use for all documents.
@@ -350,7 +347,7 @@ class SpackStyle(DefaultStyle):
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
("index", "Spack.tex", u"Spack Documentation", u"Todd Gamblin", "manual"),
("index", "Spack.tex", "Spack Documentation", "Todd Gamblin", "manual"),
]
# The name of an image file (relative to this directory) to place at the top of
@@ -378,7 +375,7 @@ class SpackStyle(DefaultStyle):
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [("index", "spack", u"Spack Documentation", [u"Todd Gamblin"], 1)]
man_pages = [("index", "spack", "Spack Documentation", ["Todd Gamblin"], 1)]
# If true, show URL addresses after external links.
# man_show_urls = False
@@ -393,8 +390,8 @@ class SpackStyle(DefaultStyle):
(
"index",
"Spack",
u"Spack Documentation",
u"Todd Gamblin",
"Spack Documentation",
"Todd Gamblin",
"Spack",
"One line description of project.",
"Miscellaneous",

View File

@@ -394,7 +394,7 @@ are indicated at the start of the path with ``~`` or ``~user``.
Spack-specific variables
^^^^^^^^^^^^^^^^^^^^^^^^
Spack understands several special variables. These are:
Spack understands over a dozen special variables. These are:
* ``$env``: name of the currently active :ref:`environment <environments>`
* ``$spack``: path to the prefix of this Spack installation
@@ -416,6 +416,8 @@ Spack understands several special variables. These are:
ArchSpec. E.g. ``skylake`` or ``neoverse-n1``.
* ``$target_family``. The target family for the current host, as
detected by ArchSpec. E.g. ``x86_64`` or ``aarch64``.
* ``$date``: the current date in the format YYYY-MM-DD
Note that, as with shell variables, you can write these as ``$varname``
or with braces to distinguish the variable from surrounding characters:

View File

@@ -253,27 +253,6 @@ to update them.
multiple runs of ``spack style`` just to re-compute line numbers and
makes it much easier to fix errors directly off of the CI output.
.. warning::
Flake8 and ``pep8-naming`` require a number of dependencies in order
to run. If you installed ``py-flake8`` and ``py-pep8-naming``, the
easiest way to ensure the right packages are on your ``PYTHONPATH`` is
to run::
spack activate py-flake8
spack activate pep8-naming
so that all of the dependencies are symlinked to a central
location. If you see an error message like:
.. code-block:: console
Traceback (most recent call last):
File: "/usr/bin/flake8", line 5, in <module>
from pkg_resources import load_entry_point
ImportError: No module named pkg_resources
that means Flake8 couldn't find setuptools in your ``PYTHONPATH``.
^^^^^^^^^^^^^^^^^^^
Documentation Tests
@@ -309,13 +288,9 @@ All of these can be installed with Spack, e.g.
.. code-block:: console
$ spack activate py-sphinx
$ spack activate py-sphinx-rtd-theme
$ spack activate py-sphinxcontrib-programoutput
$ spack load py-sphinx py-sphinx-rtd-theme py-sphinxcontrib-programoutput
so that all of the dependencies are symlinked into that Python's
tree. Alternatively, you could arrange for their library
directories to be added to PYTHONPATH. If you see an error message
so that all of the dependencies are added to PYTHONPATH. If you see an error message
like:
.. code-block:: console

View File

@@ -175,14 +175,11 @@ Spec-related modules
^^^^^^^^^^^^^^^^^^^^
:mod:`spack.spec`
Contains :class:`~spack.spec.Spec` and :class:`~spack.spec.SpecParser`.
Also implements most of the logic for normalization and concretization
Contains :class:`~spack.spec.Spec`. Also implements most of the logic for concretization
of specs.
:mod:`spack.parse`
Contains some base classes for implementing simple recursive descent
parsers: :class:`~spack.parse.Parser` and :class:`~spack.parse.Lexer`.
Used by :class:`~spack.spec.SpecParser`.
:mod:`spack.parser`
Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs.
:mod:`spack.concretize`
Contains :class:`~spack.concretize.Concretizer` implementation,

View File

@@ -233,8 +233,8 @@ packages will be listed as roots of the Environment.
All of the Spack commands that act on the list of installed specs are
Environment-sensitive in this way, including ``install``,
``uninstall``, ``activate``, ``deactivate``, ``find``, ``extensions``,
and more. In the :ref:`environment-configuration` section we will discuss
``uninstall``, ``find``, ``extensions``, and more. In the
:ref:`environment-configuration` section we will discuss
Environment-sensitive commands further.
^^^^^^^^^^^^^^^^^^^^^
@@ -1070,19 +1070,23 @@ the include is conditional.
Building a subset of the environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The generated ``Makefile``\s contain install targets for each spec. Given the hash
of a particular spec, you can use the ``.install/<hash>`` target to install the
spec with its dependencies. There is also ``.install-deps/<hash>`` to *only* install
The generated ``Makefile``\s contain install targets for each spec, identified
by ``<name>-<version>-<hash>``. This allows you to install only a subset of the
packages in the environment. When packages are unique in the environment, it's
enough to know the name and let tab-completion fill out the version and hash.
The following phony targets are available: ``install/<spec>`` to install the
spec with its dependencies, and ``install-deps/<spec>`` to *only* install
its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
$ spack env depfile -o Makefile --make-target-prefix my_env
$ spack env depfile -o Makefile
# Install dependencies in parallel, only show a log on error.
$ make -j16 my_env/.install-deps/<hash> SPACK_INSTALL_FLAGS=--show-log-on-error
$ make -j16 install-deps/python-3.11.0-<hash> SPACK_INSTALL_FLAGS=--show-log-on-error
# Install the root spec with verbose output.
$ make -j16 my_env/.install/<hash> SPACK_INSTALL_FLAGS=--verbose
$ make -j16 install/python-3.11.0-<hash> SPACK_INSTALL_FLAGS=--verbose

View File

@@ -21,8 +21,9 @@ be present on the machine where Spack is run:
:header-rows: 1
These requirements can be easily installed on most modern Linux systems;
on macOS, XCode is required. Spack is designed to run on HPC
platforms like Cray. Not all packages should be expected
on macOS, the Command Line Tools package is required, and a full XCode suite
may be necessary for some packages such as Qt and apple-gl. Spack is designed
to run on HPC platforms like Cray. Not all packages should be expected
to work on all platforms.
A build matrix showing which packages are working on which systems is shown below.
@@ -1704,9 +1705,11 @@ dependencies or incompatible build tools like autoconf. Here are several
packages known to work on Windows:
* abseil-cpp
* bzip2
* clingo
* cpuinfo
* cmake
* hdf5
* glm
* nasm
* netlib-lapack (requires Intel Fortran)

View File

@@ -67,7 +67,6 @@ or refer to the full manual below.
build_settings
environments
containers
monitoring
mirrors
module_file_support
repositories
@@ -78,12 +77,6 @@ or refer to the full manual below.
extensions
pipelines
.. toctree::
:maxdepth: 2
:caption: Research
analyze
.. toctree::
:maxdepth: 2
:caption: Contributing

View File

@@ -1,265 +0,0 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _monitoring:
==========
Monitoring
==========
You can use a `spack monitor <https://github.com/spack/spack-monitor>`_ "Spackmon"
server to store a database of your packages, builds, and associated metadata
for provenance, research, or some other kind of development. You should
follow the instructions in the `spack monitor documentation <https://spack-monitor.readthedocs.org>`_
to first create a server along with a username and token for yourself.
You can then use this guide to interact with the server.
-------------------
Analysis Monitoring
-------------------
To read about how to monitor an analysis (meaning you want to send analysis results
to a server) see :ref:`analyze_monitoring`.
---------------------
Monitoring An Install
---------------------
Since an install is typically when you build packages, we logically want
to tell spack to monitor during this step. Let's start with an example
where we want to monitor the install of hdf5. Unless you have disabled authentication
for the server, we first want to export our spack monitor token and username to the environment:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
By default, the host for your server is expected to be at ``http://127.0.0.1``
with a prefix of ``ms1``, and if this is the case, you can simply add the
``--monitor`` flag to the install command:
.. code-block:: console
$ spack install --monitor hdf5
If you need to customize the host or the prefix, you can do that as well:
.. code-block:: console
$ spack install --monitor --monitor-prefix monitor --monitor-host https://monitor-service.io hdf5
As a precaution, we cut out early in the spack client if you have not provided
authentication credentials. For example, if you run the command above without
exporting your username or token, you'll see:
.. code-block:: console
==> Error: You are required to export SPACKMON_TOKEN and SPACKMON_USER
This extra check is to ensure that we don't start any builds,
and then discover that you forgot to export your token. However, if
your monitoring server has authentication disabled, you can tell this to
the client to skip this step:
.. code-block:: console
$ spack install --monitor --monitor-disable-auth hdf5
If the service is not running, you'll cleanly exit early - the install will
not continue if you've asked it to monitor and there is no service.
For example, here is what you'll see if the monitoring service is not running:
.. code-block:: console
[Errno 111] Connection refused
If you want to continue builds (and stop monitoring) you can set the ``--monitor-keep-going``
flag.
.. code-block:: console
$ spack install --monitor --monitor-keep-going hdf5
This could mean that if a request fails, you only have partial or no data
added to your monitoring database. This setting will not be applied to the
first request to check if the server is running, but to subsequent requests.
If you don't have a monitor server running and you want to build, simply
don't provide the ``--monitor`` flag! Finally, if you want to provide one or
more tags to your build, you can do:
.. code-block:: console
# Add one tag, "pizza"
$ spack install --monitor --monitor-tags pizza hdf5
# Add two tags, "pizza" and "pasta"
$ spack install --monitor --monitor-tags pizza,pasta hdf5
----------------------------
Monitoring with Containerize
----------------------------
The same argument group is available to add to a containerize command.
^^^^^^
Docker
^^^^^^
To add monitoring to a Docker container recipe generation using the defaults,
and assuming a monitor server running on localhost, you would
start with a spack.yaml in your present working directory:
.. code-block:: yaml
spack:
specs:
- samtools
And then do:
.. code-block:: console
# preview first
spack containerize --monitor
# and then write to a Dockerfile
spack containerize --monitor > Dockerfile
The install command will be edited to include commands for enabling monitoring.
However, getting secrets into the container for your monitor server is something
that should be done carefully. Specifically you should:
- Never try to define secrets as ENV, ARG, or using ``--build-arg``
- Do not try to get the secret into the container via a "temporary" file that you remove (it in fact will still exist in a layer)
Instead, it's recommended to use buildkit `as explained here <https://pythonspeed.com/articles/docker-build-secrets/>`_.
You'll need to again export environment variables for your spack monitor server:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
And then use buildkit along with your build and identifying the name of the secret:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
The secrets are expected to come from your environment, and then will be temporarily mounted and available
at ``/run/secrets/<name>``. If you forget to supply them (and authentication is required) the build
will fail. If you need to build on your host (and interact with a spack monitor at localhost) you'll
need to tell Docker to use the host network:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --network="host" --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
^^^^^^^^^^^
Singularity
^^^^^^^^^^^
To add monitoring to a Singularity container build, the spack.yaml needs to
be modified slightly to specify wanting a different format:
.. code-block:: yaml
spack:
specs:
- samtools
container:
format: singularity
Again, generate the recipe:
.. code-block:: console
# preview first
$ spack containerize --monitor
# then write to a Singularity recipe
$ spack containerize --monitor > Singularity
Singularity doesn't have a direct way to define secrets at build time, so we have
to do a bit of a manual command to add a file, source secrets in it, and remove it.
Since Singularity doesn't have layers like Docker, deleting a file will truly
remove it from the container and history. So let's say we have this file,
``secrets.sh``:
.. code-block:: console
# secrets.sh
export SPACKMON_USER=spack
export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
We would then generate the Singularity recipe, and add a files section,
a source of that file at the start of ``%post``, and **importantly**
a removal of the final at the end of that same section.
.. code-block::
Bootstrap: docker
From: spack/ubuntu-bionic:latest
Stage: build
%files
secrets.sh /opt/secrets.sh
%post
. /opt/secrets.sh
# spack install commands are here
...
# Don't forget to remove here!
rm /opt/secrets.sh
You can then build the container as your normally would.
.. code-block:: console
$ sudo singularity build container.sif Singularity
------------------
Monitoring Offline
------------------
In the case that you want to save monitor results to your filesystem
and then upload them later (perhaps you are in an environment where you don't
have credentials or it isn't safe to use them) you can use the ``--monitor-save-local``
flag.
.. code-block:: console
$ spack install --monitor --monitor-save-local hdf5
This will save results in a subfolder, "monitor" in your designated spack
reports folder, which defaults to ``$HOME/.spack/reports/monitor``. When
you are ready to upload them to a spack monitor server:
.. code-block:: console
$ spack monitor upload ~/.spack/reports/monitor
You can choose the root directory of results as shown above, or a specific
subdirectory. The command accepts other arguments to specify configuration
for the monitor.

View File

@@ -2397,13 +2397,15 @@ this because uninstalling the dependency would break the package.
``build``, ``link``, and ``run`` dependencies all affect the hash of Spack
packages (along with ``sha256`` sums of patches and archives used to build the
package, and a [canonical hash](https://github.com/spack/spack/pull/28156) of
package, and a `canonical hash <https://github.com/spack/spack/pull/28156>`_ of
the ``package.py`` recipes). ``test`` dependencies do not affect the package
hash, as they are only used to construct a test environment *after* building and
installing a given package installation. Older versions of Spack did not include
build dependencies in the hash, but this has been
[fixed](https://github.com/spack/spack/pull/28504) as of [Spack
``v0.18``](https://github.com/spack/spack/releases/tag/v0.18.0)
build dependencies in the hash, but this has been
`fixed <https://github.com/spack/spack/pull/28504>`_ as of |Spack v0.18|_.
.. |Spack v0.18| replace:: Spack ``v0.18``
.. _Spack v0.18: https://github.com/spack/spack/releases/tag/v0.18.0
If the dependency type is not specified, Spack uses a default of
``('build', 'link')``. This is the common case for compiler languages.
@@ -2634,9 +2636,12 @@ extendable package:
extends('python')
...
Now, the ``py-numpy`` package can be used as an argument to ``spack
activate``. When it is activated, all the files in its prefix will be
symbolically linked into the prefix of the python package.
This accomplishes a few things. Firstly, the Python package can set special
variables such as ``PYTHONPATH`` for all extensions when the run or build
environment is set up. Secondly, filesystem views can ensure that extensions
are put in the same prefix as their extendee. This ensures that Python in
a view can always locate its Python packages, even without environment
variables set.
A package can only extend one other package at a time. To support packages
that may extend one of a list of other packages, Spack supports multiple
@@ -2684,9 +2689,8 @@ variant(s) are selected. This may be accomplished with conditional
...
Sometimes, certain files in one package will conflict with those in
another, which means they cannot both be activated (symlinked) at the
same time. In this case, you can tell Spack to ignore those files
when it does the activation:
another, which means they cannot both be used in a view at the
same time. In this case, you can tell Spack to ignore those files:
.. code-block:: python
@@ -2698,7 +2702,7 @@ when it does the activation:
...
The code above will prevent everything in the ``$prefix/bin/`` directory
from being linked in at activation time.
from being linked in a view.
.. note::
@@ -2722,67 +2726,6 @@ extensions; as a consequence python extension packages (those inheriting from
``PythonPackage``) likewise override ``add_files_to_view`` in order to rewrite
shebang lines which point to the Python interpreter.
^^^^^^^^^^^^^^^^^^^^^^^^^
Activation & deactivation
^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an extension to a view is referred to as an activation. If the view is
maintained in the Spack installation prefix of the extendee this is called a
global activation. Activations may involve updating some centralized state
that is maintained by the extendee package, so there can be additional work
for adding extensions compared with non-extension packages.
Spack's ``Package`` class has default ``activate`` and ``deactivate``
implementations that handle symbolically linking extensions' prefixes
into a specified view. Extendable packages can override these methods
to add custom activate/deactivate logic of their own. For example,
the ``activate`` and ``deactivate`` methods in the Python class handle
symbolic linking of extensions, but they also handle details surrounding
Python's ``.pth`` files, and other aspects of Python packaging.
Spack's extensions mechanism is designed to be extensible, so that
other packages (like Ruby, R, Perl, etc.) can provide their own
custom extension management logic, as they may not handle modules the
same way that Python does.
Let's look at Python's activate function:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.activate
:linenos:
This function is called on the *extendee* (Python). It first calls
``activate`` in the superclass, which handles symlinking the
extension package's prefix into the specified view. It then does
some special handling of the ``easy-install.pth`` file, part of
Python's setuptools.
Deactivate behaves similarly to activate, but it unlinks files:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
:pyobject: Python.deactivate
:linenos:
Both of these methods call some custom functions in the Python
package. See the source for Spack's Python package for details.
^^^^^^^^^^^^^^^^^^^^
Activation arguments
^^^^^^^^^^^^^^^^^^^^
You may have noticed that the ``activate`` function defined above
takes keyword arguments. These are the keyword arguments from
``extends()``, and they are passed to both activate and deactivate.
This capability allows an extension to customize its own activation by
passing arguments to the extendee. Extendees can likewise implement
custom ``activate()`` and ``deactivate()`` functions to suit their
needs.
The only keyword argument supported by default is the ``ignore``
argument, which can take a regex, list of regexes, or a predicate to
determine which files *not* to symlink during activation.
.. _virtual-dependencies:
--------------------
@@ -3584,7 +3527,7 @@ will likely contain some overriding of default builder methods:
def cmake_args(self):
pass
class Autotoolsbuilder(spack.build_systems.autotools.AutotoolsBuilder):
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
pass

View File

@@ -184,13 +184,48 @@ simply run the following commands:
.. code-block:: console
$ spack env activate myenv
$ spack concretize --force
$ spack concretize --fresh --force
$ spack install
The ``--force`` flag tells Spack to overwrite its previous concretization
decisions, allowing you to choose a new version of Python. If any of the new
packages like Bash are already installed, ``spack install`` won't re-install
them, it will keep the symlinks in place.
The ``--fresh`` flag tells Spack to use the latest version of every package
where possible instead of trying to optimize for reuse of existing installed
packages.
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
won't re-install them, it will keep the symlinks in place.
-----------------------------------
Updating & Cleaning Up Old Packages
-----------------------------------
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
package versions, simply run the following commands:
.. code-block:: console
$ spack env activate myenv
$ spack mark -i --all
$ spack concretize --fresh --force
$ spack install
$ spack gc
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
spack's garbage collection system that these packages should be cleaned up.
Don't worry however, this will not remove your entire environment.
Running ``spack install`` will reexamine your spack environment after
a fresh concretization and will re-mark any packages that should remain
installed as "explicitly" installed.
**Note:** if you use multiple spack environments you should re-run ``spack install``
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
environment you just upgraded.
--------------
Uninstallation

View File

@@ -1,5 +1,5 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 2.7/3.6-3.11, , Interpreter for Spack
Python, 3.6--3.11, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
1 Name Supported Versions Notes Requirement Reason
2 Python 2.7/3.6-3.11 3.6--3.11 Interpreter for Spack
3 C/C++ Compilers Building software
4 make Build software
5 patch Build software

41
lib/spack/env/cc vendored
View File

@@ -440,6 +440,47 @@ while [ $# -ne 0 ]; do
continue
fi
if [ -n "${SPACK_COMPILER_FLAGS_KEEP}" ] ; then
# NOTE: the eval is required to allow `|` alternatives inside the variable
eval "\
case \"\$1\" in
$SPACK_COMPILER_FLAGS_KEEP)
append other_args_list \"\$1\"
shift
continue
;;
esac
"
fi
# the replace list is a space-separated list of pipe-separated pairs,
# the first in each pair is the original prefix to be matched, the
# second is the replacement prefix
if [ -n "${SPACK_COMPILER_FLAGS_REPLACE}" ] ; then
for rep in ${SPACK_COMPILER_FLAGS_REPLACE} ; do
before=${rep%|*}
after=${rep#*|}
eval "\
stripped=\"\${1##$before}\"
"
if [ "$stripped" = "$1" ] ; then
continue
fi
replaced="$after$stripped"
# it matched, remove it
shift
if [ -z "$replaced" ] ; then
# completely removed, continue OUTER loop
continue 2
fi
# re-build argument list with replacement
set -- "$replaced" "$@"
done
fi
case "$1" in
-isystem*)
arg="${1#-isystem}"

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.0 (commit 77640e572725ad97f18e63a04857155752ace045)
* Version: 0.2.0 (commit e44bad9c7b6defac73696f64078b2fe634719b62)
argparse
--------

View File

@@ -1,2 +1,2 @@
"""Init file to avoid namespace packages"""
__version__ = "0.1.2"
__version__ = "0.2.0"

View File

@@ -3,13 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Aliases for microarchitecture features."""
# pylint: disable=useless-object-inheritance
from .schema import TARGETS_JSON, LazyDictionary
_FEATURE_ALIAS_PREDICATE = {}
class FeatureAliasTest(object):
class FeatureAliasTest:
"""A test that must be passed for a feature alias to succeed.
Args:
@@ -48,7 +47,7 @@ def alias_predicate(func):
# Check we didn't register anything else with the same name
if name in _FEATURE_ALIAS_PREDICATE:
msg = 'the alias predicate "{0}" already exists'.format(name)
msg = f'the alias predicate "{name}" already exists'
raise KeyError(msg)
_FEATURE_ALIAS_PREDICATE[name] = func

View File

@@ -11,8 +11,6 @@
import subprocess
import warnings
import six
from .microarchitecture import generic_microarchitecture, TARGETS
from .schema import TARGETS_JSON
@@ -80,10 +78,9 @@ def proc_cpuinfo():
def _check_output(args, env):
output = subprocess.Popen( # pylint: disable=consider-using-with
args, stdout=subprocess.PIPE, env=env
).communicate()[0]
return six.text_type(output.decode("utf-8"))
with subprocess.Popen(args, stdout=subprocess.PIPE, env=env) as proc:
output = proc.communicate()[0]
return str(output.decode("utf-8"))
def _machine():
@@ -273,7 +270,7 @@ def compatibility_check(architecture_family):
this test can be used, e.g. x86_64 or ppc64le etc.
"""
# Turn the argument into something iterable
if isinstance(architecture_family, six.string_types):
if isinstance(architecture_family, str):
architecture_family = (architecture_family,)
def decorator(func):

View File

@@ -5,14 +5,11 @@
"""Types and functions to manage information
on CPU microarchitectures.
"""
# pylint: disable=useless-object-inheritance
import functools
import platform
import re
import warnings
import six
import archspec
import archspec.cpu.alias
import archspec.cpu.schema
@@ -27,7 +24,7 @@ def coerce_target_names(func):
@functools.wraps(func)
def _impl(self, other):
if isinstance(other, six.string_types):
if isinstance(other, str):
if other not in TARGETS:
msg = '"{0}" is not a valid target name'
raise ValueError(msg.format(other))
@@ -38,7 +35,7 @@ def _impl(self, other):
return _impl
class Microarchitecture(object):
class Microarchitecture:
"""Represents a specific CPU micro-architecture.
Args:
@@ -150,7 +147,7 @@ def __str__(self):
def __contains__(self, feature):
# Feature must be of a string type, so be defensive about that
if not isinstance(feature, six.string_types):
if not isinstance(feature, str):
msg = "only objects of string types are accepted [got {0}]"
raise TypeError(msg.format(str(type(feature))))
@@ -168,7 +165,7 @@ def family(self):
"""Returns the architecture family a given target belongs to"""
roots = [x for x in [self] + self.ancestors if not x.ancestors]
msg = "a target is expected to belong to just one architecture family"
msg += "[found {0}]".format(", ".join(str(x) for x in roots))
msg += f"[found {', '.join(str(x) for x in roots)}]"
assert len(roots) == 1, msg
return roots.pop()
@@ -318,9 +315,6 @@ def _known_microarchitectures():
"""Returns a dictionary of the known micro-architectures. If the
current host platform is unknown adds it too as a generic target.
"""
# pylint: disable=fixme
# TODO: Simplify this logic using object_pairs_hook to OrderedDict
# TODO: when we stop supporting python2.6
def fill_target_from_dict(name, data, targets):
"""Recursively fills targets by adding the micro-architecture

View File

@@ -5,16 +5,12 @@
"""Global objects with the content of the microarchitecture
JSON file and its schema
"""
import collections.abc
import json
import os.path
try:
from collections.abc import MutableMapping # novm
except ImportError:
from collections import MutableMapping # pylint: disable=deprecated-class
class LazyDictionary(MutableMapping):
class LazyDictionary(collections.abc.MutableMapping):
"""Lazy dictionary that gets constructed on first access to any object key
Args:
@@ -56,7 +52,7 @@ def _load_json_file(json_file):
def _factory():
filename = os.path.join(json_dir, json_file)
with open(filename, "r") as file: # pylint: disable=unspecified-encoding
with open(filename, "r", encoding="utf-8") as file:
return json.load(file)
return _factory

View File

@@ -961,21 +961,21 @@
],
"intel": [
{
"versions": "18.0:",
"versions": "18.0:2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"versions": ":2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"versions": ":2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
@@ -1905,6 +1905,86 @@
]
}
},
"zen4": {
"from": ["zen3", "x86_64_v4"],
"vendor": "AuthenticAMD",
"features": [
"bmi1",
"bmi2",
"f16c",
"fma",
"fsgsbase",
"avx",
"avx2",
"rdseed",
"clzero",
"aes",
"pclmulqdq",
"cx16",
"movbe",
"mmx",
"sse",
"sse2",
"sse4a",
"ssse3",
"sse4_1",
"sse4_2",
"abm",
"xsavec",
"xsaveopt",
"clflushopt",
"popcnt",
"clwb",
"vaes",
"vpclmulqdq",
"pku",
"gfni",
"flush_l1d",
"erms",
"avic",
"avx512f",
"avx512dq",
"avx512ifma",
"avx512cd",
"avx512bw",
"avx512vl",
"avx512_bf16",
"avx512vbmi",
"avx512_vbmi2",
"avx512_vnni",
"avx512_bitalg",
"avx512_vpopcntdq"
],
"compilers": {
"gcc": [
{
"versions": "10.3:",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
}
],
"clang": [
{
"versions": "12.0:",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
}
],
"aocc": [
{
"versions": "3.0:3.9",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg",
"warnings": "Zen4 processors are not fully supported by AOCC versions < 4.0. For optimal performance please upgrade to a newer version of AOCC"
},
{
"versions": "4.0:",
"name": "znver4",
"flags": "-march={name} -mtune={name}"
}
]
}
},
"ppc64": {
"from": [],
"vendor": "generic",
@@ -2302,7 +2382,6 @@
"fp",
"asimd",
"evtstrm",
"pmull",
"sha1",
"sha2",
"crc32",

View File

@@ -71,13 +71,12 @@
import re
import math
import multiprocessing
import io
import sys
import threading
import time
from contextlib import contextmanager
from six import StringIO
from six import string_types
_error_matches = [
"^FAIL: ",
@@ -246,7 +245,7 @@ def __getitem__(self, line_no):
def __str__(self):
"""Returns event lines and context."""
out = StringIO()
out = io.StringIO()
for i in range(self.start, self.end):
if i == self.line_no:
out.write(' >> %-6d%s' % (i, self[i]))
@@ -386,7 +385,7 @@ def parse(self, stream, context=6, jobs=None):
(tuple): two lists containing ``BuildError`` and
``BuildWarning`` objects.
"""
if isinstance(stream, string_types):
if isinstance(stream, str):
with open(stream) as f:
return self.parse(f, context, jobs)

File diff suppressed because it is too large Load Diff

View File

@@ -1,289 +0,0 @@
A. HISTORY OF THE SOFTWARE
==========================
Python was created in the early 1990s by Guido van Rossum at Stichting
Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands
as a successor of a language called ABC. Guido remains Python's
principal author, although it includes many contributions from others.
In 1995, Guido continued his work on Python at the Corporation for
National Research Initiatives (CNRI, see http://www.cnri.reston.va.us)
in Reston, Virginia where he released several versions of the
software.
In May 2000, Guido and the Python core development team moved to
BeOpen.com to form the BeOpen PythonLabs team. In October of the same
year, the PythonLabs team moved to Digital Creations (now Zope
Corporation, see http://www.zope.com). In 2001, the Python Software
Foundation (PSF, see http://www.python.org/psf/) was formed, a
non-profit organization created specifically to own Python-related
Intellectual Property. Zope Corporation is a sponsoring member of
the PSF.
All Python releases are Open Source (see http://www.opensource.org for
the Open Source Definition). Historically, most, but not all, Python
releases have also been GPL-compatible; the table below summarizes
the various releases.
Release Derived Year Owner GPL-
from compatible? (1)
0.9.0 thru 1.2 1991-1995 CWI yes
1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
1.6 1.5.2 2000 CNRI no
2.0 1.6 2000 BeOpen.com no
1.6.1 1.6 2001 CNRI yes (2)
2.1 2.0+1.6.1 2001 PSF no
2.0.1 2.0+1.6.1 2001 PSF yes
2.1.1 2.1+2.0.1 2001 PSF yes
2.2 2.1.1 2001 PSF yes
2.1.2 2.1.1 2002 PSF yes
2.1.3 2.1.2 2002 PSF yes
2.2.1 2.2 2002 PSF yes
2.2.2 2.2.1 2002 PSF yes
2.2.3 2.2.2 2003 PSF yes
2.3 2.2.2 2002-2003 PSF yes
2.3.1 2.3 2002-2003 PSF yes
2.3.2 2.3.1 2002-2003 PSF yes
2.3.3 2.3.2 2002-2003 PSF yes
2.3.4 2.3.3 2004 PSF yes
2.3.5 2.3.4 2005 PSF yes
2.4 2.3 2004 PSF yes
2.4.1 2.4 2005 PSF yes
2.4.2 2.4.1 2005 PSF yes
2.4.3 2.4.2 2006 PSF yes
2.4.4 2.4.3 2006 PSF yes
2.5 2.4 2006 PSF yes
2.5.1 2.5 2007 PSF yes
2.5.2 2.5.1 2008 PSF yes
2.5.3 2.5.2 2008 PSF yes
2.6 2.5 2008 PSF yes
2.6.1 2.6 2008 PSF yes
2.6.2 2.6.1 2009 PSF yes
2.6.3 2.6.2 2009 PSF yes
2.6.4 2.6.3 2009 PSF yes
2.6.5 2.6.4 2010 PSF yes
3.0 2.6 2008 PSF yes
3.0.1 3.0 2009 PSF yes
3.1 3.0.1 2009 PSF yes
3.1.1 3.1 2009 PSF yes
3.1.2 3.1.1 2010 PSF yes
3.1.3 3.1.2 2010 PSF yes
3.1.4 3.1.3 2011 PSF yes
3.2 3.1 2011 PSF yes
3.2.1 3.2 2011 PSF yes
3.2.2 3.2.1 2011 PSF yes
3.2.3 3.2.2 2012 PSF yes
Footnotes:
(1) GPL-compatible doesn't mean that we're distributing Python under
the GPL. All Python licenses, unlike the GPL, let you distribute
a modified version without making your changes open source. The
GPL-compatible licenses make it possible to combine Python with
other software that is released under the GPL; the others don't.
(2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
because its license has a choice of law clause. According to
CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
is "not incompatible" with the GPL.
Thanks to the many outside volunteers who have worked under Guido's
direction to make these releases possible.
B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
===============================================================
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
--------------------------------------------
1. This LICENSE AGREEMENT is between the Python Software Foundation
("PSF"), and the Individual or Organization ("Licensee") accessing and
otherwise using this software ("Python") in source or binary form and
its associated documentation.
2. Subject to the terms and conditions of this License Agreement, PSF hereby
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
2011, 2012 Python Software Foundation; All Rights Reserved" are retained in Python
alone or in any derivative version prepared by Licensee.
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python.
4. PSF is making Python available to Licensee on an "AS IS"
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. Nothing in this License Agreement shall be deemed to create any
relationship of agency, partnership, or joint venture between PSF and
Licensee. This License Agreement does not grant permission to use PSF
trademarks or trade name in a trademark sense to endorse or promote
products or services of Licensee, or any third party.
8. By copying, installing or otherwise using Python, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
-------------------------------------------
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
Individual or Organization ("Licensee") accessing and otherwise using
this software in source or binary form and its associated
documentation ("the Software").
2. Subject to the terms and conditions of this BeOpen Python License
Agreement, BeOpen hereby grants Licensee a non-exclusive,
royalty-free, world-wide license to reproduce, analyze, test, perform
and/or display publicly, prepare derivative works, distribute, and
otherwise use the Software alone or in any derivative version,
provided, however, that the BeOpen Python License is retained in the
Software, alone or in any derivative version prepared by Licensee.
3. BeOpen is making the Software available to Licensee on an "AS IS"
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
5. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
6. This License Agreement shall be governed by and interpreted in all
respects by the law of the State of California, excluding conflict of
law provisions. Nothing in this License Agreement shall be deemed to
create any relationship of agency, partnership, or joint venture
between BeOpen and Licensee. This License Agreement does not grant
permission to use BeOpen trademarks or trade names in a trademark
sense to endorse or promote products or services of Licensee, or any
third party. As an exception, the "BeOpen Python" logos available at
http://www.pythonlabs.com/logos.html may be used according to the
permissions granted on that web page.
7. By copying, installing or otherwise using the software, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
---------------------------------------
1. This LICENSE AGREEMENT is between the Corporation for National
Research Initiatives, having an office at 1895 Preston White Drive,
Reston, VA 20191 ("CNRI"), and the Individual or Organization
("Licensee") accessing and otherwise using Python 1.6.1 software in
source or binary form and its associated documentation.
2. Subject to the terms and conditions of this License Agreement, CNRI
hereby grants Licensee a nonexclusive, royalty-free, world-wide
license to reproduce, analyze, test, perform and/or display publicly,
prepare derivative works, distribute, and otherwise use Python 1.6.1
alone or in any derivative version, provided, however, that CNRI's
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
1995-2001 Corporation for National Research Initiatives; All Rights
Reserved" are retained in Python 1.6.1 alone or in any derivative
version prepared by Licensee. Alternately, in lieu of CNRI's License
Agreement, Licensee may substitute the following text (omitting the
quotes): "Python 1.6.1 is made available subject to the terms and
conditions in CNRI's License Agreement. This Agreement together with
Python 1.6.1 may be located on the Internet using the following
unique, persistent identifier (known as a handle): 1895.22/1013. This
Agreement may also be obtained from a proxy server on the Internet
using the following URL: http://hdl.handle.net/1895.22/1013".
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python 1.6.1 or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python 1.6.1.
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. This License Agreement shall be governed by the federal
intellectual property law of the United States, including without
limitation the federal copyright law, and, to the extent such
U.S. federal law does not apply, by the law of the Commonwealth of
Virginia, excluding Virginia's conflict of law provisions.
Notwithstanding the foregoing, with regard to derivative works based
on Python 1.6.1 that incorporate non-separable material that was
previously distributed under the GNU General Public License (GPL), the
law of the Commonwealth of Virginia shall govern this License
Agreement only as to issues arising under or with respect to
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
License Agreement shall be deemed to create any relationship of
agency, partnership, or joint venture between CNRI and Licensee. This
License Agreement does not grant permission to use CNRI trademarks or
trade name in a trademark sense to endorse or promote products or
services of Licensee, or any third party.
8. By clicking on the "ACCEPT" button where indicated, or by copying,
installing or otherwise using Python 1.6.1, Licensee agrees to be
bound by the terms and conditions of this License Agreement.
ACCEPT
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
--------------------------------------------------
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
The Netherlands. All rights reserved.
Permission to use, copy, modify, and distribute this software and its
documentation for any purpose and without fee is hereby granted,
provided that the above copyright notice appear in all copies and that
both that copyright notice and this permission notice appear in
supporting documentation, and that the name of Stichting Mathematisch
Centrum or CWI not be used in advertising or publicity pertaining to
distribution of the software without specific, written prior
permission.
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

View File

@@ -1 +0,0 @@
from .functools32 import *

View File

@@ -1,158 +0,0 @@
"""Drop-in replacement for the thread module.
Meant to be used as a brain-dead substitute so that threaded code does
not need to be rewritten for when the thread module is not present.
Suggested usage is::
try:
try:
import _thread # Python >= 3
except:
import thread as _thread # Python < 3
except ImportError:
import _dummy_thread as _thread
"""
# Exports only things specified by thread documentation;
# skipping obsolete synonyms allocate(), start_new(), exit_thread().
__all__ = ['error', 'start_new_thread', 'exit', 'get_ident', 'allocate_lock',
'interrupt_main', 'LockType']
# A dummy value
TIMEOUT_MAX = 2**31
# NOTE: this module can be imported early in the extension building process,
# and so top level imports of other modules should be avoided. Instead, all
# imports are done when needed on a function-by-function basis. Since threads
# are disabled, the import lock should not be an issue anyway (??).
class error(Exception):
"""Dummy implementation of _thread.error."""
def __init__(self, *args):
self.args = args
def start_new_thread(function, args, kwargs={}):
"""Dummy implementation of _thread.start_new_thread().
Compatibility is maintained by making sure that ``args`` is a
tuple and ``kwargs`` is a dictionary. If an exception is raised
and it is SystemExit (which can be done by _thread.exit()) it is
caught and nothing is done; all other exceptions are printed out
by using traceback.print_exc().
If the executed function calls interrupt_main the KeyboardInterrupt will be
raised when the function returns.
"""
if type(args) != type(tuple()):
raise TypeError("2nd arg must be a tuple")
if type(kwargs) != type(dict()):
raise TypeError("3rd arg must be a dict")
global _main
_main = False
try:
function(*args, **kwargs)
except SystemExit:
pass
except:
import traceback
traceback.print_exc()
_main = True
global _interrupt
if _interrupt:
_interrupt = False
raise KeyboardInterrupt
def exit():
"""Dummy implementation of _thread.exit()."""
raise SystemExit
def get_ident():
"""Dummy implementation of _thread.get_ident().
Since this module should only be used when _threadmodule is not
available, it is safe to assume that the current process is the
only thread. Thus a constant can be safely returned.
"""
return -1
def allocate_lock():
"""Dummy implementation of _thread.allocate_lock()."""
return LockType()
def stack_size(size=None):
"""Dummy implementation of _thread.stack_size()."""
if size is not None:
raise error("setting thread stack size not supported")
return 0
class LockType(object):
"""Class implementing dummy implementation of _thread.LockType.
Compatibility is maintained by maintaining self.locked_status
which is a boolean that stores the state of the lock. Pickling of
the lock, though, should not be done since if the _thread module is
then used with an unpickled ``lock()`` from here problems could
occur from this class not having atomic methods.
"""
def __init__(self):
self.locked_status = False
def acquire(self, waitflag=None, timeout=-1):
"""Dummy implementation of acquire().
For blocking calls, self.locked_status is automatically set to
True and returned appropriately based on value of
``waitflag``. If it is non-blocking, then the value is
actually checked and not set if it is already acquired. This
is all done so that threading.Condition's assert statements
aren't triggered and throw a little fit.
"""
if waitflag is None or waitflag:
self.locked_status = True
return True
else:
if not self.locked_status:
self.locked_status = True
return True
else:
if timeout > 0:
import time
time.sleep(timeout)
return False
__enter__ = acquire
def __exit__(self, typ, val, tb):
self.release()
def release(self):
"""Release the dummy lock."""
# XXX Perhaps shouldn't actually bother to test? Could lead
# to problems for complex, threaded code.
if not self.locked_status:
raise error
self.locked_status = False
return True
def locked(self):
return self.locked_status
# Used to signal that interrupt_main was called in a "thread"
_interrupt = False
# True when not executing in a "thread"
_main = True
def interrupt_main():
"""Set _interrupt flag to True to have start_new_thread raise
KeyboardInterrupt upon exiting."""
if _main:
raise KeyboardInterrupt
else:
global _interrupt
_interrupt = True

View File

@@ -1,423 +0,0 @@
"""functools.py - Tools for working with functions and callable objects
"""
# Python module wrapper for _functools C module
# to allow utilities written in Python to be added
# to the functools module.
# Written by Nick Coghlan <ncoghlan at gmail.com>
# and Raymond Hettinger <python at rcn.com>
# Copyright (C) 2006-2010 Python Software Foundation.
# See C source code for _functools credits/copyright
__all__ = ['update_wrapper', 'wraps', 'WRAPPER_ASSIGNMENTS', 'WRAPPER_UPDATES',
'total_ordering', 'cmp_to_key', 'lru_cache', 'reduce', 'partial']
from _functools import partial, reduce
from collections import MutableMapping, namedtuple
from .reprlib32 import recursive_repr as _recursive_repr
from weakref import proxy as _proxy
import sys as _sys
try:
from thread import allocate_lock as Lock
except ImportError:
from ._dummy_thread32 import allocate_lock as Lock
################################################################################
### OrderedDict
################################################################################
class _Link(object):
__slots__ = 'prev', 'next', 'key', '__weakref__'
class OrderedDict(dict):
'Dictionary that remembers insertion order'
# An inherited dict maps keys to values.
# The inherited dict provides __getitem__, __len__, __contains__, and get.
# The remaining methods are order-aware.
# Big-O running times for all methods are the same as regular dictionaries.
# The internal self.__map dict maps keys to links in a doubly linked list.
# The circular doubly linked list starts and ends with a sentinel element.
# The sentinel element never gets deleted (this simplifies the algorithm).
# The sentinel is in self.__hardroot with a weakref proxy in self.__root.
# The prev links are weakref proxies (to prevent circular references).
# Individual links are kept alive by the hard reference in self.__map.
# Those hard references disappear when a key is deleted from an OrderedDict.
def __init__(self, *args, **kwds):
'''Initialize an ordered dictionary. The signature is the same as
regular dictionaries, but keyword arguments are not recommended because
their insertion order is arbitrary.
'''
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__root
except AttributeError:
self.__hardroot = _Link()
self.__root = root = _proxy(self.__hardroot)
root.prev = root.next = root
self.__map = {}
self.__update(*args, **kwds)
def __setitem__(self, key, value,
dict_setitem=dict.__setitem__, proxy=_proxy, Link=_Link):
'od.__setitem__(i, y) <==> od[i]=y'
# Setting a new item creates a new link at the end of the linked list,
# and the inherited dictionary is updated with the new key/value pair.
if key not in self:
self.__map[key] = link = Link()
root = self.__root
last = root.prev
link.prev, link.next, link.key = last, root, key
last.next = link
root.prev = proxy(link)
dict_setitem(self, key, value)
def __delitem__(self, key, dict_delitem=dict.__delitem__):
'od.__delitem__(y) <==> del od[y]'
# Deleting an existing item uses self.__map to find the link which gets
# removed by updating the links in the predecessor and successor nodes.
dict_delitem(self, key)
link = self.__map.pop(key)
link_prev = link.prev
link_next = link.next
link_prev.next = link_next
link_next.prev = link_prev
def __iter__(self):
'od.__iter__() <==> iter(od)'
# Traverse the linked list in order.
root = self.__root
curr = root.next
while curr is not root:
yield curr.key
curr = curr.next
def __reversed__(self):
'od.__reversed__() <==> reversed(od)'
# Traverse the linked list in reverse order.
root = self.__root
curr = root.prev
while curr is not root:
yield curr.key
curr = curr.prev
def clear(self):
'od.clear() -> None. Remove all items from od.'
root = self.__root
root.prev = root.next = root
self.__map.clear()
dict.clear(self)
def popitem(self, last=True):
'''od.popitem() -> (k, v), return and remove a (key, value) pair.
Pairs are returned in LIFO order if last is true or FIFO order if false.
'''
if not self:
raise KeyError('dictionary is empty')
root = self.__root
if last:
link = root.prev
link_prev = link.prev
link_prev.next = root
root.prev = link_prev
else:
link = root.next
link_next = link.next
root.next = link_next
link_next.prev = root
key = link.key
del self.__map[key]
value = dict.pop(self, key)
return key, value
def move_to_end(self, key, last=True):
'''Move an existing element to the end (or beginning if last==False).
Raises KeyError if the element does not exist.
When last=True, acts like a fast version of self[key]=self.pop(key).
'''
link = self.__map[key]
link_prev = link.prev
link_next = link.next
link_prev.next = link_next
link_next.prev = link_prev
root = self.__root
if last:
last = root.prev
link.prev = last
link.next = root
last.next = root.prev = link
else:
first = root.next
link.prev = root
link.next = first
root.next = first.prev = link
def __sizeof__(self):
sizeof = _sys.getsizeof
n = len(self) + 1 # number of links including root
size = sizeof(self.__dict__) # instance dictionary
size += sizeof(self.__map) * 2 # internal dict and inherited dict
size += sizeof(self.__hardroot) * n # link objects
size += sizeof(self.__root) * n # proxy objects
return size
update = __update = MutableMapping.update
keys = MutableMapping.keys
values = MutableMapping.values
items = MutableMapping.items
__ne__ = MutableMapping.__ne__
__marker = object()
def pop(self, key, default=__marker):
'''od.pop(k[,d]) -> v, remove specified key and return the corresponding
value. If key is not found, d is returned if given, otherwise KeyError
is raised.
'''
if key in self:
result = self[key]
del self[key]
return result
if default is self.__marker:
raise KeyError(key)
return default
def setdefault(self, key, default=None):
'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
if key in self:
return self[key]
self[key] = default
return default
@_recursive_repr()
def __repr__(self):
'od.__repr__() <==> repr(od)'
if not self:
return '%s()' % (self.__class__.__name__,)
return '%s(%r)' % (self.__class__.__name__, list(self.items()))
def __reduce__(self):
'Return state information for pickling'
items = [[k, self[k]] for k in self]
inst_dict = vars(self).copy()
for k in vars(OrderedDict()):
inst_dict.pop(k, None)
if inst_dict:
return (self.__class__, (items,), inst_dict)
return self.__class__, (items,)
def copy(self):
'od.copy() -> a shallow copy of od'
return self.__class__(self)
@classmethod
def fromkeys(cls, iterable, value=None):
'''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S.
If not specified, the value defaults to None.
'''
self = cls()
for key in iterable:
self[key] = value
return self
def __eq__(self, other):
'''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive
while comparison to a regular mapping is order-insensitive.
'''
if isinstance(other, OrderedDict):
return len(self)==len(other) and \
all(p==q for p, q in zip(self.items(), other.items()))
return dict.__eq__(self, other)
# update_wrapper() and wraps() are tools to help write
# wrapper functions that can handle naive introspection
WRAPPER_ASSIGNMENTS = ('__module__', '__name__', '__doc__')
WRAPPER_UPDATES = ('__dict__',)
def update_wrapper(wrapper,
wrapped,
assigned = WRAPPER_ASSIGNMENTS,
updated = WRAPPER_UPDATES):
"""Update a wrapper function to look like the wrapped function
wrapper is the function to be updated
wrapped is the original function
assigned is a tuple naming the attributes assigned directly
from the wrapped function to the wrapper function (defaults to
functools.WRAPPER_ASSIGNMENTS)
updated is a tuple naming the attributes of the wrapper that
are updated with the corresponding attribute from the wrapped
function (defaults to functools.WRAPPER_UPDATES)
"""
wrapper.__wrapped__ = wrapped
for attr in assigned:
try:
value = getattr(wrapped, attr)
except AttributeError:
pass
else:
setattr(wrapper, attr, value)
for attr in updated:
getattr(wrapper, attr).update(getattr(wrapped, attr, {}))
# Return the wrapper so this can be used as a decorator via partial()
return wrapper
def wraps(wrapped,
assigned = WRAPPER_ASSIGNMENTS,
updated = WRAPPER_UPDATES):
"""Decorator factory to apply update_wrapper() to a wrapper function
Returns a decorator that invokes update_wrapper() with the decorated
function as the wrapper argument and the arguments to wraps() as the
remaining arguments. Default arguments are as for update_wrapper().
This is a convenience function to simplify applying partial() to
update_wrapper().
"""
return partial(update_wrapper, wrapped=wrapped,
assigned=assigned, updated=updated)
def total_ordering(cls):
"""Class decorator that fills in missing ordering methods"""
convert = {
'__lt__': [('__gt__', lambda self, other: not (self < other or self == other)),
('__le__', lambda self, other: self < other or self == other),
('__ge__', lambda self, other: not self < other)],
'__le__': [('__ge__', lambda self, other: not self <= other or self == other),
('__lt__', lambda self, other: self <= other and not self == other),
('__gt__', lambda self, other: not self <= other)],
'__gt__': [('__lt__', lambda self, other: not (self > other or self == other)),
('__ge__', lambda self, other: self > other or self == other),
('__le__', lambda self, other: not self > other)],
'__ge__': [('__le__', lambda self, other: (not self >= other) or self == other),
('__gt__', lambda self, other: self >= other and not self == other),
('__lt__', lambda self, other: not self >= other)]
}
roots = set(dir(cls)) & set(convert)
if not roots:
raise ValueError('must define at least one ordering operation: < > <= >=')
root = max(roots) # prefer __lt__ to __le__ to __gt__ to __ge__
for opname, opfunc in convert[root]:
if opname not in roots:
opfunc.__name__ = opname
opfunc.__doc__ = getattr(int, opname).__doc__
setattr(cls, opname, opfunc)
return cls
def cmp_to_key(mycmp):
"""Convert a cmp= function into a key= function"""
class K(object):
__slots__ = ['obj']
def __init__(self, obj):
self.obj = obj
def __lt__(self, other):
return mycmp(self.obj, other.obj) < 0
def __gt__(self, other):
return mycmp(self.obj, other.obj) > 0
def __eq__(self, other):
return mycmp(self.obj, other.obj) == 0
def __le__(self, other):
return mycmp(self.obj, other.obj) <= 0
def __ge__(self, other):
return mycmp(self.obj, other.obj) >= 0
def __ne__(self, other):
return mycmp(self.obj, other.obj) != 0
__hash__ = None
return K
_CacheInfo = namedtuple("CacheInfo", "hits misses maxsize currsize")
def lru_cache(maxsize=100):
"""Least-recently-used cache decorator.
If *maxsize* is set to None, the LRU features are disabled and the cache
can grow without bound.
Arguments to the cached function must be hashable.
View the cache statistics named tuple (hits, misses, maxsize, currsize) with
f.cache_info(). Clear the cache and statistics with f.cache_clear().
Access the underlying function with f.__wrapped__.
See: http://en.wikipedia.org/wiki/Cache_algorithms#Least_Recently_Used
"""
# Users should only access the lru_cache through its public API:
# cache_info, cache_clear, and f.__wrapped__
# The internals of the lru_cache are encapsulated for thread safety and
# to allow the implementation to change (including a possible C version).
def decorating_function(user_function,
tuple=tuple, sorted=sorted, len=len, KeyError=KeyError):
hits, misses = [0], [0]
kwd_mark = (object(),) # separates positional and keyword args
lock = Lock() # needed because OrderedDict isn't threadsafe
if maxsize is None:
cache = dict() # simple cache without ordering or size limit
@wraps(user_function)
def wrapper(*args, **kwds):
key = args
if kwds:
key += kwd_mark + tuple(sorted(kwds.items()))
try:
result = cache[key]
hits[0] += 1
return result
except KeyError:
pass
result = user_function(*args, **kwds)
cache[key] = result
misses[0] += 1
return result
else:
cache = OrderedDict() # ordered least recent to most recent
cache_popitem = cache.popitem
cache_renew = cache.move_to_end
@wraps(user_function)
def wrapper(*args, **kwds):
key = args
if kwds:
key += kwd_mark + tuple(sorted(kwds.items()))
with lock:
try:
result = cache[key]
cache_renew(key) # record recent use of this key
hits[0] += 1
return result
except KeyError:
pass
result = user_function(*args, **kwds)
with lock:
cache[key] = result # record recent use of this key
misses[0] += 1
if len(cache) > maxsize:
cache_popitem(0) # purge least recently used cache entry
return result
def cache_info():
"""Report cache statistics"""
with lock:
return _CacheInfo(hits[0], misses[0], maxsize, len(cache))
def cache_clear():
"""Clear the cache and cache statistics"""
with lock:
cache.clear()
hits[0] = misses[0] = 0
wrapper.cache_info = cache_info
wrapper.cache_clear = cache_clear
return wrapper
return decorating_function

View File

@@ -1,157 +0,0 @@
"""Redo the builtin repr() (representation) but with limits on most sizes."""
__all__ = ["Repr", "repr", "recursive_repr"]
import __builtin__ as builtins
from itertools import islice
try:
from thread import get_ident
except ImportError:
from _dummy_thread32 import get_ident
def recursive_repr(fillvalue='...'):
'Decorator to make a repr function return fillvalue for a recursive call'
def decorating_function(user_function):
repr_running = set()
def wrapper(self):
key = id(self), get_ident()
if key in repr_running:
return fillvalue
repr_running.add(key)
try:
result = user_function(self)
finally:
repr_running.discard(key)
return result
# Can't use functools.wraps() here because of bootstrap issues
wrapper.__module__ = getattr(user_function, '__module__')
wrapper.__doc__ = getattr(user_function, '__doc__')
wrapper.__name__ = getattr(user_function, '__name__')
wrapper.__annotations__ = getattr(user_function, '__annotations__', {})
return wrapper
return decorating_function
class Repr:
def __init__(self):
self.maxlevel = 6
self.maxtuple = 6
self.maxlist = 6
self.maxarray = 5
self.maxdict = 4
self.maxset = 6
self.maxfrozenset = 6
self.maxdeque = 6
self.maxstring = 30
self.maxlong = 40
self.maxother = 30
def repr(self, x):
return self.repr1(x, self.maxlevel)
def repr1(self, x, level):
typename = type(x).__name__
if ' ' in typename:
parts = typename.split()
typename = '_'.join(parts)
if hasattr(self, 'repr_' + typename):
return getattr(self, 'repr_' + typename)(x, level)
else:
return self.repr_instance(x, level)
def _repr_iterable(self, x, level, left, right, maxiter, trail=''):
n = len(x)
if level <= 0 and n:
s = '...'
else:
newlevel = level - 1
repr1 = self.repr1
pieces = [repr1(elem, newlevel) for elem in islice(x, maxiter)]
if n > maxiter: pieces.append('...')
s = ', '.join(pieces)
if n == 1 and trail: right = trail + right
return '%s%s%s' % (left, s, right)
def repr_tuple(self, x, level):
return self._repr_iterable(x, level, '(', ')', self.maxtuple, ',')
def repr_list(self, x, level):
return self._repr_iterable(x, level, '[', ']', self.maxlist)
def repr_array(self, x, level):
header = "array('%s', [" % x.typecode
return self._repr_iterable(x, level, header, '])', self.maxarray)
def repr_set(self, x, level):
x = _possibly_sorted(x)
return self._repr_iterable(x, level, 'set([', '])', self.maxset)
def repr_frozenset(self, x, level):
x = _possibly_sorted(x)
return self._repr_iterable(x, level, 'frozenset([', '])',
self.maxfrozenset)
def repr_deque(self, x, level):
return self._repr_iterable(x, level, 'deque([', '])', self.maxdeque)
def repr_dict(self, x, level):
n = len(x)
if n == 0: return '{}'
if level <= 0: return '{...}'
newlevel = level - 1
repr1 = self.repr1
pieces = []
for key in islice(_possibly_sorted(x), self.maxdict):
keyrepr = repr1(key, newlevel)
valrepr = repr1(x[key], newlevel)
pieces.append('%s: %s' % (keyrepr, valrepr))
if n > self.maxdict: pieces.append('...')
s = ', '.join(pieces)
return '{%s}' % (s,)
def repr_str(self, x, level):
s = builtins.repr(x[:self.maxstring])
if len(s) > self.maxstring:
i = max(0, (self.maxstring-3)//2)
j = max(0, self.maxstring-3-i)
s = builtins.repr(x[:i] + x[len(x)-j:])
s = s[:i] + '...' + s[len(s)-j:]
return s
def repr_int(self, x, level):
s = builtins.repr(x) # XXX Hope this isn't too slow...
if len(s) > self.maxlong:
i = max(0, (self.maxlong-3)//2)
j = max(0, self.maxlong-3-i)
s = s[:i] + '...' + s[len(s)-j:]
return s
def repr_instance(self, x, level):
try:
s = builtins.repr(x)
# Bugs in x.__repr__() can cause arbitrary
# exceptions -- then make up something
except Exception:
return '<%s instance at %x>' % (x.__class__.__name__, id(x))
if len(s) > self.maxother:
i = max(0, (self.maxother-3)//2)
j = max(0, self.maxother-3-i)
s = s[:i] + '...' + s[len(s)-j:]
return s
def _possibly_sorted(x):
# Since not all sequences of items can be sorted and comparison
# functions may raise arbitrary exceptions, return an unsorted
# sequence in that case.
try:
return sorted(x)
except Exception:
return list(x)
aRepr = Repr()
repr = aRepr.repr

View File

@@ -1,103 +0,0 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This is a fake set of symbols to allow spack to import typing in python
versions where we do not support type checking (<3)
"""
from collections import defaultdict
# (1) Unparameterized types.
Annotated = object
Any = object
AnyStr = object
ByteString = object
Counter = object
Final = object
Hashable = object
NoReturn = object
Sized = object
SupportsAbs = object
SupportsBytes = object
SupportsComplex = object
SupportsFloat = object
SupportsIndex = object
SupportsInt = object
SupportsRound = object
# (2) Parameterized types.
AbstractSet = defaultdict(lambda: object)
AsyncContextManager = defaultdict(lambda: object)
AsyncGenerator = defaultdict(lambda: object)
AsyncIterable = defaultdict(lambda: object)
AsyncIterator = defaultdict(lambda: object)
Awaitable = defaultdict(lambda: object)
Callable = defaultdict(lambda: object)
ChainMap = defaultdict(lambda: object)
ClassVar = defaultdict(lambda: object)
Collection = defaultdict(lambda: object)
Container = defaultdict(lambda: object)
ContextManager = defaultdict(lambda: object)
Coroutine = defaultdict(lambda: object)
DefaultDict = defaultdict(lambda: object)
Deque = defaultdict(lambda: object)
Dict = defaultdict(lambda: object)
ForwardRef = defaultdict(lambda: object)
FrozenSet = defaultdict(lambda: object)
Generator = defaultdict(lambda: object)
Generic = defaultdict(lambda: object)
ItemsView = defaultdict(lambda: object)
Iterable = defaultdict(lambda: object)
Iterator = defaultdict(lambda: object)
KeysView = defaultdict(lambda: object)
List = defaultdict(lambda: object)
Literal = defaultdict(lambda: object)
Mapping = defaultdict(lambda: object)
MappingView = defaultdict(lambda: object)
MutableMapping = defaultdict(lambda: object)
MutableSequence = defaultdict(lambda: object)
MutableSet = defaultdict(lambda: object)
NamedTuple = defaultdict(lambda: object)
Optional = defaultdict(lambda: object)
OrderedDict = defaultdict(lambda: object)
Reversible = defaultdict(lambda: object)
Sequence = defaultdict(lambda: object)
Set = defaultdict(lambda: object)
Tuple = defaultdict(lambda: object)
Type = defaultdict(lambda: object)
TypedDict = defaultdict(lambda: object)
Union = defaultdict(lambda: object)
ValuesView = defaultdict(lambda: object)
# (3) Type variable declarations.
TypeVar = lambda *args, **kwargs: None
# (4) Functions.
cast = lambda _type, x: x
get_args = None
get_origin = None
get_type_hints = None
no_type_check = None
no_type_check_decorator = None
## typing_extensions
# We get a ModuleNotFoundError when attempting to import anything from typing_extensions
# if we separate this into a separate typing_extensions.py file for some reason.
# (1) Unparameterized types.
IntVar = object
Literal = object
NewType = object
Text = object
# (2) Parameterized types.
Protocol = defaultdict(lambda: object)
# (3) Macro for avoiding evaluation except during type checking.
TYPE_CHECKING = False
# (4) Decorators.
final = lambda x: x
overload = lambda x: x
runtime_checkable = lambda x: x

View File

@@ -7,11 +7,10 @@
import argparse
import errno
import io
import re
import sys
from six import StringIO
class Command(object):
"""Parsed representation of a command from argparse.
@@ -181,7 +180,7 @@ def __init__(self, prog, out=None, aliases=False, rst_levels=_rst_levels):
self.rst_levels = rst_levels
def format(self, cmd):
string = StringIO()
string = io.StringIO()
string.write(self.begin_command(cmd.prog))
if cmd.description:

View File

@@ -1,39 +0,0 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# isort: off
import sys
if sys.version_info < (3,):
from itertools import ifilter as filter
from itertools import imap as map
from itertools import izip as zip
from itertools import izip_longest as zip_longest # novm
from urllib import urlencode as urlencode
from urllib import urlopen as urlopen
else:
filter = filter
map = map
zip = zip
from itertools import zip_longest as zip_longest # novm # noqa: F401
from urllib.parse import urlencode as urlencode # novm # noqa: F401
from urllib.request import urlopen as urlopen # novm # noqa: F401
if sys.version_info >= (3, 3):
from collections.abc import Hashable as Hashable # novm
from collections.abc import Iterable as Iterable # novm
from collections.abc import Mapping as Mapping # novm
from collections.abc import MutableMapping as MutableMapping # novm
from collections.abc import MutableSequence as MutableSequence # novm
from collections.abc import MutableSet as MutableSet # novm
from collections.abc import Sequence as Sequence # novm
else:
from collections import Hashable as Hashable # noqa: F401
from collections import Iterable as Iterable # noqa: F401
from collections import Mapping as Mapping # noqa: F401
from collections import MutableMapping as MutableMapping # noqa: F401
from collections import MutableSequence as MutableSequence # noqa: F401
from collections import MutableSet as MutableSet # noqa: F401
from collections import Sequence as Sequence # noqa: F401

View File

@@ -0,0 +1,982 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utilities for setting and modifying environment variables."""
import collections
import contextlib
import inspect
import json
import os
import os.path
import re
import shlex
import sys
import llnl.util.executable as executable
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.path import path_to_os_path
is_windows = sys.platform == "win32"
system_paths = (
["/", "/usr", "/usr/local"]
if not is_windows
else ["C:\\", "C:\\Program Files", "C:\\Program Files (x86)", "C:\\Users", "C:\\ProgramData"]
)
suffixes = ["bin", "bin64", "include", "lib", "lib64"] if not is_windows else []
system_dirs = [os.path.join(p, s) for s in suffixes for p in system_paths] + system_paths
def is_system_path(path):
"""Predicate that given a path returns True if it is a system path,
False otherwise.
Args:
path (str): path to a directory
Returns:
True or False
"""
return path and os.path.normpath(path) in system_dirs
def filter_system_paths(paths):
"""Return only paths that are not system paths."""
return [p for p in paths if not is_system_path(p)]
def deprioritize_system_paths(paths):
"""Put system paths at the end of paths, otherwise preserving order."""
filtered_paths = filter_system_paths(paths)
fp = set(filtered_paths)
return filtered_paths + [p for p in paths if p not in fp]
_shell_set_strings = {
"sh": "export {0}={1};\n",
"csh": "setenv {0} {1};\n",
"fish": "set -gx {0} {1};\n",
"bat": 'set "{0}={1}"\n',
}
_shell_unset_strings = {
"sh": "unset {0};\n",
"csh": "unsetenv {0};\n",
"fish": "set -e {0};\n",
"bat": 'set "{0}="\n',
}
tracing_enabled = False
def prune_duplicate_paths(paths):
"""Returns the paths with duplicates removed, order preserved."""
return list(llnl.util.lang.dedupe(paths))
def get_path(name):
path = os.environ.get(name, "").strip()
if path:
return path.split(os.pathsep)
else:
return []
def env_flag(name):
if name in os.environ:
value = os.environ[name].lower()
return value == "true" or value == "1"
return False
def path_set(var_name, directories):
path_str = os.pathsep.join(str(dir) for dir in directories)
os.environ[var_name] = path_str
def path_put_first(var_name, directories):
"""Puts the provided directories first in the path, adding them
if they're not already there.
"""
path = os.environ.get(var_name, "").split(os.pathsep)
for dir in directories:
if dir in path:
path.remove(dir)
new_path = tuple(directories) + tuple(path)
path_set(var_name, new_path)
@contextlib.contextmanager
def set_env(**kwargs):
"""Temporarily sets and restores environment variables.
Variables can be set as keyword arguments to this function.
"""
saved = {}
for var, value in kwargs.items():
if var in os.environ:
saved[var] = os.environ[var]
if value is None:
if var in os.environ:
del os.environ[var]
else:
os.environ[var] = value
yield
for var, value in kwargs.items():
if var in saved:
os.environ[var] = saved[var]
else:
if var in os.environ:
del os.environ[var]
class NameModifier(object):
def __init__(self, name, **kwargs):
self.name = name
self.separator = kwargs.get("separator", os.pathsep)
self.args = {"name": name, "separator": self.separator}
self.args.update(kwargs)
def __eq__(self, other):
if not isinstance(other, NameModifier):
return False
return self.name == other.name
def update_args(self, **kwargs):
self.__dict__.update(kwargs)
self.args.update(kwargs)
class NameValueModifier(object):
def __init__(self, name, value, **kwargs):
self.name = name
self.value = value
self.separator = kwargs.get("separator", os.pathsep)
self.args = {"name": name, "value": value, "separator": self.separator}
self.args.update(kwargs)
def __eq__(self, other):
if not isinstance(other, NameValueModifier):
return False
return (
self.name == other.name
and self.value == other.value
and self.separator == other.separator
)
def update_args(self, **kwargs):
self.__dict__.update(kwargs)
self.args.update(kwargs)
class SetEnv(NameValueModifier):
def execute(self, env):
tty.debug("SetEnv: {0}={1}".format(self.name, str(self.value)), level=3)
env[self.name] = str(self.value)
class AppendFlagsEnv(NameValueModifier):
def execute(self, env):
tty.debug("AppendFlagsEnv: {0}={1}".format(self.name, str(self.value)), level=3)
if self.name in env and env[self.name]:
env[self.name] += self.separator + str(self.value)
else:
env[self.name] = str(self.value)
class UnsetEnv(NameModifier):
def execute(self, env):
tty.debug("UnsetEnv: {0}".format(self.name), level=3)
# Avoid throwing if the variable was not set
env.pop(self.name, None)
class RemoveFlagsEnv(NameValueModifier):
def execute(self, env):
tty.debug("RemoveFlagsEnv: {0}-{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
flags = environment_value.split(self.separator) if environment_value else []
flags = [f for f in flags if f != self.value]
env[self.name] = self.separator.join(flags)
class SetPath(NameValueModifier):
def execute(self, env):
string_path = concatenate_paths(self.value, separator=self.separator)
tty.debug("SetPath: {0}={1}".format(self.name, string_path), level=3)
env[self.name] = string_path
class AppendPath(NameValueModifier):
def execute(self, env):
tty.debug("AppendPath: {0}+{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories.append(path_to_os_path(os.path.normpath(self.value)).pop())
env[self.name] = self.separator.join(directories)
class PrependPath(NameValueModifier):
def execute(self, env):
tty.debug("PrependPath: {0}+{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = [path_to_os_path(os.path.normpath(self.value)).pop()] + directories
env[self.name] = self.separator.join(directories)
class RemovePath(NameValueModifier):
def execute(self, env):
tty.debug("RemovePath: {0}-{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = [
path_to_os_path(os.path.normpath(x)).pop()
for x in directories
if x != path_to_os_path(os.path.normpath(self.value)).pop()
]
env[self.name] = self.separator.join(directories)
class DeprioritizeSystemPaths(NameModifier):
def execute(self, env):
tty.debug("DeprioritizeSystemPaths: {0}".format(self.name), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = deprioritize_system_paths(
[path_to_os_path(os.path.normpath(x)).pop() for x in directories]
)
env[self.name] = self.separator.join(directories)
class PruneDuplicatePaths(NameModifier):
def execute(self, env):
tty.debug("PruneDuplicatePaths: {0}".format(self.name), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = prune_duplicate_paths(
[path_to_os_path(os.path.normpath(x)).pop() for x in directories]
)
env[self.name] = self.separator.join(directories)
class EnvironmentModifications(object):
"""Keeps track of requests to modify the current environment.
Each call to a method to modify the environment stores the extra
information on the caller in the request:
* 'filename' : filename of the module where the caller is defined
* 'lineno': line number where the request occurred
* 'context' : line of code that issued the request that failed
"""
def __init__(self, other=None, traced=None):
"""Initializes a new instance, copying commands from 'other'
if it is not None.
Args:
other (EnvironmentModifications): list of environment modifications
to be extended (optional)
traced (bool): enable or disable stack trace inspection to log the origin
of the environment modifications.
"""
self.traced = tracing_enabled if traced is None else bool(traced)
self.env_modifications = []
if other is not None:
self.extend(other)
def __iter__(self):
return iter(self.env_modifications)
def __len__(self):
return len(self.env_modifications)
def extend(self, other):
self._check_other(other)
self.env_modifications.extend(other.env_modifications)
@staticmethod
def _check_other(other):
if not isinstance(other, EnvironmentModifications):
raise TypeError("other must be an instance of EnvironmentModifications")
def _maybe_trace(self, kwargs):
"""Provide the modification with stack trace info so that we can track its
origin to find issues in packages. This is very slow and expensive."""
if not self.traced:
return
stack = inspect.stack()
try:
_, filename, lineno, _, context, index = stack[2]
context = context[index].strip()
except Exception:
filename = "unknown file"
lineno = "unknown line"
context = "unknown context"
kwargs.update({"filename": filename, "lineno": lineno, "context": context})
def set(self, name, value, **kwargs):
"""Stores a request to set an environment variable.
Args:
name: name of the environment variable to be set
value: value of the environment variable
"""
self._maybe_trace(kwargs)
item = SetEnv(name, value, **kwargs)
self.env_modifications.append(item)
def append_flags(self, name, value, sep=" ", **kwargs):
"""
Stores in the current object a request to append to an env variable
Args:
name: name of the environment variable to be appended to
value: value to append to the environment variable
Appends with spaces separating different additions to the variable
"""
self._maybe_trace(kwargs)
kwargs.update({"separator": sep})
item = AppendFlagsEnv(name, value, **kwargs)
self.env_modifications.append(item)
def unset(self, name, **kwargs):
"""Stores a request to unset an environment variable.
Args:
name: name of the environment variable to be unset
"""
self._maybe_trace(kwargs)
item = UnsetEnv(name, **kwargs)
self.env_modifications.append(item)
def remove_flags(self, name, value, sep=" ", **kwargs):
"""
Stores in the current object a request to remove flags from an
env variable
Args:
name: name of the environment variable to be removed from
value: value to remove to the environment variable
sep: separator to assume for environment variable
"""
self._maybe_trace(kwargs)
kwargs.update({"separator": sep})
item = RemoveFlagsEnv(name, value, **kwargs)
self.env_modifications.append(item)
def set_path(self, name, elements, **kwargs):
"""Stores a request to set a path generated from a list.
Args:
name: name o the environment variable to be set.
elements: elements of the path to set.
"""
self._maybe_trace(kwargs)
item = SetPath(name, elements, **kwargs)
self.env_modifications.append(item)
def append_path(self, name, path, **kwargs):
"""Stores a request to append a path to a path list.
Args:
name: name of the path list in the environment
path: path to be appended
"""
self._maybe_trace(kwargs)
item = AppendPath(name, path, **kwargs)
self.env_modifications.append(item)
def prepend_path(self, name, path, **kwargs):
"""Same as `append_path`, but the path is pre-pended.
Args:
name: name of the path list in the environment
path: path to be pre-pended
"""
self._maybe_trace(kwargs)
item = PrependPath(name, path, **kwargs)
self.env_modifications.append(item)
def remove_path(self, name, path, **kwargs):
"""Stores a request to remove a path from a path list.
Args:
name: name of the path list in the environment
path: path to be removed
"""
self._maybe_trace(kwargs)
item = RemovePath(name, path, **kwargs)
self.env_modifications.append(item)
def deprioritize_system_paths(self, name, **kwargs):
"""Stores a request to deprioritize system paths in a path list,
otherwise preserving the order.
Args:
name: name of the path list in the environment.
"""
self._maybe_trace(kwargs)
item = DeprioritizeSystemPaths(name, **kwargs)
self.env_modifications.append(item)
def prune_duplicate_paths(self, name, **kwargs):
"""Stores a request to remove duplicates from a path list, otherwise
preserving the order.
Args:
name: name of the path list in the environment.
"""
self._maybe_trace(kwargs)
item = PruneDuplicatePaths(name, **kwargs)
self.env_modifications.append(item)
def group_by_name(self):
"""Returns a dict of the modifications grouped by variable name.
Returns:
dict mapping the environment variable name to the modifications to
be done on it
"""
modifications = collections.defaultdict(list)
for item in self:
modifications[item.name].append(item)
return modifications
def is_unset(self, var_name):
modifications = self.group_by_name()
var_updates = modifications.get(var_name, None)
if not var_updates:
# We did not explicitly unset it
return False
# The last modification must unset the variable for it to be considered
# unset
return type(var_updates[-1]) == UnsetEnv
def clear(self):
"""
Clears the current list of modifications
"""
self.env_modifications = []
def reversed(self):
"""
Returns the EnvironmentModifications object that will reverse self
Only creates reversals for additions to the environment, as reversing
``unset`` and ``remove_path`` modifications is impossible.
Reversable operations are set(), prepend_path(), append_path(),
set_path(), and append_flags().
"""
rev = EnvironmentModifications()
for envmod in reversed(self.env_modifications):
if type(envmod) == SetEnv:
tty.debug("Reversing `Set` environment operation may lose " "original value")
rev.unset(envmod.name)
elif type(envmod) == AppendPath:
rev.remove_path(envmod.name, envmod.value)
elif type(envmod) == PrependPath:
rev.remove_path(envmod.name, envmod.value)
elif type(envmod) == SetPath:
tty.debug("Reversing `SetPath` environment operation may lose " "original value")
rev.unset(envmod.name)
elif type(envmod) == AppendFlagsEnv:
rev.remove_flags(envmod.name, envmod.value)
else:
# This is an un-reversable operation
tty.warn(
"Skipping reversal of unreversable operation"
"%s %s" % (type(envmod), envmod.name)
)
return rev
def apply_modifications(self, env=None):
"""Applies the modifications and clears the list."""
# Use os.environ if not specified
# Do not copy, we want to modify it in place
if env is None:
env = os.environ
modifications = self.group_by_name()
# Apply modifications one variable at a time
for name, actions in sorted(modifications.items()):
for x in actions:
x.execute(env)
def shell_modifications(self, shell="sh", explicit=False, env=None):
"""Return shell code to apply the modifications and clears the list."""
modifications = self.group_by_name()
if env is None:
env = os.environ
new_env = env.copy()
for name, actions in sorted(modifications.items()):
for x in actions:
x.execute(new_env)
if "MANPATH" in new_env and not new_env.get("MANPATH").endswith(":"):
new_env["MANPATH"] += ":"
cmds = ""
for name in sorted(set(modifications)):
new = new_env.get(name, None)
old = env.get(name, None)
if explicit or new != old:
if new is None:
cmds += _shell_unset_strings[shell].format(name)
else:
if sys.platform != "win32":
cmd = _shell_set_strings[shell].format(name, shlex.quote(new_env[name]))
else:
cmd = _shell_set_strings[shell].format(name, new_env[name])
cmds += cmd
return cmds
@staticmethod
def from_sourcing_file(filename, *arguments, **kwargs):
"""Constructs an instance of a
:py:class:`llnl.util.envmod.EnvironmentModifications` object
that has the same effect as sourcing a file.
Args:
filename (str): the file to be sourced
*arguments (list): arguments to pass on the command line
Keyword Args:
shell (str): the shell to use (default: ``bash``)
shell_options (str): options passed to the shell (default: ``-c``)
source_command (str): the command to run (default: ``source``)
suppress_output (str): redirect used to suppress output of command
(default: ``&> /dev/null``)
concatenate_on_success (str): operator used to execute a command
only when the previous command succeeds (default: ``&&``)
exclude ([str or re]): ignore any modifications of these
variables (default: [])
include ([str or re]): always respect modifications of these
variables (default: []). Supersedes any excluded variables.
clean (bool): in addition to removing empty entries,
also remove duplicate entries (default: False).
"""
tty.debug("EnvironmentModifications.from_sourcing_file: {0}".format(filename))
# Check if the file actually exists
if not os.path.isfile(filename):
msg = "Trying to source non-existing file: {0}".format(filename)
raise RuntimeError(msg)
# Prepare include and exclude lists of environment variable names
exclude = kwargs.get("exclude", [])
include = kwargs.get("include", [])
clean = kwargs.get("clean", False)
# Other variables unrelated to sourcing a file
exclude.extend(
[
# Bash internals
"SHLVL",
"_",
"PWD",
"OLDPWD",
"PS1",
"PS2",
"ENV",
# Environment modules v4
"LOADEDMODULES",
"_LMFILES_",
"BASH_FUNC_module()",
"MODULEPATH",
"MODULES_(.*)",
r"(\w*)_mod(quar|share)",
# Lmod configuration
r"LMOD_(.*)",
"MODULERCFILE",
]
)
# Compute the environments before and after sourcing
before = sanitize(
environment_after_sourcing_files(os.devnull, **kwargs),
exclude=exclude,
include=include,
)
file_and_args = (filename,) + arguments
after = sanitize(
environment_after_sourcing_files(file_and_args, **kwargs),
exclude=exclude,
include=include,
)
# Delegate to the other factory
return EnvironmentModifications.from_environment_diff(before, after, clean)
@staticmethod
def from_environment_diff(before, after, clean=False):
"""Constructs an instance of a
:py:class:`llnl.util.envmod.EnvironmentModifications` object
from the diff of two dictionaries.
Args:
before (dict): environment before the modifications are applied
after (dict): environment after the modifications are applied
clean (bool): in addition to removing empty entries, also remove
duplicate entries
"""
# Fill the EnvironmentModifications instance
env = EnvironmentModifications()
# New variables
new_variables = list(set(after) - set(before))
# Variables that have been unset
unset_variables = list(set(before) - set(after))
# Variables that have been modified
common_variables = set(before).intersection(set(after))
modified_variables = [x for x in common_variables if before[x] != after[x]]
# Consistent output order - looks nicer, easier comparison...
new_variables.sort()
unset_variables.sort()
modified_variables.sort()
def return_separator_if_any(*args):
separators = ":", ";"
for separator in separators:
for arg in args:
if separator in arg:
return separator
return None
# Add variables to env.
# Assume that variables with 'PATH' in the name or that contain
# separators like ':' or ';' are more likely to be paths
for x in new_variables:
sep = return_separator_if_any(after[x])
if sep:
env.prepend_path(x, after[x], separator=sep)
elif "PATH" in x:
env.prepend_path(x, after[x])
else:
# We just need to set the variable to the new value
env.set(x, after[x])
for x in unset_variables:
env.unset(x)
for x in modified_variables:
value_before = before[x]
value_after = after[x]
sep = return_separator_if_any(value_before, value_after)
if sep:
before_list = value_before.split(sep)
after_list = value_after.split(sep)
# Filter out empty strings
before_list = list(filter(None, before_list))
after_list = list(filter(None, after_list))
# Remove duplicate entries (worse matching, bloats env)
if clean:
before_list = list(llnl.util.lang.dedupe(before_list))
after_list = list(llnl.util.lang.dedupe(after_list))
# The reassembled cleaned entries
value_before = sep.join(before_list)
value_after = sep.join(after_list)
# Paths that have been removed
remove_list = [ii for ii in before_list if ii not in after_list]
# Check that nothing has been added in the middle of
# before_list
remaining_list = [ii for ii in before_list if ii in after_list]
try:
start = after_list.index(remaining_list[0])
end = after_list.index(remaining_list[-1])
search = sep.join(after_list[start : end + 1])
except IndexError:
env.prepend_path(x, value_after)
continue
if search not in value_before:
# We just need to set the variable to the new value
env.prepend_path(x, value_after)
else:
try:
prepend_list = after_list[:start]
prepend_list.reverse() # Preserve order after prepend
except KeyError:
prepend_list = []
try:
append_list = after_list[end + 1 :]
except KeyError:
append_list = []
for item in remove_list:
env.remove_path(x, item)
for item in append_list:
env.append_path(x, item)
for item in prepend_list:
env.prepend_path(x, item)
else:
# We just need to set the variable to the new value
env.set(x, value_after)
return env
def concatenate_paths(paths, separator=os.pathsep):
"""Concatenates an iterable of paths into a string of paths separated by
separator, defaulting to colon.
Args:
paths: iterable of paths
separator: the separator to use, default ';' windows, ':' otherwise
Returns:
string
"""
return separator.join(str(item) for item in paths)
def set_or_unset_not_first(variable, changes, errstream):
"""Check if we are going to set or unset something after other
modifications have already been requested.
"""
indexes = [
ii
for ii, item in enumerate(changes)
if ii != 0 and not item.args.get("force", False) and type(item) in [SetEnv, UnsetEnv]
]
if indexes:
good = "\t \t{context} at {filename}:{lineno}"
nogood = "\t--->\t{context} at {filename}:{lineno}"
message = "Suspicious requests to set or unset '{var}' found"
errstream(message.format(var=variable))
for ii, item in enumerate(changes):
print_format = nogood if ii in indexes else good
errstream(print_format.format(**item.args))
def validate(env, errstream):
"""Validates the environment modifications to check for the presence of
suspicious patterns. Prompts a warning for everything that was found.
Current checks:
- set or unset variables after other changes on the same variable
Args:
env: list of environment modifications
"""
if not env.traced:
return
modifications = env.group_by_name()
for variable, list_of_changes in sorted(modifications.items()):
set_or_unset_not_first(variable, list_of_changes, errstream)
def inspect_path(root, inspections, exclude=None):
"""Inspects ``root`` to search for the subdirectories in ``inspections``.
Adds every path found to a list of prepend-path commands and returns it.
Args:
root (str): absolute path where to search for subdirectories
inspections (dict): maps relative paths to a list of environment
variables that will be modified if the path exists. The
modifications are not performed immediately, but stored in a
command object that is returned to client
exclude (typing.Callable): optional callable. If present it must accept an
absolute path and return True if it should be excluded from the
inspection
Examples:
The following lines execute an inspection in ``/usr`` to search for
``/usr/include`` and ``/usr/lib64``. If found we want to prepend
``/usr/include`` to ``CPATH`` and ``/usr/lib64`` to ``MY_LIB64_PATH``.
.. code-block:: python
# Set up the dictionary containing the inspection
inspections = {
'include': ['CPATH'],
'lib64': ['MY_LIB64_PATH']
}
# Get back the list of command needed to modify the environment
env = inspect_path('/usr', inspections)
# Eventually execute the commands
env.apply_modifications()
Returns:
instance of EnvironmentModifications containing the requested
modifications
"""
if exclude is None:
exclude = lambda x: False
env = EnvironmentModifications()
# Inspect the prefix to check for the existence of common directories
for relative_path, variables in inspections.items():
expected = os.path.join(root, relative_path)
if os.path.isdir(expected) and not exclude(expected):
for variable in variables:
env.prepend_path(variable, expected)
return env
@contextlib.contextmanager
def preserve_environment(*variables):
"""Ensures that the value of the environment variables passed as
arguments is the same before entering to the context manager and after
exiting it.
Variables that are unset before entering the context manager will be
explicitly unset on exit.
Args:
variables (list): list of environment variables to be preserved
"""
cache = {}
for var in variables:
# The environment variable to be preserved might not be there.
# In that case store None as a placeholder.
cache[var] = os.environ.get(var, None)
yield
for var in variables:
value = cache[var]
msg = "[PRESERVE_ENVIRONMENT]"
if value is not None:
# Print a debug statement if the value changed
if var not in os.environ:
msg += ' {0} was unset, will be reset to "{1}"'
tty.debug(msg.format(var, value))
elif os.environ[var] != value:
msg += ' {0} was set to "{1}", will be reset to "{2}"'
tty.debug(msg.format(var, os.environ[var], value))
os.environ[var] = value
elif var in os.environ:
msg += ' {0} was set to "{1}", will be unset'
tty.debug(msg.format(var, os.environ[var]))
del os.environ[var]
def environment_after_sourcing_files(*files, **kwargs):
"""Returns a dictionary with the environment that one would have
after sourcing the files passed as argument.
Args:
*files: each item can either be a string containing the path
of the file to be sourced or a sequence, where the first element
is the file to be sourced and the remaining are arguments to be
passed to the command line
Keyword Args:
env (dict): the initial environment (default: current environment)
shell (str): the shell to use (default: ``/bin/bash``)
shell_options (str): options passed to the shell (default: ``-c``)
source_command (str): the command to run (default: ``source``)
suppress_output (str): redirect used to suppress output of command
(default: ``&> /dev/null``)
concatenate_on_success (str): operator used to execute a command
only when the previous command succeeds (default: ``&&``)
"""
# Set the shell executable that will be used to source files
shell_cmd = kwargs.get("shell", "/bin/bash")
shell_options = kwargs.get("shell_options", "-c")
source_command = kwargs.get("source_command", "source")
suppress_output = kwargs.get("suppress_output", "&> /dev/null")
concatenate_on_success = kwargs.get("concatenate_on_success", "&&")
shell = executable.Executable(" ".join([shell_cmd, shell_options]))
def _source_single_file(file_and_args, environment):
source_file = [source_command]
source_file.extend(x for x in file_and_args)
source_file = " ".join(source_file)
# If the environment contains 'python' use it, if not
# go with sys.executable. Below we just need a working
# Python interpreter, not necessarily sys.executable.
python_cmd = executable.which("python3", "python", "python2")
python_cmd = python_cmd.path if python_cmd else sys.executable
dump_cmd = "import os, json; print(json.dumps(dict(os.environ)))"
dump_environment = python_cmd + ' -E -c "{0}"'.format(dump_cmd)
# Try to source the file
source_file_arguments = " ".join(
[
source_file,
suppress_output,
concatenate_on_success,
dump_environment,
]
)
output = shell(source_file_arguments, output=str, env=environment, ignore_quotes=True)
return json.loads(output)
current_environment = kwargs.get("env", dict(os.environ))
for f in files:
# Normalize the input to the helper function
if isinstance(f, str):
f = [f]
current_environment = _source_single_file(f, environment=current_environment)
return current_environment
def sanitize(environment, exclude, include):
"""Returns a copy of the input dictionary where all the keys that
match an excluded pattern and don't match an included pattern are
removed.
Args:
environment (dict): input dictionary
exclude (list): literals or regex patterns to be excluded
include (list): literals or regex patterns to be included
"""
def set_intersection(fullset, *args):
# A set intersection using string literals and regexs
meta = "[" + re.escape("[$()*?[]^{|}") + "]"
subset = fullset & set(args) # As literal
for name in args:
if re.search(meta, name):
pattern = re.compile(name)
for k in fullset:
if re.match(pattern, k):
subset.add(k)
return subset
# Don't modify input, make a copy instead
environment = dict(environment)
# include supersedes any excluded items
prune = set_intersection(set(environment), *exclude)
prune -= set_intersection(prune, *include)
for k in prune:
environment.pop(k, None)
return environment

View File

@@ -8,14 +8,10 @@
import shlex
import subprocess
import sys
from six import string_types, text_type
from six.moves import shlex_quote
from typing import List, Optional, Union
import llnl.util.tty as tty
import spack.error
from spack.util.path import Path, format_os_path, path_to_os_path, system_path_filter
from llnl.util.path import Path, format_os_path, path_to_os_path, system_path_filter
__all__ = ["Executable", "which", "ProcessError"]
@@ -30,13 +26,20 @@ def __init__(self, name):
# filter back to platform dependent path
self.exe = path_to_os_path(*self.exe)
self.default_env = {}
from spack.util.environment import EnvironmentModifications # no cycle
self.default_envmod = EnvironmentModifications()
self._default_envmod = None
self.returncode = None
if not self.exe:
raise ProcessError("Cannot construct executable for '%s'" % name)
raise ProcessError(f"Cannot construct executable for '{name}'")
@property
def default_envmod(self):
from llnl.util.envmod import EnvironmentModifications
if self._default_envmod is None:
self._default_envmod = EnvironmentModifications()
return self._default_envmod
@system_path_filter
def add_default_arg(self, arg):
@@ -125,6 +128,8 @@ def __call__(self, *args, **kwargs):
By default, the subprocess inherits the parent's file descriptors.
"""
from llnl.util.envmod import EnvironmentModifications
# Environment
env_arg = kwargs.get("env", None)
@@ -133,8 +138,6 @@ def __call__(self, *args, **kwargs):
self.default_envmod.apply_modifications(env)
env.update(self.default_env)
from spack.util.environment import EnvironmentModifications # no cycle
# Apply env argument
if isinstance(env_arg, EnvironmentModifications):
env_arg.apply_modifications(env)
@@ -168,7 +171,7 @@ def __call__(self, *args, **kwargs):
raise ValueError("Cannot use `str` as input stream.")
def streamify(arg, mode):
if isinstance(arg, string_types):
if isinstance(arg, str):
return open(arg, mode), True
elif arg in (str, str.split):
return subprocess.PIPE, False
@@ -213,44 +216,43 @@ def streamify(arg, mode):
result = ""
if output in (str, str.split):
if sys.platform == "win32":
outstr = text_type(out.decode("ISO-8859-1"))
outstr = str(out.decode("ISO-8859-1"))
else:
outstr = text_type(out.decode("utf-8"))
outstr = str(out.decode("utf-8"))
result += outstr
if output is str.split:
sys.stdout.write(outstr)
if error in (str, str.split):
if sys.platform == "win32":
errstr = text_type(err.decode("ISO-8859-1"))
errstr = str(err.decode("ISO-8859-1"))
else:
errstr = text_type(err.decode("utf-8"))
errstr = str(err.decode("utf-8"))
result += errstr
if error is str.split:
sys.stderr.write(errstr)
rc = self.returncode = proc.returncode
if fail_on_error and rc != 0 and (rc not in ignore_errors):
long_msg = cmd_line_string
msg = f"Command {cmd_line_string} exited with status {proc.returncode}."
if result:
# If the output is not captured in the result, it will have
# been stored either in the specified files (e.g. if
# 'output' specifies a file) or written to the parent's
# stdout/stderr (e.g. if 'output' is not specified)
long_msg += "\n" + result
msg += "\n" + result
raise ProcessError("Command exited with status %d:" % proc.returncode, long_msg)
raise ProcessError(msg)
return result
except OSError as e:
raise ProcessError("%s: %s" % (self.exe[0], e.strerror), "Command: " + cmd_line_string)
raise ProcessError(f"{self.exe[0]}: {e.strerror}\n Command: '{cmd_line_string}'")
except subprocess.CalledProcessError as e:
if fail_on_error:
raise ProcessError(
str(e),
"\nExit status %d when invoking command: %s"
% (proc.returncode, cmd_line_string),
f"{str(e)}\n"
f" Exit status {proc.returncode} when invoking command: '{cmd_line_string}'"
)
finally:
@@ -278,16 +280,31 @@ def __str__(self):
@system_path_filter
def which_string(*args, **kwargs):
"""Like ``which()``, but return a string instead of an ``Executable``."""
path = kwargs.get("path", os.environ.get("PATH", ""))
required = kwargs.get("required", False)
def which_string(*args: str, path: Optional[Union[List[str], str]] = None, required: bool = False):
"""Finds an executable in the path like command-line which.
if isinstance(path, string_types):
If given multiple executables, returns the first one that is found.
If no executables are found, returns None.
Parameters:
*args: One or more executables to search for
Keyword Arguments:
path: colon-separated (semicolon-separated on windows) string or list of
paths to search. Defaults to ``os.environ["PATH"]``
required: If set to ``True``, raise an error if executable not found
Returns:
Absolute path of the first executable found.
"""
if path is None:
path = os.environ.get("PATH") or ""
if isinstance(path, str):
path = path.split(os.pathsep)
for name in args:
win_candidates = []
win_candidates: List[str] = []
if sys.platform == "win32" and (not name.endswith(".exe") and not name.endswith(".bat")):
win_candidates = [name + ext for ext in [".exe", ".bat"]]
candidate_names = [name] if not win_candidates else win_candidates
@@ -312,19 +329,19 @@ def which_string(*args, **kwargs):
return exe
if required:
raise CommandNotFoundError("spack requires '%s'. Make sure it is in your path." % args[0])
raise CommandNotFoundError(args[0])
return None
def which(*args, **kwargs):
def which(*args: str, path: Optional[Union[List[str], str]] = None, required: bool = False):
"""Finds an executable in the path like command-line which.
If given multiple executables, returns the first one that is found.
If no executables are found, returns None.
Parameters:
*args (str): One or more executables to search for
*args: One or more executables to search for
Keyword Arguments:
path (list or str): The path to search. Defaults to ``PATH``
@@ -333,13 +350,17 @@ def which(*args, **kwargs):
Returns:
Executable: The first executable that is found in the path
"""
exe = which_string(*args, **kwargs)
return Executable(shlex_quote(exe)) if exe else None
exe = which_string(*args, path=path, required=required)
return Executable(shlex.quote(exe)) if exe else None
class ProcessError(spack.error.SpackError):
class ProcessError(Exception):
"""ProcessErrors are raised when Executables exit with an error code."""
class CommandNotFoundError(spack.error.SpackError):
class CommandNotFoundError(Exception):
"""Raised when ``which()`` can't find a required executable."""
def __init__(self, command: str):
super().__init__(f"Couldn't find command '{command}'. Make sure it is in your path.")
self.command = command

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import collections.abc
import errno
import glob
import hashlib
@@ -17,16 +18,12 @@
from contextlib import contextmanager
from sys import platform as _platform
import six
from llnl.util import tty
from llnl.util.compat import Sequence
from llnl.util.executable import CommandNotFoundError, Executable, which
from llnl.util.lang import dedupe, memoized
from llnl.util.path import path_to_os_path, system_path_filter
from llnl.util.symlink import islink, symlink
from spack.util.executable import CommandNotFoundError, Executable, which
from spack.util.path import path_to_os_path, system_path_filter
is_windows = _platform == "win32"
if not is_windows:
@@ -101,7 +98,9 @@ def getuid():
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
if is_windows:
if os.path.exists(dst):
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or os.path.islink(dst):
os.remove(dst)
os.rename(src, dst)
@@ -290,9 +289,10 @@ def groupid_to_group(x):
shutil.copy(filename, tmp_filename)
try:
extra_kwargs = {}
if sys.version_info > (3, 0):
extra_kwargs = {"errors": "surrogateescape"}
# To avoid translating line endings (\n to \r\n and vis versa)
# we force os.open to ignore translations and use the line endings
# the file comes with
extra_kwargs = {"errors": "surrogateescape", "newline": ""}
# Open as a text file and filter until the end of the file is
# reached or we found a marker in the line if it was specified
@@ -522,7 +522,7 @@ def chgrp(path, group, follow_symlinks=True):
if is_windows:
raise OSError("Function 'chgrp' is not supported on Windows")
if isinstance(group, six.string_types):
if isinstance(group, str):
gid = grp.getgrnam(group).gr_gid
else:
gid = group
@@ -1000,45 +1000,16 @@ def hash_directory(directory, ignore=[]):
return md5_hash.hexdigest()
def _try_unlink(path):
try:
os.unlink(path)
except (IOError, OSError):
# But if that fails, that's OK.
pass
@contextmanager
@system_path_filter
def write_tmp_and_move(path, mode="w"):
"""Write to a temporary file in the same directory, then move into place."""
# Rely on NamedTemporaryFile to give a unique file without races
# in the directory of the target file.
file = tempfile.NamedTemporaryFile(
prefix="." + os.path.basename(path),
suffix=".tmp",
dir=os.path.dirname(path),
mode=mode,
delete=False, # we delete it ourselves
)
tmp_path = file.name
try:
yield file
except BaseException:
# On any failure, try to remove the temporary file.
_try_unlink(tmp_path)
raise
finally:
# Always close the file decriptor
file.close()
# Atomically move into existence.
try:
os.rename(tmp_path, path)
except (IOError, OSError):
_try_unlink(tmp_path)
raise
def write_tmp_and_move(filename):
"""Write to a temporary file, then move into place."""
dirname = os.path.dirname(filename)
basename = os.path.basename(filename)
tmp = os.path.join(dirname, ".%s.tmp" % basename)
with open(tmp, "w") as f:
yield f
shutil.move(tmp, filename)
@contextmanager
@@ -1048,7 +1019,7 @@ def open_if_filename(str_or_file, mode="r"):
If it's a file object, just yields the file object.
"""
if isinstance(str_or_file, six.string_types):
if isinstance(str_or_file, str):
with open(str_or_file, mode) as f:
yield f
else:
@@ -1338,46 +1309,34 @@ def visit_directory_tree(root, visitor, rel_path="", depth=0):
depth (str): current depth from the root
"""
dir = os.path.join(root, rel_path)
if sys.version_info >= (3, 5, 0):
dir_entries = sorted(os.scandir(dir), key=lambda d: d.name) # novermin
else:
dir_entries = os.listdir(dir)
dir_entries.sort()
dir_entries = sorted(os.scandir(dir), key=lambda d: d.name)
for f in dir_entries:
if sys.version_info >= (3, 5, 0):
rel_child = os.path.join(rel_path, f.name)
islink = f.is_symlink()
# On Windows, symlinks to directories are distinct from
# symlinks to files, and it is possible to create a
# broken symlink to a directory (e.g. using os.symlink
# without `target_is_directory=True`), invoking `isdir`
# on a symlink on Windows that is broken in this manner
# will result in an error. In this case we can work around
# the issue by reading the target and resolving the
# directory ourselves
try:
isdir = f.is_dir()
except OSError as e:
if is_windows and hasattr(e, "winerror") and e.winerror == 5 and islink:
# if path is a symlink, determine destination and
# evaluate file vs directory
link_target = resolve_link_target_relative_to_the_link(f)
# link_target might be relative but
# resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative
# to the CWD and therefore
# makes sense
isdir = os.path.isdir(link_target)
else:
raise e
else:
rel_child = os.path.join(rel_path, f)
lexists, islink, isdir = lexists_islink_isdir(os.path.join(dir, f))
if not lexists:
continue
rel_child = os.path.join(rel_path, f.name)
islink = f.is_symlink()
# On Windows, symlinks to directories are distinct from
# symlinks to files, and it is possible to create a
# broken symlink to a directory (e.g. using os.symlink
# without `target_is_directory=True`), invoking `isdir`
# on a symlink on Windows that is broken in this manner
# will result in an error. In this case we can work around
# the issue by reading the target and resolving the
# directory ourselves
try:
isdir = f.is_dir()
except OSError as e:
if is_windows and hasattr(e, "winerror") and e.winerror == 5 and islink:
# if path is a symlink, determine destination and
# evaluate file vs directory
link_target = resolve_link_target_relative_to_the_link(f)
# link_target might be relative but
# resolve_link_target_relative_to_the_link
# will ensure that if so, that it is relative
# to the CWD and therefore
# makes sense
isdir = os.path.isdir(link_target)
else:
raise e
if not isdir and not islink:
# handle non-symlink files
@@ -1638,14 +1597,14 @@ def find(root, files, recursive=True):
Parameters:
root (str): The root directory to start searching from
files (str or Sequence): Library name(s) to search for
files (str or collections.abc.Sequence): Library name(s) to search for
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to True.
Returns:
list: The files that have been found
"""
if isinstance(files, six.string_types):
if isinstance(files, str):
files = [files]
if recursive:
@@ -1702,14 +1661,14 @@ def _find_non_recursive(root, search_files):
# Utilities for libraries and headers
class FileList(Sequence):
class FileList(collections.abc.Sequence):
"""Sequence of absolute paths to files.
Provides a few convenience methods to manipulate file paths.
"""
def __init__(self, files):
if isinstance(files, six.string_types):
if isinstance(files, str):
files = [files]
self.files = list(dedupe(files))
@@ -1805,7 +1764,7 @@ def directories(self):
def directories(self, value):
value = value or []
# Accept a single directory as input
if isinstance(value, six.string_types):
if isinstance(value, str):
value = [value]
self._directories = [path_to_os_path(os.path.normpath(x))[0] for x in value]
@@ -1941,9 +1900,9 @@ def find_headers(headers, root, recursive=False):
Returns:
HeaderList: The headers that have been found
"""
if isinstance(headers, six.string_types):
if isinstance(headers, str):
headers = [headers]
elif not isinstance(headers, Sequence):
elif not isinstance(headers, collections.abc.Sequence):
message = "{0} expects a string or sequence of strings as the "
message += "first argument [got {1} instead]"
message = message.format(find_headers.__name__, type(headers))
@@ -2107,9 +2066,9 @@ def find_system_libraries(libraries, shared=True):
Returns:
LibraryList: The libraries that have been found
"""
if isinstance(libraries, six.string_types):
if isinstance(libraries, str):
libraries = [libraries]
elif not isinstance(libraries, Sequence):
elif not isinstance(libraries, collections.abc.Sequence):
message = "{0} expects a string or sequence of strings as the "
message += "first argument [got {1} instead]"
message = message.format(find_system_libraries.__name__, type(libraries))
@@ -2164,9 +2123,9 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
Returns:
LibraryList: The libraries that have been found
"""
if isinstance(libraries, six.string_types):
if isinstance(libraries, str):
libraries = [libraries]
elif not isinstance(libraries, Sequence):
elif not isinstance(libraries, collections.abc.Sequence):
message = "{0} expects a string or sequence of strings as the "
message += "first argument [got {1} instead]"
message = message.format(find_libraries.__name__, type(libraries))
@@ -2323,10 +2282,17 @@ def add_rpath(self, *paths):
"""
self._addl_rpaths = self._addl_rpaths | set(paths)
def _link(self, path, dest):
def _link(self, path, dest_dir):
"""Perform link step of simulated rpathing, installing
simlinks of file in path to the dest_dir
location. This method deliberately prevents
the case where a path points to a file inside the dest_dir.
This is because it is both meaningless from an rpath
perspective, and will cause an error when Developer
mode is not enabled"""
file_name = os.path.basename(path)
dest_file = os.path.join(dest, file_name)
if os.path.exists(dest):
dest_file = os.path.join(dest_dir, file_name)
if os.path.exists(dest_dir) and not dest_file == path:
try:
symlink(path, dest_file)
# For py2 compatibility, we have to catch the specific Windows error code
@@ -2340,7 +2306,7 @@ def _link(self, path, dest):
"Linking library %s to %s failed, " % (path, dest_file) + "already linked."
if already_linked
else "library with name %s already exists at location %s."
% (file_name, dest)
% (file_name, dest_dir)
)
pass
else:

View File

@@ -5,9 +5,11 @@
from __future__ import division
import collections.abc
import contextlib
import functools
import inspect
import itertools
import os
import re
import sys
@@ -15,11 +17,6 @@
from datetime import datetime, timedelta
from typing import Any, Callable, Iterable, List, Tuple
import six
from six import string_types
from llnl.util.compat import MutableMapping, MutableSequence, zip_longest
# Ignore emacs backups when listing modules
ignore_modules = [r"^\.#", "~$"]
@@ -200,14 +197,9 @@ def _memoized_function(*args, **kwargs):
return ret
except TypeError as e:
# TypeError is raised when indexing into a dict if the key is unhashable.
raise six.raise_from(
UnhashableArguments(
"args + kwargs '{}' was not hashable for function '{}'".format(
key, func.__name__
),
),
e,
)
raise UnhashableArguments(
"args + kwargs '{}' was not hashable for function '{}'".format(key, func.__name__),
) from e
return _memoized_function
@@ -312,7 +304,7 @@ def lazy_eq(lseq, rseq):
# zip_longest is implemented in native code, so use it for speed.
# use zip_longest instead of zip because it allows us to tell
# which iterator was longer.
for left, right in zip_longest(liter, riter, fillvalue=done):
for left, right in itertools.zip_longest(liter, riter, fillvalue=done):
if (left is done) or (right is done):
return False
@@ -332,7 +324,7 @@ def lazy_lt(lseq, rseq):
liter = lseq()
riter = rseq()
for left, right in zip_longest(liter, riter, fillvalue=done):
for left, right in itertools.zip_longest(liter, riter, fillvalue=done):
if (left is done) or (right is done):
return left is done # left was shorter than right
@@ -482,7 +474,7 @@ def add_func_to_class(name, func):
@lazy_lexicographic_ordering
class HashableMap(MutableMapping):
class HashableMap(collections.abc.MutableMapping):
"""This is a hashable, comparable dictionary. Hash is performed on
a tuple of the values in the dictionary."""
@@ -574,7 +566,7 @@ def match_predicate(*args):
def match(string):
for arg in args:
if isinstance(arg, string_types):
if isinstance(arg, str):
if re.search(arg, string):
return True
elif isinstance(arg, list) or isinstance(arg, tuple):
@@ -749,6 +741,18 @@ def _n_xxx_ago(x):
raise ValueError(msg)
def pretty_seconds_formatter(seconds):
if seconds >= 1:
multiplier, unit = 1, "s"
elif seconds >= 1e-3:
multiplier, unit = 1e3, "ms"
elif seconds >= 1e-6:
multiplier, unit = 1e6, "us"
else:
multiplier, unit = 1e9, "ns"
return lambda s: "%.3f%s" % (multiplier * s, unit)
def pretty_seconds(seconds):
"""Seconds to string with appropriate units
@@ -758,15 +762,7 @@ def pretty_seconds(seconds):
Returns:
str: Time string with units
"""
if seconds >= 1:
value, unit = seconds, "s"
elif seconds >= 1e-3:
value, unit = seconds * 1e3, "ms"
elif seconds >= 1e-6:
value, unit = seconds * 1e6, "us"
else:
value, unit = seconds * 1e9, "ns"
return "%.3f%s" % (value, unit)
return pretty_seconds_formatter(seconds)(seconds)
class RequiredAttributeError(ValueError):
@@ -887,32 +883,28 @@ def load_module_from_file(module_name, module_path):
ImportError: when the module can't be loaded
FileNotFoundError: when module_path doesn't exist
"""
import importlib.util
if module_name in sys.modules:
return sys.modules[module_name]
# This recipe is adapted from https://stackoverflow.com/a/67692/771663
if sys.version_info[0] == 3 and sys.version_info[1] >= 5:
import importlib.util
spec = importlib.util.spec_from_file_location(module_name, module_path) # novm
module = importlib.util.module_from_spec(spec) # novm
# The module object needs to exist in sys.modules before the
# loader executes the module code.
#
# See https://docs.python.org/3/reference/import.html#loading
sys.modules[spec.name] = module
spec = importlib.util.spec_from_file_location(module_name, module_path)
module = importlib.util.module_from_spec(spec)
# The module object needs to exist in sys.modules before the
# loader executes the module code.
#
# See https://docs.python.org/3/reference/import.html#loading
sys.modules[spec.name] = module
try:
spec.loader.exec_module(module)
except BaseException:
try:
spec.loader.exec_module(module)
except BaseException:
try:
del sys.modules[spec.name]
except KeyError:
pass
raise
elif sys.version_info[0] == 2:
import imp
module = imp.load_source(module_name, module_path)
del sys.modules[spec.name]
except KeyError:
pass
raise
return module
@@ -998,10 +990,9 @@ def enum(**kwargs):
def stable_partition(
input_iterable, # type: Iterable
predicate_fn, # type: Callable[[Any], bool]
):
# type: (...) -> Tuple[List[Any], List[Any]]
input_iterable: Iterable,
predicate_fn: Callable[[Any], bool],
) -> Tuple[List[Any], List[Any]]:
"""Partition the input iterable according to a custom predicate.
Args:
@@ -1030,7 +1021,7 @@ def ensure_last(lst, *elements):
lst.append(lst.pop(lst.index(elt)))
class TypedMutableSequence(MutableSequence):
class TypedMutableSequence(collections.abc.MutableSequence):
"""Base class that behaves like a list, just with a different type.
Client code can inherit from this base class:
@@ -1073,23 +1064,20 @@ class GroupedExceptionHandler(object):
"""A generic mechanism to coalesce multiple exceptions and preserve tracebacks."""
def __init__(self):
self.exceptions = [] # type: List[Tuple[str, Exception, List[str]]]
self.exceptions: List[Tuple[str, Exception, List[str]]] = []
def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context):
# type: (str) -> GroupedExceptionForwarder
def forward(self, context: str) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
def _receive_forwarded(self, context, exc, tb):
# type: (str, Exception, List[str]) -> None
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
def grouped_message(self, with_tracebacks=True):
# type: (bool) -> str
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"{0} raised {1}: {2}{3}".format(
@@ -1107,8 +1095,7 @@ class GroupedExceptionForwarder(object):
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context, handler):
# type: (str, GroupedExceptionHandler) -> None
def __init__(self, context: str, handler: GroupedExceptionHandler):
self._context = context
self._handler = handler

View File

@@ -9,13 +9,11 @@
import sys
import time
from datetime import datetime
from typing import Dict, Tuple # novm
import llnl.util.string
import llnl.util.tty as tty
from llnl.util.lang import pretty_seconds
import spack.util.string
if sys.platform != "win32":
import fcntl
@@ -81,7 +79,7 @@ class OpenFileTracker(object):
def __init__(self):
"""Create a new ``OpenFileTracker``."""
self._descriptors = {} # type: Dict[Tuple[int, int], OpenFile]
self._descriptors = {}
def get_fh(self, path):
"""Get a filehandle for a lockfile.
@@ -103,7 +101,7 @@ def get_fh(self, path):
try:
# see whether we've seen this inode/pid before
stat = os.stat(path)
key = (stat.st_ino, pid)
key = (stat.st_dev, stat.st_ino, pid)
open_file = self._descriptors.get(key)
except OSError as e:
@@ -129,32 +127,32 @@ def get_fh(self, path):
# if we just created the file, we'll need to get its inode here
if not stat:
inode = os.fstat(fd).st_ino
key = (inode, pid)
stat = os.fstat(fd)
key = (stat.st_dev, stat.st_ino, pid)
self._descriptors[key] = open_file
open_file.refs += 1
return open_file.fh
def release_fh(self, path):
"""Release a filehandle, only closing it if there are no more references."""
try:
inode = os.stat(path).st_ino
except OSError as e:
if e.errno != errno.ENOENT: # only handle file not found
raise
inode = None # this will not be in self._descriptors
key = (inode, os.getpid())
def release_by_stat(self, stat):
key = (stat.st_dev, stat.st_ino, os.getpid())
open_file = self._descriptors.get(key)
assert open_file, "Attempted to close non-existing lock path: %s" % path
assert open_file, "Attempted to close non-existing inode: %s" % stat.st_inode
open_file.refs -= 1
if not open_file.refs:
del self._descriptors[key]
open_file.fh.close()
def release_by_fh(self, fh):
self.release_by_stat(os.fstat(fh.fileno()))
def purge(self):
for key in list(self._descriptors.keys()):
self._descriptors[key].fh.close()
del self._descriptors[key]
#: Open file descriptors for locks in this process. Used to prevent one process
#: from opening the sam file many times for different byte range locks
@@ -166,7 +164,7 @@ def _attempts_str(wait_time, nattempts):
if nattempts <= 1:
return ""
attempts = spack.util.string.plural(nattempts, "attempt")
attempts = llnl.util.string.plural(nattempts, "attempt")
return " after {} and {}".format(pretty_seconds(wait_time), attempts)
@@ -432,8 +430,7 @@ def _unlock(self):
"""
fcntl.lockf(self._file, fcntl.LOCK_UN, self._length, self._start, os.SEEK_SET)
file_tracker.release_fh(self.path)
file_tracker.release_by_fh(self._file)
self._file = None
self._reads = 0
self._writes = 0

159
lib/spack/llnl/util/path.py Normal file
View File

@@ -0,0 +1,159 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utilities for managing Linux and Windows paths."""
# TODO: look at using pathlib since we now only support Python 3
import os
import re
import sys
from urllib.parse import urlparse
is_windows = sys.platform == "win32"
def is_path_url(path):
if "\\" in path:
return False
url_tuple = urlparse(path)
return bool(url_tuple.scheme) and len(url_tuple.scheme) > 1
def win_exe_ext():
return ".exe"
def path_to_os_path(*pths):
"""
Takes an arbitrary number of positional parameters
converts each arguemnt of type string to use a normalized
filepath separator, and returns a list of all values
"""
ret_pths = []
for pth in pths:
if isinstance(pth, str) and not is_path_url(pth):
pth = convert_to_platform_path(pth)
ret_pths.append(pth)
return ret_pths
def sanitize_file_path(pth):
"""
Formats strings to contain only characters that can
be used to generate legal file paths.
Criteria for legal files based on
https://en.wikipedia.org/wiki/Filename#Comparison_of_filename_limitations
Args:
pth: string containing path to be created
on the host filesystem
Return:
sanitized string that can legally be made into a path
"""
# on unix, splitting path by seperators will remove
# instances of illegal characters on join
pth_cmpnts = pth.split(os.path.sep)
if is_windows:
drive_match = r"[a-zA-Z]:"
is_abs = bool(re.match(drive_match, pth_cmpnts[0]))
drive = pth_cmpnts[0] + os.path.sep if is_abs else ""
pth_cmpnts = pth_cmpnts[1:] if drive else pth_cmpnts
illegal_chars = r'[<>?:"|*\\]'
else:
drive = "/" if not pth_cmpnts[0] else ""
illegal_chars = r"[/]"
pth = []
for cmp in pth_cmpnts:
san_cmp = re.sub(illegal_chars, "", cmp)
pth.append(san_cmp)
return drive + os.path.join(*pth)
def system_path_filter(_func=None, arg_slice=None):
"""
Filters function arguments to account for platform path separators.
Optional slicing range can be specified to select specific arguments
This decorator takes all (or a slice) of a method's positional arguments
and normalizes usage of filepath separators on a per platform basis.
Note: **kwargs, urls, and any type that is not a string are ignored
so in such cases where path normalization is required, that should be
handled by calling path_to_os_path directly as needed.
Parameters:
arg_slice (slice): a slice object specifying the slice of arguments
in the decorated method over which filepath separators are
normalized
"""
from functools import wraps
def holder_func(func):
@wraps(func)
def path_filter_caller(*args, **kwargs):
args = list(args)
if arg_slice:
args[arg_slice] = path_to_os_path(*args[arg_slice])
else:
args = path_to_os_path(*args)
return func(*args, **kwargs)
return path_filter_caller
if _func:
return holder_func(_func)
return holder_func
class Path:
"""
Describes the filepath separator types
in an enum style
with a helper attribute
exposing the path type of
the current platform.
"""
unix = 0
windows = 1
platform_path = windows if is_windows else unix
def format_os_path(path, mode=Path.unix):
"""
Format path to use consistent, platform specific
separators. Absolute paths are converted between
drive letters and a prepended '/' as per platform
requirement.
Parameters:
path (str): the path to be normalized, must be a string
or expose the replace method.
mode (Path): the path filesperator style to normalize the
passed path to. Default is unix style, i.e. '/'
"""
if not path:
return path
if mode == Path.windows:
path = path.replace("/", "\\")
else:
path = path.replace("\\", "/")
return path
def convert_to_posix_path(path):
return format_os_path(path, mode=Path.unix)
def convert_to_windows_path(path):
return format_os_path(path, mode=Path.windows)
def convert_to_platform_path(path):
return format_os_path(path, mode=Path.platform_path)

View File

@@ -23,8 +23,11 @@ def symlink(real_path, link_path):
On Windows, use junctions if os.symlink fails.
"""
if not is_windows or _win32_can_symlink():
if not is_windows:
os.symlink(real_path, link_path)
elif _win32_can_symlink():
# Windows requires target_is_directory=True when the target is a dir.
os.symlink(real_path, link_path, target_is_directory=os.path.isdir(real_path))
else:
try:
# Try to use junctions

View File

@@ -6,6 +6,7 @@
from __future__ import unicode_literals
import contextlib
import io
import os
import struct
import sys
@@ -14,10 +15,6 @@
from datetime import datetime
from sys import platform as _platform
import six
from six import StringIO
from six.moves import input
if _platform != "win32":
import fcntl
import termios
@@ -183,7 +180,7 @@ def msg(message, *args, **kwargs):
else:
cwrite("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
for arg in args:
print(indent + _output_filter(six.text_type(arg)))
print(indent + _output_filter(str(arg)))
def info(message, *args, **kwargs):
@@ -201,13 +198,13 @@ def info(message, *args, **kwargs):
st_text = process_stacktrace(st_countback)
cprint(
"@%s{%s==>} %s%s"
% (format, st_text, get_timestamp(), cescape(_output_filter(six.text_type(message)))),
% (format, st_text, get_timestamp(), cescape(_output_filter(str(message)))),
stream=stream,
)
for arg in args:
if wrap:
lines = textwrap.wrap(
_output_filter(six.text_type(arg)),
_output_filter(str(arg)),
initial_indent=indent,
subsequent_indent=indent,
break_long_words=break_long_words,
@@ -215,7 +212,7 @@ def info(message, *args, **kwargs):
for line in lines:
stream.write(line + "\n")
else:
stream.write(indent + _output_filter(six.text_type(arg)) + "\n")
stream.write(indent + _output_filter(str(arg)) + "\n")
def verbose(message, *args, **kwargs):
@@ -238,7 +235,7 @@ def error(message, *args, **kwargs):
kwargs.setdefault("format", "*r")
kwargs.setdefault("stream", sys.stderr)
info("Error: " + six.text_type(message), *args, **kwargs)
info("Error: " + str(message), *args, **kwargs)
def warn(message, *args, **kwargs):
@@ -247,7 +244,7 @@ def warn(message, *args, **kwargs):
kwargs.setdefault("format", "*Y")
kwargs.setdefault("stream", sys.stderr)
info("Warning: " + six.text_type(message), *args, **kwargs)
info("Warning: " + str(message), *args, **kwargs)
def die(message, *args, **kwargs):
@@ -271,7 +268,7 @@ def get_number(prompt, **kwargs):
while number is None:
msg(prompt, newline=False)
ans = input()
if ans == six.text_type(abort):
if ans == str(abort):
return None
if ans:
@@ -336,11 +333,11 @@ def hline(label=None, **kwargs):
cols -= 2
cols = min(max_width, cols)
label = six.text_type(label)
label = str(label)
prefix = char * 2 + " "
suffix = " " + (cols - len(prefix) - clen(label)) * char
out = StringIO()
out = io.StringIO()
out.write(prefix)
out.write(label)
out.write(suffix)
@@ -372,10 +369,5 @@ def ioctl_gwinsz(fd):
return int(rc[0]), int(rc[1])
else:
if sys.version_info[0] < 3:
raise RuntimeError(
"Terminal size not obtainable on Windows with a\
Python version older than 3"
)
rc = (os.environ.get("LINES", 25), os.environ.get("COLUMNS", 80))
return int(rc[0]), int(rc[1])

View File

@@ -8,11 +8,10 @@
"""
from __future__ import division, unicode_literals
import io
import os
import sys
from six import StringIO, text_type
from llnl.util.tty import terminal_size
from llnl.util.tty.color import cextra, clen
@@ -134,7 +133,7 @@ def colify(elts, **options):
)
# elts needs to be an array of strings so we can count the elements
elts = [text_type(elt) for elt in elts]
elts = [str(elt) for elt in elts]
if not elts:
return (0, ())
@@ -232,7 +231,7 @@ def transpose():
def colified(elts, **options):
"""Invokes the ``colify()`` function but returns the result as a string
instead of writing it to an output string."""
sio = StringIO()
sio = io.StringIO()
options["output"] = sio
colify(elts, **options)
return sio.getvalue()

View File

@@ -65,8 +65,6 @@
import sys
from contextlib import contextmanager
import six
class ColorParseError(Exception):
"""Raised when a color format fails to parse."""
@@ -259,7 +257,7 @@ def cescape(string):
Returns:
(str): the string with color codes escaped
"""
string = six.text_type(string)
string = str(string)
string = string.replace("@", "@@")
string = string.replace("}", "}}")
return string

View File

@@ -21,14 +21,12 @@
import traceback
from contextlib import contextmanager
from threading import Thread
from types import ModuleType # novm
from typing import Optional # novm
from six import StringIO, string_types
from types import ModuleType
from typing import Optional
import llnl.util.tty as tty
termios = None # type: Optional[ModuleType]
termios: Optional[ModuleType] = None
try:
import termios as term_mod
@@ -241,8 +239,7 @@ def __exit__(self, exc_type, exception, traceback):
"""If termios was available, restore old settings."""
if self.old_cfg:
self._restore_default_terminal_settings()
if sys.version_info >= (3,):
atexit.unregister(self._restore_default_terminal_settings)
atexit.unregister(self._restore_default_terminal_settings)
# restore SIGSTP and SIGCONT handlers
if self.old_handlers:
@@ -309,7 +306,7 @@ def __init__(self, file_like):
self.file_like = file_like
if isinstance(file_like, string_types):
if isinstance(file_like, str):
self.open = True
elif _file_descriptors_work(file_like):
self.open = False
@@ -323,12 +320,9 @@ def __init__(self, file_like):
def unwrap(self):
if self.open:
if self.file_like:
if sys.version_info < (3,):
self.file = open(self.file_like, "w")
else:
self.file = open(self.file_like, "w", encoding="utf-8") # novm
self.file = open(self.file_like, "w", encoding="utf-8")
else:
self.file = StringIO()
self.file = io.StringIO()
return self.file
else:
# We were handed an already-open file object. In this case we also
@@ -699,13 +693,10 @@ def __init__(self, sys_attr):
self.sys_attr = sys_attr
self.saved_stream = None
if sys.platform.startswith("win32"):
if sys.version_info < (3, 5):
libc = ctypes.CDLL(ctypes.util.find_library("c"))
if hasattr(sys, "gettotalrefcount"): # debug build
libc = ctypes.CDLL("ucrtbased")
else:
if hasattr(sys, "gettotalrefcount"): # debug build
libc = ctypes.CDLL("ucrtbased")
else:
libc = ctypes.CDLL("api-ms-win-crt-stdio-l1-1-0")
libc = ctypes.CDLL("api-ms-win-crt-stdio-l1-1-0")
kernel32 = ctypes.WinDLL("kernel32")
@@ -794,7 +785,7 @@ def __enter__(self):
raise RuntimeError("file argument must be set by __init__ ")
# Open both write and reading on logfile
if type(self.logfile) == StringIO:
if type(self.logfile) == io.StringIO:
self._ioflag = True
# cannot have two streams on tempfile, so we must make our own
sys.stdout = self.logfile
@@ -927,13 +918,10 @@ def _writer_daemon(
if sys.version_info < (3, 8) or sys.platform != "darwin":
os.close(write_fd)
# Use line buffering (3rd param = 1) since Python 3 has a bug
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O.
if sys.version_info < (3,):
in_pipe = os.fdopen(read_multiprocess_fd.fd, "r", 1)
else:
# Python 3.x before 3.7 does not open with UTF-8 encoding by default
in_pipe = os.fdopen(read_multiprocess_fd.fd, "r", 1, encoding="utf-8")
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
in_pipe = os.fdopen(read_multiprocess_fd.fd, "r", 1, encoding="utf-8")
if stdin_multiprocess_fd:
stdin = os.fdopen(stdin_multiprocess_fd.fd)
@@ -1023,7 +1011,7 @@ def _writer_daemon(
finally:
# send written data back to parent if we used a StringIO
if isinstance(log_file, StringIO):
if isinstance(log_file, io.StringIO):
control_pipe.send(log_file.getvalue())
log_file_wrapper.close()
close_connection_and_file(read_multiprocess_fd, in_pipe)

View File

@@ -24,8 +24,7 @@
import traceback
import llnl.util.tty.log as log
from spack.util.executable import which
from llnl.util.executable import which
termios = None
try:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.19.0.dev0"
__version__ = "0.20.0.dev0"
spack_version = __version__

View File

@@ -5,12 +5,12 @@
import os
from llnl.util.executable import Executable, ProcessError
from llnl.util.lang import memoized
import spack.spec
from spack.compilers.clang import Clang
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
class ABI(object):

View File

@@ -37,15 +37,14 @@ def _search_duplicate_compilers(error_cls):
"""
import ast
import collections
import collections.abc
import inspect
import itertools
import pickle
import re
from six.moves.urllib.request import urlopen
from urllib.request import urlopen
import llnl.util.lang
from llnl.util.compat import Sequence
import spack.config
import spack.patch
@@ -81,7 +80,7 @@ def __hash__(self):
return hash(value)
class AuditClass(Sequence):
class AuditClass(collections.abc.Sequence):
def __init__(self, group, tag, description, kwargs):
"""Return an object that acts as a decorator to register functions
associated with a specific class of sanity checks.
@@ -288,7 +287,7 @@ def _check_build_test_callbacks(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
test_callbacks = pkg_cls.build_time_test_callbacks
test_callbacks = getattr(pkg_cls, "build_time_test_callbacks", None)
if test_callbacks and "test" in test_callbacks:
msg = '{0} package contains "test" method in ' "build_time_test_callbacks"

View File

@@ -9,21 +9,26 @@
import json
import multiprocessing.pool
import os
import re
import shutil
import sys
import tarfile
import tempfile
import time
import traceback
import urllib.error
import urllib.parse
import urllib.request
import warnings
from contextlib import closing
from urllib.error import HTTPError, URLError
import ruamel.yaml as yaml
from six.moves.urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.executable import which
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
import spack.cmd
@@ -38,6 +43,7 @@
import spack.store
import spack.util.file_cache as file_cache
import spack.util.gpg
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
@@ -46,7 +52,6 @@
from spack.relocate import utf8_paths_to_single_binary_regex
from spack.spec import Spec
from spack.stage import Stage
from spack.util.executable import which
_build_cache_relative_path = "build_cache"
_build_cache_keys_relative_path = "_pgp"
@@ -266,10 +271,7 @@ def find_by_hash(self, find_hash, mirrors_to_check=None):
None, just assumes all configured mirrors.
"""
if find_hash not in self._mirrors_for_spec:
# Not found in the cached index, pull the latest from the server.
self.update(with_cooldown=True)
if find_hash not in self._mirrors_for_spec:
return None
return []
results = self._mirrors_for_spec[find_hash]
if not mirrors_to_check:
return results
@@ -345,7 +347,6 @@ def update(self, with_cooldown=False):
for cached_mirror_url in self._local_index_cache:
cache_entry = self._local_index_cache[cached_mirror_url]
cached_index_hash = cache_entry["index_hash"]
cached_index_path = cache_entry["index_path"]
if cached_mirror_url in configured_mirror_urls:
# Only do a fetch if the last fetch was longer than TTL ago
@@ -364,13 +365,14 @@ def update(self, with_cooldown=False):
# May need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash
cached_mirror_url,
cache_entry=cache_entry,
)
self._last_fetch_times[cached_mirror_url] = (now, True)
all_methods_failed = False
except FetchCacheError as fetch_error:
except FetchIndexError as e:
needs_regen = False
fetch_errors.extend(fetch_error.errors)
fetch_errors.append(e)
self._last_fetch_times[cached_mirror_url] = (now, False)
# The need to regenerate implies a need to clear as well.
spec_cache_clear_needed |= needs_regen
@@ -399,29 +401,36 @@ def update(self, with_cooldown=False):
# already have in our cache must be fetched, stored, and represented
# locally.
for mirror_url in configured_mirror_urls:
if mirror_url not in self._local_index_cache:
# Need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
self._last_fetch_times[mirror_url] = (now, True)
all_methods_failed = False
except FetchCacheError as fetch_error:
fetch_errors.extend(fetch_error.errors)
needs_regen = False
self._last_fetch_times[mirror_url] = (now, False)
# Generally speaking, a new mirror wouldn't imply the need to
# clear the spec cache, so leave it as is.
if needs_regen:
spec_cache_regenerate_needed = True
if mirror_url in self._local_index_cache:
continue
# Need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
self._last_fetch_times[mirror_url] = (now, True)
all_methods_failed = False
except FetchIndexError as e:
fetch_errors.append(e)
needs_regen = False
self._last_fetch_times[mirror_url] = (now, False)
# Generally speaking, a new mirror wouldn't imply the need to
# clear the spec cache, so leave it as is.
if needs_regen:
spec_cache_regenerate_needed = True
self._write_local_index_cache()
if all_methods_failed:
raise FetchCacheError(fetch_errors)
elif spec_cache_regenerate_needed:
if fetch_errors:
tty.warn(
"The following issues were ignored while updating the indices of binary caches",
FetchCacheError(fetch_errors),
)
if spec_cache_regenerate_needed:
self.regenerate_spec_cache(clear_existing=spec_cache_clear_needed)
def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
"""Fetch a buildcache index file from a remote mirror and cache it.
If we already have a cached index from this mirror, then we first
@@ -429,102 +438,50 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
Args:
mirror_url (str): Base url of mirror
expect_hash (str): If provided, this hash will be compared against
the index hash we retrieve from the mirror, to determine if we
need to fetch the index or not.
cache_entry (dict): Old cache metadata with keys ``index_hash``, ``index_path``,
``etag``
Returns:
True if this function thinks the concrete spec cache,
``_mirrors_for_spec``, should be regenerated. Returns False
otherwise.
Throws:
FetchCacheError: a composite exception.
"""
index_fetch_url = url_util.join(mirror_url, _build_cache_relative_path, "index.json")
hash_fetch_url = url_util.join(mirror_url, _build_cache_relative_path, "index.json.hash")
True if the local index.json was updated.
if not web_util.url_exists(index_fetch_url):
# A binary mirror is not required to have an index, so avoid
# raising FetchCacheError in that case.
Throws:
FetchIndexError
"""
# TODO: get rid of this request, handle 404 better
if not web_util.url_exists(
url_util.join(mirror_url, _build_cache_relative_path, "index.json")
):
return False
old_cache_key = None
fetched_hash = None
errors = []
# Fetch the hash first so we can check if we actually need to fetch
# the index itself.
try:
_, _, fs = web_util.read_from_url(hash_fetch_url)
fetched_hash = codecs.getreader("utf-8")(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
errors.append(
RuntimeError(
"Unable to read index hash {0} due to {1}: {2}".format(
hash_fetch_url, url_err.__class__.__name__, str(url_err)
)
)
etag = cache_entry.get("etag", None)
if etag:
fetcher = EtagIndexFetcher(mirror_url, etag)
else:
fetcher = DefaultIndexFetcher(
mirror_url, local_hash=cache_entry.get("index_hash", None)
)
# The only case where we'll skip attempting to fetch the buildcache
# index from the mirror is when we already have a hash for this
# mirror, we were able to retrieve one from the mirror, and
# the two hashes are the same.
if expect_hash and fetched_hash:
if fetched_hash == expect_hash:
tty.debug("Cached index for {0} already up to date".format(mirror_url))
return False
else:
# We expected a hash, we fetched a hash, and they were not the
# same. If we end up fetching an index successfully and
# replacing our entry for this mirror, we should clean up the
# existing cache file
if mirror_url in self._local_index_cache:
existing_entry = self._local_index_cache[mirror_url]
old_cache_key = existing_entry["index_path"]
result = fetcher.conditional_fetch()
tty.debug("Fetching index from {0}".format(index_fetch_url))
# Fetch index itself
try:
_, _, fs = web_util.read_from_url(index_fetch_url)
index_object_str = codecs.getreader("utf-8")(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
errors.append(
RuntimeError(
"Unable to read index {0} due to {1}: {2}".format(
index_fetch_url, url_err.__class__.__name__, str(url_err)
)
)
)
raise FetchCacheError(errors)
locally_computed_hash = compute_hash(index_object_str)
if fetched_hash is not None and locally_computed_hash != fetched_hash:
msg = (
"Computed hash ({0}) did not match remote ({1}), "
"indicating error in index transmission"
).format(locally_computed_hash, expect_hash)
errors.append(RuntimeError(msg))
# We somehow got an index that doesn't match the remote one, maybe
# the next time we try we'll be successful.
raise FetchCacheError(errors)
# Nothing to do
if result.fresh:
return False
# Persist new index.json
url_hash = compute_hash(mirror_url)
cache_key = "{0}_{1}.json".format(url_hash[:10], locally_computed_hash[:10])
cache_key = "{}_{}.json".format(url_hash[:10], result.hash[:10])
self._index_file_cache.init_entry(cache_key)
with self._index_file_cache.write_transaction(cache_key) as (old, new):
new.write(index_object_str)
new.write(result.data)
self._local_index_cache[mirror_url] = {
"index_hash": locally_computed_hash,
"index_hash": result.hash,
"index_path": cache_key,
"etag": result.etag,
}
# clean up the old cache_key if necessary
old_cache_key = cache_entry.get("index_path", None)
if old_cache_key:
self._index_file_cache.remove(old_cache_key)
@@ -621,7 +578,9 @@ class UnsignedPackageException(spack.error.SpackError):
def compute_hash(data):
return hashlib.sha256(data.encode("utf-8")).hexdigest()
if isinstance(data, str):
data = data.encode("utf-8")
return hashlib.sha256(data).hexdigest()
def build_cache_relative_path():
@@ -914,8 +873,6 @@ def _fetch_spec_from_mirror(spec_url):
return Spec.from_dict(specfile_json)
if spec_url.endswith(".json"):
return Spec.from_json(spec_file_contents)
if spec_url.endswith(".yaml"):
return Spec.from_yaml(spec_file_contents)
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
@@ -990,8 +947,6 @@ def file_read_method(file_path):
"*.spec.json.sig",
"--include",
"*.spec.json",
"--include",
"*.spec.yaml",
cache_prefix,
tmpspecsdir,
]
@@ -1001,7 +956,7 @@ def file_read_method(file_path):
"Using aws s3 sync to download specs from {0} to {1}".format(cache_prefix, tmpspecsdir)
)
aws(*sync_command_args, output=os.devnull, error=os.devnull)
file_list = fsys.find(tmpspecsdir, ["*.spec.json.sig", "*.spec.json", "*.spec.yaml"])
file_list = fsys.find(tmpspecsdir, ["*.spec.json.sig", "*.spec.json"])
read_fn = file_read_method
except Exception:
tty.warn("Failed to use aws s3 sync to retrieve specs, falling back to parallel fetch")
@@ -1037,9 +992,7 @@ def url_read_method(url):
file_list = [
url_util.join(cache_prefix, entry)
for entry in web_util.list_url(cache_prefix)
if entry.endswith(".yaml")
or entry.endswith("spec.json")
or entry.endswith("spec.json.sig")
if entry.endswith("spec.json") or entry.endswith("spec.json.sig")
]
read_fn = url_read_method
except KeyError as inst:
@@ -1101,14 +1054,6 @@ def generate_package_index(cache_prefix, concurrency=32):
tty.error("Unabled to generate package index, {0}".format(err))
return
if any(x.endswith(".yaml") for x in file_list):
msg = (
"The mirror in '{}' contains specs in the deprecated YAML format.\n\n\tSupport for "
"this format will be removed in v0.20, please regenerate the build cache with a "
"recent Spack\n"
).format(cache_prefix)
warnings.warn(msg)
tty.debug("Retrieving spec descriptor files from {0} to build index".format(cache_prefix))
tmpdir = tempfile.mkdtemp()
@@ -1195,7 +1140,7 @@ def generate_key_index(key_prefix, tmpdir=None):
def _build_tarball(
spec,
outdir,
out_url,
force=False,
relative=False,
unsigned=False,
@@ -1218,8 +1163,7 @@ def _build_tarball(
tarfile_dir = os.path.join(cache_prefix, tarball_directory_name(spec))
tarfile_path = os.path.join(tarfile_dir, tarfile_name)
spackfile_path = os.path.join(cache_prefix, tarball_path_name(spec, ".spack"))
remote_spackfile_path = url_util.join(outdir, os.path.relpath(spackfile_path, tmpdir))
remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, tmpdir))
mkdirp(tarfile_dir)
if web_util.url_exists(remote_spackfile_path):
@@ -1236,15 +1180,11 @@ def _build_tarball(
specfile_name = tarball_name(spec, ".spec.json")
specfile_path = os.path.realpath(os.path.join(cache_prefix, specfile_name))
signed_specfile_path = "{0}.sig".format(specfile_path)
deprecated_specfile_path = specfile_path.replace(".spec.json", ".spec.yaml")
remote_specfile_path = url_util.join(
outdir, os.path.relpath(specfile_path, os.path.realpath(tmpdir))
out_url, os.path.relpath(specfile_path, os.path.realpath(tmpdir))
)
remote_signed_specfile_path = "{0}.sig".format(remote_specfile_path)
remote_specfile_path_deprecated = url_util.join(
outdir, os.path.relpath(deprecated_specfile_path, os.path.realpath(tmpdir))
)
# If force and exists, overwrite. Otherwise raise exception on collision.
if force:
@@ -1252,12 +1192,8 @@ def _build_tarball(
web_util.remove_url(remote_specfile_path)
if web_util.url_exists(remote_signed_specfile_path):
web_util.remove_url(remote_signed_specfile_path)
if web_util.url_exists(remote_specfile_path_deprecated):
web_util.remove_url(remote_specfile_path_deprecated)
elif (
web_util.url_exists(remote_specfile_path)
or web_util.url_exists(remote_signed_specfile_path)
or web_util.url_exists(remote_specfile_path_deprecated)
elif web_util.url_exists(remote_specfile_path) or web_util.url_exists(
remote_signed_specfile_path
):
raise NoOverwriteException(url_util.format(remote_specfile_path))
@@ -1313,12 +1249,10 @@ def _build_tarball(
with open(spec_file, "r") as inputfile:
content = inputfile.read()
if spec_file.endswith(".yaml"):
spec_dict = yaml.load(content)
elif spec_file.endswith(".json"):
if spec_file.endswith(".json"):
spec_dict = sjson.load(content)
else:
raise ValueError("{0} not a valid spec file type (json or yaml)".format(spec_file))
raise ValueError("{0} not a valid spec file type".format(spec_file))
spec_dict["buildcache_layout_version"] = 1
bchecksum = {}
bchecksum["hash_algorithm"] = "sha256"
@@ -1353,12 +1287,12 @@ def _build_tarball(
# push the key to the build cache's _pgp directory so it can be
# imported
if not unsigned:
push_keys(outdir, keys=[key], regenerate_index=regenerate_index, tmpdir=tmpdir)
push_keys(out_url, keys=[key], regenerate_index=regenerate_index, tmpdir=tmpdir)
# create an index.json for the build_cache directory so specs can be
# found
if regenerate_index:
generate_package_index(url_util.join(outdir, os.path.relpath(cache_prefix, tmpdir)))
generate_package_index(url_util.join(out_url, os.path.relpath(cache_prefix, tmpdir)))
finally:
shutil.rmtree(tmpdir)
@@ -1539,7 +1473,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# Assumes we care more about finding a spec file by preferred ext
# than by mirrory priority. This can be made less complicated as
# we remove support for deprecated spec formats and buildcache layouts.
for ext in ["json.sig", "json", "yaml"]:
for ext in ["json.sig", "json"]:
for mirror_to_try in mirrors_to_try:
specfile_url = "{0}.{1}".format(mirror_to_try["specfile"], ext)
spackfile_url = mirror_to_try["spackfile"]
@@ -1576,13 +1510,6 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# the remaining mirrors, looking for one we can use.
tarball_stage = try_fetch(spackfile_url)
if tarball_stage:
if ext == "yaml":
msg = (
"Reading {} from mirror.\n\n\tThe YAML format for buildcaches is "
"deprecated and will be removed in v0.20\n"
).format(spackfile_url)
warnings.warn(msg)
return {
"tarball_stage": tarball_stage,
"specfile_stage": local_specfile_stage,
@@ -1606,10 +1533,6 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
)
)
tty.warn(
"download_tarball() was unable to download "
+ "{0} from any configured mirrors".format(spec)
)
return None
@@ -1634,7 +1557,7 @@ def make_package_relative(workdir, spec, allow_root):
if "elf" in platform.binary_formats:
relocate.make_elf_binaries_relative(cur_path_names, orig_path_names, old_layout_root)
relocate.raise_if_not_relocatable(cur_path_names, allow_root)
allow_root or relocate.ensure_binaries_are_relocatable(cur_path_names)
orig_path_names = list()
cur_path_names = list()
for linkname in buildinfo.get("relocate_links", []):
@@ -1652,7 +1575,7 @@ def check_package_relocatable(workdir, spec, allow_root):
cur_path_names = list()
for filename in buildinfo["relocate_binaries"]:
cur_path_names.append(os.path.join(workdir, filename))
relocate.raise_if_not_relocatable(cur_path_names, allow_root)
allow_root or relocate.ensure_binaries_are_relocatable(cur_path_names)
def dedupe_hardlinks_if_necessary(root, buildinfo):
@@ -1826,8 +1749,6 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
spackfile_path = os.path.join(stagepath, spackfile_name)
tarfile_name = tarball_name(spec, ".tar.gz")
tarfile_path = os.path.join(extract_to, tarfile_name)
deprecated_yaml_name = tarball_name(spec, ".spec.yaml")
deprecated_yaml_path = os.path.join(extract_to, deprecated_yaml_name)
json_name = tarball_name(spec, ".spec.json")
json_path = os.path.join(extract_to, json_name)
with closing(tarfile.open(spackfile_path, "r")) as tar:
@@ -1839,8 +1760,6 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
if os.path.exists(json_path):
specfile_path = json_path
elif os.path.exists(deprecated_yaml_path):
specfile_path = deprecated_yaml_path
else:
raise ValueError("Cannot find spec file for {0}.".format(extract_to))
@@ -1887,10 +1806,8 @@ def extract_tarball(spec, download_result, allow_root=False, unsigned=False, for
content = inputfile.read()
if specfile_path.endswith(".json.sig"):
spec_dict = Spec.extract_json_from_clearsig(content)
elif specfile_path.endswith(".json"):
spec_dict = sjson.load(content)
else:
spec_dict = syaml.load(content)
spec_dict = sjson.load(content)
bchecksum = spec_dict["binary_cache_checksum"]
filename = download_result["tarball_stage"].save_filename
@@ -1902,7 +1819,7 @@ def extract_tarball(spec, download_result, allow_root=False, unsigned=False, for
or int(spec_dict["buildcache_layout_version"]) < 1
):
# Handle the older buildcache layout where the .spack file
# contains a spec json/yaml, maybe an .asc file (signature),
# contains a spec json, maybe an .asc file (signature),
# and another tarball containing the actual install tree.
tmpdir = tempfile.mkdtemp()
try:
@@ -2053,17 +1970,12 @@ def try_direct_fetch(spec, mirrors=None):
"""
Try to find the spec directly on the configured mirrors
"""
deprecated_specfile_name = tarball_name(spec, ".spec.yaml")
specfile_name = tarball_name(spec, ".spec.json")
signed_specfile_name = tarball_name(spec, ".spec.json.sig")
specfile_is_signed = False
specfile_is_json = True
found_specs = []
for mirror in spack.mirror.MirrorCollection(mirrors=mirrors).values():
buildcache_fetch_url_yaml = url_util.join(
mirror.fetch_url, _build_cache_relative_path, deprecated_specfile_name
)
buildcache_fetch_url_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, specfile_name
)
@@ -2077,28 +1989,19 @@ def try_direct_fetch(spec, mirrors=None):
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_yaml)
specfile_is_json = False
except (URLError, web_util.SpackWebError, HTTPError) as url_err_y:
tty.debug(
"Did not find {0} on {1}".format(
specfile_name, buildcache_fetch_url_signed_json
),
url_err,
level=2,
)
tty.debug(
"Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_json),
url_err_x,
level=2,
)
tty.debug(
"Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_yaml),
url_err_y,
level=2,
)
continue
tty.debug(
"Did not find {0} on {1}".format(
specfile_name, buildcache_fetch_url_signed_json
),
url_err,
level=2,
)
tty.debug(
"Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_json),
url_err_x,
level=2,
)
continue
specfile_contents = codecs.getreader("utf-8")(fs).read()
# read the spec from the build cache file. All specs in build caches
@@ -2107,10 +2010,8 @@ def try_direct_fetch(spec, mirrors=None):
if specfile_is_signed:
specfile_json = Spec.extract_json_from_clearsig(specfile_contents)
fetched_spec = Spec.from_dict(specfile_json)
elif specfile_is_json:
fetched_spec = Spec.from_json(specfile_contents)
else:
fetched_spec = Spec.from_yaml(specfile_contents)
fetched_spec = Spec.from_json(specfile_contents)
fetched_spec._mark_concrete()
found_specs.append(
@@ -2132,8 +2033,8 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
spec (spack.spec.Spec): The spec to look for in binary mirrors
mirrors_to_check (dict): Optionally override the configured mirrors
with the mirrors in this dictionary.
index_only (bool): Do not attempt direct fetching of ``spec.json``
files from remote mirrors, only consider the indices.
index_only (bool): When ``index_only`` is set to ``True``, only the local
cache is checked, no requests are made.
Return:
A list of objects, each containing a ``mirror_url`` and ``spec`` key
@@ -2321,7 +2222,7 @@ def needs_rebuild(spec, mirror_url):
specfile_path = os.path.join(cache_prefix, specfile_name)
# Only check for the presence of the json version of the spec. If the
# mirror only has the yaml version, or doesn't have the spec at all, we
# mirror only has the json version, or doesn't have the spec at all, we
# need to rebuild.
return not web_util.url_exists(specfile_path)
@@ -2429,7 +2330,6 @@ def download_single_spec(concrete_spec, destination, mirror_url=None):
"url": [
tarball_name(concrete_spec, ".spec.json.sig"),
tarball_name(concrete_spec, ".spec.json"),
tarball_name(concrete_spec, ".spec.yaml"),
],
"path": destination,
"required": True,
@@ -2470,3 +2370,126 @@ def __call__(self, spec, **kwargs):
# Matching a spec constraint
matches = [s for s in self.possible_specs if s.satisfies(spec)]
return matches
class FetchIndexError(Exception):
def __str__(self):
if len(self.args) == 1:
return str(self.args[0])
else:
return "{}, due to: {}".format(self.args[0], self.args[1])
FetchIndexResult = collections.namedtuple("FetchIndexResult", "etag hash data fresh")
class DefaultIndexFetcher:
"""Fetcher for index.json, using separate index.json.hash as cache invalidation strategy"""
def __init__(self, url, local_hash, urlopen=web_util.urlopen):
self.url = url
self.local_hash = local_hash
self.urlopen = urlopen
self.headers = {"User-Agent": web_util.SPACK_USER_AGENT}
def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, _build_cache_relative_path, "index.json.hash")
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except urllib.error.URLError:
return None
# Validate the hash
remote_hash = response.read(64)
if not re.match(rb"[a-f\d]{64}$", remote_hash):
return None
return remote_hash.decode("utf-8")
def conditional_fetch(self):
# Do an intermediate fetch for the hash
# and a conditional fetch for the contents
# Early exit if our cache is up to date.
if self.local_hash and self.local_hash == self.get_remote_hash():
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise, download index.json
url_index = url_util.join(self.url, _build_cache_relative_path, "index.json")
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e)
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
return FetchCacheError("Remote index {} is invalid".format(url_index), e)
computed_hash = compute_hash(result)
# We don't handle computed_hash != remote_hash here, which can happen
# when remote index.json and index.json.hash are out of sync, or if
# the hash algorithm changed.
# The most likely scenario is that we got index.json got updated
# while we fetched index.json.hash. Warning about an issue thus feels
# wrong, as it's more of an issue with race conditions in the cache
# invalidation strategy.
# For now we only handle etags on http(s), since 304 error handling
# in s3:// is not there yet.
if urllib.parse.urlparse(self.url).scheme not in ("http", "https"):
etag = None
else:
etag = web_util.parse_etag(
response.headers.get("Etag", None) or response.headers.get("etag", None)
)
return FetchIndexResult(
etag=etag,
hash=computed_hash,
data=result,
fresh=False,
)
class EtagIndexFetcher:
"""Fetcher for index.json, using ETags headers as cache invalidation strategy"""
def __init__(self, url, etag, urlopen=web_util.urlopen):
self.url = url
self.etag = etag
self.urlopen = urlopen
def conditional_fetch(self):
# Just do a conditional fetch immediately
url = url_util.join(self.url, _build_cache_relative_path, "index.json")
headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),
}
try:
response = self.urlopen(urllib.request.Request(url, headers=headers))
except urllib.error.HTTPError as e:
if e.getcode() == 304:
# Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
raise FetchIndexError("Remote index {} is invalid".format(url), e) from e
headers = response.headers
etag_header_value = headers.get("Etag", None) or headers.get("etag", None)
return FetchIndexResult(
etag=web_util.parse_etag(etag_header_value),
hash=compute_hash(result),
data=result,
fresh=False,
)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,25 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Function and classes needed to bootstrap Spack itself."""
from .config import ensure_bootstrap_configuration, is_bootstrapping
from .core import (
all_core_root_specs,
ensure_core_dependencies,
ensure_patchelf_in_path_or_raise,
)
from .environment import BootstrapEnvironment, ensure_environment_dependencies
from .status import status_message
__all__ = [
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_patchelf_in_path_or_raise",
"all_core_root_specs",
"ensure_environment_dependencies",
"BootstrapEnvironment",
"status_message",
]

View File

@@ -0,0 +1,218 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common basic functions used through the spack.bootstrap package"""
import fnmatch
import os.path
import re
import sys
import sysconfig
import warnings
import archspec.cpu
import llnl.util.envmod
import llnl.util.executable
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.store
from .config import spec_for_current_python
def _python_import(module):
try:
__import__(module)
except ImportError:
return False
return True
def _try_import_from_store(module, query_spec, query_info=None):
"""Return True if the module can be imported from an already
installed spec, False otherwise.
Args:
module: Python module to be imported
query_spec: spec that may provide the module
query_info (dict or None): if a dict is passed it is populated with the
command found and the concrete spec providing it
"""
# If it is a string assume it's one of the root specs by this module
if isinstance(query_spec, str):
# We have to run as part of this python interpreter
query_spec += " ^" + spec_for_current_python()
installed_specs = spack.store.db.query(query_spec, installed=True)
for candidate_spec in installed_specs:
pkg = candidate_spec["python"].package
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
] # type: list[str]
path_before = list(sys.path)
# NOTE: try module_paths first and last, last allows an existing version in path
# to be picked up and used, possibly depending on something in the store, first
# allows the bootstrap version to work when an incompatible version is in
# sys.path
orders = [
module_paths + sys.path,
sys.path + module_paths,
]
for path in orders:
sys.path = path
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = (
f"[BOOTSTRAP MODULE {module}] The installed spec "
f'"{query_spec}/{candidate_spec.dag_hash()}" '
f'provides the "{module}" Python module'
)
tty.debug(msg)
if query_info is not None:
query_info["spec"] = candidate_spec
return True
except Exception as exc: # pylint: disable=broad-except
msg = (
"unexpected error while trying to import module "
f'"{module}" from spec "{candidate_spec}" [error="{str(exc)}"]'
)
warnings.warn(msg)
else:
msg = "Spec {0} did not provide module {1}"
warnings.warn(msg.format(candidate_spec, module))
sys.path = path_before
return False
def _fix_ext_suffix(candidate_spec):
"""Fix the external suffixes of Python extensions on the fly for
platforms that may need it
Args:
candidate_spec (Spec): installed spec with a Python module
to be checked.
"""
# Here we map target families to the patterns expected
# by pristine CPython. Only architectures with known issues
# are included. Known issues:
#
# [RHEL + ppc64le]: https://github.com/spack/spack/issues/25734
#
_suffix_to_be_checked = {
"ppc64le": {
"glob": "*.cpython-*-powerpc64le-linux-gnu.so",
"re": r".cpython-[\w]*-powerpc64le-linux-gnu.so",
"fmt": r"{module}.cpython-{major}{minor}m-powerpc64le-linux-gnu.so",
}
}
# If the current architecture is not problematic return
generic_target = archspec.cpu.host().family
if str(generic_target) not in _suffix_to_be_checked:
return
# If there's no EXT_SUFFIX (Python < 3.5) or the suffix matches
# the expectations, return since the package is surely good
ext_suffix = sysconfig.get_config_var("EXT_SUFFIX")
if ext_suffix is None:
return
expected = _suffix_to_be_checked[str(generic_target)]
if fnmatch.fnmatch(ext_suffix, expected["glob"]):
return
# If we are here it means the current interpreter expects different names
# than pristine CPython. So:
# 1. Find what we have installed
# 2. Create symbolic links for the other names, it they're not there already
# Check if standard names are installed and if we have to create
# link for this interpreter
standard_extensions = fs.find(candidate_spec.prefix, expected["glob"])
link_names = [re.sub(expected["re"], ext_suffix, s) for s in standard_extensions]
for file_name, link_name in zip(standard_extensions, link_names):
if os.path.exists(link_name):
continue
os.symlink(file_name, link_name)
# Check if this interpreter installed something and we have to create
# links for a standard CPython interpreter
non_standard_extensions = fs.find(candidate_spec.prefix, "*" + ext_suffix)
for abs_path in non_standard_extensions:
directory, filename = os.path.split(abs_path)
module = filename.split(".")[0]
link_name = os.path.join(
directory,
expected["fmt"].format(
module=module, major=sys.version_info[0], minor=sys.version_info[1]
),
)
if os.path.exists(link_name):
continue
os.symlink(abs_path, link_name)
def _executables_in_store(executables, query_spec, query_info=None):
"""Return True if at least one of the executables can be retrieved from
a spec in store, False otherwise.
The different executables must provide the same functionality and are
"alternate" to each other, i.e. the function will exit True on the first
executable found.
Args:
executables: list of executables to be searched
query_spec: spec that may provide the executable
query_info (dict or None): if a dict is passed it is populated with the
command found and the concrete spec providing it
"""
executables_str = ", ".join(executables)
msg = "[BOOTSTRAP EXECUTABLES {0}] Try installed specs with query '{1}'"
tty.debug(msg.format(executables_str, query_spec))
installed_specs = spack.store.db.query(query_spec, installed=True)
if installed_specs:
for concrete_spec in installed_specs:
bin_dir = concrete_spec.prefix.bin
# IF we have a "bin" directory and it contains
# the executables we are looking for
if (
os.path.exists(bin_dir)
and os.path.isdir(bin_dir)
and llnl.util.executable.which_string(*executables, path=bin_dir)
):
llnl.util.envmod.path_put_first("PATH", [bin_dir])
if query_info is not None:
query_info["command"] = llnl.util.executable.which(*executables, path=bin_dir)
query_info["spec"] = concrete_spec
return True
return False
def _root_spec(spec_str):
"""Add a proper compiler and target to a spec used during bootstrapping.
Args:
spec_str (str): spec to be bootstrapped. Must be without compiler and target.
"""
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS and Windows.
if str(spack.platforms.host()) == "darwin":
spec_str += " %apple-clang"
elif str(spack.platforms.host()) == "windows":
spec_str += " %msvc"
else:
spec_str += " %gcc"
target = archspec.cpu.host().family
spec_str += f" target={target}"
tty.debug(f"[BOOTSTRAP ROOT SPEC] {spec_str}")
return spec_str

View File

@@ -0,0 +1,169 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Manage configuration swapping for bootstrapping purposes"""
import contextlib
import os.path
import sys
from llnl.util import tty
import spack.compilers
import spack.config
import spack.environment
import spack.paths
import spack.platforms
import spack.repo
import spack.spec
import spack.store
import spack.util.path
#: Reference counter for the bootstrapping configuration context manager
_REF_COUNT = 0
def is_bootstrapping():
"""Return True if we are in a bootstrapping context, False otherwise."""
return _REF_COUNT > 0
def spec_for_current_python():
"""For bootstrapping purposes we are just interested in the Python
minor version (all patches are ABI compatible with the same minor).
See:
https://www.python.org/dev/peps/pep-0513/
https://stackoverflow.com/a/35801395/771663
"""
version_str = ".".join(str(x) for x in sys.version_info[:2])
return f"python@{version_str}"
def root_path():
"""Root of all the bootstrap related folders"""
return spack.util.path.canonicalize_path(
spack.config.get("bootstrap:root", spack.paths.default_user_bootstrap_path)
)
def store_path():
"""Path to the store used for bootstrapped software"""
enabled = spack.config.get("bootstrap:enable", True)
if not enabled:
msg = 'bootstrapping is currently disabled. Use "spack bootstrap enable" to enable it'
raise RuntimeError(msg)
return _store_path()
@contextlib.contextmanager
def spack_python_interpreter():
"""Override the current configuration to set the interpreter under
which Spack is currently running as the only Python external spec
available.
"""
python_prefix = sys.exec_prefix
external_python = spec_for_current_python()
entry = {
"buildable": False,
"externals": [{"prefix": python_prefix, "spec": str(external_python)}],
}
with spack.config.override("packages:python::", entry):
yield
def _store_path():
bootstrap_root_path = root_path()
return spack.util.path.canonicalize_path(os.path.join(bootstrap_root_path, "store"))
def _config_path():
bootstrap_root_path = root_path()
return spack.util.path.canonicalize_path(os.path.join(bootstrap_root_path, "config"))
@contextlib.contextmanager
def ensure_bootstrap_configuration():
"""Swap the current configuration for the one used to bootstrap Spack.
The context manager is reference counted to ensure we don't swap multiple
times if there's nested use of it in the stack. One compelling use case
is bootstrapping patchelf during the bootstrap of clingo.
"""
global _REF_COUNT # pylint: disable=global-statement
already_swapped = bool(_REF_COUNT)
_REF_COUNT += 1
try:
if already_swapped:
yield
else:
with _ensure_bootstrap_configuration():
yield
finally:
_REF_COUNT -= 1
def _read_and_sanitize_configuration():
"""Read the user configuration that needs to be reused for bootstrapping
and remove the entries that should not be copied over.
"""
# Read the "config" section but pop the install tree (the entry will not be
# considered due to the use_store context manager, so it will be confusing
# to have it in the configuration).
config_yaml = spack.config.get("config")
config_yaml.pop("install_tree", None)
user_configuration = {"bootstrap": spack.config.get("bootstrap"), "config": config_yaml}
return user_configuration
def _bootstrap_config_scopes():
tty.debug("[BOOTSTRAP CONFIG SCOPE] name=_builtin")
config_scopes = [spack.config.InternalConfigScope("_builtin", spack.config.config_defaults)]
configuration_paths = (spack.config.configuration_defaults_path, ("bootstrap", _config_path()))
for name, path in configuration_paths:
platform = spack.platforms.host().name
platform_scope = spack.config.ConfigScope(
"/".join([name, platform]), os.path.join(path, platform)
)
generic_scope = spack.config.ConfigScope(name, path)
config_scopes.extend([generic_scope, platform_scope])
msg = "[BOOTSTRAP CONFIG SCOPE] name={0}, path={1}"
tty.debug(msg.format(generic_scope.name, generic_scope.path))
tty.debug(msg.format(platform_scope.name, platform_scope.path))
return config_scopes
def _add_compilers_if_missing():
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch):
new_compilers = spack.compilers.find_new_compilers()
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)
@contextlib.contextmanager
def _ensure_bootstrap_configuration():
bootstrap_store_path = store_path()
user_configuration = _read_and_sanitize_configuration()
with spack.environment.no_active_environment():
with spack.platforms.prevent_cray_detection(), spack.platforms.use_platform(
spack.platforms.real_host()
), spack.repo.use_repositories(spack.paths.packages_path), spack.store.use_store(
bootstrap_store_path
):
# Default configuration scopes excluding command line
# and builtin but accounting for platform specific scopes
config_scopes = _bootstrap_config_scopes()
with spack.config.use_configuration(*config_scopes):
# We may need to compile code from sources, so ensure we
# have compilers for the current platform
_add_compilers_if_missing()
spack.config.set("bootstrap", user_configuration["bootstrap"])
spack.config.set("config", user_configuration["config"])
with spack.modules.disable_modules():
with spack_python_interpreter():
yield

View File

@@ -0,0 +1,580 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap Spack core dependencies from binaries.
This module contains logic to bootstrap software required by Spack from binaries served in the
bootstrapping mirrors. The logic is quite different from an installation done from a Spack user,
because of the following reasons:
1. The binaries are all compiled on the same OS for a given platform (e.g. they are compiled on
``centos7`` on ``linux``), but they will be installed and used on the host OS. They are also
targeted at the most generic architecture possible. That makes the binaries difficult to reuse
with other specs in an environment without ad-hoc logic.
2. Bootstrapping has a fallback procedure where we try to install software by default from the
most recent binaries, and proceed to older versions of the mirror, until we try building from
sources as a last resort. This allows us not to be blocked on architectures where we don't
have binaries readily available, but is also not compatible with the working of environments
(they don't have fallback procedures).
3. Among the binaries we have clingo, so we can't concretize that with clingo :-)
4. clingo, GnuPG and patchelf binaries need to be verified by sha256 sum (all the other binaries
we might add on top of that in principle can be verified with GPG signatures).
"""
import copy
import functools
import json
import os
import os.path
import sys
import uuid
from typing import Callable, List, Optional
import llnl.util.envmod
import llnl.util.executable
from llnl.util import tty
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.config
import spack.detection
import spack.environment
import spack.modules
import spack.paths
import spack.platforms
import spack.platforms.linux
import spack.repo
import spack.spec
import spack.store
import spack.user_environment
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
import spack.version
from ._common import (
_executables_in_store,
_python_import,
_root_spec,
_try_import_from_store,
)
from .config import spack_python_interpreter, spec_for_current_python
#: Name of the file containing metadata about the bootstrapping source
METADATA_YAML_FILENAME = "metadata.yaml"
#: Whether the current platform is Windows
IS_WINDOWS = sys.platform == "win32"
#: Map a bootstrapper type to the corresponding class
_bootstrap_methods = {}
def bootstrapper(bootstrapper_type: str):
"""Decorator to register classes implementing bootstrapping
methods.
Args:
bootstrapper_type: string identifying the class
"""
def _register(cls):
_bootstrap_methods[bootstrapper_type] = cls
return cls
return _register
class Bootstrapper:
"""Interface for "core" software bootstrappers"""
config_scope_name = ""
def __init__(self, conf):
self.conf = conf
self.name = conf["name"]
self.url = conf["info"]["url"]
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
@property
def mirror_url(self):
"""Mirror url associated with this bootstrapper"""
# Absolute paths
if os.path.isabs(self.url):
return spack.util.url.format(self.url)
# Check for :// and assume it's an url if we find it
if "://" in self.url:
return self.url
# Otherwise, it's a relative path
return spack.util.url.format(os.path.join(self.metadata_dir, self.url))
@property
def mirror_scope(self):
"""Mirror scope to be pushed onto the bootstrapping configuration when using
this bootstrapper.
"""
return spack.config.InternalConfigScope(
self.config_scope_name, {"mirrors:": {self.name: self.mirror_url}}
)
def try_import(self, module: str, abstract_spec_str: str) -> bool:
"""Try to import a Python module from a spec satisfying the abstract spec
passed as argument.
Args:
module: Python module name to try importing
abstract_spec_str: abstract spec that can provide the Python module
Return:
True if the Python module could be imported, False otherwise
"""
return False
def try_search_path(self, executables: List[str], abstract_spec_str: str) -> bool:
"""Try to search some executables in the prefix of specs satisfying the abstract
spec passed as argument.
Args:
executables: executables to be found
abstract_spec_str: abstract spec that can provide the Python module
Return:
True if the executables are found, False otherwise
"""
return False
@bootstrapper(bootstrapper_type="buildcache")
class BuildcacheBootstrapper(Bootstrapper):
"""Install the software needed during bootstrapping from a buildcache."""
def __init__(self, conf):
super().__init__(conf)
self.last_search = None
self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}"
@staticmethod
def _spec_and_platform(abstract_spec_str):
"""Return the spec object and platform we need to use when
querying the buildcache.
Args:
abstract_spec_str: abstract spec string we are looking for
"""
# Try to install from an unsigned binary cache
abstract_spec = spack.spec.Spec(abstract_spec_str)
# On Cray we want to use Linux binaries if available from mirrors
bincache_platform = spack.platforms.real_host()
return abstract_spec, bincache_platform
def _read_metadata(self, package_name):
"""Return metadata about the given package."""
json_filename = f"{package_name}.json"
json_dir = self.metadata_dir
json_path = os.path.join(json_dir, json_filename)
with open(json_path, encoding="utf-8") as stream:
data = json.load(stream)
return data
def _install_by_hash(self, pkg_hash, pkg_sha256, index, bincache_platform):
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null",
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family),
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override("compilers", [{"compiler": compiler_entry}]):
spec_str = "/" + pkg_hash
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
matches = spack.store.find([spec_str], multiple=False, query_fn=query)
for match in matches:
spack.binary_distribution.install_root_node(
match, allow_root=True, unsigned=True, force=True, sha256=pkg_sha256
)
def _install_and_test(self, abstract_spec, bincache_platform, bincache_data, test_fn):
# Ensure we see only the buildcache being used to bootstrap
with spack.config.override(self.mirror_scope):
# This index is currently needed to get the compiler used to build some
# specs that we know by dag hash.
spack.binary_distribution.binary_index.regenerate_spec_cache()
index = spack.binary_distribution.update_cache_and_get_specs()
if not index:
raise RuntimeError("The binary index is empty")
for item in bincache_data["verified"]:
candidate_spec = item["spec"]
# This will be None for things that don't depend on python
python_spec = item.get("python", None)
# Skip specs which are not compatible
if not abstract_spec.satisfies(candidate_spec):
continue
if python_spec is not None and python_spec not in abstract_spec:
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, index, bincache_platform)
info = {}
if test_fn(query_spec=abstract_spec, query_info=info):
self.last_search = info
return True
return False
def try_import(self, module, abstract_spec_str):
test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
return True
tty.debug(f"Bootstrapping {module} from pre-built binaries")
abstract_spec, bincache_platform = self._spec_and_platform(
abstract_spec_str + " ^" + spec_for_current_python()
)
data = self._read_metadata(module)
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
def try_search_path(self, executables, abstract_spec_str):
test_fn, info = functools.partial(_executables_in_store, executables), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
self.last_search = info
return True
abstract_spec, bincache_platform = self._spec_and_platform(abstract_spec_str)
tty.debug(f"Bootstrapping {abstract_spec.name} from pre-built binaries")
data = self._read_metadata(abstract_spec.name)
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
@bootstrapper(bootstrapper_type="install")
class SourceBootstrapper(Bootstrapper):
"""Install the software needed during bootstrapping from sources."""
def __init__(self, conf):
super().__init__(conf)
self.last_search = None
self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}"
def try_import(self, module, abstract_spec_str):
info = {}
if _try_import_from_store(module, abstract_spec_str, query_info=info):
self.last_search = info
return True
tty.debug(f"Bootstrapping {module} from sources")
# If we compile code from sources detecting a few build tools
# might reduce compilation time by a fair amount
_add_externals_if_missing()
# Try to build and install from sources
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
else:
concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install(fail_fast=True)
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
return True
return False
def try_search_path(self, executables, abstract_spec_str):
info = {}
if _executables_in_store(executables, abstract_spec_str, query_info=info):
self.last_search = info
return True
tty.debug(f"Bootstrapping {abstract_spec_str} from sources")
# If we compile code from sources detecting a few build tools
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str)
if concrete_spec.name == "patchelf":
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
else:
concrete_spec.concretize()
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install()
if _executables_in_store(executables, concrete_spec, query_info=info):
self.last_search = info
return True
return False
def create_bootstrapper(conf):
"""Return a bootstrap object built according to the configuration argument"""
btype = conf["type"]
return _bootstrap_methods[btype](conf)
def source_is_enabled_or_raise(conf):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"]
if not trusted.get(name, False):
raise ValueError("source is not trusted")
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
"""Make the requested module available for import, or raise.
This function tries to import a Python module in the current interpreter
using, in order, the methods configured in bootstrap.yaml.
If none of the methods succeed, an exception is raised. The function exits
on first success.
Args:
module: module to be imported in the current interpreter
abstract_spec: abstract spec that might provide the module. If not
given it defaults to "module"
Raises:
ImportError: if the module couldn't be imported
"""
# If we can import it already, that's great
tty.debug(f"[BOOTSTRAP MODULE {module}] Try importing from Python")
if _python_import(module):
return
abstract_spec = abstract_spec or module
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
return
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {module}"
)
msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
raise ImportError(msg)
def ensure_executables_in_path_or_raise(
executables: list,
abstract_spec: str,
cmd_check: Optional[Callable[[llnl.util.executable.Executable], bool]] = None,
):
"""Ensure that some executables are in path or raise.
Args:
executables (list): list of executables to be searched in the PATH,
in order. The function exits on the first one found.
abstract_spec (str): abstract spec that provides the executables
cmd_check (object): callable predicate that takes a
``llnl.util.executable.Executable`` command and validate it. Should return
``True`` if the executable is acceptable, ``False`` otherwise.
Can be used to, e.g., ensure a suitable version of the command before
accepting for bootstrapping.
Raises:
RuntimeError: if the executables cannot be ensured to be in PATH
Return:
Executable object
"""
cmd = llnl.util.executable.which(*executables)
if cmd:
if not cmd_check or cmd_check(cmd):
return cmd
executables_str = ", ".join(executables)
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):
# Additional environment variables needed
concrete_spec, cmd = (
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
env_mods = llnl.util.envmod.EnvironmentModifications()
for dep in concrete_spec.traverse(
root=True, order="post", deptype=("link", "run")
):
env_mods.extend(
spack.user_environment.environment_modifications_for_spec(
dep, set_package_py_globals=False
)
)
cmd.add_default_envmod(env_mods)
return cmd
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {executables_str}"
)
msg = f"cannot bootstrap any of the {executables_str} executables "
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
raise RuntimeError(msg)
def _add_externals_if_missing():
search_list = [
# clingo
spack.repo.path.get_pkg_class("cmake"),
spack.repo.path.get_pkg_class("bison"),
# GnuPG
spack.repo.path.get_pkg_class("gawk"),
]
if IS_WINDOWS:
search_list.append(spack.repo.path.get_pkg_class("winbison"))
detected_packages = spack.detection.by_executable(search_list)
spack.detection.update_configuration(detected_packages, scope="bootstrap")
def clingo_root_spec():
"""Return the root spec used to bootstrap clingo"""
return _root_spec("clingo-bootstrap@spack+python")
def ensure_clingo_importable_or_raise():
"""Ensure that the clingo module is available for import."""
ensure_module_importable_or_raise(module="clingo", abstract_spec=clingo_root_spec())
def gnupg_root_spec():
"""Return the root spec used to bootstrap GnuPG"""
return _root_spec("gnupg@2.3:")
def ensure_gpg_in_path_or_raise():
"""Ensure gpg or gpg2 are in the PATH or raise."""
return ensure_executables_in_path_or_raise(
executables=["gpg2", "gpg"], abstract_spec=gnupg_root_spec()
)
def patchelf_root_spec():
"""Return the root spec used to bootstrap patchelf"""
# 0.13.1 is the last version not to require C++17.
return _root_spec("patchelf@0.13.1:")
def verify_patchelf(patchelf):
"""Older patchelf versions can produce broken binaries, so we
verify the version here.
Arguments:
patchelf (llnl.util.executable.Executable): patchelf executable
"""
out = patchelf("--version", output=str, error=os.devnull, fail_on_error=False).strip()
if patchelf.returncode != 0:
return False
parts = out.split(" ")
if len(parts) < 2:
return False
try:
version = spack.version.Version(parts[1])
except ValueError:
return False
return version >= spack.version.Version("0.13.1")
def ensure_patchelf_in_path_or_raise():
"""Ensure patchelf is in the PATH or raise."""
# The old concretizer is not smart and we're doing its job: if the latest patchelf
# does not concretize because the compiler doesn't support C++17, we try to
# concretize again with an upperbound @:13.
try:
return ensure_executables_in_path_or_raise(
executables=["patchelf"], abstract_spec=patchelf_root_spec(), cmd_check=verify_patchelf
)
except RuntimeError:
return ensure_executables_in_path_or_raise(
executables=["patchelf"],
abstract_spec=_root_spec("patchelf@0.13.1:0.13"),
cmd_check=verify_patchelf,
)
def ensure_core_dependencies():
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":
ensure_patchelf_in_path_or_raise()
if not IS_WINDOWS:
ensure_gpg_in_path_or_raise()
ensure_clingo_importable_or_raise()
def all_core_root_specs():
"""Return a list of all the core root specs that may be used to bootstrap Spack"""
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
def bootstrapping_sources(scope: Optional[str] = None):
"""Return the list of configured sources of software for bootstrapping Spack
Args:
scope: if a valid configuration scope is given, return the
list only from that scope
"""
source_configs = spack.config.get("bootstrap:sources", default=None, scope=scope)
source_configs = source_configs or []
list_of_sources = []
for entry in source_configs:
current = copy.copy(entry)
metadata_dir = spack.util.path.canonicalize_path(entry["metadata"])
metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME)
with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream))
list_of_sources.append(current)
return list_of_sources

View File

@@ -0,0 +1,191 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap non-core Spack dependencies from an environment."""
import glob
import hashlib
import os
import pathlib
import sys
import warnings
import archspec.cpu
import llnl.util.executable
from llnl.util import tty
import spack.build_environment
import spack.environment
import spack.tengine
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
class BootstrapEnvironment(spack.environment.Environment):
"""Environment to install dependencies of Spack for a given interpreter and architecture"""
@classmethod
def spack_dev_requirements(cls):
"""Spack development requirements"""
return [
isort_root_spec(),
mypy_root_spec(),
black_root_spec(),
flake8_root_spec(),
pytest_root_spec(),
]
@classmethod
def environment_root(cls):
"""Environment root directory"""
bootstrap_root_path = root_path()
python_part = spec_for_current_python().replace("@", "")
arch_part = archspec.cpu.host().family
interpreter_part = hashlib.md5(sys.exec_prefix.encode()).hexdigest()[:5]
environment_dir = f"{python_part}-{arch_part}-{interpreter_part}"
return pathlib.Path(
spack.util.path.canonicalize_path(
os.path.join(bootstrap_root_path, "environments", environment_dir)
)
)
@classmethod
def view_root(cls):
"""Location of the view"""
return cls.environment_root().joinpath("view")
@classmethod
def pythonpaths(cls):
"""Paths to be added to sys.path or PYTHONPATH"""
python_dir_part = f"python{'.'.join(str(x) for x in sys.version_info[:2])}"
glob_expr = str(cls.view_root().joinpath("**", python_dir_part, "**"))
result = glob.glob(glob_expr)
if not result:
msg = f"Cannot find any Python path in {cls.view_root()}"
warnings.warn(msg)
return result
@classmethod
def bin_dirs(cls):
"""Paths to be added to PATH"""
return [cls.view_root().joinpath("bin")]
@classmethod
def spack_yaml(cls):
"""Environment spack.yaml file"""
return cls.environment_root().joinpath("spack.yaml")
def __init__(self):
if not self.spack_yaml().exists():
self._write_spack_yaml_file()
super().__init__(self.environment_root())
def update_installations(self):
"""Update the installations of this environment.
The update is done using a depfile on Linux and macOS, and using the ``install_all``
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
specs = self.concretize()
if specs:
colorized_specs = [
spack.spec.Spec(x).cformat("{name}{@version}")
for x in self.spack_dev_requirements()
]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False)
if sys.platform == "win32":
self.install_all()
else:
self._install_with_depfile()
self.write(regenerate=True)
def update_syspath_and_environ(self):
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
the environment view.
"""
# Do minimal modifications to sys.path and environment variables. In particular, pay
# attention to have the smallest PYTHONPATH / sys.path possible, since that may impact
# the performance of the current interpreter
sys.path.extend(self.pythonpaths())
os.environ["PATH"] = os.pathsep.join(
[str(x) for x in self.bin_dirs()] + os.environ.get("PATH", "").split(os.pathsep)
)
os.environ["PYTHONPATH"] = os.pathsep.join(
os.environ.get("PYTHONPATH", "").split(os.pathsep)
+ [str(x) for x in self.pythonpaths()]
)
def _install_with_depfile(self):
spackcmd = llnl.util.executable.which("spack")
spackcmd(
"-e",
str(self.environment_root()),
"env",
"depfile",
"-o",
str(self.environment_root().joinpath("Makefile")),
)
make = llnl.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(spack.build_environment.determine_number_of_jobs(parallel=True)),
**kwargs,
)
def _write_spack_yaml_file(self):
tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"
)
env = spack.tengine.make_environment()
template = env.get_template("bootstrap/spack.yaml")
context = {
"python_spec": spec_for_current_python(),
"python_prefix": sys.exec_prefix,
"architecture": archspec.cpu.host().family,
"environment_path": self.environment_root(),
"environment_specs": self.spack_dev_requirements(),
"store_path": store_path(),
}
self.environment_root().mkdir(parents=True, exist_ok=True)
self.spack_yaml().write_text(template.render(context), encoding="utf-8")
def isort_root_spec():
"""Return the root spec used to bootstrap isort"""
return _root_spec("py-isort@4.3.5:")
def mypy_root_spec():
"""Return the root spec used to bootstrap mypy"""
return _root_spec("py-mypy@0.900:")
def black_root_spec():
"""Return the root spec used to bootstrap black"""
return _root_spec("py-black")
def flake8_root_spec():
"""Return the root spec used to bootstrap flake8"""
return _root_spec("py-flake8")
def pytest_root_spec():
"""Return the root spec used to bootstrap flake8"""
return _root_spec("py-pytest")
def ensure_environment_dependencies():
"""Ensure Spack dependencies from the bootstrap environment are installed and ready to use"""
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()

View File

@@ -0,0 +1,169 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Query the status of bootstrapping on this machine"""
import platform
import llnl.util.executable
from ._common import _executables_in_store, _python_import, _try_import_from_store
from .config import ensure_bootstrap_configuration
from .core import clingo_root_spec, patchelf_root_spec
from .environment import (
BootstrapEnvironment,
black_root_spec,
flake8_root_spec,
isort_root_spec,
mypy_root_spec,
pytest_root_spec,
)
def _required_system_executable(exes, msg):
"""Search for an executable is the system path only."""
if isinstance(exes, str):
exes = (exes,)
if llnl.util.executable.which_string(*exes):
return True, None
return False, msg
def _required_executable(exes, query_spec, msg):
"""Search for an executable in the system path or in the bootstrap store."""
if isinstance(exes, str):
exes = (exes,)
if llnl.util.executable.which_string(*exes) or _executables_in_store(exes, query_spec):
return True, None
return False, msg
def _required_python_module(module, query_spec, msg):
"""Check if a Python module is available in the current interpreter or
if it can be loaded from the bootstrap store
"""
if _python_import(module) or _try_import_from_store(module, query_spec):
return True, None
return False, msg
def _missing(name, purpose, system_only=True):
"""Message to be printed if an executable is not found"""
msg = '[{2}] MISSING "{0}": {1}'
if not system_only:
return msg.format(name, purpose, "@*y{{B}}")
return msg.format(name, purpose, "@*y{{-}}")
def _core_requirements():
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"),
"bzip2": _missing("bzip2", "required to compress/decompress code archives"),
"git": _missing("git", "required to fetch/manage git repositories"),
}
if platform.system().lower() == "linux":
_core_system_exes["xz"] = _missing("xz", "required to compress/decompress code archives")
# Executables that are not bootstrapped yet
result = [_required_system_executable(exe, msg) for exe, msg in _core_system_exes.items()]
# Python modules
result.append(
_required_python_module(
"clingo", clingo_root_spec(), _missing("clingo", "required to concretize specs", False)
)
)
return result
def _buildcache_requirements():
_buildcache_exes = {
"file": _missing("file", "required to analyze files for buildcaches"),
("gpg2", "gpg"): _missing("gpg2", "required to sign/verify buildcaches", False),
}
if platform.system().lower() == "darwin":
_buildcache_exes["otool"] = _missing("otool", "required to relocate binaries")
# Executables that are not bootstrapped yet
result = [_required_system_executable(exe, msg) for exe, msg in _buildcache_exes.items()]
if platform.system().lower() == "linux":
result.append(
_required_executable(
"patchelf",
patchelf_root_spec(),
_missing("patchelf", "required to relocate binaries", False),
)
)
return result
def _optional_requirements():
_optional_exes = {
"zstd": _missing("zstd", "required to compress/decompress code archives"),
"svn": _missing("svn", "required to manage subversion repositories"),
"hg": _missing("hg", "required to manage mercurial repositories"),
}
# Executables that are not bootstrapped yet
result = [_required_system_executable(exe, msg) for exe, msg in _optional_exes.items()]
return result
def _development_requirements():
# Ensure we trigger environment modifications if we have an environment
if BootstrapEnvironment.spack_yaml().exists():
with BootstrapEnvironment() as env:
env.update_syspath_and_environ()
return [
_required_executable(
"isort", isort_root_spec(), _missing("isort", "required for style checks", False)
),
_required_executable(
"mypy", mypy_root_spec(), _missing("mypy", "required for style checks", False)
),
_required_executable(
"flake8", flake8_root_spec(), _missing("flake8", "required for style checks", False)
),
_required_executable(
"black", black_root_spec(), _missing("black", "required for code formatting", False)
),
_required_python_module(
"pytest", pytest_root_spec(), _missing("pytest", "required to run unit-test", False)
),
]
def status_message(section):
"""Return a status message to be printed to screen that refers to the
section passed as argument and a bool which is True if there are missing
dependencies.
Args:
section (str): either 'core' or 'buildcache' or 'optional' or 'develop'
"""
pass_token, fail_token = "@*g{[PASS]}", "@*r{[FAIL]}"
# Contain the header of the section and a list of requirements
spack_sections = {
"core": ("{0} @*{{Core Functionalities}}", _core_requirements),
"buildcache": ("{0} @*{{Binary packages}}", _buildcache_requirements),
"optional": ("{0} @*{{Optional Features}}", _optional_requirements),
"develop": ("{0} @*{{Development Dependencies}}", _development_requirements),
}
msg, required_software = spack_sections[section]
with ensure_bootstrap_configuration():
missing_software = False
for found, err_msg in required_software():
if not found:
missing_software = True
msg += "\n " + err_msg
msg += "\n"
msg = msg.format(pass_token if not missing_software else fail_token)
return msg, missing_software

View File

@@ -33,25 +33,36 @@
calls you can make from within the install() function.
"""
import inspect
import io
import multiprocessing
import os
import re
import shutil
import sys
import traceback
import types
from six import StringIO
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import install, install_tree, mkdirp
from llnl.util.envmod import (
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
system_dirs,
validate,
)
from llnl.util.executable import Executable
from llnl.util.lang import dedupe
from llnl.util.string import plural
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.config
import spack.install_test
@@ -64,25 +75,12 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.path
import spack.util.pattern
from spack.error import NoHeadersError, NoLibrariesError
from spack.installer import InstallError
from spack.util.cpus import cpus_available
from spack.util.environment import (
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
system_dirs,
validate,
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.string import plural
#
# This can be set by the user to globally disable parallel builds.
@@ -285,6 +283,23 @@ def clean_environment():
return env
def _add_werror_handling(keep_werror, env):
keep_flags = set()
# set of pairs
replace_flags: List[Tuple[str, str]] = []
if keep_werror == "all":
keep_flags.add("-Werror*")
else:
if keep_werror == "specific":
keep_flags.add("-Werror-*")
keep_flags.add("-Werror=*")
# This extra case is to handle -Werror-implicit-function-declaration
replace_flags.append(("-Werror-", "-Wno-error="))
replace_flags.append(("-Werror", "-Wno-error"))
env.set("SPACK_COMPILER_FLAGS_KEEP", "|".join(keep_flags))
env.set("SPACK_COMPILER_FLAGS_REPLACE", " ".join(["|".join(item) for item in replace_flags]))
def set_compiler_environment_variables(pkg, env):
assert pkg.spec.concrete
compiler = pkg.compiler
@@ -331,6 +346,13 @@ def set_compiler_environment_variables(pkg, env):
env.set("SPACK_DTAGS_TO_STRIP", compiler.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.enable_new_dtags)
if pkg.keep_werror is not None:
keep_werror = pkg.keep_werror
else:
keep_werror = spack.config.get("config:flags:keep_werror")
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
# Don't set on cray platform because the targeting module handles this
if spec.satisfies("platform=cray"):
@@ -353,10 +375,8 @@ def set_compiler_environment_variables(pkg, env):
if isinstance(pkg.flag_handler, types.FunctionType):
handler = pkg.flag_handler
else:
if sys.version_info >= (3, 0):
handler = pkg.flag_handler.__func__
else:
handler = pkg.flag_handler.im_func
handler = pkg.flag_handler.__func__
injf, envf, bsf = handler(pkg, flag, spec.compiler_flags[flag][:])
inject_flags[flag] = injf or []
env_flags[flag] = envf or []
@@ -542,14 +562,18 @@ def determine_number_of_jobs(
return min(max_cpus, config_default)
def _set_variables_for_single_module(pkg, module):
"""Helper function to set module variables for single module."""
def set_module_variables_for_package(pkg):
"""Populate the Python module of a package with some useful global names.
This makes things easier for package writers.
"""
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
marker = "_set_run_already_called"
if getattr(module, marker, False):
if getattr(pkg.module, marker, False):
return
module = ModuleChangePropagator(pkg)
jobs = determine_number_of_jobs(parallel=pkg.parallel)
m = module
@@ -560,15 +584,13 @@ def _set_variables_for_single_module(pkg, module):
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# easy shortcut to os.environ
m.env = os.environ
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
if sys.platform == "win32":
m.nmake = Executable("nmake")
m.msbuild = Executable("msbuild")
# Standard CMake arguments
m.std_cmake_args = spack.build_systems.cmake.CMakeBuilder.std_args(pkg)
m.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
@@ -581,21 +603,6 @@ def _set_variables_for_single_module(pkg, module):
m.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
m.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
# Emulate some shell commands for convenience
m.pwd = os.getcwd
m.cd = os.chdir
m.mkdir = os.mkdir
m.makedirs = os.makedirs
m.remove = os.remove
m.removedirs = os.removedirs
m.symlink = symlink
m.mkdirp = mkdirp
m.install = install
m.install_tree = install_tree
m.rmtree = shutil.rmtree
m.move = shutil.move
# Useful directories within the prefix are encapsulated in
# a Prefix object.
m.prefix = pkg.prefix
@@ -616,20 +623,7 @@ def static_to_shared_library(static_lib, shared_lib=None, **kwargs):
# Put a marker on this module so that it won't execute the body of this
# function again, since it is not needed
setattr(m, marker, True)
def set_module_variables_for_package(pkg):
"""Populate the module scope of install() with some useful functions.
This makes things easier for package writers.
"""
# If a user makes their own package repo, e.g.
# spack.pkg.mystuff.libelf.Libelf, and they inherit from an existing class
# like spack.pkg.original.libelf.Libelf, then set the module variables
# for both classes so the parent class can still use them if it gets
# called. parent_class_modules includes pkg.module.
modules = parent_class_modules(pkg.__class__)
for mod in modules:
_set_variables_for_single_module(pkg, mod)
module.propagate_changes_to_mro()
def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwargs):
@@ -739,25 +733,6 @@ def get_rpaths(pkg):
return list(dedupe(filter_system_paths(rpaths)))
def parent_class_modules(cls):
"""
Get list of superclass modules that descend from spack.package_base.PackageBase
Includes cls.__module__
"""
if not issubclass(cls, spack.package_base.PackageBase) or issubclass(
spack.package_base.PackageBase, cls
):
return []
result = []
module = sys.modules.get(cls.__module__)
if module:
result = [module]
for c in cls.__bases__:
result.extend(parent_class_modules(c))
return result
def load_external_modules(pkg):
"""Traverse a package's spec DAG and load any external modules.
@@ -978,22 +953,9 @@ def add_modifications_for_dep(dep):
if set_package_py_globals:
set_module_variables_for_package(dpkg)
# Allow dependencies to modify the module
# Get list of modules that may need updating
modules = []
for cls in inspect.getmro(type(spec.package)):
module = cls.module
if module == spack.package_base:
break
modules.append(module)
# Execute changes as if on a single module
# copy dict to ensure prior changes are available
changes = spack.util.pattern.Bunch()
dpkg.setup_dependent_package(changes, spec)
for module in modules:
module.__dict__.update(changes.__dict__)
current_module = ModuleChangePropagator(spec.package)
dpkg.setup_dependent_package(current_module, spec)
current_module.propagate_changes_to_mro()
if context == "build":
builder = spack.builder.create(dpkg)
@@ -1271,6 +1233,8 @@ def make_stack(tb, stack=None):
obj = frame.f_locals["self"]
if isinstance(obj, spack.package_base.PackageBase):
break
else:
return None
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
@@ -1339,7 +1303,7 @@ class ChildError(InstallError):
# List of errors considered "build errors", for which we'll show log
# context instead of Python context.
build_errors = [("spack.util.executable", "ProcessError")]
build_errors = [("llnl.util.executable", "ProcessError")]
def __init__(self, msg, module, classname, traceback_string, log_name, log_type, context):
super(ChildError, self).__init__(msg)
@@ -1352,7 +1316,7 @@ def __init__(self, msg, module, classname, traceback_string, log_name, log_type,
@property
def long_message(self):
out = StringIO()
out = io.StringIO()
out.write(self._long_message if self._long_message else "")
have_log = self.log_name and os.path.exists(self.log_name)
@@ -1437,3 +1401,51 @@ def write_log_summary(out, log_type, log, last=None):
# If no errors are found but warnings are, display warnings
out.write("\n%s found in %s log:\n" % (plural(nwar, "warning"), log_type))
out.write(make_log_context(warnings))
class ModuleChangePropagator:
"""Wrapper class to accept changes to a package.py Python module, and propagate them in the
MRO of the package.
It is mainly used as a substitute of the ``package.py`` module, when calling the
"setup_dependent_package" function during build environment setup.
"""
_PROTECTED_NAMES = ("package", "current_module", "modules_in_mro", "_set_attributes")
def __init__(self, package):
self._set_self_attributes("package", package)
self._set_self_attributes("current_module", package.module)
#: Modules for the classes in the MRO up to PackageBase
modules_in_mro = []
for cls in inspect.getmro(type(package)):
module = cls.module
if module == self.current_module:
continue
if module == spack.package_base:
break
modules_in_mro.append(module)
self._set_self_attributes("modules_in_mro", modules_in_mro)
self._set_self_attributes("_set_attributes", {})
def _set_self_attributes(self, key, value):
super().__setattr__(key, value)
def __getattr__(self, item):
return getattr(self.current_module, item)
def __setattr__(self, key, value):
if key in ModuleChangePropagator._PROTECTED_NAMES:
msg = f'Cannot set attribute "{key}" in ModuleMonkeyPatcher'
return AttributeError(msg)
setattr(self.current_module, key, value)
self._set_attributes[key] = value
def propagate_changes_to_mro(self):
for module_in_mro in self.modules_in_mro:
module_in_mro.__dict__.update(self._set_attributes)

View File

@@ -3,30 +3,30 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import six
from typing import List
import llnl.util.lang
import spack.builder
import spack.installer
import spack.relocate
import spack.spec
import spack.store
def sanity_check_prefix(builder):
def sanity_check_prefix(builder: spack.builder.Builder):
"""Check that specific directories and files are created after installation.
The files to be checked are in the ``sanity_check_is_file`` attribute of the
package object, while the directories are in the ``sanity_check_is_dir``.
Args:
builder (spack.builder.Builder): builder that installed the package
builder: builder that installed the package
"""
pkg = builder.pkg
def check_paths(path_list, filetype, predicate):
if isinstance(path_list, six.string_types):
if isinstance(path_list, str):
path_list = [path_list]
for path in path_list:
@@ -45,7 +45,7 @@ def check_paths(path_list, filetype, predicate):
raise spack.installer.InstallError(msg.format(pkg.name))
def apply_macos_rpath_fixups(builder):
def apply_macos_rpath_fixups(builder: spack.builder.Builder):
"""On Darwin, make installed libraries more easily relocatable.
Some build systems (handrolled, autotools, makefiles) can set their own
@@ -57,20 +57,22 @@ def apply_macos_rpath_fixups(builder):
packages) that do not install relocatable libraries by default.
Args:
builder (spack.builder.Builder): builder that installed the package
builder: builder that installed the package
"""
spack.relocate.fixup_macos_rpaths(builder.spec)
def ensure_build_dependencies_or_raise(spec, dependencies, error_msg):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
If not, raise a RuntimeError with a helpful error message.
Args:
spec (spack.spec.Spec): concrete spec to be checked.
dependencies (list of spack.spec.Spec): list of abstract specs to be satisfied
error_msg (str): brief error message to be prepended to a longer description
spec: concrete spec to be checked.
dependencies: list of abstract specs to be satisfied
error_msg: brief error message to be prepended to a longer description
Raises:
RuntimeError: when the required build dependencies are not found
@@ -85,33 +87,35 @@ def ensure_build_dependencies_or_raise(spec, dependencies, error_msg):
# Raise an exception on missing deps.
msg = (
"{0}: missing dependencies: {1}.\n\nPlease add "
"the following lines to the package:\n\n".format(error_msg, ", ".join(missing_deps))
"the following lines to the package:\n\n".format(
error_msg, ", ".join(str(d) for d in missing_deps)
)
)
for dep in missing_deps:
msg += " depends_on('{0}', type='build', when='@{1} {2}')\n".format(
msg += ' depends_on("{0}", type="build", when="@{1} {2}")\n'.format(
dep, spec.version, "build_system=autotools"
)
msg += "\nUpdate the version (when='@{0}') as needed.".format(spec.version)
msg += '\nUpdate the version (when="@{0}") as needed.'.format(spec.version)
raise RuntimeError(msg)
def execute_build_time_tests(builder):
def execute_build_time_tests(builder: spack.builder.Builder):
"""Execute the build-time tests prescribed by builder.
Args:
builder (Builder): builder prescribing the test callbacks. The name of the callbacks is
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``build_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.build_time_test_callbacks, "build")
def execute_install_time_tests(builder):
def execute_install_time_tests(builder: spack.builder.Builder):
"""Execute the install-time tests prescribed by builder.
Args:
builder (Builder): builder prescribing the test callbacks. The name of the callbacks is
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``install_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.install_time_test_callbacks, "install")

View File

@@ -2,11 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.executable
import llnl.util.filesystem as fs
import spack.directives
import spack.package_base
import spack.util.executable
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -22,7 +22,7 @@ def configure(self, pkg, spec, prefix):
prezip = spec["aspell"].prefix.bin.prezip
destdir = prefix
sh = spack.util.executable.which("sh")
sh = llnl.util.executable.which("sh")
sh(
"./configure",
"--vars",

View File

@@ -7,10 +7,11 @@
import os.path
import stat
import subprocess
from typing import List # novm
from typing import List
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.executable import Executable
import spack.build_environment
import spack.builder
@@ -18,7 +19,6 @@
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
from spack.util.executable import Executable
from spack.version import Version
from ._checks import (
@@ -138,7 +138,7 @@ class AutotoolsBuilder(BaseBuilder):
patch_libtool = True
#: Targets for ``make`` during the :py:meth:`~.AutotoolsBuilder.build` phase
build_targets = [] # type: List[str]
build_targets: List[str] = []
#: Targets for ``make`` during the :py:meth:`~.AutotoolsBuilder.install` phase
install_targets = ["install"]
@@ -152,7 +152,7 @@ class AutotoolsBuilder(BaseBuilder):
force_autoreconf = False
#: Options to be passed to autoreconf when using the default implementation
autoreconf_extra_args = [] # type: List[str]
autoreconf_extra_args: List[str] = []
#: If False deletes all the .la files in the prefix folder after the installation.
#: If True instead it installs them.

View File

@@ -34,22 +34,22 @@ class CachedCMakeBuilder(CMakeBuilder):
#: Phases of a Cached CMake package
#: Note: the initconfig phase is used for developer builds as a final phase to stop on
phases = ("initconfig", "cmake", "build", "install") # type: Tuple[str, ...]
phases: Tuple[str, ...] = ("initconfig", "cmake", "build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = CMakeBuilder.legacy_methods + (
legacy_methods: Tuple[str, ...] = CMakeBuilder.legacy_methods + (
"initconfig_compiler_entries",
"initconfig_mpi_entries",
"initconfig_hardware_entries",
"std_initconfig_entries",
"initconfig_package_entries",
) # type: Tuple[str, ...]
)
#: Names associated with package attributes in the old build-system format
legacy_attributes = CMakeBuilder.legacy_attributes + (
legacy_attributes: Tuple[str, ...] = CMakeBuilder.legacy_attributes + (
"cache_name",
"cache_path",
) # type: Tuple[str, ...]
)
@property
def cache_name(self):
@@ -205,13 +205,7 @@ def initconfig_hardware_entries(self):
entries.append(cmake_cache_path("CUDA_TOOLKIT_ROOT_DIR", cudatoolkitdir))
cudacompiler = "${CUDA_TOOLKIT_ROOT_DIR}/bin/nvcc"
entries.append(cmake_cache_path("CMAKE_CUDA_COMPILER", cudacompiler))
if spec.satisfies("^mpi"):
entries.append(cmake_cache_path("CMAKE_CUDA_HOST_COMPILER", "${MPI_CXX_COMPILER}"))
else:
entries.append(
cmake_cache_path("CMAKE_CUDA_HOST_COMPILER", "${CMAKE_CXX_COMPILER}")
)
entries.append(cmake_cache_path("CMAKE_CUDA_HOST_COMPILER", "${CMAKE_CXX_COMPILER}"))
return entries

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import inspect
import os
import platform
@@ -9,15 +10,11 @@
import sys
from typing import List, Tuple
import six
import llnl.util.filesystem as fs
from llnl.util.compat import Sequence
import spack.build_environment
import spack.builder
import spack.package_base
import spack.util.path
from spack.directives import build_system, depends_on, variant
from spack.multimethod import when
@@ -155,13 +152,13 @@ class CMakeBuilder(BaseBuilder):
"""
#: Phases of a CMake package
phases = ("cmake", "build", "install") # type: Tuple[str, ...]
phases: Tuple[str, ...] = ("cmake", "build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("cmake_args", "check") # type: Tuple[str, ...]
legacy_methods: Tuple[str, ...] = ("cmake_args", "check")
#: Names associated with package attributes in the old build-system format
legacy_attributes = (
legacy_attributes: Tuple[str, ...] = (
"generator",
"build_targets",
"install_targets",
@@ -171,7 +168,7 @@ class CMakeBuilder(BaseBuilder):
"std_cmake_args",
"build_dirname",
"build_directory",
) # type: Tuple[str, ...]
)
#: The build system generator to use.
#:
@@ -184,7 +181,7 @@ class CMakeBuilder(BaseBuilder):
generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
#: Targets to be used during the build phase
build_targets = [] # type: List[str]
build_targets: List[str] = []
#: Targets to be used during the install phase
install_targets = ["install"]
#: Callback names for build-time test
@@ -302,7 +299,7 @@ def define(cmake_var, value):
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, Sequence) and not isinstance(value, six.string_types):
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)

View File

@@ -35,10 +35,10 @@ class GenericBuilder(BaseBuilder):
phases = ("install",)
#: Names associated with package methods in the old build-system format
legacy_methods = () # type: Tuple[str, ...]
legacy_methods: Tuple[str, ...] = ()
#: Names associated with package attributes in the old build-system format
legacy_attributes = ("archive_files",) # type: Tuple[str, ...]
legacy_attributes: Tuple[str, ...] = ("archive_files",)
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -13,7 +13,7 @@ class GNUMirrorPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror
gnu_mirror_path = None # type: Optional[str]
gnu_mirror_path: Optional[str] = None
#: List of GNU mirrors used by Spack
base_mirrors = [

View File

@@ -11,6 +11,8 @@
import xml.etree.ElementTree as ElementTree
import llnl.util.tty as tty
from llnl.util.envmod import EnvironmentModifications
from llnl.util.executable import Executable
from llnl.util.filesystem import (
HeaderList,
LibraryList,
@@ -25,8 +27,6 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
from spack.version import Version, ver

View File

@@ -4,11 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import llnl.util.executable
from llnl.util.filesystem import find
import spack.builder
import spack.package_base
import spack.util.executable
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
@@ -41,11 +41,11 @@ class LuaPackage(spack.package_base.PackageBase):
@property
def lua(self):
return spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
return llnl.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
@property
def luarocks(self):
lr = spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
lr = llnl.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
return lr

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
from typing import List
import llnl.util.filesystem as fs
@@ -77,7 +77,7 @@ class MakefileBuilder(BaseBuilder):
)
#: Targets for ``make`` during the :py:meth:`~.MakefileBuilder.build` phase
build_targets = [] # type: List[str]
build_targets: List[str] = []
#: Targets for ``make`` during the :py:meth:`~.MakefileBuilder.install` phase
install_targets = ["install"]

View File

@@ -3,12 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.filesystem as fs
from llnl.util.executable import which
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BaseBuilder

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import List # novm
from typing import List
import llnl.util.filesystem as fs
@@ -95,7 +95,7 @@ class MesonBuilder(BaseBuilder):
"build_directory",
)
build_targets = [] # type: List[str]
build_targets: List[str] = []
install_targets = ["install"]
build_time_test_callbacks = ["check"]

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
from typing import List
import llnl.util.filesystem as fs
@@ -72,7 +72,7 @@ class NMakeBuilder(BaseBuilder):
)
#: Targets for ``make`` during the :py:meth:`~.NMakeBuilder.build` phase
build_targets = [] # type: List[str]
build_targets: List[str] = []
#: Targets for ``make`` during the :py:meth:`~.NMakeBuilder.install` phase
install_targets = ["install"]

View File

@@ -8,10 +8,11 @@
import shutil
from os.path import basename, dirname, isdir
from llnl.util.envmod import EnvironmentModifications
from llnl.util.executable import Executable
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.directives import conflicts, variant
from .generic import Package
@@ -25,6 +26,23 @@ class IntelOneApiPackage(Package):
# organization (e.g. University/Company).
redistribute_source = False
for c in [
"target=ppc64:",
"target=ppc64le:",
"target=aarch64:",
"platform=darwin:",
"platform=cray:",
"platform=windows:",
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")
# Add variant to toggle environment modifications from vars.sh
variant(
"envmods",
default=True,
description="Toggles environment modifications",
)
@staticmethod
def update_description(cls):
"""Updates oneapi package descriptions with common text."""
@@ -103,11 +121,13 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh")
# Only if environment modifications are desired (default is +envmods)
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh")
)
)
)
class IntelOneApiLibraryPackage(IntelOneApiPackage):

View File

@@ -5,13 +5,13 @@
import inspect
import os
from llnl.util.executable import Executable
from llnl.util.filesystem import filter_file
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.package_base import PackageBase
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import inspect
import os
import re
@@ -16,6 +15,7 @@
import spack.builder
import spack.multimethod
import spack.package_base
import spack.spec
from spack.directives import build_system, depends_on, extends
from spack.error import NoHeadersError, NoLibrariesError, SpecError
from spack.version import Version
@@ -177,7 +177,7 @@ class PythonPackage(PythonExtension):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi = None # type: Optional[str]
pypi: Optional[str] = None
maintainers = ["adamjstewart", "pradyunsg"]
@@ -200,7 +200,7 @@ class PythonPackage(PythonExtension):
# package manually
depends_on("py-wheel", type="build")
py_namespace = None # type: Optional[str]
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
@@ -219,13 +219,34 @@ def list_url(cls):
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
def update_external_dependencies(self):
"""
Ensure all external python packages have a python dependency
If another package in the DAG depends on python, we use that
python for the dependency of the external. If not, we assume
that the external PythonPackage is installed into the same
directory as the python it depends on.
"""
# TODO: Include this in the solve, rather than instantiating post-concretization
if "python" not in self.spec:
if "python" in self.spec.root:
python = self.spec.root["python"]
else:
python = spack.spec.Spec("python")
repo = spack.repo.path.repo_for_pkg(python)
python.namespace = repo.namespace
python._mark_concrete()
python.external_path = self.prefix
self.spec.add_dependency_edge(python, ("build", "link", "run"))
@property
def headers(self):
"""Discover header files in platlib."""
# Headers may be in either location
include = self.prefix.join(self.include)
platlib = self.prefix.join(self.platlib)
include = self.prefix.join(self.spec["python"].package.include)
platlib = self.prefix.join(self.spec["python"].package.platlib)
headers = fs.find_all_headers(include) + fs.find_all_headers(platlib)
if headers:
@@ -234,29 +255,13 @@ def headers(self):
msg = "Unable to locate {} headers in {} or {}"
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def include(self):
include = glob.glob(self.prefix.include.join("python*"))
if include:
return include[0]
return self.spec["python"].package.include
@property
def platlib(self):
for libname in ("lib", "lib64"):
platlib = glob.glob(self.prefix.join(libname).join("python*").join("site-packages"))
if platlib:
return platlib[0]
return self.spec["python"].package.platlib
@property
def libs(self):
"""Discover libraries in platlib."""
# Remove py- prefix in package name
library = "lib" + self.spec.name[3:].replace("-", "?")
root = self.prefix.join(self.platlib)
root = self.prefix.join(self.spec["python"].package.platlib)
for shared in [True, False]:
libs = fs.find_libraries(library, root, shared=shared, recursive=True)

View File

@@ -81,6 +81,6 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Search the Makefile for a ``check:`` target and runs it if found."""
with working_dir(self.build_directory):
self._if_make_target_execute("check")
self.pkg._if_make_target_execute("check")
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -22,10 +22,10 @@ class RBuilder(GenericBuilder):
"""
#: Names associated with package methods in the old build-system format
legacy_methods = (
legacy_methods: Tuple[str, ...] = (
"configure_args",
"configure_vars",
) + GenericBuilder.legacy_methods # type: Tuple[str, ...]
) + GenericBuilder.legacy_methods
def configure_args(self):
"""Arguments to pass to install via ``--configure-args``."""
@@ -64,10 +64,10 @@ class RPackage(Package):
# package attributes that can be expanded to set the homepage, url,
# list_url, and git values
# For CRAN packages
cran = None # type: Optional[str]
cran: Optional[str] = None
# For Bioconductor packages
bioc = None # type: Optional[str]
bioc: Optional[str] = None
GenericBuilder = RBuilder

View File

@@ -8,13 +8,13 @@
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.envmod import env_flag
from llnl.util.executable import Executable, ProcessError
import spack.builder
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import build_system, extends
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
class RacketPackage(PackageBase):
@@ -34,7 +34,7 @@ class RacketPackage(PackageBase):
extends("racket", when="build_system=racket")
racket_name = None # type: Optional[str]
racket_name: Optional[str] = None
parallel = True
@lang.classproperty
@@ -51,7 +51,7 @@ class RacketBuilder(spack.builder.Builder):
phases = ("install",)
#: Names associated with package methods in the old build-system format
legacy_methods = tuple() # type: Tuple[str, ...]
legacy_methods: Tuple[str, ...] = tuple()
#: Names associated with package attributes in the old build-system format
legacy_attributes = ("build_directory", "build_time_test_callbacks", "subdirectory")
@@ -59,7 +59,7 @@ class RacketBuilder(spack.builder.Builder):
#: Callback names for build-time test
build_time_test_callbacks = ["check"]
racket_name = None # type: Optional[str]
racket_name: Optional[str] = None
@property
def subdirectory(self):

View File

@@ -96,18 +96,33 @@ class ROCmPackage(PackageBase):
"gfx803",
"gfx900",
"gfx900:xnack-",
"gfx902",
"gfx904",
"gfx906",
"gfx908",
"gfx90a",
"gfx906:xnack-",
"gfx908",
"gfx908:xnack-",
"gfx909",
"gfx90a",
"gfx90a:xnack-",
"gfx90a:xnack+",
"gfx90c",
"gfx940",
"gfx1010",
"gfx1011",
"gfx1012",
"gfx1013",
"gfx1030",
"gfx1031",
"gfx1032",
"gfx1033",
"gfx1034",
"gfx1035",
"gfx1036",
"gfx1100",
"gfx1101",
"gfx1102",
"gfx1103",
)
variant("rocm", default=False, description="Enable ROCm support")
@@ -117,6 +132,7 @@ class ROCmPackage(PackageBase):
"amdgpu_target",
description="AMD GPU architecture",
values=spack.variant.any_combination_of(*amdgpu_targets),
sticky=True,
when="+rocm",
)
@@ -144,6 +160,29 @@ def hip_flags(amdgpu_target):
# depends_on('hip@:6.0', when='amdgpu_target=gfx701')
# to indicate minimum version for each architecture.
# Add compiler minimum versions based on the first release where the
# processor is included in llvm/lib/Support/TargetParser.cpp
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx900:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx906:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx908:xnack-")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx90c")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1032")
depends_on("llvm-amdgpu@4.1.0:", when="amdgpu_target=gfx1033")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx1034")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1035")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx1036")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1100")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1101")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1102")
depends_on("llvm-amdgpu@5.3.0:", when="amdgpu_target=gfx1103")
# Compiler conflicts
# TODO: add conflicts statements along the lines of

View File

@@ -46,10 +46,10 @@ class SConsBuilder(BaseBuilder):
phases = ("build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("install_args", "build_test")
legacy_methods = ("build_test",)
#: Same as legacy_methods, but the signature is different
legacy_long_methods = ("build_args",)
legacy_long_methods = ("build_args", "install_args")
#: Names associated with package attributes in the old build-system format
legacy_attributes = ("build_time_test_callbacks",)
@@ -66,13 +66,13 @@ def build(self, pkg, spec, prefix):
args = self.build_args(spec, prefix)
inspect.getmodule(self.pkg).scons(*args)
def install_args(self):
def install_args(self, spec, prefix):
"""Arguments to pass to install."""
return []
def install(self, pkg, spec, prefix):
"""Install the package."""
args = self.install_args()
args = self.install_args(spec, prefix)
inspect.getmodule(self.pkg).scons("install", *args)

View File

@@ -157,7 +157,7 @@ def configure(self, pkg, spec, prefix):
]
)
self.python(configure, *args)
self.pkg.python(configure, *args)
def configure_args(self):
"""Arguments to pass to configure."""

View File

@@ -14,7 +14,7 @@ class SourceforgePackage(spack.package_base.PackageBase):
packages."""
#: Path of the package in a Sourceforge mirror
sourceforge_mirror_path = None # type: Optional[str]
sourceforge_mirror_path: Optional[str] = None
#: List of Sourceforge mirrors used by Spack
base_mirrors = [

Some files were not shown because too many files have changed in this diff Show More