Compare commits

...

158 Commits

Author SHA1 Message Date
Todd Gamblin
50396b41b4 refactor: Move spack.util.executable -> llnl.util.executable 2023-01-01 23:37:25 -08:00
Todd Gamblin
b4a2e2b82c refactor: move most of spack.util.environment -> llnl.util.envmod 2023-01-01 22:29:11 -08:00
Todd Gamblin
d3f8bf1b96 refactor: move windows path logic out of spack.util.path and into llnl.util.path 2023-01-01 22:29:11 -08:00
Todd Gamblin
fd5cf345e1 refactor: Move spack.util.string to llnl.util.string 2023-01-01 22:29:11 -08:00
Todd Gamblin
ca265ea0c2 style: fix spurious mypy errors from numpy (#34732)
Spack imports `pytest`, which *can* import `numpy`. Recent versions of `numpy` require
Python 3.8 or higher, and they use 3.8 type annotations in their type stubs (`.pyi`
files). At the same time, we tell `mypy` to target Python 3.7, as we still support older
versions of Python.

What all this means is that if you run `mypy` on `spack`, `mypy` will follow all the
static import statements, and it ends up giving you this error when it finds numpy stuff
that is newer than the target Python version:

```
==> Running mypy checks
src/spack/var/spack/environments/default/.spack-env/._view/4g7jd4ibkg4gopv4rosq3kn2vsxrxm2f/lib/python3.11/site-packages/numpy/__init__.pyi:638: error: Positional-only parameters are only supported in Python 3.8 and greater  [syntax]
Found 1 error in 1 file (errors prevented further checking)
  mypy found errors
```

We can fix this by telling `mypy` to skip all imports of `numpy` in `pyproject.toml`:

```toml
   [[tool.mypy.overrides]]
   module = 'numpy'
   follow_imports = 'skip'
   follow_imports_for_stubs = true
```

- [x] don't follow imports from `numpy` in `mypy`
- [x] get rid of old rule not to follow `jinja2` imports, as we now require Python 3
2023-01-01 01:05:17 +00:00
Glenn Johnson
7a92579480 py-fisher: add version 0.1.10 (#34738) 2022-12-31 12:48:21 -06:00
Glenn Johnson
190dfd0269 py-youtube-dl: add version 2021.12.17 (#34740) 2022-12-31 12:42:58 -06:00
Massimiliano Culpo
b549548f69 Simplify creation of test and install reports (#34712)
The code in Spack to generate install and test reports currently suffers from unneeded complexity. For
instance, we have classes in Spack core packages, like `spack.reporters.CDash`, that need an
`argparse.Namespace` to be initialized and have "hard-coded" string literals on which they branch to
change their behavior:

```python
if do_fn.__name__ == "do_test" and skip_externals:
    package["result"] = "skipped"
else:
    package["result"] = "success"
package["stdout"] = fetch_log(pkg, do_fn, self.dir)
package["installed_from_binary_cache"] = pkg.installed_from_binary_cache
if do_fn.__name__ == "_install_task" and installed_already:
    return
```
This PR attempt to polish the major issues encountered in both `spack.report` and `spack.reporters`.

Details:
- [x] `spack.reporters` is now a package that contains both the base class `Reporter` and all 
      the derived classes (`JUnit` and `CDash`)
- [x] Classes derived from `spack.reporters.Reporter` don't take an `argparse.Namespace` anymore
       as argument to `__init__`. The rationale is that code for commands should be built upon Spack
       core classes, not vice-versa.
- [x] An `argparse.Action` has been coded to create the correct `Reporter` object based on command
       line arguments
- [x] The context managers to generate reports from either `spack install` or from `spack test` have
       been greatly simplified, and have been made less "dynamic" in nature. In particular, the `collect_info`
       class has been deleted in favor of two more specific context managers. This allows for a simpler
       structure of the code, and less knowledge required to client code (in particular on which method to patch)
- [x] The `InfoCollector` class has been turned into a simple hierarchy, so to avoid conditional statements
       within methods that assume a knowledge of the context in which the method is called.
2022-12-30 10:15:38 -08:00
Heiko Bauke
79268cedd2 mpl: add v0.2.1, v0.2.0 (#34716) 2022-12-30 09:21:58 -08:00
Satish Balay
2004171b7e petsc, py-petsc4py: add v3.18.3 (#34725) 2022-12-30 10:49:21 +01:00
Todd Gamblin
06312ddf18 bugfix: setgid tests fail when primary group is unknown (#34729)
On systems with remote groups, the primary user group may be remote and may not exist on
the local system (i.e., it might just be a number). On the CLI, it looks like this:

```console
> touch foo
> l foo
-rw-r--r-- 1 gamblin2 57095 0 Dec 29 22:24 foo
> chmod 2000 foo
chmod: changing permissions of 'foo': Operation not permitted
```

Here, the local machine doesn't know about per-user groups, so they appear as gids in
`ls` output. `57095` is also `gamblin2`'s uid, but the local machine doesn't know that
`gamblin2` is in the `57095` group.

Unfortunately, it seems that Python's `os.chmod()` just fails silently, setting
permissions to `0o0000` instead of `0o2000`. We can avoid this by ensuring that the file
has a group the user is known to be a member of.

- [x] Add `ensure_known_group()` in the permissions tests.
- [x] Call `ensure_known_group()` on tempfile in `test_chmod_real_entries_ignores_suid_sgid`.
2022-12-30 10:24:35 +01:00
Todd Gamblin
3a0db729c7 docs: avoid errors by using type hints instead of doc types (#34707)
There are a number of places in our docstrings where we write "list of X" as the type, even though napoleon doesn't actually support this. It ends up causing warnings when generating docs.

Now that we require Python 3, we don't have to rely on type hints in docs -- we can just use Python type hints and omit the types of args and return values from docstrings.

We should probably do this for all types in docstrings eventually, but this PR focuses on the ones that generate warnings during doc builds.

Some `mypy` annoyances we should consider in the future:
1. Adding some of these type annotations gets you:
    ```
    note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs  [annotation-unchecked]
    ```
   because they are in unannotated functions (like constructors where we don't really need any annotations).
   You can silence these with `disable_error_code = "annotation-unchecked"` in `pyproject.toml`
2. Right now we support running `mypy` in Python `3.6`.  That means we have to support `mypy` `.971`, which does not support `disable_error_code = "annotation-unchecked"`, so I just filter `[annotation-unchecked]` lines out in `spack style`.
3. I would rather just turn on `check_untyped_defs` and get more `mypy` coverage everywhere, but that will require about 1,000 fixes.  We should probably do that eventually.
4. We could also consider only running `mypy` on newer python versions.  This is not easy to do while supporting `3.6`, because you have to use `if TYPE_CHECKING` for a lot of things to ensure that 3.6 still parses correctly.  If we only supported `3.7` and above we could use [`from __future__ import annotations`](https://mypy.readthedocs.io/en/stable/runtime_troubles.html#future-annotations-import-pep-563), but we have to support 3.6 for now. Sigh.

- [x] Convert a number of docstring types to Python type hints
- [x] Get rid of "list of" wherever it appears
2022-12-29 16:45:09 -08:00
dependabot[bot]
9759331f43 build(deps): bump actions/setup-python from 4.3.1 to 4.4.0 (#34667)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.3.1 to 4.4.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](2c3dd9e7e2...5ccb29d877)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-29 14:57:58 +01:00
downloadico
ceca97518a trinity: add version 2.15.0-FULL (#34666) 2022-12-29 11:13:47 +01:00
Brent Huisman
1929d5e3de arbor: add v0.8.1 (#34660) 2022-12-29 11:07:17 +01:00
Lucas Frérot
238e9c3613 tamaas: added v2.6.0 (#34676) 2022-12-29 11:04:33 +01:00
Jim Galarowicz
d43e7cb5cd survey: add v1.0.7 (#34679) 2022-12-29 11:00:45 +01:00
Christopher Christofi
51a037d52a perl-archive-zip: add 1.68 (#34684) 2022-12-29 10:57:32 +01:00
Tim Haines
c91f8c2f14 boost: apply 'intel-oneapi-linux-jam.patch' to all versions since 1.76 (#34670) 2022-12-29 10:56:45 +01:00
Christopher Christofi
04ad42e5ee perl-appconfig: add v1.71 (#34685) 2022-12-29 10:55:41 +01:00
Alex Hedges
d02c71e443 git-filter-repo: add new package (#34690) 2022-12-29 10:53:19 +01:00
David Zmick
ca6e178890 jq: set -D_REENTRANT for builds on darwin (#34691) 2022-12-29 10:49:09 +01:00
Jed Brown
b145085fff libceed: add v0.11.0 (#34694) 2022-12-29 10:31:25 +01:00
AMD Toolchain Support
3a4b96e61c AOCC: add v4.0.0 (#33833) 2022-12-29 10:30:35 +01:00
Adam J. Stewart
36d87a4783 py-numpy: add v1.24.1 (#34697) 2022-12-29 10:23:20 +01:00
Wouter Deconinck
6d2645f73b libpsl: new versions through 0.21.2 (#34699)
This adds the final bugfix versions through the 0.21.2 just released.

With 0.21.1 the tag name pattern was changed, hence url_for_version.
2022-12-29 10:22:27 +01:00
Wouter Deconinck
44f7363fbe cernlib: depends_on libxaw libxt (#34448)
Based on the following lines in the top level `CMakeLists.txt` (I can't deep link since gitlab.cern.ch not public), `cernlib` needs an explicit dependency on `libxaw` and `libxt`:
```cmake
find_package(X11  REQUIRED)
message(STATUS "CERNLIB: X11_Xt_LIB=${X11_Xt_LIB} X11_Xaw_LIB=${X11_Xaw_LIB} X11_LIBRARIES=${X11_LIBRARIES}")
```
2022-12-29 09:25:07 +01:00
Wouter Deconinck
9d936a2a75 singularity, apptainer: --without-conmon into @property config_options (#34474)
Per https://github.com/spack/spack/issues/34192, apptainer does not support `--without-conmon`, so we introduce a base class `config_options` property that can be overridden in the `apptainer` package.
2022-12-29 09:24:41 +01:00
Wouter Deconinck
18438c395d dd4hep: depends_on virtual tbb instead of intel-tbb (#34704)
Recent changes to dd4hep remove the explicit dependency
on an older version of intel-tbb. This makes this explicit
in the spack package.
2022-12-29 09:13:28 +01:00
wspear
28a30bcea6 veloc: add v1.6 and dependencies (#34706) 2022-12-29 09:12:51 +01:00
Alex Richert
536c7709c2 Change regex in bacio patch to avoid python re bug (#34668) 2022-12-29 08:50:27 +01:00
Todd Gamblin
e28738a01e bugfix: make texinfo build properly with gettext (#34312)
`texinfo` depends on `gettext`, and it builds a perl module that uses gettext via XS
module FFI. Unfortunately, the XS modules build asks perl to tell it what compiler to
use instead of respecting the one passed to configure.

Without this change, the build fails with this error:

```
parsetexi/api.c:33:10: fatal error: 'libintl.h' file not found
         ^~~~~~~~~~~
```

We need the gettext dependency and the spack wrappers to ensure XS builds properly.

- [x] Add needed `gettext` dependency to `texinfo`
- [x] Override XS compiler with `PERL_EXT_CC`

Co-authored-by: Paul Kuberry <pakuber@sandia.gov>
2022-12-28 15:20:53 -08:00
Todd Gamblin
5f8c706128 Consolidate how Spack uses git (#34700)
Local `git` tests will fail with `fatal: transport 'file' not allowed` when using git 2.38.1 or higher, due to a fix for `CVE-2022-39253`.

This was fixed in CI in #33429, but that doesn't help the issue for anyone's local environment. Instead of fixing this with git config in CI, we should ensure that the tests run anywhere.

- [x] Introduce `spack.util.git`.
- [x] Use `spack.util.git.get_git()` to get a git executable, instead of `which("git")` everywhere.
- [x] Make all `git` tests use a `git` fixture that goes through `spack.util.git.get_git()`.
- [x] Add `-c protocol.file.allow=always` to all `git` invocations under `pytest`.
- [x] Revert changes from #33429, which are no longer needed.
2022-12-28 00:44:11 -08:00
Rémi Lacroix
558695793f CPMD: Remove now unused "import" 2022-12-27 13:06:08 -08:00
Rémi Lacroix
b43a27674b CPMD: Update for open-source release
CPMD has been open-sourced on GitHub so manual download is no longer needed. The patches have been included in the new 4.3 release.
2022-12-27 13:06:08 -08:00
Massimiliano Culpo
3d961b9a1f spack graph: rework to use Jinja templates and builders (#34637)
`spack graph` has been reworked to use:

- Jinja templates
- builder objects to construct the template context when DOT graphs are requested. 

This allowed to add a new colored output for DOT graphs that highlights both
the dependency types and the nodes that are needed at runtime for a given spec.
2022-12-27 15:25:53 +01:00
Todd Gamblin
d100ac8923 types: fix type annotations and remove novm annootations for llnl module
Apparently I forgot to do this in #34305.
2022-12-26 22:28:44 +01:00
Harmen Stoppels
e8fa8c5f01 timer: pick a single unit based on max duration. 2022-12-26 22:28:44 +01:00
Todd Gamblin
be6bb413df spack solve: use consistent units for time
`spack solve` is supposed to show you times you can compare. setup, ground, solve, etc.
all in a list. You're also supposed to be able to compare easily across runs. With
`pretty_seconds()` (introduced in #33900), it's easy to miss the units, e.g., spot the
bottleneck here:

```console
> spack solve --timers tcl
    setup        22.125ms
    load         16.083ms
    ground        8.298ms
    solve       848.055us
    total        58.615ms
```

It's easier to see what matters if these are all in the same units, e.g.:

```
> spack solve --timers tcl
    setup         0.0147s
    load          0.0130s
    ground        0.0078s
    solve         0.0008s
    total         0.0463s
```

And the units won't fluctuate from run to run as you make changes.

-[x] make `spack solve` timings consistent like before
2022-12-26 22:28:44 +01:00
Adam J. Stewart
d23c302ca2 qt-base: ~network by default (#34688) 2022-12-26 10:19:03 -06:00
Rohit Goswami
ed0c1cea91 py-pytest-datadir: Init at 1.4.1 (#34692)
* py-pytest-datadir: Init at 1.4.1

* py-pytest-data-dir: Fix missing dep

Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>

Co-authored-by: "Adam J. Stewart" <ajstewart426@gmail.com>
2022-12-24 11:42:05 -07:00
Lucas Frérot
ffc42e287d py-uvw: added v0.5.0 (#34677) 2022-12-24 11:12:44 -06:00
Ralf Gommers
ba0d182e10 Update py-meson-python (0.11.0, 0.12.0) and meson (0.64.1, 1.0.0) (#34675)
* Update py-meson-python versions (0.11.0, 0.12.0)

* Update `meson` to version 0.64.1

* Add Meson 1.0.0

* Apply code review suggestions
2022-12-23 19:22:19 -07:00
David Zmick
8d8104de2c tmux: add 3.3a (#34671) 2022-12-24 01:52:32 +01:00
Adam J. Stewart
7975e0afbc QMakeBuilder: fix bug introduced during multi-bs refactor (#34683) 2022-12-23 13:57:44 -06:00
Adam J. Stewart
4a43522763 py-kornia: add v0.6.9 (#34652) 2022-12-22 15:13:52 -07:00
Adam J. Stewart
30343d65ba libelf: fix build on macOS x86_64 (#34646) 2022-12-22 14:58:32 -07:00
Alex Richert
38c1639c9c bacio: fix typo in patch method (#34663) 2022-12-22 18:59:32 +01:00
Wouter Deconinck
be5033c869 sherpa: add v2.2.13 (#34628) 2022-12-22 10:58:21 -07:00
Adam J. Stewart
eb67497020 ML CI: Linux x86_64 (#34299)
* ML CI: Linux x86_64

* Update comments

* Rename again

* Rename comments

* Update to match other arches

* No compiler

* Compiler was wrong anyway

* Faster TF
2022-12-22 11:31:40 -06:00
Loïc Pottier
371268a9aa added py-dynim package (#34651)
Signed-off-by: Loïc Pottier <48072795+lpottier@users.noreply.github.com>

Signed-off-by: Loïc Pottier <48072795+lpottier@users.noreply.github.com>
2022-12-22 09:55:18 -06:00
Andrew Wood
344e8d142a Restrict a patch of rhash to versions >=1.3.6 (#34310) 2022-12-22 08:02:15 -07:00
Harmen Stoppels
161fbfadf4 Fix combine_phase_logs text encoding issues (#34657)
Avoid text decoding and encoding when combining log files, instead
combine in binary mode.

Also do a buffered copy which is sometimes faster for large log files.
2022-12-22 15:32:48 +01:00
Wladimir Arturo Garces Carrillo
3304312b26 neve: add new package (#34596)
Co-authored-by: WladIMirG <WladIMirG@users.noreply.github.com>
2022-12-22 07:27:07 -07:00
Alec Scott
3279ee7068 Add --fresh to docs to actually upgrade spack environments (#34433) 2022-12-22 11:19:24 +00:00
Todd Gamblin
8f3f838763 docs: show module documentation before submodules (#34258)
Currently, the Spack docs show documentation for submodules *before* documentation for
submodules on package doc pages. This means that if you put docs in `__init__.py` in
some package, the docs in there will be shown *after* the docs for all submodules of the
package instead of at the top as an intro to the package. See, e.g.,
[the lockfile docs](https://spack.readthedocs.io/en/latest/spack.environment.html#module-spack.environment),
which should be at the
[top of that page](https://spack.readthedocs.io/en/latest/spack.environment.html).

- [x] add the `--module-first` option to sphinx so that it generates module docs at top of page.
2022-12-22 11:50:48 +01:00
Todd Gamblin
09864d00c5 docs: remove monitors and analyzers (#34358)
These experimental features were removed in #31130, but the docs were not.

- [x] remove the `spack monitor` and `spack analyze` docs
2022-12-22 11:47:13 +01:00
Benjamin S. Kirk
0f7fa27327 librsvg: add 2.40.21, which does not require rust (#34585)
* librsvg: add 2.40.21, which does not require rust and has some security backports

https://download.gnome.org/sources/librsvg/2.40/librsvg-2.40.21.news

* librsvg: prevent finding broken gtkdoc binaries when ~doc is selected.

On my CentOS7 hosts, ./configure finds e.g. /bin/gtkdoc-rebase even when
~doc is selected.  These tools use Python2, and fail with an error:
"ImportError: No module named site"

So prevent ./configure from finding these broken tools when not building
the +doc variant.
2022-12-22 11:28:30 +01:00
Rocco Meli
a27139c081 openbabel: add 3.1.0 and 3.1.1 (#34631) 2022-12-22 03:17:50 -07:00
Vasileios Karakasis
4d4338db16 reframe: rework recipe, add v4.0.0-dev4 (#34584) 2022-12-22 09:53:42 +01:00
Annop Wongwathanarat
6d64ffdd1a quantum-espresso: enable linking with armpl-gcc and acfl for BLAS and FFT (#34416) 2022-12-22 09:50:51 +01:00
Harmen Stoppels
e9ea9e2316 index.json.hash, no fatal error if key cannot be fetched (#34643) 2022-12-22 09:48:05 +01:00
Benjamin Fovet
2a5509ea90 kokkos: add v3.7.01 (#34645)
Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2022-12-22 09:45:13 +01:00
Adam J. Stewart
b9d027f0cc py-pytorch-lightning: add v1.8.6 (#34647) 2022-12-22 09:43:57 +01:00
Christopher Christofi
6d54dc2a44 perl-config-simple: add 4.58 (#34649) 2022-12-22 09:43:41 +01:00
renjithravindrankannath
6cd9cbf578 Using corresponding commit ids of hiprand for each releases (#34545) 2022-12-22 09:36:14 +01:00
Alex Richert
72e81796d1 bacio: patch for v2.4.1 (#34575) 2022-12-22 09:12:29 +01:00
Andre Merzky
f116e6762a add py-psij-python and py-pystache packages (#34357)
* add psij package and deps

* update hashes, URLs

* linting

* Update var/spack/repos/builtin/packages/py-psij-python/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pystache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pystache/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Update package.py

apply suggested change

* Update package.py

apply suggested change

* Update package.py

ensure maintainer inheritance

* add psij to exaworks meta-package

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-12-21 20:07:35 -07:00
Howard Pritchard
c74bbc6723 paraview: patch catalyst etc. to build with oneapi (#33562)
without this patch, build of paraview has a meltdown when reaching 3rd party catalyst and other packages
with these types of errors:

   335    /tmp/foo/spack-stage/spack-stage-paraview-5.10.1-gscoqxhhakjyyfirdefuhmi2bzw4scho/spack-src/VTK/ThirdParty/fmt/vtkfmt/vtkfmt/format.h:1732:11: error: cannot capture a bi
            t-field by reference
   336          if (sign) *it++ = static_cast<Char>(data::signs[sign]);
   337              ^

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2022-12-21 17:07:24 -07:00
Harmen Stoppels
492a603d5e json: remove python 2 only code (#34615) 2022-12-21 14:18:12 -07:00
Adam J. Stewart
dab68687bd py-cartopy: older versions don't support Python 3.10 (#34626) 2022-12-21 21:23:22 +01:00
Thomas Madlener
1a32cea114 podio: add v0.16.2 (#34606) 2022-12-21 12:52:47 -07:00
Hector Martinez-Seara
aaec76652b relion: add v4.0.0 (#34600) 2022-12-21 20:41:13 +01:00
Michael Kuhn
f748911ea0 glib: add 2.74.3 (#34603) 2022-12-21 20:40:04 +01:00
Cory Bloor
e60e74694f rocm: make amdgpu_target sticky (#34591)
The sticky property will prevent clingo from changing the amdgpu_target
to work around conflicts. This is the same behaviour as was adopted for
cuda_arch in 055c9d125d.
2022-12-21 20:21:20 +01:00
Sergey Kosukhin
2ef026b8c6 eckit: skip broken test (#34610) 2022-12-21 20:20:05 +01:00
Mark W. Krentel
a6c2569b18 hpctoolkit: replace filter_file with upstream patch (#34604)
Replace the filter_file for older configure with rocm 5.3 with an
upstream patch.  Further, the patch is no longer needed for develop or
later releases.
2022-12-21 20:18:58 +01:00
shanedsnyder
5483b5ff99 dashan-runtime,darshan-util,py-darshan: update package versions for darshan-3.4.2 (#34583) 2022-12-21 20:18:27 +01:00
louisespellacy-arm
2b78a7099d arm-forge: add 22.1.2 (#34569) 2022-12-21 20:09:42 +01:00
lpoirel
34cdc6f52b starpu: add conflict for ~blocking +simgrid (#34616)
see 1f5a911d43
2022-12-21 20:09:23 +01:00
Adam J. Stewart
3aafdb06c9 py-pyproj: add new versions (#34633) 2022-12-21 20:00:53 +01:00
Harmen Stoppels
4a22c1c699 urlopen: handle timeout in opener (#34639) 2022-12-21 19:40:26 +01:00
Andrey Perestoronin
f021479ef0 feat: 🎸 Add new 2023.0.0 oneVPL package (#34642) 2022-12-21 11:07:41 -07:00
Manuela Kuhn
3f374fb62f py-vcrpy: add 4.2.1 (#34636) 2022-12-21 11:02:55 -07:00
Niclas Jansson
949be42f32 neko: add v0.5.0 (#34640) 2022-12-21 11:02:37 -07:00
Rob Falgout
e5abd5abc1 hypre: add v2.27.0 (#34625) 2022-12-21 11:02:23 -07:00
Harmen Stoppels
4473d5d811 etags for index.json invalidation, test coverage (#34641)
Implement an alternative strategy to do index.json invalidation.

The current approach of pairs of index.json / index.json.hash is
problematic because it leads to races.

The standard solution for cache invalidation is etags, which are
supported by both http and s3 protocols, which allows one to do
conditional fetches.

This PR implements that for the http/https schemes. It should also work
for s3 schemes, but that requires other prs to be merged.

Also it improves unit tests for index.json fetches.
2022-12-21 18:41:59 +01:00
Mikael Simberg
c3e61664cf Add patch for pika on macOS (#34619) 2022-12-21 13:41:49 +01:00
Manuela Kuhn
c3217775c3 py-scikit-image: add 0.19.3 (#34618)
* py-scikit-image: add 0.19.3

* Update var/spack/repos/builtin/packages/py-scikit-image/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-20 12:59:48 -06:00
kwryankrattiger
58a7e11db9 DAV: VTK-m needs to install examples for smoke test (#34611)
SDK deployment targets being able to validate and run VTK-m via spack
deployments, so examples should be installed.
2022-12-20 09:56:50 -08:00
Andrey Perestoronin
ac570bb5c4 2023.0.0 oneAPI release promotion (#34617) 2022-12-20 08:47:08 -07:00
Massimiliano Culpo
b2c806f6fc archspec: add support for zen4 (#34609)
Also add:
- Upper bound for Xeon Phi compiler support
- Better detection for a64fx
2022-12-20 11:22:50 +01:00
Nicholas Knoblauch
bd613b3124 Remove dep on jupyter meta-package (#34573) 2022-12-19 17:22:34 -06:00
Manuela Kuhn
f1b85bc653 py-nipype: add 1.8.5 and py-looseversion: add new package (#34608) 2022-12-19 13:25:22 -06:00
Hector Martinez-Seara
e1fab4dd51 Gromacs: added version 2022.4 (#34599) 2022-12-19 15:30:20 +01:00
Anton Kozhevnikov
a924079f66 [ELPA] add sha256 for elpa-2022.11.001.rc2.tar.gz (#33439) 2022-12-19 04:12:02 -07:00
Adam J. Stewart
c5aff1d412 py-horovod: patch no longer applies (#34593) 2022-12-19 11:49:02 +01:00
Adam J. Stewart
6c9602ee64 aws-sdk-cpp: add v1.10.32 (#34592) 2022-12-19 11:48:31 +01:00
Adam J. Stewart
64327bfef0 py-pyvista: add v0.37.0 (#34590) 2022-12-19 11:48:01 +01:00
Adam J. Stewart
05c3cb7cc9 netcdf-cxx: add patch to fix macOS build (#34588) 2022-12-19 11:46:33 +01:00
Adam J. Stewart
c87b251639 XNNPACK: fix build on macOS, update deps (#34555) 2022-12-19 11:44:56 +01:00
Adam J. Stewart
f2332a17d3 Node.js: new versions, newer Python support, macOS fixes (#34478) 2022-12-19 11:40:31 +01:00
Adam J. Stewart
c7f24a132e py-numpy: add v1.24.0 (#34602) 2022-12-18 17:17:06 -07:00
Alec Scott
96a7af1dd2 Add py-docstring-to-markdown v0.11 (#34595) 2022-12-18 14:59:47 -06:00
eugeneswalker
db1caa9e92 intel-oneapi-dpl: add v2022.0.0 (#34601) 2022-12-18 13:44:23 -05:00
iarspider
237d26460d LLVM: replace libelf dependency with elf (#34265)
* LLVM: replace libelf dependency with elf

I didn't test this extensively, but in CMS LLVM builds just fine with elfutils.

* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-12-17 14:44:27 -08:00
Harmen Stoppels
1020b65297 fix != -> == typo (#34568) 2022-12-17 20:15:15 +01:00
Adam J. Stewart
dceb4c9d65 Update PyTorch ecosystem (#34582) 2022-12-17 11:51:59 -07:00
Alex Richert
50570ea334 Add static-only option for ESMF (#34576) 2022-12-17 04:27:22 -07:00
eugeneswalker
7e836b925d e4s: disable mac stack due to binary relocation issue#32571 (#34560) 2022-12-17 10:53:15 +00:00
Benjamin S. Kirk
cec3da61d2 Add gimp & dependent packages (#34558)
* exiv2: add new versions

* babl: new package required to build GIMP

* gegl: new package required to build GIMP

* gexiv2: new package required to build GIMP

* libmypaint: new package required to build GIMP

* mypaint-brushes: new package required to build GIMP

* vala: new package required to build GIMP

* GIMP: new package definition for building GIMP-2.10 from source

* libjxl: update for 0.7.0

* libwmf: a library for reading vector images in Windows Metafile Format (WMF)

* libde265: an open source implementation of the h.265 video codec

* libwebp: add new versions

* GIMP: additional variants for building GIMP-2.10 from source

* libde265: remove boilerplate

* fixes for style precheck

* updates based on feedback

* fixes for style precheck
2022-12-17 03:52:56 -07:00
Mikhail Titov
7ed53cf083 Update package versions: RADICAL-Cybertools (RE, RG, RP, RS, RU) (#34572)
* rct: update packages (RE, RG, RP, RS, RU) with new versions

* re: fixed radical-pilot requirement for radical-entk
2022-12-16 22:24:00 -07:00
eugeneswalker
bdc3ab5b54 intel-oneapi-compilers: add v2023.0.0 (#34571) 2022-12-16 21:38:51 -07:00
Jack Morrison
5a985e33ea Add --enable-orterun-prefix-by-default configure option for OpenMPI (#34469) 2022-12-16 17:59:24 -07:00
Bernhard Kaindl
9817593c1c Automake requires Thread::Queue, but it is only provided with in perl+threads. (#34076)
Update the depends_on("perl") to depends_on("perl+threads").

This and #34074 is needed to properly handle e.g. the perl-Thread-Queue
rpm package:

It may not be installed on RedHat-based hosts, which can lead to automake
build failures when `spack external find perl` or `spack external find --all`
was used to use the system-provided perl install.
2022-12-16 16:11:11 -08:00
Rémi Lacroix
1cc78dac38 octopus: Ensure MPI is used consistently (#33969)
Some variants have MPI dependencies, make sure they can be used only when the `mpi` variable is enabled.
2022-12-16 15:17:37 -08:00
Marco De La Pierre
e2c5fe4aa3 adding 2nd bunch of nf-core deps from update/nextflow-tools (#34562)
* adding 2nd bunch of nf-core deps from update/nextflow-tools

* Update var/spack/repos/builtin/packages/py-a2wsgi/package.py

* Update var/spack/repos/builtin/packages/py-apispec/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bagit-profile/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-tuspy/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-schema-salad/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-bdbag/package.py

* Update var/spack/repos/builtin/packages/py-bioblend/package.py

* Update var/spack/repos/builtin/packages/py-circus/package.py

* Update var/spack/repos/builtin/packages/py-circus/package.py

* Update var/spack/repos/builtin/packages/py-cloudbridge/package.py

* Update var/spack/repos/builtin/packages/py-cloudbridge/package.py

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-16 15:18:49 -07:00
Marco De La Pierre
1bf87dbb5d Adding first bunch of recipes for dependencies of nf-core-tools (#34537)
* nextflow recipe: added latest stable version

* tower-cli recipe: added latest release

* recipes tower-agent and tower-cli renamed to nf-tower-agent and nf-tower-cli

* recipes nf-tower-agent and nf-tower-cli: small fix

* nf-core-tools recipe: added most py- dependencies

* nf-core-tools: recipe without galaxy-tool-util (for testing)

* fixed typos in py-yacman recipe

* fixed typos in py-pytest-workflow recipe

* fixed typo in nf-core-tools recipe

* fixed typos in py-yacman recipe

* fixes in recipes for py-questionary and py-url-normalize

* fixes to py-yacman recipe

* style fixes to py- packages that are dependencies to nf-core-tools

* fix in py-requests-cache recipe

* added missing dep in py-requests-cache recipe

* nf-core-tools deps: removed redundant python dep for py packages oyaml and piper

* nf-core-tools recipe: final, incl dep on py-galaxy-tool-util

* nf-core-tools: new version with extra dependency

* commit to merge packages on focus from update/nextflow-tools

* nf-core: commenting galaxy dep for this pr

* Update var/spack/repos/builtin/packages/py-requests-cache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-requests-cache/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed nf-core-tools from this branch, will be back at the end

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-16 14:28:51 -07:00
Sam Reeve
ffe527b141 Add HACCabana proxy app (#34567) 2022-12-16 14:03:08 -07:00
John W. Parent
642c5b876b Compiler detection: avoid false recognition of MSVC (#34574)
Interim fix for #34559

Spack's MSVC compiler definition uses ifx as the Fortran compiler.
Prior to #33385, the Spack MSVC compiler definition required the
executable to be called "ifx.exe"; #33385 replaced this with just
"ifx", which inadvertently led to ifx falsely indicating the
presence of MSVC on non-Windows systems (which leads to future
errors when attempting to query/use those compiler objects).

This commit applies a short-term fix by updating MSVC Fortran
version detection to always indicate a failure on non-Windows.
2022-12-16 19:22:04 +00:00
Brian Spilner
8b7bd6dc74 new release cdo-2.1.1 (#34548) 2022-12-16 11:16:32 -08:00
Adam J. Stewart
2f97dc7aa6 py-pytorch-lightning: add v1.8.5 (#34557) 2022-12-16 11:10:19 -08:00
Adam J. Stewart
958d542f81 GDAL: add v3.6.1 (#34556) 2022-12-16 10:32:54 -08:00
Vicente Bolea
b1aae1c2ed vtk-m: add v2.0.0-rc1 (#34561) 2022-12-16 10:31:10 -08:00
SXS Bot
690f9d69fe spectre: add v2022.12.16 (#34570)
* spectre: add v2022.12.16
* [@spackbot] updating style on behalf of sxs-bot

Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-12-16 11:27:56 -07:00
Marc Joos
a78c16a609 add version 3.6.4 to wi4mpi (#34565) 2022-12-16 10:26:46 -08:00
snehring
7bb2d3cca3 nwchem: restricting current versions to python@3.9 at latest (#34506) 2022-12-16 17:20:19 +01:00
Paul Kuberry
7216050dd3 libzmq: make location of libsodium explicit (#34553) 2022-12-15 16:17:15 -07:00
eugeneswalker
2f26e422d6 nco: add v5.0.6 (#34512) 2022-12-15 15:42:13 -07:00
Brian Van Essen
3477d578a3 roctracer: fixed a bug in how the external is identified (#33517)
Make the package a proper ROCm package.
2022-12-15 23:29:36 +01:00
Zack Galbreath
aa8e1ba606 gitlab ci: more resources for slow builds (#34505) 2022-12-15 14:35:54 -07:00
Manuela Kuhn
08e007e9a6 py-traits: add 6.4.1 (#34550) 2022-12-15 13:10:16 -06:00
eugeneswalker
d6fb65ebc6 eckit: add v1.19.0 (#34510) 2022-12-15 10:38:24 -08:00
eugeneswalker
2b5be919dd odc: add v1.4.5 (#34513) 2022-12-15 10:38:06 -08:00
Sebastian Grimberg
cc2dff48a8 arpack-ng: add variant for ISO C binding support (#34529)
Co-authored-by: Sebastian Grimberg <sjg@amazon.com>
2022-12-15 10:56:13 -07:00
Massimiliano Culpo
22922bf74c Propagate exceptions from Spack python console (#34547)
fixes #34489

Customize sys.excepthook to raise SystemExit when
any unhandled exception reaches the hook.
2022-12-15 18:08:53 +01:00
Sean Koyama
8a02463d7d IntelOneApiPackage: add envmods variant to toggle environment modifications by oneapi packages (#34253)
Co-authored-by: Sean Koyama <skoyama@anl.gov>
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2022-12-15 17:52:09 +01:00
Harmen Stoppels
c6465bd9bd Add a proper deprecation warning for update-index -d (#34520) 2022-12-15 17:45:32 +01:00
Harmen Stoppels
9025caed6e Remove warning in download_tarball (#34549) 2022-12-15 14:03:30 +00:00
Massimiliano Culpo
7056a4bffd Forward lookup of the "run_tests" attribute (#34531)
fixes #34518

Fix an issue due to the MRO chain of the package wrapper
during build. Before this PR we were always returning
False when the builder object was created before the
run_tests method was monkey patched.
2022-12-15 09:35:33 +01:00
snehring
d2aa8466eb metabat: adding missing build dependency (#34530) 2022-12-15 09:23:59 +01:00
Loïc Pottier
6e4684fbca talass: fixed URLs so the package is reachable (#34387)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2022-12-15 09:23:05 +01:00
Brian Vanderwende
fcbf617d38 ncl: add RPC lib with ncl+hdf4 (#34451) 2022-12-15 09:22:00 +01:00
downloadico
1f8b55a021 Add G'MIC package with only the "cli" target available (#34533) 2022-12-15 09:19:50 +01:00
David Gardner
b5f8ed07fb sundials: fix typo in smoke tests (#34539) 2022-12-15 09:07:54 +01:00
Thomas Madlener
65bd9b9ac5 podio, edm4hep: add v0.7.2 and v0.16.1 respectively (#34526)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2022-12-15 09:02:16 +01:00
Adam J. Stewart
6250d84b41 cpuinfo: new versions, shared libs (#34544) 2022-12-15 09:00:51 +01:00
Wouter Deconinck
99056e03bd acts: new versions 19.11.0, 21.0.0, 21.1.0 (#34540)
* acts: new versions 19.11.0, 21.0.0, 21.1.0

https://github.com/acts-project/acts/compare/v19.10.0...v19.11.0:
- python 3.8 required if ACTS_BUILD_EXAMPLES_PYTHON_BINDINGS

https://github.com/acts-project/acts/compare/v20.3.0...v21.0.0:
- python 3.8 required if ACTS_BUILD_EXAMPLES_PYTHON_BINDINGS

https://github.com/acts-project/acts/compare/v21.0.0...v21.1.0:
- no build system changes

* acts: depends_on python@3.8: when sometimes
2022-12-15 08:56:32 +01:00
Fabien Bruneval
1db849ee5f libcint: Fix +coulomb_erf and add +pypzpx (#34524) 2022-12-15 05:31:58 +01:00
Thomas Madlener
2f82b213df lcio: add latest version (#34527) 2022-12-15 05:06:59 +01:00
Axel Huebl
2a5f0158bc ParaView: Add openPMD Support (#33821)
openPMD, a metadata standard on top of backends like ADIOS2 and HDF5,
is implemented in ParaView 5.9+ via a Python3 module.

Simplify Conflicts & Variant

Add to ECP Data Vis SDK
2022-12-14 20:45:27 -07:00
Manuela Kuhn
21a1f7dd97 py-traitlets: add w5.7.1 (#34525) 2022-12-14 21:34:51 -06:00
David Boehme
4b5ed94af4 caliper: add version 2.9.0 (#34538) 2022-12-15 03:52:53 +01:00
snehring
06788019a4 apptainer: add new version 1.1.4 (#34536) 2022-12-15 02:06:22 +01:00
Sam Grayson
cab8f795a7 Patch dill._dill._is_builtin_module (#34534)
* Patch dill._dill._is_builtin_module

* Fix style

* Add test
2022-12-14 16:03:03 -07:00
finkandreas
2db38bfa38 py-archspec: replace removed .build_directory with .stage.source_path (#34521) 2022-12-15 00:00:21 +01:00
Harmen Stoppels
ea029442e6 Revert "Revert "Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)"" (#34498)
This reverts commit 8035eeb36d.

And also removes logic around an additional HEAD request to prevent
a more expensive GET request on wrong content-type. Since large files
are typically an attachment and only downloaded when reading the
stream, it's not an optimization that helps much, and in fact the logic
was broken since the GET request was done unconditionally.
2022-12-14 23:47:11 +01:00
Axel Huebl
43e38d0d12 WarpX 22.11, 22.12 & PICMI-Standard (#34517)
* PICMI: 0.0.22

* WarpX: 22.11, 22.12
2022-12-14 13:59:16 -08:00
460 changed files with 6351 additions and 4000 deletions

View File

@@ -20,7 +20,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -4,10 +4,6 @@ git config --global user.email "spack@example.com"
git config --global user.name "Test User"
git config --global core.longpaths true
# See https://github.com/git/git/security/advisories/GHSA-3wp6-j8xr-qw85 (CVE-2022-39253)
# This is needed to let some fixture in our unit-test suite run
git config --global protocol.file.allow always
if ($(git branch --show-current) -ne "develop")
{
git branch develop origin/develop

View File

@@ -2,10 +2,6 @@
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
# See https://github.com/git/git/security/advisories/GHSA-3wp6-j8xr-qw85 (CVE-2022-39253)
# This is needed to let some fixture in our unit-test suite run
git config --global protocol.file.allow always
# create a local pr base branch
if [[ -n $GITHUB_BASE_REF ]]; then
git fetch origin "${GITHUB_BASE_REF}:${GITHUB_BASE_REF}"

View File

@@ -50,7 +50,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -97,7 +97,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -154,7 +154,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -188,7 +188,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912 # @v2
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
- uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
with:
python-version: 3.9
- name: Install Python packages
@@ -90,7 +90,7 @@ jobs:
# - uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
# with:
# fetch-depth: 0
# - uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
# - uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
# with:
# python-version: 3.9
# - name: Install Python packages
@@ -121,7 +121,7 @@ jobs:
# run:
# shell: pwsh
# steps:
# - uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
# - uses: actions/setup-python@5ccb29d8773c3f3f653e1705f474dfaa8a06a912
# with:
# python-version: 3.9
# - name: Install Python packages

View File

@@ -1,162 +0,0 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _analyze:
=======
Analyze
=======
The analyze command is a front-end to various tools that let us analyze
package installations. Each analyzer is a module for a different kind
of analysis that can be done on a package installation, including (but not
limited to) binary, log, or text analysis. Thus, the analyze command group
allows you to take an existing package install, choose an analyzer,
and extract some output for the package using it.
-----------------
Analyzer Metadata
-----------------
For all analyzers, we write to an ``analyzers`` folder in ``~/.spack``, or the
value that you specify in your spack config at ``config:analyzers_dir``.
For example, here we see the results of running an analysis on zlib:
.. code-block:: console
$ tree ~/.spack/analyzers/
└── linux-ubuntu20.04-skylake
└── gcc-9.3.0
└── zlib-1.2.11-sl7m27mzkbejtkrajigj3a3m37ygv4u2
├── environment_variables
│   └── spack-analyzer-environment-variables.json
├── install_files
│   └── spack-analyzer-install-files.json
└── libabigail
└── spack-analyzer-libabigail-libz.so.1.2.11.xml
This means that you can always find analyzer output in this folder, and it
is organized with the same logic as the package install it was run for.
If you want to customize this top level folder, simply provide the ``--path``
argument to ``spack analyze run``. The nested organization will be maintained
within your custom root.
-----------------
Listing Analyzers
-----------------
If you aren't familiar with Spack's analyzers, you can quickly list those that
are available:
.. code-block:: console
$ spack analyze list-analyzers
install_files : install file listing read from install_manifest.json
environment_variables : environment variables parsed from spack-build-env.txt
config_args : config args loaded from spack-configure-args.txt
libabigail : Application Binary Interface (ABI) features for objects
In the above, the first three are fairly simple - parsing metadata files from
a package install directory to save
-------------------
Analyzing a Package
-------------------
The analyze command, akin to install, will accept a package spec to perform
an analysis for. The package must be installed. Let's walk through an example
with zlib. We first ask to analyze it. However, since we have more than one
install, we are asked to disambiguate:
.. code-block:: console
$ spack analyze run zlib
==> Error: zlib matches multiple packages.
Matching packages:
fz2bs56 zlib@1.2.11%gcc@7.5.0 arch=linux-ubuntu18.04-skylake
sl7m27m zlib@1.2.11%gcc@9.3.0 arch=linux-ubuntu20.04-skylake
Use a more specific spec.
We can then specify the spec version that we want to analyze:
.. code-block:: console
$ spack analyze run zlib/fz2bs56
If you don't provide any specific analyzer names, by default all analyzers
(shown in the ``list-analyzers`` subcommand list) will be run. If an analyzer does not
have any result, it will be skipped. For example, here is a result running for
zlib:
.. code-block:: console
$ ls ~/.spack/analyzers/linux-ubuntu20.04-skylake/gcc-9.3.0/zlib-1.2.11-sl7m27mzkbejtkrajigj3a3m37ygv4u2/
spack-analyzer-environment-variables.json
spack-analyzer-install-files.json
spack-analyzer-libabigail-libz.so.1.2.11.xml
If you want to run a specific analyzer, ask for it with `--analyzer`. Here we run
spack analyze on libabigail (already installed) _using_ libabigail1
.. code-block:: console
$ spack analyze run --analyzer abigail libabigail
.. _analyze_monitoring:
----------------------
Monitoring An Analysis
----------------------
For any kind of analysis, you can
use a `spack monitor <https://github.com/spack/spack-monitor>`_ "Spackmon"
as a server to upload the same run metadata to. You can
follow the instructions in the `spack monitor documentation <https://spack-monitor.readthedocs.org>`_
to first create a server along with a username and token for yourself.
You can then use this guide to interact with the server.
You should first export our spack monitor token and username to the environment:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
By default, the host for your server is expected to be at ``http://127.0.0.1``
with a prefix of ``ms1``, and if this is the case, you can simply add the
``--monitor`` flag to the install command:
.. code-block:: console
$ spack analyze run --monitor wget
If you need to customize the host or the prefix, you can do that as well:
.. code-block:: console
$ spack analyze run --monitor --monitor-prefix monitor --monitor-host https://monitor-service.io wget
If your server doesn't have authentication, you can skip it:
.. code-block:: console
$ spack analyze run --monitor --monitor-disable-auth wget
Regardless of your choice, when you run analyze on an installed package (whether
it was installed with ``--monitor`` or not, you'll see the results generating as they did
before, and a message that the monitor server was pinged:
.. code-block:: console
$ spack analyze --monitor wget
...
==> Sending result for wget bin/wget to monitor.

View File

@@ -74,6 +74,7 @@
"--force", # Overwrite existing files
"--no-toc", # Don't create a table of contents file
"--output-dir=.", # Directory to place all output
"--module-first", # emit module docs before submodule docs
]
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/spack"])
sphinx_apidoc(apidoc_args + ["_spack_root/lib/spack/llnl"])
@@ -200,12 +201,14 @@ def setup(sphinx):
("py:class", "_frozen_importlib_external.SourceFileLoader"),
("py:class", "clingo.Control"),
("py:class", "six.moves.urllib.parse.ParseResult"),
("py:class", "TextIO"),
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.VersionBase"),
("py:class", "spack.spec.DependencySpec"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -67,7 +67,6 @@ or refer to the full manual below.
build_settings
environments
containers
monitoring
mirrors
module_file_support
repositories
@@ -78,12 +77,6 @@ or refer to the full manual below.
extensions
pipelines
.. toctree::
:maxdepth: 2
:caption: Research
analyze
.. toctree::
:maxdepth: 2
:caption: Contributing

View File

@@ -1,265 +0,0 @@
.. Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _monitoring:
==========
Monitoring
==========
You can use a `spack monitor <https://github.com/spack/spack-monitor>`_ "Spackmon"
server to store a database of your packages, builds, and associated metadata
for provenance, research, or some other kind of development. You should
follow the instructions in the `spack monitor documentation <https://spack-monitor.readthedocs.org>`_
to first create a server along with a username and token for yourself.
You can then use this guide to interact with the server.
-------------------
Analysis Monitoring
-------------------
To read about how to monitor an analysis (meaning you want to send analysis results
to a server) see :ref:`analyze_monitoring`.
---------------------
Monitoring An Install
---------------------
Since an install is typically when you build packages, we logically want
to tell spack to monitor during this step. Let's start with an example
where we want to monitor the install of hdf5. Unless you have disabled authentication
for the server, we first want to export our spack monitor token and username to the environment:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
By default, the host for your server is expected to be at ``http://127.0.0.1``
with a prefix of ``ms1``, and if this is the case, you can simply add the
``--monitor`` flag to the install command:
.. code-block:: console
$ spack install --monitor hdf5
If you need to customize the host or the prefix, you can do that as well:
.. code-block:: console
$ spack install --monitor --monitor-prefix monitor --monitor-host https://monitor-service.io hdf5
As a precaution, we cut out early in the spack client if you have not provided
authentication credentials. For example, if you run the command above without
exporting your username or token, you'll see:
.. code-block:: console
==> Error: You are required to export SPACKMON_TOKEN and SPACKMON_USER
This extra check is to ensure that we don't start any builds,
and then discover that you forgot to export your token. However, if
your monitoring server has authentication disabled, you can tell this to
the client to skip this step:
.. code-block:: console
$ spack install --monitor --monitor-disable-auth hdf5
If the service is not running, you'll cleanly exit early - the install will
not continue if you've asked it to monitor and there is no service.
For example, here is what you'll see if the monitoring service is not running:
.. code-block:: console
[Errno 111] Connection refused
If you want to continue builds (and stop monitoring) you can set the ``--monitor-keep-going``
flag.
.. code-block:: console
$ spack install --monitor --monitor-keep-going hdf5
This could mean that if a request fails, you only have partial or no data
added to your monitoring database. This setting will not be applied to the
first request to check if the server is running, but to subsequent requests.
If you don't have a monitor server running and you want to build, simply
don't provide the ``--monitor`` flag! Finally, if you want to provide one or
more tags to your build, you can do:
.. code-block:: console
# Add one tag, "pizza"
$ spack install --monitor --monitor-tags pizza hdf5
# Add two tags, "pizza" and "pasta"
$ spack install --monitor --monitor-tags pizza,pasta hdf5
----------------------------
Monitoring with Containerize
----------------------------
The same argument group is available to add to a containerize command.
^^^^^^
Docker
^^^^^^
To add monitoring to a Docker container recipe generation using the defaults,
and assuming a monitor server running on localhost, you would
start with a spack.yaml in your present working directory:
.. code-block:: yaml
spack:
specs:
- samtools
And then do:
.. code-block:: console
# preview first
spack containerize --monitor
# and then write to a Dockerfile
spack containerize --monitor > Dockerfile
The install command will be edited to include commands for enabling monitoring.
However, getting secrets into the container for your monitor server is something
that should be done carefully. Specifically you should:
- Never try to define secrets as ENV, ARG, or using ``--build-arg``
- Do not try to get the secret into the container via a "temporary" file that you remove (it in fact will still exist in a layer)
Instead, it's recommended to use buildkit `as explained here <https://pythonspeed.com/articles/docker-build-secrets/>`_.
You'll need to again export environment variables for your spack monitor server:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
And then use buildkit along with your build and identifying the name of the secret:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
The secrets are expected to come from your environment, and then will be temporarily mounted and available
at ``/run/secrets/<name>``. If you forget to supply them (and authentication is required) the build
will fail. If you need to build on your host (and interact with a spack monitor at localhost) you'll
need to tell Docker to use the host network:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --network="host" --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
^^^^^^^^^^^
Singularity
^^^^^^^^^^^
To add monitoring to a Singularity container build, the spack.yaml needs to
be modified slightly to specify wanting a different format:
.. code-block:: yaml
spack:
specs:
- samtools
container:
format: singularity
Again, generate the recipe:
.. code-block:: console
# preview first
$ spack containerize --monitor
# then write to a Singularity recipe
$ spack containerize --monitor > Singularity
Singularity doesn't have a direct way to define secrets at build time, so we have
to do a bit of a manual command to add a file, source secrets in it, and remove it.
Since Singularity doesn't have layers like Docker, deleting a file will truly
remove it from the container and history. So let's say we have this file,
``secrets.sh``:
.. code-block:: console
# secrets.sh
export SPACKMON_USER=spack
export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
We would then generate the Singularity recipe, and add a files section,
a source of that file at the start of ``%post``, and **importantly**
a removal of the final at the end of that same section.
.. code-block::
Bootstrap: docker
From: spack/ubuntu-bionic:latest
Stage: build
%files
secrets.sh /opt/secrets.sh
%post
. /opt/secrets.sh
# spack install commands are here
...
# Don't forget to remove here!
rm /opt/secrets.sh
You can then build the container as your normally would.
.. code-block:: console
$ sudo singularity build container.sif Singularity
------------------
Monitoring Offline
------------------
In the case that you want to save monitor results to your filesystem
and then upload them later (perhaps you are in an environment where you don't
have credentials or it isn't safe to use them) you can use the ``--monitor-save-local``
flag.
.. code-block:: console
$ spack install --monitor --monitor-save-local hdf5
This will save results in a subfolder, "monitor" in your designated spack
reports folder, which defaults to ``$HOME/.spack/reports/monitor``. When
you are ready to upload them to a spack monitor server:
.. code-block:: console
$ spack monitor upload ~/.spack/reports/monitor
You can choose the root directory of results as shown above, or a specific
subdirectory. The command accepts other arguments to specify configuration
for the monitor.

View File

@@ -184,13 +184,48 @@ simply run the following commands:
.. code-block:: console
$ spack env activate myenv
$ spack concretize --force
$ spack concretize --fresh --force
$ spack install
The ``--force`` flag tells Spack to overwrite its previous concretization
decisions, allowing you to choose a new version of Python. If any of the new
packages like Bash are already installed, ``spack install`` won't re-install
them, it will keep the symlinks in place.
The ``--fresh`` flag tells Spack to use the latest version of every package
where possible instead of trying to optimize for reuse of existing installed
packages.
The ``--force`` flag in addition tells Spack to overwrite its previous
concretization decisions, allowing you to choose a new version of Python.
If any of the new packages like Bash are already installed, ``spack install``
won't re-install them, it will keep the symlinks in place.
-----------------------------------
Updating & Cleaning Up Old Packages
-----------------------------------
If you're looking to mimic the behavior of Homebrew, you may also want to
clean up out-of-date packages from your environment after an upgrade. To
upgrade your entire software stack within an environment and clean up old
package versions, simply run the following commands:
.. code-block:: console
$ spack env activate myenv
$ spack mark -i --all
$ spack concretize --fresh --force
$ spack install
$ spack gc
Running ``spack mark -i --all`` tells Spack to mark all of the existing
packages within an environment as "implicitly" installed. This tells
spack's garbage collection system that these packages should be cleaned up.
Don't worry however, this will not remove your entire environment.
Running ``spack install`` will reexamine your spack environment after
a fresh concretization and will re-mark any packages that should remain
installed as "explicitly" installed.
**Note:** if you use multiple spack environments you should re-run ``spack install``
in each of your environments prior to running ``spack gc`` to prevent spack
from uninstalling any shared packages that are no longer required by the
environment you just upgraded.
--------------
Uninstallation

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.0 (commit 77640e572725ad97f18e63a04857155752ace045)
* Version: 0.2.0 (commit e44bad9c7b6defac73696f64078b2fe634719b62)
argparse
--------

View File

@@ -1,2 +1,2 @@
"""Init file to avoid namespace packages"""
__version__ = "0.1.2"
__version__ = "0.2.0"

View File

@@ -3,13 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Aliases for microarchitecture features."""
# pylint: disable=useless-object-inheritance
from .schema import TARGETS_JSON, LazyDictionary
_FEATURE_ALIAS_PREDICATE = {}
class FeatureAliasTest(object):
class FeatureAliasTest:
"""A test that must be passed for a feature alias to succeed.
Args:
@@ -48,7 +47,7 @@ def alias_predicate(func):
# Check we didn't register anything else with the same name
if name in _FEATURE_ALIAS_PREDICATE:
msg = 'the alias predicate "{0}" already exists'.format(name)
msg = f'the alias predicate "{name}" already exists'
raise KeyError(msg)
_FEATURE_ALIAS_PREDICATE[name] = func

View File

@@ -11,8 +11,6 @@
import subprocess
import warnings
import six
from .microarchitecture import generic_microarchitecture, TARGETS
from .schema import TARGETS_JSON
@@ -80,10 +78,9 @@ def proc_cpuinfo():
def _check_output(args, env):
output = subprocess.Popen( # pylint: disable=consider-using-with
args, stdout=subprocess.PIPE, env=env
).communicate()[0]
return six.text_type(output.decode("utf-8"))
with subprocess.Popen(args, stdout=subprocess.PIPE, env=env) as proc:
output = proc.communicate()[0]
return str(output.decode("utf-8"))
def _machine():
@@ -273,7 +270,7 @@ def compatibility_check(architecture_family):
this test can be used, e.g. x86_64 or ppc64le etc.
"""
# Turn the argument into something iterable
if isinstance(architecture_family, six.string_types):
if isinstance(architecture_family, str):
architecture_family = (architecture_family,)
def decorator(func):

View File

@@ -5,14 +5,11 @@
"""Types and functions to manage information
on CPU microarchitectures.
"""
# pylint: disable=useless-object-inheritance
import functools
import platform
import re
import warnings
import six
import archspec
import archspec.cpu.alias
import archspec.cpu.schema
@@ -27,7 +24,7 @@ def coerce_target_names(func):
@functools.wraps(func)
def _impl(self, other):
if isinstance(other, six.string_types):
if isinstance(other, str):
if other not in TARGETS:
msg = '"{0}" is not a valid target name'
raise ValueError(msg.format(other))
@@ -38,7 +35,7 @@ def _impl(self, other):
return _impl
class Microarchitecture(object):
class Microarchitecture:
"""Represents a specific CPU micro-architecture.
Args:
@@ -150,7 +147,7 @@ def __str__(self):
def __contains__(self, feature):
# Feature must be of a string type, so be defensive about that
if not isinstance(feature, six.string_types):
if not isinstance(feature, str):
msg = "only objects of string types are accepted [got {0}]"
raise TypeError(msg.format(str(type(feature))))
@@ -168,7 +165,7 @@ def family(self):
"""Returns the architecture family a given target belongs to"""
roots = [x for x in [self] + self.ancestors if not x.ancestors]
msg = "a target is expected to belong to just one architecture family"
msg += "[found {0}]".format(", ".join(str(x) for x in roots))
msg += f"[found {', '.join(str(x) for x in roots)}]"
assert len(roots) == 1, msg
return roots.pop()
@@ -318,9 +315,6 @@ def _known_microarchitectures():
"""Returns a dictionary of the known micro-architectures. If the
current host platform is unknown adds it too as a generic target.
"""
# pylint: disable=fixme
# TODO: Simplify this logic using object_pairs_hook to OrderedDict
# TODO: when we stop supporting python2.6
def fill_target_from_dict(name, data, targets):
"""Recursively fills targets by adding the micro-architecture

View File

@@ -5,16 +5,12 @@
"""Global objects with the content of the microarchitecture
JSON file and its schema
"""
import collections.abc
import json
import os.path
try:
from collections.abc import MutableMapping # novm
except ImportError:
from collections import MutableMapping # pylint: disable=deprecated-class
class LazyDictionary(MutableMapping):
class LazyDictionary(collections.abc.MutableMapping):
"""Lazy dictionary that gets constructed on first access to any object key
Args:
@@ -56,7 +52,7 @@ def _load_json_file(json_file):
def _factory():
filename = os.path.join(json_dir, json_file)
with open(filename, "r") as file: # pylint: disable=unspecified-encoding
with open(filename, "r", encoding="utf-8") as file:
return json.load(file)
return _factory

View File

@@ -961,21 +961,21 @@
],
"intel": [
{
"versions": "18.0:",
"versions": "18.0:2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"versions": ":2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"versions": ":2021.2",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
@@ -1905,6 +1905,86 @@
]
}
},
"zen4": {
"from": ["zen3", "x86_64_v4"],
"vendor": "AuthenticAMD",
"features": [
"bmi1",
"bmi2",
"f16c",
"fma",
"fsgsbase",
"avx",
"avx2",
"rdseed",
"clzero",
"aes",
"pclmulqdq",
"cx16",
"movbe",
"mmx",
"sse",
"sse2",
"sse4a",
"ssse3",
"sse4_1",
"sse4_2",
"abm",
"xsavec",
"xsaveopt",
"clflushopt",
"popcnt",
"clwb",
"vaes",
"vpclmulqdq",
"pku",
"gfni",
"flush_l1d",
"erms",
"avic",
"avx512f",
"avx512dq",
"avx512ifma",
"avx512cd",
"avx512bw",
"avx512vl",
"avx512_bf16",
"avx512vbmi",
"avx512_vbmi2",
"avx512_vnni",
"avx512_bitalg",
"avx512_vpopcntdq"
],
"compilers": {
"gcc": [
{
"versions": "10.3:",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
}
],
"clang": [
{
"versions": "12.0:",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg"
}
],
"aocc": [
{
"versions": "3.0:3.9",
"name": "znver3",
"flags": "-march={name} -mtune={name} -mavx512f -mavx512dq -mavx512ifma -mavx512cd -mavx512bw -mavx512vl -mavx512vbmi -mavx512vbmi2 -mavx512vnni -mavx512bitalg",
"warnings": "Zen4 processors are not fully supported by AOCC versions < 4.0. For optimal performance please upgrade to a newer version of AOCC"
},
{
"versions": "4.0:",
"name": "znver4",
"flags": "-march={name} -mtune={name}"
}
]
}
},
"ppc64": {
"from": [],
"vendor": "generic",
@@ -2302,7 +2382,6 @@
"fp",
"asimd",
"evtstrm",
"pmull",
"sha1",
"sha2",
"crc32",

View File

@@ -0,0 +1,982 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utilities for setting and modifying environment variables."""
import collections
import contextlib
import inspect
import json
import os
import os.path
import re
import shlex
import sys
import llnl.util.executable as executable
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.path import path_to_os_path
is_windows = sys.platform == "win32"
system_paths = (
["/", "/usr", "/usr/local"]
if not is_windows
else ["C:\\", "C:\\Program Files", "C:\\Program Files (x86)", "C:\\Users", "C:\\ProgramData"]
)
suffixes = ["bin", "bin64", "include", "lib", "lib64"] if not is_windows else []
system_dirs = [os.path.join(p, s) for s in suffixes for p in system_paths] + system_paths
def is_system_path(path):
"""Predicate that given a path returns True if it is a system path,
False otherwise.
Args:
path (str): path to a directory
Returns:
True or False
"""
return path and os.path.normpath(path) in system_dirs
def filter_system_paths(paths):
"""Return only paths that are not system paths."""
return [p for p in paths if not is_system_path(p)]
def deprioritize_system_paths(paths):
"""Put system paths at the end of paths, otherwise preserving order."""
filtered_paths = filter_system_paths(paths)
fp = set(filtered_paths)
return filtered_paths + [p for p in paths if p not in fp]
_shell_set_strings = {
"sh": "export {0}={1};\n",
"csh": "setenv {0} {1};\n",
"fish": "set -gx {0} {1};\n",
"bat": 'set "{0}={1}"\n',
}
_shell_unset_strings = {
"sh": "unset {0};\n",
"csh": "unsetenv {0};\n",
"fish": "set -e {0};\n",
"bat": 'set "{0}="\n',
}
tracing_enabled = False
def prune_duplicate_paths(paths):
"""Returns the paths with duplicates removed, order preserved."""
return list(llnl.util.lang.dedupe(paths))
def get_path(name):
path = os.environ.get(name, "").strip()
if path:
return path.split(os.pathsep)
else:
return []
def env_flag(name):
if name in os.environ:
value = os.environ[name].lower()
return value == "true" or value == "1"
return False
def path_set(var_name, directories):
path_str = os.pathsep.join(str(dir) for dir in directories)
os.environ[var_name] = path_str
def path_put_first(var_name, directories):
"""Puts the provided directories first in the path, adding them
if they're not already there.
"""
path = os.environ.get(var_name, "").split(os.pathsep)
for dir in directories:
if dir in path:
path.remove(dir)
new_path = tuple(directories) + tuple(path)
path_set(var_name, new_path)
@contextlib.contextmanager
def set_env(**kwargs):
"""Temporarily sets and restores environment variables.
Variables can be set as keyword arguments to this function.
"""
saved = {}
for var, value in kwargs.items():
if var in os.environ:
saved[var] = os.environ[var]
if value is None:
if var in os.environ:
del os.environ[var]
else:
os.environ[var] = value
yield
for var, value in kwargs.items():
if var in saved:
os.environ[var] = saved[var]
else:
if var in os.environ:
del os.environ[var]
class NameModifier(object):
def __init__(self, name, **kwargs):
self.name = name
self.separator = kwargs.get("separator", os.pathsep)
self.args = {"name": name, "separator": self.separator}
self.args.update(kwargs)
def __eq__(self, other):
if not isinstance(other, NameModifier):
return False
return self.name == other.name
def update_args(self, **kwargs):
self.__dict__.update(kwargs)
self.args.update(kwargs)
class NameValueModifier(object):
def __init__(self, name, value, **kwargs):
self.name = name
self.value = value
self.separator = kwargs.get("separator", os.pathsep)
self.args = {"name": name, "value": value, "separator": self.separator}
self.args.update(kwargs)
def __eq__(self, other):
if not isinstance(other, NameValueModifier):
return False
return (
self.name == other.name
and self.value == other.value
and self.separator == other.separator
)
def update_args(self, **kwargs):
self.__dict__.update(kwargs)
self.args.update(kwargs)
class SetEnv(NameValueModifier):
def execute(self, env):
tty.debug("SetEnv: {0}={1}".format(self.name, str(self.value)), level=3)
env[self.name] = str(self.value)
class AppendFlagsEnv(NameValueModifier):
def execute(self, env):
tty.debug("AppendFlagsEnv: {0}={1}".format(self.name, str(self.value)), level=3)
if self.name in env and env[self.name]:
env[self.name] += self.separator + str(self.value)
else:
env[self.name] = str(self.value)
class UnsetEnv(NameModifier):
def execute(self, env):
tty.debug("UnsetEnv: {0}".format(self.name), level=3)
# Avoid throwing if the variable was not set
env.pop(self.name, None)
class RemoveFlagsEnv(NameValueModifier):
def execute(self, env):
tty.debug("RemoveFlagsEnv: {0}-{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
flags = environment_value.split(self.separator) if environment_value else []
flags = [f for f in flags if f != self.value]
env[self.name] = self.separator.join(flags)
class SetPath(NameValueModifier):
def execute(self, env):
string_path = concatenate_paths(self.value, separator=self.separator)
tty.debug("SetPath: {0}={1}".format(self.name, string_path), level=3)
env[self.name] = string_path
class AppendPath(NameValueModifier):
def execute(self, env):
tty.debug("AppendPath: {0}+{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories.append(path_to_os_path(os.path.normpath(self.value)).pop())
env[self.name] = self.separator.join(directories)
class PrependPath(NameValueModifier):
def execute(self, env):
tty.debug("PrependPath: {0}+{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = [path_to_os_path(os.path.normpath(self.value)).pop()] + directories
env[self.name] = self.separator.join(directories)
class RemovePath(NameValueModifier):
def execute(self, env):
tty.debug("RemovePath: {0}-{1}".format(self.name, str(self.value)), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = [
path_to_os_path(os.path.normpath(x)).pop()
for x in directories
if x != path_to_os_path(os.path.normpath(self.value)).pop()
]
env[self.name] = self.separator.join(directories)
class DeprioritizeSystemPaths(NameModifier):
def execute(self, env):
tty.debug("DeprioritizeSystemPaths: {0}".format(self.name), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = deprioritize_system_paths(
[path_to_os_path(os.path.normpath(x)).pop() for x in directories]
)
env[self.name] = self.separator.join(directories)
class PruneDuplicatePaths(NameModifier):
def execute(self, env):
tty.debug("PruneDuplicatePaths: {0}".format(self.name), level=3)
environment_value = env.get(self.name, "")
directories = environment_value.split(self.separator) if environment_value else []
directories = prune_duplicate_paths(
[path_to_os_path(os.path.normpath(x)).pop() for x in directories]
)
env[self.name] = self.separator.join(directories)
class EnvironmentModifications(object):
"""Keeps track of requests to modify the current environment.
Each call to a method to modify the environment stores the extra
information on the caller in the request:
* 'filename' : filename of the module where the caller is defined
* 'lineno': line number where the request occurred
* 'context' : line of code that issued the request that failed
"""
def __init__(self, other=None, traced=None):
"""Initializes a new instance, copying commands from 'other'
if it is not None.
Args:
other (EnvironmentModifications): list of environment modifications
to be extended (optional)
traced (bool): enable or disable stack trace inspection to log the origin
of the environment modifications.
"""
self.traced = tracing_enabled if traced is None else bool(traced)
self.env_modifications = []
if other is not None:
self.extend(other)
def __iter__(self):
return iter(self.env_modifications)
def __len__(self):
return len(self.env_modifications)
def extend(self, other):
self._check_other(other)
self.env_modifications.extend(other.env_modifications)
@staticmethod
def _check_other(other):
if not isinstance(other, EnvironmentModifications):
raise TypeError("other must be an instance of EnvironmentModifications")
def _maybe_trace(self, kwargs):
"""Provide the modification with stack trace info so that we can track its
origin to find issues in packages. This is very slow and expensive."""
if not self.traced:
return
stack = inspect.stack()
try:
_, filename, lineno, _, context, index = stack[2]
context = context[index].strip()
except Exception:
filename = "unknown file"
lineno = "unknown line"
context = "unknown context"
kwargs.update({"filename": filename, "lineno": lineno, "context": context})
def set(self, name, value, **kwargs):
"""Stores a request to set an environment variable.
Args:
name: name of the environment variable to be set
value: value of the environment variable
"""
self._maybe_trace(kwargs)
item = SetEnv(name, value, **kwargs)
self.env_modifications.append(item)
def append_flags(self, name, value, sep=" ", **kwargs):
"""
Stores in the current object a request to append to an env variable
Args:
name: name of the environment variable to be appended to
value: value to append to the environment variable
Appends with spaces separating different additions to the variable
"""
self._maybe_trace(kwargs)
kwargs.update({"separator": sep})
item = AppendFlagsEnv(name, value, **kwargs)
self.env_modifications.append(item)
def unset(self, name, **kwargs):
"""Stores a request to unset an environment variable.
Args:
name: name of the environment variable to be unset
"""
self._maybe_trace(kwargs)
item = UnsetEnv(name, **kwargs)
self.env_modifications.append(item)
def remove_flags(self, name, value, sep=" ", **kwargs):
"""
Stores in the current object a request to remove flags from an
env variable
Args:
name: name of the environment variable to be removed from
value: value to remove to the environment variable
sep: separator to assume for environment variable
"""
self._maybe_trace(kwargs)
kwargs.update({"separator": sep})
item = RemoveFlagsEnv(name, value, **kwargs)
self.env_modifications.append(item)
def set_path(self, name, elements, **kwargs):
"""Stores a request to set a path generated from a list.
Args:
name: name o the environment variable to be set.
elements: elements of the path to set.
"""
self._maybe_trace(kwargs)
item = SetPath(name, elements, **kwargs)
self.env_modifications.append(item)
def append_path(self, name, path, **kwargs):
"""Stores a request to append a path to a path list.
Args:
name: name of the path list in the environment
path: path to be appended
"""
self._maybe_trace(kwargs)
item = AppendPath(name, path, **kwargs)
self.env_modifications.append(item)
def prepend_path(self, name, path, **kwargs):
"""Same as `append_path`, but the path is pre-pended.
Args:
name: name of the path list in the environment
path: path to be pre-pended
"""
self._maybe_trace(kwargs)
item = PrependPath(name, path, **kwargs)
self.env_modifications.append(item)
def remove_path(self, name, path, **kwargs):
"""Stores a request to remove a path from a path list.
Args:
name: name of the path list in the environment
path: path to be removed
"""
self._maybe_trace(kwargs)
item = RemovePath(name, path, **kwargs)
self.env_modifications.append(item)
def deprioritize_system_paths(self, name, **kwargs):
"""Stores a request to deprioritize system paths in a path list,
otherwise preserving the order.
Args:
name: name of the path list in the environment.
"""
self._maybe_trace(kwargs)
item = DeprioritizeSystemPaths(name, **kwargs)
self.env_modifications.append(item)
def prune_duplicate_paths(self, name, **kwargs):
"""Stores a request to remove duplicates from a path list, otherwise
preserving the order.
Args:
name: name of the path list in the environment.
"""
self._maybe_trace(kwargs)
item = PruneDuplicatePaths(name, **kwargs)
self.env_modifications.append(item)
def group_by_name(self):
"""Returns a dict of the modifications grouped by variable name.
Returns:
dict mapping the environment variable name to the modifications to
be done on it
"""
modifications = collections.defaultdict(list)
for item in self:
modifications[item.name].append(item)
return modifications
def is_unset(self, var_name):
modifications = self.group_by_name()
var_updates = modifications.get(var_name, None)
if not var_updates:
# We did not explicitly unset it
return False
# The last modification must unset the variable for it to be considered
# unset
return type(var_updates[-1]) == UnsetEnv
def clear(self):
"""
Clears the current list of modifications
"""
self.env_modifications = []
def reversed(self):
"""
Returns the EnvironmentModifications object that will reverse self
Only creates reversals for additions to the environment, as reversing
``unset`` and ``remove_path`` modifications is impossible.
Reversable operations are set(), prepend_path(), append_path(),
set_path(), and append_flags().
"""
rev = EnvironmentModifications()
for envmod in reversed(self.env_modifications):
if type(envmod) == SetEnv:
tty.debug("Reversing `Set` environment operation may lose " "original value")
rev.unset(envmod.name)
elif type(envmod) == AppendPath:
rev.remove_path(envmod.name, envmod.value)
elif type(envmod) == PrependPath:
rev.remove_path(envmod.name, envmod.value)
elif type(envmod) == SetPath:
tty.debug("Reversing `SetPath` environment operation may lose " "original value")
rev.unset(envmod.name)
elif type(envmod) == AppendFlagsEnv:
rev.remove_flags(envmod.name, envmod.value)
else:
# This is an un-reversable operation
tty.warn(
"Skipping reversal of unreversable operation"
"%s %s" % (type(envmod), envmod.name)
)
return rev
def apply_modifications(self, env=None):
"""Applies the modifications and clears the list."""
# Use os.environ if not specified
# Do not copy, we want to modify it in place
if env is None:
env = os.environ
modifications = self.group_by_name()
# Apply modifications one variable at a time
for name, actions in sorted(modifications.items()):
for x in actions:
x.execute(env)
def shell_modifications(self, shell="sh", explicit=False, env=None):
"""Return shell code to apply the modifications and clears the list."""
modifications = self.group_by_name()
if env is None:
env = os.environ
new_env = env.copy()
for name, actions in sorted(modifications.items()):
for x in actions:
x.execute(new_env)
if "MANPATH" in new_env and not new_env.get("MANPATH").endswith(":"):
new_env["MANPATH"] += ":"
cmds = ""
for name in sorted(set(modifications)):
new = new_env.get(name, None)
old = env.get(name, None)
if explicit or new != old:
if new is None:
cmds += _shell_unset_strings[shell].format(name)
else:
if sys.platform != "win32":
cmd = _shell_set_strings[shell].format(name, shlex.quote(new_env[name]))
else:
cmd = _shell_set_strings[shell].format(name, new_env[name])
cmds += cmd
return cmds
@staticmethod
def from_sourcing_file(filename, *arguments, **kwargs):
"""Constructs an instance of a
:py:class:`llnl.util.envmod.EnvironmentModifications` object
that has the same effect as sourcing a file.
Args:
filename (str): the file to be sourced
*arguments (list): arguments to pass on the command line
Keyword Args:
shell (str): the shell to use (default: ``bash``)
shell_options (str): options passed to the shell (default: ``-c``)
source_command (str): the command to run (default: ``source``)
suppress_output (str): redirect used to suppress output of command
(default: ``&> /dev/null``)
concatenate_on_success (str): operator used to execute a command
only when the previous command succeeds (default: ``&&``)
exclude ([str or re]): ignore any modifications of these
variables (default: [])
include ([str or re]): always respect modifications of these
variables (default: []). Supersedes any excluded variables.
clean (bool): in addition to removing empty entries,
also remove duplicate entries (default: False).
"""
tty.debug("EnvironmentModifications.from_sourcing_file: {0}".format(filename))
# Check if the file actually exists
if not os.path.isfile(filename):
msg = "Trying to source non-existing file: {0}".format(filename)
raise RuntimeError(msg)
# Prepare include and exclude lists of environment variable names
exclude = kwargs.get("exclude", [])
include = kwargs.get("include", [])
clean = kwargs.get("clean", False)
# Other variables unrelated to sourcing a file
exclude.extend(
[
# Bash internals
"SHLVL",
"_",
"PWD",
"OLDPWD",
"PS1",
"PS2",
"ENV",
# Environment modules v4
"LOADEDMODULES",
"_LMFILES_",
"BASH_FUNC_module()",
"MODULEPATH",
"MODULES_(.*)",
r"(\w*)_mod(quar|share)",
# Lmod configuration
r"LMOD_(.*)",
"MODULERCFILE",
]
)
# Compute the environments before and after sourcing
before = sanitize(
environment_after_sourcing_files(os.devnull, **kwargs),
exclude=exclude,
include=include,
)
file_and_args = (filename,) + arguments
after = sanitize(
environment_after_sourcing_files(file_and_args, **kwargs),
exclude=exclude,
include=include,
)
# Delegate to the other factory
return EnvironmentModifications.from_environment_diff(before, after, clean)
@staticmethod
def from_environment_diff(before, after, clean=False):
"""Constructs an instance of a
:py:class:`llnl.util.envmod.EnvironmentModifications` object
from the diff of two dictionaries.
Args:
before (dict): environment before the modifications are applied
after (dict): environment after the modifications are applied
clean (bool): in addition to removing empty entries, also remove
duplicate entries
"""
# Fill the EnvironmentModifications instance
env = EnvironmentModifications()
# New variables
new_variables = list(set(after) - set(before))
# Variables that have been unset
unset_variables = list(set(before) - set(after))
# Variables that have been modified
common_variables = set(before).intersection(set(after))
modified_variables = [x for x in common_variables if before[x] != after[x]]
# Consistent output order - looks nicer, easier comparison...
new_variables.sort()
unset_variables.sort()
modified_variables.sort()
def return_separator_if_any(*args):
separators = ":", ";"
for separator in separators:
for arg in args:
if separator in arg:
return separator
return None
# Add variables to env.
# Assume that variables with 'PATH' in the name or that contain
# separators like ':' or ';' are more likely to be paths
for x in new_variables:
sep = return_separator_if_any(after[x])
if sep:
env.prepend_path(x, after[x], separator=sep)
elif "PATH" in x:
env.prepend_path(x, after[x])
else:
# We just need to set the variable to the new value
env.set(x, after[x])
for x in unset_variables:
env.unset(x)
for x in modified_variables:
value_before = before[x]
value_after = after[x]
sep = return_separator_if_any(value_before, value_after)
if sep:
before_list = value_before.split(sep)
after_list = value_after.split(sep)
# Filter out empty strings
before_list = list(filter(None, before_list))
after_list = list(filter(None, after_list))
# Remove duplicate entries (worse matching, bloats env)
if clean:
before_list = list(llnl.util.lang.dedupe(before_list))
after_list = list(llnl.util.lang.dedupe(after_list))
# The reassembled cleaned entries
value_before = sep.join(before_list)
value_after = sep.join(after_list)
# Paths that have been removed
remove_list = [ii for ii in before_list if ii not in after_list]
# Check that nothing has been added in the middle of
# before_list
remaining_list = [ii for ii in before_list if ii in after_list]
try:
start = after_list.index(remaining_list[0])
end = after_list.index(remaining_list[-1])
search = sep.join(after_list[start : end + 1])
except IndexError:
env.prepend_path(x, value_after)
continue
if search not in value_before:
# We just need to set the variable to the new value
env.prepend_path(x, value_after)
else:
try:
prepend_list = after_list[:start]
prepend_list.reverse() # Preserve order after prepend
except KeyError:
prepend_list = []
try:
append_list = after_list[end + 1 :]
except KeyError:
append_list = []
for item in remove_list:
env.remove_path(x, item)
for item in append_list:
env.append_path(x, item)
for item in prepend_list:
env.prepend_path(x, item)
else:
# We just need to set the variable to the new value
env.set(x, value_after)
return env
def concatenate_paths(paths, separator=os.pathsep):
"""Concatenates an iterable of paths into a string of paths separated by
separator, defaulting to colon.
Args:
paths: iterable of paths
separator: the separator to use, default ';' windows, ':' otherwise
Returns:
string
"""
return separator.join(str(item) for item in paths)
def set_or_unset_not_first(variable, changes, errstream):
"""Check if we are going to set or unset something after other
modifications have already been requested.
"""
indexes = [
ii
for ii, item in enumerate(changes)
if ii != 0 and not item.args.get("force", False) and type(item) in [SetEnv, UnsetEnv]
]
if indexes:
good = "\t \t{context} at {filename}:{lineno}"
nogood = "\t--->\t{context} at {filename}:{lineno}"
message = "Suspicious requests to set or unset '{var}' found"
errstream(message.format(var=variable))
for ii, item in enumerate(changes):
print_format = nogood if ii in indexes else good
errstream(print_format.format(**item.args))
def validate(env, errstream):
"""Validates the environment modifications to check for the presence of
suspicious patterns. Prompts a warning for everything that was found.
Current checks:
- set or unset variables after other changes on the same variable
Args:
env: list of environment modifications
"""
if not env.traced:
return
modifications = env.group_by_name()
for variable, list_of_changes in sorted(modifications.items()):
set_or_unset_not_first(variable, list_of_changes, errstream)
def inspect_path(root, inspections, exclude=None):
"""Inspects ``root`` to search for the subdirectories in ``inspections``.
Adds every path found to a list of prepend-path commands and returns it.
Args:
root (str): absolute path where to search for subdirectories
inspections (dict): maps relative paths to a list of environment
variables that will be modified if the path exists. The
modifications are not performed immediately, but stored in a
command object that is returned to client
exclude (typing.Callable): optional callable. If present it must accept an
absolute path and return True if it should be excluded from the
inspection
Examples:
The following lines execute an inspection in ``/usr`` to search for
``/usr/include`` and ``/usr/lib64``. If found we want to prepend
``/usr/include`` to ``CPATH`` and ``/usr/lib64`` to ``MY_LIB64_PATH``.
.. code-block:: python
# Set up the dictionary containing the inspection
inspections = {
'include': ['CPATH'],
'lib64': ['MY_LIB64_PATH']
}
# Get back the list of command needed to modify the environment
env = inspect_path('/usr', inspections)
# Eventually execute the commands
env.apply_modifications()
Returns:
instance of EnvironmentModifications containing the requested
modifications
"""
if exclude is None:
exclude = lambda x: False
env = EnvironmentModifications()
# Inspect the prefix to check for the existence of common directories
for relative_path, variables in inspections.items():
expected = os.path.join(root, relative_path)
if os.path.isdir(expected) and not exclude(expected):
for variable in variables:
env.prepend_path(variable, expected)
return env
@contextlib.contextmanager
def preserve_environment(*variables):
"""Ensures that the value of the environment variables passed as
arguments is the same before entering to the context manager and after
exiting it.
Variables that are unset before entering the context manager will be
explicitly unset on exit.
Args:
variables (list): list of environment variables to be preserved
"""
cache = {}
for var in variables:
# The environment variable to be preserved might not be there.
# In that case store None as a placeholder.
cache[var] = os.environ.get(var, None)
yield
for var in variables:
value = cache[var]
msg = "[PRESERVE_ENVIRONMENT]"
if value is not None:
# Print a debug statement if the value changed
if var not in os.environ:
msg += ' {0} was unset, will be reset to "{1}"'
tty.debug(msg.format(var, value))
elif os.environ[var] != value:
msg += ' {0} was set to "{1}", will be reset to "{2}"'
tty.debug(msg.format(var, os.environ[var], value))
os.environ[var] = value
elif var in os.environ:
msg += ' {0} was set to "{1}", will be unset'
tty.debug(msg.format(var, os.environ[var]))
del os.environ[var]
def environment_after_sourcing_files(*files, **kwargs):
"""Returns a dictionary with the environment that one would have
after sourcing the files passed as argument.
Args:
*files: each item can either be a string containing the path
of the file to be sourced or a sequence, where the first element
is the file to be sourced and the remaining are arguments to be
passed to the command line
Keyword Args:
env (dict): the initial environment (default: current environment)
shell (str): the shell to use (default: ``/bin/bash``)
shell_options (str): options passed to the shell (default: ``-c``)
source_command (str): the command to run (default: ``source``)
suppress_output (str): redirect used to suppress output of command
(default: ``&> /dev/null``)
concatenate_on_success (str): operator used to execute a command
only when the previous command succeeds (default: ``&&``)
"""
# Set the shell executable that will be used to source files
shell_cmd = kwargs.get("shell", "/bin/bash")
shell_options = kwargs.get("shell_options", "-c")
source_command = kwargs.get("source_command", "source")
suppress_output = kwargs.get("suppress_output", "&> /dev/null")
concatenate_on_success = kwargs.get("concatenate_on_success", "&&")
shell = executable.Executable(" ".join([shell_cmd, shell_options]))
def _source_single_file(file_and_args, environment):
source_file = [source_command]
source_file.extend(x for x in file_and_args)
source_file = " ".join(source_file)
# If the environment contains 'python' use it, if not
# go with sys.executable. Below we just need a working
# Python interpreter, not necessarily sys.executable.
python_cmd = executable.which("python3", "python", "python2")
python_cmd = python_cmd.path if python_cmd else sys.executable
dump_cmd = "import os, json; print(json.dumps(dict(os.environ)))"
dump_environment = python_cmd + ' -E -c "{0}"'.format(dump_cmd)
# Try to source the file
source_file_arguments = " ".join(
[
source_file,
suppress_output,
concatenate_on_success,
dump_environment,
]
)
output = shell(source_file_arguments, output=str, env=environment, ignore_quotes=True)
return json.loads(output)
current_environment = kwargs.get("env", dict(os.environ))
for f in files:
# Normalize the input to the helper function
if isinstance(f, str):
f = [f]
current_environment = _source_single_file(f, environment=current_environment)
return current_environment
def sanitize(environment, exclude, include):
"""Returns a copy of the input dictionary where all the keys that
match an excluded pattern and don't match an included pattern are
removed.
Args:
environment (dict): input dictionary
exclude (list): literals or regex patterns to be excluded
include (list): literals or regex patterns to be included
"""
def set_intersection(fullset, *args):
# A set intersection using string literals and regexs
meta = "[" + re.escape("[$()*?[]^{|}") + "]"
subset = fullset & set(args) # As literal
for name in args:
if re.search(meta, name):
pattern = re.compile(name)
for k in fullset:
if re.match(pattern, k):
subset.add(k)
return subset
# Don't modify input, make a copy instead
environment = dict(environment)
# include supersedes any excluded items
prune = set_intersection(set(environment), *exclude)
prune -= set_intersection(prune, *include)
for k in prune:
environment.pop(k, None)
return environment

View File

@@ -8,11 +8,10 @@
import shlex
import subprocess
import sys
from typing import List, Optional, Union
import llnl.util.tty as tty
import spack.error
from spack.util.path import Path, format_os_path, path_to_os_path, system_path_filter
from llnl.util.path import Path, format_os_path, path_to_os_path, system_path_filter
__all__ = ["Executable", "which", "ProcessError"]
@@ -27,13 +26,20 @@ def __init__(self, name):
# filter back to platform dependent path
self.exe = path_to_os_path(*self.exe)
self.default_env = {}
from spack.util.environment import EnvironmentModifications # no cycle
self.default_envmod = EnvironmentModifications()
self._default_envmod = None
self.returncode = None
if not self.exe:
raise ProcessError("Cannot construct executable for '%s'" % name)
raise ProcessError(f"Cannot construct executable for '{name}'")
@property
def default_envmod(self):
from llnl.util.envmod import EnvironmentModifications
if self._default_envmod is None:
self._default_envmod = EnvironmentModifications()
return self._default_envmod
@system_path_filter
def add_default_arg(self, arg):
@@ -122,6 +128,8 @@ def __call__(self, *args, **kwargs):
By default, the subprocess inherits the parent's file descriptors.
"""
from llnl.util.envmod import EnvironmentModifications
# Environment
env_arg = kwargs.get("env", None)
@@ -130,8 +138,6 @@ def __call__(self, *args, **kwargs):
self.default_envmod.apply_modifications(env)
env.update(self.default_env)
from spack.util.environment import EnvironmentModifications # no cycle
# Apply env argument
if isinstance(env_arg, EnvironmentModifications):
env_arg.apply_modifications(env)
@@ -227,27 +233,26 @@ def streamify(arg, mode):
rc = self.returncode = proc.returncode
if fail_on_error and rc != 0 and (rc not in ignore_errors):
long_msg = cmd_line_string
msg = f"Command {cmd_line_string} exited with status {proc.returncode}."
if result:
# If the output is not captured in the result, it will have
# been stored either in the specified files (e.g. if
# 'output' specifies a file) or written to the parent's
# stdout/stderr (e.g. if 'output' is not specified)
long_msg += "\n" + result
msg += "\n" + result
raise ProcessError("Command exited with status %d:" % proc.returncode, long_msg)
raise ProcessError(msg)
return result
except OSError as e:
raise ProcessError("%s: %s" % (self.exe[0], e.strerror), "Command: " + cmd_line_string)
raise ProcessError(f"{self.exe[0]}: {e.strerror}\n Command: '{cmd_line_string}'")
except subprocess.CalledProcessError as e:
if fail_on_error:
raise ProcessError(
str(e),
"\nExit status %d when invoking command: %s"
% (proc.returncode, cmd_line_string),
f"{str(e)}\n"
f" Exit status {proc.returncode} when invoking command: '{cmd_line_string}'"
)
finally:
@@ -275,16 +280,31 @@ def __str__(self):
@system_path_filter
def which_string(*args, **kwargs):
"""Like ``which()``, but return a string instead of an ``Executable``."""
path = kwargs.get("path", os.environ.get("PATH", ""))
required = kwargs.get("required", False)
def which_string(*args: str, path: Optional[Union[List[str], str]] = None, required: bool = False):
"""Finds an executable in the path like command-line which.
If given multiple executables, returns the first one that is found.
If no executables are found, returns None.
Parameters:
*args: One or more executables to search for
Keyword Arguments:
path: colon-separated (semicolon-separated on windows) string or list of
paths to search. Defaults to ``os.environ["PATH"]``
required: If set to ``True``, raise an error if executable not found
Returns:
Absolute path of the first executable found.
"""
if path is None:
path = os.environ.get("PATH") or ""
if isinstance(path, str):
path = path.split(os.pathsep)
for name in args:
win_candidates = []
win_candidates: List[str] = []
if sys.platform == "win32" and (not name.endswith(".exe") and not name.endswith(".bat")):
win_candidates = [name + ext for ext in [".exe", ".bat"]]
candidate_names = [name] if not win_candidates else win_candidates
@@ -309,19 +329,19 @@ def which_string(*args, **kwargs):
return exe
if required:
raise CommandNotFoundError("spack requires '%s'. Make sure it is in your path." % args[0])
raise CommandNotFoundError(args[0])
return None
def which(*args, **kwargs):
def which(*args: str, path: Optional[Union[List[str], str]] = None, required: bool = False):
"""Finds an executable in the path like command-line which.
If given multiple executables, returns the first one that is found.
If no executables are found, returns None.
Parameters:
*args (str): One or more executables to search for
*args: One or more executables to search for
Keyword Arguments:
path (list or str): The path to search. Defaults to ``PATH``
@@ -330,13 +350,17 @@ def which(*args, **kwargs):
Returns:
Executable: The first executable that is found in the path
"""
exe = which_string(*args, **kwargs)
exe = which_string(*args, path=path, required=required)
return Executable(shlex.quote(exe)) if exe else None
class ProcessError(spack.error.SpackError):
class ProcessError(Exception):
"""ProcessErrors are raised when Executables exit with an error code."""
class CommandNotFoundError(spack.error.SpackError):
class CommandNotFoundError(Exception):
"""Raised when ``which()`` can't find a required executable."""
def __init__(self, command: str):
super().__init__(f"Couldn't find command '{command}'. Make sure it is in your path.")
self.command = command

View File

@@ -19,12 +19,11 @@
from sys import platform as _platform
from llnl.util import tty
from llnl.util.executable import CommandNotFoundError, Executable, which
from llnl.util.lang import dedupe, memoized
from llnl.util.path import path_to_os_path, system_path_filter
from llnl.util.symlink import islink, symlink
from spack.util.executable import CommandNotFoundError, Executable, which
from spack.util.path import path_to_os_path, system_path_filter
is_windows = _platform == "win32"
if not is_windows:

View File

@@ -741,6 +741,18 @@ def _n_xxx_ago(x):
raise ValueError(msg)
def pretty_seconds_formatter(seconds):
if seconds >= 1:
multiplier, unit = 1, "s"
elif seconds >= 1e-3:
multiplier, unit = 1e3, "ms"
elif seconds >= 1e-6:
multiplier, unit = 1e6, "us"
else:
multiplier, unit = 1e9, "ns"
return lambda s: "%.3f%s" % (multiplier * s, unit)
def pretty_seconds(seconds):
"""Seconds to string with appropriate units
@@ -750,15 +762,7 @@ def pretty_seconds(seconds):
Returns:
str: Time string with units
"""
if seconds >= 1:
value, unit = seconds, "s"
elif seconds >= 1e-3:
value, unit = seconds * 1e3, "ms"
elif seconds >= 1e-6:
value, unit = seconds * 1e6, "us"
else:
value, unit = seconds * 1e9, "ns"
return "%.3f%s" % (value, unit)
return pretty_seconds_formatter(seconds)(seconds)
class RequiredAttributeError(ValueError):
@@ -886,8 +890,8 @@ def load_module_from_file(module_name, module_path):
# This recipe is adapted from https://stackoverflow.com/a/67692/771663
spec = importlib.util.spec_from_file_location(module_name, module_path) # novm
module = importlib.util.module_from_spec(spec) # novm
spec = importlib.util.spec_from_file_location(module_name, module_path)
module = importlib.util.module_from_spec(spec)
# The module object needs to exist in sys.modules before the
# loader executes the module code.
#
@@ -986,10 +990,9 @@ def enum(**kwargs):
def stable_partition(
input_iterable, # type: Iterable
predicate_fn, # type: Callable[[Any], bool]
):
# type: (...) -> Tuple[List[Any], List[Any]]
input_iterable: Iterable,
predicate_fn: Callable[[Any], bool],
) -> Tuple[List[Any], List[Any]]:
"""Partition the input iterable according to a custom predicate.
Args:
@@ -1061,23 +1064,20 @@ class GroupedExceptionHandler(object):
"""A generic mechanism to coalesce multiple exceptions and preserve tracebacks."""
def __init__(self):
self.exceptions = [] # type: List[Tuple[str, Exception, List[str]]]
self.exceptions: List[Tuple[str, Exception, List[str]]] = []
def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context):
# type: (str) -> GroupedExceptionForwarder
def forward(self, context: str) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
def _receive_forwarded(self, context, exc, tb):
# type: (str, Exception, List[str]) -> None
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
def grouped_message(self, with_tracebacks=True):
# type: (bool) -> str
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"{0} raised {1}: {2}{3}".format(
@@ -1095,8 +1095,7 @@ class GroupedExceptionForwarder(object):
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context, handler):
# type: (str, GroupedExceptionHandler) -> None
def __init__(self, context: str, handler: GroupedExceptionHandler):
self._context = context
self._handler = handler

View File

@@ -10,11 +10,10 @@
import time
from datetime import datetime
import llnl.util.string
import llnl.util.tty as tty
from llnl.util.lang import pretty_seconds
import spack.util.string
if sys.platform != "win32":
import fcntl
@@ -165,7 +164,7 @@ def _attempts_str(wait_time, nattempts):
if nattempts <= 1:
return ""
attempts = spack.util.string.plural(nattempts, "attempt")
attempts = llnl.util.string.plural(nattempts, "attempt")
return " after {} and {}".format(pretty_seconds(wait_time), attempts)

159
lib/spack/llnl/util/path.py Normal file
View File

@@ -0,0 +1,159 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Utilities for managing Linux and Windows paths."""
# TODO: look at using pathlib since we now only support Python 3
import os
import re
import sys
from urllib.parse import urlparse
is_windows = sys.platform == "win32"
def is_path_url(path):
if "\\" in path:
return False
url_tuple = urlparse(path)
return bool(url_tuple.scheme) and len(url_tuple.scheme) > 1
def win_exe_ext():
return ".exe"
def path_to_os_path(*pths):
"""
Takes an arbitrary number of positional parameters
converts each arguemnt of type string to use a normalized
filepath separator, and returns a list of all values
"""
ret_pths = []
for pth in pths:
if isinstance(pth, str) and not is_path_url(pth):
pth = convert_to_platform_path(pth)
ret_pths.append(pth)
return ret_pths
def sanitize_file_path(pth):
"""
Formats strings to contain only characters that can
be used to generate legal file paths.
Criteria for legal files based on
https://en.wikipedia.org/wiki/Filename#Comparison_of_filename_limitations
Args:
pth: string containing path to be created
on the host filesystem
Return:
sanitized string that can legally be made into a path
"""
# on unix, splitting path by seperators will remove
# instances of illegal characters on join
pth_cmpnts = pth.split(os.path.sep)
if is_windows:
drive_match = r"[a-zA-Z]:"
is_abs = bool(re.match(drive_match, pth_cmpnts[0]))
drive = pth_cmpnts[0] + os.path.sep if is_abs else ""
pth_cmpnts = pth_cmpnts[1:] if drive else pth_cmpnts
illegal_chars = r'[<>?:"|*\\]'
else:
drive = "/" if not pth_cmpnts[0] else ""
illegal_chars = r"[/]"
pth = []
for cmp in pth_cmpnts:
san_cmp = re.sub(illegal_chars, "", cmp)
pth.append(san_cmp)
return drive + os.path.join(*pth)
def system_path_filter(_func=None, arg_slice=None):
"""
Filters function arguments to account for platform path separators.
Optional slicing range can be specified to select specific arguments
This decorator takes all (or a slice) of a method's positional arguments
and normalizes usage of filepath separators on a per platform basis.
Note: **kwargs, urls, and any type that is not a string are ignored
so in such cases where path normalization is required, that should be
handled by calling path_to_os_path directly as needed.
Parameters:
arg_slice (slice): a slice object specifying the slice of arguments
in the decorated method over which filepath separators are
normalized
"""
from functools import wraps
def holder_func(func):
@wraps(func)
def path_filter_caller(*args, **kwargs):
args = list(args)
if arg_slice:
args[arg_slice] = path_to_os_path(*args[arg_slice])
else:
args = path_to_os_path(*args)
return func(*args, **kwargs)
return path_filter_caller
if _func:
return holder_func(_func)
return holder_func
class Path:
"""
Describes the filepath separator types
in an enum style
with a helper attribute
exposing the path type of
the current platform.
"""
unix = 0
windows = 1
platform_path = windows if is_windows else unix
def format_os_path(path, mode=Path.unix):
"""
Format path to use consistent, platform specific
separators. Absolute paths are converted between
drive letters and a prepended '/' as per platform
requirement.
Parameters:
path (str): the path to be normalized, must be a string
or expose the replace method.
mode (Path): the path filesperator style to normalize the
passed path to. Default is unix style, i.e. '/'
"""
if not path:
return path
if mode == Path.windows:
path = path.replace("/", "\\")
else:
path = path.replace("\\", "/")
return path
def convert_to_posix_path(path):
return format_os_path(path, mode=Path.unix)
def convert_to_windows_path(path):
return format_os_path(path, mode=Path.windows)
def convert_to_platform_path(path):
return format_os_path(path, mode=Path.platform_path)

View File

@@ -21,12 +21,12 @@
import traceback
from contextlib import contextmanager
from threading import Thread
from types import ModuleType # novm
from typing import Optional # novm
from types import ModuleType
from typing import Optional
import llnl.util.tty as tty
termios = None # type: Optional[ModuleType]
termios: Optional[ModuleType] = None
try:
import termios as term_mod

View File

@@ -24,8 +24,7 @@
import traceback
import llnl.util.tty.log as log
from spack.util.executable import which
from llnl.util.executable import which
termios = None
try:

View File

@@ -5,12 +5,12 @@
import os
from llnl.util.executable import Executable, ProcessError
from llnl.util.lang import memoized
import spack.spec
from spack.compilers.clang import Clang
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
class ABI(object):

View File

@@ -9,12 +9,16 @@
import json
import multiprocessing.pool
import os
import re
import shutil
import sys
import tarfile
import tempfile
import time
import traceback
import urllib.error
import urllib.parse
import urllib.request
import warnings
from contextlib import closing
from urllib.error import HTTPError, URLError
@@ -24,6 +28,7 @@
import llnl.util.filesystem as fsys
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.executable import which
from llnl.util.filesystem import BaseDirectoryVisitor, mkdirp, visit_directory_tree
import spack.cmd
@@ -38,6 +43,7 @@
import spack.store
import spack.util.file_cache as file_cache
import spack.util.gpg
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
@@ -46,7 +52,6 @@
from spack.relocate import utf8_paths_to_single_binary_regex
from spack.spec import Spec
from spack.stage import Stage
from spack.util.executable import which
_build_cache_relative_path = "build_cache"
_build_cache_keys_relative_path = "_pgp"
@@ -342,7 +347,6 @@ def update(self, with_cooldown=False):
for cached_mirror_url in self._local_index_cache:
cache_entry = self._local_index_cache[cached_mirror_url]
cached_index_hash = cache_entry["index_hash"]
cached_index_path = cache_entry["index_path"]
if cached_mirror_url in configured_mirror_urls:
# Only do a fetch if the last fetch was longer than TTL ago
@@ -361,13 +365,14 @@ def update(self, with_cooldown=False):
# May need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash
cached_mirror_url,
cache_entry=cache_entry,
)
self._last_fetch_times[cached_mirror_url] = (now, True)
all_methods_failed = False
except FetchCacheError as fetch_error:
except FetchIndexError as e:
needs_regen = False
fetch_errors.extend(fetch_error.errors)
fetch_errors.append(e)
self._last_fetch_times[cached_mirror_url] = (now, False)
# The need to regenerate implies a need to clear as well.
spec_cache_clear_needed |= needs_regen
@@ -396,20 +401,22 @@ def update(self, with_cooldown=False):
# already have in our cache must be fetched, stored, and represented
# locally.
for mirror_url in configured_mirror_urls:
if mirror_url not in self._local_index_cache:
# Need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
self._last_fetch_times[mirror_url] = (now, True)
all_methods_failed = False
except FetchCacheError as fetch_error:
fetch_errors.extend(fetch_error.errors)
needs_regen = False
self._last_fetch_times[mirror_url] = (now, False)
# Generally speaking, a new mirror wouldn't imply the need to
# clear the spec cache, so leave it as is.
if needs_regen:
spec_cache_regenerate_needed = True
if mirror_url in self._local_index_cache:
continue
# Need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
self._last_fetch_times[mirror_url] = (now, True)
all_methods_failed = False
except FetchIndexError as e:
fetch_errors.append(e)
needs_regen = False
self._last_fetch_times[mirror_url] = (now, False)
# Generally speaking, a new mirror wouldn't imply the need to
# clear the spec cache, so leave it as is.
if needs_regen:
spec_cache_regenerate_needed = True
self._write_local_index_cache()
@@ -423,7 +430,7 @@ def update(self, with_cooldown=False):
if spec_cache_regenerate_needed:
self.regenerate_spec_cache(clear_existing=spec_cache_clear_needed)
def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
"""Fetch a buildcache index file from a remote mirror and cache it.
If we already have a cached index from this mirror, then we first
@@ -431,102 +438,50 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
Args:
mirror_url (str): Base url of mirror
expect_hash (str): If provided, this hash will be compared against
the index hash we retrieve from the mirror, to determine if we
need to fetch the index or not.
cache_entry (dict): Old cache metadata with keys ``index_hash``, ``index_path``,
``etag``
Returns:
True if this function thinks the concrete spec cache,
``_mirrors_for_spec``, should be regenerated. Returns False
otherwise.
Throws:
FetchCacheError: a composite exception.
"""
index_fetch_url = url_util.join(mirror_url, _build_cache_relative_path, "index.json")
hash_fetch_url = url_util.join(mirror_url, _build_cache_relative_path, "index.json.hash")
True if the local index.json was updated.
if not web_util.url_exists(index_fetch_url):
# A binary mirror is not required to have an index, so avoid
# raising FetchCacheError in that case.
Throws:
FetchIndexError
"""
# TODO: get rid of this request, handle 404 better
if not web_util.url_exists(
url_util.join(mirror_url, _build_cache_relative_path, "index.json")
):
return False
old_cache_key = None
fetched_hash = None
errors = []
# Fetch the hash first so we can check if we actually need to fetch
# the index itself.
try:
_, _, fs = web_util.read_from_url(hash_fetch_url)
fetched_hash = codecs.getreader("utf-8")(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
errors.append(
RuntimeError(
"Unable to read index hash {0} due to {1}: {2}".format(
hash_fetch_url, url_err.__class__.__name__, str(url_err)
)
)
etag = cache_entry.get("etag", None)
if etag:
fetcher = EtagIndexFetcher(mirror_url, etag)
else:
fetcher = DefaultIndexFetcher(
mirror_url, local_hash=cache_entry.get("index_hash", None)
)
# The only case where we'll skip attempting to fetch the buildcache
# index from the mirror is when we already have a hash for this
# mirror, we were able to retrieve one from the mirror, and
# the two hashes are the same.
if expect_hash and fetched_hash:
if fetched_hash == expect_hash:
tty.debug("Cached index for {0} already up to date".format(mirror_url))
return False
else:
# We expected a hash, we fetched a hash, and they were not the
# same. If we end up fetching an index successfully and
# replacing our entry for this mirror, we should clean up the
# existing cache file
if mirror_url in self._local_index_cache:
existing_entry = self._local_index_cache[mirror_url]
old_cache_key = existing_entry["index_path"]
result = fetcher.conditional_fetch()
tty.debug("Fetching index from {0}".format(index_fetch_url))
# Fetch index itself
try:
_, _, fs = web_util.read_from_url(index_fetch_url)
index_object_str = codecs.getreader("utf-8")(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
errors.append(
RuntimeError(
"Unable to read index {0} due to {1}: {2}".format(
index_fetch_url, url_err.__class__.__name__, str(url_err)
)
)
)
raise FetchCacheError(errors)
locally_computed_hash = compute_hash(index_object_str)
if fetched_hash is not None and locally_computed_hash != fetched_hash:
msg = (
"Computed index hash [{0}] did not match remote [{1}, url:{2}] "
"indicating error in index transmission"
).format(locally_computed_hash, fetched_hash, hash_fetch_url)
errors.append(RuntimeError(msg))
# We somehow got an index that doesn't match the remote one, maybe
# the next time we try we'll be successful.
raise FetchCacheError(errors)
# Nothing to do
if result.fresh:
return False
# Persist new index.json
url_hash = compute_hash(mirror_url)
cache_key = "{0}_{1}.json".format(url_hash[:10], locally_computed_hash[:10])
cache_key = "{}_{}.json".format(url_hash[:10], result.hash[:10])
self._index_file_cache.init_entry(cache_key)
with self._index_file_cache.write_transaction(cache_key) as (old, new):
new.write(index_object_str)
new.write(result.data)
self._local_index_cache[mirror_url] = {
"index_hash": locally_computed_hash,
"index_hash": result.hash,
"index_path": cache_key,
"etag": result.etag,
}
# clean up the old cache_key if necessary
old_cache_key = cache_entry.get("index_path", None)
if old_cache_key:
self._index_file_cache.remove(old_cache_key)
@@ -623,7 +578,9 @@ class UnsignedPackageException(spack.error.SpackError):
def compute_hash(data):
return hashlib.sha256(data.encode("utf-8")).hexdigest()
if isinstance(data, str):
data = data.encode("utf-8")
return hashlib.sha256(data).hexdigest()
def build_cache_relative_path():
@@ -1576,10 +1533,6 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
)
)
tty.warn(
"download_tarball() was unable to download "
+ "{0} from any configured mirrors".format(spec)
)
return None
@@ -2417,3 +2370,126 @@ def __call__(self, spec, **kwargs):
# Matching a spec constraint
matches = [s for s in self.possible_specs if s.satisfies(spec)]
return matches
class FetchIndexError(Exception):
def __str__(self):
if len(self.args) == 1:
return str(self.args[0])
else:
return "{}, due to: {}".format(self.args[0], self.args[1])
FetchIndexResult = collections.namedtuple("FetchIndexResult", "etag hash data fresh")
class DefaultIndexFetcher:
"""Fetcher for index.json, using separate index.json.hash as cache invalidation strategy"""
def __init__(self, url, local_hash, urlopen=web_util.urlopen):
self.url = url
self.local_hash = local_hash
self.urlopen = urlopen
self.headers = {"User-Agent": web_util.SPACK_USER_AGENT}
def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, _build_cache_relative_path, "index.json.hash")
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except urllib.error.URLError:
return None
# Validate the hash
remote_hash = response.read(64)
if not re.match(rb"[a-f\d]{64}$", remote_hash):
return None
return remote_hash.decode("utf-8")
def conditional_fetch(self):
# Do an intermediate fetch for the hash
# and a conditional fetch for the contents
# Early exit if our cache is up to date.
if self.local_hash and self.local_hash == self.get_remote_hash():
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise, download index.json
url_index = url_util.join(self.url, _build_cache_relative_path, "index.json")
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e)
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
return FetchCacheError("Remote index {} is invalid".format(url_index), e)
computed_hash = compute_hash(result)
# We don't handle computed_hash != remote_hash here, which can happen
# when remote index.json and index.json.hash are out of sync, or if
# the hash algorithm changed.
# The most likely scenario is that we got index.json got updated
# while we fetched index.json.hash. Warning about an issue thus feels
# wrong, as it's more of an issue with race conditions in the cache
# invalidation strategy.
# For now we only handle etags on http(s), since 304 error handling
# in s3:// is not there yet.
if urllib.parse.urlparse(self.url).scheme not in ("http", "https"):
etag = None
else:
etag = web_util.parse_etag(
response.headers.get("Etag", None) or response.headers.get("etag", None)
)
return FetchIndexResult(
etag=etag,
hash=computed_hash,
data=result,
fresh=False,
)
class EtagIndexFetcher:
"""Fetcher for index.json, using ETags headers as cache invalidation strategy"""
def __init__(self, url, etag, urlopen=web_util.urlopen):
self.url = url
self.etag = etag
self.urlopen = urlopen
def conditional_fetch(self):
# Just do a conditional fetch immediately
url = url_util.join(self.url, _build_cache_relative_path, "index.json")
headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),
}
try:
response = self.urlopen(urllib.request.Request(url, headers=headers))
except urllib.error.HTTPError as e:
if e.getcode() == 304:
# Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
raise FetchIndexError("Remote index {} is invalid".format(url), e) from e
headers = response.headers
etag_header_value = headers.get("Etag", None) or headers.get("etag", None)
return FetchIndexResult(
etag=web_util.parse_etag(etag_header_value),
hash=compute_hash(result),
data=result,
fresh=False,
)

View File

@@ -12,12 +12,12 @@
import archspec.cpu
import llnl.util.envmod
import llnl.util.executable
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.store
import spack.util.environment
import spack.util.executable
from .config import spec_for_current_python
@@ -186,11 +186,11 @@ def _executables_in_store(executables, query_spec, query_info=None):
if (
os.path.exists(bin_dir)
and os.path.isdir(bin_dir)
and spack.util.executable.which_string(*executables, path=bin_dir)
and llnl.util.executable.which_string(*executables, path=bin_dir)
):
spack.util.environment.path_put_first("PATH", [bin_dir])
llnl.util.envmod.path_put_first("PATH", [bin_dir])
if query_info is not None:
query_info["command"] = spack.util.executable.which(*executables, path=bin_dir)
query_info["command"] = llnl.util.executable.which(*executables, path=bin_dir)
query_info["spec"] = concrete_spec
return True
return False

View File

@@ -29,7 +29,10 @@
import os.path
import sys
import uuid
from typing import Callable, List, Optional
import llnl.util.envmod
import llnl.util.executable
from llnl.util import tty
from llnl.util.lang import GroupedExceptionHandler
@@ -45,8 +48,6 @@
import spack.spec
import spack.store
import spack.user_environment
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
@@ -70,12 +71,12 @@
_bootstrap_methods = {}
def bootstrapper(bootstrapper_type):
def bootstrapper(bootstrapper_type: str):
"""Decorator to register classes implementing bootstrapping
methods.
Args:
bootstrapper_type (str): string identifying the class
bootstrapper_type: string identifying the class
"""
def _register(cls):
@@ -119,26 +120,26 @@ def mirror_scope(self):
self.config_scope_name, {"mirrors:": {self.name: self.mirror_url}}
)
def try_import(self, module: str, abstract_spec_str: str): # pylint: disable=unused-argument
def try_import(self, module: str, abstract_spec_str: str) -> bool:
"""Try to import a Python module from a spec satisfying the abstract spec
passed as argument.
Args:
module (str): Python module name to try importing
abstract_spec_str (str): abstract spec that can provide the Python module
module: Python module name to try importing
abstract_spec_str: abstract spec that can provide the Python module
Return:
True if the Python module could be imported, False otherwise
"""
return False
def try_search_path(self, executables, abstract_spec_str): # pylint: disable=unused-argument
def try_search_path(self, executables: List[str], abstract_spec_str: str) -> bool:
"""Try to search some executables in the prefix of specs satisfying the abstract
spec passed as argument.
Args:
executables (list of str): executables to be found
abstract_spec_str (str): abstract spec that can provide the Python module
executables: executables to be found
abstract_spec_str: abstract spec that can provide the Python module
Return:
True if the executables are found, False otherwise
@@ -347,7 +348,7 @@ def source_is_enabled_or_raise(conf):
raise ValueError("source is not trusted")
def ensure_module_importable_or_raise(module, abstract_spec=None):
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
"""Make the requested module available for import, or raise.
This function tries to import a Python module in the current interpreter
@@ -357,8 +358,8 @@ def ensure_module_importable_or_raise(module, abstract_spec=None):
on first success.
Args:
module (str): module to be imported in the current interpreter
abstract_spec (str): abstract spec that might provide the module. If not
module: module to be imported in the current interpreter
abstract_spec: abstract spec that might provide the module. If not
given it defaults to "module"
Raises:
@@ -395,7 +396,11 @@ def ensure_module_importable_or_raise(module, abstract_spec=None):
raise ImportError(msg)
def ensure_executables_in_path_or_raise(executables, abstract_spec, cmd_check=None):
def ensure_executables_in_path_or_raise(
executables: list,
abstract_spec: str,
cmd_check: Optional[Callable[[llnl.util.executable.Executable], bool]] = None,
):
"""Ensure that some executables are in path or raise.
Args:
@@ -403,7 +408,7 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec, cmd_check=No
in order. The function exits on the first one found.
abstract_spec (str): abstract spec that provides the executables
cmd_check (object): callable predicate that takes a
``spack.util.executable.Executable`` command and validate it. Should return
``llnl.util.executable.Executable`` command and validate it. Should return
``True`` if the executable is acceptable, ``False`` otherwise.
Can be used to, e.g., ensure a suitable version of the command before
accepting for bootstrapping.
@@ -415,7 +420,7 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec, cmd_check=No
Executable object
"""
cmd = spack.util.executable.which(*executables)
cmd = llnl.util.executable.which(*executables)
if cmd:
if not cmd_check or cmd_check(cmd):
return cmd
@@ -434,7 +439,7 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec, cmd_check=No
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
env_mods = spack.util.environment.EnvironmentModifications()
env_mods = llnl.util.envmod.EnvironmentModifications()
for dep in concrete_spec.traverse(
root=True, order="post", deptype=("link", "run")
):
@@ -509,7 +514,7 @@ def verify_patchelf(patchelf):
Arguments:
patchelf (spack.util.executable.Executable): patchelf executable
patchelf (llnl.util.executable.Executable): patchelf executable
"""
out = patchelf("--version", output=str, error=os.devnull, fail_on_error=False).strip()
if patchelf.returncode != 0:
@@ -555,11 +560,11 @@ def all_core_root_specs():
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
def bootstrapping_sources(scope=None):
def bootstrapping_sources(scope: Optional[str] = None):
"""Return the list of configured sources of software for bootstrapping Spack
Args:
scope (str or None): if a valid configuration scope is given, return the
scope: if a valid configuration scope is given, return the
list only from that scope
"""
source_configs = spack.config.get("bootstrap:sources", default=None, scope=scope)

View File

@@ -12,12 +12,12 @@
import archspec.cpu
import llnl.util.executable
from llnl.util import tty
import spack.build_environment
import spack.environment
import spack.tengine
import spack.util.executable
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
@@ -120,7 +120,7 @@ def update_syspath_and_environ(self):
)
def _install_with_depfile(self):
spackcmd = spack.util.executable.which("spack")
spackcmd = llnl.util.executable.which("spack")
spackcmd(
"-e",
str(self.environment_root()),
@@ -129,7 +129,7 @@ def _install_with_depfile(self):
"-o",
str(self.environment_root().joinpath("Makefile")),
)
make = spack.util.executable.which("make")
make = llnl.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}

View File

@@ -5,7 +5,7 @@
"""Query the status of bootstrapping on this machine"""
import platform
import spack.util.executable
import llnl.util.executable
from ._common import _executables_in_store, _python_import, _try_import_from_store
from .config import ensure_bootstrap_configuration
@@ -24,7 +24,7 @@ def _required_system_executable(exes, msg):
"""Search for an executable is the system path only."""
if isinstance(exes, str):
exes = (exes,)
if spack.util.executable.which_string(*exes):
if llnl.util.executable.which_string(*exes):
return True, None
return False, msg
@@ -33,7 +33,7 @@ def _required_executable(exes, query_spec, msg):
"""Search for an executable in the system path or in the bootstrap store."""
if isinstance(exes, str):
exes = (exes,)
if spack.util.executable.which_string(*exes) or _executables_in_store(exes, query_spec):
if llnl.util.executable.which_string(*exes) or _executables_in_store(exes, query_spec):
return True, None
return False, msg

View File

@@ -43,7 +43,19 @@
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.util.envmod import (
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
system_dirs,
validate,
)
from llnl.util.executable import Executable
from llnl.util.lang import dedupe
from llnl.util.string import plural
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
@@ -63,25 +75,12 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.path
import spack.util.pattern
from spack.error import NoHeadersError, NoLibrariesError
from spack.installer import InstallError
from spack.util.cpus import cpus_available
from spack.util.environment import (
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
system_dirs,
validate,
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.string import plural
#
# This can be set by the user to globally disable parallel builds.
@@ -1304,7 +1303,7 @@ class ChildError(InstallError):
# List of errors considered "build errors", for which we'll show log
# context instead of Python context.
build_errors = [("spack.util.executable", "ProcessError")]
build_errors = [("llnl.util.executable", "ProcessError")]
def __init__(self, msg, module, classname, traceback_string, log_name, log_type, context):
super(ChildError, self).__init__(msg)

View File

@@ -3,23 +3,25 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import List
import llnl.util.lang
import spack.builder
import spack.installer
import spack.relocate
import spack.spec
import spack.store
def sanity_check_prefix(builder):
def sanity_check_prefix(builder: spack.builder.Builder):
"""Check that specific directories and files are created after installation.
The files to be checked are in the ``sanity_check_is_file`` attribute of the
package object, while the directories are in the ``sanity_check_is_dir``.
Args:
builder (spack.builder.Builder): builder that installed the package
builder: builder that installed the package
"""
pkg = builder.pkg
@@ -43,7 +45,7 @@ def check_paths(path_list, filetype, predicate):
raise spack.installer.InstallError(msg.format(pkg.name))
def apply_macos_rpath_fixups(builder):
def apply_macos_rpath_fixups(builder: spack.builder.Builder):
"""On Darwin, make installed libraries more easily relocatable.
Some build systems (handrolled, autotools, makefiles) can set their own
@@ -55,20 +57,22 @@ def apply_macos_rpath_fixups(builder):
packages) that do not install relocatable libraries by default.
Args:
builder (spack.builder.Builder): builder that installed the package
builder: builder that installed the package
"""
spack.relocate.fixup_macos_rpaths(builder.spec)
def ensure_build_dependencies_or_raise(spec, dependencies, error_msg):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
If not, raise a RuntimeError with a helpful error message.
Args:
spec (spack.spec.Spec): concrete spec to be checked.
dependencies (list of spack.spec.Spec): list of abstract specs to be satisfied
error_msg (str): brief error message to be prepended to a longer description
spec: concrete spec to be checked.
dependencies: list of abstract specs to be satisfied
error_msg: brief error message to be prepended to a longer description
Raises:
RuntimeError: when the required build dependencies are not found
@@ -83,7 +87,9 @@ def ensure_build_dependencies_or_raise(spec, dependencies, error_msg):
# Raise an exception on missing deps.
msg = (
"{0}: missing dependencies: {1}.\n\nPlease add "
"the following lines to the package:\n\n".format(error_msg, ", ".join(missing_deps))
"the following lines to the package:\n\n".format(
error_msg, ", ".join(str(d) for d in missing_deps)
)
)
for dep in missing_deps:
@@ -95,21 +101,21 @@ def ensure_build_dependencies_or_raise(spec, dependencies, error_msg):
raise RuntimeError(msg)
def execute_build_time_tests(builder):
def execute_build_time_tests(builder: spack.builder.Builder):
"""Execute the build-time tests prescribed by builder.
Args:
builder (Builder): builder prescribing the test callbacks. The name of the callbacks is
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``build_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.build_time_test_callbacks, "build")
def execute_install_time_tests(builder):
def execute_install_time_tests(builder: spack.builder.Builder):
"""Execute the install-time tests prescribed by builder.
Args:
builder (Builder): builder prescribing the test callbacks. The name of the callbacks is
builder: builder prescribing the test callbacks. The name of the callbacks is
stored as a list of strings in the ``install_time_test_callbacks`` attribute.
"""
builder.pkg.run_test_callbacks(builder, builder.install_time_test_callbacks, "install")

View File

@@ -2,11 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.executable
import llnl.util.filesystem as fs
import spack.directives
import spack.package_base
import spack.util.executable
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -22,7 +22,7 @@ def configure(self, pkg, spec, prefix):
prezip = spec["aspell"].prefix.bin.prezip
destdir = prefix
sh = spack.util.executable.which("sh")
sh = llnl.util.executable.which("sh")
sh(
"./configure",
"--vars",

View File

@@ -11,6 +11,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.executable import Executable
import spack.build_environment
import spack.builder
@@ -18,7 +19,6 @@
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
from spack.util.executable import Executable
from spack.version import Version
from ._checks import (

View File

@@ -15,7 +15,6 @@
import spack.build_environment
import spack.builder
import spack.package_base
import spack.util.path
from spack.directives import build_system, depends_on, variant
from spack.multimethod import when

View File

@@ -11,6 +11,8 @@
import xml.etree.ElementTree as ElementTree
import llnl.util.tty as tty
from llnl.util.envmod import EnvironmentModifications
from llnl.util.executable import Executable
from llnl.util.filesystem import (
HeaderList,
LibraryList,
@@ -25,8 +27,6 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
from spack.version import Version, ver

View File

@@ -4,11 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import llnl.util.executable
from llnl.util.filesystem import find
import spack.builder
import spack.package_base
import spack.util.executable
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
@@ -41,11 +41,11 @@ class LuaPackage(spack.package_base.PackageBase):
@property
def lua(self):
return spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
return llnl.util.executable.Executable(self.spec["lua-lang"].prefix.bin.lua)
@property
def luarocks(self):
lr = spack.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
lr = llnl.util.executable.Executable(self.spec["lua-lang"].prefix.bin.luarocks)
return lr

View File

@@ -3,12 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.filesystem as fs
from llnl.util.executable import which
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BaseBuilder

View File

@@ -8,11 +8,11 @@
import shutil
from os.path import basename, dirname, isdir
from llnl.util.envmod import EnvironmentModifications
from llnl.util.executable import Executable
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.directives import conflicts
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.directives import conflicts, variant
from .generic import Package
@@ -36,6 +36,13 @@ class IntelOneApiPackage(Package):
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")
# Add variant to toggle environment modifications from vars.sh
variant(
"envmods",
default=True,
description="Toggles environment modifications",
)
@staticmethod
def update_description(cls):
"""Updates oneapi package descriptions with common text."""
@@ -114,11 +121,13 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh")
# Only if environment modifications are desired (default is +envmods)
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh")
)
)
)
class IntelOneApiLibraryPackage(IntelOneApiPackage):

View File

@@ -5,13 +5,13 @@
import inspect
import os
from llnl.util.executable import Executable
from llnl.util.filesystem import filter_file
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.package_base import PackageBase
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests

View File

@@ -81,6 +81,6 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Search the Makefile for a ``check:`` target and runs it if found."""
with working_dir(self.build_directory):
self._if_make_target_execute("check")
self.pkg._if_make_target_execute("check")
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -8,13 +8,13 @@
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.envmod import env_flag
from llnl.util.executable import Executable, ProcessError
import spack.builder
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import build_system, extends
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
class RacketPackage(PackageBase):

View File

@@ -132,6 +132,7 @@ class ROCmPackage(PackageBase):
"amdgpu_target",
description="AMD GPU architecture",
values=spack.variant.any_combination_of(*amdgpu_targets),
sticky=True,
when="+rocm",
)

View File

@@ -124,7 +124,12 @@ def __init__(self, wrapped_pkg_object, root_builder):
wrapper_cls = type(self)
bases = (package_cls, wrapper_cls)
new_cls_name = package_cls.__name__ + "Wrapper"
new_cls = type(new_cls_name, bases, {})
# Forward attributes that might be monkey patched later
new_cls = type(
new_cls_name,
bases,
{"run_tests": property(lambda x: x.wrapped_package_object.run_tests)},
)
new_cls.__module__ = package_cls.__module__
self.__class__ = new_cls
self.__dict__.update(wrapped_pkg_object.__dict__)
@@ -473,6 +478,10 @@ class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
legacy_methods: Tuple[str, ...] = ()
legacy_attributes: Tuple[str, ...] = ()
# type hints for some of the legacy methods
build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will be
@@ -509,7 +518,7 @@ def setup_build_environment(self, env):
Spack's store.
Args:
env (spack.util.environment.EnvironmentModifications): environment
env (llnl.util.envmod.EnvironmentModifications): environment
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
@@ -537,7 +546,7 @@ def setup_dependent_build_environment(self, env, dependent_spec):
variable.
Args:
env (spack.util.environment.EnvironmentModifications): environment
env (llnl.util.envmod.EnvironmentModifications): environment
modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.

View File

@@ -33,15 +33,14 @@
import spack.mirror
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.error import SpackError
from spack.reporters.cdash import CDash
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
from spack.util.pattern import Bunch
JOB_RETRY_CONDITIONS = [
"always",
@@ -486,7 +485,7 @@ def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
whether or not the stack was changed. Returns True if the environment
manifest changed between the provided revisions (or additionally if the
`.gitlab-ci.yml` file itself changed). Returns False otherwise."""
git = exe.which("git")
git = spack.util.git.git()
if git:
with fs.working_dir(spack.paths.prefix):
git_log = git(
@@ -1655,7 +1654,7 @@ def get_spack_info():
entry, otherwise, return a string containing the spack version."""
git_path = os.path.join(spack.paths.prefix, ".git")
if os.path.exists(git_path):
git = exe.which("git")
git = spack.util.git.git()
if git:
with fs.working_dir(spack.paths.prefix):
git_log = git("log", "-1", output=str, error=os.devnull, fail_on_error=False)
@@ -1695,7 +1694,7 @@ def setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None):
spack_git_path = spack.paths.prefix
git = exe.which("git")
git = spack.util.git.git()
if not git:
tty.error("reproduction of pipeline job requires git")
return False
@@ -2358,10 +2357,14 @@ def populate_buildgroup(self, job_names):
tty.warn(msg)
def report_skipped(self, spec, directory_name, reason):
cli_args = self.args()
cli_args.extend(["package", [spec.name]])
it = iter(cli_args)
kv = {x.replace("--", "").replace("-", "_"): next(it) for x in it}
reporter = CDash(Bunch(**kv))
configuration = CDashConfiguration(
upload_url=self.upload_url,
packages=[spec.name],
build=self.build_name,
site=self.site,
buildstamp=self.build_stamp,
track=None,
ctest_parsing=False,
)
reporter = CDash(configuration=configuration)
reporter.test_skipped_report(directory_name, spec, reason)

View File

@@ -16,6 +16,7 @@
import ruamel.yaml as yaml
from ruamel.yaml.error import MarkedYAMLError
import llnl.util.string
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.lang import attr_setdefault, index_by
@@ -33,7 +34,6 @@
import spack.traverse as traverse
import spack.user_environment as uenv
import spack.util.spack_json as sjson
import spack.util.string
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -523,7 +523,7 @@ def print_how_many_pkgs(specs, pkg_type=""):
category, e.g. if pkg_type is "installed" then the message
would be "3 installed packages"
"""
tty.msg("%s" % spack.util.string.plural(len(specs), pkg_type + " package"))
tty.msg("%s" % llnl.util.string.plural(len(specs), pkg_type + " package"))
def spack_is_git_repo():

View File

@@ -14,9 +14,9 @@
import spack.paths
import spack.repo
import spack.util.git
import spack.util.spack_json as sjson
from spack.cmd import spack_is_git_repo
from spack.util.executable import which
description = "show contributors to packages"
section = "developer"
@@ -116,7 +116,7 @@ def blame(parser, args):
# make sure this is a git repo
if not spack_is_git_repo():
tty.die("This spack is not a git clone. Can't use 'spack blame'")
git = which("git", required=True)
git = spack.util.git.git(required=True)
# Get name of file to blame
blame_file = None

View File

@@ -11,6 +11,7 @@
import urllib.parse
import llnl.util.tty as tty
from llnl.util.string import plural
import spack.binary_distribution as bindist
import spack.cmd
@@ -30,7 +31,6 @@
from spack.error import SpecError
from spack.spec import Spec, save_dependency_specfiles
from spack.stage import Stage
from spack.util.string import plural
description = "create, download and install binary packages"
section = "packaging"
@@ -712,9 +712,23 @@ def update_index(mirror_url, update_keys=False):
bindist.generate_key_index(keys_url)
def _mirror_url_from_args_deprecated_format(args):
# In Spack 0.19 the -d flag was equivalent to --mirror-url.
# Spack 0.20 deprecates this, so in 0.21 -d means --directory.
if args.directory and url_util.validate_scheme(urllib.parse.urlparse(args.directory).scheme):
tty.warn(
"Passing a URL to `update-index -d <url>` is deprecated "
"and will be removed in Spack 0.21. "
"Use `update-index --mirror-url <url>` instead."
)
return spack.mirror.push_url_from_mirror_url(args.directory)
else:
return _mirror_url_from_args(args)
def update_index_fn(args):
"""Update a buildcache index."""
push_url = _mirror_url_from_args(args)
push_url = _mirror_url_from_args_deprecated_format(args)
update_index(push_url, update_keys=args.keys)

View File

@@ -6,10 +6,11 @@
import os
import llnl.util.tty as tty
from llnl.util.executable import ProcessError
from llnl.util.filesystem import mkdirp, working_dir
import spack.paths
from spack.util.executable import ProcessError, which
import spack.util.git
_SPACK_UPSTREAM = "https://github.com/spack/spack"
@@ -32,7 +33,7 @@ def setup_parser(subparser):
def get_origin_info(remote):
git_dir = os.path.join(spack.paths.prefix, ".git")
git = which("git", required=True)
git = spack.util.git.git(required=True)
try:
branch = git("symbolic-ref", "--short", "HEAD", output=str)
except ProcessError:
@@ -69,13 +70,13 @@ def clone(parser, args):
if files_in_the_way:
tty.die(
"There are already files there! " "Delete these files before boostrapping spack.",
*files_in_the_way
*files_in_the_way,
)
tty.msg("Installing:", "%s/bin/spack" % prefix, "%s/lib/spack/..." % prefix)
with working_dir(prefix):
git = which("git", required=True)
git = spack.util.git.git(required=True)
git("init", "--shared", "-q")
git("remote", "add", "origin", origin_url)
git("fetch", "origin", "%s:refs/remotes/origin/%s" % (branch, branch), "-n", "-q")

View File

@@ -13,6 +13,7 @@
import spack.dependency as dep
import spack.environment as ev
import spack.modules
import spack.reporters
import spack.spec
import spack.store
from spack.util.pattern import Args
@@ -123,6 +124,64 @@ def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, deptype)
def _cdash_reporter(namespace):
"""Helper function to create a CDash reporter. This function gets an early reference to the
argparse namespace under construction, so it can later use it to create the object.
"""
def _factory():
def installed_specs(args):
if getattr(args, "spec", ""):
packages = args.spec
elif getattr(args, "specs", ""):
packages = args.specs
elif getattr(args, "package", ""):
# Ensure CI 'spack test run' can output CDash results
packages = args.package
else:
packages = []
for file in args.specfiles:
with open(file, "r") as f:
s = spack.spec.Spec.from_yaml(f)
packages.append(s.format())
return packages
configuration = spack.reporters.CDashConfiguration(
upload_url=namespace.cdash_upload_url,
packages=installed_specs(namespace),
build=namespace.cdash_build,
site=namespace.cdash_site,
buildstamp=namespace.cdash_buildstamp,
track=namespace.cdash_track,
ctest_parsing=getattr(namespace, "ctest_parsing", False),
)
return spack.reporters.CDash(configuration=configuration)
return _factory
class CreateReporter(argparse.Action):
"""Create the correct object to generate reports for installation and testing."""
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
if values == "junit":
setattr(namespace, "reporter", spack.reporters.JUnit)
elif values == "cdash":
setattr(namespace, "reporter", _cdash_reporter(namespace))
@arg
def log_format():
return Args(
"--log-format",
default=None,
action=CreateReporter,
choices=("junit", "cdash"),
help="format to be used for log files",
)
# TODO: merge constraint and installed_specs
@arg
def constraint():

View File

@@ -10,6 +10,7 @@
import urllib.parse
import llnl.util.tty as tty
from llnl.util.executable import ProcessError, which
from llnl.util.filesystem import mkdirp
import spack.repo
@@ -23,7 +24,6 @@
parse_version,
)
from spack.util.editor import editor
from spack.util.executable import ProcessError, which
from spack.util.naming import (
mod_to_class,
simplify_name,
@@ -829,7 +829,7 @@ def get_versions(args, name):
valid_url = True
try:
parsed = urllib.parse.urlparse(args.url)
if not parsed.scheme or parsed.scheme != "file":
if not parsed.scheme or parsed.scheme == "file":
valid_url = False # No point in spidering these
except (ValueError, TypeError):
valid_url = False

View File

@@ -12,13 +12,14 @@
from glob import glob
import llnl.util.tty as tty
from llnl.util.executable import which
from llnl.util.filesystem import working_dir
import spack.config
import spack.paths
import spack.platforms
import spack.util.git
from spack.main import get_version
from spack.util.executable import which
description = "debugging commands for troubleshooting Spack"
section = "developer"
@@ -35,7 +36,7 @@ def _debug_tarball_suffix():
now = datetime.now()
suffix = now.strftime("%Y-%m-%d-%H%M%S")
git = which("git")
git = spack.util.git.git()
if not git:
return "nobranch-nogit-%s" % suffix

View File

@@ -13,7 +13,6 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.solver.asp as asp
import spack.util.environment
import spack.util.spack_json as sjson
description = "compare two specs"

View File

@@ -11,7 +11,9 @@
import tempfile
import llnl.util.filesystem as fs
import llnl.util.string as string
import llnl.util.tty as tty
from llnl.util.envmod import EnvironmentModifications
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
@@ -28,8 +30,6 @@
import spack.schema.env
import spack.tengine
import spack.traverse as traverse
import spack.util.string as string
from spack.util.environment import EnvironmentModifications
description = "manage virtual environments"
section = "environments"

View File

@@ -18,7 +18,6 @@
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.util.environment
description = "manage external packages in Spack configuration"
section = "config"

View File

@@ -2,17 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import llnl.util.tty as tty
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.store
from spack.graph import graph_ascii, graph_dot
from spack.graph import (
DAGWithDependencyTypes,
SimpleDAG,
graph_ascii,
graph_dot,
static_graph_dot,
)
description = "generate graphs of package dependency relationships"
section = "basic"
@@ -36,6 +39,12 @@ def setup_parser(subparser):
action="store_true",
help="graph static (possible) deps, don't concretize (implies --dot)",
)
subparser.add_argument(
"-c",
"--color",
action="store_true",
help="use different colors for different dependency types",
)
subparser.add_argument(
"-i",
@@ -48,11 +57,14 @@ def setup_parser(subparser):
def graph(parser, args):
if args.installed:
if args.specs:
tty.die("Can't specify specs with --installed")
args.dot = True
if args.installed and args.specs:
tty.die("cannot specify specs with --installed")
if args.color and not args.dot:
tty.die("the --color option can be used only with --dot")
if args.installed:
args.dot = True
env = ev.active_environment()
if env:
specs = env.all_specs()
@@ -68,13 +80,19 @@ def graph(parser, args):
if args.static:
args.dot = True
static_graph_dot(specs, deptype=args.deptype)
return
if args.dot:
graph_dot(specs, static=args.static, deptype=args.deptype)
builder = SimpleDAG()
if args.color:
builder = DAGWithDependencyTypes()
graph_dot(specs, builder=builder, deptype=args.deptype)
return
elif specs: # ascii is default: user doesn't need to provide it explicitly
debug = spack.config.get("config:debug")
graph_ascii(specs[0], debug=debug, deptype=args.deptype)
for spec in specs[1:]:
print() # extra line bt/w independent graphs
graph_ascii(spec, debug=debug)
# ascii is default: user doesn't need to provide it explicitly
debug = spack.config.get("config:debug")
graph_ascii(specs[0], debug=debug, deptype=args.deptype)
for spec in specs[1:]:
print() # extra line bt/w independent graphs
graph_ascii(spec, debug=debug)

View File

@@ -8,9 +8,10 @@
import shutil
import sys
import textwrap
from typing import List
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util import lang, tty
import spack.build_environment
import spack.cmd
@@ -232,12 +233,7 @@ def setup_parser(subparser):
if 'all' is chosen, run package tests during installation for all
packages. If neither are chosen, don't run tests for any packages.""",
)
subparser.add_argument(
"--log-format",
default=None,
choices=spack.report.valid_formats,
help="format to be used for log files",
)
arguments.add_common_arguments(subparser, ["log_format"])
subparser.add_argument(
"--log-file",
default=None,
@@ -262,6 +258,12 @@ def default_log_file(spec):
return fs.os.path.join(dirname, basename)
def report_filename(args: argparse.Namespace, specs: List[spack.spec.Spec]) -> str:
"""Return the filename to be used for reporting to JUnit or CDash format."""
result = args.log_file or args.cdash_upload_url or default_log_file(specs[0])
return result
def install_specs(specs, install_kwargs, cli_args):
try:
if ev.active_environment():
@@ -361,19 +363,8 @@ def print_cdash_help():
parser.print_help()
def _create_log_reporter(args):
# TODO: remove args injection to spack.report.collect_info, since a class in core
# TODO: shouldn't know what are the command line arguments a command use.
reporter = spack.report.collect_info(
spack.package_base.PackageInstaller, "_install_task", args.log_format, args
)
if args.log_file:
reporter.filename = args.log_file
return reporter
def install_all_specs_from_active_environment(
install_kwargs, only_concrete, cli_test_arg, reporter
install_kwargs, only_concrete, cli_test_arg, reporter_factory
):
"""Install all specs from the active environment
@@ -415,12 +406,10 @@ def install_all_specs_from_active_environment(
tty.msg(msg)
return
if not reporter.filename:
reporter.filename = default_log_file(specs[0])
reporter.specs = specs
reporter = reporter_factory(specs) or lang.nullcontext()
tty.msg("Installing environment {0}".format(env.name))
with reporter("build"):
with reporter:
env.install_all(**install_kwargs)
tty.debug("Regenerating environment views for {0}".format(env.name))
@@ -439,7 +428,7 @@ def compute_tests_install_kwargs(specs, cli_test_arg):
return False
def specs_from_cli(args, install_kwargs, reporter):
def specs_from_cli(args, install_kwargs):
"""Return abstract and concrete spec parsed from the command line."""
abstract_specs = spack.cmd.parse_specs(args.spec)
install_kwargs["tests"] = compute_tests_install_kwargs(abstract_specs, args.test)
@@ -449,7 +438,9 @@ def specs_from_cli(args, install_kwargs, reporter):
)
except SpackError as e:
tty.debug(e)
reporter.concretization_report(e.message)
if args.log_format is not None:
reporter = args.reporter()
reporter.concretization_report(report_filename(args, abstract_specs), e.message)
raise
return abstract_specs, concrete_specs
@@ -514,7 +505,17 @@ def install(parser, args):
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
reporter = _create_log_reporter(args)
def reporter_factory(specs):
if args.log_format is None:
return None
context_manager = spack.report.build_context_manager(
reporter=args.reporter(),
filename=report_filename(args, specs=specs),
specs=specs,
)
return context_manager
install_kwargs = install_kwargs_from_args(args)
if not args.spec and not args.specfiles:
@@ -523,12 +524,12 @@ def install(parser, args):
install_kwargs=install_kwargs,
only_concrete=args.only_concrete,
cli_test_arg=args.test,
reporter=reporter,
reporter_factory=reporter_factory,
)
return
# Specs from CLI
abstract_specs, concrete_specs = specs_from_cli(args, install_kwargs, reporter)
abstract_specs, concrete_specs = specs_from_cli(args, install_kwargs)
# Concrete specs from YAML or JSON files
specs_from_file = concrete_specs_from_file(args)
@@ -538,11 +539,8 @@ def install(parser, args):
if len(concrete_specs) == 0:
tty.die("The `spack install` command requires a spec to install.")
if not reporter.filename:
reporter.filename = default_log_file(concrete_specs[0])
reporter.specs = concrete_specs
with reporter("build"):
reporter = reporter_factory(concrete_specs) or lang.nullcontext()
with reporter:
if args.overwrite:
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]

View File

@@ -13,15 +13,11 @@
import llnl.util.tty as tty
import spack.paths
from spack.util.executable import which
description = "list and check license headers on files in spack"
section = "developer"
level = "long"
#: need the git command to check new files
git = which("git")
#: SPDX license id must appear in the first <license_lines> lines of a file
license_lines = 7
@@ -238,9 +234,6 @@ def setup_parser(subparser):
def license(parser, args):
if not git:
tty.die("spack license requires git in your environment")
licensed_files[:] = [re.compile(regex) for regex in licensed_files]
commands = {

View File

@@ -5,13 +5,14 @@
import sys
import llnl.util.envmod
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.find
import spack.environment as ev
import spack.store
import spack.user_environment as uenv
import spack.util.environment
description = "add package to the user environment"
section = "user environment"
@@ -110,7 +111,7 @@ def load(parser, args):
dep for spec in specs for dep in spec.traverse(root=include_roots, order="post")
]
env_mod = spack.util.environment.EnvironmentModifications()
env_mod = llnl.util.envmod.EnvironmentModifications()
for spec in specs:
env_mod.extend(uenv.environment_modifications_for_spec(spec))
env_mod.prepend_path(uenv.spack_loaded_hashes_var, spec.dag_hash())

View File

@@ -6,10 +6,11 @@
import posixpath
import sys
import llnl.util.executable
from llnl.util.path import convert_to_posix_path
import spack.paths
import spack.util.executable
from spack.spec import Spec
from spack.util.path import convert_to_posix_path
description = "generate Windows installer"
section = "admin"
@@ -100,7 +101,7 @@ def make_installer(parser, args):
spack_logo = posixpath.join(posix_root, "share/spack/logo/favicon.ico")
try:
spack.util.executable.Executable(cmake_path)(
llnl.util.executable.Executable(cmake_path)(
"-S",
source_dir,
"-B",
@@ -111,30 +112,30 @@ def make_installer(parser, args):
"-DSPACK_LOGO=%s" % spack_logo,
"-DSPACK_GIT_VERBOSITY=%s" % git_verbosity,
)
except spack.util.executable.ProcessError:
except llnl.util.executable.ProcessError:
print("Failed to generate installer")
return spack.util.executable.ProcessError.returncode
return llnl.util.executable.ProcessError.returncode
try:
spack.util.executable.Executable(cpack_path)(
llnl.util.executable.Executable(cpack_path)(
"--config", "%s/CPackConfig.cmake" % output_dir, "-B", "%s/" % output_dir
)
except spack.util.executable.ProcessError:
except llnl.util.executable.ProcessError:
print("Failed to generate installer")
return spack.util.executable.ProcessError.returncode
return llnl.util.executable.ProcessError.returncode
try:
spack.util.executable.Executable(os.environ.get("WIX") + "/bin/candle.exe")(
llnl.util.executable.Executable(os.environ.get("WIX") + "/bin/candle.exe")(
"-ext",
"WixBalExtension",
"%s/bundle.wxs" % output_dir,
"-out",
"%s/bundle.wixobj" % output_dir,
)
except spack.util.executable.ProcessError:
except llnl.util.executable.ProcessError:
print("Failed to generate installer chain")
return spack.util.executable.ProcessError.returncode
return llnl.util.executable.ProcessError.returncode
try:
spack.util.executable.Executable(os.environ.get("WIX") + "/bin/light.exe")(
llnl.util.executable.Executable(os.environ.get("WIX") + "/bin/light.exe")(
"-sw1134",
"-ext",
"WixBalExtension",
@@ -142,9 +143,9 @@ def make_installer(parser, args):
"-out",
"%s/Spack.exe" % output_dir,
)
except spack.util.executable.ProcessError:
except llnl.util.executable.ProcessError:
print("Failed to generate installer chain")
return spack.util.executable.ProcessError.returncode
return llnl.util.executable.ProcessError.returncode
print("Successfully generated Spack.exe in %s" % (output_dir))
else:
print("The make-installer command is currently only supported on Windows.")

View File

@@ -10,6 +10,7 @@
import os
import sys
import llnl.util.executable as exe
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
@@ -17,7 +18,6 @@
import spack.cmd.common.arguments as arguments
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph
description = "query packages associated with particular git revisions"

View File

@@ -127,8 +127,10 @@ def python_interpreter(args):
console.runsource(startup.read(), startup_file, "exec")
if args.python_command:
propagate_exceptions_from(console)
console.runsource(args.python_command)
elif args.python_args:
propagate_exceptions_from(console)
sys.argv = args.python_args
with open(args.python_args[0]) as file:
console.runsource(file.read(), args.python_args[0], "exec")
@@ -149,3 +151,18 @@ def python_interpreter(args):
platform.machine(),
)
)
def propagate_exceptions_from(console):
"""Set sys.excepthook to let uncaught exceptions return 1 to the shell.
Args:
console (code.InteractiveConsole): the console that needs a change in sys.excepthook
"""
console.push("import sys")
console.push("_wrapped_hook = sys.excepthook")
console.push("def _hook(exc_type, exc_value, exc_tb):")
console.push(" _wrapped_hook(exc_type, exc_value, exc_tb)")
console.push(" sys.exit(1)")
console.push("")
console.push("sys.excepthook = _hook")

View File

@@ -10,10 +10,11 @@
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.executable import which
from llnl.util.filesystem import working_dir
import spack.paths
from spack.util.executable import which
import spack.util.git
description = "runs source code style checks on spack"
section = "developer"
@@ -47,6 +48,13 @@ def grouper(iterable, n, fillvalue=None):
#: tools we run in spack style
tools = {}
#: warnings to ignore in mypy
mypy_ignores = [
# same as `disable_error_code = "annotation-unchecked"` in pyproject.toml, which
# doesn't exist in mypy 0.971 for Python 3.6
"[annotation-unchecked]",
]
def is_package(f):
"""Whether flake8 should consider a file as a core file or a package.
@@ -81,7 +89,7 @@ def changed_files(base="develop", untracked=True, all_files=False, root=None):
if root is None:
root = spack.paths.prefix
git = which("git", required=True)
git = spack.util.git.git(required=True)
# ensure base is in the repo
base_sha = git(
@@ -210,6 +218,10 @@ def translate(match):
for line in output.split("\n"):
if not line:
continue
if any(ignore in line for ignore in mypy_ignores):
# some mypy annotations can't be disabled in older mypys (e.g. .971, which
# is the only mypy that supports python 3.6), so we filter them here.
continue
if not args.root_relative and re_obj:
line = re_obj.sub(translate, line)
print(line)

View File

@@ -5,6 +5,7 @@
import io
import sys
import llnl.util.string
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
@@ -24,7 +25,7 @@ def report_tags(category, tags):
if isatty:
num = len(tags)
fmt = "{0} package tag".format(category)
buffer.write("{0}:\n".format(spack.util.string.plural(num, fmt)))
buffer.write("{0}:\n".format(llnl.util.string.plural(num, fmt)))
if tags:
colify.colify(tags, output=buffer, tty=isatty, indent=4)

View File

@@ -13,8 +13,8 @@
import sys
import textwrap
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
from llnl.util import lang, tty
from llnl.util.tty import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
@@ -63,12 +63,7 @@ def setup_parser(subparser):
run_parser.add_argument(
"--keep-stage", action="store_true", help="Keep testing directory for debugging"
)
run_parser.add_argument(
"--log-format",
default=None,
choices=spack.report.valid_formats,
help="format to be used for log files",
)
arguments.add_common_arguments(run_parser, ["log_format"])
run_parser.add_argument(
"--log-file",
default=None,
@@ -231,10 +226,23 @@ def test_run(args):
# Set up reporter
setattr(args, "package", [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package_base.PackageBase, "do_test", args.log_format, args
)
if not reporter.filename:
reporter = create_reporter(args, specs_to_test, test_suite) or lang.nullcontext()
with reporter:
test_suite(
remove_directory=not args.keep_stage,
dirty=args.dirty,
fail_first=args.fail_first,
externals=args.externals,
)
def create_reporter(args, specs_to_test, test_suite):
if args.log_format is None:
return None
filename = args.cdash_upload_url
if not filename:
if args.log_file:
if os.path.isabs(args.log_file):
log_file = args.log_file
@@ -243,16 +251,15 @@ def test_run(args):
log_file = os.path.join(log_dir, args.log_file)
else:
log_file = os.path.join(os.getcwd(), "test-%s" % test_suite.name)
reporter.filename = log_file
reporter.specs = specs_to_test
filename = log_file
with reporter("test", test_suite.stage):
test_suite(
remove_directory=not args.keep_stage,
dirty=args.dirty,
fail_first=args.fail_first,
externals=args.externals,
)
context_manager = spack.report.test_context_manager(
reporter=args.reporter(),
filename=filename,
specs=specs_to_test,
raw_logs_dir=test_suite.stage,
)
return context_manager
def test_list(args):

View File

@@ -15,8 +15,8 @@
import spack.cmd.common.arguments as arguments
import spack.config
import spack.paths
import spack.util.git
import spack.util.gpg
from spack.util.executable import which
from spack.util.spack_yaml import syaml_dict
description = "set up spack for our tutorial (WARNING: modifies config!)"
@@ -84,7 +84,7 @@ def tutorial(parser, args):
# If you don't put this last, you'll get import errors for the code
# that follows (exacerbated by the various lazy singletons we use)
tty.msg("Ensuring we're on the releases/v{0}.{1} branch".format(*spack.spack_version_info[:2]))
git = which("git", required=True)
git = spack.util.git.git(required=True)
with working_dir(spack.paths.prefix):
git("checkout", tutorial_branch)
# NO CODE BEYOND HERE

View File

@@ -6,11 +6,12 @@
import os
import sys
import llnl.util.envmod
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.user_environment as uenv
import spack.util.environment
description = "remove package from the user environment"
section = "user environment"
@@ -82,7 +83,7 @@ def unload(parser, args):
)
return 1
env_mod = spack.util.environment.EnvironmentModifications()
env_mod = llnl.util.envmod.EnvironmentModifications()
for spec in specs:
env_mod.extend(uenv.environment_modifications_for_spec(spec).reversed())
env_mod.remove_path(uenv.spack_loaded_hashes_var, spec.dag_hash())

View File

@@ -13,18 +13,18 @@
import tempfile
from typing import List, Optional, Sequence
import llnl.util.envmod as envmod
import llnl.util.executable
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
from llnl.util.path import system_path_filter
import spack.compilers
import spack.error
import spack.spec
import spack.util.executable
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.path import system_path_filter
__all__ = ["Compiler"]
@@ -40,7 +40,7 @@ def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
compiler_path (path): path of the compiler to be invoked
version_arg (str): the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
compiler = llnl.util.executable.Executable(compiler_path)
if version_arg:
output = compiler(version_arg, output=str, error=str, ignore_errors=ignore_errors)
else:
@@ -54,7 +54,7 @@ def get_compiler_version_output(compiler_path, *args, **kwargs):
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(compiler_path, required=True)
compiler_path = llnl.util.executable.which_string(compiler_path, required=True)
return _get_compiler_version_output(compiler_path, *args, **kwargs)
@@ -156,14 +156,14 @@ def _parse_link_paths(string):
@system_path_filter
def _parse_non_system_link_dirs(string):
def _parse_non_system_link_dirs(string: str) -> List[str]:
"""Parses link paths out of compiler debug output.
Args:
string (str): compiler debug output as a string
string: compiler debug output as a string
Returns:
(list of str): implicit link paths parsed from the compiler output
Implicit link paths parsed from the compiler output
"""
link_dirs = _parse_link_paths(string)
@@ -175,7 +175,7 @@ def _parse_non_system_link_dirs(string):
# system paths. Note that 'filter_system_paths' only checks for an
# exact match, while 'in_system_subdirectory' checks if a path contains
# a system directory as a subdirectory
link_dirs = filter_system_paths(link_dirs)
link_dirs = envmod.filter_system_paths(link_dirs)
return list(p for p in link_dirs if not in_system_subdirectory(p))
@@ -339,7 +339,7 @@ def verify_executables(self):
def accessible_exe(exe):
# compilers may contain executable names (on Cray or user edited)
if not os.path.isabs(exe):
exe = spack.util.executable.which_string(exe)
exe = llnl.util.executable.which_string(exe)
if not exe:
return False
return os.path.isfile(exe) and os.access(exe, os.X_OK)
@@ -371,7 +371,7 @@ def real_version(self):
if real_version == spack.version.Version("unknown"):
return self.version
self._real_version = real_version
except spack.util.executable.ProcessError:
except llnl.util.executable.ProcessError:
self._real_version = self.version
return self._real_version
@@ -422,7 +422,7 @@ def _get_compiler_link_paths(self, paths):
csource.write(
"int main(int argc, char* argv[]) { " "(void)argc; (void)argv; return 0; }\n"
)
compiler_exe = spack.util.executable.Executable(first_compiler)
compiler_exe = llnl.util.executable.Executable(first_compiler)
for flag_type in flags:
for flag in self.flags.get(flag_type, []):
compiler_exe.add_default_arg(flag)
@@ -433,7 +433,7 @@ def _get_compiler_link_paths(self, paths):
compiler_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
) # str for py2
return _parse_non_system_link_dirs(output)
except spack.util.executable.ProcessError as pe:
except llnl.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return []
finally:
@@ -532,7 +532,7 @@ def get_real_version(self):
Use the runtime environment of the compiler (modules and environment
modifications) to enable the compiler to run properly on any platform.
"""
cc = spack.util.executable.Executable(self.cc)
cc = llnl.util.executable.Executable(self.cc)
with self.compiler_environment():
output = cc(
self.version_argument,
@@ -658,7 +658,7 @@ def compiler_environment(self):
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env = llnl.util.envmod.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()

View File

@@ -17,6 +17,7 @@
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.envmod import get_path
import spack.compiler
import spack.config
@@ -24,7 +25,6 @@
import spack.paths
import spack.platforms
import spack.spec
from spack.util.environment import get_path
from spack.util.naming import mod_to_class
_path_instance_vars = ["cc", "cxx", "f77", "fc"]
@@ -680,7 +680,7 @@ def _default(fn_args):
return value, None
error = "Couldn't get version for compiler {0}".format(path)
except spack.util.executable.ProcessError as e:
except llnl.util.executable.ProcessError as e:
error = "Couldn't get version for compiler {0}\n".format(path) + str(e)
except Exception as e:
# Catching "Exception" here is fine because it just

View File

@@ -6,13 +6,13 @@
import re
import shutil
import llnl.util.executable
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.symlink import symlink
import spack.compiler
import spack.compilers.clang
import spack.util.executable
import spack.version
@@ -90,13 +90,13 @@ def setup_custom_environment(self, pkg, env):
# Use special XCode versions of compiler wrappers when using XCode
# Overwrites build_environment's setting of SPACK_CC and SPACK_CXX
xcrun = spack.util.executable.Executable("xcrun")
xcrun = llnl.util.executable.Executable("xcrun")
xcode_clang = xcrun("-f", "clang", output=str).strip()
xcode_clangpp = xcrun("-f", "clang++", output=str).strip()
env.set("SPACK_CC", xcode_clang, force=True)
env.set("SPACK_CXX", xcode_clangpp, force=True)
xcode_select = spack.util.executable.Executable("xcode-select")
xcode_select = llnl.util.executable.Executable("xcode-select")
# Get the path of the active developer directory
real_root = xcode_select("--print-path", output=str).strip()

View File

@@ -6,11 +6,11 @@
import os
import re
import llnl.util.executable
from llnl.util.filesystem import ancestor
import spack.compiler
import spack.compilers.apple_clang as apple_clang
import spack.util.executable
from spack.version import ver
@@ -204,7 +204,7 @@ def stdcxx_libs(self):
def prefix(self):
# GCC reports its install prefix when running ``-print-search-dirs``
# on the first line ``install: <prefix>``.
cc = spack.util.executable.Executable(self.cc)
cc = llnl.util.executable.Executable(self.cc)
with self.compiler_environment():
gcc_output = cc("-print-search-dirs", output=str, error=str)

View File

@@ -10,10 +10,11 @@
from distutils.version import StrictVersion
from typing import Dict, List, Set
import llnl.util.executable
import spack.compiler
import spack.operating_systems.windows_os
import spack.platforms
import spack.util.executable
from spack.compiler import Compiler
from spack.error import SpackError
from spack.version import Version
@@ -160,6 +161,8 @@ def setup_custom_environment(self, pkg, env):
def fc_version(cls, fc):
# We're using intel for the Fortran compilers, which exist if
# ONEAPI_ROOT is a meaningful variable
if not sys.platform == "win32":
return "unknown"
fc_ver = cls.default_version(fc)
avail_fc_version.add(fc_ver)
fc_path[fc_ver] = fc
@@ -168,7 +171,7 @@ def fc_version(cls, fc):
sps = spack.operating_systems.windows_os.WindowsOs.compiler_search_paths
except AttributeError:
raise SpackError("Windows compiler search paths not established")
clp = spack.util.executable.which_string("cl", path=sps)
clp = llnl.util.executable.which_string("cl", path=sps)
ver = cls.default_version(clp)
else:
ver = fc_ver

View File

@@ -36,7 +36,7 @@
import re
import sys
from contextlib import contextmanager
from typing import List
from typing import Dict, List, Optional
import ruamel.yaml as yaml
from ruamel.yaml.error import MarkedYAMLError
@@ -391,41 +391,44 @@ class Configuration(object):
This class makes it easy to add a new scope on top of an existing one.
"""
def __init__(self, *scopes):
# convert to typing.OrderedDict when we drop 3.6, or OrderedDict when we reach 3.9
scopes: Dict[str, ConfigScope]
def __init__(self, *scopes: ConfigScope):
"""Initialize a configuration with an initial list of scopes.
Args:
scopes (list of ConfigScope): list of scopes to add to this
scopes: list of scopes to add to this
Configuration, ordered from lowest to highest precedence
"""
self.scopes = collections.OrderedDict()
for scope in scopes:
self.push_scope(scope)
self.format_updates = collections.defaultdict(list)
self.format_updates: Dict[str, List[str]] = collections.defaultdict(list)
@_config_mutator
def push_scope(self, scope):
def push_scope(self, scope: ConfigScope):
"""Add a higher precedence scope to the Configuration."""
tty.debug("[CONFIGURATION: PUSH SCOPE]: {}".format(str(scope)), level=2)
self.scopes[scope.name] = scope
@_config_mutator
def pop_scope(self):
def pop_scope(self) -> ConfigScope:
"""Remove the highest precedence scope and return it."""
name, scope = self.scopes.popitem(last=True)
name, scope = self.scopes.popitem(last=True) # type: ignore[call-arg]
tty.debug("[CONFIGURATION: POP SCOPE]: {}".format(str(scope)), level=2)
return scope
@_config_mutator
def remove_scope(self, scope_name):
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
"""Remove scope by name; has no effect when ``scope_name`` does not exist"""
scope = self.scopes.pop(scope_name, None)
tty.debug("[CONFIGURATION: POP SCOPE]: {}".format(str(scope)), level=2)
return scope
@property
def file_scopes(self):
def file_scopes(self) -> List[ConfigScope]:
"""List of writable scopes with an associated file."""
return [
s
@@ -433,21 +436,21 @@ def file_scopes(self):
if (type(s) == ConfigScope or type(s) == SingleFileScope)
]
def highest_precedence_scope(self):
def highest_precedence_scope(self) -> ConfigScope:
"""Non-internal scope with highest precedence."""
return next(reversed(self.file_scopes), None)
return next(reversed(self.file_scopes))
def highest_precedence_non_platform_scope(self):
def highest_precedence_non_platform_scope(self) -> ConfigScope:
"""Non-internal non-platform scope with highest precedence
Platform-specific scopes are of the form scope/platform"""
generator = reversed(self.file_scopes)
highest = next(generator, None)
highest = next(generator)
while highest and highest.is_platform_dependent:
highest = next(generator, None)
highest = next(generator)
return highest
def matching_scopes(self, reg_expr):
def matching_scopes(self, reg_expr) -> List[ConfigScope]:
"""
List of all scopes whose names match the provided regular expression.
@@ -456,7 +459,7 @@ def matching_scopes(self, reg_expr):
"""
return [s for s in self.scopes.values() if re.search(reg_expr, s.name)]
def _validate_scope(self, scope):
def _validate_scope(self, scope: Optional[str]) -> ConfigScope:
"""Ensure that scope is valid in this configuration.
This should be used by routines in ``config.py`` to validate
@@ -481,7 +484,7 @@ def _validate_scope(self, scope):
"Invalid config scope: '%s'. Must be one of %s" % (scope, self.scopes.keys())
)
def get_config_filename(self, scope, section):
def get_config_filename(self, scope, section) -> str:
"""For some scope and section, get the name of the configuration file."""
scope = self._validate_scope(scope)
return scope.get_section_filename(section)
@@ -495,7 +498,9 @@ def clear_caches(self):
scope.clear()
@_config_mutator
def update_config(self, section, update_data, scope=None, force=False):
def update_config(
self, section: str, update_data: Dict, scope: Optional[str] = None, force: bool = False
):
"""Update the configuration file for a particular scope.
Overwrites contents of a section in a scope with update_data,
@@ -1315,14 +1320,15 @@ def raw_github_gitlab_url(url):
return url
def collect_urls(base_url):
def collect_urls(base_url: str) -> list:
"""Return a list of configuration URLs.
Arguments:
base_url (str): URL for a configuration (yaml) file or a directory
base_url: URL for a configuration (yaml) file or a directory
containing yaml file(s)
Returns: (list) list of configuration file(s) or empty list if none
Returns:
List of configuration file(s) or empty list if none
"""
if not base_url:
return []
@@ -1337,20 +1343,21 @@ def collect_urls(base_url):
return [link for link in links if link.endswith(extension)]
def fetch_remote_configs(url, dest_dir, skip_existing=True):
def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) -> str:
"""Retrieve configuration file(s) at the specified URL.
Arguments:
url (str): URL for a configuration (yaml) file or a directory containing
url: URL for a configuration (yaml) file or a directory containing
yaml file(s)
dest_dir (str): destination directory
skip_existing (bool): Skip files that already exist in dest_dir if
dest_dir: destination directory
skip_existing: Skip files that already exist in dest_dir if
``True``; otherwise, replace those files
Returns: (str) path to the corresponding file if URL is or contains a
single file and it is the only file in the destination directory or
the root (dest_dir) directory if multiple configuration files exist
or are retrieved.
Returns:
Path to the corresponding file if URL is or contains a
single file and it is the only file in the destination directory or
the root (dest_dir) directory if multiple configuration files exist
or are retrieved.
"""
def _fetch_file(url):

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.util.executable as executable
import spack.util.git
#: Global variable used to cache in memory the content of images.json
_data = None
@@ -97,7 +97,7 @@ def _verify_ref(url, ref, enforce_sha):
# Do a checkout in a temporary directory
msg = 'Cloning "{0}" to verify ref "{1}"'.format(url, ref)
tty.info(msg, stream=sys.stderr)
git = executable.which("git", required=True)
git = spack.util.git.git(required=True)
with fs.temporary_dir():
git("clone", "-q", url, ".")
sha = git(

View File

@@ -347,7 +347,7 @@ def find_win32_additional_install_paths():
windows_search_ext.extend(
spack.config.get("config:additional_external_search_paths", default=[])
)
windows_search_ext.extend(spack.util.environment.get_path("PATH"))
windows_search_ext.extend(llnl.util.envmod.get_path("PATH"))
return windows_search_ext

View File

@@ -12,10 +12,10 @@
import sys
import warnings
import llnl.util.envmod
import llnl.util.filesystem
import llnl.util.tty
import spack.util.environment
import spack.util.ld_so_conf
from .common import ( # find_windows_compiler_bundled_packages,
@@ -83,9 +83,9 @@ def libraries_in_ld_and_system_library_path(path_hints=None):
"""
path_hints = (
path_hints
or spack.util.environment.get_path("LD_LIBRARY_PATH")
+ spack.util.environment.get_path("DYLD_LIBRARY_PATH")
+ spack.util.environment.get_path("DYLD_FALLBACK_LIBRARY_PATH")
or llnl.util.envmod.get_path("LD_LIBRARY_PATH")
+ llnl.util.envmod.get_path("DYLD_LIBRARY_PATH")
+ llnl.util.envmod.get_path("DYLD_FALLBACK_LIBRARY_PATH")
+ spack.util.ld_so_conf.host_dynamic_linker_search_paths()
)
search_paths = llnl.util.filesystem.search_paths_for_libraries(*path_hints)
@@ -93,7 +93,7 @@ def libraries_in_ld_and_system_library_path(path_hints=None):
def libraries_in_windows_paths(path_hints):
path_hints.extend(spack.util.environment.get_path("PATH"))
path_hints.extend(llnl.util.envmod.get_path("PATH"))
search_paths = llnl.util.filesystem.search_paths_for_libraries(*path_hints)
# on Windows, some libraries (.dlls) are found in the bin directory or sometimes
# at the search root. Add both of those options to the search scheme
@@ -236,7 +236,7 @@ def by_executable(packages_to_check, path_hints=None):
path_hints (list): list of paths to be searched. If None the list will be
constructed based on the PATH environment variable.
"""
path_hints = spack.util.environment.get_path("PATH") if path_hints is None else path_hints
path_hints = llnl.util.envmod.get_path("PATH") if path_hints is None else path_hints
exe_pattern_to_pkgs = collections.defaultdict(list)
for pkg in packages_to_check:
if hasattr(pkg, "executables"):

View File

@@ -20,6 +20,7 @@
import spack.spec
import spack.util.spack_json as sjson
from spack.error import SpackError
from spack.util.environment import get_host_environment_metadata
is_windows = sys.platform == "win32"
# Note: Posixpath is used here as opposed to
@@ -120,8 +121,6 @@ def write_host_environment(self, spec):
versioning. We use it in the case that an analysis later needs to
easily access this information.
"""
from spack.util.environment import get_host_environment_metadata
env_file = self.env_metadata_path(spec)
environ = get_host_environment_metadata()
with open(env_file, "w") as fd:

View File

@@ -16,6 +16,7 @@
import ruamel.yaml as yaml
import llnl.util.envmod
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import dedupe
@@ -1535,7 +1536,7 @@ def check_views(self):
)
def _env_modifications_for_default_view(self, reverse=False):
all_mods = spack.util.environment.EnvironmentModifications()
all_mods = llnl.util.envmod.EnvironmentModifications()
visited = set()
@@ -1576,7 +1577,7 @@ def add_default_view_to_env(self, env_mod):
default view. Removes duplicate paths.
Args:
env_mod (spack.util.environment.EnvironmentModifications): the environment
env_mod (llnl.util.envmod.EnvironmentModifications): the environment
modifications object that is modified.
"""
if default_view_name not in self.views:
@@ -1603,7 +1604,7 @@ def rm_default_view_from_env(self, env_mod):
default view. Reverses the action of ``add_default_view_to_env``.
Args:
env_mod (spack.util.environment.EnvironmentModifications): the environment
env_mod (llnl.util.envmod.EnvironmentModifications): the environment
modifications object that is modified.
"""
if default_view_name not in self.views:

View File

@@ -5,12 +5,12 @@
import os
import llnl.util.tty as tty
from llnl.util.envmod import EnvironmentModifications
from llnl.util.tty.color import colorize
import spack.environment as ev
import spack.repo
import spack.store
from spack.util.environment import EnvironmentModifications
def activate_header(env, shell, prompt=None):
@@ -111,7 +111,7 @@ def activate(env, use_env_repo=False, add_view=True):
add_view (bool): generate commands to add view to path variables
Returns:
spack.util.environment.EnvironmentModifications: Environment variables
llnl.util.envmod.EnvironmentModifications: Environment variables
modifications to activate environment.
"""
ev.activate(env, use_env_repo=use_env_repo)
@@ -150,7 +150,7 @@ def deactivate():
after activation are not unloaded.
Returns:
spack.util.environment.EnvironmentModifications: Environment variables
llnl.util.envmod.EnvironmentModifications: Environment variables
modifications to activate environment.
"""
env_mods = EnvironmentModifications()

View File

@@ -35,6 +35,7 @@
import llnl.util
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.executable import CommandNotFoundError, which
from llnl.util.filesystem import (
get_single_file,
mkdirp,
@@ -42,19 +43,19 @@
temp_rename,
working_dir,
)
from llnl.util.string import comma_and, quote
from llnl.util.symlink import symlink
import spack.config
import spack.error
import spack.url
import spack.util.crypto as crypto
import spack.util.git
import spack.util.pattern as pattern
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
from spack.util.compression import decompressor_for, extension_from_path
from spack.util.executable import CommandNotFoundError, which
from spack.util.string import comma_and, quote
#: List of all fetch strategies, created by FetchStrategy metaclass.
all_strategies = []
@@ -765,7 +766,7 @@ def version_from_git(git_exe):
@property
def git(self):
if not self._git:
self._git = which("git", required=True)
self._git = spack.util.git.git()
# Disable advice for a quieter fetch
# https://github.com/git/git/blob/master/Documentation/RelNotes/1.7.2.txt

View File

@@ -4,8 +4,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import urllib.parse
import urllib.response
import spack.util.web as web_util
from urllib.error import URLError
from urllib.request import BaseHandler
def gcs_open(req, *args, **kwargs):
@@ -16,8 +16,13 @@ def gcs_open(req, *args, **kwargs):
gcsblob = gcs_util.GCSBlob(url)
if not gcsblob.exists():
raise web_util.SpackWebError("GCS blob {0} does not exist".format(gcsblob.blob_path))
raise URLError("GCS blob {0} does not exist".format(gcsblob.blob_path))
stream = gcsblob.get_blob_byte_stream()
headers = gcsblob.get_blob_headers()
return urllib.response.addinfourl(stream, headers, url)
class GCSHandler(BaseHandler):
def gs_open(self, req):
return gcs_open(req)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
r"""Functions for graphing DAGs of dependencies.
This file contains code for graphing DAGs of software packages
@@ -35,88 +34,17 @@
/
o boost
graph_dot() will output a graph of a spec (or multiple specs) in dot
format.
Note that ``graph_ascii`` assumes a single spec while ``graph_dot``
can take a number of specs as input.
graph_dot() will output a graph of a spec (or multiple specs) in dot format.
"""
import heapq
import itertools
import enum
import sys
from typing import List, Optional, Set, TextIO, Tuple, Union
import llnl.util.tty.color
import spack.dependency
__all__ = ["graph_ascii", "AsciiGraph", "graph_dot"]
def node_label(spec):
return spec.format("{name}{@version}{/hash:7}")
def topological_sort(spec, deptype="all"):
"""Return a list of dependency specs in topological sorting order.
The spec argument is not modified in by the function.
This function assumes specs don't have cycles, i.e. that we are really
operating with a DAG.
Args:
spec (spack.spec.Spec): the spec to be analyzed
deptype (str or tuple): dependency types to account for when
constructing the list
"""
deptype = spack.dependency.canonical_deptype(deptype)
# Work on a copy so this is nondestructive
spec = spec.copy(deps=True)
nodes = spec.index(deptype=deptype)
def dependencies(specs):
"""Return all the dependencies (including transitive) for a spec."""
return list(
set(itertools.chain.from_iterable(s.dependencies(deptype=deptype) for s in specs))
)
def dependents(specs):
"""Return all the dependents (including those of transitive dependencies)
for a spec.
"""
candidates = list(
set(itertools.chain.from_iterable(s.dependents(deptype=deptype) for s in specs))
)
return [x for x in candidates if x.name in nodes]
topological_order, children = [], {}
# Map a spec encoded as (id, name) to a list of its transitive dependencies
for spec in itertools.chain.from_iterable(nodes.values()):
children[(id(spec), spec.name)] = [x for x in dependencies([spec]) if x.name in nodes]
# To return a result that is topologically ordered we need to add nodes
# only after their dependencies. The first nodes we can add are leaf nodes,
# i.e. nodes that have no dependencies.
ready = [
spec for spec in itertools.chain.from_iterable(nodes.values()) if not dependencies([spec])
]
heapq.heapify(ready)
while ready:
# Pop a "ready" node and add it to the topologically ordered list
s = heapq.heappop(ready)
topological_order.append(s)
# Check if adding the last node made other nodes "ready"
for dep in dependents([s]):
children[(id(dep), dep.name)].remove(s)
if not children[(id(dep), dep.name)]:
heapq.heappush(ready, dep)
return topological_order
import spack.spec
import spack.tengine
def find(seq, predicate):
@@ -133,13 +61,17 @@ def find(seq, predicate):
return -1
# Names of different graph line states. We record previous line
# states so that we can easily determine what to do when connecting.
states = ("node", "collapse", "merge-right", "expand-right", "back-edge")
NODE, COLLAPSE, MERGE_RIGHT, EXPAND_RIGHT, BACK_EDGE = states
class _GraphLineState(enum.Enum):
"""Names of different graph line states."""
NODE = enum.auto()
COLLAPSE = enum.auto()
MERGE_RIGHT = enum.auto()
EXPAND_RIGHT = enum.auto()
BACK_EDGE = enum.auto()
class AsciiGraph(object):
class AsciiGraph:
def __init__(self):
# These can be set after initialization or after a call to
# graph() to change behavior.
@@ -152,13 +84,13 @@ def __init__(self):
# See llnl.util.tty.color for details on color characters.
self.colors = "rgbmcyRGBMCY"
# Internal vars are used in the graph() function and are
# properly initialized there.
# Internal vars are used in the graph() function and are initialized there
self._name_to_color = None # Node name to color
self._out = None # Output stream
self._frontier = None # frontier
self._prev_state = None # State of previous line
self._prev_index = None # Index of expansion point of prev line
self._pos = None
def _indent(self):
self._out.write(self.indent * " ")
@@ -169,7 +101,7 @@ def _write_edge(self, string, index, sub=0):
if not self._frontier[index]:
return
name = self._frontier[index][sub]
edge = "@%s{%s}" % (self._name_to_color[name], string)
edge = f"@{self._name_to_color[name]}{{{string}}}"
self._out.write(edge)
def _connect_deps(self, i, deps, label=None):
@@ -204,14 +136,14 @@ def _connect_deps(self, i, deps, label=None):
return self._connect_deps(j, deps, label)
collapse = True
if self._prev_state == EXPAND_RIGHT:
if self._prev_state == _GraphLineState.EXPAND_RIGHT:
# Special case where previous line expanded and i is off by 1.
self._back_edge_line([], j, i + 1, True, label + "-1.5 " + str((i + 1, j)))
collapse = False
else:
# Previous node also expanded here, so i is off by one.
if self._prev_state == NODE and self._prev_index < i:
if self._prev_state == _GraphLineState.NODE and self._prev_index < i:
i += 1
if i - j > 1:
@@ -222,21 +154,21 @@ def _connect_deps(self, i, deps, label=None):
self._back_edge_line([j], -1, -1, collapse, label + "-2 " + str((i, j)))
return True
elif deps:
if deps:
self._frontier.insert(i, deps)
return False
return False
def _set_state(self, state, index, label=None):
if state not in states:
raise ValueError("Invalid graph state!")
self._prev_state = state
self._prev_index = index
if self.debug:
self._out.write(" " * 20)
self._out.write("%-20s" % (str(self._prev_state) if self._prev_state else ""))
self._out.write("%-20s" % (str(label) if label else ""))
self._out.write("%s" % self._frontier)
self._out.write(f"{str(self._prev_state) if self._prev_state else '':<20}")
self._out.write(f"{str(label) if label else '':<20}")
self._out.write(f"{self._frontier}")
def _back_edge_line(self, prev_ends, end, start, collapse, label=None):
"""Write part of a backwards edge in the graph.
@@ -309,7 +241,7 @@ def advance(to_pos, edges):
else:
advance(flen, lambda: [("| ", self._pos)])
self._set_state(BACK_EDGE, end, label)
self._set_state(_GraphLineState.BACK_EDGE, end, label)
self._out.write("\n")
def _node_label(self, node):
@@ -321,13 +253,13 @@ def _node_line(self, index, node):
for c in range(index):
self._write_edge("| ", c)
self._out.write("%s " % self.node_character)
self._out.write(f"{self.node_character} ")
for c in range(index + 1, len(self._frontier)):
self._write_edge("| ", c)
self._out.write(self._node_label(node))
self._set_state(NODE, index)
self._set_state(_GraphLineState.NODE, index)
self._out.write("\n")
def _collapse_line(self, index):
@@ -338,7 +270,7 @@ def _collapse_line(self, index):
for c in range(index, len(self._frontier)):
self._write_edge(" /", c)
self._set_state(COLLAPSE, index)
self._set_state(_GraphLineState.COLLAPSE, index)
self._out.write("\n")
def _merge_right_line(self, index):
@@ -351,7 +283,7 @@ def _merge_right_line(self, index):
for c in range(index + 1, len(self._frontier)):
self._write_edge("| ", c)
self._set_state(MERGE_RIGHT, index)
self._set_state(_GraphLineState.MERGE_RIGHT, index)
self._out.write("\n")
def _expand_right_line(self, index):
@@ -365,7 +297,7 @@ def _expand_right_line(self, index):
for c in range(index + 2, len(self._frontier)):
self._write_edge(" \\", c)
self._set_state(EXPAND_RIGHT, index)
self._set_state(_GraphLineState.EXPAND_RIGHT, index)
self._out.write("\n")
def write(self, spec, color=None, out=None):
@@ -391,7 +323,13 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = topological_sort(spec, deptype=self.deptype)
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.deptype
)
]
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive
spec = spec.copy()
@@ -506,87 +444,153 @@ def graph_ascii(spec, node="o", out=None, debug=False, indent=0, color=None, dep
graph.write(spec, color=color, out=out)
def graph_dot(specs, deptype="all", static=False, out=None):
"""Generate a graph in dot format of all provided specs.
class DotGraphBuilder:
"""Visit edges of a graph a build DOT options for nodes and edges"""
Print out a dot formatted graph of all the dependencies between
package. Output can be passed to graphviz, e.g.:
def __init__(self):
self.nodes: Set[Tuple[str, str]] = set()
self.edges: Set[Tuple[str, str, str]] = set()
.. code-block:: console
def visit(self, edge: spack.spec.DependencySpec):
"""Visit an edge and builds up entries to render the graph"""
if edge.parent is None:
self.nodes.add(self.node_entry(edge.spec))
return
spack graph --dot qt | dot -Tpdf > spack-graph.pdf
self.nodes.add(self.node_entry(edge.parent))
self.nodes.add(self.node_entry(edge.spec))
self.edges.add(self.edge_entry(edge))
def node_entry(self, node: spack.spec.Spec) -> Tuple[str, str]:
"""Return a tuple of (node_id, node_options)"""
raise NotImplementedError("Need to be implemented by derived classes")
def edge_entry(self, edge: spack.spec.DependencySpec) -> Tuple[str, str, str]:
"""Return a tuple of (parent_id, child_id, edge_options)"""
raise NotImplementedError("Need to be implemented by derived classes")
def context(self):
"""Return the context to be used to render the DOT graph template"""
result = {"nodes": self.nodes, "edges": self.edges}
return result
def render(self) -> str:
"""Return a string with the output in DOT format"""
environment = spack.tengine.make_environment()
template = environment.get_template("misc/graph.dot")
return template.render(self.context())
class SimpleDAG(DotGraphBuilder):
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
def node_entry(self, node):
format_option = "{name}{@version}{%compiler}{/hash:7}"
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
def edge_entry(self, edge):
return edge.parent.dag_hash(), edge.spec.dag_hash(), None
class StaticDag(DotGraphBuilder):
"""DOT graph for possible dependencies"""
def node_entry(self, node):
return node.name, f'[label="{node.name}"]'
def edge_entry(self, edge):
return edge.parent.name, edge.spec.name, None
class DAGWithDependencyTypes(DotGraphBuilder):
"""DOT graph with link,run nodes grouped together and edges colored according to
the dependency types.
"""
def __init__(self):
super().__init__()
self.main_unified_space: Set[str] = set()
def visit(self, edge):
if edge.parent is None:
for node in spack.traverse.traverse_nodes([edge.spec], deptype=("link", "run")):
self.main_unified_space.add(node.dag_hash())
super().visit(edge)
def node_entry(self, node):
node_str = node.format("{name}{@version}{%compiler}{/hash:7}")
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
if node.dag_hash() in self.main_unified_space:
options = f'[label="{node_str}", group="main_psid"]'
return node.dag_hash(), options
def edge_entry(self, edge):
colormap = {"build": "dodgerblue", "link": "crimson", "run": "goldenrod"}
return (
edge.parent.dag_hash(),
edge.spec.dag_hash(),
f"[color=\"{':'.join(colormap[x] for x in edge.deptypes)}\"]",
)
def _static_edges(specs, deptype):
for spec in specs:
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(expand_virtuals=True, deptype=deptype)
for parent_name, dependencies in possible.items():
for dependency_name in dependencies:
yield spack.spec.DependencySpec(
spack.spec.Spec(parent_name),
spack.spec.Spec(dependency_name),
deptypes=deptype,
)
def static_graph_dot(
specs: List[spack.spec.Spec],
deptype: Optional[Union[str, Tuple[str, ...]]] = "all",
out: Optional[TextIO] = None,
):
"""Static DOT graph with edges to all possible dependencies.
Args:
specs: abstract specs to be represented
deptype: dependency types to consider
out: optional output stream. If None sys.stdout is used
"""
out = out or sys.stdout
builder = StaticDag()
for edge in _static_edges(specs, deptype):
builder.visit(edge)
out.write(builder.render())
def graph_dot(
specs: List[spack.spec.Spec],
builder: Optional[DotGraphBuilder] = None,
deptype: Optional[Union[str, Tuple[str, ...]]] = "all",
out: Optional[TextIO] = None,
):
"""DOT graph of the concrete specs passed as input.
Args:
specs: specs to be represented
builder: builder to use to render the graph
deptype: dependency types to consider
out: optional output stream. If None sys.stdout is used
"""
if not specs:
raise ValueError("Must provide specs to graph_dot")
if out is None:
out = sys.stdout
deptype = spack.dependency.canonical_deptype(deptype)
builder = builder or SimpleDAG()
for edge in spack.traverse.traverse_edges(
specs, cover="edges", order="breadth", deptype=deptype
):
builder.visit(edge)
def static_graph(spec, deptype):
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(expand_virtuals=True, deptype=deptype)
nodes = set() # elements are (node name, node label)
edges = set() # elements are (src key, dest key)
for name, deps in possible.items():
nodes.add((name, name))
edges.update((name, d) for d in deps)
return nodes, edges
def dynamic_graph(spec, deptypes):
nodes = set() # elements are (node key, node label)
edges = set() # elements are (src key, dest key)
for s in spec.traverse(deptype=deptype):
nodes.add((s.dag_hash(), node_label(s)))
for d in s.dependencies(deptype=deptype):
edge = (s.dag_hash(), d.dag_hash())
edges.add(edge)
return nodes, edges
nodes = set()
edges = set()
for spec in specs:
if static:
n, e = static_graph(spec, deptype)
else:
n, e = dynamic_graph(spec, deptype)
nodes.update(n)
edges.update(e)
out.write("digraph G {\n")
out.write(' labelloc = "b"\n')
out.write(' rankdir = "TB"\n')
out.write(' ranksep = "1"\n')
out.write(" edge[\n")
out.write(" penwidth=4")
out.write(" ]\n")
out.write(" node[\n")
out.write(" fontname=Monaco,\n")
out.write(" penwidth=4,\n")
out.write(" fontsize=24,\n")
out.write(" margin=.2,\n")
out.write(" shape=box,\n")
out.write(" fillcolor=lightblue,\n")
out.write(' style="rounded,filled"')
out.write(" ]\n")
# write nodes
out.write("\n")
for key, label in nodes:
out.write(' "%s" [label="%s"]\n' % (key, label))
# write edges
out.write("\n")
for src, dest in edges:
out.write(' "%s" -> "%s"\n' % (src, dest))
# ensure that roots are all at the top of the plot
dests = set([d for _, d in edges])
roots = ['"%s"' % k for k, _ in nodes if k not in dests]
out.write("\n")
out.write(" { rank=min; %s; }" % "; ".join(roots))
out.write("\n")
out.write("}\n")
out.write(builder.render())

View File

@@ -6,6 +6,7 @@
import os
import llnl.util.tty as tty
from llnl.util.executable import Executable
from llnl.util.filesystem import BaseDirectoryVisitor, visit_directory_tree
from llnl.util.lang import elide_list
@@ -13,7 +14,6 @@
import spack.config
import spack.relocate
from spack.util.elf import ElfParsingError, parse_elf
from spack.util.executable import Executable
def is_shared_library_elf(filepath):

View File

@@ -6,11 +6,11 @@
import os
import llnl.util.tty as tty
from llnl.util.executable import Executable, which
from llnl.util.filesystem import mkdirp
from llnl.util.symlink import symlink
from spack.util.editor import editor
from spack.util.executable import Executable, which
def pre_install(spec):

View File

@@ -13,6 +13,7 @@
import spack.error
import spack.paths
import spack.util.path
import spack.util.prefix
import spack.util.spack_json as sjson
from spack.spec import Spec

View File

@@ -40,6 +40,8 @@
import llnl.util.filesystem as fs
import llnl.util.lock as lk
import llnl.util.tty as tty
from llnl.util.envmod import EnvironmentModifications
from llnl.util.executable import which
from llnl.util.lang import pretty_seconds
from llnl.util.tty.color import colorize
from llnl.util.tty.log import log_output
@@ -53,11 +55,9 @@
import spack.package_prefs as prefs
import spack.repo
import spack.store
import spack.util.executable
import spack.util.path
import spack.util.timer as timer
from spack.util.environment import EnvironmentModifications, dump_environment
from spack.util.executable import which
from spack.util.environment import dump_environment
#: Counter to support unique spec sequencing that is used to ensure packages
#: with the same priority are (initially) processed in the order in which they
@@ -460,11 +460,10 @@ def combine_phase_logs(phase_log_files, log_path):
phase_log_files (list): a list or iterator of logs to combine
log_path (str): the path to combine them to
"""
with open(log_path, "w") as log_file:
with open(log_path, "bw") as log_file:
for phase_log_file in phase_log_files:
with open(phase_log_file, "r") as phase_log:
log_file.write(phase_log.read())
with open(phase_log_file, "br") as phase_log:
shutil.copyfileobj(phase_log, log_file)
def dump_packages(spec, path):

View File

@@ -26,6 +26,7 @@
import archspec.cpu
import llnl.util.envmod
import llnl.util.lang
import llnl.util.tty as tty
import llnl.util.tty.colify
@@ -44,9 +45,7 @@
import spack.spec
import spack.store
import spack.util.debug
import spack.util.environment
import spack.util.executable as exe
import spack.util.path
import spack.util.git
from spack.error import SpackError
#: names of profile statistics
@@ -136,7 +135,7 @@ def get_version():
version = spack.spack_version
git_path = os.path.join(spack.paths.prefix, ".git")
if os.path.exists(git_path):
git = exe.which("git")
git = spack.util.git.git()
if not git:
return version
rev = git(
@@ -575,7 +574,7 @@ def setup_main_options(args):
if args.debug:
spack.util.debug.register_interrupt_handler()
spack.config.set("config:debug", True, scope="command_line")
spack.util.environment.tracing_enabled = True
llnl.util.envmod.tracing_enabled = True
if args.timestamp:
tty.set_timestamp(True)

View File

@@ -667,8 +667,7 @@ def push_url_from_directory(output_directory):
"""Given a directory in the local filesystem, return the URL on
which to push binary packages.
"""
scheme = urllib.parse.urlparse(output_directory, scheme="<missing>").scheme
if scheme != "<missing>":
if url_util.validate_scheme(urllib.parse.urlparse(output_directory).scheme):
raise ValueError("expected a local path, but got a URL instead")
mirror_url = url_util.path_to_file_url(output_directory)
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
@@ -685,8 +684,7 @@ def push_url_from_mirror_name(mirror_name):
def push_url_from_mirror_url(mirror_url):
"""Given a mirror URL, return the URL on which to push binary packages."""
scheme = urllib.parse.urlparse(mirror_url, scheme="<missing>").scheme
if scheme == "<missing>":
if not url_util.validate_scheme(urllib.parse.urlparse(mirror_url).scheme):
raise ValueError('"{0}" is not a valid URL'.format(mirror_url))
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
return url_util.format(mirror.push_url)

View File

@@ -36,6 +36,7 @@
import re
from typing import Optional
import llnl.util.envmod
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import dedupe
@@ -51,7 +52,6 @@
import spack.schema.environment
import spack.store
import spack.tengine as tengine
import spack.util.environment
import spack.util.file_permissions as fp
import spack.util.path
import spack.util.spack_yaml as syaml
@@ -732,8 +732,8 @@ def environment_modifications(self):
spec.prefix = view.get_projection_for_spec(spec)
env = spack.util.environment.inspect_path(
spec.prefix, prefix_inspections, exclude=spack.util.environment.is_system_path
env = llnl.util.envmod.inspect_path(
spec.prefix, prefix_inspections, exclude=llnl.util.envmod.is_system_path
)
# Let the extendee/dependency modify their extensions/dependencies

View File

@@ -9,6 +9,7 @@
import posixpath
from typing import Any, Dict
import llnl.util.envmod
import llnl.util.lang as lang
import spack.compilers
@@ -17,7 +18,6 @@
import spack.repo
import spack.spec
import spack.tengine as tengine
import spack.util.environment
from .common import BaseConfiguration, BaseContext, BaseFileLayout, BaseModuleFileWriter
@@ -71,7 +71,7 @@ def guess_core_compilers(name, store=False):
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(x) in spack.util.environment.system_dirs
os.path.dirname(x) in llnl.util.envmod.system_dirs
for x in compiler["paths"].values()
if x is not None
)

View File

@@ -10,8 +10,8 @@
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.envmod import get_path
from spack.util.environment import get_path
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro

View File

@@ -8,8 +8,8 @@
import re
import llnl.util.lang
from llnl.util.executable import Executable
from spack.util.executable import Executable
from spack.version import Version
from ._operating_system import OperatingSystem
@@ -50,7 +50,7 @@ def macos_version():
try:
output = Executable("sw_vers")(output=str, fail_on_error=False)
except Exception:
# FileNotFoundError, or spack.util.executable.ProcessError
# FileNotFoundError, or llnl.util.executable.ProcessError
pass
else:
match = re.search(r"ProductVersion:\s*([0-9.]+)", output)

View File

@@ -19,12 +19,12 @@
# import most common types used in packages
from typing import Dict, List, Optional
import llnl.util.executable
import llnl.util.filesystem
from llnl.util.executable import *
from llnl.util.filesystem import *
from llnl.util.symlink import symlink
import spack.util.executable
# These props will be overridden when the build env is set up.
from spack.build_environment import MakeExecutable
from spack.build_systems.aspell_dict import AspellDictPackage
@@ -87,7 +87,6 @@
on_package_attributes,
)
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,

View File

@@ -29,8 +29,11 @@
import warnings
from typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Type
import llnl.util.envmod
import llnl.util.filesystem as fsys
import llnl.util.path
import llnl.util.tty as tty
from llnl.util.executable import ProcessError, which
from llnl.util.lang import classproperty, memoized, nullcontext
from llnl.util.link_tree import LinkTree
@@ -51,14 +54,12 @@
import spack.spec
import spack.store
import spack.url
import spack.util.environment
import spack.util.path
import spack.util.web
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import TestFailure, TestSuite
from spack.installer import InstallError, PackageInstaller
from spack.stage import ResourceStage, Stage, StageComposite, stage_prefix
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.util.prefix import Prefix
from spack.util.web import FetchError
@@ -204,9 +205,9 @@ def __init__(cls, name, bases, attr_dict):
def platform_executables(cls):
def to_windows_exe(exe):
if exe.endswith("$"):
exe = exe.replace("$", "%s$" % spack.util.path.win_exe_ext())
exe = exe.replace("$", "%s$" % llnl.util.path.win_exe_ext())
else:
exe += spack.util.path.win_exe_ext()
exe += llnl.util.path.win_exe_ext()
return exe
plat_exe = []
@@ -1993,7 +1994,7 @@ def run_test(
for line in out:
print(line.rstrip("\n"))
if exc_type is spack.util.executable.ProcessError:
if exc_type is llnl.util.executable.ProcessError:
out = io.StringIO()
spack.build_environment.write_log_summary(
out, "test", self.test_log_file, last=1
@@ -2101,7 +2102,7 @@ def setup_run_environment(self, env):
"""Sets up the run environment for a package.
Args:
env (spack.util.environment.EnvironmentModifications): environment
env (llnl.util.envmod.EnvironmentModifications): environment
modifications to be applied when the package is run. Package authors
can call methods on it to alter the run environment.
"""
@@ -2118,7 +2119,7 @@ def setup_dependent_run_environment(self, env, dependent_spec):
for dependencies.
Args:
env (spack.util.environment.EnvironmentModifications): environment
env (llnl.util.envmod.EnvironmentModifications): environment
modifications to be applied when the dependent package is run.
Package authors can call methods on it to alter the build environment.

View File

@@ -5,7 +5,7 @@
import os
from spack.util.executable import Executable, which
from llnl.util.executable import Executable, which
def compile_c_and_execute(source_file, include_flags, link_flags):

View File

@@ -11,6 +11,7 @@
import llnl.util.filesystem
import llnl.util.lang
from llnl.util.executable import which, which_string
import spack
import spack.error
@@ -20,19 +21,9 @@
import spack.util.spack_json as sjson
from spack.util.compression import allowed_archive
from spack.util.crypto import Checker, checksum
from spack.util.executable import which, which_string
def apply_patch(stage, patch_path, level=1, working_dir="."):
"""Apply the patch at patch_path to code in the stage.
Args:
stage (spack.stage.Stage): stage with code that will be patched
patch_path (str): filesystem location for the patch to apply
level (int or None): patch level (default 1)
working_dir (str): relative path *within* the stage to change to
(default '.')
"""
def get_patch_exe():
git_utils_path = os.environ.get("PATH", "")
if sys.platform == "win32":
git = which_string("git", required=True)
@@ -46,7 +37,20 @@ def apply_patch(stage, patch_path, level=1, working_dir="."):
# Note for future developers: The GNU port of patch to windows
# has issues handling CRLF line endings unless the --binary
# flag is passed.
patch = which("patch", required=True, path=git_utils_path)
return which("patch", required=True, path=git_utils_path)
def apply_patch(stage, patch_path, level=1, working_dir="."):
"""Apply the patch at patch_path to code in the stage.
Args:
stage (spack.stage.Stage): stage with code that will be patched
patch_path (str): filesystem location for the patch to apply
level (int or None): patch level (default 1)
working_dir (str): relative path *within* the stage to change to
(default '.')
"""
patch = get_patch_exe()
with llnl.util.filesystem.working_dir(stage.source_path):
patch("-s", "-p", str(level), "-i", patch_path, "-d", working_dir)

View File

@@ -4,10 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import llnl.util.envmod
import llnl.util.lang
import spack.util.environment
from .cray import Cray
from .darwin import Darwin
from .linux import Linux
@@ -64,7 +63,7 @@ def prevent_cray_detection():
"""Context manager that prevents the detection of the Cray platform"""
reset()
try:
with spack.util.environment.set_env(MODULEPATH=""):
with llnl.util.envmod.set_env(MODULEPATH=""):
yield
finally:
reset()

Some files were not shown because too many files have changed in this diff Show More