Compare commits

..

113 Commits

Author SHA1 Message Date
Gregory Becker
36d263a2cb black 2023-01-18 10:36:17 -08:00
Gregory Becker
99ecbcf3ef solver: fix build deps that are also non-build deps 2022-12-15 09:42:10 -08:00
Gregory Becker
a14403d641 solver: use process spaces for separate concretization of build deps 2022-12-14 22:17:17 -08:00
Marco De La Pierre
2522c8b754 edits to 8x existing recipes, mostly new versions, plus two dependency fixes (#34516) 2022-12-14 12:28:33 -07:00
Marco De La Pierre
f64cb29aea Nextflow, Tower Agent, Tower CLI: updates (#34515)
* renamed tower-agent and tower-cli with prefif nf-

* new nextflow package version

* added newest versions (today) for nf-tower-agent and nf-tower-cli
2022-12-14 11:44:22 -07:00
Erik Heeren
80e30222e1 New neuroscience packages: py-bmtk, py-neurotools (#34464)
* Add py-bmtk and py-neurotools

* py-bmtk: version bump

* [@spackbot] updating style on behalf of heerener

* Maybe the copyright needs to be extended to 2022 for the check to pass

* Process review remarks

* Update var/spack/repos/builtin/packages/py-neurotools/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-14 12:00:26 -06:00
eugeneswalker
55356e9edb bufr: add v11.6, 11.7, 11.7.1 (#34509) 2022-12-14 06:24:26 -07:00
eugeneswalker
eec09f791d fms: add v2019.01.03 (#34511) 2022-12-14 04:50:00 -07:00
Harmen Stoppels
9032179b34 Use update-index --mirror-url <url> instead of -d <url> (#34519) 2022-12-14 10:03:18 +01:00
Alberto Sartori
45b40115fb justbuild: add v1.0.0 (#34467) 2022-12-14 01:17:42 -07:00
snehring
e030833129 r-rgdal: adding new version 1.6-2 (#34502) 2022-12-13 20:06:31 -07:00
Harmen Stoppels
e055dc0e64 Use file paths/urls correctly (#34452)
The main issue that's fixed is that Spack passes paths (as strings) to
functions that require urls. That wasn't an issue on unix, since there
you can simply concatenate `file://` and `path` and all is good, but on
Windows that gives invalid file urls. Also on Unix, Spack would not deal with uri encoding like x%20y for file paths. 

It also removes Spack's custom url.parse function, which had its own incorrect interpretation of file urls, taking file://x/y to mean the relative path x/y instead of hostname=x and path=/y. Also it automatically interpolated variables, which is surprising for a function that parses URLs.

Instead of all sorts of ad-hoc `if windows: fix_broken_file_url` this PR
adds two helper functions around Python's own path2url and reverse.

Also fixes a bug where some `spack buildcache` commands
used `-d` as a flag to mean `--mirror-url` requiring a URL, and others
`--directory`, requiring a path. It is now the latter consistently.
2022-12-13 23:44:13 +01:00
Matthias Wolf
c45729cba1 py-submitit: add 1.4.5 (#34460) 2022-12-13 16:11:14 -06:00
Manuela Kuhn
b02b2f0f00 py-tifffile: add 2022.10.10 (#34499) 2022-12-13 16:09:01 -06:00
Manuela Kuhn
3ded50cc8c py-sphinxcontrib-qthelp: add 1.0.3 (#34495) 2022-12-13 16:08:06 -06:00
Manuela Kuhn
a7280cd5bb py-sqlalchemy: add 1.4.45 (#34497) 2022-12-13 16:07:34 -06:00
Paul Kuberry
2837b47ea5 trilinos: extend range of Teuchos patch (#34504) 2022-12-13 14:49:20 -07:00
Matthew Thompson
ea2c61c683 Update pFunit, add gFTL, gFTL-Shared, fArgParse, pFlogger, yaFyaml (#34476)
* Add GFE packages, Update pFUnit
* Remove citibeth as maintainer per her request
* Version 3.3.0 is an odd duck. Needs a v

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-12-13 22:40:33 +01:00
Bernhard Kaindl
217b34825a py-tensorboard-data-server: build needs rust+rustfmt (#34465) 2022-12-13 10:56:31 -07:00
Mosè Giordano
17d90f4cbc pcre2: add new versions and update URL (#34477) 2022-12-13 10:48:27 -07:00
Annop Wongwathanarat
7a5bd8cac4 gromacs: enable linking with acfl FFT (#34494) 2022-12-13 09:32:42 -08:00
Harmen Stoppels
333da47dc7 Don't fetch to order mirrors (#34359)
When installing binary tarballs, Spack has to download from its
binary mirrors.

Sometimes Spack has cache available for these mirrors.

That cache helps to order mirrors to increase the likelihood of
getting a direct hit.

However, currently, when Spack can't find a spec in any local cache
of mirrors, it's very dumb:

- A while ago it used to query each mirror to see if it had a spec,
  and use that information to order the mirror again, only to go
  about and do exactly a part of what it just did: fetch the spec
  from that mirror confused
- Recently, it was changed to download a full index.json, which
  can be multiple dozens of MBs of data and may take a minute to
  process thanks to the blazing fast performance you get with
  Python.

In a typical use case of concretizing with reuse, the full index.json
is already available, and it likely that the local cache gives a perfect
mirror ordering on install. (There's typically no need to update any
caches).

However, in the use case of Gitlab CI, the build jobs don't have cache,
and it would be smart to just do direct fetches instead of all the
redundant work of (1) and/or (2).

Also, direct fetches from mirrors will soon be fast enough to
prefer these direct fetches over the excruciating slowness of
index.json files.
2022-12-13 17:07:11 +01:00
dependabot[bot]
8b68b4ae72 build(deps): bump actions/checkout from 3.1.0 to 3.2.0 (#34480)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](93ea575cb5...755da8c3cf)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-13 09:05:50 -07:00
Adam J. Stewart
40a3fdefa8 py-cartopy: add v0.21.1 (#34482) 2022-12-13 07:12:24 -07:00
Adam J. Stewart
a61474f2c1 libicd: macOS now supported (#34483) 2022-12-13 07:12:00 -07:00
Aidan Heerdegen
b95a75779b Fix markdown links in rst files (#34488) 2022-12-13 14:11:38 +00:00
Harmen Stoppels
0ff6a1bd1c spack/package.py: improve editor support for some +/- static props (#34319) 2022-12-13 13:55:32 +01:00
Massimiliano Culpo
f9cfc2f57e scons: fix signature for install_args (#34481) 2022-12-13 12:21:44 +01:00
Adam J. Stewart
f4fb20e27e py-shapely: add v2.0.0 (#34475) 2022-12-13 09:59:23 +01:00
Massimiliano Culpo
3ff5d49102 Be strict on the markers used in unit tests (#33884) 2022-12-13 09:21:57 +01:00
Erik Heeren
238d4f72f5 py-pyld: add with dependency (#34472)
* py-pyld: add with dependency

* py-pyld and py-frozendict: update copyright expiration

* [@spackbot] updating style on behalf of heerener
2022-12-12 20:15:43 -07:00
Matthias Wolf
c5bc469eeb py-sh: new versions (#34458)
* py-sh: new versions

* style
2022-12-12 20:15:28 -07:00
Sam Grayson
b01e7dca9d Update packages for running azure (#34403)
* Update packages for running azure

* Update py-msal-extensions

* Respond to comments
2022-12-12 21:10:50 -06:00
Jean Luca Bez
c62906f781 New python package: Drishti (#33316)
* include Drishti

* fix syntax

* Update var/spack/repos/builtin/packages/drishti/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update var/spack/repos/builtin/packages/drishti/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 19:33:50 -07:00
Sam Grayson
94bac8d6dd Add new package: micromamba (#34195)
* Add new packages

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* style

* wip

* Respond to comments

* Respond to comments

* Spack style

* Remove linkage=full_static to pass package audit

* Spack style

* Moved tl-expected version
2022-12-12 14:00:41 -06:00
Manuela Kuhn
cd9c9b47e8 py-sphinxcontrib-devhelp: add 1.0.2 (#34462)
* py-sphinxcontrib-devhelp: add 1.0.2

* Update var/spack/repos/builtin/packages/py-sphinxcontrib-devhelp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 13:57:06 -06:00
Manuela Kuhn
8560295529 py-sphinxcontrib-applehelp: add 1.0.2 (#34461)
* py-sphinxcontrib-applehelp: add 1.0.2

* Update var/spack/repos/builtin/packages/py-sphinxcontrib-applehelp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-12 13:56:48 -06:00
Adam J. Stewart
fd248ad0b8 GEOS: add v3.10-3.11 (#34473) 2022-12-12 11:50:03 -08:00
renjithravindrankannath
0578ccc0e6 ROCm 5.3.0 updates (#33320)
* ROCm 5.3.0 updates
* New patches for 5.3.0 on hip and hsakmt
* Adding additional build arguments in hip and llvm
* RVS updates for 5.3.0 release
* New patches and rocm-tensile, rocprofiler-dev, roctracer-dev recipe updates for 5.3.0
* Reverting OPENMP fix from rocm-tensile
* Removing the patch to compile without git and adding witout it
* Install library in to lib directory instead of lib64 across all platform
* Setting lib install directory to lib
* Disable gallivm coroutine for libllvm15
* Update llvm-amdgpu prefix path in hip-config.cmake.in
  Removing libllvm15 from Mesa dependency removing
* hip-config.cmake.in update required from 5.2
* hip-config.cmake.in update required from 5.2 and above
* hip-config.cmake.in update required for all 5.2 release above
* Style check correction in hip update
* ginkgo: add missing include
* Patching hsa include path for rocm 5.3
* Restricting patch for llvm-15
* Style check error correction
* PIC flag required for the new test applications
* Passing -DCMAKE_POSITION_INDEPENDENT_CODE=ON in the cmake_args instead of setting -fPIC in CFLAGS

Co-authored-by: Cordell Bloor <Cordell.Bloor@amd.com>
2022-12-12 13:46:20 -06:00
Glenn Johnson
fcc2ab8b4b julia: have recipe explicitly use Spack compiler wrapper (#34365) 2022-12-12 19:53:26 +01:00
Vanessasaurus
76511ac039 Automated deployment to update package flux-core 2022-12-12 (#34456) 2022-12-12 11:47:36 -07:00
Jim Edwards
e4547982b3 allow esmf to use parallelio without mpi (#34182)
* allow esmf to use parallelio without mpi
* add hash for 8.4.0
* spack no longer sets arch to cray
2022-12-12 09:50:41 -08:00
Manuela Kuhn
80722fbaa3 py-snowballstemmer: add 2.2.0 (#34459) 2022-12-12 10:23:55 -06:00
Wouter Deconinck
c2fa444344 geant4: rm preference for 10.7.3 now that 11.1.0 is out (#34445) 2022-12-12 09:05:47 -07:00
Stephen Sachs
088ece1219 [texinfo] @7.0: needs c-11 syntax (#34261)
gnulib/lib/malloca.c uses single value `static_assert()` only available in c-11
syntax. `gcc` seems to be fine, but `icc` needs extra flag.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-12-12 16:52:26 +01:00
Veselin Dobrev
fcdd275564 MFEM: fix issue with cxxflags (#34435) 2022-12-12 16:52:00 +01:00
Mikael Simberg
b6d6a1ab2c Build tests for fmt conditionally (#34424) 2022-12-12 16:49:05 +01:00
Robert Blake
7efcb5ae73 Fixes to the silo packages for 4.11. (#34275) 2022-12-12 07:39:24 -07:00
Mikael Simberg
06e6389258 stdexec: skip build phase (#34425)
Since it's a header-only library there's nothing to build. However, the
default targets include tests and examples and there's no option to turn
them off during configuration time.
2022-12-12 07:16:40 -07:00
Simon Flood
b7f0f7879d foam-extend: add v4.1 (released Oct 2019) (#34398) 2022-12-12 07:16:17 -07:00
Bernhard Kaindl
f7cfbe2702 hdf5: "hdf5@1.13:" needs a depends_on "cmake@3.18:" for build. (#34447) 2022-12-12 15:12:55 +01:00
Wouter Deconinck
1466f8d602 geant4-data: depends_on g4emlow@7.9.1 when @10.6 (#34444)
Per https://geant4.web.cern.ch/node/1837 the correct dependency for 10.6 is on `g4emlow@7.9.1`, not on both `g4emlow@7.9` and `g4emlow@7.9.1`.

This is a minor cosmetic fix. The concretization for 10.6 works just fine here. But this removes the duplicate entry.
2022-12-12 07:11:42 -07:00
Glenn Johnson
9fdb36585f Fix openblas build with intel compiler (#34432)
This PR patches the f_check script to detect the ifort compiler and
ensure that F_COMPILER is iset to INTEL. This problem was introduced with
openblas-0.3.21. Without this patch, the value of F_COMPILER falls back
to G77 and icc rather than ifort is used for the linking stage. That
results in the openblas library missing libifcore, which in turn means
many Fotran programs can not be compiled with ifort.
2022-12-12 14:27:54 +01:00
Filippo Spiga
1f0a9fdc11 Adding NVIDIA HPC SDK 22.11 (#33954) 2022-12-12 14:26:39 +01:00
Jen Herting
0baba62900 arrow: dependency fixes (#33666)
+python needs more dependencies
don't look for dependency spec when it's not there
2022-12-12 14:26:02 +01:00
iarspider
4a0e34eda8 Add checksum for py-prometheus-client 0.14.1 (#34259) 2022-12-12 13:32:02 +01:00
Luke Diorio-Toth
88f2f59d92 Added ARM/aarch64 conflict to Eddy/Rivas lab tools (#34190) 2022-12-12 13:26:57 +01:00
Bernhard Kaindl
c1d11975f5 intel-parallel-studio: package is only available for x86_64 (#34392) 2022-12-12 12:09:29 +01:00
Glenn Johnson
cca56291c6 libgit2: add pcre dependency for @0.99: (#34289) 2022-12-12 11:55:49 +01:00
dependabot[bot]
ef155c16f0 build(deps): bump actions/setup-python from 4.3.0 to 4.3.1 (#34413)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.3.0 to 4.3.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](13ae5bb136...2c3dd9e7e2)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-12 11:37:06 +01:00
Adam J. Stewart
0952d314bd py-pytorch-lightning: add v1.8.4 (#34426) 2022-12-12 11:35:20 +01:00
Wileam Y. Phan
f29ac34558 nvhpc: add v22.11 (#34410) 2022-12-12 11:35:00 +01:00
snehring
47628521b9 delly2: add v1.1.6 (#34411) 2022-12-12 11:31:26 +01:00
Todd Gamblin
62da76cb5d directives: depends_on should not admit anonymous specs (#34368)
Writing a long dependency like:

```python
     depends_on(
         "llvm"
         "targets=amdgpu,bpf,nvptx,webassembly"
         "version_suffix=jl +link_llvm_dylib ~internal_unwind"
     )
```

when it should be formatted like this:

```python
     depends_on(
         "llvm"
         " targets=amdgpu,bpf,nvptx,webassembly"
         " version_suffix=jl +link_llvm_dylib ~internal_unwind"
     )
```

can cause really subtle errors. Specifically, you'll get something like this in
the package sanity tests:

```
    AttributeError: 'NoneType' object has no attribute 'rpartition'
```

because Spack happily constructs a class that has a dependency with name `None`.

We can catch this earlier by banning anonymous dependency specs directly in
`depends_on()`.  This causes the package itself to fail to parse, and emits
a much better error message:

```
==> Error: Invalid dependency specification in package 'julia':
    llvmtargets=amdgpu,bpf,nvptx,webassemblyversion_suffix=jl +link_llvm_dylib ~internal_unwind
```
2022-12-12 11:24:28 +01:00
Brian Vanderwende
65c914fff7 netcdf-c: add libxml2 when +dap (#34178) 2022-12-12 11:04:38 +01:00
Mikael Simberg
dd7b2deb47 Only restrict CMake version in Umpire when examples and rocm are enabled (#32025)
* Only restrict CMake version in umpire when examples and rocm are enabled

* Add CMAKE_HIP_ARCHITECTURES to Umpire and lift cmake version restriction

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2022-12-12 10:55:37 +01:00
Adam J. Stewart
7d72aeb4fe py-tensorboard-data-server: add Linux aarch64 support (#34437) 2022-12-12 10:40:48 +01:00
John W. Parent
43d97afd8b Bump CMake version to 3.25.1 (#34336) 2022-12-12 10:35:27 +01:00
Robert Cohn
39f13853ba intel-oneapi-* conflicts for non linux, x86 (#34441) 2022-12-12 09:23:14 +01:00
Sebastian Pipping
d65b9c559a expat: Add latest release 2.5.0 with security fixes (#34453) 2022-12-12 00:08:44 -07:00
Stephen Sachs
bde5720a81 glib: Add list_url+list_depth to list versions (#33904)
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
2022-12-12 06:51:09 +01:00
Harmen Stoppels
2371ec7497 openblas: fix bound :7.3 to :7.3.0 (#34443)
This patch:

https://gcc.gnu.org/legacy-ml/gcc-patches/2018-01/msg01962.html

is actually in Amazon Linux GCC 7.3.1, which we use in CI.

So we should not hold openblas back because of it.

Old versions of OpenBLAS fail to detect the host arch of some of the
AVX512 cpus of build nodes, causing build failures.

Of course we should try to set ARCH properly in OpenBLAS to avoid that
it looks up the build arch, but that's quite some work.
2022-12-11 19:02:07 +01:00
Todd Gamblin
aa3b6e598f pkg grep: use capfd instead of executable for tests 2022-12-10 16:43:44 -08:00
Todd Gamblin
8035eeb36d Revert "Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)"
This reverts commit db8f115013.
2022-12-10 16:43:44 -08:00
Michael Kuhn
57383a2294 py-scipy: print error message if no Fortran compiler is available (#34439) 2022-12-10 20:19:50 +01:00
Adam J. Stewart
9517dab409 py-scikit-learn: add v1.2.0 (#34408) 2022-12-10 11:10:31 -06:00
Manuela Kuhn
84fa4e6c4c py-setuptools-scm-git-archive: add 1.4 (#34422) 2022-12-10 09:58:39 -06:00
Harmen Stoppels
f33507961d py-{boto3,botocore,jmespath,s3transfer} bump (#34423) 2022-12-10 09:07:58 -06:00
Adam J. Stewart
46010ef1e1 valgrind: add v3.20.0, mark macOS conflict (#34436) 2022-12-10 12:19:42 +01:00
Abhik Sarkar
f9d9d43b63 Support for building Pmix with Debian/Ubuntu external dependencies (#32690)
* Debian like distros use multiarch implementation spec
https://wiki.ubuntu.com/MultiarchSpec
Instead of being limited to /usr/lib64, architecture based
lib directories are used. For instance, under ubuntu a library package
on x86_64 installs binaries under /usr/lib/x86_64-linux-gnu.
Building pmix with external dependencies like hwloc or libevent
fail as with prefix set to /usr, that prefix works for
headers and binaries but does not work for libraries. The default
location for library /usr/lib64 does not hold installed binaries.
Pmix build options --with-libevent and --with-libhwloc allow us to
specify dependent library locations. This commit is an effort to
highlight and resolve such an issue when a users want to use Debian like
distro library packages and use spack to build pmix.
There maybe other packages that might be impacted in a similar way.

* Adding libs property to hwloc and libevent and some cleanups to pmix patch

* Fixing style and adding comment on Pmix' 32-bit hwloc version detection issue
2022-12-09 18:30:45 -08:00
Harmen Stoppels
db8f115013 Use urllib handler for s3:// and gs://, improve url_exists through HEAD requests (#34324)
* `url_exists` improvements (take 2)

Make `url_exists` do HEAD request for http/https/s3 protocols

Rework the opener: construct it once and only once, dynamically dispatch
to the right one based on config.
2022-12-10 00:20:29 +01:00
Manuela Kuhn
09b5476049 py-simplejson: add 3.18.0 (#34430) 2022-12-09 13:11:30 -07:00
Sinan
14c4896ec2 package/qt-base: add conflict for older gcc (#34420) 2022-12-09 12:47:29 -07:00
Ben Morgan
b5ef5c2eb5 geant4: version bumps for Geant4 11.1.0 release (#34428)
* geant4: version bumps for Geant4 11.1.0

- Version bumps for new data libraries
  - g4ndl 4.7
  - g4emlow 8.2
- Add geant4-data@11.1.0
- Checksum new Geant4 11.1.0 release
  - Limit +python variant to maximum of :11.0 due to removal of
    Geant4Py in 11.1
  - Update CLHEP dependency to at least 2.4.6.0 for this release
  - Update VecGeom dependency to at least 1.2.0 for this release,
    closing version ranges for older releases to prevent multiple
    versions satisfying requirement

* geant4: correct max version for python support
2022-12-09 12:26:22 -07:00
Scott Wittenburg
675afd884d gitlab ci: more resources for paraview and py-torch (#34412) 2022-12-09 11:58:37 -07:00
shanedsnyder
0f5482dc9a [darshan-runtime, darshan-util, py-darshan]: darshan 3.4.1 release updates (#34294) 2022-12-09 19:56:53 +01:00
Jen Herting
069e5f874c New package: py-torchdiffeq (#34409)
* [py-torchdiffeq] new package

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-12-09 12:38:14 -06:00
Manuela Kuhn
cad01a03cb py-nbformat: add 5.7.0 and new package py-hatch-nodejs-version (#34361) 2022-12-09 12:32:41 -06:00
Manuela Kuhn
f10f8ed013 py-setupmeta: add 3.3.2 (#34421) 2022-12-09 12:32:19 -06:00
Todd Gamblin
d991ec90e3 new command: spack pkg grep to search package files (#34388)
It's very common for us to tell users to grep through the existing Spack packages to
find examples of what they want, and it's also very common for package developers to do
it. Now, searching packages is even easier.

`spack pkg grep` runs grep on all `package.py` files in repos known to Spack. It has no
special options other than the search string; all options passed to it are forwarded
along to `grep`.

```console
> spack pkg grep --help
usage: spack pkg grep [--help] ...

positional arguments:
  grep_args  arguments for grep

options:
  --help     show this help message and exit
```

```console
> spack pkg grep CMakePackage | head -3
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/3dtk/package.py:class _3dtk(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/abseil-cpp/package.py:class AbseilCpp(CMakePackage):
/Users/gamblin2/src/spack/var/spack/repos/builtin/packages/accfft/package.py:class Accfft(CMakePackage, CudaPackage):
```

```console
> spack pkg grep -Eho '(\S*)\(PythonPackage\)' | head -3
AwsParallelcluster(PythonPackage)
Awscli(PythonPackage)
Bueno(PythonPackage)
```

## Return Value

This retains the return value semantics of `grep`:
* 0  for found,
* 1  for not found
* >1 for error

## Choosing a `grep`

You can set the ``SPACK_GREP`` environment variable to choose the ``grep``
executable this command should use.
2022-12-09 10:07:54 -08:00
snehring
8353d1539f py-torchvision: fix typo in version restriction for ffmpeg (#34415) 2022-12-09 11:05:43 -07:00
iarspider
bf3d18bf06 Add checksum for py-packaging11 0.12.3 (#34402) 2022-12-09 06:43:44 -07:00
John W. Parent
0e69710f41 Windows: reenable unit tests (#33385)
Unit tests on Windows are supposed to pass for any PR to pass CI.
However, the return code for the unit test command was not being
checked, which meant this check was always passing (effectively
disabled). This PR

* Properly checks the result of the unit tests and fails if the
  unit tests fail
* Fixes (or disables on Windows) a number of tests which have
  "drifted" out of support on Windows since this check was
  effectively disabled
2022-12-09 13:27:46 +00:00
Harmen Stoppels
ec62150ed7 binary distribution: warn about issues (#34152) 2022-12-09 13:25:32 +01:00
Massimiliano Culpo
d37dc37504 btop++: add new package (#34399) 2022-12-09 12:59:46 +01:00
iarspider
38d37897d4 Add checksum for py-onnxmltools 1.11.1 (#34400) 2022-12-09 04:04:20 -07:00
Todd Gamblin
606eef43bd bugfix: spack load shell test can fail on macos (#34419)
At some point the `a` mock package became an `AutotoolsPackage`, and that means it
depends on `gnuconfig` on macOS. This was causing one of our shell tests to fail on
macOS because it was testing for `{a.prefix.bin}:{b.prefix.bin}` in `PATH`, but
`gnuconfig` shows up between them.

- [x] simplify the test to check `spack load --sh a` and `spack load --sh b` separately
2022-12-09 10:36:54 +00:00
Mikael Simberg
02a30f8d95 Add pika-algorithms package and pika 0.11.0 (#34397)
* Add 20 as a valid option for cxxstd to fmt

* Add pika 0.11.0

* Fix version constraint for p2300 variant in pika package

* Add pika-algorithms package
2022-12-09 11:26:48 +01:00
Harmen Stoppels
7e054cb7fc s3: cache client instance (#34372) 2022-12-09 08:50:32 +01:00
Manuela Kuhn
d29cb87ecc py-reportlab: add 3.6.12 (#34396)
* py-reportlab: add 3.6.12

* Update var/spack/repos/builtin/packages/py-reportlab/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-08 20:08:00 -06:00
Bernhard Kaindl
f8c0d9728d intel-mkl: It is only available for x86_64 (#34391) 2022-12-08 18:10:00 -07:00
Bernhard Kaindl
f5bff16745 bcache: Fix check for libintl to work correctly (#34383) 2022-12-08 17:37:10 -07:00
Adam J. Stewart
2d1cb6d64a bash: add v5.2, readline patches (#34301) 2022-12-08 13:46:21 -07:00
Peter Scheibel
c6e35da2c7 Cray manifest: automatically convert 'cray' platform to 'linux' (#34177)
* Automatically convert 'cray' platform to 'linux'
2022-12-08 11:28:06 -08:00
Manuela Kuhn
f1cd327186 py-rdflib: add 6.2.0 (#34394) 2022-12-08 13:07:26 -06:00
Victor Lopez Herrero
391ad8cec4 dlb: new package (#34211) 2022-12-08 05:57:48 -07:00
Larry Knox
2c668f4bfd Update hdf5 vol async version (#34376)
* Add version hdf5-vol-async@1.4
2022-12-08 05:37:34 -07:00
Glenn Johnson
52fdae83f0 pixman: add libs property (#34281) 2022-12-08 06:34:49 +01:00
Michael Kuhn
0ea81affd1 py-torch: fix build with gcc@12: (#34352) 2022-12-08 06:31:00 +01:00
Brian Van Essen
ddc6e233c7 libxcrypt: building @:4.4.17 requires automake@1.14: 2022-12-08 03:17:28 +01:00
Jon Rood
7ee4499f2b Add texinfo dependency for binutils through version 2.38. (#34173) 2022-12-08 03:08:37 +01:00
Marco De La Pierre
641adae961 Add recipe for singularity-hpc, py-spython (#34234)
* adding recipe for singularity-hpc - 1st go

* typo in singularity-hpc recipe

* singularity-hpc, spython recipes: added platform variant

* singularity-hpc, spython recipes: platform variant renamed to runtime

* style fix

* another style fix

* yet another style fix (why are they not reported altogether)

* singularity-hpc recipe: added Vanessa as maintainer

* singularity-hpc recipe: add podman variant

* singularity-hpc recipe: added variant for module system

* shpc recipe: add version for py-semver dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-spython recipe: no need to specify generic python dep for a python pkg

* py-spython: py-requests not needed

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-12-07 20:07:30 -06:00
John W. Parent
aed77efb9a Windows: Prevent SameFileError when rpathing (#34332) 2022-12-07 16:58:44 -08:00
274 changed files with 4696 additions and 2313 deletions

View File

@@ -19,8 +19,8 @@ jobs:
package-audits:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup repo
@@ -158,7 +158,7 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -179,7 +179,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap clingo
run: |
set -ex
@@ -204,7 +204,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup repo
@@ -247,7 +247,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -283,7 +283,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- name: Setup non-root user
@@ -316,7 +316,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -333,7 +333,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -50,7 +50,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- name: Set Container Tag Normal (Nightly)
run: |

View File

@@ -35,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -1,6 +1,4 @@
# (c) 2021 Lawrence Livermore National Laboratory
Set-Location spack
# (c) 2022 Lawrence Livermore National Laboratory
git config --global user.email "spack@example.com"
git config --global user.name "Test User"

View File

@@ -47,10 +47,10 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -94,10 +94,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -133,7 +133,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -151,10 +151,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -185,10 +185,10 @@ jobs:
matrix:
python-version: ["3.10"]
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -18,8 +18,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: '3.11'
cache: 'pip'
@@ -35,10 +35,10 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9 # @v2
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -10,15 +10,15 @@ concurrency:
defaults:
run:
shell:
powershell Invoke-Expression -Command ".\share\spack\qa\windows_test_setup.ps1"; {0}
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
with:
python-version: 3.9
- name: Install Python packages
@@ -26,13 +26,11 @@ jobs:
python -m pip install --upgrade pip six pywin32 setuptools codecov pytest-cov clingo
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
cd spack
dir
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
@@ -41,10 +39,10 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
with:
python-version: 3.9
- name: Install Python packages
@@ -52,12 +50,11 @@ jobs:
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage pytest-cov clingo
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
cd spack
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
@@ -66,10 +63,10 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
- uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
with:
python-version: 3.9
- name: Install Python packages
@@ -78,81 +75,81 @@ jobs:
- name: Build Test
run: |
spack compiler find
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
spack external find cmake
spack external find ninja
spack -d install abseil-cpp
make-installer:
runs-on: windows-latest
steps:
- name: Disable Windows Symlinks
run: |
git config --global core.symlinks false
shell:
powershell
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools
- name: Add Light and Candle to Path
run: |
$env:WIX >> $GITHUB_PATH
- name: Run Installer
run: |
.\spack\share\spack\qa\setup_spack.ps1
spack make-installer -s spack -g SILENT pkg
echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
env:
ProgressPreference: SilentlyContinue
- uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
with:
name: Windows Spack Installer Bundle
path: ${{ env.installer_root }}\pkg\Spack.exe
- uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
with:
name: Windows Spack Installer
path: ${{ env.installer_root}}\pkg\Spack.msi
execute-installer:
needs: make-installer
runs-on: windows-latest
defaults:
run:
shell: pwsh
steps:
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools
- name: Setup installer directory
run: |
mkdir -p spack_installer
echo "spack_installer=$((pwd).Path)\spack_installer" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
- uses: actions/download-artifact@v3
with:
name: Windows Spack Installer Bundle
path: ${{ env.spack_installer }}
- name: Execute Bundled Installer
run: |
$proc = Start-Process ${{ env.spack_installer }}\spack.exe "/install /quiet" -Passthru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
$LASTEXITCODE
env:
ProgressPreference: SilentlyContinue
- uses: actions/download-artifact@v3
with:
name: Windows Spack Installer
path: ${{ env.spack_installer }}
- name: Execute MSI
run: |
$proc = Start-Process ${{ env.spack_installer }}\spack.msi "/quiet" -Passthru
$handle = $proc.Handle # cache proc.Handle
$proc.WaitForExit();
$LASTEXITCODE
# TODO: johnwparent - reduce the size of the installer operations
# make-installer:
# runs-on: windows-latest
# steps:
# - name: Disable Windows Symlinks
# run: |
# git config --global core.symlinks false
# shell:
# powershell
# - uses: actions/checkout@755da8c3cf115ac066823e79a1e1788f8940201b
# with:
# fetch-depth: 0
# - uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
# with:
# python-version: 3.9
# - name: Install Python packages
# run: |
# python -m pip install --upgrade pip six pywin32 setuptools
# - name: Add Light and Candle to Path
# run: |
# $env:WIX >> $GITHUB_PATH
# - name: Run Installer
# run: |
# ./share/spack/qa/setup_spack_installer.ps1
# spack make-installer -s . -g SILENT pkg
# echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
# env:
# ProgressPreference: SilentlyContinue
# - uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
# with:
# name: Windows Spack Installer Bundle
# path: ${{ env.installer_root }}\pkg\Spack.exe
# - uses: actions/upload-artifact@83fd05a356d7e2593de66fc9913b3002723633cb
# with:
# name: Windows Spack Installer
# path: ${{ env.installer_root}}\pkg\Spack.msi
# execute-installer:
# needs: make-installer
# runs-on: windows-latest
# defaults:
# run:
# shell: pwsh
# steps:
# - uses: actions/setup-python@2c3dd9e7e29afd70cc0950079bde6c979d1f69f9
# with:
# python-version: 3.9
# - name: Install Python packages
# run: |
# python -m pip install --upgrade pip six pywin32 setuptools
# - name: Setup installer directory
# run: |
# mkdir -p spack_installer
# echo "spack_installer=$((pwd).Path)\spack_installer" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
# - uses: actions/download-artifact@v3
# with:
# name: Windows Spack Installer Bundle
# path: ${{ env.spack_installer }}
# - name: Execute Bundled Installer
# run: |
# $proc = Start-Process ${{ env.spack_installer }}\spack.exe "/install /quiet" -Passthru
# $handle = $proc.Handle # cache proc.Handle
# $proc.WaitForExit();
# $LASTEXITCODE
# env:
# ProgressPreference: SilentlyContinue
# - uses: actions/download-artifact@v3
# with:
# name: Windows Spack Installer
# path: ${{ env.spack_installer }}
# - name: Execute MSI
# run: |
# $proc = Start-Process ${{ env.spack_installer }}\spack.msi "/quiet" -Passthru
# $handle = $proc.Handle # cache proc.Handle
# $proc.WaitForExit();
# $LASTEXITCODE

View File

@@ -2397,13 +2397,15 @@ this because uninstalling the dependency would break the package.
``build``, ``link``, and ``run`` dependencies all affect the hash of Spack
packages (along with ``sha256`` sums of patches and archives used to build the
package, and a [canonical hash](https://github.com/spack/spack/pull/28156) of
package, and a `canonical hash <https://github.com/spack/spack/pull/28156>`_ of
the ``package.py`` recipes). ``test`` dependencies do not affect the package
hash, as they are only used to construct a test environment *after* building and
installing a given package installation. Older versions of Spack did not include
build dependencies in the hash, but this has been
[fixed](https://github.com/spack/spack/pull/28504) as of [Spack
``v0.18``](https://github.com/spack/spack/releases/tag/v0.18.0)
build dependencies in the hash, but this has been
`fixed <https://github.com/spack/spack/pull/28504>`_ as of |Spack v0.18|_.
.. |Spack v0.18| replace:: Spack ``v0.18``
.. _Spack v0.18: https://github.com/spack/spack/releases/tag/v0.18.0
If the dependency type is not specified, Spack uses a default of
``('build', 'link')``. This is the common case for compiler languages.

View File

@@ -99,7 +99,9 @@ def getuid():
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
if is_windows:
if os.path.exists(dst):
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or os.path.islink(dst):
os.remove(dst)
os.rename(src, dst)
@@ -288,7 +290,10 @@ def groupid_to_group(x):
shutil.copy(filename, tmp_filename)
try:
extra_kwargs = {"errors": "surrogateescape"}
# To avoid translating line endings (\n to \r\n and vis versa)
# we force os.open to ignore translations and use the line endings
# the file comes with
extra_kwargs = {"errors": "surrogateescape", "newline": ""}
# Open as a text file and filter until the end of the file is
# reached or we found a marker in the line if it was specified
@@ -2278,10 +2283,17 @@ def add_rpath(self, *paths):
"""
self._addl_rpaths = self._addl_rpaths | set(paths)
def _link(self, path, dest):
def _link(self, path, dest_dir):
"""Perform link step of simulated rpathing, installing
simlinks of file in path to the dest_dir
location. This method deliberately prevents
the case where a path points to a file inside the dest_dir.
This is because it is both meaningless from an rpath
perspective, and will cause an error when Developer
mode is not enabled"""
file_name = os.path.basename(path)
dest_file = os.path.join(dest, file_name)
if os.path.exists(dest):
dest_file = os.path.join(dest_dir, file_name)
if os.path.exists(dest_dir) and not dest_file == path:
try:
symlink(path, dest_file)
# For py2 compatibility, we have to catch the specific Windows error code
@@ -2295,7 +2307,7 @@ def _link(self, path, dest):
"Linking library %s to %s failed, " % (path, dest_file) + "already linked."
if already_linked
else "library with name %s already exists at location %s."
% (file_name, dest)
% (file_name, dest_dir)
)
pass
else:

View File

@@ -266,10 +266,7 @@ def find_by_hash(self, find_hash, mirrors_to_check=None):
None, just assumes all configured mirrors.
"""
if find_hash not in self._mirrors_for_spec:
# Not found in the cached index, pull the latest from the server.
self.update(with_cooldown=True)
if find_hash not in self._mirrors_for_spec:
return None
return []
results = self._mirrors_for_spec[find_hash]
if not mirrors_to_check:
return results
@@ -418,7 +415,12 @@ def update(self, with_cooldown=False):
if all_methods_failed:
raise FetchCacheError(fetch_errors)
elif spec_cache_regenerate_needed:
if fetch_errors:
tty.warn(
"The following issues were ignored while updating the indices of binary caches",
FetchCacheError(fetch_errors),
)
if spec_cache_regenerate_needed:
self.regenerate_spec_cache(clear_existing=spec_cache_clear_needed)
def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
@@ -504,9 +506,9 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
if fetched_hash is not None and locally_computed_hash != fetched_hash:
msg = (
"Computed hash ({0}) did not match remote ({1}), "
"Computed index hash [{0}] did not match remote [{1}, url:{2}] "
"indicating error in index transmission"
).format(locally_computed_hash, expect_hash)
).format(locally_computed_hash, fetched_hash, hash_fetch_url)
errors.append(RuntimeError(msg))
# We somehow got an index that doesn't match the remote one, maybe
# the next time we try we'll be successful.
@@ -1181,7 +1183,7 @@ def generate_key_index(key_prefix, tmpdir=None):
def _build_tarball(
spec,
outdir,
out_url,
force=False,
relative=False,
unsigned=False,
@@ -1204,8 +1206,7 @@ def _build_tarball(
tarfile_dir = os.path.join(cache_prefix, tarball_directory_name(spec))
tarfile_path = os.path.join(tarfile_dir, tarfile_name)
spackfile_path = os.path.join(cache_prefix, tarball_path_name(spec, ".spack"))
remote_spackfile_path = url_util.join(outdir, os.path.relpath(spackfile_path, tmpdir))
remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, tmpdir))
mkdirp(tarfile_dir)
if web_util.url_exists(remote_spackfile_path):
@@ -1224,7 +1225,7 @@ def _build_tarball(
signed_specfile_path = "{0}.sig".format(specfile_path)
remote_specfile_path = url_util.join(
outdir, os.path.relpath(specfile_path, os.path.realpath(tmpdir))
out_url, os.path.relpath(specfile_path, os.path.realpath(tmpdir))
)
remote_signed_specfile_path = "{0}.sig".format(remote_specfile_path)
@@ -1329,12 +1330,12 @@ def _build_tarball(
# push the key to the build cache's _pgp directory so it can be
# imported
if not unsigned:
push_keys(outdir, keys=[key], regenerate_index=regenerate_index, tmpdir=tmpdir)
push_keys(out_url, keys=[key], regenerate_index=regenerate_index, tmpdir=tmpdir)
# create an index.json for the build_cache directory so specs can be
# found
if regenerate_index:
generate_package_index(url_util.join(outdir, os.path.relpath(cache_prefix, tmpdir)))
generate_package_index(url_util.join(out_url, os.path.relpath(cache_prefix, tmpdir)))
finally:
shutil.rmtree(tmpdir)
@@ -2079,8 +2080,8 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
spec (spack.spec.Spec): The spec to look for in binary mirrors
mirrors_to_check (dict): Optionally override the configured mirrors
with the mirrors in this dictionary.
index_only (bool): Do not attempt direct fetching of ``spec.json``
files from remote mirrors, only consider the indices.
index_only (bool): When ``index_only`` is set to ``True``, only the local
cache is checked, no requests are made.
Return:
A list of objects, each containing a ``mirror_url`` and ``spec`` key

View File

@@ -545,8 +545,9 @@ def ensure_core_dependencies():
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":
ensure_patchelf_in_path_or_raise()
if not IS_WINDOWS:
ensure_gpg_in_path_or_raise()
ensure_clingo_importable_or_raise()
ensure_gpg_in_path_or_raise()
def all_core_root_specs():

View File

@@ -37,14 +37,12 @@
import multiprocessing
import os
import re
import shutil
import sys
import traceback
import types
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.util.filesystem import install, install_tree, mkdirp
from llnl.util.lang import dedupe
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
@@ -52,6 +50,7 @@
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.config
import spack.install_test
@@ -586,9 +585,6 @@ def set_module_variables_for_package(pkg):
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# easy shortcut to os.environ
m.env = os.environ
# Find the configure script in the archive path
# Don't use which for this; we want to find it in the current dir.
m.configure = Executable("./configure")
@@ -608,21 +604,6 @@ def set_module_variables_for_package(pkg):
m.spack_f77 = os.path.join(link_dir, pkg.compiler.link_paths["f77"])
m.spack_fc = os.path.join(link_dir, pkg.compiler.link_paths["fc"])
# Emulate some shell commands for convenience
m.pwd = os.getcwd
m.cd = os.chdir
m.mkdir = os.mkdir
m.makedirs = os.makedirs
m.remove = os.remove
m.removedirs = os.removedirs
m.symlink = symlink
m.mkdirp = mkdirp
m.install = install
m.install_tree = install_tree
m.rmtree = shutil.rmtree
m.move = shutil.move
# Useful directories within the prefix are encapsulated in
# a Prefix object.
m.prefix = pkg.prefix

View File

@@ -10,6 +10,7 @@
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.directives import conflicts
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -25,6 +26,16 @@ class IntelOneApiPackage(Package):
# organization (e.g. University/Company).
redistribute_source = False
for c in [
"target=ppc64:",
"target=ppc64le:",
"target=aarch64:",
"platform=darwin:",
"platform=cray:",
"platform=windows:",
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")
@staticmethod
def update_description(cls):
"""Updates oneapi package descriptions with common text."""

View File

@@ -46,10 +46,10 @@ class SConsBuilder(BaseBuilder):
phases = ("build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("install_args", "build_test")
legacy_methods = ("build_test",)
#: Same as legacy_methods, but the signature is different
legacy_long_methods = ("build_args",)
legacy_long_methods = ("build_args", "install_args")
#: Names associated with package attributes in the old build-system format
legacy_attributes = ("build_time_test_callbacks",)
@@ -66,13 +66,13 @@ def build(self, pkg, spec, prefix):
args = self.build_args(spec, prefix)
inspect.getmodule(self.pkg).scons(*args)
def install_args(self):
def install_args(self, spec, prefix):
"""Arguments to pass to install."""
return []
def install(self, pkg, spec, prefix):
"""Install the package."""
args = self.install_args()
args = self.install_args(spec, prefix)
inspect.getmodule(self.pkg).scons("install", *args)

View File

@@ -1264,7 +1264,7 @@ def generate_gitlab_ci_yaml(
final_job["stage"] = "stage-rebuild-index"
final_job["script"] = [
"spack buildcache update-index --keys -d {0}".format(index_target_mirror)
"spack buildcache update-index --keys --mirror-url {0}".format(index_target_mirror)
]
final_job["when"] = "always"
final_job["retry"] = service_job_retries

View File

@@ -8,6 +8,7 @@
import shutil
import sys
import tempfile
import urllib.parse
import llnl.util.tty as tty
@@ -45,7 +46,7 @@ def setup_parser(subparser):
"-r",
"--rel",
action="store_true",
help="make all rpaths relative" + " before creating tarballs.",
help="make all rpaths relative before creating tarballs.",
)
create.add_argument(
"-f", "--force", action="store_true", help="overwrite tarball if it exists."
@@ -54,13 +55,13 @@ def setup_parser(subparser):
"-u",
"--unsigned",
action="store_true",
help="create unsigned buildcache" + " tarballs for testing",
help="create unsigned buildcache tarballs for testing",
)
create.add_argument(
"-a",
"--allow-root",
action="store_true",
help="allow install root string in binary files " + "after RPATH substitution",
help="allow install root string in binary files after RPATH substitution",
)
create.add_argument(
"-k", "--key", metavar="key", type=str, default=None, help="Key for signing."
@@ -71,31 +72,31 @@ def setup_parser(subparser):
"--directory",
metavar="directory",
type=str,
help="local directory where " + "buildcaches will be written.",
help="local directory where buildcaches will be written.",
)
output.add_argument(
"-m",
"--mirror-name",
metavar="mirror-name",
type=str,
help="name of the mirror where " + "buildcaches will be written.",
help="name of the mirror where buildcaches will be written.",
)
output.add_argument(
"--mirror-url",
metavar="mirror-url",
type=str,
help="URL of the mirror where " + "buildcaches will be written.",
help="URL of the mirror where buildcaches will be written.",
)
create.add_argument(
"--rebuild-index",
action="store_true",
default=False,
help="Regenerate buildcache index " + "after building package(s)",
help="Regenerate buildcache index after building package(s)",
)
create.add_argument(
"--spec-file",
default=None,
help=("Create buildcache entry for spec from json or " + "yaml file"),
help="Create buildcache entry for spec from json or yaml file",
)
create.add_argument(
"--only",
@@ -124,19 +125,19 @@ def setup_parser(subparser):
"-a",
"--allow-root",
action="store_true",
help="allow install root string in binary files " + "after RPATH substitution",
help="allow install root string in binary files after RPATH substitution",
)
install.add_argument(
"-u",
"--unsigned",
action="store_true",
help="install unsigned buildcache" + " tarballs for testing",
help="install unsigned buildcache tarballs for testing",
)
install.add_argument(
"-o",
"--otherarch",
action="store_true",
help="install specs from other architectures" + " instead of default platform and OS",
help="install specs from other architectures instead of default platform and OS",
)
arguments.add_common_arguments(install, ["specs"])
@@ -155,7 +156,7 @@ def setup_parser(subparser):
"-a",
"--allarch",
action="store_true",
help="list specs for all available architectures" + " instead of default platform and OS",
help="list specs for all available architectures instead of default platform and OS",
)
arguments.add_common_arguments(listcache, ["specs"])
listcache.set_defaults(func=list_fn)
@@ -204,7 +205,7 @@ def setup_parser(subparser):
check.add_argument(
"--spec-file",
default=None,
help=("Check single spec from json or yaml file instead of release " + "specs file"),
help=("Check single spec from json or yaml file instead of release specs file"),
)
check.set_defaults(func=check_fn)
@@ -217,7 +218,7 @@ def setup_parser(subparser):
download.add_argument(
"--spec-file",
default=None,
help=("Download built tarball for spec (from json or yaml file) " + "from mirror"),
help=("Download built tarball for spec (from json or yaml file) from mirror"),
)
download.add_argument(
"-p", "--path", default=None, help="Path to directory where tarball should be downloaded"
@@ -234,7 +235,7 @@ def setup_parser(subparser):
getbuildcachename.add_argument(
"--spec-file",
default=None,
help=("Path to spec json or yaml file for which buildcache name is " + "desired"),
help=("Path to spec json or yaml file for which buildcache name is desired"),
)
getbuildcachename.set_defaults(func=get_buildcache_name_fn)
@@ -294,7 +295,27 @@ def setup_parser(subparser):
# Update buildcache index without copying any additional packages
update_index = subparsers.add_parser("update-index", help=update_index_fn.__doc__)
update_index.add_argument("-d", "--mirror-url", default=None, help="Destination mirror url")
update_index_out = update_index.add_mutually_exclusive_group(required=True)
update_index_out.add_argument(
"-d",
"--directory",
metavar="directory",
type=str,
help="local directory where buildcaches will be written.",
)
update_index_out.add_argument(
"-m",
"--mirror-name",
metavar="mirror-name",
type=str,
help="name of the mirror where buildcaches will be written.",
)
update_index_out.add_argument(
"--mirror-url",
metavar="mirror-url",
type=str,
help="URL of the mirror where buildcaches will be written.",
)
update_index.add_argument(
"-k",
"--keys",
@@ -305,6 +326,15 @@ def setup_parser(subparser):
update_index.set_defaults(func=update_index_fn)
def _mirror_url_from_args(args):
if args.directory:
return spack.mirror.push_url_from_directory(args.directory)
if args.mirror_name:
return spack.mirror.push_url_from_mirror_name(args.mirror_name)
if args.mirror_url:
return spack.mirror.push_url_from_mirror_url(args.mirror_url)
def _matching_specs(args):
"""Return a list of matching specs read from either a spec file (JSON or YAML),
a query over the store or a query over the active environment.
@@ -323,9 +353,9 @@ def _matching_specs(args):
tty.die(
"build cache file creation requires at least one"
+ " installed package spec, an active environment,"
+ " or else a path to a json or yaml file containing a spec"
+ " to install"
" installed package spec, an active environment,"
" or else a path to a json or yaml file containing a spec"
" to install"
)
@@ -353,15 +383,7 @@ def _concrete_spec_from_args(args):
def create_fn(args):
"""create a binary package and push it to a mirror"""
if args.directory:
push_url = spack.mirror.push_url_from_directory(args.directory)
if args.mirror_name:
push_url = spack.mirror.push_url_from_mirror_name(args.mirror_name)
if args.mirror_url:
push_url = spack.mirror.push_url_from_mirror_url(args.mirror_url)
push_url = _mirror_url_from_args(args)
matches = _matching_specs(args)
msg = "Pushing binary packages to {0}/build_cache".format(push_url)
@@ -575,11 +597,11 @@ def sync_fn(args):
source_location = None
if args.src_directory:
source_location = args.src_directory
scheme = url_util.parse(source_location, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(source_location, scheme="<missing>").scheme
if scheme != "<missing>":
raise ValueError('"--src-directory" expected a local path; got a URL, instead')
# Ensure that the mirror lookup does not mistake this for named mirror
source_location = "file://" + source_location
source_location = url_util.path_to_file_url(source_location)
elif args.src_mirror_name:
source_location = args.src_mirror_name
result = spack.mirror.MirrorCollection().lookup(source_location)
@@ -587,7 +609,7 @@ def sync_fn(args):
raise ValueError('no configured mirror named "{name}"'.format(name=source_location))
elif args.src_mirror_url:
source_location = args.src_mirror_url
scheme = url_util.parse(source_location, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(source_location, scheme="<missing>").scheme
if scheme == "<missing>":
raise ValueError('"{url}" is not a valid URL'.format(url=source_location))
@@ -598,11 +620,11 @@ def sync_fn(args):
dest_location = None
if args.dest_directory:
dest_location = args.dest_directory
scheme = url_util.parse(dest_location, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(dest_location, scheme="<missing>").scheme
if scheme != "<missing>":
raise ValueError('"--dest-directory" expected a local path; got a URL, instead')
# Ensure that the mirror lookup does not mistake this for named mirror
dest_location = "file://" + dest_location
dest_location = url_util.path_to_file_url(dest_location)
elif args.dest_mirror_name:
dest_location = args.dest_mirror_name
result = spack.mirror.MirrorCollection().lookup(dest_location)
@@ -610,7 +632,7 @@ def sync_fn(args):
raise ValueError('no configured mirror named "{name}"'.format(name=dest_location))
elif args.dest_mirror_url:
dest_location = args.dest_mirror_url
scheme = url_util.parse(dest_location, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(dest_location, scheme="<missing>").scheme
if scheme == "<missing>":
raise ValueError('"{url}" is not a valid URL'.format(url=dest_location))
@@ -692,11 +714,8 @@ def update_index(mirror_url, update_keys=False):
def update_index_fn(args):
"""Update a buildcache index."""
outdir = "file://."
if args.mirror_url:
outdir = args.mirror_url
update_index(outdir, update_keys=args.keys)
push_url = _mirror_url_from_args(args)
update_index(push_url, update_keys=args.keys)
def buildcache(parser, args):

View File

@@ -356,7 +356,7 @@ def ci_rebuild(args):
# dependencies from previous stages available since we do not
# allow pushing binaries to the remote mirror during PR pipelines.
enable_artifacts_mirror = True
pipeline_mirror_url = "file://" + local_mirror_dir
pipeline_mirror_url = url_util.path_to_file_url(local_mirror_dir)
mirror_msg = "artifact buildcache enabled, mirror url: {0}".format(pipeline_mirror_url)
tty.debug(mirror_msg)

View File

@@ -7,6 +7,7 @@
import os
import re
import urllib.parse
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
@@ -827,8 +828,8 @@ def get_versions(args, name):
valid_url = True
try:
spack.util.url.require_url_format(args.url)
if args.url.startswith("file://"):
parsed = urllib.parse.urlparse(args.url)
if not parsed.scheme or parsed.scheme != "file":
valid_url = False # No point in spidering these
except (ValueError, TypeError):
valid_url = False

View File

@@ -11,6 +11,7 @@
import spack.mirror
import spack.paths
import spack.util.gpg
import spack.util.url
description = "handle GPG actions for spack"
section = "packaging"
@@ -98,7 +99,7 @@ def setup_parser(subparser):
"--directory",
metavar="directory",
type=str,
help="local directory where " + "keys will be published.",
help="local directory where keys will be published.",
)
output.add_argument(
"-m",
@@ -212,7 +213,8 @@ def gpg_publish(args):
mirror = None
if args.directory:
mirror = spack.mirror.Mirror(args.directory, args.directory)
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirror.MirrorCollection().lookup(args.mirror_name)
elif args.mirror_url:

View File

@@ -357,11 +357,10 @@ def versions_per_spec(args):
def create_mirror_for_individual_specs(mirror_specs, directory_hint, skip_unstable_versions):
local_push_url = local_mirror_url_from_user(directory_hint)
present, mirrored, error = spack.mirror.create(
local_push_url, mirror_specs, skip_unstable_versions
directory_hint, mirror_specs, skip_unstable_versions
)
tty.msg("Summary for mirror in {}".format(local_push_url))
tty.msg("Summary for mirror in {}".format(directory_hint))
process_mirror_stats(present, mirrored, error)
@@ -389,9 +388,7 @@ def local_mirror_url_from_user(directory_hint):
mirror_directory = spack.util.path.canonicalize_path(
directory_hint or spack.config.get("config:source_cache")
)
tmp_mirror = spack.mirror.Mirror(mirror_directory)
local_url = url_util.format(tmp_mirror.push_url)
return local_url
return url_util.path_to_file_url(mirror_directory)
def mirror_create(args):

View File

@@ -5,6 +5,9 @@
from __future__ import print_function
import argparse
import itertools
import os
import sys
import llnl.util.tty as tty
@@ -14,6 +17,7 @@
import spack.cmd.common.arguments as arguments
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph
description = "query packages associated with particular git revisions"
@@ -65,6 +69,14 @@ def setup_parser(subparser):
"rev2", nargs="?", default="HEAD", help="revision to compare to rev1 (default is HEAD)"
)
# explicitly add help for `spack pkg grep` with just `--help` and NOT `-h`. This is so
# that the very commonly used -h (no filename) argument can be passed through to grep
grep_parser = sp.add_parser("grep", help=pkg_grep.__doc__, add_help=False)
grep_parser.add_argument(
"grep_args", nargs=argparse.REMAINDER, default=None, help="arguments for grep"
)
grep_parser.add_argument("--help", action="help", help="show this help message and exit")
source_parser = sp.add_parser("source", help=pkg_source.__doc__)
source_parser.add_argument(
"-c",
@@ -157,18 +169,88 @@ def pkg_hash(args):
print(ph.package_hash(spec))
def pkg(parser, args):
def get_grep(required=False):
"""Get a grep command to use with ``spack pkg grep``."""
return exe.which(os.environ.get("SPACK_GREP") or "grep", required=required)
def pkg_grep(args, unknown_args):
"""grep for strings in package.py files from all repositories"""
grep = get_grep(required=True)
# add a little color to the output if we can
if "GNU" in grep("--version", output=str):
grep.add_default_arg("--color=auto")
# determines number of files to grep at a time
grouper = lambda e: e[0] // 500
# set up iterator and save the first group to ensure we don't end up with a group of size 1
groups = itertools.groupby(enumerate(spack.repo.path.all_package_paths()), grouper)
if not groups:
return 0 # no packages to search
# You can force GNU grep to show filenames on every line with -H, but not POSIX grep.
# POSIX grep only shows filenames when you're grepping 2 or more files. Since we
# don't know which one we're running, we ensure there are always >= 2 files by
# saving the prior group of paths and adding it to a straggling group of 1 if needed.
# This works unless somehow there is only one package in all of Spack.
_, first_group = next(groups)
prior_paths = [path for _, path in first_group]
# grep returns 1 for nothing found, 0 for something found, and > 1 for error
return_code = 1
# assemble args and run grep on a group of paths
def grep_group(paths):
all_args = args.grep_args + unknown_args + paths
grep(*all_args, fail_on_error=False)
return grep.returncode
for _, group in groups:
paths = [path for _, path in group] # extract current path group
if len(paths) == 1:
# Only the very last group can have length 1. If it does, combine
# it with the prior group to ensure more than one path is grepped.
prior_paths += paths
else:
# otherwise run grep on the prior group
error = grep_group(prior_paths)
if error != 1:
return_code = error
if error > 1: # fail fast on error
return error
prior_paths = paths
# Handle the last remaining group after the loop
error = grep_group(prior_paths)
if error != 1:
return_code = error
return return_code
def pkg(parser, args, unknown_args):
if not spack.cmd.spack_is_git_repo():
tty.die("This spack is not a git clone. Can't use 'spack pkg'")
action = {
"add": pkg_add,
"diff": pkg_diff,
"list": pkg_list,
"removed": pkg_removed,
"added": pkg_added,
"changed": pkg_changed,
"source": pkg_source,
"diff": pkg_diff,
"hash": pkg_hash,
"list": pkg_list,
"removed": pkg_removed,
"source": pkg_source,
}
action[args.pkg_command](args)
# grep is special as it passes unknown arguments through
if args.pkg_command == "grep":
return pkg_grep(args, unknown_args)
elif unknown_args:
tty.die("unrecognized arguments: %s" % " ".join(unknown_args))
else:
return action[args.pkg_command](args)

View File

@@ -9,6 +9,7 @@
import platform
import re
import shutil
import sys
import tempfile
from typing import List, Optional, Sequence
@@ -27,6 +28,8 @@
__all__ = ["Compiler"]
is_windows = sys.platform == "win32"
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
@@ -592,7 +595,16 @@ def search_regexps(cls, language):
# defined for the compiler
compiler_names = getattr(cls, "{0}_names".format(language))
prefixes = [""] + cls.prefixes
suffixes = [""] + cls.suffixes
suffixes = [""]
# Windows compilers generally have an extension of some sort
# as do most files on Windows, handle that case here
if is_windows:
ext = r"\.(?:exe|bat)"
cls_suf = [suf + ext for suf in cls.suffixes]
ext_suf = [ext]
suffixes = suffixes + cls.suffixes + cls_suf + ext_suf
else:
suffixes = suffixes + cls.suffixes
regexp_fmt = r"^({0}){1}({2})$"
return [
re.compile(regexp_fmt.format(prefix, re.escape(name), suffix))

View File

@@ -722,6 +722,8 @@ def _default_make_compilers(cmp_id, paths):
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
spec = spack.spec.CompilerSpec(compiler_cls.name, version)
paths = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
# TODO: johnwparent - revist the following line as per discussion at:
# https://github.com/spack/spack/pull/33385/files#r1040036318
target = archspec.cpu.host()
compiler = compiler_cls(spec, operating_system, str(target.family), paths)
return [compiler]

View File

@@ -42,16 +42,16 @@ def get_valid_fortran_pth(comp_ver):
class Msvc(Compiler):
# Subclasses use possible names of C compiler
cc_names: List[str] = ["cl.exe"]
cc_names: List[str] = ["cl"]
# Subclasses use possible names of C++ compiler
cxx_names: List[str] = ["cl.exe"]
cxx_names: List[str] = ["cl"]
# Subclasses use possible names of Fortran 77 compiler
f77_names: List[str] = ["ifx.exe"]
f77_names: List[str] = ["ifx"]
# Subclasses use possible names of Fortran 90 compiler
fc_names: List[str] = ["ifx.exe"]
fc_names: List[str] = ["ifx"]
# Named wrapper links within build_env_path
# Due to the challenges of supporting compiler wrappers

View File

@@ -61,9 +61,16 @@ def compiler_from_entry(entry):
def spec_from_entry(entry):
arch_str = ""
if "arch" in entry:
local_platform = spack.platforms.host()
spec_platform = entry["arch"]["platform"]
# Note that Cray systems are now treated as Linux. Specs
# in the manifest which specify "cray" as the platform
# should be registered in the DB as "linux"
if local_platform.name == "linux" and spec_platform.lower() == "cray":
spec_platform = "linux"
arch_format = "arch={platform}-{os}-{target}"
arch_str = arch_format.format(
platform=entry["arch"]["platform"],
platform=spec_platform,
os=entry["arch"]["platform_os"],
target=entry["arch"]["target"]["name"],
)

View File

@@ -361,6 +361,8 @@ def _depends_on(pkg, spec, when=None, type=default_deptype, patches=None):
return
dep_spec = spack.spec.Spec(spec)
if not dep_spec.name:
raise DependencyError("Invalid dependency specification in package '%s':" % pkg.name, spec)
if pkg.name == dep_spec.name:
raise CircularReferenceError("Package '%s' cannot depend on itself." % pkg.name)
@@ -769,7 +771,11 @@ class DirectiveError(spack.error.SpackError):
"""This is raised when something is wrong with a package directive."""
class CircularReferenceError(DirectiveError):
class DependencyError(DirectiveError):
"""This is raised when a dependency specification is invalid."""
class CircularReferenceError(DependencyError):
"""This is raised when something depends on itself."""

View File

@@ -11,6 +11,8 @@
import stat
import sys
import time
import urllib.parse
import urllib.request
import ruamel.yaml as yaml
@@ -42,6 +44,7 @@
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url
from spack.filesystem_view import (
SimpleFilesystemView,
inverse_view_func_parser,
@@ -926,46 +929,54 @@ def included_config_scopes(self):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
# strip file URL prefix, if needed, to avoid unnecessary remote
# config processing for local files
config_path = config_path.replace("file://", "")
include_url = urllib.parse.urlparse(config_path)
if not os.path.exists(config_path):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
if spack.util.url.is_url_format(config_path):
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
basename = os.path.basename(config_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(config_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path,
self.config_stage_dir,
skip_existing=True,
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path,
self.config_stage_dir,
skip_existing=True,
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
config_path = staged_path
elif include_url.scheme:
raise ValueError(
"Unsupported URL scheme for environment include: {}".format(config_path)
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
@@ -995,7 +1006,7 @@ def included_config_scopes(self):
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
tty.die("{0}\nPlease correct and try again.".format(msg))
raise spack.config.ConfigFileError(msg)
return scopes

View File

@@ -314,17 +314,7 @@ def mirror_id(self):
@property
def candidate_urls(self):
urls = []
for url in [self.url] + (self.mirrors or []):
# This must be skipped on Windows due to URL encoding
# of ':' characters on filepaths on Windows
if sys.platform != "win32" and url.startswith("file://"):
path = urllib.parse.quote(url[len("file://") :])
url = "file://" + path
urls.append(url)
return urls
return [self.url] + (self.mirrors or [])
@_needs_stage
def fetch(self):
@@ -496,7 +486,9 @@ def archive(self, destination):
if not self.archive_file:
raise NoArchiveFileError("Cannot call archive() before fetching.")
web_util.push_to_url(self.archive_file, destination, keep_original=True)
web_util.push_to_url(
self.archive_file, url_util.path_to_file_url(destination), keep_original=True
)
@_needs_stage
def check(self):
@@ -549,8 +541,7 @@ class CacheURLFetchStrategy(URLFetchStrategy):
@_needs_stage
def fetch(self):
reg_str = r"^file://"
path = re.sub(reg_str, "", self.url)
path = url_util.file_url_string_to_path(self.url)
# check whether the cache file exists.
if not os.path.isfile(path):
@@ -799,7 +790,7 @@ def source_id(self):
def mirror_id(self):
repo_ref = self.commit or self.tag or self.branch
if repo_ref:
repo_path = url_util.parse(self.url).path
repo_path = urllib.parse.urlparse(self.url).path
result = os.path.sep.join(["git", repo_path, repo_ref])
return result
@@ -1145,7 +1136,7 @@ def source_id(self):
def mirror_id(self):
if self.revision:
repo_path = url_util.parse(self.url).path
repo_path = urllib.parse.urlparse(self.url).path
result = os.path.sep.join(["svn", repo_path, self.revision])
return result
@@ -1256,7 +1247,7 @@ def source_id(self):
def mirror_id(self):
if self.revision:
repo_path = url_util.parse(self.url).path
repo_path = urllib.parse.urlparse(self.url).path
result = os.path.sep.join(["hg", repo_path, self.revision])
return result
@@ -1328,7 +1319,7 @@ def fetch(self):
tty.debug("Already downloaded {0}".format(self.archive_file))
return
parsed_url = url_util.parse(self.url)
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "s3":
raise web_util.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
@@ -1375,7 +1366,7 @@ def fetch(self):
tty.debug("Already downloaded {0}".format(self.archive_file))
return
parsed_url = url_util.parse(self.url)
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "gs":
raise web_util.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
@@ -1680,7 +1671,8 @@ def store(self, fetcher, relative_dest):
def fetcher(self, target_path, digest, **kwargs):
path = os.path.join(self.root, target_path)
return CacheURLFetchStrategy(path, digest, **kwargs)
url = url_util.path_to_file_url(path)
return CacheURLFetchStrategy(url, digest, **kwargs)
def destroy(self):
shutil.rmtree(self.root, ignore_errors=True)

View File

@@ -2,9 +2,9 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import urllib.parse
import urllib.response
import spack.util.url as url_util
import spack.util.web as web_util
@@ -12,7 +12,7 @@ def gcs_open(req, *args, **kwargs):
"""Open a reader stream to a blob object on GCS"""
import spack.util.gcs as gcs_util
url = url_util.parse(req.get_full_url())
url = urllib.parse.urlparse(req.get_full_url())
gcsblob = gcs_util.GCSBlob(url)
if not gcsblob.exists():

View File

@@ -48,6 +48,7 @@
import spack.compilers
import spack.error
import spack.hooks
import spack.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -419,18 +420,24 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False, timer=timer.NU
otherwise, ``False``
timer (Timer):
"""
# Early exit if no mirrors are configured.
if not spack.mirror.MirrorCollection():
return False
pkg_id = package_id(pkg)
tty.debug("Searching for binary cache of {0}".format(pkg_id))
timer.start("search")
matches = binary_distribution.get_mirrors_for_spec(pkg.spec)
matches = binary_distribution.get_mirrors_for_spec(pkg.spec, index_only=True)
timer.stop("search")
if not matches:
return False
return _process_binary_cache_tarball(
pkg, pkg.spec, explicit, unsigned, mirrors_for_spec=matches, timer=timer
pkg,
pkg.spec,
explicit,
unsigned,
mirrors_for_spec=matches,
timer=timer,
)

View File

@@ -17,15 +17,18 @@
import os.path
import sys
import traceback
import urllib.parse
import ruamel.yaml.error as yaml_error
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy as fs
import spack.mirror
import spack.spec
import spack.url as url
import spack.util.spack_json as sjson
@@ -507,19 +510,13 @@ def mirror_cache_and_stats(path, skip_unstable_versions=False):
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
parsed = url_util.parse(path)
mirror_root = url_util.local_file_path(parsed)
if not mirror_root:
raise spack.error.SpackError("MirrorCaches only work with file:// URLs")
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(mirror_root):
if not os.path.isdir(path):
try:
mkdirp(mirror_root)
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % mirror_root, str(e))
mirror_cache = spack.caches.MirrorCache(
mirror_root, skip_unstable_versions=skip_unstable_versions
)
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
@@ -670,10 +667,10 @@ def push_url_from_directory(output_directory):
"""Given a directory in the local filesystem, return the URL on
which to push binary packages.
"""
scheme = url_util.parse(output_directory, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(output_directory, scheme="<missing>").scheme
if scheme != "<missing>":
raise ValueError("expected a local path, but got a URL instead")
mirror_url = "file://" + output_directory
mirror_url = url_util.path_to_file_url(output_directory)
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
return url_util.format(mirror.push_url)
@@ -688,7 +685,7 @@ def push_url_from_mirror_name(mirror_name):
def push_url_from_mirror_url(mirror_url):
"""Given a mirror URL, return the URL on which to push binary packages."""
scheme = url_util.parse(mirror_url, scheme="<missing>").scheme
scheme = urllib.parse.urlparse(mirror_url, scheme="<missing>").scheme
if scheme == "<missing>":
raise ValueError('"{0}" is not a valid URL'.format(mirror_url))
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)

View File

@@ -8,13 +8,25 @@
Everything in this module is automatically imported into Spack package files.
"""
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
# Emulate some shell commands for convenience
env = environ
cd = chdir
pwd = getcwd
# import most common types used in packages
from typing import Dict, List, Optional
import llnl.util.filesystem
from llnl.util.filesystem import *
from llnl.util.symlink import symlink
import spack.util.executable
# These props will be overridden when the build env is set up.
from spack.build_environment import MakeExecutable
from spack.build_systems.aspell_dict import AspellDictPackage
from spack.build_systems.autotools import AutotoolsPackage
from spack.build_systems.bundle import BundlePackage
@@ -83,3 +95,10 @@
disjoint_sets,
)
from spack.version import Version, ver
# These are just here for editor support; they will be replaced when the build env
# is set up.
make = MakeExecutable("make", jobs=1)
gmake = MakeExecutable("gmake", jobs=1)
ninja = MakeExecutable("ninja", jobs=1)
configure = Executable(join_path(".", "configure"))

View File

@@ -29,8 +29,9 @@ class Test(Platform):
back_os = "debian6"
default_os = "debian6"
def __init__(self):
super(Test, self).__init__("test")
def __init__(self, name=None):
name = name or "test"
super(Test, self).__init__(name)
self.add_target(self.default, spack.target.Target(self.default))
self.add_target(self.front_end, spack.target.Target(self.front_end))

View File

@@ -754,6 +754,14 @@ def _all_package_names(self, include_virtuals):
def all_package_names(self, include_virtuals=False):
return self._all_package_names(include_virtuals)
def package_path(self, name):
"""Get path to package.py file for this repo."""
return self.repo_for_pkg(name).package_path(name)
def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags):
r = set()
for repo in self.repos:
@@ -1153,6 +1161,14 @@ def all_package_names(self, include_virtuals=False):
return names
return [x for x in names if not self.is_virtual(x)]
def package_path(self, name):
"""Get path to package.py file for this repo."""
return os.path.join(self.root, packages_dir_name, name, package_file_name)
def all_package_paths(self):
for name in self.all_package_names():
yield self.package_path(name)
def packages_with_tags(self, *tags):
v = set(self.all_package_names())
index = self.tag_index

View File

@@ -4,12 +4,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import urllib.error
import urllib.parse
import urllib.request
import urllib.response
from io import BufferedReader, IOBase
import spack.util.s3 as s3_util
import spack.util.url as url_util
# NOTE(opadron): Workaround issue in boto where its StreamingBody
@@ -43,8 +43,8 @@ def __getattr__(self, key):
def _s3_open(url):
parsed = url_util.parse(url)
s3 = s3_util.create_s3_session(parsed, connection=s3_util.get_mirror_connection(parsed))
parsed = urllib.parse.urlparse(url)
s3 = s3_util.get_s3_session(url, method="fetch")
bucket = parsed.netloc
key = parsed.path

View File

@@ -496,10 +496,12 @@ def _compute_specs_from_answer_set(self):
best = min(self.answers)
opt, _, answer = best
for input_spec in self.abstract_specs:
key = input_spec.name
key = (input_spec.name, "0")
if input_spec.virtual:
providers = [spec.name for spec in answer.values() if spec.package.provides(key)]
key = providers[0]
providers = [
spec.name for spec in answer.values() if spec.package.provides(input_spec.name)
]
key = (providers[0], "0")
candidate = answer.get(key)
if candidate and candidate.satisfies(input_spec):
@@ -1562,7 +1564,9 @@ class Body(object):
for dtype in dspec.deptypes:
# skip build dependencies of already-installed specs
if concrete_build_deps or dtype != "build":
clauses.append(fn.attr("depends_on", spec.name, dep.name, dtype))
clauses.append(
fn.attr("depends_on_unknown", spec.name, dep.name, dtype)
)
# Ensure Spack will not coconcretize this with another provider
# for the same virtual
@@ -1975,6 +1979,11 @@ def _facts_from_concrete_spec(self, spec, possible):
h = spec.dag_hash()
if spec.name in possible and h not in self.seen_hashes:
self.reusable_and_possible[h] = spec
try:
# Only consider installed packages for repo we know
spack.repo.path.get(spec)
except (spack.repo.UnknownNamespaceError, spack.repo.UnknownPackageError):
return
# this indicates that there is a spec like this installed
self.gen.fact(fn.installed_hash(spec.name, h))
@@ -2166,89 +2175,102 @@ def __init__(self, specs, hash_lookup=None):
# from this dictionary during reconstruction
self._hash_lookup = hash_lookup or {}
def hash(self, pkg, h):
if pkg not in self._specs:
self._specs[pkg] = self._hash_lookup[h]
def hash(self, pkg, psid, h):
key = (pkg, psid)
if key not in self._specs:
self._specs[key] = self._hash_lookup[h]
def node(self, pkg):
if pkg not in self._specs:
self._specs[pkg] = spack.spec.Spec(pkg)
def node(self, pkg, psid):
key = (pkg, psid)
if key not in self._specs:
self._specs[key] = spack.spec.Spec(pkg)
def _arch(self, pkg):
arch = self._specs[pkg].architecture
def _arch(self, pkg, psid):
key = (pkg, psid)
arch = self._specs[key].architecture
if not arch:
arch = spack.spec.ArchSpec()
self._specs[pkg].architecture = arch
self._specs[key].architecture = arch
return arch
def node_platform(self, pkg, platform):
self._arch(pkg).platform = platform
def node_platform(self, pkg, psid, platform):
self._arch(pkg, psid).platform = platform
def node_os(self, pkg, os):
self._arch(pkg).os = os
def node_os(self, pkg, psid, os):
self._arch(pkg, psid).os = os
def node_target(self, pkg, target):
self._arch(pkg).target = target
def node_target(self, pkg, psid, target):
self._arch(pkg, psid).target = target
def variant_value(self, pkg, name, value):
def variant_value(self, pkg, psid, name, value):
# FIXME: is there a way not to special case 'dev_path' everywhere?
key = (pkg, psid)
if name == "dev_path":
self._specs[pkg].variants.setdefault(
self._specs[key].variants.setdefault(
name, spack.variant.SingleValuedVariant(name, value)
)
return
if name == "patches":
self._specs[pkg].variants.setdefault(
self._specs[key].variants.setdefault(
name, spack.variant.MultiValuedVariant(name, value)
)
return
self._specs[pkg].update_variant_validate(name, value)
self._specs[key].update_variant_validate(name, value)
def version(self, pkg, version):
self._specs[pkg].versions = spack.version.ver([version])
def version(self, pkg, psid, version):
key = (pkg, psid)
self._specs[key].versions = spack.version.ver([version])
def node_compiler(self, pkg, compiler):
self._specs[pkg].compiler = spack.spec.CompilerSpec(compiler)
def node_compiler(self, pkg, psid, compiler):
key = (pkg, psid)
self._specs[key].compiler = spack.spec.CompilerSpec(compiler)
def node_compiler_version(self, pkg, compiler, version):
self._specs[pkg].compiler.versions = spack.version.VersionList([version])
def node_compiler_version(self, pkg, psid, compiler, version):
key = (pkg, psid)
self._specs[key].compiler.versions = spack.version.VersionList([version])
def node_flag_compiler_default(self, pkg):
self._flag_compiler_defaults.add(pkg)
def node_flag_compiler_default(self, pkg, psid):
key = (pkg, psid)
self._flag_compiler_defaults.add(key)
def node_flag(self, pkg, flag_type, flag):
self._specs[pkg].compiler_flags.add_flag(flag_type, flag, False)
def node_flag(self, pkg, psid, flag_type, flag):
key = (pkg, psid)
self._specs[key].compiler_flags.add_flag(flag_type, flag, False)
def node_flag_source(self, pkg, flag_type, source):
self._flag_sources[(pkg, flag_type)].add(source)
def node_flag_source(self, pkg, psid, flag_type, source):
self._flag_sources[(pkg, psid, flag_type)].add(source)
def no_flags(self, pkg, flag_type):
self._specs[pkg].compiler_flags[flag_type] = []
def no_flags(self, pkg, psid, flag_type):
key = (pkg, psid)
self._specs[key].compiler_flags[flag_type] = []
def external_spec_selected(self, pkg, idx):
def external_spec_selected(self, pkg, psid, idx):
"""This means that the external spec and index idx
has been selected for this package.
"""
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
spec_info = packages_yaml[pkg]["externals"][int(idx)]
self._specs[pkg].external_path = spec_info.get("prefix", None)
self._specs[pkg].external_modules = spack.spec.Spec._format_module_list(
key = (pkg, psid)
self._specs[key].external_path = spec_info.get("prefix", None)
self._specs[key].external_modules = spack.spec.Spec._format_module_list(
spec_info.get("modules", None)
)
self._specs[pkg].extra_attributes = spec_info.get("extra_attributes", {})
self._specs[key].extra_attributes = spec_info.get("extra_attributes", {})
def depends_on(self, pkg, dep, type):
dependencies = self._specs[pkg].edges_to_dependencies(name=dep)
def depends_on(self, pkg, psid1, dep, psid2, type):
pkg_key = (pkg, psid1)
dep_key = (dep, psid2)
dependencies = self._specs[pkg_key].edges_to_dependencies(name=dep)
# TODO: assertion to be removed when cross-compilation is handled correctly
msg = "Current solver does not handle multiple dependency edges of the same name"
assert len(dependencies) < 2, msg
if not dependencies:
self._specs[pkg].add_dependency_edge(self._specs[dep], (type,))
self._specs[pkg_key].add_dependency_edge(self._specs[dep_key], (type,))
else:
# TODO: This assumes that each solve unifies dependencies
dependencies[0].add_type(type)
@@ -2267,7 +2289,8 @@ def reorder_flags(self):
compilers = dict((c.spec, c) for c in all_compilers_in_config())
cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse())
for spec in self._specs.values():
for key, spec in self._specs.items():
name, psid = key
# if bootstrapping, compiler is not in config and has no flags
flagmap_from_compiler = {}
if spec.compiler in compilers:
@@ -2279,7 +2302,7 @@ def reorder_flags(self):
# order is determined by the DAG. A spec's flags come after any of its ancestors
# on the compile line
source_key = (spec.name, flag_type)
source_key = (spec.name, psid, flag_type)
if source_key in self._flag_sources:
order = [s.name for s in spec.traverse(order="post", direction="parents")]
sorted_sources = sorted(
@@ -2300,7 +2323,7 @@ def reorder_flags(self):
spec.compiler_flags.update({flag_type: ordered_compiler_flags})
def deprecated(self, pkg, version):
def deprecated(self, pkg, psid, version):
msg = 'using "{0}@{1}" which is a deprecated version'
tty.warn(msg.format(pkg, version))
@@ -2348,12 +2371,14 @@ def build_specs(self, function_tuples):
# predicates on virtual packages.
if name != "error":
pkg = args[0]
psid = args[1]
if spack.repo.path.is_virtual(pkg):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it.
spec = self._specs.get(pkg)
key = (pkg, psid)
spec = self._specs.get(key)
if spec and spec.concrete:
continue

File diff suppressed because it is too large Load Diff

View File

@@ -10,9 +10,10 @@
%==============================================================================
% Spec attributes
#show attr/2.
#show attr/3.
#show attr/4.
#show attr/5.
#show attr/6.
% names of optimization criteria
#show opt_criterion/2.

View File

@@ -1289,7 +1289,7 @@ def __init__(
# have package.py files for.
self._normal = normal
self._concrete = concrete
self.external_path = external_path
self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for
@@ -1326,6 +1326,14 @@ def _format_module_list(modules):
modules = list(modules)
return modules
@property
def external_path(self):
return pth.path_to_os_path(self._external_path)[0]
@external_path.setter
def external_path(self, ext_path):
self._external_path = ext_path
@property
def external(self):
return bool(self.external_path) or bool(self.external_modules)
@@ -2925,9 +2933,10 @@ def _new_concretize(self, tests=False):
providers = [spec.name for spec in answer.values() if spec.package.provides(name)]
name = providers[0]
assert name in answer
key = (name, "0")
assert key in answer
concretized = answer[name]
concretized = answer[key]
self._dup(concretized)
def concretize(self, tests=False):

View File

@@ -13,13 +13,16 @@
from llnl.util.filesystem import join_path, visit_directory_tree
import spack.binary_distribution as bindist
import spack.caches
import spack.config
import spack.fetch_strategy
import spack.hooks.sbang as sbang
import spack.main
import spack.mirror
import spack.repo
import spack.store
import spack.util.gpg
import spack.util.url as url_util
import spack.util.web as web_util
from spack.binary_distribution import get_buildfile_manifest
from spack.directory_layout import DirectoryLayout
@@ -58,7 +61,7 @@ def mirror_dir(tmpdir_factory):
@pytest.fixture(scope="function")
def test_mirror(mirror_dir):
mirror_url = "file://%s" % mirror_dir
mirror_url = url_util.path_to_file_url(mirror_dir)
mirror_cmd("add", "--scope", "site", "test-mirror-func", mirror_url)
yield mirror_dir
mirror_cmd("rm", "--scope=site", "test-mirror-func")
@@ -200,8 +203,7 @@ def test_default_rpaths_create_install_default_layout(mirror_dir):
buildcache_cmd("create", "-auf", "-d", mirror_dir, cspec.name)
# Create mirror index
mirror_url = "file://{0}".format(mirror_dir)
buildcache_cmd("update-index", "-d", mirror_url)
buildcache_cmd("update-index", "-d", mirror_dir)
# List the buildcaches in the mirror
buildcache_cmd("list", "-alv")
@@ -266,8 +268,7 @@ def test_relative_rpaths_create_default_layout(mirror_dir):
buildcache_cmd("create", "-aur", "-d", mirror_dir, cspec.name)
# Create mirror index
mirror_url = "file://%s" % mirror_dir
buildcache_cmd("update-index", "-d", mirror_url)
buildcache_cmd("update-index", "-d", mirror_dir)
# Uninstall the package and deps
uninstall_cmd("-y", "--dependents", gspec.name)
@@ -323,9 +324,9 @@ def test_push_and_fetch_keys(mock_gnupghome):
testpath = str(mock_gnupghome)
mirror = os.path.join(testpath, "mirror")
mirrors = {"test-mirror": mirror}
mirrors = {"test-mirror": url_util.path_to_file_url(mirror)}
mirrors = spack.mirror.MirrorCollection(mirrors)
mirror = spack.mirror.Mirror("file://" + mirror)
mirror = spack.mirror.Mirror(url_util.path_to_file_url(mirror))
gpg_dir1 = os.path.join(testpath, "gpg1")
gpg_dir2 = os.path.join(testpath, "gpg2")
@@ -389,7 +390,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
# Create a temp mirror directory for buildcache usage
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = "file://{0}".format(mirror_dir.strpath)
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
s = Spec("libdwarf").concretized()
@@ -421,7 +422,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
# Create a temp mirror directory for buildcache usage
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = "file://{0}".format(mirror_dir.strpath)
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
spack.config.set("mirrors", {"test": mirror_url})
s = Spec("libdwarf").concretized()
@@ -514,7 +515,6 @@ def test_update_sbang(tmpdir, test_mirror):
# Need a fake mirror with *function* scope.
mirror_dir = test_mirror
mirror_url = "file://{0}".format(mirror_dir)
# Assume all commands will concretize old_spec the same way.
install_cmd("--no-cache", old_spec.name)
@@ -523,7 +523,7 @@ def test_update_sbang(tmpdir, test_mirror):
buildcache_cmd("create", "-u", "-a", "-d", mirror_dir, old_spec_hash_str)
# Need to force an update of the buildcache index
buildcache_cmd("update-index", "-d", mirror_url)
buildcache_cmd("update-index", "-d", mirror_dir)
# Uninstall the original package.
uninstall_cmd("-y", old_spec_hash_str)

View File

@@ -10,22 +10,15 @@
import pytest
import spack.binary_distribution
import spack.main
import spack.spec
import spack.util.url
install = spack.main.SpackCommand("install")
pytestmark = pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")
def _validate_url(url):
return
@pytest.fixture(autouse=True)
def url_check(monkeypatch):
monkeypatch.setattr(spack.util.url, "require_url_format", _validate_url)
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmpdir):
with tmpdir.as_cwd():
@@ -33,12 +26,13 @@ def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmpdi
install(str(spec))
# Runs fine the first time, throws the second time
spack.binary_distribution._build_tarball(spec, ".", unsigned=True)
out_url = spack.util.url.path_to_file_url(str(tmpdir))
spack.binary_distribution._build_tarball(spec, out_url, unsigned=True)
with pytest.raises(spack.binary_distribution.NoOverwriteException):
spack.binary_distribution._build_tarball(spec, ".", unsigned=True)
spack.binary_distribution._build_tarball(spec, out_url, unsigned=True)
# Should work fine with force=True
spack.binary_distribution._build_tarball(spec, ".", force=True, unsigned=True)
spack.binary_distribution._build_tarball(spec, out_url, force=True, unsigned=True)
# Remove the tarball and try again.
# This must *also* throw, because of the existing .spec.json file
@@ -51,4 +45,4 @@ def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmpdi
)
with pytest.raises(spack.binary_distribution.NoOverwriteException):
spack.binary_distribution._build_tarball(spec, ".", unsigned=True)
spack.binary_distribution._build_tarball(spec, out_url, unsigned=True)

View File

@@ -10,6 +10,7 @@
import spack.cmd.create
import spack.stage
import spack.util.executable
import spack.util.url as url_util
pytestmark = pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")
@@ -50,7 +51,7 @@ def url_and_build_system(request, tmpdir):
filename, system = request.param
tmpdir.ensure("archive", filename)
tar("czf", "archive.tar.gz", "archive")
url = "file://" + str(tmpdir.join("archive.tar.gz"))
url = url_util.path_to_file_url(str(tmpdir.join("archive.tar.gz")))
yield url, system
orig_dir.chdir()

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import sys
import pytest
@@ -123,6 +124,10 @@ def test_old_style_compatibility_with_super(spec_str, method_name, expected):
assert value == expected
@pytest.mark.skipif(
sys.platform == "win32",
reason="log_ouput cannot currently be used outside of subprocess on Windows",
)
@pytest.mark.regression("33928")
@pytest.mark.usefixtures("builder_test_repository", "config", "working_env")
@pytest.mark.disable_clean_stage_check

View File

@@ -4,26 +4,24 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import sys
import pytest
from llnl.util.filesystem import mkdirp, touch
import spack.config
import spack.util.url as url_util
from spack.fetch_strategy import CacheURLFetchStrategy, NoCacheError
from spack.stage import Stage
is_windows = sys.platform == "win32"
@pytest.mark.parametrize("_fetch_method", ["curl", "urllib"])
def test_fetch_missing_cache(tmpdir, _fetch_method):
"""Ensure raise a missing cache file."""
testpath = str(tmpdir)
non_existing = os.path.join(testpath, "non-existing")
with spack.config.override("config:url_fetch_method", _fetch_method):
abs_pref = "" if is_windows else "/"
url = "file://" + abs_pref + "not-a-real-cache-file"
url = url_util.path_to_file_url(non_existing)
fetcher = CacheURLFetchStrategy(url=url)
with Stage(fetcher, path=testpath):
with pytest.raises(NoCacheError, match=r"No cache"):
@@ -36,11 +34,7 @@ def test_fetch(tmpdir, _fetch_method):
testpath = str(tmpdir)
cache = os.path.join(testpath, "cache.tar.gz")
touch(cache)
if is_windows:
url_stub = "{0}"
else:
url_stub = "/{0}"
url = "file://" + url_stub.format(cache)
url = url_util.path_to_file_url(cache)
with spack.config.override("config:url_fetch_method", _fetch_method):
fetcher = CacheURLFetchStrategy(url=url)
with Stage(fetcher, path=testpath) as stage:

View File

@@ -322,7 +322,7 @@ def make_rebuild_index_job(use_artifact_buildcache, optimize, use_dependencies):
result = {
"stage": "stage-rebuild-index",
"script": "spack buildcache update-index -d s3://mirror",
"script": "spack buildcache update-index --mirror-url s3://mirror",
"tags": ["tag-0", "tag-1"],
"image": {"name": "spack/centos7", "entrypoint": [""]},
"after_script": ['rm -rf "./spack"'],

View File

@@ -231,7 +231,7 @@ def test_ci_generate_with_env(
assert "rebuild-index" in yaml_contents
rebuild_job = yaml_contents["rebuild-index"]
expected = "spack buildcache update-index --keys -d {0}".format(mirror_url)
expected = "spack buildcache update-index --keys --mirror-url {0}".format(mirror_url)
assert rebuild_job["script"][0] == expected
assert "variables" in yaml_contents
@@ -810,10 +810,10 @@ def create_rebuild_env(tmpdir, pkg_name, broken_tests=False):
env_dir = working_dir.join("concrete_env")
mirror_dir = working_dir.join("mirror")
mirror_url = "file://{0}".format(mirror_dir.strpath)
mirror_url = url_util.path_to_file_url(mirror_dir.strpath)
broken_specs_path = os.path.join(working_dir.strpath, "naughty-list")
broken_specs_url = url_util.join("file://", broken_specs_path)
broken_specs_url = url_util.path_to_file_url(broken_specs_path)
temp_storage_url = "file:///path/to/per/pipeline/storage"
broken_tests_packages = [pkg_name] if broken_tests else []

View File

@@ -16,6 +16,7 @@
import llnl.util.link_tree
import spack.cmd.env
import spack.config
import spack.environment as ev
import spack.environment.shell
import spack.error
@@ -29,7 +30,6 @@
from spack.stage import stage_prefix
from spack.util.executable import Executable
from spack.util.path import substitute_path_variables
from spack.util.web import FetchError
from spack.version import Version
# TODO-27021
@@ -707,9 +707,9 @@ def test_with_config_bad_include():
e.concretize()
err = str(exc)
assert "not retrieve configuration" in err
assert os.path.join("no", "such", "directory") in err
assert "missing include" in err
assert "/no/such/directory" in err
assert os.path.join("no", "such", "file.yaml") in err
assert ev.active_environment() is None
@@ -827,7 +827,7 @@ def test_env_with_included_config_missing_file(tmpdir, mutable_empty_config):
f.write("spack:\n include:\n - {0}\n".format(missing_file.strpath))
env = ev.Environment(tmpdir.strpath)
with pytest.raises(FetchError, match="No such file or directory"):
with pytest.raises(spack.config.ConfigError, match="missing include path"):
ev.activate(env)

View File

@@ -347,7 +347,7 @@ def _determine_variants(cls, exes, version_str):
assert "externals" in packages_yaml["gcc"]
externals = packages_yaml["gcc"]["externals"]
assert len(externals) == 1
assert externals[0]["prefix"] == "/opt/gcc/bin"
assert externals[0]["prefix"] == os.path.sep + os.path.join("opt", "gcc", "bin")
def test_new_entries_are_reported_correctly(

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
from textwrap import dedent
from spack.main import SpackCommand
@@ -18,12 +19,24 @@ def test_list():
def test_list_cli_output_format(mock_tty_stdout):
out = list("mpileaks")
assert out == dedent(
"""\
# Currently logging on Windows detaches stdout
# from the terminal so we miss some output during tests
# TODO: (johnwparent): Once logging is amended on Windows,
# restore this test
if not sys.platform == "win32":
out_str = dedent(
"""\
mpileaks
==> 1 packages
"""
)
)
else:
out_str = dedent(
"""\
mpileaks
"""
)
assert out == out_str
def test_list_filter(mock_packages):

View File

@@ -11,6 +11,8 @@
import spack.cmd.mirror
import spack.config
import spack.environment as ev
import spack.spec
import spack.util.url as url_util
from spack.main import SpackCommand, SpackCommandError
mirror = SpackCommand("mirror")
@@ -43,15 +45,6 @@ def tmp_scope():
yield scope_name
def _validate_url(url):
return
@pytest.fixture(autouse=True)
def url_check(monkeypatch):
monkeypatch.setattr(spack.util.url, "require_url_format", _validate_url)
@pytest.mark.disable_clean_stage_check
@pytest.mark.regression("8083")
def test_regression_8083(tmpdir, capfd, mock_packages, mock_fetch, config):
@@ -89,7 +82,7 @@ def source_for_pkg_with_hash(mock_packages, tmpdir):
local_path = os.path.join(str(tmpdir), local_url_basename)
with open(local_path, "w") as f:
f.write(s.package.hashed_content)
local_url = "file://" + local_path
local_url = url_util.path_to_file_url(local_path)
s.package.versions[spack.version.Version("1.0")]["url"] = local_url

View File

@@ -13,6 +13,7 @@
from llnl.util.filesystem import mkdirp, working_dir
import spack.cmd.pkg
import spack.main
import spack.repo
from spack.util.executable import which
@@ -293,3 +294,24 @@ def test_pkg_hash(mock_packages):
output = pkg("hash", "multimethod").strip().split()
assert len(output) == 1 and all(len(elt) == 32 for elt in output)
@pytest.mark.skipif(not spack.cmd.pkg.get_grep(), reason="grep is not installed")
def test_pkg_grep(mock_packages, capfd):
# only splice-* mock packages have the string "splice" in them
pkg("grep", "-l", "splice", output=str)
output, _ = capfd.readouterr()
assert output.strip() == "\n".join(
spack.repo.path.get_pkg_class(name).module.__file__
for name in ["splice-a", "splice-h", "splice-t", "splice-vh", "splice-z"]
)
# ensure that this string isn't fouhnd
pkg("grep", "abcdefghijklmnopqrstuvwxyz", output=str, fail_on_error=False)
assert pkg.returncode == 1
output, _ = capfd.readouterr()
assert output.strip() == ""
# ensure that we return > 1 for an error
pkg("grep", "--foobarbaz-not-an-option", output=str, fail_on_error=False)
assert pkg.returncode == 2

View File

@@ -208,9 +208,7 @@ def _warn(*args, **kwargs):
# Note: I want to use https://docs.pytest.org/en/7.1.x/how-to/skipping.html#skip-all-test-functions-of-a-class-or-module
# the style formatter insists on separating these two lines.
pytest.mark.skipif(sys.platform == "win32", reason="Envs unsupported on Windows")
@pytest.mark.skipif(sys.platform == "win32", reason="Envs unsupported on Windows")
class TestUninstallFromEnv(object):
"""Tests an installation with two environments e1 and e2, which each have
shared package installations:

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import posixpath
import sys
import jinja2
@@ -339,7 +338,7 @@ def test_concretize_compiler_flag_propagate(self):
assert spec.satisfies("^openblas cflags='-g'")
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original" or sys.platform == "win32",
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Optional compiler propagation isn't deprecated for original concretizer",
)
def test_concretize_compiler_flag_does_not_propagate(self):
@@ -349,7 +348,7 @@ def test_concretize_compiler_flag_does_not_propagate(self):
assert not spec.satisfies("^openblas cflags='-g'")
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original" or sys.platform == "win32",
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Optional compiler propagation isn't deprecated for original concretizer",
)
def test_concretize_propagate_compiler_flag_not_passed_to_dependent(self):
@@ -449,7 +448,7 @@ def test_concretize_two_virtuals_with_dual_provider_and_a_conflict(self):
s.concretize()
@pytest.mark.skipif(
os.environ.get("SPACK_TEST_SOLVER") == "original" or sys.platform == "win32",
os.environ.get("SPACK_TEST_SOLVER") == "original",
reason="Optional compiler propagation isn't deprecated for original concretizer",
)
def test_concretize_propagate_disabled_variant(self):
@@ -466,7 +465,6 @@ def test_concretize_propagated_variant_is_not_passed_to_dependent(self):
assert spec.satisfies("^openblas+shared")
@pytest.mark.skipif(sys.platform == "win32", reason="No Compiler for Arch on Win")
def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed
with spack.concretize.enable_compiler_existence_check():
@@ -527,7 +525,7 @@ def test_compiler_inheritance(self, compiler_str):
def test_external_package(self):
spec = Spec("externaltool%gcc")
spec.concretize()
assert spec["externaltool"].external_path == posixpath.sep + posixpath.join(
assert spec["externaltool"].external_path == os.path.sep + os.path.join(
"path", "to", "external_tool"
)
assert "externalprereq" not in spec
@@ -558,10 +556,10 @@ def test_nobuild_package(self):
def test_external_and_virtual(self):
spec = Spec("externaltest")
spec.concretize()
assert spec["externaltool"].external_path == posixpath.sep + posixpath.join(
assert spec["externaltool"].external_path == os.path.sep + os.path.join(
"path", "to", "external_tool"
)
assert spec["stuff"].external_path == posixpath.sep + posixpath.join(
assert spec["stuff"].external_path == os.path.sep + os.path.join(
"path", "to", "external_virtual_gcc"
)
assert spec["externaltool"].compiler.satisfies("gcc")
@@ -1815,7 +1813,6 @@ def test_git_hash_assigned_version_is_preferred(self):
c = s.concretized()
assert hash in str(c)
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
@pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "main"))
def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
if spack.config.get("config:concretizer") == "original":
@@ -1827,7 +1824,6 @@ def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
assert s.satisfies("@develop")
assert s.satisfies("@0.1:")
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
@pytest.mark.parametrize("git_ref", ("a" * 40, "0.2.15", "fbranch"))
def test_git_ref_version_errors_if_unknown_version(self, git_ref):
if spack.config.get("config:concretizer") == "original":

View File

@@ -270,7 +270,7 @@ def test_external_mpi(self):
# ensure that once config is in place, external is used
spec = Spec("mpi")
spec.concretize()
assert spec["mpich"].external_path == os.sep + os.path.join("dummy", "path")
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_external_module(self, monkeypatch):
"""Test that packages can find externals specified by module
@@ -305,7 +305,7 @@ def mock_module(cmd, module):
# ensure that once config is in place, external is used
spec = Spec("mpi")
spec.concretize()
assert spec["mpich"].external_path == "/dummy/path"
assert spec["mpich"].external_path == os.path.sep + os.path.join("dummy", "path")
def test_buildable_false(self):
conf = syaml.load_config(

View File

@@ -48,6 +48,7 @@
import spack.util.executable
import spack.util.gpg
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
from spack.fetch_strategy import FetchStrategyComposite, URLFetchStrategy
from spack.util.pattern import Bunch
from spack.util.web import FetchError
@@ -259,6 +260,17 @@ def _verify_executables_noop(*args):
return None
def _host():
"""Mock archspec host so there is no inconsistency on the Windows platform
This function cannot be local as it needs to be pickleable"""
return archspec.cpu.Microarchitecture("x86_64", [], "generic", [], {}, 0)
@pytest.fixture(scope="function")
def archspec_host_is_spack_test_host(monkeypatch):
monkeypatch.setattr(archspec.cpu, "host", _host)
#
# Disable checks on compiler executable existence
#
@@ -1119,7 +1131,7 @@ def mock_archive(request, tmpdir_factory):
"Archive", ["url", "path", "archive_file", "expanded_archive_basedir"]
)
archive_file = str(tmpdir.join(archive_name))
url = "file://" + archive_file
url = url_util.path_to_file_url(archive_file)
# Return the url
yield Archive(
@@ -1320,7 +1332,7 @@ def mock_git_repository(tmpdir_factory):
tmpdir = tmpdir_factory.mktemp("mock-git-repo-submodule-dir-{0}".format(submodule_count))
tmpdir.ensure(spack.stage._source_path_subdir, dir=True)
repodir = tmpdir.join(spack.stage._source_path_subdir)
suburls.append((submodule_count, "file://" + str(repodir)))
suburls.append((submodule_count, url_util.path_to_file_url(str(repodir))))
with repodir.as_cwd():
git("init")
@@ -1348,7 +1360,7 @@ def mock_git_repository(tmpdir_factory):
git("init")
git("config", "user.name", "Spack")
git("config", "user.email", "spack@spack.io")
url = "file://" + str(repodir)
url = url_util.path_to_file_url(str(repodir))
for number, suburl in suburls:
git("submodule", "add", suburl, "third_party/submodule{0}".format(number))
@@ -1450,7 +1462,7 @@ def mock_hg_repository(tmpdir_factory):
# Initialize the repository
with repodir.as_cwd():
url = "file://" + str(repodir)
url = url_util.path_to_file_url(str(repodir))
hg("init")
# Commit file r0
@@ -1484,7 +1496,7 @@ def mock_svn_repository(tmpdir_factory):
tmpdir = tmpdir_factory.mktemp("mock-svn-stage")
tmpdir.ensure(spack.stage._source_path_subdir, dir=True)
repodir = tmpdir.join(spack.stage._source_path_subdir)
url = "file://" + str(repodir)
url = url_util.path_to_file_url(str(repodir))
# Initialize the repository
with repodir.as_cwd():

View File

@@ -233,6 +233,34 @@ def test_generate_specs_from_manifest():
assert openmpi_spec["hwloc"]
def test_translate_cray_platform_to_linux(monkeypatch):
"""Manifests might list specs on newer Cray platforms as being "cray",
but Spack identifies such platforms as "linux". Make sure we
automaticaly transform these entries.
"""
test_linux_platform = spack.platforms.test.Test("linux")
def the_host_is_linux():
return test_linux_platform
monkeypatch.setattr(spack.platforms, "host", the_host_is_linux)
cray_arch = JsonArchEntry(platform="cray", os="rhel8", target="x86_64").to_dict()
spec_json = JsonSpecEntry(
name="cray-mpich",
hash="craympichfakehashaaa",
prefix="/path/to/cray-mpich/",
version="1.0.0",
arch=cray_arch,
compiler=_common_compiler.spec_json(),
dependencies={},
parameters={},
).to_dict()
(spec,) = entries_to_specs([spec_json]).values()
assert spec.architecture.platform == "linux"
def test_translate_compiler_name():
nvidia_compiler = JsonCompilerEntry(
name="nvidia",

View File

@@ -719,13 +719,13 @@ def test_external_entries_in_db(mutable_database):
assert not rec.spec.external_modules
rec = mutable_database.get_record("externaltool")
assert rec.spec.external_path == os.sep + os.path.join("path", "to", "external_tool")
assert rec.spec.external_path == os.path.sep + os.path.join("path", "to", "external_tool")
assert not rec.spec.external_modules
assert rec.explicit is False
rec.spec.package.do_install(fake=True, explicit=True)
rec = mutable_database.get_record("externaltool")
assert rec.spec.external_path == os.sep + os.path.join("path", "to", "external_tool")
assert rec.spec.external_path == os.path.sep + os.path.join("path", "to", "external_tool")
assert not rec.spec.external_modules
assert rec.explicit is True

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.directives
import spack.repo
import spack.spec
@@ -60,3 +61,10 @@ def test_extends_spec(config, mock_packages):
assert extender.dependencies
assert extender.package.extends(extendee)
@pytest.mark.regression("34368")
def test_error_on_anonymous_dependency(config, mock_packages):
pkg = spack.repo.path.get_pkg_class("a")
with pytest.raises(spack.directives.DependencyError):
spack.directives._depends_on(pkg, "@4.5")

View File

@@ -488,7 +488,7 @@ def fake_package_list(compiler, architecture, pkgs):
def test_bootstrapping_compilers_with_different_names_from_spec(
install_mockery, mutable_config, mock_fetch
install_mockery, mutable_config, mock_fetch, archspec_host_is_spack_test_host
):
with spack.config.override("config:install_missing_compilers", True):
with spack.concretize.disable_compiler_existence_check():

View File

@@ -15,6 +15,7 @@
import spack.repo
import spack.util.executable
import spack.util.spack_json as sjson
import spack.util.url as url_util
from spack.spec import Spec
from spack.stage import Stage
from spack.util.executable import which
@@ -54,7 +55,7 @@ def check_mirror():
with Stage("spack-mirror-test") as stage:
mirror_root = os.path.join(stage.path, "test-mirror")
# register mirror with spack config
mirrors = {"spack-mirror-test": "file://" + mirror_root}
mirrors = {"spack-mirror-test": url_util.path_to_file_url(mirror_root)}
with spack.config.override("mirrors", mirrors):
with spack.config.override("config:checksum", False):
specs = [Spec(x).concretized() for x in repos]

View File

@@ -25,6 +25,7 @@
import spack.repo
import spack.store
import spack.util.gpg
import spack.util.url as url_util
from spack.fetch_strategy import FetchStrategyComposite, URLFetchStrategy
from spack.paths import mock_gpg_keys_path
from spack.relocate import (
@@ -89,7 +90,7 @@ def test_buildcache(mock_archive, tmpdir):
spack.mirror.create(mirror_path, specs=[])
# register mirror with spack config
mirrors = {"spack-mirror-test": "file://" + mirror_path}
mirrors = {"spack-mirror-test": url_util.path_to_file_url(mirror_path)}
spack.config.set("mirrors", mirrors)
stage = spack.stage.Stage(mirrors["spack-mirror-test"], name="build_cache", keep=True)

View File

@@ -16,6 +16,7 @@
import spack.paths
import spack.repo
import spack.util.compression
import spack.util.url as url_util
from spack.spec import Spec
from spack.stage import Stage
from spack.util.executable import Executable
@@ -87,7 +88,7 @@ def mock_patch_stage(tmpdir_factory, monkeypatch):
)
def test_url_patch(mock_patch_stage, filename, sha256, archive_sha256, config):
# Make a patch object
url = "file://" + filename
url = url_util.path_to_file_url(filename)
s = Spec("patch").concretized()
patch = spack.patch.UrlPatch(s.package, url, sha256=sha256, archive_sha256=archive_sha256)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import sys
import pytest
@@ -11,6 +12,8 @@
import spack.variant
from spack.parser import SpecParser, SpecTokenizationError, Token, TokenType
is_windows = sys.platform == "win32"
def simple_package_name(name):
"""A simple package name in canonical form"""
@@ -834,6 +837,7 @@ def test_error_conditions(text, exc_cls):
SpecParser(text).next_spec()
@pytest.mark.skipif(is_windows, reason="Spec parsing does not currently support Windows paths")
def test_parse_specfile_simple(specfile_for, tmpdir):
specfile = tmpdir.join("libdwarf.json")
s = specfile_for("libdwarf", specfile)
@@ -879,6 +883,7 @@ def test_parse_filename_missing_slash_as_spec(specfile_for, tmpdir, filename):
)
@pytest.mark.skipif(is_windows, reason="Spec parsing does not currently support Windows paths")
def test_parse_specfile_dependency(default_mock_concretization, tmpdir):
"""Ensure we can use a specfile as a dependency"""
s = default_mock_concretization("libdwarf")

View File

@@ -19,6 +19,7 @@
import spack.paths
import spack.stage
import spack.util.executable
import spack.util.url as url_util
from spack.resource import Resource
from spack.stage import DIYStage, ResourceStage, Stage, StageComposite
from spack.util.path import canonicalize_path
@@ -41,10 +42,6 @@
_include_hidden = 2
_include_extra = 3
_file_prefix = "file://"
if sys.platform == "win32":
_file_prefix += "/"
# Mock fetch directories are expected to appear as follows:
#
@@ -218,7 +215,7 @@ def create_stage_archive(expected_file_list=[_include_readme]):
# Create the archive directory and associated file
archive_dir = tmpdir.join(_archive_base)
archive = tmpdir.join(_archive_fn)
archive_url = _file_prefix + str(archive)
archive_url = url_util.path_to_file_url(str(archive))
archive_dir.ensure(dir=True)
# Create the optional files as requested and make sure expanded
@@ -283,7 +280,7 @@ def mock_expand_resource(tmpdir):
archive_name = "resource.tar.gz"
archive = tmpdir.join(archive_name)
archive_url = _file_prefix + str(archive)
archive_url = url_util.path_to_file_url(str(archive))
filename = "resource-file.txt"
test_file = resource_dir.join(filename)
@@ -414,7 +411,7 @@ def test_noexpand_stage_file(self, mock_stage_archive, mock_noexpand_resource):
property of the stage should refer to the path of that file.
"""
test_noexpand_fetcher = spack.fetch_strategy.from_kwargs(
url=_file_prefix + mock_noexpand_resource, expand=False
url=url_util.path_to_file_url(mock_noexpand_resource), expand=False
)
with Stage(test_noexpand_fetcher) as stage:
stage.fetch()
@@ -432,7 +429,7 @@ def test_composite_stage_with_noexpand_resource(
resource_dst_name = "resource-dst-name.sh"
test_resource_fetcher = spack.fetch_strategy.from_kwargs(
url=_file_prefix + mock_noexpand_resource, expand=False
url=url_util.path_to_file_url(mock_noexpand_resource), expand=False
)
test_resource = Resource("test_resource", test_resource_fetcher, resource_dst_name, None)
resource_stage = ResourceStage(test_resource_fetcher, root_stage, test_resource)

View File

@@ -6,111 +6,44 @@
"""Test Spack's URL handling utility functions."""
import os
import os.path
import posixpath
import re
import sys
import urllib.parse
import pytest
import spack.paths
import spack.util.url as url_util
from spack.util.path import convert_to_posix_path
is_windows = sys.platform == "win32"
if is_windows:
drive_m = re.search(r"[A-Za-z]:", spack.paths.test_path)
drive = drive_m.group() if drive_m else None
def test_url_parse():
def test_url_local_file_path(tmpdir):
# Create a file
path = str(tmpdir.join("hello.txt"))
with open(path, "wb") as f:
f.write(b"hello world")
parsed = url_util.parse("/path/to/resource", scheme="fake")
assert parsed.scheme == "fake"
assert parsed.netloc == ""
assert parsed.path == "/path/to/resource"
# Go from path -> url -> path.
roundtrip = url_util.local_file_path(url_util.path_to_file_url(path))
parsed = url_util.parse("file:///path/to/resource")
assert parsed.scheme == "file"
assert parsed.netloc == ""
assert parsed.path == "/path/to/resource"
# Verify it's the same file.
assert os.path.samefile(roundtrip, path)
parsed = url_util.parse("file:///path/to/resource", scheme="fake")
assert parsed.scheme == "file"
assert parsed.netloc == ""
assert parsed.path == "/path/to/resource"
parsed = url_util.parse("file://path/to/resource")
assert parsed.scheme == "file"
expected = convert_to_posix_path(os.path.abspath(posixpath.join("path", "to", "resource")))
if is_windows:
expected = expected.lstrip(drive)
assert parsed.path == expected
if is_windows:
parsed = url_util.parse("file://%s\\path\\to\\resource" % drive)
assert parsed.scheme == "file"
expected = "/" + posixpath.join("path", "to", "resource")
assert parsed.path == expected
parsed = url_util.parse("https://path/to/resource")
assert parsed.scheme == "https"
assert parsed.netloc == "path"
assert parsed.path == "/to/resource"
parsed = url_util.parse("gs://path/to/resource")
assert parsed.scheme == "gs"
assert parsed.netloc == "path"
assert parsed.path == "/to/resource"
spack_root = spack.paths.spack_root
parsed = url_util.parse("file://$spack")
assert parsed.scheme == "file"
if is_windows:
spack_root = "/" + convert_to_posix_path(spack_root)
assert parsed.netloc + parsed.path == spack_root
# Test if it accepts urlparse objects
parsed = urllib.parse.urlparse(url_util.path_to_file_url(path))
assert os.path.samefile(url_util.local_file_path(parsed), path)
def test_url_local_file_path():
spack_root = spack.paths.spack_root
sep = os.path.sep
lfp = url_util.local_file_path("/a/b/c.txt")
assert lfp == sep + os.path.join("a", "b", "c.txt")
def test_url_local_file_path_no_file_scheme():
assert url_util.local_file_path("https://example.com/hello.txt") is None
assert url_util.local_file_path("C:\\Program Files\\hello.txt") is None
lfp = url_util.local_file_path("file:///a/b/c.txt")
assert lfp == sep + os.path.join("a", "b", "c.txt")
if is_windows:
lfp = url_util.local_file_path("file://a/b/c.txt")
expected = os.path.abspath(os.path.join("a", "b", "c.txt"))
assert lfp == expected
def test_relative_path_to_file_url(tmpdir):
# Create a file
path = str(tmpdir.join("hello.txt"))
with open(path, "wb") as f:
f.write(b"hello world")
lfp = url_util.local_file_path("file://$spack/a/b/c.txt")
expected = os.path.abspath(os.path.join(spack_root, "a", "b", "c.txt"))
assert lfp == expected
if is_windows:
lfp = url_util.local_file_path("file:///$spack/a/b/c.txt")
expected = os.path.abspath(os.path.join(spack_root, "a", "b", "c.txt"))
assert lfp == expected
lfp = url_util.local_file_path("file://$spack/a/b/c.txt")
expected = os.path.abspath(os.path.join(spack_root, "a", "b", "c.txt"))
assert lfp == expected
# not a file:// URL - so no local file path
lfp = url_util.local_file_path("http:///a/b/c.txt")
assert lfp is None
lfp = url_util.local_file_path("http://a/b/c.txt")
assert lfp is None
lfp = url_util.local_file_path("http:///$spack/a/b/c.txt")
assert lfp is None
lfp = url_util.local_file_path("http://$spack/a/b/c.txt")
assert lfp is None
with tmpdir.as_cwd():
roundtrip = url_util.local_file_path(url_util.path_to_file_url("hello.txt"))
assert os.path.samefile(roundtrip, path)
def test_url_join_local_paths():
@@ -179,26 +112,6 @@ def test_url_join_local_paths():
== "https://mirror.spack.io/build_cache/my-package"
)
# file:// URL path components are *NOT* canonicalized
spack_root = spack.paths.spack_root
if sys.platform != "win32":
join_result = url_util.join("/a/b/c", "$spack")
assert join_result == "file:///a/b/c/$spack" # not canonicalized
format_result = url_util.format(join_result)
# canoncalize by hand
expected = url_util.format(
os.path.abspath(os.path.join("/", "a", "b", "c", "." + spack_root))
)
assert format_result == expected
# see test_url_join_absolute_paths() for more on absolute path components
join_result = url_util.join("/a/b/c", "/$spack")
assert join_result == "file:///$spack" # not canonicalized
format_result = url_util.format(join_result)
expected = url_util.format(spack_root)
assert format_result == expected
# For s3:// URLs, the "netloc" (bucket) is considered part of the path.
# Make sure join() can cross bucket boundaries in this case.
args = ["s3://bucket/a/b", "new-bucket", "c"]
@@ -253,38 +166,7 @@ def test_url_join_absolute_paths():
# works as if everything before the http:// URL was left out
assert url_util.join("literally", "does", "not", "matter", p, "resource") == join_result
# It's important to keep in mind that this logic applies even if the
# component's path is not an absolute path!
# For eaxmple:
p = "./d"
# ...is *NOT* an absolute path
# ...is also *NOT* an absolute path component
u = "file://./d"
# ...is a URL
# The path of this URL is *NOT* an absolute path
# HOWEVER, the URL, itself, *is* an absolute path component
# (We just need...
cwd = os.getcwd()
# ...to work out what resource it points to)
if sys.platform == "win32":
convert_to_posix_path(cwd)
cwd = "/" + cwd
# So, even though parse() assumes "file://" URL, the scheme is still
# significant in URL path components passed to join(), even if the base
# is a file:// URL.
path_join_result = "file:///a/b/c/d"
assert url_util.join("/a/b/c", p) == path_join_result
assert url_util.join("file:///a/b/c", p) == path_join_result
url_join_result = "file://{CWD}/d".format(CWD=cwd)
assert url_util.join("/a/b/c", u) == url_join_result
assert url_util.join("file:///a/b/c", u) == url_join_result
assert url_util.join("file:///a/b/c", "./d") == "file:///a/b/c/d"
# Finally, resolve_href should have no effect for how absolute path
# components are handled because local hrefs can not be absolute path

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import os
import posixpath
import sys
import pytest
@@ -12,15 +11,17 @@
import llnl.util.tty as tty
import spack.config
import spack.mirror
import spack.paths
import spack.util.s3
import spack.util.url as url_util
import spack.util.web
from spack.version import ver
def _create_url(relative_url):
web_data_path = posixpath.join(spack.paths.test_path, "data", "web")
return "file://" + posixpath.join(web_data_path, relative_url)
web_data_path = os.path.join(spack.paths.test_path, "data", "web")
return url_util.path_to_file_url(os.path.join(web_data_path, relative_url))
root = _create_url("index.html")
@@ -184,6 +185,7 @@ def test_get_header():
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows (yet)")
def test_list_url(tmpdir):
testpath = str(tmpdir)
testpath_url = url_util.path_to_file_url(testpath)
os.mkdir(os.path.join(testpath, "dir"))
@@ -198,7 +200,7 @@ def test_list_url(tmpdir):
pass
list_url = lambda recursive: list(
sorted(spack.util.web.list_url(testpath, recursive=recursive))
sorted(spack.util.web.list_url(testpath_url, recursive=recursive))
)
assert list_url(False) == ["file-0.txt", "file-1.txt", "file-2.txt"]
@@ -246,14 +248,24 @@ def get_object(self, Bucket=None, Key=None):
def test_gather_s3_information(monkeypatch, capfd):
mock_connection_data = {
"access_token": "AAAAAAA",
"profile": "SPacKDeV",
"access_pair": ("SPA", "CK"),
"endpoint_url": "https://127.0.0.1:8888",
}
mirror = spack.mirror.Mirror.from_dict(
{
"fetch": {
"access_token": "AAAAAAA",
"profile": "SPacKDeV",
"access_pair": ("SPA", "CK"),
"endpoint_url": "https://127.0.0.1:8888",
},
"push": {
"access_token": "AAAAAAA",
"profile": "SPacKDeV",
"access_pair": ("SPA", "CK"),
"endpoint_url": "https://127.0.0.1:8888",
},
}
)
session_args, client_args = spack.util.s3.get_mirror_s3_connection_info(mock_connection_data)
session_args, client_args = spack.util.s3.get_mirror_s3_connection_info(mirror, "push")
# Session args are used to create the S3 Session object
assert "aws_session_token" in session_args
@@ -273,10 +285,10 @@ def test_gather_s3_information(monkeypatch, capfd):
def test_remove_s3_url(monkeypatch, capfd):
fake_s3_url = "s3://my-bucket/subdirectory/mirror"
def mock_create_s3_session(url, connection={}):
def get_s3_session(url, method="fetch"):
return MockS3Client()
monkeypatch.setattr(spack.util.s3, "create_s3_session", mock_create_s3_session)
monkeypatch.setattr(spack.util.s3, "get_s3_session", get_s3_session)
current_debug_level = tty.debug_level()
tty.set_debug(1)
@@ -292,10 +304,10 @@ def mock_create_s3_session(url, connection={}):
def test_s3_url_exists(monkeypatch, capfd):
def mock_create_s3_session(url, connection={}):
def get_s3_session(url, method="fetch"):
return MockS3Client()
monkeypatch.setattr(spack.util.s3, "create_s3_session", mock_create_s3_session)
monkeypatch.setattr(spack.util.s3, "get_s3_session", get_s3_session)
fake_s3_url_exists = "s3://my-bucket/subdirectory/my-file"
assert spack.util.web.url_exists(fake_s3_url_exists)

View File

@@ -122,7 +122,7 @@ def path_to_os_path(*pths):
"""
ret_pths = []
for pth in pths:
if type(pth) is str and not is_path_url(pth):
if isinstance(pth, str) and not is_path_url(pth):
pth = convert_to_platform_path(pth)
ret_pths.append(pth)
return ret_pths

View File

@@ -4,27 +4,74 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import urllib.parse
from typing import Any, Dict, Tuple
import spack
import spack.util.url as url_util
import spack.config
#: Map (mirror name, method) tuples to s3 client instances.
s3_client_cache: Dict[Tuple[str, str], Any] = dict()
def get_mirror_connection(url, url_type="push"):
connection = {}
# Try to find a mirror for potential connection information
# Check to see if desired file starts with any of the mirror URLs
rebuilt_path = url_util.format(url)
# Gather dict of push URLS point to the value of the whole mirror
mirror_dict = {x.push_url: x for x in spack.mirror.MirrorCollection().values()}
# Ensure most specific URLs (longest) are presented first
mirror_url_keys = mirror_dict.keys()
mirror_url_keys = sorted(mirror_url_keys, key=len, reverse=True)
for mURL in mirror_url_keys:
# See if desired URL starts with the mirror's push URL
if rebuilt_path.startswith(mURL):
connection = mirror_dict[mURL].to_dict()[url_type]
break
return connection
def get_s3_session(url, method="fetch"):
# import boto and friends as late as possible. We don't want to require boto as a
# dependency unless the user actually wants to access S3 mirrors.
from boto3 import Session
from botocore import UNSIGNED
from botocore.client import Config
from botocore.exceptions import ClientError
# Circular dependency
from spack.mirror import MirrorCollection
global s3_client_cache
# Parse the URL if not already done.
if not isinstance(url, urllib.parse.ParseResult):
url = urllib.parse.urlparse(url)
url_str = url.geturl()
def get_mirror_url(mirror):
return mirror.fetch_url if method == "fetch" else mirror.push_url
# Get all configured mirrors that could match.
all_mirrors = MirrorCollection()
mirrors = [
(name, mirror)
for name, mirror in all_mirrors.items()
if url_str.startswith(get_mirror_url(mirror))
]
if not mirrors:
name, mirror = None, {}
else:
# In case we have more than one mirror, we pick the longest matching url.
# The heuristic being that it's more specific, and you can have different
# credentials for a sub-bucket (if that is a thing).
name, mirror = max(
mirrors, key=lambda name_and_mirror: len(get_mirror_url(name_and_mirror[1]))
)
key = (name, method)
# Did we already create a client for this? Then return it.
if key in s3_client_cache:
return s3_client_cache[key]
# Otherwise, create it.
s3_connection, s3_client_args = get_mirror_s3_connection_info(mirror, method)
session = Session(**s3_connection)
# if no access credentials provided above, then access anonymously
if not session.get_credentials():
s3_client_args["config"] = Config(signature_version=UNSIGNED)
client = session.client("s3", **s3_client_args)
client.ClientError = ClientError
# Cache the client.
s3_client_cache[key] = client
return client
def _parse_s3_endpoint_url(endpoint_url):
@@ -34,53 +81,37 @@ def _parse_s3_endpoint_url(endpoint_url):
return endpoint_url
def get_mirror_s3_connection_info(connection):
def get_mirror_s3_connection_info(mirror, method):
"""Create s3 config for session/client from a Mirror instance (or just set defaults
when no mirror is given.)"""
from spack.mirror import Mirror
s3_connection = {}
s3_connection_is_dict = connection and isinstance(connection, dict)
if s3_connection_is_dict:
if connection.get("access_token"):
s3_connection["aws_session_token"] = connection["access_token"]
if connection.get("access_pair"):
s3_connection["aws_access_key_id"] = connection["access_pair"][0]
s3_connection["aws_secret_access_key"] = connection["access_pair"][1]
if connection.get("profile"):
s3_connection["profile_name"] = connection["profile"]
s3_client_args = {"use_ssl": spack.config.get("config:verify_ssl")}
endpoint_url = os.environ.get("S3_ENDPOINT_URL")
# access token
if isinstance(mirror, Mirror):
access_token = mirror.get_access_token(method)
if access_token:
s3_connection["aws_session_token"] = access_token
# access pair
access_pair = mirror.get_access_pair(method)
if access_pair and access_pair[0] and access_pair[1]:
s3_connection["aws_access_key_id"] = access_pair[0]
s3_connection["aws_secret_access_key"] = access_pair[1]
# profile
profile = mirror.get_profile(method)
if profile:
s3_connection["profile_name"] = profile
# endpoint url
endpoint_url = mirror.get_endpoint_url(method) or os.environ.get("S3_ENDPOINT_URL")
else:
endpoint_url = os.environ.get("S3_ENDPOINT_URL")
if endpoint_url:
s3_client_args["endpoint_url"] = _parse_s3_endpoint_url(endpoint_url)
elif s3_connection_is_dict and connection.get("endpoint_url"):
s3_client_args["endpoint_url"] = _parse_s3_endpoint_url(connection["endpoint_url"])
return (s3_connection, s3_client_args)
def create_s3_session(url, connection={}):
url = url_util.parse(url)
if url.scheme != "s3":
raise ValueError(
"Can not create S3 session from URL with scheme: {SCHEME}".format(SCHEME=url.scheme)
)
# NOTE(opadron): import boto and friends as late as possible. We don't
# want to require boto as a dependency unless the user actually wants to
# access S3 mirrors.
from boto3 import Session # type: ignore[import]
from botocore.exceptions import ClientError # type: ignore[import]
s3_connection, s3_client_args = get_mirror_s3_connection_info(connection)
session = Session(**s3_connection)
# if no access credentials provided above, then access anonymously
if not session.get_credentials():
from botocore import UNSIGNED # type: ignore[import]
from botocore.client import Config # type: ignore[import]
s3_client_args["config"] = Config(signature_version=UNSIGNED)
client = session.client("s3", **s3_client_args)
client.ClientError = ClientError
return client

View File

@@ -8,18 +8,14 @@
"""
import itertools
import os
import posixpath
import re
import sys
import urllib.parse
import urllib.request
from spack.util.path import (
canonicalize_path,
convert_to_platform_path,
convert_to_posix_path,
)
is_windows = sys.platform == "win32"
from spack.util.path import convert_to_posix_path
def _split_all(path):
@@ -49,82 +45,22 @@ def local_file_path(url):
file or directory referenced by it. Otherwise, return None.
"""
if isinstance(url, str):
url = parse(url)
url = urllib.parse.urlparse(url)
if url.scheme == "file":
if is_windows:
pth = convert_to_platform_path(url.netloc + url.path)
if re.search(r"^\\[A-Za-z]:", pth):
pth = pth.lstrip("\\")
return pth
return url.path
return urllib.request.url2pathname(url.path)
return None
def parse(url, scheme="file"):
"""Parse a url.
def path_to_file_url(path):
if not os.path.isabs(path):
path = os.path.abspath(path)
return urllib.parse.urljoin("file:", urllib.request.pathname2url(path))
Path variable substitution is performed on file URLs as needed. The
variables are documented at
https://spack.readthedocs.io/en/latest/configuration.html#spack-specific-variables.
Arguments:
url (str): URL to be parsed
scheme (str): associated URL scheme
Returns:
(urllib.parse.ParseResult): For file scheme URLs, the
netloc and path components are concatenated and passed through
spack.util.path.canoncalize_path(). Otherwise, the returned value
is the same as urllib's urlparse() with allow_fragments=False.
"""
# guarantee a value passed in is of proper url format. Guarantee
# allows for easier string manipulation accross platforms
if isinstance(url, str):
require_url_format(url)
url = escape_file_url(url)
url_obj = (
urllib.parse.urlparse(
url,
scheme=scheme,
allow_fragments=False,
)
if isinstance(url, str)
else url
)
(scheme, netloc, path, params, query, _) = url_obj
scheme = (scheme or "file").lower()
if scheme == "file":
# (The user explicitly provides the file:// scheme.)
# examples:
# file://C:\\a\\b\\c
# file://X:/a/b/c
path = canonicalize_path(netloc + path)
path = re.sub(r"^/+", "/", path)
netloc = ""
drive_ltr_lst = re.findall(r"[A-Za-z]:\\", path)
is_win_path = bool(drive_ltr_lst)
if is_windows and is_win_path:
drive_ltr = drive_ltr_lst[0].strip("\\")
path = re.sub(r"[\\]*" + drive_ltr, "", path)
netloc = "/" + drive_ltr.strip("\\")
if sys.platform == "win32":
path = convert_to_posix_path(path)
return urllib.parse.ParseResult(
scheme=scheme,
netloc=netloc,
path=path,
params=params,
query=query,
fragment=None,
)
def file_url_string_to_path(url):
return urllib.request.url2pathname(urllib.parse.urlparse(url).path)
def format(parsed_url):
@@ -133,7 +69,7 @@ def format(parsed_url):
Returns a canonicalized format of the given URL as a string.
"""
if isinstance(parsed_url, str):
parsed_url = parse(parsed_url)
parsed_url = urllib.parse.urlparse(parsed_url)
return parsed_url.geturl()
@@ -179,18 +115,6 @@ def join(base_url, path, *extra, **kwargs):
# For canonicalizing file:// URLs, take care to explicitly differentiate
# between absolute and relative join components.
# '$spack' is not an absolute path component
join_result = spack.util.url.join('/a/b/c', '$spack') ; join_result
'file:///a/b/c/$spack'
spack.util.url.format(join_result)
'file:///a/b/c/opt/spack'
# '/$spack' *is* an absolute path component
join_result = spack.util.url.join('/a/b/c', '/$spack') ; join_result
'file:///$spack'
spack.util.url.format(join_result)
'file:///opt/spack'
"""
paths = [
(x) if isinstance(x, str) else x.geturl() for x in itertools.chain((base_url, path), extra)
@@ -260,7 +184,7 @@ def join(base_url, path, *extra, **kwargs):
def _join(base_url, path, *extra, **kwargs):
base_url = parse(base_url)
base_url = urllib.parse.urlparse(base_url)
resolve_href = kwargs.get("resolve_href", False)
(scheme, netloc, base_path, params, query, _) = base_url
@@ -365,20 +289,3 @@ def parse_git_url(url):
raise ValueError("bad port in git url: %s" % url)
return (scheme, user, hostname, port, path)
def is_url_format(url):
return re.search(r"^(file://|http://|https://|ftp://|s3://|gs://|ssh://|git://|/)", url)
def require_url_format(url):
if not is_url_format(url):
raise ValueError("Invalid url format from url: %s" % url)
def escape_file_url(url):
drive_ltr = re.findall(r"[A-Za-z]:\\", url)
if is_windows and drive_ltr:
url = url.replace(drive_ltr[0], "/" + drive_ltr[0])
return url

View File

@@ -15,6 +15,7 @@
import ssl
import sys
import traceback
import urllib.parse
from html.parser import HTMLParser
from urllib.error import URLError
from urllib.request import Request, urlopen
@@ -68,7 +69,7 @@ def uses_ssl(parsed_url):
if not endpoint_url:
return True
if url_util.parse(endpoint_url, scheme="https").scheme == "https":
if urllib.parse.urlparse(endpoint_url).scheme == "https":
return True
elif parsed_url.scheme == "gs":
@@ -79,7 +80,8 @@ def uses_ssl(parsed_url):
def read_from_url(url, accept_content_type=None):
url = url_util.parse(url)
if isinstance(url, str):
url = urllib.parse.urlparse(url)
context = None
# Timeout in seconds for web requests
@@ -143,13 +145,9 @@ def read_from_url(url, accept_content_type=None):
def push_to_url(local_file_path, remote_path, keep_original=True, extra_args=None):
if sys.platform == "win32":
if remote_path[1] == ":":
remote_path = "file://" + remote_path
remote_url = url_util.parse(remote_path)
remote_file_path = url_util.local_file_path(remote_url)
if remote_file_path is not None:
remote_url = urllib.parse.urlparse(remote_path)
if remote_url.scheme == "file":
remote_file_path = url_util.local_file_path(remote_url)
mkdirp(os.path.dirname(remote_file_path))
if keep_original:
shutil.copy(local_file_path, remote_file_path)
@@ -175,9 +173,7 @@ def push_to_url(local_file_path, remote_path, keep_original=True, extra_args=Non
while remote_path.startswith("/"):
remote_path = remote_path[1:]
s3 = s3_util.create_s3_session(
remote_url, connection=s3_util.get_mirror_connection(remote_url)
)
s3 = s3_util.get_s3_session(remote_url, method="push")
s3.upload_file(local_file_path, remote_url.netloc, remote_path, ExtraArgs=extra_args)
if not keep_original:
@@ -367,7 +363,7 @@ def url_exists(url, curl=None):
Returns (bool): True if it exists; False otherwise.
"""
tty.debug("Checking existence of {0}".format(url))
url_result = url_util.parse(url)
url_result = urllib.parse.urlparse(url)
# Check if a local file
local_path = url_util.local_file_path(url_result)
@@ -377,9 +373,7 @@ def url_exists(url, curl=None):
# Check if Amazon Simple Storage Service (S3) .. urllib-based fetch
if url_result.scheme == "s3":
# Check for URL-specific connection information
s3 = s3_util.create_s3_session(
url_result, connection=s3_util.get_mirror_connection(url_result)
) # noqa: E501
s3 = s3_util.get_s3_session(url_result, method="fetch")
try:
s3.get_object(Bucket=url_result.netloc, Key=url_result.path.lstrip("/"))
@@ -429,7 +423,7 @@ def _debug_print_delete_results(result):
def remove_url(url, recursive=False):
url = url_util.parse(url)
url = urllib.parse.urlparse(url)
local_path = url_util.local_file_path(url)
if local_path:
@@ -441,7 +435,7 @@ def remove_url(url, recursive=False):
if url.scheme == "s3":
# Try to find a mirror for potential connection information
s3 = s3_util.create_s3_session(url, connection=s3_util.get_mirror_connection(url))
s3 = s3_util.get_s3_session(url, method="push")
bucket = url.netloc
if recursive:
# Because list_objects_v2 can only return up to 1000 items
@@ -538,9 +532,9 @@ def _iter_local_prefix(path):
def list_url(url, recursive=False):
url = url_util.parse(url)
url = urllib.parse.urlparse(url)
local_path = url_util.local_file_path(url)
if local_path:
if recursive:
return list(_iter_local_prefix(local_path))
@@ -551,7 +545,7 @@ def list_url(url, recursive=False):
]
if url.scheme == "s3":
s3 = s3_util.create_s3_session(url, connection=s3_util.get_mirror_connection(url))
s3 = s3_util.get_s3_session(url, method="fetch")
if recursive:
return list(_iter_s3_prefix(s3, url))
@@ -669,7 +663,7 @@ def _spider(url, collect_nested):
collect = current_depth < depth
for root in root_urls:
root = url_util.parse(root)
root = urllib.parse.urlparse(root)
spider_args.append((root, collect))
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
@@ -708,11 +702,11 @@ def _urlopen(req, *args, **kwargs):
del kwargs["context"]
opener = urlopen
if url_util.parse(url).scheme == "s3":
if urllib.parse.urlparse(url).scheme == "s3":
import spack.s3_handler
opener = spack.s3_handler.open
elif url_util.parse(url).scheme == "gs":
elif urllib.parse.urlparse(url).scheme == "gs":
import spack.gcs_handler
opener = spack.gcs_handler.gcs_open

View File

@@ -1,6 +1,6 @@
# content of pytest.ini
[pytest]
addopts = --durations=30 -ra
addopts = --durations=30 -ra --strict-markers
testpaths = lib/spack/spack/test
python_files = *.py
filterwarnings =

View File

@@ -263,7 +263,7 @@ spack:
- cat /proc/loadavg || true
image: ecpe4s/ubuntu20.04-runner-x86_64:2022-12-01
broken-tests-packages:
- gptune
@@ -274,6 +274,8 @@ spack:
- llvm
- llvm-amdgpu
- rocblas
- paraview
- py-torch
runner-attributes:
tags: [ "spack", "huge", "x86_64" ]
variables:

View File

@@ -113,6 +113,7 @@ spack:
mappings:
- match:
- llvm
- py-torch
runner-attributes:
tags: [ "spack", "huge", "x86_64_v4" ]
variables:

View File

@@ -116,6 +116,7 @@ spack:
mappings:
- match:
- llvm
- py-torch
runner-attributes:
tags: [ "spack", "huge", "x86_64_v4" ]
variables:

View File

@@ -119,6 +119,7 @@ spack:
mappings:
- match:
- llvm
- py-torch
runner-attributes:
tags: [ "spack", "huge", "x86_64_v4" ]
variables:

View File

@@ -42,7 +42,8 @@ do
succeeds _spack_completions "${line[@]}" ''
# Test that completion with flags works
contains '-h --help' _spack_completions "${line[@]}" -
# all commands but spack pkg grep have -h; all have --help
contains '--help' _spack_completions "${line[@]}" -
done <<- EOF
$(spack commands --aliases --format=subcommands)
EOF

View File

@@ -111,7 +111,8 @@ contains "b@" echo $LIST_CONTENT
does_not_contain "a@" echo $LIST_CONTENT
fails spack -m load -l
# test a variable MacOS clears and one it doesn't for recursive loads
contains "export PATH=$(spack -m location -i a)/bin:$(spack -m location -i b)/bin" spack -m load --sh a
contains "export PATH=$(spack -m location -i a)/bin" spack -m load --sh a
contains "export PATH=$(spack -m location -i b)/bin" spack -m load --sh b
succeeds spack -m load --only dependencies a
succeeds spack -m load --only package a
fails spack -m load d

View File

@@ -1,3 +0,0 @@
spack compiler find
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
spack external find cmake

View File

@@ -0,0 +1,2 @@
spack compiler find
spack external find cmake

View File

@@ -0,0 +1,3 @@
if ($LASTEXITCODE -ne 0){
throw "Unit Tests have failed"
}

View File

@@ -1,11 +1,5 @@
Set-Location ../
$env:python_pf_ver="C:\hostedtoolcache\windows\Python\3.9.5\x64\python.exe"
cmd /c "`"spack\bin\spack_cmd.bat`" print " |
foreach {
if ($_ -match "=") {
$v = $_.split("=")
[Environment]::SetEnvironmentVariable($v[0], $v[1])
}
}
$ErrorActionPreference = "SilentlyContinue"
Write-Output F|xcopy .\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
# The line below prevents the _spack_root symlink from causing issues with cyclic symlinks on Windows
(Get-Item '.\lib\spack\docs\_spack_root').Delete()
./share/spack/setup-env.ps1

View File

@@ -556,7 +556,7 @@ _spack_buildcache_sync() {
}
_spack_buildcache_update_index() {
SPACK_COMPREPLY="-h --help -d --mirror-url -k --keys"
SPACK_COMPREPLY="-h --help -d --directory -m --mirror-name --mirror-url -k --keys"
}
_spack_cd() {
@@ -1450,7 +1450,7 @@ _spack_pkg() {
then
SPACK_COMPREPLY="-h --help"
else
SPACK_COMPREPLY="add list diff added changed removed source hash"
SPACK_COMPREPLY="add list diff added changed removed grep source hash"
fi
}
@@ -1508,6 +1508,15 @@ _spack_pkg_removed() {
fi
}
_spack_pkg_grep() {
if $list_options
then
SPACK_COMPREPLY="--help"
else
SPACK_COMPREPLY=""
fi
}
_spack_pkg_source() {
if $list_options
then

View File

@@ -3,6 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
from spack.package import *
@@ -23,6 +25,12 @@ def compiler_search_prefix(self):
def install(self, spec, prefix):
# Create the minimal compiler that will fool `spack compiler find`
mkdirp(self.compiler_search_prefix)
with open(self.compiler_search_prefix.icx, "w") as f:
f.write('#!/bin/bash\necho "oneAPI DPC++ Compiler %s"' % str(spec.version))
set_executable(self.compiler_search_prefix.icx)
comp = self.compiler_search_prefix.icx
if sys.platform == "win32":
comp = comp + ".bat"
comp_string = "@echo off\necho oneAPI DPC++ Compiler %s" % str(spec.version)
else:
comp_string = '#!/bin/bash\necho "oneAPI DPC++ Compiler %s"' % str(spec.version)
with open(comp, "w") as f:
f.write(comp_string)
set_executable(comp)

View File

@@ -47,11 +47,13 @@ class Arrow(CMakePackage, CudaPackage):
depends_on("rapidjson")
depends_on("re2+shared", when="+compute")
depends_on("re2+shared", when="+gandiva")
depends_on("re2+shared", when="+python")
depends_on("snappy~shared", when="+snappy @9:")
depends_on("snappy~shared", when="@8:")
depends_on("thrift+pic", when="+parquet")
depends_on("utf8proc@2.7.0: +shared", when="+compute")
depends_on("utf8proc@2.7.0: +shared", when="+gandiva")
depends_on("utf8proc@2.7.0: +shared", when="+python")
depends_on("xsimd@8.1.0:", when="@9.0.0:")
depends_on("zlib+pic", when="+zlib @9:")
depends_on("zlib+pic", when="@:8")
@@ -145,7 +147,12 @@ def cmake_args(self):
args.append(self.define_from_variant("ARROW_WITH_ZSTD", "zstd"))
with when("@:8"):
for dep in ("flatbuffers", "rapidjson", "snappy", "zlib", "zstd"):
dep_list = ("flatbuffers", "rapidjson", "zlib", "zstd")
if self.spec.satisfies("+snappy"):
dep_list.append("snappy")
for dep in dep_list:
args.append("-D{0}_HOME={1}".format(dep.upper(), self.spec[dep].prefix))
args.append("-DZLIB_LIBRARIES={0}".format(self.spec["zlib"].libs))

View File

@@ -15,11 +15,12 @@ class Atmi(CMakePackage):
homepage = "https://github.com/RadeonOpenCompute/atmi"
git = "https://github.com/RadeonOpenCompute/atmi.git"
url = "https://github.com/RadeonOpenCompute/atmi/archive/rocm-5.2.0.tar.gz"
url = "https://github.com/RadeonOpenCompute/atmi/archive/rocm-5.3.0.tar.gz"
tags = ["rocm"]
maintainers = ["srekolam", "renjithravindrankannath"]
version("5.3.0", sha256="dffc0eb0bc1617843e7f728dbd6c8b12326c5c8baa34369aa267aab40f5deb6a")
version("5.2.3", sha256="5f66c59e668cf968e86b556a0a52ee0202d1b370d8406e291a874cbfd200ee17")
version("5.2.1", sha256="6b33445aa67444c038cd756f855a58a72dd35db57e7b63da37fe78a8585b982b")
version("5.2.0", sha256="33e77905a607734157d46c736c924c7c50b6b13f2b2ddbf711cb08e37f2efa4f")
@@ -126,6 +127,7 @@ class Atmi(CMakePackage):
"5.2.0",
"5.2.1",
"5.2.3",
"5.3.0",
]:
depends_on("comgr@" + ver, type="link", when="@" + ver)
depends_on("hsa-rocr-dev@" + ver, type="link", when="@" + ver)

View File

@@ -16,16 +16,31 @@ class Bash(AutotoolsPackage, GNUMirrorPackage):
maintainers = ["adamjstewart"]
version("5.2", sha256="a139c166df7ff4471c5e0733051642ee5556c1cc8a4a78f145583c5c81ab32fb")
version("5.1", sha256="cc012bc860406dcf42f64431bcd3d2fa7560c02915a601aba9cd597a39329baa")
version("5.0", sha256="b4a80f2ac66170b2913efbfb9f2594f1f76c7b1afd11f799e22035d63077fb4d")
version("4.4", sha256="d86b3392c1202e8ff5a423b302e6284db7f8f435ea9f39b5b1b20fd3ac36dfcb")
version("4.3", sha256="afc687a28e0e24dc21b988fa159ff9dbcf6b7caa92ade8645cc6d5605cd024d4")
depends_on("ncurses")
depends_on("readline@8.2:", when="@5.2:")
depends_on("readline@5.0:")
depends_on("iconv")
depends_on("gettext")
patches = [
("5.2", "001", "f42f2fee923bc2209f406a1892772121c467f44533bedfe00a176139da5d310a"),
("5.2", "002", "45cc5e1b876550eee96f95bffb36c41b6cb7c07d33f671db5634405cd00fd7b8"),
("5.2", "003", "6a090cdbd334306fceacd0e4a1b9e0b0678efdbbdedbd1f5842035990c8abaff"),
("5.2", "004", "38827724bba908cf5721bd8d4e595d80f02c05c35f3dd7dbc4cd3c5678a42512"),
("5.2", "005", "ece0eb544368b3b4359fb8464caa9d89c7a6743c8ed070be1c7d599c3675d357"),
("5.2", "006", "d1e0566a257d149a0d99d450ce2885123f9995e9c01d0a5ef6df7044a72a468c"),
("5.2", "007", "2500a3fc21cb08133f06648a017cebfa27f30ea19c8cbe8dfefdf16227cfd490"),
("5.2", "008", "6b4bd92fd0099d1bab436b941875e99e0cb3c320997587182d6267af1844b1e8"),
("5.2", "009", "f95a817882eaeb0cb78bce82859a86bbb297a308ced730ebe449cd504211d3cd"),
("5.2", "010", "c7705e029f752507310ecd7270aef437e8043a9959e4d0c6065a82517996c1cd"),
("5.2", "011", "831b5f25bf3e88625f3ab315043be7498907c551f86041fa3b914123d79eb6f4"),
("5.2", "012", "2fb107ce1fb8e93f36997c8b0b2743fc1ca98a454c7cc5a3fcabec533f67d42c"),
("5.1", "001", "ebb07b3dbadd98598f078125d0ae0d699295978a5cdaef6282fe19adef45b5fa"),
("5.1", "002", "15ea6121a801e48e658ceee712ea9b88d4ded022046a6147550790caf04f5dbe"),
("5.1", "003", "22f2cc262f056b22966281babf4b0a2f84cb7dd2223422e5dcd013c3dcbab6b1"),

View File

@@ -26,7 +26,7 @@ class Bcache(MakefilePackage):
def setup_build_environment(self, env):
# Add -lintl if provided by gettext, otherwise libintl is provided by the system's glibc:
if any("libintl" in filename for filename in self.libs):
if any("libintl." in filename.split("/")[-1] for filename in self.spec["gettext"].libs):
env.append_flags("LDFLAGS", "-lintl")
patch(

View File

@@ -81,12 +81,9 @@ class Binutils(AutotoolsPackage, GNUMirrorPackage):
depends_on("m4", type="build", when="@:2.29 +gold")
depends_on("bison", type="build", when="@:2.29 +gold")
# 2.38 with +gas needs makeinfo due to a bug, see:
# https://sourceware.org/bugzilla/show_bug.cgi?id=28909
depends_on("texinfo", type="build", when="@2.38 +gas")
# 2.34 needs makeinfo due to a bug, see:
# 2.34:2.38 needs makeinfo due to a bug, see:
# https://sourceware.org/bugzilla/show_bug.cgi?id=25491
depends_on("texinfo", type="build", when="@2.34")
depends_on("texinfo", type="build", when="@2.34:2.38")
conflicts("+gold", when="platform=darwin", msg="Binutils cannot build linkers on macOS")

View File

@@ -0,0 +1,26 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Btop(MakefilePackage):
"""Resource monitor that shows usage and stats for processor,
memory, disks, network and processes.
"""
homepage = "https://github.com/aristocratos/btop#documents"
url = "https://github.com/aristocratos/btop/archive/refs/tags/v1.2.13.tar.gz"
maintainers = ["alalazo"]
version("1.2.13", sha256="668dc4782432564c35ad0d32748f972248cc5c5448c9009faeb3445282920e02")
conflicts("%gcc@:9", msg="C++ 20 is required")
build_targets = ["STATIC=true", "VERBOSE=true"]
@property
def install_targets(self):
return [f"PREFIX={self.prefix}", "install"]

View File

@@ -25,6 +25,9 @@ class Bufr(CMakePackage):
"jbathegit",
]
version("11.7.1", sha256="6533ce6eaa6b02c0cb5424cfbc086ab120ccebac3894980a4daafd4dfadd71f8")
version("11.7.0", sha256="6a76ae8e7682bbc790321bf80c2f9417775c5b01a5c4f10763df92e01b20b9ca")
version("11.6.0", sha256="af4c04e0b394aa9b5f411ec5c8055888619c724768b3094727e8bb7d3ea34a54")
version("11.5.0", sha256="d154839e29ef1fe82e58cf20232e9f8a4f0610f0e8b6a394b7ca052e58f97f43")
def _setup_bufr_environment(self, env, suffix):

View File

@@ -14,6 +14,7 @@ class Cli11(CMakePackage):
url = "https://github.com/CLIUtils/CLI11/archive/v1.9.1.tar.gz"
maintainers = ["nightlark"]
version("2.3.1", sha256="378da73d2d1d9a7b82ad6ed2b5bda3e7bc7093c4034a1d680a2e009eb067e7b2")
version("2.1.1", sha256="d69023d1d0ab6a22be86b4f59d449422bc5efd9121868f4e284d6042e52f682e")
version("2.1.0", sha256="2661b0112b02478bad3dc7f1749c4825bfc7e37b440cbb4c8c0e2ffaa3999112")
version("2.0.0", sha256="2c672f17bf56e8e6223a3bfb74055a946fa7b1ff376510371902adb9cb0ab6a3")

View File

@@ -28,6 +28,7 @@ class Cmake(Package):
executables = ["^cmake$"]
version("master", branch="master")
version("3.25.1", sha256="1c511d09516af493694ed9baf13c55947a36389674d657a2d5e0ccedc6b291d8")
version("3.25.0", sha256="306463f541555da0942e6f5a0736560f70c487178b9d94a5ae7f34d0538cdd48")
version("3.24.3", sha256="b53aa10fa82bff84ccdb59065927b72d3bee49f4d86261249fc0984b3b367291")
version("3.24.2", sha256="0d9020f06f3ddf17fb537dc228e1a56c927ee506b486f55fe2dc19f69bf0c8db")

View File

@@ -14,7 +14,7 @@ class Comgr(CMakePackage):
homepage = "https://github.com/RadeonOpenCompute/ROCm-CompilerSupport"
git = "https://github.com/RadeonOpenCompute/ROCm-CompilerSupport.git"
url = "https://github.com/RadeonOpenCompute/ROCm-CompilerSupport/archive/rocm-5.2.3.tar.gz"
url = "https://github.com/RadeonOpenCompute/ROCm-CompilerSupport/archive/rocm-5.3.0.tar.gz"
tags = ["rocm"]
maintainers = ["srekolam", "renjithravindrankannath", "haampie"]
@@ -22,6 +22,7 @@ class Comgr(CMakePackage):
version("master", branch="amd-stg-open")
version("5.3.0", sha256="072f849d79476d87d31d62b962e368762368d540a9da02ee2675963dc4942b2c")
version("5.2.3", sha256="36d67dbe791d08ad0a02f0f3aedd46059848a0a232c5f999670103b0410c89dc")
version("5.2.1", sha256="ebeaea8e653fc2b9d67d3271be44690ac7876ee679baa01d47863e75362b8c85")
version("5.2.0", sha256="5f63fa93739ee9230756ef93c53019474b6cdddea3b588492d785dae1b08c087")
@@ -138,6 +139,7 @@ class Comgr(CMakePackage):
"5.2.0",
"5.2.1",
"5.2.3",
"5.3.0",
"master",
]:
# llvm libs are linked statically, so this *could* be a build dep

View File

@@ -25,10 +25,14 @@ class DarshanRuntime(AutotoolsPackage):
test_requires_compiler = True
version("main", branch="main", submodules=True)
version(
"3.4.1",
sha256="77c0a4675d94a0f9df5710e5b8658cc9ef0f0981a6dafb114d0389b1af64774c",
preferred=True,
)
version(
"3.4.0",
sha256="7cc88b7c130ec3b574f6b73c63c3c05deec67b1350245de6d39ca91d4cff0842",
preferred=True,
)
version(
"3.4.0-pre1", sha256="57d0fd40329b9f8a51bdc9d7635b646692b341d80339115ab203357321706c09"
@@ -52,6 +56,7 @@ class DarshanRuntime(AutotoolsPackage):
depends_on("mpi", when="+mpi")
depends_on("zlib")
depends_on("hdf5", when="+hdf5")
depends_on("parallel-netcdf", when="+parallel-netcdf")
depends_on("papi", when="+apxc")
depends_on("autoconf", type="build", when="@main")
depends_on("automake", type="build", when="@main")
@@ -64,6 +69,12 @@ class DarshanRuntime(AutotoolsPackage):
variant("mpi", default=True, description="Compile with MPI support")
variant("hdf5", default=False, description="Compile with HDF5 module", when="@3.2:")
variant(
"parallel-netcdf",
default=False,
description="Compile with Parallel NetCDF module",
when="@3.4.1:",
)
variant("apmpi", default=False, description="Compile with AutoPerf MPI module", when="@3.3:")
variant(
"apmpi_sync",
@@ -103,6 +114,8 @@ def configure_args(self):
extra_args.append("--enable-hdf5-mod=%s" % spec["hdf5"].prefix)
else:
extra_args.append("--enable-hdf5-mod")
if "+parallel-netcdf" in spec:
extra_args.append("--enable-pnetcdf-mod")
if "+apmpi" in spec:
extra_args.append("--enable-apmpi-mod")
if "+apmpi_sync" in spec:

View File

@@ -21,10 +21,14 @@ class DarshanUtil(AutotoolsPackage):
tags = ["e4s"]
version("main", branch="main", submodules="True")
version(
"3.4.1",
sha256="77c0a4675d94a0f9df5710e5b8658cc9ef0f0981a6dafb114d0389b1af64774c",
preferred=True,
)
version(
"3.4.0",
sha256="7cc88b7c130ec3b574f6b73c63c3c05deec67b1350245de6d39ca91d4cff0842",
preferred=True,
)
version(
"3.4.0-pre1", sha256="57d0fd40329b9f8a51bdc9d7635b646692b341d80339115ab203357321706c09"

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
from spack.pkg.builtin.boost import Boost
class Delly2(MakefilePackage):
@@ -14,19 +13,27 @@ class Delly2(MakefilePackage):
short-read massively parallel sequencing data.."""
homepage = "https://github.com/dellytools/delly"
url = "https://github.com/dellytools/delly/archive/refs/tags/v1.1.6.tar.gz"
git = "https://github.com/dellytools/delly.git"
maintainers = ["snehring"]
version("1.1.6", sha256="08961e9c81431eb486476fa71eea94941ad24ec1970b71e5a7720623a39bfd2a")
version("0.9.1", tag="v0.9.1")
version("2017-08-03", commit="e32a9cd55c7e3df5a6ae4a91f31a0deb354529fc", deprecated=True)
variant("openmp", default=False, description="Build with openmp support")
depends_on("htslib", type=("build", "link"))
depends_on("boost", type=("build", "link"))
# TODO: replace this with an explicit list of components of Boost,
# for instance depends_on('boost +filesystem')
# See https://github.com/spack/spack/pull/22303 for reference
depends_on(Boost.with_default_variants)
depends_on(
"boost@:1.78.0+iostreams+filesystem+system+program_options+date_time",
when="@:0.9.1",
type=("build", "link"),
)
depends_on(
"boost+iostreams+filesystem+system+program_options+date_time",
when="@0.9.1:",
type=("build", "link"),
)
depends_on("bcftools", type="run")
def edit(self, spec, prefix):
@@ -49,13 +56,17 @@ def edit(self, spec, prefix):
makefile.filter(".boost:", "# .boost:")
else:
env["EBROOTHTSLIB"] = self.spec["htslib"].prefix
filter_file("BUILT_PROGRAMS =.*$", "BUILT_PROGRAMS = src/delly src/dpe", "Makefile")
if self.spec.satisfies("@0.9.1"):
filter_file(
"BUILT_PROGRAMS =.*$", "BUILT_PROGRAMS = src/delly src/dpe", "Makefile"
)
filter_file("${SUBMODULES}", "", "Makefile", string=True)
def install(self, spec, prefix):
mkdirp(prefix.bin)
with working_dir("src"):
install("delly", prefix.bin)
install("dpe", prefix.bin)
if self.spec.satisfies("@0.9.1") or self.spec.satisfies("@2017-08-03"):
install("dpe", prefix.bin)
if self.spec.satisfies("@2017-08-03"):
install("cov", prefix.bin)

View File

@@ -0,0 +1,44 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Dlb(AutotoolsPackage):
"""DLB is a dynamic library designed to speed up HPC hybrid applications
(i.e., two levels of parallelism) by improving the load balance of the
outer level of parallelism (e.g., MPI) by dynamically redistributing the
computational resources at the inner level of parallelism (e.g., OpenMP).
at run time."""
homepage = "https://pm.bsc.es/dlb"
url = "https://pm.bsc.es/ftp/dlb/releases/dlb-3.2.tar.gz"
git = "https://github.com/bsc-pm/dlb.git"
maintainers = ["vlopezh"]
version("main", branch="main")
version("3.2", sha256="b1c65ce3179b5275cfdf0bf921c0565a4a3ebcfdab72d7cef014957c17136c7e")
version("3.1", sha256="d63ee89429fdb54af5510ed956f86d11561911a7860b46324f25200d32d0d333")
version("3.0.2", sha256="75b6cf83ea24bb0862db4ed86d073f335200a0b54e8af8fee6dcf32da443b6b8")
version("3.0.1", sha256="04f8a7aa269d02fc8561d0a61d64786aa18850367ce4f95d086ca12ab3eb7d24")
version("3.0", sha256="e3fc1d51e9ded6d4d40d37f8568da4c4d72d1a8996bdeff2dfbbd86c9b96e36a")
variant("debug", default=False, description="Builds additional debug libraries")
variant("mpi", default=False, description="Builds MPI libraries")
depends_on("mpi", when="+mpi")
depends_on("python", type="build")
depends_on("autoconf", type="build", when="@main")
depends_on("automake", type="build", when="@main")
depends_on("libtool", type="build", when="@main")
def configure_args(self):
args = []
args.extend(self.enable_or_disable("debug"))
args.extend(self.enable_or_disable("instrumentation-debug", variant="debug"))
args.extend(self.with_or_without("mpi"))
return args

View File

@@ -0,0 +1,30 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Drishti(PythonPackage):
"""
Drishti is a command-line tool to guide end-users in optimizing I/O in their applications
by detecting typical I/O performance pitfalls and providing a set of recommendations.
"""
homepage = "https://github.com/hpc-io/drishti"
git = "https://github.com/hpc-io/drishti"
pypi = "drishti-io/drishti-io-0.4.tar.gz"
maintainers = ["jeanbez", "sbyna"]
version("master", branch="master")
version("0.4", sha256="bbbb272b4f6f44ae762f6cba28a2c589e15608691c559af0cc2f552590335d7b")
depends_on("darshan-util", type=("run"))
depends_on("python@3.6:", type=("build", "run"))
depends_on("py-pandas", type=("build", "run"))
depends_on("py-rich@12.5.1", type=("build", "run"))
depends_on("py-darshan", type=("build", "run"))

View File

@@ -24,6 +24,7 @@ class Esmf(MakefilePackage):
# Develop is a special name for spack and is always considered the newest version
version("develop", branch="develop")
# generate chksum with spack checksum esmf@x.y.z
version("8.4.0", sha256="28531810bf1ae78646cda6494a53d455d194400f19dccd13d6361871de42ed0f")
version(
"8.3.1",
sha256="6c39261e55dcdf9781cdfa344417b9606f7f961889d5ec626150f992f04f146d",
@@ -245,7 +246,8 @@ def edit(self, spec, prefix):
#######
# ESMF_OS must be set for Cray systems
if "platform=cray" in self.spec:
# But spack no longer gives arch == cray
if self.compiler.name == "cce" or "^cray-mpich" in self.spec:
os.environ["ESMF_OS"] = "Unicos"
#######
@@ -326,7 +328,7 @@ def edit(self, spec, prefix):
##############
# ParallelIO #
##############
if "+parallelio" in spec and "+mpi" in spec:
if "+parallelio" in spec:
os.environ["ESMF_PIO"] = "external"
os.environ["ESMF_PIO_LIBPATH"] = spec["parallelio"].prefix.lib
os.environ["ESMF_PIO_INCLUDE"] = spec["parallelio"].prefix.include

Some files were not shown because too many files have changed in this diff Show More