Compare commits

...

275 Commits

Author SHA1 Message Date
Wouter Deconinck
23197b78f9 feat: spack graph --mermaid 2024-05-04 16:13:12 -07:00
Seth Bromberger
eac5ea869f tmux: add v3.4 and missing yacc build dep (#43975) 2024-05-04 22:33:19 +02:00
Juan Miguel Carceller
f5946c4621 pythia8: Add a gzip variant and filter some compiler wrapper paths (#43807)
* Add a gzip variant and filter some compiler wrapper paths

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-04 09:33:44 -05:00
Harmen Stoppels
8564ab19c3 fix iconv from libc (#43996)
* fix iconv from libc

* fix args in glib
2024-05-04 10:30:43 +02:00
Scott Wittenburg
aae7a22d39 gitlab: release branch pipelines rebuild what changed (#43990) 2024-05-04 10:11:48 +02:00
eugeneswalker
09cea230b4 e4s-alc: add v1.0.2 (#44001) 2024-05-03 22:24:08 -06:00
wspear
a1f34ec58b Add a gmake dependency for TAU (#43870)
We discovered a case where the system gmake can get confused by spack's build environment. Let's use a spack-provided gmake for consistent builds.
2024-05-03 20:15:40 -07:00
Auriane R
4d7cd4c0bf Add py-flash-attn@2.4.2 (#43960) 2024-05-03 16:40:46 -07:00
Christopher Christofi
4adbfa3835 fix package permissions docs link for gaussian packages (#43998) 2024-05-03 17:35:39 -06:00
Adam J. Stewart
8a1b69c1d3 Modernize py-torch-geometric and dependencies (#43984)
* Modernize py-torch-geometric and dependencies
* Forgot one maintainer
2024-05-03 16:21:41 -07:00
Matthew Thompson
a1d69f8661 hdf package: fix building with apple-clang@15: (#43964) 2024-05-03 16:11:55 -07:00
Carlos Bederián
e05dbc529e globalarrays package: add missing dependencies for armci=ofi/openib (#43971) 2024-05-03 16:08:29 -07:00
jdomke
99d33bf1f2 hpl: linking against fujitsu ssl2 if available (#43978)
* linking against fujitsu ssl2 if available

---------

Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:08:12 -07:00
jdomke
bd1918cd71 adding clang compiler checks which behaves exactly like aocc (#43976)
Co-authored-by: domke <673751-domke@users.noreply.gitlab.com>
2024-05-03 16:01:14 -07:00
snehring
2a967c7df4 graphicsmagick package: add version 1.3.43 (#43977) 2024-05-03 15:59:10 -07:00
downloadico
7596aac958 New package: py-pylith (#43987) 2024-05-03 15:56:21 -07:00
Frédéric Simonis
c73ded8ed6 preCICE: increase baseline versions for next ver (#43985) 2024-05-03 15:55:46 -07:00
Satish Balay
df1d783035 sowing: add @master (#43991) 2024-05-03 15:37:39 -07:00
Massimiliano Culpo
47051c3690 Add a default provider for armci (#43997) 2024-05-03 23:48:58 +02:00
Owen Solberg
3fd83c637d Add checksum for R v4.4.0 (#43973)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-05-03 14:47:53 -07:00
Jon Rood
ef5afb66da openfast: updates to package (#43994) 2024-05-03 15:22:12 -06:00
snehring
ecc4336bf9 r-asreml: adding new package (#43970)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-03 12:20:58 -07:00
Harmen Stoppels
d2ed217796 concretizer args: --fresh-roots == --reuse-deps (#43988)
Since reuse is the default now, `--reuse-deps` can be confusing, as it
technically does not imply roots are fresh.

So add `--fresh-roots`, which is also easier to discover when running
`spack concretize --fre<tab>`
2024-05-03 12:12:36 -07:00
Dom Heinzeller
272c7c069a Add -fPIC to hdf-eos2 builds (#43794)
* Add -fPIC to hdf-eos2 builds
* Use self.compiler.cc_pic_flag instead of -fPIC
* Fix style in var/spack/repos/builtin/packages/hdf-eos2/package.py
2024-05-03 11:38:33 -07:00
Jeff Hammond
23f16041cd NWChem ARMCI-MPI variant and optional TCE builds (#43883)
* WIP NWChem with ARMCI-MPI and TCE optional
* rename armci-mpi to armcimpi
* add version for git
* add ARMCI-MPI support to NWChem package
   EXTERNAL_ARMCI_PATH needs to be set to the ARMCI-MPI package install prefix.
   I could not get it from spec["armcimpi"] but it worked to use the dependent build environment method to export a    generic environmental variable to use instead.
* i suppose i can maintain NWChem as well
* check rejects option that dozens of packages use
   i do not have time to fight with this nonsense
   Run . share/spack/setup-env.sh
   ==> Error: armcimpi version 'master' has extra arguments: 'branch'
   Valid arguments for a url fetcher are:
       'url', 'sha256', 'md5', 'sha1', 'sha224', 'sha384', 'sha512', and 'checksum'
   Error: Process completed with exit code 1.
* style
* ARMCI-MPI needs depends_on; the rest seems fixed
* fix ARMCI selection per review feedback from @zzzoom
   https://github.com/spack/spack/pull/43883#discussion_r1585014147
* address reviewer feedback from @yizeyi18
   https://github.com/spack/spack/pull/43883#discussion_r1587228084
   elaborate on what +extratce does, in terms of the NWChem build environment variables,
   some of which are documented on https://nwchemgit.github.io/TCE.html.
* style
* Update var/spack/repos/builtin/packages/nwchem/package.py

---------

Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Carlos Bederián <4043375+zzzoom@users.noreply.github.com>
2024-05-03 11:13:54 -07:00
Frank Willmore
e2329adac0 Update package.py for vacuumms 1.2.0 (#43948)
* updating vacuumms for new release
* formatting
* re-order versions and remove triple quote
2024-05-03 11:06:15 -07:00
jalcaraz
4ec788ca12 [TAU package] Update with +rocprofv2. Updated some tests. (#43937)
* [TAU package] Update with +rocprofv2. Updated some tests.

pdated with +rocprofv2 flag. Only works with rocm-core >= 6.0.0

Can be tested with (Omnia):
spack install tau@master +rocm+rocprofv2 %gcc@11
Needs the last commit in our local repository, can be done by modifying the "git = " line, or waiting until the public one is updated.


In the case that tests cause issues when building TAU, there is the flag:
disable_tests = False

The rocm test is disabled by default, as the PR regarding tests loading dependencies is not solved (PR#43682).

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-05-03 08:33:13 -07:00
Andrew W Elble
c1cea9d304 py-tensorflow: fix aarch64 build (#43852)
* py-tensorflow: fix aarch64 build

* [@spackbot] updating style on behalf of aweits

* patch format

* change patch strategy, actually get the logic correct in
the patch

* only patch 2.16.1 onwards for aarch64

* __CUDACC__ not __NVCC__

* !(defined(__NVCC__) && defined(__CUDACC__))

---------

Co-authored-by: aweits <aweits@users.noreply.github.com>
2024-05-03 15:46:13 +02:00
Massimiliano Culpo
5c96e67bb1 Make runtimes depend on libc only on linux (#43981) 2024-05-03 15:40:02 +02:00
Marc T. Henry de Frahan
7008bb6335 Add rosco package (#43822)
* Add rosco package

* format

* remove unused

* Add in other versions

* start adding some patches

* finish adding the patches
2024-05-03 01:58:35 -06:00
Adam J. Stewart
14561fafff py-torch: set TORCH_CUDA_ARCH_LIST globally for dependents (#43962) 2024-05-03 09:11:59 +02:00
Massimiliano Culpo
89bf1edb6e Spec.satisfies: fix a bug with concrete spec from JSON (#43968)
Fix a bug triggered by missing a virtual on some transitive edge, in a subdag of a pure build dependency.
2024-05-02 22:16:02 -04:00
Todd Gamblin
cc85dc05b7 docs: re-enable google analytics (#43974)
We recently switched to using the new ReadTheDocs with "addons". That includes its own
analytics, which is nice, but we also want to continue using our GA4 analytics.

Adding GA4 is no longer supported by RTD, so we have to add it manually.

- [x] re-add the gtag to all pages, manually

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-02 21:56:19 -04:00
Adam J. Stewart
ae171f8b83 py-torch: conflict with newer cuda (#43969) 2024-05-02 18:47:33 -07:00
snehring
578dd18b34 minimap2: adding version 2.28 (#43972)
* minimap2: adding version 2.28
* minimap2: adding maintainer

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-02 18:45:59 -07:00
Garth N. Wells
a7a51ee5cf py-fenics-ffcx: update to v0.8 (#43929)
* Update FFCx and UFCx to v0.8
* Syntax fix
2024-05-02 09:53:31 -07:00
Stephanie Labasan Brink
960cc90667 variorum: add branch dev and version 0.8.0 (#43829)
* variorum: add branch dev and version 0.8.0
* formatting
2024-05-02 09:50:05 -07:00
Alex Richert
dea44bad8b grads: fix libpng and g2c deps (#43909)
* grads: fix libpng and g2c deps
* Update package.py
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:42:15 -07:00
Richard Berger
e37870ff43 rdma-core: add versions 51.0, 50.0 and 49.1 (#43926) 2024-05-02 09:41:23 -07:00
Sajid Ali
3751642a27 mold: add new versions (#43931) 2024-05-02 09:39:09 -07:00
Nils Vu
0f386697c6 spectre: add versions, update constraints (#43936) 2024-05-02 09:38:06 -07:00
Alex Richert
67ce103b2c Update prod-util recipe (#43940)
* update prod-util recipe
* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-05-02 09:24:39 -07:00
Alex Richert
a8c9fa0e45 g2tmpl: add v1.12.0 (#43941) 2024-05-02 09:23:42 -07:00
Alex Richert
b56a133fce update grib-util recipe (#43939) 2024-05-02 09:22:03 -07:00
Seth R. Johnson
f0b3d33145 Celeritas: new version 0.4.3 (#43949) 2024-05-02 09:04:30 -07:00
Adam J. Stewart
32564da9d0 rust: add conflict for older GCC (#43958) 2024-05-02 08:36:06 -07:00
Stephen Hudson
8f2faf65dc libEnsemble: add v1.3.0 (#43944) 2024-05-02 08:35:06 -07:00
eugeneswalker
1d59637051 e4s ci: add py-amrex (#43904) 2024-05-02 08:57:26 -06:00
jdomke
97dc353cb0 libc: detect ARM flavor (#43959)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-02 15:59:29 +02:00
wspear
beebe2c9d3 Accept later pythons for tau@2.33+ (#43911)
* Accept later pythons for tau@2.33+

* separate forward compat bounds
2024-05-02 07:13:21 -06:00
Harmen Stoppels
2eb7add8c4 gcc: write builtin specs before modifications (#43956) 2024-05-02 06:40:10 -06:00
Harmen Stoppels
a9fea9f611 nghttp2: add diffutils (#43954) 2024-05-02 14:22:53 +02:00
Harmen Stoppels
9b62a9c238 gitlab ci: cache user cache (#43952) 2024-05-02 12:06:30 +02:00
wspear
f7eb0ccfc9 Revert #43701 (#43935)
PR #43701 is broken, preventing scorep from being installed. As explained in #43700 the issue is internal to scorep and can not be fixed in the package, at least by the method being attempted here.
2024-05-02 09:56:21 +02:00
Adrien Bernede
a0aa35667c Ignore external packages when pushing to buildcache automatically (--autopush) (#43930) 2024-05-02 09:50:10 +02:00
Luc Berger
b1d4fd14bc Nalu: adding support for Trilinos 14.2.0 for Nalu 1.6.0 (#43857) 2024-05-02 01:10:27 -06:00
John W. Parent
7e8415a3a6 Windows: auto-add WGL/SDK as externals (#43752)
Adds a pre-concretization check for the Windows SDK and WGL (Windows
GL) packages as non-buildable externals.

This is a redo of https://github.com/spack/spack/pull/43459, but makes
sure to modify the configuration scope outside of the bootstrap scope:
whichever is highest-precedence in the user's environment at the time
the concretization runs, which should either be an env scope or the
~ scope.

Adds pytest fixture mocking the check for WGL and WSDK as if they were
present.
2024-05-02 01:05:03 -06:00
Simon Pintarelli
7f4f42894d update sirius package.py (#43872) 2024-05-02 08:23:10 +02:00
Massimiliano Culpo
4e876b4014 Allow more control over which specs are reused (#42782)
This PR gives users finer control over which specs are reused during concretization.

The value of the `concretizer:reuse` config option now can take an object with the following properties:
- `roots`: true if reusing roots, false if reusing just dependencies
- `exclude`: list of constraints used to select reusable specs 
- `include`: list of constraints used to select reusable specs 
- `from`: allows to select the sources of reused specs

### Examples

#### Reuse only specs compiled with GCC
```yaml
concretizer:
  reuse:
    roots: true
    include:
    - "%gcc"
```

#### `openmpi` must be used from externals, and it must be the only external used
```yaml
concretizer:
  reuse:
    roots: true
    from:
    - type: local
      exclude:
      - "openmpi"
    - type: buildcache
      exclude:
      - "openmpi"
    - type: external
      include:
      - "openmpi"
```
2024-05-01 23:05:26 -04:00
Wouter Deconinck
77a8a4fe08 abseil-cpp: new version 20240116.2 (#43666) 2024-05-01 17:06:57 -07:00
Thomas Bouvier
597e5a4e5e Update py-nvidia-dali to v1.36.0 (#43928)
* py-nvidia-dali: add v1.36.0
* fix: do not expand downloaded wheel
2024-05-01 16:35:43 -07:00
MatthewLieber
3c31c32f62 Adding sha for 7.4 release of OSU Micro Benchmarks (#43899)
Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2024-05-01 16:27:52 -07:00
Weiqun Zhang
3a93a716e4 amrex: add v24.05 (#43938) 2024-05-01 16:17:05 -07:00
Filippo Spiga
82229a0784 Adding EXTRAE 4.1.2 (#43907) 2024-05-01 15:55:54 -07:00
Axel Huebl
5d846a69d1 ADIOS2: Campaign Variant (#43906)
With v2.10+, ADIOS added a campaign manager. This is auto-enabled
if SQLite3 is found.

Add explicit control for it now and disables it by default, to avoid
picking up system dependencies or bloating by default the ADIOS2
dependencies. Also, not yet fully mature to be used by default:
https://github.com/ornladios/ADIOS2/issues/4148
2024-05-01 15:50:45 -07:00
Stephen Herbener
d21aa1cc12 Removed use of mpi wrappers for fms and mapl package.py scripts. These were causing (#43726)
builds to fail on MacOS and CMake appears to handle the build fine without the mpi wrappers.
2024-05-01 15:41:30 -07:00
Jake Koester
7896ff51f6 bumped up version of kokkos-kernels (#43920) 2024-05-01 12:20:51 -06:00
Chris Mc
5849a24a74 jwt-cpp: add v0.7.0 (#41894)
* jwt-cpp: add version 0.7.0

* Update var/spack/repos/builtin/packages/jwt-cpp/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-01 12:20:35 -06:00
Alex Leute
38c49d6b82 py-simpy: New package (#43497)
* pypi build of py-simpy

* [py-simpy] toml explicitly called out

* py-simpy: New version

* py-setuptools-scm: Added new version

* py-setuptools-scm: add url_for_version

Because versions @:7 have an underscore (setuptools_scm) in the URL, and
newer versions have a dash (setuptools-scm).

* py-setuptools-scm: fix flake8 line too long

* py-simpy: Fix hash

---------

Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Jen Herting <jen@herting.cc>
2024-05-01 12:06:52 -06:00
fpruvost
0d8900986d qrmumps: 3.1 (#43913) 2024-05-01 12:00:25 -06:00
dependabot[bot]
62554cebc4 build(deps): bump black in /.github/workflows/style (#43879)
Bumps [black](https://github.com/psf/black) from 24.4.0 to 24.4.2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.0...24.4.2)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:37:54 +02:00
Wouter Deconinck
067155cff5 containers: add ubuntu 24.04 (#43881)
* containers: add ubuntu 24.04

* containers: use python3-boto3 pkg instead of pip install
2024-05-01 13:37:13 +02:00
Wouter Deconinck
08e68d779f ci: update upload-artifact to v4 (in build-containers) (#43880) 2024-05-01 13:35:12 +02:00
Adam J. Stewart
05b04cd4c3 Update PyTorch ecosystem (#43823)
* Update PyTorch ecosystem

* Don't expose protobuf namespace to other packages, breaks torchtext build

* Revert "Don't expose protobuf namespace to other packages, breaks torchtext build"

This reverts commit 50a4b08f65.
2024-05-01 13:30:46 +02:00
dependabot[bot]
be48f762a9 build(deps): bump pytest from 8.1.1 to 8.2.0 in /lib/spack/docs (#43908)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.1.1 to 8.2.0.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.1.1...8.2.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-01 13:29:22 +02:00
Axel Huebl
de5b4840e9 Add quantiphy (#43894)
* Add quanitphy

Add https://github.com/KenKundert/quantiphy .

* Simplify Version Range
2024-04-30 15:26:35 -07:00
Harmen Stoppels
20f9884445 curl: perl is temporarily required (#43921) 2024-04-30 23:32:06 +02:00
Harmen Stoppels
deb78bcd93 re2c: depends on gmake (#43916)
regressed in 650a668a9d
2024-04-30 23:30:36 +02:00
Garth N. Wells
06239de0e9 py-fenics-ufl: add new version (#43848)
* Bump version to 2024.1.0
* Bump to .post0
* Dependency fix
* Format file
* Run black on package file
2024-04-30 14:10:55 -07:00
Garth N. Wells
1f904c38b3 Add version 1.9.2. (#43838) 2024-04-30 12:19:31 -07:00
Wouter Deconinck
f2d0ba8fcc PackageStillNeededError: add pkg that needs spec to exception msg (#43845)
* PackageStillNeededError: add pkg that needs spec to exception msg
* PackageStillNeededError: f-string with short fmt and hash
* PackageStillNeededError: split long string
2024-04-30 12:11:47 -07:00
Satish Balay
49d3eb1723 petsc, py-petsc4py: add v3.21.1 (#43887) 2024-04-30 10:07:33 -07:00
Dom Heinzeller
7c5439f48a Update crtm-fix and crtm from JCSDA fork (#43855) 2024-04-30 10:07:16 -07:00
Harmen Stoppels
7f2cedd31f hack: drop glibc and musl in old concretizer (#43914)
The old concretizer creates a cyclic graph when expanding virtuals for
`iconv`, which is a bug. This hack drops glibc and musl as possible
providers for `iconv` in the old concretizer to work around it.
2024-04-30 13:55:48 +02:00
Harmen Stoppels
d47951a1e3 glibc: provides iconv (#43897)
`iconv` is a bit of weird virtual because the only shared API between
`glibc` and `libiconv` is:

```
iconv
iconv_open
iconv_close
```

whereas `libiconv` has further symbols [iconvctl](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconvctl.3.html), [iconv_open_into](https://www.gnu.org/software/libiconv/documentation/libiconv-1.17/iconv_open_into.3.html), and an `iconv` executable and `libcharset.so`. Packages that need those will have to do `depends_on("[virtuals=iconv] libiconv")`.
2024-04-30 07:40:00 +02:00
Teague Sterling
f2bd0c5cf1 Fix duckdb version handling again (#43762)
* Added package to build Ollama
* Update package.py
   Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
* We can now use OVERRIDE_GIT_DESCRIBE to set the version in DuckDB
* Update duckdb/package.py
   - use f-string
   - fix version specs to address inconsistencies
* Update package.py
   Fix spec definitions further
* [@spackbot] updating style on behalf of teaguesterling

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-29 18:51:04 -07:00
one
4362382223 Update package.py (#43764) 2024-04-29 18:49:49 -07:00
Nathalie Furmento
ba4859b33d starpu: add release 1.4.5 and dependancy for building starpupy (#43810) 2024-04-29 18:48:37 -07:00
Carlos Bederián
e8472714ef ucc: add 1.3.0 (#43828) 2024-04-29 18:45:00 -07:00
Mikael Simberg
ee6960e53e boost: Add 1.85.0 (#43788)
* boost: Add 1.85.0
* Add conflict for Boost 1.85.0 stacktrace change
2024-04-29 18:41:38 -07:00
Nathalie Furmento
dad266c955 starpu: add branch 1.4 (#43837) 2024-04-29 18:38:43 -07:00
Garth N. Wells
7a234ce00a (py-)fenics-basix: update for new version (#43841)
* Bump version to 2024.1.0
* Bump py-basix version
* Revert UFL change
* Python version update
2024-04-29 18:11:36 -07:00
Loris Ercole
a0c2ed97c8 Update scine-qcmaquis (#43817)
`scine-qcmaquis` is updated with a version 3.1.4.
The option to build the OpenMolcas interface is added, and some
dependencies are clarified.
2024-04-29 16:38:29 -07:00
Rémi Lacroix
a3aa5b59cd dotnet-core-sdk: Fix environment setup (#43842)
The "DOTNET_CLI_TELEMETRY_OPTOUT" environment variable should be defined when using the product, not when installing it (the installation phase is just extract the files anyway).

Also use `~` instead of `-` to check for the variant and fix the second argument for `env.set`  which should also be a string.
2024-04-29 14:47:37 -07:00
Zach Jibben
f7dbb59d13 Update Petaca (#43849)
* Add recent petaca releases
* Add Petaca 24.04
2024-04-29 14:42:11 -07:00
Wouter Deconinck
0df27bc0f7 nopayloadclient: new package (#43853) 2024-04-29 14:16:57 -07:00
Massimiliano Culpo
877e09dcc1 Delete leftover file (#43869)
This file is not needed anymore, was introduced in #19688
2024-04-29 22:45:41 +02:00
Adam J. Stewart
c4439e86a2 Prettier: add new package (#43866) 2024-04-29 13:39:52 -07:00
Pramod Kumbhar
aa00dcac96 gprofng-gui: new package (#43873) 2024-04-29 13:38:48 -07:00
Adam J. Stewart
4c9a946b3b py-keras: add v3.3.3 (#43885) 2024-04-29 13:30:52 -07:00
Wouter Deconinck
0c6e6ad226 xbae: new package (#43889) 2024-04-29 13:05:56 -07:00
Adam J. Stewart
bf8f32443f py-einops: add v0.8.0 (#43890) 2024-04-29 12:54:33 -07:00
Wouter Deconinck
c2eef8bab2 fontconfig: depends_on gperf when 2.11.1: (#43892) 2024-04-29 11:56:59 -07:00
afzpatel
2df4b307d7 hipblaslt: new package (#43846)
* initial commit to add hipblaslt package
* remove master and update patch
* add docstring comment
2024-04-29 11:48:03 -07:00
Ryan Marcellino
3c57440c10 openjdk: add v21_35 (#40699)
* openjdk: add v21+35

* add provides java@21

* Update var/spack/repos/builtin/packages/openjdk/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-04-29 10:51:17 -05:00
Harmen Stoppels
3e6e9829da compiler.py: fix early return (#43898) 2024-04-29 08:53:27 -06:00
Gregory Becker
859745f1a9 Run audits on windows
Add debug log for external detection tests. The debug log
is used to print which test is being executed.

Skip version audit on Windows where appropriate
2024-04-29 14:13:10 +02:00
Massimiliano Culpo
ddabb8b12c Fix concretization when installing missing compilers (#43876)
Restore the previous behavior when config:install_missing_compilers
is True. The libc of the missing compiler is inferred from the
Python process.
2024-04-29 08:20:33 +02:00
Jeff Hammond
16bba32124 add ILP64 option for BLIS (#43882)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
2024-04-29 08:14:25 +02:00
Michael Kuhn
7d87369ead autoconf: fix typo in m4 dependencies (#43893)
m4 1.4.8 is actually required starting with autoconf 2.72 according to
the NEWS file.
2024-04-28 18:34:12 -05:00
Adam J. Stewart
7723bd28ed Revert "package/npm update (#43692)" (#43884)
This reverts commit 03a074ebe7.
2024-04-27 08:58:12 -06:00
Harmen Stoppels
43f3a35150 gcc: generate spec file and fix external libc default paths after install from cache (#43839)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-27 08:49:20 -06:00
Jonathon Anderson
ae9f2d4d40 containers: Add Fedora 40, 39 (#43847) 2024-04-26 20:02:04 -05:00
Huston Rogers
5a3814ff15 Miniconda3: Added Versions: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0 (#43868)
* Added Versions of miniconda3: 24.3.0, 24.1.2, 23.11.0, 23.9.0, 23.5.2, 23.5.1, 23.5.0, 23.3.1, 23.1.0

* fixed style

---------

Co-authored-by: James H. Rogers <jhrogers@spear.hpc.msstate.edu>
2024-04-26 18:58:58 -06:00
Sakib Rahman
946c539dbd osg-ca-certs: new version osg 1.119 and igtf 1.128 (#43871)
* Update package.py to osg 1.119 and igtf 1.128

* Remove unnecessary white space

* Add missing comma

* change url to use git and use commit hash from repository instead of ssh256sum of tar.gz

* [@spackbot] updating style on behalf of rahmans1

---------

Co-authored-by: rahmans1 <rahmans1@users.noreply.github.com>
2024-04-26 18:39:12 -06:00
Robert Cohn
0037462f9e [SOS] update license (#43864)
Auto-generated license was wrong
2024-04-26 18:45:29 +02:00
Massimiliano Culpo
e5edac4d0c ASP-based solver: update os compatibility for macOS (#43862) 2024-04-26 18:34:46 +02:00
Jeff Hammond
3e1474dbbb ARMCI-MPI: add new package (#43813)
Signed-off-by: Jeff Hammond <jeff.science@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-26 16:35:33 +02:00
Abhishek Yenpure
0f502bb6c3 libcatalyst: add v2.0.0 (#43804) 2024-04-26 16:27:46 +02:00
Robert Cohn
1eecbd3208 [intel-oneap-*] no redistribution (#43826) 2024-04-26 12:03:39 +00:00
Jack Morrison
6e92b9180c Add tests-sos package + Variants to sos package (#43830)
* * Add initial tests-sos package
* Remove failing call of missing setup_compiler_environment from sos
  package
* Add several variants for sos package

* [@spackbot] updating style on behalf of jack-morrison
2024-04-26 07:26:21 -04:00
Massimiliano Culpo
ac9012da0c spack audit externals: allow selecting platforms and checking extra attributes (#43782) 2024-04-26 12:47:17 +02:00
Mosè Giordano
e3cb4f09f0 julia: add v1.10.2 (#41151)
* julia: add v1.10.2

* julia: add patch to remove suite-sparse cuda stub files

* julia: use permalinks for patches
2024-04-26 12:44:54 +02:00
Harmen Stoppels
2e8600bb71 gcc: simplify specs file, make binutils locatable (#43861) 2024-04-26 01:30:51 -06:00
Harmen Stoppels
d946c37cbb ldflags=* are compiler flags, not linker flags (#43820)
We run `extend spack_flags_list SPACK_LDFLAGS` for `$mode in ld|ccld`.

That's problematic, cause `ccld` needs `-Wl,--flag` whereas `ld` needs
`--flag` directly. Only `-L` and `-l` are common to compiler & linker.

In all build systems `LDFLAGS` is for the compiler not the linker, cause
any linker flag `-x` can be passed as a compiler flag `-Wl,-x`, and there
are many compiler flags that affect the linker invocation, like `-fopenmp`,
`-fuse-ld=`, `-fsanitize=` etc.

So don't pass `LDFLAGS` to the linker directly.

This way users can set `ldflags: -Wl,--allow-shlib-undefined` in compilers.yaml
to work around an issue where the linker tries to resolve the `libcuda.so.1`
stub lib which cannot be located by design in `cuda`.
2024-04-26 09:19:03 +02:00
Harmen Stoppels
47a9f0bdf7 llvm: use --gcc_install_dir in config files (#43795) 2024-04-26 09:01:02 +02:00
Alex Richert
2bf900a893 Update package.py (#43836) 2024-04-25 16:00:43 -07:00
Andrey Perestoronin
99bba0b1ce Add intel-oneapi-mpi 2021.12.1 patch package (#43850)
* Add intel-oneapi-mpi package

* Fix style
2024-04-25 13:38:47 -06:00
Juan Miguel Carceller
a8506f9022 glew package: add ld flags when compiling with ^apple-gl (#43429) 2024-04-25 11:18:00 -07:00
Harmen Stoppels
4a40a76291 build_environment.py: expand SPACK_MANAGED_DIRS with realpath (#43844) 2024-04-25 13:33:50 +02:00
downloadico
fe9ddf22fc spatialdata: add spatialdata package to spack (#43500) 2024-04-25 04:18:08 -07:00
Adam J. Stewart
1cae1299eb CI: remove ML ROCm stack (#43825) 2024-04-25 12:56:59 +02:00
Adam J. Stewart
8b106416c0 py-lightning: add v2.2.3 (#43824) 2024-04-25 12:03:01 +02:00
jalcaraz
e2088b599e [TAU Package] Updates for rocm (#43790)
* Updates for rocm

Updated for rocm@6
Added conflict between rocprofiler and roctracer.
Request either +rocprofiler or +roctracer when +rocm. In this case, it automatically builds for one, instead of displaying the message. 
Request +rocm when using either +rocprofiler or +roctracer. In this case, it automatically builds with +rocm, instead of displaying the message.

Disabled the tests. Will update them with the new test method.

* [@spackbot] updating style on behalf of jordialcaraz

---------

Co-authored-by: jordialcaraz <jordialcaraz@users.noreply.github.com>
2024-04-24 16:35:41 -07:00
Ken Raffenetti
56446685ca mpich: add 4.2.0 release (#42687) 2024-04-24 12:33:59 -07:00
Teo
47a8d875c8 update halide versions; sync llvm (#43793) 2024-04-24 12:27:42 -07:00
Adam J. Stewart
56b2d250c1 py-keras: add v3.3.1-2 (#43798) 2024-04-24 12:23:34 -07:00
David Boehme
abbd09b4b2 Add Caliper v2.11 (#43802) 2024-04-24 12:21:56 -07:00
Rémi Lacroix
9e5fdc6614 dotnet-core-sdk: Add versions 7.0.18 and 8.0.4 (#43814) 2024-04-24 12:19:33 -07:00
Harmen Stoppels
1224a3e8cf clang.py: detect flang-new (#43815)
If a flang-new exists, which is rather unlikely, it probably means the
user wants it as a fortran compiler.
2024-04-24 19:11:02 +02:00
Jake Koester
6c3218920f Trilinos: update kokkos dependency (#43785)
* fix so trilinos@master uses correct kokkos (@4.3.00)

* Update var/spack/repos/builtin/packages/trilinos/package.py
2024-04-24 10:42:08 -06:00
Peter Scheibel
02cc3ea005 Add new redistribute() directive (#20185)
Some packages can't be redistributed in source or binary form. We need an explicit way to say that in a package.

This adds a `redistribute()` directive so that package authors can write, e.g.:

```python
    redistribute(source=False, binary=False)
```

You can also do this conditionally with `when=`, as with other directives, e.g.:

```python
    # 12.0 and higher are proprietary
    redistribute(source=False, binary=False, when="@12.0:")

    # can't redistribute when we depend on some proprietary dependency
    redistribute(source=False, binary=False, when="^proprietary-dependency")
```


To prevent Spack from adding either their sources or binaries to public mirrors and build caches. You can still unconditionally add things *if* you run either:
* `spack mirror create --private`
* `spack buildcache push --private`

But the default behavior for build caches is not to include non-redistributable packages in either mirrors or build caches.  We have previously done this manually for our public buildcache, but with this we can start maintaining redistributability directly in packages.

Caveats: currently the default for `redistribute()` is `True` for both `source` and `binary`, and you can only set either of them to `False` via this directive.

- [x] add `redistribute()` directive
- [x] add `redistribute_source` and `redistribute_binary` class methods to `PackageBase`
- [x] add `--private` option to `spack mirror`
- [x] add `--private` option to `spack buildcache push`
- [x] test exclusion of packages from source mirror (both as a root and as a dependency)
- [x] test exclusion of packages from binary mirror (both as a root and as a dependency)
2024-04-24 09:41:03 -07:00
John W. Parent
641ab95a31 Revert "Windows: add win-sdk/wgl externals during bootstrapping (#43459)" (#43819)
This reverts commit 9e2558bd56.
2024-04-24 18:24:28 +02:00
Adam J. Stewart
e8b76c27e4 py-matplotlib: drop test dep (#43765)
test dependencies constrains build / link type deps, so avoid that
2024-04-24 18:03:13 +02:00
Massimiliano Culpo
0dbe4d54b6 Tune default compiler preferences (#43805)
Add `%oneapi`, remove compilers that have been discontinued upstream
2024-04-24 18:00:40 +02:00
Harmen Stoppels
1eb6977049 rust: use system libs (#43797) 2024-04-24 09:46:11 -06:00
Harmen Stoppels
3f1cfdb7d7 libc: from current python process (#43787)
If there's no compiler we currently don't have any external libc for the solver.

This commit adds a fallback on libc from the current Python process, which works if it is dynamically linked.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-24 05:10:48 -06:00
Massimiliano Culpo
d438d7993d nf-*-cli: fix typo in conditional (#43806)
The default runner changed on GitHub for macOS, and that
revealed a bug in a package when running audits
2024-04-24 11:30:35 +02:00
Todd Gamblin
aa0825d642 Refactor to improve spec format speed (#43712)
When looking at where we spend our time in solver setup, I noticed a fair bit of time is spent
in `Spec.format()`, and `Spec.format()` is a pretty old, slow, convoluted method.

This PR does a number of things:
- [x] Consolidate most of what was being done manually with a character loop and several
      regexes into a single regex.
- [x] Precompile regexes where we keep them 
- [x] Remove the `transform=` argument to `Spec.format()` which was only used in one 
      place in the code (modules) to uppercase env var names, but added a lot of complexity
- [x] Avoid escaping and colorizing specs unless necessary
- [x] Refactor a lot of the colorization logic to avoid unnecessary object construction
- [x] Add type hints and remove some spots in the code where we were using nonexistent
      arguments to `format()`.
- [x] Add trivial cases to `__str__` in `VariantMap` and `VersionList` to avoid sorting
- [x] Avoid calling `isinstance()` in the main loop of `Spec.format()`
- [x] Don't bother constructing a `string` representation for the result of `_prev_version`
      as it is only used for comparisons.

In my timings (on all the specs formatted in a solve of `hdf5`), this is over 2.67x faster than the 
original `format()`, and it seems to reduce setup time by around a second (for `hdf5`).
2024-04-23 10:52:15 -07:00
Greg Becker
978c20f35a concretizer: update reuse: default to True (#41302) 2024-04-23 17:42:14 +02:00
Marc T. Henry de Frahan
d535124500 Update amr-wind package with versions (#43728) 2024-04-23 05:07:42 -06:00
Harmen Stoppels
01f61a2eba Remove import distro from packages and docs (#43772) 2024-04-23 12:47:33 +02:00
Massimiliano Culpo
7d5e27d5e8 Do not detect a compiler without a C compiler (#43778) 2024-04-23 12:20:33 +02:00
Harmen Stoppels
d210425eef nettle: remove openssl dep (#43770) 2024-04-23 07:44:15 +02:00
Nathalie Furmento
6be07da201 starpu: fix release 1.4.4 (#43730) 2024-04-22 15:56:23 -07:00
Filippo Spiga
02b38716bf Adding UCX 1.16.0 (#43743)
* Adding UCX 1.16.0
* Fixed hash
2024-04-22 15:52:20 -07:00
Adam J. Stewart
d7bc624c61 Tags: add more build tools (#43766)
* Tags: add more build tools
* py-pythran: add maintainer
2024-04-22 15:34:38 -07:00
Adam J. Stewart
b7cecc9726 py-scikit-image: add v0.23 (#43767) 2024-04-22 15:32:55 -07:00
Adam J. Stewart
393a2f562b py-keras: add v3.3.0 (#43783) 2024-04-22 15:04:06 -07:00
Massimiliano Culpo
682fcec0b2 zig: add v0.12.0 (#43774) 2024-04-22 15:02:45 -07:00
Harmen Stoppels
d6baae525f repo.py: drop deleted packages from provider cache (#43779)
The reverse provider lookup may have stale entries for deleted packages, which used to cause errors. It's hard to invalidate those cache entries, so this commit simply drops entries w/o invalidating the cache.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-04-22 19:03:44 +02:00
Kyle Knoepfel
e1f2612581 Adjust severity of irreversible operations (#43721) 2024-04-22 16:41:53 +02:00
Harmen Stoppels
080fc875eb compiler.py: reduce verbosity of implicit link dirs parsing (#43777) 2024-04-22 16:07:14 +02:00
Harmen Stoppels
69f417b26a view: dont warn about externals (#43771)
since it's the status quo on linux after libc as external by default
2024-04-22 16:05:32 +02:00
Harmen Stoppels
80b5106611 bootstrap: no need to add dummy compilers (#43775) 2024-04-22 16:01:41 +02:00
Massimiliano Culpo
34146c197a Add libc dependency to compiled packages and runtime deps
This commit differentiate linux from other platforms by
using libc compatibility as a criterion for deciding
which buildcaches / binaries can be reused. Other
platforms still use OS compatibility.

On linux a libc is injected by all compilers as an implicit
external, and the compatibility criterion is that a libc is
compatible with all other libcs with the same name and a
version that is lesser or equal.

Some concretization unit tests use libc when run on linux.
2024-04-22 15:18:06 +02:00
Harmen Stoppels
209a3bf302 Compiler.default_libc
Some logic to detect what libc the c / cxx compilers use by default,
based on `-dynamic-linker`.

The function `compiler.default_libc()` returns a `Spec` of the form
`glibc@x.y` or `musl@x.y` with the `external_path` property set.

The idea is this can be injected as a dependency.

If we can't run the dynamic linker directly, fall back to `ldd` relative
to the prefix computed from `ld.so.`
2024-04-22 15:18:06 +02:00
Harmen Stoppels
e8c41cdbcb database.py: stream of json objects forward compat (#43598)
In the future we may transform the database from a single JSON object to
a stream of JSON objects.

This paves the way for constant time writes and constant time rereads
when only O(1) changes are made. Currently both are linear time.

This commit gives just enough forward compat for Spack to produce a
friendly error when we would move to a stream of json objects, and a db
would look like this:

```json
{"database": {"version": "<something newer>"}}
```
2024-04-22 09:43:41 +02:00
Massimiliano Culpo
a450dd31fa Fix a bug preventing to set platform= on externals (#43758)
closes #43406
2024-04-22 09:15:22 +02:00
Alex Richert
7c1a309453 perl: remove mkdirp from setup_dependent_package (#43733) 2024-04-20 21:34:07 +02:00
Massimiliano Culpo
78b6fa96e5 ci.py: visit all edges (#43761) 2024-04-20 21:29:32 +02:00
Jordan Ogas
1b315a9ede mpich: add v4.2.1 (#43753) 2024-04-20 19:45:25 +02:00
Harmen Stoppels
82df0e549d compiler wrapper: prioritize spack store paths in -L, -I, -rpath (#43593)
* compiler wrapper: prioritize spack managed paths in search order

This commit partitions search paths of -L, -I (and -rpath) into three
groups, from highest priority to lowest:

1. Spack managed directories: these include absolute paths such as
   stores and the stage dir, as well as all relative paths since they
   are relative to a Spack owned dir
2. Non-system dirs: these are for externals that live in non-system
   locations
3. System dirs: your typical `/usr/lib` etc.

It's very easy for Spack to known the prefixes it owns, it's much more
difficult to tell system dirs from non-system dirs. Before this commit
Spack tried to distinguish only system and non-system dirs, and failed
for very trivial cases like `/usr/lib/x/..` which comes up often, since
build systems sometimes copy search paths from `gcc -print-search-dirs`.

Potentially this implementation is even faster than the current state of
things, since a loop over paths is replaced with an eval'ed `case ...`.

* Trigger a pipeline

* Revert "Trigger a pipeline"

This reverts commit 5d7fa863de.

* remove redudant return statement
2024-04-20 13:23:37 -04:00
Harmen Stoppels
f5591f9068 ci.py: simplify, and dont warn excessively about externals (#43759) 2024-04-20 15:09:54 +02:00
FrederickDeny
98c08d277d e4s-alc: add new package (#43750)
* Added e4s-cl@1.0.3

* add e4s-alc package

* removed trailing whitespace
2024-04-19 15:54:23 -06:00
Jordan Galby
facca4e2c8 ccache: 4.9.1 and 4.8.3 (#43748) 2024-04-19 23:46:38 +02:00
Adam J. Stewart
764029bcd1 py-ruff: add v0.4.0 (#43740) 2024-04-19 15:44:19 -06:00
Harmen Stoppels
44cb4eca93 environment.py: fix excessive re-reads (#43746) 2024-04-19 13:39:34 -06:00
Jack Morrison
39888d4df6 libfabric: Add version 1.21.0 (#43735) 2024-04-19 11:09:59 -07:00
Vincent Michaud-Rioux
f68ea49e54 Update py-pennylane + Lightning plugins + few deps (#43706)
* Update PennyLane packages to v0.32.
* Reformat.
* Couple small fixes.
* Fix Lightning cmake_args.
* Couple dep fixes in py-pennylane + plugins.
* Fix scipy condition.
* Add comment on conflicting requirement.
* Update py-pl versions
* Update lightning versions.
* Fix copyright.
* Fix license.
* Update pl-kokkos versions
* run black
* Fix L-Kokkos build and update autoray.
* build step only required for older versions. update autograd
* Fix LK@0.31 kokkos compat issue. Introduce url_for_version.
* Fix few more version bounds.
2024-04-19 11:05:09 -07:00
Olivier Cessenat
78b5e4cdfa perl-fth: new version 0.529 (#43727) 2024-04-19 10:58:34 -07:00
Micael Oliveira
26515b8871 fms: add two variants (#43734)
* fms: add two variants supporting existing build options.
* Style fixes.
2024-04-19 10:51:58 -07:00
Harmen Stoppels
74640987c7 ruamel yaml: fix quadratic complexity bug (#43745) 2024-04-19 14:33:42 +02:00
Harmen Stoppels
d6154645c7 chai / raja / umpire: compile entire project with hipcc again (#43738) 2024-04-19 11:18:12 +02:00
James Smillie
faed43704b py-numpy package: enable build on Windows (#43686)
* Add conflicts for some blas implementations that don't build on
  Windows (or with %msvc)
* Need to enclose CC/CXX variables in quotes in case those paths
  have spaces, otherwise Meson runs into errors
* On Windows, Python dependencies now add <prefix>/Scripts to the
  PATH (this is established as a standard in PEP 370)
2024-04-18 16:38:08 -06:00
John W. Parent
6fba31ce34 Windows: Update MSVC + oneAPI detection and integration (#43646)
* Later versions of oneAPI have moved, so update detection to find it
  in both old and new location
* Remove reliance on ONEAPI_ROOT env variable when determining Fortran
  compiler version for %msvc
* When finding a Fortran compiler for MSVC, there was logic enforcing
  a maximum MSVC version for a given oneAPI Fortran version. This
  mapping was out of date and excluding valid combinations, so has
  been removed (the logic now just picks the latest available
  oneAPI Fortran compiler for any given MSVC version).
2024-04-18 21:53:56 +00:00
Olivier Cessenat
112cead00b keepassxc: new version 2.7.7 (#43729) 2024-04-18 14:54:08 -06:00
John W. Parent
9e2558bd56 Windows: add win-sdk/wgl externals during bootstrapping (#43459)
On Windows, bootstrapping logic now searches for and adds the win-sdk
and wgl packages to the user's top scope as externals if they are not
present.

These packages are generally required to install most packages with
Spack on Windows, and are only available as externals, so it is
assumed that doing this automatically would be useful and avoid
a mandatory manual step for each new Spack instance.

Note this is the first case of bootstrapping logic modifying
configuration other than the bootstrap configuration.
2024-04-18 10:20:04 -07:00
eugeneswalker
019058226f py-netcdf4 %oneapi: cflags append -Wno-error=int-conversion (#43629) 2024-04-18 10:11:24 -07:00
George Young
ac0040f67d spaceranger: new manual download package @2.1.1 (#42391)
* spaceranger: new manual download package @2.1.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:11:01 -07:00
wspear
38f341f12d Added tau@2.33.2 (#43682) 2024-04-18 10:10:18 -07:00
George Young
26ad22743f cellranger: new manual download package @7.1.0 (#38486)
* cellranger: new manual download package @7.1.0
* cellranger: updating to @7.2.0
* updating website
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:09:00 -07:00
George Young
46c2b8a565 xeniumranger: new manual download package @1.7.1 (#42389)
* xeniumranger: new manual download package @1.7.1
* Adding license url

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-04-18 10:08:04 -07:00
James Taliaferro
5cbb59f2b8 New package: editorconfig (#43670)
* New package: editorconfig
* remove FIXMEs
* add description for editorconfig
2024-04-18 10:05:15 -07:00
Adam J. Stewart
f29fa1cfdf CI: remove MXNet (#43704) 2024-04-18 10:04:03 -07:00
Olivier Cessenat
c69951d6e1 gxsview: compiles against system qt and vtk on rhel8 (#43722)
* gxsview: compiles againts system qt and vtk on rhel8
* Update gxsview/package.py for blanks around operator
* Update gxsview/package.py import blank line
* Update gxsview/package.py for style
* Update gxsview/package.py checking vtk version
2024-04-18 10:18:05 -06:00
Adam J. Stewart
f406f27d9c ML CI: remove extra xgboost (#43709) 2024-04-18 09:08:25 -07:00
Tamara Dahlgren
36ea208e12 Twitter->X: Reflect the name (only) change (#43690) 2024-04-18 08:52:54 -07:00
Kyle Knoepfel
17e0774189 Make sure variable is None if exception is raised. (#43707) 2024-04-18 08:50:15 -07:00
Sam Gillingham
3162c2459d update py-python-fmask to version 0.5.9 (#43698)
* update py-python-fmask to version 0.5.9
* add gillins and neilflood as maintainers
* remove spaces
* remove blank line
* put maintainers higher
2024-04-18 08:48:15 -07:00
Massimiliano Culpo
7cad6c62a3 Associate condition sets from cli to root node (#43710)
This PR prevents a condition_set from having nodes that are not associated with the corresponding root node through some (transitive) dependencies.
2024-04-18 17:27:12 +02:00
Harmen Stoppels
eb2ddf6fa2 asp.py: do not copy 2024-04-18 15:39:26 +02:00
Harmen Stoppels
2bc2902fed spec.py: early return in __str__ 2024-04-18 15:39:26 +02:00
Mikael Simberg
b362362291 cvise package: add version 2.10.0 and ncurses constraint (#43319) 2024-04-17 13:41:33 -07:00
Rocco Meli
32bb5c7523 mkl interface (#43673) 2024-04-17 16:19:35 -04:00
FrederickDeny
a2b76c68a0 Added e4s-cl@1.0.3 (#43693) 2024-04-17 13:45:36 -06:00
Wouter Deconinck
62132919e1 xrootd: new version 5.6.9 (#43684) 2024-04-17 12:26:52 -07:00
Wouter Deconinck
b06929f6df xerces-c: new version 3.2.5 (#43687) 2024-04-17 12:24:27 -07:00
Wouter Deconinck
0f33de157b assimp: new version 5.4.0 (#43689) 2024-04-17 10:21:15 -07:00
Sinan
03a074ebe7 package/npm update (#43692)
* package/npm update
* add conflicts to exclude certain version intervals

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2024-04-17 10:06:09 -07:00
Adam J. Stewart
4d12b6a4fd py-shapely: add v2.0.4 (#43702) 2024-04-17 18:51:48 +02:00
Adam J. Stewart
26bb15e1fb py-sphinx: add v7.3 (#43703) 2024-04-17 18:51:31 +02:00
Bill Williams
1bf92c7881 [Score-P] Make local with-or-without not use "yes" (#43701)
Score-P does not accept "--with-foo=yes", but only "--with-foo" or "--with-foo=some-valid-specific-choice-or-path". This keeps Spack from generating config flags that will cause Score-P to barf.
2024-04-17 09:32:47 -07:00
Todd Gamblin
eefe0b2eec Improve spack find output in environments (#42334)
This adds some improvements to `spack find` output when in environments based
around some thoughts about what users want to know when they're in an env.

If you're working in an enviroment, you mostly care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized

So, this PR adds a couple tweaks to display that information more clearly:

- [x] We now display install status next to every root. You can easily see
      which are installed and which aren't.

- [x] When you run `spack find -l` in an env, the roots now show their concrete
      hash (if they've been concretized). They previously would show `-------`
      (b/c the root spec itself is abstract), but showing the concretized root's
      hash is a lot more useful.

- [x] Newly added/unconcretized specs still show `-------`, which now makes more
      sense, b/c they are not concretized.

- [x] There is a new option, `-r` / `--only-roots` to *only* show env roots if
      you don't want to look at all the installed specs.

- [x] Roots in the installed spec list are now highlighted as bold. This is
      actually an old feature from the first env implementation , but various
      refactors had disabled it inadvertently.
2024-04-17 16:22:05 +00:00
Wouter Deconinck
de6c6f0cd9 py-pyparsing: new version 3.1.2 (#43579) 2024-04-17 07:53:03 -06:00
James Smillie
309d3aa1ec Python package: fix install of static libs on Windows (#43564) 2024-04-16 16:40:15 -06:00
Andrey Perestoronin
feff11f914 intel-oneapi-dnn-2024.1.0: add DNN package version (#43679)
* add onednn package

* fix style

* Update var/spack/repos/builtin/packages/intel-oneapi-dnn/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-04-16 15:26:14 -04:00
John W. Parent
de3b324983 Windows filesystem utilities (bugfix): improve SFN usage (#43645)
Reduce incidence of spurious errors by:
* Ensuring we're passing the buffer by reference
* Get the correct short string size from Windows API instead of computing ourselves
* Ensure sufficient space for null terminator character

Add test for `windows_sfn`
2024-04-16 11:02:02 -07:00
kwryankrattiger
747cd374df Run after_script aggregator with spack python (#43669) 2024-04-16 19:03:44 +02:00
Wouter Deconinck
8b3ac40436 acts: new version 34.0.0 (#43680) 2024-04-16 11:03:31 -06:00
Wouter Deconinck
28e9be443c (py-)onnx: new version 1.16.0 (#43675) 2024-04-16 09:58:43 -07:00
John W. Parent
1381bede80 zstd: 1.5.6 does not build on Windows (#43677)
Conflict until a fix has been merged upstream
2024-04-16 09:36:27 -07:00
Wouter Deconinck
6502785908 podio: +rntuple requires root +root7 (#43672) 2024-04-16 09:27:07 -07:00
Erik Heeren
53257408a3 py-bluepyopt: 1.14.11 (#43678) 2024-04-16 09:21:24 -07:00
Wouter Deconinck
28d02dff60 pythia8: new version 8.311 (#43667) 2024-04-16 09:19:06 -07:00
Wouter Deconinck
9d60b42a97 jwt-cpp: new version 0.7.0, scitokens-cpp: new versions to 1.1.1 (#43657)
* jwt-cpp: new version 0.7.0, depends_on nlohmann-json
* scitokens-cpp: new versions to 1.1.1
* scitokens-cpp: conflicts ^jwt-cpp@0.7: when @:1.1
2024-04-16 09:17:18 -07:00
Harmen Stoppels
9ff5a30574 concretize.lp: fix issue with reuse of conditional variants (#43676)
Currently if you request pkg +example where example is a conditional
variant, and you have a pkg in the database for which the condition
did not hold (so no +example nor ~example), the solver would reuse it
regardless, not imposing +example.

The change rules out exactly one thing: variant_set without variant_value,
which in practice could only happen when not node_has_variant (i.e. when
under the current package.py rules the variant's when condition did not
trigger).
2024-04-16 16:09:32 +00:00
Wouter Deconinck
9a6c013365 wayland: +doc requires graphviz +expat for HTML tables (#43668) 2024-04-16 01:28:47 -06:00
Andrey Prokopenko
9f62a3e819 arborx: add v1.6 (#43623) 2024-04-16 09:27:14 +02:00
Maciej Wójcik
e380e9a0ab gromacs: prevent version conflict after enabling plumed (#43449) 2024-04-16 09:07:50 +02:00
Adam J. Stewart
8415ea9ada py-black: switch maintainer (#43652) 2024-04-16 08:51:12 +02:00
Josh Bowden
6960766e0c Damaris: add v1.10.0 (#43664)
Co-authored-by: Joshua Bowden <joshua-chales.bowden@inria.fr>
2024-04-15 23:43:39 -06:00
Michael Kuhn
0c2ca8c841 octave: add 8.4.0 and 9.1.0 (#43518) 2024-04-15 13:32:20 -07:00
Jen Herting
273960fdbb [py-pybedtools] added version 0.10.0 (#43625) 2024-04-15 13:27:22 -07:00
eugeneswalker
0cd2a1102c crtm: add noaa versions and package mods (#43635)
* crtm: add noaa versions and package mods
* crtm@v2.4.1-jedi: add missing depends_on netcdf-fortran, ecbuild from jcsda spack fork
2024-04-15 13:24:07 -07:00
Teague Sterling
e40676e901 Add ollama package (#43655)
* Added package to build Ollama
* Update package.py
  Add license and documentation
* [@spackbot] updating style on behalf of teaguesterling
2024-04-15 13:13:07 -07:00
Tristan Carel
4ddb07e94f py-morphio: add version 3.3.7, update license (#43420) 2024-04-15 14:06:48 -06:00
eugeneswalker
50585d55c5 hdf: %oneapi: -Wno-error=implicit-function-declaration (#43631)
* hdf: %oneapi: -Wno-error=implicit-function-declaration

* [@spackbot] updating style on behalf of eugeneswalker
2024-04-15 12:17:07 -07:00
Adam J. Stewart
5d6b5f3f6f GDAL: fix MDB build (#43665) 2024-04-15 10:38:56 -07:00
Juan Miguel Carceller
2351c19489 glew: remove a few unused options (#43399)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-04-15 18:49:51 +02:00
snehring
08d49361f0 iq-tree: add v2.3.1 (#43442) 2024-04-15 18:48:58 +02:00
Ken Raffenetti
c3c63e5ca4 mpich: Update PMI configure options (#43551)
Add a "default" option that passes no option to configure. Existing
options changed in the MPICH 4.2.0 release, so update the package to
reflect those changes.
2024-04-15 18:40:37 +02:00
Richard Berger
e72d4075bd LAMMPS: add v20240207.1 (#43538)
Add workaround for undefined HIP_PATH in older versions
2024-04-15 16:34:30 +00:00
Todd Gamblin
f9f97bf22b tests: Spec tests shouldn't fetch remote git repositories. (#43656)
Currently, some of the tests in `spec_format` and `spec_semantics` fetch
the actual zlib repository when run, because they call `str()` on specs
like `zlib@foo/bar`, which at least currently requires a remote git clone
to resolve.

This doesn't change the behavior of git versions, but it uses our mock git
repo infrastructure and clones the `git-test` package instead of the *real*
URL from the mock `zlib` package.

This should speed up tests.  We could probably refactor more so that the git
tests *all* use such a fixture, but the `checks` field that unfortunately
tightly couples the mock git repository and the `git_fetch` tests complicates
this. We could also consider *not* making `str()` resolve git versions, but
I did not dig into that here.

- [x] add a mock_git_test_package fixture that sets up a mock git repo *and*
      monkeypatches the `git-test` package (like our git test packages do)
- [x] use fixture in `test_spec_format_path`
- [x] use fixture in `test_spec_format_path_posix`
- [x] use fixture in `test_spec_format_path_windows`
- [x] use fixture in `test_parse_single_spec`
2024-04-15 09:20:23 -07:00
Massimiliano Culpo
8033455d5f hdf5: require mpich+fortran when hdf5+fortran (#43591) 2024-04-15 18:17:28 +02:00
Eric Berquist
50a5a6fea4 tree-sitter: add versions up to 0.22.2 (#43608) 2024-04-15 09:10:19 -07:00
eugeneswalker
0de8a0e3f3 wgrib2: oneapi -> comp_sys="intel_linux" (#43632) 2024-04-15 18:04:41 +02:00
SXS Bot
0a26e74cc8 spectre: add v2024.04.12 (#43641)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-04-15 18:03:51 +02:00
dependabot[bot]
9dfd91efbb build(deps): bump docker/setup-buildx-action from 3.2.0 to 3.3.0 (#43542)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](2b51285047...d70bba72b1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:47 +02:00
dependabot[bot]
1a7baadbff build(deps): bump python-levenshtein in /lib/spack/docs (#43543)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.25.0 to 0.25.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.25.0...v0.25.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 18:00:24 +02:00
Marc Perache
afcfd56ae5 range-v3: add v0.12.0 (#40103) 2024-04-15 17:43:14 +02:00
dependabot[bot]
7eb2e704b6 build(deps): bump codecov/codecov-action from 4.1.1 to 4.3.0 (#43562)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.1.1 to 4.3.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](c16abc29c9...84508663e9)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 17:34:31 +02:00
Josh Milthorpe
564b4fa263 hipcub: depend on matching version of hip+cuda when +cuda (#42970) 2024-04-15 17:33:26 +02:00
Adam J. Stewart
0a941b43ca PyTorch: build with external cpuinfo (#40758) 2024-04-15 17:26:52 +02:00
one
35ff24ddea googletest: fix reversed pthreads variant logic (#43649) 2024-04-15 11:18:27 -04:00
Rocco Meli
7019e4e3cb openbabel: add CMake patch for 3.1.1 (#43612) 2024-04-15 17:07:54 +02:00
Weston Ortiz
cb16b8a047 goma: new version 7.6.1 (#43617) 2024-04-15 17:04:34 +02:00
Adam J. Stewart
381acb3726 Build systems: fix docstrings (#43618) 2024-04-15 17:01:52 +02:00
Adam J. Stewart
d87ea0b256 py-maturin: add v1.5.1 (#43619) 2024-04-15 16:55:17 +02:00
Adam J. Stewart
1a757e7f70 py-lightning: add v2.2.2 (#43621) 2024-04-15 16:54:34 +02:00
eugeneswalker
704e2c53a8 py-eccodes: add v1.4.2 (#43633) 2024-04-15 16:44:57 +02:00
renjithravindrankannath
478d8a668c rocm-opencl: add dependency on aqlprofile (#43637) 2024-04-15 16:44:16 +02:00
dependabot[bot]
7903f9fcfd build(deps): bump black from 24.3.0 to 24.4.0 in /lib/spack/docs (#43642)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:37 +02:00
dependabot[bot]
670d3d3fdc build(deps): bump black in /.github/workflows/style (#43643)
Bumps [black](https://github.com/psf/black) from 24.3.0 to 24.4.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.3.0...24.4.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-15 16:40:14 +02:00
Vanessasaurus
e8aab6b31c flux-core: add v0.61.2 (#43648)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-04-15 16:34:58 +02:00
Adam J. Stewart
1ce408ecc5 py-ruff: add v0.3.7 (#43620) 2024-04-15 16:33:11 +02:00
Adam J. Stewart
dc81a2dcdb py-rasterio: add v1.3.10 (#43653) 2024-04-15 16:32:34 +02:00
Rocco Meli
b10f51f020 charmpp: add archs including Cray shasta with ARM (#43191)
Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-04-15 16:28:31 +02:00
Hariharan Devarajan
4f4e3f5607 dlio-profiler: add releases up yo v0.0.5 (#43530) 2024-04-15 16:21:52 +02:00
Wouter Deconinck
00fb80e766 util-linux: new version 2.40 (#43661) 2024-04-15 16:16:52 +02:00
Michael Kuhn
057603cad8 Fix pkgconfig dependencies (#43651)
pkgconfig is the virtual package, pkg-config and pkgconf are
implementations.
2024-04-15 16:05:11 +02:00
kjrstory
5b8b6e492d su2: add v8.0.0, v8.0.1 (#43662) 2024-04-15 15:25:16 +02:00
Wouter Deconinck
763279cd61 spdlog: new version 1.13.0 (#43658) 2024-04-15 12:36:10 +02:00
Wouter Deconinck
e4237b9153 zlib: new version 1.3.1 (#43660) 2024-04-15 10:59:06 +02:00
Wouter Deconinck
d288658cf0 zstd: new version 1.5.6 (#43659) 2024-04-15 09:04:42 +02:00
394 changed files with 5989 additions and 2362 deletions

View File

@@ -17,10 +17,16 @@ concurrency:
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.operating_system }}
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@9bb56186c3b09b4f86b1c65136769dd318469633
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -29,21 +35,33 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit externals
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
$(which spack) audit externals
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -45,12 +45,15 @@ jobs:
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora37, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:37'],
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38']]
[fedora38, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:38'],
[fedora39, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64,linux/ppc64le', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
@@ -87,16 +90,16 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@a8a3f3ad30e3422c9c7b888a15615d19a852ae32
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@2b51285047da1547ffb1b2203d8be4c0af6b1f20
uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
@@ -120,3 +123,14 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -1,4 +1,4 @@
black==24.3.0
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2

View File

@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -223,7 +223,7 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -33,7 +33,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
@@ -59,7 +59,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@c16abc29c95fcf9174b58eb7e1abf4c866893bc8
- uses: codecov/codecov-action@84508663e988701840491b86de86b666e8a86bed
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -88,7 +88,7 @@ Resources:
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
* **X**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.

View File

@@ -15,7 +15,7 @@ concretizer:
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization. If `dependencies`, we'll only reuse dependencies but
# give you a fresh concretization for your root specs.
reuse: dependencies
reuse: true
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -19,7 +19,6 @@ packages:
- apple-clang
- clang
- gcc
- intel
providers:
elf: [libelf]
fuse: [macfuse]

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -15,9 +15,10 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, intel, pgi, clang, xl, nag, fj, aocc]
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
D: [ldc]
daal: [intel-oneapi-daal]
@@ -35,6 +36,7 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libglx: [mesa+glx, mesa18+glx]
libifcore: [ intel-oneapi-runtime ]

12
lib/spack/docs/_templates/layout.html vendored Normal file
View File

@@ -0,0 +1,12 @@
{% extends "!layout.html" %}
{%- block extrahead %}
<!-- Google tag (gtag.js) -->
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S0PQ7WV75K"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'G-S0PQ7WV75K');
</script>
{% endblock %}

View File

@@ -21,23 +21,86 @@ is the following:
Reuse already installed packages
--------------------------------
The ``reuse`` attribute controls whether Spack will prefer to use installed packages (``true``), or
whether it will do a "fresh" installation and prefer the latest settings from
``package.py`` files and ``packages.yaml`` (``false``).
You can use:
The ``reuse`` attribute controls how aggressively Spack reuses binary packages during concretization. The
attribute can either be a single value, or an object for more complex configurations.
In the former case ("single value") it allows Spack to:
1. Reuse installed packages and buildcaches for all the specs to be concretized, when ``true``
2. Reuse installed packages and buildcaches only for the dependencies of the root specs, when ``dependencies``
3. Disregard reusing installed packages and buildcaches, when ``false``
In case a finer control over which specs are reused is needed, then the value of this attribute can be
an object, with the following keys:
1. ``roots``: if ``true`` root specs are reused, if ``false`` only dependencies of root specs are reused
2. ``from``: list of sources from which reused specs are taken
Each source in ``from`` is itself an object:
.. list-table:: Attributes for a source or reusable specs
:header-rows: 1
* - Attribute name
- Description
* - type (mandatory, string)
- Can be ``local``, ``buildcache``, or ``external``
* - include (optional, list of specs)
- If present, reusable specs must match at least one of the constraint in the list
* - exclude (optional, list of specs)
- If present, reusable specs must not match any of the constraint in the list.
For instance, the following configuration:
.. code-block:: yaml
concretizer:
reuse:
roots: true
from:
- type: local
include:
- "%gcc"
- "%clang"
tells the concretizer to reuse all specs compiled with either ``gcc`` or ``clang``, that are installed
in the local store. Any spec from remote buildcaches is disregarded.
To reduce the boilerplate in configuration files, default values for the ``include`` and
``exclude`` options can be pushed up one level:
.. code-block:: yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
from:
- type: local
- type: buildcache
- type: local
include:
- "foo %oneapi"
In the example above we reuse all specs compiled with ``gcc`` from the local store
and remote buildcaches, and we also reuse ``foo %oneapi``. Note that the last source of
specs override the default ``include`` attribute.
For one-off concretizations, the are command line arguments for each of the simple "single value"
configurations. This means a user can:
.. code-block:: console
% spack install --reuse <spec>
to enable reuse for a single installation, and you can use:
to enable reuse for a single installation, or:
.. code-block:: console
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: dependencies`` is the default.
.. seealso::

View File

@@ -203,6 +203,9 @@ The OS that are currently supported are summarized in the table below:
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -233,6 +236,12 @@ The OS that are currently supported are summarized in the table below:
* - Fedora Linux 38
- ``fedora:38``
- ``spack/fedora38``
* - Fedora Linux 39
- ``fedora:39``
- ``spack/fedora39``
* - Fedora Linux 40
- ``fedora:40``
- ``spack/fedora40``

View File

@@ -552,11 +552,11 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'import distro; distro.linux_distribution()'
('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
...
$ spack python -i ipython -c 'import distro; distro.linux_distribution()'
Out[1]: ('Ubuntu', '18.04', 'Bionic Beaver')
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
Out[1]: ...
or a file:
@@ -1071,9 +1071,9 @@ Announcing a release
We announce releases in all of the major Spack communication channels.
Publishing the release takes care of GitHub. The remaining channels are
Twitter, Slack, and the mailing list. Here are the steps:
X, Slack, and the mailing list. Here are the steps:
#. Announce the release on Twitter.
#. Announce the release on X.
* Compose the tweet on the ``@spackpm`` account per the
``spack-twitter`` slack channel.

View File

@@ -1572,6 +1572,8 @@ Microsoft Visual Studio
"""""""""""""""""""""""
Microsoft Visual Studio provides the only Windows C/C++ compiler that is currently supported by Spack.
Spack additionally requires that the Windows SDK (including WGL) to be installed as part of your
visual studio installation as it is required to build many packages from source.
We require several specific components to be included in the Visual Studio installation.
One is the C/C++ toolset, which can be selected as "Desktop development with C++" or "C++ build tools,"
@@ -1579,6 +1581,7 @@ depending on installation type (Professional, Build Tools, etc.) The other requ
"C++ CMake tools for Windows," which can be selected from among the optional packages.
This provides CMake and Ninja for use during Spack configuration.
If you already have Visual Studio installed, you can make sure these components are installed by
rerunning the installer. Next to your installation, select "Modify" and look at the
"Installation details" pane on the right.

View File

@@ -6435,9 +6435,12 @@ the ``paths`` attribute:
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
platforms: ["linux", "darwin"]
results:
- spec: 'llvm@3.9.1 +clang~lld~lldb'
If the ``platforms`` attribute is present, tests are run only if the current host
matches one of the listed platforms.
Each test is performed by first creating a temporary directory structure as
specified in the corresponding ``layout`` and by then running
package detection and checking that the outcome matches the expected
@@ -6471,6 +6474,10 @@ package detection and checking that the outcome matches the expected
- A spec that is expected from detection
- Any valid spec
- Yes
* - ``results:[0]:extra_attributes``
- Extra attributes expected on the associated Spec
- Nested dictionary with string as keys, and regular expressions as leaf values
- No
"""""""""""""""""""""""""""""""
Reuse tests from other packages

View File

@@ -2,12 +2,12 @@ sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.17.2
urllib3==2.2.1
pytest==8.1.1
pytest==8.2.0
isort==5.13.2
black==24.3.0
black==24.4.0
flake8==7.0.0
mypy==1.9.0

250
lib/spack/env/cc vendored
View File

@@ -47,7 +47,8 @@ SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS"
SPACK_SYSTEM_DIRS
SPACK_MANAGED_DIRS"
# Optional parameters that aren't required to be set
@@ -173,22 +174,6 @@ preextend() {
unset IFS
}
# system_dir PATH
# test whether a path is a system directory
system_dir() {
IFS=':' # SPACK_SYSTEM_DIRS is colon-separated
path="$1"
for sd in $SPACK_SYSTEM_DIRS; do
if [ "${path}" = "${sd}" ] || [ "${path}" = "${sd}/" ]; then
# success if path starts with a system prefix
unset IFS
return 0
fi
done
unset IFS
return 1 # fail if path starts no system prefix
}
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -201,6 +186,18 @@ for param in $params; do
fi
done
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
# moving the eval inside the function would eval it every call.
eval "\
path_order() {
case \"\$1\" in
$SPACK_MANAGED_DIRS) return 0 ;;
$SPACK_SYSTEM_DIRS) return 2 ;;
/*) return 1 ;;
esac
}
"
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -420,11 +417,12 @@ input_command="$*"
parse_Wl() {
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
wl_expect_rpath=no
else
case "$1" in
@@ -432,21 +430,25 @@ parse_Wl() {
arg="${1#-rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if [ -z "$arg" ]; then
shift; continue
elif system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
wl_expect_rpath=yes
@@ -473,12 +475,20 @@ categorize_arguments() {
return_other_args_list=""
return_isystem_was_used=""
return_isystem_spack_store_include_dirs_list=""
return_isystem_system_include_dirs_list=""
return_isystem_include_dirs_list=""
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
@@ -546,29 +556,32 @@ categorize_arguments() {
arg="${1#-isystem}"
return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_isystem_system_include_dirs_list "$arg"
else
append return_isystem_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_include_dirs_list "$arg"
else
append return_include_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
append return_system_lib_dirs_list "$arg"
else
append return_lib_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -601,29 +614,32 @@ categorize_arguments() {
break
elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path>
if system_dir "$1"; then
append return_system_rpath_dirs_list "$1"
else
append return_rpath_dirs_list "$1"
fi
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
xlinker_expect_rpath=no
else
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
--rpath=*)
arg="${1#--rpath=}"
if system_dir "$arg"; then
append return_system_rpath_dirs_list "$arg"
else
append return_rpath_dirs_list "$arg"
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;;
-rpath|--rpath)
xlinker_expect_rpath=yes
@@ -661,16 +677,25 @@ categorize_arguments() {
}
categorize_arguments "$@"
include_dirs_list="$return_include_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
isystem_was_used="$return_isystem_was_used"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
other_args_list="$return_other_args_list"
spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
include_dirs_list="$return_include_dirs_list"
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list"
#
# Add flags from Spack's cppflags, cflags, cxxflags, fcflags, fflags, and
@@ -730,7 +755,7 @@ esac
# Linker flags
case "$mode" in
ld|ccld)
ccld)
extend spack_flags_list SPACK_LDFLAGS
;;
esac
@@ -738,16 +763,25 @@ esac
IFS="$lsep"
categorize_arguments $spack_flags_list
unset IFS
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_other_args_list="$return_other_args_list"
spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list"
# On macOS insert headerpad_max_install_names linker flag
@@ -767,11 +801,13 @@ if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
# Append RPATH directories. Note that in the case of the
# top-level package these directories may not exist yet. For dependencies
# it is assumed that paths have already been confirmed.
extend spack_store_rpath_dirs_list SPACK_STORE_RPATH_DIRS
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
extend spack_store_lib_dirs_list SPACK_STORE_LINK_DIRS
extend lib_dirs_list SPACK_LINK_DIRS
fi
@@ -798,38 +834,50 @@ case "$mode" in
;;
esac
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend include_dirs_list SPACK_INCLUDE_DIRS
fi
;;
esac
#
# Finally, reassemble the command line.
#
args_list="$flags_list"
# Insert include directories just prior to any system include directories
# Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_flags_include_dirs_list "-I"
extend args_list include_dirs_list "-I"
extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I
extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
elif [ "$isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
else
extend args_list SPACK_INCLUDE_DIRS "-I"
fi
;;
esac
extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_include_dirs_list -I
extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths
# Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L"
extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
@@ -839,8 +887,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;;
@@ -848,8 +900,12 @@ case "$mode" in
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;;

View File

@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -0,0 +1,13 @@
diff --git a/lib/spack/external/_vendoring/ruamel/yaml/comments.py b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
index 1badeda585..892c868af3 100644
--- a/lib/spack/external/_vendoring/ruamel/yaml/comments.py
+++ b/lib/spack/external/_vendoring/ruamel/yaml/comments.py
@@ -497,7 +497,7 @@ def copy_attributes(self, t, memo=None):
Tag.attrib, merge_attrib]:
if hasattr(self, a):
if memo is not None:
- setattr(t, a, copy.deepcopy(getattr(self, a, memo)))
+ setattr(t, a, copy.deepcopy(getattr(self, a), memo))
else:
setattr(t, a, getattr(self, a))
# fmt: on

View File

@@ -1234,10 +1234,12 @@ def windows_sfn(path: os.PathLike):
import ctypes
k32 = ctypes.WinDLL("kernel32", use_last_error=True)
# Method with null values returns size of short path name
sz = k32.GetShortPathNameW(path, None, 0)
# stub Windows types TCHAR[LENGTH]
TCHAR_arr = ctypes.c_wchar * len(path)
TCHAR_arr = ctypes.c_wchar * sz
ret_str = TCHAR_arr()
k32.GetShortPathNameW(path, ret_str, len(path))
k32.GetShortPathNameW(path, ctypes.byref(ret_str), sz)
return ret_str.value

View File

@@ -12,7 +12,7 @@
import traceback
from datetime import datetime
from sys import platform as _platform
from typing import NoReturn
from typing import Any, NoReturn
if _platform != "win32":
import fcntl
@@ -158,21 +158,22 @@ def get_timestamp(force=False):
return ""
def msg(message, *args, **kwargs):
def msg(message: Any, *args: Any, newline: bool = True) -> None:
if not msg_enabled():
return
if isinstance(message, Exception):
message = "%s: %s" % (message.__class__.__name__, str(message))
message = f"{message.__class__.__name__}: {message}"
else:
message = str(message)
newline = kwargs.get("newline", True)
st_text = ""
if _stacktrace:
st_text = process_stacktrace(2)
if newline:
cprint("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
else:
cwrite("@*b{%s==>} %s%s" % (st_text, get_timestamp(), cescape(_output_filter(message))))
nl = "\n" if newline else ""
cwrite(f"@*b{{{st_text}==>}} {get_timestamp()}{cescape(_output_filter(message))}{nl}")
for arg in args:
print(indent + _output_filter(str(arg)))

View File

@@ -237,7 +237,6 @@ def transpose():
def colified(
elts: List[Any],
cols: int = 0,
output: Optional[IO] = None,
indent: int = 0,
padding: int = 2,
tty: Optional[bool] = None,

View File

@@ -62,6 +62,7 @@
import re
import sys
from contextlib import contextmanager
from typing import Optional
class ColorParseError(Exception):
@@ -95,7 +96,7 @@ def __init__(self, message):
} # white
# Regex to be used for color formatting
color_re = r"@(?:@|\.|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)"
COLOR_RE = re.compile(r"@(?:(@)|(\.)|([*_])?([a-zA-Z])?(?:{((?:[^}]|}})*)})?)")
# Mapping from color arguments to values for tty.set_color
color_when_values = {"always": True, "auto": None, "never": False}
@@ -203,77 +204,64 @@ def color_when(value):
set_color_when(old_value)
class match_to_ansi:
def __init__(self, color=True, enclose=False, zsh=False):
self.color = _color_when_value(color)
self.enclose = enclose
self.zsh = zsh
def escape(self, s):
"""Returns a TTY escape sequence for a color"""
if self.color:
if self.zsh:
result = rf"\e[0;{s}m"
else:
result = f"\033[{s}m"
if self.enclose:
result = rf"\[{result}\]"
return result
def _escape(s: str, color: bool, enclose: bool, zsh: bool) -> str:
"""Returns a TTY escape sequence for a color"""
if color:
if zsh:
result = rf"\e[0;{s}m"
else:
return ""
result = f"\033[{s}m"
def __call__(self, match):
"""Convert a match object generated by ``color_re`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
style, color, text = match.groups()
m = match.group(0)
if enclose:
result = rf"\[{result}\]"
if m == "@@":
return "@"
elif m == "@.":
return self.escape(0)
elif m == "@":
raise ColorParseError("Incomplete color format: '%s' in %s" % (m, match.string))
string = styles[style]
if color:
if color not in colors:
raise ColorParseError(
"Invalid color specifier: '%s' in '%s'" % (color, match.string)
)
string += ";" + str(colors[color])
colored_text = ""
if text:
colored_text = text + self.escape(0)
return self.escape(string) + colored_text
return result
else:
return ""
def colorize(string, **kwargs):
def colorize(
string: str, color: Optional[bool] = None, enclose: bool = False, zsh: bool = False
) -> str:
"""Replace all color expressions in a string with ANSI control codes.
Args:
string (str): The string to replace
string: The string to replace
Returns:
str: The filtered string
The filtered string
Keyword Arguments:
color (bool): If False, output will be plain text without control
codes, for output to non-console devices.
enclose (bool): If True, enclose ansi color sequences with
color: If False, output will be plain text without control codes, for output to
non-console devices (default: automatically choose color or not)
enclose: If True, enclose ansi color sequences with
square brackets to prevent misestimation of terminal width.
zsh (bool): If True, use zsh ansi codes instead of bash ones (for variables like PS1)
zsh: If True, use zsh ansi codes instead of bash ones (for variables like PS1)
"""
color = _color_when_value(kwargs.get("color", get_color_when()))
zsh = kwargs.get("zsh", False)
string = re.sub(color_re, match_to_ansi(color, kwargs.get("enclose")), string, zsh)
string = string.replace("}}", "}")
return string
color = color if color is not None else get_color_when()
def match_to_ansi(match):
"""Convert a match object generated by ``COLOR_RE`` into an ansi
color code. This can be used as a handler in ``re.sub``.
"""
escaped_at, dot, style, color_code, text = match.groups()
if escaped_at:
return "@"
elif dot:
return _escape(0, color, enclose, zsh)
elif not (style or color_code):
raise ColorParseError(
f"Incomplete color format: '{match.group(0)}' in '{match.string}'"
)
ansi_code = _escape(f"{styles[style]};{colors.get(color_code, '')}", color, enclose, zsh)
if text:
return f"{ansi_code}{text}{_escape(0, color, enclose, zsh)}"
else:
return ansi_code
return COLOR_RE.sub(match_to_ansi, string).replace("}}", "}")
def clen(string):
@@ -305,7 +293,7 @@ def cprint(string, stream=None, color=None):
cwrite(string + "\n", stream, color)
def cescape(string):
def cescape(string: str) -> str:
"""Escapes special characters needed for color codes.
Replaces the following symbols with their equivalent literal forms:
@@ -321,10 +309,7 @@ def cescape(string):
Returns:
(str): the string with color codes escaped
"""
string = str(string)
string = string.replace("@", "@@")
string = string.replace("}", "}}")
return string
return string.replace("@", "@@").replace("}", "}}")
class ColorStream:

View File

@@ -1046,7 +1046,7 @@ def _extracts_errors(triggers, summary):
group="externals",
tag="PKG-EXTERNALS",
description="Sanity checks for external software detection",
kwargs=("pkgs",),
kwargs=("pkgs", "debug_log"),
)
@@ -1069,7 +1069,7 @@ def packages_with_detection_tests():
@external_detection
def _test_detection_by_executable(pkgs, error_cls):
def _test_detection_by_executable(pkgs, debug_log, error_cls):
"""Test drive external detection for packages"""
import spack.detection
@@ -1095,6 +1095,7 @@ def _test_detection_by_executable(pkgs, error_cls):
for idx, test_runner in enumerate(
spack.detection.detection_tests(pkg_name, spack.repo.PATH)
):
debug_log(f"[{__file__}]: running test {idx} for package {pkg_name}")
specs = test_runner.execute()
expected_specs = test_runner.expected_specs
@@ -1111,4 +1112,75 @@ def _test_detection_by_executable(pkgs, error_cls):
details = [msg.format(s, idx) for s in sorted(not_expected)]
errors.append(error_cls(summary=summary, details=details))
matched_detection = []
for candidate in expected_specs:
try:
idx = specs.index(candidate)
matched_detection.append((candidate, specs[idx]))
except (AttributeError, ValueError):
pass
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
try:
_regex = re.compile(_expected)
except re.error:
_summary = f'{pkg_name}: illegal regex in "{_spec}" extra attributes'
_details = [f"{_expected} is not a valid regex"]
return [error_cls(summary=_summary, details=_details)]
if not _regex.match(_detected):
_summary = (
f'{pkg_name}: error when trying to match "{_expected}" '
f"in extra attributes"
)
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
_details = [
f'"{_expected}" was expected',
f'"{_detected}" was detected',
] + [f'attribute "{s}" was not detected' for s in sorted(_not_detected)]
result.append(error_cls(summary=_summary, details=_details))
_common = set(_expected.keys()) & set(_detected.keys())
for _key in _common:
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
return result
for expected, detected in matched_detection:
# We might not want to test all attributes, so avoid not_expected
not_detected = set(expected.extra_attributes) - set(detected.extra_attributes)
if not_detected:
summary = f"{pkg_name}: cannot detect some attributes for spec {expected}"
details = [
f'"{s}" was not detected [test_id={idx}]' for s in sorted(not_detected)
]
errors.append(error_cls(summary=summary, details=details))
common = set(expected.extra_attributes) & set(detected.extra_attributes)
for key in common:
errors.extend(
_compare_extra_attribute(
expected.extra_attributes[key],
detected.extra_attributes[key],
_spec=expected,
)
)
return errors

View File

@@ -173,35 +173,14 @@ def _read_metadata(self, package_name: str) -> Any:
return data
def _install_by_hash(
self,
pkg_hash: str,
pkg_sha256: str,
index: List[spack.spec.Spec],
bincache_platform: spack.platforms.Platform,
self, pkg_hash: str, pkg_sha256: str, bincache_platform: spack.platforms.Platform
) -> None:
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null",
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family),
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override("compilers", [{"compiler": compiler_entry}]):
spec_str = "/" + pkg_hash
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
matches = spack.store.find([spec_str], multiple=False, query_fn=query)
for match in matches:
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
query = spack.binary_distribution.BinaryCacheQuery(all_architectures=True)
for match in spack.store.find([f"/{pkg_hash}"], multiple=False, query_fn=query):
spack.binary_distribution.install_root_node(
match, unsigned=True, force=True, sha256=pkg_sha256
)
def _install_and_test(
self,
@@ -232,7 +211,7 @@ def _install_and_test(
continue
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, index, bincache_platform)
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
if test_fn(query_spec=abstract_spec, query_info=info):
@@ -559,6 +538,41 @@ def ensure_patchelf_in_path_or_raise() -> spack.util.executable.Executable:
)
def ensure_winsdk_external_or_raise() -> None:
"""Ensure the Windows SDK + WGL are available on system
If both of these package are found, the Spack user or bootstrap
configuration (depending on where Spack is running)
will be updated to include all versions and variants detected.
If either the WDK or WSDK are not found, this method will raise
a RuntimeError.
**NOTE:** This modifies the Spack config in the current scope,
either user or environment depending on the calling context.
This is different from all other current bootstrap dependency
checks.
"""
if set(["win-sdk", "wgl"]).issubset(spack.config.get("packages").keys()):
return
externals = spack.detection.by_path(["win-sdk", "wgl"])
if not set(["win-sdk", "wgl"]) == externals.keys():
missing_packages_lst = []
if "wgl" not in externals:
missing_packages_lst.append("wgl")
if "win-sdk" not in externals:
missing_packages_lst.append("win-sdk")
missing_packages = " & ".join(missing_packages_lst)
raise RuntimeError(
f"Unable to find the {missing_packages}, please install these packages \
via the Visual Studio installer \
before proceeding with Spack or provide the path to a non standard install with \
'spack external find --path'"
)
# wgl/sdk are not required for bootstrapping Spack, but
# are required for building anything non trivial
# add to user config so they can be used by subsequent Spack ops
spack.detection.update_configuration(externals, buildable=False)
def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":

View File

@@ -43,7 +43,7 @@
from collections import defaultdict
from enum import Flag, auto
from itertools import chain
from typing import List, Tuple
from typing import List, Set, Tuple
import llnl.util.tty as tty
from llnl.string import plural
@@ -68,6 +68,7 @@
import spack.repo
import spack.schema.environment
import spack.spec
import spack.stage
import spack.store
import spack.subprocess_context
import spack.user_environment
@@ -80,7 +81,7 @@
from spack.installer import InstallError
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import (
SYSTEM_DIRS,
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
env_flag,
filter_system_paths,
@@ -103,9 +104,13 @@
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
SPACK_RPATH_DIRS = "SPACK_RPATH_DIRS"
SPACK_STORE_INCLUDE_DIRS = "SPACK_STORE_INCLUDE_DIRS"
SPACK_STORE_LINK_DIRS = "SPACK_STORE_LINK_DIRS"
SPACK_STORE_RPATH_DIRS = "SPACK_STORE_RPATH_DIRS"
SPACK_RPATH_DEPS = "SPACK_RPATH_DEPS"
SPACK_LINK_DEPS = "SPACK_LINK_DEPS"
SPACK_PREFIX = "SPACK_PREFIX"
@@ -418,7 +423,7 @@ def set_compiler_environment_variables(pkg, env):
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", ":".join(SYSTEM_DIRS))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
@@ -546,9 +551,26 @@ def update_compiler_args_for_dep(dep):
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
env.set(SPACK_LINK_DIRS, ":".join(link_dirs))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs))
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
spack_managed_dirs: Set[str] = {
spack.stage.get_stage_root(),
spack.store.STORE.db.root,
*(db.root for db in spack.store.STORE.db.upstream_dbs),
}
spack_managed_dirs.update([os.path.realpath(p) for p in spack_managed_dirs])
env.set(SPACK_MANAGED_DIRS, "|".join(f'"{p}/"*' for p in sorted(spack_managed_dirs)))
is_spack_managed = lambda p: any(p.startswith(store) for store in spack_managed_dirs)
link_dirs_spack, link_dirs_system = stable_partition(link_dirs, is_spack_managed)
include_dirs_spack, include_dirs_system = stable_partition(include_dirs, is_spack_managed)
rpath_dirs_spack, rpath_dirs_system = stable_partition(rpath_dirs, is_spack_managed)
env.set(SPACK_LINK_DIRS, ":".join(link_dirs_system))
env.set(SPACK_INCLUDE_DIRS, ":".join(include_dirs_system))
env.set(SPACK_RPATH_DIRS, ":".join(rpath_dirs_system))
env.set(SPACK_STORE_LINK_DIRS, ":".join(link_dirs_spack))
env.set(SPACK_STORE_INCLUDE_DIRS, ":".join(include_dirs_spack))
env.set(SPACK_STORE_RPATH_DIRS, ":".join(rpath_dirs_spack))
def set_package_py_globals(pkg, context: Context = Context.BUILD):

View File

@@ -16,7 +16,7 @@
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using cargo."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -21,7 +21,7 @@
class MakefilePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
"""Specialized class for packages built using Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class

View File

@@ -14,7 +14,7 @@
from llnl.util.link_tree import LinkTree
from spack.build_environment import dso_suffix
from spack.directives import conflicts, license, variant
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -30,7 +30,7 @@ class IntelOneApiPackage(Package):
# oneAPI license does not allow mirroring outside of the
# organization (e.g. University/Company).
redistribute_source = False
redistribute(source=False, binary=False)
for c in [
"target=ppc64:",

View File

@@ -80,6 +80,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package_base import PackageBase
from spack.util.environment import EnvironmentModifications
class ROCmPackage(PackageBase):
@@ -156,30 +157,23 @@ def hip_flags(amdgpu_target):
archs = ",".join(amdgpu_target)
return "--amdgpu-target={0}".format(archs)
# ASAN
@staticmethod
def asan_on(env, llvm_path):
def asan_on(self, env: EnvironmentModifications):
llvm_path = self.spec["llvm-amdgpu"].prefix
env.set("CC", llvm_path + "/bin/clang")
env.set("CXX", llvm_path + "/bin/clang++")
env.set("ASAN_OPTIONS", "detect_leaks=0")
for root, dirs, files in os.walk(llvm_path):
for root, _, files in os.walk(llvm_path):
if "libclang_rt.asan-x86_64.so" in files:
asan_lib_path = root
env.prepend_path("LD_LIBRARY_PATH", asan_lib_path)
SET_DWARF_VERSION_4 = ""
try:
# This will throw an error if imported on a non-Linux platform.
import distro
distname = distro.id()
except ImportError:
distname = "unknown"
if "rhel" in distname or "sles" in distname:
if "rhel" in self.spec.os or "sles" in self.spec.os:
SET_DWARF_VERSION_4 = "-gdwarf-5"
else:
SET_DWARF_VERSION_4 = ""
env.set("CFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CXXFLAGS", "-fsanitize=address -shared-libasan -g " + SET_DWARF_VERSION_4)
env.set("CFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("CXXFLAGS", f"-fsanitize=address -shared-libasan -g {SET_DWARF_VERSION_4}")
env.set("LDFLAGS", "-Wl,--enable-new-dtags -fuse-ld=lld -fsanitize=address -g -Wl,")
# HIP version vs Architecture

View File

@@ -16,8 +16,8 @@
import tempfile
import time
import zipfile
from collections import namedtuple
from typing import List, Optional
from collections import defaultdict, namedtuple
from typing import Dict, List, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
@@ -113,54 +113,24 @@ def _remove_reserved_tags(tags):
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _spec_deps_key(s):
def _spec_ci_label(s):
return f"{s.name}/{s.dag_hash(7)}"
def _add_dependency(spec_label, dep_label, deps):
if spec_label == dep_label:
return
if spec_label not in deps:
deps[spec_label] = set()
deps[spec_label].add(dep_label)
PlainNodes = Dict[str, spack.spec.Spec]
PlainEdges = Dict[str, Set[str]]
def _get_spec_dependencies(specs, deps, spec_labels):
spec_deps_obj = _compute_spec_deps(specs)
if spec_deps_obj:
dependencies = spec_deps_obj["dependencies"]
specs = spec_deps_obj["specs"]
for entry in specs:
spec_labels[entry["label"]] = entry["spec"]
for entry in dependencies:
_add_dependency(entry["spec"], entry["depends"], deps)
def stage_spec_jobs(specs):
"""Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework.
def stage_spec_jobs(specs: List[spack.spec.Spec]) -> Tuple[PlainNodes, PlainEdges, List[Set[str]]]:
"""Turn a DAG into a list of stages (set of nodes), the list is ordered topologically, so that
each node in a stage has dependencies only in previous stages.
Arguments:
specs (Iterable): Specs to build
Returns: A tuple of information objects describing the specs, dependencies
and stages:
spec_labels: A dictionary mapping the spec labels (which are formatted
as pkg-name/hash-prefix) to concrete specs.
deps: A dictionary where the keys should also have appeared as keys in
the spec_labels dictionary, and the values are the set of
dependencies for that spec.
stages: An ordered list of sets, each of which contains all the jobs to
built in that stage. The jobs are expressed in the same format as
the keys in the spec_labels and deps objects.
specs: Specs to build
Returns: A tuple (nodes, edges, stages) where ``nodes`` maps labels to specs, ``edges`` maps
labels to a set of labels of dependencies, and ``stages`` is a topologically ordered list
of sets of labels.
"""
# The convenience method below, "_remove_satisfied_deps()", does not modify
@@ -177,17 +147,12 @@ def _remove_satisfied_deps(deps, satisfied_list):
return new_deps
deps = {}
spec_labels = {}
nodes, edges = _extract_dag(specs)
_get_spec_dependencies(specs, deps, spec_labels)
# Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is
# overwritten rather than being modified each time through the loop,
# thus preserving the original value of "deps" saved here.
dependencies = deps
unstaged = set(spec_labels.keys())
# Save the original edges, as we need to return them at the end of the function. In the loop
# below, the "dependencies" variable is rebound rather than mutated, so "edges" is not mutated.
dependencies = edges
unstaged = set(nodes.keys())
stages = []
while dependencies:
@@ -203,7 +168,7 @@ def _remove_satisfied_deps(deps, satisfied_list):
if unstaged:
stages.append(unstaged.copy())
return spec_labels, deps, stages
return nodes, edges, stages
def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisions):
@@ -235,87 +200,22 @@ def _print_staging_summary(spec_labels, stages, mirrors_to_check, rebuild_decisi
tty.msg(msg)
def _compute_spec_deps(spec_list):
"""
Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a
comprehensive list of all the edges in the dependency graph. For
example, given a single spec like 'readline@7.0', this function
generates the following JSON object:
def _extract_dag(specs: List[spack.spec.Spec]) -> Tuple[PlainNodes, PlainEdges]:
"""Extract a sub-DAG as plain old Python objects with external nodes removed."""
nodes: PlainNodes = {}
edges: PlainEdges = defaultdict(set)
.. code-block:: JSON
for edge in traverse.traverse_edges(specs, cover="edges"):
if (edge.parent and edge.parent.external) or edge.spec.external:
continue
child_id = _spec_ci_label(edge.spec)
nodes[child_id] = edge.spec
if edge.parent:
parent_id = _spec_ci_label(edge.parent)
nodes[parent_id] = edge.parent
edges[parent_id].add(child_id)
{
"dependencies": [
{
"depends": "readline/ip6aiun",
"spec": "readline/ip6aiun"
},
{
"depends": "ncurses/y43rifz",
"spec": "readline/ip6aiun"
},
{
"depends": "ncurses/y43rifz",
"spec": "readline/ip6aiun"
},
{
"depends": "pkgconf/eg355zb",
"spec": "ncurses/y43rifz"
},
{
"depends": "pkgconf/eg355zb",
"spec": "readline/ip6aiun"
}
],
"specs": [
{
"spec": "readline@7.0%apple-clang@9.1.0 arch=darwin-highs...",
"label": "readline/ip6aiun"
},
{
"spec": "ncurses@6.1%apple-clang@9.1.0 arch=darwin-highsi...",
"label": "ncurses/y43rifz"
},
{
"spec": "pkgconf@1.5.4%apple-clang@9.1.0 arch=darwin-high...",
"label": "pkgconf/eg355zb"
}
]
}
"""
spec_labels = {}
specs = []
dependencies = []
def append_dep(s, d):
dependencies.append({"spec": s, "depends": d})
for spec in spec_list:
for s in spec.traverse(deptype="all"):
if s.external:
tty.msg(f"Will not stage external pkg: {s}")
continue
skey = _spec_deps_key(s)
spec_labels[skey] = s
for d in s.dependencies(deptype="all"):
dkey = _spec_deps_key(d)
if d.external:
tty.msg(f"Will not stage external dep: {d}")
continue
append_dep(skey, dkey)
for spec_label, concrete_spec in spec_labels.items():
specs.append({"label": spec_label, "spec": concrete_spec})
deps_json_obj = {"specs": specs, "dependencies": dependencies}
return deps_json_obj
return nodes, edges
def _spec_matches(spec, match_string):
@@ -327,7 +227,7 @@ def _format_job_needs(
):
needs_list = []
for dep_job in dep_jobs:
dep_spec_key = _spec_deps_key(dep_job)
dep_spec_key = _spec_ci_label(dep_job)
rebuild = rebuild_decisions[dep_spec_key].rebuild
if not prune_dag or rebuild:

View File

@@ -334,8 +334,7 @@ def display_specs(specs, args=None, **kwargs):
variants (bool): Show variants with specs
indent (int): indent each line this much
groups (bool): display specs grouped by arch/compiler (default True)
decorators (dict): dictionary mappng specs to decorators
header_callback (typing.Callable): called at start of arch/compiler groups
decorator (typing.Callable): function to call to decorate specs
all_headers (bool): show headers even when arch/compiler aren't defined
output (typing.IO): A file object to write to. Default is ``sys.stdout``
@@ -384,15 +383,13 @@ def get_arg(name, default=None):
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + ffmt + vfmt
transform = {"package": decorator, "fullpackage": decorator}
def fmt(s, depth=0):
"""Formatter function for all output specs"""
string = ""
if hashes:
string += gray_hash(s, hlen) + " "
string += depth * " "
string += s.cformat(format_string, transform=transform)
string += decorator(s, s.cformat(format_string))
return string
def format_list(specs):
@@ -451,7 +448,7 @@ def filter_loaded_specs(specs):
return [x for x in specs if x.dag_hash() in hashes]
def print_how_many_pkgs(specs, pkg_type=""):
def print_how_many_pkgs(specs, pkg_type="", suffix=""):
"""Given a list of specs, this will print a message about how many
specs are in that list.
@@ -462,7 +459,7 @@ def print_how_many_pkgs(specs, pkg_type=""):
category, e.g. if pkg_type is "installed" then the message
would be "3 installed packages"
"""
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package"))
tty.msg("%s" % llnl.string.plural(len(specs), pkg_type + " package") + suffix)
def spack_is_git_repo():

View File

@@ -84,7 +84,7 @@ def externals(parser, args):
return
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs, debug_log=tty.debug)
_process_reports(reports)

View File

@@ -133,6 +133,11 @@ def setup_parser(subparser: argparse.ArgumentParser):
help="when pushing to an OCI registry, tag an image containing all root specs and their "
"runtime dependencies",
)
push.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
@@ -367,6 +372,25 @@ def _make_pool() -> MaybePool:
return NoPool()
def _skip_no_redistribute_for_public(specs):
remaining_specs = list()
removed_specs = list()
for spec in specs:
if spec.package.redistribute_binary:
remaining_specs.append(spec)
else:
removed_specs.append(spec)
if removed_specs:
colified_output = tty.colify.colified(list(s.name for s in removed_specs), indent=4)
tty.debug(
"The following specs will not be added to the binary cache"
" because they cannot be redistributed:\n"
f"{colified_output}\n"
"You can use `--private` to include them."
)
return remaining_specs
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -417,6 +441,8 @@ def push_fn(args):
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
if not args.private:
specs = _skip_no_redistribute_for_public(specs)
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.

View File

@@ -563,12 +563,13 @@ def add_concretizer_args(subparser):
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--fresh-roots",
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
help="concretize with fresh roots and reused dependencies",
)
subgroup.add_argument(
"--deprecated",

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import sys
import llnl.util.lang
@@ -14,6 +13,7 @@
import spack.cmd as cmd
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
@@ -69,6 +69,12 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["long", "very_long", "tags", "namespaces"])
subparser.add_argument(
"-r",
"--only-roots",
action="store_true",
help="don't show full list of installed specs in an environment",
)
subparser.add_argument(
"-c",
"--show-concretized",
@@ -189,26 +195,22 @@ def query_arguments(args):
return q_args
def setup_env(env):
def make_env_decorator(env):
"""Create a function for decorating specs when in an environment."""
def strip_build(seq):
return set(s.copy(deps=("link", "run")) for s in seq)
added = set(strip_build(env.added_specs()))
roots = set(strip_build(env.roots()))
removed = set(strip_build(env.removed_specs()))
roots = set(env.roots())
removed = set(env.removed_specs())
def decorator(spec, fmt):
# add +/-/* to show added/removed/root specs
if any(spec.dag_hash() == r.dag_hash() for r in roots):
return color.colorize("@*{%s}" % fmt)
return color.colorize(f"@*{{{fmt}}}")
elif spec in removed:
return color.colorize("@K{%s}" % fmt)
return color.colorize(f"@K{{{fmt}}}")
else:
return "%s" % fmt
return fmt
return decorator, added, roots, removed
return decorator
def display_env(env, args, decorator, results):
@@ -223,28 +225,51 @@ def display_env(env, args, decorator, results):
"""
tty.msg("In environment %s" % env.name)
if not env.user_specs:
tty.msg("No root specs")
else:
tty.msg("Root specs")
num_roots = len(env.user_specs) or "No"
tty.msg(f"{num_roots} root specs")
# Root specs cannot be displayed with prefixes, since those are not
# set for abstract specs. Same for hashes
root_args = copy.copy(args)
root_args.paths = False
concrete_specs = {
root: concrete_root
for root, concrete_root in zip(env.concretized_user_specs, env.concrete_roots())
}
# Roots are displayed with variants, etc. so that we can see
# specifically what the user asked for.
def root_decorator(spec, string):
"""Decorate root specs with their install status if needed"""
concrete = concrete_specs.get(spec)
if concrete:
status = color.colorize(concrete.install_status().value)
hash = concrete.dag_hash()
else:
status = color.colorize(spack.spec.InstallStatus.absent.value)
hash = "-" * 32
# TODO: status has two extra spaces on the end of it, but fixing this and other spec
# TODO: space format idiosyncrasies is complicated. Fix this eventually
status = status[:-2]
if args.long or args.very_long:
hash = color.colorize(f"@K{{{hash[: 7 if args.long else None]}}}")
return f"{status} {hash} {string}"
else:
return f"{status} {string}"
with spack.store.STORE.db.read_transaction():
cmd.display_specs(
env.user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
args,
# these are overrides of CLI args
paths=False,
long=False,
very_long=False,
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
print()
print()
if args.show_concretized:
tty.msg("Concretized roots")
@@ -254,7 +279,7 @@ def display_env(env, args, decorator, results):
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results:
if results and not args.only_roots:
tty.msg("Installed packages")
@@ -263,9 +288,10 @@ def find(parser, args):
results = args.specs(**q_args)
env = ev.active_environment()
decorator = lambda s, f: f
if env:
decorator, _, roots, _ = setup_env(env)
if not env and args.only_roots:
tty.die("-r / --only-roots requires an active environment")
decorator = make_env_decorator(env) if env else lambda s, f: f
# use groups by default except with format.
if args.groups is None:
@@ -292,9 +318,12 @@ def find(parser, args):
if env:
display_env(env, args, decorator, results)
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = " (not shown)"
if not args.only_roots:
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = ""
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type)
spack.cmd.print_how_many_pkgs(results, pkg_type, suffix=count_suffix)

View File

@@ -9,7 +9,7 @@
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.graph import DAGWithDependencyTypes, SimpleDAG, graph_ascii, graph_dot, static_graph_dot
from spack.graph import DotGraph, MermaidGraph, DAGWithDependencyTypes, SimpleDAG, graph_ascii, graph_dot, static_graph_dot
description = "generate graphs of package dependency relationships"
section = "basic"
@@ -33,6 +33,9 @@ def setup_parser(subparser):
method.add_argument(
"-d", "--dot", action="store_true", help="generate graph in dot format and print to stdout"
)
method.add_argument(
"-m", "--mermaid", action="store_true", help="generate graph in mermaid format and print to stdout"
)
subparser.add_argument(
"-s",
@@ -85,10 +88,14 @@ def graph(parser, args):
static_graph_dot(specs, depflag=args.deptype)
return
if args.dot:
builder = SimpleDAG()
if args.dot or args.mermaid:
if args.dot:
graph = DotGraph()
if args.mermaid:
graph = MermaidGraph()
builder = SimpleDAG(graph=graph)
if args.color:
builder = DAGWithDependencyTypes()
builder = DAGWithDependencyTypes(graph=graph)
graph_dot(specs, builder=builder, depflag=args.deptype)
return

View File

@@ -263,8 +263,8 @@ def _fmt_name_and_default(variant):
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when, indent):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(when)}")
def _fmt_when(when: "spack.spec.Spec", indent: int):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(str(when))}")
def _fmt_variant_description(variant, width, indent):
@@ -441,7 +441,7 @@ def get_url(version):
return "No URL"
url = get_url(preferred) if pkg.has_code else ""
line = version(" {0}".format(pad(preferred))) + color.cescape(url)
line = version(" {0}".format(pad(preferred))) + color.cescape(str(url))
color.cwrite(line)
print()
@@ -464,7 +464,7 @@ def get_url(version):
continue
for v, url in vers:
line = version(" {0}".format(pad(v))) + color.cescape(url)
line = version(" {0}".format(pad(v))) + color.cescape(str(url))
color.cprint(line)
@@ -475,10 +475,7 @@ def print_virtuals(pkg, args):
color.cprint(section_title("Virtual Packages: "))
if pkg.provided:
for when, specs in reversed(sorted(pkg.provided.items())):
line = " %s provides %s" % (
when.colorized(),
", ".join(s.colorized() for s in specs),
)
line = " %s provides %s" % (when.cformat(), ", ".join(s.cformat() for s in specs))
print(line)
else:
@@ -497,7 +494,9 @@ def print_licenses(pkg, args):
pad = padder(pkg.licenses, 4)
for when_spec in pkg.licenses:
license_identifier = pkg.licenses[when_spec]
line = license(" {0}".format(pad(license_identifier))) + color.cescape(when_spec)
line = license(" {0}".format(pad(license_identifier))) + color.cescape(
str(when_spec)
)
color.cprint(line)

View File

@@ -71,6 +71,11 @@ def setup_parser(subparser):
help="the number of versions to fetch for each spec, choose 'all' to"
" retrieve all versions of each package",
)
create_parser.add_argument(
"--private",
action="store_true",
help="for a private mirror, include non-redistributable packages",
)
arguments.add_common_arguments(create_parser, ["specs"])
arguments.add_concretizer_args(create_parser)
@@ -359,7 +364,6 @@ def concrete_specs_from_user(args):
specs = filter_externals(specs)
specs = list(set(specs))
specs.sort(key=lambda s: (s.name, s.version))
specs, _ = lang.stable_partition(specs, predicate_fn=not_excluded_fn(args))
return specs
@@ -404,36 +408,50 @@ def concrete_specs_from_cli_or_file(args):
return specs
def not_excluded_fn(args):
"""Return a predicate that evaluate to True if a spec was not explicitly
excluded by the user.
"""
exclude_specs = []
if args.exclude_file:
exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
class IncludeFilter:
def __init__(self, args):
self.exclude_specs = []
if args.exclude_file:
self.exclude_specs.extend(specs_from_text_file(args.exclude_file, concretize=False))
if args.exclude_specs:
self.exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
self.private = args.private
def not_excluded(x):
return not any(x.satisfies(y) for y in exclude_specs)
def __call__(self, x):
return all([self._not_license_excluded(x), self._not_cmdline_excluded(x)])
return not_excluded
def _not_license_excluded(self, x):
"""True if the spec is for a private mirror, or as long as the
package does not explicitly forbid redistributing source."""
if self.private:
return True
elif x.package_class.redistribute_source(x):
return True
else:
tty.debug(
"Skip adding {0} to mirror: the package.py file"
" indicates that a public mirror should not contain"
" it.".format(x.name)
)
return False
def _not_cmdline_excluded(self, x):
"""True if a spec was not explicitly excluded by the user."""
return not any(x.satisfies(y) for y in self.exclude_specs)
def concrete_specs_from_environment(selection_fn):
def concrete_specs_from_environment():
env = ev.active_environment()
assert env, "an active environment is required"
mirror_specs = env.all_specs()
mirror_specs = filter_externals(mirror_specs)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
def all_specs_with_all_versions(selection_fn):
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=selection_fn)
return mirror_specs
@@ -454,12 +472,6 @@ def versions_per_spec(args):
return num_versions
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def process_mirror_stats(present, mirrored, error):
p, m, e = len(present), len(mirrored), len(error)
tty.msg(
@@ -505,30 +517,28 @@ def mirror_create(args):
# When no directory is provided, the source dir is used
path = args.directory or spack.caches.fetch_cache_location()
mirror_specs, mirror_fn = _specs_and_action(args)
mirror_fn(mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions)
def _specs_and_action(args):
include_fn = IncludeFilter(args)
if args.all and not ev.active_environment():
create_mirror_for_all_specs(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = all_specs_with_all_versions()
mirror_fn = create_mirror_for_all_specs
elif args.all and ev.active_environment():
mirror_specs = concrete_specs_from_environment()
mirror_fn = create_mirror_for_individual_specs
else:
mirror_specs = concrete_specs_from_user(args)
mirror_fn = create_mirror_for_individual_specs
if args.all and ev.active_environment():
create_mirror_for_all_specs_inside_environment(
path=path,
skip_unstable_versions=args.skip_unstable_versions,
selection_fn=not_excluded_fn(args),
)
return
mirror_specs = concrete_specs_from_user(args)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=args.skip_unstable_versions
)
mirror_specs, _ = lang.stable_partition(mirror_specs, predicate_fn=include_fn)
return mirror_specs, mirror_fn
def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
mirror_specs = all_specs_with_all_versions(selection_fn=selection_fn)
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
@@ -540,11 +550,10 @@ def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_all_specs_inside_environment(path, skip_unstable_versions, selection_fn):
mirror_specs = concrete_specs_from_environment(selection_fn=selection_fn)
create_mirror_for_individual_specs(
mirror_specs, path=path, skip_unstable_versions=skip_unstable_versions
)
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
def mirror_destroy(args):

View File

@@ -214,8 +214,6 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:

View File

@@ -20,8 +20,10 @@
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
@@ -107,7 +109,6 @@ def _parse_link_paths(string):
"""
lib_search_paths = False
raw_link_dirs = []
tty.debug("parsing implicit link info")
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
@@ -122,7 +123,7 @@ def _parse_link_paths(string):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug("linker line: %s" % line)
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
@@ -138,15 +139,12 @@ def _parse_link_paths(string):
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("linkdir: %s" % link_dir)
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
tty.debug("libpath: %s", link_dir)
raw_link_dirs.append(link_dir)
tty.debug("found raw link dirs: %s" % ", ".join(raw_link_dirs))
implicit_link_dirs = list()
visited = set()
@@ -156,7 +154,7 @@ def _parse_link_paths(string):
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug("found link dirs: %s" % ", ".join(implicit_link_dirs))
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@@ -417,17 +415,35 @@ def real_version(self):
self._real_version = self.version
return self._real_version
def implicit_rpaths(self):
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
# Put CXX first since it has the most linking issues
# And because it has flags that affect linking
link_dirs = self._get_compiler_link_paths()
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
output = self.compiler_verbose_output
if not output:
return None
dynamic_linker = spack.util.libc.parse_dynamic_linker(output)
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
@@ -436,17 +452,17 @@ def required_libs(self):
# By default every compiler returns the empty list
return []
def _get_compiler_link_paths(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if not cc or not self.verbose_flag:
# Cannot determine implicit link paths without a compiler / verbose flag
return []
# What flag types apply to first_compiler, in what order
if cc == self.cc:
flags = ["cflags", "cppflags", "ldflags"]
else:
flags = ["cxxflags", "cppflags", "ldflags"]
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
@@ -458,20 +474,19 @@ def _get_compiler_link_paths(self):
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in flags:
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
output = cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
return _parse_non_system_link_dirs(output)
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return []
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self):
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
@@ -669,8 +684,8 @@ def __str__(self):
@contextlib.contextmanager
def compiler_environment(self):
# yield immediately if no modules
if not self.modules:
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
@@ -687,13 +702,9 @@ def compiler_environment(self):
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
env = spack.util.environment.EnvironmentModifications()
env.extend(spack.schema.environment.parse(self.environment))
env.apply_modifications()
spack.schema.environment.parse(self.environment).apply_modifications()
yield
except BaseException:
raise
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()

View File

@@ -967,10 +967,11 @@ def _default_make_compilers(cmp_id, paths):
make_mixed_toolchain(flat_compilers)
# Finally, create the compiler list
compilers = []
compilers: List["spack.compiler.Compiler"] = []
for compiler_id, _, compiler in flat_compilers:
make_compilers = getattr(compiler_id.os, "make_compilers", _default_make_compilers)
compilers.extend(make_compilers(compiler_id, compiler))
candidates = make_compilers(compiler_id, compiler)
compilers.extend(x for x in candidates if x.cc is not None)
return compilers

View File

@@ -38,10 +38,10 @@ class Clang(Compiler):
cxx_names = ["clang++"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["flang"]
f77_names = ["flang-new", "flang"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang"]
fc_names = ["flang-new", "flang"]
version_argument = "--version"
@@ -171,10 +171,11 @@ def extract_version_from_output(cls, output):
match = re.search(
# Normal clang compiler versions are left as-is
r"clang version ([^ )\n]+)-svn[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r"clang version ([^ )\n]+?)-[~.\w\d-]*|" r"clang version ([^ )\n]+)",
r"(?:clang|flang-new) version ([^ )\n]+?)-[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)",
output,
)
if match:

View File

@@ -8,7 +8,7 @@
import subprocess
import sys
import tempfile
from typing import Dict, List, Set
from typing import Dict, List
import archspec.cpu
@@ -20,15 +20,7 @@
from spack.error import SpackError
from spack.version import Version, VersionRange
avail_fc_version: Set[str] = set()
fc_path: Dict[str, str] = dict()
fortran_mapping = {
"2021.3.0": "19.29.30133",
"2021.2.1": "19.28.29913",
"2021.2.0": "19.28.29334",
"2021.1.0": "19.28.29333",
}
FC_PATH: Dict[str, str] = dict()
class CmdCall:
@@ -115,15 +107,13 @@ def command_str(self):
return f"{script} {self.arch} {self.sdk_ver} {self.vcvars_ver}"
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver)
def get_valid_fortran_pth():
"""Assign maximum available fortran compiler version"""
# TODO (johnwparent): validate compatibility w/ try compiler
# functionality when added
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fc_ver = sorted(list(avail_fc_version), key=sort_fn)
for ver in sort_fc_ver:
if ver in fortran_mapping:
if Version(cl_ver) <= Version(fortran_mapping[ver]):
return fc_path[ver]
return None
sort_fc_ver = sorted(list(FC_PATH.keys()), key=sort_fn)
return FC_PATH[sort_fc_ver[-1]] if sort_fc_ver else None
class Msvc(Compiler):
@@ -167,11 +157,9 @@ def __init__(self, *args, **kwargs):
# This positional argument "paths" is later parsed and process by the base class
# via the call to `super` later in this method
paths = args[3]
# This positional argument "cspec" is also parsed and handled by the base class
# constructor
cspec = args[0]
new_pth = [pth if pth else get_valid_fortran_pth(cspec.version) for pth in paths]
paths[:] = new_pth
latest_fc = get_valid_fortran_pth()
new_pth = [pth if pth else latest_fc for pth in paths[2:]]
paths[2:] = new_pth
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
super().__init__(*args, **kwargs)
@@ -183,7 +171,7 @@ def __init__(self, *args, **kwargs):
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
env_cmds = []
compiler_root = os.path.join(self.cc, "../../../../../../..")
compiler_root = os.path.join(os.path.dirname(self.cc), "../../../../../..")
vcvars_script_path = os.path.join(compiler_root, "Auxiliary", "Build", "vcvars64.bat")
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
@@ -198,11 +186,34 @@ def __init__(self, *args, **kwargs):
# paths[2] refers to the fc path and is a generic check
# for a fortran compiler
if paths[2]:
def get_oneapi_root(pth: str):
"""From within a prefix known to be a oneAPI path
determine the oneAPI root path from arbitrary point
under root
Args:
pth: path prefixed within oneAPI root
"""
if not pth:
return ""
while os.path.basename(pth) and os.path.basename(pth) != "oneAPI":
pth = os.path.dirname(pth)
return pth
# If this found, it sets all the vars
oneapi_root = os.path.join(self.cc, "../../..")
oneapi_root = get_oneapi_root(self.fc)
if not oneapi_root:
raise RuntimeError(f"Non-oneAPI Fortran compiler {self.fc} assigned to MSVC")
oneapi_root_setvars = os.path.join(oneapi_root, "setvars.bat")
# some oneAPI exes return a version more precise than their
# install paths specify, so we determine path from
# the install path rather than the fc executable itself
numver = r"\d+\.\d+(?:\.\d+)?"
pattern = f"((?:{numver})|(?:latest))"
version_from_path = re.search(pattern, self.fc).group(1)
oneapi_version_setvars = os.path.join(
oneapi_root, "compiler", str(self.ifx_version), "env", "vars.bat"
oneapi_root, "compiler", version_from_path, "env", "vars.bat"
)
# order matters here, the specific version env must be invoked first,
# otherwise it will be ignored if the root setvars sets up the oneapi
@@ -314,23 +325,19 @@ def setup_custom_environment(self, pkg, env):
@classmethod
def fc_version(cls, fc):
# We're using intel for the Fortran compilers, which exist if
# ONEAPI_ROOT is a meaningful variable
if not sys.platform == "win32":
return "unknown"
fc_ver = cls.default_version(fc)
avail_fc_version.add(fc_ver)
fc_path[fc_ver] = fc
if os.getenv("ONEAPI_ROOT"):
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError("Windows compiler search paths not established")
clp = spack.util.executable.which_string("cl", path=sps)
ver = cls.default_version(clp)
else:
ver = fc_ver
return ver
FC_PATH[fc_ver] = fc
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError(
"Windows compiler search paths not established, "
"please report this behavior to github.com/spack/spack"
)
clp = spack.util.executable.which_string("cl", path=sps)
return cls.default_version(clp) if clp else fc_ver
@classmethod
def f77_version(cls, f77):

View File

@@ -64,7 +64,7 @@ def verbose_flag(self):
#
# This way, we at least enable the implicit rpath detection, which is
# based on compilation of a C file (see method
# spack.compiler._get_compiler_link_paths): in the case of a mixed
# spack.compiler._compile_dummy_c_source): in the case of a mixed
# NAG/GCC toolchain, the flag will be passed to g++ (e.g.
# 'g++ -Wl,-v ./main.c'), otherwise, the flag will be passed to nagfor
# (e.g. 'nagfor -Wl,-v ./main.c' - note that nagfor recognizes '.c'

View File

@@ -74,6 +74,10 @@ class Concretizer:
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
@@ -113,7 +117,11 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.PATH.providers_for(spec)
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)

View File

@@ -1562,8 +1562,9 @@ def ensure_latest_format_fn(section: str) -> Callable[[YamlConfigDict], bool]:
def use_configuration(
*scopes_or_paths: Union[ConfigScope, str]
) -> Generator[Configuration, None, None]:
"""Use the configuration scopes passed as arguments within the
context manager.
"""Use the configuration scopes passed as arguments within the context manager.
This function invalidates caches, and is therefore very slow.
Args:
*scopes_or_paths: scope objects or paths to be used

View File

@@ -12,9 +12,31 @@
},
"os_package_manager": "yum_amazon"
},
"fedora:40": {
"bootstrap": {
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:40"
},
"os_package_manager": "dnf",
"build": "spack/fedora40",
"final": {
"image": "docker.io/fedora:40"
}
},
"fedora:39": {
"bootstrap": {
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:39"
},
"os_package_manager": "dnf",
"build": "spack/fedora39",
"final": {
"image": "docker.io/fedora:39"
}
},
"fedora:38": {
"bootstrap": {
"template": "container/fedora_38.dockerfile",
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:38"
},
"os_package_manager": "dnf",
@@ -25,7 +47,7 @@
},
"fedora:37": {
"bootstrap": {
"template": "container/fedora_37.dockerfile",
"template": "container/fedora.dockerfile",
"image": "docker.io/fedora:37"
},
"os_package_manager": "dnf",
@@ -116,6 +138,13 @@
},
"os_package_manager": "apt"
},
"ubuntu:24.04": {
"bootstrap": {
"template": "container/ubuntu_2404.dockerfile"
},
"os_package_manager": "apt",
"build": "spack/ubuntu-noble"
},
"ubuntu:22.04": {
"bootstrap": {
"template": "container/ubuntu_2204.dockerfile"

View File

@@ -25,6 +25,7 @@
import socket
import sys
import time
from json import JSONDecoder
from typing import (
Any,
Callable,
@@ -818,7 +819,8 @@ def _read_from_file(self, filename):
"""
try:
with open(filename, "r") as f:
fdata = sjson.load(f)
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
raise CorruptDatabaseError("error parsing database:", str(e)) from e
@@ -833,27 +835,24 @@ def check(cond, msg):
# High-level file checks
db = fdata["database"]
check("installs" in db, "no 'installs' in JSON DB.")
check("version" in db, "no 'version' in JSON DB.")
installs = db["installs"]
# TODO: better version checking semantics.
version = vn.Version(db["version"])
if version > _DB_VERSION:
raise InvalidDatabaseVersionError(self, _DB_VERSION, version)
elif version < _DB_VERSION:
if not any(old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX):
tty.warn(
"Spack database version changed from %s to %s. Upgrading."
% (version, _DB_VERSION)
)
elif version < _DB_VERSION and not any(
old == version and new == _DB_VERSION for old, new in _SKIP_REINDEX
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields))
for k, v in self._data.items()
)
self.reindex(spack.store.STORE.layout)
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
else:
check("installs" in db, "no 'installs' in JSON DB.")
installs = db["installs"]
spec_reader = reader(version)

View File

@@ -83,26 +83,15 @@ def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
return path_to_dict(search_paths)
def get_elf_compat(path):
"""For ELF files, get a triplet (EI_CLASS, EI_DATA, e_machine) and see if
it is host-compatible."""
# On ELF platforms supporting, we try to be a bit smarter when it comes to shared
# libraries, by dropping those that are not host compatible.
with open(path, "rb") as f:
elf = elf_utils.parse_elf(f, only_header=True)
return (elf.is_64_bit, elf.is_little_endian, elf.elf_hdr.e_machine)
def accept_elf(path, host_compat):
"""Accept an ELF file if the header matches the given compat triplet,
obtained with :py:func:`get_elf_compat`. In case it's not an ELF (e.g.
static library, or some arbitrary file, fall back to is_readable_file)."""
"""Accept an ELF file if the header matches the given compat triplet. In case it's not an ELF
(e.g. static library, or some arbitrary file, fall back to is_readable_file)."""
# Fast path: assume libraries at least have .so in their basename.
# Note: don't replace with splitext, because of libsmth.so.1.2.3 file names.
if ".so" not in os.path.basename(path):
return llnl.util.filesystem.is_readable_file(path)
try:
return host_compat == get_elf_compat(path)
return host_compat == elf_utils.get_elf_compat(path)
except (OSError, elf_utils.ElfParsingError):
return llnl.util.filesystem.is_readable_file(path)
@@ -155,7 +144,7 @@ def libraries_in_ld_and_system_library_path(
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
try:
host_compat = get_elf_compat(sys.executable)
host_compat = elf_utils.get_elf_compat(sys.executable)
accept = lambda path: accept_elf(path, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = llnl.util.filesystem.is_readable_file

View File

@@ -11,6 +11,7 @@
from llnl.util import filesystem
import spack.platforms
import spack.repo
import spack.spec
from spack.util import spack_yaml
@@ -32,6 +33,8 @@ class ExpectedTestResult(NamedTuple):
#: Spec to be detected
spec: str
#: Attributes expected in the external spec
extra_attributes: Dict[str, str]
class DetectionTest(NamedTuple):
@@ -100,7 +103,10 @@ def _create_executable_scripts(self, mock_executables: MockExecutables) -> List[
@property
def expected_specs(self) -> List[spack.spec.Spec]:
return [spack.spec.Spec(r.spec) for r in self.test.results]
return [
spack.spec.Spec.from_detection(item.spec, extra_attributes=item.extra_attributes)
for item in self.test.results
]
def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runner]:
@@ -117,9 +123,13 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
"""
result = []
detection_tests_content = read_detection_tests(pkg_name, repository)
current_platform = str(spack.platforms.host())
tests_by_path = detection_tests_content.get("paths", [])
for single_test_data in tests_by_path:
if current_platform not in single_test_data.get("platforms", [current_platform]):
continue
mock_executables = []
for layout in single_test_data["layout"]:
mock_executables.append(
@@ -127,7 +137,11 @@ def detection_tests(pkg_name: str, repository: spack.repo.RepoPath) -> List[Runn
)
expected_results = []
for assertion in single_test_data["results"]:
expected_results.append(ExpectedTestResult(spec=assertion["spec"]))
expected_results.append(
ExpectedTestResult(
spec=assertion["spec"], extra_attributes=assertion.get("extra_attributes", {})
)
)
current_test = DetectionTest(
pkg_name=pkg_name, layout=mock_executables, results=expected_results

View File

@@ -27,6 +27,7 @@ class OpenMpi(Package):
* ``variant``
* ``version``
* ``requires``
* ``redistribute``
"""
import collections
@@ -63,6 +64,7 @@ class OpenMpi(Package):
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conflicts",
"depends_on",
@@ -75,6 +77,7 @@ class OpenMpi(Package):
"resource",
"build_system",
"requires",
"redistribute",
]
#: These are variant names used by Spack internally; packages can't use them
@@ -598,6 +601,64 @@ def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
By default, Packages allow source/binary distribution (i.e. in
mirrors). Because of this, and because overlapping enable/
disable specs are not allowed, this directive only allows users
to explicitly disable redistribution for specs.
"""
return lambda pkg: _execute_redistribute(pkg, source, binary, when)
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
elif (source is True) or (binary is True):
raise DirectiveError(
"Source/binary distribution are true by default, they can only "
"be explicitly disabled."
)
if source is None:
source = True
if binary is None:
binary = True
when_spec = _make_when_spec(when)
if not when_spec:
return
if source is False:
max_constraint = spack.spec.Spec(f"{pkg.name}@{when_spec.versions}")
if not max_constraint.satisfies(when_spec):
raise DirectiveError("Source distribution can only be disabled for versions")
if when_spec in pkg.disable_redistribute:
disable = pkg.disable_redistribute[when_spec]
if not source:
disable.source = True
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
source=not source, binary=not binary
)
@directive(("extendees", "dependencies"))
def extends(spec, when=None, type=("build", "run"), patches=None):
"""Same as depends_on, but also adds this package to the extendee list.

View File

@@ -106,17 +106,16 @@ def environment_name(path: Union[str, pathlib.Path]) -> str:
return path_str
def check_disallowed_env_config_mods(scopes):
def ensure_no_disallowed_env_config_mods(scopes: List[spack.config.ConfigScope]) -> None:
for scope in scopes:
with spack.config.use_configuration(scope):
if spack.config.get("config:environments_root"):
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
return scopes
config = scope.get_section("config")
if config and "environments_root" in config["config"]:
raise SpackEnvironmentError(
"Spack environments are prohibited from modifying 'config:environments_root' "
"because it can make the definition of the environment ill-posed. Please "
"remove from your environment and place it in a permanent scope such as "
"defaults, system, site, etc."
)
def default_manifest_yaml():
@@ -2463,6 +2462,10 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
self.scope_name = f"env:{environment_name(self.manifest_dir)}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
#: Configuration scopes associated with this environment. Note that these are not
#: invalidated by a re-read of the manifest file.
self._config_scopes: Optional[List[spack.config.ConfigScope]] = None
if not self.manifest_file.exists():
msg = f"cannot find '{manifest_name}' in {self.manifest_dir}"
raise SpackEnvironmentError(msg)
@@ -2808,16 +2811,19 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest.
Returns: All configuration scopes associated with the environment
"""
config_name = self.scope_name
env_scope = spack.config.SingleFileScope(
config_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
)
return check_disallowed_env_config_mods(self.included_config_scopes + [env_scope])
"""A list of all configuration scopes for the environment manifest. On the first call this
instantiates all the scopes, on subsequent calls it returns the cached list."""
if self._config_scopes is not None:
return self._config_scopes
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
),
]
ensure_no_disallowed_env_config_mods(scopes)
self._config_scopes = scopes
return scopes
def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""

View File

@@ -662,9 +662,6 @@ def add_specs(self, *specs: spack.spec.Spec) -> None:
return
# Drop externals
for s in specs:
if s.external:
tty.warn("Skipping external package: " + s.short_spec)
specs = [s for s in specs if not s.external]
self._sanity_check_view_projection(specs)

View File

@@ -34,7 +34,7 @@
/
o boost
graph_dot() will output a graph of a spec (or multiple specs) in dot format.
graph_dot() will output a graph of a spec (or multiple specs) in dot or mermaid format.
"""
import enum
import sys
@@ -446,10 +446,27 @@ def graph_ascii(
graph.write(spec, color=color, out=out)
class DotGraphBuilder:
"""Visit edges of a graph a build DOT options for nodes and edges"""
class DotGraph:
"""Configuration for DOT graphs"""
def __init__(self):
self.label = "label="
self.template = "misc/graph.dot"
class MermaidGraph:
"""Configuration for Mermaid graphs"""
def __init__(self):
self.label = ""
self.template = "misc/graph.md"
class GraphBuilder:
"""Visit edges of a graph a build options for nodes and edges"""
def __init__(self, graph = DotGraph()):
self.graph: Union[DotGraph, MermaidGraph] = graph
self.nodes: Set[Tuple[str, str]] = set()
self.edges: Set[Tuple[str, str, str]] = set()
@@ -472,40 +489,40 @@ def edge_entry(self, edge: spack.spec.DependencySpec) -> Tuple[str, str, str]:
raise NotImplementedError("Need to be implemented by derived classes")
def context(self):
"""Return the context to be used to render the DOT graph template"""
"""Return the context to be used to render the graph template"""
result = {"nodes": self.nodes, "edges": self.edges}
return result
def render(self) -> str:
"""Return a string with the output in DOT format"""
"""Return a string with the output in format"""
environment = spack.tengine.make_environment()
template = environment.get_template("misc/graph.dot")
template = environment.get_template(self.graph.template)
return template.render(self.context())
class SimpleDAG(DotGraphBuilder):
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
class SimpleDAG(GraphBuilder):
"""Simple graph, with nodes colored uniformly and edges without properties"""
def node_entry(self, node):
format_option = "{name}{@version}{%compiler}{/hash:7}"
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
return node.dag_hash(), f'[{self.graph.label}"{node.format(format_option)}"]'
def edge_entry(self, edge):
return edge.parent.dag_hash(), edge.spec.dag_hash(), None
class StaticDag(DotGraphBuilder):
"""DOT graph for possible dependencies"""
class StaticDag(GraphBuilder):
"""Graph for possible dependencies"""
def node_entry(self, node):
return node.name, f'[label="{node.name}"]'
return node.name, f'[{self.graph.label}"{node.name}"]'
def edge_entry(self, edge):
return edge.parent.name, edge.spec.name, None
class DAGWithDependencyTypes(DotGraphBuilder):
"""DOT graph with link,run nodes grouped together and edges colored according to
class DAGWithDependencyTypes(GraphBuilder):
"""Graph with link,run nodes grouped together and edges colored according to
the dependency types.
"""
@@ -521,7 +538,7 @@ def visit(self, edge):
def node_entry(self, node):
node_str = node.format("{name}{@version}{%compiler}{/hash:7}")
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
options = f'[{self.graph.label}"{node_str}", group="build_dependencies", fillcolor="coral"]'
if node.dag_hash() in self.main_unified_space:
options = f'[label="{node_str}", group="main_psid"]'
return node.dag_hash(), options
@@ -574,7 +591,7 @@ def static_graph_dot(
def graph_dot(
specs: List[spack.spec.Spec],
builder: Optional[DotGraphBuilder] = None,
builder: Optional[GraphBuilder] = None,
depflag: dt.DepFlag = dt.ALL,
out: Optional[TextIO] = None,
):

View File

@@ -12,6 +12,10 @@
def post_install(spec, explicit):
# Push package to all buildcaches with autopush==True
# Do nothing if spec is an external package
if spec.external:
return
# Do nothing if package was not installed from source
pkg = spec.package
if pkg.installed_from_binary_cache:

View File

@@ -489,6 +489,9 @@ def _process_binary_cache_tarball(
with timer.measure("install"), spack.util.path.filter_padding():
binary_distribution.extract_tarball(pkg.spec, download_result, force=False, timer=timer)
if hasattr(pkg, "_post_buildcache_install_hook"):
pkg._post_buildcache_install_hook()
pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
return True

View File

@@ -83,6 +83,17 @@ def configuration(module_set_name):
)
_FORMAT_STRING_RE = re.compile(r"({[^}]*})")
def _format_env_var_name(spec, var_name_fmt):
"""Format the variable name, but uppercase any formatted fields."""
fmt_parts = _FORMAT_STRING_RE.split(var_name_fmt)
return "".join(
spec.format(part).upper() if _FORMAT_STRING_RE.match(part) else part for part in fmt_parts
)
def _check_tokens_are_valid(format_string, message):
"""Checks that the tokens used in the format string are valid in
the context of module file and environment variable naming.
@@ -737,20 +748,12 @@ def environment_modifications(self):
exclude = self.conf.exclude_env_vars
# We may have tokens to substitute in environment commands
# Prepare a suitable transformation dictionary for the names
# of the environment variables. This means turn the valid
# tokens uppercase.
transform = {}
for token in _valid_tokens:
transform[token] = lambda s, string: str.upper(string)
for x in env:
# Ensure all the tokens are valid in this context
msg = "some tokens cannot be expanded in an environment variable name"
_check_tokens_are_valid(x.name, message=msg)
# Transform them
x.name = self.spec.format(x.name, transform=transform)
x.name = _format_env_var_name(self.spec, x.name)
if self.modification_needs_formatting(x):
try:
# Not every command has a value

View File

@@ -73,17 +73,24 @@ def vs_install_paths(self):
def msvc_paths(self):
return [os.path.join(path, "VC", "Tools", "MSVC") for path in self.vs_install_paths]
@property
def oneapi_root(self):
root = os.environ.get("ONEAPI_ROOT", "") or os.path.join(
os.environ.get("ProgramFiles(x86)", ""), "Intel", "oneAPI"
)
if os.path.exists(root):
return root
@property
def compiler_search_paths(self):
# First Strategy: Find MSVC directories using vswhere
_compiler_search_paths = []
for p in self.msvc_paths:
_compiler_search_paths.extend(glob.glob(os.path.join(p, "*", "bin", "Hostx64", "x64")))
if os.getenv("ONEAPI_ROOT"):
oneapi_root = self.oneapi_root
if oneapi_root:
_compiler_search_paths.extend(
glob.glob(
os.path.join(str(os.getenv("ONEAPI_ROOT")), "compiler", "*", "windows", "bin")
)
glob.glob(os.path.join(oneapi_root, "compiler", "**", "bin"), recursive=True)
)
# Second strategy: Find MSVC via the registry

View File

@@ -468,7 +468,41 @@ def _names(when_indexed_dictionary):
return sorted(all_names)
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -2521,7 +2555,12 @@ class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""
def __init__(self, spec, dependents):
super().__init__("Cannot uninstall %s" % spec)
spec_fmt = spack.spec.DEFAULT_FORMAT + " /{hash:7}"
dep_fmt = "{name}{@versions} /{hash:7}"
super().__init__(
f"Cannot uninstall {spec.format(spec_fmt)}, "
f"needed by {[dep.format(dep_fmt) for dep in dependents]}"
)
self.spec = spec
self.dependents = dependents

View File

@@ -726,14 +726,14 @@ def first_repo(self):
"""Get the first repo in precedence order."""
return self.repos[0] if self.repos else None
@llnl.util.lang.memoized
def _all_package_names_set(self, include_virtuals):
return {name for repo in self.repos for name in repo.all_package_names(include_virtuals)}
@llnl.util.lang.memoized
def _all_package_names(self, include_virtuals):
"""Return all unique package names in all repositories."""
all_pkgs = set()
for repo in self.repos:
for name in repo.all_package_names(include_virtuals):
all_pkgs.add(name)
return sorted(all_pkgs, key=lambda n: n.lower())
return sorted(self._all_package_names_set(include_virtuals), key=lambda n: n.lower())
def all_package_names(self, include_virtuals=False):
return self._all_package_names(include_virtuals)
@@ -794,7 +794,11 @@ def patch_index(self):
@autospec
def providers_for(self, vpkg_spec):
providers = self.provider_index.providers_for(vpkg_spec)
providers = [
spec
for spec in self.provider_index.providers_for(vpkg_spec)
if spec.name in self._all_package_names_set(include_virtuals=False)
]
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
return providers

View File

@@ -9,13 +9,40 @@
"""
from typing import Any, Dict
LIST_OF_SPECS = {"type": "array", "items": {"type": "string"}}
properties: Dict[str, Any] = {
"concretizer": {
"type": "object",
"additionalProperties": False,
"properties": {
"reuse": {
"oneOf": [{"type": "boolean"}, {"type": "string", "enum": ["dependencies"]}]
"oneOf": [
{"type": "boolean"},
{"type": "string", "enum": ["dependencies"]},
{
"type": "object",
"properties": {
"roots": {"type": "boolean"},
"include": LIST_OF_SPECS,
"exclude": LIST_OF_SPECS,
"from": {
"type": "array",
"items": {
"type": "object",
"properties": {
"type": {
"type": "string",
"enum": ["local", "buildcache", "external"],
},
"include": LIST_OF_SPECS,
"exclude": LIST_OF_SPECS,
},
},
},
},
},
]
},
"enable_node_namespace": {"type": "boolean"},
"targets": {

View File

@@ -6,6 +6,7 @@
import collections.abc
import copy
import enum
import functools
import itertools
import os
import pathlib
@@ -15,6 +16,7 @@
import types
import typing
import warnings
from contextlib import contextmanager
from typing import Callable, Dict, Iterator, List, NamedTuple, Optional, Set, Tuple, Type, Union
import archspec.cpu
@@ -40,6 +42,8 @@
import spack.spec
import spack.store
import spack.util.crypto
import spack.util.elf
import spack.util.libc
import spack.util.path
import spack.util.timer
import spack.variant
@@ -119,6 +123,17 @@ def __str__(self):
return f"{self._name_.lower()}"
@contextmanager
def spec_with_name(spec, name):
"""Context manager to temporarily set the name of a spec"""
old_name = spec.name
spec.name = name
try:
yield spec
finally:
spec.name = old_name
class RequirementKind(enum.Enum):
"""Purpose / provenance of a requirement"""
@@ -271,6 +286,34 @@ def all_compilers_in_config(configuration):
return spack.compilers.all_compilers_from(configuration)
def all_libcs() -> Set[spack.spec.Spec]:
"""Return a set of all libc specs targeted by any configured compiler. If none, fall back to
libc determined from the current Python process if dynamically linked."""
libcs = {
c.default_libc for c in all_compilers_in_config(spack.config.CONFIG) if c.default_libc
}
if libcs:
return libcs
libc = spack.util.libc.libc_from_current_python_process()
return {libc} if libc else set()
def libc_is_compatible(lhs: spack.spec.Spec, rhs: spack.spec.Spec) -> List[spack.spec.Spec]:
return (
lhs.name == rhs.name
and lhs.external_path == rhs.external_path
and lhs.version >= rhs.version
)
def using_libc_compatibility() -> bool:
"""Returns True if we are currently using libc compatibility"""
return spack.platforms.host().name == "linux"
def extend_flag_list(flag_list, new_flags):
"""Extend a list of flags, preserving order and precedence.
@@ -554,6 +597,23 @@ def _spec_with_default_name(spec_str, name):
return spec
def _external_config_with_implicit_externals(configuration):
# Read packages.yaml and normalize it, so that it will not contain entries referring to
# virtual packages.
packages_yaml = _normalize_packages_yaml(configuration.get("packages"))
# Add externals for libc from compilers on Linux
if not using_libc_compatibility():
return packages_yaml
for compiler in all_compilers_in_config(configuration):
libc = compiler.default_libc
if libc:
entry = {"spec": f"{libc} %{compiler.spec}", "prefix": libc.external_path}
packages_yaml.setdefault(libc.name, {}).setdefault("externals", []).append(entry)
return packages_yaml
class ErrorHandler:
def __init__(self, model):
self.model = model
@@ -749,12 +809,22 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
A tuple of the solve result, the timer for the different phases of the
solve, and the internal statistics from clingo.
"""
# avoid circular import
import spack.bootstrap
output = output or DEFAULT_OUTPUT_CONFIGURATION
timer = spack.util.timer.Timer()
# Initialize the control object for the solver
self.control = control or default_clingo_control()
# ensure core deps are present on Windows
# needs to modify active config scope, so cannot be run within
# bootstrap config scope
if sys.platform == "win32":
tty.debug("Ensuring basic dependencies {win-sdk, wgl} available")
spack.bootstrap.core.ensure_winsdk_external_or_raise()
timer.start("setup")
asp_problem = setup.setup(specs, reuse=reuse, allow_deprecated=allow_deprecated)
if output.out is not None:
@@ -772,10 +842,16 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
# Binary compatibility is based on libc on Linux, and on the os tag elsewhere
if using_libc_compatibility():
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
timer.stop("load")
# Grounding is the first step in the solve -- it turns our facts
@@ -972,6 +1048,9 @@ def __init__(self, tests: bool = False):
self.pkgs: Set[str] = set()
self.explicitly_required_namespaces: Dict[str, str] = {}
# list of unique libc specs targeted by compilers (or an educated guess if no compiler)
self.libcs: List[spack.spec.Spec] = []
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1153,6 +1232,9 @@ def pkg_rules(self, pkg, tests):
def trigger_rules(self):
"""Flushes all the trigger rules collected so far, and clears the cache."""
if not self._trigger_cache:
return
self.gen.h2("Trigger conditions")
for name in self._trigger_cache:
cache = self._trigger_cache[name]
@@ -1166,6 +1248,9 @@ def trigger_rules(self):
def effect_rules(self):
"""Flushes all the effect rules collected so far, and clears the cache."""
if not self._effect_cache:
return
self.gen.h2("Imposed requirements")
for name in self._effect_cache:
cache = self._effect_cache[name]
@@ -1323,34 +1408,38 @@ def condition(
Returns:
int: id of the condition created by this function
"""
named_cond = required_spec.copy()
named_cond.name = named_cond.name or name
if not named_cond.name:
raise ValueError(f"Must provide a name for anonymous condition: '{named_cond}'")
name = required_spec.name or name
if not name:
raise ValueError(f"Must provide a name for anonymous condition: '{required_spec}'")
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the caller,
# we won't emit partial facts.
with spec_with_name(required_spec, name):
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
trigger_id = self._get_condition_id(
named_cond, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id))
)
if not imposed_spec:
return condition_id
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
)
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_effect(condition_id, effect_id))
)
if not imposed_spec:
return condition_id
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
)
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
imposed_constraints = self.spec_clauses(imposed_spec, body=body, required_from=name)
for pred in imposed_constraints:
@@ -1537,14 +1626,31 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
requirement_weight += 1
def external_packages(self):
"""Facts on external packages, as read from packages.yaml"""
# Read packages.yaml and normalize it, so that it
# will not contain entries referring to virtual
# packages.
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
"""Facts on external packages, from packages.yaml and implicit externals."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
self.gen.h1("External packages")
spec_filters = []
concretizer_yaml = spack.config.get("concretizer")
reuse_yaml = concretizer_yaml.get("reuse")
if isinstance(reuse_yaml, typing.Mapping):
default_include = reuse_yaml.get("include", [])
default_exclude = reuse_yaml.get("exclude", [])
for source in reuse_yaml.get("from", []):
if source["type"] != "external":
continue
include = source.get("include", default_include)
exclude = source.get("exclude", default_exclude)
spec_filters.append(
SpecFilter(
factory=lambda: [],
is_usable=lambda x: True,
include=include,
exclude=exclude,
)
)
for pkg_name, data in packages_yaml.items():
if pkg_name == "all":
continue
@@ -1553,7 +1659,6 @@ def external_packages(self):
if pkg_name not in spack.repo.PATH:
continue
self.gen.h2("External package: {0}".format(pkg_name))
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
@@ -1563,7 +1668,17 @@ def external_packages(self):
# Read a list of all the specs for this package
externals = data.get("externals", [])
external_specs = [spack.spec.parse_with_version_concrete(x["spec"]) for x in externals]
candidate_specs = [
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
external_specs = []
if spec_filters:
for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs
external_specs.extend(current_filter.selected_specs())
else:
external_specs.extend(candidate_specs)
# Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way
@@ -1593,6 +1708,7 @@ def external_imposition(input_spec, requirements):
self.gen.newline()
self.trigger_rules()
self.effect_rules()
def preferred_variants(self, pkg_name):
"""Facts on concretization preferences, as read from packages.yaml"""
@@ -1814,6 +1930,16 @@ def _spec_clauses(
if dep.name == "gcc-runtime":
continue
# libc is also solved again by clingo, but in this case the compatibility
# is not encoded in the parent node - so we need to emit explicit facts
if "libc" in dspec.virtuals:
for libc in self.libcs:
if libc_is_compatible(libc, dep):
clauses.append(
fn.attr("compatible_libc", spec.name, libc.name, libc.version)
)
continue
# We know dependencies are real for concrete specs. For abstract
# specs they just mean the dep is somehow in the DAG.
for dtype in dt.ALL_FLAGS:
@@ -2269,6 +2395,7 @@ def setup(
node_counter = _create_counter(specs, tests=self.tests)
self.possible_virtuals = node_counter.possible_virtuals()
self.pkgs = node_counter.possible_dependencies()
self.libcs = sorted(all_libcs()) # type: ignore[type-var]
# Fail if we already know an unreachable node is requested
for spec in specs:
@@ -2278,13 +2405,17 @@ def setup(
if missing_deps:
raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
for node in spack.traverse.traverse_nodes(specs):
for node in traverse.traverse_nodes(specs):
if node.namespace is not None:
self.explicitly_required_namespaces[node.name] = node.namespace
self.gen = ProblemInstanceBuilder()
compiler_parser = CompilerParser(configuration=spack.config.CONFIG).with_input_specs(specs)
if using_libc_compatibility():
for libc in self.libcs:
self.gen.fact(fn.allowed_libc(libc.name, libc.version))
if not allow_deprecated:
self.gen.fact(fn.deprecated_versions_not_allowed())
@@ -2414,18 +2545,42 @@ def visit(node):
def define_runtime_constraints(self):
"""Define the constraints to be imposed on the runtimes"""
recorder = RuntimePropertyRecorder(self)
# TODO: Use only available compilers ?
for compiler in self.possible_compilers:
compiler_with_different_cls_names = {"oneapi": "intel-oneapi-compilers"}
compiler_with_different_cls_names = {
"oneapi": "intel-oneapi-compilers",
"clang": "llvm",
}
compiler_cls_name = compiler_with_different_cls_names.get(
compiler.spec.name, compiler.spec.name
)
try:
compiler_cls = spack.repo.PATH.get_pkg_class(compiler_cls_name)
if hasattr(compiler_cls, "runtime_constraints"):
compiler_cls.runtime_constraints(spec=compiler.spec, pkg=recorder)
except spack.repo.UnknownPackageError:
pass
# Inject libc from available compilers, on Linux
if not compiler.available:
continue
if hasattr(compiler_cls, "runtime_constraints"):
compiler_cls.runtime_constraints(spec=compiler.spec, pkg=recorder)
current_libc = compiler.compiler_obj.default_libc
# If this is a compiler yet to be built (config:install_missing_compilers:true)
# infer libc from the Python process
if not current_libc and compiler.compiler_obj.cc is None:
current_libc = spack.util.libc.libc_from_current_python_process()
if using_libc_compatibility() and current_libc:
recorder("*").depends_on(
"libc", when=f"%{compiler.spec}", type="link", description="Add libc"
)
recorder("*").depends_on(
str(current_libc),
when=f"%{compiler.spec}",
type="link",
description="Add libc",
)
recorder.consume_facts()
@@ -2803,6 +2958,13 @@ class CompilerParser:
def __init__(self, configuration) -> None:
self.compilers: Set[KnownCompiler] = set()
for c in all_compilers_in_config(configuration):
if using_libc_compatibility() and not c.default_libc:
warnings.warn(
f"cannot detect libc from {c.spec}. The compiler will not be used "
f"during concretization."
)
continue
target = c.target if c.target != "any" else None
candidate = KnownCompiler(
spec=c.spec, os=c.operating_system, target=target, available=True, compiler_obj=c
@@ -3167,12 +3329,8 @@ def no_flags(self, node, flag_type):
self._specs[node].compiler_flags[flag_type] = []
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx
has been selected for this package.
"""
packages_yaml = spack.config.get("packages")
packages_yaml = _normalize_packages_yaml(packages_yaml)
"""This means that the external spec and index idx has been selected for this package."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
spec_info = packages_yaml[node.pkg]["externals"][int(idx)]
self._specs[node].external_path = spec_info.get("prefix", None)
self._specs[node].external_modules = spack.spec.Spec._format_module_list(
@@ -3452,25 +3610,159 @@ def _has_runtime_dependencies(spec: spack.spec.Spec) -> bool:
return True
class SpecFilter:
"""Given a method to produce a list of specs, this class can filter them according to
different criteria.
"""
def __init__(
self,
factory: Callable[[], List[spack.spec.Spec]],
is_usable: Callable[[spack.spec.Spec], bool],
include: List[str],
exclude: List[str],
) -> None:
"""
Args:
factory: factory to produce a list of specs
is_usable: predicate that takes a spec in input and returns False if the spec
should not be considered for this filter, True otherwise.
include: if present, a "good" spec must match at least one entry in the list
exclude: if present, a "good" spec must not match any entry in the list
"""
self.factory = factory
self.is_usable = is_usable
self.include = include
self.exclude = exclude
def is_selected(self, s: spack.spec.Spec) -> bool:
if not self.is_usable(s):
return False
if self.include and not any(s.satisfies(c) for c in self.include):
return False
if self.exclude and any(s.satisfies(c) for c in self.exclude):
return False
return True
def selected_specs(self) -> List[spack.spec.Spec]:
return [s for s in self.factory() if self.is_selected(s)]
@staticmethod
def from_store(configuration, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the current store."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=True)
factory = functools.partial(_specs_from_store, configuration=configuration)
return SpecFilter(factory=factory, is_usable=is_reusable, include=include, exclude=exclude)
@staticmethod
def from_buildcache(configuration, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the configured buildcaches."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=False)
return SpecFilter(
factory=_specs_from_mirror, is_usable=is_reusable, include=include, exclude=exclude
)
def _specs_from_store(configuration):
store = spack.store.create(configuration)
with store.db.read_transaction():
return store.db.query(installed=True)
def _specs_from_mirror():
try:
return spack.binary_distribution.update_cache_and_get_specs()
except (spack.binary_distribution.FetchCacheError, IndexError):
# this is raised when no mirrors had indices.
# TODO: update mirror configuration so it can indicate that the
# TODO: source cache (or any mirror really) doesn't have binaries.
return []
class ReuseStrategy(enum.Enum):
ROOTS = enum.auto()
DEPENDENCIES = enum.auto()
NONE = enum.auto()
class ReusableSpecsSelector:
"""Selects specs that can be reused during concretization."""
def __init__(self, configuration: spack.config.Configuration) -> None:
self.configuration = configuration
self.store = spack.store.create(configuration)
self.reuse_strategy = ReuseStrategy.ROOTS
reuse_yaml = self.configuration.get("concretizer:reuse", False)
self.reuse_sources = []
if not isinstance(reuse_yaml, typing.Mapping):
if reuse_yaml is False:
self.reuse_strategy = ReuseStrategy.NONE
if reuse_yaml == "dependencies":
self.reuse_strategy = ReuseStrategy.DEPENDENCIES
self.reuse_sources.extend(
[
SpecFilter.from_store(
configuration=self.configuration, include=[], exclude=[]
),
SpecFilter.from_buildcache(
configuration=self.configuration, include=[], exclude=[]
),
]
)
else:
roots = reuse_yaml.get("roots", True)
if roots is True:
self.reuse_strategy = ReuseStrategy.ROOTS
else:
self.reuse_strategy = ReuseStrategy.DEPENDENCIES
default_include = reuse_yaml.get("include", [])
default_exclude = reuse_yaml.get("exclude", [])
default_sources = [{"type": "local"}, {"type": "buildcache"}]
for source in reuse_yaml.get("from", default_sources):
include = source.get("include", default_include)
exclude = source.get("exclude", default_exclude)
if source["type"] == "local":
self.reuse_sources.append(
SpecFilter.from_store(self.configuration, include=include, exclude=exclude)
)
elif source["type"] == "buildcache":
self.reuse_sources.append(
SpecFilter.from_buildcache(
self.configuration, include=include, exclude=exclude
)
)
def reusable_specs(self, specs: List[spack.spec.Spec]) -> List[spack.spec.Spec]:
if self.reuse_strategy == ReuseStrategy.NONE:
return []
result = []
for reuse_source in self.reuse_sources:
result.extend(reuse_source.selected_specs())
# If we only want to reuse dependencies, remove the root specs
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
result = [spec for spec in result if not any(root in spec for root in specs)]
return result
class Solver:
"""This is the main external interface class for solving.
It manages solver configuration and preferences in one place. It sets up the solve
and passes the setup method to the driver, as well.
Properties of interest:
``reuse (bool)``
Whether to try to reuse existing installs/binaries
"""
def __init__(self):
self.driver = PyclingoDriver()
# These properties are settable via spack configuration, and overridable
# by setting them directly as properties.
self.reuse = spack.config.get("concretizer:reuse", False)
self.selector = ReusableSpecsSelector(configuration=spack.config.CONFIG)
@staticmethod
def _check_input_and_extract_concrete_specs(specs):
@@ -3484,39 +3776,6 @@ def _check_input_and_extract_concrete_specs(specs):
spack.spec.Spec.ensure_valid_variants(s)
return reusable
def _reusable_specs(self, specs):
reusable_specs = []
if self.reuse:
packages = spack.config.get("packages")
# Specs from the local Database
with spack.store.STORE.db.read_transaction():
reusable_specs.extend(
s
for s in spack.store.STORE.db.query(installed=True)
if _is_reusable(s, packages, local=True)
)
# Specs from buildcaches
try:
reusable_specs.extend(
s
for s in spack.binary_distribution.update_cache_and_get_specs()
if _is_reusable(s, packages, local=False)
)
except (spack.binary_distribution.FetchCacheError, IndexError):
# this is raised when no mirrors had indices.
# TODO: update mirror configuration so it can indicate that the
# TODO: source cache (or any mirror really) doesn't have binaries.
pass
# If we only want to reuse dependencies, remove the root specs
if self.reuse == "dependencies":
reusable_specs = [
spec for spec in reusable_specs if not any(root in spec for root in specs)
]
return reusable_specs
def solve(
self,
specs,
@@ -3542,7 +3801,7 @@ def solve(
# Check upfront that the variants are admissible
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
result, _, _ = self.driver.solve(
@@ -3571,7 +3830,7 @@ def solve_in_rounds(
"""
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs(specs))
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
# Tell clingo that we don't have to solve all the inputs at once

View File

@@ -127,10 +127,12 @@ trigger_node(TriggerID, Node, Node) :-
trigger_condition_holds(TriggerID, Node),
literal(TriggerID).
% Since we trigger the existence of literal nodes from a condition, we need to construct
% the condition_set/2 manually below
% Since we trigger the existence of literal nodes from a condition, we need to construct the condition_set/2
mentioned_in_literal(Root, Mentioned) :- mentioned_in_literal(TriggerID, Root, Mentioned), solve_literal(TriggerID).
condition_set(node(min_dupe_id, Root), node(min_dupe_id, Mentioned)) :- mentioned_in_literal(Root, Mentioned).
condition_set(node(min_dupe_id, Root), node(min_dupe_id, Root)) :- mentioned_in_literal(Root, Root).
1 { condition_set(node(min_dupe_id, Root), node(0..Y-1, Mentioned)) : max_dupes(Mentioned, Y) } 1 :-
mentioned_in_literal(Root, Mentioned), Mentioned != Root.
% Discriminate between "roots" that have been explicitly requested, and roots that are deduced from "virtual roots"
explicitly_requested_root(node(min_dupe_id, Package)) :-
@@ -138,6 +140,20 @@ explicitly_requested_root(node(min_dupe_id, Package)) :-
trigger_and_effect(Package, TriggerID, EffectID),
imposed_constraint(EffectID, "root", Package).
% Keep track of which nodes are associated with which root DAG
associated_with_root(RootNode, RootNode) :- attr("root", RootNode).
associated_with_root(RootNode, ChildNode) :-
depends_on(ParentNode, ChildNode),
associated_with_root(RootNode, ParentNode).
% We cannot have a node in the root condition set, that is not associated with that root
:- attr("root", RootNode),
condition_set(RootNode, node(X, Package)),
not virtual(Package),
not associated_with_root(RootNode, node(X, Package)).
#defined concretize_everything/0.
#defined literal/1.
@@ -891,12 +907,8 @@ error(100, "{0} variant '{1}' cannot have values '{2}' and '{3}' as they come fr
Set1 < Set2, % see[1]
build(node(ID, Package)).
% variant_set is an explicitly set variant value. If it's not 'set',
% we revert to the default value. If it is set, we force the set value
attr("variant_value", PackageNode, Variant, Value)
:- attr("node", PackageNode),
node_has_variant(PackageNode, Variant),
attr("variant_set", PackageNode, Variant, Value).
:- attr("variant_set", node(ID, Package), Variant, Value),
not attr("variant_value", node(ID, Package), Variant, Value).
% The rules below allow us to prefer default values for variants
% whenever possible. If a variant is set in a spec, or if it is
@@ -981,14 +993,13 @@ pkg_fact(Package, variant_single_value("dev_path"))
% Platform semantics
%-----------------------------------------------------------------------------
% if no platform is set, fall back to the default
error(100, "platform '{0}' is not allowed on the current host", Platform)
:- attr("node_platform", _, Platform), not allowed_platform(Platform).
% NOTE: Currently we have a single allowed platform per DAG, therefore there is no
% need to have additional optimization criteria. If we ever add cross-platform dags,
% this needs to be changed.
:- 2 { allowed_platform(Platform) }, internal_error("More than one allowed platform detected").
attr("node_platform", PackageNode, Platform)
:- attr("node", PackageNode),
not attr("node_platform_set", PackageNode),
node_platform_default(Platform).
1 { attr("node_platform", PackageNode, Platform) : allowed_platform(Platform) } 1
:- attr("node", PackageNode).
% setting platform on a node is a hard constraint
attr("node_platform", PackageNode, Platform)
@@ -1012,14 +1023,6 @@ error(100, "Cannot select '{0} os={1}' (operating system '{1}' is not buildable)
attr("node_os", node(X, Package), OS),
not buildable_os(OS).
% can't have dependencies on incompatible OS's
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageNodeOS, DependencyOS)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_os", node(X, Package), PackageNodeOS),
attr("node_os", node(Y, Dependency), DependencyOS),
not os_compatible(PackageNodeOS, DependencyOS),
build(node(X, Package)).
% give OS choice weights according to os declarations
node_os_weight(PackageNode, Weight)
:- attr("node", PackageNode),
@@ -1032,13 +1035,6 @@ os_compatible(OS, OS) :- os(OS).
% Transitive compatibility among operating systems
os_compatible(OS1, OS3) :- os_compatible(OS1, OS2), os_compatible(OS2, OS3).
% We can select only operating systems compatible with the ones
% for which we can build software. We need a cardinality constraint
% since we might have more than one "buildable_os(OS)" fact.
:- not 1 { os_compatible(CurrentOS, ReusedOS) : buildable_os(CurrentOS) },
attr("node_os", Package, ReusedOS),
internal_error("Reused OS incompatible with build OS").
% If an OS is set explicitly respect the value
attr("node_os", PackageNode, OS) :- attr("node_os_set", PackageNode, OS), attr("node", PackageNode).
@@ -1086,6 +1082,9 @@ error(100, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Tar
compiler_version(CompilerID, Version),
build(node(X, Package)).
#defined compiler_supports_target/2.
#defined compiler_available/1.
% if a target is set explicitly, respect it
attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).

View File

@@ -0,0 +1,37 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Libc compatibility rules for reusing solves.
%
% These rules are used on Linux
%=============================================================================
% A package cannot be reused if the libc is not compatible with it
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).
% The libc must be chosen among available ones
:- has_built_packages(),
provider(node(X, LibcPackage), node(0, "libc")),
attr("node", node(X, LibcPackage)),
attr("version", node(X, LibcPackage), LibcVersion),
not allowed_libc(LibcPackage, LibcVersion).
% A built node must depend on libc
:- build(PackageNode),
provider(LibcNode, node(0, "libc")),
not external(PackageNode),
not depends_on(PackageNode, LibcNode).

View File

@@ -7,21 +7,26 @@
% OS compatibility rules for reusing solves.
% os_compatible(RecentOS, OlderOS)
% OlderOS binaries can be used on RecentOS
%
% These rules are used on every platform, but Linux
%=============================================================================
% macOS
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").
os_compatible("bigsur", "catalina").
% Ubuntu
os_compatible("ubuntu22.04", "ubuntu21.10").
os_compatible("ubuntu21.10", "ubuntu21.04").
os_compatible("ubuntu21.04", "ubuntu20.10").
os_compatible("ubuntu20.10", "ubuntu20.04").
os_compatible("ubuntu20.04", "ubuntu19.10").
os_compatible("ubuntu19.10", "ubuntu19.04").
os_compatible("ubuntu19.04", "ubuntu18.10").
os_compatible("ubuntu18.10", "ubuntu18.04").
% can't have dependencies on incompatible OS's
error(100, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageNodeOS, DependencyOS)
:- depends_on(node(X, Package), node(Y, Dependency)),
attr("node_os", node(X, Package), PackageNodeOS),
attr("node_os", node(Y, Dependency), DependencyOS),
not os_compatible(PackageNodeOS, DependencyOS),
build(node(X, Package)).
%EL8
os_compatible("rhel8", "rocky8").
% We can select only operating systems compatible with the ones
% for which we can build software. We need a cardinality constraint
% since we might have more than one "buildable_os(OS)" fact.
:- not 1 { os_compatible(CurrentOS, ReusedOS) : buildable_os(CurrentOS) },
attr("node_os", Package, ReusedOS).

View File

@@ -51,7 +51,6 @@
import collections
import collections.abc
import enum
import io
import itertools
import os
import pathlib
@@ -59,7 +58,7 @@
import re
import socket
import warnings
from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
from typing import Any, Callable, Dict, List, Match, Optional, Set, Tuple, Union
import llnl.path
import llnl.string
@@ -121,36 +120,44 @@
"SpecDeprecatedError",
]
SPEC_FORMAT_RE = re.compile(
r"(?:" # this is one big or, with matches ordered by priority
# OPTION 1: escaped character (needs to be first to catch opening \{)
# Note that an unterminated \ at the end of a string is left untouched
r"(?:\\(.))"
r"|" # or
# OPTION 2: an actual format string
r"{" # non-escaped open brace {
r"([%@/]|arch=)?" # optional sigil (to print sigil in color)
r"(?:\^([^}\.]+)\.)?" # optional ^depname. (to get attr from dependency)
# after the sigil or depname, we can have a hash expression or another attribute
r"(?:" # one of
r"(hash\b)(?:\:(\d+))?" # hash followed by :<optional length>
r"|" # or
r"([^}]*)" # another attribute to format
r")" # end one of
r"(})?" # finish format string with non-escaped close brace }, or missing if not present
r"|"
# OPTION 3: mismatched close brace (option 2 would consume a matched open brace)
r"(})" # brace
r")",
re.IGNORECASE,
)
#: Valid pattern for an identifier in Spack
IDENTIFIER_RE = r"\w[\w-]*"
# Coloring of specs when using color output. Fields are printed with
# different colors to enhance readability.
# See llnl.util.tty.color for descriptions of the color codes.
COMPILER_COLOR = "@g" #: color for highlighting compilers
VERSION_COLOR = "@c" #: color for highlighting versions
ARCHITECTURE_COLOR = "@m" #: color for highlighting architectures
ENABLED_VARIANT_COLOR = "@B" #: color for highlighting enabled variants
DISABLED_VARIANT_COLOR = "r" #: color for highlighting disabled varaints
DEPENDENCY_COLOR = "@." #: color for highlighting dependencies
VARIANT_COLOR = "@B" #: color for highlighting variants
HASH_COLOR = "@K" #: color for highlighting package hashes
#: This map determines the coloring of specs when using color output.
#: We make the fields different colors to enhance readability.
#: See llnl.util.tty.color for descriptions of the color codes.
COLOR_FORMATS = {
"%": COMPILER_COLOR,
"@": VERSION_COLOR,
"=": ARCHITECTURE_COLOR,
"+": ENABLED_VARIANT_COLOR,
"~": DISABLED_VARIANT_COLOR,
"^": DEPENDENCY_COLOR,
"#": HASH_COLOR,
}
#: Regex used for splitting by spec field separators.
#: These need to be escaped to avoid metacharacters in
#: ``COLOR_FORMATS.keys()``.
_SEPARATORS = "[\\%s]" % "\\".join(COLOR_FORMATS.keys())
#: Default format for Spec.format(). This format can be round-tripped, so that:
#: Spec(Spec("string").format()) == Spec("string)"
DEFAULT_FORMAT = (
@@ -193,26 +200,7 @@ class InstallStatus(enum.Enum):
missing = "@r{[-]} "
def colorize_spec(spec):
"""Returns a spec colorized according to the colors specified in
COLOR_FORMATS."""
class insert_color:
def __init__(self):
self.last = None
def __call__(self, match):
# ignore compiler versions (color same as compiler)
sep = match.group(0)
if self.last == "%" and sep == "@":
return clr.cescape(sep)
self.last = sep
return "%s%s" % (COLOR_FORMATS[sep], clr.cescape(sep))
return clr.colorize(re.sub(_SEPARATORS, insert_color(), str(spec)) + "@.")
# regexes used in spec formatting
OLD_STYLE_FMT_RE = re.compile(r"\${[A-Z]+}")
@@ -911,6 +899,9 @@ def flags():
yield flags
def __str__(self):
if not self:
return ""
sorted_items = sorted((k, v) for k, v in self.items() if v)
result = ""
@@ -3907,9 +3898,13 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if not self._dependencies:
return False
# If we arrived here, then rhs is abstract. At the moment we don't care about the edge
# structure of an abstract DAG, so we check if any edge could satisfy the properties
# we ask for.
# If we arrived here, the lhs root node satisfies the rhs root node. Now we need to check
# all the edges that have an abstract parent, and verify that they match some edge in the
# lhs.
#
# It might happen that the rhs brings in concrete sub-DAGs. For those we don't need to
# verify the edge properties, cause everything is encoded in the hash of the nodes that
# will be verified later.
lhs_edges: Dict[str, Set[DependencySpec]] = collections.defaultdict(set)
for rhs_edge in other.traverse_edges(root=False, cover="edges"):
# If we are checking for ^mpi we need to verify if there is any edge
@@ -3919,6 +3914,10 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if not rhs_edge.virtuals:
continue
# Skip edges from a concrete sub-DAG
if rhs_edge.parent.concrete:
continue
if not lhs_edges:
# Construct a map of the link/run subDAG + direct "build" edges,
# keyed by dependency name
@@ -4292,10 +4291,7 @@ def deps():
yield deps
def colorized(self):
return colorize_spec(self)
def format(self, format_string=DEFAULT_FORMAT, **kwargs):
def format(self, format_string: str = DEFAULT_FORMAT, color: Optional[bool] = False) -> str:
r"""Prints out particular pieces of a spec, depending on what is
in the format string.
@@ -4358,79 +4354,65 @@ def format(self, format_string=DEFAULT_FORMAT, **kwargs):
literal ``\`` character.
Args:
format_string (str): string containing the format to be expanded
Keyword Args:
color (bool): True if returned string is colored
transform (dict): maps full-string formats to a callable \
that accepts a string and returns another one
format_string: string containing the format to be expanded
color: True for colorized result; False for no color; None for auto color.
"""
ensure_modern_format_string(format_string)
color = kwargs.get("color", False)
transform = kwargs.get("transform", {})
out = io.StringIO()
def safe_color(sigil: str, string: str, color_fmt: Optional[str]) -> str:
# avoid colorizing if there is no color or the string is empty
if (color is False) or not color_fmt or not string:
return sigil + string
# escape and add the sigil here to avoid multiple concatenations
if sigil == "@":
sigil = "@@"
return clr.colorize(f"{color_fmt}{sigil}{clr.cescape(string)}@.", color=color)
def write(s, c=None):
f = clr.cescape(s)
if c is not None:
f = COLOR_FORMATS[c] + f + "@."
clr.cwrite(f, stream=out, color=color)
def format_attribute(match_object: Match) -> str:
(esc, sig, dep, hash, hash_len, attribute, close_brace, unmatched_close_brace) = (
match_object.groups()
)
if esc:
return esc
elif unmatched_close_brace:
raise SpecFormatStringError(f"Unmatched close brace: '{format_string}'")
elif not close_brace:
raise SpecFormatStringError(f"Missing close brace: '{format_string}'")
def write_attribute(spec, attribute, color):
attribute = attribute.lower()
current = self if dep is None else self[dep]
sig = ""
if attribute.startswith(("@", "%", "/")):
# color sigils that are inside braces
sig = attribute[0]
attribute = attribute[1:]
elif attribute.startswith("arch="):
sig = " arch=" # include space as separator
attribute = attribute[5:]
current = spec
if attribute.startswith("^"):
attribute = attribute[1:]
dep, attribute = attribute.split(".", 1)
current = self[dep]
# Hash attributes can return early.
# NOTE: we currently treat abstract_hash like an attribute and ignore
# any length associated with it. We may want to change that.
if hash:
if sig and sig != "/":
raise SpecFormatSigilError(sig, "DAG hashes", hash)
try:
length = int(hash_len) if hash_len else None
except ValueError:
raise SpecFormatStringError(f"Invalid hash length: '{hash_len}'")
return safe_color(sig or "", current.dag_hash(length), HASH_COLOR)
if attribute == "":
raise SpecFormatStringError("Format string attributes must be non-empty")
attribute = attribute.lower()
parts = attribute.split(".")
assert parts
# check that the sigil is valid for the attribute.
if sig == "@" and parts[-1] not in ("versions", "version"):
if not sig:
sig = ""
elif sig == "@" and parts[-1] not in ("versions", "version"):
raise SpecFormatSigilError(sig, "versions", attribute)
elif sig == "%" and attribute not in ("compiler", "compiler.name"):
raise SpecFormatSigilError(sig, "compilers", attribute)
elif sig == "/" and not re.match(r"(abstract_)?hash(:\d+)?$", attribute):
elif sig == "/" and attribute != "abstract_hash":
raise SpecFormatSigilError(sig, "DAG hashes", attribute)
elif sig == " arch=" and attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
# find the morph function for our attribute
morph = transform.get(attribute, lambda s, x: x)
# Special cases for non-spec attributes and hashes.
# These must be the only non-dep component of the format attribute
if attribute == "spack_root":
write(morph(spec, spack.paths.spack_root))
return
elif attribute == "spack_install":
write(morph(spec, spack.store.STORE.layout.root))
return
elif re.match(r"hash(:\d)?", attribute):
col = "#"
if ":" in attribute:
_, length = attribute.split(":")
write(sig + morph(spec, current.dag_hash(int(length))), col)
else:
write(sig + morph(spec, current.dag_hash()), col)
return
elif sig == "arch=":
if attribute not in ("architecture", "arch"):
raise SpecFormatSigilError(sig, "the architecture", attribute)
sig = " arch=" # include space as separator
# Iterate over components using getattr to get next element
for idx, part in enumerate(parts):
@@ -4439,7 +4421,7 @@ def write_attribute(spec, attribute, color):
if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if isinstance(current, vt.VariantMap):
if part == "variants" and isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names
current = current[part]
else:
@@ -4463,62 +4445,31 @@ def write_attribute(spec, attribute, color):
raise SpecFormatStringError(m)
if isinstance(current, vn.VersionList):
if current == vn.any_version:
# We don't print empty version lists
return
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# We're not printing anything
return
# not printing anything
return ""
# Set color codes for various attributes
col = None
color = None
if "variants" in parts:
col = "+"
color = VARIANT_COLOR
elif "architecture" in parts:
col = "="
color = ARCHITECTURE_COLOR
elif "compiler" in parts or "compiler_flags" in parts:
col = "%"
color = COMPILER_COLOR
elif "version" in parts or "versions" in parts:
col = "@"
color = VERSION_COLOR
# Finally, write the output
write(sig + morph(spec, str(current)), col)
# return colored output
return safe_color(sig, str(current), color)
attribute = ""
in_attribute = False
escape = False
for c in format_string:
if escape:
out.write(c)
escape = False
elif c == "\\":
escape = True
elif in_attribute:
if c == "}":
write_attribute(self, attribute, color)
attribute = ""
in_attribute = False
else:
attribute += c
else:
if c == "}":
raise SpecFormatStringError(
"Encountered closing } before opening { in %s" % format_string
)
elif c == "{":
in_attribute = True
else:
out.write(c)
if in_attribute:
raise SpecFormatStringError(
"Format string terminated while reading attribute." "Missing terminating }."
)
formatted_spec = out.getvalue()
return formatted_spec.strip()
return SPEC_FORMAT_RE.sub(format_attribute, format_string).strip()
def cformat(self, *args, **kwargs):
"""Same as format, but color defaults to auto instead of False."""
@@ -4526,6 +4477,16 @@ def cformat(self, *args, **kwargs):
kwargs.setdefault("color", None)
return self.format(*args, **kwargs)
@property
def spack_root(self):
"""Special field for using ``{spack_root}`` in Spec.format()."""
return spack.paths.spack_root
@property
def spack_install(self):
"""Special field for using ``{spack_install}`` in Spec.format()."""
return spack.store.STORE.layout.root
def format_path(
# self, format_string: str, _path_ctor: Optional[pathlib.PurePath] = None
self,
@@ -4551,18 +4512,27 @@ def format_path(
path_ctor = _path_ctor or pathlib.PurePath
format_string_as_path = path_ctor(format_string)
if format_string_as_path.is_absolute():
if format_string_as_path.is_absolute() or (
# Paths that begin with a single "\" on windows are relative, but we still
# want to preserve the initial "\\" to be consistent with PureWindowsPath.
# Ensure that this '\' is not passed to polite_filename() so it's not converted to '_'
(os.name == "nt" or path_ctor == pathlib.PureWindowsPath)
and format_string_as_path.parts[0] == "\\"
):
output_path_components = [format_string_as_path.parts[0]]
input_path_components = list(format_string_as_path.parts[1:])
else:
output_path_components = []
input_path_components = list(format_string_as_path.parts)
output_path_components += [
fs.polite_filename(self.format(x)) for x in input_path_components
fs.polite_filename(self.format(part)) for part in input_path_components
]
return str(path_ctor(*output_path_components))
def __str__(self):
if not self._dependencies:
return self.format()
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)

View File

@@ -390,11 +390,11 @@ def test_built_spec_cache(mirror_dir):
assert any([r["spec"] == s for r in results])
def fake_dag_hash(spec):
def fake_dag_hash(spec, length=None):
# Generate an arbitrary hash that is intended to be different than
# whatever a Spec reported before (to test actions that trigger when
# the hash changes)
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"[:length]
@pytest.mark.usefixtures(

View File

@@ -63,7 +63,8 @@ def build_environment(working_env):
os.environ["SPACK_LINKER_ARG"] = "-Wl,"
os.environ["SPACK_DTAGS_TO_ADD"] = "--disable-new-dtags"
os.environ["SPACK_DTAGS_TO_STRIP"] = "--enable-new-dtags"
os.environ["SPACK_SYSTEM_DIRS"] = "/usr/include /usr/lib"
os.environ["SPACK_SYSTEM_DIRS"] = "/usr/include|/usr/lib"
os.environ["SPACK_MANAGED_DIRS"] = f"{prefix}/opt/spack"
os.environ["SPACK_TARGET_ARGS"] = ""
if "SPACK_DEPENDENCIES" in os.environ:

View File

@@ -15,7 +15,7 @@
import spack.config
import spack.spec
from spack.paths import build_env_path
from spack.util.environment import SYSTEM_DIRS, set_env
from spack.util.environment import SYSTEM_DIR_CASE_ENTRY, set_env
from spack.util.executable import Executable, ProcessError
#
@@ -127,7 +127,7 @@
spack_cflags = ["-Wall"]
spack_cxxflags = ["-Werror"]
spack_fflags = ["-w"]
spack_ldflags = ["-L", "foo"]
spack_ldflags = ["-Wl,--gc-sections", "-L", "foo"]
spack_ldlibs = ["-lfoo"]
lheaderpad = ["-Wl,-headerpad_max_install_names"]
@@ -159,7 +159,8 @@ def wrapper_environment(working_env):
SPACK_DEBUG_LOG_ID="foo-hashabc",
SPACK_COMPILER_SPEC="gcc@4.4.7",
SPACK_SHORT_SPEC="foo@1.2 arch=linux-rhel6-x86_64 /hashabc",
SPACK_SYSTEM_DIRS=":".join(SYSTEM_DIRS),
SPACK_SYSTEM_DIRS=SYSTEM_DIR_CASE_ENTRY,
SPACK_MANAGED_DIRS="/path/to/spack-1/opt/spack/*|/path/to/spack-2/opt/spack/*",
SPACK_CC_RPATH_ARG="-Wl,-rpath,",
SPACK_CXX_RPATH_ARG="-Wl,-rpath,",
SPACK_F77_RPATH_ARG="-Wl,-rpath,",
@@ -278,7 +279,6 @@ def test_ld_flags(wrapper_environment, wrapper_flags):
test_args,
["ld"]
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ test_library_paths
+ ["--disable-new-dtags"]
+ test_rpaths
@@ -306,13 +306,14 @@ def test_cc_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_cppflags
+ spack_cflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)
@@ -324,12 +325,13 @@ def test_cxx_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_cppflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)
@@ -341,13 +343,14 @@ def test_fc_flags(wrapper_environment, wrapper_flags):
[real_cc]
+ target_args
+ test_include_paths
+ [spack_ldflags[i] + spack_ldflags[i + 1] for i in range(0, len(spack_ldflags), 2)]
+ ["-Lfoo"]
+ test_library_paths
+ ["-Wl,--disable-new-dtags"]
+ test_wl_rpaths
+ test_args_without_paths
+ spack_fflags
+ spack_cppflags
+ ["-Wl,--gc-sections"]
+ spack_ldlibs,
)
@@ -907,3 +910,108 @@ def test_linker_strips_loopopt(wrapper_environment, wrapper_flags):
result = cc(*(test_args + ["-loopopt=0", "-c", "x.c"]), output=str)
result = result.strip().split("\n")
assert "-loopopt=0" in result
def test_spack_managed_dirs_are_prioritized(wrapper_environment):
# We have two different stores with 5 packages divided over them
pkg1 = "/path/to/spack-1/opt/spack/linux-ubuntu22.04-zen2/gcc-13.2.0/pkg-1.0-abcdef"
pkg2 = "/path/to/spack-1/opt/spack/linux-ubuntu22.04-zen2/gcc-13.2.0/pkg-2.0-abcdef"
pkg3 = "/path/to/spack-2/opt/spack/linux-ubuntu22.04-zen2/gcc-13.2.0/pkg-3.0-abcdef"
pkg4 = "/path/to/spack-2/opt/spack/linux-ubuntu22.04-zen2/gcc-13.2.0/pkg-4.0-abcdef"
pkg5 = "/path/to/spack-2/opt/spack/linux-ubuntu22.04-zen2/gcc-13.2.0/pkg-5.0-abcdef"
variables = {
# cppflags, ldflags from the command line, config or package.py take highest priority
"SPACK_CPPFLAGS": f"-I/usr/local/include -I/external-1/include -I{pkg1}/include",
"SPACK_LDFLAGS": f"-L/usr/local/lib -L/external-1/lib -L{pkg1}/lib "
f"-Wl,-rpath,/usr/local/lib -Wl,-rpath,/external-1/lib -Wl,-rpath,{pkg1}/lib",
# automatic -L, -Wl,-rpath, -I flags from dependencies -- on the spack side they are
# already partitioned into "spack owned prefixes" and "non-spack owned prefixes"
"SPACK_STORE_LINK_DIRS": f"{pkg4}/lib:{pkg5}/lib",
"SPACK_STORE_RPATH_DIRS": f"{pkg4}/lib:{pkg5}/lib",
"SPACK_STORE_INCLUDE_DIRS": f"{pkg4}/include:{pkg5}/include",
"SPACK_LINK_DIRS": "/external-3/lib:/external-4/lib",
"SPACK_RPATH_DIRS": "/external-3/lib:/external-4/lib",
"SPACK_INCLUDE_DIRS": "/external-3/include:/external-4/include",
}
with set_env(SPACK_TEST_COMMAND="dump-args", **variables):
effective_call = (
cc(
# system paths
"-I/usr/include",
"-L/usr/lib",
"-Wl,-rpath,/usr/lib",
# some other externals
"-I/external-2/include",
"-L/external-2/lib",
"-Wl,-rpath,/external-2/lib",
# relative paths are considered "spack managed" since they are in the stage dir
"-I..",
"-L..",
"-Wl,-rpath,..", # pathological but simpler for the test.
# spack store paths
f"-I{pkg2}/include",
f"-I{pkg3}/include",
f"-L{pkg2}/lib",
f"-L{pkg3}/lib",
f"-Wl,-rpath,{pkg2}/lib",
f"-Wl,-rpath,{pkg3}/lib",
"hello.c",
"-o",
"hello",
output=str,
)
.strip()
.split("\n")
)
dash_I = [flag[2:] for flag in effective_call if flag.startswith("-I")]
dash_L = [flag[2:] for flag in effective_call if flag.startswith("-L")]
dash_Wl_rpath = [flag[11:] for flag in effective_call if flag.startswith("-Wl,-rpath")]
assert dash_I == [
# spack owned dirs from SPACK_*FLAGS
f"{pkg1}/include",
# spack owned dirs from command line & automatic flags for deps (in that order)]
"..",
f"{pkg2}/include", # from command line
f"{pkg3}/include", # from command line
f"{pkg4}/include", # from SPACK_STORE_INCLUDE_DIRS
f"{pkg5}/include", # from SPACK_STORE_INCLUDE_DIRS
# non-system dirs from SPACK_*FLAGS
"/external-1/include",
# non-system dirs from command line & automatic flags for deps (in that order)
"/external-2/include", # from command line
"/external-3/include", # from SPACK_INCLUDE_DIRS
"/external-4/include", # from SPACK_INCLUDE_DIRS
# system dirs from SPACK_*FLAGS
"/usr/local/include",
# system dirs from command line
"/usr/include",
]
assert (
dash_L
== dash_Wl_rpath
== [
# spack owned dirs from SPACK_*FLAGS
f"{pkg1}/lib",
# spack owned dirs from command line & automatic flags for deps (in that order)
"..",
f"{pkg2}/lib", # from command line
f"{pkg3}/lib", # from command line
f"{pkg4}/lib", # from SPACK_STORE_LINK_DIRS
f"{pkg5}/lib", # from SPACK_STORE_LINK_DIRS
# non-system dirs from SPACK_*FLAGS
"/external-1/lib",
# non-system dirs from command line & automatic flags for deps (in that order)
"/external-2/lib", # from command line
"/external-3/lib", # from SPACK_LINK_DIRS
"/external-4/lib", # from SPACK_LINK_DIRS
# system dirs from SPACK_*FLAGS
"/usr/local/lib",
# system dirs from command line
"/usr/lib",
]
)

View File

@@ -446,3 +446,10 @@ def test_push_and_install_with_mirror_marked_unsigned_does_not_require_extra_fla
spec.package.do_uninstall(force=True)
spec.package.do_install(**kwargs)
def test_skip_no_redistribute(mock_packages, config):
specs = list(Spec("no-redistribute-dependent").concretized().traverse())
filtered = spack.cmd.buildcache._skip_no_redistribute_for_public(specs)
assert not any(s.name == "no-redistribute" for s in filtered)
assert any(s.name == "no-redistribute-dependent" for s in filtered)

View File

@@ -117,13 +117,13 @@ def test_specs_staging(config, tmpdir):
with repo.use_repositories(builder.root):
spec_a = Spec("a").concretized()
spec_a_label = ci._spec_deps_key(spec_a)
spec_b_label = ci._spec_deps_key(spec_a["b"])
spec_c_label = ci._spec_deps_key(spec_a["c"])
spec_d_label = ci._spec_deps_key(spec_a["d"])
spec_e_label = ci._spec_deps_key(spec_a["e"])
spec_f_label = ci._spec_deps_key(spec_a["f"])
spec_g_label = ci._spec_deps_key(spec_a["g"])
spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["b"])
spec_c_label = ci._spec_ci_label(spec_a["c"])
spec_d_label = ci._spec_ci_label(spec_a["d"])
spec_e_label = ci._spec_ci_label(spec_a["e"])
spec_f_label = ci._spec_ci_label(spec_a["f"])
spec_g_label = ci._spec_ci_label(spec_a["g"])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])

View File

@@ -123,17 +123,24 @@ def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
@pytest.mark.parametrize(
"arg,config", [("--reuse", True), ("--fresh", False), ("--reuse-deps", "dependencies")]
"arg,conf",
[
("--reuse", True),
("--fresh", False),
("--reuse-deps", "dependencies"),
("--fresh-roots", "dependencies"),
],
)
def test_concretizer_arguments(mutable_config, mock_packages, arg, config):
def test_concretizer_arguments(mutable_config, mock_packages, arg, conf):
"""Ensure that ConfigSetAction is doing the right thing."""
spec = spack.main.SpackCommand("spec")
assert spack.config.get("concretizer:reuse", None) is None
assert spack.config.get("concretizer:reuse", None, scope="command_line") is None
spec(arg, "zlib")
assert spack.config.get("concretizer:reuse", None) == config
assert spack.config.get("concretizer:reuse", None) == conf
assert spack.config.get("concretizer:reuse", None, scope="command_line") == conf
def test_use_buildcache_type():

View File

@@ -858,8 +858,7 @@ def test_with_config_bad_include_activate(environment_from_manifest, tmpdir):
"""
)
e = ev.Environment(env_root)
with e:
with ev.Environment(env_root) as e:
e.concretize()
# we've created an environment with some included config files (which do
@@ -869,7 +868,7 @@ def test_with_config_bad_include_activate(environment_from_manifest, tmpdir):
os.remove(abs_include_path)
os.remove(include1)
with pytest.raises(spack.config.ConfigFileError) as exc:
ev.activate(e)
ev.activate(ev.Environment(env_root))
err = exc.value.message
assert "missing include" in err
@@ -1063,8 +1062,7 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
"""
)
e = ev.Environment(tmp_path)
with e:
with ev.Environment(tmp_path):
config("change", "packages:mpich:require:~debug")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.spec.Spec("mpich+debug").concretized()
@@ -1081,7 +1079,7 @@ def test_config_change_new(mutable_mock_env_path, tmp_path, mock_packages, mutab
require: "@3.0.3"
"""
)
with e:
with ev.Environment(tmp_path):
assert spack.spec.Spec("mpich").concretized().satisfies("@3.0.3")
with pytest.raises(spack.config.ConfigError, match="not a list"):
config("change", "packages:mpich:require:~debug")

View File

@@ -88,6 +88,7 @@ def __init__(
exclude_file=None,
exclude_specs=None,
directory=None,
private=False,
):
self.specs = specs or []
self.all = all
@@ -96,6 +97,7 @@ def __init__(
self.dependencies = dependencies
self.exclude_file = exclude_file
self.exclude_specs = exclude_specs
self.private = private
self.directory = directory
@@ -104,7 +106,7 @@ def test_exclude_specs(mock_packages, config):
specs=["mpich"], versions_per_spec="all", exclude_specs="mpich@3.0.1:3.0.2 mpich@1.0"
)
mirror_specs = spack.cmd.mirror.concrete_specs_from_user(args)
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
@@ -113,6 +115,19 @@ def test_exclude_specs(mock_packages, config):
assert not any(spec.satisfies(y) for spec in mirror_specs for y in expected_exclude)
def test_exclude_specs_public_mirror(mock_packages, config):
args = MockMirrorArgs(
specs=["no-redistribute-dependent"],
versions_per_spec="all",
dependencies=True,
private=False,
)
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
assert not any(s.name == "no-redistribute" for s in mirror_specs)
assert any(s.name == "no-redistribute-dependent" for s in mirror_specs)
def test_exclude_file(mock_packages, tmpdir, config):
exclude_path = os.path.join(str(tmpdir), "test-exclude.txt")
with open(exclude_path, "w") as exclude_file:
@@ -125,7 +140,7 @@ def test_exclude_file(mock_packages, tmpdir, config):
args = MockMirrorArgs(specs=["mpich"], versions_per_spec="all", exclude_file=exclude_path)
mirror_specs = spack.cmd.mirror.concrete_specs_from_user(args)
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
expected_include = set(
spack.spec.Spec(x).concretized() for x in ["mpich@3.0.3", "mpich@3.0.4", "mpich@3.0"]
)
@@ -262,11 +277,9 @@ def test_mirror_destroy(
class TestMirrorCreate:
@pytest.mark.regression("31736", "31985")
def test_all_specs_with_all_versions_dont_concretize(self):
args = MockMirrorArgs(exclude_file=None, exclude_specs=None)
specs = spack.cmd.mirror.all_specs_with_all_versions(
selection_fn=spack.cmd.mirror.not_excluded_fn(args)
)
assert all(not s.concrete for s in specs)
args = MockMirrorArgs(all=True, exclude_file=None, exclude_specs=None)
mirror_specs, _ = spack.cmd.mirror._specs_and_action(args)
assert all(not s.concrete for s in mirror_specs)
@pytest.mark.parametrize(
"cli_args,error_str",
@@ -324,8 +337,8 @@ def test_error_conditions(self, cli_args, error_str):
],
)
def test_exclude_specs_from_user(self, cli_args, not_expected, config):
specs = spack.cmd.mirror.concrete_specs_from_user(MockMirrorArgs(**cli_args))
assert not any(s.satisfies(y) for s in specs for y in not_expected)
mirror_specs, _ = spack.cmd.mirror._specs_and_action(MockMirrorArgs(**cli_args))
assert not any(s.satisfies(y) for s in mirror_specs for y in not_expected)
@pytest.mark.parametrize("abstract_specs", [("bowtie", "callpath")])
def test_specs_from_cli_are_the_same_as_from_file(self, abstract_specs, config, tmpdir):

View File

@@ -14,6 +14,7 @@
import spack.compilers
import spack.spec
import spack.util.environment
import spack.util.module_cmd
from spack.compiler import Compiler
from spack.util.executable import Executable, ProcessError
@@ -137,14 +138,6 @@ def __init__(self):
environment={},
)
def _get_compiler_link_paths(self):
# Mock os.path.isdir so the link paths don't have to exist
old_isdir = os.path.isdir
os.path.isdir = lambda x: True
ret = super()._get_compiler_link_paths()
os.path.isdir = old_isdir
return ret
@property
def name(self):
return "mockcompiler"
@@ -162,34 +155,25 @@ def verbose_flag(self):
required_libs = ["libgfortran"]
def test_implicit_rpaths(dirs_with_libfiles, monkeypatch):
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
def test_implicit_rpaths(dirs_with_libfiles):
lib_to_dirs, all_dirs = dirs_with_libfiles
def try_all_dirs(*args):
return all_dirs
monkeypatch.setattr(MockCompiler, "_get_compiler_link_paths", try_all_dirs)
expected_rpaths = set(lib_to_dirs["libstdc++"] + lib_to_dirs["libgfortran"])
compiler = MockCompiler()
compiler._compile_c_source_output = "ld " + " ".join(f"-L{d}" for d in all_dirs)
retrieved_rpaths = compiler.implicit_rpaths()
assert set(retrieved_rpaths) == expected_rpaths
assert set(retrieved_rpaths) == set(lib_to_dirs["libstdc++"] + lib_to_dirs["libgfortran"])
no_flag_dirs = ["/path/to/first/lib", "/path/to/second/lib64"]
no_flag_output = "ld -L%s -L%s" % tuple(no_flag_dirs)
flag_dirs = ["/path/to/first/with/flag/lib", "/path/to/second/lib64"]
flag_output = "ld -L%s -L%s" % tuple(flag_dirs)
without_flag_output = "ld -L/path/to/first/lib -L/path/to/second/lib64"
with_flag_output = "ld -L/path/to/first/with/flag/lib -L/path/to/second/lib64"
def call_compiler(exe, *args, **kwargs):
# This method can replace Executable.__call__ to emulate a compiler that
# changes libraries depending on a flag.
if "--correct-flag" in exe.exe:
return flag_output
return no_flag_output
return with_flag_output
return without_flag_output
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
@@ -203,8 +187,8 @@ def call_compiler(exe, *args, **kwargs):
("cc", "cppflags"),
],
)
@pytest.mark.enable_compiler_link_paths
def test_get_compiler_link_paths(monkeypatch, exe, flagname):
@pytest.mark.enable_compiler_execution
def test_compile_dummy_c_source_adds_flags(monkeypatch, exe, flagname):
# create fake compiler that emits mock verbose output
compiler = MockCompiler()
monkeypatch.setattr(Executable, "__call__", call_compiler)
@@ -221,40 +205,38 @@ def test_get_compiler_link_paths(monkeypatch, exe, flagname):
assert False
# Test without flags
assert compiler._get_compiler_link_paths() == no_flag_dirs
assert compiler._compile_dummy_c_source() == without_flag_output
if flagname:
# set flags and test
compiler.flags = {flagname: ["--correct-flag"]}
assert compiler._get_compiler_link_paths() == flag_dirs
assert compiler._compile_dummy_c_source() == with_flag_output
def test_get_compiler_link_paths_no_path():
@pytest.mark.enable_compiler_execution
def test_compile_dummy_c_source_no_path():
compiler = MockCompiler()
compiler.cc = None
compiler.cxx = None
compiler.f77 = None
compiler.fc = None
assert compiler._get_compiler_link_paths() == []
assert compiler._compile_dummy_c_source() is None
def test_get_compiler_link_paths_no_verbose_flag():
@pytest.mark.enable_compiler_execution
def test_compile_dummy_c_source_no_verbose_flag():
compiler = MockCompiler()
compiler._verbose_flag = None
assert compiler._get_compiler_link_paths() == []
assert compiler._compile_dummy_c_source() is None
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
@pytest.mark.enable_compiler_link_paths
def test_get_compiler_link_paths_load_env(working_env, monkeypatch, tmpdir):
@pytest.mark.enable_compiler_execution
def test_compile_dummy_c_source_load_env(working_env, monkeypatch, tmpdir):
gcc = str(tmpdir.join("gcc"))
with open(gcc, "w") as f:
f.write(
"""#!/bin/sh
f"""#!/bin/sh
if [ "$ENV_SET" = "1" ] && [ "$MODULE_LOADED" = "1" ]; then
echo '"""
+ no_flag_output
+ """'
printf '{without_flag_output}'
fi
"""
)
@@ -274,7 +256,7 @@ def module(*args):
compiler.environment = {"set": {"ENV_SET": "1"}}
compiler.modules = ["turn_on"]
assert compiler._get_compiler_link_paths() == no_flag_dirs
assert compiler._compile_dummy_c_source() == without_flag_output
# Get the desired flag from the specified compiler spec.
@@ -912,3 +894,66 @@ def prepare_executable(name):
# Test that null entries don't fail
compiler.cc = None
compiler.verify_executables()
@pytest.mark.parametrize(
"detected_versions,expected_length",
[
# If we detect a C compiler we expect the result to be valid
(
[
spack.compilers.DetectVersionArgs(
id=spack.compilers.CompilerID(
os="ubuntu20.04", compiler_name="clang", version="12.0.0"
),
variation=spack.compilers.NameVariation(prefix="", suffix="-12"),
language="cc",
path="/usr/bin/clang-12",
),
spack.compilers.DetectVersionArgs(
id=spack.compilers.CompilerID(
os="ubuntu20.04", compiler_name="clang", version="12.0.0"
),
variation=spack.compilers.NameVariation(prefix="", suffix="-12"),
language="cxx",
path="/usr/bin/clang++-12",
),
],
1,
),
# If we detect only a C++ compiler we expect the result to be discarded
(
[
spack.compilers.DetectVersionArgs(
id=spack.compilers.CompilerID(
os="ubuntu20.04", compiler_name="clang", version="12.0.0"
),
variation=spack.compilers.NameVariation(prefix="", suffix="-12"),
language="cxx",
path="/usr/bin/clang++-12",
)
],
0,
),
],
)
def test_detection_requires_c_compiler(detected_versions, expected_length):
"""Tests that compilers automatically added to the configuration have
at least a C compiler.
"""
result = spack.compilers.make_compiler_list(detected_versions)
assert len(result) == expected_length
def test_compiler_environment(working_env):
"""Test whether environment modifications from compilers are applied in compiler_environment"""
os.environ.pop("TEST", None)
compiler = Compiler(
"gcc@=13.2.0",
operating_system="ubuntu20.04",
target="x86_64",
paths=["/test/bin/gcc", "/test/bin/g++"],
environment={"set": {"TEST": "yes"}},
)
with compiler.compiler_environment():
assert os.environ["TEST"] == "yes"

View File

@@ -13,6 +13,7 @@
import llnl.util.lang
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
@@ -23,6 +24,7 @@
import spack.platforms
import spack.repo
import spack.solver.asp
import spack.util.libc
import spack.variant as vt
from spack.concretize import find_spec
from spack.spec import CompilerSpec, Spec
@@ -67,6 +69,24 @@ def check_concretize(abstract_spec):
return concrete
@pytest.fixture(scope="function", autouse=True)
def binary_compatibility(monkeypatch, request):
"""Selects whether we use OS compatibility for binaries, or libc compatibility."""
if spack.platforms.real_host().name != "linux":
return
if "mock_packages" not in request.fixturenames:
# Only builtin.mock has a mock glibc package
return
if "database" in request.fixturenames or "mutable_database" in request.fixturenames:
# Databases have been created without glibc support
return
monkeypatch.setattr(spack.solver.asp, "using_libc_compatibility", lambda: True)
monkeypatch.setattr(spack.compiler.Compiler, "default_libc", Spec("glibc@=2.28"))
@pytest.fixture(
params=[
# no_deps
@@ -1327,6 +1347,9 @@ def mock_fn(*args, **kwargs):
def test_reuse_installed_packages_when_package_def_changes(
self, context, mutable_database, repo_with_changing_recipe
):
# test applies only with reuse turned off in concretizer
spack.config.set("concretizer:reuse", False)
# Install a spec
root = Spec("root").concretized()
dependency = root["changing"].copy()
@@ -1350,6 +1373,22 @@ def test_reuse_installed_packages_when_package_def_changes(
# Structure and package hash will be different without reuse
assert root.dag_hash() != new_root_without_reuse.dag_hash()
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
@pytest.mark.regression("43663")
def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, mock_packages):
spack.config.set("concretizer:reuse", True)
# Install a spec for which the `version_based` variant condition does not hold
old = Spec("conditional-variant-pkg @1").concretized()
old.package.do_install(fake=True, explicit=True)
# Then explicitly require a spec with `+version_based`, which shouldn't reuse previous spec
new1 = Spec("conditional-variant-pkg +version_based").concretized()
assert new1.satisfies("@2 +version_based")
new2 = Spec("conditional-variant-pkg +two_whens").concretized()
assert new2.satisfies("@2 +two_whens +version_based")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True)
@@ -1436,6 +1475,8 @@ def test_os_selection_when_multiple_choices_are_possible(
):
s = Spec(spec_str).concretized()
for node in s.traverse():
if node.name == "glibc":
continue
assert node.satisfies(expected_os)
@pytest.mark.regression("22718")
@@ -1748,7 +1789,8 @@ def test_best_effort_coconcretize(self, specs, expected):
for s in result.specs:
concrete_specs.update(s.traverse())
assert len(concrete_specs) == expected
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
assert len(concrete_specs) == expected + libc_offset
@pytest.mark.parametrize(
"specs,expected_spec,occurances",
@@ -1868,29 +1910,16 @@ def test_version_weight_and_provenance(self):
result_spec = result.specs[0]
num_specs = len(list(result_spec.traverse()))
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
criteria = [
(num_specs - 1, None, "number of packages to build (vs. reuse)"),
(num_specs - 1 - libc_offset, None, "number of packages to build (vs. reuse)"),
(2, 0, "version badness"),
]
for criterion in criteria:
assert criterion in result.criteria
assert criterion in result.criteria, result_spec
assert result_spec.satisfies("^b@1.0")
@pytest.mark.regression("31169")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_not_reusing_incompatible_os(self):
root_spec = Spec("b")
s = root_spec.concretized()
wrong_os = s.copy()
wrong_os.architecture = spack.spec.ArchSpec("test-ubuntu2204-x86_64")
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [root_spec], reuse=[wrong_os])
concrete_spec = result.specs[0]
assert concrete_spec.satisfies("os={}".format(s.architecture.os))
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_succeeds_with_config_compatible_os(self):
root_spec = Spec("b")
@@ -2095,7 +2124,11 @@ def test_external_python_extension_find_dependency_from_installed(self, monkeypa
# install python external
python = Spec("python").concretized()
monkeypatch.setattr(spack.store.STORE.db, "query", lambda x: [python])
def query(*args, **kwargs):
return [python]
monkeypatch.setattr(spack.store.STORE.db, "query", query)
# ensure that we can't be faking this by getting it from config
external_conf.pop("python")
@@ -2382,6 +2415,53 @@ def test_reuse_specs_from_non_available_compilers(self, mutable_config, mutable_
for s in root.traverse(root=False):
assert s.satisfies("%gcc@10.2.1")
@pytest.mark.regression("43406")
def test_externals_with_platform_explicitly_set(self, tmp_path):
"""Tests that users can specify platform=xxx in an external spec"""
external_conf = {
"mpich": {
"buildable": False,
"externals": [{"spec": "mpich@=2.0.0 platform=test", "prefix": str(tmp_path)}],
}
}
spack.config.set("packages", external_conf)
s = Spec("mpich").concretized()
assert s.external
@pytest.mark.regression("43875")
def test_concretize_missing_compiler(self, mutable_config, monkeypatch):
"""Tests that Spack can concretize a spec with a missing compiler when the
option is active.
"""
def _default_libc(self):
if self.cc is None:
return None
return Spec("glibc@=2.28")
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
monkeypatch.setattr(spack.compiler.Compiler, "default_libc", property(_default_libc))
monkeypatch.setattr(
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
)
mutable_config.set("config:install_missing_compilers", True)
s = Spec("a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267")
def test_spec_with_build_dep_from_json(self, tmp_path):
"""Tests that we can correctly concretize a spec, when we express its dependency as a
concrete spec to be read from JSON.
The bug was triggered by missing virtuals on edges that were trimmed from pure build
dependencies.
"""
build_dep = Spec("dttop").concretized()
json_file = tmp_path / "build.json"
json_file.write_text(build_dep.to_json())
s = Spec(f"dtuse ^{str(json_file)}").concretized()
assert s["dttop"].dag_hash() == build_dep.dag_hash()
@pytest.fixture()
def duplicates_test_repository():
@@ -2520,6 +2600,29 @@ def test_no_multiple_solutions_with_different_edges_same_nodes(self):
assert len(edges) == 1
assert edges[0].spec.satisfies("@=60")
@pytest.mark.regression("43647")
def test_specifying_different_versions_build_deps(self):
"""Tests that we can concretize a spec with nodes using the same build
dependency pinned at different versions, when the constraint is specified
in the root spec.
o hdf5@1.0
|\
o | pinned-gmake@1.0
o | gmake@3.0
/
o gmake@4.1
"""
hdf5_str = "hdf5@1.0 ^gmake@4.1"
pinned_str = "pinned-gmake@1.0 ^gmake@3.0"
input_specs = [Spec(hdf5_str), Spec(pinned_str)]
solver = spack.solver.asp.Solver()
result = solver.solve(input_specs)
assert any(x.satisfies(hdf5_str) for x in result.specs)
assert any(x.satisfies(pinned_str) for x in result.specs)
@pytest.mark.parametrize(
"v_str,v_opts,checksummed",
@@ -2734,3 +2837,78 @@ def test_concretization_version_order():
Version("develop"), # likely development version
Version("2.0"), # deprecated
]
@pytest.mark.only_clingo("Original concretizer cannot reuse specs")
@pytest.mark.parametrize(
"roots,reuse_yaml,expected,not_expected,expected_length",
[
(
["mpileaks"],
{"roots": True, "include": ["^mpich"]},
["^mpich"],
["^mpich2", "^zmpi"],
2,
),
(
["mpileaks"],
{"roots": True, "include": ["externaltest"]},
["externaltest"],
["^mpich", "^mpich2", "^zmpi"],
1,
),
],
)
@pytest.mark.usefixtures("database", "mock_store")
@pytest.mark.not_on_windows("Expected length is different on Windows")
def test_filtering_reused_specs(
roots, reuse_yaml, expected, not_expected, expected_length, mutable_config, monkeypatch
):
"""Tests that we can select which specs are to be reused, using constraints as filters"""
# Assume all specs have a runtime dependency
monkeypatch.setattr(spack.solver.asp, "_has_runtime_dependencies", lambda x: True)
mutable_config.set("concretizer:reuse", reuse_yaml)
selector = spack.solver.asp.ReusableSpecsSelector(mutable_config)
specs = selector.reusable_specs(roots)
assert len(specs) == expected_length
for constraint in expected:
assert all(x.satisfies(constraint) for x in specs)
for constraint in not_expected:
assert all(not x.satisfies(constraint) for x in specs)
@pytest.mark.usefixtures("database", "mock_store")
@pytest.mark.parametrize(
"reuse_yaml,expected_length",
[({"from": [{"type": "local"}]}, 17), ({"from": [{"type": "buildcache"}]}, 0)],
)
@pytest.mark.not_on_windows("Expected length is different on Windows")
def test_selecting_reused_sources(reuse_yaml, expected_length, mutable_config, monkeypatch):
"""Tests that we can turn on/off sources of reusable specs"""
# Assume all specs have a runtime dependency
monkeypatch.setattr(spack.solver.asp, "_has_runtime_dependencies", lambda x: True)
mutable_config.set("concretizer:reuse", reuse_yaml)
selector = spack.solver.asp.ReusableSpecsSelector(mutable_config)
specs = selector.reusable_specs(["mpileaks"])
assert len(specs) == expected_length
@pytest.mark.parametrize(
"specs,include,exclude,expected",
[
# "foo" discarded by include rules (everything compiled with GCC)
(["cmake@3.27.9 %gcc", "foo %clang"], ["%gcc"], [], ["cmake@3.27.9 %gcc"]),
# "cmake" discarded by exclude rules (everything compiled with GCC but cmake)
(["cmake@3.27.9 %gcc", "foo %gcc"], ["%gcc"], ["cmake"], ["foo %gcc"]),
],
)
def test_spec_filters(specs, include, exclude, expected):
specs = [Spec(x) for x in specs]
expected = [Spec(x) for x in expected]
f = spack.solver.asp.SpecFilter(
factory=lambda: specs, is_usable=lambda x: True, include=include, exclude=exclude
)
assert f.selected_specs() == expected

View File

@@ -1276,7 +1276,7 @@ def test_user_config_path_is_default_when_env_var_is_empty(working_env):
def test_default_install_tree(monkeypatch, default_config):
s = spack.spec.Spec("nonexistent@x.y.z %none@a.b.c arch=foo-bar-baz")
monkeypatch.setattr(s, "dag_hash", lambda: "abc123")
monkeypatch.setattr(s, "dag_hash", lambda length: "abc123")
_, _, projections = spack.store.parse_install_tree(spack.config.get("config"))
assert s.format(projections["all"]) == "foo-bar-baz/none-a.b.c/nonexistent-x.y.z-abc123"

View File

@@ -34,6 +34,7 @@
import spack.binary_distribution
import spack.caches
import spack.cmd.buildcache
import spack.compiler
import spack.compilers
import spack.config
import spack.database
@@ -55,6 +56,7 @@
import spack.util.gpg
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.version
from spack.fetch_strategy import URLFetchStrategy
from spack.util.pattern import Bunch
@@ -268,10 +270,6 @@ def clean_test_environment():
ev.deactivate()
def _verify_executables_noop(*args):
return None
def _host():
"""Mock archspec host so there is no inconsistency on the Windows platform
This function cannot be local as it needs to be pickleable"""
@@ -297,9 +295,7 @@ def mock_compiler_executable_verification(request, monkeypatch):
If a test is marked in that way this is a no-op."""
if "enable_compiler_verification" not in request.keywords:
monkeypatch.setattr(
spack.compiler.Compiler, "verify_executables", _verify_executables_noop
)
monkeypatch.setattr(spack.compiler.Compiler, "verify_executables", _return_none)
# Hooks to add command line options or set other custom behaviors.
@@ -767,6 +763,28 @@ def mutable_empty_config(tmpdir_factory, configuration_dir):
yield cfg
# From https://github.com/pytest-dev/pytest/issues/363#issuecomment-1335631998
# Current suggested implementation from issue compatible with pytest >= 6.2
# this may be subject to change as new versions of Pytest are released
# and update the suggested solution
@pytest.fixture(scope="session")
def monkeypatch_session():
with pytest.MonkeyPatch.context() as monkeypatch:
yield monkeypatch
@pytest.fixture(scope="session", autouse=True)
def mock_wsdk_externals(monkeypatch_session):
"""Skip check for required external packages on Windows during testing
Note: In general this should cover this behavior for all tests,
however any session scoped fixture involving concretization should
include this fixture
"""
monkeypatch_session.setattr(
spack.bootstrap.core, "ensure_winsdk_external_or_raise", _return_none
)
@pytest.fixture(scope="function")
def concretize_scope(mutable_config, tmpdir):
"""Adds a scope for concretization preferences"""
@@ -846,7 +864,13 @@ def _store_dir_and_cache(tmpdir_factory):
@pytest.fixture(scope="session")
def mock_store(tmpdir_factory, mock_repo_path, mock_configuration_scopes, _store_dir_and_cache):
def mock_store(
tmpdir_factory,
mock_wsdk_externals,
mock_repo_path,
mock_configuration_scopes,
_store_dir_and_cache,
):
"""Creates a read-only mock database with some packages installed note
that the ref count for dyninst here will be 3, as it's recycled
across each install.
@@ -933,26 +957,16 @@ def dirs_with_libfiles(tmpdir_factory):
yield lib_to_dirs, all_dirs
def _compiler_link_paths_noop(*args):
return []
def _return_none(*args):
return None
@pytest.fixture(scope="function", autouse=True)
def disable_compiler_execution(monkeypatch, request):
"""
This fixture can be disabled for tests of the compiler link path
functionality by::
@pytest.mark.enable_compiler_link_paths
If a test is marked in that way this is a no-op."""
if "enable_compiler_link_paths" not in request.keywords:
# Compiler.determine_implicit_rpaths actually runs the compiler. So
# replace that function with a noop that simulates finding no implicit
# RPATHs
monkeypatch.setattr(
spack.compiler.Compiler, "_get_compiler_link_paths", _compiler_link_paths_noop
)
"""Disable compiler execution to determine implicit link paths and libc flavor and version.
To re-enable use `@pytest.mark.enable_compiler_execution`"""
if "enable_compiler_execution" not in request.keywords:
monkeypatch.setattr(spack.compiler.Compiler, "_compile_dummy_c_source", _return_none)
@pytest.fixture(scope="function")
@@ -1440,6 +1454,15 @@ def mock_git_repository(git, tmpdir_factory):
yield t
@pytest.fixture(scope="function")
def mock_git_test_package(mock_git_repository, mutable_mock_repo, monkeypatch):
# install a fake git version in the package class
pkg_class = spack.repo.PATH.get_pkg_class("git-test")
monkeypatch.delattr(pkg_class, "git")
monkeypatch.setitem(pkg_class.versions, spack.version.Version("git"), mock_git_repository.url)
return pkg_class
@pytest.fixture(scope="session")
def mock_hg_repository(tmpdir_factory):
"""Creates a very simple hg repository with two commits."""
@@ -1971,17 +1994,24 @@ def mock_modules_root(tmp_path, monkeypatch):
monkeypatch.setattr(spack.modules.common, "root_path", fn)
_repo_name_id = 0
def create_test_repo(tmpdir, pkg_name_content_tuples):
global _repo_name_id
repo_path = str(tmpdir)
repo_yaml = tmpdir.join("repo.yaml")
with open(str(repo_yaml), "w") as f:
f.write(
"""\
f"""\
repo:
namespace: testcfgrequirements
namespace: testrepo{str(_repo_name_id)}
"""
)
_repo_name_id += 1
packages_dir = tmpdir.join("packages")
for pkg_name, pkg_str in pkg_name_content_tuples:
pkg_dir = packages_dir.ensure(pkg_name, dir=True)

View File

@@ -1,5 +1,5 @@
concretizer:
# reuse is missing on purpose, see "test_concretizer_arguments"
reuse: True
targets:
granularity: microarchitectures
host_compatible: false

View File

@@ -1106,3 +1106,31 @@ def test_database_construction_doesnt_use_globals(tmpdir, config, nullify_global
lock_cfg = lock_cfg or spack.database.lock_configuration(config)
db = spack.database.Database(str(tmpdir), lock_cfg=lock_cfg)
assert os.path.exists(db.database_directory)
def test_database_read_works_with_trailing_data(tmp_path, default_mock_concretization):
# Populate a database
root = str(tmp_path)
db = spack.database.Database(root)
spec = default_mock_concretization("a")
db.add(spec, directory_layout=None)
specs_in_db = db.query_local()
assert spec in specs_in_db
# Append anything to the end of the database file
with open(db._index_path, "a") as f:
f.write(json.dumps({"hello": "world"}))
# Read the database and check that it ignores the trailing data
assert spack.database.Database(root).query_local() == specs_in_db
def test_database_errors_with_just_a_version_key(tmp_path):
root = str(tmp_path)
db = spack.database.Database(root)
next_version = f"{spack.database._DB_VERSION}.next"
with open(db._index_path, "w") as f:
f.write(json.dumps({"database": {"version": next_version}}))
with pytest.raises(spack.database.InvalidDatabaseVersionError):
spack.database.Database(root).query_local()

View File

@@ -10,6 +10,7 @@
import spack.repo
import spack.spec
import spack.version
from spack.test.conftest import create_test_repo
def test_false_directives_do_not_exist(mock_packages):
@@ -142,3 +143,86 @@ def test_version_type_validation():
# Try passing a bogus type; it's just that we want a nice error message
with pytest.raises(spack.version.VersionError, match=msg):
spack.directives._execute_version(package(name="python"), {})
_pkgx = (
"x",
"""\
class X(Package):
version("1.3")
version("1.2")
version("1.1")
version("1.0")
variant("foo", default=False)
redistribute(binary=False, when="@1.1")
redistribute(binary=False, when="@1.0:1.2+foo")
redistribute(source=False, when="@1.0:1.2")
""",
)
_pkgy = (
"y",
"""\
class Y(Package):
version("2.1")
version("2.0")
variant("bar", default=False)
redistribute(binary=False, source=False)
""",
)
@pytest.fixture
def _create_test_repo(tmpdir, mutable_config):
yield create_test_repo(tmpdir, [_pkgx, _pkgy])
@pytest.fixture
def test_repo(_create_test_repo, monkeypatch, mock_stage):
with spack.repo.use_repositories(_create_test_repo) as mock_repo_path:
yield mock_repo_path
@pytest.mark.parametrize(
"spec_str,distribute_src,distribute_bin",
[
("x@1.1~foo", False, False),
("x@1.2+foo", False, False),
("x@1.2~foo", False, True),
("x@1.0~foo", False, True),
("x@1.3+foo", True, True),
("y@2.0", False, False),
("y@2.1+bar", False, False),
],
)
def test_redistribute_directive(test_repo, spec_str, distribute_src, distribute_bin):
spec = spack.spec.Spec(spec_str)
assert spec.package_class.redistribute_source(spec) == distribute_src
concretized_spec = spec.concretized()
assert concretized_spec.package.redistribute_binary == distribute_bin
def test_redistribute_override_when():
"""Allow a user to call `redistribute` twice to separately disable
source and binary distribution for the same when spec.
The second call should not undo the effect of the first.
"""
class MockPackage:
name = "mock"
disable_redistribute = {}
cls = MockPackage
spack.directives._execute_redistribute(cls, source=False, when="@1.0")
spec_key = spack.directives._make_when_spec("@1.0")
assert not cls.disable_redistribute[spec_key].binary
assert cls.disable_redistribute[spec_key].source
spack.directives._execute_redistribute(cls, binary=False, when="@1.0")
assert cls.disable_redistribute[spec_key].binary
assert cls.disable_redistribute[spec_key].source

View File

@@ -912,7 +912,6 @@ def test_find_first_file(tmpdir, bfs_depth):
def test_rename_dest_exists(tmpdir):
@contextmanager
def setup_test_files():
a = tmpdir.join("a", "file1")
@@ -999,3 +998,40 @@ def setup_test_dirs():
assert os.path.exists(link2)
assert os.path.realpath(link2) == a
shutil.rmtree(tmpdir.join("f"))
@pytest.mark.skipif(sys.platform != "win32", reason="No-op on non Windows")
def test_windows_sfn(tmpdir):
# first check some standard Windows locations
# we know require sfn names
# this is basically a smoke test
# ensure spaces are replaced + path abbreviated
assert fs.windows_sfn("C:\\Program Files (x86)") == "C:\\PROGRA~2"
# ensure path without spaces is still properly shortened
assert fs.windows_sfn("C:\\ProgramData") == "C:\\PROGRA~3"
# test user created paths
# ensure longer path with spaces is properly abbreviated
a = tmpdir.join("d", "this is a test", "a", "still test")
# ensure longer path is properly abbreviated
b = tmpdir.join("d", "long_path_with_no_spaces", "more_long_path")
# ensure path not in need of abbreviation is properly roundtripped
c = tmpdir.join("d", "this", "is", "short")
# ensure paths that are the same in the first six letters
# are incremented post tilde
d = tmpdir.join("d", "longerpath1")
e = tmpdir.join("d", "longerpath2")
fs.mkdirp(a)
fs.mkdirp(b)
fs.mkdirp(c)
fs.mkdirp(d)
fs.mkdirp(e)
# check only for path of path we can control,
# pytest prefix may or may not be mangled by windows_sfn
# based on user/pytest config
assert "d\\THISIS~1\\a\\STILLT~1" in fs.windows_sfn(a)
assert "d\\LONG_P~1\\MORE_L~1" in fs.windows_sfn(b)
assert "d\\this\\is\\short" in fs.windows_sfn(c)
assert "d\\LONGER~1" in fs.windows_sfn(d)
assert "d\\LONGER~2" in fs.windows_sfn(e)
shutil.rmtree(tmpdir.join("d"))

View File

@@ -146,9 +146,6 @@ def test_autoload_all(self, modulefile_content, module_configuration):
assert len([x for x in content if "depends_on(" in x]) == 5
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
)
def test_alter_environment(self, modulefile_content, module_configuration):
"""Tests modifications to run-time environment."""

View File

@@ -114,9 +114,6 @@ def test_prerequisites_all(
assert len([x for x in content if "prereq" in x]) == 5
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is specific for x86_64"
)
def test_alter_environment(self, modulefile_content, module_configuration):
"""Tests modifications to run-time environment."""

View File

@@ -703,22 +703,25 @@ def check_prop(check_spec, fmt_str, prop, getter):
actual = spec.format(named_str)
assert expected == actual
def test_spec_formatting_escapes(self, default_mock_concretization):
spec = default_mock_concretization("multivalue-variant cflags=-O2")
sigil_mismatches = [
@pytest.mark.parametrize(
"fmt_str",
[
"{@name}",
"{@version.concrete}",
"{%compiler.version}",
"{/hashd}",
"{arch=architecture.os}",
]
],
)
def test_spec_formatting_sigil_mismatches(self, default_mock_concretization, fmt_str):
spec = default_mock_concretization("multivalue-variant cflags=-O2")
for fmt_str in sigil_mismatches:
with pytest.raises(SpecFormatSigilError):
spec.format(fmt_str)
with pytest.raises(SpecFormatSigilError):
spec.format(fmt_str)
bad_formats = [
@pytest.mark.parametrize(
"fmt_str",
[
r"{}",
r"name}",
r"\{name}",
@@ -728,11 +731,12 @@ def test_spec_formatting_escapes(self, default_mock_concretization):
r"{dag_hash}",
r"{foo}",
r"{+variants.debug}",
]
for fmt_str in bad_formats:
with pytest.raises(SpecFormatStringError):
spec.format(fmt_str)
],
)
def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str):
spec = default_mock_concretization("multivalue-variant cflags=-O2")
with pytest.raises(SpecFormatStringError):
spec.format(fmt_str)
def test_combination_of_wildcard_or_none(self):
# Test that using 'none' and another value raises
@@ -1096,22 +1100,22 @@ def test_unsatisfiable_virtual_deps_bindings(self, spec_str):
@pytest.mark.parametrize(
"spec_str,format_str,expected",
[
("zlib@git.foo/bar", "{name}-{version}", str(pathlib.Path("zlib-git.foo_bar"))),
("zlib@git.foo/bar", "{name}-{version}-{/hash}", None),
("zlib@git.foo/bar", "{name}/{version}", str(pathlib.Path("zlib", "git.foo_bar"))),
("git-test@git.foo/bar", "{name}-{version}", str(pathlib.Path("git-test-git.foo_bar"))),
("git-test@git.foo/bar", "{name}-{version}-{/hash}", None),
("git-test@git.foo/bar", "{name}/{version}", str(pathlib.Path("git-test", "git.foo_bar"))),
(
"zlib@{0}=1.0%gcc".format("a" * 40),
"git-test@{0}=1.0%gcc".format("a" * 40),
"{name}/{version}/{compiler}",
str(pathlib.Path("zlib", "{0}_1.0".format("a" * 40), "gcc")),
str(pathlib.Path("git-test", "{0}_1.0".format("a" * 40), "gcc")),
),
(
"zlib@git.foo/bar=1.0%gcc",
"git-test@git.foo/bar=1.0%gcc",
"{name}/{version}/{compiler}",
str(pathlib.Path("zlib", "git.foo_bar_1.0", "gcc")),
str(pathlib.Path("git-test", "git.foo_bar_1.0", "gcc")),
),
],
)
def test_spec_format_path(spec_str, format_str, expected):
def test_spec_format_path(spec_str, format_str, expected, mock_git_test_package):
_check_spec_format_path(spec_str, format_str, expected)
@@ -1129,45 +1133,57 @@ def _check_spec_format_path(spec_str, format_str, expected, path_ctor=None):
"spec_str,format_str,expected",
[
(
"zlib@git.foo/bar",
"git-test@git.foo/bar",
r"C:\\installroot\{name}\{version}",
r"C:\installroot\zlib\git.foo_bar",
r"C:\installroot\git-test\git.foo_bar",
),
(
"zlib@git.foo/bar",
"git-test@git.foo/bar",
r"\\hostname\sharename\{name}\{version}",
r"\\hostname\sharename\zlib\git.foo_bar",
r"\\hostname\sharename\git-test\git.foo_bar",
),
# leading '/' is preserved on windows but converted to '\'
# note that it's still not "absolute" -- absolute windows paths start with a drive.
(
"git-test@git.foo/bar",
r"/installroot/{name}/{version}",
r"\installroot\git-test\git.foo_bar",
),
# Windows doesn't attribute any significance to a leading
# "/" so it is discarded
("zlib@git.foo/bar", r"/installroot/{name}/{version}", r"installroot\zlib\git.foo_bar"),
],
)
def test_spec_format_path_windows(spec_str, format_str, expected):
def test_spec_format_path_windows(spec_str, format_str, expected, mock_git_test_package):
_check_spec_format_path(spec_str, format_str, expected, path_ctor=pathlib.PureWindowsPath)
@pytest.mark.parametrize(
"spec_str,format_str,expected",
[
("zlib@git.foo/bar", r"/installroot/{name}/{version}", "/installroot/zlib/git.foo_bar"),
("zlib@git.foo/bar", r"//installroot/{name}/{version}", "//installroot/zlib/git.foo_bar"),
(
"git-test@git.foo/bar",
r"/installroot/{name}/{version}",
"/installroot/git-test/git.foo_bar",
),
(
"git-test@git.foo/bar",
r"//installroot/{name}/{version}",
"//installroot/git-test/git.foo_bar",
),
# This is likely unintentional on Linux: Firstly, "\" is not a
# path separator for POSIX, so this is treated as a single path
# component (containing literal "\" characters); secondly,
# Spec.format treats "\" as an escape character, so is
# discarded (unless directly following another "\")
(
"zlib@git.foo/bar",
"git-test@git.foo/bar",
r"C:\\installroot\package-{name}-{version}",
r"C__installrootpackage-zlib-git.foo_bar",
r"C__installrootpackage-git-test-git.foo_bar",
),
# "\" is not a POSIX separator, and Spec.format treats "\{" as a literal
# "{", which means that the resulting format string is invalid
("zlib@git.foo/bar", r"package\{name}\{version}", None),
("git-test@git.foo/bar", r"package\{name}\{version}", None),
],
)
def test_spec_format_path_posix(spec_str, format_str, expected):
def test_spec_format_path_posix(spec_str, format_str, expected, mock_git_test_package):
_check_spec_format_path(spec_str, format_str, expected, path_ctor=pathlib.PurePosixPath)

View File

@@ -551,12 +551,12 @@ def _specfile_for(spec_str, filename):
"^[deptypes=build,link] zlib",
),
(
"zlib@git.foo/bar",
"git-test@git.foo/bar",
[
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, "zlib"),
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, "git-test"),
Token(TokenType.GIT_VERSION, "@git.foo/bar"),
],
"zlib@git.foo/bar",
"git-test@git.foo/bar",
),
# Variant propagation
(
@@ -585,7 +585,7 @@ def _specfile_for(spec_str, filename):
),
],
)
def test_parse_single_spec(spec_str, tokens, expected_roundtrip):
def test_parse_single_spec(spec_str, tokens, expected_roundtrip, mock_git_test_package):
parser = SpecParser(spec_str)
assert tokens == parser.tokens()
assert expected_roundtrip == str(parser.next_spec())

View File

@@ -0,0 +1,26 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
from spack.util import libc
@pytest.mark.parametrize(
"libc_prefix,startfile_prefix,expected",
[
# Ubuntu
("/usr", "/usr/lib/x86_64-linux-gnu", "/usr/include/x86_64-linux-gnu"),
("/usr", "/usr/lib/x86_64-linux-musl", "/usr/include/x86_64-linux-musl"),
("/usr", "/usr/lib/aarch64-linux-gnu", "/usr/include/aarch64-linux-gnu"),
("/usr", "/usr/lib/aarch64-linux-musl", "/usr/include/aarch64-linux-musl"),
# rhel-like
("/usr", "/usr/lib64", "/usr/include"),
("/usr", "/usr/lib", "/usr/include"),
],
)
@pytest.mark.not_on_windows("The unit test deals with unix-like paths")
def test_header_dir_computation(libc_prefix, startfile_prefix, expected):
"""Tests that we compute the correct header directory from the prefix of the libc startfiles"""
assert libc.libc_include_dir_from_startfile_prefix(libc_prefix, startfile_prefix) == expected

View File

@@ -641,6 +641,30 @@ def substitute_rpath_and_pt_interp_in_place_or_raise(
return False
def pt_interp(path: str) -> Optional[str]:
"""Retrieve the interpreter of an executable at `path`."""
try:
with open(path, "rb") as f:
elf = parse_elf(f, interpreter=True)
except (OSError, ElfParsingError):
return None
if not elf.has_pt_interp:
return None
return elf.pt_interp_str.decode("utf-8")
def get_elf_compat(path):
"""Get a triplet (EI_CLASS, EI_DATA, e_machine) from an ELF file, which can be used to see if
two ELF files are compatible."""
# On ELF platforms supporting, we try to be a bit smarter when it comes to shared
# libraries, by dropping those that are not host compatible.
with open(path, "rb") as f:
elf = parse_elf(f, only_header=True)
return (elf.is_64_bit, elf.is_little_endian, elf.elf_hdr.e_machine)
class ElfCStringUpdatesFailed(Exception):
def __init__(
self, rpath: Optional[UpdateCStringAction], pt_interp: Optional[UpdateCStringAction]

View File

@@ -36,6 +36,8 @@
SYSTEM_DIRS = [os.path.join(p, s) for s in SUFFIXES for p in SYSTEM_PATHS] + SYSTEM_PATHS
#: used in the compiler wrapper's `/usr/lib|/usr/lib64|...)` case entry
SYSTEM_DIR_CASE_ENTRY = "|".join(sorted(f'"{d}{suff}"' for d in SYSTEM_DIRS for suff in ("", "/")))
_SHELL_SET_STRINGS = {
"sh": "export {0}={1};\n",
@@ -642,8 +644,8 @@ def reversed(self) -> "EnvironmentModifications":
elif isinstance(envmod, AppendFlagsEnv):
rev.remove_flags(envmod.name, envmod.value)
else:
tty.warn(
f"Skipping reversal of unreversable operation {type(envmod)} {envmod.name}"
tty.debug(
f"Skipping reversal of irreversible operation {type(envmod)} {envmod.name}"
)
return rev

View File

@@ -0,0 +1,176 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import re
import shlex
import sys
from subprocess import PIPE, run
from typing import Optional
import spack.spec
import spack.util.elf
def _libc_from_ldd(ldd: str) -> Optional["spack.spec.Spec"]:
try:
result = run([ldd, "--version"], stdout=PIPE, stderr=PIPE, check=False)
stdout = result.stdout.decode("utf-8")
except Exception:
return None
if not re.search(r"\b(?:gnu|glibc|arm)\b", stdout, re.IGNORECASE):
return None
version_str = re.match(r".+\(.+\) (.+)", stdout)
if not version_str:
return None
try:
return spack.spec.Spec(f"glibc@={version_str.group(1)}")
except Exception:
return None
def libc_from_dynamic_linker(dynamic_linker: str) -> Optional["spack.spec.Spec"]:
if not os.path.exists(dynamic_linker):
return None
# The dynamic linker is usually installed in the same /lib(64)?/ld-*.so path across all
# distros. The rest of libc is elsewhere, e.g. /usr. Typically the dynamic linker is then
# a symlink into /usr/lib, which we use to for determining the actual install prefix of
# libc.
realpath = os.path.realpath(dynamic_linker)
prefix = os.path.dirname(realpath)
# Remove the multiarch suffix if it exists
if os.path.basename(prefix) not in ("lib", "lib64"):
prefix = os.path.dirname(prefix)
# Non-standard install layout -- just bail.
if os.path.basename(prefix) not in ("lib", "lib64"):
return None
prefix = os.path.dirname(prefix)
# Now try to figure out if glibc or musl, which is the only ones we support.
# In recent glibc we can simply execute the dynamic loader. In musl that's always the case.
try:
result = run([dynamic_linker, "--version"], stdout=PIPE, stderr=PIPE, check=False)
stdout = result.stdout.decode("utf-8")
stderr = result.stderr.decode("utf-8")
except Exception:
return None
# musl prints to stderr
if stderr.startswith("musl libc"):
version_str = re.search(r"^Version (.+)$", stderr, re.MULTILINE)
if not version_str:
return None
try:
spec = spack.spec.Spec(f"musl@={version_str.group(1)}")
spec.external_path = prefix
return spec
except Exception:
return None
elif re.search(r"\b(?:gnu|glibc|arm)\b", stdout, re.IGNORECASE):
# output is like "ld.so (...) stable release version 2.33."
match = re.search(r"version (\d+\.\d+(?:\.\d+)?)", stdout)
if not match:
return None
try:
version = match.group(1)
spec = spack.spec.Spec(f"glibc@={version}")
spec.external_path = prefix
return spec
except Exception:
return None
else:
# Could not get the version by running the dynamic linker directly. Instead locate `ldd`
# relative to the dynamic linker.
ldd = os.path.join(prefix, "bin", "ldd")
if not os.path.exists(ldd):
# If `/lib64/ld.so` was not a symlink to `/usr/lib/ld.so` we can try to use /usr as
# prefix. This is the case on ubuntu 18.04 where /lib != /usr/lib.
if prefix != "/":
return None
prefix = "/usr"
ldd = os.path.join(prefix, "bin", "ldd")
if not os.path.exists(ldd):
return None
maybe_spec = _libc_from_ldd(ldd)
if not maybe_spec:
return None
maybe_spec.external_path = prefix
return maybe_spec
def libc_from_current_python_process() -> Optional["spack.spec.Spec"]:
if not sys.executable:
return None
dynamic_linker = spack.util.elf.pt_interp(sys.executable)
if not dynamic_linker:
return None
return libc_from_dynamic_linker(dynamic_linker)
def startfile_prefix(prefix: str, compatible_with: str = sys.executable) -> Optional[str]:
# Search for crt1.o at max depth 2 compatible with the ELF file provided in compatible_with.
# This is useful for finding external libc startfiles on a multiarch system.
try:
compat = spack.util.elf.get_elf_compat(compatible_with)
accept = lambda path: spack.util.elf.get_elf_compat(path) == compat
except Exception:
accept = lambda path: True
queue = [(0, prefix)]
while queue:
depth, path = queue.pop()
try:
iterator = os.scandir(path)
except OSError:
continue
with iterator:
for entry in iterator:
try:
if entry.is_dir(follow_symlinks=True):
if depth < 2:
queue.append((depth + 1, entry.path))
elif entry.name == "crt1.o" and accept(entry.path):
return path
except Exception:
continue
return None
def parse_dynamic_linker(output: str):
"""Parse -dynamic-linker /path/to/ld.so from compiler output"""
for line in reversed(output.splitlines()):
if "-dynamic-linker" not in line:
continue
args = shlex.split(line)
for idx in reversed(range(1, len(args))):
arg = args[idx]
if arg == "-dynamic-linker" or args == "--dynamic-linker":
return args[idx + 1]
elif arg.startswith("--dynamic-linker=") or arg.startswith("-dynamic-linker="):
return arg.split("=", 1)[1]
def libc_include_dir_from_startfile_prefix(
libc_prefix: str, startfile_prefix: str
) -> Optional[str]:
"""Heuristic to determine the glibc include directory from the startfile prefix. Replaces
$libc_prefix/lib*/<multiarch> with $libc_prefix/include/<multiarch>. This function does not
check if the include directory actually exists or is correct."""
parts = os.path.relpath(startfile_prefix, libc_prefix).split(os.path.sep)
if parts[0] not in ("lib", "lib64", "libx32", "lib32"):
return None
parts[0] = "include"
return os.path.join(libc_prefix, *parts)

View File

@@ -707,6 +707,7 @@ def _spider(url: urllib.parse.ParseResult, collect_nested: bool, _visited: Set[s
raw_link = metadata_parser.fragments.pop()
abs_link = url_util.join(response_url, raw_link.strip(), resolve_href=True)
fragment_response_url = None
try:
# This seems to be text/html, though text/fragment+html is also used
fragment_response_url, _, fragment_response = read_from_url(abs_link, "text/html")

View File

@@ -638,6 +638,9 @@ def copy(self):
return clone
def __str__(self):
if not self:
return ""
# print keys in order
sorted_keys = sorted(self.keys())

View File

@@ -146,13 +146,11 @@ def from_string(string: str):
@staticmethod
def typemin():
return StandardVersion("", ((), (ALPHA,)), ("",))
return _STANDARD_VERSION_TYPEMIN
@staticmethod
def typemax():
return StandardVersion(
"infinity", ((VersionStrComponent(len(infinity_versions)),), (FINAL,)), ("",)
)
return _STANDARD_VERSION_TYPEMAX
def __bool__(self):
return True
@@ -390,6 +388,13 @@ def up_to(self, index):
return self[:index]
_STANDARD_VERSION_TYPEMIN = StandardVersion("", ((), (ALPHA,)), ("",))
_STANDARD_VERSION_TYPEMAX = StandardVersion(
"infinity", ((VersionStrComponent(len(infinity_versions)),), (FINAL,)), ("",)
)
class GitVersion(ConcreteVersion):
"""Class to represent versions interpreted from git refs.
@@ -1019,6 +1024,9 @@ def __hash__(self):
return hash(tuple(self.versions))
def __str__(self):
if not self.versions:
return ""
return ",".join(
f"={v}" if isinstance(v, StandardVersion) else str(v) for v in self.versions
)
@@ -1127,7 +1135,9 @@ def _prev_version(v: StandardVersion) -> StandardVersion:
components[1::2] = separators[: len(release)]
if prerelease_type != FINAL:
components.extend((PRERELEASE_TO_STRING[prerelease_type], *prerelease[1:]))
return StandardVersion("".join(str(c) for c in components), (release, prerelease), separators)
# this is only used for comparison functions, so don't bother making a string
return StandardVersion(None, (release, prerelease), separators)
def Version(string: Union[str, int]) -> Union[GitVersion, StandardVersion]:

View File

@@ -12,7 +12,7 @@ markers =
requires_executables: tests that requires certain executables in PATH to run
nomockstage: use a stage area specifically created for this test, instead of relying on a common mock stage
enable_compiler_verification: enable compiler verification within unit tests
enable_compiler_link_paths: verifies compiler link paths within unit tests
enable_compiler_execution: enable compiler execution to detect link paths and libc
disable_clean_stage_check: avoid failing tests if there are leftover files in the stage area
only_clingo: mark unit tests that run only with clingo
only_original: mark unit tests that are specific to the original concretizer

View File

@@ -78,24 +78,14 @@ default:
when: never
- if: $SPACK_CI_ENABLE_STACKS =~ /.+/ && $SPACK_CI_STACK_NAME !~ $SPACK_CI_ENABLE_STACKS
when: never
- if: $CI_COMMIT_REF_NAME == "develop"
# Pipelines on develop only rebuild what is missing from the mirror
- if: $CI_COMMIT_REF_NAME == "develop" || $CI_COMMIT_REF_NAME =~ /^releases\/v.*/
# Pipelines on develop/release branches only rebuild what is missing from the mirror
when: always
variables:
SPACK_PIPELINE_TYPE: "spack_protected_branch"
SPACK_COPY_BUILDCACHE: "${PROTECTED_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}"
SPACK_REQUIRE_SIGNING: "True"
OIDC_TOKEN_AUDIENCE: "protected_binary_mirror"
- if: $CI_COMMIT_REF_NAME =~ /^releases\/v.*/
# Pipelines on release branches always rebuild everything
when: always
variables:
SPACK_PIPELINE_TYPE: "spack_protected_branch"
SPACK_COPY_BUILDCACHE: "${PROTECTED_MIRROR_PUSH_DOMAIN}/${CI_COMMIT_REF_NAME}"
SPACK_PRUNE_UNTOUCHED: "False"
SPACK_PRUNE_UP_TO_DATE: "False"
SPACK_REQUIRE_SIGNING: "True"
OIDC_TOKEN_AUDIENCE: "protected_binary_mirror"
- if: $CI_COMMIT_TAG =~ /^develop-[\d]{4}-[\d]{2}-[\d]{2}$/ || $CI_COMMIT_TAG =~ /^v.*/
# Pipelines on tags (release or dev snapshots) only copy binaries from one mirror to another
when: always
@@ -170,6 +160,7 @@ default:
artifacts:
paths:
- "${CI_PROJECT_DIR}/jobs_scratch_dir"
- "${CI_PROJECT_DIR}/tmp/_user_cache/cache"
variables:
KUBERNETES_CPU_REQUEST: 4000m
KUBERNETES_MEMORY_REQUEST: 16G
@@ -761,32 +752,9 @@ ml-linux-x86_64-cuda-build:
- artifacts: True
job: ml-linux-x86_64-cuda-generate
########################################
# Machine Learning - Linux x86_64 (ROCm)
########################################
.ml-linux-x86_64-rocm:
extends: [ ".linux_x86_64_v3" ]
variables:
SPACK_CI_STACK_NAME: ml-linux-x86_64-rocm
ml-linux-x86_64-rocm-generate:
extends: [ ".generate-x86_64", .ml-linux-x86_64-rocm, ".tags-x86_64_v4" ]
image: ghcr.io/spack/linux-ubuntu22.04-x86_64_v2:v2024-01-29
ml-linux-x86_64-rocm-build:
extends: [ ".build", ".ml-linux-x86_64-rocm" ]
trigger:
include:
- artifact: jobs_scratch_dir/cloud-ci-pipeline.yml
job: ml-linux-x86_64-rocm-generate
strategy: depend
needs:
- artifacts: True
job: ml-linux-x86_64-rocm-generate
########################################
#########################################
# Machine Learning - Darwin aarch64 (MPS)
########################################
#########################################
.ml-darwin-aarch64-mps:
extends: [".darwin_aarch64"]
variables:

View File

@@ -13,7 +13,7 @@ ci:
before_script-:
- - cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- - spack list --count # ensure that spack's cache is populated
- - touch ${SPACK_USER_CACHE_PATH}/cache/*/* # bump mtime of cache so it is not invalidated
- - spack env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
- spack compiler list
- if [ -n "$SPACK_BUILD_JOBS" ]; then spack config add "config:build_jobs:$SPACK_BUILD_JOBS"; fi
@@ -29,7 +29,7 @@ ci:
after_script:
- - cat /proc/loadavg || true
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
- - time python ${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py
- - time ./bin/spack python ${CI_PROJECT_DIR}/share/spack/gitlab/cloud_pipelines/scripts/common/aggregate_package_logs.spack.py
--prefix /home/software/spack:${CI_PROJECT_DIR}/opt/spack
--log install_times.json
${SPACK_ARTIFACTS_ROOT}/user_data/install_times.json || true
@@ -119,7 +119,7 @@ ci:
# Disable local configs to avoid issues on shell runners
SPACK_DISABLE_LOCAL_CONFIG: "1"
before_script:
- - export SPACK_USER_CACHE_PATH="${CI_PROJECT_DIR}/_user_cache/"
- - export SPACK_USER_CACHE_PATH="${CI_PROJECT_DIR}/tmp/_user_cache/"
- - uname -a || true
- grep -E "vendor|model name" /proc/cpuinfo 2>/dev/null | sort -u || head -n10 /proc/cpuinfo 2>/dev/null || true
- nproc || true

View File

@@ -71,7 +71,6 @@ ci:
- match:
- dealii
- mxnet
- rocblas
build-job:
tags: [ "spack", "huge" ]

Some files were not shown because too many files have changed in this diff Show More