Compare commits

..

224 Commits

Author SHA1 Message Date
Juan Miguel Carceller
363215717d py-awkward: add v2.6.6 and py-awkward-cpp v35 (#46372)
* py-awkward: add v2.6.6 and py-awkward-cpp v35

* Add dependencies on python and numpy

* Add a conflict with GCC 14

* Move conflict to py-awkward-cpp

* py-awkward: 2.1.1 depends on py-awkward-cpp@12

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-14 15:04:46 -06:00
Alec Scott
ca46fec985 glab: add v1.46.1 (#46353) 2024-09-14 15:47:53 -05:00
Weston Ortiz
7df4ac70da goma: add v7.7.0 (#46362) 2024-09-14 15:45:21 -05:00
Adam J. Stewart
88aa96b53b py-torchmetrics: add v1.4.2 (#46389) 2024-09-14 15:15:29 -05:00
Richard Berger
4da2444a43 py-future: add version 1.0.0 (#46375) 2024-09-14 15:09:49 -05:00
Richard Berger
1a55f2cdab enchant: new versions, update homepage and url (#46386) 2024-09-14 15:05:56 -05:00
Richard Berger
5398b1be15 texlive: add new versions (#46374)
* texlive: use https

* texlive: add 20230313 and 20240312 versions
2024-09-14 13:31:15 -05:00
Jen Herting
a80d2d42d6 py-pydantic-core: new package (#46306)
* py-pydantic-core: new package

* [py-pydantic-core] fixed licence(checked_by)

* [py-pydantic-core] removed unnecessary python dependency

---------

Co-authored-by: Alex C Leute <acl2809@rit.edu>
2024-09-13 13:56:57 -05:00
Jen Herting
60104f916d py-pypdf: new package (#46303)
* [py-pypdf] New package

* [py-pypdf] added webpage
2024-09-13 09:56:47 -05:00
dependabot[bot]
d04358c369 build(deps): bump urllib3 from 2.2.2 to 2.2.3 in /lib/spack/docs (#46368)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.2.2 to 2.2.3.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.2.2...2.2.3)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-13 08:15:21 -05:00
Kyle Knoepfel
4fe417b620 Optionally output package namespace (#46359) 2024-09-13 08:08:58 -05:00
jeffmauldin
e3c5d5817b Seacas add libcatalyst variant 2 (#46339)
* Update seacas package.py
  Adding libcatalyst variant to seacas package
  When seacas is installed with "seacas +libcatalyst"
  then a dependency on the spack package "libcatalyst"
  (which is catalyst api 2 from kitware) is added, and
  the appropriage cmake variable for the catalyst TPL
  is set. The mpi variant option in catalyst (i.e. build
  with mpi or build without mpi) is passed on to
  libcatalyst. The default of the libcatalyst variant
  is false/off, so if seacas is installed without the
  "+libcatalyst" in the spec it will behave exactly as
  it did before the introduction of this variant.
* shortened line 202 to comply with < 100 characters per line style requirement
2024-09-13 05:36:49 -06:00
Harmen Stoppels
7573ea2ae5 audit: deprecate certain globals (#44895) 2024-09-13 13:21:58 +02:00
Wouter Deconinck
ef35811f39 nasm: add v2.16.03 (#46357)
* nasm: add v2.16.03
* nasm: don't need 2.16.02
2024-09-13 04:40:34 -06:00
Juan Miguel Carceller
a798f40d04 zziplib: add v0.13.78 (#46361)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-09-13 04:14:03 -06:00
David Collins
a85afb829e bash: use libc malloc on musl instead of internal malloc (#46370) 2024-09-13 03:54:56 -06:00
Harmen Stoppels
71df434d1f remove self-import cycles (#46371) 2024-09-13 11:16:36 +02:00
Joseph Wang
668aba15a0 py-pycontour: add dev/fixes (#46290) 2024-09-13 09:56:26 +02:00
eugeneswalker
2277052a6b e4s rocm external ci stack: upgrade to rocm v6.2.0 (#46328)
* e4s rocm external ci stack: upgrade to rocm v6.2.0

* magma: add rocm 6.2.0
2024-09-12 17:51:19 -07:00
James Smillie
79d938fb1f py-pip package: fix bootstrap for Python 3.12+ on Windows (#46332) 2024-09-12 14:40:30 -07:00
Wouter Deconinck
6051d56014 fastjet: avoid plugins=all,cxx combinations (#46276)
* fastjet: avoid plugins=all,cxx combinations

* fastjet: depends_on fortran when plugins=all or plugins=pxcone
2024-09-12 15:46:08 -05:00
Wouter Deconinck
94af1d2dfe tl-expected: add v1.1.0; deprecated custom calver version (#46036)
* tl-expected: add v1.1.0; deprecated calver version
* tl-expected: fix url
2024-09-12 12:08:37 -07:00
Jen Herting
d4bc1b8be2 py-httpx: added version 0.27.0 and 0.27.2 (#46311)
* py-httpx: New version
* [py-httpx] fix when for dependencies
* [py-httpx] organized dependencies
* [py-httpx] added version 0.27.2

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-09-12 12:00:20 -07:00
Alec Scott
a81baaa12e emacs: add v29.4 (#46350)
* emacs: add v29.4
* confirmed license
* emacs: update git link for https clones

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-12 12:48:43 -06:00
Alec Scott
1a17d0b535 go: add v1.23.1 (#46354) 2024-09-12 10:34:43 -07:00
Alec Scott
6d16d7ff83 fd: add v10.2.0 (#46352)
* fd: add v10.2.0
* fd: add rust build dependency constraint
2024-09-12 10:32:33 -07:00
Alec Scott
c89c84770f restic: add v0.17.1 (#46351) 2024-09-12 10:25:04 -07:00
Alec Scott
9154df9062 ripgrep: add v14.1.1 (#46348)
* ripgrep: add v14.1.1
* ripgrep: fix rust dependency type
2024-09-12 10:18:25 -07:00
Alec Scott
56f3ae18f6 kubectl: add v1.31.0 (#46343) 2024-09-12 10:17:16 -07:00
Alec Scott
9a4f83be1d fzf: add v0.55.0 (#46342) 2024-09-12 10:16:15 -07:00
Vanessasaurus
6b10f80cca flux-sched: add v0.37.0, v0.38.0 (#46215)
* Automated deployment to update package flux-sched 2024-09-05
* flux-sched: add back check for run environment
* flux-sched: add conflict for gcc and clang above 0.37.0

---------

Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-09-12 10:12:21 -07:00
Lydéric Debusschère
e8e8d59803 cpp-argparse: Add version 3.1 (#46319) 2024-09-12 10:10:23 -07:00
dependabot[bot]
d8d1bc5d7e build(deps): bump pytest from 8.3.2 to 8.3.3 in /lib/spack/docs (#46315)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.2 to 8.3.3.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.2...8.3.3)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-12 08:57:05 -07:00
Alec Scott
3531ee28f8 curl: add v8.8.0 (#44622)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-09-12 17:05:41 +02:00
Rocco Meli
8b18f95e05 ELSI: add v2.11 and dlaf variant (#46317)
* provide dlaff libs

* fix incompatibility with ntpoly

* add cuda conflicts

* [@spackbot] updating style on behalf of RMeli

* Update var/spack/repos/builtin/packages/elsi/package.py

Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>

* move pexsi conflict to context

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2024-09-12 06:34:48 -06:00
Xuefeng Ding
128cac34e0 geant4: add hdf5 variant (#46312) 2024-09-12 06:34:26 -06:00
Harmen Stoppels
8fa65b286c fix various pkgs depending on libtool only (#46299)
autotools packages with a configure script should generate the libtool
executable, there's no point in `depends_on("libtool", type="build")`.

the libtool executable in `<libtool prefix>/bin/libtool` is configured
for the wrong toolchain (libtools %compiler instead of the package's
%compiler).

Some package link to `libltdl.so`, which is fine, but had a wrong
dependency type.
2024-09-12 14:05:50 +02:00
Mikael Simberg
6cb16c39ab pika: Add conflicts between pika's and apex's allocator options (#46318) 2024-09-12 13:46:39 +02:00
Wouter Deconinck
7403469413 w3m: add v0.5.3.git20230121 (#45747) 2024-09-12 13:44:55 +02:00
Wouter Deconinck
533de03fef libssh: add v0.9.8, v0.10.6, v0.11.0 (#45689) 2024-09-12 13:41:48 +02:00
Wouter Deconinck
3fea1d6710 codespaces: add ubuntu22.04 (#46100) 2024-09-12 13:40:05 +02:00
Wouter Deconinck
257ebce108 py-zope-*: add new versions (#46338) 2024-09-12 13:39:17 +02:00
Wouter Deconinck
2623b9727f mdspan: fix typo in satisfies condition (#46340) 2024-09-12 13:37:57 +02:00
Wouter Deconinck
68df483bc6 re2: add versions through 2024-07-02; add variant icu (#46337) 2024-09-12 13:37:30 +02:00
Wouter Deconinck
0cc38d685f pcre2: add v10.44 (#46341) 2024-09-12 04:24:49 -06:00
Mikael Simberg
cec770b26e pika: Add conflict between HIP and GCC (libstdc++) >= 13 (#46291) 2024-09-12 09:03:58 +02:00
Todd Gamblin
f417e9f00c acts: further simplify cxxstd handling (#46333)
See https://github.com/spack/spack/pull/46314#discussion_r1752940332.

This further simplifies `cxxstd` variant handling in `acts` by removing superfluous
version constraints from dependencies for `geant4` and `root`.

The version constraints in the loop are redundant with the conditional variant
values here:

```python
    _cxxstd_values = (
        conditional("14", when="@:0.8.1"),
        conditional("17", when="@:35"),
        conditional("20", when="@24:"),
    )
    _cxxstd_common = {
        "values": _cxxstd_values,
        "multi": False,
        "description": "Use the specified C++ standard when building.",
    }
    variant("cxxstd", default="17", when="@:35", **_cxxstd_common)
    variant("cxxstd", default="20", when="@36:", **_cxxstd_common)
```

So we can simplify the dependencies in the loop to:

```python
    for _cxxstd in _cxxstd_values:
        for _v in _cxxstd:
            depends_on(f"geant4 cxxstd={_v.value}", when=f"cxxstd={_v.value} +geant4")
            depends_on(f"geant4 cxxstd={_v.value}", when=f"cxxstd={_v.value} +fatras_geant4")
            depends_on(f"root cxxstd={_v.value}", when=f"cxxstd={_v.value} +tgeo")
```

And avoid the potential for impossible variant expressions.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-11 23:14:50 -06:00
Peter Scheibel
527e0fb4b4 bugfix: unit test broken everywhere by package change(#46344)
Openmpi provider statements were changed in #46102. The package change
was fine in and of itself, but apparently one of our tests depends on
the precise constraints used in those statements. I updated the test
to remove the checks for constraints that were removed.
2024-09-11 19:52:40 -07:00
Stephen Nicholas Swatman
c7e3fc9049 boost: add v1.86.0 (#46321)
This commit adds Boost v1.86.0.
2024-09-11 20:37:43 -06:00
Stephen Nicholas Swatman
5c59bb87a4 acts: add v36.3.0 (#46320)
This commit adds v36.3.0 of ACTS, and also marks a previously missing
conflict with Boost 1.85.0
2024-09-11 16:55:26 -06:00
Stephen Nicholas Swatman
bca975e66a hep tag: new versions as of 2024/09/11 (#46322)
This commit updates acts-algebra-plugins to v0.25.0, actsvg to v0.4.47,
detray to v0.74.2, geomodel to v6.5.0, and vecmem to v1.7.0.
2024-09-11 16:49:18 -06:00
Pranav Sivaraman
8160cd1b9e scnlib: new package (#46313) 2024-09-11 13:18:12 -07:00
Adam J. Stewart
5833fe0c36 py-scikit-learn: add v1.5.2 (#46327) 2024-09-11 12:23:42 -07:00
Adam J. Stewart
ff1bc9e555 py-xgboost: add v2.1.1 (#46260)
* py-xgboost: add v2.1.1
* setuptools no longer needed
* Better comment
2024-09-11 12:00:48 -07:00
Dom Heinzeller
6db1defba0 New packages: py-regionmask and py-pyogrio (#46209)
* Add py-geopandas versions 1.0.0 and 1.0.1, update dependencies
* New package py-pyogrio - dependency of newer py-geopandas
* New package py-regionmask - needs py-geopandas
2024-09-11 11:37:13 -07:00
Todd Gamblin
3099662df2 Fix variant expressions that will never match (#46314)
In #44425, we add stricter variant audits that catch expressions that can never match.

This fixes 13 packages that had this type of issue.

Most packages had lingering spec expressions from before conditional variants with
`when=` statements were added. For example:

* Given `conflicts("+a", when="~b")`, if the package has since added
  `variant("a", when="+b")`, the conflict is no longer needed, because
  `a` and `b` will never exist together.

* Similarly, two packages that depended on `py-torch` depended on
  `py-torch~cuda~cudnn`, which can't match because the `cudnn` variant
  doesn't exist when `cuda` is disabled. Note that neither `+foo` or `~foo`
  match (intentionally) if the `foo` variant doesn't exist.

* Some packages referred to impossible version/variant combinations, e.g.,
  `ceed@1.0.0+mfem~petsc` when the `petsc` variant only exist at version `2`
  or higher.

Some of these correct real issues (e.g. the `py-torch` dependencies would have never
worked). Others simply declutter old code in packages by making all constraints
consistent with version and variant updates.

The only one of these that I think is not all that useful is the one for `acts`,
where looping over `cxxstd` versions and package versions ends up adding some
constraints that are impossible. The additional dependencies could never have
happened and the code is more complicated with the needed extra constriant.
I think *probably* the best thing to do in `acts` is to just not to use a loop
and to write out the constraints explicitly, but maybe the code is easier to
maintain as written.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-11 10:59:16 -07:00
Satish Balay
2185749bb4 openmpi: update dependency wrt mpi standard versions (#46102)
* openmpi: update mpi version dependency
* Use "disjoint sets" for version ranges
2024-09-11 09:13:17 -07:00
Dom Heinzeller
6287d98455 Update FMS: apply patch for fms@2023.03, add variant shared and patch for fms@2024.02 (#46238)
* Update var/spack/repos/builtin/packages/fms/package.py: apply patch for fms@2023.03 to fix compiler options bug in cmake config, add variant shared and corresponding patch for fms@2024.02
* Fix fms package audit: use c9bba516ba.patch?full_index=1 instead of c9bba516ba.patch?full_index=1
* Update checksum of patch for fms@2023.03
2024-09-11 09:08:52 -07:00
Harmen Stoppels
6c4990525d fixesproto: add xextproto dep (#46325)
needed according to pc file
2024-09-11 10:48:56 -05:00
Pierre Augier
5beed521bd py-fluidfft and co: add new packages (#46236)
* Add new packages: py-fluidfft and co

* py-fluidfft: add few contrains (version dependencies)
2024-09-11 09:45:56 -06:00
Adam J. Stewart
5fa8890bd3 CUDA: support Grace Hopper 9.0a compute capability (#45540)
* CUDA: support Grace Hopper 9.0a compute capability

* Fix other packages

* Add type annotations

* Support ancient Python versions

* isort

* spec -> self.spec

Co-authored-by: Andrew W Elble <aweits@rit.edu>

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: Andrew W Elble <aweits@rit.edu>
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2024-09-11 17:43:20 +02:00
Jannek Squar
122c3c2dbb CDO: add patch for missing algorithm header (#45692)
* add cdo patch for missing algorithm header

* add patch for 2.2.2:2.3.0

Signed-off-by: Jannek <squar@informatik.uni-hamburg.de>

* Add conflict of old cdo with new gcc

* Update var/spack/repos/builtin/packages/cdo/add_algorithm_header_222.patch

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Adjust patch hash

* Update var/spack/repos/builtin/packages/cdo/package.py

fix copy-paste-error on hash

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Signed-off-by: Jannek <squar@informatik.uni-hamburg.de>
Co-authored-by: Jannek <squar@informatik.uni-hamburg.de>
Co-authored-by: Try2Code <stark.dreamdetective@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-11 08:00:56 -05:00
Adam J. Stewart
607d158a44 OpenCV: add v4.8-4.10 (#46223) 2024-09-11 14:52:39 +02:00
Rocco Meli
892b5d8037 cp2k: add v2024.3 (#46297) 2024-09-11 09:58:03 +02:00
Massimiliano Culpo
4f00c7173d nvhpc: add 'compiler' tag to the override (#46301)
fixes #46295

A proper solution would be a tag directive that accumulates tags
with the ones defined in base classes.

For the time being, rewrite them explicitly.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-11 09:47:55 +02:00
James Smillie
fc4a4ec70d boost package: fix Windows build (#43732)
* Boost:Adjust bootstrapping/b2 options as needed for Windows (the
  bootstrapping phase sufficiently differs between Windows/Unix
  that it is handled entirely within its own branch).
* Boost: Paths in user-config.jam should be POSIX, including on Windows
* Python: `.libs` for the Python package should return link libraries
  on Windows. The libraries are also stored in a different directory.
2024-09-11 01:43:28 -06:00
afzpatel
e1da0a7312 Bump up the version for ROCm-6.2.0 (#45701)
* initial update for rocm 6.2
* fix typo in rocprofiler-register
* update rocm-device-libs
* add patch to use clang 18 for roctracer-dev
* add updates for rocm-opencl and rocm-validation-suite
* add hipify-clang patch
* update remaining packages to 6.2
* update hipblaslt mivisionx patches
* update rocm-tensile to 6.2.0
* add hipsparselt changes for 6.2
* add rocm-openmp-extras patch
* add build-time test for rocprofiler-register
* update flang-legacy support for 6.2
* simplify version range
* update boost dependency in rpp

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-09-10 20:13:21 -06:00
Taillefumier Mathieu
f58ee3ea2b SIRIUS: add v7.6.1 (#46296)
bug fix update for sirius
2024-09-10 13:08:32 -06:00
Massimiliano Culpo
ffdfa498bf Deprecate config:install_missing_compilers (#46237)
The option config:install_missing_compilers is currently buggy,
and has been for a while. Remove it, since it won't be needed
when compilers are treated as dependencies.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-10 20:02:05 +02:00
Teague Sterling
decefe0234 perl-bio-ensembl-io: new package (#44509)
* Adding the perl-bio-db-bigfile package

* Adding perl-bio-ensembl-io package

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updating dependent package handling

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Reverting variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Rename package.py to package.py

* Update package.py

* Update package.py

* Removing unneeded dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Updated from testing by @ebiarnie

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Updated from testing by @ebiarnie

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing depends

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fix styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-10 11:45:59 -06:00
Pranav Sivaraman
e2e5e74b99 fast-float: new package (#46137)
* fast-float: new package

* fast-float: add test dependency

* fast-float: fix doctest dependency type

* fast-float: convert deps to tuple

* fast-float: add v6.1.5 and v6.1.6

* fast-float: patch older versions to use find_package(doctest)
2024-09-10 12:36:23 -05:00
Tony Weaver
0629c5df38 py-your: Changed software pull location (#46201)
* py-your: new package

Spack package recipe for YOUR, Your Unified Reader.  YOUR processes pulsar data in different formats.

Output below from spack install py-your
spack install py-your
==> Installing py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x [83/83]
==> No binary for py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x found: installing from source
==> Fetching https://github.com/thepetabyteproject/your/archive/refs/tags/0.6.7.tar.gz
==> No patches needed for py-your
==> py-your: Executing phase: 'install'
==> py-your: Successfully installed py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x
  Stage: 1.43s.  Install: 0.99s.  Post-install: 0.12s.  Total: 3.12s

* Removed setup_run_environment

After some testing, both spack load and module load for the package will include the bin directory generated by py-your as well as the path to the version of python the package was built with, without the need for the setup_run_environment function.

I removed that function (Although, like Tamara I thought it would be necessary based on other package setups I used as a  basis for this package).

Note: I also updated the required version of py-astropy from py-astropy@4.0: to @py-astropy@6.1.0:  In my test builds, the install was picking up version py-astropy@4.0.1.post1 and numpy1.26.  However when I  tried to run some of the code I was getting errors about py-astropy making numpy calls that are now removed.  The newer version of py-astropy corrects these.  Ideally this would be handled in the py-astropy package to make sure numpy isn't too new

* Changed  software pull location

The original package pulled a tagged release version from GitHub.  That tagged version was created in 2022  and has not been updated since.  It no longer runs because newer versions of numpy have removed deprecation warnings for several of their calls.  The main branch for this repository has addressed these numpy issues as well as some other important fixes but no new release has been generated.  Because of this and the apparent minimal development that now appears to be going on, it is probably best to always pull from the main branch

* [@spackbot] updating style on behalf of aweaver1fandm

* py-your: Changed software pull location

1. Restored original URL and version (0.6.7) as requested
2. Updated py-numpy dependency versions to be constrained based on the version of your.  Because of numpy deprecations related to your version 0.6.7 need to ensure that the numpy version used is not 1.24 or greater because the depracations were removed starting with that version
2024-09-10 10:37:05 -05:00
Robert Underwood
79d778f8cd Add additional cuda-toolkit location variable used by py-torch (#46245)
superceds #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-10 17:20:05 +02:00
Teague Sterling
6f5ba44431 perl-bioperl: add v1.6.924, v1.7.8, deprecate v1.007002, refactor dependeicies, update url (#46213)
* perl-bioperl: add v1.6.924

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix style

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: add v1.6.924, v1.7.2, deprecate v1.007002, refactor dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: add v1.7.8

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: update url

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* perl-bioperl: cleanup version urls

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* remove v1.7.2

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-10 11:36:43 +01:00
Harmen Stoppels
0905edf592 r: do not create dir in setup_dependent_package (#46282) 2024-09-10 09:04:09 +02:00
Harmen Stoppels
16dba78288 spec.py: dedent format logic (#46279) 2024-09-10 09:02:37 +02:00
Todd Gamblin
b220938d42 bugfix: elfutils has no bzip2 or xz variants (#46294)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-10 06:20:39 +00:00
Mikael Simberg
78117877e0 boost: Refactor header-only install and add missing compiled libraries (#46281) 2024-09-10 08:16:47 +02:00
Harmen Stoppels
975f4fbf84 cmake: remove last occurrences of std_cmake_args globals (#46288) 2024-09-10 07:56:51 +02:00
AcriusWinter
dc853b2cf4 gptune: new test API (#45383)
* gptune: new test API
* gptune: cleanup; finish API changes; separate unrelated test parts
* gptune: standalone test cleanup with timeout constraints
* gptune: ensure stand-alone test bash failures terminate; enable in CI
* gptune: add directory to terminate_bash_failures
* gptune/stand-alone tests: use satisifes for checking variants

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-09-09 15:21:55 -07:00
Philipp Edelmann
69b628a3f0 makedepf90: new package (#45816)
* makedepf90: new package

* reorder versions and dependencies
2024-09-09 16:40:25 -05:00
David Collins
98fb9c23f9 numactl: Add versions 2.0.16-2.0.18 (#46150)
* Add numactl 2.0.16-2.0.18

* Create link-with-latomic-if-needed-v2.0.16.patch

Add a link to libatomic, if needed, for numactl v2.0.16.

* Add some missing patches to v2.0.16

* Create numactl-2.0.18-syscall-NR-ppc64.patch

In short, we need numactl to set __NR_set_mempolicy_home_node on ppc64, if it's not already defined.

* Apply a necessary patch for v2.0.18 on PPC64

* Add libatomic patch for v2.0.16
2024-09-09 14:33:13 -06:00
Pranav Sivaraman
d5e08abe46 py-poxy: add new package (#46214)
* py-poxy: add new package
* py-poxy: depends on misk >= 0.8.1
2024-09-09 12:19:54 -07:00
Harmen Stoppels
2f789f01d3 Revert "Set module variables for all packages before running setup_dependent_…" (#46283)
This reverts commit 6f08db4631.
2024-09-09 10:37:26 -07:00
Kevin Kuriakose
0229240df4 darshan-runtime: fix JOBID determination (#46148) 2024-09-09 10:53:15 -06:00
Harmen Stoppels
9059756a11 reindex: do not assume fixed layout (#46170)
`spack reindex` relies on projections from configuration to locate
installed specs and prefixes. This is problematic because config can
change over time, and we have reasons to do so when turning compilers
into depedencies (removing `{compiler.name}-{compiler.version}` from
projections)

This commit makes reindex recursively search for .spack/ metadirs.
2024-09-09 17:26:30 +02:00
Pierre Augier
67089b967e Add new package py-fluiddyn (#46235) 2024-09-09 09:47:44 -05:00
Mikael Simberg
98e35f7232 pika: Add conflict with fmt 11 and newer (#46280) 2024-09-09 14:30:32 +02:00
Teague Sterling
3ff441c5b0 perl-graphviz: add graphviz dependency (#46268)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-09-09 11:08:34 +02:00
Jordan Galby
2502a3c2b6 Fix regression in spec format string for indiviual variants (#46206)
Fix a regression in {variants.X} and {variants.X.value} spec format strings.
2024-09-09 10:42:04 +02:00
Adam J. Stewart
9bdc97043e PyTorch: unpin pybind11 dependency (#46266) 2024-09-09 08:42:09 +02:00
snehring
66ee4caeab nekrs: add v23.0, add new build system (#45992) 2024-09-09 07:32:14 +02:00
Eric Berquist
18218d732a intel-pin: add up to v3.31 (#46185) 2024-09-09 07:28:53 +02:00
Paul Ferrell
4c6b3ccb40 charliecloud: fix libsquashfuse dependency type (#46189)
Should have been a link depenedency, not a build dependency.
2024-09-09 07:26:20 +02:00
Satish Balay
1d9c8a9034 xsdk: add amrex variant (#46190)
and remove compiler conditionals [as amrex conflict with clang no longer exists since #22967]
2024-09-09 07:19:31 +02:00
AMD Toolchain Support
216619bb53 namd: variant updates (#45825)
* Add missing variant, already used in recipe (avxtiles)

* Add memopt variant 

Co-authored-by: viveshar <vivek.sharma2@amd.com>
2024-09-09 07:12:16 +02:00
Wouter Deconinck
47c771f03f assimp: add v5.4.3, enable testing (#46267) 2024-09-09 06:51:57 +02:00
Wouter Deconinck
c7139eb690 wayland-protocols: add v1.37 (#46269) 2024-09-09 06:50:30 +02:00
Wouter Deconinck
74ba81368a py-vector: add v1.5.1 (#46271) 2024-09-09 06:49:35 +02:00
Wouter Deconinck
ef11fcdf34 protobuf: patch @3.22:3.24.3 when ^abseil-cpp@20240116: (#46273) 2024-09-09 06:48:50 +02:00
Wouter Deconinck
c4f3348af1 (py-)onnx: add v1.16.2; onnx: enable testing (#46274) 2024-09-09 06:47:45 +02:00
Richard Berger
fec2f30d5a lammps: add 20240829 and 20230802.4 releases (#46131) 2024-09-09 06:46:12 +02:00
Wouter Deconinck
6af92f59ec xrootd: add v5.7.1 (#46270) 2024-09-09 06:45:15 +02:00
Pierre Augier
cb47c5f0ac Add package py-transonic (#46234) 2024-09-08 20:03:09 -05:00
Dom Heinzeller
32727087f1 Add patch vfile_cassert.patch for ecflow@5.11.4 (#46095) 2024-09-07 07:54:43 -06:00
Jen Herting
5f9c6299d1 [py-langsmith] added version 0.1.81 (#46259) 2024-09-07 04:29:45 -06:00
Jen Herting
541e40e252 py-httpcore: add v1.0.5 (#46123)
* py-httpcore: Added new version

* [py-httpcore]

- added version 0.18.0
- restructured dependencies as everything has a when and
  type/when ordering was all over the place

* [py-httpcore] ordered dependencies in the order listed in v1.0.5 pyproject.toml

---------

Co-authored-by: Alex C Leute <aclrc@rit.edu>
2024-09-07 02:28:15 -06:00
Jen Herting
ddc8790896 [py-lpips] new package (#46256) 2024-09-06 18:54:51 -06:00
Jen Herting
7d6b643b58 [py-torch-fidelity] New package (#46257)
* [py-torch-fidelity] New package
* [py-torch-fidelity] add patch to fix missing requirements.txt
2024-09-06 18:54:36 -06:00
Dan Lipsa
6f08db4631 Set module variables for all packages before running setup_dependent_package (#44327)
When a package is running `setup_dependent_package` on a parent, ensure
that module variables like `spack_cc` are available. This was often
true prior to this commit, but externals were an exception.

---------

Co-authored-by: John Parent <john.parent@kitware.com>
2024-09-06 18:54:09 -06:00
eugeneswalker
7086d6f1ac use updated container with cmake@3.30.2 (#46262) 2024-09-06 18:48:45 -06:00
Robert Underwood
7a313295ac Sz3 fix (#46263)
* Updated version of sz3
  Supercedes #46128
* Add Robertu94 to maintainers fo r SZ3

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 14:35:55 -07:00
Jen Herting
3bdaaaf47c New package: py-monai (#46140)
* New package: py-monai

* Fixed linked issues with py-monai

* [py-monai] removed extra line

* [py-monai]

- New version 1.3.2
- ran black
- update copyright

* [py-monai] added license

* [py-monai]

- moved build only dependencies
- specified newer python requirements for newer versions

---------

Co-authored-by: vehrc <vehrc@rit.edu>
2024-09-06 15:19:12 -06:00
Massimiliano Culpo
a3fa54812f Remove deprecated config options (#44061)
These options have been deprecated in v0.21, and
slated for removal in v0.23
2024-09-06 15:14:36 -06:00
Kyle Gerheiser
afc9615abf libfabric: Add versions v1.22.0 and v1.21.1 (#46188)
* Add libfabric v1.22.0 and v1.21.1

* Fix whitespace
2024-09-06 14:21:41 -05:00
Adam J. Stewart
19352af10b GEOS: add v3.13.0 (#46261) 2024-09-06 12:17:41 -07:00
Robert Underwood
e4f8cff286 Updated version of sz3 (#46246)
Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 12:15:03 -07:00
Sam Reeve
76df9de26a Update ExaCA backend handling and add version 2.0 (#46243)
* Add exaca 2.0
* Add kokkos/cuda/hip backend support for exaca
2024-09-06 14:53:21 -04:00
Auriane R.
983e7427f7 Replace if ... in spec with spec.satisfies in f* and g* packages (#46127)
* Replace if ... in spec with spec.satisfies in f* and g* packages

* gromacs: ^amdfftw -> ^[virtuals=fftw-api] amdfftw

* flamemaster: add virtuals lapack for the amdlibflame dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-06 13:22:19 -05:00
Adam J. Stewart
d9df520e85 py-pandas: relax build dependencies (#46229) 2024-09-06 11:19:49 -07:00
Adam J. Stewart
b51fd9f5a6 py-torchgeo: zipfile-deflate64 no longer required (#46233) 2024-09-06 11:18:10 -07:00
Robert Underwood
8058cd34e4 Require a newer version of cudnn for cupy (#46248)
cupy 12 needs a newer version of cudnn as documented here.  Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 10:56:46 -07:00
Robert Underwood
2f488b9329 Use an HTTP git url for libpressio-opt (#46249)
superceds #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-06 10:43:23 -07:00
Peter Scheibel
78a4d3e7d2 Mixed-source cflags (#41049)
Allow flags from different sources (compilers, `require:`, command-line
specs, and `depends_on`) to be merged together, and enforce a consistent
order among them.

The order is based on the sources, e.g. flags on specs from the command
line always come last. Some flag order consistency issues are fixed:

1. Flags from `compilers.yaml` and the command line were always intra- and
   inter-source order consistent.
2. Flags from dependents and packages.yaml (introduced via `require:`)
   were not: for `-a -b` from one source and `-c` from another, the final
   result might rearrange `-a -b`, and would also be inconsistent in terms
   of whether `-c` came before or after.

(1) is/was handled by going back to the original source, i.e., flags are
retrieved directly from the command line spec rather than the solver.

(2) is addressed by:

* Keeping track of grouped flags in the solver
* Keeping track of flag sources in the solver on a per-flag basis

The latter info is used in this PR to enforce DAG ordering on flags
applied from multiple dependents to the same package, e.g., for this
graph:

```
   a
  /|\
 b | c
  \|/
   d
```

If `a`, `b`, and `c` impose flags on `d`, the combined flags on `d` will
contain the flags of `a`, `b`, and `c` -- in that order. 

Conflicting flags are allowed (e.g. -O2 and -O3). `Spec.satisifes()` has
been updated such that X satisfies Y as long as X has *at least* all of
the flags that Y has. This is also true in the solver constraints.
`.satisfies` does not account for how order can change behavior (so
`-O2 -O3` can satisfy `-O3 -O2`); it is expected that this can be
addressed later (e.g. by prohibiting flag conflicts).

`Spec.constrain` and `.intersects` have been updated to be consistent
with this new definition of `.satisfies`.
2024-09-06 10:37:33 -07:00
Juan Miguel Carceller
2b9a621d19 sleef: add the PIC flag (#46217) 2024-09-06 09:37:20 -06:00
Juan Miguel Carceller
fa5f4f1cab pthreadpool: add the PIC flag (#46216) 2024-09-06 15:11:03 +02:00
John W. Parent
4042afaa99 Bootstrap GnuPG and file on Windows (#41810)
Spack can now bootstrap two new dependencies on Windows: GnuPG, and file. 

These dependencies are modeled as a separate package, and they install a cross-compiled binary.
Details on how they binaries are built are in https://github.com/spack/windows-bootstrap-resources
2024-09-06 14:26:46 +02:00
Harmen Stoppels
7fdf1029b7 fish: use shlex.quote instead of custom quote (#46251) 2024-09-06 11:38:14 +00:00
Stephen Nicholas Swatman
814f4f20c0 py-pybind11: add v2.13.0-v2.13.4 and new conflict (#45872)
This PR adds py-pybind11 versions 2.13.0, 2.13.1, 2.13.2, 2.13.3, and
2.13.4. It also adds a new conflict between gcc 14 and pybind versions
up to and including 2.13.1.
2024-09-06 10:38:35 +02:00
Harmen Stoppels
7c473937ba db: more type hints (#46242) 2024-09-06 09:13:14 +02:00
Robert Underwood
9d754c127a Require a newer version of cudnn for cupy (#46247)
cupy 12 needs a newer version of cudnn as documented here.  Supercedes #46128

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2024-09-05 23:12:46 -06:00
Massimiliano Culpo
9a8bff01ad Allow deprecating more than one property in config (#46221)
* Allow deprecating more than one property in config

This internal change allows the customization of errors
and warnings to be printed when deprecating a property.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* fix

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Use a list comprehension for "issues"

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-05 15:33:20 -07:00
Nicole C.
434a703bcf Windows: Update pytest with echo and remove others (#45620)
* Windows: Update pytest with echo and remove others

* Fix style

* Remove unused pytest import in undevelop

* Update test_test_part_pass to be Window compatible

* Style
2024-09-05 15:34:19 -05:00
Adam J. Stewart
a42108438d py-contourpy: add v1.3.0 (#46225) 2024-09-05 22:16:14 +02:00
Sebastian Pipping
1be9b7f53c expat: Add 2.6.3 with security fixes + deprecate vulnerable 2.6.2 (#46208) 2024-09-05 14:09:34 -06:00
Dom Heinzeller
6b05a80745 Bug fixes for building py-netcdf4 and py-ruamel-yaml-clib with apple-clang@15 (#46184)
* Fix compiler flags in py-netcdf4 for apple-clang@15:
* Fix compiler flags in py-ruamel-yaml-clib for apple-clang@15:
2024-09-05 11:29:26 -07:00
Fernando Ayats
4d36b0a5ef npb: fix mpi rank mismatch errors (#46075)
MPI variant has several rank mismatch errors, which are silently
ignored. This downgrades the errors to warnings.
2024-09-05 11:27:15 -07:00
Adam J. Stewart
636843f330 PyTorch: update ecosystem (#46220) 2024-09-05 11:06:57 -07:00
Adam J. Stewart
d4378b6e09 awscli-v2: remove upperbound on cryptography (#46222) 2024-09-05 11:04:56 -07:00
Adam J. Stewart
69b54d9039 py-imageio: add v2.35.1 (#46227) 2024-09-05 10:57:57 -07:00
Adam J. Stewart
618866b35c py-laspy: add v2.3.5 (#46228) 2024-09-05 10:56:51 -07:00
Adam J. Stewart
47bd8a6b26 py-pyvista: add v0.44.1 (#46231) 2024-09-05 10:53:42 -07:00
Adam J. Stewart
93d31225db py-tifffile: add v2024 (#46232) 2024-09-05 10:52:21 -07:00
Alex Richert
c3a1d1f91e Add when's to some tau dependencies (#46212)
* Add when's to some tau dependencies

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-09-05 10:35:01 -07:00
estewart08
33621a9860 [rocm-openmp-extras] - Add support for flang-legacy in 6.1.2 (#46130)
* [rocm-openmp-extras] - Add support for flang-legacy in 6.1.2
* [rocm-openmp-extras] - Remove unused variable flang_legacy_dir
* [rocm-openmp-extras] - Limit flang-legacy build to 6.1 and newer ROCm versions
2024-09-05 12:27:50 -05:00
etiennemlb
055eb3cd94 PLUMED: Using a C compiler variable (#46082)
* Using a C compiler variable

* homogeneity
2024-09-05 10:43:28 -06:00
Massimiliano Culpo
c4d18671fe Avoid best-effort expansion of stacks (#40792)
fixes #40791

Currently stacks behave differently if used in unify:false
environments, which leads to inconsistencies during concretization.

For instance, we might have two abstract user specs that do not
intersect with each other map to the same concrete spec in the
environment. This is clearly wrong.

This PR removes the best effort expansion, so that user specs
are always applied strictly.
2024-09-05 10:32:15 -06:00
Thomas Madlener
5d9f0cf44e gaudi: Specify boost dependencies explicitly and cleanup package (#46194)
* gaudi: Specify boost components and add +fiber for v39

* gaudi: Limit fmt version to allow building master branch

* Make boost dependencies a bit more readable

* Remove patches for no longer existing versions
2024-09-05 08:27:04 -06:00
Harmen Stoppels
02faa7b97e reindex: ensure database is empty before reindex (#46199)
fixes two tests that did not clear the in-memory bits of a database
before calling reindex.
2024-09-05 14:53:29 +02:00
Mikael Simberg
d37749cedd pika: Add 0.28.0 (#46207)
* pika: Add 0.28.0

* Add conflict between pika 0.28.0 and dla-future
2024-09-05 14:12:21 +02:00
Harmen Stoppels
7e5e6f2833 Pass Database layout in constructor (#46219)
Ensures that Database instances do not reference a global
`spack.store.STORE.layout`. Simplify Database.{add,reindex} signature.
2024-09-05 10:49:03 +00:00
Massimiliano Culpo
37ea9657cf Remove test_external_package_module (#46218)
This test was possibly meant for the Cray platform, and
currently is a no-op.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-05 12:10:59 +02:00
Harmen Stoppels
2107a88514 spack deprecate: deprecate --link-type flag (#46202) 2024-09-05 11:06:46 +02:00
Auriane R.
1a4b07e730 Replace if ... in spec with spec.satisfies in d* and e* packages (#46126)
* Replace if ... in spec with spec.satisfies in d* and e* packages

* Use virtuals for different mpi implementations in esmf

* esmf: ^[virtuals=mpi] mpt

* extrae: ^[virtuals=mpi] intel-oneapi-mpi

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-04 22:33:51 -06:00
Jack Morrison
c98045e028 libfabric: Add CUDA variant (#46203)
Signed-off-by: Jack Morrison <jack.morrison@cornelisnetworks.com>
2024-09-04 21:43:25 -06:00
mvlopri
bc5456a791 seacas: require +metis and +mpi instead of +parmetis (#46205)
This change aligns the build condition for parmetis with the
depends_on condition.
The current build condition of parmetis looks for "+parmetis" in
the spec which is not added by the depends_on as that adds
"^parmetis" instead.
2024-09-04 21:03:02 -06:00
Pranav Sivaraman
656720a387 py-schema: add v0.7.7 (#46210)
* py-schema: add v0.7.7

* py-schema: fix when spec

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-04 20:07:35 -06:00
Mikael Simberg
9604c0a9b3 boost: Conditionally include/exclude Boost.Json depending on Boost version (#46200) 2024-09-04 16:50:06 -06:00
Pranav Sivaraman
ee96194486 py-misk: add new package (#46153) 2024-09-04 11:25:01 -07:00
Felix Thaler
ab21fc1daf Added jump package (#46164) 2024-09-04 11:16:20 -07:00
Pranav Sivaraman
d593ad0c06 py-trieregex: new package (#46154)
* py-trieregex: new package
2024-09-04 10:58:08 -07:00
Weiqun Zhang
254fe6ed6e amrex: add v24.09 (#46171) 2024-09-04 10:54:49 -07:00
renjithravindrankannath
7e20874f54 rocm-openmp-extras: Avoiding registration of duplicate check-targets and fix for the failure in hostexec (#45658)
* Adding addtional check for omptarget library for amdgpu in nvidia environment
* Avoiding registration of duplicate when built on cuda
* Adding hsa library path in LD_LIBRARY_PATH
* Correction in hsa prefix library path in LD_LIBRARY_PATH
2024-09-04 10:31:28 -07:00
Vanessasaurus
cd4c40fdbd Automated deployment to update package flux-core 2024-09-04 (#46191)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-09-04 10:28:44 -07:00
Harmen Stoppels
c13e8e49fe goaccess: new package (#46193) 2024-09-04 10:23:28 -07:00
Adam J. Stewart
35a2a0b3d0 py-rasterio: add v1.3.11 (#46195)
* py-rasterio: add v1.3.11
* Use default_args
2024-09-04 09:45:16 -07:00
Adam J. Stewart
22d69724f4 py-fiona: add v1.10.0 (#46196) 2024-09-04 09:36:43 -07:00
Adam J. Stewart
f6e3f6eec7 py-numpy: add v2.1.1 (#46197) 2024-09-04 09:33:07 -07:00
G-Ragghianti
866c440f0c Updating git repo location (#46183) 2024-09-04 10:34:05 -05:00
Juan Miguel Carceller
fe6f5b87dc texlive: clean up recipe (#45863)
* texlive: clean up recipe

* Update the poppler dependency

* Fix typo

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-09-04 10:20:23 -05:00
Vicente Bolea
3df3f40984 paraview: add new v5.13.0 release (#46091) 2024-09-04 10:14:52 -05:00
Satish Balay
33f4a40df4 trilinos: update @[master,develop] dependency on kokkos (#46182)
==> Error: InstallError: For Trilinos@[master,develop], ^kokkos version in spec must match version in Trilinos source code. Specify ^kokkos@4.4.00 for trilinos@[master,develop] instead of ^kokkos@4.3.01.
2024-09-04 07:34:20 -06:00
Kyle Knoepfel
7ba0132f66 llvm: improve detection regexes (#46187) 2024-09-04 04:12:42 -06:00
Mikael Simberg
744f034dfb Add conflict between llvm-amdgpu until version 5 and ninja since version 1.12.0 (#45788)
* Add conflict between llvm-amdgpu and ninja since version 1.12.0

* Update llvm-amdgpu and ninja conflict to extend to 6.0
2024-09-04 10:37:48 +02:00
Massimiliano Culpo
d41fb3d542 llvm: be more strict with detection (#46179) 2024-09-03 22:37:30 +02:00
Laurent Aphecetche
9b077a360e root: fix @loader_path on macOS (#44826)
* root: fix @loader_path on macOS

* root: use loader patch from upstream

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* add missing comma

* root: rm unused patch

* root: conflict on macos 12 for @:6.27 when +python

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-03 09:50:31 -05:00
Stephen Nicholas Swatman
5c297d8322 root: fix X11 and OpenGL-related issues on macOS (#45632)
* root: Add dependency on libglx

We have been trying to build the Acts package on MacOS, and in this
process we have been running into problems with the ROOT spec on that
operating system; the primary issue we are encountering is that the
compiler is unable to find the `GL/glx.h` header, which is part of glx.
It seems, therefore, that ROOT depends on libglx, but this is not
currently encoded in the spec. This commit ensures that ROOT depends on
the virtual libglx package when both the OpenCL and X11 variants are
enabled.

* Enable builtin glew on MacOS

* Allow `root+opengl+aqua~x` on macOS
2024-09-03 08:32:17 -05:00
Massimiliano Culpo
9e18e63053 solver: minor cleanup and optimization (#46176)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-03 10:35:54 +00:00
Harmen Stoppels
069286bda7 database: remove a few class properties (#46175) 2024-09-03 11:22:14 +02:00
Richard Berger
1679b5e141 nvpl updates (#45922)
* nvpl-blas: add new version 0.3.0

* nvpl-lapack: add new version 0.2.3.1
2024-09-03 08:07:24 +02:00
Pierre Augier
1f935ac356 Add py-pytest-allclose package (#45877) 2024-09-02 14:34:24 -06:00
Pariksheet Nanda
e66e572656 py-cellprofiler: add 4.2.6 new package (#44824)
* py-cellprofiler: add 4.2.6 new package

* py-mysqlclient: Limit pkg-config patch to @1.4:

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-09-02 14:21:39 -05:00
JCGoran
17d3d17d46 py-sympy: add version 1.13.0 (#46163)
* sympy: add version 1.13.0

* sympy: update deps
2024-09-02 09:51:44 -06:00
Thomas Madlener
c814fb5fe6 podio: Add the new datasource variant once it is available (#46078)
* podio: Add the new datasource variant once it is available

* Make sure to require a suitable minimal root version
2024-09-02 08:39:43 -05:00
Harmen Stoppels
fe2d06399f db: type hints (#46168) 2024-09-02 14:44:49 +02:00
Harmen Stoppels
b5dec35113 projections: simplify expression (#46167) 2024-09-02 12:38:47 +00:00
Harmen Stoppels
20565ba8ab remove dead code: fs.is_writable_dir was used on file (#46166) 2024-09-02 11:55:40 +00:00
Wouter Deconinck
c47a3ee05b package_base: sort deprecated versions later in preferred_version (#46025) 2024-09-02 10:42:15 +00:00
Wouter Deconinck
aaa7469b92 py-vector: add v1.4.2, v1.5.0; variant awkward (#46039) 2024-09-02 09:41:15 +02:00
Asher Mancinelli
1f8a6d8e8b [neovim] add utf8proc dependency (#46064)
I believe utf8proc was added as a neovim dependency in neovim/neovim#26165
and is only in the master branch.
2024-09-02 09:28:30 +02:00
Richard Berger
654bf45c01 kokkos-nvcc-wrapper: add new 4.4.00 version (#46067) 2024-09-02 09:27:13 +02:00
Wouter Deconinck
daa42be47f ddt: deprecate all versions in favor of linaro-forge (#46115) 2024-09-02 09:15:47 +02:00
Satish Balay
ca179deb8e petsc, py-petsc4py: add v3.21.5 (#46151) 2024-09-02 09:09:05 +02:00
Satish Balay
8f6092bf83 xsdk: remove develop and 0.7.0, and deprecate 0.8.0 (#46121) 2024-09-02 09:04:36 +02:00
pauleonix
6b649ccf4f cuda: add v12.6.1 (#46143)
Update build system conflict between CUDA 12.6 and Clang 18
2024-09-02 08:46:03 +02:00
Georgia Stuart
d463d4566d docs: update conditional definition arch (#46139)
Signed-off-by: Georgia Stuart <gstuart@umass.edu>

Co-authored-by: Jordan Galby <67924449+Jordan474@users.noreply.github.com>
2024-09-02 08:32:26 +02:00
Adam J. Stewart
f79be3022b py-torchgeo: add v0.6.0 (#46158) 2024-09-02 08:30:02 +02:00
eugeneswalker
7327e731b4 e4s ci stacks: add geopm-runtime (#45881) 2024-09-02 08:24:10 +02:00
Adam J. Stewart
ab6d494fe4 py-horovod: py-torch 2.1.0 now supported (#46152) 2024-09-02 08:21:16 +02:00
Christophe Prud'homme
e5aa74e7cb package cln 1.3.7 feelpp/spack#2 (#46162)
* package cln 1.3.7 feelpp/spack#2

* add myself as maintainer

* fix style issue, rm blankline
2024-09-01 22:38:26 -05:00
Gavin John
728f8e2654 nanomath: add version 1.4.0 (#46159) 2024-09-01 17:23:57 -06:00
Stephen Sachs
9a58a6da0d [openmpi] Add optional debug build variant (#45708) 2024-09-01 11:02:04 -05:00
Stephen Nicholas Swatman
395491815a dd4hep: mark conflict with root@6.31.1: (#45855)
dd4hep versions up to and including 1.27 had a conflict with root
versions starting from 6.31.1, as shown in
https://github.com/AIDASoft/DD4hep/issues/1210. This PR explicitly adds
that conflict to the spec.
2024-09-01 10:54:20 -05:00
Joseph Wang
fd98ebed9d add rapidjson conflict for gcc14 (#46007) 2024-09-01 10:01:53 -05:00
Harmen Stoppels
f7de621d0c Remove redundant inspect.getmodule(self) idiom in packages (#46073) 2024-09-01 11:25:51 +02:00
dunatotatos
a5f404cff5 Update py-numcodecs. (#45715) 2024-08-31 15:15:18 -06:00
Martin Pokorny
8100b0d575 casacore: add new versions 3.6.1, 3.6.0, 3.2.1 (#46068) 2024-08-31 15:14:52 -06:00
Juan Miguel Carceller
b38ab54028 whizard: add a patch when using hepmc3 3.3.0 or newer (#45862)
* whizard: add a patch when using hepmc3 3.3.0 or newer

* whizard: comment with patch origin

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-08-31 13:13:42 -06:00
Stephen Nicholas Swatman
412f22b76a podio: apply patch for gcc 14 builds (#45854)
* podio: apply patch for gcc 14 builds

Podio versions after 0.17.0 but before 1.0.0 are broken when using gcc
14 because of a missing include, which is addressed in the podio pull
request at https://github.com/AIDASoft/podio/pull/600. This commit
patches pre-1.0.0 versions of Podio so they can be compiled with gcc 14,
which is important for building Acts.

* Style

* Style 2

* Fixes

* Add comment:

* Add sha256
2024-08-31 13:42:02 -05:00
Jen Herting
d226ef31bd New package: py-jsonlines (#46124)
* py-jsonlines: new package

* py-jsonlines: fix dependency

---------

Co-authored-by: Alex C Leute <acl2809@rit.edu>
2024-08-31 12:30:07 -05:00
Jen Herting
ae32af927d New package: py-ops (#46122)
* New package: py-ops

* [py-ops]

- added version 2.16.0
- ran black
- updated copyright
- added license()

---------

Co-authored-by: vehrc <vehrc@rit.edu>
2024-08-31 12:11:27 -05:00
Alex Richert
400dd40492 sigio: add v2.3.3 (#46116) 2024-08-31 12:01:08 -05:00
dependabot[bot]
04bdff33ad build(deps): bump actions/setup-python from 5.1.1 to 5.2.0 (#46129)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.1.1 to 5.2.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](39cd14951b...f677139bbe)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 11:51:49 -05:00
Pranav Sivaraman
017e3dd417 doctest: add new package (#46138) 2024-08-31 11:48:20 -05:00
Alex Richert
f7e3902ca8 landsfcutil: add v2.4.2 (#46144) 2024-08-31 09:13:10 -05:00
Alex Richert
89da8d4c84 gfsio: add v1.4.2 (#46145) 2024-08-31 09:12:23 -05:00
Alex Richert
8cac74699b sfcio: add v1.4.2 (#46146)
* sfcio: add v1.4.2

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA
2024-08-31 09:11:13 -05:00
dependabot[bot]
db311eef46 build(deps): bump actions/upload-artifact from 4.3.6 to 4.4.0 (#46149)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.6 to 4.4.0.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](834a144ee9...50769540e7)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 08:38:16 -05:00
etiennemlb
1427735876 unzip: use more generic strip flag for cce (#46087)
* Use more generic strip flag for cce

* [@spackbot] updating style on behalf of etiennemlb

* Apply always
2024-08-30 12:57:24 -05:00
Kacper Kornet
f88ca8cc1f plumed: add v2.9.1 (#46022) 2024-08-30 15:29:49 +02:00
Massimiliano Culpo
bf1f4e15ee boost: remove Compiler.cxx_names (#46037) 2024-08-30 13:25:40 +02:00
Harmen Stoppels
dd756d53de windows-vis: fix failing pipeline (#46135)
* seacas: fix gnu parallel dep

* add vtk@9.0 platform=windows conflict
2024-08-30 12:57:16 +02:00
Massimiliano Culpo
1c1970e727 Put some more constraint on a few mpi providers (#46132)
This should help not selecting, by default, some niche implementation that are supposed to be externals.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-08-30 11:16:35 +02:00
Massimiliano Culpo
c283fce487 Remove DetectedPackage class (#46071)
This PR simplifies the code doing external spec detection by removing 
the `DetectedPackage` class. Now, functions accepting or returning lists 
of `DetectedPackage`, will accept or return list of specs.

Performance doesn't seem to change if we use `Spec.__reduce__` instead 
of `DetectionPackage.__reduce__`.
2024-08-30 08:11:17 +00:00
535 changed files with 6330 additions and 3865 deletions

View File

@@ -1,4 +1,5 @@
{
"name": "Ubuntu 20.04",
"image": "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01",
"postCreateCommand": "./.devcontainer/postCreateCommand.sh"
}

View File

@@ -0,0 +1,5 @@
{
"name": "Ubuntu 22.04",
"image": "ghcr.io/spack/ubuntu-22.04:v2024-05-07",
"postCreateCommand": "./.devcontainer/postCreateCommand.sh"
}

View File

@@ -29,7 +29,7 @@ jobs:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -63,7 +63,7 @@ jobs:
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: "3.12"
- name: Bootstrap clingo
@@ -112,10 +112,10 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest'}}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
@@ -124,11 +124,16 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Setup Windows
if: ${{ matrix.runner == 'windows-latest' }}
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Checkout
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: |
3.8
@@ -137,11 +142,20 @@ jobs:
3.11
3.12
- name: Set bootstrap sources
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.4
- name: Disable from source bootstrap
if: ${{ matrix.runner != 'windows-latest' }}
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install
- name: Bootstrap clingo
# No binary clingo on Windows yet
if: ${{ matrix.runner != 'windows-latest' }}
run: |
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
@@ -164,7 +178,24 @@ jobs:
fi
done
- name: Bootstrap GnuPG
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
source share/spack/setup-env.sh
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack -d gpg list
tree ~/.spack/bootstrap/store/
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
- name: Bootstrap File
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack -d python share/spack/qa/bootstrap-file.py
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/

View File

@@ -87,7 +87,7 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@834a144ee995460fba8ed112a2fc961b36a5ec5a
uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
@@ -126,7 +126,7 @@ jobs:
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@834a144ee995460fba8ed112a2fc961b36a5ec5a
uses: actions/upload-artifact/merge@50769540e7f4bd5e21e526ee35c689e35e0d6874
with:
name: dockerfiles
pattern: dockerfiles_*

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -43,7 +43,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -91,7 +91,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
- name: Install System packages
@@ -151,7 +151,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
- name: Install System packages
@@ -188,7 +188,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -225,7 +225,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -115,12 +115,6 @@ config:
suppress_gpg_warnings: false
# If set to true, Spack will attempt to build any compiler on the spec
# that is not already available. If set to False, Spack will only use
# compilers already configured in compilers.yaml
install_missing_compilers: false
# If set to true, Spack will always check checksums after downloading
# archives. If false, Spack skips the checksum step.
checksum: true

View File

@@ -72,3 +72,13 @@ packages:
permissions:
read: world
write: user
cray-mpich:
buildable: false
cray-mvapich2:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:
buildable: false
spectrum-mpi:
buildable: false

View File

@@ -218,6 +218,7 @@ def setup(sphinx):
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
]
# The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -181,10 +181,6 @@ Spec-related modules
:mod:`spack.parser`
Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs.
:mod:`spack.concretize`
Contains :class:`~spack.concretize.Concretizer` implementation,
which allows site administrators to change Spack's :ref:`concretization-policies`.
:mod:`spack.version`
Implements a simple :class:`~spack.version.Version` class with simple
comparison semantics. Also implements :class:`~spack.version.VersionRange`

View File

@@ -863,7 +863,7 @@ named list ``compilers`` is ``['%gcc', '%clang', '%intel']`` on
spack:
definitions:
- compilers: ['%gcc', '%clang']
- when: arch.satisfies('x86_64:')
- when: arch.satisfies('target=x86_64:')
compilers: ['%intel']
.. note::

View File

@@ -663,11 +663,7 @@ build the package.
When including a bootstrapping phase as in the example above, the result is that
the bootstrapped compiler packages will be pushed to the binary mirror (and the
local artifacts mirror) before the actual release specs are built. In this case,
the jobs corresponding to subsequent release specs are configured to
``install_missing_compilers``, so that if spack is asked to install a package
with a compiler it doesn't know about, it can be quickly installed from the
binary mirror first.
local artifacts mirror) before the actual release specs are built.
Since bootstrapping compilers is optional, those items can be left out of the
environment/stack file, and in that case no bootstrapping will be done (only the

View File

@@ -5,8 +5,8 @@ sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.18.0
urllib3==2.2.2
pytest==8.3.2
urllib3==2.2.3
pytest==8.3.3
isort==5.13.2
black==24.8.0
flake8==7.1.1

View File

@@ -27,8 +27,6 @@
from llnl.util.lang import dedupe, memoized
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from spack.util.executable import Executable, which
from ..path import path_to_os_path, system_path_filter
if sys.platform != "win32":
@@ -53,7 +51,6 @@
"find_all_headers",
"find_libraries",
"find_system_libraries",
"fix_darwin_install_name",
"force_remove",
"force_symlink",
"getuid",
@@ -248,42 +245,6 @@ def path_contains_subdirectory(path, root):
return norm_path.startswith(norm_root)
@memoized
def file_command(*args):
"""Creates entry point to `file` system command with provided arguments"""
file_cmd = which("file", required=True)
for arg in args:
file_cmd.add_default_arg(arg)
return file_cmd
@memoized
def _get_mime_type():
"""Generate method to call `file` system command to aquire mime type
for a specified path
"""
if sys.platform == "win32":
# -h option (no-dereference) does not exist in Windows
return file_command("-b", "--mime-type")
else:
return file_command("-b", "-h", "--mime-type")
def mime_type(filename):
"""Returns the mime type and subtype of a file.
Args:
filename: file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
#: This generates the library filenames that may appear on any OS.
library_extensions = ["a", "la", "so", "tbd", "dylib"]
@@ -1679,41 +1640,6 @@ def safe_remove(*files_or_dirs):
raise
@system_path_filter
def fix_darwin_install_name(path):
"""Fix install name of dynamic libraries on Darwin to have full path.
There are two parts of this task:
1. Use ``install_name('-id', ...)`` to change install name of a single lib
2. Use ``install_name('-change', ...)`` to change the cross linking between
libs. The function assumes that all libraries are in one folder and
currently won't follow subfolders.
Parameters:
path (str): directory in which .dylib files are located
"""
libs = glob.glob(join_path(path, "*.dylib"))
for lib in libs:
# fix install name first:
install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool")
long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies:
for dep in deps:
for loc in libs:
# We really want to check for either
# dep == os.path.basename(loc) or
# dep == join_path(builddir, os.path.basename(loc)),
# but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib)
break
def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2) -> Optional[str]:
"""Find the first file matching a pattern.

View File

@@ -46,12 +46,14 @@ def _search_duplicate_compilers(error_cls):
import pickle
import re
import warnings
from typing import Iterable, List, Set, Tuple
from urllib.request import urlopen
import llnl.util.lang
import spack.config
import spack.patch
import spack.paths
import spack.repo
import spack.spec
import spack.util.crypto
@@ -73,7 +75,9 @@ def __init__(self, summary, details):
self.details = tuple(details)
def __str__(self):
return self.summary + "\n" + "\n".join([" " + detail for detail in self.details])
if self.details:
return f"{self.summary}\n" + "\n".join(f" {detail}" for detail in self.details)
return self.summary
def __eq__(self, other):
if self.summary != other.summary or self.details != other.details:
@@ -257,40 +261,6 @@ def _search_duplicate_specs_in_externals(error_cls):
return errors
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(attribute_name, config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
dict_view = syaml.syaml_dict((k, v) for k, v in config_data.items() if k == attribute_name)
syaml.dump_config(dict_view, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
if "all" in packages_yaml and "version" in packages_yaml["all"]:
summary = "Using the deprecated 'version' attribute under 'packages:all'"
errors.append(make_error("version", packages_yaml["all"], summary))
for package_name in packages_yaml:
if package_name == "all":
continue
package_conf = packages_yaml[package_name]
for attribute in ("compiler", "providers", "target"):
if attribute not in package_conf:
continue
summary = (
f"Using the deprecated '{attribute}' attribute " f"under 'packages:{package_name}'"
)
errors.append(make_error(attribute, package_conf, summary))
return errors
@config_packages
def _avoid_mismatched_variants(error_cls):
"""Warns if variant preferences have mismatched types or names."""
@@ -713,6 +683,88 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
return errors
class DeprecatedMagicGlobals(ast.NodeVisitor):
def __init__(self, magic_globals: Iterable[str]):
super().__init__()
self.magic_globals: Set[str] = set(magic_globals)
# State to track whether we're in a class function
self.depth: int = 0
self.in_function: bool = False
self.path = (ast.Module, ast.ClassDef, ast.FunctionDef)
# Defined locals in the current function (heuristically at least)
self.locals: Set[str] = set()
# List of (name, lineno) tuples for references to magic globals
self.references_to_globals: List[Tuple[str, int]] = []
def descend_in_function_def(self, node: ast.AST) -> None:
if not isinstance(node, self.path[self.depth]):
return
self.depth += 1
if self.depth == len(self.path):
self.in_function = True
super().generic_visit(node)
if self.depth == len(self.path):
self.in_function = False
self.locals.clear()
self.depth -= 1
def generic_visit(self, node: ast.AST) -> None:
# Recurse into function definitions
if self.depth < len(self.path):
return self.descend_in_function_def(node)
elif not self.in_function:
return
elif isinstance(node, ast.Global):
for name in node.names:
if name in self.magic_globals:
self.references_to_globals.append((name, node.lineno))
elif isinstance(node, ast.Assign):
# visit the rhs before lhs
super().visit(node.value)
for target in node.targets:
super().visit(target)
elif isinstance(node, ast.Name) and node.id in self.magic_globals:
if isinstance(node.ctx, ast.Load) and node.id not in self.locals:
self.references_to_globals.append((node.id, node.lineno))
elif isinstance(node.ctx, ast.Store):
self.locals.add(node.id)
else:
super().generic_visit(node)
@package_properties
def _uses_deprecated_globals(pkgs, error_cls):
"""Ensure that packages do not use deprecated globals"""
errors = []
for pkg_name in pkgs:
# some packages scheduled to be removed in v0.23 are not worth fixing.
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
if all(v.get("deprecated", False) for v in pkg_cls.versions.values()):
continue
file = spack.repo.PATH.filename_for_package_name(pkg_name)
tree = ast.parse(open(file).read())
visitor = DeprecatedMagicGlobals(("std_cmake_args",))
visitor.visit(tree)
if visitor.references_to_globals:
errors.append(
error_cls(
f"Package '{pkg_name}' uses deprecated globals",
[
f"{file}:{line} references '{name}'"
for name, line in visitor.references_to_globals
],
)
)
return errors
@package_https_directives
def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links"""

View File

@@ -54,6 +54,7 @@
import spack.util.archive
import spack.util.crypto
import spack.util.file_cache as file_cache
import spack.util.filesystem as ssys
import spack.util.gpg
import spack.util.parallel
import spack.util.path
@@ -105,7 +106,7 @@ class BuildCacheDatabase(spack_db.Database):
record_fields = ("spec", "ref_count", "in_buildcache")
def __init__(self, root):
super().__init__(root, lock_cfg=spack_db.NO_LOCK)
super().__init__(root, lock_cfg=spack_db.NO_LOCK, layout=None)
self._write_transaction_impl = llnl.util.lang.nullcontext
self._read_transaction_impl = llnl.util.lang.nullcontext
@@ -687,7 +688,7 @@ def get_buildfile_manifest(spec):
# Non-symlinks.
for rel_path in visitor.files:
abs_path = os.path.join(root, rel_path)
m_type, m_subtype = fsys.mime_type(abs_path)
m_type, m_subtype = ssys.mime_type(abs_path)
if relocate.needs_binary_relocation(m_type, m_subtype):
# Why is this branch not part of needs_binary_relocation? :(
@@ -788,7 +789,9 @@ def sign_specfile(key: str, specfile_path: str) -> str:
return signed_specfile_path
def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_dir, concurrency):
def _read_specs_and_push_index(
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency
):
"""Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror.
@@ -812,7 +815,7 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
else:
continue
db.add(fetched_spec, None)
db.add(fetched_spec)
db.mark(fetched_spec, "in_buildcache", True)
# Now generate the index, compute its hash, and push the two files to
@@ -1765,7 +1768,7 @@ def _oci_update_index(
for spec_dict in spec_dicts:
spec = Spec.from_dict(spec_dict)
db.add(spec, directory_layout=None)
db.add(spec)
db.mark(spec, "in_buildcache", True)
# Create the index.json file
@@ -2562,7 +2565,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, force)
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)
spack.store.STORE.db.add(spec)
def install_single_spec(spec, unsigned=False, force=False):

View File

@@ -9,6 +9,7 @@
all_core_root_specs,
ensure_clingo_importable_or_raise,
ensure_core_dependencies,
ensure_file_in_path_or_raise,
ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise,
)
@@ -19,6 +20,7 @@
"is_bootstrapping",
"ensure_bootstrap_configuration",
"ensure_core_dependencies",
"ensure_file_in_path_or_raise",
"ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise",

View File

@@ -472,7 +472,8 @@ def ensure_clingo_importable_or_raise() -> None:
def gnupg_root_spec() -> str:
"""Return the root spec used to bootstrap GnuPG"""
return _root_spec("gnupg@2.3:")
root_spec_name = "win-gpg" if IS_WINDOWS else "gnupg"
return _root_spec(f"{root_spec_name}@2.3:")
def ensure_gpg_in_path_or_raise() -> None:
@@ -482,6 +483,19 @@ def ensure_gpg_in_path_or_raise() -> None:
)
def file_root_spec() -> str:
"""Return the root spec used to bootstrap file"""
root_spec_name = "win-file" if IS_WINDOWS else "file"
return _root_spec(root_spec_name)
def ensure_file_in_path_or_raise() -> None:
"""Ensure file is in the PATH or raise"""
return ensure_executables_in_path_or_raise(
executables=["file"], abstract_spec=file_root_spec()
)
def patchelf_root_spec() -> str:
"""Return the root spec used to bootstrap patchelf"""
# 0.13.1 is the last version not to require C++17.
@@ -565,14 +579,15 @@ def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux":
ensure_patchelf_in_path_or_raise()
if not IS_WINDOWS:
ensure_gpg_in_path_or_raise()
elif sys.platform == "win32":
ensure_file_in_path_or_raise()
ensure_gpg_in_path_or_raise()
ensure_clingo_importable_or_raise()
def all_core_root_specs() -> List[str]:
"""Return a list of all the core root specs that may be used to bootstrap Spack"""
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec(), file_root_spec()]
def bootstrapping_sources(scope: Optional[str] = None):

View File

@@ -88,7 +88,7 @@ def _core_requirements() -> List[RequiredResponseType]:
def _buildcache_requirements() -> List[RequiredResponseType]:
_buildcache_exes = {
"file": _missing("file", "required to analyze files for buildcaches"),
"file": _missing("file", "required to analyze files for buildcaches", system_only=False),
("gpg2", "gpg"): _missing("gpg2", "required to sign/verify buildcaches", False),
}
if platform.system().lower() == "darwin":

View File

@@ -1553,21 +1553,21 @@ class ModuleChangePropagator:
_PROTECTED_NAMES = ("package", "current_module", "modules_in_mro", "_set_attributes")
def __init__(self, package):
def __init__(self, package: spack.package_base.PackageBase) -> None:
self._set_self_attributes("package", package)
self._set_self_attributes("current_module", package.module)
#: Modules for the classes in the MRO up to PackageBase
modules_in_mro = []
for cls in type(package).__mro__:
module = cls.module
for cls in package.__class__.__mro__:
module = getattr(cls, "module", None)
if module == self.current_module:
continue
if module == spack.package_base:
if module is None or module is spack.package_base:
break
if module is self.current_module:
continue
modules_in_mro.append(module)
self._set_self_attributes("modules_in_mro", modules_in_mro)
self._set_self_attributes("_set_attributes", {})

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
import os.path
import stat
@@ -549,13 +548,12 @@ def autoreconf(self, pkg, spec, prefix):
tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************")
with fs.working_dir(self.configure_directory):
m = inspect.getmodule(self.pkg)
# This line is what is needed most of the time
# --install, --verbose, --force
autoreconf_args = ["-ivf"]
autoreconf_args += self.autoreconf_search_path_args
autoreconf_args += self.autoreconf_extra_args
m.autoreconf(*autoreconf_args)
self.pkg.module.autoreconf(*autoreconf_args)
@property
def autoreconf_search_path_args(self):
@@ -579,7 +577,9 @@ def set_configure_or_die(self):
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
inspect.getmodule(self.pkg).configure = Executable(self.configure_abs_path)
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
@@ -596,7 +596,7 @@ def configure(self, pkg, spec, prefix):
options += self.configure_args()
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).configure(*options)
pkg.module.configure(*options)
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
@@ -604,12 +604,12 @@ def build(self, pkg, spec, prefix):
params = ["V=1"]
params += self.build_targets
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*params)
pkg.module.make(*params)
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
@@ -72,9 +70,7 @@ def check_args(self):
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
def install(self, pkg, spec, prefix):
"""Copy build files into package prefix."""
@@ -86,4 +82,4 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Run "cargo test"."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).cargo("test", *self.check_args)
self.pkg.module.cargo("test", *self.check_args)

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
import inspect
import os
import pathlib
import platform
@@ -544,24 +543,24 @@ def cmake(self, pkg, spec, prefix):
options += self.cmake_args()
options.append(os.path.abspath(self.root_cmakelists_dir))
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).cmake(*options)
pkg.module.cmake(*options)
def build(self, pkg, spec, prefix):
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
inspect.getmodule(self.pkg).make(*self.build_targets)
pkg.module.make(*self.build_targets)
elif self.generator == "Ninja":
self.build_targets.append("-v")
inspect.getmodule(self.pkg).ninja(*self.build_targets)
pkg.module.ninja(*self.build_targets)
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
elif self.generator == "Ninja":
inspect.getmodule(self.pkg).ninja(*self.install_targets)
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -3,6 +3,9 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
from typing import Iterable, List
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
@@ -44,6 +47,7 @@ class CudaPackage(PackageBase):
"87",
"89",
"90",
"90a",
)
# FIXME: keep cuda and cuda_arch separate to make usage easier until
@@ -70,6 +74,27 @@ def cuda_flags(arch_list):
for s in arch_list
]
@staticmethod
def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
"""Adds a decimal place to each CUDA arch.
>>> compute_capabilities(['90', '90a'])
['9.0', '9.0a']
Args:
arch_list: A list of integer strings, optionally followed by a suffix.
Returns:
A list of float strings, optionally followed by a suffix
"""
pattern = re.compile(r"(\d+)")
capabilities = []
for arch in arch_list:
_, number, letter = re.split(pattern, arch)
number = "{0:.1f}".format(float(number) / 10.0)
capabilities.append(number + letter)
return capabilities
depends_on("cuda", when="+cuda")
# CUDA version vs Architecture
@@ -145,7 +170,8 @@ def cuda_flags(arch_list):
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.6")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
conflicts("%clang@19:", when="+cuda ^cuda@:12.6")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -3,8 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
@@ -82,7 +80,7 @@ def check_args(self):
def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).go("build", *self.build_args)
pkg.module.go("build", *self.build_args)
def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin."""
@@ -95,4 +93,4 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).go("test", *self.check_args)
self.pkg.module.go("test", *self.check_args)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List
import llnl.util.filesystem as fs
@@ -103,12 +102,12 @@ def edit(self, pkg, spec, prefix):
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.build_targets)
pkg.module.make(*self.build_targets)
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*self.install_targets)
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import List
@@ -195,19 +194,19 @@ def meson(self, pkg, spec, prefix):
options += self.std_meson_args
options += self.meson_args()
with fs.working_dir(self.build_directory, create=True):
inspect.getmodule(self.pkg).meson(*options)
pkg.module.meson(*options)
def build(self, pkg, spec, prefix):
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).ninja(*options)
pkg.module.ninja(*options)
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).ninja(*self.install_targets)
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
import llnl.util.filesystem as fs
@@ -104,7 +103,7 @@ def msbuild_install_args(self):
def build(self, pkg, spec, prefix):
"""Run "msbuild" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).msbuild(
pkg.module.msbuild(
*self.std_msbuild_args,
*self.msbuild_args(),
self.define_targets(*self.build_targets),
@@ -114,6 +113,6 @@ def install(self, pkg, spec, prefix):
"""Run "msbuild" on the install targets specified by the builder.
This is INSTALL by default"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).msbuild(
pkg.module.msbuild(
*self.msbuild_install_args(), self.define_targets(*self.install_targets)
)

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm
import llnl.util.filesystem as fs
@@ -132,9 +131,7 @@ def build(self, pkg, spec, prefix):
if self.makefile_name:
opts.append("/F{}".format(self.makefile_name))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.build_targets, ignore_quotes=self.ignore_quotes
)
pkg.module.nmake(*opts, *self.build_targets, ignore_quotes=self.ignore_quotes)
def install(self, pkg, spec, prefix):
"""Run "nmake" on the install targets specified by the builder.
@@ -146,6 +143,4 @@ def install(self, pkg, spec, prefix):
opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes
)
pkg.module.nmake(*opts, *self.install_targets, ignore_quotes=self.ignore_quotes)

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
@@ -47,7 +45,7 @@ class OctaveBuilder(BaseBuilder):
def install(self, pkg, spec, prefix):
"""Install the package from the archive file"""
inspect.getmodule(self.pkg).octave(
pkg.module.octave(
"--quiet",
"--norc",
"--built-in-docstrings-file=/dev/null",

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
from typing import Iterable
@@ -134,7 +133,7 @@ def build_method(self):
def build_executable(self):
"""Returns the executable method to build the perl package"""
if self.build_method == "Makefile.PL":
build_executable = inspect.getmodule(self.pkg).make
build_executable = self.pkg.module.make
elif self.build_method == "Build.PL":
build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build"))
return build_executable
@@ -158,7 +157,7 @@ def configure(self, pkg, spec, prefix):
options = ["Build.PL", "--install_base", prefix]
options += self.configure_args()
inspect.getmodule(self.pkg).perl(*options)
pkg.module.perl(*options)
# It is possible that the shebang in the Build script that is created from
# Build.PL may be too long causing the build to fail. Patching the shebang

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import inspect
import operator
import os
import re
@@ -228,7 +227,7 @@ def test_imports(self) -> None:
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python # type: ignore[union-attr]
python = self.module.python
for module in self.import_modules:
with test_part(
self,
@@ -315,9 +314,9 @@ def get_external_python_for_prefix(self):
)
python_externals_detected = [
d.spec
for d in python_externals_detection.get("python", [])
if d.prefix == self.spec.external_path
spec
for spec in python_externals_detection.get("python", [])
if spec.external_path == self.spec.external_path
]
if python_externals_detected:
return python_externals_detected[0]

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir
import spack.builder
@@ -66,17 +64,17 @@ def qmake_args(self):
def qmake(self, pkg, spec, prefix):
"""Run ``qmake`` to configure the project and generate a Makefile."""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).qmake(*self.qmake_args())
pkg.module.qmake(*self.qmake_args())
def build(self, pkg, spec, prefix):
"""Make the build targets"""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make()
pkg.module.make()
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make("install")
pkg.module.make("install")
def check(self):
"""Search the Makefile for a ``check:`` target and runs it if found."""

View File

@@ -2,10 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import Optional, Tuple
import llnl.util.lang as lang
from llnl.util.filesystem import mkdirp
from spack.directives import extends
@@ -37,6 +37,7 @@ def configure_vars(self):
def install(self, pkg, spec, prefix):
"""Installs an R package."""
mkdirp(pkg.module.r_lib_dir)
config_args = self.configure_args()
config_vars = self.configure_vars()
@@ -44,14 +45,14 @@ def install(self, pkg, spec, prefix):
args = ["--vanilla", "CMD", "INSTALL"]
if config_args:
args.append("--configure-args={0}".format(" ".join(config_args)))
args.append(f"--configure-args={' '.join(config_args)}")
if config_vars:
args.append("--configure-vars={0}".format(" ".join(config_vars)))
args.append(f"--configure-vars={' '.join(config_vars)}")
args.extend(["--library={0}".format(self.pkg.module.r_lib_dir), self.stage.source_path])
args.extend([f"--library={pkg.module.r_lib_dir}", self.stage.source_path])
inspect.getmodule(self.pkg).R(*args)
pkg.module.R(*args)
class RPackage(Package):
@@ -80,27 +81,21 @@ class RPackage(Package):
@lang.classproperty
def homepage(cls):
if cls.cran:
return "https://cloud.r-project.org/package=" + cls.cran
return f"https://cloud.r-project.org/package={cls.cran}"
elif cls.bioc:
return "https://bioconductor.org/packages/" + cls.bioc
return f"https://bioconductor.org/packages/{cls.bioc}"
@lang.classproperty
def url(cls):
if cls.cran:
return (
"https://cloud.r-project.org/src/contrib/"
+ cls.cran
+ "_"
+ str(list(cls.versions)[0])
+ ".tar.gz"
)
return f"https://cloud.r-project.org/src/contrib/{cls.cran}_{str(list(cls.versions)[0])}.tar.gz"
@lang.classproperty
def list_url(cls):
if cls.cran:
return "https://cloud.r-project.org/src/contrib/Archive/" + cls.cran + "/"
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
@property
def git(self):
if self.bioc:
return "https://git.bioconductor.org/packages/" + self.bioc
return f"https://git.bioconductor.org/packages/{self.bioc}"

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import inspect
import spack.builder
import spack.package_base
@@ -52,10 +51,10 @@ def build(self, pkg, spec, prefix):
gemspecs = glob.glob("*.gemspec")
rakefiles = glob.glob("Rakefile")
if gemspecs:
inspect.getmodule(self.pkg).gem("build", "--norc", gemspecs[0])
pkg.module.gem("build", "--norc", gemspecs[0])
elif rakefiles:
jobs = inspect.getmodule(self.pkg).make_jobs
inspect.getmodule(self.pkg).rake("package", "-j{0}".format(jobs))
jobs = pkg.module.make_jobs
pkg.module.rake("package", "-j{0}".format(jobs))
else:
# Some Ruby packages only ship `*.gem` files, so nothing to build
pass
@@ -70,6 +69,6 @@ def install(self, pkg, spec, prefix):
# if --install-dir is not used, GEM_PATH is deleted from the
# environement, and Gems required to build native extensions will
# not be found. Those extensions are built during `gem install`.
inspect.getmodule(self.pkg).gem(
pkg.module.gem(
"install", "--norc", "--ignore-dependencies", "--install-dir", prefix, gems[0]
)

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
@@ -63,8 +61,7 @@ def build_args(self, spec, prefix):
def build(self, pkg, spec, prefix):
"""Build the package."""
args = self.build_args(spec, prefix)
inspect.getmodule(self.pkg).scons(*args)
pkg.module.scons(*self.build_args(spec, prefix))
def install_args(self, spec, prefix):
"""Arguments to pass to install."""
@@ -72,9 +69,7 @@ def install_args(self, spec, prefix):
def install(self, pkg, spec, prefix):
"""Install the package."""
args = self.install_args(spec, prefix)
inspect.getmodule(self.pkg).scons("install", *args)
pkg.module.scons("install", *self.install_args(spec, prefix))
def build_test(self):
"""Run unit tests after build.

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os
import re
@@ -86,14 +85,13 @@ def import_modules(self):
def python(self, *args, **kwargs):
"""The python ``Executable``."""
inspect.getmodule(self).python(*args, **kwargs)
self.pkg.module.python(*args, **kwargs)
def test_imports(self):
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python
for module in self.import_modules:
with spack.install_test.test_part(
self,
@@ -101,7 +99,7 @@ def test_imports(self):
purpose="checking import of {0}".format(module),
work_dir="spack-test",
):
python("-c", "import {0}".format(module))
self.python("-c", "import {0}".format(module))
@spack.builder.builder("sip")
@@ -136,7 +134,7 @@ def configure(self, pkg, spec, prefix):
"""Configure the package."""
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
args = ["--verbose", "--target-dir", inspect.getmodule(self.pkg).python_platlib]
args = ["--verbose", "--target-dir", pkg.module.python_platlib]
args.extend(self.configure_args())
# https://github.com/Python-SIP/sip/commit/cb0be6cb6e9b756b8b0db3136efb014f6fb9b766
@@ -155,7 +153,7 @@ def build(self, pkg, spec, prefix):
args = self.build_args()
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*args)
pkg.module.make(*args)
def build_args(self):
"""Arguments to pass to build."""
@@ -166,7 +164,7 @@ def install(self, pkg, spec, prefix):
args = self.install_args()
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make("install", *args)
pkg.module.make("install", *args)
def install_args(self):
"""Arguments to pass to install."""

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir
import spack.builder
@@ -90,11 +88,11 @@ def build_directory(self):
def python(self, *args, **kwargs):
"""The python ``Executable``."""
inspect.getmodule(self.pkg).python(*args, **kwargs)
self.pkg.module.python(*args, **kwargs)
def waf(self, *args, **kwargs):
"""Runs the waf ``Executable``."""
jobs = inspect.getmodule(self.pkg).make_jobs
jobs = self.pkg.module.make_jobs
with working_dir(self.build_directory):
self.python("waf", "-j{0}".format(jobs), *args, **kwargs)

View File

@@ -6,7 +6,6 @@
import collections.abc
import copy
import functools
import inspect
from typing import List, Optional, Tuple
from llnl.util import lang
@@ -97,11 +96,10 @@ class hierarchy (look at AspellDictPackage for an example of that)
Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder
"""
package_module = inspect.getmodule(pkg)
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
builder_cls_name = default_builder_cls.__name__
builder_cls = getattr(package_module, builder_cls_name, None)
builder_cls = getattr(pkg.module, builder_cls_name, None)
if builder_cls:
return builder_cls(pkg)

View File

@@ -115,15 +115,11 @@ def audit(parser, args):
def _process_reports(reports):
for check, errors in reports:
if errors:
msg = "{0}: {1} issue{2} found".format(
check, len(errors), "" if len(errors) == 1 else "s"
)
header = "@*b{" + msg + "}"
print(cl.colorize(header))
status = f"{len(errors)} issue{'' if len(errors) == 1 else 's'} found"
print(cl.colorize(f"{check}: @*r{{{status}}}"))
numdigits = len(str(len(errors)))
for idx, error in enumerate(errors):
print(str(idx + 1) + ". " + str(error))
print(f"{idx + 1:>{numdigits}}. {error}")
raise SystemExit(1)
else:
msg = "{0}: 0 issues found.".format(check)
header = "@*b{" + msg + "}"
print(cl.colorize(header))
print(cl.colorize(f"{check}: @*g{{passed}}"))

View File

@@ -7,6 +7,7 @@
import copy
import os
import re
import shlex
import sys
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
@@ -18,6 +19,7 @@
import spack.cmd
import spack.main
import spack.paths
import spack.platforms
from spack.main import section_descriptions
description = "list available spack commands"
@@ -139,7 +141,7 @@ def usage(self, usage: str) -> str:
cmd = self.parser.prog.replace(" ", "-")
if cmd in self.documented:
string += "\n:ref:`More documentation <cmd-{0}>`\n".format(cmd)
string = f"{string}\n:ref:`More documentation <cmd-{cmd}>`\n"
return string
@@ -249,33 +251,27 @@ def body(
Function body.
"""
if positionals:
return """
return f"""
if $list_options
then
{0}
{self.optionals(optionals)}
else
{1}
{self.positionals(positionals)}
fi
""".format(
self.optionals(optionals), self.positionals(positionals)
)
"""
elif subcommands:
return """
return f"""
if $list_options
then
{0}
{self.optionals(optionals)}
else
{1}
{self.subcommands(subcommands)}
fi
""".format(
self.optionals(optionals), self.subcommands(subcommands)
)
"""
else:
return """
{0}
""".format(
self.optionals(optionals)
)
return f"""
{self.optionals(optionals)}
"""
def positionals(self, positionals: Sequence[str]) -> str:
"""Return the syntax for reporting positional arguments.
@@ -304,7 +300,7 @@ def optionals(self, optionals: Sequence[str]) -> str:
Returns:
Syntax for optional flags.
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(optionals))
return f'SPACK_COMPREPLY="{" ".join(optionals)}"'
def subcommands(self, subcommands: Sequence[str]) -> str:
"""Return the syntax for reporting subcommands.
@@ -315,7 +311,7 @@ def subcommands(self, subcommands: Sequence[str]) -> str:
Returns:
Syntax for subcommand parsers
"""
return 'SPACK_COMPREPLY="{0}"'.format(" ".join(subcommands))
return f'SPACK_COMPREPLY="{" ".join(subcommands)}"'
# Map argument destination names to their complete commands
@@ -395,7 +391,7 @@ def _fish_dest_get_complete(prog: str, dest: str) -> Optional[str]:
subcmd = s[1] if len(s) == 2 else ""
for (prog_key, pos_key), value in _dest_to_fish_complete.items():
if subcmd.startswith(prog_key) and re.match("^" + pos_key + "$", dest):
if subcmd.startswith(prog_key) and re.match(f"^{pos_key}$", dest):
return value
return None
@@ -427,24 +423,6 @@ def format(self, cmd: Command) -> str:
+ self.complete(cmd.prog, positionals, optionals, subcommands)
)
def _quote(self, string: str) -> str:
"""Quote string and escape special characters if necessary.
Args:
string: Input string.
Returns:
Quoted string.
"""
# Goal here is to match fish_indent behavior
# Strings without spaces (or other special characters) do not need to be escaped
if not any([sub in string for sub in [" ", "'", '"']]):
return string
string = string.replace("'", r"\'")
return f"'{string}'"
def optspecs(
self,
prog: str,
@@ -463,7 +441,7 @@ def optspecs(
optspec_var = "__fish_spack_optspecs_" + prog.replace(" ", "_").replace("-", "_")
if optionals is None:
return "set -g %s\n" % optspec_var
return f"set -g {optspec_var}\n"
# Build optspec by iterating over options
args = []
@@ -490,11 +468,11 @@ def optspecs(
long = [f[2:] for f in flags if f.startswith("--")]
while len(short) > 0 and len(long) > 0:
arg = "%s/%s%s" % (short.pop(), long.pop(), required)
arg = f"{short.pop()}/{long.pop()}{required}"
while len(short) > 0:
arg = "%s/%s" % (short.pop(), required)
arg = f"{short.pop()}/{required}"
while len(long) > 0:
arg = "%s%s" % (long.pop(), required)
arg = f"{long.pop()}{required}"
args.append(arg)
@@ -503,7 +481,7 @@ def optspecs(
# indicate that such subcommand exists.
args = " ".join(args)
return "set -g %s %s\n" % (optspec_var, args)
return f"set -g {optspec_var} {args}\n"
@staticmethod
def complete_head(
@@ -524,12 +502,14 @@ def complete_head(
subcmd = s[1] if len(s) == 2 else ""
if index is None:
return "complete -c %s -n '__fish_spack_using_command %s'" % (s[0], subcmd)
return f"complete -c {s[0]} -n '__fish_spack_using_command {subcmd}'"
elif nargs in [argparse.ZERO_OR_MORE, argparse.ONE_OR_MORE, argparse.REMAINDER]:
head = "complete -c %s -n '__fish_spack_using_command_pos_remainder %d %s'"
return (
f"complete -c {s[0]} -n '__fish_spack_using_command_pos_remainder "
f"{index} {subcmd}'"
)
else:
head = "complete -c %s -n '__fish_spack_using_command_pos %d %s'"
return head % (s[0], index, subcmd)
return f"complete -c {s[0]} -n '__fish_spack_using_command_pos {index} {subcmd}'"
def complete(
self,
@@ -597,25 +577,18 @@ def positionals(
if choices is not None:
# If there are choices, we provide a completion for all possible values.
commands.append(head + " -f -a %s" % self._quote(" ".join(choices)))
commands.append(f"{head} -f -a {shlex.quote(' '.join(choices))}")
else:
# Otherwise, we try to find a predefined completion for it
value = _fish_dest_get_complete(prog, args)
if value is not None:
commands.append(head + " " + value)
commands.append(f"{head} {value}")
return "\n".join(commands) + "\n"
def prog_comment(self, prog: str) -> str:
"""Return a comment line for the command.
Args:
prog: Program name.
Returns:
Comment line.
"""
return "\n# %s\n" % prog
"""Return a comment line for the command."""
return f"\n# {prog}\n"
def optionals(
self,
@@ -658,28 +631,28 @@ def optionals(
for f in flags:
if f.startswith("--"):
long = f[2:]
prefix += " -l %s" % long
prefix = f"{prefix} -l {long}"
elif f.startswith("-"):
short = f[1:]
assert len(short) == 1
prefix += " -s %s" % short
prefix = f"{prefix} -s {short}"
# Check if option require argument.
# Currently multi-argument options are not supported, so we treat it like one argument.
if nargs != 0:
prefix += " -r"
prefix = f"{prefix} -r"
if dest is not None:
# If there are choices, we provide a completion for all possible values.
commands.append(prefix + " -f -a %s" % self._quote(" ".join(dest)))
commands.append(f"{prefix} -f -a {shlex.quote(' '.join(dest))}")
else:
# Otherwise, we try to find a predefined completion for it
value = _fish_dest_get_complete(prog, dest)
if value is not None:
commands.append(prefix + " " + value)
commands.append(f"{prefix} {value}")
if help:
commands.append(prefix + " -d %s" % self._quote(help))
commands.append(f"{prefix} -d {shlex.quote(help)}")
return "\n".join(commands) + "\n"
@@ -697,11 +670,11 @@ def subcommands(self, prog: str, subcommands: List[Tuple[ArgumentParser, str, st
head = self.complete_head(prog, 0)
for _, subcommand, help in subcommands:
command = head + " -f -a %s" % self._quote(subcommand)
command = f"{head} -f -a {shlex.quote(subcommand)}"
if help is not None and len(help) > 0:
help = help.split("\n")[0]
command += " -d %s" % self._quote(help)
command = f"{command} -d {shlex.quote(help)}"
commands.append(command)
@@ -747,7 +720,7 @@ def rst_index(out: IO) -> None:
for i, cmd in enumerate(sorted(commands)):
description = description.capitalize() if i == 0 else ""
ref = ":ref:`%s <spack-%s>`" % (cmd, cmd)
ref = f":ref:`{cmd} <spack-{cmd}>`"
comma = "," if i != len(commands) - 1 else ""
bar = "| " if i % 8 == 0 else " "
out.write(line % (description, bar + ref + comma))
@@ -858,10 +831,10 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
# check header first so we don't open out files unnecessarily
if args.header and not os.path.exists(args.header):
tty.die("No such file: '%s'" % args.header)
tty.die(f"No such file: '{args.header}'")
if args.update:
tty.msg("Updating file: %s" % args.update)
tty.msg(f"Updating file: {args.update}")
with open(args.update, "w") as f:
prepend_header(args, f)
formatter(args, f)

View File

@@ -14,7 +14,6 @@
installation and its deprecator.
"""
import argparse
import os
import llnl.util.tty as tty
from llnl.util.symlink import symlink
@@ -76,12 +75,7 @@ def setup_parser(sp):
)
sp.add_argument(
"-l",
"--link-type",
type=str,
default="soft",
choices=["soft", "hard"],
help="type of filesystem link to use for deprecation (default soft)",
"-l", "--link-type", type=str, default=None, choices=["soft", "hard"], help="(deprecated)"
)
sp.add_argument(
@@ -91,6 +85,9 @@ def setup_parser(sp):
def deprecate(parser, args):
"""Deprecate one spec in favor of another"""
if args.link_type is not None:
tty.warn("The --link-type option is deprecated and will be removed in a future release.")
env = ev.active_environment()
specs = spack.cmd.parse_specs(args.specs)
@@ -144,7 +141,5 @@ def deprecate(parser, args):
if not answer:
tty.die("Will not deprecate any packages.")
link_fn = os.link if args.link_type == "hard" else symlink
for dcate, dcator in zip(all_deprecate, all_deprecators):
dcate.package.do_deprecate(dcator, link_fn)
dcate.package.do_deprecate(dcator, symlink)

View File

@@ -48,6 +48,7 @@ def setup_parser(subparser):
options = [
("--detectable", print_detectable.__doc__),
("--maintainers", print_maintainers.__doc__),
("--namespace", print_namespace.__doc__),
("--no-dependencies", "do not " + print_dependencies.__doc__),
("--no-variants", "do not " + print_variants.__doc__),
("--no-versions", "do not " + print_versions.__doc__),
@@ -189,6 +190,15 @@ def print_maintainers(pkg, args):
color.cprint(section_title("Maintainers: ") + mnt)
def print_namespace(pkg, args):
"""output package namespace"""
repo = spack.repo.PATH.get_repo(pkg.namespace)
color.cprint("")
color.cprint(section_title("Namespace:"))
color.cprint(f" @c{{{repo.namespace}}} at {repo.root}")
def print_phases(pkg, args):
"""output installation phases"""
@@ -522,6 +532,7 @@ def info(parser, args):
# Now output optional information in expected order
sections = [
(args.all or args.maintainers, print_maintainers),
(args.all or args.namespace, print_namespace),
(args.all or args.detectable, print_detectable),
(args.all or args.tags, print_tags),
(args.all or not args.no_versions, print_versions),

View File

@@ -273,24 +273,24 @@ def find_compilers(
valid_compilers = {}
for name, detected in detected_packages.items():
compilers = [x for x in detected if CompilerConfigFactory.from_external_spec(x.spec)]
compilers = [x for x in detected if CompilerConfigFactory.from_external_spec(x)]
if not compilers:
continue
valid_compilers[name] = compilers
def _has_fortran_compilers(x):
if "compilers" not in x.spec.extra_attributes:
if "compilers" not in x.extra_attributes:
return False
return "fortran" in x.spec.extra_attributes["compilers"]
return "fortran" in x.extra_attributes["compilers"]
if mixed_toolchain:
gccs = [x for x in valid_compilers.get("gcc", []) if _has_fortran_compilers(x)]
if gccs:
best_gcc = sorted(
gccs, key=lambda x: spack.spec.parse_with_version_concrete(x.spec).version
gccs, key=lambda x: spack.spec.parse_with_version_concrete(x).version
)[-1]
gfortran = best_gcc.spec.extra_attributes["compilers"]["fortran"]
gfortran = best_gcc.extra_attributes["compilers"]["fortran"]
for name in ("llvm", "apple-clang"):
if name not in valid_compilers:
continue
@@ -298,11 +298,11 @@ def _has_fortran_compilers(x):
for candidate in candidates:
if _has_fortran_compilers(candidate):
continue
candidate.spec.extra_attributes["compilers"]["fortran"] = gfortran
candidate.extra_attributes["compilers"]["fortran"] = gfortran
new_compilers = []
for name, detected in valid_compilers.items():
for config in CompilerConfigFactory.from_specs([x.spec for x in detected]):
for config in CompilerConfigFactory.from_specs(detected):
c = _compiler_from_config_entry(config["compiler"])
if c in known_compilers:
continue

View File

@@ -19,35 +19,23 @@
import spack.tengine
import spack.util.path
class Concretizer:
"""(DEPRECATED) Only contains logic to enable/disable compiler existence checks."""
#: Controls whether we check that compiler versions actually exist
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
def __init__(self):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
"config:install_missing_compilers", False
)
CHECK_COMPILER_EXISTENCE = True
@contextmanager
def disable_compiler_existence_check():
saved = Concretizer.check_for_compiler_existence
Concretizer.check_for_compiler_existence = False
global CHECK_COMPILER_EXISTENCE
CHECK_COMPILER_EXISTENCE, saved = False, CHECK_COMPILER_EXISTENCE
yield
Concretizer.check_for_compiler_existence = saved
CHECK_COMPILER_EXISTENCE = saved
@contextmanager
def enable_compiler_existence_check():
saved = Concretizer.check_for_compiler_existence
Concretizer.check_for_compiler_existence = True
global CHECK_COMPILER_EXISTENCE
CHECK_COMPILER_EXISTENCE, saved = True, CHECK_COMPILER_EXISTENCE
yield
Concretizer.check_for_compiler_existence = saved
CHECK_COMPILER_EXISTENCE = saved
def find_spec(spec, condition, default=None):

View File

@@ -308,8 +308,7 @@ def __call__(self):
return t.render(**self.to_dict())
import spack.container.writers.docker # noqa: E402
# Import after function definition all the modules in this package,
# so that registration of writers will happen automatically
import spack.container.writers.singularity # noqa: E402
from . import docker # noqa: F401 E402
from . import singularity # noqa: F401 E402

View File

@@ -14,12 +14,14 @@
import llnl.util.tty as tty
import spack.cmd
import spack.compilers
import spack.deptypes as dt
import spack.error
import spack.hash_types as hash_types
import spack.platforms
import spack.repo
import spack.spec
import spack.store
from spack.schema.cray_manifest import schema as manifest_schema
#: Cray systems can store a Spack-compatible description of system
@@ -237,7 +239,7 @@ def read(path, apply_updates):
tty.debug(f"Include this\n{traceback.format_exc()}")
if apply_updates:
for spec in specs.values():
spack.store.STORE.db.add(spec, directory_layout=None)
spack.store.STORE.db.add(spec)
class ManifestValidationError(spack.error.SpackError):

View File

@@ -59,7 +59,11 @@
import spack.util.lock as lk
import spack.util.spack_json as sjson
import spack.version as vn
from spack.directory_layout import DirectoryLayoutError, InconsistentInstallDirectoryError
from spack.directory_layout import (
DirectoryLayout,
DirectoryLayoutError,
InconsistentInstallDirectoryError,
)
from spack.error import SpackError
from spack.util.crypto import bit_length
@@ -203,12 +207,12 @@ class InstallRecord:
def __init__(
self,
spec: "spack.spec.Spec",
path: str,
path: Optional[str],
installed: bool,
ref_count: int = 0,
explicit: bool = False,
installation_time: Optional[float] = None,
deprecated_for: Optional["spack.spec.Spec"] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin=None,
):
@@ -595,9 +599,11 @@ class Database:
def __init__(
self,
root: str,
*,
upstream_dbs: Optional[List["Database"]] = None,
is_upstream: bool = False,
lock_cfg: LockConfiguration = DEFAULT_LOCK_CFG,
layout: Optional[DirectoryLayout] = None,
) -> None:
"""Database for Spack installations.
@@ -620,6 +626,7 @@ def __init__(
"""
self.root = root
self.database_directory = os.path.join(self.root, _DB_DIRNAME)
self.layout = layout
# Set up layout of database files within the db dir
self._index_path = os.path.join(self.database_directory, "index.json")
@@ -664,14 +671,6 @@ def __init__(
self.upstream_dbs = list(upstream_dbs) if upstream_dbs else []
# whether there was an error at the start of a read transaction
self._error = None
# For testing: if this is true, an exception is thrown when missing
# dependencies are detected (rather than just printing a warning
# message)
self._fail_when_missing_deps = False
self._write_transaction_impl = lk.WriteTransaction
self._read_transaction_impl = lk.ReadTransaction
@@ -774,7 +773,13 @@ def query_local_by_spec_hash(self, hash_key):
with self.read_transaction():
return self._data.get(hash_key, None)
def _assign_dependencies(self, spec_reader, hash_key, installs, data):
def _assign_dependencies(
self,
spec_reader: Type["spack.spec.SpecfileReaderBase"],
hash_key: str,
installs: dict,
data: Dict[str, InstallRecord],
):
# Add dependencies from other records in the install DB to
# form a full spec.
spec = data[hash_key].spec
@@ -787,26 +792,20 @@ def _assign_dependencies(self, spec_reader, hash_key, installs, data):
for dname, dhash, dtypes, _, virtuals in spec_reader.read_specfile_dep_specs(
yaml_deps
):
# It is important that we always check upstream installations
# in the same order, and that we always check the local
# installation first: if a downstream Spack installs a package
# then dependents in that installation could be using it.
# If a hash is installed locally and upstream, there isn't
# enough information to determine which one a local package
# depends on, so the convention ensures that this isn't an
# issue.
upstream, record = self.query_by_spec_hash(dhash, data=data)
# It is important that we always check upstream installations in the same order,
# and that we always check the local installation first: if a downstream Spack
# installs a package then dependents in that installation could be using it. If a
# hash is installed locally and upstream, there isn't enough information to
# determine which one a local package depends on, so the convention ensures that
# this isn't an issue.
_, record = self.query_by_spec_hash(dhash, data=data)
child = record.spec if record else None
if not child:
msg = "Missing dependency not in database: " "%s needs %s-%s" % (
spec.cformat("{name}{/hash:7}"),
dname,
dhash[:7],
tty.warn(
f"Missing dependency not in database: "
f"{spec.cformat('{name}{/hash:7}')} needs {dname}-{dhash[:7]}"
)
if self._fail_when_missing_deps:
raise MissingDependenciesError(msg)
tty.warn(msg)
continue
spec._add_dependency(child, depflag=dt.canonicalize(dtypes), virtuals=virtuals)
@@ -846,7 +845,7 @@ def check(cond, msg):
):
tty.warn(f"Spack database version changed from {version} to {_DB_VERSION}. Upgrading.")
self.reindex(spack.store.STORE.layout)
self.reindex()
installs = dict(
(k, v.to_dict(include_fields=self._record_fields)) for k, v in self._data.items()
)
@@ -873,8 +872,8 @@ def invalid_record(hash_key, error):
# (i.e., its specs are a true Merkle DAG, unlike most specs.)
# Pass 1: Iterate through database and build specs w/o dependencies
data = {}
installed_prefixes = set()
data: Dict[str, InstallRecord] = {}
installed_prefixes: Set[str] = set()
for hash_key, rec in installs.items():
try:
# This constructs a spec DAG from the list of all installs
@@ -911,7 +910,7 @@ def invalid_record(hash_key, error):
self._data = data
self._installed_prefixes = installed_prefixes
def reindex(self, directory_layout):
def reindex(self):
"""Build database index from scratch based on a directory layout.
Locks the DB if it isn't locked already.
@@ -926,105 +925,116 @@ def _read_suppress_error():
if os.path.isfile(self._index_path):
self._read_from_file(self._index_path)
except CorruptDatabaseError as e:
self._error = e
tty.warn(f"Reindexing corrupt database, error was: {e}")
self._data = {}
self._installed_prefixes = set()
transaction = lk.WriteTransaction(
self.lock, acquire=_read_suppress_error, release=self._write
)
with transaction:
if self._error:
tty.warn("Spack database was corrupt. Will rebuild. Error was:", str(self._error))
self._error = None
old_data = self._data
old_installed_prefixes = self._installed_prefixes
with lk.WriteTransaction(self.lock, acquire=_read_suppress_error, release=self._write):
old_installed_prefixes, self._installed_prefixes = self._installed_prefixes, set()
old_data, self._data = self._data, {}
try:
self._construct_from_directory_layout(directory_layout, old_data)
self._reindex(old_data)
except BaseException:
# If anything explodes, restore old data, skip write.
self._data = old_data
self._installed_prefixes = old_installed_prefixes
raise
def _construct_entry_from_directory_layout(
self, directory_layout, old_data, spec, deprecator=None
):
# Try to recover explicit value from old DB, but
# default it to True if DB was corrupt. This is
# just to be conservative in case a command like
# "autoremove" is run by the user after a reindex.
tty.debug("RECONSTRUCTING FROM SPEC.YAML: {0}".format(spec))
explicit = True
inst_time = os.stat(spec.prefix).st_ctime
if old_data is not None:
old_info = old_data.get(spec.dag_hash())
if old_info is not None:
explicit = old_info.explicit
inst_time = old_info.installation_time
def _reindex(self, old_data: Dict[str, InstallRecord]):
# Specs on the file system are the source of truth for record.spec. The old database values
# if available are the source of truth for the rest of the record.
assert self.layout, "Database layout must be set to reindex"
extra_args = {"explicit": explicit, "installation_time": inst_time}
self._add(spec, directory_layout, **extra_args)
if deprecator:
self._deprecate(spec, deprecator)
specs_from_fs = self.layout.all_specs()
deprecated_for = self.layout.deprecated_for(specs_from_fs)
def _construct_from_directory_layout(self, directory_layout, old_data):
# Read first the `spec.yaml` files in the prefixes. They should be
# considered authoritative with respect to DB reindexing, as
# entries in the DB may be corrupted in a way that still makes
# them readable. If we considered DB entries authoritative
# instead, we would perpetuate errors over a reindex.
with directory_layout.disable_upstream_check():
# Initialize data in the reconstructed DB
self._data = {}
self._installed_prefixes = set()
known_specs: List[spack.spec.Spec] = [
*specs_from_fs,
*(deprecated for _, deprecated in deprecated_for),
*(rec.spec for rec in old_data.values()),
]
# Start inspecting the installed prefixes
processed_specs = set()
upstream_hashes = {
dag_hash for upstream in self.upstream_dbs for dag_hash in upstream._data
}
upstream_hashes.difference_update(spec.dag_hash() for spec in known_specs)
for spec in directory_layout.all_specs():
self._construct_entry_from_directory_layout(directory_layout, old_data, spec)
processed_specs.add(spec)
def create_node(edge: spack.spec.DependencySpec, is_upstream: bool):
if is_upstream:
return
for spec, deprecator in directory_layout.all_deprecated_specs():
self._construct_entry_from_directory_layout(
directory_layout, old_data, spec, deprecator
self._data[edge.spec.dag_hash()] = InstallRecord(
spec=edge.spec.copy(deps=False),
path=edge.spec.external_path if edge.spec.external else None,
installed=edge.spec.external,
)
# Store all nodes of known specs, excluding ones found in upstreams
tr.traverse_breadth_first_with_visitor(
known_specs,
tr.CoverNodesVisitor(
NoUpstreamVisitor(upstream_hashes, create_node), key=tr.by_dag_hash
),
)
# Store the prefix and other information for specs were found on the file system
for s in specs_from_fs:
record = self._data[s.dag_hash()]
record.path = s.prefix
record.installed = True
record.explicit = True # conservative assumption
record.installation_time = os.stat(s.prefix).st_ctime
# Deprecate specs
for new, old in deprecated_for:
self._data[old.dag_hash()].deprecated_for = new.dag_hash()
# Copy data we have from the old database
for old_record in old_data.values():
record = self._data[old_record.spec.dag_hash()]
record.explicit = old_record.explicit
record.installation_time = old_record.installation_time
record.origin = old_record.origin
record.deprecated_for = old_record.deprecated_for
# Warn when the spec has been removed from the file system (i.e. it was not detected)
if not record.installed and old_record.installed:
tty.warn(
f"Spec {old_record.spec.short_spec} was marked installed in the database "
"but was not found on the file system. It is now marked as missing."
)
processed_specs.add(spec)
for key, entry in old_data.items():
# We already took care of this spec using
# `spec.yaml` from its prefix.
if entry.spec in processed_specs:
msg = "SKIPPING RECONSTRUCTION FROM OLD DB: {0}"
msg += " [already reconstructed from spec.yaml]"
tty.debug(msg.format(entry.spec))
continue
def create_edge(edge: spack.spec.DependencySpec, is_upstream: bool):
if not edge.parent:
return
parent_record = self._data[edge.parent.dag_hash()]
if is_upstream:
upstream, child_record = self.query_by_spec_hash(edge.spec.dag_hash())
assert upstream and child_record, "Internal error: upstream spec not found"
else:
child_record = self._data[edge.spec.dag_hash()]
parent_record.spec._add_dependency(
child_record.spec, depflag=edge.depflag, virtuals=edge.virtuals
)
# If we arrived here it very likely means that
# we have external specs that are not dependencies
# of other specs. This may be the case for externally
# installed compilers or externally installed
# applications.
tty.debug("RECONSTRUCTING FROM OLD DB: {0}".format(entry.spec))
try:
layout = None if entry.spec.external else directory_layout
kwargs = {
"spec": entry.spec,
"directory_layout": layout,
"explicit": entry.explicit,
"installation_time": entry.installation_time,
}
self._add(**kwargs)
processed_specs.add(entry.spec)
except Exception as e:
# Something went wrong, so the spec was not restored
# from old data
tty.debug(e)
# Then store edges
tr.traverse_breadth_first_with_visitor(
known_specs,
tr.CoverEdgesVisitor(
NoUpstreamVisitor(upstream_hashes, create_edge), key=tr.by_dag_hash
),
)
self._check_ref_counts()
# Finally update the ref counts
for record in self._data.values():
for dep in record.spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
dep_record = self._data.get(dep.dag_hash())
if dep_record: # dep might be upstream
dep_record.ref_count += 1
if record.deprecated_for:
self._data[record.deprecated_for].ref_count += 1
self._check_ref_counts()
def _check_ref_counts(self):
"""Ensure consistency of reference counts in the DB.
@@ -1033,7 +1043,7 @@ def _check_ref_counts(self):
Does no locking.
"""
counts = {}
counts: Dict[str, int] = {}
for key, rec in self._data.items():
counts.setdefault(key, 0)
for dep in rec.spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
@@ -1117,29 +1127,23 @@ def _read(self):
def _add(
self,
spec,
directory_layout=None,
explicit=False,
installation_time=None,
allow_missing=False,
spec: "spack.spec.Spec",
explicit: bool = False,
installation_time: Optional[float] = None,
allow_missing: bool = False,
):
"""Add an install record for this spec to the database.
Assumes spec is installed in ``directory_layout.path_for_spec(spec)``.
Also ensures dependencies are present and updated in the DB as
either installed or missing.
Also ensures dependencies are present and updated in the DB as either installed or missing.
Args:
spec (spack.spec.Spec): spec to be added
directory_layout: layout of the spec installation
spec: spec to be added
explicit:
Possible values: True, False, any
A spec that was installed following a specific user
request is marked as explicit. If instead it was
pulled-in as a dependency of a user requested spec
it's considered implicit.
A spec that was installed following a specific user request is marked as explicit.
If instead it was pulled-in as a dependency of a user requested spec it's
considered implicit.
installation_time:
Date and time of installation
@@ -1150,48 +1154,42 @@ def _add(
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
key = spec.dag_hash()
spec_pkg_hash = spec._package_hash
spec_pkg_hash = spec._package_hash # type: ignore[attr-defined]
upstream, record = self.query_by_spec_hash(key)
if upstream:
return
# Retrieve optional arguments
installation_time = installation_time or _now()
for edge in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
if edge.spec.dag_hash() in self._data:
continue
# allow missing build-only deps. This prevents excessive
# warnings when a spec is installed, and its build dep
# is missing a build dep; there's no need to install the
# build dep's build dep first, and there's no need to warn
# about it missing.
dep_allow_missing = allow_missing or edge.depflag == dt.BUILD
self._add(
edge.spec,
directory_layout,
explicit=False,
installation_time=installation_time,
allow_missing=dep_allow_missing,
# allow missing build-only deps. This prevents excessive warnings when a spec is
# installed, and its build dep is missing a build dep; there's no need to install
# the build dep's build dep first, and there's no need to warn about it missing.
allow_missing=allow_missing or edge.depflag == dt.BUILD,
)
# Make sure the directory layout agrees whether the spec is installed
if not spec.external and directory_layout:
path = directory_layout.path_for_spec(spec)
if not spec.external and self.layout:
path = self.layout.path_for_spec(spec)
installed = False
try:
directory_layout.ensure_installed(spec)
self.layout.ensure_installed(spec)
installed = True
self._installed_prefixes.add(path)
except DirectoryLayoutError as e:
if not (allow_missing and isinstance(e, InconsistentInstallDirectoryError)):
msg = (
"{0} is being {1} in the database with prefix {2}, "
"but this directory does not contain an installation of "
"the spec, due to: {3}"
)
action = "updated" if key in self._data else "registered"
tty.warn(msg.format(spec.short_spec, action, path, str(e)))
tty.warn(
f"{spec.short_spec} is being {action} in the database with prefix {path}, "
"but this directory does not contain an installation of "
f"the spec, due to: {e}"
)
elif spec.external_path:
path = spec.external_path
installed = True
@@ -1202,23 +1200,27 @@ def _add(
if key not in self._data:
# Create a new install record with no deps initially.
new_spec = spec.copy(deps=False)
extra_args = {"explicit": explicit, "installation_time": installation_time}
# Commands other than 'spack install' may add specs to the DB,
# we can record the source of an installed Spec with 'origin'
if hasattr(spec, "origin"):
extra_args["origin"] = spec.origin
self._data[key] = InstallRecord(new_spec, path, installed, ref_count=0, **extra_args)
self._data[key] = InstallRecord(
new_spec,
path=path,
installed=installed,
ref_count=0,
explicit=explicit,
installation_time=installation_time,
origin=None if not hasattr(spec, "origin") else spec.origin,
)
# Connect dependencies from the DB to the new copy.
for dep in spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES):
dkey = dep.spec.dag_hash()
upstream, record = self.query_by_spec_hash(dkey)
assert record, f"Missing dependency {dep.spec.short_spec} in DB"
new_spec._add_dependency(record.spec, depflag=dep.depflag, virtuals=dep.virtuals)
if not upstream:
record.ref_count += 1
# Mark concrete once everything is built, and preserve
# the original hashes of concrete specs.
# Mark concrete once everything is built, and preserve the original hashes of concrete
# specs.
new_spec._mark_concrete()
new_spec._hash = key
new_spec._package_hash = spec_pkg_hash
@@ -1231,7 +1233,7 @@ def _add(
self._data[key].explicit = explicit
@_autospec
def add(self, spec, directory_layout, explicit=False):
def add(self, spec: "spack.spec.Spec", *, explicit: bool = False) -> None:
"""Add spec at path to database, locking and reading DB to sync.
``add()`` will lock and read from the DB on disk.
@@ -1240,9 +1242,9 @@ def add(self, spec, directory_layout, explicit=False):
# TODO: ensure that spec is concrete?
# Entire add is transactional.
with self.write_transaction():
self._add(spec, directory_layout, explicit=explicit)
self._add(spec, explicit=explicit)
def _get_matching_spec_key(self, spec, **kwargs):
def _get_matching_spec_key(self, spec: "spack.spec.Spec", **kwargs) -> str:
"""Get the exact spec OR get a single spec that matches."""
key = spec.dag_hash()
upstream, record = self.query_by_spec_hash(key)
@@ -1254,12 +1256,12 @@ def _get_matching_spec_key(self, spec, **kwargs):
return key
@_autospec
def get_record(self, spec, **kwargs):
def get_record(self, spec: "spack.spec.Spec", **kwargs) -> Optional[InstallRecord]:
key = self._get_matching_spec_key(spec, **kwargs)
upstream, record = self.query_by_spec_hash(key)
return record
def _decrement_ref_count(self, spec):
def _decrement_ref_count(self, spec: "spack.spec.Spec") -> None:
key = spec.dag_hash()
if key not in self._data:
@@ -1276,7 +1278,7 @@ def _decrement_ref_count(self, spec):
for dep in spec.dependencies(deptype=_TRACKED_DEPENDENCIES):
self._decrement_ref_count(dep)
def _increment_ref_count(self, spec):
def _increment_ref_count(self, spec: "spack.spec.Spec") -> None:
key = spec.dag_hash()
if key not in self._data:
@@ -1285,14 +1287,14 @@ def _increment_ref_count(self, spec):
rec = self._data[key]
rec.ref_count += 1
def _remove(self, spec):
def _remove(self, spec: "spack.spec.Spec") -> "spack.spec.Spec":
"""Non-locking version of remove(); does real work."""
key = self._get_matching_spec_key(spec)
rec = self._data[key]
# This install prefix is now free for other specs to use, even if the
# spec is only marked uninstalled.
if not rec.spec.external and rec.installed:
if not rec.spec.external and rec.installed and rec.path:
self._installed_prefixes.remove(rec.path)
if rec.ref_count > 0:
@@ -1316,7 +1318,7 @@ def _remove(self, spec):
return rec.spec
@_autospec
def remove(self, spec):
def remove(self, spec: "spack.spec.Spec") -> "spack.spec.Spec":
"""Removes a spec from the database. To be called on uninstall.
Reads the database, then:
@@ -1331,7 +1333,7 @@ def remove(self, spec):
with self.write_transaction():
return self._remove(spec)
def deprecator(self, spec):
def deprecator(self, spec: "spack.spec.Spec") -> Optional["spack.spec.Spec"]:
"""Return the spec that the given spec is deprecated for, or None"""
with self.read_transaction():
spec_key = self._get_matching_spec_key(spec)
@@ -1342,14 +1344,14 @@ def deprecator(self, spec):
else:
return None
def specs_deprecated_by(self, spec):
def specs_deprecated_by(self, spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
"""Return all specs deprecated in favor of the given spec"""
with self.read_transaction():
return [
rec.spec for rec in self._data.values() if rec.deprecated_for == spec.dag_hash()
]
def _deprecate(self, spec, deprecator):
def _deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> None:
spec_key = self._get_matching_spec_key(spec)
spec_rec = self._data[spec_key]
@@ -1367,17 +1369,17 @@ def _deprecate(self, spec, deprecator):
self._data[spec_key] = spec_rec
@_autospec
def mark(self, spec, key, value):
def mark(self, spec: "spack.spec.Spec", key, value) -> None:
"""Mark an arbitrary record on a spec."""
with self.write_transaction():
return self._mark(spec, key, value)
def _mark(self, spec, key, value):
def _mark(self, spec: "spack.spec.Spec", key, value) -> None:
record = self._data[self._get_matching_spec_key(spec)]
setattr(record, key, value)
@_autospec
def deprecate(self, spec, deprecator):
def deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") -> None:
"""Marks a spec as deprecated in favor of its deprecator"""
with self.write_transaction():
return self._deprecate(spec, deprecator)
@@ -1385,16 +1387,16 @@ def deprecate(self, spec, deprecator):
@_autospec
def installed_relatives(
self,
spec,
direction="children",
transitive=True,
spec: "spack.spec.Spec",
direction: str = "children",
transitive: bool = True,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.ALL,
):
) -> Set["spack.spec.Spec"]:
"""Return installed specs related to this one."""
if direction not in ("parents", "children"):
raise ValueError("Invalid direction: %s" % direction)
relatives = set()
relatives: Set[spack.spec.Spec] = set()
for spec in self.query(spec):
if transitive:
to_add = spec.traverse(direction=direction, root=False, deptype=deptype)
@@ -1405,17 +1407,13 @@ def installed_relatives(
for relative in to_add:
hash_key = relative.dag_hash()
upstream, record = self.query_by_spec_hash(hash_key)
_, record = self.query_by_spec_hash(hash_key)
if not record:
reltype = "Dependent" if direction == "parents" else "Dependency"
msg = "Inconsistent state! %s %s of %s not in DB" % (
reltype,
hash_key,
spec.dag_hash(),
tty.warn(
f"Inconsistent state: "
f"{'dependent' if direction == 'parents' else 'dependency'} {hash_key} of "
f"{spec.dag_hash()} not in DB"
)
if self._fail_when_missing_deps:
raise MissingDependenciesError(msg)
tty.warn(msg)
continue
if not record.installed:
@@ -1425,7 +1423,7 @@ def installed_relatives(
return relatives
@_autospec
def installed_extensions_for(self, extendee_spec):
def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
"""Returns the specs of all packages that extend the given spec"""
for spec in self.query():
if spec.package.extends(extendee_spec):
@@ -1684,7 +1682,7 @@ def unused_specs(
self,
root_hashes: Optional[Container[str]] = None,
deptype: Union[dt.DepFlag, dt.DepTypes] = dt.LINK | dt.RUN,
) -> "List[spack.spec.Spec]":
) -> List["spack.spec.Spec"]:
"""Return all specs that are currently installed but not needed by root specs.
By default, roots are all explicit specs in the database. If a set of root
@@ -1728,6 +1726,33 @@ def update_explicit(self, spec, explicit):
rec.explicit = explicit
class NoUpstreamVisitor:
"""Gives edges to upstream specs, but does follow edges from upstream specs."""
def __init__(
self,
upstream_hashes: Set[str],
on_visit: Callable[["spack.spec.DependencySpec", bool], None],
):
self.upstream_hashes = upstream_hashes
self.on_visit = on_visit
def accept(self, item: tr.EdgeAndDepth) -> bool:
self.on_visit(item.edge, self.is_upstream(item))
return True
def is_upstream(self, item: tr.EdgeAndDepth) -> bool:
return item.edge.spec.dag_hash() in self.upstream_hashes
def neighbors(self, item: tr.EdgeAndDepth):
# Prune edges from upstream nodes, only follow database tracked dependencies
return (
[]
if self.is_upstream(item)
else item.edge.spec.edges_to_dependencies(depflag=_TRACKED_DEPENDENCIES)
)
class UpstreamDatabaseLockingError(SpackError):
"""Raised when an operation would need to lock an upstream database"""

View File

@@ -2,17 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from .common import (
DetectedPackage,
executable_prefix,
set_virtuals_nonbuildable,
update_configuration,
)
from .common import executable_prefix, set_virtuals_nonbuildable, update_configuration
from .path import by_path, executables_in_path
from .test import detection_tests
__all__ = [
"DetectedPackage",
"by_path",
"executables_in_path",
"executable_prefix",

View File

@@ -6,9 +6,9 @@
function to update packages.yaml given a list of detected packages.
Ideally, each detection method should be placed in a specific subpackage
and implement at least a function that returns a list of DetectedPackage
objects. The update in packages.yaml can then be done using the function
provided here.
and implement at least a function that returns a list of specs.
The update in packages.yaml can then be done using the function provided here.
The module also contains other functions that might be useful across different
detection mechanisms.
@@ -17,9 +17,10 @@
import itertools
import os
import os.path
import pathlib
import re
import sys
from typing import Dict, List, NamedTuple, Optional, Set, Tuple, Union
from typing import Dict, List, Optional, Set, Tuple, Union
import llnl.util.tty
@@ -30,27 +31,6 @@
import spack.util.windows_registry
class DetectedPackage(NamedTuple):
"""Information on a package that has been detected."""
#: Spec that was detected
spec: spack.spec.Spec
#: Prefix of the spec
prefix: str
def __reduce__(self):
return DetectedPackage.restore, (str(self.spec), self.prefix, self.spec.extra_attributes)
@staticmethod
def restore(
spec_str: str, prefix: str, extra_attributes: Optional[Dict[str, str]]
) -> "DetectedPackage":
spec = spack.spec.Spec.from_detection(
spec_str=spec_str, external_path=prefix, extra_attributes=extra_attributes
)
return DetectedPackage(spec=spec, prefix=prefix)
def _externals_in_packages_yaml() -> Set[spack.spec.Spec]:
"""Returns all the specs mentioned as externals in packages.yaml"""
packages_yaml = spack.config.get("packages")
@@ -65,7 +45,7 @@ def _externals_in_packages_yaml() -> Set[spack.spec.Spec]:
def _pkg_config_dict(
external_pkg_entries: List[DetectedPackage],
external_pkg_entries: List["spack.spec.Spec"],
) -> Dict[str, Union[bool, List[Dict[str, ExternalEntryType]]]]:
"""Generate a package specific config dict according to the packages.yaml schema.
@@ -85,22 +65,19 @@ def _pkg_config_dict(
pkg_dict = spack.util.spack_yaml.syaml_dict()
pkg_dict["externals"] = []
for e in external_pkg_entries:
if not _spec_is_valid(e.spec):
if not _spec_is_valid(e):
continue
external_items: List[Tuple[str, ExternalEntryType]] = [
("spec", str(e.spec)),
("prefix", e.prefix),
("spec", str(e)),
("prefix", pathlib.Path(e.external_path).as_posix()),
]
if e.spec.external_modules:
external_items.append(("modules", e.spec.external_modules))
if e.external_modules:
external_items.append(("modules", e.external_modules))
if e.spec.extra_attributes:
if e.extra_attributes:
external_items.append(
(
"extra_attributes",
spack.util.spack_yaml.syaml_dict(e.spec.extra_attributes.items()),
)
("extra_attributes", spack.util.spack_yaml.syaml_dict(e.extra_attributes.items()))
)
# external_items.extend(e.spec.extra_attributes.items())
@@ -221,33 +198,32 @@ def library_prefix(library_dir: str) -> str:
def update_configuration(
detected_packages: Dict[str, List[DetectedPackage]],
detected_packages: Dict[str, List["spack.spec.Spec"]],
scope: Optional[str] = None,
buildable: bool = True,
) -> List[spack.spec.Spec]:
"""Add the packages passed as arguments to packages.yaml
Args:
detected_packages: list of DetectedPackage objects to be added
detected_packages: list of specs to be added
scope: configuration scope where to add the detected packages
buildable: whether the detected packages are buildable or not
"""
predefined_external_specs = _externals_in_packages_yaml()
pkg_to_cfg, all_new_specs = {}, []
for package_name, entries in detected_packages.items():
new_entries = [e for e in entries if (e.spec not in predefined_external_specs)]
new_entries = [s for s in entries if s not in predefined_external_specs]
pkg_config = _pkg_config_dict(new_entries)
external_entries = pkg_config.get("externals", [])
assert not isinstance(external_entries, bool), "unexpected value for external entry"
all_new_specs.extend([x.spec for x in new_entries])
all_new_specs.extend(new_entries)
if buildable is False:
pkg_config["buildable"] = False
pkg_to_cfg[package_name] = pkg_config
pkgs_cfg = spack.config.get("packages", scope=scope)
pkgs_cfg = spack.config.merge_yaml(pkgs_cfg, pkg_to_cfg)
spack.config.set("packages", pkgs_cfg, scope=scope)

View File

@@ -24,7 +24,6 @@
import spack.util.ld_so_conf
from .common import (
DetectedPackage,
WindowsCompilerExternalPaths,
WindowsKitExternalPaths,
_convert_to_iterable,
@@ -229,7 +228,7 @@ def prefix_from_path(self, *, path: str) -> str:
def detect_specs(
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
) -> List[DetectedPackage]:
) -> List["spack.spec.Spec"]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
Args:
@@ -295,16 +294,16 @@ def detect_specs(
warnings.warn(msg)
continue
if spec.external_path:
prefix = spec.external_path
if not spec.external_path:
spec.external_path = prefix
result.append(DetectedPackage(spec=spec, prefix=prefix))
result.append(spec)
return result
def find(
self, *, pkg_name: str, repository, initial_guess: Optional[List[str]] = None
) -> List[DetectedPackage]:
) -> List["spack.spec.Spec"]:
"""For a given package, returns a list of detected specs.
Args:
@@ -388,7 +387,7 @@ def by_path(
*,
path_hints: Optional[List[str]] = None,
max_workers: Optional[int] = None,
) -> Dict[str, List[DetectedPackage]]:
) -> Dict[str, List["spack.spec.Spec"]]:
"""Return the list of packages that have been detected on the system, keyed by
unqualified package name.

View File

@@ -68,7 +68,7 @@ def execute(self) -> List[spack.spec.Spec]:
with self._mock_layout() as path_hints:
entries = by_path([self.test.pkg_name], path_hints=path_hints)
_, unqualified_name = spack.repo.partition_package_name(self.test.pkg_name)
specs = set(x.spec for x in entries[unqualified_name])
specs = set(entries[unqualified_name])
return list(specs)
@contextlib.contextmanager

View File

@@ -4,17 +4,14 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import glob
import os
import posixpath
import re
import shutil
import sys
from contextlib import contextmanager
from pathlib import Path
from typing import List, Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config
@@ -23,13 +20,8 @@
import spack.util.spack_json as sjson
from spack.error import SpackError
# Note: Posixpath is used here as opposed to
# os.path.join due to spack.spec.Spec.format
# requiring forward slash path seperators at this stage
default_projections = {
"all": posixpath.join(
"{architecture}", "{compiler.name}-{compiler.version}", "{name}-{version}-{hash}"
)
"all": "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
}
@@ -39,6 +31,42 @@ def _check_concrete(spec):
raise ValueError("Specs passed to a DirectoryLayout must be concrete!")
def _get_spec(prefix: str) -> Optional["spack.spec.Spec"]:
"""Returns a spec if the prefix contains a spec file in the .spack subdir"""
for f in ("spec.json", "spec.yaml"):
try:
return spack.spec.Spec.from_specfile(os.path.join(prefix, ".spack", f))
except Exception:
continue
return None
def specs_from_metadata_dirs(root: str) -> List["spack.spec.Spec"]:
stack = [root]
specs = []
while stack:
prefix = stack.pop()
spec = _get_spec(prefix)
if spec:
spec.prefix = prefix
specs.append(spec)
continue
try:
scandir = os.scandir(prefix)
except OSError:
continue
with scandir as entries:
for entry in entries:
if entry.is_dir(follow_symlinks=False):
stack.append(entry.path)
return specs
class DirectoryLayout:
"""A directory layout is used to associate unique paths with specs.
Different installations are going to want different layouts for their
@@ -152,20 +180,9 @@ def read_spec(self, path):
def spec_file_path(self, spec):
"""Gets full path to spec file"""
_check_concrete(spec)
# Attempts to convert to JSON if possible.
# Otherwise just returns the YAML.
yaml_path = os.path.join(self.metadata_path(spec), self._spec_file_name_yaml)
json_path = os.path.join(self.metadata_path(spec), self.spec_file_name)
if os.path.exists(yaml_path) and fs.can_write_to_dir(yaml_path):
self.write_spec(spec, json_path)
try:
os.remove(yaml_path)
except OSError as err:
tty.debug("Could not remove deprecated {0}".format(yaml_path))
tty.debug(err)
elif os.path.exists(yaml_path):
return yaml_path
return json_path
return yaml_path if os.path.exists(yaml_path) else json_path
def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
"""Gets full path to spec file for deprecated spec
@@ -199,23 +216,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
deprecated_spec.dag_hash() + "_" + self.spec_file_name,
)
if os.path.exists(yaml_path) and fs.can_write_to_dir(yaml_path):
self.write_spec(deprecated_spec, json_path)
try:
os.remove(yaml_path)
except (IOError, OSError) as err:
tty.debug("Could not remove deprecated {0}".format(yaml_path))
tty.debug(err)
elif os.path.exists(yaml_path):
return yaml_path
return json_path
@contextmanager
def disable_upstream_check(self):
self.check_upstream = False
yield
self.check_upstream = True
return yaml_path if os.path.exists(yaml_path) else json_path
def metadata_path(self, spec):
return os.path.join(spec.prefix, self.metadata_dir)
@@ -271,53 +272,6 @@ def ensure_installed(self, spec):
"Spec file in %s does not match hash!" % spec_file_path
)
def all_specs(self):
if not os.path.isdir(self.root):
return []
specs = []
for _, path_scheme in self.projections.items():
path_elems = ["*"] * len(path_scheme.split(posixpath.sep))
# NOTE: Does not validate filename extension; should happen later
path_elems += [self.metadata_dir, "spec.json"]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
if not spec_files: # we're probably looking at legacy yaml...
path_elems += [self.metadata_dir, "spec.yaml"]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
specs.extend([self.read_spec(s) for s in spec_files])
return specs
def all_deprecated_specs(self):
if not os.path.isdir(self.root):
return []
deprecated_specs = set()
for _, path_scheme in self.projections.items():
path_elems = ["*"] * len(path_scheme.split(posixpath.sep))
# NOTE: Does not validate filename extension; should happen later
path_elems += [
self.metadata_dir,
self.deprecated_dir,
"*_spec.*",
] # + self.spec_file_name]
pattern = os.path.join(self.root, *path_elems)
spec_files = glob.glob(pattern)
get_depr_spec_file = lambda x: os.path.join(
os.path.dirname(os.path.dirname(x)), self.spec_file_name
)
deprecated_specs |= set(
(self.read_spec(s), self.read_spec(get_depr_spec_file(s))) for s in spec_files
)
return deprecated_specs
def specs_by_hash(self):
by_hash = {}
for spec in self.all_specs():
by_hash[spec.dag_hash()] = spec
return by_hash
def path_for_spec(self, spec):
"""Return absolute path from the root to a directory for the spec."""
_check_concrete(spec)
@@ -383,6 +337,35 @@ def remove_install_directory(self, spec, deprecated=False):
raise e
path = os.path.dirname(path)
def all_specs(self) -> List["spack.spec.Spec"]:
"""Returns a list of all specs detected in self.root, detected by `.spack` directories.
Their prefix is set to the directory containing the `.spack` directory. Note that these
specs may follow a different layout than the current layout if it was changed after
installation."""
return specs_from_metadata_dirs(self.root)
def deprecated_for(
self, specs: List["spack.spec.Spec"]
) -> List[Tuple["spack.spec.Spec", "spack.spec.Spec"]]:
"""Returns a list of tuples of specs (new, old) where new is deprecated for old"""
spec_with_deprecated = []
for spec in specs:
try:
deprecated = os.scandir(
os.path.join(str(spec.prefix), self.metadata_dir, self.deprecated_dir)
)
except OSError:
continue
with deprecated as entries:
for entry in entries:
try:
deprecated_spec = spack.spec.Spec.from_specfile(entry.path)
spec_with_deprecated.append((spec, deprecated_spec))
except Exception:
continue
return spec_with_deprecated
class DirectoryLayoutError(SpackError):
"""Superclass for directory layout errors."""

View File

@@ -58,9 +58,8 @@
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
from spack.spec_list import InvalidSpecConstraintError, SpecList
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
from spack.variant import UnknownVariantError
#: environment variable used to indicate the active environment
spack_env_var = "SPACK_ENV"
@@ -1625,10 +1624,10 @@ def _concretize_separately(self, tests=False):
# Concretize any new user specs that we haven't concretized yet
args, root_specs, i = [], [], 0
for uspec, uspec_constraints in zip(self.user_specs, self.user_specs.specs_as_constraints):
for uspec in self.user_specs:
if uspec not in old_concretized_user_specs:
root_specs.append(uspec)
args.append((i, [str(x) for x in uspec_constraints], tests))
args.append((i, str(uspec), tests))
i += 1
# Ensure we don't try to bootstrap clingo in parallel
@@ -2508,52 +2507,11 @@ def display_specs(specs):
print(tree_string)
def _concretize_from_constraints(spec_constraints, tests=False):
# Accept only valid constraints from list and concretize spec
# Get the named spec even if out of order
root_spec = [s for s in spec_constraints if s.name]
if len(root_spec) != 1:
m = "The constraints %s are not a valid spec " % spec_constraints
m += "concretization target. all specs must have a single name "
m += "constraint for concretization."
raise InvalidSpecConstraintError(m)
spec_constraints.remove(root_spec[0])
invalid_constraints = []
while True:
# Attach all anonymous constraints to one named spec
s = root_spec[0].copy()
for c in spec_constraints:
if c not in invalid_constraints:
s.constrain(c)
try:
return s.concretized(tests=tests)
except spack.spec.InvalidDependencyError as e:
invalid_deps_string = ["^" + d for d in e.invalid_deps]
invalid_deps = [
c
for c in spec_constraints
if any(c.satisfies(invd) for invd in invalid_deps_string)
]
if len(invalid_deps) != len(invalid_deps_string):
raise e
invalid_constraints.extend(invalid_deps)
except UnknownVariantError as e:
invalid_variants = e.unknown_variants
inv_variant_constraints = [
c for c in spec_constraints if any(name in c.variants for name in invalid_variants)
]
if len(inv_variant_constraints) != len(invalid_variants):
raise e
invalid_constraints.extend(inv_variant_constraints)
def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
index, spec_constraints, tests = packed_arguments
spec_constraints = [Spec(x) for x in spec_constraints]
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = _concretize_from_constraints(spec_constraints, tests)
spec = Spec(spec_str).concretized(tests=tests)
return index, spec, time.time() - start

View File

@@ -276,52 +276,6 @@ def _do_fake_install(pkg: "spack.package_base.PackageBase") -> None:
dump_packages(pkg.spec, packages_dir)
def _packages_needed_to_bootstrap_compiler(
compiler: "spack.spec.CompilerSpec", architecture: "spack.spec.ArchSpec", pkgs: list
) -> List[Tuple["spack.package_base.PackageBase", bool]]:
"""
Return a list of packages required to bootstrap `pkg`s compiler
Checks Spack's compiler configuration for a compiler that
matches the package spec.
Args:
compiler: the compiler to bootstrap
architecture: the architecture for which to boostrap the compiler
pkgs: the packages that may need their compiler installed
Return:
list of tuples of packages and a boolean, for concretized compiler-related
packages that need to be installed and bool values specify whether the
package is the bootstrap compiler (``True``) or one of its dependencies
(``False``). The list will be empty if there are no compilers.
"""
tty.debug(f"Bootstrapping {compiler} compiler")
compilers = spack.compilers.compilers_for_spec(compiler, arch_spec=architecture)
if compilers:
return []
dep = spack.compilers.pkg_spec_for_compiler(compiler)
# Set the architecture for the compiler package in a way that allows the
# concretizer to back off if needed for the older bootstrapping compiler
dep.constrain(f"platform={str(architecture.platform)}")
dep.constrain(f"os={str(architecture.os)}")
dep.constrain(f"target={architecture.target.microarchitecture.family.name}:")
# concrete CompilerSpec has less info than concrete Spec
# concretize as Spec to add that information
dep.concretize()
# mark compiler as depended-on by the packages that use it
for pkg in pkgs:
dep._dependents.add(
spack.spec.DependencySpec(pkg.spec, dep, depflag=dt.BUILD, virtuals=())
)
packages = [(s.package, False) for s in dep.traverse(order="post", root=False)]
packages.append((dep.package, True))
return packages
def _hms(seconds: int) -> str:
"""
Convert seconds to hours, minutes, seconds
@@ -451,7 +405,7 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
# Add to the DB
tty.debug(f"{pre} registering into DB")
spack.store.STORE.db.add(spec, None, explicit=explicit)
spack.store.STORE.db.add(spec, explicit=explicit)
def _process_binary_cache_tarball(
@@ -493,7 +447,7 @@ def _process_binary_cache_tarball(
pkg._post_buildcache_install_hook()
pkg.installed_from_binary_cache = True
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(pkg.spec, explicit=explicit)
return True
@@ -967,26 +921,6 @@ def __init__(
if package_id(d) != self.pkg_id
)
# Handle bootstrapped compiler
#
# The bootstrapped compiler is not a dependency in the spec, but it is
# a dependency of the build task. Here we add it to self.dependencies
compiler_spec = self.pkg.spec.compiler
arch_spec = self.pkg.spec.architecture
strict = spack.concretize.Concretizer().check_for_compiler_existence
if (
not spack.compilers.compilers_for_spec(compiler_spec, arch_spec=arch_spec)
and not strict
):
# The compiler is in the queue, identify it as dependency
dep = spack.compilers.pkg_spec_for_compiler(compiler_spec)
dep.constrain(f"platform={str(arch_spec.platform)}")
dep.constrain(f"os={str(arch_spec.os)}")
dep.constrain(f"target={arch_spec.target.microarchitecture.family.name}:")
dep.concretize()
dep_id = package_id(dep)
self.dependencies.add(dep_id)
# List of uninstalled dependencies, which is used to establish
# the priority of the build task.
#
@@ -1165,53 +1099,6 @@ def __str__(self) -> str:
installed = f"installed ({len(self.installed)}) = {self.installed}"
return f"{self.pid}: {requests}; {tasks}; {installed}; {failed}"
def _add_bootstrap_compilers(
self,
compiler: "spack.spec.CompilerSpec",
architecture: "spack.spec.ArchSpec",
pkgs: List["spack.package_base.PackageBase"],
request: BuildRequest,
all_deps,
) -> None:
"""
Add bootstrap compilers and dependencies to the build queue.
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs: the package list with possible compiler dependencies
request: the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
associated dependents
"""
packages = _packages_needed_to_bootstrap_compiler(compiler, architecture, pkgs)
for comp_pkg, is_compiler in packages:
pkgid = package_id(comp_pkg.spec)
if pkgid not in self.build_tasks:
self._add_init_task(comp_pkg, request, is_compiler, all_deps)
elif is_compiler:
# ensure it's queued as a compiler
self._modify_existing_task(pkgid, "compiler", True)
def _modify_existing_task(self, pkgid: str, attr, value) -> None:
"""
Update a task in-place to modify its behavior.
Currently used to update the ``compiler`` field on tasks
that were originally created as a dependency of a compiler,
but are compilers in their own right.
For example, ``intel-oneapi-compilers-classic`` depends on
``intel-oneapi-compilers``, which can cause the latter to be
queued first as a non-compiler, and only later as a compiler.
"""
for i, tup in enumerate(self.build_pq):
key, task = tup
if task.pkg_id == pkgid:
tty.debug(f"Modifying task for {pkgid} to treat it as a compiler", level=2)
setattr(task, attr, value)
self.build_pq[i] = (key, task)
def _add_init_task(
self,
pkg: "spack.package_base.PackageBase",
@@ -1541,42 +1428,7 @@ def _add_tasks(self, request: BuildRequest, all_deps):
tty.warn(f"Installation request refused: {str(err)}")
return
install_compilers = spack.config.get("config:install_missing_compilers", False)
install_deps = request.install_args.get("install_deps")
# Bootstrap compilers first
if install_deps and install_compilers:
packages_per_compiler: Dict[
"spack.spec.CompilerSpec",
Dict["spack.spec.ArchSpec", List["spack.package_base.PackageBase"]],
] = {}
for dep in request.traverse_dependencies():
dep_pkg = dep.package
compiler = dep_pkg.spec.compiler
arch = dep_pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(dep_pkg)
compiler = request.pkg.spec.compiler
arch = request.pkg.spec.architecture
if compiler not in packages_per_compiler:
packages_per_compiler[compiler] = {}
if arch not in packages_per_compiler[compiler]:
packages_per_compiler[compiler][arch] = []
packages_per_compiler[compiler][arch].append(request.pkg)
for compiler, archs in packages_per_compiler.items():
for arch, packages in archs.items():
self._add_bootstrap_compilers(compiler, arch, packages, request, all_deps)
if install_deps:
for dep in request.traverse_dependencies():
@@ -1608,10 +1460,6 @@ def _add_tasks(self, request: BuildRequest, all_deps):
fail_fast = bool(request.install_args.get("fail_fast"))
self.fail_fast = self.fail_fast or fail_fast
def _add_compiler_package_to_config(self, pkg: "spack.package_base.PackageBase") -> None:
compiler_search_prefix = getattr(pkg, "compiler_search_prefix", pkg.spec.prefix)
spack.compilers.find_compilers([compiler_search_prefix])
def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
"""
Perform the installation of the requested spec and/or dependency
@@ -1639,8 +1487,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if use_cache:
if _install_from_cache(pkg, explicit, unsigned):
self._update_installed(task)
if task.compiler:
self._add_compiler_package_to_config(pkg)
return
elif cache_only:
raise InstallError("No binary found when cache-only was specified", pkg=pkg)
@@ -1668,11 +1514,8 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
)
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.STORE.db.add(pkg.spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(pkg.spec, explicit=explicit)
# If a compiler, ensure it is added to the configuration
if task.compiler:
self._add_compiler_package_to_config(pkg)
except spack.build_environment.StopPhase as e:
# A StopPhase exception means that do_install was asked to
# stop early from clients, and is not an error at this point
@@ -2073,10 +1916,6 @@ def install(self) -> None:
path = spack.util.path.debug_padded_filter(pkg.prefix)
_print_installed_pkg(path)
# It's an already installed compiler, add it to the config
if task.compiler:
self._add_compiler_package_to_config(pkg)
else:
# At this point we've failed to get a write or a read
# lock, which means another process has taken a write

View File

@@ -29,7 +29,6 @@
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.oci.image
import spack.repo
import spack.spec

View File

@@ -46,7 +46,6 @@
import spack.deptypes as dt
import spack.environment
import spack.error
import spack.modules.common
import spack.paths
import spack.projections as proj
import spack.repo
@@ -352,7 +351,7 @@ def get_module(module_type, spec, get_full_path, module_set_name="default", requ
except spack.repo.UnknownPackageError:
upstream, record = spack.store.STORE.db.query_by_spec_hash(spec.dag_hash())
if upstream:
module = spack.modules.common.upstream_module_index.upstream_module(spec, module_type)
module = upstream_module_index.upstream_module(spec, module_type)
if not module:
return None

View File

@@ -104,6 +104,7 @@
from spack.spec import InvalidSpecDetected, Spec
from spack.util.cpus import determine_number_of_jobs
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,

View File

@@ -16,7 +16,6 @@
import glob
import hashlib
import importlib
import inspect
import io
import os
import re
@@ -117,11 +116,10 @@ def preferred_version(pkg: "PackageBase"):
Arguments:
pkg: The package whose versions are to be assessed.
"""
# Here we sort first on the fact that a version is marked
# as preferred in the package, then on the fact that the
# version is not develop, then lexicographically
key_fn = lambda v: (pkg.versions[v].get("preferred", False), not v.isdevelop(), v)
return max(pkg.versions, key=key_fn)
from spack.solver.asp import concretization_version_order
version, _ = max(pkg.versions.items(), key=concretization_version_order)
return version
class WindowsRPath:
@@ -1744,7 +1742,7 @@ def _has_make_target(self, target):
bool: True if 'target' is found, else False
"""
# Prevent altering LC_ALL for 'make' outside this function
make = copy.deepcopy(inspect.getmodule(self).make)
make = copy.deepcopy(self.module.make)
# Use English locale for missing target message comparison
make.add_default_env("LC_ALL", "C")
@@ -1794,7 +1792,7 @@ def _if_make_target_execute(self, target, *args, **kwargs):
"""
if self._has_make_target(target):
# Execute target
inspect.getmodule(self).make(target, *args, **kwargs)
self.module.make(target, *args, **kwargs)
def _has_ninja_target(self, target):
"""Checks to see if 'target' is a valid target in a Ninja build script.
@@ -1805,7 +1803,7 @@ def _has_ninja_target(self, target):
Returns:
bool: True if 'target' is found, else False
"""
ninja = inspect.getmodule(self).ninja
ninja = self.module.ninja
# Check if we have a Ninja build script
if not os.path.exists("build.ninja"):
@@ -1834,7 +1832,7 @@ def _if_ninja_target_execute(self, target, *args, **kwargs):
"""
if self._has_ninja_target(target):
# Execute target
inspect.getmodule(self).ninja(target, *args, **kwargs)
self.module.ninja(target, *args, **kwargs)
def _get_needed_resources(self):
# We use intersects here cause it would also work if self.spec is abstract

View File

@@ -12,7 +12,6 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -25,6 +24,7 @@
import spack.store
import spack.util.elf as elf
import spack.util.executable as executable
import spack.util.filesystem as ssys
import spack.util.path
from .relocate_text import BinaryFilePrefixReplacer, TextFilePrefixReplacer
@@ -664,7 +664,7 @@ def is_binary(filename):
Returns:
True or False
"""
m_type, _ = fs.mime_type(filename)
m_type, _ = ssys.mime_type(filename)
msg = "[{0}] -> ".format(filename)
if m_type == "application":
@@ -692,7 +692,7 @@ def fixup_macos_rpath(root, filename):
True if fixups were applied, else False
"""
abspath = os.path.join(root, filename)
if fs.mime_type(abspath) != ("application", "x-mach-binary"):
if ssys.mime_type(abspath) != ("application", "x-mach-binary"):
return False
# Get Mach-O header commands

View File

@@ -116,7 +116,7 @@ def rewire_node(spec, explicit):
# spec being added to look for mismatches)
spack.store.STORE.layout.write_spec(spec, spack.store.STORE.layout.spec_file_path(spec))
# add to database, not sure about explicit
spack.store.STORE.db.add(spec, spack.store.STORE.layout, explicit=explicit)
spack.store.STORE.db.add(spec, explicit=explicit)
# run post install hooks
spack.hooks.post_install(spec, explicit)

View File

@@ -3,22 +3,28 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains jsonschema files for all of Spack's YAML formats."""
import typing
import warnings
import llnl.util.lang
class DeprecationMessage(typing.NamedTuple):
message: str
error: bool
# jsonschema is imported lazily as it is heavy to import
# and increases the start-up time
def _make_validator():
import jsonschema
import spack.parser
def _validate_spec(validator, is_spec, instance, schema):
"""Check if the attributes on instance are valid specs."""
import jsonschema
import spack.parser
if not validator.is_type(instance, "object"):
return
@@ -32,27 +38,31 @@ def _deprecated_properties(validator, deprecated, instance, schema):
if not (validator.is_type(instance, "object") or validator.is_type(instance, "array")):
return
# Get a list of the deprecated properties, return if there is none
deprecated_properties = [x for x in instance if x in deprecated["properties"]]
if not deprecated_properties:
if not deprecated:
return
# Retrieve the template message
msg_str_or_func = deprecated["message"]
if isinstance(msg_str_or_func, str):
msg = msg_str_or_func.format(properties=deprecated_properties)
else:
msg = msg_str_or_func(instance, deprecated_properties)
if msg is None:
return
deprecations = {
name: DeprecationMessage(message=x["message"], error=x["error"])
for x in deprecated
for name in x["names"]
}
is_error = deprecated["error"]
if not is_error:
warnings.warn(msg)
else:
import jsonschema
# Get a list of the deprecated properties, return if there is none
issues = [entry for entry in instance if entry in deprecations]
if not issues:
return
yield jsonschema.ValidationError(msg)
# Process issues
errors = []
for name in issues:
msg = deprecations[name].message.format(name=name)
if deprecations[name].error:
errors.append(msg)
else:
warnings.warn(msg)
if errors:
yield jsonschema.ValidationError("\n".join(errors))
return jsonschema.validators.extend(
jsonschema.Draft4Validator,

View File

@@ -75,7 +75,6 @@
"verify_ssl": {"type": "boolean"},
"ssl_certs": {"type": "string"},
"suppress_gpg_warnings": {"type": "boolean"},
"install_missing_compilers": {"type": "boolean"},
"debug": {"type": "boolean"},
"checksum": {"type": "boolean"},
"deprecated": {"type": "boolean"},
@@ -96,12 +95,21 @@
"binary_index_ttl": {"type": "integer", "minimum": 0},
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},
},
"deprecatedProperties": {
"properties": ["concretizer"],
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
},
"deprecatedProperties": [
{
"names": ["concretizer"],
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
},
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"error": False,
},
],
}
}

View File

@@ -109,7 +109,6 @@
"require": requirements,
"prefer": prefer_and_conflict,
"conflict": prefer_and_conflict,
"version": {}, # Here only to warn users on ignored properties
"target": {
"type": "array",
"default": [],
@@ -140,14 +139,6 @@
},
"variants": variants,
},
"deprecatedProperties": {
"properties": ["version"],
"message": "setting version preferences in the 'all' section of packages.yaml "
"is deprecated and will be removed in v0.23\n\n\tThese preferences "
"will be ignored by Spack. You can set them only in package-specific sections "
"of the same file.\n",
"error": False,
},
}
},
"patternProperties": {
@@ -165,14 +156,11 @@
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"target": {}, # Here only to warn users on ignored properties
"compiler": {}, # Here only to warn users on ignored properties
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"providers": {}, # Here only to warn users on ignored properties
"variants": variants,
"externals": {
"type": "array",
@@ -204,18 +192,6 @@
},
},
},
"deprecatedProperties": {
"properties": ["target", "compiler", "providers"],
"message": "setting 'compiler:', 'target:' or 'provider:' preferences in "
"a package-specific section of packages.yaml is deprecated, and will be "
"removed in v0.23.\n\n\tThese preferences will be ignored by Spack, and "
"can be set only in the 'all' section of the same file. "
"You can run:\n\n\t\t$ spack audit configs\n\n\tto get better diagnostics, "
"including files:lines where the deprecated attributes are used.\n\n"
"\tUse requirements to enforce conditions on specific packages: "
f"{REQUIREMENT_URL}\n",
"error": False,
},
}
},
}

View File

@@ -579,7 +579,7 @@ def _is_checksummed_version(version_info: Tuple[GitOrStandardVersion, dict]):
return _is_checksummed_git_version(version)
def _concretization_version_order(version_info: Tuple[GitOrStandardVersion, dict]):
def concretization_version_order(version_info: Tuple[GitOrStandardVersion, dict]):
"""Version order key for concretization, where preferred > not preferred,
not deprecated > deprecated, finite > any infinite component; only if all are
the same, do we use default version ordering."""
@@ -1022,6 +1022,102 @@ def __iter__(self):
ConditionSpecCache = Dict[str, Dict[ConditionSpecKey, ConditionIdFunctionPair]]
class ConstraintOrigin(enum.Enum):
"""Generates identifiers that can be pased into the solver attached
to constraints, and then later retrieved to determine the origin of
those constraints when ``SpecBuilder`` creates Specs from the solve
result.
"""
DEPENDS_ON = 1
REQUIRE = 2
@staticmethod
def _SUFFIXES() -> Dict["ConstraintOrigin", str]:
return {ConstraintOrigin.DEPENDS_ON: "_dep", ConstraintOrigin.REQUIRE: "_req"}
@staticmethod
def append_type_suffix(pkg_id: str, kind: "ConstraintOrigin") -> str:
"""Given a package identifier and a constraint kind, generate a string ID."""
suffix = ConstraintOrigin._SUFFIXES()[kind]
return f"{pkg_id}{suffix}"
@staticmethod
def strip_type_suffix(source: str) -> Tuple[int, Optional[str]]:
"""Take a combined package/type ID generated by
``append_type_suffix``, and extract the package ID and
an associated weight.
"""
if not source:
return -1, None
for kind, suffix in ConstraintOrigin._SUFFIXES().items():
if source.endswith(suffix):
return kind.value, source[: -len(suffix)]
return -1, source
class SourceContext:
"""Tracks context in which a Spec's clause-set is generated (i.e.
with ``SpackSolverSetup.spec_clauses``).
Facts generated for the spec may include this context.
"""
def __init__(self):
# This can be "literal" for constraints that come from a user
# spec (e.g. from the command line); it can be the output of
# `ConstraintOrigin.append_type_suffix`; the default is "none"
# (which means it isn't important to keep track of the source
# in that case).
self.source = "none"
class ConditionIdContext(SourceContext):
"""Derived from a ``ConditionContext``: for clause-sets generated by
imposed/required specs, stores an associated transform.
This is primarily used for tracking whether we are generating clauses
in the context of a required spec, or for an imposed spec.
Is not a subclass of ``ConditionContext`` because it exists in a
lower-level context with less information.
"""
def __init__(self):
super().__init__()
self.transform = None
class ConditionContext(SourceContext):
"""Tracks context in which a condition (i.e. ``SpackSolverSetup.condition``)
is generated (e.g. for a `depends_on`).
This may modify the required/imposed specs generated as relevant
for the context.
"""
def __init__(self):
super().__init__()
# transformation applied to facts from the required spec. Defaults
# to leave facts as they are.
self.transform_required = None
# transformation applied to facts from the imposed spec. Defaults
# to removing "node" and "virtual_node" facts.
self.transform_imposed = None
def requirement_context(self) -> ConditionIdContext:
ctxt = ConditionIdContext()
ctxt.source = self.source
ctxt.transform = self.transform_required
return ctxt
def impose_context(self) -> ConditionIdContext:
ctxt = ConditionIdContext()
ctxt.source = self.source
ctxt.transform = self.transform_imposed
return ctxt
class SpackSolverSetup:
"""Class to set up and run a Spack concretization solve."""
@@ -1197,8 +1293,9 @@ def compiler_facts(self):
if compiler.compiler_obj is not None:
c = compiler.compiler_obj
for flag_type, flags in c.flags.items():
flag_group = " ".join(flags)
for flag in flags:
self.gen.fact(fn.compiler_flag(compiler_id, flag_type, flag))
self.gen.fact(fn.compiler_flag(compiler_id, flag_type, flag, flag_group))
if compiler.available:
self.gen.fact(fn.compiler_available(compiler_id))
@@ -1375,7 +1472,7 @@ def _get_condition_id(
named_cond: spack.spec.Spec,
cache: ConditionSpecCache,
body: bool,
transform: Optional[TransformFunction] = None,
context: ConditionIdContext,
) -> int:
"""Get the id for one half of a condition (either a trigger or an imposed constraint).
@@ -1389,15 +1486,15 @@ def _get_condition_id(
"""
pkg_cache = cache[named_cond.name]
named_cond_key = (str(named_cond), transform)
named_cond_key = (str(named_cond), context.transform)
result = pkg_cache.get(named_cond_key)
if result:
return result[0]
cond_id = next(self._id_counter)
requirements = self.spec_clauses(named_cond, body=body)
if transform:
requirements = transform(named_cond, requirements)
requirements = self.spec_clauses(named_cond, body=body, context=context)
if context.transform:
requirements = context.transform(named_cond, requirements)
pkg_cache[named_cond_key] = (cond_id, requirements)
return cond_id
@@ -1408,8 +1505,7 @@ def condition(
imposed_spec: Optional[spack.spec.Spec] = None,
name: Optional[str] = None,
msg: Optional[str] = None,
transform_required: Optional[TransformFunction] = None,
transform_imposed: Optional[TransformFunction] = remove_node,
context: Optional[ConditionContext] = None,
):
"""Generate facts for a dependency or virtual provider condition.
@@ -1418,10 +1514,8 @@ def condition(
imposed_spec: the constraints that are imposed when this condition is triggered
name: name for `required_spec` (required if required_spec is anonymous, ignored if not)
msg: description of the condition
transform_required: transformation applied to facts from the required spec. Defaults
to leave facts as they are.
transform_imposed: transformation applied to facts from the imposed spec. Defaults
to removing "node" and "virtual_node" facts.
context: if provided, indicates how to modify the clause-sets for the required/imposed
specs based on the type of constraint they are generated for (e.g. `depends_on`)
Returns:
int: id of the condition created by this function
"""
@@ -1429,14 +1523,19 @@ def condition(
if not name:
raise ValueError(f"Must provide a name for anonymous condition: '{required_spec}'")
if not context:
context = ConditionContext()
context.transform_imposed = remove_node
with spec_with_name(required_spec, name):
# Check if we can emit the requirements before updating the condition ID counter.
# In this way, if a condition can't be emitted but the exception is handled in the
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
requirement_context = context.requirement_context()
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
required_spec, cache=self._trigger_cache, body=True, context=requirement_context
)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
@@ -1446,8 +1545,9 @@ def condition(
if not imposed_spec:
return condition_id
impose_context = context.impose_context()
effect_id = self._get_condition_id(
imposed_spec, cache=self._effect_cache, body=False, transform=transform_imposed
imposed_spec, cache=self._effect_cache, body=False, context=impose_context
)
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_effect(condition_id, effect_id))
@@ -1455,8 +1555,8 @@ def condition(
return condition_id
def impose(self, condition_id, imposed_spec, node=True, name=None, body=False):
imposed_constraints = self.spec_clauses(imposed_spec, body=body, required_from=name)
def impose(self, condition_id, imposed_spec, node=True, body=False):
imposed_constraints = self.spec_clauses(imposed_spec, body=body)
for pred in imposed_constraints:
# imposed "node"-like conditions are no-ops
if not node and pred.args[0] in ("node", "virtual_node"):
@@ -1528,14 +1628,14 @@ def dependency_holds(input_spec, requirements):
if t & depflag
]
self.condition(
cond,
dep.spec,
name=pkg.name,
msg=msg,
transform_required=track_dependencies,
transform_imposed=dependency_holds,
context = ConditionContext()
context.source = ConstraintOrigin.append_type_suffix(
pkg.name, ConstraintOrigin.DEPENDS_ON
)
context.transform_required = track_dependencies
context.transform_imposed = dependency_holds
self.condition(cond, dep.spec, name=pkg.name, msg=msg, context=context)
self.gen.newline()
@@ -1613,17 +1713,21 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
when_spec = spack.spec.Spec(pkg_name)
try:
# With virtual we want to emit "node" and "virtual_node" in imposed specs
transform: Optional[TransformFunction] = remove_node
if virtual:
transform = None
context = ConditionContext()
context.source = ConstraintOrigin.append_type_suffix(
pkg_name, ConstraintOrigin.REQUIRE
)
if not virtual:
context.transform_imposed = remove_node
# else: for virtuals we want to emit "node" and
# "virtual_node" in imposed specs
member_id = self.condition(
required_spec=when_spec,
imposed_spec=spec,
name=pkg_name,
transform_imposed=transform,
msg=f"{input_spec} is a requirement for package {pkg_name}",
context=context,
)
except Exception as e:
# Do not raise if the rule comes from the 'all' subsection, since usability
@@ -1678,6 +1782,10 @@ def external_packages(self):
if pkg_name not in spack.repo.PATH:
continue
# This package is not among possible dependencies
if pkg_name not in self.pkgs:
continue
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
@@ -1718,7 +1826,9 @@ def external_imposition(input_spec, requirements):
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
context = ConditionContext()
context.transform_imposed = external_imposition
self.condition(spec, spec, msg=msg, context=context)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
@@ -1793,6 +1903,7 @@ def spec_clauses(
expand_hashes: bool = False,
concrete_build_deps=False,
required_from: Optional[str] = None,
context: Optional[SourceContext] = None,
) -> List[AspFunction]:
"""Wrap a call to `_spec_clauses()` into a try/except block with better error handling.
@@ -1808,6 +1919,7 @@ def spec_clauses(
transitive=transitive,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
context=context,
)
except RuntimeError as exc:
msg = str(exc)
@@ -1824,6 +1936,7 @@ def _spec_clauses(
transitive: bool = True,
expand_hashes: bool = False,
concrete_build_deps: bool = False,
context: Optional[SourceContext] = None,
) -> List[AspFunction]:
"""Return a list of clauses for a spec mandates are true.
@@ -1835,6 +1948,8 @@ def _spec_clauses(
expand_hashes: if True, descend into hashes of concrete specs (default False)
concrete_build_deps: if False, do not include pure build deps of concrete specs
(as they have no effect on runtime constraints)
context: tracks what constraint this clause set is generated for (e.g. a
`depends_on` constraint in a package.py file)
Normally, if called with ``transitive=True``, ``spec_clauses()`` just generates
hashes for the dependency requirements of concrete specs. If ``expand_hashes``
@@ -1921,13 +2036,19 @@ def _spec_clauses(
self.compiler_version_constraints.add(spec.compiler)
# compiler flags
source = context.source if context else "none"
for flag_type, flags in spec.compiler_flags.items():
flag_group = " ".join(flags)
for flag in flags:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(
f.node_flag(spec.name, fn.node_flag(flag_type, flag, flag_group, source))
)
if not spec.concrete and flag.propagate is True:
clauses.append(
f.propagate(
spec.name, fn.node_flag(flag_type, flag), fn.edge_types("link", "run")
spec.name,
fn.node_flag(flag_type, flag, flag_group, source),
fn.edge_types("link", "run"),
)
)
@@ -2009,6 +2130,7 @@ def _spec_clauses(
body=body,
expand_hashes=expand_hashes,
concrete_build_deps=concrete_build_deps,
context=context,
)
)
@@ -2026,7 +2148,7 @@ def define_package_versions_and_validate_preferences(
# like being a "develop" version or being preferred exist only at a
# package.py level, sort them in this partial list here
package_py_versions = sorted(
pkg_cls.versions.items(), key=_concretization_version_order, reverse=True
pkg_cls.versions.items(), key=concretization_version_order, reverse=True
)
if require_checksum and pkg_cls.has_code:
@@ -2609,10 +2731,6 @@ def define_runtime_constraints(self):
continue
current_libc = compiler.compiler_obj.default_libc
# If this is a compiler yet to be built (config:install_missing_compilers:true)
# infer libc from the Python process
if not current_libc and compiler.compiler_obj.cc is None:
current_libc = spack.util.libc.libc_from_current_python_process()
if using_libc_compatibility() and current_libc:
recorder("*").depends_on(
@@ -2644,7 +2762,9 @@ def literal_specs(self, specs):
effect_id, requirements = cache[imposed_spec_key]
else:
effect_id = next(self._id_counter)
requirements = self.spec_clauses(spec)
context = SourceContext()
context.source = "literal"
requirements = self.spec_clauses(spec, context=context)
root_name = spec.name
for clause in requirements:
clause_name = clause.args[0]
@@ -3032,7 +3152,7 @@ def with_input_specs(self, input_specs: List["spack.spec.Spec"]) -> "CompilerPar
Args:
input_specs: specs to be concretized
"""
strict = spack.concretize.Concretizer().check_for_compiler_existence
strict = spack.concretize.CHECK_COMPILER_EXISTENCE
default_os = str(spack.platforms.host().default_os)
default_target = str(archspec.cpu.host().family)
for s in traverse.traverse_nodes(input_specs):
@@ -3344,7 +3464,6 @@ def __init__(self, specs, hash_lookup=None):
self._result = None
self._command_line_specs = specs
self._flag_sources = collections.defaultdict(lambda: set())
self._flag_compiler_defaults = set()
# Pass in as arguments reusable specs and plug them in
# from this dictionary during reconstruction
@@ -3402,14 +3521,10 @@ def node_compiler_version(self, node, compiler, version):
self._specs[node].compiler = spack.spec.CompilerSpec(compiler)
self._specs[node].compiler.versions = vn.VersionList([vn.Version(version)])
def node_flag_compiler_default(self, node):
self._flag_compiler_defaults.add(node)
def node_flag(self, node, flag_type, flag):
self._specs[node].compiler_flags.add_flag(flag_type, flag, False)
def node_flag_source(self, node, flag_type, source):
self._flag_sources[(node, flag_type)].add(source)
def node_flag(self, node, node_flag):
self._specs[node].compiler_flags.add_flag(
node_flag.flag_type, node_flag.flag, False, node_flag.flag_group, node_flag.source
)
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx has been selected for this package."""
@@ -3450,15 +3565,23 @@ def virtual_on_edge(self, parent_node, provider_node, virtual):
dependencies[0].update_virtuals((virtual,))
def reorder_flags(self):
"""Order compiler flags on specs in predefined order.
We order flags so that any node's flags will take priority over
those of its dependents. That is, the deepest node in the DAG's
flags will appear last on the compile line, in the order they
were specified.
"""For each spec, determine the order of compiler flags applied to it.
The solver determines which flags are on nodes; this routine
imposes order afterwards.
imposes order afterwards. The order is:
1. Flags applied in compiler definitions should come first
2. Flags applied by dependents are ordered topologically (with a
dependency on `traverse` to resolve the partial order into a
stable total order)
3. Flags from requirements are then applied (requirements always
come from the package and never a parent)
4. Command-line flags should come last
Additionally, for each source (requirements, compiler, command line, and
dependents), flags from that source should retain their order and grouping:
e.g. for `y cflags="-z -a"` "-z" and "-a" should never have any intervening
flags inserted, and should always appear in that order.
"""
# reverse compilers so we get highest priority compilers that share a spec
compilers = dict(
@@ -3473,40 +3596,78 @@ def reorder_flags(self):
flagmap_from_compiler = compilers[spec.compiler].flags
for flag_type in spec.compiler_flags.valid_compiler_flags():
from_compiler = flagmap_from_compiler.get(flag_type, [])
from_sources = []
# order is determined by the DAG. A spec's flags come after any of its ancestors
# on the compile line
node = SpecBuilder.make_node(pkg=spec.name)
source_key = (node, flag_type)
if source_key in self._flag_sources:
order = [
SpecBuilder.make_node(pkg=s.name)
for s in spec.traverse(order="post", direction="parents")
]
sorted_sources = sorted(
self._flag_sources[source_key], key=lambda s: order.index(s)
ordered_flags = []
# 1. Put compiler flags first
from_compiler = tuple(flagmap_from_compiler.get(flag_type, []))
extend_flag_list(ordered_flags, from_compiler)
# 2. Add all sources (the compiler is one of them, so skip any
# flag group that matches it exactly)
flag_groups = set()
for flag in self._specs[node].compiler_flags.get(flag_type, []):
flag_groups.add(
spack.spec.CompilerFlag(
flag.flag_group,
propagate=flag.propagate,
flag_group=flag.flag_group,
source=flag.source,
)
)
# add flags from each source, lowest to highest precedence
for node in sorted_sources:
all_src_flags = list()
per_pkg_sources = [self._specs[node]]
if node.pkg in cmd_specs:
per_pkg_sources.append(cmd_specs[node.pkg])
for source in per_pkg_sources:
all_src_flags.extend(source.compiler_flags.get(flag_type, []))
extend_flag_list(from_sources, all_src_flags)
# For flags that are applied by dependents, put flags from parents
# before children; we depend on the stability of traverse() to
# achieve a stable flag order for flags introduced in this manner.
topo_order = list(s.name for s in spec.traverse(order="post", direction="parents"))
lex_order = list(sorted(flag_groups))
def _order_index(flag_group):
source = flag_group.source
# Note: if 'require: ^dependency cflags=...' is ever possible,
# this will topologically sort for require as well
type_index, pkg_source = ConstraintOrigin.strip_type_suffix(source)
if pkg_source in topo_order:
major_index = topo_order.index(pkg_source)
# If for x->y, x has multiple depends_on declarations that
# are activated, and each adds cflags to y, we fall back on
# alphabetical ordering to maintain a total order
minor_index = lex_order.index(flag_group)
else:
major_index = len(topo_order) + lex_order.index(flag_group)
minor_index = 0
return (type_index, major_index, minor_index)
prioritized_groups = sorted(flag_groups, key=lambda x: _order_index(x))
for grp in prioritized_groups:
grp_flags = tuple(
x for (x, y) in spack.compiler.tokenize_flags(grp.flag_group)
)
if grp_flags == from_compiler:
continue
as_compiler_flags = list(
spack.spec.CompilerFlag(
x,
propagate=grp.propagate,
flag_group=grp.flag_group,
source=grp.source,
)
for x in grp_flags
)
extend_flag_list(ordered_flags, as_compiler_flags)
# 3. Now put cmd-line flags last
if node.pkg in cmd_specs:
cmd_flags = cmd_specs[node.pkg].compiler_flags.get(flag_type, [])
extend_flag_list(ordered_flags, cmd_flags)
# compiler flags from compilers config are lowest precedence
ordered_compiler_flags = list(llnl.util.lang.dedupe(from_compiler + from_sources))
compiler_flags = spec.compiler_flags.get(flag_type, [])
msg = "%s does not equal %s" % (set(compiler_flags), set(ordered_flags))
assert set(compiler_flags) == set(ordered_flags), msg
msg = f"{set(compiler_flags)} does not equal {set(ordered_compiler_flags)}"
assert set(compiler_flags) == set(ordered_compiler_flags), msg
spec.compiler_flags.update({flag_type: ordered_compiler_flags})
spec.compiler_flags.update({flag_type: ordered_flags})
def deprecated(self, node: NodeArgument, version: str) -> None:
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
@@ -3570,10 +3731,9 @@ def build_specs(self, function_tuples):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
# do not bother calling actions on it
spec = self._specs.get(args[0])
if spec and spec.concrete and name != "node_flag_source":
if spec and spec.concrete:
continue
action(*args)
@@ -3633,7 +3793,8 @@ def _develop_specs_from_env(spec, env):
assert spec.variants["dev_path"].value == path, error_msg
else:
spec.variants.setdefault("dev_path", spack.variant.SingleValuedVariant("dev_path", path))
spec.constrain(dev_info["spec"])
assert spec.satisfies(dev_info["spec"])
def _is_reusable(spec: spack.spec.Spec, packages, local: bool) -> bool:

View File

@@ -43,9 +43,7 @@
internal_error("Only nodes can have node_compiler_version").
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode),
internal_error("variant_value true for a non-node").
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode),
internal_error("node_flag_compiler_default true for non-node").
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode),
:- attr("node_flag", PackageNode, _), not attr("node", PackageNode),
internal_error("node_flag assigned for non-node").
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode),
internal_error("external_spec_selected for non-node").
@@ -53,10 +51,6 @@
internal_error("non-node depends on something").
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode),
internal_error("something depends_on a non-node").
:- attr("node_flag_source", Node, _, _), not attr("node", Node),
internal_error("node_flag_source assigned for a non-node").
:- attr("node_flag_source", _, _, SourceNode), not attr("node", SourceNode),
internal_error("node_flag_source assigned with a non-node source").
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode),
internal_error("virtual node with no provider").
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode),
@@ -154,7 +148,6 @@ unification_set(SetID, VirtualNode)
% TODO: literals, at the moment, can only influence the "root" unification set. This needs to be extended later.
% Node attributes that have multiple node arguments (usually, only the first argument is a node)
multiple_nodes_attribute("node_flag_source").
multiple_nodes_attribute("depends_on").
multiple_nodes_attribute("virtual_on_edge").
multiple_nodes_attribute("provider_set").
@@ -392,7 +385,6 @@ trigger_condition_holds(ID, RequestorNode) :-
attr(Name, node(X, A1), A2, A3) : condition_requirement(ID, Name, A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name);
attr(Name, node(X, A1), A2, A3, A4) : condition_requirement(ID, Name, A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1));
% Special cases
attr("node_flag_source", node(X, A1), A2, node(Y, A3)) : condition_requirement(ID, "node_flag_source", A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), condition_nodes(ID, PackageNode, node(Y, A3));
not cannot_hold(ID, PackageNode).
condition_holds(ConditionID, node(X, Package))
@@ -440,13 +432,6 @@ attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constrai
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)).
% For node flag sources we need to look at the condition_set of the source, since it is the dependent
% of the package on which I want to impose the constraint
attr("node_flag_source", node(X, A1), A2, node(Y, A3))
:- impose(ID, node(X, A1)),
imposed_constraint(ID, "node_flag_source", A1, A2, A3),
condition_set(node(Y, A3), node(X, A1)).
% Provider set is relevant only for literals, since it's the only place where `^[virtuals=foo] bar`
% might appear in the HEAD of a rule
attr("provider_set", node(min_dupe_id, Provider), node(min_dupe_id, Virtual))
@@ -487,8 +472,8 @@ virtual_condition_holds(node(Y, A2), Virtual)
% we cannot have additional flag values when we are working with concrete specs
:- attr("node", node(ID, Package)),
attr("hash", node(ID, Package), Hash),
attr("node_flag", node(ID, Package), FlagType, Flag),
not imposed_constraint(Hash, "node_flag", Package, FlagType, Flag),
attr("node_flag", node(ID, Package), node_flag(FlagType, Flag, _, _)),
not imposed_constraint(Hash, "node_flag", Package, node_flag(FlagType, Flag, _, _)),
internal_error("imposed hash without imposing all flag values").
#defined condition/2.
@@ -789,22 +774,15 @@ required_provider(Provider, Virtual)
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider.
% TODO: the following two choice rules allow the solver to add compiler
% TODO: the following choice rule allows the solver to add compiler
% flags if their only source is from a requirement. This is overly-specific
% and should use a more-generic approach like in https://github.com/spack/spack/pull/37180
{ attr("node_flag", node(ID, Package), FlagType, FlagValue) } :-
{ attr("node_flag", node(ID, Package), NodeFlag) } :-
requirement_group_member(ConditionID, Package, RequirementID),
activate_requirement(node(ID, Package), RequirementID),
pkg_fact(Package, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node_flag_set", Package, FlagType, FlagValue).
{ attr("node_flag_source", node(NodeID1, Package1), FlagType, node(NodeID2, Package2)) } :-
requirement_group_member(ConditionID, Package1, RequirementID),
activate_requirement(node(NodeID1, Package1), RequirementID),
pkg_fact(Package1, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node_flag_source", Package1, FlagType, Package2),
imposed_nodes(EffectID, node(NodeID2, Package2), node(NodeID1, Package1)).
imposed_constraint(EffectID, "node_flag_set", Package, NodeFlag).
requirement_weight(node(ID, Package), Group, W) :-
W = #min {
@@ -1050,23 +1028,22 @@ variant_is_propagated(PackageNode, Variant) :-
% 1. The same flag type is not set on this node
% 2. This node has the same compiler as the propagation source
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag), _),
not attr("node_flag_set", node(PackageID, Package), FlagType, _),
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag, FlagGroup, Source), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag, FlagGroup, Source), _),
not attr("node_flag_set", node(PackageID, Package), node_flag(FlagType, _, _, "literal")),
% Same compiler as propagation source
node_compiler(node(PackageID, Package), CompilerID),
node_compiler(SourceNode, CompilerID),
attr("propagate", SourceNode, node_flag(FlagType, Flag), _),
attr("propagate", SourceNode, node_flag(FlagType, Flag, FlagGroup, Source), _),
node(PackageID, Package) != SourceNode,
not runtime(Package).
attr("node_flag", PackageNode, FlagType, Flag) :- propagated_flag(PackageNode, node_flag(FlagType, Flag), _).
attr("node_flag_source", PackageNode, FlagType, SourceNode) :- propagated_flag(PackageNode, node_flag(FlagType, _), SourceNode).
attr("node_flag", PackageNode, NodeFlag) :- propagated_flag(PackageNode, NodeFlag, _).
% Cannot propagate the same flag from two distinct sources
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source2)),
propagated_flag(node(ID, Package), node_flag(FlagType, _, _, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _, _, _), node(_, Source2)),
Source1 < Source2.
%----
@@ -1342,7 +1319,7 @@ node_compiler_weight(node(ID, Package), 100)
not compiler_weight(CompilerID, _).
% For the time being, be strict and reuse only if the compiler match one we have on the system
error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_missing_compilers:true if intended.", Package, Compiler, Version)
error(100, "Compiler {1}@{2} requested for {0} cannot be found.", Package, Compiler, Version)
:- attr("node_compiler_version", node(ID, Package), Compiler, Version),
not node_compiler(node(ID, Package), _).
@@ -1353,32 +1330,18 @@ error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_miss
% Compiler flags
%-----------------------------------------------------------------------------
% remember where flags came from
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag", PackageNode, FlagType, _), attr("hash", PackageNode, _).
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", PackageNode, FlagType, Flag)
:- compiler_flag(CompilerID, FlagType, Flag),
attr("node_flag", PackageNode, node_flag(FlagType, Flag, FlagGroup, CompilerID))
:- compiler_flag(CompilerID, FlagType, Flag, FlagGroup),
node_compiler(PackageNode, CompilerID),
flag_type(FlagType),
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
attr("node_flag_compiler_default", PackageNode)
:- not attr("node_flag_set", PackageNode, FlagType, _),
compiler_flag(CompilerID, FlagType, Flag),
node_compiler(PackageNode, CompilerID),
flag_type(FlagType),
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
attr("node_flag", PackageNode, NodeFlag) :- attr("node_flag_set", PackageNode, NodeFlag).
% Flag set to something
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_set", PackageNode, FlagType, Flag).
#defined compiler_flag/3.
#defined compiler_flag/4.
%-----------------------------------------------------------------------------

View File

@@ -230,6 +230,13 @@ class NodeArgument(NamedTuple):
pkg: str
class NodeFlag(NamedTuple):
flag_type: str
flag: str
flag_group: str
source: str
def intermediate_repr(sym):
"""Returns an intermediate representation of clingo models for Spack's spec builder.
@@ -248,6 +255,13 @@ def intermediate_repr(sym):
return NodeArgument(
id=intermediate_repr(sym.arguments[0]), pkg=intermediate_repr(sym.arguments[1])
)
elif sym.name == "node_flag":
return NodeFlag(
flag_type=intermediate_repr(sym.arguments[0]),
flag=intermediate_repr(sym.arguments[1]),
flag_group=intermediate_repr(sym.arguments[2]),
source=intermediate_repr(sym.arguments[3]),
)
except RuntimeError:
# This happens when using clingo w/ CFFI and trying to access ".name" for symbols
# that are not functions

View File

@@ -13,6 +13,7 @@
#show attr/2.
#show attr/3.
#show attr/4.
#show attr/5.
% names of optimization criteria
#show opt_criterion/2.

View File

@@ -781,17 +781,49 @@ class CompilerFlag(str):
propagate (bool): if ``True`` the flag value will
be passed to the package's dependencies. If
``False`` it will not
flag_group (str): if this flag was introduced along
with several flags via a single source, then
this will store all such flags
source (str): identifies the type of constraint that
introduced this flag (e.g. if a package has
``depends_on(... cflags=-g)``, then the ``source``
for "-g" would indicate ``depends_on``.
"""
def __new__(cls, value, **kwargs):
obj = str.__new__(cls, value)
obj.propagate = kwargs.pop("propagate", False)
obj.flag_group = kwargs.pop("flag_group", value)
obj.source = kwargs.pop("source", None)
return obj
_valid_compiler_flags = ["cflags", "cxxflags", "fflags", "ldflags", "ldlibs", "cppflags"]
def _shared_subset_pair_iterate(container1, container2):
"""
[0, a, c, d, f]
[a, d, e, f]
yields [(a, a), (d, d), (f, f)]
no repeated elements
"""
a_idx, b_idx = 0, 0
max_a, max_b = len(container1), len(container2)
while a_idx < max_a and b_idx < max_b:
if container1[a_idx] == container2[b_idx]:
yield (container1[a_idx], container2[b_idx])
a_idx += 1
b_idx += 1
else:
while container1[a_idx] < container2[b_idx]:
a_idx += 1
while container1[a_idx] > container2[b_idx]:
b_idx += 1
class FlagMap(lang.HashableMap):
__slots__ = ("spec",)
@@ -800,23 +832,9 @@ def __init__(self, spec):
self.spec = spec
def satisfies(self, other):
return all(f in self and self[f] == other[f] for f in other)
return all(f in self and set(self[f]) >= set(other[f]) for f in other)
def intersects(self, other):
common_types = set(self) & set(other)
for flag_type in common_types:
if not self[flag_type] or not other[flag_type]:
# At least one of the two is empty
continue
if self[flag_type] != other[flag_type]:
return False
if not all(
f1.propagate == f2.propagate for f1, f2 in zip(self[flag_type], other[flag_type])
):
# At least one propagation flag didn't match
return False
return True
def constrain(self, other):
@@ -824,28 +842,28 @@ def constrain(self, other):
Return whether the spec changed.
"""
if other.spec and other.spec._concrete:
for k in self:
if k not in other:
raise UnsatisfiableCompilerFlagSpecError(self[k], "<absent>")
changed = False
for k in other:
if k in self and not set(self[k]) <= set(other[k]):
raise UnsatisfiableCompilerFlagSpecError(
" ".join(f for f in self[k]), " ".join(f for f in other[k])
)
elif k not in self:
self[k] = other[k]
for flag_type in other:
if flag_type not in self:
self[flag_type] = other[flag_type]
changed = True
else:
extra_other = set(other[flag_type]) - set(self[flag_type])
if extra_other:
self[flag_type] = list(self[flag_type]) + list(
x for x in other[flag_type] if x in extra_other
)
changed = True
# Next, if any flags in other propagate, we force them to propagate in our case
shared = list(sorted(set(other[flag_type]) - extra_other))
for x, y in _shared_subset_pair_iterate(shared, sorted(self[flag_type])):
if x.propagate:
y.propagate = True
# TODO: what happens if flag groups with a partial (but not complete)
# intersection specify different behaviors for flag propagation?
# Check that the propagation values match
if self[k] == other[k]:
for i in range(len(other[k])):
if self[k][i].propagate != other[k][i].propagate:
raise UnsatisfiableCompilerFlagSpecError(
self[k][i].propagate, other[k][i].propagate
)
return changed
@staticmethod
@@ -858,7 +876,7 @@ def copy(self):
clone[name] = compiler_flag
return clone
def add_flag(self, flag_type, value, propagation):
def add_flag(self, flag_type, value, propagation, flag_group=None, source=None):
"""Stores the flag's value in CompilerFlag and adds it
to the FlagMap
@@ -869,7 +887,8 @@ def add_flag(self, flag_type, value, propagation):
propagation (bool): if ``True`` the flag value will be passed to
the packages' dependencies. If``False`` it will not be passed
"""
flag = CompilerFlag(value, propagate=propagation)
flag_group = flag_group or value
flag = CompilerFlag(value, propagate=propagation, flag_group=flag_group, source=source)
if flag_type not in self:
self[flag_type] = [flag]
@@ -1552,7 +1571,9 @@ def _get_dependency(self, name):
raise spack.error.SpecError(err_msg.format(name, len(deps)))
return deps[0]
def edges_from_dependents(self, name=None, depflag: dt.DepFlag = dt.ALL):
def edges_from_dependents(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
"""Return a list of edges connecting this node in the DAG
to parents.
@@ -1562,7 +1583,9 @@ def edges_from_dependents(self, name=None, depflag: dt.DepFlag = dt.ALL):
"""
return [d for d in self._dependents.select(parent=name, depflag=depflag)]
def edges_to_dependencies(self, name=None, depflag: dt.DepFlag = dt.ALL):
def edges_to_dependencies(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
"""Return a list of edges connecting this node in the DAG
to children.
@@ -1661,8 +1684,9 @@ def _add_flag(self, name, value, propagate):
elif name in valid_flags:
assert self.compiler_flags is not None
flags_and_propagation = spack.compiler.tokenize_flags(value, propagate)
flag_group = " ".join(x for (x, y) in flags_and_propagation)
for flag, propagation in flags_and_propagation:
self.compiler_flags.add_flag(name, flag, propagation)
self.compiler_flags.add_flag(name, flag, propagation, flag_group)
else:
# FIXME:
# All other flags represent variants. 'foo=true' and 'foo=false'
@@ -3887,43 +3911,43 @@ def format_attribute(match_object: Match) -> str:
for idx, part in enumerate(parts):
if not part:
raise SpecFormatStringError("Format string attributes must be non-empty")
if part.startswith("_"):
elif part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if part == "variants" and isinstance(current, VariantMap):
# subscript instead of getattr for variant names
elif isinstance(current, VariantMap):
# subscript instead of getattr for variant names
try:
current = current[part]
else:
# aliases
if part == "arch":
part = "architecture"
elif part == "version":
# version (singular) requires a concrete versions list. Avoid
# pedantic errors by using versions (plural) when not concrete.
# These two are not entirely equivalent for pkg@=1.2.3:
# - version prints '1.2.3'
# - versions prints '=1.2.3'
if not current.versions.concrete:
part = "versions"
try:
current = getattr(current, part)
except AttributeError:
parent = ".".join(parts[:idx])
m = "Attempted to format attribute %s." % attribute
m += "Spec %s has no attribute %s" % (parent, part)
raise SpecFormatStringError(m)
if isinstance(current, vn.VersionList):
if current == vn.any_version:
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# not printing anything
except KeyError:
raise SpecFormatStringError(f"Variant '{part}' does not exist")
else:
# aliases
if part == "arch":
part = "architecture"
elif part == "version" and not current.versions.concrete:
# version (singular) requires a concrete versions list. Avoid
# pedantic errors by using versions (plural) when not concrete.
# These two are not entirely equivalent for pkg@=1.2.3:
# - version prints '1.2.3'
# - versions prints '=1.2.3'
part = "versions"
try:
current = getattr(current, part)
except AttributeError:
raise SpecFormatStringError(
f"Attempted to format attribute {attribute}. "
f"Spec {'.'.join(parts[:idx])} has no attribute {part}"
)
if isinstance(current, vn.VersionList) and current == vn.any_version:
# don't print empty version lists
return ""
if callable(current):
raise SpecFormatStringError("Attempted to format callable object")
if current is None:
# not printing anything
return ""
# Set color codes for various attributes
color = None
if "architecture" in parts:
@@ -4682,6 +4706,10 @@ def _load(cls, data):
return hash_dict[root_spec_hash]["node_spec"]
@classmethod
def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
raise NotImplementedError("Subclasses must implement this method.")
class SpecfileV1(SpecfileReaderBase):
@classmethod

View File

@@ -39,7 +39,6 @@
import spack.paths
import spack.resource
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.lock
import spack.util.path as sup
@@ -981,8 +980,8 @@ def interactive_version_filter(
data = buffer.getvalue().encode("utf-8")
short_hash = hashlib.sha1(data).hexdigest()[:7]
filename = f"{spack.stage.stage_prefix}versions-{short_hash}.txt"
filepath = os.path.join(spack.stage.get_stage_root(), filename)
filename = f"{stage_prefix}versions-{short_hash}.txt"
filepath = os.path.join(get_stage_root(), filename)
# Write contents
with open(filepath, "wb") as f:

View File

@@ -173,7 +173,12 @@ def __init__(
self.hash_length = hash_length
self.upstreams = upstreams
self.lock_cfg = lock_cfg
self.db = spack.database.Database(root, upstream_dbs=upstreams, lock_cfg=lock_cfg)
self.layout = spack.directory_layout.DirectoryLayout(
root, projections=projections, hash_length=hash_length
)
self.db = spack.database.Database(
root, upstream_dbs=upstreams, lock_cfg=lock_cfg, layout=self.layout
)
timeout_format_str = (
f"{str(lock_cfg.package_timeout)}s" if lock_cfg.package_timeout else "No timeout"
@@ -187,13 +192,9 @@ def __init__(
self.root, default_timeout=lock_cfg.package_timeout
)
self.layout = spack.directory_layout.DirectoryLayout(
root, projections=projections, hash_length=hash_length
)
def reindex(self) -> None:
"""Convenience function to reindex the store DB with its own layout."""
return self.db.reindex(self.layout)
return self.db.reindex()
def __reduce__(self):
return Store, (
@@ -261,7 +262,7 @@ def restore(token):
def _construct_upstream_dbs_from_install_roots(
install_roots: List[str], _test: bool = False
install_roots: List[str],
) -> List[spack.database.Database]:
accumulated_upstream_dbs: List[spack.database.Database] = []
for install_root in reversed(install_roots):
@@ -271,7 +272,6 @@ def _construct_upstream_dbs_from_install_roots(
is_upstream=True,
upstream_dbs=upstream_dbs,
)
next_db._fail_when_missing_deps = _test
next_db._read()
accumulated_upstream_dbs.insert(0, next_db)

View File

@@ -592,7 +592,7 @@ def test_setting_attributes(self, default_mock_concretization):
# We can also propagate the settings to classes in the MRO
module_wrapper.propagate_changes_to_mro()
for cls in type(s.package).__mro__:
for cls in s.package.__class__.__mro__:
current_module = cls.module
if current_module == spack.package_base:
break

View File

@@ -379,9 +379,8 @@ def test_buildcache_create_install(
def test_correct_specs_are_pushed(
things_to_install, expected, tmpdir, monkeypatch, default_mock_concretization, temporary_store
):
# Concretize dttop and add it to the temporary database (without prefixes)
spec = default_mock_concretization("dttop")
temporary_store.db.add(spec, directory_layout=None)
spec.package.do_install(fake=True)
slash_hash = f"/{spec.dag_hash()}"
class DontUpload(spack.binary_distribution.Uploader):

View File

@@ -591,14 +591,12 @@ def test_config_prefer_upstream(
"""
mock_db_root = str(tmpdir_factory.mktemp("mock_db_root"))
prepared_db = spack.database.Database(mock_db_root)
upstream_layout = gen_mock_layout("/a/")
prepared_db = spack.database.Database(mock_db_root, layout=gen_mock_layout("/a/"))
for spec in ["hdf5 +mpi", "hdf5 ~mpi", "boost+debug~icu+graph", "dependency-install", "patch"]:
dep = spack.spec.Spec(spec)
dep.concretize()
prepared_db.add(dep, upstream_layout)
prepared_db.add(dep)
downstream_db_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))
db_for_test = spack.database.Database(downstream_db_root, upstream_dbs=[prepared_db])

View File

@@ -18,8 +18,6 @@
develop = SpackCommand("develop")
env = SpackCommand("env")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
@pytest.mark.usefixtures("mutable_mock_env_path", "mock_packages", "mock_fetch", "mutable_config")
class TestDevelop:

View File

@@ -2336,103 +2336,6 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
assert mpileaks_spec not in after_conc
def test_stack_concretize_extraneous_deps(tmpdir, mock_packages):
# FIXME: The new concretizer doesn't handle yet soft
# FIXME: constraints for stacks
# FIXME: This now works for statically-determinable invalid deps
# FIXME: But it still does not work for dynamically determined invalid deps
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['^zmpi', '^mpich']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
test = ev.read("test")
for user, concrete in test.concretized_specs():
assert concrete.concrete
assert not user.concrete
if user.name == "libelf":
assert not concrete.satisfies("^mpi")
elif user.name == "mpileaks":
assert concrete.satisfies("^mpi")
def test_stack_concretize_extraneous_variants(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['~shared', '+shared']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
test = ev.read("test")
for user, concrete in test.concretized_specs():
assert concrete.concrete
assert not user.concrete
if user.name == "libelf":
assert "shared" not in concrete.variants
if user.name == "mpileaks":
assert concrete.variants["shared"].value == user.variants["shared"].value
def test_stack_concretize_extraneous_variants_with_dash(tmpdir, mock_packages):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- ['shared=False', '+shared-libs']
specs:
- $install
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
ev.read("test")
# Regression test for handling of variants with dashes in them
# will fail before this point if code regresses
assert True
def test_stack_definition_extension(tmpdir):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
@@ -4301,9 +4204,6 @@ def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config
{''.join(includes)}
specs:
- mpileaks
packages:
mpileaks:
compiler: [gcc]
"""
)

View File

@@ -44,7 +44,7 @@ def test_find_external_single_package(mock_executable):
assert len(specs_by_package) == 1 and "cmake" in specs_by_package
detected_spec = specs_by_package["cmake"]
assert len(detected_spec) == 1 and detected_spec[0].spec == Spec("cmake@1.foo")
assert len(detected_spec) == 1 and detected_spec[0] == Spec("cmake@1.foo")
def test_find_external_two_instances_same_package(mock_executable):
@@ -61,10 +61,10 @@ def test_find_external_two_instances_same_package(mock_executable):
)
assert len(detected_specs) == 2
spec_to_path = {e.spec: e.prefix for e in detected_specs}
spec_to_path = {s: s.external_path for s in detected_specs}
assert spec_to_path[Spec("cmake@1.foo")] == (
spack.detection.executable_prefix(str(cmake1.parent))
)
), spec_to_path
assert spec_to_path[Spec("cmake@3.17.2")] == (
spack.detection.executable_prefix(str(cmake2.parent))
)
@@ -72,12 +72,8 @@ def test_find_external_two_instances_same_package(mock_executable):
def test_find_external_update_config(mutable_config):
entries = [
spack.detection.DetectedPackage(
Spec.from_detection("cmake@1.foo", external_path="/x/y1/"), "/x/y1/"
),
spack.detection.DetectedPackage(
Spec.from_detection("cmake@3.17.2", external_path="/x/y2/"), "/x/y2/"
),
Spec.from_detection("cmake@1.foo", external_path="/x/y1"),
Spec.from_detection("cmake@3.17.2", external_path="/x/y2"),
]
pkg_to_entries = {"cmake": entries}
@@ -88,8 +84,8 @@ def test_find_external_update_config(mutable_config):
cmake_cfg = pkgs_cfg["cmake"]
cmake_externals = cmake_cfg["externals"]
assert {"spec": "cmake@1.foo", "prefix": "/x/y1/"} in cmake_externals
assert {"spec": "cmake@3.17.2", "prefix": "/x/y2/"} in cmake_externals
assert {"spec": "cmake@1.foo", "prefix": "/x/y1"} in cmake_externals
assert {"spec": "cmake@3.17.2", "prefix": "/x/y2"} in cmake_externals
def test_get_executables(working_env, mock_executable):
@@ -229,19 +225,15 @@ def test_find_external_merge(mutable_config, mutable_mock_repo, tmp_path):
"""Checks that 'spack find external' doesn't overwrite an existing spec in packages.yaml."""
pkgs_cfg_init = {
"find-externals1": {
"externals": [{"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix/"}],
"externals": [{"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix"}],
"buildable": False,
}
}
mutable_config.update_config("packages", pkgs_cfg_init)
entries = [
spack.detection.DetectedPackage(
Spec.from_detection("find-externals1@1.1", external_path="/x/y1/"), "/x/y1/"
),
spack.detection.DetectedPackage(
Spec.from_detection("find-externals1@1.2", external_path="/x/y2/"), "/x/y2/"
),
Spec.from_detection("find-externals1@1.1", external_path="/x/y1"),
Spec.from_detection("find-externals1@1.2", external_path="/x/y2"),
]
pkg_to_entries = {"find-externals1": entries}
scope = spack.config.default_modify_scope("packages")
@@ -251,8 +243,8 @@ def test_find_external_merge(mutable_config, mutable_mock_repo, tmp_path):
pkg_cfg = pkgs_cfg["find-externals1"]
pkg_externals = pkg_cfg["externals"]
assert {"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix/"} in pkg_externals
assert {"spec": "find-externals1@1.2", "prefix": "/x/y2/"} in pkg_externals
assert {"spec": "find-externals1@1.1", "prefix": "/preexisting-prefix"} in pkg_externals
assert {"spec": "find-externals1@1.2", "prefix": "/x/y2"} in pkg_externals
def test_list_detectable_packages(mutable_config, mutable_mock_repo):
@@ -278,7 +270,7 @@ def _determine_variants(cls, exes, version_str):
assert len(detected_specs) == 1
gcc = detected_specs[0].spec
gcc = detected_specs[0]
assert gcc.name == "gcc"
assert gcc.external_path == os.path.sep + os.path.join("opt", "gcc", "bin")

View File

@@ -334,7 +334,6 @@ def test_find_command_basic_usage(database):
assert "mpileaks" in output
@pytest.mark.not_on_windows("envirnment is not yet supported on windows")
@pytest.mark.regression("9875")
def test_find_prefix_in_env(
mutable_mock_env_path, install_mockery, mock_fetch, mock_packages, mock_archive

View File

@@ -16,8 +16,6 @@
add = spack.main.SpackCommand("add")
install = spack.main.SpackCommand("install")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
@pytest.mark.db
def test_gc_without_build_dependency(mutable_database):

View File

@@ -19,7 +19,6 @@
import spack.cmd.common.arguments
import spack.cmd.install
import spack.compilers as compilers
import spack.config
import spack.environment as ev
import spack.hash_types as ht
@@ -29,7 +28,7 @@
from spack.error import SpackError
from spack.main import SpackCommand
from spack.parser import SpecSyntaxError
from spack.spec import CompilerSpec, Spec
from spack.spec import Spec
install = SpackCommand("install")
env = SpackCommand("env")
@@ -916,68 +915,6 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
def test_compiler_bootstrap(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("pkg-a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows")
def test_compiler_bootstrap_from_binary_mirror(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
):
"""
Make sure installing compiler from buildcache registers compiler
"""
# Create a temp mirror directory for buildcache usage
mirror_dir = tmpdir.join("mirror_dir")
mirror_url = "file://{0}".format(mirror_dir.strpath)
# Install a compiler, because we want to put it in a buildcache
install("gcc@=10.2.0")
# Put installed compiler in the buildcache
buildcache("push", "-u", "-f", mirror_dir.strpath, "gcc@10.2.0")
# Now uninstall the compiler
uninstall("-y", "gcc@10.2.0")
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=10.2.0") not in compilers.all_compiler_specs()
# Configure the mirror where we put that buildcache w/ the compiler
mirror("add", "test-mirror", mirror_url)
# Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not
# raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@pytest.mark.regression("16221")
def test_compiler_bootstrap_already_installed(
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch
):
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
spack.config.set("config:install_missing_compilers", True)
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("gcc@=12.0")
install("pkg-a%gcc@=12.0")
def test_install_fails_no_args(tmpdir):
# ensure no spack.yaml in directory
with tmpdir.as_cwd():

View File

@@ -31,7 +31,6 @@ def test_it_just_runs(pkg):
"mpilander",
"mvapich2",
"openmpi",
"openmpi@1.6.5",
"openmpi@1.7.5:",
"openmpi@2.0.0:",
"spectrum-mpi",

View File

@@ -2,9 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import spack.store
from spack.database import Database
from spack.main import SpackCommand
install = SpackCommand("install")
@@ -23,31 +24,55 @@ def test_reindex_basic(mock_packages, mock_archive, mock_fetch, install_mockery)
assert spack.store.STORE.db.query() == all_installed
def test_reindex_db_deleted(mock_packages, mock_archive, mock_fetch, install_mockery):
def _clear_db(tmp_path):
empty_db = Database(str(tmp_path))
with empty_db.write_transaction():
pass
shutil.rmtree(spack.store.STORE.db.database_directory)
shutil.copytree(empty_db.database_directory, spack.store.STORE.db.database_directory)
# force a re-read of the database
assert len(spack.store.STORE.db.query()) == 0
def test_reindex_db_deleted(mock_packages, mock_archive, mock_fetch, install_mockery, tmp_path):
install("libelf@0.8.13")
install("libelf@0.8.12")
all_installed = spack.store.STORE.db.query()
os.remove(spack.store.STORE.db._index_path)
_clear_db(tmp_path)
reindex()
assert spack.store.STORE.db.query() == all_installed
def test_reindex_with_deprecated_packages(
mock_packages, mock_archive, mock_fetch, install_mockery
mock_packages, mock_archive, mock_fetch, install_mockery, tmp_path
):
install("libelf@0.8.13")
install("libelf@0.8.12")
deprecate("-y", "libelf@0.8.12", "libelf@0.8.13")
all_installed = spack.store.STORE.db.query(installed=any)
non_deprecated = spack.store.STORE.db.query(installed=True)
db = spack.store.STORE.db
all_installed = db.query(installed=any)
non_deprecated = db.query(installed=True)
_clear_db(tmp_path)
os.remove(spack.store.STORE.db._index_path)
reindex()
assert spack.store.STORE.db.query(installed=any) == all_installed
assert spack.store.STORE.db.query(installed=True) == non_deprecated
assert db.query(installed=any) == all_installed
assert db.query(installed=True) == non_deprecated
old_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.12", installed=any)[0].dag_hash()
)
new_libelf = db.query_local_by_spec_hash(
db.query_local("libelf@0.8.13", installed=True)[0].dag_hash()
)
assert old_libelf.deprecated_for == new_libelf.spec.dag_hash()
assert new_libelf.deprecated_for is None
assert new_libelf.ref_count == 1

View File

@@ -50,7 +50,6 @@ def fake_stage(pkg, mirror_only=False):
return expected_path
@pytest.mark.not_on_windows("PermissionError")
def test_stage_path(check_stage_path):
"""Verify that --path only works with single specs."""
stage("--path={0}".format(check_stage_path), "trivial-install-test-package")

View File

@@ -2,10 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.environment as ev
import spack.spec
from spack.main import SpackCommand
@@ -14,8 +10,6 @@
env = SpackCommand("env")
concretize = SpackCommand("concretize")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_undevelop(tmpdir, mutable_config, mock_packages, mutable_mock_env_path):
# setup environment

View File

@@ -205,7 +205,6 @@ def _warn(*args, **kwargs):
# Note: I want to use https://docs.pytest.org/en/7.1.x/how-to/skipping.html#skip-all-test-functions-of-a-class-or-module
# the style formatter insists on separating these two lines.
@pytest.mark.not_on_windows("Envs unsupported on Windows")
class TestUninstallFromEnv:
"""Tests an installation with two environments e1 and e2, which each have
shared package installations:

View File

@@ -13,6 +13,7 @@
import llnl.util.lang
import spack.binary_distribution
import spack.compiler
import spack.compilers
import spack.concretize
@@ -400,14 +401,6 @@ def test_spec_flags_maintain_order(self, mutable_config, gcc11_with_flags):
s.compiler_flags[x] == ["-O0", "-g"] for x in ("cflags", "cxxflags", "fflags")
)
@pytest.mark.xfail(reason="Broken, needs to be fixed")
def test_compiler_flags_from_compiler_and_dependent(self):
client = Spec("cmake-client %clang@12.2.0 platform=test os=fe target=fe cflags==-g")
client.concretize()
cmake = client["cmake"]
for spec in [client, cmake]:
assert spec.compiler_flags["cflags"] == ["-O3", "-g"]
def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags):
mutable_config.set("compilers", [clang12_with_flags])
# Correct arch to use test compiler that has flags
@@ -441,6 +434,13 @@ def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12
["hypre cflags='-g'", "^openblas cflags='-O3'"],
["^openblas cflags='-g'"],
),
# Setting propagation on parent and dependency -> the
# dependency propagation flags override
(
"hypre cflags=='-g' ^openblas cflags=='-O3'",
["hypre cflags='-g'", "^openblas cflags='-O3'"],
["^openblas cflags='-g'"],
),
# Propagation doesn't go across build dependencies
(
"cmake-client cflags=='-O2 -g'",
@@ -648,20 +648,6 @@ def test_external_package(self):
assert "externalprereq" not in spec
assert spec["externaltool"].compiler.satisfies("gcc")
def test_external_package_module(self):
# No tcl modules on darwin/linux machines
# and Windows does not (currently) allow for bash calls
# TODO: improved way to check for this.
platform = spack.platforms.real_host().name
if platform == "darwin" or platform == "linux" or platform == "windows":
return
spec = Spec("externalmodule")
spec.concretize()
assert spec["externalmodule"].external_modules == ["external-module"]
assert "externalprereq" not in spec
assert spec["externalmodule"].compiler.satisfies("gcc")
def test_nobuild_package(self):
"""Test that a non-buildable package raise an error if no specs
in packages.yaml are compatible with the request.
@@ -775,15 +761,15 @@ def test_regression_issue_7239(self):
s = Spec("mpileaks")
s.concretize()
assert llnl.util.lang.ObjectWrapper not in type(s).__mro__
assert llnl.util.lang.ObjectWrapper not in s.__class__.__mro__
# Spec wrapped in a build interface
build_interface = s["mpileaks"]
assert llnl.util.lang.ObjectWrapper in type(build_interface).__mro__
assert llnl.util.lang.ObjectWrapper in build_interface.__class__.__mro__
# Mimics asking the build interface from a build interface
build_interface = s["mpileaks"]["mpileaks"]
assert llnl.util.lang.ObjectWrapper in type(build_interface).__mro__
assert llnl.util.lang.ObjectWrapper in build_interface.__class__.__mro__
@pytest.mark.regression("7705")
def test_regression_issue_7705(self):
@@ -1301,7 +1287,7 @@ def mock_fn(*args, **kwargs):
return [first_spec]
if mock_db:
temporary_store.db.add(first_spec, None)
temporary_store.db.add(first_spec)
else:
monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", mock_fn)
@@ -1366,7 +1352,7 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True)
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized()
spack.store.STORE.db.add(spec, None)
spec.package.do_install(fake=True)
testspec = Spec("pkg-a cflags=-g")
testspec.concretize()
@@ -2109,11 +2095,13 @@ def test_external_python_extension_find_dependency_from_detection(self, monkeypa
"""Test that python extensions have access to a python dependency
when python isn't otherwise in the DAG"""
python_spec = Spec("python@=detected")
prefix = os.path.sep + "fake"
python_spec = Spec.from_detection("python@=detected", external_path=prefix)
def find_fake_python(classes, path_hints):
return {"python": [spack.detection.DetectedPackage(python_spec, prefix=path_hints[0])]}
return {
"python": [Spec.from_detection("python@=detected", external_path=path_hints[0])]
}
monkeypatch.setattr(spack.detection, "by_path", find_fake_python)
external_conf = {
@@ -2128,7 +2116,8 @@ def find_fake_python(classes, path_hints):
assert "python" in spec["py-extension1"]
assert spec["python"].prefix == prefix
assert spec["python"] == python_spec
assert spec["python"].external
assert spec["python"].satisfies(python_spec)
def test_external_python_extension_find_unified_python(self):
"""Test that python extensions use the same python as other specs in unified env"""
@@ -2386,26 +2375,6 @@ def test_externals_with_platform_explicitly_set(self, tmp_path):
s = Spec("mpich").concretized()
assert s.external
@pytest.mark.regression("43875")
def test_concretize_missing_compiler(self, mutable_config, monkeypatch):
"""Tests that Spack can concretize a spec with a missing compiler when the
option is active.
"""
def _default_libc(self):
if self.cc is None:
return None
return Spec("glibc@=2.28")
monkeypatch.setattr(spack.concretize.Concretizer, "check_for_compiler_existence", False)
monkeypatch.setattr(spack.compiler.Compiler, "default_libc", property(_default_libc))
monkeypatch.setattr(
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
)
mutable_config.set("config:install_missing_compilers", True)
s = Spec("pkg-a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267")
def test_spec_with_build_dep_from_json(self, tmp_path):
"""Tests that we can correctly concretize a spec, when we express its dependency as a
@@ -2959,7 +2928,7 @@ def test_concretization_version_order():
result = [
v
for v, _ in sorted(
versions, key=spack.solver.asp._concretization_version_order, reverse=True
versions, key=spack.solver.asp.concretization_version_order, reverse=True
)
]
assert result == [

View File

@@ -8,8 +8,6 @@
import spack.solver.asp
import spack.spec
pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",
" required because quantum-espresso depends on fftw@:1.0",

View File

@@ -19,8 +19,6 @@
from spack.test.conftest import create_test_repo
from spack.util.url import path_to_file_url
pytestmark = [pytest.mark.not_on_windows("Windows uses old concretizer")]
def update_packages_config(conf_str):
conf = syaml.load_config(conf_str)

View File

@@ -27,7 +27,6 @@ def test_listing_possible_os():
assert expected_os in output
@pytest.mark.not_on_windows("test unsupported on Windows")
@pytest.mark.maybeslow
@pytest.mark.requires_executables("git")
def test_bootstrap_phase(minimal_configuration, config_dumper, capsys):

View File

@@ -7,6 +7,7 @@
import functools
import json
import os
import re
import shutil
import sys
@@ -40,20 +41,21 @@
@pytest.fixture()
def upstream_and_downstream_db(tmpdir, gen_mock_layout):
mock_db_root = str(tmpdir.mkdir("mock_db_root"))
upstream_write_db = spack.database.Database(mock_db_root)
upstream_db = spack.database.Database(mock_db_root, is_upstream=True)
upstream_layout = gen_mock_layout("/a/")
upstream_write_db = spack.database.Database(mock_db_root, layout=upstream_layout)
upstream_db = spack.database.Database(mock_db_root, is_upstream=True, layout=upstream_layout)
# Generate initial DB file to avoid reindex
with open(upstream_write_db._index_path, "w") as db_file:
upstream_write_db._write_to_file(db_file)
upstream_layout = gen_mock_layout("/a/")
downstream_db_root = str(tmpdir.mkdir("mock_downstream_db_root"))
downstream_db = spack.database.Database(downstream_db_root, upstream_dbs=[upstream_db])
downstream_db = spack.database.Database(
downstream_db_root, upstream_dbs=[upstream_db], layout=gen_mock_layout("/b/")
)
with open(downstream_db._index_path, "w") as db_file:
downstream_db._write_to_file(db_file)
downstream_layout = gen_mock_layout("/b/")
yield upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout
yield upstream_write_db, upstream_db, downstream_db
@pytest.mark.parametrize(
@@ -69,14 +71,14 @@ def upstream_and_downstream_db(tmpdir, gen_mock_layout):
def test_query_by_install_tree(
install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config
):
up_write_db, up_db, up_layout, down_db, down_layout = upstream_and_downstream_db
up_write_db, up_db, down_db = upstream_and_downstream_db
# Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b")
b = spack.spec.Spec("pkg-b").concretized()
c = spack.spec.Spec("pkg-c").concretized()
up_write_db.add(c, up_layout)
up_write_db.add(c)
up_db._read()
down_db.add(b, down_layout)
down_db.add(b)
specs = down_db.query(install_tree=install_tree.format(u=up_db.root, d=down_db.root))
assert [s.name for s in specs] == result
@@ -86,9 +88,7 @@ def test_spec_installed_upstream(
upstream_and_downstream_db, mock_custom_repository, config, monkeypatch
):
"""Test whether Spec.installed_upstream() works."""
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
# a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository):
@@ -96,7 +96,7 @@ def test_spec_installed_upstream(
assert not spec.installed
assert not spec.installed_upstream
upstream_write_db.add(spec, upstream_layout)
upstream_write_db.add(spec)
upstream_db._read()
monkeypatch.setattr(spack.store.STORE, "db", downstream_db)
@@ -112,9 +112,7 @@ def test_spec_installed_upstream(
@pytest.mark.usefixtures("config")
def test_installed_upstream(upstream_and_downstream_db, tmpdir):
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
@@ -125,7 +123,7 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("w").concretized()
for dep in spec.traverse(root=False):
upstream_write_db.add(dep, upstream_layout)
upstream_write_db.add(dep)
upstream_db._read()
for dep in spec.traverse(root=False):
@@ -135,11 +133,11 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
upstream_db.get_by_hash(dep.dag_hash())
new_spec = spack.spec.Spec("w").concretized()
downstream_db.add(new_spec, downstream_layout)
downstream_db.add(new_spec)
for dep in new_spec.traverse(root=False):
upstream, record = downstream_db.query_by_spec_hash(dep.dag_hash())
assert upstream
assert record.path == upstream_layout.path_for_spec(dep)
assert record.path == upstream_db.layout.path_for_spec(dep)
upstream, record = downstream_db.query_by_spec_hash(new_spec.dag_hash())
assert not upstream
assert record.installed
@@ -148,32 +146,32 @@ def test_installed_upstream(upstream_and_downstream_db, tmpdir):
downstream_db._check_ref_counts()
@pytest.mark.usefixtures("config")
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir):
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
def test_removed_upstream_dep(upstream_and_downstream_db, tmpdir, capsys, config):
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("z")
builder.add_package("y", dependencies=[("z", None, None)])
with spack.repo.use_repositories(builder):
spec = spack.spec.Spec("y").concretized()
y = spack.spec.Spec("y").concretized()
z = y["z"]
upstream_write_db.add(spec["z"], upstream_layout)
# add dependency to upstream, dependents to downstream
upstream_write_db.add(z)
upstream_db._read()
downstream_db.add(y)
# remove the dependency from the upstream DB
upstream_write_db.remove(z)
upstream_db._read()
new_spec = spack.spec.Spec("y").concretized()
downstream_db.add(new_spec, downstream_layout)
upstream_write_db.remove(new_spec["z"])
upstream_db._read()
new_downstream = spack.database.Database(downstream_db.root, upstream_dbs=[upstream_db])
new_downstream._fail_when_missing_deps = True
with pytest.raises(spack.database.MissingDependenciesError):
new_downstream._read()
# then rereading the downstream DB should warn about the missing dep
downstream_db._read_from_file(downstream_db._index_path)
assert (
f"Missing dependency not in database: y/{y.dag_hash(7)} needs z"
in capsys.readouterr().err
)
@pytest.mark.usefixtures("config")
@@ -182,9 +180,7 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
DB. When a package is recorded as installed in both, the results should
refer to the downstream DB.
"""
upstream_write_db, upstream_db, upstream_layout, downstream_db, downstream_layout = (
upstream_and_downstream_db
)
upstream_write_db, upstream_db, downstream_db = upstream_and_downstream_db
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
@@ -192,8 +188,8 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
downstream_db.add(spec, downstream_layout)
upstream_write_db.add(spec, upstream_layout)
downstream_db.add(spec)
upstream_write_db.add(spec)
upstream_db._read()
upstream, record = downstream_db.query_by_spec_hash(spec.dag_hash())
@@ -207,33 +203,22 @@ def test_add_to_upstream_after_downstream(upstream_and_downstream_db, tmpdir):
try:
orig_db = spack.store.STORE.db
spack.store.STORE.db = downstream_db
assert queried_spec.prefix == downstream_layout.path_for_spec(spec)
assert queried_spec.prefix == downstream_db.layout.path_for_spec(spec)
finally:
spack.store.STORE.db = orig_db
@pytest.mark.usefixtures("config", "temporary_store")
def test_cannot_write_upstream(tmpdir, gen_mock_layout):
roots = [str(tmpdir.mkdir(x)) for x in ["a", "b"]]
layouts = [gen_mock_layout(x) for x in ["/ra/", "/rb/"]]
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"))
builder.add_package("x")
def test_cannot_write_upstream(tmp_path, mock_packages, config):
# Instantiate the database that will be used as the upstream DB and make
# sure it has an index file
upstream_db_independent = spack.database.Database(roots[1])
with upstream_db_independent.write_transaction():
with spack.database.Database(str(tmp_path)).write_transaction():
pass
upstream_dbs = spack.store._construct_upstream_dbs_from_install_roots([roots[1]], _test=True)
# Create it as an upstream
db = spack.database.Database(str(tmp_path), is_upstream=True)
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x")
spec.concretize()
with pytest.raises(spack.database.ForbiddenLockError):
upstream_dbs[0].add(spec, layouts[1])
with pytest.raises(spack.database.ForbiddenLockError):
db.add(spack.spec.Spec("pkg-a").concretized())
@pytest.mark.usefixtures("config", "temporary_store")
@@ -248,17 +233,17 @@ def test_recursive_upstream_dbs(tmpdir, gen_mock_layout):
with spack.repo.use_repositories(builder.root):
spec = spack.spec.Spec("x").concretized()
db_c = spack.database.Database(roots[2])
db_c.add(spec["z"], layouts[2])
db_c = spack.database.Database(roots[2], layout=layouts[2])
db_c.add(spec["z"])
db_b = spack.database.Database(roots[1], upstream_dbs=[db_c])
db_b.add(spec["y"], layouts[1])
db_b = spack.database.Database(roots[1], upstream_dbs=[db_c], layout=layouts[1])
db_b.add(spec["y"])
db_a = spack.database.Database(roots[0], upstream_dbs=[db_b, db_c])
db_a.add(spec["x"], layouts[0])
db_a = spack.database.Database(roots[0], upstream_dbs=[db_b, db_c], layout=layouts[0])
db_a.add(spec["x"])
upstream_dbs_from_scratch = spack.store._construct_upstream_dbs_from_install_roots(
[roots[1], roots[2]], _test=True
[roots[1], roots[2]]
)
db_a_from_scratch = spack.database.Database(
roots[0], upstream_dbs=upstream_dbs_from_scratch
@@ -366,7 +351,7 @@ def _check_db_sanity(database):
_check_merkleiness()
def _check_remove_and_add_package(database, spec):
def _check_remove_and_add_package(database: spack.database.Database, spec):
"""Remove a spec from the DB, then add it and make sure everything's
still ok once it is added. This checks that it was
removed, that it's back when added again, and that ref
@@ -386,7 +371,7 @@ def _check_remove_and_add_package(database, spec):
assert concrete_spec not in remaining
# add it back and make sure everything is ok.
database.add(concrete_spec, spack.store.STORE.layout)
database.add(concrete_spec)
installed = database.query()
assert concrete_spec in installed
assert installed == original
@@ -396,7 +381,7 @@ def _check_remove_and_add_package(database, spec):
database._check_ref_counts()
def _mock_install(spec):
def _mock_install(spec: str):
s = spack.spec.Spec(spec).concretized()
s.package.do_install(fake=True)
@@ -636,7 +621,7 @@ def test_080_root_ref_counts(mutable_database):
assert mutable_database.get_record("mpich").ref_count == 1
# Put the spec back
mutable_database.add(rec.spec, spack.store.STORE.layout)
mutable_database.add(rec.spec)
# record is present again
assert len(mutable_database.query("mpileaks ^mpich", installed=any)) == 1
@@ -998,9 +983,12 @@ def test_reindex_removed_prefix_is_not_installed(mutable_database, mock_store, c
# Reindex should pick up libelf as a dependency of libdwarf
spack.store.STORE.reindex()
# Reindexing should warn about libelf not being found on the filesystem
err = capfd.readouterr()[1]
assert "this directory does not contain an installation of the spec" in err
# Reindexing should warn about libelf not found on the filesystem
assert re.search(
"libelf@0.8.13.+ was marked installed in the database "
"but was not found on the file system",
capfd.readouterr().err,
)
# And we should still have libelf in the database, but not installed.
assert not mutable_database.query_one("libelf", installed=True)
@@ -1117,9 +1105,9 @@ def test_database_construction_doesnt_use_globals(tmpdir, config, nullify_global
def test_database_read_works_with_trailing_data(tmp_path, default_mock_concretization):
# Populate a database
root = str(tmp_path)
db = spack.database.Database(root)
db = spack.database.Database(root, layout=None)
spec = default_mock_concretization("pkg-a")
db.add(spec, directory_layout=None)
db.add(spec)
specs_in_db = db.query_local()
assert spec in specs_in_db
@@ -1140,3 +1128,53 @@ def test_database_errors_with_just_a_version_key(tmp_path):
with pytest.raises(spack.database.InvalidDatabaseVersionError):
spack.database.Database(root).query_local()
def test_reindex_with_upstreams(tmp_path, monkeypatch, mock_packages, config):
# Reindexing should not put install records of upstream entries into the local database. Here
# we install `mpileaks` locally with dependencies in the upstream. And we even install
# `mpileaks` with the same hash in the upstream. After reindexing, `mpileaks` should still be
# in the local db, and `callpath` should not.
mpileaks = spack.spec.Spec("mpileaks").concretized()
callpath = mpileaks.dependencies("callpath")[0]
upstream_store = spack.store.create(
{"config": {"install_tree": {"root": str(tmp_path / "upstream")}}}
)
monkeypatch.setattr(spack.store, "STORE", upstream_store)
callpath.package.do_install(fake=True)
local_store = spack.store.create(
{
"config": {"install_tree": {"root": str(tmp_path / "local")}},
"upstreams": {"my-upstream": {"install_tree": str(tmp_path / "upstream")}},
}
)
monkeypatch.setattr(spack.store, "STORE", local_store)
mpileaks.package.do_install(fake=True)
# Sanity check that callpath is from upstream.
assert not local_store.db.query_local("callpath")
assert local_store.db.query("callpath")
# Install mpileaks also upstream with the same hash to ensure that determining upstreamness
# checks local installs before upstream databases, even when the local database is being
# reindexed.
monkeypatch.setattr(spack.store, "STORE", upstream_store)
mpileaks.package.do_install(fake=True)
# Delete the local database
shutil.rmtree(local_store.db.database_directory)
# Create a new instance s.t. we don't have cached specs in memory
reindexed_local_store = spack.store.create(
{
"config": {"install_tree": {"root": str(tmp_path / "local")}},
"upstreams": {"my-upstream": {"install_tree": str(tmp_path / "upstream")}},
}
)
reindexed_local_store.db.reindex()
assert not reindexed_local_store.db.query_local("callpath")
assert reindexed_local_store.db.query("callpath") == [callpath]
assert reindexed_local_store.db.query_local("mpileaks") == [mpileaks]

View File

@@ -11,11 +11,7 @@
def test_detection_update_config(mutable_config):
# mock detected package
detected_packages = collections.defaultdict(list)
detected_packages["cmake"] = [
spack.detection.common.DetectedPackage(
spec=spack.spec.Spec("cmake@3.27.5"), prefix="/usr/bin"
)
]
detected_packages["cmake"] = [spack.spec.Spec("cmake@3.27.5", external_path="/usr/bin")]
# update config for new package
spack.detection.common.update_configuration(detected_packages)

View File

@@ -860,3 +860,33 @@ def test_env_view_on_non_empty_dir_errors(tmp_path, config, mock_packages, tempo
env.install_all(fake=True)
with pytest.raises(ev.SpackEnvironmentError, match="because it is a non-empty dir"):
env.regenerate_views()
@pytest.mark.parametrize(
"matrix_line", [("^zmpi", "^mpich"), ("~shared", "+shared"), ("shared=False", "+shared-libs")]
)
@pytest.mark.regression("40791")
def test_stack_enforcement_is_strict(tmp_path, matrix_line, config, mock_packages):
"""Ensure that constraints in matrices are applied strictly after expansion, to avoid
inconsistencies between abstract user specs and concrete specs.
"""
manifest = tmp_path / "spack.yaml"
manifest.write_text(
f"""\
spack:
definitions:
- packages: [libelf, mpileaks]
- install:
- matrix:
- [$packages]
- [{", ".join(item for item in matrix_line)}]
specs:
- $install
concretizer:
unify: false
"""
)
# Here we raise different exceptions depending on whether we solve serially or not
with pytest.raises(Exception):
with ev.Environment(tmp_path) as e:
e.concretize()

View File

@@ -0,0 +1,333 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.build_systems.generic
import spack.config
import spack.environment as ev
import spack.error
import spack.package_base
import spack.repo
import spack.util.spack_yaml as syaml
import spack.version
from spack.spec import Spec
from spack.test.conftest import create_test_repo
"""
These tests include the following package DAGs:
Firstly, w, x, y where w and x apply cflags to y.
w
|\
x |
|/
y
Secondly, v, y which where v does not apply cflags to y - this is for testing
mixing with compiler flag propagation in the absence of compiler flags applied
by dependents.
v
|
y
Finally, a diamond dag to check that the topological order is resolved into
a total order:
t
|\
u x
|/
y
"""
_pkgx = (
"x",
"""\
class X(Package):
version("1.1")
version("1.0")
variant("activatemultiflag", default=False)
depends_on('y cflags="-d1"', when="~activatemultiflag")
depends_on('y cflags="-d1 -d2"', when="+activatemultiflag")
""",
)
_pkgy = (
"y",
"""\
class Y(Package):
version("2.1")
version("2.0")
""",
)
_pkgw = (
"w",
"""\
class W(Package):
version("3.1")
version("3.0")
variant("moveflaglater", default=False)
depends_on('x +activatemultiflag')
depends_on('y cflags="-d0"', when="~moveflaglater")
depends_on('y cflags="-d3"', when="+moveflaglater")
""",
)
_pkgv = (
"v",
"""\
class V(Package):
version("4.1")
version("4.0")
depends_on("y")
""",
)
_pkgt = (
"t",
"""\
class T(Package):
version("5.0")
depends_on("u")
depends_on("x+activatemultiflag")
depends_on("y cflags='-c1 -c2'")
""",
)
_pkgu = (
"u",
"""\
class U(Package):
version("6.0")
depends_on("y cflags='-e1 -e2'")
""",
)
@pytest.fixture
def _create_test_repo(tmpdir, mutable_config):
yield create_test_repo(tmpdir, [_pkgt, _pkgu, _pkgv, _pkgw, _pkgx, _pkgy])
@pytest.fixture
def test_repo(_create_test_repo, monkeypatch, mock_stage):
with spack.repo.use_repositories(_create_test_repo) as mock_repo_path:
yield mock_repo_path
def update_concretize_scope(conf_str, section):
conf = syaml.load_config(conf_str)
spack.config.set(section, conf[section], scope="concretize")
def test_mix_spec_and_requirements(concretize_scope, test_repo):
conf_str = """\
packages:
y:
require: cflags="-c"
"""
update_concretize_scope(conf_str, "packages")
s1 = Spec('y cflags="-a"').concretized()
assert s1.satisfies('cflags="-a -c"')
def test_mix_spec_and_dependent(concretize_scope, test_repo):
s1 = Spec('x ^y cflags="-a"').concretized()
assert s1["y"].satisfies('cflags="-a -d1"')
def _compiler_cfg_one_entry_with_cflags(cflags):
return f"""\
compilers::
- compiler:
spec: gcc@12.100.100
paths:
cc: /usr/bin/fake-gcc
cxx: /usr/bin/fake-g++
f77: null
fc: null
flags:
cflags: {cflags}
operating_system: debian6
modules: []
"""
def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "compilers")
s1 = Spec('y %gcc@12.100.100 cflags="-O2"').concretized()
assert s1.satisfies('cflags="-Wall -O2"')
@pytest.mark.parametrize(
"cmd_flags,req_flags,cmp_flags,dflags,expected_order",
[
("-a -b", "-c", None, False, "-c -a -b"),
("-x7 -x4", "-x5 -x6", None, False, "-x5 -x6 -x7 -x4"),
("-x7 -x4", "-x5 -x6", "-x3 -x8", False, "-x3 -x8 -x5 -x6 -x7 -x4"),
("-x7 -x4", "-x5 -x6", "-x3 -x8", True, "-x3 -x8 -d1 -d2 -x5 -x6 -x7 -x4"),
("-x7 -x4", None, "-x3 -x8", False, "-x3 -x8 -x7 -x4"),
("-x7 -x4", None, "-x3 -x8", True, "-x3 -x8 -d1 -d2 -x7 -x4"),
# The remaining test cover cases of intersection
("-a -b", "-a -c", None, False, "-c -a -b"),
("-a -b", None, "-a -c", False, "-c -a -b"),
("-a -b", "-a -c", "-a -d", False, "-d -c -a -b"),
("-a -d2 -d1", "-d2 -c", "-d1 -b", True, "-b -c -a -d2 -d1"),
("-a", "-d0 -d2 -c", "-d1 -b", True, "-b -d1 -d0 -d2 -c -a"),
],
)
def test_flag_order_and_grouping(
concretize_scope, test_repo, cmd_flags, req_flags, cmp_flags, dflags, expected_order
):
"""Check consistent flag ordering and grouping on a package "y"
with flags introduced from a variety of sources.
The ordering rules are explained in ``asp.SpecBuilder.reorder_flags``.
"""
if req_flags:
conf_str = f"""\
packages:
y:
require: cflags="{req_flags}"
"""
update_concretize_scope(conf_str, "packages")
if cmp_flags:
conf_str = _compiler_cfg_one_entry_with_cflags(cmp_flags)
update_concretize_scope(conf_str, "compilers")
compiler_spec = ""
if cmp_flags:
compiler_spec = "%gcc@12.100.100"
if dflags:
spec_str = f"x+activatemultiflag {compiler_spec} ^y"
expected_dflags = "-d1 -d2"
else:
spec_str = f"y {compiler_spec}"
expected_dflags = None
if cmd_flags:
spec_str += f' cflags="{cmd_flags}"'
root_spec = Spec(spec_str).concretized()
spec = root_spec["y"]
satisfy_flags = " ".join(x for x in [cmd_flags, req_flags, cmp_flags, expected_dflags] if x)
assert spec.satisfies(f'cflags="{satisfy_flags}"')
assert spec.compiler_flags["cflags"] == expected_order.split()
def test_two_dependents_flag_mixing(concretize_scope, test_repo):
root_spec1 = Spec("w~moveflaglater").concretized()
spec1 = root_spec1["y"]
assert spec1.compiler_flags["cflags"] == "-d0 -d1 -d2".split()
root_spec2 = Spec("w+moveflaglater").concretized()
spec2 = root_spec2["y"]
assert spec2.compiler_flags["cflags"] == "-d3 -d1 -d2".split()
def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
update_concretize_scope(conf_str, "compilers")
root_spec = Spec("v %gcc@12.100.100 cflags=='-f1'").concretized()
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
# Note: setting flags on a dependency overrides propagation, which
# is tested in test/concretize.py:test_compiler_flag_propagation
def test_propagate_and_pkg_dep(concretize_scope, test_repo):
root_spec1 = Spec("x ~activatemultiflag cflags=='-f1'").concretized()
assert root_spec1["y"].satisfies("cflags='-f1 -d1'")
def test_propagate_and_require(concretize_scope, test_repo):
conf_str = """\
packages:
y:
require: cflags="-f2"
"""
update_concretize_scope(conf_str, "packages")
root_spec1 = Spec("v cflags=='-f1'").concretized()
assert root_spec1["y"].satisfies("cflags='-f1 -f2'")
# Next, check that a requirement does not "undo" a request for
# propagation from the command-line spec
conf_str = """\
packages:
v:
require: cflags="-f1"
"""
update_concretize_scope(conf_str, "packages")
root_spec2 = Spec("v cflags=='-f1'").concretized()
assert root_spec2["y"].satisfies("cflags='-f1'")
# Note: requirements cannot enforce propagation: any attempt to do
# so will generate a concretization error; this likely relates to
# the note about #37180 in concretize.lp
def test_dev_mix_flags(tmp_path, concretize_scope, mutable_mock_env_path, test_repo):
src_dir = tmp_path / "x-src"
env_content = f"""\
spack:
specs:
- y %gcc@12.100.100 cflags=='-fsanitize=address'
develop:
y:
spec: y cflags=='-fsanitize=address'
path: {src_dir}
"""
conf_str = _compiler_cfg_one_entry_with_cflags("-f1")
update_concretize_scope(conf_str, "compilers")
manifest_file = tmp_path / ev.manifest_name
manifest_file.write_text(env_content)
e = ev.create("test", manifest_file)
with e:
e.concretize()
e.write()
(result,) = list(j for i, j in e.concretized_specs() if j.name == "y")
assert result["y"].satisfies("cflags='-fsanitize=address -f1'")
def test_diamond_dep_flag_mixing(concretize_scope, test_repo):
"""A diamond where each dependent applies flags to the bottom
dependency. The goal is to ensure that the flag ordering is
(a) topological and (b) repeatable for elements not subject to
this partial ordering (i.e. the flags for the left and right
nodes of the diamond always appear in the same order).
`Spec.traverse` is responsible for handling both of these needs.
"""
root_spec1 = Spec("t").concretized()
spec1 = root_spec1["y"]
assert spec1.satisfies('cflags="-c1 -c2 -d1 -d2 -e1 -e2"')
assert spec1.compiler_flags["cflags"] == "-c1 -c2 -e1 -e2 -d1 -d2".split()

View File

@@ -11,6 +11,8 @@
import llnl.util.filesystem as fs
import spack.config
import spack.database
import spack.error
import spack.mirror
import spack.patch
@@ -255,8 +257,8 @@ def install_upstream(tmpdir_factory, gen_mock_layout, install_mockery):
installs are using the upstream installs).
"""
mock_db_root = str(tmpdir_factory.mktemp("mock_db_root"))
prepared_db = spack.database.Database(mock_db_root)
upstream_layout = gen_mock_layout("/a/")
prepared_db = spack.database.Database(mock_db_root, layout=upstream_layout)
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(
name="install-upstream-fixture",
@@ -266,8 +268,7 @@ def install_upstream(tmpdir_factory, gen_mock_layout, install_mockery):
def _install_upstream(*specs):
for spec_str in specs:
s = spack.spec.Spec(spec_str).concretized()
prepared_db.add(s, upstream_layout)
prepared_db.add(Spec(spec_str).concretized())
downstream_root = str(tmpdir_factory.mktemp("mock_downstream_db_root"))
return downstream_root, upstream_layout
@@ -280,7 +281,7 @@ def test_installed_upstream_external(install_upstream, mock_fetch):
"""
store_root, _ = install_upstream("externaltool")
with spack.store.use_store(store_root):
dependent = spack.spec.Spec("externaltest")
dependent = Spec("externaltest")
dependent.concretize()
new_dependency = dependent["externaltool"]
@@ -299,8 +300,8 @@ def test_installed_upstream(install_upstream, mock_fetch):
"""
store_root, upstream_layout = install_upstream("dependency-install")
with spack.store.use_store(store_root):
dependency = spack.spec.Spec("dependency-install").concretized()
dependent = spack.spec.Spec("dependent-install").concretized()
dependency = Spec("dependency-install").concretized()
dependent = Spec("dependent-install").concretized()
new_dependency = dependent["dependency-install"]
assert new_dependency.installed_upstream
@@ -607,7 +608,7 @@ def test_install_from_binary_with_missing_patch_succeeds(
s.to_json(f)
# And register it in the database
temporary_store.db.add(s, directory_layout=temporary_store.layout, explicit=True)
temporary_store.db.add(s, explicit=True)
# Push it to a binary cache
mirror = spack.mirror.Mirror.from_local_path(str(tmp_path / "my_build_cache"))

View File

@@ -12,8 +12,6 @@
import py
import pytest
import archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lock as ulk
import llnl.util.tty as tty
@@ -435,76 +433,6 @@ def test_fake_install(install_mockery):
assert os.path.isdir(pkg.prefix.lib)
def test_packages_needed_to_bootstrap_compiler_none(install_mockery):
spec = spack.spec.Spec("trivial-install-test-package")
spec.concretize()
assert spec.concrete
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package]
)
assert not packages
@pytest.mark.xfail(reason="fails when assuming Spec.package can only be called on concrete specs")
def test_packages_needed_to_bootstrap_compiler_packages(install_mockery, monkeypatch):
spec = spack.spec.Spec("trivial-install-test-package")
spec.concretize()
def _conc_spec(compiler):
return spack.spec.Spec("pkg-a").concretized()
# Ensure we can get past functions that are precluding obtaining
# packages.
monkeypatch.setattr(spack.compilers, "compilers_for_spec", _none)
monkeypatch.setattr(spack.compilers, "pkg_spec_for_compiler", _conc_spec)
monkeypatch.setattr(spack.spec.Spec, "concretize", _noop)
packages = inst._packages_needed_to_bootstrap_compiler(
spec.compiler, spec.architecture, [spec.package]
)
assert packages
def test_update_tasks_for_compiler_packages_as_compiler(mock_packages, config, monkeypatch):
spec = spack.spec.Spec("trivial-install-test-package").concretized()
installer = inst.PackageInstaller([spec.package], {})
# Add a task to the queue
installer._add_init_task(spec.package, installer.build_requests[0], False, {})
# monkeypatch to make the list of compilers be what we test
def fake_package_list(compiler, architecture, pkgs):
return [(spec.package, True)]
monkeypatch.setattr(inst, "_packages_needed_to_bootstrap_compiler", fake_package_list)
installer._add_bootstrap_compilers("fake", "fake", "fake", None, {})
# Check that the only task is now a compiler task
assert len(installer.build_pq) == 1
assert installer.build_pq[0][1].compiler
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64",
reason="OneAPI compiler is not supported on other architectures",
)
def test_bootstrapping_compilers_with_different_names_from_spec(
install_mockery, mutable_config, mock_fetch, archspec_host_is_spack_test_host
):
"""Tests that, when we bootstrap '%oneapi' we can translate it to the
'intel-oneapi-compilers' package.
"""
with spack.config.override("config:install_missing_compilers", True):
with spack.concretize.disable_compiler_existence_check():
spec = spack.spec.Spec("trivial-install-test-package%oneapi@=22.2.0").concretized()
spec.package.do_install()
assert (
spack.spec.CompilerSpec("oneapi@=22.2.0") in spack.compilers.all_compiler_specs()
)
def test_dump_packages_deps_ok(install_mockery, tmpdir, mock_packages):
"""Test happy path for dump_packages with dependencies."""
@@ -696,26 +624,6 @@ def test_check_deps_status_upstream(install_mockery, monkeypatch):
assert inst.package_id(dep) in installer.installed
def test_add_bootstrap_compilers(install_mockery, monkeypatch):
from collections import defaultdict
def _pkgs(compiler, architecture, pkgs):
spec = spack.spec.Spec("mpi").concretized()
return [(spec.package, True)]
installer = create_installer(["trivial-install-test-package"], {})
request = installer.build_requests[0]
all_deps = defaultdict(set)
monkeypatch.setattr(inst, "_packages_needed_to_bootstrap_compiler", _pkgs)
installer._add_bootstrap_compilers("fake", "fake", [request.pkg], request, all_deps)
ids = list(installer.build_tasks)
assert len(ids) == 1
task = installer.build_tasks[ids[0]]
assert task.compiler
def test_prepare_for_install_on_installed(install_mockery, monkeypatch):
"""Test of _prepare_for_install's early return for installed task path."""
installer = create_installer(["dependent-install"], {})
@@ -729,18 +637,6 @@ def test_prepare_for_install_on_installed(install_mockery, monkeypatch):
installer._prepare_for_install(task)
def test_installer_init_requests(install_mockery):
"""Test of installer initial requests."""
spec_name = "dependent-install"
with spack.config.override("config:install_missing_compilers", True):
installer = create_installer([spec_name], {})
# There is only one explicit request in this case
assert len(installer.build_requests) == 1
request = installer.build_requests[0]
assert request.pkg.name == spec_name
def test_install_task_use_cache(install_mockery, monkeypatch):
installer = create_installer(["trivial-install-test-package"], {})
request = installer.build_requests[0]

View File

@@ -286,6 +286,7 @@ def compilers(compiler, arch_spec):
# TODO (post-34236): Remove when remove deprecated run_test(), etc.
@pytest.mark.not_on_windows("echo not available on Windows")
@pytest.mark.parametrize(
"msg,installed,purpose,expected",
[

View File

@@ -105,25 +105,21 @@ def test_schema_validation(meta_schema, config_name):
def test_deprecated_properties(module_suffixes_schema):
# Test that an error is reported when 'error: True'
msg_fmt = r"deprecated properties detected [properties={properties}]"
module_suffixes_schema["deprecatedProperties"] = {
"properties": ["tcl"],
"message": msg_fmt,
"error": True,
}
msg_fmt = r"{name} is deprecated"
module_suffixes_schema["deprecatedProperties"] = [
{"names": ["tcl"], "message": msg_fmt, "error": True}
]
v = spack.schema.Validator(module_suffixes_schema)
data = {"tcl": {"all": {"suffixes": {"^python": "py"}}}}
expected_match = "deprecated properties detected"
expected_match = "tcl is deprecated"
with pytest.raises(jsonschema.ValidationError, match=expected_match):
v.validate(data)
# Test that just a warning is reported when 'error: False'
module_suffixes_schema["deprecatedProperties"] = {
"properties": ["tcl"],
"message": msg_fmt,
"error": False,
}
module_suffixes_schema["deprecatedProperties"] = [
{"names": ["tcl"], "message": msg_fmt, "error": False}
]
v = spack.schema.Validator(module_suffixes_schema)
data = {"tcl": {"all": {"suffixes": {"^python": "py"}}}}
# The next validation doesn't raise anymore

View File

@@ -245,6 +245,65 @@ def test_abstract_specs_can_constrain_each_other(self, lhs, rhs, expected):
assert c1 == c2
assert c1 == expected
@pytest.mark.parametrize(
"lhs,rhs,expected_lhs,expected_rhs,propagated_lhs,propagated_rhs",
[
(
'mpich cppflags="-O3"',
'mpich cppflags="-O2"',
'mpich cppflags="-O3 -O2"',
'mpich cppflags="-O2 -O3"',
[],
[],
),
(
'mpich cflags="-O3 -g"',
'mpich cflags=="-O3"',
'mpich cflags="-O3 -g"',
'mpich cflags=="-O3 -g"',
[("cflags", "-O3")],
[("cflags", "-O3")],
),
],
)
def test_constrain_compiler_flags(
self, lhs, rhs, expected_lhs, expected_rhs, propagated_lhs, propagated_rhs
):
"""Constraining is asymmetric for compiler flags. Also note that
Spec equality does not account for flag propagation, so the checks
here are manual.
"""
lhs, rhs, expected_lhs, expected_rhs = (
Spec(lhs),
Spec(rhs),
Spec(expected_lhs),
Spec(expected_rhs),
)
assert lhs.intersects(rhs)
assert rhs.intersects(lhs)
c1, c2 = lhs.copy(), rhs.copy()
c1.constrain(rhs)
c2.constrain(lhs)
assert c1 == expected_lhs
assert c2 == expected_rhs
for x in [c1, c2]:
assert x.satisfies(lhs)
assert x.satisfies(rhs)
def _propagated_flags(_spec):
result = set()
for flagtype in _spec.compiler_flags:
for flag in _spec.compiler_flags[flagtype]:
if flag.propagate:
result.add((flagtype, flag))
return result
assert set(propagated_lhs) <= _propagated_flags(c1)
assert set(propagated_rhs) <= _propagated_flags(c2)
def test_constrain_specs_by_hash(self, default_mock_concretization, database):
"""Test that Specs specified only by their hashes can constrain eachother."""
mpich_dag_hash = "/" + database.query_one("mpich").dag_hash()
@@ -311,14 +370,11 @@ def test_concrete_specs_which_satisfies_abstract(self, lhs, rhs, default_mock_co
("mpich~~foo", "mpich++foo"),
("mpich++foo", "mpich~~foo"),
("mpich foo==True", "mpich foo==False"),
('mpich cppflags="-O3"', 'mpich cppflags="-O2"'),
('mpich cppflags="-O3"', 'mpich cppflags=="-O3"'),
("libelf@0:2.0", "libelf@2.1:3"),
("libelf@0:2.5%gcc@4.8:4.9", "libelf@2.1:3%gcc@4.5:4.7"),
("libelf+debug", "libelf~debug"),
("libelf+debug~foo", "libelf+debug+foo"),
("libelf debug=True", "libelf debug=False"),
('libelf cppflags="-O3"', 'libelf cppflags="-O2"'),
("libelf platform=test target=be os=be", "libelf target=fe os=fe"),
("namespace=builtin.mock", "namespace=builtin"),
],
@@ -347,10 +403,6 @@ def test_constraining_abstract_specs_with_empty_intersection(self, lhs, rhs):
("mpich", "mpich++foo"),
("mpich", "mpich~~foo"),
("mpich", "mpich foo==1"),
# Flags semantics is currently different from other variant
("mpich", 'mpich cflags="-O3"'),
("mpich cflags=-O3", 'mpich cflags="-O3 -Ofast"'),
("mpich cflags=-O2", 'mpich cflags="-O3"'),
("multivalue-variant foo=bar", "multivalue-variant +foo"),
("multivalue-variant foo=bar", "multivalue-variant ~foo"),
("multivalue-variant fee=bar", "multivalue-variant fee=baz"),
@@ -689,6 +741,13 @@ def test_spec_formatting(self, default_mock_concretization):
("{/hash}", "/", lambda s: "/" + s.dag_hash()),
]
variants_segments = [
("{variants.debug}", spec, "debug"),
("{variants.foo}", spec, "foo"),
("{^pkg-a.variants.bvv}", spec["pkg-a"], "bvv"),
("{^pkg-a.variants.foo}", spec["pkg-a"], "foo"),
]
other_segments = [
("{spack_root}", spack.paths.spack_root),
("{spack_install}", spack.store.STORE.layout.root),
@@ -716,6 +775,12 @@ def check_prop(check_spec, fmt_str, prop, getter):
callpath, fmt_str = depify("callpath", named_str, sigil)
assert spec.format(fmt_str) == getter(callpath)
for named_str, test_spec, variant_name in variants_segments:
assert test_spec.format(named_str) == str(test_spec.variants[variant_name])
assert test_spec.format(named_str[:-1] + ".value}") == str(
test_spec.variants[variant_name].value
)
for named_str, expected in other_segments:
actual = spec.format(named_str)
assert expected == actual
@@ -775,6 +840,7 @@ def test_spec_formatting_sigil_mismatches(self, default_mock_concretization, fmt
r"{dag_hash}",
r"{foo}",
r"{+variants.debug}",
r"{variants.this_variant_does_not_exist}",
],
)
def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str):
@@ -1451,8 +1517,8 @@ def test_abstract_contains_semantic(lhs, rhs, expected, mock_packages):
(CompilerSpec, "gcc@5", "gcc@5-tag", (True, False, True)),
# Flags (flags are a map, so for convenience we initialize a full Spec)
# Note: the semantic is that of sv variants, not mv variants
(Spec, "cppflags=-foo", "cppflags=-bar", (False, False, False)),
(Spec, "cppflags='-bar -foo'", "cppflags=-bar", (False, False, False)),
(Spec, "cppflags=-foo", "cppflags=-bar", (True, False, False)),
(Spec, "cppflags='-bar -foo'", "cppflags=-bar", (True, True, False)),
(Spec, "cppflags=-foo", "cppflags=-foo", (True, True, True)),
(Spec, "cppflags=-foo", "cflags=-foo", (True, False, False)),
# Versions

Some files were not shown because too many files have changed in this diff Show More