Compare commits

...

58 Commits

Author SHA1 Message Date
Harmen Stoppels
1d78283946 troubleshoot ci 2025-05-12 16:46:58 +02:00
Diego Alvarez S.
5a8a7b83f6 blast-plus: fix python version to <=3.11 (#50416) 2025-05-12 09:24:08 +02:00
Dmitri Smirnov
70de20eaa2 optix-dev: new package for NVIDIA OptiX SDK headers (#50080) 2025-05-12 09:05:22 +02:00
Adam J. Stewart
d8172e2c29 GDAL: add v3.11.0 (#50405) 2025-05-12 08:58:52 +02:00
Diego Alvarez S.
3371cc55ed nextflow: add v25.04.0 (#50412) 2025-05-12 08:50:13 +02:00
Satish Balay
35ee3706bb petsc, py-petsc4py: add v3.23.2 (#50409) 2025-05-12 08:48:41 +02:00
Diego Alvarez S.
9493cf016b Add seqkit v2.10.0 (#50414) 2025-05-12 08:47:25 +02:00
Seth R. Johnson
f6caa1c824 covfie: add v0.14 and maintainer (#50417) 2025-05-12 08:36:57 +02:00
Matthieu Dorier
2c9f94e6a5 py-confluent-kafka: new package (#50418) 2025-05-12 08:36:17 +02:00
pauleonix
2504dcf4f8 cuda: add v12.8.1 (#49379) 2025-05-12 08:34:12 +02:00
Alec Scott
82baa658a4 typos: add v1.32.0 (#50421) 2025-05-12 08:33:03 +02:00
Jon Rood
21384be78d exawind: depends on fortran as well (#50393) 2025-05-10 10:44:38 +02:00
Massimiliano Culpo
acd47147a5 solver: support concrete multivalued variants (#50325)
The solves now supports key:=val syntax for multivalued variants in specs originating from input, externals, requirements, directives and when conditions

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-05-09 23:08:21 +02:00
Harmen Stoppels
3ed6736b2c compiler.py: be more defensive (#50403) 2025-05-09 16:58:52 +00:00
Massimiliano Culpo
842954f6a4 solver: treat external nodes as concrete for optimization purposes (#50165)
This PR is a step towards treating externals as concrete specs. Specifically, it moves the optimization weights of external nodes into the group of "reused" specs, and doesn't count externals as specs to be built. 

It still keeps the one to many mapping between an external spec in `packages.yaml` and the corresponding specs in the DB. To make it such that an hashed external is preferred to a non-hashed one, the version weights of externals are demoted at lowest priority.

**Change in behavior**:
- Having the possibility, Spack will now prefer to mix compilers in a DAG, and use the latest version possible for each node, rather than using a single compiler and employ old versions of some nodes because of conflicts
- In general, using externals by default is now triggered by putting their weights in the "reused" group. This means that any penalty they might induce will never have the same priority as something that needs to be built. This behavior reflects reality, but changes some default choices from the previous state.

Modifications:
- External nodes weights are now moved to the group of "reused" specs
- Runtimes are treated as concrete as well (to avoid that e.g.`gcc` is not selected because it "builds" the runtime package)
- Shuffle version weights, so that externals are least preferred (counterbalanced by the fact that they are in the "reused" groups)
- Split provider weights on edges from version badness on edges

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-05-09 17:41:16 +02:00
Harmen Stoppels
5c918e40a6 builtin: remove llnl.util.tty import (#50396)
* builtin: remove llnl.util.tty import

* hipsycl: remove unnecessary imports
2025-05-09 15:35:18 +02:00
Harmen Stoppels
98570929aa builtin: github url pull -> commit (#50399)
rate limits on github.com's pull/ urls are ~1 per minute, the rate
limits for merged commits are better and on top of that less dynamic.
2025-05-09 15:25:39 +02:00
Gregor Olenik
75bcf58b30 neon: add new package (#50278) 2025-05-09 15:20:24 +02:00
Harshula Jayasuriya
4a03cac6cc fms: require +pic when +shared (#50130) 2025-05-09 06:15:04 -06:00
Thomas Madlener
75faab206b lcio: add version 2.22.6 (#50368)
Fix dependency on "c" to previous versions as this has been fixed

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2025-05-09 05:37:56 -06:00
Sergey Kosukhin
c9ab0d8fcb icon: add version 2025.04 (#50245) 2025-05-09 10:22:08 +02:00
John W. Parent
c45e02d58f mpilander: conflict with Windows (#49733) 2025-05-09 10:04:46 +02:00
Thomas-Ulrich
33c8f518ae seissol: fix build by adding language dependance (#50302) 2025-05-09 10:01:00 +02:00
Rémi Lacroix
2491a9abff conquest: explicitly configure the MPI compilers. (#50287) 2025-05-09 09:58:45 +02:00
G-Ragghianti
1a26ec7b8b parsec: new version and compiler dependency (#50292) 2025-05-09 09:29:27 +02:00
Patrick Diehl
89a79d3df0 hpx: add fetching APEX and specify develop (#50289)
* Add fetching APEX and specify develop

* Using the spack package

* [@spackbot] updating style on behalf of diehlpk

* Add restrictions for 1.5
2025-05-09 09:28:30 +02:00
Marc T. Henry de Frahan
ce700d69d7 Add amr-wind versions (#50373) 2025-05-09 00:10:53 -06:00
Tamara Dahlgren
a505fb1f37 unit tests: switch TestSpecList to use mock packages (#50353) 2025-05-09 07:45:49 +02:00
吴坎
f039b22093 Update package.py (#50378) 2025-05-08 23:34:41 -06:00
Dave Keeshan
18ea8f813e yosys: add v0.53 (#50372) 2025-05-08 23:34:23 -06:00
Jonas Eschle
d7e740defa py-hepstats: new package (#43697)
* enh: add py-hepstats package
* fix: version
* fix: update pypi version
* fix: update hash
* fix: use github package
* fix: allow download from pypi
* chore: remove unused Bazel, cleanup imports
* enh:  add 0.9.2 version
* fix: update dependencies for version 0.9.0 and adjust build system
* chore:  move to new builtin directory

---------

Co-authored-by: jonas-eschle <jonas-eschle@users.noreply.github.com>
2025-05-08 23:29:02 -06:00
Chris Green
c21dc1a27a jsonnet: Support CMake builds with external nlohmann-json (#49284)
* jsonnet: Support CMake builds with external `nlohmann-json`

* New version 0.21.0
2025-05-08 23:23:45 -06:00
Sinan
f30d8ea2a5 package/lemon,qjson,qtkeychain: fix c compiler depencency (#50311)
* package/lemon,qjson,qtkeychain: fix c compiler depencency
* remove generated

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2025-05-08 16:44:23 -07:00
Scott Wittenburg
03cb30cb96 binary_distribution: Do not look in sub-mirrors when indexing (#50389)
When indexing top level specs, eg, in s3://spack-binaries/develop,
do not sync manifests from all the stacks.  Instead, add the path
to the spec manifests to the url to sync, so that only items in
s3://spack-binaries/develop/v3/manifests/spec are copied to the
local system.
2025-05-08 17:25:35 -06:00
Scott Wittenburg
f6da037129 binary_distribution: Handle fetch error during rebuild-index (#50387)
Allow rebuild-index to continue if fetching some specs fails
for any reason, and issue a warning indicating which manifest
is associated with the failed fetch.
2025-05-08 13:54:43 -06:00
Kyle Shores
31c2897fd8 musica: adding a netcdf-fortran dependency (#50252) 2025-05-08 13:41:33 -06:00
jgraciahlrs
1a379215da Allow usage of config variables and env variables with include_concrete (#45871)
* Allow usage of spack config vars in concrete env path
* Update docs on usage of spack config vars in concrete env path
2025-05-08 14:23:02 -05:00
Robert Maaskant
0f7c1b5e38 go: add v1.23.9 and v1.24.3 (#50346) 2025-05-08 13:57:00 -05:00
ShujieL
7e3af5d42d dd4hep: add v1.32 (#50359)
* Update package.py for dd4hep 1.32

* Update package.py to fix the podio-dd4hep version

* fix the dd4hep 1.32 hash

Co-authored-by: Sakib Rahman <rahmans@myumanitoba.ca>

---------

Co-authored-by: Sakib Rahman <rahmans@myumanitoba.ca>
2025-05-08 11:52:57 -07:00
Victor A. P. Magri
f45e312f81 raja: add gpu-profiling variant (#50354) 2025-05-08 12:40:56 -06:00
Chris Marsh
a82e21e82f add 0.61.2 and fix numpy version constraints (#50352) 2025-05-08 12:24:13 -06:00
Robert Maaskant
1ba40b99ee yq: add v4.45.2 (#50345) 2025-05-08 12:07:22 -06:00
Robert Maaskant
60f2698a4a trivy: add v0.62.0 and v0.62.1 (#50344) 2025-05-08 12:07:04 -06:00
Harmen Stoppels
b3772f8bb6 builtin: remove unused imports from build_systems (#50385) 2025-05-08 19:27:24 +02:00
Harmen Stoppels
cd75e52ba2 yaml_cpp: do not import spack.spec (#50382) 2025-05-08 10:52:35 -06:00
Harmen Stoppels
b0b316c646 builtin: add a few missing __init__.py (#50374) 2025-05-08 18:45:09 +02:00
Harmen Stoppels
7bbf581169 singularity-eos: remove conditional depends_on (#50381) 2025-05-08 18:42:06 +02:00
Harmen Stoppels
7b93d01a68 builtin: remove various redundant wildcard imports (#50380) 2025-05-08 18:38:18 +02:00
Harmen Stoppels
a95fa26857 docs/comments: fix typo with wildcard import (#50379) 2025-05-08 18:37:45 +02:00
Harmen Stoppels
6f2393a345 builtin: delete spack.store import (#50383) 2025-05-08 10:31:11 -06:00
Mikael Simberg
9fa2bb375c fmt: add v11.2.0 (#50343) 2025-05-08 05:54:54 -06:00
Harmen Stoppels
98c08ce5c6 repo.py: enable search paths when spack.repo.PATH is assigned (#50370)
Fixes a bug where `custom_repo.get_pkg_class("foo")` failed executing `import spack_repo.builtin` even if the builtin repo was configured globally.

Instead of assignment of `spack.repo.PATH`, the `spack.repo.enable_repo` function now assigns and enables the repo, meaning that also Python module search paths are modified.
2025-05-08 13:42:20 +02:00
Caetano Melone
83f115894b glib: add preferred version 2.78.3 (#50356)
Versions of glib above 2.78.3 don't build (https://github.com/spack/spack/issues/49358). Until this is fixed we should set preferred to a confirmed version instead per https://github.com/spack/spack/issues/49358#issuecomment-2706251681.
2025-05-08 09:27:07 +02:00
Tamara Dahlgren
59339be48f test/cmd/find.py: switch to use mock_packages (#50358) 2025-05-08 08:33:56 +02:00
snehring
ef0599b53c cryodrgn: adding v3.4.3 (#48804)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-05-07 12:43:37 -07:00
Veselin Dobrev
9c4207a551 mesa: add the latest v24.* and v25.* versions (#47642)
* [mesa] Add latest version: 24.2.7
* Fix the llvm build for @18: when libunwind is disabled
* [mesa] Updaing to the latest 24.* and 25.* versions
* Add libshmfence dependency

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-05-07 12:09:23 -07:00
Sinan
eb95390ce7 package/qscintilla: fix build issue (#50317)
* package/qscintilla: fix build issue

* add maintainer

* package/qscintilla: fix build issue

* add maintainer

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2025-05-07 19:45:14 +02:00
Sinan
527d723db0 package_qgis add new versions (#50328)
* package_qgis add new versions

* restore deprecated version

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2025-05-07 09:50:58 -07:00
151 changed files with 931 additions and 1514 deletions

View File

@@ -1,73 +0,0 @@
name: audit
on:
workflow_call:
inputs:
with_coverage:
required: true
type: string
python_version:
required: true
type: string
concurrency:
group: audit-${{inputs.python_version}}-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ${{ matrix.system.os }}
strategy:
matrix:
system:
- { os: windows-latest, shell: 'powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}' }
- { os: ubuntu-latest, shell: bash }
- { os: macos-latest, shell: bash }
defaults:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml]
- name: Setup for Windows run
if: runner.os == 'Windows'
run: |
python -m pip install --upgrade pywin32
- name: Package audits (with coverage)
env:
COVERAGE_FILE: coverage/.coverage-audits-${{ matrix.system.os }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit configs
coverage run $(which spack) -d audit externals
coverage combine
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: |
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit configs
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
run: |
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit configs
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
with:
name: coverage-audits-${{ matrix.system.os }}
path: coverage
include-hidden-files: true

View File

@@ -1,194 +0,0 @@
name: Bootstrapping
on:
# This Workflow can be triggered manually
workflow_dispatch:
workflow_call:
schedule:
# nightly at 2:16 AM
- cron: '16 2 * * *'
concurrency:
group: bootstrap-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
distros-clingo-sources:
runs-on: ubuntu-latest
container: ${{ matrix.image }}
strategy:
matrix:
image: ["fedora:latest", "opensuse/leap:latest"]
steps:
- name: Setup Fedora
if: ${{ matrix.image == 'fedora:latest' }}
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static gawk
- name: Setup OpenSUSE
if: ${{ matrix.image == 'opensuse/leap:latest' }}
run: |
# Harden CI by applying the workaround described here: https://www.suse.com/support/kb/doc/?id=000019505
zypper update -y || zypper update -y
zypper install -y \
bzip2 curl file gcc-c++ gcc gcc-fortran tar git gpg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.6
spack bootstrap disable github-actions-v0.5
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
clingo-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.6
spack bootstrap disable github-actions-v0.5
spack external find --not-buildable cmake bison
spack -d solve zlib
tree $HOME/.spack/bootstrap/store/
gnupg-sources:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: [ 'macos-13', 'macos-14', "ubuntu-latest" ]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: brew install tree gawk
- name: Remove system executables
run: |
while [ -n "$(command -v gpg gpg2 patchelf)" ]; do
sudo rm $(command -v gpg gpg2 patchelf)
done
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack solve zlib
spack bootstrap disable github-actions-v0.6
spack bootstrap disable github-actions-v0.5
spack -d gpg list
tree ~/.spack/bootstrap/store/
from-binaries:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: brew install tree
- name: Remove system executables
run: |
while [ -n "$(command -v gpg gpg2 patchelf)" ]; do
sudo rm $(command -v gpg gpg2 patchelf)
done
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: |
3.8
3.9
3.10
3.11
3.12
3.13
- name: Set bootstrap sources
run: |
source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5
spack bootstrap disable spack-install
- name: Bootstrap clingo
run: |
set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' '3.13'; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
if [[ -d "$ver_dir" ]] ; then
echo "Testing $ver_dir"
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bin/bootstrap-test.sh
export PATH="$old_path"
fi
fi
if (($not_found)) ; then
echo Required python version $ver not found in runner!
exit 1
fi
done
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
spack -d gpg list
tree $HOME/.spack/bootstrap/store/
windows:
runs-on: "windows-latest"
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: "3.12"
- name: Setup Windows
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Bootstrap clingo
run: |
./share/spack/setup-env.ps1
spack bootstrap disable github-actions-v0.6
spack bootstrap disable github-actions-v0.5
spack external find --not-buildable cmake bison
spack -d solve zlib
./share/spack/qa/validate_last_exit.ps1
tree $env:userprofile/.spack/bootstrap/store/
- name: Bootstrap GnuPG
run: |
./share/spack/setup-env.ps1
spack -d gpg list
./share/spack/qa/validate_last_exit.ps1
tree $env:userprofile/.spack/bootstrap/store/

View File

@@ -1,140 +0,0 @@
name: Containers
on:
# This Workflow can be triggered manually
workflow_dispatch:
# Build new Spack develop containers nightly.
schedule:
- cron: '34 0 * * *'
# Run on pull requests that modify this file
pull_request:
branches:
- develop
paths:
- '.github/workflows/build-containers.yml'
- 'share/spack/docker/*'
- 'share/spack/templates/container/*'
- 'lib/spack/spack/container/*'
# Let's also build & tag Spack containers on releases.
release:
types: [published]
concurrency:
group: build_containers-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
deploy-images:
runs-on: ubuntu-latest
permissions:
packages: write
strategy:
# Even if one container fails to build we still want the others
# to continue their builds.
fail-fast: false
# A matrix of Dockerfile paths, associated tags, and which architectures
# they support.
matrix:
# Meaning of the various items in the matrix list
# 0: Container name (e.g. ubuntu-bionic)
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos-stream9, 'linux/amd64,linux/arm64', 'centos:stream9'],
[leap15, 'linux/amd64,linux/arm64', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64', 'ubuntu:22.04'],
[ubuntu-noble, 'linux/amd64,linux/arm64', 'ubuntu:24.04'],
[almalinux8, 'linux/amd64,linux/arm64', 'almalinux:8'],
[almalinux9, 'linux/amd64,linux/arm64', 'almalinux:9'],
[rockylinux8, 'linux/amd64,linux/arm64', 'rockylinux:8'],
[rockylinux9, 'linux/amd64,linux/arm64', 'rockylinux:9'],
[fedora39, 'linux/amd64,linux/arm64', 'fedora:39'],
[fedora40, 'linux/amd64,linux/arm64', 'fedora:40']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
images: |
ghcr.io/${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
tags: |
type=schedule,pattern=nightly
type=schedule,pattern=develop
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
SPACK_YAML_OS: "${{ matrix.dockerfile[2] }}"
run: |
.github/workflows/bin/generate_spack_yaml_containerize.sh
. share/spack/setup-env.sh
mkdir -p dockerfiles/${{ matrix.dockerfile[0] }}
spack containerize --last-stage=bootstrap | tee dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile
printf "Preparing to build ${{ env.container }} from dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile"
if [ ! -f "dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile" ]; then
printf "dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile does not exist"
exit 1;
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@49b3bc8e6bdd4a60e6116a5414239cba5943d3cf
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@6524bf65af31da8d45b59e8c27de4bd072b392f5
- name: Log in to GitHub Container Registry
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}
merge-dockerfiles:
runs-on: ubuntu-latest
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
name: dockerfiles
pattern: dockerfiles_*
delete-merged: true

View File

@@ -1,130 +0,0 @@
name: ci
on:
push:
branches:
- develop
- releases/**
pull_request:
branches:
- develop
- releases/**
merge_group:
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
# Set job outputs to values from filter step
outputs:
bootstrap: ${{ steps.filter.outputs.bootstrap }}
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
if: ${{ github.event_name == 'push' || github.event_name == 'merge_group' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# For merge group events, compare against the target branch (main)
base: ${{ github.event_name == 'merge_group' && github.event.merge_group.base_ref || '' }}
# For merge group events, use the merge group head ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || github.ref }}
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
filters: |
bootstrap:
- 'var/spack/repos/spack_repo/builtin/packages/clingo-bootstrap/**'
- 'var/spack/repos/spack_repo/builtin/packages/clingo/**'
- 'var/spack/repos/spack_repo/builtin/packages/python/**'
- 'var/spack/repos/spack_repo/builtin/packages/re2c/**'
- 'var/spack/repos/spack_repo/builtin/packages/gnupg/**'
- 'var/spack/repos/spack_repo/builtin/packages/libassuan/**'
- 'var/spack/repos/spack_repo/builtin/packages/libgcrypt/**'
- 'var/spack/repos/spack_repo/builtin/packages/libgpg-error/**'
- 'var/spack/repos/spack_repo/builtin/packages/libksba/**'
- 'var/spack/repos/spack_repo/builtin/packages/npth/**'
- 'var/spack/repos/spack_repo/builtin/packages/pinentry/**'
- 'lib/spack/**'
- 'share/spack/**'
- '.github/workflows/bootstrap.yml'
- '.github/workflows/ci.yaml'
core:
- './!(var/**)/**'
packages:
- 'var/**'
# Some links for easier reference:
#
# "github" context: https://docs.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#github-context
# job outputs: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idoutputs
# setting environment variables from earlier steps: https://docs.github.com/en/actions/reference/workflow-commands-for-github-actions#setting-an-environment-variable
#
bootstrap:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.bootstrap == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/bootstrap.yml
secrets: inherit
unit-tests:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
prechecks:
needs: [ changes ]
uses: ./.github/workflows/prechecks.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
with_packages: ${{ needs.changes.outputs.packages }}
import-check:
needs: [ changes ]
uses: ./.github/workflows/import-check.yaml
all-prechecks:
needs: [ prechecks ]
if: ${{ always() }}
runs-on: ubuntu-latest
steps:
- name: Success
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
else
exit 0
fi
coverage:
needs: [ unit-tests, prechecks ]
if: ${{ needs.changes.outputs.core }}
uses: ./.github/workflows/coverage.yml
secrets: inherit
all:
needs: [ unit-tests, coverage, bootstrap ]
if: ${{ always() }}
runs-on: ubuntu-latest
# See https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/accessing-contextual-information-about-workflow-runs#needs-context
steps:
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
exit 1
else
exit 0
fi

View File

@@ -1,36 +0,0 @@
name: coverage
on:
workflow_call:
jobs:
# Upload coverage reports to codecov once as a single bundle
upload:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.11'
cache: 'pip'
- name: Install python dependencies
run: pip install -r .github/workflows/requirements/coverage/requirements.txt
- name: Download coverage artifact files
uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16
with:
pattern: coverage-*
path: coverage
merge-multiple: true
- run: ls -la coverage
- run: coverage combine -a coverage/.coverage*
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@1e68e06f1dbfde0e4cefc87efeba9e4643565303
with:
verbose: true
fail_ci_if_error: false
token: ${{ secrets.CODECOV_TOKEN }}

View File

@@ -1,49 +0,0 @@
name: import-check
on:
workflow_call:
jobs:
# Check we don't make the situation with circular imports worse
import-check:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
- uses: julia-actions/cache@v2
# PR: use the base of the PR as the old commit
- name: Checkout PR base commit
if: github.event_name == 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
ref: ${{ github.event.pull_request.base.sha }}
path: old
# not a PR: use the previous commit as the old commit
- name: Checkout previous commit
if: github.event_name != 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 2
path: old
- name: Checkout previous commit
if: github.event_name != 'pull_request'
run: git -C old reset --hard HEAD^
- name: Checkout new commit
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
path: new
- name: Install circular import checker
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
repository: haampie/circular-import-fighter
ref: 4cdb0bf15f04ab6b49041d5ef1bfd9644cce7f33
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter
run: make -j dependencies
- name: Circular import check
working-directory: circular-import-fighter
run: make -j compare "SPACK_ROOT=../old ../new"

View File

@@ -1,31 +0,0 @@
name: Windows Paraview Nightly
on:
schedule:
- cron: '0 2 * * *' # Run at 2 am
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack external find cmake ninja win-sdk win-wdk wgl msmpi
spack -d install -y --cdash-upload-url https://cdash.spack.io/submit.php?project=Spack+on+Windows --cdash-track Nightly --only dependencies paraview
exit 0

View File

@@ -1,108 +0,0 @@
name: prechecks
on:
workflow_call:
inputs:
with_coverage:
required: true
type: string
with_packages:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Validate that the code can be run on all the Python versions supported by Spack
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: '.github/workflows/requirements/style/requirements.txt'
- name: Install Python Packages
run: |
pip install -r .github/workflows/requirements/style/requirements.txt
- name: vermin (Spack's Core)
run: |
vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: |
vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos var/spack/test_repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 2
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: '.github/workflows/requirements/style/requirements.txt'
- name: Install Python packages
run: |
pip install -r .github/workflows/requirements/style/requirements.txt
- name: Run style tests
run: |
bin/spack style --base HEAD^1
bin/spack license verify
pylint -j $(nproc) --disable=all --enable=unspecified-encoding --ignore-paths=lib/spack/external lib
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
verify-checksums:
# do not run if the commit message or PR description contains [skip-verify-checksums]
if: >-
${{ inputs.with_packages == 'true' &&
!contains(github.event.pull_request.body, '[skip-verify-checksums]') &&
!contains(github.event.head_commit.message, '[skip-verify-checksums]') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 2
- name: Verify Added Checksums
run: |
bin/spack ci verify-versions HEAD^1 HEAD
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest
container: registry.access.redhat.com/ubi8/ubi
steps:
- name: Install dependencies
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Bootstrap Spack development environment
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack debug report
spack -d bootstrap now --dev
spack -d style -t black
spack unit-test -V

View File

@@ -1,34 +0,0 @@
name: sync with spack/spack-packages
on:
push:
branches:
- develop
jobs:
sync:
if: github.repository == 'spack/spack'
runs-on: ubuntu-latest
steps:
- name: Checkout spack/spack
run: git clone https://github.com/spack/spack.git
- name: Checkout spack/spack-packages
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
ssh-key: ${{ secrets.SYNC_PACKAGES_KEY }}
path: spack-packages
repository: spack/spack-packages
- name: Install git-filter-repo
run: |
curl -LfsO https://raw.githubusercontent.com/newren/git-filter-repo/refs/tags/v2.47.0/git-filter-repo
echo "67447413e273fc76809289111748870b6f6072f08b17efe94863a92d810b7d94 git-filter-repo" | sha256sum -c -
chmod +x git-filter-repo
sudo mv git-filter-repo /usr/local/bin/
- name: Sync spack/spack-packages with spack/spack
run: |
cd spack-packages
git-filter-repo --quiet --source ../spack --subdirectory-filter var/spack/repos --refs develop
- name: Push
run: |
cd spack-packages
git push git@github.com:spack/spack-packages.git develop:develop --force

View File

@@ -1,12 +1,7 @@
name: unit tests
on:
workflow_dispatch:
workflow_call:
concurrency:
group: unit_tests-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
pull_request:
jobs:
# Run unit tests with different configurations on linux
@@ -88,146 +83,10 @@ jobs:
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }}
path: coverage
include-hidden-files: true
# Test shell integration
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.11'
- name: Install System packages
run: |
sudo apt-get -y update
# Needed for shell tests
sudo apt-get install -y coreutils csh zsh tcsh fish dash bash subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/bin/setup_git.sh
- name: Run shell tests
env:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
name: coverage-shell
path: coverage
include-hidden-files: true
- name: Setup tmate session
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
rhel8-platform-python:
runs-on: ubuntu-latest
container: registry.access.redhat.com/ubi8/ubi
steps:
- name: Install dependencies
run: |
dnf install -y \
bzip2 curl gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Run unit tests
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d bootstrap now --dev
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
- name: Install System packages
run: |
sudo apt-get -y update
sudo apt-get -y install coreutils gfortran graphviz gnupg2
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-cov clingo
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click" "black"
- name: Run unit tests (full suite with coverage)
env:
COVERAGE: true
COVERAGE_FILE: coverage/.coverage-clingo-cffi
run: |
. share/spack/setup-env.sh
spack bootstrap disable spack-install
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.6
spack bootstrap status
spack solve zlib
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretization/core.py
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
name: coverage-clingo-cffi
path: coverage
include-hidden-files: true
# Run unit tests on MacOS
macos:
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip setuptools
pip install --upgrade pytest coverage[toml] pytest-xdist pytest-cov
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg kcov
- name: Run unit tests
env:
SPACK_TEST_PARALLEL: 4
COVERAGE_FILE: coverage/.coverage-${{ matrix.os }}-python${{ matrix.python-version }}
run: |
git --version
. .github/workflows/bin/setup_git.sh
. share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }}
path: coverage
include-hidden-files: true
# Run unit tests on Windows
windows:
defaults:
run:
@@ -258,3 +117,6 @@ jobs:
name: coverage-windows
path: coverage
include-hidden-files: true
- name: Setup tmate session
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3

View File

@@ -539,7 +539,9 @@ from the command line.
You can also include an environment directly in the ``spack.yaml`` file. It
involves adding the ``include_concrete`` heading in the yaml followed by the
absolute path to the independent environments.
absolute path to the independent environments. Note, that you may use Spack
config variables such as ``$spack`` or environment variables as long as the
expression expands to an absolute path.
.. code-block:: yaml
@@ -549,7 +551,7 @@ absolute path to the independent environments.
unify: true
include_concrete:
- /absolute/path/to/environment1
- /absolute/path/to/environment2
- $spack/../path/to/environment2
Once the ``spack.yaml`` has been updated you must concretize the environment to

View File

@@ -497,7 +497,7 @@ extends Spack's ``Package`` class. For example, here is
.. code-block:: python
:linenos:
from spack import *
from spack.package import *
class Libelf(Package):
""" ... description ... """
@@ -1089,7 +1089,7 @@ You've already seen the ``homepage`` and ``url`` package attributes:
.. code-block:: python
:linenos:
from spack import *
from spack.package import *
class Mpich(Package):
@@ -6183,7 +6183,7 @@ running:
.. code-block:: python
from spack import *
from spack.package import *
This is already part of the boilerplate for packages created with
``spack create``.

View File

@@ -350,7 +350,7 @@ def _ensure_no_folders_without_package_py(error_cls):
for repository in spack.repo.PATH.repos:
missing = []
for entry in os.scandir(repository.packages_path):
if not entry.is_dir():
if not entry.is_dir() or entry.name == "__pycache__":
continue
package_py = pathlib.Path(entry.path) / spack.repo.package_file_name
if not package_py.exists():

View File

@@ -717,7 +717,11 @@ def _read_specs_and_push_index(
temp_dir: Location to write index.json and hash for pushing
"""
for file in file_list:
fetched_spec = spack.spec.Spec.from_dict(read_method(file))
try:
fetched_spec = spack.spec.Spec.from_dict(read_method(file))
except Exception as e:
tty.warn(f"Unable to fetch spec for manifest {file} due to: {e}")
continue
db.add(fetched_spec)
db.mark(fetched_spec, "in_buildcache", True)
@@ -750,6 +754,7 @@ def file_read_method(manifest_path):
cache_entry.destroy()
return spec_dict
url_to_list = url_util.join(url, buildcache_relative_specs_url())
sync_command_args = [
"s3",
"sync",
@@ -757,11 +762,11 @@ def file_read_method(manifest_path):
"*",
"--include",
"*.spec.manifest.json",
url,
url_to_list,
tmpspecsdir,
]
tty.debug(f"Using aws s3 sync to download manifests from {url} to {tmpspecsdir}")
tty.debug(f"Using aws s3 sync to download manifests from {url_to_list} to {tmpspecsdir}")
try:
aws(*sync_command_args, output=os.devnull, error=os.devnull)

View File

@@ -190,7 +190,7 @@ def archspec_name(self) -> str:
def cc(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("c", None)
return self.spec.extra_attributes.get("compilers", {}).get("c", None)
return self._cc_path()
def _cc_path(self) -> Optional[str]:
@@ -201,7 +201,7 @@ def _cc_path(self) -> Optional[str]:
def cxx(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C++ compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("cxx", None)
return self.spec.extra_attributes.get("compilers", {}).get("cxx", None)
return self._cxx_path()
def _cxx_path(self) -> Optional[str]:
@@ -212,7 +212,7 @@ def _cxx_path(self) -> Optional[str]:
def fortran(self):
assert self.spec.concrete, "cannot retrieve Fortran compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("fortran", None)
return self.spec.extra_attributes.get("compilers", {}).get("fortran", None)
return self._fortran_path()
def _fortran_path(self) -> Optional[str]:

View File

@@ -56,7 +56,7 @@ def is_package(f):
"""Whether flake8 should consider a file as a core file or a package.
We run flake8 with different exceptions for the core and for
packages, since we allow `from spack import *` and poking globals
packages, since we allow `from spack.package import *` and poking globals
into packages.
"""
return f.startswith("var/spack/") and f.endswith("package.py")

View File

@@ -1049,7 +1049,11 @@ def add_view(name, values):
def _process_concrete_includes(self):
"""Extract and load into memory included concrete spec data."""
self.included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
_included_concrete_envs = self.manifest[TOP_LEVEL_KEY].get(included_concrete_name, [])
# Expand config and environment variables
self.included_concrete_envs = [
spack.util.path.canonicalize_path(_env) for _env in _included_concrete_envs
]
if self.included_concrete_envs:
if os.path.exists(self.lock_path):

View File

@@ -550,7 +550,6 @@ def setup_main_options(args):
spack.config.CONFIG.scopes["command_line"].sections["repos"] = syaml.syaml_dict(
[(key, [spack.paths.mock_packages_path])]
)
spack.repo.PATH = spack.repo.create(spack.config.CONFIG)
# If the user asked for it, don't check ssl certs.
if args.insecure:
@@ -561,6 +560,8 @@ def setup_main_options(args):
for config_var in args.config_vars or []:
spack.config.add(fullpath=config_var, scope="command_line")
spack.repo.enable_repo(spack.repo.create(spack.config.CONFIG))
# On Windows10 console handling for ASCI/VT100 sequences is not
# on by default. Turn on before we try to write to console
# with color

View File

@@ -91,29 +91,8 @@ class ReposFinder:
Returns a loader based on the inspection of the current repository list.
"""
def __init__(self):
self._repo_init = _path
self._repo: Optional[RepoType] = None
@property
def current_repository(self):
if self._repo is None:
self._repo = self._repo_init()
return self._repo
@current_repository.setter
def current_repository(self, value):
self._repo = value
@contextlib.contextmanager
def switch_repo(self, substitute: "RepoType"):
"""Switch the current repository list for the duration of the context manager."""
old = self._repo
try:
self._repo = substitute
yield
finally:
self._repo = old
#: The current list of repositories.
repo_path: "RepoPath"
def find_spec(self, fullname, python_path, target=None):
# "target" is not None only when calling importlib.reload()
@@ -134,14 +113,11 @@ def compute_loader(self, fullname: str):
namespace, dot, module_name = fullname.rpartition(".")
# If it's a module in some repo, or if it is the repo's namespace, let the repo handle it.
current_repo = self.current_repository
is_repo_path = isinstance(current_repo, RepoPath)
if is_repo_path:
repos = current_repo.repos
else:
repos = [current_repo]
for repo in repos:
if not hasattr(self, "repo_path"):
return None
for repo in self.repo_path.repos:
# We are using the namespace of the repo and the repo contains the package
if namespace == repo.full_namespace:
# With 2 nested conditionals we can call "repo.real_name" only once
@@ -156,9 +132,7 @@ def compute_loader(self, fullname: str):
# No repo provides the namespace, but it is a valid prefix of
# something in the RepoPath.
if is_repo_path and current_repo.by_namespace.is_prefix(
fullname[len(PKG_MODULE_PREFIX_V1) :]
):
if self.repo_path.by_namespace.is_prefix(fullname[len(PKG_MODULE_PREFIX_V1) :]):
return SpackNamespaceLoader()
return None
@@ -258,6 +232,8 @@ def get_all_package_diffs(type: str, repo: "Repo", rev1="HEAD^1", rev2="HEAD") -
changed: Set[str] = set()
for path in lines:
dir_name, _, _ = path.partition("/")
if not nm.valid_module_name(dir_name, repo.package_api):
continue
pkg_name = nm.pkg_dir_to_pkg_name(dir_name, repo.package_api)
if pkg_name not in added and pkg_name not in removed:
changed.add(pkg_name)
@@ -662,7 +638,6 @@ def __init__(
if isinstance(repo, str):
assert cache is not None, "cache must hold a value, when repo is a string"
repo = Repo(repo, cache=cache, overrides=overrides)
repo.finder(self)
self.put_last(repo)
except RepoError as e:
tty.warn(
@@ -672,6 +647,20 @@ def __init__(
f" spack repo rm {repo}",
)
def enable(self) -> None:
"""Set the relevant search paths for package module loading"""
REPOS_FINDER.repo_path = self
for p in reversed(self.python_paths()):
if p not in sys.path:
sys.path.insert(0, p)
def disable(self) -> None:
"""Disable the search paths for package module loading"""
del REPOS_FINDER.repo_path
for p in self.python_paths():
if p in sys.path:
sys.path.remove(p)
def ensure_unwrapped(self) -> "RepoPath":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
@@ -845,9 +834,6 @@ def python_paths(self) -> List[str]:
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
"""Find a class for the spec's package and return the class object."""
for p in self.python_paths():
if p not in sys.path:
sys.path.insert(0, p)
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
@autospec
@@ -1094,9 +1080,6 @@ def check(condition, msg):
# Class attribute overrides by package name
self.overrides = overrides or {}
# Optional reference to a RepoPath to influence module import from spack.pkg
self._finder: Optional[RepoPath] = None
# Maps that goes from package name to corresponding file stat
self._fast_package_checker: Optional[FastPackageChecker] = None
@@ -1108,9 +1091,6 @@ def check(condition, msg):
def package_api_str(self) -> str:
return f"v{self.package_api[0]}.{self.package_api[1]}"
def finder(self, value: RepoPath) -> None:
self._finder = value
def real_name(self, import_name: str) -> Optional[str]:
"""Allow users to import Spack packages using Python identifiers.
@@ -1363,11 +1343,9 @@ def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]
fullname += ".package"
class_name = nm.pkg_name_to_class_name(pkg_name)
if self.python_path and self.python_path not in sys.path:
sys.path.insert(0, self.python_path)
try:
with REPOS_FINDER.switch_repo(self._finder or self):
module = importlib.import_module(fullname)
module = importlib.import_module(fullname)
except ImportError as e:
raise UnknownPackageError(fullname) from e
except Exception as e:
@@ -1560,12 +1538,6 @@ def create_or_construct(
return from_path(repo_yaml_dir)
def _path(configuration=None):
"""Get the singleton RepoPath instance for Spack."""
configuration = configuration or spack.config.CONFIG
return create(configuration=configuration)
def create(configuration: spack.config.Configuration) -> RepoPath:
"""Create a RepoPath from a configuration object.
@@ -1588,8 +1560,10 @@ def create(configuration: spack.config.Configuration) -> RepoPath:
return RepoPath(*repo_dirs, cache=spack.caches.MISC_CACHE, overrides=overrides)
#: Singleton repo path instance
PATH: RepoPath = llnl.util.lang.Singleton(_path) # type: ignore
#: Global package repository instance.
PATH: RepoPath = llnl.util.lang.Singleton(
lambda: create(configuration=spack.config.CONFIG)
) # type: ignore[assignment]
# Add the finder to sys.meta_path
REPOS_FINDER = ReposFinder()
@@ -1615,20 +1589,27 @@ def use_repositories(
Returns:
Corresponding RepoPath object
"""
global PATH
paths = [getattr(x, "root", x) for x in paths_and_repos]
scope_name = "use-repo-{}".format(uuid.uuid4())
scope_name = f"use-repo-{uuid.uuid4()}"
repos_key = "repos:" if override else "repos"
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={repos_key: paths})
)
PATH, saved = create(configuration=spack.config.CONFIG), PATH
old_repo, new_repo = PATH, create(configuration=spack.config.CONFIG)
old_repo.disable()
enable_repo(new_repo)
try:
with REPOS_FINDER.switch_repo(PATH): # type: ignore
yield PATH
yield new_repo
finally:
spack.config.CONFIG.remove_scope(scope_name=scope_name)
PATH = saved
enable_repo(old_repo)
def enable_repo(repo_path: RepoPath) -> None:
"""Set the global package repository and make them available in module search paths."""
global PATH
PATH = repo_path
PATH.enable()
class MockRepositoryBuilder:

View File

@@ -128,8 +128,6 @@ class Provenance(enum.IntEnum):
SPEC = enum.auto()
# A dev spec literal
DEV_SPEC = enum.auto()
# An external spec declaration
EXTERNAL = enum.auto()
# The 'packages' section of the configuration
PACKAGES_YAML = enum.auto()
# A package requirement
@@ -138,6 +136,8 @@ class Provenance(enum.IntEnum):
PACKAGE_PY = enum.auto()
# An installed spec
INSTALLED = enum.auto()
# An external spec declaration
EXTERNAL = enum.auto()
# lower provenance for installed git refs so concretizer prefers StandardVersion installs
INSTALLED_GIT_VERSION = enum.auto()
# A runtime injected from another package (e.g. a compiler)
@@ -2512,7 +2512,22 @@ def _spec_clauses(
if self.pkg_class(spec.name).has_variant(vname):
clauses.append(f.variant_value(spec.name, vname, value))
else:
clauses.append(f.variant_value(spec.name, vname, value))
variant_clause = f.variant_value(spec.name, vname, value)
if (
variant.concrete
and variant.type == vt.VariantType.MULTI
and not spec.concrete
):
if body is False:
variant_clause.args = (
f"concrete_{variant_clause.args[0]}",
*variant_clause.args[1:],
)
else:
clauses.append(
fn.attr("concrete_variant_request", spec.name, vname, value)
)
clauses.append(variant_clause)
# compiler flags
source = context.source if context else "none"

View File

@@ -159,10 +159,12 @@ unification_set(SetID, VirtualNode)
% TODO: literals, at the moment, can only influence the "root" unification set. This needs to be extended later.
% Node attributes that have multiple node arguments (usually, only the first argument is a node)
multiple_nodes_attribute("depends_on").
multiple_nodes_attribute("virtual_on_edge").
multiple_nodes_attribute("provider_set").
% Node attributes that need custom rules in ASP, e.g. because they involve multiple nodes
node_attributes_with_custom_rules("depends_on").
node_attributes_with_custom_rules("virtual_on_edge").
node_attributes_with_custom_rules("provider_set").
node_attributes_with_custom_rules("concrete_variant_set").
node_attributes_with_custom_rules("concrete_variant_request").
trigger_condition_holds(TriggerID, node(min_dupe_id, Package)) :-
solve_literal(TriggerID),
@@ -397,12 +399,26 @@ trigger_condition_holds(ID, RequestorNode) :-
trigger_node(ID, PackageNode, RequestorNode);
attr(Name, node(X, A1)) : condition_requirement(ID, Name, A1), condition_nodes(ID, PackageNode, node(X, A1));
attr(Name, node(X, A1), A2) : condition_requirement(ID, Name, A1, A2), condition_nodes(ID, PackageNode, node(X, A1));
attr(Name, node(X, A1), A2, A3) : condition_requirement(ID, Name, A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name);
attr(Name, node(X, A1), A2, A3) : condition_requirement(ID, Name, A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), not node_attributes_with_custom_rules(Name);
attr(Name, node(X, A1), A2, A3, A4) : condition_requirement(ID, Name, A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1));
% Special cases
attr("depends_on", node(X, A1), node(Y, A2), A3) : condition_requirement(ID, "depends_on", A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), condition_nodes(ID, PackageNode, node(Y, A2));
not cannot_hold(ID, PackageNode).
condition_with_concrete_variant(ID, Package, Variant) :- condition_requirement(ID, "concrete_variant_request", Package, Variant, _).
cannot_hold(ID, PackageNode) :-
not attr("variant_value", node(X, A1), Variant, Value),
condition_with_concrete_variant(ID, A1, Variant),
condition_requirement(ID, "concrete_variant_request", A1, Variant, Value),
condition_nodes(ID, PackageNode, node(X, A1)).
cannot_hold(ID, PackageNode) :-
attr("variant_value", node(X, A1), Variant, Value),
condition_with_concrete_variant(ID, A1, Variant),
not condition_requirement(ID, "concrete_variant_request", A1, Variant, Value),
condition_nodes(ID, PackageNode, node(X, A1)).
condition_holds(ConditionID, node(X, Package))
:- pkg_fact(Package, condition_trigger(ConditionID, TriggerID)),
trigger_condition_holds(TriggerID, node(X, Package)).
@@ -449,8 +465,8 @@ imposed_nodes(ConditionID, PackageNode, node(X, A1))
% Conditions that hold impose may impose constraints on other specs
attr(Name, node(X, A1)) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1), imposed_nodes(ID, PackageNode, node(X, A1)).
attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2), imposed_nodes(ID, PackageNode, node(X, A1)), not node_attributes_with_custom_rules(Name).
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not node_attributes_with_custom_rules(Name).
attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)).
% Provider set is relevant only for literals, since it's the only place where `^[virtuals=foo] bar`
@@ -471,6 +487,15 @@ provider(ProviderNode, VirtualNode) :- attr("provider_set", ProviderNode, Virtua
imposed_constraint(ID, "depends_on", A1, A2, A3),
internal_error("Build deps must land in exactly one duplicate").
% For := we must keep track of the origin of the fact, since we need to check
% each condition separately, i.e. foo:=a,b in one place and foo:=c in another
% should not make foo:=a,b,c possible
attr("concrete_variant_set", node(X, A1), Variant, Value, ID)
:- impose(ID, PackageNode),
imposed_nodes(ID, PackageNode, node(X, A1)),
imposed_constraint(ID, "concrete_variant_set", A1, Variant, Value).
% The rule below accounts for expressions like:
%
% root ^dep %compiler
@@ -1149,6 +1174,22 @@ error(100, "No valid value for variant '{1}' of package '{0}'", Package, Variant
% if a variant is set to anything, it is considered 'set'.
attr("variant_set", PackageNode, Variant) :- attr("variant_set", PackageNode, Variant, _).
% Setting a concrete variant implies setting a variant
concrete_variant_value(PackageNode, Variant, Value, Origin) :- attr("concrete_variant_set", PackageNode, Variant, Value, Origin).
attr("variant_set", PackageNode, Variant, Value) :- attr("concrete_variant_set", PackageNode, Variant, Value, _).
% Concrete variant values must be in the answer set
:- concrete_variant_value(PackageNode, Variant, Value, _), not attr("variant_value", PackageNode, Variant, Value).
% Extra variant values are not allowed, if the variant is concrete
variant_is_concrete(PackageNode, Variant, Origin) :- concrete_variant_value(PackageNode, Variant, _, Origin).
error(100, "The variant {0} in package {1} specified as := has the extra value {2}", Variant, PackageNode, Value)
:- variant_is_concrete(PackageNode, Variant, Origin),
attr("variant_value", PackageNode, Variant, Value),
not concrete_variant_value(PackageNode, Variant, Value, Origin).
% A variant cannot have a value that is not also a possible value
% This only applies to packages we need to build -- concrete packages may
% have been built w/different variants from older/different package versions.
@@ -1638,7 +1679,12 @@ build(PackageNode) :- attr("node", PackageNode), not concrete(PackageNode).
% 200 - 299 Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds and minimizing dupes.
% 0 - 99 Priorities for non-built nodes.
build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode).
treat_node_as_concrete(node(X, Package)) :- external(node(X, Package)).
treat_node_as_concrete(node(X, Package)) :- attr("node", node(X, Package)), runtime(Package).
build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode), not treat_node_as_concrete(PackageNode).
build_priority(PackageNode, 0) :- build(PackageNode), attr("node", PackageNode), treat_node_as_concrete(PackageNode).
build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", PackageNode).
% don't assign versions from installed packages unless reuse is enabled
@@ -1695,7 +1741,7 @@ opt_criterion(310, "requirement weight").
% Try hard to reuse installed packages (i.e., minimize the number built)
opt_criterion(110, "number of packages to build (vs. reuse)").
#minimize { 0@110: #true }.
#minimize { 1@110,PackageNode : build(PackageNode) }.
#minimize { 1@110,PackageNode : build(PackageNode), not treat_node_as_concrete(PackageNode) }.
opt_criterion(100, "number of nodes from the same package").
#minimize { 0@100: #true }.
@@ -1860,48 +1906,59 @@ opt_criterion(10, "target mismatches").
not runtime(Dependency)
}.
opt_criterion(5, "non-preferred targets").
#minimize{ 0@205: #true }.
#minimize{ 0@5: #true }.
opt_criterion(7, "non-preferred targets").
#minimize{ 0@207: #true }.
#minimize{ 0@7: #true }.
#minimize{
Weight@5+Priority,node(X, Package)
Weight@7+Priority,node(X, Package)
: node_target_weight(node(X, Package), Weight),
build_priority(node(X, Package), Priority),
not runtime(Package)
}.
opt_criterion(4, "preferred providers (language runtimes)").
#minimize{ 0@204: #true }.
#minimize{ 0@4: #true }.
opt_criterion(5, "preferred providers (language runtimes)").
#minimize{ 0@205: #true }.
#minimize{ 0@5: #true }.
#minimize{
Weight@4+Priority,ProviderNode,X,Virtual
Weight@5+Priority,ProviderNode,X,Virtual
: provider_weight(ProviderNode, node(X, Virtual), Weight),
language_runtime(Virtual),
build_priority(ProviderNode, Priority)
}.
% Choose more recent versions for runtimes
opt_criterion(3, "version badness (runtimes)").
#minimize{ 0@203: #true }.
#minimize{ 0@3: #true }.
opt_criterion(4, "version badness (runtimes)").
#minimize{ 0@204: #true }.
#minimize{ 0@4: #true }.
#minimize{
Weight@3,node(X, Package)
Weight@4,node(X, Package)
: version_weight(node(X, Package), Weight),
runtime(Package)
}.
% Choose best target for runtimes
opt_criterion(2, "non-preferred targets (runtimes)").
#minimize{ 0@202: #true }.
#minimize{ 0@2: #true }.
opt_criterion(3, "non-preferred targets (runtimes)").
#minimize{ 0@203: #true }.
#minimize{ 0@3: #true }.
#minimize{
Weight@2,node(X, Package)
Weight@3,node(X, Package)
: node_target_weight(node(X, Package), Weight),
runtime(Package)
}.
% Choose more recent versions for nodes
opt_criterion(1, "edge wiring").
opt_criterion(2, "providers on edges").
#minimize{ 0@202: #true }.
#minimize{ 0@2: #true }.
#minimize{
Weight@2,ParentNode,ProviderNode,Virtual
: provider_weight(ProviderNode, Virtual, Weight),
not attr("root", ProviderNode),
depends_on(ParentNode, ProviderNode)
}.
% Choose more recent versions for nodes
opt_criterion(1, "version badness on edges").
#minimize{ 0@201: #true }.
#minimize{ 0@1: #true }.
#minimize{
@@ -1912,14 +1969,6 @@ opt_criterion(1, "edge wiring").
}.
#minimize{ 0@201: #true }.
#minimize{ 0@1: #true }.
#minimize{
Weight@1,ParentNode,ProviderNode,Virtual
: provider_weight(ProviderNode, Virtual, Weight),
not attr("root", ProviderNode),
depends_on(ParentNode, ProviderNode)
}.
%-----------
% Notes

View File

@@ -4670,6 +4670,9 @@ def substitute_abstract_variants(spec: Spec):
# in $spack/lib/spack/spack/spec_list.py
unknown = []
for name, v in spec.variants.items():
if v.concrete and v.type == vt.VariantType.MULTI:
continue
if name == "dev_path":
v.type = vt.VariantType.SINGLE
v.concrete = True

View File

@@ -106,7 +106,7 @@ def __init__(self):
def restore(self):
spack.config.CONFIG = self.config
spack.repo.PATH = spack.repo.create(self.config)
spack.repo.enable_repo(spack.repo.create(self.config))
spack.platforms.host = self.platform
spack.store.STORE = self.store
self.test_patches.restore()
@@ -129,7 +129,6 @@ def restore(self):
def store_patches():
global patches
module_patches = list()
class_patches = list()
if not patches:

View File

@@ -2028,13 +2028,12 @@ def test_ci_verify_versions_valid(
tmpdir,
):
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
with spack.repo.use_repositories(repo):
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
out = ci_cmd("verify-versions", commits[-1], commits[-3])
assert "Validated diff-test@2.1.5" in out
assert "Validated diff-test@2.1.6" in out
out = ci_cmd("verify-versions", commits[-1], commits[-3])
assert "Validated diff-test@2.1.5" in out
assert "Validated diff-test@2.1.6" in out
def test_ci_verify_versions_standard_invalid(
@@ -2045,23 +2044,21 @@ def test_ci_verify_versions_standard_invalid(
verify_git_versions_invalid,
):
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
with spack.repo.use_repositories(repo):
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
assert "Invalid checksum found diff-test@2.1.5" in out
assert "Invalid commit for diff-test@2.1.6" in out
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
assert "Invalid checksum found diff-test@2.1.5" in out
assert "Invalid commit for diff-test@2.1.6" in out
def test_ci_verify_versions_manual_package(monkeypatch, mock_packages, mock_git_package_changes):
repo, _, commits = mock_git_package_changes
spack.repo.PATH.put_first(repo)
with spack.repo.use_repositories(repo):
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
monkeypatch.setattr(spack.repo, "builtin_repo", lambda: repo)
pkg_class = spack.spec.Spec("diff-test").package_class
monkeypatch.setattr(pkg_class, "manual_download", True)
pkg_class = spack.spec.Spec("diff-test").package_class
monkeypatch.setattr(pkg_class, "manual_download", True)
out = ci_cmd("verify-versions", commits[-1], commits[-2])
assert "Skipping manual download package: diff-test" in out
out = ci_cmd("verify-versions", commits[-1], commits[-2])
assert "Skipping manual download package: diff-test" in out

View File

@@ -448,7 +448,7 @@ def test_find_loaded(database, working_env):
@pytest.mark.regression("37712")
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path):
def test_environment_with_version_range_in_compiler_doesnt_fail(tmp_path, mock_packages):
"""Tests that having an active environment with a root spec containing a compiler constrained
by a version range (i.e. @X.Y rather the single version than @=X.Y) doesn't result in an error
when invoking "spack find".

View File

@@ -188,6 +188,7 @@ class Root(Package):
url = "http://www.example.com/root-1.0.tar.gz"
version("1.0", sha256="abcde")
depends_on("middle")
depends_on("changing")
conflicts("^changing~foo")
@@ -196,6 +197,20 @@ class Root(Package):
package_py.parent.mkdir(parents=True)
package_py.write_text(root_pkg_str)
middle_pkg_str = """
from spack.package import *
class Middle(Package):
homepage = "http://www.example.com"
url = "http://www.example.com/root-1.0.tar.gz"
version("1.0", sha256="abcde")
depends_on("changing")
"""
package_py = packages_dir / "middle" / "package.py"
package_py.parent.mkdir(parents=True)
package_py.write_text(middle_pkg_str)
changing_template = """
from spack.package import *
@@ -445,9 +460,7 @@ def test_mixing_compilers_only_affects_subdag(self):
if "c" not in x or not x.name.startswith("dt-diamond"):
continue
expected_gcc = x.name != "dt-diamond"
assert (
bool(x.dependencies(name="llvm", deptype="build")) is not expected_gcc
), x.long_spec
assert bool(x.dependencies(name="llvm", deptype="build")) is not expected_gcc, x.tree()
assert bool(x.dependencies(name="gcc", deptype="build")) is expected_gcc
assert x.satisfies("%clang") is not expected_gcc
assert x.satisfies("%gcc") is expected_gcc
@@ -1007,11 +1020,17 @@ def test_concretize_anonymous_dep(self, spec_str):
("bowtie@1.4.0", "%gcc@10.2.1"),
# Version with conflicts and no valid gcc select another compiler
("bowtie@1.3.0", "%clang@15.0.0"),
# If a higher gcc is available, with a worse os, still prefer that
("bowtie@1.2.2", "%gcc@11.1.0"),
# If a higher gcc is available, with a worse os, still prefer that,
# assuming the two operating systems are compatible
("bowtie@1.2.2 %gcc", "%gcc@11.1.0"),
],
)
def test_compiler_conflicts_in_package_py(self, spec_str, expected_str, gcc11_with_flags):
def test_compiler_conflicts_in_package_py(
self, spec_str, expected_str, gcc11_with_flags, mutable_config
):
mutable_config.set(
"concretizer:os_compatible", {"debian6": ["redhat6"], "redhat6": ["debian6"]}
)
with spack.config.override("packages", {"gcc": {"externals": [gcc11_with_flags]}}):
s = spack.concretize.concretize_one(spec_str)
assert s.satisfies(expected_str)
@@ -1224,20 +1243,6 @@ def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, witho
node = s[pkg_name]
assert not node.dependencies(deptype="test"), msg.format(pkg_name)
@pytest.mark.regression("20019")
def test_compiler_match_is_preferred_to_newer_version(self, compiler_factory):
# This spec depends on openblas. Openblas has a conflict
# that doesn't allow newer versions with gcc@4.4.0. Check
# that an old version of openblas is selected, rather than
# a different compiler for just that node.
with spack.config.override(
"packages", {"gcc": {"externals": [compiler_factory(spec="gcc@10.1.0 os=redhat6")]}}
):
spec_str = "simple-inheritance+openblas os=redhat6 %gcc@10.1.0"
s = spack.concretize.concretize_one(spec_str)
assert "openblas@0.2.15" in s
assert s["openblas"].satisfies("%gcc@10.1.0")
@pytest.mark.regression("19981")
def test_target_ranges_in_conflicts(self):
with pytest.raises(spack.error.SpackError):
@@ -1863,7 +1868,7 @@ def test_misleading_error_message_on_version(self, mutable_database):
@pytest.mark.regression("31148")
def test_version_weight_and_provenance(self):
"""Test package preferences during coconcretization."""
"""Test package preferences during concretization."""
reusable_specs = [
spack.concretize.concretize_one(spec_str) for spec_str in ("pkg-b@0.9", "pkg-b@1.0")
]
@@ -1873,17 +1878,18 @@ def test_version_weight_and_provenance(self):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
# The result here should have a single spec to build ('pkg-a')
# and it should be using pkg-b@1.0 with a version badness of 2
# The provenance is:
# Version badness should be > 0 only for reused specs. For instance, for pkg-b
# the version provenance is:
#
# version_declared("pkg-b","1.0",0,"package_py").
# version_declared("pkg-b","0.9",1,"package_py").
# version_declared("pkg-b","1.0",2,"installed").
# version_declared("pkg-b","0.9",3,"installed").
#
# Depending on the target, it may also use gnuconfig
v_weights = [x for x in result.criteria if x[2] == "version badness (non roots)"][0]
reused_weights, built_weights, _ = v_weights
assert reused_weights > 2 and built_weights == 0
result_spec = result.specs[0]
assert (2, 0, "version badness (non roots)") in result.criteria
assert result_spec.satisfies("^pkg-b@1.0")
assert result_spec["pkg-b"].dag_hash() == reusable_specs[1].dag_hash()
@@ -1930,7 +1936,9 @@ def test_git_ref_version_succeeds_with_unknown_version(self, git_ref):
def test_installed_externals_are_reused(
self, mutable_database, repo_with_changing_recipe, tmp_path
):
"""Test that external specs that are in the DB can be reused."""
"""Tests that external specs that are in the DB can be reused, if they result in a
better optimization score.
"""
external_conf = {
"changing": {
"buildable": False,
@@ -1940,22 +1948,23 @@ def test_installed_externals_are_reused(
spack.config.set("packages", external_conf)
# Install the external spec
external1 = spack.concretize.concretize_one("changing@1.0")
PackageInstaller([external1.package], fake=True, explicit=True).install()
assert external1.external
middle_pkg = spack.concretize.concretize_one("middle")
PackageInstaller([middle_pkg.package], fake=True, explicit=True).install()
assert middle_pkg["changing"].external
changing_external = middle_pkg["changing"]
# Modify the package.py file
repo_with_changing_recipe.change({"delete_variant": True})
# Try to concretize the external without reuse and confirm the hash changed
with spack.config.override("concretizer:reuse", False):
external2 = spack.concretize.concretize_one("changing@1.0")
assert external2.dag_hash() != external1.dag_hash()
root_no_reuse = spack.concretize.concretize_one("root")
assert root_no_reuse["changing"].dag_hash() != changing_external.dag_hash()
# ... while with reuse we have the same hash
with spack.config.override("concretizer:reuse", True):
external3 = spack.concretize.concretize_one("changing@1.0")
assert external3.dag_hash() == external1.dag_hash()
root_with_reuse = spack.concretize.concretize_one("root")
assert root_with_reuse["changing"].dag_hash() == changing_external.dag_hash()
@pytest.mark.regression("31484")
def test_user_can_select_externals_with_require(self, mutable_database, tmp_path):
@@ -3440,3 +3449,120 @@ def test_using_externals_with_compilers(mutable_config, mock_packages, tmp_path)
libelf = s["libelf"]
assert libelf.external and libelf.satisfies("%gcc")
@pytest.mark.regression("50161")
def test_installed_compiler_and_better_external(
install_mockery, do_not_check_runtimes_on_reuse, mutable_config
):
"""Tests that we always prefer a higher-priority external compiler, when we have a
lower-priority compiler installed, and we try to concretize a spec without specifying
the compiler dependency.
"""
pkg_b = spack.concretize.concretize_one(spack.spec.Spec("pkg-b %clang"))
PackageInstaller([pkg_b.package], fake=True, explicit=True).install()
with spack.config.override("concretizer:reuse", False):
pkg_a = spack.concretize.concretize_one("pkg-a")
assert pkg_a["c"].satisfies("gcc@10"), pkg_a.tree()
assert pkg_a["pkg-b"]["c"].satisfies("gcc@10")
with spack.config.override("concretizer:reuse", False):
mpileaks = spack.concretize.concretize_one("mpileaks")
assert mpileaks.satisfies("%gcc@10")
@pytest.mark.regression("50006")
def test_concrete_multi_valued_variants_in_externals(mutable_config, mock_packages, tmp_path):
"""Tests that concrete multivalued variants in externals cannot be extended with additional
values when concretizing.
"""
packages_yaml = syaml.load_config(
f"""
packages:
gcc:
buildable: false
externals:
- spec: gcc@12.1.0 languages:='c,c++'
prefix: {tmp_path / 'gcc-12'}
extra_attributes:
compilers:
c: {tmp_path / 'gcc-12'}/bin/gcc
cxx: {tmp_path / 'gcc-12'}/bin/g++
- spec: gcc@14.1.0 languages:=fortran
prefix: {tmp_path / 'gcc-14'}
extra_attributes:
compilers:
fortran: {tmp_path / 'gcc-14'}/bin/gfortran
"""
)
mutable_config.set("packages", packages_yaml["packages"])
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretize_one("pkg-b %gcc@14")
s = spack.concretize.concretize_one("pkg-b %gcc")
assert s["c"].satisfies("gcc@12.1.0"), s.tree()
assert s["c"].external
assert s["c"].satisfies("languages=c,c++") and not s["c"].satisfies("languages=fortran")
def test_concrete_multi_valued_in_input_specs(default_mock_concretization):
"""Tests that we can use := to specify exactly multivalued variants in input specs."""
s = default_mock_concretization("gcc languages:=fortran")
assert not s.external and s["c"].external
assert s.satisfies("languages:=fortran")
assert not s.satisfies("languages=c") and not s.satisfies("languages=c++")
def test_concrete_multi_valued_variants_in_requirements(mutable_config, mock_packages, tmp_path):
"""Tests that concrete multivalued variants can be imposed by requirements."""
packages_yaml = syaml.load_config(
"""
packages:
pkg-a:
require:
- libs:=static
"""
)
mutable_config.set("packages", packages_yaml["packages"])
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
spack.concretize.concretize_one("pkg-a libs=shared")
spack.concretize.concretize_one("pkg-a libs=shared,static")
s = spack.concretize.concretize_one("pkg-a")
assert s.satisfies("libs:=static")
assert not s.satisfies("libs=shared")
def test_concrete_multi_valued_variants_in_depends_on(default_mock_concretization):
"""Tests the use of := in depends_on directives"""
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
default_mock_concretization("gmt-concrete-mv-dependency ^mvdefaults foo:=c")
default_mock_concretization("gmt-concrete-mv-dependency ^mvdefaults foo:=a,c")
default_mock_concretization("gmt-concrete-mv-dependency ^mvdefaults foo:=b,c")
s = default_mock_concretization("gmt-concrete-mv-dependency")
assert s.satisfies("^mvdefaults foo:=a,b"), s.tree()
assert not s.satisfies("^mvdefaults foo=c")
def test_concrete_multi_valued_variants_when_args(default_mock_concretization):
"""Tests the use of := in conflicts and when= arguments"""
# Check conflicts("foo:=a,b", when="@0.9")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
default_mock_concretization("mvdefaults@0.9 foo:=a,b")
for c in ("foo:=a", "foo:=a,b,c", "foo:=a,c", "foo:=b,c"):
s = default_mock_concretization(f"mvdefaults@0.9 {c}")
assert s.satisfies(c)
# Check depends_on("pkg-b", when="foo:=b,c")
s = default_mock_concretization("mvdefaults foo:=b,c")
assert s.satisfies("^pkg-b")
for c in ("foo:=a", "foo:=a,b,c", "foo:=a,b", "foo:=a,c"):
s = default_mock_concretization(f"mvdefaults {c}")
assert not s.satisfies("^pkg-b")

View File

@@ -1270,13 +1270,10 @@ def test_virtual_requirement_respects_any_of(concretize_scope, mock_packages):
packages:
all:
require:
- "%gcc"
pkg-a:
require:
- "%gcc@10"
- "%gcc@9"
""",
True,
["%gcc@10"],
["%gcc@9"],
),
],
)
@@ -1303,4 +1300,4 @@ def test_requirements_on_compilers_and_reuse(
assert is_pkgb_reused == expected_reuse
for c in expected_contraints:
assert pkga.satisfies(c), print(pkga.tree())
assert pkga.satisfies(c)

View File

@@ -312,7 +312,6 @@ class TestRepoPath:
def test_creation_from_string(self, mock_test_cache):
repo = spack.repo.RepoPath(spack.paths.mock_packages_path, cache=mock_test_cache)
assert len(repo.repos) == 1
assert repo.repos[0]._finder is repo
assert repo.by_namespace["builtin.mock"] is repo.repos[0]
def test_get_repo(self, mock_test_cache):

View File

@@ -55,6 +55,7 @@ def parser_and_speclist():
return parser, result
@pytest.mark.usefixtures("mock_packages")
class TestSpecList:
@pytest.mark.regression("28749")
@pytest.mark.parametrize(
@@ -83,8 +84,8 @@ class TestSpecList:
),
# A constraint affects both the root and a dependency
(
[{"matrix": [["gromacs"], ["%gcc"], ["+plumed ^plumed%gcc"]]}],
["gromacs+plumed%gcc ^plumed%gcc"],
[{"matrix": [["version-test-root"], ["%gcc"], ["^version-test-pkg%gcc"]]}],
["version-test-root%gcc ^version-test-pkg%gcc"],
),
],
)
@@ -158,7 +159,7 @@ def test_spec_list_recursion_specs_as_constraints(self):
assert result.specs == DEFAULT_SPECS
@pytest.mark.regression("16841")
def test_spec_list_matrix_exclude(self, mock_packages):
def test_spec_list_matrix_exclude(self):
parser = SpecListParser()
result = parser.parse_user_specs(
name="specs",
@@ -171,7 +172,7 @@ def test_spec_list_matrix_exclude(self, mock_packages):
)
assert len(result.specs) == 1
def test_spec_list_exclude_with_abstract_hashes(self, mock_packages, install_mockery):
def test_spec_list_exclude_with_abstract_hashes(self, install_mockery):
# Put mpich in the database so it can be referred to by hash.
mpich_1 = spack.concretize.concretize_one("mpich+debug")
mpich_2 = spack.concretize.concretize_one("mpich~debug")

View File

@@ -40,9 +40,7 @@ spack -p --lines 20 spec mpileaks%gcc
$coverage_run $(which spack) bootstrap status --dev --optional
# Check that we can import Spack packages directly as a first import
# TODO: this check is disabled, because sys.path is only updated once
# spack.repo.PATH.get_pkg_class is called.
# $coverage_run $(which spack) python -c "import spack.pkg.builtin.mpileaks; repr(spack.pkg.builtin.mpileaks.Mpileaks)"
$coverage_run $(which spack) python -c "from spack_repo.builtin.packages.mpileaks.package import Mpileaks"
#-----------------------------------------------------------
# Run unit tests with code coverage

View File

@@ -0,0 +1,3 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,3 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -102,7 +102,7 @@ class Adios(AutotoolsPackage):
# Fix a bug in configure.ac that causes automake issues on RHEL 7.7
patch(
"https://github.com/ornladios/ADIOS/pull/207.patch?full_index=1",
"https://github.com/ornladios/ADIOS/commit/17aee8aeed64612cd8cfa0b949147091a5525bbe.patch?full_index=1",
when="@1.12.0: +mpi",
sha256="aea47e56013b57c2d5d36e23e0ae6010541c3333a84003784437768c2e350b05",
)

View File

@@ -232,7 +232,7 @@ class Adios2(CMakePackage, CudaPackage, ROCmPackage):
# Add missing include <memory>
# https://github.com/ornladios/adios2/pull/2710
patch(
"https://github.com/ornladios/adios2/pull/2710.patch?full_index=1",
"https://github.com/ornladios/adios2/commit/72363a5ed1015c2bbb1c057d4d6b2e5662de12ec.patch?full_index=1",
when="@2.5:2.7.1",
sha256="8221073d1b2f8944395a88a5d60a15c7370646b62f5fc6309867bbb6a8c2096c",
)

View File

@@ -20,6 +20,12 @@ class AmrWind(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
version("main", branch="main", submodules=True)
version(
"3.4.2", tag="v3.4.2", commit="ed475a0533dfacf1fdff0b707518ccf99040d9f9", submodules=True
)
version(
"3.4.1", tag="v3.4.1", commit="effe63ca9061e6d2bd5c5e84b690d29c0869f029", submodules=True
)
version(
"3.4.0", tag="v3.4.0", commit="38d1b9fd0b70aab4a01fd507f039750c2508bd1c", submodules=True
)

View File

@@ -13,13 +13,15 @@ class ApacheTvm(CMakePackage, CudaPackage):
hardware backend."""
homepage = "https://tvm.apache.org/"
url = "https://dlcdn.apache.org/tvm/tvm-v0.16.0/apache-tvm-src-v0.16.0.tar.gz"
license("Apache-2.0", checked_by="alex391")
url = "https://github.com/apache/tvm/releases/download/v0.19.0/apache-tvm-src-v0.19.0.tar.gz"
version("0.19.0", sha256="13fd707eae37b9b2b77bccd39668764f61ae6824d50cd1ab8164df1c75565be1")
version(
"0.16.0",
sha256="55e2629c39248ef3b1ee280e34a960182bd17bea7ae0d0fa132bbdaaf5aba1ac",
url="https://dlcdn.apache.org/tvm/tvm-v0.16.0/apache-tvm-src-v0.16.0.tar.gz",
deprecated=True,
)
@@ -28,10 +30,16 @@ class ApacheTvm(CMakePackage, CudaPackage):
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("cmake@3.18:", type="build")
depends_on("python@3.7:3.8", type=("build", "run"))
depends_on("python@3.7:", type=("build", "run"))
conflicts("^python@3.9.0:", when="@:0.16")
depends_on("zlib-api", type=("link", "run"))
depends_on("ncurses", type=("link", "run"))
depends_on("llvm@4:18.1.8", type="build", when="+llvm")
depends_on("llvm@4:", type="build", when="+llvm")
conflicts("^llvm@19.0.0:", when="@:0.16+llvm")
depends_on("cuda@8:", when="+cuda")
def cmake_args(self):

View File

@@ -155,9 +155,9 @@ class Ascent(CMakePackage, CudaPackage, ROCmPackage):
# patch for fix typo in coord_type
# https://github.com/Alpine-DAV/ascent/pull/1408
patch(
"https://github.com/Alpine-DAV/ascent/pull/1408.patch?full_index=1",
"https://github.com/Alpine-DAV/ascent/commit/21f33494eed2016f97a266b3c23f33ff1bf39619.patch?full_index=1",
when="@0.9.3 %oneapi@2025:",
sha256="7de7f51e57f3d743c39ad80d8783a4eb482be1def51eb2d3f9259246c661f164",
sha256="0dc417d8a454d235cdeb9e0f0bb527dc3c42a1eb6ae80e8bd5b33ead19198329",
)
##########################################################################

View File

@@ -33,7 +33,7 @@ class Assimp(CMakePackage):
version("4.0.1", sha256="60080d8ab4daaab309f65b3cffd99f19eb1af8d05623fff469b9b652818e286e")
patch(
"https://patch-diff.githubusercontent.com/raw/assimp/assimp/pull/4203.patch?full_index=1",
"https://github.com/assimp/assimp/commit/92b5c284ce58fb64af2ee1f11e86aa8a65c78d03.patch?full_index=1",
sha256="24135e88bcef205e118f7a3f99948851c78d3f3e16684104dc603439dd790d74",
when="@5.1:5.2.2",
)

View File

@@ -189,8 +189,8 @@ class Bazel(Package):
# https://github.com/bazelbuild/bazel/issues/18642
patch(
"https://github.com/bazelbuild/bazel/pull/20785.patch?full_index=1",
sha256="85dde31d129bbd31e004c5c87f23cdda9295fbb22946dc6d362f23d83bae1fd8",
"https://github.com/bazelbuild/bazel/commit/fe9754f0c14e15bd02fb231995cba473278d853b.patch?full_index=1",
sha256="c2dd258f11786b32f75d91013a3ebd3d0876be2628acadb2ac8331e06fd2e657",
when="@6.0:6.4",
)
conflicts("%gcc@13:", when="@:5")

View File

@@ -79,7 +79,7 @@ def patch(self):
depends_on("lzo", when="+lzo")
depends_on("pcre", when="+pcre")
depends_on("python", when="+python")
depends_on("python@:3.11", when="+python")
depends_on("perl", when="+perl")
depends_on("lmdb", when="@2.7.1:")

View File

@@ -42,8 +42,8 @@ class BufrQuery(CMakePackage, PythonExtension):
# Patches
patch(
"https://github.com/NOAA-EMC/bufr-query/pull/20.patch?full_index=1",
sha256="3acf11082c9e76e64dbbda4f62ac0cbc234dca7e60c85a275e778417cfd65001",
"https://github.com/NOAA-EMC/bufr-query/commit/a27d75e0c2a7c1b819520154fb330af202d65dcf.patch?full_index=1",
sha256="6146db89605f24a92f75bd3bb7521e3fb545dd25d1f3bfc1d917b862e7f3c6c9",
when="+python @:0.0.2",
)

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.store
from spack.package import *
@@ -41,11 +40,6 @@ class CbtfKrell(CMakePackage):
description="The build type to build",
values=("Debug", "Release", "RelWithDebInfo"),
)
variant(
"crayfe",
default=False,
description="build only the FE tool using the runtime_dir to point to target build.",
)
# Fix build errors with gcc >= 10
patch(
@@ -147,44 +141,6 @@ def set_mpi_cmake_options(self, spec, cmake_options):
cmake_options.extend(mpi_options)
def set_cray_login_node_cmake_options(self, spec, cmake_options):
# Appends to cmake_options the options that will enable
# the appropriate Cray login node libraries
cray_login_node_options = []
rt_platform = "cray"
# How do we get the compute node (CNL) cbtf package
# install directory path. spec['cbtf'].prefix is the
# login node path for this build, as we are building
# the login node components with this spack invocation. We
# need these paths to be the ones created in the CNL
# spack invocation.
be_cbtf = spack.store.db.query_one("cbtf arch=cray-CNL-haswell")
be_cbtfk = spack.store.db.query_one("cbtf-krell arch=cray-CNL-haswell")
be_papi = spack.store.db.query_one("papi arch=cray-CNL-haswell")
be_boost = spack.store.db.query_one("boost arch=cray-CNL-haswell")
be_mont = spack.store.db.query_one("libmonitor arch=cray-CNL-haswell")
be_unw = spack.store.db.query_one("libunwind arch=cray-CNL-haswell")
be_xer = spack.store.db.query_one("xerces-c arch=cray-CNL-haswell")
be_dyn = spack.store.db.query_one("dyninst arch=cray-CNL-haswell")
be_mrnet = spack.store.db.query_one("mrnet arch=cray-CNL-haswell")
cray_login_node_options.append("-DCN_RUNTIME_PLATFORM=%s" % rt_platform)
# Use install directories as CMAKE args for the building
# of login cbtf-krell
cray_login_node_options.append("-DCBTF_CN_RUNTIME_DIR=%s" % be_cbtf.prefix)
cray_login_node_options.append("-DCBTF_KRELL_CN_RUNTIME_DIR=%s" % be_cbtfk.prefix)
cray_login_node_options.append("-DPAPI_CN_RUNTIME_DIR=%s" % be_papi.prefix)
cray_login_node_options.append("-DBOOST_CN_RUNTIME_DIR=%s" % be_boost.prefix)
cray_login_node_options.append("-DLIBMONITOR_CN_RUNTIME_DIR=%s" % be_mont.prefix)
cray_login_node_options.append("-DLIBUNWIND_CN_RUNTIME_DIR=%s" % be_unw.prefix)
cray_login_node_options.append("-DXERCESC_CN_RUNTIME_DIR=%s" % be_xer.prefix)
cray_login_node_options.append("-DDYNINST_CN_RUNTIME_DIR=%s" % be_dyn.prefix)
cray_login_node_options.append("-DMRNET_CN_RUNTIME_DIR=%s" % be_mrnet.prefix)
cmake_options.extend(cray_login_node_options)
def cmake_args(self):
spec = self.spec
@@ -218,11 +174,6 @@ def cmake_args(self):
# Add any MPI implementations coming from variant settings
self.set_mpi_cmake_options(spec, cmake_args)
if self.spec.satisfies("+crayfe"):
# We need to build target/compute node components/libraries first
# then pass those libraries to the cbtf-krell login node build
self.set_cray_login_node_cmake_options(spec, cmake_args)
return cmake_args
def setup_run_environment(self, env: EnvironmentModifications) -> None:

View File

@@ -50,7 +50,7 @@ class Compadre(CMakePackage):
# fixes duplicate symbol issue with static library build
patch(
"https://patch-diff.githubusercontent.com/raw/sandialabs/Compadre/pull/286.patch?full_index=1",
"https://github.com/sandialabs/Compadre/commit/af91a6ee3831dc951445df76053ec6315c58cb45.patch?full_index=1",
sha256="e267b74f8ecb8dd23970848ed919d29b7d442f619ce80983e02a19f1d9582c61",
when="@1.5.0",
)

View File

@@ -214,9 +214,9 @@ class Conduit(CMakePackage):
# Add missing include for numeric_limits
# https://github.com/LLNL/conduit/pull/773
patch(
"https://github.com/LLNL/conduit/pull/773.patch?full_index=1",
"https://github.com/LLNL/conduit/commit/eb7dfce2229aac3b9644d422a44948509034e3c6.patch?full_index=1",
when="@:0.7.2",
sha256="784d74942a63acf698c31b39848b46b4b755bf06faa6aa6fb81be61783ec0c30",
sha256="379a1b68928d9078e7302efe694f43c51c8f2c26db4a58ab3fd753746b96b284",
)
def setup_build_environment(self, env: EnvironmentModifications) -> None:

View File

@@ -87,6 +87,8 @@ def edit(self, spec, prefix):
else:
defs_file = FileFilter("./src/system.make")
defs_file.filter(".*FC=.*", f"FC={spec['mpi'].mpifc}")
defs_file.filter(".*F77=.*", f"F77={spec['mpi'].mpif77}")
defs_file.filter(".*COMPFLAGS=.*", f"COMPFLAGS= {fflags}")
defs_file.filter(".*LINKFLAGS=.*", f"LINKFLAGS= {ldflags}")
defs_file.filter(".*BLAS=.*", f"BLAS= {lapack_ld} {blas_ld}")

View File

@@ -5,7 +5,7 @@
from spack.package import *
class Covfie(CMakePackage, CudaPackage):
class Covfie(CMakePackage, CudaPackage, ROCmPackage):
"""Covfie is a library for compositional descriptions of storage methods for
vector fields and other structured multi-dimensional data."""
@@ -14,18 +14,19 @@ class Covfie(CMakePackage, CudaPackage):
git = "https://github.com/acts-project/covfie.git"
list_url = "https://github.com/acts-project/covfie/tags"
maintainers("stephenswat")
maintainers("stephenswat", "sethrj")
license("MPL-2.0")
version("main", branch="main")
version("0.14.0", sha256="b4d8afa712c6fc0e2bc6474367d65fad652864b18d0255c5f2c18fd4c6943993")
version("0.13.0", sha256="e9cd0546c7bc9539f440273bbad303c97215ccd87403cedb4aa387a313938d57")
version("0.12.1", sha256="c33d7707ee30ab5fa8df686a780600343760701023ac0b23355627e1f2f044de")
version("0.12.0", sha256="e35e94075a40e89c4691ff373e3061577295d583a2546c682b2d652d9fce7828")
version("0.11.0", sha256="39fcd0f218d3b4f3aacc6af497a8cda8767511efae7a72b47781f10fd4340f4f")
version("0.10.0", sha256="d44142b302ffc193ad2229f1d2cc6d8d720dd9da8c37989ada4f23018f86c964")
depends_on("c", type="build")
depends_on("c", type="build", when="@:0.13")
depends_on("cxx", type="build")
depends_on("cmake@3.21:", type="build", when="@0.11:")
@@ -35,7 +36,9 @@ def cmake_args(self):
args = [
self.define("COVFIE_PLATFORM_CPU", True),
self.define_from_variant("COVFIE_PLATFORM_CUDA", "cuda"),
self.define_from_variant("COVFIE_PLATFORM_HIP", "rocm"),
self.define("COVFIE_QUIET", True),
self.define("COVFIE_BUILD_TESTS", self.run_tests),
]
return args

View File

@@ -4,8 +4,6 @@
import os
import llnl.util.tty as tty
from spack.package import *
from spack.util.module_cmd import get_path_args_from_module_line, module

View File

@@ -15,23 +15,26 @@ class Cryodrgn(PythonPackage):
license("GPL-3.0-only", checked_by="A-N-Other")
version("3.4.3", sha256="eadc41190d3c6abe983164db299ebb0d7340840281774eaaea1a12627a80dc10")
version("2.3.0", sha256="9dd75967fddfa56d6b2fbfc56933c50c9fb994326112513f223e8296adbf0afc")
depends_on("python@3.7:", type=("build", "run"))
depends_on("python@3.9:3.11", type=("build", "run"), when="@3.4.3:")
depends_on("python@3.7:3.11", type=("build", "run"), when="@2.3.0")
depends_on("py-setuptools@61:", type="build")
depends_on("py-setuptools-scm@6.2:", type="build")
depends_on("py-torch@1:", type=("build", "run"))
depends_on("py-pandas@:1", type=("build", "run"))
depends_on("py-numpy", type=("build", "run"))
depends_on("py-matplotlib", type=("build", "run"))
depends_on("py-numpy@:1.26", type=("build", "run"))
depends_on("py-matplotlib@:3.6", type=("build", "run"))
depends_on("py-pyyaml", type=("build", "run"))
depends_on("py-scipy@1.3.1:", type=("build", "run"))
depends_on("py-scikit-learn", type=("build", "run"))
depends_on("py-seaborn@:0.11", type=("build", "run"))
depends_on("py-cufflinks", type=("build", "run"))
depends_on("py-jupyterlab", type=("build", "run"))
depends_on("py-notebook@:6", type=("build", "run"), when="@3.4.3:")
depends_on("py-umap-learn", type=("build", "run"))
depends_on("py-ipywidgets@:7", type=("build", "run"))
depends_on("py-healpy", type=("build", "run"))

View File

@@ -21,6 +21,16 @@
# format returned by platform.system() and 'arch' by platform.machine()
_versions = {
"12.8.1": {
"Linux-aarch64": (
"353cbab1b57282a1001071796efd95c1e40ec27a3375e854d12637eaa1c6107c",
"https://developer.download.nvidia.com/compute/cuda/12.8.1/local_installers/cuda_12.8.1_570.124.06_linux_sbsa.run",
),
"Linux-x86_64": (
"228f6bcaf5b7618d032939f431914fc92d0e5ed39ebe37098a24502f26a19797",
"https://developer.download.nvidia.com/compute/cuda/12.8.1/local_installers/cuda_12.8.1_570.124.06_linux.run",
),
},
"12.8.0": {
"Linux-aarch64": (
"5bc211f00c4f544da6e3fc3a549b3eb0a7e038439f5f3de71caa688f2f6b132c",

View File

@@ -25,6 +25,7 @@ class Dd4hep(CMakePackage):
license("LGPL-3.0-or-later")
version("master", branch="master")
version("1.32", sha256="8bde4eab9af9841e040447282ea7df3a16e4bcec587c3a1e32f41987da9b1b4d")
version("1.31", sha256="9c06a1b4462fc1b51161404889c74b37350162d0b0ac2154db27e3f102670bd1")
version("1.30", sha256="02de46151e945eff58cffd84b4b86d35051f4436608199c3efb4d2e1183889fe")
version("1.29", sha256="435d25a7ef093d8bf660f288b5a89b98556b4c1c293c55b93bf641fb4cba77e9")
@@ -54,9 +55,9 @@ class Dd4hep(CMakePackage):
patch("cmake_language.patch", when="@:1.17")
# Fix missing SimCaloHits when using the LCIO format
patch(
"https://patch-diff.githubusercontent.com/raw/AIDASoft/DD4hep/pull/1019.patch?full_index=1",
"https://github.com/AIDASoft/DD4hep/commit/2c77055fb05744a4d367123c634bcb42291df030.patch?full_index=1",
when="@1.19:1.23",
sha256="6466719c82de830ce728db57004fb7db03983587a63b804f6dc95c6b92b3fc76",
sha256="7bac1e08d2f83edb467da7f950b841021ecc649cc4cf21fd9043bd6d757c4e05",
)
# variants for subpackages

View File

@@ -96,9 +96,9 @@ class Edm4hep(CMakePackage):
# Fix missing nljson import
# NOTE that downstream packages (dd4hep) may fail for 0.99 and before
patch(
"https://patch-diff.githubusercontent.com/raw/key4hep/EDM4hep/pull/379.patch?full_index=1",
"https://github.com/key4hep/EDM4hep/commit/18799dacfdaf5d746134c957de48607aa2665d75.patch?full_index=1",
when="@0.99.1",
sha256="c4be2f27c7bda4d033f92fee14e48ddf59fbe606d208e8288d9bdb3dec5ad5c2",
sha256="374f0b7635c632e5a57d23ad163efab7370ab471c62e2713a41aa26e33d8f221",
)
def cmake_args(self):

View File

@@ -35,6 +35,7 @@ class Exawind(CMakePackage, CudaPackage, ROCmPackage):
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
for arch in CudaPackage.cuda_arch_values:
depends_on(

View File

@@ -50,8 +50,8 @@ class Flatbuffers(CMakePackage):
# https://github.com/google/flatbuffers/issues/5950
# Possibly affects earlier releases but I haven't tried to apply it.
patch(
"https://github.com/google/flatbuffers/pull/6020.patch?full_index=1",
sha256="579cb6fa4430d4304b93c7a1df7e922f3c3ec614c445032877ad328c209d5462",
"https://github.com/google/flatbuffers/commit/515a4052a750dfe6df8d143c8f23cd8aaf51f9d7.patch?full_index=1",
sha256="f76b8777d7e719834ba0d83535b35e7c17ce474cfbc1286671d936191f784dc1",
when="@1.12.0:1%gcc@10:",
)

View File

@@ -115,7 +115,7 @@ class FluxSched(CMakePackage, AutotoolsPackage):
patch("jobid-sign-compare-fix.patch", when="@:0.22.0")
patch(
"https://github.com/flux-framework/flux-sched/pull/1338.patch?full_index=1",
"https://github.com/flux-framework/flux-sched/commit/da6156addab5ec127b36cdee03c5d1f3e458d363.patch?full_index=1",
when="@0.42.2 %oneapi@2025:",
sha256="b46579efa70176055f88493caa3fefbfea5a5663a33d9c561b71e83046f763c5",
)

View File

@@ -71,11 +71,13 @@ class Fms(CMakePackage):
)
variant("shared", description="Build shared libraries", when="@2024.02:", default=False)
# To build a shared/dynamic library, both `pic` and `shared` are required:
requires("+pic", when="+shared", msg="The +shared variant requires +pic")
# What the following patch is providing is available in version 2024.03
# and newer so it is only needed to 2024.02
patch(
"https://github.com/NOAA-GFDL/fms/pull/1559.patch?full_index=1",
sha256="2b12a6c35f357c3dddcfa5282576e56ab0e8e6c1ad1dab92a2c85ce3dfb815d4",
"https://github.com/NOAA-GFDL/fms/commit/361352e0b410373ba259259627f4b714b15cff57.patch?full_index=1",
sha256="6c085485919d493c350d1692ea0b6b403fca1246c0c4bde3b50b44a08d887694",
when="@2024.02",
)

View File

@@ -17,6 +17,7 @@ class Fmt(CMakePackage):
license("MIT")
version("11.2.0", sha256="203eb4e8aa0d746c62d8f903df58e0419e3751591bb53ff971096eaa0ebd4ec3")
version("11.1.4", sha256="49b039601196e1a765e81c5c9a05a61ed3d33f23b3961323d7322e4fe213d3e6")
version("11.1.3", sha256="7df2fd3426b18d552840c071c977dc891efe274051d2e7c47e2c83c3918ba6df")
version("11.1.2", sha256="ef54df1d4ba28519e31bf179f6a4fb5851d684c328ca051ce5da1b52bf8b1641")

View File

@@ -35,7 +35,7 @@ class Ftgl(CMakePackage):
# https://github.com/kraj/ftgl/commit/37ed7d606a0dfecdcb4ab0c26d1b0132cd96d5fa
# freetype 2.13.3 changed the type of many external chars to unsigned char!
patch(
"https://patch-diff.githubusercontent.com/raw/frankheckenbach/ftgl/pull/20.patch?full_index=1",
"https://github.com/frankheckenbach/ftgl/commit/21e050670bc7217a3f61b90cdab5b543e676380c.patch?full_index=1",
sha256="e2a0810fbf68403931bef4fbfda22e010e01421c92eeaa45f62e4e47f2381ebd",
when="^freetype@2.13.3:",
)

View File

@@ -22,7 +22,7 @@ paths:
fi
platforms: ["darwin", "linux"]
results:
- spec: "gcc@9.4.0 languages=c,c++"
- spec: "gcc@9.4.0 languages:=c,c++"
extra_attributes:
compilers:
c: ".*/bin/gcc"
@@ -45,7 +45,7 @@ paths:
fi
platforms: ["darwin", "linux"]
results:
- spec: "gcc@5.5.0 languages=c,c++,fortran"
- spec: "gcc@5.5.0 languages:=c,c++,fortran"
extra_attributes:
compilers:
c: ".*/bin/gcc-5$"
@@ -115,7 +115,7 @@ paths:
fi
platforms: [darwin]
results:
- spec: "gcc@14.1.0 languages=c"
- spec: "gcc@14.1.0 languages:=c"
extra_attributes:
compilers:
c: ".*/bin/gcc-14$"

View File

@@ -672,15 +672,13 @@ def determine_variants(cls, exes, version_str):
translation = {"cxx": "c++"}
for lang, compiler in compilers.items():
languages.add(translation.get(lang, lang))
variant_str = "languages={0}".format(",".join(languages))
variant_str = "languages:={0}".format(",".join(languages))
return variant_str, {"compilers": compilers}
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
# For GCC 'compilers' is a mandatory attribute
msg = 'the extra attribute "compilers" must be set for ' 'the detected spec "{0}"'.format(
spec
)
msg = f'the extra attribute "compilers" must be set for the detected spec "{spec}"'
assert "compilers" in extra_attributes, msg
compilers = extra_attributes["compilers"]

View File

@@ -30,6 +30,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
license("MIT")
maintainers("adamjstewart")
version("3.11.0", sha256="ba1a17a74428bfd5c789ce293f59b6a3d8bfabab747431c33331ac0ac579ea71")
version("3.10.2", sha256="67b4e08acd1cc4b6bd67b97d580be5a8118b586ad6a426b09d5853898deeada5")
version("3.10.1", sha256="9211eac72b53f5f85d23cf6d83ee20245c6d818733405024e71f2af41e5c5f91")
version("3.10.0", sha256="af821a3bcf68cf085724c21c9b53605fd451d83af3c8854d8bf194638eb734a8")
@@ -104,6 +105,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
version("2.0.0", sha256="91704fafeea2349c5e268dc1e2d03921b3aae64b05ee01d59fdfc1a6b0ffc061")
# Optional dependencies
# https://gdal.org/en/stable/development/building_from_source.html
variant("archive", default=False, when="@3.7:", description="Optional for vsi7z VFS driver")
variant(
"armadillo",
@@ -129,6 +131,12 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
variant("ecw", default=False, description="Required for ECW driver")
variant("epsilon", default=False, when="@:3.2", description="Required for EPSILON driver")
variant("expat", default=True, description="Required for XML parsing in many OGR drivers")
variant(
"exprtk",
default=False,
when="@3.11:",
description="Required for advanced C++ VRT expressions",
)
variant("filegdb", default=False, description="Required for FileGDB driver")
variant("fme", default=False, when="@:3.4", description="Required for FME driver")
variant("freexl", default=False, description="Required for XLS driver")
@@ -152,7 +160,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
variant("jxl", default=False, when="@3.4:", description="Required for JPEGXL driver")
variant("kdu", default=False, description="Required for JP2KAK and JPIPKAK drivers")
variant("kea", default=False, description="Required for KEA driver")
variant("lerc", default=False, when="@2.4:", description="Required for LERC compression")
variant("lerc", default=True, when="@2.4:", description="Required for LERC compression")
variant("libaec", default=False, when="@3.8:", description="Optional for GRIB driver")
variant("libcsf", default=False, description="Required for PCRaster driver")
variant("libkml", default=False, description="Required for LIBKML driver")
@@ -182,6 +190,12 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
when="build_system=cmake",
description="Required for MSSQLSpatial driver",
)
variant(
"muparser",
default=True,
when="@3.11:",
description="Required for nominal C++ VRT expressions",
)
variant("mysql", default=False, description="Required for MySQL driver")
variant("netcdf", default=False, description="Required for NetCDF driver")
variant("odbc", default=False, description="Required for many OGR drivers")
@@ -222,7 +236,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
)
variant(
"qhull",
default=False,
default=True,
when="@2.1:",
description="Used for linear interpolation of gdal_grid",
)
@@ -307,6 +321,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('ecw', when='+ecw')
# depends_on('libepsilon', when='+epsilon')
depends_on("expat@1.95:", when="+expat")
depends_on("exprtk", when="+exprtk")
# depends_on('filegdb', when='+filegdb')
# depends_on('fme', when='+fme')
depends_on("freexl", when="+freexl")
@@ -350,6 +365,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# depends_on('lizardtech-lidar', when='+mrsid_lidar')
# depends_on('mssql_ncli', when='+mssql_ncli')
# depends_on('mssql_odbc', when='+mssql_odbc')
depends_on("muparser", when="+muparser")
depends_on("mysql", when="+mysql")
depends_on("netcdf-c@4.7:", when="@3.9:+netcdf")
depends_on("netcdf-c", when="+netcdf")
@@ -422,17 +438,17 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
depends_on("py-numpy@1.0.0:", type=("build", "run"), when="+python")
# https://github.com/OSGeo/gdal/issues/9751
depends_on("py-numpy@:1", when="@:3.8+python", type=("build", "run"))
depends_on("swig", type="build", when="+python")
depends_on("swig@4:", type="build", when="+python")
depends_on("java@7:", type=("build", "link", "run"), when="@3.2:+java")
depends_on("java@6:", type=("build", "link", "run"), when="@2.4:+java")
depends_on("java@5:", type=("build", "link", "run"), when="@2.1:+java")
depends_on("java@4:", type=("build", "link", "run"), when="@:2.0+java")
depends_on("ant", type="build", when="+java")
depends_on("swig", type="build", when="+java")
depends_on("swig@4:", type="build", when="+java")
depends_on("perl", type=("build", "run"), when="+perl")
depends_on("swig", type="build", when="+perl")
depends_on("swig@4:", type="build", when="+perl")
depends_on("php", type=("build", "link", "run"), when="+php")
depends_on("swig", type="build", when="+php")
depends_on("swig@4:", type="build", when="+php")
# https://gdal.org/development/rfc/rfc88_googletest.html
depends_on("googletest@1.10:", type="test")
@@ -492,7 +508,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# https://github.com/OSGeo/gdal/issues/3782
patch(
"https://github.com/OSGeo/gdal/pull/3786.patch?full_index=1",
"https://github.com/OSGeo/gdal/commit/b1a01a6790d428038e3c7cd81ca54d6d468b68b9.patch?full_index=1",
when="@3.3.0",
level=2,
sha256="9f9824296e75b34b3e78284ec772a5ac8f8ba92c17253ea9ca242caf766767ce",
@@ -501,7 +517,7 @@ class Gdal(CMakePackage, AutotoolsPackage, PythonExtension):
# https://github.com/spack/spack/issues/41299
# ensures the correct build specific libproj is used with cmake builds (gdal >=3.5.0)
patch(
"https://patch-diff.githubusercontent.com/raw/OSGeo/gdal/pull/8964.patch?full_index=1",
"https://github.com/OSGeo/gdal/commit/cc1213052fbfc6aca8fd7268f39e84f38a7b4155.patch?full_index=1",
when="@3.5:3.8",
sha256="52459dc9903ced5005ba81515762a55cd829d8f5420607405c211c4a77c2bf79",
)
@@ -566,6 +582,7 @@ def cmake_args(self):
self.define_from_variant("GDAL_USE_DEFLATE", "deflate"),
self.define_from_variant("GDAL_USE_ECW", "ecw"),
self.define_from_variant("GDAL_USE_EXPAT", "expat"),
self.define_from_variant("GDAL_USE_EXPRTK", "exprtk"),
self.define_from_variant("GDAL_USE_FILEGDB", "filegdb"),
self.define_from_variant("GDAL_USE_FREEXL", "freexl"),
self.define_from_variant("GDAL_USE_FYBA", "fyba"),
@@ -595,6 +612,7 @@ def cmake_args(self):
self.define_from_variant("GDAL_USE_MRSID", "mrsid"),
self.define_from_variant("GDAL_USE_MSSQL_NCLI", "mssql_ncli"),
self.define_from_variant("GDAL_USE_MSSQL_ODBC", "mssql_odbc"),
self.define_from_variant("GDAL_USE_MUPARSER", "muparser"),
self.define_from_variant("GDAL_USE_MYSQL", "mysql"),
self.define_from_variant("GDAL_USE_NETCDF", "netcdf"),
self.define_from_variant("GDAL_USE_ODBC", "odbc"),

View File

@@ -61,7 +61,7 @@ class Genie(Package):
# Disables this check.
patch("genie_disable_gopt_with_compiler_check.patch", level=0, when="@2.11:")
patch(
"https://patch-diff.githubusercontent.com/raw/GENIE-MC/Generator/pull/376.patch?full_index=1",
"https://github.com/GENIE-MC/Generator/commit/be723d688ea0e1070b972b9fc3b52a557cfe79b5.patch?full_index=1",
sha256="7eca9bf44251cd99edd962483ca24c5072f8e2eee688f1e95b076425f2dc59f6",
when="@3.4.2",
)

View File

@@ -82,7 +82,7 @@ class Geos(CMakePackage):
variant("shared", default=True, description="Build shared library")
patch(
"https://github.com/libgeos/geos/pull/461.patch?full_index=1",
"https://github.com/libgeos/geos/commit/cb127eeac823c8b48364c1b437844a5b65ff4748.patch?full_index=1",
sha256="ab78db7ff2e8fc89e899b8233cf77d90b24d88940dd202c4219decba479c8d35",
when="@3.8:3.9",
)

View File

@@ -127,7 +127,7 @@ class Ginkgo(CMakePackage, CudaPackage, ROCmPackage):
# Correctly find rocthrust through CMake
patch(
"https://github.com/ginkgo-project/ginkgo/pull/1668.patch?full_index=1",
"https://github.com/ginkgo-project/ginkgo/commit/369b12a5f4431577d60a61e67f2b0537b428abca.patch?full_index=1",
sha256="27d6ae6c87bec15464d20a963c336e89eac92625d07e3f9548e33cd7b952a496",
when="+rocm @1.8.0",
)

View File

@@ -30,7 +30,11 @@ class Glib(MesonPackage):
# Even minor versions are stable, odd minor versions are development, only add even numbers
version("2.82.5", sha256="05c2031f9bdf6b5aba7a06ca84f0b4aced28b19bf1b50c6ab25cc675277cbc3f")
version("2.82.2", sha256="ab45f5a323048b1659ee0fbda5cecd94b099ab3e4b9abf26ae06aeb3e781fd63")
version("2.78.3", sha256="609801dd373796e515972bf95fc0b2daa44545481ee2f465c4f204d224b2bc21")
version(
"2.78.3",
sha256="609801dd373796e515972bf95fc0b2daa44545481ee2f465c4f204d224b2bc21",
preferred=True,
)
version("2.78.0", sha256="44eaab8b720877ce303c5540b657b126f12dc94972d9880b52959f43fb537b30")
version("2.76.6", sha256="1136ae6987dcbb64e0be3197a80190520f7acab81e2bfb937dc85c11c8aa9f04")
version("2.76.4", sha256="5a5a191c96836e166a7771f7ea6ca2b0069c603c7da3cba1cd38d1694a395dda")

View File

@@ -80,12 +80,12 @@ class Gnina(CMakePackage, CudaPackage):
depends_on("openblas~fortran", when="@:1.1")
patch(
"https://patch-diff.githubusercontent.com/raw/gnina/gnina/pull/280.patch?full_index=1",
"https://github.com/gnina/gnina/commit/b59e958c5d02c9348b7d327fa54a4b1bae5d55c4.patch?full_index=1",
when="@1.3",
sha256="88d1760423cedfdb992409b0bfe3f9939ab5900f52074364db9ad8b87f4845d4",
)
patch(
"https://patch-diff.githubusercontent.com/raw/gnina/gnina/pull/282.patch?full_index=1",
"https://github.com/gnina/gnina/commit/c33e0ccff8a5a36053599509f394cc2b84311563.patch?full_index=1",
when="@1.3",
sha256="6a1db3d63039a11ecc6e753b325962773e0084673d54a0d93a503bca8b08fb9e",
)

View File

@@ -39,9 +39,11 @@ class Go(Package):
license("BSD-3-Clause")
version("1.24.3", sha256="229c08b600b1446798109fae1f569228102c8473caba8104b6418cb5bc032878")
version("1.24.2", sha256="9dc77ffadc16d837a1bf32d99c624cb4df0647cee7b119edd9e7b1bcc05f2e00")
version("1.24.1", sha256="8244ebf46c65607db10222b5806aeb31c1fcf8979c1b6b12f60c677e9a3c0656")
version("1.24.0", sha256="d14120614acb29d12bcab72bd689f257eb4be9e0b6f88a8fb7e41ac65f8556e5")
version("1.23.9", sha256="08f6419547563ed9e7037d12b9c8909677c72f75f62ef85887ed9dbf49b8d2dd")
version("1.23.8", sha256="0ca1f1e37ea255e3ce283af3f4e628502fb444587da987a5bb96d6c6f15930d4")
version("1.23.7", sha256="7cfabd46b73eb4c26b19d69515dd043d7183a6559acccd5cfdb25eb6b266a458")
version("1.23.6", sha256="039c5b04e65279daceee8a6f71e70bd05cf5b801782b6f77c6e19e2ed0511222")

View File

@@ -4,9 +4,6 @@
import json
import os
from os import path
from llnl.util import filesystem
from spack.package import *
@@ -109,18 +106,16 @@ def cmake_args(self):
]
# LLVM directory containing all installed CMake files
# (e.g.: configs consumed by client projects)
llvm_cmake_dirs = filesystem.find(spec["llvm"].prefix, "LLVMExports.cmake")
llvm_cmake_dirs = find(spec["llvm"].prefix, "LLVMExports.cmake")
if len(llvm_cmake_dirs) != 1:
raise InstallError(
"concretized llvm dependency must provide "
"a unique directory containing CMake client "
"files, found: {0}".format(llvm_cmake_dirs)
)
args.append("-DLLVM_DIR:String={0}".format(path.dirname(llvm_cmake_dirs[0])))
args.append("-DLLVM_DIR:String={0}".format(os.path.dirname(llvm_cmake_dirs[0])))
# clang internal headers directory
llvm_clang_include_dirs = filesystem.find(
spec["llvm"].prefix, "__clang_cuda_runtime_wrapper.h"
)
llvm_clang_include_dirs = find(spec["llvm"].prefix, "__clang_cuda_runtime_wrapper.h")
if len(llvm_clang_include_dirs) != 1:
raise InstallError(
"concretized llvm dependency must provide a "
@@ -128,11 +123,11 @@ def cmake_args(self):
"headers, found: {0}".format(llvm_clang_include_dirs)
)
args.append(
"-DCLANG_INCLUDE_PATH:String={0}".format(path.dirname(llvm_clang_include_dirs[0]))
"-DCLANG_INCLUDE_PATH:String={0}".format(os.path.dirname(llvm_clang_include_dirs[0]))
)
# target clang++ executable
llvm_clang_bin = path.join(spec["llvm"].prefix.bin, "clang++")
if not filesystem.is_exe(llvm_clang_bin):
llvm_clang_bin = os.path.join(spec["llvm"].prefix.bin, "clang++")
if not is_exe(llvm_clang_bin):
raise InstallError(
"concretized llvm dependency must provide a "
"valid clang++ executable, found invalid: "
@@ -152,7 +147,7 @@ def cmake_args(self):
@run_after("install")
def filter_config_file(self):
def edit_config(filename, editor):
config_file_paths = filesystem.find(self.prefix, filename)
config_file_paths = find(self.prefix, filename)
if len(config_file_paths) != 1:
raise InstallError(
"installed hipSYCL must provide a unique compiler driver"
@@ -186,7 +181,7 @@ def adjust_core_config(config):
# ptx backend
rpaths = set()
if self.spec.satisfies("~rocm"):
so_paths = filesystem.find_libraries(
so_paths = find_libraries(
"libc++", self.spec["llvm"].prefix, shared=True, recursive=True
)
if len(so_paths) != 1:
@@ -195,8 +190,8 @@ def adjust_core_config(config):
"unique directory containing libc++.so, "
"found: {0}".format(so_paths)
)
rpaths.add(path.dirname(so_paths[0]))
so_paths = filesystem.find_libraries(
rpaths.add(os.path.dirname(so_paths[0]))
so_paths = find_libraries(
"libc++abi", self.spec["llvm"].prefix, shared=True, recursive=True
)
if len(so_paths) != 1:
@@ -205,7 +200,7 @@ def adjust_core_config(config):
"unique directory containing libc++abi, "
"found: {0}".format(so_paths)
)
rpaths.add(path.dirname(so_paths[0]))
rpaths.add(os.path.dirname(so_paths[0]))
def adjust_cuda_config(config):
config["default-cuda-link-line"] += " " + " ".join(

View File

@@ -66,7 +66,7 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
values=lambda x: isinstance(x, str) and (x.isdigit() or x == "auto"),
)
instrumentation_values = ("apex", "google_perftools", "papi", "valgrind", "thread_debug")
instrumentation_values = ("google_perftools", "papi", "valgrind", "thread_debug")
variant(
"instrumentation",
values=any_combination_of(*instrumentation_values),
@@ -93,10 +93,11 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
variant("examples", default=False, description="Build examples")
variant("async_mpi", default=False, description="Enable MPI Futures.")
variant("async_cuda", default=False, description="Enable CUDA Futures.")
variant("apex", default=False, description="Enable APEX support")
# Build dependencies
depends_on("cxx", type="build")
depends_on("apex", when="+apex")
depends_on("python", type=("build", "test", "run"))
depends_on("git", type="build")
depends_on("cmake", type="build")
@@ -122,7 +123,6 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
depends_on("cuda", when="+async_cuda")
depends_on("otf2", when="instrumentation=apex")
depends_on("gperftools", when="instrumentation=google_perftools")
depends_on("papi", when="instrumentation=papi")
depends_on("valgrind", when="instrumentation=valgrind")
@@ -155,6 +155,7 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
# Restrictions for 1.5.x
conflicts("cxxstd=11", when="@1.5:")
depends_on("apex@2.3:", when="@1.5")
# Restrictions for 1.2.X
with when("@:1.2.1"):
@@ -211,8 +212,6 @@ class Hpx(CMakePackage, CudaPackage, ROCmPackage):
conflicts("~generic_coroutines", when="target=aarch64:", msg=_msg_generic_coroutines_target)
conflicts("~generic_coroutines", when="target=arm:", msg=_msg_generic_coroutines_target)
# Patches APEX
patch("git_external.patch", when="@1.3.0 instrumentation=apex")
patch("mimalloc_no_version_requirement.patch", when="@:1.8.0 malloc=mimalloc")
def url_for_version(self, version):
@@ -242,6 +241,7 @@ def cmake_args(self):
self.define_from_variant("HPX_WITH_EXAMPLES", "examples"),
self.define_from_variant("HPX_WITH_ASYNC_MPI", "async_mpi"),
self.define_from_variant("HPX_WITH_ASYNC_CUDA", "async_cuda"),
self.define_from_variant("HPX_WITH_APEX", "apex"),
self.define("HPX_WITH_TESTS", self.run_tests),
self.define("HPX_WITH_NETWORKING", "networking=none" not in spec),
self.define("HPX_WITH_PARCELPORT_TCP", spec.satisfies("networking=tcp")),
@@ -278,14 +278,4 @@ def cmake_args(self):
self.define("HPX_WITH_LOGGING", True),
]
if spec.satisfies("instrumentation=apex"):
args += [
self.define("APEX_WITH_OTF2", True),
self.define("OTF2_ROOT", spec["otf2"].prefix),
]
# it seems like there was a bug in the default version of APEX in 1.5.x
if spec.satisfies("@1.5"):
args += [self.define("HPX_WITH_APEX_TAG", "v2.3.0")]
return args

View File

@@ -14,11 +14,16 @@ class Icon(AutotoolsPackage):
homepage = "https://www.icon-model.org"
url = "https://gitlab.dkrz.de/icon/icon-model/-/archive/icon-2024.01-public/icon-model-icon-2024.01-public.tar.gz"
git = "https://gitlab.dkrz.de/icon/icon-model.git"
submodules = True
maintainers("skosukhin", "Try2Code")
license("BSD-3-Clause", checked_by="skosukhin")
version(
"2025.04", tag="icon-2025.04-public", commit="1be2ca66ea0de149971d2e77e88a9f11c764bd22"
)
version("2024.10", sha256="5c461c783eb577c97accd632b18140c3da91c1853d836ca2385f376532e9bad1")
version("2024.07", sha256="f53043ba1b36b8c19d0d2617ab601c3b9138b90f8ff8ca6db0fd079665eb5efa")
version("2024.01-1", sha256="3e57608b7e1e3cf2f4cb318cfe2fdb39678bd53ca093955d99570bd6d7544184")
@@ -95,10 +100,9 @@ class Icon(AutotoolsPackage):
# Optimization Features:
variant("mixed-precision", default=False, description="Enable mixed-precision dynamical core")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("fortran", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
depends_on("python", type="build")
depends_on("perl", type="build")
depends_on("cmake@3.18:", type="build")
@@ -205,11 +209,13 @@ def configure_args(self):
"-arch=sm_{0}".format(self.nvidia_targets[gpu]),
"-ccbin={0}".format(spack_cxx),
]
flags["ICON_LDFLAGS"].extend(self.compiler.stdcxx_libs)
libs += self.spec["cuda"].libs
else:
args.append("--disable-gpu")
if gpu in self.nvidia_targets or "+comin" in self.spec:
flags["ICON_LDFLAGS"].extend(self.compiler.stdcxx_libs)
if self.compiler.name == "gcc":
flags["CFLAGS"].append("-g")
flags["ICON_CFLAGS"].append("-O3")

View File

@@ -81,16 +81,16 @@ class Ispc(CMakePackage):
# Fix build with Apple clang 15
patch(
"https://github.com/ispc/ispc/pull/2785.patch?full_index=1",
"https://github.com/ispc/ispc/commit/a25cbdcdb86cb35ea40dcddeba03564128f83eca.patch?full_index=1",
when="@1.22:1.23.0",
sha256="f6a413bf86e49d520d23df7132004d1f09caa512187f369549a4a783859fbc41",
sha256="fb807ff565f8b07e9517a57658fa434958ad53241ce84216b3490c91f9e937eb",
)
# Fix library lookup for NCurses in CMake
patch(
"https://patch-diff.githubusercontent.com/raw/ispc/ispc/pull/2638.patch?full_index=1",
"https://github.com/ispc/ispc/commit/408f831b6200439c3bc3f98fb62066f4980c1271.patch?full_index=1",
when="@1.18:1.20",
sha256="3f7dae8d4a683fca2a6157bbcb7cbe9692ff2094b0f4afaf29be121c02b0b3ad",
sha256="c4621feaa73c8cb6ee2bbcebe218bc0275517aaa5f8fc4a45a962c60a8168c95",
)
def setup_build_environment(self, env: EnvironmentModifications) -> None:

View File

@@ -3,41 +3,77 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.build_systems.python import PythonPipBuilder
import spack.build_systems.python
from spack.build_systems import cmake, makefile
from spack.package import *
class Jsonnet(MakefilePackage):
class Jsonnet(MakefilePackage, CMakePackage):
"""A data templating language for app and tool developers based on JSON"""
homepage = "https://jsonnet.org/"
git = "https://github.com/google/jsonnet.git"
url = "https://github.com/google/jsonnet/archive/refs/tags/v0.18.0.tar.gz"
maintainers("jcpunk")
maintainers("greenc-FNAL", "gartung", "jcpunk", "marcmengel", "marcpaterno")
license("Apache-2.0")
version("master", branch="master")
version("0.21.0", sha256="a12ebca72e43e7061ffe4ef910e572b95edd7778a543d6bf85f6355bd290300e")
version("0.20.0", sha256="77bd269073807731f6b11ff8d7c03e9065aafb8e4d038935deb388325e52511b")
version("0.19.1", sha256="f5a20f2dc98fdebd5d42a45365f52fa59a7e6b174e43970fea4f9718a914e887")
version("0.18.0", sha256="85c240c4740f0c788c4d49f9c9c0942f5a2d1c2ae58b2c71068107bc80a3ced4")
version("0.17.0", sha256="076b52edf888c01097010ad4299e3b2e7a72b60a41abbc65af364af1ed3c8dbe")
variant("python", default=False, description="Provide Python bindings for jsonnet")
build_system("makefile", conditional("cmake", when="@0.21.0:"), default="makefile")
conflicts("%gcc@:5.4.99", when="@0.18.0:")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
with when("build_system=cmake"):
depends_on("nlohmann-json@3.6.1:")
variant("python", default=False, description="Provide Python bindings for jsonnet")
extends("python", when="+python")
depends_on("py-setuptools", type=("build",), when="+python")
depends_on("py-pip", type=("build",), when="+python")
depends_on("py-wheel", type=("build",), when="+python")
class MakefileBuilder(makefile.MakefileBuilder):
@property
def install_targets(self):
return ["PREFIX={0}".format(self.prefix), "install"]
@run_after("install")
def python_install(self):
if "+python" in self.spec:
pip(*PythonPipBuilder.std_args(self), f"--prefix={self.prefix}", ".")
if self.pkg.spec.satisfies("+python"):
pip(
*spack.build_systems.python.PythonPipBuilder.std_args(self.pkg),
f"--prefix={self.pkg.prefix}",
".",
)
class CMakeBuilder(cmake.CMakeBuilder):
def cmake_args(self):
return [
self.define("USE_SYSTEM_JSON", True),
self.define("BUILD_SHARED_BINARIES", True),
self.define("BUILD_TESTS", self.pkg.run_tests),
]
@run_after("install")
def python_install(self):
if self.pkg.spec.satisfies("+python"):
pip(
*spack.build_systems.python.PythonPipBuilder.std_args(self.pkg),
f"--prefix={self.pkg.prefix}",
".",
)

View File

@@ -271,7 +271,7 @@ class Julia(MakefilePackage):
# Fix libstdc++ not being found (https://github.com/JuliaLang/julia/issues/47987)
patch(
"https://github.com/JuliaLang/julia/pull/48342.patch?full_index=1",
"https://github.com/JuliaLang/julia/commit/7d2499dd35bebfcd8419d2d51611ba4ac1a19a9c.patch?full_index=1",
sha256="10f7cab89c8353b2648a968d2c8e8ed8bd90961df3227084f1d69d3d482933d7",
when="@1.8.4:1.8.5",
)
@@ -280,8 +280,8 @@ class Julia(MakefilePackage):
# applicable to previous versions of the library too
# (https://github.com/JuliaLang/julia/issues/49895).
patch(
"https://github.com/JuliaLang/julia/pull/49909.patch?full_index=1",
sha256="7fa53516b97d83ccf06f6d387c04d337849808f7e8ee2bdc2e79894d84578afc",
"https://github.com/JuliaLang/julia/commit/5d43397ee52323f1c015513b2be3909078b646ef.patch?full_index=1",
sha256="15f9f2a7b6ae21aa5de8655970c673a953e1d46018e901f7fff98aead8e4a929",
when="@1.6.4:1.9.0",
)

View File

@@ -20,6 +20,7 @@ class Lcio(CMakePackage):
license("BSD-3-Clause")
version("master", branch="master")
version("2.22.6", sha256="69271f021198d15390a0134110ab5c1cbeea9a183cef3f94f0d1ee91fa4748bb")
version("2.22.5", sha256="a756521a2419f8d25d4a4f1bab0008e16c9947020d015f2f6ce457ab0a0429bf")
version("2.22.4", sha256="5d60eeb4df8611059f4bc839ac098f5d7e3608a662591e9cbae48aed07995514")
version("2.22.3", sha256="5b9715786c5e953f8854881c5d0c4a48030a5491f1701232b82e960ac7980162")
@@ -66,7 +67,7 @@ class Lcio(CMakePackage):
variant("rootdict", default=True, description="Turn on to build/install ROOT dictionary.")
variant("examples", default=False, description="Turn on to build LCIO examples")
depends_on("c", type="build")
depends_on("c", type="build", when="@:2.22.5")
depends_on("cxx", type="build")
depends_on("sio@0.0.2:", when="@2.14:")

View File

@@ -22,7 +22,8 @@ class Lemon(CMakePackage):
# soplex not mentioned in docs but shown in cmakecache
# variant("soplex", default=False, description="Enable SOPLEX solver backend") #TODO
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("glpk", when="+glpk")
depends_on("cplex", when="+ilog")

View File

@@ -21,7 +21,8 @@ class Lerc(CMakePackage):
version("4.0.0", sha256="91431c2b16d0e3de6cbaea188603359f87caed08259a645fd5a3805784ee30a0")
version("3.0", sha256="8c0148f5c22d823eff7b2c999b0781f8095e49a7d3195f13c68c5541dd5740a1")
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("cmake@3.11:", type="build")
depends_on("cmake@3.12:", type="build", when="@4.0.0:")

View File

@@ -6,7 +6,6 @@
import sys
import llnl.util.filesystem as fsys
import llnl.util.tty as tty
from spack.package import *

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.build_systems
import spack.build_systems.autotools
import spack.build_systems.cmake
from spack.package import *

View File

@@ -3,7 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
import spack.build_systems
import spack.build_systems.autotools
from spack.package import *

View File

@@ -82,7 +82,7 @@ class Libzmq(AutotoolsPackage):
# Fix build issues with gcc-12
patch(
"https://github.com/zeromq/libzmq/pull/4334.patch?full_index=1",
"https://github.com/zeromq/libzmq/commit/a01d259db372bff5e049aa966da4efce7259af67.patch?full_index=1",
sha256="edca864cba914481a5c97d2e975ba64ca1d2fbfc0044e9a78c48f1f7b2bedb6f",
when="@4.3.4",
)

View File

@@ -1212,6 +1212,7 @@ def cmake_args(self):
[
define("LLVM_ENABLE_RUNTIMES", runtimes),
define("RUNTIMES_CMAKE_ARGS", runtime_cmake_args),
define("LIBCXXABI_USE_LLVM_UNWINDER", not spec.satisfies("libunwind=none")),
]
)

View File

@@ -35,9 +35,9 @@ class LuaSol2(CMakePackage):
depends_on("lua", type=("link", "run"))
patch(
"https://github.com/ThePhD/sol2/pull/1606.patch?full_index=1",
"https://github.com/ThePhD/sol2/commit/d805d027e0a0a7222e936926139f06e23828ce9f.patch?full_index=1",
when="@3.3.0 %oneapi@2025:",
sha256="ed6c5924a0639fb1671e6d7dacbb88dce70aa006bcee2f380b6acd34da89664c",
sha256="ea7e30be5d6e0d71aded18d68b16725fa4e5f86f99757048fa18c7ece92417c5",
)
def cmake_args(self):

View File

@@ -22,10 +22,12 @@ class Mesa(MesonPackage):
version("main", branch="main")
version(
"23.3.6",
sha256="cd3d6c60121dea73abbae99d399dc2facaecde1a8c6bd647e6d85410ff4b577b",
"25.0.5",
sha256="c0d245dea0aa4b49f74b3d474b16542e4a8799791cd33d676c69f650ad4378d0",
preferred=True,
)
version("24.3.4", sha256="e641ae27191d387599219694560d221b7feaa91c900bcec46bf444218ed66025")
version("23.3.6", sha256="cd3d6c60121dea73abbae99d399dc2facaecde1a8c6bd647e6d85410ff4b577b")
version("23.3.3", sha256="518307c0057fa3cee8b58df78be431d4df5aafa7edc60d09278b2d7a0a80f3b4")
version("23.2.1", sha256="64de0616fc2d801f929ab1ac2a4f16b3e2783c4309a724c8a259b20df8bbc1cc")
version("23.1.9", sha256="295ba27c28146ed09214e8ce79afa1659edf9d142decc3c91f804552d64f7510")
@@ -66,6 +68,7 @@ class Mesa(MesonPackage):
depends_on("python@:3.11", when="@:23.2", type="build")
depends_on("py-packaging", type="build", when="^python@3.12:")
depends_on("py-mako@0.8.0:", type="build")
depends_on("py-pyyaml", when="@24.2:", type="build")
depends_on("unwind")
depends_on("expat")
depends_on("zlib-api")
@@ -126,6 +129,9 @@ class Mesa(MesonPackage):
depends_on("libxt")
depends_on("xrandr")
depends_on("glproto@1.4.14:")
# In @24.3:, "libxshmfence@1.1:" is needed when:
# (with_dri_platform == 'drm') or (with_any_vk), see mesa's meson.build.
depends_on("libxshmfence@1.1:", when="@24.3:")
# version specific issue
# https://gcc.gnu.org/bugzilla/show_bug.cgi?id=96130
@@ -198,7 +204,6 @@ def meson_args(self):
args = [
"-Dvulkan-drivers=",
"-Dgallium-vdpau=disabled",
"-Dgallium-omx=disabled",
"-Dgallium-va=disabled",
"-Dgallium-xa=disabled",
"-Dgallium-nine=false",
@@ -209,6 +214,9 @@ def meson_args(self):
# gallium-xvmc was removed in @main and @2.23:
if self.spec.satisfies("@:22.2"):
args.append("-Dgallium-xvmc=disabled")
# the option 'gallium-omx' is present in @24.2.4 and removed in @main
if spec.satisfies("@:24.2.4"):
args.append("-Dgallium-omx=disabled")
args_platforms = []
args_gallium_drivers = ["swrast"]
@@ -247,10 +255,14 @@ def meson_args(self):
if "+egl" in spec:
num_frontends += 1
args.extend(["-Degl=enabled", "-Dgbm=enabled", "-Ddri3=enabled"])
args.extend(["-Degl=enabled", "-Dgbm=enabled"])
if spec.satisfies("@:24.2.4"):
args.extend(["-Ddri3=enabled"])
args_platforms.append("surfaceless")
else:
args.extend(["-Degl=disabled", "-Dgbm=disabled", "-Ddri3=disabled"])
args.extend(["-Degl=disabled", "-Dgbm=disabled"])
if spec.satisfies("@:24.2.4"):
args.extend(["-Ddri3=disabled"])
args.append(opt_bool("+opengl" in spec, "opengl"))
args.append(opt_enable("+opengles" in spec, "gles1"))

View File

@@ -507,7 +507,7 @@ class Mfem(Package, CudaPackage, ROCmPackage):
patch("mfem-4.5.patch", when="@4.5.0")
patch("mfem-4.6.patch", when="@4.6.0")
patch(
"https://github.com/mfem/mfem/pull/4005.patch?full_index=1",
"https://github.com/mfem/mfem/commit/0ddb7aba31a0161fca08ff9dd617e6d36a565366.patch?full_index=1",
when="@4.6.0 +gslib+shared+miniapps",
sha256="2a31682d876626529e2778a216d403648b83b90997873659a505d982d0e65beb",
)

View File

@@ -45,8 +45,8 @@ class MongoCDriver(AutotoolsPackage, CMakePackage):
variant("zstd", default=True, description="Enable zstd support.")
patch(
"https://github.com/mongodb/mongo-c-driver/pull/466.patch?full_index=1",
sha256="d8802d91226c176ba46d5b82413757121331d556a3a3d57ab65b70e175cab296",
"https://github.com/mongodb/mongo-c-driver/commit/5d759ff62f0c1389075b8b40932b7fdc11b4e12d.patch?full_index=1",
sha256="d4b6be7e885ef3e2ce12811c307b8b22fa297bfb9ea97d8493eef1d053f206a4",
when="@1.8.1",
)

View File

@@ -38,6 +38,8 @@ class Mpilander(CMakePackage):
conflicts("%apple-clang@:7.4")
conflicts("%intel@:16")
conflicts("platform=windows")
def cmake_args(self):
args = [
# tests and examples

View File

@@ -2,10 +2,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.build_systems.autotools import AutotoolsBuilder
from spack.build_systems.cmake import CMakeBuilder
from spack.package import *
class Muparser(CMakePackage, Package):
class Muparser(CMakePackage, AutotoolsPackage):
"""C++ math expression parser library."""
homepage = "https://beltoforion.de/en/muparser/"
@@ -22,7 +24,9 @@ class Muparser(CMakePackage, Package):
patch("auto_ptr.patch", when="@2.2.5")
variant("samples", default=True, description="enable samples", when="build_system=cmake")
variant("openmp", default=True, description="enable OpenMP support", when="build_system=cmake")
variant(
"openmp", default=False, description="enable OpenMP support", when="build_system=cmake"
)
variant(
"wide_char",
default=False,
@@ -31,13 +35,13 @@ class Muparser(CMakePackage, Package):
)
variant("shared", default=True, description="enable shared libs", when="build_system=cmake")
# Non-CMake build system is not supported by windows
conflicts("platform=windows", when="@:2.2.5")
build_system(conditional("cmake", when="@2.2.6:"), "generic", default="cmake")
build_system(conditional("cmake", when="@2.2.6:"), "autotools", default="cmake")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
class CMakeBuilder(CMakeBuilder):
def cmake_args(self):
return [
self.define_from_variant("ENABLE_SAMPLES", "samples"),
@@ -46,17 +50,14 @@ def cmake_args(self):
self.define_from_variant("ENABLE_WIDE_CHAR", "wide_char"),
]
@when("@:2.2.5")
def install(self, spec, prefix):
options = [
class AutotoolsBuilder(AutotoolsBuilder):
parallel = False
def configure_args(self):
return [
"--disable-debug",
"--disable-samples",
"--disable-dependency-tracking",
"CXXFLAGS={0}".format(self.compiler.cxx11_flag),
"--prefix=%s" % prefix,
]
configure(*options)
make(parallel=False)
make("install")

View File

@@ -40,6 +40,7 @@ class Musica(CMakePackage):
depends_on("cxx", type="build")
depends_on("fortran", type="build")
depends_on("mpi", when="+mpi")
depends_on("netcdf-fortran", when="+tuvx")
def cmake_args(self):
args = [

View File

@@ -0,0 +1,54 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Neon(CMakePackage):
"""NeoN is a PDE solver for CFD frameworks."""
homepage = "https://github.com/exasim-project/neon"
git = "https://github.com/exasim-project/neon.git"
maintainers("greole", "HenningScheufler")
license("MIT", checked_by="greole")
version("main", branch="main")
variant("cuda", default=False, description="Compile with CUDA support")
variant("hip", default=False, description="Compile with HIP support")
variant("omp", default=False, description="Compile with OMP support")
variant("threads", default=True, description="Compile with Threads support")
variant("ginkgo", default=True, description="Compile with Ginkgo")
variant("petsc", default=False, description="Compile with PETSc")
variant("sundials", default=True, description="Compile with Sundials")
variant("test", default=False, description="Compile and install tutorial programs")
variant("adios2", default=False, description="Compile with ADIOS2 support")
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("mpi@3")
depends_on("cuda@12.6", when="+cuda")
depends_on("hip", when="+hip")
depends_on("kokkos@4.3.00")
depends_on("ginkgo@develop", when="+ginkgo")
depends_on("petsc", when="+petsc")
depends_on("sundials", when="+sundials")
depends_on("adios2", when="+adios2")
def cmake_args(self):
args = [
self.define_from_variant("NeoN_WITH_GINKGO", "ginkgo"),
self.define_from_variant("NeoN_WITH_OMP", "omp"),
self.define_from_variant("NeoN_WITH_THREADS", "threads"),
self.define_from_variant("NeoN_WITH_ADIOS2", "adios2"),
self.define_from_variant("NeoN_WITH_SUNDIALS", "sundials"),
self.define_from_variant("NeoN_WITH_PETSC", "petsc"),
self.define_from_variant("NeoN_BUILD_TESTS", "test"),
self.define_from_variant("Kokkos_ENABLE_CUDA", "cuda"),
self.define_from_variant("Kokkos_ENABLE_HIP", "hip"),
self.define("CPM_USE_LOCAL_PACKAGES", True),
]
return args

View File

@@ -98,12 +98,12 @@ class NetcdfC(CMakePackage, AutotoolsPackage):
# Fix headers
# See https://github.com/Unidata/netcdf-c/pull/1505
patch(
"https://github.com/Unidata/netcdf-c/pull/1505.patch?full_index=1",
"https://github.com/Unidata/netcdf-c/commit/cca9ae64f622bb2b7f164fa352c820b5fe4d132c.patch?full_index=1",
sha256="495b3e5beb7f074625bcec2ca76aebd339e42719e9c5ccbedbdcc4ffb81a7450",
)
# See https://github.com/Unidata/netcdf-c/pull/1508
patch(
"https://github.com/Unidata/netcdf-c/pull/1508.patch?full_index=1",
"https://github.com/Unidata/netcdf-c/commit/f0dc61a73c8a35432034c8d262f1893a0090c3ed.patch?full_index=1",
sha256="19e7f31b96536928621b1c29bb6d1a57bcb7aa672cea8719acf9ac934cdd2a3e",
)

View File

@@ -122,7 +122,7 @@ class NetlibLapack(CMakePackage):
# renaming with _64 suffixes pushes code beyond fortran column 72
patch(
"https://github.com/Reference-LAPACK/lapack/pull/1093.patch?full_index=1",
"https://github.com/Reference-LAPACK/lapack/commit/0799b59571a4bbb434c62ef2346146123aa19d8d.patch?full_index=1",
sha256="b1af8b6ef2113a59aba006319ded0c1a282533c3815289e1c9e91185f63ee9fe",
when="@3.6:3.12.1",
)
@@ -132,7 +132,7 @@ class NetlibLapack(CMakePackage):
when="@3.12:3.12.1",
)
patch(
"https://github.com/Reference-LAPACK/lapack/pull/1099.patch?full_index=1",
"https://github.com/Reference-LAPACK/lapack/commit/447fd4e7844b81e62deff09b6b2f7961eecc7590.patch?full_index=1",
sha256="3059ebf898cbca5101db77b77c645ab144a3cecbe58dd2bb46d9b84e7debee92",
when="@3.12:3.12.1",
)

View File

@@ -13,6 +13,11 @@ class Nextflow(Package):
maintainers("dialvarezs", "marcodelapierre")
version(
"25.04.0",
sha256="33d888b1e0127566950719316bac735975e15800018768cceb7d3d77ad0719eb",
expand=False,
)
version(
"24.10.5",
sha256="a9733a736cfecdd70e504b942e823da7005f9afc288902e67afe86b43dc9bcdb",

View File

@@ -37,8 +37,8 @@ class Ocaml(Package):
# constants. Fixes compatibility with the integrated assembler in clang 11.0.0.
# (Jacob Young, review by Nicolas Ojeda Bar)
patch(
"https://github.com/ocaml/ocaml/pull/9981.patch?full_index=1",
sha256="12700c697f0d5227e8eddd62e4308ec3cd67c0a5a5a1b7eec376686a5fd63a5c",
"https://github.com/ocaml/ocaml/commit/8a46d76bf9359b5cc505b3f2f9c81eb624c631fa.patch?full_index=1",
sha256="805cdd458c3849e0050600bfeac7cbe4a1da78aae7b686b529a475e63948048f",
when="@:4.11.0 %clang@11:",
)

View File

@@ -47,8 +47,8 @@ class Openmm(CMakePackage, CudaPackage):
# `openmm@7.5.1+cuda`, which is the version currently required by
# `py-alphafold`.
patch(
"https://github.com/openmm/openmm/pull/3154.patch?full_index=1",
sha256="90bc01b34cf998e90220669b3ed55cd3c42000ad364234033aac631ed754e9bd",
"https://github.com/openmm/openmm/commit/71bc7c8c70ffbccd82891dec7fd4f4deb99af64d.patch?full_index=1",
sha256="9562e03eb8d43ba4d8f0f7b2a3326cc464985fc148804cf9e4340fd7a87bb8e7",
when="@7.5.1+cuda",
)

View File

@@ -7,8 +7,6 @@
import re
import sys
import llnl.util.tty as tty
import spack.compilers.config
from spack.package import *

View File

@@ -92,29 +92,29 @@ class OpenpmdApi(CMakePackage):
# CMake: Fix Python Install Directory
patch(
"https://github.com/openPMD/openPMD-api/pull/1393.patch?full_index=1",
sha256="b5cecbdbe16d98c0ba352fa861fcdf9d7c7cc85f21226fa03effa7d62a7cb276",
"https://github.com/openPMD/openPMD-api/commit/31e3c42eb6687269adfb0e63c35269db328ea6ec.patch?full_index=1",
sha256="e8b57bcdc965643f46280408244f4d574bff09d0c19c863f42395a7203a89385",
when="@0.15.0",
)
# macOS AppleClang12 Fixes
patch(
"https://github.com/openPMD/openPMD-api/pull/1395.patch?full_index=1",
sha256="791c0a9d1dc09226beb26e8e67824b3337d95f4a2a6e7e64637ea8f0d95eee61",
"https://github.com/openPMD/openPMD-api/commit/c9b0f70294ef8d9ac89018c9b439815be9e77b96.patch?full_index=1",
sha256="83714efc90fe6d4f909bdde1b0578a43e6a013a5db6b10e87466665122fd6b21",
when="@0.15.0",
)
# forgot to bump version.hpp in 0.15.1
patch(
"https://github.com/openPMD/openPMD-api/pull/1417.patch?full_index=1",
sha256="c306483f1f94b308775a401c9cd67ee549fac6824a2264f5985499849fe210d5",
"https://github.com/openPMD/openPMD-api/commit/b3d3057e141af3a40dde5f00262a5671979a95c7.patch?full_index=1",
sha256="f31d0adcd407d20d559aa67e5f6ec2d81c6579b8b0166918c5178c02af180fba",
when="@0.15.1",
)
# fix superbuild control in 0.16.0
patch(
"https://github.com/openPMD/openPMD-api/pull/1678.patch?full_index=1",
sha256="e49fe79691bbb5aae2224d218f29801630d33f3a923c518f6bfb39ec22fd6a72",
"https://github.com/openPMD/openPMD-api/commit/3dc3a463d18dd5f87c38ee64d93bc7814b1cbb5d.patch?full_index=1",
sha256="474a7ccf11f0892717271fe3974a6ee046c15187a6ba12c75085a0d092071c9c",
when="@0.16.0",
)

View File

@@ -4,7 +4,6 @@
import os
import spack.store
from spack.package import *
from ..boost.package import Boost
@@ -38,11 +37,6 @@ class Openspeedshop(CMakePackage):
variant(
"runtime", default=False, description="build only the runtime libraries and collectors."
)
variant(
"crayfe",
default=False,
description="build only the FE tool using the runtime_dir to point to target build.",
)
variant("cuda", default=False, description="build with cuda packages included.")
variant(
@@ -123,11 +117,6 @@ class Openspeedshop(CMakePackage):
depends_on("cbtf-krell@develop", when="@develop", type=("build", "link", "run"))
depends_on("cbtf-krell@1.9.3:9999", when="@2.4.0:9999", type=("build", "link", "run"))
depends_on("cbtf-krell@develop+crayfe", when="@develop+crayfe", type=("build", "link", "run"))
depends_on(
"cbtf-krell@1.9.3:9999+crayfe", when="@2.4.0:9999+crayfe", type=("build", "link", "run")
)
depends_on("cbtf-krell@develop+mpich2", when="@develop+mpich2", type=("build", "link", "run"))
depends_on(
"cbtf-krell@1.9.3:9999+mpich2", when="@2.4.0:9999+mpich2", type=("build", "link", "run")
@@ -164,29 +153,6 @@ class Openspeedshop(CMakePackage):
build_directory = "build_openspeedshop"
def set_cray_login_node_cmake_options(self, spec, cmake_options):
# Appends to cmake_options the options that will enable the appropriate
# Cray login node libraries
cray_login_node_options = []
rt_platform = "cray"
# How do we get the compute node (CNL) cbtf package install
# directory path?
# spec['cbtf'].prefix is the login node value for this build, as
# we only get here when building the login node components and
# that is all that is known to spack.
store = spack.store
be_ck = store.db.query_one("cbtf-krell arch=cray-CNL-haswell")
# Equivalent to install-tool cmake arg:
# '-DCBTF_KRELL_CN_RUNTIME_DIR=%s'
# % <base dir>/cbtf_v2.4.0.release/compute)
cray_login_node_options.append("-DCBTF_KRELL_CN_RUNTIME_DIR=%s" % be_ck.prefix)
cray_login_node_options.append("-DRUNTIME_PLATFORM=%s" % rt_platform)
cmake_options.extend(cray_login_node_options)
def cmake_args(self):
spec = self.spec
@@ -240,13 +206,6 @@ def cmake_args(self):
if spec.satisfies("+cuda"):
cmake_args.extend(["-DCBTF_ARGONAVIS_DIR=%s" % spec["cbtf-argonavis"].prefix])
if spec.satisfies("+crayfe"):
# We need to build target/compute node
# components/libraries first then pass
# those libraries to the openspeedshop
# login node build
self.set_cray_login_node_cmake_options(spec, cmake_args)
return cmake_args
def set_defaultbase_cmake_options(self, spec, cmake_options):

View File

@@ -4,7 +4,6 @@
import os
import spack.store
from spack.package import *
from ..boost.package import Boost
@@ -41,11 +40,6 @@ class OpenspeedshopUtils(CMakePackage):
variant(
"runtime", default=False, description="build only the runtime libraries and collectors."
)
variant(
"crayfe",
default=False,
description="build only the FE tool using the runtime_dir to point to target build.",
)
variant("cuda", default=False, description="build with cuda packages included.")
variant(
@@ -117,11 +111,6 @@ class OpenspeedshopUtils(CMakePackage):
depends_on("cbtf-krell@develop", when="@develop", type=("build", "link", "run"))
depends_on("cbtf-krell@1.9.3:9999", when="@2.4.0:9999", type=("build", "link", "run"))
depends_on("cbtf-krell@develop+crayfe", when="@develop+crayfe", type=("build", "link", "run"))
depends_on(
"cbtf-krell@1.9.3:9999+crayfe", when="@2.4.0:9999+crayfe", type=("build", "link", "run")
)
depends_on("cbtf-krell@develop+mpich2", when="@develop+mpich2", type=("build", "link", "run"))
depends_on(
"cbtf-krell@1.9.3:9999+mpich2", when="@2.4.0:9999+mpich2", type=("build", "link", "run")
@@ -158,28 +147,6 @@ class OpenspeedshopUtils(CMakePackage):
build_directory = "build_openspeedshop"
def set_cray_login_node_cmake_options(self, spec, cmake_options):
# Appends to cmake_options the options that will enable the appropriate
# Cray login node libraries
cray_login_node_options = []
rt_platform = "cray"
# How do we get the compute node (CNL) cbtf package install
# directory path?
# spec['cbtf'].prefix is the login node value for this build, as
# we only get here when building the login node components and
# that is all that is known to spack.
be_ck = spack.store.db.query_one("cbtf-krell arch=cray-CNL-haswell")
# Equivalent to install-tool cmake arg:
# '-DCBTF_KRELL_CN_RUNTIME_DIR=%s'
# % <base dir>/cbtf_v2.4.0elease/compute)
cray_login_node_options.append("-DCBTF_KRELL_CN_RUNTIME_DIR=%s" % be_ck.prefix)
cray_login_node_options.append("-DRUNTIME_PLATFORM=%s" % rt_platform)
cmake_options.extend(cray_login_node_options)
def cmake_args(self):
# Appends base options to cmake_args
spec = self.spec
@@ -220,13 +187,6 @@ def cmake_args(self):
]
)
if spec.satisfies("+crayfe"):
# We need to build target/compute node
# components/libraries first then pass
# those libraries to the openspeedshop
# login node build
self.set_cray_login_node_cmake_options(spec, cmake_args)
cmake_args.extend(["-DBUILD_QT3_GUI=FALSE"])
return cmake_args

View File

@@ -0,0 +1,43 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class OptixDev(Package):
"""OptiX is an application framework for achieving optimal ray tracing
performance on the GPU. This package contains the minimal set of necessary
header files for building an application with OptiX support, including
access to the OptiX functions provided by the NVIDIA display driver.
https://github.com/NVIDIA/optix-dev It is not necessary to use this
package when installing the complete OptiX SDK; header files are already
included in the OptiX SDK."""
homepage = "https://developer.nvidia.com/rtx/ray-tracing/optix"
url = "https://github.com/NVIDIA/optix-dev/archive/refs/tags/v9.0.0.tar.gz"
license("LicenseRef-NvidiaProprietary AND BSD-3-Clause")
maintainers("plexoos")
build_system("generic")
version("9.0.0", sha256="069a5860040ea611e7eb6317f8e3bb0f0d54a5acac744568f7290d7cb8711c05")
version("8.1.0", sha256="aa32dfb55f37ff92964a5545b056094d86635441b3513e1d45a9410404b6d7c2")
version("8.0.0", sha256="b32e74c9f5c13549ff3a9760076271b5b6ec28f93fe6a8dd0bde74d7e5c58e05")
version("7.7.0", sha256="02e5acdb8870a5668c763d47043d61586c1c4e72395d64e7bdd99ea04bc4222d")
version("7.6.0", sha256="4fe1e047d0e80980e57c469e3491f88cd3c3b735462b35cb3a0c2797a751fb1e")
version("7.5.0", sha256="9053ba3636dd612ad5e50106a56ea4022e719a2d35c914c61fc9bc681b0e64d6")
version("7.4.0", sha256="91d35f1ba0f519f9e98582586478a64d323e7d7263b8b8349797c4aeb7fc53af")
version("7.3.0", sha256="a74b0120e308258f5b5d5b30f905e13e1ceeeca2058aaee58310c84647fcc31d")
version("7.2.0", sha256="a82bc7da75f3db81be73826a00b694c858be356258323d454f4e1aa78a5670f8")
version("7.1.0", sha256="70b9adac04e5a36185e715a74306f22426334b6a3850dd7f1a2744212c83f9e1")
version("7.0.0", sha256="8b294bcd4d23ced20310d73ed320c7bc3ecbb79e3d50f00eb5d97a3639d129a3")
depends_on("c", type="build")
depends_on("cxx", type="build")
def install(self, spec, prefix):
install_tree("include", prefix.include)
install("LICENSE.txt", prefix)
install("license_info.txt", prefix)

View File

@@ -2,7 +2,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.package import *

View File

@@ -26,6 +26,7 @@ class Parsec(CMakePackage, CudaPackage):
license("BSD-3-Clause-Open-MPI")
version("master", branch="master")
version("4.0.2411", sha256="3f5750565b9f673626284dd0ba835dadea3633577fee50ac217baf43a335f2ef")
version("3.0.2209", sha256="67d383d076991484cb2a265f56420abdea7cc1f329c63ac65a3e96fbfb6cc295")
version("3.0.2012", sha256="7a8403ca67305738f3974cbc7a51b64c4ec353ae9170f2468262a9a52035eff6")
version(
@@ -59,6 +60,8 @@ class Parsec(CMakePackage, CudaPackage):
# TODO: Spack does not handle cross-compilation atm
# variant('xcompile', default=False, description='Cross compile')
depends_on("c", type="build")
depends_on("cmake@3.18:", type="build")
depends_on("python", type="build")
depends_on("flex", type="build")

Some files were not shown because too many files have changed in this diff Show More