Compare commits

..

94 Commits

Author SHA1 Message Date
M. Eric Irrgang
06f9bcf734 Update for py-gmxapi for 0.4.1. (#37834)
* Update for py-gmxapi for 0.4.1.

* Note 0.4.1 hash from PyPI.
* Note relaxed dependencies for future versions.

* Update var/spack/repos/builtin/packages/py-gmxapi/package.py
2023-05-24 22:03:10 -05:00
Manuela Kuhn
ee2725762f py-charset-normalizer: add 3.1.0 (#37880) 2023-05-24 21:57:29 -05:00
Manuela Kuhn
eace0a177c py-bids-validator: add 1.11.0 (#37845) 2023-05-24 21:36:45 -05:00
Manuela Kuhn
80c7d74707 py-bottleneck: add 1.3.7 (#37847) 2023-05-24 21:36:01 -05:00
Manuela Kuhn
a6f5bf821d py-certifi: add 2023.5.7 (#37848) 2023-05-24 21:35:13 -05:00
Manuela Kuhn
b214406253 py-attrs: add 23.1.0 (#37817)
* py-attrs: add 23.1.0

* Add missing dependency
2023-05-24 20:21:31 -05:00
Manuela Kuhn
5b003d80e5 py-babel: add 2.12.1 (#37818)
* py-babel: add 2.12.1

* Update var/spack/repos/builtin/packages/py-babel/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-24 20:13:02 -05:00
Adam J. Stewart
185b2d3ee7 py-rasterio: add v1.3.7 (#37886) 2023-05-24 18:48:16 -04:00
Adam J. Stewart
71bb2a1899 py-lightly: add v1.4.6 (#37885) 2023-05-24 16:43:07 -04:00
Nathalie Furmento
785c31b730 starpu: add release 1.4.1 (#37883) 2023-05-24 09:36:56 -07:00
QuellynSnead
175da4a88a paraview (protobuf failure) #37437 (#37440)
When attempting to build paraview@5.10.1 using a recent Intel
compiler (Classic or OneAPI) or the IBM XL compiler, the build
fails if the version of protobuf used is > 3.18
2023-05-24 11:09:48 -05:00
willdunklin
73fc1ef11c sensei: Allow Paraview 5.11 for sensei develop version (#37719) 2023-05-24 11:00:35 -05:00
Stephen Sachs
2d77e44f6f Pcluster local buildcache (#37852)
* [pcluster pipeline] Use local buildcache instead of upstream spack

Spack currently does not relocate compiler references from upstream spack
installations. When using a buildcache we don't need an upstream spack.

* gcc needs to be installed via postinstall to get correct deps

* quantum-espresso@gcc@12.3.0 returns ICE on neoverse_{n,v}1

* Force gitlab to pull the new container

* Revert "Force gitlab to pull the new container"

This reverts commit 3af5f4cd88.

Seems the gitlab version does not yet support "pull_policy" in .gitlab-ci.yml

* Gitlab keeps picking up wrong container. Renaming

* Update containers once more after failed build
2023-05-24 06:55:00 -07:00
Greg Becker
033599c4cd bugfix: env concretize after remove (#37877) 2023-05-24 15:41:57 +02:00
Harmen Stoppels
8096ed4b22 spack remove: fix traversal when user specs intersect (#37882)
drop unnecessary double loop over the matching user specs.
2023-05-24 09:23:46 -04:00
Simon Pintarelli
b49bfe25af update nlcglib package (#37578) 2023-05-24 11:17:15 +02:00
Houjun Tang
8b2f34d802 Add async vol v1.6 (#37875) 2023-05-24 01:47:46 -04:00
H. Joe Lee
3daed0d6a7 hdf5-vol-daos: add a new package (#35653)
* hdf5-vol-daos: add a new package
* hdf5-vol-daos: address @soumagne review

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-05-23 23:42:42 -04:00
Glenn Johnson
d6c1f75e8d julia: remove myself from maintainers list (#37868) 2023-05-23 16:22:56 -05:00
Laura Weber
c80a4c1ddc Updated hash for latest maintenance release (2022.2.1) (#37842) 2023-05-23 14:08:55 -07:00
Glenn Johnson
466abcb62d gate: remove myself as maintainer (#37862) 2023-05-23 14:03:52 -07:00
Glenn Johnson
69e99f0c16 Remove myself as maintainer of R packages (#37859)
* Remove myself as maintainer of R packages
  I will no longer have the time to properly maintain these packages.
* fix flake8 test for import
2023-05-23 15:35:32 -05:00
Glenn Johnson
bbee6dfc58 bart: remove myself as maintainer (#37860) 2023-05-23 16:08:07 -04:00
Glenn Johnson
2d60cf120b heasoft: remove myself as maintainer (#37866) 2023-05-23 14:37:57 -05:00
Glenn Johnson
db17fc2f33 opencv: remove myself from maintainers list (#37870) 2023-05-23 14:34:52 -05:00
eugeneswalker
c62080d498 e4s ci: add dealii (#32484) 2023-05-23 21:34:31 +02:00
Glenn Johnson
f9bbe549fa gatetools: remove myself as maintainer (#37863) 2023-05-23 15:32:54 -04:00
H. Joe Lee
55d7fec69c daos: add a new package (#35649) 2023-05-23 21:30:23 +02:00
Glenn Johnson
e938907150 reditools: remove myself as maintainer (#37871) 2023-05-23 15:28:16 -04:00
Glenn Johnson
0c40b86e96 itk: remove myself as maintainer (#37867) 2023-05-23 15:27:55 -04:00
Glenn Johnson
3d4cf0d8eb mumax: remove myself as maintainer (#37869) 2023-05-23 15:23:28 -04:00
Glenn Johnson
966e19d278 gurobi: remove myself as maintainer (#37865) 2023-05-23 15:23:05 -04:00
Glenn Johnson
8f930462bd fplo: remove myself as maintainer (#37861) 2023-05-23 15:17:50 -04:00
kjrstory
bf4fccee15 New package: FDS (#37850) 2023-05-23 11:59:05 -07:00
Manuela Kuhn
784771a008 py-bleach: add 6.0.0 (#37846) 2023-05-23 11:50:53 -07:00
Glenn Johnson
e4a9d9ae5b Bioc updates (#37297)
* add version 1.48.0 to bioconductor package r-a4
* add version 1.48.0 to bioconductor package r-a4base
* add version 1.48.0 to bioconductor package r-a4classif
* add version 1.48.0 to bioconductor package r-a4core
* add version 1.48.0 to bioconductor package r-a4preproc
* add version 1.48.0 to bioconductor package r-a4reporting
* add version 1.54.0 to bioconductor package r-absseq
* add version 1.30.0 to bioconductor package r-acde
* add version 1.78.0 to bioconductor package r-acgh
* add version 2.56.0 to bioconductor package r-acme
* add version 1.70.0 to bioconductor package r-adsplit
* add version 1.72.0 to bioconductor package r-affxparser
* add version 1.78.0 to bioconductor package r-affy
* add version 1.76.0 to bioconductor package r-affycomp
* add version 1.58.0 to bioconductor package r-affycontam
* add version 1.72.0 to bioconductor package r-affycoretools
* add version 1.48.0 to bioconductor package r-affydata
* add version 1.52.0 to bioconductor package r-affyilm
* add version 1.70.0 to bioconductor package r-affyio
* add version 1.76.0 to bioconductor package r-affyplm
* add version 1.46.0 to bioconductor package r-affyrnadegradation
* add version 1.48.0 to bioconductor package r-agdex
* add version 3.32.0 to bioconductor package r-agilp
* add version 2.50.0 to bioconductor package r-agimicrorna
* add version 1.32.0 to bioconductor package r-aims
* add version 1.32.0 to bioconductor package r-aldex2
* add version 1.38.0 to bioconductor package r-allelicimbalance
* add version 1.26.0 to bioconductor package r-alpine
* add version 2.62.0 to bioconductor package r-altcdfenvs
* add version 2.24.0 to bioconductor package r-anaquin
* add version 1.28.0 to bioconductor package r-aneufinder
* add version 1.28.0 to bioconductor package r-aneufinderdata
* add version 1.72.0 to bioconductor package r-annaffy
* add version 1.78.0 to bioconductor package r-annotate
* add version 1.62.0 to bioconductor package r-annotationdbi
* add version 1.24.0 to bioconductor package r-annotationfilter
* add version 1.42.0 to bioconductor package r-annotationforge
* add version 3.8.0 to bioconductor package r-annotationhub
* add version 3.30.0 to bioconductor package r-aroma-light
* add version 1.32.0 to bioconductor package r-bamsignals
* add version 2.16.0 to bioconductor package r-beachmat
* add version 2.60.0 to bioconductor package r-biobase
* add version 2.8.0 to bioconductor package r-biocfilecache
* add version 0.46.0 to bioconductor package r-biocgeneric
* add version 1.10.0 to bioconductor package r-biocio
* add version 1.18.0 to bioconductor package r-biocneighbors
* add version 1.34.0 to bioconductor package r-biocparallel
* add version 1.16.0 to bioconductor package r-biocsingular
* add version 2.28.0 to bioconductor package r-biocstyle
* add version 3.17.1 to bioconductor package r-biocversion
* add version 2.56.0 to bioconductor package r-biomart
* add version 1.28.0 to bioconductor package r-biomformat
* add version 2.68.0 to bioconductor package r-biostrings
* add version 1.48.0 to bioconductor package r-biovizbase
* add version 1.10.0 to bioconductor package r-bluster
* add version 1.68.0 to bioconductor package r-bsgenome
* add version 1.36.0 to bioconductor package r-bsseq
* add version 1.42.0 to bioconductor package r-bumphunter
* add version 2.66.0 to bioconductor package r-category
* add version 2.30.0 to bioconductor package r-champ
* add version 2.32.0 to bioconductor package r-champdata
* add version 1.50.0 to bioconductor package r-chipseq
* add version 4.8.0 to bioconductor package r-clusterprofiler
* add version 1.36.0 to bioconductor package r-cner
* add version 1.32.0 to bioconductor package r-codex
* add version 2.16.0 to bioconductor package r-complexheatmap
* add version 1.74.0 to bioconductor package r-ctc
* add version 2.28.0 to bioconductor package r-decipher
* add version 0.26.0 to bioconductor package r-delayedarray
* add version 1.22.0 to bioconductor package r-delayedmatrixstats
* add version 1.40.0 to bioconductor package r-deseq2
* add version 1.46.0 to bioconductor package r-dexseq
* add version 1.42.0 to bioconductor package r-dirichletmultinomial
* add version 2.14.0 to bioconductor package r-dmrcate
* add version 1.74.0 to bioconductor package r-dnacopy
* add version 3.26.0 to bioconductor package r-dose
* add version 2.48.0 to bioconductor package r-dss
* add version 3.42.0 to bioconductor package r-edger
* add version 1.20.0 to bioconductor package r-enrichplot
* add version 2.24.0 to bioconductor package r-ensembldb
* add version 1.46.0 to bioconductor package r-exomecopy
* add version 2.8.0 to bioconductor package r-experimenthub
* add version 1.26.0 to bioconductor package r-fgsea
* add version 2.72.0 to bioconductor package r-gcrma
* add version 1.36.0 to bioconductor package r-gdsfmt
* add version 1.82.0 to bioconductor package r-genefilter
* add version 1.36.0 to bioconductor package r-genelendatabase
* add version 1.72.0 to bioconductor package r-genemeta
* add version 1.78.0 to bioconductor package r-geneplotter
* add version 1.22.0 to bioconductor package r-genie3
* add version 1.36.0 to bioconductor package r-genomeinfodb
* update r-genomeinfodbdata
* add version 1.36.0 to bioconductor package r-genomicalignments
* add version 1.52.0 to bioconductor package r-genomicfeatures
* add version 1.52.0 to bioconductor package r-genomicranges
* add version 2.68.0 to bioconductor package r-geoquery
* add version 1.48.0 to bioconductor package r-ggbio
* add version 3.8.0 to bioconductor package r-ggtree
* add version 2.10.0 to bioconductor package r-glimma
* add version 1.12.0 to bioconductor package r-glmgampoi
* add version 5.54.0 to bioconductor package r-globaltest
* update r-go-db
* add version 1.20.0 to bioconductor package r-gofuncr
* add version 2.26.0 to bioconductor package r-gosemsim
* add version 1.52.0 to bioconductor package r-goseq
* add version 2.66.0 to bioconductor package r-gostats
* add version 1.78.0 to bioconductor package r-graph
* add version 1.62.0 to bioconductor package r-gseabase
* add version 1.32.0 to bioconductor package r-gtrellis
* add version 1.44.0 to bioconductor package r-gviz
* add version 1.28.0 to bioconductor package r-hdf5array
* add version 1.72.0 to bioconductor package r-hypergraph
* add version 1.36.0 to bioconductor package r-illumina450probevariants-db
* add version 0.42.0 to bioconductor package r-illuminaio
* add version 1.74.0 to bioconductor package r-impute
* add version 1.38.0 to bioconductor package r-interactivedisplaybase
* add version 2.34.0 to bioconductor package r-iranges
* add version 1.60.0 to bioconductor package r-kegggraph
* add version 1.40.0 to bioconductor package r-keggrest
* add version 3.56.0 to bioconductor package r-limma
* add version 2.52.0 to bioconductor package r-lumi
* add version 1.76.0 to bioconductor package r-makecdfenv
* add version 1.78.0 to bioconductor package r-marray
* add version 1.12.0 to bioconductor package r-matrixgenerics
* add version 1.8.0 to bioconductor package r-metapod
* add version 2.46.0 to bioconductor package r-methylumi
* add version 1.46.0 to bioconductor package r-minfi
* add version 1.34.0 to bioconductor package r-missmethyl
* add version 1.80.0 to bioconductor package r-mlinterfaces
* add version 1.12.0 to bioconductor package r-mscoreutils
* add version 2.26.0 to bioconductor package r-msnbase
* add version 2.56.0 to bioconductor package r-multtest
* add version 1.38.0 to bioconductor package r-mzid
* add version 2.34.0 to bioconductor package r-mzr
* add version 1.62.0 to bioconductor package r-oligoclasses
* update r-org-hs-eg-db
* add version 1.42.0 to bioconductor package r-organismdbi
* add version 1.40.0 to bioconductor package r-pathview
* add version 1.92.0 to bioconductor package r-pcamethods
* update r-pfam-db
* add version 1.44.0 to bioconductor package r-phyloseq
* add version 1.62.0 to bioconductor package r-preprocesscore
* add version 1.32.0 to bioconductor package r-protgenerics
* add version 1.34.0 to bioconductor package r-quantro
* add version 2.32.0 to bioconductor package r-qvalue
* add version 1.76.0 to bioconductor package r-rbgl
* add version 2.40.0 to bioconductor package r-reportingtools
* add version 2.44.0 to bioconductor package r-rgraphviz
* add version 2.44.0 to bioconductor package r-rhdf5
* add version 1.12.0 to bioconductor package r-rhdf5filters
* add version 1.22.0 to bioconductor package r-rhdf5lib
* add version 1.76.0 to bioconductor package r-roc
* add version 1.28.0 to bioconductor package r-rots
* add version 2.16.0 to bioconductor package r-rsamtools
* add version 1.60.0 to bioconductor package r-rtracklayer
* add version 0.38.0 to bioconductor package r-s4vectors
* add version 1.8.0 to bioconductor package r-scaledmatrix
* add version 1.28.0 to bioconductor package r-scater
* add version 1.14.0 to bioconductor package r-scdblfinder
* add version 1.28.0 to bioconductor package r-scran
* add version 1.10.0 to bioconductor package r-scuttle
* add version 1.66.0 to bioconductor package r-seqlogo
* add version 1.58.0 to bioconductor package r-shortread
* add version 1.74.0 to bioconductor package r-siggenes
* add version 1.22.0 to bioconductor package r-singlecellexperiment
* add version 1.34.0 to bioconductor package r-snprelate
* add version 1.50.0 to bioconductor package r-snpstats
* add version 2.36.0 to bioconductor package r-somaticsignatures
* add version 1.12.0 to bioconductor package r-sparsematrixstats
* add version 1.40.0 to bioconductor package r-spem
* add version 1.38.0 to bioconductor package r-sseq
* add version 1.30.0 to bioconductor package r-summarizedexperiment
* add version 3.48.0 to bioconductor package r-sva
* add version 1.38.0 to bioconductor package r-tfbstools
* add version 1.22.0 to bioconductor package r-tmixclust
* add version 2.52.0 to bioconductor package r-topgo
* add version 1.24.0 to bioconductor package r-treeio
* add version 1.28.0 to bioconductor package r-tximport
* add version 1.28.0 to bioconductor package r-tximportdata
* add version 1.46.0 to bioconductor package r-variantannotation
* add version 3.68.0 to bioconductor package r-vsn
* add version 2.6.0 to bioconductor package r-watermelon
* add version 2.46.0 to bioconductor package r-xde
* add version 1.58.0 to bioconductor package r-xmapbridge
* add version 0.40.0 to bioconductor package r-xvector
* add version 1.26.0 to bioconductor package r-yapsa
* add version 1.26.0 to bioconductor package r-yarn
* add version 1.46.0 to bioconductor package r-zlibbioc
* Revert "add version 1.82.0 to bioconductor package r-genefilter"
  This reverts commit 1702071c6d.
* Revert "add version 0.38.0 to bioconductor package r-s4vectors"
  This reverts commit 58a7df2387.
* add version 0.38.0 to bioconductor package r-s4vectors
* Revert "add version 1.28.0 to bioconductor package r-aneufinder"
  This reverts commit 0a1f59de6c.
* add version 1.28.0 to bioconductor package r-aneufinder
* Revert "add version 2.16.0 to bioconductor package r-beachmat"
  This reverts commit cd49fb8e4c.
* add version 2.16.0 to bioconductor package r-beachmat
* Revert "add version 4.8.0 to bioconductor package r-clusterprofiler"
  This reverts commit 6e9a951cbe.
* add version 4.8.0 to bioconductor package r-clusterprofiler
* Fix syntax error
* r-genefilter: add version 1.82.0
* new package: r-basilisk-utils
* new package: r-basilisk
* new package: r-densvis
* new package: r-dir-expiry
* r-affyplm: add zlib dependency
* r-cner: add zlib dependency
* r-mzr: add zlib dependency
* r-rhdf5filters: add zstd dependency
* r-shortread: add zlib dependency
* r-snpstats: add zlib dependency

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-05-23 11:40:00 -07:00
snehring
a6886983dc usalign: new package (#37646)
* usalign: adding new package
* usalign: updating shasum, adding note about distribution
2023-05-23 11:30:40 -07:00
Andrey Parfenov
93a34a9635 hpcg: apply patch with openmp pragma changes for intel and oneapi compilers (#37856)
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
2023-05-23 11:52:45 -04:00
Todd Gamblin
91a54029f9 libgcrypt: patch 1.10.2 on macos (#37844)
macOS doesn't have `getrandom`, and 1.10.2 fails to compile because of this.

There's an upstream fix at https://dev.gnupg.org/T6442 that will be in the next
`libgcrypt` release, but the patch is available now.
2023-05-23 06:03:13 -04:00
Juan Miguel Carceller
5400b49ed6 dd4hep: add LD_LIBRARY_PATH for plugins for Gaudi (#37824)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-05-23 10:28:26 +01:00
Juan Miguel Carceller
c17fc3c0c1 gaudi: add gaudi to LD_LIBRARY_PATH (#37821)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-05-23 10:27:55 +01:00
Juan Miguel Carceller
6f248836ea dd4hep: restrict podio versions (#37699)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-05-23 10:20:31 +01:00
snehring
693c1821b0 py-pastml: adding version for compatibility with py-topiary-asr (#37828) 2023-05-22 14:44:00 -05:00
Manuela Kuhn
62afe3bd5a py-asttokens: add 2.2.1 (#37816) 2023-05-22 14:40:08 -05:00
genric
53a756d045 py-dask: add v2023.4.1 (#37550)
* py-dask: add v2023.4.1

* address review comments
2023-05-22 14:29:23 -05:00
Adam J. Stewart
321b687ae6 py-huggingface-hub: add v0.14.1, cli variant (#37815) 2023-05-22 11:19:41 -07:00
Adam J. Stewart
c8617f0574 py-fiona: add v1.9.4 (#37780) 2023-05-22 13:17:13 -05:00
Adam J. Stewart
7843e2ead0 azcopy: add new package (#37693) 2023-05-22 11:09:06 -07:00
Juan Miguel Carceller
dca3d071d7 gaudi: fix issue with fmt::format (#37810)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-05-22 10:33:05 -07:00
eugeneswalker
436f077482 tau %oneapi: -Wno-error=implicit-function-declaration (#37829) 2023-05-22 13:13:02 -04:00
simonleary-umass-edu
ab3f705019 deleted package.py better error message (#37814)
adds the namespace to the exception object's string representation
2023-05-22 09:59:07 -07:00
Tamara Dahlgren
d739989ec8 swig: convert to new stand-alone test process (#37786) 2023-05-22 09:39:30 -07:00
Jordan Galby
52ee1967d6 llvm: Fix hwloc@1 and hwloc@:2.3 compatibility (#35387) 2023-05-22 10:28:57 -05:00
Andrey Prokopenko
1af7284b5d arborx: new version 1.4 (#37809) 2023-05-21 12:25:53 -07:00
Todd Gamblin
e1bcefd805 Update CHANGELOG.md for v0.20.0 2023-05-21 01:48:34 +02:00
Manuela Kuhn
2159b0183d py-argcomplete: add 3.0.8 (#37797)
* py-argcomplete: add 3.0.8

* Update var/spack/repos/builtin/packages/py-argcomplete/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [@spackbot] updating style on behalf of manuelakuhn

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-20 11:31:28 -05:00
kwryankrattiger
078fd225a9 Mochi-Margo: Add patch for pthreads detection (#36109) 2023-05-20 07:43:27 -07:00
Manuela Kuhn
83974828c7 libtiff: disable use of sphinx (#37803) 2023-05-19 21:37:19 -05:00
Manuela Kuhn
2412f74557 py-anyio: add 3.6.2 (#37796) 2023-05-19 21:36:39 -05:00
Manuela Kuhn
db06d3621d py-alabaster: add 0.7.13 (#37798) 2023-05-19 21:34:00 -05:00
Jose E. Roman
c25170d2f9 New patch release SLEPc 3.19.1 (#37675)
* New patch release SLEPc 3.19.1

* py-slepc4py: add explicit dependency on py-numpy
2023-05-19 21:33:21 -05:00
Vanessasaurus
b3dfe13670 Automated deployment to update package flux-security 2023-05-16 (#37696)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-05-19 12:07:57 -07:00
Harmen Stoppels
6358e84b48 fix binutils dep of spack itself (#37738) 2023-05-19 12:02:36 -07:00
Swann Perarnau
8e634d8e49 aml: v0.2.1 (#37621)
* aml: v0.2.1

* add version 0.2.1
* fix hip variant bug

* [fix] pkgconf required for all builds

On top of needing pkgconf for autoreconf builds, the release configure
scripts needs pkgconf do detect dependencies if any of the hwloc, ze, or
opencl variants are active.

* Remove deprecation for v0.2.0 based on PR advise.
2023-05-19 11:58:28 -07:00
Mark W. Krentel
1a21376515 intel-xed: add version 2023.04.16 (#37582)
* intel-xed: add version 2023.04.16
 1. add version 2023.04.16
 2. adjust the mbuild resource to better match the xed version at the time
 3. replace three conflicts() with one new requires() for x86_64 target
 4. add patch for libxed-ild for some new avx512 instructions
    * [@spackbot] updating style on behalf of mwkrentel
    * Fix the build for 2023.04.16.  XED requires its source directory to be exactly 'xed', so add a symlink.
 5. move the mbuild resource up one level, xed wants it to be in the same directory as the xed source dir
 6. deprecate 10.2019.03
    * semantic style fix: add OSError to except
    * [@spackbot] updating style on behalf of mwkrentel

---------

Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
2023-05-19 10:28:18 -07:00
Harmen Stoppels
bf45a2b6d3 spack env create: generate a view when newly created env has concrete specs (#37799) 2023-05-19 18:44:54 +02:00
Thomas-Ulrich
475ce955e7 hipsycl: add v0.9.4 (#37247) 2023-05-19 18:29:45 +02:00
Robert Underwood
5e44289787 updates for the libpressio ecosystem (#37764)
* updates for the libpressio ecosystem

* [@spackbot] updating style on behalf of robertu94

* style fix: remove FIXME

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-05-19 09:24:31 -07:00
Massimiliano Culpo
e66888511f archspec: fix entry in the JSON file (#37793) 2023-05-19 09:57:57 -04:00
Tamara Dahlgren
e9e5beee1f fortrilinos: convert to new stand-alone test process (#37783) 2023-05-19 08:06:33 -04:00
Tamara Dahlgren
ffd134c09d formetis: converted to new stand-alone test process (#37785) 2023-05-19 08:05:23 -04:00
Massimiliano Culpo
bfadd5c9a5 lmod: allow core compiler to be specified with a version range (#37789)
Use CompilerSpec with satisfies instead of string equality tests

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 13:21:40 +02:00
Greg Becker
16e9279420 compiler specs: do not print '@=' when clear from context (#37787)
Ensure that spack compiler add/find/list and lists of concrete specs
print the compiler effectively as {compiler.name}{@compiler.version}.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-05-19 11:31:27 +02:00
Satish Balay
ac0903ef9f llvm: add version 16.0.3 (#37472) 2023-05-19 03:37:57 -04:00
Cyrus Harrison
648839dffd add conduit 0.8.8 release (#37776) 2023-05-19 00:34:19 -04:00
Pieter Ghysels
489a604920 Add STRUMPACK versions 7.1.2 and 7.1.3 (#37779) 2023-05-18 23:18:50 -04:00
eugeneswalker
2ac3435810 legion +rocm: apply patch for --offload-arch (#37775)
* legion +rocm: apply patch for --offload-arch

* constrain to latest version
2023-05-18 23:03:50 -04:00
Alec Scott
69ea180d26 fzf: add v0.40.0 and refactor package (#37569)
* fzf: add v0.40.0 and refactor package
* Remove unused imports
2023-05-18 15:23:20 -07:00
Alec Scott
f52f217df0 roctracer-dev-api: add v5.5.0 (#37484) 2023-05-18 15:11:36 -07:00
Alec Scott
df74aa5d7e amqp-cpp: add v4.3.24 (#37504) 2023-05-18 15:09:30 -07:00
Alec Scott
41932c53ae libjwt: add v1.15.3 (#37521) 2023-05-18 15:05:27 -07:00
Alec Scott
4296db794f rdkit: add v2023_03_1 (#37529) 2023-05-18 15:05:07 -07:00
H. Joe Lee
9ab9302409 py-jarvis-util: add a new package (#37729)
* py-jarvis-util: add a new package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-18 17:28:59 -04:00
Benjamin Meyers
0187376e54 Update py-nltk (#37703)
* Update py-nltk

* [@spackbot] updating style on behalf of meyersbs

* Update var/spack/repos/builtin/packages/py-nltk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-05-18 17:14:20 -04:00
Aditya Bhamidipati
7340d2cb83 update package nccl version 2.17, 2.18 (#37721) 2023-05-18 15:47:01 -05:00
Benjamin Meyers
641d4477d5 New package py-tensorly (#37705) 2023-05-18 15:38:23 -05:00
Benjamin Meyers
3ff2fb69af New package py-scikit-tensor-py3 (#37706) 2023-05-18 15:37:43 -05:00
vucoda
e3024b1bcb Update py-jedi 0.17.2 checksum in package.py (#37700)
Pypi for jedi-0.17.2.tar.gz does not match the package.py file.

https://pypi.org/project/jedi/0.17.2/#copy-hash-modal-60e943e3-1192-4b12-90e2-4d639cb5b4f7
2023-05-18 15:15:47 -05:00
Dom Heinzeller
e733b87865 Remove references to gmake executable, only use make (#37280) 2023-05-18 19:03:03 +00:00
Victor Lopez Herrero
919985dc1b dlb: add v3.3.1 (#37759) 2023-05-18 10:29:41 -07:00
Michael Kuhn
d746f7d427 harfbuzz: add 7.3.0 (#37753)
Do not prefer 5.3.1 anymore since it does not build with newer compilers. Linux distributions have also moved to 6.x and newer.
2023-05-18 10:27:48 -07:00
kaanolgu
b6deab515b [babelstream] FIX: maintainers list missing a comma (#37761) 2023-05-18 10:25:04 -07:00
Glenn Johnson
848220c4ba update R: add version 4.3.0 (#37090) 2023-05-18 10:13:28 -07:00
Glenn Johnson
98462bd27e Cran updates (#37296)
* add version 1.28.3 to r-hexbin
* add version 5.0-1 to r-hmisc
* add version 0.5.5 to r-htmltools
* add version 1.6.2 to r-htmlwidgets
* add version 1.4.2 to r-igraph
* add version 0.42.19 to r-imager
* add version 1.0-5 to r-inum
* add version 0.9-14 to r-ipred
* add version 1.3.2 to r-irkernel
* add version 2.2.0 to r-janitor
* add version 0.1-10 to r-jpeg
* add version 1.2.2 to r-jsonify
* add version 0.9-32 to r-kernlab
* add version 1.7-2 to r-klar
* add version 1.42 to r-knitr
* add version 1.14.0 to r-ks
* add version 2.11.0 to r-labelled
* add version 1.7.2.1 to r-lava
* add version 0.6-15 to r-lavaan
* add version 2.1.2 to r-leaflet
* add version 2.9-0 to r-lfe
* add version 1.1.6 to r-lhs
* add version 1.1-33 to r-lme4
* add version 1.5-9.7 to r-locfit
* add version 0.4.3 to r-log4r
* add version 5.6.18 to r-lpsolve
* add version 0.2-11 to r-lwgeom
* add version 2.7.4 to r-magick
* add version 1.22.1 to r-maldiquant
* add version 1.2.11 to r-mapproj
* add version 1.6 to r-markdown
* add version 7.3-59 to r-mass
* add version 1.5-4 to r-matrix
* add version 0.63.0 to r-matrixstats
* add version 4.2-3 to r-memuse
* add version 4.0-0 to r-metafor
* add version 1.8-42 to r-mgcv
* add version 3.15.0 to r-mice
* add version 0.4-5 to r-mitml
* add version 2.0.0 to r-mixtools
* add version 0.1.11 to r-modelr
* add version 1.4-23 to r-multcomp
* add version 0.1-9 to r-multcompview
* add version 0.1-13 to r-mutoss
* add version 1.18.1 to r-network
* add version 3.3.4 to r-nleqslv
* add version 3.1-162 to r-nlme
* add version 0.26 to r-nmf
* add version 0.60-17 to r-np
* add version 4.2.5.2 to r-openxlsx
* add version 2022.11-16 to r-ordinal
* add version 0.6.0.8 to r-osqp
* add version 0.9.1 to r-packrat
* add version 1.35.0 to r-parallelly
* add version 1.3-13 to r-party
* add version 1.2-20 to r-partykit
* add version 1.7-0 to r-pbapply
* add version 0.3-9 to r-pbdzmq
* add version 1.2 to r-pegas
* add version 1.5-1 to r-phytools
* add version 1.9.0 to r-pillar
* add version 1.4.0 to r-pkgbuild
* add version 2.1.0 to r-pkgcache
* add version 0.5.0 to r-pkgdepends
* add version 2.0.7 to r-pkgdown
* add version 1.3.2 to r-pkgload
* add version 0.1-8 to r-png
* add version 1.1.22 to r-polspline
* add version 1.0.1 to r-pool
* add version 1.4.1 to r-posterior
* add version 3.8.1 to r-processx
* add version 2023.03.31 to r-prodlim
* add version 1.0-12 to r-proj4
* add version 2.5.0 to r-projpred
* add version 0.1.6 to r-pryr
* add version 1.7.5 to r-ps
* add version 1.0.1 to r-purrr
* add version 1.3.2 to r-qqconf
* add version 0.25.5 to r-qs
* add version 1.60 to r-qtl
* add version 0.4.22 to r-quantmod
* add version 5.95 to r-quantreg
* add version 0.7.8 to r-questionr
* add version 1.2.5 to r-ragg
* add version 0.15.1 to r-ranger
* add version 3.6-20 to r-raster
* add version 2.2.13 to r-rbibutils
* add version 1.0.10 to r-rcpp
* add version 0.12.2.0.0 to r-rcpparmadillo
* add version 0.1.7 to r-rcppde
* add version 0.3.13 to r-rcppgsl
* add version 1.98-1.12 to r-rcurl
* add version 1.2-1 to r-rda
* add version 2.1.4 to r-readr
* add version 1.4.2 to r-readxl
* add version 1.0.6 to r-recipes
* add version 1.1.6 to r-repr
* add version 1.2.16 to r-reproducible
* add version 0.3.0 to r-require
* add version 1.28 to r-reticulate
* add version 2.0.7 to r-rfast
* add version 1.6-6 to r-rgdal
* add version 0.6-2 to r-rgeos
* add version 1.1.3 to r-rgl
* add version 0.2.18 to r-rinside
* add version 4-14 to r-rjags
* add version 1.3-1.8 to r-rjsonio
* add version 2.21 to r-rmarkdown
* add version 0.9-2 to r-rmpfr
* add version 0.7-1 to r-rmpi
* add version 6.6-0 to r-rms
* add version 0.10.25 to r-rmysql
* add version 0.8.7 to r-rncl
* add version 2.4.11 to r-rnexml
* add version 0.95-1 to r-robustbase
* add version 1.3-20 to r-rodbc
* add version 7.2.3 to r-roxygen2
* add version 1.4.5 to r-rpostgres
* add version 0.7-5 to r-rpostgresql
* add version 0.8.29 to r-rsconnect
* add version 0.4-15 to r-rsnns
* add version 2.3.1 to r-rsqlite
* add version 0.7.2 to r-rstatix
* add version 1.1.2 to r-s2
* add version 0.4.5 to r-sass
* add version 0.1.9 to r-scatterpie
* add version 0.3-43 to r-scatterplot3d
* add version 3.2.4 to r-scs
* add version 1.6-4 to r-segmented
* add version 4.2-30 to r-seqinr
* add version 0.26 to r-servr
* add version 4.3.0 to r-seurat
* add version 1.0-12 to r-sf
* add version 0.4.2 to r-sfheaders
* add version 1.1-15 to r-sfsmisc
* add version 1.7.4 to r-shiny
* add version 1.9.0 to r-signac
* add version 1.6.0.3 to r-smoof
* add version 0.1.7-1 to r-sourcetools
* add version 1.6-0 to r-sp
* add version 1.3-0 to r-spacetime
* add version 7.3-16 to r-spatial
* add version 2.0-0 to r-spatialeco
* add version 1.2-8 to r-spatialreg
* add version 3.0-5 to r-spatstat
* add version 3.0-1 to r-spatstat-data
* add version 3.1-0 to r-spatstat-explore
* add version 3.1-0 to r-spatstat-geom
* add version 3.1-0 to r-spatstat-linnet
* add version 3.1-4 to r-spatstat-random
* add version 3.0-1 to r-spatstat-sparse
* add version 3.0-2 to r-spatstat-utils
* add version 2.2.2 to r-spdata
* add version 1.2-8 to r-spdep
* add version 0.6-1 to r-stars
* add version 1.5.0 to r-statmod
* add version 4.8.0 to r-statnet-common
* add version 1.7.12 to r-stringi
* add version 1.5.0 to r-stringr
* add version 1.9.1 to r-styler
* add version 3.5-5 to r-survival
* add version 1.5-4 to r-tclust
* add version 1.7-29 to r-terra
* add version 3.1.7 to r-testthat
* add version 1.1-2 to r-th-data
* add version 1.2 to r-tictoc
* add version 1.3.2 to r-tidycensus
* add version 1.2.3 to r-tidygraph
* add version 1.3.0 to r-tidyr
* add version 2.0.0 to r-tidyverse
* add version 0.2.0 to r-timechange
* add version 0.45 to r-tinytex
* add version 0.4.1 to r-triebeard
* add version 1.0-9 to r-truncnorm
* add version 0.10-53 to r-tseries
* add version 0.8-1 to r-units
* add version 4.3.0 to r-v8
* add version 1.4-11 to r-vcd
* add version 1.14.0 to r-vcfr
* add version 0.6.2 to r-vctrs
* add version 1.1-8 to r-vgam
* add version 0.4.0 to r-vioplot
* add version 1.6.1 to r-vroom
* add version 1.72-1 to r-wgcna
* add version 0.4.1 to r-whisker
* add version 0.7.2 to r-wk
* add version 0.39 to r-xfun
* add version 1.7.5.1 to r-xgboost
* add version 1.0.7 to r-xlconnect
* add version 3.99-0.14 to r-xml
* add version 0.13.1 to r-xts
* add version 2.3.7 to r-yaml
* add version 2.3.0 to r-zip
* add version 1.8-12 to r-zoo
* r-bigmem: dependency on uuid
* r-bio3d: dependency on zlib
* r-devtools: dependency cleanup
* r-dose: dependency cleanup
* r-dss: dependency cleanup
* r-enrichplot: dependency cleanup
* r-fgsea: dependency cleanup
* r-geor: dependency cleanup
* r-ggridges: dependency cleanup
* r-lobstr: dependency cleanup
* r-lubridate: dependency cleanup
* r-mnormt: dependency cleanup
* r-sctransform: version format correction
* r-seuratobject: dependency cleanup
* r-tidyselect: dependency cleanup
* r-tweenr: dependency cleanup
* r-uwot: dependency cleanup
* new package: r-clock
* new package: r-conflicted
* new package: r-diagram
* new package: r-doby
* new package: r-httr2
* new package: r-kableextra
* new package: r-mclogit
* new package: r-memisc
* new package: r-spatstat-model
* r-rmysql: use mariadb-client
* r-snpstats: add zlib dependency
* r-qs: add zstd dependency
* r-rcppcnpy: add zlib dependency
* black reformatting
* Revert "r-dose: dependency cleanup"
  This reverts commit 4c8ae8f5615ee124fff01ce43eddd3bb5d06b9bc.
* Revert "r-dss: dependency cleanup"
  This reverts commit a6c5c15c617a9a688fdcfe2b70c501c3520d4706.
* Revert "r-enrichplot: dependency cleanup"
  This reverts commit 65e116c18a94d885bc1a0ae667c1ef07d1fe5231.
* Revert "r-fgsea: dependency cleanup"
  This reverts commit ffe2cdcd1f73f69d66167b941970ede0281b56d7.
* r-rda: this package is back in CRAN
* r-sctransform: fix copyright
* r-seurat: fix copyright
* r-seuratobject: fix copyright
* Revert "add version 6.0-94 to r-caret"
  This reverts commit 236260597de97a800bfc699aec1cd1d0e3d1ac60.
* add version 6.0-94 to r-caret
* Revert "add version 1.8.5 to r-emmeans"
  This reverts commit 64a129beb0bd88d5c88fab564cade16c03b956ec.
* add version 1.8.5 to r-emmeans
* Revert "add version 5.0-1 to r-hmisc"
  This reverts commit 517643f4fd8793747365dfcfc264b894d2f783bd.
* add version 5.0-1 to r-hmisc
* Revert "add version 1.42 to r-knitr"
  This reverts commit 2a0d9a4c1f0ba173f7423fed59ba725bac902c37.
* add version 1.42 to r-knitr
* Revert "add version 1.6 to r-markdown"
  This reverts commit 4b5565844b5704559b819d2e775fe8dec625af99.
* add version 1.6 to r-markdown
* Revert "add version 0.26 to r-nmf"
  This reverts commit 4c44a788b17848f2cda67b32312a342c0261caec.
* add version 0.26 to r-nmf
* Revert "add version 2.3.1 to r-rsqlite"
  This reverts commit 5722ee2297276e4db8beee461d39014b0b17e420.
* add version 2.3.1 to r-rsqlite
* Revert "add version 1.0-12 to r-sf"
  This reverts commit ee1734fd62cc02ca7a9359a87ed734f190575f69.
* add version 1.0-12 to r-sf
* fix syntax error
2023-05-18 09:57:43 -07:00
640 changed files with 3359 additions and 515 deletions

View File

@@ -1,3 +1,221 @@
# v0.20.0 (2023-05-21)
`v0.20.0` is a major feature release.
## Features in this release
1. **`requires()` directive and enhanced package requirements**
We've added some more enhancements to requirements in Spack (#36286).
There is a new `requires()` directive for packages. `requires()` is the opposite of
`conflicts()`. You can use it to impose constraints on this package when certain
conditions are met:
```python
requires(
"%apple-clang",
when="platform=darwin",
msg="This package builds only with clang on macOS"
)
```
More on this in [the docs](
https://spack.rtfd.io/en/latest/packaging_guide.html#conflicts-and-requirements).
You can also now add a `when:` clause to `requires:` in your `packages.yaml`
configuration or in an environment:
```yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "Only OpenMPI 4.1.5 and up can build with fancy compilers"
```
More details can be found [here](
https://spack.readthedocs.io/en/latest/build_settings.html#package-requirements)
2. **Exact versions**
Spack did not previously have a way to distinguish a version if it was a prefix of
some other version. For example, `@3.2` would match `3.2`, `3.2.1`, `3.2.2`, etc. You
can now match *exactly* `3.2` with `@=3.2`. This is useful, for example, if you need
to patch *only* the `3.2` version of a package. The new syntax is described in [the docs](
https://spack.readthedocs.io/en/latest/basic_usage.html#version-specifier).
Generally, when writing packages, you should prefer to use ranges like `@3.2` over
the specific versions, as this allows the concretizer more leeway when selecting
versions of dependencies. More details and recommendations are in the [packaging guide](
https://spack.readthedocs.io/en/latest/packaging_guide.html#ranges-versus-specific-versions).
See #36273 for full details on the version refactor.
3. **New testing interface**
Writing package tests is now much simpler with a new [test interface](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests).
Writing a test is now as easy as adding a method that starts with `test_`:
```python
class MyPackage(Package):
...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self):
"""run installed example"""
example = which(self.prefix.bin.example)
example()
```
You can use Python's native `assert` statement to implement your checks -- no more
need to fiddle with `run_test` or other test framework methods. Spack will
introspect the class and run `test_*` methods when you run `spack test`,
4. **More stable concretization**
* Now, `spack concretize` will *only* concretize the new portions of the environment
and will not change existing parts of an environment unless you specify `--force`.
This has always been true for `unify:false`, but not for `unify:true` and
`unify:when_possible` environments. Now it is true for all of them (#37438, #37681).
* The concretizer has a new `--reuse-deps` argument that *only* reuses dependencies.
That is, it will always treat the *roots* of your environment as it would with
`--fresh`. This allows you to upgrade just the roots of your environment while
keeping everything else stable (#30990).
5. **Weekly develop snapshot releases**
Since last year, we have maintained a buildcache of `develop` at
https://binaries.spack.io/develop, but the cache can grow to contain so many builds
as to be unwieldy. When we get a stable `develop` build, we snapshot the release and
add a corresponding tag the Spack repository. So, you can use a stack from a specific
day. There are now tags in the spack repository like:
* `develop-2023-05-14`
* `develop-2023-05-18`
that correspond to build caches like:
* https://binaries.spack.io/develop-2023-05-14/e4s
* https://binaries.spack.io/develop-2023-05-18/e4s
We plan to store these snapshot releases weekly.
6. **Specs in buildcaches can be referenced by hash.**
* Previously, you could run `spack buildcache list` and see the hashes in
buildcaches, but referring to them by hash would fail.
* You can now run commands like `spack spec` and `spack install` and refer to
buildcache hashes directly, e.g. `spack install /abc123` (#35042)
7. **New package and buildcache index websites**
Our public websites for searching packages have been completely revamped and updated.
You can check them out here:
* *Package Index*: https://packages.spack.io
* *Buildcache Index*: https://cache.spack.io
Both are searchable and more interactive than before. Currently major releases are
shown; UI for browsing `develop` snapshots is coming soon.
8. **Default CMake and Meson build types are now Release**
Spack has historically defaulted to building with optimization and debugging, but
packages like `llvm` can be enormous with debug turned on. Our default build type for
all Spack packages is now `Release` (#36679, #37436). This has a number of benefits:
* much smaller binaries;
* higher default optimization level; and
* defining `NDEBUG` disables assertions, which may lead to further speedups.
You can still get the old behavior back through requirements and package preferences.
## Other new commands and directives
* `spack checksum` can automatically add new versions to package (#24532)
* new command: `spack pkg grep` to easily search package files (#34388)
* New `maintainers` directive (#35083)
* Add `spack buildcache push` (alias to `buildcache create`) (#34861)
* Allow using `-j` to control the parallelism of concretization (#37608)
* Add `--exclude` option to 'spack external find' (#35013)
## Other new features of note
* editing: add higher-precedence `SPACK_EDITOR` environment variable
* Many YAML formatting improvements from updating `ruamel.yaml` to the latest version
supporting Python 3.6. (#31091, #24885, #37008).
* Requirements and preferences should not define (non-git) versions (#37687, #37747)
* Environments now store spack version/commit in `spack.lock` (#32801)
* User can specify the name of the `packages` subdirectory in repositories (#36643)
* Add container images supporting RHEL alternatives (#36713)
* make version(...) kwargs explicit (#36998)
## Notable refactors
* buildcache create: reproducible tarballs (#35623)
* Bootstrap most of Spack dependencies using environments (#34029)
* Split `satisfies(..., strict=True/False)` into two functions (#35681)
* spack install: simplify behavior when inside environments (#35206)
## Binary cache and stack updates
* Major simplification of CI boilerplate in stacks (#34272, #36045)
* Many improvements to our CI pipeline's reliability
## Removals, Deprecations, and disablements
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
* Support for Python 2 was deprecated in `v0.19.0` and has been removed. `v0.20.0` only
supports Python 3.6 and higher.
* Deprecated target names are no longer recognized by Spack. Use generic names instead:
* `graviton` is now `cortex_a72`
* `graviton2` is now `neoverse_n1`
* `graviton3` is now `neoverse_v1`
* `blacklist` and `whitelist` in module configuration were deprecated in `v0.19.0` and are
removed in this release. Use `exclude` and `include` instead.
* The `ignore=` parameter of the `extends()` directive has been removed. It was not used by
any builtin packages and is no longer needed to avoid conflicts in environment views (#35588).
* Support for the old YAML buildcache format has been removed. It was deprecated in `v0.19.0` (#34347).
* `spack find --bootstrap` has been removed. It was deprecated in `v0.19.0`. Use `spack
--bootstrap find` instead (#33964).
* `spack bootstrap trust` and `spack bootstrap untrust` are now removed, having been
deprecated in `v0.19.0`. Use `spack bootstrap enable` and `spack bootstrap disable`.
* The `--mirror-name`, `--mirror-url`, and `--directory` options to buildcache and
mirror commands were deprecated in `v0.19.0` and have now been removed. They have been
replaced by positional arguments (#37457).
* Deprecate `env:` as top level environment key (#37424)
* deprecate buildcache create --rel, buildcache install --allow-root (#37285)
* Support for very old perl-like spec format strings (e.g., `$_$@$%@+$+$=`) has been
removed (#37425). This was deprecated in in `v0.15` (#10556).
## Notable Bugfixes
* bugfix: don't fetch package metadata for unknown concrete specs (#36990)
* Improve package source code context display on error (#37655)
* Relax environment manifest filename requirements and lockfile identification criteria (#37413)
* `installer.py`: drop build edges of installed packages by default (#36707)
* Bugfix: package requirements with git commits (#35057, #36347)
* Package requirements: allow single specs in requirement lists (#36258)
* conditional variant values: allow boolean (#33939)
* spack uninstall: follow run/link edges on --dependents (#34058)
## Spack community stats
* 7,179 total packages, 499 new since `v0.19.0`
* 329 new Python packages
* 31 new R packages
* 336 people contributed to this release
* 317 committers to packages
* 62 committers to core
# v0.19.1 (2023-02-07)
### Spack Bugfixes

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.1 (commit 4b1f21802a23b536bbcce73d3c631a566b20e8bd)
* Version: 0.2.1 (commit 9e1117bd8a2f0581bced161f2a2e8d6294d0300b)
astunparse
----------------

View File

@@ -2803,7 +2803,7 @@
"flags" : "-march=armv8.2-a+fp16+dotprod+crypto -mtune=cortex-a72"
},
{
"versions": "10.2",
"versions": "10.2:10.2.99",
"flags" : "-mcpu=zeus"
},
{

View File

@@ -589,7 +589,6 @@ def set_module_variables_for_package(pkg):
# TODO: make these build deps that can be installed if not found.
m.make = MakeExecutable("make", jobs)
m.gmake = MakeExecutable("gmake", jobs)
m.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their

View File

@@ -7,7 +7,7 @@
import llnl.util.lang as lang
from spack.directives import extends, maintainers
from spack.directives import extends
from .generic import GenericBuilder, Package
@@ -71,8 +71,6 @@ class RPackage(Package):
GenericBuilder = RBuilder
maintainers("glennpj")
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "RPackage"

View File

@@ -347,7 +347,7 @@ def iter_groups(specs, indent, all_headers):
spack.spec.architecture_color,
architecture if architecture else "no arch",
spack.spec.compiler_color,
f"{compiler}" if compiler else "no compiler",
f"{compiler.display_str}" if compiler else "no compiler",
)
# Sometimes we want to display specs that are not yet concretized.

View File

@@ -98,7 +98,7 @@ def compiler_find(args):
config = spack.config.config
filename = config.get_config_filename(args.scope, "compilers")
tty.msg("Added %d new compiler%s to %s" % (n, s, filename))
colify(reversed(sorted(c.spec for c in new_compilers)), indent=4)
colify(reversed(sorted(c.spec.display_str for c in new_compilers)), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
@@ -112,13 +112,13 @@ def compiler_remove(args):
tty.die("No compilers match spec %s" % cspec)
elif not args.all and len(compilers) > 1:
tty.error("Multiple compilers match spec %s. Choose one:" % cspec)
colify(reversed(sorted([c.spec for c in compilers])), indent=4)
colify(reversed(sorted([c.spec.display_str for c in compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
sys.exit(1)
for compiler in compilers:
spack.compilers.remove_compiler_from_config(compiler.spec, scope=args.scope)
tty.msg("Removed compiler %s" % compiler.spec)
tty.msg("Removed compiler %s" % compiler.spec.display_str)
def compiler_info(args):
@@ -130,7 +130,7 @@ def compiler_info(args):
tty.die("No compilers match spec %s" % cspec)
else:
for c in compilers:
print(str(c.spec) + ":")
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
@@ -188,7 +188,7 @@ def compiler_list(args):
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.compiler_color, name, os_str)
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec for c in compilers)))
colify(reversed(sorted(c.spec.display_str for c in compilers)))
def compiler(parser, args):

View File

@@ -302,7 +302,7 @@ def env_create(args):
# the environment should not include a view.
with_view = None
_env_create(
env = _env_create(
args.create_env,
init_file=args.envfile,
dir=args.dir,
@@ -310,6 +310,9 @@ def env_create(args):
keep_relative=args.keep_relative,
)
# Generate views, only really useful for environments created from spack.lock files.
env.regenerate_views()
def _env_create(name_or_path, *, init_file=None, dir=False, with_view=None, keep_relative=False):
"""Create a new environment, with an optional yaml description.

View File

@@ -1221,28 +1221,27 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
old_specs = set(self.user_specs)
new_specs = set()
for spec in matches:
if spec in list_to_change:
try:
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError:
# define new specs list
new_specs = set(self.user_specs)
msg = f"Spec '{spec}' is part of a spec matrix and "
msg += f"cannot be removed from list '{list_to_change}'."
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
new_specs.remove(spec)
tty.warn(msg)
if spec not in list_to_change:
continue
try:
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError:
# define new specs list
new_specs = set(self.user_specs)
msg = f"Spec '{spec}' is part of a spec matrix and "
msg += f"cannot be removed from list '{list_to_change}'."
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
new_specs.remove(spec)
tty.warn(msg)
else:
if list_name == user_speclist_name:
self.manifest.remove_user_spec(str(spec))
else:
if list_name == user_speclist_name:
for user_spec in matches:
self.manifest.remove_user_spec(str(user_spec))
else:
for user_spec in matches:
self.manifest.remove_definition(str(user_spec), list_name=list_name)
self.manifest.remove_definition(str(spec), list_name=list_name)
# If force, update stale concretized specs
for spec in old_specs - new_specs:
@@ -1352,6 +1351,10 @@ def concretize(self, force=False, tests=False):
self.concretized_order = []
self.specs_by_hash = {}
# Remove concrete specs that no longer correlate to a user spec
for spec in set(self.concretized_user_specs) - set(self.user_specs):
self.deconcretize(spec)
# Pick the right concretization strategy
if self.unify == "when_possible":
return self._concretize_together_where_possible(tests=tests)
@@ -1365,6 +1368,16 @@ def concretize(self, force=False, tests=False):
msg = "concretization strategy not implemented [{0}]"
raise SpackEnvironmentError(msg.format(self.unify))
def deconcretize(self, spec):
# spec has to be a root of the environment
index = self.concretized_user_specs.index(spec)
dag_hash = self.concretized_order.pop(index)
del self.concretized_user_specs[index]
# If this was the only user spec that concretized to this concrete spec, remove it
if dag_hash not in self.concretized_order:
del self.specs_by_hash[dag_hash]
def _get_specs_to_concretize(
self,
) -> Tuple[Set[spack.spec.Spec], Set[spack.spec.Spec], List[spack.spec.Spec]]:

View File

@@ -7,7 +7,7 @@
import itertools
import os.path
import posixpath
from typing import Any, Dict
from typing import Any, Dict, List
import llnl.util.lang as lang
@@ -56,7 +56,7 @@ def make_context(spec, module_set_name, explicit):
return LmodContext(conf)
def guess_core_compilers(name, store=False):
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
"""Guesses the list of core compilers installed in the system.
Args:
@@ -64,21 +64,19 @@ def guess_core_compilers(name, store=False):
modules.yaml configuration file
Returns:
List of core compilers, if found, or None
List of found core compilers
"""
core_compilers = []
for compiler_config in spack.compilers.all_compilers_config():
for compiler in spack.compilers.all_compilers():
try:
compiler = compiler_config["compiler"]
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(x) in spack.util.environment.SYSTEM_DIRS
for x in compiler["paths"].values()
if x is not None
os.path.dirname(getattr(compiler, x, "")) in spack.util.environment.SYSTEM_DIRS
for x in ("cc", "cxx", "f77", "fc")
)
if is_system_compiler:
core_compilers.append(str(compiler["spec"]))
core_compilers.append(compiler.spec)
except (KeyError, TypeError, AttributeError):
continue
@@ -89,10 +87,10 @@ def guess_core_compilers(name, store=False):
modules_cfg = spack.config.get(
"modules:" + name, {}, scope=spack.config.default_modify_scope()
)
modules_cfg.setdefault("lmod", {})["core_compilers"] = core_compilers
modules_cfg.setdefault("lmod", {})["core_compilers"] = [str(x) for x in core_compilers]
spack.config.set("modules:" + name, modules_cfg, scope=spack.config.default_modify_scope())
return core_compilers or None
return core_compilers
class LmodConfiguration(BaseConfiguration):
@@ -104,7 +102,7 @@ class LmodConfiguration(BaseConfiguration):
default_projections = {"all": posixpath.join("{name}", "{version}")}
@property
def core_compilers(self):
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
"""Returns the list of "Core" compilers
Raises:
@@ -112,14 +110,18 @@ def core_compilers(self):
specified in the configuration file or the sequence
is empty
"""
value = configuration(self.name).get("core_compilers") or guess_core_compilers(
self.name, store=True
)
compilers = [
spack.spec.CompilerSpec(c) for c in configuration(self.name).get("core_compilers", [])
]
if not value:
if not compilers:
compilers = guess_core_compilers(self.name, store=True)
if not compilers:
msg = 'the key "core_compilers" must be set in modules.yaml'
raise CoreCompilersNotFoundError(msg)
return value
return compilers
@property
def core_specs(self):
@@ -283,16 +285,18 @@ def token_to_path(self, name, value):
# If we are dealing with a core compiler, return 'Core'
core_compilers = self.conf.core_compilers
if name == "compiler" and str(value) in core_compilers:
if name == "compiler" and any(
spack.spec.CompilerSpec(value).satisfies(c) for c in core_compilers
):
return "Core"
# CompilerSpec does not have an hash, as we are not allowed to
# CompilerSpec does not have a hash, as we are not allowed to
# use different flavors of the same compiler
if name == "compiler":
return path_part_fmt.format(token=value)
# In case the hierarchy token refers to a virtual provider
# we need to append an hash to the version to distinguish
# we need to append a hash to the version to distinguish
# among flavors of the same library (e.g. openblas~openmp vs.
# openblas+openmp)
path = path_part_fmt.format(token=value)

View File

@@ -108,6 +108,5 @@
# These are just here for editor support; they will be replaced when the build env
# is set up.
make = MakeExecutable("make", jobs=1)
gmake = MakeExecutable("gmake", jobs=1)
ninja = MakeExecutable("ninja", jobs=1)
configure = Executable(join_path(".", "configure"))

View File

@@ -1239,7 +1239,7 @@ def get_pkg_class(self, pkg_name):
try:
module = importlib.import_module(fullname)
except ImportError:
raise UnknownPackageError(pkg_name)
raise UnknownPackageError(fullname)
except Exception as e:
msg = f"cannot load package '{pkg_name}' from the '{self.namespace}' repository: {e}"
raise RepoError(msg) from e

View File

@@ -679,6 +679,16 @@ def from_dict(d):
d = d["compiler"]
return CompilerSpec(d["name"], vn.VersionList.from_dict(d))
@property
def display_str(self):
"""Equivalent to {compiler.name}{@compiler.version} for Specs, without extra
@= for readability."""
if self.concrete:
return f"{self.name}@{self.version}"
elif self.versions != vn.any_version:
return f"{self.name}@{self.versions}"
return self.name
def __str__(self):
out = self.name
if self.versions and self.versions != vn.any_version:
@@ -1730,14 +1740,14 @@ def traverse_edges(self, **kwargs):
def short_spec(self):
"""Returns a version of the spec with the dependencies hashed
instead of completely enumerated."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.format(spec_format)
@property
def cshort_spec(self):
"""Returns an auto-colorized version of ``self.short_spec``."""
spec_format = "{name}{@version}{%compiler}"
spec_format = "{name}{@version}{%compiler.name}{@compiler.version}"
spec_format += "{variants}{arch=architecture}{/hash:7}"
return self.cformat(spec_format)

View File

@@ -204,8 +204,8 @@ def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, clangdir):
os.environ["PATH"] = str(clangdir)
output = compiler("find", "--scope=site")
assert "clang@=11.0.0" in output
assert "gcc@=8.4.0" in output
assert "clang@11.0.0" in output
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")
@@ -246,8 +246,8 @@ def test_compiler_find_prefer_no_suffix(no_compilers_yaml, working_env, clangdir
os.environ["PATH"] = str(clangdir)
output = compiler("find", "--scope=site")
assert "clang@=11.0.0" in output
assert "gcc@=8.4.0" in output
assert "clang@11.0.0" in output
assert "gcc@8.4.0" in output
config = spack.compilers.get_compiler_config("site", False)
clang = next(c["compiler"] for c in config if c["compiler"]["spec"] == "clang@=11.0.0")

View File

@@ -390,6 +390,19 @@ def test_remove_after_concretize():
assert not any(s.name == "mpileaks" for s in env_specs)
def test_remove_before_concretize():
e = ev.create("test")
e.unify = True
e.add("mpileaks")
e.concretize()
e.remove("mpileaks")
e.concretize()
assert not list(e.concretized_specs())
def test_remove_command():
env("create", "test")
assert "test" in env("list")
@@ -3299,3 +3312,22 @@ def test_environment_created_in_users_location(mutable_config, tmpdir):
assert dir_name in out
assert env_dir in ev.root(dir_name)
assert os.path.isdir(os.path.join(env_dir, dir_name))
def test_environment_created_from_lockfile_has_view(mock_packages, tmpdir):
"""When an env is created from a lockfile, a view should be generated for it"""
env_a = str(tmpdir.join("a"))
env_b = str(tmpdir.join("b"))
# Create an environment and install a package in it
env("create", "-d", env_a)
with ev.Environment(env_a):
add("libelf")
install("--fake")
# Create another environment from the lockfile of the first environment
env("create", "-d", env_b, os.path.join(env_a, "spack.lock"))
# Make sure the view was created
with ev.Environment(env_b) as e:
assert os.path.isdir(e.view_path_default)

View File

@@ -4,7 +4,7 @@ lmod:
hash_length: 0
core_compilers:
- 'clang@3.3'
- 'clang@12.0.0'
core_specs:
- 'mpich@3.0.1'

View File

@@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@12.0.0'

View File

@@ -0,0 +1,5 @@
enable:
- lmod
lmod:
core_compilers:
- 'clang@=12.0.0'

View File

@@ -45,6 +45,18 @@ def provider(request):
@pytest.mark.usefixtures("config", "mock_packages")
class TestLmod(object):
@pytest.mark.regression("37788")
@pytest.mark.parametrize("modules_config", ["core_compilers", "core_compilers_at_equal"])
def test_layout_for_specs_compiled_with_core_compilers(
self, modules_config, module_configuration, factory
):
"""Tests that specs compiled with core compilers are in the 'Core' folder. Also tests that
we can use both ``compiler@version`` and ``compiler@=version`` to specify a core compiler.
"""
module_configuration(modules_config)
module, spec = factory("libelf%clang@12.0.0")
assert "Core" in module.layout.available_path_parts
def test_file_layout(self, compiler, provider, factory, module_configuration):
"""Tests the layout of files in the hierarchy is the one expected."""
module_configuration("complex_hierarchy")
@@ -61,7 +73,7 @@ def test_file_layout(self, compiler, provider, factory, module_configuration):
# is transformed to r"Core" if the compiler is listed among core
# compilers
# Check that specs listed as core_specs are transformed to "Core"
if compiler == "clang@=3.3" or spec_string == "mpich@3.0.1":
if compiler == "clang@=12.0.0" or spec_string == "mpich@3.0.1":
assert "Core" in layout.available_path_parts
else:
assert compiler.replace("@=", "/") in layout.available_path_parts

View File

@@ -788,20 +788,17 @@ deprecated-ci-build:
########################################
.aws-pcluster-generate-image:
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:latest", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:v2023-05-25", "entrypoint": [""] }
.aws-pcluster-generate:
before_script:
# Setup postinstall Spack as upstream installation
# Use gcc from local container buildcache
- - . "./share/spack/setup-env.sh"
- . /etc/profile.d/modules.sh
- if [[ -f /bootstrap/spack/etc/spack/packages.yaml ]]; then cp /bootstrap/spack/etc/spack/packages.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/config.yaml ]]; then cp /bootstrap/spack/etc/spack/config.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/modules.xyaml ]]; then cp /bootstrap/spack/etc/spack/modules.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/mirrors.yaml ]]; then cp /bootstrap/spack/etc/spack/mirrors.yaml ./etc/spack/; fi
- if [[ -d /bootstrap/spack/opt/spack ]]; then spack config add "upstreams:postinstall:install_tree:/bootstrap/spack/opt/spack"; fi
- spack mirror add local-cache /bootstrap/local-cache
- spack gpg trust /bootstrap/public-key
- cd "${CI_PROJECT_DIR}" && curl -sOL https://raw.githubusercontent.com/spack/spack-configs/main/AWS/parallelcluster/postinstall.sh
- sed -i -e '/nohup/s/&$//' -e 's/nohup//' -e "s/spack arch -t/echo ${SPACK_TARGET_ARCH}/g" postinstall.sh
- sed -i -e "s/spack arch -t/echo ${SPACK_TARGET_ARCH}/g" postinstall.sh
- /bin/bash postinstall.sh -fg
- spack config --scope site add "packages:all:target:\"target=${SPACK_TARGET_ARCH}\""
after_script:

View File

@@ -35,18 +35,15 @@ spack:
ci:
pipeline-gen:
- build-job:
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:latest", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:v2023-05-25", "entrypoint": [""] }
before_script:
- - . "./share/spack/setup-env.sh"
- . /etc/profile.d/modules.sh
- spack --version
- spack arch
# Setup postinstall Spack as upstream installation
- - if [[ -f /bootstrap/spack/etc/spack/packages.yaml ]]; then cp /bootstrap/spack/etc/spack/packages.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/config.yaml ]]; then cp /bootstrap/spack/etc/spack/config.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/modules.yaml ]]; then cp /bootstrap/spack/etc/spack/modules.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/mirrors.yaml ]]; then cp /bootstrap/spack/etc/spack/mirrors.yaml ./etc/spack/; fi
- if [[ -d /bootstrap/spack/opt/spack ]]; then spack config add "upstreams:postinstall:install_tree:/bootstrap/spack/opt/spack"; fi
# Use gcc from local container buildcache
- - spack mirror add local-cache /bootstrap/local-cache
- spack gpg trust /bootstrap/public-key
- - /bin/bash "${SPACK_ARTIFACTS_ROOT}/postinstall.sh" -fg
- spack config --scope site add "packages:all:target:\"target=${SPACK_TARGET_ARCH}\""
- signing-job:

View File

@@ -16,7 +16,7 @@ spack:
- openfoam
- palace
# - py-devito
- quantum-espresso
# - quantum-espresso
# - wrf
- optimized_libs:
@@ -36,19 +36,16 @@ spack:
ci:
pipeline-gen:
- build-job:
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:latest", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:v2023-05-25", "entrypoint": [""] }
tags: ["aarch64"]
before_script:
- - . "./share/spack/setup-env.sh"
- . /etc/profile.d/modules.sh
- spack --version
- spack arch
# Setup postinstall Spack as upstream installation
- - if [[ -f /bootstrap/spack/etc/spack/packages.yaml ]]; then cp /bootstrap/spack/etc/spack/packages.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/config.yaml ]]; then cp /bootstrap/spack/etc/spack/config.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/modules.yaml ]]; then cp /bootstrap/spack/etc/spack/modules.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/mirrors.yaml ]]; then cp /bootstrap/spack/etc/spack/mirrors.yaml ./etc/spack/; fi
- if [[ -d /bootstrap/spack/opt/spack ]]; then spack config add "upstreams:postinstall:install_tree:/bootstrap/spack/opt/spack"; fi
# Use gcc from local container buildcache
- - spack mirror add local-cache /bootstrap/local-cache
- spack gpg trust /bootstrap/public-key
- - /bin/bash "${SPACK_ARTIFACTS_ROOT}/postinstall.sh" -fg
- spack config --scope site add "packages:all:target:\"target=${SPACK_TARGET_ARCH}\""
- signing-job:

View File

@@ -16,7 +16,7 @@ spack:
- openfoam
- palace
# - py-devito
- quantum-espresso
# - quantum-espresso
# - wrf
- optimized_libs:
@@ -36,19 +36,16 @@ spack:
ci:
pipeline-gen:
- build-job:
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:latest", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:v2023-05-25", "entrypoint": [""] }
tags: ["aarch64"]
before_script:
- - . "./share/spack/setup-env.sh"
- . /etc/profile.d/modules.sh
- spack --version
- spack arch
# Setup postinstall Spack as upstream installation
- - if [[ -f /bootstrap/spack/etc/spack/packages.yaml ]]; then cp /bootstrap/spack/etc/spack/packages.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/config.yaml ]]; then cp /bootstrap/spack/etc/spack/config.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/modules.yaml ]]; then cp /bootstrap/spack/etc/spack/modules.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/mirrors.yaml ]]; then cp /bootstrap/spack/etc/spack/mirrors.yaml ./etc/spack/; fi
- if [[ -d /bootstrap/spack/opt/spack ]]; then spack config add "upstreams:postinstall:install_tree:/bootstrap/spack/opt/spack"; fi
# Use gcc from local container buildcache
- - spack mirror add local-cache /bootstrap/local-cache
- spack gpg trust /bootstrap/public-key
- - /bin/bash "${SPACK_ARTIFACTS_ROOT}/postinstall.sh" -fg
- spack config --scope site add "packages:all:target:\"target=${SPACK_TARGET_ARCH}\""
- signing-job:

View File

@@ -35,18 +35,15 @@ spack:
ci:
pipeline-gen:
- build-job:
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:latest", "entrypoint": [""] }
image: { "name": "ghcr.io/spack/pcluster-amazonlinux-2:v2023-05-25", "entrypoint": [""] }
before_script:
- - . "./share/spack/setup-env.sh"
- . /etc/profile.d/modules.sh
- spack --version
- spack arch
# Setup postinstall Spack as upstream installation
- - if [[ -f /bootstrap/spack/etc/spack/packages.yaml ]]; then cp /bootstrap/spack/etc/spack/packages.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/config.yaml ]]; then cp /bootstrap/spack/etc/spack/config.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/modules.yaml ]]; then cp /bootstrap/spack/etc/spack/modules.yaml ./etc/spack/; fi
- if [[ -f /bootstrap/spack/etc/spack/mirrors.yaml ]]; then cp /bootstrap/spack/etc/spack/mirrors.yaml ./etc/spack/; fi
- if [[ -d /bootstrap/spack/opt/spack ]]; then spack config add "upstreams:postinstall:install_tree:/bootstrap/spack/opt/spack"; fi
# Use gcc from local container buildcache
- - spack mirror add local-cache /bootstrap/local-cache
- spack gpg trust /bootstrap/public-key
- - /bin/bash "${SPACK_ARTIFACTS_ROOT}/postinstall.sh" -fg
- spack config --scope site add "packages:all:target:\"target=${SPACK_TARGET_ARCH}\""
- signing-job:

View File

@@ -70,6 +70,7 @@ spack:
- charliecloud
- conduit
- datatransferkit
- dealii
- dyninst
- ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14
- exaworks

View File

@@ -31,6 +31,7 @@ class Aml(AutotoolsPackage):
# version string is generated from git tags, requires entire repo
version("master", branch="master", submodules=True, get_full_repo=True)
version("0.2.1", sha256="bae49e89ed0f2a2ad3547430e79b7e4c018d6228c6ed951a12d59afd0b35f71c")
version("0.2.0", sha256="2044a2f3f1d7a19827dd9c0726172b690189b4d3fe938656c4160c022468cc4a")
version(
"0.1.0",
@@ -45,10 +46,12 @@ class Aml(AutotoolsPackage):
variant("ze", default=False, description="Support for memory operations on top of Level Zero.")
variant("hip", default=False, description="Support for memory operations on top of HIP.")
variant("cuda", default=False, description="Support for memory operations on top of CUDA.")
variant("hwloc", default=False, description="Enable feature related to topology management")
variant("hwloc", default=True, description="Enable feature related to topology management")
variant(
"hip-platform",
values=disjoint_sets(("amd", "nvidia")),
values=("none", conditional("amd", when="+hip"), conditional("nvidia", when="+cuda")),
default="none",
multi=False,
description="HIP backend platform.",
)
@@ -68,6 +71,10 @@ class Aml(AutotoolsPackage):
depends_on("hwloc@2.1:", when="+hwloc")
# - ocl-icd >= 2.1 becomes a dependency when +opencl variant is used.
depends_on("ocl-icd@2.1:", when="+opencl")
# Required on master for autoconf pull the right pkg.m4 macros,
# and on other builds to detect dependencies
# Note: This does NOT work with pkg-config but requires pkgconf!
depends_on("pkgconf", type="build")
# when on master, we need all the autotools and extras to generate files.
with when("@master"):
@@ -75,9 +82,6 @@ class Aml(AutotoolsPackage):
depends_on("autoconf", type="build")
depends_on("automake", type="build")
depends_on("libtool", type="build")
# Required to have pkg config macros in configure.
# Note: This does NOT work with pkg-config but requires pkgconf!
depends_on("pkgconf", type="build")
# Required to generate AML version in configure.
depends_on("git", type="build")
@@ -91,9 +95,9 @@ def configure_args(self):
config_args.extend(self.with_or_without(b))
if self.spec.satisfies("%oneapi"):
config_args += ["--with-openmp-flags=-fiopenmp -fopenmp-targets=spir64"]
if "hip-platform=amd" in self.spec:
if self.spec.variants["hip-platform"].value == "amd":
config_args += ["--with-hip-platform=amd"]
if "hip-platform=nvidia" in self.spec:
if self.spec.variants["hip-platform"].value == "nvidia":
config_args += ["--with-hip-platform=nvidia"]
return config_args

View File

@@ -17,6 +17,7 @@ class AmqpCpp(CMakePackage):
maintainers("lpottier")
version("4.3.24", sha256="c3312f8af813cacabf6c257dfaf41bf9e66606bbf7d62d085a9b7da695355245")
version("4.3.19", sha256="ca29bb349c498948576a4604bed5fd3c27d87240b271a4441ccf04ba3797b31d")
variant(

View File

@@ -18,6 +18,7 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
maintainers("aprokop")
version("master", branch="master")
version("1.4", sha256="803a1018a6305cf3fea161172b3ada49537f59261279d91c2abbcce9492ee7af")
version("1.3", sha256="3f1e17f029a460ab99f8396e2772cec908eefc4bf3868c8828907624a2d0ce5d")
version("1.2", sha256="ed1939110b2330b7994dcbba649b100c241a2353ed2624e627a200a398096c20")
version("1.1", sha256="2b5f2d2d5cec57c52f470c2bf4f42621b40271f870b4f80cb57e52df1acd90ce")
@@ -61,7 +62,8 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
# Standalone Kokkos
depends_on("kokkos@3.1.00:", when="~trilinos")
depends_on("kokkos@3.4.00:", when="@1.2~trilinos")
depends_on("kokkos@3.6.00:", when="@1.3:~trilinos")
depends_on("kokkos@3.6.00:", when="@1.3~trilinos")
depends_on("kokkos@3.7.01:", when="@1.4:~trilinos")
for backend in kokkos_backends:
depends_on("kokkos+%s" % backend.lower(), when="~trilinos+%s" % backend.lower())
@@ -83,7 +85,8 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
depends_on("trilinos+kokkos", when="+trilinos")
depends_on("trilinos+openmp", when="+trilinos+openmp")
depends_on("trilinos@13.2.0:", when="@1.2+trilinos")
depends_on("trilinos@13.4.0:", when="@1.3:+trilinos")
depends_on("trilinos@13.4.0:", when="@1.3+trilinos")
depends_on("trilinos@14.0.0:", when="@1.4:+trilinos")
conflicts("~serial", when="+trilinos")
conflicts("+cuda", when="+trilinos")

View File

@@ -0,0 +1,22 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Azcopy(Package):
"""AzCopy is a command-line utility that you can use to copy data to and from containers and
file shares in Azure Storage accounts.
"""
homepage = "https://github.com/Azure/azure-storage-azcopy"
url = "https://github.com/Azure/azure-storage-azcopy/archive/refs/tags/v10.18.1.tar.gz"
version("10.18.1", sha256="80292625d7f1a6fc41688c5948b3a20cfdae872464d37d831e20999430819c3f")
depends_on("go", type="build")
def install(self, spec, prefix):
go("build", "-o", prefix.bin.azcopy)

View File

@@ -25,7 +25,8 @@ class Babelstream(CMakePackage, CudaPackage, ROCmPackage):
version("4.0", sha256="a9cd39277fb15d977d468435eb9b894f79f468233f0131509aa540ffda4f5953")
version("main", branch="main")
version("develop", branch="develop")
maintainers("tomdeakin", "kaanolgu" "tom91136", "robj0nes")
maintainers("tomdeakin", "kaanolgu", "tom91136", "robj0nes")
# Languages
# Also supported variants are cuda and rocm (for HIP)

View File

@@ -12,8 +12,6 @@ class Bart(MakefilePackage, CudaPackage):
homepage = "https://mrirecon.github.io/bart/"
url = "https://github.com/mrirecon/bart/archive/v0.5.00.tar.gz"
maintainers("glennpj")
version("0.7.00", sha256="a16afc4b632c703d95b5c34e47acd82fafc19f51f9aff442373eecfef08bfc41")
version("0.6.00", sha256="dbbd33d1e3ed3324fe21f90a3b62cb51765fe369f21df100b46a32004928f18d")
version("0.5.00", sha256="30eedcda0f0ef3808157542e0d67df5be49ee41e4f41487af5c850632788f643")

View File

@@ -106,7 +106,7 @@ def edit(self, spec, prefix):
def build(self, spec, prefix):
with working_dir("lib"):
gmake("all")
make("all")
def install(self, spec, prefix):
with working_dir("lib"):

View File

@@ -43,6 +43,7 @@ class Conduit(CMakePackage):
# is to bridge any spack dependencies that are still using the name master
version("master", branch="develop", submodules=True)
# note: 2021-05-05 latest tagged release is now preferred instead of develop
version("0.8.8", sha256="99811e9c464b6f841f52fcd47e982ae47cbb01cba334cff43eabe13eea58c0df")
version("0.8.7", sha256="f3bf44d860783f4e0d61517c5e280c88144af37414569f4cf86e2d29b3ba5293")
version("0.8.6", sha256="8ca5d37033143ed7181c7286dd25a3f6126ba0358889066f13a2b32f68fc647e")
version("0.8.5", sha256="b4a6f269a81570a4597e2565927fd0ed2ac45da0a2500ce5a71c26f7c92c5483")

View File

@@ -0,0 +1,42 @@
From 08d0017f06695d4837f1c509ca39d61b32bdae2b Mon Sep 17 00:00:00 2001
From: Sean Koyama <skoyama@anl.gov>
Date: Mon, 6 Mar 2023 23:02:08 +0000
Subject: [PATCH] LIBPATH fix for ALT_PREFIX
---
site_scons/prereq_tools/base.py | 14 +++++++-------
1 file changed, 7 insertions(+), 7 deletions(-)
diff --git a/site_scons/prereq_tools/base.py b/site_scons/prereq_tools/base.py
index 4df1347be..da32d3dd1 100644
--- a/site_scons/prereq_tools/base.py
+++ b/site_scons/prereq_tools/base.py
@@ -1247,18 +1247,18 @@ class PreReqComponent():
ipath = os.path.join(path, "include")
if not os.path.exists(ipath):
ipath = None
- lpath = None
+ lpaths = []
for lib in ['lib64', 'lib']:
- lpath = os.path.join(path, lib)
- if not os.path.exists(lpath):
- lpath = None
- if ipath is None and lpath is None:
+ lp = os.path.join(path, lib)
+ if os.path.exists(lp):
+ lpaths.append(lp)
+ if not ipath and not lpaths:
continue
env = self.__env.Clone()
if ipath:
env.AppendUnique(CPPPATH=[ipath])
- if lpath:
- env.AppendUnique(LIBPATH=[lpath])
+ if lpaths:
+ env.AppendUnique(LIBPATH=lpaths)
if not comp.has_missing_targets(env):
self.__prebuilt_path[name] = path
return path
--
2.34.1

View File

@@ -0,0 +1,70 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Daos(SConsPackage):
"""The Distributed Asynchronous Object Storage (DAOS) is an open-source
software-defined object store designed from the ground up for massively
distributed Non Volatile Memory (NVM)."""
homepage = "https://github.com/daos-stack/daos"
git = "https://github.com/daos-stack/daos.git"
maintainers("hyoklee")
version("master", branch="master", submodules=True)
version("2.2.0", tag="v2.2.0", submodules=True)
variant(
"debug", default=False, description="Enable debugging info and strict compile warnings"
)
patch("0001-LIBPATH-fix-for-ALT_PREFIX.2.patch", when="@2.2.0:")
depends_on("argobots@1.1:")
depends_on("boost", type="build")
depends_on("cmocka", type="build")
depends_on("go", type="build")
depends_on("hwloc")
depends_on("isa-l@2.30.0:")
depends_on("isa-l-crypto@2.23.0:")
depends_on("libfabric@1.15.1:")
depends_on("libfuse@3.6.1:")
depends_on("libuuid")
depends_on("libunwind")
depends_on("libyaml")
depends_on("mercury@2.2.0:+boostsys")
depends_on("openssl")
depends_on("pmdk@1.12.1:")
depends_on("protobuf-c@1.3.3:")
depends_on("py-distro")
depends_on("readline")
depends_on("scons@4.4.0:")
depends_on("spdk@23.01:+shared+rdma+dpdk")
depends_on("ucx@1.12.1:")
def build_args(self, spec, prefix):
args = ["PREFIX={0}".format(prefix), "USE_INSTALLED=all"]
if "+debug" in spec:
args.append("--debug=explain,findlibs,includes")
# Construct ALT_PREFIX and make sure that '/usr' is last.
alt_prefix = []
for node in spec.traverse():
alt_prefix.append(format(node.prefix))
args.extend(
[
"WARNING_LEVEL=warning",
"ALT_PREFIX=%s" % ":".join([str(elem) for elem in alt_prefix]),
"GO_BIN={0}".format(spec["go"].prefix.bin) + "/go",
]
)
return args
def install_args(self, spec, prefix):
args = ["PREFIX={0}".format(prefix)]
return args

View File

@@ -157,6 +157,7 @@ class Dd4hep(CMakePackage):
depends_on("lcio", when="+lcio")
depends_on("edm4hep", when="+edm4hep")
depends_on("podio", when="+edm4hep")
depends_on("podio@:0.16.03", when="@:1.23 +edm4hep")
depends_on("podio@0.16:", when="@1.24: +edm4hep")
depends_on("py-pytest", type=("build", "test"))
@@ -222,6 +223,8 @@ def setup_run_environment(self, env):
env.set("DD4HEP", self.prefix.examples)
env.set("DD4hep_DIR", self.prefix)
env.set("DD4hep_ROOT", self.prefix)
env.set("LD_LIBRARY_PATH", self.prefix.lib)
env.set("LD_LIBRARY_PATH", self.prefix.lib64)
def url_for_version(self, version):
# dd4hep releases are dashes and padded with a leading zero

View File

@@ -20,6 +20,7 @@ class Dlb(AutotoolsPackage):
maintainers("vlopezh")
version("main", branch="main")
version("3.3.1", sha256="1b245acad80b03eb83e815fd59dcfc598cfddd899de4504cf6a9572fe5359f40")
version("3.3", sha256="55b87aea14f3954d8878912f3134938db235e6984fae26fdf5134148007eb722")
version("3.2", sha256="b1c65ce3179b5275cfdf0bf921c0565a4a3ebcfdab72d7cef014957c17136c7e")
version("3.1", sha256="d63ee89429fdb54af5510ed956f86d11561911a7860b46324f25200d32d0d333")

View File

@@ -0,0 +1,94 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Fds(MakefilePackage):
"""
Fire Dynamics Simulator (FDS) is a large-eddy simulation (LES) code for low-speed flows,
with an emphasis on smoke and heat transport from fires.
FDS and Smokeview are free and open-source software tools provided by the National Institute
of Standards and Technology (NIST) of the United States Department of Commerce. Pursuant
to Title 17, Section 105 of the United States Code, this software is not subject to copyright
protection and is in the public domain. View the full disclaimer for NIST-developed software.
"""
maintainers("kjrstory")
homepage = "https://pages.nist.gov/fds-smv"
url = "https://github.com/firemodels/fds/archive/refs/tags/FDS-6.8.0.tar.gz"
git = "https://github.com/firemodels/fds.git"
version("6.8.0", commit="886e0096535519b7358a3c4393c91da3caee5072")
depends_on("mpi")
depends_on("mkl")
build_directory = "Build"
requires(
"%gcc",
"%intel",
"%oneapi",
policy="one_of",
msg="FDS builds only with GNU Fortran or Intel Fortran",
)
requires(
"^intel-mkl",
"^intel-oneapi-mkl",
policy="one_of",
msg="FDS builds require either Intel MKL or Intel oneAPI MKL library",
)
requires(
"^openmpi",
when="%gcc platform=linux",
msg="OpenMPI can only be used with GNU Fortran on Linux platform",
)
requires(
"^intel-mpi^intel-mkl",
when="%intel platform=linux",
msg="Intel MPI and Intel MKL can only be used with Intel Fortran on Linux platform",
)
requires(
"^intel-oneapi-mpi^intel-oneapi-mkl",
when="%oneapi platform=linux",
msg="Intel oneAPI MPI and MKL can only be used with oneAPI Fortran on Linux platform",
)
requires(
"^openmpi%intel",
when="platform=darwin",
msg="OpenMPI can only be used with Intel Fortran on macOS",
)
def edit(self, spec, prefix):
env["MKL_ROOT"] = self.spec["mkl"].prefix
if spec.compiler.name == "oneapi":
env["INTEL_IFORT"] = "ifx"
makefile = FileFilter("Build/makefile")
makefile.filter(r"\.\./Scripts", "./Scripts")
makefile.filter(r"\.\.\\Scripts", ".\\Scripts")
@property
def build_targets(self):
spec = self.spec
mpi_mapping = {"openmpi": "ompi", "intel-oneapi-mpi": "impi", "intel-mpi": "impi"}
compiler_mapping = {"gcc": "gnu", "oneapi": "intel", "intel": "intel"}
platform_mapping = {"linux": "linux", "darwin": "osx"}
mpi_prefix = mpi_mapping[spec["mpi"].name]
compiler_prefix = compiler_mapping[spec.compiler.name]
platform_prefix = platform_mapping[spec.architecture.platform]
return ["{}_{}_{}".format(mpi_prefix, compiler_prefix, platform_prefix)]
def install(self, spec, prefix):
mkdirp(prefix.bin)
with working_dir(self.build_directory):
install("*.mod", prefix.bin)
install("*.o", prefix.bin)
install("fds_" + self.build_targets[0], prefix.bin + "/fds")

View File

@@ -20,6 +20,7 @@ class FluxSecurity(AutotoolsPackage):
maintainers("grondo")
version("master", branch="master")
version("0.9.0", sha256="2258120c6f32ca0b5b13b166bae56d9bd82a44c6eeaa6bc6187e4a4419bdbcc0")
version("0.8.0", sha256="9963628063b4abdff6bece03208444c8f23fbfda33c20544c48b21e9f4819ce2")
# Need autotools when building on master:

View File

@@ -14,6 +14,8 @@ class Formetis(CMakePackage):
maintainers("sethrj")
test_requires_compiler = True
version("0.0.2", sha256="0067c03ca822f4a3955751acb470f21eed489256e2ec5ff24741eb2b638592f1")
variant("mpi", default=False, description="Enable ParMETIS support")
@@ -53,8 +55,8 @@ def cached_tests_work_dir(self):
"""The working directory for cached test sources."""
return join_path(self.test_suite.current_test_cache_dir, self.examples_src_dir)
def test(self):
"""Perform stand-alone/smoke tests on the installed package."""
def test_metis(self):
"""build and run metis"""
cmake_args = [
self.define("CMAKE_PREFIX_PATH", self.prefix),
self.define("CMAKE_Fortran_COMPILER", self.compiler.fc),
@@ -63,20 +65,11 @@ def test(self):
if "+mpi" in self.spec:
cmake_args.append(self.define("ParMETIS_ROOT", self.spec["parmetis"].prefix))
cmake_args.append(self.cached_tests_work_dir)
cmake = which(self.spec["cmake"].prefix.bin.cmake)
make = which("make")
self.run_test(
"cmake", cmake_args, purpose="test: calling cmake", work_dir=self.cached_tests_work_dir
)
self.run_test(
"make", [], purpose="test: building the tests", work_dir=self.cached_tests_work_dir
)
self.run_test(
"metis",
[],
[],
purpose="test: checking the installation",
installed=False,
work_dir=self.cached_tests_work_dir,
)
with working_dir(self.cached_tests_work_dir):
cmake(*cmake_args)
make()
metis = which("metis")
metis()

View File

@@ -91,27 +91,20 @@ def setup_smoke_tests(self):
install test subdirectory for use during `spack test run`."""
self.cache_extra_test_sources([self.examples_src_dir])
def test(self):
"""Perform stand-alone/smoke tests using installed package."""
def test_installation(self):
"""build and run ctest against the installed software"""
cmake_args = [
self.define("CMAKE_PREFIX_PATH", self.prefix),
self.define("CMAKE_CXX_COMPILER", self.compiler.cxx),
self.define("CMAKE_Fortran_COMPILER", self.compiler.fc),
self.cached_tests_work_dir,
]
self.run_test(
"cmake", cmake_args, purpose="test: calling cmake", work_dir=self.cached_tests_work_dir
)
cmake = which(self.spec["cmake"].prefix.bin.cmake)
ctest = which("ctest")
make = which("make")
self.run_test(
"make", [], purpose="test: calling make", work_dir=self.cached_tests_work_dir
)
self.run_test(
"ctest",
["-V"],
["100% tests passed"],
installed=False,
purpose="test: testing the installation",
work_dir=self.cached_tests_work_dir,
)
with working_dir(self.cached_tests_work_dir, create=True):
cmake(*cmake_args)
make()
out = ctest("-V", output=str.split, error=str.split)
assert "100% tests passed" in out

View File

@@ -22,8 +22,6 @@ class Fplo(MakefilePackage):
url = "file://{0}/FPLO22.00-62.tar.gz".format(os.getcwd())
manual_download = True
maintainers("glennpj")
version("22.00-62", sha256="0d1d4e9c1e8e41900901e26c3cd08ee39dcfdeb3f2c4c8862055eaf704b6d69e")
# TODO: Try to get LAPACK to work with something other than MKL. The build

View File

@@ -2,9 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import shutil
from spack.package import *
@@ -15,38 +13,30 @@ class Fzf(MakefilePackage):
homepage = "https://github.com/junegunn/fzf"
url = "https://github.com/junegunn/fzf/archive/0.17.5.tar.gz"
maintainers("alecbcs")
executables = ["^fzf$"]
version("0.22.0", sha256="3090748bb656333ed98490fe62133760e5da40ba4cd429a8142b4a0b69d05586")
version("0.17.5", sha256="de3b39758e01b19bbc04ee0d5107e14052d3a32ce8f40d4a63d0ed311394f7ee")
version("0.17.4", sha256="a4b009638266b116f422d159cd1e09df64112e6ae3490964db2cd46636981ff0")
version("0.17.3", sha256="e843904417adf926613431e4403fded24fade56269446e92aac6ff1db86af81e")
version("0.17.1", sha256="9c881e55780c0f56b5a30b87df756634d853bfd3938e7e53cb2df6ed63aa84a7")
version("0.17.0-2", sha256="a084415231b452b92a6b8aa87a69c0c02ee59bfe95774bf0d4fcc9a6251ece20")
version("0.17.0", sha256="23569faf64cd6831c09aad7030c8b4bace0eb7a979c580b33cc4e4f9ff303e29")
version("0.16.11", sha256="e3067d4ad58d7be51eba9a35c06518cd7145c0cc297882796c7e40285f268a99")
version("0.16.10", sha256="a6b9d8abcba4239d30201cc7911e9c305a5cd750081ce5cd389f8e7425f4dc93")
version("0.16.9", sha256="dd9434576c68313481613a5bd52dbf623eee37a5c87f7bb66ca71ac8add5ff94")
version("0.16.8", sha256="daef99f67cff3dad261dbcf2aef995bb78b360bcc7098d7230cb11674e1ee1d4")
version("0.40.0", sha256="9597f297a6811d300f619fff5aadab8003adbcc1566199a43886d2ea09109a65")
depends_on("go@1.11:", type="build")
depends_on("go@1.17:", type="build")
variant("vim", default=False, description="Install vim plugins for fzf")
patch("github_mirrors.patch", when="@:0.17.5")
@classmethod
def determine_version(cls, exe):
candidate = Executable(exe)("--version", output=str, error=str)
match = re.match(r"(^[\d.]+)", candidate)
output = Executable(exe)("--version", output=str, error=str)
match = re.match(r"(^[\d.]+)", output)
return match.group(1) if match else None
@when("@:0.17.5")
def patch(self):
glide_home = os.path.join(self.build_directory, "glide_home")
os.environ["GLIDE_HOME"] = glide_home
shutil.rmtree(glide_home, ignore_errors=True)
os.mkdir(glide_home)
def setup_build_environment(self, env):
# Point GOPATH at the top of the staging dir for the build step.
env.prepend_path("GOPATH", self.stage.path)
# Set required environment variables since we
# are not using git to pull down the repository.
env.set("FZF_VERSION", self.spec.version)
env.set("FZF_REVISION", "tarball")
def install(self, spec, prefix):
make("install")
@@ -54,6 +44,9 @@ def install(self, spec, prefix):
mkdir(prefix.bin)
install("bin/fzf", prefix.bin)
mkdirp(prefix.share.fzf.shell)
install_tree("shell", prefix.share.fzf.shell)
@run_after("install")
def post_install(self):
if "+vim" in self.spec:

View File

@@ -26,8 +26,6 @@ class Gate(CMakePackage):
homepage = "http://opengatecollaboration.org/"
url = "https://github.com/OpenGATE/Gate/archive/v9.0.tar.gz"
maintainers("glennpj")
version("9.1", sha256="aaab874198500b81d45b27cc6d6a51e72cca9519910b893a5c85c8e6d3ffa4fc")
version("9.0", sha256="8354f392facc0b7ae2ddf0eed61cc43136195b198ba399df25e874886b8b69cb")

View File

@@ -12,8 +12,6 @@ class Gatetools(PythonPackage):
homepage = "https://github.com/OpenGATE/GateTools"
pypi = "gatetools/gatetools-0.9.14.tar.gz"
maintainers("glennpj")
version("0.11.2", sha256="6eef8a779278b862823ae79d6aab210db4f7889c9127b2c2e4c3a4195f9a9928")
version("0.9.14", sha256="78fe864bb52fd4c6aeeee90d8f6c1bc5406ce02ac6f48712379efac606b5c006")

View File

@@ -0,0 +1,22 @@
diff --git a/GaudiHive/src/FetchLeavesFromFile.cpp b/GaudiHive/src/FetchLeavesFromFile.cpp
index 55c60e6a1..5ed8efa91 100644
--- a/GaudiHive/src/FetchLeavesFromFile.cpp
+++ b/GaudiHive/src/FetchLeavesFromFile.cpp
@@ -67,7 +67,7 @@ namespace Gaudi {
DataObject* obj = nullptr;
evtSvc()
->retrieveObject( m_rootNode, obj )
- .orThrow( fmt::format( "failed to retrieve {} from {}", m_rootNode.value(), m_dataSvcName ), name() );
+ .orThrow( fmt::format( "failed to retrieve {} from {}", m_rootNode.value(), m_dataSvcName.value() ), name() );
}
// result
IDataStoreLeaves::LeavesList all_leaves;
@@ -93,7 +93,7 @@ namespace Gaudi {
->retrieveObject( reg->identifier(), obj )
.orElse( [&]() {
failure_msg =
- fmt::format( "failed to retrieve {} from {}", reg->identifier(), m_dataSvcName );
+ fmt::format( "failed to retrieve {} from {}", reg->identifier(), m_dataSvcName.value() );
// we do not really care about the exception we throw because traverseSubTree will just use
// it to abort the traversal
throw GaudiException( failure_msg, name(), StatusCode::FAILURE );

View File

@@ -53,6 +53,7 @@ class Gaudi(CMakePackage):
# fixes for the cmake config which could not find newer boost versions
patch("link_target_fixes.patch", when="@33.0:34")
patch("link_target_fixes32.patch", when="@:32.2")
patch("fmt_fix.patch", when="@36.6:36.12 ^fmt@10:")
# These dependencies are needed for a minimal Gaudi build
depends_on("aida")
@@ -137,6 +138,8 @@ def setup_run_environment(self, env):
# environment as in Gaudi.xenv
env.prepend_path("PATH", self.prefix.scripts)
env.prepend_path("PYTHONPATH", self.prefix.python)
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib)
env.prepend_path("LD_LIBRARY_PATH", self.prefix.lib64)
def url_for_version(self, version):
major = str(version[0])

View File

@@ -56,7 +56,7 @@ class Gmake(AutotoolsPackage, GNUMirrorPackage):
tags = ["build-tools"]
executables = ["^g?make$"]
executables = ["^make$"]
@classmethod
def determine_version(cls, exe):
@@ -84,6 +84,3 @@ def setup_dependent_package(self, module, dspec):
module.make = MakeExecutable(
self.spec.prefix.bin.make, determine_number_of_jobs(parallel=dspec.package.parallel)
)
module.gmake = MakeExecutable(
self.spec.prefix.bin.gmake, determine_number_of_jobs(parallel=dspec.package.parallel)
)

View File

@@ -23,8 +23,6 @@ class Gurobi(Package):
homepage = "https://www.gurobi.com"
manual_download = True
maintainers("glennpj")
version("10.0.0", sha256="91a9ce1464f5f948809fcdfbdeb55f77698ed8a6d6cfa6985295424b6ece2bd4")
version("9.5.2", sha256="95d8ca18b7f86116ba834a27fd6228c5b1708ae67927e7ea0e954c09374a2d0f")
version("9.5.1", sha256="fa82859d33f08fb8aeb9da66b0fbd91718ed573c534f571aa52372c9deb891da")

View File

@@ -18,13 +18,10 @@ class Harfbuzz(MesonPackage, AutotoolsPackage):
conditional("autotools", when="@:2.9"), conditional("meson", when="@3:"), default="meson"
)
version("7.3.0", sha256="20770789749ac9ba846df33983dbda22db836c70d9f5d050cb9aa5347094a8fb")
version("7.2.0", sha256="fc5560c807eae0efd5f95b5aa4c65800c7a8eed6642008a6b1e7e3ffff7873cc")
version("6.0.0", sha256="1d1010a1751d076d5291e433c138502a794d679a7498d1268ee21e2d4a140eb4")
version(
"5.3.1",
sha256="4a6ce097b75a8121facc4ba83b5b083bfec657f45b003cd5a3424f2ae6b4434d",
preferred=True,
)
version("5.3.1", sha256="4a6ce097b75a8121facc4ba83b5b083bfec657f45b003cd5a3424f2ae6b4434d")
version("5.1.0", sha256="2edb95db668781aaa8d60959d21be2ff80085f31b12053cdd660d9a50ce84f05")
version("4.2.1", sha256="bd17916513829aeff961359a5ccebba6de2f4bf37a91faee3ac29c120e3d7ee1")
version("4.1.0", sha256="f7984ff4241d4d135f318a93aa902d910a170a8265b7eaf93b5d9a504eed40c8")

View File

@@ -24,6 +24,7 @@ class Hdf5VolAsync(CMakePackage):
tags = ["e4s"]
version("develop", branch="develop")
version("1.6", tag="v1.6")
version("1.5", tag="v1.5")
version("1.4", tag="v1.4")
version("1.3", tag="v1.3")

View File

@@ -0,0 +1,47 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Hdf5VolDaos(CMakePackage):
"""The HDF5 DAOS VOL connector is an external VOL connector
that interfaces with the DAOS API"""
homepage = "https://github.com/HDFGroup/vol-daos"
url = (
"https://github.com/HDFGroup/vol-daos/releases/download/v1.2.0/hdf5_vol_daos-1.2.0.tar.bz2"
)
git = "https://github.com/HDFGroup/vol-daos.git"
maintainers("hyoklee", "soumagne")
version("master", branch="master", submodules=True)
version("1.2.0", sha256="669c1443605068f24c033783ef72619afcec4844902b3e0bffa19ddeea39779f")
depends_on("cmake@2.8.12.2:", type="build")
depends_on("daos@2.2.0:")
depends_on("hdf5@1.14.0:+hl+mpi+map")
def cmake_args(self):
"""Populate cmake arguments for HDF5 DAOS."""
define = self.define
cmake_args = [
define("BUILD_SHARED_LIBS", True),
define("BUILD_TESTING", self.run_tests),
define("PC_DAOS_INCLUDEDIR", self.spec["daos"].prefix + "/include"),
define("PC_DAOS_LIBDIR", self.spec["daos"].prefix + "/lib64"),
]
return cmake_args
def setup_run_environment(self, env):
env.prepend_path("HDF5_PLUGIN_PATH", self.prefix.lib)
def check(self):
"""Unit tests fail when run in parallel."""
with working_dir(self.build_directory):
make("test", parallel=False)

View File

@@ -24,8 +24,6 @@ class Heasoft(AutotoolsPackage):
homepage = "https://heasarc.gsfc.nasa.gov/docs/software/lheasoft/"
url = "https://heasarc.gsfc.nasa.gov/FTP/software/lheasoft/lheasoft6.29/heasoft-6.29src.tar.gz"
maintainers("glennpj")
version("6.30", sha256="7f828f6050809653319f94d715c1b6815fbc09adfdcb61f2f0f1d7a6af10684a")
version("6.29", sha256="534fec04baa2586326fd7240805f2606620f3b7d7078a80fdd95c9c1177c9e68")

View File

@@ -24,10 +24,12 @@ class Hipsycl(CMakePackage):
provides("sycl")
version("stable", branch="stable", submodules=True)
version("0.9.4", commit="99d9e24d462b35e815e0e59c1b611936c70464ae", submodules=True)
version("0.9.3", commit="51507bad524c33afe8b124804091b10fa25618dc", submodules=True)
version("0.9.2", commit="49fd02499841ae884c61c738610e58c27ab51fdb", submodules=True)
version("0.9.1", commit="fe8465cd5399a932f7221343c07c9942b0fe644c", submodules=True)
version("0.8.0", commit="2daf8407e49dd32ebd1c266e8e944e390d28b22a", submodules=True)
version("develop", branch="develop", submodules=True)
variant("cuda", default=False, description="Enable CUDA backend for SYCL kernels")
@@ -37,6 +39,8 @@ class Hipsycl(CMakePackage):
depends_on("python@3:")
depends_on("llvm@8: +clang", when="~cuda")
depends_on("llvm@9: +clang", when="+cuda")
# https://github.com/OpenSYCL/OpenSYCL/pull/918 was introduced after 0.9.4
conflicts("llvm@16:", when="@:0.9.4")
# LLVM PTX backend requires cuda7:10.1 (https://tinyurl.com/v82k5qq)
depends_on("cuda@9:10.1", when="@0.8.1: +cuda ^llvm@9")
depends_on("cuda@9:", when="@0.8.1: +cuda ^llvm@10:")

View File

@@ -38,6 +38,16 @@ class Hpcg(AutotoolsPackage):
sha256="722c13837b287e979442f8372274aa5910a290aa39f1ed1ff646116be08dcae9",
when="%arm",
)
patch(
"https://github.com/hpcg-benchmark/hpcg/commit/e9e0b7e6cae23e1f30dd983c2ce2d3bd34d56f75.patch?full_index=1",
sha256="722c13837b287e979442f8372274aa5910a290aa39f1ed1ff646116be08dcae9",
when="%oneapi",
)
patch(
"https://github.com/hpcg-benchmark/hpcg/commit/e9e0b7e6cae23e1f30dd983c2ce2d3bd34d56f75.patch?full_index=1",
sha256="722c13837b287e979442f8372274aa5910a290aa39f1ed1ff646116be08dcae9",
when="%intel",
)
depends_on("mpi@1.1:")
@@ -50,6 +60,8 @@ def configure(self, spec, prefix):
not spec.satisfies("%aocc")
and not spec.satisfies("%cce")
and not spec.satisfies("%arm")
and not spec.satisfies("%intel")
and not spec.satisfies("%oneapi")
):
CXXFLAGS += " -ftree-vectorizer-verbose=0 "
if spec.satisfies("%cce"):

View File

@@ -0,0 +1,32 @@
Fixes a bug where libxed-ild didn't understand some instructions.
https://github.com/intelxed/xed/issues/298
diff --git a/src/dec/xed-ild.c b/src/dec/xed-ild.c
index 8bf7cbe..0ab90ac 100644
--- a/src/dec/xed-ild.c
+++ b/src/dec/xed-ild.c
@@ -1385,6 +1385,7 @@ void xed_ild_lookup_init(void) {
void xed_ild_init(void) {
init_prefix_table();
xed_ild_lookup_init();
+ xed_init_chip_model_info();
#if defined(XED_EXTENSION_XOP_DEFINED)
xed_ild_chip_init();
#endif
diff --git a/xed_mbuild.py b/xed_mbuild.py
index db34179..5a7842c 100755
--- a/xed_mbuild.py
+++ b/xed_mbuild.py
@@ -1081,7 +1081,8 @@ def build_xed_ild_library(env, lib_env, lib_dag, sources_to_replace):
'xed-ild-disp-l3.c', # generated
'xed-ild-eosz.c', # generated
'xed-ild-easz.c', # generated
- 'xed-ild-imm-l3.c'] # generated
+ 'xed-ild-imm-l3.c', # generated
+ 'xed-error-enum.c',] # generated
common_objs = lib_env.make_obj(common_sources)
ild_objs += xbc.build_dir_join(lib_env, common_objs)

View File

@@ -21,6 +21,7 @@ class IntelXed(Package):
# Current versions now have actual releases and tags.
version("main", branch="main")
version("2023.04.16", tag="v2023.04.16")
version("2022.10.11", tag="v2022.10.11")
version("2022.08.11", tag="v2022.08.11")
version("2022.04.17", tag="v2022.04.17")
@@ -28,30 +29,31 @@ class IntelXed(Package):
version("11.2.0", tag="11.2.0")
# The old 2019.03.01 version (before there were tags).
version("10.2019.03", commit="b7231de4c808db821d64f4018d15412640c34113")
version("10.2019.03", commit="b7231de4c808db821d64f4018d15412640c34113", deprecated=True)
resource(name="mbuild", placement="mbuild", git=mbuild_git, branch="main", when="@main")
# XED wants the mbuild directory adjacent to xed in the same directory.
mdir = join_path("..", "mbuild")
resource(name="mbuild", placement=mdir, git=mbuild_git, branch="main", when="@main")
# Match xed more closely with the version of mbuild at the time.
resource(
name="mbuild",
placement="mbuild",
git=mbuild_git,
commit="09b6654be0c52bf1df44e88c88b411a67b624cbd",
when="@:9999",
name="mbuild", placement=mdir, git=mbuild_git, tag="v2022.07.28", when="@2022.07:9999"
)
resource(name="mbuild", placement=mdir, git=mbuild_git, tag="v2022.04.17", when="@:2022.06")
variant("debug", default=False, description="Enable debug symbols")
variant("pic", default=False, description="Compile with position independent code.")
# The current mfile uses python3 by name.
depends_on("python@3.4:", type="build")
depends_on("python@3.6:", type="build")
patch("1201-segv.patch", when="@12.0.1")
patch("2019-python3.patch", when="@10.2019.03")
patch("libxed-ild.patch", when="@12.0:2022.12")
conflicts("target=ppc64:", msg="intel-xed only runs on x86")
conflicts("target=ppc64le:", msg="intel-xed only runs on x86")
conflicts("target=aarch64:", msg="intel-xed only runs on x86")
requires("target=x86_64:", msg="intel-xed only runs on x86/x86_64")
mycflags = [] # type: List[str]
@@ -67,10 +69,19 @@ def flag_handler(self, name, flags):
def install(self, spec, prefix):
# XED needs PYTHONPATH to find the mbuild directory.
mbuild_dir = join_path(self.stage.source_path, "mbuild")
mbuild_dir = join_path(self.stage.source_path, "..", "mbuild")
python_path = os.getenv("PYTHONPATH", "")
os.environ["PYTHONPATH"] = mbuild_dir + ":" + python_path
# In 2023.04.16, the xed source directory must be exactly 'xed',
# so add a symlink, but don't fail if the link already exists.
# See: https://github.com/intelxed/xed/issues/300
try:
lname = join_path(self.stage.source_path, "..", "xed")
os.symlink("spack-src", lname)
except OSError:
pass
mfile = Executable(join_path(".", "mfile.py"))
args = ["-j", str(make_jobs), "--cc=%s" % spack_cc, "--no-werror"]

View File

@@ -22,8 +22,6 @@ class Itk(CMakePackage):
homepage = "https://itk.org/"
url = "https://github.com/InsightSoftwareConsortium/ITK/releases/download/v5.1.1/InsightToolkit-5.1.1.tar.gz"
maintainers("glennpj")
version("5.3.0", sha256="57a4471133dc8f76bde3d6eb45285c440bd40d113428884a1487472b7b71e383")
version("5.3rc02", sha256="163aaf4a6cecd5b70ff718c1a986c746581797212fd1b629fa81f12ae4756d14")
version(

View File

@@ -23,7 +23,7 @@ class Julia(MakefilePackage):
url = "https://github.com/JuliaLang/julia/releases/download/v1.7.0/julia-1.7.0.tar.gz"
git = "https://github.com/JuliaLang/julia.git"
maintainers("glennpj", "vchuravy", "haampie", "giordano")
maintainers("vchuravy", "haampie", "giordano")
version("master", branch="master")
version("1.9.0", sha256="48f4c8a7d5f33d0bc6ce24226df20ab49e385c2d0c3767ec8dfdb449602095b2")

View File

@@ -0,0 +1,24 @@
diff -ruN spack-src/CMakeLists.txt spack-src-patched/CMakeLists.txt
--- spack-src/CMakeLists.txt 2023-05-18 14:18:00.897162488 -0400
+++ spack-src-patched/CMakeLists.txt 2023-05-18 14:20:09.532413649 -0400
@@ -495,7 +495,7 @@
if("${Legion_HIP_ARCH}" STREQUAL "")
set(HIP_GENCODE "")
else()
- set(HIP_GENCODE "--offload-target=${Legion_HIP_ARCH}")
+ set(HIP_GENCODE "--offload-arch=${Legion_HIP_ARCH}")
endif()
endif()
diff -ruN spack-src/runtime/runtime.mk spack-src-patched/runtime/runtime.mk
--- spack-src/runtime/runtime.mk 2023-05-18 14:18:00.969164248 -0400
+++ spack-src-patched/runtime/runtime.mk 2023-05-18 14:20:51.317470176 -0400
@@ -499,7 +499,7 @@
HIPCC_FLAGS += -O2
endif
ifneq ($(strip $(HIP_ARCH)),)
- HIPCC_FLAGS += --offload-target=$(HIP_ARCH)
+ HIPCC_FLAGS += --offload-arch=$(HIP_ARCH)
endif
LEGION_LD_FLAGS += -lm -L$(HIP_PATH)/lib -lamdhip64
else ifeq ($(strip $(HIP_TARGET)),CUDA)

View File

@@ -70,6 +70,9 @@ class Legion(CMakePackage, ROCmPackage):
depends_on("kokkos@3.3.01:~cuda", when="+kokkos~cuda")
depends_on("kokkos@3.3.01:~cuda+openmp", when="+kokkos+openmp")
# https://github.com/spack/spack/issues/37232#issuecomment-1553376552
patch("hip-offload-arch.patch", when="@23.03.0 +rocm")
# HIP specific
variant(
"hip_hijack",

View File

@@ -38,6 +38,10 @@ def flag_handler(self, name, flags):
# flags, and the build system ensures that
return (None, flags, None)
# 1.10.2 fails on macOS when trying to use the Linux getrandom() call
# https://dev.gnupg.org/T6442
patch("rndgetentropy_no_getrandom.patch", when="@=1.10.2 platform=darwin")
def check(self):
# Without this hack, `make check` fails on macOS when SIP is enabled
# https://bugs.gnupg.org/gnupg/issue2056

View File

@@ -0,0 +1,34 @@
diff --git a/random/rndgetentropy.c b/random/rndgetentropy.c
index 513da0b..d8eedce 100644
--- a/random/rndgetentropy.c
+++ b/random/rndgetentropy.c
@@ -81,27 +81,8 @@ _gcry_rndgetentropy_gather_random (void (*add)(const void*, size_t,
do
{
_gcry_pre_syscall ();
- if (fips_mode ())
- {
- /* DRBG chaining defined in SP 800-90A (rev 1) specify
- * the upstream (kernel) DRBG needs to be reseeded for
- * initialization of downstream (libgcrypt) DRBG. For this
- * in RHEL, we repurposed the GRND_RANDOM flag of getrandom API.
- * The libgcrypt DRBG is initialized with 48B of entropy, but
- * the kernel can provide only 32B at a time after reseeding
- * so we need to limit our requests to 32B here.
- * This is clarified in IG 7.19 / IG D.K. for FIPS 140-2 / 3
- * and might not be applicable on other FIPS modules not running
- * RHEL kernel.
- */
- nbytes = length < 32 ? length : 32;
- ret = getrandom (buffer, nbytes, GRND_RANDOM);
- }
- else
- {
- nbytes = length < sizeof (buffer) ? length : sizeof (buffer);
- ret = getentropy (buffer, nbytes);
- }
+ nbytes = length < sizeof (buffer) ? length : sizeof (buffer);
+ ret = getentropy (buffer, nbytes);
_gcry_post_syscall ();
}
while (ret == -1 && errno == EINTR);

View File

@@ -15,6 +15,7 @@ class Libjwt(AutotoolsPackage):
maintainers("bollig")
version("1.15.3", sha256="cb2fd95123689e7d209a3a8c060e02f68341c9a5ded524c0cd881a8cd20d711f")
version("1.15.2", sha256="a366531ad7d5d559b1f8c982e7bc7cece7eaefacf7e91ec36d720609c01dc410")
version("1.13.1", sha256="4df55ac89c6692adaf3badb43daf3241fd876612c9ab627e250dfc4bb59993d9")
version("1.12.1", sha256="d29e4250d437340b076350e910e69fd5539ef8b92528d0306745cec0e343cc17")

View File

@@ -15,6 +15,8 @@ class LibpressioNvcomp(CMakePackage, CudaPackage):
maintainers("robertu94")
version("0.0.5", sha256="2f2a2567c77db550badaf594cda824fa313470b143f69bcef308eeb80b4876c2")
version("0.0.4", sha256="6ff7d0f3167dead7584c994a6a11782f20eb3dd4844307e4ee8b2aebcd8571e9")
version("0.0.3", sha256="21409d34f9281bfd7b83b74f5f8fc6d34794f3161391405538c060fb59534597")
version("0.0.2", commit="38d7aa7c283681cbe5b7f17b900f72f9f25be51c")

View File

@@ -17,6 +17,8 @@ class LibpressioOpt(CMakePackage):
version("develop", branch="develop")
version("sdr-develop", branch="develop", git="git@github.com:szcompressor/SDRFramework")
version("0.15.0", sha256="0f092ae287e555c890d0ab77df83a7acf619a2b05ab104cef8647df4f886d759")
version("0.14.0", sha256="1e8d348f9777f3d49764b22b1f2abefd4b972cb9b1fa27c867373d32c8f1c57d")
version("0.13.5", sha256="cc0e6a46335aa3552b8ab57757d39855f4fba71e661f706ec99519cb2c8a2f3c")
version("0.13.4", sha256="e9f715d11fe3558a31e1d9a939150209449ec8636ded047cb0adcd3db07569ae")
version("0.13.3", sha256="98436b7fa6a53dd9cc09a9b978dc81c299501930cb8b844713080fc42d39d173")
@@ -28,6 +30,8 @@ class LibpressioOpt(CMakePackage):
version("0.11.0", sha256="cebbc512fcaa537d2af1a6919d6e0400cdc13595d71d9b90b74ad3eb865c9767")
depends_on("libpressio+libdistributed+mpi")
depends_on("libpressio@0.93.0:", when="@0.14.0:")
depends_on("libpressio@0.95.0:", when="@0.15.0:")
depends_on("libpressio@0.88.0:", when="@0.13.5:")
depends_on("libpressio@0.85.0:", when="@0.13.3:")
depends_on("libpressio@0.66.1:", when="@:0.13.2")

View File

@@ -17,6 +17,7 @@ class LibpressioRmetric(CMakePackage):
version("master", branch="master")
# note versions <= 0.0.3 do not build with spack
version("0.0.7", sha256="b01df5102076412064849335c2c928a4a5ba23e1f1f515062d9166b0a7531179")
version("0.0.6", sha256="b23a79448cd32b51a7301d6cebf4e228289712dd77dd76d86821741467e9af46")
version("0.0.5", sha256="51eb192314ef083790dd0779864cab527845bd8de699b3a33cd065c248eae24c")
version("0.0.4", sha256="166af5e84d7156c828a3f0dcc5bf531793ea4ec44bbf468184fbab96e1f0a91f")

View File

@@ -21,5 +21,6 @@ class LibpressioSperr(CMakePackage):
depends_on("pkgconfig", type="build")
version("master", branch="master")
version("0.0.3", sha256="e0d1fd083419aaaa243cbf780b7de17aeb96533000071088aa21ec238d358ecc")
version("0.0.2", sha256="61995d687f9e7e798e17ec7238d19d917890dc0ff5dec18293b840c4d6f8c115")
version("0.0.1", sha256="e2c164822708624b97654046b42abff704594cba6537d6d0646d485bdf2d03ca")

View File

@@ -17,6 +17,8 @@ class LibpressioTools(CMakePackage):
tags = ["e4s"]
version("master", branch="master")
version("0.3.0", sha256="2f309557df3e8df9e492691213933865a5dbfb051c03404e33918f4765223025")
version("0.2.0", sha256="75048950f0dfa0e20f2651991875822f36fceb84bdda12d1c0361d49912392b8")
version("0.1.6", sha256="a67a364f46dea29ff1b3e5c52c0a5abf2d9d53412fb8d424f6bd71252bfa7792")
version("0.1.5", sha256="b35f495fae53df87dd2abf58c0c51ed17710b16aaa2d0842a543fddd3b2a8035")
version("0.1.4", sha256="39adc4b09a63548a416ee4b1dcc87ec8578b15a176a11a2845c276c6c211f2d0")
@@ -43,6 +45,7 @@ class LibpressioTools(CMakePackage):
depends_on("boost")
# 0.1.0 changed a bunch of things in the build system, make sure everything is up to date
depends_on("libpressio@0.89.0:", when="@0.2.0:")
depends_on("libpressio@0.88.0:", when="@0.1.6:")
depends_on("libpressio@0.85.0:", when="@0.1.0:0.1.5")
depends_on("libpressio-opt@0.13.3:", when="@0.1.0:+opt")

View File

@@ -16,6 +16,7 @@ class LibpressioTthresh(CMakePackage):
maintainers("robertu94")
version("main", branch="main")
version("0.0.7", sha256="5e364ef72dd1ed1cf786d2b7aef89624fdcf1a0ca845777ce54c365b35a75be2")
version("0.0.6", sha256="e9dc4754421d892a86516c6bb892f6ff582e9ea3c242c1c052104e4f6944cbec")
version("0.0.5", sha256="af47c90e9c16825312e390a7fb30d9d128847afb69ad6c2f6608bd80f60bae23")
version("0.0.3", sha256="b0b0a4876d3362deafc2bb326be33882132e3d1666e0c5f916fd6fad74a18688")

View File

@@ -20,6 +20,12 @@ class Libpressio(CMakePackage, CudaPackage):
tests_require_compiler = True
version("master", branch="master")
version("develop", branch="develop")
version("0.95.1", sha256="c2e4f81d1491781cd47f2baba64acfbba9a7d6203c9b01369f8b1a8f94e0bb2b")
version("0.94.0", sha256="4250597cdd54043a7d5009ffc3feea3eac9496cdd38ea3f61f9727b7acd09add")
version("0.93.0", sha256="1da5940aaf0190a810988dcd8f415b9c8db53bbbdfcb627d899921c89170d990")
version("0.92.0", sha256="e9cab155deb07aabdca4ece2c826be905ed33f16c95f82f24eb01d181fce6109")
version("0.91.1", sha256="35cd4b93e410a83c626c9c168d59ade3bf26a453bcbf50dfd77b6d141184b97c")
version("0.91.0", sha256="6220988dc964c36cdffdbc5e055261ac7a0189ad80b67a962189683648209d2e")
version("0.90.2", sha256="1fe3f4073952a96bda1b3d7c237bc5d64d1f7bf13bfe1830074852ea33006bf9")
version("0.88.3", sha256="b2df2ed11f77eb2e07206f7bdfa4754017559017235c3324820021ef451fd48b")
version("0.88.2", sha256="f5de6aff5ff906b164d6b2199ada10a8e32fb1e2a6295da3f0b79d9626661a46")
@@ -329,6 +335,7 @@ def cmake_args(self):
args.append("-DLIBPRESSIO_HAS_SZ3=ON")
if "+cuda" in self.spec:
args.append("-DLIBPRESSIO_HAS_CUFILE=ON")
args.append("-DLIBPRESSIO_HAS_CUDA=ON")
if "+mgardx" in self.spec:
args.append("-DLIBPRESSIO_HAS_MGARDx=ON")
if "+bzip2" in self.spec:

View File

@@ -16,6 +16,8 @@ class Libstdcompat(CMakePackage):
maintainers("robertu94")
version("master", branch="master")
version("0.0.17", sha256="8c8a3f2727dd28c51ab10e02a1114e39b683d6d9ea119d5c2a953f8c41d6bedd")
version("0.0.16", sha256="1287251b694adb80210536ab6eb75c1ff2c4ed8b77023208a757ae27c9dae0bb")
version("0.0.15", sha256="af374a8883a32d874f6cd18cce4e4344e32f9d60754be403a5ac7114feca2a28")
version("0.0.14", sha256="9a794d43a1d79aec3350b89d8c06689b8b32cf75e2742cdfa9dc0e3f2be6f04e")
version("0.0.13", sha256="460656189e317e108a489af701fa8f33f13a93d96380788e692a1c68100e0388")

View File

@@ -111,6 +111,7 @@ def patch(self):
class CMakeBuilder(CMakeBuilder):
def cmake_args(self):
args = [self.define_from_variant(var) for var in VARIANTS]
args.append("-Dsphinx=OFF")
# Remove empty strings
args = [arg for arg in args if arg]
@@ -123,5 +124,6 @@ def configure_args(self):
args = []
for var in VARIANTS:
args.extend(self.enable_or_disable(var))
args.append("--disable-sphinx")
return args

View File

@@ -35,6 +35,7 @@ class Llvm(CMakePackage, CudaPackage):
family = "compiler" # Used by lmod
version("main", branch="main")
version("16.0.3", sha256="0bd71bc687a4e5a250c40afb0decefc50c85178fcce726137b682039de63919b")
version("16.0.2", sha256="97c3c6aafb53c4bb0ed2781a18d6f05e75445e24bb1dc57a32b74f8d710ac19f")
version("16.0.1", sha256="b5a9ff1793b1e2d388a3819bf35797002b1d2e40bb35a10c65605e0ea1435271")
version("16.0.0", sha256="cba969a0782a3a398658d439f047b5e548ea04724f4fbfdbe17cfc946f4cd3ed")
@@ -235,7 +236,7 @@ class Llvm(CMakePackage, CudaPackage):
# openmp dependencies
depends_on("perl-data-dumper", type=("build"))
depends_on("hwloc")
depends_on("hwloc@2.0.1:", when="@9:")
depends_on("hwloc@2.0.1:", when="@13")
depends_on("elf", when="+cuda") # libomptarget
depends_on("libffi", when="+libomptarget") # libomptarget
@@ -374,6 +375,13 @@ class Llvm(CMakePackage, CudaPackage):
# when/if the bugfix is merged
patch("D133513.diff", level=0, when="@14:15+lldb+python")
# Fix hwloc@:2.3 (Conditionally disable hwloc@2.0 and hwloc@2.4 code)
patch(
"https://github.com/llvm/llvm-project/commit/3a362a9f38b95978160377ee408dbc7d14af9aad.patch?full_index=1",
sha256="25bc503f7855229620e56e76161cf4654945aef0be493a2d8d9e94a088157b7c",
when="@14:15",
)
# The functions and attributes below implement external package
# detection for LLVM. See:
#

View File

@@ -0,0 +1,563 @@
diff --git a/configure.ac b/configure.ac
index 4da931a3..90a49cab 100755
--- a/configure.ac
+++ b/configure.ac
@@ -53,6 +53,18 @@ AC_REQUIRE_CPP
AC_CHECK_SIZEOF([long int])
+dnl
+dnl margo relies on pthreads because of prio_pool implementation and
+dnl argobots profiling shim
+dnl
+AX_PTHREAD([],
+ [AC_MSG_ERROR([Could not find working pthreads])])
+LIBS="$PTHREAD_LIBS $LIBS"
+CFLAGS="$CFLAGS $PTHREAD_CFLAGS"
+dnl subst for .pc file generation
+AC_SUBST([PTHREAD_LIBS], ["$PTHREAD_LIBS"])
+AC_SUBST([PTHREAD_CFLAGS], ["$PTHREAD_CFLAGS"])
+
dnl
dnl Verify pkg-config
dnl
diff --git a/m4/ax_pthread.m4 b/m4/ax_pthread.m4
new file mode 100644
index 00000000..9f35d139
--- /dev/null
+++ b/m4/ax_pthread.m4
@@ -0,0 +1,522 @@
+# ===========================================================================
+# https://www.gnu.org/software/autoconf-archive/ax_pthread.html
+# ===========================================================================
+#
+# SYNOPSIS
+#
+# AX_PTHREAD([ACTION-IF-FOUND[, ACTION-IF-NOT-FOUND]])
+#
+# DESCRIPTION
+#
+# This macro figures out how to build C programs using POSIX threads. It
+# sets the PTHREAD_LIBS output variable to the threads library and linker
+# flags, and the PTHREAD_CFLAGS output variable to any special C compiler
+# flags that are needed. (The user can also force certain compiler
+# flags/libs to be tested by setting these environment variables.)
+#
+# Also sets PTHREAD_CC and PTHREAD_CXX to any special C compiler that is
+# needed for multi-threaded programs (defaults to the value of CC
+# respectively CXX otherwise). (This is necessary on e.g. AIX to use the
+# special cc_r/CC_r compiler alias.)
+#
+# NOTE: You are assumed to not only compile your program with these flags,
+# but also to link with them as well. For example, you might link with
+# $PTHREAD_CC $CFLAGS $PTHREAD_CFLAGS $LDFLAGS ... $PTHREAD_LIBS $LIBS
+# $PTHREAD_CXX $CXXFLAGS $PTHREAD_CFLAGS $LDFLAGS ... $PTHREAD_LIBS $LIBS
+#
+# If you are only building threaded programs, you may wish to use these
+# variables in your default LIBS, CFLAGS, and CC:
+#
+# LIBS="$PTHREAD_LIBS $LIBS"
+# CFLAGS="$CFLAGS $PTHREAD_CFLAGS"
+# CXXFLAGS="$CXXFLAGS $PTHREAD_CFLAGS"
+# CC="$PTHREAD_CC"
+# CXX="$PTHREAD_CXX"
+#
+# In addition, if the PTHREAD_CREATE_JOINABLE thread-attribute constant
+# has a nonstandard name, this macro defines PTHREAD_CREATE_JOINABLE to
+# that name (e.g. PTHREAD_CREATE_UNDETACHED on AIX).
+#
+# Also HAVE_PTHREAD_PRIO_INHERIT is defined if pthread is found and the
+# PTHREAD_PRIO_INHERIT symbol is defined when compiling with
+# PTHREAD_CFLAGS.
+#
+# ACTION-IF-FOUND is a list of shell commands to run if a threads library
+# is found, and ACTION-IF-NOT-FOUND is a list of commands to run it if it
+# is not found. If ACTION-IF-FOUND is not specified, the default action
+# will define HAVE_PTHREAD.
+#
+# Please let the authors know if this macro fails on any platform, or if
+# you have any other suggestions or comments. This macro was based on work
+# by SGJ on autoconf scripts for FFTW (http://www.fftw.org/) (with help
+# from M. Frigo), as well as ac_pthread and hb_pthread macros posted by
+# Alejandro Forero Cuervo to the autoconf macro repository. We are also
+# grateful for the helpful feedback of numerous users.
+#
+# Updated for Autoconf 2.68 by Daniel Richard G.
+#
+# LICENSE
+#
+# Copyright (c) 2008 Steven G. Johnson <stevenj@alum.mit.edu>
+# Copyright (c) 2011 Daniel Richard G. <skunk@iSKUNK.ORG>
+# Copyright (c) 2019 Marc Stevens <marc.stevens@cwi.nl>
+#
+# This program is free software: you can redistribute it and/or modify it
+# under the terms of the GNU General Public License as published by the
+# Free Software Foundation, either version 3 of the License, or (at your
+# option) any later version.
+#
+# This program is distributed in the hope that it will be useful, but
+# WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
+# Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program. If not, see <https://www.gnu.org/licenses/>.
+#
+# As a special exception, the respective Autoconf Macro's copyright owner
+# gives unlimited permission to copy, distribute and modify the configure
+# scripts that are the output of Autoconf when processing the Macro. You
+# need not follow the terms of the GNU General Public License when using
+# or distributing such scripts, even though portions of the text of the
+# Macro appear in them. The GNU General Public License (GPL) does govern
+# all other use of the material that constitutes the Autoconf Macro.
+#
+# This special exception to the GPL applies to versions of the Autoconf
+# Macro released by the Autoconf Archive. When you make and distribute a
+# modified version of the Autoconf Macro, you may extend this special
+# exception to the GPL to apply to your modified version as well.
+
+#serial 31
+
+AU_ALIAS([ACX_PTHREAD], [AX_PTHREAD])
+AC_DEFUN([AX_PTHREAD], [
+AC_REQUIRE([AC_CANONICAL_HOST])
+AC_REQUIRE([AC_PROG_CC])
+AC_REQUIRE([AC_PROG_SED])
+AC_LANG_PUSH([C])
+ax_pthread_ok=no
+
+# We used to check for pthread.h first, but this fails if pthread.h
+# requires special compiler flags (e.g. on Tru64 or Sequent).
+# It gets checked for in the link test anyway.
+
+# First of all, check if the user has set any of the PTHREAD_LIBS,
+# etcetera environment variables, and if threads linking works using
+# them:
+if test "x$PTHREAD_CFLAGS$PTHREAD_LIBS" != "x"; then
+ ax_pthread_save_CC="$CC"
+ ax_pthread_save_CFLAGS="$CFLAGS"
+ ax_pthread_save_LIBS="$LIBS"
+ AS_IF([test "x$PTHREAD_CC" != "x"], [CC="$PTHREAD_CC"])
+ AS_IF([test "x$PTHREAD_CXX" != "x"], [CXX="$PTHREAD_CXX"])
+ CFLAGS="$CFLAGS $PTHREAD_CFLAGS"
+ LIBS="$PTHREAD_LIBS $LIBS"
+ AC_MSG_CHECKING([for pthread_join using $CC $PTHREAD_CFLAGS $PTHREAD_LIBS])
+ AC_LINK_IFELSE([AC_LANG_CALL([], [pthread_join])], [ax_pthread_ok=yes])
+ AC_MSG_RESULT([$ax_pthread_ok])
+ if test "x$ax_pthread_ok" = "xno"; then
+ PTHREAD_LIBS=""
+ PTHREAD_CFLAGS=""
+ fi
+ CC="$ax_pthread_save_CC"
+ CFLAGS="$ax_pthread_save_CFLAGS"
+ LIBS="$ax_pthread_save_LIBS"
+fi
+
+# We must check for the threads library under a number of different
+# names; the ordering is very important because some systems
+# (e.g. DEC) have both -lpthread and -lpthreads, where one of the
+# libraries is broken (non-POSIX).
+
+# Create a list of thread flags to try. Items with a "," contain both
+# C compiler flags (before ",") and linker flags (after ","). Other items
+# starting with a "-" are C compiler flags, and remaining items are
+# library names, except for "none" which indicates that we try without
+# any flags at all, and "pthread-config" which is a program returning
+# the flags for the Pth emulation library.
+
+ax_pthread_flags="pthreads none -Kthread -pthread -pthreads -mthreads pthread --thread-safe -mt pthread-config"
+
+# The ordering *is* (sometimes) important. Some notes on the
+# individual items follow:
+
+# pthreads: AIX (must check this before -lpthread)
+# none: in case threads are in libc; should be tried before -Kthread and
+# other compiler flags to prevent continual compiler warnings
+# -Kthread: Sequent (threads in libc, but -Kthread needed for pthread.h)
+# -pthread: Linux/gcc (kernel threads), BSD/gcc (userland threads), Tru64
+# (Note: HP C rejects this with "bad form for `-t' option")
+# -pthreads: Solaris/gcc (Note: HP C also rejects)
+# -mt: Sun Workshop C (may only link SunOS threads [-lthread], but it
+# doesn't hurt to check since this sometimes defines pthreads and
+# -D_REENTRANT too), HP C (must be checked before -lpthread, which
+# is present but should not be used directly; and before -mthreads,
+# because the compiler interprets this as "-mt" + "-hreads")
+# -mthreads: Mingw32/gcc, Lynx/gcc
+# pthread: Linux, etcetera
+# --thread-safe: KAI C++
+# pthread-config: use pthread-config program (for GNU Pth library)
+
+case $host_os in
+
+ freebsd*)
+
+ # -kthread: FreeBSD kernel threads (preferred to -pthread since SMP-able)
+ # lthread: LinuxThreads port on FreeBSD (also preferred to -pthread)
+
+ ax_pthread_flags="-kthread lthread $ax_pthread_flags"
+ ;;
+
+ hpux*)
+
+ # From the cc(1) man page: "[-mt] Sets various -D flags to enable
+ # multi-threading and also sets -lpthread."
+
+ ax_pthread_flags="-mt -pthread pthread $ax_pthread_flags"
+ ;;
+
+ openedition*)
+
+ # IBM z/OS requires a feature-test macro to be defined in order to
+ # enable POSIX threads at all, so give the user a hint if this is
+ # not set. (We don't define these ourselves, as they can affect
+ # other portions of the system API in unpredictable ways.)
+
+ AC_EGREP_CPP([AX_PTHREAD_ZOS_MISSING],
+ [
+# if !defined(_OPEN_THREADS) && !defined(_UNIX03_THREADS)
+ AX_PTHREAD_ZOS_MISSING
+# endif
+ ],
+ [AC_MSG_WARN([IBM z/OS requires -D_OPEN_THREADS or -D_UNIX03_THREADS to enable pthreads support.])])
+ ;;
+
+ solaris*)
+
+ # On Solaris (at least, for some versions), libc contains stubbed
+ # (non-functional) versions of the pthreads routines, so link-based
+ # tests will erroneously succeed. (N.B.: The stubs are missing
+ # pthread_cleanup_push, or rather a function called by this macro,
+ # so we could check for that, but who knows whether they'll stub
+ # that too in a future libc.) So we'll check first for the
+ # standard Solaris way of linking pthreads (-mt -lpthread).
+
+ ax_pthread_flags="-mt,-lpthread pthread $ax_pthread_flags"
+ ;;
+esac
+
+# Are we compiling with Clang?
+
+AC_CACHE_CHECK([whether $CC is Clang],
+ [ax_cv_PTHREAD_CLANG],
+ [ax_cv_PTHREAD_CLANG=no
+ # Note that Autoconf sets GCC=yes for Clang as well as GCC
+ if test "x$GCC" = "xyes"; then
+ AC_EGREP_CPP([AX_PTHREAD_CC_IS_CLANG],
+ [/* Note: Clang 2.7 lacks __clang_[a-z]+__ */
+# if defined(__clang__) && defined(__llvm__)
+ AX_PTHREAD_CC_IS_CLANG
+# endif
+ ],
+ [ax_cv_PTHREAD_CLANG=yes])
+ fi
+ ])
+ax_pthread_clang="$ax_cv_PTHREAD_CLANG"
+
+
+# GCC generally uses -pthread, or -pthreads on some platforms (e.g. SPARC)
+
+# Note that for GCC and Clang -pthread generally implies -lpthread,
+# except when -nostdlib is passed.
+# This is problematic using libtool to build C++ shared libraries with pthread:
+# [1] https://gcc.gnu.org/bugzilla/show_bug.cgi?id=25460
+# [2] https://bugzilla.redhat.com/show_bug.cgi?id=661333
+# [3] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=468555
+# To solve this, first try -pthread together with -lpthread for GCC
+
+AS_IF([test "x$GCC" = "xyes"],
+ [ax_pthread_flags="-pthread,-lpthread -pthread -pthreads $ax_pthread_flags"])
+
+# Clang takes -pthread (never supported any other flag), but we'll try with -lpthread first
+
+AS_IF([test "x$ax_pthread_clang" = "xyes"],
+ [ax_pthread_flags="-pthread,-lpthread -pthread"])
+
+
+# The presence of a feature test macro requesting re-entrant function
+# definitions is, on some systems, a strong hint that pthreads support is
+# correctly enabled
+
+case $host_os in
+ darwin* | hpux* | linux* | osf* | solaris*)
+ ax_pthread_check_macro="_REENTRANT"
+ ;;
+
+ aix*)
+ ax_pthread_check_macro="_THREAD_SAFE"
+ ;;
+
+ *)
+ ax_pthread_check_macro="--"
+ ;;
+esac
+AS_IF([test "x$ax_pthread_check_macro" = "x--"],
+ [ax_pthread_check_cond=0],
+ [ax_pthread_check_cond="!defined($ax_pthread_check_macro)"])
+
+
+if test "x$ax_pthread_ok" = "xno"; then
+for ax_pthread_try_flag in $ax_pthread_flags; do
+
+ case $ax_pthread_try_flag in
+ none)
+ AC_MSG_CHECKING([whether pthreads work without any flags])
+ ;;
+
+ *,*)
+ PTHREAD_CFLAGS=`echo $ax_pthread_try_flag | sed "s/^\(.*\),\(.*\)$/\1/"`
+ PTHREAD_LIBS=`echo $ax_pthread_try_flag | sed "s/^\(.*\),\(.*\)$/\2/"`
+ AC_MSG_CHECKING([whether pthreads work with "$PTHREAD_CFLAGS" and "$PTHREAD_LIBS"])
+ ;;
+
+ -*)
+ AC_MSG_CHECKING([whether pthreads work with $ax_pthread_try_flag])
+ PTHREAD_CFLAGS="$ax_pthread_try_flag"
+ ;;
+
+ pthread-config)
+ AC_CHECK_PROG([ax_pthread_config], [pthread-config], [yes], [no])
+ AS_IF([test "x$ax_pthread_config" = "xno"], [continue])
+ PTHREAD_CFLAGS="`pthread-config --cflags`"
+ PTHREAD_LIBS="`pthread-config --ldflags` `pthread-config --libs`"
+ ;;
+
+ *)
+ AC_MSG_CHECKING([for the pthreads library -l$ax_pthread_try_flag])
+ PTHREAD_LIBS="-l$ax_pthread_try_flag"
+ ;;
+ esac
+
+ ax_pthread_save_CFLAGS="$CFLAGS"
+ ax_pthread_save_LIBS="$LIBS"
+ CFLAGS="$CFLAGS $PTHREAD_CFLAGS"
+ LIBS="$PTHREAD_LIBS $LIBS"
+
+ # Check for various functions. We must include pthread.h,
+ # since some functions may be macros. (On the Sequent, we
+ # need a special flag -Kthread to make this header compile.)
+ # We check for pthread_join because it is in -lpthread on IRIX
+ # while pthread_create is in libc. We check for pthread_attr_init
+ # due to DEC craziness with -lpthreads. We check for
+ # pthread_cleanup_push because it is one of the few pthread
+ # functions on Solaris that doesn't have a non-functional libc stub.
+ # We try pthread_create on general principles.
+
+ AC_LINK_IFELSE([AC_LANG_PROGRAM([#include <pthread.h>
+# if $ax_pthread_check_cond
+# error "$ax_pthread_check_macro must be defined"
+# endif
+ static void *some_global = NULL;
+ static void routine(void *a)
+ {
+ /* To avoid any unused-parameter or
+ unused-but-set-parameter warning. */
+ some_global = a;
+ }
+ static void *start_routine(void *a) { return a; }],
+ [pthread_t th; pthread_attr_t attr;
+ pthread_create(&th, 0, start_routine, 0);
+ pthread_join(th, 0);
+ pthread_attr_init(&attr);
+ pthread_cleanup_push(routine, 0);
+ pthread_cleanup_pop(0) /* ; */])],
+ [ax_pthread_ok=yes],
+ [])
+
+ CFLAGS="$ax_pthread_save_CFLAGS"
+ LIBS="$ax_pthread_save_LIBS"
+
+ AC_MSG_RESULT([$ax_pthread_ok])
+ AS_IF([test "x$ax_pthread_ok" = "xyes"], [break])
+
+ PTHREAD_LIBS=""
+ PTHREAD_CFLAGS=""
+done
+fi
+
+
+# Clang needs special handling, because older versions handle the -pthread
+# option in a rather... idiosyncratic way
+
+if test "x$ax_pthread_clang" = "xyes"; then
+
+ # Clang takes -pthread; it has never supported any other flag
+
+ # (Note 1: This will need to be revisited if a system that Clang
+ # supports has POSIX threads in a separate library. This tends not
+ # to be the way of modern systems, but it's conceivable.)
+
+ # (Note 2: On some systems, notably Darwin, -pthread is not needed
+ # to get POSIX threads support; the API is always present and
+ # active. We could reasonably leave PTHREAD_CFLAGS empty. But
+ # -pthread does define _REENTRANT, and while the Darwin headers
+ # ignore this macro, third-party headers might not.)
+
+ # However, older versions of Clang make a point of warning the user
+ # that, in an invocation where only linking and no compilation is
+ # taking place, the -pthread option has no effect ("argument unused
+ # during compilation"). They expect -pthread to be passed in only
+ # when source code is being compiled.
+ #
+ # Problem is, this is at odds with the way Automake and most other
+ # C build frameworks function, which is that the same flags used in
+ # compilation (CFLAGS) are also used in linking. Many systems
+ # supported by AX_PTHREAD require exactly this for POSIX threads
+ # support, and in fact it is often not straightforward to specify a
+ # flag that is used only in the compilation phase and not in
+ # linking. Such a scenario is extremely rare in practice.
+ #
+ # Even though use of the -pthread flag in linking would only print
+ # a warning, this can be a nuisance for well-run software projects
+ # that build with -Werror. So if the active version of Clang has
+ # this misfeature, we search for an option to squash it.
+
+ AC_CACHE_CHECK([whether Clang needs flag to prevent "argument unused" warning when linking with -pthread],
+ [ax_cv_PTHREAD_CLANG_NO_WARN_FLAG],
+ [ax_cv_PTHREAD_CLANG_NO_WARN_FLAG=unknown
+ # Create an alternate version of $ac_link that compiles and
+ # links in two steps (.c -> .o, .o -> exe) instead of one
+ # (.c -> exe), because the warning occurs only in the second
+ # step
+ ax_pthread_save_ac_link="$ac_link"
+ ax_pthread_sed='s/conftest\.\$ac_ext/conftest.$ac_objext/g'
+ ax_pthread_link_step=`AS_ECHO(["$ac_link"]) | sed "$ax_pthread_sed"`
+ ax_pthread_2step_ac_link="($ac_compile) && (echo ==== >&5) && ($ax_pthread_link_step)"
+ ax_pthread_save_CFLAGS="$CFLAGS"
+ for ax_pthread_try in '' -Qunused-arguments -Wno-unused-command-line-argument unknown; do
+ AS_IF([test "x$ax_pthread_try" = "xunknown"], [break])
+ CFLAGS="-Werror -Wunknown-warning-option $ax_pthread_try -pthread $ax_pthread_save_CFLAGS"
+ ac_link="$ax_pthread_save_ac_link"
+ AC_LINK_IFELSE([AC_LANG_SOURCE([[int main(void){return 0;}]])],
+ [ac_link="$ax_pthread_2step_ac_link"
+ AC_LINK_IFELSE([AC_LANG_SOURCE([[int main(void){return 0;}]])],
+ [break])
+ ])
+ done
+ ac_link="$ax_pthread_save_ac_link"
+ CFLAGS="$ax_pthread_save_CFLAGS"
+ AS_IF([test "x$ax_pthread_try" = "x"], [ax_pthread_try=no])
+ ax_cv_PTHREAD_CLANG_NO_WARN_FLAG="$ax_pthread_try"
+ ])
+
+ case "$ax_cv_PTHREAD_CLANG_NO_WARN_FLAG" in
+ no | unknown) ;;
+ *) PTHREAD_CFLAGS="$ax_cv_PTHREAD_CLANG_NO_WARN_FLAG $PTHREAD_CFLAGS" ;;
+ esac
+
+fi # $ax_pthread_clang = yes
+
+
+
+# Various other checks:
+if test "x$ax_pthread_ok" = "xyes"; then
+ ax_pthread_save_CFLAGS="$CFLAGS"
+ ax_pthread_save_LIBS="$LIBS"
+ CFLAGS="$CFLAGS $PTHREAD_CFLAGS"
+ LIBS="$PTHREAD_LIBS $LIBS"
+
+ # Detect AIX lossage: JOINABLE attribute is called UNDETACHED.
+ AC_CACHE_CHECK([for joinable pthread attribute],
+ [ax_cv_PTHREAD_JOINABLE_ATTR],
+ [ax_cv_PTHREAD_JOINABLE_ATTR=unknown
+ for ax_pthread_attr in PTHREAD_CREATE_JOINABLE PTHREAD_CREATE_UNDETACHED; do
+ AC_LINK_IFELSE([AC_LANG_PROGRAM([#include <pthread.h>],
+ [int attr = $ax_pthread_attr; return attr /* ; */])],
+ [ax_cv_PTHREAD_JOINABLE_ATTR=$ax_pthread_attr; break],
+ [])
+ done
+ ])
+ AS_IF([test "x$ax_cv_PTHREAD_JOINABLE_ATTR" != "xunknown" && \
+ test "x$ax_cv_PTHREAD_JOINABLE_ATTR" != "xPTHREAD_CREATE_JOINABLE" && \
+ test "x$ax_pthread_joinable_attr_defined" != "xyes"],
+ [AC_DEFINE_UNQUOTED([PTHREAD_CREATE_JOINABLE],
+ [$ax_cv_PTHREAD_JOINABLE_ATTR],
+ [Define to necessary symbol if this constant
+ uses a non-standard name on your system.])
+ ax_pthread_joinable_attr_defined=yes
+ ])
+
+ AC_CACHE_CHECK([whether more special flags are required for pthreads],
+ [ax_cv_PTHREAD_SPECIAL_FLAGS],
+ [ax_cv_PTHREAD_SPECIAL_FLAGS=no
+ case $host_os in
+ solaris*)
+ ax_cv_PTHREAD_SPECIAL_FLAGS="-D_POSIX_PTHREAD_SEMANTICS"
+ ;;
+ esac
+ ])
+ AS_IF([test "x$ax_cv_PTHREAD_SPECIAL_FLAGS" != "xno" && \
+ test "x$ax_pthread_special_flags_added" != "xyes"],
+ [PTHREAD_CFLAGS="$ax_cv_PTHREAD_SPECIAL_FLAGS $PTHREAD_CFLAGS"
+ ax_pthread_special_flags_added=yes])
+
+ AC_CACHE_CHECK([for PTHREAD_PRIO_INHERIT],
+ [ax_cv_PTHREAD_PRIO_INHERIT],
+ [AC_LINK_IFELSE([AC_LANG_PROGRAM([[#include <pthread.h>]],
+ [[int i = PTHREAD_PRIO_INHERIT;
+ return i;]])],
+ [ax_cv_PTHREAD_PRIO_INHERIT=yes],
+ [ax_cv_PTHREAD_PRIO_INHERIT=no])
+ ])
+ AS_IF([test "x$ax_cv_PTHREAD_PRIO_INHERIT" = "xyes" && \
+ test "x$ax_pthread_prio_inherit_defined" != "xyes"],
+ [AC_DEFINE([HAVE_PTHREAD_PRIO_INHERIT], [1], [Have PTHREAD_PRIO_INHERIT.])
+ ax_pthread_prio_inherit_defined=yes
+ ])
+
+ CFLAGS="$ax_pthread_save_CFLAGS"
+ LIBS="$ax_pthread_save_LIBS"
+
+ # More AIX lossage: compile with *_r variant
+ if test "x$GCC" != "xyes"; then
+ case $host_os in
+ aix*)
+ AS_CASE(["x/$CC"],
+ [x*/c89|x*/c89_128|x*/c99|x*/c99_128|x*/cc|x*/cc128|x*/xlc|x*/xlc_v6|x*/xlc128|x*/xlc128_v6],
+ [#handle absolute path differently from PATH based program lookup
+ AS_CASE(["x$CC"],
+ [x/*],
+ [
+ AS_IF([AS_EXECUTABLE_P([${CC}_r])],[PTHREAD_CC="${CC}_r"])
+ AS_IF([test "x${CXX}" != "x"], [AS_IF([AS_EXECUTABLE_P([${CXX}_r])],[PTHREAD_CXX="${CXX}_r"])])
+ ],
+ [
+ AC_CHECK_PROGS([PTHREAD_CC],[${CC}_r],[$CC])
+ AS_IF([test "x${CXX}" != "x"], [AC_CHECK_PROGS([PTHREAD_CXX],[${CXX}_r],[$CXX])])
+ ]
+ )
+ ])
+ ;;
+ esac
+ fi
+fi
+
+test -n "$PTHREAD_CC" || PTHREAD_CC="$CC"
+test -n "$PTHREAD_CXX" || PTHREAD_CXX="$CXX"
+
+AC_SUBST([PTHREAD_LIBS])
+AC_SUBST([PTHREAD_CFLAGS])
+AC_SUBST([PTHREAD_CC])
+AC_SUBST([PTHREAD_CXX])
+
+# Finally, execute ACTION-IF-FOUND/ACTION-IF-NOT-FOUND:
+if test "x$ax_pthread_ok" = "xyes"; then
+ ifelse([$1],,[AC_DEFINE([HAVE_PTHREAD],[1],[Define if you have POSIX threads libraries and header files.])],[$1])
+ :
+else
+ ax_pthread_ok=no
+ $2
+fi
+AC_LANG_POP
+])dnl AX_PTHREAD
diff --git a/maint/margo.pc.in b/maint/margo.pc.in
index fd82d3cb..0f65cbbe 100644
--- a/maint/margo.pc.in
+++ b/maint/margo.pc.in
@@ -8,5 +8,5 @@ Description: Argobots bindings for Mercury RPC library
Version: @PACKAGE_VERSION@
URL: https://xgitlab.cels.anl.gov/sds/margo
Requires: @PC_REQUIRES@
-Libs: -L${libdir} -lmargo
-Cflags: -I${includedir}
+Libs: -L${libdir} -lmargo @PTHREAD_LIBS@
+Cflags: -I${includedir} @PTHREAD_CFLAGS@

View File

@@ -65,6 +65,10 @@ class MochiMargo(AutotoolsPackage):
depends_on("mercury@1.0.0:", type=("build", "link", "run"), when="@:0.5.1")
depends_on("mercury@2.0.0:", type=("build", "link", "run"), when="@0.5.2:")
# Fix pthread detection
# https://github.com/mochi-hpc/mochi-margo/pull/177
patch("mochi-margo-pthreads.patch", when="@0.9,0.9.1:0.9.7")
def autoreconf(self, spec, prefix):
sh = which("sh")
sh("./prepare.sh")

View File

@@ -15,8 +15,6 @@ class Mumax(MakefilePackage, CudaPackage):
homepage = "https://mumax.github.io"
url = "https://github.com/mumax/3/archive/v3.10.tar.gz"
maintainers("glennpj")
version(
"3.10",
sha256="42c858661cec3896685ff4babea11e711f71fd6ea37d20c2bed7e4a918702caa",

View File

@@ -17,6 +17,8 @@ class Nccl(MakefilePackage, CudaPackage):
maintainers("adamjstewart")
libraries = ["libnccl.so"]
version("2.18.1-1", sha256="0e4ede5cf8df009bff5aeb3a9f194852c03299ae5664b5a425b43358e7a9eef2")
version("2.17.1-1", sha256="1311a6fd7cd44ad6d4523ba03065ce694605843fd30a5c0f77aa3d911abe706d")
version("2.16.2-1", sha256="7f7c738511a8876403fc574d13d48e7c250d934d755598d82e14bab12236fc64")
version("2.15.5-1", sha256="f4ac3c74d469c9cd718f82e1477759785db9b9f8cc9d9ecc103485805b8394a3")
version("2.14.3-1", sha256="0fffb6f91e029ea4d95efabd7bddc6eecf8bf136e4f46bf812bff7d8eee53c79")

View File

@@ -6,22 +6,23 @@
from spack.package import *
class Nlcglib(CMakePackage, CudaPackage):
class Nlcglib(CMakePackage, CudaPackage, ROCmPackage):
"""Nonlinear CG methods for wave-function optimization in DFT."""
homepage = "https://github.com/simonpintarelli/nlcglib"
git = "https://github.com/simonpintarelli/nlcglib.git"
url = "https://github.com/simonpintarelli/nlcglib/archive/v0.9.tar.gz"
maintainers("simonpintarelli")
maintainers = ["simonpintarelli"]
version("master", branch="master")
version("develop", branch="develop")
version("master", branch="master")
version("1.0b", sha256="086c46f06a117f267cbdf1df4ad42a8512689a9610885763f463469fb15e82dc")
version("0.9", sha256="8d5bc6b85ee714fb3d6480f767e7f43e5e7d569116cf60e48f533a7f50a37a08")
variant("wrapper", default=False, description="Use nvcc-wrapper for CUDA build")
variant("openmp", default=False)
variant("openmp", default=True)
variant("tests", default=False)
variant(
"build_type",
default="Release",
@@ -29,38 +30,73 @@ class Nlcglib(CMakePackage, CudaPackage):
values=("Debug", "Release", "RelWithDebInfo"),
)
depends_on("cmake@3.21:", type="build")
depends_on("mpi")
depends_on("lapack")
depends_on("kokkos +cuda~cuda_relocatable_device_code+cuda_lambda")
depends_on("kokkos-nvcc-wrapper", when="+wrapper")
depends_on("kokkos +cuda~cuda_relocatable_device_code+cuda_lambda+wrapper", when="+wrapper")
depends_on("cmake@3.15:", type="build")
depends_on(
"kokkos+cuda~cuda_relocatable_device_code+cuda_lambda+openmp+wrapper",
when="+openmp+wrapper",
)
depends_on("kokkos~cuda~rocm", when="~cuda~rocm")
depends_on("kokkos+openmp", when="+openmp")
depends_on("googletest", type="build", when="+tests")
depends_on("nlohmann-json")
with when("@:0.9"):
conflicts("+rocm")
conflicts("^kokkos@4:")
with when("+rocm"):
variant("magma", default=True, description="Use magma eigenvalue solver (AMDGPU)")
depends_on("magma+rocm", when="+magma")
depends_on("kokkos+rocm")
depends_on("rocblas")
depends_on("rocsolver")
with when("+cuda"):
depends_on("kokkos+cuda+cuda_lambda+wrapper", when="%gcc")
depends_on("kokkos+cuda")
def cmake_args(self):
options = []
options = [
self.define_from_variant("USE_OPENMP", "openmp"),
self.define_from_variant("BUILD_TESTS", "tests"),
self.define_from_variant("USE_ROCM", "rocm"),
self.define_from_variant("USE_MAGMA", "magma"),
self.define_from_variant("USE_CUDA", "cuda"),
]
if "+openmp" in self.spec:
options.append("-DUSE_OPENMP=On")
else:
options.append("-DUSE_OPENMP=Off")
if self.spec["blas"].name in ["intel-mkl", "intel-parallel-studio"]:
options.append("-DLAPACK_VENDOR=MKL")
options += [self.define("LAPACK_VENDOR", "MKL")]
elif self.spec["blas"].name in ["intel-oneapi-mkl"]:
options += [self.define("LAPACK_VENDOR", "MKLONEAPI")]
elif self.spec["blas"].name in ["openblas"]:
options.append("-DLAPACK_VENDOR=OpenBLAS")
options += [self.define("LAPACK_VENDOR", "OpenBLAS")]
else:
raise Exception("blas/lapack must be either openblas or mkl.")
options.append("-DBUILD_TESTS=OFF")
if "+wrapper" in self.spec:
options.append("-DCMAKE_CXX_COMPILER=%s" % self.spec["kokkos-nvcc-wrapper"].kokkos_cxx)
if "+cuda%gcc" in self.spec:
options += [
self.define(
"CMAKE_CXX_COMPILER", "{0}".format(self.spec["kokkos-nvcc-wrapper"].kokkos_cxx)
)
]
if "+cuda" in self.spec:
cuda_arch = self.spec.variants["cuda_arch"].value
if cuda_arch[0] != "none":
options += ["-DCMAKE_CUDA_FLAGS=-arch=sm_{0}".format(cuda_arch[0])]
cuda_archs = self.spec.variants["cuda_arch"].value
if "@:0.9" in self.spec:
cuda_flags = " ".join(
["-gencode arch=compute_{0},code=sm_{0}".format(x) for x in cuda_archs]
)
options += [self.define("CMAKE_CUDA_FLAGS", cuda_flags)]
else:
options += [self.define("CMAKE_CUDA_ARCHITECTURES", cuda_archs)]
if "^cuda+allow-unsupported-compilers" in self.spec:
options += [self.define("CMAKE_CUDA_FLAGS", "--allow-unsupported-compiler")]
if "+rocm" in self.spec:
options.append(self.define("CMAKE_CXX_COMPILER", self.spec["hip"].hipcc))
archs = ",".join(self.spec.variants["amdgpu_target"].value)
options.append("-DHIP_HCC_FLAGS=--amdgpu-target={0}".format(archs))
options.append(
"-DCMAKE_CXX_FLAGS=--amdgpu-target={0} --offload-arch={0}".format(archs)
)
return options

View File

@@ -16,7 +16,7 @@ class Opencv(CMakePackage, CudaPackage):
url = "https://github.com/opencv/opencv/archive/4.5.0.tar.gz"
git = "https://github.com/opencv/opencv.git"
maintainers("bvanessen", "adamjstewart", "glennpj")
maintainers("bvanessen", "adamjstewart")
version("master", branch="master")
version("4.6.0", sha256="1ec1cba65f9f20fe5a41fda1586e01c70ea0c9a6d7b67c9e13edf0cfe2239277")

View File

@@ -217,6 +217,12 @@ class Paraview(CMakePackage, CudaPackage, ROCmPackage):
depends_on("netcdf-c")
depends_on("pegtl")
depends_on("protobuf@3.4:")
# Paraview 5.10 can't build with protobuf > 3.18
# https://github.com/spack/spack/issues/37437
depends_on("protobuf@3.4:3.18", when="@:5.10%oneapi")
depends_on("protobuf@3.4:3.18", when="@:5.10%intel@2021:")
depends_on("protobuf@3.4:3.18", when="@:5.10%xl")
depends_on("protobuf@3.4:3.18", when="@:5.10%xl_r")
depends_on("libxml2")
depends_on("lz4")
depends_on("xz")

View File

@@ -12,7 +12,9 @@ class PyAlabaster(PythonPackage):
homepage = "https://alabaster.readthedocs.io/"
pypi = "alabaster/alabaster-0.7.10.tar.gz"
git = "https://github.com/sphinx-doc/alabaster.git"
version("0.7.13", sha256="a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2")
version("0.7.12", sha256="a661d72d58e6ea8a57f7a86e37d86716863ee5e92788398526d58b26a4e4dc02")
version("0.7.10", sha256="37cdcb9e9954ed60912ebc1ca12a9d12178c26637abdf124e3cde2341c257fe0")
version("0.7.9", sha256="47afd43b08a4ecaa45e3496e139a193ce364571e7e10c6a87ca1a4c57eb7ea08")

View File

@@ -13,6 +13,7 @@ class PyAnyio(PythonPackage):
homepage = "https://github.com/agronholm/anyio"
pypi = "anyio/anyio-3.2.1.tar.gz"
version("3.6.2", sha256="25ea0d673ae30af41a0c442f81cf3b38c7e79fdc7b60335a4c14e05eb0947421")
version("3.6.1", sha256="413adf95f93886e442aea925f3ee43baa5a765a64a0f52c6081894f9992fdd0b")
version("3.5.0", sha256="a0aeffe2fb1fdf374a8e4b471444f0f3ac4fb9f5a5b542b48824475e0042a5a6")
version("3.3.4", sha256="67da67b5b21f96b9d3d65daa6ea99f5d5282cb09f50eb4456f8fb51dffefc3ff")
@@ -20,8 +21,9 @@ class PyAnyio(PythonPackage):
depends_on("python@3.6.2:", type=("build", "run"))
depends_on("py-setuptools@42:", type="build")
depends_on("py-setuptools-scm+toml@3.4:", type="build")
depends_on("py-wheel@0.29:", type="build")
depends_on("py-setuptools-scm+toml@3.4:", type="build")
depends_on("py-idna@2.8:", type=("build", "run"))
depends_on("py-sniffio@1.1:", type=("build", "run"))
depends_on("py-typing-extensions", when="^python@:3.7", type=("build", "run"))

View File

@@ -12,14 +12,17 @@ class PyArgcomplete(PythonPackage):
homepage = "https://github.com/kislyuk/argcomplete"
pypi = "argcomplete/argcomplete-1.12.0.tar.gz"
version("3.0.8", sha256="b9ca96448e14fa459d7450a4ab5a22bbf9cee4ba7adddf03e65c398b5daeea28")
version("2.0.0", sha256="6372ad78c89d662035101418ae253668445b391755cfe94ea52f1b9d22425b20")
version("1.12.3", sha256="2c7dbffd8c045ea534921e63b0be6fe65e88599990d8dc408ac8c542b72a5445")
version("1.12.0", sha256="2fbe5ed09fd2c1d727d4199feca96569a5b50d44c71b16da9c742201f7cc295c")
version("1.1.1", sha256="cca45b5fe07000994f4f06a0b95bd71f7b51b04f81c3be0b4ea7b666e4f1f084")
depends_on("python@3.6:", when="@2:", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-importlib-metadata@0.23:4", when="@1.12.3: ^python@:3.7", type=("build", "run"))
depends_on("py-importlib-metadata@0.23:6", when="@3.0.6: ^python@:3.7", type=("build", "run"))
depends_on(
"py-importlib-metadata@0.23:4", when="@1.12.3:2 ^python@:3.7", type=("build", "run")
)
depends_on("py-importlib-metadata@0.23:3", when="@1.12.2 ^python@:3.7", type=("build", "run"))
depends_on("py-importlib-metadata@0.23:2", when="@1.12.1 ^python@:3.7", type=("build", "run"))
depends_on("py-importlib-metadata@0.23:1", when="@1.12.0 ^python@:3.7", type=("build", "run"))

View File

@@ -12,6 +12,7 @@ class PyAsttokens(PythonPackage):
homepage = "https://github.com/gristlabs/asttokens"
pypi = "asttokens/asttokens-2.0.5.tar.gz"
version("2.2.1", sha256="4622110b2a6f30b77e1473affaa97e711bc2f07d3f10848420ff1898edbe94f3")
version("2.0.8", sha256="c61e16246ecfb2cde2958406b4c8ebc043c9e6d73aaa83c941673b35e5d3a76b")
version("2.0.5", sha256="9a54c114f02c7a9480d56550932546a3f1fe71d8a02f1bc7ccd0ee3ee35cf4d5")

View File

@@ -13,6 +13,7 @@ class PyAttrs(PythonPackage):
pypi = "attrs/attrs-20.3.0.tar.gz"
git = "https://github.com/python-attrs/attrs"
version("23.1.0", sha256="6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015")
version("22.2.0", sha256="c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99")
version("22.1.0", sha256="29adc2665447e5191d0e7c568fde78b21f9672d344281d0c6e1ab085429b22b6")
version("21.4.0", sha256="626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd")
@@ -26,5 +27,12 @@ class PyAttrs(PythonPackage):
version("18.1.0", sha256="e0d0eb91441a3b53dab4d9b743eafc1ac44476296a2053b6ca3af0b139faf87b")
version("16.3.0", sha256="80203177723e36f3bbe15aa8553da6e80d47bfe53647220ccaa9ad7a5e473ccc")
depends_on("py-setuptools@40.6.0:", when="@19.1.0:", type="build")
depends_on("py-setuptools", type="build")
depends_on("py-hatchling", when="@23.1:", type="build")
depends_on("py-hatch-vcs", when="@23.1:", type="build")
depends_on("py-hatch-fancy-pypi-readme", when="@23.1:", type="build")
with when("@:22.2.0"):
depends_on("py-setuptools@40.6.0:", when="@19.1", type="build")
depends_on("py-setuptools", type="build")
depends_on("py-importlib-metadata", when="@23.1: ^python@3.7", type=("build", "run"))

View File

@@ -15,6 +15,7 @@ class PyBabel(PythonPackage):
pypi = "Babel/Babel-2.7.0.tar.gz"
git = "https://github.com/python-babel/babel"
version("2.12.1", sha256="cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455")
version("2.10.3", sha256="7614553711ee97490f732126dc077f8d0ae084ebc6a96e23db1482afabdb2c51")
version("2.9.1", sha256="bc0c176f9f6a994582230df350aa6e05ba2ebe4b3ac317eab29d9be5d2768da0")
version("2.7.0", sha256="e86135ae101e31e2c8ec20a4e0c5220f4eed12487d5cf3f78be7e98d3a57fc28")
@@ -22,7 +23,6 @@ class PyBabel(PythonPackage):
version("2.4.0", sha256="8c98f5e5f8f5f088571f2c6bd88d530e331cbbcb95a7311a0db69d3dca7ec563")
version("2.3.4", sha256="c535c4403802f6eb38173cd4863e419e2274921a01a8aad8a5b497c131c62875")
depends_on("python@3.6:", when="@2.10:", type=("build", "run"))
depends_on("python@2.7:2.8,3.4:", type=("build", "run"))
depends_on("py-setuptools", type=("build", "run"))
depends_on("py-pytz@2015.7:", type=("build", "run"))
depends_on("py-pytz@2015.7:", when="@2.12: ^python@:3.8", type=("build", "run"))
depends_on("py-pytz@2015.7:", when="@:2.10", type=("build", "run"))

View File

@@ -12,6 +12,7 @@ class PyBidsValidator(PythonPackage):
homepage = "https://github.com/bids-standard/bids-validator"
pypi = "bids-validator/bids-validator-1.7.2.tar.gz"
version("1.11.0", sha256="408c56748b7cf98cf7c31822f33a8d89c5e6e7db5254c345107e8d527576ff53")
version("1.9.8", sha256="ff39799bb205f92d6f2c322f0b8eff0d1c0288f4291a0b18fce61afa4dfd7f3e")
version("1.9.4", sha256="4bf07d375f231a2ad2f450beeb3ef6c54f93194fd993aa5157d57a8fba48ed50")
version("1.8.9", sha256="01fcb5a8fe6de1280cdfd5b37715103ffa0bafb3c739ca7f5ffc41e46549612e")

View File

@@ -12,6 +12,7 @@ class PyBleach(PythonPackage):
homepage = "https://github.com/mozilla/bleach"
pypi = "bleach/bleach-3.1.0.tar.gz"
version("6.0.0", sha256="1a1a85c1595e07d8db14c5f09f09e6433502c51c595970edc090551f0db99414")
version("5.0.1", sha256="0d03255c47eb9bd2f26aa9bb7f2107732e7e8fe195ca2f64709fcf3b0a4a085c")
version("4.1.0", sha256="0900d8b37eba61a802ee40ac0061f8c2b5dee29c1927dd1d233e075ebf5a71da")
version("4.0.0", sha256="ffa9221c6ac29399cc50fcc33473366edd0cf8d5e2cbbbb63296dc327fb67cc8")
@@ -19,10 +20,6 @@ class PyBleach(PythonPackage):
version("3.1.0", sha256="3fdf7f77adcf649c9911387df51254b813185e32b2c6619f690b593a617e19fa")
version("1.5.0", sha256="978e758599b54cd3caa2e160d74102879b230ea8dc93871d0783721eef58bc65")
depends_on("python@3.7:", when="@5:", type=("build", "run"))
depends_on("python@3.6:", when="@4:", type=("build", "run"))
depends_on("python@2.7:2,3.5:", when="@3.1.3:", type=("build", "run"))
depends_on("python@2.7:2,3.4:", type=("build", "run"))
depends_on("py-setuptools", type=("build", "run"))
depends_on("py-six@1.9.0:", type=("build", "run"))
depends_on("py-webencodings", type=("build", "run"))

View File

@@ -12,6 +12,7 @@ class PyBottleneck(PythonPackage):
homepage = "https://github.com/pydata/bottleneck"
pypi = "Bottleneck/Bottleneck-1.0.0.tar.gz"
version("1.3.7", sha256="e1467e373ad469da340ed0ff283214d6531cc08bfdca2083361a3aa6470681f8")
version("1.3.5", sha256="2c0d27afe45351f6f421893362621804fa7dea14fe29a78eaa52d4323f646de7")
version("1.3.2", sha256="20179f0b66359792ea283b69aa16366419132f3b6cf3adadc0c48e2e8118e573")
version("1.3.1", sha256="451586370462cb623d6ad604a545d1e97fb51d2ab5252b1ac57350a83e494a28")

View File

@@ -14,6 +14,7 @@ class PyCertifi(PythonPackage):
homepage = "https://github.com/certifi/python-certifi"
pypi = "certifi/certifi-2020.6.20.tar.gz"
version("2023.5.7", sha256="0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7")
version("2022.12.7", sha256="35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3")
version("2022.9.14", sha256="36973885b9542e6bd01dea287b2b4b3b21236307c56324fcc3f1160f2d655ed5")
version("2021.10.8", sha256="78884e7c1d4b00ce3cea67b44566851c4343c120abd683433ce934a68ea58872")
@@ -27,5 +28,4 @@ class PyCertifi(PythonPackage):
version("2017.4.17", sha256="f7527ebf7461582ce95f7a9e03dd141ce810d40590834f4ec20cddd54234c10a")
version("2017.1.23", sha256="81877fb7ac126e9215dfb15bfef7115fdc30e798e0013065158eed0707fd99ce")
depends_on("python@3.6:", when="@2022.05.18.1:", type=("build", "run"))
depends_on("py-setuptools", type="build")

View File

@@ -13,10 +13,9 @@ class PyCharsetNormalizer(PythonPackage):
homepage = "https://github.com/ousret/charset_normalizer"
pypi = "charset-normalizer/charset-normalizer-2.0.7.tar.gz"
version("3.1.0", sha256="34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5")
version("2.1.1", sha256="5a3d016c7c547f69d6f81fb0db9449ce888b418b5b9952cc5e6e66843e9dd845")
version("2.0.12", sha256="2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597")
version("2.0.7", sha256="e019de665e2bcf9c2b64e2e5aa025fa991da8720daa3c1138cadd2fd1856aed0")
depends_on("python@3.6:", when="@2.1:", type=("build", "run"))
depends_on("python@3.5:", type=("build", "run"))
depends_on("py-setuptools", type="build")

View File

@@ -14,6 +14,7 @@ class PyDask(PythonPackage):
maintainers("skosukhin")
version("2023.4.1", sha256="9dc72ebb509f58f3fe518c12dd5a488c67123fdd66ccb0b968b34fd11e512153")
version("2022.10.2", sha256="42cb43f601709575fa46ce09e74bea83fdd464187024f56954e09d9b428ceaab")
version("2021.6.2", sha256="8588fcd1a42224b7cfcd2ebc8ad616734abb6b1a4517efd52d89c7dd66eb91f8")
version("2021.4.1", sha256="195e4eeb154222ea7a1c368119b5f321ee4ec9d78531471fe0145a527f744aa8")
@@ -75,16 +76,24 @@ class PyDask(PythonPackage):
depends_on("python@3.8:", type=("build", "run"), when="@2022.10.2:")
depends_on("py-setuptools", type="build")
depends_on("py-setuptools@62.6:", type="build", when="@2023.4.1:")
depends_on("py-versioneer@0.28+toml", type="build", when="@2023.4.1:")
# Common requirements
depends_on("py-packaging@20:", type="build", when="@2022.10.2:")
depends_on("py-pyyaml", type=("build", "run"), when="@2.17.1:")
depends_on("py-pyyaml@5.3.1:", type=("build", "run"), when="@2022.10.2:")
depends_on("py-cloudpickle@1.1.1:", type=("build", "run"), when="@2021.3.1:")
depends_on("py-cloudpickle@1.5.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-fsspec@0.6.0:", type=("build", "run"), when="@2021.3.1:")
depends_on("py-fsspec@2021.09.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-toolz@0.8.2:", type=("build", "run"), when="@2021.3.1:")
depends_on("py-toolz@0.10.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-partd@0.3.10:", type=("build", "run"), when="@2021.3.1:")
depends_on("py-partd@1.2.0:", type=("build", "run"), when="@2023.4.0:")
depends_on("py-click@7.0:", type=("build", "run"), when="@2022.10.2:")
depends_on("py-click@8.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-importlib-metadata@4.13.0:", type=("build", "run"), when="@2023.4.0:")
# Requirements for dask.array
depends_on("py-numpy", type=("build", "run"), when="@:0.17.1 +array")
@@ -94,6 +103,7 @@ class PyDask(PythonPackage):
depends_on("py-numpy@1.15.1:", type=("build", "run"), when="@2020.12.0: +array")
depends_on("py-numpy@1.16.0:", type=("build", "run"), when="@2021.3.1: +array")
depends_on("py-numpy@1.18.0:", type=("build", "run"), when="@2022.10.2: +array")
depends_on("py-numpy@1.21.0:", type=("build", "run"), when="@2023.4.0: +array")
depends_on("py-toolz", type=("build", "run"), when="@:0.6.1 +array")
depends_on("py-toolz@0.7.2:", type=("build", "run"), when="@0.7.0: +array")
@@ -136,6 +146,7 @@ class PyDask(PythonPackage):
depends_on("py-numpy@1.15.1:", type=("build", "run"), when="@2020.12.0: +dataframe")
depends_on("py-numpy@1.16.0:", type=("build", "run"), when="@2021.3.1: +dataframe")
depends_on("py-numpy@1.18.0:", type=("build", "run"), when="@2022.10.2: +dataframe")
depends_on("py-numpy@1.21.0:", type=("build", "run"), when="@2023.4.0: +dataframe")
depends_on("py-pandas@0.16.0:", type=("build", "run"), when="+dataframe")
depends_on("py-pandas@0.18.0:", type=("build", "run"), when="@0.9.0: +dataframe")
@@ -144,6 +155,7 @@ class PyDask(PythonPackage):
depends_on("py-pandas@0.23.0:", type=("build", "run"), when="@2.11.0: +dataframe")
depends_on("py-pandas@0.25.0:", type=("build", "run"), when="@2020.12.0: +dataframe")
depends_on("py-pandas@1.0:", type=("build", "run"), when="@2022.10.2: +dataframe")
depends_on("py-pandas@1.3:", type=("build", "run"), when="@2023.4.0: +dataframe")
depends_on("py-toolz", type=("build", "run"), when="@:0.6.1 +dataframe")
depends_on("py-toolz@0.7.2:", type=("build", "run"), when="@0.7.0: +dataframe")
@@ -193,12 +205,15 @@ class PyDask(PythonPackage):
)
depends_on("py-distributed@2021.6.2", type=("build", "run"), when="@2021.6.2 +distributed")
depends_on("py-distributed@2022.10.2", type=("build", "run"), when="@2022.10.2 +distributed")
depends_on("py-distributed@2023.4.1", type=("build", "run"), when="@2023.4.1 +distributed")
# Requirements for dask.diagnostics
depends_on("py-bokeh@1.0.0:", type=("build", "run"), when="@2.0.0: +diagnostics")
depends_on("py-bokeh@1.0.0:1,2.0.1:", type=("build", "run"), when="@2.26.0: +diagnostics")
depends_on("py-bokeh@2.4.2:2", type=("build", "run"), when="@2022.10.2: +diagnostics")
depends_on("py-bokeh@2.4.2:2", type=("build", "run"), when="@2022.10.2:2023.3 +diagnostics")
depends_on("py-bokeh@2.4.2:", type=("build", "run"), when="@2023.4.0: +diagnostics")
depends_on("py-jinja2", type=("build", "run"), when="@2022.10.2: +diagnostics")
depends_on("py-jinja2@2.10.3", type=("build", "run"), when="@2023.4.0: +diagnostics")
# Requirements for dask.delayed
depends_on("py-cloudpickle@0.2.1:", type=("build", "run"), when="@2.7.0: +delayed")

View File

@@ -30,6 +30,7 @@ class PyDistributed(PythonPackage):
"distributed.diagnostics",
]
version("2023.4.1", sha256="0140376338efdcf8db1d03f7c1fdbb5eab2a337b03e955d927c116824ee94ac5")
version("2022.10.2", sha256="53f0a5bf6efab9a5ab3345cd913f6d3f3d4ea444ee2edbea331c7fef96fd67d0")
version("2022.2.1", sha256="fb62a75af8ef33bbe1aa80a68c01a33a93c1cd5a332dd017ab44955bf7ecf65b")
version("2021.6.2", sha256="d7d112a86ab049dcefa3b21fd1baea4212a2c03d22c24bd55ad38d21a7f5d148")
@@ -50,22 +51,29 @@ class PyDistributed(PythonPackage):
depends_on("python@3.6:", when="@2:", type=("build", "run"))
depends_on("python@3.7:", when="@2021.4.1:", type=("build", "run"))
depends_on("python@3.8:", when="@2022.2.1:", type=("build", "run"))
depends_on("py-setuptools", type=("build", "run"))
depends_on("py-setuptools", type="build")
depends_on("py-setuptools@62.6:", type="build", when="@2023.4.1:")
depends_on("py-versioneer@0.28+toml", type="build", when="@2023.4.1:")
# In Spack py-dask+distributed depends on py-distributed, not the other way around.
# Hence, no need for depends_on("py-dask", ...)
depends_on("py-click@6.6:", type=("build", "run"))
depends_on("py-click@8.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-cloudpickle@0.2.2:", type=("build", "run"), when="@:2.16.0")
depends_on("py-cloudpickle@1.3.0:", type=("build", "run"), when="@2.17.0:2.20.0")
depends_on("py-cloudpickle@1.5.0:", type=("build", "run"), when="@2.21.0:")
depends_on("py-jinja2", type=("build", "run"), when="@2022.2.1:")
depends_on("py-jinja2@2.10.3", type=("build", "run"), when="@2023.4.1:")
depends_on("py-locket@1:", type=("build", "run"), when="@2022.2.1:")
depends_on("py-msgpack", type=("build", "run"), when="@:2.10.0")
depends_on("py-msgpack@0.6.0:", type=("build", "run"), when="@2.11.0:")
depends_on("py-msgpack@1.0.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-packaging@20.0:", type=("build", "run"), when="@2022.2.1:")
depends_on("py-psutil@5.0:", type=("build", "run"))
depends_on("py-psutil@5.7.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-six", type=("build", "run"), when="@:1")
depends_on("py-sortedcontainers@:1,2.0.2:", type=("build", "run"))
depends_on("py-sortedcontainers@2.0.5:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-tblib", type=("build", "run"), when="@:2.10.0")
depends_on("py-tblib@1.6.0:", type=("build", "run"), when="@2.11.0:")
depends_on("py-toolz@0.7.4:", type=("build", "run"), when="@:2.12.0")
@@ -77,8 +85,12 @@ class PyDistributed(PythonPackage):
depends_on("py-tornado@6.0.3:", type=("build", "run"), when="^python@3.8:")
depends_on("py-tornado@6.0.3:6.1", type=("build", "run"), when="@2022.10.2:")
depends_on("py-zict@0.1.3:", type=("build", "run"))
depends_on("py-zict@2.2.0:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-pyyaml", type=("build", "run"))
depends_on("py-pyyaml@5.3.1:", type=("build", "run"), when="@2023.4.1:")
depends_on("py-urllib3", type=("build", "run"), when="@2022.10.2:")
depends_on("py-urllib3@1.24.3:", type=("build", "run"), when="@2023.4.1:")
def patch(self):
filter_file("^dask .*", "", "requirements.txt")
if self.spec.satisfies("@:2023.3"):
filter_file("^dask .*", "", "requirements.txt")

View File

@@ -16,6 +16,7 @@ class PyFiona(PythonPackage):
maintainers("adamjstewart")
version("master", branch="master")
version("1.9.4", sha256="49f18cbcd3b1f97128c1bb038c3451b2e1be25baa52f02ce906c25cf75af95b6")
version("1.9.3", sha256="60f3789ad9633c3a26acf7cbe39e82e3c7a12562c59af1d599fc3e4e8f7f8f25")
version("1.9.2", sha256="f9263c5f97206bf2eb2c010d52e8ffc54e96886b0e698badde25ff109b32952a")
version("1.9.1", sha256="3a3725e94840a387fef48726d60db6a6791563f366939d22378a4661f8941be7")
@@ -51,8 +52,8 @@ class PyFiona(PythonPackage):
depends_on("py-click-plugins@1:", type=("build", "run"))
depends_on("py-cligj@0.5:", type=("build", "run"))
depends_on("py-importlib-metadata", when="@1.9.2: ^python@:3.9", type=("build", "run"))
depends_on("py-munch@2.3.2:", when="@1.9:", type=("build", "run"))
depends_on("py-munch", type=("build", "run"))
depends_on("py-six", when="@1.9.4:", type=("build", "run"))
depends_on("py-six@1.7:", when="@:1.8", type=("build", "run"))
# setup.py or release notes
depends_on("gdal@3.1:", when="@1.9:", type=("build", "link", "run"))
@@ -60,7 +61,8 @@ class PyFiona(PythonPackage):
# Historical dependencies
depends_on("py-setuptools", when="@:1.9.1", type=("build", "run"))
depends_on("py-six@1.7:", when="@:1.8", type=("build", "run"))
depends_on("py-munch@2.3.2:", when="@1.9.0:1.9.3", type=("build", "run"))
depends_on("py-munch", when="@:1.8", type=("build", "run"))
# error: implicit declaration of function 'OSRFixup' is invalid in C99
conflicts("%apple-clang@12:", when="@:1.8.9")

View File

@@ -19,6 +19,7 @@ class PyGmxapi(PythonPackage):
maintainers("eirrgang", "peterkasson")
pypi = "gmxapi/gmxapi-0.4.0.tar.gz"
version("0.4.1", sha256="cc7a2e509ab8a59c187d388dcfd21ea78b785c3b355149b1818085f34dbda62a")
version("0.4.0", sha256="7fd58e6a4b1391043379e8ba55555ebeba255c5b394f5df9d676e6a5571d7eba")
depends_on("gromacs@2022.1:~mdrun_only+shared")
@@ -30,9 +31,9 @@ class PyGmxapi(PythonPackage):
depends_on("py-numpy@1.8:", type=("build", "run"))
depends_on("py-setuptools@42:", type="build")
depends_on("py-packaging", type=("build", "run"))
depends_on("py-pybind11@2.6:", type=("build", "run"))
depends_on("py-pybind11@2.6:", type="build")
depends_on("py-pybind11@2.6:", when="@:0.4", type=("build", "run"))
depends_on("py-pytest@4.6:", type="test")
depends_on("py-wheel", type="build")
def setup_build_environment(self, env):
env.set("GROMACS_DIR", self.spec["gromacs"].prefix)

View File

@@ -14,18 +14,28 @@ class PyHuggingfaceHub(PythonPackage):
homepage = "https://github.com/huggingface/huggingface_hub"
pypi = "huggingface_hub/huggingface_hub-0.0.10.tar.gz"
version("0.14.1", sha256="9ab899af8e10922eac65e290d60ab956882ab0bf643e3d990b1394b6b47b7fbc")
version("0.10.1", sha256="5c188d5b16bec4b78449f8681f9975ff9d321c16046cc29bcf0d7e464ff29276")
version("0.0.10", sha256="556765e4c7edd2d2c4c733809bae1069dca20e10ff043870ec40d53e498efae2")
version("0.0.8", sha256="be5b9a7ed36437bb10a780d500154d426798ec16803ff3406f7a61107e4ebfc2")
depends_on("python@3.7:", when="@0.10:", type=("build", "run"))
depends_on("python@3.6:", type=("build", "run"))
variant(
"cli",
default=False,
when="@0.10:",
description="Install dependencies for CLI-specific features",
)
depends_on("py-setuptools", type="build")
depends_on("py-filelock", type=("build", "run"))
depends_on("py-fsspec", when="@0.14:", type=("build", "run"))
depends_on("py-requests", type=("build", "run"))
depends_on("py-tqdm@4.42.1:", type=("build", "run"))
depends_on("py-tqdm", type=("build", "run"))
depends_on("py-pyyaml@5.1:", when="@0.10:", type=("build", "run"))
depends_on("py-typing-extensions@3.7.4.3:", when="@0.10:", type=("build", "run"))
depends_on("py-typing-extensions", when="@0.0.10:", type=("build", "run"))
depends_on("py-importlib-metadata", when="^python@:3.7", type=("build", "run"))
depends_on("py-packaging@20.9:", when="@0.10:", type=("build", "run"))
depends_on("py-inquirerpy@0.3.4", when="@0.14:+cli", type=("build", "run"))

View File

@@ -0,0 +1,22 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyInquirerpy(PythonPackage):
"""Python port of Inquirer.js
(A collection of common interactive command-line user interfaces).
"""
homepage = "https://github.com/kazhala/InquirerPy"
pypi = "inquirerpy/InquirerPy-0.3.4.tar.gz"
version("0.3.4", sha256="89d2ada0111f337483cb41ae31073108b2ec1e618a49d7110b0d7ade89fc197e")
depends_on("python@3.7:3", type=("build", "run"))
depends_on("py-poetry-core@1:", type="build")
depends_on("py-prompt-toolkit@3.0.1:3", type=("build", "run"))
depends_on("py-pfzy@0.3.1:0.3", type=("build", "run"))

View File

@@ -0,0 +1,24 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyJarvisUtil(PythonPackage):
"""Jarvis-util is a library which contains various utilities to aid with
creating shell scripts within Python. This library contains wrappers
for executing shell commands locally, SSH, SCP, MPI, argument parsing,
and various other random utilities."""
homepage = "https://github.com/scs-lab/jarvis-util"
git = "https://github.com/scs-lab/jarvis-util.git"
url = "https://github.com/scs-lab/jarvis-util/archive/refs/tags/v0.0.1.tar.gz"
maintainers("lukemartinlogan", "hyoklee")
version("0.0.1", sha256="1c5fbbfec410f1df8dc28edc87dd4421c3708f5bd22bf7ef010138d5c4a1ff8f")
depends_on("py-setuptools", type="build")
depends_on("py-psutil", type=("build", "run"))
depends_on("py-pyaml", type=("build", "run"))

View File

@@ -14,7 +14,7 @@ class PyJedi(PythonPackage):
version("0.18.1", sha256="74137626a64a99c8eb6ae5832d99b3bdd7d29a3850fe2aa80a4126b2a7d949ab")
version("0.18.0", sha256="92550a404bad8afed881a137ec9a461fed49eca661414be45059329614ed0707")
version("0.17.2", sha256="08d43addcbd656ed07e929631f8071eec567092bf16f2c19fc7bc272a97a77ef")
version("0.17.2", sha256="86ed7d9b750603e4ba582ea8edc678657fb4007894a12bcf6f4bb97892f31d20")
version("0.17.1", sha256="807d5d4f96711a2bcfdd5dfa3b1ae6d09aa53832b182090b222b5efb81f52f63")
version("0.15.1", sha256="ba859c74fa3c966a22f2aeebe1b74ee27e2a462f56d3f5f7ca4a59af61bfe42e")
version("0.15.0", sha256="9f16cb00b2aee940df2efc1d7d7c848281fd16391536a3d4561f5aea49db1ee6")

Some files were not shown because too many files have changed in this diff Show More